diff --git a/.gitattributes b/.gitattributes index 94928096cbb6b995a60b8717c555e9a6b12da653..d6a734edcc6d195d7b508527be94b747c762b899 100644 --- a/.gitattributes +++ b/.gitattributes @@ -111,3 +111,21 @@ Transformer[[:space:]]Mechanism/Transformer_Implementation/home/jovyan/work/W4A1 Transformer[[:space:]]Mechanism/Transformer_Implementation/home/jovyan/work/W4A1/transformer.png filter=lfs diff=lfs merge=lfs -text Transformer[[:space:]]Mechanism/Transformer[[:space:]]Pre-Processing/home/jovyan/work/W4A4_UGL_POS/glove/glove.6B.100d.txt filter=lfs diff=lfs merge=lfs -text Transformer[[:space:]]Mechanism/Transformer[[:space:]]Pre-Processing/home/jovyan/work/W4A4_UGL_POS/preprocessing.png filter=lfs diff=lfs merge=lfs -text +NLP[[:space:]]with[[:space:]]Attention[[:space:]]Models/NMT_with_Attention/NMT[[:space:]]with[[:space:]]MBR/Files/tf/images/att.png filter=lfs diff=lfs merge=lfs -text +NLP[[:space:]]with[[:space:]]Attention[[:space:]]Models/NMT_with_Attention/NMT[[:space:]]with[[:space:]]MBR/Files/tf/images/attention.png filter=lfs diff=lfs merge=lfs -text +NLP[[:space:]]with[[:space:]]Attention[[:space:]]Models/NMT_with_Attention/NMT[[:space:]]with[[:space:]]MBR/Files/tf/images/NMTModel.png filter=lfs diff=lfs merge=lfs -text +NLP[[:space:]]with[[:space:]]Attention[[:space:]]Models/NMT_with_Attention/NMT[[:space:]]with[[:space:]]MBR/Files/tf/por-eng/por.txt filter=lfs diff=lfs merge=lfs -text +NLP[[:space:]]with[[:space:]]Attention[[:space:]]Models/QA/QA_T5/Files/tf/data/c4-en-10k.json filter=lfs diff=lfs merge=lfs -text +NLP[[:space:]]with[[:space:]]Attention[[:space:]]Models/QA/QA_T5/Files/tf/data/c4-en-10k.jsonl filter=lfs diff=lfs merge=lfs -text +NLP[[:space:]]with[[:space:]]Attention[[:space:]]Models/QA/QA_T5/Files/tf/data/train-v2.0.json filter=lfs diff=lfs merge=lfs -text +NLP[[:space:]]with[[:space:]]Attention[[:space:]]Models/QA/QA_T5/Files/tf/images/colab_help_2.png filter=lfs diff=lfs merge=lfs -text +NLP[[:space:]]with[[:space:]]Attention[[:space:]]Models/QA/QA_T5/Files/tf/images/fulltransformer.png filter=lfs diff=lfs merge=lfs -text +NLP[[:space:]]with[[:space:]]Attention[[:space:]]Models/QA/QA_T5/Files/tf/images/qa.png filter=lfs diff=lfs merge=lfs -text +NLP[[:space:]]with[[:space:]]Attention[[:space:]]Models/QA/QA_T5/Files/tf/pretrained_models/model_c4.data-00000-of-00001 filter=lfs diff=lfs merge=lfs -text +NLP[[:space:]]with[[:space:]]Attention[[:space:]]Models/QA/QA_T5/Files/tf/pretrained_models/model_qa3.data-00000-of-00001 filter=lfs diff=lfs merge=lfs -text +NLP[[:space:]]with[[:space:]]Attention[[:space:]]Models/Text_Summarization/Summarization/tf/images/decoder_layer.png filter=lfs diff=lfs merge=lfs -text +NLP[[:space:]]with[[:space:]]Attention[[:space:]]Models/Text_Summarization/Summarization/tf/images/decoder.png filter=lfs diff=lfs merge=lfs -text +NLP[[:space:]]with[[:space:]]Attention[[:space:]]Models/Text_Summarization/Summarization/tf/images/encoder_layer.png filter=lfs diff=lfs merge=lfs -text +NLP[[:space:]]with[[:space:]]Attention[[:space:]]Models/Text_Summarization/Summarization/tf/images/encoder.png filter=lfs diff=lfs merge=lfs -text +NLP[[:space:]]with[[:space:]]Attention[[:space:]]Models/Text_Summarization/Summarization/tf/images/self-attention.png filter=lfs diff=lfs merge=lfs -text +NLP[[:space:]]with[[:space:]]Attention[[:space:]]Models/Text_Summarization/Summarization/tf/images/transformer.png filter=lfs diff=lfs merge=lfs -text diff --git a/NLP with Attention Models/NMT_with_Attention/Basic Attention-BLEU-QKV Attention/Files/home/jovyan/work/.ipynb_checkpoints/C4W1_Basic_Attention-checkpoint.ipynb b/NLP with Attention Models/NMT_with_Attention/Basic Attention-BLEU-QKV Attention/Files/home/jovyan/work/.ipynb_checkpoints/C4W1_Basic_Attention-checkpoint.ipynb new file mode 100644 index 0000000000000000000000000000000000000000..e3008688fe6bedb8346f902305fc9d8d489853b6 --- /dev/null +++ b/NLP with Attention Models/NMT_with_Attention/Basic Attention-BLEU-QKV Attention/Files/home/jovyan/work/.ipynb_checkpoints/C4W1_Basic_Attention-checkpoint.ipynb @@ -0,0 +1,302 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "id": "9c74bac5", + "metadata": {}, + "source": [ + "# Basic Attention Operation: Ungraded Lab\n", + "\n", + "As you've learned, attention allows a seq2seq decoder to use information from each encoder step instead of just the final encoder hidden state. In the attention operation, the encoder outputs are weighted based on the decoder hidden state, then combined into one context vector. This vector is then used as input to the decoder to predict the next output step.\n", + "\n", + "In this ungraded lab, you'll implement a basic attention operation as described in [Bhadanau, et al (2014)](https://arxiv.org/abs/1409.0473) using Numpy.\n", + "\n", + "This is a practice notebook, where you can train writing your code. All of the solutions are provided at the end of the notebook." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "a5288920", + "metadata": {}, + "outputs": [], + "source": [ + "# Import the libraries and define the functions you will need for this lab\n", + "import numpy as np\n", + "\n", + "def softmax(x, axis=0):\n", + " \"\"\" Calculate softmax function for an array x along specified axis\n", + " \n", + " axis=0 calculates softmax across rows which means each column sums to 1 \n", + " axis=1 calculates softmax across columns which means each row sums to 1\n", + " \"\"\"\n", + " return np.exp(x) / np.expand_dims(np.sum(np.exp(x), axis=axis), axis)" + ] + }, + { + "cell_type": "markdown", + "id": "9a6e0293", + "metadata": {}, + "source": [ + "## 1: Calculating alignment scores\n", + "\n", + "The first step is to calculate the alignment scores. This is a measure of similarity between the decoder hidden state and each encoder hidden state. From the paper, this operation looks like\n", + "\n", + "$$\n", + "\\large e_{ij} = v_a^\\top \\tanh{\\left(W_a s_{i-1} + U_a h_j\\right)}\n", + "$$\n", + "\n", + "where $W_a \\in \\mathbb{R}^{n\\times m}$, $U_a \\in \\mathbb{R}^{n \\times m}$, and $v_a \\in \\mathbb{R}^m$\n", + "are the weight matrices and $n$ is the hidden state size. In practice, this is implemented as a feedforward neural network with two layers, where $m$ is the size of the layers in the alignment network. It looks something like:\n", + "\n", + "![alignment model](./images/alignment_model_3.jpg)\n", + "\n", + "Here $h_j$ are the encoder hidden states for each input step $j$ and $s_{i - 1}$ is the decoder hidden state of the previous step. The first layer corresponds to $W_a$ and $U_a$, while the second layer corresponds to $v_a$.\n", + "\n", + "To implement this, first concatenate the encoder and decoder hidden states to produce an array with size $K \\times 2n$ where $K$ is the number of encoder states/steps. For this, use `np.concatenate` ([docs](https://numpy.org/doc/stable/reference/generated/numpy.concatenate.html)). Note that there is only one decoder state so you'll need to reshape it to successfully concatenate the arrays. The easiest way is to use `decoder_state.repeat` ([docs](https://numpy.org/doc/stable/reference/generated/numpy.repeat.html#numpy.repeat)) to match the hidden state array size.\n", + "\n", + "Then, apply the first layer as a matrix multiplication between the weights and the concatenated input. Use the tanh function to get the activations. Finally, compute the matrix multiplication of the second layer weights and the activations. This returns the alignment scores." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "72857076", + "metadata": {}, + "outputs": [], + "source": [ + "hidden_size = 16\n", + "attention_size = 10\n", + "input_length = 5\n", + "\n", + "np.random.seed(42)\n", + "\n", + "# Synthetic vectors used to test\n", + "encoder_states = np.random.randn(input_length, hidden_size)\n", + "decoder_state = np.random.randn(1, hidden_size)\n", + "\n", + "# Weights for the neural network, these are typically learned through training\n", + "# Use these in the alignment function below as the layer weights\n", + "layer_1 = np.random.randn(2 * hidden_size, attention_size)\n", + "layer_2 = np.random.randn(attention_size, 1)\n", + "\n", + "# Implement this function. Replace None with your code. Solution at the bottom of the notebook\n", + "def alignment(encoder_states, decoder_state):\n", + " # First, concatenate the encoder states and the decoder state\n", + " inputs = None\n", + " assert inputs.shape == (input_length, 2 * hidden_size)\n", + " \n", + " # Matrix multiplication of the concatenated inputs and layer_1, with tanh activation\n", + " activations = None\n", + " assert activations.shape == (input_length, attention_size)\n", + " \n", + " # Matrix multiplication of the activations with layer_2. Remember that you don't need tanh here\n", + " scores = None\n", + " assert scores.shape == (input_length, 1)\n", + " \n", + " return scores" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "fb638355", + "metadata": {}, + "outputs": [], + "source": [ + "# Run this to test your alignment function\n", + "scores = alignment(encoder_states, decoder_state)\n", + "print(scores)" + ] + }, + { + "cell_type": "markdown", + "id": "f26aae76", + "metadata": {}, + "source": [ + "If you implemented the function correctly, you should get these scores:\n", + "\n", + "```python\n", + "[[4.35790943]\n", + " [5.92373433]\n", + " [4.18673175]\n", + " [2.11437202]\n", + " [0.95767155]]\n", + "```" + ] + }, + { + "cell_type": "markdown", + "id": "58b8cfa9", + "metadata": {}, + "source": [ + "## 2: Turning alignment into weights\n", + "\n", + "The next step is to calculate the weights from the alignment scores. These weights determine the encoder outputs that are the most important for the decoder output. These weights should be between 0 and 1. You can use the softmax function (which is already implemented above) to get these weights from the attention scores. Pass the attention scores vector to the softmax function to get the weights. Mathematically,\n", + "\n", + "$$\n", + "\\large \\alpha_{ij} = \\frac{\\exp{\\left(e_{ij}\\right)}}{\\sum_{k=1}^K \\exp{\\left(e_{ik}\\right)}}\n", + "$$\n", + "\n", + "\n", + "\n", + "## 3: Weight the encoder output vectors and sum\n", + "\n", + "The weights tell you the importance of each input word with respect to the decoder state. In this step, you use the weights to modulate the magnitude of the encoder vectors. Words with little importance will be scaled down relative to important words. Multiply each encoder vector by its respective weight to get the alignment vectors, then sum up the weighted alignment vectors to get the context vector. Mathematically,\n", + "\n", + "$$\n", + "\\large c_i = \\sum_{j=1}^K\\alpha_{ij} h_{j}\n", + "$$\n", + "\n", + "Implement these steps in the `attention` function below." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "4546cbb5", + "metadata": {}, + "outputs": [], + "source": [ + "# Implement this function. Replace None with your code.\n", + "def attention(encoder_states, decoder_state):\n", + " \"\"\" Example function that calculates attention, returns the context vector \n", + " \n", + " Arguments:\n", + " encoder_vectors: NxM numpy array, where N is the number of vectors and M is the vector length\n", + " decoder_vector: 1xM numpy array, M is the vector length, much be the same M as encoder_vectors\n", + " \"\"\" \n", + " \n", + " # First, calculate the alignment scores\n", + " scores = None\n", + " \n", + " # Then take the softmax of the alignment scores to get a weight distribution\n", + " weights = None\n", + " \n", + " # Multiply each encoder state by its respective weight\n", + " weighted_scores = None\n", + " \n", + " # Sum up weighted alignment vectors to get the context vector and return it\n", + " context = None\n", + " return context\n", + "\n", + "context_vector = attention(encoder_states, decoder_state)\n", + "print(context_vector)" + ] + }, + { + "cell_type": "markdown", + "id": "5d9f3df4", + "metadata": {}, + "source": [ + "If you implemented the `attention` function correctly, the context vector should be\n", + "\n", + "```python\n", + "[-0.63514569 0.04917298 -0.43930867 -0.9268003 1.01903919 -0.43181409\n", + " 0.13365099 -0.84746874 -0.37572203 0.18279832 -0.90452701 0.17872958\n", + " -0.58015282 -0.58294027 -0.75457577 1.32985756]\n", + "```\n", + "\n" + ] + }, + { + "cell_type": "markdown", + "id": "4210899c", + "metadata": {}, + "source": [ + "## See below for solutions" + ] + }, + { + "cell_type": "markdown", + "id": "3ba0d629", + "metadata": {}, + "source": [ + "```python\n", + "# Solution\n", + "def alignment(encoder_states, decoder_state):\n", + " # First, concatenate the encoder states and the decoder state.\n", + " inputs = np.concatenate((encoder_states, decoder_state.repeat(input_length, axis=0)), axis=1)\n", + " assert inputs.shape == (input_length, 2*hidden_size)\n", + " \n", + " # Matrix multiplication of the concatenated inputs and the first layer, with tanh activation\n", + " activations = np.tanh(np.matmul(inputs, layer_1))\n", + " assert activations.shape == (input_length, attention_size)\n", + " \n", + " # Matrix multiplication of the activations with the second layer. Remember that you don't need tanh here\n", + " scores = np.matmul(activations, layer_2)\n", + " assert scores.shape == (input_length, 1)\n", + " \n", + " return scores\n", + "\n", + "# Run this to test your alignment function\n", + "scores = alignment(encoder_states, decoder_state)\n", + "print(scores)\n", + "```" + ] + }, + { + "cell_type": "markdown", + "id": "f80faecb", + "metadata": {}, + "source": [ + "```python\n", + "# Solution\n", + "def attention(encoder_states, decoder_state):\n", + " \"\"\" Example function that calculates attention, returns the context vector \n", + " \n", + " Arguments:\n", + " encoder_vectors: NxM numpy array, where N is the number of vectors and M is the vector length\n", + " decoder_vector: 1xM numpy array, M is the vector length, much be the same M as encoder_vectors\n", + " \"\"\" \n", + " \n", + " # First, calculate the dot product of each encoder vector with the decoder vector\n", + " scores = alignment(encoder_states, decoder_state)\n", + " \n", + " # Then take the softmax of those scores to get a weight distribution\n", + " weights = softmax(scores)\n", + " \n", + " # Multiply each encoder state by its respective weight\n", + " weighted_scores = encoder_states * weights\n", + " \n", + " # Sum up the weights encoder states\n", + " context = np.sum(weighted_scores, axis=0)\n", + " \n", + " return context\n", + "\n", + "context_vector = attention(encoder_states, decoder_state)\n", + "print(context_vector)\n", + "```" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "16a6caa8", + "metadata": {}, + "outputs": [], + "source": [] + } + ], + "metadata": { + "kernelspec": { + "display_name": "Python 3 (ipykernel)", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.10.11" + } + }, + "nbformat": 4, + "nbformat_minor": 5 +} diff --git a/NLP with Attention Models/NMT_with_Attention/Basic Attention-BLEU-QKV Attention/Files/home/jovyan/work/.ipynb_checkpoints/C4W1_Bleu_Score-checkpoint.ipynb b/NLP with Attention Models/NMT_with_Attention/Basic Attention-BLEU-QKV Attention/Files/home/jovyan/work/.ipynb_checkpoints/C4W1_Bleu_Score-checkpoint.ipynb new file mode 100644 index 0000000000000000000000000000000000000000..4cec1d6c21500424aae1bcb670fc4adfa701929c --- /dev/null +++ b/NLP with Attention Models/NMT_with_Attention/Basic Attention-BLEU-QKV Attention/Files/home/jovyan/work/.ipynb_checkpoints/C4W1_Bleu_Score-checkpoint.ipynb @@ -0,0 +1,585 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Calculating the Bilingual Evaluation Understudy (BLEU) score: Ungraded Lab" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "In this ungraded lab, you will implement a popular metric for evaluating the quality of machine-translated text: the BLEU score proposed by Kishore Papineni, et al. in their 2002 paper [\"BLEU: a Method for Automatic Evaluation of Machine Translation\"](https://www.aclweb.org/anthology/P02-1040.pdf). The BLEU score works by comparing a \"candidate\" text to one or more \"reference\" texts. The score is higher the better the result. In the following sections you will calculate this value using your own implementation as well as using functions from a library." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# 1. Importing the Libraries\n", + "\n", + "You will start by importing the Python libraries. First, you will implement your own version of the BLEU Score using NumPy. To verify that your implementation is correct, you will compare the results with those generated by the [SacreBLEU library](https://github.com/mjpost/sacrebleu). This package provides hassle-free computation of shareable, comparable, and reproducible BLEU scores. It also knows all the standard test sets and handles downloading, processing, and tokenization." + ] + }, + { + "cell_type": "code", + "execution_count": 1, + "metadata": { + "tags": [] + }, + "outputs": [ + { + "name": "stderr", + "output_type": "stream", + "text": [ + "[nltk_data] Downloading package punkt to /home/jovyan/nltk_data...\n", + "[nltk_data] Package punkt is already up-to-date!\n" + ] + }, + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Requirement already satisfied: sacrebleu in /opt/conda/lib/python3.10/site-packages (2.3.1)\n", + "Requirement already satisfied: portalocker in /opt/conda/lib/python3.10/site-packages (from sacrebleu) (2.8.2)\n", + "Requirement already satisfied: regex in /opt/conda/lib/python3.10/site-packages (from sacrebleu) (2023.10.3)\n", + "Requirement already satisfied: tabulate>=0.8.9 in /opt/conda/lib/python3.10/site-packages (from sacrebleu) (0.9.0)\n", + "Requirement already satisfied: numpy>=1.17 in /opt/conda/lib/python3.10/site-packages (from sacrebleu) (1.24.3)\n", + "Requirement already satisfied: colorama in /opt/conda/lib/python3.10/site-packages (from sacrebleu) (0.4.6)\n", + "Requirement already satisfied: lxml in /opt/conda/lib/python3.10/site-packages (from sacrebleu) (4.9.3)\n" + ] + } + ], + "source": [ + "import numpy as np # import numpy to make numerical computations.\n", + "import nltk # import NLTK to handle simple NL tasks like tokenization.\n", + "nltk.download(\"punkt\")\n", + "from nltk.util import ngrams\n", + "from collections import Counter # import a counter.\n", + "!pip3 install 'sacrebleu' # install the sacrebleu package.\n", + "import sacrebleu # import sacrebleu in order compute the BLEU score.\n", + "import matplotlib.pyplot as plt # import pyplot in order to make some illustrations." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# 2. BLEU score\n", + "\n", + "## 2.1 Definitions and formulas\n", + "\n", + "You have seen how to calculate the BLEU score in this week's lectures. Formally, you can express the BLEU score as:\n", + "\n", + "$$BLEU = BP\\times\\Bigl(\\prod_{i=1}^{n}precision_i\\Bigr)^{(1/n)}.\\tag{1}$$\n", + "\n", + "\n", + "The BLEU score depends on the $BP$, which stands for Brevity Penalty, and the weighted geometric mean precision for different lengths of n-grams, both of which are described below. The product runs from $i=1$ to $i=n$ to account for 1-grams to n-grams and the exponent of $1/n$ is there to calculate the geometrical average. In this notebook, you will use $n=4$\n", + "\n", + "The **Brevity Penalty** is defined as an exponential decay:\n", + "\n", + "$$BP = min\\Bigl(1, e^{(1-({len(ref)}/{len(cand)}))}\\Bigr),\\tag{2}$$\n", + "\n", + "where ${len(ref)}$ and ${len(cand)}$ refer to the length or count of words in the reference and candidate translations. The brevity penalty helps to handle very short translations. \n", + "\n", + "The **precision** is defined as :\n", + "\n", + "$$precision_i = \\frac {\\sum_{s_i \\in{cand}}min\\Bigl(C(s_i, cand), C(s_i, ref)\\Bigr)}{\\sum_{s_i \\in{cand}} C(s_i, cand)}.\\tag{3}$$\n", + "\n", + "The sum goes over all the i-grams $s_i$ in the candidate sentence $cand$. $C(s_i, cand)$ and $C(s_i, ref)$ are the counts of the i-grams in the candidate and reference sentences respectively. So the sum counts all the n-grams in the candidate sentence that also appear in the reference sentence, but only counts them as many times as they appear in the reference sentence and not more. This is then divided by the total number of i-grams in the candidate sentence." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## 2.2 Visualizing the BLEU score\n", + "\n", + "### Brevity Penalty:\n", + "The brevity penalty penalizes generated translations that are shorter than the reference sentence. It compensates for the fact that the BLEU score has no recall term." + ] + }, + { + "cell_type": "code", + "execution_count": 2, + "metadata": {}, + "outputs": [ + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAjcAAAGwCAYAAABVdURTAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjcuMSwgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy/bCgiHAAAACXBIWXMAAA9hAAAPYQGoP6dpAABcvElEQVR4nO3deVwU9eMG8Gd2WW4BEUEEBLxRPBCVwEytxLRMy1LTvLPwzLM0y6vMX5rmrZlXlleKV2UmWZ5oCoIniiIKKoqgcsq1+/n9Ye7XlcNdBAaW5/167eslw8zOswPMPs5+ZkYSQggQERERGQmF3AGIiIiIShLLDRERERkVlhsiIiIyKiw3REREZFRYboiIiMiosNwQERGRUWG5ISIiIqNiIneAsqbRaHDr1i1UqVIFkiTJHYeIiIj0IIRAWloaatasCYWi6GMzla7c3Lp1C25ubnLHICIiomKIj4+Hq6trkfNUunJTpUoVAI82jo2NjcxpiIiISB+pqalwc3PTvo8XpdKVm8cfRdnY2LDcEBERVTD6DCnhgGIiIiIyKiw3REREZFRYboiIiMiosNwQERGRUWG5ISIiIqPCckNERERGheWGiIiIjArLDRERERkVlhsiIiIyKiw3REREZFRkLTeHDh1C165dUbNmTUiShJ07dz5zmYMHD8LX1xfm5uaoXbs2VqxYUfpBiYiIqMKQtdxkZGSgWbNmWLJkiV7zx8bGokuXLmjbti0iIiLw2WefYfTo0QgODi7lpERERFRRyHrjzM6dO6Nz5856z79ixQrUqlULCxYsAAB4eXkhLCwM3377LXr06FFKKfWj1ggkpDyUNQM9Hycbc6iU/KSWiKiiq1B3BT927BgCAwN1pnXq1AmrV69Gbm4uVCpVvmWys7ORnZ2t/To1NbVUsiVnZOPFb/4pleemslHDxhxL+/rA191e7ihERPQcKlS5uX37NpycnHSmOTk5IS8vD0lJSXB2ds63zOzZszFjxowyyWdmwv/1V1RqjcDt1Cz0XnkcU99ohPdfcIckSXLHIiKiYqhQ5QZAvjccIUSB0x+bPHkyxo0bp/06NTUVbm5uJZ7LsYo5Ln2l/0dsVL5kZOfhk21n8PvZBHyx6zwi4h9g9ttNYGailDsaEREZqEIdaqhRowZu376tMy0xMREmJiaoVq1agcuYmZnBxsZG50H0NCszEyzp44MpXbygkIDtp27imz8uyR2LiIiKoUKVG39/f4SEhOhM27dvH1q2bFngeBsiQ0iShKEv1cby930BAGtDYxERd1/mVEREZChZy016ejoiIyMRGRkJ4NGp3pGRkYiLiwPw6COl/v37a+cPCgrC9evXMW7cOERFRWHNmjVYvXo1JkyYIEd8MlKdGtfAWz4uEAKYFHwWOXkauSMREZEBZC03YWFh8PHxgY+PDwBg3Lhx8PHxwdSpUwEACQkJ2qIDAJ6entizZw8OHDiA5s2b48svv8SiRYtkPw2cjM8XbzSCvZUpLt1Jw/cHY+SOQ0REBpDE4xG5lURqaipsbW2RkpLC8TdUpF2RN/Hx5kiYKhXY8/GLqOtYRe5IRESVliHv3xVqzA1RWXqzWU10aFAdOWoNJgWfhUZTqf4fQERUYbHcEBVCkiR89VYTWJkqEXb9Pn7+97rckYiISA8sN0RFcLGzwKTODQEA3/xxETfuZ8qciIiInoXlhugZ+vq5o5VHVWTkqPHZjnOoZMPUiIgqHJYbomdQKCT8X4+mMDVR4FD0XWw/dVPuSEREVASWGyI91KlujbGv1gcAzPztAhLTsmROREREhWG5IdLT0Lae8HaxQcrDXEzbdV7uOEREVAiWGyI9mSgVmNOjGUwUEv44dxt7zibIHYmIiArAckNkgEY1bTC8fR0AwNRd53AvI0fmRERE9DSWGyIDjXi5Luo7WSMpPQczf+XHU0RE5Q3LDZGBzEyUmPNOMygkYGfkLeyPuiN3JCIiegLLDVExNHezw9C2tQEAn+04i5SHuTInIiKix1huiIppbMf68HSwwp3UbMz6/YLccYiI6D8sN0TFZK5SYs47TSFJwC9hN3DgUqLckYiICCw3RM+llYc9BgV4AgAmbz+L1Cx+PEVEJDeWG6LnNLFTA7hXs0RCSha+/j1K7jhERJUeyw3Rc7IwVWLuO80gScDmk/E4FH1X7khERJUayw1RCWjtaY8B/h4AgEnBZ/jxFBGRjFhuiErIJ689+njqVkoWZv3Gj6eIiOTCckNUQixNTbQfT20Ji8c/PHuKiEgWLDdEJai15//OnpoUfAYpmfx4ioiorLHcEJWwiZ0aaC/uN/M3XtyPiKissdwQlTALUyW+fffRxf2CT93AXxd47ykiorLEckNUCnzd7bX3npq84yzuZ+TInIiIqPJguSEqJeM61kddR2vcTcvG1N3n5Y5DRFRpsNwQlRJzlRLz3m0GpULCr6dv4fczCXJHIiKqFFhuiEpRMzc7DG9fBwDw+c6zuJuWLXMiIiLjx3JDVMpGvVwPXs42uJ+Ziyk7zkIIIXckIiKjxnJDVMpMTRSY924zqJQS9l24g+2nbsodiYjIqLHcEJWBRjVtMObV+gCA6bvP49aDhzInIiIyXiw3RGXko5dqw6eWHdKy8zBx22loNPx4ioioNLDcEJURE6UC83s2h7lKgaNXkrH+2DW5IxERGSWWG6Iy5Olghc+6eAEAZv9xETF302VORERkfFhuiMrY+37uaFvPAdl5Goz75TTy1Bq5IxERGRWWG6IyplBImPNOU1QxN8Hp+AdY+k+M3JGIiIwKyw2RDJxtLfBlN28AwKK/L+N0/AN5AxERGRGWGyKZdGteE683dYZaIzD2l0g8zFHLHYmIyCiw3BDJRJIkzOruDccqZrh6NwPf7L0odyQiIqPAckMkIztLU8x9txkAYF3oNRyKvitzIiKiio/lhkhm7epXR39/dwDAxG2n8SAzR+ZEREQVG8sNUTkwubMXajtY4U5qNqbsOMebaxIRPQeWG6JywMJUiQW9m8NEIeH3swnYEcGbaxIRFZfs5WbZsmXw9PSEubk5fH19cfjw4SLnX7p0Kby8vGBhYYEGDRpg/fr1ZZSUqHQ1dbXDmFfrAQCm7jqP+HuZMiciIqqYZC03W7ZswZgxYzBlyhRERESgbdu26Ny5M+Li4gqcf/ny5Zg8eTKmT5+O8+fPY8aMGRgxYgR+/fXXMk5OVDqGta+Llu5VkZ6dh/G/nIaaN9ckIjKYJGT8cN/Pzw8tWrTA8uXLtdO8vLzQvXt3zJ49O9/8AQEBaNOmDebOnaudNmbMGISFheHIkSN6rTM1NRW2trZISUmBjY3N878IohIWfy8TnRceRnp2Hj55rQGGt68rdyQiItkZ8v4t25GbnJwchIeHIzAwUGd6YGAgQkNDC1wmOzsb5ubmOtMsLCxw4sQJ5ObmFrpMamqqzoOoPHOzt8T0NxsDAObvi8bZGykyJyIiqlhkKzdJSUlQq9VwcnLSme7k5ITbt28XuEynTp2watUqhIeHQwiBsLAwrFmzBrm5uUhKSipwmdmzZ8PW1lb7cHNzK/HXQlTSerRwQZcmNZCnEfh4SwQyc/LkjkREVGHIPqBYkiSdr4UQ+aY99sUXX6Bz58544YUXoFKp0K1bNwwcOBAAoFQqC1xm8uTJSElJ0T7i4+NLND9RaZAkCV+/1QQ1bMxx9W4Gvvo9Su5IREQVhmzlxsHBAUqlMt9RmsTExHxHcx6zsLDAmjVrkJmZiWvXriEuLg4eHh6oUqUKHBwcClzGzMwMNjY2Og+iisDO0hTzezaDJAEb/43DvvMFH9EkIiJdspUbU1NT+Pr6IiQkRGd6SEgIAgICilxWpVLB1dUVSqUSmzdvxhtvvAGFQvaDUEQlLqCuAz5sWxsA8GnwGdxJzZI5ERFR+SdrIxg3bhxWrVqFNWvWICoqCmPHjkVcXByCgoIAPPpIqX///tr5o6Oj8fPPP+Py5cs4ceIEevfujXPnzuHrr7+W6yUQlbrxgQ3QuKYN7mfmYsLW09Dw9HAioiKZyLnyXr16ITk5GTNnzkRCQgK8vb2xZ88euLs/us9OQkKCzjVv1Go15s2bh0uXLkGlUqFDhw4IDQ2Fh4eHTK+AqPSZmiiwsLcP3lh8GIcvJ2HN0Vh88N/RHCIiyk/W69zIgde5oYpqw7/XMWXHOaiUEnYMbwNvF1u5IxERlZkKcZ0bIjJMn9a1ENjICblqgdGbeXo4EVFhWG6IKghJkvBNj6ZwsjHD1bsZmPnrBbkjERGVSyw3RBVIVStTfNerOSQJ2HwyHnvOJsgdiYio3GG5IapgAuo4YFi7OgCAScFncPPBQ5kTERGVLyw3RBXQ2I710czNDqlZeRi7OZJ3DyciegLLDVEFpFIqsKh3c1ibmeDEtXtY/PdluSMREZUbLDdEFZR7NSt82f3R3cMX7b+Mf68my5yIiKh8YLkhqsDe8nHF2y1coBHAmC2RuJ+RI3ckIiLZsdwQVXBfdvOGp4MVElKy8EnwGVSy63ISEeXDckNUwVmZmWDxez5QKSWEXLiDn45flzsSEZGsWG6IjIC3iy0mdfYCAHz1exQu3EqVORERkXxYboiMxOA2HniloSNy8jQYuekUMrJ5ewYiqpxYboiMhCRJmPtuM9SwMcfVuxmYuuu83JGIiGTBckNkROytTLGwd3MoJCD41A0Eh9+QOxIRUZljuSEyMn61q2HMq/UBAF/sOoeYu+kyJyIiKlssN0RGaESHugioUw2ZOWqM2HAKWblquSMREZUZlhsiI6RUSFjQqzmqWZni4u00zPo9Su5IRERlhuWGyEg52phjfq/mAICfjl/H72cS5A1ERFRGWG6IjFi7+tUxrH0dAMCnwWdwLSlD5kRERKWP5YbIyI3vWB8t3asiPTsPIzZy/A0RGT+WGyIjZ6JUYHEfH1S1VOH8rVR8vYfjb4jIuLHcEFUCzrYW2vE3649x/A0RGTeWG6JKokMDRwzn+BsiqgRYbogqkXEd66OVx6PxN8N5/RsiMlIsN0SViIlSgcXvtYC9lSkuJKRi5m8X5I5ERFTiWG6IKpkatuZY0Ks5JAnY+G8cdkbclDsSEVGJYrkhqoReql8do16uBwCYvP0sLt9JkzkREVHJYbkhqqQ+fqUeXqzrgIe5agzfcAqZOXlyRyIiKhEsN0SVlFIhYUHv5nCsYobLiemYsuMchBByxyIiem4sN0SVmIO1GZb0aQGlQsKOiJvYdCJe7khERM+N5YaokmvtaY+JnRoAAKbvPo8zNx7IG4iI6Dmx3BARPnqpNjo2ckKOWoNhP5/Cg8wcuSMRERUbyw0RQZIkfPtuM7hXs8TNBw8xdkskNBqOvyGiionlhogAALYWKizv6wszEwX+uXQXS/+5InckIqJiYbkhIq1GNW3wVXdvAMD8v6Jx+PJdmRMRERmO5YaIdLzb0g29W7lBCODjzZG49eCh3JGIiAzCckNE+Ux/szG8XWxwLyMHwzacQnYeb7BJRBUHyw0R5WOuUmJ5X1/YWqhwOv4BZv7KG2wSUcXBckNEBXKzt8TC3o9usLnh3zhsDeMF/oioYmC5IaJCtW/giLGv1gcATNl5DudupsiciIjo2VhuiKhIIzvUxcsNHZGTp0HQz+G4n8EL/BFR+cZyQ0RFUigkfNezOWrZW+LG/Yf4eEsk1LzAHxGVYyw3RPRMtpYqrHjfF+YqBQ5F38WCv6LljkREVCjZy82yZcvg6ekJc3Nz+Pr64vDhw0XOv2HDBjRr1gyWlpZwdnbGoEGDkJycXEZpiSqvRjVt8H9vNwUALP77Cv48f1vmREREBZO13GzZsgVjxozBlClTEBERgbZt26Jz586Ii4srcP4jR46gf//+GDJkCM6fP4+tW7fi5MmT+OCDD8o4OVHl1N3HBYPaeAAAxv9yGlcS0+UNRERUAFnLzfz58zFkyBB88MEH8PLywoIFC+Dm5obly5cXOP/x48fh4eGB0aNHw9PTEy+++CI++ugjhIWFFbqO7OxspKam6jyIqPg+6+IFP097pGfn4cOfwpCWlSt3JCIiHbKVm5ycHISHhyMwMFBnemBgIEJDQwtcJiAgADdu3MCePXsghMCdO3ewbds2vP7664WuZ/bs2bC1tdU+3NzcSvR1EFU2KqUCS/u2gLOtOa7ezcC4X07zDuJEVK7IVm6SkpKgVqvh5OSkM93JyQm3bxf8WX5AQAA2bNiAXr16wdTUFDVq1ICdnR0WL15c6HomT56MlJQU7SM+nhciI3peDtZmWP6+L0yVCoRcuMM7iBNRuSL7gGJJknS+FkLkm/bYhQsXMHr0aEydOhXh4eHYu3cvYmNjERQUVOjzm5mZwcbGRudBRM+vuZudzh3E/754R+ZERESPyFZuHBwcoFQq8x2lSUxMzHc057HZs2ejTZs2mDhxIpo2bYpOnTph2bJlWLNmDRISEsoiNhE9oWcrN/T1q/XoDuKbIhFzlwOMiUh+spUbU1NT+Pr6IiQkRGd6SEgIAgICClwmMzMTCoVuZKVSCeDRER8iKnvTujZGK4+qSMvOw9D1YUjlAGMikpmsH0uNGzcOq1atwpo1axAVFYWxY8ciLi5O+zHT5MmT0b9/f+38Xbt2xfbt27F8+XJcvXoVR48exejRo9G6dWvUrFlTrpdBVKmZmiiwrK8vatj8N8B4SyQHGBORrEzkXHmvXr2QnJyMmTNnIiEhAd7e3tizZw/c3d0BAAkJCTrXvBk4cCDS0tKwZMkSjB8/HnZ2dnj55ZfxzTffyPUSiAhA9SpmWNnfF++sOIa/ohKx4K9ojAtsIHcsIqqkJFHJPs9JTU2Fra0tUlJSOLiYqIQFh9/A+K2nAQAr3m+B17ydZU5ERMbCkPdv2c+WIiLj0cPXFYPbeAIAxv1yGpdup8mciIgqI5YbIipRn3VpiIA61ZCZo8bQ9WG4n5EjdyQiqmRYboioRJkoFVjapwXc7C0Qdy8TwzecQq5aI3csIqpEDC437du3x/r16/Hw4cPSyENERqCqlSlW9W8FK1Mljl1Nxle/XZA7EhFVIgaXG19fX3zyySeoUaMGhg4diuPHj5dGLiKq4BrUqILvejUHAPx47Do2/htX9AJERCXE4HIzb9483Lx5E+vXr8fdu3fx0ksvoVGjRvj2229x5w4vv05E/xPYuAYmBNYHAEzddQ4nYu/JnIiIKoNijblRKpXo1q0bdu7ciZs3b6JPnz744osv4Obmhu7du+Pvv/8u6ZxEVEGN6FAXbzR1Rp5GYNjP4bhxP1PuSERk5J5rQPGJEycwdepUfPvtt3B0dMTkyZPh6OiIrl27YsKECSWVkYgqMEmSMPedZmhc0wbJGTkYuj4cGdl5csciIiNmcLlJTEzEvHnz4O3tjbZt2+Lu3bvYvHkzrl27hhkzZmDlypXYtWsXVqxYURp5iagCsjBV4of+LeFgbYaohFSM5S0aiKgUGVxuXF1dsWrVKgwYMAA3btzAtm3b8Nprr0GSJO08rVu3RqtWrUo0KBFVbDXtLPB9P1+YKhXYd+EO5oVckjsSERkpg2+/cPjwYbRt27a08pQ63n6BSF47Im5g7JZHt2j4rlczvOXjKnMiIqoISvX2C9OmTcODBw8KXOnLL79s6NMRUSXzlo8rhrWvAwD4NPgsTsXdlzkRERkbg8vNwYMHkZOT/3LqWVlZOHz4cImEIiLjNjGwATo2ckJOngYfrg/HzQe8KCgRlRwTfWc8c+YMAEAIgQsXLuD27dva76nVauzduxcuLi4ln5CIjI5CIWFBr+bosTwUF2+n4YMfw7AtyB9WZnrvkoiICqX3mBuFQqEdNFzQIhYWFli8eDEGDx5csglLGMfcEJUfN+5novvSo0hKz0HHRk74/n1fKBTSsxckokrHkPdvvcvN9evXIYRA7dq1ceLECVSvXl37PVNTUzg6OkKpVD5f8jLAckNUvoRfv4f3fvgXOXkafPRSbUzu4iV3JCIqhwx5/9b7GLC7uzsAQKPh3X2JqOT4uttj7jtN8fHmSHx/6Co8HazQu3UtuWMRUQWmV7nZvXu33k/45ptvFjsMEVVO3Zq7IOZuBhbtv4zPd55DLXtLBNR1kDsWEVVQen0spVDod1KVJElQq9XPHao08WMpovJJCIHRmyPx6+lbsDE3wY4RbVCnurXcsYionCjx69xoNBq9HuW92BBR+fXoHlRN4VPLDqlZeRiy7iTuZ+S/7AQR0bM8140ziYhKkrlKiZX9WsLFzgLXkjPx0c/hyMnjOD8iMozBt18AgIyMDBw8eBBxcXH5Lug3evToEgtXGvixFFH5d+l2GnosD0V6dh7e9nHBvJ7NdO5fR0SVT6mcCv5YREQEunTpgszMTGRkZMDe3h5JSUmwtLSEo6Mjrl69+lzhSxvLDVHFcDD6LgavOwm1RmDsq/Xx8av15I5ERDIq1XtLjR07Fl27dsW9e/dgYWGB48eP4/r16/D19cW3335b7NBERE9qV786vuzmDQD47q9o7Ii4IXMiIqooDC43kZGRGD9+PJRKJZRKJbKzs+Hm5oY5c+bgs88+K42MRFRJ9fGrhY9eqg0A+HTbWfx7NVnmRERUERhcblQqlfazbycnJ8TFxQEAbG1ttf8mIiopn77WEJ29ayBHrcGHP4Uj5m663JGIqJwzuNz4+PggLCwMANChQwdMnToVGzZswJgxY9CkSZMSD0hElZtCIeG7Xs3R3M0OKQ9zMXjdSdzjKeJEVASDy83XX38NZ2dnAMCXX36JatWqYdiwYUhMTMTKlStLPCARkblKiVUDWsK1qgWuJ2di6PowZOXyulpEVLBinQpekfFsKaKK60piGt5eForUrDx0aVIDS95rwbuIE1USpXq2FBGRXOo6VsHK/i1hqlRgz9nb+HpPlNyRiKgcMrjc3LlzB/369UPNmjVhYmKiPWvq8YOIqDS9ULsa5r7bFACw6kgs1h6NlTkREZU3et0V/EkDBw5EXFwcvvjiCzg7O/OqoURU5ro1d8HNBw8xZ+8lzPztApxtLfCadw25YxFROWHwmJsqVarg8OHDaN68eSlFKl0cc0NkHIQQmLLzHDb+GwczEwU2ffgCWtSqKncsIiolpTrmxs3NDZVsDDIRlUOSJGHmm43xckNHZOdp8MGPYbiWlCF3LCIqBwwuNwsWLMCkSZNw7dq1UohDRKQ/E6UCi9/zQRMXW9zLyMGAtSeQlJ4tdywikpnBH0tVrVoVmZmZyMvLg6WlJVQqlc737927V6IBSxo/liIyPolpWXh7WShu3H+IZq622Dj0BViZGTykkIjKMUPevw3+61+wYEFxcxERlQrHKub4cXBr9FgeitM3UjBi4yn80L8lVEpe7YKoMuJF/IjIaIRfv4++q44jK1eDni1d8U2Ppjyjk8hIlPpF/GJiYvD555/jvffeQ2JiIgBg7969OH/+fHGejoioRPi6V8Xi91pAIQG/hN3Ad39dljsSEcnA4HJz8OBBNGnSBP/++y+2b9+O9PRHd+g9c+YMpk2bVuIBiYgM0bGRE77q/ugmvov2X8aGf6/LnIiIyprB5WbSpEn46quvEBISAlNTU+30Dh064NixYyUajoioOPr41cLoV+oBAL7YeQ4hF+7InIiIypLB5ebs2bN466238k2vXr06kpOTSyQUEdHzGvtqPfRq6QaNAEZuPIWwa+X7TE4iKjkGlxs7OzskJCTkmx4REQEXFxeDAyxbtgyenp4wNzeHr68vDh8+XOi8AwcOhCRJ+R6NGzc2eL1EZNwkScKst7y1F/kb8mMYou+kyR2LiMqAweWmT58++PTTT3H79m1IkgSNRoOjR49iwoQJ6N+/v0HPtWXLFowZMwZTpkxBREQE2rZti86dOyMuLq7A+RcuXIiEhATtIz4+Hvb29nj33XcNfRlEVAmYKBVY2qcFWtSyQ8rDXPRffQI3HzyUOxYRlTKDTwXPzc3FwIEDsXnzZgghYGJiArVajT59+mDdunUG3Rncz88PLVq0wPLly7XTvLy80L17d8yePfuZy+/cuRNvv/02YmNj4e7uXuA82dnZyM7+3xVLU1NT4ebmxlPBiSqRB5k5eHfFMVxOTEed6lbYGhQAeyvTZy9IROWGIaeCF/s6N1evXsWpU6eg0Wjg4+ODevXqGbR8Tk4OLC0tsXXrVp0xPB9//DEiIyNx8ODBZz5H165dkZ2djX379hU6z/Tp0zFjxox801luiCqXhJSH6LEsFLdSstDczQ4bh/rB0pRXMSaqKErlOjcajQZz585FmzZt0Lp1a6xatQpvvPEGevbsaXCxAYCkpCSo1Wo4OTnpTHdycsLt27efuXxCQgL++OMPfPDBB0XON3nyZKSkpGgf8fHxBmcloorP2dYC64e0hp2lCpHxDzB8wynkqjVyxyKiUqB3ufnmm28wadIkWFlZwdnZGfPnz8fo0aOfO8DTVw8VQuh1RdF169bBzs4O3bt3L3I+MzMz2NjY6DyIqHKq61gFawa2grlKgQOX7uKTbWeg0VSqi7QTVQp6l5t169Zh8eLF2LdvH3bt2oWdO3di/fr1KO7dGxwcHKBUKvMdpUlMTMx3NOdpQgisWbMG/fr107nWDhHRs7SoVRXL+/pCqZCwI+Imvvo9qtj7MSIqn/QuN9evX8cbb7yh/bpTp04QQuDWrVvFWrGpqSl8fX0REhKiMz0kJAQBAQFFLnvw4EFcuXIFQ4YMKda6iahy69DQEXPfaQoAWHM0Fkv+viJzIiIqSXqXm5ycHFhYWGi/liQJpqamOmciGWrcuHFYtWoV1qxZg6ioKIwdOxZxcXEICgoC8Gi8TEGnl69evRp+fn7w9vYu9rqJqHJ7u4Urpr7RCAAwLyQaPx3nbRqIjIVBpwp88cUXsLS01H6dk5ODWbNmwdbWVjtt/vz5ej9fr169kJycjJkzZyIhIQHe3t7Ys2eP9rTuhISEfNe8SUlJQXBwMBYuXGhIdCKifAa/6IkHmTlY9PcVTN11DjbmJujW3PCLkRJR+aL3qeDt27d/5kBfSZLw999/l0iw0mLIqWREZPyEEJi2+zzWH7sOE4WEHwa0RIcGjnLHIqKnlMl1bioqlhsieppGIzBmSyR2n74Fc5UCPw/xQ0sPe7ljEdETSuU6N0RExkqhkDCvZzO0b1AdWbkaDFp3Ehdupcodi4iKieWGiAiASqnA8r6+aOleFWlZeei/5gRikzLkjkVExcByQ0T0HwtTJVYPbAUvZxskpWfj/VX/8kabRBUQyw0R0RNsLVT4aUhr1K5uhZsPHqLfqn9xN634l7wgorLHckNE9BQHazP8PMQPLnYWuJqUgX6r/0VKZq7csYhITwaXGw8PD8ycOTPf9WeIiIxJTTsLbPjAD9WrmOHi7TQMXHcCGdl5csciIj0YXG7Gjx+PXbt2oXbt2ujYsSM2b978XFcpJiIqrzwcrPDTkNawtVAhIu4Bhq4PQ1auWu5YRPQMBpebUaNGITw8HOHh4WjUqBFGjx4NZ2dnjBw5EqdOnSqNjEREsmlYwwY/Dm4NK1MlQmOSMXLjKeSqNXLHIqIiFHvMTbNmzbBw4ULcvHkT06ZNw6pVq9CqVSs0a9YMa9as4V12ichoNHezw6oBrWBmosBfUYkY98tpqDXcxxGVV8UuN7m5ufjll1/w5ptvYvz48WjZsiVWrVqFnj17YsqUKejbt29J5iQikpV/nWpY8b4vVEoJv56+hU+Dz0DDgkNULhl040wAOHXqFNauXYtNmzZBqVSiX79++O6779CwYUPtPIGBgXjppZdKNCgRkdw6NHTEot4+GLkpAtvCb8BcpcCX3byfed89IipbBpebVq1aoWPHjli+fDm6d+8OlUqVb55GjRqhd+/eJRKQiKg86dzEGfPVGozZEomfj8fBzESJz1/3YsEhKkcMLjdXr16Fu7t7kfNYWVlh7dq1xQ5FRFSedWvuguxcDT4JPoPVR2JhoVJiQqcGcsciov8YPOamQ4cOSE5Ozjf9wYMHqF27domEIiIq73q2csPMbo0BAEv+uYIlf1+WORERPWZwubl27RrU6vzXecjOzsbNmzdLJBQRUUXQ398Dn3V5NN7w233RWHX4qsyJiAgw4GOp3bt3a//9559/wtbWVvu1Wq3G/v374eHhUaLhiIjKuw9fqoOsXA3mh0Tjq9+jYGqiQH9/D7ljEVVqepeb7t27AwAkScKAAQN0vqdSqeDh4YF58+aVaDgioopg1Mt1kZWrxrIDMZi66zyUCgl9/Yoem0hEpUfvcqPRPLoip6enJ06ePAkHB4dSC0VEVJFIkoSJnRogTyOw8tBVTNlxDiYKCb1a1ZI7GlGlZPDZUrGxsaWRg4ioQpMkCZM7N0SeWmDN0VhM2n4WSoUC7/i6yh2NqNLRq9wsWrQIH374IczNzbFo0aIi5x09enSJBCMiqmgkScIXb3hBrdHgx2PXMXHbaSgVwFs+LDhEZUkSetwEytPTE2FhYahWrRo8PT0LfzJJwtWr5ftsgdTUVNja2iIlJQU2NjZyxyEiIySEwOc7z2HDv3FQSMCC3j54s1lNuWMRVWiGvH/rdeTmyY+i+LEUEVHRJEnCl928odYIbD4Zj7FbIqGUJLze1FnuaESVgsHXuTl48GBp5CAiMioKhYSv32qCd3xdodYIfLw5An+cTZA7FlGlYHC56dixI2rVqoVJkybh7NmzpZGJiMgoKBQSvunRFG/7uCBPIzByEwsOUVkwuNzcunULn3zyCQ4fPoxmzZqhadOmmDNnDm7cuFEa+YiIKjSlQsLcd5vhbR8XqP8rOHtYcIhKlV4DigsTGxuLjRs3YtOmTbh48SJeeukl/P333yWZr8RxQDERyUGtEZi49TS2R9yEUiFhUW8fjsEhMoAh79/PVW6AR7de+OOPP/DFF1/gzJkzBd53qjxhuSEiuag1AhO3ncb2U48KzsLezfFGU55FRaQPQ96/Df5Y6rGjR49i+PDhcHZ2Rp8+fdC4cWP89ttvxX06IiKjp1RImPtOM/Ro8XiQcSR+PX1L7lhERsfgKxR/9tln2LRpE27duoVXX30VCxYsQPfu3WFpaVka+YiIjIpSIWHOO00hScC28BsYsyUSANCV18EhKjEGl5sDBw5gwoQJ6NWrF+8vRURUDMr/zqKSAGwNv4GPN0dAIwS6NXeROxqRUTC43ISGhpZGDiKiSuVxwQEeFZyxWyKRqxa8FxVRCSjWmJuffvoJbdq0Qc2aNXH9+nUAwIIFC7Br164SDUdEZMweXwfnvdZu0Ahg4rbT2HwiTu5YRBWeweVm+fLlGDduHLp06YIHDx5oz46ys7PDggULSjofEZFRUygkzOreBP393SEEMGn7Wfx07JrcsYgqNIPLzeLFi/HDDz9gypQpUCqV2uktW7bkFYuJiIpBoZAw483GGPLioxsTf7HrPFYf4X38iIrL4HITGxsLHx+ffNPNzMyQkZFRIqGIiCobSZLw+eteGNa+DgDgy98uYPmBGJlTEVVMBpcbT09PREZG5pv+xx9/oFGjRiWRiYioUpIkCZ90aoCPX6kHAPhm70Us2n9Z5lREFY/BZ0tNnDgRI0aMQFZWFoQQOHHiBDZt2oTZs2dj1apVpZGRiKjSkCQJYzvWh6mJAnP/vIT5IdHIydNgfGB9SJIkdzyiCsHgcjNo0CDk5eXhk08+QWZmJvr06QMXFxcsXLgQvXv3Lo2MRESVzogOdaFSSvh6z0Us+ecKHuaq8fnrXiw4RHowqNzk5eVhw4YN6Nq1K4YOHYqkpCRoNBo4OjqWVj4iokrrw5fqwMxEiWm7Hw0wzsxR46vu3lAqWHCIimLQmBsTExMMGzYM2dnZAAAHBwcWGyKiUjQgwANz3mkKhQRsOhGH8b9EIk+tkTsWUblm8IBiPz8/RERElEYWIiIqQM+WbljY2wcmCgk7I29hxMZTyM5Tyx2LqNwyuNwMHz4c48ePx5IlS3Ds2DGcOXNG52GoZcuWwdPTE+bm5vD19cXhw4eLnD87OxtTpkyBu7s7zMzMUKdOHaxZs8bg9RIRVSRdm9XEivd9YWqiwJ/n7+DD9eF4mMOCQ1QQSQghDFlAocjfhyRJghACkiRpr1isjy1btqBfv35YtmwZ2rRpg++//x6rVq3ChQsXUKtWrQKX6datG+7cuYOvvvoKdevWRWJiIvLy8hAQEKDXOlNTU2Fra4uUlBTY2NjonZWIqDw4cjkJQ9eH4WGuGn6e9lg9sBWszQw+N4SowjHk/dvgcvP4XlKFcXd31/u5/Pz80KJFCyxfvlw7zcvLC927d8fs2bPzzb9371707t0bV69ehb29vf6hn8ByQ0QVXdi1exi09iTSsvPQ3M0OPw5qDVtLldyxiEqVIe/fBn8s5e7uXuRDXzk5OQgPD0dgYKDO9MDAwELvPL579260bNkSc+bMgYuLC+rXr48JEybg4cOHha4nOzsbqampOg8iooqspYc9Ng59AXaWKkTGP0CvlceQmJYldyyicsPgcpOcnKz9d3x8PKZOnYqJEyc+c6zM05KSkqBWq+Hk5KQz3cnJCbdv3y5wmatXr+LIkSM4d+4cduzYgQULFmDbtm0YMWJEoeuZPXs2bG1ttQ83NzeDchIRlUdNXG2x5UN/OFYxw8XbaXh3xTHE38uUOxZRuaB3uTl79iw8PDzg6OiIhg0bIjIyEq1atcJ3332HlStXokOHDti5c6fBAZ6+INXjsTsF0Wg0kCQJGzZsQOvWrdGlSxfMnz8f69atK/TozeTJk5GSkqJ9xMfHG5yRiKg8alCjCrYFBcDN3gLXkzPxzopQXL6TJncsItnpXW4++eQTNGnSBAcPHkT79u3xxhtvoEuXLkhJScH9+/fx0Ucf4f/+7//0XrGDgwOUSmW+ozSJiYn5juY85uzsDBcXF9ja2mqneXl5QQiBGzduFLiMmZkZbGxsdB5ERMaiVjVLbAsKQAOnKriTmo13vz+G0/EP5I5FJCu9y83Jkycxa9YsvPjii/j2229x69YtDB8+HAqFAgqFAqNGjcLFixf1XrGpqSl8fX0REhKiMz0kJKTQM5/atGmDW7duIT09XTstOjoaCoUCrq6ueq+biMiYONmYY8tHL6C5mx0eZOaizw/HERqTJHcsItnoXW7u3buHGjVqAACsra1hZWWlc8ZS1apVkZZm2OHQcePGYdWqVVizZg2ioqIwduxYxMXFISgoCMCjj5T69++vnb9Pnz6oVq0aBg0ahAsXLuDQoUOYOHEiBg8eDAsLC4PWTURkTOwsTbHhAz+0qVsNGTlqDFx7EvvOFzx+kcjYGTSg+OmxMM97A7devXphwYIFmDlzJpo3b45Dhw5hz5492rOuEhISEBcXp53f2toaISEhePDgAVq2bIm+ffuia9euWLRo0XPlICIyBlZmJlgzsBU6NXZCTp4GwzacQnB4wR/ZExkzva9zo1Ao0LlzZ5iZmQEAfv31V7z88suwsrIC8OiU67179xp0ET858Do3RGTs8tQaTNp+Ftv+KzZfvNEIQ170lDkV0fMplYv4DRo0SK+Vr127Vq/55MJyQ0SVgUYjMGtPFFYfiQUADGtfB590avDcR9yJ5FKqVyiu6FhuiKiyEEJg+cEYzNl7CQDQs6Urvn6rCUyUBl/ijEh2pXqFYiIiqhgkScLw9nXxTY8mUEjAL2E3EPTzKWTllu/hA0TPi+WGiMjI9WpVC9/3awkzEwX+irqDfqv/RUpmrtyxiEoNyw0RUSXQsZETfhrihyrmJjh57T56fn8Mt1N4PyoyTiw3RESVRGtPe2wNenQ/qkt30tBjeShi7qY/e0GiCoblhoioEmlYwwbBwwJQ28EKNx88xLsrjiGSt2sgI8NyQ0RUybjZW2JrkD+autriXkYO3lt5HH9fvCN3LKISw3JDRFQJVbM2w8ahL6BtPQc8zFVj6PpwbDkZ9+wFiSoAlhsiokrK+r/bNfRo4Qq1RuDT4LP4LiQalezyZ2SEWG6IiCoxlVKBb99tilEv1wUALNx/GZOCzyJPrZE5GVHxsdwQEVVykiRhfGADzHrLGwoJ2BIWj6Hrw5CZkyd3NKJiYbkhIiIAQF8/d3zfryXMVQr8c+kueq88jqT0bLljERmM5YaIiLQ6NnLCxqEvoKqlCmdupKDH8lBcS8qQOxaRQVhuiIhIR4taVRE8LABu9ha4npyJt5eHIvz6fbljEemN5YaIiPKpXd0a24e1QROXR9fC6fPDcew5myB3LCK9sNwQEVGBqlcxw+YPX8CrXo7IztNg+IZT+P5gDE8Vp3KP5YaIiAplZWaC7/u1xMAADwDA7D8u4vOd53iqOJVrLDdERFQkpULC9DcbY+objSBJwIZ/4zB0fRjSs3mqOJVPLDdERKSXwS96YsX7vtpTxXuuOIbbKVlyxyLKh+WGiIj01qlxDWz+0B8O1qa4kJCK7kuP4sKtVLljEelguSEiIoM0d7PDjuFtUNfRGrdTs/DuilAcuJQodywiLZYbIiIymJu9JYKDAuBfuxoyctQYvO4k1h+7JncsIgAsN0REVEy2lir8OLg1erRwhUYAU3edx7RdPJOK5MdyQ0RExWZq8uiu4p++1hAA8OOx6xj8YxhSs3JlTkaVGcsNERE9F0mSMKx9Hax43xcWKiUORd9Fj2WhiEvOlDsaVVIsN0REVCJe866BrUH+cLIxw+XEdHRfdhQnr92TOxZVQiw3RERUYrxdbLFrxIvwdrHBvYwc9P3hXwSH35A7FlUyLDdERFSiatia45eP/PFa4xrIUWswfutpzP3zIjQa3pOKygbLDRERlThLUxMs69sCw9vXAQAs/ScGwzecQmYOb9lApY/lhoiISoVCIeGT1xri23ebQaWUsPf8bfRYfgw37nOgMZUulhsiIipV7/i6YtPQF+BgbYqohFR0W3IUJ2I50JhKD8sNERGVupYe9tg18kU0rmmD5Iwc9F11HJtOxMkdi4wUyw0REZUJFzsLbAsKwOtNnZGrFpi8/Sym7TqHXF7RmEoYyw0REZUZC1MllrzngwmB9QE8uqLxgDUncD8jR+ZkZExYboiIqExJkoSRL9fD9/18YWmqRGhMMrovO4roO2lyRyMjwXJDRESy6NS4BrYPD4BrVQtcT87EW0uP4q8Ld+SORUaA5YaIiGTTsIYNdo98EX6e9sjIUWPoT2FYvP8yL/hHz4XlhoiIZGVvZYqfP/BDvxfcIQQwLyQaQT+HI413FqdiYrkhIiLZqZQKfNndG9/0aAJTpQL7LtxB96VHEXM3Xe5oVAGx3BARUbnRq1Ut/BLkjxo25oi5m4HuS44ihONwyEAsN0REVK40d7PDr6NeRGsPe6Rl52Ho+jAs+Cua43BIb7KXm2XLlsHT0xPm5ubw9fXF4cOHC533wIEDkCQp3+PixYtlmJiIiEpb9Spm2DDUDwP83QEAC/66jA9/Ckcqx+GQHmQtN1u2bMGYMWMwZcoUREREoG3btujcuTPi4oq+JPelS5eQkJCgfdSrV6+MEhMRUVlRKRWY0c0bc99pClMTBf6KejQO50oix+FQ0SQhhGzH+fz8/NCiRQssX75cO83Lywvdu3fH7Nmz881/4MABdOjQAffv34ednV2x1pmamgpbW1ukpKTAxsamuNGJiKgMnbnxAB/9FI6ElCxYm5lgXs9m6NS4htyxqAwZ8v4t25GbnJwchIeHIzAwUGd6YGAgQkNDi1zWx8cHzs7OeOWVV/DPP/8UOW92djZSU1N1HkREVLE0dX00DsfP0x7p2Xn46Kdw/N8fF5HH+1JRAWQrN0lJSVCr1XByctKZ7uTkhNu3bxe4jLOzM1auXIng4GBs374dDRo0wCuvvIJDhw4Vup7Zs2fD1tZW+3BzcyvR10FERGXDwdoMP3/gh8FtPAEAKw7GoN/qE7ibli1zMipvZPtY6tatW3BxcUFoaCj8/f2102fNmoWffvpJ70HCXbt2hSRJ2L17d4Hfz87ORnb2/37xU1NT4ebmxo+liIgqsN/O3MKn284gI0cNJxszLOvbAr7u9nLHolJUIT6WcnBwgFKpzHeUJjExMd/RnKK88MILuHz5cqHfNzMzg42Njc6DiIgqtjea1sSukW1Q19Ead1Kz0ev741h7NBYyDiOlckS2cmNqagpfX1+EhIToTA8JCUFAQIDezxMREQFnZ+eSjkdEROVcXccq2DWiDd5o6ow8jcCMXy9g9OZIZGTnyR2NZGYi58rHjRuHfv36oWXLlvD398fKlSsRFxeHoKAgAMDkyZNx8+ZNrF+/HgCwYMECeHh4oHHjxsjJycHPP/+M4OBgBAcHy/kyiIhIJlZmJlj8ng983ati1u9R+PX0LUQlpGLF+76o62gtdzySiazlplevXkhOTsbMmTORkJAAb29v7NmzB+7ujy7alJCQoHPNm5ycHEyYMAE3b96EhYUFGjdujN9//x1dunSR6yUQEZHMJEnCoDaeaOJiixEbT+FKYjq6LTmCOe80w+tNeWS/MpL1Ojdy4HVuiIiM1920bIzadArHr94DAAxu44lJnRvC1ET2C/LTc6oQA4qJiIhKWvUqZvh5iB+C2tUBAKw5Got3vz+G+HuZMiejssRyQ0RERsVEqcCkzg2xqn9L2FqocDr+AV5fdJh3F69EWG6IiMgovdrICb+PfhHN3OyQmvXo7uKzfr+AXF7V2Oix3BARkdFyrWqJrR/544MXH13V+IfDsej5/THcfPBQ5mRUmlhuiIjIqJmaKPD5G42wsp8vbMxNEBH3AF0WHsb+KH5MZaxYboiIqFIIbFwDv49ui2autkh5mIshP4Zh9p4ofkxlhFhuiIio0nCzt8TWoAAMauMBAPj+0FX0Xnkct/gxlVFhuSEiokrF1ESBaV0bY8X7vqhiboLw6/fRhWdTGRWWGyIiqpRe866B30e1RVNXWzzIzMXQ9WGYtuscsnLVckej58RyQ0RElVatapbYFhSAoW0fnU3147HreGtZKK4kpsucjJ4Hyw0REVVqpiYKTHm9EdYOaoVqVqaISkhF18VHsOVkHCrZHYqMBssNERERgA4NHPHHmLZ4sa4DHuaq8WnwWYzaFIHUrFy5o5GBWG6IiIj+41jFHOsHt8anrzWEiULCb2cS8Pqiw4iIuy93NDIAyw0REdETFAoJw9rXwdYgf7jZWyD+3kO8u+IYlh24Ao2GH1NVBCw3REREBfCpVRW/j26LN5o6I08jMGfvJfRfcwKJqVlyR6NnYLkhIiIqhI25Covf88GcHk1hoVLiyJUkvLbwMPadvy13NCoCyw0REVERJElCz1Zu+HVUGzRytsG9jBx8+FM4Jm8/g8ycPLnjUQFYboiIiPRQ17EKdowIwEcv1YYkAZtOxOP1RUdwOv6B3NHoKSw3REREejIzUWJyFy9s+MAPzrbmiE3KQI/loVjy92WoOdi43GC5ISIiMlBAHQfs/fglvP7fYONv90Wj1/fHEH8vU+5oBJYbIiKiYrG1VGHJez6Y37MZrM1MEHb9PjovPIztp27wysYyY7khIiIqJkmS8HYLV/zxcVu0dK+K9Ow8jPvlNEZuikBKJq9sLBeWGyIioufkZm+JzR++gAmB9WGikPD7mQS8tvAQQq8kyR2tUmK5ISIiKgEmSgVGvlwPwcMC4OlghYSULPRZ9S+m7z6PhzlqueNVKiw3REREJaiZmx1+G/Ui+vjVAgCsC73G+1OVMZYbIiKiEmZlZoKv32qCdYNawcnGDFf/O2X82z8vISdPI3c8o8dyQ0REVEraN3DEvjHt0L15TWgEsOSfK+i+9Cgu3k6VO5pRY7khIiIqRbaWKizo7YNlfVugqqUKFxJS0XXxESw/EMML/5USlhsiIqIy0KWJM/aNbYdXvZyQqxb4Zu9F9Pz+GK4lZcgdzeiw3BAREZWR6lXM8EN/X3z7bjNUMTNB+H8X/lt/7Bo0PIpTYlhuiIiIypAkSXjH1xV7x76ENnWr4WGuGlN3nUf/NSdw88FDueMZBZYbIiIiGbjYWeCnwX6Y8WZjmKsUOHIlCYHzD+Ln49d5FOc5sdwQERHJRKGQMCDAA3tGt0Urj6rIyFHj853n0HfVv4hL5k04i4vlhoiISGa1q1tjy4f+mNa1ESxUShy7moxOCw5h3dFYHsUpBpYbIiKickChkDCojSf2jmmLF2rb42GuGtN/vYDeK48jlmdUGYTlhoiIqBxxr2aFjR+8gC+7e8PKVIkT1+7htQWHsOrwVV4XR08sN0REROWMQiGh3wvu+HPsS2hbzwHZeRp89XsU3lkRiiuJaXLHK/dYboiIiMop16qWWD+4Nf7v7SaoYmaCiLgH6LLoCJYduII8Ne9RVRiWGyIionJMkiT0bl0L+8a9hA4NqiMnT4M5ey/h7eWhuHCL96gqCMsNERFRBeBsa4E1A1th3rvNYGNugjM3UtB1yRF8s/cisnLVcscrV1huiIiIKghJktDD1xV/jWuHLk1qQK0RWH4gBp0WHMLRK0lyxys3WG6IiIgqGEcbcyzr64sf+rdEDRtzXE/ORN9V/2LC1tO4n5EjdzzZyV5uli1bBk9PT5ibm8PX1xeHDx/Wa7mjR4/CxMQEzZs3L92ARERE5VTHRk4IGfcS+vu7Q5KAbeE38Or8g9gVeRNCVN7TxmUtN1u2bMGYMWMwZcoUREREoG3btujcuTPi4uKKXC4lJQX9+/fHK6+8UkZJiYiIyqcq5irM7OaNbUEBqO9kjeSMHHy8ORKD1p3EjfuV8xYOkpCx2vn5+aFFixZYvny5dpqXlxe6d++O2bNnF7pc7969Ua9ePSiVSuzcuRORkZF6rzM1NRW2trZISUmBjY3N88QnIiIqV3LyNFhxMAZL/r6CHLUGlqZKjA9sgIEBHlAqJLnjPRdD3r9lO3KTk5OD8PBwBAYG6kwPDAxEaGhoocutXbsWMTExmDZtml7ryc7ORmpqqs6DiIjIGJmaKDD6lXrY83FbtPawR2aOGl/+dgFvLztaqU4bl63cJCUlQa1Ww8nJSWe6k5MTbt++XeAyly9fxqRJk7BhwwaYmJjotZ7Zs2fD1tZW+3Bzc3vu7EREROVZXUdrbP7wBcx+uwmqmJvg9H+njc/eE4XMnDy545U62QcUS5LuYTIhRL5pAKBWq9GnTx/MmDED9evX1/v5J0+ejJSUFO0jPj7+uTMTERGVdwqFhPda18L+J04b//7QVbw67yD+PH/bqAcc63f4oxQ4ODhAqVTmO0qTmJiY72gOAKSlpSEsLAwREREYOXIkAECj0UAIARMTE+zbtw8vv/xyvuXMzMxgZmZWOi+CiIionHt82vj+qDuYtvs8btx/iI9+CscrDR0x/c3GcLO3lDtiiZPtyI2pqSl8fX0REhKiMz0kJAQBAQH55rexscHZs2cRGRmpfQQFBaFBgwaIjIyEn59fWUUnIiKqcF7xckLI2HYY0aEOVEoJ+y8mouN3B7H0nyvIyTOu+1TJduQGAMaNG4d+/fqhZcuW8Pf3x8qVKxEXF4egoCAAjz5SunnzJtavXw+FQgFvb2+d5R0dHWFubp5vOhEREeVnYarExE4N8ZaPCz7feQ7Hr97D3D8vYUfETXzZzRv+darJHbFEyFpuevXqheTkZMycORMJCQnw9vbGnj174O7uDgBISEh45jVviIiIyDB1Hatg09AXsCPiJmb9HoUriel474fjeNvHBZ+97gUH64o9nEPW69zIgde5ISIi+p+UzFzM+fMiNp6IgxCAjbkJPnmtIfq0rgVFObo2jiHv3yw3REREhMj4B5iy4yzO/3c9nGZudpjV3RveLrYyJ3uE5aYILDdEREQFy1Nr8NPx65i3Lxrp2XlQSMD7L7hjXMf6sLM0lTVbhbhCMREREZUvJkoFBrXxxN/j26Frs5rQCGD9sevo8O0BbPw3DmpNxTgewiM3REREVKDQK0mY/ut5RN9JBwB4u9hgxpve8HWvWuZZ+LFUEVhuiIiI9Jer1uCnY9fx3V/RSMt6dOuGt1u4YFLnhnCsYl5mOVhuisByQ0REZLik9GzM2XsRv4TdAABYm5ng41fqYUCAB0xNSn+UC8tNEVhuiIiIii8y/gGm7TqH0zdSAAB1qlth+puN0bZe9VJdL8tNEVhuiIiIno9GI7At/Aa+2XsRyRk5AIBOjZ3w+euNSu1eVSw3RWC5ISIiKhkpD3Ox4K9orD92HWqNgJmJAkHt6mBY+zowVylLdF0sN0VguSEiIipZl26nYfru8zh2NRkA4GJngR0jAkp0wDGvc0NERERlpkGNKtg41A9L+7RATVtz1K5uheoy3p9K1htnEhERkXGQJAmvN3XGyw0dkZqVC0mS775ULDdERERUYixMlbAwLdnxNobix1JERERkVFhuiIiIyKiw3BAREZFRYbkhIiIio8JyQ0REREaF5YaIiIiMCssNERERGRWWGyIiIjIqLDdERERkVFhuiIiIyKiw3BAREZFRYbkhIiIio8JyQ0REREal0t0VXAgBAEhNTZU5CREREenr8fv24/fxolS6cpOcnAwAcHNzkzkJERERGSotLQ22trZFzlPpyo29vT0AIC4u7pkbpzJITU2Fm5sb4uPjYWNjI3cc2XF7/A+3hS5uD13cHv/DbaGrtLaHEAJpaWmoWbPmM+etdOVGoXg0zMjW1pa/hE+wsbHh9ngCt8f/cFvo4vbQxe3xP9wWukpje+h7UIIDiomIiMiosNwQERGRUal05cbMzAzTpk2DmZmZ3FHKBW4PXdwe/8NtoYvbQxe3x/9wW+gqD9tDEvqcU0VERERUQVS6IzdERERk3FhuiIiIyKiw3BAREZFRYbkhIiIio2KU5WbZsmXw9PSEubk5fH19cfjw4SLnz87OxpQpU+Du7g4zMzPUqVMHa9asKaO0pc/Q7bFhwwY0a9YMlpaWcHZ2xqBBg7S3rajIDh06hK5du6JmzZqQJAk7d+585jIHDx6Er68vzM3NUbt2baxYsaL0g5YRQ7fH9u3b0bFjR1SvXh02Njbw9/fHn3/+WTZhS1lxfjceO3r0KExMTNC8efNSy1fWirM9jHk/WpztYaz70dmzZ6NVq1aoUqUKHB0d0b17d1y6dOmZy5X1vtToys2WLVswZswYTJkyBREREWjbti06d+6MuLi4Qpfp2bMn9u/fj9WrV+PSpUvYtGkTGjZsWIapS4+h2+PIkSPo378/hgwZgvPnz2Pr1q04efIkPvjggzJOXvIyMjLQrFkzLFmyRK/5Y2Nj0aVLF7Rt2xYRERH47LPPMHr0aAQHB5dy0rJh6PY4dOgQOnbsiD179iA8PBwdOnRA165dERERUcpJS5+h2+KxlJQU9O/fH6+88kopJZNHcbaHMe9HDd0exrwfPXjwIEaMGIHjx48jJCQEeXl5CAwMREZGRqHLyLIvFUamdevWIigoSGdaw4YNxaRJkwqc/48//hC2trYiOTm5LOKVOUO3x9y5c0Xt2rV1pi1atEi4urqWWkY5ABA7duwocp5PPvlENGzYUGfaRx99JF544YVSTCYPfbZHQRo1aiRmzJhR8oFkZMi26NWrl/j888/FtGnTRLNmzUo1l1z02R7Gvh99kj7bo7LsR4UQIjExUQAQBw8eLHQeOfalRnXkJicnB+Hh4QgMDNSZHhgYiNDQ0AKX2b17N1q2bIk5c+bAxcUF9evXx4QJE/Dw4cOyiFyqirM9AgICcOPGDezZswdCCNy5cwfbtm3D66+/XhaRy5Vjx47l23adOnVCWFgYcnNzZUpVfmg0GqSlpWlvRlvZrF27FjExMZg2bZrcUWRnzPvR4qhM+9GUlBQAKHI/IMe+1KhunJmUlAS1Wg0nJyed6U5OTrh9+3aBy1y9ehVHjhyBubk5duzYgaSkJAwfPhz37t2r8J8XF2d7BAQEYMOGDejVqxeysrKQl5eHN998E4sXLy6LyOXK7du3C9x2eXl5SEpKgrOzs0zJyod58+YhIyMDPXv2lDtKmbt8+TImTZqEw4cPw8TEqHajxWLM+9HiqCz7USEExo0bhxdffBHe3t6FzifHvtSojtw8JkmSztdCiHzTHtNoNJAkCRs2bEDr1q3RpUsXzJ8/H+vWrTOa/3UYsj0uXLiA0aNHY+rUqQgPD8fevXsRGxuLoKCgsoha7hS07QqaXtls2rQJ06dPx5YtW+Do6Ch3nDKlVqvRp08fzJgxA/Xr15c7TrlQGfajhqgs+9GRI0fizJkz2LRp0zPnLet9qVH9l8PBwQFKpTLfUYnExMR8rfExZ2dnuLi46NxG3cvLC0II3LhxA/Xq1SvVzKWpONtj9uzZaNOmDSZOnAgAaNq0KaysrNC2bVt89dVXlepoRY0aNQrcdiYmJqhWrZpMqeS3ZcsWDBkyBFu3bsWrr74qd5wyl5aWhrCwMERERGDkyJEAHr25CyFgYmKCffv24eWXX5Y5Zdky5v1ocVSG/eioUaOwe/duHDp0CK6urkXOK8e+1KiO3JiamsLX1xchISE600NCQhAQEFDgMm3atMGtW7eQnp6unRYdHQ2FQvHMH1h5V5ztkZmZCYVC99dCqVQC+F/Triz8/f3zbbt9+/ahZcuWUKlUMqWS16ZNmzBw4EBs3LjRKMcP6MPGxgZnz55FZGSk9hEUFIQGDRogMjISfn5+ckcsc8a8Hy0OY96PCiEwcuRIbN++HX///Tc8PT2fuYws+9JSG6osk82bNwuVSiVWr14tLly4IMaMGSOsrKzEtWvXhBBCTJo0SfTr1087f1pamnB1dRXvvPOOOH/+vDh48KCoV6+e+OCDD+R6CSXK0O2xdu1aYWJiIpYtWyZiYmLEkSNHRMuWLUXr1q3legklJi0tTURERIiIiAgBQMyfP19ERESI69evCyHyb4urV68KS0tLMXbsWHHhwgWxevVqoVKpxLZt2+R6CSXK0O2xceNGYWJiIpYuXSoSEhK0jwcPHsj1EkqModviacZ2tpSh28PY96OGbg9j3o8OGzZM2NraigMHDujsBzIzM7XzlId9qdGVGyGEWLp0qXB3dxempqaiRYsWOqeoDRgwQLRr105n/qioKPHqq68KCwsL4erqKsaNG6fzg6roDN0eixYtEo0aNRIWFhbC2dlZ9O3bV9y4caOMU5e8f/75RwDI9xgwYIAQouBtceDAAeHj4yNMTU2Fh4eHWL58edkHLyWGbo927doVOX9FVpzfjScZW7kpzvYw5v1ocbaHse5HC9oOAMTatWu185SHfan0X1giIiIio2BUY26IiIiIWG6IiIjIqLDcEBERkVFhuSEiIiKjwnJDRERERoXlhoiIiIwKyw0REREZFZYbIiIiMiosN1Soa9euQZIkREZGlup6MjMz0aNHD9jY2ECSJDx48EDvZSVJws6dO0s0z7p162BnZ1eiz/k8PDw8sGDBghJ/3qNHj6JJkyZQqVTo3r273suVt+0jhMCHH34Ie3v7Mvl9NTZl9XdeVqZPn47mzZtrvx44cOAzf7/bt2+PMWPGlGouKlssNxXcwIEDIUkSJEmCiYkJatWqhWHDhuH+/fsGP8/TOwA3NzckJCTA29u7BBPn9+OPP+Lw4cMIDQ1FQkKCzp2FH3t6h2WMyro0jBs3Ds2bN0dsbCzWrVtX4DylVaxK0t69e7Fu3Tr89ttvZfL7KoeSKvH6vNEbm4ULFxb6+11cBw4cMPg/YoUprb+xivC3W5pM5A5Az++1117D2rVrkZeXhwsXLmDw4MF48OABNm3a9FzPq1QqUaNGjRJKWbiYmBh4eXkZ5ZtSeRYTE4OgoKBye9fmnJwcmJqaPnO+mJgYODs7F3qne30IIaBWq2Fiwl2isSnoP0tUCZTqnauo1A0YMEB069ZNZ9q4ceOEvb299uu8vDwxePBg4eHhIczNzUX9+vXFggULtN+fNm1avpug/fPPPyI2NlYAEBEREdp5Dxw4IFq1aiVMTU1FjRo1xKeffipyc3OLzLht2zbRqFEjYWpqKtzd3cW3336r/d7TN2Ms6OaEa9euLfQmbQDEDz/8ILp37y4sLCxE3bp1xa5du3SWP3/+vOjcubOwsrISjo6O4v333xd3794tNO/atWuFra2tzrTdu3eLFi1aCDMzM+Hp6SmmT5+u87r1ybFr1y5Rt25dYW5uLtq3by/WrVsnAIj79+8XeGO+adOmCSGEcHd3F7NmzRKDBg0S1tbWws3NTXz//fdFbvOsrCwxatQoUb16dWFmZibatGkjTpw4IYQQ2p9rQdvzSQXdKPPJ7bN3717RsGFDYWVlJTp16iRu3bqls/yaNWtEw4YNhZmZmWjQoIFYunRpkZnbtWsnRowYIcaOHSuqVasmXnrpJSFE0T+/AQMG6ORzd3cXQgih0WjEN998Izw9PYW5ublo2rSp2Lp1q3Zdj7f33r17ha+vr1CpVOLvv//We7m//vpL+Pr6CgsLC+Hv7y8uXryY72ft6+srzMzMRLVq1cRbb72l/V52draYOHGiqFmzprC0tBStW7cW//zzT6Hbxd3dvcDXKIQQy5YtE7Vr1xYqlUrUr19frF+/vtDnedbfeXBwsGjfvr2wsLAQTZs2FaGhoTrLHz16VLRt21aYm5sLV1dXMWrUKJGenl7o+p61HX766Sfh6+srrK2thZOTk3jvvffEnTt3DN7Ws2fPFo6OjsLa2loMHjxYfPrppzo3MX16H5meni769esnrKysRI0aNcS3334r2rVrJz7++GO9shX09/P4BprP+v15WmF/Y8/a3j/++KOwsrIS0dHR2vlHjhwp6tWrJ9LT04t83sqi8r1iI/P0H25MTIxo1KiRcHJy0k7LyckRU6dOFSdOnBBXr14VP//8s7C0tBRbtmwRQgiRlpYmevbsKV577TXt7euzs7PzlZsbN24IS0tLMXz4cBEVFSV27NghHBwctG/CBQkLCxMKhULMnDlTXLp0Saxdu1ZYWFho30yTk5PF0KFDhb+/v0hISBDJycn5niMzM1OMHz9eNG7cWJvv8d2GAQhXV1exceNGcfnyZTF69GhhbW2tfZ5bt24JBwcHMXnyZBEVFSVOnTolOnbsKDp06FBo5qfLzd69e4WNjY1Yt26diImJEfv27RMeHh5i+vTp2nmelSM2NlaoVCoxYcIEcfHiRbFp0ybh4uKiLTfZ2dliwYIFwsbGRvsa09LShBCP3tzs7e3F0qVLxeXLl8Xs2bOFQqEQUVFRhb6G0aNHi5o1a4o9e/aI8+fPiwEDBoiqVauK5ORkkZeXJxISEoSNjY1YsGCBzvZ8UnJysnB1dRUzZ87UZnq8fVQqlXj11VfFyZMnRXh4uPDy8hJ9+vTRLrty5Urh7OwsgoODxdWrV0VwcLCwt7cX69atKzRzu3bthLW1tZg4caK4ePGiiIqKeubP78GDB2LmzJnC1dVVJCQkiMTERCGEEJ999plo2LCh2Lt3r4iJiRFr164VZmZm4sCBA0KI/71xNm3aVOzbt09cuXJFJCUl6b2cn5+fOHDggDh//rxo27atCAgI0L6O3377TSiVSjF16lRx4cIFERkZKWbNmqX9fp8+fURAQIA4dOiQuHLlipg7d64wMzPTeaN6UmJioraAPvkat2/fLlQqlVi6dKm4dOmSmDdvnlAqleLvv/8u8Hme9XfesGFD8dtvv4lLly6Jd955R7i7u2sL/JkzZ4S1tbX47rvvRHR0tDh69Kjw8fERAwcOLPTn+aztsHr1arFnzx4RExMjjh07Jl544QXRuXNn7ff12dZbtmwRpqam4ocffhAXL14UU6ZMEVWqVCmy3AwbNky4urqKffv2iTNnzog33nhDWFtb65SborLl5eWJ4OBgAUBcunRJJCQkiAcPHgghnv1797TC/sb02d7vvvuuaNWqlcjNzRV//PGHUKlU2v/AFPa8lQnLTQU3YMAAoVQqhZWVlTA3N9e29Pnz5xe53PDhw0WPHj10nufpI0BPl5vPPvtMNGjQQGg0Gu08S5cuFdbW1kKtVhe4nj59+oiOHTvqTJs4caJo1KiR9uuPP/64wCM2T5o2bZrODusxAOLzzz/Xfp2eni4kSRJ//PGHEEKIL774QgQGBuosEx8fr90xFeTpctO2bVvx9ddf68zz008/CWdnZ71zfPrpp8Lb21vnOaZMmaItNwWt9zF3d3fx/vvva7/WaDTC0dFRLF++vMD86enpQqVSiQ0bNmin5eTkiJo1a4o5c+Zop9na2hZ4xObpdX/33Xc60x4fSbty5Yp22tKlS3UKtZubm9i4caPOcl9++aXw9/cvdF3t2rUTzZs315mmz8/vu+++0zmakZ6eLszNzfMdeRgyZIh47733hBD/e+PcuXNnsZb766+/tN///fffBQDx8OFDIYQQ/v7+om/fvgW+xitXrghJksTNmzd1pr/yyiti8uTJBW8Y8ej3a8eOHTrTAgICxNChQ3Wmvfvuu6JLly6FPk9Rf+erVq3STjt//rwAoC3Q/fr1Ex9++KHOcocPHxYKhUL7up9W1HYoyIkTJwQAbanXd1sHBQXpPI+fn1+h5SYtLU2YmpqKzZs3a7+fnJwsLCwsdMqNvtke/+0Kod/vT0EK+hvTZ3vfu3dPuLq6imHDhgknJyfx1VdfPfN5KxN+wGwEOnTogOXLlyMzMxOrVq1CdHQ0Ro0apTPPihUrsGrVKly/fh0PHz5ETk6OwQN0o6Ki4O/vD0mStNPatGmD9PR03LhxA7Vq1SpwmW7duulMa9OmDRYsWAC1Wg2lUmlQhoI0bdpU+28rKytUqVIFiYmJAIDw8HD8888/sLa2zrdcTEwM6tev/8znDw8Px8mTJzFr1iztNLVajaysLGRmZsLS0vKZOS5duoRWrVrpPG/r1q2L9RolSUKNGjW0z13Q68rNzUWbNm2001QqFVq3bo2oqCi911kUS0tL1KlTR/u1s7OzNs/du3cRHx+PIUOGYOjQodp58vLynjn+oWXLljpfF+fnd+HCBWRlZaFjx44603NycuDj41Po+gxZ7smfh7OzMwAgMTERtWrVQmRkpM7rftKpU6cghMiXOzs7G9WqVStwmcJERUXhww8/1JnWpk0bLFy40KDneayw19SwYUOEh4fjypUr2LBhg3YeIQQ0Gg1iY2Ph5eWV7/mK2g4AEBERgenTpyMyMhL37t2DRqMBAMTFxaFRo0bPzFWrVi1ERUUhKChI53n9/f3xzz//FLjOmJgY5OTkwN/fXzvN3t4eDRo0KFa2Jxny+/Ms+mzvqlWrYvXq1ejUqRMCAgIwadIkg9Zh7FhujICVlRXq1q0LAFi0aBE6dOiAGTNm4MsvvwQA/PLLLxg7dizmzZsHf39/VKlSBXPnzsW///5r0HqEEDrF5vE0APmm67NMSVGpVDpfS5Kk3RlpNBp07doV33zzTb7lHu8on0Wj0WDGjBl4++23833P3NxcrxzPux2Keu6nFfYzKShDcRWU5/F6H+f64Ycf4OfnpzPfs8qslZWVztfF+fk9Xv/vv/8OFxcXne+ZmZkVuj5Dlnvy9T/epo+Xt7CwKDDX43mUSiXCw8PzbYuCCtyzlOTPuKjXpNFo8NFHH2H06NH5livoPzVA0dshIyMDgYGBCAwMxM8//4zq1asjLi4OnTp1Qk5Ojt65DKXP35wh2Z5kyO/Ps+i7vQ8dOgSlUolbt24hIyMDNjY2Bq3HmLHcGKFp06ahc+fOGDZsGGrWrInDhw8jICAAw4cP184TExOjs4ypqSnUanWRz9uoUSMEBwfr7EBDQ0NRpUqVfH/MTy5z5MgRnWmhoaGoX7++QUdt9MlXkBYtWiA4OBgeHh7FPhOmRYsWuHTpkrZAFkfDhg2xZ88enWlhYWE6Xxf3NT6tbt26MDU1xZEjR9CnTx8AQG5uLsLCwgy+lkdxMjk5OcHFxQVXr15F3759DVr2acX5+TVq1AhmZmaIi4tDu3bt9F5XcZd7WtOmTbF//34MGjQo3/d8fHygVquRmJiItm3b6v2cKpUq38/By8sLR44cQf/+/bXTQkNDCzyK8tjz/B2dP3/eoL+BorbDxYsXkZSUhP/7v/+Dm5sbgPx/D/rw8vLC8ePHdbbB8ePHC52/bt26UKlUOH78uLYk3L9/H9HR0dqfuT7ZHp/F9+S2LO7vT0E/E322d2hoKObMmYNff/0VkyZNwqhRo/Djjz8W+byVCa9zY4Tat2+Pxo0b4+uvvwbw6A86LCwMf/75J6Kjo/HFF1/g5MmTOst4eHjgzJkzuHTpEpKSkpCbm5vveYcPH474+HiMGjUKFy9exK5duzBt2jSMGzcOCkXBv0rjx4/H/v378eWXXyI6Oho//vgjlixZggkTJhj0mjw8PBAbG4vIyEgkJSUhOztbr+VGjBiBe/fu4b333sOJEydw9epV7Nu3D4MHD9b7D3/q1KlYv349pk+fjvPnzyMqKgpbtmzB559/rnf+jz76CBcvXsSnn36K6Oho/PLLL9prbzwuih4eHkhPT8f+/fuRlJSEzMxMvZ//SVZWVhg2bBgmTpyIvXv34sKFCxg6dCgyMzMxZMgQg57Lw8MDhw4dws2bN5GUlKT3ctOnT8fs2bOxcOFCREdH4+zZs1i7di3mz59v0PqL8/OrUqUKJkyYgLFjx+LHH39ETEwMIiIisHTpUp2df0kt97Rp06Zh06ZNmDZtGqKionD27FnMmTMHAFC/fn307dsX/fv3x/bt2xEbG4uTJ0/im2++yVd+n+Th4YH9+/fj9u3b2mtYTZw4EevWrcOKFStw+fJlzJ8/H9u3by/yb0ufv/OCfPrppzh27BhGjBiByMhIXL58Gbt378738be+26FWrVowNTXF4sWLcfXqVezevVt7pNkQH3/8MdasWYM1a9YgOjoa06ZNw/nz5wud39raGkOGDMHEiROxf/9+nDt3DgMHDtTZf+mTzd3dHZIk4bfffsPdu3eRnp5e7N+fgv7GnrW909LS0K9fP4waNQqdO3fGxo0b8csvv2Dr1q1FPm+lIsM4HypBBQ0QFEKIDRs2CFNTUxEXFyeysrLEwIEDha2trbCzsxPDhg0TkyZN0hl0l5iYKDp27Cisra1L7VRwlUolatWqJebOnavzfX0GFGdlZYkePXoIOzu7fKeCPz3Q8umBstHR0eKtt94SdnZ2wsLCQjRs2FCMGTNGZ2D0kwoa2Lt3714REBAgLCwshI2NjWjdurVYuXKl9vv65Hh8KriZmZlo3769WL58uc7gSCGECAoKEtWqVct3KvjTAwObNWtW5FlqDx8+FKNGjRIODg75TgUvLF9Bjh07Jpo2bSrMzMzynQr+pB07duQ73XTDhg2iefPmwtTUVFStWlW89NJLYvv27YWu6+nTcR971s/v6QHFQjwadL1w4ULRoEEDoVKpRPXq1UWnTp3EwYMHhRAFDwgt7nIRERECgIiNjdVOCw4O1r52BwcH8fbbb2u/9/jsRQ8PD6FSqUSNGjXEW2+9Jc6cOVPottm9e7eoW7euMDExKfap4ELo/3d+//597fcfO3HihHZZKysr0bRpU52znwpS1HbYuHGj8PDwEGZmZsLf31/s3r1bJ4e+23rWrFnCwcFBWFtbiwEDBohPPvmkyLOl0tLSxPvvvy8sLS2Fk5OTmDNnTr7fvWdlE0KImTNniho1aghJknROBS/q96cgBf2NPWt7Dxo0SDRp0kRkZWVp51+4cKGwt7cXN27cKPJ5KwtJiBIeAEFEepk1axZWrFiB+Ph4uaMQERkVjrkhKiPLli1Dq1atUK1aNRw9ehRz587FyJEj5Y5FRGR0WG6Iysjly5fx1Vdf4d69e6hVqxbGjx+PyZMnyx2LiMjo8GMpIiIiMio8W4qIiIiMCssNERERGRWWGyIiIjIqLDdERERkVFhuiIiIyKiw3BAREZFRYbkhIiIio8JyQ0REREbl/wEzBPVvSKSe5gAAAABJRU5ErkJggg==", + "text/plain": [ + "
" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "reference_length = 1\n", + "candidate_length = np.linspace(1.5, 0.5, 100)\n", + "\n", + "length_ratio = reference_length / candidate_length\n", + "BP = np.minimum(1, np.exp(1 - length_ratio))\n", + "\n", + "# Plot the data\n", + "fig, ax = plt.subplots(1)\n", + "lines = ax.plot(length_ratio, BP)\n", + "ax.set(\n", + " xlabel=\"Ratio of the length of the reference to the candidate text\",\n", + " ylabel=\"Brevity Penalty\",\n", + ")\n", + "plt.show()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### N-Gram Precision:\n", + "The n-gram precision counts how many n-grams (in your case unigrams, bigrams, trigrams, and four-grams for i =1 , ... , 4) match their n-gram counterpart in the reference translations. This term acts as a precision metric. Unigrams account for adequacy while longer n-grams account for fluency of the translation. To avoid overcounting, the n-gram counts are clipped to the maximal n-gram count occurring in the reference ($m_{n}^{ref}$). Typically precision shows exponential decay with the degree of the n-gram." + ] + }, + { + "cell_type": "code", + "execution_count": 3, + "metadata": {}, + "outputs": [ + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAjcAAAGdCAYAAADuR1K7AAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjcuMSwgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy/bCgiHAAAACXBIWXMAAA9hAAAPYQGoP6dpAAAseElEQVR4nO3de1BV9d7H8c8WZGMomJB4aYuUphTZMbBzwDyWF4qcTj3TxW6aCnMiyiTqlGTlrRN2I7pB+qjHPJVxyuwykrVPpWJkTyJOPWlXtU20kcAOoBYErOcPxz3PDlSWbNiwfL9m1kzrt35rre/2N46ffutmMwzDEAAAgEX08HcBAAAAvkS4AQAAlkK4AQAAlkK4AQAAlkK4AQAAlkK4AQAAlkK4AQAAlkK4AQAAlhLo7wI6W3Nzs3788Uf16dNHNpvN3+UAAIA2MAxDdXV1GjRokHr0OPbczEkXbn788Uc5HA5/lwEAAE5AWVmZTj/99GP2OenCTZ8+fSQd/sMJDQ31czUAAKAtamtr5XA4PP+OH8tJF26OXIoKDQ0l3AAA0M205ZYSbigGAACWQrgBAACWQrgBAACWQrgBAACWQrgBAACWQrgBAACWQrgBAACWQrgBAACWQrgBAACWQrgBAACW4vdwk5eXp+joaAUHBysuLk5FRUXH7P/SSy/pvPPO0ymnnKKBAwdq5syZqq6u7qRqAQBAV+fXcFNQUKCMjAzNmzdPpaWlGjdunJKTk+VyuVrtv2XLFk2fPl0pKSn64osv9Oqrr+rTTz9VampqJ1cOAAC6Kr+Gm5ycHKWkpCg1NVUxMTHKzc2Vw+FQfn5+q/23bt2qoUOH6o477lB0dLQuvPBC3XLLLdq2bVsnVw4AALoqv4WbhoYGlZSUKCkpyas9KSlJxcXFre6TmJioH374QYWFhTIMQ/v27dNrr72mKVOmHPU89fX1qq2t9VoAAIB1BfrrxFVVVWpqalJkZKRXe2RkpCoqKlrdJzExUS+99JKmTp2qX3/9VY2NjfrLX/6iZ5555qjnyc7O1sKFC31a+7EMnbu+084Fb3uXHD3kAgBOHn6/odhms3mtG4bRou2InTt36o477tCDDz6okpISbdiwQXv27FFaWtpRj5+VlaWamhrPUlZW5tP6AQBA1+K3mZuIiAgFBAS0mKWprKxsMZtzRHZ2tsaOHau//e1vkqRRo0YpJCRE48aN00MPPaSBAwe22Mdut8tut/v+BwAAgC7JbzM3QUFBiouLk9Pp9Gp3Op1KTExsdZ9Dhw6pRw/vkgMCAiQdnvEBAADw62WpzMxMLV++XCtXrtSuXbt05513yuVyeS4zZWVlafr06Z7+l19+uV5//XXl5+dr9+7d+uijj3THHXfoggsu0KBBg/z1MwAAQBfit8tSkjR16lRVV1dr0aJFcrvdio2NVWFhoaKioiRJbrfb6503M2bMUF1dnZ599lnddddd6tu3ryZMmKBHHnnEXz8BAAB0MTbjJLueU1tbq7CwMNXU1Cg0NNTnx+dpKf/haSkAsC4z/377/WkpAAAAXyLcAAAASyHcAAAASyHcAAAASyHcAAAASyHcAAAASyHcAAAASyHcAAAASyHcAAAASyHcAAAASyHcAAAASyHcAAAASyHcAAAASyHcAAAASyHcAAAASyHcAAAASyHcAAAASyHcAAAASyHcAAAASyHcAAAASyHcAAAASyHcAAAASyHcAAAASyHcAAAASyHcAAAASyHcAAAASyHcAAAASyHcAAAASyHcAAAASyHcAAAASyHcAAAASyHcAAAASyHcAAAAS/F7uMnLy1N0dLSCg4MVFxenoqKio/adMWOGbDZbi+Wcc87pxIoBAEBX5tdwU1BQoIyMDM2bN0+lpaUaN26ckpOT5XK5Wu3/1FNPye12e5aysjL169dP11xzTSdXDgAAuiq/hpucnBylpKQoNTVVMTExys3NlcPhUH5+fqv9w8LCNGDAAM+ybds2/fzzz5o5c2YnVw4AALoqv4WbhoYGlZSUKCkpyas9KSlJxcXFbTrGihUrNGnSJEVFRR21T319vWpra70WAABgXYH+OnFVVZWampoUGRnp1R4ZGamKiorj7u92u/XOO+/o5ZdfPma/7OxsLVy4sF21ApI0dO56f5dw0tq7ZIq/SwDQjfj9hmKbzea1bhhGi7bWrFq1Sn379tWVV155zH5ZWVmqqanxLGVlZe0pFwAAdHF+m7mJiIhQQEBAi1maysrKFrM5v2cYhlauXKlp06YpKCjomH3tdrvsdnu76wUAAN2D32ZugoKCFBcXJ6fT6dXudDqVmJh4zH03bdqkb7/9VikpKR1ZIgAA6Ib8NnMjSZmZmZo2bZri4+OVkJCgZcuWyeVyKS0tTdLhS0rl5eVavXq1134rVqzQH//4R8XGxvqjbAAA0IX5NdxMnTpV1dXVWrRokdxut2JjY1VYWOh5+sntdrd4501NTY3Wrl2rp556yh8lAwCALs6v4UaS0tPTlZ6e3uq2VatWtWgLCwvToUOHOrgqAADQXfn9aSkAAABfItwAAABLIdwAAABLIdwAAABLIdwAAABLIdwAAABLIdwAAABLIdwAAABLIdwAAABLIdwAAABLIdwAAABLIdwAAABLIdwAAABLIdwAAABLIdwAAABLIdwAAABLIdwAAABLIdwAAABLIdwAAABLIdwAAABLIdwAAABLIdwAAABLIdwAAABLIdwAAABLIdwAAABLIdwAAABLIdwAAABLIdwAAABLIdwAAABLIdwAAABLIdwAAABLIdwAAABLIdwAAABL8Xu4ycvLU3R0tIKDgxUXF6eioqJj9q+vr9e8efMUFRUlu92uM888UytXruykagEAQFcX6M+TFxQUKCMjQ3l5eRo7dqyWLl2q5ORk7dy5U0OGDGl1n2uvvVb79u3TihUrNGzYMFVWVqqxsbGTKwcAAF2VX8NNTk6OUlJSlJqaKknKzc3Vu+++q/z8fGVnZ7fov2HDBm3atEm7d+9Wv379JElDhw7tzJIBAEAX57fLUg0NDSopKVFSUpJXe1JSkoqLi1vd56233lJ8fLweffRRDR48WGeddZbuvvtu/fLLL0c9T319vWpra70WAABgXX6buamqqlJTU5MiIyO92iMjI1VRUdHqPrt379aWLVsUHBysdevWqaqqSunp6dq/f/9R77vJzs7WwoULfV4/AGsYOne9v0s4ae1dMsXfJcCi/H5Dsc1m81o3DKNF2xHNzc2y2Wx66aWXdMEFF+iyyy5TTk6OVq1addTZm6ysLNXU1HiWsrIyn/8GAADQdfht5iYiIkIBAQEtZmkqKytbzOYcMXDgQA0ePFhhYWGetpiYGBmGoR9++EHDhw9vsY/dbpfdbvdt8QAAoMvy28xNUFCQ4uLi5HQ6vdqdTqcSExNb3Wfs2LH68ccfdeDAAU/b119/rR49euj000/v0HoBAED34NfLUpmZmVq+fLlWrlypXbt26c4775TL5VJaWpqkw5eUpk+f7ul/ww03KDw8XDNnztTOnTu1efNm/e1vf9OsWbPUq1cvf/0MAADQhfj1UfCpU6equrpaixYtktvtVmxsrAoLCxUVFSVJcrvdcrlcnv69e/eW0+nU7NmzFR8fr/DwcF177bV66KGH/PUTAABAF+PXcCNJ6enpSk9Pb3XbqlWrWrSNHDmyxaUsAACAI/z+tBQAAIAvEW4AAIClEG4AAIClEG4AAIClEG4AAIClEG4AAIClEG4AAIClEG4AAIClEG4AAIClEG4AAIClEG4AAIClEG4AAIClEG4AAIClEG4AAIClEG4AAIClBJrd4eDBg1qyZInef/99VVZWqrm52Wv77t27fVYcAACAWabDTWpqqjZt2qRp06Zp4MCBstlsHVEXAADACTEdbt555x2tX79eY8eO7Yh6AAAA2sX0PTennnqq+vXr1xG1AAAAtJvpcLN48WI9+OCDOnToUEfUAwAA0C6mL0s98cQT+u677xQZGamhQ4eqZ8+eXtu3b9/us+IAAADMMh1urrzyyg4oAwAAwDdMh5v58+d3RB0AAAA+YTrcHFFSUqJdu3bJZrPp7LPP1ujRo31ZFwAAwAkxHW4qKyt13XXXaePGjerbt68Mw1BNTY0uvvhivfLKKzrttNM6ok4AAIA2Mf201OzZs1VbW6svvvhC+/fv188//6z//d//VW1tre64446OqBEAAKDNTM/cbNiwQf/+978VExPjaTv77LP13HPPKSkpyafFAQAAmGV65qa5ubnF49+S1LNnzxbfmQIAAOhspsPNhAkTNGfOHP3444+etvLyct15552aOHGiT4sDAAAwy3S4efbZZ1VXV6ehQ4fqzDPP1LBhwxQdHa26ujo988wzHVEjAABAm5m+58bhcGj79u1yOp368ssvZRiGzj77bE2aNKkj6gMAADDlhN9zM3nyZE2ePNmXtQAAALRbm8LN008/rb/+9a8KDg7W008/fcy+PA4OAAD8qU3h5sknn9SNN96o4OBgPfnkk0ftZ7PZTIebvLw8PfbYY3K73TrnnHOUm5urcePGtdp348aNuvjii1u079q1SyNHjjR1XgAAYE1tCjd79uxp9b/bq6CgQBkZGcrLy9PYsWO1dOlSJScna+fOnRoyZMhR9/vqq68UGhrqWeetyAAA4AjTT0v9XlNTk3bs2KGff/7Z9L45OTlKSUlRamqqYmJilJubK4fDofz8/GPu179/fw0YMMCzBAQEnGj5AADAYkyHm4yMDK1YsULS4WDz5z//Weeff74cDoc2btzY5uM0NDSopKSkxVuNk5KSVFxcfMx9R48erYEDB2rixIn68MMPj9m3vr5etbW1XgsAALAu009Lvfbaa7rpppskSW+//bb27t2rL7/8UqtXr9a8efP00Ucftek4VVVVampqUmRkpFd7ZGSkKioqWt1n4MCBWrZsmeLi4lRfX69//vOfmjhxojZu3Kg///nPre6TnZ2thQsXmviFAAArGDp3vb9LOGntXTLFr+c3HW6qqqo0YMAASVJhYaGuueYanXXWWUpJSTnuk1StsdlsXuuGYbRoO2LEiBEaMWKEZz0hIUFlZWV6/PHHjxpusrKylJmZ6Vmvra2Vw+EwXScAAOgeTF+WioyM1M6dO9XU1KQNGzZ4Xt536NAhU/e+REREKCAgoMUsTWVlZYvZnGP505/+pG+++eao2+12u0JDQ70WAABgXabDzcyZM3XttdcqNjZWNpvN8yK/Tz75xNTj2EFBQYqLi5PT6fRqdzqdSkxMbPNxSktLNXDgwDb3BwAA1mb6stSCBQsUGxursrIyXXPNNbLb7ZKkgIAAzZ0719SxMjMzNW3aNMXHxyshIUHLli2Ty+VSWlqapMOXlMrLy7V69WpJUm5uroYOHapzzjlHDQ0NevHFF7V27VqtXbvW7M8AAAAWdUKfX7j66qtbtN18882mjzN16lRVV1dr0aJFcrvdio2NVWFhoaKioiRJbrdbLpfL07+hoUF33323ysvL1atXL51zzjlav369LrvsshP5GQAAwIL8/vmF9PR0paent7pt1apVXuv33HOP7rnnHlPHBwAAJxe/f34BAADAl/z6+QUAAABfa/fnFwAAALoS0+Hm6quv1pIlS1q0P/bYY7rmmmt8UhQAAMCJMh1uNm3apClTWr5W+dJLL9XmzZt9UhQAAMCJMh1uDhw4oKCgoBbtPXv25KOUAADA70yHm9jYWBUUFLRof+WVV3T22Wf7pCgAAIATZfolfg888ICuuuoqfffdd5owYYIk6f3339eaNWv06quv+rxAAAAAM0yHm7/85S9644039PDDD+u1115Tr169NGrUKP373//W+PHjO6JGAACANjuhzy9MmTKl1ZuKAQAA/O2E3nPzn//8R8uXL9d9992n/fv3S5K2b9+u8vJynxYHAABglumZm88++0yTJk1SWFiY9u7dq9TUVPXr10/r1q3T999/7/mCNwAAgD+YnrnJzMzUjBkz9M033yg4ONjTnpyczHtuAACA35kON59++qluueWWFu2DBw9WRUWFT4oCAAA4UabDTXBwcKsv6/vqq6902mmn+aQoAACAE2U63FxxxRVatGiRfvvtN0mSzWaTy+XS3LlzddVVV/m8QAAAADNMh5vHH39cP/30k/r3769ffvlF48eP17Bhw9SnTx/9/e9/74gaAQAA2sz001KhoaHasmWLPvjgA23fvl3Nzc06//zzNWnSpI6oDwAAwBRT4aaxsVHBwcHasWOHJkyY4Pn8AgAAQFdh6rJUYGCgoqKi1NTU1FH1AAAAtIvpe27uv/9+ZWVled5MDAAA0JWYvufm6aef1rfffqtBgwYpKipKISEhXtu3b9/us+IAAADMMh1urrzyyg4oAwAAwDdMh5v58+d3RB0AAAA+YTrcHLFt2zbt2rVLNptNMTExiouL82VdAAAAJ8R0uPnhhx90/fXX66OPPlLfvn0lSf/5z3+UmJioNWvWyOFw+LpGAACANjP9tNSsWbP022+/adeuXdq/f7/279+vXbt2yTAMpaSkdESNAAAAbWZ65qaoqEjFxcUaMWKEp23EiBF65plnNHbsWJ8WBwAAYJbpmZshQ4Z4Ppr5/zU2Nmrw4ME+KQoAAOBEmQ43jz76qGbPnq1t27bJMAxJh28unjNnjh5//HGfFwgAAGCG6ctSM2bM0KFDh/THP/5RgYGHd29sbFRgYKBmzZqlWbNmefryFmMAANDZTIeb3NzcDigDAADAN0yHm5tvvrkj6gAAAPAJ0/fc+FpeXp6io6MVHBysuLg4FRUVtWm/jz76SIGBgfrDH/7QsQUCAIBuxa/hpqCgQBkZGZo3b55KS0s1btw4JScny+VyHXO/mpoaTZ8+XRMnTuykSgEAQHfh13CTk5OjlJQUpaamKiYmRrm5uXI4HMrPzz/mfrfccotuuOEGJSQkdFKlAACgu/BbuGloaFBJSYmSkpK82pOSklRcXHzU/f7xj3/ou+++a/MHPOvr61VbW+u1AAAA6/JbuKmqqlJTU5MiIyO92iMjI1VRUdHqPt98843mzp2rl156yfMY+vFkZ2crLCzMs/DtKwAArM3001K//vqrnnnmGX344YeqrKxUc3Oz1/bt27ebOp7NZvNaNwyjRZskNTU16YYbbtDChQt11llntfn4WVlZyszM9KzX1tYScAAAsDDT4WbWrFlyOp26+uqrdcEFF7QaRNoiIiJCAQEBLWZpKisrW8zmSFJdXZ22bdum0tJS3X777ZKk5uZmGYahwMBAvffee5owYUKL/ex2u+x2+wnVCAAAuh/T4Wb9+vUqLCxs90cyg4KCFBcXJ6fTqf/6r//ytDudTl1xxRUt+oeGhurzzz/3asvLy9MHH3yg1157TdHR0e2qBwAAWIPpcDN48GD16dPHJyfPzMzUtGnTFB8fr4SEBC1btkwul0tpaWmSDl9SKi8v1+rVq9WjRw/FxsZ67d+/f38FBwe3aAcAACcv0+HmiSee0L333qvnn39eUVFR7Tr51KlTVV1drUWLFsntdis2NlaFhYWe47rd7uO+8wYAAOD/Mx1u4uPj9euvv+qMM87QKaecop49e3ptN/uxzPT0dKWnp7e6bdWqVcfcd8GCBVqwYIGp8wEAAGszHW6uv/56lZeX6+GHH1ZkZOQJ31AMAADQEUyHm+LiYn388cc677zzOqIeAACAdjH9Er+RI0fql19+6YhaAAAA2s10uFmyZInuuusubdy4UdXV1XzaAAAAdCmmL0tdeumlktTii9xH3izc1NTkm8oAAABOgOlw8+GHH3ZEHQAAAD5hOtyMHz++I+oAAADwCdPh5ohDhw7J5XKpoaHBq33UqFHtLgoAAOBEmQ43P/30k2bOnKl33nmn1e3ccwMAAPzJ9NNSGRkZ+vnnn7V161b16tVLGzZs0AsvvKDhw4frrbfe6ogaAQAA2sz0zM0HH3ygN998U2PGjFGPHj0UFRWlyZMnKzQ0VNnZ2ZoyZUpH1AkAANAmpmduDh48qP79+0uS+vXrp59++kmSdO6552r79u2+rQ4AAMAk0+FmxIgR+uqrryRJf/jDH7R06VKVl5fr+eef18CBA31eIAAAgBmmL0tlZGTI7XZLkubPn69LLrlEL730koKCgo77FW8AAICOZjrc3HjjjZ7/Hj16tPbu3asvv/xSQ4YMUUREhE+LAwAAMMvUZanffvtNZ5xxhnbu3OlpO+WUU3T++ecTbAAAQJdgKtz07NlT9fX1stlsHVUPAABAu5i+oXj27Nl65JFH1NjY2BH1AAAAtIvpe24++eQTvf/++3rvvfd07rnnKiQkxGv766+/7rPiAAAAzDIdbvr27aurrrqqI2oBAABoN9Ph5h//+EdH1AEAAOATpu+5AQAA6MpMz9yMHj261aelbDabgoODNWzYMM2YMUMXX3yxTwoEAAAww/TMzaWXXqrdu3crJCREF198sS666CL17t1b3333ncaMGSO3261JkybpzTff7Ih6AQAAjsn0zE1VVZXuuusuPfDAA17tDz30kL7//nu99957mj9/vhYvXqwrrrjCZ4UCAAC0hemZm3/961+6/vrrW7Rfd911+te//iVJuv766z0f1wQAAOhMpsNNcHCwiouLW7QXFxcrODhYktTc3Cy73d7+6gAAAEwyfVlq9uzZSktLU0lJicaMGSObzab/+Z//0fLly3XfffdJkt59912NHj3a58UCAAAcj+lwc//99ys6OlrPPvus/vnPf0qSRowYof/+7//WDTfcIElKS0vTrbfe6ttKAQAA2sB0uJGkG2+8UTfeeONRt/fq1euECwIAAGiPdr3ELz09XVVVVb6qBQAAoN3aFW5efPFF1dbW+qoWAACAdmtXuDEMw1d1AAAA+ITfvy2Vl5en6OhoBQcHKy4uTkVFRUftu2XLFo0dO1bh4eHq1auXRo4cqSeffLITqwUAAF3dCd1QfERdXV27Tl5QUKCMjAzl5eVp7NixWrp0qZKTk7Vz504NGTKkRf+QkBDdfvvtGjVqlEJCQrRlyxbdcsstCgkJ0V//+td21QIAAKzBrzM3OTk5SklJUWpqqmJiYpSbmyuHw6H8/PxW+48ePVrXX3+9zjnnHA0dOlQ33XSTLrnkkmPO9gAAgJNLm8NNjx49FBAQcMwlMLDtE0ENDQ0qKSlRUlKSV3tSUlKrb0BuTWlpqYqLizV+/Pij9qmvr1dtba3XAgAArKvNaWTdunVH3VZcXKxnnnnG1A3GVVVVampqUmRkpFd7ZGSkKioqjrnv6aefrp9++kmNjY1asGCBUlNTj9o3OztbCxcubHNdAACge2tzuGntC99ffvmlsrKy9Pbbb+vGG2/U4sWLTRdgs9m81g3DaNH2e0VFRTpw4IC2bt2quXPnatiwYa1+zFOSsrKylJmZ6Vmvra2Vw+EwXScAAOgeTuiG4h9//FHz58/XCy+8oEsuuUQ7duxQbGysqWNEREQoICCgxSxNZWVli9mc34uOjpYknXvuudq3b58WLFhw1HBjt9v5iCcAACcRUzcU19TU6N5779WwYcP0xRdf6P3339fbb79tOthIUlBQkOLi4uR0Or3anU6nEhMT23wcwzBUX19v+vwAAMCa2jxz8+ijj+qRRx7RgAEDtGbNmlYvU5mVmZmpadOmKT4+XgkJCVq2bJlcLpfS0tIkHb6kVF5ertWrV0uSnnvuOQ0ZMkQjR46UdPi9N48//rhmz57d7loAAIA1tDnczJ07V7169dKwYcP0wgsv6IUXXmi13+uvv97mk0+dOlXV1dVatGiR3G63YmNjVVhYqKioKEmS2+2Wy+Xy9G9ublZWVpb27NmjwMBAnXnmmVqyZIluueWWNp8TAABYW5vDzfTp0497o++JSE9PV3p6eqvbVq1a5bU+e/ZsZmkAAMAxtTnc/D5oAAAAdEV+/7YUAACALxFuAACApRBuAACApRBuAACApRBuAACApRBuAACApRBuAACApRBuAACApRBuAACApRBuAACApRBuAACApRBuAACApRBuAACApRBuAACApRBuAACApRBuAACApRBuAACApRBuAACApRBuAACApRBuAACApRBuAACApRBuAACApRBuAACApRBuAACApRBuAACApRBuAACApRBuAACApRBuAACApRBuAACApRBuAACApRBuAACApRBuAACApfg93OTl5Sk6OlrBwcGKi4tTUVHRUfu+/vrrmjx5sk477TSFhoYqISFB7777bidWCwAAujq/hpuCggJlZGRo3rx5Ki0t1bhx45ScnCyXy9Vq/82bN2vy5MkqLCxUSUmJLr74Yl1++eUqLS3t5MoBAEBX5ddwk5OTo5SUFKWmpiomJka5ublyOBzKz89vtX9ubq7uuecejRkzRsOHD9fDDz+s4cOH6+233+7kygEAQFflt3DT0NCgkpISJSUlebUnJSWpuLi4Tcdobm5WXV2d+vXrd9Q+9fX1qq2t9VoAAIB1+S3cVFVVqampSZGRkV7tkZGRqqioaNMxnnjiCR08eFDXXnvtUftkZ2crLCzMszgcjnbVDQAAuja/31Bss9m81g3DaNHWmjVr1mjBggUqKChQ//79j9ovKytLNTU1nqWsrKzdNQMAgK4r0F8njoiIUEBAQItZmsrKyhazOb9XUFCglJQUvfrqq5o0adIx+9rtdtnt9nbXCwAAuge/zdwEBQUpLi5OTqfTq93pdCoxMfGo+61Zs0YzZszQyy+/rClTpnR0mQAAoJvx28yNJGVmZmratGmKj49XQkKCli1bJpfLpbS0NEmHLymVl5dr9erVkg4Hm+nTp+upp57Sn/70J8+sT69evRQWFua33wEAALoOv4abqVOnqrq6WosWLZLb7VZsbKwKCwsVFRUlSXK73V7vvFm6dKkaGxt122236bbbbvO033zzzVq1alVnlw8AALogv4YbSUpPT1d6enqr234fWDZu3NjxBQEAgG7N709LAQAA+BLhBgAAWArhBgAAWArhBgAAWArhBgAAWArhBgAAWArhBgAAWArhBgAAWArhBgAAWArhBgAAWArhBgAAWArhBgAAWArhBgAAWArhBgAAWArhBgAAWArhBgAAWArhBgAAWArhBgAAWArhBgAAWArhBgAAWArhBgAAWArhBgAAWArhBgAAWArhBgAAWArhBgAAWArhBgAAWArhBgAAWArhBgAAWArhBgAAWArhBgAAWArhBgAAWArhBgAAWArhBgAAWIrfw01eXp6io6MVHBysuLg4FRUVHbWv2+3WDTfcoBEjRqhHjx7KyMjovEIBAEC34NdwU1BQoIyMDM2bN0+lpaUaN26ckpOT5XK5Wu1fX1+v0047TfPmzdN5553XydUCAIDuwK/hJicnRykpKUpNTVVMTIxyc3PlcDiUn5/fav+hQ4fqqaee0vTp0xUWFtbJ1QIAgO7Ab+GmoaFBJSUlSkpK8mpPSkpScXGxz85TX1+v2tparwUAAFiX38JNVVWVmpqaFBkZ6dUeGRmpiooKn50nOztbYWFhnsXhcPjs2AAAoOvx+w3FNpvNa90wjBZt7ZGVlaWamhrPUlZW5rNjAwCArifQXyeOiIhQQEBAi1maysrKFrM57WG322W32312PAAA0LX5beYmKChIcXFxcjqdXu1Op1OJiYl+qgoAAHR3fpu5kaTMzExNmzZN8fHxSkhI0LJly+RyuZSWlibp8CWl8vJyrV692rPPjh07JEkHDhzQTz/9pB07digoKEhnn322P34CAADoYvwabqZOnarq6motWrRIbrdbsbGxKiwsVFRUlKTDL+37/TtvRo8e7fnvkpISvfzyy4qKitLevXs7s3QAANBF+TXcSFJ6errS09Nb3bZq1aoWbYZhdHBFAACgO/P701IAAAC+RLgBAACWQrgBAACWQrgBAACWQrgBAACWQrgBAACWQrgBAACWQrgBAACWQrgBAACWQrgBAACWQrgBAACWQrgBAACWQrgBAACWQrgBAACWQrgBAACWQrgBAACWQrgBAACWQrgBAACWQrgBAACWQrgBAACWQrgBAACWQrgBAACWQrgBAACWQrgBAACWQrgBAACWQrgBAACWQrgBAACWQrgBAACWQrgBAACWQrgBAACWQrgBAACWQrgBAACWQrgBAACW4vdwk5eXp+joaAUHBysuLk5FRUXH7L9p0ybFxcUpODhYZ5xxhp5//vlOqhQAAHQHfg03BQUFysjI0Lx581RaWqpx48YpOTlZLper1f579uzRZZddpnHjxqm0tFT33Xef7rjjDq1du7aTKwcAAF2VX8NNTk6OUlJSlJqaqpiYGOXm5srhcCg/P7/V/s8//7yGDBmi3NxcxcTEKDU1VbNmzdLjjz/eyZUDAICuKtBfJ25oaFBJSYnmzp3r1Z6UlKTi4uJW9/n444+VlJTk1XbJJZdoxYoV+u2339SzZ88W+9TX16u+vt6zXlNTI0mqra1t709oVXP9oQ45Lo6vo8b0CMbWfzpybBlX/+HvrHV1xNgeOaZhGMft67dwU1VVpaamJkVGRnq1R0ZGqqKiotV9KioqWu3f2NioqqoqDRw4sMU+2dnZWrhwYYt2h8PRjurRFYXl+rsCdBTG1poYV+vqyLGtq6tTWFjYMfv4LdwcYbPZvNYNw2jRdrz+rbUfkZWVpczMTM96c3Oz9u/fr/Dw8GOe52RTW1srh8OhsrIyhYaG+rsc+BBja12MrTUxrq0zDEN1dXUaNGjQcfv6LdxEREQoICCgxSxNZWVli9mZIwYMGNBq/8DAQIWHh7e6j91ul91u92rr27fviRducaGhofxlsijG1roYW2tiXFs63ozNEX67oTgoKEhxcXFyOp1e7U6nU4mJia3uk5CQ0KL/e++9p/j4+FbvtwEAACcfvz4tlZmZqeXLl2vlypXatWuX7rzzTrlcLqWlpUk6fElp+vTpnv5paWn6/vvvlZmZqV27dmnlypVasWKF7r77bn/9BAAA0MX49Z6bqVOnqrq6WosWLZLb7VZsbKwKCwsVFRUlSXK73V7vvImOjlZhYaHuvPNOPffccxo0aJCefvppXXXVVf76CZZht9s1f/78Fpfw0P0xttbF2FoT49p+NqMtz1QBAAB0E37//AIAAIAvEW4AAIClEG4AAIClEG4AAIClEG66mc2bN+vyyy/XoEGDZLPZ9MYbb/i7JPhAdna2xowZoz59+qh///668sor9dVXX/m7LPhAfn6+Ro0a5XkhW0JCgt555x1/lwUfy87Ols1mU0ZGhr9LgQg33c7Bgwd13nnn6dlnn+3Q8/z2228denx427Rpk2677TZt3bpVTqdTjY2NSkpK0sGDB31+Lsa2c51++ulasmSJtm3bpm3btmnChAm64oor9MUXX/j0PIyr/3z66adatmyZRo0a1SHHZ2xPgIFuS5Kxbt264/bbtWuXMXbsWMNutxsxMTGG0+n02nfPnj2GJKOgoMAYP368YbfbjZUrVxpVVVXGddddZwwePNjo1auXERsba7z88stexx4/frxx++23G3PmzDH69u1r9O/f31i6dKlx4MABY8aMGUbv3r2NM844wygsLOyAPwHrqqysNCQZmzZtOmY/xrZ7OvXUU43ly5cfdTvj2n3U1dUZw4cPN5xOpzF+/Hhjzpw5x+zP2HYOwk031pZw09TUZIwYMcKYPHmysWPHDqOoqMi44IILWv3LNHToUGPt2rXG7t27jfLycuOHH34wHnvsMaO0tNT47rvvjKefftoICAgwtm7d6jn++PHjjT59+hiLFy82vv76a2Px4sVGjx49jOTkZGPZsmXG119/bdx6661GeHi4cfDgwQ7807CWb775xpBkfP7550ftw9h2P42NjcaaNWuMoKAg44svvmi1D+PavUyfPt3IyMgwDMM4brhhbDsP4aYba0u4eeedd4zAwEDD7XZ72o72fwq5ubnHPedll11m3HXXXZ718ePHGxdeeKFnvbGx0QgJCTGmTZvmaXO73YYk4+OPP27jLzu5NTc3G5dffrnXn2trGNvu47PPPjNCQkKMgIAAIywszFi/fv1R+zKu3ceaNWuM2NhY45dffjEM4/jhhrHtPNxzYyEPP/ywevfu7VlcLpe++uorORwODRgwwNPvggsuaHX/+Ph4r/Wmpib9/e9/16hRoxQeHq7evXvrvffe8/okhiSv68wBAQEKDw/Xueee62k78pX3ysrKdv/Gk8Htt9+uzz77TGvWrPG0Mbbd24gRI7Rjxw5t3bpVt956q26++Wbt3LmTce3GysrKNGfOHL344osKDg5usZ2x9S+/flsKvpWWlqZrr73Wsz5o0CAZhiGbzdam/UNCQrzWn3jiCT355JPKzc3Vueeeq5CQEGVkZKihocGr3++/yG6z2bzajpy/ubnZ1O85Gc2ePVtvvfWWNm/erNNPP93Tzth2b0FBQRo2bJikw/9offrpp3rqqaeUnZ3NuHZTJSUlqqysVFxcnKetqalJmzdv1rPPPqt9+/Yxtn5EuLGQfv36qV+/fl5tI0eOlMvl0r59+zyJ/dNPP23T8YqKinTFFVfopptuknT4L8M333yjmJgY3xYOGYah2bNna926ddq4caOio6O9tjO21mIYhurr6xnXbmzixIn6/PPPvdpmzpypkSNH6t5771V4eLjCw8O9tjO2nYdw080cOHBA3377rWd9z5492rFjh/r166chQ4a06D958mSdeeaZuvnmm/Xoo4+qrq5O8+bNk6Tj/h/EsGHDtHbtWhUXF+vUU09VTk6OKioq+MvUAW677Ta9/PLLevPNN9WnTx9VVFRIksLCwtSrV69W92Fsu4f77rtPycnJcjgcqqur0yuvvKKNGzdqw4YNrfZnXLuHPn36KDY21qstJCRE4eHhLdqPYGw7D/fcdDPbtm3T6NGjNXr0aElSZmamRo8erQcffLDV/gEBAXrjjTd04MABjRkzRqmpqbr//vslqdXrxP/fAw88oPPPP1+XXHKJLrroIg0YMEBXXnmlT38PDsvPz1dNTY0uuugiDRw40LMUFBQcdR/GtnvYt2+fpk2bphEjRmjixIn65JNPtGHDBk2ePLnV/oyrdTG2ncdmGIbh7yLQuT766CNdeOGF+vbbb3XmmWf6uxz4EGNrTYyrdTG2HYNwcxJYt26devfureHDh+vbb7/VnDlzdOqpp2rLli3+Lg3txNhaE+NqXYxt5+Cem5NAXV2d7rnnHpWVlSkiIkKTJk3SE0884e+y4AOMrTUxrtbF2HYOZm4AAIClcEMxAACwFMINAACwFMINAACwFMINAACwFMINAACwFMINAACwFMINAACwFMINAACwFMINAACwlP8DyN1FcQKWPaEAAAAASUVORK5CYII=", + "text/plain": [ + "
" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "# Mocked dataset showing the precision for different n-grams\n", + "data = {\"1-gram\": 0.8, \"2-gram\": 0.7, \"3-gram\": 0.6, \"4-gram\": 0.5}\n", + "\n", + "# Plot the datapoints defined above\n", + "fig, ax = plt.subplots(1)\n", + "bars = ax.bar(*zip(*data.items()))\n", + "ax.set(ylabel=\"N-gram precision\")\n", + "plt.show()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### N-gram BLEU score:\n", + "When the n-gram precision is normalized by the brevity penalty (BP), then the exponential decay of n-grams is almost fully compensated. The BLEU score corresponds to a geometric average of this modified n-gram precision." + ] + }, + { + "cell_type": "code", + "execution_count": 4, + "metadata": {}, + "outputs": [ + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAjcAAAGdCAYAAADuR1K7AAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjcuMSwgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy/bCgiHAAAACXBIWXMAAA9hAAAPYQGoP6dpAAAyCklEQVR4nO3de1RU9f7/8deIMpgKKiR5GZHUDCU9Bl3wkt2ksFVZrbJMTYVOSppIZZp1SutEV8TqgFKax2+ldMJON9Kmm5fMSsJuVlpakA0SWOClIGD//nA5vzMBOhsGB3bPx1p7Leczn8/e7/GzXL367JvNMAxDAAAAFtHG3wUAAAD4EuEGAABYCuEGAABYCuEGAABYCuEGAABYCuEGAABYCuEGAABYCuEGAABYSlt/F3C81dbW6qefflKnTp1ks9n8XQ4AAPCCYRjav3+/evTooTZtjr4285cLNz/99JMcDoe/ywAAAI1QVFSkXr16HbXPXy7cdOrUSdLhv5zg4GA/VwMAALxRUVEhh8Ph/u/40fzlws2RU1HBwcGEGwAAWhlvLinhgmIAAGAphBsAAGAphBsAAGAphBsAAGAphBsAAGAphBsAAGAphBsAAGAphBsAAGAphBsAAGAphBsAAGApfg83mZmZioyMVFBQkGJiYrRx48aj9n/uuec0ZMgQnXDCCerevbumTJmisrKy41QtAABo6fwabnJycpSSkqL58+eroKBAI0eOVEJCggoLC+vtv2nTJk2aNEmJiYn68ssv9Z///Ecff/yxkpKSjnPlAACgpfJruElPT1diYqKSkpIUFRWljIwMORwOZWVl1dt/y5Yt6tOnj2655RZFRkZqxIgRuummm7R169bjXDkAAGip/BZuqqqqlJ+fr/j4eI/2+Ph4bd68ud4xw4YN048//qi8vDwZhqG9e/fqxRdf1CWXXNLgcSorK1VRUeGxAQAA62rrrwOXlpaqpqZG4eHhHu3h4eEqLi6ud8ywYcP03HPPady4cfr9999VXV2tyy67TE888USDx0lLS9OCBQt8WvvR9Jn7+nE7Fjx9/2DDIRcA8Nfh9wuKbTabx2fDMOq0HbF9+3bdcsst+sc//qH8/HytXbtWu3fv1rRp0xrc/7x581ReXu7eioqKfFo/AABoWfy2chMWFqaAgIA6qzQlJSV1VnOOSEtL0/Dhw3X77bdLkgYPHqwOHTpo5MiRuv/++9W9e/c6Y+x2u+x2u+9/AAAAaJH8tnITGBiomJgYOZ1Oj3an06lhw4bVO+bQoUNq08az5ICAAEmHV3wAAAD8eloqNTVVTz/9tJYvX66vvvpKs2fPVmFhofs007x58zRp0iR3/0svvVRr1qxRVlaWdu3apffff1+33HKLzjzzTPXo0cNfPwMAALQgfjstJUnjxo1TWVmZFi5cKJfLpejoaOXl5SkiIkKS5HK5PJ55M3nyZO3fv19PPvmkbr31VnXu3Fnnn3++HnroIX/9BAAA0MLYjL/Y+ZyKigqFhISovLxcwcHBPt8/d0v5D3dLAYB1mfnvt9/vlgIAAPAlwg0AALAUwg0AALAUwg0AALAUwg0AALAUwg0AALAUvz7nBmhNuM3ff7jNH4AZrNwAAABLIdwAAABLIdwAAABLIdwAAABLIdwAAABLIdwAAABLIdwAAABLIdwAAABLIdwAAABLIdwAAABLIdwAAABLIdwAAABLIdwAAABL4a3gAP7SeNu7//C2dzQXVm4AAIClEG4AAIClEG4AAIClEG4AAIClEG4AAIClEG4AAIClEG4AAIClEG4AAIClEG4AAIClEG4AAIClEG4AAIClEG4AAICl+P3FmZmZmXrkkUfkcrk0aNAgZWRkaOTIkfX2nTx5sv7973/XaR84cKC+/PLL5i4VANCK8FJU//H3S1H9unKTk5OjlJQUzZ8/XwUFBRo5cqQSEhJUWFhYb//FixfL5XK5t6KiInXt2lVXX331ca4cAAC0VH4NN+np6UpMTFRSUpKioqKUkZEhh8OhrKysevuHhITopJNOcm9bt27VL7/8oilTphznygEAQEvlt3BTVVWl/Px8xcfHe7THx8dr8+bNXu1j2bJluvDCCxUREdFgn8rKSlVUVHhsAADAuvwWbkpLS1VTU6Pw8HCP9vDwcBUXFx9zvMvl0htvvKGkpKSj9ktLS1NISIh7czgcTaobAAC0bH6/W8pms3l8NgyjTlt9VqxYoc6dO2vs2LFH7Tdv3jyVl5e7t6KioqaUCwAAWji/3S0VFhamgICAOqs0JSUldVZz/swwDC1fvlwTJ05UYGDgUfva7XbZ7fYm1wsAAFoHv63cBAYGKiYmRk6n06Pd6XRq2LBhRx27fv16ffvtt0pMTGzOEgEAQCvk1+fcpKamauLEiYqNjVVcXJyys7NVWFioadOmSTp8SmnPnj1auXKlx7hly5bprLPOUnR0tD/KBgAALZhfw824ceNUVlamhQsXyuVyKTo6Wnl5ee67n1wuV51n3pSXlys3N1eLFy/2R8kAAKCF8/sTipOTk5WcnFzvdytWrKjTFhISokOHDjVzVQAAoLXy+91SAAAAvkS4AQAAlkK4AQAAlkK4AQAAlkK4AQAAlkK4AQAAlkK4AQAAlkK4AQAAlkK4AQAAlkK4AQAAlkK4AQAAlkK4AQAAlkK4AQAAlkK4AQAAlkK4AQAAlkK4AQAAlkK4AQAAlkK4AQAAlkK4AQAAlkK4AQAAlkK4AQAAlkK4AQAAlkK4AQAAlkK4AQAAlkK4AQAAlkK4AQAAlkK4AQAAltLW7ICamhqtWLFCb7/9tkpKSlRbW+vx/TvvvOOz4gAAAMwyHW5mzZqlFStW6JJLLlF0dLRsNltz1AUAANAopsPN6tWr9cILL2jMmDHNUQ8AAECTmL7mJjAwUP369WuOWgAAAJrMdLi59dZbtXjxYhmG0Rz1AAAANInp01KbNm3Su+++qzfeeEODBg1Su3btPL5fs2aNz4oDAAAwy/TKTefOnXXFFVdo1KhRCgsLU0hIiMdmVmZmpiIjIxUUFKSYmBht3LjxqP0rKys1f/58RUREyG63q2/fvlq+fLnp4wIAAGsyvXLzzDPP+OzgOTk5SklJUWZmpoYPH66lS5cqISFB27dvV+/evesdc80112jv3r1atmyZ+vXrp5KSElVXV/usJgAA0LqZDjdH/Pzzz/rmm29ks9l0yimn6MQTTzS9j/T0dCUmJiopKUmSlJGRoXXr1ikrK0tpaWl1+q9du1br16/Xrl271LVrV0lSnz59GvsTAACABZk+LXXw4EFNnTpV3bt31znnnKORI0eqR48eSkxM1KFDh7zeT1VVlfLz8xUfH+/RHh8fr82bN9c75pVXXlFsbKwefvhh9ezZU6eccopuu+02/fbbbw0ep7KyUhUVFR4bAACwLtPhJjU1VevXr9err76qX3/9Vb/++qtefvllrV+/XrfeeqvX+yktLVVNTY3Cw8M92sPDw1VcXFzvmF27dmnTpk364osv9NJLLykjI0Mvvviibr755gaPk5aW5nFNkMPh8LpGAADQ+pgON7m5uVq2bJkSEhIUHBys4OBgjRkzRk899ZRefPFF0wX8+QnHhmE0+NTj2tpa2Ww2PffcczrzzDM1ZswYpaena8WKFQ2u3sybN0/l5eXuraioyHSNAACg9TB9zc2hQ4fqrLZIUrdu3UydlgoLC1NAQECdVZqSkpJ69y9J3bt3V8+ePT3uyoqKipJhGPrxxx/Vv3//OmPsdrvsdrvXdQEAgNbN9MpNXFyc7rnnHv3+++/utt9++00LFixQXFyc1/sJDAxUTEyMnE6nR7vT6dSwYcPqHTN8+HD99NNPOnDggLttx44datOmjXr16mXylwAAACsyvXKzePFiXXzxxerVq5eGDBkim82mbdu2KSgoSOvWrTO1r9TUVE2cOFGxsbGKi4tTdna2CgsLNW3aNEmHTynt2bNHK1eulCSNHz9e9913n6ZMmaIFCxaotLRUt99+u6ZOnar27dub/SkAAMCCTIeb6Oho7dy5U88++6y+/vprGYaha6+9Vtdff73pgDFu3DiVlZVp4cKFcrlcio6OVl5eniIiIiRJLpdLhYWF7v4dO3aU0+nUzJkzFRsbq9DQUF1zzTW6//77zf4MAABgUY16zk379u114403+qSA5ORkJScn1/vdihUr6rSdeuqpdU5lAQAAHOFVuHnllVeUkJCgdu3a6ZVXXjlq38suu8wnhQEAADSGV+Fm7NixKi4uVrdu3TR27NgG+9lsNtXU1PiqNgAAANO8Cje1tbX1/hkAAKClMX0reH1+/fVXX+wGAACgyUyHm4ceekg5OTnuz1dffbW6du2qnj176tNPP/VpcQAAAGaZDjdLly51v5/J6XTqrbfe0tq1a5WQkKDbb7/d5wUCAACYYfpWcJfL5Q43r732mq655hrFx8erT58+Ouuss3xeIAAAgBmmV266dOnifvnk2rVrdeGFF0o6/MJL7pQCAAD+Znrl5sorr9T48ePVv39/lZWVKSEhQZK0bds29evXz+cFAgAAmGE63CxatEh9+vRRUVGRHn74YXXs2FHS4dNVDT1pGAAA4HgxHW7atWun2267rU57SkqKL+oBAABoEl6/AAAALIXXLwAAAEvh9QsAAMBSfPL6BQAAgJbCdLi55ZZb9Pjjj9dpf/LJJ7moGAAA+J3pcJObm6vhw4fXaR82bJhefPFFnxQFAADQWKbDTVlZmUJCQuq0BwcHq7S01CdFAQAANJbpcNOvXz+tXbu2Tvsbb7yhk08+2SdFAQAANJbph/ilpqZqxowZ+vnnn3X++edLkt5++2099thjysjI8HV9AAAAppgON1OnTlVlZaX++c9/6r777pMk9enTR1lZWZo0aZLPCwQAADDDdLiRpOnTp2v69On6+eef1b59e/f7pQAAAPytUc+5qa6u1ltvvaU1a9bIMAxJ0k8//aQDBw74tDgAAACzTK/c/PDDD7r44otVWFioyspKjR49Wp06ddLDDz+s33//XUuWLGmOOgEAALxieuVm1qxZio2N1S+//KL27du726+44gq9/fbbPi0OAADALNMrN5s2bdL777+vwMBAj/aIiAjt2bPHZ4UBAAA0humVm9ra2nrf/P3jjz+qU6dOPikKAACgsUyHm9GjR3s8z8Zms+nAgQO65557NGbMGF/WBgAAYJrp01Lp6ek6//zzNXDgQP3+++8aP368du7cqbCwMK1atao5agQAAPCa6XDTs2dPbdu2TatXr1Z+fr5qa2uVmJio66+/3uMCYwAAAH8wFW7++OMPDRgwQK+99pqmTJmiKVOmNFddAAAAjWLqmpt27dqpsrJSNputueoBAABoEtMXFM+cOVMPPfSQqqurm6MeAACAJjEdbj788EOtWbNGvXv31kUXXaQrr7zSYzMrMzNTkZGRCgoKUkxMjDZu3Nhg3/fee082m63O9vXXX5s+LgAAsCbTFxR37txZV111lU8OnpOTo5SUFGVmZmr48OFaunSpEhIStH37dvXu3bvBcd98842Cg4Pdn0888USf1AMAAFo/0+HmmWee8dnB09PTlZiYqKSkJElSRkaG1q1bp6ysLKWlpTU4rlu3burcubPP6gAAANbRqLeCS1JJSYk2btyoTZs2qaSkxPT4qqoq5efnKz4+3qM9Pj5emzdvPurYoUOHqnv37rrgggv07rvvHrVvZWWlKioqPDYAAGBdpsNNRUWFJk6cqJ49e2rUqFE655xz1LNnT02YMEHl5eVe76e0tFQ1NTUKDw/3aA8PD1dxcXG9Y7p3767s7Gzl5uZqzZo1GjBggC644AJt2LChweOkpaUpJCTEvTkcDq9rBAAArY/pcJOUlKQPP/xQr732mn799VeVl5frtdde09atW3XjjTeaLuDPt5UbhtHgreYDBgzQjTfeqNNPP11xcXHKzMzUJZdcokcffbTB/c+bN0/l5eXuraioyHSNAACg9TB9zc3rr7+udevWacSIEe62iy66SE899ZQuvvhir/cTFhamgICAOqs0JSUldVZzjubss8/Ws88+2+D3drtddrvd6/0BAIDWzfTKTWhoqEJCQuq0h4SEqEuXLl7vJzAwUDExMXI6nR7tTqdTw4YN83o/BQUF6t69u9f9AQCAtZleubnrrruUmpqqlStXukNFcXGxbr/9dt19992m9pWamqqJEycqNjZWcXFxys7OVmFhoaZNmybp8CmlPXv2aOXKlZIO303Vp08fDRo0SFVVVXr22WeVm5ur3Nxcsz8DAABYlOlwk5WVpW+//VYRERHuZ9EUFhbKbrfr559/1tKlS919P/nkk6Pua9y4cSorK9PChQvlcrkUHR2tvLw8RURESJJcLpcKCwvd/auqqnTbbbdpz549at++vQYNGqTXX39dY8aMMfszAACARZkON2PHjvVpAcnJyUpOTq73uxUrVnh8njNnjubMmePT4wMAAGsxHW7uueee5qgDAADAJxr9ED8AAICWiHADAAAshXADAAAshXADAAAshXADAAAsxfTdUoZh6MUXX9S7776rkpIS1dbWeny/Zs0anxUHAABglulwM2vWLGVnZ+u8885TeHh4gy+5BAAA8AfT4ebZZ5/VmjVreCowAABokUxfcxMSEqKTTz65OWoBAABoMtPh5t5779WCBQv022+/NUc9AAAATWL6tNTVV1+tVatWqVu3burTp4/atWvn8f2xXpYJAADQnEyHm8mTJys/P18TJkzggmIAANDimA43r7/+utatW6cRI0Y0Rz0AAABNYvqaG4fDoeDg4OaoBQAAoMlMh5vHHntMc+bM0ffff98M5QAAADSN6dNSEyZM0KFDh9S3b1+dcMIJdS4o3rdvn8+KAwAAMMt0uMnIyGiGMgAAAHzDdLi54YYbmqMOAAAAnzAdbv7Xb7/9pj/++MOjjYuNAQCAP5m+oPjgwYOaMWOGunXrpo4dO6pLly4eGwAAgD+ZDjdz5szRO++8o8zMTNntdj399NNasGCBevTooZUrVzZHjQAAAF4zfVrq1Vdf1cqVK3Xuuedq6tSpGjlypPr166eIiAg999xzuv7665ujTgAAAK+YXrnZt2+fIiMjJR2+vubIrd8jRozQhg0bfFsdAACASabDzcknn+x+gN/AgQP1wgsvSDq8otO5c2df1gYAAGCa6XAzZcoUffrpp5KkefPmua+9mT17tm6//XafFwgAAGCG6WtuZs+e7f7zeeedp6+//lpbt25V3759NWTIEJ8WBwAAYJaplZs//vhD5513nnbs2OFu6927t6688kqCDQAAaBFMhZt27drpiy++kM1ma656AAAAmsT0NTeTJk3SsmXLmqMWAACAJjN9zU1VVZWefvppOZ1OxcbGqkOHDh7fp6en+6w4AAAAs0yHmy+++EKnn366JHlceyOJ01UAAMDvTIebd999tznqAAAA8AnT19z4WmZmpiIjIxUUFKSYmBht3LjRq3Hvv/++2rZtq7/97W/NWyAAAGhVTK/cXHHFFfWefrLZbAoKClK/fv00fvx4DRgw4Jj7ysnJUUpKijIzMzV8+HAtXbpUCQkJ2r59u3r37t3guPLyck2aNEkXXHCB9u7da/YnAAAACzO9chMSEqJ33nlHn3zyiTvkFBQU6J133lF1dbVycnI0ZMgQvf/++8fcV3p6uhITE5WUlKSoqChlZGTI4XAoKyvrqONuuukmjR8/XnFxcWbLBwAAFmc63Jx00kkaP368du3apdzcXK1Zs0bfffedJkyYoL59++qrr77SDTfcoDvuuOOo+6mqqlJ+fr7i4+M92uPj47V58+YGxz3zzDP67rvvdM8993hVb2VlpSoqKjw2AABgXabDzbJly5SSkqI2bf7/0DZt2mjmzJnKzs6WzWbTjBkz9MUXXxx1P6WlpaqpqVF4eLhHe3h4uIqLi+sds3PnTs2dO1fPPfec2rb17oxaWlqaQkJC3JvD4fBqHAAAaJ1Mh5vq6mp9/fXXddq//vpr1dTUSJKCgoK8vi38z/0Mw6h3bE1NjcaPH68FCxbolFNO8breefPmqby83L0VFRV5PRYAALQ+pi8onjhxohITE3XnnXfqjDPOkM1m00cffaQHHnhAkyZNkiStX79egwYNOup+wsLCFBAQUGeVpqSkpM5qjiTt379fW7duVUFBgWbMmCFJqq2tlWEYatu2rd58802df/75dcbZ7XbZ7XazPxMAALRSpsPNokWLFB4erocffth9p1J4eLhmz57tvs4mPj5eF1988VH3ExgYqJiYGDmdTl1xxRXudqfTqcsvv7xO/+DgYH3++ecebZmZmXrnnXf04osvKjIy0uxPAQAAFmQ63AQEBGj+/PmaP3++++Lc4OBgjz5Hu437f6WmpmrixImKjY1VXFycsrOzVVhYqGnTpkk6fEppz549Wrlypdq0aaPo6GiP8d26dVNQUFCddgAA8NdlOtz8r8zMTHcQaYxx48aprKxMCxculMvlUnR0tPLy8hQRESFJcrlcKiwsbEqJAADgL6ZJTyh+4IEHtG/fviYVkJycrO+//16VlZXKz8/XOeec4/5uxYoVeu+99xoce++992rbtm1NOj4AALCWJoUbwzB8VQcAAIBP+P3dUgAAAL7UpGtutm/frh49eviqFgAAgCZrUrjhab8AAKCl8TrcREZGHvOpwzabTd99912TiwIAAGgsr8NNSkpKg999//33Wrp0qSorK31REwAAQKN5HW5mzZpVp23fvn267777lJWVpbPOOksPPfSQT4sDAAAwq1HX3Pz2229KT0/XI488oj59+mjNmjUaM2aMr2sDAAAwzVS4qamp0VNPPaUFCxYoKChITzzxhCZMmOD1G8ABAACam9fh5oUXXtBdd92l8vJy3XnnnZo+fboCAwObszYAAADTvA431157rdq3b6/rrrtOP/zwg+bOnVtvv/T0dJ8VBwAAYJbX4eacc8455q3enJ4CAAD+5nW4OdoLLAEAAFoK3i0FAAAshXADAAAshXADAAAshXADAAAshXADAAAsxau7pT777DOvdzh48OBGFwMAANBUXoWbv/3tb7LZbDIM45jPsqmpqfFJYQAAAI3h1Wmp3bt3a9euXdq9e7dyc3MVGRmpzMxMFRQUqKCgQJmZmerbt69yc3Obu14AAICj8mrlJiIiwv3nq6++Wo8//rjHW8AHDx4sh8Ohu+++W2PHjvV5kQAAAN4yfUHx559/rsjIyDrtkZGR2r59u0+KAgAAaCzT4SYqKkr333+/fv/9d3dbZWWl7r//fkVFRfm0OAAAALO8frfUEUuWLNGll14qh8OhIUOGSJI+/fRT2Ww2vfbaaz4vEAAAwAzT4ebMM8/U7t279eyzz+rrr7+WYRgaN26cxo8frw4dOjRHjQAAAF4zHW4k6YQTTtDf//53X9cCAADQZI16QvH//d//acSIEerRo4d++OEHSdKiRYv08ssv+7Q4AAAAs0yHm6ysLKWmpiohIUG//PKL+6F9Xbp0UUZGhq/rAwAAMMV0uHniiSf01FNPaf78+Wrb9v+f1YqNjdXnn3/u0+IAAADMMh1udu/eraFDh9Zpt9vtOnjwoE+KAgAAaCzT4SYyMlLbtm2r0/7GG29o4MCBvqgJAACg0UzfLXX77bfr5ptv1u+//y7DMPTRRx9p1apVSktL09NPP90cNQIAAHjN9MrNlClTdM8992jOnDk6dOiQxo8fryVLlmjx4sW69tprTReQmZmpyMhIBQUFKSYmRhs3bmyw76ZNmzR8+HCFhoaqffv2OvXUU7Vo0SLTxwQAANbVqOfc3HjjjbrxxhtVWlqq2tpadevWrVEHz8nJUUpKijIzMzV8+HAtXbpUCQkJ2r59u3r37l2nf4cOHTRjxgwNHjxYHTp00KZNm3TTTTepQ4cOPHcHAABIauRzbo4ICwtrdLCRpPT0dCUmJiopKUlRUVHKyMiQw+FQVlZWvf2HDh2q6667ToMGDVKfPn00YcIEXXTRRUdd7QEAAH8tXq3cnH766Xr77bfVpUsXDR06VDabrcG+n3zyiVcHrqqqUn5+vubOnevRHh8fr82bN3u1j4KCAm3evFn3339/g30qKytVWVnp/lxRUeHVvgEAQOvkVbi5/PLLZbfbJUljx471yYFLS0tVU1Oj8PBwj/bw8HAVFxcfdWyvXr30888/q7q6Wvfee6+SkpIa7JuWlqYFCxb4pGYAANDyeRVuunTpojZtDp/BmjJlinr16uX+3FR/XgUyDOOoK0OStHHjRh04cEBbtmzR3Llz1a9fP1133XX19p03b55SU1PdnysqKuRwOJpeOAAAaJG8Cjepqam69tprFRQUpMjISLlcriZdayMdvl4nICCgzipNSUlJndWcP4uMjJQknXbaadq7d6/uvffeBsON3W53rzoBAADr82r5pUePHsrNzdUPP/wgwzD0448/qrCwsN7NW4GBgYqJiZHT6fRodzqdGjZsmNf7MQzD45oaAADw1+bVys1dd92lmTNnasaMGbLZbDrjjDPq9DlyOunIizS9kZqaqokTJyo2NlZxcXHKzs5WYWGhpk2bJunwKaU9e/Zo5cqVkqR//etf6t27t0499VRJh5978+ijj2rmzJleHxMAAFibV+Hm73//u6677jr98MMPGjx4sN566y2FhoY2+eDjxo1TWVmZFi5cKJfLpejoaOXl5SkiIkKS5HK5PFaDamtrNW/ePO3evVtt27ZV37599eCDD+qmm25qci0AAMAavH6IX6dOnRQdHa1nnnlGw4cP99l1LMnJyUpOTq73uxUrVnh8njlzJqs0AADgqEw/ofiGG25ojjoAAAB8wqtw07VrV+3YsUNhYWHq0qXLUW/V3rdvn8+KAwAAMMurcLNo0SJ16tTJ/edjPYcGAADAX7wKN/97Kmry5MnNVQsAAECTeRVuzLyPKTg4uNHFAAAANJVX4aZz585en4oy85wbAAAAX/Mq3Lz77rvuP3///feaO3euJk+erLi4OEnSBx98oH//+99KS0trnioBAAC85FW4GTVqlPvPCxcuVHp6use7nC677DKddtppys7O5lZxAADgV6Zf7f3BBx8oNja2TntsbKw++ugjnxQFAADQWKbDjcPh0JIlS+q0L126VA6HwydFAQAANJbpJxQvWrRIV111ldatW6ezzz5bkrRlyxZ99913ys3N9XmBAAAAZpheuRkzZox27typyy67TPv27VNZWZkuv/xy7dixQ2PGjGmOGgEAALxmeuVGknr16qUHHnjA17UAAAA0WaPCza+//qply5bpq6++ks1m08CBAzV16lSFhIT4uj4AAABTTJ+W2rp1q/r27atFixZp3759Ki0tVXp6uvr27atPPvmkOWoEAADwmumVm9mzZ+uyyy7TU089pbZtDw+vrq5WUlKSUlJStGHDBp8XCQAA4C3T4Wbr1q0ewUaS2rZtqzlz5tT7/BsAAIDjyfRpqeDgYBUWFtZpLyoqUqdOnXxSFAAAQGOZDjfjxo1TYmKicnJyVFRUpB9//FGrV69WUlKSxysZAAAA/MH0aalHH31UNptNkyZNUnV1tSSpXbt2mj59uh588EGfFwgAAGCG6XATGBioxYsXKy0tTd99950Mw1C/fv10wgknNEd9AAAApjTqOTeSdMIJJ+i0007zZS0AAABN5nW4mTp1qlf9li9f3uhiAAAAmsrrcLNixQpFRERo6NChMgyjOWsCAABoNK/DzbRp07R69Wrt2rVLU6dO1YQJE9S1a9fmrA0AAMA0r28Fz8zMlMvl0h133KFXX31VDodD11xzjdatW8dKDgAAaDFMPefGbrfruuuuk9Pp1Pbt2zVo0CAlJycrIiJCBw4caK4aAQAAvGb6IX5H2Gw22Ww2GYah2tpaX9YEAADQaKbCTWVlpVatWqXRo0drwIAB+vzzz/Xkk0+qsLBQHTt2bK4aAQAAvOb1BcXJyclavXq1evfurSlTpmj16tUKDQ1tztoAAABM8zrcLFmyRL1791ZkZKTWr1+v9evX19tvzZo1PisOAADALK/DzaRJk2Sz2ZqzFgAAgCYz9RA/AACAlq7Rd0v5SmZmpiIjIxUUFKSYmBht3Lixwb5r1qzR6NGjdeKJJyo4OFhxcXFat27dcawWAAC0dH4NNzk5OUpJSdH8+fNVUFCgkSNHKiEhQYWFhfX237Bhg0aPHq28vDzl5+frvPPO06WXXqqCgoLjXDkAAGip/Bpu0tPTlZiYqKSkJEVFRSkjI0MOh0NZWVn19s/IyNCcOXN0xhlnqH///nrggQfUv39/vfrqq8e5cgAA0FL5LdxUVVUpPz9f8fHxHu3x8fHavHmzV/uora3V/v37j/qOq8rKSlVUVHhsAADAuvwWbkpLS1VTU6Pw8HCP9vDwcBUXF3u1j8cee0wHDx7UNddc02CftLQ0hYSEuDeHw9GkugEAQMvm9wuK/3x7uWEYXt1yvmrVKt17773KyclRt27dGuw3b948lZeXu7eioqIm1wwAAFour28F97WwsDAFBATUWaUpKSmps5rzZzk5OUpMTNR//vMfXXjhhUfta7fbZbfbm1wvAABoHfy2chMYGKiYmBg5nU6PdqfTqWHDhjU4btWqVZo8ebKef/55XXLJJc1dJgAAaGX8tnIjSampqZo4caJiY2MVFxen7OxsFRYWatq0aZIOn1Las2ePVq5cKelwsJk0aZIWL16ss88+273q0759e4WEhPjtdwAAgJbDr+Fm3LhxKisr08KFC+VyuRQdHa28vDxFRERIklwul8czb5YuXarq6mrdfPPNuvnmm93tN9xwA09QBgAAkvwcbqTDbxtPTk6u97s/B5b33nuv+QsCAACtmt/vlgIAAPAlwg0AALAUwg0AALAUwg0AALAUwg0AALAUwg0AALAUwg0AALAUwg0AALAUwg0AALAUwg0AALAUwg0AALAUwg0AALAUwg0AALAUwg0AALAUwg0AALAUwg0AALAUwg0AALAUwg0AALAUwg0AALAUwg0AALAUwg0AALAUwg0AALAUwg0AALAUwg0AALAUwg0AALAUwg0AALAUwg0AALAUwg0AALAUwg0AALAUwg0AALAUwg0AALAUwg0AALAUwg0AALAUv4ebzMxMRUZGKigoSDExMdq4cWODfV0ul8aPH68BAwaoTZs2SklJOX6FAgCAVsGv4SYnJ0cpKSmaP3++CgoKNHLkSCUkJKiwsLDe/pWVlTrxxBM1f/58DRky5DhXCwAAWgO/hpv09HQlJiYqKSlJUVFRysjIkMPhUFZWVr39+/Tpo8WLF2vSpEkKCQk5ztUCAIDWwG/hpqqqSvn5+YqPj/doj4+P1+bNm312nMrKSlVUVHhsAADAuvwWbkpLS1VTU6Pw8HCP9vDwcBUXF/vsOGlpaQoJCXFvDofDZ/sGAAAtj98vKLbZbB6fDcOo09YU8+bNU3l5uXsrKiry2b4BAEDL09ZfBw4LC1NAQECdVZqSkpI6qzlNYbfbZbfbfbY/AADQsvlt5SYwMFAxMTFyOp0e7U6nU8OGDfNTVQAAoLXz28qNJKWmpmrixImKjY1VXFycsrOzVVhYqGnTpkk6fEppz549WrlypXvMtm3bJEkHDhzQzz//rG3btikwMFADBw70x08AAAAtjF/Dzbhx41RWVqaFCxfK5XIpOjpaeXl5ioiIkHT4oX1/fubN0KFD3X/Oz8/X888/r4iICH3//ffHs3QAANBC+TXcSFJycrKSk5Pr/W7FihV12gzDaOaKAABAa+b3u6UAAAB8iXADAAAshXADAAAshXADAAAshXADAAAshXADAAAshXADAAAshXADAAAshXADAAAshXADAAAshXADAAAshXADAAAshXADAAAshXADAAAshXADAAAshXADAAAshXADAAAshXADAAAshXADAAAshXADAAAshXADAAAshXADAAAshXADAAAshXADAAAshXADAAAshXADAAAshXADAAAshXADAAAshXADAAAshXADAAAshXADAAAshXADAAAshXADAAAsxe/hJjMzU5GRkQoKClJMTIw2btx41P7r169XTEyMgoKCdPLJJ2vJkiXHqVIAANAa+DXc5OTkKCUlRfPnz1dBQYFGjhyphIQEFRYW1tt/9+7dGjNmjEaOHKmCggLdeeeduuWWW5Sbm3ucKwcAAC2VX8NNenq6EhMTlZSUpKioKGVkZMjhcCgrK6ve/kuWLFHv3r2VkZGhqKgoJSUlaerUqXr00UePc+UAAKClauuvA1dVVSk/P19z5871aI+Pj9fmzZvrHfPBBx8oPj7eo+2iiy7SsmXL9Mcff6hdu3Z1xlRWVqqystL9uby8XJJUUVHR1J9Qr9rKQ82yXxxbc83pEcyt/zTn3DKv/sO/Wetqjrk9sk/DMI7Z12/hprS0VDU1NQoPD/doDw8PV3Fxcb1jiouL6+1fXV2t0tJSde/evc6YtLQ0LViwoE67w+FoQvVoiUIy/F0Bmgtza03Mq3U159zu379fISEhR+3jt3BzhM1m8/hsGEadtmP1r6/9iHnz5ik1NdX9uba2Vvv27VNoaOhRj/NXU1FRIYfDoaKiIgUHB/u7HPgQc2tdzK01Ma/1MwxD+/fvV48ePY7Z12/hJiwsTAEBAXVWaUpKSuqszhxx0kkn1du/bdu2Cg0NrXeM3W6X3W73aOvcuXPjC7e44OBg/jFZFHNrXcytNTGvdR1rxeYIv11QHBgYqJiYGDmdTo92p9OpYcOG1TsmLi6uTv8333xTsbGx9V5vAwAA/nr8erdUamqqnn76aS1fvlxfffWVZs+ercLCQk2bNk3S4VNKkyZNcvefNm2afvjhB6Wmpuqrr77S8uXLtWzZMt12223++gkAAKCF8es1N+PGjVNZWZkWLlwol8ul6Oho5eXlKSIiQpLkcrk8nnkTGRmpvLw8zZ49W//617/Uo0cPPf7447rqqqv89RMsw26365577qlzCg+tH3NrXcytNTGvTWczvLmnCgAAoJXw++sXAAAAfIlwAwAALIVwAwAALIVwAwAALIVw08ps2LBBl156qXr06CGbzab//ve//i4JPpCWlqYzzjhDnTp1Urdu3TR27Fh98803/i4LPpCVlaXBgwe7H8gWFxenN954w99lwcfS0tJks9mUkpLi71Igwk2rc/DgQQ0ZMkRPPvlksx7njz/+aNb9w9P69et18803a8uWLXI6naqurlZ8fLwOHjzo82Mxt8dXr1699OCDD2rr1q3aunWrzj//fF1++eX68ssvfXoc5tV/Pv74Y2VnZ2vw4MHNsn/mthEMtFqSjJdeeumY/b766itj+PDhht1uN6Kiogyn0+kxdvfu3YYkIycnxxg1apRht9uN5cuXG6Wlpca1115r9OzZ02jfvr0RHR1tPP/88x77HjVqlDFjxgxj1qxZRufOnY1u3boZS5cuNQ4cOGBMnjzZ6Nixo3HyyScbeXl5zfA3YF0lJSWGJGP9+vVH7cfctk5dunQxnn766Qa/Z15bj/379xv9+/c3nE6nMWrUKGPWrFlH7c/cHh+Em1bMm3BTU1NjDBgwwBg9erSxbds2Y+PGjcaZZ55Z7z+mPn36GLm5ucauXbuMPXv2GD/++KPxyCOPGAUFBcZ3331nPP7440ZAQICxZcsW9/5HjRpldOrUybjvvvuMHTt2GPfdd5/Rpk0bIyEhwcjOzjZ27NhhTJ8+3QgNDTUOHjzYjH8b1rJz505DkvH555832Ie5bX2qq6uNVatWGYGBgcaXX35Zbx/mtXWZNGmSkZKSYhiGccxww9weP4SbVsybcPPGG28Ybdu2NVwul7utof9TyMjIOOYxx4wZY9x6663uz6NGjTJGjBjh/lxdXW106NDBmDhxorvN5XIZkowPPvjAy1/211ZbW2tceumlHn+v9WFuW4/PPvvM6NChgxEQEGCEhIQYr7/+eoN9mdfWY9WqVUZ0dLTx22+/GYZx7HDD3B4/XHNjIQ888IA6duzo3goLC/XNN9/I4XDopJNOcvc788wz6x0fGxvr8bmmpkb//Oc/NXjwYIWGhqpjx4568803PV6JIcnjPHNAQIBCQ0N12mmnuduOvOW9pKSkyb/xr2DGjBn67LPPtGrVKncbc9u6DRgwQNu2bdOWLVs0ffp03XDDDdq+fTvz2ooVFRVp1qxZevbZZxUUFFTne+bWv/z6bin41rRp03TNNde4P/fo0UOGYchms3k1vkOHDh6fH3vsMS1atEgZGRk67bTT1KFDB6WkpKiqqsqj35/fyG6z2Tzajhy/trbW1O/5K5o5c6ZeeeUVbdiwQb169XK3M7etW2BgoPr16yfp8H+0Pv74Yy1evFhpaWnMayuVn5+vkpISxcTEuNtqamq0YcMGPfnkk9q7dy9z60eEGwvp2rWrunbt6tF26qmnqrCwUHv37nUn9o8//tir/W3cuFGXX365JkyYIOnwP4adO3cqKirKt4VDhmFo5syZeumll/Tee+8pMjLS43vm1loMw1BlZSXz2opdcMEF+vzzzz3apkyZolNPPVV33HGHQkNDFRoa6vE9c3v8EG5amQMHDujbb791f969e7e2bdumrl27qnfv3nX6jx49Wn379tUNN9yghx9+WPv379f8+fMl6Zj/B9GvXz/l5uZq8+bN6tKli9LT01VcXMw/pmZw88036/nnn9fLL7+sTp06qbi4WJIUEhKi9u3b1zuGuW0d7rzzTiUkJMjhcGj//v1avXq13nvvPa1du7be/sxr69CpUydFR0d7tHXo0EGhoaF12o9gbo8frrlpZbZu3aqhQ4dq6NChkqTU1FQNHTpU//jHP+rtHxAQoP/+9786cOCAzjjjDCUlJemuu+6SpHrPE/+vu+++W6effrouuuginXvuuTrppJM0duxYn/4eHJaVlaXy8nKde+656t69u3vLyclpcAxz2zrs3btXEydO1IABA3TBBRfoww8/1Nq1azV69Oh6+zOv1sXcHj82wzAMfxeB4+v999/XiBEj9O2336pv377+Lgc+xNxaE/NqXcxt8yDc/AW89NJL6tixo/r3769vv/1Ws2bNUpcuXbRp0yZ/l4YmYm6tiXm1Lub2+OCam7+A/fv3a86cOSoqKlJYWJguvPBCPfbYY/4uCz7A3FoT82pdzO3xwcoNAACwFC4oBgAAlkK4AQAAlkK4AQAAlkK4AQAAlkK4AQAAlkK4AQAAlkK4AQAAlkK4AQAAlkK4AQAAlvL/AEGIHigLKKERAAAAAElFTkSuQmCC", + "text/plain": [ + "
" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "# Mocked dataset showing the precision multiplied by the BP for different n-grams\n", + "data = {\"1-gram\": 0.8, \"2-gram\": 0.77, \"3-gram\": 0.74, \"4-gram\": 0.71}\n", + "\n", + "# Plot the datapoints defined above\n", + "fig, ax = plt.subplots(1)\n", + "bars = ax.bar(*zip(*data.items()))\n", + "ax.set(ylabel=\"Modified N-gram precision\")\n", + "plt.show()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# 3. Example Calculations of the BLEU score\n", + "\n", + "In this example you will have a reference sentence and 2 candidate sentences. You will tokenize all sentences using the NLTK package. Then you will compare the two candidates to the reference using BLEU score.\n", + "\n", + "First you define and tokenize the sentences." + ] + }, + { + "cell_type": "code", + "execution_count": 16, + "metadata": { + "tags": [] + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "The NASA Opportunity rover is battling a massive dust storm on planet Mars. -> ['the', 'nasa', 'opportunity', 'rover', 'is', 'battling', 'a', 'massive', 'dust', 'storm', 'on', 'planet', 'mars', '.']\n", + "\n", + "\n", + "The Opportunity rover is combating a big sandstorm on planet Mars. -> ['the', 'opportunity', 'rover', 'is', 'combating', 'a', 'big', 'sandstorm', 'on', 'planet', 'mars', '.']\n", + "\n", + "\n", + "A NASA rover is fighting a massive storm on planet Mars. -> ['a', 'nasa', 'rover', 'is', 'fighting', 'a', 'massive', 'storm', 'on', 'planet', 'mars', '.']\n" + ] + } + ], + "source": [ + "reference = \"The NASA Opportunity rover is battling a massive dust storm on planet Mars.\"\n", + "candidate_1 = \"The Opportunity rover is combating a big sandstorm on planet Mars.\"\n", + "candidate_2 = \"A NASA rover is fighting a massive storm on planet Mars.\"\n", + "\n", + "tokenized_ref = nltk.word_tokenize(reference.lower())\n", + "tokenized_cand_1 = nltk.word_tokenize(candidate_1.lower())\n", + "tokenized_cand_2 = nltk.word_tokenize(candidate_2.lower())\n", + "\n", + "print(f\"{reference} -> {tokenized_ref}\")\n", + "print(\"\\n\")\n", + "print(f\"{candidate_1} -> {tokenized_cand_1}\")\n", + "print(\"\\n\")\n", + "print(f\"{candidate_2} -> {tokenized_cand_2}\")" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## 3.1 Define the functions to calculate the BLEU score\n", + "\n", + "### Computing the Brevity Penalty\n", + "You will start by defining the function for brevity penalty according to the equation (2) in section 2.1." + ] + }, + { + "cell_type": "code", + "execution_count": 6, + "metadata": {}, + "outputs": [], + "source": [ + "def brevity_penalty(candidate, reference):\n", + " \"\"\"\n", + " Calculates the brevity penalty given the candidate and reference sentences.\n", + " \"\"\"\n", + " reference_length = len(reference)\n", + " candidate_length = len(candidate)\n", + "\n", + " if reference_length < candidate_length:\n", + " BP = 1\n", + " else:\n", + " penalty = 1 - (reference_length / candidate_length)\n", + " BP = np.exp(penalty)\n", + "\n", + " return BP" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Computing the clipped Precision\n", + "Next, you need to define a function to calculate the geometrically averaged clipped precision. This function calculates how many of the n-grams in the candidate sentence actually appear in the reference sentence. The clipping takes care of overcounting. For example if a certain n-gram appears five times in the candidate sentence, but only twice in the reference, the value is clipped to two." + ] + }, + { + "cell_type": "code", + "execution_count": 17, + "metadata": {}, + "outputs": [], + "source": [ + "def average_clipped_precision(candidate, reference):\n", + " \"\"\"\n", + " Calculates the precision given the candidate and reference sentences.\n", + " \"\"\"\n", + "\n", + " clipped_precision_score = []\n", + " \n", + " # Loop through values 1, 2, 3, 4. This is the length of n-grams\n", + " for n_gram_length in range(1, 5):\n", + " reference_n_gram_counts = Counter(ngrams(reference, n_gram_length)) \n", + " candidate_n_gram_counts = Counter(ngrams(candidate, n_gram_length)) \n", + "\n", + " total_candidate_ngrams = sum(candidate_n_gram_counts.values()) \n", + " \n", + " for ngram in candidate_n_gram_counts: \n", + " # check if it is in the reference n-gram\n", + " if ngram in reference_n_gram_counts:\n", + " # if the count of the candidate n-gram is bigger than the corresponding\n", + " # count in the reference n-gram, then set the count of the candidate n-gram \n", + " # to be equal to the reference n-gram\n", + " \n", + " if candidate_n_gram_counts[ngram] > reference_n_gram_counts[ngram]: \n", + " candidate_n_gram_counts[ngram] = reference_n_gram_counts[ngram] # t\n", + " \n", + " else:\n", + " candidate_n_gram_counts[ngram] = 0 # else set the candidate n-gram equal to zero\n", + "\n", + " clipped_candidate_ngrams = sum(candidate_n_gram_counts.values())\n", + " \n", + " clipped_precision_score.append(clipped_candidate_ngrams / total_candidate_ngrams)\n", + " \n", + " # Calculate the geometric average: take the mean of elemntwise log, then exponentiate\n", + " # This is equivalent to taking the n-th root of the product as shown in equation (1) above\n", + " s = np.exp(np.mean(np.log(clipped_precision_score)))\n", + " \n", + " return s\n" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Computing the BLEU score\n", + "Finally, you can compute the BLEU score using the above two functions." + ] + }, + { + "cell_type": "code", + "execution_count": 18, + "metadata": {}, + "outputs": [], + "source": [ + "def bleu_score(candidate, reference):\n", + " BP = brevity_penalty(candidate, reference) \n", + " geometric_average_precision = average_clipped_precision(candidate, reference) \n", + " return BP * geometric_average_precision" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## 3.2 Testing the functions\n", + "Now you can test the functions with your Example Reference and Candidates Sentences." + ] + }, + { + "cell_type": "code", + "execution_count": 19, + "metadata": { + "tags": [] + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "BLEU score of reference versus candidate 1: 27.6\n", + "BLEU score of reference versus candidate 2: 35.3\n" + ] + } + ], + "source": [ + "result_candidate_1 = round(bleu_score(tokenized_cand_1, tokenized_ref) * 100, 1)\n", + "print(f\"BLEU score of reference versus candidate 1: {result_candidate_1}\")\n", + "result_candidate_2 = round(bleu_score(tokenized_cand_2, tokenized_ref) * 100, 1)\n", + "print(f\"BLEU score of reference versus candidate 2: {result_candidate_2}\")" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## 3.3 Comparing the Results from your Code with the Sacrebleu Library\n", + "Below you will do the same calculation, but using the `sacrebleu` library. Compare them with your implementation above." + ] + }, + { + "cell_type": "code", + "execution_count": 20, + "metadata": { + "scrolled": true + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "BLEU score of reference versus candidate 1: 27.6\n", + "BLEU score of reference versus candidate 2: 35.3\n" + ] + } + ], + "source": [ + "result_candidate_1 = round(sacrebleu.sentence_bleu(candidate_1, [reference]).score, 1)\n", + "print(f\"BLEU score of reference versus candidate 1: {result_candidate_1}\")\n", + "result_candidate_2 = round(sacrebleu.sentence_bleu(candidate_2, [reference]).score, 1)\n", + "print(f\"BLEU score of reference versus candidate 2: {result_candidate_2}\")" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# 4. BLEU computation on a corpus\n", + "\n", + "## 4.1 Loading Datasets for Evaluation Using the BLEU Score\n", + "\n", + "In this section, you will use a simple pipeline for evaluating machine translated text. You will use English to German translations generated by [Google Translate](https://translate.google.com). There are three files you will need:\n", + "\n", + "1. A source text in English. In this lab, you will use the first 1671 words of the [wmt19](http://statmt.org/wmt19/translation-task.html) evaluation dataset downloaded via SacreBLEU.\n", + "2. A reference translation to German of the corresponding first 1671 words from the original English text. This is also provided by SacreBLEU.\n", + "3. A candidate machine translation to German from the same 1671 words. This is generated by Google Translate.\n", + "\n", + "With that, you can now compare the reference and candidate translation to get the BLEU Score." + ] + }, + { + "cell_type": "code", + "execution_count": 21, + "metadata": {}, + "outputs": [], + "source": [ + "# Loading the raw data\n", + "wmt19_src = open(\"data/wmt19_src.txt\", \"r\")\n", + "wmt19_src_1 = wmt19_src.read()\n", + "wmt19_src.close()\n", + "\n", + "wmt19_ref = open(\"data/wmt19_ref.txt\", \"r\")\n", + "wmt19_ref_1 = wmt19_ref.read()\n", + "wmt19_ref.close()\n", + "\n", + "wmt19_can = open(\"data/wmt19_can.txt\", \"r\")\n", + "wmt19_can_1 = wmt19_can.read()\n", + "wmt19_can.close()\n", + "\n", + "tokenized_corpus_src = nltk.word_tokenize(wmt19_src_1.lower())\n", + "tokenized_corpus_ref = nltk.word_tokenize(wmt19_ref_1.lower())\n", + "tokenized_corpus_cand = nltk.word_tokenize(wmt19_can_1.lower())" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Now that you have your data loaded, you can inspect the first sentence of each dataset." + ] + }, + { + "cell_type": "code", + "execution_count": 22, + "metadata": { + "tags": [] + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "English source text:\n", + "\n", + "Welsh AMs worried about 'looking like muppets'\n", + "There is consternation among some AMs at a suggestion their title should change to MWPs (Member of the Welsh Parliament).\n", + " -> ['\\ufeffwelsh', 'ams', 'worried', 'about', \"'looking\", 'like', \"muppets'\", 'there', 'is', 'consternation', 'among', 'some', 'ams', 'at', 'a', 'suggestion', 'their', 'title', 'should', 'change', 'to', 'mwps', '(', 'member', 'of', 'the', 'welsh', 'parliament', ')', '.']\n", + "\n", + "\n", + "German reference translation:\n", + "\n", + "Walisische Ageordnete sorgen sich \"wie Dödel auszusehen\"\n", + "Es herrscht Bestürzung unter einigen Mitgliedern der Versammlung über einen Vorschlag, der ihren Titel zu MWPs (Mitglied der walisischen Parlament) ändern soll.\n", + " -> ['\\ufeffwalisische', 'ageordnete', 'sorgen', 'sich', '``', 'wie', 'dödel', 'auszusehen', \"''\", 'es', 'herrscht', 'bestürzung', 'unter', 'einigen', 'mitgliedern', 'der', 'versammlung', 'über', 'einen', 'vorschlag', ',', 'der', 'ihren', 'titel', 'zu', 'mwps', '(', 'mitglied', 'der', 'walisischen', 'parlament', ')', 'ändern', 'soll', '.']\n", + "\n", + "\n", + "German machine translation:\n", + "\n", + "Walisische AMs machten sich Sorgen, dass sie wie Muppets aussehen könnten\n", + "Einige AMs sind bestürzt über den Vorschlag, ihren Titel in MWPs (Mitglied des walisischen Parlaments) zu ändern.\n", + "Es ist aufg -> ['walisische', 'ams', 'machten', 'sich', 'sorgen', ',', 'dass', 'sie', 'wie', 'muppets', 'aussehen', 'könnten', 'einige', 'ams', 'sind', 'bestürzt', 'über', 'den', 'vorschlag', ',', 'ihren', 'titel', 'in', 'mwps', '(', 'mitglied', 'des', 'walisischen', 'parlaments']\n" + ] + } + ], + "source": [ + "print(\"English source text:\\n\")\n", + "print(f\"{wmt19_src_1[0:170]} -> {tokenized_corpus_src[0:30]}\\n\\n\")\n", + "print(\"German reference translation:\\n\")\n", + "print(f\"{wmt19_ref_1[0:219]} -> {tokenized_corpus_ref[0:35]}\\n\\n\")\n", + "print(\"German machine translation:\\n\")\n", + "print(f\"{wmt19_can_1[0:199]} -> {tokenized_corpus_cand[0:29]}\")" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "And lastly, you can calculate the BLEU score of the translation." + ] + }, + { + "cell_type": "code", + "execution_count": 23, + "metadata": { + "tags": [] + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "BLEU score of the reference versus candidate translation: 43.2\n" + ] + } + ], + "source": [ + "result = round(sacrebleu.sentence_bleu(wmt19_can_1, [wmt19_ref_1]).score, 1)\n", + "print(f\"BLEU score of the reference versus candidate translation: {result}\")" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## 4.2 BLEU Score Interpretation on a Corpus\n", + "The table below (taken from [here](https://cloud.google.com/translate/automl/docs/evaluate)) shows the typical values of BLEU score. You can see that the translation above is of high quality according to this table and in comparison to the given reference sentence. (*if you see \"Hard to get the gist\", please open your workspace, delete `wmt19_can.txt` and get the latest version via the Lab Help button*)\n", + "\n", + "|Score | Interpretation |\n", + "|:---------:|:-------------------------------------------------------------:|\n", + "| < 10 | Almost useless |\n", + "| 10 - 19 | Hard to get the gist |\n", + "| 20 - 29 | The gist is clear, but has significant grammatical errors |\n", + "| 30 - 40 | Understandable to good translations |\n", + "| 40 - 50 | High quality translations |\n", + "| 50 - 60 | Very high quality, adequate, and fluent translations |\n", + "| > 60 | Quality often better than human |" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [] + } + ], + "metadata": { + "kernelspec": { + "display_name": "Python 3 (ipykernel)", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.10.11" + } + }, + "nbformat": 4, + "nbformat_minor": 4 +} diff --git a/NLP with Attention Models/NMT_with_Attention/Basic Attention-BLEU-QKV Attention/Files/home/jovyan/work/.ipynb_checkpoints/C4W1_QKV_Attention-checkpoint.ipynb b/NLP with Attention Models/NMT_with_Attention/Basic Attention-BLEU-QKV Attention/Files/home/jovyan/work/.ipynb_checkpoints/C4W1_QKV_Attention-checkpoint.ipynb new file mode 100644 index 0000000000000000000000000000000000000000..4059dbe6edcf3e7a1d9abc24a5170996cb8f9356 --- /dev/null +++ b/NLP with Attention Models/NMT_with_Attention/Basic Attention-BLEU-QKV Attention/Files/home/jovyan/work/.ipynb_checkpoints/C4W1_QKV_Attention-checkpoint.ipynb @@ -0,0 +1,270 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "id": "707052ae", + "metadata": {}, + "source": [ + "# Scaled Dot-Product Attention: Ungraded Lab\n", + "\n", + "The 2017 paper [Attention Is All You Need](https://arxiv.org/abs/1706.03762) introduced the Transformer model and scaled dot-product attention, sometimes also called QKV (**Q**ueries, **K**eys, **V**alues) attention. Since then, Transformers have come to dominate large-scale natural language applications. Scaled dot-product attention can be used to improve seq2seq models as well. In this ungraded lab, you'll implement a simplified version of scaled dot-product attention and replicate word alignment between English and French, as shown in [Bhadanau, et al. (2014)](https://arxiv.org/abs/1409.0473).\n", + "\n", + "The Transformer model learns how to align words in different languages. You won't be training any weights here, so instead you will use [pre-trained aligned word embeddings from here](https://fasttext.cc/docs/en/aligned-vectors.html). Run the cell below to load the embeddings and set up the rest of the notebook.\n", + "\n", + "This is a practice notebook, where you can train writing your code. All of the solutions are provided at the end of the notebook." + ] + }, + { + "cell_type": "code", + "execution_count": 1, + "id": "aa4d9f30", + "metadata": {}, + "outputs": [], + "source": [ + "# Import the libraries\n", + "import pickle\n", + "import matplotlib.pyplot as plt\n", + "import numpy as np\n", + "\n", + "# Load the word2int dictionaries\n", + "with open(\"./data/word2int_en.pkl\", \"rb\") as f:\n", + " en_words = pickle.load(f)\n", + " \n", + "with open(\"./data/word2int_fr.pkl\", \"rb\") as f:\n", + " fr_words = pickle.load(f)\n", + "\n", + "# Load the word embeddings\n", + "en_embeddings = np.load(\"./data/embeddings_en.npz\")[\"embeddings\"]\n", + "fr_embeddings = np.load(\"./data/embeddings_fr.npz\")[\"embeddings\"]" + ] + }, + { + "cell_type": "code", + "execution_count": 2, + "id": "a6914081", + "metadata": {}, + "outputs": [], + "source": [ + "# Define some helper functions\n", + "\n", + "def tokenize(sentence, token_mapping):\n", + " tokenized = []\n", + " \n", + " for word in sentence.lower().split(\" \"):\n", + " try:\n", + " tokenized.append(token_mapping[word])\n", + " except KeyError:\n", + " # Using -1 to indicate an unknown word\n", + " tokenized.append(-1)\n", + " \n", + " return tokenized\n", + "\n", + "\n", + "def embed(tokens, embeddings):\n", + " embed_size = embeddings.shape[1]\n", + " \n", + " output = np.zeros((len(tokens), embed_size))\n", + " for i, token in enumerate(tokens):\n", + " if token == -1:\n", + " output[i] = np.zeros((1, embed_size))\n", + " else:\n", + " output[i] = embeddings[token]\n", + " \n", + " return output" + ] + }, + { + "cell_type": "markdown", + "id": "6153d4b2", + "metadata": {}, + "source": [ + "The scaled-dot product attention consists of two matrix multiplications and a softmax scaling as shown in the diagram below from [Vaswani, et al. (2017)](https://arxiv.org/abs/1706.03762). It takes three input matrices, the queries, keys, and values.\n", + "\n", + "![scaled-dot product attention diagram](./images/attention.png)\n", + "\n", + "Mathematically, this is expressed as\n", + "\n", + "$$ \n", + "\\large \\mathrm{Attention}\\left(Q, K, V\\right) = \\mathrm{softmax}\\left(\\frac{QK^{\\top}}{\\sqrt{d_k}}\\right)V\n", + "$$\n", + "\n", + "where $Q$, $K$, and $V$ are the queries, keys, and values matrices respectively, and $d_k$ is the dimension of the keys. In practice, Q, K, and V all have the same dimensions. This form of attention is faster and more space-efficient than what you implemented before since it consists of only matrix multiplications instead of a learned feed-forward layer.\n", + "\n", + "Conceptually, the first matrix multiplication is a measure of the similarity between the queries and the keys. This is transformed into weights using the softmax function. These weights are then applied to the values with the second matrix multiplication resulting in output attention vectors. Typically, decoder states are used as the queries while encoder states are the keys and values.\n", + "\n", + "### Exercise 1\n", + "Implement the softmax function with Numpy and use it to calculate the weights from the queries and keys. Assume the queries and keys are 2D arrays (matrices). Note that since the dot-product of Q and K will be a matrix, you'll need to calculate softmax over a specific axis. See the end of the notebook for solutions." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "3932b927", + "metadata": {}, + "outputs": [], + "source": [ + "def softmax(x, axis=0): \n", + " \"\"\" Calculate softmax function for an array x\n", + "\n", + " axis=0 calculates softmax across rows which means each column sums to 1 \n", + " axis=1 calculates softmax across columns which means each row sums to 1\n", + " \"\"\"\n", + " # Replace pass with your code.\n", + " pass\n", + "\n", + "def calculate_weights(queries, keys):\n", + " \"\"\" Calculate the weights for scaled dot-product attention\"\"\"\n", + " # Replace None with your code.\n", + " dot = None\n", + " weights = softmax(dot, axis=1)\n", + " \n", + " assert weights.sum(axis=1)[0] == 1, \"Each row in weights must sum to 1\"\n", + " \n", + " # Replace pass with your code.\n", + " pass" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "51f47450", + "metadata": {}, + "outputs": [], + "source": [ + "# Tokenize example sentences in English and French, then get their embeddings\n", + "sentence_en = \"The agreement on the European Economic Area was signed in August 1992 .\"\n", + "tokenized_en = tokenize(sentence_en, en_words)\n", + "embedded_en = embed(tokenized_en, en_embeddings)\n", + "\n", + "sentence_fr = \"L accord sur la zone économique européenne a été signé en août 1992 .\"\n", + "tokenized_fr = tokenize(sentence_fr, fr_words)\n", + "embedded_fr = embed(tokenized_fr, fr_embeddings)\n", + "\n", + "# These weights indicate alignment between words in English and French\n", + "alignment = calculate_weights(embedded_fr, embedded_en)\n", + "\n", + "# Visualize weights to check for alignment\n", + "fig, ax = plt.subplots(figsize=(7,7))\n", + "ax.imshow(alignment, cmap='gray')\n", + "ax.xaxis.tick_top()\n", + "ax.set_xticks(np.arange(alignment.shape[1]))\n", + "ax.set_xticklabels(sentence_en.split(\" \"), rotation=90, size=16);\n", + "ax.set_yticks(np.arange(alignment.shape[0]));\n", + "ax.set_yticklabels(sentence_fr.split(\" \"), size=16);" + ] + }, + { + "cell_type": "markdown", + "id": "d634f0ec", + "metadata": {}, + "source": [ + "If you implemented the weights calculations correctly, the alignment matrix should look like this:\n", + "\n", + "![alignment visualization](./images/alignment.png)\n", + "\n", + "This is a demonstration of alignment where the model has learned which words in English correspond to words in French. For example, the words *signed* and *signé* have a large weight because they have the same meaning. Typically, these alignments are learned using linear layers in the model, but you've used pre-trained embeddings here.\n", + "\n", + "### Exercise 2\n", + "Complete the implementation of scaled dot-product attention using your `calculate_weights` function (ignore the mask)." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "fbfc157e", + "metadata": {}, + "outputs": [], + "source": [ + "def attention_qkv(queries, keys, values):\n", + " \"\"\" Calculate scaled dot-product attention from queries, keys, and values matrices \"\"\"\n", + " \n", + " # Replace pass with your code.\n", + " pass\n", + "\n", + "\n", + "attention_qkv_result = attention_qkv(embedded_fr, embedded_en, embedded_en)\n", + "\n", + "print(f\"The shape of the attention_qkv function is {attention_qkv_result.shape}\")\n", + "print(f\"Some elements of the attention_qkv function are \\n{attention_qkv_result[0:2,:10]}\")" + ] + }, + { + "cell_type": "markdown", + "id": "f98335f0", + "metadata": {}, + "source": [ + "**Expected output**\n", + "\n", + "The shape of the attention_qkv function is `(14, 300)`\n", + "\n", + "Some elements of the attention_qkv function are \n", + "```python\n", + "[[-0.04039161 -0.00275749 0.00389873 0.04842744 -0.02472726 0.01435613\n", + " -0.00370253 -0.0619686 -0.00206159 0.01615228]\n", + " [-0.04083253 -0.00245985 0.00409068 0.04830341 -0.02479128 0.01447497\n", + " -0.00355203 -0.06196036 -0.00241327 0.01582606]]\n", + "```" + ] + }, + { + "cell_type": "markdown", + "id": "f87131fb", + "metadata": {}, + "source": [ + "## Solutions" + ] + }, + { + "cell_type": "markdown", + "id": "8470a024", + "metadata": {}, + "source": [ + "```python\n", + "def softmax(x, axis=0):\n", + " \"\"\" Calculate softmax function for an array x\n", + " \n", + " axis=0 calculates softmax across rows which means each column sums to 1 \n", + " axis=1 calculates softmax across columns which means each row sums to 1\n", + " \"\"\"\n", + " y = np.exp(x) \n", + " return y / np.expand_dims(np.sum(y, axis=axis), axis)\n", + "\n", + "def calculate_weights(queries, keys):\n", + " \"\"\" Calculate the weights for scaled dot-product attention\"\"\"\n", + " dot = np.matmul(queries, keys.T)/np.sqrt(keys.shape[1])\n", + " weights = softmax(dot, axis=1)\n", + " \n", + " assert weights.sum(axis=1)[0] == 1, \"Each row in weights must sum to 1\"\n", + " \n", + " return weights\n", + "\n", + "def attention_qkv(queries, keys, values):\n", + " \"\"\" Calculate scaled dot-product attention from queries, keys, and values matrices \"\"\"\n", + " weights = calculate_weights(queries, keys)\n", + " return np.matmul(weights, values)\n", + "```" + ] + } + ], + "metadata": { + "kernelspec": { + "display_name": "Python 3 (ipykernel)", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.10.11" + } + }, + "nbformat": 4, + "nbformat_minor": 5 +} diff --git a/NLP with Attention Models/NMT_with_Attention/Basic Attention-BLEU-QKV Attention/Files/home/jovyan/work/C4W1_Basic_Attention.ipynb b/NLP with Attention Models/NMT_with_Attention/Basic Attention-BLEU-QKV Attention/Files/home/jovyan/work/C4W1_Basic_Attention.ipynb new file mode 100644 index 0000000000000000000000000000000000000000..0628c0c6f8a6d1724468ededc50263ef8a516ccb --- /dev/null +++ b/NLP with Attention Models/NMT_with_Attention/Basic Attention-BLEU-QKV Attention/Files/home/jovyan/work/C4W1_Basic_Attention.ipynb @@ -0,0 +1,324 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "id": "9c74bac5", + "metadata": {}, + "source": [ + "# Basic Attention Operation: Ungraded Lab\n", + "\n", + "As you've learned, attention allows a seq2seq decoder to use information from each encoder step instead of just the final encoder hidden state. In the attention operation, the encoder outputs are weighted based on the decoder hidden state, then combined into one context vector. This vector is then used as input to the decoder to predict the next output step.\n", + "\n", + "In this ungraded lab, you'll implement a basic attention operation as described in [Bhadanau, et al (2014)](https://arxiv.org/abs/1409.0473) using Numpy.\n", + "\n", + "This is a practice notebook, where you can train writing your code. All of the solutions are provided at the end of the notebook." + ] + }, + { + "cell_type": "code", + "execution_count": 1, + "id": "a5288920", + "metadata": {}, + "outputs": [], + "source": [ + "# Import the libraries and define the functions you will need for this lab\n", + "import numpy as np\n", + "\n", + "def softmax(x, axis=0):\n", + " \"\"\" Calculate softmax function for an array x along specified axis\n", + " \n", + " axis=0 calculates softmax across rows which means each column sums to 1 \n", + " axis=1 calculates softmax across columns which means each row sums to 1\n", + " \"\"\"\n", + " return np.exp(x) / np.expand_dims(np.sum(np.exp(x), axis=axis), axis)" + ] + }, + { + "cell_type": "markdown", + "id": "9a6e0293", + "metadata": {}, + "source": [ + "## 1: Calculating alignment scores\n", + "\n", + "The first step is to calculate the alignment scores. This is a measure of similarity between the decoder hidden state and each encoder hidden state. From the paper, this operation looks like\n", + "\n", + "$$\n", + "\\large e_{ij} = v_a^\\top \\tanh{\\left(W_a s_{i-1} + U_a h_j\\right)}\n", + "$$\n", + "\n", + "where $W_a \\in \\mathbb{R}^{n\\times m}$, $U_a \\in \\mathbb{R}^{n \\times m}$, and $v_a \\in \\mathbb{R}^m$\n", + "are the weight matrices and $n$ is the hidden state size. In practice, this is implemented as a feedforward neural network with two layers, where $m$ is the size of the layers in the alignment network. It looks something like:\n", + "\n", + "![alignment model](./images/alignment_model_3.jpg)\n", + "\n", + "Here $h_j$ are the encoder hidden states for each input step $j$ and $s_{i - 1}$ is the decoder hidden state of the previous step. The first layer corresponds to $W_a$ and $U_a$, while the second layer corresponds to $v_a$.\n", + "\n", + "To implement this, first concatenate the encoder and decoder hidden states to produce an array with size $K \\times 2n$ where $K$ is the number of encoder states/steps. For this, use `np.concatenate` ([docs](https://numpy.org/doc/stable/reference/generated/numpy.concatenate.html)). Note that there is only one decoder state so you'll need to reshape it to successfully concatenate the arrays. The easiest way is to use `decoder_state.repeat` ([docs](https://numpy.org/doc/stable/reference/generated/numpy.repeat.html#numpy.repeat)) to match the hidden state array size.\n", + "\n", + "Then, apply the first layer as a matrix multiplication between the weights and the concatenated input. Use the tanh function to get the activations. Finally, compute the matrix multiplication of the second layer weights and the activations. This returns the alignment scores." + ] + }, + { + "cell_type": "code", + "execution_count": 19, + "id": "72857076", + "metadata": {}, + "outputs": [], + "source": [ + "hidden_size = 16\n", + "attention_size = 10\n", + "input_length = 5\n", + "\n", + "np.random.seed(42)\n", + "\n", + "# Synthetic vectors used to test\n", + "encoder_states = np.random.randn(input_length, hidden_size)\n", + "decoder_state = np.random.randn(1, hidden_size)\n", + "\n", + "# Weights for the neural network, these are typically learned through training\n", + "# Use these in the alignment function below as the layer weights\n", + "layer_1 = np.random.randn(2 * hidden_size, attention_size)\n", + "layer_2 = np.random.randn(attention_size, 1)\n", + "\n", + "# Implement this function. Replace None with your code. Solution at the bottom of the notebook\n", + "def alignment(encoder_states, decoder_state):\n", + " # First, concatenate the encoder states and the decoder state\n", + " inputs = np.concatenate((encoder_states, np.repeat(decoder_state, input_length, axis=0)),axis=1)\n", + " assert inputs.shape == (input_length, 2 * hidden_size)\n", + " \n", + " # Matrix multiplication of the concatenated inputs and layer_1, with tanh activation\n", + " activations = np.tanh(np.dot(inputs,layer_1))\n", + " assert activations.shape == (input_length, attention_size)\n", + " \n", + " # Matrix multiplication of the activations with layer_2. Remember that you don't need tanh here\n", + " scores = np.dot(activations, layer_2)\n", + " assert scores.shape == (input_length, 1)\n", + " \n", + " return scores" + ] + }, + { + "cell_type": "code", + "execution_count": 20, + "id": "fb638355", + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "[[4.35790943]\n", + " [5.92373433]\n", + " [4.18673175]\n", + " [2.11437202]\n", + " [0.95767155]]\n" + ] + } + ], + "source": [ + "# Run this to test your alignment function\n", + "scores = alignment(encoder_states, decoder_state)\n", + "print(scores)" + ] + }, + { + "cell_type": "markdown", + "id": "f26aae76", + "metadata": {}, + "source": [ + "If you implemented the function correctly, you should get these scores:\n", + "\n", + "```python\n", + "[[4.35790943]\n", + " [5.92373433]\n", + " [4.18673175]\n", + " [2.11437202]\n", + " [0.95767155]]\n", + "```" + ] + }, + { + "cell_type": "markdown", + "id": "58b8cfa9", + "metadata": {}, + "source": [ + "## 2: Turning alignment into weights\n", + "\n", + "The next step is to calculate the weights from the alignment scores. These weights determine the encoder outputs that are the most important for the decoder output. These weights should be between 0 and 1. You can use the softmax function (which is already implemented above) to get these weights from the attention scores. Pass the attention scores vector to the softmax function to get the weights. Mathematically,\n", + "\n", + "$$\n", + "\\large \\alpha_{ij} = \\frac{\\exp{\\left(e_{ij}\\right)}}{\\sum_{k=1}^K \\exp{\\left(e_{ik}\\right)}}\n", + "$$\n", + "\n", + "\n", + "\n", + "## 3: Weight the encoder output vectors and sum\n", + "\n", + "The weights tell you the importance of each input word with respect to the decoder state. In this step, you use the weights to modulate the magnitude of the encoder vectors. Words with little importance will be scaled down relative to important words. Multiply each encoder vector by its respective weight to get the alignment vectors, then sum up the weighted alignment vectors to get the context vector. Mathematically,\n", + "\n", + "$$\n", + "\\large c_i = \\sum_{j=1}^K\\alpha_{ij} h_{j}\n", + "$$\n", + "\n", + "Implement these steps in the `attention` function below." + ] + }, + { + "cell_type": "code", + "execution_count": 21, + "id": "4546cbb5", + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "[-0.63514569 0.04917298 -0.43930867 -0.9268003 1.01903919 -0.43181409\n", + " 0.13365099 -0.84746874 -0.37572203 0.18279832 -0.90452701 0.17872958\n", + " -0.58015282 -0.58294027 -0.75457577 1.32985756]\n" + ] + } + ], + "source": [ + "# Implement this function. Replace None with your code.\n", + "def attention(encoder_states, decoder_state):\n", + " \"\"\" Example function that calculates attention, returns the context vector \n", + " \n", + " Arguments:\n", + " encoder_vectors: NxM numpy array, where N is the number of vectors and M is the vector length\n", + " decoder_vector: 1xM numpy array, M is the vector length, much be the same M as encoder_vectors\n", + " \"\"\" \n", + " \n", + " # First, calculate the alignment scores\n", + " scores = alignment(encoder_states, decoder_state)\n", + " \n", + " # Then take the softmax of the alignment scores to get a weight distribution\n", + " weights = softmax(scores)\n", + " \n", + " # Multiply each encoder state by its respective weight\n", + " weighted_scores = weights * encoder_states\n", + " \n", + " # Sum up weighted alignment vectors to get the context vector and return it\n", + " context = np.sum(weighted_scores,axis=0)\n", + " return context\n", + "\n", + "context_vector = attention(encoder_states, decoder_state)\n", + "print(context_vector)" + ] + }, + { + "cell_type": "markdown", + "id": "5d9f3df4", + "metadata": {}, + "source": [ + "If you implemented the `attention` function correctly, the context vector should be\n", + "\n", + "```python\n", + "[-0.63514569 0.04917298 -0.43930867 -0.9268003 1.01903919 -0.43181409\n", + " 0.13365099 -0.84746874 -0.37572203 0.18279832 -0.90452701 0.17872958\n", + " -0.58015282 -0.58294027 -0.75457577 1.32985756]\n", + "```\n", + "\n" + ] + }, + { + "cell_type": "markdown", + "id": "4210899c", + "metadata": {}, + "source": [ + "## See below for solutions" + ] + }, + { + "cell_type": "markdown", + "id": "3ba0d629", + "metadata": {}, + "source": [ + "```python\n", + "# Solution\n", + "def alignment(encoder_states, decoder_state):\n", + " # First, concatenate the encoder states and the decoder state.\n", + " inputs = np.concatenate((encoder_states, decoder_state.repeat(input_length, axis=0)), axis=1)\n", + " assert inputs.shape == (input_length, 2*hidden_size)\n", + " \n", + " # Matrix multiplication of the concatenated inputs and the first layer, with tanh activation\n", + " activations = np.tanh(np.matmul(inputs, layer_1))\n", + " assert activations.shape == (input_length, attention_size)\n", + " \n", + " # Matrix multiplication of the activations with the second layer. Remember that you don't need tanh here\n", + " scores = np.matmul(activations, layer_2)\n", + " assert scores.shape == (input_length, 1)\n", + " \n", + " return scores\n", + "\n", + "# Run this to test your alignment function\n", + "scores = alignment(encoder_states, decoder_state)\n", + "print(scores)\n", + "```" + ] + }, + { + "cell_type": "markdown", + "id": "f80faecb", + "metadata": {}, + "source": [ + "```python\n", + "# Solution\n", + "def attention(encoder_states, decoder_state):\n", + " \"\"\" Example function that calculates attention, returns the context vector \n", + " \n", + " Arguments:\n", + " encoder_vectors: NxM numpy array, where N is the number of vectors and M is the vector length\n", + " decoder_vector: 1xM numpy array, M is the vector length, much be the same M as encoder_vectors\n", + " \"\"\" \n", + " \n", + " # First, calculate the dot product of each encoder vector with the decoder vector\n", + " scores = alignment(encoder_states, decoder_state)\n", + " \n", + " # Then take the softmax of those scores to get a weight distribution\n", + " weights = softmax(scores)\n", + " \n", + " # Multiply each encoder state by its respective weight\n", + " weighted_scores = encoder_states * weights\n", + " \n", + " # Sum up the weights encoder states\n", + " context = np.sum(weighted_scores, axis=0)\n", + " \n", + " return context\n", + "\n", + "context_vector = attention(encoder_states, decoder_state)\n", + "print(context_vector)\n", + "```" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "16a6caa8", + "metadata": {}, + "outputs": [], + "source": [] + } + ], + "metadata": { + "kernelspec": { + "display_name": "Python 3 (ipykernel)", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.10.11" + } + }, + "nbformat": 4, + "nbformat_minor": 5 +} diff --git a/NLP with Attention Models/NMT_with_Attention/Basic Attention-BLEU-QKV Attention/Files/home/jovyan/work/C4W1_Bleu_Score.ipynb b/NLP with Attention Models/NMT_with_Attention/Basic Attention-BLEU-QKV Attention/Files/home/jovyan/work/C4W1_Bleu_Score.ipynb new file mode 100644 index 0000000000000000000000000000000000000000..4cec1d6c21500424aae1bcb670fc4adfa701929c --- /dev/null +++ b/NLP with Attention Models/NMT_with_Attention/Basic Attention-BLEU-QKV Attention/Files/home/jovyan/work/C4W1_Bleu_Score.ipynb @@ -0,0 +1,585 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Calculating the Bilingual Evaluation Understudy (BLEU) score: Ungraded Lab" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "In this ungraded lab, you will implement a popular metric for evaluating the quality of machine-translated text: the BLEU score proposed by Kishore Papineni, et al. in their 2002 paper [\"BLEU: a Method for Automatic Evaluation of Machine Translation\"](https://www.aclweb.org/anthology/P02-1040.pdf). The BLEU score works by comparing a \"candidate\" text to one or more \"reference\" texts. The score is higher the better the result. In the following sections you will calculate this value using your own implementation as well as using functions from a library." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# 1. Importing the Libraries\n", + "\n", + "You will start by importing the Python libraries. First, you will implement your own version of the BLEU Score using NumPy. To verify that your implementation is correct, you will compare the results with those generated by the [SacreBLEU library](https://github.com/mjpost/sacrebleu). This package provides hassle-free computation of shareable, comparable, and reproducible BLEU scores. It also knows all the standard test sets and handles downloading, processing, and tokenization." + ] + }, + { + "cell_type": "code", + "execution_count": 1, + "metadata": { + "tags": [] + }, + "outputs": [ + { + "name": "stderr", + "output_type": "stream", + "text": [ + "[nltk_data] Downloading package punkt to /home/jovyan/nltk_data...\n", + "[nltk_data] Package punkt is already up-to-date!\n" + ] + }, + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Requirement already satisfied: sacrebleu in /opt/conda/lib/python3.10/site-packages (2.3.1)\n", + "Requirement already satisfied: portalocker in /opt/conda/lib/python3.10/site-packages (from sacrebleu) (2.8.2)\n", + "Requirement already satisfied: regex in /opt/conda/lib/python3.10/site-packages (from sacrebleu) (2023.10.3)\n", + "Requirement already satisfied: tabulate>=0.8.9 in /opt/conda/lib/python3.10/site-packages (from sacrebleu) (0.9.0)\n", + "Requirement already satisfied: numpy>=1.17 in /opt/conda/lib/python3.10/site-packages (from sacrebleu) (1.24.3)\n", + "Requirement already satisfied: colorama in /opt/conda/lib/python3.10/site-packages (from sacrebleu) (0.4.6)\n", + "Requirement already satisfied: lxml in /opt/conda/lib/python3.10/site-packages (from sacrebleu) (4.9.3)\n" + ] + } + ], + "source": [ + "import numpy as np # import numpy to make numerical computations.\n", + "import nltk # import NLTK to handle simple NL tasks like tokenization.\n", + "nltk.download(\"punkt\")\n", + "from nltk.util import ngrams\n", + "from collections import Counter # import a counter.\n", + "!pip3 install 'sacrebleu' # install the sacrebleu package.\n", + "import sacrebleu # import sacrebleu in order compute the BLEU score.\n", + "import matplotlib.pyplot as plt # import pyplot in order to make some illustrations." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# 2. BLEU score\n", + "\n", + "## 2.1 Definitions and formulas\n", + "\n", + "You have seen how to calculate the BLEU score in this week's lectures. Formally, you can express the BLEU score as:\n", + "\n", + "$$BLEU = BP\\times\\Bigl(\\prod_{i=1}^{n}precision_i\\Bigr)^{(1/n)}.\\tag{1}$$\n", + "\n", + "\n", + "The BLEU score depends on the $BP$, which stands for Brevity Penalty, and the weighted geometric mean precision for different lengths of n-grams, both of which are described below. The product runs from $i=1$ to $i=n$ to account for 1-grams to n-grams and the exponent of $1/n$ is there to calculate the geometrical average. In this notebook, you will use $n=4$\n", + "\n", + "The **Brevity Penalty** is defined as an exponential decay:\n", + "\n", + "$$BP = min\\Bigl(1, e^{(1-({len(ref)}/{len(cand)}))}\\Bigr),\\tag{2}$$\n", + "\n", + "where ${len(ref)}$ and ${len(cand)}$ refer to the length or count of words in the reference and candidate translations. The brevity penalty helps to handle very short translations. \n", + "\n", + "The **precision** is defined as :\n", + "\n", + "$$precision_i = \\frac {\\sum_{s_i \\in{cand}}min\\Bigl(C(s_i, cand), C(s_i, ref)\\Bigr)}{\\sum_{s_i \\in{cand}} C(s_i, cand)}.\\tag{3}$$\n", + "\n", + "The sum goes over all the i-grams $s_i$ in the candidate sentence $cand$. $C(s_i, cand)$ and $C(s_i, ref)$ are the counts of the i-grams in the candidate and reference sentences respectively. So the sum counts all the n-grams in the candidate sentence that also appear in the reference sentence, but only counts them as many times as they appear in the reference sentence and not more. This is then divided by the total number of i-grams in the candidate sentence." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## 2.2 Visualizing the BLEU score\n", + "\n", + "### Brevity Penalty:\n", + "The brevity penalty penalizes generated translations that are shorter than the reference sentence. It compensates for the fact that the BLEU score has no recall term." + ] + }, + { + "cell_type": "code", + "execution_count": 2, + "metadata": {}, + "outputs": [ + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAjcAAAGwCAYAAABVdURTAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjcuMSwgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy/bCgiHAAAACXBIWXMAAA9hAAAPYQGoP6dpAABcvElEQVR4nO3deVwU9eMG8Gd2WW4BEUEEBLxRPBCVwEytxLRMy1LTvLPwzLM0y6vMX5rmrZlXlleKV2UmWZ5oCoIniiIKKoqgcsq1+/n9Ye7XlcNdBAaW5/167eslw8zOswPMPs5+ZkYSQggQERERGQmF3AGIiIiIShLLDRERERkVlhsiIiIyKiw3REREZFRYboiIiMiosNwQERGRUWG5ISIiIqNiIneAsqbRaHDr1i1UqVIFkiTJHYeIiIj0IIRAWloaatasCYWi6GMzla7c3Lp1C25ubnLHICIiomKIj4+Hq6trkfNUunJTpUoVAI82jo2NjcxpiIiISB+pqalwc3PTvo8XpdKVm8cfRdnY2LDcEBERVTD6DCnhgGIiIiIyKiw3REREZFRYboiIiMiosNwQERGRUWG5ISIiIqPCckNERERGheWGiIiIjArLDRERERkVlhsiIiIyKiw3REREZFRkLTeHDh1C165dUbNmTUiShJ07dz5zmYMHD8LX1xfm5uaoXbs2VqxYUfpBiYiIqMKQtdxkZGSgWbNmWLJkiV7zx8bGokuXLmjbti0iIiLw2WefYfTo0QgODi7lpERERFRRyHrjzM6dO6Nz5856z79ixQrUqlULCxYsAAB4eXkhLCwM3377LXr06FFKKfWj1ggkpDyUNQM9Hycbc6iU/KSWiKiiq1B3BT927BgCAwN1pnXq1AmrV69Gbm4uVCpVvmWys7ORnZ2t/To1NbVUsiVnZOPFb/4pleemslHDxhxL+/rA191e7ihERPQcKlS5uX37NpycnHSmOTk5IS8vD0lJSXB2ds63zOzZszFjxowyyWdmwv/1V1RqjcDt1Cz0XnkcU99ohPdfcIckSXLHIiKiYqhQ5QZAvjccIUSB0x+bPHkyxo0bp/06NTUVbm5uJZ7LsYo5Ln2l/0dsVL5kZOfhk21n8PvZBHyx6zwi4h9g9ttNYGailDsaEREZqEIdaqhRowZu376tMy0xMREmJiaoVq1agcuYmZnBxsZG50H0NCszEyzp44MpXbygkIDtp27imz8uyR2LiIiKoUKVG39/f4SEhOhM27dvH1q2bFngeBsiQ0iShKEv1cby930BAGtDYxERd1/mVEREZChZy016ejoiIyMRGRkJ4NGp3pGRkYiLiwPw6COl/v37a+cPCgrC9evXMW7cOERFRWHNmjVYvXo1JkyYIEd8MlKdGtfAWz4uEAKYFHwWOXkauSMREZEBZC03YWFh8PHxgY+PDwBg3Lhx8PHxwdSpUwEACQkJ2qIDAJ6entizZw8OHDiA5s2b48svv8SiRYtkPw2cjM8XbzSCvZUpLt1Jw/cHY+SOQ0REBpDE4xG5lURqaipsbW2RkpLC8TdUpF2RN/Hx5kiYKhXY8/GLqOtYRe5IRESVliHv3xVqzA1RWXqzWU10aFAdOWoNJgWfhUZTqf4fQERUYbHcEBVCkiR89VYTWJkqEXb9Pn7+97rckYiISA8sN0RFcLGzwKTODQEA3/xxETfuZ8qciIiInoXlhugZ+vq5o5VHVWTkqPHZjnOoZMPUiIgqHJYbomdQKCT8X4+mMDVR4FD0XWw/dVPuSEREVASWGyI91KlujbGv1gcAzPztAhLTsmROREREhWG5IdLT0Lae8HaxQcrDXEzbdV7uOEREVAiWGyI9mSgVmNOjGUwUEv44dxt7zibIHYmIiArAckNkgEY1bTC8fR0AwNRd53AvI0fmRERE9DSWGyIDjXi5Luo7WSMpPQczf+XHU0RE5Q3LDZGBzEyUmPNOMygkYGfkLeyPuiN3JCIiegLLDVExNHezw9C2tQEAn+04i5SHuTInIiKix1huiIppbMf68HSwwp3UbMz6/YLccYiI6D8sN0TFZK5SYs47TSFJwC9hN3DgUqLckYiICCw3RM+llYc9BgV4AgAmbz+L1Cx+PEVEJDeWG6LnNLFTA7hXs0RCSha+/j1K7jhERJUeyw3Rc7IwVWLuO80gScDmk/E4FH1X7khERJUayw1RCWjtaY8B/h4AgEnBZ/jxFBGRjFhuiErIJ689+njqVkoWZv3Gj6eIiOTCckNUQixNTbQfT20Ji8c/PHuKiEgWLDdEJai15//OnpoUfAYpmfx4ioiorLHcEJWwiZ0aaC/uN/M3XtyPiKissdwQlTALUyW+fffRxf2CT93AXxd47ykiorLEckNUCnzd7bX3npq84yzuZ+TInIiIqPJguSEqJeM61kddR2vcTcvG1N3n5Y5DRFRpsNwQlRJzlRLz3m0GpULCr6dv4fczCXJHIiKqFFhuiEpRMzc7DG9fBwDw+c6zuJuWLXMiIiLjx3JDVMpGvVwPXs42uJ+Ziyk7zkIIIXckIiKjxnJDVMpMTRSY924zqJQS9l24g+2nbsodiYjIqLHcEJWBRjVtMObV+gCA6bvP49aDhzInIiIyXiw3RGXko5dqw6eWHdKy8zBx22loNPx4ioioNLDcEJURE6UC83s2h7lKgaNXkrH+2DW5IxERGSWWG6Iy5Olghc+6eAEAZv9xETF302VORERkfFhuiMrY+37uaFvPAdl5Goz75TTy1Bq5IxERGRWWG6IyplBImPNOU1QxN8Hp+AdY+k+M3JGIiIwKyw2RDJxtLfBlN28AwKK/L+N0/AN5AxERGRGWGyKZdGteE683dYZaIzD2l0g8zFHLHYmIyCiw3BDJRJIkzOruDccqZrh6NwPf7L0odyQiIqPAckMkIztLU8x9txkAYF3oNRyKvitzIiKiio/lhkhm7epXR39/dwDAxG2n8SAzR+ZEREQVG8sNUTkwubMXajtY4U5qNqbsOMebaxIRPQeWG6JywMJUiQW9m8NEIeH3swnYEcGbaxIRFZfs5WbZsmXw9PSEubk5fH19cfjw4SLnX7p0Kby8vGBhYYEGDRpg/fr1ZZSUqHQ1dbXDmFfrAQCm7jqP+HuZMiciIqqYZC03W7ZswZgxYzBlyhRERESgbdu26Ny5M+Li4gqcf/ny5Zg8eTKmT5+O8+fPY8aMGRgxYgR+/fXXMk5OVDqGta+Llu5VkZ6dh/G/nIaaN9ckIjKYJGT8cN/Pzw8tWrTA8uXLtdO8vLzQvXt3zJ49O9/8AQEBaNOmDebOnaudNmbMGISFheHIkSN6rTM1NRW2trZISUmBjY3N878IohIWfy8TnRceRnp2Hj55rQGGt68rdyQiItkZ8v4t25GbnJwchIeHIzAwUGd6YGAgQkNDC1wmOzsb5ubmOtMsLCxw4sQJ5ObmFrpMamqqzoOoPHOzt8T0NxsDAObvi8bZGykyJyIiqlhkKzdJSUlQq9VwcnLSme7k5ITbt28XuEynTp2watUqhIeHQwiBsLAwrFmzBrm5uUhKSipwmdmzZ8PW1lb7cHNzK/HXQlTSerRwQZcmNZCnEfh4SwQyc/LkjkREVGHIPqBYkiSdr4UQ+aY99sUXX6Bz58544YUXoFKp0K1bNwwcOBAAoFQqC1xm8uTJSElJ0T7i4+NLND9RaZAkCV+/1QQ1bMxx9W4Gvvo9Su5IREQVhmzlxsHBAUqlMt9RmsTExHxHcx6zsLDAmjVrkJmZiWvXriEuLg4eHh6oUqUKHBwcClzGzMwMNjY2Og+iisDO0hTzezaDJAEb/43DvvMFH9EkIiJdspUbU1NT+Pr6IiQkRGd6SEgIAgICilxWpVLB1dUVSqUSmzdvxhtvvAGFQvaDUEQlLqCuAz5sWxsA8GnwGdxJzZI5ERFR+SdrIxg3bhxWrVqFNWvWICoqCmPHjkVcXByCgoIAPPpIqX///tr5o6Oj8fPPP+Py5cs4ceIEevfujXPnzuHrr7+W6yUQlbrxgQ3QuKYN7mfmYsLW09Dw9HAioiKZyLnyXr16ITk5GTNnzkRCQgK8vb2xZ88euLs/us9OQkKCzjVv1Go15s2bh0uXLkGlUqFDhw4IDQ2Fh4eHTK+AqPSZmiiwsLcP3lh8GIcvJ2HN0Vh88N/RHCIiyk/W69zIgde5oYpqw7/XMWXHOaiUEnYMbwNvF1u5IxERlZkKcZ0bIjJMn9a1ENjICblqgdGbeXo4EVFhWG6IKghJkvBNj6ZwsjHD1bsZmPnrBbkjERGVSyw3RBVIVStTfNerOSQJ2HwyHnvOJsgdiYio3GG5IapgAuo4YFi7OgCAScFncPPBQ5kTERGVLyw3RBXQ2I710czNDqlZeRi7OZJ3DyciegLLDVEFpFIqsKh3c1ibmeDEtXtY/PdluSMREZUbLDdEFZR7NSt82f3R3cMX7b+Mf68my5yIiKh8YLkhqsDe8nHF2y1coBHAmC2RuJ+RI3ckIiLZsdwQVXBfdvOGp4MVElKy8EnwGVSy63ISEeXDckNUwVmZmWDxez5QKSWEXLiDn45flzsSEZGsWG6IjIC3iy0mdfYCAHz1exQu3EqVORERkXxYboiMxOA2HniloSNy8jQYuekUMrJ5ewYiqpxYboiMhCRJmPtuM9SwMcfVuxmYuuu83JGIiGTBckNkROytTLGwd3MoJCD41A0Eh9+QOxIRUZljuSEyMn61q2HMq/UBAF/sOoeYu+kyJyIiKlssN0RGaESHugioUw2ZOWqM2HAKWblquSMREZUZlhsiI6RUSFjQqzmqWZni4u00zPo9Su5IRERlhuWGyEg52phjfq/mAICfjl/H72cS5A1ERFRGWG6IjFi7+tUxrH0dAMCnwWdwLSlD5kRERKWP5YbIyI3vWB8t3asiPTsPIzZy/A0RGT+WGyIjZ6JUYHEfH1S1VOH8rVR8vYfjb4jIuLHcEFUCzrYW2vE3649x/A0RGTeWG6JKokMDRwzn+BsiqgRYbogqkXEd66OVx6PxN8N5/RsiMlIsN0SViIlSgcXvtYC9lSkuJKRi5m8X5I5ERFTiWG6IKpkatuZY0Ks5JAnY+G8cdkbclDsSEVGJYrkhqoReql8do16uBwCYvP0sLt9JkzkREVHJYbkhqqQ+fqUeXqzrgIe5agzfcAqZOXlyRyIiKhEsN0SVlFIhYUHv5nCsYobLiemYsuMchBByxyIiem4sN0SVmIO1GZb0aQGlQsKOiJvYdCJe7khERM+N5YaokmvtaY+JnRoAAKbvPo8zNx7IG4iI6Dmx3BARPnqpNjo2ckKOWoNhP5/Cg8wcuSMRERUbyw0RQZIkfPtuM7hXs8TNBw8xdkskNBqOvyGiionlhogAALYWKizv6wszEwX+uXQXS/+5InckIqJiYbkhIq1GNW3wVXdvAMD8v6Jx+PJdmRMRERmO5YaIdLzb0g29W7lBCODjzZG49eCh3JGIiAzCckNE+Ux/szG8XWxwLyMHwzacQnYeb7BJRBUHyw0R5WOuUmJ5X1/YWqhwOv4BZv7KG2wSUcXBckNEBXKzt8TC3o9usLnh3zhsDeMF/oioYmC5IaJCtW/giLGv1gcATNl5DudupsiciIjo2VhuiKhIIzvUxcsNHZGTp0HQz+G4n8EL/BFR+cZyQ0RFUigkfNezOWrZW+LG/Yf4eEsk1LzAHxGVYyw3RPRMtpYqrHjfF+YqBQ5F38WCv6LljkREVCjZy82yZcvg6ekJc3Nz+Pr64vDhw0XOv2HDBjRr1gyWlpZwdnbGoEGDkJycXEZpiSqvRjVt8H9vNwUALP77Cv48f1vmREREBZO13GzZsgVjxozBlClTEBERgbZt26Jz586Ii4srcP4jR46gf//+GDJkCM6fP4+tW7fi5MmT+OCDD8o4OVHl1N3HBYPaeAAAxv9yGlcS0+UNRERUAFnLzfz58zFkyBB88MEH8PLywoIFC+Dm5obly5cXOP/x48fh4eGB0aNHw9PTEy+++CI++ugjhIWFFbqO7OxspKam6jyIqPg+6+IFP097pGfn4cOfwpCWlSt3JCIiHbKVm5ycHISHhyMwMFBnemBgIEJDQwtcJiAgADdu3MCePXsghMCdO3ewbds2vP7664WuZ/bs2bC1tdU+3NzcSvR1EFU2KqUCS/u2gLOtOa7ezcC4X07zDuJEVK7IVm6SkpKgVqvh5OSkM93JyQm3bxf8WX5AQAA2bNiAXr16wdTUFDVq1ICdnR0WL15c6HomT56MlJQU7SM+nhciI3peDtZmWP6+L0yVCoRcuMM7iBNRuSL7gGJJknS+FkLkm/bYhQsXMHr0aEydOhXh4eHYu3cvYmNjERQUVOjzm5mZwcbGRudBRM+vuZudzh3E/754R+ZERESPyFZuHBwcoFQq8x2lSUxMzHc057HZs2ejTZs2mDhxIpo2bYpOnTph2bJlWLNmDRISEsoiNhE9oWcrN/T1q/XoDuKbIhFzlwOMiUh+spUbU1NT+Pr6IiQkRGd6SEgIAgICClwmMzMTCoVuZKVSCeDRER8iKnvTujZGK4+qSMvOw9D1YUjlAGMikpmsH0uNGzcOq1atwpo1axAVFYWxY8ciLi5O+zHT5MmT0b9/f+38Xbt2xfbt27F8+XJcvXoVR48exejRo9G6dWvUrFlTrpdBVKmZmiiwrK8vatj8N8B4SyQHGBORrEzkXHmvXr2QnJyMmTNnIiEhAd7e3tizZw/c3d0BAAkJCTrXvBk4cCDS0tKwZMkSjB8/HnZ2dnj55ZfxzTffyPUSiAhA9SpmWNnfF++sOIa/ohKx4K9ojAtsIHcsIqqkJFHJPs9JTU2Fra0tUlJSOLiYqIQFh9/A+K2nAQAr3m+B17ydZU5ERMbCkPdv2c+WIiLj0cPXFYPbeAIAxv1yGpdup8mciIgqI5YbIipRn3VpiIA61ZCZo8bQ9WG4n5EjdyQiqmRYboioRJkoFVjapwXc7C0Qdy8TwzecQq5aI3csIqpEDC437du3x/r16/Hw4cPSyENERqCqlSlW9W8FK1Mljl1Nxle/XZA7EhFVIgaXG19fX3zyySeoUaMGhg4diuPHj5dGLiKq4BrUqILvejUHAPx47Do2/htX9AJERCXE4HIzb9483Lx5E+vXr8fdu3fx0ksvoVGjRvj2229x5w4vv05E/xPYuAYmBNYHAEzddQ4nYu/JnIiIKoNijblRKpXo1q0bdu7ciZs3b6JPnz744osv4Obmhu7du+Pvv/8u6ZxEVEGN6FAXbzR1Rp5GYNjP4bhxP1PuSERk5J5rQPGJEycwdepUfPvtt3B0dMTkyZPh6OiIrl27YsKECSWVkYgqMEmSMPedZmhc0wbJGTkYuj4cGdl5csciIiNmcLlJTEzEvHnz4O3tjbZt2+Lu3bvYvHkzrl27hhkzZmDlypXYtWsXVqxYURp5iagCsjBV4of+LeFgbYaohFSM5S0aiKgUGVxuXF1dsWrVKgwYMAA3btzAtm3b8Nprr0GSJO08rVu3RqtWrUo0KBFVbDXtLPB9P1+YKhXYd+EO5oVckjsSERkpg2+/cPjwYbRt27a08pQ63n6BSF47Im5g7JZHt2j4rlczvOXjKnMiIqoISvX2C9OmTcODBw8KXOnLL79s6NMRUSXzlo8rhrWvAwD4NPgsTsXdlzkRERkbg8vNwYMHkZOT/3LqWVlZOHz4cImEIiLjNjGwATo2ckJOngYfrg/HzQe8KCgRlRwTfWc8c+YMAEAIgQsXLuD27dva76nVauzduxcuLi4ln5CIjI5CIWFBr+bosTwUF2+n4YMfw7AtyB9WZnrvkoiICqX3mBuFQqEdNFzQIhYWFli8eDEGDx5csglLGMfcEJUfN+5novvSo0hKz0HHRk74/n1fKBTSsxckokrHkPdvvcvN9evXIYRA7dq1ceLECVSvXl37PVNTUzg6OkKpVD5f8jLAckNUvoRfv4f3fvgXOXkafPRSbUzu4iV3JCIqhwx5/9b7GLC7uzsAQKPh3X2JqOT4uttj7jtN8fHmSHx/6Co8HazQu3UtuWMRUQWmV7nZvXu33k/45ptvFjsMEVVO3Zq7IOZuBhbtv4zPd55DLXtLBNR1kDsWEVVQen0spVDod1KVJElQq9XPHao08WMpovJJCIHRmyPx6+lbsDE3wY4RbVCnurXcsYionCjx69xoNBq9HuW92BBR+fXoHlRN4VPLDqlZeRiy7iTuZ+S/7AQR0bM8140ziYhKkrlKiZX9WsLFzgLXkjPx0c/hyMnjOD8iMozBt18AgIyMDBw8eBBxcXH5Lug3evToEgtXGvixFFH5d+l2GnosD0V6dh7e9nHBvJ7NdO5fR0SVT6mcCv5YREQEunTpgszMTGRkZMDe3h5JSUmwtLSEo6Mjrl69+lzhSxvLDVHFcDD6LgavOwm1RmDsq/Xx8av15I5ERDIq1XtLjR07Fl27dsW9e/dgYWGB48eP4/r16/D19cW3335b7NBERE9qV786vuzmDQD47q9o7Ii4IXMiIqooDC43kZGRGD9+PJRKJZRKJbKzs+Hm5oY5c+bgs88+K42MRFRJ9fGrhY9eqg0A+HTbWfx7NVnmRERUERhcblQqlfazbycnJ8TFxQEAbG1ttf8mIiopn77WEJ29ayBHrcGHP4Uj5m663JGIqJwzuNz4+PggLCwMANChQwdMnToVGzZswJgxY9CkSZMSD0hElZtCIeG7Xs3R3M0OKQ9zMXjdSdzjKeJEVASDy83XX38NZ2dnAMCXX36JatWqYdiwYUhMTMTKlStLPCARkblKiVUDWsK1qgWuJ2di6PowZOXyulpEVLBinQpekfFsKaKK60piGt5eForUrDx0aVIDS95rwbuIE1USpXq2FBGRXOo6VsHK/i1hqlRgz9nb+HpPlNyRiKgcMrjc3LlzB/369UPNmjVhYmKiPWvq8YOIqDS9ULsa5r7bFACw6kgs1h6NlTkREZU3et0V/EkDBw5EXFwcvvjiCzg7O/OqoURU5ro1d8HNBw8xZ+8lzPztApxtLfCadw25YxFROWHwmJsqVarg8OHDaN68eSlFKl0cc0NkHIQQmLLzHDb+GwczEwU2ffgCWtSqKncsIiolpTrmxs3NDZVsDDIRlUOSJGHmm43xckNHZOdp8MGPYbiWlCF3LCIqBwwuNwsWLMCkSZNw7dq1UohDRKQ/E6UCi9/zQRMXW9zLyMGAtSeQlJ4tdywikpnBH0tVrVoVmZmZyMvLg6WlJVQqlc737927V6IBSxo/liIyPolpWXh7WShu3H+IZq622Dj0BViZGTykkIjKMUPevw3+61+wYEFxcxERlQrHKub4cXBr9FgeitM3UjBi4yn80L8lVEpe7YKoMuJF/IjIaIRfv4++q44jK1eDni1d8U2Ppjyjk8hIlPpF/GJiYvD555/jvffeQ2JiIgBg7969OH/+fHGejoioRPi6V8Xi91pAIQG/hN3Ad39dljsSEcnA4HJz8OBBNGnSBP/++y+2b9+O9PRHd+g9c+YMpk2bVuIBiYgM0bGRE77q/ugmvov2X8aGf6/LnIiIyprB5WbSpEn46quvEBISAlNTU+30Dh064NixYyUajoioOPr41cLoV+oBAL7YeQ4hF+7InIiIypLB5ebs2bN466238k2vXr06kpOTSyQUEdHzGvtqPfRq6QaNAEZuPIWwa+X7TE4iKjkGlxs7OzskJCTkmx4REQEXFxeDAyxbtgyenp4wNzeHr68vDh8+XOi8AwcOhCRJ+R6NGzc2eL1EZNwkScKst7y1F/kb8mMYou+kyR2LiMqAweWmT58++PTTT3H79m1IkgSNRoOjR49iwoQJ6N+/v0HPtWXLFowZMwZTpkxBREQE2rZti86dOyMuLq7A+RcuXIiEhATtIz4+Hvb29nj33XcNfRlEVAmYKBVY2qcFWtSyQ8rDXPRffQI3HzyUOxYRlTKDTwXPzc3FwIEDsXnzZgghYGJiArVajT59+mDdunUG3Rncz88PLVq0wPLly7XTvLy80L17d8yePfuZy+/cuRNvv/02YmNj4e7uXuA82dnZyM7+3xVLU1NT4ebmxlPBiSqRB5k5eHfFMVxOTEed6lbYGhQAeyvTZy9IROWGIaeCF/s6N1evXsWpU6eg0Wjg4+ODevXqGbR8Tk4OLC0tsXXrVp0xPB9//DEiIyNx8ODBZz5H165dkZ2djX379hU6z/Tp0zFjxox801luiCqXhJSH6LEsFLdSstDczQ4bh/rB0pRXMSaqKErlOjcajQZz585FmzZt0Lp1a6xatQpvvPEGevbsaXCxAYCkpCSo1Wo4OTnpTHdycsLt27efuXxCQgL++OMPfPDBB0XON3nyZKSkpGgf8fHxBmcloorP2dYC64e0hp2lCpHxDzB8wynkqjVyxyKiUqB3ufnmm28wadIkWFlZwdnZGfPnz8fo0aOfO8DTVw8VQuh1RdF169bBzs4O3bt3L3I+MzMz2NjY6DyIqHKq61gFawa2grlKgQOX7uKTbWeg0VSqi7QTVQp6l5t169Zh8eLF2LdvH3bt2oWdO3di/fr1KO7dGxwcHKBUKvMdpUlMTMx3NOdpQgisWbMG/fr107nWDhHRs7SoVRXL+/pCqZCwI+Imvvo9qtj7MSIqn/QuN9evX8cbb7yh/bpTp04QQuDWrVvFWrGpqSl8fX0REhKiMz0kJAQBAQFFLnvw4EFcuXIFQ4YMKda6iahy69DQEXPfaQoAWHM0Fkv+viJzIiIqSXqXm5ycHFhYWGi/liQJpqamOmciGWrcuHFYtWoV1qxZg6ioKIwdOxZxcXEICgoC8Gi8TEGnl69evRp+fn7w9vYu9rqJqHJ7u4Urpr7RCAAwLyQaPx3nbRqIjIVBpwp88cUXsLS01H6dk5ODWbNmwdbWVjtt/vz5ej9fr169kJycjJkzZyIhIQHe3t7Ys2eP9rTuhISEfNe8SUlJQXBwMBYuXGhIdCKifAa/6IkHmTlY9PcVTN11DjbmJujW3PCLkRJR+aL3qeDt27d/5kBfSZLw999/l0iw0mLIqWREZPyEEJi2+zzWH7sOE4WEHwa0RIcGjnLHIqKnlMl1bioqlhsieppGIzBmSyR2n74Fc5UCPw/xQ0sPe7ljEdETSuU6N0RExkqhkDCvZzO0b1AdWbkaDFp3Ehdupcodi4iKieWGiAiASqnA8r6+aOleFWlZeei/5gRikzLkjkVExcByQ0T0HwtTJVYPbAUvZxskpWfj/VX/8kabRBUQyw0R0RNsLVT4aUhr1K5uhZsPHqLfqn9xN634l7wgorLHckNE9BQHazP8PMQPLnYWuJqUgX6r/0VKZq7csYhITwaXGw8PD8ycOTPf9WeIiIxJTTsLbPjAD9WrmOHi7TQMXHcCGdl5csciIj0YXG7Gjx+PXbt2oXbt2ujYsSM2b978XFcpJiIqrzwcrPDTkNawtVAhIu4Bhq4PQ1auWu5YRPQMBpebUaNGITw8HOHh4WjUqBFGjx4NZ2dnjBw5EqdOnSqNjEREsmlYwwY/Dm4NK1MlQmOSMXLjKeSqNXLHIqIiFHvMTbNmzbBw4ULcvHkT06ZNw6pVq9CqVSs0a9YMa9as4V12ichoNHezw6oBrWBmosBfUYkY98tpqDXcxxGVV8UuN7m5ufjll1/w5ptvYvz48WjZsiVWrVqFnj17YsqUKejbt29J5iQikpV/nWpY8b4vVEoJv56+hU+Dz0DDgkNULhl040wAOHXqFNauXYtNmzZBqVSiX79++O6779CwYUPtPIGBgXjppZdKNCgRkdw6NHTEot4+GLkpAtvCb8BcpcCX3byfed89IipbBpebVq1aoWPHjli+fDm6d+8OlUqVb55GjRqhd+/eJRKQiKg86dzEGfPVGozZEomfj8fBzESJz1/3YsEhKkcMLjdXr16Fu7t7kfNYWVlh7dq1xQ5FRFSedWvuguxcDT4JPoPVR2JhoVJiQqcGcsciov8YPOamQ4cOSE5Ozjf9wYMHqF27domEIiIq73q2csPMbo0BAEv+uYIlf1+WORERPWZwubl27RrU6vzXecjOzsbNmzdLJBQRUUXQ398Dn3V5NN7w233RWHX4qsyJiAgw4GOp3bt3a//9559/wtbWVvu1Wq3G/v374eHhUaLhiIjKuw9fqoOsXA3mh0Tjq9+jYGqiQH9/D7ljEVVqepeb7t27AwAkScKAAQN0vqdSqeDh4YF58+aVaDgioopg1Mt1kZWrxrIDMZi66zyUCgl9/Yoem0hEpUfvcqPRPLoip6enJ06ePAkHB4dSC0VEVJFIkoSJnRogTyOw8tBVTNlxDiYKCb1a1ZI7GlGlZPDZUrGxsaWRg4ioQpMkCZM7N0SeWmDN0VhM2n4WSoUC7/i6yh2NqNLRq9wsWrQIH374IczNzbFo0aIi5x09enSJBCMiqmgkScIXb3hBrdHgx2PXMXHbaSgVwFs+LDhEZUkSetwEytPTE2FhYahWrRo8PT0LfzJJwtWr5ftsgdTUVNja2iIlJQU2NjZyxyEiIySEwOc7z2HDv3FQSMCC3j54s1lNuWMRVWiGvH/rdeTmyY+i+LEUEVHRJEnCl928odYIbD4Zj7FbIqGUJLze1FnuaESVgsHXuTl48GBp5CAiMioKhYSv32qCd3xdodYIfLw5An+cTZA7FlGlYHC56dixI2rVqoVJkybh7NmzpZGJiMgoKBQSvunRFG/7uCBPIzByEwsOUVkwuNzcunULn3zyCQ4fPoxmzZqhadOmmDNnDm7cuFEa+YiIKjSlQsLcd5vhbR8XqP8rOHtYcIhKlV4DigsTGxuLjRs3YtOmTbh48SJeeukl/P333yWZr8RxQDERyUGtEZi49TS2R9yEUiFhUW8fjsEhMoAh79/PVW6AR7de+OOPP/DFF1/gzJkzBd53qjxhuSEiuag1AhO3ncb2U48KzsLezfFGU55FRaQPQ96/Df5Y6rGjR49i+PDhcHZ2Rp8+fdC4cWP89ttvxX06IiKjp1RImPtOM/Ro8XiQcSR+PX1L7lhERsfgKxR/9tln2LRpE27duoVXX30VCxYsQPfu3WFpaVka+YiIjIpSIWHOO00hScC28BsYsyUSANCV18EhKjEGl5sDBw5gwoQJ6NWrF+8vRURUDMr/zqKSAGwNv4GPN0dAIwS6NXeROxqRUTC43ISGhpZGDiKiSuVxwQEeFZyxWyKRqxa8FxVRCSjWmJuffvoJbdq0Qc2aNXH9+nUAwIIFC7Br164SDUdEZMweXwfnvdZu0Ahg4rbT2HwiTu5YRBWeweVm+fLlGDduHLp06YIHDx5oz46ys7PDggULSjofEZFRUygkzOreBP393SEEMGn7Wfx07JrcsYgqNIPLzeLFi/HDDz9gypQpUCqV2uktW7bkFYuJiIpBoZAw483GGPLioxsTf7HrPFYf4X38iIrL4HITGxsLHx+ffNPNzMyQkZFRIqGIiCobSZLw+eteGNa+DgDgy98uYPmBGJlTEVVMBpcbT09PREZG5pv+xx9/oFGjRiWRiYioUpIkCZ90aoCPX6kHAPhm70Us2n9Z5lREFY/BZ0tNnDgRI0aMQFZWFoQQOHHiBDZt2oTZs2dj1apVpZGRiKjSkCQJYzvWh6mJAnP/vIT5IdHIydNgfGB9SJIkdzyiCsHgcjNo0CDk5eXhk08+QWZmJvr06QMXFxcsXLgQvXv3Lo2MRESVzogOdaFSSvh6z0Us+ecKHuaq8fnrXiw4RHowqNzk5eVhw4YN6Nq1K4YOHYqkpCRoNBo4OjqWVj4iokrrw5fqwMxEiWm7Hw0wzsxR46vu3lAqWHCIimLQmBsTExMMGzYM2dnZAAAHBwcWGyKiUjQgwANz3mkKhQRsOhGH8b9EIk+tkTsWUblm8IBiPz8/RERElEYWIiIqQM+WbljY2wcmCgk7I29hxMZTyM5Tyx2LqNwyuNwMHz4c48ePx5IlS3Ds2DGcOXNG52GoZcuWwdPTE+bm5vD19cXhw4eLnD87OxtTpkyBu7s7zMzMUKdOHaxZs8bg9RIRVSRdm9XEivd9YWqiwJ/n7+DD9eF4mMOCQ1QQSQghDFlAocjfhyRJghACkiRpr1isjy1btqBfv35YtmwZ2rRpg++//x6rVq3ChQsXUKtWrQKX6datG+7cuYOvvvoKdevWRWJiIvLy8hAQEKDXOlNTU2Fra4uUlBTY2NjonZWIqDw4cjkJQ9eH4WGuGn6e9lg9sBWszQw+N4SowjHk/dvgcvP4XlKFcXd31/u5/Pz80KJFCyxfvlw7zcvLC927d8fs2bPzzb9371707t0bV69ehb29vf6hn8ByQ0QVXdi1exi09iTSsvPQ3M0OPw5qDVtLldyxiEqVIe/fBn8s5e7uXuRDXzk5OQgPD0dgYKDO9MDAwELvPL579260bNkSc+bMgYuLC+rXr48JEybg4cOHha4nOzsbqampOg8iooqspYc9Ng59AXaWKkTGP0CvlceQmJYldyyicsPgcpOcnKz9d3x8PKZOnYqJEyc+c6zM05KSkqBWq+Hk5KQz3cnJCbdv3y5wmatXr+LIkSM4d+4cduzYgQULFmDbtm0YMWJEoeuZPXs2bG1ttQ83NzeDchIRlUdNXG2x5UN/OFYxw8XbaXh3xTHE38uUOxZRuaB3uTl79iw8PDzg6OiIhg0bIjIyEq1atcJ3332HlStXokOHDti5c6fBAZ6+INXjsTsF0Wg0kCQJGzZsQOvWrdGlSxfMnz8f69atK/TozeTJk5GSkqJ9xMfHG5yRiKg8alCjCrYFBcDN3gLXkzPxzopQXL6TJncsItnpXW4++eQTNGnSBAcPHkT79u3xxhtvoEuXLkhJScH9+/fx0Ucf4f/+7//0XrGDgwOUSmW+ozSJiYn5juY85uzsDBcXF9ja2mqneXl5QQiBGzduFLiMmZkZbGxsdB5ERMaiVjVLbAsKQAOnKriTmo13vz+G0/EP5I5FJCu9y83Jkycxa9YsvPjii/j2229x69YtDB8+HAqFAgqFAqNGjcLFixf1XrGpqSl8fX0REhKiMz0kJKTQM5/atGmDW7duIT09XTstOjoaCoUCrq6ueq+biMiYONmYY8tHL6C5mx0eZOaizw/HERqTJHcsItnoXW7u3buHGjVqAACsra1hZWWlc8ZS1apVkZZm2OHQcePGYdWqVVizZg2ioqIwduxYxMXFISgoCMCjj5T69++vnb9Pnz6oVq0aBg0ahAsXLuDQoUOYOHEiBg8eDAsLC4PWTURkTOwsTbHhAz+0qVsNGTlqDFx7EvvOFzx+kcjYGTSg+OmxMM97A7devXphwYIFmDlzJpo3b45Dhw5hz5492rOuEhISEBcXp53f2toaISEhePDgAVq2bIm+ffuia9euWLRo0XPlICIyBlZmJlgzsBU6NXZCTp4GwzacQnB4wR/ZExkzva9zo1Ao0LlzZ5iZmQEAfv31V7z88suwsrIC8OiU67179xp0ET858Do3RGTs8tQaTNp+Ftv+KzZfvNEIQ170lDkV0fMplYv4DRo0SK+Vr127Vq/55MJyQ0SVgUYjMGtPFFYfiQUADGtfB590avDcR9yJ5FKqVyiu6FhuiKiyEEJg+cEYzNl7CQDQs6Urvn6rCUyUBl/ijEh2pXqFYiIiqhgkScLw9nXxTY8mUEjAL2E3EPTzKWTllu/hA0TPi+WGiMjI9WpVC9/3awkzEwX+irqDfqv/RUpmrtyxiEoNyw0RUSXQsZETfhrihyrmJjh57T56fn8Mt1N4PyoyTiw3RESVRGtPe2wNenQ/qkt30tBjeShi7qY/e0GiCoblhoioEmlYwwbBwwJQ28EKNx88xLsrjiGSt2sgI8NyQ0RUybjZW2JrkD+autriXkYO3lt5HH9fvCN3LKISw3JDRFQJVbM2w8ahL6BtPQc8zFVj6PpwbDkZ9+wFiSoAlhsiokrK+r/bNfRo4Qq1RuDT4LP4LiQalezyZ2SEWG6IiCoxlVKBb99tilEv1wUALNx/GZOCzyJPrZE5GVHxsdwQEVVykiRhfGADzHrLGwoJ2BIWj6Hrw5CZkyd3NKJiYbkhIiIAQF8/d3zfryXMVQr8c+kueq88jqT0bLljERmM5YaIiLQ6NnLCxqEvoKqlCmdupKDH8lBcS8qQOxaRQVhuiIhIR4taVRE8LABu9ha4npyJt5eHIvz6fbljEemN5YaIiPKpXd0a24e1QROXR9fC6fPDcew5myB3LCK9sNwQEVGBqlcxw+YPX8CrXo7IztNg+IZT+P5gDE8Vp3KP5YaIiAplZWaC7/u1xMAADwDA7D8u4vOd53iqOJVrLDdERFQkpULC9DcbY+objSBJwIZ/4zB0fRjSs3mqOJVPLDdERKSXwS96YsX7vtpTxXuuOIbbKVlyxyLKh+WGiIj01qlxDWz+0B8O1qa4kJCK7kuP4sKtVLljEelguSEiIoM0d7PDjuFtUNfRGrdTs/DuilAcuJQodywiLZYbIiIymJu9JYKDAuBfuxoyctQYvO4k1h+7JncsIgAsN0REVEy2lir8OLg1erRwhUYAU3edx7RdPJOK5MdyQ0RExWZq8uiu4p++1hAA8OOx6xj8YxhSs3JlTkaVGcsNERE9F0mSMKx9Hax43xcWKiUORd9Fj2WhiEvOlDsaVVIsN0REVCJe866BrUH+cLIxw+XEdHRfdhQnr92TOxZVQiw3RERUYrxdbLFrxIvwdrHBvYwc9P3hXwSH35A7FlUyLDdERFSiatia45eP/PFa4xrIUWswfutpzP3zIjQa3pOKygbLDRERlThLUxMs69sCw9vXAQAs/ScGwzecQmYOb9lApY/lhoiISoVCIeGT1xri23ebQaWUsPf8bfRYfgw37nOgMZUulhsiIipV7/i6YtPQF+BgbYqohFR0W3IUJ2I50JhKD8sNERGVupYe9tg18kU0rmmD5Iwc9F11HJtOxMkdi4wUyw0REZUJFzsLbAsKwOtNnZGrFpi8/Sym7TqHXF7RmEoYyw0REZUZC1MllrzngwmB9QE8uqLxgDUncD8jR+ZkZExYboiIqExJkoSRL9fD9/18YWmqRGhMMrovO4roO2lyRyMjwXJDRESy6NS4BrYPD4BrVQtcT87EW0uP4q8Ld+SORUaA5YaIiGTTsIYNdo98EX6e9sjIUWPoT2FYvP8yL/hHz4XlhoiIZGVvZYqfP/BDvxfcIQQwLyQaQT+HI413FqdiYrkhIiLZqZQKfNndG9/0aAJTpQL7LtxB96VHEXM3Xe5oVAGx3BARUbnRq1Ut/BLkjxo25oi5m4HuS44ihONwyEAsN0REVK40d7PDr6NeRGsPe6Rl52Ho+jAs+Cua43BIb7KXm2XLlsHT0xPm5ubw9fXF4cOHC533wIEDkCQp3+PixYtlmJiIiEpb9Spm2DDUDwP83QEAC/66jA9/Ckcqx+GQHmQtN1u2bMGYMWMwZcoUREREoG3btujcuTPi4oq+JPelS5eQkJCgfdSrV6+MEhMRUVlRKRWY0c0bc99pClMTBf6KejQO50oix+FQ0SQhhGzH+fz8/NCiRQssX75cO83Lywvdu3fH7Nmz881/4MABdOjQAffv34ednV2x1pmamgpbW1ukpKTAxsamuNGJiKgMnbnxAB/9FI6ElCxYm5lgXs9m6NS4htyxqAwZ8v4t25GbnJwchIeHIzAwUGd6YGAgQkNDi1zWx8cHzs7OeOWVV/DPP/8UOW92djZSU1N1HkREVLE0dX00DsfP0x7p2Xn46Kdw/N8fF5HH+1JRAWQrN0lJSVCr1XByctKZ7uTkhNu3bxe4jLOzM1auXIng4GBs374dDRo0wCuvvIJDhw4Vup7Zs2fD1tZW+3BzcyvR10FERGXDwdoMP3/gh8FtPAEAKw7GoN/qE7ibli1zMipvZPtY6tatW3BxcUFoaCj8/f2102fNmoWffvpJ70HCXbt2hSRJ2L17d4Hfz87ORnb2/37xU1NT4ebmxo+liIgqsN/O3MKn284gI0cNJxszLOvbAr7u9nLHolJUIT6WcnBwgFKpzHeUJjExMd/RnKK88MILuHz5cqHfNzMzg42Njc6DiIgqtjea1sSukW1Q19Ead1Kz0ev741h7NBYyDiOlckS2cmNqagpfX1+EhIToTA8JCUFAQIDezxMREQFnZ+eSjkdEROVcXccq2DWiDd5o6ow8jcCMXy9g9OZIZGTnyR2NZGYi58rHjRuHfv36oWXLlvD398fKlSsRFxeHoKAgAMDkyZNx8+ZNrF+/HgCwYMECeHh4oHHjxsjJycHPP/+M4OBgBAcHy/kyiIhIJlZmJlj8ng983ati1u9R+PX0LUQlpGLF+76o62gtdzySiazlplevXkhOTsbMmTORkJAAb29v7NmzB+7ujy7alJCQoHPNm5ycHEyYMAE3b96EhYUFGjdujN9//x1dunSR6yUQEZHMJEnCoDaeaOJiixEbT+FKYjq6LTmCOe80w+tNeWS/MpL1Ojdy4HVuiIiM1920bIzadArHr94DAAxu44lJnRvC1ET2C/LTc6oQA4qJiIhKWvUqZvh5iB+C2tUBAKw5Got3vz+G+HuZMiejssRyQ0RERsVEqcCkzg2xqn9L2FqocDr+AV5fdJh3F69EWG6IiMgovdrICb+PfhHN3OyQmvXo7uKzfr+AXF7V2Oix3BARkdFyrWqJrR/544MXH13V+IfDsej5/THcfPBQ5mRUmlhuiIjIqJmaKPD5G42wsp8vbMxNEBH3AF0WHsb+KH5MZaxYboiIqFIIbFwDv49ui2autkh5mIshP4Zh9p4ofkxlhFhuiIio0nCzt8TWoAAMauMBAPj+0FX0Xnkct/gxlVFhuSEiokrF1ESBaV0bY8X7vqhiboLw6/fRhWdTGRWWGyIiqpRe866B30e1RVNXWzzIzMXQ9WGYtuscsnLVckej58RyQ0RElVatapbYFhSAoW0fnU3147HreGtZKK4kpsucjJ4Hyw0REVVqpiYKTHm9EdYOaoVqVqaISkhF18VHsOVkHCrZHYqMBssNERERgA4NHPHHmLZ4sa4DHuaq8WnwWYzaFIHUrFy5o5GBWG6IiIj+41jFHOsHt8anrzWEiULCb2cS8Pqiw4iIuy93NDIAyw0REdETFAoJw9rXwdYgf7jZWyD+3kO8u+IYlh24Ao2GH1NVBCw3REREBfCpVRW/j26LN5o6I08jMGfvJfRfcwKJqVlyR6NnYLkhIiIqhI25Covf88GcHk1hoVLiyJUkvLbwMPadvy13NCoCyw0REVERJElCz1Zu+HVUGzRytsG9jBx8+FM4Jm8/g8ycPLnjUQFYboiIiPRQ17EKdowIwEcv1YYkAZtOxOP1RUdwOv6B3NHoKSw3REREejIzUWJyFy9s+MAPzrbmiE3KQI/loVjy92WoOdi43GC5ISIiMlBAHQfs/fglvP7fYONv90Wj1/fHEH8vU+5oBJYbIiKiYrG1VGHJez6Y37MZrM1MEHb9PjovPIztp27wysYyY7khIiIqJkmS8HYLV/zxcVu0dK+K9Ow8jPvlNEZuikBKJq9sLBeWGyIioufkZm+JzR++gAmB9WGikPD7mQS8tvAQQq8kyR2tUmK5ISIiKgEmSgVGvlwPwcMC4OlghYSULPRZ9S+m7z6PhzlqueNVKiw3REREJaiZmx1+G/Ui+vjVAgCsC73G+1OVMZYbIiKiEmZlZoKv32qCdYNawcnGDFf/O2X82z8vISdPI3c8o8dyQ0REVEraN3DEvjHt0L15TWgEsOSfK+i+9Cgu3k6VO5pRY7khIiIqRbaWKizo7YNlfVugqqUKFxJS0XXxESw/EMML/5USlhsiIqIy0KWJM/aNbYdXvZyQqxb4Zu9F9Pz+GK4lZcgdzeiw3BAREZWR6lXM8EN/X3z7bjNUMTNB+H8X/lt/7Bo0PIpTYlhuiIiIypAkSXjH1xV7x76ENnWr4WGuGlN3nUf/NSdw88FDueMZBZYbIiIiGbjYWeCnwX6Y8WZjmKsUOHIlCYHzD+Ln49d5FOc5sdwQERHJRKGQMCDAA3tGt0Urj6rIyFHj853n0HfVv4hL5k04i4vlhoiISGa1q1tjy4f+mNa1ESxUShy7moxOCw5h3dFYHsUpBpYbIiKickChkDCojSf2jmmLF2rb42GuGtN/vYDeK48jlmdUGYTlhoiIqBxxr2aFjR+8gC+7e8PKVIkT1+7htQWHsOrwVV4XR08sN0REROWMQiGh3wvu+HPsS2hbzwHZeRp89XsU3lkRiiuJaXLHK/dYboiIiMop16qWWD+4Nf7v7SaoYmaCiLgH6LLoCJYduII8Ne9RVRiWGyIionJMkiT0bl0L+8a9hA4NqiMnT4M5ey/h7eWhuHCL96gqCMsNERFRBeBsa4E1A1th3rvNYGNugjM3UtB1yRF8s/cisnLVcscrV1huiIiIKghJktDD1xV/jWuHLk1qQK0RWH4gBp0WHMLRK0lyxys3WG6IiIgqGEcbcyzr64sf+rdEDRtzXE/ORN9V/2LC1tO4n5EjdzzZyV5uli1bBk9PT5ibm8PX1xeHDx/Wa7mjR4/CxMQEzZs3L92ARERE5VTHRk4IGfcS+vu7Q5KAbeE38Or8g9gVeRNCVN7TxmUtN1u2bMGYMWMwZcoUREREoG3btujcuTPi4uKKXC4lJQX9+/fHK6+8UkZJiYiIyqcq5irM7OaNbUEBqO9kjeSMHHy8ORKD1p3EjfuV8xYOkpCx2vn5+aFFixZYvny5dpqXlxe6d++O2bNnF7pc7969Ua9ePSiVSuzcuRORkZF6rzM1NRW2trZISUmBjY3N88QnIiIqV3LyNFhxMAZL/r6CHLUGlqZKjA9sgIEBHlAqJLnjPRdD3r9lO3KTk5OD8PBwBAYG6kwPDAxEaGhoocutXbsWMTExmDZtml7ryc7ORmpqqs6DiIjIGJmaKDD6lXrY83FbtPawR2aOGl/+dgFvLztaqU4bl63cJCUlQa1Ww8nJSWe6k5MTbt++XeAyly9fxqRJk7BhwwaYmJjotZ7Zs2fD1tZW+3Bzc3vu7EREROVZXUdrbP7wBcx+uwmqmJvg9H+njc/eE4XMnDy545U62QcUS5LuYTIhRL5pAKBWq9GnTx/MmDED9evX1/v5J0+ejJSUFO0jPj7+uTMTERGVdwqFhPda18L+J04b//7QVbw67yD+PH/bqAcc63f4oxQ4ODhAqVTmO0qTmJiY72gOAKSlpSEsLAwREREYOXIkAECj0UAIARMTE+zbtw8vv/xyvuXMzMxgZmZWOi+CiIionHt82vj+qDuYtvs8btx/iI9+CscrDR0x/c3GcLO3lDtiiZPtyI2pqSl8fX0REhKiMz0kJAQBAQH55rexscHZs2cRGRmpfQQFBaFBgwaIjIyEn59fWUUnIiKqcF7xckLI2HYY0aEOVEoJ+y8mouN3B7H0nyvIyTOu+1TJduQGAMaNG4d+/fqhZcuW8Pf3x8qVKxEXF4egoCAAjz5SunnzJtavXw+FQgFvb2+d5R0dHWFubp5vOhEREeVnYarExE4N8ZaPCz7feQ7Hr97D3D8vYUfETXzZzRv+darJHbFEyFpuevXqheTkZMycORMJCQnw9vbGnj174O7uDgBISEh45jVviIiIyDB1Hatg09AXsCPiJmb9HoUriel474fjeNvHBZ+97gUH64o9nEPW69zIgde5ISIi+p+UzFzM+fMiNp6IgxCAjbkJPnmtIfq0rgVFObo2jiHv3yw3REREhMj4B5iy4yzO/3c9nGZudpjV3RveLrYyJ3uE5aYILDdEREQFy1Nr8NPx65i3Lxrp2XlQSMD7L7hjXMf6sLM0lTVbhbhCMREREZUvJkoFBrXxxN/j26Frs5rQCGD9sevo8O0BbPw3DmpNxTgewiM3REREVKDQK0mY/ut5RN9JBwB4u9hgxpve8HWvWuZZ+LFUEVhuiIiI9Jer1uCnY9fx3V/RSMt6dOuGt1u4YFLnhnCsYl5mOVhuisByQ0REZLik9GzM2XsRv4TdAABYm5ng41fqYUCAB0xNSn+UC8tNEVhuiIiIii8y/gGm7TqH0zdSAAB1qlth+puN0bZe9VJdL8tNEVhuiIiIno9GI7At/Aa+2XsRyRk5AIBOjZ3w+euNSu1eVSw3RWC5ISIiKhkpD3Ox4K9orD92HWqNgJmJAkHt6mBY+zowVylLdF0sN0VguSEiIipZl26nYfru8zh2NRkA4GJngR0jAkp0wDGvc0NERERlpkGNKtg41A9L+7RATVtz1K5uheoy3p9K1htnEhERkXGQJAmvN3XGyw0dkZqVC0mS775ULDdERERUYixMlbAwLdnxNobix1JERERkVFhuiIiIyKiw3BAREZFRYbkhIiIio8JyQ0REREaF5YaIiIiMCssNERERGRWWGyIiIjIqLDdERERkVFhuiIiIyKiw3BAREZFRYbkhIiIio8JyQ0REREal0t0VXAgBAEhNTZU5CREREenr8fv24/fxolS6cpOcnAwAcHNzkzkJERERGSotLQ22trZFzlPpyo29vT0AIC4u7pkbpzJITU2Fm5sb4uPjYWNjI3cc2XF7/A+3hS5uD13cHv/DbaGrtLaHEAJpaWmoWbPmM+etdOVGoXg0zMjW1pa/hE+wsbHh9ngCt8f/cFvo4vbQxe3xP9wWukpje+h7UIIDiomIiMiosNwQERGRUal05cbMzAzTpk2DmZmZ3FHKBW4PXdwe/8NtoYvbQxe3x/9wW+gqD9tDEvqcU0VERERUQVS6IzdERERk3FhuiIiIyKiw3BAREZFRYbkhIiIio2KU5WbZsmXw9PSEubk5fH19cfjw4SLnz87OxpQpU+Du7g4zMzPUqVMHa9asKaO0pc/Q7bFhwwY0a9YMlpaWcHZ2xqBBg7S3rajIDh06hK5du6JmzZqQJAk7d+585jIHDx6Er68vzM3NUbt2baxYsaL0g5YRQ7fH9u3b0bFjR1SvXh02Njbw9/fHn3/+WTZhS1lxfjceO3r0KExMTNC8efNSy1fWirM9jHk/WpztYaz70dmzZ6NVq1aoUqUKHB0d0b17d1y6dOmZy5X1vtToys2WLVswZswYTJkyBREREWjbti06d+6MuLi4Qpfp2bMn9u/fj9WrV+PSpUvYtGkTGjZsWIapS4+h2+PIkSPo378/hgwZgvPnz2Pr1q04efIkPvjggzJOXvIyMjLQrFkzLFmyRK/5Y2Nj0aVLF7Rt2xYRERH47LPPMHr0aAQHB5dy0rJh6PY4dOgQOnbsiD179iA8PBwdOnRA165dERERUcpJS5+h2+KxlJQU9O/fH6+88kopJZNHcbaHMe9HDd0exrwfPXjwIEaMGIHjx48jJCQEeXl5CAwMREZGRqHLyLIvFUamdevWIigoSGdaw4YNxaRJkwqc/48//hC2trYiOTm5LOKVOUO3x9y5c0Xt2rV1pi1atEi4urqWWkY5ABA7duwocp5PPvlENGzYUGfaRx99JF544YVSTCYPfbZHQRo1aiRmzJhR8oFkZMi26NWrl/j888/FtGnTRLNmzUo1l1z02R7Gvh99kj7bo7LsR4UQIjExUQAQBw8eLHQeOfalRnXkJicnB+Hh4QgMDNSZHhgYiNDQ0AKX2b17N1q2bIk5c+bAxcUF9evXx4QJE/Dw4cOyiFyqirM9AgICcOPGDezZswdCCNy5cwfbtm3D66+/XhaRy5Vjx47l23adOnVCWFgYcnNzZUpVfmg0GqSlpWlvRlvZrF27FjExMZg2bZrcUWRnzPvR4qhM+9GUlBQAKHI/IMe+1KhunJmUlAS1Wg0nJyed6U5OTrh9+3aBy1y9ehVHjhyBubk5duzYgaSkJAwfPhz37t2r8J8XF2d7BAQEYMOGDejVqxeysrKQl5eHN998E4sXLy6LyOXK7du3C9x2eXl5SEpKgrOzs0zJyod58+YhIyMDPXv2lDtKmbt8+TImTZqEw4cPw8TEqHajxWLM+9HiqCz7USEExo0bhxdffBHe3t6FzifHvtSojtw8JkmSztdCiHzTHtNoNJAkCRs2bEDr1q3RpUsXzJ8/H+vWrTOa/3UYsj0uXLiA0aNHY+rUqQgPD8fevXsRGxuLoKCgsoha7hS07QqaXtls2rQJ06dPx5YtW+Do6Ch3nDKlVqvRp08fzJgxA/Xr15c7TrlQGfajhqgs+9GRI0fizJkz2LRp0zPnLet9qVH9l8PBwQFKpTLfUYnExMR8rfExZ2dnuLi46NxG3cvLC0II3LhxA/Xq1SvVzKWpONtj9uzZaNOmDSZOnAgAaNq0KaysrNC2bVt89dVXlepoRY0aNQrcdiYmJqhWrZpMqeS3ZcsWDBkyBFu3bsWrr74qd5wyl5aWhrCwMERERGDkyJEAHr25CyFgYmKCffv24eWXX5Y5Zdky5v1ocVSG/eioUaOwe/duHDp0CK6urkXOK8e+1KiO3JiamsLX1xchISE600NCQhAQEFDgMm3atMGtW7eQnp6unRYdHQ2FQvHMH1h5V5ztkZmZCYVC99dCqVQC+F/Triz8/f3zbbt9+/ahZcuWUKlUMqWS16ZNmzBw4EBs3LjRKMcP6MPGxgZnz55FZGSk9hEUFIQGDRogMjISfn5+ckcsc8a8Hy0OY96PCiEwcuRIbN++HX///Tc8PT2fuYws+9JSG6osk82bNwuVSiVWr14tLly4IMaMGSOsrKzEtWvXhBBCTJo0SfTr1087f1pamnB1dRXvvPOOOH/+vDh48KCoV6+e+OCDD+R6CSXK0O2xdu1aYWJiIpYtWyZiYmLEkSNHRMuWLUXr1q3legklJi0tTURERIiIiAgBQMyfP19ERESI69evCyHyb4urV68KS0tLMXbsWHHhwgWxevVqoVKpxLZt2+R6CSXK0O2xceNGYWJiIpYuXSoSEhK0jwcPHsj1EkqModviacZ2tpSh28PY96OGbg9j3o8OGzZM2NraigMHDujsBzIzM7XzlId9qdGVGyGEWLp0qXB3dxempqaiRYsWOqeoDRgwQLRr105n/qioKPHqq68KCwsL4erqKsaNG6fzg6roDN0eixYtEo0aNRIWFhbC2dlZ9O3bV9y4caOMU5e8f/75RwDI9xgwYIAQouBtceDAAeHj4yNMTU2Fh4eHWL58edkHLyWGbo927doVOX9FVpzfjScZW7kpzvYw5v1ocbaHse5HC9oOAMTatWu185SHfan0X1giIiIio2BUY26IiIiIWG6IiIjIqLDcEBERkVFhuSEiIiKjwnJDRERERoXlhoiIiIwKyw0REREZFZYbIiIiMiosN1Soa9euQZIkREZGlup6MjMz0aNHD9jY2ECSJDx48EDvZSVJws6dO0s0z7p162BnZ1eiz/k8PDw8sGDBghJ/3qNHj6JJkyZQqVTo3r273suVt+0jhMCHH34Ie3v7Mvl9NTZl9XdeVqZPn47mzZtrvx44cOAzf7/bt2+PMWPGlGouKlssNxXcwIEDIUkSJEmCiYkJatWqhWHDhuH+/fsGP8/TOwA3NzckJCTA29u7BBPn9+OPP+Lw4cMIDQ1FQkKCzp2FH3t6h2WMyro0jBs3Ds2bN0dsbCzWrVtX4DylVaxK0t69e7Fu3Tr89ttvZfL7KoeSKvH6vNEbm4ULFxb6+11cBw4cMPg/YoUprb+xivC3W5pM5A5Az++1117D2rVrkZeXhwsXLmDw4MF48OABNm3a9FzPq1QqUaNGjRJKWbiYmBh4eXkZ5ZtSeRYTE4OgoKBye9fmnJwcmJqaPnO+mJgYODs7F3qne30IIaBWq2Fiwl2isSnoP0tUCZTqnauo1A0YMEB069ZNZ9q4ceOEvb299uu8vDwxePBg4eHhIczNzUX9+vXFggULtN+fNm1avpug/fPPPyI2NlYAEBEREdp5Dxw4IFq1aiVMTU1FjRo1xKeffipyc3OLzLht2zbRqFEjYWpqKtzd3cW3336r/d7TN2Ms6OaEa9euLfQmbQDEDz/8ILp37y4sLCxE3bp1xa5du3SWP3/+vOjcubOwsrISjo6O4v333xd3794tNO/atWuFra2tzrTdu3eLFi1aCDMzM+Hp6SmmT5+u87r1ybFr1y5Rt25dYW5uLtq3by/WrVsnAIj79+8XeGO+adOmCSGEcHd3F7NmzRKDBg0S1tbWws3NTXz//fdFbvOsrCwxatQoUb16dWFmZibatGkjTpw4IYQQ2p9rQdvzSQXdKPPJ7bN3717RsGFDYWVlJTp16iRu3bqls/yaNWtEw4YNhZmZmWjQoIFYunRpkZnbtWsnRowYIcaOHSuqVasmXnrpJSFE0T+/AQMG6ORzd3cXQgih0WjEN998Izw9PYW5ublo2rSp2Lp1q3Zdj7f33r17ha+vr1CpVOLvv//We7m//vpL+Pr6CgsLC+Hv7y8uXryY72ft6+srzMzMRLVq1cRbb72l/V52draYOHGiqFmzprC0tBStW7cW//zzT6Hbxd3dvcDXKIQQy5YtE7Vr1xYqlUrUr19frF+/vtDnedbfeXBwsGjfvr2wsLAQTZs2FaGhoTrLHz16VLRt21aYm5sLV1dXMWrUKJGenl7o+p61HX766Sfh6+srrK2thZOTk3jvvffEnTt3DN7Ws2fPFo6OjsLa2loMHjxYfPrppzo3MX16H5meni769esnrKysRI0aNcS3334r2rVrJz7++GO9shX09/P4BprP+v15WmF/Y8/a3j/++KOwsrIS0dHR2vlHjhwp6tWrJ9LT04t83sqi8r1iI/P0H25MTIxo1KiRcHJy0k7LyckRU6dOFSdOnBBXr14VP//8s7C0tBRbtmwRQgiRlpYmevbsKV577TXt7euzs7PzlZsbN24IS0tLMXz4cBEVFSV27NghHBwctG/CBQkLCxMKhULMnDlTXLp0Saxdu1ZYWFho30yTk5PF0KFDhb+/v0hISBDJycn5niMzM1OMHz9eNG7cWJvv8d2GAQhXV1exceNGcfnyZTF69GhhbW2tfZ5bt24JBwcHMXnyZBEVFSVOnTolOnbsKDp06FBo5qfLzd69e4WNjY1Yt26diImJEfv27RMeHh5i+vTp2nmelSM2NlaoVCoxYcIEcfHiRbFp0ybh4uKiLTfZ2dliwYIFwsbGRvsa09LShBCP3tzs7e3F0qVLxeXLl8Xs2bOFQqEQUVFRhb6G0aNHi5o1a4o9e/aI8+fPiwEDBoiqVauK5ORkkZeXJxISEoSNjY1YsGCBzvZ8UnJysnB1dRUzZ87UZnq8fVQqlXj11VfFyZMnRXh4uPDy8hJ9+vTRLrty5Urh7OwsgoODxdWrV0VwcLCwt7cX69atKzRzu3bthLW1tZg4caK4ePGiiIqKeubP78GDB2LmzJnC1dVVJCQkiMTERCGEEJ999plo2LCh2Lt3r4iJiRFr164VZmZm4sCBA0KI/71xNm3aVOzbt09cuXJFJCUl6b2cn5+fOHDggDh//rxo27atCAgI0L6O3377TSiVSjF16lRx4cIFERkZKWbNmqX9fp8+fURAQIA4dOiQuHLlipg7d64wMzPTeaN6UmJioraAPvkat2/fLlQqlVi6dKm4dOmSmDdvnlAqleLvv/8u8Hme9XfesGFD8dtvv4lLly6Jd955R7i7u2sL/JkzZ4S1tbX47rvvRHR0tDh69Kjw8fERAwcOLPTn+aztsHr1arFnzx4RExMjjh07Jl544QXRuXNn7ff12dZbtmwRpqam4ocffhAXL14UU6ZMEVWqVCmy3AwbNky4urqKffv2iTNnzog33nhDWFtb65SborLl5eWJ4OBgAUBcunRJJCQkiAcPHgghnv1797TC/sb02d7vvvuuaNWqlcjNzRV//PGHUKlU2v/AFPa8lQnLTQU3YMAAoVQqhZWVlTA3N9e29Pnz5xe53PDhw0WPHj10nufpI0BPl5vPPvtMNGjQQGg0Gu08S5cuFdbW1kKtVhe4nj59+oiOHTvqTJs4caJo1KiR9uuPP/64wCM2T5o2bZrODusxAOLzzz/Xfp2eni4kSRJ//PGHEEKIL774QgQGBuosEx8fr90xFeTpctO2bVvx9ddf68zz008/CWdnZ71zfPrpp8Lb21vnOaZMmaItNwWt9zF3d3fx/vvva7/WaDTC0dFRLF++vMD86enpQqVSiQ0bNmin5eTkiJo1a4o5c+Zop9na2hZ4xObpdX/33Xc60x4fSbty5Yp22tKlS3UKtZubm9i4caPOcl9++aXw9/cvdF3t2rUTzZs315mmz8/vu+++0zmakZ6eLszNzfMdeRgyZIh47733hBD/e+PcuXNnsZb766+/tN///fffBQDx8OFDIYQQ/v7+om/fvgW+xitXrghJksTNmzd1pr/yyiti8uTJBW8Y8ej3a8eOHTrTAgICxNChQ3Wmvfvuu6JLly6FPk9Rf+erVq3STjt//rwAoC3Q/fr1Ex9++KHOcocPHxYKhUL7up9W1HYoyIkTJwQAbanXd1sHBQXpPI+fn1+h5SYtLU2YmpqKzZs3a7+fnJwsLCwsdMqNvtke/+0Kod/vT0EK+hvTZ3vfu3dPuLq6imHDhgknJyfx1VdfPfN5KxN+wGwEOnTogOXLlyMzMxOrVq1CdHQ0Ro0apTPPihUrsGrVKly/fh0PHz5ETk6OwQN0o6Ki4O/vD0mStNPatGmD9PR03LhxA7Vq1SpwmW7duulMa9OmDRYsWAC1Wg2lUmlQhoI0bdpU+28rKytUqVIFiYmJAIDw8HD8888/sLa2zrdcTEwM6tev/8znDw8Px8mTJzFr1iztNLVajaysLGRmZsLS0vKZOS5duoRWrVrpPG/r1q2L9RolSUKNGjW0z13Q68rNzUWbNm2001QqFVq3bo2oqCi911kUS0tL1KlTR/u1s7OzNs/du3cRHx+PIUOGYOjQodp58vLynjn+oWXLljpfF+fnd+HCBWRlZaFjx44603NycuDj41Po+gxZ7smfh7OzMwAgMTERtWrVQmRkpM7rftKpU6cghMiXOzs7G9WqVStwmcJERUXhww8/1JnWpk0bLFy40KDneayw19SwYUOEh4fjypUr2LBhg3YeIQQ0Gg1iY2Ph5eWV7/mK2g4AEBERgenTpyMyMhL37t2DRqMBAMTFxaFRo0bPzFWrVi1ERUUhKChI53n9/f3xzz//FLjOmJgY5OTkwN/fXzvN3t4eDRo0KFa2Jxny+/Ms+mzvqlWrYvXq1ejUqRMCAgIwadIkg9Zh7FhujICVlRXq1q0LAFi0aBE6dOiAGTNm4MsvvwQA/PLLLxg7dizmzZsHf39/VKlSBXPnzsW///5r0HqEEDrF5vE0APmm67NMSVGpVDpfS5Kk3RlpNBp07doV33zzTb7lHu8on0Wj0WDGjBl4++23833P3NxcrxzPux2Keu6nFfYzKShDcRWU5/F6H+f64Ycf4OfnpzPfs8qslZWVztfF+fk9Xv/vv/8OFxcXne+ZmZkVuj5Dlnvy9T/epo+Xt7CwKDDX43mUSiXCw8PzbYuCCtyzlOTPuKjXpNFo8NFHH2H06NH5livoPzVA0dshIyMDgYGBCAwMxM8//4zq1asjLi4OnTp1Qk5Ojt65DKXP35wh2Z5kyO/Ps+i7vQ8dOgSlUolbt24hIyMDNjY2Bq3HmLHcGKFp06ahc+fOGDZsGGrWrInDhw8jICAAw4cP184TExOjs4ypqSnUanWRz9uoUSMEBwfr7EBDQ0NRpUqVfH/MTy5z5MgRnWmhoaGoX7++QUdt9MlXkBYtWiA4OBgeHh7FPhOmRYsWuHTpkrZAFkfDhg2xZ88enWlhYWE6Xxf3NT6tbt26MDU1xZEjR9CnTx8AQG5uLsLCwgy+lkdxMjk5OcHFxQVXr15F3759DVr2acX5+TVq1AhmZmaIi4tDu3bt9F5XcZd7WtOmTbF//34MGjQo3/d8fHygVquRmJiItm3b6v2cKpUq38/By8sLR44cQf/+/bXTQkNDCzyK8tjz/B2dP3/eoL+BorbDxYsXkZSUhP/7v/+Dm5sbgPx/D/rw8vLC8ePHdbbB8ePHC52/bt26UKlUOH78uLYk3L9/H9HR0dqfuT7ZHp/F9+S2LO7vT0E/E322d2hoKObMmYNff/0VkyZNwqhRo/Djjz8W+byVCa9zY4Tat2+Pxo0b4+uvvwbw6A86LCwMf/75J6Kjo/HFF1/g5MmTOst4eHjgzJkzuHTpEpKSkpCbm5vveYcPH474+HiMGjUKFy9exK5duzBt2jSMGzcOCkXBv0rjx4/H/v378eWXXyI6Oho//vgjlixZggkTJhj0mjw8PBAbG4vIyEgkJSUhOztbr+VGjBiBe/fu4b333sOJEydw9epV7Nu3D4MHD9b7D3/q1KlYv349pk+fjvPnzyMqKgpbtmzB559/rnf+jz76CBcvXsSnn36K6Oho/PLLL9prbzwuih4eHkhPT8f+/fuRlJSEzMxMvZ//SVZWVhg2bBgmTpyIvXv34sKFCxg6dCgyMzMxZMgQg57Lw8MDhw4dws2bN5GUlKT3ctOnT8fs2bOxcOFCREdH4+zZs1i7di3mz59v0PqL8/OrUqUKJkyYgLFjx+LHH39ETEwMIiIisHTpUp2df0kt97Rp06Zh06ZNmDZtGqKionD27FnMmTMHAFC/fn307dsX/fv3x/bt2xEbG4uTJ0/im2++yVd+n+Th4YH9+/fj9u3b2mtYTZw4EevWrcOKFStw+fJlzJ8/H9u3by/yb0ufv/OCfPrppzh27BhGjBiByMhIXL58Gbt378738be+26FWrVowNTXF4sWLcfXqVezevVt7pNkQH3/8MdasWYM1a9YgOjoa06ZNw/nz5wud39raGkOGDMHEiROxf/9+nDt3DgMHDtTZf+mTzd3dHZIk4bfffsPdu3eRnp5e7N+fgv7GnrW909LS0K9fP4waNQqdO3fGxo0b8csvv2Dr1q1FPm+lIsM4HypBBQ0QFEKIDRs2CFNTUxEXFyeysrLEwIEDha2trbCzsxPDhg0TkyZN0hl0l5iYKDp27Cisra1L7VRwlUolatWqJebOnavzfX0GFGdlZYkePXoIOzu7fKeCPz3Q8umBstHR0eKtt94SdnZ2wsLCQjRs2FCMGTNGZ2D0kwoa2Lt3714REBAgLCwshI2NjWjdurVYuXKl9vv65Hh8KriZmZlo3769WL58uc7gSCGECAoKEtWqVct3KvjTAwObNWtW5FlqDx8+FKNGjRIODg75TgUvLF9Bjh07Jpo2bSrMzMzynQr+pB07duQ73XTDhg2iefPmwtTUVFStWlW89NJLYvv27YWu6+nTcR971s/v6QHFQjwadL1w4ULRoEEDoVKpRPXq1UWnTp3EwYMHhRAFDwgt7nIRERECgIiNjdVOCw4O1r52BwcH8fbbb2u/9/jsRQ8PD6FSqUSNGjXEW2+9Jc6cOVPottm9e7eoW7euMDExKfap4ELo/3d+//597fcfO3HihHZZKysr0bRpU52znwpS1HbYuHGj8PDwEGZmZsLf31/s3r1bJ4e+23rWrFnCwcFBWFtbiwEDBohPPvmkyLOl0tLSxPvvvy8sLS2Fk5OTmDNnTr7fvWdlE0KImTNniho1aghJknROBS/q96cgBf2NPWt7Dxo0SDRp0kRkZWVp51+4cKGwt7cXN27cKPJ5KwtJiBIeAEFEepk1axZWrFiB+Ph4uaMQERkVjrkhKiPLli1Dq1atUK1aNRw9ehRz587FyJEj5Y5FRGR0WG6Iysjly5fx1Vdf4d69e6hVqxbGjx+PyZMnyx2LiMjo8GMpIiIiMio8W4qIiIiMCssNERERGRWWGyIiIjIqLDdERERkVFhuiIiIyKiw3BAREZFRYbkhIiIio8JyQ0REREbl/wEzBPVvSKSe5gAAAABJRU5ErkJggg==", + "text/plain": [ + "
" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "reference_length = 1\n", + "candidate_length = np.linspace(1.5, 0.5, 100)\n", + "\n", + "length_ratio = reference_length / candidate_length\n", + "BP = np.minimum(1, np.exp(1 - length_ratio))\n", + "\n", + "# Plot the data\n", + "fig, ax = plt.subplots(1)\n", + "lines = ax.plot(length_ratio, BP)\n", + "ax.set(\n", + " xlabel=\"Ratio of the length of the reference to the candidate text\",\n", + " ylabel=\"Brevity Penalty\",\n", + ")\n", + "plt.show()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### N-Gram Precision:\n", + "The n-gram precision counts how many n-grams (in your case unigrams, bigrams, trigrams, and four-grams for i =1 , ... , 4) match their n-gram counterpart in the reference translations. This term acts as a precision metric. Unigrams account for adequacy while longer n-grams account for fluency of the translation. To avoid overcounting, the n-gram counts are clipped to the maximal n-gram count occurring in the reference ($m_{n}^{ref}$). Typically precision shows exponential decay with the degree of the n-gram." + ] + }, + { + "cell_type": "code", + "execution_count": 3, + "metadata": {}, + "outputs": [ + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAjcAAAGdCAYAAADuR1K7AAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjcuMSwgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy/bCgiHAAAACXBIWXMAAA9hAAAPYQGoP6dpAAAseElEQVR4nO3de1BV9d7H8c8WZGMomJB4aYuUphTZMbBzwDyWF4qcTj3TxW6aCnMiyiTqlGTlrRN2I7pB+qjHPJVxyuwykrVPpWJkTyJOPWlXtU20kcAOoBYErOcPxz3PDlSWbNiwfL9m1kzrt35rre/2N46ffutmMwzDEAAAgEX08HcBAAAAvkS4AQAAlkK4AQAAlkK4AQAAlkK4AQAAlkK4AQAAlkK4AQAAlkK4AQAAlhLo7wI6W3Nzs3788Uf16dNHNpvN3+UAAIA2MAxDdXV1GjRokHr0OPbczEkXbn788Uc5HA5/lwEAAE5AWVmZTj/99GP2OenCTZ8+fSQd/sMJDQ31czUAAKAtamtr5XA4PP+OH8tJF26OXIoKDQ0l3AAA0M205ZYSbigGAACWQrgBAACWQrgBAACWQrgBAACWQrgBAACWQrgBAACWQrgBAACWQrgBAACWQrgBAACWQrgBAACW4vdwk5eXp+joaAUHBysuLk5FRUXH7P/SSy/pvPPO0ymnnKKBAwdq5syZqq6u7qRqAQBAV+fXcFNQUKCMjAzNmzdPpaWlGjdunJKTk+VyuVrtv2XLFk2fPl0pKSn64osv9Oqrr+rTTz9VampqJ1cOAAC6Kr+Gm5ycHKWkpCg1NVUxMTHKzc2Vw+FQfn5+q/23bt2qoUOH6o477lB0dLQuvPBC3XLLLdq2bVsnVw4AALoqv4WbhoYGlZSUKCkpyas9KSlJxcXFre6TmJioH374QYWFhTIMQ/v27dNrr72mKVOmHPU89fX1qq2t9VoAAIB1BfrrxFVVVWpqalJkZKRXe2RkpCoqKlrdJzExUS+99JKmTp2qX3/9VY2NjfrLX/6iZ5555qjnyc7O1sKFC31a+7EMnbu+084Fb3uXHD3kAgBOHn6/odhms3mtG4bRou2InTt36o477tCDDz6okpISbdiwQXv27FFaWtpRj5+VlaWamhrPUlZW5tP6AQBA1+K3mZuIiAgFBAS0mKWprKxsMZtzRHZ2tsaOHau//e1vkqRRo0YpJCRE48aN00MPPaSBAwe22Mdut8tut/v+BwAAgC7JbzM3QUFBiouLk9Pp9Gp3Op1KTExsdZ9Dhw6pRw/vkgMCAiQdnvEBAADw62WpzMxMLV++XCtXrtSuXbt05513yuVyeS4zZWVlafr06Z7+l19+uV5//XXl5+dr9+7d+uijj3THHXfoggsu0KBBg/z1MwAAQBfit8tSkjR16lRVV1dr0aJFcrvdio2NVWFhoaKioiRJbrfb6503M2bMUF1dnZ599lnddddd6tu3ryZMmKBHHnnEXz8BAAB0MTbjJLueU1tbq7CwMNXU1Cg0NNTnx+dpKf/haSkAsC4z/377/WkpAAAAXyLcAAAASyHcAAAASyHcAAAASyHcAAAASyHcAAAASyHcAAAASyHcAAAASyHcAAAASyHcAAAASyHcAAAASyHcAAAASyHcAAAASyHcAAAASyHcAAAASyHcAAAASyHcAAAASyHcAAAASyHcAAAASyHcAAAASyHcAAAASyHcAAAASyHcAAAASyHcAAAASyHcAAAASyHcAAAASyHcAAAASyHcAAAASyHcAAAASyHcAAAASyHcAAAASyHcAAAASyHcAAAAS/F7uMnLy1N0dLSCg4MVFxenoqKio/adMWOGbDZbi+Wcc87pxIoBAEBX5tdwU1BQoIyMDM2bN0+lpaUaN26ckpOT5XK5Wu3/1FNPye12e5aysjL169dP11xzTSdXDgAAuiq/hpucnBylpKQoNTVVMTExys3NlcPhUH5+fqv9w8LCNGDAAM+ybds2/fzzz5o5c2YnVw4AALoqv4WbhoYGlZSUKCkpyas9KSlJxcXFbTrGihUrNGnSJEVFRR21T319vWpra70WAABgXYH+OnFVVZWampoUGRnp1R4ZGamKiorj7u92u/XOO+/o5ZdfPma/7OxsLVy4sF21ApI0dO56f5dw0tq7ZIq/SwDQjfj9hmKbzea1bhhGi7bWrFq1Sn379tWVV155zH5ZWVmqqanxLGVlZe0pFwAAdHF+m7mJiIhQQEBAi1maysrKFrM5v2cYhlauXKlp06YpKCjomH3tdrvsdnu76wUAAN2D32ZugoKCFBcXJ6fT6dXudDqVmJh4zH03bdqkb7/9VikpKR1ZIgAA6Ib8NnMjSZmZmZo2bZri4+OVkJCgZcuWyeVyKS0tTdLhS0rl5eVavXq1134rVqzQH//4R8XGxvqjbAAA0IX5NdxMnTpV1dXVWrRokdxut2JjY1VYWOh5+sntdrd4501NTY3Wrl2rp556yh8lAwCALs6v4UaS0tPTlZ6e3uq2VatWtWgLCwvToUOHOrgqAADQXfn9aSkAAABfItwAAABLIdwAAABLIdwAAABLIdwAAABLIdwAAABLIdwAAABLIdwAAABLIdwAAABLIdwAAABLIdwAAABLIdwAAABLIdwAAABLIdwAAABLIdwAAABLIdwAAABLIdwAAABLIdwAAABLIdwAAABLIdwAAABLIdwAAABLIdwAAABLIdwAAABLIdwAAABLIdwAAABLIdwAAABLIdwAAABLIdwAAABLIdwAAABLIdwAAABLIdwAAABLIdwAAABLIdwAAABL8Xu4ycvLU3R0tIKDgxUXF6eioqJj9q+vr9e8efMUFRUlu92uM888UytXruykagEAQFcX6M+TFxQUKCMjQ3l5eRo7dqyWLl2q5ORk7dy5U0OGDGl1n2uvvVb79u3TihUrNGzYMFVWVqqxsbGTKwcAAF2VX8NNTk6OUlJSlJqaKknKzc3Vu+++q/z8fGVnZ7fov2HDBm3atEm7d+9Wv379JElDhw7tzJIBAEAX57fLUg0NDSopKVFSUpJXe1JSkoqLi1vd56233lJ8fLweffRRDR48WGeddZbuvvtu/fLLL0c9T319vWpra70WAABgXX6buamqqlJTU5MiIyO92iMjI1VRUdHqPrt379aWLVsUHBysdevWqaqqSunp6dq/f/9R77vJzs7WwoULfV4/AGsYOne9v0s4ae1dMsXfJcCi/H5Dsc1m81o3DKNF2xHNzc2y2Wx66aWXdMEFF+iyyy5TTk6OVq1addTZm6ysLNXU1HiWsrIyn/8GAADQdfht5iYiIkIBAQEtZmkqKytbzOYcMXDgQA0ePFhhYWGetpiYGBmGoR9++EHDhw9vsY/dbpfdbvdt8QAAoMvy28xNUFCQ4uLi5HQ6vdqdTqcSExNb3Wfs2LH68ccfdeDAAU/b119/rR49euj000/v0HoBAED34NfLUpmZmVq+fLlWrlypXbt26c4775TL5VJaWpqkw5eUpk+f7ul/ww03KDw8XDNnztTOnTu1efNm/e1vf9OsWbPUq1cvf/0MAADQhfj1UfCpU6equrpaixYtktvtVmxsrAoLCxUVFSVJcrvdcrlcnv69e/eW0+nU7NmzFR8fr/DwcF177bV66KGH/PUTAABAF+PXcCNJ6enpSk9Pb3XbqlWrWrSNHDmyxaUsAACAI/z+tBQAAIAvEW4AAIClEG4AAIClEG4AAIClEG4AAIClEG4AAIClEG4AAIClEG4AAIClEG4AAIClEG4AAIClEG4AAIClEG4AAIClEG4AAIClEG4AAIClEG4AAIClBJrd4eDBg1qyZInef/99VVZWqrm52Wv77t27fVYcAACAWabDTWpqqjZt2qRp06Zp4MCBstlsHVEXAADACTEdbt555x2tX79eY8eO7Yh6AAAA2sX0PTennnqq+vXr1xG1AAAAtJvpcLN48WI9+OCDOnToUEfUAwAA0C6mL0s98cQT+u677xQZGamhQ4eqZ8+eXtu3b9/us+IAAADMMh1urrzyyg4oAwAAwDdMh5v58+d3RB0AAAA+YTrcHFFSUqJdu3bJZrPp7LPP1ujRo31ZFwAAwAkxHW4qKyt13XXXaePGjerbt68Mw1BNTY0uvvhivfLKKzrttNM6ok4AAIA2Mf201OzZs1VbW6svvvhC+/fv188//6z//d//VW1tre64446OqBEAAKDNTM/cbNiwQf/+978VExPjaTv77LP13HPPKSkpyafFAQAAmGV65qa5ubnF49+S1LNnzxbfmQIAAOhspsPNhAkTNGfOHP3444+etvLyct15552aOHGiT4sDAAAwy3S4efbZZ1VXV6ehQ4fqzDPP1LBhwxQdHa26ujo988wzHVEjAABAm5m+58bhcGj79u1yOp368ssvZRiGzj77bE2aNKkj6gMAADDlhN9zM3nyZE2ePNmXtQAAALRbm8LN008/rb/+9a8KDg7W008/fcy+PA4OAAD8qU3h5sknn9SNN96o4OBgPfnkk0ftZ7PZTIebvLw8PfbYY3K73TrnnHOUm5urcePGtdp348aNuvjii1u079q1SyNHjjR1XgAAYE1tCjd79uxp9b/bq6CgQBkZGcrLy9PYsWO1dOlSJScna+fOnRoyZMhR9/vqq68UGhrqWeetyAAA4AjTT0v9XlNTk3bs2KGff/7Z9L45OTlKSUlRamqqYmJilJubK4fDofz8/GPu179/fw0YMMCzBAQEnGj5AADAYkyHm4yMDK1YsULS4WDz5z//Weeff74cDoc2btzY5uM0NDSopKSkxVuNk5KSVFxcfMx9R48erYEDB2rixIn68MMPj9m3vr5etbW1XgsAALAu009Lvfbaa7rpppskSW+//bb27t2rL7/8UqtXr9a8efP00Ucftek4VVVVampqUmRkpFd7ZGSkKioqWt1n4MCBWrZsmeLi4lRfX69//vOfmjhxojZu3Kg///nPre6TnZ2thQsXmviFAAArGDp3vb9LOGntXTLFr+c3HW6qqqo0YMAASVJhYaGuueYanXXWWUpJSTnuk1StsdlsXuuGYbRoO2LEiBEaMWKEZz0hIUFlZWV6/PHHjxpusrKylJmZ6Vmvra2Vw+EwXScAAOgeTF+WioyM1M6dO9XU1KQNGzZ4Xt536NAhU/e+REREKCAgoMUsTWVlZYvZnGP505/+pG+++eao2+12u0JDQ70WAABgXabDzcyZM3XttdcqNjZWNpvN8yK/Tz75xNTj2EFBQYqLi5PT6fRqdzqdSkxMbPNxSktLNXDgwDb3BwAA1mb6stSCBQsUGxursrIyXXPNNbLb7ZKkgIAAzZ0719SxMjMzNW3aNMXHxyshIUHLli2Ty+VSWlqapMOXlMrLy7V69WpJUm5uroYOHapzzjlHDQ0NevHFF7V27VqtXbvW7M8AAAAWdUKfX7j66qtbtN18882mjzN16lRVV1dr0aJFcrvdio2NVWFhoaKioiRJbrdbLpfL07+hoUF33323ysvL1atXL51zzjlav369LrvsshP5GQAAwIL8/vmF9PR0paent7pt1apVXuv33HOP7rnnHlPHBwAAJxe/f34BAADAl/z6+QUAAABfa/fnFwAAALoS0+Hm6quv1pIlS1q0P/bYY7rmmmt8UhQAAMCJMh1uNm3apClTWr5W+dJLL9XmzZt9UhQAAMCJMh1uDhw4oKCgoBbtPXv25KOUAADA70yHm9jYWBUUFLRof+WVV3T22Wf7pCgAAIATZfolfg888ICuuuoqfffdd5owYYIk6f3339eaNWv06quv+rxAAAAAM0yHm7/85S9644039PDDD+u1115Tr169NGrUKP373//W+PHjO6JGAACANjuhzy9MmTKl1ZuKAQAA/O2E3nPzn//8R8uXL9d9992n/fv3S5K2b9+u8vJynxYHAABglumZm88++0yTJk1SWFiY9u7dq9TUVPXr10/r1q3T999/7/mCNwAAgD+YnrnJzMzUjBkz9M033yg4ONjTnpyczHtuAACA35kON59++qluueWWFu2DBw9WRUWFT4oCAAA4UabDTXBwcKsv6/vqq6902mmn+aQoAACAE2U63FxxxRVatGiRfvvtN0mSzWaTy+XS3LlzddVVV/m8QAAAADNMh5vHH39cP/30k/r3769ffvlF48eP17Bhw9SnTx/9/e9/74gaAQAA2sz001KhoaHasmWLPvjgA23fvl3Nzc06//zzNWnSpI6oDwAAwBRT4aaxsVHBwcHasWOHJkyY4Pn8AgAAQFdh6rJUYGCgoqKi1NTU1FH1AAAAtIvpe27uv/9+ZWVled5MDAAA0JWYvufm6aef1rfffqtBgwYpKipKISEhXtu3b9/us+IAAADMMh1urrzyyg4oAwAAwDdMh5v58+d3RB0AAAA+YTrcHLFt2zbt2rVLNptNMTExiouL82VdAAAAJ8R0uPnhhx90/fXX66OPPlLfvn0lSf/5z3+UmJioNWvWyOFw+LpGAACANjP9tNSsWbP022+/adeuXdq/f7/279+vXbt2yTAMpaSkdESNAAAAbWZ65qaoqEjFxcUaMWKEp23EiBF65plnNHbsWJ8WBwAAYJbpmZshQ4Z4Ppr5/zU2Nmrw4ME+KQoAAOBEmQ43jz76qGbPnq1t27bJMAxJh28unjNnjh5//HGfFwgAAGCG6ctSM2bM0KFDh/THP/5RgYGHd29sbFRgYKBmzZqlWbNmefryFmMAANDZTIeb3NzcDigDAADAN0yHm5tvvrkj6gAAAPAJ0/fc+FpeXp6io6MVHBysuLg4FRUVtWm/jz76SIGBgfrDH/7QsQUCAIBuxa/hpqCgQBkZGZo3b55KS0s1btw4JScny+VyHXO/mpoaTZ8+XRMnTuykSgEAQHfh13CTk5OjlJQUpaamKiYmRrm5uXI4HMrPzz/mfrfccotuuOEGJSQkdFKlAACgu/BbuGloaFBJSYmSkpK82pOSklRcXHzU/f7xj3/ou+++a/MHPOvr61VbW+u1AAAA6/JbuKmqqlJTU5MiIyO92iMjI1VRUdHqPt98843mzp2rl156yfMY+vFkZ2crLCzMs/DtKwAArM3001K//vqrnnnmGX344YeqrKxUc3Oz1/bt27ebOp7NZvNaNwyjRZskNTU16YYbbtDChQt11llntfn4WVlZyszM9KzX1tYScAAAsDDT4WbWrFlyOp26+uqrdcEFF7QaRNoiIiJCAQEBLWZpKisrW8zmSFJdXZ22bdum0tJS3X777ZKk5uZmGYahwMBAvffee5owYUKL/ex2u+x2+wnVCAAAuh/T4Wb9+vUqLCxs90cyg4KCFBcXJ6fTqf/6r//ytDudTl1xxRUt+oeGhurzzz/3asvLy9MHH3yg1157TdHR0e2qBwAAWIPpcDN48GD16dPHJyfPzMzUtGnTFB8fr4SEBC1btkwul0tpaWmSDl9SKi8v1+rVq9WjRw/FxsZ67d+/f38FBwe3aAcAACcv0+HmiSee0L333qvnn39eUVFR7Tr51KlTVV1drUWLFsntdis2NlaFhYWe47rd7uO+8wYAAOD/Mx1u4uPj9euvv+qMM87QKaecop49e3ptN/uxzPT0dKWnp7e6bdWqVcfcd8GCBVqwYIGp8wEAAGszHW6uv/56lZeX6+GHH1ZkZOQJ31AMAADQEUyHm+LiYn388cc677zzOqIeAACAdjH9Er+RI0fql19+6YhaAAAA2s10uFmyZInuuusubdy4UdXV1XzaAAAAdCmmL0tdeumlktTii9xH3izc1NTkm8oAAABOgOlw8+GHH3ZEHQAAAD5hOtyMHz++I+oAAADwCdPh5ohDhw7J5XKpoaHBq33UqFHtLgoAAOBEmQ43P/30k2bOnKl33nmn1e3ccwMAAPzJ9NNSGRkZ+vnnn7V161b16tVLGzZs0AsvvKDhw4frrbfe6ogaAQAA2sz0zM0HH3ygN998U2PGjFGPHj0UFRWlyZMnKzQ0VNnZ2ZoyZUpH1AkAANAmpmduDh48qP79+0uS+vXrp59++kmSdO6552r79u2+rQ4AAMAk0+FmxIgR+uqrryRJf/jDH7R06VKVl5fr+eef18CBA31eIAAAgBmmL0tlZGTI7XZLkubPn69LLrlEL730koKCgo77FW8AAICOZjrc3HjjjZ7/Hj16tPbu3asvv/xSQ4YMUUREhE+LAwAAMMvUZanffvtNZ5xxhnbu3OlpO+WUU3T++ecTbAAAQJdgKtz07NlT9fX1stlsHVUPAABAu5i+oXj27Nl65JFH1NjY2BH1AAAAtIvpe24++eQTvf/++3rvvfd07rnnKiQkxGv766+/7rPiAAAAzDIdbvr27aurrrqqI2oBAABoN9Ph5h//+EdH1AEAAOATpu+5AQAA6MpMz9yMHj261aelbDabgoODNWzYMM2YMUMXX3yxTwoEAAAww/TMzaWXXqrdu3crJCREF198sS666CL17t1b3333ncaMGSO3261JkybpzTff7Ih6AQAAjsn0zE1VVZXuuusuPfDAA17tDz30kL7//nu99957mj9/vhYvXqwrrrjCZ4UCAAC0hemZm3/961+6/vrrW7Rfd911+te//iVJuv766z0f1wQAAOhMpsNNcHCwiouLW7QXFxcrODhYktTc3Cy73d7+6gAAAEwyfVlq9uzZSktLU0lJicaMGSObzab/+Z//0fLly3XfffdJkt59912NHj3a58UCAAAcj+lwc//99ys6OlrPPvus/vnPf0qSRowYof/+7//WDTfcIElKS0vTrbfe6ttKAQAA2sB0uJGkG2+8UTfeeONRt/fq1euECwIAAGiPdr3ELz09XVVVVb6qBQAAoN3aFW5efPFF1dbW+qoWAACAdmtXuDEMw1d1AAAA+ITfvy2Vl5en6OhoBQcHKy4uTkVFRUftu2XLFo0dO1bh4eHq1auXRo4cqSeffLITqwUAAF3dCd1QfERdXV27Tl5QUKCMjAzl5eVp7NixWrp0qZKTk7Vz504NGTKkRf+QkBDdfvvtGjVqlEJCQrRlyxbdcsstCgkJ0V//+td21QIAAKzBrzM3OTk5SklJUWpqqmJiYpSbmyuHw6H8/PxW+48ePVrXX3+9zjnnHA0dOlQ33XSTLrnkkmPO9gAAgJNLm8NNjx49FBAQcMwlMLDtE0ENDQ0qKSlRUlKSV3tSUlKrb0BuTWlpqYqLizV+/Pij9qmvr1dtba3XAgAArKvNaWTdunVH3VZcXKxnnnnG1A3GVVVVampqUmRkpFd7ZGSkKioqjrnv6aefrp9++kmNjY1asGCBUlNTj9o3OztbCxcubHNdAACge2tzuGntC99ffvmlsrKy9Pbbb+vGG2/U4sWLTRdgs9m81g3DaNH2e0VFRTpw4IC2bt2quXPnatiwYa1+zFOSsrKylJmZ6Vmvra2Vw+EwXScAAOgeTuiG4h9//FHz58/XCy+8oEsuuUQ7duxQbGysqWNEREQoICCgxSxNZWVli9mc34uOjpYknXvuudq3b58WLFhw1HBjt9v5iCcAACcRUzcU19TU6N5779WwYcP0xRdf6P3339fbb79tOthIUlBQkOLi4uR0Or3anU6nEhMT23wcwzBUX19v+vwAAMCa2jxz8+ijj+qRRx7RgAEDtGbNmlYvU5mVmZmpadOmKT4+XgkJCVq2bJlcLpfS0tIkHb6kVF5ertWrV0uSnnvuOQ0ZMkQjR46UdPi9N48//rhmz57d7loAAIA1tDnczJ07V7169dKwYcP0wgsv6IUXXmi13+uvv97mk0+dOlXV1dVatGiR3G63YmNjVVhYqKioKEmS2+2Wy+Xy9G9ublZWVpb27NmjwMBAnXnmmVqyZIluueWWNp8TAABYW5vDzfTp0497o++JSE9PV3p6eqvbVq1a5bU+e/ZsZmkAAMAxtTnc/D5oAAAAdEV+/7YUAACALxFuAACApRBuAACApRBuAACApRBuAACApRBuAACApRBuAACApRBuAACApRBuAACApRBuAACApRBuAACApRBuAACApRBuAACApRBuAACApRBuAACApRBuAACApRBuAACApRBuAACApRBuAACApRBuAACApRBuAACApRBuAACApRBuAACApRBuAACApRBuAACApRBuAACApRBuAACApRBuAACApRBuAACApRBuAACApRBuAACApRBuAACApfg93OTl5Sk6OlrBwcGKi4tTUVHRUfu+/vrrmjx5sk477TSFhoYqISFB7777bidWCwAAujq/hpuCggJlZGRo3rx5Ki0t1bhx45ScnCyXy9Vq/82bN2vy5MkqLCxUSUmJLr74Yl1++eUqLS3t5MoBAEBX5ddwk5OTo5SUFKWmpiomJka5ublyOBzKz89vtX9ubq7uuecejRkzRsOHD9fDDz+s4cOH6+233+7kygEAQFflt3DT0NCgkpISJSUlebUnJSWpuLi4Tcdobm5WXV2d+vXrd9Q+9fX1qq2t9VoAAIB1+S3cVFVVqampSZGRkV7tkZGRqqioaNMxnnjiCR08eFDXXnvtUftkZ2crLCzMszgcjnbVDQAAuja/31Bss9m81g3DaNHWmjVr1mjBggUqKChQ//79j9ovKytLNTU1nqWsrKzdNQMAgK4r0F8njoiIUEBAQItZmsrKyhazOb9XUFCglJQUvfrqq5o0adIx+9rtdtnt9nbXCwAAuge/zdwEBQUpLi5OTqfTq93pdCoxMfGo+61Zs0YzZszQyy+/rClTpnR0mQAAoJvx28yNJGVmZmratGmKj49XQkKCli1bJpfLpbS0NEmHLymVl5dr9erVkg4Hm+nTp+upp57Sn/70J8+sT69evRQWFua33wEAALoOv4abqVOnqrq6WosWLZLb7VZsbKwKCwsVFRUlSXK73V7vvFm6dKkaGxt122236bbbbvO033zzzVq1alVnlw8AALogv4YbSUpPT1d6enqr234fWDZu3NjxBQEAgG7N709LAQAA+BLhBgAAWArhBgAAWArhBgAAWArhBgAAWArhBgAAWArhBgAAWArhBgAAWArhBgAAWArhBgAAWArhBgAAWArhBgAAWArhBgAAWArhBgAAWArhBgAAWArhBgAAWArhBgAAWArhBgAAWArhBgAAWArhBgAAWArhBgAAWArhBgAAWArhBgAAWArhBgAAWArhBgAAWArhBgAAWArhBgAAWArhBgAAWArhBgAAWArhBgAAWArhBgAAWArhBgAAWArhBgAAWIrfw01eXp6io6MVHBysuLg4FRUVHbWv2+3WDTfcoBEjRqhHjx7KyMjovEIBAEC34NdwU1BQoIyMDM2bN0+lpaUaN26ckpOT5XK5Wu1fX1+v0047TfPmzdN5553XydUCAIDuwK/hJicnRykpKUpNTVVMTIxyc3PlcDiUn5/fav+hQ4fqqaee0vTp0xUWFtbJ1QIAgO7Ab+GmoaFBJSUlSkpK8mpPSkpScXGxz85TX1+v2tparwUAAFiX38JNVVWVmpqaFBkZ6dUeGRmpiooKn50nOztbYWFhnsXhcPjs2AAAoOvx+w3FNpvNa90wjBZt7ZGVlaWamhrPUlZW5rNjAwCArifQXyeOiIhQQEBAi1maysrKFrM57WG322W32312PAAA0LX5beYmKChIcXFxcjqdXu1Op1OJiYl+qgoAAHR3fpu5kaTMzExNmzZN8fHxSkhI0LJly+RyuZSWlibp8CWl8vJyrV692rPPjh07JEkHDhzQTz/9pB07digoKEhnn322P34CAADoYvwabqZOnarq6motWrRIbrdbsbGxKiwsVFRUlKTDL+37/TtvRo8e7fnvkpISvfzyy4qKitLevXs7s3QAANBF+TXcSFJ6errS09Nb3bZq1aoWbYZhdHBFAACgO/P701IAAAC+RLgBAACWQrgBAACWQrgBAACWQrgBAACWQrgBAACWQrgBAACWQrgBAACWQrgBAACWQrgBAACWQrgBAACWQrgBAACWQrgBAACWQrgBAACWQrgBAACWQrgBAACWQrgBAACWQrgBAACWQrgBAACWQrgBAACWQrgBAACWQrgBAACWQrgBAACWQrgBAACWQrgBAACWQrgBAACWQrgBAACWQrgBAACWQrgBAACWQrgBAACWQrgBAACWQrgBAACWQrgBAACW4vdwk5eXp+joaAUHBysuLk5FRUXH7L9p0ybFxcUpODhYZ5xxhp5//vlOqhQAAHQHfg03BQUFysjI0Lx581RaWqpx48YpOTlZLper1f579uzRZZddpnHjxqm0tFT33Xef7rjjDq1du7aTKwcAAF2VX8NNTk6OUlJSlJqaqpiYGOXm5srhcCg/P7/V/s8//7yGDBmi3NxcxcTEKDU1VbNmzdLjjz/eyZUDAICuKtBfJ25oaFBJSYnmzp3r1Z6UlKTi4uJW9/n444+VlJTk1XbJJZdoxYoV+u2339SzZ88W+9TX16u+vt6zXlNTI0mqra1t709oVXP9oQ45Lo6vo8b0CMbWfzpybBlX/+HvrHV1xNgeOaZhGMft67dwU1VVpaamJkVGRnq1R0ZGqqKiotV9KioqWu3f2NioqqoqDRw4sMU+2dnZWrhwYYt2h8PRjurRFYXl+rsCdBTG1poYV+vqyLGtq6tTWFjYMfv4LdwcYbPZvNYNw2jRdrz+rbUfkZWVpczMTM96c3Oz9u/fr/Dw8GOe52RTW1srh8OhsrIyhYaG+rsc+BBja12MrTUxrq0zDEN1dXUaNGjQcfv6LdxEREQoICCgxSxNZWVli9mZIwYMGNBq/8DAQIWHh7e6j91ul91u92rr27fviRducaGhofxlsijG1roYW2tiXFs63ozNEX67oTgoKEhxcXFyOp1e7U6nU4mJia3uk5CQ0KL/e++9p/j4+FbvtwEAACcfvz4tlZmZqeXLl2vlypXatWuX7rzzTrlcLqWlpUk6fElp+vTpnv5paWn6/vvvlZmZqV27dmnlypVasWKF7r77bn/9BAAA0MX49Z6bqVOnqrq6WosWLZLb7VZsbKwKCwsVFRUlSXK73V7vvImOjlZhYaHuvPNOPffccxo0aJCefvppXXXVVf76CZZht9s1f/78Fpfw0P0xttbF2FoT49p+NqMtz1QBAAB0E37//AIAAIAvEW4AAIClEG4AAIClEG4AAIClEG66mc2bN+vyyy/XoEGDZLPZ9MYbb/i7JPhAdna2xowZoz59+qh///668sor9dVXX/m7LPhAfn6+Ro0a5XkhW0JCgt555x1/lwUfy87Ols1mU0ZGhr9LgQg33c7Bgwd13nnn6dlnn+3Q8/z2228denx427Rpk2677TZt3bpVTqdTjY2NSkpK0sGDB31+Lsa2c51++ulasmSJtm3bpm3btmnChAm64oor9MUXX/j0PIyr/3z66adatmyZRo0a1SHHZ2xPgIFuS5Kxbt264/bbtWuXMXbsWMNutxsxMTGG0+n02nfPnj2GJKOgoMAYP368YbfbjZUrVxpVVVXGddddZwwePNjo1auXERsba7z88stexx4/frxx++23G3PmzDH69u1r9O/f31i6dKlx4MABY8aMGUbv3r2NM844wygsLOyAPwHrqqysNCQZmzZtOmY/xrZ7OvXUU43ly5cfdTvj2n3U1dUZw4cPN5xOpzF+/Hhjzpw5x+zP2HYOwk031pZw09TUZIwYMcKYPHmysWPHDqOoqMi44IILWv3LNHToUGPt2rXG7t27jfLycuOHH34wHnvsMaO0tNT47rvvjKefftoICAgwtm7d6jn++PHjjT59+hiLFy82vv76a2Px4sVGjx49jOTkZGPZsmXG119/bdx6661GeHi4cfDgwQ7807CWb775xpBkfP7550ftw9h2P42NjcaaNWuMoKAg44svvmi1D+PavUyfPt3IyMgwDMM4brhhbDsP4aYba0u4eeedd4zAwEDD7XZ72o72fwq5ubnHPedll11m3HXXXZ718ePHGxdeeKFnvbGx0QgJCTGmTZvmaXO73YYk4+OPP27jLzu5NTc3G5dffrnXn2trGNvu47PPPjNCQkKMgIAAIywszFi/fv1R+zKu3ceaNWuM2NhY45dffjEM4/jhhrHtPNxzYyEPP/ywevfu7VlcLpe++uorORwODRgwwNPvggsuaHX/+Ph4r/Wmpib9/e9/16hRoxQeHq7evXvrvffe8/okhiSv68wBAQEKDw/Xueee62k78pX3ysrKdv/Gk8Htt9+uzz77TGvWrPG0Mbbd24gRI7Rjxw5t3bpVt956q26++Wbt3LmTce3GysrKNGfOHL344osKDg5usZ2x9S+/flsKvpWWlqZrr73Wsz5o0CAZhiGbzdam/UNCQrzWn3jiCT355JPKzc3Vueeeq5CQEGVkZKihocGr3++/yG6z2bzajpy/ubnZ1O85Gc2ePVtvvfWWNm/erNNPP93Tzth2b0FBQRo2bJikw/9offrpp3rqqaeUnZ3NuHZTJSUlqqysVFxcnKetqalJmzdv1rPPPqt9+/Yxtn5EuLGQfv36qV+/fl5tI0eOlMvl0r59+zyJ/dNPP23T8YqKinTFFVfopptuknT4L8M333yjmJgY3xYOGYah2bNna926ddq4caOio6O9tjO21mIYhurr6xnXbmzixIn6/PPPvdpmzpypkSNH6t5771V4eLjCw8O9tjO2nYdw080cOHBA3377rWd9z5492rFjh/r166chQ4a06D958mSdeeaZuvnmm/Xoo4+qrq5O8+bNk6Tj/h/EsGHDtHbtWhUXF+vUU09VTk6OKioq+MvUAW677Ta9/PLLevPNN9WnTx9VVFRIksLCwtSrV69W92Fsu4f77rtPycnJcjgcqqur0yuvvKKNGzdqw4YNrfZnXLuHPn36KDY21qstJCRE4eHhLdqPYGw7D/fcdDPbtm3T6NGjNXr0aElSZmamRo8erQcffLDV/gEBAXrjjTd04MABjRkzRqmpqbr//vslqdXrxP/fAw88oPPPP1+XXHKJLrroIg0YMEBXXnmlT38PDsvPz1dNTY0uuugiDRw40LMUFBQcdR/GtnvYt2+fpk2bphEjRmjixIn65JNPtGHDBk2ePLnV/oyrdTG2ncdmGIbh7yLQuT766CNdeOGF+vbbb3XmmWf6uxz4EGNrTYyrdTG2HYNwcxJYt26devfureHDh+vbb7/VnDlzdOqpp2rLli3+Lg3txNhaE+NqXYxt5+Cem5NAXV2d7rnnHpWVlSkiIkKTJk3SE0884e+y4AOMrTUxrtbF2HYOZm4AAIClcEMxAACwFMINAACwFMINAACwFMINAACwFMINAACwFMINAACwFMINAACwFMINAACwFMINAACwlP8DyN1FcQKWPaEAAAAASUVORK5CYII=", + "text/plain": [ + "
" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "# Mocked dataset showing the precision for different n-grams\n", + "data = {\"1-gram\": 0.8, \"2-gram\": 0.7, \"3-gram\": 0.6, \"4-gram\": 0.5}\n", + "\n", + "# Plot the datapoints defined above\n", + "fig, ax = plt.subplots(1)\n", + "bars = ax.bar(*zip(*data.items()))\n", + "ax.set(ylabel=\"N-gram precision\")\n", + "plt.show()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### N-gram BLEU score:\n", + "When the n-gram precision is normalized by the brevity penalty (BP), then the exponential decay of n-grams is almost fully compensated. The BLEU score corresponds to a geometric average of this modified n-gram precision." + ] + }, + { + "cell_type": "code", + "execution_count": 4, + "metadata": {}, + "outputs": [ + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAjcAAAGdCAYAAADuR1K7AAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjcuMSwgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy/bCgiHAAAACXBIWXMAAA9hAAAPYQGoP6dpAAAyCklEQVR4nO3de1RU9f7/8deIMpgKKiR5GZHUDCU9Bl3wkt2ksFVZrbJMTYVOSppIZZp1SutEV8TqgFKax2+ldMJON9Kmm5fMSsJuVlpakA0SWOClIGD//nA5vzMBOhsGB3bPx1p7Leczn8/e7/GzXL367JvNMAxDAAAAFtHG3wUAAAD4EuEGAABYCuEGAABYCuEGAABYCuEGAABYCuEGAABYCuEGAABYCuEGAABYSlt/F3C81dbW6qefflKnTp1ks9n8XQ4AAPCCYRjav3+/evTooTZtjr4285cLNz/99JMcDoe/ywAAAI1QVFSkXr16HbXPXy7cdOrUSdLhv5zg4GA/VwMAALxRUVEhh8Ph/u/40fzlws2RU1HBwcGEGwAAWhlvLinhgmIAAGAphBsAAGAphBsAAGAphBsAAGAphBsAAGAphBsAAGAphBsAAGAphBsAAGAphBsAAGAphBsAAGApfg83mZmZioyMVFBQkGJiYrRx48aj9n/uuec0ZMgQnXDCCerevbumTJmisrKy41QtAABo6fwabnJycpSSkqL58+eroKBAI0eOVEJCggoLC+vtv2nTJk2aNEmJiYn68ssv9Z///Ecff/yxkpKSjnPlAACgpfJruElPT1diYqKSkpIUFRWljIwMORwOZWVl1dt/y5Yt6tOnj2655RZFRkZqxIgRuummm7R169bjXDkAAGip/BZuqqqqlJ+fr/j4eI/2+Ph4bd68ud4xw4YN048//qi8vDwZhqG9e/fqxRdf1CWXXNLgcSorK1VRUeGxAQAA62rrrwOXlpaqpqZG4eHhHu3h4eEqLi6ud8ywYcP03HPPady4cfr9999VXV2tyy67TE888USDx0lLS9OCBQt8WvvR9Jn7+nE7Fjx9/2DDIRcA8Nfh9wuKbTabx2fDMOq0HbF9+3bdcsst+sc//qH8/HytXbtWu3fv1rRp0xrc/7x581ReXu7eioqKfFo/AABoWfy2chMWFqaAgIA6qzQlJSV1VnOOSEtL0/Dhw3X77bdLkgYPHqwOHTpo5MiRuv/++9W9e/c6Y+x2u+x2u+9/AAAAaJH8tnITGBiomJgYOZ1Oj3an06lhw4bVO+bQoUNq08az5ICAAEmHV3wAAAD8eloqNTVVTz/9tJYvX66vvvpKs2fPVmFhofs007x58zRp0iR3/0svvVRr1qxRVlaWdu3apffff1+33HKLzjzzTPXo0cNfPwMAALQgfjstJUnjxo1TWVmZFi5cKJfLpejoaOXl5SkiIkKS5HK5PJ55M3nyZO3fv19PPvmkbr31VnXu3Fnnn3++HnroIX/9BAAA0MLYjL/Y+ZyKigqFhISovLxcwcHBPt8/d0v5D3dLAYB1mfnvt9/vlgIAAPAlwg0AALAUwg0AALAUwg0AALAUwg0AALAUwg0AALAUvz7nBmhNuM3ff7jNH4AZrNwAAABLIdwAAABLIdwAAABLIdwAAABLIdwAAABLIdwAAABLIdwAAABLIdwAAABLIdwAAABLIdwAAABLIdwAAABLIdwAAABLIdwAAABL4a3gAP7SeNu7//C2dzQXVm4AAIClEG4AAIClEG4AAIClEG4AAIClEG4AAIClEG4AAIClEG4AAIClEG4AAIClEG4AAIClEG4AAIClEG4AAIClEG4AAICl+P3FmZmZmXrkkUfkcrk0aNAgZWRkaOTIkfX2nTx5sv7973/XaR84cKC+/PLL5i4VANCK8FJU//H3S1H9unKTk5OjlJQUzZ8/XwUFBRo5cqQSEhJUWFhYb//FixfL5XK5t6KiInXt2lVXX331ca4cAAC0VH4NN+np6UpMTFRSUpKioqKUkZEhh8OhrKysevuHhITopJNOcm9bt27VL7/8oilTphznygEAQEvlt3BTVVWl/Px8xcfHe7THx8dr8+bNXu1j2bJluvDCCxUREdFgn8rKSlVUVHhsAADAuvwWbkpLS1VTU6Pw8HCP9vDwcBUXFx9zvMvl0htvvKGkpKSj9ktLS1NISIh7czgcTaobAAC0bH6/W8pms3l8NgyjTlt9VqxYoc6dO2vs2LFH7Tdv3jyVl5e7t6KioqaUCwAAWji/3S0VFhamgICAOqs0JSUldVZz/swwDC1fvlwTJ05UYGDgUfva7XbZ7fYm1wsAAFoHv63cBAYGKiYmRk6n06Pd6XRq2LBhRx27fv16ffvtt0pMTGzOEgEAQCvk1+fcpKamauLEiYqNjVVcXJyys7NVWFioadOmSTp8SmnPnj1auXKlx7hly5bprLPOUnR0tD/KBgAALZhfw824ceNUVlamhQsXyuVyKTo6Wnl5ee67n1wuV51n3pSXlys3N1eLFy/2R8kAAKCF8/sTipOTk5WcnFzvdytWrKjTFhISokOHDjVzVQAAoLXy+91SAAAAvkS4AQAAlkK4AQAAlkK4AQAAlkK4AQAAlkK4AQAAlkK4AQAAlkK4AQAAlkK4AQAAlkK4AQAAlkK4AQAAlkK4AQAAlkK4AQAAlkK4AQAAlkK4AQAAlkK4AQAAlkK4AQAAlkK4AQAAlkK4AQAAlkK4AQAAlkK4AQAAlkK4AQAAlkK4AQAAlkK4AQAAlkK4AQAAlkK4AQAAlkK4AQAAltLW7ICamhqtWLFCb7/9tkpKSlRbW+vx/TvvvOOz4gAAAMwyHW5mzZqlFStW6JJLLlF0dLRsNltz1AUAANAopsPN6tWr9cILL2jMmDHNUQ8AAECTmL7mJjAwUP369WuOWgAAAJrMdLi59dZbtXjxYhmG0Rz1AAAANInp01KbNm3Su+++qzfeeEODBg1Su3btPL5fs2aNz4oDAAAwy/TKTefOnXXFFVdo1KhRCgsLU0hIiMdmVmZmpiIjIxUUFKSYmBht3LjxqP0rKys1f/58RUREyG63q2/fvlq+fLnp4wIAAGsyvXLzzDPP+OzgOTk5SklJUWZmpoYPH66lS5cqISFB27dvV+/evesdc80112jv3r1atmyZ+vXrp5KSElVXV/usJgAA0LqZDjdH/Pzzz/rmm29ks9l0yimn6MQTTzS9j/T0dCUmJiopKUmSlJGRoXXr1ikrK0tpaWl1+q9du1br16/Xrl271LVrV0lSnz59GvsTAACABZk+LXXw4EFNnTpV3bt31znnnKORI0eqR48eSkxM1KFDh7zeT1VVlfLz8xUfH+/RHh8fr82bN9c75pVXXlFsbKwefvhh9ezZU6eccopuu+02/fbbbw0ep7KyUhUVFR4bAACwLtPhJjU1VevXr9err76qX3/9Vb/++qtefvllrV+/XrfeeqvX+yktLVVNTY3Cw8M92sPDw1VcXFzvmF27dmnTpk364osv9NJLLykjI0Mvvviibr755gaPk5aW5nFNkMPh8LpGAADQ+pgON7m5uVq2bJkSEhIUHBys4OBgjRkzRk899ZRefPFF0wX8+QnHhmE0+NTj2tpa2Ww2PffcczrzzDM1ZswYpaena8WKFQ2u3sybN0/l5eXuraioyHSNAACg9TB9zc2hQ4fqrLZIUrdu3UydlgoLC1NAQECdVZqSkpJ69y9J3bt3V8+ePT3uyoqKipJhGPrxxx/Vv3//OmPsdrvsdrvXdQEAgNbN9MpNXFyc7rnnHv3+++/utt9++00LFixQXFyc1/sJDAxUTEyMnE6nR7vT6dSwYcPqHTN8+HD99NNPOnDggLttx44datOmjXr16mXylwAAACsyvXKzePFiXXzxxerVq5eGDBkim82mbdu2KSgoSOvWrTO1r9TUVE2cOFGxsbGKi4tTdna2CgsLNW3aNEmHTynt2bNHK1eulCSNHz9e9913n6ZMmaIFCxaotLRUt99+u6ZOnar27dub/SkAAMCCTIeb6Oho7dy5U88++6y+/vprGYaha6+9Vtdff73pgDFu3DiVlZVp4cKFcrlcio6OVl5eniIiIiRJLpdLhYWF7v4dO3aU0+nUzJkzFRsbq9DQUF1zzTW6//77zf4MAABgUY16zk379u114403+qSA5ORkJScn1/vdihUr6rSdeuqpdU5lAQAAHOFVuHnllVeUkJCgdu3a6ZVXXjlq38suu8wnhQEAADSGV+Fm7NixKi4uVrdu3TR27NgG+9lsNtXU1PiqNgAAANO8Cje1tbX1/hkAAKClMX0reH1+/fVXX+wGAACgyUyHm4ceekg5OTnuz1dffbW6du2qnj176tNPP/VpcQAAAGaZDjdLly51v5/J6XTqrbfe0tq1a5WQkKDbb7/d5wUCAACYYfpWcJfL5Q43r732mq655hrFx8erT58+Ouuss3xeIAAAgBmmV266dOnifvnk2rVrdeGFF0o6/MJL7pQCAAD+Znrl5sorr9T48ePVv39/lZWVKSEhQZK0bds29evXz+cFAgAAmGE63CxatEh9+vRRUVGRHn74YXXs2FHS4dNVDT1pGAAA4HgxHW7atWun2267rU57SkqKL+oBAABoEl6/AAAALIXXLwAAAEvh9QsAAMBSfPL6BQAAgJbCdLi55ZZb9Pjjj9dpf/LJJ7moGAAA+J3pcJObm6vhw4fXaR82bJhefPFFnxQFAADQWKbDTVlZmUJCQuq0BwcHq7S01CdFAQAANJbpcNOvXz+tXbu2Tvsbb7yhk08+2SdFAQAANJbph/ilpqZqxowZ+vnnn3X++edLkt5++2099thjysjI8HV9AAAAppgON1OnTlVlZaX++c9/6r777pMk9enTR1lZWZo0aZLPCwQAADDDdLiRpOnTp2v69On6+eef1b59e/f7pQAAAPytUc+5qa6u1ltvvaU1a9bIMAxJ0k8//aQDBw74tDgAAACzTK/c/PDDD7r44otVWFioyspKjR49Wp06ddLDDz+s33//XUuWLGmOOgEAALxieuVm1qxZio2N1S+//KL27du726+44gq9/fbbPi0OAADALNMrN5s2bdL777+vwMBAj/aIiAjt2bPHZ4UBAAA0humVm9ra2nrf/P3jjz+qU6dOPikKAACgsUyHm9GjR3s8z8Zms+nAgQO65557NGbMGF/WBgAAYJrp01Lp6ek6//zzNXDgQP3+++8aP368du7cqbCwMK1atao5agQAAPCa6XDTs2dPbdu2TatXr1Z+fr5qa2uVmJio66+/3uMCYwAAAH8wFW7++OMPDRgwQK+99pqmTJmiKVOmNFddAAAAjWLqmpt27dqpsrJSNputueoBAABoEtMXFM+cOVMPPfSQqqurm6MeAACAJjEdbj788EOtWbNGvXv31kUXXaQrr7zSYzMrMzNTkZGRCgoKUkxMjDZu3Nhg3/fee082m63O9vXXX5s+LgAAsCbTFxR37txZV111lU8OnpOTo5SUFGVmZmr48OFaunSpEhIStH37dvXu3bvBcd98842Cg4Pdn0888USf1AMAAFo/0+HmmWee8dnB09PTlZiYqKSkJElSRkaG1q1bp6ysLKWlpTU4rlu3burcubPP6gAAANbRqLeCS1JJSYk2btyoTZs2qaSkxPT4qqoq5efnKz4+3qM9Pj5emzdvPurYoUOHqnv37rrgggv07rvvHrVvZWWlKioqPDYAAGBdpsNNRUWFJk6cqJ49e2rUqFE655xz1LNnT02YMEHl5eVe76e0tFQ1NTUKDw/3aA8PD1dxcXG9Y7p3767s7Gzl5uZqzZo1GjBggC644AJt2LChweOkpaUpJCTEvTkcDq9rBAAArY/pcJOUlKQPP/xQr732mn799VeVl5frtdde09atW3XjjTeaLuDPt5UbhtHgreYDBgzQjTfeqNNPP11xcXHKzMzUJZdcokcffbTB/c+bN0/l5eXuraioyHSNAACg9TB9zc3rr7+udevWacSIEe62iy66SE899ZQuvvhir/cTFhamgICAOqs0JSUldVZzjubss8/Ws88+2+D3drtddrvd6/0BAIDWzfTKTWhoqEJCQuq0h4SEqEuXLl7vJzAwUDExMXI6nR7tTqdTw4YN83o/BQUF6t69u9f9AQCAtZleubnrrruUmpqqlStXukNFcXGxbr/9dt19992m9pWamqqJEycqNjZWcXFxys7OVmFhoaZNmybp8CmlPXv2aOXKlZIO303Vp08fDRo0SFVVVXr22WeVm5ur3Nxcsz8DAABYlOlwk5WVpW+//VYRERHuZ9EUFhbKbrfr559/1tKlS919P/nkk6Pua9y4cSorK9PChQvlcrkUHR2tvLw8RURESJJcLpcKCwvd/auqqnTbbbdpz549at++vQYNGqTXX39dY8aMMfszAACARZkON2PHjvVpAcnJyUpOTq73uxUrVnh8njNnjubMmePT4wMAAGsxHW7uueee5qgDAADAJxr9ED8AAICWiHADAAAshXADAAAshXADAAAshXADAAAsxfTdUoZh6MUXX9S7776rkpIS1dbWeny/Zs0anxUHAABglulwM2vWLGVnZ+u8885TeHh4gy+5BAAA8AfT4ebZZ5/VmjVreCowAABokUxfcxMSEqKTTz65OWoBAABoMtPh5t5779WCBQv022+/NUc9AAAATWL6tNTVV1+tVatWqVu3burTp4/atWvn8f2xXpYJAADQnEyHm8mTJys/P18TJkzggmIAANDimA43r7/+utatW6cRI0Y0Rz0AAABNYvqaG4fDoeDg4OaoBQAAoMlMh5vHHntMc+bM0ffff98M5QAAADSN6dNSEyZM0KFDh9S3b1+dcMIJdS4o3rdvn8+KAwAAMMt0uMnIyGiGMgAAAHzDdLi54YYbmqMOAAAAnzAdbv7Xb7/9pj/++MOjjYuNAQCAP5m+oPjgwYOaMWOGunXrpo4dO6pLly4eGwAAgD+ZDjdz5szRO++8o8zMTNntdj399NNasGCBevTooZUrVzZHjQAAAF4zfVrq1Vdf1cqVK3Xuuedq6tSpGjlypPr166eIiAg999xzuv7665ujTgAAAK+YXrnZt2+fIiMjJR2+vubIrd8jRozQhg0bfFsdAACASabDzcknn+x+gN/AgQP1wgsvSDq8otO5c2df1gYAAGCa6XAzZcoUffrpp5KkefPmua+9mT17tm6//XafFwgAAGCG6WtuZs+e7f7zeeedp6+//lpbt25V3759NWTIEJ8WBwAAYJaplZs//vhD5513nnbs2OFu6927t6688kqCDQAAaBFMhZt27drpiy++kM1ma656AAAAmsT0NTeTJk3SsmXLmqMWAACAJjN9zU1VVZWefvppOZ1OxcbGqkOHDh7fp6en+6w4AAAAs0yHmy+++EKnn366JHlceyOJ01UAAMDvTIebd999tznqAAAA8AnT19z4WmZmpiIjIxUUFKSYmBht3LjRq3Hvv/++2rZtq7/97W/NWyAAAGhVTK/cXHHFFfWefrLZbAoKClK/fv00fvx4DRgw4Jj7ysnJUUpKijIzMzV8+HAtXbpUCQkJ2r59u3r37t3guPLyck2aNEkXXHCB9u7da/YnAAAACzO9chMSEqJ33nlHn3zyiTvkFBQU6J133lF1dbVycnI0ZMgQvf/++8fcV3p6uhITE5WUlKSoqChlZGTI4XAoKyvrqONuuukmjR8/XnFxcWbLBwAAFmc63Jx00kkaP368du3apdzcXK1Zs0bfffedJkyYoL59++qrr77SDTfcoDvuuOOo+6mqqlJ+fr7i4+M92uPj47V58+YGxz3zzDP67rvvdM8993hVb2VlpSoqKjw2AABgXabDzbJly5SSkqI2bf7/0DZt2mjmzJnKzs6WzWbTjBkz9MUXXxx1P6WlpaqpqVF4eLhHe3h4uIqLi+sds3PnTs2dO1fPPfec2rb17oxaWlqaQkJC3JvD4fBqHAAAaJ1Mh5vq6mp9/fXXddq//vpr1dTUSJKCgoK8vi38z/0Mw6h3bE1NjcaPH68FCxbolFNO8breefPmqby83L0VFRV5PRYAALQ+pi8onjhxohITE3XnnXfqjDPOkM1m00cffaQHHnhAkyZNkiStX79egwYNOup+wsLCFBAQUGeVpqSkpM5qjiTt379fW7duVUFBgWbMmCFJqq2tlWEYatu2rd58802df/75dcbZ7XbZ7XazPxMAALRSpsPNokWLFB4erocffth9p1J4eLhmz57tvs4mPj5eF1988VH3ExgYqJiYGDmdTl1xxRXudqfTqcsvv7xO/+DgYH3++ecebZmZmXrnnXf04osvKjIy0uxPAQAAFmQ63AQEBGj+/PmaP3++++Lc4OBgjz5Hu437f6WmpmrixImKjY1VXFycsrOzVVhYqGnTpkk6fEppz549Wrlypdq0aaPo6GiP8d26dVNQUFCddgAA8NdlOtz8r8zMTHcQaYxx48aprKxMCxculMvlUnR0tPLy8hQRESFJcrlcKiwsbEqJAADgL6ZJTyh+4IEHtG/fviYVkJycrO+//16VlZXKz8/XOeec4/5uxYoVeu+99xoce++992rbtm1NOj4AALCWJoUbwzB8VQcAAIBP+P3dUgAAAL7UpGtutm/frh49eviqFgAAgCZrUrjhab8AAKCl8TrcREZGHvOpwzabTd99912TiwIAAGgsr8NNSkpKg999//33Wrp0qSorK31REwAAQKN5HW5mzZpVp23fvn267777lJWVpbPOOksPPfSQT4sDAAAwq1HX3Pz2229KT0/XI488oj59+mjNmjUaM2aMr2sDAAAwzVS4qamp0VNPPaUFCxYoKChITzzxhCZMmOD1G8ABAACam9fh5oUXXtBdd92l8vJy3XnnnZo+fboCAwObszYAAADTvA431157rdq3b6/rrrtOP/zwg+bOnVtvv/T0dJ8VBwAAYJbX4eacc8455q3enJ4CAAD+5nW4OdoLLAEAAFoK3i0FAAAshXADAAAshXADAAAshXADAAAshXADAAAsxau7pT777DOvdzh48OBGFwMAANBUXoWbv/3tb7LZbDIM45jPsqmpqfFJYQAAAI3h1Wmp3bt3a9euXdq9e7dyc3MVGRmpzMxMFRQUqKCgQJmZmerbt69yc3Obu14AAICj8mrlJiIiwv3nq6++Wo8//rjHW8AHDx4sh8Ohu+++W2PHjvV5kQAAAN4yfUHx559/rsjIyDrtkZGR2r59u0+KAgAAaCzT4SYqKkr333+/fv/9d3dbZWWl7r//fkVFRfm0OAAAALO8frfUEUuWLNGll14qh8OhIUOGSJI+/fRT2Ww2vfbaaz4vEAAAwAzT4ebMM8/U7t279eyzz+rrr7+WYRgaN26cxo8frw4dOjRHjQAAAF4zHW4k6YQTTtDf//53X9cCAADQZI16QvH//d//acSIEerRo4d++OEHSdKiRYv08ssv+7Q4AAAAs0yHm6ysLKWmpiohIUG//PKL+6F9Xbp0UUZGhq/rAwAAMMV0uHniiSf01FNPaf78+Wrb9v+f1YqNjdXnn3/u0+IAAADMMh1udu/eraFDh9Zpt9vtOnjwoE+KAgAAaCzT4SYyMlLbtm2r0/7GG29o4MCBvqgJAACg0UzfLXX77bfr5ptv1u+//y7DMPTRRx9p1apVSktL09NPP90cNQIAAHjN9MrNlClTdM8992jOnDk6dOiQxo8fryVLlmjx4sW69tprTReQmZmpyMhIBQUFKSYmRhs3bmyw76ZNmzR8+HCFhoaqffv2OvXUU7Vo0SLTxwQAANbVqOfc3HjjjbrxxhtVWlqq2tpadevWrVEHz8nJUUpKijIzMzV8+HAtXbpUCQkJ2r59u3r37l2nf4cOHTRjxgwNHjxYHTp00KZNm3TTTTepQ4cOPHcHAABIauRzbo4ICwtrdLCRpPT0dCUmJiopKUlRUVHKyMiQw+FQVlZWvf2HDh2q6667ToMGDVKfPn00YcIEXXTRRUdd7QEAAH8tXq3cnH766Xr77bfVpUsXDR06VDabrcG+n3zyiVcHrqqqUn5+vubOnevRHh8fr82bN3u1j4KCAm3evFn3339/g30qKytVWVnp/lxRUeHVvgEAQOvkVbi5/PLLZbfbJUljx471yYFLS0tVU1Oj8PBwj/bw8HAVFxcfdWyvXr30888/q7q6Wvfee6+SkpIa7JuWlqYFCxb4pGYAANDyeRVuunTpojZtDp/BmjJlinr16uX+3FR/XgUyDOOoK0OStHHjRh04cEBbtmzR3Llz1a9fP1133XX19p03b55SU1PdnysqKuRwOJpeOAAAaJG8Cjepqam69tprFRQUpMjISLlcriZdayMdvl4nICCgzipNSUlJndWcP4uMjJQknXbaadq7d6/uvffeBsON3W53rzoBAADr82r5pUePHsrNzdUPP/wgwzD0448/qrCwsN7NW4GBgYqJiZHT6fRodzqdGjZsmNf7MQzD45oaAADw1+bVys1dd92lmTNnasaMGbLZbDrjjDPq9DlyOunIizS9kZqaqokTJyo2NlZxcXHKzs5WYWGhpk2bJunwKaU9e/Zo5cqVkqR//etf6t27t0499VRJh5978+ijj2rmzJleHxMAAFibV+Hm73//u6677jr98MMPGjx4sN566y2FhoY2+eDjxo1TWVmZFi5cKJfLpejoaOXl5SkiIkKS5HK5PFaDamtrNW/ePO3evVtt27ZV37599eCDD+qmm25qci0AAMAavH6IX6dOnRQdHa1nnnlGw4cP99l1LMnJyUpOTq73uxUrVnh8njlzJqs0AADgqEw/ofiGG25ojjoAAAB8wqtw07VrV+3YsUNhYWHq0qXLUW/V3rdvn8+KAwAAMMurcLNo0SJ16tTJ/edjPYcGAADAX7wKN/97Kmry5MnNVQsAAECTeRVuzLyPKTg4uNHFAAAANJVX4aZz585en4oy85wbAAAAX/Mq3Lz77rvuP3///feaO3euJk+erLi4OEnSBx98oH//+99KS0trnioBAAC85FW4GTVqlPvPCxcuVHp6use7nC677DKddtppys7O5lZxAADgV6Zf7f3BBx8oNja2TntsbKw++ugjnxQFAADQWKbDjcPh0JIlS+q0L126VA6HwydFAQAANJbpJxQvWrRIV111ldatW6ezzz5bkrRlyxZ99913ys3N9XmBAAAAZpheuRkzZox27typyy67TPv27VNZWZkuv/xy7dixQ2PGjGmOGgEAALxmeuVGknr16qUHHnjA17UAAAA0WaPCza+//qply5bpq6++ks1m08CBAzV16lSFhIT4uj4AAABTTJ+W2rp1q/r27atFixZp3759Ki0tVXp6uvr27atPPvmkOWoEAADwmumVm9mzZ+uyyy7TU089pbZtDw+vrq5WUlKSUlJStGHDBp8XCQAA4C3T4Wbr1q0ewUaS2rZtqzlz5tT7/BsAAIDjyfRpqeDgYBUWFtZpLyoqUqdOnXxSFAAAQGOZDjfjxo1TYmKicnJyVFRUpB9//FGrV69WUlKSxysZAAAA/MH0aalHH31UNptNkyZNUnV1tSSpXbt2mj59uh588EGfFwgAAGCG6XATGBioxYsXKy0tTd99950Mw1C/fv10wgknNEd9AAAApjTqOTeSdMIJJ+i0007zZS0AAABN5nW4mTp1qlf9li9f3uhiAAAAmsrrcLNixQpFRERo6NChMgyjOWsCAABoNK/DzbRp07R69Wrt2rVLU6dO1YQJE9S1a9fmrA0AAMA0r28Fz8zMlMvl0h133KFXX31VDodD11xzjdatW8dKDgAAaDFMPefGbrfruuuuk9Pp1Pbt2zVo0CAlJycrIiJCBw4caK4aAQAAvGb6IX5H2Gw22Ww2GYah2tpaX9YEAADQaKbCTWVlpVatWqXRo0drwIAB+vzzz/Xkk0+qsLBQHTt2bK4aAQAAvOb1BcXJyclavXq1evfurSlTpmj16tUKDQ1tztoAAABM8zrcLFmyRL1791ZkZKTWr1+v9evX19tvzZo1PisOAADALK/DzaRJk2Sz2ZqzFgAAgCYz9RA/AACAlq7Rd0v5SmZmpiIjIxUUFKSYmBht3Lixwb5r1qzR6NGjdeKJJyo4OFhxcXFat27dcawWAAC0dH4NNzk5OUpJSdH8+fNVUFCgkSNHKiEhQYWFhfX237Bhg0aPHq28vDzl5+frvPPO06WXXqqCgoLjXDkAAGip/Bpu0tPTlZiYqKSkJEVFRSkjI0MOh0NZWVn19s/IyNCcOXN0xhlnqH///nrggQfUv39/vfrqq8e5cgAA0FL5LdxUVVUpPz9f8fHxHu3x8fHavHmzV/uora3V/v37j/qOq8rKSlVUVHhsAADAuvwWbkpLS1VTU6Pw8HCP9vDwcBUXF3u1j8cee0wHDx7UNddc02CftLQ0hYSEuDeHw9GkugEAQMvm9wuK/3x7uWEYXt1yvmrVKt17773KyclRt27dGuw3b948lZeXu7eioqIm1wwAAFour28F97WwsDAFBATUWaUpKSmps5rzZzk5OUpMTNR//vMfXXjhhUfta7fbZbfbm1wvAABoHfy2chMYGKiYmBg5nU6PdqfTqWHDhjU4btWqVZo8ebKef/55XXLJJc1dJgAAaGX8tnIjSampqZo4caJiY2MVFxen7OxsFRYWatq0aZIOn1Las2ePVq5cKelwsJk0aZIWL16ss88+273q0759e4WEhPjtdwAAgJbDr+Fm3LhxKisr08KFC+VyuRQdHa28vDxFRERIklwul8czb5YuXarq6mrdfPPNuvnmm93tN9xwA09QBgAAkvwcbqTDbxtPTk6u97s/B5b33nuv+QsCAACtmt/vlgIAAPAlwg0AALAUwg0AALAUwg0AALAUwg0AALAUwg0AALAUwg0AALAUwg0AALAUwg0AALAUwg0AALAUwg0AALAUwg0AALAUwg0AALAUwg0AALAUwg0AALAUwg0AALAUwg0AALAUwg0AALAUwg0AALAUwg0AALAUwg0AALAUwg0AALAUwg0AALAUwg0AALAUwg0AALAUwg0AALAUwg0AALAUwg0AALAUwg0AALAUwg0AALAUwg0AALAUwg0AALAUwg0AALAUv4ebzMxMRUZGKigoSDExMdq4cWODfV0ul8aPH68BAwaoTZs2SklJOX6FAgCAVsGv4SYnJ0cpKSmaP3++CgoKNHLkSCUkJKiwsLDe/pWVlTrxxBM1f/58DRky5DhXCwAAWgO/hpv09HQlJiYqKSlJUVFRysjIkMPhUFZWVr39+/Tpo8WLF2vSpEkKCQk5ztUCAIDWwG/hpqqqSvn5+YqPj/doj4+P1+bNm312nMrKSlVUVHhsAADAuvwWbkpLS1VTU6Pw8HCP9vDwcBUXF/vsOGlpaQoJCXFvDofDZ/sGAAAtj98vKLbZbB6fDcOo09YU8+bNU3l5uXsrKiry2b4BAEDL09ZfBw4LC1NAQECdVZqSkpI6qzlNYbfbZbfbfbY/AADQsvlt5SYwMFAxMTFyOp0e7U6nU8OGDfNTVQAAoLXz28qNJKWmpmrixImKjY1VXFycsrOzVVhYqGnTpkk6fEppz549WrlypXvMtm3bJEkHDhzQzz//rG3btikwMFADBw70x08AAAAtjF/Dzbhx41RWVqaFCxfK5XIpOjpaeXl5ioiIkHT4oX1/fubN0KFD3X/Oz8/X888/r4iICH3//ffHs3QAANBC+TXcSFJycrKSk5Pr/W7FihV12gzDaOaKAABAa+b3u6UAAAB8iXADAAAshXADAAAshXADAAAshXADAAAshXADAAAshXADAAAshXADAAAshXADAAAshXADAAAshXADAAAshXADAAAshXADAAAshXADAAAshXADAAAshXADAAAshXADAAAshXADAAAshXADAAAshXADAAAshXADAAAshXADAAAshXADAAAshXADAAAshXADAAAshXADAAAshXADAAAshXADAAAshXADAAAshXADAAAshXADAAAshXADAAAshXADAAAsxe/hJjMzU5GRkQoKClJMTIw2btx41P7r169XTEyMgoKCdPLJJ2vJkiXHqVIAANAa+DXc5OTkKCUlRfPnz1dBQYFGjhyphIQEFRYW1tt/9+7dGjNmjEaOHKmCggLdeeeduuWWW5Sbm3ucKwcAAC2VX8NNenq6EhMTlZSUpKioKGVkZMjhcCgrK6ve/kuWLFHv3r2VkZGhqKgoJSUlaerUqXr00UePc+UAAKClauuvA1dVVSk/P19z5871aI+Pj9fmzZvrHfPBBx8oPj7eo+2iiy7SsmXL9Mcff6hdu3Z1xlRWVqqystL9uby8XJJUUVHR1J9Qr9rKQ82yXxxbc83pEcyt/zTn3DKv/sO/Wetqjrk9sk/DMI7Z12/hprS0VDU1NQoPD/doDw8PV3Fxcb1jiouL6+1fXV2t0tJSde/evc6YtLQ0LViwoE67w+FoQvVoiUIy/F0Bmgtza03Mq3U159zu379fISEhR+3jt3BzhM1m8/hsGEadtmP1r6/9iHnz5ik1NdX9uba2Vvv27VNoaOhRj/NXU1FRIYfDoaKiIgUHB/u7HPgQc2tdzK01Ma/1MwxD+/fvV48ePY7Z12/hJiwsTAEBAXVWaUpKSuqszhxx0kkn1du/bdu2Cg0NrXeM3W6X3W73aOvcuXPjC7e44OBg/jFZFHNrXcytNTGvdR1rxeYIv11QHBgYqJiYGDmdTo92p9OpYcOG1TsmLi6uTv8333xTsbGx9V5vAwAA/nr8erdUamqqnn76aS1fvlxfffWVZs+ercLCQk2bNk3S4VNKkyZNcvefNm2afvjhB6Wmpuqrr77S8uXLtWzZMt12223++gkAAKCF8es1N+PGjVNZWZkWLlwol8ul6Oho5eXlKSIiQpLkcrk8nnkTGRmpvLw8zZ49W//617/Uo0cPPf7447rqqqv89RMsw26365577qlzCg+tH3NrXcytNTGvTWczvLmnCgAAoJXw++sXAAAAfIlwAwAALIVwAwAALIVwAwAALIVw08ps2LBBl156qXr06CGbzab//ve//i4JPpCWlqYzzjhDnTp1Urdu3TR27Fh98803/i4LPpCVlaXBgwe7H8gWFxenN954w99lwcfS0tJks9mUkpLi71Igwk2rc/DgQQ0ZMkRPPvlksx7njz/+aNb9w9P69et18803a8uWLXI6naqurlZ8fLwOHjzo82Mxt8dXr1699OCDD2rr1q3aunWrzj//fF1++eX68ssvfXoc5tV/Pv74Y2VnZ2vw4MHNsn/mthEMtFqSjJdeeumY/b766itj+PDhht1uN6Kiogyn0+kxdvfu3YYkIycnxxg1apRht9uN5cuXG6Wlpca1115r9OzZ02jfvr0RHR1tPP/88x77HjVqlDFjxgxj1qxZRufOnY1u3boZS5cuNQ4cOGBMnjzZ6Nixo3HyyScbeXl5zfA3YF0lJSWGJGP9+vVH7cfctk5dunQxnn766Qa/Z15bj/379xv9+/c3nE6nMWrUKGPWrFlH7c/cHh+Em1bMm3BTU1NjDBgwwBg9erSxbds2Y+PGjcaZZ55Z7z+mPn36GLm5ucauXbuMPXv2GD/++KPxyCOPGAUFBcZ3331nPP7440ZAQICxZcsW9/5HjRpldOrUybjvvvuMHTt2GPfdd5/Rpk0bIyEhwcjOzjZ27NhhTJ8+3QgNDTUOHjzYjH8b1rJz505DkvH555832Ie5bX2qq6uNVatWGYGBgcaXX35Zbx/mtXWZNGmSkZKSYhiGccxww9weP4SbVsybcPPGG28Ybdu2NVwul7utof9TyMjIOOYxx4wZY9x6663uz6NGjTJGjBjh/lxdXW106NDBmDhxorvN5XIZkowPPvjAy1/211ZbW2tceumlHn+v9WFuW4/PPvvM6NChgxEQEGCEhIQYr7/+eoN9mdfWY9WqVUZ0dLTx22+/GYZx7HDD3B4/XHNjIQ888IA6duzo3goLC/XNN9/I4XDopJNOcvc788wz6x0fGxvr8bmmpkb//Oc/NXjwYIWGhqpjx4568803PV6JIcnjPHNAQIBCQ0N12mmnuduOvOW9pKSkyb/xr2DGjBn67LPPtGrVKncbc9u6DRgwQNu2bdOWLVs0ffp03XDDDdq+fTvz2ooVFRVp1qxZevbZZxUUFFTne+bWv/z6bin41rRp03TNNde4P/fo0UOGYchms3k1vkOHDh6fH3vsMS1atEgZGRk67bTT1KFDB6WkpKiqqsqj35/fyG6z2Tzajhy/trbW1O/5K5o5c6ZeeeUVbdiwQb169XK3M7etW2BgoPr16yfp8H+0Pv74Yy1evFhpaWnMayuVn5+vkpISxcTEuNtqamq0YcMGPfnkk9q7dy9z60eEGwvp2rWrunbt6tF26qmnqrCwUHv37nUn9o8//tir/W3cuFGXX365JkyYIOnwP4adO3cqKirKt4VDhmFo5syZeumll/Tee+8pMjLS43vm1loMw1BlZSXz2opdcMEF+vzzzz3apkyZolNPPVV33HGHQkNDFRoa6vE9c3v8EG5amQMHDujbb791f969e7e2bdumrl27qnfv3nX6jx49Wn379tUNN9yghx9+WPv379f8+fMl6Zj/B9GvXz/l5uZq8+bN6tKli9LT01VcXMw/pmZw88036/nnn9fLL7+sTp06qbi4WJIUEhKi9u3b1zuGuW0d7rzzTiUkJMjhcGj//v1avXq13nvvPa1du7be/sxr69CpUydFR0d7tHXo0EGhoaF12o9gbo8frrlpZbZu3aqhQ4dq6NChkqTU1FQNHTpU//jHP+rtHxAQoP/+9786cOCAzjjjDCUlJemuu+6SpHrPE/+vu+++W6effrouuuginXvuuTrppJM0duxYn/4eHJaVlaXy8nKde+656t69u3vLyclpcAxz2zrs3btXEydO1IABA3TBBRfoww8/1Nq1azV69Oh6+zOv1sXcHj82wzAMfxeB4+v999/XiBEj9O2336pv377+Lgc+xNxaE/NqXcxt8yDc/AW89NJL6tixo/r3769vv/1Ws2bNUpcuXbRp0yZ/l4YmYm6tiXm1Lub2+OCam7+A/fv3a86cOSoqKlJYWJguvPBCPfbYY/4uCz7A3FoT82pdzO3xwcoNAACwFC4oBgAAlkK4AQAAlkK4AQAAlkK4AQAAlkK4AQAAlkK4AQAAlkK4AQAAlkK4AQAAlkK4AQAAlvL/AEGIHigLKKERAAAAAElFTkSuQmCC", + "text/plain": [ + "
" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "# Mocked dataset showing the precision multiplied by the BP for different n-grams\n", + "data = {\"1-gram\": 0.8, \"2-gram\": 0.77, \"3-gram\": 0.74, \"4-gram\": 0.71}\n", + "\n", + "# Plot the datapoints defined above\n", + "fig, ax = plt.subplots(1)\n", + "bars = ax.bar(*zip(*data.items()))\n", + "ax.set(ylabel=\"Modified N-gram precision\")\n", + "plt.show()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# 3. Example Calculations of the BLEU score\n", + "\n", + "In this example you will have a reference sentence and 2 candidate sentences. You will tokenize all sentences using the NLTK package. Then you will compare the two candidates to the reference using BLEU score.\n", + "\n", + "First you define and tokenize the sentences." + ] + }, + { + "cell_type": "code", + "execution_count": 16, + "metadata": { + "tags": [] + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "The NASA Opportunity rover is battling a massive dust storm on planet Mars. -> ['the', 'nasa', 'opportunity', 'rover', 'is', 'battling', 'a', 'massive', 'dust', 'storm', 'on', 'planet', 'mars', '.']\n", + "\n", + "\n", + "The Opportunity rover is combating a big sandstorm on planet Mars. -> ['the', 'opportunity', 'rover', 'is', 'combating', 'a', 'big', 'sandstorm', 'on', 'planet', 'mars', '.']\n", + "\n", + "\n", + "A NASA rover is fighting a massive storm on planet Mars. -> ['a', 'nasa', 'rover', 'is', 'fighting', 'a', 'massive', 'storm', 'on', 'planet', 'mars', '.']\n" + ] + } + ], + "source": [ + "reference = \"The NASA Opportunity rover is battling a massive dust storm on planet Mars.\"\n", + "candidate_1 = \"The Opportunity rover is combating a big sandstorm on planet Mars.\"\n", + "candidate_2 = \"A NASA rover is fighting a massive storm on planet Mars.\"\n", + "\n", + "tokenized_ref = nltk.word_tokenize(reference.lower())\n", + "tokenized_cand_1 = nltk.word_tokenize(candidate_1.lower())\n", + "tokenized_cand_2 = nltk.word_tokenize(candidate_2.lower())\n", + "\n", + "print(f\"{reference} -> {tokenized_ref}\")\n", + "print(\"\\n\")\n", + "print(f\"{candidate_1} -> {tokenized_cand_1}\")\n", + "print(\"\\n\")\n", + "print(f\"{candidate_2} -> {tokenized_cand_2}\")" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## 3.1 Define the functions to calculate the BLEU score\n", + "\n", + "### Computing the Brevity Penalty\n", + "You will start by defining the function for brevity penalty according to the equation (2) in section 2.1." + ] + }, + { + "cell_type": "code", + "execution_count": 6, + "metadata": {}, + "outputs": [], + "source": [ + "def brevity_penalty(candidate, reference):\n", + " \"\"\"\n", + " Calculates the brevity penalty given the candidate and reference sentences.\n", + " \"\"\"\n", + " reference_length = len(reference)\n", + " candidate_length = len(candidate)\n", + "\n", + " if reference_length < candidate_length:\n", + " BP = 1\n", + " else:\n", + " penalty = 1 - (reference_length / candidate_length)\n", + " BP = np.exp(penalty)\n", + "\n", + " return BP" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Computing the clipped Precision\n", + "Next, you need to define a function to calculate the geometrically averaged clipped precision. This function calculates how many of the n-grams in the candidate sentence actually appear in the reference sentence. The clipping takes care of overcounting. For example if a certain n-gram appears five times in the candidate sentence, but only twice in the reference, the value is clipped to two." + ] + }, + { + "cell_type": "code", + "execution_count": 17, + "metadata": {}, + "outputs": [], + "source": [ + "def average_clipped_precision(candidate, reference):\n", + " \"\"\"\n", + " Calculates the precision given the candidate and reference sentences.\n", + " \"\"\"\n", + "\n", + " clipped_precision_score = []\n", + " \n", + " # Loop through values 1, 2, 3, 4. This is the length of n-grams\n", + " for n_gram_length in range(1, 5):\n", + " reference_n_gram_counts = Counter(ngrams(reference, n_gram_length)) \n", + " candidate_n_gram_counts = Counter(ngrams(candidate, n_gram_length)) \n", + "\n", + " total_candidate_ngrams = sum(candidate_n_gram_counts.values()) \n", + " \n", + " for ngram in candidate_n_gram_counts: \n", + " # check if it is in the reference n-gram\n", + " if ngram in reference_n_gram_counts:\n", + " # if the count of the candidate n-gram is bigger than the corresponding\n", + " # count in the reference n-gram, then set the count of the candidate n-gram \n", + " # to be equal to the reference n-gram\n", + " \n", + " if candidate_n_gram_counts[ngram] > reference_n_gram_counts[ngram]: \n", + " candidate_n_gram_counts[ngram] = reference_n_gram_counts[ngram] # t\n", + " \n", + " else:\n", + " candidate_n_gram_counts[ngram] = 0 # else set the candidate n-gram equal to zero\n", + "\n", + " clipped_candidate_ngrams = sum(candidate_n_gram_counts.values())\n", + " \n", + " clipped_precision_score.append(clipped_candidate_ngrams / total_candidate_ngrams)\n", + " \n", + " # Calculate the geometric average: take the mean of elemntwise log, then exponentiate\n", + " # This is equivalent to taking the n-th root of the product as shown in equation (1) above\n", + " s = np.exp(np.mean(np.log(clipped_precision_score)))\n", + " \n", + " return s\n" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Computing the BLEU score\n", + "Finally, you can compute the BLEU score using the above two functions." + ] + }, + { + "cell_type": "code", + "execution_count": 18, + "metadata": {}, + "outputs": [], + "source": [ + "def bleu_score(candidate, reference):\n", + " BP = brevity_penalty(candidate, reference) \n", + " geometric_average_precision = average_clipped_precision(candidate, reference) \n", + " return BP * geometric_average_precision" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## 3.2 Testing the functions\n", + "Now you can test the functions with your Example Reference and Candidates Sentences." + ] + }, + { + "cell_type": "code", + "execution_count": 19, + "metadata": { + "tags": [] + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "BLEU score of reference versus candidate 1: 27.6\n", + "BLEU score of reference versus candidate 2: 35.3\n" + ] + } + ], + "source": [ + "result_candidate_1 = round(bleu_score(tokenized_cand_1, tokenized_ref) * 100, 1)\n", + "print(f\"BLEU score of reference versus candidate 1: {result_candidate_1}\")\n", + "result_candidate_2 = round(bleu_score(tokenized_cand_2, tokenized_ref) * 100, 1)\n", + "print(f\"BLEU score of reference versus candidate 2: {result_candidate_2}\")" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## 3.3 Comparing the Results from your Code with the Sacrebleu Library\n", + "Below you will do the same calculation, but using the `sacrebleu` library. Compare them with your implementation above." + ] + }, + { + "cell_type": "code", + "execution_count": 20, + "metadata": { + "scrolled": true + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "BLEU score of reference versus candidate 1: 27.6\n", + "BLEU score of reference versus candidate 2: 35.3\n" + ] + } + ], + "source": [ + "result_candidate_1 = round(sacrebleu.sentence_bleu(candidate_1, [reference]).score, 1)\n", + "print(f\"BLEU score of reference versus candidate 1: {result_candidate_1}\")\n", + "result_candidate_2 = round(sacrebleu.sentence_bleu(candidate_2, [reference]).score, 1)\n", + "print(f\"BLEU score of reference versus candidate 2: {result_candidate_2}\")" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# 4. BLEU computation on a corpus\n", + "\n", + "## 4.1 Loading Datasets for Evaluation Using the BLEU Score\n", + "\n", + "In this section, you will use a simple pipeline for evaluating machine translated text. You will use English to German translations generated by [Google Translate](https://translate.google.com). There are three files you will need:\n", + "\n", + "1. A source text in English. In this lab, you will use the first 1671 words of the [wmt19](http://statmt.org/wmt19/translation-task.html) evaluation dataset downloaded via SacreBLEU.\n", + "2. A reference translation to German of the corresponding first 1671 words from the original English text. This is also provided by SacreBLEU.\n", + "3. A candidate machine translation to German from the same 1671 words. This is generated by Google Translate.\n", + "\n", + "With that, you can now compare the reference and candidate translation to get the BLEU Score." + ] + }, + { + "cell_type": "code", + "execution_count": 21, + "metadata": {}, + "outputs": [], + "source": [ + "# Loading the raw data\n", + "wmt19_src = open(\"data/wmt19_src.txt\", \"r\")\n", + "wmt19_src_1 = wmt19_src.read()\n", + "wmt19_src.close()\n", + "\n", + "wmt19_ref = open(\"data/wmt19_ref.txt\", \"r\")\n", + "wmt19_ref_1 = wmt19_ref.read()\n", + "wmt19_ref.close()\n", + "\n", + "wmt19_can = open(\"data/wmt19_can.txt\", \"r\")\n", + "wmt19_can_1 = wmt19_can.read()\n", + "wmt19_can.close()\n", + "\n", + "tokenized_corpus_src = nltk.word_tokenize(wmt19_src_1.lower())\n", + "tokenized_corpus_ref = nltk.word_tokenize(wmt19_ref_1.lower())\n", + "tokenized_corpus_cand = nltk.word_tokenize(wmt19_can_1.lower())" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Now that you have your data loaded, you can inspect the first sentence of each dataset." + ] + }, + { + "cell_type": "code", + "execution_count": 22, + "metadata": { + "tags": [] + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "English source text:\n", + "\n", + "Welsh AMs worried about 'looking like muppets'\n", + "There is consternation among some AMs at a suggestion their title should change to MWPs (Member of the Welsh Parliament).\n", + " -> ['\\ufeffwelsh', 'ams', 'worried', 'about', \"'looking\", 'like', \"muppets'\", 'there', 'is', 'consternation', 'among', 'some', 'ams', 'at', 'a', 'suggestion', 'their', 'title', 'should', 'change', 'to', 'mwps', '(', 'member', 'of', 'the', 'welsh', 'parliament', ')', '.']\n", + "\n", + "\n", + "German reference translation:\n", + "\n", + "Walisische Ageordnete sorgen sich \"wie Dödel auszusehen\"\n", + "Es herrscht Bestürzung unter einigen Mitgliedern der Versammlung über einen Vorschlag, der ihren Titel zu MWPs (Mitglied der walisischen Parlament) ändern soll.\n", + " -> ['\\ufeffwalisische', 'ageordnete', 'sorgen', 'sich', '``', 'wie', 'dödel', 'auszusehen', \"''\", 'es', 'herrscht', 'bestürzung', 'unter', 'einigen', 'mitgliedern', 'der', 'versammlung', 'über', 'einen', 'vorschlag', ',', 'der', 'ihren', 'titel', 'zu', 'mwps', '(', 'mitglied', 'der', 'walisischen', 'parlament', ')', 'ändern', 'soll', '.']\n", + "\n", + "\n", + "German machine translation:\n", + "\n", + "Walisische AMs machten sich Sorgen, dass sie wie Muppets aussehen könnten\n", + "Einige AMs sind bestürzt über den Vorschlag, ihren Titel in MWPs (Mitglied des walisischen Parlaments) zu ändern.\n", + "Es ist aufg -> ['walisische', 'ams', 'machten', 'sich', 'sorgen', ',', 'dass', 'sie', 'wie', 'muppets', 'aussehen', 'könnten', 'einige', 'ams', 'sind', 'bestürzt', 'über', 'den', 'vorschlag', ',', 'ihren', 'titel', 'in', 'mwps', '(', 'mitglied', 'des', 'walisischen', 'parlaments']\n" + ] + } + ], + "source": [ + "print(\"English source text:\\n\")\n", + "print(f\"{wmt19_src_1[0:170]} -> {tokenized_corpus_src[0:30]}\\n\\n\")\n", + "print(\"German reference translation:\\n\")\n", + "print(f\"{wmt19_ref_1[0:219]} -> {tokenized_corpus_ref[0:35]}\\n\\n\")\n", + "print(\"German machine translation:\\n\")\n", + "print(f\"{wmt19_can_1[0:199]} -> {tokenized_corpus_cand[0:29]}\")" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "And lastly, you can calculate the BLEU score of the translation." + ] + }, + { + "cell_type": "code", + "execution_count": 23, + "metadata": { + "tags": [] + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "BLEU score of the reference versus candidate translation: 43.2\n" + ] + } + ], + "source": [ + "result = round(sacrebleu.sentence_bleu(wmt19_can_1, [wmt19_ref_1]).score, 1)\n", + "print(f\"BLEU score of the reference versus candidate translation: {result}\")" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## 4.2 BLEU Score Interpretation on a Corpus\n", + "The table below (taken from [here](https://cloud.google.com/translate/automl/docs/evaluate)) shows the typical values of BLEU score. You can see that the translation above is of high quality according to this table and in comparison to the given reference sentence. (*if you see \"Hard to get the gist\", please open your workspace, delete `wmt19_can.txt` and get the latest version via the Lab Help button*)\n", + "\n", + "|Score | Interpretation |\n", + "|:---------:|:-------------------------------------------------------------:|\n", + "| < 10 | Almost useless |\n", + "| 10 - 19 | Hard to get the gist |\n", + "| 20 - 29 | The gist is clear, but has significant grammatical errors |\n", + "| 30 - 40 | Understandable to good translations |\n", + "| 40 - 50 | High quality translations |\n", + "| 50 - 60 | Very high quality, adequate, and fluent translations |\n", + "| > 60 | Quality often better than human |" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [] + } + ], + "metadata": { + "kernelspec": { + "display_name": "Python 3 (ipykernel)", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.10.11" + } + }, + "nbformat": 4, + "nbformat_minor": 4 +} diff --git a/NLP with Attention Models/NMT_with_Attention/Basic Attention-BLEU-QKV Attention/Files/home/jovyan/work/C4W1_QKV_Attention.ipynb b/NLP with Attention Models/NMT_with_Attention/Basic Attention-BLEU-QKV Attention/Files/home/jovyan/work/C4W1_QKV_Attention.ipynb new file mode 100644 index 0000000000000000000000000000000000000000..a91bf78316970414b40758878c7c9ef4adff68fc --- /dev/null +++ b/NLP with Attention Models/NMT_with_Attention/Basic Attention-BLEU-QKV Attention/Files/home/jovyan/work/C4W1_QKV_Attention.ipynb @@ -0,0 +1,281 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "id": "707052ae", + "metadata": {}, + "source": [ + "# Scaled Dot-Product Attention: Ungraded Lab\n", + "\n", + "The 2017 paper [Attention Is All You Need](https://arxiv.org/abs/1706.03762) introduced the Transformer model and scaled dot-product attention, sometimes also called QKV (**Q**ueries, **K**eys, **V**alues) attention. Since then, Transformers have come to dominate large-scale natural language applications. Scaled dot-product attention can be used to improve seq2seq models as well. In this ungraded lab, you'll implement a simplified version of scaled dot-product attention and replicate word alignment between English and French, as shown in [Bhadanau, et al. (2014)](https://arxiv.org/abs/1409.0473).\n", + "\n", + "The Transformer model learns how to align words in different languages. You won't be training any weights here, so instead you will use [pre-trained aligned word embeddings from here](https://fasttext.cc/docs/en/aligned-vectors.html). Run the cell below to load the embeddings and set up the rest of the notebook.\n", + "\n", + "This is a practice notebook, where you can train writing your code. All of the solutions are provided at the end of the notebook." + ] + }, + { + "cell_type": "code", + "execution_count": 1, + "id": "aa4d9f30", + "metadata": {}, + "outputs": [], + "source": [ + "# Import the libraries\n", + "import pickle\n", + "import matplotlib.pyplot as plt\n", + "import numpy as np\n", + "\n", + "# Load the word2int dictionaries\n", + "with open(\"./data/word2int_en.pkl\", \"rb\") as f:\n", + " en_words = pickle.load(f)\n", + " \n", + "with open(\"./data/word2int_fr.pkl\", \"rb\") as f:\n", + " fr_words = pickle.load(f)\n", + "\n", + "# Load the word embeddings\n", + "en_embeddings = np.load(\"./data/embeddings_en.npz\")[\"embeddings\"]\n", + "fr_embeddings = np.load(\"./data/embeddings_fr.npz\")[\"embeddings\"]" + ] + }, + { + "cell_type": "code", + "execution_count": 2, + "id": "a6914081", + "metadata": {}, + "outputs": [], + "source": [ + "# Define some helper functions\n", + "\n", + "def tokenize(sentence, token_mapping):\n", + " tokenized = []\n", + " \n", + " for word in sentence.lower().split(\" \"):\n", + " try:\n", + " tokenized.append(token_mapping[word])\n", + " except KeyError:\n", + " # Using -1 to indicate an unknown word\n", + " tokenized.append(-1)\n", + " \n", + " return tokenized\n", + "\n", + "\n", + "def embed(tokens, embeddings):\n", + " embed_size = embeddings.shape[1]\n", + " \n", + " output = np.zeros((len(tokens), embed_size))\n", + " for i, token in enumerate(tokens):\n", + " if token == -1:\n", + " output[i] = np.zeros((1, embed_size))\n", + " else:\n", + " output[i] = embeddings[token]\n", + " \n", + " return output" + ] + }, + { + "cell_type": "markdown", + "id": "6153d4b2", + "metadata": {}, + "source": [ + "The scaled-dot product attention consists of two matrix multiplications and a softmax scaling as shown in the diagram below from [Vaswani, et al. (2017)](https://arxiv.org/abs/1706.03762). It takes three input matrices, the queries, keys, and values.\n", + "\n", + "![scaled-dot product attention diagram](./images/attention.png)\n", + "\n", + "Mathematically, this is expressed as\n", + "\n", + "$$ \n", + "\\large \\mathrm{Attention}\\left(Q, K, V\\right) = \\mathrm{softmax}\\left(\\frac{QK^{\\top}}{\\sqrt{d_k}}\\right)V\n", + "$$\n", + "\n", + "where $Q$, $K$, and $V$ are the queries, keys, and values matrices respectively, and $d_k$ is the dimension of the keys. In practice, Q, K, and V all have the same dimensions. This form of attention is faster and more space-efficient than what you implemented before since it consists of only matrix multiplications instead of a learned feed-forward layer.\n", + "\n", + "Conceptually, the first matrix multiplication is a measure of the similarity between the queries and the keys. This is transformed into weights using the softmax function. These weights are then applied to the values with the second matrix multiplication resulting in output attention vectors. Typically, decoder states are used as the queries while encoder states are the keys and values.\n", + "\n", + "### Exercise 1\n", + "Implement the softmax function with Numpy and use it to calculate the weights from the queries and keys. Assume the queries and keys are 2D arrays (matrices). Note that since the dot-product of Q and K will be a matrix, you'll need to calculate softmax over a specific axis. See the end of the notebook for solutions." + ] + }, + { + "cell_type": "code", + "execution_count": 21, + "id": "3932b927", + "metadata": {}, + "outputs": [], + "source": [ + "def softmax(x, axis): \n", + " \"\"\" Calculate softmax function for an array x\n", + "\n", + " axis=0 calculates softmax across rows which means each column sums to 1 \n", + " axis=1 calculates softmax across columns which means each row sums to 1\n", + " \"\"\"\n", + " # Replace pass with your code.\n", + " return np.exp(x) / np.sum(np.exp(x), axis=axis, keepdims=True)\n", + "\n", + "def calculate_weights(queries, keys):\n", + " \"\"\" Calculate the weights for scaled dot-product attention\"\"\"\n", + " # Replace None with your code.\n", + " dot = queries.dot(keys.T) / np.sqrt(keys.shape[-1])\n", + " weights = softmax(dot, axis=1)\n", + " \n", + " assert weights.sum(axis=1)[0] == 1, \"Each row in weights must sum to 1\"\n", + " \n", + " # Replace pass with your code.\n", + " return weights" + ] + }, + { + "cell_type": "code", + "execution_count": 22, + "id": "51f47450", + "metadata": {}, + "outputs": [ + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAApoAAAKyCAYAAAB1836kAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjcuMSwgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy/bCgiHAAAACXBIWXMAAA9hAAAPYQGoP6dpAACdaUlEQVR4nOzdeVxU1f8/8NcddmRVERBRFkVFwRWXwn0hc9fc11wy00+aZmb6KS1Ns0xLS80lc0vN/WOumYBL7oioKKCyiCiCwICyc35/+GO+jswgBDN30Nfz8ZhHce+5974GEN6ce865khBCgIiIiIionCnkDkBEREREryYWmkRERESkEyw0iYiIiEgnWGgSERERkU6w0CQiIiIinWChSUREREQ6wUKTiIiIiHSChSYRERER6QQLTSIiIiLSCRaaRERERKQTLDSJiIiISCdYaBIRERGRTrDQJCqD4OBghIaGlqjt1atXERwcrONEREREhkMSQgi5QxBVVAqFAm3atEFQUNBL23bo0AEnT55EXl6eHpIRERHJjz2aRGVUmr/V+HcdERG9TlhoEulJcnIyLCws5I5BRESkN8ZyByCqSJRKJVJTU9W2ZWdnIy4uTmtvZWZmJoKCgnDt2jU0atRIDymJiIgMAwtNolJYunQpvvzyS7VtFy9ehJubW4mOHzt2rA5SERERGSYWmkSlYGdnh5o1a6o+jo2NhampKZycnDS2lyQJFhYW8PDwwKBBgzB8+HB9RSUiIpIdZ50TlYFCoYC/vz+XLSIiItKAPZpEZfDrr7/C0dFR7hhEREQGiT2aRERERKQT7NEkKkcpKSnIyMgodr3M58d4EhERvcpYaBKVUUREBObOnYvDhw8jLS2t2LaSJPHJQERE9NpgoUlUBleuXEG7du1UvZjm5uZwcHCAQsFnIRAREbHQJCqDzz77DOnp6ejUqROWLl2Khg0byh2JiIjIYHAyEFEZ2NnZoaCgAAkJCahUqZLccYiIiAwK7+8RlUFBQQHq1q3LIpOIiEgDFppEZdC4cWMkJCTIHYNeA+np6QgODsatW7eKbXfr1i0EBwcjIyNDT8mIiLRjoUlUBrNmzUJCQgI2bdokdxR6xa1atQodOnTAqVOnim136tQpdOjQAWvWrNFTMiIi7ThGk6iMVq9ejY8//hjjxo3D2LFj4enpCQsLC7lj0SvmzTffxKVLl5Camgpzc3Ot7TIzM2FnZ4cWLVrg5MmTekxIRFQUC02iMjAyMipVe66jafhyc3Px66+/4tChQ7hz506xC/BLkoTbt2/rJZeTkxNsbGwQERHx0rZ169ZFeno67t+/r4dkRETacXkjojIo7d9p/LvOsCUlJaFjx464fv16ib5WkiTpIdUzqampJX6qlK2tLWJiYnSciIjo5VhoEpVBQUGB3BGoHH366ae4du0aatSogU8++QR+fn6oVq2aQSzA7+joiMjISOTn5xfbk56Xl4fIyEhUrVpVj+mIiDRjoUlE9P8dOHAAJiYm+Pvvv1G7dm2546hp06YNfv/9d6xYsQJTpkzR2m7lypVIS0vDW2+9pcd0RESayf9nOhGRgUhLS0PdunUNrsgEgKlTpwIAZsyYga+//hpPnjxR2//kyRMsXLgQ06dPh0KhwEcffSRDSiIidZwMRFQOHj58iLVr1yIoKAjx8fHIyspSmySyd+9eJCYmYuTIkcXOGCZ5+fj4IDc3Fzdv3pQ7ikaLFy/Gp59+CkmSYGpqCm9vb9jZ2SE1NRU3btxATk4OhBBYtGgRPvnkE7njEhHx1jlRWe3duxejR49Genq6agLJi5NEbty4gf/+979wcHBA37595YhJJTBu3DhMmzYNly5dQrNmzeSOU8Qnn3yCunXr4rPPPkN4eDhCQkLU9jds2BDz589Hr169ZEpIRKSOPZpEZXDlyhW0bNkSQghMmTIFvXr1wrRp03D58mXk5+er2t29exeenp4YOnQoNm/eLGNiKo4QAiNGjEBQUBBWrFiB3r17yx1Jq9u3byM8PBxKpRLW1tZo0KABPDw85I5FRKSGPZpEZfD1118jLy8Pa9euxbvvvgsAGm+Nu7u7w9HREVevXtV3RCqFTp06AQASExPRr18/2Nvbw9PTU+uz7CVJwvHjx/UZUcXT0xOenp6yXJuIqKTYo0lUBk5OTigoKEBiYqJqW5s2bXDmzBm1Hk0AaNGiBaKiovD48WN9x6QSKu0yRpIkFfk6ExHR/2GPJlEZpKSkwMfHp0RthRDIzs7WcSIqixMnTsgdAQAQHBwMALC0tETz5s3VtpVG27ZtyzUXEVFpsUeTqAxq1KiB7OxsPHr0SLVNU49mfn4+KleujGrVqiEyMlKOqFSBKBQKSJKEunXr4saNG2rbSoqPOyUiQ8AeTaIy8Pf3xx9//IF9+/YVO3Fkw4YNSE9Px+DBg/WYjiqqtm3bQpIktUdOFm4jIqpI2KNJVAYXLlxAq1atULVqVaxfvx7du3cv0qO5ceNGfPDBB8jJycGVK1fg7e0tc2oiIiL9YKFJVEZLly7Fxx9/DACoVq0asrKyoFQq8eabbyI8PFw1+WfFihWYOHGinFGphO7evYvt27cjNDQUjx8/Rm5ursZ2cs46JyKqCFhoEpWDw4cPY/bs2UUW0AaeLaL9zTffoFu3bjIko9L69ttvMXv2bOTl5aluVT//Y/L5bZx1TkRUPBaaROUoNjYWYWFhSEtLg5WVFby9vQ3yudmk2cGDB9GjRw84Ozvjq6++wrJly3D9+nUcPXoUcXFxCA0Nxbp165Cfn49FixbB19cX7dq103vO1NRU3L17FxkZGSjuRzhnnROR3FhoEhH9f926dcPRo0cRHByMN998U+MKAsnJyejfvz+uXLmCCxcuoE6dOnrL9/fff2P27Nk4f/78S9ty1jkRGQIWmkRE/1+1atVgamqKe/fuAdC++H5CQgJq1aqFgQMH6u2RogcPHkSfPn2Ql5cHc3NzuLu7w8HBodiZ6IayLigRvb64vBFROUhLS0NgYCDu3LlT7O1MSZLw3//+V8/pqKSUSiUaNWqk+rjwcaJKpRI2Njaq7c7OzmjYsKFeC7nPP/8c+fn5mDBhAhYtWgRbW1u9XZuI6N9ioUlURvPmzcM333yjeuqPpiJTkiTV5BEWmoarWrVqUCqVah8DwK1bt+Dn56fWNiMjA8nJyXrLduPGDVStWhUrV67U2zWJiMqKhSZRGXz77beYN28eAKBVq1Zo0qTJS29nkuHy9PTE5cuXVR+3bNkSv//+O1auXKlWaB4/fhxRUVFwd3fXWzZ7e3u4uLjo7XpEROWBhSZRGaxevRqSJGHLli186s8r4K233kJwcDAuXLgAPz8/DB06FF988QV+++03REREoHXr1nj48CF27NgBSZIwYsQIvWXr2rUrdu7ciSdPnqBSpUp6uy4RUVlwMhBRGVhYWKB69eq4ffu23FGoHNy9exdff/013nnnHQQEBAAAjh49isGDByM1NVWt7TvvvIOtW7fC2Fg/f6/HxsaiRYsW6Ny5M9auXasaP0pEZMhYaBKVQZ06dWBlZaVxoXZ6daSlpeHQoUOIjo6GhYUF2rRpg6ZNm+o9R0REBEaOHIl79+5hyJAh8PT0hKWlpdb2I0eO1GM6ovIXHBwMW1tbtUl62ly9ehWpqalcP9bAsNAkKoP//ve/+OabbxAZGYlatWrJHYdecZs3b8bMmTORkJBQonHAfGoRVXQKhQJt2rRBUFDQS9t26NABJ0+e5PqxBoZjNInKYPbs2Th27Bh69+6NjRs3wtfXV+5I9Iravn27qoeyRo0a8PHx4cQzei2Upj+MfWeGh4UmURmYm5sjKCgIgwYNQtOmTdGkSZNib2dKkoR169bpOSWVVlhYGH788UcEBQUhPj4e2dnZar0kq1evRkxMDD799FO19TV1aeHChZAkCQsXLsTHH38MhUKhl+sSVRTJycmwsLCQOwa9gIUmURnk5+dj0qRJOHDgAAoKCnDp0iVcunRJa3sWmobvp59+wkcffaRWWL7Ya5idnY1vvvkGDRo0wLBhw/SSKyIiAi4uLvjkk0/0cj0iOSiVyiIT77KzsxEXF6e1tzIzMxNBQUG4du1aicZykn6x0CQqg/nz52P9+vUwNTVF//790bhxY97OrMBOnDiBDz/8ENbW1liwYAF69eqFIUOG4J9//lFrN3DgQEydOhV79uzRW6FZpUoVODo66uVaRHJZunQpvvzyS7VtFy9ehJubW4mOHzt2rA5SUVmw0CQqgw0bNkChUODYsWNo06aN3HGojJYsWQIA2LJlC7p37w6gaG8mADg5OcHV1RU3btzQW7aePXvi119/RXJyMqpUqaK36xLpk52dHWrWrKn6ODY2FqampnByctLYXpIkWFhYwMPDA4MGDcLw4cP1FZVKiLPOicrA0tISbm5uei04SHeqVq0KMzMzxMfHq7a1adMGZ86cKTKDu1WrVggPD0daWppesj1+/BitWrVCrVq1sHnzZvZuVgAdO3Ys8zkkScLx48fLIU3FpFAo4O/vj+DgYLmj0L/EHk2iMnBzc+OkjFdIRkZGiZepysnJ0evyQStWrMDbb7+NlStXwtPTE926dXvpxLP//ve/estHRQUGBmrdV9hTrqmv5/l9r/swnF9//ZV/VFVw7NEkKoNvvvkGn332Ga5cuQIfHx+541AZubu7IyUlRW0ygqYezaysLFSuXBnu7u64fv26XrIpFApIkvTS5VsK20iSxHU0ZaZt7cdTp07hyy+/hL29PcaMGYP69evD0dERiYmJCA8Px/r165GSkoLPP/8cb775Jtq1a6fn5ETlhz2aRGUwY8YMXLx4ET169MCKFSvQs2dPuSNRGXTo0AG//fYb1q9fjzFjxmht98MPPyArK0v1mEp9+OKLL/R2LSofmgrES5cuYcGCBejfvz9+/fVXmJmZFWnzxRdf4N1338X8+fNx+vRpfUQ1WEqlEtHR0ahSpQpcXFzU9u3evRtr1qzB/fv30axZM3z55ZeoUaOGTElJG/ZoEpVB4Ris06dPIy8vD5UrV37p7czXebyVobt16xYaNWoEIyMjLF68GKNGjUK3bt1UPZqpqan48ccf8dVXX8Hc3Bw3btyAq6ur3LGpAunZsyeCg4ORkJBQ7ONDnz59CmdnZ7Rr1w779+/XY0LDMnfuXHz11VdYs2aN2h9/v/32G8aMGaPWw+/q6oqwsDC9rW1LJcNCk6gMSjs+k7czDd+OHTswatQo5OTkwMjICEZGRsjJyYGLiwsSEhJQUFAAU1NTbNu2Db1795Y7LlUwVatWhYeHB86fP//Sti1atMCdO3eQlJSkh2SG6c0338SFCxfw+PFjWFlZqba7u7sjNjYWM2fORKtWrfDDDz8gMDAQCxYswKeffipjYnoRC02iMijJ83dfxPFWhi8sLAxz587FoUOHkJWVpdpuYmKCgIAAfPXVV7IvDJ2ZmYnbt28jPT0d1tbW8PT05FNRKgArKyvY2dnh3r17L21bo0YNpKamIiMjQw/JDJOLiwtMTEwQHR2t2nb58mU0b94cHTt2xF9//QXg2VOBXFxc4OPjgwsXLsiUljThGE2iMmDR+Gry8fHBrl27kJubi4iICKSlpcHKygp16tSRvZg7cuQIFi5cWGSCkpGREfz9/fHpp5+ia9euMiak4vj6+uLcuXNYtWoV3n//fa3tVq9ejfv376NVq1Z6TGd4kpOT0bhxY7VtQUFBkCQJffr0UW2rUqUKvLy8EBMTo9+A9FJcl4WISAsTExM0aNAAb7zxBnx9fWUvMufOnYu3334bwcHByMvLg4mJCapXrw4TExPk5eUhMDAQ3bp1w9y5c2XNSdrNmDEDQghMnjwZQ4YMQVBQEBITEyGEQGJiIoKDgzF06FBMmjQJkiRhxowZckeWlampKR4/fqy2rXBNzbZt26ptt7CwwJMnT/SWjUqGt86Jysnp06cRFBSE+Ph4ZGVlqT3TPDo6Gjk5OfDy8pIxIZWWId2ePnz4MN5++20YGRlhwoQJmDJlCurUqaPaHxkZiR9++AG//PIL8vPzcfDgQb3OiqeSW7x4MWbPno2CggKN+4UQUCgUmD9//ms/3rBFixa4dOkSwsPD4eXlhZSUFLi6usLS0hKJiYlqbV1dXWFsbIy7d+/KlJY0EkRUJpGRkaJFixZCoVAIhUIhJEkSCoVCrc3EiROFQqEQwcHBMqWk0jh48KBo3769MDExUX1dFQqFMDY2Fu3btxd//vmn3jN169ZNKBQKsXHjxmLbbdq0SUiSJLp166anZPRvhISEiOHDhwtHR0chSZLq5ejoKIYPHy4uXbokd0SDsGzZMiFJkqhVq5aYPn26aNy4sVAoFOKjjz5SaxcdHS0kSRIBAQEyJSVtWGgSlcGDBw9E9erVhSRJokWLFuLLL78UderUKVJonjt3TkiSJKZMmSJPUCqxKVOmqP5gkCRJmJubC1dXV2Fubq7aplAoxH/+8x+95qpataqoWbNmidrWrFlTVKlSRceJqLykpqaKe/fuidTUVLmjGJy8vDzRv39/tWK8VatWRT5XX331lZAkSXz77bcyJSVtOEaTqAy+/vprJCQkYNKkSTh79iz++9//anxcWosWLWBtbY0zZ87IkJJK6tdff8WPP/4IY2NjTJs2DVFRUcjMzERsbKzqNvq0adNgYmKCn376CevXr9dbtvT09BI/is/R0ZFj1SoQW1tbuLi4wNbWVu4oBsfIyAg7d+7ExYsX8fvvv+PUqVM4c+ZMkc+Vh4cHli5diiFDhsiUlLThGE2iMvD09ERiYiKSkpJUT/jQ9MhCAGjSpAkePHiAhIQEOaJSCTRt2hShoaH4448/0K9fP63t9uzZg/79+6NJkya4dOmSXrJ5eHggKSkJCQkJqFSpktZ2T548gZOTExwcHHDnzh29ZKN/Jy4uDidPnkR8fDwyMzPx+eefq/bl5uZCCAFTU1MZExKVHXs0icogPj4ederU0fgYuReZmZkhJSVFD6no37p58yZq1apVbJEJAH379oWbmxvCw8P1lAwICAhARkYGxo8fj5ycHI1tcnJyMG7cODx9+hRvvfWW3rJR6SQlJWHQoEFwd3fHiBEj8Omnn2LevHlqbd59911YWFjo7Q8ZIl3hOppEZWBlZYVHjx6VqG1sbCyqVKmi40RUFtbW1iX+GlWpUgVPnz7VcaL/89lnn2H79u3Yvn07AgMDMX78eHh7e6NatWpITEzEjRs3sGbNGjx8+BC2traYNWuW3rJRyaWnp6Ndu3YIDw+Hq6srOnfujGPHjiE+Pl6t3bhx47B161bs3r0bzZo1kymt/DZu3FjqY0aOHKmDJPRvsdAkKoMmTZrg77//RlhYGHx8fLS2CwoKwoMHD9C3b189pqPSat++Pf73v//h8ePHqFy5stZ2ycnJuH79Onr16qW3bK6urjh06BAGDhyIuLg4zJ8/v0gbIQRq1qyJHTt28BnsBmrx4sUIDw9H//79sXHjRlhYWKBNmzZFCs22bdvCwsICJ06ckCmpYRg9ejQkSSpRWyEEJElioWlgWGgSlcHYsWNx/PhxjBkzBvv374ezs3ORNrdv38aYMWMgSRLGjx8vQ0oqqfnz5+Po0aMYNGgQtm7dCgcHhyJtHj16hKFDh8Lc3FxjsadLLVu2xM2bN7F161YcPXoUERERyMjIgJWVFby8vBAQEIAhQ4bIvrA8abdz506YmZlh7dq1xX6dFAoFateujdjYWD2mMzwjR47UWmg+efIEUVFRCA0NhYmJCd555x2YmJjoOSG9DCcDEZXRwIEDsXPnTtja2iIgIAD//PMP7t27h9mzZ+PatWs4ePAgcnJyMGLECPz2229yx6VibNy4EREREVi8eDGMjY3Rr18/1K9fH9WqVcOjR48QHh6OXbt2IT8/HzNmzNC6AD97VEgbCwsLeHl5ITQ0VLVN2wTC1q1bIyQkBFlZWfqOWaFcvHgRo0ePhoODA44eParXYjMlJQVRUVGwsLCAt7c3FIrip76EhoYiLS2tyFONXmUsNInKKC8vD//973+xbNkyZGdnq7ZLkqSaNTp16lQsWLAARkZGMiall1EoFKqvW6Hne1O0bX/RiwUDUSF7e3vY29urrQigrdB0c3NDZmYmHj58qO+YFU5kZCTq16+P2bNnF5lYpQspKSmYMGEC9uzZo3rCk729PaZNm4ZPPvkExsaabxi3adMG//zzD/Ly8nSe0VCw0CQqJ0lJSTh48CDCwsKQlpYGKysreHt7o3v37hpvqZPhKc14sOL8+uuv5ZCGXkX+/v44d+4coqKiUKtWLQCaC80rV66gadOmeOutt3Dw4EG54lYovr6+ePr0KaKionR6nZycHLRq1QqhoaF4sYSSJAlNmjTBnj17NI6T1vZHxauMYzSJyknVqlV5y7SC27Bhg9wRivX48WN89913OHToEO7cuYOMjAytbSVJ0luvSUFBASIjI/H48WPk5uZqbfc63S7UZvjw4Thz5gzee+897NmzB5aWlkXapKSkYOzYsZzYUkq5ublFJlXpws8//4wrV66gWrVq+PHHH9G1a1dkZWVh+/btmD9/Pi5fvow33ngDR48eRf369XWex9CxR5OIqAK4e/cu2rRpg4SEhCK9KNoU3tLTlUePHuHTTz/Fjh07XrrUkz4LX0OWn5+Pjh074uTJk3B3d8eAAQOwe/du3L59G2vWrMG1a9ewefNmJCUloWvXrjh8+LDckSuECxcuoHXr1nBxcUFMTIxOr/XGG2/g3Llz+Ouvv9ChQwe1fQkJCRg4cCBOnz6NqlWr4tChQ2rLU72OPZosNInKQVhYGH788UcEBQUhPj4e2dnZar9UV69ejZiYGHz66aewsbGRMSmVVkZGBtLT02FtbQ0rKyvZchROOvP19cX8+fPh5+eHatWqlcut/n8jOTkZfn5+iImJQY0aNZCWlob09HS88cYbiIuLQ3x8PPLz82FhYYEWLVoAwGu/VE+h9PR0vPfee9i+fbvamODn/3/gwIFYt25dsU+Beh0EBwdr3SeEwKNHj3DhwgWsWbMGaWlpmDZtGr799ludZrKzs4ONjY3WFQFyc3MxatQobNu2DTY2Nvjf//6HNm3aAHg9C03o8bnqRK+kFStWCBMTEyFJkuqlUCjU2vzwww9CoVCIzZs3y5SSSiMsLEyMGjVKODs7C4VCoXo5OzuLd999V4SFhek9U+XKlYWFhYV4+PCh3q+tySeffCIkSRIffvihEEIIf39/te/75ORkMWvWLGFqaipGjRolU0rDdvXqVTFv3jzxzjvviC5duoi+ffuK2bNni4sXL8odzWAU/jwt7lX4c7dLly7iyZMnOs9kamoqWrRo8dJ2kyZNEpIkCUtLS3Hw4EEhRNF/J68D9mgSlcGJEyfQuXNnWFtbY8GCBejVqxeGDBmCf/75R+0v1gcPHqB69ero168fdu7cKWNiepl169Zh0qRJqmdNa2JqaoqffvoJY8eO1VuuSpUqoW7durh8+bLerlkcb29vxMTEICEhATY2Nlp7ajZu3Ih3330Xy5cvxwcffCBTWqqo2rdvr7XXXpIkVKpUCR4eHujWrZveHrvq6uqK/Px83L9//6VtZ8+ejYULF8LMzAy//fYbli9fzh5NIiq57t27C4VCIQ4cOKDapu0v1po1a4r69evrMx6V0tmzZ4WRkZGQJEl0795dHD16VMTHx4u8vDwRHx8vjh49Krp37y4kSRLGxsbi3LlzesvWrFkz4e7urrfrvYylpaXw9vZWfdy2bVuhUChETk5OkbYuLi6iadOm+oxHpDM9e/YUCoVCXLt2rUTtFy9erPqZYWtr+9r1aBa/sigRFevs2bNwcnJC9+7dX9rW2dlZLzMi6d/79ttvIYTA119/jQMHDqBLly6oXr06jIyMUL16dXTp0gUHDhzAokWLkJ+fr/OxYM+bOnUqoqOjcfToUb1dszgmJiZqM6atra0BPOu9f5GzszMiIyP1lq0iSUlJQVxcHGJjY7W+yLB07NgRQgisW7euRO1nzJiBVatWQQiB9PR0HaczPCw0icogIyMDTk5OJWqbk5Pzet0uqYBOnToFBwcHfPrpp8W2mzFjBqpVq4aTJ0/qKdmzZXE+/fRTDBo0CD/88IPsv7Bq1KiBhIQE1ceFT0l68XPy5MkTREZGyjZpSZuHDx8iJCTkpbPldSEiIgJDhw5F5cqVUbVqVbi5ucHd3V3jy8PDQ+/5qHgDBgyAi4sL/vzzT6SmppbomPfeew+///671oXcX2Uco0kVyv379xEfH4/MzEyDWJPP3d0dKSkpaj9sNI1Vy8rKQuXKleHu7o7r16/LkJRKwszMDI0bN8a5c+de2rZly5YIDQ3V2+MBCwuOe/fuqb63qlatqnVWsiRJuH37ts7yjBkzBps2bcKjR49gZ2eHEydOoFOnTnB2dsZvv/2G1q1b4+HDh5g+fTr279+PLl266HWpnnPnzmH79u3o1KmT2h0HpVKJESNG4MCBAwCejX394Ycf8O677+ol15UrV9CuXTtkZGRACAFzc3M4ODgU++jCu3fv6iWbIdq4cWOJ2xoZGcHa2hpubm5o0KABn8RmKOS8b09UUj///LOoXbu2apahkZGR2v5p06aJ1q1bi5iYGL3mevfdd4VCoRDr1q1TbdM0RnPRokVCkiTx0Ucf6TUflU6NGjVE5cqVRW5ubrHtcnJyROXKlYWLi4uekgm1VQ1K8tL1OLC9e/cKSZLExo0bVdv69OlTZJawJEnC3NxcXLhwQad5XjRu3DihUChEUFCQ2vb33ntPSJIkjIyMROXKlVX/f/XqVb3k6tatm5AkSXTu3FmW1QsqmpLMOtf0qly5spgxY4ZeZqFT8VhokkErKCgQAwcOVP3w8PDwEDY2NkV+iW7fvl1IkiSWLl2q13w3b94UZmZmwtLSUqxYsUKkp6erFZopKSli3rx5wtjYWFhZWYnY2Fi95qPSGT58uFAoFGLatGnFtvvoo4+EQqEQI0aM0FMyIaKjo0v90qX8/Hxx7949kZaWptqWk5Mj5s2bJ+rWrSvMzMyEnZ2d6NGjh7h06ZJOs2hSv359YW1trbYtPT1dWFhYCBsbGxEeHi6EeLb0mCRJeluCydbWVlhbW4uMjAy9XK+iGzVqlBg6dKgwNTUVkiQJDw8P0bt3bzF8+HDRu3dv4eHhISRJEmZmZmLIkCFiwIABwsfHR1WgtmzZUmRmZsr9Nl5rLDTJoK1Zs0ZIkiQaNmyo6nHQ1GOYkZEhjI2NRefOnfWecfv27cLc3FwoFAphYmKi+n9XV1dhbGwsFAqFMDc3F3v37tV7Niqd69evq75+zZo1E+vXrxdnz54Vd+7cEWfPnhXr168XTZs2VX1Nr1+/Lndk0qJy5cpqs+KFEOLAgQNCkiQxfvx41bb8/Hzh4OAg6tWrp5dc1tbWonnz5nq51qvgyZMnokWLFsLNzU0EBgZqbBMUFCTc3d1FixYtVD2Y58+fF25ubkKhUIjvvvtOZ/mePn0qdu/eLT755BPRu3dv0bZtW+Hn5yc6dOgghgwZIr777jtx8+ZNnV2/ImChSQatVatWwsjISNX7IIT25YPq1q0rPDw89BlP5erVq6Jfv37CwsJC7falqamp6Nmzp7hy5Yosuaj09u3bJ2xtbbXespMkSdja2or9+/fLHZWKYWJiUqSg+/TTT4VCoRA7duxQ2+7n5ycqVaqkl1xt2rTR65CLim7mzJnCyMhI3Lhxo9h2169fFwqFQsyYMUO17dy5c0KSJOHn51fuufLz88X8+fNVyxVpW0S+8OOuXbuKyMjIcs9REXAyEBk0GxsbODk5ISIiQrVN28LQrVu3RmhoqCyzSAvl5uYiIiICaWlpsLKyQp06dWBhYSFbHvp3EhIS8NNPP+HYsWOIiIhARkYGrKys4OXlhYCAAEycOBHOzs6y5bt7964qW+HjMb28vNClSxe4u7vrPU9cXBxOnjypmqj3+eefq/YVLnxvamqq10zOzs7Iz8/Hw4cPVTPe/fz8cPnyZSQkJKBatWqqtk2aNEFMTAweP36s81yHDh1Cjx49sGHDBowYMULn16voPDw8YGVlhatXr760baNGjZCeno47d+6otrm7u+Px48dIS0srt0xCCPTp0wcHDhyAEAIuLi5wcXFBfHw84uPjIUkSBg4ciNq1a+PixYsIDAxEdnY2rKyscODAAYOYyKpXspa5RC9RqVIl0bBhQ7Vt2no0vb29hZ2dnb6iCSGEcHNzE76+viI7O1uv131VxMfHi/PnzxeZsCGXmJgYERMTI/Lz8+WOotHjx4/F4MGDhZGRkVrPyfOT5IYOHSoeP36slzyPHj0SAwcOVMvz4r/NYcOGCYVCoffHKvbp00coFAqxevVqIYQQx44dE5IkiSZNmqi1KygoEFZWVkVus+vSqlWrhJWVlZg6daoICwsTT58+1du1Kxpzc3PRqFGjErVt1KiRMDc3V9vWokWLItvK6ueff1YN6Tp79qzavnPnzon69esLc3NzERoaKoR49u928uTJQpIkUaVKFYN5jKy+sNAkg9awYUNhbm4u0tPTVds0FZoJCQnCyMhItG7dWq/5LC0tdXJb5lVnqKsISJIknJycDLLQfPr0qWjSpImquHzjjTfE+PHjxZw5c8T48ePFG2+8oSo6mzZtqvMJEEqlUnh7ewtJkkTNmjXFmDFjhKura5F/mydOnBCSJInPPvtMp3ledPLkSVUBXKVKFdX/b968Wa1dYGCgXicDlXb29Iv/Nl43bm5uwtjYWNy6davYdrdu3RJGRkbCzc1Nbbuzs7OoXr16uWby8/MTJiYmWn8+FWbp27ev2vaZM2cKSZLEJ598Uq55DB0LTTJon332WZFlgTQVmkOHDhUKhUIsXrxYr/l8fHxEnTp19HrNiszQVxGws7MTLVu21Os1S2rBggVCkiRRv359rUsFXbhwQXh7ewuFQiEWLlyo0zxz5swRkiSJd955R9Ujp+nfZn5+vrC0tNT7H4FCPFuCqfCP1Tp16oiff/65SJvBgwcLSZLEli1b9JKptMtUSZKkl1yGasaMGUKSJFGnTh3xzz//aGxz9uxZUadOHaFQKNSKuPj4eCFJkujQoUO5ZrK2ti7SM/6iunXrisqVK6tty8jIEBYWFq/do4hZaJJBe/z4sXBxcREKhUK888474tChQ6J58+ZCoVCIO3fuiH379olOnToJSZKEp6en3pcM+eabb4RCodDrM68rMkNfRaBNmzbC2dlZr9csqUaNGgljY2Nx+/btYttFRUUJY2PjEt9u/Lfq1asnzM3NRWpqqmqbtmEtvr6+BjsBRqlUitTUVIPsxaZnS1I1adJE1Vtft25dMWDAADFmzBgxcOBAUa9ePVUvf9OmTdV+BxR2VHz77bflmsnKyqrIkK4Xubu7C0tLyyLbmzRporeJZ4aChSYZvGvXrglPT89iZwF7enrKsoREXl6e6NGjh3BychJ79+4VBQUFes9QkRj6KgI7d+4UkiSpLcBvKCpVqiQaN25coraNGzfW+S8zc3Nz4evrq7ZN29eyVatWwszMTKd56NWlVCrFpEmTiqzqUfiysLAQkydPFkqlUi95mjVrJoyMjLQuuF84271BgwZF9jVo0EDY2trqOKFhef0eukkVToMGDXD16lWsW7cOe/bsQVhYmGpWt7e3N/r164cJEyZofRSfLnXp0gVCCCQlJaFfv36wtbVFnTp1in0s4PHjx/Wc0nBcv34dHh4eqFev3kvb2tvbIzQ0VA+p/k///v2xaNEiTJo0CWFhYRgxYgTq169vECsHGBkZITc3t0Rtc3Nzi32kYXkwNzcv8fPWExISYGtrq9M89OqytrbGihUr8PXXX+PkyZOIjIzEkydPUKlSJXh5ecHf3x82NjZ6yzNkyBBcvnwZ3bp1w9KlS9GnTx8YGxsjLy8P+/btw9SpUyFJEvr37692XH5+Pu7evfvaPb+eyxsRlUFpf5lLklRkWabXiZWVFdzd3REWFqbapm25qgYNGuD+/ftISUnRW77SPhtZkiTk5eXpKI26Fi1a4NKlS7h8+TIaNWqktd2VK1fQtGlT+Pn5leiZ7f+Wv78/zp07h6ioKNSqVQuA5q9lYZ633noLBw8e1Fkebe7evYvt27cjNDQUjx8/1lqsv+5/BFLJ5eTkoG3btjh//jwkSYJCoYCDgwOSkpKQn58PIQTq1auHc+fOwdraWnXcvn370LdvX7z//vv4+eefZXwH+sUeTaIyOHHihNwRKhR3d3dERUWp1qXU5sGDB7h16xZatGihx3TP1sfTZfuyGDFiBC5evIgePXrg559/Rs+ePYu02b9/PyZPngxJknS+RuPw4cNx5swZvPfee9izZw8sLS2LtElJScHYsWMhSRJGjhyp0zyafPvtt5g9ezby8vJUa2k+/zV7flvh/+taaf6YUSgUsLa2hpubG/z9/TFu3Dj4+vrqMB2VhKmpKf766y98+OGH2LRpE/Lz8/HgwQMAz76n+vXrh5UrV6oVmQDg6emJPXv2FPuH4quIPZpEpDezZ8/GwoULMXXqVHz//fcANPeCDRs2DNu2bcOiRYswY8YMueIalLy8PAQEBODEiROQJAk1a9ZEvXr1UK1aNSQmJiI8PBxxcXEQQqBjx444cuRIqXtoSyM/Px8dO3bEyZMn4e7ujgEDBmD37t24ffs21qxZg2vXrmHz5s1ISkpC165dcfjwYZ1l0eTgwYPo0aMHnJ2d8dVXX2HZsmW4fv06jh49iri4OISGhmLdunXIz8/HokWL4Ovri3bt2uk8V1mGNBgZGeHrr79+rf5NBAcHl/oYfS6I/vjxY5w7dw4pKSmwtbVF8+bN4ejoqLfrVwgyjQ0lKrGcnByxevVq0adPH+Hr6ys8PDyEu7u7xpdcj6CkkjH0VQQMXWZmppg+fbqoVKmSxkkRlSpVEh9//LHO19AspFQqVcsDvfjYvcL/HzRokCxfx7feeksoFApx6tQpIYTmiUpJSUmiXbt2wtbWVkREROgt29KlS4W5ubkYPXq0CAoKEikpKSIvL0+kpKSI4OBg8e677wpzc3OxdOlS8eTJE3Hp0iUxadIk1Vqgf/31l96yyk3bJFCuO1pxsEeTDFpSUhI6duyI69evl+g25es+BrIiuH79Onr37o07d+5ovF0phICHhwf+/PNP1K1bV4aEhi89PR2nTp0q8nhMf3//Irfr9CEsLEzjRL2+ffuiWbNmes8DANWqVYOpqSnu3bsHQPtY4ISEBNSqVQsDBw7E5s2bdZ5r165dGDhwIFasWIGJEydqbbdy5UpMnjwZ27Ztw4ABAwAA33//PT7++GP07t0be/bs0XlWQ9C+fXutwxqePHmC27dvIyUlBaampmjdujUADmkyNCw0yaCNGzcO69evR40aNfDJJ5/Az88P1apVK/b2U+HEBH3o2LFjidsaGRmpxlu9+eab6Nmzp86f/3z06FEcOnQId+7cQUZGhtZiXd8TIZ4+fWqQqwhQyW3cuBEAMGjQIJiZmcmcpihzc3M0atRINSGqS5cu+Pvvv5GSklJkhnLTpk3x8OFDxMfH6zxX69atERcXpyqAi1OjRg3UqFEDZ8+eBfBs+ETVqlVhYWGBhIQEXUetMHbt2oUpU6agXbt22LJli96vHxISgrNnz+LWrVtISUlBZmYmrKys4OTkhGbNmqFz586v96oLMvamEr2Uo6OjMDU1FZGRkXJH0ej5W4XP3y58/qVpn0KhEK6uriIwMFAnuZ48eSICAgKKzfViHuLjAUtDoVCI2rVryx1DK1dXV1GvXj3Vx4VPDzt//nyRtnXq1NHbOp9WVlaiRYsWJWrbokULYW1trbbNz89PmJqa6iJahXb+/Hm9P01s8+bNao/SfXF958L/t7S0FO+9955ITk7WWzZDotuF1ojKKC0tDXXr1kXt2rXljqLRiRMnsGDBAhgbG8PT0xNz587F7t27cezYMezevRvz5s1DnTp1YGJigq+//hoHDhzAt99+i4YNG+LevXvo0aMHIiMjyz3Xf//7Xxw9ehTW1tb46KOP8Pvvv+P48eM4ceKExtfff/9d7hkqIvHsIRYlfhUUFOgt2/79++Hh4YElS5YU227JkiXw8PDQ+VJCDg4OsLe31+k1ysLT0xP3799XfdyyZUsIIbBy5Uq1dsePH0dUVBRcXFz0ksvExAQRERHIzs4utl12djYiIiJgbKy+OIxSqZRleISh8/Pzg5eXF9asWaOX602aNAkjR47E7du3IUkSHBwcIEmS6q5Rr169MHToUHh5eSEzMxNr166Fj48PwsPD9ZLPoMhR3RKVVMOGDUXdunXljqFVSEiIsLS0FKNGjRK5ubka2+Tl5YnRo0cLCwsLcenSJSHEs+c/Dx8+XEiSJMaPH1/uuVxdXYWxsTEfjVmOnjx5IkJDQ8XkyZNFpUqVxOrVq/V6/f79+wuFQvHSR1BGRkaqJuHoOo+NjY3eJh6V1qJFi9R6MB89eiTs7OyEQqEQb775pvj444/FiBEjhJmZmVAoFOKLL77QS64ePXoIhUIhxo0bp/WxlwUFBWL8+PFCkiTRs2dP1fbs7GxhamoqfHx89JK1ovHx8REWFhY6v87vv/8uJEkSLi4uYvv27SInJ0cI8Wzi6vbt24WLi4uoXLmyiI6OFkI8e7pdz549hSRJombNmiI9PV3nGQ0JC00yaMuWLRMKhUJcvHhR7iga9erVS9ja2r70l21mZqawtbVV+6Xx+PFjYWpqKtzd3cs9l7m5uahfv365n7c8vAqrCGzYsEEoFApx8OBBvV3Tw8NDODk5laitk5OT8PT01GmeK1euCHNzczFx4kSdXuffunPnjhg3bpw4fPiwatuRI0eEvb19kaEjAwYM0PqHYnm7fPmyMDc3FwqFQnh7e4uFCxeKP//8UwQHB4uDBw+KRYsWiYYNGwqFQiHMzc1FSEiI6tht27YJSZLEhx9+qJesFUlkZKQwMTEp8b+Rsmjbtq0wMjISV69e1bj/7NmzQpIkMXLkSLXto0aNEgqFQnz11Vc6z2hIOBmIDJoQAiNGjEBQUBBWrFiB3r17yx1JjYODAzw8PEr0BJaWLVvi9u3bSEpKUm1r3Lgxbt26hczMzHLNVadOHVhYWODq1avlet6yepVWEXBxcYGnp+e/Wufv37CwsICvr2+JvtdatGiB69ev48mTJzrLExwcjKCgIHz55Zfw9fXFsGHDUL9+/WIncelzfUNt0tLScOjQIURHR8PCwgJt2rRB06ZN9Zrhr7/+wogRI/Dw4UOtKy84OTlh06ZN6NSpk2p7YGAgYmJi0KZNm9fmMYaxsbFa9wkh8OjRI1y4cAGLFy9GbGwsxo4di19++UWnmezt7VGjRg21J5y9yMPDA0+fPlUt5A48W3PTyckJDRo0QEhIiE4zGhIWmmQwipvBffr0aeTl5cHe3h6enp4G8yxxKysr2NvbIy4u7qVtXV1dkZKSgoyMDNW2Jk2a4M6dO0hLSyvXXHPnzsX8+fNx69YteHp6luu5y8LQVxEojebNmyMiIgJKpVIv16tatSpsbGxw586dl7b18PBAamoqHj9+rLM8CoVCbUzay56so8/HdQLPCpSaNWvq7XqllZGRga1bt+LYsWNFnt3dpUsXDBkypNinZ70uCr/PXkYIgQYNGuDEiROoWrWqTjNVqlQJdevWxeXLl7W2qVevHmJiYop0IjRq1AjR0dHl/jPfkLHQJINRlidmFNJ3D1irVq1w4cIFrFq1CuPHj9fabu3atXjvvffQsmVL/PPPP6rtNjY2cHR0LPcJQTk5OejatSuSk5OxceNGNGnSpFzP/285OTkhJSUF169fN9gJXiXx5MkTODk5wcjICKmpqXq5ZocOHRAcHIxz586hefPmWttdvHgRLVq0gL+/v057W4tb31Abfa5vaGRkhFq1aqFt27Zo164d2rZta1B/dFHJuLm5af0+kyQJlSpVgoeHB7p164Z3331XL0tt+fj44NatW7h9+zZcXV2L7I+MjET9+vXh7u5e5Gd748aNcefOHb39gWoI+KxzMhgVcZHdadOmYfDgwZg4cSIuXbqE0aNHw9fXF5aWlsjMzMTVq1exYcMGrF27FpIkYfr06apjT506hYyMDJ0MB3j//ffh6uqKf/75B35+fmjcuPFLe4LXrVtX7jleZOirCJREeHg4pk2bhqdPn+Ktt97S23WHDh2KoKAgDBs2DIcOHdJ46/Tu3bsYNmwYJEnC0KFDdZonMDBQp+cvK1dXV0RHRyM6OhqbNm0CADg7O6Ndu3aqwrNevXoypzRcQghcvXr1pWvwAtDpc+yjo6N1du5/q2/fvpg/fz569OiBtWvXws/PT7Xv4sWLGD16NIQQ6NmzZ5Fj7969i+rVq+szruzYo0lURl999RXmzZun9oP4+VuKQghIkoR58+Zhzpw5qjbr16/HyZMnMXr06HJ/xvKLtzVfRl89wT4+PsjNzcXNmzd1fq1/o7hxb4XjwTIzMyGEgJWVFU6ePIlGjRrpJVt+fj7atWuHM2fOwNzcHP369UPLli1hZ2eH1NRUnD17Fnv37kVmZibeeOMNBAUF6fRZ5yUVHx+PLVu2YPPmzXofMxwXF4egoCAEBQUhODhY1btU2EPm4OCg1uPp4+Oj13yGauvWrZg5c6ba8lDFMYRx1Pr8PlMqlWjevDmioqIgSRJq1KgBFxcXxMfH4969exBCwNnZGSEhIahWrZrquL///hudO3fGqFGj8Ouvv+o0oyFhoUkGpWPHjvD19cWyZcvkjlIqFy5cwPfff4/jx4+rTfapWrUqunTpgqlTp6r91atrv/32W6mPGTVqlA6SqPvhhx8wbdo0nD9/XrZHExanJMM3bG1tERAQgHnz5un9EZmpqal49913sW/fPgDq4yILf5T37dsX69atg52dnV6zPS8jIwO7du3Cpk2bEBgYqMomd0Hy4MEDBAYGIjg4GMHBwWprGupiDGnhuPNatWqpCovSPE2sMJc+x53/8ccfGDRoEIBnQ10aNWr00nHUchVNcn6f3b9/H8OHD9fYs9+sWTNs2bIFXl5eattPnz6Nixcvon379nr7A9UQsNAkg6JQKHQ+tkzXUlNTVQP75fxlb4gMfRWBmJgYrfsKx4NVqVJFj4k0u3jxIvbt24fw8HDVAt4NGjRAnz599D6DulBBQQGOHj2KTZs2Yd++faqeX+DZpLfhw4fjo48+kiXbi+7fv48TJ05g586d2L9/v+quQ3kXKIXFWb169XDjxg21bSWl73HnzZs3R0hICGbOnIkvv/yyyILxcjO077OrV6/i9OnTSElJga2tLfz8/NCiRQu9Xb8iYKFJBuVVKDQNUUREBCIiIpCeng5ra2t4eXkV+Wu7vFXEVQSo9EJCQrBp0yb8/vvvSExMVP3SNzMzw7Rp0zB8+HDUr19f1oyxsbGqW+hBQUGqmftCCDg6Oqpun3/wwQflet2goCAAgKWlpeqORuG20ijvoTXFsbS0hI2NjdqyPIagInyfkWYsNMmgVNRC88mTJ/jf//6H0NBQPH78GLm5uRrb6WvSTaHVq1fjm2++0dhTV6tWLcyaNavY2fJlURFWEdi4cSMcHR0REBBQZJ9SqYSxsTEsLS01HrtixQrcuXMH33///WuX7d69e9iyZQs2bdqkuv0shIC9vT0GDBiAX375BU5OTiUe46cLv/76q6qwjI2NVRUmNWrUUBuXqe/hD4bOwcEB7u7uOH/+vNxRKsT3WWnFx8cjPz/foJfeKm8sNMmgVMRCc9u2bZg4caLachWa1hbU1e05bd59911s3LgRQgiYmZnB1dUVjo6OePjwIeLi4pCdnQ1JkjBy5EidjLH6Nz03muiyN0ehUKBNmzYasxa3DwDatGmDM2fO6OzraWjZMjIysHPnTmzatAlBQUGq571bWFigR48eGDZsGLp16wYTExMoFArZC4DCCXFOTk7o2rWrara5u7u7bJleRqlU4tChQ7h//z6aNm2q157MQgMGDMCxY8eQmJgIU1NTvV+/on2flZaDgwNSUlL0uqas3Axr8AVRBfPPP/9gxIgRsLCwwOzZs7F9+3ZERUVhzZo1iIuLQ2hoKP73v//BzMwMc+bM0duyFlu3bsVvv/2GSpUq4YsvvsD777+vtvhzRkYGVq1ahS+//BIbN25E165dMWTIkHLNoOmXZHBwMGxtbUs0EP7q1at6WaOyuL+15f473JCyOTo6IisrC0IIGBkZoVOnThg2bBj69etnsAuLCyGQmJiIsLAwVK5cGVWqVIGdnR3s7e1ly7R9+3Z88803+OCDDzBu3DjV9ps3b6Jr166Ij49XbRsxYgQ2bNig13zz58/HkSNH8Mknn8gyKbMifp+Vltw/V/SuPJ5jSVReJEkSCoXiX7+MjIz0mrdfv35CoVCI/fv3CyGE8Pf3FwqFQq1NeHi4aNiwoXBxcREPHjzQS6727dsLhUIhjhw5Umy7I0eOCEmSRIcOHfSSS5Ik0bZt2xK1bd++vc6/npIkiTZt2pR6nxCav9blydCyFf7brFy5sti6dasoKCgotq2zs3O5Xr+0zp49KxYvXiy6d+8u7OzsVPmNjIxEo0aNxIcffih27dolHj16pNdcffv2FQqFQty6dUtte0BAgJAkSdSuXVv07dtX2NjYCIVCIf7880+95gsKChJLliwRpqamomnTpmL58uXi4MGDIigoSOurPFW077PSqlq1qk5/bhgi3jong1LWcX36nqFZvXp15Ofn4+HDhwC037KMiIhA/fr1MX78eKxatUrnuQp7b0ryxCEvLy88evQIKSkpOs9VmqERhU/C0eXXs7g8L8uqj1vnhpTNx8cH169fB/Ds35mzszMGDRqEoUOHFlmqytBuaQohcOXKFQQGBiIoKAinTp3C48ePVUNb6tevj/bt22PFihU6z+Lp6QmlUolHjx6ptiUkJKBGjRpwdXXFzZs3YW5ujuDgYLRv3x7dunXDn3/+qfNcheR+tGhF+D77+uuvy3RsZmam7Et96RNvnZPB8fHxwY8//ih3jBJJTk6Gr6+v6uPCMU2FyxsV8vLyQoMGDXDo0CG95MrKyirx0ko2Nja4d++ebgP9C8nJybCwsJA7Bv1/YWFhCA0NxcaNG7Ft2zbcv38fy5Ytw7Jly1C7dm0MGzYMQ4cONcinPkmShCZNmqBJkyaqpW/Onz+PhQsXYv/+/bhx4wbCw8P1Umg+evQIderUUdt24sQJCCEwdOhQmJubAwDatm2LWrVqqa31qQ9t27Yt9aNFy1NF+D6bM2fOv/4cif8/Vv91wkKTDI6tra0sg+D/jSpVqiAzM1P1cdWqVQEAt2/fVitAAaj1fOpazZo1ce3aNSQlJakyafLo0SNcv34dtWrV0kkOpVJZZJxldnY24uLitI5TyszMRFBQEK5du/ZaLWpcETRq1AhLlizBt99+i7/++gu//fYb9u3bh8jISMybNw/z5s2TbR3Pl3ny5AlOnz6tmol+8eJF5Obmqr4P9TXxJScnp0hv1smTJyFJEjp06KC23dHREaGhoXrJVcgQHi1q6N9nRkZGKCgo+FfjRrdt24acnBwdJTNQ8t21JyrqZWPPDE3r1q1F5cqVVR8vWrRISJIkPvnkE7V2V65cEcbGxqJGjRp6yTVjxgwhSZLo2LGjSExM1Njm4cOHokOHDkKhUBTJW17mzp2rNoa2NGNwJUkSy5cv10muQoY2DrKk15c72/MyMjLEhg0bRKdOnYSRkZGQJElIkiSMjIxEx44dxa+//iqUSqVesjwvNTVVHDhwQMyYMUO0aNFCmJiYqL6vJEkSlpaWokOHDmLu3Lni77//FpmZmXrJ5eHhIaysrMSTJ09U29zc3ISpqanaNiGE8Pb2Fo6OjnrJZegM6fusUaNGJRoDr8nrOEaTPZpEZdClSxecO3cO169fR4MGDTB06FDMmzcP3333HeLj49G6dWs8fPgQP//8MwoKCtC/f3+95Pr000+xbds2BAYGolatWhgwYAC8vb1RrVo1JCYm4saNG/jjjz+QlZUFV1dXzJw5Uyc57Ozs1NaLi42NhampKZycnDS2lyQJFhYW8PDwwKBBgzB8+HCd5HpeYmIiNm7c+K/26ZohZytUqVIljBo1CqNGjcL9+/exefNmbN68GdeuXcOJEycQGBiISZMmoVevXvj999/1lqtq1aooKChQ9VhaWVnhjTfeUK2f2aJFC5iYmOgtT6HOnTtj7dq1+M9//oOPPvoIO3fuRExMDAICAtTWRc3MzERkZKROe/VjY2MBACYmJnB2dlbbVhr6WBPSkL7PWrRogbCwMFy8eBFdu3bV6bX05ZNPPkFSUpJu1nqWu9Ilel5F69G8du2a6Ny5s9i1a5dq24YNG4Spqala750kSaJ169YiPT1db9kiIyNF8+bNVX/5v9hbKEmSaNGihYiKitJbJkP7+pZllYPCY1/HbCUREhIiPvroI+Hs7CxLHnt7e9GzZ0/x3XffifPnz4u8vDy9Xl+bmJgYVa9W4dfK1NRUnDt3Tq3d9u3bhSRJ4qOPPtJZlsKvi7e3d5FthrrSx4vk+D5bs2aNkCRJ9OrVq9THVqlSRfZ/m5q4ubnp7PPHHk2iMmjQoAGOHTumtm3UqFFo06YNduzYgejoaFhYWMDf3x99+vSBkZGR3rLVrl0bFy5cwPHjx3H06FFEREQgIyMDVlZW8PLyQkBAQLGPidSFX3/9FY6Ojnq9ZnFq1qxpsAPzDTlbSTRu3BiNGzfGd999h6NHj2Lz5s16vX5ycrJBfv5q1qyJixcv4rvvvkNUVBRcXV0xadKkIj2XgYGBaNSoEXr37q3TLIUzu1/cVlHI8X3WuXNnTJkypdjx79rs379f65Pj5PTOO+8gKSlJJ+fm8kZEREREpBNlfxgxEREREZEGLDSJiIiISCdYaFKFlZ2djblz5yI7O1vuKGoMNRdguNkMNRdguNkMNRdguNkMNRdguNkMNRdguNkMNRcgTzaO0aQKS6lUwtbWFmlpabCxsZE7joqh5gIMN5uh5gIMN5uh5gIMN5uh5gIMN5uh5gIMN5uh5gLkycYeTSIiIiLSCRaaRERERKQTXEeT9KagoAD379+HtbV1uazTplQq1f5rKAw1F2C42Qw1F2C42Qw1F2C42Qw1F2C42Qw1F2C42Qw1F1C+2YQQSE9PR/Xq1aFQaO+35BhN0pt79+7B1dVV7hhERERUTuLi4lCjRg2t+9mjSXpjbW0NAPjmm29gbm4ucxp1a9askTuCVvXq1ZM7glZt2rSRO4JGf/zxh9wRtMrKypI7gkZ5eXlyR9CqcuXKckfQKCYmRu4IWjVp0kTuCBqFhobKHUErQ+13M9QOmry8PJw8eVL1u10bFpqkN4W3y83NzWFhYSFzGnX6fDRkaZmYmMgdQStD+zoWMjY23B9thprNUH/JAob7OePPjdIz5M+Zof4bMNTv/0IvGwrHyUBEREREpBMsNImIiIhIJ1hoEhEREZFOsNAkIiIiIp1goUlEREREOsFCk4iIiIh0goUmEREREekEC00iIiIi0gkWmkRERESkEyw06aXc3NwgSRI2bNggdxQiIiKqQFhoEhEREZFOsNAkIiIiIp1goUlEREREOsFCk4iIiIh0goUmEREREemEsdwB6NWVnZ2N7Oxs1cdKpVLGNERERKRv7NEknVm4cCFsbW1VL1dXV7kjERERkR6x0CSdmTVrFtLS0lSvuLg4uSMRERGRHvHWOemMmZkZzMzM5I5BREREMmGPJhERERHpBAtNIiIiItIJFppEREREpBMsNImIiIhIJ1hoEhEREZFOsNCkEvvPf/6DqlWran1du3ZN7ohERERkQLi8EZVYRkYGMjIytO7Py8vTYxoiIiIydCw06aWio6PljkBEREQVEG+dExEREZFOsNAkIiIiIp1goUlEREREOsFCk4iIiIh0goUmEREREekEC00iIiIi0gkWmkRERESkEyw0iYiIiEgnJCGEkDsEvR6USiVsbW3RpEkTGBkZyR1HzZEjR+SOoNW4cePkjqBVXFyc3BE0sre3lzuCVvfv35c7gkb37t2TO4JW9evXlzuCRkOHDpU7gla//PKL3BE0cnd3lzuCVnfv3pU7QoWSn5+P8PBwpKWlwcbGRms79mgSERERkU6w0CQiIiIinWChSUREREQ6wUKTiIiIiHSChSYRERER6QQLTSIiIiLSCRaaRERERKQTLDSJiIiISCdYaBIRERGRTrDQJCIiIiKdYKFJpRYYGAhJktC+fXu5oxAREZEBY6FJRERERDrBQpOIiIiIdIKFJhERERHpBAtNIiIiItIJFpoldO3aNXzxxRdo3bo1nJ2dYWpqCmdnZ/Tr1w9nzpzRelx8fDymTZsGb29vVKpUCba2tvDx8cHHH3+MyMjIIu2fPn2K7777Dq1atYKdnR0sLS1Rp04djBgxAkFBQUXaP3nyBPPnz4evry8qVaoEGxsbtGzZEj/99BPy8vKKtH9+Ik9eXh4WL14MHx8fWFpaws3NTa3tnj178MYbb6BSpUqoUqUKevTogYsXL5b+k0dERESvJWO5A1QUU6dOxfHjx2FnZwdnZ2dUr14dsbGx2LNnD/bv34+NGzdi6NChasccP34c/fr1g1KphImJCerXr4+CggLcuXMHS5YsgZWVFebOnatqHxsbi7feegvh4eEAgDp16sDa2hrR0dHYvHkz4uLiEBgYqGr/6NEjdOrUCWFhYVAoFGjYsCFyc3Nx/vx5nD9/Hvv27cP+/fthbm5e5P0IIdCnTx/8+eef8PT0hLe3N7KyslT7Fy9ejJkzZwKA6v0GBQXB398fc+bMKcfPLBEREb2q2KNZQu+//z6uXr2KlJQU3LhxA5cuXUJiYiL27t0LCwsLTJw4Eenp6ar2sbGx6N+/P5RKJUaOHIkHDx4gNDQUYWFhSE9Px4EDB9CsWTNV+/z8fPTr1w/h4eFo3rw5bty4gYiICFy6dAnJyckICQnBoEGD1DJNnDgRYWFhaNCgASIiIhAaGoobN27gwoULcHR0xLFjx/DFF19ofD+nT5/GhQsXcObMGURFReHixYuq3sqQkBB89tlnkCQJK1asQHx8PC5evIiEhAT06dMHX375pQ4+w0RERPSqYaFZQu+88w58fHzUtkmShN69e2Pq1KlQKpX43//+p9r3zTffIC0tDZ06dcKGDRtQuXJl1T6FQoHu3bujZ8+eqm27d+/GpUuXUK1aNRw+fBj169dXu1bjxo0xceJE1ceRkZHYvXs3AGDTpk3w9PRU7WvevDmWL18OAPjpp5/UCuBC+fn5WLlyJVq3bq3aVtjz+f333yM/Px/vvPMOJk2aBEmSAABWVlbYsGED7O3tS/Q5y87OhlKpVHsRERHR64OFZinExsZi0aJFGDhwIDp27Ah/f3/4+/tj+/btAIDQ0FBV23379gEAZsyYoSrUilPYfsyYMahSpcpL2x87dgxCCPj7+6NJkyZF9vfv3x81atTAkydPcPr06SL7bW1t0bt3b43nPnr0KACoFbaFzM3NMWbMmJfmA4CFCxfC1tZW9XJ1dS3RcURERPRq4BjNEvrtt9/w/vvvq41jfNHjx48BAOnp6YiPjwcAtGrVqkTnLxyXWdL2ERERAABvb2+N+xUKBerVq4d79+4hIiICb731ltr+OnXqwMjIqMhxqampSExMBIAivaqFtG1/0axZszBt2jTVx0qlksUmERHRa4Q9miVw+/ZtjB8/HllZWZg+fTpCQkKgVCpRUFAAIQTWrFkDAMjNzQUAtVvEtra2JbpG4TF2dnYlap+RkQEAqFatmtY2jo6OAKDx1nmlSpWKPS8AODg4FHvelzEzM4ONjY3ai4iIiF4f7NEsgR07diA3NxeDBw/Gd999V2R/XFyc2sfW1taq/09LSytRsVl4TGpqaokyWVlZAYCq91GThw8fFslT0vMCz2a1Ozk5FWlT3DWJiIiICrFHswSio6MBAG+88YbG/c+PzQQAGxsb1KhRAwBw9uzZEl2jQYMGpWrv5eUFALhx44bG/QUFBbh586Za25Kws7NT9ZIWHv+iwtv8RERERMVhoVkCFhYWAP6vh/B5N2/eVJttXqhPnz4AgCVLlpToGoXt169frxrrWZyuXbtCkiScOnUKISEhRfbv3r0b9+7dQ6VKlfDmm2+WKEOhLl26AABWrVpVZF92djbWr19fqvMRERHR64mFZgn4+/sDAH7++WdcuXJFtT0iIgIDBgyAqalpkWNmzJgBW1tbHDt2DGPHjkVKSopqX0FBAQ4ePIgDBw6otvXp0wfNmzdHYmIi3n77bdy6dUvtfKGhoVi5cqXq49q1a6Nfv34AgJEjR+LOnTuqfZcvX8aHH34IAJg8eXKpbp0DwEcffQSFQoEdO3Zg1apVEEIAePYUojFjxpSoECYiIiJioVkCffr0QatWrZCSkoLmzZvD29sbPj4+qFevHpKTkzU+KadmzZrYuXMnrK2tsX79ejg6OqJx48bw9fWFjY0NunfvrvY4RyMjI+zatQt169bFuXPnUK9ePdStWxfNmzdH1apV0bhxY9UySoVWrlwJHx8fXLt2DV5eXmjcuDEaNGiAZs2aISEhAZ07d1Z78lBJNWvWDPPnz4cQAhMnTkSNGjXg5+cHZ2dn7Nq1C59//nmpz0lERESvHxaaJWBsbIwjR47gP//5DxwdHREVFYXU1FSMHTsWly5dgouLi8bjOnfujGvXrmHy5MmoVasWbt68ibi4OHh6emLGjBkYMWKEWvuaNWvi0qVLWLhwIZo2bYr79+8jPDwclStXxqhRo/DVV1+ptXdwcMA///yDL7/8EvXr10dERARiYmLg5+eH5cuX4+DBgxofP1kSs2bNws6dO9GyZUukpKTg9u3baNOmDU6dOqXq4SUiIiIqjiQK74sS6ZhSqYStrS2aNGmicQ1POR05ckTuCFqNGzdO7ghavbjigqEo6dOr5HD//n25I2h07949uSNoVdK1e/Vt6NChckfQ6pdffpE7gkbu7u5yR9Dq7t27ckeoUPLz8xEeHo60tLRily9kjyYRERER6QQLTSIiIiLSCRaaRERERKQTLDSJiIiISCdYaBIRERGRTrDQJCIiIiKdYKFJRERERDrBQpOIiIiIdIKFJhERERHphLHcAej14+PjA1NTU7ljqGncuLHcEbQ6ceKE3BG0WrRokdwRNLpx44bcEbTKy8uTO4JGhvxvICEhQe4IGq1du1buCFq1bt1a7gganT9/Xu4IWhUUFMgdQaM6derIHUGj3NxchIeHv7QdezSJiIiISCdYaBIRERGRTrDQJCIiIiKdYKFJRERERDrBQpOIiIiIdIKFJhERERHpBAtNIiIiItIJFppEREREpBMsNImIiIhIJ1hoEhEREZFOsNAkIiIiIp1goUlEREREOsFCk4iIiIh0goUmEREREekEC00iIiIi0gkWmq+ImJgYTJgwAR4eHjAzM4O1tTU8PDzQt29fbNu2TdVu7ty5kCQJc+fO1XieDRs2QJIkjB49Wuv2J0+e4LPPPoOXlxfMzc3Rvn173b0xIiIiqrCM5Q5AZRcdHQ0/Pz8kJSXB0tISdevWhZGREWJjY7F3717cvXsXgwcPLpdrZWZmom3btggJCUG9evXg7e0NMzOzcjk3ERERvVpYaL4ClixZgqSkJIwaNQorVqyAlZWVat/NmzcRHBxcbtfatWsXPD09cf36ddSvXx8AkJWVpbFtdnY2srOzVR8rlcpyy0FERESGj4XmKyAyMhIAMG3aNLUiEwDq1auHevXqldu18vPz8fvvv6uKTAAwNzfX2HbhwoWYN29euV2biIiIKhaO0XwFuLq6AgB27twJIYROr9WgQQM0bdq0RG1nzZqFtLQ01SsuLk6n2YiIiMiwsNB8BUyaNAkmJib46quv4O7ujvfffx9btmzB/fv3y/1az/dkvoyZmRlsbGzUXkRERPT6YKH5CmjcuDGCg4PRtWtXxMfHY/Xq1Rg+fDhq1KiBgIAAhIeHl9u1KlWqVG7nIiIiolcbC81XRKtWrXDkyBGkpKTg8OHDmDlzJmrUqIGjR4+iS5cuSE1NBQBIkgQAWm+xP3nyRF+RiYiI6BXHQvMVY2VlhYCAACxatAg3b96Ep6cn4uPjcejQIQD/1yP56NEjjcdHRUXpLSsRERG92lhovsIsLS3h4+MDAKrxmh4eHgCACxcuFGn/5MkTtcXdiYiIiMqCheYrYOLEidi+fTuePn2qtj04OBjHjx8HANVM8Q4dOsDc3BwXL17EL7/8omqbmpqK0aNHIzk5WX/BiYiI6JXGQvMV8M8//2Dw4MGwtbWFt7c3WrZsCTc3N7Rr1w7p6ekYPnw4OnToAACwt7fH7NmzAQATJkxAjRo10Lx5c1SvXh0nT55U7SMiIiIqKxaar4ClS5diypQp8PX1RVJSEq5cuQIACAgIwP79+7Fx40a19nPmzMFPP/0Eb29vPHr0CHFxcXjnnXdw8eJF1KpVS4Z3QERERK8iPhnoFdChQwdVj2VJffDBB/jggw+KbB89ejRGjx5d4u1ERERE2rBHk4iIiIh0goUmEREREekEC00iIiIi0gkWmkRERESkEyw0iYiIiEgnWGgSERERkU6w0CQiIiIinWChSUREREQ6wQXbSe8CAgJgaWkpdww1x44dkzuCVkuWLJE7glbdu3eXO4JG165dkzuCVo8fP5Y7gkbW1tZyR9CqWrVqckfQ6MaNG3JH0MpQH7ARFBQkdwStkpOT5Y6gkaurq9wRNMrLyytRO/ZoEhEREZFOsNAkIiIiIp1goUlEREREOsFCk4iIiIh0goUmEREREekEC00iIiIi0gkWmkRERESkEyw0iYiIiEgnWGgSERERkU6w0CQiIiIinWChSXBzc4MkSYiOjpY7ChEREb1CWGgSERERkU6w0CQiIiIinWChSUREREQ6wUKTiIiIiHSChSZp9ODBAyxfvhwBAQFwc3ODubk57O3t0a5dO2zatEnueERERFQBsNAkjdauXYsPP/wQJ0+ehLGxMXx8fGBjY4Pg4GCMHDkSEydOlDsiERERGTgWmqRR+/bt8ffffyM9PR1RUVG4cOECYmJiEBoaivr162PVqlUICgqSOyYREREZMBaapJG/vz86dOgAIyMjte2+vr5Yvnw5AGDLli3FniM7OxtKpVLtRURERK8PY7kDkOFKT0/Htm3bcOrUKSQkJCAzMxNCCGRnZwMAQkNDiz1+4cKFmDdvnj6iEhERkQFioUkahYSEoEePHrh//77WNo8fPy72HLNmzcK0adNUHyuVSri6upZbRiIiIjJsvHVOReTn52PgwIG4f/8+3n77bQQFBSEpKQl5eXkQQiAyMhIAkJubW+x5zMzMYGNjo/YiIiKi1wd7NKmI8+fPIyoqCrVq1cLu3bthZmamtj8uLk6mZERERFSRsEeTioiOjgYANGvWrEiRCbx8bCYRERERwEKTNLCwsAAAPHz4sMi+3NxcLFu2TM+JiIiIqCJioUlFtGrVCsbGxjh9+jQ2btyo2p6WloZhw4ZpLECJiIiIXsRCk4pwcnLC1KlTAQCjRo1CrVq10Lx5czg7O2Pv3r1YunSpvAGJiIioQuBkINJo8eLFqFGjBlatWoU7d+7g6dOn6Ny5M2bPng1HR0e54xEREVEFwEKTVJN/nidJEqZMmYIpU6ZoPEYIoeNUREREVNHx1jkRERER6QQLTSIiIiLSCRaaRERERKQTLDSJiIiISCdYaBIRERGRTrDQJCIiIiKdYKFJRERERDrBQpOIiIiIdIKFJhERERHphCT4iBfSE6VSCVtbWwwcOBAmJiZyx1Hz8OFDuSNopVAY7t+DDg4OckfQKD4+Xu4IWtnb28sdQaNJkybJHUGrmTNnyh1Bo8zMTLkjaFW7dm25I2gUExMjdwSt8vPz5Y5QoeTn5yM8PBxpaWmwsbHR2s5wf4MRERERUYXGQpOIiIiIdIKFJhERERHpBAtNIiIiItIJFppEREREpBMsNImIiIhIJ1hoEhEREZFOsNAkIiIiIp1goUlEREREOsFCk4iIiIh0goUmEREREekEC00iIiIi0gkWmkRERESkEyw0iYiIiEgnWGgSERERkU6w0JRR+/btIUlSsa/27dsXOe769esYMWIEatSoAVNTUzg6OqJ///44e/asxuuMHj0akiRhw4YNuH//PsaMGQNnZ2eYm5ujQYMG+Omnn4rNef78eQwePBguLi6q6w0YMAAhISHl8WkgIiKiV5Sx3AFeZz4+PsjLy9O47/bt23jw4EGR7fv378fAgQORnZ0NOzs7NGrUCDExMdi9ezf27t2LVatWYfz48RrPGRMTg2bNmiE1NRXe3t5QKBS4ceMGJk+ejNTUVMyePbvIMUuXLsX06dMhhEDlypXRsGFDxMbGYufOndi3bx+2bduGfv36le0TQURERK8k9mjKaPny5Th16lSR17p165CZmQkAeP/991Xt79+/jxEjRiA7OxtTpkzBw4cPceHCBTx48AALFixAQUEBJk2ahKtXr2q83oIFC+Dv74+EhARcunQJ8fHx+PnnnwEA8+fPR2pqqlr7w4cPY/r06ahSpQp27dqF5ORkXL58GUlJSVi7di2EEBg9ejQSEhI0Xi87OxtKpVLtRURERK8PFpoGJi0tDb169UJaWho+/fRTDB48WLXv559/hlKpROPGjbFs2TKYmpoCABQKBT777DO8/fbbyM3NxXfffafx3FWqVMGGDRtgZ2en2jZx4kQ0bdoUWVlZOHHihFr72bNnQwiBdevWFem1HDt2LKZMmYL09HSsXbtW4/UWLlwIW1tb1cvV1fXffEqIiIiogmKhaUAKCgowdOhQREREoHv37liwYIHa/qNHjwIAJk+erPH4KVOmqLV70ZAhQ1CpUqUi2/38/AAAd+7cUW2LiYnB5cuXUa1aNfTq1Uvj+Qq3BwUFadw/a9YspKWlqV5xcXEa2xEREdGriWM0DcisWbNw8OBB1KtXD1u3boVCof53QEREBADA29tb4/ENGjQAADx8+BBKpRI2NjZq+z09PTUeV61aNQBARkaGaltYWBgAICsrC/7+/hqPy8rKAgDEx8dr3G9mZgYzMzON+4iIiOjVx0LTQGzbtg2LFy+GnZ0d9u3bV6RIBP6vECwsDF/k6Oio+v/09PQi59DUmwlAVdAKIVTb0tLSAABKpRKnT58uNnvheFIiIiKi5/HWuQG4fPkyxowZA4VCga1bt8LLy0tjOysrKwBAYmKixv0PHz5U/b+1tXWZMhVe680334QQothXdHR0ma5FREREryYWmjJLTExEnz59kJmZiUWLFqFbt25a2xYWoDdu3NC4//r16wCe9Wxq6hEtjcLb8+Hh4SgoKCjTuYiIiOj1xEJTRrm5uXjnnXcQFxeHYcOGYcaMGcW2DwgIAACsWLFC4/4ff/xRrV1Z1KlTBw0bNsTjx4+xcePGMp+PiIiIXj8sNGX0n//8BydPnkTz5s2xZs2al7afOHEibGxscOXKFXz00UfIyckB8Gy2+uLFi/Hnn3/CxMQE06dPL5d833zzDSRJwqRJk7B27doii8vfuXMHCxYswO7du8vlekRERPRq4WQgGa1evRrAs0k+Xbp00dimSZMmWL58OQCgevXq2LRpEwYMGIBly5bht99+Q+3atRETE4PExEQoFAqsWLECvr6+5ZLv7bffxvLlyzFlyhSMHz8e06ZNg5eXFyRJQlxcnGpM6MqVK8vlekRERPRqYaFpAG7evKl1n7Gx+peoV69euHTpEhYtWoS///4bV65cgZ2dHfr27YsZM2agdevW5Zpt0qRJaNeuHX744Qf8/fffuH79OszMzFCjRg107NgR/fr1w9tvv12u1yQiIqJXgySeX9OGSIeUSiVsbW0xcOBAmJiYyB1HzfMz9g3Ni+upGhIHBwe5I2ikbW1XQ2Bvby93BI0mTZokdwStZs6cKXcEjQx5abfatWvLHUGjmJgYuSNolZ+fL3eECiU/Px/h4eFIS0srdgKy4f4GIyIiIqIKjYUmEREREekEC00iIiIi0gkWmkRERESkEyw0iYiIiEgnWGgSERERkU6w0CQiIiIinWChSUREREQ6wScDkd49fvy4yBOP5Jabmyt3BK1MTU3ljqBVcnKy3BE08vT0lDuCVsuWLZM7gkbdunWTO4JWkiTJHUGj1NRUuSNoZag/09LS0uSOoJWjo6PcETRKT0+XO4JGBQUFJWrHHk0iIiIi0gkWmkRERESkEyw0iYiIiEgnWGgSERERkU6w0CQiIiIinWChSUREREQ6wUKTiIiIiHSChSYRERER6QQLTSIiIiLSCRaaRERERKQT5VJoCiHQrl07SJKELVu2lMcpiYiIiKiCK5dC84cffkBwcDA+/PBDDBs2rDxO+VqKjo6GJElwc3OTOwoRERFRmZW50Lx16xY+++wztGnTBkuWLCmPTERERET0CjAuy8H5+fkYPXo07O3tsWPHDhgbl+l0rz0TExPUrVsXLi4uckchIiIiKrMyVYa3b99GQEAAevbsCScnp/LK9NpycXHBzZs35Y5BREREVC7KVGh6eXlh7ty55RSFiIiIiF4l/2qMZl5eHlatWgV/f3/Y2dnB3Nwc9erVw5w5c6BUKjUeEx8fj2nTpsHb2xuVKlWCra0tfHx88PHHHyMyMrJI+9jYWEycOBHu7u4wMzND1apV0a1bNxw6dEjj+efOnQtJkjB37lykpaVh6tSpqFmzJszMzFC7dm189dVXyMvL0/qe/vzzT7z11luoWrUqzMzM4O7ujg8++ABxcXEa27u5uUGSJERHRyMoKAidO3eGnZ0dKleujL59+6q9p/3796NNmzawsbGBvb09hgwZgvv37xc558smA8XExGD48OGoVq0aLC0t4evri59++glCCLU8z5MkCZIkaX3f2o4Dnq0msG3bNnTp0gVVqlSBmZkZPDw88OGHH+LBgwdaz0lEREQE/ItCU6lUolOnTpg4cSL++ecf2NnZoU6dOrh79y4WLFiAVq1aITExUe2Y48ePw9vbG0uXLkVUVBRq166NmjVr4s6dO1iyZEmRJZHOnTuHRo0aYdWqVXj06BF8fHxgYWGBw4cP4+2338bnn3+uNV9aWhpat26Nn376CVWqVEH16tVx+/ZtfP7555g4caLGY2bNmoUePXrgyJEjsLCwgI+PDxITE7Fy5Uo0atQIFy9e1Hq9PXv2oFOnTggLC4OnpydycnKwd+9etGvXDg8ePMDSpUvRu3dvREdHw8PDA5mZmdi2bRs6duyIrKysEn/ew8PD0axZM2zZsgXp6enw9vZGWloaJk+ejMmTJ5f4PCWVm5uLQYMGYciQIfjrr79gbm6O+vXr4+HDh1i+fDmaNm2KiIiIcr8uERERvTpKXWhOmDABwcHB6NSpEyIjIxEdHY2wsDA8ePAA/fr1Q3h4OCZNmqRqHxsbi/79+0OpVGLkyJF48OABQkNDERYWhvT0dBw4cADNmjVTtX/69CkGDhyI1NRUDBw4EAkJCbh48SLi4uKwYcMGGBkZ4auvvtLas/nTTz/BwcEBMTExCAkJwd27d7F//34YGRlh7dq1RcZAHjhwAIsWLYKxsTE2b96MuLg4XLx4EQkJCejbty9SUlIwYMAAZGZmarzezJkzsXjxYiQkJODSpUu4d+8eWrVqhYSEBIwbNw5z5szBli1bEBcXhytXriAyMhIeHh64desWfv311xJ9zoUQGD58OJKTkxEQEID4+HhcvHgRMTEx+P3337FmzRrEx8eX6Fwl9fnnn+OPP/5AkyZNEBISgvj4eFy5cgVJSUn44IMPkJCQ8NKlrLKzs6FUKtVeRERE9PooVaF59epVbNu2DbVq1cKePXvg4eGh2mdvb49NmzbB1dUVu3btQkxMDADgm2++QVpaGjp16oQNGzagcuXK/3dxhQLdu3dHz549Vdu2bt2K2NhYODo64rfffoO1tbVq36hRozBhwgQAwMKFCzVmNDY2xpYtW1C9enXVtp49e6J3794AUKRAXbRoEQBg0qRJaoWTjY0NNm/ejKpVqyI6Ohq///67xuu9/fbbmDZtGhSKZ59KOzs7zJs3D8Cz2/Hjx4/H0KFDVe1dXV3xySefAAAOHz6s8Zwv+vvvv3H58mVYWFhg8+bNap/DwYMHY+LEicUOCyitR48eYenSpbCxscH+/fvRuHFj1T4LCwssX74cfn5+uHjxIk6ePKn1PAsXLoStra3q5erqWm4ZiYiIyPCVqtDcs2cPAGDgwIFqBWAhS0tLdO7cGUIIVQGyb98+AMCMGTOKHStY6OjRowCA8ePHw9zcvMj+KVOmAADOnDmDJ0+eFNn/1ltvoUaNGkW2+/n5AQDu3Lmj2paRkYF//vkHAPCf//xH4/sZP368Wq4XjR07tsi25wszTfubNGlSJEtxjhw5AgAYMGAAqlatWmT/Bx98UKLzlNTBgweRnZ2NgIAAjZ9LhUKBHj16AACCgoK0nmfWrFlIS0tTvbSNdyUiIqJXU6lmnYeFhQF4VnCeOXNGY5vCnsz4+Hikp6erbum2atWqRNcoHPfn7e2tcX+dOnVgamqKnJwc3L59G76+vmr7PT09NR5XrVo1AM+Ky0JRUVEoKChQTXLRpEGDBmq5XqTpeg4ODiXa/3yW4hReu379+hr316lTB8bGxuXWq1n4dT579iz8/f01tnn48CEAFHvL3szMDGZmZuWSiYiIiCqeUhWaaWlpAJ4VaFFRUcW2zczMVBuTZ2trW6JrFBZfhYXhiyRJgoODg6qQfVGlSpU0Hld4a1sIUeRaDg4OWntbHR0dAUDjtYBnvZ6aMpZk//NZivN8Tk0UCgWqVq1abjPBC7/OcXFxL+2F1DZ2lYiIiKhUt86trKwAAGvWrIEQotjX3Llz1W6vFxYvJb3GizPXCwkh8OjRIwDQePu+NAqv9ejRI61FX2HPXVmvVRbP59SkoKAAycnJxZ5D2/vTNPyg8HqzZ89+6dd5w4YNpXgnRERE9DopVaFZeDv72rVrJWpvY2OjGuN39uzZEh3j5eUFALhx44bG/ZGRkcjJyYGRkZHW2+QlVbt2bSgUCmRnZ2sdL3n9+nW1XHIovLa2pwZFRUUhNzdX477CHl5NRWpaWhqSkpKKbC/t15mIiIhIk1IVmn379gUAbN68+aU9aIX69OkDAFiyZEmJ2gcEBAB41muqaZ3JH3/8EQDw5ptvar1NXlJWVlZ44403AADLly8vsj8zMxNr165VyyWHrl27AgD++OMPjZ/3n3/+WeuxhWNPL1y4UGRf4Xt7Uffu3WFqaoqDBw9qXEyfiIiIqCRKVWg2b94cAwcORHJyMrp06YKQkBC1/fn5+QgMDMSwYcOQnZ0N4Nlsc1tbWxw7dgxjx45FSkqKqn1BQQEOHjyIAwcOqLYNGTIENWvWxMOHDzF69Gi1CTObN2/G6tWrAQCffvpp6d+tBjNnzgTwrFjbunWrant6ejpGjhyJR48ewc3NDYMHDy6X6/0bnTp1QpMmTfD06VOMGDFC7XO4Y8cOrFy5EsbGmofbduvWDQAwZ84c1TAA4NnSSl9++aXG46pXr46pU6ciNzcXAQEBCAwMVNsvhMD58+cxceLEEs+cJyIiotdPqRdsX7dunarIbNq0KWrVqoVWrVrB19cX1tbW6NChA7Zu3aoaE1izZk3s3LkT1tbWWL9+PRwdHdG4cWP4+vrCxsYG3bt3V3vyjqWlJXbs2AFbW1ts374dTk5O8PPzQ82aNTFixAjk5eVhzpw5qgKqrHr06IFPP/0Uubm5GDZsGGrWrAk/Pz84Oztj586dsLe3x44dO2BhYVEu1/s3JEnCpk2bULlyZRw6dAguLi7w8/ODm5sbBg0ahHHjxsHFxUXjsR9//DGcnJxw5coV1KpVC02aNIG7uzu6deuGDz74QOtxCxYswPDhw3H37l106NABzs7OaNmyJRo3bgxbW1u0bNkSq1atQk5Oji7fOhEREVVgpS40rayscPjwYWzZsgUBAQF4+vQpLl++jKSkJPj6+mLmzJk4f/682hqYnTt3xrVr1zB58mTUqlULN2/eRFxcHDw9PTFjxgyMGDFC7RotW7ZEaGgoJkyYgKpVq+Lq1avIyMhA165d8eeff+Krr74q+zt/zsKFC/G///0PXbp0QUZGBq5evYqqVavi/fffR2hoqGoNTjk1aNAAFy9exNChQ2FpaYlr167BxsYGy5cvx4oVK7Qe5+DggNOnT2PAgAGwtLTErVu3YG9vj19//VXrovfAs4XvN23ahD///FM1/CEkJAQJCQnw8vLC5MmTERgYKOvYVSIiIjJskijpGjtk0Nzc3BATE4O7d+/Czc1N7jgaKZVK2NraonPnzlpv9cvFkJdpMjU1lTuCVkZGRnJH0EhbT70hWLZsmdwRNCqvu0S6oGm8viG4f/++3BG0atSokdwRNAoPD5c7glaFyxkaGm3LK8otPz8ft27dQlpaGmxsbLS2K3WPJhERERFRSbDQJCIiIiKdYKFJRERERDrBQpOIiIiIdMKwZmTQvxYdHS13BCIiIiI17NEkIiIiIp1goUlEREREOsFCk4iIiIh0gmM0Se8yMjIMbsH2WrVqyR1BK0NeFNpQF7o31EXRAWDatGlyR9AoMTFR7ghaGeqjbg11UXQAyMvLkzuCRvn5+XJH0Co3N1fuCBoZ6kM7Svq1ZI8mEREREekEC00iIiIi0gkWmkRERESkEyw0iYiIiEgnWGgSERERkU6w0CQiIiIinWChSUREREQ6wUKTiIiIiHSChSYRERER6QQLTR0YMWIEJEnC119/LXcUIiIiItmw0Cxne/fuxebNm9G7d2/MmjVL7jhEREREsmGhWY6SkpIwYcIE1K1bFxs3boQkSXJHIiIiIpKNsdwBXiUffPABnj59ihMnTsDGxkbuOERERESyYo9mOUlMTIS3tzf27t0Lb29vueMQERERyY49muWkWrVqmDt3rtwxiIiIiAxGufRo5uXlYdWqVfD394ednR3Mzc1Rr149zJkzB0qlUq3t3LlzIUmS1qJsw4YNkCQJo0eP1rr9yZMn+Oyzz+Dl5QVzc3O0b99e1U4Igc2bN6Ndu3aws7ODhYUF6tWrh5kzZ+Lx48carylJkmo85datW9GiRQtYWVmhcuXK6NOnD65du6b1vQshsG3bNnTp0gVVqlSBmZkZPDw88OGHH+LBgwdaj3v8+DFmz56Nhg0bolKlSrC2tkarVq2wZs0aFBQUFGk/evRoSJKEDRs24P79+xgzZgycnZ1hbm6OBg0a4KefftJ4nX97XKHz589j8ODBcHFxgampKRwdHTFgwACEhIQUexwRERFRmQtNpVKJTp06YeLEifjnn39gZ2eHOnXq4O7du1iwYAFatWqFxMTE8sgKAMjMzETbtm2xaNEiGBsbw9vbG2ZmZgCeFX3Dhw/HiBEjEBwcjCpVqsDb2xt3797F4sWL0bRpU9y5c0fruRcvXoxhw4YhLi4O9evXR15eHvbt24cWLVrg1KlTRdrn5uZi0KBBGDJkCP766y+Ym5ujfv36ePjwIZYvX46mTZsiIiKiyHHXr1+Hr68vvv76a0RGRsLNzQ2Ojo44f/483nvvPQwaNAhCCI0ZY2Ji0KxZM/z++++oXr06qlSpghs3bmDy5MlYsGCB1vf2b45bunQpWrVqhe3btyMrKwsNGzZEfn4+du7ciZYtW2L37t1ar0dERERU5kJzwoQJCA4ORqdOnRAZGYno6GiEhYXhwYMH6NevH8LDwzFp0qTyyAoA2LVrFzIyMnD9+nXcuHEDly9fxr59+wAAP/30E7Zu3Qpra2scPXoUt2/fxqVLlxATE4M333wTMTExGDp0qNZzz5kzB0uWLEF8fDwuXLiABw8eYNiwYcjMzMTw4cORmZmp1v7zzz/HH3/8gSZNmiAkJATx8fG4cuUKkpKS8MEHHyAhIQHDhg1TO+bJkyfo3bs34uPj8eGHH+LRo0e4fv06oqKicO3aNTRo0AA7d+7Ezz//rDHjggUL4O/vj4SEBFy6dAnx8fGqtvPnz0dqamq5HHf48GFMnz4dVapUwa5du5CcnIzLly8jKSkJa9euhRACo0ePRkJCgtbPZ3Z2NpRKpdqLiIiIXh9lKjSvXr2Kbdu2oVatWtizZw88PDxU++zt7bFp0ya4urpi165diImJKXNYAMjPz8fvv/+O+vXrq7aZm5tDCIHFixcDAL788kt06dJFtd/JyQnbt2+Hqakpzp07h7///lvjubt164Zp06ZBoXj2abG0tMT69evh5OSEmJgYbNu2TdX20aNHWLp0KWxsbLB//340btxYtc/CwgLLly+Hn58fLl68iJMnT6r2rV+/Hrdv30bfvn3xww8/qM1O9/b2xtatWyFJEr7//nuNGatUqYINGzbAzs5OtW3ixIlo2rQpsrKycOLEiXI5bvbs2RBCYN26dejXr5/avrFjx2LKlClIT0/H2rVrNV4PABYuXAhbW1vVy9XVVWtbIiIievWUqdDcs2cPAGDgwIGwtrYust/S0hKdO3eGEEKt2CqLBg0aoGnTpkW2h4eHIy4uDubm5hg/fnyR/S4uLujfvz8A4OjRoxrPrann1dTUFOPGjQMAHDlyRLX94MGDyM7ORkBAAGrUqFHkOIVCgR49egAAgoKCVNsLbzcXnvNFvr6+cHNzw507d3Dv3r0i+4cMGYJKlSoV2e7n5wcAWocGlOa4mJgYXL58GdWqVUOvXr00nq9w+/Pv7UWzZs1CWlqa6hUXF6e1LREREb16yjTrPCwsDMCzgvPMmTMa2xT2ZMbHx5flUirP92Q+r3AsZM2aNTUWVMCzIvX5tiU9d+H2548rfO9nz56Fv7+/xuMePnwIQP29Fx73+eefa31EZVJSkuq4F4tYT09PjcdUq1YNAJCRkaFxf2mOK8yYlZWl9b1lZWWpMmpjZmamGj9LREREr58yFZppaWkAgKioKERFRRXb9sXxjf+WtiKysFAqLJw0cXR0BACkp6dr3K/tWE3HFb73uLi4l/bUPf/eC4+7dOlSsce8eFwhbe+/8Ha/tklEpTmuMKNSqcTp06dLnZGIiIgIKOOtcysrKwDAmjVrIIQo9lW4nFHhMkLaCqInT56UKUtxM9wLexg13eYHno271KTwnM8fV3i9wrGMxb02bNhQ5LjIyMiXHvf8sk36VJjxzTfffGnG6OhoWTISERGR4StToVn4BJzi1pl8UWHPmrai7mU9o9p4eXkBAGJjY7XePr5+/bpa2xeFh4cXu/354/7Ney/LcfpUmDE8PFzjmp5EREREJVGmQrNv374AgM2bNyM5OblExxTOTL9w4UKRfU+ePFGb2V0a9evXR82aNZGVlaVxJvT9+/exa9cuAEBAQIDGc2haUignJwfr1q0DAHTt2lW1vXv37jA1NcXBgwcRGRlZ4pyFM7h//PFHrb26cqtTpw4aNmyIx48fY+PGjXLHISIiogqqTIVm8+bNMXDgQCQnJ6NLly5FnhaTn5+PwMBADBs2DNnZ2QCADh06wNzcHBcvXsQvv/yiapuamorRo0eXuGB9kSRJmDFjBgDgiy++wPHjx1X7Hj58iMGDByMnJwetWrVChw4dNJ7jzz//xA8//KAqADMzMzF+/Hjcv38frq6uGDx4sKpt9erVMXXqVOTm5iIgIACBgYFq5xJC4Pz585g4caLajO4JEybAw8MDJ06cwLBhw4qsQ5mRkYEdO3Zg2rRp/+rzUF6++eYbSJKESZMmYe3atcjLy1Pbf+fOHSxYsICLthMREZFWZV6wfd26daois2nTpqhVqxZatWoFX19fWFtbo0OHDti6dauqeLO3t8fs2bMBPCu6atSogebNm6N69eo4efKkat+/MWnSJAwdOhRKpRKdO3dGnTp10KxZM9SsWRMnT55EzZo1sWXLFq3Hz58/H1OnTkX16tXRokULODk5YePGjTA3N8fmzZthaWmp1n7BggUYPnw47t69iw4dOsDZ2RktW7ZE48aNYWtri5YtW2LVqlXIyclRHWNlZYU///wT7u7u+P3331GjRg14e3ujVatWqFu3Luzs7DBo0CCts/j15e2338by5cuRnZ2N8ePHo3LlymjevDn8/Pzg5OQET09PzJkzp1yf+kRERESvljIXmlZWVjh8+DC2bNmCgIAAPH36VPUEGV9fX8ycORPnz5+Hubm56pg5c+bgp59+gre3Nx49eoS4uDi88847uHjxImrVqvWvs0iShM2bN2Pjxo1o06YNEhMTcf36ddSqVQszZszA5cuX1RaVf9Enn3yCLVu2wNXVFdevX4ckSejVqxfOnTuHtm3bFmlvbGyMTZs24c8//0SfPn0AACEhIUhISICXlxcmT56MwMDAImNC69Wrh9DQUCxatAh+fn6qJwrl5OSgXbt2+O677/71EILyNGnSJFy5cgXjxo2Dg4MDrl+/jsjISFStWhVDhgzBH3/8gZEjR8odk4iIiAyUJAx1oKAevWwmPJUPpVIJW1tbtGrVCsbGZVpZq9yV5Q8cXbt//77cEbTKzc2VO4JGhw4dkjuCVnIPi9GmuIcvyO35u0KGRNvay4bgxeFOhuLmzZtyR9DKwcFB7ggaGeqk3Pz8fISFhSEtLU3tKYcvKnOPJhERERGRJiw0iYiIiEgnWGgSERERkU6w0CQiIiIinTCsGRky4SQgIiIiovLHHk0iIiIi0gkWmkRERESkEyw0iYiIiEgnOEaT9M7R0REmJiZyx1Bz9uxZuSNo1aZNG7kjaJWRkSF3BI3eeustuSNo9eDBA7kjaGRvby93BK1GjRoldwSNli1bJncErbp37y53BI0KH5BiiAz159mLTxc0FDk5OQgLC3tpO/ZoEhEREZFOsNAkIiIiIp1goUlEREREOsFCk4iIiIh0goUmEREREekEC00iIiIi0gkWmkRERESkEyw0iYiIiEgnWGgSERERkU6w0CQiIiIinWChSUREREQ6wUKTiIiIiHSChSYRERER6QQLTSIiIiLSCRaaRERERKQTLDRJq2vXruGLL75A69at4ezsDFNTUzg7O6Nfv344c+aM3PGIiIjIwLHQJK2mTp2KL7/8Ejdv3oS9vT18fHyQl5eHPXv2oG3btti6davcEYmIiMiAsdAkrd5//31cvXoVKSkpuHHjBi5duoTExETs3bsXFhYWmDhxItLT0+WOSURERAaKhSZp9c4778DHx0dtmyRJ6N27N6ZOnQqlUon//e9/Wo/Pzs6GUqlUexEREdHrw1juAGTYYmNjsXXrVly+fBlJSUnIyckBACQmJgIAQkNDMXToUI3HLly4EPPmzdNbViIiIjIsLDRJq99++w3vv/8+srKytLZ5/Pix1n2zZs3CtGnTVB8rlUq4urqWa0YiIiIyXLx1Thrdvn0b48ePR1ZWFqZPn46QkBAolUoUFBRACIE1a9YAAHJzc7Wew8zMDDY2NmovIiIien2wR5M02rFjB3JzczF48GB89913RfbHxcXJkIqIiIgqEvZokkbR0dEAgDfeeEPj/tDQUD2mISIiooqIhSZpZGFhAQB4+PBhkX03b94sdrY5EREREcBCk7Tw9/cHAPz888+4cuWKantERAQGDBgAU1NTmZIRERFRRcFCkzTq06cPWrVqhZSUFDRv3hze3t7w8fFBvXr1kJycjDlz5sgdkYiIiAwcC03SyNjYGEeOHMF//vMfODo6IioqCqmpqRg7diwuXboEFxcXuSMSERGRgeOsc9LKxsYGP/74I3788cci+0aPHo3Ro0frPxQRERFVGOzRJCIiIiKdYKFJRERERDrBQpOIiIiIdIKFJhERERHpBAtNIiIiItIJFppEREREpBMsNImIiIhIJ1hoEhEREZFOsNAkIiIiIp3gk4FI7xQKBRQKw/obx9zcXO4IWmVmZsodQStD+zoWMjExkTuCVnfu3JE7gkZNmzaVO4JWjRs3ljuCRk5OTnJH0Co3N1fuCBrZ2dnJHUGrxMREuSNodPXqVbkjaJSfn1+idob5W4KIiIiIKjwWmkRERESkEyw0iYiIiEgnWGgSERERkU6w0CQiIiIinWChSUREREQ6wUKTiIiIiHSChSYRERER6QQLTSIiIiLSCRaaryghBNq1awdJkrBly5ZyO++6desgSRLeeustFBQUlNt5iYiI6NXDQvMV9cMPPyA4OBgffvghhg0bVi7njI2NxbRp0+Du7o6tW7ca7OMHiYiIyDCwUngF3bp1C5999hnatGmDJUuWFNkfHR2NuXPnYsOGDSU+pxACY8aMQW5uLnbv3o3KlSuXY2IiIiJ6FbHQfMXk5+dj9OjRsLe3x44dO2BsbFykTXR0NObNm1eqQvPnn3/G8ePHsXr1ajRu3Lj8AhMREdErq2gVQhXa7du3ERAQgJ49e8LJyalczimEQGZmJtauXYsRI0aUyzmJiIjo1cdC8xXj5eWFuXPnlus5JUnCxx9/XK7nJCIiolcfb52/IvLy8rBq1Sr4+/vDzs4O5ubmqFevHubMmQOlUqlq1759e3To0AEAEBQUBEmSVC83N7ci5z1//jwGDx4MFxcXmJqawtHREQMGDEBISIi+3hoRERFVUOzRfAUolUr07NkTwcHBUCgUcHV1hbW1NSIiIrBgwQLs3r0bgYGBqFatGnx8fJCcnIxr167BxsYGPj4+qvM4OzurnXfp0qWYPn06hBCoXLkyGjZsiNjYWOzcuRP79u3Dtm3b0K9fP32/XSIiIqog2KP5CpgwYQKCg4PRqVMnREZGIjo6GmFhYXjw4AH69euH8PBwTJo0CQCwfPlyLF++HADQpEkTnDp1SvX6448/VOc8fPgwpk+fjipVqmDXrl1ITk7G5cuXkZSUhLVr10IIgdGjRyMhIUFrruzsbCiVSrUXERERvT5YaFZwV69exbZt21CrVi3s2bMHHh4eqn329vbYtGkTXF1dsWvXLsTExJT4vLNnz4YQAuvWrSvSazl27FhMmTIF6enpWLt2rdZzLFy4ELa2tqqXq6tr6d8gERERVVgsNCu4PXv2AAAGDhwIa2vrIvstLS3RuXNnCCFw8uTJEp0zJiYGly9fRrVq1dCrVy+NbQq3BwUFaT3PrFmzkJaWpnrFxcWV6PpERET0auAYzQouLCwMwLOC88yZMxrbFPZkxsfHl+qcWVlZ8Pf319gmKyvrpec0MzODmZlZia5JRERErx4WmhVcWloaACAqKgpRUVHFts3MzCzVOZVKJU6fPl0u5yQiIqLXDwvNCs7KygoAsGbNGowbN65cz/nmm2/i1KlT5XJOIiIiev1wjGYF5+3tDQC4du1aiY+RJKlE5wwPD0dBQcG/D0dERESvNRaaFVzfvn0BAJs3b0ZycnKJjrGwsACg/bZ3nTp10LBhQzx+/BgbN24sn6BERET02mGhWcE1b94cAwcORHJyMrp06VLkiT35+fkIDAzEsGHDkJ2dDQBwd3cHANy4cQOPHj3SeN5vvvkGkiRh0qRJWLt2LfLy8tT237lzR7UYPBEREZEmHKP5Cli3bh1SUlJw7NgxNG3aFDVr1oSzszOePn2KqKgoVc/lunXrAAAODg7o2LEj/v77b3h6esLb2xvm5uZwcnLCtm3bAABvv/02li9fjilTpmD8+PGYNm0avLy8IEkS4uLi8PDhQwDAypUr5XnTREREZPDYo/kKsLKywuHDh7FlyxYEBATg6dOnqqf4+Pr6YubMmTh//jzMzc1Vx2zduhWjR4+GjY0NLl26hKCgIJw9e1btvJMmTcKVK1cwbtw4ODg44Pr164iMjETVqlUxZMgQ/PHHHxg5cqS+3y4RERFVEOzRfEUoFAoMHToUQ4cOLVF7R0dH/Prrry9t17BhQ6xZs6as8YiIiOg1xB5NIiIiItIJFppEREREpBMsNImIiIhIJ1hoEhEREZFOsNAkIiIiIp1goUlEREREOsFCk4iIiIh0goUmEREREekEF2wnvYuNjYWRkZHcMdR8++23ckfQaunSpXJH0CotLU3uCBrl5eXJHUErMzMzuSNodPXqVbkjaPX999/LHUGj3bt3yx1Bq759+8odQaOqVavKHUGrrKwsuSNoZGlpKXeEMmGPJhERERHpBAtNIiIiItIJFppEREREpBMsNImIiIhIJ1hoEhEREZFOsNAkIiIiIp1goUlEREREOsFCk4iIiIh0goUmEREREekEC00iIiIi0gkWmlRiQgi0a9cOkiRhy5YtcschIiIiA8dC04AEBgZCkiS0b99e7iga/fDDDwgODsaHH36IYcOGyR2HiIiIDBwLTSqRW7du4bPPPkObNm2wZMkSueMQERFRBcBC04BYWlqibt26qFmzptxR1OTn52P06NGwt7fHjh07YGxsLHckIiIiqgBYMRiQFi1a4ObNm3LHKOL27dsICAhAz5494eTkJHccIiIiqiBYaNJLeXl5Ye7cuXLHICIiogqGt871ICYmBhMmTICHhwfMzMxgbW0NDw8P9O3bF9u2bVO1e9lkoJCQEPTs2RP29vawsrJCq1atsHPnTgCAJEmQJKnIMc9vP3ToENq2bQtra2vY2tqiW7duCAkJ0Zo7Ly8Pq1atgr+/P+zs7GBubo569ephzpw5UCqVZfiMEBER0euAPZo6Fh0dDT8/PyQlJanGYBoZGSE2NhZ79+7F3bt3MXjw4Jee56+//kKPHj2QnZ0NGxsb1K9fH7GxsRgwYAC+//77lx6/atUqfPDBB3BycoKXlxdu3bqFw4cP49SpU7hw4QLq1aun1l6pVKJnz54IDg6GQqGAq6srrK2tERERgQULFmD37t0IDAxEtWrV/vXnhoiIiF5t7NHUsSVLliApKQmjRo3Cw4cPcfXqVYSEhCA5ORnh4eH44IMPXnqO9PR0jBgxAtnZ2Xj33Xfx4MEDXLhwAfHx8VixYgVmzZr10nNMnz4d69evx/3793Hp0iUkJCSgU6dOyMjI0HhbfMKECQgODkanTp0QGRmJ6OhohIWF4cGDB+jXrx/Cw8MxadKkYq+ZnZ0NpVKp9iIiIqLXBwtNHYuMjAQATJs2DVZWVmr76tWrh/fee++l59i6dSsePHiAevXq4ZdffoGFhQWAZ7fFJ02aVKIe0bFjx2L06NGqj62trbF06VIAwOHDh9XaXr16Fdu2bUOtWrWwZ88eeHh4qPbZ29tj06ZNcHV1xa5duxATE6P1mgsXLoStra3q5erq+tKcRERE9OpgoaljhcXVzp07IYT4V+c4duwYAGDEiBEalxZ69913X3qOcePGFdnm4+MDc3NzpKWlITk5WbV9z549AICBAwfC2tq6yHGWlpbo3LkzhBA4efKk1mvOmjULaWlpqldcXNxLcxIREdGrg2M0dWzSpEn47bff8NVXX2Hjxo1466230KZNG3To0AHVq1cv0TkKe0V9fX017te2/Xmenp4atzs4OCAuLg4ZGRmoUqUKACAsLAzAs4LzzJkzGo8r7MmMj4/Xek0zMzOYmZm9NBsRERG9mlho6ljjxo0RHByML774An///TdWr16N1atXQ5IkdOnSBcuWLUP9+vWLPceTJ08AQGPvYnHbn1epUiWN2xWKZ53az/e2pqWlAQCioqIQFRVV7HkzMzNfem0iIiJ6PbHQ1INWrVrhyJEjyMjIwOnTp3HixAls3boVR48eRZcuXXDt2jXY2dlpPb6wSMzIyNC4Pz09vVzzFo4lXbNmjcZb7kREREQlwTGaemRlZYWAgAAsWrQIN2/ehKenJ+Lj43Ho0KFij/Py8gLwbJKOJoW3usuLt7c3AODatWvlel4iIiJ6vbDQlImlpSV8fHwAAPfv3y+2bZcuXQAAmzdvRn5+fpH9GzZsKNdsffv2VV3v+UlCRERERKXBQlPHJk6ciO3bt+Pp06dq24ODg3H8+HEAQNOmTYs9x5AhQ+Dk5IQbN27g/fffR1ZWFoBn4ypXrlyJrVu3lmvm5s2bY+DAgUhOTkaXLl2KPD0oPz8fgYGBGDZsGLKzs8v12kRERPTq4BhNHfvnn3+watUqGBsbo06dOrC2tsbDhw9Vs7aHDx+ODh06FHsOa2trbNq0Cd27d8fatWvxxx9/wMvLC/Hx8bh//z6WLFmC6dOnqyb2lId169YhJSUFx44dQ9OmTVGzZk04Ozvj6dOniIqKUk0CWrduXbldk4iIiP5fe/ceFNV5uHH82eW2KrBqEsELEbRqFG8xarEai9XEyURrtK1NtU2xiTW29ZbYqKkTrbYlk5pqahIydTLxEqqJYyHVVpLQ1ipe0Ch4v0GiYtQQEFkUWRbY3x8O+3NlV6nJ4Szw/cycGfac87LPAWWeefdcmhZmNA22fPlyzZo1S3379lVRUZFyc3MlSaNHj9bf//53rV27tl7fZ9SoUdq9e7cef/xxSdKxY8fUsWNHrV+/XtOmTZNUv6vP6ys8PFwZGRlKTU3V6NGjVV5ergMHDqioqEh9+/bVvHnztHfvXtlstq/tPQEAQNPCjKbBRowYcccZy1qJiYm3van7gAEDtGXLljrr9+/fL0mKjY2ts+1ON4k/c+aM321Wq1WTJk3SpEmTbvs9AAAAfGFGswl45513JElDhw41OQkAAMD/o2g2Ev/5z3+0YcMGr4tvXC6X/vSnPyklJUVWq1VTp041MSEAAIA3PjpvJM6ePaspU6YoJCREcXFxioyM1KlTp+RwOCRJycnJ6t+/v7khAQAAbsKMZiPx8MMP61e/+pW6d++uL7/8Urm5ubLZbBo7dqw+/PBDzZ8/3+yIAAAAXpjRbCS6du2qlStXmh0DAACg3pjRBAAAgCEomgAAADAERRMAAACG4BxNNLj4+HiFhoaaHcPLiy++aHYEv4YMGWJ2BL9cLpfZEXzKz883O4Jfgfo0rcjISLMj+BUeHm52BJ+++93vmh3Br2eeecbsCD5t2rTJ7Ah+VVVVmR3Bp86dO5sdwSeXy6WDBw/ecT9mNAEAAGAIiiYAAAAMQdEEAACAISiaAAAAMARFEwAAAIagaAIAAMAQFE0AAAAYgqIJAAAAQ1A0AQAAYAiKJgAAAAxB0QQAAIAhKJoAAAAwBEUTAAAAhqBoAgAAwBAUTQAAABiCotlMVFVV6a233tKwYcPUunVr2Ww2PfDAA1q4cKEcDofXvqtXr5bFYlFSUpKcTqcWL16sb3zjG7LZbIqJidFzzz2na9eumXQkAACgsaBoNgMOh0MjR47U9OnTtXv3brVu3VrdunXTZ599pt///vdKSEhQYWFhnXEul0uPPvqolixZIpvNptjYWF24cEHLly/X+PHjTTgSAADQmFA0m4Fp06Zp+/btGjlypE6fPq0zZ87o8OHDunTpkiZMmKDjx4/rl7/8ZZ1xGzduVFFRkU6cOKEjR47oxIkT2rlzpyIjI/Xxxx8rIyPjtu/rdDrlcDi8FgAA0HxQNJu4Q4cOacOGDercubPS0tLUpUsXz7Y2bdpo3bp1iomJ0aZNm3T27FmvsVVVVVqzZo26d+/uWZeQkKBnnnlGkrR169bbvndycrLsdrtniYmJ+RqPDAAABDqKZhOXlpYmSZo4caIiIiLqbG/ZsqVGjRolt9utHTt2eG3r37+/Bg4cWGfMoEGDJEmffvrpbd97wYIFKi0t9SwFBQV3exgAAKARCjY7AIx1+PBhSTcK565du3zuUzuT+fnnn3ut79q1q8/927VrJ0m6evXqbd87LCxMYWFh/1NeAADQdFA0m7jS0lJJUl5envLy8m677/Xr171et2rVyud+VuuNiXC32/01JAQAAE0VRbOJCw8PlyStWrXKc24lAABAQ+AczSauV69ekqQjR46YnAQAADQ3FM0mrvZ+l++++66Ki4tNTgMAAJoTimYTN3DgQE2cOFHFxcV65JFHlJOT47W9urpa27Zt0+TJk+V0Ok1KCQAAmiLO0WwG3n77bZWUlOjjjz/WgAEDdP/996t9+/YqLy9XXl6e5yKgt99+2+SkAACgKWFGsxkIDw9XRkaGUlNTNXr0aJWXl+vAgQMqKipS3759NW/ePO3du1c2m83sqAAAoAlhRrOZsFqtmjRpkiZNmnTHfZOSkpSUlOR3e2JiIrc2AgAAd8SMJgAAAAxB0QQAAIAhKJoAAAAwBEUTAAAAhqBoAgAAwBAUTQAAABiCogkAAABDUDQBAABgCG7YjgZ3+PBhBQUFmR3Dyz333GN2BL+CgwP3v+m+ffvMjuBTYWGh2RH8unLlitkRfLJaA3feITIy0uwIPnXo0MHsCH5t2LDB7Ag+ZWZmmh3Brz59+pgdwaeqqiqzI/hU31yB+5cFAAAAjRpFEwAAAIagaAIAAMAQFE0AAAAYgqIJAAAAQ1A0AQAAYAiKJgAAAAxB0QQAAIAhKJoAAAAwBEWzmbty5YratWunoKAg5eTkmB0HAAA0IRTNZu6ll15SaWmpoqKiNGPGDLPjAACAJoSi2YwdPXpUKSkpWrhwod555x3t3LlTqampfvdPT0/X4sWLlZub23AhAQBAo0XRbMZmzpyp3r17a/78+Ro9erR++tOf6oUXXtDVq1d97p+enq7f/va3FE0AAFAvFM1m6osvvtDDDz+sd999VyEhIZKk5cuX6+c//7lOnz5tcjoAANAUBJsdAOaIiorS4sWLvda1adNGixYtMicQAABocpjRDCBHjhzRokWLNGTIELVv316hoaFq3769JkyYoF27dvkdt2vXLk2YMEFRUVEKDQ1Vp06d9NRTT+n48eM+94+NjZXFYtGZM2d8bk9MTJTFYtG2bdskSWfOnJHFYtGaNWskSVOmTJHFYvEstxZWAAAAiaIZUGbPnq0lS5boxIkTatOmjfr06aOqqiqlpaVp+PDh+utf/1pnTEpKioYNG6a0tDRJUr9+/XTt2jWtW7dOAwYM0D/+8Y+vnMtms2no0KFq166dJKlbt24aOnSoZ7n//vu/8nsAAICmh6IZQJ599lkdOnRIJSUlOnbsmPbv36/CwkKlp6erRYsWmj59usrKyjz75+bmaubMmXK73XrllVd08eJF7du3T5cuXdIvfvELVVRUaPLkybp48eJXyhUdHa2srCw99thjkqQXX3xRWVlZnuVnP/vZV/r+AACgaaJoBpDvf//76tOnj9c6i8WicePGafbs2XI4HNq8ebNn27Jly1RVVaVx48bp17/+tazWG7/OsLAwvf7664qPj1dpaalSUlIa9DhqOZ1OORwOrwUAADQfFM0Ac+7cOb388suaOHGivvOd72jYsGEaNmyY3nvvPUnSwYMHPft+9NFHkuTzRusWi0UzZ8702q+hJScny263e5aYmBhTcgAAAHNw1XkAWbNmjZ599llVVFT43efy5cuSbjw68ssvv5Qk9erVy+e+8fHxkqRTp059zUnrZ8GCBXruuec8rx0OB2UTAIBmhBnNAJGfn6+pU6eqoqJCzz//vHJycuRwOFRTUyO3261Vq1ZJklwulyR53VS99iKdW0VFRUmS13mdDSksLEyRkZFeCwAAaD6Y0QwQ77//vlwul5588kktW7aszvaCggKv1+Hh4Z6vCwsL1b59+zpjvvjiC0lSRESE13qLxSJJcrvdPrNcu3btfwsPAADgAzOaAaL2npbf+ta3fG6/+dxMSWrdurXuu+8+SdKxY8d8jjl69KgkqXv37l7rW7VqJUmej95vlZ+f73N9bUEFAACoD4pmgGjRooWk/5+FvNmJEye8rjavNXr0aEnSypUr62xzu92e9bX71erSpYskad++fXXGbdq0SSUlJbfNeP36db/HAQAAUIuiGSCGDRsmSXrzzTeVm5vrWX/q1Cn94Ac/UGhoaJ0xzz//vIKDg/XBBx/o1VdfVU1NjSSpsrJSs2bN0pEjR2S32zV9+nSvcbX3w3zllVe8nmu+b98+zZw50/Ps81vVFtTt27f7/dgdAACgFkUzQDzxxBNKSEhQSUmJBg4cqF69eqlPnz564IEHVFxcrIULF9YZ079/f/35z3+WxWLR3Llz1aFDBw0ePFhRUVFauXKlwsLClJqaqujoaK9xU6ZMUXx8vM6dO+d5nx49emjw4MEaPny434/vx48fr9DQUG3YsEFxcXEaPny4EhMTtXr1aiN+JAAAoJGjaAaI4OBgffjhh5oxY4aioqKUl5enK1eu6Omnn9b+/fvVsWNHn+OmT5+uHTt26IknnlBNTY1yc3PVsmVL/fjHP9aBAwf0+OOP1xljs9n073//W08//bTatm2r06dPy2q1atmyZUpNTfWbsWvXrtq8ebO+/e1vq6SkRFlZWfrvf//r95npAACgebO4+QwUDcThcMhut+vBBx9UUFCQ2XG82O12syP45e8+qYFg9+7dZkfwqbCw0OwIfp07d87sCD61bdvW7Ah+/ehHPzI7gk8XLlwwO4JfgfoktszMTLMj+HXrk/kCRVxcnNkRfHK5XMrIyFBpaeltb1/IjCYAAAAMQdEEAACAISiaAAAAMARFEwAAAIagaAIAAMAQFE0AAAAYgqIJAAAAQ1A0AQAAYAiKJgAAAAzBk4HQYGqfDDRq1CiFhISYHQcAANwll8ulzMxMngwEAAAAc1A0AQAAYAiKJgAAAAxB0QQAAIAhKJoAAAAwBEUTAAAAhqBoAgAAwBAUTQAAABiCogkAAABDUDQBAABgCIomAAAADEHRNMFnn32mVatWaerUqerXr5+Cg4NlsVj0u9/97rbjSktL9dJLL6l3795q2bKlWrdureHDh2v9+vW3Hed0OvXqq6/qoYceUnh4uCIiIjRo0CC9+eabqqmp8Tnm/PnzWrFihcaOHatOnTopNDRUdrtdQ4YM0fLly+V0Ou/6+AEAQPMQbHaA5ui1117Ta6+99j+N+fzzzzVixAidPn1aQUFB6t27t1wul7KysrRjxw5t375dKSkpdcaVlZXpkUceUXZ2tiwWi3r27KmQkBDl5OTok08+0datW5WWlqbgYO9/CkOGDNH58+clSVFRUerXr58uXryoPXv2aM+ePVq7dq0yMzN1zz333P0PAgAANGnMaJrg3nvv1ZgxY7RkyRJt3bpV3/ve9+445ic/+YlOnz6t+Ph45eXlKTc3V0ePHlVOTo46dOigt956S+vWraszbtasWcrOzlaHDh2Uk5Ojo0ePKjc3V3l5eYqPj9eWLVuUnJxcZ5zNZtPMmTN16NAhXbp0Sfv27dP58+eVmZmpdu3aKTc3V9OmTftafh4AAKBpsrjdbrfZIZq7pKQkrVmzRkuXLtXChQvrbD948KD69+8vSdq9e7cSEhK8tr/33nt68skn1aVLF+Xn53vWFxcXKyoqStXV1dqwYYN++MMfeo3bs2ePhgwZooiICF28eFGtWrXybLt8+bLatm3rM2/t+1mtVhUWFtZ7VtPhcMhut2vUqFEKCQmp1xgAABB4XC6XMjMzVVpaqsjISL/7MaPZCOzcuVOS1KlTpzolU5LGjx8vq9WqTz/9VPv37/esz87OVnV1taxWq8aPH19nXEJCgjp27KiysjJlZGR4bfNXMiXp0UcflSTV1NQoLy/vro4JAAA0fRTNRqCkpESS1LFjR5/bQ0NDde+990q6MUt567j77rtPoaGhPsfWfs+bx91JRUWF5+sWLVrUexwAAGheuBioEbDb7ZJuXBDkS2VlpYqKiiRJJ0+erDOuqKhIlZWVPstm7fe8edydvP/++5KkNm3aqFevXn73czqdXlenOxyOer8HAABo/JjRbAQGDRok6cYth/bu3Vtne3p6uuc2RbWzmJI0cOBAWSwWVVdX64MPPqgzbu/evZ6iefO427l48aKWLl0qSZozZ06dq9VvlpycLLvd7lliYmLq9R4AAKBpoGg2At/85jf10EMPSbpx4dCpU6c827KzszVnzhzP6+vXr3u+jo6O9pybOXv2bGVnZ3u2nTp1SklJST7H+VNZWamJEyequLhY/fv317x58267/4IFC1RaWupZCgoK7vgeAACg6aBoNhKpqamKjo7W8ePH1bNnT/Xo0UNxcXFKSEhQeXm5xo4dK0kKDw/3GpeSkqIePXrowoULSkhIUFxcnHr06KGePXsqPz9fEydO9DnuVm63W0lJScrKylL79u2Vlpbm97zPWmFhYYqMjPRaAABA80HRbCR69OihnJwczZo1S7GxsTpz5oyuXbumyZMn68CBA54SFx0d7TWuXbt2ys7O1sKFC9WzZ09dunRJhYWFGjNmjLKzs9WtWzef4241Y8YMrV+/Xm3bttVHH32k2NhYQ44TAAA0HVwM1IhER0drxYoVWrFiRZ1tn3zyiSR5PmK/md1u19KlSz3nVt5s/vz5fsfV+s1vfqM33nhD4eHh2rp1q3r37n2XRwAAAJoTZjSbgKNHj+rkyZOy2WwaNWpUvcddvnxZ27ZtkySNGTPG5z5//OMf9Yc//EE2m02bN2/W4MGDv47IAACgGaBoNnJut1sLFiyQJE2ePFlt2rSp99hFixbJ6XRq5MiR6tmzZ53tf/nLX/TCCy8oJCREGzduVGJi4tcVGwAANAMUzUYiKytL//rXv3TzE0OLi4s1ZcoUbd68WVFRUXr55ZfrjDt8+LDS09NVVVXlWXf16lXNnz9fr7/+ulq2bKk33nijzriNGzdq+vTpslqtWrt2rd8ZTwAAAH941rkJdu7cqXHjxnleX716VU6nUy1btvR60k5OTo7n3pMrVqzQnDlzFBERobi4OLndbh0/flxVVVXq2LGjMjIyfJ47mZ6ervHjx6tFixaKi4tTaGioTpw4oYqKCrVu3Vp/+9vfNGLEiDrjwsLCVFlZqcjISPXp08fvsaxcuVIPPvhgvY6bZ50DANA01PdZ51wMZAKXy6Xi4uI668vLy1VeXu55XV1d7fk6MTFRTz31lHbv3q38/HxZLBb16tVLEyZM0Jw5c/z+kvv166dp06Zpx44dKigoUFVVlTp37qwxY8Zo7ty5fq82r6yslHSjHNY+a92X0tLSeh0zAABofpjRRINhRhMAgKahvjOanKMJAAAAQ1A0AQAAYAiKJgAAAAxB0QQAAIAhKJoAAAAwBEUTAAAAhqBoAgAAwBAUTQAAABiCJwOhwXXu3FmhoaFmx/CyZcsWsyP49dhjj5kdwa+goCCzI/iUnp5udgS/bn76VyCJjY01O4Jfdrvd7Ag+nTx50uwIfs2aNcvsCD7985//NDuCX+fPnzc7gk+ZmZlmR/CprKysXo+gZkYTAAAAhqBoAgAAwBAUTQAAABiCogkAAABDUDQBAABgCIomAAAADEHRBAAAgCEomgAAADAERRMAAACGoGgCAADAEBRNAAAAGIKiCQAAAENQNAEAAGAIiiYAAAAMQdEEAACAISiaAAAAMARFEwAAAIagaAIAAMAQwWYHQNPldDrldDo9rx0Oh4lpAABAQ2NGE4ZJTk6W3W73LDExMWZHAgAADYiiCcMsWLBApaWlnqWgoMDsSAAAoAHx0TkMExYWprCwMLNjAAAAkzCjCQAAAENQNAEAAGAIiiYAAAAMQdHEXZk7d65iY2M1d+5cs6MAAIAARdHEXSkqKtLZs2dVVFRkdhQAABCgKJoAAAAwBLc3wl1ZvXq1Vq9ebXYMAAAQwJjRBAAAgCEomgAAADAERRMAAACGoGgCAADAEBRNAAAAGIKiCQAAAENQNAEAAGAIiiYAAAAMwQ3b0WDcbrckqbKy0uQkddXU1Jgdwa9A/HnVCgoKMjuCT4H8+6z9fxBoqqurzY7gV1VVldkRfArkf2cVFRVmR/ApUH+XUuD+PsvKysyO4NPVq1cl3flvmsUdqH/10OScP39eMTExZscAAABfk4KCAnXq1MnvdoomGkxNTY0uXLigiIgIWSwWs+MAAIC75Ha7VVZWpg4dOshq9X8mJkUTAAAAhuBiIAAAABiCogkAAABDUDQBAABgCIomAAAADEHRBAAAgCEomgAAADAERRMAAACG+D+eSyhaTYOgTgAAAABJRU5ErkJggg==", + "text/plain": [ + "
" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "# Tokenize example sentences in English and French, then get their embeddings\n", + "sentence_en = \"The agreement on the European Economic Area was signed in August 1992 .\"\n", + "tokenized_en = tokenize(sentence_en, en_words)\n", + "embedded_en = embed(tokenized_en, en_embeddings)\n", + "\n", + "sentence_fr = \"L accord sur la zone économique européenne a été signé en août 1992 .\"\n", + "tokenized_fr = tokenize(sentence_fr, fr_words)\n", + "embedded_fr = embed(tokenized_fr, fr_embeddings)\n", + "\n", + "# These weights indicate alignment between words in English and French\n", + "alignment = calculate_weights(embedded_fr, embedded_en)\n", + "\n", + "# Visualize weights to check for alignment\n", + "fig, ax = plt.subplots(figsize=(7,7))\n", + "ax.imshow(alignment, cmap='gray')\n", + "ax.xaxis.tick_top()\n", + "ax.set_xticks(np.arange(alignment.shape[1]))\n", + "ax.set_xticklabels(sentence_en.split(\" \"), rotation=90, size=16);\n", + "ax.set_yticks(np.arange(alignment.shape[0]));\n", + "ax.set_yticklabels(sentence_fr.split(\" \"), size=16);" + ] + }, + { + "cell_type": "markdown", + "id": "d634f0ec", + "metadata": {}, + "source": [ + "If you implemented the weights calculations correctly, the alignment matrix should look like this:\n", + "\n", + "![alignment visualization](./images/alignment.png)\n", + "\n", + "This is a demonstration of alignment where the model has learned which words in English correspond to words in French. For example, the words *signed* and *signé* have a large weight because they have the same meaning. Typically, these alignments are learned using linear layers in the model, but you've used pre-trained embeddings here.\n", + "\n", + "### Exercise 2\n", + "Complete the implementation of scaled dot-product attention using your `calculate_weights` function (ignore the mask)." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "fbfc157e", + "metadata": {}, + "outputs": [], + "source": [ + "def attention_qkv(queries, keys, values):\n", + " \"\"\" Calculate scaled dot-product attention from queries, keys, and values matrices \"\"\"\n", + " \n", + " # Replace pass with your code.\n", + " return calculate_weights(queries, keys).dot(values)\n", + "\n", + "\n", + "attention_qkv_result = attention_qkv(embedded_fr, embedded_en, embedded_en)\n", + "\n", + "print(f\"The shape of the attention_qkv function is {attention_qkv_result.shape}\")\n", + "print(f\"Some elements of the attention_qkv function are \\n{attention_qkv_result[0:2,:10]}\")" + ] + }, + { + "cell_type": "markdown", + "id": "f98335f0", + "metadata": {}, + "source": [ + "**Expected output**\n", + "\n", + "The shape of the attention_qkv function is `(14, 300)`\n", + "\n", + "Some elements of the attention_qkv function are \n", + "```python\n", + "[[-0.04039161 -0.00275749 0.00389873 0.04842744 -0.02472726 0.01435613\n", + " -0.00370253 -0.0619686 -0.00206159 0.01615228]\n", + " [-0.04083253 -0.00245985 0.00409068 0.04830341 -0.02479128 0.01447497\n", + " -0.00355203 -0.06196036 -0.00241327 0.01582606]]\n", + "```" + ] + }, + { + "cell_type": "markdown", + "id": "f87131fb", + "metadata": {}, + "source": [ + "## Solutions" + ] + }, + { + "cell_type": "markdown", + "id": "8470a024", + "metadata": {}, + "source": [ + "```python\n", + "def softmax(x, axis=0):\n", + " \"\"\" Calculate softmax function for an array x\n", + " \n", + " axis=0 calculates softmax across rows which means each column sums to 1 \n", + " axis=1 calculates softmax across columns which means each row sums to 1\n", + " \"\"\"\n", + " y = np.exp(x) \n", + " return y / np.expand_dims(np.sum(y, axis=axis), axis)\n", + "\n", + "def calculate_weights(queries, keys):\n", + " \"\"\" Calculate the weights for scaled dot-product attention\"\"\"\n", + " dot = np.matmul(queries, keys.T)/np.sqrt(keys.shape[1])\n", + " weights = softmax(dot, axis=1)\n", + " \n", + " assert weights.sum(axis=1)[0] == 1, \"Each row in weights must sum to 1\"\n", + " \n", + " return weights\n", + "\n", + "def attention_qkv(queries, keys, values):\n", + " \"\"\" Calculate scaled dot-product attention from queries, keys, and values matrices \"\"\"\n", + " weights = calculate_weights(queries, keys)\n", + " return np.matmul(weights, values)\n", + "```" + ] + } + ], + "metadata": { + "kernelspec": { + "display_name": "Python 3 (ipykernel)", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.10.11" + } + }, + "nbformat": 4, + "nbformat_minor": 5 +} diff --git a/NLP with Attention Models/NMT_with_Attention/Basic Attention-BLEU-QKV Attention/Files/home/jovyan/work/data/embeddings_en.npz b/NLP with Attention Models/NMT_with_Attention/Basic Attention-BLEU-QKV Attention/Files/home/jovyan/work/data/embeddings_en.npz new file mode 100644 index 0000000000000000000000000000000000000000..cd33945f592528e320a77d8016154e5863e4ff60 --- /dev/null +++ b/NLP with Attention Models/NMT_with_Attention/Basic Attention-BLEU-QKV Attention/Files/home/jovyan/work/data/embeddings_en.npz @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:0dde813af6f3db6e520041567fe87619549e9a966515c534f050fe08413e0bc5 +size 6681346 diff --git a/NLP with Attention Models/NMT_with_Attention/Basic Attention-BLEU-QKV Attention/Files/home/jovyan/work/data/embeddings_fr.npz b/NLP with Attention Models/NMT_with_Attention/Basic Attention-BLEU-QKV Attention/Files/home/jovyan/work/data/embeddings_fr.npz new file mode 100644 index 0000000000000000000000000000000000000000..bf3c516d4814e01e3577b8a4bb23c09e492091c9 --- /dev/null +++ b/NLP with Attention Models/NMT_with_Attention/Basic Attention-BLEU-QKV Attention/Files/home/jovyan/work/data/embeddings_fr.npz @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:02aeb7d162339f0440134f501e867b15203f0dfac2f4cde6c7333840642b03ad +size 6685746 diff --git a/NLP with Attention Models/NMT_with_Attention/Basic Attention-BLEU-QKV Attention/Files/home/jovyan/work/data/wmt19_can.txt b/NLP with Attention Models/NMT_with_Attention/Basic Attention-BLEU-QKV Attention/Files/home/jovyan/work/data/wmt19_can.txt new file mode 100644 index 0000000000000000000000000000000000000000..cbedae50faa75975fc059fd462799a76e373f5f3 --- /dev/null +++ b/NLP with Attention Models/NMT_with_Attention/Basic Attention-BLEU-QKV Attention/Files/home/jovyan/work/data/wmt19_can.txt @@ -0,0 +1 @@ +Walisische AMs machten sich Sorgen, dass sie wie Muppets aussehen könnten Einige AMs sind bestürzt über den Vorschlag, ihren Titel in MWPs (Mitglied des walisischen Parlaments) zu ändern. Es ist aufgrund von Plänen entstanden, den Namen der Versammlung in walisisches Parlament zu ändern. AMs im gesamten politischen Spektrum befürchten, dass dies zu Spott führen könnte. Ein Labour AM sagte, seine Gruppe sei besorgt, "es reimt sich auf Twp und Pwp." Für Leser außerhalb von Wales: Auf Walisisch bedeutet twp dumm und pwp bedeutet poo. Ein Plaid AM sagte, die Gruppe als Ganzes sei "nicht glücklich" und habe Alternativen vorgeschlagen. Ein walisischer Konservativer sagte, seine Gruppe sei "aufgeschlossen" gegenüber der Namensänderung, bemerkte jedoch, dass es sich um einen kurzen verbalen Sprung von MWP nach Muppet handele. In diesem Zusammenhang wird der walisische Buchstabe w ähnlich wie die englische Aussprache des Buchstabens u in Yorkshire ausgesprochen. Die Versammlungskommission, die derzeit Gesetze zur Einführung der Namensänderungen ausarbeitet, sagte: "Die endgültige Entscheidung über Deskriptoren dessen, was Versammlungsmitglieder genannt werden, wird natürlich Sache der Mitglieder selbst sein." Der Government of Wales Act 2017 gab der walisischen Versammlung die Befugnis, ihren Namen zu ändern. Im Juni veröffentlichte die Kommission die Ergebnisse einer öffentlichen Konsultation zu den Vorschlägen, die breite Unterstützung für die Einberufung der Versammlung als walisisches Parlament fanden. In Bezug auf den Titel der AMs favorisierte die Kommission walisische Parlamentsmitglieder oder WMPs, aber die MWP-Option erhielt in einer öffentlichen Konsultation die größte Unterstützung. AMs schlagen offenbar alternative Optionen vor, aber der Kampf um einen Konsens könnte dem Vorsitzenden Elin Jones Kopfschmerzen bereiten, der voraussichtlich innerhalb von Wochen einen Gesetzesentwurf zu den Änderungen vorlegen wird. Die Gesetzgebung zu den Reformen wird weitere Änderungen in der Arbeitsweise der Versammlung enthalten, einschließlich Regeln zur Disqualifikation von AMs und zur Gestaltung des Ausschusssystems. AMs erhalten die endgültige Abstimmung über die Frage, wie sie genannt werden sollen, wenn sie über die Gesetzgebung debattieren. Mazedonier gehen im Referendum zur Wahl, um den Namen des Landes zu ändern Die Wähler werden am Sonntag darüber abstimmen, ob der Name ihres Landes in "Republik Nordmakedonien" geändert werden soll. Die Volksabstimmung wurde ins Leben gerufen, um einen jahrzehntelangen Streit mit dem benachbarten Griechenland zu lösen, das eine eigene Provinz namens Mazedonien hat. Athen hat lange darauf bestanden, dass der Name seines nördlichen Nachbarn einen Anspruch auf sein Territorium darstellt, und wiederholt gegen seine Beitrittsgesuche für die EU und die NATO protestiert. Der mazedonische Präsident Gjorge Ivanov, ein Gegner der Volksabstimmung über die Namensänderung, hat erklärt, er werde die Abstimmung ignorieren. Befürworter des Referendums, darunter auch Premierminister Zoran Zaev, argumentieren jedoch, dass die Namensänderung lediglich der Preis ist, der für den Beitritt zur EU und zur NATO zu zahlen ist. Die Glocken von St. Martin fallen still als Kirchen im Harlem-Kampf "Historisch gesehen haben die alten Leute, mit denen ich gesprochen habe, gesagt, dass es an jeder Ecke eine Bar und eine Kirche gibt", sagte Mr. Adams. "Heute gibt es keine." Er sagte, das Verschwinden von Balken sei verständlich. "Die Menschen knüpfen heutzutage andere Kontakte", sagte er. "Bars sind keine Wohnräume in der Nachbarschaft mehr, in die die Leute regelmäßig gehen." Was die Kirchen betrifft, befürchtet er, dass das Geld aus dem Verkauf von Vermögenswerten nicht so lange hält, wie die Führer es erwarten, "und früher oder später werden sie wieder da sein, wo sie angefangen haben". Kirchen, fügte er hinzu, könnten durch Wohnhäuser mit Eigentumswohnungen ersetzt werden, die mit Menschen gefüllt sind, die den verbleibenden Heiligtümern des Viertels nicht helfen werden. "Die überwiegende Mehrheit der Menschen, die Eigentumswohnungen in diesen Gebäuden kaufen, wird weiß sein", sagte er, "und daher den Tag beschleunigen, an dem diese Kirchen insgesamt schließen, da es unwahrscheinlich ist, dass die meisten dieser Menschen, die in diese Eigentumswohnungen ziehen, Mitglieder von werden." diese Kirchen. " Beide Kirchen wurden von weißen Gemeinden gebaut, bevor Harlem 1870 eine schwarze Metropole wurde - Metropolitan Community, ein Jahrzehnt später St. Martin's. Die ursprüngliche weiße methodistische Gemeinde zog in den 1930er Jahren aus. Eine schwarze Gemeinde, die in der Nähe angebetet hatte, nahm den Titel des Gebäudes an. St. Martin's wurde von einer schwarzen Gemeinde unter Rev. John Howard Johnson übernommen, die einen Boykott von Einzelhändlern in der 125th Street, einer Hauptstraße zum Einkaufen in Harlem, anführte und sich weigerte, Schwarze einzustellen oder zu fördern. Bei einem Brand im Jahr 1939 wurde das Gebäude schwer beschädigt, aber als die Gemeindemitglieder von Pater Johnson Pläne für den Wiederaufbau machten, gaben sie das Glockenspiel in Auftrag. Rev. David Johnson, Pater Johnsons Sohn und Nachfolger in St. Martin's, nannte das Glockenspiel stolz "die Glocken der Armen". Der Experte, der das Glockenspiel im Juli spielte, nannte es etwas anderes: "Ein kultureller Schatz" und "ein unersetzliches historisches Instrument". Die Expertin Tiffany Ng von der University of Michigan bemerkte auch, dass es das erste Glockenspiel der Welt war, das von einem schwarzen Musiker, Dionisio A. Lind, gespielt wurde, der vor 18 Jahren in das größere Glockenspiel der Riverside Church zog. Herr Merriweather sagte, dass St. Martin's ihn nicht ersetzte. Was sich in den letzten Monaten in St. Martin's abgespielt hat, war eine komplizierte Geschichte von Architekten und Bauunternehmern, einige von den Laienführern der Kirche, andere von der bischöflichen Diözese. Die Sakristei - das aus Laienführern bestehende Leitungsgremium der Gemeinde - schrieb im Juli an die Diözese mit der Sorge, dass die Diözese "versuchen würde, die Kosten an die Sakristei weiterzugeben", obwohl die Sakristei nicht an der Einstellung der Architekten beteiligt war und Auftragnehmer der Diözese geschickt. Einige Gemeindemitglieder beklagten sich über mangelnde Transparenz der Diözese. Hai verletzt 13-Jährigen beim Hummertauchen in Kalifornien Ein Hai hat am Samstag einen 13-jährigen Jungen angegriffen und verletzt, als er am Eröffnungstag der Hummersaison in Kalifornien nach Hummer tauchte. Der Angriff ereignete sich kurz vor 7 Uhr morgens in der Nähe von Beacon's Beach in Encinitas. Chad Hammel erzählte KSWB-TV in San Diego, er habe am Samstagmorgen etwa eine halbe Stunde lang mit Freunden getaucht, als er den Jungen um Hilfe schreien hörte, und paddelte dann mit einer Gruppe hinüber, um ihn aus dem Wasser zu ziehen. Hammel sagte zuerst, er dachte, es sei nur eine Aufregung, einen Hummer zu fangen, aber dann "bemerkte er, dass er schrie:" Ich habe gebissen! Ich habe gebissen! ' Sein ganzes Schlüsselbein wurde aufgerissen ", sagte Hammel, als er den Jungen erreichte. "Ich schrie alle an, aus dem Wasser zu steigen: 'Da ist ein Hai im Wasser!'", Fügte Hammel hinzu. Der Junge wurde in das Rady Kinderkrankenhaus in San Diego geflogen, wo er in einem kritischen Zustand ist. Die für den Angriff verantwortliche Haiart war unbekannt. Rettungsschwimmer-Kapitän Larry Giles sagte auf einer Pressekonferenz, dass einige Wochen zuvor ein Hai in der Gegend gesichtet worden sei, aber es wurde festgestellt, dass es sich nicht um eine gefährliche Haiart handelt. Giles fügte hinzu, dass das Opfer traumatische Verletzungen an seinem Oberkörperbereich erlitten habe. Beamte sperrten den Zugang zum Strand von Ponto Beach in Casablad zu Swami in Ecinitas für 48 Stunden zu Ermittlungs- und Sicherheitszwecken. Giles bemerkte, dass es in der Gegend mehr als 135 Haiarten gibt, aber die meisten gelten nicht als gefährlich. Sainsburys Pläne drängen auf den britischen Schönheitsmarkt Sainsbury's tritt gegen Boots, Superdrug und Debenhams mit Schönheitsgängen im Kaufhausstil an, die von spezialisierten Assistenten besetzt sind. Im Rahmen eines erheblichen Vorstoßes in den britischen Schönheitsmarkt von 2,8 Mrd. GBP, der weiter wächst, während die Verkäufe von Mode und Haushaltswaren zurückgehen, werden die größeren Schönheitsgänge in 11 Geschäften im ganzen Land getestet und im nächsten Jahr in weitere Geschäfte gebracht, wenn es ist ein Erfolg. Die Investition in Schönheit kommt daher, dass Supermärkte nach Möglichkeiten suchen, Regalflächen zu nutzen, die einst für Fernseher, Mikrowellen und Haushaltswaren verklagt wurden. Sainsbury's sagte, es würde die Größe seines Schönheitsangebots auf bis zu 3.000 Produkte verdoppeln, darunter erstmals Marken wie Revlon, Essie, Tweezerman und Dr. PawPaw. Bestehende Sortimente von L'Oreal, Maybelline und Burt's Bees werden auch mehr Platz mit Markenbereichen erhalten, die denen in Geschäften wie Boots ähneln. Der Supermarkt bringt auch sein Boutique-Make-up-Sortiment neu auf den Markt, sodass die meisten Produkte veganerfreundlich sind - was von jüngeren Käufern zunehmend gefordert wird. Darüber hinaus testet der Parfümhändler The Fragrance Shop Konzessionen in zwei Sainsbury's-Geschäften, von denen das erste letzte Woche in Croydon im Süden Londons eröffnet wurde, während das zweite Ende dieses Jahres in Selly Oak, Birmingham, eröffnet wird. Online-Einkäufe und die Verlagerung hin zum täglichen Kauf kleiner Mengen von Lebensmitteln in örtlichen Convenience-Läden bedeuten, dass Supermärkte mehr tun müssen, um die Besucher zum Besuch zu bewegen. Mike Coupe, der Geschäftsführer von Sainsbury's, sagte, dass die Filialen zunehmend wie Kaufhäuser aussehen werden, da die Supermarktkette versucht, sich mit mehr Dienstleistungen und Non-Food gegen die Discounter Aldi und Lidl zu wehren. Sainsbury's hat Argos-Filialen in Hunderten von Geschäften eingerichtet und seit dem Kauf beider Ketten vor zwei Jahren eine Reihe von Habitats eingeführt, was den Verkauf von Lebensmitteln gestärkt und die Akquisitionen rentabler gemacht hat. Der frühere Versuch des Supermarkts, seine Schönheits- und Apothekenabteilungen zu erneuern, schlug fehl. Sainsbury's testete Anfang der 2000er Jahre ein Joint Venture mit Boots, aber die Zusammenarbeit endete nach einem Streit darüber, wie die Einnahmen aus den Apotheken in den Supermärkten aufgeteilt werden sollten. Die neue Strategie kommt, nachdem Sainsbury's vor drei Jahren sein Apothekengeschäft mit 281 Filialen für 125 Millionen Pfund an Celesio, den Eigentümer der Lloyds Pharmacy-Kette, verkauft hat. Es hieß, Lloyds würde eine Rolle in dem Plan spielen, indem eine erweiterte Auswahl an Luxus-Hautpflegemarken, darunter La Roche-Posay und Vichy, in vier Geschäften hinzugefügt würde. Paul Mills-Hicks, kaufmännischer Leiter von Sainsbury, sagte: "Wir haben das Erscheinungsbild unserer Schönheitsgänge verändert, um die Umwelt für unsere Kunden zu verbessern. Wir haben auch in speziell ausgebildete Kollegen investiert, die Ihnen mit Rat und Tat zur Seite stehen. Unser Markensortiment ist auf alle Bedürfnisse zugeschnitten. Aufgrund der verführerischen Umgebung und der günstigen Lage sind wir jetzt ein überzeugendes Schönheitsziel, das die alte Art des Einkaufens in Frage stellt. " \ No newline at end of file diff --git a/NLP with Attention Models/NMT_with_Attention/Basic Attention-BLEU-QKV Attention/Files/home/jovyan/work/data/wmt19_ref.txt b/NLP with Attention Models/NMT_with_Attention/Basic Attention-BLEU-QKV Attention/Files/home/jovyan/work/data/wmt19_ref.txt new file mode 100644 index 0000000000000000000000000000000000000000..5bbaf9ea44e1593a987bcf9190d79aad2b275ac4 --- /dev/null +++ b/NLP with Attention Models/NMT_with_Attention/Basic Attention-BLEU-QKV Attention/Files/home/jovyan/work/data/wmt19_ref.txt @@ -0,0 +1 @@ +Walisische Ageordnete sorgen sich "wie Dödel auszusehen" Es herrscht Bestürzung unter einigen Mitgliedern der Versammlung über einen Vorschlag, der ihren Titel zu MWPs (Mitglied der walisischen Parlament) ändern soll. Der Grund dafür waren Pläne, den Namen der Nationalversammlung in Walisisches Parlament zu ändern. Mitglieder aller Parteien der Nationalversammlung haben Bedenken, dass sie sich dadurch Spott aussetzen könnten. Ein Labour-Abgeordneter sagte, dass seine Gruppe "sich mit Twp und Pwp reimt". Hinweis für den Leser: „twp“ im Walisischen bedeutet „bescheuert“ und „pwp“ bedeutet „Kacke“. Ein Versammlungsmitglied von Plaid Cymru sagte, die Gruppe als Ganzes sei "nicht glücklich" und hat Alternativen vorgeschlagen. Ein walisischer Konservativer sagte, seine Gruppe wäre „offen“ für eine Namensänderung, wies aber darauf hin, dass es von „MWP“ (Mitglied des Walisischen Parlaments) nur ein kurzer verbaler Sprung zu „Muppet“ ist. Hinweis: Der walisische Buchstabe W wird ähnlich ausgesprochen wie das U im Englischen. Die Kommission der Nationalversammlung, die gerade an einem Gesetzentwurf für die Namensänderungen arbeitet, sagte: „Die finale Entscheidung über die Bezeichnung der Mitglieder der Nationalversammlung liegt natürlich bei den Mitgliedern selbst.“ Mit dem Government of Wales Act 2017 erhielt das walisische Parlament die Möglichkeit, seinen Namen zu ändern. Im Juni vergangenen Jahres hat die Kommission die Ergebnisse einer öffentlichen Anhörung zu den Vorschlägen veröffentlicht, wonach die Namensänderung in Walisisches Parlament breite Zustimmung findet. Bei der Frage um den Titel der Versammlungsmitglieder bevorzugte die Kommission walisischen Parlamentsmitglieder oder WMPs, jedoch bekam die MWP-Option die meiste Unterstützung in einer öffentlichen Befragung. Mitglieder des walisischen Parlaments schlagen offenbar alternative Optionen vor, aber der Kampf zu einem Konsens zu gelangen, könnte der Vorsitzenden Elin JONES Kopfschmerzen bereiten. Von ihr wird erwartet, dass sie einen Gesetzesentwurf für diese Änderungen in den nächsten Wochen vorlegt. Die Rechtsvorschriften über die Reformen wird Änderungen in der Arbeitsweise der Versammlung beinhalten, einschließlich der Vorschriften für die Disqualifikation von Mitgliedern des walisischen Parlaments und die Gestaltung des Auschusssystems. Die Mitglieder der Nationalversammlung können bei der Debatte um das Gesetz entscheiden, wie sie genannt werden sollen. Mazedonier halten über die Änderung des Landesnamens ein Referendum ab. Am Sonntag stimmen die Wahlberechtigten über die Änderung des Landesnamens zu „Republik Nordmazedonien“ ab. Die Volksabstimmung wird abgehalten, um einen jahrzehntelangen Streit mit dem benachbarten Griechenland beizulegen, in dem eine Provinz den Namen Mazedonien trägt. Athen beharrt seit langem darauf, dass der Name seines nördlichen Nachbarn einen Anspruch auf sein Territorium darstellt und hat wiederholt Einspruch gegen seinen Aufnahmeantrag für die EU und die NATO erhoben. Der mazedonische Präsident Gjorge Ivanov, ein Gegner des Referendums bezüglich der Namensänderung, hat gesagt, er werde die Abstimmung ignorieren. Die Befürworter des Referendums, einschließlich des Premierministers Zoran Zaev, argumentieren jedoch, dass die Namensänderung ganz einfach der Preis ist, den man für den Beitritt zur EU und zur NATO zahlen muss. Die Glocken von St. Martin verstummenn, da Kirchen in Harlem Probleme haben "Historisch gesehen haben die alten Leute, mit denen ich gesprochen habe, gesagt, dass es an jeder Ecke eine Bar und eine Kirche gab", sagte Herr Adams. „Heute gibt es weder noch." Er sagte, das Verschwinden von Kneipen sei verständlich. "Menschen knüpfen Komtakte auf eine andere Art und Weise", heutzutage sagte er. "Kneipen sind keine Wohnzimmer mehr in der Nachbarschaft, in denen man sich regelmäßig trifft." Was Kirchen angeht, fürchtet er, dass das Geld aus dem Verkauf von Vermögenswerten nicht so lange Bestand haben wird, wie die Anführer es erwarten. Kirchen könnten durch Mehrfamilienhäuser mit Eigentumswohnungen ersetzt werden, die mit der Art von Menschen gefüllt sind, die den verbleibenden Zufluchtsstätten des Stadtteils nicht helfen werden. Die überwältigende Mehrheit der Menschen, die Eigentumswohnungen in diesen Gebäuden kaufen, wird weiß sein, sagte er, "und wird daher den Tag, an dem diese Kirchen ganz geschlossen werden, beschleunigen, da es unwahrscheinlich ist, dass die meisten dieser Personen, die in diese Eigentumswohnungen einziehen, Mitglieder dieser Kirchen werden." Beide Kirchen wurden von weißen Gemeinden gebaut, bevor Harlem 1870 zur schwarzen Metropole wurde – Metropolitan Community, St. Martin's ein Jahrzehnt später. Die ursprüngliche weiße methodistische Gemeinde zog in den 1930er Jahren aus. Eine schwarze Gemeinde, die in der Nähe eine Religion ausübten, erwarb das Gebäude. St. Martin's wurde von einer schwarzen Gemeinde unter dem Pfarrer John Howard Johnson übernommen, der einen Boykott der Einzelhändler in der 125. Straße, einer Hauptstraße zum Einkaufen in Harlem, anführte, die sich der Einstellung oder Förderung von Schwarzen widersetzte. Ein Brand im Jahr 1939 hinterließ das Gebäude schwer beschädigt, aber als die Gemeindemitglieder von Pater Johnson Pläne zum Wiederaufbau machten, beauftragten sie das Glockenspiel. Pfarrer David Johnson, Sohn von Pater Johnson und Nachfolger in St. Martin's, nannte das Glockenspiel stolz „die Glocken der Armen." Der Experte, der im Juli das Glockenspiel spielte, nannte es noch etwas anderes: "Ein Kulturschatz" und "ein unersetzliches historisches Instrument". Der Experte, Tiffany Ng von der University of Michigan, stellte auch fest, dass es das erste Glockenspiel der Welt war, das von einem schwarzen Musiker, Dionisio A. Lind, gespielt wurde, der vor 18 Jahren in das größere Glockenspiel an der Riverside Church wechselte. Herr Merriweather sagte, dass St. Martin's ihn nicht ersetzt hat. Was sich in den letzten Monaten bei St. Martin abgespielt hat, war eine komplizierte Geschichte von Architekten und Bauunternehmern, einige wurden von den Laienführern der Kirche, andere von der Bischofsdiözese eingebracht. Die Sakristei – das Leitungsorgan der Pfarrei, das sich aus Laienführern zusammensetzt – schrieb im Juli an die Diözese mit der Sorge, dass die Diözese "versuchen würde, die Kosten an die Sakristei weiterzugeben", obwohl die Sakristei nicht an der Beauftragung der Architekten und Auftragnehmer der Diözese beteiligt war. Einige Gemeindemitglieder beklagten einen Mangel an Transparenz seitens der Diözese. Hai verletzt 13-jährigen Jungen beim Hummertauchen in Kalifornien Am Samstag griff ein Hai einen 13-jährigen Jungen an und verletzte ihn, während er in Kalifornien am Eröffnungstag der Hummersaison nach Hummern tauchte, sagten Beamte. Der Angriff fand kurz vor sieben Uhr morgens nahe dem Strand von Beacon in Encinitas statt. Chad Hammel sagte KSWB-TV in San Diego, er habe mit Freunden für eine halbe Stunde am Samstagmorgen getaucht, als er den Jungen um Hilfe schreien hörte. Er sei dann mit den anderen rübergepaddelt, um ihn aus dem Wasser zu retten. „Zuerst dachte ich, jemand freut sich, weil er einen Hummer gefangen hat“, sagte Hammel. Aber dann bemerkte ich, dass jemand schrie: „Ich wurde gebissen!“ Ich wurde gebissen! Sein ganzes Schlüsselbein wurde aufgerissen, sagte Hammel, er stellte dies fest als er zu dem Jungen kam. Ich schrie alle an, damit sie aus dem Wasser herauskommen: "Da ist ein Hai im Wasser!" sagte Hammel. Der Junge wurde ins Rady Children's Hospital in San Diego gebracht, wo sein Zustand als kritisch dokumentiert wurde. Die für den Angriff verantwortliche Haiart ist unbekannt. Rettungsschwimmer Kapitän Larry Giles sagte bei einer Medienbesprechung, dass ein Hai einige Wochen zuvor in der Gegend gesichtet worden war, aber es wurde festgestellt, dass es sich nicht um eine gefährliche Haiart handelt. Giles fügte im Oberkörperbereich seines Opfers traumatische Verletzungen hinzu. Beamte schlossen den Zugang zum Strand von Ponto Beach in Casablad zu Swami's in Ecinitas für 48 Stunden aus Sicherheitsgründen. Giles stellte fest, dass es mehr als 135 Haiarten in der Gegend gibt, aber die meisten gelten nicht als gefährlich. Sainsbury plant, den britischen Beauty-Markt zu erobern Sainsbury's übernimmt Boots, Superdrug und Debenhams mit Regalen mit Schönheitsprodukten im Kaufhausstil, die mit Fachassistenten besetzt sind. Im Rahmen eines umfangreichen Pushs in den britische Schönheitsmarkt in Großbritannien, der weiter wächst und £2.8bn wert ist, werden Mode-und Homeware-Verkäufe zurückfallen. Die größeren 11 Filialen, die das Land getestet und im nächsten Jahr wieder in mehr Läden gebracht hat, um Erfolg zu zeigen. Die Idee, in den Beauty-Markt zu investieren, ist daraus entstanden, dass Supermärkte nach Möglichkeiten suchen, den Regalplatz, der früher für Fernseher, Mikrowellen und Haushaltswaren verwendet wurde, für neue Produkte zu nutzen. Sainsbury teilte mit, dass das Angebot an Beauty-Produkten auf 3000 Produkte verdoppelt werde, indem erstmals auch Marken wie Revlon, Essie, Tweezerman und Dr. PawPaw in das Produktportfolio aufgenommen werden. Auch bestehende Sortimente von L'Oreal, Maybelline und Burt's Bees werden mehr Platz in Markenbereichen erhalten, ähnlich wie sie in Geschäften wie Boots zu finden sind. Die Supermarktkette belebt auch ihre hauseigene Makeup-Marke „Boutique“ neu, die zahlreiche bei jungen Käufern beliebte vegane Produkte im Angebot hat. Darüber hinaus testet der Parfümhändler Fragrance Shop Konzessionen in zwei Sainsburys Geschäften, von denen das erste letzte Woche in Croydon, Süd-London, eröffnet wurde, während ein zweites Ende dieses Jahres in Selly Oak, Birmingham, eröffnet wird. Online-Shopping und eine Verlagerung hin zum täglichen Einkauf kleiner Mengen Lebensmittel in Nachbarschaftsläden haben zur Folge, dass Supermärkte mehr tun müssen, um die Menschen zum Besuch zu bewegen. Mike Coupe, der Chef von Sainsbury's, hat gesagt, dass die Verkaufsstellen in zunehmendem Maße wie Warenhäuser aussehen werden, da die Supermarktkette versucht Discounter wie Aldi und Lidl mit mehr Dienstleistungen und Non-Food zu bekämpfen. Sainsbury's hat Argos-Outlets in Hunderten von Geschäften platziert und auch eine Reihe von Habitats eingeführt, seit es beide Ketten vor zwei Jahren gekauft hat, was angeblich den Lebensmittelverkauf gestärkt und die Akquisitionen profitabler gemacht hat. Der frühere Versuch des Supermarktes, seine Schönheits- und Apothekenabteilungen zu überarbeiten, endete mit einem Misserfolg. Sainsbury hat ein Joint Venture mit Boots in den frühen 2000er Jahren getestet, aber es endete unetschieden, und die Einnahmen aus den Geschäften der Apotheken konnten nicht geteilt werden. Die neue Strategie kommt, nachdem Sainsbury's sein 281-Filialen-Apothekengeschäft vor drei Jahren für 125 Millionen Pfund an Celesio, den Eigentümer der Lloyds Pharmacy Kette, verkauft hat. Laut Angaben soll Lloyds ebenfalls beteiligt sein. In vier Lloyds-Stores soll das Angebot an Luxus-Hautpflegeprodukten von Marken wie La Roche-Posay und Vichy stark erweitert werden. Paul Mills-Hicks, Werbechef von Sainsbury, sagte: Wir haben das Erscheinungsbild unserer Regale für Schönheitsprodukte verändert, um die Ausstattung für unsere Kunden zu verbessern. Zusätzlich haben wir in die Schulung von Mitarbeitern investiert, die dem Kunden beratend zur Seite stehen sollen. Unser Markenportfolio lässt keine Wünsche offen. Das elegante Ambiente, die günstige Innenstadtlage und ein revolutioniertes Kauferlebnis machen uns zu einem beliebten Anlaufziel in Beauty-Fragen. \ No newline at end of file diff --git a/NLP with Attention Models/NMT_with_Attention/Basic Attention-BLEU-QKV Attention/Files/home/jovyan/work/data/wmt19_src.txt b/NLP with Attention Models/NMT_with_Attention/Basic Attention-BLEU-QKV Attention/Files/home/jovyan/work/data/wmt19_src.txt new file mode 100644 index 0000000000000000000000000000000000000000..e70b7d2a9b8bcd9f626aa2c00fed5772a6653e7e --- /dev/null +++ b/NLP with Attention Models/NMT_with_Attention/Basic Attention-BLEU-QKV Attention/Files/home/jovyan/work/data/wmt19_src.txt @@ -0,0 +1 @@ +Welsh AMs worried about 'looking like muppets' There is consternation among some AMs at a suggestion their title should change to MWPs (Member of the Welsh Parliament). It has arisen because of plans to change the name of the assembly to the Welsh Parliament. AMs across the political spectrum are worried it could invite ridicule. One Labour AM said his group was concerned "it rhymes with Twp and Pwp." For readers outside of Wales: In Welsh twp means daft and pwp means poo. A Plaid AM said the group as a whole was "not happy" and has suggested alternatives. A Welsh Conservative said his group was "open minded" about the name change, but noted it was a short verbal hop from MWP to Muppet. In this context The Welsh letter w is pronounced similarly to the Yorkshire English pronunciation of the letter u. The Assembly Commission, which is currently drafting legislation to introduce the name changes, said: "The final decision on any descriptors of what Assembly Members are called will of course be a matter for the members themselves." The Government of Wales Act 2017 gave the Welsh assembly the power to change its name. In June, the Commission published the results of a public consultation on the proposals which found broad support for calling the assembly a Welsh Parliament. On the matter of the AMs' title, the Commission favoured Welsh Parliament Members or WMPs, but the MWP option received the most support in a public consultation. AMs are apparently suggesting alternative options, but the struggle to reach consensus could be a headache for the Presiding Officer, Elin Jones, who is expected to submit draft legislation on the changes within weeks. The legislation on the reforms will include other changes to the way the assembly works, including rules on disqualification of AMs and the design of the committee system. AMs will get the final vote on the question of what they should be called when they debate the legislation. Macedonians go to polls in referendum on changing country's name Voters will vote Sunday on whether to change their country's name to the "Republic of North Macedonia." The popular vote was set up in a bid to resolve a decades-long dispute with neighboring Greece, which has its own province called Macedonia. Athens has long insisted that its northern neighbor's name represents a claim on its territory and has repeatedly objected to its membership bids for the EU and NATO. Macedonian President Gjorge Ivanov, an opponent of the plebiscite on the name change, has said he will disregard the vote. However, supporters of the referendum, including Prime Minister Zoran Zaev, argue that the name change is simply the price to pay to join the EU and NATO. The Bells of St. Martin's Fall Silent as Churches in Harlem Struggle "Historically, the old people I've talked to say there was a bar and a church on every corner," Mr. Adams said. "Today, there's neither." He said the disappearance of bars was understandable. "People socialize in a different way" nowadays, he said. "Bars are no longer neighborhood living rooms where people go on a regular basis." As for churches, he worries that the money from selling assets will not last as long as leaders expect it to, "and sooner or later they'll be right back where they started." Churches, he added, could be replaced by apartment buildings with condominiums filled with the kind of people who will not help the neighborhood's remaining sanctuaries. "The overwhelming majority of people who buy condominiums in these buildings will be white," he said, "and therefore will hasten the day that these churches close altogether because it is unlikely that most of these people who move into these condominiums will become members of these churches." Both churches were built by white congregations before Harlem became a black metropolis - Metropolitan Community in 1870, St. Martin's a decade later. The original white Methodist congregation moved out in the 1930s. A black congregation that had been worshiping nearby took title to the building. St. Martin's was taken over by a black congregation under the Rev. John Howard Johnson, who led a boycott of retailers on 125th Street, a main street for shopping in Harlem, who resisted hiring or promoting blacks. A fire in 1939 left the building badly damaged, but as Father Johnson's parishioners made plans to rebuild, they commissioned the carillon. The Rev. David Johnson, Father Johnson's son and successor at St. Martin's, proudly called the carillon "the poor people's bells." The expert who played the carillon in July called it something else: "A cultural treasure" and "an irreplaceable historical instrument." The expert, Tiffany Ng of the University of Michigan, also noted that it was the first carillon in the world to be played by a black musician, Dionisio A. Lind, who moved to the larger carillon at the Riverside Church 18 years ago. Mr. Merriweather said that St. Martin's did not replace him. What has played out at St. Martin's over the last few months has been a complicated tale of architects and contractors, some brought in by the lay leaders of the church, others by the Episcopal diocese. The vestry - the parish's governing body, made up of lay leaders - wrote the diocese in July with concerns that the diocese "would seek to pass along the costs" to the vestry, even though the vestry had not been involved in hiring the architects and contractors the diocese sent in. Some parishioners complained of a lack of transparency on the diocese's part. Shark injures 13-year-old on lobster dive in California A shark attacked and injured a 13-year-old boy Saturday while he was diving for lobster in California on the opening day of lobster season, officials said. The attack occurred just before 7 a.m. near Beacon's Beach in Encinitas. Chad Hammel told KSWB-TV in San Diego he had been diving with friends for about half an hour Saturday morning when he heard the boy screaming for help and then paddled over with a group to help pull him out of the water. Hammel said at first he thought it was just excitement of catching a lobster, but then he "realized that he was yelling, 'I got bit! I got bit!' His whole clavicle was ripped open," Hammel said he noticed once he got to the boy. "I yelled at everyone to get out of the water: 'There's a shark in the water!'" Hammel added. The boy was airlifted to Rady Children's Hospital in San Diego where he is listed in critical condition. The species of shark responsible for the attack was unknown. Lifeguard Capt. Larry Giles said at a media briefing that a shark had been spotted in the area a few weeks earlier, but it was determined not to be a dangerous species of shark. Giles added the victim sustained traumatic injuries to his upper torso area. Officials shut down beach access from Ponto Beach in Casablad to Swami's in Ecinitas for 48 hours for investigation and safety purposes. Giles noted that there are more than 135 shark species in the area, but most are not considered dangerous. Sainsbury's plans push into UK beauty market Sainsbury's is taking on Boots, Superdrug and Debenhams with department store-style beauty aisles staffed with specialist assistants. As part of a substantial push into the UK's £2.8bn beauty market, which is continuing to grow while fashion and homeware sales fall back, the larger beauty aisles will be tested out in 11 stores around the country and taken to more stores next year if it proves a success. The investment in beauty comes as supermarkets hunt for ways to use up shelf space once sued for TVs, microwaves and homeware. Sainsbury's said it would be doubling the size of its beauty offering to up to 3,000 products, including brands such as Revlon, Essie, Tweezerman and Dr. PawPaw for the first time. Existing ranges from L'Oreal, Maybelline and Burt's Bees will also get more space with branded areas similar to those found in shops like Boots. The supermarket is also relaunching its Boutique makeup range so that the majority of products are vegan-friendly - something increasingly demanded by younger shoppers. In addition, perfume retailer the Fragrance Shop will be testing out concessions in two Sainsbury's stores, the first of which opened in Croydon, south London, last week while a second opens in Selly Oak, Birmingham, later this year. Online shopping and a shift towards buying small amounts of food daily at local convenience stores means supermarkets are having to do more to persuade people to visit. Mike Coupe, the chief executive of Sainsbury's, has said the outlets will look increasingly like department stores as the supermarket chain tries to fight back against the discounters Aldi and Lidl with more services and non-food. Sainsbury's has been putting Argos outlets in hundreds of stores and has also introduced a number of Habitats since it bought both chains two years ago, which it says has bolstered grocery sales and made the acquisitions more profitable. The supermarket's previous attempt to revamp its beauty and pharmacy departments ended in failure. Sainsbury's tested a joint venture with Boots in the early 2000s, but the tie-up ended after a row over how to split the revenues from the chemist's stores in its supermarkets. The new strategy comes after Sainsbury's sold its 281-store pharmacy business to Celesio, the owner of the Lloyds Pharmacy chain, for £125m, three years ago. It said Lloyds would play a role in the plan, by adding an extended range of luxury skincare brands including La Roche-Posay and Vichy in four stores. Paul Mills-Hicks, Sainsbury's commercial director, said: "We've transformed the look and feel of our beauty aisles to enhance the environment for our customers. We've also invested in specially trained colleagues who will be on hand to offer advice. Our range of brands is designed to suit every need and the alluring environment and convenient locations mean we're now a compelling beauty destination which challenges the old way of shopping." \ No newline at end of file diff --git a/NLP with Attention Models/NMT_with_Attention/Basic Attention-BLEU-QKV Attention/Files/home/jovyan/work/data/word2int_en.pkl b/NLP with Attention Models/NMT_with_Attention/Basic Attention-BLEU-QKV Attention/Files/home/jovyan/work/data/word2int_en.pkl new file mode 100644 index 0000000000000000000000000000000000000000..2cf293d245c03be1b0d0391972e453b766690840 --- /dev/null +++ b/NLP with Attention Models/NMT_with_Attention/Basic Attention-BLEU-QKV Attention/Files/home/jovyan/work/data/word2int_en.pkl @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:42b2af079f58c498b5f437e1b6922d486a8baa721f254c15ccaba44e5a7c8165 +size 127796 diff --git a/NLP with Attention Models/NMT_with_Attention/Basic Attention-BLEU-QKV Attention/Files/home/jovyan/work/data/word2int_fr.pkl b/NLP with Attention Models/NMT_with_Attention/Basic Attention-BLEU-QKV Attention/Files/home/jovyan/work/data/word2int_fr.pkl new file mode 100644 index 0000000000000000000000000000000000000000..ab919ec5757136112c9b0356e0ecdf6612a04c14 --- /dev/null +++ b/NLP with Attention Models/NMT_with_Attention/Basic Attention-BLEU-QKV Attention/Files/home/jovyan/work/data/word2int_fr.pkl @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:16e1f4953589bfcad3daae8002476a5727f9902dc8f48f513a20bcb1c94f81dd +size 133064 diff --git a/NLP with Attention Models/NMT_with_Attention/Basic Attention-BLEU-QKV Attention/Files/home/jovyan/work/images/alignment.png b/NLP with Attention Models/NMT_with_Attention/Basic Attention-BLEU-QKV Attention/Files/home/jovyan/work/images/alignment.png new file mode 100644 index 0000000000000000000000000000000000000000..f24eb9fa32898ed658388f7777f2bafdf89dc985 Binary files /dev/null and b/NLP with Attention Models/NMT_with_Attention/Basic Attention-BLEU-QKV Attention/Files/home/jovyan/work/images/alignment.png differ diff --git a/NLP with Attention Models/NMT_with_Attention/Basic Attention-BLEU-QKV Attention/Files/home/jovyan/work/images/alignment_model_3.jpg b/NLP with Attention Models/NMT_with_Attention/Basic Attention-BLEU-QKV Attention/Files/home/jovyan/work/images/alignment_model_3.jpg new file mode 100644 index 0000000000000000000000000000000000000000..ea6497357f47d790799716c71d41f64622fc805d Binary files /dev/null and b/NLP with Attention Models/NMT_with_Attention/Basic Attention-BLEU-QKV Attention/Files/home/jovyan/work/images/alignment_model_3.jpg differ diff --git a/NLP with Attention Models/NMT_with_Attention/Basic Attention-BLEU-QKV Attention/Files/home/jovyan/work/images/attention.png b/NLP with Attention Models/NMT_with_Attention/Basic Attention-BLEU-QKV Attention/Files/home/jovyan/work/images/attention.png new file mode 100644 index 0000000000000000000000000000000000000000..fc538b62e816a977b2605075eedd22bda1bcabc3 Binary files /dev/null and b/NLP with Attention Models/NMT_with_Attention/Basic Attention-BLEU-QKV Attention/Files/home/jovyan/work/images/attention.png differ diff --git a/NLP with Attention Models/NMT_with_Attention/NMT with MBR/Files/tf/.ipynb_checkpoints/C4W1_Assignment-checkpoint.ipynb b/NLP with Attention Models/NMT_with_Attention/NMT with MBR/Files/tf/.ipynb_checkpoints/C4W1_Assignment-checkpoint.ipynb new file mode 100644 index 0000000000000000000000000000000000000000..cc31e334a1f73b5da2080e8241680b5163781ce9 --- /dev/null +++ b/NLP with Attention Models/NMT_with_Attention/NMT with MBR/Files/tf/.ipynb_checkpoints/C4W1_Assignment-checkpoint.ipynb @@ -0,0 +1,1994 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "id": "9cb49525", + "metadata": {}, + "source": [ + "# Assignment 1: Neural Machine Translation\n", + "\n", + "Welcome to the first assignment of Course 4. Here, you will build an English-to-Portuguese neural machine translation (NMT) model using Long Short-Term Memory (LSTM) networks with attention. Machine translation is an important task in natural language processing and could be useful not only for translating one language to another but also for word sense disambiguation (e.g. determining whether the word \"bank\" refers to the financial bank, or the land alongside a river). Implementing this using just a Recurrent Neural Network (RNN) with LSTMs can work for short to medium length sentences but can result in vanishing gradients for very long sequences. To help with this, you will be adding an attention mechanism to allow the decoder to access all relevant parts of the input sentence regardless of its length. By completing this assignment, you will:\n", + "\n", + "- Implement an encoder-decoder system with attention\n", + "- Build the NMT model from scratch using Tensorflow\n", + "- Generate translations using greedy and Minimum Bayes Risk (MBR) decoding\n", + "\n", + "## Table of Contents\n", + "- [1 - Data Preparation](#1)\n", + "- [2 - NMT model with attention](#2)\n", + " - [Exercise 1 - Encoder](#ex1)\n", + " - [Exercise 2 - CrossAttention](#ex2)\n", + " - [Exercise 3 - Decoder](#ex3) \n", + " - [Exercise 4 - Translator](#ex4)\n", + "- [3 - Training](#3)\n", + "- [4 - Using the model for inference ](#4)\n", + " - [Exercise 5 - translate](#ex5)\n", + "- [5 - Minimum Bayes-Risk Decoding](#5)\n", + " - [Exercise 6 - rouge1_similarity](#ex6)\n", + " - [Exercise 7 - average_overlap](#ex7)\n" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "f9ef370d", + "metadata": { + "deletable": false, + "editable": false, + "tags": [ + "graded" + ] + }, + "outputs": [], + "source": [ + "import os\n", + "os.environ['TF_CPP_MIN_LOG_LEVEL'] = '3' # Setting this env variable prevents TF warnings from showing up\n", + "\n", + "import numpy as np\n", + "import tensorflow as tf\n", + "from collections import Counter\n", + "from utils import (sentences, train_data, val_data, english_vectorizer, portuguese_vectorizer, \n", + " masked_loss, masked_acc, tokens_to_text)" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "8adb8fd6", + "metadata": { + "deletable": false, + "editable": false, + "tags": [] + }, + "outputs": [], + "source": [ + "import w1_unittest" + ] + }, + { + "cell_type": "markdown", + "id": "e76be1dc", + "metadata": {}, + "source": [ + "\n", + "## 1. Data Preparation\n", + "\n", + "The text pre-processing bits have already been taken care of (if you are interested in this be sure to check the `utils.py` file). The steps performed can be summarized as:\n", + "\n", + "- Reading the raw data from the text files\n", + "- Cleaning the data (using lowercase, adding space around punctuation, trimming whitespaces, etc)\n", + "- Splitting it into training and validation sets\n", + "- Adding the start-of-sentence and end-of-sentence tokens to every sentence\n", + "- Tokenizing the sentences\n", + "- Creating a Tensorflow dataset out of the tokenized sentences\n", + "\n", + "Take a moment to inspect the raw sentences:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "226033a1", + "metadata": { + "deletable": false, + "editable": false, + "tags": [ + "graded" + ] + }, + "outputs": [], + "source": [ + "portuguese_sentences, english_sentences = sentences\n", + "\n", + "print(f\"English (to translate) sentence:\\n\\n{english_sentences[-5]}\\n\")\n", + "print(f\"Portuguese (translation) sentence:\\n\\n{portuguese_sentences[-5]}\")" + ] + }, + { + "cell_type": "markdown", + "id": "5ba90eb9", + "metadata": {}, + "source": [ + "You don't have much use for the raw sentences so delete them to save memory:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "d9f081b0", + "metadata": { + "deletable": false, + "editable": false, + "tags": [ + "graded" + ] + }, + "outputs": [], + "source": [ + "del portuguese_sentences\n", + "del english_sentences\n", + "del sentences" + ] + }, + { + "cell_type": "markdown", + "id": "a2ff83d2", + "metadata": {}, + "source": [ + "Notice that you imported an `english_vectorizer` and a `portuguese_vectorizer` from `utils.py`. These were created using [tf.keras.layers.TextVectorization](https://www.tensorflow.org/api_docs/python/tf/keras/layers/TextVectorization) and they provide interesting features such as ways to visualize the vocabulary and convert text into tokenized ids and vice versa. In fact, you can inspect the first ten words of the vocabularies for both languages:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "2c1cfc17", + "metadata": { + "deletable": false, + "editable": false, + "tags": [ + "graded" + ] + }, + "outputs": [], + "source": [ + "print(f\"First 10 words of the english vocabulary:\\n\\n{english_vectorizer.get_vocabulary()[:10]}\\n\")\n", + "print(f\"First 10 words of the portuguese vocabulary:\\n\\n{portuguese_vectorizer.get_vocabulary()[:10]}\")" + ] + }, + { + "cell_type": "markdown", + "id": "3152b075", + "metadata": {}, + "source": [ + "Notice that the first 4 words are reserved for special words. In order, these are:\n", + "\n", + "- the empty string\n", + "- a special token to represent an unknown word\n", + "- a special token to represent the start of a sentence\n", + "- a special token to represent the end of a sentence\n", + "\n", + "You can see how many words are in a vocabulary by using the `vocabulary_size` method:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "5facaa0c", + "metadata": { + "deletable": false, + "editable": false, + "slideshow": { + "slide_type": "" + }, + "tags": [ + "graded" + ] + }, + "outputs": [], + "source": [ + "# Size of the vocabulary\n", + "vocab_size_por = portuguese_vectorizer.vocabulary_size()\n", + "vocab_size_eng = english_vectorizer.vocabulary_size()\n", + "\n", + "print(f\"Portuguese vocabulary is made up of {vocab_size_por} words\")\n", + "print(f\"English vocabulary is made up of {vocab_size_eng} words\")" + ] + }, + { + "cell_type": "markdown", + "id": "53e4b615", + "metadata": { + "editable": true, + "slideshow": { + "slide_type": "" + }, + "tags": [] + }, + "source": [ + "You can define [tf.keras.layers.StringLookup](https://www.tensorflow.org/api_docs/python/tf/keras/layers/StringLookup) objects that will help you map from words to ids and vice versa. Do this for the portuguese vocabulary since this will be useful later on when you decode the predictions from your model:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "218f7a36", + "metadata": { + "deletable": false, + "editable": false, + "tags": [ + "graded" + ] + }, + "outputs": [], + "source": [ + "# This helps you convert from words to ids\n", + "word_to_id = tf.keras.layers.StringLookup(\n", + " vocabulary=portuguese_vectorizer.get_vocabulary(), \n", + " mask_token=\"\", \n", + " oov_token=\"[UNK]\"\n", + ")\n", + "\n", + "# This helps you convert from ids to words\n", + "id_to_word = tf.keras.layers.StringLookup(\n", + " vocabulary=portuguese_vectorizer.get_vocabulary(),\n", + " mask_token=\"\",\n", + " oov_token=\"[UNK]\",\n", + " invert=True,\n", + ")" + ] + }, + { + "cell_type": "markdown", + "id": "4af8b623", + "metadata": {}, + "source": [ + "Try it out for the special tokens and a random word:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "20076b9a", + "metadata": { + "deletable": false, + "editable": false, + "tags": [ + "graded" + ] + }, + "outputs": [], + "source": [ + "unk_id = word_to_id(\"[UNK]\")\n", + "sos_id = word_to_id(\"[SOS]\")\n", + "eos_id = word_to_id(\"[EOS]\")\n", + "baunilha_id = word_to_id(\"baunilha\")\n", + "\n", + "print(f\"The id for the [UNK] token is {unk_id}\")\n", + "print(f\"The id for the [SOS] token is {sos_id}\")\n", + "print(f\"The id for the [EOS] token is {eos_id}\")\n", + "print(f\"The id for baunilha (vanilla) is {baunilha_id}\")" + ] + }, + { + "cell_type": "markdown", + "id": "2f1d744c", + "metadata": {}, + "source": [ + "Finally take a look at how the data that is going to be fed to the neural network looks like. Both `train_data` and `val_data` are of type `tf.data.Dataset` and are already arranged in batches of 64 examples. To get the first batch out of a tf dataset you can use the `take` method. To get the first example out of the batch you can slice the tensor and use the `numpy` method for nicer printing:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "739777eb", + "metadata": { + "deletable": false, + "editable": false, + "tags": [ + "graded" + ] + }, + "outputs": [], + "source": [ + "for (to_translate, sr_translation), translation in train_data.take(1):\n", + " print(f\"Tokenized english sentence:\\n{to_translate[0, :].numpy()}\\n\\n\")\n", + " print(f\"Tokenized portuguese sentence (shifted to the right):\\n{sr_translation[0, :].numpy()}\\n\\n\")\n", + " print(f\"Tokenized portuguese sentence:\\n{translation[0, :].numpy()}\\n\\n\")" + ] + }, + { + "cell_type": "markdown", + "id": "bdd9ee3c", + "metadata": { + "editable": true, + "slideshow": { + "slide_type": "" + }, + "tags": [] + }, + "source": [ + "There are a couple of important details to notice.\n", + "\n", + "- Padding has already been applied to the tensors and the value used for this is 0\n", + "- Each example consists of 3 different tensors:\n", + " - The sentence to translate\n", + " - The shifted-to-the-right translation\n", + " - The translation\n", + " \n", + "The first two can be considered as the features, while the third one as the target. By doing this your model can perform Teacher Forcing as you saw in the lectures.\n", + "\n", + "Now it is time to begin coding!" + ] + }, + { + "cell_type": "markdown", + "id": "dd41cb52", + "metadata": { + "editable": true, + "slideshow": { + "slide_type": "" + }, + "tags": [] + }, + "source": [ + "\n", + "## 2. NMT model with attention\n", + "\n", + "The model you will build uses an encoder-decoder architecture. This Recurrent Neural Network (RNN) takes in a tokenized version of a sentence in its encoder, then passes it on to the decoder for translation. As mentioned in the lectures, just using a a regular sequence-to-sequence model with LSTMs will work effectively for short to medium sentences but will start to degrade for longer ones. You can picture it like the figure below where all of the context of the input sentence is compressed into one vector that is passed into the decoder block. You can see how this will be an issue for very long sentences (e.g. 100 tokens or more) because the context of the first parts of the input will have very little effect on the final vector passed to the decoder.\n", + "\n", + "\n", + "\n", + "Adding an attention layer to this model avoids this problem by giving the decoder access to all parts of the input sentence. To illustrate, let's just use a 4-word input sentence as shown below. Remember that a hidden state is produced at each timestep of the encoder (represented by the orange rectangles). These are all passed to the attention layer and each are given a score given the current activation (i.e. hidden state) of the decoder. For instance, let's consider the figure below where the first prediction \"como\" is already made. To produce the next prediction, the attention layer will first receive all the encoder hidden states (i.e. orange rectangles) as well as the decoder hidden state when producing the word \"como\" (i.e. first green rectangle). Given this information, it will score each of the encoder hidden states to know which one the decoder should focus on to produce the next word. As a result of training, the model might have learned that it should align to the second encoder hidden state and subsequently assigns a high probability to the word \"você\". If we are using greedy decoding, we will output the said word as the next symbol, then restart the process to produce the next word until we reach an end-of-sentence prediction.\n", + "\n", + "\n", + "\n", + "\n", + "There are different ways to implement attention and the one we'll use for this assignment is the Scaled Dot Product Attention which has the form:\n", + "\n", + "$$Attention(Q, K, V) = softmax(\\frac{QK^T}{\\sqrt{d_k}})V$$\n", + "\n", + "You will dive deeper into this equation in the next week but for now, you can think of it as computing scores using queries (Q) and keys (K), followed by a multiplication of values (V) to get a context vector at a particular timestep of the decoder. This context vector is fed to the decoder RNN to get a set of probabilities for the next predicted word. The division by square root of the keys dimensionality ($\\sqrt{d_k}$) is for improving model performance and you'll also learn more about it next week. For our machine translation application, the encoder activations (i.e. encoder hidden states) will be the keys and values, while the decoder activations (i.e. decoder hidden states) will be the queries.\n", + "\n", + "You will see in the upcoming sections that this complex architecture and mechanism can be implemented with just a few lines of code. \n", + "\n", + "First you will define two important global variables:\n", + "\n", + "- The size of the vocabulary\n", + "- The number of units in the LSTM layers (the same number will be used for all LSTM layers)\n", + "\n", + "In this assignment, the vocabulary sizes for English and Portuguese are the same. Therefore, we use a single constant VOCAB_SIZE throughout the notebook. While in other settings, vocabulary sizes could differ, that is not the case in our assignment." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "2e484abf", + "metadata": { + "deletable": false, + "editable": false, + "slideshow": { + "slide_type": "" + }, + "tags": [ + "graded" + ] + }, + "outputs": [], + "source": [ + "VOCAB_SIZE = 12000\n", + "UNITS = 256" + ] + }, + { + "cell_type": "markdown", + "id": "cc251965", + "metadata": {}, + "source": [ + "\n", + "## Exercise 1 - Encoder\n", + "\n", + "Your first exercise is to code the encoder part of the neural network. For this, complete the `Encoder` class below. Notice that in the constructor (the `__init__` method) you need to define all of the sublayers of the encoder and then use these sublayers during the forward pass (the `call` method).\n", + "\n", + "The encoder consists of the following layers:\n", + "\n", + "- [Embedding](https://www.tensorflow.org/api_docs/python/tf/keras/layers/Embedding). For this layer you need to define the appropriate `input_dim` and `output_dim` and let it know that you are using '0' as padding, which can be done by using the appropriate value for the `mask_zero` parameter.\n", + " \n", + "+ [Bidirectional](https://www.tensorflow.org/api_docs/python/tf/keras/layers/Bidirectional) [LSTM](https://www.tensorflow.org/api_docs/python/tf/keras/layers/LSTM). In TF you can implement bidirectional behaviour for RNN-like layers. This part is already taken care of but you will need to specify the appropriate type of layer as well as its parameters. In particular you need to set the appropriate number of units and make sure that the LSTM returns the full sequence and not only the last output, which can be done by using the appropriate value for the `return_sequences` parameter.\n", + "\n", + "\n", + "You need to define the forward pass using the syntax of TF's [functional API](https://www.tensorflow.org/guide/keras/functional_api). What this means is that you chain function calls together to define your network like this:\n", + "\n", + "```python\n", + "encoder_input = keras.Input(shape=(28, 28, 1), name=\"original_img\")\n", + "x = layers.Conv2D(16, 3, activation=\"relu\")(encoder_input)\n", + "x = layers.MaxPooling2D(3)(x)\n", + "x = layers.Conv2D(16, 3, activation=\"relu\")(x)\n", + "encoder_output = layers.GlobalMaxPooling2D()(x)\n", + "```" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "b1db0a1d", + "metadata": { + "deletable": false, + "tags": [ + "graded" + ] + }, + "outputs": [], + "source": [ + "# GRADED CLASS: Encoder\n", + "class Encoder(tf.keras.layers.Layer):\n", + " def __init__(self, vocab_size, units):\n", + " \"\"\"Initializes an instance of this class\n", + "\n", + " Args:\n", + " vocab_size (int): Size of the vocabulary\n", + " units (int): Number of units in the LSTM layer\n", + " \"\"\"\n", + " super(Encoder, self).__init__()\n", + "\n", + " ### START CODE HERE ###\n", + "\n", + " self.embedding = tf.keras.layers.Embedding( \n", + " input_dim=None,\n", + " output_dim=None,\n", + " mask_zero=None\n", + " ) \n", + "\n", + " self.rnn = tf.keras.layers.Bidirectional( \n", + " merge_mode=\"sum\", \n", + " layer=tf.keras.layers.None(\n", + " units=None,\n", + " return_sequences=None\n", + " ), \n", + " ) \n", + "\n", + " ### END CODE HERE ###\n", + "\n", + " def call(self, context):\n", + " \"\"\"Forward pass of this layer\n", + "\n", + " Args:\n", + " context (tf.Tensor): The sentence to translate\n", + "\n", + " Returns:\n", + " tf.Tensor: Encoded sentence to translate\n", + " \"\"\"\n", + "\n", + " ### START CODE HERE ###\n", + "\n", + " # Pass the context through the embedding layer\n", + " x = None\n", + "\n", + " # Pass the output of the embedding through the RNN\n", + " x = None\n", + "\n", + " ### END CODE HERE ###\n", + "\n", + " return x" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "65034ffd", + "metadata": { + "deletable": false, + "editable": false, + "tags": [ + "graded" + ] + }, + "outputs": [], + "source": [ + "# Do a quick check of your implementation\n", + "\n", + "# Create an instance of your class\n", + "encoder = Encoder(VOCAB_SIZE, UNITS)\n", + "\n", + "# Pass a batch of sentences to translate from english to portuguese\n", + "encoder_output = encoder(to_translate)\n", + "\n", + "print(f'Tensor of sentences in english has shape: {to_translate.shape}\\n')\n", + "print(f'Encoder output has shape: {encoder_output.shape}')" + ] + }, + { + "cell_type": "markdown", + "id": "a909aea1", + "metadata": {}, + "source": [ + "##### __Expected Output__\n", + "\n", + "```\n", + "Tensor of sentences in english has shape: (64, 14)\n", + "\n", + "Encoder output has shape: (64, 14, 256)\n", + "```" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "3031bb14", + "metadata": { + "deletable": false, + "editable": false, + "tags": [] + }, + "outputs": [], + "source": [ + "# Test your code!\n", + "\n", + "w1_unittest.test_encoder(Encoder)" + ] + }, + { + "cell_type": "markdown", + "id": "1afe83f4", + "metadata": {}, + "source": [ + "\n", + "## Exercise 2 - CrossAttention\n", + "\n", + "Your next exercise is to code the layer that will perform cross attention between the original sentences and the translations. For this, complete the `CrossAttention` class below. Notice that in the constructor (the `__init__` method) you need to define all of the sublayers and then use these sublayers during the forward pass (the `call` method). For this particular case some of these bits are already taken care of.\n", + "\n", + "The cross attention consists of the following layers:\n", + "\n", + "- [MultiHeadAttention](https://www.tensorflow.org/api_docs/python/tf/keras/layers/MultiHeadAttention). For this layer you need to define the appropriate `key_dim`, which is the size of the key and query tensors. You will also need to set the number of heads to 1 since you aren't implementing multi head attention but attention between two tensors. The reason why this layer is preferred over [Attention](https://www.tensorflow.org/api_docs/python/tf/keras/layers/Attention) is that it allows simpler code during the forward pass.\n", + " \n", + "A couple of things to notice:\n", + "- You need a way to pass both the output of the attention alongside the shifted-to-the-right translation (since this cross attention happens in the decoder side). For this you will use an [Add](https://www.tensorflow.org/api_docs/python/tf/keras/layers/Add) layer so that the original dimension is preserved, which would not happen if you use something like a [Concatenate](https://www.tensorflow.org/api_docs/python/tf/keras/layers/Concatenate) layer.\n", + "\n", + "+ Layer normalization is also performed for better stability of the network by using a [LayerNormalization](https://www.tensorflow.org/api_docs/python/tf/keras/layers/LayerNormalization) layer.\n", + "\n", + "- You don't need to worry about these last steps as these are already solved.\n", + "\n" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "74e71f3d", + "metadata": { + "deletable": false, + "tags": [ + "graded" + ] + }, + "outputs": [], + "source": [ + "# GRADED CLASS: CrossAttention\n", + "class CrossAttention(tf.keras.layers.Layer):\n", + " def __init__(self, units):\n", + " \"\"\"Initializes an instance of this class\n", + "\n", + " Args:\n", + " units (int): Number of units in the LSTM layer\n", + " \"\"\"\n", + " super().__init__()\n", + "\n", + " ### START CODE HERE ###\n", + "\n", + " self.mha = ( \n", + " tf.keras.layers.None(\n", + " key_dim=None,\n", + " num_heads=None\n", + " ) \n", + " ) \n", + "\n", + " ### END CODE HERE ###\n", + "\n", + " self.layernorm = tf.keras.layers.LayerNormalization()\n", + " self.add = tf.keras.layers.Add()\n", + "\n", + " def call(self, context, target):\n", + " \"\"\"Forward pass of this layer\n", + "\n", + " Args:\n", + " context (tf.Tensor): Encoded sentence to translate\n", + " target (tf.Tensor): The embedded shifted-to-the-right translation\n", + "\n", + " Returns:\n", + " tf.Tensor: Cross attention between context and target\n", + " \"\"\"\n", + " ### START CODE HERE ###\n", + "\n", + " # Call the MH attention by passing in the query and value\n", + " # For this case the query should be the translation and the value the encoded sentence to translate\n", + " # Hint: Check the call arguments of MultiHeadAttention in the docs\n", + " attn_output = None(\n", + " query=None,\n", + " value=None\n", + " ) \n", + "\n", + " ### END CODE HERE ###\n", + "\n", + " x = self.add([target, attn_output])\n", + "\n", + " x = self.layernorm(x)\n", + "\n", + " return x" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "4c62796f", + "metadata": { + "deletable": false, + "editable": false, + "tags": [ + "graded" + ] + }, + "outputs": [], + "source": [ + "# Do a quick check of your implementation\n", + "\n", + "# Create an instance of your class\n", + "attention_layer = CrossAttention(UNITS)\n", + "\n", + "# The attention layer expects the embedded sr-translation and the context\n", + "# The context (encoder_output) is already embedded so you need to do this for sr_translation:\n", + "sr_translation_embed = tf.keras.layers.Embedding(VOCAB_SIZE, output_dim=UNITS, mask_zero=True)(sr_translation)\n", + "\n", + "# Compute the cross attention\n", + "attention_result = attention_layer(encoder_output, sr_translation_embed)\n", + "\n", + "print(f'Tensor of contexts has shape: {encoder_output.shape}')\n", + "print(f'Tensor of translations has shape: {sr_translation_embed.shape}')\n", + "print(f'Tensor of attention scores has shape: {attention_result.shape}')" + ] + }, + { + "cell_type": "markdown", + "id": "41d4f99a", + "metadata": {}, + "source": [ + "##### __Expected Output__\n", + "\n", + "```\n", + "Tensor of contexts has shape: (64, 14, 256)\n", + "Tensor of translations has shape: (64, 15, 256)\n", + "Tensor of attention scores has shape: (64, 15, 256)\n", + "```" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "4f658975", + "metadata": { + "deletable": false, + "editable": false, + "tags": [] + }, + "outputs": [], + "source": [ + "# Test your code!\n", + "\n", + "w1_unittest.test_cross_attention(CrossAttention)" + ] + }, + { + "cell_type": "markdown", + "id": "aa296ee2", + "metadata": {}, + "source": [ + "\n", + "## Exercise 3 - Decoder\n", + "\n", + "\n", + "Now you will implement the decoder part of the neural network by completing the `Decoder` class below. Notice that in the constructor (the `__init__` method) you need to define all of the sublayers of the decoder and then use these sublayers during the forward pass (the `call` method).\n", + "\n", + "The decoder consists of the following layers:\n", + "\n", + "- [Embedding](https://www.tensorflow.org/api_docs/python/tf/keras/layers/Embedding). For this layer you need to define the appropriate `input_dim` and `output_dim` and let it know that you are using '0' as padding, which can be done by using the appropriate value for the `mask_zero` parameter.\n", + " \n", + " \n", + "+ Pre-attention [LSTM](https://www.tensorflow.org/api_docs/python/tf/keras/layers/LSTM). Unlike in the encoder in which you used a Bidirectional LSTM, here you will use a vanilla LSTM. Don't forget to set the appropriate number of units and make sure that the LSTM returns the full sequence and not only the last output, which can be done by using the appropriate value for the `return_sequences` parameter. It is very important that this layer returns the state since this will be needed for inference so make sure to set the `return_state` parameter accordingly. Notice that LSTM layers return state as a tuple of two tensors called `memory_state` and `carry_state`, **however these names have been changed to better reflect what you have seen in the lectures to `hidden_state` and `cell_state` respectively**.\n", + "\n", + "- The attention layer that performs cross attention between the sentence to translate and the right-shifted translation. Here you need to use the `CrossAttention` layer you defined in the previous exercise.\n", + "\n", + "+ Post-attention [LSTM](https://www.tensorflow.org/api_docs/python/tf/keras/layers/LSTM). Another LSTM layer. For this one you don't need it to return the state.\n", + "\n", + "- Finally a [Dense](https://www.tensorflow.org/api_docs/python/tf/keras/layers/Dense) layer. This one should have the same number of units as the size of the vocabulary since you expect it to compute the logits for every possible word in the vocabulary. Make sure to use a `logsoftmax` activation function for this one, which you can get as [tf.nn.log_softmax](https://www.tensorflow.org/api_docs/python/tf/nn/log_softmax).\n", + "\n" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "e9639bdb", + "metadata": { + "deletable": false, + "tags": [ + "graded" + ] + }, + "outputs": [], + "source": [ + "# GRADED CLASS: Decoder\n", + "class Decoder(tf.keras.layers.Layer):\n", + " def __init__(self, vocab_size, units):\n", + " \"\"\"Initializes an instance of this class\n", + "\n", + " Args:\n", + " vocab_size (int): Size of the vocabulary\n", + " units (int): Number of units in the LSTM layer\n", + " \"\"\"\n", + " super(Decoder, self).__init__()\n", + "\n", + " ### START CODE HERE ###\n", + "\n", + " # The embedding layer\n", + " self.embedding = tf.keras.layers.None(\n", + " input_dim=None,\n", + " output_dim=None,\n", + " mask_zero=None\n", + " ) \n", + "\n", + " # The RNN before attention\n", + " self.pre_attention_rnn = tf.keras.layers.None(\n", + " units=None,\n", + " return_sequences=None,\n", + " return_state=None\n", + " ) \n", + "\n", + " # The attention layer\n", + " self.attention = None(None)\n", + "\n", + " # The RNN after attention\n", + " self.post_attention_rnn = tf.keras.layers.None(\n", + " units=None,\n", + " return_sequences=None\n", + " ) \n", + "\n", + " # The dense layer with logsoftmax activation\n", + " self.output_layer = tf.keras.layers.None(\n", + " units=None,\n", + " activation=None\n", + " ) \n", + "\n", + " ### END CODE HERE ###\n", + "\n", + " def call(self, context, target, state=None, return_state=False):\n", + " \"\"\"Forward pass of this layer\n", + "\n", + " Args:\n", + " context (tf.Tensor): Encoded sentence to translate\n", + " target (tf.Tensor): The shifted-to-the-right translation\n", + " state (list[tf.Tensor, tf.Tensor], optional): Hidden state of the pre-attention LSTM. Defaults to None.\n", + " return_state (bool, optional): If set to true return the hidden states of the LSTM. Defaults to False.\n", + "\n", + " Returns:\n", + " tf.Tensor: The log_softmax probabilities of predicting a particular token\n", + " \"\"\"\n", + " ### START CODE HERE ###\n", + "\n", + " # Get the embedding of the input\n", + " x = self.None(None)\n", + "\n", + " # Pass the embedded input into the pre attention LSTM\n", + " # Hints:\n", + " # - The LSTM you defined earlier should return the output alongside the state (made up of two tensors)\n", + " # - Pass in the state to the LSTM (needed for inference)\n", + " x, hidden_state, cell_state = self.None(None, initial_state=None)\n", + "\n", + " # Perform cross attention between the context and the output of the LSTM (in that order)\n", + " x = self.None(None, None)\n", + "\n", + " # Do a pass through the post attention LSTM\n", + " x = self.None(None)\n", + "\n", + " # Compute the logits\n", + " logits = self.None(None)\n", + "\n", + " ### END CODE HERE ###\n", + "\n", + " if return_state:\n", + " return logits, [hidden_state, cell_state]\n", + "\n", + " return logits" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "f6165cf2", + "metadata": { + "deletable": false, + "editable": false, + "tags": [ + "graded" + ] + }, + "outputs": [], + "source": [ + "# Do a quick check of your implementation\n", + "\n", + "# Create an instance of your class\n", + "decoder = Decoder(VOCAB_SIZE, UNITS)\n", + "\n", + "# Notice that you don't need the embedded version of sr_translation since this is done inside the class\n", + "logits = decoder(encoder_output, sr_translation)\n", + "\n", + "print(f'Tensor of contexts has shape: {encoder_output.shape}')\n", + "print(f'Tensor of right-shifted translations has shape: {sr_translation.shape}')\n", + "print(f'Tensor of logits has shape: {logits.shape}')" + ] + }, + { + "cell_type": "markdown", + "id": "6f2b5d7d", + "metadata": {}, + "source": [ + "##### __Expected Output__\n", + "\n", + "```\n", + "Tensor of contexts has shape: (64, 14, 256)\n", + "Tensor of right-shifted translations has shape: (64, 15)\n", + "Tensor of logits has shape: (64, 15, 12000)\n", + "```" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "1b61093a", + "metadata": { + "deletable": false, + "editable": false, + "tags": [] + }, + "outputs": [], + "source": [ + "# Test your code!\n", + "\n", + "w1_unittest.test_decoder(Decoder, CrossAttention)" + ] + }, + { + "cell_type": "markdown", + "id": "9dcce3a7", + "metadata": {}, + "source": [ + "\n", + "## Exercise 4 - Translator\n", + "\n", + "Now you have to put together all of the layers you previously coded into an actual model. For this, complete the `Translator` class below. Notice how unlike the Encoder and Decoder classes inherited from `tf.keras.layers.Layer`, the Translator class inherits from `tf.keras.Model`.\n", + "\n", + "Remember that `train_data` will yield a tuple with the sentence to translate and the shifted-to-the-right translation, which are the \"features\" of the model. This means that the inputs of your network will be tuples containing context and targets." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "205fcf31", + "metadata": { + "deletable": false, + "tags": [ + "graded" + ] + }, + "outputs": [], + "source": [ + "# GRADED CLASS: Translator\n", + "class Translator(tf.keras.Model):\n", + " def __init__(self, vocab_size, units):\n", + " \"\"\"Initializes an instance of this class\n", + "\n", + " Args:\n", + " vocab_size (int): Size of the vocabulary\n", + " units (int): Number of units in the LSTM layer\n", + " \"\"\"\n", + " super().__init__()\n", + "\n", + " ### START CODE HERE ###\n", + "\n", + " # Define the encoder with the appropriate vocab_size and number of units\n", + " self.encoder = None\n", + "\n", + " # Define the decoder with the appropriate vocab_size and number of units\n", + " self.decoder = None\n", + "\n", + " ### END CODE HERE ###\n", + "\n", + " def call(self, inputs):\n", + " \"\"\"Forward pass of this layer\n", + "\n", + " Args:\n", + " inputs (tuple(tf.Tensor, tf.Tensor)): Tuple containing the context (sentence to translate) and the target (shifted-to-the-right translation)\n", + "\n", + " Returns:\n", + " tf.Tensor: The log_softmax probabilities of predicting a particular token\n", + " \"\"\"\n", + "\n", + " ### START CODE HERE ###\n", + "\n", + " # In this case inputs is a tuple consisting of the context and the target, unpack it into single variables\n", + " context, target = None\n", + "\n", + " # Pass the context through the encoder\n", + " encoded_context = None\n", + "\n", + " # Compute the logits by passing the encoded context and the target to the decoder\n", + " logits = None\n", + "\n", + " ### END CODE HERE ###\n", + "\n", + " return logits" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "4d4a231c", + "metadata": { + "deletable": false, + "editable": false, + "tags": [ + "graded" + ] + }, + "outputs": [], + "source": [ + "# Do a quick check of your implementation\n", + "\n", + "# Create an instance of your class\n", + "translator = Translator(VOCAB_SIZE, UNITS)\n", + "\n", + "# Compute the logits for every word in the vocabulary\n", + "logits = translator((to_translate, sr_translation))\n", + "\n", + "print(f'Tensor of sentences to translate has shape: {to_translate.shape}')\n", + "print(f'Tensor of right-shifted translations has shape: {sr_translation.shape}')\n", + "print(f'Tensor of logits has shape: {logits.shape}')" + ] + }, + { + "cell_type": "markdown", + "id": "e3a162dd", + "metadata": {}, + "source": [ + "##### __Expected Output__\n", + "\n", + "```\n", + "Tensor of sentences to translate has shape: (64, 14)\n", + "Tensor of right-shifted translations has shape: (64, 15)\n", + "Tensor of logits has shape: (64, 15, 12000)\n", + "```" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "37009022", + "metadata": { + "deletable": false, + "editable": false, + "tags": [] + }, + "outputs": [], + "source": [ + "w1_unittest.test_translator(Translator, Encoder, Decoder)" + ] + }, + { + "cell_type": "markdown", + "id": "f81bc228", + "metadata": {}, + "source": [ + "\n", + "## 3. Training\n", + "\n", + "Now that you have an untrained instance of the NMT model, it is time to train it. You can use the `compile_and_train` function below to achieve this:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "8a61ef65", + "metadata": { + "deletable": false, + "editable": false, + "tags": [ + "graded" + ] + }, + "outputs": [], + "source": [ + "def compile_and_train(model, epochs=20, steps_per_epoch=500):\n", + " model.compile(optimizer=\"adam\", loss=masked_loss, metrics=[masked_acc, masked_loss])\n", + "\n", + " history = model.fit(\n", + " train_data.repeat(),\n", + " epochs=epochs,\n", + " steps_per_epoch=steps_per_epoch,\n", + " validation_data=val_data,\n", + " validation_steps=50,\n", + " callbacks=[tf.keras.callbacks.EarlyStopping(patience=3)],\n", + " )\n", + "\n", + " return model, history" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "87d9bf9f", + "metadata": { + "deletable": false, + "editable": false, + "tags": [] + }, + "outputs": [], + "source": [ + "# Train the translator (this takes some minutes so feel free to take a break)\n", + "\n", + "trained_translator, history = compile_and_train(translator)" + ] + }, + { + "cell_type": "markdown", + "id": "d23b9301", + "metadata": {}, + "source": [ + "\n", + "## 4. Using the model for inference \n", + "\n", + "\n", + "Now that your model is trained you can use it for inference. To help you with this the `generate_next_token` function is provided. Notice that this function is meant to be used inside a for-loop, so you feed to it the information of the previous step to generate the information of the next step. In particular you need to keep track of the state of the pre-attention LSTM in the decoder and if you are done with the translation. Also notice that a `temperature` variable is introduced which determines how to select the next token given the predicted logits: " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "522f6b6f", + "metadata": { + "deletable": false, + "editable": false, + "tags": [ + "graded" + ] + }, + "outputs": [], + "source": [ + "def generate_next_token(decoder, context, next_token, done, state, temperature=0.0):\n", + " \"\"\"Generates the next token in the sequence\n", + "\n", + " Args:\n", + " decoder (Decoder): The decoder\n", + " context (tf.Tensor): Encoded sentence to translate\n", + " next_token (tf.Tensor): The predicted next token\n", + " done (bool): True if the translation is complete\n", + " state (list[tf.Tensor, tf.Tensor]): Hidden states of the pre-attention LSTM layer\n", + " temperature (float, optional): The temperature that controls the randomness of the predicted tokens. Defaults to 0.0.\n", + "\n", + " Returns:\n", + " tuple(tf.Tensor, np.float, list[tf.Tensor, tf.Tensor], bool): The next token, log prob of said token, hidden state of LSTM and if translation is done\n", + " \"\"\"\n", + " # Get the logits and state from the decoder\n", + " logits, state = decoder(context, next_token, state=state, return_state=True)\n", + " \n", + " # Trim the intermediate dimension \n", + " logits = logits[:, -1, :]\n", + " \n", + " # If temp is 0 then next_token is the argmax of logits\n", + " if temperature == 0.0:\n", + " next_token = tf.argmax(logits, axis=-1)\n", + " \n", + " # If temp is not 0 then next_token is sampled out of logits\n", + " else:\n", + " logits = logits / temperature\n", + " next_token = tf.random.categorical(logits, num_samples=1)\n", + " \n", + " # Trim dimensions of size 1\n", + " logits = tf.squeeze(logits)\n", + " next_token = tf.squeeze(next_token)\n", + " \n", + " # Get the logit of the selected next_token\n", + " logit = logits[next_token].numpy()\n", + " \n", + " # Reshape to (1,1) since this is the expected shape for text encoded as TF tensors\n", + " next_token = tf.reshape(next_token, shape=(1,1))\n", + " \n", + " # If next_token is End-of-Sentence token you are done\n", + " if next_token == eos_id:\n", + " done = True\n", + " \n", + " return next_token, logit, state, done" + ] + }, + { + "cell_type": "markdown", + "id": "190d2d76", + "metadata": {}, + "source": [ + "See how it works by running the following cell:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "9937547a", + "metadata": { + "deletable": false, + "editable": false, + "tags": [ + "graded" + ] + }, + "outputs": [], + "source": [ + "# PROCESS SENTENCE TO TRANSLATE AND ENCODE\n", + "\n", + "# A sentence you wish to translate\n", + "eng_sentence = \"I love languages\"\n", + "\n", + "# Convert it to a tensor\n", + "texts = tf.convert_to_tensor(eng_sentence)[tf.newaxis]\n", + "\n", + "# Vectorize it and pass it through the encoder\n", + "context = english_vectorizer(texts).to_tensor()\n", + "context = encoder(context)\n", + "\n", + "# SET STATE OF THE DECODER\n", + "\n", + "# Next token is Start-of-Sentence since you are starting fresh\n", + "next_token = tf.fill((1,1), sos_id)\n", + "\n", + "# Hidden and Cell states of the LSTM can be mocked using uniform samples\n", + "state = [tf.random.uniform((1, UNITS)), tf.random.uniform((1, UNITS))]\n", + "\n", + "# You are not done until next token is EOS token\n", + "done = False\n", + "\n", + "# Generate next token\n", + "next_token, logit, state, done = generate_next_token(decoder, context, next_token, done, state, temperature=0.5)\n", + "print(f\"Next token: {next_token}\\nLogit: {logit:.4f}\\nDone? {done}\")" + ] + }, + { + "cell_type": "markdown", + "id": "170323dd", + "metadata": {}, + "source": [ + "\n", + "## Exercise 5 - translate\n", + "\n", + "Now you can put everything together to translate a given sentence. For this, complete the `translate` function below. This function will take care of the following steps: \n", + "- Process the sentence to translate and encode it\n", + "\n", + "+ Set the initial state of the decoder\n", + "\n", + "- Get predictions of the next token (starting with the \\ token) for a maximum of iterations (in case the \\ token is never returned)\n", + " \n", + "+ Return the translated text (as a string), the logit of the last iteration (this helps measure how certain was that the sequence was translated in its totality) and the translation in token format.\n", + "\n", + "\n", + "Hints: \n", + "\n", + "- The previous cell provides a lot of insights on how this function should work, so if you get stuck refer to it.\n", + "\n", + "+ Some useful docs:\n", + " + [tf.newaxis](https://www.tensorflow.org/api_docs/python/tf#newaxis)\n", + "\n", + " - [tf.fill](https://www.tensorflow.org/api_docs/python/tf/fill)\n", + "\n", + " + [tf.zeros](https://www.tensorflow.org/api_docs/python/tf/zeros)\n", + "\n", + "\n", + "**IMPORTANT NOTE**: Due to randomness processes involving tensorflow training and weight initializing, the results below may vary a lot, even if you retrain your model in the same session. \n" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "42c74f1f", + "metadata": { + "deletable": false, + "tags": [ + "graded" + ] + }, + "outputs": [], + "source": [ + "# GRADED FUNCTION: translate\n", + "def translate(model, text, max_length=50, temperature=0.0):\n", + " \"\"\"Translate a given sentence from English to Portuguese\n", + "\n", + " Args:\n", + " model (tf.keras.Model): The trained translator\n", + " text (string): The sentence to translate\n", + " max_length (int, optional): The maximum length of the translation. Defaults to 50.\n", + " temperature (float, optional): The temperature that controls the randomness of the predicted tokens. Defaults to 0.0.\n", + "\n", + " Returns:\n", + " tuple(str, np.float, tf.Tensor): The translation, logit that predicted token and the tokenized translation\n", + " \"\"\"\n", + " # Lists to save tokens and logits\n", + " tokens, logits = [], []\n", + "\n", + " ### START CODE HERE ###\n", + " \n", + " # PROCESS THE SENTENCE TO TRANSLATE\n", + " \n", + " # Convert the original string into a tensor\n", + " text = tf.None(None)[tf.None]\n", + " \n", + " # Vectorize the text using the correct vectorizer\n", + " context = None(None).to_tensor()\n", + " \n", + " # Get the encoded context (pass the context through the encoder)\n", + " # Hint: Remember you can get the encoder by using model.encoder\n", + " context = None.None(None)\n", + " \n", + " # INITIAL STATE OF THE DECODER\n", + " \n", + " # First token should be SOS token with shape (1,1)\n", + " next_token = tf.None((None, None), None)\n", + " \n", + " # Initial hidden and cell states should be tensors of zeros with shape (1, UNITS)\n", + " state = [tf.None((None, None)), tf.None((None, None))]\n", + " \n", + " # You are done when you draw a EOS token as next token (initial state is False)\n", + " done = None\n", + "\n", + " # Iterate for max_length iterations\n", + " for None in None(None):\n", + " # Generate the next token\n", + " try:\n", + " next_token, logit, state, done = None(\n", + " decoder=None,\n", + " context=None,\n", + " next_token=None,\n", + " done=None,\n", + " state=None,\n", + " temperature=None\n", + " )\n", + " except:\n", + " raise Exception(\"Problem generating the next token\")\n", + " \n", + " # If done then break out of the loop\n", + " if None:\n", + " None\n", + " \n", + " # Add next_token to the list of tokens\n", + " None.None(None)\n", + " \n", + " # Add logit to the list of logits\n", + " None.None(None)\n", + " \n", + " ### END CODE HERE ###\n", + " \n", + " # Concatenate all tokens into a tensor\n", + " tokens = tf.concat(tokens, axis=-1)\n", + " \n", + " # Convert the translated tokens into text\n", + " translation = tf.squeeze(tokens_to_text(tokens, id_to_word))\n", + " translation = translation.numpy().decode()\n", + " \n", + " return translation, logits[-1], tokens" + ] + }, + { + "cell_type": "markdown", + "id": "3525e8ba", + "metadata": {}, + "source": [ + "Try your function with temperature of 0, which will yield a deterministic output and is equivalent to a greedy decoding:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "daaea8c5", + "metadata": { + "deletable": false, + "editable": false, + "tags": [] + }, + "outputs": [], + "source": [ + "# Running this cell multiple times should return the same output since temp is 0\n", + "\n", + "temp = 0.0 \n", + "original_sentence = \"I love languages\"\n", + "\n", + "translation, logit, tokens = translate(trained_translator, original_sentence, temperature=temp)\n", + "\n", + "print(f\"Temperature: {temp}\\n\\nOriginal sentence: {original_sentence}\\nTranslation: {translation}\\nTranslation tokens:{tokens}\\nLogit: {logit:.3f}\")" + ] + }, + { + "cell_type": "markdown", + "id": "7d05129b", + "metadata": {}, + "source": [ + "Try your function with temperature of 0.7 (stochastic output):" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "0e0697db", + "metadata": { + "deletable": false, + "editable": false, + "tags": [] + }, + "outputs": [], + "source": [ + "# Running this cell multiple times should return different outputs since temp is not 0\n", + "# You can try different temperatures\n", + "\n", + "temp = 0.7\n", + "original_sentence = \"I love languages\"\n", + "\n", + "translation, logit, tokens = translate(trained_translator, original_sentence, temperature=temp)\n", + "\n", + "print(f\"Temperature: {temp}\\n\\nOriginal sentence: {original_sentence}\\nTranslation: {translation}\\nTranslation tokens:{tokens}\\nLogit: {logit:.3f}\")" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "a3a9ea35", + "metadata": { + "deletable": false, + "editable": false, + "tags": [] + }, + "outputs": [], + "source": [ + "w1_unittest.test_translate(translate, trained_translator)" + ] + }, + { + "cell_type": "markdown", + "id": "ba027524", + "metadata": {}, + "source": [ + "\n", + "## 5. Minimum Bayes-Risk Decoding\n", + "\n", + "As mentioned in the lectures, getting the most probable token at each step may not necessarily produce the best results. Another approach is to do Minimum Bayes Risk Decoding or MBR. The general steps to implement this are:\n", + "\n", + "- Take several random samples\n", + "+ Score each sample against all other samples\n", + "- Select the one with the highest score\n", + "\n", + "You will be building helper functions for these steps in the following sections.\n", + "\n", + "With the ability to generate different translations by setting different temperature values you can do what you saw in the lectures and generate a bunch of translations and then determine which one is the best candidate. You will now do this by using the provided `generate_samples` function. This function will return any desired number of candidate translations alongside the log-probability for each one:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "62301cd5", + "metadata": { + "deletable": false, + "editable": false, + "tags": [ + "graded" + ] + }, + "outputs": [], + "source": [ + "def generate_samples(model, text, n_samples=4, temperature=0.6):\n", + " \n", + " samples, log_probs = [], []\n", + "\n", + " # Iterate for n_samples iterations\n", + " for _ in range(n_samples):\n", + " \n", + " # Save the logit and the translated tensor\n", + " _, logp, sample = translate(model, text, temperature=temperature)\n", + " \n", + " # Save the translated tensors\n", + " samples.append(np.squeeze(sample.numpy()).tolist())\n", + " \n", + " # Save the logits\n", + " log_probs.append(logp)\n", + " \n", + " return samples, log_probs" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "06bd792c", + "metadata": { + "deletable": false, + "editable": false, + "tags": [] + }, + "outputs": [], + "source": [ + "samples, log_probs = generate_samples(trained_translator, 'I love languages')\n", + "\n", + "for s, l in zip(samples, log_probs):\n", + " print(f\"Translated tensor: {s} has logit: {l:.3f}\")" + ] + }, + { + "cell_type": "markdown", + "id": "29b10677", + "metadata": {}, + "source": [ + "## Comparing overlaps\n", + "\n", + "Now that you can generate multiple translations it is time to come up with a method to measure the goodness of each one. As you saw in the lectures, one way to achieve this is by comparing each sample against the others. \n", + "\n", + "There are several metrics you can use for this purpose, as shown in the lectures and you can try experimenting with any one of these. For this assignment, you will be calculating scores for **unigram overlaps**. \n", + "\n", + "One of these metrics is the widely used yet simple [Jaccard similarity](https://en.wikipedia.org/wiki/Jaccard_index) which gets the intersection over union of two sets. The `jaccard_similarity` function returns this metric for any pair of candidate and reference translations:\n" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "edb54a71", + "metadata": { + "deletable": false, + "editable": false, + "tags": [ + "graded" + ] + }, + "outputs": [], + "source": [ + "def jaccard_similarity(candidate, reference):\n", + " \n", + " # Convert the lists to sets to get the unique tokens\n", + " candidate_set = set(candidate)\n", + " reference_set = set(reference)\n", + " \n", + " # Get the set of tokens common to both candidate and reference\n", + " common_tokens = candidate_set.intersection(reference_set)\n", + " \n", + " # Get the set of all tokens found in either candidate or reference\n", + " all_tokens = candidate_set.union(reference_set)\n", + " \n", + " # Compute the percentage of overlap (divide the number of common tokens by the number of all tokens)\n", + " overlap = len(common_tokens) / len(all_tokens)\n", + " \n", + " return overlap" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "fc3384bf", + "metadata": { + "deletable": false, + "editable": false, + "tags": [ + "graded" + ] + }, + "outputs": [], + "source": [ + "l1 = [1, 2, 3]\n", + "l2 = [1, 2, 3, 4]\n", + "\n", + "js = jaccard_similarity(l1, l2)\n", + "\n", + "print(f\"jaccard similarity between lists: {l1} and {l2} is {js:.3f}\")" + ] + }, + { + "cell_type": "markdown", + "id": "a6997662", + "metadata": {}, + "source": [ + "##### __Expected Output__\n", + "\n", + "```\n", + "jaccard similarity between tensors: [1, 2, 3] and [1, 2, 3, 4] is 0.750\n", + "\n", + "```" + ] + }, + { + "cell_type": "markdown", + "id": "b2510e3d", + "metadata": {}, + "source": [ + "\n", + "## Exercise 6 - rouge1_similarity\n", + "\n", + "Jaccard similarity is good but a more commonly used metric in machine translation is the ROUGE score. For unigrams, this is called ROUGE-1 and as shown in the lectures, you can output the scores for both precision and recall when comparing two samples. To get the final score, you will want to compute the F1-score as given by:\n", + "\n", + "$$score = 2* \\frac{(precision * recall)}{(precision + recall)}$$\n", + "\n", + "For the implementation of the `rouge1_similarity` function you want to use the [Counter](https://docs.python.org/3/library/collections.html#collections.Counter) class from the Python standard library:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "fb2e0a00", + "metadata": { + "deletable": false, + "tags": [ + "graded" + ] + }, + "outputs": [], + "source": [ + "# GRADED FUNCTION: rouge1_similarity\n", + "def rouge1_similarity(candidate, reference):\n", + " \"\"\"Computes the ROUGE 1 score between two token lists\n", + "\n", + " Args:\n", + " candidate (list[int]): Tokenized candidate translation\n", + " reference (list[int]): Tokenized reference translation\n", + "\n", + " Returns:\n", + " float: Overlap between the two token lists\n", + " \"\"\"\n", + " ### START CODE HERE ###\n", + " \n", + " # Make a frequency table of the candidate and reference tokens\n", + " # Hint: use the Counter class (already imported)\n", + " candidate_word_counts = None\n", + " reference_word_counts = None\n", + " \n", + " # Initialize overlap at 0\n", + " overlap = None\n", + " \n", + " # Iterate over the tokens in the candidate frequency table\n", + " # Hint: Counter is a subclass of dict and you can get the keys \n", + " # out of a dict using the keys method like this: dict.keys()\n", + " for token in None:\n", + " \n", + " # Get the count of the current token in the candidate frequency table\n", + " # Hint: You can access the counts of a token as you would access values of a dictionary\n", + " token_count_candidate = None\n", + " \n", + " # Get the count of the current token in the reference frequency table\n", + " # Hint: You can access the counts of a token as you would access values of a dictionary\n", + " token_count_reference = None\n", + " \n", + " # Update the overlap by getting the minimum between the two token counts above\n", + " overlap += None\n", + " \n", + " # Compute the precision\n", + " # Hint: precision = overlap / (number of tokens in candidate list) \n", + " precision = None\n", + " \n", + " # Compute the recall\n", + " # Hint: recall = overlap / (number of tokens in reference list) \n", + " recall = None\n", + " \n", + " if precision + recall != 0:\n", + " # Compute the Rouge1 Score\n", + " # Hint: This is equivalent to the F1 score\n", + " f1_score = None\n", + " \n", + " return f1_score\n", + " \n", + " ### END CODE HERE ###\n", + " \n", + " return 0 # If precision + recall = 0 then return 0" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "14bb5295", + "metadata": { + "deletable": false, + "editable": false, + "tags": [ + "graded" + ] + }, + "outputs": [], + "source": [ + "l1 = [1, 2, 3]\n", + "l2 = [1, 2, 3, 4]\n", + "\n", + "r1s = rouge1_similarity(l1, l2)\n", + "\n", + "print(f\"rouge 1 similarity between lists: {l1} and {l2} is {r1s:.3f}\")" + ] + }, + { + "cell_type": "markdown", + "id": "afb8c61a", + "metadata": {}, + "source": [ + "##### __Expected Output__\n", + "\n", + "```\n", + "rouge 1 similarity between lists: [1, 2, 3] and [1, 2, 3, 4] is 0.857\n", + "\n", + "```" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "a680132e", + "metadata": { + "deletable": false, + "editable": false, + "tags": [] + }, + "outputs": [], + "source": [ + "w1_unittest.test_rouge1_similarity(rouge1_similarity)" + ] + }, + { + "cell_type": "markdown", + "id": "aaf8a058", + "metadata": {}, + "source": [ + "## Computing the Overall Score\n", + "\n", + "\n", + "You will now build a function to generate the overall score for a particular sample. As mentioned in the lectures, you need to compare each sample with all other samples. For instance, if we generated 30 sentences, we will need to compare sentence 1 to sentences 2 through 30. Then, we compare sentence 2 to sentences 1 and 3 through 30, and so forth. At each step, we get the average score of all comparisons to get the overall score for a particular sample. To illustrate, these will be the steps to generate the scores of a 4-sample list.\n", + "\n", + "- Get similarity score between sample 1 and sample 2\n", + "+ Get similarity score between sample 1 and sample 3\n", + "- Get similarity score between sample 1 and sample 4\n", + "+ Get average score of the first 3 steps. This will be the overall score of sample 1\n", + "- Iterate and repeat until samples 1 to 4 have overall scores.\n", + "\n", + "\n", + "The results will be stored in a dictionary for easy lookups.\n", + "\n", + "\n", + "## Exercise 7 - average_overlap\n", + "\n", + "Complete the `average_overlap` function below which should implement the process described above:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "142264ff", + "metadata": { + "deletable": false, + "tags": [ + "graded" + ] + }, + "outputs": [], + "source": [ + "# GRADED FUNCTION: average_overlap\n", + "def average_overlap(samples, similarity_fn):\n", + " \"\"\"Computes the arithmetic mean of each candidate sentence in the samples\n", + "\n", + " Args:\n", + " samples (list[list[int]]): Tokenized version of translated sentences\n", + " similarity_fn (Function): Similarity function used to compute the overlap\n", + "\n", + " Returns:\n", + " dict[int, float]: A dictionary mapping the index of each translation to its score\n", + " \"\"\"\n", + " # Initialize dictionary\n", + " scores = {}\n", + " \n", + " # Iterate through all samples (enumerate helps keep track of indexes)\n", + " for index_candidate, candidate in enumerate(samples): \n", + " \n", + " ### START CODE HERE ###\n", + " \n", + " # Initially overlap is zero\n", + " overlap = None\n", + " \n", + " # Iterate through all samples (enumerate helps keep track of indexes)\n", + " for index_sample, sample in enumerate(samples):\n", + "\n", + " # Skip if the candidate index is the same as the sample index\n", + " if None == None:\n", + " None\n", + " \n", + " # Get the overlap between candidate and sample using the similarity function\n", + " sample_overlap = None(None, None)\n", + " \n", + " # Add the sample overlap to the total overlap\n", + " overlap += None\n", + "\n", + " ### END CODE HERE ###\n", + " \n", + " # Get the score for the candidate by computing the average\n", + " score = overlap / (len(samples) - 1)\n", + "\n", + " # Only use 3 decimal points\n", + " score = round(score, 3)\n", + " \n", + " # Save the score in the dictionary. use index as the key.\n", + " scores[index_candidate] = score\n", + " \n", + " return scores" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "f36cf403", + "metadata": { + "deletable": false, + "editable": false, + "tags": [ + "graded" + ] + }, + "outputs": [], + "source": [ + "# Test with Jaccard similarity\n", + "\n", + "l1 = [1, 2, 3]\n", + "l2 = [1, 2, 4]\n", + "l3 = [1, 2, 4, 5]\n", + "\n", + "avg_ovlp = average_overlap([l1, l2, l3], jaccard_similarity)\n", + "\n", + "print(f\"average overlap between lists: {l1}, {l2} and {l3} using Jaccard similarity is:\\n\\n{avg_ovlp}\")" + ] + }, + { + "cell_type": "markdown", + "id": "e277aed2-a5c9-4ed0-9ee2-614939f2df7b", + "metadata": {}, + "source": [ + "##### __Expected Output__\n", + "\n", + "```\n", + "average overlap between lists: [1, 2, 3], [1, 2, 4] and [1, 2, 4, 5] using Jaccard similarity is:\n", + "\n", + "{0: 0.45, 1: 0.625, 2: 0.575}\n", + "```" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "d961a304-7c03-4ecb-ba5f-c8747ed3ec39", + "metadata": { + "deletable": false, + "editable": false, + "tags": [ + "graded" + ] + }, + "outputs": [], + "source": [ + "# Test with Rouge1 similarity\n", + "\n", + "l1 = [1, 2, 3]\n", + "l2 = [1, 4]\n", + "l3 = [1, 2, 4, 5]\n", + "l4 = [5,6]\n", + "\n", + "avg_ovlp = average_overlap([l1, l2, l3, l4], rouge1_similarity)\n", + "\n", + "print(f\"average overlap between lists: {l1}, {l2}, {l3} and {l4} using Rouge1 similarity is:\\n\\n{avg_ovlp}\")" + ] + }, + { + "cell_type": "markdown", + "id": "30adc749-ffcb-4e82-a8f0-c04a7e39da0a", + "metadata": {}, + "source": [ + "##### __Expected Output__\n", + "\n", + "```\n", + "average overlap between lists: [1, 2, 3], [1, 4], [1, 2, 4, 5] and [5, 6] using Rouge1 similarity is:\n", + "\n", + "{0: 0.324, 1: 0.356, 2: 0.524, 3: 0.111}\n", + "```" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "c41b1fba-fd0f-41e6-9b07-746f64030fe3", + "metadata": { + "deletable": false, + "editable": false, + "tags": [] + }, + "outputs": [], + "source": [ + "w1_unittest.test_average_overlap(average_overlap)" + ] + }, + { + "cell_type": "markdown", + "id": "e4482249", + "metadata": {}, + "source": [ + "In practice, it is also common to see the weighted mean being used to calculate the overall score instead of just the arithmetic mean. This is implemented in the `weighted_avg_overlap` function below and you can use it in your experiments to see which one will give better results:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "398714be", + "metadata": { + "deletable": false, + "editable": false, + "tags": [ + "graded" + ] + }, + "outputs": [], + "source": [ + "def weighted_avg_overlap(samples, log_probs, similarity_fn):\n", + " \n", + " # Scores dictionary\n", + " scores = {}\n", + " \n", + " # Iterate over the samples\n", + " for index_candidate, candidate in enumerate(samples): \n", + " \n", + " # Initialize overlap and weighted sum\n", + " overlap, weight_sum = 0.0, 0.0\n", + " \n", + " # Iterate over all samples and log probabilities\n", + " for index_sample, (sample, logp) in enumerate(zip(samples, log_probs)):\n", + "\n", + " # Skip if the candidate index is the same as the sample index \n", + " if index_candidate == index_sample:\n", + " continue\n", + " \n", + " # Convert log probability to linear scale\n", + " sample_p = float(np.exp(logp))\n", + "\n", + " # Update the weighted sum\n", + " weight_sum += sample_p\n", + "\n", + " # Get the unigram overlap between candidate and sample\n", + " sample_overlap = similarity_fn(candidate, sample)\n", + " \n", + " # Update the overlap\n", + " overlap += sample_p * sample_overlap\n", + " \n", + " # Compute the score for the candidate\n", + " score = overlap / weight_sum\n", + "\n", + " # Only use 3 decimal points\n", + " score = round(score, 3)\n", + " \n", + " # Save the score in the dictionary. use index as the key.\n", + " scores[index_candidate] = score\n", + " \n", + " return scores" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "e3dfd6d3", + "metadata": { + "deletable": false, + "editable": false, + "tags": [ + "graded" + ] + }, + "outputs": [], + "source": [ + "l1 = [1, 2, 3]\n", + "l2 = [1, 2, 4]\n", + "l3 = [1, 2, 4, 5]\n", + "log_probs = [0.4, 0.2, 0.5]\n", + "\n", + "w_avg_ovlp = weighted_avg_overlap([l1, l2, l3], log_probs, jaccard_similarity)\n", + "\n", + "print(f\"weighted average overlap using Jaccard similarity is:\\n\\n{w_avg_ovlp}\")" + ] + }, + { + "cell_type": "markdown", + "id": "cdb0b4db", + "metadata": {}, + "source": [ + "## mbr_decode\n", + "\n", + "You will now put everything together in the the `mbr_decode` function below. This final step is not graded as this function is just a wrapper around all the cool stuff you have coded so far! \n", + "\n", + "You can use it to play around, trying different numbers of samples, temperatures and similarity functions!" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "6fcfa640", + "metadata": { + "deletable": false, + "editable": false, + "tags": [ + "graded" + ] + }, + "outputs": [], + "source": [ + "def mbr_decode(model, text, n_samples=5, temperature=0.6, similarity_fn=jaccard_similarity):\n", + " \n", + " # Generate samples\n", + " samples, log_probs = generate_samples(model, text, n_samples=n_samples, temperature=temperature)\n", + " \n", + " # Compute the overlap scores\n", + " scores = weighted_avg_overlap(samples, log_probs, similarity_fn)\n", + "\n", + " # Decode samples\n", + " decoded_translations = [tokens_to_text(s, id_to_word).numpy().decode('utf-8') for s in samples]\n", + " \n", + " # Find the key with the highest score\n", + " max_score_key = max(scores, key=lambda k: scores[k])\n", + " \n", + " # Get the translation \n", + " translation = decoded_translations[max_score_key]\n", + " \n", + " return translation, decoded_translations" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "99507fcc-7727-45e7-933b-d3a08034f731", + "metadata": { + "deletable": false, + "editable": false, + "tags": [] + }, + "outputs": [], + "source": [ + "english_sentence = \"I love languages\"\n", + "\n", + "translation, candidates = mbr_decode(trained_translator, english_sentence, n_samples=10, temperature=0.6)\n", + "\n", + "print(\"Translation candidates:\")\n", + "for c in candidates:\n", + " print(c)\n", + "\n", + "print(f\"\\nSelected translation: {translation}\")" + ] + }, + { + "cell_type": "markdown", + "id": "801b193f-4ea6-4ca1-ae29-a506cce656d9", + "metadata": {}, + "source": [ + "**Congratulations!** Next week, you'll dive deeper into attention models and study the Transformer architecture. You will build another network but without the recurrent part. It will show that attention is all you need! It should be fun!\n", + "\n", + "**Keep up the good work!**" + ] + } + ], + "metadata": { + "grader_version": "1", + "kernelspec": { + "display_name": "Python 3 (ipykernel)", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.10.11" + } + }, + "nbformat": 4, + "nbformat_minor": 5 +} diff --git a/NLP with Attention Models/NMT_with_Attention/NMT with MBR/Files/tf/.ipynb_checkpoints/w1_unittest-checkpoint.py b/NLP with Attention Models/NMT_with_Attention/NMT with MBR/Files/tf/.ipynb_checkpoints/w1_unittest-checkpoint.py new file mode 100644 index 0000000000000000000000000000000000000000..f94f9aa69107477704a4d36dcb939a463ac228bf --- /dev/null +++ b/NLP with Attention Models/NMT_with_Attention/NMT with MBR/Files/tf/.ipynb_checkpoints/w1_unittest-checkpoint.py @@ -0,0 +1,654 @@ +import math +from itertools import combinations +import tensorflow as tf +import numpy as np +from dlai_grader.grading import test_case, print_feedback +from utils import train_data + +VOCAB_SIZE = 12000 +UNITS = 256 + + +def test_encoder(encoder_to_test): + def g(): + vocab_sizes = [5, 20, 1000, 15000] + units = [32, 64, 256, 512] + + cases = [] + + vocab_size = 15000 + n_units = 512 + encoder = encoder_to_test(vocab_size, n_units) + + t = test_case() + if encoder.embedding.mask_zero != True: + t.failed = True + t.msg = "Embedding layer has incorrect value for 'mask_zero' attribute" + t.want = True + t.got = encoder.embedding.mask_zero + cases.append(t) + + for vs, u in zip(vocab_sizes, units): + encoder = encoder_to_test(vs, u) + + t = test_case() + if encoder.embedding.input_dim != vs: + t.failed = True + t.msg = "Incorrect input dim of embedding layer" + t.want = vs + t.got = encoder.embedding.input_dim + cases.append(t) + + t = test_case() + if encoder.embedding.output_dim != u: + t.failed = True + t.msg = "Incorrect output dim of embedding layer" + t.want = u + t.got = encoder.embedding.output_dim + cases.append(t) + + t = test_case() + if not isinstance(encoder.rnn.layer, tf.keras.layers.LSTM): + t.failed = True + t.msg = "Incorrect type of layer inside Bidirectional" + t.want = tf.keras.layers.LSTM + t.got = type(encoder.rnn.layer) + return [t] + + for u in units: + encoder = encoder_to_test(vocab_size, u) + t = test_case() + if encoder.rnn.layer.units != u: + t.failed = True + t.msg = "Incorrect number of units in LSTM layer" + t.want = u + t.got = encoder.rnn.layer.units + cases.append(t) + + t = test_case() + if encoder.rnn.layer.return_sequences != True: + t.failed = True + t.msg = "LSTM layer has incorrect value for 'return_sequences' attribute" + t.want = True + t.got = encoder.rnn.layer.return_sequences + cases.append(t) + + encoder = encoder_to_test(vocab_size, n_units) + + for (to_translate, _), _ in train_data.take(3): + first_dim_in, second_dim_in = to_translate.shape + encoder_output = encoder(to_translate) + t = test_case() + if len(encoder_output.shape) != 3: + t.failed = True + t.msg = "Incorrect shape of encoder output" + t.want = "a shape with 3 dimensions" + t.got = encoder_output.shape + return [t] + + first_dim_out, second_dim_out, third_dim_out = encoder_output.shape + + t = test_case() + if first_dim_in != first_dim_out: + t.failed = True + t.msg = "Incorrect first dimension of encoder output" + t.want = first_dim_in + t.got = first_dim_out + cases.append(t) + + t = test_case() + if second_dim_in != second_dim_out: + t.failed = True + t.msg = "Incorrect second dimension of encoder output" + t.want = second_dim_in + t.got = second_dim_out + cases.append(t) + + t = test_case() + if third_dim_out != n_units: + t.failed = True + t.msg = "Incorrect third dimension of encoder output" + t.want = units + t.got = third_dim_out + cases.append(t) + + return cases + + cases = g() + print_feedback(cases) + + +def test_cross_attention(cross_attention_to_test): + def g(): + units = [32, 64, 256, 512] + + cases = [] + + n_units = 512 + cross_attention = cross_attention_to_test(n_units) + + t = test_case() + if not isinstance(cross_attention.mha, tf.keras.layers.MultiHeadAttention): + t.failed = True + t.msg = "Incorrect type of layer for Multi Head Attention" + t.want = tf.keras.layers.MultiHeadAttention + t.got = type(cross_attention.mha) + return [t] + + # for u in units: + # cross_attention = cross_attention_to_test(u) + + # t = test_case() + # if cross_attention.mha.key_dim != u: + # t.failed = True + # t.msg = "Incorrect key dim of Multi Head Attention layer" + # t.want = u + # t.got = cross_attention.mha.key_dim + # cases.append(t) + + cross_attention = cross_attention_to_test(n_units) + embed = tf.keras.layers.Embedding(VOCAB_SIZE, output_dim=UNITS, mask_zero=True) + + for (to_translate, sr_translation), _ in train_data.take(3): + sr_translation_embed = embed(sr_translation) + first_dim_in, second_dim_in, third_dim_in = sr_translation_embed.shape + dummy_encoder_output = np.random.rand(64, 14, 512) + cross_attention_output = cross_attention( + dummy_encoder_output, sr_translation_embed + ) + # print(cross_attention_output.shape) + + t = test_case() + if len(cross_attention_output.shape) != 3: + t.failed = True + t.msg = "Incorrect shape of cross_attention output" + t.want = "a shape with 3 dimensions" + t.got = cross_attention_output.shape + return [t] + + first_dim_out, second_dim_out, third_dim_out = cross_attention_output.shape + + t = test_case() + if first_dim_in != first_dim_out: + t.failed = True + t.msg = "Incorrect first dimension of cross_attention output" + t.want = first_dim_in + t.got = first_dim_out + cases.append(t) + + t = test_case() + if second_dim_in != second_dim_out: + t.failed = True + t.msg = "Incorrect second dimension of cross_attention output" + t.want = second_dim_in + t.got = second_dim_out + cases.append(t) + + t = test_case() + if third_dim_in != third_dim_out: + t.failed = True + t.msg = "Incorrect third dimension of cross_attention output" + t.want = third_dim_in + t.got = third_dim_out + cases.append(t) + + _, n_heads, key_dim = cross_attention.mha.get_weights()[0].shape + + t = test_case() + if n_heads != 1: + t.failed = True + t.msg = "Incorrect number of attention heads" + t.want = 1 + t.got = n_heads + cases.append(t) + + t = test_case() + if key_dim != n_units: + t.failed = True + t.msg = f"Incorrect size of query and key for every attention head when passing {n_units} units to the constructor" + t.want = n_units + t.got = key_dim + cases.append(t) + + return cases + + cases = g() + print_feedback(cases) + + +def test_decoder(decoder_to_test, CrossAttention): + def g(): + vocab_sizes = [5, 20, 1000, 15000] + units = [32, 64, 256, 512] + + cases = [] + + vocab_size = 10000 + n_units = 512 + decoder = decoder_to_test(vocab_size, n_units) + + t = test_case() + if not isinstance(decoder.embedding, tf.keras.layers.Embedding): + t.failed = True + t.msg = "Incorrect type of embedding layer" + t.want = tf.keras.layers.Embedding + t.got = type(decoder.embedding) + return [t] + + t = test_case() + if decoder.embedding.mask_zero != True: + t.failed = True + t.msg = "Embedding layer has incorrect value for 'mask_zero' attribute" + t.want = True + t.got = decoder.embedding.mask_zero + cases.append(t) + + for vs, u in zip(vocab_sizes, units): + decoder = decoder_to_test(vs, u) + + t = test_case() + if decoder.embedding.input_dim != vs: + t.failed = True + t.msg = "Incorrect input dim of embedding layer" + t.want = vs + t.got = decoder.embedding.input_dim + cases.append(t) + + t = test_case() + if decoder.embedding.output_dim != u: + t.failed = True + t.msg = "Incorrect output dim of embedding layer" + t.want = u + t.got = decoder.embedding.output_dim + cases.append(t) + + t = test_case() + if not isinstance(decoder.pre_attention_rnn, tf.keras.layers.LSTM): + t.failed = True + t.msg = "Incorrect type of pre_attention_rnn layer" + t.want = tf.keras.layers.LSTM + t.got = type(decoder.pre_attention_rnn) + return [t] + + for u in units: + decoder = decoder_to_test(vocab_size, u) + t = test_case() + if decoder.pre_attention_rnn.units != u: + t.failed = True + t.msg = "Incorrect number of units in pre_attention_rnn layer" + t.want = u + t.got = decoder.pre_attention_rnn.units + cases.append(t) + + # t = test_case() + # if decoder.attention.units != u: + # t.failed = True + # t.msg = "Incorrect number of units in attention layer" + # t.want = u + # t.got = decoder.attention.units + # cases.append(t) + + t = test_case() + if decoder.post_attention_rnn.units != u: + t.failed = True + t.msg = "Incorrect number of units in post_attention_rnn layer" + t.want = u + t.got = decoder.post_attention_rnn.units + cases.append(t) + + t = test_case() + if decoder.pre_attention_rnn.return_sequences != True: + t.failed = True + t.msg = "pre_attention_rnn layer has incorrect value for 'return_sequences' attribute" + t.want = True + t.got = decoder.pre_attention_rnn.return_sequences + cases.append(t) + + t = test_case() + if decoder.pre_attention_rnn.return_state != True: + t.failed = True + t.msg = "pre_attention_rnn layer has incorrect value for 'return_state' attribute" + t.want = True + t.got = decoder.pre_attention_rnn.return_state + cases.append(t) + + t = test_case() + if not isinstance(decoder.attention, CrossAttention): + t.failed = True + t.msg = "Incorrect type of attention layer" + t.want = CrossAttention + t.got = type(decoder.attention) + return [t] + + t = test_case() + if decoder.post_attention_rnn.return_sequences != True: + t.failed = True + t.msg = "post_attention_rnn layer has incorrect value for 'return_sequences' attribute" + t.want = True + t.got = decoder.post_attention_rnn.return_sequences + cases.append(t) + + t = test_case() + if not isinstance(decoder.post_attention_rnn, tf.keras.layers.LSTM): + t.failed = True + t.msg = "Incorrect type of pre_attention_rnn layer" + t.want = tf.keras.layers.LSTM + t.got = type(decoder.post_attention_rnn) + return [t] + + t = test_case() + if not isinstance(decoder.output_layer, tf.keras.layers.Dense): + t.failed = True + t.msg = "Incorrect type of output_layer layer" + t.want = tf.keras.layers.Dense + t.got = type(decoder.output_layer) + return [t] + + t = test_case() + if ( + "log" not in decoder.output_layer.activation.__name__ + or "softmax" not in decoder.output_layer.activation.__name__ + ): + t.failed = True + t.msg = "output_layer layer has incorrect activation function" + t.want = "a log softmax activation function such as 'log_softmax_v2'" + t.got = decoder.output_layer.activation.__name__ + cases.append(t) + + vocab_size = 10000 + n_units = 512 + decoder = decoder_to_test(vocab_size, n_units) + + for (_, sr_translation), _ in train_data.take(3): + encoder_output = np.random.rand(64, 15, 256) + decoder_output = decoder(encoder_output, sr_translation) + + first_dim_in, second_dim_in = sr_translation.shape + + t = test_case() + if len(decoder_output.shape) != 3: + t.failed = True + t.msg = "Incorrect shape of decoder output" + t.want = "a shape with 3 dimensions" + t.got = decoder_output.shape + return [t] + + first_dim_out, second_dim_out, third_dim_out = decoder_output.shape + + t = test_case() + if first_dim_in != first_dim_out: + t.failed = True + t.msg = "Incorrect first dimension of decoder output" + t.want = first_dim_in + t.got = first_dim_out + cases.append(t) + + t = test_case() + if second_dim_in != second_dim_out: + t.failed = True + t.msg = "Incorrect second dimension of decoder output" + t.want = second_dim_in + t.got = second_dim_out + cases.append(t) + + t = test_case() + if third_dim_out != vocab_size: + t.failed = True + t.msg = "Incorrect third dimension of decoder output" + t.want = vocab_size + t.got = third_dim_out + cases.append(t) + return cases + + cases = g() + print_feedback(cases) + + +def test_translator(translator_to_test, Encoder, Decoder): + def g(): + vocab_sizes = [5, 20, 1000, 15000] + units = [32, 64, 256, 512] + + cases = [] + + vocab_size = 10000 + n_units = 512 + translator = translator_to_test(vocab_size, n_units) + + t = test_case() + if not isinstance(translator.encoder, Encoder): + t.failed = True + t.msg = "Incorrect type of encoder layer" + t.want = Encoder + t.got = type(translator.encoder) + return [t] + + t = test_case() + if not isinstance(translator.decoder, Decoder): + t.failed = True + t.msg = "Incorrect type of encoder layer" + t.want = Decoder + t.got = type(translator.decoder) + return [t] + + translator = translator_to_test(vocab_size, n_units) + + for (to_translate, sr_translation), _ in train_data.take(3): + first_dim_in, second_dim_in = sr_translation.shape + translator_output = translator((to_translate, sr_translation)) + t = test_case() + if len(translator_output.shape) != 3: + t.failed = True + t.msg = "Incorrect shape of translator output" + t.want = "a shape with 3 dimensions" + t.got = translator_output.shape + return [t] + + first_dim_out, second_dim_out, third_dim_out = translator_output.shape + + t = test_case() + if first_dim_in != first_dim_out: + t.failed = True + t.msg = "Incorrect first dimension of translator output" + t.want = first_dim_in + t.got = first_dim_out + cases.append(t) + + t = test_case() + if second_dim_in != second_dim_out: + t.failed = True + t.msg = "Incorrect second dimension of translator output" + t.want = second_dim_in + t.got = second_dim_out + cases.append(t) + + t = test_case() + if third_dim_out != vocab_size: + t.failed = True + t.msg = "Incorrect third dimension of translator output" + t.want = vocab_size + t.got = third_dim_out + cases.append(t) + + return cases + + cases = g() + print_feedback(cases) + + + +def test_translate(learner_func, model): + def g(): + + cases = [] + + txt = "Hi, my name is Younes" + try: + translation, logit, tokens = learner_func(model, txt, temperature=0.9) + except Exception as e: + t = test_case() + t.failed = True + t.msg = "There was an exception when running your function" + t.want = "No exceptions" + t.got = f"{str(e)}" + return [t] + + txt = "Hi, my name is Alejandra" + translation, logit, tokens = learner_func(model, txt, temperature=0.0) + + t = test_case() + + if not isinstance(translation, str): + t.failed = True + t.msg = "'translation' has incorrect type" + t.want = str + t.got = type(translation) + cases.append(t) + + if not isinstance(logit, np.number): + t.failed = True + t.msg = "'logit' has incorrect type" + t.want = np.number + t.got = type(logit) + cases.append(t) + + if not isinstance(tokens, tf.Tensor): + t.failed = True + t.msg = "'tokens' has incorrect type" + t.want = tf.Tensor + t.got = type(tokens) + cases.append(t) + + translation2, logit2, tokens2 = learner_func(model, txt, temperature=0.0) + + t = test_case() + if translation != translation2: + t.failed = True + t.msg = "translate didn't return the same translation when using temperature of 0.0" + t.want = translation + t.got = translation2 + cases.append(t) + + t = test_case() + if logit != logit2: + t.failed = True + t.msg = "translate didn't return the same logit when using temperature of 0.0" + t.want = logit + t.got = logit2 + cases.append(t) + + t = test_case() + if not np.allclose(tokens, tokens2): + t.failed = True + t.msg = "translate didn't return the same tokens when using temperature of 0.0" + t.want = tokens + t.got = tokens2 + cases.append(t) + + + return cases + + cases = g() + print_feedback(cases) + + + + +def test_rouge1_similarity(learner_func): + + def g(): + + tensors = [ + [0], + [0, 1], + [0, 1, 2], + [1, 2, 4, 5], + [5, 5, 7, 0, 232] + ] + + expected = [0.6666666666666666, 0.5, 0, 0.33333333333333337, 0.8, 0.3333333333333333, 0.28571428571428575, 0.5714285714285715, 0.25] + + cases = [] + pairs = list(combinations(tensors, 2)) + + for (candidate, reference), solution in zip(pairs, expected): + answer = learner_func(candidate, reference) + t = test_case() + if not math.isclose(answer, solution): + t.failed = True + t.msg = f"Incorrect similarity for candidate={candidate} and reference={reference}" + t.want = solution + t.got = answer + cases.append(t) + + return cases + + cases = g() + print_feedback(cases) + + +def test_average_overlap(learner_func): + + def jaccard_similarity(candidate, reference): + + # Convert the lists to sets to get the unique tokens + candidate_set = set(candidate) + reference_set = set(reference) + + # Get the set of tokens common to both candidate and reference + common_tokens = candidate_set.intersection(reference_set) + + # Get the set of all tokens found in either candidate or reference + all_tokens = candidate_set.union(reference_set) + + # Compute the percentage of overlap (divide the number of common tokens by the number of all tokens) + overlap = len(common_tokens) / len(all_tokens) + + return overlap + + def g(): + + l1 = [1, 2, 3] + l2 = [1, 2, 4] + l3 = [1, 2, 4, 5] + l4 = [5,6] + + elements = [l1, l2, l3, l4] + + all_combinations = [] + + for r in range(2, len(elements) + 1): + # Generate combinations of length r + combinations_r = combinations(elements, r) + + # Append the combinations to the result list + all_combinations.extend(combinations_r) + + expected = [{0: 0.5, 1: 0.5}, + {0: 0.4, 1: 0.4}, + {0: 0.0, 1: 0.0}, + {0: 0.75, 1: 0.75}, + {0: 0.0, 1: 0.0}, + {0: 0.2, 1: 0.2}, + {0: 0.45, 1: 0.625, 2: 0.575}, + {0: 0.25, 1: 0.25, 2: 0.0}, + {0: 0.2, 1: 0.3, 2: 0.1}, + {0: 0.375, 1: 0.475, 2: 0.1}, + {0: 0.3, 1: 0.417, 2: 0.45, 3: 0.067}] + + cases = [] + + for combination, solution in zip(all_combinations, expected): + answer = learner_func(combination, jaccard_similarity) + t = test_case() + if answer != solution: + t.failed = True + t.msg = f"Incorrect overlap for lists={combination}" + t.want = solution + t.got = answer + cases.append(t) + + return cases + + cases = g() + print_feedback(cases) diff --git a/NLP with Attention Models/NMT_with_Attention/NMT with MBR/Files/tf/C4W1_Assignment.ipynb b/NLP with Attention Models/NMT_with_Attention/NMT with MBR/Files/tf/C4W1_Assignment.ipynb new file mode 100644 index 0000000000000000000000000000000000000000..6b8a7c0278750441b6bfb40810e2c12f0303aea4 --- /dev/null +++ b/NLP with Attention Models/NMT_with_Attention/NMT with MBR/Files/tf/C4W1_Assignment.ipynb @@ -0,0 +1,2312 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "id": "9cb49525", + "metadata": {}, + "source": [ + "# Assignment 1: Neural Machine Translation\n", + "\n", + "Welcome to the first assignment of Course 4. Here, you will build an English-to-Portuguese neural machine translation (NMT) model using Long Short-Term Memory (LSTM) networks with attention. Machine translation is an important task in natural language processing and could be useful not only for translating one language to another but also for word sense disambiguation (e.g. determining whether the word \"bank\" refers to the financial bank, or the land alongside a river). Implementing this using just a Recurrent Neural Network (RNN) with LSTMs can work for short to medium length sentences but can result in vanishing gradients for very long sequences. To help with this, you will be adding an attention mechanism to allow the decoder to access all relevant parts of the input sentence regardless of its length. By completing this assignment, you will:\n", + "\n", + "- Implement an encoder-decoder system with attention\n", + "- Build the NMT model from scratch using Tensorflow\n", + "- Generate translations using greedy and Minimum Bayes Risk (MBR) decoding\n", + "\n", + "## Table of Contents\n", + "- [1 - Data Preparation](#1)\n", + "- [2 - NMT model with attention](#2)\n", + " - [Exercise 1 - Encoder](#ex1)\n", + " - [Exercise 2 - CrossAttention](#ex2)\n", + " - [Exercise 3 - Decoder](#ex3) \n", + " - [Exercise 4 - Translator](#ex4)\n", + "- [3 - Training](#3)\n", + "- [4 - Using the model for inference ](#4)\n", + " - [Exercise 5 - translate](#ex5)\n", + "- [5 - Minimum Bayes-Risk Decoding](#5)\n", + " - [Exercise 6 - rouge1_similarity](#ex6)\n", + " - [Exercise 7 - average_overlap](#ex7)\n" + ] + }, + { + "cell_type": "code", + "execution_count": 1, + "id": "f9ef370d", + "metadata": { + "deletable": false, + "editable": false, + "tags": [ + "graded" + ] + }, + "outputs": [], + "source": [ + "import os\n", + "os.environ['TF_CPP_MIN_LOG_LEVEL'] = '3' # Setting this env variable prevents TF warnings from showing up\n", + "\n", + "import numpy as np\n", + "import tensorflow as tf\n", + "from collections import Counter\n", + "from utils import (sentences, train_data, val_data, english_vectorizer, portuguese_vectorizer, \n", + " masked_loss, masked_acc, tokens_to_text)" + ] + }, + { + "cell_type": "code", + "execution_count": 2, + "id": "8adb8fd6", + "metadata": { + "deletable": false, + "editable": false, + "tags": [] + }, + "outputs": [], + "source": [ + "import w1_unittest" + ] + }, + { + "cell_type": "markdown", + "id": "e76be1dc", + "metadata": {}, + "source": [ + "\n", + "## 1. Data Preparation\n", + "\n", + "The text pre-processing bits have already been taken care of (if you are interested in this be sure to check the `utils.py` file). The steps performed can be summarized as:\n", + "\n", + "- Reading the raw data from the text files\n", + "- Cleaning the data (using lowercase, adding space around punctuation, trimming whitespaces, etc)\n", + "- Splitting it into training and validation sets\n", + "- Adding the start-of-sentence and end-of-sentence tokens to every sentence\n", + "- Tokenizing the sentences\n", + "- Creating a Tensorflow dataset out of the tokenized sentences\n", + "\n", + "Take a moment to inspect the raw sentences:" + ] + }, + { + "cell_type": "code", + "execution_count": 3, + "id": "226033a1", + "metadata": { + "deletable": false, + "editable": false, + "tags": [ + "graded" + ] + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "English (to translate) sentence:\n", + "\n", + "No matter how much you try to convince people that chocolate is vanilla, it'll still be chocolate, even though you may manage to convince yourself and a few others that it's vanilla.\n", + "\n", + "Portuguese (translation) sentence:\n", + "\n", + "Não importa o quanto você tenta convencer os outros de que chocolate é baunilha, ele ainda será chocolate, mesmo que você possa convencer a si mesmo e poucos outros de que é baunilha.\n" + ] + } + ], + "source": [ + "portuguese_sentences, english_sentences = sentences\n", + "\n", + "print(f\"English (to translate) sentence:\\n\\n{english_sentences[-5]}\\n\")\n", + "print(f\"Portuguese (translation) sentence:\\n\\n{portuguese_sentences[-5]}\")" + ] + }, + { + "cell_type": "markdown", + "id": "5ba90eb9", + "metadata": {}, + "source": [ + "You don't have much use for the raw sentences so delete them to save memory:" + ] + }, + { + "cell_type": "code", + "execution_count": 4, + "id": "d9f081b0", + "metadata": { + "deletable": false, + "editable": false, + "tags": [ + "graded" + ] + }, + "outputs": [], + "source": [ + "del portuguese_sentences\n", + "del english_sentences\n", + "del sentences" + ] + }, + { + "cell_type": "markdown", + "id": "a2ff83d2", + "metadata": {}, + "source": [ + "Notice that you imported an `english_vectorizer` and a `portuguese_vectorizer` from `utils.py`. These were created using [tf.keras.layers.TextVectorization](https://www.tensorflow.org/api_docs/python/tf/keras/layers/TextVectorization) and they provide interesting features such as ways to visualize the vocabulary and convert text into tokenized ids and vice versa. In fact, you can inspect the first ten words of the vocabularies for both languages:" + ] + }, + { + "cell_type": "code", + "execution_count": 5, + "id": "2c1cfc17", + "metadata": { + "deletable": false, + "editable": false, + "tags": [ + "graded" + ] + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "First 10 words of the english vocabulary:\n", + "\n", + "['', '[UNK]', '[SOS]', '[EOS]', '.', 'tom', 'i', 'to', 'you', 'the']\n", + "\n", + "First 10 words of the portuguese vocabulary:\n", + "\n", + "['', '[UNK]', '[SOS]', '[EOS]', '.', 'tom', 'que', 'o', 'nao', 'eu']\n" + ] + } + ], + "source": [ + "print(f\"First 10 words of the english vocabulary:\\n\\n{english_vectorizer.get_vocabulary()[:10]}\\n\")\n", + "print(f\"First 10 words of the portuguese vocabulary:\\n\\n{portuguese_vectorizer.get_vocabulary()[:10]}\")" + ] + }, + { + "cell_type": "markdown", + "id": "3152b075", + "metadata": {}, + "source": [ + "Notice that the first 4 words are reserved for special words. In order, these are:\n", + "\n", + "- the empty string\n", + "- a special token to represent an unknown word\n", + "- a special token to represent the start of a sentence\n", + "- a special token to represent the end of a sentence\n", + "\n", + "You can see how many words are in a vocabulary by using the `vocabulary_size` method:" + ] + }, + { + "cell_type": "code", + "execution_count": 6, + "id": "5facaa0c", + "metadata": { + "deletable": false, + "editable": false, + "slideshow": { + "slide_type": "" + }, + "tags": [ + "graded" + ] + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Portuguese vocabulary is made up of 12000 words\n", + "English vocabulary is made up of 12000 words\n" + ] + } + ], + "source": [ + "# Size of the vocabulary\n", + "vocab_size_por = portuguese_vectorizer.vocabulary_size()\n", + "vocab_size_eng = english_vectorizer.vocabulary_size()\n", + "\n", + "print(f\"Portuguese vocabulary is made up of {vocab_size_por} words\")\n", + "print(f\"English vocabulary is made up of {vocab_size_eng} words\")" + ] + }, + { + "cell_type": "markdown", + "id": "53e4b615", + "metadata": { + "slideshow": { + "slide_type": "" + }, + "tags": [] + }, + "source": [ + "You can define [tf.keras.layers.StringLookup](https://www.tensorflow.org/api_docs/python/tf/keras/layers/StringLookup) objects that will help you map from words to ids and vice versa. Do this for the portuguese vocabulary since this will be useful later on when you decode the predictions from your model:" + ] + }, + { + "cell_type": "code", + "execution_count": 7, + "id": "218f7a36", + "metadata": { + "deletable": false, + "editable": false, + "tags": [ + "graded" + ] + }, + "outputs": [], + "source": [ + "# This helps you convert from words to ids\n", + "word_to_id = tf.keras.layers.StringLookup(\n", + " vocabulary=portuguese_vectorizer.get_vocabulary(), \n", + " mask_token=\"\", \n", + " oov_token=\"[UNK]\"\n", + ")\n", + "\n", + "# This helps you convert from ids to words\n", + "id_to_word = tf.keras.layers.StringLookup(\n", + " vocabulary=portuguese_vectorizer.get_vocabulary(),\n", + " mask_token=\"\",\n", + " oov_token=\"[UNK]\",\n", + " invert=True,\n", + ")" + ] + }, + { + "cell_type": "markdown", + "id": "4af8b623", + "metadata": {}, + "source": [ + "Try it out for the special tokens and a random word:" + ] + }, + { + "cell_type": "code", + "execution_count": 8, + "id": "20076b9a", + "metadata": { + "deletable": false, + "editable": false, + "tags": [ + "graded" + ] + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "The id for the [UNK] token is 1\n", + "The id for the [SOS] token is 2\n", + "The id for the [EOS] token is 3\n", + "The id for baunilha (vanilla) is 7079\n" + ] + } + ], + "source": [ + "unk_id = word_to_id(\"[UNK]\")\n", + "sos_id = word_to_id(\"[SOS]\")\n", + "eos_id = word_to_id(\"[EOS]\")\n", + "baunilha_id = word_to_id(\"baunilha\")\n", + "\n", + "print(f\"The id for the [UNK] token is {unk_id}\")\n", + "print(f\"The id for the [SOS] token is {sos_id}\")\n", + "print(f\"The id for the [EOS] token is {eos_id}\")\n", + "print(f\"The id for baunilha (vanilla) is {baunilha_id}\")" + ] + }, + { + "cell_type": "markdown", + "id": "2f1d744c", + "metadata": {}, + "source": [ + "Finally take a look at how the data that is going to be fed to the neural network looks like. Both `train_data` and `val_data` are of type `tf.data.Dataset` and are already arranged in batches of 64 examples. To get the first batch out of a tf dataset you can use the `take` method. To get the first example out of the batch you can slice the tensor and use the `numpy` method for nicer printing:" + ] + }, + { + "cell_type": "code", + "execution_count": 9, + "id": "739777eb", + "metadata": { + "deletable": false, + "editable": false, + "tags": [ + "graded" + ] + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Tokenized english sentence:\n", + "[ 2 210 9 146 123 38 9 1672 4 3 0 0 0 0]\n", + "\n", + "\n", + "Tokenized portuguese sentence (shifted to the right):\n", + "[ 2 1085 7 128 11 389 37 2038 4 0 0 0 0 0\n", + " 0]\n", + "\n", + "\n", + "Tokenized portuguese sentence:\n", + "[1085 7 128 11 389 37 2038 4 3 0 0 0 0 0\n", + " 0]\n", + "\n", + "\n" + ] + } + ], + "source": [ + "for (to_translate, sr_translation), translation in train_data.take(1):\n", + " print(f\"Tokenized english sentence:\\n{to_translate[0, :].numpy()}\\n\\n\")\n", + " print(f\"Tokenized portuguese sentence (shifted to the right):\\n{sr_translation[0, :].numpy()}\\n\\n\")\n", + " print(f\"Tokenized portuguese sentence:\\n{translation[0, :].numpy()}\\n\\n\")" + ] + }, + { + "cell_type": "markdown", + "id": "bdd9ee3c", + "metadata": { + "slideshow": { + "slide_type": "" + }, + "tags": [] + }, + "source": [ + "There are a couple of important details to notice.\n", + "\n", + "- Padding has already been applied to the tensors and the value used for this is 0\n", + "- Each example consists of 3 different tensors:\n", + " - The sentence to translate\n", + " - The shifted-to-the-right translation\n", + " - The translation\n", + " \n", + "The first two can be considered as the features, while the third one as the target. By doing this your model can perform Teacher Forcing as you saw in the lectures.\n", + "\n", + "Now it is time to begin coding!" + ] + }, + { + "cell_type": "markdown", + "id": "dd41cb52", + "metadata": { + "slideshow": { + "slide_type": "" + }, + "tags": [] + }, + "source": [ + "\n", + "## 2. NMT model with attention\n", + "\n", + "The model you will build uses an encoder-decoder architecture. This Recurrent Neural Network (RNN) takes in a tokenized version of a sentence in its encoder, then passes it on to the decoder for translation. As mentioned in the lectures, just using a a regular sequence-to-sequence model with LSTMs will work effectively for short to medium sentences but will start to degrade for longer ones. You can picture it like the figure below where all of the context of the input sentence is compressed into one vector that is passed into the decoder block. You can see how this will be an issue for very long sentences (e.g. 100 tokens or more) because the context of the first parts of the input will have very little effect on the final vector passed to the decoder.\n", + "\n", + "\n", + "\n", + "Adding an attention layer to this model avoids this problem by giving the decoder access to all parts of the input sentence. To illustrate, let's just use a 4-word input sentence as shown below. Remember that a hidden state is produced at each timestep of the encoder (represented by the orange rectangles). These are all passed to the attention layer and each are given a score given the current activation (i.e. hidden state) of the decoder. For instance, let's consider the figure below where the first prediction \"como\" is already made. To produce the next prediction, the attention layer will first receive all the encoder hidden states (i.e. orange rectangles) as well as the decoder hidden state when producing the word \"como\" (i.e. first green rectangle). Given this information, it will score each of the encoder hidden states to know which one the decoder should focus on to produce the next word. As a result of training, the model might have learned that it should align to the second encoder hidden state and subsequently assigns a high probability to the word \"você\". If we are using greedy decoding, we will output the said word as the next symbol, then restart the process to produce the next word until we reach an end-of-sentence prediction.\n", + "\n", + "\n", + "\n", + "\n", + "There are different ways to implement attention and the one we'll use for this assignment is the Scaled Dot Product Attention which has the form:\n", + "\n", + "$$Attention(Q, K, V) = softmax(\\frac{QK^T}{\\sqrt{d_k}})V$$\n", + "\n", + "You will dive deeper into this equation in the next week but for now, you can think of it as computing scores using queries (Q) and keys (K), followed by a multiplication of values (V) to get a context vector at a particular timestep of the decoder. This context vector is fed to the decoder RNN to get a set of probabilities for the next predicted word. The division by square root of the keys dimensionality ($\\sqrt{d_k}$) is for improving model performance and you'll also learn more about it next week. For our machine translation application, the encoder activations (i.e. encoder hidden states) will be the keys and values, while the decoder activations (i.e. decoder hidden states) will be the queries.\n", + "\n", + "You will see in the upcoming sections that this complex architecture and mechanism can be implemented with just a few lines of code. \n", + "\n", + "First you will define two important global variables:\n", + "\n", + "- The size of the vocabulary\n", + "- The number of units in the LSTM layers (the same number will be used for all LSTM layers)\n", + "\n", + "In this assignment, the vocabulary sizes for English and Portuguese are the same. Therefore, we use a single constant VOCAB_SIZE throughout the notebook. While in other settings, vocabulary sizes could differ, that is not the case in our assignment." + ] + }, + { + "cell_type": "code", + "execution_count": 10, + "id": "2e484abf", + "metadata": { + "deletable": false, + "editable": false, + "slideshow": { + "slide_type": "" + }, + "tags": [ + "graded" + ] + }, + "outputs": [], + "source": [ + "VOCAB_SIZE = 12000\n", + "UNITS = 256" + ] + }, + { + "cell_type": "markdown", + "id": "cc251965", + "metadata": {}, + "source": [ + "\n", + "## Exercise 1 - Encoder\n", + "\n", + "Your first exercise is to code the encoder part of the neural network. For this, complete the `Encoder` class below. Notice that in the constructor (the `__init__` method) you need to define all of the sublayers of the encoder and then use these sublayers during the forward pass (the `call` method).\n", + "\n", + "The encoder consists of the following layers:\n", + "\n", + "- [Embedding](https://www.tensorflow.org/api_docs/python/tf/keras/layers/Embedding). For this layer you need to define the appropriate `input_dim` and `output_dim` and let it know that you are using '0' as padding, which can be done by using the appropriate value for the `mask_zero` parameter.\n", + " \n", + "+ [Bidirectional](https://www.tensorflow.org/api_docs/python/tf/keras/layers/Bidirectional) [LSTM](https://www.tensorflow.org/api_docs/python/tf/keras/layers/LSTM). In TF you can implement bidirectional behaviour for RNN-like layers. This part is already taken care of but you will need to specify the appropriate type of layer as well as its parameters. In particular you need to set the appropriate number of units and make sure that the LSTM returns the full sequence and not only the last output, which can be done by using the appropriate value for the `return_sequences` parameter.\n", + "\n", + "\n", + "You need to define the forward pass using the syntax of TF's [functional API](https://www.tensorflow.org/guide/keras/functional_api). What this means is that you chain function calls together to define your network like this:\n", + "\n", + "```python\n", + "encoder_input = keras.Input(shape=(28, 28, 1), name=\"original_img\")\n", + "x = layers.Conv2D(16, 3, activation=\"relu\")(encoder_input)\n", + "x = layers.MaxPooling2D(3)(x)\n", + "x = layers.Conv2D(16, 3, activation=\"relu\")(x)\n", + "encoder_output = layers.GlobalMaxPooling2D()(x)\n", + "```" + ] + }, + { + "cell_type": "code", + "execution_count": 20, + "id": "b1db0a1d", + "metadata": { + "deletable": false, + "tags": [ + "graded" + ] + }, + "outputs": [], + "source": [ + "# GRADED CLASS: Encoder\n", + "class Encoder(tf.keras.layers.Layer):\n", + " def __init__(self, vocab_size, units):\n", + " \"\"\"Initializes an instance of this class\n", + "\n", + " Args:\n", + " vocab_size (int): Size of the vocabulary\n", + " units (int): Number of units in the LSTM layer\n", + " \"\"\"\n", + " super(Encoder, self).__init__()\n", + "\n", + " ### START CODE HERE ###\n", + "\n", + " self.embedding = tf.keras.layers.Embedding( \n", + " input_dim=vocab_size,\n", + " output_dim=units,\n", + " mask_zero=True\n", + " ) \n", + "\n", + " self.rnn = tf.keras.layers.Bidirectional( \n", + " merge_mode=\"sum\", \n", + " layer=tf.keras.layers.LSTM(\n", + " units=units,\n", + " return_sequences=True\n", + " ), \n", + " ) \n", + "\n", + " ### END CODE HERE ###\n", + "\n", + " def call(self, context):\n", + " \"\"\"Forward pass of this layer\n", + "\n", + " Args:\n", + " context (tf.Tensor): The sentence to translate\n", + "\n", + " Returns:\n", + " tf.Tensor: Encoded sentence to translate\n", + " \"\"\"\n", + "\n", + " ### START CODE HERE ###\n", + "\n", + " # Pass the context through the embedding layer\n", + " x = self.embedding(context)\n", + "\n", + " # Pass the output of the embedding through the RNN\n", + " x = self.rnn(x)\n", + "\n", + " ### END CODE HERE ###\n", + "\n", + " return x" + ] + }, + { + "cell_type": "code", + "execution_count": 21, + "id": "65034ffd", + "metadata": { + "deletable": false, + "editable": false, + "tags": [ + "graded" + ] + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Tensor of sentences in english has shape: (64, 14)\n", + "\n", + "Encoder output has shape: (64, 14, 256)\n" + ] + } + ], + "source": [ + "# Do a quick check of your implementation\n", + "\n", + "# Create an instance of your class\n", + "encoder = Encoder(VOCAB_SIZE, UNITS)\n", + "\n", + "# Pass a batch of sentences to translate from english to portuguese\n", + "encoder_output = encoder(to_translate)\n", + "\n", + "print(f'Tensor of sentences in english has shape: {to_translate.shape}\\n')\n", + "print(f'Encoder output has shape: {encoder_output.shape}')" + ] + }, + { + "cell_type": "markdown", + "id": "a909aea1", + "metadata": {}, + "source": [ + "##### __Expected Output__\n", + "\n", + "```\n", + "Tensor of sentences in english has shape: (64, 14)\n", + "\n", + "Encoder output has shape: (64, 14, 256)\n", + "```" + ] + }, + { + "cell_type": "code", + "execution_count": 22, + "id": "3031bb14", + "metadata": { + "deletable": false, + "editable": false, + "tags": [] + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "\u001b[92m All tests passed!\n" + ] + } + ], + "source": [ + "# Test your code!\n", + "\n", + "w1_unittest.test_encoder(Encoder)" + ] + }, + { + "cell_type": "markdown", + "id": "1afe83f4", + "metadata": {}, + "source": [ + "\n", + "## Exercise 2 - CrossAttention\n", + "\n", + "Your next exercise is to code the layer that will perform cross attention between the original sentences and the translations. For this, complete the `CrossAttention` class below. Notice that in the constructor (the `__init__` method) you need to define all of the sublayers and then use these sublayers during the forward pass (the `call` method). For this particular case some of these bits are already taken care of.\n", + "\n", + "The cross attention consists of the following layers:\n", + "\n", + "- [MultiHeadAttention](https://www.tensorflow.org/api_docs/python/tf/keras/layers/MultiHeadAttention). For this layer you need to define the appropriate `key_dim`, which is the size of the key and query tensors. You will also need to set the number of heads to 1 since you aren't implementing multi head attention but attention between two tensors. The reason why this layer is preferred over [Attention](https://www.tensorflow.org/api_docs/python/tf/keras/layers/Attention) is that it allows simpler code during the forward pass.\n", + " \n", + "A couple of things to notice:\n", + "- You need a way to pass both the output of the attention alongside the shifted-to-the-right translation (since this cross attention happens in the decoder side). For this you will use an [Add](https://www.tensorflow.org/api_docs/python/tf/keras/layers/Add) layer so that the original dimension is preserved, which would not happen if you use something like a [Concatenate](https://www.tensorflow.org/api_docs/python/tf/keras/layers/Concatenate) layer.\n", + "\n", + "+ Layer normalization is also performed for better stability of the network by using a [LayerNormalization](https://www.tensorflow.org/api_docs/python/tf/keras/layers/LayerNormalization) layer.\n", + "\n", + "- You don't need to worry about these last steps as these are already solved.\n", + "\n" + ] + }, + { + "cell_type": "code", + "execution_count": 23, + "id": "74e71f3d", + "metadata": { + "deletable": false, + "tags": [ + "graded" + ] + }, + "outputs": [], + "source": [ + "# GRADED CLASS: CrossAttention\n", + "class CrossAttention(tf.keras.layers.Layer):\n", + " def __init__(self, units):\n", + " \"\"\"Initializes an instance of this class\n", + "\n", + " Args:\n", + " units (int): Number of units in the LSTM layer\n", + " \"\"\"\n", + " super().__init__()\n", + "\n", + " ### START CODE HERE ###\n", + "\n", + " self.mha = ( \n", + " tf.keras.layers.MultiHeadAttention(\n", + " key_dim=units,\n", + " num_heads=1\n", + " ) \n", + " ) \n", + "\n", + " ### END CODE HERE ###\n", + "\n", + " self.layernorm = tf.keras.layers.LayerNormalization()\n", + " self.add = tf.keras.layers.Add()\n", + "\n", + " def call(self, context, target):\n", + " \"\"\"Forward pass of this layer\n", + "\n", + " Args:\n", + " context (tf.Tensor): Encoded sentence to translate\n", + " target (tf.Tensor): The embedded shifted-to-the-right translation\n", + "\n", + " Returns:\n", + " tf.Tensor: Cross attention between context and target\n", + " \"\"\"\n", + " ### START CODE HERE ###\n", + "\n", + " # Call the MH attention by passing in the query and value\n", + " # For this case the query should be the translation and the value the encoded sentence to translate\n", + " # Hint: Check the call arguments of MultiHeadAttention in the docs\n", + " attn_output =self.mha(\n", + " query=target,\n", + " value=context\n", + " ) \n", + "\n", + " ### END CODE HERE ###\n", + "\n", + " x = self.add([target, attn_output])\n", + "\n", + " x = self.layernorm(x)\n", + "\n", + " return x" + ] + }, + { + "cell_type": "code", + "execution_count": 24, + "id": "4c62796f", + "metadata": { + "deletable": false, + "editable": false, + "tags": [ + "graded" + ] + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Tensor of contexts has shape: (64, 14, 256)\n", + "Tensor of translations has shape: (64, 15, 256)\n", + "Tensor of attention scores has shape: (64, 15, 256)\n" + ] + } + ], + "source": [ + "# Do a quick check of your implementation\n", + "\n", + "# Create an instance of your class\n", + "attention_layer = CrossAttention(UNITS)\n", + "\n", + "# The attention layer expects the embedded sr-translation and the context\n", + "# The context (encoder_output) is already embedded so you need to do this for sr_translation:\n", + "sr_translation_embed = tf.keras.layers.Embedding(VOCAB_SIZE, output_dim=UNITS, mask_zero=True)(sr_translation)\n", + "\n", + "# Compute the cross attention\n", + "attention_result = attention_layer(encoder_output, sr_translation_embed)\n", + "\n", + "print(f'Tensor of contexts has shape: {encoder_output.shape}')\n", + "print(f'Tensor of translations has shape: {sr_translation_embed.shape}')\n", + "print(f'Tensor of attention scores has shape: {attention_result.shape}')" + ] + }, + { + "cell_type": "markdown", + "id": "41d4f99a", + "metadata": {}, + "source": [ + "##### __Expected Output__\n", + "\n", + "```\n", + "Tensor of contexts has shape: (64, 14, 256)\n", + "Tensor of translations has shape: (64, 15, 256)\n", + "Tensor of attention scores has shape: (64, 15, 256)\n", + "```" + ] + }, + { + "cell_type": "code", + "execution_count": 25, + "id": "4f658975", + "metadata": { + "deletable": false, + "editable": false, + "tags": [] + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "\u001b[92m All tests passed!\n" + ] + } + ], + "source": [ + "# Test your code!\n", + "\n", + "w1_unittest.test_cross_attention(CrossAttention)" + ] + }, + { + "cell_type": "markdown", + "id": "aa296ee2", + "metadata": {}, + "source": [ + "\n", + "## Exercise 3 - Decoder\n", + "\n", + "\n", + "Now you will implement the decoder part of the neural network by completing the `Decoder` class below. Notice that in the constructor (the `__init__` method) you need to define all of the sublayers of the decoder and then use these sublayers during the forward pass (the `call` method).\n", + "\n", + "The decoder consists of the following layers:\n", + "\n", + "- [Embedding](https://www.tensorflow.org/api_docs/python/tf/keras/layers/Embedding). For this layer you need to define the appropriate `input_dim` and `output_dim` and let it know that you are using '0' as padding, which can be done by using the appropriate value for the `mask_zero` parameter.\n", + " \n", + " \n", + "+ Pre-attention [LSTM](https://www.tensorflow.org/api_docs/python/tf/keras/layers/LSTM). Unlike in the encoder in which you used a Bidirectional LSTM, here you will use a vanilla LSTM. Don't forget to set the appropriate number of units and make sure that the LSTM returns the full sequence and not only the last output, which can be done by using the appropriate value for the `return_sequences` parameter. It is very important that this layer returns the state since this will be needed for inference so make sure to set the `return_state` parameter accordingly. Notice that LSTM layers return state as a tuple of two tensors called `memory_state` and `carry_state`, **however these names have been changed to better reflect what you have seen in the lectures to `hidden_state` and `cell_state` respectively**.\n", + "\n", + "- The attention layer that performs cross attention between the sentence to translate and the right-shifted translation. Here you need to use the `CrossAttention` layer you defined in the previous exercise.\n", + "\n", + "+ Post-attention [LSTM](https://www.tensorflow.org/api_docs/python/tf/keras/layers/LSTM). Another LSTM layer. For this one you don't need it to return the state.\n", + "\n", + "- Finally a [Dense](https://www.tensorflow.org/api_docs/python/tf/keras/layers/Dense) layer. This one should have the same number of units as the size of the vocabulary since you expect it to compute the logits for every possible word in the vocabulary. Make sure to use a `logsoftmax` activation function for this one, which you can get as [tf.nn.log_softmax](https://www.tensorflow.org/api_docs/python/tf/nn/log_softmax).\n", + "\n" + ] + }, + { + "cell_type": "code", + "execution_count": 43, + "id": "e9639bdb", + "metadata": { + "deletable": false, + "tags": [ + "graded" + ] + }, + "outputs": [], + "source": [ + "# GRADED CLASS: Decoder\n", + "class Decoder(tf.keras.layers.Layer):\n", + " def __init__(self, vocab_size, units):\n", + " \"\"\"Initializes an instance of this class\n", + "\n", + " Args:\n", + " vocab_size (int): Size of the vocabulary\n", + " units (int): Number of units in the LSTM layer\n", + " \"\"\"\n", + " super(Decoder, self).__init__()\n", + "\n", + " ### START CODE HERE ###\n", + "\n", + " # The embedding layer\n", + " self.embedding = tf.keras.layers.Embedding(\n", + " input_dim=vocab_size,\n", + " output_dim=units,\n", + " mask_zero=True\n", + " ) \n", + "\n", + " # The RNN before attention\n", + " self.pre_attention_rnn = tf.keras.layers.LSTM(\n", + " units=units,\n", + " return_sequences=True,\n", + " return_state=True\n", + " ) \n", + "\n", + " # The attention layer\n", + " self.attention = CrossAttention(units)\n", + "\n", + " # The RNN after attention\n", + " self.post_attention_rnn = tf.keras.layers.LSTM(\n", + " units=units,\n", + " return_sequences=True\n", + " ) \n", + "\n", + " # The dense layer with logsoftmax activation\n", + " self.output_layer = tf.keras.layers.Dense(\n", + " units=vocab_size,\n", + " activation=tf.nn.log_softmax\n", + " ) \n", + "\n", + " ### END CODE HERE ###\n", + "\n", + " def call(self, context, target, state=None, return_state=False):\n", + " \"\"\"Forward pass of this layer\n", + "\n", + " Args:\n", + " context (tf.Tensor): Encoded sentence to translate\n", + " target (tf.Tensor): The shifted-to-the-right translation\n", + " state (list[tf.Tensor, tf.Tensor], optional): Hidden state of the pre-attention LSTM. Defaults to None.\n", + " return_state (bool, optional): If set to true return the hidden states of the LSTM. Defaults to False.\n", + "\n", + " Returns:\n", + " tf.Tensor: The log_softmax probabilities of predicting a particular token\n", + " \"\"\"\n", + " ### START CODE HERE ###\n", + "\n", + " # Get the embedding of the input\n", + " x = self.embedding(target)\n", + "\n", + " # Pass the embedded input into the pre attention LSTM\n", + " # Hints:\n", + " # - The LSTM you defined earlier should return the output alongside the state (made up of two tensors)\n", + " # - Pass in the state to the LSTM (needed for inference)\n", + " x, hidden_state, cell_state = self.pre_attention_rnn(x, initial_state=state)\n", + "\n", + " # Perform cross attention between the context and the output of the LSTM (in that order)\n", + " x = self.attention(context, x)\n", + "\n", + " # Do a pass through the post attention LSTM\n", + " x = self.post_attention_rnn(x)\n", + "\n", + " # Compute the logits\n", + " logits = self.output_layer(x)\n", + "\n", + " ### END CODE HERE ###\n", + "\n", + " if return_state:\n", + " return logits, [hidden_state, cell_state]\n", + "\n", + " return logits" + ] + }, + { + "cell_type": "code", + "execution_count": 44, + "id": "f6165cf2", + "metadata": { + "deletable": false, + "editable": false, + "tags": [ + "graded" + ] + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Tensor of contexts has shape: (64, 14, 256)\n", + "Tensor of right-shifted translations has shape: (64, 15)\n", + "Tensor of logits has shape: (64, 15, 12000)\n" + ] + } + ], + "source": [ + "# Do a quick check of your implementation\n", + "\n", + "# Create an instance of your class\n", + "decoder = Decoder(VOCAB_SIZE, UNITS)\n", + "\n", + "# Notice that you don't need the embedded version of sr_translation since this is done inside the class\n", + "logits = decoder(encoder_output, sr_translation)\n", + "\n", + "print(f'Tensor of contexts has shape: {encoder_output.shape}')\n", + "print(f'Tensor of right-shifted translations has shape: {sr_translation.shape}')\n", + "print(f'Tensor of logits has shape: {logits.shape}')" + ] + }, + { + "cell_type": "markdown", + "id": "6f2b5d7d", + "metadata": {}, + "source": [ + "##### __Expected Output__\n", + "\n", + "```\n", + "Tensor of contexts has shape: (64, 14, 256)\n", + "Tensor of right-shifted translations has shape: (64, 15)\n", + "Tensor of logits has shape: (64, 15, 12000)\n", + "```" + ] + }, + { + "cell_type": "code", + "execution_count": 45, + "id": "1b61093a", + "metadata": { + "deletable": false, + "editable": false, + "tags": [] + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "\u001b[92m All tests passed!\n" + ] + } + ], + "source": [ + "# Test your code!\n", + "\n", + "w1_unittest.test_decoder(Decoder, CrossAttention)" + ] + }, + { + "cell_type": "markdown", + "id": "9dcce3a7", + "metadata": {}, + "source": [ + "\n", + "## Exercise 4 - Translator\n", + "\n", + "Now you have to put together all of the layers you previously coded into an actual model. For this, complete the `Translator` class below. Notice how unlike the Encoder and Decoder classes inherited from `tf.keras.layers.Layer`, the Translator class inherits from `tf.keras.Model`.\n", + "\n", + "Remember that `train_data` will yield a tuple with the sentence to translate and the shifted-to-the-right translation, which are the \"features\" of the model. This means that the inputs of your network will be tuples containing context and targets." + ] + }, + { + "cell_type": "code", + "execution_count": 46, + "id": "205fcf31", + "metadata": { + "deletable": false, + "tags": [ + "graded" + ] + }, + "outputs": [], + "source": [ + "# GRADED CLASS: Translator\n", + "class Translator(tf.keras.Model):\n", + " def __init__(self, vocab_size, units):\n", + " \"\"\"Initializes an instance of this class\n", + "\n", + " Args:\n", + " vocab_size (int): Size of the vocabulary\n", + " units (int): Number of units in the LSTM layer\n", + " \"\"\"\n", + " super().__init__()\n", + "\n", + " ### START CODE HERE ###\n", + "\n", + " # Define the encoder with the appropriate vocab_size and number of units\n", + " self.encoder = Encoder(vocab_size,units)\n", + "\n", + " # Define the decoder with the appropriate vocab_size and number of units\n", + " self.decoder = Decoder(vocab_size,units)\n", + "\n", + " ### END CODE HERE ###\n", + "\n", + " def call(self, inputs):\n", + " \"\"\"Forward pass of this layer\n", + "\n", + " Args:\n", + " inputs (tuple(tf.Tensor, tf.Tensor)): Tuple containing the context (sentence to translate) and the target (shifted-to-the-right translation)\n", + "\n", + " Returns:\n", + " tf.Tensor: The log_softmax probabilities of predicting a particular token\n", + " \"\"\"\n", + "\n", + " ### START CODE HERE ###\n", + "\n", + " # In this case inputs is a tuple consisting of the context and the target, unpack it into single variables\n", + " context, target = inputs\n", + "\n", + " # Pass the context through the encoder\n", + " encoded_context = self.encoder(context)\n", + "\n", + " # Compute the logits by passing the encoded context and the target to the decoder\n", + " logits = self.decoder(target=target,context=encoded_context)\n", + "\n", + " ### END CODE HERE ###\n", + "\n", + " return logits" + ] + }, + { + "cell_type": "code", + "execution_count": 47, + "id": "4d4a231c", + "metadata": { + "deletable": false, + "editable": false, + "tags": [ + "graded" + ] + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Tensor of sentences to translate has shape: (64, 14)\n", + "Tensor of right-shifted translations has shape: (64, 15)\n", + "Tensor of logits has shape: (64, 15, 12000)\n" + ] + } + ], + "source": [ + "# Do a quick check of your implementation\n", + "\n", + "# Create an instance of your class\n", + "translator = Translator(VOCAB_SIZE, UNITS)\n", + "\n", + "# Compute the logits for every word in the vocabulary\n", + "logits = translator((to_translate, sr_translation))\n", + "\n", + "print(f'Tensor of sentences to translate has shape: {to_translate.shape}')\n", + "print(f'Tensor of right-shifted translations has shape: {sr_translation.shape}')\n", + "print(f'Tensor of logits has shape: {logits.shape}')" + ] + }, + { + "cell_type": "markdown", + "id": "e3a162dd", + "metadata": {}, + "source": [ + "##### __Expected Output__\n", + "\n", + "```\n", + "Tensor of sentences to translate has shape: (64, 14)\n", + "Tensor of right-shifted translations has shape: (64, 15)\n", + "Tensor of logits has shape: (64, 15, 12000)\n", + "```" + ] + }, + { + "cell_type": "code", + "execution_count": 50, + "id": "37009022", + "metadata": { + "deletable": false, + "editable": false, + "tags": [] + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "\u001b[92m All tests passed!\n" + ] + } + ], + "source": [ + "w1_unittest.test_translator(Translator, Encoder, Decoder)" + ] + }, + { + "cell_type": "markdown", + "id": "f81bc228", + "metadata": {}, + "source": [ + "\n", + "## 3. Training\n", + "\n", + "Now that you have an untrained instance of the NMT model, it is time to train it. You can use the `compile_and_train` function below to achieve this:" + ] + }, + { + "cell_type": "code", + "execution_count": 79, + "id": "8a61ef65", + "metadata": { + "deletable": false, + "editable": false, + "tags": [ + "graded" + ] + }, + "outputs": [], + "source": [ + "def compile_and_train(model, epochs=20, steps_per_epoch=500):\n", + " model.compile(optimizer=\"adam\", loss=masked_loss, metrics=[masked_acc, masked_loss])\n", + "\n", + " history = model.fit(\n", + " train_data.repeat(),\n", + " epochs=epochs,\n", + " steps_per_epoch=steps_per_epoch,\n", + " validation_data=val_data,\n", + " validation_steps=50,\n", + " callbacks=[tf.keras.callbacks.EarlyStopping(patience=3)],\n", + " )\n", + "\n", + " return model, history" + ] + }, + { + "cell_type": "code", + "execution_count": 80, + "id": "87d9bf9f", + "metadata": { + "deletable": false, + "editable": false, + "tags": [] + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Epoch 1/20\n", + "500/500 [==============================] - 44s 62ms/step - loss: 0.7642 - masked_acc: 0.8158 - masked_loss: 0.7650 - val_loss: 1.0111 - val_masked_acc: 0.7826 - val_masked_loss: 1.0127\n", + "Epoch 2/20\n", + "500/500 [==============================] - 17s 33ms/step - loss: 0.7811 - masked_acc: 0.8130 - masked_loss: 0.7816 - val_loss: 0.9974 - val_masked_acc: 0.7832 - val_masked_loss: 0.9979\n", + "Epoch 3/20\n", + "500/500 [==============================] - 16s 32ms/step - loss: 0.7887 - masked_acc: 0.8119 - masked_loss: 0.7894 - val_loss: 0.9753 - val_masked_acc: 0.7860 - val_masked_loss: 0.9780\n", + "Epoch 4/20\n", + "500/500 [==============================] - 15s 31ms/step - loss: 0.7835 - masked_acc: 0.8125 - masked_loss: 0.7841 - val_loss: 0.9936 - val_masked_acc: 0.7821 - val_masked_loss: 0.9955\n", + "Epoch 5/20\n", + "500/500 [==============================] - 16s 32ms/step - loss: 0.7452 - masked_acc: 0.8175 - masked_loss: 0.7461 - val_loss: 0.9643 - val_masked_acc: 0.7896 - val_masked_loss: 0.9638\n", + "Epoch 6/20\n", + "500/500 [==============================] - 15s 30ms/step - loss: 0.6538 - masked_acc: 0.8322 - masked_loss: 0.6543 - val_loss: 0.9582 - val_masked_acc: 0.7932 - val_masked_loss: 0.9586\n", + "Epoch 7/20\n", + "500/500 [==============================] - 15s 30ms/step - loss: 0.6596 - masked_acc: 0.8308 - masked_loss: 0.6605 - val_loss: 0.9581 - val_masked_acc: 0.7916 - val_masked_loss: 0.9594\n", + "Epoch 8/20\n", + "500/500 [==============================] - 15s 30ms/step - loss: 0.6746 - masked_acc: 0.8267 - masked_loss: 0.6754 - val_loss: 0.9448 - val_masked_acc: 0.7925 - val_masked_loss: 0.9467\n", + "Epoch 9/20\n", + "500/500 [==============================] - 16s 32ms/step - loss: 0.6785 - masked_acc: 0.8266 - masked_loss: 0.6788 - val_loss: 0.9292 - val_masked_acc: 0.7928 - val_masked_loss: 0.9295\n", + "Epoch 10/20\n", + "500/500 [==============================] - 15s 31ms/step - loss: 0.6287 - masked_acc: 0.8356 - masked_loss: 0.6292 - val_loss: 0.9324 - val_masked_acc: 0.7955 - val_masked_loss: 0.9324\n", + "Epoch 11/20\n", + "500/500 [==============================] - 15s 30ms/step - loss: 0.5875 - masked_acc: 0.8432 - masked_loss: 0.5880 - val_loss: 0.9407 - val_masked_acc: 0.7978 - val_masked_loss: 0.9401\n", + "Epoch 12/20\n", + "500/500 [==============================] - 15s 31ms/step - loss: 0.5988 - masked_acc: 0.8398 - masked_loss: 0.5992 - val_loss: 0.9546 - val_masked_acc: 0.7926 - val_masked_loss: 0.9546\n" + ] + } + ], + "source": [ + "# Train the translator (this takes some minutes so feel free to take a break)\n", + "\n", + "trained_translator, history = compile_and_train(translator)" + ] + }, + { + "cell_type": "markdown", + "id": "d23b9301", + "metadata": {}, + "source": [ + "\n", + "## 4. Using the model for inference \n", + "\n", + "\n", + "Now that your model is trained you can use it for inference. To help you with this the `generate_next_token` function is provided. Notice that this function is meant to be used inside a for-loop, so you feed to it the information of the previous step to generate the information of the next step. In particular you need to keep track of the state of the pre-attention LSTM in the decoder and if you are done with the translation. Also notice that a `temperature` variable is introduced which determines how to select the next token given the predicted logits: " + ] + }, + { + "cell_type": "code", + "execution_count": 82, + "id": "522f6b6f", + "metadata": { + "deletable": false, + "editable": false, + "tags": [ + "graded" + ] + }, + "outputs": [], + "source": [ + "def generate_next_token(decoder, context, next_token, done, state, temperature=0.0):\n", + " \"\"\"Generates the next token in the sequence\n", + "\n", + " Args:\n", + " decoder (Decoder): The decoder\n", + " context (tf.Tensor): Encoded sentence to translate\n", + " next_token (tf.Tensor): The predicted next token\n", + " done (bool): True if the translation is complete\n", + " state (list[tf.Tensor, tf.Tensor]): Hidden states of the pre-attention LSTM layer\n", + " temperature (float, optional): The temperature that controls the randomness of the predicted tokens. Defaults to 0.0.\n", + "\n", + " Returns:\n", + " tuple(tf.Tensor, np.float, list[tf.Tensor, tf.Tensor], bool): The next token, log prob of said token, hidden state of LSTM and if translation is done\n", + " \"\"\"\n", + " # Get the logits and state from the decoder\n", + " logits, state = decoder(context, next_token, state=state, return_state=True)\n", + " \n", + " # Trim the intermediate dimension \n", + " logits = logits[:, -1, :]\n", + " \n", + " # If temp is 0 then next_token is the argmax of logits\n", + " if temperature == 0.0:\n", + " next_token = tf.argmax(logits, axis=-1)\n", + " \n", + " # If temp is not 0 then next_token is sampled out of logits\n", + " else:\n", + " logits = logits / temperature\n", + " next_token = tf.random.categorical(logits, num_samples=1)\n", + " \n", + " # Trim dimensions of size 1\n", + " logits = tf.squeeze(logits)\n", + " next_token = tf.squeeze(next_token)\n", + " \n", + " # Get the logit of the selected next_token\n", + " logit = logits[next_token].numpy()\n", + " \n", + " # Reshape to (1,1) since this is the expected shape for text encoded as TF tensors\n", + " next_token = tf.reshape(next_token, shape=(1,1))\n", + " \n", + " # If next_token is End-of-Sentence token you are done\n", + " if next_token == eos_id:\n", + " done = True\n", + " \n", + " return next_token, logit, state, done" + ] + }, + { + "cell_type": "markdown", + "id": "190d2d76", + "metadata": {}, + "source": [ + "See how it works by running the following cell:" + ] + }, + { + "cell_type": "code", + "execution_count": 83, + "id": "9937547a", + "metadata": { + "deletable": false, + "editable": false, + "tags": [ + "graded" + ] + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Next token: [[11484]]\n", + "Logit: -18.7833\n", + "Done? False\n" + ] + } + ], + "source": [ + "# PROCESS SENTENCE TO TRANSLATE AND ENCODE\n", + "\n", + "# A sentence you wish to translate\n", + "eng_sentence = \"I love languages\"\n", + "\n", + "# Convert it to a tensor\n", + "texts = tf.convert_to_tensor(eng_sentence)[tf.newaxis]\n", + "\n", + "# Vectorize it and pass it through the encoder\n", + "context = english_vectorizer(texts).to_tensor()\n", + "context = encoder(context)\n", + "\n", + "# SET STATE OF THE DECODER\n", + "\n", + "# Next token is Start-of-Sentence since you are starting fresh\n", + "next_token = tf.fill((1,1), sos_id)\n", + "\n", + "# Hidden and Cell states of the LSTM can be mocked using uniform samples\n", + "state = [tf.random.uniform((1, UNITS)), tf.random.uniform((1, UNITS))]\n", + "\n", + "# You are not done until next token is EOS token\n", + "done = False\n", + "\n", + "# Generate next token\n", + "next_token, logit, state, done = generate_next_token(decoder, context, next_token, done, state, temperature=0.5)\n", + "print(f\"Next token: {next_token}\\nLogit: {logit:.4f}\\nDone? {done}\")" + ] + }, + { + "cell_type": "markdown", + "id": "170323dd", + "metadata": {}, + "source": [ + "\n", + "## Exercise 5 - translate\n", + "\n", + "Now you can put everything together to translate a given sentence. For this, complete the `translate` function below. This function will take care of the following steps: \n", + "- Process the sentence to translate and encode it\n", + "\n", + "+ Set the initial state of the decoder\n", + "\n", + "- Get predictions of the next token (starting with the \\ token) for a maximum of iterations (in case the \\ token is never returned)\n", + " \n", + "+ Return the translated text (as a string), the logit of the last iteration (this helps measure how certain was that the sequence was translated in its totality) and the translation in token format.\n", + "\n", + "\n", + "Hints: \n", + "\n", + "- The previous cell provides a lot of insights on how this function should work, so if you get stuck refer to it.\n", + "\n", + "+ Some useful docs:\n", + " + [tf.newaxis](https://www.tensorflow.org/api_docs/python/tf#newaxis)\n", + "\n", + " - [tf.fill](https://www.tensorflow.org/api_docs/python/tf/fill)\n", + "\n", + " + [tf.zeros](https://www.tensorflow.org/api_docs/python/tf/zeros)\n", + "\n", + "\n", + "**IMPORTANT NOTE**: Due to randomness processes involving tensorflow training and weight initializing, the results below may vary a lot, even if you retrain your model in the same session. \n" + ] + }, + { + "cell_type": "code", + "execution_count": 84, + "id": "42c74f1f", + "metadata": { + "deletable": false, + "tags": [ + "graded" + ] + }, + "outputs": [], + "source": [ + "# GRADED FUNCTION: translate\n", + "def translate(model, text, max_length=50, temperature=0.0):\n", + " \"\"\"Translate a given sentence from English to Portuguese\n", + "\n", + " Args:\n", + " model (tf.keras.Model): The trained translator\n", + " text (string): The sentence to translate\n", + " max_length (int, optional): The maximum length of the translation. Defaults to 50.\n", + " temperature (float, optional): The temperature that controls the randomness of the predicted tokens. Defaults to 0.0.\n", + "\n", + " Returns:\n", + " tuple(str, np.float, tf.Tensor): The translation, logit that predicted token and the tokenized translation\n", + " \"\"\"\n", + " # Lists to save tokens and logits\n", + " tokens, logits = [], []\n", + "\n", + " ### START CODE HERE ###\n", + " \n", + " # PROCESS THE SENTENCE TO TRANSLATE\n", + " \n", + " # Convert the original string into a tensor\n", + " text = tf.convert_to_tensor(text)[tf.newaxis]\n", + " \n", + " # Vectorize the text using the correct vectorizer\n", + " context = english_vectorizer(text).to_tensor()\n", + " \n", + " # Get the encoded context (pass the context through the encoder)\n", + " # Hint: Remember you can get the encoder by using model.encoder\n", + " context = model.encoder(context)\n", + " \n", + " # INITIAL STATE OF THE DECODER\n", + " \n", + " # First token should be SOS token with shape (1,1)\n", + " next_token = tf.fill((1,1), sos_id)\n", + " \n", + " # Initial hidden and cell states should be tensors of zeros with shape (1, UNITS)\n", + " state = [tf.random.uniform((1, UNITS)), tf.random.uniform((1, UNITS))]\n", + " \n", + " # You are done when you draw a EOS token as next token (initial state is False)\n", + " done = False\n", + "\n", + " # Iterate for max_length iterations\n", + " for _ in range(max_length):\n", + " # Generate the next token\n", + " try:\n", + " next_token, logit, state, done = generate_next_token(\n", + " decoder=model.decoder,\n", + " context=context,\n", + " next_token=next_token,\n", + " done=done,\n", + " state=state,\n", + " temperature=temperature\n", + " )\n", + " except:\n", + " raise Exception(\"Problem generating the next token\")\n", + " \n", + " # If done then break out of the loop\n", + " if done:\n", + " break\n", + " \n", + " # Add next_token to the list of tokens\n", + " tokens.append(next_token)\n", + " \n", + " # Add logit to the list of logits\n", + " logits.append(logit)\n", + " \n", + " ### END CODE HERE ###\n", + " \n", + " # Concatenate all tokens into a tensor\n", + " tokens = tf.concat(tokens, axis=-1)\n", + " \n", + " # Convert the translated tokens into text\n", + " translation = tf.squeeze(tokens_to_text(tokens, id_to_word))\n", + " translation = translation.numpy().decode()\n", + " \n", + " return translation, logits[-1], tokens" + ] + }, + { + "cell_type": "markdown", + "id": "3525e8ba", + "metadata": {}, + "source": [ + "Try your function with temperature of 0, which will yield a deterministic output and is equivalent to a greedy decoding:" + ] + }, + { + "cell_type": "code", + "execution_count": 85, + "id": "daaea8c5", + "metadata": { + "deletable": false, + "editable": false, + "tags": [] + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Temperature: 0.0\n", + "\n", + "Original sentence: I love languages\n", + "Translation: eu adoro idiomas .\n", + "Translation tokens:[[ 9 564 850 4]]\n", + "Logit: -0.074\n" + ] + } + ], + "source": [ + "# Running this cell multiple times should return the same output since temp is 0\n", + "\n", + "temp = 0.0 \n", + "original_sentence = \"I love languages\"\n", + "\n", + "translation, logit, tokens = translate(trained_translator, original_sentence, temperature=temp)\n", + "\n", + "print(f\"Temperature: {temp}\\n\\nOriginal sentence: {original_sentence}\\nTranslation: {translation}\\nTranslation tokens:{tokens}\\nLogit: {logit:.3f}\")" + ] + }, + { + "cell_type": "markdown", + "id": "7d05129b", + "metadata": {}, + "source": [ + "Try your function with temperature of 0.7 (stochastic output):" + ] + }, + { + "cell_type": "code", + "execution_count": 86, + "id": "0e0697db", + "metadata": { + "deletable": false, + "editable": false, + "tags": [] + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Temperature: 0.7\n", + "\n", + "Original sentence: I love languages\n", + "Translation: eu adoro idiomas .\n", + "Translation tokens:[[ 9 564 850 4]]\n", + "Logit: -0.093\n" + ] + } + ], + "source": [ + "# Running this cell multiple times should return different outputs since temp is not 0\n", + "# You can try different temperatures\n", + "\n", + "temp = 0.7\n", + "original_sentence = \"I love languages\"\n", + "\n", + "translation, logit, tokens = translate(trained_translator, original_sentence, temperature=temp)\n", + "\n", + "print(f\"Temperature: {temp}\\n\\nOriginal sentence: {original_sentence}\\nTranslation: {translation}\\nTranslation tokens:{tokens}\\nLogit: {logit:.3f}\")" + ] + }, + { + "cell_type": "code", + "execution_count": 87, + "id": "a3a9ea35", + "metadata": { + "deletable": false, + "editable": false, + "tags": [] + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "\u001b[91mFailed test case: translate didn't return the same translation when using temperature of 0.0.\n", + "Expected: o meu nome e [UNK] a [UNK] .\n", + "Got: , meu nome e [UNK] a [UNK] .\n", + "\n", + "\u001b[91mFailed test case: translate didn't return the same logit when using temperature of 0.0.\n", + "Expected: -0.5501561164855957\n", + "Got: -0.6304512619972229\n", + "\n", + "\u001b[91mFailed test case: translate didn't return the same tokens when using temperature of 0.0.\n", + "Expected: [[ 7 43 175 13 1 12 1 4]]\n", + "Got: [[ 19 43 175 13 1 12 1 4]]\n", + "\n", + "\n" + ] + } + ], + "source": [ + "w1_unittest.test_translate(translate, trained_translator)" + ] + }, + { + "cell_type": "markdown", + "id": "ba027524", + "metadata": {}, + "source": [ + "\n", + "## 5. Minimum Bayes-Risk Decoding\n", + "\n", + "As mentioned in the lectures, getting the most probable token at each step may not necessarily produce the best results. Another approach is to do Minimum Bayes Risk Decoding or MBR. The general steps to implement this are:\n", + "\n", + "- Take several random samples\n", + "+ Score each sample against all other samples\n", + "- Select the one with the highest score\n", + "\n", + "You will be building helper functions for these steps in the following sections.\n", + "\n", + "With the ability to generate different translations by setting different temperature values you can do what you saw in the lectures and generate a bunch of translations and then determine which one is the best candidate. You will now do this by using the provided `generate_samples` function. This function will return any desired number of candidate translations alongside the log-probability for each one:" + ] + }, + { + "cell_type": "code", + "execution_count": 88, + "id": "62301cd5", + "metadata": { + "deletable": false, + "editable": false, + "tags": [ + "graded" + ] + }, + "outputs": [], + "source": [ + "def generate_samples(model, text, n_samples=4, temperature=0.6):\n", + " \n", + " samples, log_probs = [], []\n", + "\n", + " # Iterate for n_samples iterations\n", + " for _ in range(n_samples):\n", + " \n", + " # Save the logit and the translated tensor\n", + " _, logp, sample = translate(model, text, temperature=temperature)\n", + " \n", + " # Save the translated tensors\n", + " samples.append(np.squeeze(sample.numpy()).tolist())\n", + " \n", + " # Save the logits\n", + " log_probs.append(logp)\n", + " \n", + " return samples, log_probs" + ] + }, + { + "cell_type": "code", + "execution_count": 89, + "id": "06bd792c", + "metadata": { + "deletable": false, + "editable": false, + "tags": [] + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Translated tensor: [9, 81, 850, 4] has logit: -0.080\n", + "Translated tensor: 4 has logit: -0.677\n", + "Translated tensor: [9, 98, 11, 850, 4] has logit: -0.063\n", + "Translated tensor: [9, 564, 850, 4] has logit: -0.110\n" + ] + } + ], + "source": [ + "samples, log_probs = generate_samples(trained_translator, 'I love languages')\n", + "\n", + "for s, l in zip(samples, log_probs):\n", + " print(f\"Translated tensor: {s} has logit: {l:.3f}\")" + ] + }, + { + "cell_type": "markdown", + "id": "29b10677", + "metadata": {}, + "source": [ + "## Comparing overlaps\n", + "\n", + "Now that you can generate multiple translations it is time to come up with a method to measure the goodness of each one. As you saw in the lectures, one way to achieve this is by comparing each sample against the others. \n", + "\n", + "There are several metrics you can use for this purpose, as shown in the lectures and you can try experimenting with any one of these. For this assignment, you will be calculating scores for **unigram overlaps**. \n", + "\n", + "One of these metrics is the widely used yet simple [Jaccard similarity](https://en.wikipedia.org/wiki/Jaccard_index) which gets the intersection over union of two sets. The `jaccard_similarity` function returns this metric for any pair of candidate and reference translations:\n" + ] + }, + { + "cell_type": "code", + "execution_count": 90, + "id": "edb54a71", + "metadata": { + "deletable": false, + "editable": false, + "tags": [ + "graded" + ] + }, + "outputs": [], + "source": [ + "def jaccard_similarity(candidate, reference):\n", + " \n", + " # Convert the lists to sets to get the unique tokens\n", + " candidate_set = set(candidate)\n", + " reference_set = set(reference)\n", + " \n", + " # Get the set of tokens common to both candidate and reference\n", + " common_tokens = candidate_set.intersection(reference_set)\n", + " \n", + " # Get the set of all tokens found in either candidate or reference\n", + " all_tokens = candidate_set.union(reference_set)\n", + " \n", + " # Compute the percentage of overlap (divide the number of common tokens by the number of all tokens)\n", + " overlap = len(common_tokens) / len(all_tokens)\n", + " \n", + " return overlap" + ] + }, + { + "cell_type": "code", + "execution_count": 91, + "id": "fc3384bf", + "metadata": { + "deletable": false, + "editable": false, + "tags": [ + "graded" + ] + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "jaccard similarity between lists: [1, 2, 3] and [1, 2, 3, 4] is 0.750\n" + ] + } + ], + "source": [ + "l1 = [1, 2, 3]\n", + "l2 = [1, 2, 3, 4]\n", + "\n", + "js = jaccard_similarity(l1, l2)\n", + "\n", + "print(f\"jaccard similarity between lists: {l1} and {l2} is {js:.3f}\")" + ] + }, + { + "cell_type": "markdown", + "id": "a6997662", + "metadata": {}, + "source": [ + "##### __Expected Output__\n", + "\n", + "```\n", + "jaccard similarity between tensors: [1, 2, 3] and [1, 2, 3, 4] is 0.750\n", + "\n", + "```" + ] + }, + { + "cell_type": "markdown", + "id": "b2510e3d", + "metadata": {}, + "source": [ + "\n", + "## Exercise 6 - rouge1_similarity\n", + "\n", + "Jaccard similarity is good but a more commonly used metric in machine translation is the ROUGE score. For unigrams, this is called ROUGE-1 and as shown in the lectures, you can output the scores for both precision and recall when comparing two samples. To get the final score, you will want to compute the F1-score as given by:\n", + "\n", + "$$score = 2* \\frac{(precision * recall)}{(precision + recall)}$$\n", + "\n", + "For the implementation of the `rouge1_similarity` function you want to use the [Counter](https://docs.python.org/3/library/collections.html#collections.Counter) class from the Python standard library:" + ] + }, + { + "cell_type": "code", + "execution_count": 92, + "id": "fb2e0a00", + "metadata": { + "deletable": false, + "tags": [ + "graded" + ] + }, + "outputs": [], + "source": [ + "# GRADED FUNCTION: rouge1_similarity\n", + "def rouge1_similarity(candidate, reference):\n", + " \"\"\"Computes the ROUGE 1 score between two token lists\n", + "\n", + " Args:\n", + " candidate (list[int]): Tokenized candidate translation\n", + " reference (list[int]): Tokenized reference translation\n", + "\n", + " Returns:\n", + " float: Overlap between the two token lists\n", + " \"\"\"\n", + " ### START CODE HERE ###\n", + " \n", + " # Make a frequency table of the candidate and reference tokens\n", + " # Hint: use the Counter class (already imported)\n", + " candidate_word_counts = Counter(candidate)\n", + " reference_word_counts = Counter(reference)\n", + " \n", + " # Initialize overlap at 0\n", + " overlap = 0\n", + " \n", + " # Iterate over the tokens in the candidate frequency table\n", + " # Hint: Counter is a subclass of dict and you can get the keys \n", + " # out of a dict using the keys method like this: dict.keys()\n", + " for token in candidate_word_counts.keys():\n", + " \n", + " # Get the count of the current token in the candidate frequency table\n", + " # Hint: You can access the counts of a token as you would access values of a dictionary\n", + " token_count_candidate = candidate_word_counts[token]\n", + " \n", + " # Get the count of the current token in the reference frequency table\n", + " # Hint: You can access the counts of a token as you would access values of a dictionary\n", + " token_count_reference = reference_word_counts.get(token, 0)\n", + " \n", + " # Update the overlap by getting the minimum between the two token counts above\n", + " overlap += min(token_count_candidate, token_count_reference)\n", + " \n", + " # Compute the precision\n", + " # Hint: precision = overlap / (number of tokens in candidate list) \n", + " precision = overlap / len(candidate) if len(candidate) > 0 else 0\n", + " \n", + " # Compute the recall\n", + " # Hint: recall = overlap / (number of tokens in reference list) \n", + " recall = overlap / len(reference) if len(reference) > 0 else 0\n", + " \n", + " if precision + recall != 0:\n", + " # Compute the Rouge1 Score\n", + " # Hint: This is equivalent to the F1 score\n", + " f1_score = 2 * (precision * recall) / (precision + recall)\n", + " \n", + " return f1_score\n", + " \n", + " ### END CODE HERE ###\n", + " \n", + " return 0 # If precision + recall = 0 then return 0" + ] + }, + { + "cell_type": "code", + "execution_count": 93, + "id": "14bb5295", + "metadata": { + "deletable": false, + "editable": false, + "tags": [ + "graded" + ] + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "rouge 1 similarity between lists: [1, 2, 3] and [1, 2, 3, 4] is 0.857\n" + ] + } + ], + "source": [ + "l1 = [1, 2, 3]\n", + "l2 = [1, 2, 3, 4]\n", + "\n", + "r1s = rouge1_similarity(l1, l2)\n", + "\n", + "print(f\"rouge 1 similarity between lists: {l1} and {l2} is {r1s:.3f}\")" + ] + }, + { + "cell_type": "markdown", + "id": "afb8c61a", + "metadata": {}, + "source": [ + "##### __Expected Output__\n", + "\n", + "```\n", + "rouge 1 similarity between lists: [1, 2, 3] and [1, 2, 3, 4] is 0.857\n", + "\n", + "```" + ] + }, + { + "cell_type": "code", + "execution_count": 94, + "id": "a680132e", + "metadata": { + "deletable": false, + "editable": false, + "tags": [] + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "\u001b[92m All tests passed!\n" + ] + } + ], + "source": [ + "w1_unittest.test_rouge1_similarity(rouge1_similarity)" + ] + }, + { + "cell_type": "markdown", + "id": "aaf8a058", + "metadata": {}, + "source": [ + "## Computing the Overall Score\n", + "\n", + "\n", + "You will now build a function to generate the overall score for a particular sample. As mentioned in the lectures, you need to compare each sample with all other samples. For instance, if we generated 30 sentences, we will need to compare sentence 1 to sentences 2 through 30. Then, we compare sentence 2 to sentences 1 and 3 through 30, and so forth. At each step, we get the average score of all comparisons to get the overall score for a particular sample. To illustrate, these will be the steps to generate the scores of a 4-sample list.\n", + "\n", + "- Get similarity score between sample 1 and sample 2\n", + "+ Get similarity score between sample 1 and sample 3\n", + "- Get similarity score between sample 1 and sample 4\n", + "+ Get average score of the first 3 steps. This will be the overall score of sample 1\n", + "- Iterate and repeat until samples 1 to 4 have overall scores.\n", + "\n", + "\n", + "The results will be stored in a dictionary for easy lookups.\n", + "\n", + "\n", + "## Exercise 7 - average_overlap\n", + "\n", + "Complete the `average_overlap` function below which should implement the process described above:" + ] + }, + { + "cell_type": "code", + "execution_count": 95, + "id": "142264ff", + "metadata": { + "deletable": false, + "tags": [ + "graded" + ] + }, + "outputs": [], + "source": [ + "# GRADED FUNCTION: average_overlap\n", + "def average_overlap(samples, similarity_fn):\n", + " \"\"\"Computes the arithmetic mean of each candidate sentence in the samples\n", + "\n", + " Args:\n", + " samples (list[list[int]]): Tokenized version of translated sentences\n", + " similarity_fn (Function): Similarity function used to compute the overlap\n", + "\n", + " Returns:\n", + " dict[int, float]: A dictionary mapping the index of each translation to its score\n", + " \"\"\"\n", + " # Initialize dictionary\n", + " scores = {}\n", + " \n", + " # Iterate through all samples (enumerate helps keep track of indexes)\n", + " for index_candidate, candidate in enumerate(samples): \n", + " \n", + " ### START CODE HERE ###\n", + " \n", + " # Initially overlap is zero\n", + " overlap = 0.0\n", + " \n", + " # Iterate through all samples (enumerate helps keep track of indexes)\n", + " for index_sample, sample in enumerate(samples):\n", + "\n", + " # Skip if the candidate index is the same as the sample index\n", + " if index_candidate == index_sample:\n", + " continue\n", + " \n", + " # Get the overlap between candidate and sample using the similarity function\n", + " sample_overlap = similarity_fn(candidate, sample)\n", + " \n", + " # Add the sample overlap to the total overlap\n", + " overlap += sample_overlap\n", + "\n", + " ### END CODE HERE ###\n", + " \n", + " # Get the score for the candidate by computing the average\n", + " score = overlap / (len(samples) - 1)\n", + "\n", + " # Only use 3 decimal points\n", + " score = round(score, 3)\n", + " \n", + " # Save the score in the dictionary. use index as the key.\n", + " scores[index_candidate] = score\n", + " \n", + " return scores" + ] + }, + { + "cell_type": "code", + "execution_count": 96, + "id": "f36cf403", + "metadata": { + "deletable": false, + "editable": false, + "tags": [ + "graded" + ] + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "average overlap between lists: [1, 2, 3], [1, 2, 4] and [1, 2, 4, 5] using Jaccard similarity is:\n", + "\n", + "{0: 0.45, 1: 0.625, 2: 0.575}\n" + ] + } + ], + "source": [ + "# Test with Jaccard similarity\n", + "\n", + "l1 = [1, 2, 3]\n", + "l2 = [1, 2, 4]\n", + "l3 = [1, 2, 4, 5]\n", + "\n", + "avg_ovlp = average_overlap([l1, l2, l3], jaccard_similarity)\n", + "\n", + "print(f\"average overlap between lists: {l1}, {l2} and {l3} using Jaccard similarity is:\\n\\n{avg_ovlp}\")" + ] + }, + { + "cell_type": "markdown", + "id": "e277aed2-a5c9-4ed0-9ee2-614939f2df7b", + "metadata": {}, + "source": [ + "##### __Expected Output__\n", + "\n", + "```\n", + "average overlap between lists: [1, 2, 3], [1, 2, 4] and [1, 2, 4, 5] using Jaccard similarity is:\n", + "\n", + "{0: 0.45, 1: 0.625, 2: 0.575}\n", + "```" + ] + }, + { + "cell_type": "code", + "execution_count": 97, + "id": "d961a304-7c03-4ecb-ba5f-c8747ed3ec39", + "metadata": { + "deletable": false, + "editable": false, + "tags": [ + "graded" + ] + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "average overlap between lists: [1, 2, 3], [1, 4], [1, 2, 4, 5] and [5, 6] using Rouge1 similarity is:\n", + "\n", + "{0: 0.324, 1: 0.356, 2: 0.524, 3: 0.111}\n" + ] + } + ], + "source": [ + "# Test with Rouge1 similarity\n", + "\n", + "l1 = [1, 2, 3]\n", + "l2 = [1, 4]\n", + "l3 = [1, 2, 4, 5]\n", + "l4 = [5,6]\n", + "\n", + "avg_ovlp = average_overlap([l1, l2, l3, l4], rouge1_similarity)\n", + "\n", + "print(f\"average overlap between lists: {l1}, {l2}, {l3} and {l4} using Rouge1 similarity is:\\n\\n{avg_ovlp}\")" + ] + }, + { + "cell_type": "markdown", + "id": "30adc749-ffcb-4e82-a8f0-c04a7e39da0a", + "metadata": {}, + "source": [ + "##### __Expected Output__\n", + "\n", + "```\n", + "average overlap between lists: [1, 2, 3], [1, 4], [1, 2, 4, 5] and [5, 6] using Rouge1 similarity is:\n", + "\n", + "{0: 0.324, 1: 0.356, 2: 0.524, 3: 0.111}\n", + "```" + ] + }, + { + "cell_type": "code", + "execution_count": 98, + "id": "c41b1fba-fd0f-41e6-9b07-746f64030fe3", + "metadata": { + "deletable": false, + "editable": false, + "tags": [] + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "\u001b[92m All tests passed!\n" + ] + } + ], + "source": [ + "w1_unittest.test_average_overlap(average_overlap)" + ] + }, + { + "cell_type": "markdown", + "id": "e4482249", + "metadata": {}, + "source": [ + "In practice, it is also common to see the weighted mean being used to calculate the overall score instead of just the arithmetic mean. This is implemented in the `weighted_avg_overlap` function below and you can use it in your experiments to see which one will give better results:" + ] + }, + { + "cell_type": "code", + "execution_count": 99, + "id": "398714be", + "metadata": { + "deletable": false, + "editable": false, + "tags": [ + "graded" + ] + }, + "outputs": [], + "source": [ + "def weighted_avg_overlap(samples, log_probs, similarity_fn):\n", + " \n", + " # Scores dictionary\n", + " scores = {}\n", + " \n", + " # Iterate over the samples\n", + " for index_candidate, candidate in enumerate(samples): \n", + " \n", + " # Initialize overlap and weighted sum\n", + " overlap, weight_sum = 0.0, 0.0\n", + " \n", + " # Iterate over all samples and log probabilities\n", + " for index_sample, (sample, logp) in enumerate(zip(samples, log_probs)):\n", + "\n", + " # Skip if the candidate index is the same as the sample index \n", + " if index_candidate == index_sample:\n", + " continue\n", + " \n", + " # Convert log probability to linear scale\n", + " sample_p = float(np.exp(logp))\n", + "\n", + " # Update the weighted sum\n", + " weight_sum += sample_p\n", + "\n", + " # Get the unigram overlap between candidate and sample\n", + " sample_overlap = similarity_fn(candidate, sample)\n", + " \n", + " # Update the overlap\n", + " overlap += sample_p * sample_overlap\n", + " \n", + " # Compute the score for the candidate\n", + " score = overlap / weight_sum\n", + "\n", + " # Only use 3 decimal points\n", + " score = round(score, 3)\n", + " \n", + " # Save the score in the dictionary. use index as the key.\n", + " scores[index_candidate] = score\n", + " \n", + " return scores" + ] + }, + { + "cell_type": "code", + "execution_count": 100, + "id": "e3dfd6d3", + "metadata": { + "deletable": false, + "editable": false, + "tags": [ + "graded" + ] + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "weighted average overlap using Jaccard similarity is:\n", + "\n", + "{0: 0.443, 1: 0.631, 2: 0.558}\n" + ] + } + ], + "source": [ + "l1 = [1, 2, 3]\n", + "l2 = [1, 2, 4]\n", + "l3 = [1, 2, 4, 5]\n", + "log_probs = [0.4, 0.2, 0.5]\n", + "\n", + "w_avg_ovlp = weighted_avg_overlap([l1, l2, l3], log_probs, jaccard_similarity)\n", + "\n", + "print(f\"weighted average overlap using Jaccard similarity is:\\n\\n{w_avg_ovlp}\")" + ] + }, + { + "cell_type": "markdown", + "id": "cdb0b4db", + "metadata": {}, + "source": [ + "## mbr_decode\n", + "\n", + "You will now put everything together in the the `mbr_decode` function below. This final step is not graded as this function is just a wrapper around all the cool stuff you have coded so far! \n", + "\n", + "You can use it to play around, trying different numbers of samples, temperatures and similarity functions!" + ] + }, + { + "cell_type": "code", + "execution_count": 101, + "id": "6fcfa640", + "metadata": { + "deletable": false, + "editable": false, + "tags": [ + "graded" + ] + }, + "outputs": [], + "source": [ + "def mbr_decode(model, text, n_samples=5, temperature=0.6, similarity_fn=jaccard_similarity):\n", + " \n", + " # Generate samples\n", + " samples, log_probs = generate_samples(model, text, n_samples=n_samples, temperature=temperature)\n", + " \n", + " # Compute the overlap scores\n", + " scores = weighted_avg_overlap(samples, log_probs, similarity_fn)\n", + "\n", + " # Decode samples\n", + " decoded_translations = [tokens_to_text(s, id_to_word).numpy().decode('utf-8') for s in samples]\n", + " \n", + " # Find the key with the highest score\n", + " max_score_key = max(scores, key=lambda k: scores[k])\n", + " \n", + " # Get the translation \n", + " translation = decoded_translations[max_score_key]\n", + " \n", + " return translation, decoded_translations" + ] + }, + { + "cell_type": "code", + "execution_count": 102, + "id": "99507fcc-7727-45e7-933b-d3a08034f731", + "metadata": { + "deletable": false, + "editable": false, + "tags": [] + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Translation candidates:\n", + "eu adoro idiomas .\n", + "eu adoro idiomas .\n", + "eu sinto idiomas .\n", + "eu adoro idiomas .\n", + "eu adoro idiomas .\n", + "eu adoro idiomas .\n", + "eu adoro idiomas .\n", + "eu adoro idiomas .\n", + "eu adoro idiomas .\n", + "eu adoro idiomas .\n", + "\n", + "Selected translation: eu adoro idiomas .\n" + ] + } + ], + "source": [ + "english_sentence = \"I love languages\"\n", + "\n", + "translation, candidates = mbr_decode(trained_translator, english_sentence, n_samples=10, temperature=0.6)\n", + "\n", + "print(\"Translation candidates:\")\n", + "for c in candidates:\n", + " print(c)\n", + "\n", + "print(f\"\\nSelected translation: {translation}\")" + ] + }, + { + "cell_type": "markdown", + "id": "801b193f-4ea6-4ca1-ae29-a506cce656d9", + "metadata": {}, + "source": [ + "**Congratulations!** Next week, you'll dive deeper into attention models and study the Transformer architecture. You will build another network but without the recurrent part. It will show that attention is all you need! It should be fun!\n", + "\n", + "**Keep up the good work!**" + ] + } + ], + "metadata": { + "grader_version": "1", + "kernelspec": { + "display_name": "Python 3 (ipykernel)", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.8.10" + } + }, + "nbformat": 4, + "nbformat_minor": 5 +} diff --git a/NLP with Attention Models/NMT_with_Attention/NMT with MBR/Files/tf/__pycache__/ult.cpython-38.pyc b/NLP with Attention Models/NMT_with_Attention/NMT with MBR/Files/tf/__pycache__/ult.cpython-38.pyc new file mode 100644 index 0000000000000000000000000000000000000000..fead874d1ddf27003ea5f34291518968e350f3a3 Binary files /dev/null and b/NLP with Attention Models/NMT_with_Attention/NMT with MBR/Files/tf/__pycache__/ult.cpython-38.pyc differ diff --git a/NLP with Attention Models/NMT_with_Attention/NMT with MBR/Files/tf/__pycache__/utils.cpython-311.pyc b/NLP with Attention Models/NMT_with_Attention/NMT with MBR/Files/tf/__pycache__/utils.cpython-311.pyc new file mode 100644 index 0000000000000000000000000000000000000000..a7a208a56b7f83e9e1b46384b7f35d1993b90881 Binary files /dev/null and b/NLP with Attention Models/NMT_with_Attention/NMT with MBR/Files/tf/__pycache__/utils.cpython-311.pyc differ diff --git a/NLP with Attention Models/NMT_with_Attention/NMT with MBR/Files/tf/__pycache__/utils.cpython-38.pyc b/NLP with Attention Models/NMT_with_Attention/NMT with MBR/Files/tf/__pycache__/utils.cpython-38.pyc new file mode 100644 index 0000000000000000000000000000000000000000..3cc81c991d9baafab6c8de7861e621cc14e1b8a7 Binary files /dev/null and b/NLP with Attention Models/NMT_with_Attention/NMT with MBR/Files/tf/__pycache__/utils.cpython-38.pyc differ diff --git a/NLP with Attention Models/NMT_with_Attention/NMT with MBR/Files/tf/__pycache__/w1_unittest.cpython-311.pyc b/NLP with Attention Models/NMT_with_Attention/NMT with MBR/Files/tf/__pycache__/w1_unittest.cpython-311.pyc new file mode 100644 index 0000000000000000000000000000000000000000..82e9075a514e2062eec7eefbedac0f04bad832c8 Binary files /dev/null and b/NLP with Attention Models/NMT_with_Attention/NMT with MBR/Files/tf/__pycache__/w1_unittest.cpython-311.pyc differ diff --git a/NLP with Attention Models/NMT_with_Attention/NMT with MBR/Files/tf/__pycache__/w1_unittest.cpython-37.pyc b/NLP with Attention Models/NMT_with_Attention/NMT with MBR/Files/tf/__pycache__/w1_unittest.cpython-37.pyc new file mode 100644 index 0000000000000000000000000000000000000000..30a17cb358f2bc35bdf278475b1e0eca4133f883 Binary files /dev/null and b/NLP with Attention Models/NMT_with_Attention/NMT with MBR/Files/tf/__pycache__/w1_unittest.cpython-37.pyc differ diff --git a/NLP with Attention Models/NMT_with_Attention/NMT with MBR/Files/tf/__pycache__/w1_unittest.cpython-38.pyc b/NLP with Attention Models/NMT_with_Attention/NMT with MBR/Files/tf/__pycache__/w1_unittest.cpython-38.pyc new file mode 100644 index 0000000000000000000000000000000000000000..7913bb323260dfcd3c1e900469ec7848a3827375 Binary files /dev/null and b/NLP with Attention Models/NMT_with_Attention/NMT with MBR/Files/tf/__pycache__/w1_unittest.cpython-38.pyc differ diff --git a/NLP with Attention Models/NMT_with_Attention/NMT with MBR/Files/tf/images/NMTModel.png b/NLP with Attention Models/NMT_with_Attention/NMT with MBR/Files/tf/images/NMTModel.png new file mode 100644 index 0000000000000000000000000000000000000000..9eb9be706de3d39bc9b5f7fc8fb0829f0ae57737 --- /dev/null +++ b/NLP with Attention Models/NMT_with_Attention/NMT with MBR/Files/tf/images/NMTModel.png @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:f9b251a60aedde3c2140fd11a35c464974072ad3797302a04904c7925f11e16e +size 131252 diff --git a/NLP with Attention Models/NMT_with_Attention/NMT with MBR/Files/tf/images/att.png b/NLP with Attention Models/NMT_with_Attention/NMT with MBR/Files/tf/images/att.png new file mode 100644 index 0000000000000000000000000000000000000000..498372a0db8d9a6a0e40daf9c1debaee6d4773a2 --- /dev/null +++ b/NLP with Attention Models/NMT_with_Attention/NMT with MBR/Files/tf/images/att.png @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:8e31fd29f7b79a45a65bae5b8a355857dd67bfcc5e01ea2d5f9cefa08e393044 +size 134527 diff --git a/NLP with Attention Models/NMT_with_Attention/NMT with MBR/Files/tf/images/attention.png b/NLP with Attention Models/NMT_with_Attention/NMT with MBR/Files/tf/images/attention.png new file mode 100644 index 0000000000000000000000000000000000000000..8571d6fccbb6e4bcae36052ab6599f507628cafd --- /dev/null +++ b/NLP with Attention Models/NMT_with_Attention/NMT with MBR/Files/tf/images/attention.png @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:dda2f8d5bd98a195202b059c41088d33cbc430a9d5e5ad4bf8df4b4c205ed8a6 +size 244941 diff --git a/NLP with Attention Models/NMT_with_Attention/NMT with MBR/Files/tf/images/attention_overview.png b/NLP with Attention Models/NMT_with_Attention/NMT with MBR/Files/tf/images/attention_overview.png new file mode 100644 index 0000000000000000000000000000000000000000..b49d312485fba3a689ab9d63e37a115fa212f1d0 Binary files /dev/null and b/NLP with Attention Models/NMT_with_Attention/NMT with MBR/Files/tf/images/attention_overview.png differ diff --git a/NLP with Attention Models/NMT_with_Attention/NMT with MBR/Files/tf/images/input_encoder.png b/NLP with Attention Models/NMT_with_Attention/NMT with MBR/Files/tf/images/input_encoder.png new file mode 100644 index 0000000000000000000000000000000000000000..0f86c9ae678dbb3906d4825486c0b581be3467af Binary files /dev/null and b/NLP with Attention Models/NMT_with_Attention/NMT with MBR/Files/tf/images/input_encoder.png differ diff --git a/NLP with Attention Models/NMT_with_Attention/NMT with MBR/Files/tf/images/plain_rnn.png b/NLP with Attention Models/NMT_with_Attention/NMT with MBR/Files/tf/images/plain_rnn.png new file mode 100644 index 0000000000000000000000000000000000000000..ac0ea5b93fc616eafddac37c00fef9e858e65c49 Binary files /dev/null and b/NLP with Attention Models/NMT_with_Attention/NMT with MBR/Files/tf/images/plain_rnn.png differ diff --git a/NLP with Attention Models/NMT_with_Attention/NMT with MBR/Files/tf/images/pre_attention_decoder.png b/NLP with Attention Models/NMT_with_Attention/NMT with MBR/Files/tf/images/pre_attention_decoder.png new file mode 100644 index 0000000000000000000000000000000000000000..222629ff83656c04a4974d4d7a5f2db3b945ddc5 Binary files /dev/null and b/NLP with Attention Models/NMT_with_Attention/NMT with MBR/Files/tf/images/pre_attention_decoder.png differ diff --git a/NLP with Attention Models/NMT_with_Attention/NMT with MBR/Files/tf/por-eng/por.txt b/NLP with Attention Models/NMT_with_Attention/NMT with MBR/Files/tf/por-eng/por.txt new file mode 100644 index 0000000000000000000000000000000000000000..96366b0f63623e56cbce5bf88711756b88644ba6 --- /dev/null +++ b/NLP with Attention Models/NMT_with_Attention/NMT with MBR/Files/tf/por-eng/por.txt @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:5307326410edc8f65c3d0213cf3e5544e41c400efed7fe9ba557c8847a6ee803 +size 27856622 diff --git a/NLP with Attention Models/NMT_with_Attention/NMT with MBR/Files/tf/utils.py b/NLP with Attention Models/NMT_with_Attention/NMT with MBR/Files/tf/utils.py new file mode 100644 index 0000000000000000000000000000000000000000..147521aaab7266fea918c9a9aaea9f5a17e08518 --- /dev/null +++ b/NLP with Attention Models/NMT_with_Attention/NMT with MBR/Files/tf/utils.py @@ -0,0 +1,114 @@ +import numpy as np +import tensorflow as tf +import tensorflow_text as tf_text +import pathlib + +path_to_file = pathlib.Path("por-eng/por.txt") + +np.random.seed(1234) +tf.random.set_seed(1234) + +def load_data(path): + text = path.read_text(encoding="utf-8") + + lines = text.splitlines() + pairs = [line.split("\t") for line in lines] + + context = np.array([context for target, context, _ in pairs]) + target = np.array([target for target, context, _ in pairs]) + + return context, target + + +portuguese_sentences, english_sentences = load_data(path_to_file) + +sentences = (portuguese_sentences, english_sentences) + +BUFFER_SIZE = len(english_sentences) +BATCH_SIZE = 64 + +is_train = np.random.uniform(size=(len(portuguese_sentences),)) < 0.8 + +train_raw = ( + tf.data.Dataset.from_tensor_slices( + (english_sentences[is_train], portuguese_sentences[is_train]) + ) + .shuffle(BUFFER_SIZE) + .batch(BATCH_SIZE) +) +val_raw = ( + tf.data.Dataset.from_tensor_slices( + (english_sentences[~is_train], portuguese_sentences[~is_train]) + ) + .shuffle(BUFFER_SIZE) + .batch(BATCH_SIZE) +) + + +def tf_lower_and_split_punct(text): + text = tf_text.normalize_utf8(text, "NFKD") + text = tf.strings.lower(text) + text = tf.strings.regex_replace(text, "[^ a-z.?!,¿]", "") + text = tf.strings.regex_replace(text, "[.?!,¿]", r" \0 ") + text = tf.strings.strip(text) + text = tf.strings.join(["[SOS]", text, "[EOS]"], separator=" ") + return text + + +max_vocab_size = 12000 + +english_vectorizer = tf.keras.layers.TextVectorization( + standardize=tf_lower_and_split_punct, max_tokens=max_vocab_size, ragged=True +) + +english_vectorizer.adapt(train_raw.map(lambda context, target: context)) + +portuguese_vectorizer = tf.keras.layers.TextVectorization( + standardize=tf_lower_and_split_punct, max_tokens=max_vocab_size, ragged=True +) + +portuguese_vectorizer.adapt(train_raw.map(lambda context, target: target)) + + +def process_text(context, target): + context = english_vectorizer(context).to_tensor() + target = portuguese_vectorizer(target) + targ_in = target[:, :-1].to_tensor() + targ_out = target[:, 1:].to_tensor() + return (context, targ_in), targ_out + + +train_data = train_raw.map(process_text, tf.data.AUTOTUNE) +val_data = val_raw.map(process_text, tf.data.AUTOTUNE) + +del train_raw +del val_raw + + +def masked_loss(y_true, y_pred): + + loss_fn = tf.keras.losses.SparseCategoricalCrossentropy(from_logits=True, reduction='none') + loss = loss_fn(y_true, y_pred) + + # Check which elements of y_true are padding + mask = tf.cast(y_true != 0, loss.dtype) + + loss *= mask + # Return the total. + return tf.reduce_sum(loss)/tf.reduce_sum(mask) + + +def masked_acc(y_true, y_pred): + y_pred = tf.argmax(y_pred, axis=-1) + y_pred = tf.cast(y_pred, y_true.dtype) + match = tf.cast(y_true == y_pred, tf.float32) + mask = tf.cast(y_true != 0, tf.float32) + match*= mask + + return tf.reduce_sum(match)/tf.reduce_sum(mask) + + +def tokens_to_text(tokens, id_to_word): + words = id_to_word(tokens) + result = tf.strings.reduce_join(words, axis=-1, separator=" ") + return result diff --git a/NLP with Attention Models/NMT_with_Attention/NMT with MBR/Files/tf/w1_unittest.py b/NLP with Attention Models/NMT_with_Attention/NMT with MBR/Files/tf/w1_unittest.py new file mode 100644 index 0000000000000000000000000000000000000000..73e36ba8708bcab428cb00ef6758045684abe25b --- /dev/null +++ b/NLP with Attention Models/NMT_with_Attention/NMT with MBR/Files/tf/w1_unittest.py @@ -0,0 +1,702 @@ +import math +from itertools import combinations +import tensorflow as tf +import numpy as np +from dlai_grader.grading import test_case, print_feedback +from utils import train_data + +VOCAB_SIZE = 12000 +UNITS = 256 + + +def test_encoder(encoder_to_test): + def g(): + vocab_sizes = [5, 20, 1000, 15000] + units = [32, 64, 256, 512] + + cases = [] + + encoder = encoder_to_test(vocab_sizes[0], units[0]) + + t = test_case() + if encoder.embedding.mask_zero != True: + t.failed = True + t.msg = "Embedding layer has incorrect value for 'mask_zero' attribute" + t.want = True + t.got = encoder.embedding.mask_zero + cases.append(t) + + for vs, u in zip(vocab_sizes, units): + encoder = encoder_to_test(vs, u) + + t = test_case() + if encoder.embedding.input_dim != vs: + t.failed = True + t.msg = "Incorrect input dim of embedding layer" + t.want = vs + t.got = encoder.embedding.input_dim + cases.append(t) + + t = test_case() + if encoder.embedding.output_dim != u: + t.failed = True + t.msg = "Incorrect output dim of embedding layer" + t.want = u + t.got = encoder.embedding.output_dim + cases.append(t) + + t = test_case() + if not isinstance(encoder.rnn.layer, tf.keras.layers.LSTM): + t.failed = True + t.msg = "Incorrect type of layer inside Bidirectional" + t.want = tf.keras.layers.LSTM + t.got = type(encoder.rnn.layer) + return [t] + + for u in units: + encoder = encoder_to_test(vocab_sizes[1], u) + t = test_case() + if encoder.rnn.layer.units != u: + t.failed = True + t.msg = "Incorrect number of units in LSTM layer" + t.want = u + t.got = encoder.rnn.layer.units + cases.append(t) + + t = test_case() + if encoder.rnn.layer.return_sequences != True: + t.failed = True + t.msg = "LSTM layer has incorrect value for 'return_sequences' attribute" + t.want = True + t.got = encoder.rnn.layer.return_sequences + cases.append(t) + + vocab_size = 16 + n_units = 8 + encoder = encoder_to_test(vocab_size, n_units) + to_translate = np.array([[1, 2, 3, 4, 5, 6, 14, 0, 0, 0], + [2, 1, 1, 1, 1, 1, 8, 0, 0, 0], + [5, 4, 2, 3, 3, 15, 11, 0, 0, 0]]) + #for (to_translate, _), _ in train_data.take(3): + + first_dim_in, second_dim_in = to_translate.shape + encoder_output = encoder(to_translate) + t = test_case() + if len(encoder_output.shape) != 3: + t.failed = True + t.msg = "Incorrect shape of encoder output" + t.want = "a shape with 3 dimensions" + t.got = encoder_output.shape + return [t] + + first_dim_out, second_dim_out, third_dim_out = encoder_output.shape + + t = test_case() + if first_dim_in != first_dim_out: + t.failed = True + t.msg = "Incorrect first dimension of encoder output" + t.want = first_dim_in + t.got = first_dim_out + cases.append(t) + + t = test_case() + if second_dim_in != second_dim_out: + t.failed = True + t.msg = "Incorrect second dimension of encoder output" + t.want = second_dim_in + t.got = second_dim_out + cases.append(t) + + t = test_case() + if third_dim_out != n_units: + t.failed = True + t.msg = "Incorrect third dimension of encoder output" + t.want = units + t.got = third_dim_out + cases.append(t) + + return cases + + cases = g() + print_feedback(cases) + + +def test_cross_attention(cross_attention_to_test): + def g(): + units = [32, 64, 256, 512] + + cases = [] + + n_units = 512 + cross_attention = cross_attention_to_test(n_units) + + t = test_case() + if not isinstance(cross_attention.mha, tf.keras.layers.MultiHeadAttention): + t.failed = True + t.msg = "Incorrect type of layer for Multi Head Attention" + t.want = tf.keras.layers.MultiHeadAttention + t.got = type(cross_attention.mha) + return [t] + + # for u in units: + # cross_attention = cross_attention_to_test(u) + + # t = test_case() + # if cross_attention.mha.key_dim != u: + # t.failed = True + # t.msg = "Incorrect key dim of Multi Head Attention layer" + # t.want = u + # t.got = cross_attention.mha.key_dim + # cases.append(t) + + cross_attention = cross_attention_to_test(n_units) + embed = tf.keras.layers.Embedding(VOCAB_SIZE, output_dim=UNITS, mask_zero=True) + + for (to_translate, sr_translation), _ in train_data.take(3): + sr_translation_embed = embed(sr_translation) + first_dim_in, second_dim_in, third_dim_in = sr_translation_embed.shape + dummy_encoder_output = np.random.rand(64, 14, 512) + cross_attention_output = cross_attention( + dummy_encoder_output, sr_translation_embed + ) + # print(cross_attention_output.shape) + + t = test_case() + if len(cross_attention_output.shape) != 3: + t.failed = True + t.msg = "Incorrect shape of cross_attention output" + t.want = "a shape with 3 dimensions" + t.got = cross_attention_output.shape + return [t] + + first_dim_out, second_dim_out, third_dim_out = cross_attention_output.shape + + t = test_case() + if first_dim_in != first_dim_out: + t.failed = True + t.msg = "Incorrect first dimension of cross_attention output" + t.want = first_dim_in + t.got = first_dim_out + cases.append(t) + + t = test_case() + if second_dim_in != second_dim_out: + t.failed = True + t.msg = "Incorrect second dimension of cross_attention output" + t.want = second_dim_in + t.got = second_dim_out + cases.append(t) + + t = test_case() + if third_dim_in != third_dim_out: + t.failed = True + t.msg = "Incorrect third dimension of cross_attention output" + t.want = third_dim_in + t.got = third_dim_out + cases.append(t) + + _, n_heads, key_dim = cross_attention.mha.get_weights()[0].shape + + t = test_case() + if n_heads != 1: + t.failed = True + t.msg = "Incorrect number of attention heads" + t.want = 1 + t.got = n_heads + cases.append(t) + + t = test_case() + if key_dim != n_units: + t.failed = True + t.msg = f"Incorrect size of query and key for every attention head when passing {n_units} units to the constructor" + t.want = n_units + t.got = key_dim + cases.append(t) + + return cases + + cases = g() + print_feedback(cases) + + +def test_decoder(decoder_to_test, CrossAttention): + def g(): + vocab_sizes = [5, 20, 1000, 15000] + units = [32, 64, 256, 512] + + cases = [] + + vocab_size = 10000 + n_units = 512 + decoder = decoder_to_test(vocab_size, n_units) + + t = test_case() + if not isinstance(decoder.embedding, tf.keras.layers.Embedding): + t.failed = True + t.msg = "Incorrect type of embedding layer" + t.want = tf.keras.layers.Embedding + t.got = type(decoder.embedding) + return [t] + + t = test_case() + if decoder.embedding.mask_zero != True: + t.failed = True + t.msg = "Embedding layer has incorrect value for 'mask_zero' attribute" + t.want = True + t.got = decoder.embedding.mask_zero + cases.append(t) + + for vs, u in zip(vocab_sizes, units): + decoder = decoder_to_test(vs, u) + + t = test_case() + if decoder.embedding.input_dim != vs: + t.failed = True + t.msg = "Incorrect input dim of embedding layer" + t.want = vs + t.got = decoder.embedding.input_dim + cases.append(t) + + t = test_case() + if decoder.embedding.output_dim != u: + t.failed = True + t.msg = "Incorrect output dim of embedding layer" + t.want = u + t.got = decoder.embedding.output_dim + cases.append(t) + + t = test_case() + if not isinstance(decoder.pre_attention_rnn, tf.keras.layers.LSTM): + t.failed = True + t.msg = "Incorrect type of pre_attention_rnn layer" + t.want = tf.keras.layers.LSTM + t.got = type(decoder.pre_attention_rnn) + return [t] + + for u in units: + decoder = decoder_to_test(vocab_size, u) + t = test_case() + if decoder.pre_attention_rnn.units != u: + t.failed = True + t.msg = "Incorrect number of units in pre_attention_rnn layer" + t.want = u + t.got = decoder.pre_attention_rnn.units + cases.append(t) + + # t = test_case() + # if decoder.attention.units != u: + # t.failed = True + # t.msg = "Incorrect number of units in attention layer" + # t.want = u + # t.got = decoder.attention.units + # cases.append(t) + + t = test_case() + if decoder.post_attention_rnn.units != u: + t.failed = True + t.msg = "Incorrect number of units in post_attention_rnn layer" + t.want = u + t.got = decoder.post_attention_rnn.units + cases.append(t) + + t = test_case() + if decoder.pre_attention_rnn.return_sequences != True: + t.failed = True + t.msg = "pre_attention_rnn layer has incorrect value for 'return_sequences' attribute" + t.want = True + t.got = decoder.pre_attention_rnn.return_sequences + cases.append(t) + + t = test_case() + if decoder.pre_attention_rnn.return_state != True: + t.failed = True + t.msg = "pre_attention_rnn layer has incorrect value for 'return_state' attribute" + t.want = True + t.got = decoder.pre_attention_rnn.return_state + cases.append(t) + + t = test_case() + if not isinstance(decoder.attention, CrossAttention): + t.failed = True + t.msg = "Incorrect type of attention layer" + t.want = CrossAttention + t.got = type(decoder.attention) + return [t] + + t = test_case() + if decoder.post_attention_rnn.return_sequences != True: + t.failed = True + t.msg = "post_attention_rnn layer has incorrect value for 'return_sequences' attribute" + t.want = True + t.got = decoder.post_attention_rnn.return_sequences + cases.append(t) + + t = test_case() + if not isinstance(decoder.post_attention_rnn, tf.keras.layers.LSTM): + t.failed = True + t.msg = "Incorrect type of pre_attention_rnn layer" + t.want = tf.keras.layers.LSTM + t.got = type(decoder.post_attention_rnn) + return [t] + + t = test_case() + if not isinstance(decoder.output_layer, tf.keras.layers.Dense): + t.failed = True + t.msg = "Incorrect type of output_layer layer" + t.want = tf.keras.layers.Dense + t.got = type(decoder.output_layer) + return [t] + + t = test_case() + if ( + "log" not in decoder.output_layer.activation.__name__ + or "softmax" not in decoder.output_layer.activation.__name__ + ): + t.failed = True + t.msg = "output_layer layer has incorrect activation function" + t.want = "a log softmax activation function such as 'log_softmax_v2'" + t.got = decoder.output_layer.activation.__name__ + cases.append(t) + + vocab_size = 6 + n_units = 4 + decoder = decoder_to_test(vocab_size, n_units) + sr_translation = np.array([[3, 4, 5, 3, 3, 3, 5, 1, 1, 1, 1, 1], + [1, 2, 3, 4, 5, 1, 1, 0, 0, 0, 0, 0]]) + encoder_output = np.random.rand(2, 10, n_units) + decoder_output = decoder(encoder_output, sr_translation) + + first_dim_in, second_dim_in = sr_translation.shape + + t = test_case() + if len(decoder_output.shape) != 3: + t.failed = True + t.msg = "Incorrect shape of decoder output" + t.want = "a shape with 3 dimensions" + t.got = decoder_output.shape + return [t] + + first_dim_out, second_dim_out, third_dim_out = decoder_output.shape + + t = test_case() + if first_dim_in != first_dim_out: + t.failed = True + t.msg = "Incorrect first dimension of decoder output" + t.want = first_dim_in + t.got = first_dim_out + cases.append(t) + + t = test_case() + if second_dim_in != second_dim_out: + t.failed = True + t.msg = "Incorrect second dimension of decoder output" + t.want = second_dim_in + t.got = second_dim_out + cases.append(t) + + t = test_case() + if third_dim_out != vocab_size: + t.failed = True + t.msg = "Incorrect third dimension of decoder output" + t.want = vocab_size + t.got = third_dim_out + cases.append(t) + + return cases + + cases = g() + print_feedback(cases) + + +def test_translator(translator_to_test, Encoder, Decoder): + def g(): + vocab_sizes = [5, 20, 1000, 15000] + units = [32, 64, 256, 512] + + cases = [] + + vocab_size = 10000 + n_units = 512 + translator = translator_to_test(vocab_size, n_units) + + t = test_case() + if not isinstance(translator.encoder, Encoder): + t.failed = True + t.msg = "Incorrect type of encoder layer" + t.want = Encoder + t.got = type(translator.encoder) + return [t] + + t = test_case() + if not isinstance(translator.decoder, Decoder): + t.failed = True + t.msg = "Incorrect type of encoder layer" + t.want = Decoder + t.got = type(translator.decoder) + return [t] + + vocab_size = 16 + n_units = 8 + translator = translator_to_test(vocab_size, n_units) + + to_translate = np. array([[1, 2, 3, 4, 5, 0, 0], + [5, 2, 3, 4, 5, 6, 0], + [6, 3, 3, 4, 5, 3, 3], + [7, 9, 9, 6, 5, 3, 3]]) + + sr_translation = np. array([[8, 1, 2, 3, 4, 5, 0, 0], + [9, 5, 2, 3, 4, 5, 6, 0], + [10, 6, 3, 3, 4, 5, 3, 3], + [11, 7, 9, 9, 6, 5, 3, 3]]) + + #for (to_translate, sr_translation), _ in train_data.take(3): + first_dim_in, second_dim_in = sr_translation.shape + translator_output = translator((to_translate, sr_translation)) + t = test_case() + if len(translator_output.shape) != 3: + t.failed = True + t.msg = "Incorrect shape of translator output" + t.want = "a shape with 3 dimensions" + t.got = translator_output.shape + return [t] + + first_dim_out, second_dim_out, third_dim_out = translator_output.shape + + t = test_case() + if first_dim_in != first_dim_out: + t.failed = True + t.msg = "Incorrect first dimension of translator output" + t.want = first_dim_in + t.got = first_dim_out + cases.append(t) + + t = test_case() + if second_dim_in != second_dim_out: + t.failed = True + t.msg = "Incorrect second dimension of translator output" + t.want = second_dim_in + t.got = second_dim_out + cases.append(t) + + t = test_case() + if third_dim_out != vocab_size: + t.failed = True + t.msg = "Incorrect third dimension of translator output" + t.want = vocab_size + t.got = third_dim_out + cases.append(t) + + return cases + + cases = g() + print_feedback(cases) + + + +def test_translate(learner_func, model): + def g(): + + cases = [] + + txt = "Hi, my name is Younes" + try: + translation, logit, tokens = learner_func(model, txt, temperature=0.9) + except Exception as e: + t = test_case() + t.failed = True + t.msg = "There was an exception when running your function" + t.want = "No exceptions" + t.got = f"{str(e)}" + return [t] + + txt = "Hi, my name is Alejandra" + translation, logit, tokens = learner_func(model, txt, temperature=0.0) + + t = test_case() + + if not isinstance(translation, str): + t.failed = True + t.msg = "'translation' has incorrect type" + t.want = str + t.got = type(translation) + cases.append(t) + + if not isinstance(logit, np.number): + t.failed = True + t.msg = "'logit' has incorrect type" + t.want = np.number + t.got = type(logit) + cases.append(t) + + if not isinstance(tokens, tf.Tensor): + t.failed = True + t.msg = "'tokens' has incorrect type" + t.want = tf.Tensor + t.got = type(tokens) + cases.append(t) + + translation2, logit2, tokens2 = learner_func(model, txt, temperature=0.0) + + t = test_case() + if translation != translation2: + t.failed = True + t.msg = "translate didn't return the same translation when using temperature of 0.0" + t.want = translation + t.got = translation2 + cases.append(t) + + t = test_case() + if logit != logit2: + t.failed = True + t.msg = "translate didn't return the same logit when using temperature of 0.0" + t.want = logit + t.got = logit2 + cases.append(t) + + t = test_case() + if not np.allclose(tokens, tokens2): + t.failed = True + t.msg = "translate didn't return the same tokens when using temperature of 0.0" + t.want = tokens + t.got = tokens2 + cases.append(t) + + # Check that function uses the model.decoder and model.enconder functions + inputs = tf.keras.Input(shape=(37,)) + outputs = tf.keras.layers.Dense(5, activation="softmax")(inputs) + model_fake = tf.keras.Model(inputs = inputs, outputs = outputs) + + model_fake.encoder = model.encoder + model_fake.decoder = None + t = test_case() + try: + ff = learner_func(model_fake, "Hello world", temperature=0.0) + t.failed = True + t.msg = "The translator is not using the internal model.decoder. You are probably using a global variable" + t.want = "Fail translation" + t.got = "Succeed translation with wrong decoder" + except: + None + + cases.append(t) + + model_fake.encoder = None + model_fake.decoder = model.decoder + t = test_case() + try: + ff = learner_func(model_fake, "Hello world", temperature=0.0) + t.failed = True + t.msg = "The translator is not using the internal model.encoder. You are probably using a global variable" + t.want = "Fail translation" + t.got = "Succeed translation with wrong encoder" + except: + None + + cases.append(t) + + return cases + + cases = g() + print_feedback(cases) + + + + +def test_rouge1_similarity(learner_func): + + def g(): + + tensors = [ + [0], + [0, 1], + [0, 1, 2], + [1, 2, 4, 5], + [5, 5, 7, 0, 232] + ] + + expected = [0.6666666666666666, 0.5, 0, 0.33333333333333337, 0.8, 0.3333333333333333, 0.28571428571428575, 0.5714285714285715, 0.25] + + cases = [] + pairs = list(combinations(tensors, 2)) + + for (candidate, reference), solution in zip(pairs, expected): + answer = learner_func(candidate, reference) + t = test_case() + if not math.isclose(answer, solution): + t.failed = True + t.msg = f"Incorrect similarity for candidate={candidate} and reference={reference}" + t.want = solution + t.got = answer + cases.append(t) + + return cases + + cases = g() + print_feedback(cases) + + +def test_average_overlap(learner_func): + + def jaccard_similarity(candidate, reference): + + # Convert the lists to sets to get the unique tokens + candidate_set = set(candidate) + reference_set = set(reference) + + # Get the set of tokens common to both candidate and reference + common_tokens = candidate_set.intersection(reference_set) + + # Get the set of all tokens found in either candidate or reference + all_tokens = candidate_set.union(reference_set) + + # Compute the percentage of overlap (divide the number of common tokens by the number of all tokens) + overlap = len(common_tokens) / len(all_tokens) + + return overlap + + def g(): + + l1 = [1, 2, 3] + l2 = [1, 2, 4] + l3 = [1, 2, 4, 5] + l4 = [5,6] + + elements = [l1, l2, l3, l4] + + all_combinations = [] + + for r in range(2, len(elements) + 1): + # Generate combinations of length r + combinations_r = combinations(elements, r) + + # Append the combinations to the result list + all_combinations.extend(combinations_r) + + expected = [{0: 0.5, 1: 0.5}, + {0: 0.4, 1: 0.4}, + {0: 0.0, 1: 0.0}, + {0: 0.75, 1: 0.75}, + {0: 0.0, 1: 0.0}, + {0: 0.2, 1: 0.2}, + {0: 0.45, 1: 0.625, 2: 0.575}, + {0: 0.25, 1: 0.25, 2: 0.0}, + {0: 0.2, 1: 0.3, 2: 0.1}, + {0: 0.375, 1: 0.475, 2: 0.1}, + {0: 0.3, 1: 0.417, 2: 0.45, 3: 0.067}] + + cases = [] + + for combination, solution in zip(all_combinations, expected): + answer = learner_func(combination, jaccard_similarity) + t = test_case() + if answer != solution: + t.failed = True + t.msg = f"Incorrect overlap for lists={combination}" + t.want = solution + t.got = answer + cases.append(t) + + return cases + + cases = g() + print_feedback(cases) diff --git a/NLP with Attention Models/QA/BPE_algorithm/home/jovyan/work/.ipynb_checkpoints/C4W3_SentencePiece_and_BPE-checkpoint.ipynb b/NLP with Attention Models/QA/BPE_algorithm/home/jovyan/work/.ipynb_checkpoints/C4W3_SentencePiece_and_BPE-checkpoint.ipynb new file mode 100644 index 0000000000000000000000000000000000000000..115fdbdb7fa989c62491a7ec5d6bafbea0686f11 --- /dev/null +++ b/NLP with Attention Models/QA/BPE_algorithm/home/jovyan/work/.ipynb_checkpoints/C4W3_SentencePiece_and_BPE-checkpoint.ipynb @@ -0,0 +1,633 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# SentencePiece and BPE " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Introduction to Tokenization" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "In order to process text in neural network models it is first required to **encode** text as numbers with ids, since the tensor operations act on numbers. Finally, if the output of the network is to be words, it is required to **decode** the predicted tokens ids back to text.\n", + "\n", + "To encode text, the first decision that has to be made is to what level of granularity are we going to consider the text? Because ultimately, from these **tokens**, features are going to be created about them. Many different experiments have been carried out using *words*, *morphological units*, *phonemic units* or *characters* as tokens. For example, \n", + "\n", + "- Tokens are tricky. (raw text)\n", + "- Tokens are tricky . ([words](https://arxiv.org/pdf/1301.3781))\n", + "- Token s _ are _ trick _ y . ([morphemes](https://arxiv.org/pdf/1907.02423.pdf))\n", + "- t oʊ k ə n z _ ɑː _ ˈt r ɪ k i. ([phonemes](https://www.aclweb.org/anthology/W18-5812.pdf), for STT)\n", + "- T o k e n s _ a r e _ t r i c k y . ([character](https://www.aclweb.org/anthology/C18-1139/))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "But how to identify these units, such as words, is largely determined by the language they come from. For example, in many European languages a space is used to separate words, while in some Asian languages there are no spaces between words. Compare English and Mandarin.\n", + "\n", + "- Tokens are tricky. (original sentence)\n", + "- 标记很棘手 (Mandarin)\n", + "- Biāojì hěn jíshǒu (pinyin)\n", + "- 标记 很 棘手 (Mandarin with spaces)\n", + "\n", + "\n", + "So, the ability to **tokenize**, i.e. split text into meaningful fundamental units, is not always straight-forward.\n", + "\n", + "Also, there are practical issues of how large our *vocabulary* of words, `vocab_size`, should be, considering memory limitations vs. coverage. A compromise may be need to be made between: \n", + "* the finest-grained models employing characters which can be memory intensive and \n", + "* more computationally efficient *subword* units such as [n-grams](https://arxiv.org/pdf/1712.09405) or larger units.\n", + "\n", + "In [SentencePiece](https://www.aclweb.org/anthology/D18-2012.pdf) unicode characters are grouped together using either a [unigram language model](https://www.aclweb.org/anthology/P18-1007.pdf) (used in this week's assignment) or [BPE](https://arxiv.org/pdf/1508.07909.pdf), **byte-pair encoding**. We will discuss BPE, since BERT and many of its variants use a modified version of BPE and its pseudocode is easy to implement and understand... hopefully!" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## SentencePiece Preprocessing\n", + "### NFKC Normalization" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Unsurprisingly, even using unicode to initially tokenize text can be ambiguous, e.g., " + ] + }, + { + "cell_type": "code", + "execution_count": 1, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "é = é : False\n" + ] + } + ], + "source": [ + "eaccent = '\\u00E9'\n", + "e_accent = '\\u0065\\u0301'\n", + "print(f'{eaccent} = {e_accent} : {eaccent == e_accent}')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "SentencePiece uses the Unicode standard normalization form, [NFKC](https://en.wikipedia.org/wiki/Unicode_equivalence), so this isn't an issue. Looking at the example from above but with normalization:" + ] + }, + { + "cell_type": "code", + "execution_count": 2, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "é = é : True\n" + ] + } + ], + "source": [ + "from unicodedata import normalize\n", + "\n", + "norm_eaccent = normalize('NFKC', '\\u00E9')\n", + "norm_e_accent = normalize('NFKC', '\\u0065\\u0301')\n", + "print(f'{norm_eaccent} = {norm_e_accent} : {norm_eaccent == norm_e_accent}')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Normalization has actually changed the unicode code point (unicode unique id) for one of these two characters." + ] + }, + { + "cell_type": "code", + "execution_count": 3, + "metadata": {}, + "outputs": [], + "source": [ + "def get_hex_encoding(s):\n", + " return ' '.join(hex(ord(c)) for c in s)\n", + "\n", + "def print_string_and_encoding(s):\n", + " print(f'{s} : {get_hex_encoding(s)}') " + ] + }, + { + "cell_type": "code", + "execution_count": 4, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "é : 0xe9\n", + "é : 0x65 0x301\n", + "é : 0xe9\n", + "é : 0xe9\n" + ] + } + ], + "source": [ + "for s in [eaccent, e_accent, norm_eaccent, norm_e_accent]:\n", + " print_string_and_encoding(s)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "This normalization has other side effects which may be considered useful such as converting curly quotes “ to \" their ASCII equivalent. (*Although we *now* lose directionality of the quote...)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Lossless Tokenization\n", + "\n", + "SentencePiece also ensures that when you tokenize your data and detokenize your data the original position of white space is preserved. However, tabs and newlines are converted to spaces.\n", + "\n", + "To ensure this **lossless tokenization**, SentencePiece replaces white space with _ (U+2581). So that a simple join of the tokens by replacing underscores with spaces can restore the white space, even if there are consecutive symbols. But remember first to normalize and then replace spaces with _ (U+2581)." + ] + }, + { + "cell_type": "code", + "execution_count": 7, + "metadata": {}, + "outputs": [], + "source": [ + "s = 'Tokenization is hard.'\n", + "sn = normalize('NFKC', s)\n", + "sn_ = sn.replace(' ', '\\u2581')" + ] + }, + { + "cell_type": "code", + "execution_count": 8, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "0x54 0x6f 0x6b 0x65 0x6e 0x69 0x7a 0x61 0x74 0x69 0x6f 0x6e 0x20 0x69 0x73 0x20 0x68 0x61 0x72 0x64 0x2e\n", + "0x54 0x6f 0x6b 0x65 0x6e 0x69 0x7a 0x61 0x74 0x69 0x6f 0x6e 0x20 0x69 0x73 0x20 0x68 0x61 0x72 0x64 0x2e\n", + "0x54 0x6f 0x6b 0x65 0x6e 0x69 0x7a 0x61 0x74 0x69 0x6f 0x6e 0x2581 0x69 0x73 0x2581 0x68 0x61 0x72 0x64 0x2e\n" + ] + } + ], + "source": [ + "print(get_hex_encoding(s))\n", + "print(get_hex_encoding(sn))\n", + "print(get_hex_encoding(sn_))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## BPE Algorithm\n", + "\n", + "After discussing the preprocessing that SentencePiece performs, you will get the data, preprocess it, and apply the BPE algorithm. You will see how this reproduces the tokenization produced by training SentencePiece on the example dataset (from this week's assignment).\n", + "\n", + "### Preparing our Data\n", + "First, you get the Squad data and process it as above." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "import ast\n", + "\n", + "def convert_json_examples_to_text(filepath):\n", + " example_jsons = list(map(ast.literal_eval, open(filepath))) # Read in the json from the example file\n", + " texts = [example_json['text'].decode('utf-8') for example_json in example_jsons] # Decode the byte sequences\n", + " text = '\\n\\n'.join(texts) # Separate different articles by two newlines\n", + " text = normalize('NFKC', text) # Normalize the text\n", + "\n", + " with open('example.txt', 'w') as fw:\n", + " fw.write(text)\n", + " \n", + " return text" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "text = convert_json_examples_to_text('./data/data.txt')\n", + "print(text[:900])" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "In the algorithm the `vocab` variable is actually a frequency dictionary of the words. Those words have been prepended with an *underscore* to indicate that they are the beginning of a word. Finally, the characters have been delimited by spaces so that the BPE algorithm can group the most common characters together in the dictionary in a greedy fashion. You will see how that is done shortly." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "from collections import Counter\n", + "\n", + "vocab = Counter(['\\u2581' + word for word in text.split()])\n", + "vocab = {' '.join([l for l in word]): freq for word, freq in vocab.items()}" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "def show_vocab(vocab, end='\\n', limit=20):\n", + " \"\"\"Show word frequencys in vocab up to the limit number of words\"\"\"\n", + " shown = 0\n", + " for word, freq in vocab.items():\n", + " print(f'{word}: {freq}', end=end)\n", + " shown +=1\n", + " if shown > limit:\n", + " break" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "show_vocab(vocab)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "You check the size of the vocabulary (frequency dictionary) because this is the one hyperparameter that BPE depends on crucially on how far it breaks up a word into SentencePieces. It turns out that for your trained model on the small dataset that 60% of 455 merges of the most frequent characters need to be done to reproduce the upperlimit of a 32K `vocab_size` over the entire corpus of examples." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "print(f'Total number of unique words: {len(vocab)}')\n", + "print(f'Number of merges required to reproduce SentencePiece training on the whole corpus: {int(0.60*len(vocab))}')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### BPE Algorithm\n", + "Directly from the BPE paper you have the following algorithm. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "import re, collections\n", + "\n", + "def get_stats(vocab):\n", + " pairs = collections.defaultdict(int)\n", + " for word, freq in vocab.items():\n", + " symbols = word.split()\n", + " for i in range(len(symbols) - 1):\n", + " pairs[symbols[i], symbols[i+1]] += freq\n", + " return pairs\n", + "\n", + "def merge_vocab(pair, v_in):\n", + " v_out = {}\n", + " bigram = re.escape(' '.join(pair))\n", + " p = re.compile(r'(? id\n", + "print(sp.encode_as_pieces(s0))\n", + "print(sp.encode_as_ids(s0))\n", + "\n", + "# decode: id => text\n", + "print(sp.decode_pieces(sp.encode_as_pieces(s0)))\n", + "print(sp.decode_ids([12847, 277]))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Notice how SentencePiece breaks the words into seemingly odd parts, but you have seen something similar with BPE. But how close was the model trained on the whole corpus of examples with a `vocab_size` of 32,000 instead of 455? Here you can also test what happens to white space, like '\\n'. \n", + "\n", + "But first note that SentencePiece encodes the SentencePieces, the tokens, and has reserved some of the ids as can be seen in this week's assignment." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "uid = 15068\n", + "spiece = \"\\u2581BBQ\"\n", + "unknown = \"__MUST_BE_UNKNOWN__\"\n", + "\n", + "# id <=> piece conversion\n", + "print(f'SentencePiece for ID {uid}: {sp.id_to_piece(uid)}')\n", + "print(f'ID for Sentence Piece {spiece}: {sp.piece_to_id(spiece)}')\n", + "\n", + "# returns 0 for unknown tokens (we can change the id for UNK)\n", + "print(f'ID for unknown text {unknown}: {sp.piece_to_id(unknown)}')" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "print(f'Beginning of sentence id: {sp.bos_id()}')\n", + "print(f'Pad id: {sp.pad_id()}')\n", + "print(f'End of sentence id: {sp.eos_id()}')\n", + "print(f'Unknown id: {sp.unk_id()}')\n", + "print(f'Vocab size: {sp.vocab_size()}')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "You can also check what are the ids for the first part and last part of the vocabulary." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "print('\\nId\\tSentP\\tControl?')\n", + "print('------------------------')\n", + "# , , are defined by default. Their ids are (0, 1, 2)\n", + "# and are defined as 'control' symbol.\n", + "for uid in range(10):\n", + " print(uid, sp.id_to_piece(uid), sp.is_control(uid), sep='\\t')\n", + " \n", + "# for uid in range(sp.vocab_size()-10,sp.vocab_size()):\n", + "# print(uid, sp.id_to_piece(uid), sp.is_control(uid), sep='\\t')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Train SentencePiece BPE model with our example.txt" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Finally, train your own BPE model directly from the SentencePiece library and compare it to the results of the implemention of the algorithm from the BPE paper itself." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "spm.SentencePieceTrainer.train('--input=example.txt --model_prefix=example_bpe --vocab_size=450 --model_type=bpe')\n", + "sp_bpe = spm.SentencePieceProcessor()\n", + "sp_bpe.load('example_bpe.model')\n", + "\n", + "print('*** BPE ***')\n", + "print(sp_bpe.encode_as_pieces(s0))" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "show_vocab(sp_vocab, end = ', ')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The implementation of BPE's code from the paper matches up pretty well with the library itself! The differences are probably accounted for by the `vocab_size`. There is also another technical difference in that in the SentencePiece implementation of BPE a priority queue is used to more efficiently keep track of the *best pairs*. Actually, there is a priority queue in the Python standard library called `heapq` if you would like to give that a try below! " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Optionally try to implement BPE using a priority queue below" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "from heapq import heappush, heappop" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "def heapsort(iterable):\n", + " h = []\n", + " for value in iterable:\n", + " heappush(h, value)\n", + " return [heappop(h) for i in range(len(h))]" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "a = [1,4,3,1,3,2,1,4,2]\n", + "heapsort(a)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "For a more extensive example consider looking at the [SentencePiece repo](https://github.com/google/sentencepiece/blob/master/python/sentencepiece_python_module_example.ipynb). The last few sections of this code were repurposed from that tutorial. Thanks for your participation! Next stop BERT and T5!" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [] + } + ], + "metadata": { + "kernelspec": { + "display_name": "Python 3 (ipykernel)", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.10.11" + } + }, + "nbformat": 4, + "nbformat_minor": 4 +} diff --git a/NLP with Attention Models/QA/BPE_algorithm/home/jovyan/work/C4W3_SentencePiece_and_BPE.ipynb b/NLP with Attention Models/QA/BPE_algorithm/home/jovyan/work/C4W3_SentencePiece_and_BPE.ipynb new file mode 100644 index 0000000000000000000000000000000000000000..bb1d6c08a82829b0d01521d4d516266b86c6d9c9 --- /dev/null +++ b/NLP with Attention Models/QA/BPE_algorithm/home/jovyan/work/C4W3_SentencePiece_and_BPE.ipynb @@ -0,0 +1,724 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# SentencePiece and BPE " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Introduction to Tokenization" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "In order to process text in neural network models it is first required to **encode** text as numbers with ids, since the tensor operations act on numbers. Finally, if the output of the network is to be words, it is required to **decode** the predicted tokens ids back to text.\n", + "\n", + "To encode text, the first decision that has to be made is to what level of granularity are we going to consider the text? Because ultimately, from these **tokens**, features are going to be created about them. Many different experiments have been carried out using *words*, *morphological units*, *phonemic units* or *characters* as tokens. For example, \n", + "\n", + "- Tokens are tricky. (raw text)\n", + "- Tokens are tricky . ([words](https://arxiv.org/pdf/1301.3781))\n", + "- Token s _ are _ trick _ y . ([morphemes](https://arxiv.org/pdf/1907.02423.pdf))\n", + "- t oʊ k ə n z _ ɑː _ ˈt r ɪ k i. ([phonemes](https://www.aclweb.org/anthology/W18-5812.pdf), for STT)\n", + "- T o k e n s _ a r e _ t r i c k y . ([character](https://www.aclweb.org/anthology/C18-1139/))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "But how to identify these units, such as words, is largely determined by the language they come from. For example, in many European languages a space is used to separate words, while in some Asian languages there are no spaces between words. Compare English and Mandarin.\n", + "\n", + "- Tokens are tricky. (original sentence)\n", + "- 标记很棘手 (Mandarin)\n", + "- Biāojì hěn jíshǒu (pinyin)\n", + "- 标记 很 棘手 (Mandarin with spaces)\n", + "\n", + "\n", + "So, the ability to **tokenize**, i.e. split text into meaningful fundamental units, is not always straight-forward.\n", + "\n", + "Also, there are practical issues of how large our *vocabulary* of words, `vocab_size`, should be, considering memory limitations vs. coverage. A compromise may be need to be made between: \n", + "* the finest-grained models employing characters which can be memory intensive and \n", + "* more computationally efficient *subword* units such as [n-grams](https://arxiv.org/pdf/1712.09405) or larger units.\n", + "\n", + "In [SentencePiece](https://www.aclweb.org/anthology/D18-2012.pdf) unicode characters are grouped together using either a [unigram language model](https://www.aclweb.org/anthology/P18-1007.pdf) (used in this week's assignment) or [BPE](https://arxiv.org/pdf/1508.07909.pdf), **byte-pair encoding**. We will discuss BPE, since BERT and many of its variants use a modified version of BPE and its pseudocode is easy to implement and understand... hopefully!" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## SentencePiece Preprocessing\n", + "### NFKC Normalization" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Unsurprisingly, even using unicode to initially tokenize text can be ambiguous, e.g., " + ] + }, + { + "cell_type": "code", + "execution_count": 1, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "é = é : False\n" + ] + } + ], + "source": [ + "eaccent = '\\u00E9'\n", + "e_accent = '\\u0065\\u0301'\n", + "print(f'{eaccent} = {e_accent} : {eaccent == e_accent}')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "SentencePiece uses the Unicode standard normalization form, [NFKC](https://en.wikipedia.org/wiki/Unicode_equivalence), so this isn't an issue. Looking at the example from above but with normalization:" + ] + }, + { + "cell_type": "code", + "execution_count": 2, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "é = é : True\n" + ] + } + ], + "source": [ + "from unicodedata import normalize\n", + "\n", + "norm_eaccent = normalize('NFKC', '\\u00E9')\n", + "norm_e_accent = normalize('NFKC', '\\u0065\\u0301')\n", + "print(f'{norm_eaccent} = {norm_e_accent} : {norm_eaccent == norm_e_accent}')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Normalization has actually changed the unicode code point (unicode unique id) for one of these two characters." + ] + }, + { + "cell_type": "code", + "execution_count": 3, + "metadata": {}, + "outputs": [], + "source": [ + "def get_hex_encoding(s):\n", + " return ' '.join(hex(ord(c)) for c in s)\n", + "\n", + "def print_string_and_encoding(s):\n", + " print(f'{s} : {get_hex_encoding(s)}') " + ] + }, + { + "cell_type": "code", + "execution_count": 4, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "é : 0xe9\n", + "é : 0x65 0x301\n", + "é : 0xe9\n", + "é : 0xe9\n" + ] + } + ], + "source": [ + "for s in [eaccent, e_accent, norm_eaccent, norm_e_accent]:\n", + " print_string_and_encoding(s)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "This normalization has other side effects which may be considered useful such as converting curly quotes “ to \" their ASCII equivalent. (*Although we *now* lose directionality of the quote...)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Lossless Tokenization\n", + "\n", + "SentencePiece also ensures that when you tokenize your data and detokenize your data the original position of white space is preserved. However, tabs and newlines are converted to spaces.\n", + "\n", + "To ensure this **lossless tokenization**, SentencePiece replaces white space with _ (U+2581). So that a simple join of the tokens by replacing underscores with spaces can restore the white space, even if there are consecutive symbols. But remember first to normalize and then replace spaces with _ (U+2581)." + ] + }, + { + "cell_type": "code", + "execution_count": 6, + "metadata": {}, + "outputs": [], + "source": [ + "s = 'Tokenization is hard.'\n", + "sn = normalize('NFKC', s)\n", + "sn_ = sn.replace(' ', '\\u2581')" + ] + }, + { + "cell_type": "code", + "execution_count": 7, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "0x54 0x6f 0x6b 0x65 0x6e 0x69 0x7a 0x61 0x74 0x69 0x6f 0x6e 0x20 0x69 0x73 0x20 0x68 0x61 0x72 0x64 0x2e\n", + "0x54 0x6f 0x6b 0x65 0x6e 0x69 0x7a 0x61 0x74 0x69 0x6f 0x6e 0x20 0x69 0x73 0x20 0x68 0x61 0x72 0x64 0x2e\n", + "0x54 0x6f 0x6b 0x65 0x6e 0x69 0x7a 0x61 0x74 0x69 0x6f 0x6e 0x2581 0x69 0x73 0x2581 0x68 0x61 0x72 0x64 0x2e\n" + ] + } + ], + "source": [ + "print(get_hex_encoding(s))\n", + "print(get_hex_encoding(sn))\n", + "print(get_hex_encoding(sn_))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## BPE Algorithm\n", + "\n", + "After discussing the preprocessing that SentencePiece performs, you will get the data, preprocess it, and apply the BPE algorithm. You will see how this reproduces the tokenization produced by training SentencePiece on the example dataset (from this week's assignment).\n", + "\n", + "### Preparing our Data\n", + "First, you get the Squad data and process it as above." + ] + }, + { + "cell_type": "code", + "execution_count": 8, + "metadata": {}, + "outputs": [], + "source": [ + "import ast\n", + "\n", + "def convert_json_examples_to_text(filepath):\n", + " example_jsons = list(map(ast.literal_eval, open(filepath))) # Read in the json from the example file\n", + " texts = [example_json['text'].decode('utf-8') for example_json in example_jsons] # Decode the byte sequences\n", + " text = '\\n\\n'.join(texts) # Separate different articles by two newlines\n", + " text = normalize('NFKC', text) # Normalize the text\n", + "\n", + " with open('example.txt', 'w') as fw:\n", + " fw.write(text)\n", + " \n", + " return text" + ] + }, + { + "cell_type": "code", + "execution_count": 9, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Beginners BBQ Class Taking Place in Missoula!\n", + "Do you want to get better at making delicious BBQ? You will have the opportunity, put this on your calendar now. Thursday, September 22nd join World Class BBQ Champion, Tony Balay from Lonestar Smoke Rangers. He will be teaching a beginner level class for everyone who wants to get better with their culinary skills.\n", + "He will teach you everything you need to know to compete in a KCBS BBQ competition, including techniques, recipes, timelines, meat selection and trimming, plus smoker and fire information.\n", + "The cost to be in the class is $35 per person, and for spectators it is free. Included in the cost will be either a t-shirt or apron and you will be tasting samples of each meat that is prepared.\n", + "\n", + "Discussion in 'Mac OS X Lion (10.7)' started by axboi87, Jan 20, 2012.\n", + "I've got a 500gb internal drive and a 240gb SSD.\n", + "When trying to restore using di\n" + ] + } + ], + "source": [ + "text = convert_json_examples_to_text('./data/data.txt')\n", + "print(text[:900])" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "In the algorithm the `vocab` variable is actually a frequency dictionary of the words. Those words have been prepended with an *underscore* to indicate that they are the beginning of a word. Finally, the characters have been delimited by spaces so that the BPE algorithm can group the most common characters together in the dictionary in a greedy fashion. You will see how that is done shortly." + ] + }, + { + "cell_type": "code", + "execution_count": 10, + "metadata": {}, + "outputs": [], + "source": [ + "from collections import Counter\n", + "\n", + "vocab = Counter(['\\u2581' + word for word in text.split()])\n", + "vocab = {' '.join([l for l in word]): freq for word, freq in vocab.items()}" + ] + }, + { + "cell_type": "code", + "execution_count": 11, + "metadata": {}, + "outputs": [], + "source": [ + "def show_vocab(vocab, end='\\n', limit=20):\n", + " \"\"\"Show word frequencys in vocab up to the limit number of words\"\"\"\n", + " shown = 0\n", + " for word, freq in vocab.items():\n", + " print(f'{word}: {freq}', end=end)\n", + " shown +=1\n", + " if shown > limit:\n", + " break" + ] + }, + { + "cell_type": "code", + "execution_count": 12, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "▁ B e g i n n e r s: 1\n", + "▁ B B Q: 3\n", + "▁ C l a s s: 2\n", + "▁ T a k i n g: 1\n", + "▁ P l a c e: 1\n", + "▁ i n: 15\n", + "▁ M i s s o u l a !: 1\n", + "▁ D o: 1\n", + "▁ y o u: 13\n", + "▁ w a n t: 1\n", + "▁ t o: 33\n", + "▁ g e t: 2\n", + "▁ b e t t e r: 2\n", + "▁ a t: 1\n", + "▁ m a k i n g: 2\n", + "▁ d e l i c i o u s: 1\n", + "▁ B B Q ?: 1\n", + "▁ Y o u: 1\n", + "▁ w i l l: 6\n", + "▁ h a v e: 4\n", + "▁ t h e: 31\n" + ] + } + ], + "source": [ + "show_vocab(vocab)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "You check the size of the vocabulary (frequency dictionary) because this is the one hyperparameter that BPE depends on crucially on how far it breaks up a word into SentencePieces. It turns out that for your trained model on the small dataset that 60% of 455 merges of the most frequent characters need to be done to reproduce the upperlimit of a 32K `vocab_size` over the entire corpus of examples." + ] + }, + { + "cell_type": "code", + "execution_count": 13, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Total number of unique words: 455\n", + "Number of merges required to reproduce SentencePiece training on the whole corpus: 273\n" + ] + } + ], + "source": [ + "print(f'Total number of unique words: {len(vocab)}')\n", + "print(f'Number of merges required to reproduce SentencePiece training on the whole corpus: {int(0.60*len(vocab))}')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### BPE Algorithm\n", + "Directly from the BPE paper you have the following algorithm. " + ] + }, + { + "cell_type": "code", + "execution_count": 14, + "metadata": {}, + "outputs": [], + "source": [ + "import re, collections\n", + "\n", + "def get_stats(vocab):\n", + " pairs = collections.defaultdict(int)\n", + " for word, freq in vocab.items():\n", + " symbols = word.split()\n", + " for i in range(len(symbols) - 1):\n", + " pairs[symbols[i], symbols[i+1]] += freq\n", + " return pairs\n", + "\n", + "def merge_vocab(pair, v_in):\n", + " v_out = {}\n", + " bigram = re.escape(' '.join(pair))\n", + " p = re.compile(r'(? id\n", + "print(sp.encode_as_pieces(s0))\n", + "print(sp.encode_as_ids(s0))\n", + "\n", + "# decode: id => text\n", + "print(sp.decode_pieces(sp.encode_as_pieces(s0)))\n", + "print(sp.decode_ids([12847, 277]))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Notice how SentencePiece breaks the words into seemingly odd parts, but you have seen something similar with BPE. But how close was the model trained on the whole corpus of examples with a `vocab_size` of 32,000 instead of 455? Here you can also test what happens to white space, like '\\n'. \n", + "\n", + "But first note that SentencePiece encodes the SentencePieces, the tokens, and has reserved some of the ids as can be seen in this week's assignment." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "uid = 15068\n", + "spiece = \"\\u2581BBQ\"\n", + "unknown = \"__MUST_BE_UNKNOWN__\"\n", + "\n", + "# id <=> piece conversion\n", + "print(f'SentencePiece for ID {uid}: {sp.id_to_piece(uid)}')\n", + "print(f'ID for Sentence Piece {spiece}: {sp.piece_to_id(spiece)}')\n", + "\n", + "# returns 0 for unknown tokens (we can change the id for UNK)\n", + "print(f'ID for unknown text {unknown}: {sp.piece_to_id(unknown)}')" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "print(f'Beginning of sentence id: {sp.bos_id()}')\n", + "print(f'Pad id: {sp.pad_id()}')\n", + "print(f'End of sentence id: {sp.eos_id()}')\n", + "print(f'Unknown id: {sp.unk_id()}')\n", + "print(f'Vocab size: {sp.vocab_size()}')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "You can also check what are the ids for the first part and last part of the vocabulary." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "print('\\nId\\tSentP\\tControl?')\n", + "print('------------------------')\n", + "# , , are defined by default. Their ids are (0, 1, 2)\n", + "# and are defined as 'control' symbol.\n", + "for uid in range(10):\n", + " print(uid, sp.id_to_piece(uid), sp.is_control(uid), sep='\\t')\n", + " \n", + "# for uid in range(sp.vocab_size()-10,sp.vocab_size()):\n", + "# print(uid, sp.id_to_piece(uid), sp.is_control(uid), sep='\\t')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Train SentencePiece BPE model with our example.txt" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Finally, train your own BPE model directly from the SentencePiece library and compare it to the results of the implemention of the algorithm from the BPE paper itself." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "spm.SentencePieceTrainer.train('--input=example.txt --model_prefix=example_bpe --vocab_size=450 --model_type=bpe')\n", + "sp_bpe = spm.SentencePieceProcessor()\n", + "sp_bpe.load('example_bpe.model')\n", + "\n", + "print('*** BPE ***')\n", + "print(sp_bpe.encode_as_pieces(s0))" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "show_vocab(sp_vocab, end = ', ')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The implementation of BPE's code from the paper matches up pretty well with the library itself! The differences are probably accounted for by the `vocab_size`. There is also another technical difference in that in the SentencePiece implementation of BPE a priority queue is used to more efficiently keep track of the *best pairs*. Actually, there is a priority queue in the Python standard library called `heapq` if you would like to give that a try below! " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Optionally try to implement BPE using a priority queue below" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "from heapq import heappush, heappop" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "def heapsort(iterable):\n", + " h = []\n", + " for value in iterable:\n", + " heappush(h, value)\n", + " return [heappop(h) for i in range(len(h))]" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "a = [1,4,3,1,3,2,1,4,2]\n", + "heapsort(a)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "For a more extensive example consider looking at the [SentencePiece repo](https://github.com/google/sentencepiece/blob/master/python/sentencepiece_python_module_example.ipynb). The last few sections of this code were repurposed from that tutorial. Thanks for your participation! Next stop BERT and T5!" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [] + } + ], + "metadata": { + "kernelspec": { + "display_name": "Python 3 (ipykernel)", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.10.11" + } + }, + "nbformat": 4, + "nbformat_minor": 4 +} diff --git a/NLP with Attention Models/QA/BPE_algorithm/home/jovyan/work/data/data.txt b/NLP with Attention Models/QA/BPE_algorithm/home/jovyan/work/data/data.txt new file mode 100644 index 0000000000000000000000000000000000000000..2cd6069cceed2c351bd305aff5d7902af9a25058 --- /dev/null +++ b/NLP with Attention Models/QA/BPE_algorithm/home/jovyan/work/data/data.txt @@ -0,0 +1,5 @@ +{'content-length': b'1970', 'content-type': b'text/plain', 'text': b'Beginners BBQ Class Taking Place in Missoula!\nDo you want to get better at making delicious BBQ? You will have the opportunity, put this on your calendar now. Thursday, September 22nd join World Class BBQ Champion, Tony Balay from Lonestar Smoke Rangers. He will be teaching a beginner level class for everyone who wants to get better with their culinary skills.\nHe will teach you everything you need to know to compete in a KCBS BBQ competition, including techniques, recipes, timelines, meat selection and trimming, plus smoker and fire information.\nThe cost to be in the class is $35 per person, and for spectators it is free. Included in the cost will be either a t-shirt or apron and you will be tasting samples of each meat that is prepared.', 'timestamp': b'2019-04-25T12:57:54Z', 'url': b'https://klyq.com/beginners-bbq-class-taking-place-in-missoula/'} +{'content-length': b'12064', 'content-type': b'text/plain', 'text': b'Discussion in \'Mac OS X Lion (10.7)\' started by axboi87, Jan 20, 2012.\nI\'ve got a 500gb internal drive and a 240gb SSD.\nWhen trying to restore using disk utility i\'m given the error "Not enough space on disk ____ to restore"\nBut I shouldn\'t have to do that!!!\nAny ideas or workarounds before resorting to the above?\nUse Carbon Copy Cloner to copy one drive to the other. I\'ve done this several times going from larger HDD to smaller SSD and I wound up with a bootable SSD drive. One step you have to remember not to skip is to use Disk Utility to partition the SSD as GUID partition scheme HFS+ before doing the clone. If it came Apple Partition Scheme, even if you let CCC do the clone, the resulting drive won\'t be bootable. CCC usually works in "file mode" and it can easily copy a larger drive (that\'s mostly empty) onto a smaller drive. If you tell CCC to clone a drive you did NOT boot from, it can work in block copy mode where the destination drive must be the same size or larger than the drive you are cloning from (if I recall).\nI\'ve actually done this somehow on Disk Utility several times (booting from a different drive (or even the dvd) so not running disk utility from the drive your cloning) and had it work just fine from larger to smaller bootable clone. Definitely format the drive cloning to first, as bootable Apple etc..\nThanks for pointing this out. My only experience using DU to go larger to smaller was when I was trying to make a Lion install stick and I was unable to restore InstallESD.dmg to a 4 GB USB stick but of course the reason that wouldn\'t fit is there was slightly more than 4 GB of data.', 'timestamp': b'2019-04-21T10:07:13Z', 'url':b'https://forums.macrumors.com/threads/restore-from-larger-disk-to-smaller-disk.1311329/'} +{'content-length': b'5235', 'content-type': b'text/plain', 'text': b'Foil plaid lycra and spandex shortall with metallic slinky insets. Attached metallic elastic belt with O-ring. Headband included. Great hip hop or jazz dance costume. Made in the USA.', 'timestamp': b'2019-04-25T10:40:23Z', 'url': b'https://awishcometrue.com/Catalogs/Clearance/Tweens/V1960-Find-A-Way'} +{'content-length': b'4967', 'content-type': b'text/plain', 'text': b"How many backlinks per day for new site?\nDiscussion in 'Black Hat SEO' started by Omoplata, Dec 3, 2010.\n1) for a newly created site, what's the max # backlinks per day I should do to be safe?\n2) how long do I have to let my site age before I can start making more blinks?\nI did about 6000 forum profiles every 24 hours for 10 days for one of my sites which had a brand new domain.\nThere is three backlinks for every of these forum profile so thats 18 000 backlinks every 24 hours and nothing happened in terms of being penalized or sandboxed. This is now maybe 3 months ago and the site is ranking on first page for a lot of my targeted keywords.\nbuild more you can in starting but do manual submission and not spammy type means manual + relevant to the post.. then after 1 month you can make a big blast..\nWow, dude, you built 18k backlinks a day on a brand new site? How quickly did you rank up? What kind of competition/searches did those keywords have?", 'timestamp': b'2019-04-21T12:46:19Z', 'url': b'https://www.blackhatworld.com/seo/how-many-backlinks-per-day-for-new-site.258615/'} +{'content-length': b'4499', 'content-type': b'text/plain', 'text': b'The Denver Board of Education opened the 2017-18 school year with an update on projects that include new construction, upgrades, heat mitigation and quality learning environments.\nWe are excited that Denver students will be the beneficiaries of a four year, $572 million General Obligation Bond. Since the passage of the bond, our construction team has worked to schedule the projects over the four-year term of the bond.\nDenver voters on Tuesday approved bond and mill funding measures for students in Denver Public Schools, agreeing to invest $572 million in bond funding to build and improve schools and $56.6 million in operating dollars to support proven initiatives, such as early literacy.\nDenver voters say yes to bond and mill levy funding support for DPS students and schools. Click to learn more about the details of the voter-approved bond measure.\nDenver voters on Nov. 8 approved bond and mill funding measures for DPS students and schools. Learn more about what\xe2\x80\x99s included in the mill levy measure.', 'timestamp': b'2019-04-20T14:33:21Z', 'url': b'http://bond.dpsk12.org/category/news/'} \ No newline at end of file diff --git a/NLP with Attention Models/QA/BPE_algorithm/home/jovyan/work/data/example.txt b/NLP with Attention Models/QA/BPE_algorithm/home/jovyan/work/data/example.txt new file mode 100644 index 0000000000000000000000000000000000000000..00bd6e2e4d3ff1cd7907f4a90f2e6ec4414600d3 --- /dev/null +++ b/NLP with Attention Models/QA/BPE_algorithm/home/jovyan/work/data/example.txt @@ -0,0 +1,30 @@ +Beginners BBQ Class Taking Place in Missoula! +Do you want to get better at making delicious BBQ? You will have the opportunity, put this on your calendar now. Thursday, September 22nd join World Class BBQ Champion, Tony Balay from Lonestar Smoke Rangers. He will be teaching a beginner level class for everyone who wants to get better with their culinary skills. +He will teach you everything you need to know to compete in a KCBS BBQ competition, including techniques, recipes, timelines, meat selection and trimming, plus smoker and fire information. +The cost to be in the class is $35 per person, and for spectators it is free. Included in the cost will be either a t-shirt or apron and you will be tasting samples of each meat that is prepared. + +Discussion in 'Mac OS X Lion (10.7)' started by axboi87, Jan 20, 2012. +I've got a 500gb internal drive and a 240gb SSD. +When trying to restore using disk utility i'm given the error "Not enough space on disk ____ to restore" +But I shouldn't have to do that!!! +Any ideas or workarounds before resorting to the above? +Use Carbon Copy Cloner to copy one drive to the other. I've done this several times going from larger HDD to smaller SSD and I wound up with a bootable SSD drive. One step you have to remember not to skip is to use Disk Utility to partition the SSD as GUID partition scheme HFS+ before doing the clone. If it came Apple Partition Scheme, even if you let CCC do the clone, the resulting drive won't be bootable. CCC usually works in "file mode" and it can easily copy a larger drive (that's mostly empty) onto a smaller drive. If you tell CCC to clone a drive you did NOT boot from, it can work in block copy mode where the destination drive must be the same size or larger than the drive you are cloning from (if I recall). +I've actually done this somehow on Disk Utility several times (booting from a different drive (or even the dvd) so not running disk utility from the drive your cloning) and had it work just fine from larger to smaller bootable clone. Definitely format the drive cloning to first, as bootable Apple etc.. +Thanks for pointing this out. My only experience using DU to go larger to smaller was when I was trying to make a Lion install stick and I was unable to restore InstallESD.dmg to a 4 GB USB stick but of course the reason that wouldn't fit is there was slightly more than 4 GB of data. + +Foil plaid lycra and spandex shortall with metallic slinky insets. Attached metallic elastic belt with O-ring. Headband included. Great hip hop or jazz dance costume. Made in the USA. + +How many backlinks per day for new site? +Discussion in 'Black Hat SEO' started by Omoplata, Dec 3, 2010. +1) for a newly created site, what's the max # backlinks per day I should do to be safe? +2) how long do I have to let my site age before I can start making more blinks? +I did about 6000 forum profiles every 24 hours for 10 days for one of my sites which had a brand new domain. +There is three backlinks for every of these forum profile so thats 18 000 backlinks every 24 hours and nothing happened in terms of being penalized or sandboxed. This is now maybe 3 months ago and the site is ranking on first page for a lot of my targeted keywords. +build more you can in starting but do manual submission and not spammy type means manual + relevant to the post.. then after 1 month you can make a big blast.. +Wow, dude, you built 18k backlinks a day on a brand new site? How quickly did you rank up? What kind of competition/searches did those keywords have? + +The Denver Board of Education opened the 2017-18 school year with an update on projects that include new construction, upgrades, heat mitigation and quality learning environments. +We are excited that Denver students will be the beneficiaries of a four year, $572 million General Obligation Bond. Since the passage of the bond, our construction team has worked to schedule the projects over the four-year term of the bond. +Denver voters on Tuesday approved bond and mill funding measures for students in Denver Public Schools, agreeing to invest $572 million in bond funding to build and improve schools and $56.6 million in operating dollars to support proven initiatives, such as early literacy. +Denver voters say yes to bond and mill levy funding support for DPS students and schools. Click to learn more about the details of the voter-approved bond measure. +Denver voters on Nov. 8 approved bond and mill funding measures for DPS students and schools. Learn more about what’s included in the mill levy measure. \ No newline at end of file diff --git a/NLP with Attention Models/QA/BPE_algorithm/home/jovyan/work/data/example_bpe.model b/NLP with Attention Models/QA/BPE_algorithm/home/jovyan/work/data/example_bpe.model new file mode 100644 index 0000000000000000000000000000000000000000..ad2aafb2227b8f1a34fda89743beabaf749dca41 --- /dev/null +++ b/NLP with Attention Models/QA/BPE_algorithm/home/jovyan/work/data/example_bpe.model @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:72964ffad404e80f7c7450e333730318ecb18b36c94cd8191b23b4461283610b +size 243359 diff --git a/NLP with Attention Models/QA/BPE_algorithm/home/jovyan/work/data/example_bpe.vocab b/NLP with Attention Models/QA/BPE_algorithm/home/jovyan/work/data/example_bpe.vocab new file mode 100644 index 0000000000000000000000000000000000000000..71eb077172eff0dbde356024589189b2825997c7 --- /dev/null +++ b/NLP with Attention Models/QA/BPE_algorithm/home/jovyan/work/data/example_bpe.vocab @@ -0,0 +1,450 @@ + 0 + 0 + 0 +▁t -0 +in -1 +on -2 +▁a -3 +er -4 +▁s -5 +▁th -6 +or -7 +▁b -8 +▁d -9 +▁m -10 +it -11 +nd -12 +ou -13 +▁f -14 +ing -15 +▁the -16 +▁to -17 +ve -18 +▁w -19 +ar -20 +▁c -21 +at -22 +ll -23 +▁in -24 +re -25 +en -26 +is -27 +le -28 +st -29 +ion -30 +▁and -31 +an -32 +▁p -33 +ot -34 +▁y -35 +as -36 +ed -37 +▁o -38 +ch -39 +ro -40 +▁D -41 +▁I -42 +▁e -43 +▁be -44 +▁h -45 +▁for -46 +▁you -47 +ill -48 +ive -49 +ver -50 +▁of -51 +▁n -52 +all -53 +▁dr -54 +▁on -55 +▁drive -56 +ck -57 +es -58 +▁u -59 +ore -60 +▁st -61 +et -62 +il -63 +ud -64 +▁C -65 +▁S -66 +▁re -67 +al -68 +ay -69 +pp -70 +▁2 -71 +▁B -72 +▁T -73 +▁l -74 +lin -75 +▁cl -76 +▁co -77 +ks -78 +me -79 +ow -80 +ts -81 +▁H -82 +ond -83 +one -84 +▁do -85 +▁ha -86 +▁is -87 +ly -88 +mp -89 +art -90 +rom -91 +▁le -92 +▁me -93 +▁bond -94 +▁from -95 +▁mill -96 +ic -97 +id -98 +la -99 +se -100 +▁g -101 +arg -102 +ers -103 +ite -104 +ith -105 +ity -106 +oot -107 +our -108 +▁Th -109 +▁ne -110 +▁wh -111 +▁Den -112 +▁sch -113 +links -114 +▁that -115 +▁will -116 +▁Denver -117 +SD -118 +ab -119 +ak -120 +ce -121 +cl -122 +ct -123 +ir -124 +ol -125 +▁( -126 +▁1 -127 +▁G -128 +▁O -129 +▁U -130 +▁W -131 +ack -132 +and -133 +ass -134 +isk -135 +ool -136 +ort -137 +▁bu -138 +▁it -139 +▁or -140 +▁sm -141 +▁te -142 +able -143 +clud -144 +ents -145 +rove -146 +very -147 +▁can -148 +▁new -149 +▁wor -150 +arger -151 +ation -152 +ition -153 +▁back -154 +▁boot -155 +▁have -156 +▁more -157 +▁site -158 +▁with -159 +▁every -160 +▁larger -161 +▁backlinks -162 +BQ -163 +ig -164 +ld -165 +py -166 +th -167 +▁$ -168 +▁A -169 +▁L -170 +▁k -171 +▁v -172 +ach -173 +asu -174 +ear -175 +ick -176 +out -177 +ter -178 +til -179 +und -180 +▁20 -181 +▁Cl -182 +▁ab -183 +▁sp -184 +▁su -185 +▁up -186 +ools -187 +▁BBQ -188 +▁SSD -189 +▁day -190 +▁did -191 +▁mak -192 +▁not -193 +▁pro -194 +▁vot -195 +▁was -196 +aller -197 +asure -198 +▁fund -199 +▁stud -200 +▁this -201 +▁work -202 +tility -203 +▁clone -204 +▁start -205 +▁includ -206 +▁funding -207 +▁measure -208 +▁smaller -209 +▁bootable -210 +▁students -211 +.. -212 +00 -213 +ad -214 +ec -215 +fi -216 +ge -217 +if -218 +im -219 +ip -220 +qu -221 +ru -222 +us -223 +▁M -224 +▁P -225 +▁j -226 +ere -227 +ree -228 +▁$5 -229 +▁24 -230 +▁CC -231 +▁He -232 +▁as -233 +▁mo -234 +▁my -235 +▁sa -236 +▁se -237 +▁sh -238 +▁so -239 +▁tr -240 +▁us -241 +file -242 +fore -243 +mpet -244 +ould -245 +sion -246 +▁201 -247 +▁CCC -248 +▁man -249 +▁per -250 +ction -251 +oning -252 +pport -253 +roved -254 +store -255 +▁buil -256 +▁copy -257 +▁cost -258 +▁disk -259 +▁about -260 +pproved -261 +▁before -262 +▁compet -263 +▁voters -264 +artition -265 +▁cloning -266 +▁million -267 +▁restore -268 +▁schools -269 +0. -270 +72 -271 +PS -272 +__ -273 +ac -274 +am -275 +bl -276 +bo -277 +de -278 +ds -279 +ef -280 +ep -281 +ey -282 +gb -283 +iz -284 +lt -285 +mb -286 +mo -287 +um -288 +ut -289 +vy -290 +▁" -291 +▁' -292 +▁3 -293 +▁4 -294 +▁N -295 +▁i -296 +▁r -297 +000 -298 +age -299 +ank -300 +ant -301 +arn -302 +ata -303 +cus -304 +day -305 +eme -306 +erm -307 +eyw -308 +gin -309 +ici -310 +jec -311 +oin -312 +per -313 +ual -314 +ust -315 +ven -316 +▁18 -317 +▁GB -318 +▁If -319 +▁In -320 +▁US -321 +▁Wh -322 +▁ag -323 +▁br -324 +▁by -325 +▁ca -326 +▁de -327 +▁en -328 +▁ex -329 +▁go -330 +▁qu -331 +▁sk -332 +ally -333 +ened -334 +ginn -335 +imes -336 +irst -337 +last -338 +mber -339 +onst -340 +onth -341 +ords -342 +ound -343 +ours -344 +pple -345 +reat -346 +tter -347 +ying -348 +▁DPS -349 +▁Dis -350 +▁How -351 +▁Sch -352 +▁The -353 +▁are -354 +▁but -355 +▁get -356 +▁had -357 +▁let -358 +▁met -359 +▁now -360 +▁one -361 +▁rec -362 +▁res -363 +allic -364 +jects -365 +ouldn -366 +stall -367 +ually -368 +veral -369 +▁$572 -370 +▁Disk -371 +▁Lion -372 +▁done -373 +▁ -374 +e -375 +o -376 +t -377 +n -378 +a -379 +i -380 +r -381 +s -382 +l -383 +d -384 +h -385 +u -386 +c -387 +m -388 +y -389 +p -390 +b -391 +f -392 +g -393 +v -394 +w -395 +k -396 +. -397 +, -398 +D -399 +S -400 +B -401 +C -402 +I -403 +0 -404 +' -405 +2 -406 +1 -407 +T -408 +? -409 +H -410 +) -411 +O -412 +U -413 +x -414 +( -415 +- -416 +4 -417 +5 -418 +7 -419 +8 -420 +A -421 +G -422 +P -423 +W -424 +j -425 +! -426 +" -427 +$ -428 +L -429 +M -430 +Q -431 +_ -432 +z -433 +3 -434 +6 -435 +E -436 +N -437 +q -438 ++ -439 +F -440 +# -441 +/ -442 +J -443 +K -444 +R -445 +X -446 diff --git a/NLP with Attention Models/QA/BPE_algorithm/home/jovyan/work/data/sentencepiece.model b/NLP with Attention Models/QA/BPE_algorithm/home/jovyan/work/data/sentencepiece.model new file mode 100644 index 0000000000000000000000000000000000000000..317a5ccbde45300f5d1d970d4d449af2108b147e --- /dev/null +++ b/NLP with Attention Models/QA/BPE_algorithm/home/jovyan/work/data/sentencepiece.model @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:d60acb128cf7b7f2536e8f38a5b18a05535c9e14c7a355904270e15b0945ea86 +size 791656 diff --git a/NLP with Attention Models/QA/BPE_algorithm/home/jovyan/work/example.txt b/NLP with Attention Models/QA/BPE_algorithm/home/jovyan/work/example.txt new file mode 100644 index 0000000000000000000000000000000000000000..00bd6e2e4d3ff1cd7907f4a90f2e6ec4414600d3 --- /dev/null +++ b/NLP with Attention Models/QA/BPE_algorithm/home/jovyan/work/example.txt @@ -0,0 +1,30 @@ +Beginners BBQ Class Taking Place in Missoula! +Do you want to get better at making delicious BBQ? You will have the opportunity, put this on your calendar now. Thursday, September 22nd join World Class BBQ Champion, Tony Balay from Lonestar Smoke Rangers. He will be teaching a beginner level class for everyone who wants to get better with their culinary skills. +He will teach you everything you need to know to compete in a KCBS BBQ competition, including techniques, recipes, timelines, meat selection and trimming, plus smoker and fire information. +The cost to be in the class is $35 per person, and for spectators it is free. Included in the cost will be either a t-shirt or apron and you will be tasting samples of each meat that is prepared. + +Discussion in 'Mac OS X Lion (10.7)' started by axboi87, Jan 20, 2012. +I've got a 500gb internal drive and a 240gb SSD. +When trying to restore using disk utility i'm given the error "Not enough space on disk ____ to restore" +But I shouldn't have to do that!!! +Any ideas or workarounds before resorting to the above? +Use Carbon Copy Cloner to copy one drive to the other. I've done this several times going from larger HDD to smaller SSD and I wound up with a bootable SSD drive. One step you have to remember not to skip is to use Disk Utility to partition the SSD as GUID partition scheme HFS+ before doing the clone. If it came Apple Partition Scheme, even if you let CCC do the clone, the resulting drive won't be bootable. CCC usually works in "file mode" and it can easily copy a larger drive (that's mostly empty) onto a smaller drive. If you tell CCC to clone a drive you did NOT boot from, it can work in block copy mode where the destination drive must be the same size or larger than the drive you are cloning from (if I recall). +I've actually done this somehow on Disk Utility several times (booting from a different drive (or even the dvd) so not running disk utility from the drive your cloning) and had it work just fine from larger to smaller bootable clone. Definitely format the drive cloning to first, as bootable Apple etc.. +Thanks for pointing this out. My only experience using DU to go larger to smaller was when I was trying to make a Lion install stick and I was unable to restore InstallESD.dmg to a 4 GB USB stick but of course the reason that wouldn't fit is there was slightly more than 4 GB of data. + +Foil plaid lycra and spandex shortall with metallic slinky insets. Attached metallic elastic belt with O-ring. Headband included. Great hip hop or jazz dance costume. Made in the USA. + +How many backlinks per day for new site? +Discussion in 'Black Hat SEO' started by Omoplata, Dec 3, 2010. +1) for a newly created site, what's the max # backlinks per day I should do to be safe? +2) how long do I have to let my site age before I can start making more blinks? +I did about 6000 forum profiles every 24 hours for 10 days for one of my sites which had a brand new domain. +There is three backlinks for every of these forum profile so thats 18 000 backlinks every 24 hours and nothing happened in terms of being penalized or sandboxed. This is now maybe 3 months ago and the site is ranking on first page for a lot of my targeted keywords. +build more you can in starting but do manual submission and not spammy type means manual + relevant to the post.. then after 1 month you can make a big blast.. +Wow, dude, you built 18k backlinks a day on a brand new site? How quickly did you rank up? What kind of competition/searches did those keywords have? + +The Denver Board of Education opened the 2017-18 school year with an update on projects that include new construction, upgrades, heat mitigation and quality learning environments. +We are excited that Denver students will be the beneficiaries of a four year, $572 million General Obligation Bond. Since the passage of the bond, our construction team has worked to schedule the projects over the four-year term of the bond. +Denver voters on Tuesday approved bond and mill funding measures for students in Denver Public Schools, agreeing to invest $572 million in bond funding to build and improve schools and $56.6 million in operating dollars to support proven initiatives, such as early literacy. +Denver voters say yes to bond and mill levy funding support for DPS students and schools. Click to learn more about the details of the voter-approved bond measure. +Denver voters on Nov. 8 approved bond and mill funding measures for DPS students and schools. Learn more about what’s included in the mill levy measure. \ No newline at end of file diff --git a/NLP with Attention Models/QA/BPE_algorithm/home/jovyan/work/example_bpe.model b/NLP with Attention Models/QA/BPE_algorithm/home/jovyan/work/example_bpe.model new file mode 100644 index 0000000000000000000000000000000000000000..8ee2062d39a3665cb0ec078737ed20d4d3c9a5db --- /dev/null +++ b/NLP with Attention Models/QA/BPE_algorithm/home/jovyan/work/example_bpe.model @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:e3c6c33b38015133abc0b477cfc51ef435a4741e2f907ae26760f61ce8ae85cb +size 243359 diff --git a/NLP with Attention Models/QA/BPE_algorithm/home/jovyan/work/example_bpe.vocab b/NLP with Attention Models/QA/BPE_algorithm/home/jovyan/work/example_bpe.vocab new file mode 100644 index 0000000000000000000000000000000000000000..2e71429a32affdd50b643808ace4f1a46b8976f0 --- /dev/null +++ b/NLP with Attention Models/QA/BPE_algorithm/home/jovyan/work/example_bpe.vocab @@ -0,0 +1,450 @@ + 0 + 0 + 0 +▁t -0 +in -1 +on -2 +▁a -3 +er -4 +▁s -5 +▁th -6 +or -7 +▁b -8 +▁d -9 +▁m -10 +it -11 +nd -12 +ou -13 +▁f -14 +ing -15 +▁the -16 +▁to -17 +ve -18 +▁w -19 +ar -20 +▁c -21 +at -22 +ll -23 +▁in -24 +re -25 +en -26 +is -27 +le -28 +st -29 +ion -30 +▁and -31 +an -32 +▁p -33 +ot -34 +▁y -35 +as -36 +ed -37 +▁o -38 +ch -39 +ro -40 +▁D -41 +▁I -42 +▁e -43 +▁be -44 +▁h -45 +▁for -46 +▁you -47 +ill -48 +ive -49 +ver -50 +▁of -51 +▁n -52 +all -53 +▁dr -54 +▁on -55 +▁drive -56 +ck -57 +es -58 +▁u -59 +ore -60 +▁st -61 +et -62 +il -63 +ud -64 +▁C -65 +▁S -66 +▁re -67 +al -68 +ay -69 +pp -70 +▁2 -71 +▁B -72 +▁T -73 +▁l -74 +lin -75 +▁cl -76 +▁co -77 +ks -78 +me -79 +ow -80 +ts -81 +▁H -82 +ond -83 +one -84 +▁do -85 +▁ha -86 +▁is -87 +ly -88 +mp -89 +art -90 +rom -91 +▁le -92 +▁me -93 +▁bond -94 +▁from -95 +▁mill -96 +ic -97 +id -98 +la -99 +se -100 +▁g -101 +arg -102 +ers -103 +ite -104 +ith -105 +ity -106 +oot -107 +our -108 +▁Th -109 +▁ne -110 +▁wh -111 +▁Den -112 +▁sch -113 +links -114 +▁that -115 +▁will -116 +▁Denver -117 +00 -118 +SD -119 +ab -120 +ak -121 +ce -122 +cl -123 +ct -124 +ir -125 +ol -126 +▁( -127 +▁1 -128 +▁G -129 +▁O -130 +▁U -131 +▁W -132 +ack -133 +and -134 +ass -135 +isk -136 +ool -137 +ort -138 +▁bu -139 +▁it -140 +▁or -141 +▁sm -142 +▁te -143 +able -144 +clud -145 +ents -146 +rove -147 +very -148 +▁can -149 +▁new -150 +▁wor -151 +arger -152 +ation -153 +ition -154 +▁back -155 +▁boot -156 +▁have -157 +▁more -158 +▁site -159 +▁with -160 +▁every -161 +▁larger -162 +▁backlinks -163 +BQ -164 +ig -165 +ld -166 +py -167 +th -168 +▁$ -169 +▁A -170 +▁L -171 +▁k -172 +▁v -173 +ach -174 +asu -175 +ear -176 +ick -177 +out -178 +ter -179 +til -180 +und -181 +▁20 -182 +▁Cl -183 +▁ab -184 +▁sp -185 +▁su -186 +▁up -187 +ools -188 +▁BBQ -189 +▁SSD -190 +▁day -191 +▁did -192 +▁mak -193 +▁not -194 +▁pro -195 +▁vot -196 +▁was -197 +aller -198 +asure -199 +▁fund -200 +▁stud -201 +▁this -202 +▁work -203 +tility -204 +▁clone -205 +▁start -206 +▁includ -207 +▁funding -208 +▁measure -209 +▁smaller -210 +▁bootable -211 +▁students -212 +.. -213 +CC -214 +__ -215 +ad -216 +ec -217 +fi -218 +ge -219 +if -220 +im -221 +ip -222 +qu -223 +ru -224 +us -225 +▁M -226 +▁P -227 +▁j -228 +ere -229 +ree -230 +▁$5 -231 +▁24 -232 +▁He -233 +▁as -234 +▁mo -235 +▁my -236 +▁sa -237 +▁se -238 +▁sh -239 +▁so -240 +▁tr -241 +▁us -242 +file -243 +fore -244 +mpet -245 +ould -246 +sion -247 +▁201 -248 +▁CCC -249 +▁man -250 +▁per -251 +ction -252 +oning -253 +pport -254 +roved -255 +store -256 +▁buil -257 +▁copy -258 +▁cost -259 +▁disk -260 +▁about -261 +pproved -262 +▁before -263 +▁compet -264 +▁voters -265 +artition -266 +▁cloning -267 +▁million -268 +▁restore -269 +▁schools -270 +!! -271 +0. -272 +72 -273 +PS -274 +ac -275 +am -276 +bl -277 +bo -278 +de -279 +ds -280 +ef -281 +ep -282 +ey -283 +gb -284 +iz -285 +lt -286 +mb -287 +mo -288 +um -289 +ut -290 +vy -291 +▁" -292 +▁' -293 +▁3 -294 +▁4 -295 +▁N -296 +▁i -297 +▁r -298 +000 -299 +age -300 +ank -301 +ant -302 +arn -303 +ata -304 +cus -305 +day -306 +eme -307 +erm -308 +eyw -309 +gin -310 +ici -311 +jec -312 +oin -313 +per -314 +ual -315 +ust -316 +ven -317 +▁18 -318 +▁GB -319 +▁If -320 +▁In -321 +▁US -322 +▁Wh -323 +▁ag -324 +▁br -325 +▁by -326 +▁ca -327 +▁de -328 +▁en -329 +▁ex -330 +▁go -331 +▁qu -332 +▁sk -333 +ally -334 +ened -335 +ginn -336 +imes -337 +irst -338 +last -339 +mber -340 +onst -341 +onth -342 +ords -343 +ound -344 +ours -345 +pple -346 +reat -347 +tter -348 +ying -349 +▁DPS -350 +▁Dis -351 +▁How -352 +▁Sch -353 +▁The -354 +▁are -355 +▁but -356 +▁get -357 +▁had -358 +▁let -359 +▁met -360 +▁now -361 +▁one -362 +▁rec -363 +▁res -364 +allic -365 +jects -366 +ouldn -367 +stall -368 +ually -369 +veral -370 +▁$572 -371 +▁Disk -372 +▁Lion -373 +▁ -374 +e -375 +o -376 +t -377 +n -378 +a -379 +i -380 +r -381 +s -382 +l -383 +d -384 +h -385 +u -386 +c -387 +m -388 +y -389 +p -390 +b -391 +f -392 +g -393 +v -394 +w -395 +k -396 +. -397 +, -398 +D -399 +S -400 +B -401 +C -402 +I -403 +0 -404 +' -405 +2 -406 +1 -407 +T -408 +? -409 +H -410 +) -411 +O -412 +U -413 +x -414 +( -415 +- -416 +4 -417 +5 -418 +7 -419 +8 -420 +A -421 +G -422 +P -423 +W -424 +j -425 +! -426 +" -427 +$ -428 +L -429 +M -430 +Q -431 +_ -432 +z -433 +3 -434 +6 -435 +E -436 +N -437 +q -438 ++ -439 +F -440 +# -441 +/ -442 +J -443 +K -444 +R -445 +X -446 diff --git a/NLP with Attention Models/QA/QA_DistilBERT_pipline_FT/Files/tf/.ipynb_checkpoints/C4W3_HF_Lab1_QA_BERT-checkpoint.ipynb b/NLP with Attention Models/QA/QA_DistilBERT_pipline_FT/Files/tf/.ipynb_checkpoints/C4W3_HF_Lab1_QA_BERT-checkpoint.ipynb new file mode 100644 index 0000000000000000000000000000000000000000..2b0b51ecb35f5bc0113530b0b3f978ffbc461ef2 --- /dev/null +++ b/NLP with Attention Models/QA/QA_DistilBERT_pipline_FT/Files/tf/.ipynb_checkpoints/C4W3_HF_Lab1_QA_BERT-checkpoint.ipynb @@ -0,0 +1,2110 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": { + "id": "u2UXutvEvpUj" + }, + "source": [ + "# Question Answering with BERT and HuggingFace\n", + "\n", + "You've seen how to use BERT and other transformer models for a wide range of natural language tasks, including machine translation, summarization, and question answering. Transformers have become the standard model for NLP, similar to convolutional models in computer vision. And all started with Attention!\n", + "\n", + "In practice, you'll rarely train a transformer model from scratch. Transformers tend to be very large, so they take time, money, and lots of data to train fully. Instead, you'll want to start with a pre-trained model and fine-tune it with your dataset if you need to.\n", + "\n", + "[Hugging Face](https://huggingface.co/) (🤗) is the best resource for pre-trained transformers. Their open-source libraries simplify downloading and using transformer models like BERT, T5, and GPT-2. And the best part, you can use them alongside either TensorFlow, PyTorch or Flax.\n", + "\n", + "In this notebook, you'll use 🤗 transformers to use the DistilBERT model for question answering." + ] + }, + { + "cell_type": "markdown", + "metadata": { + "id": "tm675LmQvpUm" + }, + "source": [ + "## Pipelines\n", + "\n", + "Before fine-tuning a model, you will look at the pipelines from Hugging Face to use pre-trained transformer models for specific tasks. The `transformers` library provides pipelines for popular tasks like sentiment analysis, summarization, and text generation. A pipeline consists of a tokenizer, a model, and the model configuration. All these are packaged together into an easy-to-use object. Hugging Face makes life easier.\n", + "\n", + "Pipelines are intended to be used without fine-tuning and will often be immediately helpful in your projects. For example, `transformers` provides a pipeline for [question answering](https://huggingface.co/docs/transformers/main_classes/pipelines#transformers.QuestionAnsweringPipeline) that you can directly use to answer your questions if you give some context. Let's see how to do just that.\n", + "\n", + "You will import `pipeline` from `transformers` for creating pipelines." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "id": "uNJGGbRWvpUm" + }, + "outputs": [], + "source": [ + "import os\n", + "os.environ['TF_CPP_MIN_LOG_LEVEL'] = '3'\n", + "\n", + "from transformers import pipeline" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "id": "_CeFTIr7P3QR" + }, + "source": [ + "Now, you will create the pipeline for question-answering, which uses the [DistilBert](https://hf.co/distilbert-base-cased-distilled-squad) model for extractive question answering (i.e., answering questions with the exact wording provided in the context)." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "colab": { + "base_uri": "https://localhost:8080/", + "height": 177, + "referenced_widgets": [ + "d7e158e614f44983b229d6dd0d8960f9", + "69aad11dbc914410b95f6c3cb17a2457", + "0302e718c6084fb0a96d92fd976738dc", + "72b47b116b0b4125a35d47e060f46807", + "6f5fbb8f0f5a4374a7bde870c64f1fa4", + "331c23df507e4d679e3aaf81af39cd22", + "e33617a01b03437986c143c9a69ba14f", + "b1271eb1b7e74250bd9273f229b49cd8", + "6b20c9e39d36404fb761b4d83954a278", + "322cc6ef697945ccbbc2b3029dfdf0e3", + "f9456ff5134242bc9541d9d60c753384", + "5b917388b6624637ad8d8f60516d4001", + "13e05f2f64a54245a2478393b1f6b409", + "3c63301478f54f95ba0f0f8c853a7266", + "bb455638d3ac451096fd7cce4cb0d82c", + "c4a6b418089147f6b50eab097ace0342", + "9f4a770ae6b84593ac7de85e15c305a9", + "65d019ca643045a2b0933411d059c920", + "794c088a92ba4f6798fa94cded51d0bd", + "deb9a0d8d3e1430bacab12c8b4ce7573", + "c53ce49d9b7c4a3c87ad7f7c75dce1f5", + "b099d3bb966a4e9b8d02710329030ff3", + "6d42d468b1d04c4a94fb2eed75e3c238", + "a371ed9c75184b78a80facda31086426", + "c363fb4238e0464cb3a4ab16250e554a", + "86dce8a2c404469dab0dc1a466788c0a", + "f6a334e9b5da4c82a1c4f1fbc1fe3c7e", + "e9f1a9476de147c3bc98c1c36960dad6", + "2f8a7ddfcee64b978ff23f8f43911e01", + "099a1ac5af9b4e38abb6afb9c333d37a", + "340a7a5171ba4b8aa9b70e945df1618c", + "b0f31194b8f24b5ab601f8edd5332e04", + "cbeb9e9dfdf6420d91ec1126fedd8e48", + "da1553ec3e044a4fb7bb9b0c2a84bfe0", + "b440f3b3937549d69a4f188bb8415531", + "1b937b91e8ee46c4a764a8091f365291", + "1369029047864cc68100a44ffaad35ca", + "80cb9869ae694ecda95aab298598c7a2", + "2aaa61bf4df248be970940e103af0276", + "3a7f8b6302034a0e889976ba0fbb4531", + "8cc02da6124448598923899b144afd6d", + "7361467d667d49ee85c432eec884d882", + "39c0c457e3bd46c1971ae2913fd66429", + "2985df1d86904019a8fdf69a356ade6d", + "6c291edce30b4cefb329113d5ecbe640", + "1249849d670f4824a0a21aa61d187b56", + "0e7b7a12422d49d4a33538e638bfc1c9", + "ec90909f38e0448bab37c735ea9b9ebe", + "f4274a98f06945a6a1e4a56b680c1790", + "c04562a21d36405890c11e74915839c7", + "e6b91c2208e44524a9883309ae431277", + "9adf994d114141f98aeea509a73e9c59", + "520e9bf4c0164e60a2c3288bd97ef93e", + "dd420d29751341faa84c025afa743bb5", + "4d56ed704097453a930baa9ecdfb1156" + ] + }, + "id": "nKy4AAhLvpUo", + "outputId": "0419ab21-4237-4ad2-b076-9ed83377ed34" + }, + "outputs": [], + "source": [ + "# The task \"question-answering\" will return a QuestionAnsweringPipeline object\n", + "question_answerer = pipeline(task=\"question-answering\", model=\"distilbert-base-cased-distilled-squad\")" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "id": "4ltQLVWgvpUo" + }, + "source": [ + "Notice that this environment already has the model stored in the directory `distilbert-base-cased-distilled-squad`. However if you were to run that exact code on your local computer, Huggingface will download the model for you, which is a great feature!\n", + "\n", + "\n", + "After running the last cell, you have a pipeline for performing question answering given a context string. The pipeline `question_answerer` you just created needs you to pass the question and context as strings. It returns an answer to the question from the context you provided. For example, here are the first few paragraphs from the [Wikipedia entry for tea](https://en.wikipedia.org/wiki/Tea) that you will use as the context.\n", + "\n", + "\n" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "id": "D_-MzZNJvpUp" + }, + "outputs": [], + "source": [ + "context = \"\"\"\n", + "Tea is an aromatic beverage prepared by pouring hot or boiling water over cured or fresh leaves of Camellia sinensis,\n", + "an evergreen shrub native to China and East Asia. After water, it is the most widely consumed drink in the world.\n", + "There are many different types of tea; some, like Chinese greens and Darjeeling, have a cooling, slightly bitter,\n", + "and astringent flavour, while others have vastly different profiles that include sweet, nutty, floral, or grassy\n", + "notes. Tea has a stimulating effect in humans primarily due to its caffeine content.\n", + "\n", + "The tea plant originated in the region encompassing today's Southwest China, Tibet, north Myanmar and Northeast India,\n", + "where it was used as a medicinal drink by various ethnic groups. An early credible record of tea drinking dates to\n", + "the 3rd century AD, in a medical text written by Hua Tuo. It was popularised as a recreational drink during the\n", + "Chinese Tang dynasty, and tea drinking spread to other East Asian countries. Portuguese priests and merchants\n", + "introduced it to Europe during the 16th century. During the 17th century, drinking tea became fashionable among the\n", + "English, who started to plant tea on a large scale in India.\n", + "\n", + "The term herbal tea refers to drinks not made from Camellia sinensis: infusions of fruit, leaves, or other plant\n", + "parts, such as steeps of rosehip, chamomile, or rooibos. These may be called tisanes or herbal infusions to prevent\n", + "confusion with 'tea' made from the tea plant.\n", + "\"\"\"" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "id": "HyR3o2mrvpUq" + }, + "source": [ + "Now, you can ask your model anything related to that passage. For instance, \"Where is tea native to?\"." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "colab": { + "base_uri": "https://localhost:8080/" + }, + "id": "eiRohAWWvpUq", + "outputId": "a1ddfca3-3723-4d43-cbda-0509337b60d6", + "scrolled": true + }, + "outputs": [], + "source": [ + "result = question_answerer(question=\"Where is tea native to?\", context=context)\n", + "\n", + "print(result['answer'])" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "id": "cRXzFlZ5vpUr" + }, + "source": [ + "You can also pass multiple questions to your pipeline within a list so that you can ask:\n", + "\n", + "* \"Where is tea native to?\"\n", + "* \"When was tea discovered?\"\n", + "* \"What is the species name for tea?\"\n", + "\n", + "at the same time, and your `question-answerer` will return all the answers." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "colab": { + "base_uri": "https://localhost:8080/" + }, + "id": "IMLyXeMZvpUr", + "outputId": "ac9badb1-083d-4234-9474-f112c1f2f20f" + }, + "outputs": [], + "source": [ + "questions = [\"Where is tea native to?\",\n", + " \"When was tea discovered?\",\n", + " \"What is the species name for tea?\"]\n", + "\n", + "results = question_answerer(question=questions, context=context)\n", + "\n", + "for q, r in zip(questions, results):\n", + " print(f\"{q} \\n>> {r['answer']}\")" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "id": "XXf18tVu8p70" + }, + "source": [ + "Although the models used in the Hugging Face pipelines generally give outstanding results, sometimes you will have particular examples where they don't perform so well. Let's use the following example with a context string about the Golden Age of Comic Books:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "id": "0v9C0TAqwinw" + }, + "outputs": [], + "source": [ + "context = \"\"\"\n", + "The Golden Age of Comic Books describes an era of American comic books from the\n", + "late 1930s to circa 1950. During this time, modern comic books were first published\n", + "and rapidly increased in popularity. The superhero archetype was created and many\n", + "well-known characters were introduced, including Superman, Batman, Captain Marvel\n", + "(later known as SHAZAM!), Captain America, and Wonder Woman.\n", + "Between 1939 and 1941 Detective Comics and its sister company, All-American Publications,\n", + "introduced popular superheroes such as Batman and Robin, Wonder Woman, the Flash,\n", + "Green Lantern, Doctor Fate, the Atom, Hawkman, Green Arrow and Aquaman.[7] Timely Comics,\n", + "the 1940s predecessor of Marvel Comics, had million-selling titles featuring the Human Torch,\n", + "the Sub-Mariner, and Captain America.[8]\n", + "As comic books grew in popularity, publishers began launching titles that expanded\n", + "into a variety of genres. Dell Comics' non-superhero characters (particularly the\n", + "licensed Walt Disney animated-character comics) outsold the superhero comics of the day.[12]\n", + "The publisher featured licensed movie and literary characters such as Mickey Mouse, Donald Duck,\n", + "Roy Rogers and Tarzan.[13] It was during this era that noted Donald Duck writer-artist\n", + "Carl Barks rose to prominence.[14] Additionally, MLJ's introduction of Archie Andrews\n", + "in Pep Comics #22 (December 1941) gave rise to teen humor comics,[15] with the Archie\n", + "Andrews character remaining in print well into the 21st century.[16]\n", + "At the same time in Canada, American comic books were prohibited importation under\n", + "the War Exchange Conservation Act[17] which restricted the importation of non-essential\n", + "goods. As a result, a domestic publishing industry flourished during the duration\n", + "of the war which were collectively informally called the Canadian Whites.\n", + "The educational comic book Dagwood Splits the Atom used characters from the comic\n", + "strip Blondie.[18] According to historian Michael A. Amundson, appealing comic-book\n", + "characters helped ease young readers' fear of nuclear war and neutralize anxiety\n", + "about the questions posed by atomic power.[19] It was during this period that long-running\n", + "humor comics debuted, including EC's Mad and Carl Barks' Uncle Scrooge in Dell's Four\n", + "Color Comics (both in 1952).[20][21]\n", + "\"\"\"" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "id": "fYbERLKQbhyH" + }, + "source": [ + "Let's ask the following question: \"What popular superheroes were introduced between 1939 and 1941?\" The answer is in the fourth paragraph of the context string." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "colab": { + "base_uri": "https://localhost:8080/" + }, + "id": "SEmAbSSGbg0J", + "outputId": "35b5e3c4-2fd2-4f37-b674-014681ece042" + }, + "outputs": [], + "source": [ + "question = \"What popular superheroes were introduced between 1939 and 1941?\"\n", + "\n", + "result = question_answerer(question=question, context=context)\n", + "print(result['answer'])" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "id": "LGx_BHkN-ejY" + }, + "source": [ + "Here, the answer should be:\n", + "\"Batman and Robin, Wonder Woman, the Flash,\n", + "Green Lantern, Doctor Fate, the Atom, Hawkman, Green Arrow, and Aquaman\". Instead, the pipeline returned a different answer. You can even try different question wordings:\n", + "\n", + "* \"What superheroes were introduced between 1939 and 1941?\"\n", + "* \"What comic book characters were created between 1939 and 1941?\"\n", + "* \"What well-known characters were created between 1939 and 1941?\"\n", + "* \"What well-known superheroes were introduced between 1939 and 1941 by Detective Comics?\"\n", + "\n", + "and you will only get incorrect answers." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "colab": { + "base_uri": "https://localhost:8080/" + }, + "id": "f91kLn9VcRzK", + "outputId": "bb3942b6-321a-4466-ac18-9f173b115600" + }, + "outputs": [], + "source": [ + "questions = [\"What popular superheroes were introduced between 1939 and 1941?\",\n", + " \"What superheroes were introduced between 1939 and 1941 by Detective Comics and its sister company?\",\n", + " \"What comic book characters were created between 1939 and 1941?\",\n", + " \"What well-known characters were created between 1939 and 1941?\",\n", + " \"What well-known superheroes were introduced between 1939 and 1941 by Detective Comics?\"]\n", + "\n", + "results = question_answerer(question=questions, context=context)\n", + "\n", + "for q, r in zip(questions, results):\n", + " print(f\"{q} \\n>> {r['answer']}\")" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "id": "QCkLhf27cEsH" + }, + "source": [ + "It seems like this model is a **huge fan** of Archie Andrews. It even considers him a superhero!\n", + "\n", + "The example that fooled your `question_answerer` belongs to the [TyDi QA dataset](https://ai.google.com/research/tydiqa), a dataset from Google for question/answering in diverse languages. To achieve better results when you know that the pipeline isn't working as it should, you need to consider fine-tuning your model.\n", + "\n", + "In the next ungraded lab, you will get the chance to fine-tune the DistilBert model using the TyDi QA dataset.\n", + "\n" + ] + } + ], + "metadata": { + "accelerator": "GPU", + "colab": { + "provenance": [] + }, + "kernelspec": { + "display_name": "Python 3 (ipykernel)", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.8.10" + }, + "widgets": { + "application/vnd.jupyter.widget-state+json": { + "0302e718c6084fb0a96d92fd976738dc": { + "model_module": "@jupyter-widgets/controls", + "model_module_version": "1.5.0", + "model_name": "FloatProgressModel", + "state": { + "_dom_classes": [], + "_model_module": "@jupyter-widgets/controls", + "_model_module_version": "1.5.0", + "_model_name": "FloatProgressModel", + "_view_count": null, + "_view_module": "@jupyter-widgets/controls", + "_view_module_version": "1.5.0", + "_view_name": "ProgressView", + "bar_style": "success", + "description": "", + "description_tooltip": null, + "layout": "IPY_MODEL_b1271eb1b7e74250bd9273f229b49cd8", + "max": 473, + "min": 0, + "orientation": "horizontal", + "style": "IPY_MODEL_6b20c9e39d36404fb761b4d83954a278", + "value": 473 + } + }, + "099a1ac5af9b4e38abb6afb9c333d37a": { + "model_module": "@jupyter-widgets/base", + "model_module_version": "1.2.0", + "model_name": "LayoutModel", + "state": { + "_model_module": "@jupyter-widgets/base", + "_model_module_version": "1.2.0", + "_model_name": "LayoutModel", + "_view_count": null, + "_view_module": "@jupyter-widgets/base", + "_view_module_version": "1.2.0", + "_view_name": "LayoutView", + "align_content": null, + "align_items": null, + "align_self": null, + "border": null, + "bottom": null, + "display": null, + "flex": null, + "flex_flow": null, + "grid_area": null, + "grid_auto_columns": null, + "grid_auto_flow": null, + "grid_auto_rows": null, + "grid_column": null, + "grid_gap": null, + "grid_row": null, + "grid_template_areas": null, + "grid_template_columns": null, + "grid_template_rows": null, + "height": null, + "justify_content": null, + "justify_items": null, + "left": null, + "margin": null, + "max_height": null, + "max_width": null, + "min_height": null, + "min_width": null, + "object_fit": null, + "object_position": null, + "order": null, + "overflow": null, + "overflow_x": null, + "overflow_y": null, + "padding": null, + "right": null, + "top": null, + "visibility": null, + "width": null + } + }, + "0e7b7a12422d49d4a33538e638bfc1c9": { + "model_module": "@jupyter-widgets/controls", + "model_module_version": "1.5.0", + "model_name": "FloatProgressModel", + "state": { + "_dom_classes": [], + "_model_module": "@jupyter-widgets/controls", + "_model_module_version": "1.5.0", + "_model_name": "FloatProgressModel", + "_view_count": null, + "_view_module": "@jupyter-widgets/controls", + "_view_module_version": "1.5.0", + "_view_name": "ProgressView", + "bar_style": "success", + "description": "", + "description_tooltip": null, + "layout": "IPY_MODEL_9adf994d114141f98aeea509a73e9c59", + "max": 435797, + "min": 0, + "orientation": "horizontal", + "style": "IPY_MODEL_520e9bf4c0164e60a2c3288bd97ef93e", + "value": 435797 + } + }, + "1249849d670f4824a0a21aa61d187b56": { + "model_module": "@jupyter-widgets/controls", + "model_module_version": "1.5.0", + "model_name": "HTMLModel", + "state": { + "_dom_classes": [], + "_model_module": "@jupyter-widgets/controls", + "_model_module_version": "1.5.0", + "_model_name": "HTMLModel", + "_view_count": null, + "_view_module": "@jupyter-widgets/controls", + "_view_module_version": "1.5.0", + "_view_name": "HTMLView", + "description": "", + "description_tooltip": null, + "layout": "IPY_MODEL_c04562a21d36405890c11e74915839c7", + "placeholder": "​", + "style": "IPY_MODEL_e6b91c2208e44524a9883309ae431277", + "value": "Downloading: 100%" + } + }, + "1369029047864cc68100a44ffaad35ca": { + "model_module": "@jupyter-widgets/controls", + "model_module_version": "1.5.0", + "model_name": "HTMLModel", + "state": { + "_dom_classes": [], + "_model_module": "@jupyter-widgets/controls", + "_model_module_version": "1.5.0", + "_model_name": "HTMLModel", + "_view_count": null, + "_view_module": "@jupyter-widgets/controls", + "_view_module_version": "1.5.0", + "_view_name": "HTMLView", + "description": "", + "description_tooltip": null, + "layout": "IPY_MODEL_39c0c457e3bd46c1971ae2913fd66429", + "placeholder": "​", + "style": "IPY_MODEL_2985df1d86904019a8fdf69a356ade6d", + "value": " 213k/213k [00:00<00:00, 170kB/s]" + } + }, + "13e05f2f64a54245a2478393b1f6b409": { + "model_module": "@jupyter-widgets/controls", + "model_module_version": "1.5.0", + "model_name": "HTMLModel", + "state": { + "_dom_classes": [], + "_model_module": "@jupyter-widgets/controls", + "_model_module_version": "1.5.0", + "_model_name": "HTMLModel", + "_view_count": null, + "_view_module": "@jupyter-widgets/controls", + "_view_module_version": "1.5.0", + "_view_name": "HTMLView", + "description": "", + "description_tooltip": null, + "layout": "IPY_MODEL_9f4a770ae6b84593ac7de85e15c305a9", + "placeholder": "​", + "style": "IPY_MODEL_65d019ca643045a2b0933411d059c920", + "value": "Downloading: 100%" + } + }, + "1b937b91e8ee46c4a764a8091f365291": { + "model_module": "@jupyter-widgets/controls", + "model_module_version": "1.5.0", + "model_name": "FloatProgressModel", + "state": { + "_dom_classes": [], + "_model_module": "@jupyter-widgets/controls", + "_model_module_version": "1.5.0", + "_model_name": "FloatProgressModel", + "_view_count": null, + "_view_module": "@jupyter-widgets/controls", + "_view_module_version": "1.5.0", + "_view_name": "ProgressView", + "bar_style": "success", + "description": "", + "description_tooltip": null, + "layout": "IPY_MODEL_8cc02da6124448598923899b144afd6d", + "max": 213450, + "min": 0, + "orientation": "horizontal", + "style": "IPY_MODEL_7361467d667d49ee85c432eec884d882", + "value": 213450 + } + }, + "2985df1d86904019a8fdf69a356ade6d": { + "model_module": "@jupyter-widgets/controls", + "model_module_version": "1.5.0", + "model_name": "DescriptionStyleModel", + "state": { + "_model_module": "@jupyter-widgets/controls", + "_model_module_version": "1.5.0", + "_model_name": "DescriptionStyleModel", + "_view_count": null, + "_view_module": "@jupyter-widgets/base", + "_view_module_version": "1.2.0", + "_view_name": "StyleView", + "description_width": "" + } + }, + "2aaa61bf4df248be970940e103af0276": { + "model_module": "@jupyter-widgets/base", + "model_module_version": "1.2.0", + "model_name": "LayoutModel", + "state": { + "_model_module": "@jupyter-widgets/base", + "_model_module_version": "1.2.0", + "_model_name": "LayoutModel", + "_view_count": null, + "_view_module": "@jupyter-widgets/base", + "_view_module_version": "1.2.0", + "_view_name": "LayoutView", + "align_content": null, + "align_items": null, + "align_self": null, + "border": null, + "bottom": null, + "display": null, + "flex": null, + "flex_flow": null, + "grid_area": null, + "grid_auto_columns": null, + "grid_auto_flow": null, + "grid_auto_rows": null, + "grid_column": null, + "grid_gap": null, + "grid_row": null, + "grid_template_areas": null, + "grid_template_columns": null, + "grid_template_rows": null, + "height": null, + "justify_content": null, + "justify_items": null, + "left": null, + "margin": null, + "max_height": null, + "max_width": null, + "min_height": null, + "min_width": null, + "object_fit": null, + "object_position": null, + "order": null, + "overflow": null, + "overflow_x": null, + "overflow_y": null, + "padding": null, + "right": null, + "top": null, + "visibility": null, + "width": null + } + }, + "2f8a7ddfcee64b978ff23f8f43911e01": { + "model_module": "@jupyter-widgets/controls", + "model_module_version": "1.5.0", + "model_name": "DescriptionStyleModel", + "state": { + "_model_module": "@jupyter-widgets/controls", + "_model_module_version": "1.5.0", + "_model_name": "DescriptionStyleModel", + "_view_count": null, + "_view_module": "@jupyter-widgets/base", + "_view_module_version": "1.2.0", + "_view_name": "StyleView", + "description_width": "" + } + }, + "322cc6ef697945ccbbc2b3029dfdf0e3": { + "model_module": "@jupyter-widgets/base", + "model_module_version": "1.2.0", + "model_name": "LayoutModel", + "state": { + "_model_module": "@jupyter-widgets/base", + "_model_module_version": "1.2.0", + "_model_name": "LayoutModel", + "_view_count": null, + "_view_module": "@jupyter-widgets/base", + "_view_module_version": "1.2.0", + "_view_name": "LayoutView", + "align_content": null, + "align_items": null, + "align_self": null, + "border": null, + "bottom": null, + "display": null, + "flex": null, + "flex_flow": null, + "grid_area": null, + "grid_auto_columns": null, + "grid_auto_flow": null, + "grid_auto_rows": null, + "grid_column": null, + "grid_gap": null, + "grid_row": null, + "grid_template_areas": null, + "grid_template_columns": null, + "grid_template_rows": null, + "height": null, + "justify_content": null, + "justify_items": null, + "left": null, + "margin": null, + "max_height": null, + "max_width": null, + "min_height": null, + "min_width": null, + "object_fit": null, + "object_position": null, + "order": null, + "overflow": null, + "overflow_x": null, + "overflow_y": null, + "padding": null, + "right": null, + "top": null, + "visibility": null, + "width": null + } + }, + "331c23df507e4d679e3aaf81af39cd22": { + "model_module": "@jupyter-widgets/base", + "model_module_version": "1.2.0", + "model_name": "LayoutModel", + "state": { + "_model_module": "@jupyter-widgets/base", + "_model_module_version": "1.2.0", + "_model_name": "LayoutModel", + "_view_count": null, + "_view_module": "@jupyter-widgets/base", + "_view_module_version": "1.2.0", + "_view_name": "LayoutView", + "align_content": null, + "align_items": null, + "align_self": null, + "border": null, + "bottom": null, + "display": null, + "flex": null, + "flex_flow": null, + "grid_area": null, + "grid_auto_columns": null, + "grid_auto_flow": null, + "grid_auto_rows": null, + "grid_column": null, + "grid_gap": null, + "grid_row": null, + "grid_template_areas": null, + "grid_template_columns": null, + "grid_template_rows": null, + "height": null, + "justify_content": null, + "justify_items": null, + "left": null, + "margin": null, + "max_height": null, + "max_width": null, + "min_height": null, + "min_width": null, + "object_fit": null, + "object_position": null, + "order": null, + "overflow": null, + "overflow_x": null, + "overflow_y": null, + "padding": null, + "right": null, + "top": null, + "visibility": null, + "width": null + } + }, + "340a7a5171ba4b8aa9b70e945df1618c": { + "model_module": "@jupyter-widgets/controls", + "model_module_version": "1.5.0", + "model_name": "ProgressStyleModel", + "state": { + "_model_module": "@jupyter-widgets/controls", + "_model_module_version": "1.5.0", + "_model_name": "ProgressStyleModel", + "_view_count": null, + "_view_module": "@jupyter-widgets/base", + "_view_module_version": "1.2.0", + "_view_name": "StyleView", + "bar_color": null, + "description_width": "" + } + }, + "39c0c457e3bd46c1971ae2913fd66429": { + "model_module": "@jupyter-widgets/base", + "model_module_version": "1.2.0", + "model_name": "LayoutModel", + "state": { + "_model_module": "@jupyter-widgets/base", + "_model_module_version": "1.2.0", + "_model_name": "LayoutModel", + "_view_count": null, + "_view_module": "@jupyter-widgets/base", + "_view_module_version": "1.2.0", + "_view_name": "LayoutView", + "align_content": null, + "align_items": null, + "align_self": null, + "border": null, + "bottom": null, + "display": null, + "flex": null, + "flex_flow": null, + "grid_area": null, + "grid_auto_columns": null, + "grid_auto_flow": null, + "grid_auto_rows": null, + "grid_column": null, + "grid_gap": null, + "grid_row": null, + "grid_template_areas": null, + "grid_template_columns": null, + "grid_template_rows": null, + "height": null, + "justify_content": null, + "justify_items": null, + "left": null, + "margin": null, + "max_height": null, + "max_width": null, + "min_height": null, + "min_width": null, + "object_fit": null, + "object_position": null, + "order": null, + "overflow": null, + "overflow_x": null, + "overflow_y": null, + "padding": null, + "right": null, + "top": null, + "visibility": null, + "width": null + } + }, + "3a7f8b6302034a0e889976ba0fbb4531": { + "model_module": "@jupyter-widgets/controls", + "model_module_version": "1.5.0", + "model_name": "DescriptionStyleModel", + "state": { + "_model_module": "@jupyter-widgets/controls", + "_model_module_version": "1.5.0", + "_model_name": "DescriptionStyleModel", + "_view_count": null, + "_view_module": "@jupyter-widgets/base", + "_view_module_version": "1.2.0", + "_view_name": "StyleView", + "description_width": "" + } + }, + "3c63301478f54f95ba0f0f8c853a7266": { + "model_module": "@jupyter-widgets/controls", + "model_module_version": "1.5.0", + "model_name": "FloatProgressModel", + "state": { + "_dom_classes": [], + "_model_module": "@jupyter-widgets/controls", + "_model_module_version": "1.5.0", + "_model_name": "FloatProgressModel", + "_view_count": null, + "_view_module": "@jupyter-widgets/controls", + "_view_module_version": "1.5.0", + "_view_name": "ProgressView", + "bar_style": "success", + "description": "", + "description_tooltip": null, + "layout": "IPY_MODEL_794c088a92ba4f6798fa94cded51d0bd", + "max": 260793700, + "min": 0, + "orientation": "horizontal", + "style": "IPY_MODEL_deb9a0d8d3e1430bacab12c8b4ce7573", + "value": 260793700 + } + }, + "4d56ed704097453a930baa9ecdfb1156": { + "model_module": "@jupyter-widgets/controls", + "model_module_version": "1.5.0", + "model_name": "DescriptionStyleModel", + "state": { + "_model_module": "@jupyter-widgets/controls", + "_model_module_version": "1.5.0", + "_model_name": "DescriptionStyleModel", + "_view_count": null, + "_view_module": "@jupyter-widgets/base", + "_view_module_version": "1.2.0", + "_view_name": "StyleView", + "description_width": "" + } + }, + "520e9bf4c0164e60a2c3288bd97ef93e": { + "model_module": "@jupyter-widgets/controls", + "model_module_version": "1.5.0", + "model_name": "ProgressStyleModel", + "state": { + "_model_module": "@jupyter-widgets/controls", + "_model_module_version": "1.5.0", + "_model_name": "ProgressStyleModel", + "_view_count": null, + "_view_module": "@jupyter-widgets/base", + "_view_module_version": "1.2.0", + "_view_name": "StyleView", + "bar_color": null, + "description_width": "" + } + }, + "5b917388b6624637ad8d8f60516d4001": { + "model_module": "@jupyter-widgets/controls", + "model_module_version": "1.5.0", + "model_name": "HBoxModel", + "state": { + "_dom_classes": [], + "_model_module": "@jupyter-widgets/controls", + "_model_module_version": "1.5.0", + "_model_name": "HBoxModel", + "_view_count": null, + "_view_module": "@jupyter-widgets/controls", + "_view_module_version": "1.5.0", + "_view_name": "HBoxView", + "box_style": "", + "children": [ + "IPY_MODEL_13e05f2f64a54245a2478393b1f6b409", + "IPY_MODEL_3c63301478f54f95ba0f0f8c853a7266", + "IPY_MODEL_bb455638d3ac451096fd7cce4cb0d82c" + ], + "layout": "IPY_MODEL_c4a6b418089147f6b50eab097ace0342" + } + }, + "65d019ca643045a2b0933411d059c920": { + "model_module": "@jupyter-widgets/controls", + "model_module_version": "1.5.0", + "model_name": "DescriptionStyleModel", + "state": { + "_model_module": "@jupyter-widgets/controls", + "_model_module_version": "1.5.0", + "_model_name": "DescriptionStyleModel", + "_view_count": null, + "_view_module": "@jupyter-widgets/base", + "_view_module_version": "1.2.0", + "_view_name": "StyleView", + "description_width": "" + } + }, + "69aad11dbc914410b95f6c3cb17a2457": { + "model_module": "@jupyter-widgets/controls", + "model_module_version": "1.5.0", + "model_name": "HTMLModel", + "state": { + "_dom_classes": [], + "_model_module": "@jupyter-widgets/controls", + "_model_module_version": "1.5.0", + "_model_name": "HTMLModel", + "_view_count": null, + "_view_module": "@jupyter-widgets/controls", + "_view_module_version": "1.5.0", + "_view_name": "HTMLView", + "description": "", + "description_tooltip": null, + "layout": "IPY_MODEL_331c23df507e4d679e3aaf81af39cd22", + "placeholder": "​", + "style": "IPY_MODEL_e33617a01b03437986c143c9a69ba14f", + "value": "Downloading: 100%" + } + }, + "6b20c9e39d36404fb761b4d83954a278": { + "model_module": "@jupyter-widgets/controls", + "model_module_version": "1.5.0", + "model_name": "ProgressStyleModel", + "state": { + "_model_module": "@jupyter-widgets/controls", + "_model_module_version": "1.5.0", + "_model_name": "ProgressStyleModel", + "_view_count": null, + "_view_module": "@jupyter-widgets/base", + "_view_module_version": "1.2.0", + "_view_name": "StyleView", + "bar_color": null, + "description_width": "" + } + }, + "6c291edce30b4cefb329113d5ecbe640": { + "model_module": "@jupyter-widgets/controls", + "model_module_version": "1.5.0", + "model_name": "HBoxModel", + "state": { + "_dom_classes": [], + "_model_module": "@jupyter-widgets/controls", + "_model_module_version": "1.5.0", + "_model_name": "HBoxModel", + "_view_count": null, + "_view_module": "@jupyter-widgets/controls", + "_view_module_version": "1.5.0", + "_view_name": "HBoxView", + "box_style": "", + "children": [ + "IPY_MODEL_1249849d670f4824a0a21aa61d187b56", + "IPY_MODEL_0e7b7a12422d49d4a33538e638bfc1c9", + "IPY_MODEL_ec90909f38e0448bab37c735ea9b9ebe" + ], + "layout": "IPY_MODEL_f4274a98f06945a6a1e4a56b680c1790" + } + }, + "6d42d468b1d04c4a94fb2eed75e3c238": { + "model_module": "@jupyter-widgets/controls", + "model_module_version": "1.5.0", + "model_name": "HBoxModel", + "state": { + "_dom_classes": [], + "_model_module": "@jupyter-widgets/controls", + "_model_module_version": "1.5.0", + "_model_name": "HBoxModel", + "_view_count": null, + "_view_module": "@jupyter-widgets/controls", + "_view_module_version": "1.5.0", + "_view_name": "HBoxView", + "box_style": "", + "children": [ + "IPY_MODEL_a371ed9c75184b78a80facda31086426", + "IPY_MODEL_c363fb4238e0464cb3a4ab16250e554a", + "IPY_MODEL_86dce8a2c404469dab0dc1a466788c0a" + ], + "layout": "IPY_MODEL_f6a334e9b5da4c82a1c4f1fbc1fe3c7e" + } + }, + "6f5fbb8f0f5a4374a7bde870c64f1fa4": { + "model_module": "@jupyter-widgets/base", + "model_module_version": "1.2.0", + "model_name": "LayoutModel", + "state": { + "_model_module": "@jupyter-widgets/base", + "_model_module_version": "1.2.0", + "_model_name": "LayoutModel", + "_view_count": null, + "_view_module": "@jupyter-widgets/base", + "_view_module_version": "1.2.0", + "_view_name": "LayoutView", + "align_content": null, + "align_items": null, + "align_self": null, + "border": null, + "bottom": null, + "display": null, + "flex": null, + "flex_flow": null, + "grid_area": null, + "grid_auto_columns": null, + "grid_auto_flow": null, + "grid_auto_rows": null, + "grid_column": null, + "grid_gap": null, + "grid_row": null, + "grid_template_areas": null, + "grid_template_columns": null, + "grid_template_rows": null, + "height": null, + "justify_content": null, + "justify_items": null, + "left": null, + "margin": null, + "max_height": null, + "max_width": null, + "min_height": null, + "min_width": null, + "object_fit": null, + "object_position": null, + "order": null, + "overflow": null, + "overflow_x": null, + "overflow_y": null, + "padding": null, + "right": null, + "top": null, + "visibility": null, + "width": null + } + }, + "72b47b116b0b4125a35d47e060f46807": { + "model_module": "@jupyter-widgets/controls", + "model_module_version": "1.5.0", + "model_name": "HTMLModel", + "state": { + "_dom_classes": [], + "_model_module": "@jupyter-widgets/controls", + "_model_module_version": "1.5.0", + "_model_name": "HTMLModel", + "_view_count": null, + "_view_module": "@jupyter-widgets/controls", + "_view_module_version": "1.5.0", + "_view_name": "HTMLView", + "description": "", + "description_tooltip": null, + "layout": "IPY_MODEL_322cc6ef697945ccbbc2b3029dfdf0e3", + "placeholder": "​", + "style": "IPY_MODEL_f9456ff5134242bc9541d9d60c753384", + "value": " 473/473 [00:00<00:00, 13.5kB/s]" + } + }, + "7361467d667d49ee85c432eec884d882": { + "model_module": "@jupyter-widgets/controls", + "model_module_version": "1.5.0", + "model_name": "ProgressStyleModel", + "state": { + "_model_module": "@jupyter-widgets/controls", + "_model_module_version": "1.5.0", + "_model_name": "ProgressStyleModel", + "_view_count": null, + "_view_module": "@jupyter-widgets/base", + "_view_module_version": "1.2.0", + "_view_name": "StyleView", + "bar_color": null, + "description_width": "" + } + }, + "794c088a92ba4f6798fa94cded51d0bd": { + "model_module": "@jupyter-widgets/base", + "model_module_version": "1.2.0", + "model_name": "LayoutModel", + "state": { + "_model_module": "@jupyter-widgets/base", + "_model_module_version": "1.2.0", + "_model_name": "LayoutModel", + "_view_count": null, + "_view_module": "@jupyter-widgets/base", + "_view_module_version": "1.2.0", + "_view_name": "LayoutView", + "align_content": null, + "align_items": null, + "align_self": null, + "border": null, + "bottom": null, + "display": null, + "flex": null, + "flex_flow": null, + "grid_area": null, + "grid_auto_columns": null, + "grid_auto_flow": null, + "grid_auto_rows": null, + "grid_column": null, + "grid_gap": null, + "grid_row": null, + "grid_template_areas": null, + "grid_template_columns": null, + "grid_template_rows": null, + "height": null, + "justify_content": null, + "justify_items": null, + "left": null, + "margin": null, + "max_height": null, + "max_width": null, + "min_height": null, + "min_width": null, + "object_fit": null, + "object_position": null, + "order": null, + "overflow": null, + "overflow_x": null, + "overflow_y": null, + "padding": null, + "right": null, + "top": null, + "visibility": null, + "width": null + } + }, + "80cb9869ae694ecda95aab298598c7a2": { + "model_module": "@jupyter-widgets/base", + "model_module_version": "1.2.0", + "model_name": "LayoutModel", + "state": { + "_model_module": "@jupyter-widgets/base", + "_model_module_version": "1.2.0", + "_model_name": "LayoutModel", + "_view_count": null, + "_view_module": "@jupyter-widgets/base", + "_view_module_version": "1.2.0", + "_view_name": "LayoutView", + "align_content": null, + "align_items": null, + "align_self": null, + "border": null, + "bottom": null, + "display": null, + "flex": null, + "flex_flow": null, + "grid_area": null, + "grid_auto_columns": null, + "grid_auto_flow": null, + "grid_auto_rows": null, + "grid_column": null, + "grid_gap": null, + "grid_row": null, + "grid_template_areas": null, + "grid_template_columns": null, + "grid_template_rows": null, + "height": null, + "justify_content": null, + "justify_items": null, + "left": null, + "margin": null, + "max_height": null, + "max_width": null, + "min_height": null, + "min_width": null, + "object_fit": null, + "object_position": null, + "order": null, + "overflow": null, + "overflow_x": null, + "overflow_y": null, + "padding": null, + "right": null, + "top": null, + "visibility": null, + "width": null + } + }, + "86dce8a2c404469dab0dc1a466788c0a": { + "model_module": "@jupyter-widgets/controls", + "model_module_version": "1.5.0", + "model_name": "HTMLModel", + "state": { + "_dom_classes": [], + "_model_module": "@jupyter-widgets/controls", + "_model_module_version": "1.5.0", + "_model_name": "HTMLModel", + "_view_count": null, + "_view_module": "@jupyter-widgets/controls", + "_view_module_version": "1.5.0", + "_view_name": "HTMLView", + "description": "", + "description_tooltip": null, + "layout": "IPY_MODEL_b0f31194b8f24b5ab601f8edd5332e04", + "placeholder": "​", + "style": "IPY_MODEL_cbeb9e9dfdf6420d91ec1126fedd8e48", + "value": " 29.0/29.0 [00:00<00:00, 321B/s]" + } + }, + "8cc02da6124448598923899b144afd6d": { + "model_module": "@jupyter-widgets/base", + "model_module_version": "1.2.0", + "model_name": "LayoutModel", + "state": { + "_model_module": "@jupyter-widgets/base", + "_model_module_version": "1.2.0", + "_model_name": "LayoutModel", + "_view_count": null, + "_view_module": "@jupyter-widgets/base", + "_view_module_version": "1.2.0", + "_view_name": "LayoutView", + "align_content": null, + "align_items": null, + "align_self": null, + "border": null, + "bottom": null, + "display": null, + "flex": null, + "flex_flow": null, + "grid_area": null, + "grid_auto_columns": null, + "grid_auto_flow": null, + "grid_auto_rows": null, + "grid_column": null, + "grid_gap": null, + "grid_row": null, + "grid_template_areas": null, + "grid_template_columns": null, + "grid_template_rows": null, + "height": null, + "justify_content": null, + "justify_items": null, + "left": null, + "margin": null, + "max_height": null, + "max_width": null, + "min_height": null, + "min_width": null, + "object_fit": null, + "object_position": null, + "order": null, + "overflow": null, + "overflow_x": null, + "overflow_y": null, + "padding": null, + "right": null, + "top": null, + "visibility": null, + "width": null + } + }, + "9adf994d114141f98aeea509a73e9c59": { + "model_module": "@jupyter-widgets/base", + "model_module_version": "1.2.0", + "model_name": "LayoutModel", + "state": { + "_model_module": "@jupyter-widgets/base", + "_model_module_version": "1.2.0", + "_model_name": "LayoutModel", + "_view_count": null, + "_view_module": "@jupyter-widgets/base", + "_view_module_version": "1.2.0", + "_view_name": "LayoutView", + "align_content": null, + "align_items": null, + "align_self": null, + "border": null, + "bottom": null, + "display": null, + "flex": null, + "flex_flow": null, + "grid_area": null, + "grid_auto_columns": null, + "grid_auto_flow": null, + "grid_auto_rows": null, + "grid_column": null, + "grid_gap": null, + "grid_row": null, + "grid_template_areas": null, + "grid_template_columns": null, + "grid_template_rows": null, + "height": null, + "justify_content": null, + "justify_items": null, + "left": null, + "margin": null, + "max_height": null, + "max_width": null, + "min_height": null, + "min_width": null, + "object_fit": null, + "object_position": null, + "order": null, + "overflow": null, + "overflow_x": null, + "overflow_y": null, + "padding": null, + "right": null, + "top": null, + "visibility": null, + "width": null + } + }, + "9f4a770ae6b84593ac7de85e15c305a9": { + "model_module": "@jupyter-widgets/base", + "model_module_version": "1.2.0", + "model_name": "LayoutModel", + "state": { + "_model_module": "@jupyter-widgets/base", + "_model_module_version": "1.2.0", + "_model_name": "LayoutModel", + "_view_count": null, + "_view_module": "@jupyter-widgets/base", + "_view_module_version": "1.2.0", + "_view_name": "LayoutView", + "align_content": null, + "align_items": null, + "align_self": null, + "border": null, + "bottom": null, + "display": null, + "flex": null, + "flex_flow": null, + "grid_area": null, + "grid_auto_columns": null, + "grid_auto_flow": null, + "grid_auto_rows": null, + "grid_column": null, + "grid_gap": null, + "grid_row": null, + "grid_template_areas": null, + "grid_template_columns": null, + "grid_template_rows": null, + "height": null, + "justify_content": null, + "justify_items": null, + "left": null, + "margin": null, + "max_height": null, + "max_width": null, + "min_height": null, + "min_width": null, + "object_fit": null, + "object_position": null, + "order": null, + "overflow": null, + "overflow_x": null, + "overflow_y": null, + "padding": null, + "right": null, + "top": null, + "visibility": null, + "width": null + } + }, + "a371ed9c75184b78a80facda31086426": { + "model_module": "@jupyter-widgets/controls", + "model_module_version": "1.5.0", + "model_name": "HTMLModel", + "state": { + "_dom_classes": [], + "_model_module": "@jupyter-widgets/controls", + "_model_module_version": "1.5.0", + "_model_name": "HTMLModel", + "_view_count": null, + "_view_module": "@jupyter-widgets/controls", + "_view_module_version": "1.5.0", + "_view_name": "HTMLView", + "description": "", + "description_tooltip": null, + "layout": "IPY_MODEL_e9f1a9476de147c3bc98c1c36960dad6", + "placeholder": "​", + "style": "IPY_MODEL_2f8a7ddfcee64b978ff23f8f43911e01", + "value": "Downloading: 100%" + } + }, + "b099d3bb966a4e9b8d02710329030ff3": { + "model_module": "@jupyter-widgets/controls", + "model_module_version": "1.5.0", + "model_name": "DescriptionStyleModel", + "state": { + "_model_module": "@jupyter-widgets/controls", + "_model_module_version": "1.5.0", + "_model_name": "DescriptionStyleModel", + "_view_count": null, + "_view_module": "@jupyter-widgets/base", + "_view_module_version": "1.2.0", + "_view_name": "StyleView", + "description_width": "" + } + }, + "b0f31194b8f24b5ab601f8edd5332e04": { + "model_module": "@jupyter-widgets/base", + "model_module_version": "1.2.0", + "model_name": "LayoutModel", + "state": { + "_model_module": "@jupyter-widgets/base", + "_model_module_version": "1.2.0", + "_model_name": "LayoutModel", + "_view_count": null, + "_view_module": "@jupyter-widgets/base", + "_view_module_version": "1.2.0", + "_view_name": "LayoutView", + "align_content": null, + "align_items": null, + "align_self": null, + "border": null, + "bottom": null, + "display": null, + "flex": null, + "flex_flow": null, + "grid_area": null, + "grid_auto_columns": null, + "grid_auto_flow": null, + "grid_auto_rows": null, + "grid_column": null, + "grid_gap": null, + "grid_row": null, + "grid_template_areas": null, + "grid_template_columns": null, + "grid_template_rows": null, + "height": null, + "justify_content": null, + "justify_items": null, + "left": null, + "margin": null, + "max_height": null, + "max_width": null, + "min_height": null, + "min_width": null, + "object_fit": null, + "object_position": null, + "order": null, + "overflow": null, + "overflow_x": null, + "overflow_y": null, + "padding": null, + "right": null, + "top": null, + "visibility": null, + "width": null + } + }, + "b1271eb1b7e74250bd9273f229b49cd8": { + "model_module": "@jupyter-widgets/base", + "model_module_version": "1.2.0", + "model_name": "LayoutModel", + "state": { + "_model_module": "@jupyter-widgets/base", + "_model_module_version": "1.2.0", + "_model_name": "LayoutModel", + "_view_count": null, + "_view_module": "@jupyter-widgets/base", + "_view_module_version": "1.2.0", + "_view_name": "LayoutView", + "align_content": null, + "align_items": null, + "align_self": null, + "border": null, + "bottom": null, + "display": null, + "flex": null, + "flex_flow": null, + "grid_area": null, + "grid_auto_columns": null, + "grid_auto_flow": null, + "grid_auto_rows": null, + "grid_column": null, + "grid_gap": null, + "grid_row": null, + "grid_template_areas": null, + "grid_template_columns": null, + "grid_template_rows": null, + "height": null, + "justify_content": null, + "justify_items": null, + "left": null, + "margin": null, + "max_height": null, + "max_width": null, + "min_height": null, + "min_width": null, + "object_fit": null, + "object_position": null, + "order": null, + "overflow": null, + "overflow_x": null, + "overflow_y": null, + "padding": null, + "right": null, + "top": null, + "visibility": null, + "width": null + } + }, + "b440f3b3937549d69a4f188bb8415531": { + "model_module": "@jupyter-widgets/controls", + "model_module_version": "1.5.0", + "model_name": "HTMLModel", + "state": { + "_dom_classes": [], + "_model_module": "@jupyter-widgets/controls", + "_model_module_version": "1.5.0", + "_model_name": "HTMLModel", + "_view_count": null, + "_view_module": "@jupyter-widgets/controls", + "_view_module_version": "1.5.0", + "_view_name": "HTMLView", + "description": "", + "description_tooltip": null, + "layout": "IPY_MODEL_2aaa61bf4df248be970940e103af0276", + "placeholder": "​", + "style": "IPY_MODEL_3a7f8b6302034a0e889976ba0fbb4531", + "value": "Downloading: 100%" + } + }, + "bb455638d3ac451096fd7cce4cb0d82c": { + "model_module": "@jupyter-widgets/controls", + "model_module_version": "1.5.0", + "model_name": "HTMLModel", + "state": { + "_dom_classes": [], + "_model_module": "@jupyter-widgets/controls", + "_model_module_version": "1.5.0", + "_model_name": "HTMLModel", + "_view_count": null, + "_view_module": "@jupyter-widgets/controls", + "_view_module_version": "1.5.0", + "_view_name": "HTMLView", + "description": "", + "description_tooltip": null, + "layout": "IPY_MODEL_c53ce49d9b7c4a3c87ad7f7c75dce1f5", + "placeholder": "​", + "style": "IPY_MODEL_b099d3bb966a4e9b8d02710329030ff3", + "value": " 261M/261M [00:04<00:00, 53.4MB/s]" + } + }, + "c04562a21d36405890c11e74915839c7": { + "model_module": "@jupyter-widgets/base", + "model_module_version": "1.2.0", + "model_name": "LayoutModel", + "state": { + "_model_module": "@jupyter-widgets/base", + "_model_module_version": "1.2.0", + "_model_name": "LayoutModel", + "_view_count": null, + "_view_module": "@jupyter-widgets/base", + "_view_module_version": "1.2.0", + "_view_name": "LayoutView", + "align_content": null, + "align_items": null, + "align_self": null, + "border": null, + "bottom": null, + "display": null, + "flex": null, + "flex_flow": null, + "grid_area": null, + "grid_auto_columns": null, + "grid_auto_flow": null, + "grid_auto_rows": null, + "grid_column": null, + "grid_gap": null, + "grid_row": null, + "grid_template_areas": null, + "grid_template_columns": null, + "grid_template_rows": null, + "height": null, + "justify_content": null, + "justify_items": null, + "left": null, + "margin": null, + "max_height": null, + "max_width": null, + "min_height": null, + "min_width": null, + "object_fit": null, + "object_position": null, + "order": null, + "overflow": null, + "overflow_x": null, + "overflow_y": null, + "padding": null, + "right": null, + "top": null, + "visibility": null, + "width": null + } + }, + "c363fb4238e0464cb3a4ab16250e554a": { + "model_module": "@jupyter-widgets/controls", + "model_module_version": "1.5.0", + "model_name": "FloatProgressModel", + "state": { + "_dom_classes": [], + "_model_module": "@jupyter-widgets/controls", + "_model_module_version": "1.5.0", + "_model_name": "FloatProgressModel", + "_view_count": null, + "_view_module": "@jupyter-widgets/controls", + "_view_module_version": "1.5.0", + "_view_name": "ProgressView", + "bar_style": "success", + "description": "", + "description_tooltip": null, + "layout": "IPY_MODEL_099a1ac5af9b4e38abb6afb9c333d37a", + "max": 29, + "min": 0, + "orientation": "horizontal", + "style": "IPY_MODEL_340a7a5171ba4b8aa9b70e945df1618c", + "value": 29 + } + }, + "c4a6b418089147f6b50eab097ace0342": { + "model_module": "@jupyter-widgets/base", + "model_module_version": "1.2.0", + "model_name": "LayoutModel", + "state": { + "_model_module": "@jupyter-widgets/base", + "_model_module_version": "1.2.0", + "_model_name": "LayoutModel", + "_view_count": null, + "_view_module": "@jupyter-widgets/base", + "_view_module_version": "1.2.0", + "_view_name": "LayoutView", + "align_content": null, + "align_items": null, + "align_self": null, + "border": null, + "bottom": null, + "display": null, + "flex": null, + "flex_flow": null, + "grid_area": null, + "grid_auto_columns": null, + "grid_auto_flow": null, + "grid_auto_rows": null, + "grid_column": null, + "grid_gap": null, + "grid_row": null, + "grid_template_areas": null, + "grid_template_columns": null, + "grid_template_rows": null, + "height": null, + "justify_content": null, + "justify_items": null, + "left": null, + "margin": null, + "max_height": null, + "max_width": null, + "min_height": null, + "min_width": null, + "object_fit": null, + "object_position": null, + "order": null, + "overflow": null, + "overflow_x": null, + "overflow_y": null, + "padding": null, + "right": null, + "top": null, + "visibility": null, + "width": null + } + }, + "c53ce49d9b7c4a3c87ad7f7c75dce1f5": { + "model_module": "@jupyter-widgets/base", + "model_module_version": "1.2.0", + "model_name": "LayoutModel", + "state": { + "_model_module": "@jupyter-widgets/base", + "_model_module_version": "1.2.0", + "_model_name": "LayoutModel", + "_view_count": null, + "_view_module": "@jupyter-widgets/base", + "_view_module_version": "1.2.0", + "_view_name": "LayoutView", + "align_content": null, + "align_items": null, + "align_self": null, + "border": null, + "bottom": null, + "display": null, + "flex": null, + "flex_flow": null, + "grid_area": null, + "grid_auto_columns": null, + "grid_auto_flow": null, + "grid_auto_rows": null, + "grid_column": null, + "grid_gap": null, + "grid_row": null, + "grid_template_areas": null, + "grid_template_columns": null, + "grid_template_rows": null, + "height": null, + "justify_content": null, + "justify_items": null, + "left": null, + "margin": null, + "max_height": null, + "max_width": null, + "min_height": null, + "min_width": null, + "object_fit": null, + "object_position": null, + "order": null, + "overflow": null, + "overflow_x": null, + "overflow_y": null, + "padding": null, + "right": null, + "top": null, + "visibility": null, + "width": null + } + }, + "cbeb9e9dfdf6420d91ec1126fedd8e48": { + "model_module": "@jupyter-widgets/controls", + "model_module_version": "1.5.0", + "model_name": "DescriptionStyleModel", + "state": { + "_model_module": "@jupyter-widgets/controls", + "_model_module_version": "1.5.0", + "_model_name": "DescriptionStyleModel", + "_view_count": null, + "_view_module": "@jupyter-widgets/base", + "_view_module_version": "1.2.0", + "_view_name": "StyleView", + "description_width": "" + } + }, + "d7e158e614f44983b229d6dd0d8960f9": { + "model_module": "@jupyter-widgets/controls", + "model_module_version": "1.5.0", + "model_name": "HBoxModel", + "state": { + "_dom_classes": [], + "_model_module": "@jupyter-widgets/controls", + "_model_module_version": "1.5.0", + "_model_name": "HBoxModel", + "_view_count": null, + "_view_module": "@jupyter-widgets/controls", + "_view_module_version": "1.5.0", + "_view_name": "HBoxView", + "box_style": "", + "children": [ + "IPY_MODEL_69aad11dbc914410b95f6c3cb17a2457", + "IPY_MODEL_0302e718c6084fb0a96d92fd976738dc", + "IPY_MODEL_72b47b116b0b4125a35d47e060f46807" + ], + "layout": "IPY_MODEL_6f5fbb8f0f5a4374a7bde870c64f1fa4" + } + }, + "da1553ec3e044a4fb7bb9b0c2a84bfe0": { + "model_module": "@jupyter-widgets/controls", + "model_module_version": "1.5.0", + "model_name": "HBoxModel", + "state": { + "_dom_classes": [], + "_model_module": "@jupyter-widgets/controls", + "_model_module_version": "1.5.0", + "_model_name": "HBoxModel", + "_view_count": null, + "_view_module": "@jupyter-widgets/controls", + "_view_module_version": "1.5.0", + "_view_name": "HBoxView", + "box_style": "", + "children": [ + "IPY_MODEL_b440f3b3937549d69a4f188bb8415531", + "IPY_MODEL_1b937b91e8ee46c4a764a8091f365291", + "IPY_MODEL_1369029047864cc68100a44ffaad35ca" + ], + "layout": "IPY_MODEL_80cb9869ae694ecda95aab298598c7a2" + } + }, + "dd420d29751341faa84c025afa743bb5": { + "model_module": "@jupyter-widgets/base", + "model_module_version": "1.2.0", + "model_name": "LayoutModel", + "state": { + "_model_module": "@jupyter-widgets/base", + "_model_module_version": "1.2.0", + "_model_name": "LayoutModel", + "_view_count": null, + "_view_module": "@jupyter-widgets/base", + "_view_module_version": "1.2.0", + "_view_name": "LayoutView", + "align_content": null, + "align_items": null, + "align_self": null, + "border": null, + "bottom": null, + "display": null, + "flex": null, + "flex_flow": null, + "grid_area": null, + "grid_auto_columns": null, + "grid_auto_flow": null, + "grid_auto_rows": null, + "grid_column": null, + "grid_gap": null, + "grid_row": null, + "grid_template_areas": null, + "grid_template_columns": null, + "grid_template_rows": null, + "height": null, + "justify_content": null, + "justify_items": null, + "left": null, + "margin": null, + "max_height": null, + "max_width": null, + "min_height": null, + "min_width": null, + "object_fit": null, + "object_position": null, + "order": null, + "overflow": null, + "overflow_x": null, + "overflow_y": null, + "padding": null, + "right": null, + "top": null, + "visibility": null, + "width": null + } + }, + "deb9a0d8d3e1430bacab12c8b4ce7573": { + "model_module": "@jupyter-widgets/controls", + "model_module_version": "1.5.0", + "model_name": "ProgressStyleModel", + "state": { + "_model_module": "@jupyter-widgets/controls", + "_model_module_version": "1.5.0", + "_model_name": "ProgressStyleModel", + "_view_count": null, + "_view_module": "@jupyter-widgets/base", + "_view_module_version": "1.2.0", + "_view_name": "StyleView", + "bar_color": null, + "description_width": "" + } + }, + "e33617a01b03437986c143c9a69ba14f": { + "model_module": "@jupyter-widgets/controls", + "model_module_version": "1.5.0", + "model_name": "DescriptionStyleModel", + "state": { + "_model_module": "@jupyter-widgets/controls", + "_model_module_version": "1.5.0", + "_model_name": "DescriptionStyleModel", + "_view_count": null, + "_view_module": "@jupyter-widgets/base", + "_view_module_version": "1.2.0", + "_view_name": "StyleView", + "description_width": "" + } + }, + "e6b91c2208e44524a9883309ae431277": { + "model_module": "@jupyter-widgets/controls", + "model_module_version": "1.5.0", + "model_name": "DescriptionStyleModel", + "state": { + "_model_module": "@jupyter-widgets/controls", + "_model_module_version": "1.5.0", + "_model_name": "DescriptionStyleModel", + "_view_count": null, + "_view_module": "@jupyter-widgets/base", + "_view_module_version": "1.2.0", + "_view_name": "StyleView", + "description_width": "" + } + }, + "e9f1a9476de147c3bc98c1c36960dad6": { + "model_module": "@jupyter-widgets/base", + "model_module_version": "1.2.0", + "model_name": "LayoutModel", + "state": { + "_model_module": "@jupyter-widgets/base", + "_model_module_version": "1.2.0", + "_model_name": "LayoutModel", + "_view_count": null, + "_view_module": "@jupyter-widgets/base", + "_view_module_version": "1.2.0", + "_view_name": "LayoutView", + "align_content": null, + "align_items": null, + "align_self": null, + "border": null, + "bottom": null, + "display": null, + "flex": null, + "flex_flow": null, + "grid_area": null, + "grid_auto_columns": null, + "grid_auto_flow": null, + "grid_auto_rows": null, + "grid_column": null, + "grid_gap": null, + "grid_row": null, + "grid_template_areas": null, + "grid_template_columns": null, + "grid_template_rows": null, + "height": null, + "justify_content": null, + "justify_items": null, + "left": null, + "margin": null, + "max_height": null, + "max_width": null, + "min_height": null, + "min_width": null, + "object_fit": null, + "object_position": null, + "order": null, + "overflow": null, + "overflow_x": null, + "overflow_y": null, + "padding": null, + "right": null, + "top": null, + "visibility": null, + "width": null + } + }, + "ec90909f38e0448bab37c735ea9b9ebe": { + "model_module": "@jupyter-widgets/controls", + "model_module_version": "1.5.0", + "model_name": "HTMLModel", + "state": { + "_dom_classes": [], + "_model_module": "@jupyter-widgets/controls", + "_model_module_version": "1.5.0", + "_model_name": "HTMLModel", + "_view_count": null, + "_view_module": "@jupyter-widgets/controls", + "_view_module_version": "1.5.0", + "_view_name": "HTMLView", + "description": "", + "description_tooltip": null, + "layout": "IPY_MODEL_dd420d29751341faa84c025afa743bb5", + "placeholder": "​", + "style": "IPY_MODEL_4d56ed704097453a930baa9ecdfb1156", + "value": " 436k/436k [00:01<00:00, 470kB/s]" + } + }, + "f4274a98f06945a6a1e4a56b680c1790": { + "model_module": "@jupyter-widgets/base", + "model_module_version": "1.2.0", + "model_name": "LayoutModel", + "state": { + "_model_module": "@jupyter-widgets/base", + "_model_module_version": "1.2.0", + "_model_name": "LayoutModel", + "_view_count": null, + "_view_module": "@jupyter-widgets/base", + "_view_module_version": "1.2.0", + "_view_name": "LayoutView", + "align_content": null, + "align_items": null, + "align_self": null, + "border": null, + "bottom": null, + "display": null, + "flex": null, + "flex_flow": null, + "grid_area": null, + "grid_auto_columns": null, + "grid_auto_flow": null, + "grid_auto_rows": null, + "grid_column": null, + "grid_gap": null, + "grid_row": null, + "grid_template_areas": null, + "grid_template_columns": null, + "grid_template_rows": null, + "height": null, + "justify_content": null, + "justify_items": null, + "left": null, + "margin": null, + "max_height": null, + "max_width": null, + "min_height": null, + "min_width": null, + "object_fit": null, + "object_position": null, + "order": null, + "overflow": null, + "overflow_x": null, + "overflow_y": null, + "padding": null, + "right": null, + "top": null, + "visibility": null, + "width": null + } + }, + "f6a334e9b5da4c82a1c4f1fbc1fe3c7e": { + "model_module": "@jupyter-widgets/base", + "model_module_version": "1.2.0", + "model_name": "LayoutModel", + "state": { + "_model_module": "@jupyter-widgets/base", + "_model_module_version": "1.2.0", + "_model_name": "LayoutModel", + "_view_count": null, + "_view_module": "@jupyter-widgets/base", + "_view_module_version": "1.2.0", + "_view_name": "LayoutView", + "align_content": null, + "align_items": null, + "align_self": null, + "border": null, + "bottom": null, + "display": null, + "flex": null, + "flex_flow": null, + "grid_area": null, + "grid_auto_columns": null, + "grid_auto_flow": null, + "grid_auto_rows": null, + "grid_column": null, + "grid_gap": null, + "grid_row": null, + "grid_template_areas": null, + "grid_template_columns": null, + "grid_template_rows": null, + "height": null, + "justify_content": null, + "justify_items": null, + "left": null, + "margin": null, + "max_height": null, + "max_width": null, + "min_height": null, + "min_width": null, + "object_fit": null, + "object_position": null, + "order": null, + "overflow": null, + "overflow_x": null, + "overflow_y": null, + "padding": null, + "right": null, + "top": null, + "visibility": null, + "width": null + } + }, + "f9456ff5134242bc9541d9d60c753384": { + "model_module": "@jupyter-widgets/controls", + "model_module_version": "1.5.0", + "model_name": "DescriptionStyleModel", + "state": { + "_model_module": "@jupyter-widgets/controls", + "_model_module_version": "1.5.0", + "_model_name": "DescriptionStyleModel", + "_view_count": null, + "_view_module": "@jupyter-widgets/base", + "_view_module_version": "1.2.0", + "_view_name": "StyleView", + "description_width": "" + } + } + } + } + }, + "nbformat": 4, + "nbformat_minor": 1 +} diff --git a/NLP with Attention Models/QA/QA_DistilBERT_pipline_FT/Files/tf/.ipynb_checkpoints/C4W3_HF_Lab2_QA_BERT-checkpoint.ipynb b/NLP with Attention Models/QA/QA_DistilBERT_pipline_FT/Files/tf/.ipynb_checkpoints/C4W3_HF_Lab2_QA_BERT-checkpoint.ipynb new file mode 100644 index 0000000000000000000000000000000000000000..1164f17406ee14009277dc7ac1784bf51e82d187 --- /dev/null +++ b/NLP with Attention Models/QA/QA_DistilBERT_pipline_FT/Files/tf/.ipynb_checkpoints/C4W3_HF_Lab2_QA_BERT-checkpoint.ipynb @@ -0,0 +1,644 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": { + "id": "u2UXutvEvpUj" + }, + "source": [ + "# Question Answering with BERT and HuggingFace 🤗 (Fine-tuning)\n", + "\n", + "In the previous Hugging Face ungraded lab, you saw how to use the pipeline objects to use transformer models for NLP tasks. In that lab, the model didn't output the desired answers to a series of precise questions for a context related to the history of comic books.\n", + "\n", + "In this lab, you will fine-tune the model from that lab to give better answers for that type of context. To do that, you'll be using the [TyDi QA dataset](https://ai.google.com/research/tydiqa) but on a filtered version with only English examples. Additionally, you will use a lot of the tools that Hugging Face has to offer.\n", + "\n", + "You have to note that, in general, you will fine-tune general-purpose transformer models to work for specific tasks. However, fine-tuning a general-purpose model can take a lot of time. That's why you will be using the model from the question answering pipeline in this lab.\n", + "\n", + "Begin by importing some libraries and/or objects you will use throughout the lab:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "import os\n", + "os.environ['TF_CPP_MIN_LOG_LEVEL'] = '3'\n", + "\n", + "import numpy as np\n", + "\n", + "from datasets import load_from_disk\n", + "from transformers import AutoTokenizer, AutoModelForQuestionAnswering, Trainer, TrainingArguments\n", + "\n", + "from sklearn.metrics import f1_score" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "id": "FrEglXPmvpUr" + }, + "source": [ + "## Fine-tuning a BERT model\n", + "\n", + "As you saw in the previous lab, you can use these pipelines as they are. But sometimes, you'll need something more specific to your problem, or maybe you need it to perform better on your production data. In these cases, you'll need to fine-tune a model.\n", + "\n", + "Here, you'll fine-tune a pre-trained DistilBERT model on the TyDi QA dataset.\n", + "\n", + "To fine-tune your model, you will leverage three components provided by Hugging Face:\n", + "\n", + "* Datasets: Library that contains some datasets and different metrics to evaluate the performance of your models.\n", + "* Tokenizer: Object in charge of preprocessing your text to be given as input for the transformer models.\n", + "* Transformers: Library with the pre-trained model checkpoints and the trainer object.\n", + "\n" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "id": "g0Rg-e4jBFFs" + }, + "source": [ + "### Datasets\n", + "\n", + "To get the dataset to fine-tune your model, you will use [🤗 Datasets](https://huggingface.co/docs/datasets/), a lightweight and extensible library to share and access datasets and evaluation metrics for NLP easily. You can download Hugging Face datasets directly using the `load_dataset` function from the `datasets` library. \n", + "\n", + "Hugging Face `datasets` allows to load data in several formats, such as CSV, JSON, text files and even parquet. You can see more about the supported formats in the [documentation](https://huggingface.co/docs/datasets/loading)\n", + "\n", + "A common approach is to use `load_dataset` and get the full dataset but **for this lab you will use a filtered version containing only the English examples**, which is already saved in this environment. Since this filtered dataset is saved using the Apache Arrow format, you can read it by using the `load_from_disk` function.\n" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "id": "x68dqaoXg5Ra" + }, + "outputs": [], + "source": [ + "#The path where the dataset is stored\n", + "path = './tydiqa_data/'\n", + "\n", + "#Load Dataset\n", + "tydiqa_data = load_from_disk(path)\n", + "\n", + "tydiqa_data" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "id": "1hfzBZU3T47O" + }, + "source": [ + "\n", + "You can check below that the type of the loaded dataset is a `datasets.arrow_dataset.Dataset`. This object type corresponds to an Apache Arrow Table that allows creating a hash table that contains the position in memory where data is stored instead of loading the complete dataset into memory. But you don't have to worry too much about that. It is just an efficient way to work with lots of data." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "id": "gkeppC3GQiW6" + }, + "outputs": [], + "source": [ + "# Checking the object type for one of the elements in the dataset\n", + "type(tydiqa_data['train'])" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "id": "q_HLaNtQaFlR" + }, + "source": [ + "You can also check the structure of the dataset:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "id": "2l9ANJTrbP-U" + }, + "outputs": [], + "source": [ + "tydiqa_data['train']" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "id": "2xRO1yIkvpUt" + }, + "source": [ + "You can see that each example is like a dictionary object. This dataset consists of questions, contexts, and indices that point to the start and end position of the answer inside the context. You can access the index using the `annotations` key, which is a kind of dictionary." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "id": "KNVpW6lADk92" + }, + "outputs": [], + "source": [ + "idx = 600\n", + "\n", + "# start index\n", + "start_index = tydiqa_data['train'][idx]['annotations']['minimal_answers_start_byte'][0]\n", + "\n", + "# end index\n", + "end_index = tydiqa_data['train'][idx]['annotations']['minimal_answers_end_byte'][0]\n", + "\n", + "print(f\"Question: {tydiqa_data['train'][idx]['question_text']}\")\n", + "print(f\"\\nContext (truncated): {tydiqa_data['train'][idx]['document_plaintext'][0:512]} ...\")\n", + "print(f\"\\nAnswer: {tydiqa_data['train'][idx]['document_plaintext'][start_index:end_index]}\")" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "id": "Z-lZgDTEYm74" + }, + "source": [ + "The question answering model predicts a start and endpoint in the context to extract as the answer. That's why this NLP task is known as extractive question answering.\n", + "\n", + "To train your model, you need to pass start and endpoints as labels. So, you need to implement a function that extracts the start and end positions from the dataset.\n", + "\n", + "The dataset contains unanswerable questions. For these, the start and end indices for the answer are equal to `-1`." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "id": "Ty_QDcdKYw9a" + }, + "outputs": [], + "source": [ + "tydiqa_data['train'][0]['annotations']" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "id": "lHWcNMudcAuO" + }, + "source": [ + "Now, you have to flatten the dataset to work with an object with a table structure instead of a dictionary structure. This step facilitates the pre-processing steps." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "id": "xDCAQQtoCs_r" + }, + "outputs": [], + "source": [ + "# Flattening the datasets\n", + "flattened_train_data = tydiqa_data['train'].flatten()\n", + "flattened_test_data = tydiqa_data['validation'].flatten()" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "id": "q5wUa5xED0fK" + }, + "source": [ + "Also, to make the training more straightforward and faster, we will extract a subset of the train and test datasets. For that purpose, we will use the Hugging Face Dataset object's method called `select()`. This method allows you to take some data points by their index. Here, you will select the first 3000 rows but you can play with the number of data points, however, consider that this will increase the training time." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "id": "BkcIhpEnDHSJ" + }, + "outputs": [], + "source": [ + "# Selecting a subset of the train dataset\n", + "flattened_train_data = flattened_train_data.select(range(3000))\n", + "\n", + "# Selecting a subset of the test dataset\n", + "flattened_test_data = flattened_test_data.select(range(1000))" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "id": "fBXrmwXhc13M" + }, + "source": [ + "### Tokenizers\n", + "\n", + "Now, you will use the [tokenizer](https://huggingface.co/transformers/main_classes/tokenizer.html) object from Hugging Face. You can load a tokenizer using different methods. Here, you will retrieve it from the pipeline object you created in the previous Hugging Face lab. With this tokenizer, you can ensure that the tokens you get for the dataset will match the tokens used in the original DistilBERT implementation.\n", + "\n", + "When loading a tokenizer with any method, you must pass the model checkpoint that you want to fine-tune. Here, you are using the`'distilbert-base-cased-distilled-squad'` checkpoint.\n" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "id": "LInV3b_HyAIF" + }, + "outputs": [], + "source": [ + "# Import the AutoTokenizer from the transformers library\n", + "tokenizer = AutoTokenizer.from_pretrained(\"distilbert-base-cased-distilled-squad\")\n", + "\n", + "# Define max length of sequences in the tokenizer\n", + "tokenizer.model_max_length = 512" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "id": "qz6YtVcOh3qP" + }, + "source": [ + "Given the characteristics of the dataset and the question-answering task, you will need to add some steps to pre-process the data after the tokenization:\n", + "\n", + "1. When there is no answer to a question given a context, you will use the `CLS` token, a unique token used to represent the start of the sequence.\n", + "\n", + "2. Tokenizers can split a given string into substrings, resulting in a subtoken for each substring, creating misalignment between the list of dataset tags and the labels generated by the tokenizer. Therefore, you will need to align the start and end indices with the tokens associated with the target answer word.\n", + "\n", + "3. Finally, a tokenizer can truncate a very long sequence. So, if the start/end position of an answer is `None`, you will assume that it was truncated and assign the maximum length of the tokenizer to those positions.\n", + "\n", + "Those three steps are done within the `process_samples` function defined below." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "id": "3l-r4wI06LU7" + }, + "outputs": [], + "source": [ + "# Processing samples using the 3 steps described above\n", + "def process_samples(sample):\n", + " tokenized_data = tokenizer(sample['document_plaintext'], sample['question_text'], truncation=\"only_first\", padding=\"max_length\")\n", + "\n", + " input_ids = tokenized_data[\"input_ids\"]\n", + "\n", + " # We will label impossible answers with the index of the CLS token.\n", + " cls_index = input_ids.index(tokenizer.cls_token_id)\n", + "\n", + " # If no answers are given, set the cls_index as answer.\n", + " if sample[\"annotations.minimal_answers_start_byte\"][0] == -1:\n", + " start_position = cls_index\n", + " end_position = cls_index\n", + " else:\n", + " # Start/end character index of the answer in the text.\n", + " gold_text = sample[\"document_plaintext\"][sample['annotations.minimal_answers_start_byte'][0]:sample['annotations.minimal_answers_end_byte'][0]]\n", + " start_char = sample[\"annotations.minimal_answers_start_byte\"][0]\n", + " end_char = sample['annotations.minimal_answers_end_byte'][0] #start_char + len(gold_text)\n", + "\n", + " # sometimes answers are off by a character or two – fix this\n", + " if sample['document_plaintext'][start_char-1:end_char-1] == gold_text:\n", + " start_char = start_char - 1\n", + " end_char = end_char - 1 # When the gold label is off by one character\n", + " elif sample['document_plaintext'][start_char-2:end_char-2] == gold_text:\n", + " start_char = start_char - 2\n", + " end_char = end_char - 2 # When the gold label is off by two characters\n", + "\n", + " start_token = tokenized_data.char_to_token(start_char)\n", + " end_token = tokenized_data.char_to_token(end_char - 1)\n", + "\n", + " # if start position is None, the answer passage has been truncated\n", + " if start_token is None:\n", + " start_token = tokenizer.model_max_length\n", + " if end_token is None:\n", + " end_token = tokenizer.model_max_length\n", + "\n", + " start_position = start_token\n", + " end_position = end_token\n", + "\n", + " return {'input_ids': tokenized_data['input_ids'],\n", + " 'attention_mask': tokenized_data['attention_mask'],\n", + " 'start_positions': start_position,\n", + " 'end_positions': end_position}\n" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "id": "Q3LAsWSyk_Rm" + }, + "source": [ + "To apply the `process_samples` function defined above to the whole dataset, you can use the `map` method as follows:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "id": "rGbYd7QnFetG" + }, + "outputs": [], + "source": [ + "# Tokenizing and processing the flattened dataset\n", + "processed_train_data = flattened_train_data.map(process_samples)\n", + "processed_test_data = flattened_test_data.map(process_samples)" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "id": "wCpPhYKJluMA" + }, + "source": [ + "# Transformers\n", + "\n", + "The last component of Hugging Face that is useful for fine-tuning a transformer corresponds to the pre-trained models you can access in multiple ways.\n", + "\n", + "For this lab, you will use the same model from the question-answering pipeline that you loaded in the previous lab." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "id": "jR3VqjNc1Vb3" + }, + "outputs": [], + "source": [ + "# Import the AutoModelForQuestionAnswering for the pre-trained model. You will only fine tune the head of the model\n", + "model = AutoModelForQuestionAnswering.from_pretrained(\"distilbert-base-cased-distilled-squad\")" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "id": "K29BYtnsm1yH" + }, + "source": [ + "Now, you can take the necessary columns from the datasets to train/test and return them as Pytorch Tensors." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "id": "0X14G89noLfW" + }, + "outputs": [], + "source": [ + "columns_to_return = ['input_ids','attention_mask', 'start_positions', 'end_positions']\n", + "\n", + "processed_train_data.set_format(type='pt', columns=columns_to_return)\n", + "processed_test_data.set_format(type='pt', columns=columns_to_return)" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "id": "yjoUFWu_nLRq" + }, + "source": [ + "Here, we give you the F1 score as a metric to evaluate your model's performance. We will use this metric for simplicity, although it is based on the start and end values predicted by the model. If you want to dig deeper on other metrics that can be used for a question and answering task, you can also check [this colab notebook resource](https://colab.research.google.com/github/huggingface/notebooks/blob/master/examples/question_answering.ipynb) from the Hugging Face team." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "id": "xcW2wPnirsJk" + }, + "outputs": [], + "source": [ + "def compute_f1_metrics(pred):\n", + " start_labels = pred.label_ids[0]\n", + " start_preds = pred.predictions[0].argmax(-1)\n", + " end_labels = pred.label_ids[1]\n", + " end_preds = pred.predictions[1].argmax(-1)\n", + "\n", + " f1_start = f1_score(start_labels, start_preds, average='macro')\n", + " f1_end = f1_score(end_labels, end_preds, average='macro')\n", + "\n", + " return {\n", + " 'f1_start': f1_start,\n", + " 'f1_end': f1_end,\n", + " }" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "id": "KuhASU4evpUu" + }, + "source": [ + "Now, you will use the Hugging Face [Trainer](https://huggingface.co/transformers/main_classes/trainer.html) to fine-tune your model." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "colab": { + "background_save": true + }, + "id": "nxyOwf5utXAt" + }, + "outputs": [], + "source": [ + "# Training hyperparameters\n", + "training_args = TrainingArguments(\n", + " output_dir='model_results', # output directory\n", + " overwrite_output_dir=True,\n", + " num_train_epochs=3, # total number of training epochs\n", + " per_device_train_batch_size=8, # batch size per device during training\n", + " per_device_eval_batch_size=8, # batch size for evaluation\n", + " warmup_steps=20, # number of warmup steps for learning rate scheduler\n", + " weight_decay=0.01, # strength of weight decay\n", + " logging_steps=50\n", + ")\n", + "\n", + "# Trainer object\n", + "trainer = Trainer(\n", + " model=model, # the instantiated 🤗 Transformers model to be trained\n", + " args=training_args, # training arguments, defined above\n", + " train_dataset=processed_train_data, # training dataset\n", + " eval_dataset=processed_test_data, # evaluation dataset\n", + " compute_metrics=compute_f1_metrics\n", + ")\n", + "\n", + "# Training loop\n", + "trainer.train()" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "id": "Ic_wNlBHCRMn" + }, + "source": [ + "And, in the next cell, you can evaluate the fine-tuned model's performance on the test set." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "id": "92N11A076wRA" + }, + "outputs": [], + "source": [ + "trainer.evaluate(processed_test_data)" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "id": "_HubPkRbnzh_" + }, + "source": [ + "### Using your Fine-Tuned Model\n", + "\n", + "After training and evaluating your fine-tuned model, you can check its results for the same questions from the previous lab.\n", + "\n", + "For that, you will tell Pytorch to use your GPU or your CPU to run the model. Additionally, you will need to tokenize your input context and questions. Finally, you need to post-process the output results to transform them from tokens to human-readable strings using the `tokenizer`." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "text = r\"\"\"\n", + "The Golden Age of Comic Books describes an era of American comic books from the\n", + "late 1930s to circa 1950. During this time, modern comic books were first published\n", + "and rapidly increased in popularity. The superhero archetype was created and many\n", + "well-known characters were introduced, including Superman, Batman, Captain Marvel\n", + "(later known as SHAZAM!), Captain America, and Wonder Woman.\n", + "Between 1939 and 1941 Detective Comics and its sister company, All-American Publications,\n", + "introduced popular superheroes such as Batman and Robin, Wonder Woman, the Flash,\n", + "Green Lantern, Doctor Fate, the Atom, Hawkman, Green Arrow and Aquaman.[7] Timely Comics,\n", + "the 1940s predecessor of Marvel Comics, had million-selling titles featuring the Human Torch,\n", + "the Sub-Mariner, and Captain America.[8]\n", + "As comic books grew in popularity, publishers began launching titles that expanded\n", + "into a variety of genres. Dell Comics' non-superhero characters (particularly the\n", + "licensed Walt Disney animated-character comics) outsold the superhero comics of the day.[12]\n", + "The publisher featured licensed movie and literary characters such as Mickey Mouse, Donald Duck,\n", + "Roy Rogers and Tarzan.[13] It was during this era that noted Donald Duck writer-artist\n", + "Carl Barks rose to prominence.[14] Additionally, MLJ's introduction of Archie Andrews\n", + "in Pep Comics #22 (December 1941) gave rise to teen humor comics,[15] with the Archie\n", + "Andrews character remaining in print well into the 21st century.[16]\n", + "At the same time in Canada, American comic books were prohibited importation under\n", + "the War Exchange Conservation Act[17] which restricted the importation of non-essential\n", + "goods. As a result, a domestic publishing industry flourished during the duration\n", + "of the war which were collectively informally called the Canadian Whites.\n", + "The educational comic book Dagwood Splits the Atom used characters from the comic\n", + "strip Blondie.[18] According to historian Michael A. Amundson, appealing comic-book\n", + "characters helped ease young readers' fear of nuclear war and neutralize anxiety\n", + "about the questions posed by atomic power.[19] It was during this period that long-running\n", + "humor comics debuted, including EC's Mad and Carl Barks' Uncle Scrooge in Dell's Four\n", + "Color Comics (both in 1952).[20][21]\n", + "\"\"\"\n", + "\n", + "questions = [\"What superheroes were introduced between 1939 and 1941 by Detective Comics and its sister company?\",\n", + " \"What comic book characters were created between 1939 and 1941?\",\n", + " \"What well-known characters were created between 1939 and 1941?\",\n", + " \"What well-known superheroes were introduced between 1939 and 1941 by Detective Comics?\"]\n", + "\n", + "for question in questions:\n", + " inputs = tokenizer.encode_plus(question, text, return_tensors=\"pt\")\n", + "\n", + " input_ids = inputs[\"input_ids\"].tolist()[0]\n", + " inputs.to(\"cuda\")\n", + "\n", + " text_tokens = tokenizer.convert_ids_to_tokens(input_ids)\n", + " answer_model = model(**inputs)\n", + " \n", + " start_logits = answer_model['start_logits'].cpu().detach().numpy()\n", + "\n", + " answer_start = np.argmax(start_logits) \n", + " \n", + " end_logits = answer_model['end_logits'].cpu().detach().numpy()\n", + " \n", + " # Get the most likely beginning of answer with the argmax of the score\n", + " answer_end = np.argmax(end_logits) + 1 # Get the most likely end of answer with the argmax of the score\n", + "\n", + " answer = tokenizer.convert_tokens_to_string(tokenizer.convert_ids_to_tokens(input_ids[answer_start:answer_end]))\n", + "\n", + " print(f\"Question: {question}\")\n", + " print(f\"Answer: {answer}\\n\")\n" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "id": "_yTDQ6kn6pWS" + }, + "source": [ + "By fine-tuning the model for only 3 epochs you can already see an improvement!\n", + "\n", + "You can compare those results with those obtained using the base model (without fine-tuning), as you did in the previous lab. As a reminder, here are those results:\n", + "\n", + "```\n", + "What popular superheroes were introduced between 1939 and 1941?\n", + ">> teen humor comics\n", + "What superheroes were introduced between 1939 and 1941 by Detective Comics and its sister company?\n", + ">> Archie Andrews\n", + "What comic book characters were created between 1939 and 1941?\n", + ">> Archie\n", + "Andrews\n", + "What well-known characters were created between 1939 and 1941?\n", + ">> Archie\n", + "Andrews\n", + "What well-known superheroes were introduced between 1939 and 1941 by Detective Comics?\n", + ">> Archie Andrews\n", + "```" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "id": "uf-v8mUSLqXN" + }, + "source": [ + "**Congratulations!**\n", + "\n", + "You have finished this series of ungraded labs. You were able to:\n", + "\n", + "* Explore the Hugging Face Pipelines, which can be used right out of the bat.\n", + "\n", + "* Fine-tune a model for the Extractive Question & Answering task.\n", + "\n", + "We also recommend you go through the free [Hugging Face course](https://huggingface.co/course/chapter1) to explore their ecosystem in more detail and find different ways to use the `transformers` library." + ] + } + ], + "metadata": { + "accelerator": "GPU", + "colab": { + "provenance": [] + }, + "kernelspec": { + "display_name": "Python 3 (ipykernel)", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.8.10" + } + }, + "nbformat": 4, + "nbformat_minor": 1 +} diff --git a/NLP with Attention Models/QA/QA_DistilBERT_pipline_FT/Files/tf/C4W3_HF_Lab1_QA_BERT.ipynb b/NLP with Attention Models/QA/QA_DistilBERT_pipline_FT/Files/tf/C4W3_HF_Lab1_QA_BERT.ipynb new file mode 100644 index 0000000000000000000000000000000000000000..ea1c4fdd9e0ab510a3e9d93e382e2cee05f17bc7 --- /dev/null +++ b/NLP with Attention Models/QA/QA_DistilBERT_pipline_FT/Files/tf/C4W3_HF_Lab1_QA_BERT.ipynb @@ -0,0 +1,2110 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": { + "id": "u2UXutvEvpUj" + }, + "source": [ + "# Question Answering with BERT and HuggingFace\n", + "\n", + "You've seen how to use BERT and other transformer models for a wide range of natural language tasks, including machine translation, summarization, and question answering. Transformers have become the standard model for NLP, similar to convolutional models in computer vision. And all started with Attention!\n", + "\n", + "In practice, you'll rarely train a transformer model from scratch. Transformers tend to be very large, so they take time, money, and lots of data to train fully. Instead, you'll want to start with a pre-trained model and fine-tune it with your dataset if you need to.\n", + "\n", + "[Hugging Face](https://huggingface.co/) (🤗) is the best resource for pre-trained transformers. Their open-source libraries simplify downloading and using transformer models like BERT, T5, and GPT-2. And the best part, you can use them alongside either TensorFlow, PyTorch or Flax.\n", + "\n", + "In this notebook, you'll use 🤗 transformers to use the DistilBERT model for question answering." + ] + }, + { + "cell_type": "markdown", + "metadata": { + "id": "tm675LmQvpUm" + }, + "source": [ + "## Pipelines\n", + "\n", + "Before fine-tuning a model, you will look at the pipelines from Hugging Face to use pre-trained transformer models for specific tasks. The `transformers` library provides pipelines for popular tasks like sentiment analysis, summarization, and text generation. A pipeline consists of a tokenizer, a model, and the model configuration. All these are packaged together into an easy-to-use object. Hugging Face makes life easier.\n", + "\n", + "Pipelines are intended to be used without fine-tuning and will often be immediately helpful in your projects. For example, `transformers` provides a pipeline for [question answering](https://huggingface.co/docs/transformers/main_classes/pipelines#transformers.QuestionAnsweringPipeline) that you can directly use to answer your questions if you give some context. Let's see how to do just that.\n", + "\n", + "You will import `pipeline` from `transformers` for creating pipelines." + ] + }, + { + "cell_type": "code", + "execution_count": 13, + "metadata": { + "id": "uNJGGbRWvpUm" + }, + "outputs": [], + "source": [ + "import os\n", + "os.environ['TF_CPP_MIN_LOG_LEVEL'] = '3'\n", + "\n", + "from transformers import pipeline" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "id": "_CeFTIr7P3QR" + }, + "source": [ + "Now, you will create the pipeline for question-answering, which uses the [DistilBert](https://hf.co/distilbert-base-cased-distilled-squad) model for extractive question answering (i.e., answering questions with the exact wording provided in the context)." + ] + }, + { + "cell_type": "code", + "execution_count": 14, + "metadata": { + "colab": { + "base_uri": "https://localhost:8080/", + "height": 177, + "referenced_widgets": [ + "d7e158e614f44983b229d6dd0d8960f9", + "69aad11dbc914410b95f6c3cb17a2457", + "0302e718c6084fb0a96d92fd976738dc", + "72b47b116b0b4125a35d47e060f46807", + "6f5fbb8f0f5a4374a7bde870c64f1fa4", + "331c23df507e4d679e3aaf81af39cd22", + "e33617a01b03437986c143c9a69ba14f", + "b1271eb1b7e74250bd9273f229b49cd8", + "6b20c9e39d36404fb761b4d83954a278", + "322cc6ef697945ccbbc2b3029dfdf0e3", + "f9456ff5134242bc9541d9d60c753384", + "5b917388b6624637ad8d8f60516d4001", + "13e05f2f64a54245a2478393b1f6b409", + "3c63301478f54f95ba0f0f8c853a7266", + "bb455638d3ac451096fd7cce4cb0d82c", + "c4a6b418089147f6b50eab097ace0342", + "9f4a770ae6b84593ac7de85e15c305a9", + "65d019ca643045a2b0933411d059c920", + "794c088a92ba4f6798fa94cded51d0bd", + "deb9a0d8d3e1430bacab12c8b4ce7573", + "c53ce49d9b7c4a3c87ad7f7c75dce1f5", + "b099d3bb966a4e9b8d02710329030ff3", + "6d42d468b1d04c4a94fb2eed75e3c238", + "a371ed9c75184b78a80facda31086426", + "c363fb4238e0464cb3a4ab16250e554a", + "86dce8a2c404469dab0dc1a466788c0a", + "f6a334e9b5da4c82a1c4f1fbc1fe3c7e", + "e9f1a9476de147c3bc98c1c36960dad6", + "2f8a7ddfcee64b978ff23f8f43911e01", + "099a1ac5af9b4e38abb6afb9c333d37a", + "340a7a5171ba4b8aa9b70e945df1618c", + "b0f31194b8f24b5ab601f8edd5332e04", + "cbeb9e9dfdf6420d91ec1126fedd8e48", + "da1553ec3e044a4fb7bb9b0c2a84bfe0", + "b440f3b3937549d69a4f188bb8415531", + "1b937b91e8ee46c4a764a8091f365291", + "1369029047864cc68100a44ffaad35ca", + "80cb9869ae694ecda95aab298598c7a2", + "2aaa61bf4df248be970940e103af0276", + "3a7f8b6302034a0e889976ba0fbb4531", + "8cc02da6124448598923899b144afd6d", + "7361467d667d49ee85c432eec884d882", + "39c0c457e3bd46c1971ae2913fd66429", + "2985df1d86904019a8fdf69a356ade6d", + "6c291edce30b4cefb329113d5ecbe640", + "1249849d670f4824a0a21aa61d187b56", + "0e7b7a12422d49d4a33538e638bfc1c9", + "ec90909f38e0448bab37c735ea9b9ebe", + "f4274a98f06945a6a1e4a56b680c1790", + "c04562a21d36405890c11e74915839c7", + "e6b91c2208e44524a9883309ae431277", + "9adf994d114141f98aeea509a73e9c59", + "520e9bf4c0164e60a2c3288bd97ef93e", + "dd420d29751341faa84c025afa743bb5", + "4d56ed704097453a930baa9ecdfb1156" + ] + }, + "id": "nKy4AAhLvpUo", + "outputId": "0419ab21-4237-4ad2-b076-9ed83377ed34" + }, + "outputs": [], + "source": [ + "# The task \"question-answering\" will return a QuestionAnsweringPipeline object\n", + "question_answerer = pipeline(task=\"question-answering\", model=\"distilbert-base-cased-distilled-squad\")" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "id": "4ltQLVWgvpUo" + }, + "source": [ + "Notice that this environment already has the model stored in the directory `distilbert-base-cased-distilled-squad`. However if you were to run that exact code on your local computer, Huggingface will download the model for you, which is a great feature!\n", + "\n", + "\n", + "After running the last cell, you have a pipeline for performing question answering given a context string. The pipeline `question_answerer` you just created needs you to pass the question and context as strings. It returns an answer to the question from the context you provided. For example, here are the first few paragraphs from the [Wikipedia entry for tea](https://en.wikipedia.org/wiki/Tea) that you will use as the context.\n", + "\n", + "\n" + ] + }, + { + "cell_type": "code", + "execution_count": 15, + "metadata": { + "id": "D_-MzZNJvpUp" + }, + "outputs": [], + "source": [ + "context = \"\"\"\n", + "Tea is an aromatic beverage prepared by pouring hot or boiling water over cured or fresh leaves of Camellia sinensis,\n", + "an evergreen shrub native to China and East Asia. After water, it is the most widely consumed drink in the world.\n", + "There are many different types of tea; some, like Chinese greens and Darjeeling, have a cooling, slightly bitter,\n", + "and astringent flavour, while others have vastly different profiles that include sweet, nutty, floral, or grassy\n", + "notes. Tea has a stimulating effect in humans primarily due to its caffeine content.\n", + "\n", + "The tea plant originated in the region encompassing today's Southwest China, Tibet, north Myanmar and Northeast India,\n", + "where it was used as a medicinal drink by various ethnic groups. An early credible record of tea drinking dates to\n", + "the 3rd century AD, in a medical text written by Hua Tuo. It was popularised as a recreational drink during the\n", + "Chinese Tang dynasty, and tea drinking spread to other East Asian countries. Portuguese priests and merchants\n", + "introduced it to Europe during the 16th century. During the 17th century, drinking tea became fashionable among the\n", + "English, who started to plant tea on a large scale in India.\n", + "\n", + "The term herbal tea refers to drinks not made from Camellia sinensis: infusions of fruit, leaves, or other plant\n", + "parts, such as steeps of rosehip, chamomile, or rooibos. These may be called tisanes or herbal infusions to prevent\n", + "confusion with 'tea' made from the tea plant.\n", + "\"\"\"" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "id": "HyR3o2mrvpUq" + }, + "source": [ + "Now, you can ask your model anything related to that passage. For instance, \"Where is tea native to?\"." + ] + }, + { + "cell_type": "code", + "execution_count": 16, + "metadata": { + "colab": { + "base_uri": "https://localhost:8080/" + }, + "id": "eiRohAWWvpUq", + "outputId": "a1ddfca3-3723-4d43-cbda-0509337b60d6", + "scrolled": true + }, + "outputs": [], + "source": [ + "result = question_answerer(question=\"Where is tea native to?\", context=context)\n", + "\n", + "print(result['answer'])" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "id": "cRXzFlZ5vpUr" + }, + "source": [ + "You can also pass multiple questions to your pipeline within a list so that you can ask:\n", + "\n", + "* \"Where is tea native to?\"\n", + "* \"When was tea discovered?\"\n", + "* \"What is the species name for tea?\"\n", + "\n", + "at the same time, and your `question-answerer` will return all the answers." + ] + }, + { + "cell_type": "code", + "execution_count": 17, + "metadata": { + "colab": { + "base_uri": "https://localhost:8080/" + }, + "id": "IMLyXeMZvpUr", + "outputId": "ac9badb1-083d-4234-9474-f112c1f2f20f" + }, + "outputs": [], + "source": [ + "questions = [\"Where is tea native to?\",\n", + " \"When was tea discovered?\",\n", + " \"What is the species name for tea?\"]\n", + "\n", + "results = question_answerer(question=questions, context=context)\n", + "\n", + "for q, r in zip(questions, results):\n", + " print(f\"{q} \\n>> {r['answer']}\")" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "id": "XXf18tVu8p70" + }, + "source": [ + "Although the models used in the Hugging Face pipelines generally give outstanding results, sometimes you will have particular examples where they don't perform so well. Let's use the following example with a context string about the Golden Age of Comic Books:" + ] + }, + { + "cell_type": "code", + "execution_count": 19, + "metadata": { + "id": "0v9C0TAqwinw" + }, + "outputs": [], + "source": [ + "context = \"\"\"\n", + "The Golden Age of Comic Books describes an era of American comic books from the\n", + "late 1930s to circa 1950. During this time, modern comic books were first published\n", + "and rapidly increased in popularity. The superhero archetype was created and many\n", + "well-known characters were introduced, including Superman, Batman, Captain Marvel\n", + "(later known as SHAZAM!), Captain America, and Wonder Woman.\n", + "Between 1939 and 1941 Detective Comics and its sister company, All-American Publications,\n", + "introduced popular superheroes such as Batman and Robin, Wonder Woman, the Flash,\n", + "Green Lantern, Doctor Fate, the Atom, Hawkman, Green Arrow and Aquaman.[7] Timely Comics,\n", + "the 1940s predecessor of Marvel Comics, had million-selling titles featuring the Human Torch,\n", + "the Sub-Mariner, and Captain America.[8]\n", + "As comic books grew in popularity, publishers began launching titles that expanded\n", + "into a variety of genres. Dell Comics' non-superhero characters (particularly the\n", + "licensed Walt Disney animated-character comics) outsold the superhero comics of the day.[12]\n", + "The publisher featured licensed movie and literary characters such as Mickey Mouse, Donald Duck,\n", + "Roy Rogers and Tarzan.[13] It was during this era that noted Donald Duck writer-artist\n", + "Carl Barks rose to prominence.[14] Additionally, MLJ's introduction of Archie Andrews\n", + "in Pep Comics #22 (December 1941) gave rise to teen humor comics,[15] with the Archie\n", + "Andrews character remaining in print well into the 21st century.[16]\n", + "At the same time in Canada, American comic books were prohibited importation under\n", + "the War Exchange Conservation Act[17] which restricted the importation of non-essential\n", + "goods. As a result, a domestic publishing industry flourished during the duration\n", + "of the war which were collectively informally called the Canadian Whites.\n", + "The educational comic book Dagwood Splits the Atom used characters from the comic\n", + "strip Blondie.[18] According to historian Michael A. Amundson, appealing comic-book\n", + "characters helped ease young readers' fear of nuclear war and neutralize anxiety\n", + "about the questions posed by atomic power.[19] It was during this period that long-running\n", + "humor comics debuted, including EC's Mad and Carl Barks' Uncle Scrooge in Dell's Four\n", + "Color Comics (both in 1952).[20][21]\n", + "\"\"\"" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "id": "fYbERLKQbhyH" + }, + "source": [ + "Let's ask the following question: \"What popular superheroes were introduced between 1939 and 1941?\" The answer is in the fourth paragraph of the context string." + ] + }, + { + "cell_type": "code", + "execution_count": 20, + "metadata": { + "colab": { + "base_uri": "https://localhost:8080/" + }, + "id": "SEmAbSSGbg0J", + "outputId": "35b5e3c4-2fd2-4f37-b674-014681ece042" + }, + "outputs": [], + "source": [ + "question = \"What popular superheroes were introduced between 1939 and 1941?\"\n", + "\n", + "result = question_answerer(question=question, context=context)\n", + "print(result['answer'])" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "id": "LGx_BHkN-ejY" + }, + "source": [ + "Here, the answer should be:\n", + "\"Batman and Robin, Wonder Woman, the Flash,\n", + "Green Lantern, Doctor Fate, the Atom, Hawkman, Green Arrow, and Aquaman\". Instead, the pipeline returned a different answer. You can even try different question wordings:\n", + "\n", + "* \"What superheroes were introduced between 1939 and 1941?\"\n", + "* \"What comic book characters were created between 1939 and 1941?\"\n", + "* \"What well-known characters were created between 1939 and 1941?\"\n", + "* \"What well-known superheroes were introduced between 1939 and 1941 by Detective Comics?\"\n", + "\n", + "and you will only get incorrect answers." + ] + }, + { + "cell_type": "code", + "execution_count": 21, + "metadata": { + "colab": { + "base_uri": "https://localhost:8080/" + }, + "id": "f91kLn9VcRzK", + "outputId": "bb3942b6-321a-4466-ac18-9f173b115600" + }, + "outputs": [], + "source": [ + "questions = [\"What popular superheroes were introduced between 1939 and 1941?\",\n", + " \"What superheroes were introduced between 1939 and 1941 by Detective Comics and its sister company?\",\n", + " \"What comic book characters were created between 1939 and 1941?\",\n", + " \"What well-known characters were created between 1939 and 1941?\",\n", + " \"What well-known superheroes were introduced between 1939 and 1941 by Detective Comics?\"]\n", + "\n", + "results = question_answerer(question=questions, context=context)\n", + "\n", + "for q, r in zip(questions, results):\n", + " print(f\"{q} \\n>> {r['answer']}\")" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "id": "QCkLhf27cEsH" + }, + "source": [ + "It seems like this model is a **huge fan** of Archie Andrews. It even considers him a superhero!\n", + "\n", + "The example that fooled your `question_answerer` belongs to the [TyDi QA dataset](https://ai.google.com/research/tydiqa), a dataset from Google for question/answering in diverse languages. To achieve better results when you know that the pipeline isn't working as it should, you need to consider fine-tuning your model.\n", + "\n", + "In the next ungraded lab, you will get the chance to fine-tune the DistilBert model using the TyDi QA dataset.\n", + "\n" + ] + } + ], + "metadata": { + "accelerator": "GPU", + "colab": { + "provenance": [] + }, + "kernelspec": { + "display_name": "Python 3 (ipykernel)", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.8.10" + }, + "widgets": { + "application/vnd.jupyter.widget-state+json": { + "0302e718c6084fb0a96d92fd976738dc": { + "model_module": "@jupyter-widgets/controls", + "model_module_version": "1.5.0", + "model_name": "FloatProgressModel", + "state": { + "_dom_classes": [], + "_model_module": "@jupyter-widgets/controls", + "_model_module_version": "1.5.0", + "_model_name": "FloatProgressModel", + "_view_count": null, + "_view_module": "@jupyter-widgets/controls", + "_view_module_version": "1.5.0", + "_view_name": "ProgressView", + "bar_style": "success", + "description": "", + "description_tooltip": null, + "layout": "IPY_MODEL_b1271eb1b7e74250bd9273f229b49cd8", + "max": 473, + "min": 0, + "orientation": "horizontal", + "style": "IPY_MODEL_6b20c9e39d36404fb761b4d83954a278", + "value": 473 + } + }, + "099a1ac5af9b4e38abb6afb9c333d37a": { + "model_module": "@jupyter-widgets/base", + "model_module_version": "1.2.0", + "model_name": "LayoutModel", + "state": { + "_model_module": "@jupyter-widgets/base", + "_model_module_version": "1.2.0", + "_model_name": "LayoutModel", + "_view_count": null, + "_view_module": "@jupyter-widgets/base", + "_view_module_version": "1.2.0", + "_view_name": "LayoutView", + "align_content": null, + "align_items": null, + "align_self": null, + "border": null, + "bottom": null, + "display": null, + "flex": null, + "flex_flow": null, + "grid_area": null, + "grid_auto_columns": null, + "grid_auto_flow": null, + "grid_auto_rows": null, + "grid_column": null, + "grid_gap": null, + "grid_row": null, + "grid_template_areas": null, + "grid_template_columns": null, + "grid_template_rows": null, + "height": null, + "justify_content": null, + "justify_items": null, + "left": null, + "margin": null, + "max_height": null, + "max_width": null, + "min_height": null, + "min_width": null, + "object_fit": null, + "object_position": null, + "order": null, + "overflow": null, + "overflow_x": null, + "overflow_y": null, + "padding": null, + "right": null, + "top": null, + "visibility": null, + "width": null + } + }, + "0e7b7a12422d49d4a33538e638bfc1c9": { + "model_module": "@jupyter-widgets/controls", + "model_module_version": "1.5.0", + "model_name": "FloatProgressModel", + "state": { + "_dom_classes": [], + "_model_module": "@jupyter-widgets/controls", + "_model_module_version": "1.5.0", + "_model_name": "FloatProgressModel", + "_view_count": null, + "_view_module": "@jupyter-widgets/controls", + "_view_module_version": "1.5.0", + "_view_name": "ProgressView", + "bar_style": "success", + "description": "", + "description_tooltip": null, + "layout": "IPY_MODEL_9adf994d114141f98aeea509a73e9c59", + "max": 435797, + "min": 0, + "orientation": "horizontal", + "style": "IPY_MODEL_520e9bf4c0164e60a2c3288bd97ef93e", + "value": 435797 + } + }, + "1249849d670f4824a0a21aa61d187b56": { + "model_module": "@jupyter-widgets/controls", + "model_module_version": "1.5.0", + "model_name": "HTMLModel", + "state": { + "_dom_classes": [], + "_model_module": "@jupyter-widgets/controls", + "_model_module_version": "1.5.0", + "_model_name": "HTMLModel", + "_view_count": null, + "_view_module": "@jupyter-widgets/controls", + "_view_module_version": "1.5.0", + "_view_name": "HTMLView", + "description": "", + "description_tooltip": null, + "layout": "IPY_MODEL_c04562a21d36405890c11e74915839c7", + "placeholder": "​", + "style": "IPY_MODEL_e6b91c2208e44524a9883309ae431277", + "value": "Downloading: 100%" + } + }, + "1369029047864cc68100a44ffaad35ca": { + "model_module": "@jupyter-widgets/controls", + "model_module_version": "1.5.0", + "model_name": "HTMLModel", + "state": { + "_dom_classes": [], + "_model_module": "@jupyter-widgets/controls", + "_model_module_version": "1.5.0", + "_model_name": "HTMLModel", + "_view_count": null, + "_view_module": "@jupyter-widgets/controls", + "_view_module_version": "1.5.0", + "_view_name": "HTMLView", + "description": "", + "description_tooltip": null, + "layout": "IPY_MODEL_39c0c457e3bd46c1971ae2913fd66429", + "placeholder": "​", + "style": "IPY_MODEL_2985df1d86904019a8fdf69a356ade6d", + "value": " 213k/213k [00:00<00:00, 170kB/s]" + } + }, + "13e05f2f64a54245a2478393b1f6b409": { + "model_module": "@jupyter-widgets/controls", + "model_module_version": "1.5.0", + "model_name": "HTMLModel", + "state": { + "_dom_classes": [], + "_model_module": "@jupyter-widgets/controls", + "_model_module_version": "1.5.0", + "_model_name": "HTMLModel", + "_view_count": null, + "_view_module": "@jupyter-widgets/controls", + "_view_module_version": "1.5.0", + "_view_name": "HTMLView", + "description": "", + "description_tooltip": null, + "layout": "IPY_MODEL_9f4a770ae6b84593ac7de85e15c305a9", + "placeholder": "​", + "style": "IPY_MODEL_65d019ca643045a2b0933411d059c920", + "value": "Downloading: 100%" + } + }, + "1b937b91e8ee46c4a764a8091f365291": { + "model_module": "@jupyter-widgets/controls", + "model_module_version": "1.5.0", + "model_name": "FloatProgressModel", + "state": { + "_dom_classes": [], + "_model_module": "@jupyter-widgets/controls", + "_model_module_version": "1.5.0", + "_model_name": "FloatProgressModel", + "_view_count": null, + "_view_module": "@jupyter-widgets/controls", + "_view_module_version": "1.5.0", + "_view_name": "ProgressView", + "bar_style": "success", + "description": "", + "description_tooltip": null, + "layout": "IPY_MODEL_8cc02da6124448598923899b144afd6d", + "max": 213450, + "min": 0, + "orientation": "horizontal", + "style": "IPY_MODEL_7361467d667d49ee85c432eec884d882", + "value": 213450 + } + }, + "2985df1d86904019a8fdf69a356ade6d": { + "model_module": "@jupyter-widgets/controls", + "model_module_version": "1.5.0", + "model_name": "DescriptionStyleModel", + "state": { + "_model_module": "@jupyter-widgets/controls", + "_model_module_version": "1.5.0", + "_model_name": "DescriptionStyleModel", + "_view_count": null, + "_view_module": "@jupyter-widgets/base", + "_view_module_version": "1.2.0", + "_view_name": "StyleView", + "description_width": "" + } + }, + "2aaa61bf4df248be970940e103af0276": { + "model_module": "@jupyter-widgets/base", + "model_module_version": "1.2.0", + "model_name": "LayoutModel", + "state": { + "_model_module": "@jupyter-widgets/base", + "_model_module_version": "1.2.0", + "_model_name": "LayoutModel", + "_view_count": null, + "_view_module": "@jupyter-widgets/base", + "_view_module_version": "1.2.0", + "_view_name": "LayoutView", + "align_content": null, + "align_items": null, + "align_self": null, + "border": null, + "bottom": null, + "display": null, + "flex": null, + "flex_flow": null, + "grid_area": null, + "grid_auto_columns": null, + "grid_auto_flow": null, + "grid_auto_rows": null, + "grid_column": null, + "grid_gap": null, + "grid_row": null, + "grid_template_areas": null, + "grid_template_columns": null, + "grid_template_rows": null, + "height": null, + "justify_content": null, + "justify_items": null, + "left": null, + "margin": null, + "max_height": null, + "max_width": null, + "min_height": null, + "min_width": null, + "object_fit": null, + "object_position": null, + "order": null, + "overflow": null, + "overflow_x": null, + "overflow_y": null, + "padding": null, + "right": null, + "top": null, + "visibility": null, + "width": null + } + }, + "2f8a7ddfcee64b978ff23f8f43911e01": { + "model_module": "@jupyter-widgets/controls", + "model_module_version": "1.5.0", + "model_name": "DescriptionStyleModel", + "state": { + "_model_module": "@jupyter-widgets/controls", + "_model_module_version": "1.5.0", + "_model_name": "DescriptionStyleModel", + "_view_count": null, + "_view_module": "@jupyter-widgets/base", + "_view_module_version": "1.2.0", + "_view_name": "StyleView", + "description_width": "" + } + }, + "322cc6ef697945ccbbc2b3029dfdf0e3": { + "model_module": "@jupyter-widgets/base", + "model_module_version": "1.2.0", + "model_name": "LayoutModel", + "state": { + "_model_module": "@jupyter-widgets/base", + "_model_module_version": "1.2.0", + "_model_name": "LayoutModel", + "_view_count": null, + "_view_module": "@jupyter-widgets/base", + "_view_module_version": "1.2.0", + "_view_name": "LayoutView", + "align_content": null, + "align_items": null, + "align_self": null, + "border": null, + "bottom": null, + "display": null, + "flex": null, + "flex_flow": null, + "grid_area": null, + "grid_auto_columns": null, + "grid_auto_flow": null, + "grid_auto_rows": null, + "grid_column": null, + "grid_gap": null, + "grid_row": null, + "grid_template_areas": null, + "grid_template_columns": null, + "grid_template_rows": null, + "height": null, + "justify_content": null, + "justify_items": null, + "left": null, + "margin": null, + "max_height": null, + "max_width": null, + "min_height": null, + "min_width": null, + "object_fit": null, + "object_position": null, + "order": null, + "overflow": null, + "overflow_x": null, + "overflow_y": null, + "padding": null, + "right": null, + "top": null, + "visibility": null, + "width": null + } + }, + "331c23df507e4d679e3aaf81af39cd22": { + "model_module": "@jupyter-widgets/base", + "model_module_version": "1.2.0", + "model_name": "LayoutModel", + "state": { + "_model_module": "@jupyter-widgets/base", + "_model_module_version": "1.2.0", + "_model_name": "LayoutModel", + "_view_count": null, + "_view_module": "@jupyter-widgets/base", + "_view_module_version": "1.2.0", + "_view_name": "LayoutView", + "align_content": null, + "align_items": null, + "align_self": null, + "border": null, + "bottom": null, + "display": null, + "flex": null, + "flex_flow": null, + "grid_area": null, + "grid_auto_columns": null, + "grid_auto_flow": null, + "grid_auto_rows": null, + "grid_column": null, + "grid_gap": null, + "grid_row": null, + "grid_template_areas": null, + "grid_template_columns": null, + "grid_template_rows": null, + "height": null, + "justify_content": null, + "justify_items": null, + "left": null, + "margin": null, + "max_height": null, + "max_width": null, + "min_height": null, + "min_width": null, + "object_fit": null, + "object_position": null, + "order": null, + "overflow": null, + "overflow_x": null, + "overflow_y": null, + "padding": null, + "right": null, + "top": null, + "visibility": null, + "width": null + } + }, + "340a7a5171ba4b8aa9b70e945df1618c": { + "model_module": "@jupyter-widgets/controls", + "model_module_version": "1.5.0", + "model_name": "ProgressStyleModel", + "state": { + "_model_module": "@jupyter-widgets/controls", + "_model_module_version": "1.5.0", + "_model_name": "ProgressStyleModel", + "_view_count": null, + "_view_module": "@jupyter-widgets/base", + "_view_module_version": "1.2.0", + "_view_name": "StyleView", + "bar_color": null, + "description_width": "" + } + }, + "39c0c457e3bd46c1971ae2913fd66429": { + "model_module": "@jupyter-widgets/base", + "model_module_version": "1.2.0", + "model_name": "LayoutModel", + "state": { + "_model_module": "@jupyter-widgets/base", + "_model_module_version": "1.2.0", + "_model_name": "LayoutModel", + "_view_count": null, + "_view_module": "@jupyter-widgets/base", + "_view_module_version": "1.2.0", + "_view_name": "LayoutView", + "align_content": null, + "align_items": null, + "align_self": null, + "border": null, + "bottom": null, + "display": null, + "flex": null, + "flex_flow": null, + "grid_area": null, + "grid_auto_columns": null, + "grid_auto_flow": null, + "grid_auto_rows": null, + "grid_column": null, + "grid_gap": null, + "grid_row": null, + "grid_template_areas": null, + "grid_template_columns": null, + "grid_template_rows": null, + "height": null, + "justify_content": null, + "justify_items": null, + "left": null, + "margin": null, + "max_height": null, + "max_width": null, + "min_height": null, + "min_width": null, + "object_fit": null, + "object_position": null, + "order": null, + "overflow": null, + "overflow_x": null, + "overflow_y": null, + "padding": null, + "right": null, + "top": null, + "visibility": null, + "width": null + } + }, + "3a7f8b6302034a0e889976ba0fbb4531": { + "model_module": "@jupyter-widgets/controls", + "model_module_version": "1.5.0", + "model_name": "DescriptionStyleModel", + "state": { + "_model_module": "@jupyter-widgets/controls", + "_model_module_version": "1.5.0", + "_model_name": "DescriptionStyleModel", + "_view_count": null, + "_view_module": "@jupyter-widgets/base", + "_view_module_version": "1.2.0", + "_view_name": "StyleView", + "description_width": "" + } + }, + "3c63301478f54f95ba0f0f8c853a7266": { + "model_module": "@jupyter-widgets/controls", + "model_module_version": "1.5.0", + "model_name": "FloatProgressModel", + "state": { + "_dom_classes": [], + "_model_module": "@jupyter-widgets/controls", + "_model_module_version": "1.5.0", + "_model_name": "FloatProgressModel", + "_view_count": null, + "_view_module": "@jupyter-widgets/controls", + "_view_module_version": "1.5.0", + "_view_name": "ProgressView", + "bar_style": "success", + "description": "", + "description_tooltip": null, + "layout": "IPY_MODEL_794c088a92ba4f6798fa94cded51d0bd", + "max": 260793700, + "min": 0, + "orientation": "horizontal", + "style": "IPY_MODEL_deb9a0d8d3e1430bacab12c8b4ce7573", + "value": 260793700 + } + }, + "4d56ed704097453a930baa9ecdfb1156": { + "model_module": "@jupyter-widgets/controls", + "model_module_version": "1.5.0", + "model_name": "DescriptionStyleModel", + "state": { + "_model_module": "@jupyter-widgets/controls", + "_model_module_version": "1.5.0", + "_model_name": "DescriptionStyleModel", + "_view_count": null, + "_view_module": "@jupyter-widgets/base", + "_view_module_version": "1.2.0", + "_view_name": "StyleView", + "description_width": "" + } + }, + "520e9bf4c0164e60a2c3288bd97ef93e": { + "model_module": "@jupyter-widgets/controls", + "model_module_version": "1.5.0", + "model_name": "ProgressStyleModel", + "state": { + "_model_module": "@jupyter-widgets/controls", + "_model_module_version": "1.5.0", + "_model_name": "ProgressStyleModel", + "_view_count": null, + "_view_module": "@jupyter-widgets/base", + "_view_module_version": "1.2.0", + "_view_name": "StyleView", + "bar_color": null, + "description_width": "" + } + }, + "5b917388b6624637ad8d8f60516d4001": { + "model_module": "@jupyter-widgets/controls", + "model_module_version": "1.5.0", + "model_name": "HBoxModel", + "state": { + "_dom_classes": [], + "_model_module": "@jupyter-widgets/controls", + "_model_module_version": "1.5.0", + "_model_name": "HBoxModel", + "_view_count": null, + "_view_module": "@jupyter-widgets/controls", + "_view_module_version": "1.5.0", + "_view_name": "HBoxView", + "box_style": "", + "children": [ + "IPY_MODEL_13e05f2f64a54245a2478393b1f6b409", + "IPY_MODEL_3c63301478f54f95ba0f0f8c853a7266", + "IPY_MODEL_bb455638d3ac451096fd7cce4cb0d82c" + ], + "layout": "IPY_MODEL_c4a6b418089147f6b50eab097ace0342" + } + }, + "65d019ca643045a2b0933411d059c920": { + "model_module": "@jupyter-widgets/controls", + "model_module_version": "1.5.0", + "model_name": "DescriptionStyleModel", + "state": { + "_model_module": "@jupyter-widgets/controls", + "_model_module_version": "1.5.0", + "_model_name": "DescriptionStyleModel", + "_view_count": null, + "_view_module": "@jupyter-widgets/base", + "_view_module_version": "1.2.0", + "_view_name": "StyleView", + "description_width": "" + } + }, + "69aad11dbc914410b95f6c3cb17a2457": { + "model_module": "@jupyter-widgets/controls", + "model_module_version": "1.5.0", + "model_name": "HTMLModel", + "state": { + "_dom_classes": [], + "_model_module": "@jupyter-widgets/controls", + "_model_module_version": "1.5.0", + "_model_name": "HTMLModel", + "_view_count": null, + "_view_module": "@jupyter-widgets/controls", + "_view_module_version": "1.5.0", + "_view_name": "HTMLView", + "description": "", + "description_tooltip": null, + "layout": "IPY_MODEL_331c23df507e4d679e3aaf81af39cd22", + "placeholder": "​", + "style": "IPY_MODEL_e33617a01b03437986c143c9a69ba14f", + "value": "Downloading: 100%" + } + }, + "6b20c9e39d36404fb761b4d83954a278": { + "model_module": "@jupyter-widgets/controls", + "model_module_version": "1.5.0", + "model_name": "ProgressStyleModel", + "state": { + "_model_module": "@jupyter-widgets/controls", + "_model_module_version": "1.5.0", + "_model_name": "ProgressStyleModel", + "_view_count": null, + "_view_module": "@jupyter-widgets/base", + "_view_module_version": "1.2.0", + "_view_name": "StyleView", + "bar_color": null, + "description_width": "" + } + }, + "6c291edce30b4cefb329113d5ecbe640": { + "model_module": "@jupyter-widgets/controls", + "model_module_version": "1.5.0", + "model_name": "HBoxModel", + "state": { + "_dom_classes": [], + "_model_module": "@jupyter-widgets/controls", + "_model_module_version": "1.5.0", + "_model_name": "HBoxModel", + "_view_count": null, + "_view_module": "@jupyter-widgets/controls", + "_view_module_version": "1.5.0", + "_view_name": "HBoxView", + "box_style": "", + "children": [ + "IPY_MODEL_1249849d670f4824a0a21aa61d187b56", + "IPY_MODEL_0e7b7a12422d49d4a33538e638bfc1c9", + "IPY_MODEL_ec90909f38e0448bab37c735ea9b9ebe" + ], + "layout": "IPY_MODEL_f4274a98f06945a6a1e4a56b680c1790" + } + }, + "6d42d468b1d04c4a94fb2eed75e3c238": { + "model_module": "@jupyter-widgets/controls", + "model_module_version": "1.5.0", + "model_name": "HBoxModel", + "state": { + "_dom_classes": [], + "_model_module": "@jupyter-widgets/controls", + "_model_module_version": "1.5.0", + "_model_name": "HBoxModel", + "_view_count": null, + "_view_module": "@jupyter-widgets/controls", + "_view_module_version": "1.5.0", + "_view_name": "HBoxView", + "box_style": "", + "children": [ + "IPY_MODEL_a371ed9c75184b78a80facda31086426", + "IPY_MODEL_c363fb4238e0464cb3a4ab16250e554a", + "IPY_MODEL_86dce8a2c404469dab0dc1a466788c0a" + ], + "layout": "IPY_MODEL_f6a334e9b5da4c82a1c4f1fbc1fe3c7e" + } + }, + "6f5fbb8f0f5a4374a7bde870c64f1fa4": { + "model_module": "@jupyter-widgets/base", + "model_module_version": "1.2.0", + "model_name": "LayoutModel", + "state": { + "_model_module": "@jupyter-widgets/base", + "_model_module_version": "1.2.0", + "_model_name": "LayoutModel", + "_view_count": null, + "_view_module": "@jupyter-widgets/base", + "_view_module_version": "1.2.0", + "_view_name": "LayoutView", + "align_content": null, + "align_items": null, + "align_self": null, + "border": null, + "bottom": null, + "display": null, + "flex": null, + "flex_flow": null, + "grid_area": null, + "grid_auto_columns": null, + "grid_auto_flow": null, + "grid_auto_rows": null, + "grid_column": null, + "grid_gap": null, + "grid_row": null, + "grid_template_areas": null, + "grid_template_columns": null, + "grid_template_rows": null, + "height": null, + "justify_content": null, + "justify_items": null, + "left": null, + "margin": null, + "max_height": null, + "max_width": null, + "min_height": null, + "min_width": null, + "object_fit": null, + "object_position": null, + "order": null, + "overflow": null, + "overflow_x": null, + "overflow_y": null, + "padding": null, + "right": null, + "top": null, + "visibility": null, + "width": null + } + }, + "72b47b116b0b4125a35d47e060f46807": { + "model_module": "@jupyter-widgets/controls", + "model_module_version": "1.5.0", + "model_name": "HTMLModel", + "state": { + "_dom_classes": [], + "_model_module": "@jupyter-widgets/controls", + "_model_module_version": "1.5.0", + "_model_name": "HTMLModel", + "_view_count": null, + "_view_module": "@jupyter-widgets/controls", + "_view_module_version": "1.5.0", + "_view_name": "HTMLView", + "description": "", + "description_tooltip": null, + "layout": "IPY_MODEL_322cc6ef697945ccbbc2b3029dfdf0e3", + "placeholder": "​", + "style": "IPY_MODEL_f9456ff5134242bc9541d9d60c753384", + "value": " 473/473 [00:00<00:00, 13.5kB/s]" + } + }, + "7361467d667d49ee85c432eec884d882": { + "model_module": "@jupyter-widgets/controls", + "model_module_version": "1.5.0", + "model_name": "ProgressStyleModel", + "state": { + "_model_module": "@jupyter-widgets/controls", + "_model_module_version": "1.5.0", + "_model_name": "ProgressStyleModel", + "_view_count": null, + "_view_module": "@jupyter-widgets/base", + "_view_module_version": "1.2.0", + "_view_name": "StyleView", + "bar_color": null, + "description_width": "" + } + }, + "794c088a92ba4f6798fa94cded51d0bd": { + "model_module": "@jupyter-widgets/base", + "model_module_version": "1.2.0", + "model_name": "LayoutModel", + "state": { + "_model_module": "@jupyter-widgets/base", + "_model_module_version": "1.2.0", + "_model_name": "LayoutModel", + "_view_count": null, + "_view_module": "@jupyter-widgets/base", + "_view_module_version": "1.2.0", + "_view_name": "LayoutView", + "align_content": null, + "align_items": null, + "align_self": null, + "border": null, + "bottom": null, + "display": null, + "flex": null, + "flex_flow": null, + "grid_area": null, + "grid_auto_columns": null, + "grid_auto_flow": null, + "grid_auto_rows": null, + "grid_column": null, + "grid_gap": null, + "grid_row": null, + "grid_template_areas": null, + "grid_template_columns": null, + "grid_template_rows": null, + "height": null, + "justify_content": null, + "justify_items": null, + "left": null, + "margin": null, + "max_height": null, + "max_width": null, + "min_height": null, + "min_width": null, + "object_fit": null, + "object_position": null, + "order": null, + "overflow": null, + "overflow_x": null, + "overflow_y": null, + "padding": null, + "right": null, + "top": null, + "visibility": null, + "width": null + } + }, + "80cb9869ae694ecda95aab298598c7a2": { + "model_module": "@jupyter-widgets/base", + "model_module_version": "1.2.0", + "model_name": "LayoutModel", + "state": { + "_model_module": "@jupyter-widgets/base", + "_model_module_version": "1.2.0", + "_model_name": "LayoutModel", + "_view_count": null, + "_view_module": "@jupyter-widgets/base", + "_view_module_version": "1.2.0", + "_view_name": "LayoutView", + "align_content": null, + "align_items": null, + "align_self": null, + "border": null, + "bottom": null, + "display": null, + "flex": null, + "flex_flow": null, + "grid_area": null, + "grid_auto_columns": null, + "grid_auto_flow": null, + "grid_auto_rows": null, + "grid_column": null, + "grid_gap": null, + "grid_row": null, + "grid_template_areas": null, + "grid_template_columns": null, + "grid_template_rows": null, + "height": null, + "justify_content": null, + "justify_items": null, + "left": null, + "margin": null, + "max_height": null, + "max_width": null, + "min_height": null, + "min_width": null, + "object_fit": null, + "object_position": null, + "order": null, + "overflow": null, + "overflow_x": null, + "overflow_y": null, + "padding": null, + "right": null, + "top": null, + "visibility": null, + "width": null + } + }, + "86dce8a2c404469dab0dc1a466788c0a": { + "model_module": "@jupyter-widgets/controls", + "model_module_version": "1.5.0", + "model_name": "HTMLModel", + "state": { + "_dom_classes": [], + "_model_module": "@jupyter-widgets/controls", + "_model_module_version": "1.5.0", + "_model_name": "HTMLModel", + "_view_count": null, + "_view_module": "@jupyter-widgets/controls", + "_view_module_version": "1.5.0", + "_view_name": "HTMLView", + "description": "", + "description_tooltip": null, + "layout": "IPY_MODEL_b0f31194b8f24b5ab601f8edd5332e04", + "placeholder": "​", + "style": "IPY_MODEL_cbeb9e9dfdf6420d91ec1126fedd8e48", + "value": " 29.0/29.0 [00:00<00:00, 321B/s]" + } + }, + "8cc02da6124448598923899b144afd6d": { + "model_module": "@jupyter-widgets/base", + "model_module_version": "1.2.0", + "model_name": "LayoutModel", + "state": { + "_model_module": "@jupyter-widgets/base", + "_model_module_version": "1.2.0", + "_model_name": "LayoutModel", + "_view_count": null, + "_view_module": "@jupyter-widgets/base", + "_view_module_version": "1.2.0", + "_view_name": "LayoutView", + "align_content": null, + "align_items": null, + "align_self": null, + "border": null, + "bottom": null, + "display": null, + "flex": null, + "flex_flow": null, + "grid_area": null, + "grid_auto_columns": null, + "grid_auto_flow": null, + "grid_auto_rows": null, + "grid_column": null, + "grid_gap": null, + "grid_row": null, + "grid_template_areas": null, + "grid_template_columns": null, + "grid_template_rows": null, + "height": null, + "justify_content": null, + "justify_items": null, + "left": null, + "margin": null, + "max_height": null, + "max_width": null, + "min_height": null, + "min_width": null, + "object_fit": null, + "object_position": null, + "order": null, + "overflow": null, + "overflow_x": null, + "overflow_y": null, + "padding": null, + "right": null, + "top": null, + "visibility": null, + "width": null + } + }, + "9adf994d114141f98aeea509a73e9c59": { + "model_module": "@jupyter-widgets/base", + "model_module_version": "1.2.0", + "model_name": "LayoutModel", + "state": { + "_model_module": "@jupyter-widgets/base", + "_model_module_version": "1.2.0", + "_model_name": "LayoutModel", + "_view_count": null, + "_view_module": "@jupyter-widgets/base", + "_view_module_version": "1.2.0", + "_view_name": "LayoutView", + "align_content": null, + "align_items": null, + "align_self": null, + "border": null, + "bottom": null, + "display": null, + "flex": null, + "flex_flow": null, + "grid_area": null, + "grid_auto_columns": null, + "grid_auto_flow": null, + "grid_auto_rows": null, + "grid_column": null, + "grid_gap": null, + "grid_row": null, + "grid_template_areas": null, + "grid_template_columns": null, + "grid_template_rows": null, + "height": null, + "justify_content": null, + "justify_items": null, + "left": null, + "margin": null, + "max_height": null, + "max_width": null, + "min_height": null, + "min_width": null, + "object_fit": null, + "object_position": null, + "order": null, + "overflow": null, + "overflow_x": null, + "overflow_y": null, + "padding": null, + "right": null, + "top": null, + "visibility": null, + "width": null + } + }, + "9f4a770ae6b84593ac7de85e15c305a9": { + "model_module": "@jupyter-widgets/base", + "model_module_version": "1.2.0", + "model_name": "LayoutModel", + "state": { + "_model_module": "@jupyter-widgets/base", + "_model_module_version": "1.2.0", + "_model_name": "LayoutModel", + "_view_count": null, + "_view_module": "@jupyter-widgets/base", + "_view_module_version": "1.2.0", + "_view_name": "LayoutView", + "align_content": null, + "align_items": null, + "align_self": null, + "border": null, + "bottom": null, + "display": null, + "flex": null, + "flex_flow": null, + "grid_area": null, + "grid_auto_columns": null, + "grid_auto_flow": null, + "grid_auto_rows": null, + "grid_column": null, + "grid_gap": null, + "grid_row": null, + "grid_template_areas": null, + "grid_template_columns": null, + "grid_template_rows": null, + "height": null, + "justify_content": null, + "justify_items": null, + "left": null, + "margin": null, + "max_height": null, + "max_width": null, + "min_height": null, + "min_width": null, + "object_fit": null, + "object_position": null, + "order": null, + "overflow": null, + "overflow_x": null, + "overflow_y": null, + "padding": null, + "right": null, + "top": null, + "visibility": null, + "width": null + } + }, + "a371ed9c75184b78a80facda31086426": { + "model_module": "@jupyter-widgets/controls", + "model_module_version": "1.5.0", + "model_name": "HTMLModel", + "state": { + "_dom_classes": [], + "_model_module": "@jupyter-widgets/controls", + "_model_module_version": "1.5.0", + "_model_name": "HTMLModel", + "_view_count": null, + "_view_module": "@jupyter-widgets/controls", + "_view_module_version": "1.5.0", + "_view_name": "HTMLView", + "description": "", + "description_tooltip": null, + "layout": "IPY_MODEL_e9f1a9476de147c3bc98c1c36960dad6", + "placeholder": "​", + "style": "IPY_MODEL_2f8a7ddfcee64b978ff23f8f43911e01", + "value": "Downloading: 100%" + } + }, + "b099d3bb966a4e9b8d02710329030ff3": { + "model_module": "@jupyter-widgets/controls", + "model_module_version": "1.5.0", + "model_name": "DescriptionStyleModel", + "state": { + "_model_module": "@jupyter-widgets/controls", + "_model_module_version": "1.5.0", + "_model_name": "DescriptionStyleModel", + "_view_count": null, + "_view_module": "@jupyter-widgets/base", + "_view_module_version": "1.2.0", + "_view_name": "StyleView", + "description_width": "" + } + }, + "b0f31194b8f24b5ab601f8edd5332e04": { + "model_module": "@jupyter-widgets/base", + "model_module_version": "1.2.0", + "model_name": "LayoutModel", + "state": { + "_model_module": "@jupyter-widgets/base", + "_model_module_version": "1.2.0", + "_model_name": "LayoutModel", + "_view_count": null, + "_view_module": "@jupyter-widgets/base", + "_view_module_version": "1.2.0", + "_view_name": "LayoutView", + "align_content": null, + "align_items": null, + "align_self": null, + "border": null, + "bottom": null, + "display": null, + "flex": null, + "flex_flow": null, + "grid_area": null, + "grid_auto_columns": null, + "grid_auto_flow": null, + "grid_auto_rows": null, + "grid_column": null, + "grid_gap": null, + "grid_row": null, + "grid_template_areas": null, + "grid_template_columns": null, + "grid_template_rows": null, + "height": null, + "justify_content": null, + "justify_items": null, + "left": null, + "margin": null, + "max_height": null, + "max_width": null, + "min_height": null, + "min_width": null, + "object_fit": null, + "object_position": null, + "order": null, + "overflow": null, + "overflow_x": null, + "overflow_y": null, + "padding": null, + "right": null, + "top": null, + "visibility": null, + "width": null + } + }, + "b1271eb1b7e74250bd9273f229b49cd8": { + "model_module": "@jupyter-widgets/base", + "model_module_version": "1.2.0", + "model_name": "LayoutModel", + "state": { + "_model_module": "@jupyter-widgets/base", + "_model_module_version": "1.2.0", + "_model_name": "LayoutModel", + "_view_count": null, + "_view_module": "@jupyter-widgets/base", + "_view_module_version": "1.2.0", + "_view_name": "LayoutView", + "align_content": null, + "align_items": null, + "align_self": null, + "border": null, + "bottom": null, + "display": null, + "flex": null, + "flex_flow": null, + "grid_area": null, + "grid_auto_columns": null, + "grid_auto_flow": null, + "grid_auto_rows": null, + "grid_column": null, + "grid_gap": null, + "grid_row": null, + "grid_template_areas": null, + "grid_template_columns": null, + "grid_template_rows": null, + "height": null, + "justify_content": null, + "justify_items": null, + "left": null, + "margin": null, + "max_height": null, + "max_width": null, + "min_height": null, + "min_width": null, + "object_fit": null, + "object_position": null, + "order": null, + "overflow": null, + "overflow_x": null, + "overflow_y": null, + "padding": null, + "right": null, + "top": null, + "visibility": null, + "width": null + } + }, + "b440f3b3937549d69a4f188bb8415531": { + "model_module": "@jupyter-widgets/controls", + "model_module_version": "1.5.0", + "model_name": "HTMLModel", + "state": { + "_dom_classes": [], + "_model_module": "@jupyter-widgets/controls", + "_model_module_version": "1.5.0", + "_model_name": "HTMLModel", + "_view_count": null, + "_view_module": "@jupyter-widgets/controls", + "_view_module_version": "1.5.0", + "_view_name": "HTMLView", + "description": "", + "description_tooltip": null, + "layout": "IPY_MODEL_2aaa61bf4df248be970940e103af0276", + "placeholder": "​", + "style": "IPY_MODEL_3a7f8b6302034a0e889976ba0fbb4531", + "value": "Downloading: 100%" + } + }, + "bb455638d3ac451096fd7cce4cb0d82c": { + "model_module": "@jupyter-widgets/controls", + "model_module_version": "1.5.0", + "model_name": "HTMLModel", + "state": { + "_dom_classes": [], + "_model_module": "@jupyter-widgets/controls", + "_model_module_version": "1.5.0", + "_model_name": "HTMLModel", + "_view_count": null, + "_view_module": "@jupyter-widgets/controls", + "_view_module_version": "1.5.0", + "_view_name": "HTMLView", + "description": "", + "description_tooltip": null, + "layout": "IPY_MODEL_c53ce49d9b7c4a3c87ad7f7c75dce1f5", + "placeholder": "​", + "style": "IPY_MODEL_b099d3bb966a4e9b8d02710329030ff3", + "value": " 261M/261M [00:04<00:00, 53.4MB/s]" + } + }, + "c04562a21d36405890c11e74915839c7": { + "model_module": "@jupyter-widgets/base", + "model_module_version": "1.2.0", + "model_name": "LayoutModel", + "state": { + "_model_module": "@jupyter-widgets/base", + "_model_module_version": "1.2.0", + "_model_name": "LayoutModel", + "_view_count": null, + "_view_module": "@jupyter-widgets/base", + "_view_module_version": "1.2.0", + "_view_name": "LayoutView", + "align_content": null, + "align_items": null, + "align_self": null, + "border": null, + "bottom": null, + "display": null, + "flex": null, + "flex_flow": null, + "grid_area": null, + "grid_auto_columns": null, + "grid_auto_flow": null, + "grid_auto_rows": null, + "grid_column": null, + "grid_gap": null, + "grid_row": null, + "grid_template_areas": null, + "grid_template_columns": null, + "grid_template_rows": null, + "height": null, + "justify_content": null, + "justify_items": null, + "left": null, + "margin": null, + "max_height": null, + "max_width": null, + "min_height": null, + "min_width": null, + "object_fit": null, + "object_position": null, + "order": null, + "overflow": null, + "overflow_x": null, + "overflow_y": null, + "padding": null, + "right": null, + "top": null, + "visibility": null, + "width": null + } + }, + "c363fb4238e0464cb3a4ab16250e554a": { + "model_module": "@jupyter-widgets/controls", + "model_module_version": "1.5.0", + "model_name": "FloatProgressModel", + "state": { + "_dom_classes": [], + "_model_module": "@jupyter-widgets/controls", + "_model_module_version": "1.5.0", + "_model_name": "FloatProgressModel", + "_view_count": null, + "_view_module": "@jupyter-widgets/controls", + "_view_module_version": "1.5.0", + "_view_name": "ProgressView", + "bar_style": "success", + "description": "", + "description_tooltip": null, + "layout": "IPY_MODEL_099a1ac5af9b4e38abb6afb9c333d37a", + "max": 29, + "min": 0, + "orientation": "horizontal", + "style": "IPY_MODEL_340a7a5171ba4b8aa9b70e945df1618c", + "value": 29 + } + }, + "c4a6b418089147f6b50eab097ace0342": { + "model_module": "@jupyter-widgets/base", + "model_module_version": "1.2.0", + "model_name": "LayoutModel", + "state": { + "_model_module": "@jupyter-widgets/base", + "_model_module_version": "1.2.0", + "_model_name": "LayoutModel", + "_view_count": null, + "_view_module": "@jupyter-widgets/base", + "_view_module_version": "1.2.0", + "_view_name": "LayoutView", + "align_content": null, + "align_items": null, + "align_self": null, + "border": null, + "bottom": null, + "display": null, + "flex": null, + "flex_flow": null, + "grid_area": null, + "grid_auto_columns": null, + "grid_auto_flow": null, + "grid_auto_rows": null, + "grid_column": null, + "grid_gap": null, + "grid_row": null, + "grid_template_areas": null, + "grid_template_columns": null, + "grid_template_rows": null, + "height": null, + "justify_content": null, + "justify_items": null, + "left": null, + "margin": null, + "max_height": null, + "max_width": null, + "min_height": null, + "min_width": null, + "object_fit": null, + "object_position": null, + "order": null, + "overflow": null, + "overflow_x": null, + "overflow_y": null, + "padding": null, + "right": null, + "top": null, + "visibility": null, + "width": null + } + }, + "c53ce49d9b7c4a3c87ad7f7c75dce1f5": { + "model_module": "@jupyter-widgets/base", + "model_module_version": "1.2.0", + "model_name": "LayoutModel", + "state": { + "_model_module": "@jupyter-widgets/base", + "_model_module_version": "1.2.0", + "_model_name": "LayoutModel", + "_view_count": null, + "_view_module": "@jupyter-widgets/base", + "_view_module_version": "1.2.0", + "_view_name": "LayoutView", + "align_content": null, + "align_items": null, + "align_self": null, + "border": null, + "bottom": null, + "display": null, + "flex": null, + "flex_flow": null, + "grid_area": null, + "grid_auto_columns": null, + "grid_auto_flow": null, + "grid_auto_rows": null, + "grid_column": null, + "grid_gap": null, + "grid_row": null, + "grid_template_areas": null, + "grid_template_columns": null, + "grid_template_rows": null, + "height": null, + "justify_content": null, + "justify_items": null, + "left": null, + "margin": null, + "max_height": null, + "max_width": null, + "min_height": null, + "min_width": null, + "object_fit": null, + "object_position": null, + "order": null, + "overflow": null, + "overflow_x": null, + "overflow_y": null, + "padding": null, + "right": null, + "top": null, + "visibility": null, + "width": null + } + }, + "cbeb9e9dfdf6420d91ec1126fedd8e48": { + "model_module": "@jupyter-widgets/controls", + "model_module_version": "1.5.0", + "model_name": "DescriptionStyleModel", + "state": { + "_model_module": "@jupyter-widgets/controls", + "_model_module_version": "1.5.0", + "_model_name": "DescriptionStyleModel", + "_view_count": null, + "_view_module": "@jupyter-widgets/base", + "_view_module_version": "1.2.0", + "_view_name": "StyleView", + "description_width": "" + } + }, + "d7e158e614f44983b229d6dd0d8960f9": { + "model_module": "@jupyter-widgets/controls", + "model_module_version": "1.5.0", + "model_name": "HBoxModel", + "state": { + "_dom_classes": [], + "_model_module": "@jupyter-widgets/controls", + "_model_module_version": "1.5.0", + "_model_name": "HBoxModel", + "_view_count": null, + "_view_module": "@jupyter-widgets/controls", + "_view_module_version": "1.5.0", + "_view_name": "HBoxView", + "box_style": "", + "children": [ + "IPY_MODEL_69aad11dbc914410b95f6c3cb17a2457", + "IPY_MODEL_0302e718c6084fb0a96d92fd976738dc", + "IPY_MODEL_72b47b116b0b4125a35d47e060f46807" + ], + "layout": "IPY_MODEL_6f5fbb8f0f5a4374a7bde870c64f1fa4" + } + }, + "da1553ec3e044a4fb7bb9b0c2a84bfe0": { + "model_module": "@jupyter-widgets/controls", + "model_module_version": "1.5.0", + "model_name": "HBoxModel", + "state": { + "_dom_classes": [], + "_model_module": "@jupyter-widgets/controls", + "_model_module_version": "1.5.0", + "_model_name": "HBoxModel", + "_view_count": null, + "_view_module": "@jupyter-widgets/controls", + "_view_module_version": "1.5.0", + "_view_name": "HBoxView", + "box_style": "", + "children": [ + "IPY_MODEL_b440f3b3937549d69a4f188bb8415531", + "IPY_MODEL_1b937b91e8ee46c4a764a8091f365291", + "IPY_MODEL_1369029047864cc68100a44ffaad35ca" + ], + "layout": "IPY_MODEL_80cb9869ae694ecda95aab298598c7a2" + } + }, + "dd420d29751341faa84c025afa743bb5": { + "model_module": "@jupyter-widgets/base", + "model_module_version": "1.2.0", + "model_name": "LayoutModel", + "state": { + "_model_module": "@jupyter-widgets/base", + "_model_module_version": "1.2.0", + "_model_name": "LayoutModel", + "_view_count": null, + "_view_module": "@jupyter-widgets/base", + "_view_module_version": "1.2.0", + "_view_name": "LayoutView", + "align_content": null, + "align_items": null, + "align_self": null, + "border": null, + "bottom": null, + "display": null, + "flex": null, + "flex_flow": null, + "grid_area": null, + "grid_auto_columns": null, + "grid_auto_flow": null, + "grid_auto_rows": null, + "grid_column": null, + "grid_gap": null, + "grid_row": null, + "grid_template_areas": null, + "grid_template_columns": null, + "grid_template_rows": null, + "height": null, + "justify_content": null, + "justify_items": null, + "left": null, + "margin": null, + "max_height": null, + "max_width": null, + "min_height": null, + "min_width": null, + "object_fit": null, + "object_position": null, + "order": null, + "overflow": null, + "overflow_x": null, + "overflow_y": null, + "padding": null, + "right": null, + "top": null, + "visibility": null, + "width": null + } + }, + "deb9a0d8d3e1430bacab12c8b4ce7573": { + "model_module": "@jupyter-widgets/controls", + "model_module_version": "1.5.0", + "model_name": "ProgressStyleModel", + "state": { + "_model_module": "@jupyter-widgets/controls", + "_model_module_version": "1.5.0", + "_model_name": "ProgressStyleModel", + "_view_count": null, + "_view_module": "@jupyter-widgets/base", + "_view_module_version": "1.2.0", + "_view_name": "StyleView", + "bar_color": null, + "description_width": "" + } + }, + "e33617a01b03437986c143c9a69ba14f": { + "model_module": "@jupyter-widgets/controls", + "model_module_version": "1.5.0", + "model_name": "DescriptionStyleModel", + "state": { + "_model_module": "@jupyter-widgets/controls", + "_model_module_version": "1.5.0", + "_model_name": "DescriptionStyleModel", + "_view_count": null, + "_view_module": "@jupyter-widgets/base", + "_view_module_version": "1.2.0", + "_view_name": "StyleView", + "description_width": "" + } + }, + "e6b91c2208e44524a9883309ae431277": { + "model_module": "@jupyter-widgets/controls", + "model_module_version": "1.5.0", + "model_name": "DescriptionStyleModel", + "state": { + "_model_module": "@jupyter-widgets/controls", + "_model_module_version": "1.5.0", + "_model_name": "DescriptionStyleModel", + "_view_count": null, + "_view_module": "@jupyter-widgets/base", + "_view_module_version": "1.2.0", + "_view_name": "StyleView", + "description_width": "" + } + }, + "e9f1a9476de147c3bc98c1c36960dad6": { + "model_module": "@jupyter-widgets/base", + "model_module_version": "1.2.0", + "model_name": "LayoutModel", + "state": { + "_model_module": "@jupyter-widgets/base", + "_model_module_version": "1.2.0", + "_model_name": "LayoutModel", + "_view_count": null, + "_view_module": "@jupyter-widgets/base", + "_view_module_version": "1.2.0", + "_view_name": "LayoutView", + "align_content": null, + "align_items": null, + "align_self": null, + "border": null, + "bottom": null, + "display": null, + "flex": null, + "flex_flow": null, + "grid_area": null, + "grid_auto_columns": null, + "grid_auto_flow": null, + "grid_auto_rows": null, + "grid_column": null, + "grid_gap": null, + "grid_row": null, + "grid_template_areas": null, + "grid_template_columns": null, + "grid_template_rows": null, + "height": null, + "justify_content": null, + "justify_items": null, + "left": null, + "margin": null, + "max_height": null, + "max_width": null, + "min_height": null, + "min_width": null, + "object_fit": null, + "object_position": null, + "order": null, + "overflow": null, + "overflow_x": null, + "overflow_y": null, + "padding": null, + "right": null, + "top": null, + "visibility": null, + "width": null + } + }, + "ec90909f38e0448bab37c735ea9b9ebe": { + "model_module": "@jupyter-widgets/controls", + "model_module_version": "1.5.0", + "model_name": "HTMLModel", + "state": { + "_dom_classes": [], + "_model_module": "@jupyter-widgets/controls", + "_model_module_version": "1.5.0", + "_model_name": "HTMLModel", + "_view_count": null, + "_view_module": "@jupyter-widgets/controls", + "_view_module_version": "1.5.0", + "_view_name": "HTMLView", + "description": "", + "description_tooltip": null, + "layout": "IPY_MODEL_dd420d29751341faa84c025afa743bb5", + "placeholder": "​", + "style": "IPY_MODEL_4d56ed704097453a930baa9ecdfb1156", + "value": " 436k/436k [00:01<00:00, 470kB/s]" + } + }, + "f4274a98f06945a6a1e4a56b680c1790": { + "model_module": "@jupyter-widgets/base", + "model_module_version": "1.2.0", + "model_name": "LayoutModel", + "state": { + "_model_module": "@jupyter-widgets/base", + "_model_module_version": "1.2.0", + "_model_name": "LayoutModel", + "_view_count": null, + "_view_module": "@jupyter-widgets/base", + "_view_module_version": "1.2.0", + "_view_name": "LayoutView", + "align_content": null, + "align_items": null, + "align_self": null, + "border": null, + "bottom": null, + "display": null, + "flex": null, + "flex_flow": null, + "grid_area": null, + "grid_auto_columns": null, + "grid_auto_flow": null, + "grid_auto_rows": null, + "grid_column": null, + "grid_gap": null, + "grid_row": null, + "grid_template_areas": null, + "grid_template_columns": null, + "grid_template_rows": null, + "height": null, + "justify_content": null, + "justify_items": null, + "left": null, + "margin": null, + "max_height": null, + "max_width": null, + "min_height": null, + "min_width": null, + "object_fit": null, + "object_position": null, + "order": null, + "overflow": null, + "overflow_x": null, + "overflow_y": null, + "padding": null, + "right": null, + "top": null, + "visibility": null, + "width": null + } + }, + "f6a334e9b5da4c82a1c4f1fbc1fe3c7e": { + "model_module": "@jupyter-widgets/base", + "model_module_version": "1.2.0", + "model_name": "LayoutModel", + "state": { + "_model_module": "@jupyter-widgets/base", + "_model_module_version": "1.2.0", + "_model_name": "LayoutModel", + "_view_count": null, + "_view_module": "@jupyter-widgets/base", + "_view_module_version": "1.2.0", + "_view_name": "LayoutView", + "align_content": null, + "align_items": null, + "align_self": null, + "border": null, + "bottom": null, + "display": null, + "flex": null, + "flex_flow": null, + "grid_area": null, + "grid_auto_columns": null, + "grid_auto_flow": null, + "grid_auto_rows": null, + "grid_column": null, + "grid_gap": null, + "grid_row": null, + "grid_template_areas": null, + "grid_template_columns": null, + "grid_template_rows": null, + "height": null, + "justify_content": null, + "justify_items": null, + "left": null, + "margin": null, + "max_height": null, + "max_width": null, + "min_height": null, + "min_width": null, + "object_fit": null, + "object_position": null, + "order": null, + "overflow": null, + "overflow_x": null, + "overflow_y": null, + "padding": null, + "right": null, + "top": null, + "visibility": null, + "width": null + } + }, + "f9456ff5134242bc9541d9d60c753384": { + "model_module": "@jupyter-widgets/controls", + "model_module_version": "1.5.0", + "model_name": "DescriptionStyleModel", + "state": { + "_model_module": "@jupyter-widgets/controls", + "_model_module_version": "1.5.0", + "_model_name": "DescriptionStyleModel", + "_view_count": null, + "_view_module": "@jupyter-widgets/base", + "_view_module_version": "1.2.0", + "_view_name": "StyleView", + "description_width": "" + } + } + } + } + }, + "nbformat": 4, + "nbformat_minor": 1 +} diff --git a/NLP with Attention Models/QA/QA_DistilBERT_pipline_FT/Files/tf/C4W3_HF_Lab2_QA_BERT.ipynb b/NLP with Attention Models/QA/QA_DistilBERT_pipline_FT/Files/tf/C4W3_HF_Lab2_QA_BERT.ipynb new file mode 100644 index 0000000000000000000000000000000000000000..1164f17406ee14009277dc7ac1784bf51e82d187 --- /dev/null +++ b/NLP with Attention Models/QA/QA_DistilBERT_pipline_FT/Files/tf/C4W3_HF_Lab2_QA_BERT.ipynb @@ -0,0 +1,644 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": { + "id": "u2UXutvEvpUj" + }, + "source": [ + "# Question Answering with BERT and HuggingFace 🤗 (Fine-tuning)\n", + "\n", + "In the previous Hugging Face ungraded lab, you saw how to use the pipeline objects to use transformer models for NLP tasks. In that lab, the model didn't output the desired answers to a series of precise questions for a context related to the history of comic books.\n", + "\n", + "In this lab, you will fine-tune the model from that lab to give better answers for that type of context. To do that, you'll be using the [TyDi QA dataset](https://ai.google.com/research/tydiqa) but on a filtered version with only English examples. Additionally, you will use a lot of the tools that Hugging Face has to offer.\n", + "\n", + "You have to note that, in general, you will fine-tune general-purpose transformer models to work for specific tasks. However, fine-tuning a general-purpose model can take a lot of time. That's why you will be using the model from the question answering pipeline in this lab.\n", + "\n", + "Begin by importing some libraries and/or objects you will use throughout the lab:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "import os\n", + "os.environ['TF_CPP_MIN_LOG_LEVEL'] = '3'\n", + "\n", + "import numpy as np\n", + "\n", + "from datasets import load_from_disk\n", + "from transformers import AutoTokenizer, AutoModelForQuestionAnswering, Trainer, TrainingArguments\n", + "\n", + "from sklearn.metrics import f1_score" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "id": "FrEglXPmvpUr" + }, + "source": [ + "## Fine-tuning a BERT model\n", + "\n", + "As you saw in the previous lab, you can use these pipelines as they are. But sometimes, you'll need something more specific to your problem, or maybe you need it to perform better on your production data. In these cases, you'll need to fine-tune a model.\n", + "\n", + "Here, you'll fine-tune a pre-trained DistilBERT model on the TyDi QA dataset.\n", + "\n", + "To fine-tune your model, you will leverage three components provided by Hugging Face:\n", + "\n", + "* Datasets: Library that contains some datasets and different metrics to evaluate the performance of your models.\n", + "* Tokenizer: Object in charge of preprocessing your text to be given as input for the transformer models.\n", + "* Transformers: Library with the pre-trained model checkpoints and the trainer object.\n", + "\n" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "id": "g0Rg-e4jBFFs" + }, + "source": [ + "### Datasets\n", + "\n", + "To get the dataset to fine-tune your model, you will use [🤗 Datasets](https://huggingface.co/docs/datasets/), a lightweight and extensible library to share and access datasets and evaluation metrics for NLP easily. You can download Hugging Face datasets directly using the `load_dataset` function from the `datasets` library. \n", + "\n", + "Hugging Face `datasets` allows to load data in several formats, such as CSV, JSON, text files and even parquet. You can see more about the supported formats in the [documentation](https://huggingface.co/docs/datasets/loading)\n", + "\n", + "A common approach is to use `load_dataset` and get the full dataset but **for this lab you will use a filtered version containing only the English examples**, which is already saved in this environment. Since this filtered dataset is saved using the Apache Arrow format, you can read it by using the `load_from_disk` function.\n" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "id": "x68dqaoXg5Ra" + }, + "outputs": [], + "source": [ + "#The path where the dataset is stored\n", + "path = './tydiqa_data/'\n", + "\n", + "#Load Dataset\n", + "tydiqa_data = load_from_disk(path)\n", + "\n", + "tydiqa_data" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "id": "1hfzBZU3T47O" + }, + "source": [ + "\n", + "You can check below that the type of the loaded dataset is a `datasets.arrow_dataset.Dataset`. This object type corresponds to an Apache Arrow Table that allows creating a hash table that contains the position in memory where data is stored instead of loading the complete dataset into memory. But you don't have to worry too much about that. It is just an efficient way to work with lots of data." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "id": "gkeppC3GQiW6" + }, + "outputs": [], + "source": [ + "# Checking the object type for one of the elements in the dataset\n", + "type(tydiqa_data['train'])" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "id": "q_HLaNtQaFlR" + }, + "source": [ + "You can also check the structure of the dataset:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "id": "2l9ANJTrbP-U" + }, + "outputs": [], + "source": [ + "tydiqa_data['train']" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "id": "2xRO1yIkvpUt" + }, + "source": [ + "You can see that each example is like a dictionary object. This dataset consists of questions, contexts, and indices that point to the start and end position of the answer inside the context. You can access the index using the `annotations` key, which is a kind of dictionary." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "id": "KNVpW6lADk92" + }, + "outputs": [], + "source": [ + "idx = 600\n", + "\n", + "# start index\n", + "start_index = tydiqa_data['train'][idx]['annotations']['minimal_answers_start_byte'][0]\n", + "\n", + "# end index\n", + "end_index = tydiqa_data['train'][idx]['annotations']['minimal_answers_end_byte'][0]\n", + "\n", + "print(f\"Question: {tydiqa_data['train'][idx]['question_text']}\")\n", + "print(f\"\\nContext (truncated): {tydiqa_data['train'][idx]['document_plaintext'][0:512]} ...\")\n", + "print(f\"\\nAnswer: {tydiqa_data['train'][idx]['document_plaintext'][start_index:end_index]}\")" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "id": "Z-lZgDTEYm74" + }, + "source": [ + "The question answering model predicts a start and endpoint in the context to extract as the answer. That's why this NLP task is known as extractive question answering.\n", + "\n", + "To train your model, you need to pass start and endpoints as labels. So, you need to implement a function that extracts the start and end positions from the dataset.\n", + "\n", + "The dataset contains unanswerable questions. For these, the start and end indices for the answer are equal to `-1`." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "id": "Ty_QDcdKYw9a" + }, + "outputs": [], + "source": [ + "tydiqa_data['train'][0]['annotations']" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "id": "lHWcNMudcAuO" + }, + "source": [ + "Now, you have to flatten the dataset to work with an object with a table structure instead of a dictionary structure. This step facilitates the pre-processing steps." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "id": "xDCAQQtoCs_r" + }, + "outputs": [], + "source": [ + "# Flattening the datasets\n", + "flattened_train_data = tydiqa_data['train'].flatten()\n", + "flattened_test_data = tydiqa_data['validation'].flatten()" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "id": "q5wUa5xED0fK" + }, + "source": [ + "Also, to make the training more straightforward and faster, we will extract a subset of the train and test datasets. For that purpose, we will use the Hugging Face Dataset object's method called `select()`. This method allows you to take some data points by their index. Here, you will select the first 3000 rows but you can play with the number of data points, however, consider that this will increase the training time." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "id": "BkcIhpEnDHSJ" + }, + "outputs": [], + "source": [ + "# Selecting a subset of the train dataset\n", + "flattened_train_data = flattened_train_data.select(range(3000))\n", + "\n", + "# Selecting a subset of the test dataset\n", + "flattened_test_data = flattened_test_data.select(range(1000))" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "id": "fBXrmwXhc13M" + }, + "source": [ + "### Tokenizers\n", + "\n", + "Now, you will use the [tokenizer](https://huggingface.co/transformers/main_classes/tokenizer.html) object from Hugging Face. You can load a tokenizer using different methods. Here, you will retrieve it from the pipeline object you created in the previous Hugging Face lab. With this tokenizer, you can ensure that the tokens you get for the dataset will match the tokens used in the original DistilBERT implementation.\n", + "\n", + "When loading a tokenizer with any method, you must pass the model checkpoint that you want to fine-tune. Here, you are using the`'distilbert-base-cased-distilled-squad'` checkpoint.\n" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "id": "LInV3b_HyAIF" + }, + "outputs": [], + "source": [ + "# Import the AutoTokenizer from the transformers library\n", + "tokenizer = AutoTokenizer.from_pretrained(\"distilbert-base-cased-distilled-squad\")\n", + "\n", + "# Define max length of sequences in the tokenizer\n", + "tokenizer.model_max_length = 512" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "id": "qz6YtVcOh3qP" + }, + "source": [ + "Given the characteristics of the dataset and the question-answering task, you will need to add some steps to pre-process the data after the tokenization:\n", + "\n", + "1. When there is no answer to a question given a context, you will use the `CLS` token, a unique token used to represent the start of the sequence.\n", + "\n", + "2. Tokenizers can split a given string into substrings, resulting in a subtoken for each substring, creating misalignment between the list of dataset tags and the labels generated by the tokenizer. Therefore, you will need to align the start and end indices with the tokens associated with the target answer word.\n", + "\n", + "3. Finally, a tokenizer can truncate a very long sequence. So, if the start/end position of an answer is `None`, you will assume that it was truncated and assign the maximum length of the tokenizer to those positions.\n", + "\n", + "Those three steps are done within the `process_samples` function defined below." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "id": "3l-r4wI06LU7" + }, + "outputs": [], + "source": [ + "# Processing samples using the 3 steps described above\n", + "def process_samples(sample):\n", + " tokenized_data = tokenizer(sample['document_plaintext'], sample['question_text'], truncation=\"only_first\", padding=\"max_length\")\n", + "\n", + " input_ids = tokenized_data[\"input_ids\"]\n", + "\n", + " # We will label impossible answers with the index of the CLS token.\n", + " cls_index = input_ids.index(tokenizer.cls_token_id)\n", + "\n", + " # If no answers are given, set the cls_index as answer.\n", + " if sample[\"annotations.minimal_answers_start_byte\"][0] == -1:\n", + " start_position = cls_index\n", + " end_position = cls_index\n", + " else:\n", + " # Start/end character index of the answer in the text.\n", + " gold_text = sample[\"document_plaintext\"][sample['annotations.minimal_answers_start_byte'][0]:sample['annotations.minimal_answers_end_byte'][0]]\n", + " start_char = sample[\"annotations.minimal_answers_start_byte\"][0]\n", + " end_char = sample['annotations.minimal_answers_end_byte'][0] #start_char + len(gold_text)\n", + "\n", + " # sometimes answers are off by a character or two – fix this\n", + " if sample['document_plaintext'][start_char-1:end_char-1] == gold_text:\n", + " start_char = start_char - 1\n", + " end_char = end_char - 1 # When the gold label is off by one character\n", + " elif sample['document_plaintext'][start_char-2:end_char-2] == gold_text:\n", + " start_char = start_char - 2\n", + " end_char = end_char - 2 # When the gold label is off by two characters\n", + "\n", + " start_token = tokenized_data.char_to_token(start_char)\n", + " end_token = tokenized_data.char_to_token(end_char - 1)\n", + "\n", + " # if start position is None, the answer passage has been truncated\n", + " if start_token is None:\n", + " start_token = tokenizer.model_max_length\n", + " if end_token is None:\n", + " end_token = tokenizer.model_max_length\n", + "\n", + " start_position = start_token\n", + " end_position = end_token\n", + "\n", + " return {'input_ids': tokenized_data['input_ids'],\n", + " 'attention_mask': tokenized_data['attention_mask'],\n", + " 'start_positions': start_position,\n", + " 'end_positions': end_position}\n" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "id": "Q3LAsWSyk_Rm" + }, + "source": [ + "To apply the `process_samples` function defined above to the whole dataset, you can use the `map` method as follows:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "id": "rGbYd7QnFetG" + }, + "outputs": [], + "source": [ + "# Tokenizing and processing the flattened dataset\n", + "processed_train_data = flattened_train_data.map(process_samples)\n", + "processed_test_data = flattened_test_data.map(process_samples)" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "id": "wCpPhYKJluMA" + }, + "source": [ + "# Transformers\n", + "\n", + "The last component of Hugging Face that is useful for fine-tuning a transformer corresponds to the pre-trained models you can access in multiple ways.\n", + "\n", + "For this lab, you will use the same model from the question-answering pipeline that you loaded in the previous lab." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "id": "jR3VqjNc1Vb3" + }, + "outputs": [], + "source": [ + "# Import the AutoModelForQuestionAnswering for the pre-trained model. You will only fine tune the head of the model\n", + "model = AutoModelForQuestionAnswering.from_pretrained(\"distilbert-base-cased-distilled-squad\")" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "id": "K29BYtnsm1yH" + }, + "source": [ + "Now, you can take the necessary columns from the datasets to train/test and return them as Pytorch Tensors." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "id": "0X14G89noLfW" + }, + "outputs": [], + "source": [ + "columns_to_return = ['input_ids','attention_mask', 'start_positions', 'end_positions']\n", + "\n", + "processed_train_data.set_format(type='pt', columns=columns_to_return)\n", + "processed_test_data.set_format(type='pt', columns=columns_to_return)" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "id": "yjoUFWu_nLRq" + }, + "source": [ + "Here, we give you the F1 score as a metric to evaluate your model's performance. We will use this metric for simplicity, although it is based on the start and end values predicted by the model. If you want to dig deeper on other metrics that can be used for a question and answering task, you can also check [this colab notebook resource](https://colab.research.google.com/github/huggingface/notebooks/blob/master/examples/question_answering.ipynb) from the Hugging Face team." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "id": "xcW2wPnirsJk" + }, + "outputs": [], + "source": [ + "def compute_f1_metrics(pred):\n", + " start_labels = pred.label_ids[0]\n", + " start_preds = pred.predictions[0].argmax(-1)\n", + " end_labels = pred.label_ids[1]\n", + " end_preds = pred.predictions[1].argmax(-1)\n", + "\n", + " f1_start = f1_score(start_labels, start_preds, average='macro')\n", + " f1_end = f1_score(end_labels, end_preds, average='macro')\n", + "\n", + " return {\n", + " 'f1_start': f1_start,\n", + " 'f1_end': f1_end,\n", + " }" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "id": "KuhASU4evpUu" + }, + "source": [ + "Now, you will use the Hugging Face [Trainer](https://huggingface.co/transformers/main_classes/trainer.html) to fine-tune your model." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "colab": { + "background_save": true + }, + "id": "nxyOwf5utXAt" + }, + "outputs": [], + "source": [ + "# Training hyperparameters\n", + "training_args = TrainingArguments(\n", + " output_dir='model_results', # output directory\n", + " overwrite_output_dir=True,\n", + " num_train_epochs=3, # total number of training epochs\n", + " per_device_train_batch_size=8, # batch size per device during training\n", + " per_device_eval_batch_size=8, # batch size for evaluation\n", + " warmup_steps=20, # number of warmup steps for learning rate scheduler\n", + " weight_decay=0.01, # strength of weight decay\n", + " logging_steps=50\n", + ")\n", + "\n", + "# Trainer object\n", + "trainer = Trainer(\n", + " model=model, # the instantiated 🤗 Transformers model to be trained\n", + " args=training_args, # training arguments, defined above\n", + " train_dataset=processed_train_data, # training dataset\n", + " eval_dataset=processed_test_data, # evaluation dataset\n", + " compute_metrics=compute_f1_metrics\n", + ")\n", + "\n", + "# Training loop\n", + "trainer.train()" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "id": "Ic_wNlBHCRMn" + }, + "source": [ + "And, in the next cell, you can evaluate the fine-tuned model's performance on the test set." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "id": "92N11A076wRA" + }, + "outputs": [], + "source": [ + "trainer.evaluate(processed_test_data)" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "id": "_HubPkRbnzh_" + }, + "source": [ + "### Using your Fine-Tuned Model\n", + "\n", + "After training and evaluating your fine-tuned model, you can check its results for the same questions from the previous lab.\n", + "\n", + "For that, you will tell Pytorch to use your GPU or your CPU to run the model. Additionally, you will need to tokenize your input context and questions. Finally, you need to post-process the output results to transform them from tokens to human-readable strings using the `tokenizer`." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "text = r\"\"\"\n", + "The Golden Age of Comic Books describes an era of American comic books from the\n", + "late 1930s to circa 1950. During this time, modern comic books were first published\n", + "and rapidly increased in popularity. The superhero archetype was created and many\n", + "well-known characters were introduced, including Superman, Batman, Captain Marvel\n", + "(later known as SHAZAM!), Captain America, and Wonder Woman.\n", + "Between 1939 and 1941 Detective Comics and its sister company, All-American Publications,\n", + "introduced popular superheroes such as Batman and Robin, Wonder Woman, the Flash,\n", + "Green Lantern, Doctor Fate, the Atom, Hawkman, Green Arrow and Aquaman.[7] Timely Comics,\n", + "the 1940s predecessor of Marvel Comics, had million-selling titles featuring the Human Torch,\n", + "the Sub-Mariner, and Captain America.[8]\n", + "As comic books grew in popularity, publishers began launching titles that expanded\n", + "into a variety of genres. Dell Comics' non-superhero characters (particularly the\n", + "licensed Walt Disney animated-character comics) outsold the superhero comics of the day.[12]\n", + "The publisher featured licensed movie and literary characters such as Mickey Mouse, Donald Duck,\n", + "Roy Rogers and Tarzan.[13] It was during this era that noted Donald Duck writer-artist\n", + "Carl Barks rose to prominence.[14] Additionally, MLJ's introduction of Archie Andrews\n", + "in Pep Comics #22 (December 1941) gave rise to teen humor comics,[15] with the Archie\n", + "Andrews character remaining in print well into the 21st century.[16]\n", + "At the same time in Canada, American comic books were prohibited importation under\n", + "the War Exchange Conservation Act[17] which restricted the importation of non-essential\n", + "goods. As a result, a domestic publishing industry flourished during the duration\n", + "of the war which were collectively informally called the Canadian Whites.\n", + "The educational comic book Dagwood Splits the Atom used characters from the comic\n", + "strip Blondie.[18] According to historian Michael A. Amundson, appealing comic-book\n", + "characters helped ease young readers' fear of nuclear war and neutralize anxiety\n", + "about the questions posed by atomic power.[19] It was during this period that long-running\n", + "humor comics debuted, including EC's Mad and Carl Barks' Uncle Scrooge in Dell's Four\n", + "Color Comics (both in 1952).[20][21]\n", + "\"\"\"\n", + "\n", + "questions = [\"What superheroes were introduced between 1939 and 1941 by Detective Comics and its sister company?\",\n", + " \"What comic book characters were created between 1939 and 1941?\",\n", + " \"What well-known characters were created between 1939 and 1941?\",\n", + " \"What well-known superheroes were introduced between 1939 and 1941 by Detective Comics?\"]\n", + "\n", + "for question in questions:\n", + " inputs = tokenizer.encode_plus(question, text, return_tensors=\"pt\")\n", + "\n", + " input_ids = inputs[\"input_ids\"].tolist()[0]\n", + " inputs.to(\"cuda\")\n", + "\n", + " text_tokens = tokenizer.convert_ids_to_tokens(input_ids)\n", + " answer_model = model(**inputs)\n", + " \n", + " start_logits = answer_model['start_logits'].cpu().detach().numpy()\n", + "\n", + " answer_start = np.argmax(start_logits) \n", + " \n", + " end_logits = answer_model['end_logits'].cpu().detach().numpy()\n", + " \n", + " # Get the most likely beginning of answer with the argmax of the score\n", + " answer_end = np.argmax(end_logits) + 1 # Get the most likely end of answer with the argmax of the score\n", + "\n", + " answer = tokenizer.convert_tokens_to_string(tokenizer.convert_ids_to_tokens(input_ids[answer_start:answer_end]))\n", + "\n", + " print(f\"Question: {question}\")\n", + " print(f\"Answer: {answer}\\n\")\n" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "id": "_yTDQ6kn6pWS" + }, + "source": [ + "By fine-tuning the model for only 3 epochs you can already see an improvement!\n", + "\n", + "You can compare those results with those obtained using the base model (without fine-tuning), as you did in the previous lab. As a reminder, here are those results:\n", + "\n", + "```\n", + "What popular superheroes were introduced between 1939 and 1941?\n", + ">> teen humor comics\n", + "What superheroes were introduced between 1939 and 1941 by Detective Comics and its sister company?\n", + ">> Archie Andrews\n", + "What comic book characters were created between 1939 and 1941?\n", + ">> Archie\n", + "Andrews\n", + "What well-known characters were created between 1939 and 1941?\n", + ">> Archie\n", + "Andrews\n", + "What well-known superheroes were introduced between 1939 and 1941 by Detective Comics?\n", + ">> Archie Andrews\n", + "```" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "id": "uf-v8mUSLqXN" + }, + "source": [ + "**Congratulations!**\n", + "\n", + "You have finished this series of ungraded labs. You were able to:\n", + "\n", + "* Explore the Hugging Face Pipelines, which can be used right out of the bat.\n", + "\n", + "* Fine-tune a model for the Extractive Question & Answering task.\n", + "\n", + "We also recommend you go through the free [Hugging Face course](https://huggingface.co/course/chapter1) to explore their ecosystem in more detail and find different ways to use the `transformers` library." + ] + } + ], + "metadata": { + "accelerator": "GPU", + "colab": { + "provenance": [] + }, + "kernelspec": { + "display_name": "Python 3 (ipykernel)", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.8.10" + } + }, + "nbformat": 4, + "nbformat_minor": 1 +} diff --git a/NLP with Attention Models/QA/QA_DistilBERT_pipline_FT/Files/tf/distilbert-base-cased-distilled-squad/config.json b/NLP with Attention Models/QA/QA_DistilBERT_pipline_FT/Files/tf/distilbert-base-cased-distilled-squad/config.json new file mode 100644 index 0000000000000000000000000000000000000000..41b2befbb5d0e14dead1bcc87b900d49098d7079 --- /dev/null +++ b/NLP with Attention Models/QA/QA_DistilBERT_pipline_FT/Files/tf/distilbert-base-cased-distilled-squad/config.json @@ -0,0 +1,22 @@ +{ + "activation": "gelu", + "architectures": [ + "DistilBertForQuestionAnswering" + ], + "attention_dropout": 0.1, + "dim": 768, + "dropout": 0.1, + "hidden_dim": 3072, + "initializer_range": 0.02, + "max_position_embeddings": 512, + "model_type": "distilbert", + "n_heads": 12, + "n_layers": 6, + "output_past": true, + "pad_token_id": 0, + "qa_dropout": 0.1, + "seq_classif_dropout": 0.2, + "sinusoidal_pos_embds": true, + "tie_weights_": true, + "vocab_size": 28996 +} diff --git a/NLP with Attention Models/QA/QA_DistilBERT_pipline_FT/Files/tf/distilbert-base-cased-distilled-squad/model.safetensors b/NLP with Attention Models/QA/QA_DistilBERT_pipline_FT/Files/tf/distilbert-base-cased-distilled-squad/model.safetensors new file mode 100644 index 0000000000000000000000000000000000000000..5f88965095b4f481ebbe669be77fcfdd6345ac95 --- /dev/null +++ b/NLP with Attention Models/QA/QA_DistilBERT_pipline_FT/Files/tf/distilbert-base-cased-distilled-squad/model.safetensors @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:f198de8ef6e40aeccd6eaa86e34dde73c3bb4bf0e54003cd182a18c29a1811db +size 260782156 diff --git a/NLP with Attention Models/QA/QA_DistilBERT_pipline_FT/Files/tf/distilbert-base-cased-distilled-squad/tokenizer.json b/NLP with Attention Models/QA/QA_DistilBERT_pipline_FT/Files/tf/distilbert-base-cased-distilled-squad/tokenizer.json new file mode 100644 index 0000000000000000000000000000000000000000..3506cd531024c4fad2649aec20f7aa2020bca693 --- /dev/null +++ b/NLP with Attention Models/QA/QA_DistilBERT_pipline_FT/Files/tf/distilbert-base-cased-distilled-squad/tokenizer.json @@ -0,0 +1 @@ +{"version":"1.0","truncation":null,"padding":null,"added_tokens":[{"id":0,"special":true,"content":"[PAD]","single_word":false,"lstrip":false,"rstrip":false,"normalized":false},{"id":100,"special":true,"content":"[UNK]","single_word":false,"lstrip":false,"rstrip":false,"normalized":false},{"id":101,"special":true,"content":"[CLS]","single_word":false,"lstrip":false,"rstrip":false,"normalized":false},{"id":102,"special":true,"content":"[SEP]","single_word":false,"lstrip":false,"rstrip":false,"normalized":false},{"id":103,"special":true,"content":"[MASK]","single_word":false,"lstrip":false,"rstrip":false,"normalized":false}],"normalizer":{"type":"BertNormalizer","clean_text":true,"handle_chinese_chars":true,"strip_accents":null,"lowercase":false},"pre_tokenizer":{"type":"BertPreTokenizer"},"post_processor":{"type":"TemplateProcessing","single":[{"SpecialToken":{"id":"[CLS]","type_id":0}},{"Sequence":{"id":"A","type_id":0}},{"SpecialToken":{"id":"[SEP]","type_id":0}}],"pair":[{"SpecialToken":{"id":"[CLS]","type_id":0}},{"Sequence":{"id":"A","type_id":0}},{"SpecialToken":{"id":"[SEP]","type_id":0}},{"Sequence":{"id":"B","type_id":1}},{"SpecialToken":{"id":"[SEP]","type_id":1}}],"special_tokens":{"[CLS]":{"id":"[CLS]","ids":[101],"tokens":["[CLS]"]},"[SEP]":{"id":"[SEP]","ids":[102],"tokens":["[SEP]"]}}},"decoder":{"type":"WordPiece","prefix":"##","cleanup":true},"model":{"unk_token":"[UNK]","continuing_subword_prefix":"##","max_input_chars_per_word":100,"vocab":{"[PAD]":0,"[unused1]":1,"[unused2]":2,"[unused3]":3,"[unused4]":4,"[unused5]":5,"[unused6]":6,"[unused7]":7,"[unused8]":8,"[unused9]":9,"[unused10]":10,"[unused11]":11,"[unused12]":12,"[unused13]":13,"[unused14]":14,"[unused15]":15,"[unused16]":16,"[unused17]":17,"[unused18]":18,"[unused19]":19,"[unused20]":20,"[unused21]":21,"[unused22]":22,"[unused23]":23,"[unused24]":24,"[unused25]":25,"[unused26]":26,"[unused27]":27,"[unused28]":28,"[unused29]":29,"[unused30]":30,"[unused31]":31,"[unused32]":32,"[unused33]":33,"[unused34]":34,"[unused35]":35,"[unused36]":36,"[unused37]":37,"[unused38]":38,"[unused39]":39,"[unused40]":40,"[unused41]":41,"[unused42]":42,"[unused43]":43,"[unused44]":44,"[unused45]":45,"[unused46]":46,"[unused47]":47,"[unused48]":48,"[unused49]":49,"[unused50]":50,"[unused51]":51,"[unused52]":52,"[unused53]":53,"[unused54]":54,"[unused55]":55,"[unused56]":56,"[unused57]":57,"[unused58]":58,"[unused59]":59,"[unused60]":60,"[unused61]":61,"[unused62]":62,"[unused63]":63,"[unused64]":64,"[unused65]":65,"[unused66]":66,"[unused67]":67,"[unused68]":68,"[unused69]":69,"[unused70]":70,"[unused71]":71,"[unused72]":72,"[unused73]":73,"[unused74]":74,"[unused75]":75,"[unused76]":76,"[unused77]":77,"[unused78]":78,"[unused79]":79,"[unused80]":80,"[unused81]":81,"[unused82]":82,"[unused83]":83,"[unused84]":84,"[unused85]":85,"[unused86]":86,"[unused87]":87,"[unused88]":88,"[unused89]":89,"[unused90]":90,"[unused91]":91,"[unused92]":92,"[unused93]":93,"[unused94]":94,"[unused95]":95,"[unused96]":96,"[unused97]":97,"[unused98]":98,"[unused99]":99,"[UNK]":100,"[CLS]":101,"[SEP]":102,"[MASK]":103,"[unused100]":104,"[unused101]":105,"!":106,"\"":107,"#":108,"$":109,"%":110,"&":111,"'":112,"(":113,")":114,"*":115,"+":116,",":117,"-":118,".":119,"/":120,"0":121,"1":122,"2":123,"3":124,"4":125,"5":126,"6":127,"7":128,"8":129,"9":130,":":131,";":132,"<":133,"=":134,">":135,"?":136,"@":137,"A":138,"B":139,"C":140,"D":141,"E":142,"F":143,"G":144,"H":145,"I":146,"J":147,"K":148,"L":149,"M":150,"N":151,"O":152,"P":153,"Q":154,"R":155,"S":156,"T":157,"U":158,"V":159,"W":160,"X":161,"Y":162,"Z":163,"[":164,"\\":165,"]":166,"^":167,"_":168,"`":169,"a":170,"b":171,"c":172,"d":173,"e":174,"f":175,"g":176,"h":177,"i":178,"j":179,"k":180,"l":181,"m":182,"n":183,"o":184,"p":185,"q":186,"r":187,"s":188,"t":189,"u":190,"v":191,"w":192,"x":193,"y":194,"z":195,"{":196,"|":197,"}":198,"~":199,"¡":200,"¢":201,"£":202,"¥":203,"§":204,"¨":205,"©":206,"ª":207,"«":208,"¬":209,"®":210,"°":211,"±":212,"²":213,"³":214,"´":215,"µ":216,"¶":217,"·":218,"¹":219,"º":220,"»":221,"¼":222,"½":223,"¾":224,"¿":225,"À":226,"Á":227,"Â":228,"Ä":229,"Å":230,"Æ":231,"Ç":232,"È":233,"É":234,"Í":235,"Î":236,"Ñ":237,"Ó":238,"Ö":239,"×":240,"Ø":241,"Ú":242,"Ü":243,"Þ":244,"ß":245,"à":246,"á":247,"â":248,"ã":249,"ä":250,"å":251,"æ":252,"ç":253,"è":254,"é":255,"ê":256,"ë":257,"ì":258,"í":259,"î":260,"ï":261,"ð":262,"ñ":263,"ò":264,"ó":265,"ô":266,"õ":267,"ö":268,"÷":269,"ø":270,"ù":271,"ú":272,"û":273,"ü":274,"ý":275,"þ":276,"ÿ":277,"Ā":278,"ā":279,"ă":280,"ą":281,"Ć":282,"ć":283,"Č":284,"č":285,"ď":286,"Đ":287,"đ":288,"ē":289,"ė":290,"ę":291,"ě":292,"ğ":293,"ġ":294,"Ħ":295,"ħ":296,"ĩ":297,"Ī":298,"ī":299,"İ":300,"ı":301,"ļ":302,"Ľ":303,"ľ":304,"Ł":305,"ł":306,"ń":307,"ņ":308,"ň":309,"ŋ":310,"Ō":311,"ō":312,"ŏ":313,"ő":314,"Œ":315,"œ":316,"ř":317,"Ś":318,"ś":319,"Ş":320,"ş":321,"Š":322,"š":323,"Ţ":324,"ţ":325,"ť":326,"ũ":327,"ū":328,"ŭ":329,"ů":330,"ű":331,"ų":332,"ŵ":333,"ŷ":334,"ź":335,"Ż":336,"ż":337,"Ž":338,"ž":339,"Ə":340,"ƒ":341,"ơ":342,"ư":343,"ǎ":344,"ǐ":345,"ǒ":346,"ǔ":347,"ǫ":348,"Ș":349,"ș":350,"Ț":351,"ț":352,"ɐ":353,"ɑ":354,"ɔ":355,"ɕ":356,"ə":357,"ɛ":358,"ɡ":359,"ɣ":360,"ɨ":361,"ɪ":362,"ɲ":363,"ɾ":364,"ʀ":365,"ʁ":366,"ʂ":367,"ʃ":368,"ʊ":369,"ʋ":370,"ʌ":371,"ʐ":372,"ʑ":373,"ʒ":374,"ʔ":375,"ʰ":376,"ʲ":377,"ʳ":378,"ʷ":379,"ʻ":380,"ʼ":381,"ʾ":382,"ʿ":383,"ˈ":384,"ː":385,"ˡ":386,"ˢ":387,"ˣ":388,"́":389,"̃":390,"̍":391,"̯":392,"͡":393,"Α":394,"Β":395,"Γ":396,"Δ":397,"Ε":398,"Η":399,"Θ":400,"Ι":401,"Κ":402,"Λ":403,"Μ":404,"Ν":405,"Ο":406,"Π":407,"Σ":408,"Τ":409,"Φ":410,"Χ":411,"Ψ":412,"Ω":413,"ά":414,"έ":415,"ή":416,"ί":417,"α":418,"β":419,"γ":420,"δ":421,"ε":422,"ζ":423,"η":424,"θ":425,"ι":426,"κ":427,"λ":428,"μ":429,"ν":430,"ξ":431,"ο":432,"π":433,"ρ":434,"ς":435,"σ":436,"τ":437,"υ":438,"φ":439,"χ":440,"ψ":441,"ω":442,"ό":443,"ύ":444,"ώ":445,"І":446,"Ј":447,"А":448,"Б":449,"В":450,"Г":451,"Д":452,"Е":453,"Ж":454,"З":455,"И":456,"К":457,"Л":458,"М":459,"Н":460,"О":461,"П":462,"Р":463,"С":464,"Т":465,"У":466,"Ф":467,"Х":468,"Ц":469,"Ч":470,"Ш":471,"Э":472,"Ю":473,"Я":474,"а":475,"б":476,"в":477,"г":478,"д":479,"е":480,"ж":481,"з":482,"и":483,"й":484,"к":485,"л":486,"м":487,"н":488,"о":489,"п":490,"р":491,"с":492,"т":493,"у":494,"ф":495,"х":496,"ц":497,"ч":498,"ш":499,"щ":500,"ъ":501,"ы":502,"ь":503,"э":504,"ю":505,"я":506,"ё":507,"і":508,"ї":509,"ј":510,"њ":511,"ћ":512,"Ա":513,"Հ":514,"ա":515,"ե":516,"ի":517,"կ":518,"մ":519,"յ":520,"ն":521,"ո":522,"ս":523,"տ":524,"ր":525,"ւ":526,"ְ":527,"ִ":528,"ֵ":529,"ֶ":530,"ַ":531,"ָ":532,"ֹ":533,"ּ":534,"א":535,"ב":536,"ג":537,"ד":538,"ה":539,"ו":540,"ז":541,"ח":542,"ט":543,"י":544,"כ":545,"ל":546,"ם":547,"מ":548,"ן":549,"נ":550,"ס":551,"ע":552,"פ":553,"צ":554,"ק":555,"ר":556,"ש":557,"ת":558,"،":559,"ء":560,"آ":561,"أ":562,"إ":563,"ئ":564,"ا":565,"ب":566,"ة":567,"ت":568,"ث":569,"ج":570,"ح":571,"خ":572,"د":573,"ذ":574,"ر":575,"ز":576,"س":577,"ش":578,"ص":579,"ض":580,"ط":581,"ظ":582,"ع":583,"غ":584,"ف":585,"ق":586,"ك":587,"ل":588,"م":589,"ن":590,"ه":591,"و":592,"ى":593,"ي":594,"َ":595,"ِ":596,"ٹ":597,"پ":598,"چ":599,"ک":600,"گ":601,"ہ":602,"ی":603,"ے":604,"ं":605,"आ":606,"क":607,"ग":608,"च":609,"ज":610,"ण":611,"त":612,"द":613,"ध":614,"न":615,"प":616,"ब":617,"भ":618,"म":619,"य":620,"र":621,"ल":622,"व":623,"श":624,"ष":625,"स":626,"ह":627,"ा":628,"ि":629,"ी":630,"ु":631,"े":632,"ो":633,"्":634,"।":635,"॥":636,"আ":637,"ই":638,"এ":639,"ও":640,"ক":641,"খ":642,"গ":643,"চ":644,"ছ":645,"জ":646,"ট":647,"ত":648,"থ":649,"দ":650,"ধ":651,"ন":652,"প":653,"ব":654,"ম":655,"য":656,"র":657,"ল":658,"শ":659,"স":660,"হ":661,"়":662,"া":663,"ি":664,"ী":665,"ু":666,"ে":667,"ো":668,"্":669,"য়":670,"க":671,"த":672,"ப":673,"ம":674,"ய":675,"ர":676,"ல":677,"வ":678,"ா":679,"ி":680,"ு":681,"்":682,"ร":683,"་":684,"ག":685,"ང":686,"ད":687,"ན":688,"བ":689,"མ":690,"ར":691,"ལ":692,"ས":693,"ི":694,"ུ":695,"ེ":696,"ོ":697,"ა":698,"ე":699,"ი":700,"ლ":701,"ნ":702,"ო":703,"რ":704,"ს":705,"ᴬ":706,"ᴵ":707,"ᵀ":708,"ᵃ":709,"ᵇ":710,"ᵈ":711,"ᵉ":712,"ᵍ":713,"ᵏ":714,"ᵐ":715,"ᵒ":716,"ᵖ":717,"ᵗ":718,"ᵘ":719,"ᵢ":720,"ᵣ":721,"ᵤ":722,"ᵥ":723,"ᶜ":724,"ᶠ":725,"ḍ":726,"Ḥ":727,"ḥ":728,"Ḩ":729,"ḩ":730,"ḳ":731,"ṃ":732,"ṅ":733,"ṇ":734,"ṛ":735,"ṣ":736,"ṭ":737,"ạ":738,"ả":739,"ấ":740,"ầ":741,"ẩ":742,"ậ":743,"ắ":744,"ế":745,"ề":746,"ể":747,"ễ":748,"ệ":749,"ị":750,"ọ":751,"ố":752,"ồ":753,"ổ":754,"ộ":755,"ớ":756,"ờ":757,"ợ":758,"ụ":759,"ủ":760,"ứ":761,"ừ":762,"ử":763,"ữ":764,"ự":765,"ỳ":766,"ỹ":767,"ἀ":768,"ἐ":769,"ὁ":770,"ὐ":771,"ὰ":772,"ὶ":773,"ὸ":774,"ῆ":775,"ῖ":776,"ῦ":777,"ῶ":778,"‐":779,"‑":780,"‒":781,"–":782,"—":783,"―":784,"‖":785,"‘":786,"’":787,"‚":788,"“":789,"”":790,"„":791,"†":792,"‡":793,"•":794,"…":795,"‰":796,"′":797,"″":798,"⁄":799,"⁰":800,"ⁱ":801,"⁴":802,"⁵":803,"⁶":804,"⁷":805,"⁸":806,"⁹":807,"⁺":808,"⁻":809,"ⁿ":810,"₀":811,"₁":812,"₂":813,"₃":814,"₄":815,"₅":816,"₆":817,"₇":818,"₈":819,"₉":820,"₊":821,"₍":822,"₎":823,"ₐ":824,"ₑ":825,"ₒ":826,"ₓ":827,"ₕ":828,"ₖ":829,"ₘ":830,"ₙ":831,"ₚ":832,"ₛ":833,"ₜ":834,"₤":835,"€":836,"₱":837,"₹":838,"ℓ":839,"№":840,"ℝ":841,"⅓":842,"←":843,"↑":844,"→":845,"↔":846,"⇌":847,"⇒":848,"∂":849,"∈":850,"−":851,"∗":852,"∘":853,"√":854,"∞":855,"∧":856,"∨":857,"∩":858,"∪":859,"≈":860,"≠":861,"≡":862,"≤":863,"≥":864,"⊂":865,"⊆":866,"⊕":867,"⋅":868,"─":869,"│":870,"■":871,"●":872,"★":873,"☆":874,"☉":875,"♠":876,"♣":877,"♥":878,"♦":879,"♭":880,"♯":881,"⟨":882,"⟩":883,"ⱼ":884,"、":885,"。":886,"《":887,"》":888,"「":889,"」":890,"『":891,"』":892,"〜":893,"い":894,"う":895,"え":896,"お":897,"か":898,"き":899,"く":900,"け":901,"こ":902,"さ":903,"し":904,"す":905,"せ":906,"そ":907,"た":908,"ち":909,"つ":910,"て":911,"と":912,"な":913,"に":914,"の":915,"は":916,"ひ":917,"ま":918,"み":919,"む":920,"め":921,"も":922,"や":923,"ゆ":924,"よ":925,"ら":926,"り":927,"る":928,"れ":929,"ん":930,"ア":931,"ィ":932,"イ":933,"ウ":934,"エ":935,"オ":936,"カ":937,"ガ":938,"キ":939,"ク":940,"グ":941,"コ":942,"サ":943,"シ":944,"ジ":945,"ス":946,"ズ":947,"タ":948,"ダ":949,"ッ":950,"テ":951,"デ":952,"ト":953,"ド":954,"ナ":955,"ニ":956,"ハ":957,"バ":958,"パ":959,"フ":960,"ブ":961,"プ":962,"マ":963,"ミ":964,"ム":965,"ャ":966,"ュ":967,"ラ":968,"リ":969,"ル":970,"レ":971,"ロ":972,"ン":973,"・":974,"ー":975,"一":976,"三":977,"上":978,"下":979,"中":980,"事":981,"二":982,"井":983,"京":984,"人":985,"亻":986,"仁":987,"佐":988,"侍":989,"光":990,"公":991,"力":992,"北":993,"十":994,"南":995,"原":996,"口":997,"史":998,"司":999,"吉":1000,"同":1001,"和":1002,"囗":1003,"国":1004,"國":1005,"土":1006,"城":1007,"士":1008,"大":1009,"天":1010,"太":1011,"夫":1012,"女":1013,"子":1014,"宀":1015,"安":1016,"宮":1017,"宿":1018,"小":1019,"尚":1020,"山":1021,"島":1022,"川":1023,"州":1024,"平":1025,"年":1026,"心":1027,"愛":1028,"戸":1029,"文":1030,"新":1031,"方":1032,"日":1033,"明":1034,"星":1035,"書":1036,"月":1037,"木":1038,"本":1039,"李":1040,"村":1041,"東":1042,"松":1043,"林":1044,"正":1045,"武":1046,"氏":1047,"水":1048,"氵":1049,"江":1050,"河":1051,"海":1052,"版":1053,"犬":1054,"王":1055,"生":1056,"田":1057,"白":1058,"皇":1059,"省":1060,"真":1061,"石":1062,"社":1063,"神":1064,"竹":1065,"美":1066,"義":1067,"花":1068,"藤":1069,"西":1070,"谷":1071,"車":1072,"辶":1073,"道":1074,"郎":1075,"郡":1076,"部":1077,"野":1078,"金":1079,"長":1080,"門":1081,"陽":1082,"青":1083,"食":1084,"馬":1085,"高":1086,"龍":1087,"龸":1088,"사":1089,"씨":1090,"의":1091,"이":1092,"한":1093,"fi":1094,"fl":1095,"!":1096,"(":1097,")":1098,",":1099,"-":1100,"/":1101,":":1102,"the":1103,"of":1104,"and":1105,"to":1106,"in":1107,"was":1108,"The":1109,"is":1110,"for":1111,"as":1112,"on":1113,"with":1114,"that":1115,"##s":1116,"his":1117,"by":1118,"he":1119,"at":1120,"from":1121,"it":1122,"her":1123,"He":1124,"had":1125,"an":1126,"were":1127,"you":1128,"be":1129,"In":1130,"she":1131,"are":1132,"but":1133,"which":1134,"It":1135,"not":1136,"or":1137,"have":1138,"my":1139,"him":1140,"one":1141,"this":1142,"me":1143,"has":1144,"also":1145,"up":1146,"their":1147,"first":1148,"out":1149,"who":1150,"been":1151,"they":1152,"She":1153,"into":1154,"all":1155,"would":1156,"its":1157,"##ing":1158,"time":1159,"two":1160,"##a":1161,"##e":1162,"said":1163,"about":1164,"when":1165,"over":1166,"more":1167,"other":1168,"can":1169,"after":1170,"back":1171,"them":1172,"then":1173,"##ed":1174,"there":1175,"like":1176,"so":1177,"only":1178,"##n":1179,"could":1180,"##d":1181,"##i":1182,"##y":1183,"what":1184,"no":1185,"##o":1186,"where":1187,"This":1188,"made":1189,"than":1190,"if":1191,"You":1192,"##ly":1193,"through":1194,"we":1195,"before":1196,"##r":1197,"just":1198,"some":1199,"##er":1200,"years":1201,"do":1202,"New":1203,"##t":1204,"down":1205,"between":1206,"new":1207,"now":1208,"will":1209,"three":1210,"most":1211,"On":1212,"around":1213,"year":1214,"used":1215,"such":1216,"being":1217,"well":1218,"during":1219,"They":1220,"know":1221,"against":1222,"under":1223,"later":1224,"did":1225,"part":1226,"known":1227,"off":1228,"while":1229,"His":1230,"re":1231,"...":1232,"##l":1233,"people":1234,"until":1235,"way":1236,"American":1237,"didn":1238,"University":1239,"your":1240,"both":1241,"many":1242,"get":1243,"United":1244,"became":1245,"head":1246,"There":1247,"second":1248,"As":1249,"work":1250,"any":1251,"But":1252,"still":1253,"again":1254,"born":1255,"even":1256,"eyes":1257,"After":1258,"including":1259,"de":1260,"took":1261,"And":1262,"long":1263,"team":1264,"season":1265,"family":1266,"see":1267,"right":1268,"same":1269,"called":1270,"name":1271,"because":1272,"film":1273,"don":1274,"10":1275,"found":1276,"much":1277,"school":1278,"##es":1279,"going":1280,"won":1281,"place":1282,"away":1283,"We":1284,"day":1285,"left":1286,"John":1287,"000":1288,"hand":1289,"since":1290,"World":1291,"these":1292,"how":1293,"make":1294,"number":1295,"each":1296,"life":1297,"area":1298,"man":1299,"four":1300,"go":1301,"No":1302,"here":1303,"very":1304,"National":1305,"##m":1306,"played":1307,"released":1308,"never":1309,"began":1310,"States":1311,"album":1312,"home":1313,"last":1314,"too":1315,"held":1316,"several":1317,"May":1318,"own":1319,"##on":1320,"take":1321,"end":1322,"School":1323,"##h":1324,"ll":1325,"series":1326,"What":1327,"want":1328,"use":1329,"another":1330,"city":1331,"When":1332,"2010":1333,"side":1334,"At":1335,"may":1336,"That":1337,"came":1338,"face":1339,"June":1340,"think":1341,"game":1342,"those":1343,"high":1344,"March":1345,"early":1346,"September":1347,"##al":1348,"2011":1349,"looked":1350,"July":1351,"state":1352,"small":1353,"thought":1354,"went":1355,"January":1356,"October":1357,"##u":1358,"based":1359,"August":1360,"##us":1361,"world":1362,"good":1363,"April":1364,"York":1365,"us":1366,"12":1367,"2012":1368,"2008":1369,"For":1370,"2009":1371,"group":1372,"along":1373,"few":1374,"South":1375,"little":1376,"##k":1377,"following":1378,"November":1379,"something":1380,"2013":1381,"December":1382,"set":1383,"2007":1384,"old":1385,"2006":1386,"2014":1387,"located":1388,"##an":1389,"music":1390,"County":1391,"City":1392,"former":1393,"##in":1394,"room":1395,"ve":1396,"next":1397,"All":1398,"##man":1399,"got":1400,"father":1401,"house":1402,"##g":1403,"body":1404,"15":1405,"20":1406,"18":1407,"started":1408,"If":1409,"2015":1410,"town":1411,"our":1412,"line":1413,"War":1414,"large":1415,"population":1416,"named":1417,"British":1418,"company":1419,"member":1420,"five":1421,"My":1422,"single":1423,"##en":1424,"age":1425,"State":1426,"moved":1427,"February":1428,"11":1429,"Her":1430,"should":1431,"century":1432,"government":1433,"built":1434,"come":1435,"best":1436,"show":1437,"However":1438,"within":1439,"look":1440,"men":1441,"door":1442,"without":1443,"need":1444,"wasn":1445,"2016":1446,"water":1447,"One":1448,"system":1449,"knew":1450,"every":1451,"died":1452,"League":1453,"turned":1454,"asked":1455,"North":1456,"St":1457,"wanted":1458,"building":1459,"received":1460,"song":1461,"served":1462,"though":1463,"felt":1464,"##ia":1465,"station":1466,"band":1467,"##ers":1468,"local":1469,"public":1470,"himself":1471,"different":1472,"death":1473,"say":1474,"##1":1475,"30":1476,"##2":1477,"2005":1478,"16":1479,"night":1480,"behind":1481,"children":1482,"English":1483,"members":1484,"near":1485,"saw":1486,"together":1487,"son":1488,"14":1489,"voice":1490,"village":1491,"13":1492,"hands":1493,"help":1494,"##3":1495,"due":1496,"French":1497,"London":1498,"top":1499,"told":1500,"open":1501,"published":1502,"third":1503,"2017":1504,"play":1505,"across":1506,"During":1507,"put":1508,"final":1509,"often":1510,"include":1511,"25":1512,"##le":1513,"main":1514,"having":1515,"2004":1516,"once":1517,"ever":1518,"let":1519,"book":1520,"led":1521,"gave":1522,"late":1523,"front":1524,"find":1525,"club":1526,"##4":1527,"German":1528,"included":1529,"species":1530,"College":1531,"form":1532,"opened":1533,"mother":1534,"women":1535,"enough":1536,"West":1537,"must":1538,"2000":1539,"power":1540,"really":1541,"17":1542,"making":1543,"half":1544,"##6":1545,"order":1546,"might":1547,"##is":1548,"given":1549,"million":1550,"times":1551,"days":1552,"point":1553,"full":1554,"service":1555,"With":1556,"km":1557,"major":1558,"##7":1559,"original":1560,"become":1561,"seen":1562,"II":1563,"north":1564,"six":1565,"##te":1566,"love":1567,"##0":1568,"national":1569,"International":1570,"##5":1571,"24":1572,"So":1573,"District":1574,"lost":1575,"run":1576,"couldn":1577,"career":1578,"always":1579,"##9":1580,"2003":1581,"##th":1582,"country":1583,"##z":1584,"House":1585,"air":1586,"tell":1587,"south":1588,"worked":1589,"woman":1590,"player":1591,"##A":1592,"almost":1593,"war":1594,"River":1595,"##ic":1596,"married":1597,"continued":1598,"Then":1599,"James":1600,"close":1601,"black":1602,"short":1603,"##8":1604,"##na":1605,"using":1606,"history":1607,"returned":1608,"light":1609,"car":1610,"##ra":1611,"sure":1612,"William":1613,"things":1614,"General":1615,"##ry":1616,"2002":1617,"better":1618,"support":1619,"100":1620,"among":1621,"From":1622,"feet":1623,"King":1624,"anything":1625,"21":1626,"19":1627,"established":1628,"district":1629,"2001":1630,"feel":1631,"great":1632,"##ton":1633,"level":1634,"Cup":1635,"These":1636,"written":1637,"games":1638,"others":1639,"already":1640,"title":1641,"story":1642,"##p":1643,"law":1644,"thing":1645,"US":1646,"record":1647,"role":1648,"however":1649,"By":1650,"students":1651,"England":1652,"white":1653,"control":1654,"least":1655,"inside":1656,"land":1657,"##C":1658,"22":1659,"give":1660,"community":1661,"hard":1662,"##ie":1663,"non":1664,"##c":1665,"produced":1666,"George":1667,"round":1668,"period":1669,"Park":1670,"business":1671,"various":1672,"##ne":1673,"does":1674,"present":1675,"wife":1676,"far":1677,"taken":1678,"per":1679,"reached":1680,"David":1681,"able":1682,"version":1683,"working":1684,"young":1685,"live":1686,"created":1687,"joined":1688,"East":1689,"living":1690,"appeared":1691,"case":1692,"High":1693,"done":1694,"23":1695,"important":1696,"President":1697,"Award":1698,"France":1699,"position":1700,"office":1701,"looking":1702,"total":1703,"general":1704,"class":1705,"To":1706,"production":1707,"##S":1708,"football":1709,"party":1710,"brother":1711,"keep":1712,"mind":1713,"free":1714,"Street":1715,"hair":1716,"announced":1717,"development":1718,"either":1719,"nothing":1720,"moment":1721,"Church":1722,"followed":1723,"wrote":1724,"why":1725,"India":1726,"San":1727,"election":1728,"1999":1729,"lead":1730,"How":1731,"##ch":1732,"##rs":1733,"words":1734,"European":1735,"course":1736,"considered":1737,"America":1738,"arms":1739,"Army":1740,"political":1741,"##la":1742,"28":1743,"26":1744,"west":1745,"east":1746,"ground":1747,"further":1748,"church":1749,"less":1750,"site":1751,"First":1752,"Not":1753,"Australia":1754,"toward":1755,"California":1756,"##ness":1757,"described":1758,"works":1759,"An":1760,"Council":1761,"heart":1762,"past":1763,"military":1764,"27":1765,"##or":1766,"heard":1767,"field":1768,"human":1769,"soon":1770,"founded":1771,"1998":1772,"playing":1773,"trying":1774,"##x":1775,"##ist":1776,"##ta":1777,"television":1778,"mouth":1779,"although":1780,"taking":1781,"win":1782,"fire":1783,"Division":1784,"##ity":1785,"Party":1786,"Royal":1787,"program":1788,"Some":1789,"Don":1790,"Association":1791,"According":1792,"tried":1793,"TV":1794,"Paul":1795,"outside":1796,"daughter":1797,"Best":1798,"While":1799,"someone":1800,"match":1801,"recorded":1802,"Canada":1803,"closed":1804,"region":1805,"Air":1806,"above":1807,"months":1808,"elected":1809,"##da":1810,"##ian":1811,"road":1812,"##ar":1813,"brought":1814,"move":1815,"1997":1816,"leave":1817,"##um":1818,"Thomas":1819,"1996":1820,"am":1821,"low":1822,"Robert":1823,"formed":1824,"person":1825,"services":1826,"points":1827,"Mr":1828,"miles":1829,"##b":1830,"stop":1831,"rest":1832,"doing":1833,"needed":1834,"international":1835,"release":1836,"floor":1837,"start":1838,"sound":1839,"call":1840,"killed":1841,"real":1842,"dark":1843,"research":1844,"finished":1845,"language":1846,"Michael":1847,"professional":1848,"change":1849,"sent":1850,"50":1851,"upon":1852,"29":1853,"track":1854,"hit":1855,"event":1856,"2018":1857,"term":1858,"example":1859,"Germany":1860,"similar":1861,"return":1862,"##ism":1863,"fact":1864,"pulled":1865,"stood":1866,"says":1867,"ran":1868,"information":1869,"yet":1870,"result":1871,"developed":1872,"girl":1873,"##re":1874,"God":1875,"1995":1876,"areas":1877,"signed":1878,"decided":1879,"##ment":1880,"Company":1881,"seemed":1882,"##el":1883,"co":1884,"turn":1885,"race":1886,"common":1887,"video":1888,"Charles":1889,"Indian":1890,"##ation":1891,"blood":1892,"art":1893,"red":1894,"##able":1895,"added":1896,"rather":1897,"1994":1898,"met":1899,"director":1900,"addition":1901,"design":1902,"average":1903,"minutes":1904,"##ies":1905,"##ted":1906,"available":1907,"bed":1908,"coming":1909,"friend":1910,"idea":1911,"kind":1912,"Union":1913,"Road":1914,"remained":1915,"##ting":1916,"everything":1917,"##ma":1918,"running":1919,"care":1920,"finally":1921,"Chinese":1922,"appointed":1923,"1992":1924,"Australian":1925,"##ley":1926,"popular":1927,"mean":1928,"teams":1929,"probably":1930,"##land":1931,"usually":1932,"project":1933,"social":1934,"Championship":1935,"possible":1936,"word":1937,"Russian":1938,"instead":1939,"mi":1940,"herself":1941,"##T":1942,"Peter":1943,"Hall":1944,"Center":1945,"seat":1946,"style":1947,"money":1948,"1993":1949,"else":1950,"Department":1951,"table":1952,"Music":1953,"current":1954,"31":1955,"features":1956,"special":1957,"events":1958,"character":1959,"Two":1960,"square":1961,"sold":1962,"debut":1963,"##v":1964,"process":1965,"Although":1966,"Since":1967,"##ka":1968,"40":1969,"Central":1970,"currently":1971,"education":1972,"placed":1973,"lot":1974,"China":1975,"quickly":1976,"forward":1977,"seven":1978,"##ling":1979,"Europe":1980,"arm":1981,"performed":1982,"Japanese":1983,"1991":1984,"Henry":1985,"Now":1986,"Dr":1987,"##ion":1988,"week":1989,"Group":1990,"myself":1991,"big":1992,"UK":1993,"Washington":1994,"ten":1995,"deep":1996,"1990":1997,"Club":1998,"Japan":1999,"space":2000,"La":2001,"directed":2002,"smile":2003,"episode":2004,"hours":2005,"whole":2006,"##de":2007,"##less":2008,"Why":2009,"wouldn":2010,"designed":2011,"strong":2012,"training":2013,"changed":2014,"Society":2015,"stage":2016,"involved":2017,"hadn":2018,"towards":2019,"leading":2020,"police":2021,"eight":2022,"kept":2023,"Institute":2024,"study":2025,"largest":2026,"child":2027,"eventually":2028,"private":2029,"modern":2030,"Court":2031,"throughout":2032,"getting":2033,"originally":2034,"attack":2035,"##E":2036,"talk":2037,"Great":2038,"longer":2039,"songs":2040,"alone":2041,"##ine":2042,"wide":2043,"dead":2044,"walked":2045,"shot":2046,"##ri":2047,"Oh":2048,"force":2049,"##st":2050,"Art":2051,"today":2052,"friends":2053,"Island":2054,"Richard":2055,"1989":2056,"center":2057,"construction":2058,"believe":2059,"size":2060,"White":2061,"ship":2062,"completed":2063,"##B":2064,"gone":2065,"Just":2066,"rock":2067,"sat":2068,"##R":2069,"radio":2070,"below":2071,"entire":2072,"families":2073,"league":2074,"includes":2075,"type":2076,"lived":2077,"official":2078,"range":2079,"hold":2080,"featured":2081,"Most":2082,"##ter":2083,"president":2084,"passed":2085,"means":2086,"##f":2087,"forces":2088,"lips":2089,"Mary":2090,"Do":2091,"guitar":2092,"##ce":2093,"food":2094,"wall":2095,"Of":2096,"spent":2097,"Its":2098,"performance":2099,"hear":2100,"##P":2101,"Western":2102,"reported":2103,"sister":2104,"##et":2105,"morning":2106,"##M":2107,"especially":2108,"##ive":2109,"Minister":2110,"itself":2111,"post":2112,"bit":2113,"groups":2114,"1988":2115,"##tion":2116,"Black":2117,"##ng":2118,"Well":2119,"raised":2120,"sometimes":2121,"Canadian":2122,"Paris":2123,"Spanish":2124,"replaced":2125,"schools":2126,"Academy":2127,"leaving":2128,"central":2129,"female":2130,"Christian":2131,"Jack":2132,"whose":2133,"college":2134,"onto":2135,"provided":2136,"##D":2137,"##ville":2138,"players":2139,"actually":2140,"stopped":2141,"##son":2142,"Museum":2143,"doesn":2144,"##ts":2145,"books":2146,"fight":2147,"allowed":2148,"##ur":2149,"beginning":2150,"Records":2151,"awarded":2152,"parents":2153,"coach":2154,"##os":2155,"Red":2156,"saying":2157,"##ck":2158,"Smith":2159,"Yes":2160,"Lake":2161,"##L":2162,"aircraft":2163,"1987":2164,"##ble":2165,"previous":2166,"ft":2167,"action":2168,"Italian":2169,"African":2170,"happened":2171,"vocals":2172,"Act":2173,"future":2174,"court":2175,"##ge":2176,"1986":2177,"degree":2178,"phone":2179,"##ro":2180,"Is":2181,"countries":2182,"winning":2183,"breath":2184,"Love":2185,"river":2186,"matter":2187,"Lord":2188,"Other":2189,"list":2190,"self":2191,"parts":2192,"##ate":2193,"provide":2194,"cut":2195,"shows":2196,"plan":2197,"1st":2198,"interest":2199,"##ized":2200,"Africa":2201,"stated":2202,"Sir":2203,"fell":2204,"owned":2205,"earlier":2206,"ended":2207,"competition":2208,"attention":2209,"1985":2210,"lower":2211,"nearly":2212,"bad":2213,"older":2214,"stay":2215,"Saint":2216,"##se":2217,"certain":2218,"1984":2219,"fingers":2220,"blue":2221,"try":2222,"fourth":2223,"Grand":2224,"##as":2225,"king":2226,"##nt":2227,"makes":2228,"chest":2229,"movement":2230,"states":2231,"moving":2232,"data":2233,"introduced":2234,"model":2235,"date":2236,"section":2237,"Los":2238,"deal":2239,"##I":2240,"skin":2241,"entered":2242,"middle":2243,"success":2244,"Texas":2245,"##w":2246,"summer":2247,"island":2248,"##N":2249,"Republic":2250,"length":2251,"husband":2252,"1980":2253,"##ey":2254,"reason":2255,"anyone":2256,"forced":2257,"via":2258,"base":2259,"500":2260,"job":2261,"covered":2262,"Festival":2263,"Roman":2264,"successful":2265,"rights":2266,"cover":2267,"Man":2268,"writing":2269,"Ireland":2270,"##F":2271,"related":2272,"goal":2273,"takes":2274,"buildings":2275,"true":2276,"weeks":2277,"1983":2278,"Because":2279,"opening":2280,"novel":2281,"ISBN":2282,"meet":2283,"gold":2284,"##ous":2285,"mid":2286,"km²":2287,"standing":2288,"Football":2289,"Chicago":2290,"shook":2291,"whom":2292,"##ki":2293,"1982":2294,"Day":2295,"feeling":2296,"scored":2297,"boy":2298,"higher":2299,"Force":2300,"leader":2301,"heavy":2302,"fall":2303,"question":2304,"sense":2305,"army":2306,"Second":2307,"energy":2308,"meeting":2309,"themselves":2310,"kill":2311,"##am":2312,"board":2313,"census":2314,"##ya":2315,"##ns":2316,"mine":2317,"meant":2318,"market":2319,"required":2320,"battle":2321,"campaign":2322,"attended":2323,"approximately":2324,"Kingdom":2325,"runs":2326,"active":2327,"##ha":2328,"contract":2329,"clear":2330,"previously":2331,"health":2332,"1979":2333,"Arts":2334,"complete":2335,"Catholic":2336,"couple":2337,"units":2338,"##ll":2339,"##ty":2340,"Committee":2341,"shoulder":2342,"sea":2343,"systems":2344,"listed":2345,"##O":2346,"caught":2347,"tournament":2348,"##G":2349,"northern":2350,"author":2351,"Film":2352,"Your":2353,"##men":2354,"holding":2355,"offered":2356,"personal":2357,"1981":2358,"southern":2359,"artist":2360,"traditional":2361,"studio":2362,"200":2363,"capital":2364,"##ful":2365,"regular":2366,"ask":2367,"giving":2368,"organization":2369,"month":2370,"news":2371,"Are":2372,"read":2373,"managed":2374,"helped":2375,"studied":2376,"student":2377,"defeated":2378,"natural":2379,"industry":2380,"Year":2381,"noted":2382,"decision":2383,"Government":2384,"quite":2385,"##id":2386,"smiled":2387,"1972":2388,"Maybe":2389,"tracks":2390,"##ke":2391,"Mark":2392,"al":2393,"media":2394,"engine":2395,"hour":2396,"Their":2397,"relationship":2398,"plays":2399,"property":2400,"structure":2401,"1976":2402,"ago":2403,"Hill":2404,"Martin":2405,"1978":2406,"ready":2407,"Many":2408,"Like":2409,"Bay":2410,"immediately":2411,"generally":2412,"Italy":2413,"Greek":2414,"practice":2415,"caused":2416,"division":2417,"significant":2418,"Joseph":2419,"speed":2420,"Let":2421,"thinking":2422,"completely":2423,"1974":2424,"primary":2425,"mostly":2426,"##field":2427,"##K":2428,"1975":2429,"##to":2430,"Even":2431,"writer":2432,"##led":2433,"dropped":2434,"magazine":2435,"collection":2436,"understand":2437,"route":2438,"highest":2439,"particular":2440,"films":2441,"lines":2442,"network":2443,"Science":2444,"loss":2445,"carried":2446,"direction":2447,"green":2448,"1977":2449,"location":2450,"producer":2451,"according":2452,"Women":2453,"Queen":2454,"neck":2455,"thus":2456,"independent":2457,"view":2458,"1970":2459,"Angeles":2460,"Soviet":2461,"distance":2462,"problem":2463,"Board":2464,"tour":2465,"western":2466,"income":2467,"appearance":2468,"access":2469,"Mexico":2470,"nodded":2471,"street":2472,"surface":2473,"arrived":2474,"believed":2475,"Old":2476,"1968":2477,"1973":2478,"becoming":2479,"whether":2480,"1945":2481,"figure":2482,"singer":2483,"stand":2484,"Following":2485,"issue":2486,"window":2487,"wrong":2488,"pain":2489,"everyone":2490,"lives":2491,"issues":2492,"park":2493,"slowly":2494,"la":2495,"act":2496,"##va":2497,"bring":2498,"Lee":2499,"operations":2500,"key":2501,"comes":2502,"fine":2503,"cold":2504,"famous":2505,"Navy":2506,"1971":2507,"Me":2508,"additional":2509,"individual":2510,"##ner":2511,"Zealand":2512,"goals":2513,"county":2514,"contains":2515,"Service":2516,"minute":2517,"2nd":2518,"reach":2519,"talking":2520,"particularly":2521,"##ham":2522,"movie":2523,"Director":2524,"glass":2525,"paper":2526,"studies":2527,"##co":2528,"railway":2529,"standard":2530,"Education":2531,"45":2532,"represented":2533,"Chief":2534,"Louis":2535,"launched":2536,"Star":2537,"terms":2538,"60":2539,"1969":2540,"experience":2541,"watched":2542,"Another":2543,"Press":2544,"Tom":2545,"staff":2546,"starting":2547,"subject":2548,"break":2549,"Virginia":2550,"nine":2551,"eye":2552,"##age":2553,"evidence":2554,"foot":2555,"##est":2556,"companies":2557,"Prince":2558,"##V":2559,"gun":2560,"create":2561,"Big":2562,"People":2563,"guy":2564,"Green":2565,"simply":2566,"numerous":2567,"##line":2568,"increased":2569,"twenty":2570,"##ga":2571,"##do":2572,"1967":2573,"award":2574,"officer":2575,"stone":2576,"Before":2577,"material":2578,"Northern":2579,"grew":2580,"male":2581,"plant":2582,"Life":2583,"legs":2584,"step":2585,"Al":2586,"unit":2587,"35":2588,"except":2589,"answer":2590,"##U":2591,"report":2592,"response":2593,"Edward":2594,"commercial":2595,"edition":2596,"trade":2597,"science":2598,"##ca":2599,"Irish":2600,"Law":2601,"shown":2602,"rate":2603,"failed":2604,"##ni":2605,"remains":2606,"changes":2607,"mm":2608,"limited":2609,"larger":2610,"Later":2611,"cause":2612,"waiting":2613,"Time":2614,"##wood":2615,"cost":2616,"Bill":2617,"manager":2618,"activities":2619,"likely":2620,"allow":2621,"operated":2622,"retired":2623,"##ping":2624,"65":2625,"directly":2626,"Who":2627,"associated":2628,"effect":2629,"hell":2630,"Florida":2631,"straight":2632,"hot":2633,"Valley":2634,"management":2635,"girls":2636,"expected":2637,"eastern":2638,"Mike":2639,"chance":2640,"cast":2641,"centre":2642,"chair":2643,"hurt":2644,"problems":2645,"##li":2646,"walk":2647,"programs":2648,"Team":2649,"characters":2650,"Battle":2651,"edge":2652,"pay":2653,"maybe":2654,"corner":2655,"majority":2656,"medical":2657,"Joe":2658,"Summer":2659,"##io":2660,"attempt":2661,"Pacific":2662,"command":2663,"Radio":2664,"##by":2665,"names":2666,"municipality":2667,"1964":2668,"train":2669,"economic":2670,"Brown":2671,"feature":2672,"sex":2673,"source":2674,"agreed":2675,"remember":2676,"Three":2677,"1966":2678,"1965":2679,"Pennsylvania":2680,"victory":2681,"senior":2682,"annual":2683,"III":2684,"Southern":2685,"results":2686,"Sam":2687,"serving":2688,"religious":2689,"Jones":2690,"appears":2691,"##der":2692,"despite":2693,"claimed":2694,"Both":2695,"musical":2696,"matches":2697,"fast":2698,"security":2699,"selected":2700,"Young":2701,"double":2702,"complex":2703,"hospital":2704,"chief":2705,"Times":2706,"##ve":2707,"Championships":2708,"filled":2709,"Public":2710,"Despite":2711,"beautiful":2712,"Research":2713,"plans":2714,"Province":2715,"##ally":2716,"Wales":2717,"##ko":2718,"artists":2719,"metal":2720,"nearby":2721,"Spain":2722,"##il":2723,"32":2724,"houses":2725,"supported":2726,"piece":2727,"##no":2728,"stared":2729,"recording":2730,"nature":2731,"legal":2732,"Russia":2733,"##ization":2734,"remaining":2735,"looks":2736,"##sh":2737,"bridge":2738,"closer":2739,"cases":2740,"scene":2741,"marriage":2742,"Little":2743,"##é":2744,"uses":2745,"Earth":2746,"specific":2747,"Frank":2748,"theory":2749,"Good":2750,"discovered":2751,"referred":2752,"bass":2753,"culture":2754,"university":2755,"presented":2756,"Congress":2757,"##go":2758,"metres":2759,"continue":2760,"1960":2761,"isn":2762,"Awards":2763,"meaning":2764,"cell":2765,"composed":2766,"separate":2767,"Series":2768,"forms":2769,"Blue":2770,"cross":2771,"##tor":2772,"increase":2773,"test":2774,"computer":2775,"slightly":2776,"Where":2777,"Jewish":2778,"Town":2779,"tree":2780,"status":2781,"1944":2782,"variety":2783,"responsible":2784,"pretty":2785,"initially":2786,"##way":2787,"realized":2788,"pass":2789,"provides":2790,"Captain":2791,"Alexander":2792,"recent":2793,"score":2794,"broke":2795,"Scott":2796,"drive":2797,"financial":2798,"showed":2799,"Line":2800,"stories":2801,"ordered":2802,"soldiers":2803,"genus":2804,"operation":2805,"gaze":2806,"sitting":2807,"society":2808,"Only":2809,"hope":2810,"actor":2811,"follow":2812,"Empire":2813,"Yeah":2814,"technology":2815,"happy":2816,"focus":2817,"policy":2818,"spread":2819,"situation":2820,"##ford":2821,"##ba":2822,"Mrs":2823,"watch":2824,"Can":2825,"1963":2826,"Commission":2827,"touch":2828,"earned":2829,"troops":2830,"Under":2831,"1962":2832,"individuals":2833,"cannot":2834,"19th":2835,"##lin":2836,"mile":2837,"expression":2838,"exactly":2839,"suddenly":2840,"weight":2841,"dance":2842,"stepped":2843,"places":2844,"appear":2845,"difficult":2846,"Railway":2847,"anti":2848,"numbers":2849,"kilometres":2850,"star":2851,"##ier":2852,"department":2853,"ice":2854,"Britain":2855,"removed":2856,"Once":2857,"##lo":2858,"Boston":2859,"value":2860,"##ant":2861,"mission":2862,"trees":2863,"Order":2864,"sports":2865,"join":2866,"serve":2867,"Major":2868,"poor":2869,"Poland":2870,"mainly":2871,"Theatre":2872,"pushed":2873,"Station":2874,"##it":2875,"Lady":2876,"federal":2877,"silver":2878,"##ler":2879,"foreign":2880,"##ard":2881,"Eastern":2882,"##den":2883,"box":2884,"hall":2885,"subsequently":2886,"lies":2887,"acquired":2888,"1942":2889,"ancient":2890,"CD":2891,"History":2892,"Jean":2893,"beyond":2894,"##ger":2895,"El":2896,"##les":2897,"growing":2898,"championship":2899,"native":2900,"Parliament":2901,"Williams":2902,"watching":2903,"direct":2904,"overall":2905,"offer":2906,"Also":2907,"80":2908,"Secretary":2909,"spoke":2910,"Latin":2911,"ability":2912,"##ated":2913,"safe":2914,"presence":2915,"##ial":2916,"headed":2917,"regional":2918,"planned":2919,"1961":2920,"Johnson":2921,"throat":2922,"consists":2923,"##W":2924,"extended":2925,"Or":2926,"bar":2927,"walls":2928,"Chris":2929,"stations":2930,"politician":2931,"Olympics":2932,"influence":2933,"share":2934,"fighting":2935,"speak":2936,"hundred":2937,"Carolina":2938,"die":2939,"stars":2940,"##tic":2941,"color":2942,"Chapter":2943,"##ish":2944,"fear":2945,"sleep":2946,"goes":2947,"Francisco":2948,"oil":2949,"Bank":2950,"sign":2951,"physical":2952,"##berg":2953,"Dutch":2954,"seasons":2955,"##rd":2956,"Games":2957,"Governor":2958,"sorry":2959,"lack":2960,"Centre":2961,"memory":2962,"baby":2963,"smaller":2964,"charge":2965,"Did":2966,"multiple":2967,"ships":2968,"shirt":2969,"Assembly":2970,"amount":2971,"leaves":2972,"3rd":2973,"Foundation":2974,"conditions":2975,"1943":2976,"Rock":2977,"Democratic":2978,"Daniel":2979,"##at":2980,"winner":2981,"products":2982,"##ina":2983,"store":2984,"latter":2985,"Professor":2986,"civil":2987,"prior":2988,"host":2989,"1956":2990,"soft":2991,"vote":2992,"needs":2993,"Each":2994,"rules":2995,"1958":2996,"pressure":2997,"letter":2998,"normal":2999,"proposed":3000,"levels":3001,"records":3002,"1959":3003,"paid":3004,"intended":3005,"Victoria":3006,"purpose":3007,"okay":3008,"historical":3009,"issued":3010,"1980s":3011,"broadcast":3012,"rule":3013,"simple":3014,"picked":3015,"firm":3016,"Sea":3017,"1941":3018,"Elizabeth":3019,"1940":3020,"serious":3021,"featuring":3022,"highly":3023,"graduated":3024,"mentioned":3025,"choice":3026,"1948":3027,"replied":3028,"percent":3029,"Scotland":3030,"##hi":3031,"females":3032,"constructed":3033,"1957":3034,"settled":3035,"Steve":3036,"recognized":3037,"cities":3038,"crew":3039,"glanced":3040,"kiss":3041,"competed":3042,"flight":3043,"knowledge":3044,"editor":3045,"More":3046,"Conference":3047,"##H":3048,"fifth":3049,"elements":3050,"##ee":3051,"##tes":3052,"function":3053,"newspaper":3054,"recently":3055,"Miss":3056,"cultural":3057,"brown":3058,"twice":3059,"Office":3060,"1939":3061,"truth":3062,"Creek":3063,"1946":3064,"households":3065,"USA":3066,"1950":3067,"quality":3068,"##tt":3069,"border":3070,"seconds":3071,"destroyed":3072,"pre":3073,"wait":3074,"ahead":3075,"build":3076,"image":3077,"90":3078,"cars":3079,"##mi":3080,"33":3081,"promoted":3082,"professor":3083,"et":3084,"bank":3085,"medal":3086,"text":3087,"broken":3088,"Middle":3089,"revealed":3090,"sides":3091,"wing":3092,"seems":3093,"channel":3094,"1970s":3095,"Ben":3096,"loved":3097,"effort":3098,"officers":3099,"Will":3100,"##ff":3101,"70":3102,"Israel":3103,"Jim":3104,"upper":3105,"fully":3106,"label":3107,"Jr":3108,"assistant":3109,"powerful":3110,"pair":3111,"positive":3112,"##ary":3113,"gives":3114,"1955":3115,"20th":3116,"races":3117,"remain":3118,"kitchen":3119,"primarily":3120,"##ti":3121,"Sydney":3122,"easy":3123,"Tour":3124,"whispered":3125,"buried":3126,"300":3127,"News":3128,"Polish":3129,"1952":3130,"Duke":3131,"Columbia":3132,"produce":3133,"accepted":3134,"00":3135,"approach":3136,"minor":3137,"1947":3138,"Special":3139,"44":3140,"Asian":3141,"basis":3142,"visit":3143,"Fort":3144,"Civil":3145,"finish":3146,"formerly":3147,"beside":3148,"leaned":3149,"##ite":3150,"median":3151,"rose":3152,"coast":3153,"effects":3154,"supposed":3155,"Cross":3156,"##hip":3157,"Corps":3158,"residents":3159,"Jackson":3160,"##ir":3161,"Bob":3162,"basketball":3163,"36":3164,"Asia":3165,"seem":3166,"Bishop":3167,"Book":3168,"##ber":3169,"ring":3170,"##ze":3171,"owner":3172,"BBC":3173,"##ja":3174,"transferred":3175,"acting":3176,"De":3177,"appearances":3178,"walking":3179,"Le":3180,"press":3181,"grabbed":3182,"1954":3183,"officially":3184,"1953":3185,"##pe":3186,"risk":3187,"taught":3188,"review":3189,"##X":3190,"lay":3191,"##well":3192,"council":3193,"Avenue":3194,"seeing":3195,"losing":3196,"Ohio":3197,"Super":3198,"province":3199,"ones":3200,"travel":3201,"##sa":3202,"projects":3203,"equipment":3204,"spot":3205,"Berlin":3206,"administrative":3207,"heat":3208,"potential":3209,"shut":3210,"capacity":3211,"elections":3212,"growth":3213,"fought":3214,"Republican":3215,"mixed":3216,"Andrew":3217,"teacher":3218,"turning":3219,"strength":3220,"shoulders":3221,"beat":3222,"wind":3223,"1949":3224,"Health":3225,"follows":3226,"camp":3227,"suggested":3228,"perhaps":3229,"Alex":3230,"mountain":3231,"contact":3232,"divided":3233,"candidate":3234,"fellow":3235,"34":3236,"Show":3237,"necessary":3238,"workers":3239,"ball":3240,"horse":3241,"ways":3242,"questions":3243,"protect":3244,"gas":3245,"activity":3246,"younger":3247,"bottom":3248,"founder":3249,"Scottish":3250,"screen":3251,"treatment":3252,"easily":3253,"com":3254,"##house":3255,"dedicated":3256,"Master":3257,"warm":3258,"Night":3259,"Georgia":3260,"Long":3261,"von":3262,"##me":3263,"perfect":3264,"website":3265,"1960s":3266,"piano":3267,"efforts":3268,"##ide":3269,"Tony":3270,"sort":3271,"offers":3272,"Development":3273,"Simon":3274,"executive":3275,"##nd":3276,"save":3277,"Over":3278,"Senate":3279,"1951":3280,"1990s":3281,"draw":3282,"master":3283,"Police":3284,"##ius":3285,"renamed":3286,"boys":3287,"initial":3288,"prominent":3289,"damage":3290,"Co":3291,"##ov":3292,"##za":3293,"online":3294,"begin":3295,"occurred":3296,"captured":3297,"youth":3298,"Top":3299,"account":3300,"tells":3301,"Justice":3302,"conducted":3303,"forest":3304,"##town":3305,"bought":3306,"teeth":3307,"Jersey":3308,"##di":3309,"purchased":3310,"agreement":3311,"Michigan":3312,"##ure":3313,"campus":3314,"prison":3315,"becomes":3316,"product":3317,"secret":3318,"guess":3319,"Route":3320,"huge":3321,"types":3322,"drums":3323,"64":3324,"split":3325,"defeat":3326,"estate":3327,"housing":3328,"##ot":3329,"brothers":3330,"Coast":3331,"declared":3332,"happen":3333,"titled":3334,"therefore":3335,"sun":3336,"commonly":3337,"alongside":3338,"Stadium":3339,"library":3340,"Home":3341,"article":3342,"steps":3343,"telling":3344,"slow":3345,"assigned":3346,"refused":3347,"laughed":3348,"wants":3349,"Nick":3350,"wearing":3351,"Rome":3352,"Open":3353,"##ah":3354,"Hospital":3355,"pointed":3356,"Taylor":3357,"lifted":3358,"escape":3359,"participated":3360,"##j":3361,"drama":3362,"parish":3363,"Santa":3364,"##per":3365,"organized":3366,"mass":3367,"pick":3368,"Airport":3369,"gets":3370,"Library":3371,"unable":3372,"pull":3373,"Live":3374,"##ging":3375,"surrounding":3376,"##ries":3377,"focused":3378,"Adam":3379,"facilities":3380,"##ning":3381,"##ny":3382,"38":3383,"##ring":3384,"notable":3385,"era":3386,"connected":3387,"gained":3388,"operating":3389,"laid":3390,"Regiment":3391,"branch":3392,"defined":3393,"Christmas":3394,"machine":3395,"Four":3396,"academic":3397,"Iran":3398,"adopted":3399,"concept":3400,"Men":3401,"compared":3402,"search":3403,"traffic":3404,"Max":3405,"Maria":3406,"greater":3407,"##ding":3408,"widely":3409,"##burg":3410,"serves":3411,"1938":3412,"37":3413,"Go":3414,"hotel":3415,"shared":3416,"typically":3417,"scale":3418,"1936":3419,"leg":3420,"suffered":3421,"yards":3422,"pieces":3423,"Ministry":3424,"Wilson":3425,"episodes":3426,"empty":3427,"1918":3428,"safety":3429,"continues":3430,"yellow":3431,"historic":3432,"settlement":3433,"400":3434,"Come":3435,"Corporation":3436,"enemy":3437,"content":3438,"picture":3439,"evening":3440,"territory":3441,"method":3442,"trial":3443,"solo":3444,"driver":3445,"Here":3446,"##ls":3447,"entrance":3448,"Prize":3449,"spring":3450,"whatever":3451,"##ent":3452,"75":3453,"##ji":3454,"reading":3455,"Arthur":3456,"##cy":3457,"Our":3458,"clothes":3459,"Prime":3460,"Illinois":3461,"Kong":3462,"code":3463,"##ria":3464,"sit":3465,"Harry":3466,"Federal":3467,"chosen":3468,"administration":3469,"bodies":3470,"begins":3471,"stomach":3472,"Though":3473,"seats":3474,"Hong":3475,"density":3476,"Sun":3477,"leaders":3478,"Field":3479,"museum":3480,"chart":3481,"platform":3482,"languages":3483,"##ron":3484,"birth":3485,"holds":3486,"Gold":3487,"##un":3488,"fish":3489,"combined":3490,"##ps":3491,"4th":3492,"1937":3493,"largely":3494,"captain":3495,"trust":3496,"Game":3497,"van":3498,"boat":3499,"Oxford":3500,"basic":3501,"beneath":3502,"Islands":3503,"painting":3504,"nice":3505,"Toronto":3506,"path":3507,"males":3508,"sources":3509,"block":3510,"conference":3511,"parties":3512,"murder":3513,"clubs":3514,"crowd":3515,"calling":3516,"About":3517,"Business":3518,"peace":3519,"knows":3520,"lake":3521,"speaking":3522,"stayed":3523,"Brazil":3524,"allowing":3525,"Born":3526,"unique":3527,"thick":3528,"Technology":3529,"##que":3530,"receive":3531,"des":3532,"semi":3533,"alive":3534,"noticed":3535,"format":3536,"##ped":3537,"coffee":3538,"digital":3539,"##ned":3540,"handed":3541,"guard":3542,"tall":3543,"faced":3544,"setting":3545,"plants":3546,"partner":3547,"claim":3548,"reduced":3549,"temple":3550,"animals":3551,"determined":3552,"classes":3553,"##out":3554,"estimated":3555,"##ad":3556,"Olympic":3557,"providing":3558,"Massachusetts":3559,"learned":3560,"Inc":3561,"Philadelphia":3562,"Social":3563,"carry":3564,"42":3565,"possibly":3566,"hosted":3567,"tonight":3568,"respectively":3569,"Today":3570,"shape":3571,"Mount":3572,"roles":3573,"designated":3574,"brain":3575,"etc":3576,"Korea":3577,"thoughts":3578,"Brian":3579,"Highway":3580,"doors":3581,"background":3582,"drew":3583,"models":3584,"footballer":3585,"tone":3586,"turns":3587,"1935":3588,"quiet":3589,"tower":3590,"wood":3591,"bus":3592,"write":3593,"software":3594,"weapons":3595,"flat":3596,"marked":3597,"1920":3598,"newly":3599,"tight":3600,"Eric":3601,"finger":3602,"Journal":3603,"FC":3604,"Van":3605,"rise":3606,"critical":3607,"Atlantic":3608,"granted":3609,"returning":3610,"communities":3611,"humans":3612,"quick":3613,"39":3614,"48":3615,"ranked":3616,"sight":3617,"pop":3618,"Swedish":3619,"Stephen":3620,"card":3621,"analysis":3622,"attacked":3623,"##wa":3624,"Sunday":3625,"identified":3626,"Jason":3627,"champion":3628,"situated":3629,"1930":3630,"expanded":3631,"tears":3632,"##nce":3633,"reaching":3634,"Davis":3635,"protection":3636,"Emperor":3637,"positions":3638,"nominated":3639,"Bridge":3640,"tax":3641,"dress":3642,"allows":3643,"avoid":3644,"leadership":3645,"killing":3646,"actress":3647,"guest":3648,"steel":3649,"knowing":3650,"electric":3651,"cells":3652,"disease":3653,"grade":3654,"unknown":3655,"##ium":3656,"resulted":3657,"Pakistan":3658,"confirmed":3659,"##ged":3660,"tongue":3661,"covers":3662,"##Y":3663,"roof":3664,"entirely":3665,"applied":3666,"votes":3667,"drink":3668,"interview":3669,"exchange":3670,"Township":3671,"reasons":3672,"##ised":3673,"page":3674,"calls":3675,"dog":3676,"agent":3677,"nose":3678,"teaching":3679,"##ds":3680,"##ists":3681,"advanced":3682,"wish":3683,"Golden":3684,"existing":3685,"vehicle":3686,"del":3687,"1919":3688,"develop":3689,"attacks":3690,"pressed":3691,"Sports":3692,"planning":3693,"resulting":3694,"facility":3695,"Sarah":3696,"notes":3697,"1933":3698,"Class":3699,"Historic":3700,"winter":3701,"##mo":3702,"audience":3703,"Community":3704,"household":3705,"Netherlands":3706,"creation":3707,"##ize":3708,"keeping":3709,"1914":3710,"claims":3711,"dry":3712,"guys":3713,"opposite":3714,"##ak":3715,"explained":3716,"Ontario":3717,"secondary":3718,"difference":3719,"Francis":3720,"actions":3721,"organizations":3722,"yard":3723,"animal":3724,"Up":3725,"Lewis":3726,"titles":3727,"Several":3728,"1934":3729,"Ryan":3730,"55":3731,"Supreme":3732,"rolled":3733,"1917":3734,"distribution":3735,"figures":3736,"afraid":3737,"rural":3738,"yourself":3739,"##rt":3740,"sets":3741,"barely":3742,"Instead":3743,"passing":3744,"awards":3745,"41":3746,"silence":3747,"authority":3748,"occupied":3749,"environment":3750,"windows":3751,"engineering":3752,"surprised":3753,"flying":3754,"crime":3755,"reports":3756,"Mountain":3757,"powers":3758,"driving":3759,"succeeded":3760,"reviews":3761,"1929":3762,"Head":3763,"missing":3764,"Song":3765,"Jesus":3766,"opportunity":3767,"inspired":3768,"ends":3769,"albums":3770,"conversation":3771,"impact":3772,"injury":3773,"surprise":3774,"billion":3775,"learning":3776,"heavily":3777,"oldest":3778,"union":3779,"creating":3780,"##ky":3781,"festival":3782,"literature":3783,"letters":3784,"sexual":3785,"##tte":3786,"apartment":3787,"Final":3788,"comedy":3789,"nation":3790,"orders":3791,"##sen":3792,"contemporary":3793,"Power":3794,"drawn":3795,"existence":3796,"connection":3797,"##ating":3798,"Post":3799,"Junior":3800,"remembered":3801,"message":3802,"Medal":3803,"castle":3804,"note":3805,"engineer":3806,"sounds":3807,"Beach":3808,"crossed":3809,"##dy":3810,"ear":3811,"scientific":3812,"sales":3813,"##ai":3814,"theme":3815,"starts":3816,"clearly":3817,"##ut":3818,"trouble":3819,"##gan":3820,"bag":3821,"##han":3822,"BC":3823,"sons":3824,"1928":3825,"silent":3826,"versions":3827,"daily":3828,"Studies":3829,"ending":3830,"Rose":3831,"guns":3832,"1932":3833,"headquarters":3834,"reference":3835,"obtained":3836,"Squadron":3837,"concert":3838,"none":3839,"du":3840,"Among":3841,"##don":3842,"prevent":3843,"Member":3844,"answered":3845,"staring":3846,"Between":3847,"##lla":3848,"portion":3849,"drug":3850,"liked":3851,"association":3852,"performances":3853,"Nations":3854,"formation":3855,"Castle":3856,"lose":3857,"learn":3858,"scoring":3859,"relatively":3860,"quarter":3861,"47":3862,"Premier":3863,"##ors":3864,"Sweden":3865,"baseball":3866,"attempted":3867,"trip":3868,"worth":3869,"perform":3870,"airport":3871,"fields":3872,"enter":3873,"honor":3874,"Medical":3875,"rear":3876,"commander":3877,"officials":3878,"condition":3879,"supply":3880,"materials":3881,"52":3882,"Anna":3883,"volume":3884,"threw":3885,"Persian":3886,"43":3887,"interested":3888,"Gallery":3889,"achieved":3890,"visited":3891,"laws":3892,"relief":3893,"Area":3894,"Matt":3895,"singles":3896,"Lieutenant":3897,"Country":3898,"fans":3899,"Cambridge":3900,"sky":3901,"Miller":3902,"effective":3903,"tradition":3904,"Port":3905,"##ana":3906,"minister":3907,"extra":3908,"entitled":3909,"System":3910,"sites":3911,"authorities":3912,"acres":3913,"committee":3914,"racing":3915,"1931":3916,"desk":3917,"trains":3918,"ass":3919,"weren":3920,"Family":3921,"farm":3922,"##ance":3923,"industrial":3924,"##head":3925,"iron":3926,"49":3927,"abandoned":3928,"Out":3929,"Holy":3930,"chairman":3931,"waited":3932,"frequently":3933,"display":3934,"Light":3935,"transport":3936,"starring":3937,"Patrick":3938,"Engineering":3939,"eat":3940,"FM":3941,"judge":3942,"reaction":3943,"centuries":3944,"price":3945,"##tive":3946,"Korean":3947,"defense":3948,"Get":3949,"arrested":3950,"1927":3951,"send":3952,"urban":3953,"##ss":3954,"pilot":3955,"Okay":3956,"Media":3957,"reality":3958,"arts":3959,"soul":3960,"thirty":3961,"##be":3962,"catch":3963,"generation":3964,"##nes":3965,"apart":3966,"Anne":3967,"drop":3968,"See":3969,"##ving":3970,"sixth":3971,"trained":3972,"Management":3973,"magic":3974,"cm":3975,"height":3976,"Fox":3977,"Ian":3978,"resources":3979,"vampire":3980,"principal":3981,"Was":3982,"haven":3983,"##au":3984,"Walter":3985,"Albert":3986,"rich":3987,"1922":3988,"causing":3989,"entry":3990,"##ell":3991,"shortly":3992,"46":3993,"worry":3994,"doctor":3995,"composer":3996,"rank":3997,"Network":3998,"bright":3999,"showing":4000,"regions":4001,"1924":4002,"wave":4003,"carrying":4004,"kissed":4005,"finding":4006,"missed":4007,"Earl":4008,"lying":4009,"target":4010,"vehicles":4011,"Military":4012,"controlled":4013,"dinner":4014,"##board":4015,"briefly":4016,"lyrics":4017,"motion":4018,"duty":4019,"strange":4020,"attempts":4021,"invited":4022,"kg":4023,"villages":4024,"5th":4025,"Land":4026,"##mer":4027,"Christ":4028,"prepared":4029,"twelve":4030,"check":4031,"thousand":4032,"earth":4033,"copies":4034,"en":4035,"transfer":4036,"citizens":4037,"Americans":4038,"politics":4039,"nor":4040,"theatre":4041,"Project":4042,"##bo":4043,"clean":4044,"rooms":4045,"laugh":4046,"##ran":4047,"application":4048,"contained":4049,"anyway":4050,"containing":4051,"Sciences":4052,"1925":4053,"rare":4054,"speech":4055,"exist":4056,"1950s":4057,"falling":4058,"passenger":4059,"##im":4060,"stands":4061,"51":4062,"##ol":4063,"##ow":4064,"phase":4065,"governor":4066,"kids":4067,"details":4068,"methods":4069,"Vice":4070,"employed":4071,"performing":4072,"counter":4073,"Jane":4074,"heads":4075,"Channel":4076,"wine":4077,"opposition":4078,"aged":4079,"1912":4080,"Every":4081,"1926":4082,"highway":4083,"##ura":4084,"1921":4085,"aired":4086,"978":4087,"permanent":4088,"Forest":4089,"finds":4090,"joint":4091,"approved":4092,"##pur":4093,"brief":4094,"doubt":4095,"acts":4096,"brand":4097,"wild":4098,"closely":4099,"Ford":4100,"Kevin":4101,"chose":4102,"shall":4103,"port":4104,"sweet":4105,"fun":4106,"asking":4107,"Be":4108,"##bury":4109,"sought":4110,"Dave":4111,"Mexican":4112,"mom":4113,"Right":4114,"Howard":4115,"Moscow":4116,"Charlie":4117,"Stone":4118,"##mann":4119,"admitted":4120,"##ver":4121,"wooden":4122,"1923":4123,"Officer":4124,"relations":4125,"Hot":4126,"combat":4127,"publication":4128,"chain":4129,"shop":4130,"inhabitants":4131,"proved":4132,"ideas":4133,"address":4134,"1915":4135,"Memorial":4136,"explain":4137,"increasing":4138,"conflict":4139,"Anthony":4140,"Melbourne":4141,"narrow":4142,"temperature":4143,"slid":4144,"1916":4145,"worse":4146,"selling":4147,"documentary":4148,"Ali":4149,"Ray":4150,"opposed":4151,"vision":4152,"dad":4153,"extensive":4154,"Infantry":4155,"commissioned":4156,"Doctor":4157,"offices":4158,"programming":4159,"core":4160,"respect":4161,"storm":4162,"##pa":4163,"##ay":4164,"##om":4165,"promotion":4166,"der":4167,"struck":4168,"anymore":4169,"shit":4170,"Region":4171,"receiving":4172,"DVD":4173,"alternative":4174,"##ue":4175,"ride":4176,"maximum":4177,"1910":4178,"##ious":4179,"Third":4180,"Affairs":4181,"cancer":4182,"Executive":4183,"##op":4184,"dream":4185,"18th":4186,"Due":4187,"##ker":4188,"##worth":4189,"economy":4190,"IV":4191,"Billboard":4192,"identity":4193,"subsequent":4194,"statement":4195,"skills":4196,"##back":4197,"funding":4198,"##ons":4199,"Round":4200,"Foreign":4201,"truck":4202,"Please":4203,"lights":4204,"wondered":4205,"##ms":4206,"frame":4207,"yes":4208,"Still":4209,"districts":4210,"fiction":4211,"Colonel":4212,"converted":4213,"150":4214,"grown":4215,"accident":4216,"critics":4217,"fit":4218,"Information":4219,"architecture":4220,"Point":4221,"Five":4222,"armed":4223,"Billy":4224,"poet":4225,"functions":4226,"consisted":4227,"suit":4228,"Turkish":4229,"Band":4230,"object":4231,"desire":4232,"##ities":4233,"sounded":4234,"flow":4235,"Norwegian":4236,"articles":4237,"Marie":4238,"pulling":4239,"thin":4240,"singing":4241,"Hunter":4242,"Human":4243,"Battalion":4244,"Federation":4245,"Kim":4246,"origin":4247,"represent":4248,"dangerous":4249,"weather":4250,"fuel":4251,"ex":4252,"##sing":4253,"Last":4254,"bedroom":4255,"aid":4256,"knees":4257,"Alan":4258,"angry":4259,"assumed":4260,"plane":4261,"Something":4262,"founding":4263,"concerned":4264,"global":4265,"Fire":4266,"di":4267,"please":4268,"Portuguese":4269,"touched":4270,"Roger":4271,"nuclear":4272,"Register":4273,"Jeff":4274,"fixed":4275,"royal":4276,"lie":4277,"finals":4278,"NFL":4279,"Manchester":4280,"towns":4281,"handle":4282,"shaped":4283,"Chairman":4284,"Dean":4285,"launch":4286,"understanding":4287,"Children":4288,"violence":4289,"failure":4290,"sector":4291,"Brigade":4292,"wrapped":4293,"fired":4294,"sharp":4295,"tiny":4296,"developing":4297,"expansion":4298,"Free":4299,"institutions":4300,"technical":4301,"Nothing":4302,"otherwise":4303,"Main":4304,"inch":4305,"Saturday":4306,"wore":4307,"Senior":4308,"attached":4309,"cheek":4310,"representing":4311,"Kansas":4312,"##chi":4313,"##kin":4314,"actual":4315,"advantage":4316,"Dan":4317,"Austria":4318,"##dale":4319,"hoped":4320,"multi":4321,"squad":4322,"Norway":4323,"streets":4324,"1913":4325,"Services":4326,"hired":4327,"grow":4328,"pp":4329,"wear":4330,"painted":4331,"Minnesota":4332,"stuff":4333,"Building":4334,"54":4335,"Philippines":4336,"1900":4337,"##ties":4338,"educational":4339,"Khan":4340,"Magazine":4341,"##port":4342,"Cape":4343,"signal":4344,"Gordon":4345,"sword":4346,"Anderson":4347,"cool":4348,"engaged":4349,"Commander":4350,"images":4351,"Upon":4352,"tied":4353,"Security":4354,"cup":4355,"rail":4356,"Vietnam":4357,"successfully":4358,"##red":4359,"Muslim":4360,"gain":4361,"bringing":4362,"Native":4363,"hers":4364,"occurs":4365,"negative":4366,"Philip":4367,"Kelly":4368,"Colorado":4369,"category":4370,"##lan":4371,"600":4372,"Have":4373,"supporting":4374,"wet":4375,"56":4376,"stairs":4377,"Grace":4378,"observed":4379,"##ung":4380,"funds":4381,"restaurant":4382,"1911":4383,"Jews":4384,"##ments":4385,"##che":4386,"Jake":4387,"Back":4388,"53":4389,"asks":4390,"journalist":4391,"accept":4392,"bands":4393,"bronze":4394,"helping":4395,"##ice":4396,"decades":4397,"mayor":4398,"survived":4399,"usual":4400,"influenced":4401,"Douglas":4402,"Hey":4403,"##izing":4404,"surrounded":4405,"retirement":4406,"Temple":4407,"derived":4408,"Pope":4409,"registered":4410,"producing":4411,"##ral":4412,"structures":4413,"Johnny":4414,"contributed":4415,"finishing":4416,"buy":4417,"specifically":4418,"##king":4419,"patients":4420,"Jordan":4421,"internal":4422,"regarding":4423,"Samuel":4424,"Clark":4425,"##q":4426,"afternoon":4427,"Finally":4428,"scenes":4429,"notice":4430,"refers":4431,"quietly":4432,"threat":4433,"Water":4434,"Those":4435,"Hamilton":4436,"promise":4437,"freedom":4438,"Turkey":4439,"breaking":4440,"maintained":4441,"device":4442,"lap":4443,"ultimately":4444,"Champion":4445,"Tim":4446,"Bureau":4447,"expressed":4448,"investigation":4449,"extremely":4450,"capable":4451,"qualified":4452,"recognition":4453,"items":4454,"##up":4455,"Indiana":4456,"adult":4457,"rain":4458,"greatest":4459,"architect":4460,"Morgan":4461,"dressed":4462,"equal":4463,"Antonio":4464,"collected":4465,"drove":4466,"occur":4467,"Grant":4468,"graduate":4469,"anger":4470,"Sri":4471,"worried":4472,"standards":4473,"##ore":4474,"injured":4475,"somewhere":4476,"damn":4477,"Singapore":4478,"Jimmy":4479,"pocket":4480,"homes":4481,"stock":4482,"religion":4483,"aware":4484,"regarded":4485,"Wisconsin":4486,"##tra":4487,"passes":4488,"fresh":4489,"##ea":4490,"argued":4491,"Ltd":4492,"EP":4493,"Diego":4494,"importance":4495,"Census":4496,"incident":4497,"Egypt":4498,"Missouri":4499,"domestic":4500,"leads":4501,"ceremony":4502,"Early":4503,"camera":4504,"Father":4505,"challenge":4506,"Switzerland":4507,"lands":4508,"familiar":4509,"hearing":4510,"spend":4511,"educated":4512,"Tennessee":4513,"Thank":4514,"##ram":4515,"Thus":4516,"concern":4517,"putting":4518,"inches":4519,"map":4520,"classical":4521,"Allen":4522,"crazy":4523,"valley":4524,"Space":4525,"softly":4526,"##my":4527,"pool":4528,"worldwide":4529,"climate":4530,"experienced":4531,"neighborhood":4532,"scheduled":4533,"neither":4534,"fleet":4535,"1908":4536,"Girl":4537,"##J":4538,"Part":4539,"engines":4540,"locations":4541,"darkness":4542,"Revolution":4543,"establishment":4544,"lawyer":4545,"objects":4546,"apparently":4547,"Queensland":4548,"Entertainment":4549,"bill":4550,"mark":4551,"Television":4552,"##ong":4553,"pale":4554,"demand":4555,"Hotel":4556,"selection":4557,"##rn":4558,"##ino":4559,"Labour":4560,"Liberal":4561,"burned":4562,"Mom":4563,"merged":4564,"Arizona":4565,"request":4566,"##lia":4567,"##light":4568,"hole":4569,"employees":4570,"##ical":4571,"incorporated":4572,"95":4573,"independence":4574,"Walker":4575,"covering":4576,"joining":4577,"##ica":4578,"task":4579,"papers":4580,"backing":4581,"sell":4582,"biggest":4583,"6th":4584,"strike":4585,"establish":4586,"##ō":4587,"gently":4588,"59":4589,"Orchestra":4590,"Winter":4591,"protein":4592,"Juan":4593,"locked":4594,"dates":4595,"Boy":4596,"aren":4597,"shooting":4598,"Luke":4599,"solid":4600,"charged":4601,"Prior":4602,"resigned":4603,"interior":4604,"garden":4605,"spoken":4606,"improve":4607,"wonder":4608,"promote":4609,"hidden":4610,"##med":4611,"combination":4612,"Hollywood":4613,"Swiss":4614,"consider":4615,"##ks":4616,"Lincoln":4617,"literary":4618,"drawing":4619,"Marine":4620,"weapon":4621,"Victor":4622,"Trust":4623,"Maryland":4624,"properties":4625,"##ara":4626,"exhibition":4627,"understood":4628,"hung":4629,"Tell":4630,"installed":4631,"loud":4632,"fashion":4633,"affected":4634,"junior":4635,"landing":4636,"flowers":4637,"##he":4638,"Internet":4639,"beach":4640,"Heart":4641,"tries":4642,"Mayor":4643,"programme":4644,"800":4645,"wins":4646,"noise":4647,"##ster":4648,"##ory":4649,"58":4650,"contain":4651,"fair":4652,"delivered":4653,"##ul":4654,"wedding":4655,"Square":4656,"advance":4657,"behavior":4658,"Program":4659,"Oregon":4660,"##rk":4661,"residence":4662,"realize":4663,"certainly":4664,"hill":4665,"Houston":4666,"57":4667,"indicated":4668,"##water":4669,"wounded":4670,"Village":4671,"massive":4672,"Moore":4673,"thousands":4674,"personnel":4675,"dating":4676,"opera":4677,"poetry":4678,"##her":4679,"causes":4680,"feelings":4681,"Frederick":4682,"applications":4683,"push":4684,"approached":4685,"foundation":4686,"pleasure":4687,"sale":4688,"fly":4689,"gotten":4690,"northeast":4691,"costs":4692,"raise":4693,"paintings":4694,"##ney":4695,"views":4696,"horses":4697,"formal":4698,"Arab":4699,"hockey":4700,"typical":4701,"representative":4702,"rising":4703,"##des":4704,"clock":4705,"stadium":4706,"shifted":4707,"Dad":4708,"peak":4709,"Fame":4710,"vice":4711,"disappeared":4712,"users":4713,"Way":4714,"Naval":4715,"prize":4716,"hoping":4717,"values":4718,"evil":4719,"Bell":4720,"consisting":4721,"##ón":4722,"Regional":4723,"##ics":4724,"improved":4725,"circle":4726,"carefully":4727,"broad":4728,"##ini":4729,"Fine":4730,"maintain":4731,"operate":4732,"offering":4733,"mention":4734,"Death":4735,"stupid":4736,"Through":4737,"Princess":4738,"attend":4739,"interests":4740,"ruled":4741,"somewhat":4742,"wings":4743,"roads":4744,"grounds":4745,"##ual":4746,"Greece":4747,"Champions":4748,"facing":4749,"hide":4750,"voted":4751,"require":4752,"Dark":4753,"Matthew":4754,"credit":4755,"sighed":4756,"separated":4757,"manner":4758,"##ile":4759,"Boys":4760,"1905":4761,"committed":4762,"impossible":4763,"lip":4764,"candidates":4765,"7th":4766,"Bruce":4767,"arranged":4768,"Islamic":4769,"courses":4770,"criminal":4771,"##ened":4772,"smell":4773,"##bed":4774,"08":4775,"consecutive":4776,"##ening":4777,"proper":4778,"purchase":4779,"weak":4780,"Prix":4781,"1906":4782,"aside":4783,"introduction":4784,"Look":4785,"##ku":4786,"changing":4787,"budget":4788,"resistance":4789,"factory":4790,"Forces":4791,"agency":4792,"##tone":4793,"northwest":4794,"user":4795,"1907":4796,"stating":4797,"##one":4798,"sport":4799,"Design":4800,"environmental":4801,"cards":4802,"concluded":4803,"Carl":4804,"250":4805,"accused":4806,"##ology":4807,"Girls":4808,"sick":4809,"intelligence":4810,"Margaret":4811,"responsibility":4812,"Guard":4813,"##tus":4814,"17th":4815,"sq":4816,"goods":4817,"1909":4818,"hate":4819,"##ek":4820,"capture":4821,"stores":4822,"Gray":4823,"comic":4824,"Modern":4825,"Silver":4826,"Andy":4827,"electronic":4828,"wheel":4829,"##ied":4830,"Deputy":4831,"##bs":4832,"Czech":4833,"zone":4834,"choose":4835,"constant":4836,"reserve":4837,"##lle":4838,"Tokyo":4839,"spirit":4840,"sub":4841,"degrees":4842,"flew":4843,"pattern":4844,"compete":4845,"Dance":4846,"##ik":4847,"secretary":4848,"Imperial":4849,"99":4850,"reduce":4851,"Hungarian":4852,"confused":4853,"##rin":4854,"Pierre":4855,"describes":4856,"regularly":4857,"Rachel":4858,"85":4859,"landed":4860,"passengers":4861,"##ise":4862,"##sis":4863,"historian":4864,"meters":4865,"Youth":4866,"##ud":4867,"participate":4868,"##cing":4869,"arrival":4870,"tired":4871,"Mother":4872,"##gy":4873,"jumped":4874,"Kentucky":4875,"faces":4876,"feed":4877,"Israeli":4878,"Ocean":4879,"##Q":4880,"##án":4881,"plus":4882,"snow":4883,"techniques":4884,"plate":4885,"sections":4886,"falls":4887,"jazz":4888,"##ris":4889,"tank":4890,"loan":4891,"repeated":4892,"opinion":4893,"##res":4894,"unless":4895,"rugby":4896,"journal":4897,"Lawrence":4898,"moments":4899,"shock":4900,"distributed":4901,"##ded":4902,"adjacent":4903,"Argentina":4904,"crossing":4905,"uncle":4906,"##ric":4907,"Detroit":4908,"communication":4909,"mental":4910,"tomorrow":4911,"session":4912,"Emma":4913,"Without":4914,"##gen":4915,"Miami":4916,"charges":4917,"Administration":4918,"hits":4919,"coat":4920,"protected":4921,"Cole":4922,"invasion":4923,"priest":4924,"09":4925,"Gary":4926,"enjoyed":4927,"plot":4928,"measure":4929,"bound":4930,"friendly":4931,"throw":4932,"musician":4933,"##lon":4934,"##ins":4935,"Age":4936,"knife":4937,"damaged":4938,"birds":4939,"driven":4940,"lit":4941,"ears":4942,"breathing":4943,"Arabic":4944,"Jan":4945,"faster":4946,"Jonathan":4947,"##gate":4948,"Independent":4949,"starred":4950,"Harris":4951,"teachers":4952,"Alice":4953,"sequence":4954,"mph":4955,"file":4956,"translated":4957,"decide":4958,"determine":4959,"Review":4960,"documents":4961,"sudden":4962,"threatened":4963,"##ft":4964,"bear":4965,"distinct":4966,"decade":4967,"burning":4968,"##sky":4969,"1930s":4970,"replace":4971,"begun":4972,"extension":4973,"##time":4974,"1904":4975,"equivalent":4976,"accompanied":4977,"Christopher":4978,"Danish":4979,"##ye":4980,"Besides":4981,"##more":4982,"persons":4983,"fallen":4984,"Rural":4985,"roughly":4986,"saved":4987,"willing":4988,"ensure":4989,"Belgium":4990,"05":4991,"musicians":4992,"##ang":4993,"giant":4994,"Six":4995,"Retrieved":4996,"worst":4997,"purposes":4998,"##bly":4999,"mountains":5000,"seventh":5001,"slipped":5002,"brick":5003,"07":5004,"##py":5005,"somehow":5006,"Carter":5007,"Iraq":5008,"cousin":5009,"favor":5010,"islands":5011,"journey":5012,"FIFA":5013,"contrast":5014,"planet":5015,"vs":5016,"calm":5017,"##ings":5018,"concrete":5019,"branches":5020,"gray":5021,"profit":5022,"Russell":5023,"##ae":5024,"##ux":5025,"##ens":5026,"philosophy":5027,"businesses":5028,"talked":5029,"parking":5030,"##ming":5031,"owners":5032,"Place":5033,"##tle":5034,"agricultural":5035,"Kate":5036,"06":5037,"southeast":5038,"draft":5039,"Eddie":5040,"earliest":5041,"forget":5042,"Dallas":5043,"Commonwealth":5044,"edited":5045,"66":5046,"inner":5047,"ed":5048,"operates":5049,"16th":5050,"Harvard":5051,"assistance":5052,"##si":5053,"designs":5054,"Take":5055,"bathroom":5056,"indicate":5057,"CEO":5058,"Command":5059,"Louisiana":5060,"1902":5061,"Dublin":5062,"Books":5063,"1901":5064,"tropical":5065,"1903":5066,"##tors":5067,"Places":5068,"tie":5069,"progress":5070,"forming":5071,"solution":5072,"62":5073,"letting":5074,"##ery":5075,"studying":5076,"##jo":5077,"duties":5078,"Baseball":5079,"taste":5080,"Reserve":5081,"##ru":5082,"Ann":5083,"##gh":5084,"visible":5085,"##vi":5086,"notably":5087,"link":5088,"NCAA":5089,"southwest":5090,"Never":5091,"storage":5092,"mobile":5093,"writers":5094,"favorite":5095,"Pro":5096,"pages":5097,"truly":5098,"count":5099,"##tta":5100,"string":5101,"kid":5102,"98":5103,"Ross":5104,"row":5105,"##idae":5106,"Kennedy":5107,"##tan":5108,"Hockey":5109,"hip":5110,"waist":5111,"grandfather":5112,"listen":5113,"##ho":5114,"feels":5115,"busy":5116,"72":5117,"stream":5118,"obvious":5119,"cycle":5120,"shaking":5121,"Knight":5122,"##ren":5123,"Carlos":5124,"painter":5125,"trail":5126,"web":5127,"linked":5128,"04":5129,"Palace":5130,"existed":5131,"##ira":5132,"responded":5133,"closing":5134,"End":5135,"examples":5136,"Marshall":5137,"weekend":5138,"jaw":5139,"Denmark":5140,"lady":5141,"township":5142,"medium":5143,"chin":5144,"Story":5145,"option":5146,"fifteen":5147,"Moon":5148,"represents":5149,"makeup":5150,"investment":5151,"jump":5152,"childhood":5153,"Oklahoma":5154,"roll":5155,"normally":5156,"Ten":5157,"Operation":5158,"Graham":5159,"Seattle":5160,"Atlanta":5161,"paused":5162,"promised":5163,"rejected":5164,"treated":5165,"returns":5166,"flag":5167,"##ita":5168,"Hungary":5169,"danger":5170,"glad":5171,"movements":5172,"visual":5173,"subjects":5174,"credited":5175,"soldier":5176,"Norman":5177,"ill":5178,"translation":5179,"José":5180,"Quebec":5181,"medicine":5182,"warning":5183,"theater":5184,"praised":5185,"municipal":5186,"01":5187,"commune":5188,"churches":5189,"acid":5190,"folk":5191,"8th":5192,"testing":5193,"add":5194,"survive":5195,"Sound":5196,"devices":5197,"residential":5198,"severe":5199,"presidential":5200,"Mississippi":5201,"Austin":5202,"Perhaps":5203,"Charlotte":5204,"hanging":5205,"Montreal":5206,"grin":5207,"##ten":5208,"racial":5209,"partnership":5210,"shoot":5211,"shift":5212,"##nie":5213,"Les":5214,"downtown":5215,"Brothers":5216,"Garden":5217,"matters":5218,"restored":5219,"mirror":5220,"forever":5221,"winners":5222,"rapidly":5223,"poverty":5224,"##ible":5225,"Until":5226,"DC":5227,"faith":5228,"hundreds":5229,"Real":5230,"Ukraine":5231,"Nelson":5232,"balance":5233,"Adams":5234,"contest":5235,"relative":5236,"ethnic":5237,"Edinburgh":5238,"composition":5239,"##nts":5240,"emergency":5241,"##van":5242,"marine":5243,"reputation":5244,"Down":5245,"pack":5246,"12th":5247,"Communist":5248,"Mountains":5249,"pro":5250,"stages":5251,"measures":5252,"##ld":5253,"ABC":5254,"Li":5255,"victims":5256,"benefit":5257,"Iowa":5258,"Broadway":5259,"gathered":5260,"rating":5261,"Defense":5262,"classic":5263,"##ily":5264,"ceiling":5265,"##ions":5266,"snapped":5267,"Everything":5268,"constituency":5269,"Franklin":5270,"Thompson":5271,"Stewart":5272,"entering":5273,"Judge":5274,"forth":5275,"##sk":5276,"wanting":5277,"smiling":5278,"moves":5279,"tunnel":5280,"premiered":5281,"grass":5282,"unusual":5283,"Ukrainian":5284,"bird":5285,"Friday":5286,"tail":5287,"Portugal":5288,"coal":5289,"element":5290,"Fred":5291,"guards":5292,"Senator":5293,"collaboration":5294,"beauty":5295,"Wood":5296,"chemical":5297,"beer":5298,"justice":5299,"signs":5300,"##Z":5301,"sees":5302,"##zi":5303,"Puerto":5304,"##zed":5305,"96":5306,"smooth":5307,"Bowl":5308,"gift":5309,"limit":5310,"97":5311,"heading":5312,"Source":5313,"wake":5314,"requires":5315,"Ed":5316,"Constitution":5317,"factor":5318,"Lane":5319,"factors":5320,"adding":5321,"Note":5322,"cleared":5323,"pictures":5324,"pink":5325,"##ola":5326,"Kent":5327,"Local":5328,"Singh":5329,"moth":5330,"Ty":5331,"##ture":5332,"courts":5333,"Seven":5334,"temporary":5335,"involving":5336,"Vienna":5337,"emerged":5338,"fishing":5339,"agree":5340,"defensive":5341,"stuck":5342,"secure":5343,"Tamil":5344,"##ick":5345,"bottle":5346,"03":5347,"Player":5348,"instruments":5349,"Spring":5350,"patient":5351,"flesh":5352,"contributions":5353,"cry":5354,"Malaysia":5355,"120":5356,"Global":5357,"da":5358,"Alabama":5359,"Within":5360,"##work":5361,"debuted":5362,"expect":5363,"Cleveland":5364,"concerns":5365,"retained":5366,"horror":5367,"10th":5368,"spending":5369,"Peace":5370,"Transport":5371,"grand":5372,"Crown":5373,"instance":5374,"institution":5375,"acted":5376,"Hills":5377,"mounted":5378,"Campbell":5379,"shouldn":5380,"1898":5381,"##ably":5382,"chamber":5383,"soil":5384,"88":5385,"Ethan":5386,"sand":5387,"cheeks":5388,"##gi":5389,"marry":5390,"61":5391,"weekly":5392,"classification":5393,"DNA":5394,"Elementary":5395,"Roy":5396,"definitely":5397,"Soon":5398,"Rights":5399,"gate":5400,"suggests":5401,"aspects":5402,"imagine":5403,"golden":5404,"beating":5405,"Studios":5406,"Warren":5407,"differences":5408,"significantly":5409,"glance":5410,"occasionally":5411,"##od":5412,"clothing":5413,"Assistant":5414,"depth":5415,"sending":5416,"possibility":5417,"mode":5418,"prisoners":5419,"requirements":5420,"daughters":5421,"dated":5422,"Representatives":5423,"prove":5424,"guilty":5425,"interesting":5426,"smoke":5427,"cricket":5428,"93":5429,"##ates":5430,"rescue":5431,"Connecticut":5432,"underground":5433,"Opera":5434,"13th":5435,"reign":5436,"##ski":5437,"thanks":5438,"leather":5439,"equipped":5440,"routes":5441,"fan":5442,"##ans":5443,"script":5444,"Wright":5445,"bishop":5446,"Welsh":5447,"jobs":5448,"faculty":5449,"eleven":5450,"Railroad":5451,"appearing":5452,"anniversary":5453,"Upper":5454,"##down":5455,"anywhere":5456,"Rugby":5457,"Metropolitan":5458,"Meanwhile":5459,"Nicholas":5460,"champions":5461,"forehead":5462,"mining":5463,"drinking":5464,"76":5465,"Jerry":5466,"membership":5467,"Brazilian":5468,"Wild":5469,"Rio":5470,"scheme":5471,"Unlike":5472,"strongly":5473,"##bility":5474,"fill":5475,"##rian":5476,"easier":5477,"MP":5478,"Hell":5479,"##sha":5480,"Stanley":5481,"banks":5482,"Baron":5483,"##ique":5484,"Robinson":5485,"67":5486,"Gabriel":5487,"Austrian":5488,"Wayne":5489,"exposed":5490,"##wan":5491,"Alfred":5492,"1899":5493,"manage":5494,"mix":5495,"visitors":5496,"eating":5497,"##rate":5498,"Sean":5499,"commission":5500,"Cemetery":5501,"policies":5502,"Camp":5503,"parallel":5504,"traveled":5505,"guitarist":5506,"02":5507,"supplies":5508,"couples":5509,"poem":5510,"blocks":5511,"Rick":5512,"Training":5513,"Energy":5514,"achieve":5515,"appointment":5516,"Wing":5517,"Jamie":5518,"63":5519,"novels":5520,"##em":5521,"1890":5522,"songwriter":5523,"Base":5524,"Jay":5525,"##gar":5526,"naval":5527,"scared":5528,"miss":5529,"labor":5530,"technique":5531,"crisis":5532,"Additionally":5533,"backed":5534,"destroy":5535,"seriously":5536,"tools":5537,"tennis":5538,"91":5539,"god":5540,"##ington":5541,"continuing":5542,"steam":5543,"obviously":5544,"Bobby":5545,"adapted":5546,"fifty":5547,"enjoy":5548,"Jacob":5549,"publishing":5550,"column":5551,"##ular":5552,"Baltimore":5553,"Donald":5554,"Liverpool":5555,"92":5556,"drugs":5557,"movies":5558,"##ock":5559,"Heritage":5560,"##je":5561,"##istic":5562,"vocal":5563,"strategy":5564,"gene":5565,"advice":5566,"##bi":5567,"Ottoman":5568,"riding":5569,"##side":5570,"Agency":5571,"Indonesia":5572,"11th":5573,"laughing":5574,"sleeping":5575,"und":5576,"muttered":5577,"listening":5578,"deck":5579,"tip":5580,"77":5581,"ownership":5582,"grey":5583,"Claire":5584,"deeply":5585,"provincial":5586,"popularity":5587,"Cooper":5588,"##á":5589,"Emily":5590,"##sed":5591,"designer":5592,"Murray":5593,"describe":5594,"Danny":5595,"Around":5596,"Parker":5597,"##dae":5598,"68":5599,"rates":5600,"suffering":5601,"considerable":5602,"78":5603,"nervous":5604,"powered":5605,"tons":5606,"circumstances":5607,"wished":5608,"belonged":5609,"Pittsburgh":5610,"flows":5611,"9th":5612,"##use":5613,"belt":5614,"81":5615,"useful":5616,"15th":5617,"context":5618,"List":5619,"Dead":5620,"Iron":5621,"seek":5622,"Season":5623,"worn":5624,"frequency":5625,"legislation":5626,"replacement":5627,"memories":5628,"Tournament":5629,"Again":5630,"Barry":5631,"organisation":5632,"copy":5633,"Gulf":5634,"waters":5635,"meets":5636,"struggle":5637,"Oliver":5638,"1895":5639,"Susan":5640,"protest":5641,"kick":5642,"Alliance":5643,"components":5644,"1896":5645,"Tower":5646,"Windows":5647,"demanded":5648,"regiment":5649,"sentence":5650,"Woman":5651,"Logan":5652,"Referee":5653,"hosts":5654,"debate":5655,"knee":5656,"Blood":5657,"##oo":5658,"universities":5659,"practices":5660,"Ward":5661,"ranking":5662,"correct":5663,"happening":5664,"Vincent":5665,"attracted":5666,"classified":5667,"##stic":5668,"processes":5669,"immediate":5670,"waste":5671,"increasingly":5672,"Helen":5673,"##po":5674,"Lucas":5675,"Phil":5676,"organ":5677,"1897":5678,"tea":5679,"suicide":5680,"actors":5681,"lb":5682,"crash":5683,"approval":5684,"waves":5685,"##ered":5686,"hated":5687,"grip":5688,"700":5689,"amongst":5690,"69":5691,"74":5692,"hunting":5693,"dying":5694,"lasted":5695,"illegal":5696,"##rum":5697,"stare":5698,"defeating":5699,"##gs":5700,"shrugged":5701,"°C":5702,"Jon":5703,"Count":5704,"Orleans":5705,"94":5706,"affairs":5707,"formally":5708,"##and":5709,"##ves":5710,"criticized":5711,"Disney":5712,"Vol":5713,"successor":5714,"tests":5715,"scholars":5716,"palace":5717,"Would":5718,"celebrated":5719,"rounds":5720,"grant":5721,"Schools":5722,"Such":5723,"commanded":5724,"demon":5725,"Romania":5726,"##all":5727,"Karl":5728,"71":5729,"##yn":5730,"84":5731,"Daily":5732,"totally":5733,"Medicine":5734,"fruit":5735,"Die":5736,"upset":5737,"Lower":5738,"Conservative":5739,"14th":5740,"Mitchell":5741,"escaped":5742,"shoes":5743,"Morris":5744,"##tz":5745,"queen":5746,"harder":5747,"prime":5748,"Thanks":5749,"indeed":5750,"Sky":5751,"authors":5752,"rocks":5753,"definition":5754,"Nazi":5755,"accounts":5756,"printed":5757,"experiences":5758,"##ters":5759,"divisions":5760,"Cathedral":5761,"denied":5762,"depending":5763,"Express":5764,"##let":5765,"73":5766,"appeal":5767,"loose":5768,"colors":5769,"filed":5770,"##isation":5771,"gender":5772,"##ew":5773,"throne":5774,"forests":5775,"Finland":5776,"domain":5777,"boats":5778,"Baker":5779,"squadron":5780,"shore":5781,"remove":5782,"##ification":5783,"careful":5784,"wound":5785,"railroad":5786,"82":5787,"seeking":5788,"agents":5789,"##ved":5790,"Blues":5791,"##off":5792,"customers":5793,"ignored":5794,"net":5795,"##ction":5796,"hiding":5797,"Originally":5798,"declined":5799,"##ess":5800,"franchise":5801,"eliminated":5802,"NBA":5803,"merely":5804,"pure":5805,"appropriate":5806,"visiting":5807,"forty":5808,"markets":5809,"offensive":5810,"coverage":5811,"cave":5812,"##nia":5813,"spell":5814,"##lar":5815,"Benjamin":5816,"##ire":5817,"Convention":5818,"filmed":5819,"Trade":5820,"##sy":5821,"##ct":5822,"Having":5823,"palm":5824,"1889":5825,"Evans":5826,"intense":5827,"plastic":5828,"Julia":5829,"document":5830,"jeans":5831,"vessel":5832,"SR":5833,"##fully":5834,"proposal":5835,"Birmingham":5836,"le":5837,"##ative":5838,"assembly":5839,"89":5840,"fund":5841,"lock":5842,"1893":5843,"AD":5844,"meetings":5845,"occupation":5846,"modified":5847,"Years":5848,"odd":5849,"aimed":5850,"reform":5851,"Mission":5852,"Works":5853,"shake":5854,"cat":5855,"exception":5856,"convinced":5857,"executed":5858,"pushing":5859,"dollars":5860,"replacing":5861,"soccer":5862,"manufacturing":5863,"##ros":5864,"expensive":5865,"kicked":5866,"minimum":5867,"Josh":5868,"coastal":5869,"Chase":5870,"ha":5871,"Thailand":5872,"publications":5873,"deputy":5874,"Sometimes":5875,"Angel":5876,"effectively":5877,"##illa":5878,"criticism":5879,"conduct":5880,"Serbian":5881,"landscape":5882,"NY":5883,"absence":5884,"passage":5885,"##ula":5886,"Blake":5887,"Indians":5888,"1892":5889,"admit":5890,"Trophy":5891,"##ball":5892,"Next":5893,"##rated":5894,"##ians":5895,"charts":5896,"kW":5897,"orchestra":5898,"79":5899,"heritage":5900,"1894":5901,"rough":5902,"exists":5903,"boundary":5904,"Bible":5905,"Legislative":5906,"moon":5907,"medieval":5908,"##over":5909,"cutting":5910,"print":5911,"##ett":5912,"birthday":5913,"##hood":5914,"destruction":5915,"Julian":5916,"injuries":5917,"influential":5918,"sisters":5919,"raising":5920,"statue":5921,"colour":5922,"dancing":5923,"characteristics":5924,"orange":5925,"##ok":5926,"##aries":5927,"Ken":5928,"colonial":5929,"twin":5930,"Larry":5931,"surviving":5932,"##shi":5933,"Barbara":5934,"personality":5935,"entertainment":5936,"assault":5937,"##ering":5938,"talent":5939,"happens":5940,"license":5941,"86":5942,"couch":5943,"Century":5944,"soundtrack":5945,"shower":5946,"swimming":5947,"cash":5948,"Staff":5949,"bent":5950,"1885":5951,"bay":5952,"lunch":5953,"##lus":5954,"dozen":5955,"vessels":5956,"CBS":5957,"greatly":5958,"critic":5959,"Test":5960,"symbol":5961,"panel":5962,"shell":5963,"output":5964,"reaches":5965,"87":5966,"Front":5967,"motor":5968,"ocean":5969,"##era":5970,"##ala":5971,"maintenance":5972,"violent":5973,"scent":5974,"Limited":5975,"Las":5976,"Hope":5977,"Theater":5978,"Which":5979,"survey":5980,"Robin":5981,"recordings":5982,"compilation":5983,"##ward":5984,"bomb":5985,"insurance":5986,"Authority":5987,"sponsored":5988,"satellite":5989,"Jazz":5990,"refer":5991,"stronger":5992,"blow":5993,"whilst":5994,"Wrestling":5995,"suggest":5996,"##rie":5997,"climbed":5998,"##els":5999,"voices":6000,"shopping":6001,"1891":6002,"Neil":6003,"discovery":6004,"##vo":6005,"##ations":6006,"burst":6007,"Baby":6008,"peaked":6009,"Brooklyn":6010,"knocked":6011,"lift":6012,"##try":6013,"false":6014,"nations":6015,"Hugh":6016,"Catherine":6017,"preserved":6018,"distinguished":6019,"terminal":6020,"resolution":6021,"ratio":6022,"pants":6023,"cited":6024,"competitions":6025,"completion":6026,"DJ":6027,"bone":6028,"uniform":6029,"schedule":6030,"shouted":6031,"83":6032,"1920s":6033,"rarely":6034,"Basketball":6035,"Taiwan":6036,"artistic":6037,"bare":6038,"vampires":6039,"arrest":6040,"Utah":6041,"Marcus":6042,"assist":6043,"gradually":6044,"qualifying":6045,"Victorian":6046,"vast":6047,"rival":6048,"Warner":6049,"Terry":6050,"Economic":6051,"##cia":6052,"losses":6053,"boss":6054,"versus":6055,"audio":6056,"runner":6057,"apply":6058,"surgery":6059,"Play":6060,"twisted":6061,"comfortable":6062,"##cs":6063,"Everyone":6064,"guests":6065,"##lt":6066,"Harrison":6067,"UEFA":6068,"lowered":6069,"occasions":6070,"##lly":6071,"##cher":6072,"chapter":6073,"youngest":6074,"eighth":6075,"Culture":6076,"##room":6077,"##stone":6078,"1888":6079,"Songs":6080,"Seth":6081,"Digital":6082,"involvement":6083,"expedition":6084,"relationships":6085,"signing":6086,"1000":6087,"fault":6088,"annually":6089,"circuit":6090,"afterwards":6091,"meat":6092,"creature":6093,"##ou":6094,"cable":6095,"Bush":6096,"##net":6097,"Hispanic":6098,"rapid":6099,"gonna":6100,"figured":6101,"extent":6102,"considering":6103,"cried":6104,"##tin":6105,"sigh":6106,"dynasty":6107,"##ration":6108,"cabinet":6109,"Richmond":6110,"stable":6111,"##zo":6112,"1864":6113,"Admiral":6114,"Unit":6115,"occasion":6116,"shares":6117,"badly":6118,"longest":6119,"##ify":6120,"Connor":6121,"extreme":6122,"wondering":6123,"girlfriend":6124,"Studio":6125,"##tions":6126,"1865":6127,"tribe":6128,"exact":6129,"muscles":6130,"hat":6131,"Luis":6132,"Orthodox":6133,"decisions":6134,"amateur":6135,"description":6136,"##lis":6137,"hips":6138,"kingdom":6139,"##ute":6140,"Portland":6141,"whereas":6142,"Bachelor":6143,"outer":6144,"discussion":6145,"partly":6146,"Arkansas":6147,"1880":6148,"dreams":6149,"perfectly":6150,"Lloyd":6151,"##bridge":6152,"asleep":6153,"##tti":6154,"Greg":6155,"permission":6156,"trading":6157,"pitch":6158,"mill":6159,"Stage":6160,"liquid":6161,"Keith":6162,"##tal":6163,"wolf":6164,"processing":6165,"stick":6166,"Jerusalem":6167,"profile":6168,"rushed":6169,"spiritual":6170,"argument":6171,"Ice":6172,"Guy":6173,"till":6174,"Delhi":6175,"roots":6176,"Section":6177,"missions":6178,"Glasgow":6179,"penalty":6180,"NBC":6181,"encouraged":6182,"identify":6183,"keyboards":6184,"##zing":6185,"##ston":6186,"disc":6187,"plain":6188,"informed":6189,"Bernard":6190,"thinks":6191,"fled":6192,"Justin":6193,"##day":6194,"newspapers":6195,"##wick":6196,"Ralph":6197,"##zer":6198,"unlike":6199,"Stars":6200,"artillery":6201,"##ified":6202,"recovered":6203,"arrangement":6204,"searching":6205,"##pers":6206,"##tory":6207,"##rus":6208,"deaths":6209,"Egyptian":6210,"diameter":6211,"##í":6212,"marketing":6213,"corporate":6214,"teach":6215,"marks":6216,"Turner":6217,"staying":6218,"hallway":6219,"Sebastian":6220,"chapel":6221,"naked":6222,"mistake":6223,"possession":6224,"1887":6225,"dominated":6226,"jacket":6227,"creative":6228,"Fellow":6229,"Falls":6230,"Defence":6231,"suspended":6232,"employment":6233,"##rry":6234,"Hebrew":6235,"Hudson":6236,"Week":6237,"Wars":6238,"recognize":6239,"Natural":6240,"controversial":6241,"Tommy":6242,"thank":6243,"Athletic":6244,"benefits":6245,"decline":6246,"intention":6247,"##ets":6248,"Lost":6249,"Wall":6250,"participation":6251,"elevation":6252,"supports":6253,"parliament":6254,"1861":6255,"concentration":6256,"Movement":6257,"##IS":6258,"competing":6259,"stops":6260,"behalf":6261,"##mm":6262,"limits":6263,"funded":6264,"discuss":6265,"Collins":6266,"departure":6267,"obtain":6268,"woods":6269,"latest":6270,"universe":6271,"alcohol":6272,"Laura":6273,"rush":6274,"blade":6275,"funny":6276,"Dennis":6277,"forgotten":6278,"Amy":6279,"Symphony":6280,"apparent":6281,"graduating":6282,"1862":6283,"Rob":6284,"Grey":6285,"collections":6286,"Mason":6287,"emotions":6288,"##ugh":6289,"literally":6290,"Any":6291,"counties":6292,"1863":6293,"nomination":6294,"fighter":6295,"habitat":6296,"respond":6297,"external":6298,"Capital":6299,"exit":6300,"Video":6301,"carbon":6302,"sharing":6303,"Bad":6304,"opportunities":6305,"Perry":6306,"photo":6307,"##mus":6308,"Orange":6309,"posted":6310,"remainder":6311,"transportation":6312,"portrayed":6313,"Labor":6314,"recommended":6315,"percussion":6316,"rated":6317,"Grade":6318,"rivers":6319,"partially":6320,"suspected":6321,"strip":6322,"adults":6323,"button":6324,"struggled":6325,"intersection":6326,"Canal":6327,"##ability":6328,"poems":6329,"claiming":6330,"Madrid":6331,"1886":6332,"Together":6333,"##our":6334,"Much":6335,"Vancouver":6336,"instrument":6337,"instrumental":6338,"1870":6339,"mad":6340,"angle":6341,"Control":6342,"Phoenix":6343,"Leo":6344,"Communications":6345,"mail":6346,"##ette":6347,"##ev":6348,"preferred":6349,"adaptation":6350,"alleged":6351,"discussed":6352,"deeper":6353,"##ane":6354,"Yet":6355,"Monday":6356,"volumes":6357,"thrown":6358,"Zane":6359,"##logy":6360,"displayed":6361,"rolling":6362,"dogs":6363,"Along":6364,"Todd":6365,"##ivity":6366,"withdrew":6367,"representation":6368,"belief":6369,"##sia":6370,"crown":6371,"Late":6372,"Short":6373,"hardly":6374,"grinned":6375,"romantic":6376,"Pete":6377,"##ken":6378,"networks":6379,"enemies":6380,"Colin":6381,"Eventually":6382,"Side":6383,"donated":6384,"##su":6385,"steady":6386,"grab":6387,"guide":6388,"Finnish":6389,"Milan":6390,"pregnant":6391,"controversy":6392,"reminded":6393,"1884":6394,"Stuart":6395,"##bach":6396,"##ade":6397,"Race":6398,"Belgian":6399,"LP":6400,"Production":6401,"Zone":6402,"lieutenant":6403,"infantry":6404,"Child":6405,"confusion":6406,"sang":6407,"resident":6408,"##ez":6409,"victim":6410,"1881":6411,"channels":6412,"Ron":6413,"businessman":6414,"##gle":6415,"Dick":6416,"colony":6417,"pace":6418,"producers":6419,"##ese":6420,"agencies":6421,"Craig":6422,"Lucy":6423,"Very":6424,"centers":6425,"Yorkshire":6426,"photography":6427,"##ched":6428,"Album":6429,"championships":6430,"Metro":6431,"substantial":6432,"Standard":6433,"terrible":6434,"directors":6435,"contribution":6436,"advertising":6437,"emotional":6438,"##its":6439,"layer":6440,"segment":6441,"sir":6442,"folded":6443,"Roberts":6444,"ceased":6445,"Hampshire":6446,"##ray":6447,"detailed":6448,"partners":6449,"m²":6450,"##pt":6451,"Beth":6452,"genre":6453,"commented":6454,"generated":6455,"remote":6456,"aim":6457,"Hans":6458,"credits":6459,"concerts":6460,"periods":6461,"breakfast":6462,"gay":6463,"shadow":6464,"defence":6465,"Too":6466,"Had":6467,"transition":6468,"Afghanistan":6469,"##book":6470,"eggs":6471,"defend":6472,"##lli":6473,"writes":6474,"Systems":6475,"bones":6476,"mess":6477,"seed":6478,"scientists":6479,"Shortly":6480,"Romanian":6481,"##zy":6482,"Freedom":6483,"muscle":6484,"hero":6485,"parent":6486,"agriculture":6487,"checked":6488,"Islam":6489,"Bristol":6490,"Freyja":6491,"Arena":6492,"cabin":6493,"Germans":6494,"electricity":6495,"ranks":6496,"viewed":6497,"medals":6498,"Wolf":6499,"associate":6500,"Madison":6501,"Sorry":6502,"fort":6503,"Chile":6504,"detail":6505,"widespread":6506,"attorney":6507,"boyfriend":6508,"##nan":6509,"Students":6510,"Spencer":6511,"##ig":6512,"bite":6513,"Maine":6514,"demolished":6515,"Lisa":6516,"erected":6517,"Someone":6518,"operational":6519,"Commissioner":6520,"NHL":6521,"Coach":6522,"Bar":6523,"forcing":6524,"Dream":6525,"Rico":6526,"cargo":6527,"Murphy":6528,"##fish":6529,"##ase":6530,"distant":6531,"##master":6532,"##ora":6533,"Organization":6534,"doorway":6535,"Steven":6536,"traded":6537,"electrical":6538,"frequent":6539,"##wn":6540,"Branch":6541,"Sure":6542,"1882":6543,"placing":6544,"Manhattan":6545,"attending":6546,"attributed":6547,"excellent":6548,"pounds":6549,"ruling":6550,"principles":6551,"component":6552,"Mediterranean":6553,"Vegas":6554,"machines":6555,"percentage":6556,"infrastructure":6557,"throwing":6558,"affiliated":6559,"Kings":6560,"secured":6561,"Caribbean":6562,"Track":6563,"Ted":6564,"honour":6565,"opponent":6566,"Virgin":6567,"Construction":6568,"grave":6569,"produces":6570,"Challenge":6571,"stretched":6572,"paying":6573,"murmured":6574,"##ata":6575,"integrated":6576,"waved":6577,"Nathan":6578,"##ator":6579,"transmission":6580,"videos":6581,"##yan":6582,"##hu":6583,"Nova":6584,"descent":6585,"AM":6586,"Harold":6587,"conservative":6588,"Therefore":6589,"venue":6590,"competitive":6591,"##ui":6592,"conclusion":6593,"funeral":6594,"confidence":6595,"releases":6596,"scholar":6597,"##sson":6598,"Treaty":6599,"stress":6600,"mood":6601,"##sm":6602,"Mac":6603,"residing":6604,"Action":6605,"Fund":6606,"##ship":6607,"animated":6608,"fitted":6609,"##kar":6610,"defending":6611,"voting":6612,"tend":6613,"##berry":6614,"answers":6615,"believes":6616,"##ci":6617,"helps":6618,"Aaron":6619,"##tis":6620,"themes":6621,"##lay":6622,"populations":6623,"Players":6624,"stroke":6625,"Trinity":6626,"electoral":6627,"paint":6628,"abroad":6629,"charity":6630,"keys":6631,"Fair":6632,"##pes":6633,"interrupted":6634,"participants":6635,"murdered":6636,"Days":6637,"supporters":6638,"##ab":6639,"expert":6640,"borders":6641,"mate":6642,"##llo":6643,"solar":6644,"architectural":6645,"tension":6646,"##bling":6647,"Parish":6648,"tape":6649,"operator":6650,"Cultural":6651,"Clinton":6652,"indicates":6653,"publisher":6654,"ordinary":6655,"sugar":6656,"arrive":6657,"rifle":6658,"acoustic":6659,"##uring":6660,"assets":6661,"##shire":6662,"SS":6663,"sufficient":6664,"options":6665,"HMS":6666,"Classic":6667,"bars":6668,"rebuilt":6669,"governments":6670,"Beijing":6671,"reporter":6672,"screamed":6673,"Abbey":6674,"crying":6675,"mechanical":6676,"instantly":6677,"communications":6678,"Political":6679,"cemetery":6680,"Cameron":6681,"Stop":6682,"representatives":6683,"USS":6684,"texts":6685,"mathematics":6686,"innings":6687,"civilian":6688,"Serbia":6689,"##hill":6690,"practical":6691,"patterns":6692,"dust":6693,"Faculty":6694,"debt":6695,"##end":6696,"##cus":6697,"junction":6698,"suppose":6699,"experimental":6700,"Computer":6701,"Food":6702,"wrist":6703,"abuse":6704,"dealing":6705,"bigger":6706,"cap":6707,"principle":6708,"##pin":6709,"Muhammad":6710,"Fleet":6711,"Collection":6712,"attempting":6713,"dismissed":6714,"##burn":6715,"regime":6716,"Herbert":6717,"##ua":6718,"shadows":6719,"1883":6720,"Eve":6721,"Lanka":6722,"1878":6723,"Performance":6724,"fictional":6725,"##lock":6726,"Noah":6727,"Run":6728,"Voivodeship":6729,"exercise":6730,"broadcasting":6731,"##fer":6732,"RAF":6733,"Magic":6734,"Bangladesh":6735,"suitable":6736,"##low":6737,"##del":6738,"styles":6739,"toured":6740,"Code":6741,"identical":6742,"links":6743,"insisted":6744,"110":6745,"flash":6746,"Model":6747,"slave":6748,"Derek":6749,"Rev":6750,"fairly":6751,"Greater":6752,"sole":6753,"##lands":6754,"connecting":6755,"zero":6756,"bench":6757,"##ome":6758,"switched":6759,"Fall":6760,"Owen":6761,"yours":6762,"Electric":6763,"shocked":6764,"convention":6765,"##bra":6766,"climb":6767,"memorial":6768,"swept":6769,"Racing":6770,"decides":6771,"belong":6772,"##nk":6773,"parliamentary":6774,"##und":6775,"ages":6776,"proof":6777,"##dan":6778,"delivery":6779,"1860":6780,"##ów":6781,"sad":6782,"publicly":6783,"leaning":6784,"Archbishop":6785,"dirt":6786,"##ose":6787,"categories":6788,"1876":6789,"burn":6790,"##bing":6791,"requested":6792,"Guinea":6793,"Historical":6794,"rhythm":6795,"relation":6796,"##heim":6797,"ye":6798,"pursue":6799,"merchant":6800,"##mes":6801,"lists":6802,"continuous":6803,"frowned":6804,"colored":6805,"tool":6806,"gods":6807,"involves":6808,"Duncan":6809,"photographs":6810,"Cricket":6811,"slight":6812,"Gregory":6813,"atmosphere":6814,"wider":6815,"Cook":6816,"##tar":6817,"essential":6818,"Being":6819,"FA":6820,"emperor":6821,"wealthy":6822,"nights":6823,"##bar":6824,"licensed":6825,"Hawaii":6826,"viewers":6827,"Language":6828,"load":6829,"nearest":6830,"milk":6831,"kilometers":6832,"platforms":6833,"##ys":6834,"territories":6835,"Rogers":6836,"sheet":6837,"Rangers":6838,"contested":6839,"##lation":6840,"isolated":6841,"assisted":6842,"swallowed":6843,"Small":6844,"Contemporary":6845,"Technical":6846,"Edwards":6847,"express":6848,"Volume":6849,"endemic":6850,"##ei":6851,"tightly":6852,"Whatever":6853,"indigenous":6854,"Colombia":6855,"##ulation":6856,"hp":6857,"characterized":6858,"##ida":6859,"Nigeria":6860,"Professional":6861,"duo":6862,"Soccer":6863,"slaves":6864,"Farm":6865,"smart":6866,"Attorney":6867,"Attendance":6868,"Common":6869,"salt":6870,"##vin":6871,"tribes":6872,"nod":6873,"sentenced":6874,"bid":6875,"sample":6876,"Drive":6877,"switch":6878,"instant":6879,"21st":6880,"Cuba":6881,"drunk":6882,"Alaska":6883,"proud":6884,"awareness":6885,"hitting":6886,"sessions":6887,"Thai":6888,"locally":6889,"elsewhere":6890,"Dragon":6891,"gentle":6892,"touching":6893,"##lee":6894,"Springs":6895,"Universal":6896,"Latino":6897,"spin":6898,"1871":6899,"Chart":6900,"recalled":6901,"Type":6902,"pointing":6903,"##ii":6904,"lowest":6905,"##ser":6906,"grandmother":6907,"Adelaide":6908,"Jacques":6909,"spotted":6910,"Buffalo":6911,"restoration":6912,"Son":6913,"Joan":6914,"farmers":6915,"Lily":6916,"1879":6917,"lucky":6918,"##dal":6919,"luck":6920,"eldest":6921,"##rant":6922,"Market":6923,"drummer":6924,"deployed":6925,"warned":6926,"prince":6927,"sing":6928,"amazing":6929,"sailed":6930,"##oon":6931,"1875":6932,"Primary":6933,"traveling":6934,"Masters":6935,"Sara":6936,"cattle":6937,"Trail":6938,"gang":6939,"Further":6940,"desert":6941,"relocated":6942,"##tch":6943,"##ord":6944,"Flight":6945,"illness":6946,"Munich":6947,"ninth":6948,"repair":6949,"Singles":6950,"##lated":6951,"Tyler":6952,"tossed":6953,"boots":6954,"Work":6955,"sized":6956,"earning":6957,"shoved":6958,"magazines":6959,"housed":6960,"dam":6961,"researchers":6962,"Former":6963,"spun":6964,"premiere":6965,"spaces":6966,"organised":6967,"wealth":6968,"crimes":6969,"devoted":6970,"stones":6971,"Urban":6972,"automatic":6973,"hop":6974,"affect":6975,"outstanding":6976,"tanks":6977,"mechanism":6978,"Muslims":6979,"Ms":6980,"shots":6981,"argue":6982,"Jeremy":6983,"connections":6984,"Armenian":6985,"increases":6986,"rubbed":6987,"1867":6988,"retail":6989,"gear":6990,"Pan":6991,"bonus":6992,"jurisdiction":6993,"weird":6994,"concerning":6995,"whisper":6996,"##gal":6997,"Microsoft":6998,"tenure":6999,"hills":7000,"www":7001,"Gmina":7002,"porch":7003,"files":7004,"reportedly":7005,"venture":7006,"Storm":7007,"##ence":7008,"Nature":7009,"killer":7010,"panic":7011,"fate":7012,"Secret":7013,"Wang":7014,"scream":7015,"drivers":7016,"belongs":7017,"Chamber":7018,"clan":7019,"monument":7020,"mixing":7021,"Peru":7022,"bet":7023,"Riley":7024,"Friends":7025,"Isaac":7026,"submarine":7027,"1877":7028,"130":7029,"judges":7030,"harm":7031,"ranging":7032,"affair":7033,"prepare":7034,"pupils":7035,"householder":7036,"Policy":7037,"decorated":7038,"Nation":7039,"slammed":7040,"activist":7041,"implemented":7042,"Room":7043,"qualify":7044,"Publishing":7045,"establishing":7046,"Baptist":7047,"touring":7048,"subsidiary":7049,"##nal":7050,"legend":7051,"1872":7052,"laughter":7053,"PC":7054,"Athens":7055,"settlers":7056,"ties":7057,"dual":7058,"dear":7059,"Draft":7060,"strategic":7061,"Ivan":7062,"reveal":7063,"closest":7064,"dominant":7065,"Ah":7066,"##ult":7067,"Denver":7068,"bond":7069,"boundaries":7070,"drafted":7071,"tables":7072,"##TV":7073,"eyed":7074,"Edition":7075,"##ena":7076,"1868":7077,"belonging":7078,"1874":7079,"Industrial":7080,"cream":7081,"Ridge":7082,"Hindu":7083,"scholarship":7084,"Ma":7085,"opens":7086,"initiated":7087,"##ith":7088,"yelled":7089,"compound":7090,"random":7091,"Throughout":7092,"grades":7093,"physics":7094,"sank":7095,"grows":7096,"exclusively":7097,"settle":7098,"Saints":7099,"brings":7100,"Amsterdam":7101,"Make":7102,"Hart":7103,"walks":7104,"battery":7105,"violin":7106,"##born":7107,"explanation":7108,"##ware":7109,"1873":7110,"##har":7111,"provinces":7112,"thrust":7113,"exclusive":7114,"sculpture":7115,"shops":7116,"##fire":7117,"VI":7118,"constitution":7119,"Barcelona":7120,"monster":7121,"Devon":7122,"Jefferson":7123,"Sullivan":7124,"bow":7125,"##din":7126,"desperate":7127,"##ć":7128,"Julie":7129,"##mon":7130,"##ising":7131,"terminus":7132,"Jesse":7133,"abilities":7134,"golf":7135,"##ple":7136,"##via":7137,"##away":7138,"Raymond":7139,"measured":7140,"jury":7141,"firing":7142,"revenue":7143,"suburb":7144,"Bulgarian":7145,"1866":7146,"##cha":7147,"timber":7148,"Things":7149,"##weight":7150,"Morning":7151,"spots":7152,"Alberta":7153,"Data":7154,"explains":7155,"Kyle":7156,"friendship":7157,"raw":7158,"tube":7159,"demonstrated":7160,"aboard":7161,"immigrants":7162,"reply":7163,"breathe":7164,"Manager":7165,"ease":7166,"##ban":7167,"##dia":7168,"Diocese":7169,"##vy":7170,"##ía":7171,"pit":7172,"ongoing":7173,"##lie":7174,"Gilbert":7175,"Costa":7176,"1940s":7177,"Report":7178,"voters":7179,"cloud":7180,"traditions":7181,"##MS":7182,"gallery":7183,"Jennifer":7184,"swung":7185,"Broadcasting":7186,"Does":7187,"diverse":7188,"reveals":7189,"arriving":7190,"initiative":7191,"##ani":7192,"Give":7193,"Allied":7194,"Pat":7195,"Outstanding":7196,"monastery":7197,"blind":7198,"Currently":7199,"##war":7200,"bloody":7201,"stopping":7202,"focuses":7203,"managing":7204,"Florence":7205,"Harvey":7206,"creatures":7207,"900":7208,"breast":7209,"internet":7210,"Artillery":7211,"purple":7212,"##mate":7213,"alliance":7214,"excited":7215,"fee":7216,"Brisbane":7217,"lifetime":7218,"Private":7219,"##aw":7220,"##nis":7221,"##gue":7222,"##ika":7223,"phrase":7224,"regulations":7225,"reflected":7226,"manufactured":7227,"conventional":7228,"pleased":7229,"client":7230,"##ix":7231,"##ncy":7232,"Pedro":7233,"reduction":7234,"##con":7235,"welcome":7236,"jail":7237,"comfort":7238,"Iranian":7239,"Norfolk":7240,"Dakota":7241,"##tein":7242,"evolution":7243,"everywhere":7244,"Initially":7245,"sensitive":7246,"Olivia":7247,"Oscar":7248,"implementation":7249,"sits":7250,"stolen":7251,"demands":7252,"slide":7253,"grandson":7254,"##ich":7255,"merger":7256,"##mic":7257,"Spirit":7258,"##°":7259,"ticket":7260,"root":7261,"difficulty":7262,"Nevada":7263,"##als":7264,"lined":7265,"Dylan":7266,"Original":7267,"Call":7268,"biological":7269,"EU":7270,"dramatic":7271,"##hn":7272,"Operations":7273,"treaty":7274,"gap":7275,"##list":7276,"Am":7277,"Romanized":7278,"moral":7279,"Butler":7280,"perspective":7281,"Furthermore":7282,"Manuel":7283,"absolutely":7284,"unsuccessful":7285,"disaster":7286,"dispute":7287,"preparation":7288,"tested":7289,"discover":7290,"##ach":7291,"shield":7292,"squeezed":7293,"brushed":7294,"battalion":7295,"Arnold":7296,"##ras":7297,"superior":7298,"treat":7299,"clinical":7300,"##so":7301,"Apple":7302,"Syria":7303,"Cincinnati":7304,"package":7305,"flights":7306,"editions":7307,"Leader":7308,"minority":7309,"wonderful":7310,"hang":7311,"Pop":7312,"Philippine":7313,"telephone":7314,"bell":7315,"honorary":7316,"##mar":7317,"balls":7318,"Democrat":7319,"dirty":7320,"thereafter":7321,"collapsed":7322,"Inside":7323,"slip":7324,"wrestling":7325,"##ín":7326,"listened":7327,"regard":7328,"bowl":7329,"None":7330,"Sport":7331,"completing":7332,"trapped":7333,"##view":7334,"copper":7335,"Wallace":7336,"Honor":7337,"blame":7338,"Peninsula":7339,"##ert":7340,"##oy":7341,"Anglo":7342,"bearing":7343,"simultaneously":7344,"honest":7345,"##ias":7346,"Mix":7347,"Got":7348,"speaker":7349,"voiced":7350,"impressed":7351,"prices":7352,"error":7353,"1869":7354,"##feld":7355,"trials":7356,"Nine":7357,"Industry":7358,"substitute":7359,"Municipal":7360,"departed":7361,"slept":7362,"##ama":7363,"Junction":7364,"Socialist":7365,"flower":7366,"dropping":7367,"comment":7368,"fantasy":7369,"##ress":7370,"arrangements":7371,"travelled":7372,"furniture":7373,"fist":7374,"relieved":7375,"##tics":7376,"Leonard":7377,"linear":7378,"earn":7379,"expand":7380,"Soul":7381,"Plan":7382,"Leeds":7383,"Sierra":7384,"accessible":7385,"innocent":7386,"Winner":7387,"Fighter":7388,"Range":7389,"winds":7390,"vertical":7391,"Pictures":7392,"101":7393,"charter":7394,"cooperation":7395,"prisoner":7396,"interviews":7397,"recognised":7398,"sung":7399,"manufacturer":7400,"exposure":7401,"submitted":7402,"Mars":7403,"leaf":7404,"gauge":7405,"screaming":7406,"likes":7407,"eligible":7408,"##ac":7409,"gathering":7410,"columns":7411,"##dra":7412,"belly":7413,"UN":7414,"maps":7415,"messages":7416,"speakers":7417,"##ants":7418,"garage":7419,"unincorporated":7420,"Number":7421,"Watson":7422,"sixteen":7423,"lots":7424,"beaten":7425,"Could":7426,"Municipality":7427,"##ano":7428,"Horse":7429,"talks":7430,"Drake":7431,"scores":7432,"Venice":7433,"genetic":7434,"##mal":7435,"##ère":7436,"Cold":7437,"Jose":7438,"nurse":7439,"traditionally":7440,"##bus":7441,"Territory":7442,"Key":7443,"Nancy":7444,"##win":7445,"thumb":7446,"São":7447,"index":7448,"dependent":7449,"carries":7450,"controls":7451,"Comics":7452,"coalition":7453,"physician":7454,"referring":7455,"Ruth":7456,"Based":7457,"restricted":7458,"inherited":7459,"internationally":7460,"stretch":7461,"THE":7462,"plates":7463,"margin":7464,"Holland":7465,"knock":7466,"significance":7467,"valuable":7468,"Kenya":7469,"carved":7470,"emotion":7471,"conservation":7472,"municipalities":7473,"overseas":7474,"resumed":7475,"Finance":7476,"graduation":7477,"blinked":7478,"temperatures":7479,"constantly":7480,"productions":7481,"scientist":7482,"ghost":7483,"cuts":7484,"permitted":7485,"##ches":7486,"firmly":7487,"##bert":7488,"patrol":7489,"##yo":7490,"Croatian":7491,"attacking":7492,"1850":7493,"portrait":7494,"promoting":7495,"sink":7496,"conversion":7497,"##kov":7498,"locomotives":7499,"Guide":7500,"##val":7501,"nephew":7502,"relevant":7503,"Marc":7504,"drum":7505,"originated":7506,"Chair":7507,"visits":7508,"dragged":7509,"Price":7510,"favour":7511,"corridor":7512,"properly":7513,"respective":7514,"Caroline":7515,"reporting":7516,"inaugural":7517,"1848":7518,"industries":7519,"##ching":7520,"edges":7521,"Christianity":7522,"Maurice":7523,"Trent":7524,"Economics":7525,"carrier":7526,"Reed":7527,"##gon":7528,"tribute":7529,"Pradesh":7530,"##ale":7531,"extend":7532,"attitude":7533,"Yale":7534,"##lu":7535,"settlements":7536,"glasses":7537,"taxes":7538,"targets":7539,"##ids":7540,"quarters":7541,"##ological":7542,"connect":7543,"hence":7544,"metre":7545,"collapse":7546,"underneath":7547,"banned":7548,"Future":7549,"clients":7550,"alternate":7551,"explosion":7552,"kinds":7553,"Commons":7554,"hungry":7555,"dragon":7556,"Chapel":7557,"Buddhist":7558,"lover":7559,"depression":7560,"pulls":7561,"##ges":7562,"##uk":7563,"origins":7564,"computers":7565,"crosses":7566,"kissing":7567,"assume":7568,"emphasis":7569,"lighting":7570,"##ites":7571,"personally":7572,"crashed":7573,"beam":7574,"touchdown":7575,"lane":7576,"comparison":7577,"##mont":7578,"Hitler":7579,"##las":7580,"execution":7581,"##ene":7582,"acre":7583,"sum":7584,"Pearl":7585,"ray":7586,"##point":7587,"essentially":7588,"worker":7589,"convicted":7590,"tear":7591,"Clay":7592,"recovery":7593,"Literature":7594,"Unfortunately":7595,"##row":7596,"partial":7597,"Petersburg":7598,"Bulgaria":7599,"coaching":7600,"evolved":7601,"reception":7602,"enters":7603,"narrowed":7604,"elevator":7605,"therapy":7606,"defended":7607,"pairs":7608,"##lam":7609,"breaks":7610,"Bennett":7611,"Uncle":7612,"cylinder":7613,"##ison":7614,"passion":7615,"bases":7616,"Actor":7617,"cancelled":7618,"battles":7619,"extensively":7620,"oxygen":7621,"Ancient":7622,"specialized":7623,"negotiations":7624,"##rat":7625,"acquisition":7626,"convince":7627,"interpretation":7628,"##00":7629,"photos":7630,"aspect":7631,"colleges":7632,"Artist":7633,"keeps":7634,"##wing":7635,"Croatia":7636,"##ona":7637,"Hughes":7638,"Otto":7639,"comments":7640,"##du":7641,"Ph":7642,"Sweet":7643,"adventure":7644,"describing":7645,"Student":7646,"Shakespeare":7647,"scattered":7648,"objective":7649,"Aviation":7650,"Phillips":7651,"Fourth":7652,"athletes":7653,"##hal":7654,"##tered":7655,"Guitar":7656,"intensity":7657,"née":7658,"dining":7659,"curve":7660,"Obama":7661,"topics":7662,"legislative":7663,"Mill":7664,"Cruz":7665,"##ars":7666,"Members":7667,"recipient":7668,"Derby":7669,"inspiration":7670,"corresponding":7671,"fed":7672,"YouTube":7673,"coins":7674,"pressing":7675,"intent":7676,"Karen":7677,"cinema":7678,"Delta":7679,"destination":7680,"shorter":7681,"Christians":7682,"imagined":7683,"canal":7684,"Newcastle":7685,"Shah":7686,"Adrian":7687,"super":7688,"Males":7689,"160":7690,"liberal":7691,"lord":7692,"bat":7693,"supplied":7694,"Claude":7695,"meal":7696,"worship":7697,"##atic":7698,"Han":7699,"wire":7700,"°F":7701,"##tha":7702,"punishment":7703,"thirteen":7704,"fighters":7705,"##ibility":7706,"1859":7707,"Ball":7708,"gardens":7709,"##ari":7710,"Ottawa":7711,"pole":7712,"indicating":7713,"Twenty":7714,"Higher":7715,"Bass":7716,"Ivy":7717,"farming":7718,"##urs":7719,"certified":7720,"Saudi":7721,"plenty":7722,"##ces":7723,"restaurants":7724,"Representative":7725,"Miles":7726,"payment":7727,"##inger":7728,"##rit":7729,"Confederate":7730,"festivals":7731,"references":7732,"##ić":7733,"Mario":7734,"PhD":7735,"playoffs":7736,"witness":7737,"rice":7738,"mask":7739,"saving":7740,"opponents":7741,"enforcement":7742,"automatically":7743,"relegated":7744,"##oe":7745,"radar":7746,"whenever":7747,"Financial":7748,"imperial":7749,"uncredited":7750,"influences":7751,"Abraham":7752,"skull":7753,"Guardian":7754,"Haven":7755,"Bengal":7756,"impressive":7757,"input":7758,"mixture":7759,"Warsaw":7760,"altitude":7761,"distinction":7762,"1857":7763,"collective":7764,"Annie":7765,"##ean":7766,"##bal":7767,"directions":7768,"Flying":7769,"##nic":7770,"faded":7771,"##ella":7772,"contributing":7773,"##ó":7774,"employee":7775,"##lum":7776,"##yl":7777,"ruler":7778,"oriented":7779,"conductor":7780,"focusing":7781,"##die":7782,"Giants":7783,"Mills":7784,"mines":7785,"Deep":7786,"curled":7787,"Jessica":7788,"guitars":7789,"Louise":7790,"procedure":7791,"Machine":7792,"failing":7793,"attendance":7794,"Nepal":7795,"Brad":7796,"Liam":7797,"tourist":7798,"exhibited":7799,"Sophie":7800,"depicted":7801,"Shaw":7802,"Chuck":7803,"##can":7804,"expecting":7805,"challenges":7806,"##nda":7807,"equally":7808,"resignation":7809,"##logical":7810,"Tigers":7811,"loop":7812,"pitched":7813,"outdoor":7814,"reviewed":7815,"hopes":7816,"True":7817,"temporarily":7818,"Borough":7819,"torn":7820,"jerked":7821,"collect":7822,"Berkeley":7823,"Independence":7824,"cotton":7825,"retreat":7826,"campaigns":7827,"participating":7828,"Intelligence":7829,"Heaven":7830,"##ked":7831,"situations":7832,"borough":7833,"Democrats":7834,"Harbor":7835,"##len":7836,"Liga":7837,"serial":7838,"circles":7839,"fourteen":7840,"##lot":7841,"seized":7842,"filling":7843,"departments":7844,"finance":7845,"absolute":7846,"Roland":7847,"Nate":7848,"floors":7849,"raced":7850,"struggling":7851,"deliver":7852,"protests":7853,"##tel":7854,"Exchange":7855,"efficient":7856,"experiments":7857,"##dar":7858,"faint":7859,"3D":7860,"binding":7861,"Lions":7862,"lightly":7863,"skill":7864,"proteins":7865,"difficulties":7866,"##cal":7867,"monthly":7868,"camps":7869,"flood":7870,"loves":7871,"Amanda":7872,"Commerce":7873,"##oid":7874,"##lies":7875,"elementary":7876,"##tre":7877,"organic":7878,"##stein":7879,"##ph":7880,"receives":7881,"Tech":7882,"enormous":7883,"distinctive":7884,"Joint":7885,"experiment":7886,"Circuit":7887,"citizen":7888,"##hy":7889,"shelter":7890,"ideal":7891,"practically":7892,"formula":7893,"addressed":7894,"Foster":7895,"Productions":7896,"##ax":7897,"variable":7898,"punk":7899,"Voice":7900,"fastest":7901,"concentrated":7902,"##oma":7903,"##yer":7904,"stored":7905,"surrender":7906,"vary":7907,"Sergeant":7908,"Wells":7909,"ward":7910,"Wait":7911,"##ven":7912,"playoff":7913,"reducing":7914,"cavalry":7915,"##dle":7916,"Venezuela":7917,"tissue":7918,"amounts":7919,"sweat":7920,"##we":7921,"Non":7922,"##nik":7923,"beetle":7924,"##bu":7925,"##tu":7926,"Jared":7927,"Hunt":7928,"##₂":7929,"fat":7930,"Sultan":7931,"Living":7932,"Circle":7933,"Secondary":7934,"Suddenly":7935,"reverse":7936,"##min":7937,"Travel":7938,"##bin":7939,"Lebanon":7940,"##mas":7941,"virus":7942,"Wind":7943,"dissolved":7944,"enrolled":7945,"holiday":7946,"Keep":7947,"helicopter":7948,"Clarke":7949,"constitutional":7950,"technologies":7951,"doubles":7952,"instructions":7953,"##ace":7954,"Azerbaijan":7955,"##ill":7956,"occasional":7957,"frozen":7958,"trick":7959,"wiped":7960,"writings":7961,"Shanghai":7962,"preparing":7963,"challenged":7964,"mainstream":7965,"summit":7966,"180":7967,"##arian":7968,"##rating":7969,"designation":7970,"##ada":7971,"revenge":7972,"filming":7973,"tightened":7974,"Miguel":7975,"Montana":7976,"reflect":7977,"celebration":7978,"bitch":7979,"flashed":7980,"signals":7981,"rounded":7982,"peoples":7983,"##tation":7984,"renowned":7985,"Google":7986,"characteristic":7987,"Campaign":7988,"sliding":7989,"##rman":7990,"usage":7991,"Record":7992,"Using":7993,"woke":7994,"solutions":7995,"holes":7996,"theories":7997,"logo":7998,"Protestant":7999,"relaxed":8000,"brow":8001,"nickname":8002,"Reading":8003,"marble":8004,"##tro":8005,"symptoms":8006,"Overall":8007,"capita":8008,"##ila":8009,"outbreak":8010,"revolution":8011,"deemed":8012,"Principal":8013,"Hannah":8014,"approaches":8015,"inducted":8016,"Wellington":8017,"vulnerable":8018,"Environmental":8019,"Drama":8020,"incumbent":8021,"Dame":8022,"1854":8023,"travels":8024,"samples":8025,"accurate":8026,"physically":8027,"Sony":8028,"Nashville":8029,"##sville":8030,"##lic":8031,"##og":8032,"Producer":8033,"Lucky":8034,"tough":8035,"Stanford":8036,"resort":8037,"repeatedly":8038,"eyebrows":8039,"Far":8040,"choir":8041,"commenced":8042,"##ep":8043,"##ridge":8044,"rage":8045,"swing":8046,"sequel":8047,"heir":8048,"buses":8049,"ad":8050,"Grove":8051,"##late":8052,"##rick":8053,"updated":8054,"##SA":8055,"Delaware":8056,"##fa":8057,"Athletics":8058,"warmth":8059,"Off":8060,"excitement":8061,"verse":8062,"Protection":8063,"Villa":8064,"corruption":8065,"intellectual":8066,"Jenny":8067,"##lyn":8068,"mystery":8069,"prayer":8070,"healthy":8071,"##ologist":8072,"Bear":8073,"lab":8074,"Ernest":8075,"Remix":8076,"register":8077,"basement":8078,"Montgomery":8079,"consistent":8080,"tier":8081,"1855":8082,"Preston":8083,"Brooks":8084,"##maker":8085,"vocalist":8086,"laboratory":8087,"delayed":8088,"wheels":8089,"rope":8090,"bachelor":8091,"pitcher":8092,"Block":8093,"Nevertheless":8094,"suspect":8095,"efficiency":8096,"Nebraska":8097,"siege":8098,"FBI":8099,"planted":8100,"##AC":8101,"Newton":8102,"breeding":8103,"##ain":8104,"eighteen":8105,"Argentine":8106,"encounter":8107,"servant":8108,"1858":8109,"elder":8110,"Shadow":8111,"Episode":8112,"fabric":8113,"doctors":8114,"survival":8115,"removal":8116,"chemistry":8117,"volunteers":8118,"Kane":8119,"variant":8120,"arrives":8121,"Eagle":8122,"Left":8123,"##fe":8124,"Jo":8125,"divorce":8126,"##ret":8127,"yesterday":8128,"Bryan":8129,"handling":8130,"diseases":8131,"customer":8132,"Sheriff":8133,"Tiger":8134,"Harper":8135,"##oi":8136,"resting":8137,"Linda":8138,"Sheffield":8139,"gasped":8140,"sexy":8141,"economics":8142,"alien":8143,"tale":8144,"footage":8145,"Liberty":8146,"yeah":8147,"fundamental":8148,"Ground":8149,"flames":8150,"Actress":8151,"photographer":8152,"Maggie":8153,"Additional":8154,"joke":8155,"custom":8156,"Survey":8157,"Abu":8158,"silk":8159,"consumption":8160,"Ellis":8161,"bread":8162,"##uous":8163,"engagement":8164,"puts":8165,"Dog":8166,"##hr":8167,"poured":8168,"guilt":8169,"CDP":8170,"boxes":8171,"hardware":8172,"clenched":8173,"##cio":8174,"stem":8175,"arena":8176,"extending":8177,"##com":8178,"examination":8179,"Steel":8180,"encountered":8181,"revised":8182,"140":8183,"picking":8184,"Car":8185,"hasn":8186,"Minor":8187,"pride":8188,"Roosevelt":8189,"boards":8190,"##mia":8191,"blocked":8192,"curious":8193,"drag":8194,"narrative":8195,"brigade":8196,"Prefecture":8197,"mysterious":8198,"namely":8199,"connects":8200,"Devil":8201,"historians":8202,"CHAPTER":8203,"quit":8204,"installation":8205,"Golf":8206,"empire":8207,"elevated":8208,"##eo":8209,"releasing":8210,"Bond":8211,"##uri":8212,"harsh":8213,"ban":8214,"##BA":8215,"contracts":8216,"cloth":8217,"presents":8218,"stake":8219,"chorus":8220,"##eau":8221,"swear":8222,"##mp":8223,"allies":8224,"generations":8225,"Motor":8226,"meter":8227,"pen":8228,"warrior":8229,"veteran":8230,"##EC":8231,"comprehensive":8232,"missile":8233,"interaction":8234,"instruction":8235,"Renaissance":8236,"rested":8237,"Dale":8238,"fix":8239,"fluid":8240,"les":8241,"investigate":8242,"loaded":8243,"widow":8244,"exhibit":8245,"artificial":8246,"select":8247,"rushing":8248,"tasks":8249,"signature":8250,"nowhere":8251,"Engineer":8252,"feared":8253,"Prague":8254,"bother":8255,"extinct":8256,"gates":8257,"Bird":8258,"climbing":8259,"heels":8260,"striking":8261,"artwork":8262,"hunt":8263,"awake":8264,"##hin":8265,"Formula":8266,"thereby":8267,"commitment":8268,"imprisoned":8269,"Beyond":8270,"##MA":8271,"transformed":8272,"Agriculture":8273,"Low":8274,"Movie":8275,"radical":8276,"complicated":8277,"Yellow":8278,"Auckland":8279,"mansion":8280,"tenth":8281,"Trevor":8282,"predecessor":8283,"##eer":8284,"disbanded":8285,"sucked":8286,"circular":8287,"witch":8288,"gaining":8289,"lean":8290,"Behind":8291,"illustrated":8292,"rang":8293,"celebrate":8294,"bike":8295,"consist":8296,"framework":8297,"##cent":8298,"Shane":8299,"owns":8300,"350":8301,"comprises":8302,"collaborated":8303,"colleagues":8304,"##cast":8305,"engage":8306,"fewer":8307,"##ave":8308,"1856":8309,"observation":8310,"diplomatic":8311,"legislature":8312,"improvements":8313,"Interstate":8314,"craft":8315,"MTV":8316,"martial":8317,"administered":8318,"jet":8319,"approaching":8320,"permanently":8321,"attraction":8322,"manuscript":8323,"numbered":8324,"Happy":8325,"Andrea":8326,"shallow":8327,"Gothic":8328,"Anti":8329,"##bad":8330,"improvement":8331,"trace":8332,"preserve":8333,"regardless":8334,"rode":8335,"dies":8336,"achievement":8337,"maintaining":8338,"Hamburg":8339,"spine":8340,"##air":8341,"flowing":8342,"encourage":8343,"widened":8344,"posts":8345,"##bound":8346,"125":8347,"Southeast":8348,"Santiago":8349,"##bles":8350,"impression":8351,"receiver":8352,"Single":8353,"closure":8354,"##unt":8355,"communist":8356,"honors":8357,"Northwest":8358,"105":8359,"##ulated":8360,"cared":8361,"un":8362,"hug":8363,"magnetic":8364,"seeds":8365,"topic":8366,"perceived":8367,"prey":8368,"prevented":8369,"Marvel":8370,"Eight":8371,"Michel":8372,"Transportation":8373,"rings":8374,"Gate":8375,"##gne":8376,"Byzantine":8377,"accommodate":8378,"floating":8379,"##dor":8380,"equation":8381,"ministry":8382,"##ito":8383,"##gled":8384,"Rules":8385,"earthquake":8386,"revealing":8387,"Brother":8388,"Celtic":8389,"blew":8390,"chairs":8391,"Panama":8392,"Leon":8393,"attractive":8394,"descendants":8395,"Care":8396,"Ambassador":8397,"tours":8398,"breathed":8399,"threatening":8400,"##cho":8401,"smiles":8402,"Lt":8403,"Beginning":8404,"##iness":8405,"fake":8406,"assists":8407,"fame":8408,"strings":8409,"Mobile":8410,"Liu":8411,"parks":8412,"http":8413,"1852":8414,"brush":8415,"Aunt":8416,"bullet":8417,"consciousness":8418,"##sta":8419,"##ther":8420,"consequences":8421,"gather":8422,"dug":8423,"1851":8424,"bridges":8425,"Doug":8426,"##sion":8427,"Artists":8428,"ignore":8429,"Carol":8430,"brilliant":8431,"radiation":8432,"temples":8433,"basin":8434,"clouds":8435,"##cted":8436,"Stevens":8437,"spite":8438,"soap":8439,"consumer":8440,"Damn":8441,"Snow":8442,"recruited":8443,"##craft":8444,"Advanced":8445,"tournaments":8446,"Quinn":8447,"undergraduate":8448,"questioned":8449,"Palmer":8450,"Annual":8451,"Others":8452,"feeding":8453,"Spider":8454,"printing":8455,"##orn":8456,"cameras":8457,"functional":8458,"Chester":8459,"readers":8460,"Alpha":8461,"universal":8462,"Faith":8463,"Brandon":8464,"François":8465,"authored":8466,"Ring":8467,"el":8468,"aims":8469,"athletic":8470,"possessed":8471,"Vermont":8472,"programmes":8473,"##uck":8474,"bore":8475,"Fisher":8476,"statements":8477,"shed":8478,"saxophone":8479,"neighboring":8480,"pronounced":8481,"barrel":8482,"bags":8483,"##dge":8484,"organisations":8485,"pilots":8486,"casualties":8487,"Kenneth":8488,"##brook":8489,"silently":8490,"Malcolm":8491,"span":8492,"Essex":8493,"anchor":8494,"##hl":8495,"virtual":8496,"lessons":8497,"Henri":8498,"Trump":8499,"Page":8500,"pile":8501,"locomotive":8502,"wounds":8503,"uncomfortable":8504,"sustained":8505,"Diana":8506,"Eagles":8507,"##pi":8508,"2000s":8509,"documented":8510,"##bel":8511,"Cassie":8512,"delay":8513,"kisses":8514,"##ines":8515,"variation":8516,"##ag":8517,"growled":8518,"##mark":8519,"##ways":8520,"Leslie":8521,"studios":8522,"Friedrich":8523,"aunt":8524,"actively":8525,"armor":8526,"eaten":8527,"historically":8528,"Better":8529,"purse":8530,"honey":8531,"ratings":8532,"##ée":8533,"naturally":8534,"1840":8535,"peer":8536,"Kenny":8537,"Cardinal":8538,"database":8539,"Looking":8540,"runners":8541,"handsome":8542,"Double":8543,"PA":8544,"##boat":8545,"##sted":8546,"protecting":8547,"##jan":8548,"Diamond":8549,"concepts":8550,"interface":8551,"##aki":8552,"Watch":8553,"Article":8554,"Columbus":8555,"dialogue":8556,"pause":8557,"##rio":8558,"extends":8559,"blanket":8560,"pulse":8561,"1853":8562,"affiliate":8563,"ladies":8564,"Ronald":8565,"counted":8566,"kills":8567,"demons":8568,"##zation":8569,"Airlines":8570,"Marco":8571,"Cat":8572,"companion":8573,"mere":8574,"Yugoslavia":8575,"Forum":8576,"Allan":8577,"pioneer":8578,"Competition":8579,"Methodist":8580,"patent":8581,"nobody":8582,"Stockholm":8583,"##ien":8584,"regulation":8585,"##ois":8586,"accomplished":8587,"##itive":8588,"washed":8589,"sake":8590,"Vladimir":8591,"crops":8592,"prestigious":8593,"humor":8594,"Sally":8595,"labour":8596,"tributary":8597,"trap":8598,"altered":8599,"examined":8600,"Mumbai":8601,"bombing":8602,"Ash":8603,"noble":8604,"suspension":8605,"ruins":8606,"##bank":8607,"spare":8608,"displays":8609,"guided":8610,"dimensional":8611,"Iraqi":8612,"##hon":8613,"sciences":8614,"Franz":8615,"relating":8616,"fence":8617,"followers":8618,"Palestine":8619,"invented":8620,"proceeded":8621,"Batman":8622,"Bradley":8623,"##yard":8624,"##ova":8625,"crystal":8626,"Kerala":8627,"##ima":8628,"shipping":8629,"handled":8630,"Want":8631,"abolished":8632,"Drew":8633,"##tter":8634,"Powell":8635,"Half":8636,"##table":8637,"##cker":8638,"exhibitions":8639,"Were":8640,"assignment":8641,"assured":8642,"##rine":8643,"Indonesian":8644,"Grammy":8645,"acknowledged":8646,"Kylie":8647,"coaches":8648,"structural":8649,"clearing":8650,"stationed":8651,"Say":8652,"Total":8653,"Rail":8654,"besides":8655,"glow":8656,"threats":8657,"afford":8658,"Tree":8659,"Musical":8660,"##pp":8661,"elite":8662,"centered":8663,"explore":8664,"Engineers":8665,"Stakes":8666,"Hello":8667,"tourism":8668,"severely":8669,"assessment":8670,"##tly":8671,"crack":8672,"politicians":8673,"##rrow":8674,"sheets":8675,"volunteer":8676,"##borough":8677,"##hold":8678,"announcement":8679,"recover":8680,"contribute":8681,"lungs":8682,"##ille":8683,"mainland":8684,"presentation":8685,"Johann":8686,"Writing":8687,"1849":8688,"##bird":8689,"Study":8690,"Boulevard":8691,"coached":8692,"fail":8693,"airline":8694,"Congo":8695,"Plus":8696,"Syrian":8697,"introduce":8698,"ridge":8699,"Casey":8700,"manages":8701,"##fi":8702,"searched":8703,"Support":8704,"succession":8705,"progressive":8706,"coup":8707,"cultures":8708,"##lessly":8709,"sensation":8710,"Cork":8711,"Elena":8712,"Sofia":8713,"Philosophy":8714,"mini":8715,"trunk":8716,"academy":8717,"Mass":8718,"Liz":8719,"practiced":8720,"Reid":8721,"##ule":8722,"satisfied":8723,"experts":8724,"Wilhelm":8725,"Woods":8726,"invitation":8727,"Angels":8728,"calendar":8729,"joy":8730,"Sr":8731,"Dam":8732,"packed":8733,"##uan":8734,"bastard":8735,"Workers":8736,"broadcasts":8737,"logic":8738,"cooking":8739,"backward":8740,"##ack":8741,"Chen":8742,"creates":8743,"enzyme":8744,"##xi":8745,"Davies":8746,"aviation":8747,"VII":8748,"Conservation":8749,"fucking":8750,"Knights":8751,"##kan":8752,"requiring":8753,"hectares":8754,"wars":8755,"ate":8756,"##box":8757,"Mind":8758,"desired":8759,"oak":8760,"absorbed":8761,"Really":8762,"Vietnamese":8763,"Paulo":8764,"athlete":8765,"##car":8766,"##eth":8767,"Talk":8768,"Wu":8769,"##cks":8770,"survivors":8771,"Yang":8772,"Joel":8773,"Almost":8774,"Holmes":8775,"Armed":8776,"Joshua":8777,"priests":8778,"discontinued":8779,"##sey":8780,"blond":8781,"Rolling":8782,"suggesting":8783,"CA":8784,"clay":8785,"exterior":8786,"Scientific":8787,"##sive":8788,"Giovanni":8789,"Hi":8790,"farther":8791,"contents":8792,"Winners":8793,"animation":8794,"neutral":8795,"mall":8796,"Notes":8797,"layers":8798,"professionals":8799,"Armstrong":8800,"Against":8801,"Piano":8802,"involve":8803,"monitor":8804,"angel":8805,"parked":8806,"bears":8807,"seated":8808,"feat":8809,"beliefs":8810,"##kers":8811,"Version":8812,"suffer":8813,"##ceae":8814,"guidance":8815,"##eur":8816,"honored":8817,"raid":8818,"alarm":8819,"Glen":8820,"Ellen":8821,"Jamaica":8822,"trio":8823,"enabled":8824,"##ils":8825,"procedures":8826,"##hus":8827,"moderate":8828,"upstairs":8829,"##ses":8830,"torture":8831,"Georgian":8832,"rebellion":8833,"Fernando":8834,"Nice":8835,"##are":8836,"Aires":8837,"Campus":8838,"beast":8839,"##hing":8840,"1847":8841,"##FA":8842,"Isle":8843,"##logist":8844,"Princeton":8845,"cathedral":8846,"Oakland":8847,"Solomon":8848,"##tto":8849,"Milwaukee":8850,"upcoming":8851,"midfielder":8852,"Neither":8853,"sacred":8854,"Eyes":8855,"appreciate":8856,"Brunswick":8857,"secrets":8858,"Rice":8859,"Somerset":8860,"Chancellor":8861,"Curtis":8862,"##gel":8863,"Rich":8864,"separation":8865,"grid":8866,"##los":8867,"##bon":8868,"urge":8869,"##ees":8870,"##ree":8871,"freight":8872,"towers":8873,"psychology":8874,"requirement":8875,"dollar":8876,"##fall":8877,"##sman":8878,"exile":8879,"tomb":8880,"Salt":8881,"Stefan":8882,"Buenos":8883,"Revival":8884,"Porter":8885,"tender":8886,"diesel":8887,"chocolate":8888,"Eugene":8889,"Legion":8890,"Laboratory":8891,"sheep":8892,"arched":8893,"hospitals":8894,"orbit":8895,"Full":8896,"##hall":8897,"drinks":8898,"ripped":8899,"##RS":8900,"tense":8901,"Hank":8902,"leagues":8903,"##nberg":8904,"PlayStation":8905,"fool":8906,"Punjab":8907,"relatives":8908,"Comedy":8909,"sur":8910,"1846":8911,"Tonight":8912,"Sox":8913,"##if":8914,"Rabbi":8915,"org":8916,"speaks":8917,"institute":8918,"defender":8919,"painful":8920,"wishes":8921,"Weekly":8922,"literacy":8923,"portions":8924,"snake":8925,"item":8926,"deals":8927,"##tum":8928,"autumn":8929,"sharply":8930,"reforms":8931,"thighs":8932,"prototype":8933,"##ition":8934,"argues":8935,"disorder":8936,"Physics":8937,"terror":8938,"provisions":8939,"refugees":8940,"predominantly":8941,"independently":8942,"march":8943,"##graphy":8944,"Arabia":8945,"Andrews":8946,"Bus":8947,"Money":8948,"drops":8949,"##zar":8950,"pistol":8951,"matrix":8952,"revolutionary":8953,"##ust":8954,"Starting":8955,"##ptic":8956,"Oak":8957,"Monica":8958,"##ides":8959,"servants":8960,"##hed":8961,"archaeological":8962,"divorced":8963,"rocket":8964,"enjoying":8965,"fires":8966,"##nel":8967,"assembled":8968,"qualification":8969,"retiring":8970,"##fied":8971,"Distinguished":8972,"handful":8973,"infection":8974,"Durham":8975,"##itz":8976,"fortune":8977,"renewed":8978,"Chelsea":8979,"##sley":8980,"curved":8981,"gesture":8982,"retain":8983,"exhausted":8984,"##ifying":8985,"Perth":8986,"jumping":8987,"Palestinian":8988,"Simpson":8989,"colonies":8990,"steal":8991,"##chy":8992,"corners":8993,"Finn":8994,"arguing":8995,"Martha":8996,"##var":8997,"Betty":8998,"emerging":8999,"Heights":9000,"Hindi":9001,"Manila":9002,"pianist":9003,"founders":9004,"regret":9005,"Napoleon":9006,"elbow":9007,"overhead":9008,"bold":9009,"praise":9010,"humanity":9011,"##ori":9012,"Revolutionary":9013,"##ere":9014,"fur":9015,"##ole":9016,"Ashley":9017,"Official":9018,"##rm":9019,"lovely":9020,"Architecture":9021,"##sch":9022,"Baronet":9023,"virtually":9024,"##OS":9025,"descended":9026,"immigration":9027,"##das":9028,"##kes":9029,"Holly":9030,"Wednesday":9031,"maintains":9032,"theatrical":9033,"Evan":9034,"Gardens":9035,"citing":9036,"##gia":9037,"segments":9038,"Bailey":9039,"Ghost":9040,"##city":9041,"governing":9042,"graphics":9043,"##ined":9044,"privately":9045,"potentially":9046,"transformation":9047,"Crystal":9048,"Cabinet":9049,"sacrifice":9050,"hesitated":9051,"mud":9052,"Apollo":9053,"Desert":9054,"bin":9055,"victories":9056,"Editor":9057,"Railways":9058,"Web":9059,"Case":9060,"tourists":9061,"Brussels":9062,"Franco":9063,"compiled":9064,"topped":9065,"Gene":9066,"engineers":9067,"commentary":9068,"egg":9069,"escort":9070,"nerve":9071,"arch":9072,"necessarily":9073,"frustration":9074,"Michelle":9075,"democracy":9076,"genes":9077,"Facebook":9078,"halfway":9079,"##ient":9080,"102":9081,"flipped":9082,"Won":9083,"##mit":9084,"NASA":9085,"Lynn":9086,"Provincial":9087,"ambassador":9088,"Inspector":9089,"glared":9090,"Change":9091,"McDonald":9092,"developments":9093,"tucked":9094,"noting":9095,"Gibson":9096,"circulation":9097,"dubbed":9098,"armies":9099,"resource":9100,"Headquarters":9101,"##iest":9102,"Mia":9103,"Albanian":9104,"Oil":9105,"Albums":9106,"excuse":9107,"intervention":9108,"Grande":9109,"Hugo":9110,"integration":9111,"civilians":9112,"depends":9113,"reserves":9114,"Dee":9115,"compositions":9116,"identification":9117,"restrictions":9118,"quarterback":9119,"Miranda":9120,"Universe":9121,"favourite":9122,"ranges":9123,"hint":9124,"loyal":9125,"Op":9126,"entity":9127,"Manual":9128,"quoted":9129,"dealt":9130,"specialist":9131,"Zhang":9132,"download":9133,"Westminster":9134,"Rebecca":9135,"streams":9136,"Anglican":9137,"variations":9138,"Mine":9139,"detective":9140,"Films":9141,"reserved":9142,"##oke":9143,"##key":9144,"sailing":9145,"##gger":9146,"expanding":9147,"recall":9148,"discovers":9149,"particles":9150,"behaviour":9151,"Gavin":9152,"blank":9153,"permit":9154,"Java":9155,"Fraser":9156,"Pass":9157,"##non":9158,"##TA":9159,"panels":9160,"statistics":9161,"notion":9162,"courage":9163,"dare":9164,"venues":9165,"##roy":9166,"Box":9167,"Newport":9168,"travelling":9169,"Thursday":9170,"warriors":9171,"Glenn":9172,"criteria":9173,"360":9174,"mutual":9175,"restore":9176,"varied":9177,"bitter":9178,"Katherine":9179,"##lant":9180,"ritual":9181,"bits":9182,"##à":9183,"Henderson":9184,"trips":9185,"Richardson":9186,"Detective":9187,"curse":9188,"psychological":9189,"Il":9190,"midnight":9191,"streak":9192,"facts":9193,"Dawn":9194,"Indies":9195,"Edmund":9196,"roster":9197,"Gen":9198,"##nation":9199,"1830":9200,"congregation":9201,"shaft":9202,"##ically":9203,"##mination":9204,"Indianapolis":9205,"Sussex":9206,"loving":9207,"##bit":9208,"sounding":9209,"horrible":9210,"Continental":9211,"Griffin":9212,"advised":9213,"magical":9214,"millions":9215,"##date":9216,"1845":9217,"Safety":9218,"lifting":9219,"determination":9220,"valid":9221,"dialect":9222,"Penn":9223,"Know":9224,"triple":9225,"avoided":9226,"dancer":9227,"judgment":9228,"sixty":9229,"farmer":9230,"lakes":9231,"blast":9232,"aggressive":9233,"Abby":9234,"tag":9235,"chains":9236,"inscription":9237,"##nn":9238,"conducting":9239,"Scout":9240,"buying":9241,"##wich":9242,"spreading":9243,"##OC":9244,"array":9245,"hurried":9246,"Environment":9247,"improving":9248,"prompted":9249,"fierce":9250,"Taking":9251,"Away":9252,"tune":9253,"pissed":9254,"Bull":9255,"catching":9256,"##ying":9257,"eyebrow":9258,"metropolitan":9259,"terrain":9260,"##rel":9261,"Lodge":9262,"manufacturers":9263,"creator":9264,"##etic":9265,"happiness":9266,"ports":9267,"##ners":9268,"Relations":9269,"fortress":9270,"targeted":9271,"##ST":9272,"allegedly":9273,"blues":9274,"##osa":9275,"Bosnia":9276,"##dom":9277,"burial":9278,"similarly":9279,"stranger":9280,"pursued":9281,"symbols":9282,"rebels":9283,"reflection":9284,"routine":9285,"traced":9286,"indoor":9287,"eventual":9288,"##ska":9289,"##ão":9290,"##una":9291,"MD":9292,"##phone":9293,"oh":9294,"grants":9295,"Reynolds":9296,"rid":9297,"operators":9298,"##nus":9299,"Joey":9300,"vital":9301,"siblings":9302,"keyboard":9303,"br":9304,"removing":9305,"societies":9306,"drives":9307,"solely":9308,"princess":9309,"lighter":9310,"Various":9311,"Cavalry":9312,"believing":9313,"SC":9314,"underwent":9315,"relay":9316,"smelled":9317,"syndrome":9318,"welfare":9319,"authorized":9320,"seemingly":9321,"Hard":9322,"chicken":9323,"##rina":9324,"Ages":9325,"Bo":9326,"democratic":9327,"barn":9328,"Eye":9329,"shorts":9330,"##coming":9331,"##hand":9332,"disappointed":9333,"unexpected":9334,"centres":9335,"Exhibition":9336,"Stories":9337,"Site":9338,"banking":9339,"accidentally":9340,"Agent":9341,"conjunction":9342,"André":9343,"Chloe":9344,"resist":9345,"width":9346,"Queens":9347,"provision":9348,"##art":9349,"Melissa":9350,"Honorary":9351,"Del":9352,"prefer":9353,"abruptly":9354,"duration":9355,"##vis":9356,"Glass":9357,"enlisted":9358,"##ado":9359,"discipline":9360,"Sisters":9361,"carriage":9362,"##ctor":9363,"##sburg":9364,"Lancashire":9365,"log":9366,"fuck":9367,"##iz":9368,"closet":9369,"collecting":9370,"holy":9371,"rape":9372,"trusted":9373,"cleaning":9374,"inhabited":9375,"Rocky":9376,"104":9377,"editorial":9378,"##yu":9379,"##ju":9380,"succeed":9381,"strict":9382,"Cuban":9383,"##iya":9384,"Bronze":9385,"outcome":9386,"##ifies":9387,"##set":9388,"corps":9389,"Hero":9390,"barrier":9391,"Kumar":9392,"groaned":9393,"Nina":9394,"Burton":9395,"enable":9396,"stability":9397,"Milton":9398,"knots":9399,"##ination":9400,"slavery":9401,"##borg":9402,"curriculum":9403,"trailer":9404,"warfare":9405,"Dante":9406,"Edgar":9407,"revival":9408,"Copenhagen":9409,"define":9410,"advocate":9411,"Garrett":9412,"Luther":9413,"overcome":9414,"pipe":9415,"750":9416,"construct":9417,"Scotia":9418,"kings":9419,"flooding":9420,"##hard":9421,"Ferdinand":9422,"Felix":9423,"forgot":9424,"Fish":9425,"Kurt":9426,"elaborate":9427,"##BC":9428,"graphic":9429,"gripped":9430,"colonel":9431,"Sophia":9432,"Advisory":9433,"Self":9434,"##uff":9435,"##lio":9436,"monitoring":9437,"seal":9438,"senses":9439,"rises":9440,"peaceful":9441,"journals":9442,"1837":9443,"checking":9444,"legendary":9445,"Ghana":9446,"##power":9447,"ammunition":9448,"Rosa":9449,"Richards":9450,"nineteenth":9451,"ferry":9452,"aggregate":9453,"Troy":9454,"inter":9455,"##wall":9456,"Triple":9457,"steep":9458,"tent":9459,"Cyprus":9460,"1844":9461,"##woman":9462,"commanding":9463,"farms":9464,"doi":9465,"navy":9466,"specified":9467,"na":9468,"cricketer":9469,"transported":9470,"Think":9471,"comprising":9472,"grateful":9473,"solve":9474,"##core":9475,"beings":9476,"clerk":9477,"grain":9478,"vector":9479,"discrimination":9480,"##TC":9481,"Katie":9482,"reasonable":9483,"drawings":9484,"veins":9485,"consideration":9486,"Monroe":9487,"repeat":9488,"breed":9489,"dried":9490,"witnessed":9491,"ordained":9492,"Current":9493,"spirits":9494,"remarkable":9495,"consultant":9496,"urged":9497,"Remember":9498,"anime":9499,"singers":9500,"phenomenon":9501,"Rhode":9502,"Carlo":9503,"demanding":9504,"findings":9505,"manual":9506,"varying":9507,"Fellowship":9508,"generate":9509,"safely":9510,"heated":9511,"withdrawn":9512,"##ao":9513,"headquartered":9514,"##zon":9515,"##lav":9516,"##ency":9517,"Col":9518,"Memphis":9519,"imposed":9520,"rivals":9521,"Planet":9522,"healing":9523,"##hs":9524,"ensemble":9525,"Warriors":9526,"##bone":9527,"cult":9528,"Frankfurt":9529,"##HL":9530,"diversity":9531,"Gerald":9532,"intermediate":9533,"##izes":9534,"reactions":9535,"Sister":9536,"##ously":9537,"##lica":9538,"quantum":9539,"awkward":9540,"mentions":9541,"pursuit":9542,"##ography":9543,"varies":9544,"profession":9545,"molecular":9546,"consequence":9547,"lectures":9548,"cracked":9549,"103":9550,"slowed":9551,"##tsu":9552,"cheese":9553,"upgraded":9554,"suite":9555,"substance":9556,"Kingston":9557,"1800":9558,"Idaho":9559,"Theory":9560,"##een":9561,"ain":9562,"Carson":9563,"Molly":9564,"##OR":9565,"configuration":9566,"Whitney":9567,"reads":9568,"audiences":9569,"##tie":9570,"Geneva":9571,"Outside":9572,"##nen":9573,"##had":9574,"transit":9575,"volleyball":9576,"Randy":9577,"Chad":9578,"rubber":9579,"motorcycle":9580,"respected":9581,"eager":9582,"Level":9583,"coin":9584,"##lets":9585,"neighbouring":9586,"##wski":9587,"confident":9588,"##cious":9589,"poll":9590,"uncertain":9591,"punch":9592,"thesis":9593,"Tucker":9594,"IATA":9595,"Alec":9596,"##ographic":9597,"##law":9598,"1841":9599,"desperately":9600,"1812":9601,"Lithuania":9602,"accent":9603,"Cox":9604,"lightning":9605,"skirt":9606,"##load":9607,"Burns":9608,"Dynasty":9609,"##ug":9610,"chapters":9611,"Working":9612,"dense":9613,"Morocco":9614,"##kins":9615,"casting":9616,"Set":9617,"activated":9618,"oral":9619,"Brien":9620,"horn":9621,"HIV":9622,"dawn":9623,"stumbled":9624,"altar":9625,"tore":9626,"considerably":9627,"Nicole":9628,"interchange":9629,"registration":9630,"biography":9631,"Hull":9632,"Stan":9633,"bulk":9634,"consent":9635,"Pierce":9636,"##ER":9637,"Fifth":9638,"marched":9639,"terrorist":9640,"##piece":9641,"##itt":9642,"Presidential":9643,"Heather":9644,"staged":9645,"Plant":9646,"relegation":9647,"sporting":9648,"joins":9649,"##ced":9650,"Pakistani":9651,"dynamic":9652,"Heat":9653,"##lf":9654,"ourselves":9655,"Except":9656,"Elliott":9657,"nationally":9658,"goddess":9659,"investors":9660,"Burke":9661,"Jackie":9662,"##ā":9663,"##RA":9664,"Tristan":9665,"Associate":9666,"Tuesday":9667,"scope":9668,"Near":9669,"bunch":9670,"##abad":9671,"##ben":9672,"sunlight":9673,"##aire":9674,"manga":9675,"Willie":9676,"trucks":9677,"boarding":9678,"Lion":9679,"lawsuit":9680,"Learning":9681,"Der":9682,"pounding":9683,"awful":9684,"##mine":9685,"IT":9686,"Legend":9687,"romance":9688,"Serie":9689,"AC":9690,"gut":9691,"precious":9692,"Robertson":9693,"hometown":9694,"realm":9695,"Guards":9696,"Tag":9697,"batting":9698,"##vre":9699,"halt":9700,"conscious":9701,"1838":9702,"acquire":9703,"collar":9704,"##gg":9705,"##ops":9706,"Herald":9707,"nationwide":9708,"citizenship":9709,"Aircraft":9710,"decrease":9711,"em":9712,"Fiction":9713,"Female":9714,"corporation":9715,"Located":9716,"##ip":9717,"fights":9718,"unconscious":9719,"Tampa":9720,"Poetry":9721,"lobby":9722,"Malta":9723,"##sar":9724,"##bie":9725,"layout":9726,"Tate":9727,"reader":9728,"stained":9729,"##bre":9730,"##rst":9731,"##ulate":9732,"loudly":9733,"Eva":9734,"Cohen":9735,"exploded":9736,"Merit":9737,"Maya":9738,"##rable":9739,"Rovers":9740,"##IC":9741,"Morrison":9742,"Should":9743,"vinyl":9744,"##mie":9745,"onwards":9746,"##gie":9747,"vicinity":9748,"Wildlife":9749,"probability":9750,"Mar":9751,"Barnes":9752,"##ook":9753,"spinning":9754,"Moses":9755,"##vie":9756,"Surrey":9757,"Planning":9758,"conferences":9759,"protective":9760,"Plaza":9761,"deny":9762,"Canterbury":9763,"manor":9764,"Estate":9765,"tilted":9766,"comics":9767,"IBM":9768,"destroying":9769,"server":9770,"Dorothy":9771,"##horn":9772,"Oslo":9773,"lesser":9774,"heaven":9775,"Marshal":9776,"scales":9777,"strikes":9778,"##ath":9779,"firms":9780,"attract":9781,"##BS":9782,"controlling":9783,"Bradford":9784,"southeastern":9785,"Amazon":9786,"Travis":9787,"Janet":9788,"governed":9789,"1842":9790,"Train":9791,"Holden":9792,"bleeding":9793,"gifts":9794,"rent":9795,"1839":9796,"palms":9797,"##ū":9798,"judicial":9799,"Ho":9800,"Finals":9801,"conflicts":9802,"unlikely":9803,"draws":9804,"##cies":9805,"compensation":9806,"adds":9807,"elderly":9808,"Anton":9809,"lasting":9810,"Nintendo":9811,"codes":9812,"ministers":9813,"pot":9814,"associations":9815,"capabilities":9816,"##cht":9817,"libraries":9818,"##sie":9819,"chances":9820,"performers":9821,"runway":9822,"##af":9823,"##nder":9824,"Mid":9825,"Vocals":9826,"##uch":9827,"##eon":9828,"interpreted":9829,"priority":9830,"Uganda":9831,"ruined":9832,"Mathematics":9833,"cook":9834,"AFL":9835,"Lutheran":9836,"AIDS":9837,"Capitol":9838,"chase":9839,"axis":9840,"Moreover":9841,"María":9842,"Saxon":9843,"storyline":9844,"##ffed":9845,"Tears":9846,"Kid":9847,"cent":9848,"colours":9849,"Sex":9850,"##long":9851,"pm":9852,"blonde":9853,"Edwin":9854,"CE":9855,"diocese":9856,"##ents":9857,"##boy":9858,"Inn":9859,"##ller":9860,"Saskatchewan":9861,"##kh":9862,"stepping":9863,"Windsor":9864,"##oka":9865,"##eri":9866,"Xavier":9867,"Resources":9868,"1843":9869,"##top":9870,"##rad":9871,"##lls":9872,"Testament":9873,"poorly":9874,"1836":9875,"drifted":9876,"slope":9877,"CIA":9878,"remix":9879,"Lords":9880,"mature":9881,"hosting":9882,"diamond":9883,"beds":9884,"##ncies":9885,"luxury":9886,"trigger":9887,"##lier":9888,"preliminary":9889,"hybrid":9890,"journalists":9891,"Enterprise":9892,"proven":9893,"expelled":9894,"insects":9895,"Beautiful":9896,"lifestyle":9897,"vanished":9898,"##ake":9899,"##ander":9900,"matching":9901,"surfaces":9902,"Dominican":9903,"Kids":9904,"referendum":9905,"Orlando":9906,"Truth":9907,"Sandy":9908,"privacy":9909,"Calgary":9910,"Speaker":9911,"sts":9912,"Nobody":9913,"shifting":9914,"##gers":9915,"Roll":9916,"Armenia":9917,"Hand":9918,"##ES":9919,"106":9920,"##ont":9921,"Guild":9922,"larvae":9923,"Stock":9924,"flame":9925,"gravity":9926,"enhanced":9927,"Marion":9928,"surely":9929,"##tering":9930,"Tales":9931,"algorithm":9932,"Emmy":9933,"darker":9934,"VIII":9935,"##lash":9936,"hamlet":9937,"deliberately":9938,"occurring":9939,"choices":9940,"Gage":9941,"fees":9942,"settling":9943,"ridiculous":9944,"##ela":9945,"Sons":9946,"cop":9947,"custody":9948,"##ID":9949,"proclaimed":9950,"Cardinals":9951,"##pm":9952,"Metal":9953,"Ana":9954,"1835":9955,"clue":9956,"Cardiff":9957,"riders":9958,"observations":9959,"MA":9960,"sometime":9961,"##och":9962,"performer":9963,"intact":9964,"Points":9965,"allegations":9966,"rotation":9967,"Tennis":9968,"tenor":9969,"Directors":9970,"##ats":9971,"Transit":9972,"thigh":9973,"Complex":9974,"##works":9975,"twentieth":9976,"Factory":9977,"doctrine":9978,"Daddy":9979,"##ished":9980,"pretend":9981,"Winston":9982,"cigarette":9983,"##IA":9984,"specimens":9985,"hydrogen":9986,"smoking":9987,"mathematical":9988,"arguments":9989,"openly":9990,"developer":9991,"##iro":9992,"fists":9993,"somebody":9994,"##san":9995,"Standing":9996,"Caleb":9997,"intelligent":9998,"Stay":9999,"Interior":10000,"echoed":10001,"Valentine":10002,"varieties":10003,"Brady":10004,"cluster":10005,"Ever":10006,"voyage":10007,"##of":10008,"deposits":10009,"ultimate":10010,"Hayes":10011,"horizontal":10012,"proximity":10013,"##ás":10014,"estates":10015,"exploration":10016,"NATO":10017,"Classical":10018,"##most":10019,"bills":10020,"condemned":10021,"1832":10022,"hunger":10023,"##ato":10024,"planes":10025,"deserve":10026,"offense":10027,"sequences":10028,"rendered":10029,"acceptance":10030,"##ony":10031,"manufacture":10032,"Plymouth":10033,"innovative":10034,"predicted":10035,"##RC":10036,"Fantasy":10037,"##une":10038,"supporter":10039,"absent":10040,"Picture":10041,"bassist":10042,"rescued":10043,"##MC":10044,"Ahmed":10045,"Monte":10046,"##sts":10047,"##rius":10048,"insane":10049,"novelist":10050,"##és":10051,"agrees":10052,"Antarctic":10053,"Lancaster":10054,"Hopkins":10055,"calculated":10056,"startled":10057,"##star":10058,"tribal":10059,"Amendment":10060,"##hoe":10061,"invisible":10062,"patron":10063,"deer":10064,"Walk":10065,"tracking":10066,"Lyon":10067,"tickets":10068,"##ED":10069,"philosopher":10070,"compounds":10071,"chuckled":10072,"##wi":10073,"pound":10074,"loyalty":10075,"Academic":10076,"petition":10077,"refuses":10078,"marking":10079,"Mercury":10080,"northeastern":10081,"dimensions":10082,"scandal":10083,"Canyon":10084,"patch":10085,"publish":10086,"##oning":10087,"Peak":10088,"minds":10089,"##boro":10090,"Presbyterian":10091,"Hardy":10092,"theoretical":10093,"magnitude":10094,"bombs":10095,"cage":10096,"##ders":10097,"##kai":10098,"measuring":10099,"explaining":10100,"avoiding":10101,"touchdowns":10102,"Card":10103,"theology":10104,"##ured":10105,"Popular":10106,"export":10107,"suspicious":10108,"Probably":10109,"photograph":10110,"Lou":10111,"Parks":10112,"Arms":10113,"compact":10114,"Apparently":10115,"excess":10116,"Banks":10117,"lied":10118,"stunned":10119,"territorial":10120,"Filipino":10121,"spectrum":10122,"learns":10123,"wash":10124,"imprisonment":10125,"ugly":10126,"##rose":10127,"Albany":10128,"Erik":10129,"sends":10130,"##hara":10131,"##rid":10132,"consumed":10133,"##gling":10134,"Belgrade":10135,"Da":10136,"opposing":10137,"Magnus":10138,"footsteps":10139,"glowing":10140,"delicate":10141,"Alexandria":10142,"Ludwig":10143,"gorgeous":10144,"Bros":10145,"Index":10146,"##PA":10147,"customs":10148,"preservation":10149,"bonds":10150,"##mond":10151,"environments":10152,"##nto":10153,"instructed":10154,"parted":10155,"adoption":10156,"locality":10157,"workshops":10158,"goalkeeper":10159,"##rik":10160,"##uma":10161,"Brighton":10162,"Slovenia":10163,"##ulating":10164,"##tical":10165,"towel":10166,"hugged":10167,"stripped":10168,"Bears":10169,"upright":10170,"Wagner":10171,"##aux":10172,"secretly":10173,"Adventures":10174,"nest":10175,"Course":10176,"Lauren":10177,"Boeing":10178,"Abdul":10179,"Lakes":10180,"450":10181,"##cu":10182,"USSR":10183,"caps":10184,"Chan":10185,"##nna":10186,"conceived":10187,"Actually":10188,"Belfast":10189,"Lithuanian":10190,"concentrate":10191,"possess":10192,"militia":10193,"pine":10194,"protagonist":10195,"Helena":10196,"##PS":10197,"##band":10198,"Belle":10199,"Clara":10200,"Reform":10201,"currency":10202,"pregnancy":10203,"1500":10204,"##rim":10205,"Isabella":10206,"hull":10207,"Name":10208,"trend":10209,"journalism":10210,"diet":10211,"##mel":10212,"Recording":10213,"acclaimed":10214,"Tang":10215,"Jace":10216,"steering":10217,"vacant":10218,"suggestion":10219,"costume":10220,"laser":10221,"##š":10222,"##ink":10223,"##pan":10224,"##vić":10225,"integral":10226,"achievements":10227,"wise":10228,"classroom":10229,"unions":10230,"southwestern":10231,"##uer":10232,"Garcia":10233,"toss":10234,"Tara":10235,"Large":10236,"##tate":10237,"evident":10238,"responsibilities":10239,"populated":10240,"satisfaction":10241,"##bia":10242,"casual":10243,"Ecuador":10244,"##ght":10245,"arose":10246,"##ović":10247,"Cornwall":10248,"embrace":10249,"refuse":10250,"Heavyweight":10251,"XI":10252,"Eden":10253,"activists":10254,"##uation":10255,"biology":10256,"##shan":10257,"fraud":10258,"Fuck":10259,"matched":10260,"legacy":10261,"Rivers":10262,"missionary":10263,"extraordinary":10264,"Didn":10265,"holder":10266,"wickets":10267,"crucial":10268,"Writers":10269,"Hurricane":10270,"Iceland":10271,"gross":10272,"trumpet":10273,"accordance":10274,"hurry":10275,"flooded":10276,"doctorate":10277,"Albania":10278,"##yi":10279,"united":10280,"deceased":10281,"jealous":10282,"grief":10283,"flute":10284,"portraits":10285,"##а":10286,"pleasant":10287,"Founded":10288,"Face":10289,"crowned":10290,"Raja":10291,"advisor":10292,"Salem":10293,"##ec":10294,"Achievement":10295,"admission":10296,"freely":10297,"minimal":10298,"Sudan":10299,"developers":10300,"estimate":10301,"disabled":10302,"##lane":10303,"downstairs":10304,"Bruno":10305,"##pus":10306,"pinyin":10307,"##ude":10308,"lecture":10309,"deadly":10310,"underlying":10311,"optical":10312,"witnesses":10313,"Combat":10314,"Julius":10315,"tapped":10316,"variants":10317,"##like":10318,"Colonial":10319,"Critics":10320,"Similarly":10321,"mouse":10322,"voltage":10323,"sculptor":10324,"Concert":10325,"salary":10326,"Frances":10327,"##ground":10328,"hook":10329,"premises":10330,"Software":10331,"instructor":10332,"nominee":10333,"##ited":10334,"fog":10335,"slopes":10336,"##zu":10337,"vegetation":10338,"sail":10339,"##rch":10340,"Body":10341,"Apart":10342,"atop":10343,"View":10344,"utility":10345,"ribs":10346,"cab":10347,"migration":10348,"##wyn":10349,"bounded":10350,"2019":10351,"pillow":10352,"trails":10353,"##ub":10354,"Halifax":10355,"shade":10356,"Rush":10357,"##lah":10358,"##dian":10359,"Notre":10360,"interviewed":10361,"Alexandra":10362,"Springfield":10363,"Indeed":10364,"rubbing":10365,"dozens":10366,"amusement":10367,"legally":10368,"##lers":10369,"Jill":10370,"Cinema":10371,"ignoring":10372,"Choice":10373,"##ures":10374,"pockets":10375,"##nell":10376,"laying":10377,"Blair":10378,"tackles":10379,"separately":10380,"##teen":10381,"Criminal":10382,"performs":10383,"theorem":10384,"Communication":10385,"suburbs":10386,"##iel":10387,"competitors":10388,"rows":10389,"##hai":10390,"Manitoba":10391,"Eleanor":10392,"interactions":10393,"nominations":10394,"assassination":10395,"##dis":10396,"Edmonton":10397,"diving":10398,"##dine":10399,"essay":10400,"##tas":10401,"AFC":10402,"Edge":10403,"directing":10404,"imagination":10405,"sunk":10406,"implement":10407,"Theodore":10408,"trembling":10409,"sealed":10410,"##rock":10411,"Nobel":10412,"##ancy":10413,"##dorf":10414,"##chen":10415,"genuine":10416,"apartments":10417,"Nicolas":10418,"AA":10419,"Bach":10420,"Globe":10421,"Store":10422,"220":10423,"##10":10424,"Rochester":10425,"##ño":10426,"alert":10427,"107":10428,"Beck":10429,"##nin":10430,"Naples":10431,"Basin":10432,"Crawford":10433,"fears":10434,"Tracy":10435,"##hen":10436,"disk":10437,"##pped":10438,"seventeen":10439,"Lead":10440,"backup":10441,"reconstruction":10442,"##lines":10443,"terrified":10444,"sleeve":10445,"nicknamed":10446,"popped":10447,"##making":10448,"##ern":10449,"Holiday":10450,"Gospel":10451,"ibn":10452,"##ime":10453,"convert":10454,"divine":10455,"resolved":10456,"##quet":10457,"ski":10458,"realizing":10459,"##RT":10460,"Legislature":10461,"reservoir":10462,"Rain":10463,"sinking":10464,"rainfall":10465,"elimination":10466,"challenging":10467,"tobacco":10468,"##outs":10469,"Given":10470,"smallest":10471,"Commercial":10472,"pin":10473,"rebel":10474,"comedian":10475,"exchanged":10476,"airing":10477,"dish":10478,"Salvador":10479,"promising":10480,"##wl":10481,"relax":10482,"presenter":10483,"toll":10484,"aerial":10485,"##eh":10486,"Fletcher":10487,"brass":10488,"disappear":10489,"zones":10490,"adjusted":10491,"contacts":10492,"##lk":10493,"sensed":10494,"Walt":10495,"mild":10496,"toes":10497,"flies":10498,"shame":10499,"considers":10500,"wildlife":10501,"Hanna":10502,"Arsenal":10503,"Ladies":10504,"naming":10505,"##ishing":10506,"anxiety":10507,"discussions":10508,"cute":10509,"undertaken":10510,"Cash":10511,"strain":10512,"Wyoming":10513,"dishes":10514,"precise":10515,"Angela":10516,"##ided":10517,"hostile":10518,"twins":10519,"115":10520,"Built":10521,"##pel":10522,"Online":10523,"tactics":10524,"Newman":10525,"##bourne":10526,"unclear":10527,"repairs":10528,"embarrassed":10529,"listing":10530,"tugged":10531,"Vale":10532,"##gin":10533,"Meredith":10534,"bout":10535,"##cle":10536,"velocity":10537,"tips":10538,"froze":10539,"evaluation":10540,"demonstrate":10541,"##card":10542,"criticised":10543,"Nash":10544,"lineup":10545,"Rao":10546,"monks":10547,"bacteria":10548,"lease":10549,"##lish":10550,"frightened":10551,"den":10552,"revived":10553,"finale":10554,"##rance":10555,"flee":10556,"Letters":10557,"decreased":10558,"##oh":10559,"Sounds":10560,"wrap":10561,"Sharon":10562,"incidents":10563,"renovated":10564,"everybody":10565,"stole":10566,"Bath":10567,"boxing":10568,"1815":10569,"withdraw":10570,"backs":10571,"interim":10572,"react":10573,"murders":10574,"Rhodes":10575,"Copa":10576,"framed":10577,"flown":10578,"Estonia":10579,"Heavy":10580,"explored":10581,"##rra":10582,"##GA":10583,"##ali":10584,"Istanbul":10585,"1834":10586,"##rite":10587,"##aging":10588,"##ues":10589,"Episcopal":10590,"arc":10591,"orientation":10592,"Maxwell":10593,"infected":10594,"##rot":10595,"BCE":10596,"Brook":10597,"grasp":10598,"Roberto":10599,"Excellence":10600,"108":10601,"withdrawal":10602,"Marines":10603,"rider":10604,"Lo":10605,"##sin":10606,"##run":10607,"Subsequently":10608,"garrison":10609,"hurricane":10610,"facade":10611,"Prussia":10612,"crushed":10613,"enterprise":10614,"##mber":10615,"Twitter":10616,"Generation":10617,"Physical":10618,"Sugar":10619,"editing":10620,"communicate":10621,"Ellie":10622,"##hurst":10623,"Ernst":10624,"wagon":10625,"promotional":10626,"conquest":10627,"Parliamentary":10628,"courtyard":10629,"lawyers":10630,"Superman":10631,"email":10632,"Prussian":10633,"lately":10634,"lecturer":10635,"Singer":10636,"Majesty":10637,"Paradise":10638,"sooner":10639,"Heath":10640,"slot":10641,"curves":10642,"convoy":10643,"##vian":10644,"induced":10645,"synonym":10646,"breeze":10647,"##plane":10648,"##ox":10649,"peered":10650,"Coalition":10651,"##hia":10652,"odds":10653,"##esh":10654,"##lina":10655,"Tomorrow":10656,"Nadu":10657,"##ico":10658,"##rah":10659,"damp":10660,"autonomous":10661,"console":10662,"Victory":10663,"counts":10664,"Luxembourg":10665,"intimate":10666,"Archived":10667,"Carroll":10668,"spy":10669,"Zero":10670,"habit":10671,"Always":10672,"faction":10673,"teenager":10674,"Johnston":10675,"chaos":10676,"ruin":10677,"commerce":10678,"blog":10679,"##shed":10680,"##the":10681,"reliable":10682,"Word":10683,"Yu":10684,"Norton":10685,"parade":10686,"Catholics":10687,"damned":10688,"##iling":10689,"surgeon":10690,"##tia":10691,"Allison":10692,"Jonas":10693,"remarked":10694,"##ès":10695,"idiot":10696,"Making":10697,"proposals":10698,"Industries":10699,"strategies":10700,"artifacts":10701,"batteries":10702,"reward":10703,"##vers":10704,"Agricultural":10705,"distinguish":10706,"lengths":10707,"Jeffrey":10708,"Progressive":10709,"kicking":10710,"Patricia":10711,"##gio":10712,"ballot":10713,"##ios":10714,"skilled":10715,"##gation":10716,"Colt":10717,"limestone":10718,"##AS":10719,"peninsula":10720,"##itis":10721,"LA":10722,"hotels":10723,"shapes":10724,"Crime":10725,"depicting":10726,"northwestern":10727,"HD":10728,"silly":10729,"Das":10730,"##²":10731,"##ws":10732,"##ash":10733,"##matic":10734,"thermal":10735,"Has":10736,"forgive":10737,"surrendered":10738,"Palm":10739,"Nacional":10740,"drank":10741,"haired":10742,"Mercedes":10743,"##foot":10744,"loading":10745,"Timothy":10746,"##roll":10747,"mechanisms":10748,"traces":10749,"digging":10750,"discussing":10751,"Natalie":10752,"##zhou":10753,"Forbes":10754,"landmark":10755,"Anyway":10756,"Manor":10757,"conspiracy":10758,"gym":10759,"knocking":10760,"viewing":10761,"Formation":10762,"Pink":10763,"Beauty":10764,"limbs":10765,"Phillip":10766,"sponsor":10767,"Joy":10768,"granite":10769,"Harbour":10770,"##ero":10771,"payments":10772,"Ballet":10773,"conviction":10774,"##dam":10775,"Hood":10776,"estimates":10777,"lacked":10778,"Mad":10779,"Jorge":10780,"##wen":10781,"refuge":10782,"##LA":10783,"invaded":10784,"Kat":10785,"suburban":10786,"##fold":10787,"investigated":10788,"Ari":10789,"complained":10790,"creek":10791,"Georges":10792,"##uts":10793,"powder":10794,"accepting":10795,"deserved":10796,"carpet":10797,"Thunder":10798,"molecules":10799,"Legal":10800,"cliff":10801,"strictly":10802,"enrollment":10803,"ranch":10804,"##rg":10805,"##mba":10806,"proportion":10807,"renovation":10808,"crop":10809,"grabbing":10810,"##liga":10811,"finest":10812,"entries":10813,"receptor":10814,"helmet":10815,"blown":10816,"Listen":10817,"flagship":10818,"workshop":10819,"resolve":10820,"nails":10821,"Shannon":10822,"portal":10823,"jointly":10824,"shining":10825,"Violet":10826,"overwhelming":10827,"upward":10828,"Mick":10829,"proceedings":10830,"##dies":10831,"##aring":10832,"Laurence":10833,"Churchill":10834,"##rice":10835,"commit":10836,"170":10837,"inclusion":10838,"Examples":10839,"##verse":10840,"##rma":10841,"fury":10842,"paths":10843,"##SC":10844,"ankle":10845,"nerves":10846,"Chemistry":10847,"rectangular":10848,"sworn":10849,"screenplay":10850,"cake":10851,"Mann":10852,"Seoul":10853,"Animal":10854,"sizes":10855,"Speed":10856,"vol":10857,"Population":10858,"Southwest":10859,"Hold":10860,"continuously":10861,"Qualified":10862,"wishing":10863,"Fighting":10864,"Made":10865,"disappointment":10866,"Portsmouth":10867,"Thirty":10868,"##beck":10869,"Ahmad":10870,"teammate":10871,"MLB":10872,"graph":10873,"Charleston":10874,"realizes":10875,"##dium":10876,"exhibits":10877,"preventing":10878,"##int":10879,"fever":10880,"rivalry":10881,"Male":10882,"mentally":10883,"dull":10884,"##lor":10885,"##rich":10886,"consistently":10887,"##igan":10888,"Madame":10889,"certificate":10890,"suited":10891,"Krishna":10892,"accuracy":10893,"Webb":10894,"Budapest":10895,"Rex":10896,"1831":10897,"Cornell":10898,"OK":10899,"surveillance":10900,"##gated":10901,"habitats":10902,"Adventure":10903,"Conrad":10904,"Superior":10905,"Gay":10906,"sofa":10907,"aka":10908,"boot":10909,"Statistics":10910,"Jessie":10911,"Liberation":10912,"##lip":10913,"##rier":10914,"brands":10915,"saint":10916,"Heinrich":10917,"Christine":10918,"bath":10919,"Rhine":10920,"ballet":10921,"Jin":10922,"consensus":10923,"chess":10924,"Arctic":10925,"stack":10926,"furious":10927,"cheap":10928,"toy":10929,"##yre":10930,"##face":10931,"##gging":10932,"gastropod":10933,"##nne":10934,"Romans":10935,"membrane":10936,"answering":10937,"25th":10938,"architects":10939,"sustainable":10940,"##yne":10941,"Hon":10942,"1814":10943,"Baldwin":10944,"dome":10945,"##awa":10946,"##zen":10947,"celebrity":10948,"enclosed":10949,"##uit":10950,"##mmer":10951,"Electronic":10952,"locals":10953,"##CE":10954,"supervision":10955,"mineral":10956,"Chemical":10957,"Slovakia":10958,"alley":10959,"hub":10960,"##az":10961,"heroes":10962,"Creative":10963,"##AM":10964,"incredible":10965,"politically":10966,"ESPN":10967,"yanked":10968,"halls":10969,"Aboriginal":10970,"Greatest":10971,"yield":10972,"##20":10973,"congressional":10974,"robot":10975,"Kiss":10976,"welcomed":10977,"MS":10978,"speeds":10979,"proceed":10980,"Sherman":10981,"eased":10982,"Greene":10983,"Walsh":10984,"Geoffrey":10985,"variables":10986,"rocky":10987,"##print":10988,"acclaim":10989,"Reverend":10990,"Wonder":10991,"tonnes":10992,"recurring":10993,"Dawson":10994,"continent":10995,"finite":10996,"AP":10997,"continental":10998,"ID":10999,"facilitate":11000,"essays":11001,"Rafael":11002,"Neal":11003,"1833":11004,"ancestors":11005,"##met":11006,"##gic":11007,"Especially":11008,"teenage":11009,"frustrated":11010,"Jules":11011,"cock":11012,"expense":11013,"##oli":11014,"##old":11015,"blocking":11016,"Notable":11017,"prohibited":11018,"ca":11019,"dock":11020,"organize":11021,"##wald":11022,"Burma":11023,"Gloria":11024,"dimension":11025,"aftermath":11026,"choosing":11027,"Mickey":11028,"torpedo":11029,"pub":11030,"##used":11031,"manuscripts":11032,"laps":11033,"Ulster":11034,"staircase":11035,"sphere":11036,"Insurance":11037,"Contest":11038,"lens":11039,"risks":11040,"investigations":11041,"ERA":11042,"glare":11043,"##play":11044,"Graduate":11045,"auction":11046,"Chronicle":11047,"##tric":11048,"##50":11049,"Coming":11050,"seating":11051,"Wade":11052,"seeks":11053,"inland":11054,"Thames":11055,"Rather":11056,"butterfly":11057,"contracted":11058,"positioned":11059,"consumers":11060,"contestants":11061,"fragments":11062,"Yankees":11063,"Santos":11064,"administrator":11065,"hypothesis":11066,"retire":11067,"Denis":11068,"agreements":11069,"Winnipeg":11070,"##rill":11071,"1820":11072,"trophy":11073,"crap":11074,"shakes":11075,"Jenkins":11076,"##rium":11077,"ya":11078,"twist":11079,"labels":11080,"Maritime":11081,"##lings":11082,"##iv":11083,"111":11084,"##ensis":11085,"Cairo":11086,"Anything":11087,"##fort":11088,"opinions":11089,"crowded":11090,"##nian":11091,"abandon":11092,"##iff":11093,"drained":11094,"imported":11095,"##rr":11096,"tended":11097,"##rain":11098,"Going":11099,"introducing":11100,"sculptures":11101,"bankruptcy":11102,"danced":11103,"demonstration":11104,"stance":11105,"settings":11106,"gazed":11107,"abstract":11108,"pet":11109,"Calvin":11110,"stiff":11111,"strongest":11112,"wrestler":11113,"##dre":11114,"Republicans":11115,"grace":11116,"allocated":11117,"cursed":11118,"snail":11119,"advancing":11120,"Return":11121,"errors":11122,"Mall":11123,"presenting":11124,"eliminate":11125,"Amateur":11126,"Institution":11127,"counting":11128,"##wind":11129,"warehouse":11130,"##nde":11131,"Ethiopia":11132,"trailed":11133,"hollow":11134,"##press":11135,"Literary":11136,"capability":11137,"nursing":11138,"preceding":11139,"lamp":11140,"Thomson":11141,"Morton":11142,"##ctic":11143,"Crew":11144,"Close":11145,"composers":11146,"boom":11147,"Clare":11148,"missiles":11149,"112":11150,"hunter":11151,"snap":11152,"##oni":11153,"##tail":11154,"Us":11155,"declaration":11156,"##cock":11157,"rally":11158,"huh":11159,"lion":11160,"straightened":11161,"Philippe":11162,"Sutton":11163,"alpha":11164,"valued":11165,"maker":11166,"navigation":11167,"detected":11168,"favorable":11169,"perception":11170,"Charter":11171,"##ña":11172,"Ricky":11173,"rebounds":11174,"tunnels":11175,"slapped":11176,"Emergency":11177,"supposedly":11178,"##act":11179,"deployment":11180,"socialist":11181,"tubes":11182,"anybody":11183,"corn":11184,"##NA":11185,"Seminary":11186,"heating":11187,"pump":11188,"##AA":11189,"achieving":11190,"souls":11191,"##ass":11192,"Link":11193,"##ele":11194,"##smith":11195,"greeted":11196,"Bates":11197,"Americas":11198,"Elder":11199,"cure":11200,"contestant":11201,"240":11202,"fold":11203,"Runner":11204,"Uh":11205,"licked":11206,"Politics":11207,"committees":11208,"neighbors":11209,"fairy":11210,"Silva":11211,"Leipzig":11212,"tipped":11213,"correctly":11214,"exciting":11215,"electronics":11216,"foundations":11217,"cottage":11218,"governmental":11219,"##hat":11220,"allied":11221,"claws":11222,"presidency":11223,"cruel":11224,"Agreement":11225,"slender":11226,"accompanying":11227,"precisely":11228,"##pass":11229,"driveway":11230,"swim":11231,"Stand":11232,"crews":11233,"##mission":11234,"rely":11235,"everyday":11236,"Wings":11237,"demo":11238,"##hic":11239,"recreational":11240,"min":11241,"nationality":11242,"##duction":11243,"Easter":11244,"##hole":11245,"canvas":11246,"Kay":11247,"Leicester":11248,"talented":11249,"Discovery":11250,"shells":11251,"##ech":11252,"Kerry":11253,"Ferguson":11254,"Leave":11255,"##place":11256,"altogether":11257,"adopt":11258,"butt":11259,"wolves":11260,"##nsis":11261,"##ania":11262,"modest":11263,"soprano":11264,"Boris":11265,"##ught":11266,"electron":11267,"depicts":11268,"hid":11269,"cruise":11270,"differ":11271,"treasure":11272,"##nch":11273,"Gun":11274,"Mama":11275,"Bengali":11276,"trainer":11277,"merchants":11278,"innovation":11279,"presumably":11280,"Shirley":11281,"bottles":11282,"proceeds":11283,"Fear":11284,"invested":11285,"Pirates":11286,"particle":11287,"Dominic":11288,"blamed":11289,"Fight":11290,"Daisy":11291,"##pper":11292,"##graphic":11293,"nods":11294,"knight":11295,"Doyle":11296,"tales":11297,"Carnegie":11298,"Evil":11299,"Inter":11300,"Shore":11301,"Nixon":11302,"transform":11303,"Savannah":11304,"##gas":11305,"Baltic":11306,"stretching":11307,"worlds":11308,"protocol":11309,"Percy":11310,"Toby":11311,"Heroes":11312,"brave":11313,"dancers":11314,"##aria":11315,"backwards":11316,"responses":11317,"Chi":11318,"Gaelic":11319,"Berry":11320,"crush":11321,"embarked":11322,"promises":11323,"Madonna":11324,"researcher":11325,"realised":11326,"inaugurated":11327,"Cherry":11328,"Mikhail":11329,"Nottingham":11330,"reinforced":11331,"subspecies":11332,"rapper":11333,"##kie":11334,"Dreams":11335,"Re":11336,"Damon":11337,"Minneapolis":11338,"monsters":11339,"suspicion":11340,"Tel":11341,"surroundings":11342,"afterward":11343,"complaints":11344,"OF":11345,"sectors":11346,"Algeria":11347,"lanes":11348,"Sabha":11349,"objectives":11350,"Donna":11351,"bothered":11352,"distracted":11353,"deciding":11354,"##ives":11355,"##CA":11356,"##onia":11357,"bishops":11358,"Strange":11359,"machinery":11360,"Voiced":11361,"synthesis":11362,"reflects":11363,"interference":11364,"##TS":11365,"##ury":11366,"keen":11367,"##ign":11368,"frown":11369,"freestyle":11370,"ton":11371,"Dixon":11372,"Sacred":11373,"Ruby":11374,"Prison":11375,"##ión":11376,"1825":11377,"outfit":11378,"##tain":11379,"curiosity":11380,"##ight":11381,"frames":11382,"steadily":11383,"emigrated":11384,"horizon":11385,"##erly":11386,"Doc":11387,"philosophical":11388,"Table":11389,"UTC":11390,"Marina":11391,"##DA":11392,"secular":11393,"##eed":11394,"Zimbabwe":11395,"cops":11396,"Mack":11397,"sheriff":11398,"Sanskrit":11399,"Francesco":11400,"catches":11401,"questioning":11402,"streaming":11403,"Kill":11404,"testimony":11405,"hissed":11406,"tackle":11407,"countryside":11408,"copyright":11409,"##IP":11410,"Buddhism":11411,"##rator":11412,"ladder":11413,"##ON":11414,"Past":11415,"rookie":11416,"depths":11417,"##yama":11418,"##ister":11419,"##HS":11420,"Samantha":11421,"Dana":11422,"Educational":11423,"brows":11424,"Hammond":11425,"raids":11426,"envelope":11427,"##sco":11428,"##hart":11429,"##ulus":11430,"epic":11431,"detection":11432,"Streets":11433,"Potter":11434,"statistical":11435,"für":11436,"ni":11437,"accounting":11438,"##pot":11439,"employer":11440,"Sidney":11441,"Depression":11442,"commands":11443,"Tracks":11444,"averaged":11445,"lets":11446,"Ram":11447,"longtime":11448,"suits":11449,"branded":11450,"chip":11451,"Shield":11452,"loans":11453,"ought":11454,"Said":11455,"sip":11456,"##rome":11457,"requests":11458,"Vernon":11459,"bordered":11460,"veterans":11461,"##ament":11462,"Marsh":11463,"Herzegovina":11464,"Pine":11465,"##igo":11466,"mills":11467,"anticipation":11468,"reconnaissance":11469,"##ef":11470,"expectations":11471,"protested":11472,"arrow":11473,"guessed":11474,"depot":11475,"maternal":11476,"weakness":11477,"##ap":11478,"projected":11479,"pour":11480,"Carmen":11481,"provider":11482,"newer":11483,"remind":11484,"freed":11485,"##rily":11486,"##wal":11487,"##tones":11488,"intentions":11489,"Fiji":11490,"timing":11491,"Match":11492,"managers":11493,"Kosovo":11494,"Herman":11495,"Wesley":11496,"Chang":11497,"135":11498,"semifinals":11499,"shouting":11500,"Indo":11501,"Janeiro":11502,"Chess":11503,"Macedonia":11504,"Buck":11505,"##onies":11506,"rulers":11507,"Mail":11508,"##vas":11509,"##sel":11510,"MHz":11511,"Programme":11512,"Task":11513,"commercially":11514,"subtle":11515,"propaganda":11516,"spelled":11517,"bowling":11518,"basically":11519,"Raven":11520,"1828":11521,"Colony":11522,"109":11523,"##ingham":11524,"##wara":11525,"anticipated":11526,"1829":11527,"##iers":11528,"graduates":11529,"##rton":11530,"##fication":11531,"endangered":11532,"ISO":11533,"diagnosed":11534,"##tage":11535,"exercises":11536,"Battery":11537,"bolt":11538,"poison":11539,"cartoon":11540,"##ción":11541,"hood":11542,"bowed":11543,"heal":11544,"Meyer":11545,"Reagan":11546,"##wed":11547,"subfamily":11548,"##gent":11549,"momentum":11550,"infant":11551,"detect":11552,"##sse":11553,"Chapman":11554,"Darwin":11555,"mechanics":11556,"NSW":11557,"Cancer":11558,"Brooke":11559,"Nuclear":11560,"comprised":11561,"hire":11562,"sanctuary":11563,"wingspan":11564,"contrary":11565,"remembering":11566,"surprising":11567,"Basic":11568,"stealing":11569,"OS":11570,"hatred":11571,"##lled":11572,"masters":11573,"violation":11574,"Rule":11575,"##nger":11576,"assuming":11577,"conquered":11578,"louder":11579,"robe":11580,"Beatles":11581,"legitimate":11582,"##vation":11583,"massacre":11584,"Rica":11585,"unsuccessfully":11586,"poets":11587,"##enberg":11588,"careers":11589,"doubled":11590,"premier":11591,"battalions":11592,"Dubai":11593,"Paper":11594,"Louisville":11595,"gestured":11596,"dressing":11597,"successive":11598,"mumbled":11599,"Vic":11600,"referee":11601,"pupil":11602,"##cated":11603,"##rre":11604,"ceremonies":11605,"picks":11606,"##IN":11607,"diplomat":11608,"alike":11609,"geographical":11610,"rays":11611,"##HA":11612,"##read":11613,"harbour":11614,"factories":11615,"pastor":11616,"playwright":11617,"Ultimate":11618,"nationalist":11619,"uniforms":11620,"obtaining":11621,"kit":11622,"Amber":11623,"##pling":11624,"screenwriter":11625,"ancestry":11626,"##cott":11627,"Fields":11628,"PR":11629,"Coleman":11630,"rat":11631,"Bavaria":11632,"squeeze":11633,"highlighted":11634,"Adult":11635,"reflecting":11636,"Mel":11637,"1824":11638,"bicycle":11639,"organizing":11640,"sided":11641,"Previously":11642,"Underground":11643,"Prof":11644,"athletics":11645,"coupled":11646,"mortal":11647,"Hampton":11648,"worthy":11649,"immune":11650,"Ava":11651,"##gun":11652,"encouraging":11653,"simplified":11654,"##ssa":11655,"##nte":11656,"##ann":11657,"Providence":11658,"entities":11659,"Pablo":11660,"Strong":11661,"Housing":11662,"##ista":11663,"##ators":11664,"kidnapped":11665,"mosque":11666,"Kirk":11667,"whispers":11668,"fruits":11669,"shattered":11670,"fossil":11671,"Empress":11672,"Johns":11673,"Webster":11674,"Thing":11675,"refusing":11676,"differently":11677,"specimen":11678,"Ha":11679,"##EN":11680,"##tina":11681,"##elle":11682,"##night":11683,"Horn":11684,"neighbourhood":11685,"Bolivia":11686,"##rth":11687,"genres":11688,"Pre":11689,"##vich":11690,"Amelia":11691,"swallow":11692,"Tribune":11693,"Forever":11694,"Psychology":11695,"Use":11696,"##bers":11697,"Gazette":11698,"ash":11699,"##usa":11700,"Monster":11701,"##cular":11702,"delegation":11703,"blowing":11704,"Oblast":11705,"retreated":11706,"automobile":11707,"##ex":11708,"profits":11709,"shirts":11710,"devil":11711,"Treasury":11712,"##backs":11713,"Drums":11714,"Ronnie":11715,"gameplay":11716,"expertise":11717,"Evening":11718,"resides":11719,"Caesar":11720,"unity":11721,"Crazy":11722,"linking":11723,"Vision":11724,"donations":11725,"Isabel":11726,"valve":11727,"Sue":11728,"WWE":11729,"logical":11730,"availability":11731,"fitting":11732,"revolt":11733,"##mill":11734,"Linux":11735,"taxi":11736,"Access":11737,"pollution":11738,"statues":11739,"Augustus":11740,"##pen":11741,"cello":11742,"##some":11743,"lacking":11744,"##ati":11745,"Gwen":11746,"##aka":11747,"##ovich":11748,"1821":11749,"Wow":11750,"initiatives":11751,"Uruguay":11752,"Cain":11753,"stroked":11754,"examine":11755,"##ī":11756,"mentor":11757,"moist":11758,"disorders":11759,"buttons":11760,"##tica":11761,"##anna":11762,"Species":11763,"Lynch":11764,"museums":11765,"scorer":11766,"Poor":11767,"eligibility":11768,"op":11769,"unveiled":11770,"cats":11771,"Title":11772,"wheat":11773,"critically":11774,"Syracuse":11775,"##osis":11776,"marketed":11777,"enhance":11778,"Ryder":11779,"##NG":11780,"##ull":11781,"##rna":11782,"embedded":11783,"throws":11784,"foods":11785,"happily":11786,"##ami":11787,"lesson":11788,"formats":11789,"punched":11790,"##rno":11791,"expressions":11792,"qualities":11793,"##sal":11794,"Gods":11795,"##lity":11796,"elect":11797,"wives":11798,"##lling":11799,"jungle":11800,"Toyota":11801,"reversed":11802,"Grammar":11803,"Cloud":11804,"Agnes":11805,"##ules":11806,"disputed":11807,"verses":11808,"Lucien":11809,"threshold":11810,"##rea":11811,"scanned":11812,"##bled":11813,"##dley":11814,"##lice":11815,"Kazakhstan":11816,"Gardner":11817,"Freeman":11818,"##rz":11819,"inspection":11820,"Rita":11821,"accommodation":11822,"advances":11823,"chill":11824,"Elliot":11825,"thriller":11826,"Constantinople":11827,"##mos":11828,"debris":11829,"whoever":11830,"1810":11831,"Santo":11832,"Carey":11833,"remnants":11834,"Guatemala":11835,"##irs":11836,"carriers":11837,"equations":11838,"mandatory":11839,"##WA":11840,"anxious":11841,"measurement":11842,"Summit":11843,"Terminal":11844,"Erin":11845,"##zes":11846,"LLC":11847,"##uo":11848,"glancing":11849,"sin":11850,"##₃":11851,"Downtown":11852,"flowering":11853,"Euro":11854,"Leigh":11855,"Lance":11856,"warn":11857,"decent":11858,"recommendations":11859,"##ote":11860,"Quartet":11861,"##rrell":11862,"Clarence":11863,"colleague":11864,"guarantee":11865,"230":11866,"Clayton":11867,"Beast":11868,"addresses":11869,"prospect":11870,"destroyer":11871,"vegetables":11872,"Leadership":11873,"fatal":11874,"prints":11875,"190":11876,"##makers":11877,"Hyde":11878,"persuaded":11879,"illustrations":11880,"Southampton":11881,"Joyce":11882,"beats":11883,"editors":11884,"mount":11885,"##grave":11886,"Malaysian":11887,"Bombay":11888,"endorsed":11889,"##sian":11890,"##bee":11891,"applying":11892,"Religion":11893,"nautical":11894,"bomber":11895,"Na":11896,"airfield":11897,"gravel":11898,"##rew":11899,"Cave":11900,"bye":11901,"dig":11902,"decree":11903,"burden":11904,"Election":11905,"Hawk":11906,"Fe":11907,"##iled":11908,"reunited":11909,"##tland":11910,"liver":11911,"Teams":11912,"Put":11913,"delegates":11914,"Ella":11915,"##fect":11916,"Cal":11917,"invention":11918,"Castro":11919,"bored":11920,"##kawa":11921,"##ail":11922,"Trinidad":11923,"NASCAR":11924,"pond":11925,"develops":11926,"##pton":11927,"expenses":11928,"Zoe":11929,"Released":11930,"##rf":11931,"organs":11932,"beta":11933,"parameters":11934,"Neill":11935,"##lene":11936,"lateral":11937,"Beat":11938,"blades":11939,"Either":11940,"##hale":11941,"Mitch":11942,"##ET":11943,"##vous":11944,"Rod":11945,"burnt":11946,"phones":11947,"Rising":11948,"##front":11949,"investigating":11950,"##dent":11951,"Stephanie":11952,"##keeper":11953,"screening":11954,"##uro":11955,"Swan":11956,"Sinclair":11957,"modes":11958,"bullets":11959,"Nigerian":11960,"melody":11961,"##ques":11962,"Rifle":11963,"##12":11964,"128":11965,"##jin":11966,"charm":11967,"Venus":11968,"##tian":11969,"fusion":11970,"advocated":11971,"visitor":11972,"pinned":11973,"genera":11974,"3000":11975,"Ferry":11976,"Solo":11977,"quantity":11978,"regained":11979,"platinum":11980,"shoots":11981,"narrowly":11982,"preceded":11983,"update":11984,"##ichi":11985,"equality":11986,"unaware":11987,"regiments":11988,"ally":11989,"##tos":11990,"transmitter":11991,"locks":11992,"Seeing":11993,"outlets":11994,"feast":11995,"reopened":11996,"##ows":11997,"struggles":11998,"Buddy":11999,"1826":12000,"bark":12001,"elegant":12002,"amused":12003,"Pretty":12004,"themed":12005,"schemes":12006,"Lisbon":12007,"Te":12008,"patted":12009,"terrorism":12010,"Mystery":12011,"##croft":12012,"##imo":12013,"Madagascar":12014,"Journey":12015,"dealer":12016,"contacted":12017,"##quez":12018,"ITV":12019,"vacation":12020,"Wong":12021,"Sacramento":12022,"organisms":12023,"##pts":12024,"balcony":12025,"coloured":12026,"sheer":12027,"defines":12028,"MC":12029,"abortion":12030,"forbidden":12031,"accredited":12032,"Newfoundland":12033,"tendency":12034,"entrepreneur":12035,"Benny":12036,"Tanzania":12037,"needing":12038,"finalist":12039,"mythology":12040,"weakened":12041,"gown":12042,"sentences":12043,"Guest":12044,"websites":12045,"Tibetan":12046,"UFC":12047,"voluntary":12048,"annoyed":12049,"Welcome":12050,"honestly":12051,"correspondence":12052,"geometry":12053,"Deutsche":12054,"Biology":12055,"Help":12056,"##aya":12057,"Lines":12058,"Hector":12059,"##ael":12060,"reluctant":12061,"##ages":12062,"wears":12063,"inquiry":12064,"##dell":12065,"Holocaust":12066,"Tourism":12067,"Wei":12068,"volcanic":12069,"##mates":12070,"Visual":12071,"sorts":12072,"neighborhoods":12073,"Running":12074,"apple":12075,"shy":12076,"Laws":12077,"bend":12078,"Northeast":12079,"feminist":12080,"Speedway":12081,"Murder":12082,"visa":12083,"stuffed":12084,"fangs":12085,"transmitted":12086,"fiscal":12087,"Ain":12088,"enlarged":12089,"##ndi":12090,"Cecil":12091,"Peterson":12092,"Benson":12093,"Bedford":12094,"acceptable":12095,"##CC":12096,"##wer":12097,"purely":12098,"triangle":12099,"foster":12100,"Alberto":12101,"educator":12102,"Highland":12103,"acute":12104,"LGBT":12105,"Tina":12106,"Mi":12107,"adventures":12108,"Davidson":12109,"Honda":12110,"translator":12111,"monk":12112,"enacted":12113,"summoned":12114,"##ional":12115,"collector":12116,"Genesis":12117,"Un":12118,"liner":12119,"Di":12120,"Statistical":12121,"##CS":12122,"filter":12123,"Knox":12124,"Religious":12125,"Stella":12126,"Estonian":12127,"Turn":12128,"##ots":12129,"primitive":12130,"parishes":12131,"##lles":12132,"complexity":12133,"autobiography":12134,"rigid":12135,"cannon":12136,"pursuing":12137,"exploring":12138,"##gram":12139,"##mme":12140,"freshman":12141,"caves":12142,"Expedition":12143,"Traditional":12144,"iTunes":12145,"certification":12146,"cooling":12147,"##ort":12148,"##gna":12149,"##IT":12150,"##lman":12151,"##VA":12152,"Motion":12153,"explosive":12154,"licence":12155,"boxer":12156,"shrine":12157,"loosely":12158,"Brigadier":12159,"Savage":12160,"Brett":12161,"MVP":12162,"heavier":12163,"##elli":12164,"##gged":12165,"Buddha":12166,"Easy":12167,"spells":12168,"fails":12169,"incredibly":12170,"Georg":12171,"stern":12172,"compatible":12173,"Perfect":12174,"applies":12175,"cognitive":12176,"excessive":12177,"nightmare":12178,"neighbor":12179,"Sicily":12180,"appealed":12181,"static":12182,"##₁":12183,"Aberdeen":12184,"##leigh":12185,"slipping":12186,"bride":12187,"##guard":12188,"Um":12189,"Clyde":12190,"1818":12191,"##gible":12192,"Hal":12193,"Frost":12194,"Sanders":12195,"interactive":12196,"Hour":12197,"##vor":12198,"hurting":12199,"bull":12200,"termed":12201,"shelf":12202,"capturing":12203,"##pace":12204,"rolls":12205,"113":12206,"##bor":12207,"Chilean":12208,"teaches":12209,"##rey":12210,"exam":12211,"shipped":12212,"Twin":12213,"borrowed":12214,"##lift":12215,"Shit":12216,"##hot":12217,"Lindsay":12218,"Below":12219,"Kiev":12220,"Lin":12221,"leased":12222,"##sto":12223,"Eli":12224,"Diane":12225,"Val":12226,"subtropical":12227,"shoe":12228,"Bolton":12229,"Dragons":12230,"##rification":12231,"Vatican":12232,"##pathy":12233,"Crisis":12234,"dramatically":12235,"talents":12236,"babies":12237,"##ores":12238,"surname":12239,"##AP":12240,"##cology":12241,"cubic":12242,"opted":12243,"Archer":12244,"sweep":12245,"tends":12246,"Karnataka":12247,"Judy":12248,"stint":12249,"Similar":12250,"##nut":12251,"explicitly":12252,"##nga":12253,"interact":12254,"Mae":12255,"portfolio":12256,"clinic":12257,"abbreviated":12258,"Counties":12259,"##iko":12260,"hearts":12261,"##ı":12262,"providers":12263,"screams":12264,"Individual":12265,"##etti":12266,"Monument":12267,"##iana":12268,"accessed":12269,"encounters":12270,"gasp":12271,"##rge":12272,"defunct":12273,"Avery":12274,"##rne":12275,"nobility":12276,"useless":12277,"Phase":12278,"Vince":12279,"senator":12280,"##FL":12281,"1813":12282,"surprisingly":12283,"##illo":12284,"##chin":12285,"Boyd":12286,"rumors":12287,"equity":12288,"Gone":12289,"Hearts":12290,"chassis":12291,"overnight":12292,"Trek":12293,"wrists":12294,"submit":12295,"civic":12296,"designers":12297,"##rity":12298,"prominence":12299,"decorative":12300,"derives":12301,"starter":12302,"##AF":12303,"wisdom":12304,"Powers":12305,"reluctantly":12306,"measurements":12307,"doctoral":12308,"Noel":12309,"Gideon":12310,"Baden":12311,"Cologne":12312,"lawn":12313,"Hawaiian":12314,"anthology":12315,"##rov":12316,"Raiders":12317,"embassy":12318,"Sterling":12319,"##pal":12320,"Telugu":12321,"troubled":12322,"##FC":12323,"##bian":12324,"fountain":12325,"observe":12326,"ore":12327,"##uru":12328,"##gence":12329,"spelling":12330,"Border":12331,"grinning":12332,"sketch":12333,"Benedict":12334,"Xbox":12335,"dialects":12336,"readily":12337,"immigrant":12338,"Constitutional":12339,"aided":12340,"nevertheless":12341,"SE":12342,"tragedy":12343,"##ager":12344,"##rden":12345,"Flash":12346,"##MP":12347,"Europa":12348,"emissions":12349,"##ield":12350,"panties":12351,"Beverly":12352,"Homer":12353,"curtain":12354,"##oto":12355,"toilet":12356,"Isn":12357,"Jerome":12358,"Chiefs":12359,"Hermann":12360,"supernatural":12361,"juice":12362,"integrity":12363,"Scots":12364,"auto":12365,"Patriots":12366,"Strategic":12367,"engaging":12368,"prosecution":12369,"cleaned":12370,"Byron":12371,"investments":12372,"adequate":12373,"vacuum":12374,"laughs":12375,"##inus":12376,"##nge":12377,"Usually":12378,"Roth":12379,"Cities":12380,"Brand":12381,"corpse":12382,"##ffy":12383,"Gas":12384,"rifles":12385,"Plains":12386,"sponsorship":12387,"Levi":12388,"tray":12389,"owed":12390,"della":12391,"commanders":12392,"##ead":12393,"tactical":12394,"##rion":12395,"García":12396,"harbor":12397,"discharge":12398,"##hausen":12399,"gentleman":12400,"endless":12401,"highways":12402,"##itarian":12403,"pleaded":12404,"##eta":12405,"archive":12406,"Midnight":12407,"exceptions":12408,"instances":12409,"Gibraltar":12410,"cart":12411,"##NS":12412,"Darren":12413,"Bonnie":12414,"##yle":12415,"##iva":12416,"OCLC":12417,"bra":12418,"Jess":12419,"##EA":12420,"consulting":12421,"Archives":12422,"Chance":12423,"distances":12424,"commissioner":12425,"##AR":12426,"LL":12427,"sailors":12428,"##sters":12429,"enthusiasm":12430,"Lang":12431,"##zia":12432,"Yugoslav":12433,"confirm":12434,"possibilities":12435,"Suffolk":12436,"##eman":12437,"banner":12438,"1822":12439,"Supporting":12440,"fingertips":12441,"civilization":12442,"##gos":12443,"technically":12444,"1827":12445,"Hastings":12446,"sidewalk":12447,"strained":12448,"monuments":12449,"Floyd":12450,"Chennai":12451,"Elvis":12452,"villagers":12453,"Cumberland":12454,"strode":12455,"albeit":12456,"Believe":12457,"planets":12458,"combining":12459,"Mohammad":12460,"container":12461,"##mouth":12462,"##tures":12463,"verb":12464,"BA":12465,"Tank":12466,"Midland":12467,"screened":12468,"Gang":12469,"Democracy":12470,"Helsinki":12471,"screens":12472,"thread":12473,"charitable":12474,"##version":12475,"swiftly":12476,"ma":12477,"rational":12478,"combine":12479,"##SS":12480,"##antly":12481,"dragging":12482,"Cliff":12483,"Tasmania":12484,"quest":12485,"professionally":12486,"##aj":12487,"rap":12488,"##lion":12489,"livestock":12490,"##hua":12491,"informal":12492,"specially":12493,"lonely":12494,"Matthews":12495,"Dictionary":12496,"1816":12497,"Observatory":12498,"correspondent":12499,"constitute":12500,"homeless":12501,"waving":12502,"appreciated":12503,"Analysis":12504,"Meeting":12505,"dagger":12506,"##AL":12507,"Gandhi":12508,"flank":12509,"Giant":12510,"Choir":12511,"##not":12512,"glimpse":12513,"toe":12514,"Writer":12515,"teasing":12516,"springs":12517,"##dt":12518,"Glory":12519,"healthcare":12520,"regulated":12521,"complaint":12522,"math":12523,"Publications":12524,"makers":12525,"##hips":12526,"cement":12527,"Need":12528,"apologize":12529,"disputes":12530,"finishes":12531,"Partners":12532,"boring":12533,"ups":12534,"gains":12535,"1793":12536,"Congressional":12537,"clergy":12538,"Folk":12539,"##made":12540,"##nza":12541,"Waters":12542,"stays":12543,"encoded":12544,"spider":12545,"betrayed":12546,"Applied":12547,"inception":12548,"##urt":12549,"##zzo":12550,"wards":12551,"bells":12552,"UCLA":12553,"Worth":12554,"bombers":12555,"Mo":12556,"trademark":12557,"Piper":12558,"##vel":12559,"incorporates":12560,"1801":12561,"##cial":12562,"dim":12563,"Twelve":12564,"##word":12565,"Appeals":12566,"tighter":12567,"spacecraft":12568,"##tine":12569,"coordinates":12570,"##iac":12571,"mistakes":12572,"Zach":12573,"laptop":12574,"Teresa":12575,"##llar":12576,"##yr":12577,"favored":12578,"Nora":12579,"sophisticated":12580,"Irving":12581,"hammer":12582,"División":12583,"corporations":12584,"niece":12585,"##rley":12586,"Patterson":12587,"UNESCO":12588,"trafficking":12589,"Ming":12590,"balanced":12591,"plaque":12592,"Latvia":12593,"broader":12594,"##owed":12595,"Save":12596,"confined":12597,"##vable":12598,"Dalton":12599,"tide":12600,"##right":12601,"##ural":12602,"##num":12603,"swords":12604,"caring":12605,"##eg":12606,"IX":12607,"Acting":12608,"paved":12609,"##moto":12610,"launching":12611,"Antoine":12612,"substantially":12613,"Pride":12614,"Philharmonic":12615,"grammar":12616,"Indoor":12617,"Ensemble":12618,"enabling":12619,"114":12620,"resided":12621,"Angelo":12622,"publicity":12623,"chaired":12624,"crawled":12625,"Maharashtra":12626,"Telegraph":12627,"lengthy":12628,"preference":12629,"differential":12630,"anonymous":12631,"Honey":12632,"##itation":12633,"wage":12634,"##iki":12635,"consecrated":12636,"Bryant":12637,"regulatory":12638,"Carr":12639,"##én":12640,"functioning":12641,"watches":12642,"##ú":12643,"shifts":12644,"diagnosis":12645,"Search":12646,"app":12647,"Peters":12648,"##SE":12649,"##cat":12650,"Andreas":12651,"honours":12652,"temper":12653,"counsel":12654,"Urdu":12655,"Anniversary":12656,"maritime":12657,"##uka":12658,"harmony":12659,"##unk":12660,"essence":12661,"Lorenzo":12662,"choked":12663,"Quarter":12664,"indie":12665,"##oll":12666,"loses":12667,"##prints":12668,"amendment":12669,"Adolf":12670,"scenario":12671,"similarities":12672,"##rade":12673,"##LC":12674,"technological":12675,"metric":12676,"Russians":12677,"thoroughly":12678,"##tead":12679,"cruiser":12680,"1806":12681,"##nier":12682,"1823":12683,"Teddy":12684,"##psy":12685,"au":12686,"progressed":12687,"exceptional":12688,"broadcaster":12689,"partnered":12690,"fitness":12691,"irregular":12692,"placement":12693,"mothers":12694,"unofficial":12695,"Garion":12696,"Johannes":12697,"1817":12698,"regain":12699,"Solar":12700,"publishes":12701,"Gates":12702,"Broken":12703,"thirds":12704,"conversations":12705,"dive":12706,"Raj":12707,"contributor":12708,"quantities":12709,"Worcester":12710,"governance":12711,"##flow":12712,"generating":12713,"pretending":12714,"Belarus":12715,"##voy":12716,"radius":12717,"skating":12718,"Marathon":12719,"1819":12720,"affection":12721,"undertook":12722,"##wright":12723,"los":12724,"##bro":12725,"locate":12726,"PS":12727,"excluded":12728,"recreation":12729,"tortured":12730,"jewelry":12731,"moaned":12732,"##logue":12733,"##cut":12734,"Complete":12735,"##rop":12736,"117":12737,"##II":12738,"plantation":12739,"whipped":12740,"slower":12741,"crater":12742,"##drome":12743,"Volunteer":12744,"attributes":12745,"celebrations":12746,"regards":12747,"Publishers":12748,"oath":12749,"utilized":12750,"Robbie":12751,"Giuseppe":12752,"fiber":12753,"indication":12754,"melted":12755,"archives":12756,"Damien":12757,"storey":12758,"affecting":12759,"identifying":12760,"dances":12761,"alumni":12762,"comparable":12763,"upgrade":12764,"rented":12765,"sprint":12766,"##kle":12767,"Marty":12768,"##lous":12769,"treating":12770,"railways":12771,"Lebanese":12772,"erupted":12773,"occupy":12774,"sympathy":12775,"Jude":12776,"Darling":12777,"Qatar":12778,"drainage":12779,"McCarthy":12780,"heel":12781,"Klein":12782,"computing":12783,"wireless":12784,"flip":12785,"Du":12786,"Bella":12787,"##ast":12788,"##ssen":12789,"narrator":12790,"mist":12791,"sings":12792,"alignment":12793,"121":12794,"2020":12795,"securing":12796,"##rail":12797,"Progress":12798,"missionaries":12799,"brutal":12800,"mercy":12801,"##shing":12802,"Hip":12803,"##ache":12804,"##olo":12805,"switching":12806,"##here":12807,"Malay":12808,"##ob":12809,"constituted":12810,"Mohammed":12811,"Often":12812,"standings":12813,"surge":12814,"teachings":12815,"ink":12816,"detached":12817,"systematic":12818,"Trial":12819,"Myanmar":12820,"##wo":12821,"offs":12822,"Reyes":12823,"decoration":12824,"translations":12825,"wherever":12826,"reviewer":12827,"speculation":12828,"Bangkok":12829,"terminated":12830,"##ester":12831,"beard":12832,"RCA":12833,"Aidan":12834,"Associated":12835,"Emerson":12836,"Charity":12837,"1803":12838,"generous":12839,"Dudley":12840,"ATP":12841,"##haven":12842,"prizes":12843,"toxic":12844,"gloves":12845,"##iles":12846,"##dos":12847,"Turning":12848,"myth":12849,"Parade":12850,"##building":12851,"Hits":12852,"##eva":12853,"teamed":12854,"Above":12855,"Duchess":12856,"Holt":12857,"##oth":12858,"Sub":12859,"Ace":12860,"atomic":12861,"inform":12862,"Ship":12863,"depend":12864,"Jun":12865,"##bes":12866,"Norwich":12867,"globe":12868,"Baroque":12869,"Christina":12870,"Cotton":12871,"Tunnel":12872,"kidding":12873,"Concerto":12874,"Brittany":12875,"tasted":12876,"phases":12877,"stems":12878,"angles":12879,"##TE":12880,"##nam":12881,"##40":12882,"charted":12883,"Alison":12884,"intensive":12885,"Willis":12886,"glory":12887,"##lit":12888,"Bergen":12889,"est":12890,"taller":12891,"##dicate":12892,"labeled":12893,"##ido":12894,"commentator":12895,"Warrior":12896,"Viscount":12897,"shortened":12898,"aisle":12899,"Aria":12900,"Spike":12901,"spectators":12902,"goodbye":12903,"overlooking":12904,"mammals":12905,"##lude":12906,"wholly":12907,"Barrett":12908,"##gus":12909,"accompany":12910,"seventy":12911,"employ":12912,"##mb":12913,"ambitious":12914,"beloved":12915,"basket":12916,"##mma":12917,"##lding":12918,"halted":12919,"descendant":12920,"pad":12921,"exclaimed":12922,"cloak":12923,"##pet":12924,"Strait":12925,"Bang":12926,"Aviv":12927,"sadness":12928,"##ffer":12929,"Donovan":12930,"1880s":12931,"agenda":12932,"swinging":12933,"##quin":12934,"jerk":12935,"Boat":12936,"##rist":12937,"nervously":12938,"Silence":12939,"Echo":12940,"shout":12941,"implies":12942,"##iser":12943,"##cking":12944,"Shiva":12945,"Weston":12946,"damages":12947,"##tist":12948,"effectiveness":12949,"Horace":12950,"cycling":12951,"Rey":12952,"ache":12953,"Photography":12954,"PDF":12955,"Dear":12956,"leans":12957,"Lea":12958,"##vision":12959,"booth":12960,"attained":12961,"disbelief":12962,"##eus":12963,"##ution":12964,"Hop":12965,"pension":12966,"toys":12967,"Eurovision":12968,"faithful":12969,"##heads":12970,"Andre":12971,"owe":12972,"default":12973,"Atlas":12974,"Megan":12975,"highlights":12976,"lovers":12977,"Constantine":12978,"Sixth":12979,"masses":12980,"##garh":12981,"emerge":12982,"Auto":12983,"Slovak":12984,"##oa":12985,"##vert":12986,"Superintendent":12987,"flicked":12988,"inventor":12989,"Chambers":12990,"Frankie":12991,"Romeo":12992,"pottery":12993,"companions":12994,"Rudolf":12995,"##liers":12996,"diary":12997,"Unless":12998,"tap":12999,"alter":13000,"Randall":13001,"##ddle":13002,"##eal":13003,"limitations":13004,"##boards":13005,"utterly":13006,"knelt":13007,"guaranteed":13008,"Cowboys":13009,"Islander":13010,"horns":13011,"##ike":13012,"Wendy":13013,"sexually":13014,"Smart":13015,"breasts":13016,"##cian":13017,"compromise":13018,"Duchy":13019,"AT":13020,"Galaxy":13021,"analog":13022,"Style":13023,"##aking":13024,"weighed":13025,"Nigel":13026,"optional":13027,"Czechoslovakia":13028,"practicing":13029,"Ham":13030,"##0s":13031,"feedback":13032,"batted":13033,"uprising":13034,"operative":13035,"applicable":13036,"criminals":13037,"classrooms":13038,"Somehow":13039,"##ode":13040,"##OM":13041,"Naomi":13042,"Winchester":13043,"##pping":13044,"Bart":13045,"Regina":13046,"competitor":13047,"Recorded":13048,"Yuan":13049,"Vera":13050,"lust":13051,"Confederation":13052,"##test":13053,"suck":13054,"1809":13055,"Lambert":13056,"175":13057,"Friend":13058,"##ppa":13059,"Slowly":13060,"##⁺":13061,"Wake":13062,"Dec":13063,"##aneous":13064,"chambers":13065,"Color":13066,"Gus":13067,"##site":13068,"Alternative":13069,"##world":13070,"Exeter":13071,"Omaha":13072,"celebrities":13073,"striker":13074,"210":13075,"dwarf":13076,"meals":13077,"Oriental":13078,"Pearson":13079,"financing":13080,"revenues":13081,"underwater":13082,"Steele":13083,"screw":13084,"Feeling":13085,"Mt":13086,"acids":13087,"badge":13088,"swore":13089,"theaters":13090,"Moving":13091,"admired":13092,"lung":13093,"knot":13094,"penalties":13095,"116":13096,"fork":13097,"##cribed":13098,"Afghan":13099,"outskirts":13100,"Cambodia":13101,"oval":13102,"wool":13103,"fossils":13104,"Ned":13105,"Countess":13106,"Darkness":13107,"delicious":13108,"##nica":13109,"Evelyn":13110,"Recordings":13111,"guidelines":13112,"##CP":13113,"Sandra":13114,"meantime":13115,"Antarctica":13116,"modeling":13117,"granddaughter":13118,"##rial":13119,"Roma":13120,"Seventh":13121,"Sunshine":13122,"Gabe":13123,"##nton":13124,"Shop":13125,"Turks":13126,"prolific":13127,"soup":13128,"parody":13129,"##nta":13130,"Judith":13131,"disciplines":13132,"resign":13133,"Companies":13134,"Libya":13135,"Jets":13136,"inserted":13137,"Mile":13138,"retrieve":13139,"filmmaker":13140,"##rand":13141,"realistic":13142,"unhappy":13143,"##30":13144,"sandstone":13145,"##nas":13146,"##lent":13147,"##ush":13148,"##rous":13149,"Brent":13150,"trash":13151,"Rescue":13152,"##unted":13153,"Autumn":13154,"disgust":13155,"flexible":13156,"infinite":13157,"sideways":13158,"##oss":13159,"##vik":13160,"trailing":13161,"disturbed":13162,"50th":13163,"Newark":13164,"posthumously":13165,"##rol":13166,"Schmidt":13167,"Josef":13168,"##eous":13169,"determining":13170,"menu":13171,"Pole":13172,"Anita":13173,"Luc":13174,"peaks":13175,"118":13176,"Yard":13177,"warrant":13178,"generic":13179,"deserted":13180,"Walking":13181,"stamp":13182,"tracked":13183,"##berger":13184,"paired":13185,"surveyed":13186,"sued":13187,"Rainbow":13188,"##isk":13189,"Carpenter":13190,"submarines":13191,"realization":13192,"touches":13193,"sweeping":13194,"Fritz":13195,"module":13196,"Whether":13197,"resembles":13198,"##form":13199,"##lop":13200,"unsure":13201,"hunters":13202,"Zagreb":13203,"unemployment":13204,"Senators":13205,"Georgetown":13206,"##onic":13207,"Barker":13208,"foul":13209,"commercials":13210,"Dresden":13211,"Words":13212,"collision":13213,"Carlton":13214,"Fashion":13215,"doubted":13216,"##ril":13217,"precision":13218,"MIT":13219,"Jacobs":13220,"mob":13221,"Monk":13222,"retaining":13223,"gotta":13224,"##rod":13225,"remake":13226,"Fast":13227,"chips":13228,"##pled":13229,"sufficiently":13230,"##lights":13231,"delivering":13232,"##enburg":13233,"Dancing":13234,"Barton":13235,"Officers":13236,"metals":13237,"##lake":13238,"religions":13239,"##ré":13240,"motivated":13241,"differs":13242,"dorsal":13243,"##birds":13244,"##rts":13245,"Priest":13246,"polished":13247,"##aling":13248,"Saxony":13249,"Wyatt":13250,"knockout":13251,"##hor":13252,"Lopez":13253,"RNA":13254,"##link":13255,"metallic":13256,"##kas":13257,"daylight":13258,"Montenegro":13259,"##lining":13260,"wrapping":13261,"resemble":13262,"Jam":13263,"Viking":13264,"uncertainty":13265,"angels":13266,"enables":13267,"##fy":13268,"Stuttgart":13269,"tricks":13270,"tattoo":13271,"127":13272,"wicked":13273,"asset":13274,"breach":13275,"##yman":13276,"MW":13277,"breaths":13278,"Jung":13279,"im":13280,"1798":13281,"noon":13282,"vowel":13283,"##qua":13284,"calmly":13285,"seasonal":13286,"chat":13287,"ingredients":13288,"cooled":13289,"Randolph":13290,"ensuring":13291,"##ib":13292,"##idal":13293,"flashing":13294,"1808":13295,"Macedonian":13296,"Cool":13297,"councils":13298,"##lick":13299,"advantages":13300,"Immediately":13301,"Madras":13302,"##cked":13303,"Pain":13304,"fancy":13305,"chronic":13306,"Malayalam":13307,"begged":13308,"##nese":13309,"Inner":13310,"feathers":13311,"##vey":13312,"Names":13313,"dedication":13314,"Sing":13315,"pan":13316,"Fischer":13317,"nurses":13318,"Sharp":13319,"inning":13320,"stamps":13321,"Meg":13322,"##ello":13323,"edged":13324,"motioned":13325,"Jacksonville":13326,"##ffle":13327,"##dic":13328,"##US":13329,"divide":13330,"garnered":13331,"Ranking":13332,"chasing":13333,"modifications":13334,"##oc":13335,"clever":13336,"midst":13337,"flushed":13338,"##DP":13339,"void":13340,"##sby":13341,"ambulance":13342,"beaches":13343,"groan":13344,"isolation":13345,"strengthen":13346,"prevention":13347,"##ffs":13348,"Scouts":13349,"reformed":13350,"geographic":13351,"squadrons":13352,"Fiona":13353,"Kai":13354,"Consequently":13355,"##uss":13356,"overtime":13357,"##yas":13358,"Fr":13359,"##BL":13360,"Papua":13361,"Mixed":13362,"glances":13363,"Haiti":13364,"Sporting":13365,"sandy":13366,"confronted":13367,"René":13368,"Tanner":13369,"1811":13370,"##IM":13371,"advisory":13372,"trim":13373,"##ibe":13374,"González":13375,"gambling":13376,"Jupiter":13377,"##ility":13378,"##owski":13379,"##nar":13380,"122":13381,"apology":13382,"teased":13383,"Pool":13384,"feminine":13385,"wicket":13386,"eagle":13387,"shiny":13388,"##lator":13389,"blend":13390,"peaking":13391,"nasty":13392,"nodding":13393,"fraction":13394,"tech":13395,"Noble":13396,"Kuwait":13397,"brushing":13398,"Italia":13399,"Canberra":13400,"duet":13401,"Johan":13402,"1805":13403,"Written":13404,"cameo":13405,"Stalin":13406,"pig":13407,"cord":13408,"##zio":13409,"Surely":13410,"SA":13411,"owing":13412,"holidays":13413,"123":13414,"Ranger":13415,"lighthouse":13416,"##ige":13417,"miners":13418,"1804":13419,"##ë":13420,"##gren":13421,"##ried":13422,"crashing":13423,"##atory":13424,"wartime":13425,"highlight":13426,"inclined":13427,"Torres":13428,"Tax":13429,"##zel":13430,"##oud":13431,"Own":13432,"##corn":13433,"Divine":13434,"EMI":13435,"Relief":13436,"Northwestern":13437,"ethics":13438,"BMW":13439,"click":13440,"plasma":13441,"Christie":13442,"coordinator":13443,"Shepherd":13444,"washing":13445,"cooked":13446,"##dio":13447,"##eat":13448,"Cerambycidae":13449,"algebra":13450,"Engine":13451,"costumes":13452,"Vampire":13453,"vault":13454,"submission":13455,"virtue":13456,"assumption":13457,"##rell":13458,"Toledo":13459,"##oting":13460,"##rva":13461,"crept":13462,"emphasized":13463,"##lton":13464,"##ood":13465,"Greeks":13466,"surgical":13467,"crest":13468,"Patrol":13469,"Beta":13470,"Tessa":13471,"##GS":13472,"pizza":13473,"traits":13474,"rats":13475,"Iris":13476,"spray":13477,"##GC":13478,"Lightning":13479,"binary":13480,"escapes":13481,"##take":13482,"Clary":13483,"crowds":13484,"##zong":13485,"hauled":13486,"maid":13487,"##fen":13488,"Manning":13489,"##yang":13490,"Nielsen":13491,"aesthetic":13492,"sympathetic":13493,"affiliation":13494,"soaked":13495,"Mozart":13496,"personalities":13497,"begging":13498,"##iga":13499,"clip":13500,"Raphael":13501,"yearly":13502,"Lima":13503,"abundant":13504,"##lm":13505,"1794":13506,"strips":13507,"Initiative":13508,"reporters":13509,"##vsky":13510,"consolidated":13511,"##itated":13512,"Civic":13513,"rankings":13514,"mandate":13515,"symbolic":13516,"##ively":13517,"1807":13518,"rental":13519,"duck":13520,"nave":13521,"complications":13522,"##nor":13523,"Irene":13524,"Nazis":13525,"haunted":13526,"scholarly":13527,"Pratt":13528,"Gran":13529,"Embassy":13530,"Wave":13531,"pity":13532,"genius":13533,"bats":13534,"canton":13535,"Tropical":13536,"marker":13537,"##cos":13538,"escorted":13539,"Climate":13540,"##posed":13541,"appreciation":13542,"freezing":13543,"puzzle":13544,"Internal":13545,"pools":13546,"Shawn":13547,"pathway":13548,"Daniels":13549,"Fitzgerald":13550,"extant":13551,"olive":13552,"Vanessa":13553,"marriages":13554,"cocked":13555,"##dging":13556,"prone":13557,"chemicals":13558,"doll":13559,"drawer":13560,"##HF":13561,"Stark":13562,"Property":13563,"##tai":13564,"flowed":13565,"Sheridan":13566,"##uated":13567,"Less":13568,"Omar":13569,"remarks":13570,"catalogue":13571,"Seymour":13572,"wreck":13573,"Carrie":13574,"##bby":13575,"Mercer":13576,"displaced":13577,"sovereignty":13578,"rip":13579,"Flynn":13580,"Archie":13581,"Quarterfinals":13582,"Hassan":13583,"##ards":13584,"vein":13585,"Osaka":13586,"pouring":13587,"wages":13588,"Romance":13589,"##cript":13590,"##phere":13591,"550":13592,"##eil":13593,"##stown":13594,"Documentary":13595,"ancestor":13596,"CNN":13597,"Panthers":13598,"publishers":13599,"Rise":13600,"##mu":13601,"biting":13602,"Bright":13603,"String":13604,"succeeding":13605,"119":13606,"loaned":13607,"Warwick":13608,"Sheikh":13609,"Von":13610,"Afterwards":13611,"Jax":13612,"Camden":13613,"helicopters":13614,"Hence":13615,"Laurel":13616,"##ddy":13617,"transaction":13618,"Corp":13619,"clause":13620,"##owing":13621,"##kel":13622,"Investment":13623,"cups":13624,"Lucia":13625,"Moss":13626,"Giles":13627,"chef":13628,"López":13629,"decisive":13630,"30th":13631,"distress":13632,"linguistic":13633,"surveys":13634,"Ready":13635,"maiden":13636,"Touch":13637,"frontier":13638,"incorporate":13639,"exotic":13640,"mollusk":13641,"Leopold":13642,"Ride":13643,"##wain":13644,"##ndo":13645,"teammates":13646,"tones":13647,"drift":13648,"ordering":13649,"Feb":13650,"Penny":13651,"Normandy":13652,"Present":13653,"Flag":13654,"pipes":13655,"##rro":13656,"delight":13657,"motto":13658,"Tibet":13659,"leap":13660,"Eliza":13661,"Produced":13662,"teenagers":13663,"sitcom":13664,"Try":13665,"Hansen":13666,"Cody":13667,"wandered":13668,"terrestrial":13669,"frog":13670,"scare":13671,"resisted":13672,"employers":13673,"coined":13674,"##DS":13675,"resistant":13676,"Fly":13677,"captive":13678,"dissolution":13679,"judged":13680,"associates":13681,"defining":13682,"##court":13683,"Hale":13684,"##mbo":13685,"raises":13686,"clusters":13687,"twelfth":13688,"##metric":13689,"Roads":13690,"##itude":13691,"satisfy":13692,"Android":13693,"Reds":13694,"Gloucester":13695,"Category":13696,"Valencia":13697,"Daemon":13698,"stabbed":13699,"Luna":13700,"Churches":13701,"Canton":13702,"##eller":13703,"Attack":13704,"Kashmir":13705,"annexed":13706,"grabs":13707,"asteroid":13708,"Hartford":13709,"recommendation":13710,"Rodriguez":13711,"handing":13712,"stressed":13713,"frequencies":13714,"delegate":13715,"Bones":13716,"Erie":13717,"Weber":13718,"Hands":13719,"Acts":13720,"millimetres":13721,"24th":13722,"Fat":13723,"Howe":13724,"casually":13725,"##SL":13726,"convent":13727,"1790":13728,"IF":13729,"##sity":13730,"1795":13731,"yelling":13732,"##ises":13733,"drain":13734,"addressing":13735,"amino":13736,"Marcel":13737,"Sylvia":13738,"Paramount":13739,"Gerard":13740,"Volleyball":13741,"butter":13742,"124":13743,"Albion":13744,"##GB":13745,"triggered":13746,"1792":13747,"folding":13748,"accepts":13749,"##ße":13750,"preparations":13751,"Wimbledon":13752,"dose":13753,"##grass":13754,"escaping":13755,"##tling":13756,"import":13757,"charging":13758,"##dation":13759,"280":13760,"Nolan":13761,"##fried":13762,"Calcutta":13763,"##pool":13764,"Cove":13765,"examining":13766,"minded":13767,"heartbeat":13768,"twisting":13769,"domains":13770,"bush":13771,"Tunisia":13772,"Purple":13773,"Leone":13774,"##code":13775,"evacuated":13776,"battlefield":13777,"tiger":13778,"Electrical":13779,"##ared":13780,"chased":13781,"##cre":13782,"cultivated":13783,"Jet":13784,"solved":13785,"shrug":13786,"ringing":13787,"Impact":13788,"##iant":13789,"kilometre":13790,"##log":13791,"commemorate":13792,"migrated":13793,"singular":13794,"designing":13795,"promptly":13796,"Higgins":13797,"##own":13798,"##aves":13799,"freshwater":13800,"Marketing":13801,"Payne":13802,"beg":13803,"locker":13804,"pray":13805,"implied":13806,"AAA":13807,"corrected":13808,"Trans":13809,"Europeans":13810,"Ashe":13811,"acknowledge":13812,"Introduction":13813,"##writer":13814,"##llen":13815,"Munster":13816,"auxiliary":13817,"growl":13818,"Hours":13819,"Poems":13820,"##AT":13821,"reduces":13822,"Plain":13823,"plague":13824,"canceled":13825,"detention":13826,"polite":13827,"necklace":13828,"Gustav":13829,"##gu":13830,"##lance":13831,"En":13832,"Angola":13833,"##bb":13834,"dwelling":13835,"##hea":13836,"5000":13837,"Qing":13838,"Dodgers":13839,"rim":13840,"##ored":13841,"##haus":13842,"spilled":13843,"Elisabeth":13844,"Viktor":13845,"backpack":13846,"1802":13847,"amended":13848,"##worthy":13849,"Phantom":13850,"##ctive":13851,"keeper":13852,"##loom":13853,"Vikings":13854,"##gua":13855,"employs":13856,"Tehran":13857,"specialty":13858,"##bate":13859,"Marx":13860,"Mirror":13861,"Jenna":13862,"rides":13863,"needle":13864,"prayers":13865,"clarinet":13866,"forewings":13867,"##walk":13868,"Midlands":13869,"convincing":13870,"advocacy":13871,"Cao":13872,"Birds":13873,"cycles":13874,"Clement":13875,"Gil":13876,"bubble":13877,"Maximum":13878,"humanitarian":13879,"Tan":13880,"cries":13881,"##SI":13882,"Parsons":13883,"Trio":13884,"offshore":13885,"Innovation":13886,"clutched":13887,"260":13888,"##mund":13889,"##duct":13890,"Prairie":13891,"relied":13892,"Falcon":13893,"##ste":13894,"Kolkata":13895,"Gill":13896,"Swift":13897,"Negro":13898,"Zoo":13899,"valleys":13900,"##OL":13901,"Opening":13902,"beams":13903,"MPs":13904,"outline":13905,"Bermuda":13906,"Personal":13907,"exceed":13908,"productive":13909,"##MT":13910,"republic":13911,"forum":13912,"##sty":13913,"tornado":13914,"Known":13915,"dipped":13916,"Edith":13917,"folks":13918,"mathematician":13919,"watershed":13920,"Ricardo":13921,"synthetic":13922,"##dication":13923,"deity":13924,"##₄":13925,"gaming":13926,"subjected":13927,"suspects":13928,"Foot":13929,"swollen":13930,"Motors":13931,"##tty":13932,"##ý":13933,"aloud":13934,"ceremonial":13935,"es":13936,"nuts":13937,"intend":13938,"Carlisle":13939,"tasked":13940,"hesitation":13941,"sponsors":13942,"unified":13943,"inmates":13944,"##ctions":13945,"##stan":13946,"tiles":13947,"jokes":13948,"whereby":13949,"outcomes":13950,"Lights":13951,"scary":13952,"Stoke":13953,"Portrait":13954,"Blind":13955,"sergeant":13956,"violations":13957,"cultivation":13958,"fuselage":13959,"Mister":13960,"Alfonso":13961,"candy":13962,"sticks":13963,"teen":13964,"agony":13965,"Enough":13966,"invite":13967,"Perkins":13968,"Appeal":13969,"mapping":13970,"undergo":13971,"Glacier":13972,"Melanie":13973,"affects":13974,"incomplete":13975,"##dd":13976,"Colombian":13977,"##nate":13978,"CBC":13979,"purchasing":13980,"bypass":13981,"Drug":13982,"Electronics":13983,"Frontier":13984,"Coventry":13985,"##aan":13986,"autonomy":13987,"scrambled":13988,"Recent":13989,"bounced":13990,"cow":13991,"experiencing":13992,"Rouge":13993,"cuisine":13994,"Elite":13995,"disability":13996,"Ji":13997,"inheritance":13998,"wildly":13999,"Into":14000,"##wig":14001,"confrontation":14002,"Wheeler":14003,"shiver":14004,"Performing":14005,"aligned":14006,"consequently":14007,"Alexis":14008,"Sin":14009,"woodland":14010,"executives":14011,"Stevenson":14012,"Ferrari":14013,"inevitable":14014,"##cist":14015,"##dha":14016,"##base":14017,"Corner":14018,"comeback":14019,"León":14020,"##eck":14021,"##urus":14022,"MacDonald":14023,"pioneering":14024,"breakdown":14025,"landscapes":14026,"Veterans":14027,"Rican":14028,"Theological":14029,"stirred":14030,"participant":14031,"Credit":14032,"Hyderabad":14033,"snails":14034,"Claudia":14035,"##ocene":14036,"compliance":14037,"##MI":14038,"Flags":14039,"Middlesex":14040,"storms":14041,"winding":14042,"asserted":14043,"er":14044,"##ault":14045,"##kal":14046,"waking":14047,"##rates":14048,"abbey":14049,"Augusta":14050,"tooth":14051,"trustees":14052,"Commodore":14053,"##uded":14054,"Cunningham":14055,"NC":14056,"Witch":14057,"marching":14058,"Sword":14059,"Same":14060,"spiral":14061,"Harley":14062,"##ahan":14063,"Zack":14064,"Audio":14065,"1890s":14066,"##fit":14067,"Simmons":14068,"Kara":14069,"Veronica":14070,"negotiated":14071,"Speaking":14072,"FIBA":14073,"Conservatory":14074,"formations":14075,"constituencies":14076,"explicit":14077,"facial":14078,"eleventh":14079,"##ilt":14080,"villain":14081,"##dog":14082,"##case":14083,"##hol":14084,"armored":14085,"tin":14086,"hairs":14087,"##umi":14088,"##rai":14089,"mattress":14090,"Angus":14091,"cease":14092,"verbal":14093,"Recreation":14094,"savings":14095,"Aurora":14096,"peers":14097,"Monastery":14098,"Airways":14099,"drowned":14100,"additions":14101,"downstream":14102,"sticking":14103,"Shi":14104,"mice":14105,"skiing":14106,"##CD":14107,"Raw":14108,"Riverside":14109,"warming":14110,"hooked":14111,"boost":14112,"memorable":14113,"posed":14114,"treatments":14115,"320":14116,"##dai":14117,"celebrating":14118,"blink":14119,"helpless":14120,"circa":14121,"Flowers":14122,"PM":14123,"uncommon":14124,"Oct":14125,"Hawks":14126,"overwhelmed":14127,"Sparhawk":14128,"repaired":14129,"Mercy":14130,"pose":14131,"counterpart":14132,"compare":14133,"survives":14134,"##½":14135,"##eum":14136,"coordinate":14137,"Lil":14138,"grandchildren":14139,"notorious":14140,"Yi":14141,"Judaism":14142,"Juliet":14143,"accusations":14144,"1789":14145,"floated":14146,"marathon":14147,"roar":14148,"fortified":14149,"reunion":14150,"145":14151,"Nov":14152,"Paula":14153,"##fare":14154,"##toria":14155,"tearing":14156,"Cedar":14157,"disappearance":14158,"Si":14159,"gifted":14160,"scar":14161,"270":14162,"PBS":14163,"Technologies":14164,"Marvin":14165,"650":14166,"roller":14167,"cupped":14168,"negotiate":14169,"##erman":14170,"passport":14171,"tram":14172,"miracle":14173,"styled":14174,"##tier":14175,"necessity":14176,"Des":14177,"rehabilitation":14178,"Lara":14179,"USD":14180,"psychic":14181,"wipe":14182,"##lem":14183,"mistaken":14184,"##lov":14185,"charming":14186,"Rider":14187,"pageant":14188,"dynamics":14189,"Cassidy":14190,"##icus":14191,"defenses":14192,"##tadt":14193,"##vant":14194,"aging":14195,"##inal":14196,"declare":14197,"mistress":14198,"supervised":14199,"##alis":14200,"##rest":14201,"Ashton":14202,"submerged":14203,"sack":14204,"Dodge":14205,"grocery":14206,"ramp":14207,"Teacher":14208,"lineage":14209,"imagery":14210,"arrange":14211,"inscriptions":14212,"Organisation":14213,"Siege":14214,"combines":14215,"pounded":14216,"Fleming":14217,"legends":14218,"columnist":14219,"Apostolic":14220,"prose":14221,"insight":14222,"Arabian":14223,"expired":14224,"##uses":14225,"##nos":14226,"Alone":14227,"elbows":14228,"##asis":14229,"##adi":14230,"##combe":14231,"Step":14232,"Waterloo":14233,"Alternate":14234,"interval":14235,"Sonny":14236,"plains":14237,"Goals":14238,"incorporating":14239,"recruit":14240,"adjoining":14241,"Cheshire":14242,"excluding":14243,"marrying":14244,"ducked":14245,"Cherokee":14246,"par":14247,"##inate":14248,"hiking":14249,"Coal":14250,"##bow":14251,"natives":14252,"ribbon":14253,"Allies":14254,"con":14255,"descriptions":14256,"positively":14257,"##lal":14258,"defendant":14259,"22nd":14260,"Vivian":14261,"##beat":14262,"Weather":14263,"possessions":14264,"Date":14265,"sweetheart":14266,"inability":14267,"Salisbury":14268,"adviser":14269,"ideology":14270,"Nordic":14271,"##eu":14272,"Cubs":14273,"IP":14274,"Administrative":14275,"##nick":14276,"facto":14277,"liberation":14278,"Burnett":14279,"Javier":14280,"fashioned":14281,"Electoral":14282,"Turin":14283,"theft":14284,"unanimous":14285,"Per":14286,"1799":14287,"Clan":14288,"Hawkins":14289,"Teachers":14290,"##wes":14291,"Cameroon":14292,"Parkway":14293,"##gment":14294,"demolition":14295,"atoms":14296,"nucleus":14297,"##thi":14298,"recovering":14299,"##yte":14300,"##vice":14301,"lifts":14302,"Must":14303,"deposit":14304,"Hancock":14305,"Semi":14306,"darkened":14307,"Declaration":14308,"moan":14309,"muscular":14310,"Myers":14311,"attractions":14312,"sauce":14313,"simulation":14314,"##weed":14315,"Alps":14316,"barriers":14317,"##baum":14318,"Barack":14319,"galleries":14320,"Min":14321,"holders":14322,"Greenwich":14323,"donation":14324,"Everybody":14325,"Wolfgang":14326,"sandwich":14327,"Kendra":14328,"Collegiate":14329,"casino":14330,"Slavic":14331,"ensuing":14332,"Porto":14333,"##grapher":14334,"Jesuit":14335,"suppressed":14336,"tires":14337,"Ibrahim":14338,"protesters":14339,"Ibn":14340,"Amos":14341,"1796":14342,"phenomena":14343,"Hayden":14344,"Paraguay":14345,"Squad":14346,"Reilly":14347,"complement":14348,"aluminum":14349,"##eers":14350,"doubts":14351,"decay":14352,"demise":14353,"Practice":14354,"patience":14355,"fireplace":14356,"transparent":14357,"monarchy":14358,"##person":14359,"Rodney":14360,"mattered":14361,"rotating":14362,"Clifford":14363,"disposal":14364,"Standards":14365,"paced":14366,"##llie":14367,"arise":14368,"tallest":14369,"tug":14370,"documentation":14371,"node":14372,"freeway":14373,"Nikolai":14374,"##cite":14375,"clicked":14376,"imaging":14377,"Lorraine":14378,"Tactical":14379,"Different":14380,"Regular":14381,"Holding":14382,"165":14383,"Pilot":14384,"guarded":14385,"##polis":14386,"Classics":14387,"Mongolia":14388,"Brock":14389,"monarch":14390,"cellular":14391,"receptors":14392,"Mini":14393,"Chandler":14394,"financed":14395,"financially":14396,"Lives":14397,"erection":14398,"Fuller":14399,"unnamed":14400,"Kannada":14401,"cc":14402,"passive":14403,"plateau":14404,"##arity":14405,"freak":14406,"##rde":14407,"retrieved":14408,"transactions":14409,"##sus":14410,"23rd":14411,"swimmer":14412,"beef":14413,"fulfill":14414,"Arlington":14415,"offspring":14416,"reasoning":14417,"Rhys":14418,"saves":14419,"pseudonym":14420,"centimetres":14421,"shivered":14422,"shuddered":14423,"##ME":14424,"Feel":14425,"##otic":14426,"professors":14427,"Blackburn":14428,"##eng":14429,"##life":14430,"##haw":14431,"interred":14432,"lodge":14433,"fragile":14434,"Della":14435,"guardian":14436,"##bbled":14437,"catalog":14438,"clad":14439,"observer":14440,"tract":14441,"declaring":14442,"##headed":14443,"Lok":14444,"dean":14445,"Isabelle":14446,"1776":14447,"irrigation":14448,"spectacular":14449,"shuttle":14450,"mastering":14451,"##aro":14452,"Nathaniel":14453,"Retired":14454,"##lves":14455,"Brennan":14456,"##kha":14457,"dick":14458,"##dated":14459,"##hler":14460,"Rookie":14461,"leapt":14462,"televised":14463,"weekends":14464,"Baghdad":14465,"Yemen":14466,"##fo":14467,"factions":14468,"ion":14469,"Lab":14470,"mortality":14471,"passionate":14472,"Hammer":14473,"encompasses":14474,"confluence":14475,"demonstrations":14476,"Ki":14477,"derivative":14478,"soils":14479,"##unch":14480,"Ranch":14481,"Universities":14482,"conventions":14483,"outright":14484,"aiming":14485,"hierarchy":14486,"reside":14487,"illusion":14488,"graves":14489,"rituals":14490,"126":14491,"Antwerp":14492,"Dover":14493,"##ema":14494,"campuses":14495,"Hobart":14496,"lifelong":14497,"aliens":14498,"##vity":14499,"Memory":14500,"coordination":14501,"alphabet":14502,"##mina":14503,"Titans":14504,"pushes":14505,"Flanders":14506,"##holder":14507,"Normal":14508,"excellence":14509,"capped":14510,"profound":14511,"Taipei":14512,"portrayal":14513,"sparked":14514,"scratch":14515,"se":14516,"##eas":14517,"##hir":14518,"Mackenzie":14519,"##cation":14520,"Neo":14521,"Shin":14522,"##lined":14523,"magnificent":14524,"poster":14525,"batsman":14526,"##rgent":14527,"persuade":14528,"##ement":14529,"Icelandic":14530,"miserable":14531,"collegiate":14532,"Feature":14533,"geography":14534,"##mura":14535,"Comic":14536,"Circus":14537,"processor":14538,"barracks":14539,"Tale":14540,"##11":14541,"Bulls":14542,"##rap":14543,"strengthened":14544,"##bell":14545,"injection":14546,"miniature":14547,"broadly":14548,"Letter":14549,"fare":14550,"hostage":14551,"traders":14552,"##nium":14553,"##mere":14554,"Fortune":14555,"Rivera":14556,"Lu":14557,"triumph":14558,"Browns":14559,"Bangalore":14560,"cooperative":14561,"Basel":14562,"announcing":14563,"Sawyer":14564,"##him":14565,"##cco":14566,"##kara":14567,"darted":14568,"##AD":14569,"##nova":14570,"sucking":14571,"##position":14572,"perimeter":14573,"flung":14574,"Holdings":14575,"##NP":14576,"Basque":14577,"sketches":14578,"Augustine":14579,"Silk":14580,"Elijah":14581,"analyst":14582,"armour":14583,"riots":14584,"acquiring":14585,"ghosts":14586,"##ems":14587,"132":14588,"Pioneer":14589,"Colleges":14590,"Simone":14591,"Economy":14592,"Author":14593,"semester":14594,"Soldier":14595,"il":14596,"##unting":14597,"##bid":14598,"freaking":14599,"Vista":14600,"tumor":14601,"##bat":14602,"murderer":14603,"##eda":14604,"unreleased":14605,"##grove":14606,"##sser":14607,"##té":14608,"edit":14609,"statute":14610,"sovereign":14611,"##gawa":14612,"Killer":14613,"stares":14614,"Fury":14615,"comply":14616,"##lord":14617,"##nant":14618,"barrels":14619,"Andhra":14620,"Maple":14621,"generator":14622,"mascot":14623,"unusually":14624,"eds":14625,"##ante":14626,"##runner":14627,"rod":14628,"##tles":14629,"Historically":14630,"Jennings":14631,"dumped":14632,"Established":14633,"resemblance":14634,"##lium":14635,"##cise":14636,"##body":14637,"##voke":14638,"Lydia":14639,"##hou":14640,"##iring":14641,"nonetheless":14642,"1797":14643,"corrupt":14644,"patrons":14645,"physicist":14646,"sneak":14647,"Livingston":14648,"Citizens":14649,"Architects":14650,"Werner":14651,"trends":14652,"Melody":14653,"eighty":14654,"markings":14655,"brakes":14656,"##titled":14657,"oversaw":14658,"processed":14659,"mock":14660,"Midwest":14661,"intervals":14662,"##EF":14663,"stretches":14664,"werewolf":14665,"##MG":14666,"Pack":14667,"controller":14668,"##dition":14669,"Honours":14670,"cane":14671,"Griffith":14672,"vague":14673,"repertoire":14674,"Courtney":14675,"orgasm":14676,"Abdullah":14677,"dominance":14678,"occupies":14679,"Ya":14680,"introduces":14681,"Lester":14682,"instinct":14683,"collaborative":14684,"Indigenous":14685,"refusal":14686,"##rank":14687,"outlet":14688,"debts":14689,"spear":14690,"155":14691,"##keeping":14692,"##ulu":14693,"Catalan":14694,"##osh":14695,"tensions":14696,"##OT":14697,"bred":14698,"crude":14699,"Dunn":14700,"abdomen":14701,"accurately":14702,"##fu":14703,"##lough":14704,"accidents":14705,"Row":14706,"Audrey":14707,"rude":14708,"Getting":14709,"promotes":14710,"replies":14711,"Paolo":14712,"merge":14713,"##nock":14714,"trans":14715,"Evangelical":14716,"automated":14717,"Canon":14718,"##wear":14719,"##ggy":14720,"##gma":14721,"Broncos":14722,"foolish":14723,"icy":14724,"Voices":14725,"knives":14726,"Aside":14727,"dreamed":14728,"generals":14729,"molecule":14730,"AG":14731,"rejection":14732,"insufficient":14733,"##nagar":14734,"deposited":14735,"sacked":14736,"Landing":14737,"arches":14738,"helpful":14739,"devotion":14740,"intake":14741,"Flower":14742,"PGA":14743,"dragons":14744,"evolutionary":14745,"##mail":14746,"330":14747,"GM":14748,"tissues":14749,"##tree":14750,"arcade":14751,"composite":14752,"lid":14753,"Across":14754,"implications":14755,"lacks":14756,"theological":14757,"assessed":14758,"concentrations":14759,"Den":14760,"##mans":14761,"##ulous":14762,"Fu":14763,"homeland":14764,"##stream":14765,"Harriet":14766,"ecclesiastical":14767,"troop":14768,"ecological":14769,"winked":14770,"##xed":14771,"eighteenth":14772,"Casino":14773,"specializing":14774,"##sworth":14775,"unlocked":14776,"supreme":14777,"devastated":14778,"snatched":14779,"trauma":14780,"GDP":14781,"Nord":14782,"saddle":14783,"Wes":14784,"convenient":14785,"competes":14786,"##nu":14787,"##iss":14788,"Marian":14789,"subway":14790,"##rri":14791,"successes":14792,"umbrella":14793,"##far":14794,"##ually":14795,"Dundee":14796,"##cence":14797,"spark":14798,"##rix":14799,"##я":14800,"Quality":14801,"Geological":14802,"cockpit":14803,"rpm":14804,"Cam":14805,"Bucharest":14806,"riot":14807,"##PM":14808,"Leah":14809,"##dad":14810,"##pose":14811,"Ka":14812,"m³":14813,"Bundesliga":14814,"Wolfe":14815,"grim":14816,"textile":14817,"quartet":14818,"expressing":14819,"fantastic":14820,"destroyers":14821,"eternal":14822,"picnic":14823,"##oro":14824,"contractor":14825,"1775":14826,"spanning":14827,"declining":14828,"##cating":14829,"Lowe":14830,"Sutherland":14831,"Emirates":14832,"downward":14833,"nineteen":14834,"violently":14835,"scout":14836,"viral":14837,"melting":14838,"enterprises":14839,"##cer":14840,"Crosby":14841,"Jubilee":14842,"antenna":14843,"urgent":14844,"Rory":14845,"##uin":14846,"##sure":14847,"wandering":14848,"##gler":14849,"##vent":14850,"Suzuki":14851,"Lifetime":14852,"Dirty":14853,"occupying":14854,"##quent":14855,"Disc":14856,"Guru":14857,"mound":14858,"Lennon":14859,"Humanities":14860,"listeners":14861,"Walton":14862,"uh":14863,"Braves":14864,"Bologna":14865,"##bis":14866,"##gra":14867,"Dwight":14868,"crawl":14869,"flags":14870,"memoir":14871,"Thorne":14872,"Archdiocese":14873,"dairy":14874,"##uz":14875,"##tery":14876,"roared":14877,"adjust":14878,"patches":14879,"inn":14880,"Knowing":14881,"##bbed":14882,"##zan":14883,"scan":14884,"Papa":14885,"precipitation":14886,"angrily":14887,"passages":14888,"postal":14889,"Phi":14890,"embraced":14891,"blacks":14892,"economist":14893,"triangular":14894,"Sen":14895,"shooter":14896,"punished":14897,"Millennium":14898,"Swimming":14899,"confessed":14900,"Aston":14901,"defeats":14902,"Era":14903,"cousins":14904,"Williamson":14905,"##rer":14906,"daytime":14907,"dumb":14908,"##rek":14909,"underway":14910,"specification":14911,"Buchanan":14912,"prayed":14913,"concealed":14914,"activation":14915,"##issa":14916,"canon":14917,"awesome":14918,"Starr":14919,"plural":14920,"summers":14921,"##fields":14922,"Slam":14923,"unnecessary":14924,"1791":14925,"resume":14926,"trilogy":14927,"compression":14928,"##rough":14929,"selective":14930,"dignity":14931,"Yan":14932,"##xton":14933,"immense":14934,"##yun":14935,"lone":14936,"seeded":14937,"hiatus":14938,"lightweight":14939,"summary":14940,"Yo":14941,"approve":14942,"Galway":14943,"rejoined":14944,"Elise":14945,"garbage":14946,"burns":14947,"speeches":14948,"129":14949,"Honduras":14950,"##liness":14951,"inventory":14952,"jersey":14953,"FK":14954,"assure":14955,"slumped":14956,"Lionel":14957,"Suite":14958,"##sbury":14959,"Lena":14960,"continuation":14961,"##AN":14962,"brightly":14963,"##nti":14964,"GT":14965,"Knowledge":14966,"##park":14967,"##lius":14968,"lethal":14969,"##tribution":14970,"##sions":14971,"Certificate":14972,"Mara":14973,"##lby":14974,"algorithms":14975,"Jade":14976,"blows":14977,"pirates":14978,"fleeing":14979,"wheelchair":14980,"Stein":14981,"sophomore":14982,"Alt":14983,"Territorial":14984,"diploma":14985,"snakes":14986,"##olic":14987,"##tham":14988,"Tiffany":14989,"Pius":14990,"flush":14991,"urging":14992,"Hanover":14993,"Reich":14994,"##olate":14995,"Unity":14996,"Pike":14997,"collectively":14998,"Theme":14999,"ballad":15000,"kindergarten":15001,"rocked":15002,"zoo":15003,"##page":15004,"whip":15005,"Rodríguez":15006,"strokes":15007,"checks":15008,"Becky":15009,"Stern":15010,"upstream":15011,"##uta":15012,"Silent":15013,"volunteered":15014,"Sigma":15015,"##ingen":15016,"##tract":15017,"##ede":15018,"Gujarat":15019,"screwed":15020,"entertaining":15021,"##action":15022,"##ryn":15023,"defenders":15024,"innocence":15025,"lesbian":15026,"que":15027,"Richie":15028,"nodes":15029,"Lie":15030,"juvenile":15031,"Jakarta":15032,"safer":15033,"confront":15034,"Bert":15035,"breakthrough":15036,"gospel":15037,"Cable":15038,"##zie":15039,"institutional":15040,"Archive":15041,"brake":15042,"liquor":15043,"feeds":15044,"##iate":15045,"chancellor":15046,"Encyclopedia":15047,"Animation":15048,"scanning":15049,"teens":15050,"##mother":15051,"Core":15052,"Rear":15053,"Wine":15054,"##flower":15055,"reactor":15056,"Ave":15057,"cardinal":15058,"sodium":15059,"strands":15060,"Olivier":15061,"crouched":15062,"Vaughan":15063,"Sammy":15064,"Image":15065,"scars":15066,"Emmanuel":15067,"flour":15068,"bias":15069,"nipple":15070,"revelation":15071,"##ucci":15072,"Denny":15073,"##ssy":15074,"Form":15075,"Runners":15076,"admits":15077,"Rama":15078,"violated":15079,"Burmese":15080,"feud":15081,"underwear":15082,"Mohamed":15083,"Named":15084,"swift":15085,"statewide":15086,"Door":15087,"Recently":15088,"comparing":15089,"Hundred":15090,"##idge":15091,"##nity":15092,"##rds":15093,"Rally":15094,"Reginald":15095,"Auburn":15096,"solving":15097,"waitress":15098,"Treasurer":15099,"##ilization":15100,"Halloween":15101,"Ministers":15102,"Boss":15103,"Shut":15104,"##listic":15105,"Rahman":15106,"demonstrating":15107,"##pies":15108,"Gaza":15109,"Yuri":15110,"installations":15111,"Math":15112,"schooling":15113,"##bble":15114,"Bronx":15115,"exiled":15116,"gasoline":15117,"133":15118,"bundle":15119,"humid":15120,"FCC":15121,"proportional":15122,"relate":15123,"VFL":15124,"##dez":15125,"continuity":15126,"##cene":15127,"syndicated":15128,"atmospheric":15129,"arrows":15130,"Wanderers":15131,"reinforcements":15132,"Willow":15133,"Lexington":15134,"Rotten":15135,"##yon":15136,"discovering":15137,"Serena":15138,"portable":15139,"##lysis":15140,"targeting":15141,"£1":15142,"Goodman":15143,"Steam":15144,"sensors":15145,"detachment":15146,"Malik":15147,"##erie":15148,"attitudes":15149,"Goes":15150,"Kendall":15151,"Read":15152,"Sleep":15153,"beans":15154,"Nikki":15155,"modification":15156,"Jeanne":15157,"knuckles":15158,"Eleven":15159,"##iously":15160,"Gross":15161,"Jaime":15162,"dioxide":15163,"moisture":15164,"Stones":15165,"UCI":15166,"displacement":15167,"Metacritic":15168,"Jury":15169,"lace":15170,"rendering":15171,"elephant":15172,"Sergei":15173,"##quire":15174,"GP":15175,"Abbott":15176,"##type":15177,"projection":15178,"Mouse":15179,"Bishops":15180,"whispering":15181,"Kathleen":15182,"Rams":15183,"##jar":15184,"whites":15185,"##oran":15186,"assess":15187,"dispatched":15188,"##hire":15189,"kin":15190,"##mir":15191,"Nursing":15192,"advocates":15193,"tremendous":15194,"sweater":15195,"assisting":15196,"##bil":15197,"Farmer":15198,"prominently":15199,"reddish":15200,"Hague":15201,"cyclone":15202,"##SD":15203,"Sage":15204,"Lawson":15205,"Sanctuary":15206,"discharged":15207,"retains":15208,"##ube":15209,"shotgun":15210,"wilderness":15211,"Reformed":15212,"similarity":15213,"Entry":15214,"Watts":15215,"Bahá":15216,"Quest":15217,"Looks":15218,"visions":15219,"Reservoir":15220,"Arabs":15221,"curls":15222,"Blu":15223,"dripping":15224,"accomplish":15225,"Verlag":15226,"drill":15227,"sensor":15228,"Dillon":15229,"physicians":15230,"smashed":15231,"##dir":15232,"painters":15233,"Renault":15234,"straw":15235,"fading":15236,"Directorate":15237,"lounge":15238,"commissions":15239,"Brain":15240,"##graph":15241,"neo":15242,"##urg":15243,"plug":15244,"coordinated":15245,"##houses":15246,"Critical":15247,"lamps":15248,"illustrator":15249,"Returning":15250,"erosion":15251,"Crow":15252,"##ciation":15253,"blessing":15254,"Thought":15255,"Wife":15256,"medalist":15257,"synthesizer":15258,"Pam":15259,"Thornton":15260,"Esther":15261,"HBO":15262,"fond":15263,"Associates":15264,"##raz":15265,"pirate":15266,"permits":15267,"Wide":15268,"tire":15269,"##PC":15270,"Ernie":15271,"Nassau":15272,"transferring":15273,"RFC":15274,"##ntly":15275,"um":15276,"spit":15277,"AS":15278,"##mps":15279,"Mining":15280,"polar":15281,"villa":15282,"anchored":15283,"##zzi":15284,"embarrassment":15285,"relates":15286,"##ă":15287,"Rupert":15288,"counterparts":15289,"131":15290,"Baxter":15291,"##18":15292,"Igor":15293,"recognizes":15294,"Clive":15295,"##hane":15296,"##eries":15297,"##ibly":15298,"occurrence":15299,"##scope":15300,"fin":15301,"colorful":15302,"Rapids":15303,"banker":15304,"tile":15305,"##rative":15306,"##dus":15307,"delays":15308,"destinations":15309,"##llis":15310,"Pond":15311,"Dane":15312,"grandparents":15313,"rewarded":15314,"socially":15315,"motorway":15316,"##hof":15317,"##lying":15318,"##human":15319,"modeled":15320,"Dayton":15321,"Forward":15322,"conscience":15323,"Sharma":15324,"whistle":15325,"Mayer":15326,"Sasha":15327,"##pical":15328,"circuits":15329,"Zhou":15330,"##ça":15331,"Latvian":15332,"finalists":15333,"predators":15334,"Lafayette":15335,"closes":15336,"obligations":15337,"Resolution":15338,"##vier":15339,"Trustees":15340,"reminiscent":15341,"##hos":15342,"Highlands":15343,"Protected":15344,"asylum":15345,"evacuation":15346,"##acy":15347,"Chevrolet":15348,"confession":15349,"Somalia":15350,"emergence":15351,"separating":15352,"##rica":15353,"alright":15354,"calcium":15355,"Laurent":15356,"Welfare":15357,"Leonardo":15358,"ashes":15359,"dental":15360,"Deal":15361,"minerals":15362,"##lump":15363,"##mount":15364,"accounted":15365,"staggered":15366,"slogan":15367,"photographic":15368,"builder":15369,"##imes":15370,"##raft":15371,"tragic":15372,"144":15373,"SEC":15374,"Hit":15375,"tailed":15376,"##ples":15377,"##rring":15378,"##rson":15379,"ethical":15380,"wrestlers":15381,"concludes":15382,"lunar":15383,"##ept":15384,"nitrogen":15385,"Aid":15386,"cyclist":15387,"quarterfinals":15388,"##ه":15389,"harvest":15390,"##hem":15391,"Pasha":15392,"IL":15393,"##mis":15394,"continually":15395,"##forth":15396,"Intel":15397,"bucket":15398,"##ended":15399,"witches":15400,"pretended":15401,"dresses":15402,"viewer":15403,"peculiar":15404,"lowering":15405,"volcano":15406,"Marilyn":15407,"Qualifier":15408,"clung":15409,"##sher":15410,"Cut":15411,"modules":15412,"Bowie":15413,"##lded":15414,"onset":15415,"transcription":15416,"residences":15417,"##pie":15418,"##itor":15419,"scrapped":15420,"##bic":15421,"Monaco":15422,"Mayo":15423,"eternity":15424,"Strike":15425,"uncovered":15426,"skeleton":15427,"##wicz":15428,"Isles":15429,"bug":15430,"Promoted":15431,"##rush":15432,"Mechanical":15433,"XII":15434,"##ivo":15435,"gripping":15436,"stubborn":15437,"velvet":15438,"TD":15439,"decommissioned":15440,"operas":15441,"spatial":15442,"unstable":15443,"Congressman":15444,"wasted":15445,"##aga":15446,"##ume":15447,"advertisements":15448,"##nya":15449,"obliged":15450,"Cannes":15451,"Conway":15452,"bricks":15453,"##gnant":15454,"##mity":15455,"##uise":15456,"jumps":15457,"Clear":15458,"##cine":15459,"##sche":15460,"chord":15461,"utter":15462,"Su":15463,"podium":15464,"spokesman":15465,"Royce":15466,"assassin":15467,"confirmation":15468,"licensing":15469,"liberty":15470,"##rata":15471,"Geographic":15472,"individually":15473,"detained":15474,"##ffe":15475,"Saturn":15476,"crushing":15477,"airplane":15478,"bushes":15479,"knights":15480,"##PD":15481,"Lilly":15482,"hurts":15483,"unexpectedly":15484,"Conservatives":15485,"pumping":15486,"Forty":15487,"candle":15488,"Pérez":15489,"peasants":15490,"supplement":15491,"Sundays":15492,"##ggs":15493,"##rries":15494,"risen":15495,"enthusiastic":15496,"corresponds":15497,"pending":15498,"##IF":15499,"Owens":15500,"floods":15501,"Painter":15502,"inflation":15503,"presumed":15504,"inscribed":15505,"Chamberlain":15506,"bizarre":15507,"1200":15508,"liability":15509,"reacted":15510,"tub":15511,"Legacy":15512,"##eds":15513,"##pted":15514,"shone":15515,"##litz":15516,"##NC":15517,"Tiny":15518,"genome":15519,"bays":15520,"Eduardo":15521,"robbery":15522,"stall":15523,"hatch":15524,"Depot":15525,"Variety":15526,"Flora":15527,"reprinted":15528,"trembled":15529,"outlined":15530,"CR":15531,"Theresa":15532,"spans":15533,"##plication":15534,"Jensen":15535,"##eering":15536,"posting":15537,"##rky":15538,"pays":15539,"##ost":15540,"Marcos":15541,"fortifications":15542,"inferior":15543,"##ential":15544,"Devi":15545,"despair":15546,"Talbot":15547,"##chus":15548,"updates":15549,"ego":15550,"Booth":15551,"Darius":15552,"tops":15553,"##lau":15554,"Scene":15555,"##DC":15556,"Harlem":15557,"Trey":15558,"Generally":15559,"candles":15560,"##α":15561,"Neville":15562,"Admiralty":15563,"##hong":15564,"iconic":15565,"victorious":15566,"1600":15567,"Rowan":15568,"abundance":15569,"miniseries":15570,"clutching":15571,"sanctioned":15572,"##words":15573,"obscure":15574,"##ision":15575,"##rle":15576,"##EM":15577,"disappearing":15578,"Resort":15579,"Obviously":15580,"##eb":15581,"exceeded":15582,"1870s":15583,"Adults":15584,"##cts":15585,"Cry":15586,"Kerr":15587,"ragged":15588,"selfish":15589,"##lson":15590,"circled":15591,"pillars":15592,"galaxy":15593,"##asco":15594,"##mental":15595,"rebuild":15596,"caution":15597,"Resistance":15598,"Start":15599,"bind":15600,"splitting":15601,"Baba":15602,"Hogan":15603,"ps":15604,"partnerships":15605,"slam":15606,"Peggy":15607,"courthouse":15608,"##OD":15609,"organizational":15610,"packages":15611,"Angie":15612,"##nds":15613,"possesses":15614,"##rp":15615,"Expressway":15616,"Gould":15617,"Terror":15618,"Him":15619,"Geoff":15620,"nobles":15621,"##ope":15622,"shark":15623,"##nh":15624,"identifies":15625,"##oor":15626,"testified":15627,"Playing":15628,"##ump":15629,"##isa":15630,"stool":15631,"Idol":15632,"##pice":15633,"##tana":15634,"Byrne":15635,"Gerry":15636,"grunted":15637,"26th":15638,"observing":15639,"habits":15640,"privilege":15641,"immortal":15642,"wagons":15643,"##thy":15644,"dot":15645,"Bring":15646,"##lian":15647,"##witz":15648,"newest":15649,"##uga":15650,"constraints":15651,"Screen":15652,"Issue":15653,"##RNA":15654,"##vil":15655,"reminder":15656,"##gles":15657,"addiction":15658,"piercing":15659,"stunning":15660,"var":15661,"##rita":15662,"Signal":15663,"accumulated":15664,"##wide":15665,"float":15666,"devastating":15667,"viable":15668,"cartoons":15669,"Uttar":15670,"flared":15671,"##encies":15672,"Theology":15673,"patents":15674,"##bahn":15675,"privileges":15676,"##ava":15677,"##CO":15678,"137":15679,"##oped":15680,"##NT":15681,"orchestral":15682,"medication":15683,"225":15684,"erect":15685,"Nadia":15686,"École":15687,"fried":15688,"Sales":15689,"scripts":15690,"##rease":15691,"airs":15692,"Cage":15693,"inadequate":15694,"structured":15695,"countless":15696,"Avengers":15697,"Kathy":15698,"disguise":15699,"mirrors":15700,"Investigation":15701,"reservation":15702,"##nson":15703,"Legends":15704,"humorous":15705,"Mona":15706,"decorations":15707,"attachment":15708,"Via":15709,"motivation":15710,"Browne":15711,"strangers":15712,"##ński":15713,"Shadows":15714,"Twins":15715,"##pressed":15716,"Alma":15717,"Nominated":15718,"##ott":15719,"Sergio":15720,"canopy":15721,"152":15722,"Semifinals":15723,"devised":15724,"##irk":15725,"upwards":15726,"Traffic":15727,"Goddess":15728,"Move":15729,"beetles":15730,"138":15731,"spat":15732,"##anne":15733,"holdings":15734,"##SP":15735,"tangled":15736,"Whilst":15737,"Fowler":15738,"anthem":15739,"##ING":15740,"##ogy":15741,"snarled":15742,"moonlight":15743,"songwriting":15744,"tolerance":15745,"Worlds":15746,"exams":15747,"##pia":15748,"notices":15749,"sensitivity":15750,"poetic":15751,"Stephens":15752,"Boone":15753,"insect":15754,"reconstructed":15755,"Fresh":15756,"27th":15757,"balloon":15758,"##ables":15759,"Brendan":15760,"mug":15761,"##gee":15762,"1780":15763,"apex":15764,"exports":15765,"slides":15766,"Lahore":15767,"hiring":15768,"Shell":15769,"electorate":15770,"sexuality":15771,"poker":15772,"nonprofit":15773,"##imate":15774,"cone":15775,"##uce":15776,"Okinawa":15777,"superintendent":15778,"##HC":15779,"referenced":15780,"turret":15781,"Sprint":15782,"Citizen":15783,"equilibrium":15784,"Stafford":15785,"curb":15786,"Driver":15787,"Valerie":15788,"##rona":15789,"aching":15790,"impacts":15791,"##bol":15792,"observers":15793,"Downs":15794,"Shri":15795,"##uth":15796,"airports":15797,"##uda":15798,"assignments":15799,"curtains":15800,"solitary":15801,"icon":15802,"patrols":15803,"substances":15804,"Jasper":15805,"mountainous":15806,"Published":15807,"ached":15808,"##ingly":15809,"announce":15810,"dove":15811,"damaging":15812,"##tism":15813,"Primera":15814,"Dexter":15815,"limiting":15816,"batch":15817,"##uli":15818,"undergoing":15819,"refugee":15820,"Ye":15821,"admiral":15822,"pavement":15823,"##WR":15824,"##reed":15825,"pipeline":15826,"desires":15827,"Ramsey":15828,"Sheila":15829,"thickness":15830,"Brotherhood":15831,"Tea":15832,"instituted":15833,"Belt":15834,"Break":15835,"plots":15836,"##ais":15837,"masculine":15838,"##where":15839,"Theo":15840,"##aged":15841,"##mined":15842,"Experience":15843,"scratched":15844,"Ethiopian":15845,"Teaching":15846,"##nov":15847,"Aiden":15848,"Abe":15849,"Samoa":15850,"conditioning":15851,"##mous":15852,"Otherwise":15853,"fade":15854,"Jenks":15855,"##encing":15856,"Nat":15857,"##lain":15858,"Anyone":15859,"##kis":15860,"smirk":15861,"Riding":15862,"##nny":15863,"Bavarian":15864,"blessed":15865,"potatoes":15866,"Hook":15867,"##wise":15868,"likewise":15869,"hardened":15870,"Merry":15871,"amid":15872,"persecution":15873,"##sten":15874,"Elections":15875,"Hoffman":15876,"Pitt":15877,"##vering":15878,"distraction":15879,"exploitation":15880,"infamous":15881,"quote":15882,"averaging":15883,"healed":15884,"Rhythm":15885,"Germanic":15886,"Mormon":15887,"illuminated":15888,"guides":15889,"##ische":15890,"interfere":15891,"##ilized":15892,"rector":15893,"perennial":15894,"##ival":15895,"Everett":15896,"courtesy":15897,"##nham":15898,"Kirby":15899,"Mk":15900,"##vic":15901,"Medieval":15902,"##tale":15903,"Luigi":15904,"limp":15905,"##diction":15906,"Alive":15907,"greeting":15908,"shove":15909,"##force":15910,"##fly":15911,"Jasmine":15912,"Bend":15913,"Capt":15914,"Suzanne":15915,"ditch":15916,"134":15917,"##nning":15918,"Host":15919,"fathers":15920,"rebuilding":15921,"Vocal":15922,"wires":15923,"##manship":15924,"tan":15925,"Factor":15926,"fixture":15927,"##LS":15928,"Māori":15929,"Plate":15930,"pyramid":15931,"##umble":15932,"slap":15933,"Schneider":15934,"yell":15935,"##ulture":15936,"##tional":15937,"Goodbye":15938,"sore":15939,"##pher":15940,"depressed":15941,"##dox":15942,"pitching":15943,"Find":15944,"Lotus":15945,"##wang":15946,"strand":15947,"Teen":15948,"debates":15949,"prevalent":15950,"##bilities":15951,"exposing":15952,"hears":15953,"billed":15954,"##rse":15955,"reorganized":15956,"compelled":15957,"disturbing":15958,"displaying":15959,"##tock":15960,"Clinical":15961,"emotionally":15962,"##iah":15963,"Derbyshire":15964,"grouped":15965,"##quel":15966,"Bahrain":15967,"Journalism":15968,"IN":15969,"persistent":15970,"blankets":15971,"Crane":15972,"camping":15973,"Direct":15974,"proving":15975,"Lola":15976,"##dding":15977,"Corporate":15978,"birthplace":15979,"##boats":15980,"##ender":15981,"Figure":15982,"dared":15983,"Assam":15984,"precursor":15985,"##nched":15986,"Tribe":15987,"Restoration":15988,"slate":15989,"Meyrick":15990,"hunted":15991,"stroking":15992,"Earlier":15993,"Kind":15994,"polls":15995,"appeals":15996,"monetary":15997,"##reate":15998,"Kira":15999,"Langdon":16000,"explores":16001,"GPS":16002,"extensions":16003,"squares":16004,"Results":16005,"draped":16006,"announcer":16007,"merit":16008,"##ennial":16009,"##tral":16010,"##roved":16011,"##cion":16012,"robots":16013,"supervisor":16014,"snorted":16015,"##group":16016,"Cannon":16017,"procession":16018,"monkey":16019,"freeze":16020,"sleeves":16021,"Nile":16022,"verdict":16023,"ropes":16024,"firearms":16025,"extraction":16026,"tensed":16027,"EC":16028,"Saunders":16029,"##tches":16030,"diamonds":16031,"Marriage":16032,"##amble":16033,"curling":16034,"Amazing":16035,"##haling":16036,"unrelated":16037,"##roads":16038,"Daughter":16039,"cum":16040,"discarded":16041,"kidney":16042,"cliffs":16043,"forested":16044,"Candy":16045,"##lap":16046,"authentic":16047,"tablet":16048,"notation":16049,"##nburg":16050,"Bulldogs":16051,"Callum":16052,"Meet":16053,"mouths":16054,"coated":16055,"##xe":16056,"Truman":16057,"combinations":16058,"##mation":16059,"Steelers":16060,"Fan":16061,"Than":16062,"paternal":16063,"##father":16064,"##uti":16065,"Rebellion":16066,"inviting":16067,"Fun":16068,"theatres":16069,"##ي":16070,"##rom":16071,"curator":16072,"##cision":16073,"networking":16074,"Oz":16075,"drought":16076,"##ssel":16077,"granting":16078,"MBA":16079,"Shelby":16080,"Elaine":16081,"jealousy":16082,"Kyoto":16083,"shores":16084,"signaling":16085,"tenants":16086,"debated":16087,"Intermediate":16088,"Wise":16089,"##hes":16090,"##pu":16091,"Havana":16092,"duke":16093,"vicious":16094,"exited":16095,"servers":16096,"Nonetheless":16097,"Reports":16098,"explode":16099,"##beth":16100,"Nationals":16101,"offerings":16102,"Oval":16103,"conferred":16104,"eponymous":16105,"folklore":16106,"##NR":16107,"Shire":16108,"planting":16109,"1783":16110,"Zeus":16111,"accelerated":16112,"Constable":16113,"consuming":16114,"troubles":16115,"McCartney":16116,"texture":16117,"bust":16118,"Immigration":16119,"excavated":16120,"hopefully":16121,"##cession":16122,"##coe":16123,"##name":16124,"##ully":16125,"lining":16126,"Einstein":16127,"Venezuelan":16128,"reissued":16129,"minorities":16130,"Beatrice":16131,"crystals":16132,"##nies":16133,"circus":16134,"lava":16135,"Beirut":16136,"extinction":16137,"##shu":16138,"Becker":16139,"##uke":16140,"issuing":16141,"Zurich":16142,"extract":16143,"##esta":16144,"##rred":16145,"regulate":16146,"progression":16147,"hut":16148,"alcoholic":16149,"plea":16150,"AB":16151,"Norse":16152,"Hubert":16153,"Mansfield":16154,"ashamed":16155,"##put":16156,"Bombardment":16157,"stripes":16158,"electrons":16159,"Denise":16160,"horrified":16161,"Nor":16162,"arranger":16163,"Hay":16164,"Koch":16165,"##ddling":16166,"##iner":16167,"Birthday":16168,"Josie":16169,"deliberate":16170,"explorer":16171,"##jiang":16172,"##signed":16173,"Arrow":16174,"wiping":16175,"satellites":16176,"baritone":16177,"mobility":16178,"##rals":16179,"Dorset":16180,"turbine":16181,"Coffee":16182,"185":16183,"##lder":16184,"Cara":16185,"Colts":16186,"pits":16187,"Crossing":16188,"coral":16189,"##birth":16190,"Tai":16191,"zombie":16192,"smoothly":16193,"##hp":16194,"mates":16195,"##ady":16196,"Marguerite":16197,"##tary":16198,"puzzled":16199,"tapes":16200,"overly":16201,"Sonic":16202,"Prayer":16203,"Thinking":16204,"##uf":16205,"IEEE":16206,"obligation":16207,"##cliffe":16208,"Basil":16209,"redesignated":16210,"##mmy":16211,"nostrils":16212,"Barney":16213,"XIII":16214,"##phones":16215,"vacated":16216,"unused":16217,"Berg":16218,"##roid":16219,"Towards":16220,"viola":16221,"136":16222,"Event":16223,"subdivided":16224,"rabbit":16225,"recruiting":16226,"##nery":16227,"Namibia":16228,"##16":16229,"##ilation":16230,"recruits":16231,"Famous":16232,"Francesca":16233,"##hari":16234,"Goa":16235,"##lat":16236,"Karachi":16237,"haul":16238,"biblical":16239,"##cible":16240,"MGM":16241,"##rta":16242,"horsepower":16243,"profitable":16244,"Grandma":16245,"importantly":16246,"Martinez":16247,"incoming":16248,"##kill":16249,"beneficial":16250,"nominal":16251,"praying":16252,"##isch":16253,"gable":16254,"nail":16255,"noises":16256,"##ttle":16257,"Polytechnic":16258,"rub":16259,"##cope":16260,"Thor":16261,"audition":16262,"erotic":16263,"##ending":16264,"##iano":16265,"Ultimately":16266,"armoured":16267,"##mum":16268,"presently":16269,"pedestrian":16270,"##tled":16271,"Ipswich":16272,"offence":16273,"##ffin":16274,"##borne":16275,"Flemish":16276,"##hman":16277,"echo":16278,"##cting":16279,"auditorium":16280,"gentlemen":16281,"winged":16282,"##tched":16283,"Nicaragua":16284,"Unknown":16285,"prosperity":16286,"exhaust":16287,"pie":16288,"Peruvian":16289,"compartment":16290,"heights":16291,"disabilities":16292,"##pole":16293,"Harding":16294,"Humphrey":16295,"postponed":16296,"moths":16297,"Mathematical":16298,"Mets":16299,"posters":16300,"axe":16301,"##nett":16302,"Nights":16303,"Typically":16304,"chuckle":16305,"councillors":16306,"alternating":16307,"141":16308,"Norris":16309,"##ately":16310,"##etus":16311,"deficit":16312,"dreaming":16313,"cooler":16314,"oppose":16315,"Beethoven":16316,"##esis":16317,"Marquis":16318,"flashlight":16319,"headache":16320,"investor":16321,"responding":16322,"appointments":16323,"##shore":16324,"Elias":16325,"ideals":16326,"shades":16327,"torch":16328,"lingering":16329,"##real":16330,"pier":16331,"fertile":16332,"Diploma":16333,"currents":16334,"Snake":16335,"##horse":16336,"##15":16337,"Briggs":16338,"##ota":16339,"##hima":16340,"##romatic":16341,"Coastal":16342,"Kuala":16343,"ankles":16344,"Rae":16345,"slice":16346,"Hilton":16347,"locking":16348,"Approximately":16349,"Workshop":16350,"Niagara":16351,"strangely":16352,"##scence":16353,"functionality":16354,"advertisement":16355,"Rapid":16356,"Anders":16357,"ho":16358,"Soviets":16359,"packing":16360,"basal":16361,"Sunderland":16362,"Permanent":16363,"##fting":16364,"rack":16365,"tying":16366,"Lowell":16367,"##ncing":16368,"Wizard":16369,"mighty":16370,"tertiary":16371,"pencil":16372,"dismissal":16373,"torso":16374,"grasped":16375,"##yev":16376,"Sand":16377,"gossip":16378,"##nae":16379,"Beer":16380,"implementing":16381,"##19":16382,"##riya":16383,"Fork":16384,"Bee":16385,"##eria":16386,"Win":16387,"##cid":16388,"sailor":16389,"pressures":16390,"##oping":16391,"speculated":16392,"Freddie":16393,"originating":16394,"##DF":16395,"##SR":16396,"##outh":16397,"28th":16398,"melt":16399,"Brenda":16400,"lump":16401,"Burlington":16402,"USC":16403,"marginal":16404,"##bine":16405,"Dogs":16406,"swamp":16407,"cu":16408,"Ex":16409,"uranium":16410,"metro":16411,"spill":16412,"Pietro":16413,"seize":16414,"Chorus":16415,"partition":16416,"##dock":16417,"##media":16418,"engineered":16419,"##oria":16420,"conclusions":16421,"subdivision":16422,"##uid":16423,"Illustrated":16424,"Leading":16425,"##hora":16426,"Berkshire":16427,"definite":16428,"##books":16429,"##cin":16430,"##suke":16431,"noun":16432,"winced":16433,"Doris":16434,"dissertation":16435,"Wilderness":16436,"##quest":16437,"braced":16438,"arbitrary":16439,"kidnapping":16440,"Kurdish":16441,"##but":16442,"clearance":16443,"excavations":16444,"wanna":16445,"Allmusic":16446,"insult":16447,"presided":16448,"yacht":16449,"##SM":16450,"Honour":16451,"Tin":16452,"attracting":16453,"explosives":16454,"Gore":16455,"Bride":16456,"##ience":16457,"Packers":16458,"Devils":16459,"Observer":16460,"##course":16461,"Loser":16462,"##erry":16463,"##hardt":16464,"##mble":16465,"Cyrillic":16466,"undefeated":16467,"##stra":16468,"subordinate":16469,"##ame":16470,"Wigan":16471,"compulsory":16472,"Pauline":16473,"Cruise":16474,"Opposition":16475,"##ods":16476,"Period":16477,"dispersed":16478,"expose":16479,"##60":16480,"##has":16481,"Certain":16482,"Clerk":16483,"Wolves":16484,"##hibition":16485,"apparatus":16486,"allegiance":16487,"orbital":16488,"justified":16489,"thanked":16490,"##ević":16491,"Biblical":16492,"Carolyn":16493,"Graves":16494,"##tton":16495,"Hercules":16496,"backgrounds":16497,"replica":16498,"1788":16499,"aquatic":16500,"Mega":16501,"Stirling":16502,"obstacles":16503,"filing":16504,"Founder":16505,"vowels":16506,"Deborah":16507,"Rotterdam":16508,"surpassed":16509,"Belarusian":16510,"##ologists":16511,"Zambia":16512,"Ren":16513,"Olga":16514,"Alpine":16515,"bi":16516,"councillor":16517,"Oaks":16518,"Animals":16519,"eliminating":16520,"digit":16521,"Managing":16522,"##GE":16523,"laundry":16524,"##rdo":16525,"presses":16526,"slamming":16527,"Tudor":16528,"thief":16529,"posterior":16530,"##bas":16531,"Rodgers":16532,"smells":16533,"##ining":16534,"Hole":16535,"SUV":16536,"trombone":16537,"numbering":16538,"representations":16539,"Domingo":16540,"Paralympics":16541,"cartridge":16542,"##rash":16543,"Combined":16544,"shelves":16545,"Kraków":16546,"revision":16547,"##frame":16548,"Sánchez":16549,"##tracted":16550,"##bler":16551,"Alain":16552,"townships":16553,"sic":16554,"trousers":16555,"Gibbs":16556,"anterior":16557,"symmetry":16558,"vaguely":16559,"Castile":16560,"IRA":16561,"resembling":16562,"Penguin":16563,"##ulent":16564,"infections":16565,"##stant":16566,"raped":16567,"##pressive":16568,"worrying":16569,"brains":16570,"bending":16571,"JR":16572,"Evidence":16573,"Venetian":16574,"complexes":16575,"Jonah":16576,"850":16577,"exported":16578,"Ambrose":16579,"Gap":16580,"philanthropist":16581,"##atus":16582,"Marxist":16583,"weighing":16584,"##KO":16585,"##nath":16586,"Soldiers":16587,"chiefs":16588,"reject":16589,"repeating":16590,"shaky":16591,"Zürich":16592,"preserving":16593,"##xin":16594,"cigarettes":16595,"##break":16596,"mortar":16597,"##fin":16598,"Already":16599,"reproduction":16600,"socks":16601,"Waiting":16602,"amazed":16603,"##aca":16604,"dash":16605,"##path":16606,"Airborne":16607,"##harf":16608,"##get":16609,"descending":16610,"OBE":16611,"Sant":16612,"Tess":16613,"Lucius":16614,"enjoys":16615,"##ttered":16616,"##ivation":16617,"##ete":16618,"Leinster":16619,"Phillies":16620,"execute":16621,"geological":16622,"unfinished":16623,"Courts":16624,"SP":16625,"Beaver":16626,"Duck":16627,"motions":16628,"Platinum":16629,"friction":16630,"##aud":16631,"##bet":16632,"Parts":16633,"Stade":16634,"entirety":16635,"sprang":16636,"Smithsonian":16637,"coffin":16638,"prolonged":16639,"Borneo":16640,"##vise":16641,"unanimously":16642,"##uchi":16643,"Cars":16644,"Cassandra":16645,"Australians":16646,"##CT":16647,"##rgen":16648,"Louisa":16649,"spur":16650,"Constance":16651,"##lities":16652,"Patent":16653,"racism":16654,"tempo":16655,"##ssion":16656,"##chard":16657,"##nology":16658,"##claim":16659,"Million":16660,"Nichols":16661,"##dah":16662,"Numerous":16663,"ing":16664,"Pure":16665,"plantations":16666,"donor":16667,"##EP":16668,"##rip":16669,"convenience":16670,"##plate":16671,"dots":16672,"indirect":16673,"##written":16674,"Dong":16675,"failures":16676,"adapt":16677,"wizard":16678,"unfortunately":16679,"##gion":16680,"practitioners":16681,"economically":16682,"Enrique":16683,"unchanged":16684,"kingdoms":16685,"refined":16686,"definitions":16687,"lazy":16688,"worries":16689,"railing":16690,"##nay":16691,"Kaiser":16692,"##lug":16693,"cracks":16694,"sells":16695,"ninety":16696,"##WC":16697,"Directed":16698,"denotes":16699,"developmental":16700,"papal":16701,"unfortunate":16702,"disappointing":16703,"sixteenth":16704,"Jen":16705,"##urier":16706,"NWA":16707,"drifting":16708,"Horror":16709,"##chemical":16710,"behaviors":16711,"bury":16712,"surfaced":16713,"foreigners":16714,"slick":16715,"AND":16716,"##rene":16717,"##ditions":16718,"##teral":16719,"scrap":16720,"kicks":16721,"comprise":16722,"buddy":16723,"##anda":16724,"Mental":16725,"##ype":16726,"Dom":16727,"wines":16728,"Limerick":16729,"Luca":16730,"Rand":16731,"##won":16732,"Tomatoes":16733,"homage":16734,"geometric":16735,"##nted":16736,"telescope":16737,"Shelley":16738,"poles":16739,"##fan":16740,"shareholders":16741,"Autonomous":16742,"cope":16743,"intensified":16744,"Genoa":16745,"Reformation":16746,"grazing":16747,"##tern":16748,"Zhao":16749,"provisional":16750,"##bies":16751,"Con":16752,"##riel":16753,"Cynthia":16754,"Raleigh":16755,"vivid":16756,"threaten":16757,"Length":16758,"subscription":16759,"roses":16760,"Müller":16761,"##isms":16762,"robin":16763,"##tial":16764,"Laos":16765,"Stanton":16766,"nationalism":16767,"##clave":16768,"##ND":16769,"##17":16770,"##zz":16771,"staging":16772,"Busch":16773,"Cindy":16774,"relieve":16775,"##spective":16776,"packs":16777,"neglected":16778,"CBE":16779,"alpine":16780,"Evolution":16781,"uneasy":16782,"coastline":16783,"Destiny":16784,"Barber":16785,"Julio":16786,"##tted":16787,"informs":16788,"unprecedented":16789,"Pavilion":16790,"##bei":16791,"##ference":16792,"betrayal":16793,"awaiting":16794,"leaked":16795,"V8":16796,"puppet":16797,"adverse":16798,"Bourne":16799,"Sunset":16800,"collectors":16801,"##glass":16802,"##sque":16803,"copied":16804,"Demon":16805,"conceded":16806,"resembled":16807,"Rafe":16808,"Levy":16809,"prosecutor":16810,"##ject":16811,"flora":16812,"manned":16813,"deaf":16814,"Mosque":16815,"reminds":16816,"Lizzie":16817,"Products":16818,"Funny":16819,"cassette":16820,"congress":16821,"##rong":16822,"Rover":16823,"tossing":16824,"prompting":16825,"chooses":16826,"Satellite":16827,"cautiously":16828,"Reese":16829,"##UT":16830,"Huang":16831,"Gloucestershire":16832,"giggled":16833,"Kitty":16834,"##å":16835,"Pleasant":16836,"Aye":16837,"##ond":16838,"judging":16839,"1860s":16840,"intentionally":16841,"Hurling":16842,"aggression":16843,"##xy":16844,"transfers":16845,"employing":16846,"##fies":16847,"##oda":16848,"Archibald":16849,"Blessed":16850,"Ski":16851,"flavor":16852,"Rosie":16853,"##burgh":16854,"sunset":16855,"Scholarship":16856,"WC":16857,"surround":16858,"ranged":16859,"##jay":16860,"Degree":16861,"Houses":16862,"squeezing":16863,"limb":16864,"premium":16865,"Leningrad":16866,"steals":16867,"##inated":16868,"##ssie":16869,"madness":16870,"vacancy":16871,"hydraulic":16872,"Northampton":16873,"##prise":16874,"Marks":16875,"Boxing":16876,"##fying":16877,"academics":16878,"##lich":16879,"##TY":16880,"CDs":16881,"##lma":16882,"hardcore":16883,"monitors":16884,"paperback":16885,"cables":16886,"Dimitri":16887,"upside":16888,"advent":16889,"Ra":16890,"##clusive":16891,"Aug":16892,"Christchurch":16893,"objected":16894,"stalked":16895,"Simple":16896,"colonists":16897,"##laid":16898,"CT":16899,"discusses":16900,"fellowship":16901,"Carnival":16902,"cares":16903,"Miracle":16904,"pastoral":16905,"rooted":16906,"shortage":16907,"borne":16908,"Quentin":16909,"meditation":16910,"tapping":16911,"Novel":16912,"##ades":16913,"Alicia":16914,"Burn":16915,"famed":16916,"residency":16917,"Fernández":16918,"Johannesburg":16919,"Zhu":16920,"offended":16921,"Mao":16922,"outward":16923,"##inas":16924,"XV":16925,"denial":16926,"noticing":16927,"##ís":16928,"quarry":16929,"##hound":16930,"##amo":16931,"Bernie":16932,"Bentley":16933,"Joanna":16934,"mortgage":16935,"##rdi":16936,"##sumption":16937,"lenses":16938,"extracted":16939,"depiction":16940,"##RE":16941,"Networks":16942,"Broad":16943,"Revenue":16944,"flickered":16945,"virgin":16946,"flanked":16947,"##о":16948,"Enterprises":16949,"probable":16950,"Liberals":16951,"Falcons":16952,"drowning":16953,"phrases":16954,"loads":16955,"assumes":16956,"inhaled":16957,"awe":16958,"logs":16959,"slightest":16960,"spiders":16961,"waterfall":16962,"##pate":16963,"rocking":16964,"shrub":16965,"##uil":16966,"roofs":16967,"##gard":16968,"prehistoric":16969,"wary":16970,"##rak":16971,"TO":16972,"clips":16973,"sustain":16974,"treason":16975,"microphone":16976,"voter":16977,"Lamb":16978,"psychologist":16979,"wrinkled":16980,"##ères":16981,"mating":16982,"Carrier":16983,"340":16984,"##lbert":16985,"sensing":16986,"##rino":16987,"destiny":16988,"distract":16989,"weaker":16990,"UC":16991,"Nearly":16992,"neurons":16993,"spends":16994,"Apache":16995,"##rem":16996,"genuinely":16997,"wells":16998,"##lanted":16999,"stereo":17000,"##girl":17001,"Lois":17002,"Leaving":17003,"consul":17004,"fungi":17005,"Pier":17006,"Cyril":17007,"80s":17008,"Jungle":17009,"##tani":17010,"illustration":17011,"Split":17012,"##hana":17013,"Abigail":17014,"##patrick":17015,"1787":17016,"diminished":17017,"Selected":17018,"packaging":17019,"##EG":17020,"Martínez":17021,"communal":17022,"Manufacturing":17023,"sentiment":17024,"143":17025,"unwilling":17026,"praising":17027,"Citation":17028,"pills":17029,"##iti":17030,"##rax":17031,"muffled":17032,"neatly":17033,"workforce":17034,"Yep":17035,"leisure":17036,"Tu":17037,"##nding":17038,"Wakefield":17039,"ancestral":17040,"##uki":17041,"destructive":17042,"seas":17043,"Passion":17044,"showcase":17045,"##ceptive":17046,"heroic":17047,"142":17048,"exhaustion":17049,"Customs":17050,"##aker":17051,"Scholar":17052,"sliced":17053,"##inian":17054,"Direction":17055,"##OW":17056,"Swansea":17057,"aluminium":17058,"##eep":17059,"ceramic":17060,"McCoy":17061,"Career":17062,"Sector":17063,"chartered":17064,"Damascus":17065,"pictured":17066,"Interest":17067,"stiffened":17068,"Plateau":17069,"obsolete":17070,"##tant":17071,"irritated":17072,"inappropriate":17073,"overs":17074,"##nko":17075,"bail":17076,"Talent":17077,"Sur":17078,"ours":17079,"##nah":17080,"barred":17081,"legged":17082,"sociology":17083,"Bud":17084,"dictionary":17085,"##luk":17086,"Cover":17087,"obey":17088,"##oring":17089,"annoying":17090,"##dong":17091,"apprentice":17092,"Cyrus":17093,"Role":17094,"##GP":17095,"##uns":17096,"##bag":17097,"Greenland":17098,"Porsche":17099,"Rocket":17100,"##32":17101,"organism":17102,"##ntary":17103,"reliability":17104,"##vocation":17105,"##й":17106,"Found":17107,"##hine":17108,"motors":17109,"promoter":17110,"unfair":17111,"##oms":17112,"##note":17113,"distribute":17114,"eminent":17115,"rails":17116,"appealing":17117,"chiefly":17118,"meaningful":17119,"Stephan":17120,"##rehension":17121,"Consumer":17122,"psychiatric":17123,"bowler":17124,"saints":17125,"##iful":17126,"##н":17127,"1777":17128,"Pol":17129,"Dorian":17130,"Townsend":17131,"hastily":17132,"##jima":17133,"Quincy":17134,"Sol":17135,"fascinated":17136,"Scarlet":17137,"alto":17138,"Avon":17139,"certainty":17140,"##eding":17141,"Keys":17142,"##chu":17143,"Chu":17144,"##VE":17145,"ions":17146,"tributaries":17147,"Thanksgiving":17148,"##fusion":17149,"astronomer":17150,"oxide":17151,"pavilion":17152,"Supply":17153,"Casa":17154,"Bollywood":17155,"sadly":17156,"mutations":17157,"Keller":17158,"##wave":17159,"nationals":17160,"##rgo":17161,"##ym":17162,"predict":17163,"Catholicism":17164,"Vega":17165,"##eration":17166,"##ums":17167,"Mali":17168,"tuned":17169,"Lankan":17170,"Plans":17171,"radial":17172,"Bosnian":17173,"Lexi":17174,"##14":17175,"##ü":17176,"sacks":17177,"unpleasant":17178,"Empty":17179,"handles":17180,"##taking":17181,"Bon":17182,"switches":17183,"intently":17184,"tuition":17185,"antique":17186,"##jk":17187,"fraternity":17188,"notebook":17189,"Desmond":17190,"##sei":17191,"prostitution":17192,"##how":17193,"deed":17194,"##OP":17195,"501":17196,"Somewhere":17197,"Rocks":17198,"##mons":17199,"campaigned":17200,"frigate":17201,"gases":17202,"suppress":17203,"##hang":17204,"Merlin":17205,"Northumberland":17206,"dominate":17207,"expeditions":17208,"thunder":17209,"##ups":17210,"##rical":17211,"Cap":17212,"thorough":17213,"Ariel":17214,"##kind":17215,"renewable":17216,"constructing":17217,"pacing":17218,"terrorists":17219,"Bowen":17220,"documentaries":17221,"westward":17222,"##lass":17223,"##nage":17224,"Merchant":17225,"##ued":17226,"Beaumont":17227,"Din":17228,"##hian":17229,"Danube":17230,"peasant":17231,"Garrison":17232,"encourages":17233,"gratitude":17234,"reminding":17235,"stormed":17236,"##ouse":17237,"pronunciation":17238,"##ailed":17239,"Weekend":17240,"suggestions":17241,"##ffing":17242,"##DI":17243,"Active":17244,"Colombo":17245,"##logists":17246,"Merrill":17247,"##cens":17248,"Archaeological":17249,"Medina":17250,"captained":17251,"##yk":17252,"duel":17253,"cracking":17254,"Wilkinson":17255,"Guam":17256,"pickup":17257,"renovations":17258,"##ël":17259,"##izer":17260,"delighted":17261,"##iri":17262,"Weaver":17263,"##ctional":17264,"tens":17265,"##hab":17266,"Clint":17267,"##usion":17268,"##each":17269,"petals":17270,"Farrell":17271,"##sable":17272,"caste":17273,"##will":17274,"Ezra":17275,"##qi":17276,"##standing":17277,"thrilled":17278,"ambush":17279,"exhaled":17280,"##SU":17281,"Resource":17282,"blur":17283,"forearm":17284,"specifications":17285,"contingent":17286,"cafe":17287,"##iology":17288,"Antony":17289,"fundraising":17290,"grape":17291,"##rgy":17292,"turnout":17293,"##udi":17294,"Clifton":17295,"laboratories":17296,"Irvine":17297,"##opus":17298,"##lid":17299,"Monthly":17300,"Bihar":17301,"statutory":17302,"Roses":17303,"Emil":17304,"##rig":17305,"lumber":17306,"optimal":17307,"##DR":17308,"pumps":17309,"plaster":17310,"Mozambique":17311,"##aco":17312,"nightclub":17313,"propelled":17314,"##hun":17315,"ked":17316,"surplus":17317,"wax":17318,"##urai":17319,"pioneered":17320,"Sunny":17321,"imprint":17322,"Forget":17323,"Eliot":17324,"approximate":17325,"patronage":17326,"##bek":17327,"##ely":17328,"##mbe":17329,"Partnership":17330,"curl":17331,"snapping":17332,"29th":17333,"Patriarch":17334,"##jord":17335,"seldom":17336,"##ature":17337,"astronomy":17338,"Bremen":17339,"XIV":17340,"airborne":17341,"205":17342,"1778":17343,"recognizing":17344,"stranded":17345,"arrogant":17346,"bombardment":17347,"destined":17348,"ensured":17349,"146":17350,"robust":17351,"Davenport":17352,"Interactive":17353,"Offensive":17354,"Fi":17355,"prevents":17356,"probe":17357,"propeller":17358,"sorrow":17359,"Blade":17360,"mounting":17361,"automotive":17362,"##dged":17363,"wallet":17364,"201":17365,"lashes":17366,"Forrest":17367,"##ift":17368,"Cell":17369,"Younger":17370,"shouts":17371,"##cki":17372,"folds":17373,"##chet":17374,"Epic":17375,"yields":17376,"homosexual":17377,"tunes":17378,"##minate":17379,"##text":17380,"Manny":17381,"chemist":17382,"hindwings":17383,"##urn":17384,"pilgrimage":17385,"##sfield":17386,"##riff":17387,"MLS":17388,"##rive":17389,"Huntington":17390,"translates":17391,"Path":17392,"slim":17393,"##ndra":17394,"##oz":17395,"climax":17396,"commuter":17397,"desperation":17398,"##reet":17399,"denying":17400,"##rious":17401,"daring":17402,"seminary":17403,"polo":17404,"##clamation":17405,"Teatro":17406,"Torah":17407,"Cats":17408,"identities":17409,"Poles":17410,"photographed":17411,"fiery":17412,"popularly":17413,"##cross":17414,"winters":17415,"Hesse":17416,"##vio":17417,"Nurse":17418,"Senegal":17419,"Salon":17420,"prescribed":17421,"justify":17422,"##gues":17423,"##и":17424,"##orted":17425,"HQ":17426,"##hiro":17427,"evaluated":17428,"momentarily":17429,"##unts":17430,"Debbie":17431,"##licity":17432,"##TP":17433,"Mighty":17434,"Rabbit":17435,"##chal":17436,"Events":17437,"Savoy":17438,"##ht":17439,"Brandenburg":17440,"Bordeaux":17441,"##laus":17442,"Release":17443,"##IE":17444,"##kowski":17445,"1900s":17446,"SK":17447,"Strauss":17448,"##aly":17449,"Sonia":17450,"Updated":17451,"synagogue":17452,"McKay":17453,"flattened":17454,"370":17455,"clutch":17456,"contests":17457,"toast":17458,"evaluate":17459,"pope":17460,"heirs":17461,"jam":17462,"tutor":17463,"reverted":17464,"##ading":17465,"nonsense":17466,"hesitate":17467,"Lars":17468,"Ceylon":17469,"Laurie":17470,"##guchi":17471,"accordingly":17472,"customary":17473,"148":17474,"Ethics":17475,"Multiple":17476,"instincts":17477,"IGN":17478,"##ä":17479,"bullshit":17480,"##hit":17481,"##par":17482,"desirable":17483,"##ducing":17484,"##yam":17485,"alias":17486,"ashore":17487,"licenses":17488,"##lification":17489,"misery":17490,"147":17491,"Cola":17492,"assassinated":17493,"fiercely":17494,"##aft":17495,"las":17496,"goat":17497,"substrate":17498,"lords":17499,"Cass":17500,"Bridges":17501,"ICC":17502,"lasts":17503,"sights":17504,"reproductive":17505,"##asi":17506,"Ivory":17507,"Clean":17508,"fixing":17509,"##lace":17510,"seeming":17511,"aide":17512,"1850s":17513,"harassment":17514,"##FF":17515,"##LE":17516,"reasonably":17517,"##coat":17518,"##cano":17519,"NYC":17520,"1784":17521,"Fifty":17522,"immunity":17523,"Canadians":17524,"Cheng":17525,"comforting":17526,"meanwhile":17527,"##tera":17528,"##blin":17529,"breeds":17530,"glowed":17531,"##vour":17532,"Aden":17533,"##verted":17534,"##aded":17535,"##oral":17536,"neat":17537,"enforced":17538,"poisoning":17539,"##ews":17540,"##hone":17541,"enforce":17542,"predecessors":17543,"survivor":17544,"Month":17545,"unfamiliar":17546,"pierced":17547,"waived":17548,"dump":17549,"responds":17550,"Mai":17551,"Declan":17552,"angular":17553,"Doesn":17554,"interpretations":17555,"##yar":17556,"invest":17557,"Dhaka":17558,"policeman":17559,"Congregation":17560,"Eighth":17561,"painfully":17562,"##este":17563,"##vior":17564,"Württemberg":17565,"##cles":17566,"blockade":17567,"encouragement":17568,"##fie":17569,"Caucasus":17570,"Malone":17571,"Universidad":17572,"utilize":17573,"Nissan":17574,"inherent":17575,"151":17576,"agreeing":17577,"syllable":17578,"determines":17579,"Protocol":17580,"conclude":17581,"##gara":17582,"40th":17583,"Xu":17584,"Taiwanese":17585,"##ather":17586,"boiler":17587,"printer":17588,"Lacey":17589,"titular":17590,"Klaus":17591,"Fallon":17592,"Wembley":17593,"fox":17594,"Chandra":17595,"Governorate":17596,"obsessed":17597,"##Ps":17598,"micro":17599,"##25":17600,"Cooke":17601,"gymnasium":17602,"weaving":17603,"Shall":17604,"Hussein":17605,"glaring":17606,"softball":17607,"Reader":17608,"Dominion":17609,"Trouble":17610,"varsity":17611,"Cooperation":17612,"Chaos":17613,"Kang":17614,"Kramer":17615,"Eisenhower":17616,"proves":17617,"Connie":17618,"consortium":17619,"governors":17620,"Bethany":17621,"opener":17622,"Normally":17623,"Willy":17624,"linebacker":17625,"Regent":17626,"Used":17627,"AllMusic":17628,"Twilight":17629,"##shaw":17630,"Companion":17631,"Tribunal":17632,"simpler":17633,"##gam":17634,"Experimental":17635,"Slovenian":17636,"cellar":17637,"deadline":17638,"trout":17639,"Hubbard":17640,"ads":17641,"idol":17642,"##hetto":17643,"Granada":17644,"clues":17645,"salmon":17646,"1700":17647,"Omega":17648,"Caldwell":17649,"softened":17650,"Bills":17651,"Honolulu":17652,"##gn":17653,"Terrace":17654,"suitcase":17655,"##IL":17656,"frantic":17657,"##oons":17658,"Abbot":17659,"Sitting":17660,"Fortress":17661,"Riders":17662,"sickness":17663,"enzymes":17664,"trustee":17665,"Bern":17666,"forged":17667,"##13":17668,"##ruff":17669,"##rl":17670,"##versity":17671,"inspector":17672,"champagne":17673,"##held":17674,"##FI":17675,"hereditary":17676,"Taliban":17677,"handball":17678,"##wine":17679,"Sioux":17680,"##dicated":17681,"honoured":17682,"139":17683,"##tude":17684,"Skye":17685,"meanings":17686,"##rkin":17687,"cardiac":17688,"analyzed":17689,"vegetable":17690,"##FS":17691,"Royals":17692,"dial":17693,"freelance":17694,"##fest":17695,"partisan":17696,"petroleum":17697,"ridden":17698,"Lincolnshire":17699,"panting":17700,"##comb":17701,"presidents":17702,"Haley":17703,"##chs":17704,"contributes":17705,"Jew":17706,"discoveries":17707,"panicked":17708,"Woody":17709,"eyelids":17710,"Fate":17711,"Tulsa":17712,"mg":17713,"whiskey":17714,"zombies":17715,"Wii":17716,"##udge":17717,"investigators":17718,"##bull":17719,"centred":17720,"##screen":17721,"Bone":17722,"Lana":17723,"##oise":17724,"forts":17725,"##ske":17726,"Conan":17727,"Lyons":17728,"##writing":17729,"SH":17730,"##ride":17731,"rhythmic":17732,"154":17733,"##llah":17734,"pioneers":17735,"##bright":17736,"captivity":17737,"Sanchez":17738,"Oman":17739,"##mith":17740,"Flint":17741,"Platform":17742,"##ioned":17743,"emission":17744,"packet":17745,"Persia":17746,"##formed":17747,"takeover":17748,"tempted":17749,"Vance":17750,"Few":17751,"Toni":17752,"receptions":17753,"##ن":17754,"exchanges":17755,"Camille":17756,"whale":17757,"Chronicles":17758,"##rent":17759,"##ushing":17760,"##rift":17761,"Alto":17762,"Genus":17763,"##asing":17764,"onward":17765,"foremost":17766,"longing":17767,"Rockefeller":17768,"containers":17769,"##cribe":17770,"intercepted":17771,"##olt":17772,"pleading":17773,"Bye":17774,"bee":17775,"##umbling":17776,"153":17777,"undertake":17778,"Izzy":17779,"cheaper":17780,"Ultra":17781,"validity":17782,"##pse":17783,"Sa":17784,"hovering":17785,"##pert":17786,"vintage":17787,"engraved":17788,"##rise":17789,"farmland":17790,"##ever":17791,"##ifier":17792,"Atlantis":17793,"propose":17794,"Catalonia":17795,"plunged":17796,"##edly":17797,"demonstrates":17798,"gig":17799,"##cover":17800,"156":17801,"Osborne":17802,"cowboy":17803,"herd":17804,"investigator":17805,"loops":17806,"Burning":17807,"rests":17808,"Instrumental":17809,"embarrassing":17810,"focal":17811,"install":17812,"readings":17813,"swirling":17814,"Chatham":17815,"parameter":17816,"##zin":17817,"##holders":17818,"Mandarin":17819,"Moody":17820,"converting":17821,"Escape":17822,"warnings":17823,"##chester":17824,"incarnation":17825,"##ophone":17826,"adopting":17827,"##lins":17828,"Cromwell":17829,"##laws":17830,"Axis":17831,"Verde":17832,"Kappa":17833,"Schwartz":17834,"Serbs":17835,"caliber":17836,"Wanna":17837,"Chung":17838,"##ality":17839,"nursery":17840,"principally":17841,"Bulletin":17842,"likelihood":17843,"logging":17844,"##erty":17845,"Boyle":17846,"supportive":17847,"twitched":17848,"##usive":17849,"builds":17850,"Marseille":17851,"omitted":17852,"motif":17853,"Lands":17854,"##lusion":17855,"##ssed":17856,"Barrow":17857,"Airfield":17858,"Harmony":17859,"WWF":17860,"endured":17861,"merging":17862,"convey":17863,"branding":17864,"examinations":17865,"167":17866,"Italians":17867,"##dh":17868,"dude":17869,"1781":17870,"##teau":17871,"crawling":17872,"thoughtful":17873,"clasped":17874,"concluding":17875,"brewery":17876,"Moldova":17877,"Wan":17878,"Towers":17879,"Heidelberg":17880,"202":17881,"##ict":17882,"Lagos":17883,"imposing":17884,"##eval":17885,"##serve":17886,"Bacon":17887,"frowning":17888,"thirteenth":17889,"conception":17890,"calculations":17891,"##ович":17892,"##mile":17893,"##ivated":17894,"mutation":17895,"strap":17896,"##lund":17897,"demographic":17898,"nude":17899,"perfection":17900,"stocks":17901,"##renched":17902,"##dit":17903,"Alejandro":17904,"bites":17905,"fragment":17906,"##hack":17907,"##rchy":17908,"GB":17909,"Surgery":17910,"Berger":17911,"punish":17912,"boiling":17913,"consume":17914,"Elle":17915,"Sid":17916,"Dome":17917,"relies":17918,"Crescent":17919,"treasurer":17920,"Bloody":17921,"1758":17922,"upheld":17923,"Guess":17924,"Restaurant":17925,"signatures":17926,"font":17927,"millennium":17928,"mural":17929,"stakes":17930,"Abel":17931,"hailed":17932,"insists":17933,"Alumni":17934,"Breton":17935,"##jun":17936,"digits":17937,"##FM":17938,"##thal":17939,"Talking":17940,"motive":17941,"reigning":17942,"babe":17943,"masks":17944,"##ø":17945,"Shaun":17946,"potato":17947,"sour":17948,"whitish":17949,"Somali":17950,"##derman":17951,"##rab":17952,"##wy":17953,"chancel":17954,"telecommunications":17955,"Noise":17956,"messenger":17957,"tidal":17958,"grinding":17959,"##ogenic":17960,"Rebel":17961,"constituent":17962,"peripheral":17963,"recruitment":17964,"##ograph":17965,"##tler":17966,"pumped":17967,"Ravi":17968,"poked":17969,"##gley":17970,"Olive":17971,"diabetes":17972,"discs":17973,"liking":17974,"sting":17975,"fits":17976,"stir":17977,"Mari":17978,"Sega":17979,"creativity":17980,"weights":17981,"Macau":17982,"mandated":17983,"Bohemia":17984,"disastrous":17985,"Katrina":17986,"Baku":17987,"Rajasthan":17988,"waiter":17989,"##psis":17990,"Siberia":17991,"verbs":17992,"##truction":17993,"patented":17994,"1782":17995,"##ndon":17996,"Relegated":17997,"Hunters":17998,"Greenwood":17999,"Shock":18000,"accusing":18001,"skipped":18002,"Sessions":18003,"markers":18004,"subset":18005,"monumental":18006,"Viola":18007,"comparative":18008,"Alright":18009,"Barbados":18010,"setup":18011,"Session":18012,"standardized":18013,"##ík":18014,"##sket":18015,"appoint":18016,"AFB":18017,"Nationalist":18018,"##WS":18019,"Troop":18020,"leaped":18021,"Treasure":18022,"goodness":18023,"weary":18024,"originates":18025,"100th":18026,"compassion":18027,"expresses":18028,"recommend":18029,"168":18030,"composing":18031,"seventeenth":18032,"Tex":18033,"Atlético":18034,"bald":18035,"Finding":18036,"Presidency":18037,"Sharks":18038,"favoured":18039,"inactive":18040,"##lter":18041,"suffix":18042,"princes":18043,"brighter":18044,"##ctus":18045,"classics":18046,"defendants":18047,"culminated":18048,"terribly":18049,"Strategy":18050,"evenings":18051,"##ção":18052,"##iver":18053,"##urance":18054,"absorb":18055,"##rner":18056,"Territories":18057,"RBI":18058,"soothing":18059,"Martín":18060,"concurrently":18061,"##tr":18062,"Nicholson":18063,"fibers":18064,"swam":18065,"##oney":18066,"Allie":18067,"Algerian":18068,"Dartmouth":18069,"Mafia":18070,"##bos":18071,"##tts":18072,"Councillor":18073,"vocabulary":18074,"##bla":18075,"##lé":18076,"intending":18077,"##dler":18078,"Guerrero":18079,"sunshine":18080,"pedal":18081,"##TO":18082,"administrators":18083,"periodic":18084,"scholarships":18085,"Loop":18086,"Madeline":18087,"exaggerated":18088,"##ressed":18089,"Regan":18090,"##cellular":18091,"Explorer":18092,"##oids":18093,"Alexandre":18094,"vows":18095,"Reporter":18096,"Unable":18097,"Average":18098,"absorption":18099,"##bedience":18100,"Fortunately":18101,"Auxiliary":18102,"Grandpa":18103,"##HP":18104,"##ovo":18105,"potent":18106,"temporal":18107,"adrenaline":18108,"##udo":18109,"confusing":18110,"guiding":18111,"Dry":18112,"qualifications":18113,"joking":18114,"wherein":18115,"heavyweight":18116,"##ices":18117,"nightmares":18118,"pharmaceutical":18119,"Commanding":18120,"##aled":18121,"##ove":18122,"Gregor":18123,"##UP":18124,"censorship":18125,"degradation":18126,"glorious":18127,"Austro":18128,"##rench":18129,"380":18130,"Miriam":18131,"sped":18132,"##orous":18133,"offset":18134,"##KA":18135,"fined":18136,"specialists":18137,"Pune":18138,"João":18139,"##dina":18140,"propped":18141,"fungus":18142,"##ς":18143,"frantically":18144,"Gabrielle":18145,"Hare":18146,"committing":18147,"##plied":18148,"Ask":18149,"Wilmington":18150,"stunt":18151,"numb":18152,"warmer":18153,"preacher":18154,"earnings":18155,"##lating":18156,"integer":18157,"##ija":18158,"federation":18159,"homosexuality":18160,"##cademia":18161,"epidemic":18162,"grumbled":18163,"shoving":18164,"Milk":18165,"Satan":18166,"Tobias":18167,"innovations":18168,"##dington":18169,"geology":18170,"memoirs":18171,"##IR":18172,"spared":18173,"culminating":18174,"Daphne":18175,"Focus":18176,"severed":18177,"stricken":18178,"Paige":18179,"Mans":18180,"flats":18181,"Russo":18182,"communes":18183,"litigation":18184,"strengthening":18185,"##powered":18186,"Staffordshire":18187,"Wiltshire":18188,"Painting":18189,"Watkins":18190,"##د":18191,"specializes":18192,"Select":18193,"##rane":18194,"##aver":18195,"Fulton":18196,"playable":18197,"##VN":18198,"openings":18199,"sampling":18200,"##coon":18201,"##21":18202,"Allah":18203,"travelers":18204,"allocation":18205,"##arily":18206,"Loch":18207,"##hm":18208,"commentators":18209,"fulfilled":18210,"##troke":18211,"Emeritus":18212,"Vanderbilt":18213,"Vijay":18214,"pledged":18215,"##tative":18216,"diagram":18217,"drilling":18218,"##MD":18219,"##plain":18220,"Edison":18221,"productivity":18222,"31st":18223,"##rying":18224,"##ption":18225,"##gano":18226,"##oration":18227,"##bara":18228,"posture":18229,"bothering":18230,"platoon":18231,"politely":18232,"##inating":18233,"redevelopment":18234,"Job":18235,"##vale":18236,"stark":18237,"incorrect":18238,"Mansion":18239,"renewal":18240,"threatens":18241,"Bahamas":18242,"fridge":18243,"##tata":18244,"Uzbekistan":18245,"##edia":18246,"Sainte":18247,"##mio":18248,"gaps":18249,"neural":18250,"##storm":18251,"overturned":18252,"Preservation":18253,"shields":18254,"##ngo":18255,"##physics":18256,"ah":18257,"gradual":18258,"killings":18259,"##anza":18260,"consultation":18261,"premiership":18262,"Felipe":18263,"coincidence":18264,"##ène":18265,"##any":18266,"Handbook":18267,"##loaded":18268,"Edit":18269,"Guns":18270,"arguably":18271,"##ş":18272,"compressed":18273,"depict":18274,"seller":18275,"##qui":18276,"Kilkenny":18277,"##kling":18278,"Olympia":18279,"librarian":18280,"##acles":18281,"dramas":18282,"JP":18283,"Kit":18284,"Maj":18285,"##lists":18286,"proprietary":18287,"##nged":18288,"##ettes":18289,"##tok":18290,"exceeding":18291,"Lock":18292,"induction":18293,"numerical":18294,"##vist":18295,"Straight":18296,"foyer":18297,"imaginary":18298,"##pop":18299,"violinist":18300,"Carla":18301,"bouncing":18302,"##ashi":18303,"abolition":18304,"##uction":18305,"restoring":18306,"scenic":18307,"##č":18308,"Doom":18309,"overthrow":18310,"para":18311,"##vid":18312,"##ughty":18313,"Concord":18314,"HC":18315,"cocaine":18316,"deputies":18317,"##aul":18318,"visibility":18319,"##wart":18320,"Kapoor":18321,"Hutchinson":18322,"##agan":18323,"flashes":18324,"kn":18325,"decreasing":18326,"##ronology":18327,"quotes":18328,"vain":18329,"satisfying":18330,"##iam":18331,"##linger":18332,"310":18333,"Hanson":18334,"fauna":18335,"##zawa":18336,"##rrel":18337,"Trenton":18338,"##VB":18339,"Employment":18340,"vocational":18341,"Exactly":18342,"bartender":18343,"butterflies":18344,"tow":18345,"##chers":18346,"##ocks":18347,"pigs":18348,"merchandise":18349,"##game":18350,"##pine":18351,"Shea":18352,"##gration":18353,"Connell":18354,"Josephine":18355,"monopoly":18356,"##dled":18357,"Cobb":18358,"warships":18359,"cancellation":18360,"someday":18361,"stove":18362,"##Cs":18363,"candidacy":18364,"superhero":18365,"unrest":18366,"Toulouse":18367,"admiration":18368,"undergone":18369,"whirled":18370,"Reconnaissance":18371,"costly":18372,"##ships":18373,"290":18374,"Cafe":18375,"amber":18376,"Tory":18377,"##mpt":18378,"definitive":18379,"##dress":18380,"proposes":18381,"redesigned":18382,"acceleration":18383,"##asa":18384,"##raphy":18385,"Presley":18386,"exits":18387,"Languages":18388,"##cel":18389,"Mode":18390,"spokesperson":18391,"##tius":18392,"Ban":18393,"forthcoming":18394,"grounded":18395,"ACC":18396,"compelling":18397,"logistics":18398,"retailers":18399,"abused":18400,"##gating":18401,"soda":18402,"##yland":18403,"##lution":18404,"Landmark":18405,"XVI":18406,"blush":18407,"##tem":18408,"hurling":18409,"dread":18410,"Tobago":18411,"Foley":18412,"##uad":18413,"scenarios":18414,"##mentation":18415,"##rks":18416,"Score":18417,"fatigue":18418,"hairy":18419,"correspond":18420,"##iard":18421,"defences":18422,"confiscated":18423,"##rudence":18424,"1785":18425,"Formerly":18426,"Shot":18427,"advertised":18428,"460":18429,"Text":18430,"ridges":18431,"Promise":18432,"Dev":18433,"exclusion":18434,"NHS":18435,"tuberculosis":18436,"rockets":18437,"##offs":18438,"sparkling":18439,"256":18440,"disappears":18441,"mankind":18442,"##hore":18443,"HP":18444,"##omo":18445,"taxation":18446,"Multi":18447,"DS":18448,"Virgil":18449,"##ams":18450,"Dell":18451,"stacked":18452,"guessing":18453,"Jump":18454,"Nope":18455,"cheer":18456,"hates":18457,"ballots":18458,"overlooked":18459,"analyses":18460,"Prevention":18461,"maturity":18462,"dos":18463,"##cards":18464,"##lect":18465,"Mare":18466,"##yssa":18467,"Petty":18468,"##wning":18469,"differing":18470,"iOS":18471,"##ior":18472,"Joachim":18473,"Sentinel":18474,"##nstein":18475,"90s":18476,"Pamela":18477,"480":18478,"Asher":18479,"##lary":18480,"Vicente":18481,"landings":18482,"portray":18483,"##rda":18484,"##xley":18485,"Virtual":18486,"##uary":18487,"finances":18488,"Jain":18489,"Somebody":18490,"Tri":18491,"behave":18492,"Michele":18493,"##ider":18494,"dwellings":18495,"FAA":18496,"Gallagher":18497,"##lide":18498,"Monkey":18499,"195":18500,"aforementioned":18501,"##rism":18502,"##bey":18503,"##kim":18504,"##puted":18505,"Mesa":18506,"hopped":18507,"unopposed":18508,"recipients":18509,"Reality":18510,"Been":18511,"gritted":18512,"149":18513,"playground":18514,"pillar":18515,"##rone":18516,"Guinness":18517,"##tad":18518,"Théâtre":18519,"depended":18520,"Tipperary":18521,"Reuben":18522,"frightening":18523,"wooded":18524,"Target":18525,"globally":18526,"##uted":18527,"Morales":18528,"Baptiste":18529,"drunken":18530,"Institut":18531,"characterised":18532,"##chemistry":18533,"Strip":18534,"discrete":18535,"Premiership":18536,"##zzling":18537,"gazing":18538,"Outer":18539,"##quisition":18540,"Sikh":18541,"Booker":18542,"##yal":18543,"contemporaries":18544,"Jericho":18545,"##chan":18546,"##physical":18547,"##witch":18548,"Militia":18549,"##rez":18550,"##zard":18551,"dangers":18552,"##utter":18553,"##₀":18554,"Programs":18555,"darling":18556,"participates":18557,"railroads":18558,"##ienne":18559,"behavioral":18560,"bureau":18561,"##rook":18562,"161":18563,"Hicks":18564,"##rises":18565,"Comes":18566,"inflicted":18567,"bees":18568,"kindness":18569,"norm":18570,"##ković":18571,"generators":18572,"##pard":18573,"##omy":18574,"##ili":18575,"methodology":18576,"Alvin":18577,"façade":18578,"latitude":18579,"##plified":18580,"DE":18581,"Morse":18582,"##mered":18583,"educate":18584,"intersects":18585,"##MF":18586,"##cz":18587,"##vated":18588,"AL":18589,"##graded":18590,"##fill":18591,"constitutes":18592,"artery":18593,"feudal":18594,"avant":18595,"cautious":18596,"##ogue":18597,"immigrated":18598,"##chenko":18599,"Saul":18600,"Clinic":18601,"Fang":18602,"choke":18603,"Cornelius":18604,"flexibility":18605,"temperate":18606,"pins":18607,"##erson":18608,"oddly":18609,"inequality":18610,"157":18611,"Natasha":18612,"Sal":18613,"##uter":18614,"215":18615,"aft":18616,"blinking":18617,"##ntino":18618,"northward":18619,"Exposition":18620,"cookies":18621,"Wedding":18622,"impulse":18623,"Overseas":18624,"terrifying":18625,"##ough":18626,"Mortimer":18627,"##see":18628,"440":18629,"https":18630,"og":18631,"imagining":18632,"##cars":18633,"Nicola":18634,"exceptionally":18635,"threads":18636,"##cup":18637,"Oswald":18638,"Provisional":18639,"dismantled":18640,"deserves":18641,"1786":18642,"Fairy":18643,"discourse":18644,"Counsel":18645,"departing":18646,"Arc":18647,"guarding":18648,"##orse":18649,"420":18650,"alterations":18651,"vibrant":18652,"Em":18653,"squinted":18654,"terrace":18655,"rowing":18656,"Led":18657,"accessories":18658,"SF":18659,"Sgt":18660,"cheating":18661,"Atomic":18662,"##raj":18663,"Blackpool":18664,"##iary":18665,"boarded":18666,"substituted":18667,"bestowed":18668,"lime":18669,"kernel":18670,"##jah":18671,"Belmont":18672,"shaken":18673,"sticky":18674,"retrospective":18675,"Louie":18676,"migrants":18677,"weigh":18678,"sunglasses":18679,"thumbs":18680,"##hoff":18681,"excavation":18682,"##nks":18683,"Extra":18684,"Polo":18685,"motives":18686,"Drum":18687,"infrared":18688,"tastes":18689,"berth":18690,"verge":18691,"##stand":18692,"programmed":18693,"warmed":18694,"Shankar":18695,"Titan":18696,"chromosome":18697,"cafeteria":18698,"dividing":18699,"pepper":18700,"CPU":18701,"Stevie":18702,"satirical":18703,"Nagar":18704,"scowled":18705,"Died":18706,"backyard":18707,"##gata":18708,"##reath":18709,"##bir":18710,"Governors":18711,"portraying":18712,"##yah":18713,"Revenge":18714,"##acing":18715,"1772":18716,"margins":18717,"Bahn":18718,"OH":18719,"lowland":18720,"##razed":18721,"catcher":18722,"replay":18723,"##yoshi":18724,"Seriously":18725,"##licit":18726,"Aristotle":18727,"##ald":18728,"Habsburg":18729,"weekday":18730,"Secretariat":18731,"CO":18732,"##dly":18733,"##joy":18734,"##stad":18735,"litre":18736,"ultra":18737,"##cke":18738,"Mongol":18739,"Tucson":18740,"correlation":18741,"compose":18742,"traps":18743,"Groups":18744,"Hai":18745,"Salvatore":18746,"##dea":18747,"cents":18748,"##eese":18749,"concession":18750,"clash":18751,"Trip":18752,"Panzer":18753,"Moroccan":18754,"cruisers":18755,"torque":18756,"Ba":18757,"grossed":18758,"##arate":18759,"restriction":18760,"concentrating":18761,"FDA":18762,"##Leod":18763,"##ones":18764,"Scholars":18765,"##esi":18766,"throbbing":18767,"specialised":18768,"##heses":18769,"Chicken":18770,"##fia":18771,"##ificant":18772,"Erich":18773,"Residence":18774,"##trate":18775,"manipulation":18776,"namesake":18777,"##tom":18778,"Hoover":18779,"cue":18780,"Lindsey":18781,"Lonely":18782,"275":18783,"##HT":18784,"combustion":18785,"subscribers":18786,"Punjabi":18787,"respects":18788,"Jeremiah":18789,"penned":18790,"##gor":18791,"##rilla":18792,"suppression":18793,"##tration":18794,"Crimson":18795,"piston":18796,"Derry":18797,"crimson":18798,"lyrical":18799,"oversee":18800,"portrays":18801,"CF":18802,"Districts":18803,"Lenin":18804,"Cora":18805,"searches":18806,"clans":18807,"VHS":18808,"##hel":18809,"Jacqueline":18810,"Redskins":18811,"Clubs":18812,"desktop":18813,"indirectly":18814,"alternatives":18815,"marijuana":18816,"suffrage":18817,"##smos":18818,"Irwin":18819,"##liff":18820,"Process":18821,"##hawks":18822,"Sloane":18823,"##bson":18824,"Sonata":18825,"yielded":18826,"Flores":18827,"##ares":18828,"armament":18829,"adaptations":18830,"integrate":18831,"neighbours":18832,"shelters":18833,"##tour":18834,"Skinner":18835,"##jet":18836,"##tations":18837,"1774":18838,"Peterborough":18839,"##elles":18840,"ripping":18841,"Liang":18842,"Dickinson":18843,"charities":18844,"Rwanda":18845,"monasteries":18846,"crossover":18847,"racist":18848,"barked":18849,"guerrilla":18850,"##ivate":18851,"Grayson":18852,"##iques":18853,"##vious":18854,"##got":18855,"Rolls":18856,"denominations":18857,"atom":18858,"affinity":18859,"##delity":18860,"Wish":18861,"##inted":18862,"##inae":18863,"interrogation":18864,"##cey":18865,"##erina":18866,"##lifting":18867,"192":18868,"Sands":18869,"1779":18870,"mast":18871,"Likewise":18872,"##hyl":18873,"##oft":18874,"contempt":18875,"##por":18876,"assaulted":18877,"fills":18878,"establishments":18879,"Mal":18880,"consulted":18881,"##omi":18882,"##sight":18883,"greet":18884,"##roma":18885,"##egan":18886,"Pulitzer":18887,"##rried":18888,"##dius":18889,"##ractical":18890,"##voked":18891,"Hasan":18892,"CB":18893,"##zzy":18894,"Romanesque":18895,"Panic":18896,"wheeled":18897,"recorder":18898,"##tters":18899,"##warm":18900,"##gly":18901,"botanist":18902,"Balkan":18903,"Lockheed":18904,"Polly":18905,"farewell":18906,"suffers":18907,"purchases":18908,"Eaton":18909,"##80":18910,"Quick":18911,"commenting":18912,"Saga":18913,"beasts":18914,"hides":18915,"motifs":18916,"##icks":18917,"Alonso":18918,"Springer":18919,"Wikipedia":18920,"circulated":18921,"encoding":18922,"jurisdictions":18923,"snout":18924,"UAE":18925,"Integrated":18926,"unmarried":18927,"Heinz":18928,"##lein":18929,"##figured":18930,"deleted":18931,"##tley":18932,"Zen":18933,"Cycling":18934,"Fuel":18935,"Scandinavian":18936,"##rants":18937,"Conner":18938,"reef":18939,"Marino":18940,"curiously":18941,"lingered":18942,"Gina":18943,"manners":18944,"activism":18945,"Mines":18946,"Expo":18947,"Micah":18948,"promotions":18949,"Server":18950,"booked":18951,"derivatives":18952,"eastward":18953,"detailing":18954,"reelection":18955,"##chase":18956,"182":18957,"Campeonato":18958,"Po":18959,"158":18960,"Peel":18961,"winger":18962,"##itch":18963,"canyon":18964,"##pit":18965,"LDS":18966,"A1":18967,"##shin":18968,"Giorgio":18969,"pathetic":18970,"##rga":18971,"##mist":18972,"Aren":18973,"##lag":18974,"confronts":18975,"motel":18976,"textbook":18977,"shine":18978,"turbines":18979,"1770":18980,"Darcy":18981,"##cot":18982,"Southeastern":18983,"##lessness":18984,"Banner":18985,"recognise":18986,"stray":18987,"Kitchen":18988,"paperwork":18989,"realism":18990,"Chrysler":18991,"filmmakers":18992,"fishermen":18993,"##hetic":18994,"variously":18995,"Vishnu":18996,"fiddle":18997,"Eddy":18998,"Origin":18999,"##tec":19000,"##ulin":19001,"Flames":19002,"Rs":19003,"bankrupt":19004,"Extreme":19005,"Pomeranian":19006,"##emption":19007,"ratified":19008,"##iu":19009,"jockey":19010,"Stratford":19011,"##ivating":19012,"##oire":19013,"Babylon":19014,"pardon":19015,"AI":19016,"affordable":19017,"deities":19018,"disturbance":19019,"Trying":19020,"##sai":19021,"Ida":19022,"Papers":19023,"advancement":19024,"70s":19025,"archbishop":19026,"Luftwaffe":19027,"announces":19028,"tugging":19029,"##lphin":19030,"##sistence":19031,"##eel":19032,"##ishes":19033,"ambition":19034,"aura":19035,"##fled":19036,"##lected":19037,"##vue":19038,"Prasad":19039,"boiled":19040,"clarity":19041,"Violin":19042,"investigative":19043,"routing":19044,"Yankee":19045,"##uckle":19046,"McMahon":19047,"bugs":19048,"eruption":19049,"##rooms":19050,"Minutes":19051,"relics":19052,"##ckle":19053,"##nse":19054,"sipped":19055,"valves":19056,"weakly":19057,"##ital":19058,"Middleton":19059,"collided":19060,"##quer":19061,"bamboo":19062,"insignia":19063,"Tyne":19064,"exercised":19065,"Ninth":19066,"echoing":19067,"polynomial":19068,"considerations":19069,"lunged":19070,"##bius":19071,"objections":19072,"complain":19073,"disguised":19074,"plaza":19075,"##VC":19076,"institutes":19077,"Judicial":19078,"ascent":19079,"imminent":19080,"Waterford":19081,"hello":19082,"Lumpur":19083,"Niger":19084,"Goldman":19085,"vendors":19086,"Kensington":19087,"Wren":19088,"browser":19089,"##bner":19090,"##tri":19091,"##mize":19092,"##pis":19093,"##lea":19094,"Cheyenne":19095,"Bold":19096,"Settlement":19097,"Hollow":19098,"Paralympic":19099,"axle":19100,"##toire":19101,"##actic":19102,"impose":19103,"perched":19104,"utilizing":19105,"slips":19106,"Benz":19107,"Michaels":19108,"manipulate":19109,"Chiang":19110,"##mian":19111,"Dolphins":19112,"prohibition":19113,"attacker":19114,"ecology":19115,"Estadio":19116,"##SB":19117,"##uild":19118,"attracts":19119,"recalls":19120,"glacier":19121,"lad":19122,"##rima":19123,"Barlow":19124,"kHz":19125,"melodic":19126,"##aby":19127,"##iracy":19128,"assumptions":19129,"Cornish":19130,"##aru":19131,"DOS":19132,"Maddie":19133,"##mers":19134,"lyric":19135,"Luton":19136,"nm":19137,"##tron":19138,"Reno":19139,"Fin":19140,"YOU":19141,"Broadcast":19142,"Finch":19143,"sensory":19144,"##bent":19145,"Jeep":19146,"##uman":19147,"additionally":19148,"Buildings":19149,"businessmen":19150,"treaties":19151,"235":19152,"Stranger":19153,"gateway":19154,"Charlton":19155,"accomplishments":19156,"Diary":19157,"apologized":19158,"zinc":19159,"histories":19160,"supplier":19161,"##tting":19162,"162":19163,"asphalt":19164,"Treatment":19165,"Abbas":19166,"##pating":19167,"##yres":19168,"Bloom":19169,"sedan":19170,"soloist":19171,"##cum":19172,"antagonist":19173,"denounced":19174,"Fairfax":19175,"##aving":19176,"##enko":19177,"noticeable":19178,"Budget":19179,"Buckingham":19180,"Snyder":19181,"retreating":19182,"Jai":19183,"spoon":19184,"invading":19185,"giggle":19186,"woven":19187,"gunfire":19188,"arrests":19189,"##vered":19190,"##come":19191,"respiratory":19192,"violet":19193,"##aws":19194,"Byrd":19195,"shocking":19196,"tenant":19197,"Jamaican":19198,"Ottomans":19199,"Seal":19200,"theirs":19201,"##isse":19202,"##48":19203,"cooperate":19204,"peering":19205,"##nius":19206,"163":19207,"Composer":19208,"organist":19209,"Mongolian":19210,"Bauer":19211,"Spy":19212,"collects":19213,"prophecy":19214,"congregations":19215,"##moor":19216,"Brick":19217,"calculation":19218,"fixtures":19219,"exempt":19220,"##dden":19221,"Ada":19222,"Thousand":19223,"##lue":19224,"tracing":19225,"##achi":19226,"bodyguard":19227,"vicar":19228,"supplying":19229,"Łódź":19230,"interception":19231,"monitored":19232,"##heart":19233,"Paso":19234,"overlap":19235,"annoyance":19236,"##dice":19237,"yellowish":19238,"stables":19239,"elders":19240,"illegally":19241,"honesty":19242,"##oar":19243,"skinny":19244,"spinal":19245,"##puram":19246,"Bourbon":19247,"##cor":19248,"flourished":19249,"Medium":19250,"##stics":19251,"##aba":19252,"Follow":19253,"##ckey":19254,"stationary":19255,"##scription":19256,"dresser":19257,"scrutiny":19258,"Buckley":19259,"Clearly":19260,"##SF":19261,"Lyrics":19262,"##heimer":19263,"drying":19264,"Oracle":19265,"internally":19266,"rains":19267,"##last":19268,"Enemy":19269,"##oes":19270,"McLean":19271,"Ole":19272,"phosphate":19273,"Rosario":19274,"Rifles":19275,"##mium":19276,"battered":19277,"Pepper":19278,"Presidents":19279,"conquer":19280,"Château":19281,"castles":19282,"##aldo":19283,"##ulf":19284,"Depending":19285,"Lesser":19286,"Boom":19287,"trades":19288,"Peyton":19289,"164":19290,"emphasize":19291,"accustomed":19292,"SM":19293,"Ai":19294,"Classification":19295,"##mins":19296,"##35":19297,"##rons":19298,"leak":19299,"piled":19300,"deeds":19301,"lush":19302,"##self":19303,"beginnings":19304,"breathless":19305,"1660":19306,"McGill":19307,"##ago":19308,"##chaft":19309,"##gies":19310,"humour":19311,"Bomb":19312,"securities":19313,"Might":19314,"##zone":19315,"##eves":19316,"Matthias":19317,"Movies":19318,"Levine":19319,"vengeance":19320,"##ads":19321,"Challenger":19322,"Misty":19323,"Traditionally":19324,"constellation":19325,"##rass":19326,"deepest":19327,"workplace":19328,"##oof":19329,"##vina":19330,"impatient":19331,"##ML":19332,"Mughal":19333,"Alessandro":19334,"scenery":19335,"Slater":19336,"postseason":19337,"troupe":19338,"##ń":19339,"Volunteers":19340,"Facility":19341,"militants":19342,"Reggie":19343,"sanctions":19344,"Expeditionary":19345,"Nam":19346,"countered":19347,"interpret":19348,"Basilica":19349,"coding":19350,"expectation":19351,"Duffy":19352,"def":19353,"Tong":19354,"wakes":19355,"Bowling":19356,"Vehicle":19357,"Adler":19358,"salad":19359,"intricate":19360,"stronghold":19361,"medley":19362,"##uries":19363,"##bur":19364,"joints":19365,"##rac":19366,"##yx":19367,"##IO":19368,"Ordnance":19369,"Welch":19370,"distributor":19371,"Ark":19372,"cavern":19373,"trench":19374,"Weiss":19375,"Mauritius":19376,"decreases":19377,"docks":19378,"eagerly":19379,"irritation":19380,"Matilda":19381,"biographer":19382,"Visiting":19383,"##marked":19384,"##iter":19385,"##ear":19386,"##gong":19387,"Moreno":19388,"attendant":19389,"Bury":19390,"instrumentation":19391,"theologian":19392,"clit":19393,"nuns":19394,"symphony":19395,"translate":19396,"375":19397,"loser":19398,"##user":19399,"##VR":19400,"##meter":19401,"##orious":19402,"harmful":19403,"##yuki":19404,"Commissioners":19405,"Mendoza":19406,"sniffed":19407,"Hulk":19408,"##dded":19409,"##ulator":19410,"##nz":19411,"Donnell":19412,"##eka":19413,"deported":19414,"Met":19415,"SD":19416,"Aerospace":19417,"##cultural":19418,"##odes":19419,"Fantastic":19420,"cavity":19421,"remark":19422,"emblem":19423,"fearing":19424,"##iance":19425,"ICAO":19426,"Liberia":19427,"stab":19428,"##yd":19429,"Pac":19430,"Gymnasium":19431,"IS":19432,"Everton":19433,"##vanna":19434,"mantle":19435,"##ief":19436,"Ramon":19437,"##genic":19438,"Shooting":19439,"Smoke":19440,"Random":19441,"Africans":19442,"MB":19443,"tavern":19444,"bargain":19445,"voluntarily":19446,"Ion":19447,"Peoples":19448,"Rusty":19449,"attackers":19450,"Patton":19451,"sins":19452,"##cake":19453,"Hat":19454,"moderately":19455,"##hala":19456,"##alia":19457,"requesting":19458,"mechanic":19459,"##eae":19460,"Seine":19461,"Robbins":19462,"##ulum":19463,"susceptible":19464,"Bravo":19465,"Slade":19466,"Strasbourg":19467,"rubble":19468,"entrusted":19469,"Creation":19470,"##amp":19471,"smoothed":19472,"##uintet":19473,"evenly":19474,"reviewers":19475,"skip":19476,"Sculpture":19477,"177":19478,"Rough":19479,"##rrie":19480,"Reeves":19481,"##cede":19482,"Administrator":19483,"garde":19484,"minus":19485,"carriages":19486,"grenade":19487,"Ninja":19488,"fuscous":19489,"##kley":19490,"Punk":19491,"contributors":19492,"Aragon":19493,"Tottenham":19494,"##cca":19495,"##sir":19496,"VA":19497,"laced":19498,"dealers":19499,"##sonic":19500,"crisp":19501,"harmonica":19502,"Artistic":19503,"Butch":19504,"Andes":19505,"Farmers":19506,"corridors":19507,"unseen":19508,"##tium":19509,"Countries":19510,"Lone":19511,"envisioned":19512,"Katy":19513,"##lang":19514,"##cc":19515,"Quarterly":19516,"##neck":19517,"consort":19518,"##aceae":19519,"bidding":19520,"Corey":19521,"concurrent":19522,"##acts":19523,"##gum":19524,"Highness":19525,"##lient":19526,"##rators":19527,"arising":19528,"##unta":19529,"pathways":19530,"49ers":19531,"bolted":19532,"complaining":19533,"ecosystem":19534,"libretto":19535,"Ser":19536,"narrated":19537,"212":19538,"Soft":19539,"influx":19540,"##dder":19541,"incorporation":19542,"plagued":19543,"tents":19544,"##ddled":19545,"1750":19546,"Risk":19547,"citation":19548,"Tomas":19549,"hostilities":19550,"seals":19551,"Bruins":19552,"Dominique":19553,"attic":19554,"competent":19555,"##UR":19556,"##cci":19557,"hugging":19558,"Breuning":19559,"bacterial":19560,"Shrewsbury":19561,"vowed":19562,"eh":19563,"elongated":19564,"hangs":19565,"render":19566,"centimeters":19567,"##ficient":19568,"Mu":19569,"turtle":19570,"besieged":19571,"##gaard":19572,"grapes":19573,"bravery":19574,"collaborations":19575,"deprived":19576,"##amine":19577,"##using":19578,"##gins":19579,"arid":19580,"##uve":19581,"coats":19582,"hanged":19583,"##sting":19584,"Pa":19585,"prefix":19586,"##ranged":19587,"Exit":19588,"Chain":19589,"Flood":19590,"Materials":19591,"suspicions":19592,"##ö":19593,"hovered":19594,"Hidden":19595,"##state":19596,"Malawi":19597,"##24":19598,"Mandy":19599,"norms":19600,"fascinating":19601,"airlines":19602,"delivers":19603,"##rust":19604,"Cretaceous":19605,"spanned":19606,"pillows":19607,"##onomy":19608,"jar":19609,"##kka":19610,"regent":19611,"fireworks":19612,"morality":19613,"discomfort":19614,"lure":19615,"uneven":19616,"##jack":19617,"Lucian":19618,"171":19619,"archaeology":19620,"##til":19621,"mornings":19622,"Billie":19623,"Marquess":19624,"impending":19625,"spilling":19626,"tombs":19627,"##volved":19628,"Celia":19629,"Coke":19630,"underside":19631,"##bation":19632,"Vaughn":19633,"Daytona":19634,"Godfrey":19635,"Pascal":19636,"Alien":19637,"##sign":19638,"172":19639,"##lage":19640,"iPhone":19641,"Gonna":19642,"genocide":19643,"##rber":19644,"oven":19645,"endure":19646,"dashed":19647,"simultaneous":19648,"##phism":19649,"Wally":19650,"##rō":19651,"ants":19652,"predator":19653,"reissue":19654,"##aper":19655,"Speech":19656,"funk":19657,"Rudy":19658,"claw":19659,"Hindus":19660,"Numbers":19661,"Bing":19662,"lantern":19663,"##aurus":19664,"scattering":19665,"poisoned":19666,"##active":19667,"Andrei":19668,"algebraic":19669,"baseman":19670,"##ritz":19671,"Gregg":19672,"##cola":19673,"selections":19674,"##putation":19675,"lick":19676,"Laguna":19677,"##IX":19678,"Sumatra":19679,"Warning":19680,"turf":19681,"buyers":19682,"Burgess":19683,"Oldham":19684,"exploit":19685,"worm":19686,"initiate":19687,"strapped":19688,"tuning":19689,"filters":19690,"haze":19691,"##е":19692,"##ledge":19693,"##ydro":19694,"##culture":19695,"amendments":19696,"Promotion":19697,"##union":19698,"Clair":19699,"##uria":19700,"petty":19701,"shutting":19702,"##eveloped":19703,"Phoebe":19704,"Zeke":19705,"conducts":19706,"grains":19707,"clashes":19708,"##latter":19709,"illegitimate":19710,"willingly":19711,"Deer":19712,"Lakers":19713,"Reference":19714,"chaplain":19715,"commitments":19716,"interrupt":19717,"salvation":19718,"Panther":19719,"Qualifying":19720,"Assessment":19721,"cancel":19722,"efficiently":19723,"attorneys":19724,"Dynamo":19725,"impress":19726,"accession":19727,"clinging":19728,"randomly":19729,"reviewing":19730,"Romero":19731,"Cathy":19732,"charting":19733,"clapped":19734,"rebranded":19735,"Azerbaijani":19736,"coma":19737,"indicator":19738,"punches":19739,"##tons":19740,"Sami":19741,"monastic":19742,"prospects":19743,"Pastor":19744,"##rville":19745,"electrified":19746,"##CI":19747,"##utical":19748,"tumbled":19749,"Chef":19750,"muzzle":19751,"selecting":19752,"UP":19753,"Wheel":19754,"protocols":19755,"##tat":19756,"Extended":19757,"beautifully":19758,"nests":19759,"##stal":19760,"Andersen":19761,"##anu":19762,"##³":19763,"##rini":19764,"kneeling":19765,"##reis":19766,"##xia":19767,"anatomy":19768,"dusty":19769,"Safe":19770,"turmoil":19771,"Bianca":19772,"##elo":19773,"analyze":19774,"##ر":19775,"##eran":19776,"podcast":19777,"Slovene":19778,"Locke":19779,"Rue":19780,"##retta":19781,"##uni":19782,"Person":19783,"Prophet":19784,"crooked":19785,"disagreed":19786,"Versailles":19787,"Sarajevo":19788,"Utrecht":19789,"##ogen":19790,"chewing":19791,"##ception":19792,"##iidae":19793,"Missile":19794,"attribute":19795,"majors":19796,"Arch":19797,"intellectuals":19798,"##andra":19799,"ideological":19800,"Cory":19801,"Salzburg":19802,"##fair":19803,"Lot":19804,"electromagnetic":19805,"Distribution":19806,"##oper":19807,"##pered":19808,"Russ":19809,"Terra":19810,"repeats":19811,"fluttered":19812,"Riga":19813,"##ific":19814,"##gt":19815,"cows":19816,"Hair":19817,"labelled":19818,"protects":19819,"Gale":19820,"Personnel":19821,"Düsseldorf":19822,"Moran":19823,"rematch":19824,"##OE":19825,"Slow":19826,"forgiveness":19827,"##ssi":19828,"proudly":19829,"Macmillan":19830,"insist":19831,"undoubtedly":19832,"Québec":19833,"Violence":19834,"##yuan":19835,"##aine":19836,"mourning":19837,"linen":19838,"accidental":19839,"##iol":19840,"##arium":19841,"grossing":19842,"lattice":19843,"maneuver":19844,"##marine":19845,"prestige":19846,"petrol":19847,"gradient":19848,"invasive":19849,"militant":19850,"Galerie":19851,"widening":19852,"##aman":19853,"##quist":19854,"disagreement":19855,"##ales":19856,"creepy":19857,"remembers":19858,"buzz":19859,"##erial":19860,"Exempt":19861,"Dirk":19862,"mon":19863,"Addison":19864,"##inen":19865,"deposed":19866,"##agon":19867,"fifteenth":19868,"Hang":19869,"ornate":19870,"slab":19871,"##lades":19872,"Fountain":19873,"contractors":19874,"das":19875,"Warwickshire":19876,"1763":19877,"##rc":19878,"Carly":19879,"Essays":19880,"Indy":19881,"Ligue":19882,"greenhouse":19883,"slit":19884,"##sea":19885,"chewed":19886,"wink":19887,"##azi":19888,"Playhouse":19889,"##kon":19890,"Gram":19891,"Ko":19892,"Samson":19893,"creators":19894,"revive":19895,"##rians":19896,"spawned":19897,"seminars":19898,"Craft":19899,"Tall":19900,"diverted":19901,"assistants":19902,"computational":19903,"enclosure":19904,"##acity":19905,"Coca":19906,"##eve":19907,"databases":19908,"Drop":19909,"##loading":19910,"##hage":19911,"Greco":19912,"Privy":19913,"entrances":19914,"pork":19915,"prospective":19916,"Memories":19917,"robes":19918,"##market":19919,"transporting":19920,"##lik":19921,"Rudolph":19922,"Horton":19923,"visually":19924,"##uay":19925,"##nja":19926,"Centro":19927,"Tor":19928,"Howell":19929,"##rsey":19930,"admitting":19931,"postgraduate":19932,"herbs":19933,"##att":19934,"Chin":19935,"Rutherford":19936,"##bot":19937,"##etta":19938,"Seasons":19939,"explanations":19940,"##bery":19941,"Friedman":19942,"heap":19943,"##ryl":19944,"##sberg":19945,"jaws":19946,"##agh":19947,"Choi":19948,"Killing":19949,"Fanny":19950,"##suming":19951,"##hawk":19952,"hopeful":19953,"##aid":19954,"Monty":19955,"gum":19956,"remarkably":19957,"Secrets":19958,"disco":19959,"harp":19960,"advise":19961,"##avia":19962,"Marathi":19963,"##cycle":19964,"Truck":19965,"abbot":19966,"sincere":19967,"urine":19968,"##mology":19969,"masked":19970,"bathing":19971,"##tun":19972,"Fellows":19973,"##TM":19974,"##gnetic":19975,"owl":19976,"##jon":19977,"hymn":19978,"##leton":19979,"208":19980,"hostility":19981,"##cée":19982,"baked":19983,"Bottom":19984,"##AB":19985,"shudder":19986,"##ater":19987,"##von":19988,"##hee":19989,"reorganization":19990,"Cycle":19991,"##phs":19992,"Lex":19993,"##style":19994,"##rms":19995,"Translation":19996,"##erick":19997,"##imeter":19998,"##ière":19999,"attested":20000,"Hillary":20001,"##DM":20002,"gal":20003,"wander":20004,"Salle":20005,"##laming":20006,"Perez":20007,"Pit":20008,"##LP":20009,"USAF":20010,"contexts":20011,"Disease":20012,"blazing":20013,"aroused":20014,"razor":20015,"walled":20016,"Danielle":20017,"Mont":20018,"Funk":20019,"royalty":20020,"thee":20021,"203":20022,"donors":20023,"##erton":20024,"famously":20025,"processors":20026,"reassigned":20027,"welcoming":20028,"Goldberg":20029,"##quities":20030,"undisclosed":20031,"Orient":20032,"Patty":20033,"vaccine":20034,"refrigerator":20035,"Cypriot":20036,"consonant":20037,"##waters":20038,"176":20039,"sober":20040,"##lement":20041,"Racecourse":20042,"##uate":20043,"Luckily":20044,"Selection":20045,"conceptual":20046,"vines":20047,"Breaking":20048,"wa":20049,"lions":20050,"oversight":20051,"sheltered":20052,"Dancer":20053,"ponds":20054,"borrow":20055,"##BB":20056,"##pulsion":20057,"Daly":20058,"##eek":20059,"fertility":20060,"spontaneous":20061,"Worldwide":20062,"gasping":20063,"##tino":20064,"169":20065,"ABS":20066,"Vickers":20067,"ambient":20068,"energetic":20069,"prisons":20070,"##eson":20071,"Stacy":20072,"##roach":20073,"GmbH":20074,"Afro":20075,"Marin":20076,"farmhouse":20077,"pinched":20078,"##cursion":20079,"##sp":20080,"Sabine":20081,"##pire":20082,"181":20083,"nak":20084,"swelling":20085,"humble":20086,"perfume":20087,"##balls":20088,"Rai":20089,"cannons":20090,"##taker":20091,"Married":20092,"Maltese":20093,"canals":20094,"interceptions":20095,"hats":20096,"lever":20097,"slowing":20098,"##ppy":20099,"Nike":20100,"Silas":20101,"Scarborough":20102,"skirts":20103,"166":20104,"inauguration":20105,"Shuttle":20106,"alloy":20107,"beads":20108,"belts":20109,"Compton":20110,"Cause":20111,"battling":20112,"critique":20113,"surf":20114,"Dock":20115,"roommate":20116,"##ulet":20117,"invade":20118,"Garland":20119,"##slow":20120,"nutrition":20121,"persona":20122,"##zam":20123,"Wichita":20124,"acquaintance":20125,"coincided":20126,"##cate":20127,"Dracula":20128,"clamped":20129,"##gau":20130,"overhaul":20131,"##broken":20132,"##rrier":20133,"melodies":20134,"ventures":20135,"Paz":20136,"convex":20137,"Roots":20138,"##holding":20139,"Tribute":20140,"transgender":20141,"##ò":20142,"chimney":20143,"##riad":20144,"Ajax":20145,"Thereafter":20146,"messed":20147,"nowadays":20148,"pH":20149,"##100":20150,"##alog":20151,"Pomerania":20152,"##yra":20153,"Rossi":20154,"glove":20155,"##TL":20156,"Races":20157,"##asily":20158,"tablets":20159,"Jase":20160,"##ttes":20161,"diner":20162,"##rns":20163,"Hu":20164,"Mohan":20165,"anytime":20166,"weighted":20167,"remixes":20168,"Dove":20169,"cherry":20170,"imports":20171,"##urity":20172,"GA":20173,"##TT":20174,"##iated":20175,"##sford":20176,"Clarkson":20177,"evidently":20178,"rugged":20179,"Dust":20180,"siding":20181,"##ometer":20182,"acquitted":20183,"choral":20184,"##mite":20185,"infants":20186,"Domenico":20187,"gallons":20188,"Atkinson":20189,"gestures":20190,"slated":20191,"##xa":20192,"Archaeology":20193,"unwanted":20194,"##ibes":20195,"##duced":20196,"premise":20197,"Colby":20198,"Geelong":20199,"disqualified":20200,"##pf":20201,"##voking":20202,"simplicity":20203,"Walkover":20204,"Qaeda":20205,"Warden":20206,"##bourg":20207,"##ān":20208,"Invasion":20209,"Babe":20210,"harness":20211,"183":20212,"##tated":20213,"maze":20214,"Burt":20215,"bedrooms":20216,"##nsley":20217,"Horizon":20218,"##oast":20219,"minimize":20220,"peeked":20221,"MLA":20222,"Trains":20223,"tractor":20224,"nudged":20225,"##iform":20226,"Growth":20227,"Benton":20228,"separates":20229,"##about":20230,"##kari":20231,"buffer":20232,"anthropology":20233,"brigades":20234,"foil":20235,"##wu":20236,"Domain":20237,"licking":20238,"whore":20239,"##rage":20240,"##sham":20241,"Initial":20242,"Courthouse":20243,"Rutgers":20244,"dams":20245,"villains":20246,"supermarket":20247,"##brush":20248,"Brunei":20249,"Palermo":20250,"arises":20251,"Passenger":20252,"outreach":20253,"##gill":20254,"Labrador":20255,"McLaren":20256,"##uy":20257,"Lori":20258,"##fires":20259,"Heads":20260,"magistrate":20261,"¹⁄₂":20262,"Weapons":20263,"##wai":20264,"##roke":20265,"projecting":20266,"##ulates":20267,"bordering":20268,"McKenzie":20269,"Pavel":20270,"midway":20271,"Guangzhou":20272,"streamed":20273,"racer":20274,"##lished":20275,"eccentric":20276,"spectral":20277,"206":20278,"##mism":20279,"Wilde":20280,"Grange":20281,"preparatory":20282,"lent":20283,"##tam":20284,"starving":20285,"Gertrude":20286,"##cea":20287,"##ricted":20288,"Breakfast":20289,"Mira":20290,"blurted":20291,"derive":20292,"##lair":20293,"blunt":20294,"sob":20295,"Cheltenham":20296,"Henrik":20297,"reinstated":20298,"intends":20299,"##istan":20300,"unite":20301,"##ector":20302,"playful":20303,"sparks":20304,"mapped":20305,"Cadet":20306,"luggage":20307,"prosperous":20308,"##ein":20309,"salon":20310,"##utes":20311,"Biological":20312,"##rland":20313,"Tyrone":20314,"buyer":20315,"##lose":20316,"amounted":20317,"Saw":20318,"smirked":20319,"Ronan":20320,"Reviews":20321,"Adele":20322,"trait":20323,"##proof":20324,"Bhutan":20325,"Ginger":20326,"##junct":20327,"digitally":20328,"stirring":20329,"##isted":20330,"coconut":20331,"Hamlet":20332,"Dinner":20333,"Scale":20334,"pledge":20335,"##RP":20336,"Wrong":20337,"Goal":20338,"Panel":20339,"therapeutic":20340,"elevations":20341,"infectious":20342,"priesthood":20343,"##inda":20344,"Guyana":20345,"diagnostic":20346,"##mbre":20347,"Blackwell":20348,"sails":20349,"##arm":20350,"literal":20351,"periodically":20352,"gleaming":20353,"Robot":20354,"Rector":20355,"##abulous":20356,"##tres":20357,"Reaching":20358,"Romantic":20359,"CP":20360,"Wonderful":20361,"##tur":20362,"ornamental":20363,"##nges":20364,"traitor":20365,"##zilla":20366,"genetics":20367,"mentioning":20368,"##eim":20369,"resonance":20370,"Areas":20371,"Shopping":20372,"##nard":20373,"Gail":20374,"Solid":20375,"##rito":20376,"##mara":20377,"Willem":20378,"Chip":20379,"Matches":20380,"Volkswagen":20381,"obstacle":20382,"Organ":20383,"invites":20384,"Coral":20385,"attain":20386,"##anus":20387,"##dates":20388,"Midway":20389,"shuffled":20390,"Cecilia":20391,"dessert":20392,"Gateway":20393,"Ch":20394,"Napoleonic":20395,"Petroleum":20396,"jets":20397,"goose":20398,"striped":20399,"bowls":20400,"vibration":20401,"Sims":20402,"nickel":20403,"Thirteen":20404,"problematic":20405,"intervene":20406,"##grading":20407,"##unds":20408,"Mum":20409,"semifinal":20410,"Radical":20411,"##izations":20412,"refurbished":20413,"##sation":20414,"##harine":20415,"Maximilian":20416,"cites":20417,"Advocate":20418,"Potomac":20419,"surged":20420,"preserves":20421,"Curry":20422,"angled":20423,"ordination":20424,"##pad":20425,"Cade":20426,"##DE":20427,"##sko":20428,"researched":20429,"torpedoes":20430,"Resident":20431,"wetlands":20432,"hay":20433,"applicants":20434,"depart":20435,"Bernstein":20436,"##pic":20437,"##ario":20438,"##rae":20439,"favourable":20440,"##wari":20441,"##р":20442,"metabolism":20443,"nobleman":20444,"Defaulted":20445,"calculate":20446,"ignition":20447,"Celebrity":20448,"Belize":20449,"sulfur":20450,"Flat":20451,"Sc":20452,"USB":20453,"flicker":20454,"Hertfordshire":20455,"Sept":20456,"CFL":20457,"Pasadena":20458,"Saturdays":20459,"Titus":20460,"##nir":20461,"Canary":20462,"Computing":20463,"Isaiah":20464,"##mler":20465,"formidable":20466,"pulp":20467,"orchid":20468,"Called":20469,"Solutions":20470,"kilograms":20471,"steamer":20472,"##hil":20473,"Doncaster":20474,"successors":20475,"Stokes":20476,"Holstein":20477,"##sius":20478,"sperm":20479,"API":20480,"Rogue":20481,"instability":20482,"Acoustic":20483,"##rag":20484,"159":20485,"undercover":20486,"Wouldn":20487,"##pra":20488,"##medical":20489,"Eliminated":20490,"honorable":20491,"##chel":20492,"denomination":20493,"abrupt":20494,"Buffy":20495,"blouse":20496,"fi":20497,"Regardless":20498,"Subsequent":20499,"##rdes":20500,"Lover":20501,"##tford":20502,"bacon":20503,"##emia":20504,"carving":20505,"##cripts":20506,"Massacre":20507,"Ramos":20508,"Latter":20509,"##ulp":20510,"ballroom":20511,"##gement":20512,"richest":20513,"bruises":20514,"Rest":20515,"Wiley":20516,"##aster":20517,"explosions":20518,"##lastic":20519,"Edo":20520,"##LD":20521,"Mir":20522,"choking":20523,"disgusted":20524,"faintly":20525,"Barracks":20526,"blasted":20527,"headlights":20528,"Tours":20529,"ensued":20530,"presentations":20531,"##cale":20532,"wrought":20533,"##oat":20534,"##coa":20535,"Quaker":20536,"##sdale":20537,"recipe":20538,"##gny":20539,"corpses":20540,"##liance":20541,"comfortably":20542,"##wat":20543,"Landscape":20544,"niche":20545,"catalyst":20546,"##leader":20547,"Securities":20548,"messy":20549,"##RL":20550,"Rodrigo":20551,"backdrop":20552,"##opping":20553,"treats":20554,"Emilio":20555,"Anand":20556,"bilateral":20557,"meadow":20558,"VC":20559,"socialism":20560,"##grad":20561,"clinics":20562,"##itating":20563,"##ppe":20564,"##ymphonic":20565,"seniors":20566,"Advisor":20567,"Armoured":20568,"Method":20569,"Alley":20570,"##orio":20571,"Sad":20572,"fueled":20573,"raided":20574,"Axel":20575,"NH":20576,"rushes":20577,"Dixie":20578,"Otis":20579,"wrecked":20580,"##22":20581,"capitalism":20582,"café":20583,"##bbe":20584,"##pion":20585,"##forcing":20586,"Aubrey":20587,"Lublin":20588,"Whenever":20589,"Sears":20590,"Scheme":20591,"##lana":20592,"Meadows":20593,"treatise":20594,"##RI":20595,"##ustic":20596,"sacrifices":20597,"sustainability":20598,"Biography":20599,"mystical":20600,"Wanted":20601,"multiplayer":20602,"Applications":20603,"disliked":20604,"##tisfied":20605,"impaired":20606,"empirical":20607,"forgetting":20608,"Fairfield":20609,"Sunni":20610,"blurred":20611,"Growing":20612,"Avalon":20613,"coil":20614,"Camera":20615,"Skin":20616,"bruised":20617,"terminals":20618,"##fted":20619,"##roving":20620,"Commando":20621,"##hya":20622,"##sper":20623,"reservations":20624,"needles":20625,"dangling":20626,"##rsch":20627,"##rsten":20628,"##spect":20629,"##mbs":20630,"yoga":20631,"regretted":20632,"Bliss":20633,"Orion":20634,"Rufus":20635,"glucose":20636,"Olsen":20637,"autobiographical":20638,"##dened":20639,"222":20640,"humidity":20641,"Shan":20642,"##ifiable":20643,"supper":20644,"##rou":20645,"flare":20646,"##MO":20647,"campaigning":20648,"descend":20649,"socio":20650,"declares":20651,"Mounted":20652,"Gracie":20653,"Arte":20654,"endurance":20655,"##ety":20656,"Copper":20657,"costa":20658,"airplay":20659,"##MB":20660,"Proceedings":20661,"dislike":20662,"grimaced":20663,"occupants":20664,"births":20665,"glacial":20666,"oblivious":20667,"cans":20668,"installment":20669,"muddy":20670,"##ł":20671,"captains":20672,"pneumonia":20673,"Quiet":20674,"Sloan":20675,"Excuse":20676,"##nine":20677,"Geography":20678,"gymnastics":20679,"multimedia":20680,"drains":20681,"Anthology":20682,"Gear":20683,"cylindrical":20684,"Fry":20685,"undertaking":20686,"##pler":20687,"##tility":20688,"Nan":20689,"##recht":20690,"Dub":20691,"philosophers":20692,"piss":20693,"Atari":20694,"##pha":20695,"Galicia":20696,"México":20697,"##nking":20698,"Continuing":20699,"bump":20700,"graveyard":20701,"persisted":20702,"Shrine":20703,"##erapy":20704,"defects":20705,"Advance":20706,"Bomber":20707,"##oil":20708,"##ffling":20709,"cheerful":20710,"##lix":20711,"scrub":20712,"##eto":20713,"awkwardly":20714,"collaborator":20715,"fencing":20716,"##alo":20717,"prophet":20718,"Croix":20719,"coughed":20720,"##lication":20721,"roadway":20722,"slaughter":20723,"elephants":20724,"##erated":20725,"Simpsons":20726,"vulnerability":20727,"ivory":20728,"Birth":20729,"lizard":20730,"scarce":20731,"cylinders":20732,"fortunes":20733,"##NL":20734,"Hate":20735,"Priory":20736,"##lai":20737,"McBride":20738,"##copy":20739,"Lenny":20740,"liaison":20741,"Triangle":20742,"coronation":20743,"sampled":20744,"savage":20745,"amidst":20746,"Grady":20747,"whatsoever":20748,"instinctively":20749,"Reconstruction":20750,"insides":20751,"seizure":20752,"Drawing":20753,"##rlin":20754,"Antioch":20755,"Gao":20756,"Díaz":20757,"1760":20758,"Sparks":20759,"##tien":20760,"##bidae":20761,"rehearsal":20762,"##bbs":20763,"botanical":20764,"##hers":20765,"compensate":20766,"wholesale":20767,"Seville":20768,"shareholder":20769,"prediction":20770,"astronomical":20771,"Reddy":20772,"hardest":20773,"circling":20774,"whereabouts":20775,"termination":20776,"Rep":20777,"Assistance":20778,"Dramatic":20779,"Herb":20780,"##ghter":20781,"climbs":20782,"188":20783,"Poole":20784,"301":20785,"##pable":20786,"wit":20787,"##istice":20788,"Walters":20789,"relying":20790,"Jakob":20791,"##redo":20792,"proceeding":20793,"Langley":20794,"affiliates":20795,"ou":20796,"##allo":20797,"##holm":20798,"Samsung":20799,"##ishi":20800,"Missing":20801,"Xi":20802,"vertices":20803,"Claus":20804,"foam":20805,"restless":20806,"##uating":20807,"##sso":20808,"##ttering":20809,"Philips":20810,"delta":20811,"bombed":20812,"Catalogue":20813,"coaster":20814,"Ling":20815,"Willard":20816,"satire":20817,"410":20818,"Composition":20819,"Net":20820,"Orioles":20821,"##ldon":20822,"fins":20823,"Palatinate":20824,"Woodward":20825,"tease":20826,"tilt":20827,"brightness":20828,"##70":20829,"##bbling":20830,"##loss":20831,"##dhi":20832,"##uilt":20833,"Whoever":20834,"##yers":20835,"hitter":20836,"Elton":20837,"Extension":20838,"ace":20839,"Affair":20840,"restructuring":20841,"##loping":20842,"Paterson":20843,"hi":20844,"##rya":20845,"spouse":20846,"Shay":20847,"Himself":20848,"piles":20849,"preaching":20850,"##gical":20851,"bikes":20852,"Brave":20853,"expulsion":20854,"Mirza":20855,"stride":20856,"Trees":20857,"commemorated":20858,"famine":20859,"masonry":20860,"Selena":20861,"Watt":20862,"Banking":20863,"Rancho":20864,"Stockton":20865,"dip":20866,"tattoos":20867,"Vlad":20868,"acquainted":20869,"Flyers":20870,"ruthless":20871,"fourteenth":20872,"illustrate":20873,"##akes":20874,"EPA":20875,"##rows":20876,"##uiz":20877,"bumped":20878,"Designed":20879,"Leaders":20880,"mastered":20881,"Manfred":20882,"swirled":20883,"McCain":20884,"##rout":20885,"Artemis":20886,"rabbi":20887,"flinched":20888,"upgrades":20889,"penetrate":20890,"shipyard":20891,"transforming":20892,"caretaker":20893,"##eiro":20894,"Maureen":20895,"tightening":20896,"##founded":20897,"RAM":20898,"##icular":20899,"##mper":20900,"##rung":20901,"Fifteen":20902,"exploited":20903,"consistency":20904,"interstate":20905,"##ynn":20906,"Bridget":20907,"contamination":20908,"Mistress":20909,"##rup":20910,"coating":20911,"##FP":20912,"##jective":20913,"Libyan":20914,"211":20915,"Gemma":20916,"dependence":20917,"shrubs":20918,"##ggled":20919,"Germain":20920,"retaliation":20921,"traction":20922,"##PP":20923,"Dangerous":20924,"terminology":20925,"psychiatrist":20926,"##garten":20927,"hurdles":20928,"Natal":20929,"wasting":20930,"Weir":20931,"revolves":20932,"stripe":20933,"##reased":20934,"preferences":20935,"##entation":20936,"##lde":20937,"##áil":20938,"##otherapy":20939,"Flame":20940,"##ologies":20941,"viruses":20942,"Label":20943,"Pandora":20944,"veil":20945,"##ogical":20946,"Coliseum":20947,"Cottage":20948,"creeping":20949,"Jong":20950,"lectured":20951,"##çaise":20952,"shoreline":20953,"##fference":20954,"##hra":20955,"Shade":20956,"Clock":20957,"Faye":20958,"bilingual":20959,"Humboldt":20960,"Operating":20961,"##fter":20962,"##was":20963,"algae":20964,"towed":20965,"amphibious":20966,"Parma":20967,"impacted":20968,"smacked":20969,"Piedmont":20970,"Monsters":20971,"##omb":20972,"Moor":20973,"##lberg":20974,"sinister":20975,"Postal":20976,"178":20977,"Drummond":20978,"Sign":20979,"textbooks":20980,"hazardous":20981,"Brass":20982,"Rosemary":20983,"Pick":20984,"Sit":20985,"Architect":20986,"transverse":20987,"Centennial":20988,"confess":20989,"polling":20990,"##aia":20991,"Julien":20992,"##mand":20993,"consolidation":20994,"Ethel":20995,"##ulse":20996,"severity":20997,"Yorker":20998,"choreographer":20999,"1840s":21000,"##ltry":21001,"softer":21002,"versa":21003,"##geny":21004,"##quila":21005,"##jō":21006,"Caledonia":21007,"Friendship":21008,"Visa":21009,"rogue":21010,"##zzle":21011,"bait":21012,"feather":21013,"incidence":21014,"Foods":21015,"Ships":21016,"##uto":21017,"##stead":21018,"arousal":21019,"##rote":21020,"Hazel":21021,"##bolic":21022,"Swing":21023,"##ej":21024,"##cule":21025,"##jana":21026,"##metry":21027,"##uity":21028,"Valuable":21029,"##ₙ":21030,"Shropshire":21031,"##nect":21032,"365":21033,"Ones":21034,"realise":21035,"Café":21036,"Albuquerque":21037,"##grown":21038,"##stadt":21039,"209":21040,"##ᵢ":21041,"prefers":21042,"withstand":21043,"Lillian":21044,"MacArthur":21045,"Hara":21046,"##fulness":21047,"domination":21048,"##VO":21049,"##school":21050,"Freddy":21051,"ethnicity":21052,"##while":21053,"adorned":21054,"hormone":21055,"Calder":21056,"Domestic":21057,"Freud":21058,"Shields":21059,"##phus":21060,"##rgan":21061,"BP":21062,"Segunda":21063,"Mustang":21064,"##GI":21065,"Bonn":21066,"patiently":21067,"remarried":21068,"##umbria":21069,"Crete":21070,"Elephant":21071,"Nuremberg":21072,"tolerate":21073,"Tyson":21074,"##evich":21075,"Programming":21076,"##lander":21077,"Bethlehem":21078,"segregation":21079,"Constituency":21080,"quarterly":21081,"blushed":21082,"photographers":21083,"Sheldon":21084,"porcelain":21085,"Blanche":21086,"goddamn":21087,"lively":21088,"##fused":21089,"bumps":21090,"##eli":21091,"curated":21092,"coherent":21093,"provoked":21094,"##vet":21095,"Madeleine":21096,"##isco":21097,"rainy":21098,"Bethel":21099,"accusation":21100,"ponytail":21101,"gag":21102,"##lington":21103,"quicker":21104,"scroll":21105,"##vate":21106,"Bow":21107,"Gender":21108,"Ira":21109,"crashes":21110,"ACT":21111,"Maintenance":21112,"##aton":21113,"##ieu":21114,"bitterly":21115,"strains":21116,"rattled":21117,"vectors":21118,"##arina":21119,"##ishly":21120,"173":21121,"parole":21122,"##nx":21123,"amusing":21124,"Gonzalez":21125,"##erative":21126,"Caucus":21127,"sensual":21128,"Penelope":21129,"coefficient":21130,"Mateo":21131,"##mani":21132,"proposition":21133,"Duty":21134,"lacrosse":21135,"proportions":21136,"Plato":21137,"profiles":21138,"Botswana":21139,"Brandt":21140,"reins":21141,"mandolin":21142,"encompassing":21143,"##gens":21144,"Kahn":21145,"prop":21146,"summon":21147,"##MR":21148,"##yrian":21149,"##zaki":21150,"Falling":21151,"conditional":21152,"thy":21153,"##bao":21154,"##ych":21155,"radioactive":21156,"##nics":21157,"Newspaper":21158,"##people":21159,"##nded":21160,"Gaming":21161,"sunny":21162,"##look":21163,"Sherwood":21164,"crafted":21165,"NJ":21166,"awoke":21167,"187":21168,"timeline":21169,"giants":21170,"possessing":21171,"##ycle":21172,"Cheryl":21173,"ng":21174,"Ruiz":21175,"polymer":21176,"potassium":21177,"Ramsay":21178,"relocation":21179,"##leen":21180,"Sociology":21181,"##bana":21182,"Franciscan":21183,"propulsion":21184,"denote":21185,"##erjee":21186,"registers":21187,"headline":21188,"Tests":21189,"emerges":21190,"Articles":21191,"Mint":21192,"livery":21193,"breakup":21194,"kits":21195,"Rap":21196,"Browning":21197,"Bunny":21198,"##mington":21199,"##watch":21200,"Anastasia":21201,"Zachary":21202,"arranging":21203,"biographical":21204,"Erica":21205,"Nippon":21206,"##membrance":21207,"Carmel":21208,"##sport":21209,"##xes":21210,"Paddy":21211,"##holes":21212,"Issues":21213,"Spears":21214,"compliment":21215,"##stro":21216,"##graphs":21217,"Castillo":21218,"##MU":21219,"##space":21220,"Corporal":21221,"##nent":21222,"174":21223,"Gentlemen":21224,"##ilize":21225,"##vage":21226,"convinces":21227,"Carmine":21228,"Crash":21229,"##hashi":21230,"Files":21231,"Doctors":21232,"brownish":21233,"sweating":21234,"goats":21235,"##conductor":21236,"rendition":21237,"##bt":21238,"NL":21239,"##spiration":21240,"generates":21241,"##cans":21242,"obsession":21243,"##noy":21244,"Danger":21245,"Diaz":21246,"heats":21247,"Realm":21248,"priorities":21249,"##phon":21250,"1300":21251,"initiation":21252,"pagan":21253,"bursts":21254,"archipelago":21255,"chloride":21256,"Screenplay":21257,"Hewitt":21258,"Khmer":21259,"bang":21260,"judgement":21261,"negotiating":21262,"##ait":21263,"Mabel":21264,"densely":21265,"Boulder":21266,"knob":21267,"430":21268,"Alfredo":21269,"##kt":21270,"pitches":21271,"##ées":21272,"##ان":21273,"Macdonald":21274,"##llum":21275,"imply":21276,"##mot":21277,"Smile":21278,"spherical":21279,"##tura":21280,"Derrick":21281,"Kelley":21282,"Nico":21283,"cortex":21284,"launches":21285,"differed":21286,"parallels":21287,"Navigation":21288,"##child":21289,"##rming":21290,"canoe":21291,"forestry":21292,"reinforce":21293,"##mote":21294,"confirming":21295,"tasting":21296,"scaled":21297,"##resh":21298,"##eting":21299,"Understanding":21300,"prevailing":21301,"Pearce":21302,"CW":21303,"earnest":21304,"Gaius":21305,"asserts":21306,"denoted":21307,"landmarks":21308,"Chargers":21309,"warns":21310,"##flies":21311,"Judges":21312,"jagged":21313,"##dain":21314,"tails":21315,"Historian":21316,"Millie":21317,"##sler":21318,"221":21319,"##uard":21320,"absurd":21321,"Dion":21322,"##ially":21323,"makeshift":21324,"Specifically":21325,"ignorance":21326,"Eat":21327,"##ieri":21328,"comparisons":21329,"forensic":21330,"186":21331,"Giro":21332,"skeptical":21333,"disciplinary":21334,"battleship":21335,"##45":21336,"Libby":21337,"520":21338,"Odyssey":21339,"ledge":21340,"##post":21341,"Eternal":21342,"Missionary":21343,"deficiency":21344,"settler":21345,"wonders":21346,"##gai":21347,"raging":21348,"##cis":21349,"Romney":21350,"Ulrich":21351,"annexation":21352,"boxers":21353,"sect":21354,"204":21355,"ARIA":21356,"dei":21357,"Hitchcock":21358,"te":21359,"Varsity":21360,"##fic":21361,"CC":21362,"lending":21363,"##nial":21364,"##tag":21365,"##rdy":21366,"##obe":21367,"Defensive":21368,"##dson":21369,"##pore":21370,"stellar":21371,"Lam":21372,"Trials":21373,"contention":21374,"Sung":21375,"##uminous":21376,"Poe":21377,"superiority":21378,"##plicate":21379,"325":21380,"bitten":21381,"conspicuous":21382,"##olly":21383,"Lila":21384,"Pub":21385,"Petit":21386,"distorted":21387,"ISIL":21388,"distinctly":21389,"##family":21390,"Cowboy":21391,"mutant":21392,"##cats":21393,"##week":21394,"Changes":21395,"Sinatra":21396,"epithet":21397,"neglect":21398,"Innocent":21399,"gamma":21400,"thrill":21401,"reggae":21402,"##adia":21403,"##ational":21404,"##due":21405,"landlord":21406,"##leaf":21407,"visibly":21408,"##ì":21409,"Darlington":21410,"Gomez":21411,"##iting":21412,"scarf":21413,"##lade":21414,"Hinduism":21415,"Fever":21416,"scouts":21417,"##roi":21418,"convened":21419,"##oki":21420,"184":21421,"Lao":21422,"boycott":21423,"unemployed":21424,"##lore":21425,"##ß":21426,"##hammer":21427,"Curran":21428,"disciples":21429,"odor":21430,"##ygiene":21431,"Lighthouse":21432,"Played":21433,"whales":21434,"discretion":21435,"Yves":21436,"##ceived":21437,"pauses":21438,"coincide":21439,"##nji":21440,"dizzy":21441,"##scopic":21442,"routed":21443,"Guardians":21444,"Kellan":21445,"carnival":21446,"nasal":21447,"224":21448,"##awed":21449,"Mitsubishi":21450,"640":21451,"Cast":21452,"silky":21453,"Projects":21454,"joked":21455,"Huddersfield":21456,"Rothschild":21457,"zu":21458,"##olar":21459,"Divisions":21460,"mildly":21461,"##eni":21462,"##lge":21463,"Appalachian":21464,"Sahara":21465,"pinch":21466,"##roon":21467,"wardrobe":21468,"##dham":21469,"##etal":21470,"Bubba":21471,"##lini":21472,"##rumbling":21473,"Communities":21474,"Poznań":21475,"unification":21476,"Beau":21477,"Kris":21478,"SV":21479,"Rowing":21480,"Minh":21481,"reconciliation":21482,"##saki":21483,"##sor":21484,"taped":21485,"##reck":21486,"certificates":21487,"gubernatorial":21488,"rainbow":21489,"##uing":21490,"litter":21491,"##lique":21492,"##oted":21493,"Butterfly":21494,"benefited":21495,"Images":21496,"induce":21497,"Balkans":21498,"Velvet":21499,"##90":21500,"##xon":21501,"Bowman":21502,"##breaker":21503,"penis":21504,"##nitz":21505,"##oint":21506,"##otive":21507,"crust":21508,"##pps":21509,"organizers":21510,"Outdoor":21511,"nominees":21512,"##rika":21513,"TX":21514,"##ucks":21515,"Protestants":21516,"##imation":21517,"appetite":21518,"Baja":21519,"awaited":21520,"##points":21521,"windshield":21522,"##igh":21523,"##zled":21524,"Brody":21525,"Buster":21526,"stylized":21527,"Bryce":21528,"##sz":21529,"Dollar":21530,"vest":21531,"mold":21532,"ounce":21533,"ok":21534,"receivers":21535,"##uza":21536,"Purdue":21537,"Harrington":21538,"Hodges":21539,"captures":21540,"##ggio":21541,"Reservation":21542,"##ssin":21543,"##tman":21544,"cosmic":21545,"straightforward":21546,"flipping":21547,"remixed":21548,"##athed":21549,"Gómez":21550,"Lim":21551,"motorcycles":21552,"economies":21553,"owning":21554,"Dani":21555,"##rosis":21556,"myths":21557,"sire":21558,"kindly":21559,"1768":21560,"Bean":21561,"graphs":21562,"##mee":21563,"##RO":21564,"##geon":21565,"puppy":21566,"Stephenson":21567,"notified":21568,"##jer":21569,"Watching":21570,"##rama":21571,"Sino":21572,"urgency":21573,"Islanders":21574,"##mash":21575,"Plata":21576,"fumble":21577,"##chev":21578,"##stance":21579,"##rack":21580,"##she":21581,"facilitated":21582,"swings":21583,"akin":21584,"enduring":21585,"payload":21586,"##phine":21587,"Deputies":21588,"murals":21589,"##tooth":21590,"610":21591,"Jays":21592,"eyeing":21593,"##quito":21594,"transparency":21595,"##cote":21596,"Timor":21597,"negatively":21598,"##isan":21599,"battled":21600,"##fected":21601,"thankful":21602,"Rage":21603,"hospitality":21604,"incorrectly":21605,"207":21606,"entrepreneurs":21607,"##cula":21608,"##wley":21609,"hedge":21610,"##cratic":21611,"Corpus":21612,"Odessa":21613,"Whereas":21614,"##ln":21615,"fetch":21616,"happier":21617,"Amherst":21618,"bullying":21619,"graceful":21620,"Height":21621,"Bartholomew":21622,"willingness":21623,"qualifier":21624,"191":21625,"Syed":21626,"Wesleyan":21627,"Layla":21628,"##rrence":21629,"Webber":21630,"##hum":21631,"Rat":21632,"##cket":21633,"##herence":21634,"Monterey":21635,"contaminated":21636,"Beside":21637,"Mustafa":21638,"Nana":21639,"213":21640,"##pruce":21641,"Reason":21642,"##spense":21643,"spike":21644,"##gé":21645,"AU":21646,"disciple":21647,"charcoal":21648,"##lean":21649,"formulated":21650,"Diesel":21651,"Mariners":21652,"accreditation":21653,"glossy":21654,"1800s":21655,"##ih":21656,"Mainz":21657,"unison":21658,"Marianne":21659,"shear":21660,"overseeing":21661,"vernacular":21662,"bowled":21663,"##lett":21664,"unpopular":21665,"##ckoned":21666,"##monia":21667,"Gaston":21668,"##TI":21669,"##oters":21670,"Cups":21671,"##bones":21672,"##ports":21673,"Museo":21674,"minors":21675,"1773":21676,"Dickens":21677,"##EL":21678,"##NBC":21679,"Presents":21680,"ambitions":21681,"axes":21682,"Río":21683,"Yukon":21684,"bedside":21685,"Ribbon":21686,"Units":21687,"faults":21688,"conceal":21689,"##lani":21690,"prevailed":21691,"214":21692,"Goodwin":21693,"Jaguar":21694,"crumpled":21695,"Cullen":21696,"Wireless":21697,"ceded":21698,"remotely":21699,"Bin":21700,"mocking":21701,"straps":21702,"ceramics":21703,"##avi":21704,"##uding":21705,"##ader":21706,"Taft":21707,"twenties":21708,"##aked":21709,"Problem":21710,"quasi":21711,"Lamar":21712,"##ntes":21713,"##avan":21714,"Barr":21715,"##eral":21716,"hooks":21717,"sa":21718,"##ône":21719,"194":21720,"##ross":21721,"Nero":21722,"Caine":21723,"trance":21724,"Homeland":21725,"benches":21726,"Guthrie":21727,"dismiss":21728,"##lex":21729,"César":21730,"foliage":21731,"##oot":21732,"##alty":21733,"Assyrian":21734,"Ahead":21735,"Murdoch":21736,"dictatorship":21737,"wraps":21738,"##ntal":21739,"Corridor":21740,"Mackay":21741,"respectable":21742,"jewels":21743,"understands":21744,"##pathic":21745,"Bryn":21746,"##tep":21747,"ON":21748,"capsule":21749,"intrigued":21750,"Sleeping":21751,"communists":21752,"##chayat":21753,"##current":21754,"##vez":21755,"doubling":21756,"booklet":21757,"##uche":21758,"Creed":21759,"##NU":21760,"spies":21761,"##sef":21762,"adjusting":21763,"197":21764,"Imam":21765,"heaved":21766,"Tanya":21767,"canonical":21768,"restraint":21769,"senators":21770,"stainless":21771,"##gnate":21772,"Matter":21773,"cache":21774,"restrained":21775,"conflicting":21776,"stung":21777,"##ool":21778,"Sustainable":21779,"antiquity":21780,"193":21781,"heavens":21782,"inclusive":21783,"##ador":21784,"fluent":21785,"303":21786,"911":21787,"archaeologist":21788,"superseded":21789,"##plex":21790,"Tammy":21791,"inspire":21792,"##passing":21793,"##lub":21794,"Lama":21795,"Mixing":21796,"##activated":21797,"##yote":21798,"parlor":21799,"tactic":21800,"198":21801,"Stefano":21802,"prostitute":21803,"recycling":21804,"sorted":21805,"banana":21806,"Stacey":21807,"Musée":21808,"aristocratic":21809,"cough":21810,"##rting":21811,"authorised":21812,"gangs":21813,"runoff":21814,"thoughtfully":21815,"##nish":21816,"Fisheries":21817,"Provence":21818,"detector":21819,"hum":21820,"##zhen":21821,"pill":21822,"##árez":21823,"Map":21824,"Leaves":21825,"Peabody":21826,"skater":21827,"vent":21828,"##color":21829,"390":21830,"cerebral":21831,"hostages":21832,"mare":21833,"Jurassic":21834,"swell":21835,"##isans":21836,"Knoxville":21837,"Naked":21838,"Malaya":21839,"scowl":21840,"Cobra":21841,"##anga":21842,"Sexual":21843,"##dron":21844,"##iae":21845,"196":21846,"##drick":21847,"Ravens":21848,"Blaine":21849,"##throp":21850,"Ismail":21851,"symmetric":21852,"##lossom":21853,"Leicestershire":21854,"Sylvester":21855,"glazed":21856,"##tended":21857,"Radar":21858,"fused":21859,"Families":21860,"Blacks":21861,"Sale":21862,"Zion":21863,"foothills":21864,"microwave":21865,"slain":21866,"Collingwood":21867,"##pants":21868,"##dling":21869,"killers":21870,"routinely":21871,"Janice":21872,"hearings":21873,"##chanted":21874,"##ltration":21875,"continents":21876,"##iving":21877,"##yster":21878,"##shot":21879,"##yna":21880,"injected":21881,"Guillaume":21882,"##ibi":21883,"kinda":21884,"Confederacy":21885,"Barnett":21886,"disasters":21887,"incapable":21888,"##grating":21889,"rhythms":21890,"betting":21891,"draining":21892,"##hak":21893,"Callie":21894,"Glover":21895,"##iliated":21896,"Sherlock":21897,"hearted":21898,"punching":21899,"Wolverhampton":21900,"Leaf":21901,"Pi":21902,"builders":21903,"furnished":21904,"knighted":21905,"Photo":21906,"##zle":21907,"Touring":21908,"fumbled":21909,"pads":21910,"##ий":21911,"Bartlett":21912,"Gunner":21913,"eerie":21914,"Marius":21915,"Bonus":21916,"pots":21917,"##hino":21918,"##pta":21919,"Bray":21920,"Frey":21921,"Ortiz":21922,"stalls":21923,"belongings":21924,"Subway":21925,"fascination":21926,"metaphor":21927,"Bat":21928,"Boer":21929,"Colchester":21930,"sway":21931,"##gro":21932,"rhetoric":21933,"##dheim":21934,"Fool":21935,"PMID":21936,"admire":21937,"##hsil":21938,"Strand":21939,"TNA":21940,"##roth":21941,"Nottinghamshire":21942,"##mat":21943,"##yler":21944,"Oxfordshire":21945,"##nacle":21946,"##roner":21947,"BS":21948,"##nces":21949,"stimulus":21950,"transports":21951,"Sabbath":21952,"##postle":21953,"Richter":21954,"4000":21955,"##grim":21956,"##shima":21957,"##lette":21958,"deteriorated":21959,"analogous":21960,"##ratic":21961,"UHF":21962,"energies":21963,"inspiring":21964,"Yiddish":21965,"Activities":21966,"##quential":21967,"##boe":21968,"Melville":21969,"##ilton":21970,"Judd":21971,"consonants":21972,"labs":21973,"smuggling":21974,"##fari":21975,"avid":21976,"##uc":21977,"truce":21978,"undead":21979,"##raith":21980,"Mostly":21981,"bracelet":21982,"Connection":21983,"Hussain":21984,"awhile":21985,"##UC":21986,"##vention":21987,"liable":21988,"genetically":21989,"##phic":21990,"Important":21991,"Wildcats":21992,"daddy":21993,"transmit":21994,"##cas":21995,"conserved":21996,"Yesterday":21997,"##lite":21998,"Nicky":21999,"Guys":22000,"Wilder":22001,"Lay":22002,"skinned":22003,"Communists":22004,"Garfield":22005,"Nearby":22006,"organizer":22007,"Loss":22008,"crafts":22009,"walkway":22010,"Chocolate":22011,"Sundance":22012,"Synod":22013,"##enham":22014,"modify":22015,"swayed":22016,"Surface":22017,"analysts":22018,"brackets":22019,"drone":22020,"parachute":22021,"smelling":22022,"Andrés":22023,"filthy":22024,"frogs":22025,"vertically":22026,"##OK":22027,"localities":22028,"marries":22029,"AHL":22030,"35th":22031,"##pian":22032,"Palazzo":22033,"cube":22034,"dismay":22035,"relocate":22036,"##на":22037,"Hear":22038,"##digo":22039,"##oxide":22040,"prefecture":22041,"converts":22042,"hangar":22043,"##oya":22044,"##ucking":22045,"Spectrum":22046,"deepened":22047,"spoiled":22048,"Keeping":22049,"##phobic":22050,"Verona":22051,"outrage":22052,"Improvement":22053,"##UI":22054,"masterpiece":22055,"slung":22056,"Calling":22057,"chant":22058,"Haute":22059,"mediated":22060,"manipulated":22061,"affirmed":22062,"##hesis":22063,"Hangul":22064,"skies":22065,"##llan":22066,"Worcestershire":22067,"##kos":22068,"mosaic":22069,"##bage":22070,"##wned":22071,"Putnam":22072,"folder":22073,"##LM":22074,"guts":22075,"noteworthy":22076,"##rada":22077,"AJ":22078,"sculpted":22079,"##iselle":22080,"##rang":22081,"recognizable":22082,"##pent":22083,"dolls":22084,"lobbying":22085,"impatiently":22086,"Se":22087,"staple":22088,"Serb":22089,"tandem":22090,"Hiroshima":22091,"thieves":22092,"##ynx":22093,"faculties":22094,"Norte":22095,"##alle":22096,"##trusion":22097,"chords":22098,"##ylon":22099,"Gareth":22100,"##lops":22101,"##escu":22102,"FIA":22103,"Levin":22104,"auspices":22105,"groin":22106,"Hui":22107,"nun":22108,"Listed":22109,"Honourable":22110,"Larsen":22111,"rigorous":22112,"##erer":22113,"Tonga":22114,"##pment":22115,"##rave":22116,"##track":22117,"##aa":22118,"##enary":22119,"540":22120,"clone":22121,"sediment":22122,"esteem":22123,"sighted":22124,"cruelty":22125,"##boa":22126,"inverse":22127,"violating":22128,"Amtrak":22129,"Status":22130,"amalgamated":22131,"vertex":22132,"AR":22133,"harmless":22134,"Amir":22135,"mounts":22136,"Coronation":22137,"counseling":22138,"Audi":22139,"CO₂":22140,"splits":22141,"##eyer":22142,"Humans":22143,"Salmon":22144,"##have":22145,"##rado":22146,"##čić":22147,"216":22148,"takeoff":22149,"classmates":22150,"psychedelic":22151,"##gni":22152,"Gypsy":22153,"231":22154,"Anger":22155,"GAA":22156,"ME":22157,"##nist":22158,"##tals":22159,"Lissa":22160,"Odd":22161,"baptized":22162,"Fiat":22163,"fringe":22164,"##hren":22165,"179":22166,"elevators":22167,"perspectives":22168,"##TF":22169,"##ngle":22170,"Question":22171,"frontal":22172,"950":22173,"thicker":22174,"Molecular":22175,"##nological":22176,"Sixteen":22177,"Baton":22178,"Hearing":22179,"commemorative":22180,"dorm":22181,"Architectural":22182,"purity":22183,"##erse":22184,"risky":22185,"Georgie":22186,"relaxing":22187,"##ugs":22188,"downed":22189,"##rar":22190,"Slim":22191,"##phy":22192,"IUCN":22193,"##thorpe":22194,"Parkinson":22195,"217":22196,"Marley":22197,"Shipping":22198,"sweaty":22199,"Jesuits":22200,"Sindh":22201,"Janata":22202,"implying":22203,"Armenians":22204,"intercept":22205,"Ankara":22206,"commissioners":22207,"ascended":22208,"sniper":22209,"Grass":22210,"Walls":22211,"salvage":22212,"Dewey":22213,"generalized":22214,"learnt":22215,"PT":22216,"##fighter":22217,"##tech":22218,"DR":22219,"##itrus":22220,"##zza":22221,"mercenaries":22222,"slots":22223,"##burst":22224,"##finger":22225,"##nsky":22226,"Princes":22227,"Rhodesia":22228,"##munication":22229,"##strom":22230,"Fremantle":22231,"homework":22232,"ins":22233,"##Os":22234,"##hao":22235,"##uffed":22236,"Thorpe":22237,"Xiao":22238,"exquisite":22239,"firstly":22240,"liberated":22241,"technician":22242,"Oilers":22243,"Phyllis":22244,"herb":22245,"sharks":22246,"MBE":22247,"##stock":22248,"Product":22249,"banjo":22250,"##morandum":22251,"##than":22252,"Visitors":22253,"unavailable":22254,"unpublished":22255,"oxidation":22256,"Vogue":22257,"##copic":22258,"##etics":22259,"Yates":22260,"##ppard":22261,"Leiden":22262,"Trading":22263,"cottages":22264,"Principles":22265,"##Millan":22266,"##wife":22267,"##hiva":22268,"Vicar":22269,"nouns":22270,"strolled":22271,"##eorological":22272,"##eton":22273,"##science":22274,"precedent":22275,"Armand":22276,"Guido":22277,"rewards":22278,"##ilis":22279,"##tise":22280,"clipped":22281,"chick":22282,"##endra":22283,"averages":22284,"tentatively":22285,"1830s":22286,"##vos":22287,"Certainly":22288,"305":22289,"Société":22290,"Commandant":22291,"##crats":22292,"##dified":22293,"##nka":22294,"marsh":22295,"angered":22296,"ventilation":22297,"Hutton":22298,"Ritchie":22299,"##having":22300,"Eclipse":22301,"flick":22302,"motionless":22303,"Amor":22304,"Fest":22305,"Loire":22306,"lays":22307,"##icit":22308,"##sband":22309,"Guggenheim":22310,"Luck":22311,"disrupted":22312,"##ncia":22313,"Disco":22314,"##vigator":22315,"criticisms":22316,"grins":22317,"##lons":22318,"##vial":22319,"##ody":22320,"salute":22321,"Coaches":22322,"junk":22323,"saxophonist":22324,"##eology":22325,"Uprising":22326,"Diet":22327,"##marks":22328,"chronicles":22329,"robbed":22330,"##iet":22331,"##ahi":22332,"Bohemian":22333,"magician":22334,"wavelength":22335,"Kenyan":22336,"augmented":22337,"fashionable":22338,"##ogies":22339,"Luce":22340,"F1":22341,"Monmouth":22342,"##jos":22343,"##loop":22344,"enjoyment":22345,"exemption":22346,"Centers":22347,"##visor":22348,"Soundtrack":22349,"blinding":22350,"practitioner":22351,"solidarity":22352,"sacrificed":22353,"##oso":22354,"##cture":22355,"##riated":22356,"blended":22357,"Abd":22358,"Copyright":22359,"##nob":22360,"34th":22361,"##reak":22362,"Claudio":22363,"hectare":22364,"rotor":22365,"testify":22366,"##ends":22367,"##iably":22368,"##sume":22369,"landowner":22370,"##cess":22371,"##ckman":22372,"Eduard":22373,"Silesian":22374,"backseat":22375,"mutually":22376,"##abe":22377,"Mallory":22378,"bounds":22379,"Collective":22380,"Poet":22381,"Winkler":22382,"pertaining":22383,"scraped":22384,"Phelps":22385,"crane":22386,"flickering":22387,"Proto":22388,"bubbles":22389,"popularized":22390,"removes":22391,"##86":22392,"Cadillac":22393,"Warfare":22394,"audible":22395,"rites":22396,"shivering":22397,"##sist":22398,"##nst":22399,"##biotic":22400,"Mon":22401,"fascist":22402,"Bali":22403,"Kathryn":22404,"ambiguous":22405,"furiously":22406,"morale":22407,"patio":22408,"Sang":22409,"inconsistent":22410,"topology":22411,"Greens":22412,"monkeys":22413,"Köppen":22414,"189":22415,"Toy":22416,"vow":22417,"##ías":22418,"bombings":22419,"##culus":22420,"improvised":22421,"lodged":22422,"subsidiaries":22423,"garment":22424,"startling":22425,"practised":22426,"Hume":22427,"Thorn":22428,"categorized":22429,"Till":22430,"Eileen":22431,"wedge":22432,"##64":22433,"Federico":22434,"patriotic":22435,"unlock":22436,"##oshi":22437,"badminton":22438,"Compared":22439,"Vilnius":22440,"##KE":22441,"Crimean":22442,"Kemp":22443,"decks":22444,"spaced":22445,"resolutions":22446,"sighs":22447,"##mind":22448,"Imagine":22449,"Cartoon":22450,"huddled":22451,"policemen":22452,"forwards":22453,"##rouch":22454,"equals":22455,"##nter":22456,"inspected":22457,"Charley":22458,"MG":22459,"##rte":22460,"pamphlet":22461,"Arturo":22462,"dans":22463,"scarcely":22464,"##ulton":22465,"##rvin":22466,"parental":22467,"unconstitutional":22468,"watts":22469,"Susannah":22470,"Dare":22471,"##sitive":22472,"Rowland":22473,"Valle":22474,"invalid":22475,"##ué":22476,"Detachment":22477,"acronym":22478,"Yokohama":22479,"verified":22480,"##lsson":22481,"groove":22482,"Liza":22483,"clarified":22484,"compromised":22485,"265":22486,"##rgon":22487,"##orf":22488,"hesitant":22489,"Fruit":22490,"Application":22491,"Mathias":22492,"icons":22493,"##cell":22494,"Qin":22495,"interventions":22496,"##uron":22497,"punt":22498,"remnant":22499,"##rien":22500,"Ames":22501,"manifold":22502,"spines":22503,"floral":22504,"##zable":22505,"comrades":22506,"Fallen":22507,"orbits":22508,"Annals":22509,"hobby":22510,"Auditorium":22511,"implicated":22512,"researching":22513,"Pueblo":22514,"Ta":22515,"terminate":22516,"##pella":22517,"Rings":22518,"approximation":22519,"fuzzy":22520,"##ús":22521,"thriving":22522,"##ket":22523,"Conor":22524,"alarmed":22525,"etched":22526,"Cary":22527,"##rdon":22528,"Ally":22529,"##rington":22530,"Pay":22531,"mint":22532,"##hasa":22533,"##unity":22534,"##dman":22535,"##itate":22536,"Oceania":22537,"furrowed":22538,"trams":22539,"##aq":22540,"Wentworth":22541,"ventured":22542,"choreography":22543,"prototypes":22544,"Patel":22545,"mouthed":22546,"trenches":22547,"##licing":22548,"##yya":22549,"Lies":22550,"deception":22551,"##erve":22552,"##vations":22553,"Bertrand":22554,"earthquakes":22555,"##tography":22556,"Southwestern":22557,"##aja":22558,"token":22559,"Gupta":22560,"##yō":22561,"Beckett":22562,"initials":22563,"ironic":22564,"Tsar":22565,"subdued":22566,"shootout":22567,"sobbing":22568,"liar":22569,"Scandinavia":22570,"Souls":22571,"ch":22572,"therapist":22573,"trader":22574,"Regulation":22575,"Kali":22576,"busiest":22577,"##pation":22578,"32nd":22579,"Telephone":22580,"Vargas":22581,"##moky":22582,"##nose":22583,"##uge":22584,"Favorite":22585,"abducted":22586,"bonding":22587,"219":22588,"255":22589,"correction":22590,"mat":22591,"drown":22592,"fl":22593,"unbeaten":22594,"Pocket":22595,"Summers":22596,"Quite":22597,"rods":22598,"Percussion":22599,"##ndy":22600,"buzzing":22601,"cadet":22602,"Wilkes":22603,"attire":22604,"directory":22605,"utilities":22606,"naive":22607,"populous":22608,"Hendrix":22609,"##actor":22610,"disadvantage":22611,"1400":22612,"Landon":22613,"Underworld":22614,"##ense":22615,"Occasionally":22616,"mercury":22617,"Davey":22618,"Morley":22619,"spa":22620,"wrestled":22621,"##vender":22622,"eclipse":22623,"Sienna":22624,"supplemented":22625,"thou":22626,"Stream":22627,"liturgical":22628,"##gall":22629,"##berries":22630,"##piration":22631,"1769":22632,"Bucks":22633,"abandoning":22634,"##jutant":22635,"##nac":22636,"232":22637,"venom":22638,"##31":22639,"Roche":22640,"dotted":22641,"Currie":22642,"Córdoba":22643,"Milo":22644,"Sharif":22645,"divides":22646,"justification":22647,"prejudice":22648,"fortunate":22649,"##vide":22650,"##ābād":22651,"Rowe":22652,"inflammatory":22653,"##eld":22654,"avenue":22655,"Sources":22656,"##rimal":22657,"Messenger":22658,"Blanco":22659,"advocating":22660,"formulation":22661,"##pute":22662,"emphasizes":22663,"nut":22664,"Armored":22665,"##ented":22666,"nutrients":22667,"##tment":22668,"insistence":22669,"Martins":22670,"landowners":22671,"##RB":22672,"comparatively":22673,"headlines":22674,"snaps":22675,"##qing":22676,"Celebration":22677,"##mad":22678,"republican":22679,"##NE":22680,"Trace":22681,"##500":22682,"1771":22683,"proclamation":22684,"NRL":22685,"Rubin":22686,"Buzz":22687,"Weimar":22688,"##AG":22689,"199":22690,"posthumous":22691,"##ental":22692,"##deacon":22693,"Distance":22694,"intensely":22695,"overheard":22696,"Arcade":22697,"diagonal":22698,"hazard":22699,"Giving":22700,"weekdays":22701,"##ù":22702,"Verdi":22703,"actresses":22704,"##hare":22705,"Pulling":22706,"##erries":22707,"##pores":22708,"catering":22709,"shortest":22710,"##ctors":22711,"##cure":22712,"##restle":22713,"##reta":22714,"##runch":22715,"##brecht":22716,"##uddin":22717,"Moments":22718,"senate":22719,"Feng":22720,"Prescott":22721,"##thest":22722,"218":22723,"divisional":22724,"Bertie":22725,"sparse":22726,"surrounds":22727,"coupling":22728,"gravitational":22729,"werewolves":22730,"##lax":22731,"Rankings":22732,"##mated":22733,"##tries":22734,"Shia":22735,"##mart":22736,"##23":22737,"##vocative":22738,"interfaces":22739,"morphology":22740,"newscast":22741,"##bide":22742,"inputs":22743,"solicitor":22744,"Olaf":22745,"cabinets":22746,"puzzles":22747,"##tains":22748,"Unified":22749,"##firmed":22750,"WA":22751,"solemn":22752,"##opy":22753,"Tito":22754,"Jaenelle":22755,"Neolithic":22756,"horseback":22757,"##ires":22758,"pharmacy":22759,"prevalence":22760,"##lint":22761,"Swami":22762,"##bush":22763,"##tudes":22764,"Philipp":22765,"mythical":22766,"divers":22767,"Scouting":22768,"aperture":22769,"progressively":22770,"##bay":22771,"##nio":22772,"bounce":22773,"Floor":22774,"##elf":22775,"Lucan":22776,"adulthood":22777,"helm":22778,"Bluff":22779,"Passage":22780,"Salvation":22781,"lemon":22782,"napkin":22783,"scheduling":22784,"##gets":22785,"Elements":22786,"Mina":22787,"Novak":22788,"stalled":22789,"##llister":22790,"Infrastructure":22791,"##nky":22792,"##tania":22793,"##uished":22794,"Katz":22795,"Norma":22796,"sucks":22797,"trusting":22798,"1765":22799,"boilers":22800,"Accordingly":22801,"##hered":22802,"223":22803,"Crowley":22804,"##fight":22805,"##ulo":22806,"Henrietta":22807,"##hani":22808,"pounder":22809,"surprises":22810,"##chor":22811,"##glia":22812,"Dukes":22813,"##cracy":22814,"##zier":22815,"##fs":22816,"Patriot":22817,"silicon":22818,"##VP":22819,"simulcast":22820,"telegraph":22821,"Mysore":22822,"cardboard":22823,"Len":22824,"##QL":22825,"Auguste":22826,"accordion":22827,"analytical":22828,"specify":22829,"ineffective":22830,"hunched":22831,"abnormal":22832,"Transylvania":22833,"##dn":22834,"##tending":22835,"Emilia":22836,"glittering":22837,"Maddy":22838,"##wana":22839,"1762":22840,"External":22841,"Lecture":22842,"endorsement":22843,"Hernández":22844,"Anaheim":22845,"Ware":22846,"offences":22847,"##phorus":22848,"Plantation":22849,"popping":22850,"Bonaparte":22851,"disgusting":22852,"neared":22853,"##notes":22854,"Identity":22855,"heroin":22856,"nicely":22857,"##raverse":22858,"apron":22859,"congestion":22860,"##PR":22861,"padded":22862,"##fts":22863,"invaders":22864,"##came":22865,"freshly":22866,"Halle":22867,"endowed":22868,"fracture":22869,"ROM":22870,"##max":22871,"sediments":22872,"diffusion":22873,"dryly":22874,"##tara":22875,"Tam":22876,"Draw":22877,"Spin":22878,"Talon":22879,"Anthropology":22880,"##lify":22881,"nausea":22882,"##shirt":22883,"insert":22884,"Fresno":22885,"capitalist":22886,"indefinitely":22887,"apples":22888,"Gift":22889,"scooped":22890,"60s":22891,"Cooperative":22892,"mistakenly":22893,"##lover":22894,"murmur":22895,"##iger":22896,"Equipment":22897,"abusive":22898,"orphanage":22899,"##9th":22900,"##lterweight":22901,"##unda":22902,"Baird":22903,"ant":22904,"saloon":22905,"33rd":22906,"Chesapeake":22907,"##chair":22908,"##sound":22909,"##tend":22910,"chaotic":22911,"pornography":22912,"brace":22913,"##aret":22914,"heiress":22915,"SSR":22916,"resentment":22917,"Arbor":22918,"headmaster":22919,"##uren":22920,"unlimited":22921,"##with":22922,"##jn":22923,"Bram":22924,"Ely":22925,"Pokémon":22926,"pivotal":22927,"##guous":22928,"Database":22929,"Marta":22930,"Shine":22931,"stumbling":22932,"##ovsky":22933,"##skin":22934,"Henley":22935,"Polk":22936,"functioned":22937,"##layer":22938,"##pas":22939,"##udd":22940,"##MX":22941,"blackness":22942,"cadets":22943,"feral":22944,"Damian":22945,"##actions":22946,"2D":22947,"##yla":22948,"Apocalypse":22949,"##aic":22950,"inactivated":22951,"##china":22952,"##kovic":22953,"##bres":22954,"destroys":22955,"nap":22956,"Macy":22957,"sums":22958,"Madhya":22959,"Wisdom":22960,"rejects":22961,"##amel":22962,"60th":22963,"Cho":22964,"bandwidth":22965,"##sons":22966,"##obbing":22967,"##orama":22968,"Mutual":22969,"shafts":22970,"##estone":22971,"##rsen":22972,"accord":22973,"replaces":22974,"waterfront":22975,"##gonal":22976,"##rida":22977,"convictions":22978,"##ays":22979,"calmed":22980,"suppliers":22981,"Cummings":22982,"GMA":22983,"fearful":22984,"Scientist":22985,"Sinai":22986,"examines":22987,"experimented":22988,"Netflix":22989,"Enforcement":22990,"Scarlett":22991,"##lasia":22992,"Healthcare":22993,"##onte":22994,"Dude":22995,"inverted":22996,"##36":22997,"##regation":22998,"##lidae":22999,"Munro":23000,"##angay":23001,"Airbus":23002,"overlapping":23003,"Drivers":23004,"lawsuits":23005,"bodily":23006,"##udder":23007,"Wanda":23008,"Effects":23009,"Fathers":23010,"##finery":23011,"##islav":23012,"Ridley":23013,"observatory":23014,"pod":23015,"##utrition":23016,"Electricity":23017,"landslide":23018,"##mable":23019,"##zoic":23020,"##imator":23021,"##uration":23022,"Estates":23023,"sleepy":23024,"Nickelodeon":23025,"steaming":23026,"irony":23027,"schedules":23028,"snack":23029,"spikes":23030,"Hmm":23031,"##nesia":23032,"##bella":23033,"##hibit":23034,"Greenville":23035,"plucked":23036,"Harald":23037,"##ono":23038,"Gamma":23039,"infringement":23040,"roaring":23041,"deposition":23042,"##pol":23043,"##orum":23044,"660":23045,"seminal":23046,"passports":23047,"engagements":23048,"Akbar":23049,"rotated":23050,"##bina":23051,"##gart":23052,"Hartley":23053,"##lown":23054,"##truct":23055,"uttered":23056,"traumatic":23057,"Dex":23058,"##ôme":23059,"Holloway":23060,"MV":23061,"apartheid":23062,"##nee":23063,"Counter":23064,"Colton":23065,"OR":23066,"245":23067,"Spaniards":23068,"Regency":23069,"Schedule":23070,"scratching":23071,"squads":23072,"verify":23073,"##alk":23074,"keyboardist":23075,"rotten":23076,"Forestry":23077,"aids":23078,"commemorating":23079,"##yed":23080,"##érie":23081,"Sting":23082,"##elly":23083,"Dai":23084,"##fers":23085,"##berley":23086,"##ducted":23087,"Melvin":23088,"cannabis":23089,"glider":23090,"##enbach":23091,"##rban":23092,"Costello":23093,"Skating":23094,"cartoonist":23095,"AN":23096,"audit":23097,"##pectator":23098,"distributing":23099,"226":23100,"312":23101,"interpreter":23102,"header":23103,"Alternatively":23104,"##ases":23105,"smug":23106,"##kumar":23107,"cabins":23108,"remastered":23109,"Connolly":23110,"Kelsey":23111,"LED":23112,"tentative":23113,"Check":23114,"Sichuan":23115,"shaved":23116,"##42":23117,"Gerhard":23118,"Harvest":23119,"inward":23120,"##rque":23121,"Hopefully":23122,"hem":23123,"##34":23124,"Typical":23125,"binds":23126,"wrath":23127,"Woodstock":23128,"forcibly":23129,"Fergus":23130,"##charged":23131,"##tured":23132,"prepares":23133,"amenities":23134,"penetration":23135,"##ghan":23136,"coarse":23137,"##oned":23138,"enthusiasts":23139,"##av":23140,"##twined":23141,"fielded":23142,"##cky":23143,"Kiel":23144,"##obia":23145,"470":23146,"beers":23147,"tremble":23148,"youths":23149,"attendees":23150,"##cademies":23151,"##sex":23152,"Macon":23153,"communism":23154,"dir":23155,"##abi":23156,"Lennox":23157,"Wen":23158,"differentiate":23159,"jewel":23160,"##SO":23161,"activate":23162,"assert":23163,"laden":23164,"unto":23165,"Gillespie":23166,"Guillermo":23167,"accumulation":23168,"##GM":23169,"NGO":23170,"Rosenberg":23171,"calculating":23172,"drastically":23173,"##omorphic":23174,"peeled":23175,"Liège":23176,"insurgents":23177,"outdoors":23178,"##enia":23179,"Aspen":23180,"Sep":23181,"awakened":23182,"##eye":23183,"Consul":23184,"Maiden":23185,"insanity":23186,"##brian":23187,"furnace":23188,"Colours":23189,"distributions":23190,"longitudinal":23191,"syllables":23192,"##scent":23193,"Martian":23194,"accountant":23195,"Atkins":23196,"husbands":23197,"sewage":23198,"zur":23199,"collaborate":23200,"highlighting":23201,"##rites":23202,"##PI":23203,"colonization":23204,"nearer":23205,"##XT":23206,"dunes":23207,"positioning":23208,"Ku":23209,"multitude":23210,"luxurious":23211,"Volvo":23212,"linguistics":23213,"plotting":23214,"squared":23215,"##inder":23216,"outstretched":23217,"##uds":23218,"Fuji":23219,"ji":23220,"##feit":23221,"##ahu":23222,"##loat":23223,"##gado":23224,"##luster":23225,"##oku":23226,"América":23227,"##iza":23228,"Residents":23229,"vine":23230,"Pieces":23231,"DD":23232,"Vampires":23233,"##ová":23234,"smoked":23235,"harshly":23236,"spreads":23237,"##turn":23238,"##zhi":23239,"betray":23240,"electors":23241,"##settled":23242,"Considering":23243,"exploits":23244,"stamped":23245,"Dusty":23246,"enraged":23247,"Nairobi":23248,"##38":23249,"intervened":23250,"##luck":23251,"orchestras":23252,"##lda":23253,"Hereford":23254,"Jarvis":23255,"calf":23256,"##itzer":23257,"##CH":23258,"salesman":23259,"Lovers":23260,"cigar":23261,"Angelica":23262,"doomed":23263,"heroine":23264,"##tible":23265,"Sanford":23266,"offenders":23267,"##ulously":23268,"articulated":23269,"##oam":23270,"Emanuel":23271,"Gardiner":23272,"Edna":23273,"Shu":23274,"gigantic":23275,"##stable":23276,"Tallinn":23277,"coasts":23278,"Maker":23279,"ale":23280,"stalking":23281,"##oga":23282,"##smus":23283,"lucrative":23284,"southbound":23285,"##changing":23286,"Reg":23287,"##lants":23288,"Schleswig":23289,"discount":23290,"grouping":23291,"physiological":23292,"##OH":23293,"##sun":23294,"Galen":23295,"assurance":23296,"reconcile":23297,"rib":23298,"scarlet":23299,"Thatcher":23300,"anarchist":23301,"##oom":23302,"Turnpike":23303,"##ceding":23304,"cocktail":23305,"Sweeney":23306,"Allegheny":23307,"concessions":23308,"oppression":23309,"reassuring":23310,"##poli":23311,"##ticus":23312,"##TR":23313,"##VI":23314,"##uca":23315,"##zione":23316,"directional":23317,"strikeouts":23318,"Beneath":23319,"Couldn":23320,"Kabul":23321,"##national":23322,"hydroelectric":23323,"##jit":23324,"Desire":23325,"##riot":23326,"enhancing":23327,"northbound":23328,"##PO":23329,"Ok":23330,"Routledge":23331,"volatile":23332,"Bernardo":23333,"Python":23334,"333":23335,"ample":23336,"chestnut":23337,"automobiles":23338,"##innamon":23339,"##care":23340,"##hering":23341,"BWF":23342,"salaries":23343,"Turbo":23344,"acquisitions":23345,"##stituting":23346,"strengths":23347,"pilgrims":23348,"Ponce":23349,"Pig":23350,"Actors":23351,"Beard":23352,"sanitation":23353,"##RD":23354,"##mett":23355,"Telecommunications":23356,"worms":23357,"##idas":23358,"Juno":23359,"Larson":23360,"Ventura":23361,"Northeastern":23362,"weighs":23363,"Houghton":23364,"collaborating":23365,"lottery":23366,"##rano":23367,"Wonderland":23368,"gigs":23369,"##lmer":23370,"##zano":23371,"##edd":23372,"##nife":23373,"mixtape":23374,"predominant":23375,"tripped":23376,"##ruly":23377,"Alexei":23378,"investing":23379,"Belgarath":23380,"Brasil":23381,"hiss":23382,"##crat":23383,"##xham":23384,"Côte":23385,"560":23386,"kilometer":23387,"##cological":23388,"analyzing":23389,"##As":23390,"engined":23391,"listener":23392,"##cakes":23393,"negotiation":23394,"##hisky":23395,"Santana":23396,"##lemma":23397,"IAAF":23398,"Seneca":23399,"skeletal":23400,"Covenant":23401,"Steiner":23402,"##lev":23403,"##uen":23404,"Neptune":23405,"retention":23406,"##upon":23407,"Closing":23408,"Czechoslovak":23409,"chalk":23410,"Navarre":23411,"NZ":23412,"##IG":23413,"##hop":23414,"##oly":23415,"##quatorial":23416,"##sad":23417,"Brewery":23418,"Conflict":23419,"Them":23420,"renew":23421,"turrets":23422,"disagree":23423,"Petra":23424,"Slave":23425,"##reole":23426,"adjustment":23427,"##dela":23428,"##regard":23429,"##sner":23430,"framing":23431,"stature":23432,"##rca":23433,"##sies":23434,"##46":23435,"##mata":23436,"Logic":23437,"inadvertently":23438,"naturalist":23439,"spheres":23440,"towering":23441,"heightened":23442,"Dodd":23443,"rink":23444,"##fle":23445,"Keyboards":23446,"bulb":23447,"diver":23448,"ul":23449,"##tsk":23450,"Exodus":23451,"Deacon":23452,"España":23453,"Canadiens":23454,"oblique":23455,"thud":23456,"reigned":23457,"rug":23458,"Whitman":23459,"Dash":23460,"##iens":23461,"Haifa":23462,"pets":23463,"##arland":23464,"manually":23465,"dart":23466,"##bial":23467,"Sven":23468,"textiles":23469,"subgroup":23470,"Napier":23471,"graffiti":23472,"revolver":23473,"humming":23474,"Babu":23475,"protector":23476,"typed":23477,"Provinces":23478,"Sparta":23479,"Wills":23480,"subjective":23481,"##rella":23482,"temptation":23483,"##liest":23484,"FL":23485,"Sadie":23486,"manifest":23487,"Guangdong":23488,"Transfer":23489,"entertain":23490,"eve":23491,"recipes":23492,"##33":23493,"Benedictine":23494,"retailer":23495,"##dence":23496,"establishes":23497,"##cluded":23498,"##rked":23499,"Ursula":23500,"##ltz":23501,"##lars":23502,"##rena":23503,"qualifiers":23504,"##curement":23505,"colt":23506,"depictions":23507,"##oit":23508,"Spiritual":23509,"differentiation":23510,"staffed":23511,"transitional":23512,"##lew":23513,"1761":23514,"fatalities":23515,"##oan":23516,"Bayern":23517,"Northamptonshire":23518,"Weeks":23519,"##CU":23520,"Fife":23521,"capacities":23522,"hoarse":23523,"##latt":23524,"##ة":23525,"evidenced":23526,"##HD":23527,"##ographer":23528,"assessing":23529,"evolve":23530,"hints":23531,"42nd":23532,"streaked":23533,"##lve":23534,"Yahoo":23535,"##estive":23536,"##rned":23537,"##zas":23538,"baggage":23539,"Elected":23540,"secrecy":23541,"##champ":23542,"Character":23543,"Pen":23544,"Decca":23545,"cape":23546,"Bernardino":23547,"vapor":23548,"Dolly":23549,"counselor":23550,"##isers":23551,"Benin":23552,"##khar":23553,"##CR":23554,"notch":23555,"##thus":23556,"##racy":23557,"bounty":23558,"lend":23559,"grassland":23560,"##chtenstein":23561,"##dating":23562,"pseudo":23563,"golfer":23564,"simplest":23565,"##ceive":23566,"Lucivar":23567,"Triumph":23568,"dinosaur":23569,"dinosaurs":23570,"##šić":23571,"Seahawks":23572,"##nco":23573,"resorts":23574,"reelected":23575,"1766":23576,"reproduce":23577,"universally":23578,"##OA":23579,"ER":23580,"tendencies":23581,"Consolidated":23582,"Massey":23583,"Tasmanian":23584,"reckless":23585,"##icz":23586,"##ricks":23587,"1755":23588,"questionable":23589,"Audience":23590,"##lates":23591,"preseason":23592,"Quran":23593,"trivial":23594,"Haitian":23595,"Freeway":23596,"dialed":23597,"Appointed":23598,"Heard":23599,"ecosystems":23600,"##bula":23601,"hormones":23602,"Carbon":23603,"Rd":23604,"##arney":23605,"##working":23606,"Christoph":23607,"presiding":23608,"pu":23609,"##athy":23610,"Morrow":23611,"Dar":23612,"ensures":23613,"posing":23614,"remedy":23615,"EA":23616,"disclosed":23617,"##hui":23618,"##rten":23619,"rumours":23620,"surveying":23621,"##ficiency":23622,"Aziz":23623,"Jewel":23624,"Plays":23625,"##smatic":23626,"Bernhard":23627,"Christi":23628,"##eanut":23629,"##friend":23630,"jailed":23631,"##dr":23632,"govern":23633,"neighbour":23634,"butler":23635,"Acheron":23636,"murdering":23637,"oils":23638,"mac":23639,"Editorial":23640,"detectives":23641,"bolts":23642,"##ulon":23643,"Guitars":23644,"malaria":23645,"36th":23646,"Pembroke":23647,"Opened":23648,"##hium":23649,"harmonic":23650,"serum":23651,"##sio":23652,"Franks":23653,"fingernails":23654,"##gli":23655,"culturally":23656,"evolving":23657,"scalp":23658,"VP":23659,"deploy":23660,"uploaded":23661,"mater":23662,"##evo":23663,"Jammu":23664,"Spa":23665,"##icker":23666,"flirting":23667,"##cursions":23668,"Heidi":23669,"Majority":23670,"sprawled":23671,"##alytic":23672,"Zheng":23673,"bunker":23674,"##lena":23675,"ST":23676,"##tile":23677,"Jiang":23678,"ceilings":23679,"##ently":23680,"##ols":23681,"Recovery":23682,"dire":23683,"##good":23684,"Manson":23685,"Honestly":23686,"Montréal":23687,"1764":23688,"227":23689,"quota":23690,"Lakshmi":23691,"incentive":23692,"Accounting":23693,"##cilla":23694,"Eureka":23695,"Reaper":23696,"buzzed":23697,"##uh":23698,"courtroom":23699,"dub":23700,"##mberg":23701,"KC":23702,"Gong":23703,"Theodor":23704,"Académie":23705,"NPR":23706,"criticizing":23707,"protesting":23708,"##pired":23709,"##yric":23710,"abuses":23711,"fisheries":23712,"##minated":23713,"1767":23714,"yd":23715,"Gemini":23716,"Subcommittee":23717,"##fuse":23718,"Duff":23719,"Wasn":23720,"Wight":23721,"cleaner":23722,"##tite":23723,"planetary":23724,"Survivor":23725,"Zionist":23726,"mounds":23727,"##rary":23728,"landfall":23729,"disruption":23730,"yielding":23731,"##yana":23732,"bids":23733,"unidentified":23734,"Garry":23735,"Ellison":23736,"Elmer":23737,"Fishing":23738,"Hayward":23739,"demos":23740,"modelling":23741,"##anche":23742,"##stick":23743,"caressed":23744,"entertained":23745,"##hesion":23746,"piers":23747,"Crimea":23748,"##mass":23749,"WHO":23750,"boulder":23751,"trunks":23752,"1640":23753,"Biennale":23754,"Palestinians":23755,"Pursuit":23756,"##udes":23757,"Dora":23758,"contender":23759,"##dridge":23760,"Nanjing":23761,"##ezer":23762,"##former":23763,"##ibel":23764,"Whole":23765,"proliferation":23766,"##tide":23767,"##weiler":23768,"fuels":23769,"predictions":23770,"##ente":23771,"##onium":23772,"Filming":23773,"absorbing":23774,"Ramón":23775,"strangled":23776,"conveyed":23777,"inhabit":23778,"prostitutes":23779,"recession":23780,"bonded":23781,"clinched":23782,"##eak":23783,"##iji":23784,"##edar":23785,"Pleasure":23786,"Rite":23787,"Christy":23788,"Therapy":23789,"sarcasm":23790,"##collegiate":23791,"hilt":23792,"probation":23793,"Sarawak":23794,"coefficients":23795,"underworld":23796,"biodiversity":23797,"SBS":23798,"groom":23799,"brewing":23800,"dungeon":23801,"##claiming":23802,"Hari":23803,"turnover":23804,"##ntina":23805,"##omer":23806,"##opped":23807,"orthodox":23808,"styling":23809,"##tars":23810,"##ulata":23811,"priced":23812,"Marjorie":23813,"##eley":23814,"##abar":23815,"Yong":23816,"##tically":23817,"Crambidae":23818,"Hernandez":23819,"##ego":23820,"##rricular":23821,"##ark":23822,"##lamour":23823,"##llin":23824,"##augh":23825,"##tens":23826,"Advancement":23827,"Loyola":23828,"##4th":23829,"##hh":23830,"goin":23831,"marshes":23832,"Sardinia":23833,"##ša":23834,"Ljubljana":23835,"Singing":23836,"suspiciously":23837,"##hesive":23838,"Félix":23839,"Regarding":23840,"flap":23841,"stimulation":23842,"##raught":23843,"Apr":23844,"Yin":23845,"gaping":23846,"tighten":23847,"skier":23848,"##itas":23849,"##lad":23850,"##rani":23851,"264":23852,"Ashes":23853,"Olson":23854,"Problems":23855,"Tabitha":23856,"##rading":23857,"balancing":23858,"sunrise":23859,"##ease":23860,"##iture":23861,"##ritic":23862,"Fringe":23863,"##iciency":23864,"Inspired":23865,"Linnaeus":23866,"PBA":23867,"disapproval":23868,"##kles":23869,"##rka":23870,"##tails":23871,"##urger":23872,"Disaster":23873,"Laboratories":23874,"apps":23875,"paradise":23876,"Aero":23877,"Came":23878,"sneaking":23879,"Gee":23880,"Beacon":23881,"ODI":23882,"commodity":23883,"Ellington":23884,"graphical":23885,"Gretchen":23886,"spire":23887,"##skaya":23888,"##trine":23889,"RTÉ":23890,"efficacy":23891,"plc":23892,"tribunal":23893,"##ytic":23894,"downhill":23895,"flu":23896,"medications":23897,"##kaya":23898,"widen":23899,"Sunrise":23900,"##nous":23901,"distinguishing":23902,"pawn":23903,"##BO":23904,"##irn":23905,"##ssing":23906,"##ν":23907,"Easton":23908,"##vila":23909,"Rhineland":23910,"##aque":23911,"defect":23912,"##saurus":23913,"Goose":23914,"Ju":23915,"##classified":23916,"Middlesbrough":23917,"shaping":23918,"preached":23919,"1759":23920,"##erland":23921,"Ein":23922,"Hailey":23923,"musicals":23924,"##altered":23925,"Galileo":23926,"Hilda":23927,"Fighters":23928,"Lac":23929,"##ometric":23930,"295":23931,"Leafs":23932,"Milano":23933,"##lta":23934,"##VD":23935,"##ivist":23936,"penetrated":23937,"Mask":23938,"Orchard":23939,"plaintiff":23940,"##icorn":23941,"Yvonne":23942,"##fred":23943,"outfielder":23944,"peek":23945,"Collier":23946,"Caracas":23947,"repealed":23948,"Bois":23949,"dell":23950,"restrict":23951,"Dolores":23952,"Hadley":23953,"peacefully":23954,"##LL":23955,"condom":23956,"Granny":23957,"Orders":23958,"sabotage":23959,"##toon":23960,"##rings":23961,"compass":23962,"marshal":23963,"gears":23964,"brigadier":23965,"dye":23966,"Yunnan":23967,"communicating":23968,"donate":23969,"emerald":23970,"vitamin":23971,"administer":23972,"Fulham":23973,"##classical":23974,"##llas":23975,"Buckinghamshire":23976,"Held":23977,"layered":23978,"disclosure":23979,"Akira":23980,"programmer":23981,"shrimp":23982,"Crusade":23983,"##ximal":23984,"Luzon":23985,"bakery":23986,"##cute":23987,"Garth":23988,"Citadel":23989,"uniquely":23990,"Curling":23991,"info":23992,"mum":23993,"Para":23994,"##ști":23995,"sleek":23996,"##ione":23997,"hey":23998,"Lantern":23999,"mesh":24000,"##lacing":24001,"##lizzard":24002,"##gade":24003,"prosecuted":24004,"Alba":24005,"Gilles":24006,"greedy":24007,"twists":24008,"##ogged":24009,"Viper":24010,"##kata":24011,"Appearances":24012,"Skyla":24013,"hymns":24014,"##pelled":24015,"curving":24016,"predictable":24017,"Grave":24018,"Watford":24019,"##dford":24020,"##liptic":24021,"##vary":24022,"Westwood":24023,"fluids":24024,"Models":24025,"statutes":24026,"##ynamite":24027,"1740":24028,"##culate":24029,"Framework":24030,"Johanna":24031,"##gression":24032,"Vuelta":24033,"imp":24034,"##otion":24035,"##raga":24036,"##thouse":24037,"Ciudad":24038,"festivities":24039,"##love":24040,"Beyoncé":24041,"italics":24042,"##vance":24043,"DB":24044,"##haman":24045,"outs":24046,"Singers":24047,"##ueva":24048,"##urning":24049,"##51":24050,"##ntiary":24051,"##mobile":24052,"285":24053,"Mimi":24054,"emeritus":24055,"nesting":24056,"Keeper":24057,"Ways":24058,"##onal":24059,"##oux":24060,"Edmond":24061,"MMA":24062,"##bark":24063,"##oop":24064,"Hampson":24065,"##ñez":24066,"##rets":24067,"Gladstone":24068,"wreckage":24069,"Pont":24070,"Playboy":24071,"reluctance":24072,"##ná":24073,"apprenticeship":24074,"preferring":24075,"Value":24076,"originate":24077,"##wei":24078,"##olio":24079,"Alexia":24080,"##rog":24081,"Parachute":24082,"jammed":24083,"stud":24084,"Eton":24085,"vols":24086,"##ganized":24087,"1745":24088,"straining":24089,"creep":24090,"indicators":24091,"##mán":24092,"humiliation":24093,"hinted":24094,"alma":24095,"tanker":24096,"##egation":24097,"Haynes":24098,"Penang":24099,"amazement":24100,"branched":24101,"rumble":24102,"##ddington":24103,"archaeologists":24104,"paranoid":24105,"expenditure":24106,"Absolutely":24107,"Musicians":24108,"banished":24109,"##fining":24110,"baptism":24111,"Joker":24112,"Persons":24113,"hemisphere":24114,"##tieth":24115,"##ück":24116,"flock":24117,"##xing":24118,"lbs":24119,"Kung":24120,"crab":24121,"##dak":24122,"##tinent":24123,"Regulations":24124,"barrage":24125,"parcel":24126,"##ós":24127,"Tanaka":24128,"##rsa":24129,"Natalia":24130,"Voyage":24131,"flaws":24132,"stepfather":24133,"##aven":24134,"##eological":24135,"Botanical":24136,"Minsk":24137,"##ckers":24138,"Cinderella":24139,"Feast":24140,"Loving":24141,"Previous":24142,"Shark":24143,"##took":24144,"barrister":24145,"collaborators":24146,"##nnes":24147,"Croydon":24148,"Graeme":24149,"Juniors":24150,"##7th":24151,"##formation":24152,"##ulos":24153,"##ák":24154,"£2":24155,"##hwa":24156,"##rove":24157,"##ș":24158,"Whig":24159,"demeanor":24160,"Otago":24161,"##TH":24162,"##ooster":24163,"Faber":24164,"instructors":24165,"##ahl":24166,"##bha":24167,"emptied":24168,"##schen":24169,"saga":24170,"##lora":24171,"exploding":24172,"##rges":24173,"Crusaders":24174,"##caster":24175,"##uations":24176,"streaks":24177,"CBN":24178,"bows":24179,"insights":24180,"ka":24181,"1650":24182,"diversion":24183,"LSU":24184,"Wingspan":24185,"##liva":24186,"Response":24187,"sanity":24188,"Producers":24189,"imitation":24190,"##fine":24191,"Lange":24192,"Spokane":24193,"splash":24194,"weed":24195,"Siberian":24196,"magnet":24197,"##rocodile":24198,"capitals":24199,"##rgus":24200,"swelled":24201,"Rani":24202,"Bells":24203,"Silesia":24204,"arithmetic":24205,"rumor":24206,"##hampton":24207,"favors":24208,"Weird":24209,"marketplace":24210,"##orm":24211,"tsunami":24212,"unpredictable":24213,"##citation":24214,"##ferno":24215,"Tradition":24216,"postwar":24217,"stench":24218,"succeeds":24219,"##roup":24220,"Anya":24221,"Users":24222,"oversized":24223,"totaling":24224,"pouch":24225,"##nat":24226,"Tripoli":24227,"leverage":24228,"satin":24229,"##cline":24230,"Bathurst":24231,"Lund":24232,"Niall":24233,"thereof":24234,"##quid":24235,"Bangor":24236,"barge":24237,"Animated":24238,"##53":24239,"##alan":24240,"Ballard":24241,"utilizes":24242,"Done":24243,"ballistic":24244,"NDP":24245,"gatherings":24246,"##elin":24247,"##vening":24248,"Rockets":24249,"Sabrina":24250,"Tamara":24251,"Tribal":24252,"WTA":24253,"##citing":24254,"blinded":24255,"flux":24256,"Khalid":24257,"Una":24258,"prescription":24259,"##jee":24260,"Parents":24261,"##otics":24262,"##food":24263,"Silicon":24264,"cured":24265,"electro":24266,"perpendicular":24267,"intimacy":24268,"##rified":24269,"Lots":24270,"##ceiving":24271,"##powder":24272,"incentives":24273,"McKenna":24274,"##arma":24275,"##ounced":24276,"##rinkled":24277,"Alzheimer":24278,"##tarian":24279,"262":24280,"Seas":24281,"##cam":24282,"Novi":24283,"##hout":24284,"##morphic":24285,"##hazar":24286,"##hul":24287,"##nington":24288,"Huron":24289,"Bahadur":24290,"Pirate":24291,"pursed":24292,"Griffiths":24293,"indicted":24294,"swap":24295,"refrain":24296,"##mulating":24297,"Lal":24298,"stomped":24299,"##Pad":24300,"##mamoto":24301,"Reef":24302,"disposed":24303,"plastered":24304,"weeping":24305,"##rato":24306,"Minas":24307,"hourly":24308,"tumors":24309,"##ruising":24310,"Lyle":24311,"##yper":24312,"##sol":24313,"Odisha":24314,"credibility":24315,"##Dowell":24316,"Braun":24317,"Graphic":24318,"lurched":24319,"muster":24320,"##nex":24321,"##ührer":24322,"##connected":24323,"##iek":24324,"##ruba":24325,"Carthage":24326,"Peck":24327,"maple":24328,"bursting":24329,"##lava":24330,"Enrico":24331,"rite":24332,"##jak":24333,"Moment":24334,"##skar":24335,"Styx":24336,"poking":24337,"Spartan":24338,"##urney":24339,"Hepburn":24340,"Mart":24341,"Titanic":24342,"newsletter":24343,"waits":24344,"Mecklenburg":24345,"agitated":24346,"eats":24347,"##dious":24348,"Chow":24349,"matrices":24350,"Maud":24351,"##sexual":24352,"sermon":24353,"234":24354,"##sible":24355,"##lung":24356,"Qi":24357,"cemeteries":24358,"mined":24359,"sprinter":24360,"##ckett":24361,"coward":24362,"##gable":24363,"##hell":24364,"##thin":24365,"##FB":24366,"Contact":24367,"##hay":24368,"rainforest":24369,"238":24370,"Hemisphere":24371,"boasts":24372,"##nders":24373,"##verance":24374,"##kat":24375,"Convent":24376,"Dunedin":24377,"Lecturer":24378,"lyricist":24379,"##bject":24380,"Iberian":24381,"comune":24382,"##pphire":24383,"chunk":24384,"##boo":24385,"thrusting":24386,"fore":24387,"informing":24388,"pistols":24389,"echoes":24390,"Tier":24391,"battleships":24392,"substitution":24393,"##belt":24394,"moniker":24395,"##charya":24396,"##lland":24397,"Thoroughbred":24398,"38th":24399,"##01":24400,"##tah":24401,"parting":24402,"tongues":24403,"Cale":24404,"##seau":24405,"Unionist":24406,"modular":24407,"celebrates":24408,"preview":24409,"steamed":24410,"Bismarck":24411,"302":24412,"737":24413,"vamp":24414,"##finity":24415,"##nbridge":24416,"weaknesses":24417,"husky":24418,"##berman":24419,"absently":24420,"##icide":24421,"Craven":24422,"tailored":24423,"Tokugawa":24424,"VIP":24425,"syntax":24426,"Kazan":24427,"captives":24428,"doses":24429,"filtered":24430,"overview":24431,"Cleopatra":24432,"Conversely":24433,"stallion":24434,"Burger":24435,"Suez":24436,"Raoul":24437,"th":24438,"##reaves":24439,"Dickson":24440,"Nell":24441,"Rate":24442,"anal":24443,"colder":24444,"##sław":24445,"Arm":24446,"Semitic":24447,"##green":24448,"reflective":24449,"1100":24450,"episcopal":24451,"journeys":24452,"##ours":24453,"##pository":24454,"##dering":24455,"residue":24456,"Gunn":24457,"##27":24458,"##ntial":24459,"##crates":24460,"##zig":24461,"Astros":24462,"Renee":24463,"Emerald":24464,"##vili":24465,"connectivity":24466,"undrafted":24467,"Sampson":24468,"treasures":24469,"##kura":24470,"##theon":24471,"##vern":24472,"Destroyer":24473,"##iable":24474,"##ener":24475,"Frederic":24476,"briefcase":24477,"confinement":24478,"Bree":24479,"##WD":24480,"Athena":24481,"233":24482,"Padres":24483,"Thom":24484,"speeding":24485,"##hali":24486,"Dental":24487,"ducks":24488,"Putin":24489,"##rcle":24490,"##lou":24491,"Asylum":24492,"##usk":24493,"dusk":24494,"pasture":24495,"Institutes":24496,"ONE":24497,"jack":24498,"##named":24499,"diplomacy":24500,"Intercontinental":24501,"Leagues":24502,"Towns":24503,"comedic":24504,"premature":24505,"##edic":24506,"##mona":24507,"##ories":24508,"trimmed":24509,"Charge":24510,"Cream":24511,"guarantees":24512,"Dmitry":24513,"splashed":24514,"Philosophical":24515,"tramway":24516,"##cape":24517,"Maynard":24518,"predatory":24519,"redundant":24520,"##gratory":24521,"##wry":24522,"sobs":24523,"Burgundy":24524,"edible":24525,"outfits":24526,"Handel":24527,"dazed":24528,"dangerously":24529,"idle":24530,"Operational":24531,"organizes":24532,"##sional":24533,"blackish":24534,"broker":24535,"weddings":24536,"##halt":24537,"Becca":24538,"McGee":24539,"##gman":24540,"protagonists":24541,"##pelling":24542,"Keynes":24543,"aux":24544,"stumble":24545,"##ordination":24546,"Nokia":24547,"reel":24548,"sexes":24549,"##woods":24550,"##pheric":24551,"##quished":24552,"##voc":24553,"##oir":24554,"##pathian":24555,"##ptus":24556,"##sma":24557,"##tating":24558,"##ê":24559,"fulfilling":24560,"sheath":24561,"##ayne":24562,"Mei":24563,"Ordinary":24564,"Collin":24565,"Sharpe":24566,"grasses":24567,"interdisciplinary":24568,"##OX":24569,"Background":24570,"##ignment":24571,"Assault":24572,"transforms":24573,"Hamas":24574,"Serge":24575,"ratios":24576,"##sik":24577,"swaying":24578,"##rcia":24579,"Rosen":24580,"##gant":24581,"##versible":24582,"cinematographer":24583,"curly":24584,"penny":24585,"Kamal":24586,"Mellon":24587,"Sailor":24588,"Spence":24589,"phased":24590,"Brewers":24591,"amassed":24592,"Societies":24593,"##ropriations":24594,"##buted":24595,"mythological":24596,"##SN":24597,"##byss":24598,"##ired":24599,"Sovereign":24600,"preface":24601,"Parry":24602,"##ife":24603,"altitudes":24604,"crossings":24605,"##28":24606,"Crewe":24607,"southernmost":24608,"taut":24609,"McKinley":24610,"##owa":24611,"##tore":24612,"254":24613,"##ckney":24614,"compiling":24615,"Shelton":24616,"##hiko":24617,"228":24618,"Poll":24619,"Shepard":24620,"Labs":24621,"Pace":24622,"Carlson":24623,"grasping":24624,"##ов":24625,"Delaney":24626,"Winning":24627,"robotic":24628,"intentional":24629,"shattering":24630,"##boarding":24631,"##git":24632,"##grade":24633,"Editions":24634,"Reserves":24635,"ignorant":24636,"proposing":24637,"##hanna":24638,"cutter":24639,"Mongols":24640,"NW":24641,"##eux":24642,"Codex":24643,"Cristina":24644,"Daughters":24645,"Rees":24646,"forecast":24647,"##hita":24648,"NGOs":24649,"Stations":24650,"Beaux":24651,"Erwin":24652,"##jected":24653,"##EX":24654,"##trom":24655,"Schumacher":24656,"##hrill":24657,"##rophe":24658,"Maharaja":24659,"Oricon":24660,"##sul":24661,"##dynamic":24662,"##fighting":24663,"Ce":24664,"Ingrid":24665,"rumbled":24666,"Prospect":24667,"stairwell":24668,"Barnard":24669,"applause":24670,"complementary":24671,"##uba":24672,"grunt":24673,"##mented":24674,"Bloc":24675,"Carleton":24676,"loft":24677,"noisy":24678,"##hey":24679,"490":24680,"contrasted":24681,"##inator":24682,"##rief":24683,"##centric":24684,"##fica":24685,"Cantonese":24686,"Blanc":24687,"Lausanne":24688,"License":24689,"artifact":24690,"##ddin":24691,"rot":24692,"Amongst":24693,"Prakash":24694,"RF":24695,"##topia":24696,"milestone":24697,"##vard":24698,"Winters":24699,"Mead":24700,"churchyard":24701,"Lulu":24702,"estuary":24703,"##ind":24704,"Cha":24705,"Infinity":24706,"Meadow":24707,"subsidies":24708,"##valent":24709,"CONCACAF":24710,"Ching":24711,"medicinal":24712,"navigate":24713,"Carver":24714,"Twice":24715,"abdominal":24716,"regulating":24717,"RB":24718,"toilets":24719,"Brewer":24720,"weakening":24721,"ambushed":24722,"##aut":24723,"##vignon":24724,"Lansing":24725,"unacceptable":24726,"reliance":24727,"stabbing":24728,"##mpo":24729,"##naire":24730,"Interview":24731,"##ested":24732,"##imed":24733,"bearings":24734,"##lts":24735,"Rashid":24736,"##iation":24737,"authenticity":24738,"vigorous":24739,"##frey":24740,"##uel":24741,"biologist":24742,"NFC":24743,"##rmaid":24744,"##wash":24745,"Makes":24746,"##aunt":24747,"##steries":24748,"withdrawing":24749,"##qa":24750,"Buccaneers":24751,"bleed":24752,"inclination":24753,"stain":24754,"##ilo":24755,"##ppel":24756,"Torre":24757,"privileged":24758,"cereal":24759,"trailers":24760,"alumnus":24761,"neon":24762,"Cochrane":24763,"Mariana":24764,"caress":24765,"##47":24766,"##ients":24767,"experimentation":24768,"Window":24769,"convict":24770,"signaled":24771,"##YP":24772,"rower":24773,"Pharmacy":24774,"interacting":24775,"241":24776,"Strings":24777,"dominating":24778,"kinase":24779,"Dinamo":24780,"Wire":24781,"pains":24782,"sensations":24783,"##suse":24784,"Twenty20":24785,"##39":24786,"spotlight":24787,"##hend":24788,"elemental":24789,"##pura":24790,"Jameson":24791,"Swindon":24792,"honoring":24793,"pained":24794,"##ediatric":24795,"##lux":24796,"Psychological":24797,"assemblies":24798,"ingredient":24799,"Martial":24800,"Penguins":24801,"beverage":24802,"Monitor":24803,"mysteries":24804,"##ION":24805,"emigration":24806,"mused":24807,"##sique":24808,"crore":24809,"AMC":24810,"Funding":24811,"Chinatown":24812,"Establishment":24813,"Finalist":24814,"enjoyable":24815,"1756":24816,"##mada":24817,"##rams":24818,"NO":24819,"newborn":24820,"CS":24821,"comprehend":24822,"Invisible":24823,"Siemens":24824,"##acon":24825,"246":24826,"contraction":24827,"##volving":24828,"##moration":24829,"##rok":24830,"montane":24831,"##ntation":24832,"Galloway":24833,"##llow":24834,"Verity":24835,"directorial":24836,"pearl":24837,"Leaning":24838,"##rase":24839,"Fernandez":24840,"swallowing":24841,"Automatic":24842,"Madness":24843,"haunting":24844,"paddle":24845,"##UE":24846,"##rrows":24847,"##vies":24848,"##zuki":24849,"##bolt":24850,"##iber":24851,"Fender":24852,"emails":24853,"paste":24854,"##lancing":24855,"hind":24856,"homestead":24857,"hopeless":24858,"##dles":24859,"Rockies":24860,"garlic":24861,"fatty":24862,"shrieked":24863,"##ismic":24864,"Gillian":24865,"Inquiry":24866,"Schultz":24867,"XML":24868,"##cius":24869,"##uld":24870,"Domesday":24871,"grenades":24872,"northernmost":24873,"##igi":24874,"Tbilisi":24875,"optimistic":24876,"##poon":24877,"Refuge":24878,"stacks":24879,"Bose":24880,"smash":24881,"surreal":24882,"Nah":24883,"Straits":24884,"Conquest":24885,"##roo":24886,"##weet":24887,"##kell":24888,"Gladys":24889,"CH":24890,"##lim":24891,"##vitation":24892,"Doctorate":24893,"NRHP":24894,"knocks":24895,"Bey":24896,"Romano":24897,"##pile":24898,"242":24899,"Diamonds":24900,"strides":24901,"eclectic":24902,"Betsy":24903,"clade":24904,"##hady":24905,"##leashed":24906,"dissolve":24907,"moss":24908,"Suburban":24909,"silvery":24910,"##bria":24911,"tally":24912,"turtles":24913,"##uctive":24914,"finely":24915,"industrialist":24916,"##nary":24917,"Ernesto":24918,"oz":24919,"pact":24920,"loneliness":24921,"##hov":24922,"Tomb":24923,"multinational":24924,"risked":24925,"Layne":24926,"USL":24927,"ne":24928,"##quiries":24929,"Ad":24930,"Message":24931,"Kamen":24932,"Kristen":24933,"reefs":24934,"implements":24935,"##itative":24936,"educators":24937,"garments":24938,"gunshot":24939,"##essed":24940,"##rve":24941,"Montevideo":24942,"vigorously":24943,"Stamford":24944,"assemble":24945,"packaged":24946,"##same":24947,"état":24948,"Viva":24949,"paragraph":24950,"##eter":24951,"##wire":24952,"Stick":24953,"Navajo":24954,"MCA":24955,"##pressing":24956,"ensembles":24957,"ABA":24958,"##zor":24959,"##llus":24960,"Partner":24961,"raked":24962,"##BI":24963,"Iona":24964,"thump":24965,"Celeste":24966,"Kiran":24967,"##iscovered":24968,"##rith":24969,"inflammation":24970,"##arel":24971,"Features":24972,"loosened":24973,"##yclic":24974,"Deluxe":24975,"Speak":24976,"economical":24977,"Frankenstein":24978,"Picasso":24979,"showcased":24980,"##zad":24981,"##eira":24982,"##planes":24983,"##linear":24984,"##overs":24985,"monsoon":24986,"prosecutors":24987,"slack":24988,"Horses":24989,"##urers":24990,"Angry":24991,"coughing":24992,"##truder":24993,"Questions":24994,"##tō":24995,"##zak":24996,"challenger":24997,"clocks":24998,"##ieving":24999,"Newmarket":25000,"##acle":25001,"cursing":25002,"stimuli":25003,"##mming":25004,"##qualified":25005,"slapping":25006,"##vasive":25007,"narration":25008,"##kini":25009,"Advertising":25010,"CSI":25011,"alliances":25012,"mixes":25013,"##yes":25014,"covert":25015,"amalgamation":25016,"reproduced":25017,"##ardt":25018,"##gis":25019,"1648":25020,"id":25021,"Annette":25022,"Boots":25023,"Champagne":25024,"Brest":25025,"Daryl":25026,"##emon":25027,"##jou":25028,"##llers":25029,"Mean":25030,"adaptive":25031,"technicians":25032,"##pair":25033,"##usal":25034,"Yoga":25035,"fronts":25036,"leaping":25037,"Jul":25038,"harvesting":25039,"keel":25040,"##44":25041,"petitioned":25042,"##lved":25043,"yells":25044,"Endowment":25045,"proponent":25046,"##spur":25047,"##tised":25048,"##zal":25049,"Homes":25050,"Includes":25051,"##ifer":25052,"##oodoo":25053,"##rvette":25054,"awarding":25055,"mirrored":25056,"ransom":25057,"Flute":25058,"outlook":25059,"##ganj":25060,"DVDs":25061,"Sufi":25062,"frontman":25063,"Goddard":25064,"barren":25065,"##astic":25066,"Suicide":25067,"hillside":25068,"Harlow":25069,"Lau":25070,"notions":25071,"Amnesty":25072,"Homestead":25073,"##irt":25074,"GE":25075,"hooded":25076,"umpire":25077,"mustered":25078,"Catch":25079,"Masonic":25080,"##erd":25081,"Dynamics":25082,"Equity":25083,"Oro":25084,"Charts":25085,"Mussolini":25086,"populace":25087,"muted":25088,"accompaniment":25089,"##lour":25090,"##ndes":25091,"ignited":25092,"##iferous":25093,"##laced":25094,"##atch":25095,"anguish":25096,"registry":25097,"##tub":25098,"##hards":25099,"##neer":25100,"251":25101,"Hooker":25102,"uncomfortably":25103,"##6th":25104,"##ivers":25105,"Catalina":25106,"MiG":25107,"giggling":25108,"1754":25109,"Dietrich":25110,"Kaladin":25111,"pricing":25112,"##quence":25113,"Sabah":25114,"##lving":25115,"##nical":25116,"Gettysburg":25117,"Vita":25118,"Telecom":25119,"Worst":25120,"Palais":25121,"Pentagon":25122,"##brand":25123,"##chichte":25124,"Graf":25125,"unnatural":25126,"1715":25127,"bio":25128,"##26":25129,"Radcliffe":25130,"##utt":25131,"chatting":25132,"spices":25133,"##aus":25134,"untouched":25135,"##eper":25136,"Doll":25137,"turkey":25138,"Syndicate":25139,"##rlene":25140,"##JP":25141,"##roots":25142,"Como":25143,"clashed":25144,"modernization":25145,"1757":25146,"fantasies":25147,"##iating":25148,"dissipated":25149,"Sicilian":25150,"inspect":25151,"sensible":25152,"reputed":25153,"##final":25154,"Milford":25155,"poised":25156,"RC":25157,"metabolic":25158,"Tobacco":25159,"Mecca":25160,"optimization":25161,"##heat":25162,"lobe":25163,"rabbits":25164,"NAS":25165,"geologist":25166,"##liner":25167,"Kilda":25168,"carpenter":25169,"nationalists":25170,"##brae":25171,"summarized":25172,"##venge":25173,"Designer":25174,"misleading":25175,"beamed":25176,"##meyer":25177,"Matrix":25178,"excuses":25179,"##aines":25180,"##biology":25181,"401":25182,"Moose":25183,"drafting":25184,"Sai":25185,"##ggle":25186,"Comprehensive":25187,"dripped":25188,"skate":25189,"##WI":25190,"##enan":25191,"##ruk":25192,"narrower":25193,"outgoing":25194,"##enter":25195,"##nounce":25196,"overseen":25197,"##structure":25198,"travellers":25199,"banging":25200,"scarred":25201,"##thing":25202,"##arra":25203,"Ebert":25204,"Sometime":25205,"##nated":25206,"BAFTA":25207,"Hurricanes":25208,"configurations":25209,"##MLL":25210,"immortality":25211,"##heus":25212,"gothic":25213,"##mpest":25214,"clergyman":25215,"viewpoint":25216,"Maxim":25217,"Instituto":25218,"emitted":25219,"quantitative":25220,"1689":25221,"Consortium":25222,"##rsk":25223,"Meat":25224,"Tao":25225,"swimmers":25226,"Shaking":25227,"Terence":25228,"mainline":25229,"##linity":25230,"Quantum":25231,"##rogate":25232,"Nair":25233,"banquet":25234,"39th":25235,"reprised":25236,"lagoon":25237,"subdivisions":25238,"synonymous":25239,"incurred":25240,"password":25241,"sprung":25242,"##vere":25243,"Credits":25244,"Petersen":25245,"Faces":25246,"##vu":25247,"statesman":25248,"Zombie":25249,"gesturing":25250,"##going":25251,"Sergey":25252,"dormant":25253,"possessive":25254,"totals":25255,"southward":25256,"Ángel":25257,"##odies":25258,"HM":25259,"Mariano":25260,"Ramirez":25261,"Wicked":25262,"impressions":25263,"##Net":25264,"##cap":25265,"##ème":25266,"Transformers":25267,"Poker":25268,"RIAA":25269,"Redesignated":25270,"##chuk":25271,"Harcourt":25272,"Peña":25273,"spacious":25274,"tinged":25275,"alternatively":25276,"narrowing":25277,"Brigham":25278,"authorization":25279,"Membership":25280,"Zeppelin":25281,"##amed":25282,"Handball":25283,"steer":25284,"##orium":25285,"##rnal":25286,"##rops":25287,"Committees":25288,"endings":25289,"##MM":25290,"##yung":25291,"ejected":25292,"grams":25293,"##relli":25294,"Birch":25295,"Hilary":25296,"Stadion":25297,"orphan":25298,"clawed":25299,"##kner":25300,"Motown":25301,"Wilkins":25302,"ballads":25303,"outspoken":25304,"##ancipation":25305,"##bankment":25306,"##cheng":25307,"Advances":25308,"harvested":25309,"novelty":25310,"ineligible":25311,"oversees":25312,"##´s":25313,"obeyed":25314,"inevitably":25315,"Kingdoms":25316,"burying":25317,"Fabian":25318,"relevance":25319,"Tatiana":25320,"##MCA":25321,"sarcastic":25322,"##onda":25323,"Akron":25324,"229":25325,"sandwiches":25326,"Adobe":25327,"Maddox":25328,"##azar":25329,"Hunting":25330,"##onized":25331,"Smiling":25332,"##tology":25333,"Juventus":25334,"Leroy":25335,"Poets":25336,"attach":25337,"lo":25338,"##rly":25339,"##film":25340,"Structure":25341,"##igate":25342,"olds":25343,"projections":25344,"SMS":25345,"outnumbered":25346,"##tase":25347,"judiciary":25348,"paramilitary":25349,"playfully":25350,"##rsing":25351,"##tras":25352,"Chico":25353,"Vin":25354,"informally":25355,"abandonment":25356,"##russ":25357,"Baroness":25358,"injuring":25359,"octagonal":25360,"deciduous":25361,"##nea":25362,"##olm":25363,"Hz":25364,"Norwood":25365,"poses":25366,"Marissa":25367,"alerted":25368,"willed":25369,"##KS":25370,"Dino":25371,"##ddler":25372,"##vani":25373,"Barbie":25374,"Thankfully":25375,"625":25376,"bicycles":25377,"shimmering":25378,"##tinuum":25379,"##wolf":25380,"Chesterfield":25381,"##idy":25382,"##urgency":25383,"Knowles":25384,"sweetly":25385,"Ventures":25386,"##ponents":25387,"##valence":25388,"Darryl":25389,"Powerplant":25390,"RAAF":25391,"##pec":25392,"Kingsley":25393,"Parramatta":25394,"penetrating":25395,"spectacle":25396,"##inia":25397,"Marlborough":25398,"residual":25399,"compatibility":25400,"hike":25401,"Underwood":25402,"depleted":25403,"ministries":25404,"##odus":25405,"##ropriation":25406,"rotting":25407,"Faso":25408,"##inn":25409,"Happiness":25410,"Lille":25411,"Suns":25412,"cookie":25413,"rift":25414,"warmly":25415,"##lvin":25416,"Bugs":25417,"Gotham":25418,"Gothenburg":25419,"Properties":25420,"##seller":25421,"##ubi":25422,"Created":25423,"MAC":25424,"Noelle":25425,"Requiem":25426,"Ulysses":25427,"##ails":25428,"franchises":25429,"##icious":25430,"##rwick":25431,"celestial":25432,"kinetic":25433,"720":25434,"STS":25435,"transmissions":25436,"amplitude":25437,"forums":25438,"freeing":25439,"reptiles":25440,"tumbling":25441,"##continent":25442,"##rising":25443,"##tropy":25444,"physiology":25445,"##uster":25446,"Loves":25447,"bodied":25448,"neutrality":25449,"Neumann":25450,"assessments":25451,"Vicky":25452,"##hom":25453,"hampered":25454,"##uku":25455,"Custom":25456,"timed":25457,"##eville":25458,"##xious":25459,"elastic":25460,"##section":25461,"rig":25462,"stilled":25463,"shipment":25464,"243":25465,"artworks":25466,"boulders":25467,"Bournemouth":25468,"##hly":25469,"##LF":25470,"##linary":25471,"rumored":25472,"##bino":25473,"##drum":25474,"Chun":25475,"Freiburg":25476,"##dges":25477,"Equality":25478,"252":25479,"Guadalajara":25480,"##sors":25481,"##taire":25482,"Roach":25483,"cramped":25484,"##ultural":25485,"Logistics":25486,"Punch":25487,"fines":25488,"Lai":25489,"caravan":25490,"##55":25491,"lame":25492,"Collector":25493,"pausing":25494,"315":25495,"migrant":25496,"hawk":25497,"signalling":25498,"##erham":25499,"##oughs":25500,"Demons":25501,"surfing":25502,"Rana":25503,"insisting":25504,"Wien":25505,"adolescent":25506,"##jong":25507,"##rera":25508,"##umba":25509,"Regis":25510,"brushes":25511,"##iman":25512,"residues":25513,"storytelling":25514,"Consider":25515,"contrasting":25516,"regeneration":25517,"##elling":25518,"##hlete":25519,"afforded":25520,"reactors":25521,"costing":25522,"##biotics":25523,"##gat":25524,"##евич":25525,"chanting":25526,"secondly":25527,"confesses":25528,"##ikos":25529,"##uang":25530,"##ronological":25531,"##−":25532,"Giacomo":25533,"##eca":25534,"vaudeville":25535,"weeds":25536,"rejecting":25537,"revoked":25538,"affluent":25539,"fullback":25540,"progresses":25541,"geologic":25542,"proprietor":25543,"replication":25544,"gliding":25545,"recounted":25546,"##bah":25547,"##igma":25548,"Flow":25549,"ii":25550,"newcomer":25551,"##lasp":25552,"##miya":25553,"Candace":25554,"fractured":25555,"interiors":25556,"confidential":25557,"Inverness":25558,"footing":25559,"##robe":25560,"Coordinator":25561,"Westphalia":25562,"jumper":25563,"##chism":25564,"dormitory":25565,"##gno":25566,"281":25567,"acknowledging":25568,"leveled":25569,"##éra":25570,"Algiers":25571,"migrate":25572,"Frog":25573,"Rare":25574,"##iovascular":25575,"##urous":25576,"DSO":25577,"nomadic":25578,"##iera":25579,"woken":25580,"lifeless":25581,"##graphical":25582,"##ifications":25583,"Dot":25584,"Sachs":25585,"crow":25586,"nmi":25587,"Tacoma":25588,"Weight":25589,"mushroom":25590,"RS":25591,"conditioned":25592,"##zine":25593,"Tunisian":25594,"altering":25595,"##mizing":25596,"Handicap":25597,"Patti":25598,"Monsieur":25599,"clicking":25600,"gorge":25601,"interrupting":25602,"##powerment":25603,"drawers":25604,"Serra":25605,"##icides":25606,"Specialist":25607,"##itte":25608,"connector":25609,"worshipped":25610,"##ask":25611,"consoles":25612,"tags":25613,"##iler":25614,"glued":25615,"##zac":25616,"fences":25617,"Bratislava":25618,"honeymoon":25619,"313":25620,"A2":25621,"disposition":25622,"Gentleman":25623,"Gilmore":25624,"glaciers":25625,"##scribed":25626,"Calhoun":25627,"convergence":25628,"Aleppo":25629,"shortages":25630,"##43":25631,"##orax":25632,"##worm":25633,"##codes":25634,"##rmal":25635,"neutron":25636,"##ossa":25637,"Bloomberg":25638,"Salford":25639,"periodicals":25640,"##ryan":25641,"Slayer":25642,"##ynasties":25643,"credentials":25644,"##tista":25645,"surveyor":25646,"File":25647,"stinging":25648,"unnoticed":25649,"Medici":25650,"ecstasy":25651,"espionage":25652,"Jett":25653,"Leary":25654,"circulating":25655,"bargaining":25656,"concerto":25657,"serviced":25658,"37th":25659,"HK":25660,"##fueling":25661,"Delilah":25662,"Marcia":25663,"graded":25664,"##join":25665,"Kaplan":25666,"feasible":25667,"##nale":25668,"##yt":25669,"Burnley":25670,"dreadful":25671,"ministerial":25672,"Brewster":25673,"Judah":25674,"##ngled":25675,"##rrey":25676,"recycled":25677,"Iroquois":25678,"backstage":25679,"parchment":25680,"##numbered":25681,"Kern":25682,"Motorsports":25683,"Organizations":25684,"##mini":25685,"Seems":25686,"Warrington":25687,"Dunbar":25688,"Ezio":25689,"##eor":25690,"paralyzed":25691,"Ara":25692,"yeast":25693,"##olis":25694,"cheated":25695,"reappeared":25696,"banged":25697,"##ymph":25698,"##dick":25699,"Lyndon":25700,"glide":25701,"Mat":25702,"##natch":25703,"Hotels":25704,"Household":25705,"parasite":25706,"irrelevant":25707,"youthful":25708,"##smic":25709,"##tero":25710,"##anti":25711,"2d":25712,"Ignacio":25713,"squash":25714,"##nets":25715,"shale":25716,"##اد":25717,"Abrams":25718,"##oese":25719,"assaults":25720,"##dier":25721,"##otte":25722,"Swamp":25723,"287":25724,"Spurs":25725,"##economic":25726,"Fargo":25727,"auditioned":25728,"##mé":25729,"Haas":25730,"une":25731,"abbreviation":25732,"Turkic":25733,"##tisfaction":25734,"favorites":25735,"specials":25736,"##lial":25737,"Enlightenment":25738,"Burkina":25739,"##vir":25740,"Comparative":25741,"Lacrosse":25742,"elves":25743,"##lerical":25744,"##pear":25745,"Borders":25746,"controllers":25747,"##villa":25748,"excelled":25749,"##acher":25750,"##varo":25751,"camouflage":25752,"perpetual":25753,"##ffles":25754,"devoid":25755,"schooner":25756,"##bered":25757,"##oris":25758,"Gibbons":25759,"Lia":25760,"discouraged":25761,"sue":25762,"##gnition":25763,"Excellent":25764,"Layton":25765,"noir":25766,"smack":25767,"##ivable":25768,"##evity":25769,"##lone":25770,"Myra":25771,"weaken":25772,"weaponry":25773,"##azza":25774,"Shake":25775,"backbone":25776,"Certified":25777,"clown":25778,"occupational":25779,"caller":25780,"enslaved":25781,"soaking":25782,"Wexford":25783,"perceive":25784,"shortlisted":25785,"##pid":25786,"feminism":25787,"Bari":25788,"Indie":25789,"##avelin":25790,"##ldo":25791,"Hellenic":25792,"Hundreds":25793,"Savings":25794,"comedies":25795,"Honors":25796,"Mohawk":25797,"Told":25798,"coded":25799,"Incorporated":25800,"hideous":25801,"trusts":25802,"hose":25803,"Calais":25804,"Forster":25805,"Gabon":25806,"Internationale":25807,"AK":25808,"Colour":25809,"##UM":25810,"##heist":25811,"McGregor":25812,"localized":25813,"##tronomy":25814,"Darrell":25815,"##iara":25816,"squirrel":25817,"freaked":25818,"##eking":25819,"##manned":25820,"##ungen":25821,"radiated":25822,"##dua":25823,"commence":25824,"Donaldson":25825,"##iddle":25826,"MR":25827,"SAS":25828,"Tavern":25829,"Teenage":25830,"admissions":25831,"Instruments":25832,"##ilizer":25833,"Konrad":25834,"contemplated":25835,"##ductor":25836,"Jing":25837,"Reacher":25838,"recalling":25839,"Dhabi":25840,"emphasizing":25841,"illumination":25842,"##tony":25843,"legitimacy":25844,"Goethe":25845,"Ritter":25846,"McDonnell":25847,"Polar":25848,"Seconds":25849,"aspiring":25850,"derby":25851,"tunic":25852,"##rmed":25853,"outlines":25854,"Changing":25855,"distortion":25856,"##cter":25857,"Mechanics":25858,"##urly":25859,"##vana":25860,"Egg":25861,"Wolverine":25862,"Stupid":25863,"centralized":25864,"knit":25865,"##Ms":25866,"Saratoga":25867,"Ogden":25868,"storylines":25869,"##vres":25870,"lavish":25871,"beverages":25872,"##grarian":25873,"Kyrgyzstan":25874,"forcefully":25875,"superb":25876,"Elm":25877,"Thessaloniki":25878,"follower":25879,"Plants":25880,"slang":25881,"trajectory":25882,"Nowadays":25883,"Bengals":25884,"Ingram":25885,"perch":25886,"coloring":25887,"carvings":25888,"doubtful":25889,"##aph":25890,"##gratulations":25891,"##41":25892,"Curse":25893,"253":25894,"nightstand":25895,"Campo":25896,"Meiji":25897,"decomposition":25898,"##giri":25899,"McCormick":25900,"Yours":25901,"##amon":25902,"##bang":25903,"Texans":25904,"injunction":25905,"organise":25906,"periodical":25907,"##peculative":25908,"oceans":25909,"##aley":25910,"Success":25911,"Lehigh":25912,"##guin":25913,"1730":25914,"Davy":25915,"allowance":25916,"obituary":25917,"##tov":25918,"treasury":25919,"##wayne":25920,"euros":25921,"readiness":25922,"systematically":25923,"##stered":25924,"##igor":25925,"##xen":25926,"##cliff":25927,"##lya":25928,"Send":25929,"##umatic":25930,"Celtics":25931,"Judiciary":25932,"425":25933,"propagation":25934,"rebellious":25935,"##ims":25936,"##lut":25937,"Dal":25938,"##ayman":25939,"##cloth":25940,"Boise":25941,"pairing":25942,"Waltz":25943,"torment":25944,"Hatch":25945,"aspirations":25946,"diaspora":25947,"##hame":25948,"Rank":25949,"237":25950,"Including":25951,"Muir":25952,"chained":25953,"toxicity":25954,"Université":25955,"##aroo":25956,"Mathews":25957,"meadows":25958,"##bio":25959,"Editing":25960,"Khorasan":25961,"##them":25962,"##ahn":25963,"##bari":25964,"##umes":25965,"evacuate":25966,"##sium":25967,"gram":25968,"kidnap":25969,"pinning":25970,"##diation":25971,"##orms":25972,"beacon":25973,"organising":25974,"McGrath":25975,"##ogist":25976,"Qur":25977,"Tango":25978,"##ceptor":25979,"##rud":25980,"##cend":25981,"##cie":25982,"##jas":25983,"##sided":25984,"Tuscany":25985,"Venture":25986,"creations":25987,"exhibiting":25988,"##rcerer":25989,"##tten":25990,"Butcher":25991,"Divinity":25992,"Pet":25993,"Whitehead":25994,"falsely":25995,"perished":25996,"handy":25997,"Moines":25998,"cyclists":25999,"synthesizers":26000,"Mortal":26001,"notoriety":26002,"##ronic":26003,"Dialogue":26004,"expressive":26005,"uk":26006,"Nightingale":26007,"grimly":26008,"vineyards":26009,"Driving":26010,"relentless":26011,"compiler":26012,"##district":26013,"##tuated":26014,"Hades":26015,"medicines":26016,"objection":26017,"Answer":26018,"Soap":26019,"Chattanooga":26020,"##gogue":26021,"Haryana":26022,"Parties":26023,"Turtle":26024,"##ferred":26025,"explorers":26026,"stakeholders":26027,"##aar":26028,"##rbonne":26029,"tempered":26030,"conjecture":26031,"##tee":26032,"##hur":26033,"Reeve":26034,"bumper":26035,"stew":26036,"##church":26037,"##generate":26038,"##ilitating":26039,"##chanized":26040,"##elier":26041,"##enne":26042,"translucent":26043,"##lows":26044,"Publisher":26045,"evangelical":26046,"inherit":26047,"##rted":26048,"247":26049,"SmackDown":26050,"bitterness":26051,"lesions":26052,"##worked":26053,"mosques":26054,"wed":26055,"##lashes":26056,"Ng":26057,"Rebels":26058,"booking":26059,"##nail":26060,"Incident":26061,"Sailing":26062,"yo":26063,"confirms":26064,"Chaplin":26065,"baths":26066,"##kled":26067,"modernist":26068,"pulsing":26069,"Cicero":26070,"slaughtered":26071,"boasted":26072,"##losure":26073,"zipper":26074,"##hales":26075,"aristocracy":26076,"halftime":26077,"jolt":26078,"unlawful":26079,"Marching":26080,"sustaining":26081,"Yerevan":26082,"bracket":26083,"ram":26084,"Markus":26085,"##zef":26086,"butcher":26087,"massage":26088,"##quisite":26089,"Leisure":26090,"Pizza":26091,"collapsing":26092,"##lante":26093,"commentaries":26094,"scripted":26095,"##disciplinary":26096,"##sused":26097,"eroded":26098,"alleging":26099,"vase":26100,"Chichester":26101,"Peacock":26102,"commencement":26103,"dice":26104,"hotter":26105,"poisonous":26106,"executions":26107,"##occo":26108,"frost":26109,"fielding":26110,"vendor":26111,"Counts":26112,"Troops":26113,"maize":26114,"Divisional":26115,"analogue":26116,"shadowy":26117,"Nuevo":26118,"Ville":26119,"radiating":26120,"worthless":26121,"Adriatic":26122,"Buy":26123,"blaze":26124,"brutally":26125,"horizontally":26126,"longed":26127,"##matical":26128,"federally":26129,"Rolf":26130,"Root":26131,"exclude":26132,"rag":26133,"agitation":26134,"Lounge":26135,"astonished":26136,"##wirl":26137,"Impossible":26138,"transformations":26139,"##IVE":26140,"##ceded":26141,"##slav":26142,"downloaded":26143,"fucked":26144,"Egyptians":26145,"Welles":26146,"##ffington":26147,"U2":26148,"befriended":26149,"radios":26150,"##jid":26151,"archaic":26152,"compares":26153,"##ccelerator":26154,"##imated":26155,"##tosis":26156,"Hung":26157,"Scientists":26158,"Thousands":26159,"geographically":26160,"##LR":26161,"Macintosh":26162,"fluorescent":26163,"##ipur":26164,"Wehrmacht":26165,"##BR":26166,"##firmary":26167,"Chao":26168,"##ague":26169,"Boyer":26170,"##grounds":26171,"##hism":26172,"##mento":26173,"##taining":26174,"infancy":26175,"##cton":26176,"510":26177,"Boca":26178,"##loy":26179,"1644":26180,"ben":26181,"dong":26182,"stresses":26183,"Sweat":26184,"expressway":26185,"graders":26186,"ochreous":26187,"nets":26188,"Lawn":26189,"thirst":26190,"Uruguayan":26191,"satisfactory":26192,"##tracts":26193,"baroque":26194,"rusty":26195,"##ław":26196,"Shen":26197,"Gdańsk":26198,"chickens":26199,"##graving":26200,"Hodge":26201,"Papal":26202,"SAT":26203,"bearer":26204,"##ogo":26205,"##rger":26206,"merits":26207,"Calendar":26208,"Highest":26209,"Skills":26210,"##ortex":26211,"Roberta":26212,"paradigm":26213,"recounts":26214,"frigates":26215,"swamps":26216,"unitary":26217,"##oker":26218,"balloons":26219,"Hawthorne":26220,"Muse":26221,"spurred":26222,"advisors":26223,"reclaimed":26224,"stimulate":26225,"fibre":26226,"pat":26227,"repeal":26228,"##dgson":26229,"##iar":26230,"##rana":26231,"anthropologist":26232,"descends":26233,"flinch":26234,"reared":26235,"##chang":26236,"##eric":26237,"##lithic":26238,"commissioning":26239,"##cumenical":26240,"##lume":26241,"##rchen":26242,"Wolff":26243,"##tsky":26244,"Eurasian":26245,"Nepali":26246,"Nightmare":26247,"ZIP":26248,"playback":26249,"##latz":26250,"##vington":26251,"Warm":26252,"##75":26253,"Martina":26254,"Rollins":26255,"Saetan":26256,"Variations":26257,"sorting":26258,"##م":26259,"530":26260,"Joaquin":26261,"Ptolemy":26262,"thinner":26263,"##iator":26264,"##pticism":26265,"Cebu":26266,"Highlanders":26267,"Linden":26268,"Vanguard":26269,"##SV":26270,"##mor":26271,"##ulge":26272,"ISSN":26273,"cartridges":26274,"repression":26275,"Étienne":26276,"311":26277,"Lauderdale":26278,"commodities":26279,"null":26280,"##rb":26281,"1720":26282,"gearbox":26283,"##reator":26284,"Ang":26285,"Forgotten":26286,"dubious":26287,"##rls":26288,"##dicative":26289,"##phate":26290,"Groove":26291,"Herrera":26292,"##çais":26293,"Collections":26294,"Maximus":26295,"##published":26296,"Fell":26297,"Qualification":26298,"filtering":26299,"##tized":26300,"Roe":26301,"hazards":26302,"##37":26303,"##lative":26304,"##tröm":26305,"Guadalupe":26306,"Tajikistan":26307,"Preliminary":26308,"fronted":26309,"glands":26310,"##paper":26311,"##iche":26312,"##iding":26313,"Cairns":26314,"rallies":26315,"Location":26316,"seduce":26317,"##mple":26318,"BYU":26319,"##itic":26320,"##FT":26321,"Carmichael":26322,"Prentice":26323,"songwriters":26324,"forefront":26325,"Physicians":26326,"##rille":26327,"##zee":26328,"Preparatory":26329,"##cherous":26330,"UV":26331,"##dized":26332,"Navarro":26333,"misses":26334,"##nney":26335,"Inland":26336,"resisting":26337,"##sect":26338,"Hurt":26339,"##lino":26340,"galaxies":26341,"##raze":26342,"Institutions":26343,"devote":26344,"##lamp":26345,"##ciating":26346,"baron":26347,"##bracing":26348,"Hess":26349,"operatic":26350,"##CL":26351,"##ος":26352,"Chevalier":26353,"Guiana":26354,"##lattered":26355,"Fed":26356,"##cuted":26357,"##smo":26358,"Skull":26359,"denies":26360,"236":26361,"Waller":26362,"##mah":26363,"Sakura":26364,"mole":26365,"nominate":26366,"sermons":26367,"##bering":26368,"widowed":26369,"##röm":26370,"Cavendish":26371,"##struction":26372,"Nehru":26373,"Revelation":26374,"doom":26375,"Gala":26376,"baking":26377,"Nr":26378,"Yourself":26379,"banning":26380,"Individuals":26381,"Sykes":26382,"orchestrated":26383,"630":26384,"Phone":26385,"steered":26386,"620":26387,"specialising":26388,"starvation":26389,"##AV":26390,"##alet":26391,"##upation":26392,"seductive":26393,"##jects":26394,"##zure":26395,"Tolkien":26396,"Benito":26397,"Wizards":26398,"Submarine":26399,"dictator":26400,"Duo":26401,"Caden":26402,"approx":26403,"basins":26404,"##nc":26405,"shrink":26406,"##icles":26407,"##sponsible":26408,"249":26409,"mit":26410,"outpost":26411,"##bayashi":26412,"##rouse":26413,"##tl":26414,"Jana":26415,"Lombard":26416,"RBIs":26417,"finalized":26418,"humanities":26419,"##function":26420,"Honorable":26421,"tomato":26422,"##iot":26423,"Pie":26424,"tee":26425,"##pect":26426,"Beaufort":26427,"Ferris":26428,"bucks":26429,"##graduate":26430,"##ocytes":26431,"Directory":26432,"anxiously":26433,"##nating":26434,"flanks":26435,"##Ds":26436,"virtues":26437,"##believable":26438,"Grades":26439,"criterion":26440,"manufactures":26441,"sourced":26442,"##balt":26443,"##dance":26444,"##tano":26445,"Ying":26446,"##BF":26447,"##sett":26448,"adequately":26449,"blacksmith":26450,"totaled":26451,"trapping":26452,"expanse":26453,"Historia":26454,"Worker":26455,"Sense":26456,"ascending":26457,"housekeeper":26458,"##oos":26459,"Crafts":26460,"Resurrection":26461,"##verty":26462,"encryption":26463,"##aris":26464,"##vat":26465,"##pox":26466,"##runk":26467,"##iability":26468,"gazes":26469,"spying":26470,"##ths":26471,"helmets":26472,"wired":26473,"##zophrenia":26474,"Cheung":26475,"WR":26476,"downloads":26477,"stereotypes":26478,"239":26479,"Lucknow":26480,"bleak":26481,"Bragg":26482,"hauling":26483,"##haft":26484,"prohibit":26485,"##ermined":26486,"##castle":26487,"barony":26488,"##hta":26489,"Typhoon":26490,"antibodies":26491,"##ascism":26492,"Hawthorn":26493,"Kurdistan":26494,"Minority":26495,"Gorge":26496,"Herr":26497,"appliances":26498,"disrupt":26499,"Drugs":26500,"Lazarus":26501,"##ilia":26502,"##ryo":26503,"##tany":26504,"Gotta":26505,"Masovian":26506,"Roxy":26507,"choreographed":26508,"##rissa":26509,"turbulent":26510,"##listed":26511,"Anatomy":26512,"exiting":26513,"##det":26514,"##isław":26515,"580":26516,"Kaufman":26517,"sage":26518,"##apa":26519,"Symposium":26520,"##rolls":26521,"Kaye":26522,"##ptera":26523,"##rocław":26524,"jerking":26525,"##menclature":26526,"Guo":26527,"M1":26528,"resurrected":26529,"trophies":26530,"##lard":26531,"Gathering":26532,"nestled":26533,"serpent":26534,"Dow":26535,"reservoirs":26536,"Claremont":26537,"arbitration":26538,"chronicle":26539,"eki":26540,"##arded":26541,"##zers":26542,"##mmoth":26543,"Congregational":26544,"Astronomical":26545,"NE":26546,"RA":26547,"Robson":26548,"Scotch":26549,"modelled":26550,"slashed":26551,"##imus":26552,"exceeds":26553,"##roper":26554,"##utile":26555,"Laughing":26556,"vascular":26557,"superficial":26558,"##arians":26559,"Barclay":26560,"Caucasian":26561,"classmate":26562,"sibling":26563,"Kimberly":26564,"Shreveport":26565,"##ilde":26566,"##liche":26567,"Cheney":26568,"Deportivo":26569,"Veracruz":26570,"berries":26571,"##lase":26572,"Bed":26573,"MI":26574,"Anatolia":26575,"Mindanao":26576,"broadband":26577,"##olia":26578,"##arte":26579,"##wab":26580,"darts":26581,"##immer":26582,"##uze":26583,"believers":26584,"ordinance":26585,"violate":26586,"##wheel":26587,"##ynth":26588,"Alongside":26589,"Coupe":26590,"Hobbs":26591,"arrondissement":26592,"earl":26593,"townland":26594,"##dote":26595,"##lihood":26596,"##sla":26597,"Ghosts":26598,"midfield":26599,"pulmonary":26600,"##eno":26601,"cues":26602,"##gol":26603,"##zda":26604,"322":26605,"Siena":26606,"Sultanate":26607,"Bradshaw":26608,"Pieter":26609,"##thical":26610,"Raceway":26611,"bared":26612,"competence":26613,"##ssent":26614,"Bet":26615,"##urer":26616,"##ła":26617,"Alistair":26618,"Göttingen":26619,"appropriately":26620,"forge":26621,"##osterone":26622,"##ugen":26623,"DL":26624,"345":26625,"convoys":26626,"inventions":26627,"##resses":26628,"##cturnal":26629,"Fay":26630,"Integration":26631,"slash":26632,"##roats":26633,"Widow":26634,"barking":26635,"##fant":26636,"1A":26637,"Hooper":26638,"##cona":26639,"##runched":26640,"unreliable":26641,"##emont":26642,"##esign":26643,"##stabulary":26644,"##stop":26645,"Journalists":26646,"bony":26647,"##iba":26648,"##trata":26649,"##ège":26650,"horrific":26651,"##bish":26652,"Jocelyn":26653,"##rmon":26654,"##apon":26655,"##cier":26656,"trainers":26657,"##ulatory":26658,"1753":26659,"BR":26660,"corpus":26661,"synthesized":26662,"##bidden":26663,"##rafford":26664,"Elgin":26665,"##entry":26666,"Doherty":26667,"clockwise":26668,"##played":26669,"spins":26670,"##ample":26671,"##bley":26672,"Cope":26673,"constructions":26674,"seater":26675,"warlord":26676,"Voyager":26677,"documenting":26678,"fairies":26679,"##viator":26680,"Lviv":26681,"jewellery":26682,"suites":26683,"##gold":26684,"Maia":26685,"NME":26686,"##eavor":26687,"##kus":26688,"Eugène":26689,"furnishings":26690,"##risto":26691,"MCC":26692,"Metropolis":26693,"Older":26694,"Telangana":26695,"##mpus":26696,"amplifier":26697,"supervising":26698,"1710":26699,"buffalo":26700,"cushion":26701,"terminating":26702,"##powering":26703,"steak":26704,"Quickly":26705,"contracting":26706,"dem":26707,"sarcastically":26708,"Elsa":26709,"##hein":26710,"bastards":26711,"narratives":26712,"Takes":26713,"304":26714,"composure":26715,"typing":26716,"variance":26717,"##ifice":26718,"Softball":26719,"##rations":26720,"McLaughlin":26721,"gaped":26722,"shrines":26723,"##hogany":26724,"Glamorgan":26725,"##icle":26726,"##nai":26727,"##ntin":26728,"Fleetwood":26729,"Woodland":26730,"##uxe":26731,"fictitious":26732,"shrugs":26733,"##iper":26734,"BWV":26735,"conform":26736,"##uckled":26737,"Launch":26738,"##ductory":26739,"##mized":26740,"Tad":26741,"##stituted":26742,"##free":26743,"Bel":26744,"Chávez":26745,"messing":26746,"quartz":26747,"##iculate":26748,"##folia":26749,"##lynn":26750,"ushered":26751,"##29":26752,"##ailing":26753,"dictated":26754,"Pony":26755,"##opsis":26756,"precinct":26757,"802":26758,"Plastic":26759,"##ughter":26760,"##uno":26761,"##porated":26762,"Denton":26763,"Matters":26764,"SPD":26765,"hating":26766,"##rogen":26767,"Essential":26768,"Deck":26769,"Dortmund":26770,"obscured":26771,"##maging":26772,"Earle":26773,"##bred":26774,"##ittle":26775,"##ropolis":26776,"saturated":26777,"##fiction":26778,"##ression":26779,"Pereira":26780,"Vinci":26781,"mute":26782,"warehouses":26783,"##ún":26784,"biographies":26785,"##icking":26786,"sealing":26787,"##dered":26788,"executing":26789,"pendant":26790,"##wives":26791,"murmurs":26792,"##oko":26793,"substrates":26794,"symmetrical":26795,"Susie":26796,"##mare":26797,"Yusuf":26798,"analogy":26799,"##urage":26800,"Lesley":26801,"limitation":26802,"##rby":26803,"##ío":26804,"disagreements":26805,"##mise":26806,"embroidered":26807,"nape":26808,"unarmed":26809,"Sumner":26810,"Stores":26811,"dwell":26812,"Wilcox":26813,"creditors":26814,"##rivatization":26815,"##shes":26816,"##amia":26817,"directs":26818,"recaptured":26819,"scouting":26820,"McGuire":26821,"cradle":26822,"##onnell":26823,"Sato":26824,"insulin":26825,"mercenary":26826,"tolerant":26827,"Macquarie":26828,"transitions":26829,"cradled":26830,"##berto":26831,"##ivism":26832,"##yotes":26833,"FF":26834,"Ke":26835,"Reach":26836,"##dbury":26837,"680":26838,"##bill":26839,"##oja":26840,"##sui":26841,"prairie":26842,"##ogan":26843,"reactive":26844,"##icient":26845,"##rits":26846,"Cyclone":26847,"Sirius":26848,"Survival":26849,"Pak":26850,"##coach":26851,"##trar":26852,"halves":26853,"Agatha":26854,"Opus":26855,"contrasts":26856,"##jection":26857,"ominous":26858,"##iden":26859,"Baylor":26860,"Woodrow":26861,"duct":26862,"fortification":26863,"intercourse":26864,"##rois":26865,"Colbert":26866,"envy":26867,"##isi":26868,"Afterward":26869,"geared":26870,"##flections":26871,"accelerate":26872,"##lenching":26873,"Witness":26874,"##rrer":26875,"Angelina":26876,"Material":26877,"assertion":26878,"misconduct":26879,"Nix":26880,"cringed":26881,"tingling":26882,"##eti":26883,"##gned":26884,"Everest":26885,"disturb":26886,"sturdy":26887,"##keepers":26888,"##vied":26889,"Profile":26890,"heavenly":26891,"##kova":26892,"##victed":26893,"translating":26894,"##sses":26895,"316":26896,"Invitational":26897,"Mention":26898,"martyr":26899,"##uristic":26900,"Barron":26901,"hardness":26902,"Nakamura":26903,"405":26904,"Genevieve":26905,"reflections":26906,"##falls":26907,"jurist":26908,"##LT":26909,"Pyramid":26910,"##yme":26911,"Shoot":26912,"heck":26913,"linguist":26914,"##tower":26915,"Ives":26916,"superiors":26917,"##leo":26918,"Achilles":26919,"##phological":26920,"Christophe":26921,"Padma":26922,"precedence":26923,"grassy":26924,"Oral":26925,"resurrection":26926,"##itting":26927,"clumsy":26928,"##lten":26929,"##rue":26930,"huts":26931,"##stars":26932,"Equal":26933,"##queduct":26934,"Devin":26935,"Gaga":26936,"diocesan":26937,"##plating":26938,"##upe":26939,"##graphers":26940,"Patch":26941,"Scream":26942,"hail":26943,"moaning":26944,"tracts":26945,"##hdi":26946,"Examination":26947,"outsider":26948,"##ergic":26949,"##oter":26950,"Archipelago":26951,"Havilland":26952,"greenish":26953,"tilting":26954,"Aleksandr":26955,"Konstantin":26956,"warship":26957,"##emann":26958,"##gelist":26959,"##ought":26960,"billionaire":26961,"##blivion":26962,"321":26963,"Hungarians":26964,"transplant":26965,"##jured":26966,"##fters":26967,"Corbin":26968,"autism":26969,"pitchers":26970,"Garner":26971,"thence":26972,"Scientology":26973,"transitioned":26974,"integrating":26975,"repetitive":26976,"##dant":26977,"Rene":26978,"vomit":26979,"##burne":26980,"1661":26981,"Researchers":26982,"Wallis":26983,"insulted":26984,"wavy":26985,"##wati":26986,"Ewing":26987,"excitedly":26988,"##kor":26989,"frescoes":26990,"injustice":26991,"##achal":26992,"##lumber":26993,"##úl":26994,"novella":26995,"##sca":26996,"Liv":26997,"##enstein":26998,"##river":26999,"monstrous":27000,"topping":27001,"downfall":27002,"looming":27003,"sinks":27004,"trillion":27005,"##pont":27006,"Effect":27007,"##phi":27008,"##urley":27009,"Sites":27010,"catchment":27011,"##H1":27012,"Hopper":27013,"##raiser":27014,"1642":27015,"Maccabi":27016,"lance":27017,"##chia":27018,"##sboro":27019,"NSA":27020,"branching":27021,"retorted":27022,"tensor":27023,"Immaculate":27024,"drumming":27025,"feeder":27026,"##mony":27027,"Dyer":27028,"homicide":27029,"Temeraire":27030,"fishes":27031,"protruding":27032,"skins":27033,"orchards":27034,"##nso":27035,"inlet":27036,"ventral":27037,"##finder":27038,"Asiatic":27039,"Sul":27040,"1688":27041,"Melinda":27042,"assigns":27043,"paranormal":27044,"gardening":27045,"Tau":27046,"calming":27047,"##inge":27048,"##crow":27049,"regimental":27050,"Nik":27051,"fastened":27052,"correlated":27053,"##gene":27054,"##rieve":27055,"Sick":27056,"##minster":27057,"##politan":27058,"hardwood":27059,"hurled":27060,"##ssler":27061,"Cinematography":27062,"rhyme":27063,"Montenegrin":27064,"Packard":27065,"debating":27066,"##itution":27067,"Helens":27068,"Trick":27069,"Museums":27070,"defiance":27071,"encompassed":27072,"##EE":27073,"##TU":27074,"##nees":27075,"##uben":27076,"##ünster":27077,"##nosis":27078,"435":27079,"Hagen":27080,"cinemas":27081,"Corbett":27082,"commended":27083,"##fines":27084,"##oman":27085,"bosses":27086,"ripe":27087,"scraping":27088,"##loc":27089,"filly":27090,"Saddam":27091,"pointless":27092,"Faust":27093,"Orléans":27094,"Syriac":27095,"##♭":27096,"longitude":27097,"##ropic":27098,"Alfa":27099,"bliss":27100,"gangster":27101,"##ckling":27102,"SL":27103,"blending":27104,"##eptide":27105,"##nner":27106,"bends":27107,"escorting":27108,"##bloid":27109,"##quis":27110,"burials":27111,"##sle":27112,"##è":27113,"Ambulance":27114,"insults":27115,"##gth":27116,"Antrim":27117,"unfolded":27118,"##missible":27119,"splendid":27120,"Cure":27121,"warily":27122,"Saigon":27123,"Waste":27124,"astonishment":27125,"boroughs":27126,"##VS":27127,"##dalgo":27128,"##reshing":27129,"##usage":27130,"rue":27131,"marital":27132,"versatile":27133,"unpaid":27134,"allotted":27135,"bacterium":27136,"##coil":27137,"##cue":27138,"Dorothea":27139,"IDF":27140,"##location":27141,"##yke":27142,"RPG":27143,"##tropical":27144,"devotees":27145,"liter":27146,"##pree":27147,"Johnstone":27148,"astronaut":27149,"attends":27150,"pollen":27151,"periphery":27152,"doctrines":27153,"meta":27154,"showered":27155,"##tyn":27156,"GO":27157,"Huh":27158,"laude":27159,"244":27160,"Amar":27161,"Christensen":27162,"Ping":27163,"Pontifical":27164,"Austen":27165,"raiding":27166,"realities":27167,"##dric":27168,"urges":27169,"##dek":27170,"Cambridgeshire":27171,"##otype":27172,"Cascade":27173,"Greenberg":27174,"Pact":27175,"##cognition":27176,"##aran":27177,"##urion":27178,"Riot":27179,"mimic":27180,"Eastwood":27181,"##imating":27182,"reversal":27183,"##blast":27184,"##henian":27185,"Pitchfork":27186,"##sunderstanding":27187,"Staten":27188,"WCW":27189,"lieu":27190,"##bard":27191,"##sang":27192,"experimenting":27193,"Aquino":27194,"##lums":27195,"TNT":27196,"Hannibal":27197,"catastrophic":27198,"##lsive":27199,"272":27200,"308":27201,"##otypic":27202,"41st":27203,"Highways":27204,"aggregator":27205,"##fluenza":27206,"Featured":27207,"Reece":27208,"dispatch":27209,"simulated":27210,"##BE":27211,"Communion":27212,"Vinnie":27213,"hardcover":27214,"inexpensive":27215,"til":27216,"##adores":27217,"groundwater":27218,"kicker":27219,"blogs":27220,"frenzy":27221,"##wala":27222,"dealings":27223,"erase":27224,"Anglia":27225,"##umour":27226,"Hapoel":27227,"Marquette":27228,"##raphic":27229,"##tives":27230,"consult":27231,"atrocities":27232,"concussion":27233,"##érard":27234,"Decree":27235,"ethanol":27236,"##aen":27237,"Rooney":27238,"##chemist":27239,"##hoot":27240,"1620":27241,"menacing":27242,"Schuster":27243,"##bearable":27244,"laborers":27245,"sultan":27246,"Juliana":27247,"erased":27248,"onstage":27249,"##ync":27250,"Eastman":27251,"##tick":27252,"hushed":27253,"##yrinth":27254,"Lexie":27255,"Wharton":27256,"Lev":27257,"##PL":27258,"Testing":27259,"Bangladeshi":27260,"##bba":27261,"##usions":27262,"communicated":27263,"integers":27264,"internship":27265,"societal":27266,"##odles":27267,"Loki":27268,"ET":27269,"Ghent":27270,"broadcasters":27271,"Unix":27272,"##auer":27273,"Kildare":27274,"Yamaha":27275,"##quencing":27276,"##zman":27277,"chilled":27278,"##rapped":27279,"##uant":27280,"Duval":27281,"sentiments":27282,"Oliveira":27283,"packets":27284,"Horne":27285,"##rient":27286,"Harlan":27287,"Mirage":27288,"invariant":27289,"##anger":27290,"##tensive":27291,"flexed":27292,"sweetness":27293,"##wson":27294,"alleviate":27295,"insulting":27296,"limo":27297,"Hahn":27298,"##llars":27299,"##hesia":27300,"##lapping":27301,"buys":27302,"##oaming":27303,"mocked":27304,"pursuits":27305,"scooted":27306,"##conscious":27307,"##ilian":27308,"Ballad":27309,"jackets":27310,"##kra":27311,"hilly":27312,"##cane":27313,"Scenic":27314,"McGraw":27315,"silhouette":27316,"whipping":27317,"##roduced":27318,"##wark":27319,"##chess":27320,"##rump":27321,"Lemon":27322,"calculus":27323,"demonic":27324,"##latine":27325,"Bharatiya":27326,"Govt":27327,"Que":27328,"Trilogy":27329,"Ducks":27330,"Suit":27331,"stairway":27332,"##ceipt":27333,"Isa":27334,"regulator":27335,"Automobile":27336,"flatly":27337,"##buster":27338,"##lank":27339,"Spartans":27340,"topography":27341,"Tavi":27342,"usable":27343,"Chartered":27344,"Fairchild":27345,"##sance":27346,"##vyn":27347,"Digest":27348,"nuclei":27349,"typhoon":27350,"##llon":27351,"Alvarez":27352,"DJs":27353,"Grimm":27354,"authoritative":27355,"firearm":27356,"##chschule":27357,"Origins":27358,"lair":27359,"unmistakable":27360,"##xial":27361,"##cribing":27362,"Mouth":27363,"##genesis":27364,"##shū":27365,"##gaon":27366,"##ulter":27367,"Jaya":27368,"Neck":27369,"##UN":27370,"##oing":27371,"##static":27372,"relativity":27373,"##mott":27374,"##utive":27375,"##esan":27376,"##uveau":27377,"BT":27378,"salts":27379,"##roa":27380,"Dustin":27381,"preoccupied":27382,"Novgorod":27383,"##asus":27384,"Magnum":27385,"tempting":27386,"##histling":27387,"##ilated":27388,"Musa":27389,"##ghty":27390,"Ashland":27391,"pubs":27392,"routines":27393,"##etto":27394,"Soto":27395,"257":27396,"Featuring":27397,"Augsburg":27398,"##alaya":27399,"Bit":27400,"loomed":27401,"expects":27402,"##abby":27403,"##ooby":27404,"Auschwitz":27405,"Pendleton":27406,"vodka":27407,"##sent":27408,"rescuing":27409,"systemic":27410,"##inet":27411,"##leg":27412,"Yun":27413,"applicant":27414,"revered":27415,"##nacht":27416,"##ndas":27417,"Muller":27418,"characterization":27419,"##patient":27420,"##roft":27421,"Carole":27422,"##asperated":27423,"Amiga":27424,"disconnected":27425,"gel":27426,"##cologist":27427,"Patriotic":27428,"rallied":27429,"assign":27430,"veterinary":27431,"installing":27432,"##cedural":27433,"258":27434,"Jang":27435,"Parisian":27436,"incarcerated":27437,"stalk":27438,"##iment":27439,"Jamal":27440,"McPherson":27441,"Palma":27442,"##oken":27443,"##viation":27444,"512":27445,"Rourke":27446,"irrational":27447,"##rippled":27448,"Devlin":27449,"erratic":27450,"##NI":27451,"##payers":27452,"Ni":27453,"engages":27454,"Portal":27455,"aesthetics":27456,"##rrogance":27457,"Milne":27458,"assassins":27459,"##rots":27460,"335":27461,"385":27462,"Cambodian":27463,"Females":27464,"fellows":27465,"si":27466,"##block":27467,"##otes":27468,"Jayne":27469,"Toro":27470,"flutter":27471,"##eera":27472,"Burr":27473,"##lanche":27474,"relaxation":27475,"##fra":27476,"Fitzroy":27477,"##undy":27478,"1751":27479,"261":27480,"comb":27481,"conglomerate":27482,"ribbons":27483,"veto":27484,"##Es":27485,"casts":27486,"##ege":27487,"1748":27488,"Ares":27489,"spears":27490,"spirituality":27491,"comet":27492,"##nado":27493,"##yeh":27494,"Veterinary":27495,"aquarium":27496,"yer":27497,"Councils":27498,"##oked":27499,"##ynamic":27500,"Malmö":27501,"remorse":27502,"auditions":27503,"drilled":27504,"Hoffmann":27505,"Moe":27506,"Nagoya":27507,"Yacht":27508,"##hakti":27509,"##race":27510,"##rrick":27511,"Talmud":27512,"coordinating":27513,"##EI":27514,"##bul":27515,"##his":27516,"##itors":27517,"##ligent":27518,"##uerra":27519,"Narayan":27520,"goaltender":27521,"taxa":27522,"##asures":27523,"Det":27524,"##mage":27525,"Infinite":27526,"Maid":27527,"bean":27528,"intriguing":27529,"##cription":27530,"gasps":27531,"socket":27532,"##mentary":27533,"##reus":27534,"sewing":27535,"transmitting":27536,"##different":27537,"##furbishment":27538,"##traction":27539,"Grimsby":27540,"sprawling":27541,"Shipyard":27542,"##destine":27543,"##hropic":27544,"##icked":27545,"trolley":27546,"##agi":27547,"##lesh":27548,"Josiah":27549,"invasions":27550,"Content":27551,"firefighters":27552,"intro":27553,"Lucifer":27554,"subunit":27555,"Sahib":27556,"Myrtle":27557,"inhibitor":27558,"maneuvers":27559,"##teca":27560,"Wrath":27561,"slippery":27562,"##versing":27563,"Shoes":27564,"##dial":27565,"##illiers":27566,"##luded":27567,"##mmal":27568,"##pack":27569,"handkerchief":27570,"##edestal":27571,"##stones":27572,"Fusion":27573,"cumulative":27574,"##mell":27575,"##cacia":27576,"##rudge":27577,"##utz":27578,"foe":27579,"storing":27580,"swiped":27581,"##meister":27582,"##orra":27583,"batter":27584,"strung":27585,"##venting":27586,"##kker":27587,"Doo":27588,"Taste":27589,"immensely":27590,"Fairbanks":27591,"Jarrett":27592,"Boogie":27593,"1746":27594,"mage":27595,"Kick":27596,"legislators":27597,"medial":27598,"##ilon":27599,"##logies":27600,"##ranton":27601,"Hybrid":27602,"##uters":27603,"Tide":27604,"deportation":27605,"Metz":27606,"##secration":27607,"##virus":27608,"UFO":27609,"##fell":27610,"##orage":27611,"##raction":27612,"##rrigan":27613,"1747":27614,"fabricated":27615,"##BM":27616,"##GR":27617,"##rter":27618,"muttering":27619,"theorist":27620,"##tamine":27621,"BMG":27622,"Kincaid":27623,"solvent":27624,"##azed":27625,"Thin":27626,"adorable":27627,"Wendell":27628,"ta":27629,"##viour":27630,"pulses":27631,"##pologies":27632,"counters":27633,"exposition":27634,"sewer":27635,"Luciano":27636,"Clancy":27637,"##angelo":27638,"##riars":27639,"Showtime":27640,"observes":27641,"frankly":27642,"##oppy":27643,"Bergman":27644,"lobes":27645,"timetable":27646,"##bri":27647,"##uest":27648,"FX":27649,"##dust":27650,"##genus":27651,"Glad":27652,"Helmut":27653,"Meridian":27654,"##besity":27655,"##ontaine":27656,"Revue":27657,"miracles":27658,"##titis":27659,"PP":27660,"bluff":27661,"syrup":27662,"307":27663,"Messiah":27664,"##erne":27665,"interfering":27666,"picturesque":27667,"unconventional":27668,"dipping":27669,"hurriedly":27670,"Kerman":27671,"248":27672,"Ethnic":27673,"Toward":27674,"acidic":27675,"Harrisburg":27676,"##65":27677,"intimidating":27678,"##aal":27679,"Jed":27680,"Pontiac":27681,"munitions":27682,"##nchen":27683,"growling":27684,"mausoleum":27685,"##ération":27686,"##wami":27687,"Cy":27688,"aerospace":27689,"caucus":27690,"Doing":27691,"##around":27692,"##miring":27693,"Cuthbert":27694,"##poradic":27695,"##rovisation":27696,"##wth":27697,"evaluating":27698,"##scraper":27699,"Belinda":27700,"owes":27701,"##sitic":27702,"##thermal":27703,"##fast":27704,"economists":27705,"##lishing":27706,"##uerre":27707,"##ân":27708,"credible":27709,"##koto":27710,"Fourteen":27711,"cones":27712,"##ebrates":27713,"bookstore":27714,"towels":27715,"##phony":27716,"Appearance":27717,"newscasts":27718,"##olin":27719,"Karin":27720,"Bingham":27721,"##elves":27722,"1680":27723,"306":27724,"disks":27725,"##lston":27726,"##secutor":27727,"Levant":27728,"##vout":27729,"Micro":27730,"snuck":27731,"##ogel":27732,"##racker":27733,"Exploration":27734,"drastic":27735,"##kening":27736,"Elsie":27737,"endowment":27738,"##utnant":27739,"Blaze":27740,"##rrosion":27741,"leaking":27742,"45th":27743,"##rug":27744,"##uernsey":27745,"760":27746,"Shapiro":27747,"cakes":27748,"##ehan":27749,"##mei":27750,"##ité":27751,"##kla":27752,"repetition":27753,"successively":27754,"Friendly":27755,"Île":27756,"Koreans":27757,"Au":27758,"Tirana":27759,"flourish":27760,"Spirits":27761,"Yao":27762,"reasoned":27763,"##leam":27764,"Consort":27765,"cater":27766,"marred":27767,"ordeal":27768,"supremacy":27769,"##ritable":27770,"Paisley":27771,"euro":27772,"healer":27773,"portico":27774,"wetland":27775,"##kman":27776,"restart":27777,"##habilitation":27778,"##zuka":27779,"##Script":27780,"emptiness":27781,"communion":27782,"##CF":27783,"##inhabited":27784,"##wamy":27785,"Casablanca":27786,"pulsed":27787,"##rrible":27788,"##safe":27789,"395":27790,"Dual":27791,"Terrorism":27792,"##urge":27793,"##found":27794,"##gnolia":27795,"Courage":27796,"patriarch":27797,"segregated":27798,"intrinsic":27799,"##liography":27800,"##phe":27801,"PD":27802,"convection":27803,"##icidal":27804,"Dharma":27805,"Jimmie":27806,"texted":27807,"constituents":27808,"twitch":27809,"##calated":27810,"##mitage":27811,"##ringing":27812,"415":27813,"milling":27814,"##geons":27815,"Armagh":27816,"Geometridae":27817,"evergreen":27818,"needy":27819,"reflex":27820,"template":27821,"##pina":27822,"Schubert":27823,"##bruck":27824,"##icted":27825,"##scher":27826,"##wildered":27827,"1749":27828,"Joanne":27829,"clearer":27830,"##narl":27831,"278":27832,"Print":27833,"automation":27834,"consciously":27835,"flashback":27836,"occupations":27837,"##ests":27838,"Casimir":27839,"differentiated":27840,"policing":27841,"repay":27842,"##aks":27843,"##gnesium":27844,"Evaluation":27845,"commotion":27846,"##CM":27847,"##smopolitan":27848,"Clapton":27849,"mitochondrial":27850,"Kobe":27851,"1752":27852,"Ignoring":27853,"Vincenzo":27854,"Wet":27855,"bandage":27856,"##rassed":27857,"##unate":27858,"Maris":27859,"##eted":27860,"##hetical":27861,"figuring":27862,"##eit":27863,"##nap":27864,"leopard":27865,"strategically":27866,"##reer":27867,"Fen":27868,"Iain":27869,"##ggins":27870,"##pipe":27871,"Matteo":27872,"McIntyre":27873,"##chord":27874,"##feng":27875,"Romani":27876,"asshole":27877,"flopped":27878,"reassure":27879,"Founding":27880,"Styles":27881,"Torino":27882,"patrolling":27883,"##erging":27884,"##ibrating":27885,"##ructural":27886,"sincerity":27887,"##ät":27888,"##teacher":27889,"Juliette":27890,"##cé":27891,"##hog":27892,"##idated":27893,"##span":27894,"Winfield":27895,"##fender":27896,"##nast":27897,"##pliant":27898,"1690":27899,"Bai":27900,"Je":27901,"Saharan":27902,"expands":27903,"Bolshevik":27904,"rotate":27905,"##root":27906,"Britannia":27907,"Severn":27908,"##cini":27909,"##gering":27910,"##say":27911,"sly":27912,"Steps":27913,"insertion":27914,"rooftop":27915,"Piece":27916,"cuffs":27917,"plausible":27918,"##zai":27919,"Provost":27920,"semantic":27921,"##data":27922,"##vade":27923,"##cimal":27924,"IPA":27925,"indictment":27926,"Libraries":27927,"flaming":27928,"highlands":27929,"liberties":27930,"##pio":27931,"Elders":27932,"aggressively":27933,"##pecific":27934,"Decision":27935,"pigeon":27936,"nominally":27937,"descriptive":27938,"adjustments":27939,"equestrian":27940,"heaving":27941,"##mour":27942,"##dives":27943,"##fty":27944,"##yton":27945,"intermittent":27946,"##naming":27947,"##sets":27948,"Calvert":27949,"Casper":27950,"Tarzan":27951,"##kot":27952,"Ramírez":27953,"##IB":27954,"##erus":27955,"Gustavo":27956,"Roller":27957,"vaulted":27958,"##solation":27959,"##formatics":27960,"##tip":27961,"Hunger":27962,"colloquially":27963,"handwriting":27964,"hearth":27965,"launcher":27966,"##idian":27967,"##ilities":27968,"##lind":27969,"##locating":27970,"Magdalena":27971,"Soo":27972,"clubhouse":27973,"##kushima":27974,"##ruit":27975,"Bogotá":27976,"Organic":27977,"Worship":27978,"##Vs":27979,"##wold":27980,"upbringing":27981,"##kick":27982,"groundbreaking":27983,"##urable":27984,"##ván":27985,"repulsed":27986,"##dira":27987,"##ditional":27988,"##ici":27989,"melancholy":27990,"##bodied":27991,"##cchi":27992,"404":27993,"concurrency":27994,"H₂O":27995,"bouts":27996,"##gami":27997,"288":27998,"Leto":27999,"troll":28000,"##lak":28001,"advising":28002,"bundled":28003,"##nden":28004,"lipstick":28005,"littered":28006,"##leading":28007,"##mogeneous":28008,"Experiment":28009,"Nikola":28010,"grove":28011,"##ogram":28012,"Mace":28013,"##jure":28014,"cheat":28015,"Annabelle":28016,"Tori":28017,"lurking":28018,"Emery":28019,"Walden":28020,"##riz":28021,"paints":28022,"Markets":28023,"brutality":28024,"overrun":28025,"##agu":28026,"##sat":28027,"din":28028,"ostensibly":28029,"Fielding":28030,"flees":28031,"##eron":28032,"Pound":28033,"ornaments":28034,"tornadoes":28035,"##nikov":28036,"##organisation":28037,"##reen":28038,"##Works":28039,"##ldred":28040,"##olten":28041,"##stillery":28042,"soluble":28043,"Mata":28044,"Grimes":28045,"Léon":28046,"##NF":28047,"coldly":28048,"permitting":28049,"##inga":28050,"##reaked":28051,"Agents":28052,"hostess":28053,"##dl":28054,"Dyke":28055,"Kota":28056,"avail":28057,"orderly":28058,"##saur":28059,"##sities":28060,"Arroyo":28061,"##ceps":28062,"##egro":28063,"Hawke":28064,"Noctuidae":28065,"html":28066,"seminar":28067,"##ggles":28068,"##wasaki":28069,"Clube":28070,"recited":28071,"##sace":28072,"Ascension":28073,"Fitness":28074,"dough":28075,"##ixel":28076,"Nationale":28077,"##solidate":28078,"pulpit":28079,"vassal":28080,"570":28081,"Annapolis":28082,"bladder":28083,"phylogenetic":28084,"##iname":28085,"convertible":28086,"##ppan":28087,"Comet":28088,"paler":28089,"##definite":28090,"Spot":28091,"##dices":28092,"frequented":28093,"Apostles":28094,"slalom":28095,"##ivision":28096,"##mana":28097,"##runcated":28098,"Trojan":28099,"##agger":28100,"##iq":28101,"##league":28102,"Concept":28103,"Controller":28104,"##barian":28105,"##curate":28106,"##spersed":28107,"##tring":28108,"engulfed":28109,"inquired":28110,"##hmann":28111,"286":28112,"##dict":28113,"##osy":28114,"##raw":28115,"MacKenzie":28116,"su":28117,"##ienced":28118,"##iggs":28119,"##quitaine":28120,"bisexual":28121,"##noon":28122,"runways":28123,"subsp":28124,"##!":28125,"##\"":28126,"###":28127,"##$":28128,"##%":28129,"##&":28130,"##'":28131,"##(":28132,"##)":28133,"##*":28134,"##+":28135,"##,":28136,"##-":28137,"##.":28138,"##/":28139,"##:":28140,"##;":28141,"##<":28142,"##=":28143,"##>":28144,"##?":28145,"##@":28146,"##[":28147,"##\\":28148,"##]":28149,"##^":28150,"##_":28151,"##`":28152,"##{":28153,"##|":28154,"##}":28155,"##~":28156,"##¡":28157,"##¢":28158,"##£":28159,"##¥":28160,"##§":28161,"##¨":28162,"##©":28163,"##ª":28164,"##«":28165,"##¬":28166,"##®":28167,"##±":28168,"##´":28169,"##µ":28170,"##¶":28171,"##·":28172,"##¹":28173,"##º":28174,"##»":28175,"##¼":28176,"##¾":28177,"##¿":28178,"##À":28179,"##Á":28180,"##Â":28181,"##Ä":28182,"##Å":28183,"##Æ":28184,"##Ç":28185,"##È":28186,"##É":28187,"##Í":28188,"##Î":28189,"##Ñ":28190,"##Ó":28191,"##Ö":28192,"##×":28193,"##Ø":28194,"##Ú":28195,"##Ü":28196,"##Þ":28197,"##â":28198,"##ã":28199,"##æ":28200,"##ç":28201,"##î":28202,"##ï":28203,"##ð":28204,"##ñ":28205,"##ô":28206,"##õ":28207,"##÷":28208,"##û":28209,"##þ":28210,"##ÿ":28211,"##Ā":28212,"##ą":28213,"##Ć":28214,"##Č":28215,"##ď":28216,"##Đ":28217,"##đ":28218,"##ē":28219,"##ė":28220,"##ę":28221,"##ě":28222,"##ğ":28223,"##ġ":28224,"##Ħ":28225,"##ħ":28226,"##ĩ":28227,"##Ī":28228,"##İ":28229,"##ļ":28230,"##Ľ":28231,"##ľ":28232,"##Ł":28233,"##ņ":28234,"##ň":28235,"##ŋ":28236,"##Ō":28237,"##ŏ":28238,"##ő":28239,"##Œ":28240,"##œ":28241,"##ř":28242,"##Ś":28243,"##ś":28244,"##Ş":28245,"##Š":28246,"##Ţ":28247,"##ţ":28248,"##ť":28249,"##ũ":28250,"##ŭ":28251,"##ů":28252,"##ű":28253,"##ų":28254,"##ŵ":28255,"##ŷ":28256,"##ź":28257,"##Ż":28258,"##ż":28259,"##Ž":28260,"##ž":28261,"##Ə":28262,"##ƒ":28263,"##ơ":28264,"##ư":28265,"##ǎ":28266,"##ǐ":28267,"##ǒ":28268,"##ǔ":28269,"##ǫ":28270,"##Ș":28271,"##Ț":28272,"##ț":28273,"##ɐ":28274,"##ɑ":28275,"##ɔ":28276,"##ɕ":28277,"##ə":28278,"##ɛ":28279,"##ɡ":28280,"##ɣ":28281,"##ɨ":28282,"##ɪ":28283,"##ɲ":28284,"##ɾ":28285,"##ʀ":28286,"##ʁ":28287,"##ʂ":28288,"##ʃ":28289,"##ʊ":28290,"##ʋ":28291,"##ʌ":28292,"##ʐ":28293,"##ʑ":28294,"##ʒ":28295,"##ʔ":28296,"##ʰ":28297,"##ʲ":28298,"##ʳ":28299,"##ʷ":28300,"##ʻ":28301,"##ʼ":28302,"##ʾ":28303,"##ʿ":28304,"##ˈ":28305,"##ː":28306,"##ˡ":28307,"##ˢ":28308,"##ˣ":28309,"##́":28310,"##̃":28311,"##̍":28312,"##̯":28313,"##͡":28314,"##Α":28315,"##Β":28316,"##Γ":28317,"##Δ":28318,"##Ε":28319,"##Η":28320,"##Θ":28321,"##Ι":28322,"##Κ":28323,"##Λ":28324,"##Μ":28325,"##Ν":28326,"##Ο":28327,"##Π":28328,"##Σ":28329,"##Τ":28330,"##Φ":28331,"##Χ":28332,"##Ψ":28333,"##Ω":28334,"##ά":28335,"##έ":28336,"##ή":28337,"##ί":28338,"##β":28339,"##γ":28340,"##δ":28341,"##ε":28342,"##ζ":28343,"##η":28344,"##θ":28345,"##ι":28346,"##κ":28347,"##λ":28348,"##μ":28349,"##ξ":28350,"##ο":28351,"##π":28352,"##ρ":28353,"##σ":28354,"##τ":28355,"##υ":28356,"##φ":28357,"##χ":28358,"##ψ":28359,"##ω":28360,"##ό":28361,"##ύ":28362,"##ώ":28363,"##І":28364,"##Ј":28365,"##А":28366,"##Б":28367,"##В":28368,"##Г":28369,"##Д":28370,"##Е":28371,"##Ж":28372,"##З":28373,"##И":28374,"##К":28375,"##Л":28376,"##М":28377,"##Н":28378,"##О":28379,"##П":28380,"##Р":28381,"##С":28382,"##Т":28383,"##У":28384,"##Ф":28385,"##Х":28386,"##Ц":28387,"##Ч":28388,"##Ш":28389,"##Э":28390,"##Ю":28391,"##Я":28392,"##б":28393,"##в":28394,"##г":28395,"##д":28396,"##ж":28397,"##з":28398,"##к":28399,"##л":28400,"##м":28401,"##п":28402,"##с":28403,"##т":28404,"##у":28405,"##ф":28406,"##х":28407,"##ц":28408,"##ч":28409,"##ш":28410,"##щ":28411,"##ъ":28412,"##ы":28413,"##ь":28414,"##э":28415,"##ю":28416,"##ё":28417,"##і":28418,"##ї":28419,"##ј":28420,"##њ":28421,"##ћ":28422,"##Ա":28423,"##Հ":28424,"##ա":28425,"##ե":28426,"##ի":28427,"##կ":28428,"##մ":28429,"##յ":28430,"##ն":28431,"##ո":28432,"##ս":28433,"##տ":28434,"##ր":28435,"##ւ":28436,"##ְ":28437,"##ִ":28438,"##ֵ":28439,"##ֶ":28440,"##ַ":28441,"##ָ":28442,"##ֹ":28443,"##ּ":28444,"##א":28445,"##ב":28446,"##ג":28447,"##ד":28448,"##ה":28449,"##ו":28450,"##ז":28451,"##ח":28452,"##ט":28453,"##י":28454,"##כ":28455,"##ל":28456,"##ם":28457,"##מ":28458,"##ן":28459,"##נ":28460,"##ס":28461,"##ע":28462,"##פ":28463,"##צ":28464,"##ק":28465,"##ר":28466,"##ש":28467,"##ת":28468,"##،":28469,"##ء":28470,"##آ":28471,"##أ":28472,"##إ":28473,"##ئ":28474,"##ا":28475,"##ب":28476,"##ت":28477,"##ث":28478,"##ج":28479,"##ح":28480,"##خ":28481,"##ذ":28482,"##ز":28483,"##س":28484,"##ش":28485,"##ص":28486,"##ض":28487,"##ط":28488,"##ظ":28489,"##ع":28490,"##غ":28491,"##ف":28492,"##ق":28493,"##ك":28494,"##ل":28495,"##و":28496,"##ى":28497,"##َ":28498,"##ِ":28499,"##ٹ":28500,"##پ":28501,"##چ":28502,"##ک":28503,"##گ":28504,"##ہ":28505,"##ی":28506,"##ے":28507,"##ं":28508,"##आ":28509,"##क":28510,"##ग":28511,"##च":28512,"##ज":28513,"##ण":28514,"##त":28515,"##द":28516,"##ध":28517,"##न":28518,"##प":28519,"##ब":28520,"##भ":28521,"##म":28522,"##य":28523,"##र":28524,"##ल":28525,"##व":28526,"##श":28527,"##ष":28528,"##स":28529,"##ह":28530,"##ा":28531,"##ि":28532,"##ी":28533,"##ु":28534,"##े":28535,"##ो":28536,"##्":28537,"##।":28538,"##॥":28539,"##আ":28540,"##ই":28541,"##এ":28542,"##ও":28543,"##ক":28544,"##খ":28545,"##গ":28546,"##চ":28547,"##ছ":28548,"##জ":28549,"##ট":28550,"##ত":28551,"##থ":28552,"##দ":28553,"##ধ":28554,"##ন":28555,"##প":28556,"##ব":28557,"##ম":28558,"##য":28559,"##র":28560,"##ল":28561,"##শ":28562,"##স":28563,"##হ":28564,"##়":28565,"##া":28566,"##ি":28567,"##ী":28568,"##ু":28569,"##ে":28570,"##ো":28571,"##্":28572,"##য়":28573,"##க":28574,"##த":28575,"##ப":28576,"##ம":28577,"##ய":28578,"##ர":28579,"##ல":28580,"##வ":28581,"##ா":28582,"##ி":28583,"##ு":28584,"##்":28585,"##ร":28586,"##་":28587,"##ག":28588,"##ང":28589,"##ད":28590,"##ན":28591,"##བ":28592,"##མ":28593,"##ར":28594,"##ལ":28595,"##ས":28596,"##ི":28597,"##ུ":28598,"##ེ":28599,"##ོ":28600,"##ა":28601,"##ე":28602,"##ი":28603,"##ლ":28604,"##ნ":28605,"##ო":28606,"##რ":28607,"##ს":28608,"##ᴬ":28609,"##ᴵ":28610,"##ᵀ":28611,"##ᵃ":28612,"##ᵇ":28613,"##ᵈ":28614,"##ᵉ":28615,"##ᵍ":28616,"##ᵏ":28617,"##ᵐ":28618,"##ᵒ":28619,"##ᵖ":28620,"##ᵗ":28621,"##ᵘ":28622,"##ᵣ":28623,"##ᵤ":28624,"##ᵥ":28625,"##ᶜ":28626,"##ᶠ":28627,"##ḍ":28628,"##Ḥ":28629,"##ḥ":28630,"##Ḩ":28631,"##ḩ":28632,"##ḳ":28633,"##ṃ":28634,"##ṅ":28635,"##ṇ":28636,"##ṛ":28637,"##ṣ":28638,"##ṭ":28639,"##ạ":28640,"##ả":28641,"##ấ":28642,"##ầ":28643,"##ẩ":28644,"##ậ":28645,"##ắ":28646,"##ế":28647,"##ề":28648,"##ể":28649,"##ễ":28650,"##ệ":28651,"##ị":28652,"##ọ":28653,"##ố":28654,"##ồ":28655,"##ổ":28656,"##ộ":28657,"##ớ":28658,"##ờ":28659,"##ợ":28660,"##ụ":28661,"##ủ":28662,"##ứ":28663,"##ừ":28664,"##ử":28665,"##ữ":28666,"##ự":28667,"##ỳ":28668,"##ỹ":28669,"##ἀ":28670,"##ἐ":28671,"##ὁ":28672,"##ὐ":28673,"##ὰ":28674,"##ὶ":28675,"##ὸ":28676,"##ῆ":28677,"##ῖ":28678,"##ῦ":28679,"##ῶ":28680,"##‐":28681,"##‑":28682,"##‒":28683,"##–":28684,"##—":28685,"##―":28686,"##‖":28687,"##‘":28688,"##’":28689,"##‚":28690,"##“":28691,"##”":28692,"##„":28693,"##†":28694,"##‡":28695,"##•":28696,"##…":28697,"##‰":28698,"##′":28699,"##″":28700,"##⁄":28701,"##⁰":28702,"##ⁱ":28703,"##⁴":28704,"##⁵":28705,"##⁶":28706,"##⁷":28707,"##⁸":28708,"##⁹":28709,"##⁻":28710,"##ⁿ":28711,"##₅":28712,"##₆":28713,"##₇":28714,"##₈":28715,"##₉":28716,"##₊":28717,"##₍":28718,"##₎":28719,"##ₐ":28720,"##ₑ":28721,"##ₒ":28722,"##ₓ":28723,"##ₕ":28724,"##ₖ":28725,"##ₘ":28726,"##ₚ":28727,"##ₛ":28728,"##ₜ":28729,"##₤":28730,"##€":28731,"##₱":28732,"##₹":28733,"##ℓ":28734,"##№":28735,"##ℝ":28736,"##⅓":28737,"##←":28738,"##↑":28739,"##→":28740,"##↔":28741,"##⇌":28742,"##⇒":28743,"##∂":28744,"##∈":28745,"##∗":28746,"##∘":28747,"##√":28748,"##∞":28749,"##∧":28750,"##∨":28751,"##∩":28752,"##∪":28753,"##≈":28754,"##≠":28755,"##≡":28756,"##≤":28757,"##≥":28758,"##⊂":28759,"##⊆":28760,"##⊕":28761,"##⋅":28762,"##─":28763,"##│":28764,"##■":28765,"##●":28766,"##★":28767,"##☆":28768,"##☉":28769,"##♠":28770,"##♣":28771,"##♥":28772,"##♦":28773,"##♯":28774,"##⟨":28775,"##⟩":28776,"##ⱼ":28777,"##、":28778,"##。":28779,"##《":28780,"##》":28781,"##「":28782,"##」":28783,"##『":28784,"##』":28785,"##〜":28786,"##い":28787,"##う":28788,"##え":28789,"##お":28790,"##か":28791,"##き":28792,"##く":28793,"##け":28794,"##こ":28795,"##さ":28796,"##し":28797,"##す":28798,"##せ":28799,"##そ":28800,"##た":28801,"##ち":28802,"##つ":28803,"##て":28804,"##と":28805,"##な":28806,"##に":28807,"##の":28808,"##は":28809,"##ひ":28810,"##ま":28811,"##み":28812,"##む":28813,"##め":28814,"##も":28815,"##や":28816,"##ゆ":28817,"##よ":28818,"##ら":28819,"##り":28820,"##る":28821,"##れ":28822,"##ん":28823,"##ア":28824,"##ィ":28825,"##イ":28826,"##ウ":28827,"##エ":28828,"##オ":28829,"##カ":28830,"##ガ":28831,"##キ":28832,"##ク":28833,"##グ":28834,"##コ":28835,"##サ":28836,"##シ":28837,"##ジ":28838,"##ス":28839,"##ズ":28840,"##タ":28841,"##ダ":28842,"##ッ":28843,"##テ":28844,"##デ":28845,"##ト":28846,"##ド":28847,"##ナ":28848,"##ニ":28849,"##ハ":28850,"##バ":28851,"##パ":28852,"##フ":28853,"##ブ":28854,"##プ":28855,"##マ":28856,"##ミ":28857,"##ム":28858,"##ャ":28859,"##ュ":28860,"##ラ":28861,"##リ":28862,"##ル":28863,"##レ":28864,"##ロ":28865,"##ン":28866,"##・":28867,"##ー":28868,"##一":28869,"##三":28870,"##上":28871,"##下":28872,"##中":28873,"##事":28874,"##二":28875,"##井":28876,"##京":28877,"##人":28878,"##亻":28879,"##仁":28880,"##佐":28881,"##侍":28882,"##光":28883,"##公":28884,"##力":28885,"##北":28886,"##十":28887,"##南":28888,"##原":28889,"##口":28890,"##史":28891,"##司":28892,"##吉":28893,"##同":28894,"##和":28895,"##囗":28896,"##国":28897,"##國":28898,"##土":28899,"##城":28900,"##士":28901,"##大":28902,"##天":28903,"##太":28904,"##夫":28905,"##女":28906,"##子":28907,"##宀":28908,"##安":28909,"##宮":28910,"##宿":28911,"##小":28912,"##尚":28913,"##山":28914,"##島":28915,"##川":28916,"##州":28917,"##平":28918,"##年":28919,"##心":28920,"##愛":28921,"##戸":28922,"##文":28923,"##新":28924,"##方":28925,"##日":28926,"##明":28927,"##星":28928,"##書":28929,"##月":28930,"##木":28931,"##本":28932,"##李":28933,"##村":28934,"##東":28935,"##松":28936,"##林":28937,"##正":28938,"##武":28939,"##氏":28940,"##水":28941,"##氵":28942,"##江":28943,"##河":28944,"##海":28945,"##版":28946,"##犬":28947,"##王":28948,"##生":28949,"##田":28950,"##白":28951,"##皇":28952,"##省":28953,"##真":28954,"##石":28955,"##社":28956,"##神":28957,"##竹":28958,"##美":28959,"##義":28960,"##花":28961,"##藤":28962,"##西":28963,"##谷":28964,"##車":28965,"##辶":28966,"##道":28967,"##郎":28968,"##郡":28969,"##部":28970,"##野":28971,"##金":28972,"##長":28973,"##門":28974,"##陽":28975,"##青":28976,"##食":28977,"##馬":28978,"##高":28979,"##龍":28980,"##龸":28981,"##사":28982,"##씨":28983,"##의":28984,"##이":28985,"##한":28986,"##fi":28987,"##fl":28988,"##!":28989,"##(":28990,"##)":28991,"##,":28992,"##-":28993,"##/":28994,"##:":28995}}} \ No newline at end of file diff --git a/NLP with Attention Models/QA/QA_DistilBERT_pipline_FT/Files/tf/distilbert-base-cased-distilled-squad/tokenizer_config.json b/NLP with Attention Models/QA/QA_DistilBERT_pipline_FT/Files/tf/distilbert-base-cased-distilled-squad/tokenizer_config.json new file mode 100644 index 0000000000000000000000000000000000000000..e3c6d456fb2616f01a9a6cd01a1be1a36353ed22 --- /dev/null +++ b/NLP with Attention Models/QA/QA_DistilBERT_pipline_FT/Files/tf/distilbert-base-cased-distilled-squad/tokenizer_config.json @@ -0,0 +1,3 @@ +{ + "do_lower_case": false +} diff --git a/NLP with Attention Models/QA/QA_DistilBERT_pipline_FT/Files/tf/distilbert-base-cased-distilled-squad/vocab.txt b/NLP with Attention Models/QA/QA_DistilBERT_pipline_FT/Files/tf/distilbert-base-cased-distilled-squad/vocab.txt new file mode 100644 index 0000000000000000000000000000000000000000..2ea941cc79a6f3d7985ca6991ef4f67dad62af04 --- /dev/null +++ b/NLP with Attention Models/QA/QA_DistilBERT_pipline_FT/Files/tf/distilbert-base-cased-distilled-squad/vocab.txt @@ -0,0 +1,28996 @@ +[PAD] +[unused1] +[unused2] +[unused3] +[unused4] +[unused5] +[unused6] +[unused7] +[unused8] +[unused9] +[unused10] +[unused11] +[unused12] +[unused13] +[unused14] +[unused15] +[unused16] +[unused17] +[unused18] +[unused19] +[unused20] +[unused21] +[unused22] +[unused23] +[unused24] +[unused25] +[unused26] +[unused27] +[unused28] +[unused29] +[unused30] +[unused31] +[unused32] +[unused33] +[unused34] +[unused35] +[unused36] +[unused37] +[unused38] +[unused39] +[unused40] +[unused41] +[unused42] +[unused43] +[unused44] +[unused45] +[unused46] +[unused47] +[unused48] +[unused49] +[unused50] +[unused51] +[unused52] +[unused53] +[unused54] +[unused55] +[unused56] +[unused57] +[unused58] +[unused59] +[unused60] +[unused61] +[unused62] +[unused63] +[unused64] +[unused65] +[unused66] +[unused67] +[unused68] +[unused69] +[unused70] +[unused71] +[unused72] +[unused73] +[unused74] +[unused75] +[unused76] +[unused77] +[unused78] +[unused79] +[unused80] +[unused81] +[unused82] +[unused83] +[unused84] +[unused85] +[unused86] +[unused87] +[unused88] +[unused89] +[unused90] +[unused91] +[unused92] +[unused93] +[unused94] +[unused95] +[unused96] +[unused97] +[unused98] +[unused99] +[UNK] +[CLS] +[SEP] +[MASK] +[unused100] +[unused101] +! +" +# +$ +% +& +' +( +) +* ++ +, +- +. +/ +0 +1 +2 +3 +4 +5 +6 +7 +8 +9 +: +; +< += +> +? +@ +A +B +C +D +E +F +G +H +I +J +K +L +M +N +O +P +Q +R +S +T +U +V +W +X +Y +Z +[ +\ +] +^ +_ +` +a +b +c +d +e +f +g +h +i +j +k +l +m +n +o +p +q +r +s +t +u +v +w +x +y +z +{ +| +} +~ +¡ +¢ +£ +¥ +§ +¨ +© +ª +« +¬ +® +° +± +² +³ +´ +µ +¶ +· +¹ +º +» +¼ +½ +¾ +¿ +À +Á + +Ä +Å +Æ +Ç +È +É +Í +Î +Ñ +Ó +Ö +× +Ø +Ú +Ü +Þ +ß +à +á +â +ã +ä +å +æ +ç +è +é +ê +ë +ì +í +î +ï +ð +ñ +ò +ó +ô +õ +ö +÷ +ø +ù +ú +û +ü +ý +þ +ÿ +Ā +ā +ă +ą +Ć +ć +Č +č +ď +Đ +đ +ē +ė +ę +ě +ğ +ġ +Ħ +ħ +ĩ +Ī +ī +İ +ı +ļ +Ľ +ľ +Ł +ł +ń +ņ +ň +ŋ +Ō +ō +ŏ +ő +Œ +œ +ř +Ś +ś +Ş +ş +Š +š +Ţ +ţ +ť +ũ +ū +ŭ +ů +ű +ų +ŵ +ŷ +ź +Ż +ż +Ž +ž +Ə +ƒ +ơ +ư +ǎ +ǐ +ǒ +ǔ +ǫ +Ș +ș +Ț +ț +ɐ +ɑ +ɔ +ɕ +ə +ɛ +ɡ +ɣ +ɨ +ɪ +ɲ +ɾ +ʀ +ʁ +ʂ +ʃ +ʊ +ʋ +ʌ +ʐ +ʑ +ʒ +ʔ +ʰ +ʲ +ʳ +ʷ +ʻ +ʼ +ʾ +ʿ +ˈ +ː +ˡ +ˢ +ˣ +́ +̃ +̍ +̯ +͡ +Α +Β +Γ +Δ +Ε +Η +Θ +Ι +Κ +Λ +Μ +Ν +Ο +Π +Σ +Τ +Φ +Χ +Ψ +Ω +ά +έ +ή +ί +α +β +γ +δ +ε +ζ +η +θ +ι +κ +λ +μ +ν +ξ +ο +π +ρ +ς +σ +τ +υ +φ +χ +ψ +ω +ό +ύ +ώ +І +Ј +А +Б +В +Г +Д +Е +Ж +З +И +К +Л +М +Н +О +П +Р +С +Т +У +Ф +Х +Ц +Ч +Ш +Э +Ю +Я +а +б +в +г +д +е +ж +з +и +й +к +л +м +н +о +п +р +с +т +у +ф +х +ц +ч +ш +щ +ъ +ы +ь +э +ю +я +ё +і +ї +ј +њ +ћ +Ա +Հ +ա +ե +ի +կ +մ +յ +ն +ո +ս +տ +ր +ւ +ְ +ִ +ֵ +ֶ +ַ +ָ +ֹ +ּ +א +ב +ג +ד +ה +ו +ז +ח +ט +י +כ +ל +ם +מ +ן +נ +ס +ע +פ +צ +ק +ר +ש +ת +، +ء +آ +أ +إ +ئ +ا +ب +ة +ت +ث +ج +ح +خ +د +ذ +ر +ز +س +ش +ص +ض +ط +ظ +ع +غ +ف +ق +ك +ل +م +ن +ه +و +ى +ي +َ +ِ +ٹ +پ +چ +ک +گ +ہ +ی +ے +ं +आ +क +ग +च +ज +ण +त +द +ध +न +प +ब +भ +म +य +र +ल +व +श +ष +स +ह +ा +ि +ी +ु +े +ो +् +। +॥ +আ +ই +এ +ও +ক +খ +গ +চ +ছ +জ +ট +ত +থ +দ +ধ +ন +প +ব +ম +য +র +ল +শ +স +হ +় +া +ি +ী +ু +ে +ো +্ +য় +க +த +ப +ம +ய +ர +ல +வ +ா +ி +ு +் +ร +་ +ག +ང +ད +ན +བ +མ +ར +ལ +ས +ི +ུ +ེ +ོ +ა +ე +ი +ლ +ნ +ო +რ +ს +ᴬ +ᴵ +ᵀ +ᵃ +ᵇ +ᵈ +ᵉ +ᵍ +ᵏ +ᵐ +ᵒ +ᵖ +ᵗ +ᵘ +ᵢ +ᵣ +ᵤ +ᵥ +ᶜ +ᶠ +ḍ +Ḥ +ḥ +Ḩ +ḩ +ḳ +ṃ +ṅ +ṇ +ṛ +ṣ +ṭ +ạ +ả +ấ +ầ +ẩ +ậ +ắ +ế +ề +ể +ễ +ệ +ị +ọ +ố +ồ +ổ +ộ +ớ +ờ +ợ +ụ +ủ +ứ +ừ +ử +ữ +ự +ỳ +ỹ +ἀ +ἐ +ὁ +ὐ +ὰ +ὶ +ὸ +ῆ +ῖ +ῦ +ῶ +‐ +‑ +‒ +– +— +― +‖ +‘ +’ +‚ +“ +” +„ +† +‡ +• +… +‰ +′ +″ +⁄ +⁰ +ⁱ +⁴ +⁵ +⁶ +⁷ +⁸ +⁹ +⁺ +⁻ +ⁿ +₀ +₁ +₂ +₃ +₄ +₅ +₆ +₇ +₈ +₉ +₊ +₍ +₎ +ₐ +ₑ +ₒ +ₓ +ₕ +ₖ +ₘ +ₙ +ₚ +ₛ +ₜ +₤ +€ +₱ +₹ +ℓ +№ +ℝ +⅓ +← +↑ +→ +↔ +⇌ +⇒ +∂ +∈ +− +∗ +∘ +√ +∞ +∧ +∨ +∩ +∪ +≈ +≠ +≡ +≤ +≥ +⊂ +⊆ +⊕ +⋅ +─ +│ +■ +● +★ +☆ +☉ +♠ +♣ +♥ +♦ +♭ +♯ +⟨ +⟩ +ⱼ +、 +。 +《 +》 +「 +」 +『 +』 +〜 +い +う +え +お +か +き +く +け +こ +さ +し +す +せ +そ +た +ち +つ +て +と +な +に +の +は +ひ +ま +み +む +め +も +や +ゆ +よ +ら +り +る +れ +ん +ア +ィ +イ +ウ +エ +オ +カ +ガ +キ +ク +グ +コ +サ +シ +ジ +ス +ズ +タ +ダ +ッ +テ +デ +ト +ド +ナ +ニ +ハ +バ +パ +フ +ブ +プ +マ +ミ +ム +ャ +ュ +ラ +リ +ル +レ +ロ +ン +・ +ー +一 +三 +上 +下 +中 +事 +二 +井 +京 +人 +亻 +仁 +佐 +侍 +光 +公 +力 +北 +十 +南 +原 +口 +史 +司 +吉 +同 +和 +囗 +国 +國 +土 +城 +士 +大 +天 +太 +夫 +女 +子 +宀 +安 +宮 +宿 +小 +尚 +山 +島 +川 +州 +平 +年 +心 +愛 +戸 +文 +新 +方 +日 +明 +星 +書 +月 +木 +本 +李 +村 +東 +松 +林 +正 +武 +氏 +水 +氵 +江 +河 +海 +版 +犬 +王 +生 +田 +白 +皇 +省 +真 +石 +社 +神 +竹 +美 +義 +花 +藤 +西 +谷 +車 +辶 +道 +郎 +郡 +部 +野 +金 +長 +門 +陽 +青 +食 +馬 +高 +龍 +龸 +사 +씨 +의 +이 +한 +fi +fl +! +( +) +, +- +/ +: +the +of +and +to +in +was +The +is +for +as +on +with +that +##s +his +by +he +at +from +it +her +He +had +an +were +you +be +In +she +are +but +which +It +not +or +have +my +him +one +this +me +has +also +up +their +first +out +who +been +they +She +into +all +would +its +##ing +time +two +##a +##e +said +about +when +over +more +other +can +after +back +them +then +##ed +there +like +so +only +##n +could +##d +##i +##y +what +no +##o +where +This +made +than +if +You +##ly +through +we +before +##r +just +some +##er +years +do +New +##t +down +between +new +now +will +three +most +On +around +year +used +such +being +well +during +They +know +against +under +later +did +part +known +off +while +His +re +... +##l +people +until +way +American +didn +University +your +both +many +get +United +became +head +There +second +As +work +any +But +still +again +born +even +eyes +After +including +de +took +And +long +team +season +family +see +right +same +called +name +because +film +don +10 +found +much +school +##es +going +won +place +away +We +day +left +John +000 +hand +since +World +these +how +make +number +each +life +area +man +four +go +No +here +very +National +##m +played +released +never +began +States +album +home +last +too +held +several +May +own +##on +take +end +School +##h +ll +series +What +want +use +another +city +When +2010 +side +At +may +That +came +face +June +think +game +those +high +March +early +September +##al +2011 +looked +July +state +small +thought +went +January +October +##u +based +August +##us +world +good +April +York +us +12 +2012 +2008 +For +2009 +group +along +few +South +little +##k +following +November +something +2013 +December +set +2007 +old +2006 +2014 +located +##an +music +County +City +former +##in +room +ve +next +All +##man +got +father +house +##g +body +15 +20 +18 +started +If +2015 +town +our +line +War +large +population +named +British +company +member +five +My +single +##en +age +State +moved +February +11 +Her +should +century +government +built +come +best +show +However +within +look +men +door +without +need +wasn +2016 +water +One +system +knew +every +died +League +turned +asked +North +St +wanted +building +received +song +served +though +felt +##ia +station +band +##ers +local +public +himself +different +death +say +##1 +30 +##2 +2005 +16 +night +behind +children +English +members +near +saw +together +son +14 +voice +village +13 +hands +help +##3 +due +French +London +top +told +open +published +third +2017 +play +across +During +put +final +often +include +25 +##le +main +having +2004 +once +ever +let +book +led +gave +late +front +find +club +##4 +German +included +species +College +form +opened +mother +women +enough +West +must +2000 +power +really +17 +making +half +##6 +order +might +##is +given +million +times +days +point +full +service +With +km +major +##7 +original +become +seen +II +north +six +##te +love +##0 +national +International +##5 +24 +So +District +lost +run +couldn +career +always +##9 +2003 +##th +country +##z +House +air +tell +south +worked +woman +player +##A +almost +war +River +##ic +married +continued +Then +James +close +black +short +##8 +##na +using +history +returned +light +car +##ra +sure +William +things +General +##ry +2002 +better +support +100 +among +From +feet +King +anything +21 +19 +established +district +2001 +feel +great +##ton +level +Cup +These +written +games +others +already +title +story +##p +law +thing +US +record +role +however +By +students +England +white +control +least +inside +land +##C +22 +give +community +hard +##ie +non +##c +produced +George +round +period +Park +business +various +##ne +does +present +wife +far +taken +per +reached +David +able +version +working +young +live +created +joined +East +living +appeared +case +High +done +23 +important +President +Award +France +position +office +looking +total +general +class +To +production +##S +football +party +brother +keep +mind +free +Street +hair +announced +development +either +nothing +moment +Church +followed +wrote +why +India +San +election +1999 +lead +How +##ch +##rs +words +European +course +considered +America +arms +Army +political +##la +28 +26 +west +east +ground +further +church +less +site +First +Not +Australia +toward +California +##ness +described +works +An +Council +heart +past +military +27 +##or +heard +field +human +soon +founded +1998 +playing +trying +##x +##ist +##ta +television +mouth +although +taking +win +fire +Division +##ity +Party +Royal +program +Some +Don +Association +According +tried +TV +Paul +outside +daughter +Best +While +someone +match +recorded +Canada +closed +region +Air +above +months +elected +##da +##ian +road +##ar +brought +move +1997 +leave +##um +Thomas +1996 +am +low +Robert +formed +person +services +points +Mr +miles +##b +stop +rest +doing +needed +international +release +floor +start +sound +call +killed +real +dark +research +finished +language +Michael +professional +change +sent +50 +upon +29 +track +hit +event +2018 +term +example +Germany +similar +return +##ism +fact +pulled +stood +says +ran +information +yet +result +developed +girl +##re +God +1995 +areas +signed +decided +##ment +Company +seemed +##el +co +turn +race +common +video +Charles +Indian +##ation +blood +art +red +##able +added +rather +1994 +met +director +addition +design +average +minutes +##ies +##ted +available +bed +coming +friend +idea +kind +Union +Road +remained +##ting +everything +##ma +running +care +finally +Chinese +appointed +1992 +Australian +##ley +popular +mean +teams +probably +##land +usually +project +social +Championship +possible +word +Russian +instead +mi +herself +##T +Peter +Hall +Center +seat +style +money +1993 +else +Department +table +Music +current +31 +features +special +events +character +Two +square +sold +debut +##v +process +Although +Since +##ka +40 +Central +currently +education +placed +lot +China +quickly +forward +seven +##ling +Europe +arm +performed +Japanese +1991 +Henry +Now +Dr +##ion +week +Group +myself +big +UK +Washington +ten +deep +1990 +Club +Japan +space +La +directed +smile +episode +hours +whole +##de +##less +Why +wouldn +designed +strong +training +changed +Society +stage +involved +hadn +towards +leading +police +eight +kept +Institute +study +largest +child +eventually +private +modern +Court +throughout +getting +originally +attack +##E +talk +Great +longer +songs +alone +##ine +wide +dead +walked +shot +##ri +Oh +force +##st +Art +today +friends +Island +Richard +1989 +center +construction +believe +size +White +ship +completed +##B +gone +Just +rock +sat +##R +radio +below +entire +families +league +includes +type +lived +official +range +hold +featured +Most +##ter +president +passed +means +##f +forces +lips +Mary +Do +guitar +##ce +food +wall +Of +spent +Its +performance +hear +##P +Western +reported +sister +##et +morning +##M +especially +##ive +Minister +itself +post +bit +groups +1988 +##tion +Black +##ng +Well +raised +sometimes +Canadian +Paris +Spanish +replaced +schools +Academy +leaving +central +female +Christian +Jack +whose +college +onto +provided +##D +##ville +players +actually +stopped +##son +Museum +doesn +##ts +books +fight +allowed +##ur +beginning +Records +awarded +parents +coach +##os +Red +saying +##ck +Smith +Yes +Lake +##L +aircraft +1987 +##ble +previous +ft +action +Italian +African +happened +vocals +Act +future +court +##ge +1986 +degree +phone +##ro +Is +countries +winning +breath +Love +river +matter +Lord +Other +list +self +parts +##ate +provide +cut +shows +plan +1st +interest +##ized +Africa +stated +Sir +fell +owned +earlier +ended +competition +attention +1985 +lower +nearly +bad +older +stay +Saint +##se +certain +1984 +fingers +blue +try +fourth +Grand +##as +king +##nt +makes +chest +movement +states +moving +data +introduced +model +date +section +Los +deal +##I +skin +entered +middle +success +Texas +##w +summer +island +##N +Republic +length +husband +1980 +##ey +reason +anyone +forced +via +base +500 +job +covered +Festival +Roman +successful +rights +cover +Man +writing +Ireland +##F +related +goal +takes +buildings +true +weeks +1983 +Because +opening +novel +ISBN +meet +gold +##ous +mid +km² +standing +Football +Chicago +shook +whom +##ki +1982 +Day +feeling +scored +boy +higher +Force +leader +heavy +fall +question +sense +army +Second +energy +meeting +themselves +kill +##am +board +census +##ya +##ns +mine +meant +market +required +battle +campaign +attended +approximately +Kingdom +runs +active +##ha +contract +clear +previously +health +1979 +Arts +complete +Catholic +couple +units +##ll +##ty +Committee +shoulder +sea +systems +listed +##O +caught +tournament +##G +northern +author +Film +Your +##men +holding +offered +personal +1981 +southern +artist +traditional +studio +200 +capital +##ful +regular +ask +giving +organization +month +news +Are +read +managed +helped +studied +student +defeated +natural +industry +Year +noted +decision +Government +quite +##id +smiled +1972 +Maybe +tracks +##ke +Mark +al +media +engine +hour +Their +relationship +plays +property +structure +1976 +ago +Hill +Martin +1978 +ready +Many +Like +Bay +immediately +generally +Italy +Greek +practice +caused +division +significant +Joseph +speed +Let +thinking +completely +1974 +primary +mostly +##field +##K +1975 +##to +Even +writer +##led +dropped +magazine +collection +understand +route +highest +particular +films +lines +network +Science +loss +carried +direction +green +1977 +location +producer +according +Women +Queen +neck +thus +independent +view +1970 +Angeles +Soviet +distance +problem +Board +tour +western +income +appearance +access +Mexico +nodded +street +surface +arrived +believed +Old +1968 +1973 +becoming +whether +1945 +figure +singer +stand +Following +issue +window +wrong +pain +everyone +lives +issues +park +slowly +la +act +##va +bring +Lee +operations +key +comes +fine +cold +famous +Navy +1971 +Me +additional +individual +##ner +Zealand +goals +county +contains +Service +minute +2nd +reach +talking +particularly +##ham +movie +Director +glass +paper +studies +##co +railway +standard +Education +45 +represented +Chief +Louis +launched +Star +terms +60 +1969 +experience +watched +Another +Press +Tom +staff +starting +subject +break +Virginia +nine +eye +##age +evidence +foot +##est +companies +Prince +##V +gun +create +Big +People +guy +Green +simply +numerous +##line +increased +twenty +##ga +##do +1967 +award +officer +stone +Before +material +Northern +grew +male +plant +Life +legs +step +Al +unit +35 +except +answer +##U +report +response +Edward +commercial +edition +trade +science +##ca +Irish +Law +shown +rate +failed +##ni +remains +changes +mm +limited +larger +Later +cause +waiting +Time +##wood +cost +Bill +manager +activities +likely +allow +operated +retired +##ping +65 +directly +Who +associated +effect +hell +Florida +straight +hot +Valley +management +girls +expected +eastern +Mike +chance +cast +centre +chair +hurt +problems +##li +walk +programs +Team +characters +Battle +edge +pay +maybe +corner +majority +medical +Joe +Summer +##io +attempt +Pacific +command +Radio +##by +names +municipality +1964 +train +economic +Brown +feature +sex +source +agreed +remember +Three +1966 +1965 +Pennsylvania +victory +senior +annual +III +Southern +results +Sam +serving +religious +Jones +appears +##der +despite +claimed +Both +musical +matches +fast +security +selected +Young +double +complex +hospital +chief +Times +##ve +Championships +filled +Public +Despite +beautiful +Research +plans +Province +##ally +Wales +##ko +artists +metal +nearby +Spain +##il +32 +houses +supported +piece +##no +stared +recording +nature +legal +Russia +##ization +remaining +looks +##sh +bridge +closer +cases +scene +marriage +Little +##é +uses +Earth +specific +Frank +theory +Good +discovered +referred +bass +culture +university +presented +Congress +##go +metres +continue +1960 +isn +Awards +meaning +cell +composed +separate +Series +forms +Blue +cross +##tor +increase +test +computer +slightly +Where +Jewish +Town +tree +status +1944 +variety +responsible +pretty +initially +##way +realized +pass +provides +Captain +Alexander +recent +score +broke +Scott +drive +financial +showed +Line +stories +ordered +soldiers +genus +operation +gaze +sitting +society +Only +hope +actor +follow +Empire +Yeah +technology +happy +focus +policy +spread +situation +##ford +##ba +Mrs +watch +Can +1963 +Commission +touch +earned +troops +Under +1962 +individuals +cannot +19th +##lin +mile +expression +exactly +suddenly +weight +dance +stepped +places +appear +difficult +Railway +anti +numbers +kilometres +star +##ier +department +ice +Britain +removed +Once +##lo +Boston +value +##ant +mission +trees +Order +sports +join +serve +Major +poor +Poland +mainly +Theatre +pushed +Station +##it +Lady +federal +silver +##ler +foreign +##ard +Eastern +##den +box +hall +subsequently +lies +acquired +1942 +ancient +CD +History +Jean +beyond +##ger +El +##les +growing +championship +native +Parliament +Williams +watching +direct +overall +offer +Also +80 +Secretary +spoke +Latin +ability +##ated +safe +presence +##ial +headed +regional +planned +1961 +Johnson +throat +consists +##W +extended +Or +bar +walls +Chris +stations +politician +Olympics +influence +share +fighting +speak +hundred +Carolina +die +stars +##tic +color +Chapter +##ish +fear +sleep +goes +Francisco +oil +Bank +sign +physical +##berg +Dutch +seasons +##rd +Games +Governor +sorry +lack +Centre +memory +baby +smaller +charge +Did +multiple +ships +shirt +Assembly +amount +leaves +3rd +Foundation +conditions +1943 +Rock +Democratic +Daniel +##at +winner +products +##ina +store +latter +Professor +civil +prior +host +1956 +soft +vote +needs +Each +rules +1958 +pressure +letter +normal +proposed +levels +records +1959 +paid +intended +Victoria +purpose +okay +historical +issued +1980s +broadcast +rule +simple +picked +firm +Sea +1941 +Elizabeth +1940 +serious +featuring +highly +graduated +mentioned +choice +1948 +replied +percent +Scotland +##hi +females +constructed +1957 +settled +Steve +recognized +cities +crew +glanced +kiss +competed +flight +knowledge +editor +More +Conference +##H +fifth +elements +##ee +##tes +function +newspaper +recently +Miss +cultural +brown +twice +Office +1939 +truth +Creek +1946 +households +USA +1950 +quality +##tt +border +seconds +destroyed +pre +wait +ahead +build +image +90 +cars +##mi +33 +promoted +professor +et +bank +medal +text +broken +Middle +revealed +sides +wing +seems +channel +1970s +Ben +loved +effort +officers +Will +##ff +70 +Israel +Jim +upper +fully +label +Jr +assistant +powerful +pair +positive +##ary +gives +1955 +20th +races +remain +kitchen +primarily +##ti +Sydney +easy +Tour +whispered +buried +300 +News +Polish +1952 +Duke +Columbia +produce +accepted +00 +approach +minor +1947 +Special +44 +Asian +basis +visit +Fort +Civil +finish +formerly +beside +leaned +##ite +median +rose +coast +effects +supposed +Cross +##hip +Corps +residents +Jackson +##ir +Bob +basketball +36 +Asia +seem +Bishop +Book +##ber +ring +##ze +owner +BBC +##ja +transferred +acting +De +appearances +walking +Le +press +grabbed +1954 +officially +1953 +##pe +risk +taught +review +##X +lay +##well +council +Avenue +seeing +losing +Ohio +Super +province +ones +travel +##sa +projects +equipment +spot +Berlin +administrative +heat +potential +shut +capacity +elections +growth +fought +Republican +mixed +Andrew +teacher +turning +strength +shoulders +beat +wind +1949 +Health +follows +camp +suggested +perhaps +Alex +mountain +contact +divided +candidate +fellow +34 +Show +necessary +workers +ball +horse +ways +questions +protect +gas +activity +younger +bottom +founder +Scottish +screen +treatment +easily +com +##house +dedicated +Master +warm +Night +Georgia +Long +von +##me +perfect +website +1960s +piano +efforts +##ide +Tony +sort +offers +Development +Simon +executive +##nd +save +Over +Senate +1951 +1990s +draw +master +Police +##ius +renamed +boys +initial +prominent +damage +Co +##ov +##za +online +begin +occurred +captured +youth +Top +account +tells +Justice +conducted +forest +##town +bought +teeth +Jersey +##di +purchased +agreement +Michigan +##ure +campus +prison +becomes +product +secret +guess +Route +huge +types +drums +64 +split +defeat +estate +housing +##ot +brothers +Coast +declared +happen +titled +therefore +sun +commonly +alongside +Stadium +library +Home +article +steps +telling +slow +assigned +refused +laughed +wants +Nick +wearing +Rome +Open +##ah +Hospital +pointed +Taylor +lifted +escape +participated +##j +drama +parish +Santa +##per +organized +mass +pick +Airport +gets +Library +unable +pull +Live +##ging +surrounding +##ries +focused +Adam +facilities +##ning +##ny +38 +##ring +notable +era +connected +gained +operating +laid +Regiment +branch +defined +Christmas +machine +Four +academic +Iran +adopted +concept +Men +compared +search +traffic +Max +Maria +greater +##ding +widely +##burg +serves +1938 +37 +Go +hotel +shared +typically +scale +1936 +leg +suffered +yards +pieces +Ministry +Wilson +episodes +empty +1918 +safety +continues +yellow +historic +settlement +400 +Come +Corporation +enemy +content +picture +evening +territory +method +trial +solo +driver +Here +##ls +entrance +Prize +spring +whatever +##ent +75 +##ji +reading +Arthur +##cy +Our +clothes +Prime +Illinois +Kong +code +##ria +sit +Harry +Federal +chosen +administration +bodies +begins +stomach +Though +seats +Hong +density +Sun +leaders +Field +museum +chart +platform +languages +##ron +birth +holds +Gold +##un +fish +combined +##ps +4th +1937 +largely +captain +trust +Game +van +boat +Oxford +basic +beneath +Islands +painting +nice +Toronto +path +males +sources +block +conference +parties +murder +clubs +crowd +calling +About +Business +peace +knows +lake +speaking +stayed +Brazil +allowing +Born +unique +thick +Technology +##que +receive +des +semi +alive +noticed +format +##ped +coffee +digital +##ned +handed +guard +tall +faced +setting +plants +partner +claim +reduced +temple +animals +determined +classes +##out +estimated +##ad +Olympic +providing +Massachusetts +learned +Inc +Philadelphia +Social +carry +42 +possibly +hosted +tonight +respectively +Today +shape +Mount +roles +designated +brain +etc +Korea +thoughts +Brian +Highway +doors +background +drew +models +footballer +tone +turns +1935 +quiet +tower +wood +bus +write +software +weapons +flat +marked +1920 +newly +tight +Eric +finger +Journal +FC +Van +rise +critical +Atlantic +granted +returning +communities +humans +quick +39 +48 +ranked +sight +pop +Swedish +Stephen +card +analysis +attacked +##wa +Sunday +identified +Jason +champion +situated +1930 +expanded +tears +##nce +reaching +Davis +protection +Emperor +positions +nominated +Bridge +tax +dress +allows +avoid +leadership +killing +actress +guest +steel +knowing +electric +cells +disease +grade +unknown +##ium +resulted +Pakistan +confirmed +##ged +tongue +covers +##Y +roof +entirely +applied +votes +drink +interview +exchange +Township +reasons +##ised +page +calls +dog +agent +nose +teaching +##ds +##ists +advanced +wish +Golden +existing +vehicle +del +1919 +develop +attacks +pressed +Sports +planning +resulting +facility +Sarah +notes +1933 +Class +Historic +winter +##mo +audience +Community +household +Netherlands +creation +##ize +keeping +1914 +claims +dry +guys +opposite +##ak +explained +Ontario +secondary +difference +Francis +actions +organizations +yard +animal +Up +Lewis +titles +Several +1934 +Ryan +55 +Supreme +rolled +1917 +distribution +figures +afraid +rural +yourself +##rt +sets +barely +Instead +passing +awards +41 +silence +authority +occupied +environment +windows +engineering +surprised +flying +crime +reports +Mountain +powers +driving +succeeded +reviews +1929 +Head +missing +Song +Jesus +opportunity +inspired +ends +albums +conversation +impact +injury +surprise +billion +learning +heavily +oldest +union +creating +##ky +festival +literature +letters +sexual +##tte +apartment +Final +comedy +nation +orders +##sen +contemporary +Power +drawn +existence +connection +##ating +Post +Junior +remembered +message +Medal +castle +note +engineer +sounds +Beach +crossed +##dy +ear +scientific +sales +##ai +theme +starts +clearly +##ut +trouble +##gan +bag +##han +BC +sons +1928 +silent +versions +daily +Studies +ending +Rose +guns +1932 +headquarters +reference +obtained +Squadron +concert +none +du +Among +##don +prevent +Member +answered +staring +Between +##lla +portion +drug +liked +association +performances +Nations +formation +Castle +lose +learn +scoring +relatively +quarter +47 +Premier +##ors +Sweden +baseball +attempted +trip +worth +perform +airport +fields +enter +honor +Medical +rear +commander +officials +condition +supply +materials +52 +Anna +volume +threw +Persian +43 +interested +Gallery +achieved +visited +laws +relief +Area +Matt +singles +Lieutenant +Country +fans +Cambridge +sky +Miller +effective +tradition +Port +##ana +minister +extra +entitled +System +sites +authorities +acres +committee +racing +1931 +desk +trains +ass +weren +Family +farm +##ance +industrial +##head +iron +49 +abandoned +Out +Holy +chairman +waited +frequently +display +Light +transport +starring +Patrick +Engineering +eat +FM +judge +reaction +centuries +price +##tive +Korean +defense +Get +arrested +1927 +send +urban +##ss +pilot +Okay +Media +reality +arts +soul +thirty +##be +catch +generation +##nes +apart +Anne +drop +See +##ving +sixth +trained +Management +magic +cm +height +Fox +Ian +resources +vampire +principal +Was +haven +##au +Walter +Albert +rich +1922 +causing +entry +##ell +shortly +46 +worry +doctor +composer +rank +Network +bright +showing +regions +1924 +wave +carrying +kissed +finding +missed +Earl +lying +target +vehicles +Military +controlled +dinner +##board +briefly +lyrics +motion +duty +strange +attempts +invited +kg +villages +5th +Land +##mer +Christ +prepared +twelve +check +thousand +earth +copies +en +transfer +citizens +Americans +politics +nor +theatre +Project +##bo +clean +rooms +laugh +##ran +application +contained +anyway +containing +Sciences +1925 +rare +speech +exist +1950s +falling +passenger +##im +stands +51 +##ol +##ow +phase +governor +kids +details +methods +Vice +employed +performing +counter +Jane +heads +Channel +wine +opposition +aged +1912 +Every +1926 +highway +##ura +1921 +aired +978 +permanent +Forest +finds +joint +approved +##pur +brief +doubt +acts +brand +wild +closely +Ford +Kevin +chose +shall +port +sweet +fun +asking +Be +##bury +sought +Dave +Mexican +mom +Right +Howard +Moscow +Charlie +Stone +##mann +admitted +##ver +wooden +1923 +Officer +relations +Hot +combat +publication +chain +shop +inhabitants +proved +ideas +address +1915 +Memorial +explain +increasing +conflict +Anthony +Melbourne +narrow +temperature +slid +1916 +worse +selling +documentary +Ali +Ray +opposed +vision +dad +extensive +Infantry +commissioned +Doctor +offices +programming +core +respect +storm +##pa +##ay +##om +promotion +der +struck +anymore +shit +Region +receiving +DVD +alternative +##ue +ride +maximum +1910 +##ious +Third +Affairs +cancer +Executive +##op +dream +18th +Due +##ker +##worth +economy +IV +Billboard +identity +subsequent +statement +skills +##back +funding +##ons +Round +Foreign +truck +Please +lights +wondered +##ms +frame +yes +Still +districts +fiction +Colonel +converted +150 +grown +accident +critics +fit +Information +architecture +Point +Five +armed +Billy +poet +functions +consisted +suit +Turkish +Band +object +desire +##ities +sounded +flow +Norwegian +articles +Marie +pulling +thin +singing +Hunter +Human +Battalion +Federation +Kim +origin +represent +dangerous +weather +fuel +ex +##sing +Last +bedroom +aid +knees +Alan +angry +assumed +plane +Something +founding +concerned +global +Fire +di +please +Portuguese +touched +Roger +nuclear +Register +Jeff +fixed +royal +lie +finals +NFL +Manchester +towns +handle +shaped +Chairman +Dean +launch +understanding +Children +violence +failure +sector +Brigade +wrapped +fired +sharp +tiny +developing +expansion +Free +institutions +technical +Nothing +otherwise +Main +inch +Saturday +wore +Senior +attached +cheek +representing +Kansas +##chi +##kin +actual +advantage +Dan +Austria +##dale +hoped +multi +squad +Norway +streets +1913 +Services +hired +grow +pp +wear +painted +Minnesota +stuff +Building +54 +Philippines +1900 +##ties +educational +Khan +Magazine +##port +Cape +signal +Gordon +sword +Anderson +cool +engaged +Commander +images +Upon +tied +Security +cup +rail +Vietnam +successfully +##red +Muslim +gain +bringing +Native +hers +occurs +negative +Philip +Kelly +Colorado +category +##lan +600 +Have +supporting +wet +56 +stairs +Grace +observed +##ung +funds +restaurant +1911 +Jews +##ments +##che +Jake +Back +53 +asks +journalist +accept +bands +bronze +helping +##ice +decades +mayor +survived +usual +influenced +Douglas +Hey +##izing +surrounded +retirement +Temple +derived +Pope +registered +producing +##ral +structures +Johnny +contributed +finishing +buy +specifically +##king +patients +Jordan +internal +regarding +Samuel +Clark +##q +afternoon +Finally +scenes +notice +refers +quietly +threat +Water +Those +Hamilton +promise +freedom +Turkey +breaking +maintained +device +lap +ultimately +Champion +Tim +Bureau +expressed +investigation +extremely +capable +qualified +recognition +items +##up +Indiana +adult +rain +greatest +architect +Morgan +dressed +equal +Antonio +collected +drove +occur +Grant +graduate +anger +Sri +worried +standards +##ore +injured +somewhere +damn +Singapore +Jimmy +pocket +homes +stock +religion +aware +regarded +Wisconsin +##tra +passes +fresh +##ea +argued +Ltd +EP +Diego +importance +Census +incident +Egypt +Missouri +domestic +leads +ceremony +Early +camera +Father +challenge +Switzerland +lands +familiar +hearing +spend +educated +Tennessee +Thank +##ram +Thus +concern +putting +inches +map +classical +Allen +crazy +valley +Space +softly +##my +pool +worldwide +climate +experienced +neighborhood +scheduled +neither +fleet +1908 +Girl +##J +Part +engines +locations +darkness +Revolution +establishment +lawyer +objects +apparently +Queensland +Entertainment +bill +mark +Television +##ong +pale +demand +Hotel +selection +##rn +##ino +Labour +Liberal +burned +Mom +merged +Arizona +request +##lia +##light +hole +employees +##ical +incorporated +95 +independence +Walker +covering +joining +##ica +task +papers +backing +sell +biggest +6th +strike +establish +##ō +gently +59 +Orchestra +Winter +protein +Juan +locked +dates +Boy +aren +shooting +Luke +solid +charged +Prior +resigned +interior +garden +spoken +improve +wonder +promote +hidden +##med +combination +Hollywood +Swiss +consider +##ks +Lincoln +literary +drawing +Marine +weapon +Victor +Trust +Maryland +properties +##ara +exhibition +understood +hung +Tell +installed +loud +fashion +affected +junior +landing +flowers +##he +Internet +beach +Heart +tries +Mayor +programme +800 +wins +noise +##ster +##ory +58 +contain +fair +delivered +##ul +wedding +Square +advance +behavior +Program +Oregon +##rk +residence +realize +certainly +hill +Houston +57 +indicated +##water +wounded +Village +massive +Moore +thousands +personnel +dating +opera +poetry +##her +causes +feelings +Frederick +applications +push +approached +foundation +pleasure +sale +fly +gotten +northeast +costs +raise +paintings +##ney +views +horses +formal +Arab +hockey +typical +representative +rising +##des +clock +stadium +shifted +Dad +peak +Fame +vice +disappeared +users +Way +Naval +prize +hoping +values +evil +Bell +consisting +##ón +Regional +##ics +improved +circle +carefully +broad +##ini +Fine +maintain +operate +offering +mention +Death +stupid +Through +Princess +attend +interests +ruled +somewhat +wings +roads +grounds +##ual +Greece +Champions +facing +hide +voted +require +Dark +Matthew +credit +sighed +separated +manner +##ile +Boys +1905 +committed +impossible +lip +candidates +7th +Bruce +arranged +Islamic +courses +criminal +##ened +smell +##bed +08 +consecutive +##ening +proper +purchase +weak +Prix +1906 +aside +introduction +Look +##ku +changing +budget +resistance +factory +Forces +agency +##tone +northwest +user +1907 +stating +##one +sport +Design +environmental +cards +concluded +Carl +250 +accused +##ology +Girls +sick +intelligence +Margaret +responsibility +Guard +##tus +17th +sq +goods +1909 +hate +##ek +capture +stores +Gray +comic +Modern +Silver +Andy +electronic +wheel +##ied +Deputy +##bs +Czech +zone +choose +constant +reserve +##lle +Tokyo +spirit +sub +degrees +flew +pattern +compete +Dance +##ik +secretary +Imperial +99 +reduce +Hungarian +confused +##rin +Pierre +describes +regularly +Rachel +85 +landed +passengers +##ise +##sis +historian +meters +Youth +##ud +participate +##cing +arrival +tired +Mother +##gy +jumped +Kentucky +faces +feed +Israeli +Ocean +##Q +##án +plus +snow +techniques +plate +sections +falls +jazz +##ris +tank +loan +repeated +opinion +##res +unless +rugby +journal +Lawrence +moments +shock +distributed +##ded +adjacent +Argentina +crossing +uncle +##ric +Detroit +communication +mental +tomorrow +session +Emma +Without +##gen +Miami +charges +Administration +hits +coat +protected +Cole +invasion +priest +09 +Gary +enjoyed +plot +measure +bound +friendly +throw +musician +##lon +##ins +Age +knife +damaged +birds +driven +lit +ears +breathing +Arabic +Jan +faster +Jonathan +##gate +Independent +starred +Harris +teachers +Alice +sequence +mph +file +translated +decide +determine +Review +documents +sudden +threatened +##ft +bear +distinct +decade +burning +##sky +1930s +replace +begun +extension +##time +1904 +equivalent +accompanied +Christopher +Danish +##ye +Besides +##more +persons +fallen +Rural +roughly +saved +willing +ensure +Belgium +05 +musicians +##ang +giant +Six +Retrieved +worst +purposes +##bly +mountains +seventh +slipped +brick +07 +##py +somehow +Carter +Iraq +cousin +favor +islands +journey +FIFA +contrast +planet +vs +calm +##ings +concrete +branches +gray +profit +Russell +##ae +##ux +##ens +philosophy +businesses +talked +parking +##ming +owners +Place +##tle +agricultural +Kate +06 +southeast +draft +Eddie +earliest +forget +Dallas +Commonwealth +edited +66 +inner +ed +operates +16th +Harvard +assistance +##si +designs +Take +bathroom +indicate +CEO +Command +Louisiana +1902 +Dublin +Books +1901 +tropical +1903 +##tors +Places +tie +progress +forming +solution +62 +letting +##ery +studying +##jo +duties +Baseball +taste +Reserve +##ru +Ann +##gh +visible +##vi +notably +link +NCAA +southwest +Never +storage +mobile +writers +favorite +Pro +pages +truly +count +##tta +string +kid +98 +Ross +row +##idae +Kennedy +##tan +Hockey +hip +waist +grandfather +listen +##ho +feels +busy +72 +stream +obvious +cycle +shaking +Knight +##ren +Carlos +painter +trail +web +linked +04 +Palace +existed +##ira +responded +closing +End +examples +Marshall +weekend +jaw +Denmark +lady +township +medium +chin +Story +option +fifteen +Moon +represents +makeup +investment +jump +childhood +Oklahoma +roll +normally +Ten +Operation +Graham +Seattle +Atlanta +paused +promised +rejected +treated +returns +flag +##ita +Hungary +danger +glad +movements +visual +subjects +credited +soldier +Norman +ill +translation +José +Quebec +medicine +warning +theater +praised +municipal +01 +commune +churches +acid +folk +8th +testing +add +survive +Sound +devices +residential +severe +presidential +Mississippi +Austin +Perhaps +Charlotte +hanging +Montreal +grin +##ten +racial +partnership +shoot +shift +##nie +Les +downtown +Brothers +Garden +matters +restored +mirror +forever +winners +rapidly +poverty +##ible +Until +DC +faith +hundreds +Real +Ukraine +Nelson +balance +Adams +contest +relative +ethnic +Edinburgh +composition +##nts +emergency +##van +marine +reputation +Down +pack +12th +Communist +Mountains +pro +stages +measures +##ld +ABC +Li +victims +benefit +Iowa +Broadway +gathered +rating +Defense +classic +##ily +ceiling +##ions +snapped +Everything +constituency +Franklin +Thompson +Stewart +entering +Judge +forth +##sk +wanting +smiling +moves +tunnel +premiered +grass +unusual +Ukrainian +bird +Friday +tail +Portugal +coal +element +Fred +guards +Senator +collaboration +beauty +Wood +chemical +beer +justice +signs +##Z +sees +##zi +Puerto +##zed +96 +smooth +Bowl +gift +limit +97 +heading +Source +wake +requires +Ed +Constitution +factor +Lane +factors +adding +Note +cleared +pictures +pink +##ola +Kent +Local +Singh +moth +Ty +##ture +courts +Seven +temporary +involving +Vienna +emerged +fishing +agree +defensive +stuck +secure +Tamil +##ick +bottle +03 +Player +instruments +Spring +patient +flesh +contributions +cry +Malaysia +120 +Global +da +Alabama +Within +##work +debuted +expect +Cleveland +concerns +retained +horror +10th +spending +Peace +Transport +grand +Crown +instance +institution +acted +Hills +mounted +Campbell +shouldn +1898 +##ably +chamber +soil +88 +Ethan +sand +cheeks +##gi +marry +61 +weekly +classification +DNA +Elementary +Roy +definitely +Soon +Rights +gate +suggests +aspects +imagine +golden +beating +Studios +Warren +differences +significantly +glance +occasionally +##od +clothing +Assistant +depth +sending +possibility +mode +prisoners +requirements +daughters +dated +Representatives +prove +guilty +interesting +smoke +cricket +93 +##ates +rescue +Connecticut +underground +Opera +13th +reign +##ski +thanks +leather +equipped +routes +fan +##ans +script +Wright +bishop +Welsh +jobs +faculty +eleven +Railroad +appearing +anniversary +Upper +##down +anywhere +Rugby +Metropolitan +Meanwhile +Nicholas +champions +forehead +mining +drinking +76 +Jerry +membership +Brazilian +Wild +Rio +scheme +Unlike +strongly +##bility +fill +##rian +easier +MP +Hell +##sha +Stanley +banks +Baron +##ique +Robinson +67 +Gabriel +Austrian +Wayne +exposed +##wan +Alfred +1899 +manage +mix +visitors +eating +##rate +Sean +commission +Cemetery +policies +Camp +parallel +traveled +guitarist +02 +supplies +couples +poem +blocks +Rick +Training +Energy +achieve +appointment +Wing +Jamie +63 +novels +##em +1890 +songwriter +Base +Jay +##gar +naval +scared +miss +labor +technique +crisis +Additionally +backed +destroy +seriously +tools +tennis +91 +god +##ington +continuing +steam +obviously +Bobby +adapted +fifty +enjoy +Jacob +publishing +column +##ular +Baltimore +Donald +Liverpool +92 +drugs +movies +##ock +Heritage +##je +##istic +vocal +strategy +gene +advice +##bi +Ottoman +riding +##side +Agency +Indonesia +11th +laughing +sleeping +und +muttered +listening +deck +tip +77 +ownership +grey +Claire +deeply +provincial +popularity +Cooper +##á +Emily +##sed +designer +Murray +describe +Danny +Around +Parker +##dae +68 +rates +suffering +considerable +78 +nervous +powered +tons +circumstances +wished +belonged +Pittsburgh +flows +9th +##use +belt +81 +useful +15th +context +List +Dead +Iron +seek +Season +worn +frequency +legislation +replacement +memories +Tournament +Again +Barry +organisation +copy +Gulf +waters +meets +struggle +Oliver +1895 +Susan +protest +kick +Alliance +components +1896 +Tower +Windows +demanded +regiment +sentence +Woman +Logan +Referee +hosts +debate +knee +Blood +##oo +universities +practices +Ward +ranking +correct +happening +Vincent +attracted +classified +##stic +processes +immediate +waste +increasingly +Helen +##po +Lucas +Phil +organ +1897 +tea +suicide +actors +lb +crash +approval +waves +##ered +hated +grip +700 +amongst +69 +74 +hunting +dying +lasted +illegal +##rum +stare +defeating +##gs +shrugged +°C +Jon +Count +Orleans +94 +affairs +formally +##and +##ves +criticized +Disney +Vol +successor +tests +scholars +palace +Would +celebrated +rounds +grant +Schools +Such +commanded +demon +Romania +##all +Karl +71 +##yn +84 +Daily +totally +Medicine +fruit +Die +upset +Lower +Conservative +14th +Mitchell +escaped +shoes +Morris +##tz +queen +harder +prime +Thanks +indeed +Sky +authors +rocks +definition +Nazi +accounts +printed +experiences +##ters +divisions +Cathedral +denied +depending +Express +##let +73 +appeal +loose +colors +filed +##isation +gender +##ew +throne +forests +Finland +domain +boats +Baker +squadron +shore +remove +##ification +careful +wound +railroad +82 +seeking +agents +##ved +Blues +##off +customers +ignored +net +##ction +hiding +Originally +declined +##ess +franchise +eliminated +NBA +merely +pure +appropriate +visiting +forty +markets +offensive +coverage +cave +##nia +spell +##lar +Benjamin +##ire +Convention +filmed +Trade +##sy +##ct +Having +palm +1889 +Evans +intense +plastic +Julia +document +jeans +vessel +SR +##fully +proposal +Birmingham +le +##ative +assembly +89 +fund +lock +1893 +AD +meetings +occupation +modified +Years +odd +aimed +reform +Mission +Works +shake +cat +exception +convinced +executed +pushing +dollars +replacing +soccer +manufacturing +##ros +expensive +kicked +minimum +Josh +coastal +Chase +ha +Thailand +publications +deputy +Sometimes +Angel +effectively +##illa +criticism +conduct +Serbian +landscape +NY +absence +passage +##ula +Blake +Indians +1892 +admit +Trophy +##ball +Next +##rated +##ians +charts +kW +orchestra +79 +heritage +1894 +rough +exists +boundary +Bible +Legislative +moon +medieval +##over +cutting +print +##ett +birthday +##hood +destruction +Julian +injuries +influential +sisters +raising +statue +colour +dancing +characteristics +orange +##ok +##aries +Ken +colonial +twin +Larry +surviving +##shi +Barbara +personality +entertainment +assault +##ering +talent +happens +license +86 +couch +Century +soundtrack +shower +swimming +cash +Staff +bent +1885 +bay +lunch +##lus +dozen +vessels +CBS +greatly +critic +Test +symbol +panel +shell +output +reaches +87 +Front +motor +ocean +##era +##ala +maintenance +violent +scent +Limited +Las +Hope +Theater +Which +survey +Robin +recordings +compilation +##ward +bomb +insurance +Authority +sponsored +satellite +Jazz +refer +stronger +blow +whilst +Wrestling +suggest +##rie +climbed +##els +voices +shopping +1891 +Neil +discovery +##vo +##ations +burst +Baby +peaked +Brooklyn +knocked +lift +##try +false +nations +Hugh +Catherine +preserved +distinguished +terminal +resolution +ratio +pants +cited +competitions +completion +DJ +bone +uniform +schedule +shouted +83 +1920s +rarely +Basketball +Taiwan +artistic +bare +vampires +arrest +Utah +Marcus +assist +gradually +qualifying +Victorian +vast +rival +Warner +Terry +Economic +##cia +losses +boss +versus +audio +runner +apply +surgery +Play +twisted +comfortable +##cs +Everyone +guests +##lt +Harrison +UEFA +lowered +occasions +##lly +##cher +chapter +youngest +eighth +Culture +##room +##stone +1888 +Songs +Seth +Digital +involvement +expedition +relationships +signing +1000 +fault +annually +circuit +afterwards +meat +creature +##ou +cable +Bush +##net +Hispanic +rapid +gonna +figured +extent +considering +cried +##tin +sigh +dynasty +##ration +cabinet +Richmond +stable +##zo +1864 +Admiral +Unit +occasion +shares +badly +longest +##ify +Connor +extreme +wondering +girlfriend +Studio +##tions +1865 +tribe +exact +muscles +hat +Luis +Orthodox +decisions +amateur +description +##lis +hips +kingdom +##ute +Portland +whereas +Bachelor +outer +discussion +partly +Arkansas +1880 +dreams +perfectly +Lloyd +##bridge +asleep +##tti +Greg +permission +trading +pitch +mill +Stage +liquid +Keith +##tal +wolf +processing +stick +Jerusalem +profile +rushed +spiritual +argument +Ice +Guy +till +Delhi +roots +Section +missions +Glasgow +penalty +NBC +encouraged +identify +keyboards +##zing +##ston +disc +plain +informed +Bernard +thinks +fled +Justin +##day +newspapers +##wick +Ralph +##zer +unlike +Stars +artillery +##ified +recovered +arrangement +searching +##pers +##tory +##rus +deaths +Egyptian +diameter +##í +marketing +corporate +teach +marks +Turner +staying +hallway +Sebastian +chapel +naked +mistake +possession +1887 +dominated +jacket +creative +Fellow +Falls +Defence +suspended +employment +##rry +Hebrew +Hudson +Week +Wars +recognize +Natural +controversial +Tommy +thank +Athletic +benefits +decline +intention +##ets +Lost +Wall +participation +elevation +supports +parliament +1861 +concentration +Movement +##IS +competing +stops +behalf +##mm +limits +funded +discuss +Collins +departure +obtain +woods +latest +universe +alcohol +Laura +rush +blade +funny +Dennis +forgotten +Amy +Symphony +apparent +graduating +1862 +Rob +Grey +collections +Mason +emotions +##ugh +literally +Any +counties +1863 +nomination +fighter +habitat +respond +external +Capital +exit +Video +carbon +sharing +Bad +opportunities +Perry +photo +##mus +Orange +posted +remainder +transportation +portrayed +Labor +recommended +percussion +rated +Grade +rivers +partially +suspected +strip +adults +button +struggled +intersection +Canal +##ability +poems +claiming +Madrid +1886 +Together +##our +Much +Vancouver +instrument +instrumental +1870 +mad +angle +Control +Phoenix +Leo +Communications +mail +##ette +##ev +preferred +adaptation +alleged +discussed +deeper +##ane +Yet +Monday +volumes +thrown +Zane +##logy +displayed +rolling +dogs +Along +Todd +##ivity +withdrew +representation +belief +##sia +crown +Late +Short +hardly +grinned +romantic +Pete +##ken +networks +enemies +Colin +Eventually +Side +donated +##su +steady +grab +guide +Finnish +Milan +pregnant +controversy +reminded +1884 +Stuart +##bach +##ade +Race +Belgian +LP +Production +Zone +lieutenant +infantry +Child +confusion +sang +resident +##ez +victim +1881 +channels +Ron +businessman +##gle +Dick +colony +pace +producers +##ese +agencies +Craig +Lucy +Very +centers +Yorkshire +photography +##ched +Album +championships +Metro +substantial +Standard +terrible +directors +contribution +advertising +emotional +##its +layer +segment +sir +folded +Roberts +ceased +Hampshire +##ray +detailed +partners +m² +##pt +Beth +genre +commented +generated +remote +aim +Hans +credits +concerts +periods +breakfast +gay +shadow +defence +Too +Had +transition +Afghanistan +##book +eggs +defend +##lli +writes +Systems +bones +mess +seed +scientists +Shortly +Romanian +##zy +Freedom +muscle +hero +parent +agriculture +checked +Islam +Bristol +Freyja +Arena +cabin +Germans +electricity +ranks +viewed +medals +Wolf +associate +Madison +Sorry +fort +Chile +detail +widespread +attorney +boyfriend +##nan +Students +Spencer +##ig +bite +Maine +demolished +Lisa +erected +Someone +operational +Commissioner +NHL +Coach +Bar +forcing +Dream +Rico +cargo +Murphy +##fish +##ase +distant +##master +##ora +Organization +doorway +Steven +traded +electrical +frequent +##wn +Branch +Sure +1882 +placing +Manhattan +attending +attributed +excellent +pounds +ruling +principles +component +Mediterranean +Vegas +machines +percentage +infrastructure +throwing +affiliated +Kings +secured +Caribbean +Track +Ted +honour +opponent +Virgin +Construction +grave +produces +Challenge +stretched +paying +murmured +##ata +integrated +waved +Nathan +##ator +transmission +videos +##yan +##hu +Nova +descent +AM +Harold +conservative +Therefore +venue +competitive +##ui +conclusion +funeral +confidence +releases +scholar +##sson +Treaty +stress +mood +##sm +Mac +residing +Action +Fund +##ship +animated +fitted +##kar +defending +voting +tend +##berry +answers +believes +##ci +helps +Aaron +##tis +themes +##lay +populations +Players +stroke +Trinity +electoral +paint +abroad +charity +keys +Fair +##pes +interrupted +participants +murdered +Days +supporters +##ab +expert +borders +mate +##llo +solar +architectural +tension +##bling +Parish +tape +operator +Cultural +Clinton +indicates +publisher +ordinary +sugar +arrive +rifle +acoustic +##uring +assets +##shire +SS +sufficient +options +HMS +Classic +bars +rebuilt +governments +Beijing +reporter +screamed +Abbey +crying +mechanical +instantly +communications +Political +cemetery +Cameron +Stop +representatives +USS +texts +mathematics +innings +civilian +Serbia +##hill +practical +patterns +dust +Faculty +debt +##end +##cus +junction +suppose +experimental +Computer +Food +wrist +abuse +dealing +bigger +cap +principle +##pin +Muhammad +Fleet +Collection +attempting +dismissed +##burn +regime +Herbert +##ua +shadows +1883 +Eve +Lanka +1878 +Performance +fictional +##lock +Noah +Run +Voivodeship +exercise +broadcasting +##fer +RAF +Magic +Bangladesh +suitable +##low +##del +styles +toured +Code +identical +links +insisted +110 +flash +Model +slave +Derek +Rev +fairly +Greater +sole +##lands +connecting +zero +bench +##ome +switched +Fall +Owen +yours +Electric +shocked +convention +##bra +climb +memorial +swept +Racing +decides +belong +##nk +parliamentary +##und +ages +proof +##dan +delivery +1860 +##ów +sad +publicly +leaning +Archbishop +dirt +##ose +categories +1876 +burn +##bing +requested +Guinea +Historical +rhythm +relation +##heim +ye +pursue +merchant +##mes +lists +continuous +frowned +colored +tool +gods +involves +Duncan +photographs +Cricket +slight +Gregory +atmosphere +wider +Cook +##tar +essential +Being +FA +emperor +wealthy +nights +##bar +licensed +Hawaii +viewers +Language +load +nearest +milk +kilometers +platforms +##ys +territories +Rogers +sheet +Rangers +contested +##lation +isolated +assisted +swallowed +Small +Contemporary +Technical +Edwards +express +Volume +endemic +##ei +tightly +Whatever +indigenous +Colombia +##ulation +hp +characterized +##ida +Nigeria +Professional +duo +Soccer +slaves +Farm +smart +Attorney +Attendance +Common +salt +##vin +tribes +nod +sentenced +bid +sample +Drive +switch +instant +21st +Cuba +drunk +Alaska +proud +awareness +hitting +sessions +Thai +locally +elsewhere +Dragon +gentle +touching +##lee +Springs +Universal +Latino +spin +1871 +Chart +recalled +Type +pointing +##ii +lowest +##ser +grandmother +Adelaide +Jacques +spotted +Buffalo +restoration +Son +Joan +farmers +Lily +1879 +lucky +##dal +luck +eldest +##rant +Market +drummer +deployed +warned +prince +sing +amazing +sailed +##oon +1875 +Primary +traveling +Masters +Sara +cattle +Trail +gang +Further +desert +relocated +##tch +##ord +Flight +illness +Munich +ninth +repair +Singles +##lated +Tyler +tossed +boots +Work +sized +earning +shoved +magazines +housed +dam +researchers +Former +spun +premiere +spaces +organised +wealth +crimes +devoted +stones +Urban +automatic +hop +affect +outstanding +tanks +mechanism +Muslims +Ms +shots +argue +Jeremy +connections +Armenian +increases +rubbed +1867 +retail +gear +Pan +bonus +jurisdiction +weird +concerning +whisper +##gal +Microsoft +tenure +hills +www +Gmina +porch +files +reportedly +venture +Storm +##ence +Nature +killer +panic +fate +Secret +Wang +scream +drivers +belongs +Chamber +clan +monument +mixing +Peru +bet +Riley +Friends +Isaac +submarine +1877 +130 +judges +harm +ranging +affair +prepare +pupils +householder +Policy +decorated +Nation +slammed +activist +implemented +Room +qualify +Publishing +establishing +Baptist +touring +subsidiary +##nal +legend +1872 +laughter +PC +Athens +settlers +ties +dual +dear +Draft +strategic +Ivan +reveal +closest +dominant +Ah +##ult +Denver +bond +boundaries +drafted +tables +##TV +eyed +Edition +##ena +1868 +belonging +1874 +Industrial +cream +Ridge +Hindu +scholarship +Ma +opens +initiated +##ith +yelled +compound +random +Throughout +grades +physics +sank +grows +exclusively +settle +Saints +brings +Amsterdam +Make +Hart +walks +battery +violin +##born +explanation +##ware +1873 +##har +provinces +thrust +exclusive +sculpture +shops +##fire +VI +constitution +Barcelona +monster +Devon +Jefferson +Sullivan +bow +##din +desperate +##ć +Julie +##mon +##ising +terminus +Jesse +abilities +golf +##ple +##via +##away +Raymond +measured +jury +firing +revenue +suburb +Bulgarian +1866 +##cha +timber +Things +##weight +Morning +spots +Alberta +Data +explains +Kyle +friendship +raw +tube +demonstrated +aboard +immigrants +reply +breathe +Manager +ease +##ban +##dia +Diocese +##vy +##ía +pit +ongoing +##lie +Gilbert +Costa +1940s +Report +voters +cloud +traditions +##MS +gallery +Jennifer +swung +Broadcasting +Does +diverse +reveals +arriving +initiative +##ani +Give +Allied +Pat +Outstanding +monastery +blind +Currently +##war +bloody +stopping +focuses +managing +Florence +Harvey +creatures +900 +breast +internet +Artillery +purple +##mate +alliance +excited +fee +Brisbane +lifetime +Private +##aw +##nis +##gue +##ika +phrase +regulations +reflected +manufactured +conventional +pleased +client +##ix +##ncy +Pedro +reduction +##con +welcome +jail +comfort +Iranian +Norfolk +Dakota +##tein +evolution +everywhere +Initially +sensitive +Olivia +Oscar +implementation +sits +stolen +demands +slide +grandson +##ich +merger +##mic +Spirit +##° +ticket +root +difficulty +Nevada +##als +lined +Dylan +Original +Call +biological +EU +dramatic +##hn +Operations +treaty +gap +##list +Am +Romanized +moral +Butler +perspective +Furthermore +Manuel +absolutely +unsuccessful +disaster +dispute +preparation +tested +discover +##ach +shield +squeezed +brushed +battalion +Arnold +##ras +superior +treat +clinical +##so +Apple +Syria +Cincinnati +package +flights +editions +Leader +minority +wonderful +hang +Pop +Philippine +telephone +bell +honorary +##mar +balls +Democrat +dirty +thereafter +collapsed +Inside +slip +wrestling +##ín +listened +regard +bowl +None +Sport +completing +trapped +##view +copper +Wallace +Honor +blame +Peninsula +##ert +##oy +Anglo +bearing +simultaneously +honest +##ias +Mix +Got +speaker +voiced +impressed +prices +error +1869 +##feld +trials +Nine +Industry +substitute +Municipal +departed +slept +##ama +Junction +Socialist +flower +dropping +comment +fantasy +##ress +arrangements +travelled +furniture +fist +relieved +##tics +Leonard +linear +earn +expand +Soul +Plan +Leeds +Sierra +accessible +innocent +Winner +Fighter +Range +winds +vertical +Pictures +101 +charter +cooperation +prisoner +interviews +recognised +sung +manufacturer +exposure +submitted +Mars +leaf +gauge +screaming +likes +eligible +##ac +gathering +columns +##dra +belly +UN +maps +messages +speakers +##ants +garage +unincorporated +Number +Watson +sixteen +lots +beaten +Could +Municipality +##ano +Horse +talks +Drake +scores +Venice +genetic +##mal +##ère +Cold +Jose +nurse +traditionally +##bus +Territory +Key +Nancy +##win +thumb +São +index +dependent +carries +controls +Comics +coalition +physician +referring +Ruth +Based +restricted +inherited +internationally +stretch +THE +plates +margin +Holland +knock +significance +valuable +Kenya +carved +emotion +conservation +municipalities +overseas +resumed +Finance +graduation +blinked +temperatures +constantly +productions +scientist +ghost +cuts +permitted +##ches +firmly +##bert +patrol +##yo +Croatian +attacking +1850 +portrait +promoting +sink +conversion +##kov +locomotives +Guide +##val +nephew +relevant +Marc +drum +originated +Chair +visits +dragged +Price +favour +corridor +properly +respective +Caroline +reporting +inaugural +1848 +industries +##ching +edges +Christianity +Maurice +Trent +Economics +carrier +Reed +##gon +tribute +Pradesh +##ale +extend +attitude +Yale +##lu +settlements +glasses +taxes +targets +##ids +quarters +##ological +connect +hence +metre +collapse +underneath +banned +Future +clients +alternate +explosion +kinds +Commons +hungry +dragon +Chapel +Buddhist +lover +depression +pulls +##ges +##uk +origins +computers +crosses +kissing +assume +emphasis +lighting +##ites +personally +crashed +beam +touchdown +lane +comparison +##mont +Hitler +##las +execution +##ene +acre +sum +Pearl +ray +##point +essentially +worker +convicted +tear +Clay +recovery +Literature +Unfortunately +##row +partial +Petersburg +Bulgaria +coaching +evolved +reception +enters +narrowed +elevator +therapy +defended +pairs +##lam +breaks +Bennett +Uncle +cylinder +##ison +passion +bases +Actor +cancelled +battles +extensively +oxygen +Ancient +specialized +negotiations +##rat +acquisition +convince +interpretation +##00 +photos +aspect +colleges +Artist +keeps +##wing +Croatia +##ona +Hughes +Otto +comments +##du +Ph +Sweet +adventure +describing +Student +Shakespeare +scattered +objective +Aviation +Phillips +Fourth +athletes +##hal +##tered +Guitar +intensity +née +dining +curve +Obama +topics +legislative +Mill +Cruz +##ars +Members +recipient +Derby +inspiration +corresponding +fed +YouTube +coins +pressing +intent +Karen +cinema +Delta +destination +shorter +Christians +imagined +canal +Newcastle +Shah +Adrian +super +Males +160 +liberal +lord +bat +supplied +Claude +meal +worship +##atic +Han +wire +°F +##tha +punishment +thirteen +fighters +##ibility +1859 +Ball +gardens +##ari +Ottawa +pole +indicating +Twenty +Higher +Bass +Ivy +farming +##urs +certified +Saudi +plenty +##ces +restaurants +Representative +Miles +payment +##inger +##rit +Confederate +festivals +references +##ić +Mario +PhD +playoffs +witness +rice +mask +saving +opponents +enforcement +automatically +relegated +##oe +radar +whenever +Financial +imperial +uncredited +influences +Abraham +skull +Guardian +Haven +Bengal +impressive +input +mixture +Warsaw +altitude +distinction +1857 +collective +Annie +##ean +##bal +directions +Flying +##nic +faded +##ella +contributing +##ó +employee +##lum +##yl +ruler +oriented +conductor +focusing +##die +Giants +Mills +mines +Deep +curled +Jessica +guitars +Louise +procedure +Machine +failing +attendance +Nepal +Brad +Liam +tourist +exhibited +Sophie +depicted +Shaw +Chuck +##can +expecting +challenges +##nda +equally +resignation +##logical +Tigers +loop +pitched +outdoor +reviewed +hopes +True +temporarily +Borough +torn +jerked +collect +Berkeley +Independence +cotton +retreat +campaigns +participating +Intelligence +Heaven +##ked +situations +borough +Democrats +Harbor +##len +Liga +serial +circles +fourteen +##lot +seized +filling +departments +finance +absolute +Roland +Nate +floors +raced +struggling +deliver +protests +##tel +Exchange +efficient +experiments +##dar +faint +3D +binding +Lions +lightly +skill +proteins +difficulties +##cal +monthly +camps +flood +loves +Amanda +Commerce +##oid +##lies +elementary +##tre +organic +##stein +##ph +receives +Tech +enormous +distinctive +Joint +experiment +Circuit +citizen +##hy +shelter +ideal +practically +formula +addressed +Foster +Productions +##ax +variable +punk +Voice +fastest +concentrated +##oma +##yer +stored +surrender +vary +Sergeant +Wells +ward +Wait +##ven +playoff +reducing +cavalry +##dle +Venezuela +tissue +amounts +sweat +##we +Non +##nik +beetle +##bu +##tu +Jared +Hunt +##₂ +fat +Sultan +Living +Circle +Secondary +Suddenly +reverse +##min +Travel +##bin +Lebanon +##mas +virus +Wind +dissolved +enrolled +holiday +Keep +helicopter +Clarke +constitutional +technologies +doubles +instructions +##ace +Azerbaijan +##ill +occasional +frozen +trick +wiped +writings +Shanghai +preparing +challenged +mainstream +summit +180 +##arian +##rating +designation +##ada +revenge +filming +tightened +Miguel +Montana +reflect +celebration +bitch +flashed +signals +rounded +peoples +##tation +renowned +Google +characteristic +Campaign +sliding +##rman +usage +Record +Using +woke +solutions +holes +theories +logo +Protestant +relaxed +brow +nickname +Reading +marble +##tro +symptoms +Overall +capita +##ila +outbreak +revolution +deemed +Principal +Hannah +approaches +inducted +Wellington +vulnerable +Environmental +Drama +incumbent +Dame +1854 +travels +samples +accurate +physically +Sony +Nashville +##sville +##lic +##og +Producer +Lucky +tough +Stanford +resort +repeatedly +eyebrows +Far +choir +commenced +##ep +##ridge +rage +swing +sequel +heir +buses +ad +Grove +##late +##rick +updated +##SA +Delaware +##fa +Athletics +warmth +Off +excitement +verse +Protection +Villa +corruption +intellectual +Jenny +##lyn +mystery +prayer +healthy +##ologist +Bear +lab +Ernest +Remix +register +basement +Montgomery +consistent +tier +1855 +Preston +Brooks +##maker +vocalist +laboratory +delayed +wheels +rope +bachelor +pitcher +Block +Nevertheless +suspect +efficiency +Nebraska +siege +FBI +planted +##AC +Newton +breeding +##ain +eighteen +Argentine +encounter +servant +1858 +elder +Shadow +Episode +fabric +doctors +survival +removal +chemistry +volunteers +Kane +variant +arrives +Eagle +Left +##fe +Jo +divorce +##ret +yesterday +Bryan +handling +diseases +customer +Sheriff +Tiger +Harper +##oi +resting +Linda +Sheffield +gasped +sexy +economics +alien +tale +footage +Liberty +yeah +fundamental +Ground +flames +Actress +photographer +Maggie +Additional +joke +custom +Survey +Abu +silk +consumption +Ellis +bread +##uous +engagement +puts +Dog +##hr +poured +guilt +CDP +boxes +hardware +clenched +##cio +stem +arena +extending +##com +examination +Steel +encountered +revised +140 +picking +Car +hasn +Minor +pride +Roosevelt +boards +##mia +blocked +curious +drag +narrative +brigade +Prefecture +mysterious +namely +connects +Devil +historians +CHAPTER +quit +installation +Golf +empire +elevated +##eo +releasing +Bond +##uri +harsh +ban +##BA +contracts +cloth +presents +stake +chorus +##eau +swear +##mp +allies +generations +Motor +meter +pen +warrior +veteran +##EC +comprehensive +missile +interaction +instruction +Renaissance +rested +Dale +fix +fluid +les +investigate +loaded +widow +exhibit +artificial +select +rushing +tasks +signature +nowhere +Engineer +feared +Prague +bother +extinct +gates +Bird +climbing +heels +striking +artwork +hunt +awake +##hin +Formula +thereby +commitment +imprisoned +Beyond +##MA +transformed +Agriculture +Low +Movie +radical +complicated +Yellow +Auckland +mansion +tenth +Trevor +predecessor +##eer +disbanded +sucked +circular +witch +gaining +lean +Behind +illustrated +rang +celebrate +bike +consist +framework +##cent +Shane +owns +350 +comprises +collaborated +colleagues +##cast +engage +fewer +##ave +1856 +observation +diplomatic +legislature +improvements +Interstate +craft +MTV +martial +administered +jet +approaching +permanently +attraction +manuscript +numbered +Happy +Andrea +shallow +Gothic +Anti +##bad +improvement +trace +preserve +regardless +rode +dies +achievement +maintaining +Hamburg +spine +##air +flowing +encourage +widened +posts +##bound +125 +Southeast +Santiago +##bles +impression +receiver +Single +closure +##unt +communist +honors +Northwest +105 +##ulated +cared +un +hug +magnetic +seeds +topic +perceived +prey +prevented +Marvel +Eight +Michel +Transportation +rings +Gate +##gne +Byzantine +accommodate +floating +##dor +equation +ministry +##ito +##gled +Rules +earthquake +revealing +Brother +Celtic +blew +chairs +Panama +Leon +attractive +descendants +Care +Ambassador +tours +breathed +threatening +##cho +smiles +Lt +Beginning +##iness +fake +assists +fame +strings +Mobile +Liu +parks +http +1852 +brush +Aunt +bullet +consciousness +##sta +##ther +consequences +gather +dug +1851 +bridges +Doug +##sion +Artists +ignore +Carol +brilliant +radiation +temples +basin +clouds +##cted +Stevens +spite +soap +consumer +Damn +Snow +recruited +##craft +Advanced +tournaments +Quinn +undergraduate +questioned +Palmer +Annual +Others +feeding +Spider +printing +##orn +cameras +functional +Chester +readers +Alpha +universal +Faith +Brandon +François +authored +Ring +el +aims +athletic +possessed +Vermont +programmes +##uck +bore +Fisher +statements +shed +saxophone +neighboring +pronounced +barrel +bags +##dge +organisations +pilots +casualties +Kenneth +##brook +silently +Malcolm +span +Essex +anchor +##hl +virtual +lessons +Henri +Trump +Page +pile +locomotive +wounds +uncomfortable +sustained +Diana +Eagles +##pi +2000s +documented +##bel +Cassie +delay +kisses +##ines +variation +##ag +growled +##mark +##ways +Leslie +studios +Friedrich +aunt +actively +armor +eaten +historically +Better +purse +honey +ratings +##ée +naturally +1840 +peer +Kenny +Cardinal +database +Looking +runners +handsome +Double +PA +##boat +##sted +protecting +##jan +Diamond +concepts +interface +##aki +Watch +Article +Columbus +dialogue +pause +##rio +extends +blanket +pulse +1853 +affiliate +ladies +Ronald +counted +kills +demons +##zation +Airlines +Marco +Cat +companion +mere +Yugoslavia +Forum +Allan +pioneer +Competition +Methodist +patent +nobody +Stockholm +##ien +regulation +##ois +accomplished +##itive +washed +sake +Vladimir +crops +prestigious +humor +Sally +labour +tributary +trap +altered +examined +Mumbai +bombing +Ash +noble +suspension +ruins +##bank +spare +displays +guided +dimensional +Iraqi +##hon +sciences +Franz +relating +fence +followers +Palestine +invented +proceeded +Batman +Bradley +##yard +##ova +crystal +Kerala +##ima +shipping +handled +Want +abolished +Drew +##tter +Powell +Half +##table +##cker +exhibitions +Were +assignment +assured +##rine +Indonesian +Grammy +acknowledged +Kylie +coaches +structural +clearing +stationed +Say +Total +Rail +besides +glow +threats +afford +Tree +Musical +##pp +elite +centered +explore +Engineers +Stakes +Hello +tourism +severely +assessment +##tly +crack +politicians +##rrow +sheets +volunteer +##borough +##hold +announcement +recover +contribute +lungs +##ille +mainland +presentation +Johann +Writing +1849 +##bird +Study +Boulevard +coached +fail +airline +Congo +Plus +Syrian +introduce +ridge +Casey +manages +##fi +searched +Support +succession +progressive +coup +cultures +##lessly +sensation +Cork +Elena +Sofia +Philosophy +mini +trunk +academy +Mass +Liz +practiced +Reid +##ule +satisfied +experts +Wilhelm +Woods +invitation +Angels +calendar +joy +Sr +Dam +packed +##uan +bastard +Workers +broadcasts +logic +cooking +backward +##ack +Chen +creates +enzyme +##xi +Davies +aviation +VII +Conservation +fucking +Knights +##kan +requiring +hectares +wars +ate +##box +Mind +desired +oak +absorbed +Really +Vietnamese +Paulo +athlete +##car +##eth +Talk +Wu +##cks +survivors +Yang +Joel +Almost +Holmes +Armed +Joshua +priests +discontinued +##sey +blond +Rolling +suggesting +CA +clay +exterior +Scientific +##sive +Giovanni +Hi +farther +contents +Winners +animation +neutral +mall +Notes +layers +professionals +Armstrong +Against +Piano +involve +monitor +angel +parked +bears +seated +feat +beliefs +##kers +Version +suffer +##ceae +guidance +##eur +honored +raid +alarm +Glen +Ellen +Jamaica +trio +enabled +##ils +procedures +##hus +moderate +upstairs +##ses +torture +Georgian +rebellion +Fernando +Nice +##are +Aires +Campus +beast +##hing +1847 +##FA +Isle +##logist +Princeton +cathedral +Oakland +Solomon +##tto +Milwaukee +upcoming +midfielder +Neither +sacred +Eyes +appreciate +Brunswick +secrets +Rice +Somerset +Chancellor +Curtis +##gel +Rich +separation +grid +##los +##bon +urge +##ees +##ree +freight +towers +psychology +requirement +dollar +##fall +##sman +exile +tomb +Salt +Stefan +Buenos +Revival +Porter +tender +diesel +chocolate +Eugene +Legion +Laboratory +sheep +arched +hospitals +orbit +Full +##hall +drinks +ripped +##RS +tense +Hank +leagues +##nberg +PlayStation +fool +Punjab +relatives +Comedy +sur +1846 +Tonight +Sox +##if +Rabbi +org +speaks +institute +defender +painful +wishes +Weekly +literacy +portions +snake +item +deals +##tum +autumn +sharply +reforms +thighs +prototype +##ition +argues +disorder +Physics +terror +provisions +refugees +predominantly +independently +march +##graphy +Arabia +Andrews +Bus +Money +drops +##zar +pistol +matrix +revolutionary +##ust +Starting +##ptic +Oak +Monica +##ides +servants +##hed +archaeological +divorced +rocket +enjoying +fires +##nel +assembled +qualification +retiring +##fied +Distinguished +handful +infection +Durham +##itz +fortune +renewed +Chelsea +##sley +curved +gesture +retain +exhausted +##ifying +Perth +jumping +Palestinian +Simpson +colonies +steal +##chy +corners +Finn +arguing +Martha +##var +Betty +emerging +Heights +Hindi +Manila +pianist +founders +regret +Napoleon +elbow +overhead +bold +praise +humanity +##ori +Revolutionary +##ere +fur +##ole +Ashley +Official +##rm +lovely +Architecture +##sch +Baronet +virtually +##OS +descended +immigration +##das +##kes +Holly +Wednesday +maintains +theatrical +Evan +Gardens +citing +##gia +segments +Bailey +Ghost +##city +governing +graphics +##ined +privately +potentially +transformation +Crystal +Cabinet +sacrifice +hesitated +mud +Apollo +Desert +bin +victories +Editor +Railways +Web +Case +tourists +Brussels +Franco +compiled +topped +Gene +engineers +commentary +egg +escort +nerve +arch +necessarily +frustration +Michelle +democracy +genes +Facebook +halfway +##ient +102 +flipped +Won +##mit +NASA +Lynn +Provincial +ambassador +Inspector +glared +Change +McDonald +developments +tucked +noting +Gibson +circulation +dubbed +armies +resource +Headquarters +##iest +Mia +Albanian +Oil +Albums +excuse +intervention +Grande +Hugo +integration +civilians +depends +reserves +Dee +compositions +identification +restrictions +quarterback +Miranda +Universe +favourite +ranges +hint +loyal +Op +entity +Manual +quoted +dealt +specialist +Zhang +download +Westminster +Rebecca +streams +Anglican +variations +Mine +detective +Films +reserved +##oke +##key +sailing +##gger +expanding +recall +discovers +particles +behaviour +Gavin +blank +permit +Java +Fraser +Pass +##non +##TA +panels +statistics +notion +courage +dare +venues +##roy +Box +Newport +travelling +Thursday +warriors +Glenn +criteria +360 +mutual +restore +varied +bitter +Katherine +##lant +ritual +bits +##à +Henderson +trips +Richardson +Detective +curse +psychological +Il +midnight +streak +facts +Dawn +Indies +Edmund +roster +Gen +##nation +1830 +congregation +shaft +##ically +##mination +Indianapolis +Sussex +loving +##bit +sounding +horrible +Continental +Griffin +advised +magical +millions +##date +1845 +Safety +lifting +determination +valid +dialect +Penn +Know +triple +avoided +dancer +judgment +sixty +farmer +lakes +blast +aggressive +Abby +tag +chains +inscription +##nn +conducting +Scout +buying +##wich +spreading +##OC +array +hurried +Environment +improving +prompted +fierce +Taking +Away +tune +pissed +Bull +catching +##ying +eyebrow +metropolitan +terrain +##rel +Lodge +manufacturers +creator +##etic +happiness +ports +##ners +Relations +fortress +targeted +##ST +allegedly +blues +##osa +Bosnia +##dom +burial +similarly +stranger +pursued +symbols +rebels +reflection +routine +traced +indoor +eventual +##ska +##ão +##una +MD +##phone +oh +grants +Reynolds +rid +operators +##nus +Joey +vital +siblings +keyboard +br +removing +societies +drives +solely +princess +lighter +Various +Cavalry +believing +SC +underwent +relay +smelled +syndrome +welfare +authorized +seemingly +Hard +chicken +##rina +Ages +Bo +democratic +barn +Eye +shorts +##coming +##hand +disappointed +unexpected +centres +Exhibition +Stories +Site +banking +accidentally +Agent +conjunction +André +Chloe +resist +width +Queens +provision +##art +Melissa +Honorary +Del +prefer +abruptly +duration +##vis +Glass +enlisted +##ado +discipline +Sisters +carriage +##ctor +##sburg +Lancashire +log +fuck +##iz +closet +collecting +holy +rape +trusted +cleaning +inhabited +Rocky +104 +editorial +##yu +##ju +succeed +strict +Cuban +##iya +Bronze +outcome +##ifies +##set +corps +Hero +barrier +Kumar +groaned +Nina +Burton +enable +stability +Milton +knots +##ination +slavery +##borg +curriculum +trailer +warfare +Dante +Edgar +revival +Copenhagen +define +advocate +Garrett +Luther +overcome +pipe +750 +construct +Scotia +kings +flooding +##hard +Ferdinand +Felix +forgot +Fish +Kurt +elaborate +##BC +graphic +gripped +colonel +Sophia +Advisory +Self +##uff +##lio +monitoring +seal +senses +rises +peaceful +journals +1837 +checking +legendary +Ghana +##power +ammunition +Rosa +Richards +nineteenth +ferry +aggregate +Troy +inter +##wall +Triple +steep +tent +Cyprus +1844 +##woman +commanding +farms +doi +navy +specified +na +cricketer +transported +Think +comprising +grateful +solve +##core +beings +clerk +grain +vector +discrimination +##TC +Katie +reasonable +drawings +veins +consideration +Monroe +repeat +breed +dried +witnessed +ordained +Current +spirits +remarkable +consultant +urged +Remember +anime +singers +phenomenon +Rhode +Carlo +demanding +findings +manual +varying +Fellowship +generate +safely +heated +withdrawn +##ao +headquartered +##zon +##lav +##ency +Col +Memphis +imposed +rivals +Planet +healing +##hs +ensemble +Warriors +##bone +cult +Frankfurt +##HL +diversity +Gerald +intermediate +##izes +reactions +Sister +##ously +##lica +quantum +awkward +mentions +pursuit +##ography +varies +profession +molecular +consequence +lectures +cracked +103 +slowed +##tsu +cheese +upgraded +suite +substance +Kingston +1800 +Idaho +Theory +##een +ain +Carson +Molly +##OR +configuration +Whitney +reads +audiences +##tie +Geneva +Outside +##nen +##had +transit +volleyball +Randy +Chad +rubber +motorcycle +respected +eager +Level +coin +##lets +neighbouring +##wski +confident +##cious +poll +uncertain +punch +thesis +Tucker +IATA +Alec +##ographic +##law +1841 +desperately +1812 +Lithuania +accent +Cox +lightning +skirt +##load +Burns +Dynasty +##ug +chapters +Working +dense +Morocco +##kins +casting +Set +activated +oral +Brien +horn +HIV +dawn +stumbled +altar +tore +considerably +Nicole +interchange +registration +biography +Hull +Stan +bulk +consent +Pierce +##ER +Fifth +marched +terrorist +##piece +##itt +Presidential +Heather +staged +Plant +relegation +sporting +joins +##ced +Pakistani +dynamic +Heat +##lf +ourselves +Except +Elliott +nationally +goddess +investors +Burke +Jackie +##ā +##RA +Tristan +Associate +Tuesday +scope +Near +bunch +##abad +##ben +sunlight +##aire +manga +Willie +trucks +boarding +Lion +lawsuit +Learning +Der +pounding +awful +##mine +IT +Legend +romance +Serie +AC +gut +precious +Robertson +hometown +realm +Guards +Tag +batting +##vre +halt +conscious +1838 +acquire +collar +##gg +##ops +Herald +nationwide +citizenship +Aircraft +decrease +em +Fiction +Female +corporation +Located +##ip +fights +unconscious +Tampa +Poetry +lobby +Malta +##sar +##bie +layout +Tate +reader +stained +##bre +##rst +##ulate +loudly +Eva +Cohen +exploded +Merit +Maya +##rable +Rovers +##IC +Morrison +Should +vinyl +##mie +onwards +##gie +vicinity +Wildlife +probability +Mar +Barnes +##ook +spinning +Moses +##vie +Surrey +Planning +conferences +protective +Plaza +deny +Canterbury +manor +Estate +tilted +comics +IBM +destroying +server +Dorothy +##horn +Oslo +lesser +heaven +Marshal +scales +strikes +##ath +firms +attract +##BS +controlling +Bradford +southeastern +Amazon +Travis +Janet +governed +1842 +Train +Holden +bleeding +gifts +rent +1839 +palms +##ū +judicial +Ho +Finals +conflicts +unlikely +draws +##cies +compensation +adds +elderly +Anton +lasting +Nintendo +codes +ministers +pot +associations +capabilities +##cht +libraries +##sie +chances +performers +runway +##af +##nder +Mid +Vocals +##uch +##eon +interpreted +priority +Uganda +ruined +Mathematics +cook +AFL +Lutheran +AIDS +Capitol +chase +axis +Moreover +María +Saxon +storyline +##ffed +Tears +Kid +cent +colours +Sex +##long +pm +blonde +Edwin +CE +diocese +##ents +##boy +Inn +##ller +Saskatchewan +##kh +stepping +Windsor +##oka +##eri +Xavier +Resources +1843 +##top +##rad +##lls +Testament +poorly +1836 +drifted +slope +CIA +remix +Lords +mature +hosting +diamond +beds +##ncies +luxury +trigger +##lier +preliminary +hybrid +journalists +Enterprise +proven +expelled +insects +Beautiful +lifestyle +vanished +##ake +##ander +matching +surfaces +Dominican +Kids +referendum +Orlando +Truth +Sandy +privacy +Calgary +Speaker +sts +Nobody +shifting +##gers +Roll +Armenia +Hand +##ES +106 +##ont +Guild +larvae +Stock +flame +gravity +enhanced +Marion +surely +##tering +Tales +algorithm +Emmy +darker +VIII +##lash +hamlet +deliberately +occurring +choices +Gage +fees +settling +ridiculous +##ela +Sons +cop +custody +##ID +proclaimed +Cardinals +##pm +Metal +Ana +1835 +clue +Cardiff +riders +observations +MA +sometime +##och +performer +intact +Points +allegations +rotation +Tennis +tenor +Directors +##ats +Transit +thigh +Complex +##works +twentieth +Factory +doctrine +Daddy +##ished +pretend +Winston +cigarette +##IA +specimens +hydrogen +smoking +mathematical +arguments +openly +developer +##iro +fists +somebody +##san +Standing +Caleb +intelligent +Stay +Interior +echoed +Valentine +varieties +Brady +cluster +Ever +voyage +##of +deposits +ultimate +Hayes +horizontal +proximity +##ás +estates +exploration +NATO +Classical +##most +bills +condemned +1832 +hunger +##ato +planes +deserve +offense +sequences +rendered +acceptance +##ony +manufacture +Plymouth +innovative +predicted +##RC +Fantasy +##une +supporter +absent +Picture +bassist +rescued +##MC +Ahmed +Monte +##sts +##rius +insane +novelist +##és +agrees +Antarctic +Lancaster +Hopkins +calculated +startled +##star +tribal +Amendment +##hoe +invisible +patron +deer +Walk +tracking +Lyon +tickets +##ED +philosopher +compounds +chuckled +##wi +pound +loyalty +Academic +petition +refuses +marking +Mercury +northeastern +dimensions +scandal +Canyon +patch +publish +##oning +Peak +minds +##boro +Presbyterian +Hardy +theoretical +magnitude +bombs +cage +##ders +##kai +measuring +explaining +avoiding +touchdowns +Card +theology +##ured +Popular +export +suspicious +Probably +photograph +Lou +Parks +Arms +compact +Apparently +excess +Banks +lied +stunned +territorial +Filipino +spectrum +learns +wash +imprisonment +ugly +##rose +Albany +Erik +sends +##hara +##rid +consumed +##gling +Belgrade +Da +opposing +Magnus +footsteps +glowing +delicate +Alexandria +Ludwig +gorgeous +Bros +Index +##PA +customs +preservation +bonds +##mond +environments +##nto +instructed +parted +adoption +locality +workshops +goalkeeper +##rik +##uma +Brighton +Slovenia +##ulating +##tical +towel +hugged +stripped +Bears +upright +Wagner +##aux +secretly +Adventures +nest +Course +Lauren +Boeing +Abdul +Lakes +450 +##cu +USSR +caps +Chan +##nna +conceived +Actually +Belfast +Lithuanian +concentrate +possess +militia +pine +protagonist +Helena +##PS +##band +Belle +Clara +Reform +currency +pregnancy +1500 +##rim +Isabella +hull +Name +trend +journalism +diet +##mel +Recording +acclaimed +Tang +Jace +steering +vacant +suggestion +costume +laser +##š +##ink +##pan +##vić +integral +achievements +wise +classroom +unions +southwestern +##uer +Garcia +toss +Tara +Large +##tate +evident +responsibilities +populated +satisfaction +##bia +casual +Ecuador +##ght +arose +##ović +Cornwall +embrace +refuse +Heavyweight +XI +Eden +activists +##uation +biology +##shan +fraud +Fuck +matched +legacy +Rivers +missionary +extraordinary +Didn +holder +wickets +crucial +Writers +Hurricane +Iceland +gross +trumpet +accordance +hurry +flooded +doctorate +Albania +##yi +united +deceased +jealous +grief +flute +portraits +##а +pleasant +Founded +Face +crowned +Raja +advisor +Salem +##ec +Achievement +admission +freely +minimal +Sudan +developers +estimate +disabled +##lane +downstairs +Bruno +##pus +pinyin +##ude +lecture +deadly +underlying +optical +witnesses +Combat +Julius +tapped +variants +##like +Colonial +Critics +Similarly +mouse +voltage +sculptor +Concert +salary +Frances +##ground +hook +premises +Software +instructor +nominee +##ited +fog +slopes +##zu +vegetation +sail +##rch +Body +Apart +atop +View +utility +ribs +cab +migration +##wyn +bounded +2019 +pillow +trails +##ub +Halifax +shade +Rush +##lah +##dian +Notre +interviewed +Alexandra +Springfield +Indeed +rubbing +dozens +amusement +legally +##lers +Jill +Cinema +ignoring +Choice +##ures +pockets +##nell +laying +Blair +tackles +separately +##teen +Criminal +performs +theorem +Communication +suburbs +##iel +competitors +rows +##hai +Manitoba +Eleanor +interactions +nominations +assassination +##dis +Edmonton +diving +##dine +essay +##tas +AFC +Edge +directing +imagination +sunk +implement +Theodore +trembling +sealed +##rock +Nobel +##ancy +##dorf +##chen +genuine +apartments +Nicolas +AA +Bach +Globe +Store +220 +##10 +Rochester +##ño +alert +107 +Beck +##nin +Naples +Basin +Crawford +fears +Tracy +##hen +disk +##pped +seventeen +Lead +backup +reconstruction +##lines +terrified +sleeve +nicknamed +popped +##making +##ern +Holiday +Gospel +ibn +##ime +convert +divine +resolved +##quet +ski +realizing +##RT +Legislature +reservoir +Rain +sinking +rainfall +elimination +challenging +tobacco +##outs +Given +smallest +Commercial +pin +rebel +comedian +exchanged +airing +dish +Salvador +promising +##wl +relax +presenter +toll +aerial +##eh +Fletcher +brass +disappear +zones +adjusted +contacts +##lk +sensed +Walt +mild +toes +flies +shame +considers +wildlife +Hanna +Arsenal +Ladies +naming +##ishing +anxiety +discussions +cute +undertaken +Cash +strain +Wyoming +dishes +precise +Angela +##ided +hostile +twins +115 +Built +##pel +Online +tactics +Newman +##bourne +unclear +repairs +embarrassed +listing +tugged +Vale +##gin +Meredith +bout +##cle +velocity +tips +froze +evaluation +demonstrate +##card +criticised +Nash +lineup +Rao +monks +bacteria +lease +##lish +frightened +den +revived +finale +##rance +flee +Letters +decreased +##oh +Sounds +wrap +Sharon +incidents +renovated +everybody +stole +Bath +boxing +1815 +withdraw +backs +interim +react +murders +Rhodes +Copa +framed +flown +Estonia +Heavy +explored +##rra +##GA +##ali +Istanbul +1834 +##rite +##aging +##ues +Episcopal +arc +orientation +Maxwell +infected +##rot +BCE +Brook +grasp +Roberto +Excellence +108 +withdrawal +Marines +rider +Lo +##sin +##run +Subsequently +garrison +hurricane +facade +Prussia +crushed +enterprise +##mber +Twitter +Generation +Physical +Sugar +editing +communicate +Ellie +##hurst +Ernst +wagon +promotional +conquest +Parliamentary +courtyard +lawyers +Superman +email +Prussian +lately +lecturer +Singer +Majesty +Paradise +sooner +Heath +slot +curves +convoy +##vian +induced +synonym +breeze +##plane +##ox +peered +Coalition +##hia +odds +##esh +##lina +Tomorrow +Nadu +##ico +##rah +damp +autonomous +console +Victory +counts +Luxembourg +intimate +Archived +Carroll +spy +Zero +habit +Always +faction +teenager +Johnston +chaos +ruin +commerce +blog +##shed +##the +reliable +Word +Yu +Norton +parade +Catholics +damned +##iling +surgeon +##tia +Allison +Jonas +remarked +##ès +idiot +Making +proposals +Industries +strategies +artifacts +batteries +reward +##vers +Agricultural +distinguish +lengths +Jeffrey +Progressive +kicking +Patricia +##gio +ballot +##ios +skilled +##gation +Colt +limestone +##AS +peninsula +##itis +LA +hotels +shapes +Crime +depicting +northwestern +HD +silly +Das +##² +##ws +##ash +##matic +thermal +Has +forgive +surrendered +Palm +Nacional +drank +haired +Mercedes +##foot +loading +Timothy +##roll +mechanisms +traces +digging +discussing +Natalie +##zhou +Forbes +landmark +Anyway +Manor +conspiracy +gym +knocking +viewing +Formation +Pink +Beauty +limbs +Phillip +sponsor +Joy +granite +Harbour +##ero +payments +Ballet +conviction +##dam +Hood +estimates +lacked +Mad +Jorge +##wen +refuge +##LA +invaded +Kat +suburban +##fold +investigated +Ari +complained +creek +Georges +##uts +powder +accepting +deserved +carpet +Thunder +molecules +Legal +cliff +strictly +enrollment +ranch +##rg +##mba +proportion +renovation +crop +grabbing +##liga +finest +entries +receptor +helmet +blown +Listen +flagship +workshop +resolve +nails +Shannon +portal +jointly +shining +Violet +overwhelming +upward +Mick +proceedings +##dies +##aring +Laurence +Churchill +##rice +commit +170 +inclusion +Examples +##verse +##rma +fury +paths +##SC +ankle +nerves +Chemistry +rectangular +sworn +screenplay +cake +Mann +Seoul +Animal +sizes +Speed +vol +Population +Southwest +Hold +continuously +Qualified +wishing +Fighting +Made +disappointment +Portsmouth +Thirty +##beck +Ahmad +teammate +MLB +graph +Charleston +realizes +##dium +exhibits +preventing +##int +fever +rivalry +Male +mentally +dull +##lor +##rich +consistently +##igan +Madame +certificate +suited +Krishna +accuracy +Webb +Budapest +Rex +1831 +Cornell +OK +surveillance +##gated +habitats +Adventure +Conrad +Superior +Gay +sofa +aka +boot +Statistics +Jessie +Liberation +##lip +##rier +brands +saint +Heinrich +Christine +bath +Rhine +ballet +Jin +consensus +chess +Arctic +stack +furious +cheap +toy +##yre +##face +##gging +gastropod +##nne +Romans +membrane +answering +25th +architects +sustainable +##yne +Hon +1814 +Baldwin +dome +##awa +##zen +celebrity +enclosed +##uit +##mmer +Electronic +locals +##CE +supervision +mineral +Chemical +Slovakia +alley +hub +##az +heroes +Creative +##AM +incredible +politically +ESPN +yanked +halls +Aboriginal +Greatest +yield +##20 +congressional +robot +Kiss +welcomed +MS +speeds +proceed +Sherman +eased +Greene +Walsh +Geoffrey +variables +rocky +##print +acclaim +Reverend +Wonder +tonnes +recurring +Dawson +continent +finite +AP +continental +ID +facilitate +essays +Rafael +Neal +1833 +ancestors +##met +##gic +Especially +teenage +frustrated +Jules +cock +expense +##oli +##old +blocking +Notable +prohibited +ca +dock +organize +##wald +Burma +Gloria +dimension +aftermath +choosing +Mickey +torpedo +pub +##used +manuscripts +laps +Ulster +staircase +sphere +Insurance +Contest +lens +risks +investigations +ERA +glare +##play +Graduate +auction +Chronicle +##tric +##50 +Coming +seating +Wade +seeks +inland +Thames +Rather +butterfly +contracted +positioned +consumers +contestants +fragments +Yankees +Santos +administrator +hypothesis +retire +Denis +agreements +Winnipeg +##rill +1820 +trophy +crap +shakes +Jenkins +##rium +ya +twist +labels +Maritime +##lings +##iv +111 +##ensis +Cairo +Anything +##fort +opinions +crowded +##nian +abandon +##iff +drained +imported +##rr +tended +##rain +Going +introducing +sculptures +bankruptcy +danced +demonstration +stance +settings +gazed +abstract +pet +Calvin +stiff +strongest +wrestler +##dre +Republicans +grace +allocated +cursed +snail +advancing +Return +errors +Mall +presenting +eliminate +Amateur +Institution +counting +##wind +warehouse +##nde +Ethiopia +trailed +hollow +##press +Literary +capability +nursing +preceding +lamp +Thomson +Morton +##ctic +Crew +Close +composers +boom +Clare +missiles +112 +hunter +snap +##oni +##tail +Us +declaration +##cock +rally +huh +lion +straightened +Philippe +Sutton +alpha +valued +maker +navigation +detected +favorable +perception +Charter +##ña +Ricky +rebounds +tunnels +slapped +Emergency +supposedly +##act +deployment +socialist +tubes +anybody +corn +##NA +Seminary +heating +pump +##AA +achieving +souls +##ass +Link +##ele +##smith +greeted +Bates +Americas +Elder +cure +contestant +240 +fold +Runner +Uh +licked +Politics +committees +neighbors +fairy +Silva +Leipzig +tipped +correctly +exciting +electronics +foundations +cottage +governmental +##hat +allied +claws +presidency +cruel +Agreement +slender +accompanying +precisely +##pass +driveway +swim +Stand +crews +##mission +rely +everyday +Wings +demo +##hic +recreational +min +nationality +##duction +Easter +##hole +canvas +Kay +Leicester +talented +Discovery +shells +##ech +Kerry +Ferguson +Leave +##place +altogether +adopt +butt +wolves +##nsis +##ania +modest +soprano +Boris +##ught +electron +depicts +hid +cruise +differ +treasure +##nch +Gun +Mama +Bengali +trainer +merchants +innovation +presumably +Shirley +bottles +proceeds +Fear +invested +Pirates +particle +Dominic +blamed +Fight +Daisy +##pper +##graphic +nods +knight +Doyle +tales +Carnegie +Evil +Inter +Shore +Nixon +transform +Savannah +##gas +Baltic +stretching +worlds +protocol +Percy +Toby +Heroes +brave +dancers +##aria +backwards +responses +Chi +Gaelic +Berry +crush +embarked +promises +Madonna +researcher +realised +inaugurated +Cherry +Mikhail +Nottingham +reinforced +subspecies +rapper +##kie +Dreams +Re +Damon +Minneapolis +monsters +suspicion +Tel +surroundings +afterward +complaints +OF +sectors +Algeria +lanes +Sabha +objectives +Donna +bothered +distracted +deciding +##ives +##CA +##onia +bishops +Strange +machinery +Voiced +synthesis +reflects +interference +##TS +##ury +keen +##ign +frown +freestyle +ton +Dixon +Sacred +Ruby +Prison +##ión +1825 +outfit +##tain +curiosity +##ight +frames +steadily +emigrated +horizon +##erly +Doc +philosophical +Table +UTC +Marina +##DA +secular +##eed +Zimbabwe +cops +Mack +sheriff +Sanskrit +Francesco +catches +questioning +streaming +Kill +testimony +hissed +tackle +countryside +copyright +##IP +Buddhism +##rator +ladder +##ON +Past +rookie +depths +##yama +##ister +##HS +Samantha +Dana +Educational +brows +Hammond +raids +envelope +##sco +##hart +##ulus +epic +detection +Streets +Potter +statistical +für +ni +accounting +##pot +employer +Sidney +Depression +commands +Tracks +averaged +lets +Ram +longtime +suits +branded +chip +Shield +loans +ought +Said +sip +##rome +requests +Vernon +bordered +veterans +##ament +Marsh +Herzegovina +Pine +##igo +mills +anticipation +reconnaissance +##ef +expectations +protested +arrow +guessed +depot +maternal +weakness +##ap +projected +pour +Carmen +provider +newer +remind +freed +##rily +##wal +##tones +intentions +Fiji +timing +Match +managers +Kosovo +Herman +Wesley +Chang +135 +semifinals +shouting +Indo +Janeiro +Chess +Macedonia +Buck +##onies +rulers +Mail +##vas +##sel +MHz +Programme +Task +commercially +subtle +propaganda +spelled +bowling +basically +Raven +1828 +Colony +109 +##ingham +##wara +anticipated +1829 +##iers +graduates +##rton +##fication +endangered +ISO +diagnosed +##tage +exercises +Battery +bolt +poison +cartoon +##ción +hood +bowed +heal +Meyer +Reagan +##wed +subfamily +##gent +momentum +infant +detect +##sse +Chapman +Darwin +mechanics +NSW +Cancer +Brooke +Nuclear +comprised +hire +sanctuary +wingspan +contrary +remembering +surprising +Basic +stealing +OS +hatred +##lled +masters +violation +Rule +##nger +assuming +conquered +louder +robe +Beatles +legitimate +##vation +massacre +Rica +unsuccessfully +poets +##enberg +careers +doubled +premier +battalions +Dubai +Paper +Louisville +gestured +dressing +successive +mumbled +Vic +referee +pupil +##cated +##rre +ceremonies +picks +##IN +diplomat +alike +geographical +rays +##HA +##read +harbour +factories +pastor +playwright +Ultimate +nationalist +uniforms +obtaining +kit +Amber +##pling +screenwriter +ancestry +##cott +Fields +PR +Coleman +rat +Bavaria +squeeze +highlighted +Adult +reflecting +Mel +1824 +bicycle +organizing +sided +Previously +Underground +Prof +athletics +coupled +mortal +Hampton +worthy +immune +Ava +##gun +encouraging +simplified +##ssa +##nte +##ann +Providence +entities +Pablo +Strong +Housing +##ista +##ators +kidnapped +mosque +Kirk +whispers +fruits +shattered +fossil +Empress +Johns +Webster +Thing +refusing +differently +specimen +Ha +##EN +##tina +##elle +##night +Horn +neighbourhood +Bolivia +##rth +genres +Pre +##vich +Amelia +swallow +Tribune +Forever +Psychology +Use +##bers +Gazette +ash +##usa +Monster +##cular +delegation +blowing +Oblast +retreated +automobile +##ex +profits +shirts +devil +Treasury +##backs +Drums +Ronnie +gameplay +expertise +Evening +resides +Caesar +unity +Crazy +linking +Vision +donations +Isabel +valve +Sue +WWE +logical +availability +fitting +revolt +##mill +Linux +taxi +Access +pollution +statues +Augustus +##pen +cello +##some +lacking +##ati +Gwen +##aka +##ovich +1821 +Wow +initiatives +Uruguay +Cain +stroked +examine +##ī +mentor +moist +disorders +buttons +##tica +##anna +Species +Lynch +museums +scorer +Poor +eligibility +op +unveiled +cats +Title +wheat +critically +Syracuse +##osis +marketed +enhance +Ryder +##NG +##ull +##rna +embedded +throws +foods +happily +##ami +lesson +formats +punched +##rno +expressions +qualities +##sal +Gods +##lity +elect +wives +##lling +jungle +Toyota +reversed +Grammar +Cloud +Agnes +##ules +disputed +verses +Lucien +threshold +##rea +scanned +##bled +##dley +##lice +Kazakhstan +Gardner +Freeman +##rz +inspection +Rita +accommodation +advances +chill +Elliot +thriller +Constantinople +##mos +debris +whoever +1810 +Santo +Carey +remnants +Guatemala +##irs +carriers +equations +mandatory +##WA +anxious +measurement +Summit +Terminal +Erin +##zes +LLC +##uo +glancing +sin +##₃ +Downtown +flowering +Euro +Leigh +Lance +warn +decent +recommendations +##ote +Quartet +##rrell +Clarence +colleague +guarantee +230 +Clayton +Beast +addresses +prospect +destroyer +vegetables +Leadership +fatal +prints +190 +##makers +Hyde +persuaded +illustrations +Southampton +Joyce +beats +editors +mount +##grave +Malaysian +Bombay +endorsed +##sian +##bee +applying +Religion +nautical +bomber +Na +airfield +gravel +##rew +Cave +bye +dig +decree +burden +Election +Hawk +Fe +##iled +reunited +##tland +liver +Teams +Put +delegates +Ella +##fect +Cal +invention +Castro +bored +##kawa +##ail +Trinidad +NASCAR +pond +develops +##pton +expenses +Zoe +Released +##rf +organs +beta +parameters +Neill +##lene +lateral +Beat +blades +Either +##hale +Mitch +##ET +##vous +Rod +burnt +phones +Rising +##front +investigating +##dent +Stephanie +##keeper +screening +##uro +Swan +Sinclair +modes +bullets +Nigerian +melody +##ques +Rifle +##12 +128 +##jin +charm +Venus +##tian +fusion +advocated +visitor +pinned +genera +3000 +Ferry +Solo +quantity +regained +platinum +shoots +narrowly +preceded +update +##ichi +equality +unaware +regiments +ally +##tos +transmitter +locks +Seeing +outlets +feast +reopened +##ows +struggles +Buddy +1826 +bark +elegant +amused +Pretty +themed +schemes +Lisbon +Te +patted +terrorism +Mystery +##croft +##imo +Madagascar +Journey +dealer +contacted +##quez +ITV +vacation +Wong +Sacramento +organisms +##pts +balcony +coloured +sheer +defines +MC +abortion +forbidden +accredited +Newfoundland +tendency +entrepreneur +Benny +Tanzania +needing +finalist +mythology +weakened +gown +sentences +Guest +websites +Tibetan +UFC +voluntary +annoyed +Welcome +honestly +correspondence +geometry +Deutsche +Biology +Help +##aya +Lines +Hector +##ael +reluctant +##ages +wears +inquiry +##dell +Holocaust +Tourism +Wei +volcanic +##mates +Visual +sorts +neighborhoods +Running +apple +shy +Laws +bend +Northeast +feminist +Speedway +Murder +visa +stuffed +fangs +transmitted +fiscal +Ain +enlarged +##ndi +Cecil +Peterson +Benson +Bedford +acceptable +##CC +##wer +purely +triangle +foster +Alberto +educator +Highland +acute +LGBT +Tina +Mi +adventures +Davidson +Honda +translator +monk +enacted +summoned +##ional +collector +Genesis +Un +liner +Di +Statistical +##CS +filter +Knox +Religious +Stella +Estonian +Turn +##ots +primitive +parishes +##lles +complexity +autobiography +rigid +cannon +pursuing +exploring +##gram +##mme +freshman +caves +Expedition +Traditional +iTunes +certification +cooling +##ort +##gna +##IT +##lman +##VA +Motion +explosive +licence +boxer +shrine +loosely +Brigadier +Savage +Brett +MVP +heavier +##elli +##gged +Buddha +Easy +spells +fails +incredibly +Georg +stern +compatible +Perfect +applies +cognitive +excessive +nightmare +neighbor +Sicily +appealed +static +##₁ +Aberdeen +##leigh +slipping +bride +##guard +Um +Clyde +1818 +##gible +Hal +Frost +Sanders +interactive +Hour +##vor +hurting +bull +termed +shelf +capturing +##pace +rolls +113 +##bor +Chilean +teaches +##rey +exam +shipped +Twin +borrowed +##lift +Shit +##hot +Lindsay +Below +Kiev +Lin +leased +##sto +Eli +Diane +Val +subtropical +shoe +Bolton +Dragons +##rification +Vatican +##pathy +Crisis +dramatically +talents +babies +##ores +surname +##AP +##cology +cubic +opted +Archer +sweep +tends +Karnataka +Judy +stint +Similar +##nut +explicitly +##nga +interact +Mae +portfolio +clinic +abbreviated +Counties +##iko +hearts +##ı +providers +screams +Individual +##etti +Monument +##iana +accessed +encounters +gasp +##rge +defunct +Avery +##rne +nobility +useless +Phase +Vince +senator +##FL +1813 +surprisingly +##illo +##chin +Boyd +rumors +equity +Gone +Hearts +chassis +overnight +Trek +wrists +submit +civic +designers +##rity +prominence +decorative +derives +starter +##AF +wisdom +Powers +reluctantly +measurements +doctoral +Noel +Gideon +Baden +Cologne +lawn +Hawaiian +anthology +##rov +Raiders +embassy +Sterling +##pal +Telugu +troubled +##FC +##bian +fountain +observe +ore +##uru +##gence +spelling +Border +grinning +sketch +Benedict +Xbox +dialects +readily +immigrant +Constitutional +aided +nevertheless +SE +tragedy +##ager +##rden +Flash +##MP +Europa +emissions +##ield +panties +Beverly +Homer +curtain +##oto +toilet +Isn +Jerome +Chiefs +Hermann +supernatural +juice +integrity +Scots +auto +Patriots +Strategic +engaging +prosecution +cleaned +Byron +investments +adequate +vacuum +laughs +##inus +##nge +Usually +Roth +Cities +Brand +corpse +##ffy +Gas +rifles +Plains +sponsorship +Levi +tray +owed +della +commanders +##ead +tactical +##rion +García +harbor +discharge +##hausen +gentleman +endless +highways +##itarian +pleaded +##eta +archive +Midnight +exceptions +instances +Gibraltar +cart +##NS +Darren +Bonnie +##yle +##iva +OCLC +bra +Jess +##EA +consulting +Archives +Chance +distances +commissioner +##AR +LL +sailors +##sters +enthusiasm +Lang +##zia +Yugoslav +confirm +possibilities +Suffolk +##eman +banner +1822 +Supporting +fingertips +civilization +##gos +technically +1827 +Hastings +sidewalk +strained +monuments +Floyd +Chennai +Elvis +villagers +Cumberland +strode +albeit +Believe +planets +combining +Mohammad +container +##mouth +##tures +verb +BA +Tank +Midland +screened +Gang +Democracy +Helsinki +screens +thread +charitable +##version +swiftly +ma +rational +combine +##SS +##antly +dragging +Cliff +Tasmania +quest +professionally +##aj +rap +##lion +livestock +##hua +informal +specially +lonely +Matthews +Dictionary +1816 +Observatory +correspondent +constitute +homeless +waving +appreciated +Analysis +Meeting +dagger +##AL +Gandhi +flank +Giant +Choir +##not +glimpse +toe +Writer +teasing +springs +##dt +Glory +healthcare +regulated +complaint +math +Publications +makers +##hips +cement +Need +apologize +disputes +finishes +Partners +boring +ups +gains +1793 +Congressional +clergy +Folk +##made +##nza +Waters +stays +encoded +spider +betrayed +Applied +inception +##urt +##zzo +wards +bells +UCLA +Worth +bombers +Mo +trademark +Piper +##vel +incorporates +1801 +##cial +dim +Twelve +##word +Appeals +tighter +spacecraft +##tine +coordinates +##iac +mistakes +Zach +laptop +Teresa +##llar +##yr +favored +Nora +sophisticated +Irving +hammer +División +corporations +niece +##rley +Patterson +UNESCO +trafficking +Ming +balanced +plaque +Latvia +broader +##owed +Save +confined +##vable +Dalton +tide +##right +##ural +##num +swords +caring +##eg +IX +Acting +paved +##moto +launching +Antoine +substantially +Pride +Philharmonic +grammar +Indoor +Ensemble +enabling +114 +resided +Angelo +publicity +chaired +crawled +Maharashtra +Telegraph +lengthy +preference +differential +anonymous +Honey +##itation +wage +##iki +consecrated +Bryant +regulatory +Carr +##én +functioning +watches +##ú +shifts +diagnosis +Search +app +Peters +##SE +##cat +Andreas +honours +temper +counsel +Urdu +Anniversary +maritime +##uka +harmony +##unk +essence +Lorenzo +choked +Quarter +indie +##oll +loses +##prints +amendment +Adolf +scenario +similarities +##rade +##LC +technological +metric +Russians +thoroughly +##tead +cruiser +1806 +##nier +1823 +Teddy +##psy +au +progressed +exceptional +broadcaster +partnered +fitness +irregular +placement +mothers +unofficial +Garion +Johannes +1817 +regain +Solar +publishes +Gates +Broken +thirds +conversations +dive +Raj +contributor +quantities +Worcester +governance +##flow +generating +pretending +Belarus +##voy +radius +skating +Marathon +1819 +affection +undertook +##wright +los +##bro +locate +PS +excluded +recreation +tortured +jewelry +moaned +##logue +##cut +Complete +##rop +117 +##II +plantation +whipped +slower +crater +##drome +Volunteer +attributes +celebrations +regards +Publishers +oath +utilized +Robbie +Giuseppe +fiber +indication +melted +archives +Damien +storey +affecting +identifying +dances +alumni +comparable +upgrade +rented +sprint +##kle +Marty +##lous +treating +railways +Lebanese +erupted +occupy +sympathy +Jude +Darling +Qatar +drainage +McCarthy +heel +Klein +computing +wireless +flip +Du +Bella +##ast +##ssen +narrator +mist +sings +alignment +121 +2020 +securing +##rail +Progress +missionaries +brutal +mercy +##shing +Hip +##ache +##olo +switching +##here +Malay +##ob +constituted +Mohammed +Often +standings +surge +teachings +ink +detached +systematic +Trial +Myanmar +##wo +offs +Reyes +decoration +translations +wherever +reviewer +speculation +Bangkok +terminated +##ester +beard +RCA +Aidan +Associated +Emerson +Charity +1803 +generous +Dudley +ATP +##haven +prizes +toxic +gloves +##iles +##dos +Turning +myth +Parade +##building +Hits +##eva +teamed +Above +Duchess +Holt +##oth +Sub +Ace +atomic +inform +Ship +depend +Jun +##bes +Norwich +globe +Baroque +Christina +Cotton +Tunnel +kidding +Concerto +Brittany +tasted +phases +stems +angles +##TE +##nam +##40 +charted +Alison +intensive +Willis +glory +##lit +Bergen +est +taller +##dicate +labeled +##ido +commentator +Warrior +Viscount +shortened +aisle +Aria +Spike +spectators +goodbye +overlooking +mammals +##lude +wholly +Barrett +##gus +accompany +seventy +employ +##mb +ambitious +beloved +basket +##mma +##lding +halted +descendant +pad +exclaimed +cloak +##pet +Strait +Bang +Aviv +sadness +##ffer +Donovan +1880s +agenda +swinging +##quin +jerk +Boat +##rist +nervously +Silence +Echo +shout +implies +##iser +##cking +Shiva +Weston +damages +##tist +effectiveness +Horace +cycling +Rey +ache +Photography +PDF +Dear +leans +Lea +##vision +booth +attained +disbelief +##eus +##ution +Hop +pension +toys +Eurovision +faithful +##heads +Andre +owe +default +Atlas +Megan +highlights +lovers +Constantine +Sixth +masses +##garh +emerge +Auto +Slovak +##oa +##vert +Superintendent +flicked +inventor +Chambers +Frankie +Romeo +pottery +companions +Rudolf +##liers +diary +Unless +tap +alter +Randall +##ddle +##eal +limitations +##boards +utterly +knelt +guaranteed +Cowboys +Islander +horns +##ike +Wendy +sexually +Smart +breasts +##cian +compromise +Duchy +AT +Galaxy +analog +Style +##aking +weighed +Nigel +optional +Czechoslovakia +practicing +Ham +##0s +feedback +batted +uprising +operative +applicable +criminals +classrooms +Somehow +##ode +##OM +Naomi +Winchester +##pping +Bart +Regina +competitor +Recorded +Yuan +Vera +lust +Confederation +##test +suck +1809 +Lambert +175 +Friend +##ppa +Slowly +##⁺ +Wake +Dec +##aneous +chambers +Color +Gus +##site +Alternative +##world +Exeter +Omaha +celebrities +striker +210 +dwarf +meals +Oriental +Pearson +financing +revenues +underwater +Steele +screw +Feeling +Mt +acids +badge +swore +theaters +Moving +admired +lung +knot +penalties +116 +fork +##cribed +Afghan +outskirts +Cambodia +oval +wool +fossils +Ned +Countess +Darkness +delicious +##nica +Evelyn +Recordings +guidelines +##CP +Sandra +meantime +Antarctica +modeling +granddaughter +##rial +Roma +Seventh +Sunshine +Gabe +##nton +Shop +Turks +prolific +soup +parody +##nta +Judith +disciplines +resign +Companies +Libya +Jets +inserted +Mile +retrieve +filmmaker +##rand +realistic +unhappy +##30 +sandstone +##nas +##lent +##ush +##rous +Brent +trash +Rescue +##unted +Autumn +disgust +flexible +infinite +sideways +##oss +##vik +trailing +disturbed +50th +Newark +posthumously +##rol +Schmidt +Josef +##eous +determining +menu +Pole +Anita +Luc +peaks +118 +Yard +warrant +generic +deserted +Walking +stamp +tracked +##berger +paired +surveyed +sued +Rainbow +##isk +Carpenter +submarines +realization +touches +sweeping +Fritz +module +Whether +resembles +##form +##lop +unsure +hunters +Zagreb +unemployment +Senators +Georgetown +##onic +Barker +foul +commercials +Dresden +Words +collision +Carlton +Fashion +doubted +##ril +precision +MIT +Jacobs +mob +Monk +retaining +gotta +##rod +remake +Fast +chips +##pled +sufficiently +##lights +delivering +##enburg +Dancing +Barton +Officers +metals +##lake +religions +##ré +motivated +differs +dorsal +##birds +##rts +Priest +polished +##aling +Saxony +Wyatt +knockout +##hor +Lopez +RNA +##link +metallic +##kas +daylight +Montenegro +##lining +wrapping +resemble +Jam +Viking +uncertainty +angels +enables +##fy +Stuttgart +tricks +tattoo +127 +wicked +asset +breach +##yman +MW +breaths +Jung +im +1798 +noon +vowel +##qua +calmly +seasonal +chat +ingredients +cooled +Randolph +ensuring +##ib +##idal +flashing +1808 +Macedonian +Cool +councils +##lick +advantages +Immediately +Madras +##cked +Pain +fancy +chronic +Malayalam +begged +##nese +Inner +feathers +##vey +Names +dedication +Sing +pan +Fischer +nurses +Sharp +inning +stamps +Meg +##ello +edged +motioned +Jacksonville +##ffle +##dic +##US +divide +garnered +Ranking +chasing +modifications +##oc +clever +midst +flushed +##DP +void +##sby +ambulance +beaches +groan +isolation +strengthen +prevention +##ffs +Scouts +reformed +geographic +squadrons +Fiona +Kai +Consequently +##uss +overtime +##yas +Fr +##BL +Papua +Mixed +glances +Haiti +Sporting +sandy +confronted +René +Tanner +1811 +##IM +advisory +trim +##ibe +González +gambling +Jupiter +##ility +##owski +##nar +122 +apology +teased +Pool +feminine +wicket +eagle +shiny +##lator +blend +peaking +nasty +nodding +fraction +tech +Noble +Kuwait +brushing +Italia +Canberra +duet +Johan +1805 +Written +cameo +Stalin +pig +cord +##zio +Surely +SA +owing +holidays +123 +Ranger +lighthouse +##ige +miners +1804 +##ë +##gren +##ried +crashing +##atory +wartime +highlight +inclined +Torres +Tax +##zel +##oud +Own +##corn +Divine +EMI +Relief +Northwestern +ethics +BMW +click +plasma +Christie +coordinator +Shepherd +washing +cooked +##dio +##eat +Cerambycidae +algebra +Engine +costumes +Vampire +vault +submission +virtue +assumption +##rell +Toledo +##oting +##rva +crept +emphasized +##lton +##ood +Greeks +surgical +crest +Patrol +Beta +Tessa +##GS +pizza +traits +rats +Iris +spray +##GC +Lightning +binary +escapes +##take +Clary +crowds +##zong +hauled +maid +##fen +Manning +##yang +Nielsen +aesthetic +sympathetic +affiliation +soaked +Mozart +personalities +begging +##iga +clip +Raphael +yearly +Lima +abundant +##lm +1794 +strips +Initiative +reporters +##vsky +consolidated +##itated +Civic +rankings +mandate +symbolic +##ively +1807 +rental +duck +nave +complications +##nor +Irene +Nazis +haunted +scholarly +Pratt +Gran +Embassy +Wave +pity +genius +bats +canton +Tropical +marker +##cos +escorted +Climate +##posed +appreciation +freezing +puzzle +Internal +pools +Shawn +pathway +Daniels +Fitzgerald +extant +olive +Vanessa +marriages +cocked +##dging +prone +chemicals +doll +drawer +##HF +Stark +Property +##tai +flowed +Sheridan +##uated +Less +Omar +remarks +catalogue +Seymour +wreck +Carrie +##bby +Mercer +displaced +sovereignty +rip +Flynn +Archie +Quarterfinals +Hassan +##ards +vein +Osaka +pouring +wages +Romance +##cript +##phere +550 +##eil +##stown +Documentary +ancestor +CNN +Panthers +publishers +Rise +##mu +biting +Bright +String +succeeding +119 +loaned +Warwick +Sheikh +Von +Afterwards +Jax +Camden +helicopters +Hence +Laurel +##ddy +transaction +Corp +clause +##owing +##kel +Investment +cups +Lucia +Moss +Giles +chef +López +decisive +30th +distress +linguistic +surveys +Ready +maiden +Touch +frontier +incorporate +exotic +mollusk +Leopold +Ride +##wain +##ndo +teammates +tones +drift +ordering +Feb +Penny +Normandy +Present +Flag +pipes +##rro +delight +motto +Tibet +leap +Eliza +Produced +teenagers +sitcom +Try +Hansen +Cody +wandered +terrestrial +frog +scare +resisted +employers +coined +##DS +resistant +Fly +captive +dissolution +judged +associates +defining +##court +Hale +##mbo +raises +clusters +twelfth +##metric +Roads +##itude +satisfy +Android +Reds +Gloucester +Category +Valencia +Daemon +stabbed +Luna +Churches +Canton +##eller +Attack +Kashmir +annexed +grabs +asteroid +Hartford +recommendation +Rodriguez +handing +stressed +frequencies +delegate +Bones +Erie +Weber +Hands +Acts +millimetres +24th +Fat +Howe +casually +##SL +convent +1790 +IF +##sity +1795 +yelling +##ises +drain +addressing +amino +Marcel +Sylvia +Paramount +Gerard +Volleyball +butter +124 +Albion +##GB +triggered +1792 +folding +accepts +##ße +preparations +Wimbledon +dose +##grass +escaping +##tling +import +charging +##dation +280 +Nolan +##fried +Calcutta +##pool +Cove +examining +minded +heartbeat +twisting +domains +bush +Tunisia +Purple +Leone +##code +evacuated +battlefield +tiger +Electrical +##ared +chased +##cre +cultivated +Jet +solved +shrug +ringing +Impact +##iant +kilometre +##log +commemorate +migrated +singular +designing +promptly +Higgins +##own +##aves +freshwater +Marketing +Payne +beg +locker +pray +implied +AAA +corrected +Trans +Europeans +Ashe +acknowledge +Introduction +##writer +##llen +Munster +auxiliary +growl +Hours +Poems +##AT +reduces +Plain +plague +canceled +detention +polite +necklace +Gustav +##gu +##lance +En +Angola +##bb +dwelling +##hea +5000 +Qing +Dodgers +rim +##ored +##haus +spilled +Elisabeth +Viktor +backpack +1802 +amended +##worthy +Phantom +##ctive +keeper +##loom +Vikings +##gua +employs +Tehran +specialty +##bate +Marx +Mirror +Jenna +rides +needle +prayers +clarinet +forewings +##walk +Midlands +convincing +advocacy +Cao +Birds +cycles +Clement +Gil +bubble +Maximum +humanitarian +Tan +cries +##SI +Parsons +Trio +offshore +Innovation +clutched +260 +##mund +##duct +Prairie +relied +Falcon +##ste +Kolkata +Gill +Swift +Negro +Zoo +valleys +##OL +Opening +beams +MPs +outline +Bermuda +Personal +exceed +productive +##MT +republic +forum +##sty +tornado +Known +dipped +Edith +folks +mathematician +watershed +Ricardo +synthetic +##dication +deity +##₄ +gaming +subjected +suspects +Foot +swollen +Motors +##tty +##ý +aloud +ceremonial +es +nuts +intend +Carlisle +tasked +hesitation +sponsors +unified +inmates +##ctions +##stan +tiles +jokes +whereby +outcomes +Lights +scary +Stoke +Portrait +Blind +sergeant +violations +cultivation +fuselage +Mister +Alfonso +candy +sticks +teen +agony +Enough +invite +Perkins +Appeal +mapping +undergo +Glacier +Melanie +affects +incomplete +##dd +Colombian +##nate +CBC +purchasing +bypass +Drug +Electronics +Frontier +Coventry +##aan +autonomy +scrambled +Recent +bounced +cow +experiencing +Rouge +cuisine +Elite +disability +Ji +inheritance +wildly +Into +##wig +confrontation +Wheeler +shiver +Performing +aligned +consequently +Alexis +Sin +woodland +executives +Stevenson +Ferrari +inevitable +##cist +##dha +##base +Corner +comeback +León +##eck +##urus +MacDonald +pioneering +breakdown +landscapes +Veterans +Rican +Theological +stirred +participant +Credit +Hyderabad +snails +Claudia +##ocene +compliance +##MI +Flags +Middlesex +storms +winding +asserted +er +##ault +##kal +waking +##rates +abbey +Augusta +tooth +trustees +Commodore +##uded +Cunningham +NC +Witch +marching +Sword +Same +spiral +Harley +##ahan +Zack +Audio +1890s +##fit +Simmons +Kara +Veronica +negotiated +Speaking +FIBA +Conservatory +formations +constituencies +explicit +facial +eleventh +##ilt +villain +##dog +##case +##hol +armored +tin +hairs +##umi +##rai +mattress +Angus +cease +verbal +Recreation +savings +Aurora +peers +Monastery +Airways +drowned +additions +downstream +sticking +Shi +mice +skiing +##CD +Raw +Riverside +warming +hooked +boost +memorable +posed +treatments +320 +##dai +celebrating +blink +helpless +circa +Flowers +PM +uncommon +Oct +Hawks +overwhelmed +Sparhawk +repaired +Mercy +pose +counterpart +compare +survives +##½ +##eum +coordinate +Lil +grandchildren +notorious +Yi +Judaism +Juliet +accusations +1789 +floated +marathon +roar +fortified +reunion +145 +Nov +Paula +##fare +##toria +tearing +Cedar +disappearance +Si +gifted +scar +270 +PBS +Technologies +Marvin +650 +roller +cupped +negotiate +##erman +passport +tram +miracle +styled +##tier +necessity +Des +rehabilitation +Lara +USD +psychic +wipe +##lem +mistaken +##lov +charming +Rider +pageant +dynamics +Cassidy +##icus +defenses +##tadt +##vant +aging +##inal +declare +mistress +supervised +##alis +##rest +Ashton +submerged +sack +Dodge +grocery +ramp +Teacher +lineage +imagery +arrange +inscriptions +Organisation +Siege +combines +pounded +Fleming +legends +columnist +Apostolic +prose +insight +Arabian +expired +##uses +##nos +Alone +elbows +##asis +##adi +##combe +Step +Waterloo +Alternate +interval +Sonny +plains +Goals +incorporating +recruit +adjoining +Cheshire +excluding +marrying +ducked +Cherokee +par +##inate +hiking +Coal +##bow +natives +ribbon +Allies +con +descriptions +positively +##lal +defendant +22nd +Vivian +##beat +Weather +possessions +Date +sweetheart +inability +Salisbury +adviser +ideology +Nordic +##eu +Cubs +IP +Administrative +##nick +facto +liberation +Burnett +Javier +fashioned +Electoral +Turin +theft +unanimous +Per +1799 +Clan +Hawkins +Teachers +##wes +Cameroon +Parkway +##gment +demolition +atoms +nucleus +##thi +recovering +##yte +##vice +lifts +Must +deposit +Hancock +Semi +darkened +Declaration +moan +muscular +Myers +attractions +sauce +simulation +##weed +Alps +barriers +##baum +Barack +galleries +Min +holders +Greenwich +donation +Everybody +Wolfgang +sandwich +Kendra +Collegiate +casino +Slavic +ensuing +Porto +##grapher +Jesuit +suppressed +tires +Ibrahim +protesters +Ibn +Amos +1796 +phenomena +Hayden +Paraguay +Squad +Reilly +complement +aluminum +##eers +doubts +decay +demise +Practice +patience +fireplace +transparent +monarchy +##person +Rodney +mattered +rotating +Clifford +disposal +Standards +paced +##llie +arise +tallest +tug +documentation +node +freeway +Nikolai +##cite +clicked +imaging +Lorraine +Tactical +Different +Regular +Holding +165 +Pilot +guarded +##polis +Classics +Mongolia +Brock +monarch +cellular +receptors +Mini +Chandler +financed +financially +Lives +erection +Fuller +unnamed +Kannada +cc +passive +plateau +##arity +freak +##rde +retrieved +transactions +##sus +23rd +swimmer +beef +fulfill +Arlington +offspring +reasoning +Rhys +saves +pseudonym +centimetres +shivered +shuddered +##ME +Feel +##otic +professors +Blackburn +##eng +##life +##haw +interred +lodge +fragile +Della +guardian +##bbled +catalog +clad +observer +tract +declaring +##headed +Lok +dean +Isabelle +1776 +irrigation +spectacular +shuttle +mastering +##aro +Nathaniel +Retired +##lves +Brennan +##kha +dick +##dated +##hler +Rookie +leapt +televised +weekends +Baghdad +Yemen +##fo +factions +ion +Lab +mortality +passionate +Hammer +encompasses +confluence +demonstrations +Ki +derivative +soils +##unch +Ranch +Universities +conventions +outright +aiming +hierarchy +reside +illusion +graves +rituals +126 +Antwerp +Dover +##ema +campuses +Hobart +lifelong +aliens +##vity +Memory +coordination +alphabet +##mina +Titans +pushes +Flanders +##holder +Normal +excellence +capped +profound +Taipei +portrayal +sparked +scratch +se +##eas +##hir +Mackenzie +##cation +Neo +Shin +##lined +magnificent +poster +batsman +##rgent +persuade +##ement +Icelandic +miserable +collegiate +Feature +geography +##mura +Comic +Circus +processor +barracks +Tale +##11 +Bulls +##rap +strengthened +##bell +injection +miniature +broadly +Letter +fare +hostage +traders +##nium +##mere +Fortune +Rivera +Lu +triumph +Browns +Bangalore +cooperative +Basel +announcing +Sawyer +##him +##cco +##kara +darted +##AD +##nova +sucking +##position +perimeter +flung +Holdings +##NP +Basque +sketches +Augustine +Silk +Elijah +analyst +armour +riots +acquiring +ghosts +##ems +132 +Pioneer +Colleges +Simone +Economy +Author +semester +Soldier +il +##unting +##bid +freaking +Vista +tumor +##bat +murderer +##eda +unreleased +##grove +##sser +##té +edit +statute +sovereign +##gawa +Killer +stares +Fury +comply +##lord +##nant +barrels +Andhra +Maple +generator +mascot +unusually +eds +##ante +##runner +rod +##tles +Historically +Jennings +dumped +Established +resemblance +##lium +##cise +##body +##voke +Lydia +##hou +##iring +nonetheless +1797 +corrupt +patrons +physicist +sneak +Livingston +Citizens +Architects +Werner +trends +Melody +eighty +markings +brakes +##titled +oversaw +processed +mock +Midwest +intervals +##EF +stretches +werewolf +##MG +Pack +controller +##dition +Honours +cane +Griffith +vague +repertoire +Courtney +orgasm +Abdullah +dominance +occupies +Ya +introduces +Lester +instinct +collaborative +Indigenous +refusal +##rank +outlet +debts +spear +155 +##keeping +##ulu +Catalan +##osh +tensions +##OT +bred +crude +Dunn +abdomen +accurately +##fu +##lough +accidents +Row +Audrey +rude +Getting +promotes +replies +Paolo +merge +##nock +trans +Evangelical +automated +Canon +##wear +##ggy +##gma +Broncos +foolish +icy +Voices +knives +Aside +dreamed +generals +molecule +AG +rejection +insufficient +##nagar +deposited +sacked +Landing +arches +helpful +devotion +intake +Flower +PGA +dragons +evolutionary +##mail +330 +GM +tissues +##tree +arcade +composite +lid +Across +implications +lacks +theological +assessed +concentrations +Den +##mans +##ulous +Fu +homeland +##stream +Harriet +ecclesiastical +troop +ecological +winked +##xed +eighteenth +Casino +specializing +##sworth +unlocked +supreme +devastated +snatched +trauma +GDP +Nord +saddle +Wes +convenient +competes +##nu +##iss +Marian +subway +##rri +successes +umbrella +##far +##ually +Dundee +##cence +spark +##rix +##я +Quality +Geological +cockpit +rpm +Cam +Bucharest +riot +##PM +Leah +##dad +##pose +Ka +m³ +Bundesliga +Wolfe +grim +textile +quartet +expressing +fantastic +destroyers +eternal +picnic +##oro +contractor +1775 +spanning +declining +##cating +Lowe +Sutherland +Emirates +downward +nineteen +violently +scout +viral +melting +enterprises +##cer +Crosby +Jubilee +antenna +urgent +Rory +##uin +##sure +wandering +##gler +##vent +Suzuki +Lifetime +Dirty +occupying +##quent +Disc +Guru +mound +Lennon +Humanities +listeners +Walton +uh +Braves +Bologna +##bis +##gra +Dwight +crawl +flags +memoir +Thorne +Archdiocese +dairy +##uz +##tery +roared +adjust +patches +inn +Knowing +##bbed +##zan +scan +Papa +precipitation +angrily +passages +postal +Phi +embraced +blacks +economist +triangular +Sen +shooter +punished +Millennium +Swimming +confessed +Aston +defeats +Era +cousins +Williamson +##rer +daytime +dumb +##rek +underway +specification +Buchanan +prayed +concealed +activation +##issa +canon +awesome +Starr +plural +summers +##fields +Slam +unnecessary +1791 +resume +trilogy +compression +##rough +selective +dignity +Yan +##xton +immense +##yun +lone +seeded +hiatus +lightweight +summary +Yo +approve +Galway +rejoined +Elise +garbage +burns +speeches +129 +Honduras +##liness +inventory +jersey +FK +assure +slumped +Lionel +Suite +##sbury +Lena +continuation +##AN +brightly +##nti +GT +Knowledge +##park +##lius +lethal +##tribution +##sions +Certificate +Mara +##lby +algorithms +Jade +blows +pirates +fleeing +wheelchair +Stein +sophomore +Alt +Territorial +diploma +snakes +##olic +##tham +Tiffany +Pius +flush +urging +Hanover +Reich +##olate +Unity +Pike +collectively +Theme +ballad +kindergarten +rocked +zoo +##page +whip +Rodríguez +strokes +checks +Becky +Stern +upstream +##uta +Silent +volunteered +Sigma +##ingen +##tract +##ede +Gujarat +screwed +entertaining +##action +##ryn +defenders +innocence +lesbian +que +Richie +nodes +Lie +juvenile +Jakarta +safer +confront +Bert +breakthrough +gospel +Cable +##zie +institutional +Archive +brake +liquor +feeds +##iate +chancellor +Encyclopedia +Animation +scanning +teens +##mother +Core +Rear +Wine +##flower +reactor +Ave +cardinal +sodium +strands +Olivier +crouched +Vaughan +Sammy +Image +scars +Emmanuel +flour +bias +nipple +revelation +##ucci +Denny +##ssy +Form +Runners +admits +Rama +violated +Burmese +feud +underwear +Mohamed +Named +swift +statewide +Door +Recently +comparing +Hundred +##idge +##nity +##rds +Rally +Reginald +Auburn +solving +waitress +Treasurer +##ilization +Halloween +Ministers +Boss +Shut +##listic +Rahman +demonstrating +##pies +Gaza +Yuri +installations +Math +schooling +##bble +Bronx +exiled +gasoline +133 +bundle +humid +FCC +proportional +relate +VFL +##dez +continuity +##cene +syndicated +atmospheric +arrows +Wanderers +reinforcements +Willow +Lexington +Rotten +##yon +discovering +Serena +portable +##lysis +targeting +£1 +Goodman +Steam +sensors +detachment +Malik +##erie +attitudes +Goes +Kendall +Read +Sleep +beans +Nikki +modification +Jeanne +knuckles +Eleven +##iously +Gross +Jaime +dioxide +moisture +Stones +UCI +displacement +Metacritic +Jury +lace +rendering +elephant +Sergei +##quire +GP +Abbott +##type +projection +Mouse +Bishops +whispering +Kathleen +Rams +##jar +whites +##oran +assess +dispatched +##hire +kin +##mir +Nursing +advocates +tremendous +sweater +assisting +##bil +Farmer +prominently +reddish +Hague +cyclone +##SD +Sage +Lawson +Sanctuary +discharged +retains +##ube +shotgun +wilderness +Reformed +similarity +Entry +Watts +Bahá +Quest +Looks +visions +Reservoir +Arabs +curls +Blu +dripping +accomplish +Verlag +drill +sensor +Dillon +physicians +smashed +##dir +painters +Renault +straw +fading +Directorate +lounge +commissions +Brain +##graph +neo +##urg +plug +coordinated +##houses +Critical +lamps +illustrator +Returning +erosion +Crow +##ciation +blessing +Thought +Wife +medalist +synthesizer +Pam +Thornton +Esther +HBO +fond +Associates +##raz +pirate +permits +Wide +tire +##PC +Ernie +Nassau +transferring +RFC +##ntly +um +spit +AS +##mps +Mining +polar +villa +anchored +##zzi +embarrassment +relates +##ă +Rupert +counterparts +131 +Baxter +##18 +Igor +recognizes +Clive +##hane +##eries +##ibly +occurrence +##scope +fin +colorful +Rapids +banker +tile +##rative +##dus +delays +destinations +##llis +Pond +Dane +grandparents +rewarded +socially +motorway +##hof +##lying +##human +modeled +Dayton +Forward +conscience +Sharma +whistle +Mayer +Sasha +##pical +circuits +Zhou +##ça +Latvian +finalists +predators +Lafayette +closes +obligations +Resolution +##vier +Trustees +reminiscent +##hos +Highlands +Protected +asylum +evacuation +##acy +Chevrolet +confession +Somalia +emergence +separating +##rica +alright +calcium +Laurent +Welfare +Leonardo +ashes +dental +Deal +minerals +##lump +##mount +accounted +staggered +slogan +photographic +builder +##imes +##raft +tragic +144 +SEC +Hit +tailed +##ples +##rring +##rson +ethical +wrestlers +concludes +lunar +##ept +nitrogen +Aid +cyclist +quarterfinals +##ه +harvest +##hem +Pasha +IL +##mis +continually +##forth +Intel +bucket +##ended +witches +pretended +dresses +viewer +peculiar +lowering +volcano +Marilyn +Qualifier +clung +##sher +Cut +modules +Bowie +##lded +onset +transcription +residences +##pie +##itor +scrapped +##bic +Monaco +Mayo +eternity +Strike +uncovered +skeleton +##wicz +Isles +bug +Promoted +##rush +Mechanical +XII +##ivo +gripping +stubborn +velvet +TD +decommissioned +operas +spatial +unstable +Congressman +wasted +##aga +##ume +advertisements +##nya +obliged +Cannes +Conway +bricks +##gnant +##mity +##uise +jumps +Clear +##cine +##sche +chord +utter +Su +podium +spokesman +Royce +assassin +confirmation +licensing +liberty +##rata +Geographic +individually +detained +##ffe +Saturn +crushing +airplane +bushes +knights +##PD +Lilly +hurts +unexpectedly +Conservatives +pumping +Forty +candle +Pérez +peasants +supplement +Sundays +##ggs +##rries +risen +enthusiastic +corresponds +pending +##IF +Owens +floods +Painter +inflation +presumed +inscribed +Chamberlain +bizarre +1200 +liability +reacted +tub +Legacy +##eds +##pted +shone +##litz +##NC +Tiny +genome +bays +Eduardo +robbery +stall +hatch +Depot +Variety +Flora +reprinted +trembled +outlined +CR +Theresa +spans +##plication +Jensen +##eering +posting +##rky +pays +##ost +Marcos +fortifications +inferior +##ential +Devi +despair +Talbot +##chus +updates +ego +Booth +Darius +tops +##lau +Scene +##DC +Harlem +Trey +Generally +candles +##α +Neville +Admiralty +##hong +iconic +victorious +1600 +Rowan +abundance +miniseries +clutching +sanctioned +##words +obscure +##ision +##rle +##EM +disappearing +Resort +Obviously +##eb +exceeded +1870s +Adults +##cts +Cry +Kerr +ragged +selfish +##lson +circled +pillars +galaxy +##asco +##mental +rebuild +caution +Resistance +Start +bind +splitting +Baba +Hogan +ps +partnerships +slam +Peggy +courthouse +##OD +organizational +packages +Angie +##nds +possesses +##rp +Expressway +Gould +Terror +Him +Geoff +nobles +##ope +shark +##nh +identifies +##oor +testified +Playing +##ump +##isa +stool +Idol +##pice +##tana +Byrne +Gerry +grunted +26th +observing +habits +privilege +immortal +wagons +##thy +dot +Bring +##lian +##witz +newest +##uga +constraints +Screen +Issue +##RNA +##vil +reminder +##gles +addiction +piercing +stunning +var +##rita +Signal +accumulated +##wide +float +devastating +viable +cartoons +Uttar +flared +##encies +Theology +patents +##bahn +privileges +##ava +##CO +137 +##oped +##NT +orchestral +medication +225 +erect +Nadia +École +fried +Sales +scripts +##rease +airs +Cage +inadequate +structured +countless +Avengers +Kathy +disguise +mirrors +Investigation +reservation +##nson +Legends +humorous +Mona +decorations +attachment +Via +motivation +Browne +strangers +##ński +Shadows +Twins +##pressed +Alma +Nominated +##ott +Sergio +canopy +152 +Semifinals +devised +##irk +upwards +Traffic +Goddess +Move +beetles +138 +spat +##anne +holdings +##SP +tangled +Whilst +Fowler +anthem +##ING +##ogy +snarled +moonlight +songwriting +tolerance +Worlds +exams +##pia +notices +sensitivity +poetic +Stephens +Boone +insect +reconstructed +Fresh +27th +balloon +##ables +Brendan +mug +##gee +1780 +apex +exports +slides +Lahore +hiring +Shell +electorate +sexuality +poker +nonprofit +##imate +cone +##uce +Okinawa +superintendent +##HC +referenced +turret +Sprint +Citizen +equilibrium +Stafford +curb +Driver +Valerie +##rona +aching +impacts +##bol +observers +Downs +Shri +##uth +airports +##uda +assignments +curtains +solitary +icon +patrols +substances +Jasper +mountainous +Published +ached +##ingly +announce +dove +damaging +##tism +Primera +Dexter +limiting +batch +##uli +undergoing +refugee +Ye +admiral +pavement +##WR +##reed +pipeline +desires +Ramsey +Sheila +thickness +Brotherhood +Tea +instituted +Belt +Break +plots +##ais +masculine +##where +Theo +##aged +##mined +Experience +scratched +Ethiopian +Teaching +##nov +Aiden +Abe +Samoa +conditioning +##mous +Otherwise +fade +Jenks +##encing +Nat +##lain +Anyone +##kis +smirk +Riding +##nny +Bavarian +blessed +potatoes +Hook +##wise +likewise +hardened +Merry +amid +persecution +##sten +Elections +Hoffman +Pitt +##vering +distraction +exploitation +infamous +quote +averaging +healed +Rhythm +Germanic +Mormon +illuminated +guides +##ische +interfere +##ilized +rector +perennial +##ival +Everett +courtesy +##nham +Kirby +Mk +##vic +Medieval +##tale +Luigi +limp +##diction +Alive +greeting +shove +##force +##fly +Jasmine +Bend +Capt +Suzanne +ditch +134 +##nning +Host +fathers +rebuilding +Vocal +wires +##manship +tan +Factor +fixture +##LS +Māori +Plate +pyramid +##umble +slap +Schneider +yell +##ulture +##tional +Goodbye +sore +##pher +depressed +##dox +pitching +Find +Lotus +##wang +strand +Teen +debates +prevalent +##bilities +exposing +hears +billed +##rse +reorganized +compelled +disturbing +displaying +##tock +Clinical +emotionally +##iah +Derbyshire +grouped +##quel +Bahrain +Journalism +IN +persistent +blankets +Crane +camping +Direct +proving +Lola +##dding +Corporate +birthplace +##boats +##ender +Figure +dared +Assam +precursor +##nched +Tribe +Restoration +slate +Meyrick +hunted +stroking +Earlier +Kind +polls +appeals +monetary +##reate +Kira +Langdon +explores +GPS +extensions +squares +Results +draped +announcer +merit +##ennial +##tral +##roved +##cion +robots +supervisor +snorted +##group +Cannon +procession +monkey +freeze +sleeves +Nile +verdict +ropes +firearms +extraction +tensed +EC +Saunders +##tches +diamonds +Marriage +##amble +curling +Amazing +##haling +unrelated +##roads +Daughter +cum +discarded +kidney +cliffs +forested +Candy +##lap +authentic +tablet +notation +##nburg +Bulldogs +Callum +Meet +mouths +coated +##xe +Truman +combinations +##mation +Steelers +Fan +Than +paternal +##father +##uti +Rebellion +inviting +Fun +theatres +##ي +##rom +curator +##cision +networking +Oz +drought +##ssel +granting +MBA +Shelby +Elaine +jealousy +Kyoto +shores +signaling +tenants +debated +Intermediate +Wise +##hes +##pu +Havana +duke +vicious +exited +servers +Nonetheless +Reports +explode +##beth +Nationals +offerings +Oval +conferred +eponymous +folklore +##NR +Shire +planting +1783 +Zeus +accelerated +Constable +consuming +troubles +McCartney +texture +bust +Immigration +excavated +hopefully +##cession +##coe +##name +##ully +lining +Einstein +Venezuelan +reissued +minorities +Beatrice +crystals +##nies +circus +lava +Beirut +extinction +##shu +Becker +##uke +issuing +Zurich +extract +##esta +##rred +regulate +progression +hut +alcoholic +plea +AB +Norse +Hubert +Mansfield +ashamed +##put +Bombardment +stripes +electrons +Denise +horrified +Nor +arranger +Hay +Koch +##ddling +##iner +Birthday +Josie +deliberate +explorer +##jiang +##signed +Arrow +wiping +satellites +baritone +mobility +##rals +Dorset +turbine +Coffee +185 +##lder +Cara +Colts +pits +Crossing +coral +##birth +Tai +zombie +smoothly +##hp +mates +##ady +Marguerite +##tary +puzzled +tapes +overly +Sonic +Prayer +Thinking +##uf +IEEE +obligation +##cliffe +Basil +redesignated +##mmy +nostrils +Barney +XIII +##phones +vacated +unused +Berg +##roid +Towards +viola +136 +Event +subdivided +rabbit +recruiting +##nery +Namibia +##16 +##ilation +recruits +Famous +Francesca +##hari +Goa +##lat +Karachi +haul +biblical +##cible +MGM +##rta +horsepower +profitable +Grandma +importantly +Martinez +incoming +##kill +beneficial +nominal +praying +##isch +gable +nail +noises +##ttle +Polytechnic +rub +##cope +Thor +audition +erotic +##ending +##iano +Ultimately +armoured +##mum +presently +pedestrian +##tled +Ipswich +offence +##ffin +##borne +Flemish +##hman +echo +##cting +auditorium +gentlemen +winged +##tched +Nicaragua +Unknown +prosperity +exhaust +pie +Peruvian +compartment +heights +disabilities +##pole +Harding +Humphrey +postponed +moths +Mathematical +Mets +posters +axe +##nett +Nights +Typically +chuckle +councillors +alternating +141 +Norris +##ately +##etus +deficit +dreaming +cooler +oppose +Beethoven +##esis +Marquis +flashlight +headache +investor +responding +appointments +##shore +Elias +ideals +shades +torch +lingering +##real +pier +fertile +Diploma +currents +Snake +##horse +##15 +Briggs +##ota +##hima +##romatic +Coastal +Kuala +ankles +Rae +slice +Hilton +locking +Approximately +Workshop +Niagara +strangely +##scence +functionality +advertisement +Rapid +Anders +ho +Soviets +packing +basal +Sunderland +Permanent +##fting +rack +tying +Lowell +##ncing +Wizard +mighty +tertiary +pencil +dismissal +torso +grasped +##yev +Sand +gossip +##nae +Beer +implementing +##19 +##riya +Fork +Bee +##eria +Win +##cid +sailor +pressures +##oping +speculated +Freddie +originating +##DF +##SR +##outh +28th +melt +Brenda +lump +Burlington +USC +marginal +##bine +Dogs +swamp +cu +Ex +uranium +metro +spill +Pietro +seize +Chorus +partition +##dock +##media +engineered +##oria +conclusions +subdivision +##uid +Illustrated +Leading +##hora +Berkshire +definite +##books +##cin +##suke +noun +winced +Doris +dissertation +Wilderness +##quest +braced +arbitrary +kidnapping +Kurdish +##but +clearance +excavations +wanna +Allmusic +insult +presided +yacht +##SM +Honour +Tin +attracting +explosives +Gore +Bride +##ience +Packers +Devils +Observer +##course +Loser +##erry +##hardt +##mble +Cyrillic +undefeated +##stra +subordinate +##ame +Wigan +compulsory +Pauline +Cruise +Opposition +##ods +Period +dispersed +expose +##60 +##has +Certain +Clerk +Wolves +##hibition +apparatus +allegiance +orbital +justified +thanked +##ević +Biblical +Carolyn +Graves +##tton +Hercules +backgrounds +replica +1788 +aquatic +Mega +Stirling +obstacles +filing +Founder +vowels +Deborah +Rotterdam +surpassed +Belarusian +##ologists +Zambia +Ren +Olga +Alpine +bi +councillor +Oaks +Animals +eliminating +digit +Managing +##GE +laundry +##rdo +presses +slamming +Tudor +thief +posterior +##bas +Rodgers +smells +##ining +Hole +SUV +trombone +numbering +representations +Domingo +Paralympics +cartridge +##rash +Combined +shelves +Kraków +revision +##frame +Sánchez +##tracted +##bler +Alain +townships +sic +trousers +Gibbs +anterior +symmetry +vaguely +Castile +IRA +resembling +Penguin +##ulent +infections +##stant +raped +##pressive +worrying +brains +bending +JR +Evidence +Venetian +complexes +Jonah +850 +exported +Ambrose +Gap +philanthropist +##atus +Marxist +weighing +##KO +##nath +Soldiers +chiefs +reject +repeating +shaky +Zürich +preserving +##xin +cigarettes +##break +mortar +##fin +Already +reproduction +socks +Waiting +amazed +##aca +dash +##path +Airborne +##harf +##get +descending +OBE +Sant +Tess +Lucius +enjoys +##ttered +##ivation +##ete +Leinster +Phillies +execute +geological +unfinished +Courts +SP +Beaver +Duck +motions +Platinum +friction +##aud +##bet +Parts +Stade +entirety +sprang +Smithsonian +coffin +prolonged +Borneo +##vise +unanimously +##uchi +Cars +Cassandra +Australians +##CT +##rgen +Louisa +spur +Constance +##lities +Patent +racism +tempo +##ssion +##chard +##nology +##claim +Million +Nichols +##dah +Numerous +ing +Pure +plantations +donor +##EP +##rip +convenience +##plate +dots +indirect +##written +Dong +failures +adapt +wizard +unfortunately +##gion +practitioners +economically +Enrique +unchanged +kingdoms +refined +definitions +lazy +worries +railing +##nay +Kaiser +##lug +cracks +sells +ninety +##WC +Directed +denotes +developmental +papal +unfortunate +disappointing +sixteenth +Jen +##urier +NWA +drifting +Horror +##chemical +behaviors +bury +surfaced +foreigners +slick +AND +##rene +##ditions +##teral +scrap +kicks +comprise +buddy +##anda +Mental +##ype +Dom +wines +Limerick +Luca +Rand +##won +Tomatoes +homage +geometric +##nted +telescope +Shelley +poles +##fan +shareholders +Autonomous +cope +intensified +Genoa +Reformation +grazing +##tern +Zhao +provisional +##bies +Con +##riel +Cynthia +Raleigh +vivid +threaten +Length +subscription +roses +Müller +##isms +robin +##tial +Laos +Stanton +nationalism +##clave +##ND +##17 +##zz +staging +Busch +Cindy +relieve +##spective +packs +neglected +CBE +alpine +Evolution +uneasy +coastline +Destiny +Barber +Julio +##tted +informs +unprecedented +Pavilion +##bei +##ference +betrayal +awaiting +leaked +V8 +puppet +adverse +Bourne +Sunset +collectors +##glass +##sque +copied +Demon +conceded +resembled +Rafe +Levy +prosecutor +##ject +flora +manned +deaf +Mosque +reminds +Lizzie +Products +Funny +cassette +congress +##rong +Rover +tossing +prompting +chooses +Satellite +cautiously +Reese +##UT +Huang +Gloucestershire +giggled +Kitty +##å +Pleasant +Aye +##ond +judging +1860s +intentionally +Hurling +aggression +##xy +transfers +employing +##fies +##oda +Archibald +Blessed +Ski +flavor +Rosie +##burgh +sunset +Scholarship +WC +surround +ranged +##jay +Degree +Houses +squeezing +limb +premium +Leningrad +steals +##inated +##ssie +madness +vacancy +hydraulic +Northampton +##prise +Marks +Boxing +##fying +academics +##lich +##TY +CDs +##lma +hardcore +monitors +paperback +cables +Dimitri +upside +advent +Ra +##clusive +Aug +Christchurch +objected +stalked +Simple +colonists +##laid +CT +discusses +fellowship +Carnival +cares +Miracle +pastoral +rooted +shortage +borne +Quentin +meditation +tapping +Novel +##ades +Alicia +Burn +famed +residency +Fernández +Johannesburg +Zhu +offended +Mao +outward +##inas +XV +denial +noticing +##ís +quarry +##hound +##amo +Bernie +Bentley +Joanna +mortgage +##rdi +##sumption +lenses +extracted +depiction +##RE +Networks +Broad +Revenue +flickered +virgin +flanked +##о +Enterprises +probable +Liberals +Falcons +drowning +phrases +loads +assumes +inhaled +awe +logs +slightest +spiders +waterfall +##pate +rocking +shrub +##uil +roofs +##gard +prehistoric +wary +##rak +TO +clips +sustain +treason +microphone +voter +Lamb +psychologist +wrinkled +##ères +mating +Carrier +340 +##lbert +sensing +##rino +destiny +distract +weaker +UC +Nearly +neurons +spends +Apache +##rem +genuinely +wells +##lanted +stereo +##girl +Lois +Leaving +consul +fungi +Pier +Cyril +80s +Jungle +##tani +illustration +Split +##hana +Abigail +##patrick +1787 +diminished +Selected +packaging +##EG +Martínez +communal +Manufacturing +sentiment +143 +unwilling +praising +Citation +pills +##iti +##rax +muffled +neatly +workforce +Yep +leisure +Tu +##nding +Wakefield +ancestral +##uki +destructive +seas +Passion +showcase +##ceptive +heroic +142 +exhaustion +Customs +##aker +Scholar +sliced +##inian +Direction +##OW +Swansea +aluminium +##eep +ceramic +McCoy +Career +Sector +chartered +Damascus +pictured +Interest +stiffened +Plateau +obsolete +##tant +irritated +inappropriate +overs +##nko +bail +Talent +Sur +ours +##nah +barred +legged +sociology +Bud +dictionary +##luk +Cover +obey +##oring +annoying +##dong +apprentice +Cyrus +Role +##GP +##uns +##bag +Greenland +Porsche +Rocket +##32 +organism +##ntary +reliability +##vocation +##й +Found +##hine +motors +promoter +unfair +##oms +##note +distribute +eminent +rails +appealing +chiefly +meaningful +Stephan +##rehension +Consumer +psychiatric +bowler +saints +##iful +##н +1777 +Pol +Dorian +Townsend +hastily +##jima +Quincy +Sol +fascinated +Scarlet +alto +Avon +certainty +##eding +Keys +##chu +Chu +##VE +ions +tributaries +Thanksgiving +##fusion +astronomer +oxide +pavilion +Supply +Casa +Bollywood +sadly +mutations +Keller +##wave +nationals +##rgo +##ym +predict +Catholicism +Vega +##eration +##ums +Mali +tuned +Lankan +Plans +radial +Bosnian +Lexi +##14 +##ü +sacks +unpleasant +Empty +handles +##taking +Bon +switches +intently +tuition +antique +##jk +fraternity +notebook +Desmond +##sei +prostitution +##how +deed +##OP +501 +Somewhere +Rocks +##mons +campaigned +frigate +gases +suppress +##hang +Merlin +Northumberland +dominate +expeditions +thunder +##ups +##rical +Cap +thorough +Ariel +##kind +renewable +constructing +pacing +terrorists +Bowen +documentaries +westward +##lass +##nage +Merchant +##ued +Beaumont +Din +##hian +Danube +peasant +Garrison +encourages +gratitude +reminding +stormed +##ouse +pronunciation +##ailed +Weekend +suggestions +##ffing +##DI +Active +Colombo +##logists +Merrill +##cens +Archaeological +Medina +captained +##yk +duel +cracking +Wilkinson +Guam +pickup +renovations +##ël +##izer +delighted +##iri +Weaver +##ctional +tens +##hab +Clint +##usion +##each +petals +Farrell +##sable +caste +##will +Ezra +##qi +##standing +thrilled +ambush +exhaled +##SU +Resource +blur +forearm +specifications +contingent +cafe +##iology +Antony +fundraising +grape +##rgy +turnout +##udi +Clifton +laboratories +Irvine +##opus +##lid +Monthly +Bihar +statutory +Roses +Emil +##rig +lumber +optimal +##DR +pumps +plaster +Mozambique +##aco +nightclub +propelled +##hun +ked +surplus +wax +##urai +pioneered +Sunny +imprint +Forget +Eliot +approximate +patronage +##bek +##ely +##mbe +Partnership +curl +snapping +29th +Patriarch +##jord +seldom +##ature +astronomy +Bremen +XIV +airborne +205 +1778 +recognizing +stranded +arrogant +bombardment +destined +ensured +146 +robust +Davenport +Interactive +Offensive +Fi +prevents +probe +propeller +sorrow +Blade +mounting +automotive +##dged +wallet +201 +lashes +Forrest +##ift +Cell +Younger +shouts +##cki +folds +##chet +Epic +yields +homosexual +tunes +##minate +##text +Manny +chemist +hindwings +##urn +pilgrimage +##sfield +##riff +MLS +##rive +Huntington +translates +Path +slim +##ndra +##oz +climax +commuter +desperation +##reet +denying +##rious +daring +seminary +polo +##clamation +Teatro +Torah +Cats +identities +Poles +photographed +fiery +popularly +##cross +winters +Hesse +##vio +Nurse +Senegal +Salon +prescribed +justify +##gues +##и +##orted +HQ +##hiro +evaluated +momentarily +##unts +Debbie +##licity +##TP +Mighty +Rabbit +##chal +Events +Savoy +##ht +Brandenburg +Bordeaux +##laus +Release +##IE +##kowski +1900s +SK +Strauss +##aly +Sonia +Updated +synagogue +McKay +flattened +370 +clutch +contests +toast +evaluate +pope +heirs +jam +tutor +reverted +##ading +nonsense +hesitate +Lars +Ceylon +Laurie +##guchi +accordingly +customary +148 +Ethics +Multiple +instincts +IGN +##ä +bullshit +##hit +##par +desirable +##ducing +##yam +alias +ashore +licenses +##lification +misery +147 +Cola +assassinated +fiercely +##aft +las +goat +substrate +lords +Cass +Bridges +ICC +lasts +sights +reproductive +##asi +Ivory +Clean +fixing +##lace +seeming +aide +1850s +harassment +##FF +##LE +reasonably +##coat +##cano +NYC +1784 +Fifty +immunity +Canadians +Cheng +comforting +meanwhile +##tera +##blin +breeds +glowed +##vour +Aden +##verted +##aded +##oral +neat +enforced +poisoning +##ews +##hone +enforce +predecessors +survivor +Month +unfamiliar +pierced +waived +dump +responds +Mai +Declan +angular +Doesn +interpretations +##yar +invest +Dhaka +policeman +Congregation +Eighth +painfully +##este +##vior +Württemberg +##cles +blockade +encouragement +##fie +Caucasus +Malone +Universidad +utilize +Nissan +inherent +151 +agreeing +syllable +determines +Protocol +conclude +##gara +40th +Xu +Taiwanese +##ather +boiler +printer +Lacey +titular +Klaus +Fallon +Wembley +fox +Chandra +Governorate +obsessed +##Ps +micro +##25 +Cooke +gymnasium +weaving +Shall +Hussein +glaring +softball +Reader +Dominion +Trouble +varsity +Cooperation +Chaos +Kang +Kramer +Eisenhower +proves +Connie +consortium +governors +Bethany +opener +Normally +Willy +linebacker +Regent +Used +AllMusic +Twilight +##shaw +Companion +Tribunal +simpler +##gam +Experimental +Slovenian +cellar +deadline +trout +Hubbard +ads +idol +##hetto +Granada +clues +salmon +1700 +Omega +Caldwell +softened +Bills +Honolulu +##gn +Terrace +suitcase +##IL +frantic +##oons +Abbot +Sitting +Fortress +Riders +sickness +enzymes +trustee +Bern +forged +##13 +##ruff +##rl +##versity +inspector +champagne +##held +##FI +hereditary +Taliban +handball +##wine +Sioux +##dicated +honoured +139 +##tude +Skye +meanings +##rkin +cardiac +analyzed +vegetable +##FS +Royals +dial +freelance +##fest +partisan +petroleum +ridden +Lincolnshire +panting +##comb +presidents +Haley +##chs +contributes +Jew +discoveries +panicked +Woody +eyelids +Fate +Tulsa +mg +whiskey +zombies +Wii +##udge +investigators +##bull +centred +##screen +Bone +Lana +##oise +forts +##ske +Conan +Lyons +##writing +SH +##ride +rhythmic +154 +##llah +pioneers +##bright +captivity +Sanchez +Oman +##mith +Flint +Platform +##ioned +emission +packet +Persia +##formed +takeover +tempted +Vance +Few +Toni +receptions +##ن +exchanges +Camille +whale +Chronicles +##rent +##ushing +##rift +Alto +Genus +##asing +onward +foremost +longing +Rockefeller +containers +##cribe +intercepted +##olt +pleading +Bye +bee +##umbling +153 +undertake +Izzy +cheaper +Ultra +validity +##pse +Sa +hovering +##pert +vintage +engraved +##rise +farmland +##ever +##ifier +Atlantis +propose +Catalonia +plunged +##edly +demonstrates +gig +##cover +156 +Osborne +cowboy +herd +investigator +loops +Burning +rests +Instrumental +embarrassing +focal +install +readings +swirling +Chatham +parameter +##zin +##holders +Mandarin +Moody +converting +Escape +warnings +##chester +incarnation +##ophone +adopting +##lins +Cromwell +##laws +Axis +Verde +Kappa +Schwartz +Serbs +caliber +Wanna +Chung +##ality +nursery +principally +Bulletin +likelihood +logging +##erty +Boyle +supportive +twitched +##usive +builds +Marseille +omitted +motif +Lands +##lusion +##ssed +Barrow +Airfield +Harmony +WWF +endured +merging +convey +branding +examinations +167 +Italians +##dh +dude +1781 +##teau +crawling +thoughtful +clasped +concluding +brewery +Moldova +Wan +Towers +Heidelberg +202 +##ict +Lagos +imposing +##eval +##serve +Bacon +frowning +thirteenth +conception +calculations +##ович +##mile +##ivated +mutation +strap +##lund +demographic +nude +perfection +stocks +##renched +##dit +Alejandro +bites +fragment +##hack +##rchy +GB +Surgery +Berger +punish +boiling +consume +Elle +Sid +Dome +relies +Crescent +treasurer +Bloody +1758 +upheld +Guess +Restaurant +signatures +font +millennium +mural +stakes +Abel +hailed +insists +Alumni +Breton +##jun +digits +##FM +##thal +Talking +motive +reigning +babe +masks +##ø +Shaun +potato +sour +whitish +Somali +##derman +##rab +##wy +chancel +telecommunications +Noise +messenger +tidal +grinding +##ogenic +Rebel +constituent +peripheral +recruitment +##ograph +##tler +pumped +Ravi +poked +##gley +Olive +diabetes +discs +liking +sting +fits +stir +Mari +Sega +creativity +weights +Macau +mandated +Bohemia +disastrous +Katrina +Baku +Rajasthan +waiter +##psis +Siberia +verbs +##truction +patented +1782 +##ndon +Relegated +Hunters +Greenwood +Shock +accusing +skipped +Sessions +markers +subset +monumental +Viola +comparative +Alright +Barbados +setup +Session +standardized +##ík +##sket +appoint +AFB +Nationalist +##WS +Troop +leaped +Treasure +goodness +weary +originates +100th +compassion +expresses +recommend +168 +composing +seventeenth +Tex +Atlético +bald +Finding +Presidency +Sharks +favoured +inactive +##lter +suffix +princes +brighter +##ctus +classics +defendants +culminated +terribly +Strategy +evenings +##ção +##iver +##urance +absorb +##rner +Territories +RBI +soothing +Martín +concurrently +##tr +Nicholson +fibers +swam +##oney +Allie +Algerian +Dartmouth +Mafia +##bos +##tts +Councillor +vocabulary +##bla +##lé +intending +##dler +Guerrero +sunshine +pedal +##TO +administrators +periodic +scholarships +Loop +Madeline +exaggerated +##ressed +Regan +##cellular +Explorer +##oids +Alexandre +vows +Reporter +Unable +Average +absorption +##bedience +Fortunately +Auxiliary +Grandpa +##HP +##ovo +potent +temporal +adrenaline +##udo +confusing +guiding +Dry +qualifications +joking +wherein +heavyweight +##ices +nightmares +pharmaceutical +Commanding +##aled +##ove +Gregor +##UP +censorship +degradation +glorious +Austro +##rench +380 +Miriam +sped +##orous +offset +##KA +fined +specialists +Pune +João +##dina +propped +fungus +##ς +frantically +Gabrielle +Hare +committing +##plied +Ask +Wilmington +stunt +numb +warmer +preacher +earnings +##lating +integer +##ija +federation +homosexuality +##cademia +epidemic +grumbled +shoving +Milk +Satan +Tobias +innovations +##dington +geology +memoirs +##IR +spared +culminating +Daphne +Focus +severed +stricken +Paige +Mans +flats +Russo +communes +litigation +strengthening +##powered +Staffordshire +Wiltshire +Painting +Watkins +##د +specializes +Select +##rane +##aver +Fulton +playable +##VN +openings +sampling +##coon +##21 +Allah +travelers +allocation +##arily +Loch +##hm +commentators +fulfilled +##troke +Emeritus +Vanderbilt +Vijay +pledged +##tative +diagram +drilling +##MD +##plain +Edison +productivity +31st +##rying +##ption +##gano +##oration +##bara +posture +bothering +platoon +politely +##inating +redevelopment +Job +##vale +stark +incorrect +Mansion +renewal +threatens +Bahamas +fridge +##tata +Uzbekistan +##edia +Sainte +##mio +gaps +neural +##storm +overturned +Preservation +shields +##ngo +##physics +ah +gradual +killings +##anza +consultation +premiership +Felipe +coincidence +##ène +##any +Handbook +##loaded +Edit +Guns +arguably +##ş +compressed +depict +seller +##qui +Kilkenny +##kling +Olympia +librarian +##acles +dramas +JP +Kit +Maj +##lists +proprietary +##nged +##ettes +##tok +exceeding +Lock +induction +numerical +##vist +Straight +foyer +imaginary +##pop +violinist +Carla +bouncing +##ashi +abolition +##uction +restoring +scenic +##č +Doom +overthrow +para +##vid +##ughty +Concord +HC +cocaine +deputies +##aul +visibility +##wart +Kapoor +Hutchinson +##agan +flashes +kn +decreasing +##ronology +quotes +vain +satisfying +##iam +##linger +310 +Hanson +fauna +##zawa +##rrel +Trenton +##VB +Employment +vocational +Exactly +bartender +butterflies +tow +##chers +##ocks +pigs +merchandise +##game +##pine +Shea +##gration +Connell +Josephine +monopoly +##dled +Cobb +warships +cancellation +someday +stove +##Cs +candidacy +superhero +unrest +Toulouse +admiration +undergone +whirled +Reconnaissance +costly +##ships +290 +Cafe +amber +Tory +##mpt +definitive +##dress +proposes +redesigned +acceleration +##asa +##raphy +Presley +exits +Languages +##cel +Mode +spokesperson +##tius +Ban +forthcoming +grounded +ACC +compelling +logistics +retailers +abused +##gating +soda +##yland +##lution +Landmark +XVI +blush +##tem +hurling +dread +Tobago +Foley +##uad +scenarios +##mentation +##rks +Score +fatigue +hairy +correspond +##iard +defences +confiscated +##rudence +1785 +Formerly +Shot +advertised +460 +Text +ridges +Promise +Dev +exclusion +NHS +tuberculosis +rockets +##offs +sparkling +256 +disappears +mankind +##hore +HP +##omo +taxation +Multi +DS +Virgil +##ams +Dell +stacked +guessing +Jump +Nope +cheer +hates +ballots +overlooked +analyses +Prevention +maturity +dos +##cards +##lect +Mare +##yssa +Petty +##wning +differing +iOS +##ior +Joachim +Sentinel +##nstein +90s +Pamela +480 +Asher +##lary +Vicente +landings +portray +##rda +##xley +Virtual +##uary +finances +Jain +Somebody +Tri +behave +Michele +##ider +dwellings +FAA +Gallagher +##lide +Monkey +195 +aforementioned +##rism +##bey +##kim +##puted +Mesa +hopped +unopposed +recipients +Reality +Been +gritted +149 +playground +pillar +##rone +Guinness +##tad +Théâtre +depended +Tipperary +Reuben +frightening +wooded +Target +globally +##uted +Morales +Baptiste +drunken +Institut +characterised +##chemistry +Strip +discrete +Premiership +##zzling +gazing +Outer +##quisition +Sikh +Booker +##yal +contemporaries +Jericho +##chan +##physical +##witch +Militia +##rez +##zard +dangers +##utter +##₀ +Programs +darling +participates +railroads +##ienne +behavioral +bureau +##rook +161 +Hicks +##rises +Comes +inflicted +bees +kindness +norm +##ković +generators +##pard +##omy +##ili +methodology +Alvin +façade +latitude +##plified +DE +Morse +##mered +educate +intersects +##MF +##cz +##vated +AL +##graded +##fill +constitutes +artery +feudal +avant +cautious +##ogue +immigrated +##chenko +Saul +Clinic +Fang +choke +Cornelius +flexibility +temperate +pins +##erson +oddly +inequality +157 +Natasha +Sal +##uter +215 +aft +blinking +##ntino +northward +Exposition +cookies +Wedding +impulse +Overseas +terrifying +##ough +Mortimer +##see +440 +https +og +imagining +##cars +Nicola +exceptionally +threads +##cup +Oswald +Provisional +dismantled +deserves +1786 +Fairy +discourse +Counsel +departing +Arc +guarding +##orse +420 +alterations +vibrant +Em +squinted +terrace +rowing +Led +accessories +SF +Sgt +cheating +Atomic +##raj +Blackpool +##iary +boarded +substituted +bestowed +lime +kernel +##jah +Belmont +shaken +sticky +retrospective +Louie +migrants +weigh +sunglasses +thumbs +##hoff +excavation +##nks +Extra +Polo +motives +Drum +infrared +tastes +berth +verge +##stand +programmed +warmed +Shankar +Titan +chromosome +cafeteria +dividing +pepper +CPU +Stevie +satirical +Nagar +scowled +Died +backyard +##gata +##reath +##bir +Governors +portraying +##yah +Revenge +##acing +1772 +margins +Bahn +OH +lowland +##razed +catcher +replay +##yoshi +Seriously +##licit +Aristotle +##ald +Habsburg +weekday +Secretariat +CO +##dly +##joy +##stad +litre +ultra +##cke +Mongol +Tucson +correlation +compose +traps +Groups +Hai +Salvatore +##dea +cents +##eese +concession +clash +Trip +Panzer +Moroccan +cruisers +torque +Ba +grossed +##arate +restriction +concentrating +FDA +##Leod +##ones +Scholars +##esi +throbbing +specialised +##heses +Chicken +##fia +##ificant +Erich +Residence +##trate +manipulation +namesake +##tom +Hoover +cue +Lindsey +Lonely +275 +##HT +combustion +subscribers +Punjabi +respects +Jeremiah +penned +##gor +##rilla +suppression +##tration +Crimson +piston +Derry +crimson +lyrical +oversee +portrays +CF +Districts +Lenin +Cora +searches +clans +VHS +##hel +Jacqueline +Redskins +Clubs +desktop +indirectly +alternatives +marijuana +suffrage +##smos +Irwin +##liff +Process +##hawks +Sloane +##bson +Sonata +yielded +Flores +##ares +armament +adaptations +integrate +neighbours +shelters +##tour +Skinner +##jet +##tations +1774 +Peterborough +##elles +ripping +Liang +Dickinson +charities +Rwanda +monasteries +crossover +racist +barked +guerrilla +##ivate +Grayson +##iques +##vious +##got +Rolls +denominations +atom +affinity +##delity +Wish +##inted +##inae +interrogation +##cey +##erina +##lifting +192 +Sands +1779 +mast +Likewise +##hyl +##oft +contempt +##por +assaulted +fills +establishments +Mal +consulted +##omi +##sight +greet +##roma +##egan +Pulitzer +##rried +##dius +##ractical +##voked +Hasan +CB +##zzy +Romanesque +Panic +wheeled +recorder +##tters +##warm +##gly +botanist +Balkan +Lockheed +Polly +farewell +suffers +purchases +Eaton +##80 +Quick +commenting +Saga +beasts +hides +motifs +##icks +Alonso +Springer +Wikipedia +circulated +encoding +jurisdictions +snout +UAE +Integrated +unmarried +Heinz +##lein +##figured +deleted +##tley +Zen +Cycling +Fuel +Scandinavian +##rants +Conner +reef +Marino +curiously +lingered +Gina +manners +activism +Mines +Expo +Micah +promotions +Server +booked +derivatives +eastward +detailing +reelection +##chase +182 +Campeonato +Po +158 +Peel +winger +##itch +canyon +##pit +LDS +A1 +##shin +Giorgio +pathetic +##rga +##mist +Aren +##lag +confronts +motel +textbook +shine +turbines +1770 +Darcy +##cot +Southeastern +##lessness +Banner +recognise +stray +Kitchen +paperwork +realism +Chrysler +filmmakers +fishermen +##hetic +variously +Vishnu +fiddle +Eddy +Origin +##tec +##ulin +Flames +Rs +bankrupt +Extreme +Pomeranian +##emption +ratified +##iu +jockey +Stratford +##ivating +##oire +Babylon +pardon +AI +affordable +deities +disturbance +Trying +##sai +Ida +Papers +advancement +70s +archbishop +Luftwaffe +announces +tugging +##lphin +##sistence +##eel +##ishes +ambition +aura +##fled +##lected +##vue +Prasad +boiled +clarity +Violin +investigative +routing +Yankee +##uckle +McMahon +bugs +eruption +##rooms +Minutes +relics +##ckle +##nse +sipped +valves +weakly +##ital +Middleton +collided +##quer +bamboo +insignia +Tyne +exercised +Ninth +echoing +polynomial +considerations +lunged +##bius +objections +complain +disguised +plaza +##VC +institutes +Judicial +ascent +imminent +Waterford +hello +Lumpur +Niger +Goldman +vendors +Kensington +Wren +browser +##bner +##tri +##mize +##pis +##lea +Cheyenne +Bold +Settlement +Hollow +Paralympic +axle +##toire +##actic +impose +perched +utilizing +slips +Benz +Michaels +manipulate +Chiang +##mian +Dolphins +prohibition +attacker +ecology +Estadio +##SB +##uild +attracts +recalls +glacier +lad +##rima +Barlow +kHz +melodic +##aby +##iracy +assumptions +Cornish +##aru +DOS +Maddie +##mers +lyric +Luton +nm +##tron +Reno +Fin +YOU +Broadcast +Finch +sensory +##bent +Jeep +##uman +additionally +Buildings +businessmen +treaties +235 +Stranger +gateway +Charlton +accomplishments +Diary +apologized +zinc +histories +supplier +##tting +162 +asphalt +Treatment +Abbas +##pating +##yres +Bloom +sedan +soloist +##cum +antagonist +denounced +Fairfax +##aving +##enko +noticeable +Budget +Buckingham +Snyder +retreating +Jai +spoon +invading +giggle +woven +gunfire +arrests +##vered +##come +respiratory +violet +##aws +Byrd +shocking +tenant +Jamaican +Ottomans +Seal +theirs +##isse +##48 +cooperate +peering +##nius +163 +Composer +organist +Mongolian +Bauer +Spy +collects +prophecy +congregations +##moor +Brick +calculation +fixtures +exempt +##dden +Ada +Thousand +##lue +tracing +##achi +bodyguard +vicar +supplying +Łódź +interception +monitored +##heart +Paso +overlap +annoyance +##dice +yellowish +stables +elders +illegally +honesty +##oar +skinny +spinal +##puram +Bourbon +##cor +flourished +Medium +##stics +##aba +Follow +##ckey +stationary +##scription +dresser +scrutiny +Buckley +Clearly +##SF +Lyrics +##heimer +drying +Oracle +internally +rains +##last +Enemy +##oes +McLean +Ole +phosphate +Rosario +Rifles +##mium +battered +Pepper +Presidents +conquer +Château +castles +##aldo +##ulf +Depending +Lesser +Boom +trades +Peyton +164 +emphasize +accustomed +SM +Ai +Classification +##mins +##35 +##rons +leak +piled +deeds +lush +##self +beginnings +breathless +1660 +McGill +##ago +##chaft +##gies +humour +Bomb +securities +Might +##zone +##eves +Matthias +Movies +Levine +vengeance +##ads +Challenger +Misty +Traditionally +constellation +##rass +deepest +workplace +##oof +##vina +impatient +##ML +Mughal +Alessandro +scenery +Slater +postseason +troupe +##ń +Volunteers +Facility +militants +Reggie +sanctions +Expeditionary +Nam +countered +interpret +Basilica +coding +expectation +Duffy +def +Tong +wakes +Bowling +Vehicle +Adler +salad +intricate +stronghold +medley +##uries +##bur +joints +##rac +##yx +##IO +Ordnance +Welch +distributor +Ark +cavern +trench +Weiss +Mauritius +decreases +docks +eagerly +irritation +Matilda +biographer +Visiting +##marked +##iter +##ear +##gong +Moreno +attendant +Bury +instrumentation +theologian +clit +nuns +symphony +translate +375 +loser +##user +##VR +##meter +##orious +harmful +##yuki +Commissioners +Mendoza +sniffed +Hulk +##dded +##ulator +##nz +Donnell +##eka +deported +Met +SD +Aerospace +##cultural +##odes +Fantastic +cavity +remark +emblem +fearing +##iance +ICAO +Liberia +stab +##yd +Pac +Gymnasium +IS +Everton +##vanna +mantle +##ief +Ramon +##genic +Shooting +Smoke +Random +Africans +MB +tavern +bargain +voluntarily +Ion +Peoples +Rusty +attackers +Patton +sins +##cake +Hat +moderately +##hala +##alia +requesting +mechanic +##eae +Seine +Robbins +##ulum +susceptible +Bravo +Slade +Strasbourg +rubble +entrusted +Creation +##amp +smoothed +##uintet +evenly +reviewers +skip +Sculpture +177 +Rough +##rrie +Reeves +##cede +Administrator +garde +minus +carriages +grenade +Ninja +fuscous +##kley +Punk +contributors +Aragon +Tottenham +##cca +##sir +VA +laced +dealers +##sonic +crisp +harmonica +Artistic +Butch +Andes +Farmers +corridors +unseen +##tium +Countries +Lone +envisioned +Katy +##lang +##cc +Quarterly +##neck +consort +##aceae +bidding +Corey +concurrent +##acts +##gum +Highness +##lient +##rators +arising +##unta +pathways +49ers +bolted +complaining +ecosystem +libretto +Ser +narrated +212 +Soft +influx +##dder +incorporation +plagued +tents +##ddled +1750 +Risk +citation +Tomas +hostilities +seals +Bruins +Dominique +attic +competent +##UR +##cci +hugging +Breuning +bacterial +Shrewsbury +vowed +eh +elongated +hangs +render +centimeters +##ficient +Mu +turtle +besieged +##gaard +grapes +bravery +collaborations +deprived +##amine +##using +##gins +arid +##uve +coats +hanged +##sting +Pa +prefix +##ranged +Exit +Chain +Flood +Materials +suspicions +##ö +hovered +Hidden +##state +Malawi +##24 +Mandy +norms +fascinating +airlines +delivers +##rust +Cretaceous +spanned +pillows +##onomy +jar +##kka +regent +fireworks +morality +discomfort +lure +uneven +##jack +Lucian +171 +archaeology +##til +mornings +Billie +Marquess +impending +spilling +tombs +##volved +Celia +Coke +underside +##bation +Vaughn +Daytona +Godfrey +Pascal +Alien +##sign +172 +##lage +iPhone +Gonna +genocide +##rber +oven +endure +dashed +simultaneous +##phism +Wally +##rō +ants +predator +reissue +##aper +Speech +funk +Rudy +claw +Hindus +Numbers +Bing +lantern +##aurus +scattering +poisoned +##active +Andrei +algebraic +baseman +##ritz +Gregg +##cola +selections +##putation +lick +Laguna +##IX +Sumatra +Warning +turf +buyers +Burgess +Oldham +exploit +worm +initiate +strapped +tuning +filters +haze +##е +##ledge +##ydro +##culture +amendments +Promotion +##union +Clair +##uria +petty +shutting +##eveloped +Phoebe +Zeke +conducts +grains +clashes +##latter +illegitimate +willingly +Deer +Lakers +Reference +chaplain +commitments +interrupt +salvation +Panther +Qualifying +Assessment +cancel +efficiently +attorneys +Dynamo +impress +accession +clinging +randomly +reviewing +Romero +Cathy +charting +clapped +rebranded +Azerbaijani +coma +indicator +punches +##tons +Sami +monastic +prospects +Pastor +##rville +electrified +##CI +##utical +tumbled +Chef +muzzle +selecting +UP +Wheel +protocols +##tat +Extended +beautifully +nests +##stal +Andersen +##anu +##³ +##rini +kneeling +##reis +##xia +anatomy +dusty +Safe +turmoil +Bianca +##elo +analyze +##ر +##eran +podcast +Slovene +Locke +Rue +##retta +##uni +Person +Prophet +crooked +disagreed +Versailles +Sarajevo +Utrecht +##ogen +chewing +##ception +##iidae +Missile +attribute +majors +Arch +intellectuals +##andra +ideological +Cory +Salzburg +##fair +Lot +electromagnetic +Distribution +##oper +##pered +Russ +Terra +repeats +fluttered +Riga +##ific +##gt +cows +Hair +labelled +protects +Gale +Personnel +Düsseldorf +Moran +rematch +##OE +Slow +forgiveness +##ssi +proudly +Macmillan +insist +undoubtedly +Québec +Violence +##yuan +##aine +mourning +linen +accidental +##iol +##arium +grossing +lattice +maneuver +##marine +prestige +petrol +gradient +invasive +militant +Galerie +widening +##aman +##quist +disagreement +##ales +creepy +remembers +buzz +##erial +Exempt +Dirk +mon +Addison +##inen +deposed +##agon +fifteenth +Hang +ornate +slab +##lades +Fountain +contractors +das +Warwickshire +1763 +##rc +Carly +Essays +Indy +Ligue +greenhouse +slit +##sea +chewed +wink +##azi +Playhouse +##kon +Gram +Ko +Samson +creators +revive +##rians +spawned +seminars +Craft +Tall +diverted +assistants +computational +enclosure +##acity +Coca +##eve +databases +Drop +##loading +##hage +Greco +Privy +entrances +pork +prospective +Memories +robes +##market +transporting +##lik +Rudolph +Horton +visually +##uay +##nja +Centro +Tor +Howell +##rsey +admitting +postgraduate +herbs +##att +Chin +Rutherford +##bot +##etta +Seasons +explanations +##bery +Friedman +heap +##ryl +##sberg +jaws +##agh +Choi +Killing +Fanny +##suming +##hawk +hopeful +##aid +Monty +gum +remarkably +Secrets +disco +harp +advise +##avia +Marathi +##cycle +Truck +abbot +sincere +urine +##mology +masked +bathing +##tun +Fellows +##TM +##gnetic +owl +##jon +hymn +##leton +208 +hostility +##cée +baked +Bottom +##AB +shudder +##ater +##von +##hee +reorganization +Cycle +##phs +Lex +##style +##rms +Translation +##erick +##imeter +##ière +attested +Hillary +##DM +gal +wander +Salle +##laming +Perez +Pit +##LP +USAF +contexts +Disease +blazing +aroused +razor +walled +Danielle +Mont +Funk +royalty +thee +203 +donors +##erton +famously +processors +reassigned +welcoming +Goldberg +##quities +undisclosed +Orient +Patty +vaccine +refrigerator +Cypriot +consonant +##waters +176 +sober +##lement +Racecourse +##uate +Luckily +Selection +conceptual +vines +Breaking +wa +lions +oversight +sheltered +Dancer +ponds +borrow +##BB +##pulsion +Daly +##eek +fertility +spontaneous +Worldwide +gasping +##tino +169 +ABS +Vickers +ambient +energetic +prisons +##eson +Stacy +##roach +GmbH +Afro +Marin +farmhouse +pinched +##cursion +##sp +Sabine +##pire +181 +nak +swelling +humble +perfume +##balls +Rai +cannons +##taker +Married +Maltese +canals +interceptions +hats +lever +slowing +##ppy +Nike +Silas +Scarborough +skirts +166 +inauguration +Shuttle +alloy +beads +belts +Compton +Cause +battling +critique +surf +Dock +roommate +##ulet +invade +Garland +##slow +nutrition +persona +##zam +Wichita +acquaintance +coincided +##cate +Dracula +clamped +##gau +overhaul +##broken +##rrier +melodies +ventures +Paz +convex +Roots +##holding +Tribute +transgender +##ò +chimney +##riad +Ajax +Thereafter +messed +nowadays +pH +##100 +##alog +Pomerania +##yra +Rossi +glove +##TL +Races +##asily +tablets +Jase +##ttes +diner +##rns +Hu +Mohan +anytime +weighted +remixes +Dove +cherry +imports +##urity +GA +##TT +##iated +##sford +Clarkson +evidently +rugged +Dust +siding +##ometer +acquitted +choral +##mite +infants +Domenico +gallons +Atkinson +gestures +slated +##xa +Archaeology +unwanted +##ibes +##duced +premise +Colby +Geelong +disqualified +##pf +##voking +simplicity +Walkover +Qaeda +Warden +##bourg +##ān +Invasion +Babe +harness +183 +##tated +maze +Burt +bedrooms +##nsley +Horizon +##oast +minimize +peeked +MLA +Trains +tractor +nudged +##iform +Growth +Benton +separates +##about +##kari +buffer +anthropology +brigades +foil +##wu +Domain +licking +whore +##rage +##sham +Initial +Courthouse +Rutgers +dams +villains +supermarket +##brush +Brunei +Palermo +arises +Passenger +outreach +##gill +Labrador +McLaren +##uy +Lori +##fires +Heads +magistrate +¹⁄₂ +Weapons +##wai +##roke +projecting +##ulates +bordering +McKenzie +Pavel +midway +Guangzhou +streamed +racer +##lished +eccentric +spectral +206 +##mism +Wilde +Grange +preparatory +lent +##tam +starving +Gertrude +##cea +##ricted +Breakfast +Mira +blurted +derive +##lair +blunt +sob +Cheltenham +Henrik +reinstated +intends +##istan +unite +##ector +playful +sparks +mapped +Cadet +luggage +prosperous +##ein +salon +##utes +Biological +##rland +Tyrone +buyer +##lose +amounted +Saw +smirked +Ronan +Reviews +Adele +trait +##proof +Bhutan +Ginger +##junct +digitally +stirring +##isted +coconut +Hamlet +Dinner +Scale +pledge +##RP +Wrong +Goal +Panel +therapeutic +elevations +infectious +priesthood +##inda +Guyana +diagnostic +##mbre +Blackwell +sails +##arm +literal +periodically +gleaming +Robot +Rector +##abulous +##tres +Reaching +Romantic +CP +Wonderful +##tur +ornamental +##nges +traitor +##zilla +genetics +mentioning +##eim +resonance +Areas +Shopping +##nard +Gail +Solid +##rito +##mara +Willem +Chip +Matches +Volkswagen +obstacle +Organ +invites +Coral +attain +##anus +##dates +Midway +shuffled +Cecilia +dessert +Gateway +Ch +Napoleonic +Petroleum +jets +goose +striped +bowls +vibration +Sims +nickel +Thirteen +problematic +intervene +##grading +##unds +Mum +semifinal +Radical +##izations +refurbished +##sation +##harine +Maximilian +cites +Advocate +Potomac +surged +preserves +Curry +angled +ordination +##pad +Cade +##DE +##sko +researched +torpedoes +Resident +wetlands +hay +applicants +depart +Bernstein +##pic +##ario +##rae +favourable +##wari +##р +metabolism +nobleman +Defaulted +calculate +ignition +Celebrity +Belize +sulfur +Flat +Sc +USB +flicker +Hertfordshire +Sept +CFL +Pasadena +Saturdays +Titus +##nir +Canary +Computing +Isaiah +##mler +formidable +pulp +orchid +Called +Solutions +kilograms +steamer +##hil +Doncaster +successors +Stokes +Holstein +##sius +sperm +API +Rogue +instability +Acoustic +##rag +159 +undercover +Wouldn +##pra +##medical +Eliminated +honorable +##chel +denomination +abrupt +Buffy +blouse +fi +Regardless +Subsequent +##rdes +Lover +##tford +bacon +##emia +carving +##cripts +Massacre +Ramos +Latter +##ulp +ballroom +##gement +richest +bruises +Rest +Wiley +##aster +explosions +##lastic +Edo +##LD +Mir +choking +disgusted +faintly +Barracks +blasted +headlights +Tours +ensued +presentations +##cale +wrought +##oat +##coa +Quaker +##sdale +recipe +##gny +corpses +##liance +comfortably +##wat +Landscape +niche +catalyst +##leader +Securities +messy +##RL +Rodrigo +backdrop +##opping +treats +Emilio +Anand +bilateral +meadow +VC +socialism +##grad +clinics +##itating +##ppe +##ymphonic +seniors +Advisor +Armoured +Method +Alley +##orio +Sad +fueled +raided +Axel +NH +rushes +Dixie +Otis +wrecked +##22 +capitalism +café +##bbe +##pion +##forcing +Aubrey +Lublin +Whenever +Sears +Scheme +##lana +Meadows +treatise +##RI +##ustic +sacrifices +sustainability +Biography +mystical +Wanted +multiplayer +Applications +disliked +##tisfied +impaired +empirical +forgetting +Fairfield +Sunni +blurred +Growing +Avalon +coil +Camera +Skin +bruised +terminals +##fted +##roving +Commando +##hya +##sper +reservations +needles +dangling +##rsch +##rsten +##spect +##mbs +yoga +regretted +Bliss +Orion +Rufus +glucose +Olsen +autobiographical +##dened +222 +humidity +Shan +##ifiable +supper +##rou +flare +##MO +campaigning +descend +socio +declares +Mounted +Gracie +Arte +endurance +##ety +Copper +costa +airplay +##MB +Proceedings +dislike +grimaced +occupants +births +glacial +oblivious +cans +installment +muddy +##ł +captains +pneumonia +Quiet +Sloan +Excuse +##nine +Geography +gymnastics +multimedia +drains +Anthology +Gear +cylindrical +Fry +undertaking +##pler +##tility +Nan +##recht +Dub +philosophers +piss +Atari +##pha +Galicia +México +##nking +Continuing +bump +graveyard +persisted +Shrine +##erapy +defects +Advance +Bomber +##oil +##ffling +cheerful +##lix +scrub +##eto +awkwardly +collaborator +fencing +##alo +prophet +Croix +coughed +##lication +roadway +slaughter +elephants +##erated +Simpsons +vulnerability +ivory +Birth +lizard +scarce +cylinders +fortunes +##NL +Hate +Priory +##lai +McBride +##copy +Lenny +liaison +Triangle +coronation +sampled +savage +amidst +Grady +whatsoever +instinctively +Reconstruction +insides +seizure +Drawing +##rlin +Antioch +Gao +Díaz +1760 +Sparks +##tien +##bidae +rehearsal +##bbs +botanical +##hers +compensate +wholesale +Seville +shareholder +prediction +astronomical +Reddy +hardest +circling +whereabouts +termination +Rep +Assistance +Dramatic +Herb +##ghter +climbs +188 +Poole +301 +##pable +wit +##istice +Walters +relying +Jakob +##redo +proceeding +Langley +affiliates +ou +##allo +##holm +Samsung +##ishi +Missing +Xi +vertices +Claus +foam +restless +##uating +##sso +##ttering +Philips +delta +bombed +Catalogue +coaster +Ling +Willard +satire +410 +Composition +Net +Orioles +##ldon +fins +Palatinate +Woodward +tease +tilt +brightness +##70 +##bbling +##loss +##dhi +##uilt +Whoever +##yers +hitter +Elton +Extension +ace +Affair +restructuring +##loping +Paterson +hi +##rya +spouse +Shay +Himself +piles +preaching +##gical +bikes +Brave +expulsion +Mirza +stride +Trees +commemorated +famine +masonry +Selena +Watt +Banking +Rancho +Stockton +dip +tattoos +Vlad +acquainted +Flyers +ruthless +fourteenth +illustrate +##akes +EPA +##rows +##uiz +bumped +Designed +Leaders +mastered +Manfred +swirled +McCain +##rout +Artemis +rabbi +flinched +upgrades +penetrate +shipyard +transforming +caretaker +##eiro +Maureen +tightening +##founded +RAM +##icular +##mper +##rung +Fifteen +exploited +consistency +interstate +##ynn +Bridget +contamination +Mistress +##rup +coating +##FP +##jective +Libyan +211 +Gemma +dependence +shrubs +##ggled +Germain +retaliation +traction +##PP +Dangerous +terminology +psychiatrist +##garten +hurdles +Natal +wasting +Weir +revolves +stripe +##reased +preferences +##entation +##lde +##áil +##otherapy +Flame +##ologies +viruses +Label +Pandora +veil +##ogical +Coliseum +Cottage +creeping +Jong +lectured +##çaise +shoreline +##fference +##hra +Shade +Clock +Faye +bilingual +Humboldt +Operating +##fter +##was +algae +towed +amphibious +Parma +impacted +smacked +Piedmont +Monsters +##omb +Moor +##lberg +sinister +Postal +178 +Drummond +Sign +textbooks +hazardous +Brass +Rosemary +Pick +Sit +Architect +transverse +Centennial +confess +polling +##aia +Julien +##mand +consolidation +Ethel +##ulse +severity +Yorker +choreographer +1840s +##ltry +softer +versa +##geny +##quila +##jō +Caledonia +Friendship +Visa +rogue +##zzle +bait +feather +incidence +Foods +Ships +##uto +##stead +arousal +##rote +Hazel +##bolic +Swing +##ej +##cule +##jana +##metry +##uity +Valuable +##ₙ +Shropshire +##nect +365 +Ones +realise +Café +Albuquerque +##grown +##stadt +209 +##ᵢ +prefers +withstand +Lillian +MacArthur +Hara +##fulness +domination +##VO +##school +Freddy +ethnicity +##while +adorned +hormone +Calder +Domestic +Freud +Shields +##phus +##rgan +BP +Segunda +Mustang +##GI +Bonn +patiently +remarried +##umbria +Crete +Elephant +Nuremberg +tolerate +Tyson +##evich +Programming +##lander +Bethlehem +segregation +Constituency +quarterly +blushed +photographers +Sheldon +porcelain +Blanche +goddamn +lively +##fused +bumps +##eli +curated +coherent +provoked +##vet +Madeleine +##isco +rainy +Bethel +accusation +ponytail +gag +##lington +quicker +scroll +##vate +Bow +Gender +Ira +crashes +ACT +Maintenance +##aton +##ieu +bitterly +strains +rattled +vectors +##arina +##ishly +173 +parole +##nx +amusing +Gonzalez +##erative +Caucus +sensual +Penelope +coefficient +Mateo +##mani +proposition +Duty +lacrosse +proportions +Plato +profiles +Botswana +Brandt +reins +mandolin +encompassing +##gens +Kahn +prop +summon +##MR +##yrian +##zaki +Falling +conditional +thy +##bao +##ych +radioactive +##nics +Newspaper +##people +##nded +Gaming +sunny +##look +Sherwood +crafted +NJ +awoke +187 +timeline +giants +possessing +##ycle +Cheryl +ng +Ruiz +polymer +potassium +Ramsay +relocation +##leen +Sociology +##bana +Franciscan +propulsion +denote +##erjee +registers +headline +Tests +emerges +Articles +Mint +livery +breakup +kits +Rap +Browning +Bunny +##mington +##watch +Anastasia +Zachary +arranging +biographical +Erica +Nippon +##membrance +Carmel +##sport +##xes +Paddy +##holes +Issues +Spears +compliment +##stro +##graphs +Castillo +##MU +##space +Corporal +##nent +174 +Gentlemen +##ilize +##vage +convinces +Carmine +Crash +##hashi +Files +Doctors +brownish +sweating +goats +##conductor +rendition +##bt +NL +##spiration +generates +##cans +obsession +##noy +Danger +Diaz +heats +Realm +priorities +##phon +1300 +initiation +pagan +bursts +archipelago +chloride +Screenplay +Hewitt +Khmer +bang +judgement +negotiating +##ait +Mabel +densely +Boulder +knob +430 +Alfredo +##kt +pitches +##ées +##ان +Macdonald +##llum +imply +##mot +Smile +spherical +##tura +Derrick +Kelley +Nico +cortex +launches +differed +parallels +Navigation +##child +##rming +canoe +forestry +reinforce +##mote +confirming +tasting +scaled +##resh +##eting +Understanding +prevailing +Pearce +CW +earnest +Gaius +asserts +denoted +landmarks +Chargers +warns +##flies +Judges +jagged +##dain +tails +Historian +Millie +##sler +221 +##uard +absurd +Dion +##ially +makeshift +Specifically +ignorance +Eat +##ieri +comparisons +forensic +186 +Giro +skeptical +disciplinary +battleship +##45 +Libby +520 +Odyssey +ledge +##post +Eternal +Missionary +deficiency +settler +wonders +##gai +raging +##cis +Romney +Ulrich +annexation +boxers +sect +204 +ARIA +dei +Hitchcock +te +Varsity +##fic +CC +lending +##nial +##tag +##rdy +##obe +Defensive +##dson +##pore +stellar +Lam +Trials +contention +Sung +##uminous +Poe +superiority +##plicate +325 +bitten +conspicuous +##olly +Lila +Pub +Petit +distorted +ISIL +distinctly +##family +Cowboy +mutant +##cats +##week +Changes +Sinatra +epithet +neglect +Innocent +gamma +thrill +reggae +##adia +##ational +##due +landlord +##leaf +visibly +##ì +Darlington +Gomez +##iting +scarf +##lade +Hinduism +Fever +scouts +##roi +convened +##oki +184 +Lao +boycott +unemployed +##lore +##ß +##hammer +Curran +disciples +odor +##ygiene +Lighthouse +Played +whales +discretion +Yves +##ceived +pauses +coincide +##nji +dizzy +##scopic +routed +Guardians +Kellan +carnival +nasal +224 +##awed +Mitsubishi +640 +Cast +silky +Projects +joked +Huddersfield +Rothschild +zu +##olar +Divisions +mildly +##eni +##lge +Appalachian +Sahara +pinch +##roon +wardrobe +##dham +##etal +Bubba +##lini +##rumbling +Communities +Poznań +unification +Beau +Kris +SV +Rowing +Minh +reconciliation +##saki +##sor +taped +##reck +certificates +gubernatorial +rainbow +##uing +litter +##lique +##oted +Butterfly +benefited +Images +induce +Balkans +Velvet +##90 +##xon +Bowman +##breaker +penis +##nitz +##oint +##otive +crust +##pps +organizers +Outdoor +nominees +##rika +TX +##ucks +Protestants +##imation +appetite +Baja +awaited +##points +windshield +##igh +##zled +Brody +Buster +stylized +Bryce +##sz +Dollar +vest +mold +ounce +ok +receivers +##uza +Purdue +Harrington +Hodges +captures +##ggio +Reservation +##ssin +##tman +cosmic +straightforward +flipping +remixed +##athed +Gómez +Lim +motorcycles +economies +owning +Dani +##rosis +myths +sire +kindly +1768 +Bean +graphs +##mee +##RO +##geon +puppy +Stephenson +notified +##jer +Watching +##rama +Sino +urgency +Islanders +##mash +Plata +fumble +##chev +##stance +##rack +##she +facilitated +swings +akin +enduring +payload +##phine +Deputies +murals +##tooth +610 +Jays +eyeing +##quito +transparency +##cote +Timor +negatively +##isan +battled +##fected +thankful +Rage +hospitality +incorrectly +207 +entrepreneurs +##cula +##wley +hedge +##cratic +Corpus +Odessa +Whereas +##ln +fetch +happier +Amherst +bullying +graceful +Height +Bartholomew +willingness +qualifier +191 +Syed +Wesleyan +Layla +##rrence +Webber +##hum +Rat +##cket +##herence +Monterey +contaminated +Beside +Mustafa +Nana +213 +##pruce +Reason +##spense +spike +##gé +AU +disciple +charcoal +##lean +formulated +Diesel +Mariners +accreditation +glossy +1800s +##ih +Mainz +unison +Marianne +shear +overseeing +vernacular +bowled +##lett +unpopular +##ckoned +##monia +Gaston +##TI +##oters +Cups +##bones +##ports +Museo +minors +1773 +Dickens +##EL +##NBC +Presents +ambitions +axes +Río +Yukon +bedside +Ribbon +Units +faults +conceal +##lani +prevailed +214 +Goodwin +Jaguar +crumpled +Cullen +Wireless +ceded +remotely +Bin +mocking +straps +ceramics +##avi +##uding +##ader +Taft +twenties +##aked +Problem +quasi +Lamar +##ntes +##avan +Barr +##eral +hooks +sa +##ône +194 +##ross +Nero +Caine +trance +Homeland +benches +Guthrie +dismiss +##lex +César +foliage +##oot +##alty +Assyrian +Ahead +Murdoch +dictatorship +wraps +##ntal +Corridor +Mackay +respectable +jewels +understands +##pathic +Bryn +##tep +ON +capsule +intrigued +Sleeping +communists +##chayat +##current +##vez +doubling +booklet +##uche +Creed +##NU +spies +##sef +adjusting +197 +Imam +heaved +Tanya +canonical +restraint +senators +stainless +##gnate +Matter +cache +restrained +conflicting +stung +##ool +Sustainable +antiquity +193 +heavens +inclusive +##ador +fluent +303 +911 +archaeologist +superseded +##plex +Tammy +inspire +##passing +##lub +Lama +Mixing +##activated +##yote +parlor +tactic +198 +Stefano +prostitute +recycling +sorted +banana +Stacey +Musée +aristocratic +cough +##rting +authorised +gangs +runoff +thoughtfully +##nish +Fisheries +Provence +detector +hum +##zhen +pill +##árez +Map +Leaves +Peabody +skater +vent +##color +390 +cerebral +hostages +mare +Jurassic +swell +##isans +Knoxville +Naked +Malaya +scowl +Cobra +##anga +Sexual +##dron +##iae +196 +##drick +Ravens +Blaine +##throp +Ismail +symmetric +##lossom +Leicestershire +Sylvester +glazed +##tended +Radar +fused +Families +Blacks +Sale +Zion +foothills +microwave +slain +Collingwood +##pants +##dling +killers +routinely +Janice +hearings +##chanted +##ltration +continents +##iving +##yster +##shot +##yna +injected +Guillaume +##ibi +kinda +Confederacy +Barnett +disasters +incapable +##grating +rhythms +betting +draining +##hak +Callie +Glover +##iliated +Sherlock +hearted +punching +Wolverhampton +Leaf +Pi +builders +furnished +knighted +Photo +##zle +Touring +fumbled +pads +##ий +Bartlett +Gunner +eerie +Marius +Bonus +pots +##hino +##pta +Bray +Frey +Ortiz +stalls +belongings +Subway +fascination +metaphor +Bat +Boer +Colchester +sway +##gro +rhetoric +##dheim +Fool +PMID +admire +##hsil +Strand +TNA +##roth +Nottinghamshire +##mat +##yler +Oxfordshire +##nacle +##roner +BS +##nces +stimulus +transports +Sabbath +##postle +Richter +4000 +##grim +##shima +##lette +deteriorated +analogous +##ratic +UHF +energies +inspiring +Yiddish +Activities +##quential +##boe +Melville +##ilton +Judd +consonants +labs +smuggling +##fari +avid +##uc +truce +undead +##raith +Mostly +bracelet +Connection +Hussain +awhile +##UC +##vention +liable +genetically +##phic +Important +Wildcats +daddy +transmit +##cas +conserved +Yesterday +##lite +Nicky +Guys +Wilder +Lay +skinned +Communists +Garfield +Nearby +organizer +Loss +crafts +walkway +Chocolate +Sundance +Synod +##enham +modify +swayed +Surface +analysts +brackets +drone +parachute +smelling +Andrés +filthy +frogs +vertically +##OK +localities +marries +AHL +35th +##pian +Palazzo +cube +dismay +relocate +##на +Hear +##digo +##oxide +prefecture +converts +hangar +##oya +##ucking +Spectrum +deepened +spoiled +Keeping +##phobic +Verona +outrage +Improvement +##UI +masterpiece +slung +Calling +chant +Haute +mediated +manipulated +affirmed +##hesis +Hangul +skies +##llan +Worcestershire +##kos +mosaic +##bage +##wned +Putnam +folder +##LM +guts +noteworthy +##rada +AJ +sculpted +##iselle +##rang +recognizable +##pent +dolls +lobbying +impatiently +Se +staple +Serb +tandem +Hiroshima +thieves +##ynx +faculties +Norte +##alle +##trusion +chords +##ylon +Gareth +##lops +##escu +FIA +Levin +auspices +groin +Hui +nun +Listed +Honourable +Larsen +rigorous +##erer +Tonga +##pment +##rave +##track +##aa +##enary +540 +clone +sediment +esteem +sighted +cruelty +##boa +inverse +violating +Amtrak +Status +amalgamated +vertex +AR +harmless +Amir +mounts +Coronation +counseling +Audi +CO₂ +splits +##eyer +Humans +Salmon +##have +##rado +##čić +216 +takeoff +classmates +psychedelic +##gni +Gypsy +231 +Anger +GAA +ME +##nist +##tals +Lissa +Odd +baptized +Fiat +fringe +##hren +179 +elevators +perspectives +##TF +##ngle +Question +frontal +950 +thicker +Molecular +##nological +Sixteen +Baton +Hearing +commemorative +dorm +Architectural +purity +##erse +risky +Georgie +relaxing +##ugs +downed +##rar +Slim +##phy +IUCN +##thorpe +Parkinson +217 +Marley +Shipping +sweaty +Jesuits +Sindh +Janata +implying +Armenians +intercept +Ankara +commissioners +ascended +sniper +Grass +Walls +salvage +Dewey +generalized +learnt +PT +##fighter +##tech +DR +##itrus +##zza +mercenaries +slots +##burst +##finger +##nsky +Princes +Rhodesia +##munication +##strom +Fremantle +homework +ins +##Os +##hao +##uffed +Thorpe +Xiao +exquisite +firstly +liberated +technician +Oilers +Phyllis +herb +sharks +MBE +##stock +Product +banjo +##morandum +##than +Visitors +unavailable +unpublished +oxidation +Vogue +##copic +##etics +Yates +##ppard +Leiden +Trading +cottages +Principles +##Millan +##wife +##hiva +Vicar +nouns +strolled +##eorological +##eton +##science +precedent +Armand +Guido +rewards +##ilis +##tise +clipped +chick +##endra +averages +tentatively +1830s +##vos +Certainly +305 +Société +Commandant +##crats +##dified +##nka +marsh +angered +ventilation +Hutton +Ritchie +##having +Eclipse +flick +motionless +Amor +Fest +Loire +lays +##icit +##sband +Guggenheim +Luck +disrupted +##ncia +Disco +##vigator +criticisms +grins +##lons +##vial +##ody +salute +Coaches +junk +saxophonist +##eology +Uprising +Diet +##marks +chronicles +robbed +##iet +##ahi +Bohemian +magician +wavelength +Kenyan +augmented +fashionable +##ogies +Luce +F1 +Monmouth +##jos +##loop +enjoyment +exemption +Centers +##visor +Soundtrack +blinding +practitioner +solidarity +sacrificed +##oso +##cture +##riated +blended +Abd +Copyright +##nob +34th +##reak +Claudio +hectare +rotor +testify +##ends +##iably +##sume +landowner +##cess +##ckman +Eduard +Silesian +backseat +mutually +##abe +Mallory +bounds +Collective +Poet +Winkler +pertaining +scraped +Phelps +crane +flickering +Proto +bubbles +popularized +removes +##86 +Cadillac +Warfare +audible +rites +shivering +##sist +##nst +##biotic +Mon +fascist +Bali +Kathryn +ambiguous +furiously +morale +patio +Sang +inconsistent +topology +Greens +monkeys +Köppen +189 +Toy +vow +##ías +bombings +##culus +improvised +lodged +subsidiaries +garment +startling +practised +Hume +Thorn +categorized +Till +Eileen +wedge +##64 +Federico +patriotic +unlock +##oshi +badminton +Compared +Vilnius +##KE +Crimean +Kemp +decks +spaced +resolutions +sighs +##mind +Imagine +Cartoon +huddled +policemen +forwards +##rouch +equals +##nter +inspected +Charley +MG +##rte +pamphlet +Arturo +dans +scarcely +##ulton +##rvin +parental +unconstitutional +watts +Susannah +Dare +##sitive +Rowland +Valle +invalid +##ué +Detachment +acronym +Yokohama +verified +##lsson +groove +Liza +clarified +compromised +265 +##rgon +##orf +hesitant +Fruit +Application +Mathias +icons +##cell +Qin +interventions +##uron +punt +remnant +##rien +Ames +manifold +spines +floral +##zable +comrades +Fallen +orbits +Annals +hobby +Auditorium +implicated +researching +Pueblo +Ta +terminate +##pella +Rings +approximation +fuzzy +##ús +thriving +##ket +Conor +alarmed +etched +Cary +##rdon +Ally +##rington +Pay +mint +##hasa +##unity +##dman +##itate +Oceania +furrowed +trams +##aq +Wentworth +ventured +choreography +prototypes +Patel +mouthed +trenches +##licing +##yya +Lies +deception +##erve +##vations +Bertrand +earthquakes +##tography +Southwestern +##aja +token +Gupta +##yō +Beckett +initials +ironic +Tsar +subdued +shootout +sobbing +liar +Scandinavia +Souls +ch +therapist +trader +Regulation +Kali +busiest +##pation +32nd +Telephone +Vargas +##moky +##nose +##uge +Favorite +abducted +bonding +219 +255 +correction +mat +drown +fl +unbeaten +Pocket +Summers +Quite +rods +Percussion +##ndy +buzzing +cadet +Wilkes +attire +directory +utilities +naive +populous +Hendrix +##actor +disadvantage +1400 +Landon +Underworld +##ense +Occasionally +mercury +Davey +Morley +spa +wrestled +##vender +eclipse +Sienna +supplemented +thou +Stream +liturgical +##gall +##berries +##piration +1769 +Bucks +abandoning +##jutant +##nac +232 +venom +##31 +Roche +dotted +Currie +Córdoba +Milo +Sharif +divides +justification +prejudice +fortunate +##vide +##ābād +Rowe +inflammatory +##eld +avenue +Sources +##rimal +Messenger +Blanco +advocating +formulation +##pute +emphasizes +nut +Armored +##ented +nutrients +##tment +insistence +Martins +landowners +##RB +comparatively +headlines +snaps +##qing +Celebration +##mad +republican +##NE +Trace +##500 +1771 +proclamation +NRL +Rubin +Buzz +Weimar +##AG +199 +posthumous +##ental +##deacon +Distance +intensely +overheard +Arcade +diagonal +hazard +Giving +weekdays +##ù +Verdi +actresses +##hare +Pulling +##erries +##pores +catering +shortest +##ctors +##cure +##restle +##reta +##runch +##brecht +##uddin +Moments +senate +Feng +Prescott +##thest +218 +divisional +Bertie +sparse +surrounds +coupling +gravitational +werewolves +##lax +Rankings +##mated +##tries +Shia +##mart +##23 +##vocative +interfaces +morphology +newscast +##bide +inputs +solicitor +Olaf +cabinets +puzzles +##tains +Unified +##firmed +WA +solemn +##opy +Tito +Jaenelle +Neolithic +horseback +##ires +pharmacy +prevalence +##lint +Swami +##bush +##tudes +Philipp +mythical +divers +Scouting +aperture +progressively +##bay +##nio +bounce +Floor +##elf +Lucan +adulthood +helm +Bluff +Passage +Salvation +lemon +napkin +scheduling +##gets +Elements +Mina +Novak +stalled +##llister +Infrastructure +##nky +##tania +##uished +Katz +Norma +sucks +trusting +1765 +boilers +Accordingly +##hered +223 +Crowley +##fight +##ulo +Henrietta +##hani +pounder +surprises +##chor +##glia +Dukes +##cracy +##zier +##fs +Patriot +silicon +##VP +simulcast +telegraph +Mysore +cardboard +Len +##QL +Auguste +accordion +analytical +specify +ineffective +hunched +abnormal +Transylvania +##dn +##tending +Emilia +glittering +Maddy +##wana +1762 +External +Lecture +endorsement +Hernández +Anaheim +Ware +offences +##phorus +Plantation +popping +Bonaparte +disgusting +neared +##notes +Identity +heroin +nicely +##raverse +apron +congestion +##PR +padded +##fts +invaders +##came +freshly +Halle +endowed +fracture +ROM +##max +sediments +diffusion +dryly +##tara +Tam +Draw +Spin +Talon +Anthropology +##lify +nausea +##shirt +insert +Fresno +capitalist +indefinitely +apples +Gift +scooped +60s +Cooperative +mistakenly +##lover +murmur +##iger +Equipment +abusive +orphanage +##9th +##lterweight +##unda +Baird +ant +saloon +33rd +Chesapeake +##chair +##sound +##tend +chaotic +pornography +brace +##aret +heiress +SSR +resentment +Arbor +headmaster +##uren +unlimited +##with +##jn +Bram +Ely +Pokémon +pivotal +##guous +Database +Marta +Shine +stumbling +##ovsky +##skin +Henley +Polk +functioned +##layer +##pas +##udd +##MX +blackness +cadets +feral +Damian +##actions +2D +##yla +Apocalypse +##aic +inactivated +##china +##kovic +##bres +destroys +nap +Macy +sums +Madhya +Wisdom +rejects +##amel +60th +Cho +bandwidth +##sons +##obbing +##orama +Mutual +shafts +##estone +##rsen +accord +replaces +waterfront +##gonal +##rida +convictions +##ays +calmed +suppliers +Cummings +GMA +fearful +Scientist +Sinai +examines +experimented +Netflix +Enforcement +Scarlett +##lasia +Healthcare +##onte +Dude +inverted +##36 +##regation +##lidae +Munro +##angay +Airbus +overlapping +Drivers +lawsuits +bodily +##udder +Wanda +Effects +Fathers +##finery +##islav +Ridley +observatory +pod +##utrition +Electricity +landslide +##mable +##zoic +##imator +##uration +Estates +sleepy +Nickelodeon +steaming +irony +schedules +snack +spikes +Hmm +##nesia +##bella +##hibit +Greenville +plucked +Harald +##ono +Gamma +infringement +roaring +deposition +##pol +##orum +660 +seminal +passports +engagements +Akbar +rotated +##bina +##gart +Hartley +##lown +##truct +uttered +traumatic +Dex +##ôme +Holloway +MV +apartheid +##nee +Counter +Colton +OR +245 +Spaniards +Regency +Schedule +scratching +squads +verify +##alk +keyboardist +rotten +Forestry +aids +commemorating +##yed +##érie +Sting +##elly +Dai +##fers +##berley +##ducted +Melvin +cannabis +glider +##enbach +##rban +Costello +Skating +cartoonist +AN +audit +##pectator +distributing +226 +312 +interpreter +header +Alternatively +##ases +smug +##kumar +cabins +remastered +Connolly +Kelsey +LED +tentative +Check +Sichuan +shaved +##42 +Gerhard +Harvest +inward +##rque +Hopefully +hem +##34 +Typical +binds +wrath +Woodstock +forcibly +Fergus +##charged +##tured +prepares +amenities +penetration +##ghan +coarse +##oned +enthusiasts +##av +##twined +fielded +##cky +Kiel +##obia +470 +beers +tremble +youths +attendees +##cademies +##sex +Macon +communism +dir +##abi +Lennox +Wen +differentiate +jewel +##SO +activate +assert +laden +unto +Gillespie +Guillermo +accumulation +##GM +NGO +Rosenberg +calculating +drastically +##omorphic +peeled +Liège +insurgents +outdoors +##enia +Aspen +Sep +awakened +##eye +Consul +Maiden +insanity +##brian +furnace +Colours +distributions +longitudinal +syllables +##scent +Martian +accountant +Atkins +husbands +sewage +zur +collaborate +highlighting +##rites +##PI +colonization +nearer +##XT +dunes +positioning +Ku +multitude +luxurious +Volvo +linguistics +plotting +squared +##inder +outstretched +##uds +Fuji +ji +##feit +##ahu +##loat +##gado +##luster +##oku +América +##iza +Residents +vine +Pieces +DD +Vampires +##ová +smoked +harshly +spreads +##turn +##zhi +betray +electors +##settled +Considering +exploits +stamped +Dusty +enraged +Nairobi +##38 +intervened +##luck +orchestras +##lda +Hereford +Jarvis +calf +##itzer +##CH +salesman +Lovers +cigar +Angelica +doomed +heroine +##tible +Sanford +offenders +##ulously +articulated +##oam +Emanuel +Gardiner +Edna +Shu +gigantic +##stable +Tallinn +coasts +Maker +ale +stalking +##oga +##smus +lucrative +southbound +##changing +Reg +##lants +Schleswig +discount +grouping +physiological +##OH +##sun +Galen +assurance +reconcile +rib +scarlet +Thatcher +anarchist +##oom +Turnpike +##ceding +cocktail +Sweeney +Allegheny +concessions +oppression +reassuring +##poli +##ticus +##TR +##VI +##uca +##zione +directional +strikeouts +Beneath +Couldn +Kabul +##national +hydroelectric +##jit +Desire +##riot +enhancing +northbound +##PO +Ok +Routledge +volatile +Bernardo +Python +333 +ample +chestnut +automobiles +##innamon +##care +##hering +BWF +salaries +Turbo +acquisitions +##stituting +strengths +pilgrims +Ponce +Pig +Actors +Beard +sanitation +##RD +##mett +Telecommunications +worms +##idas +Juno +Larson +Ventura +Northeastern +weighs +Houghton +collaborating +lottery +##rano +Wonderland +gigs +##lmer +##zano +##edd +##nife +mixtape +predominant +tripped +##ruly +Alexei +investing +Belgarath +Brasil +hiss +##crat +##xham +Côte +560 +kilometer +##cological +analyzing +##As +engined +listener +##cakes +negotiation +##hisky +Santana +##lemma +IAAF +Seneca +skeletal +Covenant +Steiner +##lev +##uen +Neptune +retention +##upon +Closing +Czechoslovak +chalk +Navarre +NZ +##IG +##hop +##oly +##quatorial +##sad +Brewery +Conflict +Them +renew +turrets +disagree +Petra +Slave +##reole +adjustment +##dela +##regard +##sner +framing +stature +##rca +##sies +##46 +##mata +Logic +inadvertently +naturalist +spheres +towering +heightened +Dodd +rink +##fle +Keyboards +bulb +diver +ul +##tsk +Exodus +Deacon +España +Canadiens +oblique +thud +reigned +rug +Whitman +Dash +##iens +Haifa +pets +##arland +manually +dart +##bial +Sven +textiles +subgroup +Napier +graffiti +revolver +humming +Babu +protector +typed +Provinces +Sparta +Wills +subjective +##rella +temptation +##liest +FL +Sadie +manifest +Guangdong +Transfer +entertain +eve +recipes +##33 +Benedictine +retailer +##dence +establishes +##cluded +##rked +Ursula +##ltz +##lars +##rena +qualifiers +##curement +colt +depictions +##oit +Spiritual +differentiation +staffed +transitional +##lew +1761 +fatalities +##oan +Bayern +Northamptonshire +Weeks +##CU +Fife +capacities +hoarse +##latt +##ة +evidenced +##HD +##ographer +assessing +evolve +hints +42nd +streaked +##lve +Yahoo +##estive +##rned +##zas +baggage +Elected +secrecy +##champ +Character +Pen +Decca +cape +Bernardino +vapor +Dolly +counselor +##isers +Benin +##khar +##CR +notch +##thus +##racy +bounty +lend +grassland +##chtenstein +##dating +pseudo +golfer +simplest +##ceive +Lucivar +Triumph +dinosaur +dinosaurs +##šić +Seahawks +##nco +resorts +reelected +1766 +reproduce +universally +##OA +ER +tendencies +Consolidated +Massey +Tasmanian +reckless +##icz +##ricks +1755 +questionable +Audience +##lates +preseason +Quran +trivial +Haitian +Freeway +dialed +Appointed +Heard +ecosystems +##bula +hormones +Carbon +Rd +##arney +##working +Christoph +presiding +pu +##athy +Morrow +Dar +ensures +posing +remedy +EA +disclosed +##hui +##rten +rumours +surveying +##ficiency +Aziz +Jewel +Plays +##smatic +Bernhard +Christi +##eanut +##friend +jailed +##dr +govern +neighbour +butler +Acheron +murdering +oils +mac +Editorial +detectives +bolts +##ulon +Guitars +malaria +36th +Pembroke +Opened +##hium +harmonic +serum +##sio +Franks +fingernails +##gli +culturally +evolving +scalp +VP +deploy +uploaded +mater +##evo +Jammu +Spa +##icker +flirting +##cursions +Heidi +Majority +sprawled +##alytic +Zheng +bunker +##lena +ST +##tile +Jiang +ceilings +##ently +##ols +Recovery +dire +##good +Manson +Honestly +Montréal +1764 +227 +quota +Lakshmi +incentive +Accounting +##cilla +Eureka +Reaper +buzzed +##uh +courtroom +dub +##mberg +KC +Gong +Theodor +Académie +NPR +criticizing +protesting +##pired +##yric +abuses +fisheries +##minated +1767 +yd +Gemini +Subcommittee +##fuse +Duff +Wasn +Wight +cleaner +##tite +planetary +Survivor +Zionist +mounds +##rary +landfall +disruption +yielding +##yana +bids +unidentified +Garry +Ellison +Elmer +Fishing +Hayward +demos +modelling +##anche +##stick +caressed +entertained +##hesion +piers +Crimea +##mass +WHO +boulder +trunks +1640 +Biennale +Palestinians +Pursuit +##udes +Dora +contender +##dridge +Nanjing +##ezer +##former +##ibel +Whole +proliferation +##tide +##weiler +fuels +predictions +##ente +##onium +Filming +absorbing +Ramón +strangled +conveyed +inhabit +prostitutes +recession +bonded +clinched +##eak +##iji +##edar +Pleasure +Rite +Christy +Therapy +sarcasm +##collegiate +hilt +probation +Sarawak +coefficients +underworld +biodiversity +SBS +groom +brewing +dungeon +##claiming +Hari +turnover +##ntina +##omer +##opped +orthodox +styling +##tars +##ulata +priced +Marjorie +##eley +##abar +Yong +##tically +Crambidae +Hernandez +##ego +##rricular +##ark +##lamour +##llin +##augh +##tens +Advancement +Loyola +##4th +##hh +goin +marshes +Sardinia +##ša +Ljubljana +Singing +suspiciously +##hesive +Félix +Regarding +flap +stimulation +##raught +Apr +Yin +gaping +tighten +skier +##itas +##lad +##rani +264 +Ashes +Olson +Problems +Tabitha +##rading +balancing +sunrise +##ease +##iture +##ritic +Fringe +##iciency +Inspired +Linnaeus +PBA +disapproval +##kles +##rka +##tails +##urger +Disaster +Laboratories +apps +paradise +Aero +Came +sneaking +Gee +Beacon +ODI +commodity +Ellington +graphical +Gretchen +spire +##skaya +##trine +RTÉ +efficacy +plc +tribunal +##ytic +downhill +flu +medications +##kaya +widen +Sunrise +##nous +distinguishing +pawn +##BO +##irn +##ssing +##ν +Easton +##vila +Rhineland +##aque +defect +##saurus +Goose +Ju +##classified +Middlesbrough +shaping +preached +1759 +##erland +Ein +Hailey +musicals +##altered +Galileo +Hilda +Fighters +Lac +##ometric +295 +Leafs +Milano +##lta +##VD +##ivist +penetrated +Mask +Orchard +plaintiff +##icorn +Yvonne +##fred +outfielder +peek +Collier +Caracas +repealed +Bois +dell +restrict +Dolores +Hadley +peacefully +##LL +condom +Granny +Orders +sabotage +##toon +##rings +compass +marshal +gears +brigadier +dye +Yunnan +communicating +donate +emerald +vitamin +administer +Fulham +##classical +##llas +Buckinghamshire +Held +layered +disclosure +Akira +programmer +shrimp +Crusade +##ximal +Luzon +bakery +##cute +Garth +Citadel +uniquely +Curling +info +mum +Para +##ști +sleek +##ione +hey +Lantern +mesh +##lacing +##lizzard +##gade +prosecuted +Alba +Gilles +greedy +twists +##ogged +Viper +##kata +Appearances +Skyla +hymns +##pelled +curving +predictable +Grave +Watford +##dford +##liptic +##vary +Westwood +fluids +Models +statutes +##ynamite +1740 +##culate +Framework +Johanna +##gression +Vuelta +imp +##otion +##raga +##thouse +Ciudad +festivities +##love +Beyoncé +italics +##vance +DB +##haman +outs +Singers +##ueva +##urning +##51 +##ntiary +##mobile +285 +Mimi +emeritus +nesting +Keeper +Ways +##onal +##oux +Edmond +MMA +##bark +##oop +Hampson +##ñez +##rets +Gladstone +wreckage +Pont +Playboy +reluctance +##ná +apprenticeship +preferring +Value +originate +##wei +##olio +Alexia +##rog +Parachute +jammed +stud +Eton +vols +##ganized +1745 +straining +creep +indicators +##mán +humiliation +hinted +alma +tanker +##egation +Haynes +Penang +amazement +branched +rumble +##ddington +archaeologists +paranoid +expenditure +Absolutely +Musicians +banished +##fining +baptism +Joker +Persons +hemisphere +##tieth +##ück +flock +##xing +lbs +Kung +crab +##dak +##tinent +Regulations +barrage +parcel +##ós +Tanaka +##rsa +Natalia +Voyage +flaws +stepfather +##aven +##eological +Botanical +Minsk +##ckers +Cinderella +Feast +Loving +Previous +Shark +##took +barrister +collaborators +##nnes +Croydon +Graeme +Juniors +##7th +##formation +##ulos +##ák +£2 +##hwa +##rove +##ș +Whig +demeanor +Otago +##TH +##ooster +Faber +instructors +##ahl +##bha +emptied +##schen +saga +##lora +exploding +##rges +Crusaders +##caster +##uations +streaks +CBN +bows +insights +ka +1650 +diversion +LSU +Wingspan +##liva +Response +sanity +Producers +imitation +##fine +Lange +Spokane +splash +weed +Siberian +magnet +##rocodile +capitals +##rgus +swelled +Rani +Bells +Silesia +arithmetic +rumor +##hampton +favors +Weird +marketplace +##orm +tsunami +unpredictable +##citation +##ferno +Tradition +postwar +stench +succeeds +##roup +Anya +Users +oversized +totaling +pouch +##nat +Tripoli +leverage +satin +##cline +Bathurst +Lund +Niall +thereof +##quid +Bangor +barge +Animated +##53 +##alan +Ballard +utilizes +Done +ballistic +NDP +gatherings +##elin +##vening +Rockets +Sabrina +Tamara +Tribal +WTA +##citing +blinded +flux +Khalid +Una +prescription +##jee +Parents +##otics +##food +Silicon +cured +electro +perpendicular +intimacy +##rified +Lots +##ceiving +##powder +incentives +McKenna +##arma +##ounced +##rinkled +Alzheimer +##tarian +262 +Seas +##cam +Novi +##hout +##morphic +##hazar +##hul +##nington +Huron +Bahadur +Pirate +pursed +Griffiths +indicted +swap +refrain +##mulating +Lal +stomped +##Pad +##mamoto +Reef +disposed +plastered +weeping +##rato +Minas +hourly +tumors +##ruising +Lyle +##yper +##sol +Odisha +credibility +##Dowell +Braun +Graphic +lurched +muster +##nex +##ührer +##connected +##iek +##ruba +Carthage +Peck +maple +bursting +##lava +Enrico +rite +##jak +Moment +##skar +Styx +poking +Spartan +##urney +Hepburn +Mart +Titanic +newsletter +waits +Mecklenburg +agitated +eats +##dious +Chow +matrices +Maud +##sexual +sermon +234 +##sible +##lung +Qi +cemeteries +mined +sprinter +##ckett +coward +##gable +##hell +##thin +##FB +Contact +##hay +rainforest +238 +Hemisphere +boasts +##nders +##verance +##kat +Convent +Dunedin +Lecturer +lyricist +##bject +Iberian +comune +##pphire +chunk +##boo +thrusting +fore +informing +pistols +echoes +Tier +battleships +substitution +##belt +moniker +##charya +##lland +Thoroughbred +38th +##01 +##tah +parting +tongues +Cale +##seau +Unionist +modular +celebrates +preview +steamed +Bismarck +302 +737 +vamp +##finity +##nbridge +weaknesses +husky +##berman +absently +##icide +Craven +tailored +Tokugawa +VIP +syntax +Kazan +captives +doses +filtered +overview +Cleopatra +Conversely +stallion +Burger +Suez +Raoul +th +##reaves +Dickson +Nell +Rate +anal +colder +##sław +Arm +Semitic +##green +reflective +1100 +episcopal +journeys +##ours +##pository +##dering +residue +Gunn +##27 +##ntial +##crates +##zig +Astros +Renee +Emerald +##vili +connectivity +undrafted +Sampson +treasures +##kura +##theon +##vern +Destroyer +##iable +##ener +Frederic +briefcase +confinement +Bree +##WD +Athena +233 +Padres +Thom +speeding +##hali +Dental +ducks +Putin +##rcle +##lou +Asylum +##usk +dusk +pasture +Institutes +ONE +jack +##named +diplomacy +Intercontinental +Leagues +Towns +comedic +premature +##edic +##mona +##ories +trimmed +Charge +Cream +guarantees +Dmitry +splashed +Philosophical +tramway +##cape +Maynard +predatory +redundant +##gratory +##wry +sobs +Burgundy +edible +outfits +Handel +dazed +dangerously +idle +Operational +organizes +##sional +blackish +broker +weddings +##halt +Becca +McGee +##gman +protagonists +##pelling +Keynes +aux +stumble +##ordination +Nokia +reel +sexes +##woods +##pheric +##quished +##voc +##oir +##pathian +##ptus +##sma +##tating +##ê +fulfilling +sheath +##ayne +Mei +Ordinary +Collin +Sharpe +grasses +interdisciplinary +##OX +Background +##ignment +Assault +transforms +Hamas +Serge +ratios +##sik +swaying +##rcia +Rosen +##gant +##versible +cinematographer +curly +penny +Kamal +Mellon +Sailor +Spence +phased +Brewers +amassed +Societies +##ropriations +##buted +mythological +##SN +##byss +##ired +Sovereign +preface +Parry +##ife +altitudes +crossings +##28 +Crewe +southernmost +taut +McKinley +##owa +##tore +254 +##ckney +compiling +Shelton +##hiko +228 +Poll +Shepard +Labs +Pace +Carlson +grasping +##ов +Delaney +Winning +robotic +intentional +shattering +##boarding +##git +##grade +Editions +Reserves +ignorant +proposing +##hanna +cutter +Mongols +NW +##eux +Codex +Cristina +Daughters +Rees +forecast +##hita +NGOs +Stations +Beaux +Erwin +##jected +##EX +##trom +Schumacher +##hrill +##rophe +Maharaja +Oricon +##sul +##dynamic +##fighting +Ce +Ingrid +rumbled +Prospect +stairwell +Barnard +applause +complementary +##uba +grunt +##mented +Bloc +Carleton +loft +noisy +##hey +490 +contrasted +##inator +##rief +##centric +##fica +Cantonese +Blanc +Lausanne +License +artifact +##ddin +rot +Amongst +Prakash +RF +##topia +milestone +##vard +Winters +Mead +churchyard +Lulu +estuary +##ind +Cha +Infinity +Meadow +subsidies +##valent +CONCACAF +Ching +medicinal +navigate +Carver +Twice +abdominal +regulating +RB +toilets +Brewer +weakening +ambushed +##aut +##vignon +Lansing +unacceptable +reliance +stabbing +##mpo +##naire +Interview +##ested +##imed +bearings +##lts +Rashid +##iation +authenticity +vigorous +##frey +##uel +biologist +NFC +##rmaid +##wash +Makes +##aunt +##steries +withdrawing +##qa +Buccaneers +bleed +inclination +stain +##ilo +##ppel +Torre +privileged +cereal +trailers +alumnus +neon +Cochrane +Mariana +caress +##47 +##ients +experimentation +Window +convict +signaled +##YP +rower +Pharmacy +interacting +241 +Strings +dominating +kinase +Dinamo +Wire +pains +sensations +##suse +Twenty20 +##39 +spotlight +##hend +elemental +##pura +Jameson +Swindon +honoring +pained +##ediatric +##lux +Psychological +assemblies +ingredient +Martial +Penguins +beverage +Monitor +mysteries +##ION +emigration +mused +##sique +crore +AMC +Funding +Chinatown +Establishment +Finalist +enjoyable +1756 +##mada +##rams +NO +newborn +CS +comprehend +Invisible +Siemens +##acon +246 +contraction +##volving +##moration +##rok +montane +##ntation +Galloway +##llow +Verity +directorial +pearl +Leaning +##rase +Fernandez +swallowing +Automatic +Madness +haunting +paddle +##UE +##rrows +##vies +##zuki +##bolt +##iber +Fender +emails +paste +##lancing +hind +homestead +hopeless +##dles +Rockies +garlic +fatty +shrieked +##ismic +Gillian +Inquiry +Schultz +XML +##cius +##uld +Domesday +grenades +northernmost +##igi +Tbilisi +optimistic +##poon +Refuge +stacks +Bose +smash +surreal +Nah +Straits +Conquest +##roo +##weet +##kell +Gladys +CH +##lim +##vitation +Doctorate +NRHP +knocks +Bey +Romano +##pile +242 +Diamonds +strides +eclectic +Betsy +clade +##hady +##leashed +dissolve +moss +Suburban +silvery +##bria +tally +turtles +##uctive +finely +industrialist +##nary +Ernesto +oz +pact +loneliness +##hov +Tomb +multinational +risked +Layne +USL +ne +##quiries +Ad +Message +Kamen +Kristen +reefs +implements +##itative +educators +garments +gunshot +##essed +##rve +Montevideo +vigorously +Stamford +assemble +packaged +##same +état +Viva +paragraph +##eter +##wire +Stick +Navajo +MCA +##pressing +ensembles +ABA +##zor +##llus +Partner +raked +##BI +Iona +thump +Celeste +Kiran +##iscovered +##rith +inflammation +##arel +Features +loosened +##yclic +Deluxe +Speak +economical +Frankenstein +Picasso +showcased +##zad +##eira +##planes +##linear +##overs +monsoon +prosecutors +slack +Horses +##urers +Angry +coughing +##truder +Questions +##tō +##zak +challenger +clocks +##ieving +Newmarket +##acle +cursing +stimuli +##mming +##qualified +slapping +##vasive +narration +##kini +Advertising +CSI +alliances +mixes +##yes +covert +amalgamation +reproduced +##ardt +##gis +1648 +id +Annette +Boots +Champagne +Brest +Daryl +##emon +##jou +##llers +Mean +adaptive +technicians +##pair +##usal +Yoga +fronts +leaping +Jul +harvesting +keel +##44 +petitioned +##lved +yells +Endowment +proponent +##spur +##tised +##zal +Homes +Includes +##ifer +##oodoo +##rvette +awarding +mirrored +ransom +Flute +outlook +##ganj +DVDs +Sufi +frontman +Goddard +barren +##astic +Suicide +hillside +Harlow +Lau +notions +Amnesty +Homestead +##irt +GE +hooded +umpire +mustered +Catch +Masonic +##erd +Dynamics +Equity +Oro +Charts +Mussolini +populace +muted +accompaniment +##lour +##ndes +ignited +##iferous +##laced +##atch +anguish +registry +##tub +##hards +##neer +251 +Hooker +uncomfortably +##6th +##ivers +Catalina +MiG +giggling +1754 +Dietrich +Kaladin +pricing +##quence +Sabah +##lving +##nical +Gettysburg +Vita +Telecom +Worst +Palais +Pentagon +##brand +##chichte +Graf +unnatural +1715 +bio +##26 +Radcliffe +##utt +chatting +spices +##aus +untouched +##eper +Doll +turkey +Syndicate +##rlene +##JP +##roots +Como +clashed +modernization +1757 +fantasies +##iating +dissipated +Sicilian +inspect +sensible +reputed +##final +Milford +poised +RC +metabolic +Tobacco +Mecca +optimization +##heat +lobe +rabbits +NAS +geologist +##liner +Kilda +carpenter +nationalists +##brae +summarized +##venge +Designer +misleading +beamed +##meyer +Matrix +excuses +##aines +##biology +401 +Moose +drafting +Sai +##ggle +Comprehensive +dripped +skate +##WI +##enan +##ruk +narrower +outgoing +##enter +##nounce +overseen +##structure +travellers +banging +scarred +##thing +##arra +Ebert +Sometime +##nated +BAFTA +Hurricanes +configurations +##MLL +immortality +##heus +gothic +##mpest +clergyman +viewpoint +Maxim +Instituto +emitted +quantitative +1689 +Consortium +##rsk +Meat +Tao +swimmers +Shaking +Terence +mainline +##linity +Quantum +##rogate +Nair +banquet +39th +reprised +lagoon +subdivisions +synonymous +incurred +password +sprung +##vere +Credits +Petersen +Faces +##vu +statesman +Zombie +gesturing +##going +Sergey +dormant +possessive +totals +southward +Ángel +##odies +HM +Mariano +Ramirez +Wicked +impressions +##Net +##cap +##ème +Transformers +Poker +RIAA +Redesignated +##chuk +Harcourt +Peña +spacious +tinged +alternatively +narrowing +Brigham +authorization +Membership +Zeppelin +##amed +Handball +steer +##orium +##rnal +##rops +Committees +endings +##MM +##yung +ejected +grams +##relli +Birch +Hilary +Stadion +orphan +clawed +##kner +Motown +Wilkins +ballads +outspoken +##ancipation +##bankment +##cheng +Advances +harvested +novelty +ineligible +oversees +##´s +obeyed +inevitably +Kingdoms +burying +Fabian +relevance +Tatiana +##MCA +sarcastic +##onda +Akron +229 +sandwiches +Adobe +Maddox +##azar +Hunting +##onized +Smiling +##tology +Juventus +Leroy +Poets +attach +lo +##rly +##film +Structure +##igate +olds +projections +SMS +outnumbered +##tase +judiciary +paramilitary +playfully +##rsing +##tras +Chico +Vin +informally +abandonment +##russ +Baroness +injuring +octagonal +deciduous +##nea +##olm +Hz +Norwood +poses +Marissa +alerted +willed +##KS +Dino +##ddler +##vani +Barbie +Thankfully +625 +bicycles +shimmering +##tinuum +##wolf +Chesterfield +##idy +##urgency +Knowles +sweetly +Ventures +##ponents +##valence +Darryl +Powerplant +RAAF +##pec +Kingsley +Parramatta +penetrating +spectacle +##inia +Marlborough +residual +compatibility +hike +Underwood +depleted +ministries +##odus +##ropriation +rotting +Faso +##inn +Happiness +Lille +Suns +cookie +rift +warmly +##lvin +Bugs +Gotham +Gothenburg +Properties +##seller +##ubi +Created +MAC +Noelle +Requiem +Ulysses +##ails +franchises +##icious +##rwick +celestial +kinetic +720 +STS +transmissions +amplitude +forums +freeing +reptiles +tumbling +##continent +##rising +##tropy +physiology +##uster +Loves +bodied +neutrality +Neumann +assessments +Vicky +##hom +hampered +##uku +Custom +timed +##eville +##xious +elastic +##section +rig +stilled +shipment +243 +artworks +boulders +Bournemouth +##hly +##LF +##linary +rumored +##bino +##drum +Chun +Freiburg +##dges +Equality +252 +Guadalajara +##sors +##taire +Roach +cramped +##ultural +Logistics +Punch +fines +Lai +caravan +##55 +lame +Collector +pausing +315 +migrant +hawk +signalling +##erham +##oughs +Demons +surfing +Rana +insisting +Wien +adolescent +##jong +##rera +##umba +Regis +brushes +##iman +residues +storytelling +Consider +contrasting +regeneration +##elling +##hlete +afforded +reactors +costing +##biotics +##gat +##евич +chanting +secondly +confesses +##ikos +##uang +##ronological +##− +Giacomo +##eca +vaudeville +weeds +rejecting +revoked +affluent +fullback +progresses +geologic +proprietor +replication +gliding +recounted +##bah +##igma +Flow +ii +newcomer +##lasp +##miya +Candace +fractured +interiors +confidential +Inverness +footing +##robe +Coordinator +Westphalia +jumper +##chism +dormitory +##gno +281 +acknowledging +leveled +##éra +Algiers +migrate +Frog +Rare +##iovascular +##urous +DSO +nomadic +##iera +woken +lifeless +##graphical +##ifications +Dot +Sachs +crow +nmi +Tacoma +Weight +mushroom +RS +conditioned +##zine +Tunisian +altering +##mizing +Handicap +Patti +Monsieur +clicking +gorge +interrupting +##powerment +drawers +Serra +##icides +Specialist +##itte +connector +worshipped +##ask +consoles +tags +##iler +glued +##zac +fences +Bratislava +honeymoon +313 +A2 +disposition +Gentleman +Gilmore +glaciers +##scribed +Calhoun +convergence +Aleppo +shortages +##43 +##orax +##worm +##codes +##rmal +neutron +##ossa +Bloomberg +Salford +periodicals +##ryan +Slayer +##ynasties +credentials +##tista +surveyor +File +stinging +unnoticed +Medici +ecstasy +espionage +Jett +Leary +circulating +bargaining +concerto +serviced +37th +HK +##fueling +Delilah +Marcia +graded +##join +Kaplan +feasible +##nale +##yt +Burnley +dreadful +ministerial +Brewster +Judah +##ngled +##rrey +recycled +Iroquois +backstage +parchment +##numbered +Kern +Motorsports +Organizations +##mini +Seems +Warrington +Dunbar +Ezio +##eor +paralyzed +Ara +yeast +##olis +cheated +reappeared +banged +##ymph +##dick +Lyndon +glide +Mat +##natch +Hotels +Household +parasite +irrelevant +youthful +##smic +##tero +##anti +2d +Ignacio +squash +##nets +shale +##اد +Abrams +##oese +assaults +##dier +##otte +Swamp +287 +Spurs +##economic +Fargo +auditioned +##mé +Haas +une +abbreviation +Turkic +##tisfaction +favorites +specials +##lial +Enlightenment +Burkina +##vir +Comparative +Lacrosse +elves +##lerical +##pear +Borders +controllers +##villa +excelled +##acher +##varo +camouflage +perpetual +##ffles +devoid +schooner +##bered +##oris +Gibbons +Lia +discouraged +sue +##gnition +Excellent +Layton +noir +smack +##ivable +##evity +##lone +Myra +weaken +weaponry +##azza +Shake +backbone +Certified +clown +occupational +caller +enslaved +soaking +Wexford +perceive +shortlisted +##pid +feminism +Bari +Indie +##avelin +##ldo +Hellenic +Hundreds +Savings +comedies +Honors +Mohawk +Told +coded +Incorporated +hideous +trusts +hose +Calais +Forster +Gabon +Internationale +AK +Colour +##UM +##heist +McGregor +localized +##tronomy +Darrell +##iara +squirrel +freaked +##eking +##manned +##ungen +radiated +##dua +commence +Donaldson +##iddle +MR +SAS +Tavern +Teenage +admissions +Instruments +##ilizer +Konrad +contemplated +##ductor +Jing +Reacher +recalling +Dhabi +emphasizing +illumination +##tony +legitimacy +Goethe +Ritter +McDonnell +Polar +Seconds +aspiring +derby +tunic +##rmed +outlines +Changing +distortion +##cter +Mechanics +##urly +##vana +Egg +Wolverine +Stupid +centralized +knit +##Ms +Saratoga +Ogden +storylines +##vres +lavish +beverages +##grarian +Kyrgyzstan +forcefully +superb +Elm +Thessaloniki +follower +Plants +slang +trajectory +Nowadays +Bengals +Ingram +perch +coloring +carvings +doubtful +##aph +##gratulations +##41 +Curse +253 +nightstand +Campo +Meiji +decomposition +##giri +McCormick +Yours +##amon +##bang +Texans +injunction +organise +periodical +##peculative +oceans +##aley +Success +Lehigh +##guin +1730 +Davy +allowance +obituary +##tov +treasury +##wayne +euros +readiness +systematically +##stered +##igor +##xen +##cliff +##lya +Send +##umatic +Celtics +Judiciary +425 +propagation +rebellious +##ims +##lut +Dal +##ayman +##cloth +Boise +pairing +Waltz +torment +Hatch +aspirations +diaspora +##hame +Rank +237 +Including +Muir +chained +toxicity +Université +##aroo +Mathews +meadows +##bio +Editing +Khorasan +##them +##ahn +##bari +##umes +evacuate +##sium +gram +kidnap +pinning +##diation +##orms +beacon +organising +McGrath +##ogist +Qur +Tango +##ceptor +##rud +##cend +##cie +##jas +##sided +Tuscany +Venture +creations +exhibiting +##rcerer +##tten +Butcher +Divinity +Pet +Whitehead +falsely +perished +handy +Moines +cyclists +synthesizers +Mortal +notoriety +##ronic +Dialogue +expressive +uk +Nightingale +grimly +vineyards +Driving +relentless +compiler +##district +##tuated +Hades +medicines +objection +Answer +Soap +Chattanooga +##gogue +Haryana +Parties +Turtle +##ferred +explorers +stakeholders +##aar +##rbonne +tempered +conjecture +##tee +##hur +Reeve +bumper +stew +##church +##generate +##ilitating +##chanized +##elier +##enne +translucent +##lows +Publisher +evangelical +inherit +##rted +247 +SmackDown +bitterness +lesions +##worked +mosques +wed +##lashes +Ng +Rebels +booking +##nail +Incident +Sailing +yo +confirms +Chaplin +baths +##kled +modernist +pulsing +Cicero +slaughtered +boasted +##losure +zipper +##hales +aristocracy +halftime +jolt +unlawful +Marching +sustaining +Yerevan +bracket +ram +Markus +##zef +butcher +massage +##quisite +Leisure +Pizza +collapsing +##lante +commentaries +scripted +##disciplinary +##sused +eroded +alleging +vase +Chichester +Peacock +commencement +dice +hotter +poisonous +executions +##occo +frost +fielding +vendor +Counts +Troops +maize +Divisional +analogue +shadowy +Nuevo +Ville +radiating +worthless +Adriatic +Buy +blaze +brutally +horizontally +longed +##matical +federally +Rolf +Root +exclude +rag +agitation +Lounge +astonished +##wirl +Impossible +transformations +##IVE +##ceded +##slav +downloaded +fucked +Egyptians +Welles +##ffington +U2 +befriended +radios +##jid +archaic +compares +##ccelerator +##imated +##tosis +Hung +Scientists +Thousands +geographically +##LR +Macintosh +fluorescent +##ipur +Wehrmacht +##BR +##firmary +Chao +##ague +Boyer +##grounds +##hism +##mento +##taining +infancy +##cton +510 +Boca +##loy +1644 +ben +dong +stresses +Sweat +expressway +graders +ochreous +nets +Lawn +thirst +Uruguayan +satisfactory +##tracts +baroque +rusty +##ław +Shen +Gdańsk +chickens +##graving +Hodge +Papal +SAT +bearer +##ogo +##rger +merits +Calendar +Highest +Skills +##ortex +Roberta +paradigm +recounts +frigates +swamps +unitary +##oker +balloons +Hawthorne +Muse +spurred +advisors +reclaimed +stimulate +fibre +pat +repeal +##dgson +##iar +##rana +anthropologist +descends +flinch +reared +##chang +##eric +##lithic +commissioning +##cumenical +##lume +##rchen +Wolff +##tsky +Eurasian +Nepali +Nightmare +ZIP +playback +##latz +##vington +Warm +##75 +Martina +Rollins +Saetan +Variations +sorting +##م +530 +Joaquin +Ptolemy +thinner +##iator +##pticism +Cebu +Highlanders +Linden +Vanguard +##SV +##mor +##ulge +ISSN +cartridges +repression +Étienne +311 +Lauderdale +commodities +null +##rb +1720 +gearbox +##reator +Ang +Forgotten +dubious +##rls +##dicative +##phate +Groove +Herrera +##çais +Collections +Maximus +##published +Fell +Qualification +filtering +##tized +Roe +hazards +##37 +##lative +##tröm +Guadalupe +Tajikistan +Preliminary +fronted +glands +##paper +##iche +##iding +Cairns +rallies +Location +seduce +##mple +BYU +##itic +##FT +Carmichael +Prentice +songwriters +forefront +Physicians +##rille +##zee +Preparatory +##cherous +UV +##dized +Navarro +misses +##nney +Inland +resisting +##sect +Hurt +##lino +galaxies +##raze +Institutions +devote +##lamp +##ciating +baron +##bracing +Hess +operatic +##CL +##ος +Chevalier +Guiana +##lattered +Fed +##cuted +##smo +Skull +denies +236 +Waller +##mah +Sakura +mole +nominate +sermons +##bering +widowed +##röm +Cavendish +##struction +Nehru +Revelation +doom +Gala +baking +Nr +Yourself +banning +Individuals +Sykes +orchestrated +630 +Phone +steered +620 +specialising +starvation +##AV +##alet +##upation +seductive +##jects +##zure +Tolkien +Benito +Wizards +Submarine +dictator +Duo +Caden +approx +basins +##nc +shrink +##icles +##sponsible +249 +mit +outpost +##bayashi +##rouse +##tl +Jana +Lombard +RBIs +finalized +humanities +##function +Honorable +tomato +##iot +Pie +tee +##pect +Beaufort +Ferris +bucks +##graduate +##ocytes +Directory +anxiously +##nating +flanks +##Ds +virtues +##believable +Grades +criterion +manufactures +sourced +##balt +##dance +##tano +Ying +##BF +##sett +adequately +blacksmith +totaled +trapping +expanse +Historia +Worker +Sense +ascending +housekeeper +##oos +Crafts +Resurrection +##verty +encryption +##aris +##vat +##pox +##runk +##iability +gazes +spying +##ths +helmets +wired +##zophrenia +Cheung +WR +downloads +stereotypes +239 +Lucknow +bleak +Bragg +hauling +##haft +prohibit +##ermined +##castle +barony +##hta +Typhoon +antibodies +##ascism +Hawthorn +Kurdistan +Minority +Gorge +Herr +appliances +disrupt +Drugs +Lazarus +##ilia +##ryo +##tany +Gotta +Masovian +Roxy +choreographed +##rissa +turbulent +##listed +Anatomy +exiting +##det +##isław +580 +Kaufman +sage +##apa +Symposium +##rolls +Kaye +##ptera +##rocław +jerking +##menclature +Guo +M1 +resurrected +trophies +##lard +Gathering +nestled +serpent +Dow +reservoirs +Claremont +arbitration +chronicle +eki +##arded +##zers +##mmoth +Congregational +Astronomical +NE +RA +Robson +Scotch +modelled +slashed +##imus +exceeds +##roper +##utile +Laughing +vascular +superficial +##arians +Barclay +Caucasian +classmate +sibling +Kimberly +Shreveport +##ilde +##liche +Cheney +Deportivo +Veracruz +berries +##lase +Bed +MI +Anatolia +Mindanao +broadband +##olia +##arte +##wab +darts +##immer +##uze +believers +ordinance +violate +##wheel +##ynth +Alongside +Coupe +Hobbs +arrondissement +earl +townland +##dote +##lihood +##sla +Ghosts +midfield +pulmonary +##eno +cues +##gol +##zda +322 +Siena +Sultanate +Bradshaw +Pieter +##thical +Raceway +bared +competence +##ssent +Bet +##urer +##ła +Alistair +Göttingen +appropriately +forge +##osterone +##ugen +DL +345 +convoys +inventions +##resses +##cturnal +Fay +Integration +slash +##roats +Widow +barking +##fant +1A +Hooper +##cona +##runched +unreliable +##emont +##esign +##stabulary +##stop +Journalists +bony +##iba +##trata +##ège +horrific +##bish +Jocelyn +##rmon +##apon +##cier +trainers +##ulatory +1753 +BR +corpus +synthesized +##bidden +##rafford +Elgin +##entry +Doherty +clockwise +##played +spins +##ample +##bley +Cope +constructions +seater +warlord +Voyager +documenting +fairies +##viator +Lviv +jewellery +suites +##gold +Maia +NME +##eavor +##kus +Eugène +furnishings +##risto +MCC +Metropolis +Older +Telangana +##mpus +amplifier +supervising +1710 +buffalo +cushion +terminating +##powering +steak +Quickly +contracting +dem +sarcastically +Elsa +##hein +bastards +narratives +Takes +304 +composure +typing +variance +##ifice +Softball +##rations +McLaughlin +gaped +shrines +##hogany +Glamorgan +##icle +##nai +##ntin +Fleetwood +Woodland +##uxe +fictitious +shrugs +##iper +BWV +conform +##uckled +Launch +##ductory +##mized +Tad +##stituted +##free +Bel +Chávez +messing +quartz +##iculate +##folia +##lynn +ushered +##29 +##ailing +dictated +Pony +##opsis +precinct +802 +Plastic +##ughter +##uno +##porated +Denton +Matters +SPD +hating +##rogen +Essential +Deck +Dortmund +obscured +##maging +Earle +##bred +##ittle +##ropolis +saturated +##fiction +##ression +Pereira +Vinci +mute +warehouses +##ún +biographies +##icking +sealing +##dered +executing +pendant +##wives +murmurs +##oko +substrates +symmetrical +Susie +##mare +Yusuf +analogy +##urage +Lesley +limitation +##rby +##ío +disagreements +##mise +embroidered +nape +unarmed +Sumner +Stores +dwell +Wilcox +creditors +##rivatization +##shes +##amia +directs +recaptured +scouting +McGuire +cradle +##onnell +Sato +insulin +mercenary +tolerant +Macquarie +transitions +cradled +##berto +##ivism +##yotes +FF +Ke +Reach +##dbury +680 +##bill +##oja +##sui +prairie +##ogan +reactive +##icient +##rits +Cyclone +Sirius +Survival +Pak +##coach +##trar +halves +Agatha +Opus +contrasts +##jection +ominous +##iden +Baylor +Woodrow +duct +fortification +intercourse +##rois +Colbert +envy +##isi +Afterward +geared +##flections +accelerate +##lenching +Witness +##rrer +Angelina +Material +assertion +misconduct +Nix +cringed +tingling +##eti +##gned +Everest +disturb +sturdy +##keepers +##vied +Profile +heavenly +##kova +##victed +translating +##sses +316 +Invitational +Mention +martyr +##uristic +Barron +hardness +Nakamura +405 +Genevieve +reflections +##falls +jurist +##LT +Pyramid +##yme +Shoot +heck +linguist +##tower +Ives +superiors +##leo +Achilles +##phological +Christophe +Padma +precedence +grassy +Oral +resurrection +##itting +clumsy +##lten +##rue +huts +##stars +Equal +##queduct +Devin +Gaga +diocesan +##plating +##upe +##graphers +Patch +Scream +hail +moaning +tracts +##hdi +Examination +outsider +##ergic +##oter +Archipelago +Havilland +greenish +tilting +Aleksandr +Konstantin +warship +##emann +##gelist +##ought +billionaire +##blivion +321 +Hungarians +transplant +##jured +##fters +Corbin +autism +pitchers +Garner +thence +Scientology +transitioned +integrating +repetitive +##dant +Rene +vomit +##burne +1661 +Researchers +Wallis +insulted +wavy +##wati +Ewing +excitedly +##kor +frescoes +injustice +##achal +##lumber +##úl +novella +##sca +Liv +##enstein +##river +monstrous +topping +downfall +looming +sinks +trillion +##pont +Effect +##phi +##urley +Sites +catchment +##H1 +Hopper +##raiser +1642 +Maccabi +lance +##chia +##sboro +NSA +branching +retorted +tensor +Immaculate +drumming +feeder +##mony +Dyer +homicide +Temeraire +fishes +protruding +skins +orchards +##nso +inlet +ventral +##finder +Asiatic +Sul +1688 +Melinda +assigns +paranormal +gardening +Tau +calming +##inge +##crow +regimental +Nik +fastened +correlated +##gene +##rieve +Sick +##minster +##politan +hardwood +hurled +##ssler +Cinematography +rhyme +Montenegrin +Packard +debating +##itution +Helens +Trick +Museums +defiance +encompassed +##EE +##TU +##nees +##uben +##ünster +##nosis +435 +Hagen +cinemas +Corbett +commended +##fines +##oman +bosses +ripe +scraping +##loc +filly +Saddam +pointless +Faust +Orléans +Syriac +##♭ +longitude +##ropic +Alfa +bliss +gangster +##ckling +SL +blending +##eptide +##nner +bends +escorting +##bloid +##quis +burials +##sle +##è +Ambulance +insults +##gth +Antrim +unfolded +##missible +splendid +Cure +warily +Saigon +Waste +astonishment +boroughs +##VS +##dalgo +##reshing +##usage +rue +marital +versatile +unpaid +allotted +bacterium +##coil +##cue +Dorothea +IDF +##location +##yke +RPG +##tropical +devotees +liter +##pree +Johnstone +astronaut +attends +pollen +periphery +doctrines +meta +showered +##tyn +GO +Huh +laude +244 +Amar +Christensen +Ping +Pontifical +Austen +raiding +realities +##dric +urges +##dek +Cambridgeshire +##otype +Cascade +Greenberg +Pact +##cognition +##aran +##urion +Riot +mimic +Eastwood +##imating +reversal +##blast +##henian +Pitchfork +##sunderstanding +Staten +WCW +lieu +##bard +##sang +experimenting +Aquino +##lums +TNT +Hannibal +catastrophic +##lsive +272 +308 +##otypic +41st +Highways +aggregator +##fluenza +Featured +Reece +dispatch +simulated +##BE +Communion +Vinnie +hardcover +inexpensive +til +##adores +groundwater +kicker +blogs +frenzy +##wala +dealings +erase +Anglia +##umour +Hapoel +Marquette +##raphic +##tives +consult +atrocities +concussion +##érard +Decree +ethanol +##aen +Rooney +##chemist +##hoot +1620 +menacing +Schuster +##bearable +laborers +sultan +Juliana +erased +onstage +##ync +Eastman +##tick +hushed +##yrinth +Lexie +Wharton +Lev +##PL +Testing +Bangladeshi +##bba +##usions +communicated +integers +internship +societal +##odles +Loki +ET +Ghent +broadcasters +Unix +##auer +Kildare +Yamaha +##quencing +##zman +chilled +##rapped +##uant +Duval +sentiments +Oliveira +packets +Horne +##rient +Harlan +Mirage +invariant +##anger +##tensive +flexed +sweetness +##wson +alleviate +insulting +limo +Hahn +##llars +##hesia +##lapping +buys +##oaming +mocked +pursuits +scooted +##conscious +##ilian +Ballad +jackets +##kra +hilly +##cane +Scenic +McGraw +silhouette +whipping +##roduced +##wark +##chess +##rump +Lemon +calculus +demonic +##latine +Bharatiya +Govt +Que +Trilogy +Ducks +Suit +stairway +##ceipt +Isa +regulator +Automobile +flatly +##buster +##lank +Spartans +topography +Tavi +usable +Chartered +Fairchild +##sance +##vyn +Digest +nuclei +typhoon +##llon +Alvarez +DJs +Grimm +authoritative +firearm +##chschule +Origins +lair +unmistakable +##xial +##cribing +Mouth +##genesis +##shū +##gaon +##ulter +Jaya +Neck +##UN +##oing +##static +relativity +##mott +##utive +##esan +##uveau +BT +salts +##roa +Dustin +preoccupied +Novgorod +##asus +Magnum +tempting +##histling +##ilated +Musa +##ghty +Ashland +pubs +routines +##etto +Soto +257 +Featuring +Augsburg +##alaya +Bit +loomed +expects +##abby +##ooby +Auschwitz +Pendleton +vodka +##sent +rescuing +systemic +##inet +##leg +Yun +applicant +revered +##nacht +##ndas +Muller +characterization +##patient +##roft +Carole +##asperated +Amiga +disconnected +gel +##cologist +Patriotic +rallied +assign +veterinary +installing +##cedural +258 +Jang +Parisian +incarcerated +stalk +##iment +Jamal +McPherson +Palma +##oken +##viation +512 +Rourke +irrational +##rippled +Devlin +erratic +##NI +##payers +Ni +engages +Portal +aesthetics +##rrogance +Milne +assassins +##rots +335 +385 +Cambodian +Females +fellows +si +##block +##otes +Jayne +Toro +flutter +##eera +Burr +##lanche +relaxation +##fra +Fitzroy +##undy +1751 +261 +comb +conglomerate +ribbons +veto +##Es +casts +##ege +1748 +Ares +spears +spirituality +comet +##nado +##yeh +Veterinary +aquarium +yer +Councils +##oked +##ynamic +Malmö +remorse +auditions +drilled +Hoffmann +Moe +Nagoya +Yacht +##hakti +##race +##rrick +Talmud +coordinating +##EI +##bul +##his +##itors +##ligent +##uerra +Narayan +goaltender +taxa +##asures +Det +##mage +Infinite +Maid +bean +intriguing +##cription +gasps +socket +##mentary +##reus +sewing +transmitting +##different +##furbishment +##traction +Grimsby +sprawling +Shipyard +##destine +##hropic +##icked +trolley +##agi +##lesh +Josiah +invasions +Content +firefighters +intro +Lucifer +subunit +Sahib +Myrtle +inhibitor +maneuvers +##teca +Wrath +slippery +##versing +Shoes +##dial +##illiers +##luded +##mmal +##pack +handkerchief +##edestal +##stones +Fusion +cumulative +##mell +##cacia +##rudge +##utz +foe +storing +swiped +##meister +##orra +batter +strung +##venting +##kker +Doo +Taste +immensely +Fairbanks +Jarrett +Boogie +1746 +mage +Kick +legislators +medial +##ilon +##logies +##ranton +Hybrid +##uters +Tide +deportation +Metz +##secration +##virus +UFO +##fell +##orage +##raction +##rrigan +1747 +fabricated +##BM +##GR +##rter +muttering +theorist +##tamine +BMG +Kincaid +solvent +##azed +Thin +adorable +Wendell +ta +##viour +pulses +##pologies +counters +exposition +sewer +Luciano +Clancy +##angelo +##riars +Showtime +observes +frankly +##oppy +Bergman +lobes +timetable +##bri +##uest +FX +##dust +##genus +Glad +Helmut +Meridian +##besity +##ontaine +Revue +miracles +##titis +PP +bluff +syrup +307 +Messiah +##erne +interfering +picturesque +unconventional +dipping +hurriedly +Kerman +248 +Ethnic +Toward +acidic +Harrisburg +##65 +intimidating +##aal +Jed +Pontiac +munitions +##nchen +growling +mausoleum +##ération +##wami +Cy +aerospace +caucus +Doing +##around +##miring +Cuthbert +##poradic +##rovisation +##wth +evaluating +##scraper +Belinda +owes +##sitic +##thermal +##fast +economists +##lishing +##uerre +##ân +credible +##koto +Fourteen +cones +##ebrates +bookstore +towels +##phony +Appearance +newscasts +##olin +Karin +Bingham +##elves +1680 +306 +disks +##lston +##secutor +Levant +##vout +Micro +snuck +##ogel +##racker +Exploration +drastic +##kening +Elsie +endowment +##utnant +Blaze +##rrosion +leaking +45th +##rug +##uernsey +760 +Shapiro +cakes +##ehan +##mei +##ité +##kla +repetition +successively +Friendly +Île +Koreans +Au +Tirana +flourish +Spirits +Yao +reasoned +##leam +Consort +cater +marred +ordeal +supremacy +##ritable +Paisley +euro +healer +portico +wetland +##kman +restart +##habilitation +##zuka +##Script +emptiness +communion +##CF +##inhabited +##wamy +Casablanca +pulsed +##rrible +##safe +395 +Dual +Terrorism +##urge +##found +##gnolia +Courage +patriarch +segregated +intrinsic +##liography +##phe +PD +convection +##icidal +Dharma +Jimmie +texted +constituents +twitch +##calated +##mitage +##ringing +415 +milling +##geons +Armagh +Geometridae +evergreen +needy +reflex +template +##pina +Schubert +##bruck +##icted +##scher +##wildered +1749 +Joanne +clearer +##narl +278 +Print +automation +consciously +flashback +occupations +##ests +Casimir +differentiated +policing +repay +##aks +##gnesium +Evaluation +commotion +##CM +##smopolitan +Clapton +mitochondrial +Kobe +1752 +Ignoring +Vincenzo +Wet +bandage +##rassed +##unate +Maris +##eted +##hetical +figuring +##eit +##nap +leopard +strategically +##reer +Fen +Iain +##ggins +##pipe +Matteo +McIntyre +##chord +##feng +Romani +asshole +flopped +reassure +Founding +Styles +Torino +patrolling +##erging +##ibrating +##ructural +sincerity +##ät +##teacher +Juliette +##cé +##hog +##idated +##span +Winfield +##fender +##nast +##pliant +1690 +Bai +Je +Saharan +expands +Bolshevik +rotate +##root +Britannia +Severn +##cini +##gering +##say +sly +Steps +insertion +rooftop +Piece +cuffs +plausible +##zai +Provost +semantic +##data +##vade +##cimal +IPA +indictment +Libraries +flaming +highlands +liberties +##pio +Elders +aggressively +##pecific +Decision +pigeon +nominally +descriptive +adjustments +equestrian +heaving +##mour +##dives +##fty +##yton +intermittent +##naming +##sets +Calvert +Casper +Tarzan +##kot +Ramírez +##IB +##erus +Gustavo +Roller +vaulted +##solation +##formatics +##tip +Hunger +colloquially +handwriting +hearth +launcher +##idian +##ilities +##lind +##locating +Magdalena +Soo +clubhouse +##kushima +##ruit +Bogotá +Organic +Worship +##Vs +##wold +upbringing +##kick +groundbreaking +##urable +##ván +repulsed +##dira +##ditional +##ici +melancholy +##bodied +##cchi +404 +concurrency +H₂O +bouts +##gami +288 +Leto +troll +##lak +advising +bundled +##nden +lipstick +littered +##leading +##mogeneous +Experiment +Nikola +grove +##ogram +Mace +##jure +cheat +Annabelle +Tori +lurking +Emery +Walden +##riz +paints +Markets +brutality +overrun +##agu +##sat +din +ostensibly +Fielding +flees +##eron +Pound +ornaments +tornadoes +##nikov +##organisation +##reen +##Works +##ldred +##olten +##stillery +soluble +Mata +Grimes +Léon +##NF +coldly +permitting +##inga +##reaked +Agents +hostess +##dl +Dyke +Kota +avail +orderly +##saur +##sities +Arroyo +##ceps +##egro +Hawke +Noctuidae +html +seminar +##ggles +##wasaki +Clube +recited +##sace +Ascension +Fitness +dough +##ixel +Nationale +##solidate +pulpit +vassal +570 +Annapolis +bladder +phylogenetic +##iname +convertible +##ppan +Comet +paler +##definite +Spot +##dices +frequented +Apostles +slalom +##ivision +##mana +##runcated +Trojan +##agger +##iq +##league +Concept +Controller +##barian +##curate +##spersed +##tring +engulfed +inquired +##hmann +286 +##dict +##osy +##raw +MacKenzie +su +##ienced +##iggs +##quitaine +bisexual +##noon +runways +subsp +##! +##" +### +##$ +##% +##& +##' +##( +##) +##* +##+ +##, +##- +##. +##/ +##: +##; +##< +##= +##> +##? +##@ +##[ +##\ +##] +##^ +##_ +##` +##{ +##| +##} +##~ +##¡ +##¢ +##£ +##¥ +##§ +##¨ +##© +##ª +##« +##¬ +##® +##± +##´ +##µ +##¶ +##· +##¹ +##º +##» +##¼ +##¾ +##¿ +##À +##Á +## +##Ä +##Å +##Æ +##Ç +##È +##É +##Í +##Î +##Ñ +##Ó +##Ö +##× +##Ø +##Ú +##Ü +##Þ +##â +##ã +##æ +##ç +##î +##ï +##ð +##ñ +##ô +##õ +##÷ +##û +##þ +##ÿ +##Ā +##ą +##Ć +##Č +##ď +##Đ +##đ +##ē +##ė +##ę +##ě +##ğ +##ġ +##Ħ +##ħ +##ĩ +##Ī +##İ +##ļ +##Ľ +##ľ +##Ł +##ņ +##ň +##ŋ +##Ō +##ŏ +##ő +##Œ +##œ +##ř +##Ś +##ś +##Ş +##Š +##Ţ +##ţ +##ť +##ũ +##ŭ +##ů +##ű +##ų +##ŵ +##ŷ +##ź +##Ż +##ż +##Ž +##ž +##Ə +##ƒ +##ơ +##ư +##ǎ +##ǐ +##ǒ +##ǔ +##ǫ +##Ș +##Ț +##ț +##ɐ +##ɑ +##ɔ +##ɕ +##ə +##ɛ +##ɡ +##ɣ +##ɨ +##ɪ +##ɲ +##ɾ +##ʀ +##ʁ +##ʂ +##ʃ +##ʊ +##ʋ +##ʌ +##ʐ +##ʑ +##ʒ +##ʔ +##ʰ +##ʲ +##ʳ +##ʷ +##ʻ +##ʼ +##ʾ +##ʿ +##ˈ +##ː +##ˡ +##ˢ +##ˣ +##́ +##̃ +##̍ +##̯ +##͡ +##Α +##Β +##Γ +##Δ +##Ε +##Η +##Θ +##Ι +##Κ +##Λ +##Μ +##Ν +##Ο +##Π +##Σ +##Τ +##Φ +##Χ +##Ψ +##Ω +##ά +##έ +##ή +##ί +##β +##γ +##δ +##ε +##ζ +##η +##θ +##ι +##κ +##λ +##μ +##ξ +##ο +##π +##ρ +##σ +##τ +##υ +##φ +##χ +##ψ +##ω +##ό +##ύ +##ώ +##І +##Ј +##А +##Б +##В +##Г +##Д +##Е +##Ж +##З +##И +##К +##Л +##М +##Н +##О +##П +##Р +##С +##Т +##У +##Ф +##Х +##Ц +##Ч +##Ш +##Э +##Ю +##Я +##б +##в +##г +##д +##ж +##з +##к +##л +##м +##п +##с +##т +##у +##ф +##х +##ц +##ч +##ш +##щ +##ъ +##ы +##ь +##э +##ю +##ё +##і +##ї +##ј +##њ +##ћ +##Ա +##Հ +##ա +##ե +##ի +##կ +##մ +##յ +##ն +##ո +##ս +##տ +##ր +##ւ +##ְ +##ִ +##ֵ +##ֶ +##ַ +##ָ +##ֹ +##ּ +##א +##ב +##ג +##ד +##ה +##ו +##ז +##ח +##ט +##י +##כ +##ל +##ם +##מ +##ן +##נ +##ס +##ע +##פ +##צ +##ק +##ר +##ש +##ת +##، +##ء +##آ +##أ +##إ +##ئ +##ا +##ب +##ت +##ث +##ج +##ح +##خ +##ذ +##ز +##س +##ش +##ص +##ض +##ط +##ظ +##ع +##غ +##ف +##ق +##ك +##ل +##و +##ى +##َ +##ِ +##ٹ +##پ +##چ +##ک +##گ +##ہ +##ی +##ے +##ं +##आ +##क +##ग +##च +##ज +##ण +##त +##द +##ध +##न +##प +##ब +##भ +##म +##य +##र +##ल +##व +##श +##ष +##स +##ह +##ा +##ि +##ी +##ु +##े +##ो +##् +##। +##॥ +##আ +##ই +##এ +##ও +##ক +##খ +##গ +##চ +##ছ +##জ +##ট +##ত +##থ +##দ +##ধ +##ন +##প +##ব +##ম +##য +##র +##ল +##শ +##স +##হ +##় +##া +##ি +##ী +##ু +##ে +##ো +##্ +##য় +##க +##த +##ப +##ம +##ய +##ர +##ல +##வ +##ா +##ி +##ு +##் +##ร +##་ +##ག +##ང +##ད +##ན +##བ +##མ +##ར +##ལ +##ས +##ི +##ུ +##ེ +##ོ +##ა +##ე +##ი +##ლ +##ნ +##ო +##რ +##ს +##ᴬ +##ᴵ +##ᵀ +##ᵃ +##ᵇ +##ᵈ +##ᵉ +##ᵍ +##ᵏ +##ᵐ +##ᵒ +##ᵖ +##ᵗ +##ᵘ +##ᵣ +##ᵤ +##ᵥ +##ᶜ +##ᶠ +##ḍ +##Ḥ +##ḥ +##Ḩ +##ḩ +##ḳ +##ṃ +##ṅ +##ṇ +##ṛ +##ṣ +##ṭ +##ạ +##ả +##ấ +##ầ +##ẩ +##ậ +##ắ +##ế +##ề +##ể +##ễ +##ệ +##ị +##ọ +##ố +##ồ +##ổ +##ộ +##ớ +##ờ +##ợ +##ụ +##ủ +##ứ +##ừ +##ử +##ữ +##ự +##ỳ +##ỹ +##ἀ +##ἐ +##ὁ +##ὐ +##ὰ +##ὶ +##ὸ +##ῆ +##ῖ +##ῦ +##ῶ +##‐ +##‑ +##‒ +##– +##— +##― +##‖ +##‘ +##’ +##‚ +##“ +##” +##„ +##† +##‡ +##• +##… +##‰ +##′ +##″ +##⁄ +##⁰ +##ⁱ +##⁴ +##⁵ +##⁶ +##⁷ +##⁸ +##⁹ +##⁻ +##ⁿ +##₅ +##₆ +##₇ +##₈ +##₉ +##₊ +##₍ +##₎ +##ₐ +##ₑ +##ₒ +##ₓ +##ₕ +##ₖ +##ₘ +##ₚ +##ₛ +##ₜ +##₤ +##€ +##₱ +##₹ +##ℓ +##№ +##ℝ +##⅓ +##← +##↑ +##→ +##↔ +##⇌ +##⇒ +##∂ +##∈ +##∗ +##∘ +##√ +##∞ +##∧ +##∨ +##∩ +##∪ +##≈ +##≠ +##≡ +##≤ +##≥ +##⊂ +##⊆ +##⊕ +##⋅ +##─ +##│ +##■ +##● +##★ +##☆ +##☉ +##♠ +##♣ +##♥ +##♦ +##♯ +##⟨ +##⟩ +##ⱼ +##、 +##。 +##《 +##》 +##「 +##」 +##『 +##』 +##〜 +##い +##う +##え +##お +##か +##き +##く +##け +##こ +##さ +##し +##す +##せ +##そ +##た +##ち +##つ +##て +##と +##な +##に +##の +##は +##ひ +##ま +##み +##む +##め +##も +##や +##ゆ +##よ +##ら +##り +##る +##れ +##ん +##ア +##ィ +##イ +##ウ +##エ +##オ +##カ +##ガ +##キ +##ク +##グ +##コ +##サ +##シ +##ジ +##ス +##ズ +##タ +##ダ +##ッ +##テ +##デ +##ト +##ド +##ナ +##ニ +##ハ +##バ +##パ +##フ +##ブ +##プ +##マ +##ミ +##ム +##ャ +##ュ +##ラ +##リ +##ル +##レ +##ロ +##ン +##・ +##ー +##一 +##三 +##上 +##下 +##中 +##事 +##二 +##井 +##京 +##人 +##亻 +##仁 +##佐 +##侍 +##光 +##公 +##力 +##北 +##十 +##南 +##原 +##口 +##史 +##司 +##吉 +##同 +##和 +##囗 +##国 +##國 +##土 +##城 +##士 +##大 +##天 +##太 +##夫 +##女 +##子 +##宀 +##安 +##宮 +##宿 +##小 +##尚 +##山 +##島 +##川 +##州 +##平 +##年 +##心 +##愛 +##戸 +##文 +##新 +##方 +##日 +##明 +##星 +##書 +##月 +##木 +##本 +##李 +##村 +##東 +##松 +##林 +##正 +##武 +##氏 +##水 +##氵 +##江 +##河 +##海 +##版 +##犬 +##王 +##生 +##田 +##白 +##皇 +##省 +##真 +##石 +##社 +##神 +##竹 +##美 +##義 +##花 +##藤 +##西 +##谷 +##車 +##辶 +##道 +##郎 +##郡 +##部 +##野 +##金 +##長 +##門 +##陽 +##青 +##食 +##馬 +##高 +##龍 +##龸 +##사 +##씨 +##의 +##이 +##한 +##fi +##fl +##! +##( +##) +##, +##- +##/ +##: diff --git a/NLP with Attention Models/QA/QA_DistilBERT_pipline_FT/Files/tf/tydiqa_data/dataset_dict.json b/NLP with Attention Models/QA/QA_DistilBERT_pipline_FT/Files/tf/tydiqa_data/dataset_dict.json new file mode 100644 index 0000000000000000000000000000000000000000..ce084f11866187cc2a80681a7feed2433220a9e0 --- /dev/null +++ b/NLP with Attention Models/QA/QA_DistilBERT_pipline_FT/Files/tf/tydiqa_data/dataset_dict.json @@ -0,0 +1 @@ +{"splits": ["train", "validation"]} \ No newline at end of file diff --git a/NLP with Attention Models/QA/QA_DistilBERT_pipline_FT/Files/tf/tydiqa_data/train/cache-21c3b3af5ad138a8.arrow b/NLP with Attention Models/QA/QA_DistilBERT_pipline_FT/Files/tf/tydiqa_data/train/cache-21c3b3af5ad138a8.arrow new file mode 100644 index 0000000000000000000000000000000000000000..813b41fd9d1ad3890e50e89773b891bbd38d7ca1 --- /dev/null +++ b/NLP with Attention Models/QA/QA_DistilBERT_pipline_FT/Files/tf/tydiqa_data/train/cache-21c3b3af5ad138a8.arrow @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:b0ef5552477d30f553dc99e680e841122af4b6158cec92a833571c9e4b7a48b8 +size 88142224 diff --git a/NLP with Attention Models/QA/QA_DistilBERT_pipline_FT/Files/tf/tydiqa_data/train/cache-a7d4fcf0afedf699.arrow b/NLP with Attention Models/QA/QA_DistilBERT_pipline_FT/Files/tf/tydiqa_data/train/cache-a7d4fcf0afedf699.arrow new file mode 100644 index 0000000000000000000000000000000000000000..5e309567221acc9b121a58ad779bdb21a3627796 --- /dev/null +++ b/NLP with Attention Models/QA/QA_DistilBERT_pipline_FT/Files/tf/tydiqa_data/train/cache-a7d4fcf0afedf699.arrow @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:15414003cf5f6854a1d1a8cddf27ed5ec5fb80067a5244c411828a6fb31aa751 +size 88142616 diff --git a/NLP with Attention Models/QA/QA_DistilBERT_pipline_FT/Files/tf/tydiqa_data/train/cache-bec06ea6cf14cfc1.arrow b/NLP with Attention Models/QA/QA_DistilBERT_pipline_FT/Files/tf/tydiqa_data/train/cache-bec06ea6cf14cfc1.arrow new file mode 100644 index 0000000000000000000000000000000000000000..ec6ac6f5a919eb023c53f21d519ba90b3106c633 --- /dev/null +++ b/NLP with Attention Models/QA/QA_DistilBERT_pipline_FT/Files/tf/tydiqa_data/train/cache-bec06ea6cf14cfc1.arrow @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:20e02df2e0d92c47d03e4c4dab0bbac8ad45b318a6c97bf07d2c078de63e7be5 +size 88142616 diff --git a/NLP with Attention Models/QA/QA_DistilBERT_pipline_FT/Files/tf/tydiqa_data/train/cache-ce4e04eb371cb7de.arrow b/NLP with Attention Models/QA/QA_DistilBERT_pipline_FT/Files/tf/tydiqa_data/train/cache-ce4e04eb371cb7de.arrow new file mode 100644 index 0000000000000000000000000000000000000000..ec1992b9c4df4308fa17742bf8654d76644f368d --- /dev/null +++ b/NLP with Attention Models/QA/QA_DistilBERT_pipline_FT/Files/tf/tydiqa_data/train/cache-ce4e04eb371cb7de.arrow @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:afa75dfcbfb6bb8ecb2e35ac30cee80bf6134749a80edc5266a954612fc78ab7 +size 276686136 diff --git a/NLP with Attention Models/QA/QA_DistilBERT_pipline_FT/Files/tf/tydiqa_data/train/cache-dae547115b62dff9.arrow b/NLP with Attention Models/QA/QA_DistilBERT_pipline_FT/Files/tf/tydiqa_data/train/cache-dae547115b62dff9.arrow new file mode 100644 index 0000000000000000000000000000000000000000..de33f350099e839894937c6345f7c442cae82aa5 --- /dev/null +++ b/NLP with Attention Models/QA/QA_DistilBERT_pipline_FT/Files/tf/tydiqa_data/train/cache-dae547115b62dff9.arrow @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:59788e35943a26c7857835d35bb03a94e797462f7e3678c876c8c93a2a809a40 +size 88142224 diff --git a/NLP with Attention Models/QA/QA_DistilBERT_pipline_FT/Files/tf/tydiqa_data/train/cache-f01ce9f0d79ac965.arrow b/NLP with Attention Models/QA/QA_DistilBERT_pipline_FT/Files/tf/tydiqa_data/train/cache-f01ce9f0d79ac965.arrow new file mode 100644 index 0000000000000000000000000000000000000000..969be060df6fbe731731b0f45343359e8b9c522e --- /dev/null +++ b/NLP with Attention Models/QA/QA_DistilBERT_pipline_FT/Files/tf/tydiqa_data/train/cache-f01ce9f0d79ac965.arrow @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:3f6de821f2d6a457650b65630836c45cb75dc35fdaa39dd60551beb75ccef73b +size 88142224 diff --git a/NLP with Attention Models/QA/QA_DistilBERT_pipline_FT/Files/tf/tydiqa_data/train/dataset.arrow b/NLP with Attention Models/QA/QA_DistilBERT_pipline_FT/Files/tf/tydiqa_data/train/dataset.arrow new file mode 100644 index 0000000000000000000000000000000000000000..d9e7658a47c2cc3f1a10bd2284254da9820434b0 --- /dev/null +++ b/NLP with Attention Models/QA/QA_DistilBERT_pipline_FT/Files/tf/tydiqa_data/train/dataset.arrow @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:62c4ea077b01f48685ceadb29b8a98773906355882c26c40ed26a746e9f81e60 +size 253066616 diff --git a/NLP with Attention Models/QA/QA_DistilBERT_pipline_FT/Files/tf/tydiqa_data/train/dataset_info.json b/NLP with Attention Models/QA/QA_DistilBERT_pipline_FT/Files/tf/tydiqa_data/train/dataset_info.json new file mode 100644 index 0000000000000000000000000000000000000000..c881d5286d36f69ed7c9087edad7378123b872c2 --- /dev/null +++ b/NLP with Attention Models/QA/QA_DistilBERT_pipline_FT/Files/tf/tydiqa_data/train/dataset_info.json @@ -0,0 +1,125 @@ +{ + "builder_name": "tydiqa", + "citation": "@article{tydiqa,\ntitle = {TyDi QA: A Benchmark for Information-Seeking Question Answering in Typologically Diverse Languages},\nauthor = {Jonathan H. Clark and Eunsol Choi and Michael Collins and Dan Garrette and Tom Kwiatkowski and Vitaly Nikolaev and Jennimaria Palomaki}\nyear = {2020},\njournal = {Transactions of the Association for Computational Linguistics}\n}\n", + "config_name": "primary_task", + "dataset_size": 6034955060, + "description": "TyDi QA is a question answering dataset covering 11 typologically diverse languages with 204K question-answer pairs.\nThe languages of TyDi QA are diverse with regard to their typology -- the set of linguistic features that each language\nexpresses -- such that we expect models performing well on this set to generalize across a large number of the languages\nin the world. It contains language phenomena that would not be found in English-only corpora. To provide a realistic\ninformation-seeking task and avoid priming effects, questions are written by people who want to know the answer, but\ndon\u2019t know the answer yet, (unlike SQuAD and its descendents) and the data is collected directly in each language without\nthe use of translation (unlike MLQA and XQuAD).\n", + "download_checksums": { + "https://storage.googleapis.com/tydiqa/v1.0/tydiqa-v1.0-train.jsonl.gz": { + "num_bytes": 1729651634, + "checksum": "8eeedfee7593db7c3637d65a3d5c67b82486137ac6ac3ea7d08be9a64d71b629" + }, + "https://storage.googleapis.com/tydiqa/v1.0/tydiqa-v1.0-dev.jsonl.gz": { + "num_bytes": 160614310, + "checksum": "b52b8d4db1850b1549e960219e6056d8139986f8caf1b5e8b4eecadabed24413" + }, + "https://storage.googleapis.com/tydiqa/v1.1/tydiqa-goldp-v1.1-train.json": { + "num_bytes": 58004076, + "checksum": "cefc8e09ff2548d9b10a678d3a6bbbe5bc036be543f92418819ea676c97be23b" + }, + "https://storage.googleapis.com/tydiqa/v1.1/tydiqa-goldp-v1.1-dev.json": { + "num_bytes": 5617409, + "checksum": "b286e0f34bc7f52259359989716f369b160565bd12ad8f3a3e311f9b0dbad1c0" + } + }, + "download_size": 1953887429, + "features": { + "passage_answer_candidates": { + "feature": { + "plaintext_start_byte": { + "dtype": "int32", + "id": null, + "_type": "Value" + }, + "plaintext_end_byte": { + "dtype": "int32", + "id": null, + "_type": "Value" + } + }, + "length": -1, + "id": null, + "_type": "Sequence" + }, + "question_text": { + "dtype": "string", + "id": null, + "_type": "Value" + }, + "document_title": { + "dtype": "string", + "id": null, + "_type": "Value" + }, + "language": { + "dtype": "string", + "id": null, + "_type": "Value" + }, + "annotations": { + "feature": { + "passage_answer_candidate_index": { + "dtype": "int32", + "id": null, + "_type": "Value" + }, + "minimal_answers_start_byte": { + "dtype": "int32", + "id": null, + "_type": "Value" + }, + "minimal_answers_end_byte": { + "dtype": "int32", + "id": null, + "_type": "Value" + }, + "yes_no_answer": { + "dtype": "string", + "id": null, + "_type": "Value" + } + }, + "length": -1, + "id": null, + "_type": "Sequence" + }, + "document_plaintext": { + "dtype": "string", + "id": null, + "_type": "Value" + }, + "document_url": { + "dtype": "string", + "id": null, + "_type": "Value" + } + }, + "homepage": "https://github.com/google-research-datasets/tydiqa", + "license": "", + "post_processed": null, + "post_processing_size": null, + "size_in_bytes": 7988842489, + "splits": { + "train": { + "name": "train", + "num_bytes": 5550574617, + "num_examples": 166916, + "dataset_name": "tydiqa" + }, + "validation": { + "name": "validation", + "num_bytes": 484380443, + "num_examples": 18670, + "dataset_name": "tydiqa" + } + }, + "supervised_keys": null, + "task_templates": null, + "version": { + "version_str": "1.0.0", + "description": "", + "major": 1, + "minor": 0, + "patch": 0 + } +} \ No newline at end of file diff --git a/NLP with Attention Models/QA/QA_DistilBERT_pipline_FT/Files/tf/tydiqa_data/train/state.json b/NLP with Attention Models/QA/QA_DistilBERT_pipline_FT/Files/tf/tydiqa_data/train/state.json new file mode 100644 index 0000000000000000000000000000000000000000..42980fdb95f1d9083c3c63ea8f4ec34d6f4bf1fe --- /dev/null +++ b/NLP with Attention Models/QA/QA_DistilBERT_pipline_FT/Files/tf/tydiqa_data/train/state.json @@ -0,0 +1,15 @@ +{ + "_data_files": [ + { + "filename": "dataset.arrow" + } + ], + "_fingerprint": "531e2357b5b18c4a", + "_format_columns": null, + "_format_kwargs": {}, + "_format_type": null, + "_indexes": {}, + "_indices_data_files": null, + "_output_all_columns": false, + "_split": "train" +} \ No newline at end of file diff --git a/NLP with Attention Models/QA/QA_DistilBERT_pipline_FT/Files/tf/tydiqa_data/train/tmp91o4sskf b/NLP with Attention Models/QA/QA_DistilBERT_pipline_FT/Files/tf/tydiqa_data/train/tmp91o4sskf new file mode 100644 index 0000000000000000000000000000000000000000..e69de29bb2d1d6434b8b29ae775ad8c2e48c5391 diff --git a/NLP with Attention Models/QA/QA_DistilBERT_pipline_FT/Files/tf/tydiqa_data/train/tmpcx8p0ni5 b/NLP with Attention Models/QA/QA_DistilBERT_pipline_FT/Files/tf/tydiqa_data/train/tmpcx8p0ni5 new file mode 100644 index 0000000000000000000000000000000000000000..e69de29bb2d1d6434b8b29ae775ad8c2e48c5391 diff --git a/NLP with Attention Models/QA/QA_DistilBERT_pipline_FT/Files/tf/tydiqa_data/train/tmphmn7cgck b/NLP with Attention Models/QA/QA_DistilBERT_pipline_FT/Files/tf/tydiqa_data/train/tmphmn7cgck new file mode 100644 index 0000000000000000000000000000000000000000..e69de29bb2d1d6434b8b29ae775ad8c2e48c5391 diff --git a/NLP with Attention Models/QA/QA_DistilBERT_pipline_FT/Files/tf/tydiqa_data/validation/cache-0adce067eac1391a.arrow b/NLP with Attention Models/QA/QA_DistilBERT_pipline_FT/Files/tf/tydiqa_data/validation/cache-0adce067eac1391a.arrow new file mode 100644 index 0000000000000000000000000000000000000000..a98ed3d29ff2ea356a684151df20292fdcaf70f5 --- /dev/null +++ b/NLP with Attention Models/QA/QA_DistilBERT_pipline_FT/Files/tf/tydiqa_data/validation/cache-0adce067eac1391a.arrow @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:1ad55689373389ce9d6e1879d7430a446a92599348da0a9f68f24b6162caec82 +size 33058928 diff --git a/NLP with Attention Models/QA/QA_DistilBERT_pipline_FT/Files/tf/tydiqa_data/validation/cache-0b6a95603380c6f5.arrow b/NLP with Attention Models/QA/QA_DistilBERT_pipline_FT/Files/tf/tydiqa_data/validation/cache-0b6a95603380c6f5.arrow new file mode 100644 index 0000000000000000000000000000000000000000..854ad1a8e1e7d807ffd22dcede65875c90ca453c --- /dev/null +++ b/NLP with Attention Models/QA/QA_DistilBERT_pipline_FT/Files/tf/tydiqa_data/validation/cache-0b6a95603380c6f5.arrow @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:8df97a375bcab11a525a70a0f51b1aca3a1b676b9bc41d25ed356b15efd1fe87 +size 33058536 diff --git a/NLP with Attention Models/QA/QA_DistilBERT_pipline_FT/Files/tf/tydiqa_data/validation/cache-0bf3746900e14840.arrow b/NLP with Attention Models/QA/QA_DistilBERT_pipline_FT/Files/tf/tydiqa_data/validation/cache-0bf3746900e14840.arrow new file mode 100644 index 0000000000000000000000000000000000000000..a47aced2b1e10dcf1a03e01dc2a1ef02ff5f7525 --- /dev/null +++ b/NLP with Attention Models/QA/QA_DistilBERT_pipline_FT/Files/tf/tydiqa_data/validation/cache-0bf3746900e14840.arrow @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:74afb7e4e75b90710ef695a9fb5b1e5fd5b9bbc08c676a2b293a6d0f6c8ae100 +size 33058536 diff --git a/NLP with Attention Models/QA/QA_DistilBERT_pipline_FT/Files/tf/tydiqa_data/validation/cache-0c773b45eb4c1c17.arrow b/NLP with Attention Models/QA/QA_DistilBERT_pipline_FT/Files/tf/tydiqa_data/validation/cache-0c773b45eb4c1c17.arrow new file mode 100644 index 0000000000000000000000000000000000000000..9f3c58e581c138e72a9742302d7d1eb36de1b11e --- /dev/null +++ b/NLP with Attention Models/QA/QA_DistilBERT_pipline_FT/Files/tf/tydiqa_data/validation/cache-0c773b45eb4c1c17.arrow @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:4e31a1a09e3f5c5937666a8a43d07ba63ff8d269ac94232a2ed7197177a9db24 +size 33058536 diff --git a/NLP with Attention Models/QA/QA_DistilBERT_pipline_FT/Files/tf/tydiqa_data/validation/cache-14bfc163d5792383.arrow b/NLP with Attention Models/QA/QA_DistilBERT_pipline_FT/Files/tf/tydiqa_data/validation/cache-14bfc163d5792383.arrow new file mode 100644 index 0000000000000000000000000000000000000000..f84fc265ba5fe94b9e22b630b8f3d2830eaee30e --- /dev/null +++ b/NLP with Attention Models/QA/QA_DistilBERT_pipline_FT/Files/tf/tydiqa_data/validation/cache-14bfc163d5792383.arrow @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:89c1615193ddfd88ea2af45cd1b7426d0dad4ff5453dc5fa6cd66137459e9476 +size 33058536 diff --git a/NLP with Attention Models/QA/QA_DistilBERT_pipline_FT/Files/tf/tydiqa_data/validation/cache-22dd192df839003a.arrow b/NLP with Attention Models/QA/QA_DistilBERT_pipline_FT/Files/tf/tydiqa_data/validation/cache-22dd192df839003a.arrow new file mode 100644 index 0000000000000000000000000000000000000000..fdbaa13586a0d4f8f26b0aeb4b8a0f77bbd5b413 --- /dev/null +++ b/NLP with Attention Models/QA/QA_DistilBERT_pipline_FT/Files/tf/tydiqa_data/validation/cache-22dd192df839003a.arrow @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:95fe6e34295f87b19e7718362e4969a69612b0870f1006946a02e63090fe772f +size 33058928 diff --git a/NLP with Attention Models/QA/QA_DistilBERT_pipline_FT/Files/tf/tydiqa_data/validation/cache-32664b2bb6ecb93c.arrow b/NLP with Attention Models/QA/QA_DistilBERT_pipline_FT/Files/tf/tydiqa_data/validation/cache-32664b2bb6ecb93c.arrow new file mode 100644 index 0000000000000000000000000000000000000000..52967dd4e9e669fc809e7a12b55fd726870c0979 --- /dev/null +++ b/NLP with Attention Models/QA/QA_DistilBERT_pipline_FT/Files/tf/tydiqa_data/validation/cache-32664b2bb6ecb93c.arrow @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:c6520663168e4d52ad2e41e3f949b62d558e7c811a999346fc1187397de47545 +size 33058928 diff --git a/NLP with Attention Models/QA/QA_DistilBERT_pipline_FT/Files/tf/tydiqa_data/validation/cache-981c6a4602432980.arrow b/NLP with Attention Models/QA/QA_DistilBERT_pipline_FT/Files/tf/tydiqa_data/validation/cache-981c6a4602432980.arrow new file mode 100644 index 0000000000000000000000000000000000000000..611cb4882115639eee39f9dc063e39b9f13e2043 --- /dev/null +++ b/NLP with Attention Models/QA/QA_DistilBERT_pipline_FT/Files/tf/tydiqa_data/validation/cache-981c6a4602432980.arrow @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:fc029701f06e0d1f884928c60e666b2497f90bf48fc4ea6b2c00cac184216a6e +size 33058928 diff --git a/NLP with Attention Models/QA/QA_DistilBERT_pipline_FT/Files/tf/tydiqa_data/validation/cache-de50d25427e34427.arrow b/NLP with Attention Models/QA/QA_DistilBERT_pipline_FT/Files/tf/tydiqa_data/validation/cache-de50d25427e34427.arrow new file mode 100644 index 0000000000000000000000000000000000000000..e57db3b6036455919296f5314250485d6eb836c8 --- /dev/null +++ b/NLP with Attention Models/QA/QA_DistilBERT_pipline_FT/Files/tf/tydiqa_data/validation/cache-de50d25427e34427.arrow @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:7e7e7a4a9d44733871fedfab65c989b9a21215bb2b85eeac9a28427905f2e978 +size 34073752 diff --git a/NLP with Attention Models/QA/QA_DistilBERT_pipline_FT/Files/tf/tydiqa_data/validation/dataset.arrow b/NLP with Attention Models/QA/QA_DistilBERT_pipline_FT/Files/tf/tydiqa_data/validation/dataset.arrow new file mode 100644 index 0000000000000000000000000000000000000000..3df82a3009a058692648bf5766e6d8a47aca6736 --- /dev/null +++ b/NLP with Attention Models/QA/QA_DistilBERT_pipline_FT/Files/tf/tydiqa_data/validation/dataset.arrow @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:31a39789d3db909bd38c471aeb437bc34593ad6eef5b9d403a95bb169cf69c8a +size 31428392 diff --git a/NLP with Attention Models/QA/QA_DistilBERT_pipline_FT/Files/tf/tydiqa_data/validation/dataset_info.json b/NLP with Attention Models/QA/QA_DistilBERT_pipline_FT/Files/tf/tydiqa_data/validation/dataset_info.json new file mode 100644 index 0000000000000000000000000000000000000000..c881d5286d36f69ed7c9087edad7378123b872c2 --- /dev/null +++ b/NLP with Attention Models/QA/QA_DistilBERT_pipline_FT/Files/tf/tydiqa_data/validation/dataset_info.json @@ -0,0 +1,125 @@ +{ + "builder_name": "tydiqa", + "citation": "@article{tydiqa,\ntitle = {TyDi QA: A Benchmark for Information-Seeking Question Answering in Typologically Diverse Languages},\nauthor = {Jonathan H. Clark and Eunsol Choi and Michael Collins and Dan Garrette and Tom Kwiatkowski and Vitaly Nikolaev and Jennimaria Palomaki}\nyear = {2020},\njournal = {Transactions of the Association for Computational Linguistics}\n}\n", + "config_name": "primary_task", + "dataset_size": 6034955060, + "description": "TyDi QA is a question answering dataset covering 11 typologically diverse languages with 204K question-answer pairs.\nThe languages of TyDi QA are diverse with regard to their typology -- the set of linguistic features that each language\nexpresses -- such that we expect models performing well on this set to generalize across a large number of the languages\nin the world. It contains language phenomena that would not be found in English-only corpora. To provide a realistic\ninformation-seeking task and avoid priming effects, questions are written by people who want to know the answer, but\ndon\u2019t know the answer yet, (unlike SQuAD and its descendents) and the data is collected directly in each language without\nthe use of translation (unlike MLQA and XQuAD).\n", + "download_checksums": { + "https://storage.googleapis.com/tydiqa/v1.0/tydiqa-v1.0-train.jsonl.gz": { + "num_bytes": 1729651634, + "checksum": "8eeedfee7593db7c3637d65a3d5c67b82486137ac6ac3ea7d08be9a64d71b629" + }, + "https://storage.googleapis.com/tydiqa/v1.0/tydiqa-v1.0-dev.jsonl.gz": { + "num_bytes": 160614310, + "checksum": "b52b8d4db1850b1549e960219e6056d8139986f8caf1b5e8b4eecadabed24413" + }, + "https://storage.googleapis.com/tydiqa/v1.1/tydiqa-goldp-v1.1-train.json": { + "num_bytes": 58004076, + "checksum": "cefc8e09ff2548d9b10a678d3a6bbbe5bc036be543f92418819ea676c97be23b" + }, + "https://storage.googleapis.com/tydiqa/v1.1/tydiqa-goldp-v1.1-dev.json": { + "num_bytes": 5617409, + "checksum": "b286e0f34bc7f52259359989716f369b160565bd12ad8f3a3e311f9b0dbad1c0" + } + }, + "download_size": 1953887429, + "features": { + "passage_answer_candidates": { + "feature": { + "plaintext_start_byte": { + "dtype": "int32", + "id": null, + "_type": "Value" + }, + "plaintext_end_byte": { + "dtype": "int32", + "id": null, + "_type": "Value" + } + }, + "length": -1, + "id": null, + "_type": "Sequence" + }, + "question_text": { + "dtype": "string", + "id": null, + "_type": "Value" + }, + "document_title": { + "dtype": "string", + "id": null, + "_type": "Value" + }, + "language": { + "dtype": "string", + "id": null, + "_type": "Value" + }, + "annotations": { + "feature": { + "passage_answer_candidate_index": { + "dtype": "int32", + "id": null, + "_type": "Value" + }, + "minimal_answers_start_byte": { + "dtype": "int32", + "id": null, + "_type": "Value" + }, + "minimal_answers_end_byte": { + "dtype": "int32", + "id": null, + "_type": "Value" + }, + "yes_no_answer": { + "dtype": "string", + "id": null, + "_type": "Value" + } + }, + "length": -1, + "id": null, + "_type": "Sequence" + }, + "document_plaintext": { + "dtype": "string", + "id": null, + "_type": "Value" + }, + "document_url": { + "dtype": "string", + "id": null, + "_type": "Value" + } + }, + "homepage": "https://github.com/google-research-datasets/tydiqa", + "license": "", + "post_processed": null, + "post_processing_size": null, + "size_in_bytes": 7988842489, + "splits": { + "train": { + "name": "train", + "num_bytes": 5550574617, + "num_examples": 166916, + "dataset_name": "tydiqa" + }, + "validation": { + "name": "validation", + "num_bytes": 484380443, + "num_examples": 18670, + "dataset_name": "tydiqa" + } + }, + "supervised_keys": null, + "task_templates": null, + "version": { + "version_str": "1.0.0", + "description": "", + "major": 1, + "minor": 0, + "patch": 0 + } +} \ No newline at end of file diff --git a/NLP with Attention Models/QA/QA_DistilBERT_pipline_FT/Files/tf/tydiqa_data/validation/state.json b/NLP with Attention Models/QA/QA_DistilBERT_pipline_FT/Files/tf/tydiqa_data/validation/state.json new file mode 100644 index 0000000000000000000000000000000000000000..cc10d4a25d538cf7ac77b48b5d26ba90b0528e2b --- /dev/null +++ b/NLP with Attention Models/QA/QA_DistilBERT_pipline_FT/Files/tf/tydiqa_data/validation/state.json @@ -0,0 +1,15 @@ +{ + "_data_files": [ + { + "filename": "dataset.arrow" + } + ], + "_fingerprint": "1a623ff2b1b90fec", + "_format_columns": null, + "_format_kwargs": {}, + "_format_type": null, + "_indexes": {}, + "_indices_data_files": null, + "_output_all_columns": false, + "_split": "validation" +} \ No newline at end of file diff --git a/NLP with Attention Models/QA/QA_T5/Files/tf/.ipynb_checkpoints/C4W3_Assignment-checkpoint.ipynb b/NLP with Attention Models/QA/QA_T5/Files/tf/.ipynb_checkpoints/C4W3_Assignment-checkpoint.ipynb new file mode 100644 index 0000000000000000000000000000000000000000..fbcc58aae697c97ce8cb118b441643e55575656a --- /dev/null +++ b/NLP with Attention Models/QA/QA_T5/Files/tf/.ipynb_checkpoints/C4W3_Assignment-checkpoint.ipynb @@ -0,0 +1,2187 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "id": "f368f78e", + "metadata": { + "colab_type": "text", + "id": "7yuytuIllsv1" + }, + "source": [ + "# Assignment 3: Question Answering\n", + "\n", + "Welcome to the third assignment of course 4. In this assignment you will explore question answering. You will implement the \"Text to Text Transfer from Transformers\" (better known as T5). Since you implemented transformers from scratch last week you will now be able to use them. \n", + "\n", + " \n" + ] + }, + { + "cell_type": "markdown", + "id": "bf3ec561", + "metadata": { + "colab_type": "text", + "id": "Db6LQW5cMSgx" + }, + "source": [ + "## Table of Contents\n", + "\n", + "- [Overview](#0-1)\n", + "- [Importing the Packages](#0-2)\n", + "- [1 - Prepare the data for pretraining T5](#1)\n", + " - [1.1 - Pre-Training Objective](#1-1)\n", + " - [1.2 - C4 Dataset](#1-2)\n", + " - [1.3 - Process C4](#1-3)\n", + " - [1.4 - Decode to Natural Language](#1-4)\n", + " - [1.5 - Tokenizing and Masking](#1-5)\n", + " - [Exercise 1 - tokenize_and_mask](#ex-1)\n", + " - [1.6 - Creating the Pairs](#1-6)\n", + "- [2 - Pretrain a T5 model using C4](#2)\n", + " - [2.1 - Instantiate a new transformer model](#2-1)\n", + " - [2.2 - C4 pretraining](#2-2)\n", + "- [3 - Fine tune the T5 model for Question Answering](#3)\n", + " - [3.1 - Creating a list of paired question and answers](#3-1)\n", + " - [Exercise 2 - Parse the SQuaD 2.0 dataset](#ex-2)\n", + " - [3.2 - Fine tune the T5 model](#3-2) \n", + " - [3.3 - Implement your Question Answering model](#3-3)\n", + " - [Exercise 3 - Implement the question answering function](#ex-3) " + ] + }, + { + "cell_type": "markdown", + "id": "0595e9c4", + "metadata": { + "colab_type": "text", + "id": "ysxogfC1M158" + }, + "source": [ + "\n", + "## Overview\n", + "\n", + "This assignment will be different from the two previous ones. Due to memory constraints of this environment and for the sake of time, your model will be trained with small datasets, so you won't get models that you could use in production but you will gain the necessary knowledge about how the Generative Language models are trained and used. Also you won't spend too much time with the architecture of the models but you will instead take a model that is pre-trained on a larger dataset and fine tune it to get better results.\n", + "\n", + "After completing this labs you will:\n", + "* Understand how the C4 dataset is structured.\n", + "* Pretrain a transformer model using a Masked Language Model.\n", + "* Understand how the \"Text to Text Transfer from Transformers\" or T5 model works. \n", + "* Fine tune the T5 model for Question answering\n", + "\n", + "Before getting started take some time to read the following tips:\n", + "#### TIPS FOR SUCCESSFUL GRADING OF YOUR ASSIGNMENT:\n", + "- All cells are frozen except for the ones where you need to submit your solutions.\n", + "- You can add new cells to experiment but these will be omitted by the grader, so don't rely on newly created cells to host your solution code, use the provided places for this.\n", + "- You can add the comment # grade-up-to-here in any graded cell to signal the grader that it must only evaluate up to that point. This is helpful if you want to check if you are on the right track even if you are not done with the whole assignment. Be sure to remember to delete the comment afterwards!\n", + "- To submit your notebook, save it and then click on the blue submit button at the beginning of the page." + ] + }, + { + "cell_type": "markdown", + "id": "2156cf78", + "metadata": {}, + "source": [ + "\n", + "## Importing the Packages\n", + "\n", + "Let's start by importing all the required libraries. " + ] + }, + { + "cell_type": "code", + "execution_count": 1, + "id": "3a532381", + "metadata": { + "colab": { + "base_uri": "https://localhost:8080/", + "height": 34 + }, + "colab_type": "code", + "deletable": false, + "editable": false, + "id": "uDhi6qLQMHzs", + "outputId": "64947d91-eef3-425b-9b4b-7ca7cefcc823", + "slideshow": { + "slide_type": "" + }, + "tags": [ + "graded" + ] + }, + "outputs": [], + "source": [ + "import os\n", + "os.environ['TF_CPP_MIN_LOG_LEVEL'] = '3'\n", + "\n", + "import traceback\n", + "import time\n", + "import json\n", + "from termcolor import colored\n", + "import string\n", + "import textwrap\n", + "import itertools\n", + "import numpy as np\n", + "import tensorflow_text as tf_text\n", + "import tensorflow as tf\n", + "\n", + "import transformer_utils \n", + "import utils\n", + "\n", + "# Will come in handy later\n", + "wrapper = textwrap.TextWrapper(width=70)\n", + "\n", + "# Set random seed\n", + "np.random.seed(42)" + ] + }, + { + "cell_type": "code", + "execution_count": 16, + "id": "bf711eba", + "metadata": { + "deletable": false, + "editable": false, + "tags": [] + }, + "outputs": [], + "source": [ + "import w3_unittest" + ] + }, + { + "cell_type": "markdown", + "id": "47bea693", + "metadata": { + "colab_type": "text", + "id": "t7A-LAxsYpDd", + "slideshow": { + "slide_type": "" + }, + "tags": [] + }, + "source": [ + "\n", + "## 1 - Prepare the data for pretraining T5 \n", + "\n", + "\n", + "### 1.1 - Pre-Training Objective\n", + "\n", + "In the initial phase of training a T5 model for a Question Answering task, the pre-training process involves leveraging a masked language model (MLM) on a very large dataset, such as the C4 dataset. The objective is to allow the model to learn contextualized representations of words and phrases, fostering a deeper understanding of language semantics. To initiate pre-training, it is essential to employ the Transformer architecture, which forms the backbone of T5. The Transformer's self-attention mechanism enables the model to weigh different parts of the input sequence dynamically, capturing long-range dependencies effectively.\n", + "\n", + "Before delving into pre-training, thorough data preprocessing is crucial. The C4 dataset, a diverse and extensive collection of web pages, provides a rich source for language understanding tasks. The dataset needs to be tokenized into smaller units, such as subwords or words, to facilitate model input. Additionally, the text is often segmented into fixed-length sequences or batches, optimizing computational efficiency during training.\n", + "\n", + "For the masked language modeling objective, a percentage of the tokenized input is randomly masked, and the model is trained to predict the original content of these masked tokens. This process encourages the T5 model to grasp contextual relationships between words and phrases, enhancing its ability to generate coherent and contextually appropriate responses during downstream tasks like question answering.\n", + "\n", + "In summary, the pre-training of the T5 model involves utilizing the Transformer architecture on a sizable dataset like C4, coupled with meticulous data preprocessing to convert raw text into a format suitable for training. The incorporation of a masked language modeling objective ensures that the model learns robust contextual representations, laying a solid foundation for subsequent fine-tuning on specific tasks such as question answering.\n", + "\n", + "**Note:** The word \"mask\" will be used throughout this assignment in context of hiding/removing word(s)\n", + "\n", + "You will be implementing the Masked language model (MLM) as shown in the following image. \n", + "\n", + "\n", + "\n", + "Assume you have the following text: **Thank you for inviting me to your party last week** \n", + "\n", + "\n", + "Now as input you will mask the words in red in the text: \n", + "\n", + " **Input:** Thank you **X** me to your party **Y** week.\n", + "\n", + "**Output:** The model should predict the words(s) for **X** and **Y**. \n", + "\n", + "**[EOS]** will be used to mark the end of the target sequence." + ] + }, + { + "cell_type": "markdown", + "id": "1dc25302", + "metadata": { + "colab_type": "text", + "id": "Cwr7LoXwQUW5", + "slideshow": { + "slide_type": "" + }, + "tags": [] + }, + "source": [ + "\n", + "### 1.2 - C4 Dataset\n", + "\n", + "The [C4 dataset](https://www.tensorflow.org/datasets/catalog/c4), also known as the Common Crawl C4 (Common Crawl Corpus C4), is a large-scale dataset of web pages collected by the [Common Crawl organization](https://commoncrawl.org/). It is commonly used for various natural language processing tasks and machine learning research. Each sample in the C4 dataset follows a consistent format, making it suitable for pretraining models like BERT. Here's a short explanation and description of the C4 dataset:\n", + "\n", + "- Format: Each sample in the C4 dataset is represented as a JSON object, containing several key-value pairs.\n", + "\n", + "- Content: The 'text' field in each sample contains the actual text content extracted from web pages. This text often includes a wide range of topics and writing styles, making it diverse and suitable for training language models.\n", + "\n", + "- Metadata: The dataset includes metadata such as 'content-length,' 'content-type,' 'timestamp,' and 'url,' providing additional information about each web page. 'Content-length' specifies the length of the content, 'content-type' describes the type of content (e.g., 'text/plain'), 'timestamp' indicates when the web page was crawled, and 'url' provides the source URL of the web page.\n", + "\n", + "- Applications: The C4 dataset is commonly used for training and fine-tuning large-scale language models, such as BERT. It serves as a valuable resource for tasks like text classification, named entity recognition, question answering, and more.\n", + "\n", + "- Size: The C4 dataset is containing more than 800 GiB of text data, making it suitable for training models with billions of parameters.\n", + "\n", + "Run the cell below to see how the C4 dataset looks like. " + ] + }, + { + "cell_type": "code", + "execution_count": 2, + "id": "aa56acc9", + "metadata": { + "deletable": false, + "editable": false, + "slideshow": { + "slide_type": "" + }, + "tags": [] + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "example number 1: \n", + "\n", + "{'text': 'Beginners BBQ Class Taking Place in Missoula!\\nDo you want to get better at making delicious BBQ? You will have the opportunity, put this on your calendar now. Thursday, September 22nd join World Class BBQ Champion, Tony Balay from Lonestar Smoke Rangers. He will be teaching a beginner level class for everyone who wants to get better with their culinary skills.\\nHe will teach you everything you need to know to compete in a KCBS BBQ competition, including techniques, recipes, timelines, meat selection and trimming, plus smoker and fire information.\\nThe cost to be in the class is $35 per person, and for spectators it is free. Included in the cost will be either a t-shirt or apron and you will be tasting samples of each meat that is prepared.'} \n", + "\n", + "example number 2: \n", + "\n", + "{'text': 'Discussion in \\'Mac OS X Lion (10.7)\\' started by axboi87, Jan 20, 2012.\\nI\\'ve got a 500gb internal drive and a 240gb SSD.\\nWhen trying to restore using disk utility i\\'m given the error \"Not enough space on disk ____ to restore\"\\nBut I shouldn\\'t have to do that!!!\\nAny ideas or workarounds before resorting to the above?\\nUse Carbon Copy Cloner to copy one drive to the other. I\\'ve done this several times going from larger HDD to smaller SSD and I wound up with a bootable SSD drive. One step you have to remember not to skip is to use Disk Utility to partition the SSD as GUID partition scheme HFS+ before doing the clone. If it came Apple Partition Scheme, even if you let CCC do the clone, the resulting drive won\\'t be bootable. CCC usually works in \"file mode\" and it can easily copy a larger drive (that\\'s mostly empty) onto a smaller drive. If you tell CCC to clone a drive you did NOT boot from, it can work in block copy mode where the destination drive must be the same size or larger than the drive you are cloning from (if I recall).\\nI\\'ve actually done this somehow on Disk Utility several times (booting from a different drive (or even the dvd) so not running disk utility from the drive your cloning) and had it work just fine from larger to smaller bootable clone. Definitely format the drive cloning to first, as bootable Apple etc..\\nThanks for pointing this out. My only experience using DU to go larger to smaller was when I was trying to make a Lion install stick and I was unable to restore InstallESD.dmg to a 4 GB USB stick but of course the reason that wouldn\\'t fit is there was slightly more than 4 GB of data.'} \n", + "\n", + "example number 3: \n", + "\n", + "{'text': 'Foil plaid lycra and spandex shortall with metallic slinky insets. Attached metallic elastic belt with O-ring. Headband included. Great hip hop or jazz dance costume. Made in the USA.'} \n", + "\n", + "example number 4: \n", + "\n", + "{'text': \"How many backlinks per day for new site?\\nDiscussion in 'Black Hat SEO' started by Omoplata, Dec 3, 2010.\\n1) for a newly created site, what's the max # backlinks per day I should do to be safe?\\n2) how long do I have to let my site age before I can start making more blinks?\\nI did about 6000 forum profiles every 24 hours for 10 days for one of my sites which had a brand new domain.\\nThere is three backlinks for every of these forum profile so thats 18 000 backlinks every 24 hours and nothing happened in terms of being penalized or sandboxed. This is now maybe 3 months ago and the site is ranking on first page for a lot of my targeted keywords.\\nbuild more you can in starting but do manual submission and not spammy type means manual + relevant to the post.. then after 1 month you can make a big blast..\\nWow, dude, you built 18k backlinks a day on a brand new site? How quickly did you rank up? What kind of competition/searches did those keywords have?\"} \n", + "\n", + "example number 5: \n", + "\n", + "{'text': 'The Denver Board of Education opened the 2017-18 school year with an update on projects that include new construction, upgrades, heat mitigation and quality learning environments.\\nWe are excited that Denver students will be the beneficiaries of a four year, $572 million General Obligation Bond. Since the passage of the bond, our construction team has worked to schedule the projects over the four-year term of the bond.\\nDenver voters on Tuesday approved bond and mill funding measures for students in Denver Public Schools, agreeing to invest $572 million in bond funding to build and improve schools and $56.6 million in operating dollars to support proven initiatives, such as early literacy.\\nDenver voters say yes to bond and mill levy funding support for DPS students and schools. Click to learn more about the details of the voter-approved bond measure.\\nDenver voters on Nov. 8 approved bond and mill funding measures for DPS students and schools. Learn more about what’s included in the mill levy measure.'} \n", + "\n" + ] + } + ], + "source": [ + "# Load example jsons\n", + "with open('data/c4-en-10k.jsonl', 'r') as file:\n", + " example_jsons = [json.loads(line.strip()) for line in file]\n", + "\n", + "# Printing the examples to see how the data looks like\n", + "for i in range(5):\n", + " print(f'example number {i+1}: \\n\\n{example_jsons[i]} \\n')" + ] + }, + { + "cell_type": "markdown", + "id": "48901d97", + "metadata": { + "colab_type": "text", + "id": "eeihIgtiaSfh", + "slideshow": { + "slide_type": "" + }, + "tags": [] + }, + "source": [ + "\n", + "### 1.3 - Process C4\n", + "\n", + "For the purpose of pretaining the T5 model, you will only use the `content` of each entry. In the following code, you filter only the field `text` from all the entries in the dataset. This is the data that you will use to create the `inputs` and `targets` of your language model." + ] + }, + { + "cell_type": "code", + "execution_count": 3, + "id": "af728cb2", + "metadata": { + "deletable": false, + "editable": false, + "slideshow": { + "slide_type": "" + }, + "tags": [] + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Beginners BBQ Class Taking Place in Missoula!\n", + "Do you want to get better at making delicious BBQ? You will have the opportunity, put this on your calendar now. Thursday, September 22nd join World Class BBQ Champion, Tony Balay from Lonestar Smoke Rangers. He will be teaching a beginner level class for everyone who wants to get better with their culinary skills.\n", + "He will teach you everything you need to know to compete in a KCBS BBQ competition, including techniques, recipes, timelines, meat selection and trimming, plus smoker and fire information.\n", + "The cost to be in the class is $35 per person, and for spectators it is free. Included in the cost will be either a t-shirt or apron and you will be tasting samples of each meat that is prepared.\n" + ] + } + ], + "source": [ + "# Grab text field from dictionary\n", + "natural_language_texts = [example_json['text'] for example_json in example_jsons]\n", + "\n", + "# Print the first text example\n", + "print(natural_language_texts[0])" + ] + }, + { + "cell_type": "markdown", + "id": "ee4a25a2", + "metadata": { + "colab_type": "text", + "id": "1rMrONRqcCYi", + "slideshow": { + "slide_type": "" + }, + "tags": [] + }, + "source": [ + "\n", + "### 1.4 - Decode to Natural Language\n", + "\n", + "The [SentencePieceTokenizer](https://www.tensorflow.org/text/api_docs/python/text/SentencepieceTokenizer), used in the code snippet, tokenizes text into subword units, enhancing handling of complex word structures, out-of-vocabulary words, and multilingual support. It simplifies preprocessing, ensures consistent tokenization, and seamlessly integrates with machine learning frameworks.\n", + "\n", + "In this task, a SentencePiece model is loaded from a file, which is used to tokenize text into subwords represented by integer IDs." + ] + }, + { + "cell_type": "code", + "execution_count": 4, + "id": "2ac53d57", + "metadata": { + "deletable": false, + "editable": false, + "slideshow": { + "slide_type": "" + }, + "tags": [] + }, + "outputs": [], + "source": [ + "# Special tokens\n", + "# PAD, EOS = 0, 1\n", + "\n", + "with open(\"./models/sentencepiece.model\", \"rb\") as f:\n", + " pre_trained_tokenizer = f.read()\n", + " \n", + "tokenizer = tf_text.SentencepieceTokenizer(pre_trained_tokenizer, out_type=tf.int32)" + ] + }, + { + "cell_type": "markdown", + "id": "658b0e86", + "metadata": {}, + "source": [ + "In this tokenizer the string `` is used as `EOS` token. By default, the tokenizer does not add the `EOS` to the end of each sentence, so you need to add it manually when required. Let's verify what id correspond to this token:" + ] + }, + { + "cell_type": "code", + "execution_count": 5, + "id": "7d2fec4b", + "metadata": { + "deletable": false, + "editable": false, + "tags": [] + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "EOS: 1\n" + ] + } + ], + "source": [ + "eos = tokenizer.string_to_id(\"\").numpy()\n", + "\n", + "print(\"EOS: \" + str(eos))" + ] + }, + { + "cell_type": "markdown", + "id": "6e87756f", + "metadata": {}, + "source": [ + "This code shows the process of tokenizing individual words from a given text, in this case, the first entry of the dataset." + ] + }, + { + "cell_type": "code", + "execution_count": 6, + "id": "83c48352", + "metadata": { + "colab": { + "base_uri": "https://localhost:8080/", + "height": 54 + }, + "colab_type": "code", + "deletable": false, + "id": "iCCjgiVZgTSK", + "outputId": "023a227c-d895-4fd9-ae83-9394fe48cebd", + "slideshow": { + "slide_type": "" + }, + "tags": [] + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Word\t\t-->\tTokenization\n", + "----------------------------------------\n", + "Foil \t-->\t[4452, 173]\n", + "plaid \t-->\t[30772]\n", + "lycra \t-->\t[3, 120, 2935]\n", + "and \t-->\t[11]\n", + "spandex \t-->\t[8438, 26, 994]\n", + "shortall\t-->\t[710, 1748]\n", + "with \t-->\t[28]\n", + "metallic\t-->\t[18813]\n", + "slinky \t-->\t[3, 7, 4907, 63]\n", + "insets. \t-->\t[16, 2244, 7, 5]\n", + "Attached\t-->\t[28416, 15, 26]\n", + "metallic\t-->\t[18813]\n", + "elastic \t-->\t[15855]\n", + "belt \t-->\t[6782]\n", + "with \t-->\t[28]\n", + "O-ring. \t-->\t[411, 18, 1007, 5]\n", + "Headband\t-->\t[3642, 3348]\n", + "included.\t-->\t[1285, 5]\n", + "Great \t-->\t[1651]\n", + "hip \t-->\t[5436]\n", + "hop \t-->\t[13652]\n", + "or \t-->\t[42]\n", + "jazz \t-->\t[9948]\n", + "dance \t-->\t[2595]\n", + "costume.\t-->\t[11594, 5]\n", + "Made \t-->\t[6465]\n", + "in \t-->\t[16]\n", + "the \t-->\t[8]\n", + "USA. \t-->\t[2312, 5]\n" + ] + } + ], + "source": [ + "# printing the encoding of each word to see how subwords are tokenized\n", + "tokenized_text = [(list(tokenizer.tokenize(word).numpy()), word) for word in natural_language_texts[2].split()]\n", + "\n", + "print(\"Word\\t\\t-->\\tTokenization\")\n", + "print(\"-\"*40)\n", + "for element in tokenized_text:\n", + " print(f\"{element[1]:<8}\\t-->\\t{element[0]}\")" + ] + }, + { + "cell_type": "markdown", + "id": "d4616cf3", + "metadata": { + "slideshow": { + "slide_type": "" + }, + "tags": [] + }, + "source": [ + "And as usual, the library provides a function to turn numeric tokens into human readable text. Look how it works. " + ] + }, + { + "cell_type": "code", + "execution_count": 7, + "id": "92d7037b", + "metadata": { + "deletable": false, + "editable": false, + "slideshow": { + "slide_type": "" + }, + "tags": [] + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "tokenized: [12847 277]\n", + "detokenized: b'Beginners'\n" + ] + } + ], + "source": [ + "# We can see that detokenize successfully undoes the tokenization\n", + "print(f\"tokenized: {tokenizer.tokenize('Beginners')}\\ndetokenized: {tokenizer.detokenize(tokenizer.tokenize('Beginners'))}\")" + ] + }, + { + "cell_type": "markdown", + "id": "52f63624", + "metadata": { + "colab_type": "text", + "id": "vPKgGOeOxv3w", + "slideshow": { + "slide_type": "" + }, + "tags": [] + }, + "source": [ + "As you can see above, you were able to take a piece of string and tokenize it. \n", + "\n", + "Now you will create `input` and `target` pairs that will allow you to train your model. T5 uses the ids at the end of the vocab file as sentinels. For example, it will replace: \n", + " - `vocab_size - 1` by ``\n", + " - `vocab_size - 2` by ``\n", + " - and so forth. \n", + " \n", + "It assigns every word a `chr`.\n", + "\n", + "The `pretty_decode` function below, which you will use in a bit, helps in handling the type when decoding. Take a look and try to understand what the function is doing.\n", + "\n", + "\n", + "Notice that:\n", + "```python\n", + "string.ascii_letters = 'abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ'\n", + "```\n", + "\n", + "**NOTE:** Targets may have more than the 52 sentinels we replace, but this is just to give you an idea of things." + ] + }, + { + "cell_type": "code", + "execution_count": 10, + "id": "b25bb46d", + "metadata": { + "colab": {}, + "colab_type": "code", + "deletable": false, + "editable": false, + "id": "fCPQL5FTxv3w", + "slideshow": { + "slide_type": "" + }, + "tags": [] + }, + "outputs": [], + "source": [ + "def get_sentinels(tokenizer, display=False):\n", + " sentinels = {}\n", + " vocab_size = tokenizer.vocab_size(name=None)\n", + " for i, char in enumerate(reversed(string.ascii_letters), 1):\n", + " decoded_text = tokenizer.detokenize([vocab_size - i]).numpy().decode(\"utf-8\")\n", + " \n", + " # Sentinels, ex: - \n", + " sentinels[decoded_text] = f'<{char}>' \n", + " \n", + " if display:\n", + " print(f'The sentinel is <{char}> and the decoded token is:', decoded_text)\n", + "\n", + " return sentinels\n", + "\n", + "def pretty_decode(encoded_str_list, sentinels, tokenizer):\n", + " # If already a string, just do the replacements.\n", + " if tf.is_tensor(encoded_str_list) and encoded_str_list.dtype == tf.string:\n", + " for token, char in sentinels.items():\n", + " encoded_str_list = tf.strings.regex_replace(encoded_str_list, token, char)\n", + " return encoded_str_list\n", + " \n", + " # We need to decode and then prettyfy it.\n", + " return pretty_decode(tokenizer.detokenize(encoded_str_list), sentinels, tokenizer)" + ] + }, + { + "cell_type": "code", + "execution_count": 11, + "id": "56d75b6c", + "metadata": { + "colab": {}, + "colab_type": "code", + "deletable": false, + "editable": false, + "id": "fCPQL5FTxv3w", + "slideshow": { + "slide_type": "" + }, + "tags": [] + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "The sentinel is and the decoded token is: Internațional\n", + "The sentinel is and the decoded token is: erwachsene\n", + "The sentinel is and the decoded token is: Cushion\n", + "The sentinel is and the decoded token is: imunitar\n", + "The sentinel is and the decoded token is: Intellectual\n", + "The sentinel is and the decoded token is: traditi\n", + "The sentinel is and the decoded token is: disguise\n", + "The sentinel is and the decoded token is: exerce\n", + "The sentinel is and the decoded token is: nourishe\n", + "The sentinel is and the decoded token is: predominant\n", + "The sentinel is

and the decoded token is: amitié\n", + "The sentinel is and the decoded token is: erkennt\n", + "The sentinel is and the decoded token is: dimension\n", + "The sentinel is and the decoded token is: inférieur\n", + "The sentinel is and the decoded token is: refugi\n", + "The sentinel is and the decoded token is: cheddar\n", + "The sentinel is and the decoded token is: unterlieg\n", + "The sentinel is and the decoded token is: garanteaz\n", + "The sentinel is and the decoded token is: făcute\n", + "The sentinel is and the decoded token is: réglage\n", + "The sentinel is and the decoded token is: pedepse\n", + "The sentinel is and the decoded token is: Germain\n", + "The sentinel is and the decoded token is: distinctly\n", + "The sentinel is and the decoded token is: Schraub\n", + "The sentinel is and the decoded token is: emanat\n", + "The sentinel is and the decoded token is: trimestre\n", + "The sentinel is and the decoded token is: disrespect\n", + "The sentinel is and the decoded token is: Erasmus\n", + "The sentinel is and the decoded token is: Australia\n", + "The sentinel is and the decoded token is: permeabil\n", + "The sentinel is and the decoded token is: deseori\n", + "The sentinel is and the decoded token is: manipulated\n", + "The sentinel is and the decoded token is: suggér\n", + "The sentinel is and the decoded token is: corespund\n", + "The sentinel is and the decoded token is: nitro\n", + "The sentinel is and the decoded token is: oyons\n", + "The sentinel is

and the decoded token is: Account\n", + "The sentinel is and the decoded token is: échéan\n", + "The sentinel is and the decoded token is: laundering\n", + "The sentinel is and the decoded token is: genealogy\n", + "The sentinel is and the decoded token is: QuickBooks\n", + "The sentinel is and the decoded token is: constituted\n", + "The sentinel is and the decoded token is: Fertigung\n", + "The sentinel is and the decoded token is: goutte\n", + "The sentinel is and the decoded token is: regulă\n", + "The sentinel is and the decoded token is: overwhelmingly\n", + "The sentinel is and the decoded token is: émerg\n", + "The sentinel is and the decoded token is: broyeur\n", + "The sentinel is and the decoded token is: povești\n", + "The sentinel is and the decoded token is: emulator\n", + "The sentinel is and the decoded token is: halloween\n", + "The sentinel is and the decoded token is: combustibil\n" + ] + } + ], + "source": [ + "sentinels = get_sentinels(tokenizer, display=True)" + ] + }, + { + "cell_type": "markdown", + "id": "be73a35d", + "metadata": { + "colab": {}, + "colab_type": "code", + "id": "fCPQL5FTxv3w", + "slideshow": { + "slide_type": "" + }, + "tags": [] + }, + "source": [ + "Now, let's use the `pretty_decode` function in the following sentence. Note that all the words listed as sentinels, will be replaced by the function with the corresponding sentinel. It could be a drawback of this method, but don't worry about it now." + ] + }, + { + "cell_type": "code", + "execution_count": 12, + "id": "1fe92253", + "metadata": { + "deletable": false, + "editable": false, + "slideshow": { + "slide_type": "" + }, + "tags": [] + }, + "outputs": [ + { + "data": { + "text/plain": [ + " this .'>" + ] + }, + "execution_count": 12, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "pretty_decode(tf.constant(\"I want to dress up as an Intellectual this halloween.\"), sentinels, tokenizer)" + ] + }, + { + "cell_type": "markdown", + "id": "559b04b7", + "metadata": { + "colab_type": "text", + "id": "Y64F--Nzxv30", + "slideshow": { + "slide_type": "" + }, + "tags": [] + }, + "source": [ + "The functions above make your `inputs` and `targets` more readable. For example, you might see something like this once you implement the masking function below. \n", + "\n", + "- Input sentence: Younes and Lukasz were working together in the lab yesterday after lunch. \n", + "- Input: Younes and Lukasz **Z** together in the **Y** yesterday after lunch.\n", + "- Target: **Z** were working **Y** lab.\n" + ] + }, + { + "cell_type": "markdown", + "id": "244cd7a8", + "metadata": { + "colab_type": "text", + "id": "NvvNd7n6xv30" + }, + "source": [ + "\n", + "### 1.5 - Tokenizing and Masking\n", + "\n", + "In this task, you will implement the `tokenize_and_mask` function, which tokenizes and masks input words based on a given probability. The probability is controlled by the `noise` parameter, typically set to mask around `15%` of the words in the input text. The function will generate two lists of tokenized sequences following the algorithm outlined below:" + ] + }, + { + "cell_type": "markdown", + "id": "7050f25c", + "metadata": { + "slideshow": { + "slide_type": "" + }, + "tags": [] + }, + "source": [ + "\n", + "### Exercise 1 - tokenize_and_mask\n", + "\n", + "- Start with two empty lists: `inps` and `targs`\n", + "- Tokenize the input text using the given tokenizer.\n", + "- For each `token` in the tokenized sequence:\n", + " - Generate a random number(simulating a weighted coin toss)\n", + " - If the random value is greater than the given threshold(noise):\n", + " - Add the current token to the `inps` list\n", + " - Else:\n", + " - If a new sentinel must be included(read note **):\n", + " - Compute the next sentinel ID using a progression.\n", + " - Add a sentinel into the `inps` and `targs` to mark the position of the masked element.\n", + " - Add the current token to the `targs` list.\n", + "\n", + "** There's a special case to consider. If two or more consecutive tokens get masked during the process, you don't need to add a new sentinel to the sequences. To account for this, use the `prev_no_mask` flag, which starts as `True` but is turned to `False` each time you mask a new element. The code that adds sentinels will only be executed if, before masking the token, the flag was in the `True` state.\n" + ] + }, + { + "cell_type": "code", + "execution_count": 13, + "id": "c660bf97", + "metadata": { + "colab": {}, + "colab_type": "code", + "deletable": false, + "id": "Bi33WKgRxv31", + "slideshow": { + "slide_type": "" + }, + "tags": [ + "graded" + ] + }, + "outputs": [], + "source": [ + "# GRADED FUNCTION: tokenize_and_mask\n", + "def tokenize_and_mask(text, \n", + " noise=0.15, \n", + " randomizer=np.random.uniform, \n", + " tokenizer=None):\n", + " \"\"\"Tokenizes and masks a given input.\n", + "\n", + " Args:\n", + " text (str or bytes): Text input.\n", + " noise (float, optional): Probability of masking a token. Defaults to 0.15.\n", + " randomizer (function, optional): Function that generates random values. Defaults to np.random.uniform.\n", + " tokenizer (function, optional): Tokenizer function. Defaults to tokenize.\n", + "\n", + " Returns:\n", + " inps, targs: Lists of integers associated to inputs and targets.\n", + " \"\"\"\n", + " \n", + " # Current sentinel number (starts at 0)\n", + " cur_sentinel_num = 0\n", + " \n", + " # Inputs and targets\n", + " inps, targs = [], []\n", + "\n", + " # Vocab_size\n", + " vocab_size = int(tokenizer.vocab_size())\n", + " \n", + " # EOS token id \n", + " # Must be at the end of each target!\n", + " eos = tokenizer.string_to_id(\"\").numpy()\n", + " \n", + " ### START CODE HERE ###\n", + " \n", + " # prev_no_mask is True if the previous token was NOT masked, False otherwise\n", + " # set prev_no_mask to True\n", + " prev_no_mask = True\n", + " \n", + " # Loop over the tokenized text\n", + " for token in tokenizer.tokenize(text).numpy():\n", + " \n", + " # Generate a random value between 0 and 1\n", + " rnd_val = randomizer() \n", + " \n", + " # Check if the noise is greater than a random value (weighted coin flip)\n", + " if noise > rnd_val:\n", + " \n", + " # Check if previous token was NOT masked\n", + " if prev_no_mask:\n", + " \n", + " # Current sentinel increases by 1\n", + " cur_sentinel_num += 1\n", + " \n", + " # Compute end_id by subtracting current sentinel value out of the total vocabulary size\n", + " end_id = vocab_size - cur_sentinel_num\n", + " \n", + " # Append end_id at the end of the targets\n", + " targs.append(end_id)\n", + " \n", + " # Append end_id at the end of the inputs\n", + " inps.append(end_id)\n", + " \n", + " # Append token at the end of the targets\n", + " targs.append(token)\n", + " \n", + " # set prev_no_mask accordingly\n", + " prev_no_mask = False\n", + "\n", + " else:\n", + " \n", + " # Append token at the end of the inputs\n", + " inps.append(token)\n", + " \n", + " # Set prev_no_mask accordingly\n", + " prev_no_mask = True\n", + " \n", + " \n", + " # Add EOS token to the end of the targets\n", + " targs.append(eos)\n", + " \n", + " ### END CODE HERE ###\n", + " \n", + " return inps, targs" + ] + }, + { + "cell_type": "code", + "execution_count": 14, + "id": "e92edca1", + "metadata": { + "colab": { + "base_uri": "https://localhost:8080/", + "height": 122 + }, + "colab_type": "code", + "deletable": false, + "editable": false, + "id": "OlPySQo9xv34", + "outputId": "2b0dc5e4-8d58-4eb0-a146-0c9f158264ac", + "scrolled": true, + "slideshow": { + "slide_type": "" + }, + "tags": [] + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "tokenized inputs - shape=53:\n", + "\n", + "[31999, 15068, 4501, 3, 12297, 3399, 16, 5964, 7115, 31998, 531, 25, 241, 12, 129, 394, 44, 492, 31997, 58, 148, 56, 43, 8, 1004, 6, 474, 31996, 39, 4793, 230, 5, 2721, 6, 1600, 1630, 31995, 1150, 4501, 15068, 16127, 6, 9137, 2659, 5595, 31994, 782, 3624, 14627, 15, 12612, 277, 5]\n", + "\n", + "targets - shape=19:\n", + "\n", + "[31999, 12847, 277, 31998, 9, 55, 31997, 3326, 15068, 31996, 48, 30, 31995, 727, 1715, 31994, 45, 301, 1]\n" + ] + } + ], + "source": [ + "# Some logic to mock a np.random value generator\n", + "# Needs to be in the same cell for it to always generate same output\n", + "def testing_rnd():\n", + " def dummy_generator():\n", + " vals = np.linspace(0, 1, 10)\n", + " cyclic_vals = itertools.cycle(vals)\n", + " for _ in range(100):\n", + " yield next(cyclic_vals)\n", + "\n", + " dumr = itertools.cycle(dummy_generator())\n", + "\n", + " def dummy_randomizer():\n", + " return next(dumr)\n", + " \n", + " return dummy_randomizer\n", + "\n", + "input_str = 'Beginners BBQ Class Taking Place in Missoula!\\nDo you want to get better at making delicious BBQ? You will have the opportunity, put this on your calendar now. Thursday, September 22nd join World Class BBQ Champion, Tony Balay from Lonestar Smoke Rangers.'\n", + "\n", + "inps, targs = tokenize_and_mask(input_str, randomizer=testing_rnd(), tokenizer=tokenizer)\n", + "print(f\"tokenized inputs - shape={len(inps)}:\\n\\n{inps}\\n\\ntargets - shape={len(targs)}:\\n\\n{targs}\")" + ] + }, + { + "cell_type": "markdown", + "id": "07996252", + "metadata": { + "slideshow": { + "slide_type": "" + }, + "tags": [] + }, + "source": [ + "#### **Expected Output:**\n", + "```\n", + "tokenized inputs - shape=53:\n", + "\n", + "[31999 15068 4501 3 12297 3399 16 5964 7115 31998 531 25\n", + " 241 12 129 394 44 492 31997 58 148 56 43 8\n", + " 1004 6 474 31996 39 4793 230 5 2721 6 1600 1630\n", + " 31995 1150 4501 15068 16127 6 9137 2659 5595 31994 782 3624\n", + " 14627 15 12612 277 5]\n", + "\n", + "targets - shape=19:\n", + "\n", + "[31999 12847 277 31998 9 55 31997 3326 15068 31996 48 30\n", + " 31995 727 1715 31994 45 301 1]\n", + "```" + ] + }, + { + "cell_type": "code", + "execution_count": 17, + "id": "76daaa5b", + "metadata": { + "deletable": false, + "editable": false, + "slideshow": { + "slide_type": "" + }, + "tags": [] + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "\u001b[92m All tests passed\n" + ] + } + ], + "source": [ + "# Test your implementation!\n", + "w3_unittest.test_tokenize_and_mask(tokenize_and_mask)" + ] + }, + { + "cell_type": "markdown", + "id": "9c87bea8", + "metadata": { + "colab_type": "text", + "id": "_omCqbkLxv36", + "slideshow": { + "slide_type": "" + }, + "tags": [] + }, + "source": [ + "You will now use the inputs and the targets from the `tokenize_and_mask` function you implemented above. Take a look at the decoded version of your masked sentence using your `inps` and `targs` from the sentence above. " + ] + }, + { + "cell_type": "code", + "execution_count": 18, + "id": "054d51bf", + "metadata": { + "colab": { + "base_uri": "https://localhost:8080/", + "height": 105 + }, + "colab_type": "code", + "deletable": false, + "editable": false, + "id": "y6xwo6lGxv37", + "outputId": "4330ae1e-1805-40c9-daf3-c6bbe92d957b", + "slideshow": { + "slide_type": "" + }, + "tags": [] + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Inputs: \n", + "\n", + " b' BBQ Class Taking Place in Missoul Do you want to get better at making ? You will have the opportunity, put your calendar now. Thursday, September 22 World Class BBQ Champion, Tony Balay onestar Smoke Rangers.'\n", + "\n", + "Targets: \n", + "\n", + " b' Beginners a! delicious BBQ this on nd join from L'\n" + ] + } + ], + "source": [ + "print('Inputs: \\n\\n', pretty_decode(inps, sentinels, tokenizer).numpy())\n", + "print('\\nTargets: \\n\\n', pretty_decode(targs, sentinels, tokenizer).numpy())" + ] + }, + { + "cell_type": "markdown", + "id": "0707c320", + "metadata": { + "colab_type": "text", + "id": "24HZiIBLxv3-", + "slideshow": { + "slide_type": "" + }, + "tags": [] + }, + "source": [ + "\n", + "### 1.6 - Creating the Pairs\n", + "\n", + "You will now create pairs using your dataset. You will iterate over your data and create (inp, targ) pairs using the functions that we have given you. " + ] + }, + { + "cell_type": "code", + "execution_count": 19, + "id": "ae83fff0", + "metadata": { + "deletable": false, + "editable": false, + "slideshow": { + "slide_type": "" + }, + "tags": [] + }, + "outputs": [], + "source": [ + "# Apply tokenize_and_mask\n", + "inputs_targets_pairs = [tokenize_and_mask(text.encode('utf-8', errors='ignore').decode('utf-8'), tokenizer=tokenizer) \n", + " for text in natural_language_texts[0:2000]]" + ] + }, + { + "cell_type": "code", + "execution_count": 20, + "id": "3f157ad1", + "metadata": { + "colab": { + "base_uri": "https://localhost:8080/", + "height": 1000 + }, + "colab_type": "code", + "deletable": false, + "editable": false, + "id": "c1HiKreWokhs", + "outputId": "fc194524-41de-4d3b-87d9-ae35c29c9f79", + "slideshow": { + "slide_type": "" + }, + "tags": [] + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "[1]\n", + "\n", + "inputs:\n", + "il plaid lycra spandex shortall with metallic slinky\n", + "sets. Attache metallic elastic belt with O ring. Head \n", + "included. Great hip hop jazz dance costume. in the USA.\n", + "\n", + "targets:\n", + " Fo and in d - band or Made\n", + "\n", + "\n", + "\n", + "[2]\n", + "\n", + "inputs:\n", + "I thought I was going to 3rd season Wire tonight. there\n", + "was a commentary 11, so I had to re watch Ground with \n", + "commentary. Hopefully can finish season .\n", + "\n", + "targets:\n", + " finish the of the But on episode - Middle \n", + "the I the next weekend\n", + "\n", + "\n", + "\n", + "[3]\n", + "\n", + "inputs:\n", + "Pencarian FILM Untuk \" eace er 2017 yuk mampir ke channel\n", + "say . Edges provides the l.. A corrupt cop makes one w.. er\n", + "2017 ⁇ ⁇ .. Náo Lo ⁇ n - Peace Break.. Please subscribe and hit\n", + ".. in HD at http://.. cannot believe I manage..\n", + "\n", + "targets:\n", + " P Break \" . East Peace Break uploaded\n", + " I\n", + "\n", + "\n", + "\n" + ] + } + ], + "source": [ + "def display_input_target_pairs(inputs_targets_pairs, sentinels, wrapper=textwrap.TextWrapper(width=70), tokenizer=tokenizer):\n", + " for i, inp_tgt_pair in enumerate(inputs_targets_pairs, 1):\n", + " inps, tgts = inp_tgt_pair\n", + " inps = str(pretty_decode(inps, sentinels, tokenizer).numpy(), encoding='utf-8')\n", + " tgts = str(pretty_decode(tgts, sentinels, tokenizer).numpy(), encoding='utf-8')\n", + " print(f'[{i}]\\n\\n'\n", + " f'inputs:\\n{wrapper.fill(text=inps)}\\n\\n'\n", + " f'targets:\\n{wrapper.fill(text=tgts)}\\n\\n\\n')\n", + "\n", + "# Print 3 samples. We print inputs with less than 100 tokens. It is just to give you and idea of the process\n", + "display_input_target_pairs(filter(lambda x: len(x[0]) < 100, inputs_targets_pairs[0:12]), sentinels, wrapper, tokenizer)" + ] + }, + { + "cell_type": "markdown", + "id": "d7d5e6d9", + "metadata": { + "colab_type": "text", + "id": "hQI5Jgov5X-d", + "slideshow": { + "slide_type": "" + }, + "tags": [] + }, + "source": [ + "\n", + "## 2 - Pretrain a T5 model using C4\n", + "\n", + "Now you are going to use the Transformer's architecture that you coded in the previous assignment to summarize text, but this time to answer questions. Instead of training the question answering model from scratch, you will first \"pre-train\" the model using the C4 data set you just processed. This will help the model to learn the general structure of language from a large dataset. This is much easier to do, as you don't need to label any data, but just use the masking, which is done automatically. You will then use the data from the SQuAD set to teach the model to answer questions given a context. To start let's review the Transformer's architecture. \n", + "\n", + "\n", + "\n", + "\n", + "### 2.1 - Instantiate a new transformer model\n", + "\n", + "We have packaged the code implemented in the previous week into the `Transformer.py` file. You can import it here, and setup with the same configuration used there. " + ] + }, + { + "cell_type": "code", + "execution_count": 21, + "id": "58ce75dc", + "metadata": { + "colab": {}, + "colab_type": "code", + "deletable": false, + "editable": false, + "id": "eScMhEG7xv4H", + "slideshow": { + "slide_type": "" + }, + "tags": [] + }, + "outputs": [], + "source": [ + "# Define the model parameters\n", + "num_layers = 2\n", + "embedding_dim = 128\n", + "fully_connected_dim = 128\n", + "num_heads = 2\n", + "positional_encoding_length = 256\n", + "\n", + "encoder_vocab_size = int(tokenizer.vocab_size())\n", + "decoder_vocab_size = encoder_vocab_size\n", + "\n", + "# Initialize the model\n", + "transformer = transformer_utils.Transformer(\n", + " num_layers, \n", + " embedding_dim, \n", + " num_heads, \n", + " fully_connected_dim,\n", + " encoder_vocab_size, \n", + " decoder_vocab_size, \n", + " positional_encoding_length, \n", + " positional_encoding_length,\n", + ")" + ] + }, + { + "cell_type": "markdown", + "id": "618697cf", + "metadata": {}, + "source": [ + "Now, you will define the optimizer and the loss function. For this task the model will try to predict the masked words, so, as in the previous lab, the loss function will be the `SparseCategoricalCrossEntropy`." + ] + }, + { + "cell_type": "code", + "execution_count": 22, + "id": "7df2d1d1", + "metadata": { + "deletable": false, + "editable": false, + "slideshow": { + "slide_type": "" + }, + "tags": [] + }, + "outputs": [], + "source": [ + "learning_rate = transformer_utils.CustomSchedule(embedding_dim)\n", + "optimizer = tf.keras.optimizers.Adam(0.0001, beta_1=0.9, beta_2=0.98, epsilon=1e-9)\n", + "\n", + "loss_object = tf.keras.losses.SparseCategoricalCrossentropy(from_logits=True, reduction='none')\n", + "train_loss = tf.keras.metrics.Mean(name='train_loss')\n", + "\n", + "# Here you will store the losses, so you can later plot them\n", + "losses = []" + ] + }, + { + "cell_type": "markdown", + "id": "03d54376", + "metadata": {}, + "source": [ + "\n", + "### 2.2 - C4 pretraining\n", + "\n", + "For training a Tensorflow model you need to arrange the data into datasets. Now, you will get the `inputs` and the `targets` for the transformer model from the `inputs_targets_pairs`. Before creating the dataset, you need to be sure that all `inputs` have the same length by truncating the longer sequences and padding the shorter ones with `0`. The same must be done for the targets. The function `tf.keras.preprocessing.sequence.pad_sequences` will help you here, as in the previous week assignment.\n", + "\n", + "You will use a `BATCH_SIZE = 64`" + ] + }, + { + "cell_type": "code", + "execution_count": 23, + "id": "b03eb998", + "metadata": { + "deletable": false, + "editable": false, + "tags": [] + }, + "outputs": [], + "source": [ + "# Limit the size of the input and output data so this can run in this environment\n", + "encoder_maxlen = 150\n", + "decoder_maxlen = 50\n", + "\n", + "inputs = tf.keras.preprocessing.sequence.pad_sequences([x[0] for x in inputs_targets_pairs], maxlen=encoder_maxlen, padding='post', truncating='post')\n", + "targets = tf.keras.preprocessing.sequence.pad_sequences([x[1] for x in inputs_targets_pairs], maxlen=decoder_maxlen, padding='post', truncating='post')\n", + "\n", + "inputs = tf.cast(inputs, dtype=tf.int32)\n", + "targets = tf.cast(targets, dtype=tf.int32)\n", + "\n", + "# Create the final training dataset.\n", + "BUFFER_SIZE = 10000\n", + "BATCH_SIZE = 64\n", + "\n", + "dataset = tf.data.Dataset.from_tensor_slices((inputs, targets)).shuffle(BUFFER_SIZE).batch(BATCH_SIZE)" + ] + }, + { + "cell_type": "markdown", + "id": "4e32ae0c", + "metadata": {}, + "source": [ + "Now, you can run the training loop for 10 epochs. Running it with a big dataset such as C4 on a good computer with enough memory and a good GPU could take more than 24 hours. Here, you will run few epochs using a small portion of the C4 dataset for illustration. It will only take a few minutes, but the model won't be very powerful. " + ] + }, + { + "cell_type": "code", + "execution_count": 24, + "id": "44fc5f76", + "metadata": { + "deletable": false, + "editable": false, + "tags": [] + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Epoch 1, Loss 10.1104\n", + "Time taken for one epoch: 26.227202892303467 sec\n", + "Epoch 2, Loss 9.5002\n", + "Time taken for one epoch: 9.266757726669312 sec\n", + "Epoch 3, Loss 8.9102\n", + "Time taken for one epoch: 8.233482360839844 sec\n", + "Epoch 4, Loss 8.3658\n", + "Time taken for one epoch: 6.5507917404174805 sec\n", + "Epoch 5, Loss 7.8717\n", + "Time taken for one epoch: 6.289384365081787 sec\n", + "Epoch 6, Loss 7.4269\n", + "Time taken for one epoch: 6.0538671016693115 sec\n", + "Epoch 7, Loss 7.0341\n", + "Time taken for one epoch: 5.186856508255005 sec\n", + "Epoch 8, Loss 6.7001\n", + "Time taken for one epoch: 3.4987030029296875 sec\n", + "Epoch 9, Loss 6.4342\n", + "Time taken for one epoch: 4.493892669677734 sec\n", + "Epoch 10, Loss 6.2358\n", + "Time taken for one epoch: 3.7167770862579346 sec\n" + ] + } + ], + "source": [ + "# Define the number of epochs\n", + "epochs = 10\n", + "\n", + "# Training loop\n", + "for epoch in range(epochs):\n", + " \n", + " start = time.time()\n", + " train_loss.reset_states()\n", + " number_of_batches=len(list(enumerate(dataset)))\n", + "\n", + " for (batch, (inp, tar)) in enumerate(dataset):\n", + " print(f'Epoch {epoch+1}, Batch {batch+1}/{number_of_batches}', end='\\r')\n", + " transformer_utils.train_step(inp, tar, transformer, loss_object, optimizer, train_loss)\n", + " \n", + " print (f'Epoch {epoch+1}, Loss {train_loss.result():.4f}')\n", + " losses.append(train_loss.result())\n", + " \n", + " print (f'Time taken for one epoch: {time.time() - start} sec')\n", + "\n", + "# Save the pretrained model\n", + "# transformer.save_weights('./model_c4_temp')" + ] + }, + { + "cell_type": "markdown", + "id": "2e8135b5", + "metadata": {}, + "source": [ + "**Load a pretrained model**\n", + "\n", + "To show how powerful this model actually is, we trained it for several epochs with the full dataset in Colab and saved the weights for you. You can load them using the cell below. For the rest of the notebook, you will see the power of the transfer learning in action." + ] + }, + { + "cell_type": "code", + "execution_count": 25, + "id": "55360633", + "metadata": { + "deletable": false, + "editable": false, + "tags": [] + }, + "outputs": [ + { + "data": { + "text/plain": [ + "" + ] + }, + "execution_count": 25, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "transformer.load_weights('./pretrained_models/model_c4')" + ] + }, + { + "cell_type": "markdown", + "id": "b8822756", + "metadata": {}, + "source": [ + "\n", + "## 3. Fine tune the T5 model for Question Answering\n", + "\n", + "Now, you are going to fine tune the pretrained model for Question Answering using the [SQUad 2.0 dataset](https://rajpurkar.github.io/SQuAD-explorer/).\n", + "\n", + "SQuAD, short for Stanford Question Answering Dataset, is a dataset designed for training and evaluating question answering systems. It consists of real questions posed by humans on a set of Wikipedia articles, where the answer to each question is a specific span of text within the corresponding article.\n", + "\n", + "SQuAD 1.1, the previous version of the SQuAD dataset, contains 100,000+ question-answer pairs on about 500 articles.\n", + "SQuAD 2.0, contains 50.000 additional questions that are not meant to be answered. This extra set of questions can help to train models to detect unanswerable questions.\n", + "\n", + "Let's load the dataset." + ] + }, + { + "cell_type": "code", + "execution_count": 26, + "id": "987571df", + "metadata": { + "deletable": false, + "editable": false, + "tags": [] + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Number of articles: 442\n" + ] + } + ], + "source": [ + "with open('data/train-v2.0.json', 'r') as f:\n", + " example_jsons = json.load(f)\n", + "\n", + "example_jsons = example_jsons['data']\n", + "\n", + "print('Number of articles: ' + str(len(example_jsons)))" + ] + }, + { + "cell_type": "markdown", + "id": "c941761f", + "metadata": {}, + "source": [ + "The structure of each article is as follows:\n", + "- `title`: The article title\n", + "- `paragraphs`: A list of paragraphs and questions related to them\n", + " - `context`: The actual paragraph text\n", + " - `qas`: A set of question related to the paragraph\n", + " - `question`: A question\n", + " - `id`: The question unique identifier\n", + " - `is_imposible`: Boolean, specifies if the question can be answered or not\n", + " - `answers`: A set of possible answers for the question\n", + " - `text`: The answer\n", + " - `answer_start`: The index of the character that starts the sentence containing the explicit answer to the question\n", + " \n", + "Take a look at an article by running the next cell. Notice that the `context` is usually the last element for every paragraph: " + ] + }, + { + "cell_type": "code", + "execution_count": 27, + "id": "7c4c4cfa", + "metadata": { + "deletable": false, + "editable": false, + "tags": [] + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Title: Beyoncé\n", + "{'qas': [{'question': 'When did Beyonce start becoming popular?', 'id': '56be85543aeaaa14008c9063', 'answers': [{'text': 'in the late 1990s', 'answer_start': 269}], 'is_impossible': False}, {'question': 'What areas did Beyonce compete in when she was growing up?', 'id': '56be85543aeaaa14008c9065', 'answers': [{'text': 'singing and dancing', 'answer_start': 207}], 'is_impossible': False}, {'question': \"When did Beyonce leave Destiny's Child and become a solo singer?\", 'id': '56be85543aeaaa14008c9066', 'answers': [{'text': '2003', 'answer_start': 526}], 'is_impossible': False}, {'question': 'In what city and state did Beyonce grow up? ', 'id': '56bf6b0f3aeaaa14008c9601', 'answers': [{'text': 'Houston, Texas', 'answer_start': 166}], 'is_impossible': False}, {'question': 'In which decade did Beyonce become famous?', 'id': '56bf6b0f3aeaaa14008c9602', 'answers': [{'text': 'late 1990s', 'answer_start': 276}], 'is_impossible': False}, {'question': 'In what R&B group was she the lead singer?', 'id': '56bf6b0f3aeaaa14008c9603', 'answers': [{'text': \"Destiny's Child\", 'answer_start': 320}], 'is_impossible': False}, {'question': 'What album made her a worldwide known artist?', 'id': '56bf6b0f3aeaaa14008c9604', 'answers': [{'text': 'Dangerously in Love', 'answer_start': 505}], 'is_impossible': False}, {'question': \"Who managed the Destiny's Child group?\", 'id': '56bf6b0f3aeaaa14008c9605', 'answers': [{'text': 'Mathew Knowles', 'answer_start': 360}], 'is_impossible': False}, {'question': 'When did Beyoncé rise to fame?', 'id': '56d43c5f2ccc5a1400d830a9', 'answers': [{'text': 'late 1990s', 'answer_start': 276}], 'is_impossible': False}, {'question': \"What role did Beyoncé have in Destiny's Child?\", 'id': '56d43c5f2ccc5a1400d830aa', 'answers': [{'text': 'lead singer', 'answer_start': 290}], 'is_impossible': False}, {'question': 'What was the first album Beyoncé released as a solo artist?', 'id': '56d43c5f2ccc5a1400d830ab', 'answers': [{'text': 'Dangerously in Love', 'answer_start': 505}], 'is_impossible': False}, {'question': 'When did Beyoncé release Dangerously in Love?', 'id': '56d43c5f2ccc5a1400d830ac', 'answers': [{'text': '2003', 'answer_start': 526}], 'is_impossible': False}, {'question': 'How many Grammy awards did Beyoncé win for her first solo album?', 'id': '56d43c5f2ccc5a1400d830ad', 'answers': [{'text': 'five', 'answer_start': 590}], 'is_impossible': False}, {'question': \"What was Beyoncé's role in Destiny's Child?\", 'id': '56d43ce42ccc5a1400d830b4', 'answers': [{'text': 'lead singer', 'answer_start': 290}], 'is_impossible': False}, {'question': \"What was the name of Beyoncé's first solo album?\", 'id': '56d43ce42ccc5a1400d830b5', 'answers': [{'text': 'Dangerously in Love', 'answer_start': 505}], 'is_impossible': False}], 'context': 'Beyoncé Giselle Knowles-Carter (/biːˈjɒnseɪ/ bee-YON-say) (born September 4, 1981) is an American singer, songwriter, record producer and actress. Born and raised in Houston, Texas, she performed in various singing and dancing competitions as a child, and rose to fame in the late 1990s as lead singer of R&B girl-group Destiny\\'s Child. Managed by her father, Mathew Knowles, the group became one of the world\\'s best-selling girl groups of all time. Their hiatus saw the release of Beyoncé\\'s debut album, Dangerously in Love (2003), which established her as a solo artist worldwide, earned five Grammy Awards and featured the Billboard Hot 100 number-one singles \"Crazy in Love\" and \"Baby Boy\".'}\n" + ] + } + ], + "source": [ + "example_article = example_jsons[0]\n", + "example_article\n", + "\n", + "print(\"Title: \" + example_article[\"title\"])\n", + "print(example_article[\"paragraphs\"][0])" + ] + }, + { + "cell_type": "markdown", + "id": "2982be57", + "metadata": {}, + "source": [ + "The previous article might be difficult to navigate so here is a nicely formatted example paragraph:\n", + "```python\n", + "{\n", + " \"context\": \"Beyoncé Giselle Knowles-Carter (/biːˈjɒnseɪ/ bee-YON-say) (born September 4, 1981) is an American singer, songwriter, record producer and actress. Born and raised in Houston, Texas, she performed in various singing and dancing competitions as a child, and rose to fame in the late 1990s as lead singer of R&B girl-group Destiny's Child. Managed by her father, Mathew Knowles, the group became one of the world's best-selling girl groups of all time. Their hiatus saw the release of Beyoncé's debut album, Dangerously in Love (2003), which established her as a solo artist worldwide, earned five Grammy Awards and featured the Billboard Hot 100 number-one singles 'Crazy in Love' and 'Baby Boy'\",\n", + " \"qas\": [\n", + " {\n", + " \"question\": \"When did Beyonce start becoming popular?\",\n", + " \"id\": \"56be85543aeaaa14008c9063\",\n", + " \"answers\": [\n", + " {\n", + " \"text\": \"in the late 1990s\",\n", + " \"answer_start\": 269\n", + " }\n", + " ],\n", + " \"is_impossible\": false\n", + " },\n", + " {\n", + " \"question\": \"What areas did Beyonce compete in when she was growing up?\",\n", + " \"id\": \"56be85543aeaaa14008c9065\",\n", + " \"answers\": [\n", + " {\n", + " \"text\": \"singing and dancing\",\n", + " \"answer_start\": 207\n", + " }\n", + " ],\n", + " \"is_impossible\": false\n", + " }\n", + " ]\n", + "}\n", + "```" + ] + }, + { + "cell_type": "markdown", + "id": "b3345571", + "metadata": {}, + "source": [ + "\n", + "### 3.1 - Creating a list of paired question and answers \n", + "\n", + "You are tasked with generating input/output pairs for a Question Answering (QA) model using the SQuAD 2.0 dataset. Each pair follows the structure:\n", + "\n", + "- inputs: `question: context:

`\n", + "- targets: `answer: `\n", + " \n", + "Here, `` represents the question in the context of the given paragraph `

`, and `` is a possible answer.\n", + "\n", + "In this notebook, we will focus on a single answer per question. However, it's essential to note that the dataset contains questions with multiple answers. When training a model in real-life scenarios, consider including all available information.\n", + "\n", + "\n", + "### Exercise 2 - Parse the SQuaD 2.0 Dataset\n", + "\n", + "Your task is to implement the parse_squad function, which iterates over all the articles, paragraphs, and questions in the SQuAD dataset. Extract pairs of inputs and targets for the QA model using the provided code template.\n", + "- Start with two empty lists: `inputs` and `targets`.\n", + "- Loop over all the articles in the dataset.\n", + "- For each article, loop over each paragraph.\n", + "- Extract the context from the paragraph.\n", + "- Loop over each question in the given paragraph.\n", + "- Check if the question is not impossible and has at least one answer.\n", + "- If the above condition is met, create the `question_context` sequence as described in the input structure.\n", + "- Create the `answer` sequence using the first answer from the available answers.\n", + "- Append the `question_context` to the `inputs` list.\n", + "- Append the `answer` to the `targets` list." + ] + }, + { + "cell_type": "code", + "execution_count": 29, + "id": "b5344f35", + "metadata": { + "deletable": false, + "tags": [ + "graded" + ] + }, + "outputs": [], + "source": [ + "# GRADED FUNCTION: parse_squad\n", + "def parse_squad(dataset):\n", + " \"\"\"Extract all the answers/questions pairs from the SQuAD dataset\n", + "\n", + " Args:\n", + " dataset (dict): The imported JSON dataset\n", + "\n", + " Returns:\n", + " inputs, targets: Two lists containing the inputs and the targets for the QA model\n", + " \"\"\"\n", + "\n", + " inputs, targets = [], []\n", + "\n", + " ### START CODE HERE ###\n", + " \n", + " # Loop over all the articles\n", + " for article in dataset:\n", + " \n", + " # Loop over each paragraph of each article\n", + " for paragraph in article[\"paragraphs\"]:\n", + " \n", + " # Extract context from the paragraph\n", + " context = paragraph[\"context\"]\n", + " \n", + " #Loop over each question of the given paragraph\n", + " for qa in paragraph[\"qas\"]:\n", + " \n", + " # If this question is not impossible and there is at least one answer\n", + " if len(qa['answers']) > 0 and not(qa['is_impossible']):\n", + " \n", + " # Create the question/context sequence\n", + " question_context = 'question: ' + qa[\"question\"] + ' context: ' + context\n", + " \n", + " # Create the answer sequence. Use the text field of the first answer\n", + " answer = 'answer: ' + qa[\"answers\"][0][\"text\"]\n", + " \n", + " # Add the question_context to the inputs list\n", + " inputs.append(question_context)\n", + " \n", + " # Add the answer to the targets list\n", + " targets.append(answer)\n", + " \n", + " ### END CODE HERE ###\n", + " \n", + " return inputs, targets" + ] + }, + { + "cell_type": "code", + "execution_count": 30, + "id": "6744c424", + "metadata": { + "deletable": false, + "editable": false, + "tags": [] + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Number of question/answer pairs: 86821\n", + "\n", + "First Q/A pair:\n", + "\n", + "inputs: \u001b[34mquestion: When did Beyonce start becoming popular? context: Beyoncé Giselle Knowles-Carter (/biːˈjɒnseɪ/ bee-YON-say) (born September 4, 1981) is an American singer, songwriter, record producer and actress. Born and raised in Houston, Texas, she performed in various singing and dancing competitions as a child, and rose to fame in the late 1990s as lead singer of R&B girl-group Destiny's Child. Managed by her father, Mathew Knowles, the group became one of the world's best-selling girl groups of all time. Their hiatus saw the release of Beyoncé's debut album, Dangerously in Love (2003), which established her as a solo artist worldwide, earned five Grammy Awards and featured the Billboard Hot 100 number-one singles \"Crazy in Love\" and \"Baby Boy\".\u001b[0m\n", + "\n", + "targets: \u001b[32manswer: in the late 1990s\u001b[0m\n", + "\n", + "Last Q/A pair:\n", + "\n", + "inputs: \u001b[34mquestion: What is KMC an initialism of? context: Kathmandu Metropolitan City (KMC), in order to promote international relations has established an International Relations Secretariat (IRC). KMC's first international relationship was established in 1975 with the city of Eugene, Oregon, United States. This activity has been further enhanced by establishing formal relationships with 8 other cities: Motsumoto City of Japan, Rochester of the USA, Yangon (formerly Rangoon) of Myanmar, Xi'an of the People's Republic of China, Minsk of Belarus, and Pyongyang of the Democratic Republic of Korea. KMC's constant endeavor is to enhance its interaction with SAARC countries, other International agencies and many other major cities of the world to achieve better urban management and developmental programs for Kathmandu.\u001b[0m\n", + "\n", + "targets: \u001b[32manswer: Kathmandu Metropolitan City\u001b[0m\n" + ] + } + ], + "source": [ + "inputs, targets = parse_squad(example_jsons) \n", + "print(\"Number of question/answer pairs: \" + str(len(inputs)))\n", + "\n", + "print('\\nFirst Q/A pair:\\n\\ninputs: ' + colored(inputs[0], 'blue'))\n", + "print('\\ntargets: ' + colored(targets[0], 'green'))\n", + "print('\\nLast Q/A pair:\\n\\ninputs: ' + colored(inputs[-1], 'blue'))\n", + "print('\\ntargets: ' + colored(targets[-1], 'green'))" + ] + }, + { + "cell_type": "markdown", + "id": "e2b164c2", + "metadata": {}, + "source": [ + "#### **Expected Output:**\n", + "```\n", + "Number of question/answer pairs: 86821\n", + "\n", + "First Q/A pair:\n", + "\n", + "inputs: question: When did Beyonce start becoming popular? context: Beyoncé Giselle Knowles-Carter (/biːˈjɒnseɪ/ bee-YON-say) (born September 4, 1981) is an American singer, songwriter, record producer and actress. Born and raised in Houston, Texas, she performed in various singing and dancing competitions as a child, and rose to fame in the late 1990s as lead singer of R&B girl-group Destiny's Child. Managed by her father, Mathew Knowles, the group became one of the world's best-selling girl groups of all time. Their hiatus saw the release of Beyoncé's debut album, Dangerously in Love (2003), which established her as a solo artist worldwide, earned five Grammy Awards and featured the Billboard Hot 100 number-one singles \"Crazy in Love\" and \"Baby Boy\".\n", + "\n", + "targets: answer: in the late 1990s\n", + "\n", + "Last Q/A pair:\n", + "\n", + "inputs: question: What is KMC an initialism of? context: Kathmandu Metropolitan City (KMC), in order to promote international relations has established an International Relations Secretariat (IRC). KMC's first international relationship was established in 1975 with the city of Eugene, Oregon, United States. This activity has been further enhanced by establishing formal relationships with 8 other cities: Motsumoto City of Japan, Rochester of the USA, Yangon (formerly Rangoon) of Myanmar, Xi'an of the People's Republic of China, Minsk of Belarus, and Pyongyang of the Democratic Republic of Korea. KMC's constant endeavor is to enhance its interaction with SAARC countries, other International agencies and many other major cities of the world to achieve better urban management and developmental programs for Kathmandu.\n", + "\n", + "targets: answer: Kathmandu Metropolitan City\n", + "```" + ] + }, + { + "cell_type": "code", + "execution_count": 31, + "id": "f197bb69", + "metadata": { + "deletable": false, + "editable": false, + "tags": [] + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "\u001b[92m All tests passed\n" + ] + } + ], + "source": [ + "# UNIT TEST\n", + "w3_unittest.test_parse_squad(parse_squad)" + ] + }, + { + "cell_type": "markdown", + "id": "d1f69b24", + "metadata": { + "slideshow": { + "slide_type": "" + }, + "tags": [] + }, + "source": [ + "You will use 40000 samples for training and 5000 samples for testing" + ] + }, + { + "cell_type": "code", + "execution_count": 32, + "id": "947354ad", + "metadata": { + "deletable": false, + "editable": false, + "slideshow": { + "slide_type": "" + }, + "tags": [] + }, + "outputs": [], + "source": [ + "# 40K pairs for training\n", + "inputs_train = inputs[0:40000] \n", + "targets_train = targets[0:40000] \n", + "\n", + "# 5K pairs for testing\n", + "inputs_test = inputs[40000:45000] \n", + "targets_test = targets[40000:45000] " + ] + }, + { + "cell_type": "markdown", + "id": "1c21fd31", + "metadata": { + "slideshow": { + "slide_type": "" + }, + "tags": [] + }, + "source": [ + "Now, you can create the batch dataset of padded sequences. You will first tokenize the inputs and the targets. Then, using the function `tf.keras.preprocessing.sequence.pad_sequences`, you will ensure that the inputs and the outputs have the required lengths. Remember that the sequences longer than the required size will be truncated and the shorter ones will be padded with `0`. This setup is very similar to the other one used in this and the previous notebook." + ] + }, + { + "cell_type": "code", + "execution_count": 33, + "id": "83393c74", + "metadata": { + "deletable": false, + "editable": false, + "tags": [] + }, + "outputs": [], + "source": [ + "# Limit the size of the input and output data so this can run in this environment\n", + "encoder_maxlen = 150\n", + "decoder_maxlen = 50\n", + "\n", + "inputs_str = [tokenizer.tokenize(s) for s in inputs_train]\n", + "targets_str = [tf.concat([tokenizer.tokenize(s), [1]], 0) for s in targets_train]\n", + "\n", + "inputs = tf.keras.preprocessing.sequence.pad_sequences(inputs_str, maxlen=encoder_maxlen, padding='post', truncating='post')\n", + "targets = tf.keras.preprocessing.sequence.pad_sequences(targets_str, maxlen=decoder_maxlen, padding='post', truncating='post')\n", + "\n", + "inputs = tf.cast(inputs, dtype=tf.int32)\n", + "targets = tf.cast(targets, dtype=tf.int32)\n", + "\n", + "# Create the final training dataset.\n", + "BUFFER_SIZE = 10000\n", + "BATCH_SIZE = 64\n", + "dataset = tf.data.Dataset.from_tensor_slices((inputs, targets)).shuffle(BUFFER_SIZE).batch(BATCH_SIZE)" + ] + }, + { + "cell_type": "markdown", + "id": "df82c8a9", + "metadata": {}, + "source": [ + "\n", + "### 3.2 Fine tune the T5 model\n", + "\n", + "Now, you will train the model for 2 epochs. In the T5 model, all the weights are adjusted during the fine tuning. As usual, fine tuning this model to get state of the art results would require more time and resources than there are available in this environment, but you are welcome to train the model for more epochs and with more data using Colab GPUs." + ] + }, + { + "cell_type": "code", + "execution_count": 34, + "id": "aaaba558", + "metadata": { + "deletable": false, + "editable": false, + "tags": [] + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Epoch 1, Loss 5.993725\n", + "Time taken for one epoch: 72.94187331199646 sec\n", + "Epoch 2, Loss 5.356225\n", + "Time taken for one epoch: 33.04398775100708 sec\n" + ] + } + ], + "source": [ + "# Define the number of epochs\n", + "epochs = 2\n", + "losses = []\n", + "\n", + "# Training loop\n", + "for epoch in range(epochs):\n", + " \n", + " start = time.time()\n", + " train_loss.reset_states()\n", + " number_of_batches=len(list(enumerate(dataset)))\n", + "\n", + " for (batch, (inp, tar)) in enumerate(dataset):\n", + " print(f'Epoch {epoch+1}, Batch {batch+1}/{number_of_batches}', end='\\r')\n", + " transformer_utils.train_step(inp, tar, transformer, loss_object, optimizer, train_loss)\n", + " \n", + " print (f'Epoch {epoch+1}, Loss {train_loss.result():.4f}')\n", + " losses.append(train_loss.result())\n", + " \n", + " print (f'Time taken for one epoch: {time.time() - start} sec')\n", + " #if epoch % 15 == 0:\n", + " #transformer.save_weights('./pretrained_models/model_qa_temp')\n", + "# Save the final model\n", + "#transformer.save_weights('./pretrained_models/model_qa_temp')" + ] + }, + { + "cell_type": "markdown", + "id": "23b8dc0c", + "metadata": {}, + "source": [ + "To get a model that works properly, you would need to train for about 100 epochs. So, we have pretrained a model for you. Just load the weights in the current model and let's use it for answering questions" + ] + }, + { + "cell_type": "code", + "execution_count": 35, + "id": "144e769b", + "metadata": { + "deletable": false, + "editable": false, + "tags": [] + }, + "outputs": [ + { + "data": { + "text/plain": [ + "" + ] + }, + "execution_count": 35, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "# Restore the weights\n", + "transformer.load_weights('./pretrained_models/model_qa3')" + ] + }, + { + "cell_type": "markdown", + "id": "360e09fd", + "metadata": {}, + "source": [ + "\n", + "### 3.3 - Implement your Question Answering model\n", + "In this final step, you will implement the answer_question function, utilizing a pre-trained transformer model for question answering.\n", + "\n", + "To help you out the `transformer_utils.next_word` function is provided. This function receives the question and beginning of the answer (both in tensor format) alongside the model to predict the next token in the answer. The next cell shows how to use this:" + ] + }, + { + "cell_type": "code", + "execution_count": 36, + "id": "92b40de0", + "metadata": { + "deletable": false, + "editable": false, + "tags": [] + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Predicted next word is: 'blue'\n", + "Answer so far: 'answer: blue'\n" + ] + } + ], + "source": [ + "# Define an example question\n", + "example_question = \"question: What color is the sky? context: Sky is blue\"\n", + "\n", + "# Question is tokenized and padded\n", + "# Note that this is hardcoded here but you must implement this in the upcoming exercise\n", + "tokenized_padded_question = tf.constant([[822, 10, 363, 945, 19, 8, 5796, 58, 2625, 10, 5643, 19, 1692, 0, 0]])\n", + "\n", + "# All answers begin with the string \"answer: \"\n", + "# Feel free to check that this is indeed the tokenized version of that string\n", + "tokenized_answer = tf.constant([[1525, 10]])\n", + "\n", + "# Predict the next word using the transformer_utils.next_word function\n", + "# Notice that it expects the question, answer and model (in that order)\n", + "next_word = transformer_utils.next_word(tokenized_padded_question, tokenized_answer, transformer)\n", + "\n", + "print(f\"Predicted next word is: '{tokenizer.detokenize(next_word).numpy()[0].decode('utf-8')}'\")\n", + "\n", + "# Concatenate predicted word with answer so far\n", + "answer_so_far = tf.concat([tokenized_answer, next_word], axis=-1)\n", + "\n", + "print(f\"Answer so far: '{tokenizer.detokenize(answer_so_far).numpy()[0].decode('utf-8')}'\")" + ] + }, + { + "cell_type": "markdown", + "id": "8e23a6be", + "metadata": {}, + "source": [ + "\n", + "### Exercise 3 - Implement the question answering function\n", + "\n", + "Implement the `answer_question` function. Here are the steps:\n", + "- **Question Setup:**\n", + "\n", + " - Tokenize the given question using the provided tokenizer.\n", + " - Add an extra dimension to the tensor for compatibility.\n", + " - Pad the question tensor using `pad_sequences` to ensure the sequence has the specified max length. This function will truncate the sequence if it is larger or pad with zeros if it is shorter.\n", + "- **Answer Setup:**\n", + " - Tokenize the initial answer, noting that all answers begin with the string \"answer: \".\n", + " - Add an extra dimension to the tensor for compatibility.\n", + " - Get the id of the `EOS` token, typically represented by 1.\n", + "- **Generate Answer:**\n", + " - Loop for `decoder_maxlen` iterations.\n", + " - Use the `transformer_utils.next_word` function, which predicts the next token in the answer using the model, input document, and the current state of the output.\n", + " - Concatenate the predicted next word to the output tensor.\n", + "- **Stop Condition:**\n", + " - The text generation stops if the model predicts the `EOS` token.\n", + " - If the `EOS` token is predicted, break out of the loop." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "91def253", + "metadata": { + "deletable": false, + "tags": [ + "graded" + ] + }, + "outputs": [], + "source": [ + "# GRADED FUNCTION: answer_question\n", + "def answer_question(question, model, tokenizer, encoder_maxlen=150, decoder_maxlen=50):\n", + " \"\"\"\n", + " A function for question answering using the transformer model\n", + " Arguments:\n", + " question (tf.Tensor): Input data with question and context\n", + " model (tf.keras.model): The transformer model\n", + " tokenizer (function): The SentencePiece tokenizer\n", + " encoder_maxlen (number): Max length of the encoded sequence\n", + " decoder_maxlen (number): Max length of the decoded sequence\n", + " Returns:\n", + " _ (str): The answer to the question\n", + " \"\"\"\n", + " \n", + " ### START CODE HERE ###\n", + " \n", + " # QUESTION SETUP\n", + " \n", + " # Tokenize the question\n", + " tokenized_question = tokenizer.tokenize(question)\n", + " \n", + " # Add an extra dimension to the tensor\n", + " tokenized_question = tf.expand_dims(tokenized_question, 0) \n", + " \n", + " # Pad the question tensor\n", + " padded_question = tf.keras.preprocessing.sequence.pad_sequences(tokenized_question,\n", + " maxlen=encoder_maxlen,\n", + " padding='post', \n", + " truncating='post') \n", + " # ANSWER SETUP\n", + " \n", + " # Tokenize the answer\n", + " # Hint: All answers begin with the string \"answer: \"\n", + " tokenized_answer = tokenizer.tokenize(answer)\n", + " \n", + " # Add an extra dimension to the tensor\n", + " tokenized_answer = tf.expand_dims(None, None)\n", + " \n", + " # Get the id of the EOS token\n", + " eos = tokenizer.string_to_id(\"\") \n", + " \n", + " # Loop for decoder_maxlen iterations\n", + " for i in range(None):\n", + " \n", + " # Predict the next word using the model, the input document and the current state of output\n", + " next_word = transformer_utils.next_word(None, None, None)\n", + " \n", + " # Concat the predicted next word to the output \n", + " tokenized_answer = tf.concat([None, None], axis=1)\n", + " \n", + " # The text generation stops if the model predicts the EOS token\n", + " if None == None:\n", + " break \n", + " \n", + " ### END CODE HERE ###\n", + "\n", + " return tokenized_answer " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "c8f1e5a4", + "metadata": {}, + "outputs": [], + "source": [] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "c8abd19c", + "metadata": {}, + "outputs": [], + "source": [] + }, + { + "cell_type": "markdown", + "id": "de501d8c", + "metadata": {}, + "source": [ + "Let's test the model with some question from the training dataset. Check if the answers match the correct one." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "163e79eb", + "metadata": { + "deletable": false, + "editable": false, + "tags": [] + }, + "outputs": [], + "source": [ + "idx = 10408\n", + "\n", + "result = answer_question(inputs_train[idx], transformer, tokenizer)\n", + "print(colored(pretty_decode(result, sentinels, tokenizer).numpy()[0], 'blue'))\n", + "print()\n", + "print(inputs_train[idx])\n", + "print(colored(targets_train[idx], 'green'))" + ] + }, + { + "cell_type": "markdown", + "id": "2fae9e8d", + "metadata": {}, + "source": [ + "#### **Expected Output:**\n", + "```\n", + "b'answer: January 9, 1957'\n", + "\n", + "question: When was the Chechen-Ingush Autonomous Soviet Socialist Republic transferred from the Georgian SSR? context: On January 9, 1957, Karachay Autonomous Oblast and Chechen-Ingush Autonomous Soviet Socialist Republic were restored by Khrushchev and they were transferred from the Georgian SSR back to the Russian SFSR.\n", + "answer: January 9, 1957\n", + "```" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "19ac8067", + "metadata": { + "deletable": false, + "editable": false, + "tags": [] + }, + "outputs": [], + "source": [ + "# UNIT TEST\n", + "w3_unittest.test_answer_question(answer_question)" + ] + }, + { + "cell_type": "markdown", + "id": "06588341", + "metadata": {}, + "source": [ + "Test the model with question 110" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "6c381df3", + "metadata": { + "deletable": false, + "editable": false, + "tags": [] + }, + "outputs": [], + "source": [ + "idx = 110\n", + "result = answer_question(inputs_test[idx], transformer, tokenizer)\n", + "print(colored(pretty_decode(result, sentinels, tokenizer).numpy()[0], 'blue'))\n", + "print()\n", + "print(inputs_test[idx])\n", + "print(colored(targets_test[idx], 'green'))" + ] + }, + { + "cell_type": "markdown", + "id": "fd09ec41", + "metadata": {}, + "source": [ + "Test the model with question 301. Use this cell to play with the model by selecting other test questions. Look if the model has learnt something or if it is just generating random text." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "fc9c898f", + "metadata": { + "deletable": false, + "editable": false, + "tags": [] + }, + "outputs": [], + "source": [ + "idx = 311\n", + "result = answer_question(inputs_test[idx], transformer, tokenizer)\n", + "print(colored(pretty_decode(result, sentinels, tokenizer).numpy()[0], 'blue'))\n", + "print()\n", + "print(inputs_test[idx])\n", + "print(colored(targets_test[idx], 'green'))" + ] + }, + { + "cell_type": "markdown", + "id": "0e3d00de", + "metadata": {}, + "source": [ + "Congratulations, you have finished the last assignment of this specialization. Now, you know what is behind the powerful models like ChatGPT. Now it is time for you to find and solve the huge amount of problems that can be approached with NLP." + ] + } + ], + "metadata": { + "grader_version": "1", + "jupytext": { + "encoding": "# -*- coding: utf-8 -*-" + }, + "kernelspec": { + "display_name": "Python 3 (ipykernel)", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.8.10" + } + }, + "nbformat": 4, + "nbformat_minor": 5 +} diff --git a/NLP with Attention Models/QA/QA_T5/Files/tf/.ipynb_checkpoints/test_utils-checkpoint.py b/NLP with Attention Models/QA/QA_T5/Files/tf/.ipynb_checkpoints/test_utils-checkpoint.py new file mode 100644 index 0000000000000000000000000000000000000000..dc3971ce88825edc90be6a80726ae97db305eea9 --- /dev/null +++ b/NLP with Attention Models/QA/QA_T5/Files/tf/.ipynb_checkpoints/test_utils-checkpoint.py @@ -0,0 +1,69 @@ +import numpy as np +from termcolor import colored + +# + +from tensorflow.keras.layers import Embedding +from tensorflow.keras.layers import GRU +from tensorflow.keras.layers import Dense +from tensorflow.keras.layers import Dropout + +from utils import PositionalEmbedding +from dlai_grader.grading import test_case, object_to_grade +from types import ModuleType, FunctionType + +# Compare the two inputs + +def comparator(learner, instructor, modelId): + cases = [] + t = test_case() + if len(learner) != len(instructor): + t.failed = True + t.msg = f"{modelId}: The number of layers in the proposed model does not agree with the expected model" + t.want = len(instructor) + t.got = len(learner) + cases.append(t) + index_layer = 1 + + for a, b in zip(learner, instructor): + t = test_case() + if tuple(a) != tuple(b): + t.failed = True + t.msg = f"{modelId}: Test failed in layer {index_layer}" + t.want = b + t.got = a + cases.append(t) + index_layer = index_layer + 1 + return cases + + +def summary(model): + result = [] + for layer in model.layers: + descriptors = [layer.__class__.__name__, + layer.output_shape, layer.count_params()] + if (type(layer) == Dense): + descriptors.append(layer.activation.__name__) + if (type(layer) == Dropout): + descriptors.append(f"rate={layer.rate}") + if (type(layer) == GRU): + descriptors.append(f"return_sequences={layer.return_sequences}") + descriptors.append(f"return_state={layer.return_state}") + if (type(layer) == PositionalEmbedding): + descriptors.append(f"vocab_size={layer.vocab_size}") + descriptors.append(f"d_model={layer.d_model}") + descriptors.append(f"max_length={layer.max_length}") + if hasattr(layer, 'd_model'): + descriptors.append(f"d_model={layer.d_model}") + if hasattr(layer, 'd_ff'): + descriptors.append(f"d_ff={layer.d_ff}") + if hasattr(layer, 'n_heads'): + descriptors.append(f"n_heads={layer.n_heads}") + if hasattr(layer, 'dropout'): + descriptors.append(f"dropout={layer.dropout}") + if hasattr(layer, 'ff_activation'): + descriptors.append(f"ff_activation={layer.ff_activation}") + + result.append(descriptors) + return result + + diff --git a/NLP with Attention Models/QA/QA_T5/Files/tf/.ipynb_checkpoints/transformer_utils-checkpoint.py b/NLP with Attention Models/QA/QA_T5/Files/tf/.ipynb_checkpoints/transformer_utils-checkpoint.py new file mode 100644 index 0000000000000000000000000000000000000000..2670f2fab031be3636508ee5e7900f25e273b192 --- /dev/null +++ b/NLP with Attention Models/QA/QA_T5/Files/tf/.ipynb_checkpoints/transformer_utils-checkpoint.py @@ -0,0 +1,598 @@ +# import os + +import numpy as np +#import pandas as pd +import tensorflow as tf +#import matplotlib.pyplot as plt +#import time +#import utils + +def positional_encoding(positions, d_model): + """ + Precomputes a matrix with all the positional encodings + + Arguments: + positions (int): Maximum number of positions to be encoded + d (int): Encoding size + + Returns: + pos_encoding (tf.Tensor): A matrix of shape (1, position, d_model) with the positional encodings + """ + + position = np.arange(positions)[:, np.newaxis] + k = np.arange(d_model)[np.newaxis, :] + i = k // 2 + + # initialize a matrix angle_rads of all the angles + angle_rates = 1 / np.power(10000, (2 * i) / np.float32(d_model)) + angle_rads = position * angle_rates + + # apply sin to even indices in the array; 2i + angle_rads[:, 0::2] = np.sin(angle_rads[:, 0::2]) + + # apply cos to odd indices in the array; 2i+1 + angle_rads[:, 1::2] = np.cos(angle_rads[:, 1::2]) + + pos_encoding = angle_rads[np.newaxis, ...] + + return tf.cast(pos_encoding, dtype=tf.float32) + +def create_padding_mask(decoder_token_ids): + """ + Creates a matrix mask for the padding cells + + Arguments: + decoder_token_ids (matrix like): matrix of size (n, m) + + Returns: + mask (tf.Tensor): binary tensor of size (n, 1, m) + """ + seq = 1 - tf.cast(tf.math.equal(decoder_token_ids, 0), tf.float32) + + # add extra dimensions to add the padding to the attention logits. + # this will allow for broadcasting later when comparing sequences + return seq[:, tf.newaxis, :] + + +def create_look_ahead_mask(sequence_length): + """ + Returns a lower triangular matrix filled with ones + + Arguments: + sequence_length (int): matrix size + + Returns: + mask (tf.Tensor): binary tensor of size (sequence_length, sequence_length) + """ + mask = tf.linalg.band_part(tf.ones((1, sequence_length, sequence_length)), -1, 0) + return mask + +def scaled_dot_product_attention(q, k, v, mask): + """ + Calculate the attention weights. + q, k, v must have matching leading dimensions. + k, v must have matching penultimate dimension, i.e.: seq_len_k = seq_len_v. + The mask has different shapes depending on its type(padding or look ahead) + but it must be broadcastable for addition. + + Arguments: + q (tf.Tensor): query of shape (..., seq_len_q, depth) + k (tf.Tensor): key of shape (..., seq_len_k, depth) + v (tf.Tensor): value of shape (..., seq_len_v, depth_v) + mask (tf.Tensor): mask with shape broadcastable + to (..., seq_len_q, seq_len_k). Defaults to None. + + Returns: + output -- attention_weights + """ + ### START CODE HERE ### + + # Multiply q and k transposed. + matmul_qk = tf.matmul(q, k, transpose_b=True) # (..., seq_len_q, seq_len_k) + + # scale matmul_qk with the square root of dk + dk = tf.cast(tf.shape(k)[-1], tf.float32) + scaled_attention_logits = matmul_qk / tf.math.sqrt(dk) + + # add the mask to the scaled tensor. + if mask is not None: # Don't replace this None + scaled_attention_logits += (1. - mask) * -1e9 + + # softmax is normalized on the last axis (seq_len_k) so that the scores add up to 1. + attention_weights = tf.keras.activations.softmax(scaled_attention_logits) # (..., seq_len_q, seq_len_k) + + # Multiply the attention weights by v + output = tf.matmul(attention_weights, v) # (..., seq_len_q, depth_v) + + ### END CODE HERE ### + + return output, attention_weights + +def FullyConnected(embedding_dim, fully_connected_dim): + """ + Returns a sequential model consisting of two dense layers. The first dense layer has + fully_connected_dim neurons and is acrivated by relu. The second dense layer has + embedding_dim and no activation. + + Arguments: + embedding_dim (int): output dimension + fully_connected_dim (int): dimension of the hidden layer + + Returns: + _ (tf.keras.Model): sequential model + """ + return tf.keras.Sequential([ + tf.keras.layers.Dense(fully_connected_dim, activation='relu'), # (batch_size, seq_len, dff) + tf.keras.layers.Dense(embedding_dim) # (batch_size, seq_len, d_model) + ]) + +# GRADED FUNCTION EncoderLayer +class EncoderLayer(tf.keras.layers.Layer): + """ + The encoder layer is composed by a multi-head self-attention mechanism, + followed by a simple, positionwise fully connected feed-forward network. + This architecture includes a residual connection around each of the two + sub-layers, followed by layer normalization. + """ + def __init__(self, embedding_dim, num_heads, fully_connected_dim, + dropout_rate=0.1, layernorm_eps=1e-6): + + super(EncoderLayer, self).__init__() + + self.mha = tf.keras.layers.MultiHeadAttention( + num_heads=num_heads, + key_dim=embedding_dim, + dropout=dropout_rate + ) + + self.ffn = FullyConnected( + embedding_dim=embedding_dim, + fully_connected_dim=fully_connected_dim + ) + + self.layernorm1 = tf.keras.layers.LayerNormalization(epsilon=layernorm_eps) + self.layernorm2 = tf.keras.layers.LayerNormalization(epsilon=layernorm_eps) + + self.dropout_ffn = tf.keras.layers.Dropout(dropout_rate) + + def call(self, x, training, mask): + """ + Forward pass for the Encoder Layer + + Arguments: + x (tf.Tensor): Tensor of shape (batch_size, input_seq_len, fully_connected_dim) + training (bool): Boolean, set to true to activate + the training mode for dropout layers + mask (tf.Tensor): Boolean mask to ensure that the padding is not + treated as part of the input + Returns: + encoder_layer_out (tf.Tensor): Tensor of shape (batch_size, input_seq_len, embedding_dim) + """ + # START CODE HERE + # calculate self-attention using mha(~1 line). + # Dropout is added by Keras automatically if the dropout parameter is non-zero during training + self_mha_output = self.mha(x, x, x, mask) # Self attention (batch_size, input_seq_len, fully_connected_dim) + + # skip connection + # apply layer normalization on sum of the input and the attention output to get the + # output of the multi-head attention layer (~1 line) + skip_x_attention = self.layernorm1(x + self_mha_output) # (batch_size, input_seq_len, fully_connected_dim) + + # pass the output of the multi-head attention layer through a ffn (~1 line) + ffn_output = self.ffn(skip_x_attention) # (batch_size, input_seq_len, fully_connected_dim) + + # apply dropout layer to ffn output during training (~1 line) + # use `training=training` + ffn_output = self.dropout_ffn(ffn_output, training=training) + + # apply layer normalization on sum of the output from multi-head attention (skip connection) and ffn output to get the + # output of the encoder layer (~1 line) + encoder_layer_out = self.layernorm2(skip_x_attention + ffn_output) # (batch_size, input_seq_len, embedding_dim) + # END CODE HERE + + return encoder_layer_out + + +class Encoder(tf.keras.layers.Layer): + """ + The entire Encoder starts by passing the input to an embedding layer + and using positional encoding to then pass the output through a stack of + encoder Layers + + """ + def __init__(self, num_layers, embedding_dim, num_heads, fully_connected_dim, input_vocab_size, + maximum_position_encoding, dropout_rate=0.1, layernorm_eps=1e-6): + super(Encoder, self).__init__() + + self.embedding_dim = embedding_dim + self.num_layers = num_layers + + self.embedding = tf.keras.layers.Embedding(input_vocab_size, self.embedding_dim) + self.pos_encoding = positional_encoding(maximum_position_encoding, + self.embedding_dim) + + self.enc_layers = [EncoderLayer(embedding_dim=self.embedding_dim, + num_heads=num_heads, + fully_connected_dim=fully_connected_dim, + dropout_rate=dropout_rate, + layernorm_eps=layernorm_eps) + for _ in range(self.num_layers)] + + self.dropout = tf.keras.layers.Dropout(dropout_rate) + + def call(self, x, training, mask): + """ + Forward pass for the Encoder + + Arguments: + x (tf.Tensor): Tensor of shape (batch_size, seq_len, embedding_dim) + training (bool): Boolean, set to true to activate + the training mode for dropout layers + mask (tf.Tensor): Boolean mask to ensure that the padding is not + treated as part of the input + + Returns: + x (tf.Tensor): Tensor of shape (batch_size, seq_len, embedding_dim) + """ + seq_len = tf.shape(x)[1] + + # START CODE HERE + # Pass input through the Embedding layer + x = self.embedding(x) # (batch_size, input_seq_len, embedding_dim) + # Scale embedding by multiplying it by the square root of the embedding dimension + x *= tf.math.sqrt(tf.cast(self.embedding_dim, tf.float32)) + # Add the position encoding to embedding + x += self.pos_encoding[:, :seq_len, :] + # Pass the encoded embedding through a dropout layer + # use `training=training` + x = self.dropout(x, training=training) + # Pass the output through the stack of encoding layers + for i in range(self.num_layers): + x = self.enc_layers[i](x, training, mask) + # END CODE HERE + + return x # (batch_size, input_seq_len, embedding_dim) + +class DecoderLayer(tf.keras.layers.Layer): + """ + The decoder layer is composed by two multi-head attention blocks, + one that takes the new input and uses self-attention, and the other + one that combines it with the output of the encoder, followed by a + fully connected block. + """ + def __init__(self, embedding_dim, num_heads, fully_connected_dim, dropout_rate=0.1, layernorm_eps=1e-6): + super(DecoderLayer, self).__init__() + + self.mha1 = tf.keras.layers.MultiHeadAttention( + num_heads=num_heads, + key_dim=embedding_dim, + dropout=dropout_rate + ) + + self.mha2 = tf.keras.layers.MultiHeadAttention( + num_heads=num_heads, + key_dim=embedding_dim, + dropout=dropout_rate + ) + + self.ffn = FullyConnected( + embedding_dim=embedding_dim, + fully_connected_dim=fully_connected_dim + ) + + self.layernorm1 = tf.keras.layers.LayerNormalization(epsilon=layernorm_eps) + self.layernorm2 = tf.keras.layers.LayerNormalization(epsilon=layernorm_eps) + self.layernorm3 = tf.keras.layers.LayerNormalization(epsilon=layernorm_eps) + + self.dropout_ffn = tf.keras.layers.Dropout(dropout_rate) + + def call(self, x, enc_output, training, look_ahead_mask, padding_mask): + """ + Forward pass for the Decoder Layer + + Arguments: + x (tf.Tensor): Tensor of shape (batch_size, target_seq_len, fully_connected_dim) + enc_output (tf.Tensor): Tensor of shape(batch_size, input_seq_len, fully_connected_dim) + training (bool): Boolean, set to true to activate + the training mode for dropout layers + look_ahead_mask (tf.Tensor): Boolean mask for the target_input + padding_mask (tf.Tensor): Boolean mask for the second multihead attention layer + Returns: + out3 (tf.Tensor): Tensor of shape (batch_size, target_seq_len, fully_connected_dim) + attn_weights_block1 (tf.Tensor): Tensor of shape (batch_size, num_heads, target_seq_len, input_seq_len) + attn_weights_block2 (tf.Tensor): Tensor of shape (batch_size, num_heads, target_seq_len, input_seq_len) + """ + + # START CODE HERE + # enc_output.shape == (batch_size, input_seq_len, fully_connected_dim) + + # BLOCK 1 + # calculate self-attention and return attention scores as attn_weights_block1. + # Dropout will be applied during training (~1 line). + mult_attn_out1, attn_weights_block1 = self.mha1(x, x, x, look_ahead_mask, return_attention_scores=True) # (batch_size, target_seq_len, d_model) + + # apply layer normalization (layernorm1) to the sum of the attention output and the input (~1 line) + Q1 = self.layernorm1(mult_attn_out1 + x) + + # BLOCK 2 + # calculate self-attention using the Q from the first block and K and V from the encoder output. + # Dropout will be applied during training + # Return attention scores as attn_weights_block2 (~1 line) + mult_attn_out2, attn_weights_block2 = self.mha2(Q1, enc_output, enc_output, padding_mask, return_attention_scores=True) # (batch_size, target_seq_len, d_model) + + # apply layer normalization (layernorm2) to the sum of the attention output and the output of the first block (~1 line) + mult_attn_out2 = self.layernorm2(mult_attn_out2 + Q1) # (batch_size, target_seq_len, fully_connected_dim) + + #BLOCK 3 + # pass the output of the second block through a ffn + ffn_output = self.ffn(mult_attn_out2) # (batch_size, target_seq_len, fully_connected_dim) + + # apply a dropout layer to the ffn output + # use `training=training` + ffn_output = self.dropout_ffn(ffn_output, training=training) + + # apply layer normalization (layernorm3) to the sum of the ffn output and the output of the second block + out3 = self.layernorm3(ffn_output + mult_attn_out2) # (batch_size, target_seq_len, fully_connected_dim) + # END CODE HERE + + return out3, attn_weights_block1, attn_weights_block2 + +# GRADED FUNCTION Decoder +class Decoder(tf.keras.layers.Layer): + """ + The entire Encoder starts by passing the target input to an embedding layer + and using positional encoding to then pass the output through a stack of + decoder Layers + + """ + def __init__(self, num_layers, embedding_dim, num_heads, fully_connected_dim, target_vocab_size, + maximum_position_encoding, dropout_rate=0.1, layernorm_eps=1e-6): + super(Decoder, self).__init__() + + self.embedding_dim = embedding_dim + self.num_layers = num_layers + + self.embedding = tf.keras.layers.Embedding(target_vocab_size, self.embedding_dim) + self.pos_encoding = positional_encoding(maximum_position_encoding, self.embedding_dim) + + self.dec_layers = [DecoderLayer(embedding_dim=self.embedding_dim, + num_heads=num_heads, + fully_connected_dim=fully_connected_dim, + dropout_rate=dropout_rate, + layernorm_eps=layernorm_eps) + for _ in range(self.num_layers)] + self.dropout = tf.keras.layers.Dropout(dropout_rate) + + def call(self, x, enc_output, training, + look_ahead_mask, padding_mask): + """ + Forward pass for the Decoder + + Arguments: + x (tf.Tensor): Tensor of shape (batch_size, target_seq_len, fully_connected_dim) + enc_output (tf.Tensor): Tensor of shape(batch_size, input_seq_len, fully_connected_dim) + training (bool): Boolean, set to true to activate + the training mode for dropout layers + look_ahead_mask (tf.Tensor): Boolean mask for the target_input + padding_mask (tf.Tensor): Boolean mask for the second multihead attention layer + Returns: + x (tf.Tensor): Tensor of shape (batch_size, target_seq_len, fully_connected_dim) + attention_weights (dict[str: tf.Tensor]): Dictionary of tensors containing all the attention weights + each of shape Tensor of shape (batch_size, num_heads, target_seq_len, input_seq_len) + """ + + seq_len = tf.shape(x)[1] + attention_weights = {} + + # START CODE HERE + # create word embeddings + x = self.embedding(x) # (batch_size, target_seq_len, fully_connected_dim) + + # scale embeddings by multiplying by the square root of their dimension + x *= tf.math.sqrt(tf.cast(self.embedding_dim, tf.float32)) + + # calculate positional encodings and add to word embedding + x += self.pos_encoding[:, :seq_len, :] + + # apply a dropout layer to x + # use `training=training` + x = self.dropout(x, training=training) + + # use a for loop to pass x through a stack of decoder layers and update attention_weights (~4 lines total) + for i in range(self.num_layers): + # pass x and the encoder output through a stack of decoder layers and save the attention weights + # of block 1 and 2 (~1 line) + x, block1, block2 = self.dec_layers[i](x, enc_output, training, + look_ahead_mask, padding_mask) + + #update attention_weights dictionary with the attention weights of block 1 and block 2 + attention_weights['decoder_layer{}_block1_self_att'.format(i+1)] = block1 + attention_weights['decoder_layer{}_block2_decenc_att'.format(i+1)] = block2 + # END CODE HERE + + # x.shape == (batch_size, target_seq_len, fully_connected_dim) + return x, attention_weights + +# + +class Transformer(tf.keras.Model): + """ + Complete transformer with an Encoder and a Decoder + """ + def __init__(self, num_layers, embedding_dim, num_heads, fully_connected_dim, input_vocab_size, + target_vocab_size, max_positional_encoding_input, + max_positional_encoding_target, dropout_rate=0.1, layernorm_eps=1e-6): + super(Transformer, self).__init__() + + self.encoder = Encoder(num_layers=num_layers, + embedding_dim=embedding_dim, + num_heads=num_heads, + fully_connected_dim=fully_connected_dim, + input_vocab_size=input_vocab_size, + maximum_position_encoding=max_positional_encoding_input, + dropout_rate=dropout_rate, + layernorm_eps=layernorm_eps) + + self.decoder = Decoder(num_layers=num_layers, + embedding_dim=embedding_dim, + num_heads=num_heads, + fully_connected_dim=fully_connected_dim, + target_vocab_size=target_vocab_size, + maximum_position_encoding=max_positional_encoding_target, + dropout_rate=dropout_rate, + layernorm_eps=layernorm_eps) + + self.final_layer = tf.keras.layers.Dense(target_vocab_size, activation='softmax') + + def call(self, input_sentence, output_sentence, training, enc_padding_mask, look_ahead_mask, dec_padding_mask): + """ + Forward pass for the entire Transformer + Arguments: + input_sentence (tf.Tensor): Tensor of shape (batch_size, input_seq_len, fully_connected_dim) + An array of the indexes of the words in the input sentence + output_sentence (tf.Tensor): Tensor of shape (batch_size, target_seq_len, fully_connected_dim) + An array of the indexes of the words in the output sentence + training (bool): Boolean, set to true to activate + the training mode for dropout layers + enc_padding_mask (tf.Tensor): Boolean mask to ensure that the padding is not + treated as part of the input + look_ahead_mask (tf.Tensor): Boolean mask for the target_input + dec_padding_mask (tf.Tensor): Boolean mask for the second multihead attention layer + Returns: + final_output (tf.Tensor): The final output of the model + attention_weights (dict[str: tf.Tensor]): Dictionary of tensors containing all the attention weights for the decoder + each of shape Tensor of shape (batch_size, num_heads, target_seq_len, input_seq_len) + + """ + # START CODE HERE + # call self.encoder with the appropriate arguments to get the encoder output + enc_output = self.encoder(input_sentence, training, enc_padding_mask) # (batch_size, inp_seq_len, fully_connected_dim) + + # call self.decoder with the appropriate arguments to get the decoder output + # dec_output.shape == (batch_size, tar_seq_len, fully_connected_dim) + dec_output, attention_weights = self.decoder( + output_sentence, enc_output, training, look_ahead_mask, dec_padding_mask) + + # pass decoder output through a linear layer and softmax (~2 lines) + final_output = self.final_layer(dec_output) # (batch_size, tar_seq_len, target_vocab_size) + # END CODE HERE + + return final_output, attention_weights + +class CustomSchedule(tf.keras.optimizers.schedules.LearningRateSchedule): + def __init__(self, d_model, warmup_steps=1000): + super(CustomSchedule, self).__init__() + self.d_model = tf.cast(d_model, dtype=tf.float32) + self.warmup_steps = warmup_steps + + def __call__(self, step): + step = tf.cast(step, dtype=tf.float32) + arg1 = tf.math.rsqrt(step) + arg2 = step * (self.warmup_steps ** -1.5) + + return tf.math.rsqrt(self.d_model) * tf.math.minimum(arg1, arg2) + +def masked_loss(real, pred, loss_object): + mask = tf.math.logical_not(tf.math.equal(real, 0)) + loss_ = loss_object(real, pred) + + mask = tf.cast(mask, dtype=loss_.dtype) + loss_ *= mask + + return tf.reduce_sum(loss_)/tf.reduce_sum(mask) + +@tf.function +def train_step(inp, tar, model, loss_object, optimizer, train_loss): + """ + One training step for the transformer + Arguments: + inp (tf.Tensor): Input data to summarize + tar (tf.Tensor): Target (summary) + Returns: + None + """ + tar_inp = tar[:, :-1] + tar_real = tar[:, 1:] + + # Create masks + enc_padding_mask = create_padding_mask(inp) + look_ahead_mask = create_look_ahead_mask(tf.shape(tar_inp)[1])# + + with tf.GradientTape() as tape: + predictions, _ = model( + inp, + tar_inp, + True, + enc_padding_mask, + look_ahead_mask, + enc_padding_mask + ) + loss = masked_loss(tar_real, predictions, loss_object) + + gradients = tape.gradient(loss, model.trainable_variables) + optimizer.apply_gradients(zip(gradients, model.trainable_variables)) + + train_loss(loss) + +@tf.function +def create_padding_mask(decoder_token_ids): + """ + Creates a matrix mask for the padding cells + + Arguments: + decoder_token_ids (matrix like): matrix of size (n, m) + + Returns: + mask (tf.Tensor): binary tensor of size (n, 1, m) + """ + seq = 1 - tf.cast(tf.math.equal(decoder_token_ids, 0), tf.float32) + + # add extra dimensions to add the padding to the attention logits. + # this will allow for broadcasting later when comparing sequences + return seq[:, tf.newaxis, :] + +@tf.function +def create_look_ahead_mask(sequence_length): + """ + Returns a lower triangular matrix filled with ones + + Arguments: + sequence_length (int): matrix size + + Returns: + mask (tf.Tensor): binary tensor of size (sequence_length, sequence_length) + """ + mask = tf.linalg.band_part(tf.ones((1, sequence_length, sequence_length)), -1, 0) + return mask + +def next_word(encoder_input, output, model): + """ + Helper function that uses the model to predict just the next word. + Arguments: + encoder_input (tf.Tensor): Input question + output (tf.Tensor): Current state of the answer + Returns: + predicted_id (tf.Tensor): The id of the predicted word + """ + # Create a padding mask for the input + enc_padding_mask = create_padding_mask(encoder_input) + # Create a look-ahead mask for the output + look_ahead_mask = create_look_ahead_mask(tf.shape(output)[1]) + # Run the prediction of the next word with the transformer model + predictions, attention_weights = model( + encoder_input, + output, + False, + enc_padding_mask, + look_ahead_mask, + enc_padding_mask + ) + + predictions = predictions[: ,-1:, :] + predicted_id = tf.cast(tf.argmax(predictions, axis=-1), tf.int32) + + return predicted_id + + +# - + + diff --git a/NLP with Attention Models/QA/QA_T5/Files/tf/.ipynb_checkpoints/utils-checkpoint.py b/NLP with Attention Models/QA/QA_T5/Files/tf/.ipynb_checkpoints/utils-checkpoint.py new file mode 100644 index 0000000000000000000000000000000000000000..91c6b862b1a24de02289ef17bd4b04266bcf0405 --- /dev/null +++ b/NLP with Attention Models/QA/QA_T5/Files/tf/.ipynb_checkpoints/utils-checkpoint.py @@ -0,0 +1,51 @@ +import tensorflow as tf +import numpy as np + +def positional_encoding(length, depth): + depth = depth/2 + + positions = np.arange(length)[:, np.newaxis] # (seq, 1) + depths = np.arange(depth)[np.newaxis, :]/depth # (1, depth) + + angle_rates = 1 / (10000**depths) # (1, depth) + angle_rads = positions * angle_rates # (pos, depth) + + pos_encoding = np.concatenate( + [np.sin(angle_rads), np.cos(angle_rads)], + axis=-1) + + return tf.cast(pos_encoding, dtype=tf.float32) + +class PositionalEmbedding(tf.keras.layers.Layer): + def __init__(self, vocab_size, d_model, max_length=2048): + super().__init__() + self.vocab_size = vocab_size + self.d_model = d_model + self.max_length = max_length + self.embedding = tf.keras.layers.Embedding(self.vocab_size, self.d_model, mask_zero=True) + self.pos_encoding = positional_encoding(length=self.max_length, depth=self.d_model) + + def compute_mask(self, *args, **kwargs): + return self.embedding.compute_mask(*args, **kwargs) + + def call(self, x): + length = tf.shape(x)[1] + x = self.embedding(x) + # This factor sets the relative scale of the embedding and positonal_encoding. + x *= tf.math.sqrt(tf.cast(self.d_model, tf.float32)) + x = x + self.pos_encoding[tf.newaxis, :length, :] + return x + +# Dummy enconder block. Will be replaced by learner implementation +class EncoderBlock(tf.keras.Model): + def __init__(self, d_model, d_ff, n_heads, dropout, ff_activation='relu'): + super().__init__(self) + self.d_model = d_model + self.d_ff = d_ff + self.n_heads = n_heads + self.dropout = dropout + self.ff_activation = ff_activation + + def call(self, inputs, states=None, return_state=False, training=False): + x = inputs + return x \ No newline at end of file diff --git a/NLP with Attention Models/QA/QA_T5/Files/tf/.ipynb_checkpoints/w3_unittest-checkpoint.py b/NLP with Attention Models/QA/QA_T5/Files/tf/.ipynb_checkpoints/w3_unittest-checkpoint.py new file mode 100644 index 0000000000000000000000000000000000000000..7fe546f6371d26e3f0116756aca0b8da9361f88d --- /dev/null +++ b/NLP with Attention Models/QA/QA_T5/Files/tf/.ipynb_checkpoints/w3_unittest-checkpoint.py @@ -0,0 +1,355 @@ +import sys +import itertools +import numpy as np +import traceback +import test_utils +from utils import EncoderBlock + +import tensorflow as tf +from tensorflow.keras.layers import MultiHeadAttention, ReLU, Attention, LayerNormalization, Input +import tensorflow_text as tf_text + +from dlai_grader.grading import test_case, object_to_grade +from types import ModuleType, FunctionType +import transformer_utils + +def testing_rnd(): + def dummy_generator(): + vals = np.linspace(0, 1, 10) + cyclic_vals = itertools.cycle(vals) + for _ in range(100): + yield next(cyclic_vals) + + dumr = itertools.cycle(dummy_generator()) + + def dummy_randomizer(): + return next(dumr) + + return dummy_randomizer + +f = open("./models/sentencepiece.model", "rb") +tokenizer = tf_text.SentencepieceTokenizer(f.read(), out_type=tf.int32) + +# + +def test_tokenize_and_mask(target): + + t = test_case() + if not isinstance(target, FunctionType): + t.failed = True + t.msg = "target has incorrect type" + t.want = FunctionType + t.got = type(target) + return [t] + + text1 = b"Beginners BBQ Class Taking Place in Missoula!" + text2 = b"Foil plaid lycra and spandex shortall with metallic slinky insets." + text3 = 'Beginners BBQ Class Taking Place in Missoula!\nDo you want to get better at making delicious BBQ? You will have the opportunity, put this on your calendar now. Thursday, September 22nd join World Class BBQ Champion, Tony Balay from Lonestar Smoke Rangers.' + + + test_cases = [{"text": text1, "noise": 0, + "expected_output":([12847, 277, 15068, 4501, 3, 12297, 3399, 16, 5964, 7115, 9, 55], [1])}, + {"text": text1, "noise": 0.2, + "expected_output": ([31999, 15068, 4501, 3, 12297, 3399, 16, 5964, 7115, 31998], [31999, 12847, 277, 31998, 9, 55, 1])}, + {"text": text1, "noise": 0.5, + "expected_output": ([31999, 12297, 3399, 16, 5964, 7115, 31998], [31999, 12847, 277, 15068, 4501, 3, 31998, 9, 55, 1])}, + {"text": text2, "noise": 0.1, + "expected_output": ([31999, 173, 30772, 3, 120, 2935, 11, 8438, 26, 994, 31998, 1748, 28, 18813, 3, 7, 4907, 63, 16, 2244, 31997, 5], [31999, 4452, 31998, 710, 31997, 7, 1])}, + {"text": text2, "noise": 0.2, + "expected_output": ([31999, 30772, 3, 120, 2935, 11, 8438, 26, 994, 31998, 28, 18813, 3, 7, 4907, 63, 16, 2244, 31997], [31999, 4452, 173, 31998, 710, 1748, 31997, 7, 5, 1])}, + {"text": text2, "noise": 1.0, + "expected_output": ([31999, 994, 31998, 2244, 31997], [31999, 4452, 173, 30772, 3, 120, 2935, 11, 8438, 26, 31998, 710, 1748, 28, 18813, 3, 7, 4907, 63, 16, 31997, 7, 5, 1])}, + {"text": text3, "noise": 0.15, + "expected_output": ([31999, 15068, 4501, 3, 12297, 3399, 16, 5964, 7115, 31998, 531, 25, 241, 12, 129, 394, 44, 492, 31997, 58, 148, 56, 43, 8, 1004, 6, 474, 31996, 39, 4793, 230, 5, 2721, 6, 1600, 1630, 31995, 1150, 4501, 15068, 16127, 6, 9137, 2659, 5595, 31994, 782, 3624, 14627, 15, 12612, 277, 5], [31999, 12847, 277, 31998, 9, 55, 31997, 3326, 15068, 31996, 48, 30, 31995, 727, 1715, 31994, 45, 301, 1])} + ] + cases = [] + + for case in test_cases: + output = target(case.get('text'), + noise=case.get('noise'), + randomizer=testing_rnd(), + tokenizer=tokenizer) + t = test_case() #inps, targs + if not isinstance(output[0], list): + t.failed = True + t.msg = "Wrong type. inps extected to be a list" + t.want = tf.Tensor + t.got = type(output[0]) + cases.append(t) + + t = test_case() + if not isinstance(output[1], list): + t.failed = True + t.msg = "Wrong type. args extected to be a list" + t.want = tf.Tensor + t.got = type(output[1]) + cases.append(t) + + t = test_case() + if len(case.get('expected_output')[0]) != len(output[0]): + t.failed = True + t.msg = "Wrong length for inps" + t.want = len(case.get('expected_output')[0]) + t.got = len(output[0]) + cases.append(t) + + t = test_case() + if len(case.get('expected_output')[1]) != len(output[1]): + t.failed = True + t.msg = "Wrong length for args" + t.want = len(case.get('expected_output')[1]) + t.got = len(output[1]) + cases.append(t) + + t = test_case() + if len(output[0])>0 and not (isinstance(output[0][0], (int, np.int32, np.int64, type(tf.constant(1.0))))): + t.failed = True + t.msg = "Wrong type. inps extected to be a int" + t.want = type(1) + t.got = type(output[0][0]) + cases.append(t) + + t = test_case() + if len(output[1])>0 and not (isinstance(output[1][0], (int, np.int32, np.int64, type(tf.constant(1.0))))): + t.failed = True + t.msg = "Wrong type. args extected to be a int qq" + t.want = type(1) + t.got = type(output[1][0]) + cases.append(t) + + t = test_case() + if not np.array_equal(output[0], case.get('expected_output')[0]): + t.failed = True + t.msg = f"Wrong values for inps for input: {case.get('text')}" + t.want = case.get('expected_output')[0] + t.got = output[0] + cases.append(t) + + t = test_case() + if not np.array_equal(output[1], case.get('expected_output')[1]): + t.failed = True + t.msg = f"Wrong values for args for input: {case.get('text')}" + t.want = case.get('expected_output')[1] + t.got = output[1] + cases.append(t) + + for i in range(len(cases)): + if cases[i].failed: + print(f"\033[91mTese case {i} failed\n") + print(cases[i]) + return + + print("\033[92m All tests passed") + +def test_parse_squad(target): + t = test_case() + if not isinstance(target, FunctionType): + t.failed = True + t.msg = "target has incorrect type" + t.want = FunctionType + t.got = type(target) + return [t] + dataset1 = [{"title": "t1", "paragraphs": [ + {"context": "very long context one", + "qas": [{ "question": "question is abc?", + "id": "1", + "answers": [ + { + "text": "here is abc", + "answer_start": 8 + }, + { + "text": "abc here abc", + "answer_start": 0 + } + ], + "is_impossible": False}, + { "question": "unanswerable question?", + "id": "2", + "answers": [ + { + "text": "what?", + "answer_start": 9 + } + ], + "is_impossible": True}, + { "question": "question is xyz?", + "id": "3", + "answers": [ + { + "text": "here is xyz", + "answer_start": 9 + } + ], + "is_impossible": False} + ]}]}] + + pairs = target(dataset1) + expected_pairs1 = (['question: question is abc? context: very long context one', + 'question: question is xyz? context: very long context one'], + ['answer: here is abc', 'answer: here is xyz']) + cases = [] + + t = test_case() + if not isinstance(pairs[0], list): + t.failed = True + t.msg = "Wrong type for returned inputs" + t.want = list + t.got = type(pairs[0]) + cases.append(t) + + t = test_case() + if not isinstance(pairs[1], list): + t.failed = True + t.msg = "Wrong type for returned outputs" + t.want = list + t.got = type(pairs[1]) + cases.append(t) + + t = test_case() #inps, targs + if len(pairs[0]) != 2: + t.failed = True + t.msg = "Wrong length for returned inputs" + t.want = tf.Tensor + t.got = type(pairs[0]) + cases.append(t) + + t = test_case() #inps, targs + if len(pairs[1]) != 2: + t.failed = True + t.msg = "Wrong length for returned outputs" + t.want = tf.Tensor + t.got = type(pairs[1]) + cases.append(t) + + t = test_case() #inps, targs + if not(pairs[0][0] == expected_pairs1[0][0]): + t.failed = True + t.msg = "Wrong input 0" + t.want = expected_pairs1[0][0] + t.got = pairs[0][0] + cases.append(t) + + t = test_case() #inps, targs + if not(pairs[0][1] == expected_pairs1[0][1]): + t.failed = True + t.msg = "Wrong input 1" + t.want = expected_pairs1[0][1] + t.got = pairs[0][1] + cases.append(t) + + t = test_case() #inps, targs + if not(pairs[1][0] == expected_pairs1[1][0]): + t.failed = True + t.msg = "Wrong output 0" + t.want = expected_pairs1[1][0] + t.got = pairs[1][0] + cases.append(t) + + t = test_case() #inps, targs + if not(pairs[1][1] == expected_pairs1[1][1]): + t.failed = True + t.msg = "Wrong output 1" + t.want = expected_pairs1[1][1] + t.got = pairs[1][1] + cases.append(t) + + print("\033[92m All tests passed") + +def test_answer_question(target): + + t = test_case() + if not isinstance(target, FunctionType): + t.failed = True + t.msg = "target has incorrect type" + t.want = FunctionType + t.got = type(target) + return [t] + + # Define the model parameters + num_layers = 2 + embedding_dim = 128 + fully_connected_dim = 128 + num_heads = 2 + positional_encoding_length = 256 + + encoder_vocab_size = int(tokenizer.vocab_size()) + decoder_vocab_size = encoder_vocab_size + + # Initialize the model + modelx = transformer_utils.Transformer( + num_layers, + embedding_dim, + num_heads, + fully_connected_dim, + encoder_vocab_size, + decoder_vocab_size, + positional_encoding_length, + positional_encoding_length, + ) + + if False: + print("Not all tests were performed due to missing files. Don't worry, this has no impact on the assignment and we are working to fix it.") + else: + modelx.load_weights('./pretrained_models/model_qa3') + + question = "question: How many are this? context: This is five." + result = tokenizer.detokenize(target(question, modelx, tokenizer)).numpy()[0].decode() + cases = [] + + t = test_case() #inps, targs + if not ("answer:" in result): + t.failed = True + t.msg = "Wrong preamble" + t.want = "answer:" + t.got = result + cases.append(t) + + if not ("five" in result): + t.failed = True + t.msg = "Wrong answer" + t.want = "five" + t.got = result + cases.append(t) + + + question = "question: When did that happen? context: That happen on August 17, 1715" + result = tokenizer.detokenize(target(question, modelx, tokenizer)).numpy()[0].decode() + t = test_case() #inps, targs + if not ("answer:" in result): + t.failed = True + t.msg = "Wrong preamble" + t.want = "answer:" + t.got = result + cases.append(t) + + if not ("August" in result): + t.failed = True + t.msg = "Wrong answer" + t.want = "August" + t.got = result + cases.append(t) + + question = "question: Who is the king? context: In this country the king is Charles from here in advance" + result = tokenizer.detokenize(target(question, modelx, tokenizer)).numpy()[0].decode() + + if not ("answer:" in result): + t.failed = True + t.msg = "Wrong preamble" + t.want = "answer:" + t.got = result + cases.append(t) + + if not ("Charles V" in result): + t.failed = True + t.msg = "Wrong answer" + t.want = "Charles V" + t.got = result + cases.append(t) + + for i in range(len(cases)): + if cases[i].failed: + print(f"\033[91mTese case {i} failed\n") + print(cases[i]) + return + + print("\033[92m All tests passed") diff --git a/NLP with Attention Models/QA/QA_T5/Files/tf/C4W3_Assignment.ipynb b/NLP with Attention Models/QA/QA_T5/Files/tf/C4W3_Assignment.ipynb new file mode 100644 index 0000000000000000000000000000000000000000..d2e70558592198ef0676f25a516f78905cb63c6f --- /dev/null +++ b/NLP with Attention Models/QA/QA_T5/Files/tf/C4W3_Assignment.ipynb @@ -0,0 +1,2232 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "id": "f368f78e", + "metadata": { + "colab_type": "text", + "id": "7yuytuIllsv1" + }, + "source": [ + "# Assignment 3: Question Answering\n", + "\n", + "Welcome to the third assignment of course 4. In this assignment you will explore question answering. You will implement the \"Text to Text Transfer from Transformers\" (better known as T5). Since you implemented transformers from scratch last week you will now be able to use them. \n", + "\n", + " \n" + ] + }, + { + "cell_type": "markdown", + "id": "bf3ec561", + "metadata": { + "colab_type": "text", + "id": "Db6LQW5cMSgx" + }, + "source": [ + "## Table of Contents\n", + "\n", + "- [Overview](#0-1)\n", + "- [Importing the Packages](#0-2)\n", + "- [1 - Prepare the data for pretraining T5](#1)\n", + " - [1.1 - Pre-Training Objective](#1-1)\n", + " - [1.2 - C4 Dataset](#1-2)\n", + " - [1.3 - Process C4](#1-3)\n", + " - [1.4 - Decode to Natural Language](#1-4)\n", + " - [1.5 - Tokenizing and Masking](#1-5)\n", + " - [Exercise 1 - tokenize_and_mask](#ex-1)\n", + " - [1.6 - Creating the Pairs](#1-6)\n", + "- [2 - Pretrain a T5 model using C4](#2)\n", + " - [2.1 - Instantiate a new transformer model](#2-1)\n", + " - [2.2 - C4 pretraining](#2-2)\n", + "- [3 - Fine tune the T5 model for Question Answering](#3)\n", + " - [3.1 - Creating a list of paired question and answers](#3-1)\n", + " - [Exercise 2 - Parse the SQuaD 2.0 dataset](#ex-2)\n", + " - [3.2 - Fine tune the T5 model](#3-2) \n", + " - [3.3 - Implement your Question Answering model](#3-3)\n", + " - [Exercise 3 - Implement the question answering function](#ex-3) " + ] + }, + { + "cell_type": "markdown", + "id": "0595e9c4", + "metadata": { + "colab_type": "text", + "id": "ysxogfC1M158" + }, + "source": [ + "\n", + "## Overview\n", + "\n", + "This assignment will be different from the two previous ones. Due to memory constraints of this environment and for the sake of time, your model will be trained with small datasets, so you won't get models that you could use in production but you will gain the necessary knowledge about how the Generative Language models are trained and used. Also you won't spend too much time with the architecture of the models but you will instead take a model that is pre-trained on a larger dataset and fine tune it to get better results.\n", + "\n", + "After completing this labs you will:\n", + "* Understand how the C4 dataset is structured.\n", + "* Pretrain a transformer model using a Masked Language Model.\n", + "* Understand how the \"Text to Text Transfer from Transformers\" or T5 model works. \n", + "* Fine tune the T5 model for Question answering\n", + "\n", + "Before getting started take some time to read the following tips:\n", + "#### TIPS FOR SUCCESSFUL GRADING OF YOUR ASSIGNMENT:\n", + "- All cells are frozen except for the ones where you need to submit your solutions.\n", + "- You can add new cells to experiment but these will be omitted by the grader, so don't rely on newly created cells to host your solution code, use the provided places for this.\n", + "- You can add the comment # grade-up-to-here in any graded cell to signal the grader that it must only evaluate up to that point. This is helpful if you want to check if you are on the right track even if you are not done with the whole assignment. Be sure to remember to delete the comment afterwards!\n", + "- To submit your notebook, save it and then click on the blue submit button at the beginning of the page." + ] + }, + { + "cell_type": "markdown", + "id": "2156cf78", + "metadata": {}, + "source": [ + "\n", + "## Importing the Packages\n", + "\n", + "Let's start by importing all the required libraries. " + ] + }, + { + "cell_type": "code", + "execution_count": 1, + "id": "3a532381", + "metadata": { + "colab": { + "base_uri": "https://localhost:8080/", + "height": 34 + }, + "colab_type": "code", + "deletable": false, + "editable": false, + "id": "uDhi6qLQMHzs", + "outputId": "64947d91-eef3-425b-9b4b-7ca7cefcc823", + "slideshow": { + "slide_type": "" + }, + "tags": [ + "graded" + ] + }, + "outputs": [], + "source": [ + "import os\n", + "os.environ['TF_CPP_MIN_LOG_LEVEL'] = '3'\n", + "\n", + "import traceback\n", + "import time\n", + "import json\n", + "from termcolor import colored\n", + "import string\n", + "import textwrap\n", + "import itertools\n", + "import numpy as np\n", + "import tensorflow_text as tf_text\n", + "import tensorflow as tf\n", + "\n", + "import transformer_utils \n", + "import utils\n", + "\n", + "# Will come in handy later\n", + "wrapper = textwrap.TextWrapper(width=70)\n", + "\n", + "# Set random seed\n", + "np.random.seed(42)" + ] + }, + { + "cell_type": "code", + "execution_count": 2, + "id": "bf711eba", + "metadata": { + "deletable": false, + "editable": false, + "tags": [] + }, + "outputs": [], + "source": [ + "import w3_unittest" + ] + }, + { + "cell_type": "markdown", + "id": "47bea693", + "metadata": { + "colab_type": "text", + "id": "t7A-LAxsYpDd", + "slideshow": { + "slide_type": "" + }, + "tags": [] + }, + "source": [ + "\n", + "## 1 - Prepare the data for pretraining T5 \n", + "\n", + "\n", + "### 1.1 - Pre-Training Objective\n", + "\n", + "In the initial phase of training a T5 model for a Question Answering task, the pre-training process involves leveraging a masked language model (MLM) on a very large dataset, such as the C4 dataset. The objective is to allow the model to learn contextualized representations of words and phrases, fostering a deeper understanding of language semantics. To initiate pre-training, it is essential to employ the Transformer architecture, which forms the backbone of T5. The Transformer's self-attention mechanism enables the model to weigh different parts of the input sequence dynamically, capturing long-range dependencies effectively.\n", + "\n", + "Before delving into pre-training, thorough data preprocessing is crucial. The C4 dataset, a diverse and extensive collection of web pages, provides a rich source for language understanding tasks. The dataset needs to be tokenized into smaller units, such as subwords or words, to facilitate model input. Additionally, the text is often segmented into fixed-length sequences or batches, optimizing computational efficiency during training.\n", + "\n", + "For the masked language modeling objective, a percentage of the tokenized input is randomly masked, and the model is trained to predict the original content of these masked tokens. This process encourages the T5 model to grasp contextual relationships between words and phrases, enhancing its ability to generate coherent and contextually appropriate responses during downstream tasks like question answering.\n", + "\n", + "In summary, the pre-training of the T5 model involves utilizing the Transformer architecture on a sizable dataset like C4, coupled with meticulous data preprocessing to convert raw text into a format suitable for training. The incorporation of a masked language modeling objective ensures that the model learns robust contextual representations, laying a solid foundation for subsequent fine-tuning on specific tasks such as question answering.\n", + "\n", + "**Note:** The word \"mask\" will be used throughout this assignment in context of hiding/removing word(s)\n", + "\n", + "You will be implementing the Masked language model (MLM) as shown in the following image. \n", + "\n", + "\n", + "\n", + "Assume you have the following text: **Thank you for inviting me to your party last week** \n", + "\n", + "\n", + "Now as input you will mask the words in red in the text: \n", + "\n", + " **Input:** Thank you **X** me to your party **Y** week.\n", + "\n", + "**Output:** The model should predict the words(s) for **X** and **Y**. \n", + "\n", + "**[EOS]** will be used to mark the end of the target sequence." + ] + }, + { + "cell_type": "markdown", + "id": "1dc25302", + "metadata": { + "colab_type": "text", + "id": "Cwr7LoXwQUW5", + "slideshow": { + "slide_type": "" + }, + "tags": [] + }, + "source": [ + "\n", + "### 1.2 - C4 Dataset\n", + "\n", + "The [C4 dataset](https://www.tensorflow.org/datasets/catalog/c4), also known as the Common Crawl C4 (Common Crawl Corpus C4), is a large-scale dataset of web pages collected by the [Common Crawl organization](https://commoncrawl.org/). It is commonly used for various natural language processing tasks and machine learning research. Each sample in the C4 dataset follows a consistent format, making it suitable for pretraining models like BERT. Here's a short explanation and description of the C4 dataset:\n", + "\n", + "- Format: Each sample in the C4 dataset is represented as a JSON object, containing several key-value pairs.\n", + "\n", + "- Content: The 'text' field in each sample contains the actual text content extracted from web pages. This text often includes a wide range of topics and writing styles, making it diverse and suitable for training language models.\n", + "\n", + "- Metadata: The dataset includes metadata such as 'content-length,' 'content-type,' 'timestamp,' and 'url,' providing additional information about each web page. 'Content-length' specifies the length of the content, 'content-type' describes the type of content (e.g., 'text/plain'), 'timestamp' indicates when the web page was crawled, and 'url' provides the source URL of the web page.\n", + "\n", + "- Applications: The C4 dataset is commonly used for training and fine-tuning large-scale language models, such as BERT. It serves as a valuable resource for tasks like text classification, named entity recognition, question answering, and more.\n", + "\n", + "- Size: The C4 dataset is containing more than 800 GiB of text data, making it suitable for training models with billions of parameters.\n", + "\n", + "Run the cell below to see how the C4 dataset looks like. " + ] + }, + { + "cell_type": "code", + "execution_count": 3, + "id": "aa56acc9", + "metadata": { + "deletable": false, + "editable": false, + "slideshow": { + "slide_type": "" + }, + "tags": [] + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "example number 1: \n", + "\n", + "{'text': 'Beginners BBQ Class Taking Place in Missoula!\\nDo you want to get better at making delicious BBQ? You will have the opportunity, put this on your calendar now. Thursday, September 22nd join World Class BBQ Champion, Tony Balay from Lonestar Smoke Rangers. He will be teaching a beginner level class for everyone who wants to get better with their culinary skills.\\nHe will teach you everything you need to know to compete in a KCBS BBQ competition, including techniques, recipes, timelines, meat selection and trimming, plus smoker and fire information.\\nThe cost to be in the class is $35 per person, and for spectators it is free. Included in the cost will be either a t-shirt or apron and you will be tasting samples of each meat that is prepared.'} \n", + "\n", + "example number 2: \n", + "\n", + "{'text': 'Discussion in \\'Mac OS X Lion (10.7)\\' started by axboi87, Jan 20, 2012.\\nI\\'ve got a 500gb internal drive and a 240gb SSD.\\nWhen trying to restore using disk utility i\\'m given the error \"Not enough space on disk ____ to restore\"\\nBut I shouldn\\'t have to do that!!!\\nAny ideas or workarounds before resorting to the above?\\nUse Carbon Copy Cloner to copy one drive to the other. I\\'ve done this several times going from larger HDD to smaller SSD and I wound up with a bootable SSD drive. One step you have to remember not to skip is to use Disk Utility to partition the SSD as GUID partition scheme HFS+ before doing the clone. If it came Apple Partition Scheme, even if you let CCC do the clone, the resulting drive won\\'t be bootable. CCC usually works in \"file mode\" and it can easily copy a larger drive (that\\'s mostly empty) onto a smaller drive. If you tell CCC to clone a drive you did NOT boot from, it can work in block copy mode where the destination drive must be the same size or larger than the drive you are cloning from (if I recall).\\nI\\'ve actually done this somehow on Disk Utility several times (booting from a different drive (or even the dvd) so not running disk utility from the drive your cloning) and had it work just fine from larger to smaller bootable clone. Definitely format the drive cloning to first, as bootable Apple etc..\\nThanks for pointing this out. My only experience using DU to go larger to smaller was when I was trying to make a Lion install stick and I was unable to restore InstallESD.dmg to a 4 GB USB stick but of course the reason that wouldn\\'t fit is there was slightly more than 4 GB of data.'} \n", + "\n", + "example number 3: \n", + "\n", + "{'text': 'Foil plaid lycra and spandex shortall with metallic slinky insets. Attached metallic elastic belt with O-ring. Headband included. Great hip hop or jazz dance costume. Made in the USA.'} \n", + "\n", + "example number 4: \n", + "\n", + "{'text': \"How many backlinks per day for new site?\\nDiscussion in 'Black Hat SEO' started by Omoplata, Dec 3, 2010.\\n1) for a newly created site, what's the max # backlinks per day I should do to be safe?\\n2) how long do I have to let my site age before I can start making more blinks?\\nI did about 6000 forum profiles every 24 hours for 10 days for one of my sites which had a brand new domain.\\nThere is three backlinks for every of these forum profile so thats 18 000 backlinks every 24 hours and nothing happened in terms of being penalized or sandboxed. This is now maybe 3 months ago and the site is ranking on first page for a lot of my targeted keywords.\\nbuild more you can in starting but do manual submission and not spammy type means manual + relevant to the post.. then after 1 month you can make a big blast..\\nWow, dude, you built 18k backlinks a day on a brand new site? How quickly did you rank up? What kind of competition/searches did those keywords have?\"} \n", + "\n", + "example number 5: \n", + "\n", + "{'text': 'The Denver Board of Education opened the 2017-18 school year with an update on projects that include new construction, upgrades, heat mitigation and quality learning environments.\\nWe are excited that Denver students will be the beneficiaries of a four year, $572 million General Obligation Bond. Since the passage of the bond, our construction team has worked to schedule the projects over the four-year term of the bond.\\nDenver voters on Tuesday approved bond and mill funding measures for students in Denver Public Schools, agreeing to invest $572 million in bond funding to build and improve schools and $56.6 million in operating dollars to support proven initiatives, such as early literacy.\\nDenver voters say yes to bond and mill levy funding support for DPS students and schools. Click to learn more about the details of the voter-approved bond measure.\\nDenver voters on Nov. 8 approved bond and mill funding measures for DPS students and schools. Learn more about what’s included in the mill levy measure.'} \n", + "\n" + ] + } + ], + "source": [ + "# Load example jsons\n", + "with open('data/c4-en-10k.jsonl', 'r') as file:\n", + " example_jsons = [json.loads(line.strip()) for line in file]\n", + "\n", + "# Printing the examples to see how the data looks like\n", + "for i in range(5):\n", + " print(f'example number {i+1}: \\n\\n{example_jsons[i]} \\n')" + ] + }, + { + "cell_type": "markdown", + "id": "48901d97", + "metadata": { + "colab_type": "text", + "id": "eeihIgtiaSfh", + "slideshow": { + "slide_type": "" + }, + "tags": [] + }, + "source": [ + "\n", + "### 1.3 - Process C4\n", + "\n", + "For the purpose of pretaining the T5 model, you will only use the `content` of each entry. In the following code, you filter only the field `text` from all the entries in the dataset. This is the data that you will use to create the `inputs` and `targets` of your language model." + ] + }, + { + "cell_type": "code", + "execution_count": 4, + "id": "af728cb2", + "metadata": { + "deletable": false, + "editable": false, + "slideshow": { + "slide_type": "" + }, + "tags": [] + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Beginners BBQ Class Taking Place in Missoula!\n", + "Do you want to get better at making delicious BBQ? You will have the opportunity, put this on your calendar now. Thursday, September 22nd join World Class BBQ Champion, Tony Balay from Lonestar Smoke Rangers. He will be teaching a beginner level class for everyone who wants to get better with their culinary skills.\n", + "He will teach you everything you need to know to compete in a KCBS BBQ competition, including techniques, recipes, timelines, meat selection and trimming, plus smoker and fire information.\n", + "The cost to be in the class is $35 per person, and for spectators it is free. Included in the cost will be either a t-shirt or apron and you will be tasting samples of each meat that is prepared.\n" + ] + } + ], + "source": [ + "# Grab text field from dictionary\n", + "natural_language_texts = [example_json['text'] for example_json in example_jsons]\n", + "\n", + "# Print the first text example\n", + "print(natural_language_texts[0])" + ] + }, + { + "cell_type": "markdown", + "id": "ee4a25a2", + "metadata": { + "colab_type": "text", + "id": "1rMrONRqcCYi", + "slideshow": { + "slide_type": "" + }, + "tags": [] + }, + "source": [ + "\n", + "### 1.4 - Decode to Natural Language\n", + "\n", + "The [SentencePieceTokenizer](https://www.tensorflow.org/text/api_docs/python/text/SentencepieceTokenizer), used in the code snippet, tokenizes text into subword units, enhancing handling of complex word structures, out-of-vocabulary words, and multilingual support. It simplifies preprocessing, ensures consistent tokenization, and seamlessly integrates with machine learning frameworks.\n", + "\n", + "In this task, a SentencePiece model is loaded from a file, which is used to tokenize text into subwords represented by integer IDs." + ] + }, + { + "cell_type": "code", + "execution_count": 5, + "id": "2ac53d57", + "metadata": { + "deletable": false, + "editable": false, + "slideshow": { + "slide_type": "" + }, + "tags": [] + }, + "outputs": [], + "source": [ + "# Special tokens\n", + "# PAD, EOS = 0, 1\n", + "\n", + "with open(\"./models/sentencepiece.model\", \"rb\") as f:\n", + " pre_trained_tokenizer = f.read()\n", + " \n", + "tokenizer = tf_text.SentencepieceTokenizer(pre_trained_tokenizer, out_type=tf.int32)" + ] + }, + { + "cell_type": "markdown", + "id": "658b0e86", + "metadata": {}, + "source": [ + "In this tokenizer the string `` is used as `EOS` token. By default, the tokenizer does not add the `EOS` to the end of each sentence, so you need to add it manually when required. Let's verify what id correspond to this token:" + ] + }, + { + "cell_type": "code", + "execution_count": 6, + "id": "7d2fec4b", + "metadata": { + "deletable": false, + "editable": false, + "tags": [] + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "EOS: 1\n" + ] + } + ], + "source": [ + "eos = tokenizer.string_to_id(\"\").numpy()\n", + "\n", + "print(\"EOS: \" + str(eos))" + ] + }, + { + "cell_type": "markdown", + "id": "6e87756f", + "metadata": {}, + "source": [ + "This code shows the process of tokenizing individual words from a given text, in this case, the first entry of the dataset." + ] + }, + { + "cell_type": "code", + "execution_count": 7, + "id": "83c48352", + "metadata": { + "colab": { + "base_uri": "https://localhost:8080/", + "height": 54 + }, + "colab_type": "code", + "deletable": false, + "id": "iCCjgiVZgTSK", + "outputId": "023a227c-d895-4fd9-ae83-9394fe48cebd", + "slideshow": { + "slide_type": "" + }, + "tags": [] + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Word\t\t-->\tTokenization\n", + "----------------------------------------\n", + "Foil \t-->\t[4452, 173]\n", + "plaid \t-->\t[30772]\n", + "lycra \t-->\t[3, 120, 2935]\n", + "and \t-->\t[11]\n", + "spandex \t-->\t[8438, 26, 994]\n", + "shortall\t-->\t[710, 1748]\n", + "with \t-->\t[28]\n", + "metallic\t-->\t[18813]\n", + "slinky \t-->\t[3, 7, 4907, 63]\n", + "insets. \t-->\t[16, 2244, 7, 5]\n", + "Attached\t-->\t[28416, 15, 26]\n", + "metallic\t-->\t[18813]\n", + "elastic \t-->\t[15855]\n", + "belt \t-->\t[6782]\n", + "with \t-->\t[28]\n", + "O-ring. \t-->\t[411, 18, 1007, 5]\n", + "Headband\t-->\t[3642, 3348]\n", + "included.\t-->\t[1285, 5]\n", + "Great \t-->\t[1651]\n", + "hip \t-->\t[5436]\n", + "hop \t-->\t[13652]\n", + "or \t-->\t[42]\n", + "jazz \t-->\t[9948]\n", + "dance \t-->\t[2595]\n", + "costume.\t-->\t[11594, 5]\n", + "Made \t-->\t[6465]\n", + "in \t-->\t[16]\n", + "the \t-->\t[8]\n", + "USA. \t-->\t[2312, 5]\n" + ] + } + ], + "source": [ + "# printing the encoding of each word to see how subwords are tokenized\n", + "tokenized_text = [(list(tokenizer.tokenize(word).numpy()), word) for word in natural_language_texts[2].split()]\n", + "\n", + "print(\"Word\\t\\t-->\\tTokenization\")\n", + "print(\"-\"*40)\n", + "for element in tokenized_text:\n", + " print(f\"{element[1]:<8}\\t-->\\t{element[0]}\")" + ] + }, + { + "cell_type": "markdown", + "id": "d4616cf3", + "metadata": { + "slideshow": { + "slide_type": "" + }, + "tags": [] + }, + "source": [ + "And as usual, the library provides a function to turn numeric tokens into human readable text. Look how it works. " + ] + }, + { + "cell_type": "code", + "execution_count": 8, + "id": "92d7037b", + "metadata": { + "deletable": false, + "editable": false, + "slideshow": { + "slide_type": "" + }, + "tags": [] + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "tokenized: [12847 277]\n", + "detokenized: b'Beginners'\n" + ] + } + ], + "source": [ + "# We can see that detokenize successfully undoes the tokenization\n", + "print(f\"tokenized: {tokenizer.tokenize('Beginners')}\\ndetokenized: {tokenizer.detokenize(tokenizer.tokenize('Beginners'))}\")" + ] + }, + { + "cell_type": "markdown", + "id": "52f63624", + "metadata": { + "colab_type": "text", + "id": "vPKgGOeOxv3w", + "slideshow": { + "slide_type": "" + }, + "tags": [] + }, + "source": [ + "As you can see above, you were able to take a piece of string and tokenize it. \n", + "\n", + "Now you will create `input` and `target` pairs that will allow you to train your model. T5 uses the ids at the end of the vocab file as sentinels. For example, it will replace: \n", + " - `vocab_size - 1` by ``\n", + " - `vocab_size - 2` by ``\n", + " - and so forth. \n", + " \n", + "It assigns every word a `chr`.\n", + "\n", + "The `pretty_decode` function below, which you will use in a bit, helps in handling the type when decoding. Take a look and try to understand what the function is doing.\n", + "\n", + "\n", + "Notice that:\n", + "```python\n", + "string.ascii_letters = 'abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ'\n", + "```\n", + "\n", + "**NOTE:** Targets may have more than the 52 sentinels we replace, but this is just to give you an idea of things." + ] + }, + { + "cell_type": "code", + "execution_count": 9, + "id": "b25bb46d", + "metadata": { + "colab": {}, + "colab_type": "code", + "deletable": false, + "editable": false, + "id": "fCPQL5FTxv3w", + "slideshow": { + "slide_type": "" + }, + "tags": [] + }, + "outputs": [], + "source": [ + "def get_sentinels(tokenizer, display=False):\n", + " sentinels = {}\n", + " vocab_size = tokenizer.vocab_size(name=None)\n", + " for i, char in enumerate(reversed(string.ascii_letters), 1):\n", + " decoded_text = tokenizer.detokenize([vocab_size - i]).numpy().decode(\"utf-8\")\n", + " \n", + " # Sentinels, ex: - \n", + " sentinels[decoded_text] = f'<{char}>' \n", + " \n", + " if display:\n", + " print(f'The sentinel is <{char}> and the decoded token is:', decoded_text)\n", + "\n", + " return sentinels\n", + "\n", + "def pretty_decode(encoded_str_list, sentinels, tokenizer):\n", + " # If already a string, just do the replacements.\n", + " if tf.is_tensor(encoded_str_list) and encoded_str_list.dtype == tf.string:\n", + " for token, char in sentinels.items():\n", + " encoded_str_list = tf.strings.regex_replace(encoded_str_list, token, char)\n", + " return encoded_str_list\n", + " \n", + " # We need to decode and then prettyfy it.\n", + " return pretty_decode(tokenizer.detokenize(encoded_str_list), sentinels, tokenizer)" + ] + }, + { + "cell_type": "code", + "execution_count": 10, + "id": "56d75b6c", + "metadata": { + "colab": {}, + "colab_type": "code", + "deletable": false, + "editable": false, + "id": "fCPQL5FTxv3w", + "slideshow": { + "slide_type": "" + }, + "tags": [] + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "The sentinel is and the decoded token is: Internațional\n", + "The sentinel is and the decoded token is: erwachsene\n", + "The sentinel is and the decoded token is: Cushion\n", + "The sentinel is and the decoded token is: imunitar\n", + "The sentinel is and the decoded token is: Intellectual\n", + "The sentinel is and the decoded token is: traditi\n", + "The sentinel is and the decoded token is: disguise\n", + "The sentinel is and the decoded token is: exerce\n", + "The sentinel is and the decoded token is: nourishe\n", + "The sentinel is and the decoded token is: predominant\n", + "The sentinel is

and the decoded token is: amitié\n", + "The sentinel is and the decoded token is: erkennt\n", + "The sentinel is and the decoded token is: dimension\n", + "The sentinel is and the decoded token is: inférieur\n", + "The sentinel is and the decoded token is: refugi\n", + "The sentinel is and the decoded token is: cheddar\n", + "The sentinel is and the decoded token is: unterlieg\n", + "The sentinel is and the decoded token is: garanteaz\n", + "The sentinel is and the decoded token is: făcute\n", + "The sentinel is and the decoded token is: réglage\n", + "The sentinel is and the decoded token is: pedepse\n", + "The sentinel is and the decoded token is: Germain\n", + "The sentinel is and the decoded token is: distinctly\n", + "The sentinel is and the decoded token is: Schraub\n", + "The sentinel is and the decoded token is: emanat\n", + "The sentinel is and the decoded token is: trimestre\n", + "The sentinel is and the decoded token is: disrespect\n", + "The sentinel is and the decoded token is: Erasmus\n", + "The sentinel is and the decoded token is: Australia\n", + "The sentinel is and the decoded token is: permeabil\n", + "The sentinel is and the decoded token is: deseori\n", + "The sentinel is and the decoded token is: manipulated\n", + "The sentinel is and the decoded token is: suggér\n", + "The sentinel is and the decoded token is: corespund\n", + "The sentinel is and the decoded token is: nitro\n", + "The sentinel is and the decoded token is: oyons\n", + "The sentinel is

and the decoded token is: Account\n", + "The sentinel is and the decoded token is: échéan\n", + "The sentinel is and the decoded token is: laundering\n", + "The sentinel is and the decoded token is: genealogy\n", + "The sentinel is and the decoded token is: QuickBooks\n", + "The sentinel is and the decoded token is: constituted\n", + "The sentinel is and the decoded token is: Fertigung\n", + "The sentinel is and the decoded token is: goutte\n", + "The sentinel is and the decoded token is: regulă\n", + "The sentinel is and the decoded token is: overwhelmingly\n", + "The sentinel is and the decoded token is: émerg\n", + "The sentinel is and the decoded token is: broyeur\n", + "The sentinel is and the decoded token is: povești\n", + "The sentinel is and the decoded token is: emulator\n", + "The sentinel is and the decoded token is: halloween\n", + "The sentinel is and the decoded token is: combustibil\n" + ] + } + ], + "source": [ + "sentinels = get_sentinels(tokenizer, display=True)" + ] + }, + { + "cell_type": "markdown", + "id": "be73a35d", + "metadata": { + "colab": {}, + "colab_type": "code", + "id": "fCPQL5FTxv3w", + "slideshow": { + "slide_type": "" + }, + "tags": [] + }, + "source": [ + "Now, let's use the `pretty_decode` function in the following sentence. Note that all the words listed as sentinels, will be replaced by the function with the corresponding sentinel. It could be a drawback of this method, but don't worry about it now." + ] + }, + { + "cell_type": "code", + "execution_count": 11, + "id": "1fe92253", + "metadata": { + "deletable": false, + "editable": false, + "slideshow": { + "slide_type": "" + }, + "tags": [] + }, + "outputs": [ + { + "data": { + "text/plain": [ + " this .'>" + ] + }, + "execution_count": 11, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "pretty_decode(tf.constant(\"I want to dress up as an Intellectual this halloween.\"), sentinels, tokenizer)" + ] + }, + { + "cell_type": "markdown", + "id": "559b04b7", + "metadata": { + "colab_type": "text", + "id": "Y64F--Nzxv30", + "slideshow": { + "slide_type": "" + }, + "tags": [] + }, + "source": [ + "The functions above make your `inputs` and `targets` more readable. For example, you might see something like this once you implement the masking function below. \n", + "\n", + "- Input sentence: Younes and Lukasz were working together in the lab yesterday after lunch. \n", + "- Input: Younes and Lukasz **Z** together in the **Y** yesterday after lunch.\n", + "- Target: **Z** were working **Y** lab.\n" + ] + }, + { + "cell_type": "markdown", + "id": "244cd7a8", + "metadata": { + "colab_type": "text", + "id": "NvvNd7n6xv30" + }, + "source": [ + "\n", + "### 1.5 - Tokenizing and Masking\n", + "\n", + "In this task, you will implement the `tokenize_and_mask` function, which tokenizes and masks input words based on a given probability. The probability is controlled by the `noise` parameter, typically set to mask around `15%` of the words in the input text. The function will generate two lists of tokenized sequences following the algorithm outlined below:" + ] + }, + { + "cell_type": "markdown", + "id": "7050f25c", + "metadata": { + "slideshow": { + "slide_type": "" + }, + "tags": [] + }, + "source": [ + "\n", + "### Exercise 1 - tokenize_and_mask\n", + "\n", + "- Start with two empty lists: `inps` and `targs`\n", + "- Tokenize the input text using the given tokenizer.\n", + "- For each `token` in the tokenized sequence:\n", + " - Generate a random number(simulating a weighted coin toss)\n", + " - If the random value is greater than the given threshold(noise):\n", + " - Add the current token to the `inps` list\n", + " - Else:\n", + " - If a new sentinel must be included(read note **):\n", + " - Compute the next sentinel ID using a progression.\n", + " - Add a sentinel into the `inps` and `targs` to mark the position of the masked element.\n", + " - Add the current token to the `targs` list.\n", + "\n", + "** There's a special case to consider. If two or more consecutive tokens get masked during the process, you don't need to add a new sentinel to the sequences. To account for this, use the `prev_no_mask` flag, which starts as `True` but is turned to `False` each time you mask a new element. The code that adds sentinels will only be executed if, before masking the token, the flag was in the `True` state.\n" + ] + }, + { + "cell_type": "code", + "execution_count": 12, + "id": "c660bf97", + "metadata": { + "colab": {}, + "colab_type": "code", + "deletable": false, + "id": "Bi33WKgRxv31", + "slideshow": { + "slide_type": "" + }, + "tags": [ + "graded" + ] + }, + "outputs": [], + "source": [ + "# GRADED FUNCTION: tokenize_and_mask\n", + "def tokenize_and_mask(text, \n", + " noise=0.15, \n", + " randomizer=np.random.uniform, \n", + " tokenizer=None):\n", + " \"\"\"Tokenizes and masks a given input.\n", + "\n", + " Args:\n", + " text (str or bytes): Text input.\n", + " noise (float, optional): Probability of masking a token. Defaults to 0.15.\n", + " randomizer (function, optional): Function that generates random values. Defaults to np.random.uniform.\n", + " tokenizer (function, optional): Tokenizer function. Defaults to tokenize.\n", + "\n", + " Returns:\n", + " inps, targs: Lists of integers associated to inputs and targets.\n", + " \"\"\"\n", + " \n", + " # Current sentinel number (starts at 0)\n", + " cur_sentinel_num = 0\n", + " \n", + " # Inputs and targets\n", + " inps, targs = [], []\n", + "\n", + " # Vocab_size\n", + " vocab_size = int(tokenizer.vocab_size())\n", + " \n", + " # EOS token id \n", + " # Must be at the end of each target!\n", + " eos = tokenizer.string_to_id(\"\").numpy()\n", + " \n", + " ### START CODE HERE ###\n", + " \n", + " # prev_no_mask is True if the previous token was NOT masked, False otherwise\n", + " # set prev_no_mask to True\n", + " prev_no_mask = True\n", + " \n", + " # Loop over the tokenized text\n", + " for token in tokenizer.tokenize(text).numpy():\n", + " \n", + " # Generate a random value between 0 and 1\n", + " rnd_val = randomizer() \n", + " \n", + " # Check if the noise is greater than a random value (weighted coin flip)\n", + " if noise > rnd_val:\n", + " \n", + " # Check if previous token was NOT masked\n", + " if prev_no_mask:\n", + " \n", + " # Current sentinel increases by 1\n", + " cur_sentinel_num += 1\n", + " \n", + " # Compute end_id by subtracting current sentinel value out of the total vocabulary size\n", + " end_id = vocab_size - cur_sentinel_num\n", + " \n", + " # Append end_id at the end of the targets\n", + " targs.append(end_id)\n", + " \n", + " # Append end_id at the end of the inputs\n", + " inps.append(end_id)\n", + " \n", + " # Append token at the end of the targets\n", + " targs.append(token)\n", + " \n", + " # set prev_no_mask accordingly\n", + " prev_no_mask = False\n", + "\n", + " else:\n", + " \n", + " # Append token at the end of the inputs\n", + " inps.append(token)\n", + " \n", + " # Set prev_no_mask accordingly\n", + " prev_no_mask = True\n", + " \n", + " \n", + " # Add EOS token to the end of the targets\n", + " targs.append(eos)\n", + " \n", + " ### END CODE HERE ###\n", + " \n", + " return inps, targs" + ] + }, + { + "cell_type": "code", + "execution_count": 13, + "id": "e92edca1", + "metadata": { + "colab": { + "base_uri": "https://localhost:8080/", + "height": 122 + }, + "colab_type": "code", + "deletable": false, + "editable": false, + "id": "OlPySQo9xv34", + "outputId": "2b0dc5e4-8d58-4eb0-a146-0c9f158264ac", + "scrolled": true, + "slideshow": { + "slide_type": "" + }, + "tags": [] + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "tokenized inputs - shape=53:\n", + "\n", + "[31999, 15068, 4501, 3, 12297, 3399, 16, 5964, 7115, 31998, 531, 25, 241, 12, 129, 394, 44, 492, 31997, 58, 148, 56, 43, 8, 1004, 6, 474, 31996, 39, 4793, 230, 5, 2721, 6, 1600, 1630, 31995, 1150, 4501, 15068, 16127, 6, 9137, 2659, 5595, 31994, 782, 3624, 14627, 15, 12612, 277, 5]\n", + "\n", + "targets - shape=19:\n", + "\n", + "[31999, 12847, 277, 31998, 9, 55, 31997, 3326, 15068, 31996, 48, 30, 31995, 727, 1715, 31994, 45, 301, 1]\n" + ] + } + ], + "source": [ + "# Some logic to mock a np.random value generator\n", + "# Needs to be in the same cell for it to always generate same output\n", + "def testing_rnd():\n", + " def dummy_generator():\n", + " vals = np.linspace(0, 1, 10)\n", + " cyclic_vals = itertools.cycle(vals)\n", + " for _ in range(100):\n", + " yield next(cyclic_vals)\n", + "\n", + " dumr = itertools.cycle(dummy_generator())\n", + "\n", + " def dummy_randomizer():\n", + " return next(dumr)\n", + " \n", + " return dummy_randomizer\n", + "\n", + "input_str = 'Beginners BBQ Class Taking Place in Missoula!\\nDo you want to get better at making delicious BBQ? You will have the opportunity, put this on your calendar now. Thursday, September 22nd join World Class BBQ Champion, Tony Balay from Lonestar Smoke Rangers.'\n", + "\n", + "inps, targs = tokenize_and_mask(input_str, randomizer=testing_rnd(), tokenizer=tokenizer)\n", + "print(f\"tokenized inputs - shape={len(inps)}:\\n\\n{inps}\\n\\ntargets - shape={len(targs)}:\\n\\n{targs}\")" + ] + }, + { + "cell_type": "markdown", + "id": "07996252", + "metadata": { + "slideshow": { + "slide_type": "" + }, + "tags": [] + }, + "source": [ + "#### **Expected Output:**\n", + "```\n", + "tokenized inputs - shape=53:\n", + "\n", + "[31999 15068 4501 3 12297 3399 16 5964 7115 31998 531 25\n", + " 241 12 129 394 44 492 31997 58 148 56 43 8\n", + " 1004 6 474 31996 39 4793 230 5 2721 6 1600 1630\n", + " 31995 1150 4501 15068 16127 6 9137 2659 5595 31994 782 3624\n", + " 14627 15 12612 277 5]\n", + "\n", + "targets - shape=19:\n", + "\n", + "[31999 12847 277 31998 9 55 31997 3326 15068 31996 48 30\n", + " 31995 727 1715 31994 45 301 1]\n", + "```" + ] + }, + { + "cell_type": "code", + "execution_count": 14, + "id": "76daaa5b", + "metadata": { + "deletable": false, + "editable": false, + "slideshow": { + "slide_type": "" + }, + "tags": [] + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "\u001b[92m All tests passed\n" + ] + } + ], + "source": [ + "# Test your implementation!\n", + "w3_unittest.test_tokenize_and_mask(tokenize_and_mask)" + ] + }, + { + "cell_type": "markdown", + "id": "9c87bea8", + "metadata": { + "colab_type": "text", + "id": "_omCqbkLxv36", + "slideshow": { + "slide_type": "" + }, + "tags": [] + }, + "source": [ + "You will now use the inputs and the targets from the `tokenize_and_mask` function you implemented above. Take a look at the decoded version of your masked sentence using your `inps` and `targs` from the sentence above. " + ] + }, + { + "cell_type": "code", + "execution_count": 15, + "id": "054d51bf", + "metadata": { + "colab": { + "base_uri": "https://localhost:8080/", + "height": 105 + }, + "colab_type": "code", + "deletable": false, + "editable": false, + "id": "y6xwo6lGxv37", + "outputId": "4330ae1e-1805-40c9-daf3-c6bbe92d957b", + "slideshow": { + "slide_type": "" + }, + "tags": [] + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Inputs: \n", + "\n", + " b' BBQ Class Taking Place in Missoul Do you want to get better at making ? You will have the opportunity, put your calendar now. Thursday, September 22 World Class BBQ Champion, Tony Balay onestar Smoke Rangers.'\n", + "\n", + "Targets: \n", + "\n", + " b' Beginners a! delicious BBQ this on nd join from L'\n" + ] + } + ], + "source": [ + "print('Inputs: \\n\\n', pretty_decode(inps, sentinels, tokenizer).numpy())\n", + "print('\\nTargets: \\n\\n', pretty_decode(targs, sentinels, tokenizer).numpy())" + ] + }, + { + "cell_type": "markdown", + "id": "0707c320", + "metadata": { + "colab_type": "text", + "id": "24HZiIBLxv3-", + "slideshow": { + "slide_type": "" + }, + "tags": [] + }, + "source": [ + "\n", + "### 1.6 - Creating the Pairs\n", + "\n", + "You will now create pairs using your dataset. You will iterate over your data and create (inp, targ) pairs using the functions that we have given you. " + ] + }, + { + "cell_type": "code", + "execution_count": 16, + "id": "ae83fff0", + "metadata": { + "deletable": false, + "editable": false, + "slideshow": { + "slide_type": "" + }, + "tags": [] + }, + "outputs": [], + "source": [ + "# Apply tokenize_and_mask\n", + "inputs_targets_pairs = [tokenize_and_mask(text.encode('utf-8', errors='ignore').decode('utf-8'), tokenizer=tokenizer) \n", + " for text in natural_language_texts[0:2000]]" + ] + }, + { + "cell_type": "code", + "execution_count": 17, + "id": "3f157ad1", + "metadata": { + "colab": { + "base_uri": "https://localhost:8080/", + "height": 1000 + }, + "colab_type": "code", + "deletable": false, + "editable": false, + "id": "c1HiKreWokhs", + "outputId": "fc194524-41de-4d3b-87d9-ae35c29c9f79", + "slideshow": { + "slide_type": "" + }, + "tags": [] + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "[1]\n", + "\n", + "inputs:\n", + "il plaid lycra spandex shortall with metallic slinky\n", + "sets. Attache metallic elastic belt with O ring. Head \n", + "included. Great hip hop jazz dance costume. in the USA.\n", + "\n", + "targets:\n", + " Fo and in d - band or Made\n", + "\n", + "\n", + "\n", + "[2]\n", + "\n", + "inputs:\n", + "I thought I was going to 3rd season Wire tonight. there\n", + "was a commentary 11, so I had to re watch Ground with \n", + "commentary. Hopefully can finish season .\n", + "\n", + "targets:\n", + " finish the of the But on episode - Middle \n", + "the I the next weekend\n", + "\n", + "\n", + "\n", + "[3]\n", + "\n", + "inputs:\n", + "Pencarian FILM Untuk \" eace er 2017 yuk mampir ke channel\n", + "say . Edges provides the l.. A corrupt cop makes one w.. er\n", + "2017 ⁇ ⁇ .. Náo Lo ⁇ n - Peace Break.. Please subscribe and hit\n", + ".. in HD at http://.. cannot believe I manage..\n", + "\n", + "targets:\n", + " P Break \" . East Peace Break uploaded\n", + " I\n", + "\n", + "\n", + "\n" + ] + } + ], + "source": [ + "def display_input_target_pairs(inputs_targets_pairs, sentinels, wrapper=textwrap.TextWrapper(width=70), tokenizer=tokenizer):\n", + " for i, inp_tgt_pair in enumerate(inputs_targets_pairs, 1):\n", + " inps, tgts = inp_tgt_pair\n", + " inps = str(pretty_decode(inps, sentinels, tokenizer).numpy(), encoding='utf-8')\n", + " tgts = str(pretty_decode(tgts, sentinels, tokenizer).numpy(), encoding='utf-8')\n", + " print(f'[{i}]\\n\\n'\n", + " f'inputs:\\n{wrapper.fill(text=inps)}\\n\\n'\n", + " f'targets:\\n{wrapper.fill(text=tgts)}\\n\\n\\n')\n", + "\n", + "# Print 3 samples. We print inputs with less than 100 tokens. It is just to give you and idea of the process\n", + "display_input_target_pairs(filter(lambda x: len(x[0]) < 100, inputs_targets_pairs[0:12]), sentinels, wrapper, tokenizer)" + ] + }, + { + "cell_type": "markdown", + "id": "d7d5e6d9", + "metadata": { + "colab_type": "text", + "id": "hQI5Jgov5X-d", + "slideshow": { + "slide_type": "" + }, + "tags": [] + }, + "source": [ + "\n", + "## 2 - Pretrain a T5 model using C4\n", + "\n", + "Now you are going to use the Transformer's architecture that you coded in the previous assignment to summarize text, but this time to answer questions. Instead of training the question answering model from scratch, you will first \"pre-train\" the model using the C4 data set you just processed. This will help the model to learn the general structure of language from a large dataset. This is much easier to do, as you don't need to label any data, but just use the masking, which is done automatically. You will then use the data from the SQuAD set to teach the model to answer questions given a context. To start let's review the Transformer's architecture. \n", + "\n", + "\n", + "\n", + "\n", + "### 2.1 - Instantiate a new transformer model\n", + "\n", + "We have packaged the code implemented in the previous week into the `Transformer.py` file. You can import it here, and setup with the same configuration used there. " + ] + }, + { + "cell_type": "code", + "execution_count": 18, + "id": "58ce75dc", + "metadata": { + "colab": {}, + "colab_type": "code", + "deletable": false, + "editable": false, + "id": "eScMhEG7xv4H", + "slideshow": { + "slide_type": "" + }, + "tags": [] + }, + "outputs": [], + "source": [ + "# Define the model parameters\n", + "num_layers = 2\n", + "embedding_dim = 128\n", + "fully_connected_dim = 128\n", + "num_heads = 2\n", + "positional_encoding_length = 256\n", + "\n", + "encoder_vocab_size = int(tokenizer.vocab_size())\n", + "decoder_vocab_size = encoder_vocab_size\n", + "\n", + "# Initialize the model\n", + "transformer = transformer_utils.Transformer(\n", + " num_layers, \n", + " embedding_dim, \n", + " num_heads, \n", + " fully_connected_dim,\n", + " encoder_vocab_size, \n", + " decoder_vocab_size, \n", + " positional_encoding_length, \n", + " positional_encoding_length,\n", + ")" + ] + }, + { + "cell_type": "markdown", + "id": "618697cf", + "metadata": {}, + "source": [ + "Now, you will define the optimizer and the loss function. For this task the model will try to predict the masked words, so, as in the previous lab, the loss function will be the `SparseCategoricalCrossEntropy`." + ] + }, + { + "cell_type": "code", + "execution_count": 19, + "id": "7df2d1d1", + "metadata": { + "deletable": false, + "editable": false, + "slideshow": { + "slide_type": "" + }, + "tags": [] + }, + "outputs": [], + "source": [ + "learning_rate = transformer_utils.CustomSchedule(embedding_dim)\n", + "optimizer = tf.keras.optimizers.Adam(0.0001, beta_1=0.9, beta_2=0.98, epsilon=1e-9)\n", + "\n", + "loss_object = tf.keras.losses.SparseCategoricalCrossentropy(from_logits=True, reduction='none')\n", + "train_loss = tf.keras.metrics.Mean(name='train_loss')\n", + "\n", + "# Here you will store the losses, so you can later plot them\n", + "losses = []" + ] + }, + { + "cell_type": "markdown", + "id": "03d54376", + "metadata": {}, + "source": [ + "\n", + "### 2.2 - C4 pretraining\n", + "\n", + "For training a Tensorflow model you need to arrange the data into datasets. Now, you will get the `inputs` and the `targets` for the transformer model from the `inputs_targets_pairs`. Before creating the dataset, you need to be sure that all `inputs` have the same length by truncating the longer sequences and padding the shorter ones with `0`. The same must be done for the targets. The function `tf.keras.preprocessing.sequence.pad_sequences` will help you here, as in the previous week assignment.\n", + "\n", + "You will use a `BATCH_SIZE = 64`" + ] + }, + { + "cell_type": "code", + "execution_count": 20, + "id": "b03eb998", + "metadata": { + "deletable": false, + "editable": false, + "tags": [] + }, + "outputs": [], + "source": [ + "# Limit the size of the input and output data so this can run in this environment\n", + "encoder_maxlen = 150\n", + "decoder_maxlen = 50\n", + "\n", + "inputs = tf.keras.preprocessing.sequence.pad_sequences([x[0] for x in inputs_targets_pairs], maxlen=encoder_maxlen, padding='post', truncating='post')\n", + "targets = tf.keras.preprocessing.sequence.pad_sequences([x[1] for x in inputs_targets_pairs], maxlen=decoder_maxlen, padding='post', truncating='post')\n", + "\n", + "inputs = tf.cast(inputs, dtype=tf.int32)\n", + "targets = tf.cast(targets, dtype=tf.int32)\n", + "\n", + "# Create the final training dataset.\n", + "BUFFER_SIZE = 10000\n", + "BATCH_SIZE = 64\n", + "\n", + "dataset = tf.data.Dataset.from_tensor_slices((inputs, targets)).shuffle(BUFFER_SIZE).batch(BATCH_SIZE)" + ] + }, + { + "cell_type": "markdown", + "id": "4e32ae0c", + "metadata": {}, + "source": [ + "Now, you can run the training loop for 10 epochs. Running it with a big dataset such as C4 on a good computer with enough memory and a good GPU could take more than 24 hours. Here, you will run few epochs using a small portion of the C4 dataset for illustration. It will only take a few minutes, but the model won't be very powerful. " + ] + }, + { + "cell_type": "code", + "execution_count": 21, + "id": "44fc5f76", + "metadata": { + "deletable": false, + "editable": false, + "tags": [] + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "WARNING:tensorflow:5 out of the last 5 calls to triggered tf.function retracing. Tracing is expensive and the excessive number of tracings could be due to (1) creating @tf.function repeatedly in a loop, (2) passing tensors with different shapes, (3) passing Python objects instead of tensors. For (1), please define your @tf.function outside of the loop. For (2), @tf.function has reduce_retracing=True option that can avoid unnecessary retracing. For (3), please refer to https://www.tensorflow.org/guide/function#controlling_retracing and https://www.tensorflow.org/api_docs/python/tf/function for more details.\n", + "WARNING:tensorflow:6 out of the last 6 calls to triggered tf.function retracing. Tracing is expensive and the excessive number of tracings could be due to (1) creating @tf.function repeatedly in a loop, (2) passing tensors with different shapes, (3) passing Python objects instead of tensors. For (1), please define your @tf.function outside of the loop. For (2), @tf.function has reduce_retracing=True option that can avoid unnecessary retracing. For (3), please refer to https://www.tensorflow.org/guide/function#controlling_retracing and https://www.tensorflow.org/api_docs/python/tf/function for more details.\n", + "Epoch 1, Loss 10.1072\n", + "Time taken for one epoch: 27.537233591079712 sec\n", + "Epoch 2, Loss 9.4925\n", + "Time taken for one epoch: 15.818775415420532 sec\n", + "Epoch 3, Loss 8.9007\n", + "Time taken for one epoch: 13.021282196044922 sec\n", + "Epoch 4, Loss 8.3569\n", + "Time taken for one epoch: 12.486879110336304 sec\n", + "Epoch 5, Loss 7.8626\n", + "Time taken for one epoch: 11.662268877029419 sec\n", + "Epoch 6, Loss 7.4189\n", + "Time taken for one epoch: 9.629899740219116 sec\n", + "Epoch 7, Loss 7.0271\n", + "Time taken for one epoch: 11.970134019851685 sec\n", + "Epoch 8, Loss 6.6958\n", + "Time taken for one epoch: 10.43953561782837 sec\n", + "Epoch 9, Loss 6.4291\n", + "Time taken for one epoch: 9.317028760910034 sec\n", + "Epoch 10, Loss 6.2340\n", + "Time taken for one epoch: 9.726068019866943 sec\n" + ] + } + ], + "source": [ + "# Define the number of epochs\n", + "epochs = 10\n", + "\n", + "# Training loop\n", + "for epoch in range(epochs):\n", + " \n", + " start = time.time()\n", + " train_loss.reset_states()\n", + " number_of_batches=len(list(enumerate(dataset)))\n", + "\n", + " for (batch, (inp, tar)) in enumerate(dataset):\n", + " print(f'Epoch {epoch+1}, Batch {batch+1}/{number_of_batches}', end='\\r')\n", + " transformer_utils.train_step(inp, tar, transformer, loss_object, optimizer, train_loss)\n", + " \n", + " print (f'Epoch {epoch+1}, Loss {train_loss.result():.4f}')\n", + " losses.append(train_loss.result())\n", + " \n", + " print (f'Time taken for one epoch: {time.time() - start} sec')\n", + "\n", + "# Save the pretrained model\n", + "# transformer.save_weights('./model_c4_temp')" + ] + }, + { + "cell_type": "markdown", + "id": "2e8135b5", + "metadata": {}, + "source": [ + "**Load a pretrained model**\n", + "\n", + "To show how powerful this model actually is, we trained it for several epochs with the full dataset in Colab and saved the weights for you. You can load them using the cell below. For the rest of the notebook, you will see the power of the transfer learning in action." + ] + }, + { + "cell_type": "code", + "execution_count": 22, + "id": "55360633", + "metadata": { + "deletable": false, + "editable": false, + "tags": [] + }, + "outputs": [ + { + "data": { + "text/plain": [ + "" + ] + }, + "execution_count": 22, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "transformer.load_weights('./pretrained_models/model_c4')" + ] + }, + { + "cell_type": "markdown", + "id": "b8822756", + "metadata": {}, + "source": [ + "\n", + "## 3. Fine tune the T5 model for Question Answering\n", + "\n", + "Now, you are going to fine tune the pretrained model for Question Answering using the [SQUad 2.0 dataset](https://rajpurkar.github.io/SQuAD-explorer/).\n", + "\n", + "SQuAD, short for Stanford Question Answering Dataset, is a dataset designed for training and evaluating question answering systems. It consists of real questions posed by humans on a set of Wikipedia articles, where the answer to each question is a specific span of text within the corresponding article.\n", + "\n", + "SQuAD 1.1, the previous version of the SQuAD dataset, contains 100,000+ question-answer pairs on about 500 articles.\n", + "SQuAD 2.0, contains 50.000 additional questions that are not meant to be answered. This extra set of questions can help to train models to detect unanswerable questions.\n", + "\n", + "Let's load the dataset." + ] + }, + { + "cell_type": "code", + "execution_count": 23, + "id": "987571df", + "metadata": { + "deletable": false, + "editable": false, + "tags": [] + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Number of articles: 442\n" + ] + } + ], + "source": [ + "with open('data/train-v2.0.json', 'r') as f:\n", + " example_jsons = json.load(f)\n", + "\n", + "example_jsons = example_jsons['data']\n", + "\n", + "print('Number of articles: ' + str(len(example_jsons)))" + ] + }, + { + "cell_type": "markdown", + "id": "c941761f", + "metadata": {}, + "source": [ + "The structure of each article is as follows:\n", + "- `title`: The article title\n", + "- `paragraphs`: A list of paragraphs and questions related to them\n", + " - `context`: The actual paragraph text\n", + " - `qas`: A set of question related to the paragraph\n", + " - `question`: A question\n", + " - `id`: The question unique identifier\n", + " - `is_imposible`: Boolean, specifies if the question can be answered or not\n", + " - `answers`: A set of possible answers for the question\n", + " - `text`: The answer\n", + " - `answer_start`: The index of the character that starts the sentence containing the explicit answer to the question\n", + " \n", + "Take a look at an article by running the next cell. Notice that the `context` is usually the last element for every paragraph: " + ] + }, + { + "cell_type": "code", + "execution_count": 24, + "id": "7c4c4cfa", + "metadata": { + "deletable": false, + "editable": false, + "tags": [] + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Title: Beyoncé\n", + "{'qas': [{'question': 'When did Beyonce start becoming popular?', 'id': '56be85543aeaaa14008c9063', 'answers': [{'text': 'in the late 1990s', 'answer_start': 269}], 'is_impossible': False}, {'question': 'What areas did Beyonce compete in when she was growing up?', 'id': '56be85543aeaaa14008c9065', 'answers': [{'text': 'singing and dancing', 'answer_start': 207}], 'is_impossible': False}, {'question': \"When did Beyonce leave Destiny's Child and become a solo singer?\", 'id': '56be85543aeaaa14008c9066', 'answers': [{'text': '2003', 'answer_start': 526}], 'is_impossible': False}, {'question': 'In what city and state did Beyonce grow up? ', 'id': '56bf6b0f3aeaaa14008c9601', 'answers': [{'text': 'Houston, Texas', 'answer_start': 166}], 'is_impossible': False}, {'question': 'In which decade did Beyonce become famous?', 'id': '56bf6b0f3aeaaa14008c9602', 'answers': [{'text': 'late 1990s', 'answer_start': 276}], 'is_impossible': False}, {'question': 'In what R&B group was she the lead singer?', 'id': '56bf6b0f3aeaaa14008c9603', 'answers': [{'text': \"Destiny's Child\", 'answer_start': 320}], 'is_impossible': False}, {'question': 'What album made her a worldwide known artist?', 'id': '56bf6b0f3aeaaa14008c9604', 'answers': [{'text': 'Dangerously in Love', 'answer_start': 505}], 'is_impossible': False}, {'question': \"Who managed the Destiny's Child group?\", 'id': '56bf6b0f3aeaaa14008c9605', 'answers': [{'text': 'Mathew Knowles', 'answer_start': 360}], 'is_impossible': False}, {'question': 'When did Beyoncé rise to fame?', 'id': '56d43c5f2ccc5a1400d830a9', 'answers': [{'text': 'late 1990s', 'answer_start': 276}], 'is_impossible': False}, {'question': \"What role did Beyoncé have in Destiny's Child?\", 'id': '56d43c5f2ccc5a1400d830aa', 'answers': [{'text': 'lead singer', 'answer_start': 290}], 'is_impossible': False}, {'question': 'What was the first album Beyoncé released as a solo artist?', 'id': '56d43c5f2ccc5a1400d830ab', 'answers': [{'text': 'Dangerously in Love', 'answer_start': 505}], 'is_impossible': False}, {'question': 'When did Beyoncé release Dangerously in Love?', 'id': '56d43c5f2ccc5a1400d830ac', 'answers': [{'text': '2003', 'answer_start': 526}], 'is_impossible': False}, {'question': 'How many Grammy awards did Beyoncé win for her first solo album?', 'id': '56d43c5f2ccc5a1400d830ad', 'answers': [{'text': 'five', 'answer_start': 590}], 'is_impossible': False}, {'question': \"What was Beyoncé's role in Destiny's Child?\", 'id': '56d43ce42ccc5a1400d830b4', 'answers': [{'text': 'lead singer', 'answer_start': 290}], 'is_impossible': False}, {'question': \"What was the name of Beyoncé's first solo album?\", 'id': '56d43ce42ccc5a1400d830b5', 'answers': [{'text': 'Dangerously in Love', 'answer_start': 505}], 'is_impossible': False}], 'context': 'Beyoncé Giselle Knowles-Carter (/biːˈjɒnseɪ/ bee-YON-say) (born September 4, 1981) is an American singer, songwriter, record producer and actress. Born and raised in Houston, Texas, she performed in various singing and dancing competitions as a child, and rose to fame in the late 1990s as lead singer of R&B girl-group Destiny\\'s Child. Managed by her father, Mathew Knowles, the group became one of the world\\'s best-selling girl groups of all time. Their hiatus saw the release of Beyoncé\\'s debut album, Dangerously in Love (2003), which established her as a solo artist worldwide, earned five Grammy Awards and featured the Billboard Hot 100 number-one singles \"Crazy in Love\" and \"Baby Boy\".'}\n" + ] + } + ], + "source": [ + "example_article = example_jsons[0]\n", + "example_article\n", + "\n", + "print(\"Title: \" + example_article[\"title\"])\n", + "print(example_article[\"paragraphs\"][0])" + ] + }, + { + "cell_type": "markdown", + "id": "2982be57", + "metadata": {}, + "source": [ + "The previous article might be difficult to navigate so here is a nicely formatted example paragraph:\n", + "```python\n", + "{\n", + " \"context\": \"Beyoncé Giselle Knowles-Carter (/biːˈjɒnseɪ/ bee-YON-say) (born September 4, 1981) is an American singer, songwriter, record producer and actress. Born and raised in Houston, Texas, she performed in various singing and dancing competitions as a child, and rose to fame in the late 1990s as lead singer of R&B girl-group Destiny's Child. Managed by her father, Mathew Knowles, the group became one of the world's best-selling girl groups of all time. Their hiatus saw the release of Beyoncé's debut album, Dangerously in Love (2003), which established her as a solo artist worldwide, earned five Grammy Awards and featured the Billboard Hot 100 number-one singles 'Crazy in Love' and 'Baby Boy'\",\n", + " \"qas\": [\n", + " {\n", + " \"question\": \"When did Beyonce start becoming popular?\",\n", + " \"id\": \"56be85543aeaaa14008c9063\",\n", + " \"answers\": [\n", + " {\n", + " \"text\": \"in the late 1990s\",\n", + " \"answer_start\": 269\n", + " }\n", + " ],\n", + " \"is_impossible\": false\n", + " },\n", + " {\n", + " \"question\": \"What areas did Beyonce compete in when she was growing up?\",\n", + " \"id\": \"56be85543aeaaa14008c9065\",\n", + " \"answers\": [\n", + " {\n", + " \"text\": \"singing and dancing\",\n", + " \"answer_start\": 207\n", + " }\n", + " ],\n", + " \"is_impossible\": false\n", + " }\n", + " ]\n", + "}\n", + "```" + ] + }, + { + "cell_type": "markdown", + "id": "b3345571", + "metadata": {}, + "source": [ + "\n", + "### 3.1 - Creating a list of paired question and answers \n", + "\n", + "You are tasked with generating input/output pairs for a Question Answering (QA) model using the SQuAD 2.0 dataset. Each pair follows the structure:\n", + "\n", + "- inputs: `question: context:

`\n", + "- targets: `answer: `\n", + " \n", + "Here, `` represents the question in the context of the given paragraph `

`, and `` is a possible answer.\n", + "\n", + "In this notebook, we will focus on a single answer per question. However, it's essential to note that the dataset contains questions with multiple answers. When training a model in real-life scenarios, consider including all available information.\n", + "\n", + "\n", + "### Exercise 2 - Parse the SQuaD 2.0 Dataset\n", + "\n", + "Your task is to implement the parse_squad function, which iterates over all the articles, paragraphs, and questions in the SQuAD dataset. Extract pairs of inputs and targets for the QA model using the provided code template.\n", + "- Start with two empty lists: `inputs` and `targets`.\n", + "- Loop over all the articles in the dataset.\n", + "- For each article, loop over each paragraph.\n", + "- Extract the context from the paragraph.\n", + "- Loop over each question in the given paragraph.\n", + "- Check if the question is not impossible and has at least one answer.\n", + "- If the above condition is met, create the `question_context` sequence as described in the input structure.\n", + "- Create the `answer` sequence using the first answer from the available answers.\n", + "- Append the `question_context` to the `inputs` list.\n", + "- Append the `answer` to the `targets` list." + ] + }, + { + "cell_type": "code", + "execution_count": 25, + "id": "b5344f35", + "metadata": { + "deletable": false, + "tags": [ + "graded" + ] + }, + "outputs": [], + "source": [ + "# GRADED FUNCTION: parse_squad\n", + "def parse_squad(dataset):\n", + " \"\"\"Extract all the answers/questions pairs from the SQuAD dataset\n", + "\n", + " Args:\n", + " dataset (dict): The imported JSON dataset\n", + "\n", + " Returns:\n", + " inputs, targets: Two lists containing the inputs and the targets for the QA model\n", + " \"\"\"\n", + "\n", + " inputs, targets = [], []\n", + "\n", + " ### START CODE HERE ###\n", + " \n", + " # Loop over all the articles\n", + " for article in dataset:\n", + " \n", + " # Loop over each paragraph of each article\n", + " for paragraph in article[\"paragraphs\"]:\n", + " \n", + " # Extract context from the paragraph\n", + " context = paragraph[\"context\"]\n", + " \n", + " #Loop over each question of the given paragraph\n", + " for qa in paragraph[\"qas\"]:\n", + " \n", + " # If this question is not impossible and there is at least one answer\n", + " if len(qa['answers']) > 0 and not(qa['is_impossible']):\n", + " \n", + " # Create the question/context sequence\n", + " question_context = 'question: ' + qa[\"question\"] + ' context: ' + context\n", + " \n", + " # Create the answer sequence. Use the text field of the first answer\n", + " answer = 'answer: ' + qa[\"answers\"][0][\"text\"]\n", + " \n", + " # Add the question_context to the inputs list\n", + " inputs.append(question_context)\n", + " \n", + " # Add the answer to the targets list\n", + " targets.append(answer)\n", + " \n", + " ### END CODE HERE ###\n", + " \n", + " return inputs, targets" + ] + }, + { + "cell_type": "code", + "execution_count": 26, + "id": "6744c424", + "metadata": { + "deletable": false, + "editable": false, + "tags": [] + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Number of question/answer pairs: 86821\n", + "\n", + "First Q/A pair:\n", + "\n", + "inputs: \u001b[34mquestion: When did Beyonce start becoming popular? context: Beyoncé Giselle Knowles-Carter (/biːˈjɒnseɪ/ bee-YON-say) (born September 4, 1981) is an American singer, songwriter, record producer and actress. Born and raised in Houston, Texas, she performed in various singing and dancing competitions as a child, and rose to fame in the late 1990s as lead singer of R&B girl-group Destiny's Child. Managed by her father, Mathew Knowles, the group became one of the world's best-selling girl groups of all time. Their hiatus saw the release of Beyoncé's debut album, Dangerously in Love (2003), which established her as a solo artist worldwide, earned five Grammy Awards and featured the Billboard Hot 100 number-one singles \"Crazy in Love\" and \"Baby Boy\".\u001b[0m\n", + "\n", + "targets: \u001b[32manswer: in the late 1990s\u001b[0m\n", + "\n", + "Last Q/A pair:\n", + "\n", + "inputs: \u001b[34mquestion: What is KMC an initialism of? context: Kathmandu Metropolitan City (KMC), in order to promote international relations has established an International Relations Secretariat (IRC). KMC's first international relationship was established in 1975 with the city of Eugene, Oregon, United States. This activity has been further enhanced by establishing formal relationships with 8 other cities: Motsumoto City of Japan, Rochester of the USA, Yangon (formerly Rangoon) of Myanmar, Xi'an of the People's Republic of China, Minsk of Belarus, and Pyongyang of the Democratic Republic of Korea. KMC's constant endeavor is to enhance its interaction with SAARC countries, other International agencies and many other major cities of the world to achieve better urban management and developmental programs for Kathmandu.\u001b[0m\n", + "\n", + "targets: \u001b[32manswer: Kathmandu Metropolitan City\u001b[0m\n" + ] + } + ], + "source": [ + "inputs, targets = parse_squad(example_jsons) \n", + "print(\"Number of question/answer pairs: \" + str(len(inputs)))\n", + "\n", + "print('\\nFirst Q/A pair:\\n\\ninputs: ' + colored(inputs[0], 'blue'))\n", + "print('\\ntargets: ' + colored(targets[0], 'green'))\n", + "print('\\nLast Q/A pair:\\n\\ninputs: ' + colored(inputs[-1], 'blue'))\n", + "print('\\ntargets: ' + colored(targets[-1], 'green'))" + ] + }, + { + "cell_type": "markdown", + "id": "e2b164c2", + "metadata": {}, + "source": [ + "#### **Expected Output:**\n", + "```\n", + "Number of question/answer pairs: 86821\n", + "\n", + "First Q/A pair:\n", + "\n", + "inputs: question: When did Beyonce start becoming popular? context: Beyoncé Giselle Knowles-Carter (/biːˈjɒnseɪ/ bee-YON-say) (born September 4, 1981) is an American singer, songwriter, record producer and actress. Born and raised in Houston, Texas, she performed in various singing and dancing competitions as a child, and rose to fame in the late 1990s as lead singer of R&B girl-group Destiny's Child. Managed by her father, Mathew Knowles, the group became one of the world's best-selling girl groups of all time. Their hiatus saw the release of Beyoncé's debut album, Dangerously in Love (2003), which established her as a solo artist worldwide, earned five Grammy Awards and featured the Billboard Hot 100 number-one singles \"Crazy in Love\" and \"Baby Boy\".\n", + "\n", + "targets: answer: in the late 1990s\n", + "\n", + "Last Q/A pair:\n", + "\n", + "inputs: question: What is KMC an initialism of? context: Kathmandu Metropolitan City (KMC), in order to promote international relations has established an International Relations Secretariat (IRC). KMC's first international relationship was established in 1975 with the city of Eugene, Oregon, United States. This activity has been further enhanced by establishing formal relationships with 8 other cities: Motsumoto City of Japan, Rochester of the USA, Yangon (formerly Rangoon) of Myanmar, Xi'an of the People's Republic of China, Minsk of Belarus, and Pyongyang of the Democratic Republic of Korea. KMC's constant endeavor is to enhance its interaction with SAARC countries, other International agencies and many other major cities of the world to achieve better urban management and developmental programs for Kathmandu.\n", + "\n", + "targets: answer: Kathmandu Metropolitan City\n", + "```" + ] + }, + { + "cell_type": "code", + "execution_count": 27, + "id": "f197bb69", + "metadata": { + "deletable": false, + "editable": false, + "tags": [] + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "\u001b[92m All tests passed\n" + ] + } + ], + "source": [ + "# UNIT TEST\n", + "w3_unittest.test_parse_squad(parse_squad)" + ] + }, + { + "cell_type": "markdown", + "id": "d1f69b24", + "metadata": { + "slideshow": { + "slide_type": "" + }, + "tags": [] + }, + "source": [ + "You will use 40000 samples for training and 5000 samples for testing" + ] + }, + { + "cell_type": "code", + "execution_count": 28, + "id": "947354ad", + "metadata": { + "deletable": false, + "editable": false, + "slideshow": { + "slide_type": "" + }, + "tags": [] + }, + "outputs": [], + "source": [ + "# 40K pairs for training\n", + "inputs_train = inputs[0:40000] \n", + "targets_train = targets[0:40000] \n", + "\n", + "# 5K pairs for testing\n", + "inputs_test = inputs[40000:45000] \n", + "targets_test = targets[40000:45000] " + ] + }, + { + "cell_type": "markdown", + "id": "1c21fd31", + "metadata": { + "slideshow": { + "slide_type": "" + }, + "tags": [] + }, + "source": [ + "Now, you can create the batch dataset of padded sequences. You will first tokenize the inputs and the targets. Then, using the function `tf.keras.preprocessing.sequence.pad_sequences`, you will ensure that the inputs and the outputs have the required lengths. Remember that the sequences longer than the required size will be truncated and the shorter ones will be padded with `0`. This setup is very similar to the other one used in this and the previous notebook." + ] + }, + { + "cell_type": "code", + "execution_count": 29, + "id": "83393c74", + "metadata": { + "deletable": false, + "editable": false, + "tags": [] + }, + "outputs": [], + "source": [ + "# Limit the size of the input and output data so this can run in this environment\n", + "encoder_maxlen = 150\n", + "decoder_maxlen = 50\n", + "\n", + "inputs_str = [tokenizer.tokenize(s) for s in inputs_train]\n", + "targets_str = [tf.concat([tokenizer.tokenize(s), [1]], 0) for s in targets_train]\n", + "\n", + "inputs = tf.keras.preprocessing.sequence.pad_sequences(inputs_str, maxlen=encoder_maxlen, padding='post', truncating='post')\n", + "targets = tf.keras.preprocessing.sequence.pad_sequences(targets_str, maxlen=decoder_maxlen, padding='post', truncating='post')\n", + "\n", + "inputs = tf.cast(inputs, dtype=tf.int32)\n", + "targets = tf.cast(targets, dtype=tf.int32)\n", + "\n", + "# Create the final training dataset.\n", + "BUFFER_SIZE = 10000\n", + "BATCH_SIZE = 64\n", + "dataset = tf.data.Dataset.from_tensor_slices((inputs, targets)).shuffle(BUFFER_SIZE).batch(BATCH_SIZE)" + ] + }, + { + "cell_type": "markdown", + "id": "df82c8a9", + "metadata": {}, + "source": [ + "\n", + "### 3.2 Fine tune the T5 model\n", + "\n", + "Now, you will train the model for 2 epochs. In the T5 model, all the weights are adjusted during the fine tuning. As usual, fine tuning this model to get state of the art results would require more time and resources than there are available in this environment, but you are welcome to train the model for more epochs and with more data using Colab GPUs." + ] + }, + { + "cell_type": "code", + "execution_count": 30, + "id": "aaaba558", + "metadata": { + "deletable": false, + "editable": false, + "tags": [] + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Epoch 1, Loss 5.883525\n", + "Time taken for one epoch: 173.80748438835144 sec\n", + "Epoch 2, Loss 5.307125\n", + "Time taken for one epoch: 130.27699828147888 sec\n" + ] + } + ], + "source": [ + "# Define the number of epochs\n", + "epochs = 2\n", + "losses = []\n", + "\n", + "# Training loop\n", + "for epoch in range(epochs):\n", + " \n", + " start = time.time()\n", + " train_loss.reset_states()\n", + " number_of_batches=len(list(enumerate(dataset)))\n", + "\n", + " for (batch, (inp, tar)) in enumerate(dataset):\n", + " print(f'Epoch {epoch+1}, Batch {batch+1}/{number_of_batches}', end='\\r')\n", + " transformer_utils.train_step(inp, tar, transformer, loss_object, optimizer, train_loss)\n", + " \n", + " print (f'Epoch {epoch+1}, Loss {train_loss.result():.4f}')\n", + " losses.append(train_loss.result())\n", + " \n", + " print (f'Time taken for one epoch: {time.time() - start} sec')\n", + " #if epoch % 15 == 0:\n", + " #transformer.save_weights('./pretrained_models/model_qa_temp')\n", + "# Save the final model\n", + "#transformer.save_weights('./pretrained_models/model_qa_temp')" + ] + }, + { + "cell_type": "markdown", + "id": "23b8dc0c", + "metadata": {}, + "source": [ + "To get a model that works properly, you would need to train for about 100 epochs. So, we have pretrained a model for you. Just load the weights in the current model and let's use it for answering questions" + ] + }, + { + "cell_type": "code", + "execution_count": 31, + "id": "144e769b", + "metadata": { + "deletable": false, + "editable": false, + "tags": [] + }, + "outputs": [ + { + "data": { + "text/plain": [ + "" + ] + }, + "execution_count": 31, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "# Restore the weights\n", + "transformer.load_weights('./pretrained_models/model_qa3')" + ] + }, + { + "cell_type": "markdown", + "id": "360e09fd", + "metadata": {}, + "source": [ + "\n", + "### 3.3 - Implement your Question Answering model\n", + "In this final step, you will implement the answer_question function, utilizing a pre-trained transformer model for question answering.\n", + "\n", + "To help you out the `transformer_utils.next_word` function is provided. This function receives the question and beginning of the answer (both in tensor format) alongside the model to predict the next token in the answer. The next cell shows how to use this:" + ] + }, + { + "cell_type": "code", + "execution_count": 32, + "id": "92b40de0", + "metadata": { + "deletable": false, + "editable": false, + "tags": [] + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Predicted next word is: ''\n", + "Answer so far: 'answer: '\n" + ] + } + ], + "source": [ + "# Define an example question\n", + "example_question = \"question: What color is the sky? context: Sky is blue\"\n", + "\n", + "# Question is tokenized and padded\n", + "# Note that this is hardcoded here but you must implement this in the upcoming exercise\n", + "tokenized_padded_question = tf.constant([[822, 10, 363, 945, 19, 8, 5796, 58, 2625, 10, 5643, 19, 1692, 0, 0]])\n", + "\n", + "# All answers begin with the string \"answer: \"\n", + "# Feel free to check that this is indeed the tokenized version of that string\n", + "tokenized_answer = tf.constant([[1525, 10]])\n", + "\n", + "# Predict the next word using the transformer_utils.next_word function\n", + "# Notice that it expects the question, answer and model (in that order)\n", + "next_word = transformer_utils.next_word(tokenized_padded_question, tokenized_answer, transformer)\n", + "\n", + "print(f\"Predicted next word is: '{tokenizer.detokenize(next_word).numpy()[0].decode('utf-8')}'\")\n", + "\n", + "# Concatenate predicted word with answer so far\n", + "answer_so_far = tf.concat([tokenized_answer, next_word], axis=-1)\n", + "\n", + "print(f\"Answer so far: '{tokenizer.detokenize(answer_so_far).numpy()[0].decode('utf-8')}'\")" + ] + }, + { + "cell_type": "markdown", + "id": "8e23a6be", + "metadata": {}, + "source": [ + "\n", + "### Exercise 3 - Implement the question answering function\n", + "\n", + "Implement the `answer_question` function. Here are the steps:\n", + "- **Question Setup:**\n", + "\n", + " - Tokenize the given question using the provided tokenizer.\n", + " - Add an extra dimension to the tensor for compatibility.\n", + " - Pad the question tensor using `pad_sequences` to ensure the sequence has the specified max length. This function will truncate the sequence if it is larger or pad with zeros if it is shorter.\n", + "- **Answer Setup:**\n", + " - Tokenize the initial answer, noting that all answers begin with the string \"answer: \".\n", + " - Add an extra dimension to the tensor for compatibility.\n", + " - Get the id of the `EOS` token, typically represented by 1.\n", + "- **Generate Answer:**\n", + " - Loop for `decoder_maxlen` iterations.\n", + " - Use the `transformer_utils.next_word` function, which predicts the next token in the answer using the model, input document, and the current state of the output.\n", + " - Concatenate the predicted next word to the output tensor.\n", + "- **Stop Condition:**\n", + " - The text generation stops if the model predicts the `EOS` token.\n", + " - If the `EOS` token is predicted, break out of the loop." + ] + }, + { + "cell_type": "code", + "execution_count": 33, + "id": "91def253", + "metadata": { + "deletable": false, + "tags": [ + "graded" + ] + }, + "outputs": [], + "source": [ + "# GRADED FUNCTION: answer_question\n", + "def answer_question(question, model, tokenizer, encoder_maxlen=150, decoder_maxlen=50):\n", + " \"\"\"\n", + " A function for question answering using the transformer model\n", + " Arguments:\n", + " question (tf.Tensor): Input data with question and context\n", + " model (tf.keras.model): The transformer model\n", + " tokenizer (function): The SentencePiece tokenizer\n", + " encoder_maxlen (number): Max length of the encoded sequence\n", + " decoder_maxlen (number): Max length of the decoded sequence\n", + " Returns:\n", + " _ (str): The answer to the question\n", + " \"\"\"\n", + " \n", + " ### START CODE HERE ###\n", + " \n", + " # QUESTION SETUP\n", + " \n", + " # Tokenize the question\n", + " tokenized_question = tokenizer.tokenize(question)\n", + " \n", + " # Add an extra dimension to the tensor\n", + " tokenized_question = tf.expand_dims(tokenized_question, 0) \n", + " \n", + " # Pad the question tensor\n", + " padded_question = tf.keras.preprocessing.sequence.pad_sequences(tokenized_question,\n", + " maxlen=encoder_maxlen,\n", + " padding='post', \n", + " truncating='post') \n", + " # ANSWER SETUP\n", + " \n", + " # Tokenize the answer\n", + " # Hint: All answers begin with the string \"answer: \"\n", + " tokenized_answer = tokenizer.tokenize(\"answer: \")\n", + " \n", + " # Add an extra dimension to the tensor\n", + " tokenized_answer = tf.expand_dims(tokenized_answer, 0)\n", + " \n", + " # Get the id of the EOS token\n", + " eos = tokenizer.string_to_id(\"\") \n", + " \n", + " # Loop for decoder_maxlen iterations\n", + " for i in range(decoder_maxlen):\n", + " \n", + " # Predict the next word using the model, the input document and the current state of output\n", + " next_word = transformer_utils.next_word(padded_question, tokenized_answer, model)\n", + " \n", + " # Concat the predicted next word to the output \n", + " tokenized_answer = tf.concat([next_word, tokenized_answer], axis=1)\n", + " \n", + " # The text generation stops if the model predicts the EOS token\n", + " if next_word == eos:\n", + " break \n", + " ### END CODE HERE ###\n", + "\n", + " return tokenized_answer" + ] + }, + { + "cell_type": "markdown", + "id": "de501d8c", + "metadata": {}, + "source": [ + "Let's test the model with some question from the training dataset. Check if the answers match the correct one." + ] + }, + { + "cell_type": "code", + "execution_count": 34, + "id": "163e79eb", + "metadata": { + "deletable": false, + "editable": false, + "tags": [] + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "\u001b[34mb'Treat Treat Treat Treat Treat Treat Treat Treat Treat Treat Treat Treat Treat Treat Treat Treat Treat Treat Treat Treat Treat Treat 2002 2002 Treat 2002 2002 2002 2002 2002 2002 Treat Treat Treat 2002 2002 2002 economic economic 2002 2002 2002 2002 2002 2002 2002 2002 2002 2002 Treat answer:'\u001b[0m\n", + "\n", + "question: When was the Chechen-Ingush Autonomous Soviet Socialist Republic transferred from the Georgian SSR? context: On January 9, 1957, Karachay Autonomous Oblast and Chechen-Ingush Autonomous Soviet Socialist Republic were restored by Khrushchev and they were transferred from the Georgian SSR back to the Russian SFSR.\n", + "\u001b[32manswer: January 9, 1957\u001b[0m\n" + ] + } + ], + "source": [ + "idx = 10408\n", + "\n", + "result = answer_question(inputs_train[idx], transformer, tokenizer)\n", + "print(colored(pretty_decode(result, sentinels, tokenizer).numpy()[0], 'blue'))\n", + "print()\n", + "print(inputs_train[idx])\n", + "print(colored(targets_train[idx], 'green'))" + ] + }, + { + "cell_type": "markdown", + "id": "2fae9e8d", + "metadata": {}, + "source": [ + "#### **Expected Output:**\n", + "```\n", + "b'answer: January 9, 1957'\n", + "\n", + "question: When was the Chechen-Ingush Autonomous Soviet Socialist Republic transferred from the Georgian SSR? context: On January 9, 1957, Karachay Autonomous Oblast and Chechen-Ingush Autonomous Soviet Socialist Republic were restored by Khrushchev and they were transferred from the Georgian SSR back to the Russian SFSR.\n", + "answer: January 9, 1957\n", + "```" + ] + }, + { + "cell_type": "code", + "execution_count": 36, + "id": "19ac8067", + "metadata": { + "deletable": false, + "editable": false, + "tags": [] + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "\u001b[91mTese case 0 failed\n", + "\n", + "test_case(msg='Wrong answer', want='five', got='answer: on on re on re on re on re on re on re on re on re on on on on on on on on on on on on on on on ', failed=True)\n", + "WARNING:tensorflow:Detecting that an object or model or tf.train.Checkpoint is being deleted with unrestored values. See the following logs for the specific values in question. To silence these warnings, use `status.expect_partial()`. See https://www.tensorflow.org/api_docs/python/tf/train/Checkpoint#restorefor details about the status object returned by the restore function.\n", + "WARNING:tensorflow:Value in checkpoint could not be found in the restored object: (root).encoder.enc_layers.0.ffn.layer_with_weights-0.kernel\n", + "WARNING:tensorflow:Value in checkpoint could not be found in the restored object: (root).encoder.enc_layers.0.ffn.layer_with_weights-0.bias\n", + "WARNING:tensorflow:Value in checkpoint could not be found in the restored object: (root).encoder.enc_layers.0.ffn.layer_with_weights-1.kernel\n", + "WARNING:tensorflow:Value in checkpoint could not be found in the restored object: (root).encoder.enc_layers.0.ffn.layer_with_weights-1.bias\n", + "WARNING:tensorflow:Value in checkpoint could not be found in the restored object: (root).encoder.enc_layers.1.ffn.layer_with_weights-0.kernel\n", + "WARNING:tensorflow:Value in checkpoint could not be found in the restored object: (root).encoder.enc_layers.1.ffn.layer_with_weights-0.bias\n", + "WARNING:tensorflow:Value in checkpoint could not be found in the restored object: (root).encoder.enc_layers.1.ffn.layer_with_weights-1.kernel\n", + "WARNING:tensorflow:Value in checkpoint could not be found in the restored object: (root).encoder.enc_layers.1.ffn.layer_with_weights-1.bias\n", + "WARNING:tensorflow:Value in checkpoint could not be found in the restored object: (root).decoder.dec_layers.0.ffn.layer_with_weights-0.kernel\n", + "WARNING:tensorflow:Value in checkpoint could not be found in the restored object: (root).decoder.dec_layers.0.ffn.layer_with_weights-0.bias\n", + "WARNING:tensorflow:Value in checkpoint could not be found in the restored object: (root).decoder.dec_layers.0.ffn.layer_with_weights-1.kernel\n", + "WARNING:tensorflow:Value in checkpoint could not be found in the restored object: (root).decoder.dec_layers.0.ffn.layer_with_weights-1.bias\n", + "WARNING:tensorflow:Value in checkpoint could not be found in the restored object: (root).decoder.dec_layers.1.ffn.layer_with_weights-0.kernel\n", + "WARNING:tensorflow:Value in checkpoint could not be found in the restored object: (root).decoder.dec_layers.1.ffn.layer_with_weights-0.bias\n", + "WARNING:tensorflow:Value in checkpoint could not be found in the restored object: (root).decoder.dec_layers.1.ffn.layer_with_weights-1.kernel\n", + "WARNING:tensorflow:Value in checkpoint could not be found in the restored object: (root).decoder.dec_layers.1.ffn.layer_with_weights-1.bias\n" + ] + } + ], + "source": [ + "# UNIT TEST\n", + "w3_unittest.test_answer_question(answer_question)" + ] + }, + { + "cell_type": "markdown", + "id": "06588341", + "metadata": {}, + "source": [ + "Test the model with question 110" + ] + }, + { + "cell_type": "code", + "execution_count": 84, + "id": "6c381df3", + "metadata": { + "deletable": false, + "editable": false, + "tags": [] + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "\u001b[34mb'answer: 50'\u001b[0m\n", + "\n", + "question: What percentage of the vote was recorded as approving Napoleon's constitution? context: Napoleon established a political system that historian Martyn Lyons called \"dictatorship by plebiscite.\" Worried by the democratic forces unleashed by the Revolution, but unwilling to ignore them entirely, Napoleon resorted to regular electoral consultations with the French people on his road to imperial power. He drafted the Constitution of the Year VIII and secured his own election as First Consul, taking up residence at the Tuileries. The constitution was approved in a rigged plebiscite held the following January, with 99.94 percent officially listed as voting \"yes.\" Napoleon's brother, Lucien, had falsified the returns to show that 3 million people had participated in the plebiscite; the real number was 1.5 million. Political observers at the time assumed the eligible French voting public numbered about 5 million people, so the regime artificially doubled the participation rate to indicate popular enthusiasm for the Consulate. In the first few months of the Consulate, with war in Europe still raging and internal instability still plaguing the country, Napoleon's grip on power remained very tenuous.\n", + "\u001b[32manswer: 99.94\u001b[0m\n" + ] + } + ], + "source": [ + "idx = 110\n", + "result = answer_question(inputs_test[idx], transformer, tokenizer)\n", + "print(colored(pretty_decode(result, sentinels, tokenizer).numpy()[0], 'blue'))\n", + "print()\n", + "print(inputs_test[idx])\n", + "print(colored(targets_test[idx], 'green'))" + ] + }, + { + "cell_type": "markdown", + "id": "fd09ec41", + "metadata": {}, + "source": [ + "Test the model with question 301. Use this cell to play with the model by selecting other test questions. Look if the model has learnt something or if it is just generating random text." + ] + }, + { + "cell_type": "code", + "execution_count": 85, + "id": "fc9c898f", + "metadata": { + "deletable": false, + "editable": false, + "tags": [] + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "\u001b[34mb'answer: June 1840'\u001b[0m\n", + "\n", + "question: On what date was a state funeral held for Napoleon? context: In 1840, Louis Philippe I obtained permission from the British to return Napoleon's remains to France. On 15 December 1840, a state funeral was held. The hearse proceeded from the Arc de Triomphe down the Champs-Élysées, across the Place de la Concorde to the Esplanade des Invalides and then to the cupola in St Jérôme's Chapel, where it remained until the tomb designed by Louis Visconti was completed. In 1861, Napoleon's remains were entombed in a porphyry sarcophagus in the crypt under the dome at Les Invalides.\n", + "\u001b[32manswer: 15 December 1840\u001b[0m\n" + ] + } + ], + "source": [ + "idx = 311\n", + "result = answer_question(inputs_test[idx], transformer, tokenizer)\n", + "print(colored(pretty_decode(result, sentinels, tokenizer).numpy()[0], 'blue'))\n", + "print()\n", + "print(inputs_test[idx])\n", + "print(colored(targets_test[idx], 'green'))" + ] + }, + { + "cell_type": "markdown", + "id": "0e3d00de", + "metadata": {}, + "source": [ + "Congratulations, you have finished the last assignment of this specialization. Now, you know what is behind the powerful models like ChatGPT. Now it is time for you to find and solve the huge amount of problems that can be approached with NLP." + ] + } + ], + "metadata": { + "grader_version": "1", + "jupytext": { + "encoding": "# -*- coding: utf-8 -*-" + }, + "kernelspec": { + "display_name": "Python 3 (ipykernel)", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.8.10" + } + }, + "nbformat": 4, + "nbformat_minor": 5 +} diff --git a/NLP with Attention Models/QA/QA_T5/Files/tf/__pycache__/test_utils.cpython-311.pyc b/NLP with Attention Models/QA/QA_T5/Files/tf/__pycache__/test_utils.cpython-311.pyc new file mode 100644 index 0000000000000000000000000000000000000000..bd68ece18dde7f7616ea35fb1b6ea9f8f732d630 Binary files /dev/null and b/NLP with Attention Models/QA/QA_T5/Files/tf/__pycache__/test_utils.cpython-311.pyc differ diff --git a/NLP with Attention Models/QA/QA_T5/Files/tf/__pycache__/test_utils.cpython-38.pyc b/NLP with Attention Models/QA/QA_T5/Files/tf/__pycache__/test_utils.cpython-38.pyc new file mode 100644 index 0000000000000000000000000000000000000000..390bb487ffab1941edaf5940cf8dc89f76f0575f Binary files /dev/null and b/NLP with Attention Models/QA/QA_T5/Files/tf/__pycache__/test_utils.cpython-38.pyc differ diff --git a/NLP with Attention Models/QA/QA_T5/Files/tf/__pycache__/transformer.cpython-38.pyc b/NLP with Attention Models/QA/QA_T5/Files/tf/__pycache__/transformer.cpython-38.pyc new file mode 100644 index 0000000000000000000000000000000000000000..e7d7175c610cfbd498d635ea5cce7150641a9110 Binary files /dev/null and b/NLP with Attention Models/QA/QA_T5/Files/tf/__pycache__/transformer.cpython-38.pyc differ diff --git a/NLP with Attention Models/QA/QA_T5/Files/tf/__pycache__/transformer_utils.cpython-38.pyc b/NLP with Attention Models/QA/QA_T5/Files/tf/__pycache__/transformer_utils.cpython-38.pyc new file mode 100644 index 0000000000000000000000000000000000000000..04a445cf8b62cbda310046b7dfcbac0b9de8338b Binary files /dev/null and b/NLP with Attention Models/QA/QA_T5/Files/tf/__pycache__/transformer_utils.cpython-38.pyc differ diff --git a/NLP with Attention Models/QA/QA_T5/Files/tf/__pycache__/utils.cpython-311.pyc b/NLP with Attention Models/QA/QA_T5/Files/tf/__pycache__/utils.cpython-311.pyc new file mode 100644 index 0000000000000000000000000000000000000000..f86ca8d7da77e0202efad0fbbba80a59fca135b8 Binary files /dev/null and b/NLP with Attention Models/QA/QA_T5/Files/tf/__pycache__/utils.cpython-311.pyc differ diff --git a/NLP with Attention Models/QA/QA_T5/Files/tf/__pycache__/utils.cpython-38.pyc b/NLP with Attention Models/QA/QA_T5/Files/tf/__pycache__/utils.cpython-38.pyc new file mode 100644 index 0000000000000000000000000000000000000000..edf8ccd68f4a9cf1d3e776030d582dd4d9c3b5b2 Binary files /dev/null and b/NLP with Attention Models/QA/QA_T5/Files/tf/__pycache__/utils.cpython-38.pyc differ diff --git a/NLP with Attention Models/QA/QA_T5/Files/tf/__pycache__/w3_tests.cpython-38.pyc b/NLP with Attention Models/QA/QA_T5/Files/tf/__pycache__/w3_tests.cpython-38.pyc new file mode 100644 index 0000000000000000000000000000000000000000..42e86ea90dfea64c088a9df633ce398287c739a4 Binary files /dev/null and b/NLP with Attention Models/QA/QA_T5/Files/tf/__pycache__/w3_tests.cpython-38.pyc differ diff --git a/NLP with Attention Models/QA/QA_T5/Files/tf/__pycache__/w3_unittest.cpython-311.pyc b/NLP with Attention Models/QA/QA_T5/Files/tf/__pycache__/w3_unittest.cpython-311.pyc new file mode 100644 index 0000000000000000000000000000000000000000..995f52f8c7a025483ec5a6f9963b6b7683444f64 Binary files /dev/null and b/NLP with Attention Models/QA/QA_T5/Files/tf/__pycache__/w3_unittest.cpython-311.pyc differ diff --git a/NLP with Attention Models/QA/QA_T5/Files/tf/__pycache__/w3_unittest.cpython-38.pyc b/NLP with Attention Models/QA/QA_T5/Files/tf/__pycache__/w3_unittest.cpython-38.pyc new file mode 100644 index 0000000000000000000000000000000000000000..51bcaf4a0b29b9dd160f9acc3d2221018429679a Binary files /dev/null and b/NLP with Attention Models/QA/QA_T5/Files/tf/__pycache__/w3_unittest.cpython-38.pyc differ diff --git a/NLP with Attention Models/QA/QA_T5/Files/tf/data/c4-en-10k.json b/NLP with Attention Models/QA/QA_T5/Files/tf/data/c4-en-10k.json new file mode 100644 index 0000000000000000000000000000000000000000..f84b5d3bf565620aa41b557d0366501d95c46cdf --- /dev/null +++ b/NLP with Attention Models/QA/QA_T5/Files/tf/data/c4-en-10k.json @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:6952c080b8fc8be3d5ca2889747f1e668405a659876ec0a90874d75da46e823b +size 21889789 diff --git a/NLP with Attention Models/QA/QA_T5/Files/tf/data/c4-en-10k.jsonl b/NLP with Attention Models/QA/QA_T5/Files/tf/data/c4-en-10k.jsonl new file mode 100644 index 0000000000000000000000000000000000000000..cf4105c348f7d062aa47327472402b4d00947b8f --- /dev/null +++ b/NLP with Attention Models/QA/QA_T5/Files/tf/data/c4-en-10k.jsonl @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:2e822b5bad87c5fa0315cc59ef9531116d62d42b7d9b11323588bcaf45f49fde +size 21879788 diff --git a/NLP with Attention Models/QA/QA_T5/Files/tf/data/data.txt b/NLP with Attention Models/QA/QA_T5/Files/tf/data/data.txt new file mode 100644 index 0000000000000000000000000000000000000000..2cd6069cceed2c351bd305aff5d7902af9a25058 --- /dev/null +++ b/NLP with Attention Models/QA/QA_T5/Files/tf/data/data.txt @@ -0,0 +1,5 @@ +{'content-length': b'1970', 'content-type': b'text/plain', 'text': b'Beginners BBQ Class Taking Place in Missoula!\nDo you want to get better at making delicious BBQ? You will have the opportunity, put this on your calendar now. Thursday, September 22nd join World Class BBQ Champion, Tony Balay from Lonestar Smoke Rangers. He will be teaching a beginner level class for everyone who wants to get better with their culinary skills.\nHe will teach you everything you need to know to compete in a KCBS BBQ competition, including techniques, recipes, timelines, meat selection and trimming, plus smoker and fire information.\nThe cost to be in the class is $35 per person, and for spectators it is free. Included in the cost will be either a t-shirt or apron and you will be tasting samples of each meat that is prepared.', 'timestamp': b'2019-04-25T12:57:54Z', 'url': b'https://klyq.com/beginners-bbq-class-taking-place-in-missoula/'} +{'content-length': b'12064', 'content-type': b'text/plain', 'text': b'Discussion in \'Mac OS X Lion (10.7)\' started by axboi87, Jan 20, 2012.\nI\'ve got a 500gb internal drive and a 240gb SSD.\nWhen trying to restore using disk utility i\'m given the error "Not enough space on disk ____ to restore"\nBut I shouldn\'t have to do that!!!\nAny ideas or workarounds before resorting to the above?\nUse Carbon Copy Cloner to copy one drive to the other. I\'ve done this several times going from larger HDD to smaller SSD and I wound up with a bootable SSD drive. One step you have to remember not to skip is to use Disk Utility to partition the SSD as GUID partition scheme HFS+ before doing the clone. If it came Apple Partition Scheme, even if you let CCC do the clone, the resulting drive won\'t be bootable. CCC usually works in "file mode" and it can easily copy a larger drive (that\'s mostly empty) onto a smaller drive. If you tell CCC to clone a drive you did NOT boot from, it can work in block copy mode where the destination drive must be the same size or larger than the drive you are cloning from (if I recall).\nI\'ve actually done this somehow on Disk Utility several times (booting from a different drive (or even the dvd) so not running disk utility from the drive your cloning) and had it work just fine from larger to smaller bootable clone. Definitely format the drive cloning to first, as bootable Apple etc..\nThanks for pointing this out. My only experience using DU to go larger to smaller was when I was trying to make a Lion install stick and I was unable to restore InstallESD.dmg to a 4 GB USB stick but of course the reason that wouldn\'t fit is there was slightly more than 4 GB of data.', 'timestamp': b'2019-04-21T10:07:13Z', 'url':b'https://forums.macrumors.com/threads/restore-from-larger-disk-to-smaller-disk.1311329/'} +{'content-length': b'5235', 'content-type': b'text/plain', 'text': b'Foil plaid lycra and spandex shortall with metallic slinky insets. Attached metallic elastic belt with O-ring. Headband included. Great hip hop or jazz dance costume. Made in the USA.', 'timestamp': b'2019-04-25T10:40:23Z', 'url': b'https://awishcometrue.com/Catalogs/Clearance/Tweens/V1960-Find-A-Way'} +{'content-length': b'4967', 'content-type': b'text/plain', 'text': b"How many backlinks per day for new site?\nDiscussion in 'Black Hat SEO' started by Omoplata, Dec 3, 2010.\n1) for a newly created site, what's the max # backlinks per day I should do to be safe?\n2) how long do I have to let my site age before I can start making more blinks?\nI did about 6000 forum profiles every 24 hours for 10 days for one of my sites which had a brand new domain.\nThere is three backlinks for every of these forum profile so thats 18 000 backlinks every 24 hours and nothing happened in terms of being penalized or sandboxed. This is now maybe 3 months ago and the site is ranking on first page for a lot of my targeted keywords.\nbuild more you can in starting but do manual submission and not spammy type means manual + relevant to the post.. then after 1 month you can make a big blast..\nWow, dude, you built 18k backlinks a day on a brand new site? How quickly did you rank up? What kind of competition/searches did those keywords have?", 'timestamp': b'2019-04-21T12:46:19Z', 'url': b'https://www.blackhatworld.com/seo/how-many-backlinks-per-day-for-new-site.258615/'} +{'content-length': b'4499', 'content-type': b'text/plain', 'text': b'The Denver Board of Education opened the 2017-18 school year with an update on projects that include new construction, upgrades, heat mitigation and quality learning environments.\nWe are excited that Denver students will be the beneficiaries of a four year, $572 million General Obligation Bond. Since the passage of the bond, our construction team has worked to schedule the projects over the four-year term of the bond.\nDenver voters on Tuesday approved bond and mill funding measures for students in Denver Public Schools, agreeing to invest $572 million in bond funding to build and improve schools and $56.6 million in operating dollars to support proven initiatives, such as early literacy.\nDenver voters say yes to bond and mill levy funding support for DPS students and schools. Click to learn more about the details of the voter-approved bond measure.\nDenver voters on Nov. 8 approved bond and mill funding measures for DPS students and schools. Learn more about what\xe2\x80\x99s included in the mill levy measure.', 'timestamp': b'2019-04-20T14:33:21Z', 'url': b'http://bond.dpsk12.org/category/news/'} \ No newline at end of file diff --git a/NLP with Attention Models/QA/QA_T5/Files/tf/data/inputs_targets_pairs_file.txt b/NLP with Attention Models/QA/QA_T5/Files/tf/data/inputs_targets_pairs_file.txt new file mode 100644 index 0000000000000000000000000000000000000000..d45ccc83e766c3d400d57e592f6ab370637d50ed Binary files /dev/null and b/NLP with Attention Models/QA/QA_T5/Files/tf/data/inputs_targets_pairs_file.txt differ diff --git a/NLP with Attention Models/QA/QA_T5/Files/tf/data/train-v2.0.json b/NLP with Attention Models/QA/QA_T5/Files/tf/data/train-v2.0.json new file mode 100644 index 0000000000000000000000000000000000000000..07e3c5fd9fc26adbdc85dbac3ae64d803d891daa --- /dev/null +++ b/NLP with Attention Models/QA/QA_T5/Files/tf/data/train-v2.0.json @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:68dcfbb971bd3e96d5b46c7177b16c1a4e7d4bdef19fb204502738552dede002 +size 42123633 diff --git a/NLP with Attention Models/QA/QA_T5/Files/tf/images/colab_help_1.png b/NLP with Attention Models/QA/QA_T5/Files/tf/images/colab_help_1.png new file mode 100644 index 0000000000000000000000000000000000000000..c5ba273392ec227384b39811c6d7f5e646508979 Binary files /dev/null and b/NLP with Attention Models/QA/QA_T5/Files/tf/images/colab_help_1.png differ diff --git a/NLP with Attention Models/QA/QA_T5/Files/tf/images/colab_help_2.png b/NLP with Attention Models/QA/QA_T5/Files/tf/images/colab_help_2.png new file mode 100644 index 0000000000000000000000000000000000000000..9db1fda5adc9a9e8d15d2632b49cffed638b62a1 --- /dev/null +++ b/NLP with Attention Models/QA/QA_T5/Files/tf/images/colab_help_2.png @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:9839fc77e5254fc80f7d292eb722050b0a53315062d10d1d74dfff3dbb0b37bd +size 133878 diff --git a/NLP with Attention Models/QA/QA_T5/Files/tf/images/encoder.png b/NLP with Attention Models/QA/QA_T5/Files/tf/images/encoder.png new file mode 100644 index 0000000000000000000000000000000000000000..8950c960acb6e0053ea967370cf9300ec36ecadd Binary files /dev/null and b/NLP with Attention Models/QA/QA_T5/Files/tf/images/encoder.png differ diff --git a/NLP with Attention Models/QA/QA_T5/Files/tf/images/fulltransformer.png b/NLP with Attention Models/QA/QA_T5/Files/tf/images/fulltransformer.png new file mode 100644 index 0000000000000000000000000000000000000000..529d7ff5ede9d0fbe7f76984788f9601b214a798 --- /dev/null +++ b/NLP with Attention Models/QA/QA_T5/Files/tf/images/fulltransformer.png @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:6086864d162483989b6d9e9f49c1ea59b106c48e4b3875c284567df52c01442d +size 131507 diff --git a/NLP with Attention Models/QA/QA_T5/Files/tf/images/loss.png b/NLP with Attention Models/QA/QA_T5/Files/tf/images/loss.png new file mode 100644 index 0000000000000000000000000000000000000000..961c2013dd9bbefb3bc3e0dbaa409ad1e06d346a Binary files /dev/null and b/NLP with Attention Models/QA/QA_T5/Files/tf/images/loss.png differ diff --git a/NLP with Attention Models/QA/QA_T5/Files/tf/images/qa.png b/NLP with Attention Models/QA/QA_T5/Files/tf/images/qa.png new file mode 100644 index 0000000000000000000000000000000000000000..7a61a3a100e809efea550b6678edd7b87f0f43f9 --- /dev/null +++ b/NLP with Attention Models/QA/QA_T5/Files/tf/images/qa.png @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:227d1c9fc5c6b377b99af1b7a541acb8a7dd7e35d75e9d8318a7084a9c16b28b +size 1910396 diff --git a/NLP with Attention Models/QA/QA_T5/Files/tf/models/sentencepiece.model b/NLP with Attention Models/QA/QA_T5/Files/tf/models/sentencepiece.model new file mode 100644 index 0000000000000000000000000000000000000000..317a5ccbde45300f5d1d970d4d449af2108b147e --- /dev/null +++ b/NLP with Attention Models/QA/QA_T5/Files/tf/models/sentencepiece.model @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:d60acb128cf7b7f2536e8f38a5b18a05535c9e14c7a355904270e15b0945ea86 +size 791656 diff --git a/NLP with Attention Models/QA/QA_T5/Files/tf/pretrained_models/model_c4.data-00000-of-00001 b/NLP with Attention Models/QA/QA_T5/Files/tf/pretrained_models/model_c4.data-00000-of-00001 new file mode 100644 index 0000000000000000000000000000000000000000..307e8af63420fe0b9d3572d9ed1e204d326121ce --- /dev/null +++ b/NLP with Attention Models/QA/QA_T5/Files/tf/pretrained_models/model_c4.data-00000-of-00001 @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:8d82336c883d73ce51dfff3c52281277502662579527f5484a47aa0a681334e8 +size 53003988 diff --git a/NLP with Attention Models/QA/QA_T5/Files/tf/pretrained_models/model_c4.index b/NLP with Attention Models/QA/QA_T5/Files/tf/pretrained_models/model_c4.index new file mode 100644 index 0000000000000000000000000000000000000000..7191f912611edad03ba2a5e47c33bc4ce5b250a4 Binary files /dev/null and b/NLP with Attention Models/QA/QA_T5/Files/tf/pretrained_models/model_c4.index differ diff --git a/NLP with Attention Models/QA/QA_T5/Files/tf/pretrained_models/model_qa3.data-00000-of-00001 b/NLP with Attention Models/QA/QA_T5/Files/tf/pretrained_models/model_qa3.data-00000-of-00001 new file mode 100644 index 0000000000000000000000000000000000000000..564d4e1ff329fc7519ea52132f7fb5c2a1ecc3e0 --- /dev/null +++ b/NLP with Attention Models/QA/QA_T5/Files/tf/pretrained_models/model_qa3.data-00000-of-00001 @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:5e23b1c44883369302e5f5f31fae85a4ee889e1881d3fd2695420e22f9f26572 +size 53003988 diff --git a/NLP with Attention Models/QA/QA_T5/Files/tf/pretrained_models/model_qa3.index b/NLP with Attention Models/QA/QA_T5/Files/tf/pretrained_models/model_qa3.index new file mode 100644 index 0000000000000000000000000000000000000000..903a53967bf96a714fc5f0b8a62cda9c9e24e883 Binary files /dev/null and b/NLP with Attention Models/QA/QA_T5/Files/tf/pretrained_models/model_qa3.index differ diff --git a/NLP with Attention Models/QA/QA_T5/Files/tf/test_utils.py b/NLP with Attention Models/QA/QA_T5/Files/tf/test_utils.py new file mode 100644 index 0000000000000000000000000000000000000000..dc3971ce88825edc90be6a80726ae97db305eea9 --- /dev/null +++ b/NLP with Attention Models/QA/QA_T5/Files/tf/test_utils.py @@ -0,0 +1,69 @@ +import numpy as np +from termcolor import colored + +# + +from tensorflow.keras.layers import Embedding +from tensorflow.keras.layers import GRU +from tensorflow.keras.layers import Dense +from tensorflow.keras.layers import Dropout + +from utils import PositionalEmbedding +from dlai_grader.grading import test_case, object_to_grade +from types import ModuleType, FunctionType + +# Compare the two inputs + +def comparator(learner, instructor, modelId): + cases = [] + t = test_case() + if len(learner) != len(instructor): + t.failed = True + t.msg = f"{modelId}: The number of layers in the proposed model does not agree with the expected model" + t.want = len(instructor) + t.got = len(learner) + cases.append(t) + index_layer = 1 + + for a, b in zip(learner, instructor): + t = test_case() + if tuple(a) != tuple(b): + t.failed = True + t.msg = f"{modelId}: Test failed in layer {index_layer}" + t.want = b + t.got = a + cases.append(t) + index_layer = index_layer + 1 + return cases + + +def summary(model): + result = [] + for layer in model.layers: + descriptors = [layer.__class__.__name__, + layer.output_shape, layer.count_params()] + if (type(layer) == Dense): + descriptors.append(layer.activation.__name__) + if (type(layer) == Dropout): + descriptors.append(f"rate={layer.rate}") + if (type(layer) == GRU): + descriptors.append(f"return_sequences={layer.return_sequences}") + descriptors.append(f"return_state={layer.return_state}") + if (type(layer) == PositionalEmbedding): + descriptors.append(f"vocab_size={layer.vocab_size}") + descriptors.append(f"d_model={layer.d_model}") + descriptors.append(f"max_length={layer.max_length}") + if hasattr(layer, 'd_model'): + descriptors.append(f"d_model={layer.d_model}") + if hasattr(layer, 'd_ff'): + descriptors.append(f"d_ff={layer.d_ff}") + if hasattr(layer, 'n_heads'): + descriptors.append(f"n_heads={layer.n_heads}") + if hasattr(layer, 'dropout'): + descriptors.append(f"dropout={layer.dropout}") + if hasattr(layer, 'ff_activation'): + descriptors.append(f"ff_activation={layer.ff_activation}") + + result.append(descriptors) + return result + + diff --git a/NLP with Attention Models/QA/QA_T5/Files/tf/transformer_utils.py b/NLP with Attention Models/QA/QA_T5/Files/tf/transformer_utils.py new file mode 100644 index 0000000000000000000000000000000000000000..39996c359dcffb80b2263420e50ed96291685475 --- /dev/null +++ b/NLP with Attention Models/QA/QA_T5/Files/tf/transformer_utils.py @@ -0,0 +1,591 @@ +# import os + +import numpy as np +#import pandas as pd +import tensorflow as tf +#import matplotlib.pyplot as plt +#import time +#import utils + +def positional_encoding(positions, d_model): + """ + Precomputes a matrix with all the positional encodings + + Arguments: + positions (int): Maximum number of positions to be encoded + d (int): Encoding size + + Returns: + pos_encoding (tf.Tensor): A matrix of shape (1, position, d_model) with the positional encodings + """ + + position = np.arange(positions)[:, np.newaxis] + k = np.arange(d_model)[np.newaxis, :] + i = k // 2 + + # initialize a matrix angle_rads of all the angles + angle_rates = 1 / np.power(10000, (2 * i) / np.float32(d_model)) + angle_rads = position * angle_rates + + # apply sin to even indices in the array; 2i + angle_rads[:, 0::2] = np.sin(angle_rads[:, 0::2]) + + # apply cos to odd indices in the array; 2i+1 + angle_rads[:, 1::2] = np.cos(angle_rads[:, 1::2]) + + pos_encoding = angle_rads[np.newaxis, ...] + + return tf.cast(pos_encoding, dtype=tf.float32) + +def create_padding_mask(decoder_token_ids): + """ + Creates a matrix mask for the padding cells + + Arguments: + decoder_token_ids (matrix like): matrix of size (n, m) + + Returns: + mask (tf.Tensor): binary tensor of size (n, 1, m) + """ + seq = 1 - tf.cast(tf.math.equal(decoder_token_ids, 0), tf.float32) + + # add extra dimensions to add the padding to the attention logits. + # this will allow for broadcasting later when comparing sequences + return seq[:, tf.newaxis, :] + + +def create_look_ahead_mask(sequence_length): + """ + Returns a lower triangular matrix filled with ones + + Arguments: + sequence_length (int): matrix size + + Returns: + mask (tf.Tensor): binary tensor of size (sequence_length, sequence_length) + """ + mask = tf.linalg.band_part(tf.ones((1, sequence_length, sequence_length)), -1, 0) + return mask + +def scaled_dot_product_attention(q, k, v, mask): + """ + Calculate the attention weights. + q, k, v must have matching leading dimensions. + k, v must have matching penultimate dimension, i.e.: seq_len_k = seq_len_v. + The mask has different shapes depending on its type(padding or look ahead) + but it must be broadcastable for addition. + + Arguments: + q (tf.Tensor): query of shape (..., seq_len_q, depth) + k (tf.Tensor): key of shape (..., seq_len_k, depth) + v (tf.Tensor): value of shape (..., seq_len_v, depth_v) + mask (tf.Tensor): mask with shape broadcastable + to (..., seq_len_q, seq_len_k). Defaults to None. + + Returns: + output -- attention_weights + """ + ### START CODE HERE ### + + # Multiply q and k transposed. + matmul_qk = tf.matmul(q, k, transpose_b=True) # (..., seq_len_q, seq_len_k) + + # scale matmul_qk with the square root of dk + dk = tf.cast(tf.shape(k)[-1], tf.float32) + scaled_attention_logits = matmul_qk / tf.math.sqrt(dk) + + # add the mask to the scaled tensor. + if mask is not None: # Don't replace this None + scaled_attention_logits += (1. - mask) * -1e9 + + # softmax is normalized on the last axis (seq_len_k) so that the scores add up to 1. + attention_weights = tf.keras.activations.softmax(scaled_attention_logits) # (..., seq_len_q, seq_len_k) + + # Multiply the attention weights by v + output = tf.matmul(attention_weights, v) # (..., seq_len_q, depth_v) + + ### END CODE HERE ### + + return output, attention_weights + +class FullyConnected(tf.keras.layers.Layer): + + def __init__(self, embedding_dim, fully_connected_dim): + super(FullyConnected, self).__init__() + self.dense1 = tf.keras.layers.Dense(fully_connected_dim, activation='relu') + self.dense2 = tf.keras.layers.Dense(embedding_dim) + def call(self,x): + x = self.dense1(x) + return self.dense2(x) + + +# GRADED FUNCTION EncoderLayer +class EncoderLayer(tf.keras.layers.Layer): + """ + The encoder layer is composed by a multi-head self-attention mechanism, + followed by a simple, positionwise fully connected feed-forward network. + This architecture includes a residual connection around each of the two + sub-layers, followed by layer normalization. + """ + def __init__(self, embedding_dim, num_heads, fully_connected_dim, + dropout_rate=0.1, layernorm_eps=1e-6): + + super(EncoderLayer, self).__init__() + + self.mha = tf.keras.layers.MultiHeadAttention( + num_heads=num_heads, + key_dim=embedding_dim, + dropout=dropout_rate + ) + + self.ffn = FullyConnected( + embedding_dim=embedding_dim, + fully_connected_dim=fully_connected_dim + ) + + self.layernorm1 = tf.keras.layers.LayerNormalization(epsilon=layernorm_eps) + self.layernorm2 = tf.keras.layers.LayerNormalization(epsilon=layernorm_eps) + + self.dropout_ffn = tf.keras.layers.Dropout(dropout_rate) + + def call(self, x, training, mask): + """ + Forward pass for the Encoder Layer + + Arguments: + x (tf.Tensor): Tensor of shape (batch_size, input_seq_len, fully_connected_dim) + training (bool): Boolean, set to true to activate + the training mode for dropout layers + mask (tf.Tensor): Boolean mask to ensure that the padding is not + treated as part of the input + Returns: + encoder_layer_out (tf.Tensor): Tensor of shape (batch_size, input_seq_len, embedding_dim) + """ + # START CODE HERE + # calculate self-attention using mha(~1 line). + # Dropout is added by Keras automatically if the dropout parameter is non-zero during training + self_mha_output = self.mha(x, x, x, mask) # Self attention (batch_size, input_seq_len, fully_connected_dim) + + # skip connection + # apply layer normalization on sum of the input and the attention output to get the + # output of the multi-head attention layer (~1 line) + skip_x_attention = self.layernorm1(x + self_mha_output) # (batch_size, input_seq_len, fully_connected_dim) + + # pass the output of the multi-head attention layer through a ffn (~1 line) + ffn_output = self.ffn(skip_x_attention) # (batch_size, input_seq_len, fully_connected_dim) + + # apply dropout layer to ffn output during training (~1 line) + # use `training=training` + ffn_output = self.dropout_ffn(ffn_output, training=training) + + # apply layer normalization on sum of the output from multi-head attention (skip connection) and ffn output to get the + # output of the encoder layer (~1 line) + encoder_layer_out = self.layernorm2(skip_x_attention + ffn_output) # (batch_size, input_seq_len, embedding_dim) + # END CODE HERE + + return encoder_layer_out + + +class Encoder(tf.keras.layers.Layer): + """ + The entire Encoder starts by passing the input to an embedding layer + and using positional encoding to then pass the output through a stack of + encoder Layers + + """ + def __init__(self, num_layers, embedding_dim, num_heads, fully_connected_dim, input_vocab_size, + maximum_position_encoding, dropout_rate=0.1, layernorm_eps=1e-6): + super(Encoder, self).__init__() + + self.embedding_dim = embedding_dim + self.num_layers = num_layers + + self.embedding = tf.keras.layers.Embedding(input_vocab_size, self.embedding_dim) + self.pos_encoding = positional_encoding(maximum_position_encoding, + self.embedding_dim) + + self.enc_layers = [EncoderLayer(embedding_dim=self.embedding_dim, + num_heads=num_heads, + fully_connected_dim=fully_connected_dim, + dropout_rate=dropout_rate, + layernorm_eps=layernorm_eps) + for _ in range(self.num_layers)] + + self.dropout = tf.keras.layers.Dropout(dropout_rate) + + def call(self, x, training, mask): + """ + Forward pass for the Encoder + + Arguments: + x (tf.Tensor): Tensor of shape (batch_size, seq_len, embedding_dim) + training (bool): Boolean, set to true to activate + the training mode for dropout layers + mask (tf.Tensor): Boolean mask to ensure that the padding is not + treated as part of the input + + Returns: + x (tf.Tensor): Tensor of shape (batch_size, seq_len, embedding_dim) + """ + seq_len = tf.shape(x)[1] + + # START CODE HERE + # Pass input through the Embedding layer + x = self.embedding(x) # (batch_size, input_seq_len, embedding_dim) + # Scale embedding by multiplying it by the square root of the embedding dimension + x *= tf.math.sqrt(tf.cast(self.embedding_dim, tf.float32)) + # Add the position encoding to embedding + x += self.pos_encoding[:, :seq_len, :] + # Pass the encoded embedding through a dropout layer + # use `training=training` + x = self.dropout(x, training=training) + # Pass the output through the stack of encoding layers + for i in range(self.num_layers): + x = self.enc_layers[i](x, training, mask) + # END CODE HERE + + return x # (batch_size, input_seq_len, embedding_dim) + +class DecoderLayer(tf.keras.layers.Layer): + """ + The decoder layer is composed by two multi-head attention blocks, + one that takes the new input and uses self-attention, and the other + one that combines it with the output of the encoder, followed by a + fully connected block. + """ + def __init__(self, embedding_dim, num_heads, fully_connected_dim, dropout_rate=0.1, layernorm_eps=1e-6): + super(DecoderLayer, self).__init__() + + self.mha1 = tf.keras.layers.MultiHeadAttention( + num_heads=num_heads, + key_dim=embedding_dim, + dropout=dropout_rate + ) + + self.mha2 = tf.keras.layers.MultiHeadAttention( + num_heads=num_heads, + key_dim=embedding_dim, + dropout=dropout_rate + ) + + self.ffn = FullyConnected( + embedding_dim=embedding_dim, + fully_connected_dim=fully_connected_dim + ) + + self.layernorm1 = tf.keras.layers.LayerNormalization(epsilon=layernorm_eps) + self.layernorm2 = tf.keras.layers.LayerNormalization(epsilon=layernorm_eps) + self.layernorm3 = tf.keras.layers.LayerNormalization(epsilon=layernorm_eps) + + self.dropout_ffn = tf.keras.layers.Dropout(dropout_rate) + + def call(self, x, enc_output, training, look_ahead_mask, padding_mask): + """ + Forward pass for the Decoder Layer + + Arguments: + x (tf.Tensor): Tensor of shape (batch_size, target_seq_len, fully_connected_dim) + enc_output (tf.Tensor): Tensor of shape(batch_size, input_seq_len, fully_connected_dim) + training (bool): Boolean, set to true to activate + the training mode for dropout layers + look_ahead_mask (tf.Tensor): Boolean mask for the target_input + padding_mask (tf.Tensor): Boolean mask for the second multihead attention layer + Returns: + out3 (tf.Tensor): Tensor of shape (batch_size, target_seq_len, fully_connected_dim) + attn_weights_block1 (tf.Tensor): Tensor of shape (batch_size, num_heads, target_seq_len, input_seq_len) + attn_weights_block2 (tf.Tensor): Tensor of shape (batch_size, num_heads, target_seq_len, input_seq_len) + """ + + # START CODE HERE + # enc_output.shape == (batch_size, input_seq_len, fully_connected_dim) + + # BLOCK 1 + # calculate self-attention and return attention scores as attn_weights_block1. + # Dropout will be applied during training (~1 line). + mult_attn_out1, attn_weights_block1 = self.mha1(x, x, x, look_ahead_mask, return_attention_scores=True) # (batch_size, target_seq_len, d_model) + + # apply layer normalization (layernorm1) to the sum of the attention output and the input (~1 line) + Q1 = self.layernorm1(mult_attn_out1 + x) + + # BLOCK 2 + # calculate self-attention using the Q from the first block and K and V from the encoder output. + # Dropout will be applied during training + # Return attention scores as attn_weights_block2 (~1 line) + mult_attn_out2, attn_weights_block2 = self.mha2(Q1, enc_output, enc_output, padding_mask, return_attention_scores=True) # (batch_size, target_seq_len, d_model) + + # apply layer normalization (layernorm2) to the sum of the attention output and the output of the first block (~1 line) + mult_attn_out2 = self.layernorm2(mult_attn_out2 + Q1) # (batch_size, target_seq_len, fully_connected_dim) + + #BLOCK 3 + # pass the output of the second block through a ffn + ffn_output = self.ffn(mult_attn_out2) # (batch_size, target_seq_len, fully_connected_dim) + + # apply a dropout layer to the ffn output + # use `training=training` + ffn_output = self.dropout_ffn(ffn_output, training=training) + + # apply layer normalization (layernorm3) to the sum of the ffn output and the output of the second block + out3 = self.layernorm3(ffn_output + mult_attn_out2) # (batch_size, target_seq_len, fully_connected_dim) + # END CODE HERE + + return out3, attn_weights_block1, attn_weights_block2 + +# GRADED FUNCTION Decoder +class Decoder(tf.keras.layers.Layer): + """ + The entire Encoder starts by passing the target input to an embedding layer + and using positional encoding to then pass the output through a stack of + decoder Layers + + """ + def __init__(self, num_layers, embedding_dim, num_heads, fully_connected_dim, target_vocab_size, + maximum_position_encoding, dropout_rate=0.1, layernorm_eps=1e-6): + super(Decoder, self).__init__() + + self.embedding_dim = embedding_dim + self.num_layers = num_layers + + self.embedding = tf.keras.layers.Embedding(target_vocab_size, self.embedding_dim) + self.pos_encoding = positional_encoding(maximum_position_encoding, self.embedding_dim) + + self.dec_layers = [DecoderLayer(embedding_dim=self.embedding_dim, + num_heads=num_heads, + fully_connected_dim=fully_connected_dim, + dropout_rate=dropout_rate, + layernorm_eps=layernorm_eps) + for _ in range(self.num_layers)] + self.dropout = tf.keras.layers.Dropout(dropout_rate) + + def call(self, x, enc_output, training, + look_ahead_mask, padding_mask): + """ + Forward pass for the Decoder + + Arguments: + x (tf.Tensor): Tensor of shape (batch_size, target_seq_len, fully_connected_dim) + enc_output (tf.Tensor): Tensor of shape(batch_size, input_seq_len, fully_connected_dim) + training (bool): Boolean, set to true to activate + the training mode for dropout layers + look_ahead_mask (tf.Tensor): Boolean mask for the target_input + padding_mask (tf.Tensor): Boolean mask for the second multihead attention layer + Returns: + x (tf.Tensor): Tensor of shape (batch_size, target_seq_len, fully_connected_dim) + attention_weights (dict[str: tf.Tensor]): Dictionary of tensors containing all the attention weights + each of shape Tensor of shape (batch_size, num_heads, target_seq_len, input_seq_len) + """ + + seq_len = tf.shape(x)[1] + attention_weights = {} + + # START CODE HERE + # create word embeddings + x = self.embedding(x) # (batch_size, target_seq_len, fully_connected_dim) + + # scale embeddings by multiplying by the square root of their dimension + x *= tf.math.sqrt(tf.cast(self.embedding_dim, tf.float32)) + + # calculate positional encodings and add to word embedding + x += self.pos_encoding[:, :seq_len, :] + + # apply a dropout layer to x + # use `training=training` + x = self.dropout(x, training=training) + + # use a for loop to pass x through a stack of decoder layers and update attention_weights (~4 lines total) + for i in range(self.num_layers): + # pass x and the encoder output through a stack of decoder layers and save the attention weights + # of block 1 and 2 (~1 line) + x, block1, block2 = self.dec_layers[i](x, enc_output, training, + look_ahead_mask, padding_mask) + + #update attention_weights dictionary with the attention weights of block 1 and block 2 + attention_weights['decoder_layer{}_block1_self_att'.format(i+1)] = block1 + attention_weights['decoder_layer{}_block2_decenc_att'.format(i+1)] = block2 + # END CODE HERE + + # x.shape == (batch_size, target_seq_len, fully_connected_dim) + return x, attention_weights + +# + +class Transformer(tf.keras.Model): + """ + Complete transformer with an Encoder and a Decoder + """ + def __init__(self, num_layers, embedding_dim, num_heads, fully_connected_dim, input_vocab_size, + target_vocab_size, max_positional_encoding_input, + max_positional_encoding_target, dropout_rate=0.1, layernorm_eps=1e-6): + super(Transformer, self).__init__() + + self.encoder = Encoder(num_layers=num_layers, + embedding_dim=embedding_dim, + num_heads=num_heads, + fully_connected_dim=fully_connected_dim, + input_vocab_size=input_vocab_size, + maximum_position_encoding=max_positional_encoding_input, + dropout_rate=dropout_rate, + layernorm_eps=layernorm_eps) + + self.decoder = Decoder(num_layers=num_layers, + embedding_dim=embedding_dim, + num_heads=num_heads, + fully_connected_dim=fully_connected_dim, + target_vocab_size=target_vocab_size, + maximum_position_encoding=max_positional_encoding_target, + dropout_rate=dropout_rate, + layernorm_eps=layernorm_eps) + + self.final_layer = tf.keras.layers.Dense(target_vocab_size, activation='softmax') + + def call(self, input_sentence, output_sentence, training, enc_padding_mask, look_ahead_mask, dec_padding_mask): + """ + Forward pass for the entire Transformer + Arguments: + input_sentence (tf.Tensor): Tensor of shape (batch_size, input_seq_len, fully_connected_dim) + An array of the indexes of the words in the input sentence + output_sentence (tf.Tensor): Tensor of shape (batch_size, target_seq_len, fully_connected_dim) + An array of the indexes of the words in the output sentence + training (bool): Boolean, set to true to activate + the training mode for dropout layers + enc_padding_mask (tf.Tensor): Boolean mask to ensure that the padding is not + treated as part of the input + look_ahead_mask (tf.Tensor): Boolean mask for the target_input + dec_padding_mask (tf.Tensor): Boolean mask for the second multihead attention layer + Returns: + final_output (tf.Tensor): The final output of the model + attention_weights (dict[str: tf.Tensor]): Dictionary of tensors containing all the attention weights for the decoder + each of shape Tensor of shape (batch_size, num_heads, target_seq_len, input_seq_len) + + """ + # START CODE HERE + # call self.encoder with the appropriate arguments to get the encoder output + enc_output = self.encoder(input_sentence, training, enc_padding_mask) # (batch_size, inp_seq_len, fully_connected_dim) + + # call self.decoder with the appropriate arguments to get the decoder output + # dec_output.shape == (batch_size, tar_seq_len, fully_connected_dim) + dec_output, attention_weights = self.decoder( + output_sentence, enc_output, training, look_ahead_mask, dec_padding_mask) + + # pass decoder output through a linear layer and softmax (~2 lines) + final_output = self.final_layer(dec_output) # (batch_size, tar_seq_len, target_vocab_size) + # END CODE HERE + + return final_output, attention_weights + +class CustomSchedule(tf.keras.optimizers.schedules.LearningRateSchedule): + def __init__(self, d_model, warmup_steps=1000): + super(CustomSchedule, self).__init__() + self.d_model = tf.cast(d_model, dtype=tf.float32) + self.warmup_steps = warmup_steps + + def __call__(self, step): + step = tf.cast(step, dtype=tf.float32) + arg1 = tf.math.rsqrt(step) + arg2 = step * (self.warmup_steps ** -1.5) + + return tf.math.rsqrt(self.d_model) * tf.math.minimum(arg1, arg2) + +def masked_loss(real, pred, loss_object): + mask = tf.math.logical_not(tf.math.equal(real, 0)) + loss_ = loss_object(real, pred) + + mask = tf.cast(mask, dtype=loss_.dtype) + loss_ *= mask + + return tf.reduce_sum(loss_)/tf.reduce_sum(mask) + +#@tf.function +def train_step(inp, tar, model, loss_object, optimizer, train_loss): + """ + One training step for the transformer + Arguments: + inp (tf.Tensor): Input data to summarize + tar (tf.Tensor): Target (summary) + Returns: + None + """ + tar_inp = tar[:, :-1] + tar_real = tar[:, 1:] + + # Create masks + enc_padding_mask = create_padding_mask(inp) + look_ahead_mask = create_look_ahead_mask(tf.shape(tar_inp)[1])# + + with tf.GradientTape() as tape: + predictions, _ = model( + inp, + tar_inp, + True, + enc_padding_mask, + look_ahead_mask, + enc_padding_mask + ) + loss = masked_loss(tar_real, predictions, loss_object) + + gradients = tape.gradient(loss, model.trainable_variables) + optimizer.apply_gradients(zip(gradients, model.trainable_variables)) + + train_loss(loss) + +#@tf.function +def create_padding_mask(decoder_token_ids): + """ + Creates a matrix mask for the padding cells + + Arguments: + decoder_token_ids (matrix like): matrix of size (n, m) + + Returns: + mask (tf.Tensor): binary tensor of size (n, 1, m) + """ + seq = 1 - tf.cast(tf.math.equal(decoder_token_ids, 0), tf.float32) + + # add extra dimensions to add the padding to the attention logits. + # this will allow for broadcasting later when comparing sequences + return seq[:, tf.newaxis, :] + +#@tf.function +def create_look_ahead_mask(sequence_length): + """ + Returns a lower triangular matrix filled with ones + + Arguments: + sequence_length (int): matrix size + + Returns: + mask (tf.Tensor): binary tensor of size (sequence_length, sequence_length) + """ + mask = tf.linalg.band_part(tf.ones((1, sequence_length, sequence_length)), -1, 0) + return mask + +def next_word(encoder_input, output, model): + """ + Helper function that uses the model to predict just the next word. + Arguments: + encoder_input (tf.Tensor): Input question + output (tf.Tensor): Current state of the answer + Returns: + predicted_id (tf.Tensor): The id of the predicted word + """ + # Create a padding mask for the input + enc_padding_mask = create_padding_mask(encoder_input) + # Create a look-ahead mask for the output + look_ahead_mask = create_look_ahead_mask(tf.shape(output)[1]) + # Run the prediction of the next word with the transformer model + predictions, attention_weights = model( + encoder_input, + output, + False, + enc_padding_mask, + look_ahead_mask, + enc_padding_mask + ) + + predictions = predictions[: ,-1:, :] + predicted_id = tf.cast(tf.argmax(predictions, axis=-1), tf.int32) + + return predicted_id + + +# - + + diff --git a/NLP with Attention Models/QA/QA_T5/Files/tf/utils.py b/NLP with Attention Models/QA/QA_T5/Files/tf/utils.py new file mode 100644 index 0000000000000000000000000000000000000000..91c6b862b1a24de02289ef17bd4b04266bcf0405 --- /dev/null +++ b/NLP with Attention Models/QA/QA_T5/Files/tf/utils.py @@ -0,0 +1,51 @@ +import tensorflow as tf +import numpy as np + +def positional_encoding(length, depth): + depth = depth/2 + + positions = np.arange(length)[:, np.newaxis] # (seq, 1) + depths = np.arange(depth)[np.newaxis, :]/depth # (1, depth) + + angle_rates = 1 / (10000**depths) # (1, depth) + angle_rads = positions * angle_rates # (pos, depth) + + pos_encoding = np.concatenate( + [np.sin(angle_rads), np.cos(angle_rads)], + axis=-1) + + return tf.cast(pos_encoding, dtype=tf.float32) + +class PositionalEmbedding(tf.keras.layers.Layer): + def __init__(self, vocab_size, d_model, max_length=2048): + super().__init__() + self.vocab_size = vocab_size + self.d_model = d_model + self.max_length = max_length + self.embedding = tf.keras.layers.Embedding(self.vocab_size, self.d_model, mask_zero=True) + self.pos_encoding = positional_encoding(length=self.max_length, depth=self.d_model) + + def compute_mask(self, *args, **kwargs): + return self.embedding.compute_mask(*args, **kwargs) + + def call(self, x): + length = tf.shape(x)[1] + x = self.embedding(x) + # This factor sets the relative scale of the embedding and positonal_encoding. + x *= tf.math.sqrt(tf.cast(self.d_model, tf.float32)) + x = x + self.pos_encoding[tf.newaxis, :length, :] + return x + +# Dummy enconder block. Will be replaced by learner implementation +class EncoderBlock(tf.keras.Model): + def __init__(self, d_model, d_ff, n_heads, dropout, ff_activation='relu'): + super().__init__(self) + self.d_model = d_model + self.d_ff = d_ff + self.n_heads = n_heads + self.dropout = dropout + self.ff_activation = ff_activation + + def call(self, inputs, states=None, return_state=False, training=False): + x = inputs + return x \ No newline at end of file diff --git a/NLP with Attention Models/QA/QA_T5/Files/tf/w3_unittest.py b/NLP with Attention Models/QA/QA_T5/Files/tf/w3_unittest.py new file mode 100644 index 0000000000000000000000000000000000000000..7fe546f6371d26e3f0116756aca0b8da9361f88d --- /dev/null +++ b/NLP with Attention Models/QA/QA_T5/Files/tf/w3_unittest.py @@ -0,0 +1,355 @@ +import sys +import itertools +import numpy as np +import traceback +import test_utils +from utils import EncoderBlock + +import tensorflow as tf +from tensorflow.keras.layers import MultiHeadAttention, ReLU, Attention, LayerNormalization, Input +import tensorflow_text as tf_text + +from dlai_grader.grading import test_case, object_to_grade +from types import ModuleType, FunctionType +import transformer_utils + +def testing_rnd(): + def dummy_generator(): + vals = np.linspace(0, 1, 10) + cyclic_vals = itertools.cycle(vals) + for _ in range(100): + yield next(cyclic_vals) + + dumr = itertools.cycle(dummy_generator()) + + def dummy_randomizer(): + return next(dumr) + + return dummy_randomizer + +f = open("./models/sentencepiece.model", "rb") +tokenizer = tf_text.SentencepieceTokenizer(f.read(), out_type=tf.int32) + +# + +def test_tokenize_and_mask(target): + + t = test_case() + if not isinstance(target, FunctionType): + t.failed = True + t.msg = "target has incorrect type" + t.want = FunctionType + t.got = type(target) + return [t] + + text1 = b"Beginners BBQ Class Taking Place in Missoula!" + text2 = b"Foil plaid lycra and spandex shortall with metallic slinky insets." + text3 = 'Beginners BBQ Class Taking Place in Missoula!\nDo you want to get better at making delicious BBQ? You will have the opportunity, put this on your calendar now. Thursday, September 22nd join World Class BBQ Champion, Tony Balay from Lonestar Smoke Rangers.' + + + test_cases = [{"text": text1, "noise": 0, + "expected_output":([12847, 277, 15068, 4501, 3, 12297, 3399, 16, 5964, 7115, 9, 55], [1])}, + {"text": text1, "noise": 0.2, + "expected_output": ([31999, 15068, 4501, 3, 12297, 3399, 16, 5964, 7115, 31998], [31999, 12847, 277, 31998, 9, 55, 1])}, + {"text": text1, "noise": 0.5, + "expected_output": ([31999, 12297, 3399, 16, 5964, 7115, 31998], [31999, 12847, 277, 15068, 4501, 3, 31998, 9, 55, 1])}, + {"text": text2, "noise": 0.1, + "expected_output": ([31999, 173, 30772, 3, 120, 2935, 11, 8438, 26, 994, 31998, 1748, 28, 18813, 3, 7, 4907, 63, 16, 2244, 31997, 5], [31999, 4452, 31998, 710, 31997, 7, 1])}, + {"text": text2, "noise": 0.2, + "expected_output": ([31999, 30772, 3, 120, 2935, 11, 8438, 26, 994, 31998, 28, 18813, 3, 7, 4907, 63, 16, 2244, 31997], [31999, 4452, 173, 31998, 710, 1748, 31997, 7, 5, 1])}, + {"text": text2, "noise": 1.0, + "expected_output": ([31999, 994, 31998, 2244, 31997], [31999, 4452, 173, 30772, 3, 120, 2935, 11, 8438, 26, 31998, 710, 1748, 28, 18813, 3, 7, 4907, 63, 16, 31997, 7, 5, 1])}, + {"text": text3, "noise": 0.15, + "expected_output": ([31999, 15068, 4501, 3, 12297, 3399, 16, 5964, 7115, 31998, 531, 25, 241, 12, 129, 394, 44, 492, 31997, 58, 148, 56, 43, 8, 1004, 6, 474, 31996, 39, 4793, 230, 5, 2721, 6, 1600, 1630, 31995, 1150, 4501, 15068, 16127, 6, 9137, 2659, 5595, 31994, 782, 3624, 14627, 15, 12612, 277, 5], [31999, 12847, 277, 31998, 9, 55, 31997, 3326, 15068, 31996, 48, 30, 31995, 727, 1715, 31994, 45, 301, 1])} + ] + cases = [] + + for case in test_cases: + output = target(case.get('text'), + noise=case.get('noise'), + randomizer=testing_rnd(), + tokenizer=tokenizer) + t = test_case() #inps, targs + if not isinstance(output[0], list): + t.failed = True + t.msg = "Wrong type. inps extected to be a list" + t.want = tf.Tensor + t.got = type(output[0]) + cases.append(t) + + t = test_case() + if not isinstance(output[1], list): + t.failed = True + t.msg = "Wrong type. args extected to be a list" + t.want = tf.Tensor + t.got = type(output[1]) + cases.append(t) + + t = test_case() + if len(case.get('expected_output')[0]) != len(output[0]): + t.failed = True + t.msg = "Wrong length for inps" + t.want = len(case.get('expected_output')[0]) + t.got = len(output[0]) + cases.append(t) + + t = test_case() + if len(case.get('expected_output')[1]) != len(output[1]): + t.failed = True + t.msg = "Wrong length for args" + t.want = len(case.get('expected_output')[1]) + t.got = len(output[1]) + cases.append(t) + + t = test_case() + if len(output[0])>0 and not (isinstance(output[0][0], (int, np.int32, np.int64, type(tf.constant(1.0))))): + t.failed = True + t.msg = "Wrong type. inps extected to be a int" + t.want = type(1) + t.got = type(output[0][0]) + cases.append(t) + + t = test_case() + if len(output[1])>0 and not (isinstance(output[1][0], (int, np.int32, np.int64, type(tf.constant(1.0))))): + t.failed = True + t.msg = "Wrong type. args extected to be a int qq" + t.want = type(1) + t.got = type(output[1][0]) + cases.append(t) + + t = test_case() + if not np.array_equal(output[0], case.get('expected_output')[0]): + t.failed = True + t.msg = f"Wrong values for inps for input: {case.get('text')}" + t.want = case.get('expected_output')[0] + t.got = output[0] + cases.append(t) + + t = test_case() + if not np.array_equal(output[1], case.get('expected_output')[1]): + t.failed = True + t.msg = f"Wrong values for args for input: {case.get('text')}" + t.want = case.get('expected_output')[1] + t.got = output[1] + cases.append(t) + + for i in range(len(cases)): + if cases[i].failed: + print(f"\033[91mTese case {i} failed\n") + print(cases[i]) + return + + print("\033[92m All tests passed") + +def test_parse_squad(target): + t = test_case() + if not isinstance(target, FunctionType): + t.failed = True + t.msg = "target has incorrect type" + t.want = FunctionType + t.got = type(target) + return [t] + dataset1 = [{"title": "t1", "paragraphs": [ + {"context": "very long context one", + "qas": [{ "question": "question is abc?", + "id": "1", + "answers": [ + { + "text": "here is abc", + "answer_start": 8 + }, + { + "text": "abc here abc", + "answer_start": 0 + } + ], + "is_impossible": False}, + { "question": "unanswerable question?", + "id": "2", + "answers": [ + { + "text": "what?", + "answer_start": 9 + } + ], + "is_impossible": True}, + { "question": "question is xyz?", + "id": "3", + "answers": [ + { + "text": "here is xyz", + "answer_start": 9 + } + ], + "is_impossible": False} + ]}]}] + + pairs = target(dataset1) + expected_pairs1 = (['question: question is abc? context: very long context one', + 'question: question is xyz? context: very long context one'], + ['answer: here is abc', 'answer: here is xyz']) + cases = [] + + t = test_case() + if not isinstance(pairs[0], list): + t.failed = True + t.msg = "Wrong type for returned inputs" + t.want = list + t.got = type(pairs[0]) + cases.append(t) + + t = test_case() + if not isinstance(pairs[1], list): + t.failed = True + t.msg = "Wrong type for returned outputs" + t.want = list + t.got = type(pairs[1]) + cases.append(t) + + t = test_case() #inps, targs + if len(pairs[0]) != 2: + t.failed = True + t.msg = "Wrong length for returned inputs" + t.want = tf.Tensor + t.got = type(pairs[0]) + cases.append(t) + + t = test_case() #inps, targs + if len(pairs[1]) != 2: + t.failed = True + t.msg = "Wrong length for returned outputs" + t.want = tf.Tensor + t.got = type(pairs[1]) + cases.append(t) + + t = test_case() #inps, targs + if not(pairs[0][0] == expected_pairs1[0][0]): + t.failed = True + t.msg = "Wrong input 0" + t.want = expected_pairs1[0][0] + t.got = pairs[0][0] + cases.append(t) + + t = test_case() #inps, targs + if not(pairs[0][1] == expected_pairs1[0][1]): + t.failed = True + t.msg = "Wrong input 1" + t.want = expected_pairs1[0][1] + t.got = pairs[0][1] + cases.append(t) + + t = test_case() #inps, targs + if not(pairs[1][0] == expected_pairs1[1][0]): + t.failed = True + t.msg = "Wrong output 0" + t.want = expected_pairs1[1][0] + t.got = pairs[1][0] + cases.append(t) + + t = test_case() #inps, targs + if not(pairs[1][1] == expected_pairs1[1][1]): + t.failed = True + t.msg = "Wrong output 1" + t.want = expected_pairs1[1][1] + t.got = pairs[1][1] + cases.append(t) + + print("\033[92m All tests passed") + +def test_answer_question(target): + + t = test_case() + if not isinstance(target, FunctionType): + t.failed = True + t.msg = "target has incorrect type" + t.want = FunctionType + t.got = type(target) + return [t] + + # Define the model parameters + num_layers = 2 + embedding_dim = 128 + fully_connected_dim = 128 + num_heads = 2 + positional_encoding_length = 256 + + encoder_vocab_size = int(tokenizer.vocab_size()) + decoder_vocab_size = encoder_vocab_size + + # Initialize the model + modelx = transformer_utils.Transformer( + num_layers, + embedding_dim, + num_heads, + fully_connected_dim, + encoder_vocab_size, + decoder_vocab_size, + positional_encoding_length, + positional_encoding_length, + ) + + if False: + print("Not all tests were performed due to missing files. Don't worry, this has no impact on the assignment and we are working to fix it.") + else: + modelx.load_weights('./pretrained_models/model_qa3') + + question = "question: How many are this? context: This is five." + result = tokenizer.detokenize(target(question, modelx, tokenizer)).numpy()[0].decode() + cases = [] + + t = test_case() #inps, targs + if not ("answer:" in result): + t.failed = True + t.msg = "Wrong preamble" + t.want = "answer:" + t.got = result + cases.append(t) + + if not ("five" in result): + t.failed = True + t.msg = "Wrong answer" + t.want = "five" + t.got = result + cases.append(t) + + + question = "question: When did that happen? context: That happen on August 17, 1715" + result = tokenizer.detokenize(target(question, modelx, tokenizer)).numpy()[0].decode() + t = test_case() #inps, targs + if not ("answer:" in result): + t.failed = True + t.msg = "Wrong preamble" + t.want = "answer:" + t.got = result + cases.append(t) + + if not ("August" in result): + t.failed = True + t.msg = "Wrong answer" + t.want = "August" + t.got = result + cases.append(t) + + question = "question: Who is the king? context: In this country the king is Charles from here in advance" + result = tokenizer.detokenize(target(question, modelx, tokenizer)).numpy()[0].decode() + + if not ("answer:" in result): + t.failed = True + t.msg = "Wrong preamble" + t.want = "answer:" + t.got = result + cases.append(t) + + if not ("Charles V" in result): + t.failed = True + t.msg = "Wrong answer" + t.want = "Charles V" + t.got = result + cases.append(t) + + for i in range(len(cases)): + if cases[i].failed: + print(f"\033[91mTese case {i} failed\n") + print(cases[i]) + return + + print("\033[92m All tests passed") diff --git a/NLP with Attention Models/Text_Summarization/Attention_Masking_PE/home/jovyan/work/.ipynb_checkpoints/C4W2_Attention-checkpoint.ipynb b/NLP with Attention Models/Text_Summarization/Attention_Masking_PE/home/jovyan/work/.ipynb_checkpoints/C4W2_Attention-checkpoint.ipynb new file mode 100644 index 0000000000000000000000000000000000000000..76d71c006950f83941755ec93c649ab3190db9b0 --- /dev/null +++ b/NLP with Attention Models/Text_Summarization/Attention_Masking_PE/home/jovyan/work/.ipynb_checkpoints/C4W2_Attention-checkpoint.ipynb @@ -0,0 +1,232 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# The Three Ways of Attention and Dot Product Attention: Ungraded Lab Notebook\n", + "\n", + "In this notebook you'll explore the three ways of attention (encoder-decoder attention, causal attention, and bi-directional self attention) and how to implement the latter two with dot product attention. \n", + "\n", + "## Background\n", + "\n", + "As you learned last week, **attention models** constitute powerful tools in the NLP practitioner's toolkit. Like LSTMs, they learn which words are most important to phrases, sentences, paragraphs, and so on. Moreover, they mitigate the vanishing gradient problem even better than LSTMs. You've already seen how to combine attention with LSTMs to build **encoder-decoder models** for applications such as machine translation. \n", + "\n", + "\n", + "\n", + "This week, you'll see how to integrate attention into **transformers**. Because transformers do not process one token at a time, they are much easier to parallelize and accelerate. Beyond text summarization, applications of transformers include: \n", + "* Machine translation\n", + "* Auto-completion\n", + "* Named Entity Recognition\n", + "* Chatbots\n", + "* Question-Answering\n", + "* And more!\n", + "\n", + "Along with embedding, positional encoding, dense layers, and residual connections, attention is a crucial component of transformers. At the heart of any attention scheme used in a transformer is **dot product attention**, of which the figures below display a simplified picture:\n", + "\n", + "\n", + "\n", + "\n", + "\n", + "With basic dot product attention, you capture the interactions between every word (embedding) in your query and every word in your key. If the queries and keys belong to the same sentences, this constitutes **bi-directional self-attention**. In some situations, however, it's more appropriate to consider only words which have come before the current one. Such cases, particularly when the queries and keys come from the same sentences, fall into the category of **causal attention**. \n", + "\n", + "\n", + "\n", + "For causal attention, you add a **mask** to the argument of our softmax function, as illustrated below: \n", + "\n", + "\n", + "\n", + "\n", + "\n", + "Now let's see how to implement the attention mechanism." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Imports" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "import os\n", + "os.environ['TF_CPP_MIN_LOG_LEVEL'] = '3'\n", + "\n", + "import sys\n", + "\n", + "import tensorflow as tf\n", + "\n", + "import textwrap\n", + "wrapper = textwrap.TextWrapper(width=70)\n" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Here is a helper function that will help you display useful information:\n", + "\n", + "* `display_tensor()` prints out the shape and the actual tensor." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "def display_tensor(t, name):\n", + " \"\"\"Display shape and tensor\"\"\"\n", + " print(f'{name} shape: {t.shape}\\n')\n", + " print(f'{t}\\n')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Create some tensors and display their shapes. Feel free to experiment with your own tensors. Keep in mind, though, that the query, key, and value arrays must all have the same embedding dimensions (number of columns), and the mask array must have the same shape as `tf.matmul(query, key_transposed)`. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "q = tf.constant([[1.0, 0.0, 0.0], [0.0, 1.0, 0.0]])\n", + "display_tensor(q, 'query')\n", + "k = tf.constant([[1.0, 2.0, 3.0], [4.0, 5.0, 6.0]])\n", + "display_tensor(k, 'key')\n", + "v = tf.constant([[0.0, 1.0, 0.0], [1.0, 0.0, 1.0]])\n", + "display_tensor(v, 'value')\n", + "m = tf.constant([[1.0, 0.0], [1.0, 1.0]])\n", + "display_tensor(m, 'mask')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Dot product attention\n", + "\n", + "Here you compute \n", + "$\\textrm{softmax} \\left(\\frac{Q K^T}{\\sqrt{d}} + M \\right) V$, where the (optional, but default) scaling factor $\\sqrt{d}$ is the square root of the embedding dimension." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "def dot_product_attention(q, k, v, mask, scale=True):\n", + " \"\"\"\n", + " Calculate the attention weights.\n", + " q, k, v must have matching leading dimensions.\n", + " k, v must have matching penultimate dimension, i.e.: seq_len_k = seq_len_v.\n", + " The mask has different shapes depending on its type(padding or look ahead) \n", + " but it must be broadcastable for addition.\n", + "\n", + " Arguments:\n", + " q (tf.Tensor): query of shape (..., seq_len_q, depth)\n", + " k (tf.Tensor): key of shape (..., seq_len_k, depth)\n", + " v (tf.Tensor): value of shape (..., seq_len_v, depth_v)\n", + " mask (tf.Tensor): mask with shape broadcastable \n", + " to (..., seq_len_q, seq_len_k). Defaults to None.\n", + " scale (boolean): if True, the result is a scaled dot-product attention. Defaults to True.\n", + "\n", + " Returns:\n", + " attention_output (tf.Tensor): the result of the attention function\n", + " \"\"\"\n", + " \n", + " # Multiply q and k transposed.\n", + " matmul_qk = tf.matmul(q, k, transpose_b=True) # (..., seq_len_q, seq_len_k)\n", + "\n", + " # scale matmul_qk with the square root of dk\n", + " if scale:\n", + " dk = tf.cast(tf.shape(k)[-1], tf.float32)\n", + " matmul_qk = matmul_qk / tf.math.sqrt(dk)\n", + " # add the mask to the scaled tensor.\n", + " if mask is not None:\n", + " matmul_qk = matmul_qk + (1. - mask) * -1e9 \n", + "\n", + " # softmax is normalized on the last axis (seq_len_k) so that the scores add up to 1.\n", + " attention_weights = tf.keras.activations.softmax(matmul_qk)\n", + "\n", + " # Multiply the attention weights by v\n", + " attention_output = tf.matmul(attention_weights, v) # (..., seq_len_q, depth_v)\n", + "\n", + " return attention_output" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Finally, you implement the *masked* dot product self-attention (at the heart of causal attention) as a special case of dot product attention" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "def causal_dot_product_attention(q, k, v, scale=True):\n", + " \"\"\" Masked dot product self attention.\n", + " Args:\n", + " q (numpy.ndarray): queries.\n", + " k (numpy.ndarray): keys.\n", + " v (numpy.ndarray): values.\n", + " Returns:\n", + " numpy.ndarray: masked dot product self attention tensor.\n", + " \"\"\"\n", + " \n", + " # Size of the penultimate dimension of the query\n", + " mask_size = q.shape[-2]\n", + "\n", + " # Creates a matrix with ones below the diagonal and 0s above. It should have shape (1, mask_size, mask_size)\n", + " mask = tf.experimental.numpy.tril(tf.ones((mask_size, mask_size))) \n", + " \n", + " return dot_product_attention(q, k, v, mask, scale=scale)" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "result = causal_dot_product_attention(q, k, v)\n", + "display_tensor(result, 'result')" + ] + } + ], + "metadata": { + "kernelspec": { + "display_name": "Python 3 (ipykernel)", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.10.11" + } + }, + "nbformat": 4, + "nbformat_minor": 4 +} diff --git a/NLP with Attention Models/Text_Summarization/Attention_Masking_PE/home/jovyan/work/.ipynb_checkpoints/C4W2_Masking-checkpoint.ipynb b/NLP with Attention Models/Text_Summarization/Attention_Masking_PE/home/jovyan/work/.ipynb_checkpoints/C4W2_Masking-checkpoint.ipynb new file mode 100644 index 0000000000000000000000000000000000000000..bfeb87dff8b095828462383fdc4d04f5a03317b5 --- /dev/null +++ b/NLP with Attention Models/Text_Summarization/Attention_Masking_PE/home/jovyan/work/.ipynb_checkpoints/C4W2_Masking-checkpoint.ipynb @@ -0,0 +1,212 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "id": "c2b5b44f", + "metadata": {}, + "source": [ + "# Masking\n", + "\n", + "In this lab, you will implement the masking, that is one of the essential building blocks of the transformer. You will see how to define the masks and test how they work. You will use the masks later in the programming assignment." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "ff0def87", + "metadata": {}, + "outputs": [], + "source": [ + "import os\n", + "os.environ['TF_CPP_MIN_LOG_LEVEL'] = '3'\n", + "import tensorflow as tf" + ] + }, + { + "cell_type": "markdown", + "id": "9c0b0358", + "metadata": {}, + "source": [ + "## 1 - Masking\n", + "\n", + "There are two types of masks that are useful when building your Transformer network: the *padding mask* and the *look-ahead mask*. Both help the softmax computation give the appropriate weights to the words in your input sentence. \n", + "\n", + "### 1.1 - Padding Mask\n", + "\n", + "Oftentimes your input sequence will exceed the maximum length of a sequence your network can process. Let's say the maximum length of your model is five, it is fed the following sequences:\n", + "\n", + " [[\"Do\", \"you\", \"know\", \"when\", \"Jane\", \"is\", \"going\", \"to\", \"visit\", \"Africa\"], \n", + " [\"Jane\", \"visits\", \"Africa\", \"in\", \"September\" ],\n", + " [\"Exciting\", \"!\"]\n", + " ]\n", + "\n", + "which might get vectorized as:\n", + "\n", + " [[ 71, 121, 4, 56, 99, 2344, 345, 1284, 15],\n", + " [ 56, 1285, 15, 181, 545],\n", + " [ 87, 600]\n", + " ]\n", + " \n", + "When passing sequences into a transformer model, it is important that they are of uniform length. You can achieve this by padding the sequence with zeros, and truncating sentences that exceed the maximum length of your model:\n", + "\n", + " [[ 71, 121, 4, 56, 99],\n", + " [ 2344, 345, 1284, 15, 0],\n", + " [ 56, 1285, 15, 181, 545],\n", + " [ 87, 600, 0, 0, 0],\n", + " ]\n", + " \n", + "Sequences longer than the maximum length of five will be truncated, and zeros will be added to the truncated sequence to achieve uniform length. Similarly, for sequences shorter than the maximum length, zeros will also be added for padding.\n", + "\n", + "When pasing these vectors through the attention layers, the zeros will typically disappear (you will get completely new vectors given the mathematical operations that happen in the attention block). However, you still want the network to attend only to the first few numbers in that vector (given by the sentence length) and this is when a padding mask comes in handy. You will need to define a boolean mask that specifies to which elements you must attend (1) and which elements you must ignore (0) and you do this by looking at all the zeros in the sequence. Then you use the mask to set the values of the vectors (corresponding to the zeros in the initial vector) close to negative infinity (-1e9).\n", + "\n", + "Imagine your input vector is `[87, 600, 0, 0, 0]`. This would give you a mask of `[1, 1, 0, 0, 0]`. When your vector passes through the attention mechanism, you get another (randomly looking) vector, let's say `[1, 2, 3, 4, 5]`, which after masking becomes `[1, 2, -1e9, -1e9, -1e9]`, so that when you take the softmax, the last three elements (where there were zeros in the input) don't affect the score.\n", + "\n", + "The [MultiheadAttention](https://keras.io/api/layers/attention_layers/multi_head_attention/) layer implemented in Keras, uses this masking logic.\n", + "\n", + "**Note:** The below function only creates the mask of an _already padded sequence_." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "b4be6e26", + "metadata": {}, + "outputs": [], + "source": [ + "def create_padding_mask(decoder_token_ids):\n", + " \"\"\"\n", + " Creates a matrix mask for the padding cells\n", + " \n", + " Arguments:\n", + " decoder_token_ids (matrix like): matrix of size (n, m)\n", + " \n", + " Returns:\n", + " mask (tf.Tensor): binary tensor of size (n, 1, m)\n", + " \"\"\" \n", + " seq = 1 - tf.cast(tf.math.equal(decoder_token_ids, 0), tf.float32)\n", + " \n", + " # add extra dimensions to add the padding\n", + " # to the attention logits. \n", + " # this will allow for broadcasting later when comparing sequences\n", + " return seq[:, tf.newaxis, :] " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "13a484c2", + "metadata": {}, + "outputs": [], + "source": [ + "x = tf.constant([[7., 6., 0., 0., 0.], [1., 2., 3., 0., 0.], [3., 0., 0., 0., 0.]])\n", + "print(create_padding_mask(x))" + ] + }, + { + "cell_type": "markdown", + "id": "ce1d7106", + "metadata": {}, + "source": [ + "If you multiply (1 - mask) by -1e9 and add it to the sample input sequences, the zeros are essentially set to negative infinity. Notice the difference when taking the softmax of the original sequence and the masked sequence:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "ba9a0bdd", + "metadata": {}, + "outputs": [], + "source": [ + "# Create the mask for x\n", + "mask = create_padding_mask(x)\n", + "\n", + "# Extend the dimension of x to match the dimension of the mask\n", + "x_extended = x[:, tf.newaxis, :]\n", + "\n", + "print(\"Softmax of non-masked vectors:\\n\")\n", + "print(tf.keras.activations.softmax(x_extended))\n", + "\n", + "print(\"\\nSoftmax of masked vectors:\\n\")\n", + "print(tf.keras.activations.softmax(x_extended + (1 - mask) * -1.0e9))" + ] + }, + { + "cell_type": "markdown", + "id": "da92b367", + "metadata": {}, + "source": [ + "### 1.2 - Look-ahead Mask\n", + "\n", + "The look-ahead mask follows similar intuition. In training, you will have access to the complete correct output of your training example. The look-ahead mask helps your model pretend that it correctly predicted a part of the output and see if, *without looking ahead*, it can correctly predict the next output. \n", + "\n", + "For example, if the expected correct output is `[1, 2, 3]` and you wanted to see if given that the model correctly predicted the first value it could predict the second value, you would mask out the second and third values. So you would input the masked sequence `[1, -1e9, -1e9]` and see if it could generate `[1, 2, -1e9]`.\n", + "\n", + "Just because you've worked so hard, we'll also implement this mask for you 😇😇. Again, take a close look at the code so you can effectively implement it later." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "843dd37e", + "metadata": {}, + "outputs": [], + "source": [ + "def create_look_ahead_mask(sequence_length):\n", + " \"\"\"\n", + " Returns a lower triangular matrix filled with ones\n", + " \n", + " Arguments:\n", + " sequence_length (int): matrix size\n", + " \n", + " Returns:\n", + " mask (tf.Tensor): binary tensor of size (sequence_length, sequence_length)\n", + " \"\"\"\n", + " mask = tf.linalg.band_part(tf.ones((1, sequence_length, sequence_length)), -1, 0)\n", + " return mask " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "393f4398", + "metadata": {}, + "outputs": [], + "source": [ + "x = tf.random.uniform((1, 3))\n", + "temp = create_look_ahead_mask(x.shape[1])\n", + "temp" + ] + }, + { + "cell_type": "markdown", + "id": "50e1114d", + "metadata": {}, + "source": [ + "**Congratulations on finishing this Lab!** Now you should have a better understanding of the masking in the transformer and this will surely help you with this week's assignment!\n", + "\n", + "**Keep it up!**" + ] + } + ], + "metadata": { + "kernelspec": { + "display_name": "Python 3 (ipykernel)", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.10.11" + } + }, + "nbformat": 4, + "nbformat_minor": 5 +} diff --git a/NLP with Attention Models/Text_Summarization/Attention_Masking_PE/home/jovyan/work/.ipynb_checkpoints/C4W2_Positional_Encoding-checkpoint.ipynb b/NLP with Attention Models/Text_Summarization/Attention_Masking_PE/home/jovyan/work/.ipynb_checkpoints/C4W2_Positional_Encoding-checkpoint.ipynb new file mode 100644 index 0000000000000000000000000000000000000000..a03885c1273632dee6a6368c8ad8616cf9b4bd2d --- /dev/null +++ b/NLP with Attention Models/Text_Summarization/Attention_Masking_PE/home/jovyan/work/.ipynb_checkpoints/C4W2_Positional_Encoding-checkpoint.ipynb @@ -0,0 +1,221 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "id": "a9479dda", + "metadata": {}, + "source": [ + "\n", + "# Positional Encoding\n", + "\n", + "In this lab, you will learn how to implement the positional encoding of words in the transformer." + ] + }, + { + "cell_type": "code", + "execution_count": 5, + "id": "f97b2311", + "metadata": {}, + "outputs": [], + "source": [ + "import os\n", + "os.environ['TF_CPP_MIN_LOG_LEVEL'] = '3'\n", + "import numpy as np\n", + "import matplotlib.pyplot as plt\n", + "import tensorflow as tf" + ] + }, + { + "cell_type": "markdown", + "id": "14ea5651", + "metadata": {}, + "source": [ + "## 1. Positional Encoding\n", + "\n", + "In sequence to sequence tasks, the relative order of your data is extremely important to its meaning. When you were training sequential neural networks such as RNNs, you fed your inputs into the network in order. Information about the order of your data was automatically fed into your model. However, when you train a Transformer network using multi-head attention, you feed your data into the model all at once. While this dramatically reduces training time, there is no information about the order of your data. This is where positional encoding is useful - you can specifically encode the positions of your inputs and pass them into the network using these sine and cosine formulas:\n", + " \n", + "$$\n", + "PE_{(pos, 2i)}= sin\\left(\\frac{pos}{{10000}^{\\frac{2i}{d}}}\\right)\n", + "\\tag{1}$$\n", + "
\n", + "$$\n", + "PE_{(pos, 2i+1)}= cos\\left(\\frac{pos}{{10000}^{\\frac{2i}{d}}}\\right)\n", + "\\tag{2}$$\n", + "\n", + "* $d$ is the dimension of the word embedding and positional encoding\n", + "* $pos$ is the position of the word.\n", + "* $k$ refers to each of the different dimensions in the positional encodings, with $i$ equal to $k$ $//$ $2$.\n", + "\n", + "To develop some intuition about positional encodings, you can think of them broadly as a feature that contains the information about the relative positions of words. The sum of the positional encoding and word embedding is ultimately what is fed into the model. If you just hard code the positions in, say by adding a matrix of 1's or whole numbers to the word embedding, the semantic meaning is distorted. Conversely, the values of the sine and cosine equations are small enough (between -1 and 1) that when you add the positional encoding to a word embedding, the word embedding is not significantly distorted, and is instead enriched with positional information. Using a combination of these two equations helps your Transformer network attend to the relative positions of your input data.\n", + "\n", + "### 1.1 - Sine and Cosine Angles\n", + "\n", + "Notice that even though the sine and cosine positional encoding equations take in different arguments (`2i` versus `2i+1`, or even versus odd numbers) the inner terms for both equations are the same: $$\\theta(pos, i, d) = \\frac{pos}{10000^{\\frac{2i}{d}}} \\tag{3}$$\n", + "\n", + "Consider the inner term as you calculate the positional encoding for a word in a sequence.
\n", + "$PE_{(pos, 0)}= sin\\left(\\frac{pos}{{10000}^{\\frac{0}{d}}}\\right)$, since solving `2i = 0` gives `i = 0`
\n", + "$PE_{(pos, 1)}= cos\\left(\\frac{pos}{{10000}^{\\frac{0}{d}}}\\right)$, since solving `2i + 1 = 1` gives `i = 0`\n", + "\n", + "The angle is the same for both! The angles for $PE_{(pos, 2)}$ and $PE_{(pos, 3)}$ are the same as well, since for both, `i = 1` and therefore the inner term is $\\left(\\frac{pos}{{10000}^{\\frac{2}{d}}}\\right)$. This relationship holds true for all paired sine and cosine curves:\n", + "\n", + "| k | 0 | 1 | 2 | 3 | ... | d - 2 | d - 1 | \n", + "| ---------------- | :------: | ----------------- | ----------------- | ----------------- | ----- | ----------------- | ----------------- |\n", + "| encoding(0) = |[$sin(\\theta(0, 0, d))$| $cos(\\theta(0, 0, d))$| $sin(\\theta(0, 1, d))$| $cos(\\theta(0, 1, d))$|... |$sin(\\theta(0, d//2, d))$| $cos(\\theta(0, d//2, d))$]|\n", + "| encoding(1) = | [$sin(\\theta(1, 0, d))$| $cos(\\theta(1, 0, d))$| $sin(\\theta(1, 1, d))$| $cos(\\theta(1, 1, d))$|... |$sin(\\theta(1, d//2, d))$| $cos(\\theta(1, d//2, d))$]|\n", + "...\n", + "| encoding(pos) = | [$sin(\\theta(pos, 0, d))$| $cos(\\theta(pos, 0, d))$| $sin(\\theta(pos, 1, d))$| $cos(\\theta(pos, 1, d))$|... |$sin(\\theta(pos, d//2, d))$| $cos(\\theta(pos, d//2, d))]$|" + ] + }, + { + "cell_type": "code", + "execution_count": 2, + "id": "f7c6a09e", + "metadata": {}, + "outputs": [], + "source": [ + "def get_angles(position, k, d_model):\n", + " \"\"\"\n", + " Computes a positional encoding for a word \n", + " \n", + " Arguments:\n", + " position (int): position of the word\n", + " k (int): refers to each of the different dimensions in the positional encodings, with i equal to k//2\n", + " d_model(int): the dimension of the word embedding and positional encoding\n", + " \n", + " Returns:\n", + " _ (float): positional embedding value for the word\n", + " \"\"\"\n", + " i = k // 2\n", + " angle_rates = 1 / np.power(10000, (2 * i) / np.float32(d_model))\n", + " return position * angle_rates" + ] + }, + { + "cell_type": "markdown", + "id": "6107ee72", + "metadata": {}, + "source": [ + "### 1.2 - Sine and Cosine Positional Encodings\n", + "\n", + "Now you can use the angles you computed to calculate the sine and cosine positional encodings, shown in equations (1) and (2)." + ] + }, + { + "cell_type": "code", + "execution_count": 3, + "id": "7c654219", + "metadata": {}, + "outputs": [], + "source": [ + "def positional_encoding(positions, d):\n", + " \"\"\"\n", + " Precomputes a matrix with all the positional encodings \n", + " \n", + " Arguments:\n", + " positions (int): Maximum number of positions to be encoded \n", + " d (int): Encoding size \n", + " \n", + " Returns:\n", + " pos_encoding (tf.Tensor): A matrix of shape (1, position, d_model) with the positional encodings\n", + " \"\"\"\n", + " # initialize a matrix angle_rads of all the angles \n", + " angle_rads = get_angles(np.arange(positions)[:, np.newaxis],\n", + " np.arange(d)[np.newaxis, :],\n", + " d)\n", + " \n", + " # apply sin to even indices in the array; 2i\n", + " angle_rads[:, 0::2] = np.sin(angle_rads[:, 0::2])\n", + " \n", + " # apply cos to odd indices in the array; 2i+1\n", + " angle_rads[:, 1::2] = np.cos(angle_rads[:, 1::2])\n", + " \n", + " pos_encoding = angle_rads[np.newaxis, ...]\n", + " \n", + " return tf.cast(pos_encoding, dtype=tf.float32)" + ] + }, + { + "cell_type": "markdown", + "id": "537c0575", + "metadata": {}, + "source": [ + "Now you can visualize the positional encodings." + ] + }, + { + "cell_type": "code", + "execution_count": 4, + "id": "a9308263", + "metadata": {}, + "outputs": [ + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAjgAAAG2CAYAAAByJ/zDAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjcuMSwgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy/bCgiHAAAACXBIWXMAAA9hAAAPYQGoP6dpAAEAAElEQVR4nOydd3gUVd+G75nZmt30QELovYsISLOACIoNK9h97X76qogVK1iwvEpV7IqKvYHYKBawYAEBRXoNJYWE9GT7fn+c2dnZkEAKaoRzX1cuD7MzZ87MJu7sec7ze5RwOBxGIpFIJBKJ5BBC/acHIJFIJBKJRHKwkQ84EolEIpFIDjnkA45EIpFIJJJDDvmAI5FIJBKJ5JBDPuBIJBKJRCI55JAPOBKJRCKRSA455AOORCKRSCSSQw75gCORSCQSieSQQz7gSCQSiUQiOeSQDzgSiUQikUgOOeQDjkQikUgkEgCWLFnC6aefTmZmJoqiMGfOnAMes3jxYvr06YPD4aBdu3Y899xz++zz4Ycf0q1bN+x2O926dePjjz/+C0Yfi3zAkUgkEolEAkB5eTm9evXi6aefrtX+W7du5ZRTTuHYY49lxYoV3H333dx00018+OGHxj5Lly5lzJgxXHLJJaxatYpLLrmE0aNH8/PPP/9VlwGAIsM2JRKJRCKRVEVRFD7++GPOPPPMGve58847+eSTT1i7dq2x7brrrmPVqlUsXboUgDFjxlBSUsIXX3xh7HPyySeTnJzM22+//ZeN3/KX9fwvIhQKsXv3buLj41EU5Z8ejkQikUgaMeFwmNLSUjIzM1HVv04I8Xg8+Hy+BvcTDof3+Wyz2+3Y7fYG97106VJGjBgRs+2kk07i5Zdfxu/3Y7VaWbp0Kbfccss++0ydOrXB598f8gEH2L17Ny1btvynhyGRSCSSfxE7duygRYsWf0nfHo8HZ3wKBCob3Jfb7aasrCxm2wMPPMCECRMa3HdOTg7p6ekx29LT0wkEAuTn59OsWbMa98nJyWnw+feHfMAB4uPjAZi79Hdc7ngW9T0BgBZfzmN05yQA7swcyFt9TgJg97RT6DldvDH9h3RmT7EXgJ0b8kV/KXFG3xu+/ZIvXrsXgP+b+RMz1z8DQJthPXh/0gIAUm0anz/8LAA/fbMGgGOH9+DZY5zi3N3O44FnLgCg50ca7Y8+GoAJnz5MOCgUxomj7gPA7rIy7vk7RR/TrqPd/9YDsGP6qdzT/1oAPr1oAp17iz/KLrdfzYm/fi2O1b+JfHLU8Xx60QQAOvduwf89fRsAj1/yMM/nzAJg2furmXDSraK/hPkAXJvxH/bmiD+iN4teY0zcRQCc9+b93PbmjQCo/c/ijlbHA/DZqBtZ3FVc7zF/dATgokuGccXRYmz9zrqbzxM2AXBT77EMmC7O92juMpaddCoAtjgrAEe9/jy3dxDbyl94nSF3XgPAXUeMYv1l4v1oPWMH3737IAAXT/me5KYuAC4f2h6AO+57ma2v/geAjLMe549TRd/Hr+3GtKWvAPDcbVPpOV7cx8h9m3vUEJ7YuhCAyV1GctOzlwDww4Mf07RbGgDxLZNZ8cFqAE59/nqevHAqAHeu+QSA2zucytgtYjp3RruBDPxe/G6sO2Ekec+/AUDvsZfz1tjJAIx/S7zftw29g1nZLwNwjnqycT/7L01j/eXi97rNjB3snnIyAOk3fEzuB+I+pp/7FAC7Fj5F8+Fi2++fPMGRZ4vf16/emsCI/zwGwKwZ47j69ucBmHj/ZQA8+MQHXHm1uOdvvPkdw0/rB8C3X6+nS5/WAGxZk0dGmyQACvMqiHOJexoMhQDwe0O4Eu366+Wkt0oUY9pUQPvu4n+I65ZvZ8Cx4j364as/ARhxypF8MUfcr3PHHMu7b4r34qorRvDCC58CcMtNo5g8+X0A7r7rfB6Z9BYAj9wv3p/x97/C5EeuEvuOf56Z//s/AP7v1qd5ZdrNAPznxsnMfkbcm4uu+x8A771wB6OveQKAT165i9MvfwSAL167l5GXPQzAotn3MezCCeJ+vDORIec/AMCSd8Tv33Hn388P74n2oHPvYemHoo+B59zDzx9NEu/h2Xfz68ei3e+suwFY9vEk+urt3+Y8Su8z7gBg5SdPcKTe/v2TJzhCb/8x73/0PP12AP78VIy/+2m3s+Yz0e526u2s09tdqrTXf/4kAJ1PEX//G794ko4ja9/e/OWTtD852gZof/JtMe0t88XvYLuTbq2xDbBtwVO0GVH79vYFT9Ha1AZoPeLWOrcBdiycTMvh42La4aCf4Jr3jM+OvwKfzweBSizdRoNmrX9HQT9la95jx44dJCQkGJsPxuxNhKqzQ5GVL+bt1e3zVysm8gGH6I13ueNxxSdgV8QHvdMdb/xC2FBRNBsACW4Xqk08fFidbixecRtVWwUAmj36gKNoNtzxCcZ2t0XsG++w41A0AOJUDavTLfqwin5tTjcJ8XpbUUlwil9GxWJBc4gPZpemEUb8IkW2WRxW4lTRb4LLiWKxG2OOXJdqi8PqFPvbFRWXW/yR2jXV2Kba4vTrc+HSNL1vF/F2qzHmyD2IbLM4XGh28cHltlmNPuyKSkKc2FdNiMemr21XrU7iHTa97RD7xrmJ1++Xotlw6fdLs8cZxyUkJODSxHab/npCfLRfv9ONU7+3isVOQpyj2vfCot8zp379isVOgv4/LEWzEe+wGuOM3FOLM3ofI/fNhkpCgmg7FNU4n0vVcFstxj0yvy+RPiLns6Ea121TVOIi74miYosTvxtORTPGHHlPNFuccQ5Vjd7Pqted4HZF2/Fuox25n5F2vKntjk8wfn/i3PFG2+mKN94ze1z09zYyTvPvl2qL3mfNDpo+PoLi9yREEIvDob8eNvZVbZUxfezz9xHnNtr2OHf098cVbTtc8SgWhzFmY/ym9zuuFm2X0RZjd8XHx7bN96sW7ci9r6kdX0MbIN7UV0y7hvewatvot47t/fWbUMt2pK+6tutyjr/z3LDvh/ZfgWJ1xJyzroQj/99JSIh5wDlYZGRk7DMTk5eXh8ViITU1db/7VJ3VOdjIBxwTZ143FcViZ7o+K3DChw9wlPdsAF49qR3zknsBcPkfKbz7UB8Aho25n6JF4ttX/HHiKX/dwml0PUm0jxw1msun/wDApsWf0OdD8Q28rEln1o4TNrm1pTB5VHcA7tc/VL/4+Ce+7X8uAE5NZfl0MUvS/syH2PzrctH3NUOZeatYqX7J8A4APPb013Q59wgAtn/wGSkdRB/FzftQqc/2tOzclM1/5gHQKhCiX6b48Jj8/XbxutNKeqsUALI27WVLnnhw690xjfSW3QDY+uJyynK2AtDk+E4AdLQnsW2VmHHJ31JA0gjxAZVms1C2SWwPDXYa99tTvIfEDs0BcGxtAsDmPWWkOsWvZTgUpDxXnNuVEP22oZXlE5cm+inZUSr6jUs2Xs8r8ZJoFQ8QQZ8HNbktAIq6kyKPHwCb3YK3UrQT7OJ8oYAPxVeu76vhK9UfWG1OKvUPZLfDQlD/duK2acY5FZ+YRrapCoEK0bY4LPg9AdGHw27cf8XhQm8S1j+MAQIho4lX/4emKPj0tk1VCOptRZ9tCwZDaPo4wr4gqi1y70Io1ui3PnM7rMb+2YdN/5M2DYFgCFT9f47+YAhFb4ci385UjWCo+nYEVVUIhaI+htB+PA1h037hUBhNrfnDw/yaxdSu6RjtIH4QNaSvmg5VqXuf5r7UmO1yHeGhhKJqMX9TdSbcgGNrwcCBA5k3b17MtgULFtC3b1+s+v93Bg4cyMKFC2PW4SxYsIBBgwb9pWOTDzgSiUQikTRS/u4HnLKyMjbpX0hB2MBXrlxJSkoKrVq1Yvz48ezatYvXX38dEI6pp59+mnHjxnH11VezdOlSXn755Rh31M0338xxxx3H448/zqhRo5g7dy6LFi3i+++/r/911QJZB0cikUgkEgkAy5Yto3fv3vTu3RuAcePG0bt3b+6//34AsrOzycrKMvZv27Ytn3/+Od9++y1HHnkkDz30ENOnT+ecc84x9hk0aBDvvPMOr776KkcccQSzZs3i3XffpX///n/ptcgZHBNNuvRFtcVx4QdiQexNTY5h01HiFqXNn8sXPjGBf/SoO3lih3gqTmzRiVXnCRmr0zAhSzlfvIvkNj0AePv6AYZc5UxO54WcJAB+/P5PhqUImSWrwk/yIrH4+MGTbgDgjcdnMHFOFwDuaZXI4nUFYkynd+PKOe8A4D53AjuuF+1r9AWZd2xZRctx5wPw5el30e4OsWB30dYiUnQp49S+LfjfNz8CQvaIz14FwNe/C2nlnqYuenURktG8979nhy7lDO2UhiM0AIB836t4isWiamePEwHoH0xhbsEuAAq2F5PWTKxdaO60ULhWyF+eigA2XUbwVZRgayV+weMSxb3I3VOB01dsvCcVe8WY3EkONH3mXSvLx60vEM5ZJaS28nBUgikv9RGnS1pBvw8tWVyLarWxt1JIRo44KxVlwn6Z4rQa+6pek0RVItqa3WnIS0lxNkNecun3U1NACXiM++kv1yUqp4WAfj6Lw2ZIW4rdYbTDJm3dr0s0mqLg1yUxFfAFgsZ5gvp2zSa+m4RDYTSrLh15gqhaZHsQLCbd3tyuImEEQ2HjG6K5bZaTzO2guR0yt0PVbg/rm1VVMWQoRf8dCIViFxqq1UhM4VDQkJ7CoeA+r9dEVbmqumOrO9+B+vknURWlTvegrkOX8pa4x40JRWngDE6obscOGTKE/ZXHmzVr1j7bjj/+eH777bf99nvuuedy7rnn1mksDUU+4EgkEolE0khRNBVFa4hEdfgKNYfvlUskEolEIjlkkTM4Jn69uzcJ8fEcOWkFAK+NbE/5haLmybE3vs3i1EUANO1+OjP0AknPLr+X146YAsAns0V9l1da3cL4haKGQuI7DxKfKWp4DDz9eB57QUhDFQW7mHyvqE2y6/s1LLlLLMg6fppuXU7OYI0uI/W/7STm6FLU+I5urnUI19PiIidui3hGTVor6qYEPGUUdxoKwLLCSv5zrHAQvfVLFhfo9UZ6dElnwp4dgHBMlS4WdUN2bxaaa4sBmZzYpSkAb+duo9gvpsR7N4snZDkSgMpgmIBHLxzVWri2epTFG7LVrsoAvVomAZCS7mLvJiGx7cwvN8Yc9FZiaS7cX+4k8XppYSVasbATKqrGHq+QeFqnuXBG5JeCXcQ1Ey6vYr9YDFfsDRoSVmWZF2eycCeFAj60ZHEtFpuTggohS9mcFor2CAkq0RF1UYVK9hrn9peL+kbWRIchUbntUReVQx+PpiioMS4qIVdZHRYChovKhi8i1dgchsxlrm8RMMk6nmDUReU1uahCEXeVLo8FAwHUiIsqFES1Rv+kFYvV2K6Yqq1WdVEFTbPR5olpv1lyMr1gdksFanBRBatIUbXB7KKC/UtDZieTpkZlG01VCAerkaLM+9cgQTRUmmhs0obk0EBt4CLjcEPkrX858gFHIpFIJJJGSoNdVIfxA46UqCQSiUQikRxyyBkcEy/1GoVD0djcWUg8ljkf8vKWLwFI3rGWKe/+AsDCnJf45IOHADjh52coTtar9P7vvwDk+wLc6RbSyXP3fMKlr4tifPee0I6MZ180zme7Rjinjjj1N17pejEAjgmzAeh46gR+/+RdAOIunkyb20Qf3g8mk97zOAAem7+e63Qn1rZZ4jhncifmrhcyUY4nwA0dRSXJSbOW02mQcFR1tJcRCgippkeak23zReHAojLRV8tzeuPKFA4oX3mxIU+0sFSSHYxWaY5Q5BASUAdbmKAu1eR6AxzZQpTdT+mYTM5K4XZal1dGcsT1E/ARSBZjSkwVx21bk0doj7AgajYne3Vdp10TFx5dgwrkZuHKEBJV5PVSX8iQHjzlfhy6RBXeEyTkTNT7cxgSVZzLhrdSSFAua1TiCZUVAcJx5S0R+1rSNHyhiIvKSr5+P+wWcT5NgbAu19lUhUClOE61qfjKRdvqchh9KE6XIXOFTBJVtLgfVPqizimfqeif4aIyjVmxRJ1TEYlKbDeVdzeXeteqFPqLcUUR0zY7qsxOKxBT57EuqthCfSAkKqNsuxrdrmlRF5hi+ppldleZ0aoJNDyQm6gh7idtP4eapaiaZKkDyVXhULDGon9mDqbo1VgUNOnUqhtyBqf+yAcciUQikUgaKYqqxqyhqzN/Ydp5Y+fwvXKJRCKRSCSHLHIGx4RDU3AqCg89KRJwT7hyOrcuEUnOj3+1gFZni8To4G0XccNrIq36yYuf487PRIXHiSMnAvCfs7uwcJRI8F1d4mXGAOGMyn/6dlLaiTwrzebkvx+LZOSL+3U0it8t+ENIOfe/3otLf/gOgFlrijm+fyYAy6fP55hbzwDgmy9W0XNUZwBWfbQWgIzRF/PqN5sB6KsppOwUstqejb/T4UKR+xFa9gX2eCHxtOzWgm3fCEmoouVuANyDz8eJKLYXDgVx6vP16tbf+MMhnFaJVhVND9vcXChcQ0clBox7WewP0SNdyFwJHTJY+a04x7rsEkY4olOmZRYR/tamqXCGrf1hNYFsUVjQ6nRTrmsm7ZPj2KXLGr7cbOIyhPRWpss3eeU+4x56KnzEpYqxhXODRk6VZneyp0TIUqluG4FKISvFWaMST6i8ROxrsUXlJbvFyKJKjLOSr4/doUUkKoVwpXBkOTUFv8lFVeaJFPqzRyUqk4vKZ7IneUxhVEb+lKIYcpVNVYyieYoWLXynRfKngkE0W0SK8sUU94txTimx32vMDqkQNRT6q0GKina5n/ypavavjn1lKf0aq3FFabXIn4IaivvVQmI6EHUqtldHoakR1RX8x6jPPThUb5uUqOqPfMCRSCQSiaSRIiSqhjzgHL5CjXzAMXHRHwtISIin8HkxE/M/Z1tj9uKC5TNJ/E2kf/83czhH3yQWCId4jlkuUf/GoX/t6PLy2zybcCQAZ3VM4ferxGzP0iVZXPj6/wFgt6i89MJnAKzdeARP9hJRC+//lg3AKQkFdBh8LAAzP/6Tn8aNBuDx86Zw+wkdxb7TXqLt9IsAePFVMWN0dP8WfP2ZKJl9Sbyd3I/EQuXyPQGcJ4wFIOupR0hqPQKA1sPdfDpnAwDBdLHQ19emH9oKMTarK5FMPeG8/Lfv+SmzNQCZDiv2eDEzsiJbzHr0q9yBqs8a+EJhmseL46xdWpPjETV9tmWX0iRRLABWVI09epRBx3Qxg/Nx8R4qskr1c6caMzTNE+yU6Qtry3btIXXA0WJM+sxKdqnXeK985aU40+Ijbyshh1hkbHW4ySsVMzgpLrtRx8dpMaVplxYBoFqs+EsjyePRRcbpjuhiXbslUgcHQhVizDZVIaBHNVjdNiOqQTNFNYQt0agGc+0br6n2TWTWxqlgJIirSjRNPFIHJ+T3xSwsjrRDoUojQTwcCsZ8i6taB8c8cWJeZOw3xTaY2z4j0dxU70Yz18SJnSExFhwr0ZkdxYheCMfEJdQUyVDbqIaa08T3e1hMHERV6jKbUJeZHYmkNjQ4qkE5fGdwDt9HO4lEIpFIJIcs/+gDzpIlSzj99NPJzMxEURTmzJljvOb3+7nzzjvp2bMnLpeLzMxMLr30Unbv3h3Th9fr5cYbbyQtLQ2Xy8UZZ5zBzp07/+YrkUgkEonkL0DTUBrwQ0NyrP7l/KMSVXl5Ob169eLyyy+PiVYHqKio4LfffuO+++6jV69eFBYWMnbsWM444wyWLVtm7Dd27FjmzZvHO++8Q2pqKrfeeiunnXYay5cvR6vjG9t7/FeotjhOmzUPgB9z/6DNbpEKfnufa1lWeQoA97RL5qL7xeLjzdNG03HSewCsmXQ6AOe+vZbL9LTrYW8/xF0DRDp5WSDE9OPSACFTPHWvWBi8vjCXgU8I6Sr3/McA2PbYRG67WkRAXHPzdHz3TQAgx/M/uvu3iT5CQUp7jNT7FtEQ1wxqywczXgWgZ+901n+oJ7yqR7DD0QqALQs20vyCKwBwHdOD3Z7XAIxFw6vzKmn9wxLxepPWdHALqSN76WqWHiGW2N7ituFqIvr7cZPY9h/3MmPxMkC8V8Qe0KEbhXrcQ9GeChJbi4XFWqWTbfoC5Xap4n75K0oo3bEHALu7o7G4NzPezh49Qbsip4Cmqc0AjNf3lHtx6YuQ/RXFODsn66PwUh4QsojF6aagTEhUvVomEdBr9sSZJCp/SYlxL3xlukTltOLXJaV4W/R3yq5/PdAUhZC+yNimKgQ8+uJkh4WgXqdHc8YZMlfYVJ/GH9p3kbG59o1NVQiZEsSjdXBMtW9sVlM7uuA4RpYy176pMt0dNNW4CYcxgv1iatzUlCBew/bq0ohVNVrHxyxj1RTnYKlTzMP+FyLX9hiAUCj4l8Uu1DSkv7s0TEO/2R6qC3obIw1dZNwgeetfzj/6gDNy5EhGjhxZ7WuJiYksXLgwZtuMGTM4+uijycrKolWrVhQXF/Pyyy/zxhtvcOKJJwIwe/ZsWrZsyaJFizjppJP+8muQSCQSiUTS+PhXrcEpLi5GURSSkpIAWL58OX6/nxEjRhj7ZGZm0qNHD3788cca+/F6vZSUlMT8SCQSiUTS2IjM4DTk53DlX+Oi8ng83HXXXVx44YUkJAiJIycnB5vNRnJycsy+6enp5OTk1NjXo48+ysSJE/fZXrJzPYrFzvn9mwOQfcbJ3HTZ/wC4Oi2OZ94SstQJv36G5wyx/ftj76L0xQcA2DVayEtfj5nIB+/eDcB7ai/D3TM41cXWO68HwJmaSJMuAwHI3/ArW7ufCcCxNy0FYO6T33DlBDHNf2nxHl7+Taw9ahNnJWfWTABS2vXi7dW5gEgFB+if7DfiErr8ZyCv3vERAHEDM/lST/T27ShhaG9RV6ewSRtDOolLFdu+2pzPSd8L+Sy55VAyuglZLXt5NrlJoj5Ok+5pJDUTMtHGbUUA7A2uIy61p7g+TUHLE3EVtOpquKFK95aR3E68X7ZtCWwprACgb6ZwOgW8lZRmCYnKmZxm1GhJcmgkxos09LLsYpQUce7I2HfuraSd7mryV5YR11ScQ1H3UOLV68jEuSguE/JRkwQ7Qa+4T6pXOKAAvIWirVoT8et1cMwuKrfdlNbtjyaIhz26RGXVosfFO4xUcMXuIGJQEi4q0Q6YitB4A5F4BoUSX7QdDEScSmo06sCUIK6ZEsRVk/wVE9UQm4eAmZh4BnPtm3DYqKAaCoX3ka5qTBA3O6fUaLs6zLV2wuFwtbJSdQ4nTVUImRLEzdtrkp7CoagEaOxfD61FM92+2FTzuvfVmKlPLRpZw+fgo6oaqqyDUy/+FX+Sfr+f888/n1AoxMyZMw+4fzgc3m/eyfjx4ykuLjZ+duzYcTCHK5FIJBKJ5B+m0c/g+P1+Ro8ezdatW/n666+N2RuAjIwMfD4fhYWFMbM4eXl5DBo0qMY+7XY7drv9Lx23RCKRSCQNpaGF/hqUY/Uvp1E/4EQebjZu3Mg333xDampqzOt9+vTBarWycOFCRo8WhfCys7NZvXo1TzzxRJ3P9+ULN+GOT8BpuROAKek9WeR5CYA5q+fTcdIKAI6buYbLb7kcgKvvf5+BF10IwLmPfA2IMv8fJw8BYPy0xSy4VMQztDxpEE9e/BwAiVaNi94UkRAffpbK/729EoD5Nwi5a9V9X1D4yuMAJLXpwavz1gHwQv9Mfn/1JwA63XgBbywUMtCjHcQDXuDr2YbUlDDyXDbf8A4AqR168e7S7QD0rvBxYU8h8fyyq5RE3ZGT3KarGMOqbHouFxJfs+OTae5uB8DXL/xE4S4hlaUf1Yq0pqKY3u4thQDk78nC3fRk4/p8G8T9Uk+8wpBkPIU5JPYVEqCjsAnrsoUkdGonIYMFfZWU7hYF+FytHcZ7k2SFuDTh8irPrSDoEr8LkX5ziivprV9H0FeJLVW4uRR1L2V+IU3YHVY8FcIZlWi3EvQJB5fiLTfO4ysVkpnV4cajxyw4nFbjPG6Ti0rx6fsqihHxYHFajOJ+cU1t0XgGh8tom5O9I9s0JTZN3BeMuKiIOqdsarTon0miMhf6wxJ1VCl60UVR6C/6p1610F+4iiwVwR/jloq6MWpKEDe3zQX9QmGTXCVuf0xxv+pmW2sqvFeXQnpm+UithXZSnXMqHAzG9FNbZGC25GAhXVT15x99wCkrK2PTpk3Gv7du3crKlStJSUkhMzOTc889l99++41PP/2UYDBorKtJSUnBZrORmJjIlVdeya233kpqaiopKSncdttt9OzZ03BVSSQSiUQiOfz4Rx9wli1bxtChQ41/jxs3DoDLLruMCRMm8MknnwBw5JFHxhz3zTffMGTIEACmTJmCxWJh9OjRVFZWMmzYMGbNmlXnGjgSiUQikTQ25AxO/flHH3CGDBlSbUGwCPt7LYLD4WDGjBnMmDGjwePZO+ZsvJqF0S3PA2D1E6P4sVBILv2n/MGPD4lZoSbH3MAPb14KwMtZa/nylYsBiD9OPKCddfN13PbElwDkrfmBNt+LQnqbS8Pk+0SG1W5PgGlDhFSTkeTg4UfeBODVgZGsJws/T/kGgCPuGcOvH30OwJFjz+Gh86YBMPakzlx739sA9LhUOLLWv7GAjO43AJDlbm9cW6cjMlizTCym7hIMc0SKkHPG/pJDb5eQMpq1FTLXzo0FrC8S8s3x3dJp0l4kiG+b+h3le0QqeNOzu3OkRchE639cBUDBxgJSeohMqXS7haK1YnYueFz0D8xTvIfETm0BcGzOYJ2eY9UkLvqrWJotJKr4ZKexTSvNw6UXT8xfX0DIFStX5hZ5SNaL3AV9HrTkpgColiz2VkQK9lnw6A6nZKeVYEC0VY8Yg6Jq+EqEXKXZnFTqupTbYTHcRfE2k2PJ5KIKmBLE/bq0ZY1zGhKU6nRFs6isUenNLAP5DblKMQr9aYpCKNK2asbfhFHoz2Mq7hcKoVijCeKRLCqIlaVCVcq0xcpPGI6NYKj6ZPEDuahqIwcZ4zIXBwxV76KCfYv2Wao4p2ru/5/PhqpJrqprynjVvtSY7VITawh/VWHHg4F8wKk/jXoNjkQikUgkhzMNDdtUZNimRCKRSCQSyaGDnMEx8dnvedgVlWBTIT280ue//NxTyCTu4fezevEkALqNHM+7x4uCfac8/iqrzjsbgI5DbwHgjVGtcE8Tbqn0Hsdx6fuiaN62rGIe7d4EgKxdpeROEvv/987HuSt3GwDT3/0dgFePb8Xbi7YC8OgZ3Rn88isA+Ibdx17fZABObWmlOEv0nTr9JgA+f3Q0A0a3BODjtXlGAcDB/Vtx3TtzAUi0qih/LALg1z/cjGkvpKkTjhDOqmeW/MxuXWY5vUMaNvUYAPZ4J+MpFrlTjl4j6V8ujnu1QDircrLLaNNKFOxrGWelYLW4ppISr1Hs0Fdegq2NkNPcSWGK8oUTyVISLcxYpMtjaalObLr8oJXm4m4mXFtbf9lNic9UnQ6oKPPhSBbST8BbiZaaIY6z2sivEFKU3WmldK94b1OcFkJ+sZ1Kk0Slu6g0u9PIuUqKsxkuKpdNM4rDRVxUNlXBXy76tTgt+PUMK81hclHZosX9wlpURtJr+6EpipFFpQKVvoDRd8RFpdq0qItKixTgixb6E86pqIsKS/Q85mJfwXBUVoKqzqnofTVv9wVDNbioItlSZrlKIax3o6qxRf9CpmKAxtAOUNyvTs6pmiSuGs5XkzRxoByrv4vI+OpyD+o6dClvNW6M0MwGHH+4Ih9wJBKJRCJppMg6OPXn8L1yiUQikUgkhyxyBsfEhA9uJ8Hl5PIe5wJw9Kg76bzjOwBOnPAib/yfKOj308Kh3PqYkCfeGx7HTVdtAWDe7CEA/H7RmbQfcjMAt17Qi9vufhEQ8syxL98LQOHir3jvIeG0GqWNJ61TPwC2LhUJ6r2n3sTshULC6uNbjzu9DQAv/7bbkJ18n840nuzXWFoBsKrYwzWDhBPr1rdW8mArUfm5V7tkPMUi46mT20bel+LcOZsG0GZYJ3GNHYV89kTeDiM7qmfTOLyWboBw24R055E/swe9SiOyk8inyqrw07etKLCX3C6JvZtEAcBN+eUkWnV5w1cJGaJwYGJaNntzhWNKK9oFgGqxsUfPjmrXxI1V14MC2dtwNRPOqb2+IIXeSFaTeO8qy7w4dYkqVOlDSRQuKs3mIF93UTlcVvJ3Czkq0W41pv1DpYXGub0lXgCsTWyGiyrRacWryzUOi2LIHeYsKn+5kNUsDgu+cnE+q8thKvQXF3VRWaJVtM2SkMeURWUu+hc0XFSqMeaYQn82c6E/k4vKlEUV1kwuqirmRFMclp4/peljC1crSwWqcU5VpToHpFkOisRhhasUB4zNlYp+/9KqSDXmzClNVQgH95Vw6uqMqW0ulbnfmto1UZshScFIYka6qOqPfMCRSCQSiaSRIh9w6o+UqCQSiUQikRxyyBkcE2eva4fF4eKJyWcB0P7YW1gyfT4Anwws45fjhAz007En8H8XdAfgy0EX0U8vSGd5SjiZXp+/hXdfOh6A7kUruDEigVhtzHf3B+DY/xvO6tuFq8n2zA+c9vyNALz2+G8AbGx9AselxYn2E0/QfuB1ALw6bx0zewn5ZcXT80lucwEAM3/YBkA4GKavW0gnWX9spuMZPQBI2/MHqi5fdOnWhC1f/glAsT2RjMuOAyA1Q5wv6Ks0pusTizbzhypcWTZVMb4N7KhUaZskJJCIbJXvC3JypnBRpXZJZe2nGwFYtaOYLvaojBJIFvexSVoxO9YJacq/swgAq9NNsZ4d1a6pi726WyiQk4W7uZDQiv0hCiLF+3RJo7LUhytdjD+8JUgoLgkQBfvyy4TslBRvZ0uFKOQXZ1VMEpU4t2aNSlS2FhZDokqKs7JbV1Lsmmrcm0j+lE2FQESiclqoyBf3P8ZFZXcaUlDApN54TFJUpS9oXFOk0J9NVQy5R9VU415rzuj91IyCfoEYWcqcP2VoQrCPrBQ2FfrzB2MlI/MxahW5qmpxv2rdUqpJhtJiJano0GpXtK82mKWrg8XBKAL3TxUcbCwGqdo4terz1jeSy/tLUVWlTsUz9+3gcLhL1SMfcCQSiUQiaaQoqhLzJaA+xx+uSIlKIpFIJBLJIYecwTHx89tvoWg2XlkpXFEr3+hFSTvhqJp+zH+5InslAHfFd2PCV6JQ3tNJ3Zn21YMATBw5EYCWTiutPn0cgMVPL+KIcx8BwGLTuP2FnwHo368FQ1KEtPXT3koeGymcTAsWiLyrWz76g1euFwX2Ppu6mP8u6grATbc/y5H/NxyAZ/7vTdqfKaSyb77bBsCFDguBr94AoHBbNs3HnQPA3s/fx53RBoC2/RU+fUrkXHnb7EXrLfqz7RaZUqrFRhNdUvKtWsLStFMBSLNp2FxCglqVU0rbpAIgKm+UBUJ01K8prntbst4TRQj/3FXMMGf0Vy3PI2SKrs0S+KFQFPjz7RTSkTUugRLdTdQz0UlQd1+V78zG3V5kWJUHQ2SXCinJqUtYnvIKnKkiq4otEHImif6cbrL1woGpbjuBSuHacpkcSaGSAuO6vcW6ROW0GPJSitvGbn3sTpOLKlRRaowhUujP6rAQiGRRuZymQn/RXC2fSQaq8EedUxGJKlEBry+6PahrWqpNNdxChnMqGHVRhUJeI4sqHArG5E+Z5apgFZUoaM6ZMrmoYjOq9s2igqijSpxfH6fZLaWY5aqojBWZcjc7p8wyTjgYW+ivqnSl1ULWMruianI71XRsdZtrkpmq217XnKnD+Eu25AAoitKgYoyHcyFHOYMjkUgkEkkjRdHX4NT3p74S1cyZM2nbti0Oh4M+ffrw3Xff1bjvf/7zH+NBzPzTvXt3Y59Zs2ZVu4/H46nX+GqDnMExMfaBm7HHublmifiW+XrnEVR+9gUAlcE5DLxPzNq8MbI9Jz4kZkBeOakdb7jFgmKH/ot0+aTTee22DwHYUObl7a8HACKpucdIkThesK0rT4wfAcCeCV+gvSdiIEaPEouGX33pc5q+cBcAv02Yz7ROIqbg6sIc7Gc8BsC2y2Zx9YkdADGzA9C/ZQJrZ38NQMDTEk93MTuz+e4pZPS6FYAmJ3Vjw0SxeFpRNbYjIhcyfp4FgDM5nTZxYrFq7ne/sKhzXwDOcdtwpmYC8MOWvZwWvwLAmNUByLCJ2Ytw167k+z4BoDC3jCS9Ho+Gk536Qt4uGfH49AXYJVvFLIo9vqexuLdNkpNim3gGL9uVT9IgcZ/LAiHy9FRwlz6D4y8vJq5jsj6KMB5V1JqxONzsKRV/QO2auPF7xAyO2xa1TgZKRB0fzebAn68vXrZb8FeTIO6wqMbC5nClmHVyatE6OFaXlaA++2KJiy4sDlvtxkyJedbDXO/GnCAeiWfQLGo0TdymRevgWKORDKrV9GdsXmRsKtFuns0JV4lqCJpSL2IWGVdJEI/O7OjHa1XiGUz7V7eYuDZYGjiVUZdkcfO/Q3/hIuDGMjvT0G+zjeQyDjsUpYFrcOoxg/Puu+8yduxYZs6cyeDBg3n++ecZOXIka9asoVWrVvvsP23aNB577DHj34FAgF69enHeeefF7JeQkMD69etjtjkcjjqPr7bIGRyJRCKRSCQGkydP5sorr+Sqq66ia9euTJ06lZYtW/Lss89Wu39iYiIZGRnGz7JlyygsLOTyyy+P2U9RlJj9MjIy/tLrkA84EolEIpE0UiIuqob8AJSUlMT8eL3eas/n8/lYvnw5I0aMiNk+YsQIfvzxx1qN+eWXX+bEE0+kdevWMdvLyspo3bo1LVq04LTTTmPFihX1uCO1R0pUJq5cNpN4h40vzpoAwO6pw3j89ukA7P18AhNvmQNA828+YcNQUbemx4+fc9qFMwFY/ZBYjJs3+gE2jBUSVXuXjcR3xCJke3I8cbrEU7RtNfbZzwBw6u9bWHyv6PueTfcAMPX+tXy6R0zdpdg0fB9PBcCd3oYvdwtNIdGqck7XNEBIVwDdRvdmwcwfxPl69GLBZiEBFS3Ppu/l4tzB7p2NKAZncjrfbt0LwOBFywBIanMpbfYuEPfgp61ss4k+WrRLIjFTLPRdvimfElX8cjqSRb82VcG6Z5O4mR2OMOrZFBdUkNJByEe27EQ2FoiYi3bJcfj1Rb8l27L1voZHF/c6NFL0WIqyXQVoTVoA4AuFydL7yLCIZ3RfRTFxGSn6O1lAqR7lYHUlkq9LYv3bpxL06jVq/BVE8BaJxcKaPcOIWTAvMk6wR/9MtIDHkKgii4ytqoq/XJzD4rTi0xcOK3aHIXOFLdE0cfMiY49Jlio31cEJmmQpQ66yakYdHIsjupjYkKuCwdh4hphFxuY0cWIIETYSh0VUgy77BUPG9mA1sQ0xCeJKbB0c82LiyPgVVTFkr8j/dMPhcLWyUtWFxZF2qJqohsj+1RE2RWEciNrWu9FiFinXuvt9aIxrP+ujhDQWCe5QRVWUBtViCuvHtmzZMmb7Aw88wIQJE/bZPz8/n2AwSHp6esz29PR0cnJyDni+7OxsvvjiC956662Y7V26dGHWrFn07NmTkpISpk2bxuDBg1m1ahUdO3as41XVDvmAI5FIJBLJIc6OHTtISEgw/m232/ez975rd8LhcK3W88yaNYukpCTOPPPMmO0DBgxgwIABxr8HDx7MUUcdxYwZM5g+fXotrqDuyAcciUQikUgaKQer0F9CQkLMA05NpKWloWnaPrM1eXl5+8zqVCUcDvPKK69wySWXYLPZ9ruvqqr069ePjRs3HnBM9UU+4Jh44rGvsKHy+pfiybZ4yWSenyze5Kt2dmbgxc0BOO6eBXQ9SdTHOe6xxZTs3ADA7gvEKvJLJn/HS8OElNNmeA/evke4iZrYNU547FUAflq4ipvnijoxLz/0MM/MHgXACXP+Bwgp6rGPVgPwaOdUlk0Rbq72Zz7E5PliFfrt6W6sP74DgD1eyDOZF57G6seEiyqlw1G8rEc4dCv1cUEfIfH8lufBqRcJSWzVjY9+E3EJrX8S/828LIUWNjGdufy91expK1LIM/u1oGkL4ZjK21nCnoJN+lgHi76sKr4NImpCHXCmIfGU79lNcg9xbltZMutyhLRzbOskkS4OlOwU21y9oivqU+MsuNJFbZvS3WUE3UKO84XCFBYKiamzVY9yqCzD2UTcA9VSSolPSBN2pxWPHuuQGmcj6BNuJ9VTYpzHWyhkMovNiU/f1+60GuOPN0lUir/CWLgW1qMaLE4Lfl3acqbFmxLEXdEEcS0qHflDYaNGizdoimowauJASNeRNJtqyD2iHXFRRaMaFIvL1I5KV+baN2a5qmpUQ0yNG9NL5nbE4bXP/gdwSymqQljcmphy89V9E6yu3k1k+/6osZaNWeIy7VJTfRzjfFWSyWsrQzVGyUny7+fvrmRss9no06cPCxcu5KyzzjK2L1y4kFGjRu332MWLF7Np0yauvPLKA54nHA6zcuVKevbsWafx1QX5gCORSCQSicRg3LhxXHLJJfTt25eBAwfywgsvkJWVxXXXiUzE8ePHs2vXLl5//fWY415++WX69+9Pjx499ulz4sSJDBgwgI4dO1JSUsL06dNZuXIlzzzzzF92HfIBRyKRSCSSRkpDwzbD9Th2zJgxFBQU8OCDD5KdnU2PHj34/PPPDVdUdnY2WVlZMccUFxfz4YcfMm3atGr7LCoq4pprriEnJ4fExER69+7NkiVLOProo+t+UbVEPuCYuPGK3rhtVhYpQwAY/n0ii6cPBaD7yeMon3cbAK7T32T3t+JNbHbcfxl06WUAnPuIkIayln7KUfOFFJVrSWP1nfOMc7wy5ggAHmniZtYrotjehz3OJ1GPJPhh4hwAul35FH9+9S0AR995Gk9eJZ6U/3taV+56TKSQ97qiPxtf/QCApt2uACAvs59RKK/dES1Yv1K4k1oFQhzXWuiv98zfRHuXkDJadMpg6/p8ANbvFlLN4O7pZLYXxf3eevk3SncLKarZ5UfQXRMy0Lzla8jbKo5LPVUUIUy3WyhZLVLKgwMuMq7ZU7yH5C5tAIjb3pzVu0RhvaZxbQ35oWSnkHvih0YjDSwlObibioTwwq1FBN1NjNci8QspesG+oM+DltoeAEXdaaSNO1xWPHpRwGSnlYAuianeUsMVZLiobE4qdckoMc5qyEuJZonKV2m4qAIVoi+Lw4Jfj2fQHHbj/isOlyFXha1R6c2k9pgK/cUmiBsuKqvJRWXTCPuqRDWEQij2aN+KtXoXVchUpi1yenOhP9VIEw8Z20PhWOfU/lxUqhobyWAuEmimagHAcKh6FxXESk9VCwDWHM9wcHSihvZT0+F1jXCo2o8a85rUxBrCwUiJ/ztQVPHTkOPrw/XXX8/1119f7WuzZs3aZ1tiYiIVFRX77qwzZcoUpkyZUr/B1BNZB0cikUgkEskhh5zBkUgkEomkkSLDNuuPfMAxsfiiR3C64vmzexIA8cfeQtFCUaSvw/Fj+byPSOY+9p7n2PEfsbq87TH/x/yreon9jxM5U026DODqRaJ43sasLdzfUcg62/dUUPbkWAAeGvc4U+8XLqmHZq/ghcHCZfTeEqFrPnreEQx7R6SCq2fPJOfiVwAY3zmBa7aI1O9m/7uBb/qK7Kr+k0U+yIdr82ipF8cbeGxbbvrkSwDcFhXLKtH+bmUcp7cT1zi0VzOef+5TALbpss7JXZri0E4AIN/3LJ5iIUU5ep/PsRXCyfRuwW52bxdSU+vWoq9WcVbyfxdJ7CXFXsOp5S0txNpOpKTHp6jsyRMZTvayXOPeF+8VklNaWpwhAWmlecTrrq0dv+VQ4o++V+V68T5nknC8BbyVaKnNALDYneRXCFnKEWejdK+QktLirIT8YrtSWWJILr5SMa1qcbqNAohJcTbDRRRn1QznjeotMxLMfaXl+nEWApV6gnicw5ClVKcrmkVlicpIvmDYkEAihf5UoNKny1wKBPwRKUqLkatCHt1pFeOiiuZSYTFZM03F/ULhfWWlCH5TMTyztOQzyVWxzql9i+eJ4n76adXYon8hUzFAY2gHKO53IOeUmapyVeRYrYbz1SRN7C/H6u8kMr663IO6Dv1w/tD7t6Gq1f+91JbwYazTyAcciUQikUgaKX+3TfxQ4jB+tpNIJBKJRHKoImdwTNx522QUzUZq1ncAnPTQS7x0rZCAfnt/COOmCpfR/NPc3HSTKLb3Zd4wVp0nih91HHoLALde2Itb7ngOAH9lGcPefgiAwiWLeOuBzwE4y3MLTboMBGDzd/PpO1Uc+/Yg8d/BgXW409sA8MwvO2kTJ2QI30dTDNlgjb0tv+luopuOFw6iW95cwYO6W6pXhxQq9YyqrvE2cuYJN9fuzf1pO6ILAKd0bcpTOdsADHnmqAwXFVp3cb5Q2CjGF2hxBH3KIrLTXrJ0SWtA+1QAUtsnk7++AIBNe8oMZ1jQV4nSvDMAiWm72Jsr7qNWtAtVl1RydBdSx/R4rLoeFNi9BXdz4Zza411DoTdaCK9Cl6hcTUWRu1CZDyVZJNNqNgd5unPK4bKSv1s4tBLtVmPaP1hcYJzbq/dlbWIzHFCJTiteXa5xWhVD7lD8UReVv1zce5vLamRYWV0OU6G/uGihP0u0LLpZEvIEonKKt1oXlam4n00ztaMSlVmWismi0kzF/Woo4Aexbil/NW4pgMABXFQgCndVxSwHKeq+LqpQFReVpka/c2kmqaaqdKWpyj4F+aqerzZotdi9ap/mf9fmfLUZ0uH7HVtyIBSlgTM4h7EcKR9wJBKJRCJppByssM3DESlRSSQSiUQiOeSQMzgmup90OprDxW+TFgLwQbt1rD1bSCtfdz+GcTeKJNQPjr6QoU2ENOK7+zJeWyCcQ3PfGAJAq00LuFGXdSx2J++pwmV1whVDWDvuYwC0mT9y0WxROHDGw6v4PVMcOyLdDcCaiZPorLuyXpmzhlf6ZwKwbMoXpHYROR9PfLURl6439HEIR9O2FWvpOvooAFJ3/YpmE4Xzuh2ZzqbPhGurJD6ZZteOACA9w2VIUJHp+sSCDfymioqVTk0xpIltFQptk4QcEgr4yNeLzp3SXDidmvZMZ9VHIl/rt+1FdLdHZRR/iuivRUYxWX8Kp5g/qwCrU1zvXr2vjulu9ugupcDurbhbCImq2B8ir0zIQDZVMYr3xaWJ6wuXBAnFJYl77nSTWyLko6R4O1sqhNvJbVMJBcRxoeICNGusRGVvZTUkqlS3jR26muK0qMa9CZWXGO4wf0nEfWWhIl/cQ6vLWa2LKmBSZir8QaO/Sv26baoS045mUWnGmC1uW1SiMQr6BWJkKXP+FDXkTwVNjioAv0mzClXJmVL3U+ivRreUWYrS9pWlIvtHaKh7SVOVOjmOakN9vjFXLeJ3sMdUWxrLF/baSCP1eesbyeX9fTRwkXG9bvIhgnzAkUgkEomkkSJdVPVHSlQSiUQikUgOOeQMjolFp0GCG4KD7wVg4qkPc/H2FQB80bwXzXRn1A9P92bq+rcBGNv5AvolC5nENvkmAL6c9SsD7xH72iwq42csAaDjUa24KUNIMp/nlDFtSHMAPlk0gpvfEueZd9+pALx6x0fcc69IZB1z9WMceYcoMvj4mOl0vbIbAN8v2cb1eqZU+dwXASjKKqbZQ5cCkPvBWyS2EG6pDsfYDAeXt/1elN4nifFt+sGQsdJ1Scn76wK+bXq2sc0eLwoV/rKzhA7JeQCoFpvhuuqcKvKi4o7oQNbs3wH4c0cRJ+vOL0XVyNblpS7NEviqSDi7PNvLsLmTxfj1vKV+SU6Cer5U2Y4cErqJ8ZcEgmSX6cX9NJXKUpEf5WoqcrDYAiGXcHNZHW6yi4VE1TTBgb9cyHduW/R5PlhcgGYTxfe8xbqLyq4Z8lKKy8YOfV+nRTGcU6GK0qiLSs+isrmsBHQXmMXkogpbo7laPpMM5AmEDIdQRJZKVCCg30+rqhpyj2bTDLeQarUYhQojLqpQyIuiS23hUDAmf8rcNjunqpqdgubMqTA1FPeLdVRBrCwV45ZSYnOpwiZJy7iuagr6hYOxbiltPzJWzVlU0XZVt5PZgVUd1W2uSWY6GPLTYfzFWlIHGhq22ZBj/+3IBxyJRCKRSBopMqqh/kiJSiKRSCQSySGHnMEx8cjQcdgVlT9mzAbgsiQHw258HYBl94/g6NveA2DpHUO4epko3DYo0c4F748HYOLIiYBwBH16RW8AQopG0vMvAVC0LYPhz18j9rn8WXIniaJ+t100gdvuFhKTc94DAGy4/h1uSiwCIFBZRuDEqwHI8Uzm9uGdADjrzfcY0FO4jP54dTEAQV9nctKFi2rjnDtoceqZAKSe2pfNt80BhLy0rlzIR5lff46raUsAuiWIa9r19S8s6jkYgOvibUbBwW837OFc91IAQ7YCaIKQi8Jde5LrFfdob04ZKR2F/KT5nGzTCxJ2zYjHV1oIQPGmPByJothhxL3UOslJiV1IJKU7ckkZOsJ4fWexkIQSLKohO8W1TdVH4acsJI6zuhLJ011UXZsl4PeIwoJx1ujzfKCkxFToT8g+dqcVj16Ez22L/mnYLaohKYVKi6IuKr3Qn8URzaLSnHEmicpuOJZ8Jomnwh/ax0XVRFWM/CnNphIyFfqLuKhUW7RQoebQZalgCZhdVFo0f8qcRRWMyaIiBrOLylyE0BcwZVGFwyjavtKVubhfuJrttVngaKnDFHp10tD+XFj7k5JCoWCD6ovsj8aiCjTkG2wjuYTDHkUVPw05/nBFPuBIJBKJRNJIkWtw6o98wDExoEkccarGk8+K2ZQPNv/IFadNAuC7M+9n7zWivf2RJ3n7IjFb89wXj/GaImZrHPov0mnN4tl09WgAnGnxpPc4GYA9635iY987AThl/Gbee0ike187NsT1BbsBeOpHsbS1vctG9swnAEjt0JcXl4vX28RZOSFFzBwEPGUc8d+hALxws5g5cQ06kQ/WiJTu4PoCTn1E1J8pyGhlzCK4M9ow50+x0PfUhb+T1u6/ALT0NAVgxw9Z7HSKNPTmR6aT0lLM8KzekE9BcCUAcam9jJkMbdef4ga2P8JYeFycX0pqZ5E8bt+WzJo9YhZlYItkfPrsS/G2XJzJYp/I2NKcGonxYiapZPtelDSRsl4ZDLFTTwVvZ1GNPlzNxAyOou6hRI9ysMW5KNAXGWd0bUrQq9f58YjIBgBPQTEWp7g3/vLIDI7FGEeiI/qnofoqjIXF4YoSbHoEReQ4W7wDjz4tojhdROZAwhaHsbjXvMjYGwgaM0IlvmhUQ1AvlqPaNIJ6f5opnkGzWqJRDaZ08ph4hpoWGZtmbYzZHH1GRkQ1iK95oVDsTI2ynzo4ihJbB8e8mDhorq1jms2JzOxUN+tS08LiUJXtkddqXgCs3zvTIbWJZIjt33xs1XPXrS8zjXE5RH0+/w7jz8y/HWkTrz+H8eSVRCKRSCSSQxU5gyORSCQSSSNFuqjqj3zAMdHjh4XEJyQw6LXlAHS/4ytmPCXkm2tveYZr7roOgPNun20szL0/vxMvv/QhAKsfEjVsko8fxvhBYwFwW1QeWPw4ADM+asoVz/8EwDd3Psbq2+cCkDVpvCFjvfqhiFP4+PQO/PiCWNB79KSreXXeOgBe6J9J+ceixk58s/bEnXYhAJuvEgujM3v05e1vRHREn3IfNx4pIh6+2lpIE33xbpMO3Zi/fBcAPX/dTeuRQiZq2UTUnJn75Dfs3bEdgOYDO5KRmiTO8XsOudnbAEjqcDpp+kJc75+/AKCMusWQZMr3ZJEyREhAjsIm/L5DSErndG1qLJot2lZMfLtorRiARNVPfDNRK6g0u4xgQjog6rNsLxCRC31tGgGPaNszRIK4ou6lWJeoHHE2PHq6d1qcjaBPyFVqZbFxHm9RGRa9/o9Hr2HjctmM8SfaTRKVtwyr/j+JYHkpFqd4zafX9nFlJEXjGRymRcamBPFAKGzIJN6AaZGxPxLPQFSWsmpGmnhMgrjV9Oeqy1LhUBBFr+cTDgUJa9Fk8bBpkbF5MXCoSiEcfygc0z5QHZxgNXVtFFWJlaL8GPtEMP+PtqZ6N8Z4Q8FaLz42S0i1WW9Q3cLicDC4jxRVGw7jzw7J34Rcg1N/pEQlkUgkEonkkEPO4EgkEolE0khRlAYuMj6MpxnlA46Jwde9gGp1kDOhJwAJ733LyV99A4A9sS3/ayqSsl/K3cbLz4gk8Gtunm44enZf8BgAH+8sJsEiJscK/UHuc28FoPM1/Rlxwf0A3Ni5CUNShETyxawVXDD7PgCmPzgDgG5vjOfF90TkwqTTujHwQtH3kWPP4acH3gSg3ZkPsdwjkrwjjqahg1vz4TsiGuLIMHQICUfV7T/nc1mikDL+6N6U377fKNrFXs7uI5xKab7jAdj24HzKcreJbdcOYKgi3FUrPl9M3hoR1dB0SALNdakm7zchn/mH+qMyTHE+8V27AuDamMaGXeIepWh+436X7CwlMU3EPESO04p34c4UEtXuX3bjtSca++cXCjdUsstKIJKA3kTEXaiWTeSWCenL4bJSqbfT4mwEdBeV4imN1r4p2otFTzKPOL+S4qwxLqrImBRfeTSeoaQCm0vIQz7dRWV1Oak0uagiLiWsUadTTFRDMBrV4AuYEsQjtW9sKqFgtA5OsEJPE3faDIdQJJ6hapsYWSq6OXJ6RdWMujfVJoWHY6WoyHZfIFitdBWuUlNHnMMkXZmcVmL/fZPF9xfJUNP2qv+ub3RCQ5PMzdT0OVI1Zbwu/VSdYj+cP6wOBn9V3aO/Ek1VGvR7GpYSlUQikUgkEsmhwz/6gLNkyRJOP/10MjMzURSFOXPmxLweDoeZMGECmZmZOJ1OhgwZwp9//hmzj9fr5cYbbyQtLQ2Xy8UZZ5zBzp07/8arkEgkEonkr0HVZ3Dq+3M4LzL+RyWq8vJyevXqxeWXX84555yzz+tPPPEEkydPZtasWXTq1ImHH36Y4cOHs379euLjRYr02LFjmTdvHu+88w6pqanceuutnHbaaSxfvhzNXLa+FlgdLlSrk0nH3AzA9MVf8cjAIQAs2rGSZ7scDcAt787h9DWvAqDZHAwYdQoA5z7ytbiuPbv59Q5RgG/Xj5tYfP4dAAye+l8sdiFLff7+tzwxXsQQ3H/7XKYMbw/AjIfFmH92dDMkkk65P1FZKArzBU68j+/GTAfgqpM7M33xZgBGJgk5pOeAVrz0mHBitYmz4vlapJ6vW9WTjse3AmB0nxZ8PftjAHK9AS5oq8cuWI4DoNgfwlsqCv2pPY7jmHIhEz1SmMvmQuFI6tMhjRZ6MnreKlGcMCe/nES9CJ6vvBhbxyMBSGpSSNEe4Xqy7N1uSB17Kvy0SRfvY0QCYs8OEloJV9fvX29jb2VUeigv0dPE0+KM4n0WXaKy2J3k6ZKR022jdK+QxNLirIZrK1ySb5zbU1iONV7IY2WBSIJ4NFrBaY3GM6jeclM8QyUWvQhgJJ5BJIiLMaoOlyEHmV1UvmDY6K/CHzRcWRWmQn/RqAYtNk28LOqiikgxZudUTYX+YuIZwmanVKyu5A9VL0sdsNCf6X+cqhpb9M+MuQBghGrTxEOxTqaaEserYpYdanJUNXZpIjK+ukhtdf3ckvLWv5OGSlQh+YDzzzBy5EhGjhxZ7WvhcJipU6dyzz33cPbZZwPw2muvkZ6ezltvvcW1115LcXExL7/8Mm+88QYnnngiALNnz6Zly5YsWrSIk0466W+7FolEIpFIJI2HRrsGZ+vWreTk5DBixAhjm91u5/jjj+fHH38EYPny5fj9/ph9MjMz6dGjh7FPdXi9XkpKSmJ+JBKJRCJpbDREnmro7M+/nUbrosrJEZJMenp6zPb09HS2b99u7GOz2UhOTt5nn8jx1fHoo48yceLEfbb/OvMSEhIS+LDV0wCM+ORBmmUKCUW9/UJyvUKSuF/7kaeufA2ACQvnc10vkYcUf+wtRl+h54WM1OeaHdzcUsxS+a6ewpHXTQbg57fewPnmMwB0ffBLyp4VieStjh4OwJ3v/85dzcS51z0xDXd6fwBeXL6b3Xphult6NuWJV5cBcPdJ7QDobCk0JJk+rRJY/853AOR7g3S4XCSEp7VJxlO8BxDumtaqkHOygkKKMuf25Dsy6BYnNgQ8Zca5B7RNIf0IkWS+c6koGrh8RxFpNj3fKOAjkCbGlJq+nvW/iXVRwd2bDPfSHm+QLvo1eiL5Trs2E99KuLb2+kIU6uezqYohUbmbxhHKFtcYcol7b3G4yCkV8ll8vJ0dlfq+pkJ5oZK9qLrjyFPowdZUSDsRB1Sq20aurta4LKpxH8KeMkOi8pVWYHOLPspzhexmdTkM95XZRRVUo9JRTHE/X3CfNHGbqsQU+ou8hxaHLabQnyFRxSSIm9vm/Cmz9MQ+7YjkJLKoIm6pkOGu8gZCpmPChuQTkz9lyGDR7ZqmmvaJntfsrjKjqft+z6oq1dQli+pA1JRLZZaxampX928zxvtTi8+Ug/mx01jUJymDHXykRFV/Gu0MToSqfzDhcPiAf0QH2mf8+PEUFxcbPzt27DgoY5VIJBKJ5GBiUcGiKg34+aev4J+j0V56hl6Cv+pMTF5enjGrk5GRgc/no7CwsMZ9qsNut5OQkBDzI5FIJBKJ5NCh0UpUbdu2JSMjg4ULF9K7d28AfD4fixcv5vHHRbZTnz59sFqtLFy4kNGjRwOQnZ3N6tWreeKJJ+p8zm97DSFO1bh0/QIAbmx6HNOKfwPg5sSjeOIVUXhv6lmP4dGdKLeWLmLVue8A0HGokKgsNo2zp/0AwKmDWtPeJSSNeVsLef3yvgAc89tGbpwjCgfee2kvvn50PgBXvX8DAI8++hZHjxsGwFsPfE77W4Wz67XP1nOGnillXfIGeWs2iHNPGAOAZ/5ruNPbiG29gnz7ksiJqmizG/vxojhhRskmQ5JIsWmEVy4E4IdksVC7id2C1SXkquXZZZycFF2jVKw7fY5qFo/riLYAfP+FyL76deteroiLyiUFYeEY6948kRULfwbAv60CW5x4oCz0B+mT5gJgu/41o2LbNlytWhivZxUL2cmpKVSWCueUOzMessU5ghGJyukmu0js2zTBwSq9+KLbphqyQbAwz8if8pb4sOuFCit121OK20auPnaziypUWmS4vHwl5UahvyJdPrO6nFGJyh5nuKh8Jl2owh91CFX6grgj/ekykE1VCOpuLtWmEvILiUqNtxjOIYvTTigkZDH0goVV86eIcVFhakcdVfu4qII1509FtgdM20PVtBUlNpfKGE6NspQudwWruKhqKPpX2yn6/UlL1e+/77aapK/qtte1iN9hrBYY1OceHM63TUpU9ecffcApKytj06ZNxr+3bt3KypUrSUlJoVWrVowdO5ZJkybRsWNHOnbsyKRJk4iLi+PCC0XAZGJiIldeeSW33norqamppKSkcNttt9GzZ0/DVSWRSCQSyb8VtYEPOEH5gPPPsGzZMoYOHWr8e9y4cQBcdtllzJo1izvuuIPKykquv/56CgsL6d+/PwsWLDBq4ABMmTIFi8XC6NGjqaysZNiwYcyaNavONXAkEolEIpEcOvyjDzhDhgwxXBjVoSgKEyZMYMKECTXu43A4mDFjBjNmzGjweNaV+nAoKg88ugqAV09qR9+HlwLw0olt+aDnVQB4Qm9x06NnAPD86KfYUCYcO3PfGAIIOeCIkbcCkPV7V/64T9TjeXXil6TO+x8AF104hlmvCFnqxWcf5clnhBw1rY9wJt2Tu43E/0wCYO24j/nvaSLX6abbn+WYVkI++vOZ9/GWtgTAN+A8ANaMOoXMXuLczc8+gtWTvzOuL8uWCUDTb6YQlyrandw2cr8S2VWfd+4OwCi3DVcT0e/XG/YwPF7IdPb4FEP2aOkMEe7VA4Ddni8ByN1VQtM2uhMLJ1t1yahH8wS8xfkAFG0owJEosr7KAiHap4hiexW67FayLYeEfoON13eViD4SLBo+vfigq2MqIAbi0YTkZItLZFdhBQDtmrjxVegSlTW6zCxQtBdNL5Dnzfdidwipya//DiY5ovKa06IaslSovASnJvrxl3uw6hJVpNCf1e0yJKqwzWnIQeb8KSFRiXalL0iiLp8Yxf0sqimLSjM5p6wxLqoIij2ac4XpYT6m0F8oKkuZXVSRLCqzi8p8jDmXStHMhf50WUnfX62SM2WmpsypyLVY6iE/Ve3LaFc5vGrxwMj+oXq6rmpLY/mi3JCFlY3kEiQmNEWt1mlYl+MPVw7fK5dIJBKJpJHzT9XBmTlzJm3btsXhcNCnTx++++67Gvf99ttvRep5lZ9169bF7Pfhhx/SrVs37HY73bp14+OPP67X2GqLfMCRSCQSiURi8O677zJ27FjuueceVqxYwbHHHsvIkSPJysra73Hr168nOzvb+OnYsaPx2tKlSxkzZgyXXHIJq1at4pJLLmH06NH8/PPPf9l1NFoX1T/BHT+/QkK8m0ljXgLAMfdj1p8qnFEZi+Zx0pjHANj9+jX81PtyAHbcMZd+ybpMMvkmcVxqAomthKRUtG01/pdF0b8xeUXMvf19AB7aditT7xeZUW/sPJcMPd+o5MUJACS26sq7W4UEkuGwcH5n4Ty6ujCHI68SEs77jyzA2acfAB+tFRJQxa/ZDLu+NQCenh0o02WPuNRM5q0Xxf1O+PQnUjsIua1jxdds+3o9AOvCoo+bu6WR0kpkY/28No8i9VcAnKltDdnGuns14a6i+GCxX5xjb24ZaZ2Fq8m+O5k1eWUA9EiPx6e7moo37yYuVVSe9oXCpOtyT7ZePK8kKx+1qcjMqgyG2KpnWLWyqlHZqXkTIE/cX4+QHezxSeTrhQAHdUwjUCnObfPrriPAU1CMZhfSnLfEi013UUXkpQR79M9BC3iiElVZEU69gKG/3IvVJTKmfLq8pDhdhswVtjhMLqqoTOMJhAwXVbkvaPQdNMlSByr0p9kdhuuopvwp1KhcZXZRhXRJT9E0Q5JS9GlvfzBkSFG+QKjmLCojL8lU6M+UMxUZv6IqpnNEiwFW902yJudUyLQ9XKVdHWHdGVZTET8ztXFXxWZiHbjP/dEYa9/V50t9Y5HgDjca6qKqz7GTJ0/myiuv5KqrxOfE1KlTmT9/Ps8++yyPPvpojcc1bdqUpKSkal+bOnUqw4cPZ/x4UdR2/PjxLF68mKlTp/L222/XeYy1Qc7gSCQSiUTSSPm7JSqfz8fy5ctjIpAARowYsd8IJIDevXvTrFkzhg0bxjfffBPz2tKlS/fp86STTjpgnw1BzuBIJBKJRHKIUzVz0W63Y7fb99kvPz+fYDBYbUxSTRFIzZo144UXXqBPnz54vV7eeOMNhg0bxrfffstxxx0HiKK9denzYCAfcEwMemUPmr2cCU8IF9LQq57mrJuvA+CYsR8Y0/UvZp7LE/eJxVFLbzue9GHiDZx46sMAJFhUbv/8CwBemdOBMS+IYntfPfgcPzwlnErHPzKWJl0GAvDEGyt480RRNG/pE6LoXq97nmPqx0LCmnJEU3wfTQHAnd6G1IuEY2rtXfNo1lPPqFok6gkdVerl6gFC4lmwuZAUXVpp2rk3H/woMrzafptF6/HCrdW6SRfmzxCLx/a0Ea+3PLYjzdJEvlfWuj3szhESVnLLYUbWlGfVD6iniqKEEYmnfE8WaYNF/pSzMoOVO4SkdHKHVII+UaSvcEsR8cc5jXueYhEynLuZyKcq2VFCMFFUsQ6GYXuBkJiOtGn4y8UfqKNZOqqlCIBir5AsHHE2I6uqSZyNoE+4r5TKYuNcnoISrI5Oou0JEKcXYIyMP8UZlX1UTwnWSKG/8hIsupzlLfER11Q4xSIFAhWHyUVlif4PwxcKR51TZheVP4hNnzsN+CMSlWrIVRanpdr8KSxRR5Wiu8HCoWBMcb/qXFRV2/4qDqdgmBoL/ZnbqiEfReWnGCnKr9870zfGfaJWTK6mqtQlW8osIak1fEM1y1XVyVLhYLBeUlRjlJwkhy6aosT8ntbneICWLVvGbH/ggQf261CuS0xS586d6dy5s/HvgQMHsmPHDp588knjAaeufR4M5AOORCKRSCSNlIYW+ot8AdixY0dMLFF1szcAaWlpaJq235ik2jBgwABmz55t/DsjI6PBfdYV+YBjYssPC1A0G1c0FVENk+2teaOzCOJMfmkVz00XMzs3jJsRXTQ7eSoLdhQB4FAfASDXG+DJdJElcPR/BzHy4gfFcW2SGZQsvnl/MvNHLpotohNmPDyTI1+9B4DXeorFy9POOYJBF4lFzf3Gn8fPD78HQLtTJ7AsIH4hbKrC0GPbADDnXTEL0yMUpqu+APfupflcliTO90fPdFb9KGZ5VhZ5GN1fzPI09Q1jw6NfAVCavVlsu/oYhobFOaYt+I6cNSLAIOPYJFrpUQx5y/7Ef3zswk5PYS4JPcQMlWtLGquzREZYE2vAuMdF24tJauIyjtOKdwOQ0FL84e3+ZTdeRzQdPqdA1LZJdlkJ6LNAWpPmqJatAGSXilkbZ7yNyjKxMLep207AK/ZVK4tR9VgDb9FeI8m8LBAi1R07g5PosBjXovjKjYXA/tIKI57BV+7D6hIzUJEUctUVb9S+CdvijLHHLDIOhkxRDYHoAubIrI3DQshYZKwSrNAXGTttxgJaxRqNZDC3w6Y08ZCpkknk9IqqGbVvVFWLqY8DxEQ3mBcW+wLBamd2wrFJD3pfsbM24Zj9911kXFO7pvo4DZ3xMdOQD4uq1PTls64RDuZ+1JjtcrqoodRmUXlj5mAtMq5t7qLNZqNPnz4sXLiQs846y9i+cOFCRo0aVevzrlixgmbNmhn/HjhwIAsXLuSWW24xti1YsIBBgwbVus+6Ih9wJBKJRCKRGIwbN45LLrmEvn37MnDgQF544QWysrK47jqxZGP8+PHs2rWL119/HRAOqTZt2tC9e3d8Ph+zZ8/mww8/5MMPPzT6vPnmmznuuON4/PHHGTVqFHPnzmXRokV8//33f9l1yAcciUQikUgaKRZViZnZrCv1yaIaM2YMBQUFPPjgg2RnZ9OjRw8+//xzWrcWJUiys7NjauL4fD5uu+02du3ahdPppHv37nz22Weccsopxj6DBg3inXfe4d577+W+++6jffv2vPvuu/Tv37/e13Yg5AOOiRdn3EacO56JvcSU2Q85fzC57VEAPPTll5z289MA3BKfzDHni6m60+/7koqCXQCsfuhUAHZ9v5ZFZ4j07+OnXo3FLiSNuW/O58lHzgRg/PXvMG2oSM1+epLGYpuQdiLSRaed31JZKPTKymH38u0FzwBw4/Pd+N9XGwEYneKkx2CxOPnVx58FoE2clfLPXwNg7cqj6HJiGwAu6t+KxW/NAYSEdnH7FHHR2hCjjo1Xj0JQew5haLmYyny0YDebC8WC3f6dmtCqucgBy/0ti525pQAkWnVJo7wYWydxv5KaFFCYKxYIW/ZuN6SOnHIf7ZuJvp2aCjkiiTyxbVMAfv96G3sro9JDmR734Ep3EdRlJ0t6K6y61JSjx2Q43TZK9wrZMC3OatSRCZfkG+f2FJZjSxTyWFkgTIpezyYiL7msmiEjqd5ynLpe5S0qxaLXKQpUBrC4hOzn06Ua1eEy5KCYRcbBsNFfhT9oLFqu8EUXtxpRDTaNYEBIeRanlXDZvouMFZsj2q6hDo7fHM8QNi8sDu2zTwSzLOXdXx2cKv+jVE11cJQqqeHm+jgRzFEN0f2qLvTdt96NOXHcOLf5mBoWHDd2aUJV9r3WAx9Tt3NIievfzz9RBwfg+uuv5/rrr6/2tVmzZsX8+4477uCOO+44YJ/nnnsu5557br3GUx9kHRyJRCKRSCSHHHIGRyKRSCSSRso/NYNzKCAfcEy0e+K/uK0WmvZvDkDOqJON6e8rN7zK42NFzMKLy7/n9FQRBRD3/EuGSyfnwscB6H61n+eT+gJQevnTHDPxRQAWPf8SFeeL1PN+98wh95GxALQ/9lzufHMFAJM6CAfRqgemkdjqZAAmf7+dfI+QL8Z3TuD+Z4Rj6rFzutEuIGrXRKa5+3dKYc2bel0bQrS/6UQAmrVLxlMsohqCYWjlF/LXhkD6PuXtd1ua0LOpmNwL+irZUSkKnBzTPpWMI4W7asuibSzLKgIgQ484CAV8+JuK7JGmzXz8+cs20UdWieFe2uMN0r25kKhKLCr+HRsAiG/Z1Hg9r1ycz6kpRm0bV1MXoZ1Cdgq6m2BxCKkpW08bT0p0kFUunGsJ9mgad7BwjyERego92NOFtFMZDNE0QchJu3VlxWlRjHsRrig2JCpfSQU23XFVnluOLT7O6ANAcSUYclBQjUpHVRPEzW0jqsEUzxCNZIjKUhaHnXBIvPdmWQqTFIZWU+0b8V9zmriiagSMSIVoPINagyylGpLRvmniimpySGmqSa6KDq2qdGUMuZp05KpSTXUuqno7p2pyOynVS1o1tavDkA1r8TlyMD9qGov6JGWwvxZNaeADzmH8/kiJSiKRSCQSySGHnMGRSCQSiaSRcrAK/R2OyAccEy/PWY8NlTv2/AHAU017MvVnkQQ+ftBY2uhF7gZ/+RiLnl0CwNHXTMai190/+2FRMK9X3+aMbiJkjEV55XxwmXAWHfH7aVz82nIA3r37JN576EsA7vvuEa7872TR9wSxwvzJ/7xMt4dFUvi7n67jSv3c/rnTyPtTFN5r+8I1FLwrks8j6eXdBjp4/xFRqLCyXQ7acf8FIGnXckNKy3BY8P/8KQALm4wi0yH6diSK+IalO4s5L0kkiyuqZrisjsxwE9dXRB0snLOBXzYXAHCLO1p0Lscr5I2jWifz6xc/AODZVIrdLaS3Qn+QY9OEXLXZqlG2VVgN3R07GK9nFQu3lFNTqSwVTqyEFomwU5wj5E4zXFQ794p9myU5Wa4XX4y3ReWeUFGecd2VhR4jQbwyGCZFH/dufexxVjUqHZUWCZcX4CstNwr9FZb7jUJ/kQKBis1puKg8gahbSUhUor8yT4BEvW+vL4hD7zsY0N1GNpWQX0hwWqLNcA6pNguhkJDpFLvTuK7YBHGTRGVSg6pzUQl3FUYbao5nCJjaoVDYkGqMqAZFiYltMDunQqbifuZ4hsh1VZcUbt5etV2VmmIY1GocWVWpbnPNKeXVOLjqKDQdxp8vBnV2f/01w/hXItfg1B8pUUkkEolEIjnkkDM4EolEIpE0UuQMTv2RDzgm7n/iTOKddrpeLVxPfzx2OuetEu6eC5IdnD1vEgDjjxlHmS5FfH1dTyMLKP5YkbGRuzqFt966HYDgBf8j+67/APDg1U9y3S3CReX8/DFW3z4XgGeSC7hEl1cqTxG5Vbs9z/PgmT0AGHnxgxzXV2R6LJvyBcGgkKN2tRjE5tfvBaDtuacB0OTc49gwXshPms3JH+VCKmv1xUfEZ7YHoNeO79j+6WIAPjmiL7foeVXxzcTrX/6Zy6g48XpEtgLICBVB7376+D4mb6dI907tIooGWirdrMsX2VFHNE/EUyiktL3r9uBMHgyIDKh2yULiKXdoFG8WRRKTh5xkvL69SMhOCRYVr+78cvVuAgippixsxRYvzrmzUJyvV8skfBW6RGWN/kH7CgoMx5U314ddTwz3hEK0dZhcSUCcJZraGyotMlxU/nIPdt1xFagMYHELeSwiUYVsTkMO8plknQp/KMY51USNFvfTdFkzUujP6rAYxQlVW7RQoeawEQ6K+4zZRaVpRtOcRRUMmwr9RZxTWjSLCmKL/oFwURlyVTiMou1b3C8crr6IX9BwZJmyqGr4H2pdqrHuz1F1oP3N7VAklf0vcpLUNKy/27jSkKn4w/fj79+BpjbsIUU7jHUa+YAjkUgkEkkjRc7g1J/D+NlOIpFIJBLJoYqcwTExoclobHFuApWfAPB6/5v5fKyQpd7/4zNuWCwkkL5uGz2OENLVj8NOx91UyEAdh44FIGvZYj5NF5LLWS8W8bieI3Xf/23lykpRIHDsJ+sYlCykoQ3330Nmn0sBeHDRZgB6JTo4RhO2oaCvkt53itj6xy94hsQRZwPw3E9ZuDcVAvCfEaLA3vaEVEOmSGzRideX7QBgzJzlZA4cDkBbSwu2fSMKBGbZ82l5jMjEatquJQCr/swlx/sbAPHNhuC2iOdgZfMylA59ACj2hyjKFvJR055CPnOsT+P3HCGnnNKpKX79WgvXZ+PWC/kFw5DhEr92u1KdFG0RTiyaihA3XyjMxlxxXG+bhk+X7tzNm6CoopBfoSeI3SVkosJiUegvs6eTgH4+tbIw8pbiKShGsws5y1vixaG7oXyhMMnOWIlK9ZbiUCMSVSEOPX/KW+IlLlXIXJ5gCMUp2n79PoetcYYzyRc0S1RRF1WJubhfIIxqEzJQSJc6NZPzy+KwGW3V4oiOzxZth7Woc83sqAqa1KdYF1VEulKjzidt3+J+vqpZVEZekkmuqqZwH0DILGOFa3BRhWJdVOZ2qAZHVbVOJlUhrEttZkdVTQX9asI8fR+biRXdfqjUSavrF/nD+It/o0LO4NQf+YAjkUgkEkkjRdbBqT9SopJIJBKJRHLIIWdwTLw5+TkUzcYvn4qiewPOvIuuJ4nCewNnbuDPL4V09ewPzxFoLaSam909jWnxua8NAeDOeU255SlR9K/5xMso9otigT9ddTedh40HYN673zDpLiEZvf7gfK6d1wuAma/9BMBXF/Rg25THAGja7WQCI8YAkOOZRof+onDgnEWbOF7Pibquu5CAnvtlJ23ihHzRokcXvv1ZSFTdf8+j/w2ZALTr3J8vv3wbgL1b19D6zN4AdHWkAbBk/u/s3r4NgLTjmpGpSzUlv/4A3UYY96t8jyjS13R4FwBce1uxbOteAK7uk2k4gQq3FJF0lMs4zu0vAiChRQIlO0sBCCYImSsYhqx8UdxvhF0j4BFtW7P2qBYhZ+VX+ImLF66mSFZVU5eNoE/IVWpFoSGzVBaUYItLBKDMGyBJP646iUrxlhsyUqCkBKtL3Ed/mQ9LK3FvKoMh1Lh4Y6wAYWs0F8obCBu/D95AyOivzBuI9u0Poll1GUjXlCxOCyGf7qKyWgz5RbE7ollHprZZlgqrUUdVjCwVjLidNPwm7SoqV1WfP1VdFlXIJFGFwqbifuLXT5eMogUAw6ZCf9Vhlp8s1chS++xfD52oqnPKKDL4Vzmq6uhHMg9Djdl++H7jluyLpigN+p09nLOo5AOORCKRSCSNFFVRGlTm4K8qkfBvQEpUEolEIpFIDjnkDI6JEVdfjtXpInD1OQC0PeZmfr2rLwDuoXfRtJsoVnfJMhdbPvoZgIe7NWF7tpBZ7NPGAvD27Y/jnvE8AFfNSObloW0AeOebbTz35gAAjjvzA+LeFO6qzXfOY1qfVAAeuE1kXLWb8wBvHDkagGEz7uCVFcJB1NJpZfDJnQG4YdwMEq3iGTV+9RcAfLzEzZNdRF9lg1rzzNNzANhQ5uP8o5oD4NLOZLfnDQDK9+wgYcglAJxWLmSiT1/5gO3rhRzU+ao0OumyTu4v6yg6LZITpeApFnlVzu7CMZb4e5htWcL15KrIM+7rnpwymmUKWcemKlj2CmkrqW0yO34X+xWHonJRoZ4vFZ8aZzixLM3aYLFvAGBXiQenniNVqu+b7rYR8Iq2WlFkyCyegmIsem5VWSBEqltcSzAMCXbx6x+RlFRPSWz+lFuMyVviwxYvnHKVwTCqK0HsE4q6qCL4gmFjSrjCHzS+QVT6YiUqi56JFYy4qKwaIc++LirFFpWlsESdU5iK+wVC5uJ+0ba5oF/IVADQF4zmUoFwTkUIhkJVHFURKSm6WDFsyp8y51JFqLqosToXVW2oTq4y+qrhfDV9Wz2YTpKGfCFWlag0V7v9634OKXEdWrMWGnV3B1Y9/nBFPuBIJBKJRNJIUVWlQU4o6aKSSCQSiUQiOYSQMzgmXnYvISHOwbgvRLG9Va8fxZddTwDg2PHPMv084XTqc8YdhkPo+G8/JLRKOKYmnvowAOetvpB2x90AwMZv5tLnk5kAfNr+NI5c/zEAaZ36MfaTdQD0T3KQP0W4qyJyyhdlTfm1UEgu94/oyFlPfgfAs/2acVSXZACuLN5DPz3XaeurbwKQtecYup1/NABNe2Xy2G5xLWWBEIMyhZRS4O8czVEK+ChvIVxZAysCAHhL97KpTFzf8G7pZBwpHFq7l+fw5y4hQaXZLAQ8Qj4Kt+wOQGqzbezeIorsKdkb0GxibDmeID2aCydTyKLi37YWgIS2zdjjXQPAngoxZW9TFUN2is90E9orxkFqcyNTKrvMi0vPhsrNKgIgxRnNbwoU5KDZxbkr8yuwt7Tr9yBMU11uKwyFcenyXkTuULxlRv6Ut7AMq14U0FPowZZglqgiLir9HlqjBfg8gWj+VLlJlqqIKfQXQtPPHc2cshIqNOVPVSNLKSa3VlgzF/eLOqdMdQYNF5WQq/aVsSIEqjinjHOEYgv3KWrsdk1TTXJV9dJVbOG+6PcpzSTVmB1VRjsYK+HUR3KozbS+ud+a2tVhSIi1OMfB/P7cWJQXKYP9fUgXVf2RDzgSiUQikTRSpIuq/sgHHBP3XfEqNkXl7ruGAvBW5+Gs0qMAFoyATQ/8B4D0nudidYhv7SPe3MYlxwwDwKY+AsDrn27kza3HA3DSlrVM+F30P+asznxzjaiJc/60N5k9awEAD1w3kC+nifTudlc9BcDE91Zxgh6R0GLjArb9shyA3recReiLZwFwJmfQa4BIFl/1kZgVKW6RSurk/wMgJZxrfNNMtKqoKz4D4Nv4Y0jRowIsDje/7hYzMccmipmTcChIrlfM5pzbKpmEfu0A+G3pdyxZL+IZznGZFgXbReJ4zzYlbPhpNQC+DRXY48VMU74vQI9MsTB3h1WjcstGMaYOrdmjnyerOLp4ubxE3PP4Zm7C+WL8IXcTrC4xC5SVX0FmipihWVcq6u4k2qNRB8GCHDR95qOy0IOji94Ohmiiz/wUAnH6LIoxs1JcYMzg+EorsOl1cEp2lmJLELNHvlAYxRlbB8ccz1DmCxjfmMo8AZx6u9I0gxPwh4xFxkY8g9MWrdNisxIKifthXmRsrn2DKarBPGtjThM3LzI2z+ZEFhWr1cQzeAOh6mvfmGZzQqaZmgiqadbGjKYq+8zGRLZX165un2qjGuow2yL22ffYmhb6Vt1e1/o2Vc93uFLXeyBvWfWoSsMWGR/Ov4tyDY5EIpFIJJJDDjmDI5FIJBJJI0W6qOqPfMAxMebYlrgsFhadMxGArKkncPf/zgTg6aOvYkOZiAX4Kvc1Qwo4YuSt/LlYyER/TBwJwKsTv6TjIhH3cMU1Y3jhBVGj5oFnZ/BMMyFnTRvWkqcnCDkn45VH+fUxUR/ntjFHAHDT7c/ycJsk0e+k56gsFEnfgRH3sWbUKQC0POpW2pwnFhQ//9I4cREtYHu8SBZvOn8KribiuK7xdnI+FeP4sHMbRulSjTujDZ/+mQvA4HiRIG6PTzFkj/buMGE9GmJH5dds2yoWEbdol4SGkIk2FQpJ6ajWSbxRsBuAwjUFOJN7AiJ5vEuansDt0CjcIOIjWpx3NCW6XLKtSEgyCRYNb7GQweI7NoU/xEAqrfFG5ML2gnLaNRGLsX0VYtFzvC06GRnYu8dYkOzZ7cERpy8WDoVpFheVdlxVJKpQaVG0Dk5JOfYEsW+gMoDVHZWowvri6cjvgNdUR0YkiIt2pS9IojmewRJZWBxCi6SJ+yPxDNZojReHDdAlKnt0ATOWqCwYmyAelZ8CJnXFLEuZk76DVaMawuEqyeK6hBMOm+SccLULS6tLFjfXu6kaw3Cg+i81SVfVTdGb+zLX2gnVodZOfWgsnxcNmX5vJJcgqQVyDU79kRKVRCKRSCSSQw45gyORSCQSSSNFa+Ai44Yc+29HPuCYCMx4k0B8PP+9SEhUxUsm81pY1L4p9s9jZIZwz3jGno8zTbSbdjuZPetEAnjuxY8DcIWq8vrYdwF4ZMPlTL1fSFFP/Hka7XVnTt5j40jr1A+Ap1eX00aXUS5sJab8ry7MYcCdQop64eb3cA0aBMArK7IJ/rQLgHNu60BR11aAqHMDEJ/ZntkrhEx06jvfkd71vwB0DS9h02d/ArDam8utfTLE+Nt34odVIgZiT1DU2nGn9zLcRJZty6DnMQDs9T1OQbaQhNJ7ZeDYJhK2f8suAWBgi2S8uqtp77oduJqICAdfKEymHnuQk+igaJOQxFpltKVSjw3YmCucXB1tmtFHfKt0FFXIVXsrAzgShSsrt7CSYzsL51ZAj3Kw+UqN97EyrxCLszUA3hIvDt3xVRkMxSSIa/4KcWxEoiopwKlLR75SD7Z4IQ9V+oIoejyDPxwmpEczRGQ8j8nGJCQq0V+JJ0CriETlizqngsEQVj2hPVIHx+KwRWvi2F2G80ipQZYKm6Ia/FGFLOqi0jRTPINqpIkrWtRFVWOauFmWMsUzRLT8oBH1oMQmi4erTxA3y0cRIu1QTXVwanQ46dEWplPU9D/wmmMbzMcq1W6vD41RCairnNZY5DdJFClR1R8pUUkkEolEIjnkkDM4EolEIpE0UjRVaVBY7MEMmv23IR9wTJz7f0+hWOxk9h4CwIgfklj20QwA8ufdg9pTbL8xY6gh4by/eSZ3fdQWgLMfFpENn014iLV3zAXg96uuof2QGwGY+dK3LL1RJJLPnfE9579+CwDPvLmSuWd2AiD36YcAEeXgOP9iADZfNZv2A3WJat46BuoxCvcd3ZJ3dQdUS116adHzCOYu2QpA26U76TdGJIh36NiX18aLMeVtXke7U3oD0NndhF+XiJTuHVniv2l9zyTTIfor+/kbOP8eQEgypbs3AZB+Zhdc5UIe+3mzSB6/oEe6IbPkrysg+Uy3cW+TwuUAJLZOoGi7kLkCyS0NmWdjrpCYBtk1/BVC8nK0aINqEfvmVwRw6kng5SVemuntgEf0q5YXGDJLxZ4irA4RH1FeGSBBj2fwhcKkxUWlHdWjnyciI5UURxPEi724m4tU9spgCDUu3ugj4qKKIBLERbvCH41qqPQHiZi7Av4gmlWXgQJRuSrkM8cz6NKPPVrcL6bQn6m4X1iNZgQHTQ6piHNKNcUzANVGNSjVFPoLmqSomEJ/YZNc5dfvn6pE4xmUaNvslqoqS9XkrqqOmhLD90fV6XijeGIdp+lru3t9igBGj5VIDoyUqOqP/BuTSCQSiURyyCFncCQSiUQiaaRIF1X9kQ84JtK7D0C1xbH6iRMBiD/mZsPpdH3+kayftQ2A+zqmsCVPOHBavXEP88c+JvY/VkhOoyc7ePZYId+8/eVmnlsn5KXh595Bk1dFjtTqx7owbZgowvfMg9Po+oGQpt7qdxEAQyZfwyt/COmnpdPK0aeLYoK33PEcw/WCcU03fcVrC4TT54nOKQCUDW7DSy+IzKlVxV4u6y/cRMnqeWy7+QOxT842kk88D4AzPc1ZNFtIV1mrhWOp08VpdI4Xcsju73+n+MRoTlRloZDEXL1PJHGd+MvZuK0IgARPvnEvC3aXkdFMyDo2VcFSIO5dcrtk1n4qsqiKw9F07Px8cT8TUxz4dWeUpVlbLHaRhr6zxGMkiJfuraSZLjsFvGJsavleQ2bxFBQbuVXF/iBNE8Q9CoYh0S4kKE0BtbJYvy5xP71FpVh1l5uv3I8t3pwgLlxUvlCYsO6iiuANhA0JpMIfxGpkUfmjOVfBqCxllqtCHl2yMSWIm2UpxW6Sw0zOqWANklNs/lTULeUzt02FCUVfIVPb7JyKSkPhanKplP1ISGaJ6kDF/cxUJ1eFQ8F9JKbaSFYHa+1BQ2f4VSV6D2rqK+Ze1uN8Mt370JVilAZKVIfz74Z8wJFIJBKJpJEiFxnXn0a9BicQCHDvvffStm1bnE4n7dq148EHHyRk+sYZDoeZMGECmZmZOJ1OhgwZwp9//vkPjloikUgkkn83M2fOpG3btjgcDvr06cN3331X474fffQRw4cPp0mTJiQkJDBw4EDmz58fs8+sWbNQFGWfH4/H85ddQ6OewXn88cd57rnneO211+jevTvLli3j8ssvJzExkZtvvhmAJ554gsmTJzNr1iw6derEww8/zPDhw1m/fj3x8fF1Ot8vd3QnIT6eue1ELtSJE19kytk9AJE5FZlmHvrz5wxd/S0A9w27m/N+FO6j9kPEmNbO/4B+X7wKwBdtRnD0WlH0r0mXAVzzgSj6NyTZQd5jIj/K5krmk/JmAPxQICSXR0/tymmPLwbghQGZHNVVFLm7rjCHwalCttj8/Cts2yNcWT0uFTJYs74tmHy/GE9ZIMSxmUKeyQt0x6dLC0FfJWWt+wNwnCdIZWGOGHepcPScekQzMnunA7Dzp138kSXyp9LtFgIeIR/R5kiatBRurZ0bhZSm7FyDpjuMsir8HNVajNlvUfFt/gOA5E4t2VW5FoA9FUFDwindq2dRtUggtFeMgyYtjUypXaUe3EniWnKzikjV3VAR11agIAdNl3Mq8ytwtHbo9yBMU13OKgyFceu2Jk1RoFK4qCKOOG9hmZE/5Sn0YEswS1TidykYDhOymvKhAE8g6pwq9wWMa6rwBQ35K+ALolkjWVQ+NN2lFio0u6h0KccSdUsp1qiMF9aif64Bk0PKVGcwJn+qOueU+djqCv0FAqGYLCpF/woUDoXR9GsxnFOqua0Y0lWscyr6HUpT9nVXxRT3C8ZKWbWdljfvV5v1BmqMHFR9uzoOJDOZOVjfmRuTunA4Sx3/JCoNK8BYn1mMd999l7FjxzJz5kwGDx7M888/z8iRI1mzZg2tWrXaZ/8lS5YwfPhwJk2aRFJSEq+++iqnn346P//8M7179zb2S0hIYP369THHOhyOqt0dNBr1A87SpUsZNWoUp556KgBt2rTh7bffZtmyZYD4H/DUqVO55557OPvsswF47bXXSE9P56233uLaa6/9x8YukUgkEklD0RSlzmUOqh5fVyZPnsyVV17JVVddBcDUqVOZP38+zz77LI8++ug++0+dOjXm35MmTWLu3LnMmzcv5gFHURQyMjLqPJ760qglqmOOOYavvvqKDRvEjMSqVav4/vvvOeUUEWGwdetWcnJyGDFihHGM3W7n+OOP58cff6yxX6/XS0lJScyPRCKRSCSHKlU/87xeb7X7+Xw+li9fHvO5CjBixIj9fq6aCYVClJaWkpKSErO9rKyM1q1b06JFC0477TRWrFhRv4upJY16BufOO++kuLiYLl26oGkawWCQRx55hAsuuACAnBwhraSnp8ccl56ezvbt22vs99FHH2XixIn7bH++9zk4FI18n5iK/rjTJtZcMwGAtsfcgEV3vhw74w/OGHwUAGk2C69+KZw+n8waCsBxO7O4YYkoXHf75b357MqZANz4zhz+97/3AHjyruG8/8gCADrdNI2Jb4k3+hQ9Cyl9+bts+0lIOb3vvRT/nCkAuJq05MhhotLa97OWUdJaZDIlnzcWgBRf9LpTbBr8/DEAC5NOIEPPP7K6Evlxh3ioOzGpzNh/jzcAwEWtknENEoUHf/3xG75ZmwfAZe6odJJvTaVXG+FCWvf9SgB8GypwJIp8qnxfkD6Zwnm03aZRuVk4p+Lbt2avfn83F1YY8lBZUaQQYCLhfPF6MKGZ4YbakldOixQhGa0t3UuiXdyniGwQLMjBostjFfmVOLpF86cydGmrEAyJyqYqhIqFtBYZg6+0Aofu1CrZWYotQchjvlAYxRmRqMBbxYVU5gsY35LKPAHc+nxypS8qwQX8puJ+fh8W3aUWkWUsTjuhkLgHNRX3w9Q2y1KBGl1U0e0R55Sq7ptF5Q2EYor7RQiZcqnMRf8iqCZZyoymKjFy0/6yqKo71nyMca6qLqpafCs1n8LsZKqOfc5XD5HpMF7LaVDXeyBv2YE5WIX+WrZsGbP9gQceYMKECfvsn5+fTzAYrPZzNfKZeyCeeuopysvLGT16tLGtS5cuzJo1i549e1JSUsK0adMYPHgwq1atomPHjnW8qtrRqB9w3n33XWbPns1bb71F9+7dWblyJWPHjiUzM5PLLrvM2K+qNhwOh/erF48fP55x48YZ/y4pKdnnzZdIJBKJ5J9GUxsWBBs5dseOHSQkJBjb7XZ7DUcI6vq5GuHtt99mwoQJzJ07l6ZNmxrbBwwYwIABA4x/Dx48mKOOOooZM2Ywffr02lxKnWnUDzi33347d911F+effz4APXv2ZPv27Tz66KNcdtllhpaXk5NDs2bNjOPy8vL2efo0Y7fbD/jmSiQSiURyqJCQkBDzgFMTaWlpaJq2z2zNgT5XQUxKXHnllbz//vuceOKJ+91XVVX69evHxo0bDzz4etKoH3AqKipQ1dhHV03TDJt427ZtycjIYOHChcZCJp/Px+LFi3n88cfrfL4Eq4pTUbn+wzsBmDhyoiGnrKg8yZACUgffwLpv9Yer2f/HW1c8D4Bj+lgAJtx2P3fe8yIA0z95jaeeE7lIT3eB+7OFnOX67zOsvnMeAJMuPoqzrxQLt0YeJfr99YFX8Ho6A1DQ73w2Dhe/LO1PnUDri4Q2OmXKxShthcywRm0BQPOPHiY+sz0AvXb8QNb7nwAwu2d7rksUD3WJzTvxwcrdAAxx/4gjUchcEdmjtbWc8DHHALDjoQVkbREuqswuqVh8Il9qbX4FA9sJffXVAtFX/so84lIHAsLB1SVN7Ot1WNi7bpvo++hjKdElkq2FFSRYdJmkWBQZjO+RDsuFBFeuOIyxbS8op2sz8cfpLYtKVBH8e3LRbEKKqsz14IgTck55MERzZ1TaibNGJaqgLlG5LeZCf0La8pf5sbqFRFUZDBGyu/R7FMYTjJVlKvxBw71T5gmQYshSQaz6OEOBEFZdIhQuKrvRBt1FFRSyoWI3uQq06HWGTYX+Yh1SpvsQrF6uMstSkewqRTO7qKLOqciUdriK/BQtAFiDLFWHnCnjmkwuqqr7V+eIMktJZukrpLf/imJvjUl6asiiyUZ0GZI6oCoN+72u6++vzWajT58+LFy4kLPOOsvYvnDhQkaNGlXjcW+//TZXXHEFb7/9tmEM2h/hcJiVK1fSs2fPug2wDjTqB5zTTz+dRx55hFatWtG9e3dWrFjB5MmTueKKKwAxhTZ27FgmTZpEx44d6dixI5MmTSIuLo4LL7zwHx69RCKRSCQNQ22gi6o+D0fjxo3jkksuoW/fvgwcOJAXXniBrKwsrrvuOkAs89i1axevv/46IB5uLr30UqZNm8aAAQOM2R+n00liolhHOXHiRAYMGEDHjh0pKSlh+vTprFy5kmeeeabe13YgGvUDzowZM7jvvvu4/vrrycvLIzMzk2uvvZb777/f2OeOO+6gsrKS66+/nsLCQvr378+CBQvqXAMHYMyfX5OQkMCYd8Ti3gsS7XToL9K4lxwxiDi9/kz3U+5l688/APBm5tlc8l4SABNPfVj8d9QobikX38Yvemc156SJxbF/3nwTrQddA8DNn6xjULL4pj5c20JQjxzo95BYW/TImY+TcqpYoPXo15tJ+1PMcFz/XFc2J4n6MsFwmJR2vQB4+ntRk+aSN3+k5Qn3AtDN3YYNnwoH2mYllw5DRWxDsw5tWf57NgC7y78jseXJACTqsxusWYLaVdTV2esLUrBD7NusX2uca8QU5dLthZzVXcw2eUv3ArBn9U7i24nXfaEwLRLEjEN2UxcF68VC5dbNOlCpTzn8uauEvvoMR6SPhLbNUNRdABRUBnHq72NeQQUj9PMFKsvQKvbGvHflOQXYXGJhtKfQg1NfEF01QVwzJYiHSsXMlEOfWfGWeIlLFTM15YEganwSAP5wNJ5BLDKOncEo9UWjBAq90To4fm8QVV80HvAHjUXG4VAQi8NmtAFUm8voT7VFZ3BiE8Sjf67mWRvzjEx0ZkeNzuZosXEOVRcZB02LicMxUQ3RZPGg6YShyPlUhXC4+gTxmhYWh/az4Hifhb7G9tA+szl1zdcxr2Ewf1iYt9fnM6Qxloap6zf2xjRDJdmXfyJNfMyYMRQUFPDggw+SnZ1Njx49+Pzzz2ndWnyGZGdnk5WVZez//PPPEwgEuOGGG7jhhhuM7ZdddhmzZs0CoKioiGuuuYacnBwSExPp3bs3S5Ys4eijj673tR2IRv2AEx8fz9SpU/fx2JtRFIUJEyZUuxpcIpFIJBJJ3bn++uu5/vrrq30t8tAS4dtvvz1gf1OmTGHKlCkHYWS1p1E/4EgkEolEcjhzsFxUhyPyAcdE79u+QLU5yVn1DQDvbv6RbZXit+OVjCMI6ou9f1p0DFN+6QDAnRNmoz54CQA29REA5o+6h/53CF1x0ey5zH7xagAeGvM0j/wqIhL++8A7TLprOAC/3/kArQf8HwC5/cRi4j3eSfQa1heAT79Yy8keUaNmfNdkbv1SyFE9Ehz83rcLAIu/2wZAl9V7OPX+NgB06DecD+eIceRv+I321w4BYKCayQdviWvcvnELmecKGa6NvjC3YMliQt2ii8TKcsX50s/siXuvWMz8w8Z8bhkkSnZHpIX89QWkDo5Kg3FlQodNbpdE/nqxoDeQ3MpYzLw5p5TTdHnIXyGkI1uLI9BsQn7aXeolTq9LU1bkobmeCu73lKOVCskuIrNU5hVijRNab4k/SIp+nJCoojKPOUHcX1QEgF3f11viJbm9kMEqg2GUOFOCuC2a6u3R5ZqITFLpj9a7ial94wsaC4uDwRCaLleFfD40Q6LS5SK7OUG8+jo4ISW64Dim9o1+Q1VVi0kQj0pJsbJUsEpUg69KHRwjNTxskqv8scniIGZPw9XEM5ip6+JjTalertofVafgIzV4GrJuYX+H1rVGjrkvNWa71IYayqGaIG7mn5CoDhUO42c7iUQikUgkhypyBkcikUgkkkaKojRsMfthPIEjH3DMlOVsQbHYGXjxpQD0uOsbyvOFzPLz3cPY+eMmAH485kRumXITAA+V5HPXIx8A8MfEkQA8cOc8PrtWrAxPfftdVvW+DYDK4HTOUYVD6+Jtq3G9KeSjTx7oydhvjgRg/Gfi9ZGpTq4bJernDHj1DVo69fosc6Yw/0dRN+DqEW1xDxMlrm8YNwOAzeU+xvYRMpKmnkeOZxoAFQW7sR0jHFpnlyfxypOiHs/6rBIG9BSyTMfmQl7asXgtOaNE1ESiVcVTnA+A46gzSPtdl4+yirDnC80uInVkF1TSsVUSIOIPlN0iNTalcwZrlu4EILciKj0UF1SQoJ/TXykiI6wt2mOx/wlAVrEHly5L7dlZTAtdSgp6KwkX5QKg6snbFXm7semOq7JAiGaJQlLyhcIkOMT4bKqC4ik1xuctFO1I7RtvsRerKUFc011UwTCETS6niIsqIoGU+gJYI22P3xTPEHVOBQMhLBG5qsK3j4vKHM+gxLioog6wYDia+h1xTgF4TbEIflONGl+welnKFwgabahSBycUK0UpZumqSn0ctYrkVK2LSql+H+P6grF1cGKTwU3bq8ZE1PB/7QPV3fm7OFA0xL771/0cUuI6PFBR6hUdYj7+cEVKVBKJRCKRSA455AyORCKRSCSNFClR1R/5gGNi0Qv/xR2fQIed3wGQ8ME3aFYhJfxy70MMuTMJgHsSurHuzIcAuOrV93l+kpCaci8W8RCnTPmG7deLIn29Tr+Jq2aIiPnnB7TghyvvAyCt0yXc/Mk6AKyeAOPbi8m0h6YsA+Dx6wbRvPxPY2xD+4msrWVTviAnTjiOutw/mo5dRXr3lXrUAUC7yi0ArFRbG3KJomps1YQUNaC5lYBHSEKby32c3kNsbzlYBI7+MXc9360X/XWJsxlxAr7MnrRp+7u4H9+sJbBeyEQ2PfF7V2WA/np8wx6rhm+T2Dela2tyPD+LfUq9Rnp36d5KkloLp1JoizhHMKk5Vt29tLWgnDS9uGLW2p0km6IOgntEMUCLXU8QL6jEkS7eq2J/iGZ6gviOMCTo7iVNgXCJkNucmoKnQNxHux5hUbKjFFt8RKIKGYX+fKEwfiX6p1LmCxj9gYhniNznMk/AuL5gMISmJ9AHfZVY3WJM4fwgFj2qIRwSfZllKSymnDRL1EXlN0VEROrumV1UiqoRMMlS5gRxbzUuquqkqGAwhKLP64ZDYTTdYyr2IQbFlCYeW7gvdsfqpJraFPqrLTUV/TPLWLVqVzOVHw4Fa/0BcbA+RxrLB1JtJTCZIP7XIqIaGnb84YqUqCQSiUQikRxyyBkciUQikUgaKVKiqj/yAcdE9qhRuDSN9zYKp9CUb78ypr+vvGUmGd37AfDmaR2Z/YVwVE3umMfK888H4JxHvgbgz48mct/xtwPw9u7+dDlpHAAD336MsV0vBuD8u49l9qwFADyV6mTHpLsAyN8gpIqMpyewYbxwamX0uoDep4lk8ccveAZvFzG+ygFjcP7wNgD2eCENdXLbKZwrAtBeb/kfusYLucPVpCVz1gpJ6ZaETWh64bqyQIh+mSL12zpMXN8Hs//gl7UiO+q0Zi7DqbS5yM9xnUW699fvfE7JH3quU3KmGLsvwHFNhZNprcPC3tXCZdXk+GMp9Av5YWNBBYm6bFO+t4DEtk3FDRaqGv74DCNBfMueclrr2VA/le4lyREtdBcsEO42i0O8XpFfiTNejLMyGCJTv+4dgNsWTRAPlYh757aoeIuEiypS6M9fXoA9OV7vI2w4p4LhMJWm/KkKf6TQX1SWaqLPA/t8Qay6RBPwhQwXVchvShD3Rwv9hUJeoGpxv6hzypw/FTA5pPzVtEWCePXF/QxHlaaZigRGHVLmBPGINBGqIZfKLEvFpHsHq3FR1ZQUXmOhv2i7Oimpun7NVN28P9mrvpLY/s4nkRxspIuq/sgHHIlEIpFIGisNnME5jJ9v5BociUQikUgkhx5yBsfEgvUF2BWVrrrUMWLegzhShaNnvCuTLUvmApDx9RxufFDkS7079EYWbBMOofhjbgZg9uUTcOruk/i3JpDQohMAb5a1JtEqtt97YmuenrBanOeOE3n/ESFXOfucA8BbWQoVHwiX1WkvdyU44gwAcjzTcDURbqfXVuVwwsvvA5De4yoA+oZ+Zs2bSwH4dugwxvQSElDTzr2Y+9MOAC5VPiM+sz0gZJv4Xb8BED76JAByvS+Qs60QgOb9MnHsFk6tpTsK6ddcOKY8xfnsWZEFQEKzYwAh63RIERJbWaKdgrWiuF+z//SgWJeo/txdQivd1eQp2UPSMSIHi6+EJJZfGcCRmArAjvxyevcRr3tL9+IMlBvvVflusb/FKeQxT6EHp1u8b55QmFRT/pQtKGQgp6YSLBTHue0WvCWVQFSi8ngCRnE/fzhM2B6RqDBcSACl3oBx7wBKPQGaV1PcL+APGsX9QoHY4n6WOCERhoNFQJXifibnFDGF/tin0J+imfOn1Ghbqzl/ytgekaKCoWqlqKBpe0wuVXhf55RlP7JUqLoCgCbnVHVyUzgUMuSqmqSkSL+1ydqJLTh4wN33Kz39nWsaavsNtKbx1jRUKa3Vj39iPYt0UdUf+YAjkUgkEkkjRaFhKtNh/HwjJSqJRCKRSCSHHvWawcnNzeW2227jq6++Ii8vz5iyjhAMNtyd8E9w/8fjSXA5UXscB8CN6UOMom2fb/mN2z8UheuOv2c+n014DoAfph1Bp/POBKD9ECFR3ffUlywdeywA79z3KZfP/giACS/8zNxzugCQ9+hY0joJ11LSDRez+s55AHQ89ngAJr//BwOLPaK/E9rz8opsAFo6rbTu2x+A17/cQOKCrQAMnNYKgG49B/PqHeJ8uzP+pNM54hzd3Rn88o3Iudqy63fS+55s9Ff63RcAhMfcDYjCdiU7RY5U83N64f66LQBfr8vjrK7C4RT0VZL7h5B7Us6MN+5hKkJGSmqXxF7djRZIbUOkRt3a7BKO1mUbX2khztZtANBswtGUW+YnTs+fKimspIXeDnjKUctE8UFF1SjPEX3b4roCUFbmIylR7FsZDJHujso8aoWQ2xyqQqBIP85txVMo7m9qM+FAKwuEUN1Jxj0I2aP5U55gJH8KynxBow1Q5g2gG7Xwe6OyVDAQwuYWElPIZ3JOBfwodnGe6rKowlp07CGTRBWTMxWIFOvTanRUVStLVVPoz+yWipGi/FWKAVbJotqfWyqy3VIr51TNmVPRfWL/XVWWMhxcdZSiaiM5NNSFIr9FShqCqii1kmH3d/zhSr0ecP7zn/+QlZXFfffdR7NmzWTom0QikUgkfwEKDayDc9BG8u+jXg8433//Pd999x1HHnnkQR6ORCKRSCQSScOp1wNOy5Yt95GlDgXOWNkci8NF4efC3fTyiW3JXS1kEfcDl7HoYZE15TpjNqfcL65/7gXdefZtkRn1yayhAHQ/6Wa0mU8DsPqx7szoJ6STqfd/Ruf5Irfq+Y4ncf7rtwAw6ftddNIllVPG9ALggmse51TdbZT66zu88rXIonq2XzNsJwlX1qOPvsUqXca66TjhiopXL2HzDe8AULp7M8mnXwPAhWUZLJglHFfr/8yjz1Wivx5pTrbNXw5AwfFCXnJbVCoKdotr7TuK1I3CNbRh814Sy3YZ9yt7p5CVWrUUziqbqqDlCGmrSbd0lr8n7uPeUDRbKS+njJR0If34K8uwthLXYrFvBmBrUSVuPUeqaE85zSMSlbcSpVgUKlQtNirz9BwsvcBhsT9IsyThTPKFwqQ4hbSjKaBWFgPCReUpEG17gh1vici/sieJQoeVwXBM/lSk0B9AZSBa3C+SRWU1Cv35DddcjS4qTzDGRRVxTRkSlZ6pJU4SlaUCobDhnDK3/aGoq8sfrKag3z6yVHXbI5JT9blUIZMsBfvKR1UlqprcTjUVA6xuv5rkqv1Ns9ckfdWVhk5Eq0r0HtTUV8z9rMf55Gz54Se5qDRM5jycJdJ6XfvUqVO566672LZt20EejkQikUgkkgiKojT453ClXjM4Y8aMoaKigvbt2xMXF4fVao15fe/evQdlcBKJRCKRSCT1oV4POFOnTj3Iw2gcLHv/HRTNZmQvxX0yh5OdokjcTclHc9IX5wIwbMKLLHr+JQBSv3mPo77oA4Bj+lgAOg07m7Nn/AjAw11S+f3q6wBIaDGMh1cISSm3xMu0E1uL/W+cx6KLjwCgbUIBAIHKMoZ3EwX2Vjz2OtsYDEDvW86iY29R3O6e3G1U6u6eI23ioXKT0tK4nlDAx65EIQENa6LiLRX7rC31cvaRoo/Wx7Viy6JtAPywXshxLZ1Wgj5RBC/Y7miat1kDwJ+/bCO0STiSrK5Esir8AAzuKMZZYtPwrVsGQHKX1uyqXAnAzhKf4UYrLqggsbWQtII7KwmltQHA5k4GYEtBOckpQq7ZvSmXpi6rcS2BPCGPWexOynOFnBbXRchfxf4QzXQXVW4Y3NZo/hRl4p66LQqeApGfZU+wG31EJKqyQAg1XowjGA4T1KLSWrnJOVVWTaG/SDvgD2KtprhfKODD6tKL+4UCMYX9ABRbVKIyF/rzB6NScDAULfQX2a6omlGE0Cw/qTVuVwwXlLlwn6LP5YZDYTRdbhNyFaZ2bBaVGU2NTgZrSu2zqMKhYL0kh6quqgjV5Vftr10d4VCw1nLVwfpu3Fi+ZB/O3/YbK7LQX/2p1wPOZZdddrDHIZFIJBKJpAoyTbz+1LuScTAYZM6cOaxduxZFUejWrRtnnHEGmqYd+OBGyu0P3oLD5Ta+XQ658mlaHNEbgGePa8U734uog3knORicfyEAJz74NcvfEcnhD54+CYD3tk+h96m3AjD0rUe46+jrATj7zf/x4ivfAPBQsoO8R8cCkLNKpe28RwDYNuk+ANJ7nk7/ke0AmHL5K1R2FunZwRH3kbhcREZYXYm0d4lv+2UfvwjAa60upZM7miD+iT4r83+J240E8WJ/iGNaiggK20n9+FiPhPhutVi4e1+Gy5gp2FyuMrSriHv4ac4iSlaIRbpxqZnk67Map2eKGZn1Dgt7V4m+0gYPIN8n6vGsyy83JYgXktxB1NJhJwSSRBRDJA19Y24Z7ZqIGZXfiveQoieIh0NBgrkiGkKzOSjLqwDA2U+//kCITH2RcS4QbxfH2VSFYKG+UNyUIO5Idhh1eswJ4jhE2xcKGwuLwVz7RqFYn7lqos90VHoCOLR9E8SD3kq0RH0Gp0qCuOJ0GdcFVRLETXVwTBM4MfVuvKbFxHVNEI/MwERmMoKBUK0SxM2zLqAvLP6bEsT3t5C4tgni1W2vT42bw/kbcYS63gN5y+qPXGRcf+r1gLNp0yZOOeUUdu3aRefOnQmHw2zYsIGWLVvy2Wef0b59+4M9TolEIpFIJJJaU6+Hu5tuuon27duzY8cOfvvtN1asWEFWVhZt27blpptuOthjlEgkEonksES6qOpPvWZwFi9ezE8//URKSoqxLTU1lccee4zBgwcftMH93Vz8w1Ti7TYjQXyqNZO18z8AoPXXXzA2kiB+9MUsNiWIv3XhBABs6qMANH1vopEg/na4h7HA9pFT2zPr0ekAnHbXcCNB3NHnHN7MjgOgYraIgzjj5XsJjRSLmndUPm8kiL+8IpsTZr4OQLMjrmIQYlHvH7OWAPD50ON4tnc6AOldj+KdJdsAOE/5KCZBPCl7BQDhQaeR630VgF2bxWLclgOa49wt+vgxq5DBrcXC28rCXPKWRRLEB1GmyyGdUoU0VJFoJ/9P8XrGpdcaCeJ/7Co2EsQri3JIHiRiJfg2jz0VYsGuM1nIYNvyyjjDlCAeF6ww3p9IgrjVlYlnq1isbU4Qb+qKSjuOUPUJ4p5CfWFxgh2PR5y7pgTxSn/1CeKl+nG1SRCPLiyOTRCvusi4pgRxc+2bYDiMokUWGdc/QTwUrFoHp3YJ4lVlovokiFe3v3EPGmmC+N/9+SATxBsX//TzgVxkXH/qNYNjt9spLS3dZ3tZWRk2m62aIyQSiUQikUj+Pur1gHPaaadxzTXX8PPPPxMOhwmHw/z0009cd911nHHGGQd7jBKJRCKRHLYoDfg5nKmXRDV9+nQuu+wyBg4caBT5CwQCnHHGGUybNu2gDvDv5MknF2NDNabJf8z5g9s/ETEMA2/5iA8fFDELv844itYnnQJAt5HjuesRIWP9MXEkAK/ePY+xcz8F4N6nl7DocuHE2nXvtaT3GAGA679jjATxbicO44k3VwJwrB698NDwDjz9804A2rtstB84CIBX5q3DPX8LACc8144efUT6+As3vwfAztQVdLvoGACOcmSyZL6QvDZtX0nmcWJ8bRZZKf5KnDt08f34dCmjOEvUu2kxpjfxXwsH18I1uZzbLZognrNSOK3Szk0w7ltqSDirUjqmULBOr+OT1s5wAK3eVcxApylBvK2QyjRbKbtLRVyCW3dAFRdU0CpRtAOectRScT5zgrjd3YPiUiFBpSaLfWuTIG5PsMckiEcktpoSxCsD+yaI21SF3ZV+vS32MyeIB/zBWiWIq/u4qGqXIK4aUQ31TxAPVamDU9sE8epcVBH+6gTx6iSpxpYgbu5Xjdl+uH/MSBqClKjqT70ecJKSkpg7dy4bN25k3bp1hMNhunX7f/bOOzyKquHiv+27aZve6L0I0qUpRQQbYu9iRxFFUWzYKCp88CqiAioWUAHBBoIiAkpRmjTpvQZIIL1ustny/XFnZ2dDAimgAe7vefK8N5NpO4TX4Z57zmlOw4YNz/b9SSQSiUQikVSYSufgADRq1IhGjRqdrXuRSCQSiUSioapOqIt5BrHcLzjPPfccb775JsHBwTz33HOn3Xf8+PFVvrH/gqcf70CoxcSxNcIJdPKW6/juPRHiZ/98HTe+kAfA2pev5O23fwfgj197kHjlPLF/f9E2njTsZybUEbLIyE1LqLXySwAmxLTm2YVvAvDMvF10iRBOmvvvaU2vO98A4E5F6gj+bSLT/moCwIyedYi6oRkAz774MRuzhMwypFt9gvSPArD/0ekA5J04RMj14s/ngbwo5n70NQBbdqZy+bM1AWhZM5QDC0SDeHKPXOxKrYGvQTyka3+i94lKgz170wnJOggIqePIEbG4vHH9CNUdpk8SreExLRLY/88/AJwo9v9qpZ7IIzxBhPc587Mx178EAJNtPwcyhEsqVJGa0o7nqBJVsSMPXZYIONQbzRQki7ElNJxsxeGkbRCPCtI0iCsSVYhRj+OkGFvCzDgUicoaFUaeT4KyR6nn0DaIF7pLbxDPLRQSVWkN4m6XR5Wr3AVOTMH+1nCd1S9Lneqi8tdClKdBvLC0QL+SQX8udynbdXiV05ypQVzrrgJKD/orxe1UcntpDeIBIX4XSIN4+fav2Pkv5v84abnYGsS1SImq8pT7BWfTpk0UFxerY4lEIpFIJJLqSrldVEuXLiU8PFwdn+5LIpFIJBJJ1amKg6oqTqrJkydTr149rFYr7dq1488//zzt/suXL6ddu3ZYrVbq16/Pxx9/fMo+P/zwA82bN8disdC8eXPmzJlTybsrH5Vag/Pwww/z/vvvExoaGrA9Pz+fwYMH88UXX5yVm/u3WXzbcGzBoVz3umjHfje2Jat6Pg/AG/N/YdQrHwKw9+3/4+ZvrwbgcP+b6HjXKwDc8tYfAHxzdX1+v+FJAOLaD+beWULCiXcU83594RpKHL+U0S/3BiAqc516D9d0FyF4q0Z8R1KEcEO1/N9AWjQTYXsDM1PUKf2m+btYYxCOJJ9cpNMb2K0XQXk9aupwFQpZbU9eEXe2EdsTejVgw7finpZtT6FNsL/xGqCwVjuaNdkKwIrftuDaJjq4LKGRaoP45Y1iSDOLX5+iHX8DEH1pQ44UiM9yMLOQEKN4f846mU9kIxEK6dnrxB0hQgvNoRHsPinuLy5GBB0e3naYmGB/UF7x8UOAaBDPTRb72lpZVImqpiJtJXkhzOzvn/JmKeF+Rr3aIG6LsJJ1WIyt4aE4FGlHrwT9OT1enHq/m8kX7mfQQbbqnNKRpwT9+Z55cZEbs9J67nY6MIUI+clzwonRKqSnUxrENZKU+L70BvFijURV5PL4patSZKmSDeJa55S2QdytHotyb6U3iPv2h8AuKh9lNYgH7lO606qilNUeDhVvEC/NISUbxM+M7J/6b9DrdFWS6Cpz7OzZsxkyZAiTJ0+ma9eufPLJJ1x77bXs2LGD2rVrn7L/wYMHue666xgwYADTp09n5cqVDBo0iJiYGG699VYAVq9ezZ133smbb77JzTffzJw5c7jjjjv466+/6NixY6U/3+moVA7Ol19+icPhOGW7w+Hgq6++qvJNSSQSiUQi8beJV+WroowfP55HHnmERx99lGbNmjFhwgRq1arFRx99VOr+H3/8MbVr12bChAk0a9aMRx99lIcffph33nlH3WfChAn07t2bYcOG0bRpU4YNG0avXr2YMGFCJZ/MmanQC05OTg7Z2dl4vV5yc3PJyclRvzIzM1mwYAGxsbHn6l4lEolEIpFUAu1/r3NycigqKip1P6fTyYYNG+jTp0/A9j59+rBq1apSj1m9evUp+1999dWsX79eXbtb1j5lnfNsUCGJKjw8XLWsNW7c+JSf63Q6Ro4cedZu7t/m1effRWcw81bdFgBsevt6vnxL9EU9mTSDky8/BsBdL3zN8eWzAXgqsTeLZrQDRC8VwKV/zODpyE4AjPz6Sp59UWiR3zaMZOuTYp+swzUJniGCAzfceB0NrhgKQJvbhWw17PLncF0qJJmjjfoQO/c9AIJjatE2XEgdRz/7iPebPAHAjYpUM6NmYz7/W7jAxoRuxBIqpCGH28tlseKP23tdT778TLio1m1N4c4GQv4yGMQ5tpws4Krmoovql2n7SV+brly7JWlK4F2/mna2BonzndywG4Cat99KmlO4uXak5hGmSFT5aSlENk4UD3mvl4IgERxoDYthV7KQjBrHCblzRU4qkVaD+mdSnCLkMVNQGPnHRY9UcJiFfEVmqae4qJIAu0UcF9A/ZdTjSBdBhBa7haIc8ZfaEmnHoUhBXpsILXR7vQH9U9mF/v6pbEWai9frKPL1UpkUucjpd1F5ip0YVFnKjVFxUXk82eg04X5eU6BEpQ36C+if8t8OhRqJqriMQD+nppeqSOOc8slSwvWjHKvTOKdK6aXylAj38wX4aV1RZYX+nXFcSoif1+MOdFep93GqrFVSLilL+jqTJFYe2eXfcKFUaipdg5SDLlx0Xi86r/fMO57meIBatWoFbB8+fDgjRow4Zf+0tDTcbjdxcXEB2+Pi4khJSSn1GikpKaXu73K5SEtLIyEhocx9yjrn2aBCLzhLly7F6/Vy5ZVX8sMPPwSUbZrNZurUqUNiYuJZv0mJRCKRSC5KvB7UbIfKHg8kJSURFuZPoLdYLGUdAZy6Nsvr9Z52vVZp+5fcXtFzVpUKveB07y5qAQ4ePEjt2rVlRoNEIpFIJOcBYWFhAS84ZREdHY3BYDhlZuXkyZOnzMD4iI+PL3V/o9FIVFTUafcp65xng3K/4GzZsoUWLVqg1+vJzs5m69atZe576aWXnpWb+7e59LqbMViD2btqNQDfXP4czy25EoDXu7/AmEUitfmT1CSunyMkkIFxwWy9/SYAGvQQ8lP/n47QN0pIJ/dY9/JkvpBIekwZysirRaBf9E0388y8XQAk/nmEV8a1BmBvgvgFdHq8RDfuAMCoRXt58L1fxTWuH8Fl8RsB2PLlev7uK+So4dcKN1WdBi1YvPIwAIMKfiai/g0A2Nf8iG6TOIe+w/WkFolQwuQ9h6nbS8iNwTvEFOaiPanccWkCAIXZqSSvFd1X4U2uV3urGkRYyEgUslLqtmMA1Bp6idrvtOlIFh0V2caRmUJEszoA6H5NIiVfyD3BkVEcTxWyU29FEivOz8ZckK7+meQdSwXAHNyYgjSxsD0ozKI6oOJD/P8KMRZmAaJ/yp0u/iIF24w40sU1QhLs5Cnykj40nEIlOM9jFiGEbq+/fwog1+nvOspUXFT1DDqKi5TOJZumf0rrovLJUi4nBkX+8roz0GtcVFpJSnyv7Z/yb3d5vOgMfllKpziX1KA/Q8lwv1NdVDpdoIvKq3FXAbjdntJ7qbyBEtWZuqg8pQQAluWc8ocMegLkqrLcUqcP/dMeX3ovVUX/LVZd/+1WUbnsYg55qyzV7c9e5/Wgq8IMTkWPNZvNtGvXjsWLF3PzzTer2xcvXsyNN95Y6jGdO3dm/vz5AdsWLVpE+/bt1b7Kzp07s3jxYp599tmAfbp06VKh+6sI5X7Bad26NSkpKcTGxtK6dWvxf5ql6II6nQ63u3JWUIlEIpFIJBrOkkRVEZ577jn69+9P+/bt6dy5M1OmTOHIkSMMHDgQgGHDhnHs2DHVNT1w4EAmTpzIc889x4ABA1i9ejWff/4533zzjXrOZ555hm7dujF27FhuvPFGfvrpJ5YsWcJff/1V+c92Bsr9gnPw4EFiYmLUsUQikUgkkguPO++8k/T0dEaNGkVycjItWrRgwYIF1KkjZuKTk5M5cuSIun+9evVYsGABzz77LJMmTSIxMZEPPvhAzcAB6NKlC7NmzeK1117j9ddfp0GDBsyePfucZeBABV5wfB+s5PhC4rfehYSF6Nny0NMAdL3lZdY++TAAN9qtTLpBOMSe/34eY1+bAMCvK6cyuNEdAMyb1hOAdne8xZfTxTTc0jtfptXdYwBYU6s1Ts/rANx1zxVMnyYcWrc73dwRKzqerv9RyEsDY4I41rsVAL//uoW624VUM+jjZjQsfhCAL2e9QNquNQA0GHUXALd66/G//30LwO6tu6j/nAgcbPGzhWM//QKAs8l1qhSQc2wP8Q91AyAiTywa/31LMi91FC+zXo+blH+EHBffJ1w9zpZxgJhLRCDikb+OAuCw11Sf5e6kLG4OE/KRsyAHc30RWmi0ZXJI6YMKjbSRky5kp7oRIujPmZ+DMcffP5Xvk6hCO5Gp9EElRNpUB1RCqLhGyf6pojTRBWaNsKr9U1HNaqoSmt4epcptXqs/sNLh8qjny1E6p6x6HXlq/5QOp89FpchSrmJ3Gf1TngDnVICLqkTQn0vjo3GVCPfTq2P3Kb1U5emf0ut1uF1+icqjzLzqFZebs8gdGPSn6aIqywXl+94nPxnL5ZyqaG/TqedRHVxVKh8s43rl9CLpAxZNao/XXqOa6RznGRdz99QpeL3iqyrHV4JBgwYxaNCgUn82bdq0U7Z1796djRs3nvact912G7fddlul7qcyVDro75dfflG/f/HFFwkPD6dLly4cPnz4rN0cwLFjx7jvvvuIiooiKCiI1q1bs2HDBvXnXq+XESNGkJiYiM1mo0ePHmzfvv2s3oNEIpFIJP8JPomqKl8XKZV6wRk9ejQ2m1hEu3r1aiZOnMi4ceOIjo4OWEBUVTIzM+natSsmk4lff/2VHTt28O6776qdWADjxo1j/PjxTJw4kXXr1hEfH0/v3r3Jzc09a/chkUgkEonk/KJSXVRJSUk0bNgQgLlz53Lbbbfx2GOP0bVrV3r06HHWbm7s2LHUqlWLqVOnqtvq1q2rjr1eLxMmTODVV1/llltuAcTsUlxcHDNnzuTxxx+v0PVGXz0Mi07PrX3qA1Dv8sHMeV+E9M3Y8hMj610DwKvupXzWsC0AD681coUSvGeb9BwAblcIi2r3A+C3naP5ZqDQGPuNW8GUK4RTqe1VdZg4QvRBXRUbzL7hwwDYmNoVgMtf6kOz65sB0GrSZ6QooXPDmkWQ4RL3kV08lKJcIcU4LxOf/26HmzeS9wOwIa2A+7qK59XsskT2/bIDgB03p1HLJuSVotwMDG1EuGCN7WJt1dG96Rj2i34pg9nGnlwRjtetWSzFSride9ffxLYRzq3lC8T16mYXY1akhPTkPLV/ypWRh75WUwDMQVvYmyFcTfaoIA5sOwFALbuQbNxOB64TYhbQYLGRezQLgKA6YWQ4xb9E6kQFk6nIS75wP4NOhy43DRASVcFJIVdZI6wUKhKVNSqMPEWqMWgkKrdFuKgA8pVrmPU6cor8QX9ZBT6JSo9LcVf5ZCmXswhTsLh/T44TY5DfReXrn/J63OgsQep1vMZAF5VWlvKF+MGp4X6+sVaWKq1/Sqc34PJJVEa9aggwGHV4lO1G5c9S2z/l9XgxKNJVyaA/X/eU6pDSle6WMuh1qpSkJUBy0nZDaWSswH1OOUWplKd/qjTK2z9V2i4Vk9o05yrP9f4FeUZKaOcPIuivKi6qKshb5zmVmsEJCQkhPV1YeRctWsRVV10FgNVqLbWjqrLMmzeP9u3bc/vttxMbG0ubNm349NNP1Z8fPHiQlJSUgPhni8VC9+7dTxv/XFRUdEpstUQikUgk1Q4pUVWaSs3g9O7dm0cffZQ2bdqwZ88err/+egC2b98eMMNSVQ4cOMBHH33Ec889xyuvvMLff//N008/jcVi4f7771dDg0qLfz7dWqAxY8aUWinRpUYowXoDn/68F4DN6e1pqyy+7PrpIZZM6Q/AhNvH8/Vm8QLVt/+bfDzrBQBG3TAagCs/nM7gd0Sz+HMhFkJnjgBg79J8Os76HwDHxz2n5tz0eCyWqS/+CEB2QzHrEfboeCL2iWsYzDZ1xsU1730+jxWLmhsEm7FFxAPw3Q6xGPfBiBT0yuzA8UIXjzURC4GtN3Xk0+e+B+CXv5N4IVJIjDq9gcOIqoYrW4iX0/cX/0XeOjHLEhSVyHFl9qh33Uj2KrMWWRs3Yr+0JQAnihYAsPVkLnaTeF45aZlENVEWKq9044qsK+7DHsP2Y0o9Q2IYO/7cDEC0kinj9bhxHRczSSZrsNogHtzSQnax+Fdz7aggMpU/M7tFXM9m0OFOTxbbTHoKM0T2kC3CSravQTzKrubnEByhzuAE1DMUiZkag85fz2DX6yhQnoHVoFdzcMwhSgt7sROT3eofB4tn6/HkotcuLNZk33iNmmZxSsm+0SwmVsfuU2dwDEZzqbM5BqMejy/vRqdTxzrtuBz1DNrFvWfKwSlr7J/x8X/G0mZatNcrScnN+tMsWC41d6eCZQYyP6Zyz0A+tnPAf2ATv1Co1AzOpEmT6Ny5M6mpqfzwww9qUuGGDRu4++67z9rNeTwe2rZty+jRo2nTpg2PP/44AwYMOKXRtKLxz8OGDSM7O1v9SkpKOmv3LJFIJBKJ5L+nUjM44eHhTJw48ZTtZ7toMyEhgebNmwdsa9asGT/88AMgop9BFH0lJCSo+5wp/tlisZyxh0MikUgkkv8cOYNTaSr1ggOQlZXF559/zs6dO9HpdDRr1oxHHnkEu91+1m6ua9eu7N69O2Dbnj171ByeevXqER8fz+LFi2nTpg0gqt6XL1/O2LFjK3y9JksXEhoaRtNxIjtmZqNebDi4EoDQ7s8z8+Ph4hqe6bSY+xYAwbG1mB0l1gCZ9SLv5vu7mxH6kVgrdPcb1zDr9Z8BsLS7lWmZ4qWs6P0/uX2qyNsxXv0gewbNAiAkri4Ak/5Jp/fkDwCo0XYAPQybANgwYQHf9hQLnD9qF09CS7GA+fMl+wC41vsD4bXF4mTzxt+IPfEPAN7et3GoYCYAh3emUv8qcR1bUhzLDoqFyn0aC0lpTPpxji4/BEB47YfV7JhL44LxRAhpJWX9PqJvfwiADGXR7cYjWdS2iF+pgvRjRHUUi7VZeZITDiGLBEUlsOuokI9u6lCTwmwhrYV6CtQ/h/wjovrBFJxI/j6lZiHcSqEircSHWNis7GvzOsX/GvRqPUOIxYgjTbjobFHBOPLFPsbwSPUcXmsoSpQOBWU0iPsWFscb9BQ5xHajzYhLkcoC6xmELOX1uDEGKWN3FjqrZmGxtkFcU80Apy4y9tUzFLk8aj1Dkcutbnc4NXk3vqZwk0au0unw+JrF9f6FxXpNs/iZ6hnAn29TmnxUnnqGksd4Pf6cId95S8OjaRYvjfLUM5RFWdLLv73utjzT56eTicr6kZTXKk61XnPt9YBHvuBUhkpJVOvXr6dBgwa89957ZGRkkJaWxnvvvUeDBg3OGPRTEZ599lnWrFnD6NGj2bdvHzNnzmTKlCk8+aR4AdHpdAwZMoTRo0czZ84ctm3bxoMPPkhQUBD33HPPWbsPiUQikUgk5xeVmsF59tln6devH59++ilGo2KVdbl49NFHGTJkCCtWrDgrN9ehQwfmzJnDsGHDGDVqFPXq1WPChAnce++96j4vvvgiDoeDQYMGkZmZSceOHVm0aBGhoaGnObNEIpFIJNWff7ts80KiUi8469evD3i5ATAajbz44ou0b9/+rN0cQN++fenbt2+ZP9fpdIwYMYIRI0ZU+VrdBnyEzmRl/JtCfjr0QU+WXNIDgC7PfchLrwrZ6fj0gYy5Vyx0nvT3nzw98jsANr1xNQC7HrmbOl0eA6Dw4d5se+EnADrcch1vThH5MldnF/J+n7oADP11L60UF862nlcA8Mm3WzH8IvJl7pzdlDZXihbXcfd9zOFQ4a5qPeharjLWA+D7mUvFtQ+sp87NNwHQeImZ1LlC+nIN+D9Vksk88A91HhfVCfZfa/LjRiEJ3fOAkL7cTgfH1glHUvzDEerziS46SWzLWABObj1JcbyQwnzn3XAwg8uDlHyd7DSCmohWeaO1gENZIosmLCqILKVBvEFkMM4C4XAyZIl70OkN5B4R2TjWsFZkKs3j0RE21QFVM8zvQDLkCYkr2KDHmSoqJWwRVgqUCoj42rFkKxKUPixKPYdHU8+Qr61n0GTfHHf46xmKNfUMPonK1ybucToxqg3ixehsYtG91+MOdFFpnFNufaBE5QxwSHnVeoby5OCUNtbrA91SvnoGnV6HV3wsTaN36fUMJWWpki6qsuoZtGizcvRl7uMflyZLed3uSlUzaA+paj1DWeeV9QxnD1nPUAZyDU6lqZREFRYWFlC05SMpKUnOnEgkEolEIvnPqdQLzp133skjjzzC7NmzSUpK4ujRo8yaNYtHH330rNrEJRKJRCK5qPGVbVbl6yKlUhLVO++8g16v5/7778flElP3JpOJJ554gv/7v/87qzf4b2IOjUBvsvHEUyKML3vZOwztIrq1/rjBQvRvIqjt49ibseo/AeC6rZ9x/5HjAKRNFM6taa+35st9oqH7zo/XMqqpkCxa9W9DzR6DAegQYSV1nKh2mLuvE88/LtxQre4Usk6vO99gp1KRMPryOngMYmF1SuFECtLF9QzXv8egPDGt+9n/7QRg7Z4MbusuZKs2i2LZ/YMoJt13ZTrxSkifI/ME5i5iHVPN/ckc3C0qDqxJYl+90czeE0JG6tgsFpMS3ufds5a4tnUB2PBXErVzfaF44vmlp+QS1UBIWs78bIz1xWcxB+9hV5pSzxAdxNG9IgW7XrgNd5GQkjwnlXoGs43cY6Kd3BpjVx1adaKCKVIkl6ggo1oJoc8X5ypZz1CQJs4r6hkUCSoiRg3381j9bj9f0J9BpyNLaQ036XRkFfgcWv4GcVOwCZdT2SdY/D5oG8Q9Lic6q6ZBXBlDoIuqWFOpoP0eoMjtbw0vcvslKkexZnsZDeJurSylaId6vQ6vst1g0FPkEZ9Fbzhz0J/ZaFDHBl2gRFVWPYNWrqpKPUNZslfJYypSzwDlrEso5VqVrWcoD1LekpSJlKgqTYVecAoKCnjhhReYO3cuxcXF3HTTTTz11FPY7XYaNmxIUFDQmU8ikUgkEolEco6p0AvO8OHDmTZtGvfeey82m42ZM2fi8Xj47rvvztX9SSQSiURy0SLLNitPhV5wfvzxRz7//HPuuusuAO699166du2K2+3GoASQnc9snHwvYWFhtHpOdCtdszaKmWOEg2tyuweYvEL0Sz32zAckf/4AABMe+ZKObwp31S1viZ8/aDZy6cpJAPzz0wl6znwbgMwvXsOuhPBdd4eNuWN/B+BkPTeJ74t9amfsAsR0uE9Ssv3+CZ/H3QRALZsJq10E8s3Zn8ft4Wnq/gCHCop5paUIEwy64zJmvCZCBuetOswz4f5m62NW0WresxV88rHYJ2+lCMcLikokSXEQXdk4hv1W4fjJXr+WqA6tATheuJRtJ0RPlF1ppc5MySK2pUiQ9q5x44oWUpnVHsNWJdyvQUIYe9aIVvPYYCMel5CBXMeEY8xkCyEnSTirQhpZVQdU/dhgtil/T8MtBlWicqUeU+5Bj0ORqGwRVnKOis9ijbKrEpU+NNIf7ufy/6XPUD6rWa8jI0/cj92gI1cJ+gs2GnAp92G0GVVZzRwtZiw9x7X9U/llOqfQdFE53YH/p6MN+gtwTpXon9Kfpk1cr9fh9slxBk0XlcG/3WgyBIT+iXv2YjT6m8LL6p8qLeivrHF5+6dKHqtFX8qxVemfOpPEdK4C8qqL+lQeGUz2T1VDpERVaSq0yDgpKYkrrrhC/f6yyy7DaDRy/Pjxs35jEolEIpFc9Mg28UpToRcct9uN2WwO2GY0GtWFxhKJRCKRSCTVgQpJVF6vlwcffDCgqLKwsJCBAwcSHOx3i/z4449n7w7/RRZc0o0gvYHth9cBENZ1MAs/EwWiacN/5e4/JwBgDo3ip6b3AVDomcqige0ACL38GQAefr0PXz8pep9MrW/kG28LAByjhnDnZ9+Kfa9txOZXhFwVHFOLKXuFHNJj4mgAanZ4hF7GrQCsH/s9n/VsAsCktvF83aoLABMX7qGnTjxrn/Rl/mcJNTO2iQ/U9272Pyt+fmjHSRr2FpKR7Wg8i/cL99F1zWJ5N+UQAEeXimyjiLoPqtJQu4QQDJFCZjm+ehctbuoPQIbzHdYeEpJQDaV/Kj/1CJFXi2uwJp0TToPy+Wqy85iQqPq1q8EcpX/K7vX3T+UdPKI820TyD4vtIeFW8pVgvuahVpRPRTBOQhRJxe2TqKxGCk6Ka9iig9T+KVNUtNo/5bHZVRdVefqnfM4po82oCfoz43YKiUqVpVxOTCGKLOXOKbN/ymv0j12luKjK0z+lN4p/YJTWP6U36MvVP6UN/YPy908ZVEmr8v1TJWUi7fdn6p/SIvunyn+MpHSqi3R4RqREVWkq9ILzwAMPnLLtvvvuO2s3I5FIJBKJxI+saqg8FXrBmTp16rm6D4lEIpFIJJKzRqWC/i5UDhe4sOo8/Fr/MgCuGvEpg4dOBODkD0MZfpMIAJy2eTUPvfwVADvH3cS2u24BoIES4ucY0JNtr7YEoPs9N/HaRFE+em1WIe/3rgnA4J930zFCSD87r+7J+9P/AcA5by8A9/94Ce2vE43oY29/j0PBon+qzbM3cp2xAQDTpy1i6541ANS/W3RVNVtqIWXWlwC4Bv1PlWTS9myg7pPdAYj4tRbfrRdhenMfbqdKLklrhNyT8Eik+kyiC1OIby2cUSn/nKCJ0j/l9HjZfjADgG4hQjYpyk4jpEUbAIzWtRzM9PdPZSiOqwaRwRTlCWnLkHVMlWi0/VMZOSLgMDEmWO2Oqm23qfdkyEslWNEltP1T+SeFtJXQvq4qsRkiYv39UzZ/uF++y6M6fDIL/S6qpHxxbZtBh9Nxav+UJcyMp1DIX1Xpn/K5qCraP3W6Lqry9k/5Qv8q2j9lPI2LKmD7OeifKo8MBeemf6rkeWT/1NlD9k+VA49HfFXl+IsU+YIjkUgkEkl1pap1CxdxDk6luqgkEolEIpFIqjNyBkfDc+u+Iiw0hFca3QzAnKYHqRsjAvGed3TlkqD3AWg3dxT5J4VU8M+1rzNnSAcA5qb0AODW9/7ig441AGh/fxvsH4kgwJ4xQSS9MhCAecldGflCLwAuu6cNHfq9BMAeJWjuvSvqkOd8CICUwv+p/VNc9xpP5Qu9YdKonaw6IOSeh3o3AqDNX4ns+lZ0Su3olUotm5BFHJkpmC4fAECdQ4c5sF1IO+Z9KzGYhfyza7/oi7qiRRw6s/h8nh0rie8ozr1m2RFqZfvlnJNHRSBfVFMhaTmzszE2EP1TltA9bD0pwvYi40I4tENcr2GkDZdDyFXu4/vUa+co7fRBieGkFgl5o35MCDmKnBJp84f76XNPYlf6sfKThRvMFm3DoUhitthwjUSl6Z+y+Jvu84o86vmylKA/q8ZFFWLUB/RPFRcWKmMLrhwl6C9UCfor2T9l819H66Jyuv2ykrZ7CqCguHz9U36Jyt8/5XJpHFKK9GUw+l1URpOBQpf4XHqDppdKcaKVt39Kuw/8u/1TJaWM8vZP+e71Yu2fkhLaBYB0UVUa+YIjkUgkEkk1RbqoKo+UqCQSiUQikVxwyBkcDV0+OYbBEsT6eS8D8NY1b7Dk4CYA2vV7kbzf3wXg5U6DufvrHwC4/7XveVPpeAqeNBSAHQuh8/wvADj80oPEt7oegL4v1mfCI8LhlNEkBPv0yQBErZ+LwSKkmgbBiiPpq1GMryEyhpqFWghWpLIv/knmCfthAPRGM8eVkLqnmot+KtM93Zj0xAwAfl5xiNdrCrlEpzewH+H06dvezZjFwtmV+Uc2IfF1AUjaKGSMG5rGsjtISFvpq1YT1Um4ypIcC9lwXMhSkWYDWckisC+ulZDjvMvcOGOEnGWLiGfT4Sxx/zXsbF++EYD4YKM67e86sgdzcBgAWYfFeUNb2MhWHEt1o4PZpCg5kTYjNkXLcJ1IUiWqghQhUQVFB5F1QFzPFhuh9k8REqVKVPma/qnMwmJVPvH1T8UY9OQpEpXVoKdYkcrMIWa1f8pkt+IpVkIE1f6pXPTB4jkL55Q23M/votJkC57iotLKUlopyuF0Y/CF+7k86E1mdR8QMpPPOWUw6NXOKZ0u0FHlk6W0TqvSnFOn6586UxdVefqnygoD9O9z6rFV6Z86Exd6/1R5kP1T1RwpUVUa+YIjkUgkEkl1xeut4gvOxeuiki84Gg6t+QOd0cy9ze8F4I4wC65n7gSg1mWP8fjeRAC6hJp5qL3ISwl6dzP3znoBgLf7iZqFiOufZshq8fPoKesZ8ec4AI7WDud44Wdie+MODJkvmsP7v/chja98DYBrau0GYOWbP/PdDe0BmHtDQ76q3xmAz+bv5Nr8r5RzXIN9068AhGwRjeC6qx/gUME08Xm2HKBxv0vEz3fU5YdtKQDcfEk8r6UmAXB48SGi6ncDUGc9WscHU5woZiSO/rWb2IeHAJBd7ObPfaK9vL3VSH6qWBgc07uxuPaKYxzNETMgobEJ7EnKAuC+bvWZrtQzBBVmqM87+8BRrHZxbN4OsfA4NMKm1jPUtlvZpOxrK85Vs29cJ48Spsww5Z/0NaAHk6PMvhgjYij0+LNvfMXdeU7//0lkOIrVRcbpSq1DbYMOpzIjZgoxqfUMljCz2npuDgtWx8YQMfvkdWeht/qrSrwm/6yNdjZHO2ujVjUYtNk3ev/YcGr2jXbBsVM7U6OpYdDO1JRVz+D1njqDYymlTVy7j6esRcalzMiIrJ1T6xkMpfyz/3T1DNrsG4Ou5OyRf1xm3s0FUM9wuluV9QwV53yaWVPxuqECi91LPf4iRa7BkUgkEolEcsEhZ3AkEolEIqmmeD0edUa0ssdfrMgXHA1TJw0lKCSU2+59BYCZ+/7iGXtbADblX0N092cBmLLqU37s+ggAV478lNlRYgGwQTcGgNee78frw6cB8GCxmweson7h8s9gVBOx0Nd41+XMnv4HABEbknnzC3Gd5rphAHx6yf0c2/A7AE2nPMMTRaJN/PlXPmXTVtEy3vrNZ+i0WCx0PTBVtJfnD++jSi+Zh7dR46mbAIh1m/h5rZClXmgVpMoGR9Ydp/71YoGy77jQE9upcVkCAHsXHiA7VHw+txe2HxAS0+3xwRRmC7nKdsnV4vjgPHamiSyd8JhgMk+IcfOYEJz5YhGxMeuo2oidczAZa4SQx9IUOahObAgORcqpbbepsoYh94TaIO5IPoEtWmTQ5CvXiGpWU82+0UfEqucIqGco9mffZDqKsSrj7AIhOdkMOjX7RlvPYA4248r1Z994lEwZnaaSQacsli65yNilTJLq9IaAHJyCYl+VgW9hsVtTz+BRn5HD6Q7IvikpURmMetwuf66NT5bSG/U4lUXSBoPen32jHZe2yLjE2FjeRcbaBcRlZN9oCcjB0S5wrqKGUJV6Bu2iZu15Sk5z6wIWTlf0/s5HjeTsIusZKoinihJVVY49z5ESlUQikUgkkgsO+YIjkUgkEkl1xTeDU5Wvc0RmZib9+/fHbrdjt9vp378/WVlZZe5fXFzMSy+9RMuWLQkODiYxMZH777+f48ePB+zXo0cPdDpdwNddd91V4fuTEpWGmm8NJMRkpPN9QiZqMWw531wrmrvXdOhGTNsnAbhjlZn4VCGNzLshgppP/wjApjeEVBNj3c0Linxz91X1WHqnyNXZEnE53T8TlQxXtKnPR6PeA8Dh9nKdRchHa/TNgUD3yZ64TtwXJqSJQenHWZkuWrNfvKoxCVuES2rDt9sAWHbNYdqECYmkOD8bV2uRwdNm71ZW/LYFAM/6/VjtQpbauquIG9uIHJs0i/h1cKxdSI1urQH46dudRKQKeSbEqOdkUjYAsS1jce0VzidvbdGcbrFvZYPinKpZM4x1K/YAUNtuwVUo9nUe2I7JFgJA1sHdhLQSMo+vnqFRXAhJpdQzeNOOEanUR+QfSyMoWql4SBIuquD4KNUFZoyKV7NvnEZ/C3lWYbG/QbzAiU2x4qQrOTghRr3aIG4Js1BcKJ6zOcyGO03JwQm24fWIffRB/uwbndl/Ha85SB37nFOAek86vUHNsSkr+0av2a7X5OAYjOLPSFvP4HdLUaoUpd2u0+TgmI3+f98Y9JpxGfUMWilJHKNpDS+H7FBWPUNZ+5yujiHgZyXkp8rWM1SV6qK8lEcGk9k35w9et1v9O1fZ488V99xzD0ePHmXhwoUAPPbYY/Tv35/58+eXun9BQQEbN27k9ddfp1WrVmRmZjJkyBD69evH+vXrA/YdMGAAo0aNUr+32WwlT3dG5AuORCKRSCSSCrFz504WLlzImjVr6NixIwCffvopnTt3Zvfu3TRp0uSUY+x2O4sXLw7Y9uGHH3LZZZdx5MgRateurW4PCgoiPj6+SvcoJSqJRCKRSKorHk/Vv4CcnJyAr6Kioird1urVq7Hb7erLDUCnTp2w2+2sWrWq3OfJzs5Gp9MRHh4esH3GjBlER0dzySWX8Pzzz5Obm1vhe5QzOBq+mL8XM3p+f000R4d+u4TIX+YCMDW2Jb/9fgsAba4fytqbxdvpkm73kBUp/oDTJo4F4J/O3bhssGgeb3t3As/UFNKVt6ObFbE9AWg6dgjRjUUL+bWZG9j+6hsAvNRWhAa+WiecHy7pCsArP+9gRvASAIJjaqkOoU5BWfCAqHP4+JMhAKxYdYRHOgvJyZRjZ8nBLADual+Ln6YIp9WxX5Ox17oGgBMrXNxdT7SBbwsTUsjRpRtpOORp8fOir1l5WDin4ixGso8fBCC+c0PYK2S6THOUcm+12XhYtJt3ahDFspMiCDDe6pdpCg/txxwSAUDOwRzsvYSck6O0Y18SHUySsm+EGdU55TpxxC9RpaQTEiukrZTNoqXcEhutSlSe4EjcSphdYLifS5W8TuYUUV/RSQp9beJWY4CLylfPYE7wh/uZwoJwFwv5Ua9xTnm0EpU23E8jSykfEZ3BQGEJiarQ7a9h0DqnHMVuNfRPyFjinn2yVEmHlG+72WYKaA1X99fISkateymgTdzXMu4flwwA9J1LHZeoWNBKV9owQE8pklZpcklF6hm0lCfc798IyKvKvxylFHR2qS7SYaXxeKroohJ/92vVqhWwefjw4YwYMaLSp01JSSE2NvaU7bGxsaSkpJTrHIWFhbz88svcc889hIWFqdvvvfde6tWrR3x8PNu2bWPYsGFs3rz5lNmfMyFfcCQSiUQiucBJSkoKeImwWCyl7jdixAhGjhx52nOtW7cOKH29l9frLdc6sOLiYu666y48Hg+TJ08O+NmAAQPUcYsWLWjUqBHt27dn48aNtG3b9ozn9iFfcCQSiUQiqaZ4Pe4zzlye6XiAsLCwgBecsnjqqafO6FiqW7cuW7Zs4cSJE6f8LDU1lbi4uNMeX1xczB133MHBgwf5448/znhfbdu2xWQysXfvXvmCU1lGTLidMJuF0V2eAmD80iVcMfgbAFa+0J3C1+4HIKHN3dR67wYAPg5rTsdB4pfhlrdEcF+vw9n8/ISQrUYsP0TjEPGmfOn1/RjyyVoAnvzsT26fKmSgbjelqy3j2zP/BKDr8BvpbW4BwK9z1rDh0AIA6vd5g2ZrZgGQ9c1EvA+/Bfh7pI5vW0+zh4QMFvVbHb5YdQiAr+9pRXG+cEAdXLKfGnfXBER4X12dkJWy24oFXUkrk6j7XjtAOLyW7hQy0MAQE450Yeezt26N4QfREL4vU0h6EfHhHFdcVi0vr4czT5zXkHlElVwy9yRhi+gBQMZ6BzFKYJ/v/uuG+6UeQ06K2j/lTD5KUKT4WW5yHrGtxHRrhnOf2DcqgUJFkvEERQT0T/nkk3SNc+pEvpNLFfmlSHFOmYLN/v4puwW3U3FOhfnD/QzB4Xg94i91QIO4ye+c8pTRP+WTpfTaoD+j6NQ6xUWlbRDXhP4ZlPt3l+qi0uEVtxnYRaXzd1SZjfpT+qO8JaSossL9TumiKhHop+2fUo8po1ncR8lwv7L6pwwlNB/tqaoS7lfWOfUB28+NxlEZqUz2T12EeP3raCp9fAWIjo4mOjr6jPt17tyZ7Oxs/v77by677DIA1q5dS3Z2Nl26dCnzON/Lzd69e1m6dClRUVFnvNb27dspLi4mISGh/B8EuchYIpFIJJJqi28Gpypf54JmzZpxzTXXMGDAANasWcOaNWsYMGAAffv2DXBQNW3alDlz5gDgcrm47bbbWL9+PTNmzMDtdpOSkkJKSgpOp1jnuH//fkaNGsX69es5dOgQCxYs4Pbbb6dNmzZ07dq1QvcoX3AkEolEIpFUmBkzZtCyZUv69OlDnz59uPTSS/n6668D9tm9ezfZ2WJm/+jRo8ybN4+jR4/SunVrEhIS1C+f88psNvP7779z9dVX06RJE55++mn69OnDkiVLMCiGi/IiJSoNLwffiDkohC5WEdx33a9v83ySCKXb8vpYfmvRCYA/Tn7NlW8vA+CjbrVp+4RwQ4V1HQzA1XHBHB9yLwBTUi9n2xvCsXR1/8todd1QADZnFzLh6voA5BQ/S5LjMwDyThwSN3PbJF4rEHLJ9P9NZtkOIRM98WEzWm8Q8tLmz/5i+2VCLqllE1JHfmoS5qtGAFDv+AG2b0oGwNYhB4Pi9NmyL5ur2gmnlcdswLNZdF7V6iFCBteNXkxCjiJp6HUcOyCkpsSWsTgzxC+qsXknrHbhktqYLHqmouJDObBN3E/jqCCciiTmSdqF0Rfut+8wIXXE9GdKoZtmCUJ7zVSknIQQs+p0MuSkqM6p3CMnCI7z908Fxwvnl69/yhgdj0ORZNy2cHzkFLnV86UXONX+qfS8ItWhVaS4qCxhZooLhdxmDrXiStb2T4l/XeiCwvxBcrZQ9Tpek1+W0ob4aYP+fLKUTm+gyB3oonIUu0sN/dP2T7lcHvQ+h5Pb55DS4fEFHJoMFCpSmt7gl6UMRr0a7qd1UZmN4rxej7tc4X4lXVQBDqky+qfKCvcrea7S9j9d0F9JKhvup+2fKi/6cshjAdf7F2w81b3jSvZPVYFq3EUVGRnJ9OnTT7uP1+v//8C6desGfF8atWrVYvny5Wfl/uQLjkQikUgk1RVPFdfgXMRt4lKikkgkEolEcsEhZ3A0fPfBFHQGM5/u+hWAwfE9GbtqKQAPDPmI6TWEnJI/+E527hKOo7Yrf2X7nTcB0KCHkKiuv6UPr/d6BYCsS8Ip/PwDAGr/PhlbhLDPdYiwkjZ2CADDGj1OL8Uh9EPtZgCMXnqQkWGiO8poC+F4oZCrXmwRjXfgjQCMvfMDflkiXETjmgjJxmC28Y9DSCf9r6jH87+KgMCT89Ox12wKQNKGBdzXQqxG3xpiJnnxMgASb+oHwKGCX9AfEbJUtNlA5jHhnEpoXxfvQjHd6YisT3CMiNVetVcE37WrF8nWpSIfoWaoSZ32dx7YjjlIPLusw9mEthGfNbPYTaM4IV39rfwZRNkMqnRUfHS/KlHlHUslJE6E+2XszSQoXqy897mvCI1WQ/Vyi/xTsmkFTlWiSs0pIl6x4+TmO7Ep51b7p+wWf7hfZDCeI0KWMocF4/EIuU0fGq5+Lq823M90av+UVqIqKUuV7KJyON0YfG6pYrca+lfgdGNQnofb7VGn+lUXlc7fLaXT+2UpvaZzSitLWYz6gC4pODXEr6yx1mkFFQ/306Ivcaz2vCUpbbvWIVUeZ1F1dx/9G7dX0WdQzR/ZRUN17qKq7sgXHIlEIpFIqitnKcn4YkRKVBKJRCKRSC445AyOhmsffxiTLZg2/9sOwOdX1aPhBhEhPcISRZ+NPwMwOLYbjYcOAqD7O6to/8teAOam9ABg9tEcVRZp0OMm7vxYhPsNff9Luoz8FIDremUyd6xwLy3u1YKxz4pjp4d3BmDWTzu49ajYt3aHZ6ilXNv903vob34egJTC8RzauBmAlg9dDkDEyoZ8uuYwACP7NGKQEsy3b/4uavQR0pbjRy8twsVnLmwWzZFl4v4TXxYZA9nFHhZtF10it4aYVWdXzN1t0C/ZAcD+zCLCE4Tctv9QFgD9WiXycaZwUVmzj6rPNXPnYWwR7QHI2pJHjCJL5bk8NIgQ0o5PorLkp6rhfsXJh7BHWsW+yblENIwR91d4CGOMcIHl+5xTwVH+cL/iwHA/35/F8dwiGhrFuMjhwhImnE++/ilLmD/czxIeojqnDMHheIpTAdAHhaqSiUcb7meyqmOtc8rpk6UMBo2LSu8P+iutf0ozdro8AeF+eo1cBcIhVaxIcgajHo/iUNAb9KpboWS4X2lBfxZN/1SpLipPYCAfVDzcT6/T+eWxCob7VcaEU9FjqvqvvfIcX1GZqLpLa9WVC8q0VY1dVNUd+YIjkUgkEkk1xevxqP+QqOzxFytSopJIJBKJRHLBIWdwNHxsXkKYxUr0ciHrhMz7ibdqtQJg3r719PhcyEEjmkYx4nXR91S7x2AesQt5IniSCPEbtq8Tfz4lujj6DOpKn7vfAGDJyXy+vrc1ADZjWza/IlxNGQc2EzVtPADv54l3znZfTmfJDhGk9/BLzemyTri2NkxYwL5GohMr3mok5+geAMJvfQ6AegXJ/LlaHBfXOl/tMdq+O4PLXxayjtWkR7dlEQB1ejVlwQei/yqhWLiUDDrYvy9d/LxJFIXZQp4xX3oHVrsIHFx7NJvoGsKtdXSv2LdlbKga7uc9uhOjVUhRGXv2Ehwj7v+Yw0VTJdzP4fZQyy5kIjXcL/t4YLhfrLin3OQ8anUXQYQZTjcGRaLyhft5giLUP8ecIr+cklrgxKoXz1Qb7ud0FGMOEeGIxUVCirKEWXClKi4qTf+UPjQCr0e41fTB/lI4rYsqoHNKM/ZJUQajWdM/ZcahCf0D4Zby908FdlHplGfjcXvVED01xM+gp8gjJDa9wR/6p9frVKdVSVlKDfoz+F1RpfZSlQj3K+l2+i/D/SraP1VWuF9Z59EF3EeZt3Ga+7uQNJLKIcP9zhJSoqo08gVHIpFIJJLqireKLzhe+YIjAYY/Nh2zTs8rv/0GQLdHPuQzpe06avQA1q0LB6D7mt/YPfB2AGp3fpT7x4gKh7dvGgvAyWZubJM/BqD9n9MwB9sBaGW34pz4AgDDGgygfbiY+fm+ZmPGrhGLc4eF7QTEv/IPFYgZhPGX1YRnxQLhd+77mAULxazN6EaR/J8yA7BDlwjAfT1svPLGVABS52YQVrMxAHs2/cbdbcWsx/YQMyd+FZ8x7ure7B8jFjvrD2UAEGMxknpQLBKu0bU+3kVK9k1sEzX75q+9qbStL7JodqzYIJ6F3aQuzHXu24IlVMyqZOzNIPwSMROT5nTTQskT+tsLsUHiV9Cm/FO/+Oh+YixK9s2RE4QmilmgI38dJbiGssi42IMuXCxwVrNvXP5/LZ7Ic6rnS8kqJF6ZkcjOcxJiEdcrcriwRojn73LkAWBJCMV91LfIOBS3S8m+CfYvLPaaSs++Ka2eQczglJ5943D6Z3ag7OwbV3Fgg7jeEJiDYzQZAmZtfNk3Rs3CYm32jdloOGWRcXnGYpExAZwu+6Y0qpJ9U9rMzJnqGc7VAt2zNTFR1dOUZ5ZIZt+c/8g1OJVHrsGRSCQSiURywXFeveCMGTMGnU7HkCFD1G1er5cRI0aQmJiIzWajR48ebN++/b+7SYlEIpFIzha+oL9Kf128MzjnjUS1bt06pkyZwqWXXhqwfdy4cYwfP55p06bRuHFj3nrrLXr37s3u3bsJDQ0t42ylc3evuoQYjdTfOQWAD03xXL9NSDlPx1xOkxcmAdB1wkYumy3yYH5N6c3spCwADLpxADTqeTM3fbgagOc/nEy3N0VT+C3X5PLd22Jx77yrGjPyxV4A/GC/gi9m/wPAdcfFtet1HkKtzaIywvPjOPS3vAjA8cKJ7Fu7HoBWj3Un8k/RSP7+igMAvH1tY55JTQJg9w+7qNXnegAcc720UdbhelrGsn+huP+458eojdw/bxHN43eH+rNv4u5uh/4Pse++zCIiaggpbM+BTK6/USz6/UzJvrFpsm/St+zHFnGZGG8OzL5pFCnkqr8RuTcAYUqzdfGx/Wr2Tc7RbDX7JsNxMCD7xh0s5DFf5Ex2kfuM2TeFBcVq9k2Rwz/WZt+4naJN3BAc6c++CfHXM3gsIepn1GbfFFYw+8YnUVU0+8YnXVUl+0YrXcHps2/MmjAa/wLnM2ffeLTS1TnIvilLepHZNxcvF+yaZrnIuNKcFzM4eXl53HvvvXz66adERPjdMl6vlwkTJvDqq69yyy230KJFC7788ksKCgqYOXPmf3jHEolEIpFI/kvOixecJ598kuuvv56rrroqYPvBgwdJSUmhT58+6jaLxUL37t1ZtWpVmecrKioiJycn4EsikUgkkuqGr2yzKl8XK9Veopo1axYbN25k3bp1p/wsJUXUCcTFxQVsj4uL4/Dhw2Wec8yYMYwcOfKU7bn/m4YnJJThTToA8FfKFjq+txKADzrVYPxIISmFdxnE84q7SjfqEV5KFhUHm964GoBbH+xGh34vASL75tv72wBQUNyKbS/NB0T2jX26qIGYlFfMJZ9PA2DBDnHfT73RkrZbawLw99if2VG/PwC1bCY1+ybs9tdp4Dgk7nWlOC6mRQYGJZ9l0850+rwhXE86swHWi7qHete0VGsiYgutqrywf4/Is6ndIsaffdPmbmwRYvvqpCziagtH2KEdJ2kdLyTAolzhvvIc2qLJvtlNaKJoLD/mcNGihjgu3+2hZphwC5n1OgyZQk7zOadyDiYHZN/UvrKFOJ/TgzFefBaH24NHkah8ZBe6VVkqJa9IrXs4mVOI3STOXZjvxKJc2+lwYA0Xz6k4WXFRaeoZArJvQsLV62izb8pyTuWq8lHZ2TcFTv92CMy+KXK6A7JvDIrE5nF5MPo+iy+jR5N9YzDqS82+0TqnzAb9KW6n8mbflFbVUHI/ODfZN77NpTmnypt9cybX1fmQfVPd83Vk9s05wOOp2jqai3gNTrWewUlKSuKZZ55h+vTpWK3WMvcr+Zfe6/We9v8Ihg0bRnZ2tvqVlJR01u5ZIpFIJBLJf0+1nsHZsGEDJ0+epF27duo2t9vNihUrmDhxIrt37wbETE5CQoK6z8mTJ0+Z1dFisViwWCzn7sYlEolEIjkbyEXGlaZav+D06tWLrVu3Bmx76KGHaNq0KS+99BL169cnPj6exYsX06aNkIGcTifLly9n7NixFb7e3U+8g85o4Ze24mUp/fa+bM9pCECjZYtY3+NKAFr2fYPb7hZ1Cc9f+hBZbSIByJgoXFQ1544mJL4uAD1jgjj2vNj36RZDeFiRX35q2JYh83cB8EHIaiyh4hzHC0Xs/stNg/AOewiAkdeOZMF8EQA4qW08b7uFDLQqL5QnrmoEwMBnPxTHz0ojoq5o7t6/fgGPtxUy1+YwC0d/WgBAnfv7s/8N4dBy7k8j0SoqC9IOHgSgdvemeOcKCSQ7vAEhcfUAWLLjBJ0bRQOw5fc11LELScU39V+4azNWu/h52q7VRLQR95nmdNFSkahWeiFeqUgIMeopPiLkNlWiOpRMWE0hfR1aeoTgxFhxH8VusIux0+Mlyxk47Xoyv8gvUWUVUluRXHLzndiU6genw4VNE+5nqS2u4zmiVDVEhOIuTgNAH+p3TmllKW24n7aSoch1aj1DybE23M9Rok1cG+7ndge6qHyzkR6PV3VU+aoatOF+JR1SqnuplAZx7VjrnAL8clWJcL+S4Xwlw/20zin/MYHHlxXspz2vuj8Vl4zOhQOp5GRwVaa9/w0B52IK97sYFDGvx33avzflOf5ipVq/4ISGhtKiRYuAbcHBwURFRanbhwwZwujRo2nUqBGNGjVi9OjRBAUFcc899/wXtyyRSCQSiaQaUK1fcMrDiy++iMPhYNCgQWRmZtKxY0cWLVpU4QwciUQikUiqG7KqofKcdy84y5YtC/hep9MxYsQIRowYUeVz12h9BQZLEHVeHwXAO7EtuWry/wHQaejPXPO3aBlf9Xtnnlp4CIA2oRY63HEnALeM/gOAx6f9yJPfzwOg3yMWxt4tAgJXdV3A9xMfAOBPY3dmTxf737V3Bs1uexuAVv/8BED6RyNxPy5ktgznGxz6W9je2zx7I7G/1gJg7OI9/PBgWwAeVFxPO779h/qP3AeA81svjfXCAVXYqQb7fhVyUI3/u5I8xW0zb8MxnlU6sXzhftFPdMW4UIQJbj1ZQHRtsZ7pwP4MHupcF4D30o9jPiHWQPkC6tK27CM4Rlj2T27Ip0aieMnMc3loGi2kuZWASXFORZgMFB7eD4Bdke5yjuYQ36aOOF/RAQxxwjmV7/bgDhX34faKYD/wO3ZO5juxKbLOodwimisyiyPXqcpSRYXFav+Uu9CBJVxIaGq4X2g8Xs8JZawJ9zP7w/3cRv9id60sVehzL5m0zimT6rTSm8zkKfKj3ugP+vM5pxxOd7nC/XzOJbcSJqg36PEo47LC/cwleqk8mu0+Ajun/GNtuF/JLqrS5CsfZyPc77RBfxphpTwyRaDrSrv93GgcF1O4n3ROnVu8Hi9ed1VecLxn3ukC5bx7wZFIJBKJ5GLB6/ZU7QWnCsee71Rrm7hEIpFIJBJJZZAzOBrWPtuYsNAQaj7+BQCb3r6emM4iBC506goevlr0Pq3s0puvwkS43/s/j+L+NqJzKazrYAAOFRQzoZE47m/DA2QXfwCAI/MEh64QnVJjwwx8NOo9ABbsTOOd+4TUlHBSdG2teG8pyxoJ+aZtuJUv0oU8xnWv0TF3i7iPxVvR1RfuKqtddDat3ZVD/x7iPtNsJpx/zACg0S2dWfzUNwAEJRcQosgTR3anUety4bQq2isC+/QtexAULa63/EA69RsIh9e6FXtoqUhJxfnZuPZuBMAcLBxSqdv2YW8lXFTHHC7a1hG1GklurxruZzPoIEV8rhiLgex9xwAISRQyUE5SLo1uFFJUhtONKbGueHZuL0Vm/7qq9ALh8vI5p5JzCtVwv+QsB3aTGBcW+GUpZ0E+1gjl/g/kYQ1XXFS+cD9N55QuyK5ey2sJVseFLo8qyRW6/UF/eU6XOtYG+uUVie0Gozkg3M/h9G0XfwVdGlnKVewpM9xPb/BJRopzyqhXXVRaKUob7mfQld5FpcpSbneAcyowtE/jZCqho5Qn3K88gX4B41I8PV6Pu9xumbLC/cqDvpzyWMD1zrE8U97zX0zOqYsNuQan8sgXHIlEIpFIqilSoqo8UqKSSCQSiURywSFncDRMansHVp2Boi53A/Btt6G4L78GgBe/n0e9zsLR87q9Oba+Qkb5v4JWXH/nTQDU7/YUAPdGH2TZTYMAGHzV60y9SgTlLW5xIw9NWQvAD+5viWooZCn31qV0KfhH3MRLrwPw3uQbWLBgOwBDbm9O8CHhnPrin2Re7NUYgK6TvmDv1CMAxDZ/FIDjy2czsKmQiTYmhLBn9goALn3/fyQ5vgLg+83HaRosJKOMgzuoc5MISdTtF06so7oIwmuJa/yxLYXbOonP/cesBSQYCtTnlf2PuGdblHA9ZWxaTNQ1/nC/rglhACQBkfoiAOwmA8WHhawWE2Qi+5BwLdlrhwNwZNMJrLXEZ81xeXCHxQMi3C+z0C83JOeK8/mcU0czHFyiyCz5uU6CwkRSdZGjODDcTwkRdDsdWCKFDOUuTgbAYI/yO6c0spTH7B87XF5VolJdVAZDgHOqQBPiF+Cc0mxX5SqDT5byu6g8Lk3Qn9uD2WZSx9owQAiUpYx6HZ5i5ynbzSWcU94SLqqSQX8+ucnr8WDSuKu0YyhfuJ9BT4BUpt2uPVdpnE52CXBFnWO9pbz/CizrPsq6vfPZOfVfcrGZtuQMTuWRLzgSiUQikVRTvG43nio0gl/MbeJSopJIJBKJRHLBIWdwNERbDNh0BhZ9NgSAbre/xmtK39CLGd/T9lVRDzH7liZ0fuVeAJ5+4SNOLNsLwE8pPQGor2vD4NhuAOwzzaPt/I8AmJAfSb/73wRg/s4/uWPq0wBct20qm14cDUDKuzMB0dN0YquQlxp98yL1v8wC4LP5O3niCSHheD1uNiwS/VGXf1gXANsUHZGHVgLQuF8zVn69CYCo0AYoH4W1/xzn5obC4ZSfmkRYj/4AWGaKcMK/jmQTrzigUg5l0fm2VoBwgekP/wOAwWzj5KZ9AITGic+alF1Ey7rCcZXn8tAkWvQ2LdHrMKYfEs/YbCB3nxiH1Qwl61A2AA37CRkvtWgHpoS66jk8obHqn0+WIlGZ9TqS84RE5XNO7c520EVxTjnyivzhfg6/i6o4Iw9rlJDNPK5i9PYoZXwYCOyf8lj8jq1i9H5Zyu13UalSlN5ArtPvnMpX5arAQL+8wuJTtquSk8sbID/5HFXOotKlK6/GOeVzgQXIT4bAcL8AF5Wmo8qHSe+Xq0ya5D19GY6okt8HnLeCGkJ5wv1Kc1eVup/GOVWecL/KyETn2jl1PiDD/f49vN4quqi8UqKSSCQSiURSzZBrcCqPlKgkEolEIpFccMgZHA237lhOWFgY2xVXVJ1OT/H82C4AvNHndQ5e0gOAyN9/4PbfJwPwjF5PK7uQQIInDQXg4UaP0yvSBsAPNRszYrN4g34jbCk6g5A3NmcXMkEJDvQaHmDsnSIM8JdvNwMw7pIYPlCkkA3mJjxxQw4Az7/yKSeCRL9URN0WbN68BIDBV4hzbbVbOT5LBPrVvO0mtk1aA0DB3nQSreKPO3nPYer3bgqAZ6GT/JrCzRVWYxcAv25LpktzIQ1NW7OJJlGiX8rjclK4TZzPao8mdYfoq4q8Scg5xwtdtK0TDsDfXqgRItw/IUY9roPbAIi3GsncI7qowuvaOfLXUXHtugmACPfTRYvgQYfbQ7bL/w7ud07pOJbhEOdTpJz07ELsyucrzNfIUvnZWBMUt1SyA2uUMnZlqxKVT9Lw2sLUa3kt/v4pR7Em3E/josrTBvcVa6Uof7hfrhL0p3VOGYx6XMr+Bo2Lyhfi53Z5sCjOKY9LE/rn8WJUJS1xXotRr0pDAeF+pQT6iX30p4xF/5S6WZUftKF/WqeVR3VFaY8JPN5bqruqjO2cKhmdKdwvIJSv7N0qxNlUXapyKhnuJ9EiZ3Aqj5zBkUgkEomkmuL1eNU048p9nbuyzczMTPr374/dbsdut9O/f3+ysrJOe8yDDz6ITqcL+OrUqVPAPkVFRQwePJjo6GiCg4Pp168fR48erfD9yRkcDW2emYvebOP6n8Wi4S0ZHXnkF/FQe4VauOKhhwDo+cpvDPj6UwBeXfAr9ztETszbN4n273ldYpgy9QkAduh68cnHPwPQfc8sLu0/DoC2238lbewQANxPv0dK4XgA9ixfBkCnkfcS/3MiAMPmbefnh0VWzaD042yZKhYONxp4B44fxS9vK6PIsDH0qM2Ob/8BIPqNj8guFtUQs1Yf5nll0W9W0k7iB10NgHHZKtYdF7UScfVriHvelUr/+9sBMCk1CdtJf2v4ib93ABASdw3Jm3IBqK/M2mQ43dwQJ2ZBNujAmiUyeiJMBhz7xOxQZFwwWQczAUhoX5dUZZG0qWZDAHJcbtx2MZvj9kKmpjX8WK5o/bYZ9OzPFHk8TXwLi0u0hgdFiRk0V6F/YbHbWYgxXOTqeIpTMYSK+z5Ta7g2+ybf6cZgEhlC6uyM0USu0z9TU1ZruLqw2KDH7WsZL6M1PGDBsW+RsdsTMOsCp+balNUaXto+5WkNP11juF5X+sLis9UaXtHG8JL7VaU1vKrZN2dr/+rEf7mw+GJe0+xxe/BUYRamKseeiXvuuYejR4+ycOFCAB577DH69+/P/PnzT3vcNddcw9SpU9XvzWZzwM+HDBnC/PnzmTVrFlFRUQwdOpS+ffuyYcMGDIoKUh7kC45EIpFIJJIKsXPnThYuXMiaNWvo2LEjAJ9++imdO3dm9+7dNGnSpMxjLRYL8fHxpf4sOzubzz//nK+//pqrrroKgOnTp1OrVi2WLFnC1VdfXe57lBKVRCKRSCTVFN8anKp8AeTk5AR8FRUVVem+Vq9ejd1uV19uADp16oTdbmfVqlWnPXbZsmXExsbSuHFjBgwYwMmTJ9WfbdiwgeLiYvr06aNuS0xMpEWLFmc8b0nkDI6G/NTD6IwWXhp6BQCzG/Xk2zqXAzB18/f0V/IEgr78ihxFYhjuXsW3sdcBYNAJ+cnldLCs8Z0AvB2ex4Q3RDXB3N3pfPWoaB4P0fVi7tjfAVhQYysPKC3dX2SmAJDV7VX6uQ+I+5j+B+6oZQAEx9RixY7lAAy5pgmHRylyydzPAWh8T2/mLfgYAOfBLLVV++C2ZBooi5qLN2TjbSV+eULij/PzdlGX0LKZWFi8eP562iR0F/vmZ+PcJnJ1rPYYUjYIqSmqWywH80WuS5dGohpip8dL3XBfa7geb5L43PFWg7qwOKJ+OCd3pAHQ9K4apCoSlDFR3JvD7aXA6JeKUnKd6vmOZoqFxWFGPclZQq6KNCuZNHlObNFClirKzcEWLRY+F+/NIyg2QvlzyUcf5su+cUJIFFq8Vn/2jUPTGq4d5zpdpS4yDlhYrJGocpXsG4PRiFPdX4+rWPz+GJU/H5fTrS4sLswvVre7XX7pyuPxqhKVr5LBoNfk4BjKkKJKZN+UttDXl4MDYNTm4JTRGA5g0myoysJiH6dbWFxS2vk3W8P/jdwbubBYUhZna5FxLaUCx8fw4cMZMWJEpc+bkpJCbGzsKdtjY2NJSUkp87hrr72W22+/nTp16nDw4EFef/11rrzySjZs2IDFYiElJQWz2UxERETAcXFxcac9b2nIFxyJRCKRSC5wkpKSCAvzO0UtFkup+40YMYKRI0ee9lzr1q0DSn8x93q9p31hv/POO9VxixYtaN++PXXq1OGXX37hlltuKfO4M523NOQLjkQikUgk1ZSzlWQcFhYW8IJTFk899RR33XXXafepW7cuW7Zs4cSJE6f8LDU1lbi4uHLfX0JCAnXq1GHvXmHuiY+Px+l0kpmZGTCLc/LkSbp06VLu84J8wQlg6aeDCQkN46+joj5g/6QeNOzeD4Arvkph6PuiWqHPm58xOF1IMV/cMpqRvYQraNMbYvHTavttPPHOMgC+Pz6FBj2eASBx958k/v4+ALoh77L5FZFFs3r+cqa+KlrLo1cKN9FrC/fw9rWi0fujUe+xYYKQhur3eYOMxV8C0LeOlfWtxC/S1mmi1qHTb3M5XjgRgM1rDnNHuHADfXxgM3WGXAmAfvMOduYKGSK2QWNWbhZt2s/e2ByAH6fsJ7rguPpc0tZsBCA4pj0nl2eIz3JfOJlKlks3JWdmJxBWKOSnCJOBon1bAIiPCSJjt5hajGgYw86VwplmrdNAPYfLLhxjTo+XNIfGOZXjc07pOJwunFNdTHryc4R+HKTkDRXmOwlSXGKuwjxsMeIvhnu7A1N4OACe4kwMETGAkEM8GkkKwK3JvskPyL7xqPlFeU43eqNJGZfunMrTZN/4nFNGk0HNvtGOz+Sc8nq92BQZzuNyYjmNi0rrnNJKUSUrHHz4JCatc0qbiaOtbfBoM3HOgnNKS5kt3JXQV6qyqFC2hlcfLmbXVEn+7Ryc6OhooqOjz7hf586dyc7O5u+//+ayy8TSi7Vr15KdnV2hF5H09HSSkpJISBDu2Xbt2mEymVi8eDF33HEHAMnJyWzbto1x48ZV6LPIRcYSiUQikUgqRLNmzbjmmmsYMGAAa9asYc2aNQwYMIC+ffsGOKiaNm3KnDlzAMjLy+P5559n9erVHDp0iGXLlnHDDTcQHR3NzTffDIDdbueRRx5h6NCh/P7772zatIn77ruPli1bqq6q8iJncCQSiUQiqaZU5yTjGTNm8PTTT6uOp379+jFx4sSAfXbv3k12tlBFDAYDW7du5auvviIrK4uEhAR69uzJ7NmzCQ31z6i/9957GI1G7rjjDhwOB7169WLatGkVysAB+YITwJHrbyDYYGBghHBOZf4xlmGX9gAg9PJnWJEmJJKfehlYYxgFwJ7XF5B1RLiFMiaK6bN5YTpCPxWuphlbDzBlYlcAEorb8PMzoi18mekG2iry0RdH9xA0QFQ/9AjfCsBv8zcyKVws5LLaY1i2VkhQj33QjLR3hUTi+mUyLR8RDeaTnxL1DN50PSGKjPH7huO80aMOAI69KRg7DwQgJL6AOduEZNSkeQx/LxX336VWZwAKs1NxKU3mltBIjq/dAEBkq5vZlydcQV0bRZOkOMkaRorPYTPo0B0VQYA1bEYytokQv4h64WQeyAKg/nXtSCkUFQ+m2o3JU85RFOR3NJ3MV1xBeh1HsnzOKQN/Z4jnH2k2UOCTqHzOqfw8gmPFXxDngWxsvkoGZyGGCLHS3+vZhz40Ur2Ot4REpa1kcLg86I1ChswudGFQxnlFLnW7T4rSOqcMZhsORboymi0UleKcMhh1eFw+F5VPBitWqxo8GueU2xUY7mc2GtQxBDqnTpGl3KdKVFrHkVZKKqs1PCD0r4TWUtItdcZxKcF9JZ1TZbWGl9xaVmt4wDFltIafT86pSrWdV+JeSr+21ImqAx6PB08V1uBU5dgzERkZyfTp00+7j9frT1K22Wz89ttvZzyv1Wrlww8/5MMPP6zS/UmJSiKRSCQSyQWHnMGRSCQSiaSaUp0lquqOfMHRsGRfBhadntiHhKR07bpYnr9byFWdB7/Ps8lCnpnR8QGGXf08AH8OuZz1ccLXf8voPwD4Pm0KtTs/CoB91wra75gFgH7MFN6Z1BqAud8s4/WhQl56bUMjXlu0D4Ax1zcDoNF7H7FxjNhWr/PLHF8+G4AXW0bzd0vhBPpn0m9cNmcGAIce/gqALSsPcq1d5Bt8tnsDjR4X19CPO8AeVzgAMQ2bs2jDMQAGXdOEJdN/AiChOFV9FmkrRWt4SFwrTvwlekbibwznhCLLXFcngiRl3yiXcJFFm40494ierIQoG+k7xTUiGsWwZ51watkaNFKdU+6IWjiVIri0AkXi0cGRbOGcCjHqOZCaD0B7k54cJdwvJMJKQW6gRFWcn42tnuKc2uXAEitcAK6ig6pE5XE58djs6mf0WAIlqgKtRFXsd05lF7lU51Su041e6aLKLhC/DyWdU2rQn0GPU9luNBkCZKliJeDQGizO5XZ7VLnKremc0jqnPMXOgO0g5KczdU6VdE75LKcmtX/KHTguwzllKqGXGPSUKndpt1fFOfVvuI/KM4V9uvuQzqmzh1TESke84JQ/yLK04y9W5AuORCKRSCTVFF8reFWOv1iRa3AkEolEIpFccMgZHA3Df36DsOAg3mp/PQBhXZ7kkqQcAJbeHMLfBhHSt3FCO1J3CQnH8e4kFocJaSGs62AAvty2i893dQMg0duaeQ9PAmDZ5Mv9zqkDmwmfLpxTPWdv4bvvhbNotG0tINxLS/4WYXuDxl1C2gdCIvHM/4DWA0UWwOSnvsGTKzqsbIrd5fe1R3npyroAFOw9jqnHYwCEfDWTH7YJmah5yzjWrdgDQLfHO+JQ+q88W5ep1z62SjinolrdyO5fhRzSs1ms6pxqFh2kXlN/zO+cSvtHnDeifjjpe4V0VffqtqQUirBAc92mZCtuosLgGPXZJ+f5nVOHFLdUmNHA32lCorrW4ndOBccGU5SfJz6XEjLoPJCtdk6d4pyy+0OrtBJVgUvIYz5ZKr8czqmcwmKMZiGLaZ1TeUrnlNY5ZTQbApxT2qC/Qpcib5XhnPKF+5V0TvnkJm2437/lnCrpqqmMc6o06ao8zilVKiv1eM0x0jlVJaRzqvrh9VRxDc5FPIMjX3AkEolEIqmuVHGRMRfxGhwpUUkkEolEIrngkDM4GvqujcVoDWbo7e0B6PnGFF5KFdLQZ+3uY/hVTwGw/vU+bIm4D4Dr31jE98mfAFC/m/h52K4VdFj/KQD68V/z7sfCGfXD14sY9ca1AAxf1YChv+wGYPyNl1Dv/Y8BWDdcbGt45Wsc//N7AF5qbmd923hx7fHz6bTwR0A4p9YtE06rWyOEbPLFznU0flqkSurf3s3OYiHJxDW6hAVrhe9pyA3NWPyVOEcNZyf1859c9hcAoQntOb78V/HzmyNU59QNdSP5Rtk3ujidaLP49SncJuS6hCgbqdtEz1R0s3j2bRDSV1CjJqQpso0rso7qnEotcKlyyKFMEeindU51NJfunAqOC6Y4XyRjlumcihLP63TOqQJFPiqPc8pgEc83u6D4lKA/rXPKaDIEOKe0spTWOeV2Bwb9ldc5pY6r4Jzy7VNR55TvlNI5VfFjJKUjFbEz43F78FRhFqYqx57vyBcciUQikUiqKdJFVXmkRCWRSCQSieSCQ87gaNg45zt0BjNrMoRcMr9lEj+GvQFA0phuZB8VDqGdo99iYZhw8QRPncoXm/cCMDdZhOqFRvZi+sAvAViQ25O7Y4IA+OLITtwPi26NW2vv5/uZSwF41/sbQVGJACz6azkAwz5pxeH3hRRSMP3/aPPMDQC8c/8UclLEvK7dpOe3NUJ2eqtvQwAc/6Rg6PkcAGFffsW0deLnbdsm8McvwsnU/anOFGaLUL/iDYuw2oWbKenP1QDEdL6V3T8LV9PVLePZozinmscEqT1XHPqH2kHi1yd1k/j80U2iSNudDkDDmzpyzKF0TtW7RO2cKrD6u6CSsouwKbrHPsUtFWEysPqkeLZ9LUbyfBJVQgiFOdnK2I5zrzKuKe69uDAfQ5SQAj2u3ejCNM6poAh1nK9xTuUpEpVPcspwFKvOqexCFwbFLZVZ4FS3ZxUUq9uzCsQzMpotOHzhfprOKaNJj0uR5iw2E4X5xadsL805JWQpZexyYjP5XVRB5hJdVOVwTpn0/n/HGDXSVVnOKZNBd8o+Xo10pf2Z9nrq9go6pwLcTiXO77t2SSlDOqekc+piQSYZVx75giORSCQSSTXF6/bidXvPvONpjr9YkRKVRCKRSCSSCw45g6PhlbeGYA0OYUhRBwDe7P0qH3QRPVMp3z5Dka4XAPe8+DUzdokOqHb9x9Hq0CoAQj9+AQDXU++ycUQLAJbOnMc3nw0EoMbPidz3tehq+vmRdkx5awIAK17eRMuB4wFI/V1Uz98Sk8/Wq+oCsPad3+i+XchHxwsns/Q34bQaGh3MxB0rAaj/lrhP4661/JUq3thrtWjOH2uPADDm/nb8+LG459isvapz6PjiP7HXugaAQwvnA9D0oWiOK5LLA3Wj2K/MYEfkJRFnEb8y+ZvWUCNROJJObhbOqYT2ddm6SoxtjVuQ5pwpnkdUXdU5dULjnNqfWaBKXntP5ALQy2wgV5EIQ2KCcOQJ51RoQgjFBUKWCmoehWurkLTMMYpbqng3xmgx9nrcAbKUyxyijvOcflkqV3E1GUxaWUoEMWYXFQd0Tvlkqbwil+qoyiss3S3lGxuMetU5ZTDq1e16g151NqiylMsZID9pg/4sGueUz0Xlk31sZkOAc0qVpTTOKYNO63xCs0/pzqmS7ih1XELK0P5MV4Y0pC9DliqffKQ9tuTPKiarVMU5dborna/Oqf9SlpKKWMXweKroorqIFxnLFxyJRCKRSKopXo8Xr6cKElUVjj3fkS84Gm5dMo5Qi5l7mwwS34dasCgx/696r2RMiKgv+Cj9OD/sEotplz3dHm9bsaj37ZvGArDA9RfvXSYWDX+Zn82mS4cCMKJGAU+/8BEA6elfE9Wwrdj/x6V8eE8bALaOFbMDSe+OosUwcR8zLn+Og/+ITJlaNhPzVm8HoNXDl1H8i5jVKOxwCwCR9Qv55K+DAPTpVJvPpvwCQPc6V6rZMQUr5hIcUwuAw0t/JvG2GgDs+V4smr2uZQKrlb8UzaKt2H1ZLbvWUi9YZMOcWL+LWKXV/PAKsZC55aN9OOYQWTqGuv6FxVm6YPUZH8hwaGZt8ohXzp2kZN/EBJn8C4sTQyjMFnUPwQmROLcpC4trxOByirEhRty7x7UV7LHqdbQzOHnKgl690azO4Oj0BjIc/jZwgBxNJUN2gb+SIctRrM7aZBU4MZrFMyjS5N34ZmpKLix25DrV7W7leZjNBtwucWxQGQuLtZUM2lkbdVzKwmJtZo1J799uDJjZ8bePV3Rhccm6hLIWFuvLWFispTwLi8tLWQuLy5q1Kc/sT1UnGar7wmLJ+YPHDR595V9SKvBX6YJDrsGRSCQSiURywSFncCQSiUQiqaZ43R68emkTrwzyBUfDe++vxIye+a2FVPDNvhW0yBdyxGU3vkTCzj8AePmnn+n2mlj0+3fv6zg0aTYABt04ALYt+IGuc0V9Q6svTzDgA7EIeeuTcTyemwHAvPeWc+vnTwJg+2k8zY/8DkDUwC4ALJq8ku4vibZxh9vDJ/NEY/ektvG8eWAzAIn/9yRBa0TlwnfbTwLQsH1DNvwtFvq+9nov3ju+HwDzjt8xWsVi2wM/ryWqgWgZ37F6Bl1bJQCQqkgu99UOZ4Myf249voVaNvEMMtauJaGRyLE5sTmFJre2A2DZz6Iu4urGbUhzinMUR9XH5048nleMWTnfnvR8whQpZl1yDq2tYuxbWByaGEJ+jtImXjOMolTxvEJrx+JaL2QsU2wD3EVCsjMqEpXX48Zt88tShXoLIKSoHI0slV2kZNFYbGQX+drAxcLi9AKnKkWl5znVhcXZBf5xXqFLrVdQZSmzZpGx2aDm3QSHlb742GY24HEp7emaGobS8m7cLmeJfJzARcZaKcpk0PsrGQxauar0pm/fIuOSC4sDj/VP8hpKzPeWtbBYS1kLi8uTd1NW1o04pvTrlUVZklFlZCK5sLjiyIXFlcfr9uKtgkQlbeISiUQikUgkFxByBkcikUgkkmqKx+2t4iLji3cGR77gaHjmiY6EWkxsT+wPQMvX/uKHDNEKXrPDI+RvWwLAS47F6H8TmTHP2Nsy9+0fANj4am8AJm9oxAsbxZzs7Cc60eiqpwFY+89+GvV8EYDN6xcy4bpGAPzdJp51L0wA4LI5Iqtm3dvdWKbk3fSJtDHjbyFztXnmBhh3AIBd1oYktLwMgM+XCJlo0DVNeGr+bwA01bdRpZATP8/DXrMlAIf//IM6rwoH1KGCYvpeEgfAt8rfg9q6bDXvpnD979SPFVUTKWv3Et9OSEJrZ27hstatxbmLFgDgrdkch/KXKbnAo7p0dqXlYzeJycLtx3JobxGSy4mT+URGCeknR5GowmqGUqTUSIS2iKP4kJJ9UyOBYodwhxnjauNxiQZzT3CU+uenHfsybnR6g99FZTKTVuBzTpnI9LmolLybjDyn6pzKDnBOFauyVJGjGJPybHxuKZPF76IKCbeqLjCTxaC2hls0slSQ2YCnWIxtSiO7tinc7XKW7pzyuANkLDh9U3hpzimTXqc6sAJcVwa/TFRaDo7X48ZQQpspj3Pq32gKL49zSktF827Ke9/nwjl1Pis7UpY6O8g1OJVHSlQSiUQikUguOKr1C86YMWPo0KEDoaGhxMbGctNNN7F79+6AfbxeLyNGjCAxMRGbzUaPHj3Yvn37f3THEolEIpGcPTxeLx5PFb68UqKqlixfvpwnn3ySDh064HK5ePXVV+nTpw87duwgOFiEx40bN47x48czbdo0GjduzFtvvUXv3r3ZvXs3oaGhFbreL/1exRocytIw4TwKfWwxE/8WL0vrc6/BdIVwMo2/YwIL3hDN1W80imTakZ0AZE0SLqqXu6fy1ttCahp2IBN7bbHvDwuWMfmzTgDs/NRK+rghAHQcM5CR144EYN8RMa8bbzXy2a/CLTWyfxsKlh0HQHfDa0R+P1Xcx/L99OlWD4DZ04XD64anO/NIpnAYFS+bpbaU753/OzWuF2GA//xcyF2XiaC/9S4PHRKFu+o3RUZi5180DBHOqeN/biK+tZCwjv99nEa3Xw7AwU83YGosXFQZilSTY/U3eO/PLFSbwrcdzyFGkWL+Op5DP8WVlZvhIKxmGACOLOGWsteLpWibzzkVR3GhqHAwxl+KxyVebnWRCep13BpZSuuWytFUMvhkKaPZpspSBrONNKUGwqS4y9LznRhtYpxV4MRkFe4qR6ELsyJLFRe5MSrPyam4sEoG+vmkK7PZgMsprmcrIUv55KqymsLVqgZ3oETlk5XKagovWcMAVQv0KyvMD85NU3h5WsJLXq8sztem8LN517Ip/ALA7cWrq8JLilyDUz1ZuHBhwPdTp04lNjaWDRs20K1bN7xeLxMmTODVV1/lllvEf7y//PJL4uLimDlzJo8//vh/cdsSiUQikUj+Y6q1RFWS7Gyx4DQyUmSxHDx4kJSUFPr06aPuY7FY6N69O6tWrSrzPEVFReTk5AR8SSQSiURS3fC4PVX+ulip1jM4WrxeL8899xyXX345LVqIpu6UFCHFxMXFBewbFxfH4cOHyzzXmDFjGDly5CnbR7z8HjqDGcfOZQD87/dFdHhDyFIbu/Rg95RvAXB7p7Futmj9vnLFd3T4WjR23/y2kIl2PBHOq5knAJg5fAmPfjMHAN3Cj7nskOiGajC0Jz+9sxSAngP/R3bxGwC8M3sLAFM612DMLuEUqjP6JYK3i0C/af+k0Kyr+PzLlh1g/gjh3Pr4rW0ABG9fpAb67fl2GXHNngBg21ffcWX7mgCcKHLRv754Sdyi1xF8dCMAdYOEmyht+XJqNRNy09E1STS/Szi1/vztAL1bCIktzTmV4rgmyvMQz+9QllMN9NuSkkO0IrOsTMriPiXQLys1H3sdIUvlZeUTXk+E8xUmK86pjrEU/y1eOM01muEuWgcI55RPDnGHxqh/Zg69kJFKylLpBf5Av7QCIQcZzFbSfWOLjfQ83/ZTA/2y8pyqc8pZ6MJk9YX7+eWq/BwhcYWEW88Y6BdqNQXIUlpHFZw+0M9q9EtXajifJtCv9C4qnd85ZdCrzqnyBPppt5fmiFKbyTX/PCpvoF/pkpb2GmXLUmVxukDAM52rMs6pf0P6qgoy0O/CwltFiUoG/Z0HPPXUU2zZsoVvvvnmlJ+V/D8cr9d72v8TGjZsGNnZ2epXUlLSWb9fiUQikUiqitftrfLXxcp5MYMzePBg5s2bx4oVK6hZs6a6PT4+HhAzOQkJ/oWnJ0+ePGVWR4vFYsFisZy7G5ZIJBKJRPKfUq1fcLxeL4MHD2bOnDksW7aMevXqBfy8Xr16xMfHs3jxYtq0aQOA0+lk+fLljB07tsLXa9XvVozWYGL+9ycA1y/6P1yzRKDfmOgWfPPKFABSvnmSqQuEC+na75NZ/JSQbcK6iG6pxQs20X7geAB2vvgLE9sK19DWaxuwdMB7AHTfvprNrzcF4Ntv/mFovHB8TVslwgTbvvUYhmFrAVhenEjdDuIaU+bvZHT/tgDc9u2PNClsEPAZjsycRVTDawDYsXAJrf8nXFR7Jjt5oLUI6fsCqOUUrqxaNhO5f/4KQJPaQjpKWr6Lml3Eef+YsoYuHTuL7Y75eOq0BsDh9nI4V3EkKZNlm0/kEKnILJsOZ9LLKn69TqbkERMvZLPcTAcR9cMBKMxMIay9eDEt3idkqeC6dXAVCbeUKbEuHtdKANyh/hdWly1SHWcX+WWpTIdwNZWUpdI0slRqjt85lZGvOJ8U51R6nlOVn5xFLjXQz1kU2D9lDRZSXmmyVJDVqJGljBrnVImgP49fxoISzilNoJ/H5dSE8HlKcVGVHuin15Ud6KdKTGUE+mknPw2nkYzKkqXKlIPK2TN1pmNLHl8W5yrQrzz8284pKUtduHjcHjy6yq+jkWtwqilPPvkkM2fO5KeffiI0NFRdc2O327HZbOh0OoYMGcLo0aNp1KgRjRo1YvTo0QQFBXHPPff8x3cvkUgkEknV8Hq9eD1VWIMjc3CqJx999BEAPXr0CNg+depUHnzwQQBefPFFHA4HgwYNIjMzk44dO7Jo0aIKZ+BIJBKJRCK5cKjWLzjlefPU6XSMGDGCESNGVPl6v3bLJiy4GN3Dwuk0OKEXv+wT8szyIV34brcIF3zf3pefRwrJpF2/F9m5ZBQA9bsJiWruB38w/4mOAKz7Iop/HhkIQOtPJ/JZrWsB+Onn3bQNFw6gbxcuo/v/3QqA6/1jABxp3o8abUUH1MiftjOwX3MAXnr1U3oNbQVAcX42J78RXVkRdYWzatePn9Nw6CAANs8u5MFOdcT1PF5ahgqJJM5ixLlaSG8tom0c+m0DALW61hbH/biTVs/eAcD+91agayw+S3axh2R3kPq8NqfkAWBX5JR1hzJpo8hSS47lkqD0TGWl5quyVEH6CcKbCtmscGUq9obi/ornHQXAVKMBHtdWALyR/vVW7tBYdZxZ6EZvFDJRdpG/Z+pEvl+WOplfuix1MleMjbYQ0n1BfxZxrkJNz5TT4cJs8TunbCFizVZ+ThEmZbvqlrIacTnFuUKtRtxFolerZKCf23WqXFWWc8pi9LulrIbStpfeM+UjwCEV4LQ61S1VsmeqtKC+ktIVlC1LleWcKg8B5ylnx1R5nFdVkaXK65qSPVOSc4HH7cWDLNusDOeNi0oikUgkkosN4YTyVOHr3L3gZGZm0r9/f+x2O3a7nf79+5OVlXXaY3Q6Xalf//vf/9R9evToccrP77rrrgrfX7WewZFIJBKJRFI9ueeeezh69KjaOvDYY4/Rv39/5s+fX+YxycnJAd//+uuvPPLII9x6660B2wcMGMCoUaPU7202W4XvT77gaHjr2tex6PQsGPIhAFN61uHrg6IPau9rE/hI6R565Knx3NVZBNNFN+7GZ9+IP9y5yT0B2PxNEGkvPwhAz1ljeKGdqIxYus9ELaWHacI3y3h9qNj/qR8O4bx5IgBxf/wAwAvztnNPP9FhNWniXB4YJGSiwenHccyZDEBoQgO2T58LQL17RVXFml/yGdC9PgArij30VEL1/jQb8Pwtfula2i0c+fUvAOp0q82BJYcAuPL9/gB8M+0fbrq0GwAZzrGkW/3y0NaT+QDYTXrWH84EoLYiS608nEk/u5By0lNyiWooQvzy0k4S0US4pRybUghvLBxozkVpmGq3B8Dt3CsuEFfXH+gXFq9eN6sYVZbKKnRjsIhf9mSf5GS2cTJfjA1mGydyCoHTyFJWK7mKjOVzThU5XFhsyriwWJWlctIKCI8R211ONxaLbyxcZIFuKWOZgX5aWcrncNI6p7RSlE9K8nrcGLXblbFPlrIYAiUnv3PK30tVsmeq1O0awaQ8jipxvH98tgL9KiNLlbnPGfeo2vnh4uqZkrLUf4PX7cVbBYnqXM3g7Ny5k4ULF7JmzRo6dhT/ffr000/p3Lkzu3fvpkmTJqUe54t38fHTTz/Rs2dP6tevH7A9KCjolH0ripSoJBKJRCKppnjc3ip/nQtWr16N3W5XX24AOnXqhN1uP21VkpYTJ07wyy+/8Mgjj5zysxkzZhAdHc0ll1zC888/T25uboXvUc7gSCQSiURygVOyc7GqgbcpKSnExsaesj02NlaNdDkTX375JaGhoWpZto97771Xzbnbtm0bw4YNY/PmzSxevLhC9yhfcDR0q20n2GDg/aVCyrHPn8+be4UUdfczH7PnOlH2aY2IZ+J40Q01Y9e77F4gFkeFfvwCADfNepmR14quqyYPvau6jD747C+WP9YBgDeWbSZihpCaordO5cVfRLjdzTe2BmD29D+Y9uUAAMa8vAfPvAkABEUlsumj3wGod/0IVi2dCsCDvRoCsOlVNy82iQJgs0mPfv1PALSyWzg67zcA6nerzcElBwG44u1bmfP9LgCu7SBKS08UTSbbLkIV3V7YdrIAgBCjnjWHMgBItJqYuz9dPDcl+C49OY/IRiKEL/dkKpFNhNPMsTOFyD7CLeVcmYO5rnCmuZ2L0ceL6/hlKX8idbbbiE4vnl2Gw++cSs4rwqh0RqUokpPRGkxytpClzMF2dWwKtnNSkavMQcFkK/1TFquJwnwhMZl9spSjGIsiIRZmOLBHKV1UzGpOgAAAJo9JREFURS5VliouchKiSHI+t1SI1YTbqYwtgUF/PudUiNUYIEv59gl0SPnkKk+AW8pq1MhQJTSRsmSpkj1TpW3XSk7anqlAmahsh9K5kKXK4nSSUUVlorKDCKUsJal+eD0evFX4c/f11tWqVStg+/Dhw0t1H48YMaLUrkYt69aJjsDS/s6cqSpJyxdffMG9996L1WoN2D5gwAB13KJFCxo1akT79u3ZuHEjbdu2Lde5Qb7gSCQSiURSbTlbNvGkpCTCwsLU7WXN3jz11FNndCzVrVuXLVu2cOLEiVN+lpqaetqqJB9//vknu3fvZvbs2Wfct23btphMJvbu3StfcCpLgyULCA0N46UNIoum24DJ7L9JZL28rbfxv+dFK/hXW1azf9lnADT9YSQdZzwj9rnlHQBa3jwcg068Ab/+4TL+eET8gYxeuYIaf30CQNSDX/Lcz2LWpu+tnZn/nVj0u+2LhwH4aNR7mH8TMzxBUYlsmPAzAPWufI2V74lfiEcmNmHzKLHw+elLRMP2AZMeyyYxA9XKbiXp+7kANLmiFvt+FQt5uw6/kV9/mQFAny59OVEkZoGyoxoDYtZm8wmxmDjEqOfPA2KmJtFq5Je9aQA8G2Im9aiY8oy5RDSPZ59II6aFmIEp2HmM6KvErFLRukysDYVO63L8jqGGuI7XsxBXROC/KnKw+mdtCt1qu3dKXhEmpVIhObcIo1VkEiX7FhPbQkjOEmOjNSRg1iZDycGxWE0UFvhnbYocYuybtcnNcBAWKa6XXuTCqmwvOWsTrrSul2fWprSqhmCTQZ3tKG3WxuNyljpr49Hm4JzlWZuymsG120v+m+xczNqUt8LhbOXdlOdfmpWpcDhb8yyygkHi9VRxkbGSghwWFhbwglMW0dHRREdHn3G/zp07k52dzd9//81ll10GwNq1a8nOzqZLly5nPP7zzz+nXbt2tGrV6oz7bt++neLi4oDOyfIgFxlLJBKJRCKpEM2aNeOaa65hwIABrFmzhjVr1jBgwAD69u0b4KBq2rQpc+bMCTg2JyeH7777jkcfffSU8+7fv59Ro0axfv16Dh06xIIFC7j99ttp06YNXbt2rdA9yhcciUQikUiqK1UK+fPAOSzbnDFjBi1btqRPnz706dOHSy+9lK+//jpgn927d5OdnR2wbdasWXi9Xu6+++5Tzmk2m/n999+5+uqradKkCU8//TR9+vRhyZIlGAyGCt2flKg09BgwCZ3Ryt6+wo42iSjeHDgTgO92rmV3e/EH1+r7EVz5w8sAjLz+LZqtfkk5g5Cohv5vEcuVqoaxy5ZSc9WXAEQ/OI1B8/cBcOtd3fh+5lIAdkwbwLQxHwBgWygW9AZFJbJ+nFjI3KDPGyx9ZxYAAyc0Z9NoIa081TKOQybxjhq0aR4AbcOtHPnmewAu6VmHPT/tBODyUTezcKGQpXp3u4kkxzQAsqKb4lSmMDem+GWpPxQpKtFqZP7Ok+JzhVr44oj4RY25JJqsZKG/+mSp/J1HiL5SI0s1FouJXY7FGGqLTB+vZxGuyNrqM89BLC7zyVKpBS5Vljqa45elkrILVVnqSGYB5mC72CdDyETm0EiSs8XYEhoWIEs5lIXFZpuRQiX7xmIzkascW1FZKkRZcFwVWcq38K+ispQ2HwcqJ0upbeIVlKV8csl/IUudTiaSstTZQ8pS1Q+P24unCoWZnioUdZ6JyMhIpk+fftp9Sqtceuyxx3jsscdK3b9WrVosX778rNyfnMGRSCQSiURywSFncCQSiUQiqaZ43d5yFU+Xefw5nMGp7sgXHA3W8Dj0JhtvPP0FAH8lb2Fzy68AqD3hSbotnQDACx2eoNbS5wEIM+p54W1Rr7DxNZEjM/aHFdhXCJdV/OAfuGuGqHsY8HBPJk2cC8DRb59hylvifIZvRxOa0ACAVSO+A6DZvf/H4lHfADD04xaseEu4pV68NJbDSry/ZeVMOinyyoEvhYTV8poG7PpRyFJXvt+fn+ZPAaD3lberslRqRCN84Zarj+ZiV2SuRbuEFFU3yMR3O4T89HKUjU8PZ4nP0jqWzGPHAYhrW5v8TUcAiL2uKQCFa9KwNu0OgMuxEH2dSwDFLRVVV33OmV5hT9QbzZzMF5/LqJGifLLUkWyHKksdztDIUpkOTOpYSHrmoGDSlewbrSxlDTZRWKBIRiEWctLE/vboINKThRSplaXCg8T4dLJUqNVX1eDLwfHLUiFWY5mylMeluLaMen8Ojk+i8rjV2oXTyVJa6QnKJ0sZ9KXLUiX38VGWLFVSvqhK9UJFZamSe0hZqmroyiEdSqoHHm8VJaoqHHu+IyUqiUQikUgkFxxyBkcikUgkkmqK2+vFXYVZmKoce74jX3A0/DP5bsLCwli7VwTfZdzRl1u2LQBgcFwP8ju9CEAvu5UhI8U+x794mBHvCEkoa9I4AOqnLqXfpDUAvDm4B0+/8BEA3816jDFH9wCQPm4IUQ1FAODy1z6mzasfA7Dw2c/FcbdfyvevClnh9Tp69imyiGfeBLrVFGFNOyfP5tI7WwKw/pstANz41RC++XYsAFd3v4PjhSIsMMnqdy4tO5RFjEXIXPO2JtNeqVr4cruQpXonhpByKAuAhHbxZB0VUlRCx4bk/6nIUrdcQuGywwBYm98AgKvwW/R1LwXA6/kFV5S/HTbVKa6nN5pJyRNSjcFi42CWkHlMihR1KMuBKVh8vgOp+VhCRfXD4fQCzL5xWj6WYCFjZSqylDXIjCNXkaJCzapEFRJuJeOECGuMiA0htVBcO8hmwukQ144KEZ/f5cjDrshSrsI87Ip05XI6VOnK5XRgV8alyVJBJr8UFTj2S0lBJsMpbeBet19+8rr99Qwlqxq02wHMxjPLUmU1gp+uDdx3jdO1e1elEbysY08nS2kp65iyrlGe85SXC0GWkpw/uL1Qlb7Mc9S1eV4gJSqJRCKRSCQXHHIGRyKRSCSSaoqUqCqPfMHR8FPTy7HpDXTbLZpSx0S34OlJoi/qnXYJXPO+cEZ9uXYqTz8qoqd/aNSfDneIjqeb3/4DgNkv96DTTSII8N5rb2FgvgjH2/3sk9TqKLqm5r33Krd+/iQAv/3wDpPuENLOpGfEL+OVhsPsChNuo6wvxtKztSgv2zDhZ1o9Kno+5o79nQG/ix6sjyYPBuCmK+4mtWg0ADtdEZiVOfhf9qaRaBXSyncbjnJrqDj3gq0p3NdYSD/HD4im8JqdapBxRHymGj2ak/9TEgBRHdtR+PNGAKytrsXtFC3knlotlCf4LQ57TUAE9x1XHFIGs40jSvCeyRbCnnSf8ymM/RkiXNAnP+0/mYc1TOnVSs3DYhfjw2n52EKFjJWTXUSQcv/5ynltoWYKFFnKHhVEboZ45tGJoTiVzil7iJn9BeJ6USFmXA4hXUUGi3O5nQ6iFLlOK0u5iwJlqTDlOapBf2ZjgCzlk4ZsJV1Uythq1KtdUmXJTz7nFPjdUaIzKlBnKNMtpStdutKXIldpz+v1uDGUIWmVvHZ5ZCmtTFSWLKWlou6oU44/B7JUVZUdfcDnlrKUpGJIiaryyBcciUQikUiqKZ4qzuBIm7hEIpFIJBLJBYScwdFwvNCNVeel42PCybTxzWup+8NsAFqu+J3YwSLQ75Zl8OywhwB4dvh0jn4rZKKwLkJyqrdwPcExtQBYds+rtLp7DACz33qcN1eLjqoVnxQy4bpGAIwyGajzjwj46xkTBMCe0W9x5Y2NAfh7/B9c+X5/AN65fwqdvxsEwLZXfyGtYQ8A8lyi22jxkQI1uG/6hqM0VhxC3606zMux4tzTt55gWAfRH5Wy7wi1e4j7yNq0FYBaD7Qnf6KQpeydulM0fSEApkvuw+MS7rDiRJ8sBRmmCEA4pI5kC6nGaAthb7rSExVsZ1eaIkWFRLDrhBKwZ49hV7J/DLDnRC5Wuzjf8fQCgkKEfJSXVUiwItnlZTsIUsa+4L6YWmFknRTXSKwbzlGHkK6iwyzsyRfXiAqxBMhSbiWozydFFRfmBUhRkcqz87ichGmC/oJMBmUsPqvNZAhwUbm1jirVOeWXpSwGvUa68stSFqVIzqsN+ispV2mkJAiUpYz6QLlKHZeQnM5Wl5ShHMdrKcshVRlZqizpq6zjyyNLleeeyosM7pOcLdxUUaI6a3dy/iFfcCQSiUQiqaa4vV7cyEXGlUFKVBKJRCKRSC445AyOhqc3zCQsNISxD38LwI+9XqRb+DEA2g9dwOJxNwJw6bVD+XbiNQC8k3mCnXeL7fW7CYlq2rMv8tg3wmX1fd9Pmf1EJwDeHe7hTttBAPLCraSPGwLADV1rsvol0Rl1+Uuiz2rm8AU8tUqE9H0681F63fA0AMcLJ7PZWwMAs17HzC0pgOiPAvjkzwM8ECH6qd5ec4QPmkcD8L/tx6nfR/Rdpe7bQb3rRMhgzrd7iH+oGwCOP4R7LKjTw7jeEcGDNOmM1/MLALkRDdDphYyS5NBjMIvr7E0XYXvmYDtbTwo5yBISwXbf2B7NtmM5ANgi4tmVLMbWiHj2HBfj4AjRLZWaVkBIuFVcL8OhjvOyComIEy6qk0eyiU4UYYAphzIBCA+1cCBfyE8J4XXYpjjXYkKtFBeIcWyoheJCsU9smIVin1zlk6KKnURo+qdCzT5ZqljTRVVMiNkvJQGEmrVSlCHAIeUba91VPolLO/Z63AGhfUaNFGVUdhcOp9O4qMqQogx6zbgcDqmAXqpyBv1VJbivzP3LIUOd7lzlQTqkJNUdt7dqMpN0UUkkEolEIql2yBecyiMlKolEIpFIJBcccgZHQ6cPD2KwBPHrp8IV1fOO18mbejsAIV8vxfPSVwAktrufr69/FYB7P57FZ3fdBMDc5J4AfPhhER9cIpw0E0MtBH31OgC3tU9gzaOvANB3aE9+fncpAA8tGsewLkMA6DJgBAA7n5vDjqj2gHgDn/aPkKISrUbG/S76rG6MsDF+yT4A3mkmpKhP1h/jkuuEFHVsx06a3CrOkfnzZuq82BuAvOGbCb/qXgCKPp+KoY3Y7nGtBCA/oaUqRR31hKpS1K70QszBQkranJKL1S6uuTFZSEDWiDg2HskCICiqBpsOC/koOKY2W5LE9pDoaI4kC2koLDKIbCX0zydF5WQ4sEcJt1fa8RxqNIhUxw2biusd3n6UhHARKLgzV4QTJoQ3ojjfL0U5fbKURoqKCbPgLhLOqchgs+qiilJlqULsVr8UpXVOhShyldvlJFTZ7i72u6X8UlRg55RXG+6n6Y8qzfmkdUiZtJKRrnT5yHdMZaWo8gT1GU4jE5VHigrsuCp9rKUyUtS5dkiVlJ7+TSmq5KWkLHXxIRcZVx75giORSCQSSTXFU0WJynPxvt9IiUoikUgkEsmFh5zB0XB04wp0RguWZyYCUKfTk0zs9DgAQ76Zw/t9rwNgWdo1vPv5CwB8dGkun9uFvBL6sdh2X486rLhZHHfPyOuYOXwBAE+tmszzrR8FYPSSxWx8vSkA7eK7qdOIkzalAcIV9cr8HQA8EB3E2/N3AvBB6zg+XyFkqVE3N+XQxk0AtHzwcgDSvl1DvdeEqyvn1XVE3yy6rwpnfIy+k5DePK515NZsB4BO/xWHEcF6Pinqn5QCLEo31N/HcrBFiB6sVUcysUUlArDyQAbBMbUBWLs/HRBS1IaDQjIKjY3hwFHhkLJHB5F5QshE9qggslJFIF9EXDAnjwgpqVYTIT+lHUuhQeMoAI7sPEqtaBGYuCs7nZoRIpBwTW4GNSOEjOWTompG2HAqElV8uFWVouLCrAFSlNspHF8RNpMa1KeVouyW0qWoUJ9zyu0+xUUVYjaWOrZq5CqzQTv2S0laWUorV2kqqkrd52y4oso6tjyOKKi4FFWWK+psBfWd7p7KQ3VxREkZSqJFSlSVR77gSCQSiURSTZEuqsojX3A0fD3xOYJCQvmkmahT2JLZmf+bIn61Xi9cwPI6YoFt1hO3M/gxMQPyfbfHuX+GmBl5+5Z3AHjj8BIGx4sFx/WXvMnO55TmcU8zbMo/p5/7eTdtlYW1T8/YyBv1xSzK09+KuoTZ19Tng9//AeCjgV04uFIsAG733I2kjhd1CQ0mPkLuY3MBsN/yPABFn4+Fy+8CxKLh5MhLAFGjsKtQ5MiYgu38lSQyamwRcfyutIgHx4rZkiV7UwmJqwvA4p0nCU1sCMDSnSexJ9YDYMO+NMITxMzO/kNZAETGhZCeIs4bHhNMhjJrE1fbztG9Ypan8aXxpBxKBeDSS+M4+M8B8ZxixfV2ZKdSP0bMbK3KTqWOsuC4KC+DOtHKrE1+NjUixWyTr3ohwW7FVShmhuJDLOo4OsgUMGvjUmZzIqwmtVLBt7DYXez0LzIu9s/meD1uzeJjJ8ElZnACFhBrZmdKjn2YyjFTU552byh7pqasGoXyLDjWHqu9mqHEzEJV8mvKOra8LePn00xNWbMzcqZGUh7EC05VZnDO4s2cZ8g1OBKJRCKRSC445AyORCKRSCTVFClRVR75gqMh+rUBhJiMvDikCwCzG3bnpfki72bk9W8x4thyAJ6OuZznTwopacUnLXEn3KCcQUhUA5dm0SVCyE93fbKWN5uLpuwnPlnL90pD+DWz/+T1oULGemHREq4Ycw8AR8YtBqDFlKGkPTgLgLj3X6Vg3tsA6G54Dddo0SZ+vF53dPr5AGzzCrnIFGxn0SEhEwVFJTJvt5CDQhMb8O3m4wDYazTm+02igsJeuzk/KeOoukIa+mNLMlF1xALiLbtTiamlLPo9kElsLSHTnTyaQ3ydcACS9oiF0c1aJ7BljaiiaN6jESu2HwLgis612b1KPK8WNZqwPlNk+jSKa8uinFRlHAJAYXaqOi7Oz1YlquL8HGqEiWfqKswnQWkZ90lRscEWVX6KCjLjUhYZRweZ1bya6CATHmUcaQscg5CfQjWylLZGwapp9/bJTf5G8NKlKHOJjBt1H/2ZFxkHyFgB8hYBlG+Rcel1DmXJT+WRnkp+X56FyeWRn6oiPYlrlC4/nSspSspPknONXGRceaREJZFIJBKJ5IJDzuBIJBKJRFJN8QKeKh5/sSJfcDRM/20/Zp2erhumArDno+4MSmsNQJdgE92n7AbgzeYxXDtiCQDf39aMa0bPBWDDa6IJvPlnP/DhpyLv5qlxP9JjppCXDj84i+a/ioyd9GvfJuJr0Rbu+GEQaT1FhYP3/4Qbalv0ZZiCfwXg14xggpT8mWn/pGCv3QyAKWuTiKzfCoAJy4UbKbpxBz5eIcZxzdoz808hGcU3ac5vfycBkNi0PpuUFvKajeM5uFtITIlKLULSnjSatU4AYMuag3TtIfJnVvy2hY5thXts/oYd3HClqITYvnw9AG1vbM7q+ULGa1PnMhamC+mrTe1efJctpKimCaEUZYvrNYkNwZkr6hzqRypSVEEOtezCIeUsyKFGqJClih15fomqyEFCqEUdA8QFm1XJKS7EjEdxSEUGmdSx3WLyN4BbDOp2m8kvP9k0UpTN5JePLBoNx2rw7yO+L11+Kpl940MrOZnLkJ/KkqsMJXQeYxnyU1njALeUvnSJSVcOp1XJ78tT1VBR+el0EtO5kJ/KKzdJ+UnybyIlqsojJSqJRCKRSCQXHHIGRyKRSCSSaop0UVUe+YKjYfikuwmzWUgY/C4Amb+PIfRpISNN2TSbJ256H4ArVv/G4e5CSkpY9j0ZVw4BIOtDcZzzl9dYdakI/zMHv8cMl5CUQhMaMG67+FWNatiW534Wkld8q548O3c7ADU7CJnr2e+2UPeyKwEY/eM26nXsKu5j/k4adGgDwJzF+2h0mTj3ypWHAWjavg471gspql2Xuvy9VFQ89Lm+Fb/OEQGBt991ObOn/wHADY/24ZOPfwbggX43AfC/RX9y1b2tAfjr+4Vc2VRce8FXB7mi4dUAzE4/Tvs6Ipzwo3ThzmpXK1yVn1rEhVKUJ+SnJtHBFClSVKPIYLVSoX5kkNr0XTdckagcedRWqi/cRQ5qKrKUx+UkJtjX+u0gKsjvfAIItxrVcZjF3+IdoqlLCDHr1X20rd/BxjIkqgC5yj+2GAM1Cq3MZNEk95nLlKK0QX+lj01lhPYZS+hEZbqlyjEuK9yvLKdVecP2yuNkOldup/LITNLtJDmfkBJV5ZESlUQikUgkkgsOOYMjkUgkEkk1RUpUlUe+4Gh41nANJkMIsc2F3HPdpnjqXd4XgO6zMmh29W0AdH13Ha1uEn1Pvd5eRoc7REjfzW8L2afz3bfz+LsrAOh5Tz9e/3CZON89VzP5MzG+9a5u/DBL7PPgg1fy2ZRfABj8VD8A3n/ve14dJq7x1tszGDPyAQBeevVT3h8rmsqffuEjXn/gSQAee+YDAMY9MYT7Z4nuq4eG9mDhFyIs8O521/PN+0KuuuXS2/n0+H5xT81ieTflkPgsDUSj98j043SvKxxVjswTdKwVDkBhdhrtEsMAKMrN4FIlkM8nOTWL9stPjaKC1J6ouuH+Ru/ados6Tgw1q+O4YH/XU5TNP460GdRxhNUfvGe3BPZB2S1+ySnUrJGoAsb+CctgjQYUpBnbNPKTdmw1lD6G8slSZcpVZYzLkq5K9kGV7ZzSVWhcGfmorJ9VRSaqjJNJykySCxkpUVUe+YIjkUgkEkk1xVPFGRzPxft+c+GswZk8eTL16tXDarXSrl07/vzzz//6liQSiUQiuWB5++236dKlC0FBQYSHh5frGK/Xy4gRI0hMTMRms9GjRw+2b98esE9RURGDBw8mOjqa4OBg+vXrx9GjRyt8fxfEDM7s2bMZMmQIkydPpmvXrnzyySdce+217Nixg9q1a5f7PHMnfY7OYCZ7tXBO2TsPqvAY4J//TVbH2yZMxj5pCgBTPr4d+7ti3/9dfz9T3poAwBu9HuXd14R89MIVgwF468X9PNGhBgAvnzjE/a1E19Tg9OPccYnothqQmcKNTURP1P1KkN41DSMoys0AoFc9O8WKZNS1ViiuQiEZXVYjRB23iQ9WZaIWsSJgz+100CTK715qGGFRx/XCzeq4jl2MfRJQzTB/kF6NUP84IcQ/jg32/8rFBPnHUZqxT5YCCLeWPvZJVD5CNd9rpSjtOKgsWaqMsVZ6KmsM5ZOiypKcypSiyjE+3c+kTCSRXBhUZ4nK6XRy++2307lzZz7//PNyHTNu3DjGjx/PtGnTaNy4MW+99Ra9e/dm9+7dhIaGAjBkyBDmz5/PrFmziIqKYujQofTt25cNGzZgMBjOcAU/F8QMzvjx43nkkUd49NFHadasGRMmTKBWrVp89NFH//WtSSQSiURSadwoC40r+3UO723kyJE8++yztGzZslz7e71eJkyYwKuvvsott9xCixYt+PLLLykoKGDmzJkAZGdn8/nnn/Puu+9y1VVX0aZNG6ZPn87WrVtZsmRJhe7vvJ/BcTqdbNiwgZdffjlge58+fVi1alWpxxQVFVFUVKR+n50tZjm87mIAcnJylO+dFR77jq/KWF5bXlteW15bXrv6Xtv33wrvv7CA11mlJir/8b7P5cNisWCxWKp07opy8OBBUlJS6NOnT8B9dO/enVWrVvH444+zYcMGiouLA/ZJTEykRYsWrFq1iquvvrr8F/Se5xw7dswLeFeuXBmw/e233/Y2bty41GOGDx/uRXSQyS/5Jb/kl/ySX5X6SkpKOmf/bXM4HN74+Pizcp8hISGnbBs+fPhZu9epU6d67Xb7GfdbuXKlF/AeO3YsYPuAAQO8ffr08Xq9Xu+MGTO8ZrP5lGN79+7tfeyxxyp0X+f9DI4PXYlFAF6v95RtPoYNG8Zzzz2nfp+VlUWdOnU4cuQIdrv9nN7nhUJOTg61atUiKSmJsLCw//p2zhvkc6s48plVDvncKk55n5nX6yU3N5fExMRzdi9Wq5WDBw/idDqrfK7S/ntY1uzNiBEjGDly5GnPt27dOtq3b1/p+6nIf68rsk9JzvsXnOjoaAwGAykpKQHbT548SVxcXKnHlDU1Z7fb5f8RVJCwsDD5zCqBfG4VRz6zyiGfW8UpzzP7N/4xbLVasVqt5/w6Wp566inuuuuu0+5Tt27dSp07Pj4egJSUFBISEtTt2v9ex8fH43Q6yczMJCIiImCfLl26VOh65/0iY7PZTLt27Vi8eHHA9sWLF1f4YUgkEolEcjETHR1N06ZNT/tV2ZeuevXqER8fH/Dfa6fTyfLly9X/Xrdr1w6TyRSwT3JyMtu2bavwf9PP+xkcgOeee47+/fvTvn17OnfuzJQpUzhy5AgDBw78r29NIpFIJJILkiNHjpCRkcGRI0dwu938888/ADRs2JCQEJF037RpU8aMGcPNN9+MTqdjyJAhjB49mkaNGtGoUSNGjx5NUFAQ99wjGgHsdjuPPPIIQ4cOJSoqisjISJ5//nlatmzJVVddVaH7uyBecO68807S09MZNWoUycnJtGjRggULFlCnTp1yHW+xWBg+fPi/vqL8fEY+s8ohn1vFkc+scsjnVnHkM6sYb7zxBl9++aX6fZs2bQBYunQpPXr0AGD37t2qUxngxRdfxOFwMGjQIDIzM+nYsSOLFi1SM3AA3nvvPYxGI3fccQcOh4NevXoxbdq0CmXgAOi83ou4qEIikUgkEskFyXm/BkcikUgkEomkJPIFRyKRSCQSyQWHfMGRSCQSiURywSFfcCQSiUQikVxwXPQvOJMnT6ZevXpYrVbatWvHn3/++V/fUrVixIgR6HS6gC9fWBOIdMkRI0aQmJiIzWajR48ebN++/T+843+fFStWcMMNN5CYmIhOp2Pu3LkBPy/PMyoqKmLw4MFER0cTHBxMv379OHr06L/4Kf59zvTcHnzwwVN+9zp16hSwz8X23MaMGUOHDh0IDQ0lNjaWm266id27dwfsI3/fAinPM5O/axcmF/ULzuzZsxkyZAivvvoqmzZt4oorruDaa6/lyJEj//WtVSsuueQSkpOT1a+tW7eqPxs3bhzjx49n4sSJrFu3jvj4eHr37k1ubu5/eMf/Lvn5+bRq1YqJEyeW+vPyPKMhQ4YwZ84cZs2axV9//UVeXh59+/bF7T6XXcD/LWd6bgDXXHNNwO/eggULAn5+sT235cuX8+STT7JmzRoWL16My+WiT58+5Ofnq/vI37dAyvPMQP6uXZBUqLnqAuOyyy7zDhw4MGBb06ZNvS+//PJ/dEfVj+HDh3tbtWpV6s88Ho83Pj7e+3//93/qtsLCQq/dbvd+/PHH/9IdVi8A75w5c9Tvy/OMsrKyvCaTyTtr1ix1n2PHjnn1er134cKF/9q9/5eUfG7e/2/vXkKbWMMwjj/qSYq0pRAvzcRgCIpuUgQrSERUChYKRaGb6sYsRFCIUAxduXArSN25kiItCK4qCC680KQgpSAxYL2AwdbLoqFYirZWGi/v2ZwTTkxro6LxzPx/EAiZCXx5eBdPp5N8ZpZIJOzw4cPLvofczKanp02SjYyMmBnzVo2vMzNj1tzKs1dwisWistls2ZbsktTe3q7R0dEarerPlM/nFQqFFI1GdeTIEU1MTEiSJicnVSgUyjKsq6vT/v37yfAf1WSUzWb18ePHsnNCoZBisZjnc8xkMtq4caO2bdumEydOaHp6unSM3FT6AbVAICCJeavG15n9i1lzH88WnDdv3ujz588VG3I2NzdXbNzpZbt379bg4KBu3bqly5cvq1AoaM+ePZqZmSnlRIbLqyajQqEgv99ftrHc1+d4UUdHh65evarh4WH19fXp/v37amtr0+LioiRyMzOdOXNGe/fuVSwWk8S8rWSpzCRmza1csVXDz/iRbdu9pKOjo/S8paVF8XhcW7Zs0cDAQOkmPDJc2Y9k5PUcu7u7S89jsZh27dqlSCSimzdvqqura9n3eSW3ZDKphw8f6t69exXHmLelLZcZs+ZOnr2Cs379eq1Zs6aiff9323ZUqq+vV0tLi/L5fOnbVGS4vGoyCgaDKhaLmp2dXfYcSI7jKBKJKJ/PS/J2bqdPn9aNGzeUTqcVDodLrzNvy1sus6Uwa+7g2YLj9/vV2tpatiW7JN25c+e7t2T3ksXFRT19+lSO4ygajSoYDJZlWCwWNTIyQob/qCaj1tZW+Xy+snOmpqb06NEjcvyPmZkZvX79Wo7jSPJmbmamZDKpoaEhDQ8PKxqNlh1n3iqtlNlSmDWXqM29zX+Ga9eumc/ns/7+fnvy5In19PRYfX29vXjxotZL+2OkUinLZDI2MTFhY2Nj1tnZaY2NjaWMzp8/b01NTTY0NGTj4+N29OhRcxzH3r17V+OV/z5zc3OWy+Usl8uZJLt48aLlcjl7+fKlmVWX0cmTJy0cDtvdu3ftwYMH1tbWZjt27LBPnz7V6mP9ct/KbW5uzlKplI2Ojtrk5KSl02mLx+O2adMmT+d26tQpa2pqskwmY1NTU6XHwsJC6RzmrdxKmTFr7uXpgmNmdunSJYtEIub3+23nzp1lXx2EWXd3tzmOYz6fz0KhkHV1ddnjx49Lx798+WLnzp2zYDBodXV1tm/fPhsfH6/hin+/dDptkioeiUTCzKrL6MOHD5ZMJi0QCNjatWuts7PTXr16VYNP8/t8K7eFhQVrb2+3DRs2mM/ns82bN1sikajIxGu5LZWXJLty5UrpHOat3EqZMWvutcrM7PddLwIAAPj1PHsPDgAAcC8KDgAAcB0KDgAAcB0KDgAAcB0KDgAAcB0KDgAAcB0KDgAAcB0KDoCqHDhwQD09PbVeBgBUhYIDAABch4IDAABch4IDoML79+917NgxNTQ0yHEc9fX11XpJAPBdKDgAKvT29iqdTuv69eu6ffu2MpmMstlsrZcFAFX7q9YLAPBnmZ+fV39/vwYHB3Xw4EFJ0sDAgMLhcI1XBgDV4woOgDLPnz9XsVhUPB4vvRYIBLR9+/YargoAvg8FB0AZM6v1EgDgp1FwAJTZunWrfD6fxsbGSq/Nzs7q2bNnNVwVAHwf7sEBUKahoUHHjx9Xb2+v1q1bp+bmZp09e1arV/P3EID/DwoOgAoXLlzQ/Py8Dh06pMbGRqVSKb19+7bWywKAqq0y/uEOAABchmvOAADAdSg4AADAdSg4AADAdSg4AADAdSg4AADAdSg4AADAdSg4AADAdSg4AADAdSg4AADAdSg4AADAdSg4AADAdSg4AADAdf4GfrCb/RNooScAAAAASUVORK5CYII=", + "text/plain": [ + "

" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "pos_encoding = positional_encoding(128, 256)\n", + "\n", + "plt.pcolormesh(pos_encoding[0], cmap='RdBu')\n", + "plt.xlabel('d')\n", + "plt.xlim((0, 256))\n", + "plt.ylabel('Position')\n", + "plt.colorbar()\n", + "plt.show()" + ] + }, + { + "cell_type": "markdown", + "id": "f633ce92", + "metadata": {}, + "source": [ + "Each row represents a positional encoding - notice how none of the rows are identical! You have created a unique positional encoding for each of the words." + ] + }, + { + "cell_type": "markdown", + "id": "0dd7e035", + "metadata": {}, + "source": [ + "**Congratulations on finishing this Lab!** Now you should have a better understanding of the positional encoding in the transformer and this will surely help you with this week's assignment!\n", + "\n", + "**Keep it up!**" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "85e793c3", + "metadata": {}, + "outputs": [], + "source": [] + } + ], + "metadata": { + "kernelspec": { + "display_name": "Python 3 (ipykernel)", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.10.11" + } + }, + "nbformat": 4, + "nbformat_minor": 5 +} diff --git a/NLP with Attention Models/Text_Summarization/Attention_Masking_PE/home/jovyan/work/C4W2_Attention.ipynb b/NLP with Attention Models/Text_Summarization/Attention_Masking_PE/home/jovyan/work/C4W2_Attention.ipynb new file mode 100644 index 0000000000000000000000000000000000000000..e1eb616a828ecd297c0e2a99c25032712c96fadf --- /dev/null +++ b/NLP with Attention Models/Text_Summarization/Attention_Masking_PE/home/jovyan/work/C4W2_Attention.ipynb @@ -0,0 +1,259 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# The Three Ways of Attention and Dot Product Attention: Ungraded Lab Notebook\n", + "\n", + "In this notebook you'll explore the three ways of attention (encoder-decoder attention, causal attention, and bi-directional self attention) and how to implement the latter two with dot product attention. \n", + "\n", + "## Background\n", + "\n", + "As you learned last week, **attention models** constitute powerful tools in the NLP practitioner's toolkit. Like LSTMs, they learn which words are most important to phrases, sentences, paragraphs, and so on. Moreover, they mitigate the vanishing gradient problem even better than LSTMs. You've already seen how to combine attention with LSTMs to build **encoder-decoder models** for applications such as machine translation. \n", + "\n", + "\n", + "\n", + "This week, you'll see how to integrate attention into **transformers**. Because transformers do not process one token at a time, they are much easier to parallelize and accelerate. Beyond text summarization, applications of transformers include: \n", + "* Machine translation\n", + "* Auto-completion\n", + "* Named Entity Recognition\n", + "* Chatbots\n", + "* Question-Answering\n", + "* And more!\n", + "\n", + "Along with embedding, positional encoding, dense layers, and residual connections, attention is a crucial component of transformers. At the heart of any attention scheme used in a transformer is **dot product attention**, of which the figures below display a simplified picture:\n", + "\n", + "\n", + "\n", + "\n", + "\n", + "With basic dot product attention, you capture the interactions between every word (embedding) in your query and every word in your key. If the queries and keys belong to the same sentences, this constitutes **bi-directional self-attention**. In some situations, however, it's more appropriate to consider only words which have come before the current one. Such cases, particularly when the queries and keys come from the same sentences, fall into the category of **causal attention**. \n", + "\n", + "\n", + "\n", + "For causal attention, you add a **mask** to the argument of our softmax function, as illustrated below: \n", + "\n", + "\n", + "\n", + "\n", + "\n", + "Now let's see how to implement the attention mechanism." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Imports" + ] + }, + { + "cell_type": "code", + "execution_count": 1, + "metadata": {}, + "outputs": [], + "source": [ + "import os\n", + "os.environ['TF_CPP_MIN_LOG_LEVEL'] = '3'\n", + "\n", + "import sys\n", + "\n", + "import tensorflow as tf\n", + "\n", + "import textwrap\n", + "wrapper = textwrap.TextWrapper(width=70)\n" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Here is a helper function that will help you display useful information:\n", + "\n", + "* `display_tensor()` prints out the shape and the actual tensor." + ] + }, + { + "cell_type": "code", + "execution_count": 2, + "metadata": {}, + "outputs": [], + "source": [ + "def display_tensor(t, name):\n", + " \"\"\"Display shape and tensor\"\"\"\n", + " print(f'{name} shape: {t.shape}\\n')\n", + " print(f'{t}\\n')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Create some tensors and display their shapes. Feel free to experiment with your own tensors. Keep in mind, though, that the query, key, and value arrays must all have the same embedding dimensions (number of columns), and the mask array must have the same shape as `tf.matmul(query, key_transposed)`. " + ] + }, + { + "cell_type": "code", + "execution_count": 3, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "query shape: (2, 3)\n", + "\n", + "[[1. 0. 0.]\n", + " [0. 1. 0.]]\n", + "\n", + "key shape: (2, 3)\n", + "\n", + "[[1. 2. 3.]\n", + " [4. 5. 6.]]\n", + "\n", + "value shape: (2, 3)\n", + "\n", + "[[0. 1. 0.]\n", + " [1. 0. 1.]]\n", + "\n", + "mask shape: (2, 2)\n", + "\n", + "[[1. 0.]\n", + " [1. 1.]]\n", + "\n" + ] + } + ], + "source": [ + "q = tf.constant([[1.0, 0.0, 0.0], [0.0, 1.0, 0.0]])\n", + "display_tensor(q, 'query')\n", + "k = tf.constant([[1.0, 2.0, 3.0], [4.0, 5.0, 6.0]])\n", + "display_tensor(k, 'key')\n", + "v = tf.constant([[0.0, 1.0, 0.0], [1.0, 0.0, 1.0]])\n", + "display_tensor(v, 'value')\n", + "m = tf.constant([[1.0, 0.0], [1.0, 1.0]])\n", + "display_tensor(m, 'mask')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Dot product attention\n", + "\n", + "Here you compute \n", + "$\\textrm{softmax} \\left(\\frac{Q K^T}{\\sqrt{d}} + M \\right) V$, where the (optional, but default) scaling factor $\\sqrt{d}$ is the square root of the embedding dimension." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "def dot_product_attention(q, k, v, mask, scale=True):\n", + " \"\"\"\n", + " Calculate the attention weights.\n", + " q, k, v must have matching leading dimensions.\n", + " k, v must have matching penultimate dimension, i.e.: seq_len_k = seq_len_v.\n", + " The mask has different shapes depending on its type(padding or look ahead) \n", + " but it must be broadcastable for addition.\n", + "\n", + " Arguments:\n", + " q (tf.Tensor): query of shape (..., seq_len_q, depth)\n", + " k (tf.Tensor): key of shape (..., seq_len_k, depth)\n", + " v (tf.Tensor): value of shape (..., seq_len_v, depth_v)\n", + " mask (tf.Tensor): mask with shape broadcastable \n", + " to (..., seq_len_q, seq_len_k). Defaults to None.\n", + " scale (boolean): if True, the result is a scaled dot-product attention. Defaults to True.\n", + "\n", + " Returns:\n", + " attention_output (tf.Tensor): the result of the attention function\n", + " \"\"\"\n", + " \n", + " # Multiply q and k transposed.\n", + " matmul_qk = tf.matmul(q, k, transpose_b=True) # (..., seq_len_q, seq_len_k)\n", + "\n", + " # scale matmul_qk with the square root of dk\n", + " if scale:\n", + " dk = tf.cast(tf.shape(k)[-1], tf.float32)\n", + " matmul_qk = matmul_qk / tf.math.sqrt(dk)\n", + " # add the mask to the scaled tensor.\n", + " if mask is not None:\n", + " matmul_qk = matmul_qk + (1. - mask) * -1e9 \n", + "\n", + " # softmax is normalized on the last axis (seq_len_k) so that the scores add up to 1.\n", + " attention_weights = tf.keras.activations.softmax(matmul_qk)\n", + "\n", + " # Multiply the attention weights by v\n", + " attention_output = tf.matmul(attention_weights, v) # (..., seq_len_q, depth_v)\n", + "\n", + " return attention_output" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Finally, you implement the *masked* dot product self-attention (at the heart of causal attention) as a special case of dot product attention" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "def causal_dot_product_attention(q, k, v, scale=True):\n", + " \"\"\" Masked dot product self attention.\n", + " Args:\n", + " q (numpy.ndarray): queries.\n", + " k (numpy.ndarray): keys.\n", + " v (numpy.ndarray): values.\n", + " Returns:\n", + " numpy.ndarray: masked dot product self attention tensor.\n", + " \"\"\"\n", + " \n", + " # Size of the penultimate dimension of the query\n", + " mask_size = q.shape[-2]\n", + "\n", + " # Creates a matrix with ones below the diagonal and 0s above. It should have shape (1, mask_size, mask_size)\n", + " mask = tf.experimental.numpy.tril(tf.ones((mask_size, mask_size))) \n", + " \n", + " return dot_product_attention(q, k, v, mask, scale=scale)" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "result = causal_dot_product_attention(q, k, v)\n", + "display_tensor(result, 'result')" + ] + } + ], + "metadata": { + "kernelspec": { + "display_name": "Python 3 (ipykernel)", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.10.11" + } + }, + "nbformat": 4, + "nbformat_minor": 4 +} diff --git a/NLP with Attention Models/Text_Summarization/Attention_Masking_PE/home/jovyan/work/C4W2_Masking.ipynb b/NLP with Attention Models/Text_Summarization/Attention_Masking_PE/home/jovyan/work/C4W2_Masking.ipynb new file mode 100644 index 0000000000000000000000000000000000000000..b603d8c306de8ba8fe74115284b291a76fe27916 --- /dev/null +++ b/NLP with Attention Models/Text_Summarization/Attention_Masking_PE/home/jovyan/work/C4W2_Masking.ipynb @@ -0,0 +1,263 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "id": "c2b5b44f", + "metadata": {}, + "source": [ + "# Masking\n", + "\n", + "In this lab, you will implement the masking, that is one of the essential building blocks of the transformer. You will see how to define the masks and test how they work. You will use the masks later in the programming assignment." + ] + }, + { + "cell_type": "code", + "execution_count": 1, + "id": "ff0def87", + "metadata": {}, + "outputs": [], + "source": [ + "import os\n", + "os.environ['TF_CPP_MIN_LOG_LEVEL'] = '3'\n", + "import tensorflow as tf" + ] + }, + { + "cell_type": "markdown", + "id": "9c0b0358", + "metadata": {}, + "source": [ + "## 1 - Masking\n", + "\n", + "There are two types of masks that are useful when building your Transformer network: the *padding mask* and the *look-ahead mask*. Both help the softmax computation give the appropriate weights to the words in your input sentence. \n", + "\n", + "### 1.1 - Padding Mask\n", + "\n", + "Oftentimes your input sequence will exceed the maximum length of a sequence your network can process. Let's say the maximum length of your model is five, it is fed the following sequences:\n", + "\n", + " [[\"Do\", \"you\", \"know\", \"when\", \"Jane\", \"is\", \"going\", \"to\", \"visit\", \"Africa\"], \n", + " [\"Jane\", \"visits\", \"Africa\", \"in\", \"September\" ],\n", + " [\"Exciting\", \"!\"]\n", + " ]\n", + "\n", + "which might get vectorized as:\n", + "\n", + " [[ 71, 121, 4, 56, 99, 2344, 345, 1284, 15],\n", + " [ 56, 1285, 15, 181, 545],\n", + " [ 87, 600]\n", + " ]\n", + " \n", + "When passing sequences into a transformer model, it is important that they are of uniform length. You can achieve this by padding the sequence with zeros, and truncating sentences that exceed the maximum length of your model:\n", + "\n", + " [[ 71, 121, 4, 56, 99],\n", + " [ 2344, 345, 1284, 15, 0],\n", + " [ 56, 1285, 15, 181, 545],\n", + " [ 87, 600, 0, 0, 0],\n", + " ]\n", + " \n", + "Sequences longer than the maximum length of five will be truncated, and zeros will be added to the truncated sequence to achieve uniform length. Similarly, for sequences shorter than the maximum length, zeros will also be added for padding.\n", + "\n", + "When pasing these vectors through the attention layers, the zeros will typically disappear (you will get completely new vectors given the mathematical operations that happen in the attention block). However, you still want the network to attend only to the first few numbers in that vector (given by the sentence length) and this is when a padding mask comes in handy. You will need to define a boolean mask that specifies to which elements you must attend (1) and which elements you must ignore (0) and you do this by looking at all the zeros in the sequence. Then you use the mask to set the values of the vectors (corresponding to the zeros in the initial vector) close to negative infinity (-1e9).\n", + "\n", + "Imagine your input vector is `[87, 600, 0, 0, 0]`. This would give you a mask of `[1, 1, 0, 0, 0]`. When your vector passes through the attention mechanism, you get another (randomly looking) vector, let's say `[1, 2, 3, 4, 5]`, which after masking becomes `[1, 2, -1e9, -1e9, -1e9]`, so that when you take the softmax, the last three elements (where there were zeros in the input) don't affect the score.\n", + "\n", + "The [MultiheadAttention](https://keras.io/api/layers/attention_layers/multi_head_attention/) layer implemented in Keras, uses this masking logic.\n", + "\n", + "**Note:** The below function only creates the mask of an _already padded sequence_." + ] + }, + { + "cell_type": "code", + "execution_count": 2, + "id": "b4be6e26", + "metadata": {}, + "outputs": [], + "source": [ + "def create_padding_mask(decoder_token_ids):\n", + " \"\"\"\n", + " Creates a matrix mask for the padding cells\n", + " \n", + " Arguments:\n", + " decoder_token_ids (matrix like): matrix of size (n, m)\n", + " \n", + " Returns:\n", + " mask (tf.Tensor): binary tensor of size (n, 1, m)\n", + " \"\"\" \n", + " seq = 1 - tf.cast(tf.math.equal(decoder_token_ids, 0), tf.float32)\n", + " \n", + " # add extra dimensions to add the padding\n", + " # to the attention logits. \n", + " # this will allow for broadcasting later when comparing sequences\n", + " return seq[:, tf.newaxis, :] " + ] + }, + { + "cell_type": "code", + "execution_count": 3, + "id": "13a484c2", + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "tf.Tensor(\n", + "[[[1. 1. 0. 0. 0.]]\n", + "\n", + " [[1. 1. 1. 0. 0.]]\n", + "\n", + " [[1. 0. 0. 0. 0.]]], shape=(3, 1, 5), dtype=float32)\n" + ] + } + ], + "source": [ + "x = tf.constant([[7., 6., 0., 0., 0.], [1., 2., 3., 0., 0.], [3., 0., 0., 0., 0.]])\n", + "print(create_padding_mask(x))" + ] + }, + { + "cell_type": "markdown", + "id": "ce1d7106", + "metadata": {}, + "source": [ + "If you multiply (1 - mask) by -1e9 and add it to the sample input sequences, the zeros are essentially set to negative infinity. Notice the difference when taking the softmax of the original sequence and the masked sequence:" + ] + }, + { + "cell_type": "code", + "execution_count": 5, + "id": "ba9a0bdd", + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Softmax of non-masked vectors:\n", + "\n", + "tf.Tensor(\n", + "[[[7.2959954e-01 2.6840466e-01 6.6530867e-04 6.6530867e-04 6.6530867e-04]]\n", + "\n", + " [[8.4437378e-02 2.2952460e-01 6.2391251e-01 3.1062774e-02 3.1062774e-02]]\n", + "\n", + " [[8.3392531e-01 4.1518696e-02 4.1518696e-02 4.1518696e-02 4.1518696e-02]]], shape=(3, 1, 5), dtype=float32)\n", + "\n", + "Softmax of masked vectors:\n", + "\n", + "tf.Tensor(\n", + "[[[0.7310586 0.26894143 0. 0. 0. ]]\n", + "\n", + " [[0.09003057 0.24472848 0.66524094 0. 0. ]]\n", + "\n", + " [[1. 0. 0. 0. 0. ]]], shape=(3, 1, 5), dtype=float32)\n" + ] + } + ], + "source": [ + "# Create the mask for x\n", + "mask = create_padding_mask(x)\n", + "\n", + "# Extend the dimension of x to match the dimension of the mask\n", + "x_extended = x[:, tf.newaxis, :]\n", + "\n", + "print(\"Softmax of non-masked vectors:\\n\")\n", + "print(tf.keras.activations.softmax(x_extended))\n", + "\n", + "print(\"\\nSoftmax of masked vectors:\\n\")\n", + "print(tf.keras.activations.softmax(x_extended + (1 - mask) * -1.0e9))" + ] + }, + { + "cell_type": "markdown", + "id": "da92b367", + "metadata": {}, + "source": [ + "### 1.2 - Look-ahead Mask\n", + "\n", + "The look-ahead mask follows similar intuition. In training, you will have access to the complete correct output of your training example. The look-ahead mask helps your model pretend that it correctly predicted a part of the output and see if, *without looking ahead*, it can correctly predict the next output. \n", + "\n", + "For example, if the expected correct output is `[1, 2, 3]` and you wanted to see if given that the model correctly predicted the first value it could predict the second value, you would mask out the second and third values. So you would input the masked sequence `[1, -1e9, -1e9]` and see if it could generate `[1, 2, -1e9]`.\n", + "\n", + "Just because you've worked so hard, we'll also implement this mask for you 😇😇. Again, take a close look at the code so you can effectively implement it later." + ] + }, + { + "cell_type": "code", + "execution_count": 9, + "id": "843dd37e", + "metadata": {}, + "outputs": [], + "source": [ + "def create_look_ahead_mask(sequence_length):\n", + " \"\"\"\n", + " Returns a lower triangular matrix filled with ones\n", + " \n", + " Arguments:\n", + " sequence_length (int): matrix size\n", + " \n", + " Returns:\n", + " mask (tf.Tensor): binary tensor of size (sequence_length, sequence_length)\n", + " \"\"\"\n", + " mask = tf.linalg.band_part(tf.ones((1, sequence_length, sequence_length)), -1, 0)\n", + " return mask " + ] + }, + { + "cell_type": "code", + "execution_count": 10, + "id": "393f4398", + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "" + ] + }, + "execution_count": 10, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "x = tf.random.uniform((1, 3))\n", + "temp = create_look_ahead_mask(x.shape[1])\n", + "temp" + ] + }, + { + "cell_type": "markdown", + "id": "50e1114d", + "metadata": {}, + "source": [ + "**Congratulations on finishing this Lab!** Now you should have a better understanding of the masking in the transformer and this will surely help you with this week's assignment!\n", + "\n", + "**Keep it up!**" + ] + } + ], + "metadata": { + "kernelspec": { + "display_name": "Python 3 (ipykernel)", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.10.11" + } + }, + "nbformat": 4, + "nbformat_minor": 5 +} diff --git a/NLP with Attention Models/Text_Summarization/Attention_Masking_PE/home/jovyan/work/C4W2_Positional_Encoding.ipynb b/NLP with Attention Models/Text_Summarization/Attention_Masking_PE/home/jovyan/work/C4W2_Positional_Encoding.ipynb new file mode 100644 index 0000000000000000000000000000000000000000..f6898066f3ad5cfa88293784ed72fd43d1b6a792 --- /dev/null +++ b/NLP with Attention Models/Text_Summarization/Attention_Masking_PE/home/jovyan/work/C4W2_Positional_Encoding.ipynb @@ -0,0 +1,296 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "id": "a9479dda", + "metadata": {}, + "source": [ + "\n", + "# Positional Encoding\n", + "\n", + "In this lab, you will learn how to implement the positional encoding of words in the transformer." + ] + }, + { + "cell_type": "code", + "execution_count": 1, + "id": "f97b2311", + "metadata": {}, + "outputs": [], + "source": [ + "import os\n", + "os.environ['TF_CPP_MIN_LOG_LEVEL'] = '3'\n", + "import numpy as np\n", + "import matplotlib.pyplot as plt\n", + "import tensorflow as tf" + ] + }, + { + "cell_type": "markdown", + "id": "14ea5651", + "metadata": {}, + "source": [ + "## 1. Positional Encoding\n", + "\n", + "In sequence to sequence tasks, the relative order of your data is extremely important to its meaning. When you were training sequential neural networks such as RNNs, you fed your inputs into the network in order. Information about the order of your data was automatically fed into your model. However, when you train a Transformer network using multi-head attention, you feed your data into the model all at once. While this dramatically reduces training time, there is no information about the order of your data. This is where positional encoding is useful - you can specifically encode the positions of your inputs and pass them into the network using these sine and cosine formulas:\n", + " \n", + "$$\n", + "PE_{(pos, 2i)}= sin\\left(\\frac{pos}{{10000}^{\\frac{2i}{d}}}\\right)\n", + "\\tag{1}$$\n", + "
\n", + "$$\n", + "PE_{(pos, 2i+1)}= cos\\left(\\frac{pos}{{10000}^{\\frac{2i}{d}}}\\right)\n", + "\\tag{2}$$\n", + "\n", + "* $d$ is the dimension of the word embedding and positional encoding\n", + "* $pos$ is the position of the word.\n", + "* $k$ refers to each of the different dimensions in the positional encodings, with $i$ equal to $k$ $//$ $2$.\n", + "\n", + "To develop some intuition about positional encodings, you can think of them broadly as a feature that contains the information about the relative positions of words. The sum of the positional encoding and word embedding is ultimately what is fed into the model. If you just hard code the positions in, say by adding a matrix of 1's or whole numbers to the word embedding, the semantic meaning is distorted. Conversely, the values of the sine and cosine equations are small enough (between -1 and 1) that when you add the positional encoding to a word embedding, the word embedding is not significantly distorted, and is instead enriched with positional information. Using a combination of these two equations helps your Transformer network attend to the relative positions of your input data.\n", + "\n", + "### 1.1 - Sine and Cosine Angles\n", + "\n", + "Notice that even though the sine and cosine positional encoding equations take in different arguments (`2i` versus `2i+1`, or even versus odd numbers) the inner terms for both equations are the same: $$\\theta(pos, i, d) = \\frac{pos}{10000^{\\frac{2i}{d}}} \\tag{3}$$\n", + "\n", + "Consider the inner term as you calculate the positional encoding for a word in a sequence.
\n", + "$PE_{(pos, 0)}= sin\\left(\\frac{pos}{{10000}^{\\frac{0}{d}}}\\right)$, since solving `2i = 0` gives `i = 0`
\n", + "$PE_{(pos, 1)}= cos\\left(\\frac{pos}{{10000}^{\\frac{0}{d}}}\\right)$, since solving `2i + 1 = 1` gives `i = 0`\n", + "\n", + "The angle is the same for both! The angles for $PE_{(pos, 2)}$ and $PE_{(pos, 3)}$ are the same as well, since for both, `i = 1` and therefore the inner term is $\\left(\\frac{pos}{{10000}^{\\frac{2}{d}}}\\right)$. This relationship holds true for all paired sine and cosine curves:\n", + "\n", + "| k | 0 | 1 | 2 | 3 | ... | d - 2 | d - 1 | \n", + "| ---------------- | :------: | ----------------- | ----------------- | ----------------- | ----- | ----------------- | ----------------- |\n", + "| encoding(0) = |[$sin(\\theta(0, 0, d))$| $cos(\\theta(0, 0, d))$| $sin(\\theta(0, 1, d))$| $cos(\\theta(0, 1, d))$|... |$sin(\\theta(0, d//2, d))$| $cos(\\theta(0, d//2, d))$]|\n", + "| encoding(1) = | [$sin(\\theta(1, 0, d))$| $cos(\\theta(1, 0, d))$| $sin(\\theta(1, 1, d))$| $cos(\\theta(1, 1, d))$|... |$sin(\\theta(1, d//2, d))$| $cos(\\theta(1, d//2, d))$]|\n", + "...\n", + "| encoding(pos) = | [$sin(\\theta(pos, 0, d))$| $cos(\\theta(pos, 0, d))$| $sin(\\theta(pos, 1, d))$| $cos(\\theta(pos, 1, d))$|... |$sin(\\theta(pos, d//2, d))$| $cos(\\theta(pos, d//2, d))]$|" + ] + }, + { + "cell_type": "code", + "execution_count": 2, + "id": "f7c6a09e", + "metadata": {}, + "outputs": [], + "source": [ + "def get_angles(position, k, d_model):\n", + " \"\"\"\n", + " Computes a positional encoding for a word \n", + " \n", + " Arguments:\n", + " position (int): position of the word\n", + " k (int): refers to each of the different dimensions in the positional encodings, with i equal to k//2\n", + " d_model(int): the dimension of the word embedding and positional encoding\n", + " \n", + " Returns:\n", + " _ (float): positional embedding value for the word\n", + " \"\"\"\n", + " i = k // 2\n", + " angle_rates = 1 / np.power(10000, (2 * i) / np.float32(d_model))\n", + " return position * angle_rates" + ] + }, + { + "cell_type": "markdown", + "id": "6107ee72", + "metadata": {}, + "source": [ + "### 1.2 - Sine and Cosine Positional Encodings\n", + "\n", + "Now you can use the angles you computed to calculate the sine and cosine positional encodings, shown in equations (1) and (2)." + ] + }, + { + "cell_type": "code", + "execution_count": 3, + "id": "7c654219", + "metadata": {}, + "outputs": [], + "source": [ + "def positional_encoding(positions, d):\n", + " \"\"\"\n", + " Precomputes a matrix with all the positional encodings \n", + " \n", + " Arguments:\n", + " positions (int): Maximum number of positions to be encoded \n", + " d (int): Encoding size \n", + " \n", + " Returns:\n", + " pos_encoding (tf.Tensor): A matrix of shape (1, position, d_model) with the positional encodings\n", + " \"\"\"\n", + " # initialize a matrix angle_rads of all the angles \n", + " angle_rads = get_angles(np.arange(positions)[:, np.newaxis],\n", + " np.arange(d)[np.newaxis, :],\n", + " d)\n", + " \n", + " # apply sin to even indices in the array; 2i\n", + " angle_rads[:, 0::2] = np.sin(angle_rads[:, 0::2])\n", + " \n", + " # apply cos to odd indices in the array; 2i+1\n", + " angle_rads[:, 1::2] = np.cos(angle_rads[:, 1::2])\n", + " \n", + " pos_encoding = angle_rads[np.newaxis, ...]\n", + " \n", + " return tf.cast(pos_encoding, dtype=tf.float32)" + ] + }, + { + "cell_type": "code", + "execution_count": 9, + "id": "c822e06a", + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "array([[0. , 0. , 0. , 0. ],\n", + " [1. , 1. , 0.01, 0.01],\n", + " [2. , 2. , 0.02, 0.02],\n", + " [3. , 3. , 0.03, 0.03]])" + ] + }, + "execution_count": 9, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "angle_rads = get_angles(np.arange(4)[:, np.newaxis],\n", + " np.arange(4)[np.newaxis, :],\n", + " 4)\n", + "angle_rads" + ] + }, + { + "cell_type": "code", + "execution_count": 8, + "id": "e6cb84ed", + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "array([[0. , 0. ],\n", + " [1. , 0.01],\n", + " [2. , 0.02],\n", + " [3. , 0.03]])" + ] + }, + "execution_count": 8, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "angle_rads[:, 0::2]" + ] + }, + { + "cell_type": "code", + "execution_count": 10, + "id": "b73a2670", + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "array([[0. , 0. ],\n", + " [1. , 0.01],\n", + " [2. , 0.02],\n", + " [3. , 0.03]])" + ] + }, + "execution_count": 10, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "angle_rads[:, 1::2]" + ] + }, + { + "cell_type": "markdown", + "id": "537c0575", + "metadata": {}, + "source": [ + "Now you can visualize the positional encodings." + ] + }, + { + "cell_type": "code", + "execution_count": 4, + "id": "a9308263", + "metadata": {}, + "outputs": [ + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAjgAAAG2CAYAAAByJ/zDAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjcuMSwgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy/bCgiHAAAACXBIWXMAAA9hAAAPYQGoP6dpAAEAAElEQVR4nOydd3gUVd+G75nZmt30QELovYsISLOACIoNK9h97X76qogVK1iwvEpV7IqKvYHYKBawYAEBRXoNJYWE9GT7fn+c2dnZkEAKaoRzX1cuD7MzZ87MJu7sec7ze5RwOBxGIpFIJBKJ5BBC/acHIJFIJBKJRHKwkQ84EolEIpFIDjnkA45EIpFIJJJDDvmAI5FIJBKJ5JBDPuBIJBKJRCI55JAPOBKJRCKRSA455AOORCKRSCSSQw75gCORSCQSieSQQz7gSCQSiUQiOeSQDzgSiUQikUgOOeQDjkQikUgkEgCWLFnC6aefTmZmJoqiMGfOnAMes3jxYvr06YPD4aBdu3Y899xz++zz4Ycf0q1bN+x2O926dePjjz/+C0Yfi3zAkUgkEolEAkB5eTm9evXi6aefrtX+W7du5ZRTTuHYY49lxYoV3H333dx00018+OGHxj5Lly5lzJgxXHLJJaxatYpLLrmE0aNH8/PPP/9VlwGAIsM2JRKJRCKRVEVRFD7++GPOPPPMGve58847+eSTT1i7dq2x7brrrmPVqlUsXboUgDFjxlBSUsIXX3xh7HPyySeTnJzM22+//ZeN3/KX9fwvIhQKsXv3buLj41EU5Z8ejkQikUgaMeFwmNLSUjIzM1HVv04I8Xg8+Hy+BvcTDof3+Wyz2+3Y7fYG97106VJGjBgRs+2kk07i5Zdfxu/3Y7VaWbp0Kbfccss++0ydOrXB598f8gEH2L17Ny1btvynhyGRSCSSfxE7duygRYsWf0nfHo8HZ3wKBCob3Jfb7aasrCxm2wMPPMCECRMa3HdOTg7p6ekx29LT0wkEAuTn59OsWbMa98nJyWnw+feHfMAB4uPjAZi79Hdc7ngW9T0BgBZfzmN05yQA7swcyFt9TgJg97RT6DldvDH9h3RmT7EXgJ0b8kV/KXFG3xu+/ZIvXrsXgP+b+RMz1z8DQJthPXh/0gIAUm0anz/8LAA/fbMGgGOH9+DZY5zi3N3O44FnLgCg50ca7Y8+GoAJnz5MOCgUxomj7gPA7rIy7vk7RR/TrqPd/9YDsGP6qdzT/1oAPr1oAp17iz/KLrdfzYm/fi2O1b+JfHLU8Xx60QQAOvduwf89fRsAj1/yMM/nzAJg2furmXDSraK/hPkAXJvxH/bmiD+iN4teY0zcRQCc9+b93PbmjQCo/c/ijlbHA/DZqBtZ3FVc7zF/dATgokuGccXRYmz9zrqbzxM2AXBT77EMmC7O92juMpaddCoAtjgrAEe9/jy3dxDbyl94nSF3XgPAXUeMYv1l4v1oPWMH3737IAAXT/me5KYuAC4f2h6AO+57ma2v/geAjLMe549TRd/Hr+3GtKWvAPDcbVPpOV7cx8h9m3vUEJ7YuhCAyV1GctOzlwDww4Mf07RbGgDxLZNZ8cFqAE59/nqevHAqAHeu+QSA2zucytgtYjp3RruBDPxe/G6sO2Ekec+/AUDvsZfz1tjJAIx/S7zftw29g1nZLwNwjnqycT/7L01j/eXi97rNjB3snnIyAOk3fEzuB+I+pp/7FAC7Fj5F8+Fi2++fPMGRZ4vf16/emsCI/zwGwKwZ47j69ucBmHj/ZQA8+MQHXHm1uOdvvPkdw0/rB8C3X6+nS5/WAGxZk0dGmyQACvMqiHOJexoMhQDwe0O4Eu366+Wkt0oUY9pUQPvu4n+I65ZvZ8Cx4j364as/ARhxypF8MUfcr3PHHMu7b4r34qorRvDCC58CcMtNo5g8+X0A7r7rfB6Z9BYAj9wv3p/x97/C5EeuEvuOf56Z//s/AP7v1qd5ZdrNAPznxsnMfkbcm4uu+x8A771wB6OveQKAT165i9MvfwSAL167l5GXPQzAotn3MezCCeJ+vDORIec/AMCSd8Tv33Hn388P74n2oHPvYemHoo+B59zDzx9NEu/h2Xfz68ei3e+suwFY9vEk+urt3+Y8Su8z7gBg5SdPcKTe/v2TJzhCb/8x73/0PP12AP78VIy/+2m3s+Yz0e526u2s09tdqrTXf/4kAJ1PEX//G794ko4ja9/e/OWTtD852gZof/JtMe0t88XvYLuTbq2xDbBtwVO0GVH79vYFT9Ha1AZoPeLWOrcBdiycTMvh42La4aCf4Jr3jM+OvwKfzweBSizdRoNmrX9HQT9la95jx44dJCQkGJsPxuxNhKqzQ5GVL+bt1e3zVysm8gGH6I13ueNxxSdgV8QHvdMdb/xC2FBRNBsACW4Xqk08fFidbixecRtVWwUAmj36gKNoNtzxCcZ2t0XsG++w41A0AOJUDavTLfqwin5tTjcJ8XpbUUlwil9GxWJBc4gPZpemEUb8IkW2WRxW4lTRb4LLiWKxG2OOXJdqi8PqFPvbFRWXW/yR2jXV2Kba4vTrc+HSNL1vF/F2qzHmyD2IbLM4XGh28cHltlmNPuyKSkKc2FdNiMemr21XrU7iHTa97RD7xrmJ1++Xotlw6fdLs8cZxyUkJODSxHab/npCfLRfv9ONU7+3isVOQpyj2vfCot8zp379isVOgv4/LEWzEe+wGuOM3FOLM3ofI/fNhkpCgmg7FNU4n0vVcFstxj0yvy+RPiLns6Ea121TVOIi74miYosTvxtORTPGHHlPNFuccQ5Vjd7Pqted4HZF2/Fuox25n5F2vKntjk8wfn/i3PFG2+mKN94ze1z09zYyTvPvl2qL3mfNDpo+PoLi9yREEIvDob8eNvZVbZUxfezz9xHnNtr2OHf098cVbTtc8SgWhzFmY/ym9zuuFm2X0RZjd8XHx7bN96sW7ci9r6kdX0MbIN7UV0y7hvewatvot47t/fWbUMt2pK+6tutyjr/z3LDvh/ZfgWJ1xJyzroQj/99JSIh5wDlYZGRk7DMTk5eXh8ViITU1db/7VJ3VOdjIBxwTZ143FcViZ7o+K3DChw9wlPdsAF49qR3zknsBcPkfKbz7UB8Aho25n6JF4ttX/HHiKX/dwml0PUm0jxw1msun/wDApsWf0OdD8Q28rEln1o4TNrm1pTB5VHcA7tc/VL/4+Ce+7X8uAE5NZfl0MUvS/syH2PzrctH3NUOZeatYqX7J8A4APPb013Q59wgAtn/wGSkdRB/FzftQqc/2tOzclM1/5gHQKhCiX6b48Jj8/XbxutNKeqsUALI27WVLnnhw690xjfSW3QDY+uJyynK2AtDk+E4AdLQnsW2VmHHJ31JA0gjxAZVms1C2SWwPDXYa99tTvIfEDs0BcGxtAsDmPWWkOsWvZTgUpDxXnNuVEP22oZXlE5cm+inZUSr6jUs2Xs8r8ZJoFQ8QQZ8HNbktAIq6kyKPHwCb3YK3UrQT7OJ8oYAPxVeu76vhK9UfWG1OKvUPZLfDQlD/duK2acY5FZ+YRrapCoEK0bY4LPg9AdGHw27cf8XhQm8S1j+MAQIho4lX/4emKPj0tk1VCOptRZ9tCwZDaPo4wr4gqi1y70Io1ui3PnM7rMb+2YdN/5M2DYFgCFT9f47+YAhFb4ci385UjWCo+nYEVVUIhaI+htB+PA1h037hUBhNrfnDw/yaxdSu6RjtIH4QNaSvmg5VqXuf5r7UmO1yHeGhhKJqMX9TdSbcgGNrwcCBA5k3b17MtgULFtC3b1+s+v93Bg4cyMKFC2PW4SxYsIBBgwb9pWOTDzgSiUQikTRS/u4HnLKyMjbpX0hB2MBXrlxJSkoKrVq1Yvz48ezatYvXX38dEI6pp59+mnHjxnH11VezdOlSXn755Rh31M0338xxxx3H448/zqhRo5g7dy6LFi3i+++/r/911QJZB0cikUgkEgkAy5Yto3fv3vTu3RuAcePG0bt3b+6//34AsrOzycrKMvZv27Ytn3/+Od9++y1HHnkkDz30ENOnT+ecc84x9hk0aBDvvPMOr776KkcccQSzZs3i3XffpX///n/ptcgZHBNNuvRFtcVx4QdiQexNTY5h01HiFqXNn8sXPjGBf/SoO3lih3gqTmzRiVXnCRmr0zAhSzlfvIvkNj0AePv6AYZc5UxO54WcJAB+/P5PhqUImSWrwk/yIrH4+MGTbgDgjcdnMHFOFwDuaZXI4nUFYkynd+PKOe8A4D53AjuuF+1r9AWZd2xZRctx5wPw5el30e4OsWB30dYiUnQp49S+LfjfNz8CQvaIz14FwNe/C2nlnqYuenURktG8979nhy7lDO2UhiM0AIB836t4isWiamePEwHoH0xhbsEuAAq2F5PWTKxdaO60ULhWyF+eigA2XUbwVZRgayV+weMSxb3I3VOB01dsvCcVe8WY3EkONH3mXSvLx60vEM5ZJaS28nBUgikv9RGnS1pBvw8tWVyLarWxt1JIRo44KxVlwn6Z4rQa+6pek0RVItqa3WnIS0lxNkNecun3U1NACXiM++kv1yUqp4WAfj6Lw2ZIW4rdYbTDJm3dr0s0mqLg1yUxFfAFgsZ5gvp2zSa+m4RDYTSrLh15gqhaZHsQLCbd3tyuImEEQ2HjG6K5bZaTzO2guR0yt0PVbg/rm1VVMWQoRf8dCIViFxqq1UhM4VDQkJ7CoeA+r9dEVbmqumOrO9+B+vknURWlTvegrkOX8pa4x40JRWngDE6obscOGTKE/ZXHmzVr1j7bjj/+eH777bf99nvuuedy7rnn1mksDUU+4EgkEolE0khRNBVFa4hEdfgKNYfvlUskEolEIjlkkTM4Jn69uzcJ8fEcOWkFAK+NbE/5haLmybE3vs3i1EUANO1+OjP0AknPLr+X146YAsAns0V9l1da3cL4haKGQuI7DxKfKWp4DDz9eB57QUhDFQW7mHyvqE2y6/s1LLlLLMg6fppuXU7OYI0uI/W/7STm6FLU+I5urnUI19PiIidui3hGTVor6qYEPGUUdxoKwLLCSv5zrHAQvfVLFhfo9UZ6dElnwp4dgHBMlS4WdUN2bxaaa4sBmZzYpSkAb+duo9gvpsR7N4snZDkSgMpgmIBHLxzVWri2epTFG7LVrsoAvVomAZCS7mLvJiGx7cwvN8Yc9FZiaS7cX+4k8XppYSVasbATKqrGHq+QeFqnuXBG5JeCXcQ1Ey6vYr9YDFfsDRoSVmWZF2eycCeFAj60ZHEtFpuTggohS9mcFor2CAkq0RF1UYVK9hrn9peL+kbWRIchUbntUReVQx+PpiioMS4qIVdZHRYChovKhi8i1dgchsxlrm8RMMk6nmDUReU1uahCEXeVLo8FAwHUiIsqFES1Rv+kFYvV2K6Yqq1WdVEFTbPR5olpv1lyMr1gdksFanBRBatIUbXB7KKC/UtDZieTpkZlG01VCAerkaLM+9cgQTRUmmhs0obk0EBt4CLjcEPkrX858gFHIpFIJJJGSoNdVIfxA46UqCQSiUQikRxyyBkcEy/1GoVD0djcWUg8ljkf8vKWLwFI3rGWKe/+AsDCnJf45IOHADjh52coTtar9P7vvwDk+wLc6RbSyXP3fMKlr4tifPee0I6MZ180zme7Rjinjjj1N17pejEAjgmzAeh46gR+/+RdAOIunkyb20Qf3g8mk97zOAAem7+e63Qn1rZZ4jhncifmrhcyUY4nwA0dRSXJSbOW02mQcFR1tJcRCgippkeak23zReHAojLRV8tzeuPKFA4oX3mxIU+0sFSSHYxWaY5Q5BASUAdbmKAu1eR6AxzZQpTdT+mYTM5K4XZal1dGcsT1E/ARSBZjSkwVx21bk0doj7AgajYne3Vdp10TFx5dgwrkZuHKEBJV5PVSX8iQHjzlfhy6RBXeEyTkTNT7cxgSVZzLhrdSSFAua1TiCZUVAcJx5S0R+1rSNHyhiIvKSr5+P+wWcT5NgbAu19lUhUClOE61qfjKRdvqchh9KE6XIXOFTBJVtLgfVPqizimfqeif4aIyjVmxRJ1TEYlKbDeVdzeXeteqFPqLcUUR0zY7qsxOKxBT57EuqthCfSAkKqNsuxrdrmlRF5hi+ppldleZ0aoJNDyQm6gh7idtP4eapaiaZKkDyVXhULDGon9mDqbo1VgUNOnUqhtyBqf+yAcciUQikUgaKYqqxqyhqzN/Ydp5Y+fwvXKJRCKRSCSHLHIGx4RDU3AqCg89KRJwT7hyOrcuEUnOj3+1gFZni8To4G0XccNrIq36yYuf487PRIXHiSMnAvCfs7uwcJRI8F1d4mXGAOGMyn/6dlLaiTwrzebkvx+LZOSL+3U0it8t+ENIOfe/3otLf/gOgFlrijm+fyYAy6fP55hbzwDgmy9W0XNUZwBWfbQWgIzRF/PqN5sB6KsppOwUstqejb/T4UKR+xFa9gX2eCHxtOzWgm3fCEmoouVuANyDz8eJKLYXDgVx6vP16tbf+MMhnFaJVhVND9vcXChcQ0clBox7WewP0SNdyFwJHTJY+a04x7rsEkY4olOmZRYR/tamqXCGrf1hNYFsUVjQ6nRTrmsm7ZPj2KXLGr7cbOIyhPRWpss3eeU+4x56KnzEpYqxhXODRk6VZneyp0TIUqluG4FKISvFWaMST6i8ROxrsUXlJbvFyKJKjLOSr4/doUUkKoVwpXBkOTUFv8lFVeaJFPqzRyUqk4vKZ7IneUxhVEb+lKIYcpVNVYyieYoWLXynRfKngkE0W0SK8sUU94txTimx32vMDqkQNRT6q0GKina5n/ypavavjn1lKf0aq3FFabXIn4IaivvVQmI6EHUqtldHoakR1RX8x6jPPThUb5uUqOqPfMCRSCQSiaSRIiSqhjzgHL5CjXzAMXHRHwtISIin8HkxE/M/Z1tj9uKC5TNJ/E2kf/83czhH3yQWCId4jlkuUf/GoX/t6PLy2zybcCQAZ3VM4ferxGzP0iVZXPj6/wFgt6i89MJnAKzdeARP9hJRC+//lg3AKQkFdBh8LAAzP/6Tn8aNBuDx86Zw+wkdxb7TXqLt9IsAePFVMWN0dP8WfP2ZKJl9Sbyd3I/EQuXyPQGcJ4wFIOupR0hqPQKA1sPdfDpnAwDBdLHQ19emH9oKMTarK5FMPeG8/Lfv+SmzNQCZDiv2eDEzsiJbzHr0q9yBqs8a+EJhmseL46xdWpPjETV9tmWX0iRRLABWVI09epRBx3Qxg/Nx8R4qskr1c6caMzTNE+yU6Qtry3btIXXA0WJM+sxKdqnXeK985aU40+Ijbyshh1hkbHW4ySsVMzgpLrtRx8dpMaVplxYBoFqs+EsjyePRRcbpjuhiXbslUgcHQhVizDZVIaBHNVjdNiOqQTNFNYQt0agGc+0br6n2TWTWxqlgJIirSjRNPFIHJ+T3xSwsjrRDoUojQTwcCsZ8i6taB8c8cWJeZOw3xTaY2z4j0dxU70Yz18SJnSExFhwr0ZkdxYheCMfEJdQUyVDbqIaa08T3e1hMHERV6jKbUJeZHYmkNjQ4qkE5fGdwDt9HO4lEIpFIJIcs/+gDzpIlSzj99NPJzMxEURTmzJljvOb3+7nzzjvp2bMnLpeLzMxMLr30Unbv3h3Th9fr5cYbbyQtLQ2Xy8UZZ5zBzp07/+YrkUgkEonkL0DTUBrwQ0NyrP7l/KMSVXl5Ob169eLyyy+PiVYHqKio4LfffuO+++6jV69eFBYWMnbsWM444wyWLVtm7Dd27FjmzZvHO++8Q2pqKrfeeiunnXYay5cvR6vjG9t7/FeotjhOmzUPgB9z/6DNbpEKfnufa1lWeQoA97RL5qL7xeLjzdNG03HSewCsmXQ6AOe+vZbL9LTrYW8/xF0DRDp5WSDE9OPSACFTPHWvWBi8vjCXgU8I6Sr3/McA2PbYRG67WkRAXHPzdHz3TQAgx/M/uvu3iT5CQUp7jNT7FtEQ1wxqywczXgWgZ+901n+oJ7yqR7DD0QqALQs20vyCKwBwHdOD3Z7XAIxFw6vzKmn9wxLxepPWdHALqSN76WqWHiGW2N7ituFqIvr7cZPY9h/3MmPxMkC8V8Qe0KEbhXrcQ9GeChJbi4XFWqWTbfoC5Xap4n75K0oo3bEHALu7o7G4NzPezh49Qbsip4Cmqc0AjNf3lHtx6YuQ/RXFODsn66PwUh4QsojF6aagTEhUvVomEdBr9sSZJCp/SYlxL3xlukTltOLXJaV4W/R3yq5/PdAUhZC+yNimKgQ8+uJkh4WgXqdHc8YZMlfYVJ/GH9p3kbG59o1NVQiZEsSjdXBMtW9sVlM7uuA4RpYy176pMt0dNNW4CYcxgv1iatzUlCBew/bq0ohVNVrHxyxj1RTnYKlTzMP+FyLX9hiAUCj4l8Uu1DSkv7s0TEO/2R6qC3obIw1dZNwgeetfzj/6gDNy5EhGjhxZ7WuJiYksXLgwZtuMGTM4+uijycrKolWrVhQXF/Pyyy/zxhtvcOKJJwIwe/ZsWrZsyaJFizjppJP+8muQSCQSiUTS+PhXrcEpLi5GURSSkpIAWL58OX6/nxEjRhj7ZGZm0qNHD3788cca+/F6vZSUlMT8SCQSiUTS2IjM4DTk53DlX+Oi8ng83HXXXVx44YUkJAiJIycnB5vNRnJycsy+6enp5OTk1NjXo48+ysSJE/fZXrJzPYrFzvn9mwOQfcbJ3HTZ/wC4Oi2OZ94SstQJv36G5wyx/ftj76L0xQcA2DVayEtfj5nIB+/eDcB7ai/D3TM41cXWO68HwJmaSJMuAwHI3/ArW7ufCcCxNy0FYO6T33DlBDHNf2nxHl7+Taw9ahNnJWfWTABS2vXi7dW5gEgFB+if7DfiErr8ZyCv3vERAHEDM/lST/T27ShhaG9RV6ewSRtDOolLFdu+2pzPSd8L+Sy55VAyuglZLXt5NrlJoj5Ok+5pJDUTMtHGbUUA7A2uIy61p7g+TUHLE3EVtOpquKFK95aR3E68X7ZtCWwprACgb6ZwOgW8lZRmCYnKmZxm1GhJcmgkxos09LLsYpQUce7I2HfuraSd7mryV5YR11ScQ1H3UOLV68jEuSguE/JRkwQ7Qa+4T6pXOKAAvIWirVoT8et1cMwuKrfdlNbtjyaIhz26RGXVosfFO4xUcMXuIGJQEi4q0Q6YitB4A5F4BoUSX7QdDEScSmo06sCUIK6ZEsRVk/wVE9UQm4eAmZh4BnPtm3DYqKAaCoX3ka5qTBA3O6fUaLs6zLV2wuFwtbJSdQ4nTVUImRLEzdtrkp7CoagEaOxfD61FM92+2FTzuvfVmKlPLRpZw+fgo6oaqqyDUy/+FX+Sfr+f888/n1AoxMyZMw+4fzgc3m/eyfjx4ykuLjZ+duzYcTCHK5FIJBKJ5B+m0c/g+P1+Ro8ezdatW/n666+N2RuAjIwMfD4fhYWFMbM4eXl5DBo0qMY+7XY7drv9Lx23RCKRSCQNpaGF/hqUY/Uvp1E/4EQebjZu3Mg333xDampqzOt9+vTBarWycOFCRo8WhfCys7NZvXo1TzzxRJ3P9+ULN+GOT8BpuROAKek9WeR5CYA5q+fTcdIKAI6buYbLb7kcgKvvf5+BF10IwLmPfA2IMv8fJw8BYPy0xSy4VMQztDxpEE9e/BwAiVaNi94UkRAffpbK/729EoD5Nwi5a9V9X1D4yuMAJLXpwavz1gHwQv9Mfn/1JwA63XgBbywUMtCjHcQDXuDr2YbUlDDyXDbf8A4AqR168e7S7QD0rvBxYU8h8fyyq5RE3ZGT3KarGMOqbHouFxJfs+OTae5uB8DXL/xE4S4hlaUf1Yq0pqKY3u4thQDk78nC3fRk4/p8G8T9Uk+8wpBkPIU5JPYVEqCjsAnrsoUkdGonIYMFfZWU7hYF+FytHcZ7k2SFuDTh8irPrSDoEr8LkX5ziivprV9H0FeJLVW4uRR1L2V+IU3YHVY8FcIZlWi3EvQJB5fiLTfO4ysVkpnV4cajxyw4nFbjPG6Ti0rx6fsqihHxYHFajOJ+cU1t0XgGh8tom5O9I9s0JTZN3BeMuKiIOqdsarTon0miMhf6wxJ1VCl60UVR6C/6p1610F+4iiwVwR/jloq6MWpKEDe3zQX9QmGTXCVuf0xxv+pmW2sqvFeXQnpm+UithXZSnXMqHAzG9FNbZGC25GAhXVT15x99wCkrK2PTpk3Gv7du3crKlStJSUkhMzOTc889l99++41PP/2UYDBorKtJSUnBZrORmJjIlVdeya233kpqaiopKSncdttt9OzZ03BVSSQSiUQiOfz4Rx9wli1bxtChQ41/jxs3DoDLLruMCRMm8MknnwBw5JFHxhz3zTffMGTIEACmTJmCxWJh9OjRVFZWMmzYMGbNmlXnGjgSiUQikTQ25AxO/flHH3CGDBlSbUGwCPt7LYLD4WDGjBnMmDGjwePZO+ZsvJqF0S3PA2D1E6P4sVBILv2n/MGPD4lZoSbH3MAPb14KwMtZa/nylYsBiD9OPKCddfN13PbElwDkrfmBNt+LQnqbS8Pk+0SG1W5PgGlDhFSTkeTg4UfeBODVgZGsJws/T/kGgCPuGcOvH30OwJFjz+Gh86YBMPakzlx739sA9LhUOLLWv7GAjO43AJDlbm9cW6cjMlizTCym7hIMc0SKkHPG/pJDb5eQMpq1FTLXzo0FrC8S8s3x3dJp0l4kiG+b+h3le0QqeNOzu3OkRchE639cBUDBxgJSeohMqXS7haK1YnYueFz0D8xTvIfETm0BcGzOYJ2eY9UkLvqrWJotJKr4ZKexTSvNw6UXT8xfX0DIFStX5hZ5SNaL3AV9HrTkpgColiz2VkQK9lnw6A6nZKeVYEC0VY8Yg6Jq+EqEXKXZnFTqupTbYTHcRfE2k2PJ5KIKmBLE/bq0ZY1zGhKU6nRFs6isUenNLAP5DblKMQr9aYpCKNK2asbfhFHoz2Mq7hcKoVijCeKRLCqIlaVCVcq0xcpPGI6NYKj6ZPEDuahqIwcZ4zIXBwxV76KCfYv2Wao4p2ru/5/PhqpJrqprynjVvtSY7VITawh/VWHHg4F8wKk/jXoNjkQikUgkhzMNDdtUZNimRCKRSCQSyaGDnMEx8dnvedgVlWBTIT280ue//NxTyCTu4fezevEkALqNHM+7x4uCfac8/iqrzjsbgI5DbwHgjVGtcE8Tbqn0Hsdx6fuiaN62rGIe7d4EgKxdpeROEvv/987HuSt3GwDT3/0dgFePb8Xbi7YC8OgZ3Rn88isA+Ibdx17fZABObWmlOEv0nTr9JgA+f3Q0A0a3BODjtXlGAcDB/Vtx3TtzAUi0qih/LALg1z/cjGkvpKkTjhDOqmeW/MxuXWY5vUMaNvUYAPZ4J+MpFrlTjl4j6V8ujnu1QDircrLLaNNKFOxrGWelYLW4ppISr1Hs0Fdegq2NkNPcSWGK8oUTyVISLcxYpMtjaalObLr8oJXm4m4mXFtbf9lNic9UnQ6oKPPhSBbST8BbiZaaIY6z2sivEFKU3WmldK94b1OcFkJ+sZ1Kk0Slu6g0u9PIuUqKsxkuKpdNM4rDRVxUNlXBXy76tTgt+PUMK81hclHZosX9wlpURtJr+6EpipFFpQKVvoDRd8RFpdq0qItKixTgixb6E86pqIsKS/Q85mJfwXBUVoKqzqnofTVv9wVDNbioItlSZrlKIax3o6qxRf9CpmKAxtAOUNyvTs6pmiSuGs5XkzRxoByrv4vI+OpyD+o6dClvNW6M0MwGHH+4Ih9wJBKJRCJppMg6OPXn8L1yiUQikUgkhyxyBsfEhA9uJ8Hl5PIe5wJw9Kg76bzjOwBOnPAib/yfKOj308Kh3PqYkCfeGx7HTVdtAWDe7CEA/H7RmbQfcjMAt17Qi9vufhEQ8syxL98LQOHir3jvIeG0GqWNJ61TPwC2LhUJ6r2n3sTshULC6uNbjzu9DQAv/7bbkJ18n840nuzXWFoBsKrYwzWDhBPr1rdW8mArUfm5V7tkPMUi46mT20bel+LcOZsG0GZYJ3GNHYV89kTeDiM7qmfTOLyWboBw24R055E/swe9SiOyk8inyqrw07etKLCX3C6JvZtEAcBN+eUkWnV5w1cJGaJwYGJaNntzhWNKK9oFgGqxsUfPjmrXxI1V14MC2dtwNRPOqb2+IIXeSFaTeO8qy7w4dYkqVOlDSRQuKs3mIF93UTlcVvJ3Czkq0W41pv1DpYXGub0lXgCsTWyGiyrRacWryzUOi2LIHeYsKn+5kNUsDgu+cnE+q8thKvQXF3VRWaJVtM2SkMeURWUu+hc0XFSqMeaYQn82c6E/k4vKlEUV1kwuqirmRFMclp4/peljC1crSwWqcU5VpToHpFkOisRhhasUB4zNlYp+/9KqSDXmzClNVQgH95Vw6uqMqW0ulbnfmto1UZshScFIYka6qOqPfMCRSCQSiaSRIh9w6o+UqCQSiUQikRxyyBkcE2eva4fF4eKJyWcB0P7YW1gyfT4Anwws45fjhAz007En8H8XdAfgy0EX0U8vSGd5SjiZXp+/hXdfOh6A7kUruDEigVhtzHf3B+DY/xvO6tuFq8n2zA+c9vyNALz2+G8AbGx9AselxYn2E0/QfuB1ALw6bx0zewn5ZcXT80lucwEAM3/YBkA4GKavW0gnWX9spuMZPQBI2/MHqi5fdOnWhC1f/glAsT2RjMuOAyA1Q5wv6Ks0pusTizbzhypcWTZVMb4N7KhUaZskJJCIbJXvC3JypnBRpXZJZe2nGwFYtaOYLvaojBJIFvexSVoxO9YJacq/swgAq9NNsZ4d1a6pi726WyiQk4W7uZDQiv0hCiLF+3RJo7LUhytdjD+8JUgoLgkQBfvyy4TslBRvZ0uFKOQXZ1VMEpU4t2aNSlS2FhZDokqKs7JbV1Lsmmrcm0j+lE2FQESiclqoyBf3P8ZFZXcaUlDApN54TFJUpS9oXFOk0J9NVQy5R9VU415rzuj91IyCfoEYWcqcP2VoQrCPrBQ2FfrzB2MlI/MxahW5qmpxv2rdUqpJhtJiJano0GpXtK82mKWrg8XBKAL3TxUcbCwGqdo4terz1jeSy/tLUVWlTsUz9+3gcLhL1SMfcCQSiUQiaaQoqhLzJaA+xx+uSIlKIpFIJBLJIYecwTHx89tvoWg2XlkpXFEr3+hFSTvhqJp+zH+5InslAHfFd2PCV6JQ3tNJ3Zn21YMATBw5EYCWTiutPn0cgMVPL+KIcx8BwGLTuP2FnwHo368FQ1KEtPXT3koeGymcTAsWiLyrWz76g1euFwX2Ppu6mP8u6grATbc/y5H/NxyAZ/7vTdqfKaSyb77bBsCFDguBr94AoHBbNs3HnQPA3s/fx53RBoC2/RU+fUrkXHnb7EXrLfqz7RaZUqrFRhNdUvKtWsLStFMBSLNp2FxCglqVU0rbpAIgKm+UBUJ01K8prntbst4TRQj/3FXMMGf0Vy3PI2SKrs0S+KFQFPjz7RTSkTUugRLdTdQz0UlQd1+V78zG3V5kWJUHQ2SXCinJqUtYnvIKnKkiq4otEHImif6cbrL1woGpbjuBSuHacpkcSaGSAuO6vcW6ROW0GPJSitvGbn3sTpOLKlRRaowhUujP6rAQiGRRuZymQn/RXC2fSQaq8EedUxGJKlEBry+6PahrWqpNNdxChnMqGHVRhUJeI4sqHArG5E+Z5apgFZUoaM6ZMrmoYjOq9s2igqijSpxfH6fZLaWY5aqojBWZcjc7p8wyTjgYW+ivqnSl1ULWMruianI71XRsdZtrkpmq217XnKnD+Eu25AAoitKgYoyHcyFHOYMjkUgkEkkjRdHX4NT3p74S1cyZM2nbti0Oh4M+ffrw3Xff1bjvf/7zH+NBzPzTvXt3Y59Zs2ZVu4/H46nX+GqDnMExMfaBm7HHublmifiW+XrnEVR+9gUAlcE5DLxPzNq8MbI9Jz4kZkBeOakdb7jFgmKH/ot0+aTTee22DwHYUObl7a8HACKpucdIkThesK0rT4wfAcCeCV+gvSdiIEaPEouGX33pc5q+cBcAv02Yz7ROIqbg6sIc7Gc8BsC2y2Zx9YkdADGzA9C/ZQJrZ38NQMDTEk93MTuz+e4pZPS6FYAmJ3Vjw0SxeFpRNbYjIhcyfp4FgDM5nTZxYrFq7ne/sKhzXwDOcdtwpmYC8MOWvZwWvwLAmNUByLCJ2Ytw167k+z4BoDC3jCS9Ho+Gk536Qt4uGfH49AXYJVvFLIo9vqexuLdNkpNim3gGL9uVT9IgcZ/LAiHy9FRwlz6D4y8vJq5jsj6KMB5V1JqxONzsKRV/QO2auPF7xAyO2xa1TgZKRB0fzebAn68vXrZb8FeTIO6wqMbC5nClmHVyatE6OFaXlaA++2KJiy4sDlvtxkyJedbDXO/GnCAeiWfQLGo0TdymRevgWKORDKrV9GdsXmRsKtFuns0JV4lqCJpSL2IWGVdJEI/O7OjHa1XiGUz7V7eYuDZYGjiVUZdkcfO/Q3/hIuDGMjvT0G+zjeQyDjsUpYFrcOoxg/Puu+8yduxYZs6cyeDBg3n++ecZOXIka9asoVWrVvvsP23aNB577DHj34FAgF69enHeeefF7JeQkMD69etjtjkcjjqPr7bIGRyJRCKRSCQGkydP5sorr+Sqq66ia9euTJ06lZYtW/Lss89Wu39iYiIZGRnGz7JlyygsLOTyyy+P2U9RlJj9MjIy/tLrkA84EolEIpE0UiIuqob8AJSUlMT8eL3eas/n8/lYvnw5I0aMiNk+YsQIfvzxx1qN+eWXX+bEE0+kdevWMdvLyspo3bo1LVq04LTTTmPFihX1uCO1R0pUJq5cNpN4h40vzpoAwO6pw3j89ukA7P18AhNvmQNA828+YcNQUbemx4+fc9qFMwFY/ZBYjJs3+gE2jBUSVXuXjcR3xCJke3I8cbrEU7RtNfbZzwBw6u9bWHyv6PueTfcAMPX+tXy6R0zdpdg0fB9PBcCd3oYvdwtNIdGqck7XNEBIVwDdRvdmwcwfxPl69GLBZiEBFS3Ppu/l4tzB7p2NKAZncjrfbt0LwOBFywBIanMpbfYuEPfgp61ss4k+WrRLIjFTLPRdvimfElX8cjqSRb82VcG6Z5O4mR2OMOrZFBdUkNJByEe27EQ2FoiYi3bJcfj1Rb8l27L1voZHF/c6NFL0WIqyXQVoTVoA4AuFydL7yLCIZ3RfRTFxGSn6O1lAqR7lYHUlkq9LYv3bpxL06jVq/BVE8BaJxcKaPcOIWTAvMk6wR/9MtIDHkKgii4ytqoq/XJzD4rTi0xcOK3aHIXOFLdE0cfMiY49Jlio31cEJmmQpQ66yakYdHIsjupjYkKuCwdh4hphFxuY0cWIIETYSh0VUgy77BUPG9mA1sQ0xCeJKbB0c82LiyPgVVTFkr8j/dMPhcLWyUtWFxZF2qJqohsj+1RE2RWEciNrWu9FiFinXuvt9aIxrP+ujhDQWCe5QRVWUBtViCuvHtmzZMmb7Aw88wIQJE/bZPz8/n2AwSHp6esz29PR0cnJyDni+7OxsvvjiC956662Y7V26dGHWrFn07NmTkpISpk2bxuDBg1m1ahUdO3as41XVDvmAI5FIJBLJIc6OHTtISEgw/m232/ez975rd8LhcK3W88yaNYukpCTOPPPMmO0DBgxgwIABxr8HDx7MUUcdxYwZM5g+fXotrqDuyAcciUQikUgaKQer0F9CQkLMA05NpKWloWnaPrM1eXl5+8zqVCUcDvPKK69wySWXYLPZ9ruvqqr069ePjRs3HnBM9UU+4Jh44rGvsKHy+pfiybZ4yWSenyze5Kt2dmbgxc0BOO6eBXQ9SdTHOe6xxZTs3ADA7gvEKvJLJn/HS8OElNNmeA/evke4iZrYNU547FUAflq4ipvnijoxLz/0MM/MHgXACXP+Bwgp6rGPVgPwaOdUlk0Rbq72Zz7E5PliFfrt6W6sP74DgD1eyDOZF57G6seEiyqlw1G8rEc4dCv1cUEfIfH8lufBqRcJSWzVjY9+E3EJrX8S/828LIUWNjGdufy91expK1LIM/u1oGkL4ZjK21nCnoJN+lgHi76sKr4NImpCHXCmIfGU79lNcg9xbltZMutyhLRzbOskkS4OlOwU21y9oivqU+MsuNJFbZvS3WUE3UKO84XCFBYKiamzVY9yqCzD2UTcA9VSSolPSBN2pxWPHuuQGmcj6BNuJ9VTYpzHWyhkMovNiU/f1+60GuOPN0lUir/CWLgW1qMaLE4Lfl3acqbFmxLEXdEEcS0qHflDYaNGizdoimowauJASNeRNJtqyD2iHXFRRaMaFIvL1I5KV+baN2a5qmpUQ0yNG9NL5nbE4bXP/gdwSymqQljcmphy89V9E6yu3k1k+/6osZaNWeIy7VJTfRzjfFWSyWsrQzVGyUny7+fvrmRss9no06cPCxcu5KyzzjK2L1y4kFGjRu332MWLF7Np0yauvPLKA54nHA6zcuVKevbsWafx1QX5gCORSCQSicRg3LhxXHLJJfTt25eBAwfywgsvkJWVxXXXiUzE8ePHs2vXLl5//fWY415++WX69+9Pjx499ulz4sSJDBgwgI4dO1JSUsL06dNZuXIlzzzzzF92HfIBRyKRSCSSRkpDwzbD9Th2zJgxFBQU8OCDD5KdnU2PHj34/PPPDVdUdnY2WVlZMccUFxfz4YcfMm3atGr7LCoq4pprriEnJ4fExER69+7NkiVLOProo+t+UbVEPuCYuPGK3rhtVhYpQwAY/n0ii6cPBaD7yeMon3cbAK7T32T3t+JNbHbcfxl06WUAnPuIkIayln7KUfOFFJVrSWP1nfOMc7wy5ggAHmniZtYrotjehz3OJ1GPJPhh4hwAul35FH9+9S0AR995Gk9eJZ6U/3taV+56TKSQ97qiPxtf/QCApt2uACAvs59RKK/dES1Yv1K4k1oFQhzXWuiv98zfRHuXkDJadMpg6/p8ANbvFlLN4O7pZLYXxf3eevk3SncLKarZ5UfQXRMy0Lzla8jbKo5LPVUUIUy3WyhZLVLKgwMuMq7ZU7yH5C5tAIjb3pzVu0RhvaZxbQ35oWSnkHvih0YjDSwlObibioTwwq1FBN1NjNci8QspesG+oM+DltoeAEXdaaSNO1xWPHpRwGSnlYAuianeUsMVZLiobE4qdckoMc5qyEuJZonKV2m4qAIVoi+Lw4Jfj2fQHHbj/isOlyFXha1R6c2k9pgK/cUmiBsuKqvJRWXTCPuqRDWEQij2aN+KtXoXVchUpi1yenOhP9VIEw8Z20PhWOfU/lxUqhobyWAuEmimagHAcKh6FxXESk9VCwDWHM9wcHSihvZT0+F1jXCo2o8a85rUxBrCwUiJ/ztQVPHTkOPrw/XXX8/1119f7WuzZs3aZ1tiYiIVFRX77qwzZcoUpkyZUr/B1BNZB0cikUgkEskhh5zBkUgkEomkkSLDNuuPfMAxsfiiR3C64vmzexIA8cfeQtFCUaSvw/Fj+byPSOY+9p7n2PEfsbq87TH/x/yreon9jxM5U026DODqRaJ43sasLdzfUcg62/dUUPbkWAAeGvc4U+8XLqmHZq/ghcHCZfTeEqFrPnreEQx7R6SCq2fPJOfiVwAY3zmBa7aI1O9m/7uBb/qK7Kr+k0U+yIdr82ipF8cbeGxbbvrkSwDcFhXLKtH+bmUcp7cT1zi0VzOef+5TALbpss7JXZri0E4AIN/3LJ5iIUU5ep/PsRXCyfRuwW52bxdSU+vWoq9WcVbyfxdJ7CXFXsOp5S0txNpOpKTHp6jsyRMZTvayXOPeF+8VklNaWpwhAWmlecTrrq0dv+VQ4o++V+V68T5nknC8BbyVaKnNALDYneRXCFnKEWejdK+QktLirIT8YrtSWWJILr5SMa1qcbqNAohJcTbDRRRn1QznjeotMxLMfaXl+nEWApV6gnicw5ClVKcrmkVlicpIvmDYkEAihf5UoNKny1wKBPwRKUqLkatCHt1pFeOiiuZSYTFZM03F/ULhfWWlCH5TMTyztOQzyVWxzql9i+eJ4n76adXYon8hUzFAY2gHKO53IOeUmapyVeRYrYbz1SRN7C/H6u8kMr663IO6Dv1w/tD7t6Gq1f+91JbwYazTyAcciUQikUgaKX+3TfxQ4jB+tpNIJBKJRHKoImdwTNx522QUzUZq1ncAnPTQS7x0rZCAfnt/COOmCpfR/NPc3HSTKLb3Zd4wVp0nih91HHoLALde2Itb7ngOAH9lGcPefgiAwiWLeOuBzwE4y3MLTboMBGDzd/PpO1Uc+/Yg8d/BgXW409sA8MwvO2kTJ2QI30dTDNlgjb0tv+luopuOFw6iW95cwYO6W6pXhxQq9YyqrvE2cuYJN9fuzf1pO6ILAKd0bcpTOdsADHnmqAwXFVp3cb5Q2CjGF2hxBH3KIrLTXrJ0SWtA+1QAUtsnk7++AIBNe8oMZ1jQV4nSvDMAiWm72Jsr7qNWtAtVl1RydBdSx/R4rLoeFNi9BXdz4Zza411DoTdaCK9Cl6hcTUWRu1CZDyVZJNNqNgd5unPK4bKSv1s4tBLtVmPaP1hcYJzbq/dlbWIzHFCJTiteXa5xWhVD7lD8UReVv1zce5vLamRYWV0OU6G/uGihP0u0LLpZEvIEonKKt1oXlam4n00ztaMSlVmWismi0kzF/Woo4Aexbil/NW4pgMABXFQgCndVxSwHKeq+LqpQFReVpka/c2kmqaaqdKWpyj4F+aqerzZotdi9ap/mf9fmfLUZ0uH7HVtyIBSlgTM4h7EcKR9wJBKJRCJppByssM3DESlRSSQSiUQiOeSQMzgmup90OprDxW+TFgLwQbt1rD1bSCtfdz+GcTeKJNQPjr6QoU2ENOK7+zJeWyCcQ3PfGAJAq00LuFGXdSx2J++pwmV1whVDWDvuYwC0mT9y0WxROHDGw6v4PVMcOyLdDcCaiZPorLuyXpmzhlf6ZwKwbMoXpHYROR9PfLURl6439HEIR9O2FWvpOvooAFJ3/YpmE4Xzuh2ZzqbPhGurJD6ZZteOACA9w2VIUJHp+sSCDfymioqVTk0xpIltFQptk4QcEgr4yNeLzp3SXDidmvZMZ9VHIl/rt+1FdLdHZRR/iuivRUYxWX8Kp5g/qwCrU1zvXr2vjulu9ugupcDurbhbCImq2B8ir0zIQDZVMYr3xaWJ6wuXBAnFJYl77nSTWyLko6R4O1sqhNvJbVMJBcRxoeICNGusRGVvZTUkqlS3jR26muK0qMa9CZWXGO4wf0nEfWWhIl/cQ6vLWa2LKmBSZir8QaO/Sv26baoS045mUWnGmC1uW1SiMQr6BWJkKXP+FDXkTwVNjioAv0mzClXJmVL3U+ivRreUWYrS9pWlIvtHaKh7SVOVOjmOakN9vjFXLeJ3sMdUWxrLF/baSCP1eesbyeX9fTRwkXG9bvIhgnzAkUgkEomkkSJdVPVHSlQSiUQikUgOOeQMjolFp0GCG4KD7wVg4qkPc/H2FQB80bwXzXRn1A9P92bq+rcBGNv5AvolC5nENvkmAL6c9SsD7xH72iwq42csAaDjUa24KUNIMp/nlDFtSHMAPlk0gpvfEueZd9+pALx6x0fcc69IZB1z9WMceYcoMvj4mOl0vbIbAN8v2cb1eqZU+dwXASjKKqbZQ5cCkPvBWyS2EG6pDsfYDAeXt/1elN4nifFt+sGQsdJ1Scn76wK+bXq2sc0eLwoV/rKzhA7JeQCoFpvhuuqcKvKi4o7oQNbs3wH4c0cRJ+vOL0XVyNblpS7NEviqSDi7PNvLsLmTxfj1vKV+SU6Cer5U2Y4cErqJ8ZcEgmSX6cX9NJXKUpEf5WoqcrDYAiGXcHNZHW6yi4VE1TTBgb9cyHduW/R5PlhcgGYTxfe8xbqLyq4Z8lKKy8YOfV+nRTGcU6GK0qiLSs+isrmsBHQXmMXkogpbo7laPpMM5AmEDIdQRJZKVCCg30+rqhpyj2bTDLeQarUYhQojLqpQyIuiS23hUDAmf8rcNjunqpqdgubMqTA1FPeLdVRBrCwV45ZSYnOpwiZJy7iuagr6hYOxbiltPzJWzVlU0XZVt5PZgVUd1W2uSWY6GPLTYfzFWlIHGhq22ZBj/+3IBxyJRCKRSBopMqqh/kiJSiKRSCQSySGHnMEx8cjQcdgVlT9mzAbgsiQHw258HYBl94/g6NveA2DpHUO4epko3DYo0c4F748HYOLIiYBwBH16RW8AQopG0vMvAVC0LYPhz18j9rn8WXIniaJ+t100gdvuFhKTc94DAGy4/h1uSiwCIFBZRuDEqwHI8Uzm9uGdADjrzfcY0FO4jP54dTEAQV9nctKFi2rjnDtoceqZAKSe2pfNt80BhLy0rlzIR5lff46raUsAuiWIa9r19S8s6jkYgOvibUbBwW837OFc91IAQ7YCaIKQi8Jde5LrFfdob04ZKR2F/KT5nGzTCxJ2zYjHV1oIQPGmPByJothhxL3UOslJiV1IJKU7ckkZOsJ4fWexkIQSLKohO8W1TdVH4acsJI6zuhLJ011UXZsl4PeIwoJx1ujzfKCkxFToT8g+dqcVj16Ez22L/mnYLaohKYVKi6IuKr3Qn8URzaLSnHEmicpuOJZ8Jomnwh/ax0XVRFWM/CnNphIyFfqLuKhUW7RQoebQZalgCZhdVFo0f8qcRRWMyaIiBrOLylyE0BcwZVGFwyjavtKVubhfuJrttVngaKnDFHp10tD+XFj7k5JCoWCD6ovsj8aiCjTkG2wjuYTDHkUVPw05/nBFPuBIJBKJRNJIkWtw6o98wDExoEkccarGk8+K2ZQPNv/IFadNAuC7M+9n7zWivf2RJ3n7IjFb89wXj/GaImZrHPov0mnN4tl09WgAnGnxpPc4GYA9635iY987AThl/Gbee0ike187NsT1BbsBeOpHsbS1vctG9swnAEjt0JcXl4vX28RZOSFFzBwEPGUc8d+hALxws5g5cQ06kQ/WiJTu4PoCTn1E1J8pyGhlzCK4M9ow50+x0PfUhb+T1u6/ALT0NAVgxw9Z7HSKNPTmR6aT0lLM8KzekE9BcCUAcam9jJkMbdef4ga2P8JYeFycX0pqZ5E8bt+WzJo9YhZlYItkfPrsS/G2XJzJYp/I2NKcGonxYiapZPtelDSRsl4ZDLFTTwVvZ1GNPlzNxAyOou6hRI9ysMW5KNAXGWd0bUrQq9f58YjIBgBPQTEWp7g3/vLIDI7FGEeiI/qnofoqjIXF4YoSbHoEReQ4W7wDjz4tojhdROZAwhaHsbjXvMjYGwgaM0IlvmhUQ1AvlqPaNIJ6f5opnkGzWqJRDaZ08ph4hpoWGZtmbYzZHH1GRkQ1iK95oVDsTI2ynzo4ihJbB8e8mDhorq1jms2JzOxUN+tS08LiUJXtkddqXgCs3zvTIbWJZIjt33xs1XPXrS8zjXE5RH0+/w7jz8y/HWkTrz+H8eSVRCKRSCSSQxU5gyORSCQSSSNFuqjqj3zAMdHjh4XEJyQw6LXlAHS/4ytmPCXkm2tveYZr7roOgPNun20szL0/vxMvv/QhAKsfEjVsko8fxvhBYwFwW1QeWPw4ADM+asoVz/8EwDd3Psbq2+cCkDVpvCFjvfqhiFP4+PQO/PiCWNB79KSreXXeOgBe6J9J+ceixk58s/bEnXYhAJuvEgujM3v05e1vRHREn3IfNx4pIh6+2lpIE33xbpMO3Zi/fBcAPX/dTeuRQiZq2UTUnJn75Dfs3bEdgOYDO5KRmiTO8XsOudnbAEjqcDpp+kJc75+/AKCMusWQZMr3ZJEyREhAjsIm/L5DSErndG1qLJot2lZMfLtorRiARNVPfDNRK6g0u4xgQjog6rNsLxCRC31tGgGPaNszRIK4ou6lWJeoHHE2PHq6d1qcjaBPyFVqZbFxHm9RGRa9/o9Hr2HjctmM8SfaTRKVtwyr/j+JYHkpFqd4zafX9nFlJEXjGRymRcamBPFAKGzIJN6AaZGxPxLPQFSWsmpGmnhMgrjV9Oeqy1LhUBBFr+cTDgUJa9Fk8bBpkbF5MXCoSiEcfygc0z5QHZxgNXVtFFWJlaL8GPtEMP+PtqZ6N8Z4Q8FaLz42S0i1WW9Q3cLicDC4jxRVGw7jzw7J34Rcg1N/pEQlkUgkEonkkEPO4EgkEolE0khRlAYuMj6MpxnlA46Jwde9gGp1kDOhJwAJ733LyV99A4A9sS3/ayqSsl/K3cbLz4gk8Gtunm44enZf8BgAH+8sJsEiJscK/UHuc28FoPM1/Rlxwf0A3Ni5CUNShETyxawVXDD7PgCmPzgDgG5vjOfF90TkwqTTujHwQtH3kWPP4acH3gSg3ZkPsdwjkrwjjqahg1vz4TsiGuLIMHQICUfV7T/nc1mikDL+6N6U377fKNrFXs7uI5xKab7jAdj24HzKcreJbdcOYKgi3FUrPl9M3hoR1dB0SALNdakm7zchn/mH+qMyTHE+8V27AuDamMaGXeIepWh+436X7CwlMU3EPESO04p34c4UEtXuX3bjtSca++cXCjdUsstKIJKA3kTEXaiWTeSWCenL4bJSqbfT4mwEdBeV4imN1r4p2otFTzKPOL+S4qwxLqrImBRfeTSeoaQCm0vIQz7dRWV1Oak0uagiLiWsUadTTFRDMBrV4AuYEsQjtW9sKqFgtA5OsEJPE3faDIdQJJ6hapsYWSq6OXJ6RdWMujfVJoWHY6WoyHZfIFitdBWuUlNHnMMkXZmcVmL/fZPF9xfJUNP2qv+ub3RCQ5PMzdT0OVI1Zbwu/VSdYj+cP6wOBn9V3aO/Ek1VGvR7GpYSlUQikUgkEsmhwz/6gLNkyRJOP/10MjMzURSFOXPmxLweDoeZMGECmZmZOJ1OhgwZwp9//hmzj9fr5cYbbyQtLQ2Xy8UZZ5zBzp07/8arkEgkEonkr0HVZ3Dq+3M4LzL+RyWq8vJyevXqxeWXX84555yzz+tPPPEEkydPZtasWXTq1ImHH36Y4cOHs379euLjRYr02LFjmTdvHu+88w6pqanceuutnHbaaSxfvhzNXLa+FlgdLlSrk0nH3AzA9MVf8cjAIQAs2rGSZ7scDcAt787h9DWvAqDZHAwYdQoA5z7ytbiuPbv59Q5RgG/Xj5tYfP4dAAye+l8sdiFLff7+tzwxXsQQ3H/7XKYMbw/AjIfFmH92dDMkkk65P1FZKArzBU68j+/GTAfgqpM7M33xZgBGJgk5pOeAVrz0mHBitYmz4vlapJ6vW9WTjse3AmB0nxZ8PftjAHK9AS5oq8cuWI4DoNgfwlsqCv2pPY7jmHIhEz1SmMvmQuFI6tMhjRZ6MnreKlGcMCe/nES9CJ6vvBhbxyMBSGpSSNEe4Xqy7N1uSB17Kvy0SRfvY0QCYs8OEloJV9fvX29jb2VUeigv0dPE0+KM4n0WXaKy2J3k6ZKR022jdK+QxNLirIZrK1ySb5zbU1iONV7IY2WBSIJ4NFrBaY3GM6jeclM8QyUWvQhgJJ5BJIiLMaoOlyEHmV1UvmDY6K/CHzRcWRWmQn/RqAYtNk28LOqiikgxZudUTYX+YuIZwmanVKyu5A9VL0sdsNCf6X+cqhpb9M+MuQBghGrTxEOxTqaaEserYpYdanJUNXZpIjK+ukhtdf3ckvLWv5OGSlQh+YDzzzBy5EhGjhxZ7WvhcJipU6dyzz33cPbZZwPw2muvkZ6ezltvvcW1115LcXExL7/8Mm+88QYnnngiALNnz6Zly5YsWrSIk0466W+7FolEIpFIJI2HRrsGZ+vWreTk5DBixAhjm91u5/jjj+fHH38EYPny5fj9/ph9MjMz6dGjh7FPdXi9XkpKSmJ+JBKJRCJpbDREnmro7M+/nUbrosrJEZJMenp6zPb09HS2b99u7GOz2UhOTt5nn8jx1fHoo48yceLEfbb/OvMSEhIS+LDV0wCM+ORBmmUKCUW9/UJyvUKSuF/7kaeufA2ACQvnc10vkYcUf+wtRl+h54WM1OeaHdzcUsxS+a6ewpHXTQbg57fewPnmMwB0ffBLyp4VieStjh4OwJ3v/85dzcS51z0xDXd6fwBeXL6b3Xphult6NuWJV5cBcPdJ7QDobCk0JJk+rRJY/853AOR7g3S4XCSEp7VJxlO8BxDumtaqkHOygkKKMuf25Dsy6BYnNgQ8Zca5B7RNIf0IkWS+c6koGrh8RxFpNj3fKOAjkCbGlJq+nvW/iXVRwd2bDPfSHm+QLvo1eiL5Trs2E99KuLb2+kIU6uezqYohUbmbxhHKFtcYcol7b3G4yCkV8ll8vJ0dlfq+pkJ5oZK9qLrjyFPowdZUSDsRB1Sq20aurta4LKpxH8KeMkOi8pVWYHOLPspzhexmdTkM95XZRRVUo9JRTHE/X3CfNHGbqsQU+ou8hxaHLabQnyFRxSSIm9vm/Cmz9MQ+7YjkJLKoIm6pkOGu8gZCpmPChuQTkz9lyGDR7ZqmmvaJntfsrjKjqft+z6oq1dQli+pA1JRLZZaxampX928zxvtTi8+Ug/mx01jUJymDHXykRFV/Gu0MToSqfzDhcPiAf0QH2mf8+PEUFxcbPzt27DgoY5VIJBKJ5GBiUcGiKg34+aev4J+j0V56hl6Cv+pMTF5enjGrk5GRgc/no7CwsMZ9qsNut5OQkBDzI5FIJBKJ5NCh0UpUbdu2JSMjg4ULF9K7d28AfD4fixcv5vHHRbZTnz59sFqtLFy4kNGjRwOQnZ3N6tWreeKJJ+p8zm97DSFO1bh0/QIAbmx6HNOKfwPg5sSjeOIVUXhv6lmP4dGdKLeWLmLVue8A0HGokKgsNo2zp/0AwKmDWtPeJSSNeVsLef3yvgAc89tGbpwjCgfee2kvvn50PgBXvX8DAI8++hZHjxsGwFsPfE77W4Wz67XP1nOGnillXfIGeWs2iHNPGAOAZ/5ruNPbiG29gnz7ksiJqmizG/vxojhhRskmQ5JIsWmEVy4E4IdksVC7id2C1SXkquXZZZycFF2jVKw7fY5qFo/riLYAfP+FyL76deteroiLyiUFYeEY6948kRULfwbAv60CW5x4oCz0B+mT5gJgu/41o2LbNlytWhivZxUL2cmpKVSWCueUOzMessU5ghGJyukmu0js2zTBwSq9+KLbphqyQbAwz8if8pb4sOuFCit121OK20auPnaziypUWmS4vHwl5UahvyJdPrO6nFGJyh5nuKh8Jl2owh91CFX6grgj/ekykE1VCOpuLtWmEvILiUqNtxjOIYvTTigkZDH0goVV86eIcVFhakcdVfu4qII1509FtgdM20PVtBUlNpfKGE6NspQudwWruKhqKPpX2yn6/UlL1e+/77aapK/qtte1iN9hrBYY1OceHM63TUpU9ecffcApKytj06ZNxr+3bt3KypUrSUlJoVWrVowdO5ZJkybRsWNHOnbsyKRJk4iLi+PCC0XAZGJiIldeeSW33norqamppKSkcNttt9GzZ0/DVSWRSCQSyb8VtYEPOEH5gPPPsGzZMoYOHWr8e9y4cQBcdtllzJo1izvuuIPKykquv/56CgsL6d+/PwsWLDBq4ABMmTIFi8XC6NGjqaysZNiwYcyaNavONXAkEolEIpEcOvyjDzhDhgwxXBjVoSgKEyZMYMKECTXu43A4mDFjBjNmzGjweNaV+nAoKg88ugqAV09qR9+HlwLw0olt+aDnVQB4Qm9x06NnAPD86KfYUCYcO3PfGAIIOeCIkbcCkPV7V/64T9TjeXXil6TO+x8AF104hlmvCFnqxWcf5clnhBw1rY9wJt2Tu43E/0wCYO24j/nvaSLX6abbn+WYVkI++vOZ9/GWtgTAN+A8ANaMOoXMXuLczc8+gtWTvzOuL8uWCUDTb6YQlyrandw2cr8S2VWfd+4OwCi3DVcT0e/XG/YwPF7IdPb4FEP2aOkMEe7VA4Ddni8ByN1VQtM2uhMLJ1t1yahH8wS8xfkAFG0owJEosr7KAiHap4hiexW67FayLYeEfoON13eViD4SLBo+vfigq2MqIAbi0YTkZItLZFdhBQDtmrjxVegSlTW6zCxQtBdNL5Dnzfdidwipya//DiY5ovKa06IaslSovASnJvrxl3uw6hJVpNCf1e0yJKqwzWnIQeb8KSFRiXalL0iiLp8Yxf0sqimLSjM5p6wxLqoIij2ac4XpYT6m0F8oKkuZXVSRLCqzi8p8jDmXStHMhf50WUnfX62SM2WmpsypyLVY6iE/Ve3LaFc5vGrxwMj+oXq6rmpLY/mi3JCFlY3kEiQmNEWt1mlYl+MPVw7fK5dIJBKJpJHzT9XBmTlzJm3btsXhcNCnTx++++67Gvf99ttvRep5lZ9169bF7Pfhhx/SrVs37HY73bp14+OPP67X2GqLfMCRSCQSiURi8O677zJ27FjuueceVqxYwbHHHsvIkSPJysra73Hr168nOzvb+OnYsaPx2tKlSxkzZgyXXHIJq1at4pJLLmH06NH8/PPPf9l1NFoX1T/BHT+/QkK8m0ljXgLAMfdj1p8qnFEZi+Zx0pjHANj9+jX81PtyAHbcMZd+ybpMMvkmcVxqAomthKRUtG01/pdF0b8xeUXMvf19AB7aditT7xeZUW/sPJcMPd+o5MUJACS26sq7W4UEkuGwcH5n4Ty6ujCHI68SEs77jyzA2acfAB+tFRJQxa/ZDLu+NQCenh0o02WPuNRM5q0Xxf1O+PQnUjsIua1jxdds+3o9AOvCoo+bu6WR0kpkY/28No8i9VcAnKltDdnGuns14a6i+GCxX5xjb24ZaZ2Fq8m+O5k1eWUA9EiPx6e7moo37yYuVVSe9oXCpOtyT7ZePK8kKx+1qcjMqgyG2KpnWLWyqlHZqXkTIE/cX4+QHezxSeTrhQAHdUwjUCnObfPrriPAU1CMZhfSnLfEi013UUXkpQR79M9BC3iiElVZEU69gKG/3IvVJTKmfLq8pDhdhswVtjhMLqqoTOMJhAwXVbkvaPQdNMlSByr0p9kdhuuopvwp1KhcZXZRhXRJT9E0Q5JS9GlvfzBkSFG+QKjmLCojL8lU6M+UMxUZv6IqpnNEiwFW902yJudUyLQ9XKVdHWHdGVZTET8ztXFXxWZiHbjP/dEYa9/V50t9Y5HgDjca6qKqz7GTJ0/myiuv5KqrxOfE1KlTmT9/Ps8++yyPPvpojcc1bdqUpKSkal+bOnUqw4cPZ/x4UdR2/PjxLF68mKlTp/L222/XeYy1Qc7gSCQSiUTSSPm7JSqfz8fy5ctjIpAARowYsd8IJIDevXvTrFkzhg0bxjfffBPz2tKlS/fp86STTjpgnw1BzuBIJBKJRHKIUzVz0W63Y7fb99kvPz+fYDBYbUxSTRFIzZo144UXXqBPnz54vV7eeOMNhg0bxrfffstxxx0HiKK9denzYCAfcEwMemUPmr2cCU8IF9LQq57mrJuvA+CYsR8Y0/UvZp7LE/eJxVFLbzue9GHiDZx46sMAJFhUbv/8CwBemdOBMS+IYntfPfgcPzwlnErHPzKWJl0GAvDEGyt480RRNG/pE6LoXq97nmPqx0LCmnJEU3wfTQHAnd6G1IuEY2rtXfNo1lPPqFok6gkdVerl6gFC4lmwuZAUXVpp2rk3H/woMrzafptF6/HCrdW6SRfmzxCLx/a0Ea+3PLYjzdJEvlfWuj3szhESVnLLYUbWlGfVD6iniqKEEYmnfE8WaYNF/pSzMoOVO4SkdHKHVII+UaSvcEsR8cc5jXueYhEynLuZyKcq2VFCMFFUsQ6GYXuBkJiOtGn4y8UfqKNZOqqlCIBir5AsHHE2I6uqSZyNoE+4r5TKYuNcnoISrI5Oou0JEKcXYIyMP8UZlX1UTwnWSKG/8hIsupzlLfER11Q4xSIFAhWHyUVlif4PwxcKR51TZheVP4hNnzsN+CMSlWrIVRanpdr8KSxRR5Wiu8HCoWBMcb/qXFRV2/4qDqdgmBoL/ZnbqiEfReWnGCnKr9870zfGfaJWTK6mqtQlW8osIak1fEM1y1XVyVLhYLBeUlRjlJwkhy6aosT8ntbneICWLVvGbH/ggQf261CuS0xS586d6dy5s/HvgQMHsmPHDp588knjAaeufR4M5AOORCKRSCSNlIYW+ot8AdixY0dMLFF1szcAaWlpaJq235ik2jBgwABmz55t/DsjI6PBfdYV+YBjYssPC1A0G1c0FVENk+2teaOzCOJMfmkVz00XMzs3jJsRXTQ7eSoLdhQB4FAfASDXG+DJdJElcPR/BzHy4gfFcW2SGZQsvnl/MvNHLpotohNmPDyTI1+9B4DXeorFy9POOYJBF4lFzf3Gn8fPD78HQLtTJ7AsIH4hbKrC0GPbADDnXTEL0yMUpqu+APfupflcliTO90fPdFb9KGZ5VhZ5GN1fzPI09Q1jw6NfAVCavVlsu/oYhobFOaYt+I6cNSLAIOPYJFrpUQx5y/7Ef3zswk5PYS4JPcQMlWtLGquzREZYE2vAuMdF24tJauIyjtOKdwOQ0FL84e3+ZTdeRzQdPqdA1LZJdlkJ6LNAWpPmqJatAGSXilkbZ7yNyjKxMLep207AK/ZVK4tR9VgDb9FeI8m8LBAi1R07g5PosBjXovjKjYXA/tIKI57BV+7D6hIzUJEUctUVb9S+CdvijLHHLDIOhkxRDYHoAubIrI3DQshYZKwSrNAXGTttxgJaxRqNZDC3w6Y08ZCpkknk9IqqGbVvVFWLqY8DxEQ3mBcW+wLBamd2wrFJD3pfsbM24Zj9911kXFO7pvo4DZ3xMdOQD4uq1PTls64RDuZ+1JjtcrqoodRmUXlj5mAtMq5t7qLNZqNPnz4sXLiQs846y9i+cOFCRo0aVevzrlixgmbNmhn/HjhwIAsXLuSWW24xti1YsIBBgwbVus+6Ih9wJBKJRCKRGIwbN45LLrmEvn37MnDgQF544QWysrK47jqxZGP8+PHs2rWL119/HRAOqTZt2tC9e3d8Ph+zZ8/mww8/5MMPPzT6vPnmmznuuON4/PHHGTVqFHPnzmXRokV8//33f9l1yAcciUQikUgaKRZViZnZrCv1yaIaM2YMBQUFPPjgg2RnZ9OjRw8+//xzWrcWJUiys7NjauL4fD5uu+02du3ahdPppHv37nz22Weccsopxj6DBg3inXfe4d577+W+++6jffv2vPvuu/Tv37/e13Yg5AOOiRdn3EacO56JvcSU2Q85fzC57VEAPPTll5z289MA3BKfzDHni6m60+/7koqCXQCsfuhUAHZ9v5ZFZ4j07+OnXo3FLiSNuW/O58lHzgRg/PXvMG2oSM1+epLGYpuQdiLSRaed31JZKPTKymH38u0FzwBw4/Pd+N9XGwEYneKkx2CxOPnVx58FoE2clfLPXwNg7cqj6HJiGwAu6t+KxW/NAYSEdnH7FHHR2hCjjo1Xj0JQew5haLmYyny0YDebC8WC3f6dmtCqucgBy/0ti525pQAkWnVJo7wYWydxv5KaFFCYKxYIW/ZuN6SOnHIf7ZuJvp2aCjkiiTyxbVMAfv96G3sro9JDmR734Ep3EdRlJ0t6K6y61JSjx2Q43TZK9wrZMC3OatSRCZfkG+f2FJZjSxTyWFkgTIpezyYiL7msmiEjqd5ynLpe5S0qxaLXKQpUBrC4hOzn06Ua1eEy5KCYRcbBsNFfhT9oLFqu8EUXtxpRDTaNYEBIeRanlXDZvouMFZsj2q6hDo7fHM8QNi8sDu2zTwSzLOXdXx2cKv+jVE11cJQqqeHm+jgRzFEN0f2qLvTdt96NOXHcOLf5mBoWHDd2aUJV9r3WAx9Tt3NIievfzz9RBwfg+uuv5/rrr6/2tVmzZsX8+4477uCOO+44YJ/nnnsu5557br3GUx9kHRyJRCKRSCSHHHIGRyKRSCSSRso/NYNzKCAfcEy0e+K/uK0WmvZvDkDOqJON6e8rN7zK42NFzMKLy7/n9FQRBRD3/EuGSyfnwscB6H61n+eT+gJQevnTHDPxRQAWPf8SFeeL1PN+98wh95GxALQ/9lzufHMFAJM6CAfRqgemkdjqZAAmf7+dfI+QL8Z3TuD+Z4Rj6rFzutEuIGrXRKa5+3dKYc2bel0bQrS/6UQAmrVLxlMsohqCYWjlF/LXhkD6PuXtd1ua0LOpmNwL+irZUSkKnBzTPpWMI4W7asuibSzLKgIgQ484CAV8+JuK7JGmzXz8+cs20UdWieFe2uMN0r25kKhKLCr+HRsAiG/Z1Hg9r1ycz6kpRm0bV1MXoZ1Cdgq6m2BxCKkpW08bT0p0kFUunGsJ9mgad7BwjyERego92NOFtFMZDNE0QchJu3VlxWlRjHsRrig2JCpfSQU23XFVnluOLT7O6ANAcSUYclBQjUpHVRPEzW0jqsEUzxCNZIjKUhaHnXBIvPdmWQqTFIZWU+0b8V9zmriiagSMSIVoPINagyylGpLRvmniimpySGmqSa6KDq2qdGUMuZp05KpSTXUuqno7p2pyOynVS1o1tavDkA1r8TlyMD9qGov6JGWwvxZNaeADzmH8/kiJSiKRSCQSySGHnMGRSCQSiaSRcrAK/R2OyAccEy/PWY8NlTv2/AHAU017MvVnkQQ+ftBY2uhF7gZ/+RiLnl0CwNHXTMai190/+2FRMK9X3+aMbiJkjEV55XxwmXAWHfH7aVz82nIA3r37JN576EsA7vvuEa7872TR9wSxwvzJ/7xMt4dFUvi7n67jSv3c/rnTyPtTFN5r+8I1FLwrks8j6eXdBjp4/xFRqLCyXQ7acf8FIGnXckNKy3BY8P/8KQALm4wi0yH6diSK+IalO4s5L0kkiyuqZrisjsxwE9dXRB0snLOBXzYXAHCLO1p0Lscr5I2jWifz6xc/AODZVIrdLaS3Qn+QY9OEXLXZqlG2VVgN3R07GK9nFQu3lFNTqSwVTqyEFomwU5wj5E4zXFQ794p9myU5Wa4XX4y3ReWeUFGecd2VhR4jQbwyGCZFH/dufexxVjUqHZUWCZcX4CstNwr9FZb7jUJ/kQKBis1puKg8gahbSUhUor8yT4BEvW+vL4hD7zsY0N1GNpWQX0hwWqLNcA6pNguhkJDpFLvTuK7YBHGTRGVSg6pzUQl3FUYbao5nCJjaoVDYkGqMqAZFiYltMDunQqbifuZ4hsh1VZcUbt5etV2VmmIY1GocWVWpbnPNKeXVOLjqKDQdxp8vBnV2f/01w/hXItfg1B8pUUkkEolEIjnkkDM4EolEIpE0UuQMTv2RDzgm7n/iTOKddrpeLVxPfzx2OuetEu6eC5IdnD1vEgDjjxlHmS5FfH1dTyMLKP5YkbGRuzqFt966HYDgBf8j+67/APDg1U9y3S3CReX8/DFW3z4XgGeSC7hEl1cqTxG5Vbs9z/PgmT0AGHnxgxzXV2R6LJvyBcGgkKN2tRjE5tfvBaDtuacB0OTc49gwXshPms3JH+VCKmv1xUfEZ7YHoNeO79j+6WIAPjmiL7foeVXxzcTrX/6Zy6g48XpEtgLICBVB7376+D4mb6dI907tIooGWirdrMsX2VFHNE/EUyiktL3r9uBMHgyIDKh2yULiKXdoFG8WRRKTh5xkvL69SMhOCRYVr+78cvVuAgippixsxRYvzrmzUJyvV8skfBW6RGWN/kH7CgoMx5U314ddTwz3hEK0dZhcSUCcJZraGyotMlxU/nIPdt1xFagMYHELeSwiUYVsTkMO8plknQp/KMY51USNFvfTdFkzUujP6rAYxQlVW7RQoeawEQ6K+4zZRaVpRtOcRRUMmwr9RZxTWjSLCmKL/oFwURlyVTiMou1b3C8crr6IX9BwZJmyqGr4H2pdqrHuz1F1oP3N7VAklf0vcpLUNKy/27jSkKn4w/fj79+BpjbsIUU7jHUa+YAjkUgkEkkjRc7g1J/D+NlOIpFIJBLJoYqcwTExoclobHFuApWfAPB6/5v5fKyQpd7/4zNuWCwkkL5uGz2OENLVj8NOx91UyEAdh44FIGvZYj5NF5LLWS8W8bieI3Xf/23lykpRIHDsJ+sYlCykoQ3330Nmn0sBeHDRZgB6JTo4RhO2oaCvkt53itj6xy94hsQRZwPw3E9ZuDcVAvCfEaLA3vaEVEOmSGzRideX7QBgzJzlZA4cDkBbSwu2fSMKBGbZ82l5jMjEatquJQCr/swlx/sbAPHNhuC2iOdgZfMylA59ACj2hyjKFvJR055CPnOsT+P3HCGnnNKpKX79WgvXZ+PWC/kFw5DhEr92u1KdFG0RTiyaihA3XyjMxlxxXG+bhk+X7tzNm6CoopBfoSeI3SVkosJiUegvs6eTgH4+tbIw8pbiKShGsws5y1vixaG7oXyhMMnOWIlK9ZbiUCMSVSEOPX/KW+IlLlXIXJ5gCMUp2n79PoetcYYzyRc0S1RRF1WJubhfIIxqEzJQSJc6NZPzy+KwGW3V4oiOzxZth7Woc83sqAqa1KdYF1VEulKjzidt3+J+vqpZVEZekkmuqqZwH0DILGOFa3BRhWJdVOZ2qAZHVbVOJlUhrEttZkdVTQX9asI8fR+biRXdfqjUSavrF/nD+It/o0LO4NQf+YAjkUgkEkkjRdbBqT9SopJIJBKJRHLIIWdwTLw5+TkUzcYvn4qiewPOvIuuJ4nCewNnbuDPL4V09ewPzxFoLaSam909jWnxua8NAeDOeU255SlR9K/5xMso9otigT9ddTedh40HYN673zDpLiEZvf7gfK6d1wuAma/9BMBXF/Rg25THAGja7WQCI8YAkOOZRof+onDgnEWbOF7Pibquu5CAnvtlJ23ihHzRokcXvv1ZSFTdf8+j/w2ZALTr3J8vv3wbgL1b19D6zN4AdHWkAbBk/u/s3r4NgLTjmpGpSzUlv/4A3UYY96t8jyjS13R4FwBce1uxbOteAK7uk2k4gQq3FJF0lMs4zu0vAiChRQIlO0sBCCYImSsYhqx8UdxvhF0j4BFtW7P2qBYhZ+VX+ImLF66mSFZVU5eNoE/IVWpFoSGzVBaUYItLBKDMGyBJP646iUrxlhsyUqCkBKtL3Ed/mQ9LK3FvKoMh1Lh4Y6wAYWs0F8obCBu/D95AyOivzBuI9u0Poll1GUjXlCxOCyGf7qKyWgz5RbE7ollHprZZlgqrUUdVjCwVjLidNPwm7SoqV1WfP1VdFlXIJFGFwqbifuLXT5eMogUAw6ZCf9Vhlp8s1chS++xfD52oqnPKKDL4Vzmq6uhHMg9Djdl++H7jluyLpigN+p09nLOo5AOORCKRSCSNFFVRGlTm4K8qkfBvQEpUEolEIpFIDjnkDI6JEVdfjtXpInD1OQC0PeZmfr2rLwDuoXfRtJsoVnfJMhdbPvoZgIe7NWF7tpBZ7NPGAvD27Y/jnvE8AFfNSObloW0AeOebbTz35gAAjjvzA+LeFO6qzXfOY1qfVAAeuE1kXLWb8wBvHDkagGEz7uCVFcJB1NJpZfDJnQG4YdwMEq3iGTV+9RcAfLzEzZNdRF9lg1rzzNNzANhQ5uP8o5oD4NLOZLfnDQDK9+wgYcglAJxWLmSiT1/5gO3rhRzU+ao0OumyTu4v6yg6LZITpeApFnlVzu7CMZb4e5htWcL15KrIM+7rnpwymmUKWcemKlj2CmkrqW0yO34X+xWHonJRoZ4vFZ8aZzixLM3aYLFvAGBXiQenniNVqu+b7rYR8Iq2WlFkyCyegmIsem5VWSBEqltcSzAMCXbx6x+RlFRPSWz+lFuMyVviwxYvnHKVwTCqK0HsE4q6qCL4gmFjSrjCHzS+QVT6YiUqi56JFYy4qKwaIc++LirFFpWlsESdU5iK+wVC5uJ+0ba5oF/IVADQF4zmUoFwTkUIhkJVHFURKSm6WDFsyp8y51JFqLqosToXVW2oTq4y+qrhfDV9Wz2YTpKGfCFWlag0V7v9634OKXEdWrMWGnV3B1Y9/nBFPuBIJBKJRNJIUVWlQU4o6aKSSCQSiUQiOYSQMzgmXnYvISHOwbgvRLG9Va8fxZddTwDg2PHPMv084XTqc8YdhkPo+G8/JLRKOKYmnvowAOetvpB2x90AwMZv5tLnk5kAfNr+NI5c/zEAaZ36MfaTdQD0T3KQP0W4qyJyyhdlTfm1UEgu94/oyFlPfgfAs/2acVSXZACuLN5DPz3XaeurbwKQtecYup1/NABNe2Xy2G5xLWWBEIMyhZRS4O8czVEK+ChvIVxZAysCAHhL97KpTFzf8G7pZBwpHFq7l+fw5y4hQaXZLAQ8Qj4Kt+wOQGqzbezeIorsKdkb0GxibDmeID2aCydTyKLi37YWgIS2zdjjXQPAngoxZW9TFUN2is90E9orxkFqcyNTKrvMi0vPhsrNKgIgxRnNbwoU5KDZxbkr8yuwt7Tr9yBMU11uKwyFcenyXkTuULxlRv6Ut7AMq14U0FPowZZglqgiLir9HlqjBfg8gWj+VLlJlqqIKfQXQtPPHc2cshIqNOVPVSNLKSa3VlgzF/eLOqdMdQYNF5WQq/aVsSIEqjinjHOEYgv3KWrsdk1TTXJV9dJVbOG+6PcpzSTVmB1VRjsYK+HUR3KozbS+ud+a2tVhSIi1OMfB/P7cWJQXKYP9fUgXVf2RDzgSiUQikTRSpIuq/sgHHBP3XfEqNkXl7ruGAvBW5+Gs0qMAFoyATQ/8B4D0nudidYhv7SPe3MYlxwwDwKY+AsDrn27kza3HA3DSlrVM+F30P+asznxzjaiJc/60N5k9awEAD1w3kC+nifTudlc9BcDE91Zxgh6R0GLjArb9shyA3recReiLZwFwJmfQa4BIFl/1kZgVKW6RSurk/wMgJZxrfNNMtKqoKz4D4Nv4Y0jRowIsDje/7hYzMccmipmTcChIrlfM5pzbKpmEfu0A+G3pdyxZL+IZznGZFgXbReJ4zzYlbPhpNQC+DRXY48VMU74vQI9MsTB3h1WjcstGMaYOrdmjnyerOLp4ubxE3PP4Zm7C+WL8IXcTrC4xC5SVX0FmipihWVcq6u4k2qNRB8GCHDR95qOy0IOji94Ohmiiz/wUAnH6LIoxs1JcYMzg+EorsOl1cEp2lmJLELNHvlAYxRlbB8ccz1DmCxjfmMo8AZx6u9I0gxPwh4xFxkY8g9MWrdNisxIKifthXmRsrn2DKarBPGtjThM3LzI2z+ZEFhWr1cQzeAOh6mvfmGZzQqaZmgiqadbGjKYq+8zGRLZX165un2qjGuow2yL22ffYmhb6Vt1e1/o2Vc93uFLXeyBvWfWoSsMWGR/Ov4tyDY5EIpFIJJJDDjmDI5FIJBJJI0W6qOqPfMAxMebYlrgsFhadMxGArKkncPf/zgTg6aOvYkOZiAX4Kvc1Qwo4YuSt/LlYyER/TBwJwKsTv6TjIhH3cMU1Y3jhBVGj5oFnZ/BMMyFnTRvWkqcnCDkn45VH+fUxUR/ntjFHAHDT7c/ycJsk0e+k56gsFEnfgRH3sWbUKQC0POpW2pwnFhQ//9I4cREtYHu8SBZvOn8KribiuK7xdnI+FeP4sHMbRulSjTujDZ/+mQvA4HiRIG6PTzFkj/buMGE9GmJH5dds2yoWEbdol4SGkIk2FQpJ6ajWSbxRsBuAwjUFOJN7AiJ5vEuansDt0CjcIOIjWpx3NCW6XLKtSEgyCRYNb7GQweI7NoU/xEAqrfFG5ML2gnLaNRGLsX0VYtFzvC06GRnYu8dYkOzZ7cERpy8WDoVpFheVdlxVJKpQaVG0Dk5JOfYEsW+gMoDVHZWowvri6cjvgNdUR0YkiIt2pS9IojmewRJZWBxCi6SJ+yPxDNZojReHDdAlKnt0ATOWqCwYmyAelZ8CJnXFLEuZk76DVaMawuEqyeK6hBMOm+SccLULS6tLFjfXu6kaw3Cg+i81SVfVTdGb+zLX2gnVodZOfWgsnxcNmX5vJJcgqQVyDU79kRKVRCKRSCSSQw45gyORSCQSSSNFa+Ai44Yc+29HPuCYCMx4k0B8PP+9SEhUxUsm81pY1L4p9s9jZIZwz3jGno8zTbSbdjuZPetEAnjuxY8DcIWq8vrYdwF4ZMPlTL1fSFFP/Hka7XVnTt5j40jr1A+Ap1eX00aXUS5sJab8ry7MYcCdQop64eb3cA0aBMArK7IJ/rQLgHNu60BR11aAqHMDEJ/ZntkrhEx06jvfkd71vwB0DS9h02d/ArDam8utfTLE+Nt34odVIgZiT1DU2nGn9zLcRJZty6DnMQDs9T1OQbaQhNJ7ZeDYJhK2f8suAWBgi2S8uqtp77oduJqICAdfKEymHnuQk+igaJOQxFpltKVSjw3YmCucXB1tmtFHfKt0FFXIVXsrAzgShSsrt7CSYzsL51ZAj3Kw+UqN97EyrxCLszUA3hIvDt3xVRkMxSSIa/4KcWxEoiopwKlLR75SD7Z4IQ9V+oIoejyDPxwmpEczRGQ8j8nGJCQq0V+JJ0CriETlizqngsEQVj2hPVIHx+KwRWvi2F2G80ipQZYKm6Ia/FGFLOqi0jRTPINqpIkrWtRFVWOauFmWMsUzRLT8oBH1oMQmi4erTxA3y0cRIu1QTXVwanQ46dEWplPU9D/wmmMbzMcq1W6vD41RCairnNZY5DdJFClR1R8pUUkkEolEIjnkkDM4EolEIpE0UjRVaVBY7MEMmv23IR9wTJz7f0+hWOxk9h4CwIgfklj20QwA8ufdg9pTbL8xY6gh4by/eSZ3fdQWgLMfFpENn014iLV3zAXg96uuof2QGwGY+dK3LL1RJJLPnfE9579+CwDPvLmSuWd2AiD36YcAEeXgOP9iADZfNZv2A3WJat46BuoxCvcd3ZJ3dQdUS116adHzCOYu2QpA26U76TdGJIh36NiX18aLMeVtXke7U3oD0NndhF+XiJTuHVniv2l9zyTTIfor+/kbOP8eQEgypbs3AZB+Zhdc5UIe+3mzSB6/oEe6IbPkrysg+Uy3cW+TwuUAJLZOoGi7kLkCyS0NmWdjrpCYBtk1/BVC8nK0aINqEfvmVwRw6kng5SVemuntgEf0q5YXGDJLxZ4irA4RH1FeGSBBj2fwhcKkxUWlHdWjnyciI5UURxPEi724m4tU9spgCDUu3ugj4qKKIBLERbvCH41qqPQHiZi7Av4gmlWXgQJRuSrkM8cz6NKPPVrcL6bQn6m4X1iNZgQHTQ6piHNKNcUzANVGNSjVFPoLmqSomEJ/YZNc5dfvn6pE4xmUaNvslqoqS9XkrqqOmhLD90fV6XijeGIdp+lru3t9igBGj5VIDoyUqOqP/BuTSCQSiURyyCFncCQSiUQiaaRIF1X9kQ84JtK7D0C1xbH6iRMBiD/mZsPpdH3+kayftQ2A+zqmsCVPOHBavXEP88c+JvY/VkhOoyc7ePZYId+8/eVmnlsn5KXh595Bk1dFjtTqx7owbZgowvfMg9Po+oGQpt7qdxEAQyZfwyt/COmnpdPK0aeLYoK33PEcw/WCcU03fcVrC4TT54nOKQCUDW7DSy+IzKlVxV4u6y/cRMnqeWy7+QOxT842kk88D4AzPc1ZNFtIV1mrhWOp08VpdI4Xcsju73+n+MRoTlRloZDEXL1PJHGd+MvZuK0IgARPvnEvC3aXkdFMyDo2VcFSIO5dcrtk1n4qsqiKw9F07Px8cT8TUxz4dWeUpVlbLHaRhr6zxGMkiJfuraSZLjsFvGJsavleQ2bxFBQbuVXF/iBNE8Q9CoYh0S4kKE0BtbJYvy5xP71FpVh1l5uv3I8t3pwgLlxUvlCYsO6iiuANhA0JpMIfxGpkUfmjOVfBqCxllqtCHl2yMSWIm2UpxW6Sw0zOqWANklNs/lTULeUzt02FCUVfIVPb7JyKSkPhanKplP1ISGaJ6kDF/cxUJ1eFQ8F9JKbaSFYHa+1BQ2f4VSV6D2rqK+Ze1uN8Mt370JVilAZKVIfz74Z8wJFIJBKJpJEiFxnXn0a9BicQCHDvvffStm1bnE4n7dq148EHHyRk+sYZDoeZMGECmZmZOJ1OhgwZwp9//vkPjloikUgkkn83M2fOpG3btjgcDvr06cN3331X474fffQRw4cPp0mTJiQkJDBw4EDmz58fs8+sWbNQFGWfH4/H85ddQ6OewXn88cd57rnneO211+jevTvLli3j8ssvJzExkZtvvhmAJ554gsmTJzNr1iw6derEww8/zPDhw1m/fj3x8fF1Ot8vd3QnIT6eue1ELtSJE19kytk9AJE5FZlmHvrz5wxd/S0A9w27m/N+FO6j9kPEmNbO/4B+X7wKwBdtRnD0WlH0r0mXAVzzgSj6NyTZQd5jIj/K5krmk/JmAPxQICSXR0/tymmPLwbghQGZHNVVFLm7rjCHwalCttj8/Cts2yNcWT0uFTJYs74tmHy/GE9ZIMSxmUKeyQt0x6dLC0FfJWWt+wNwnCdIZWGOGHepcPScekQzMnunA7Dzp138kSXyp9LtFgIeIR/R5kiatBRurZ0bhZSm7FyDpjuMsir8HNVajNlvUfFt/gOA5E4t2VW5FoA9FUFDwindq2dRtUggtFeMgyYtjUypXaUe3EniWnKzikjV3VAR11agIAdNl3Mq8ytwtHbo9yBMU13OKgyFceu2Jk1RoFK4qCKOOG9hmZE/5Sn0YEswS1TidykYDhOymvKhAE8g6pwq9wWMa6rwBQ35K+ALolkjWVQ+NN2lFio0u6h0KccSdUsp1qiMF9aif64Bk0PKVGcwJn+qOueU+djqCv0FAqGYLCpF/woUDoXR9GsxnFOqua0Y0lWscyr6HUpT9nVXxRT3C8ZKWbWdljfvV5v1BmqMHFR9uzoOJDOZOVjfmRuTunA4Sx3/JCoNK8BYn1mMd999l7FjxzJz5kwGDx7M888/z8iRI1mzZg2tWrXaZ/8lS5YwfPhwJk2aRFJSEq+++iqnn346P//8M7179zb2S0hIYP369THHOhyOqt0dNBr1A87SpUsZNWoUp556KgBt2rTh7bffZtmyZYD4H/DUqVO55557OPvsswF47bXXSE9P56233uLaa6/9x8YukUgkEklD0RSlzmUOqh5fVyZPnsyVV17JVVddBcDUqVOZP38+zz77LI8++ug++0+dOjXm35MmTWLu3LnMmzcv5gFHURQyMjLqPJ760qglqmOOOYavvvqKDRvEjMSqVav4/vvvOeUUEWGwdetWcnJyGDFihHGM3W7n+OOP58cff6yxX6/XS0lJScyPRCKRSCSHKlU/87xeb7X7+Xw+li9fHvO5CjBixIj9fq6aCYVClJaWkpKSErO9rKyM1q1b06JFC0477TRWrFhRv4upJY16BufOO++kuLiYLl26oGkawWCQRx55hAsuuACAnBwhraSnp8ccl56ezvbt22vs99FHH2XixIn7bH++9zk4FI18n5iK/rjTJtZcMwGAtsfcgEV3vhw74w/OGHwUAGk2C69+KZw+n8waCsBxO7O4YYkoXHf75b357MqZANz4zhz+97/3AHjyruG8/8gCADrdNI2Jb4k3+hQ9Cyl9+bts+0lIOb3vvRT/nCkAuJq05MhhotLa97OWUdJaZDIlnzcWgBRf9LpTbBr8/DEAC5NOIEPPP7K6Evlxh3ioOzGpzNh/jzcAwEWtknENEoUHf/3xG75ZmwfAZe6odJJvTaVXG+FCWvf9SgB8GypwJIp8qnxfkD6Zwnm03aZRuVk4p+Lbt2avfn83F1YY8lBZUaQQYCLhfPF6MKGZ4YbakldOixQhGa0t3UuiXdyniGwQLMjBostjFfmVOLpF86cydGmrEAyJyqYqhIqFtBYZg6+0Aofu1CrZWYotQchjvlAYxRmRqMBbxYVU5gsY35LKPAHc+nxypS8qwQX8puJ+fh8W3aUWkWUsTjuhkLgHNRX3w9Q2y1KBGl1U0e0R55Sq7ptF5Q2EYor7RQiZcqnMRf8iqCZZyoymKjFy0/6yqKo71nyMca6qLqpafCs1n8LsZKqOfc5XD5HpMF7LaVDXeyBv2YE5WIX+WrZsGbP9gQceYMKECfvsn5+fTzAYrPZzNfKZeyCeeuopysvLGT16tLGtS5cuzJo1i549e1JSUsK0adMYPHgwq1atomPHjnW8qtrRqB9w3n33XWbPns1bb71F9+7dWblyJWPHjiUzM5PLLrvM2K+qNhwOh/erF48fP55x48YZ/y4pKdnnzZdIJBKJ5J9GUxsWBBs5dseOHSQkJBjb7XZ7DUcI6vq5GuHtt99mwoQJzJ07l6ZNmxrbBwwYwIABA4x/Dx48mKOOOooZM2Ywffr02lxKnWnUDzi33347d911F+effz4APXv2ZPv27Tz66KNcdtllhpaXk5NDs2bNjOPy8vL2efo0Y7fbD/jmSiQSiURyqJCQkBDzgFMTaWlpaJq2z2zNgT5XQUxKXHnllbz//vuceOKJ+91XVVX69evHxo0bDzz4etKoH3AqKipQ1dhHV03TDJt427ZtycjIYOHChcZCJp/Px+LFi3n88cfrfL4Eq4pTUbn+wzsBmDhyoiGnrKg8yZACUgffwLpv9Yer2f/HW1c8D4Bj+lgAJtx2P3fe8yIA0z95jaeeE7lIT3eB+7OFnOX67zOsvnMeAJMuPoqzrxQLt0YeJfr99YFX8Ho6A1DQ73w2Dhe/LO1PnUDri4Q2OmXKxShthcywRm0BQPOPHiY+sz0AvXb8QNb7nwAwu2d7rksUD3WJzTvxwcrdAAxx/4gjUchcEdmjtbWc8DHHALDjoQVkbREuqswuqVh8Il9qbX4FA9sJffXVAtFX/so84lIHAsLB1SVN7Ot1WNi7bpvo++hjKdElkq2FFSRYdJmkWBQZjO+RDsuFBFeuOIyxbS8op2sz8cfpLYtKVBH8e3LRbEKKqsz14IgTck55MERzZ1TaibNGJaqgLlG5LeZCf0La8pf5sbqFRFUZDBGyu/R7FMYTjJVlKvxBw71T5gmQYshSQaz6OEOBEFZdIhQuKrvRBt1FFRSyoWI3uQq06HWGTYX+Yh1SpvsQrF6uMstSkewqRTO7qKLOqciUdriK/BQtAFiDLFWHnCnjmkwuqqr7V+eIMktJZukrpLf/imJvjUl6asiiyUZ0GZI6oCoN+72u6++vzWajT58+LFy4kLPOOsvYvnDhQkaNGlXjcW+//TZXXHEFb7/9tmEM2h/hcJiVK1fSs2fPug2wDjTqB5zTTz+dRx55hFatWtG9e3dWrFjB5MmTueKKKwAxhTZ27FgmTZpEx44d6dixI5MmTSIuLo4LL7zwHx69RCKRSCQNQ22gi6o+D0fjxo3jkksuoW/fvgwcOJAXXniBrKwsrrvuOkAs89i1axevv/46IB5uLr30UqZNm8aAAQOM2R+n00liolhHOXHiRAYMGEDHjh0pKSlh+vTprFy5kmeeeabe13YgGvUDzowZM7jvvvu4/vrrycvLIzMzk2uvvZb777/f2OeOO+6gsrKS66+/nsLCQvr378+CBQvqXAMHYMyfX5OQkMCYd8Ti3gsS7XToL9K4lxwxiDi9/kz3U+5l688/APBm5tlc8l4SABNPfVj8d9QobikX38Yvemc156SJxbF/3nwTrQddA8DNn6xjULL4pj5c20JQjxzo95BYW/TImY+TcqpYoPXo15tJ+1PMcFz/XFc2J4n6MsFwmJR2vQB4+ntRk+aSN3+k5Qn3AtDN3YYNnwoH2mYllw5DRWxDsw5tWf57NgC7y78jseXJACTqsxusWYLaVdTV2esLUrBD7NusX2uca8QU5dLthZzVXcw2eUv3ArBn9U7i24nXfaEwLRLEjEN2UxcF68VC5dbNOlCpTzn8uauEvvoMR6SPhLbNUNRdABRUBnHq72NeQQUj9PMFKsvQKvbGvHflOQXYXGJhtKfQg1NfEF01QVwzJYiHSsXMlEOfWfGWeIlLFTM15YEganwSAP5wNJ5BLDKOncEo9UWjBAq90To4fm8QVV80HvAHjUXG4VAQi8NmtAFUm8voT7VFZ3BiE8Sjf67mWRvzjEx0ZkeNzuZosXEOVRcZB02LicMxUQ3RZPGg6YShyPlUhXC4+gTxmhYWh/az4Hifhb7G9tA+szl1zdcxr2Ewf1iYt9fnM6Qxloap6zf2xjRDJdmXfyJNfMyYMRQUFPDggw+SnZ1Njx49+Pzzz2ndWnyGZGdnk5WVZez//PPPEwgEuOGGG7jhhhuM7ZdddhmzZs0CoKioiGuuuYacnBwSExPp3bs3S5Ys4eijj673tR2IRv2AEx8fz9SpU/fx2JtRFIUJEyZUuxpcIpFIJBJJ3bn++uu5/vrrq30t8tAS4dtvvz1gf1OmTGHKlCkHYWS1p1E/4EgkEolEcjhzsFxUhyPyAcdE79u+QLU5yVn1DQDvbv6RbZXit+OVjCMI6ou9f1p0DFN+6QDAnRNmoz54CQA29REA5o+6h/53CF1x0ey5zH7xagAeGvM0j/wqIhL++8A7TLprOAC/3/kArQf8HwC5/cRi4j3eSfQa1heAT79Yy8keUaNmfNdkbv1SyFE9Ehz83rcLAIu/2wZAl9V7OPX+NgB06DecD+eIceRv+I321w4BYKCayQdviWvcvnELmecKGa6NvjC3YMliQt2ii8TKcsX50s/siXuvWMz8w8Z8bhkkSnZHpIX89QWkDo5Kg3FlQodNbpdE/nqxoDeQ3MpYzLw5p5TTdHnIXyGkI1uLI9BsQn7aXeolTq9LU1bkobmeCu73lKOVCskuIrNU5hVijRNab4k/SIp+nJCoojKPOUHcX1QEgF3f11viJbm9kMEqg2GUOFOCuC2a6u3R5ZqITFLpj9a7ial94wsaC4uDwRCaLleFfD40Q6LS5SK7OUG8+jo4ISW64Dim9o1+Q1VVi0kQj0pJsbJUsEpUg69KHRwjNTxskqv8scniIGZPw9XEM5ip6+JjTalertofVafgIzV4GrJuYX+H1rVGjrkvNWa71IYayqGaIG7mn5CoDhUO42c7iUQikUgkhypyBkcikUgkkkaKojRsMfthPIEjH3DMlOVsQbHYGXjxpQD0uOsbyvOFzPLz3cPY+eMmAH485kRumXITAA+V5HPXIx8A8MfEkQA8cOc8PrtWrAxPfftdVvW+DYDK4HTOUYVD6+Jtq3G9KeSjTx7oydhvjgRg/Gfi9ZGpTq4bJernDHj1DVo69fosc6Yw/0dRN+DqEW1xDxMlrm8YNwOAzeU+xvYRMpKmnkeOZxoAFQW7sR0jHFpnlyfxypOiHs/6rBIG9BSyTMfmQl7asXgtOaNE1ESiVcVTnA+A46gzSPtdl4+yirDnC80uInVkF1TSsVUSIOIPlN0iNTalcwZrlu4EILciKj0UF1SQoJ/TXykiI6wt2mOx/wlAVrEHly5L7dlZTAtdSgp6KwkX5QKg6snbFXm7semOq7JAiGaJQlLyhcIkOMT4bKqC4ik1xuctFO1I7RtvsRerKUFc011UwTCETS6niIsqIoGU+gJYI22P3xTPEHVOBQMhLBG5qsK3j4vKHM+gxLioog6wYDia+h1xTgF4TbEIflONGl+welnKFwgabahSBycUK0UpZumqSn0ctYrkVK2LSql+H+P6grF1cGKTwU3bq8ZE1PB/7QPV3fm7OFA0xL771/0cUuI6PFBR6hUdYj7+cEVKVBKJRCKRSA455AyORCKRSCSNFClR1R/5gGNi0Qv/xR2fQIed3wGQ8ME3aFYhJfxy70MMuTMJgHsSurHuzIcAuOrV93l+kpCaci8W8RCnTPmG7deLIn29Tr+Jq2aIiPnnB7TghyvvAyCt0yXc/Mk6AKyeAOPbi8m0h6YsA+Dx6wbRvPxPY2xD+4msrWVTviAnTjiOutw/mo5dRXr3lXrUAUC7yi0ArFRbG3KJomps1YQUNaC5lYBHSEKby32c3kNsbzlYBI7+MXc9360X/XWJsxlxAr7MnrRp+7u4H9+sJbBeyEQ2PfF7V2WA/np8wx6rhm+T2Dela2tyPD+LfUq9Rnp36d5KkloLp1JoizhHMKk5Vt29tLWgnDS9uGLW2p0km6IOgntEMUCLXU8QL6jEkS7eq2J/iGZ6gviOMCTo7iVNgXCJkNucmoKnQNxHux5hUbKjFFt8RKIKGYX+fKEwfiX6p1LmCxj9gYhniNznMk/AuL5gMISmJ9AHfZVY3WJM4fwgFj2qIRwSfZllKSymnDRL1EXlN0VEROrumV1UiqoRMMlS5gRxbzUuquqkqGAwhKLP64ZDYTTdYyr2IQbFlCYeW7gvdsfqpJraFPqrLTUV/TPLWLVqVzOVHw4Fa/0BcbA+RxrLB1JtJTCZIP7XIqIaGnb84YqUqCQSiUQikRxyyBkciUQikUgaKVKiqj/yAcdE9qhRuDSN9zYKp9CUb78ypr+vvGUmGd37AfDmaR2Z/YVwVE3umMfK888H4JxHvgbgz48mct/xtwPw9u7+dDlpHAAD336MsV0vBuD8u49l9qwFADyV6mTHpLsAyN8gpIqMpyewYbxwamX0uoDep4lk8ccveAZvFzG+ygFjcP7wNgD2eCENdXLbKZwrAtBeb/kfusYLucPVpCVz1gpJ6ZaETWh64bqyQIh+mSL12zpMXN8Hs//gl7UiO+q0Zi7DqbS5yM9xnUW699fvfE7JH3quU3KmGLsvwHFNhZNprcPC3tXCZdXk+GMp9Av5YWNBBYm6bFO+t4DEtk3FDRaqGv74DCNBfMueclrr2VA/le4lyREtdBcsEO42i0O8XpFfiTNejLMyGCJTv+4dgNsWTRAPlYh757aoeIuEiypS6M9fXoA9OV7vI2w4p4LhMJWm/KkKf6TQX1SWaqLPA/t8Qay6RBPwhQwXVchvShD3Rwv9hUJeoGpxv6hzypw/FTA5pPzVtEWCePXF/QxHlaaZigRGHVLmBPGINBGqIZfKLEvFpHsHq3FR1ZQUXmOhv2i7Oimpun7NVN28P9mrvpLY/s4nkRxspIuq/sgHHIlEIpFIGisNnME5jJ9v5BociUQikUgkhx5yBsfEgvUF2BWVrrrUMWLegzhShaNnvCuTLUvmApDx9RxufFDkS7079EYWbBMOofhjbgZg9uUTcOruk/i3JpDQohMAb5a1JtEqtt97YmuenrBanOeOE3n/ESFXOfucA8BbWQoVHwiX1WkvdyU44gwAcjzTcDURbqfXVuVwwsvvA5De4yoA+oZ+Zs2bSwH4dugwxvQSElDTzr2Y+9MOAC5VPiM+sz0gZJv4Xb8BED76JAByvS+Qs60QgOb9MnHsFk6tpTsK6ddcOKY8xfnsWZEFQEKzYwAh63RIERJbWaKdgrWiuF+z//SgWJeo/txdQivd1eQp2UPSMSIHi6+EJJZfGcCRmArAjvxyevcRr3tL9+IMlBvvVflusb/FKeQxT6EHp1u8b55QmFRT/pQtKGQgp6YSLBTHue0WvCWVQFSi8ngCRnE/fzhM2B6RqDBcSACl3oBx7wBKPQGaV1PcL+APGsX9QoHY4n6WOCERhoNFQJXifibnFDGF/tin0J+imfOn1Ghbqzl/ytgekaKCoWqlqKBpe0wuVXhf55RlP7JUqLoCgCbnVHVyUzgUMuSqmqSkSL+1ydqJLTh4wN33Kz39nWsaavsNtKbx1jRUKa3Vj39iPYt0UdUf+YAjkUgkEkkjRaFhKtNh/HwjJSqJRCKRSCSHHvWawcnNzeW2227jq6++Ii8vz5iyjhAMNtyd8E9w/8fjSXA5UXscB8CN6UOMom2fb/mN2z8UheuOv2c+n014DoAfph1Bp/POBKD9ECFR3ffUlywdeywA79z3KZfP/giACS/8zNxzugCQ9+hY0joJ11LSDRez+s55AHQ89ngAJr//BwOLPaK/E9rz8opsAFo6rbTu2x+A17/cQOKCrQAMnNYKgG49B/PqHeJ8uzP+pNM54hzd3Rn88o3Iudqy63fS+55s9Ff63RcAhMfcDYjCdiU7RY5U83N64f66LQBfr8vjrK7C4RT0VZL7h5B7Us6MN+5hKkJGSmqXxF7djRZIbUOkRt3a7BKO1mUbX2khztZtANBswtGUW+YnTs+fKimspIXeDnjKUctE8UFF1SjPEX3b4roCUFbmIylR7FsZDJHujso8aoWQ2xyqQqBIP85txVMo7m9qM+FAKwuEUN1Jxj0I2aP5U55gJH8KynxBow1Q5g2gG7Xwe6OyVDAQwuYWElPIZ3JOBfwodnGe6rKowlp07CGTRBWTMxWIFOvTanRUVStLVVPoz+yWipGi/FWKAVbJotqfWyqy3VIr51TNmVPRfWL/XVWWMhxcdZSiaiM5NNSFIr9FShqCqii1kmH3d/zhSr0ecP7zn/+QlZXFfffdR7NmzWTom0QikUgkfwEKDayDc9BG8u+jXg8433//Pd999x1HHnnkQR6ORCKRSCQSScOp1wNOy5Yt95GlDgXOWNkci8NF4efC3fTyiW3JXS1kEfcDl7HoYZE15TpjNqfcL65/7gXdefZtkRn1yayhAHQ/6Wa0mU8DsPqx7szoJ6STqfd/Ruf5Irfq+Y4ncf7rtwAw6ftddNIllVPG9ALggmse51TdbZT66zu88rXIonq2XzNsJwlX1qOPvsUqXca66TjhiopXL2HzDe8AULp7M8mnXwPAhWUZLJglHFfr/8yjz1Wivx5pTrbNXw5AwfFCXnJbVCoKdotr7TuK1I3CNbRh814Sy3YZ9yt7p5CVWrUUziqbqqDlCGmrSbd0lr8n7uPeUDRbKS+njJR0If34K8uwthLXYrFvBmBrUSVuPUeqaE85zSMSlbcSpVgUKlQtNirz9BwsvcBhsT9IsyThTPKFwqQ4hbSjKaBWFgPCReUpEG17gh1vici/sieJQoeVwXBM/lSk0B9AZSBa3C+SRWU1Cv35DddcjS4qTzDGRRVxTRkSlZ6pJU4SlaUCobDhnDK3/aGoq8sfrKag3z6yVHXbI5JT9blUIZMsBfvKR1UlqprcTjUVA6xuv5rkqv1Ns9ckfdWVhk5Eq0r0HtTUV8z9rMf55Gz54Se5qDRM5jycJdJ6XfvUqVO566672LZt20EejkQikUgkkgiKojT453ClXjM4Y8aMoaKigvbt2xMXF4fVao15fe/evQdlcBKJRCKRSCT1oV4POFOnTj3Iw2gcLHv/HRTNZmQvxX0yh5OdokjcTclHc9IX5wIwbMKLLHr+JQBSv3mPo77oA4Bj+lgAOg07m7Nn/AjAw11S+f3q6wBIaDGMh1cISSm3xMu0E1uL/W+cx6KLjwCgbUIBAIHKMoZ3EwX2Vjz2OtsYDEDvW86iY29R3O6e3G1U6u6eI23ioXKT0tK4nlDAx65EIQENa6LiLRX7rC31cvaRoo/Wx7Viy6JtAPywXshxLZ1Wgj5RBC/Y7miat1kDwJ+/bCO0STiSrK5Esir8AAzuKMZZYtPwrVsGQHKX1uyqXAnAzhKf4UYrLqggsbWQtII7KwmltQHA5k4GYEtBOckpQq7ZvSmXpi6rcS2BPCGPWexOynOFnBbXRchfxf4QzXQXVW4Y3NZo/hRl4p66LQqeApGfZU+wG31EJKqyQAg1XowjGA4T1KLSWrnJOVVWTaG/SDvgD2KtprhfKODD6tKL+4UCMYX9ABRbVKIyF/rzB6NScDAULfQX2a6omlGE0Cw/qTVuVwwXlLlwn6LP5YZDYTRdbhNyFaZ2bBaVGU2NTgZrSu2zqMKhYL0kh6quqgjV5Vftr10d4VCw1nLVwfpu3Fi+ZB/O3/YbK7LQX/2p1wPOZZdddrDHIZFIJBKJpAoyTbz+1LuScTAYZM6cOaxduxZFUejWrRtnnHEGmqYd+OBGyu0P3oLD5Ta+XQ658mlaHNEbgGePa8U734uog3knORicfyEAJz74NcvfEcnhD54+CYD3tk+h96m3AjD0rUe46+jrATj7zf/x4ivfAPBQsoO8R8cCkLNKpe28RwDYNuk+ANJ7nk7/ke0AmHL5K1R2FunZwRH3kbhcREZYXYm0d4lv+2UfvwjAa60upZM7miD+iT4r83+J240E8WJ/iGNaiggK20n9+FiPhPhutVi4e1+Gy5gp2FyuMrSriHv4ac4iSlaIRbpxqZnk67Map2eKGZn1Dgt7V4m+0gYPIN8n6vGsyy83JYgXktxB1NJhJwSSRBRDJA19Y24Z7ZqIGZXfiveQoieIh0NBgrkiGkKzOSjLqwDA2U+//kCITH2RcS4QbxfH2VSFYKG+UNyUIO5Idhh1eswJ4jhE2xcKGwuLwVz7RqFYn7lqos90VHoCOLR9E8SD3kq0RH0Gp0qCuOJ0GdcFVRLETXVwTBM4MfVuvKbFxHVNEI/MwERmMoKBUK0SxM2zLqAvLP6bEsT3t5C4tgni1W2vT42bw/kbcYS63gN5y+qPXGRcf+r1gLNp0yZOOeUUdu3aRefOnQmHw2zYsIGWLVvy2Wef0b59+4M9TolEIpFIJJJaU6+Hu5tuuon27duzY8cOfvvtN1asWEFWVhZt27blpptuOthjlEgkEonksES6qOpPvWZwFi9ezE8//URKSoqxLTU1lccee4zBgwcftMH93Vz8w1Ti7TYjQXyqNZO18z8AoPXXXzA2kiB+9MUsNiWIv3XhBABs6qMANH1vopEg/na4h7HA9pFT2zPr0ekAnHbXcCNB3NHnHN7MjgOgYraIgzjj5XsJjRSLmndUPm8kiL+8IpsTZr4OQLMjrmIQYlHvH7OWAPD50ON4tnc6AOldj+KdJdsAOE/5KCZBPCl7BQDhQaeR630VgF2bxWLclgOa49wt+vgxq5DBrcXC28rCXPKWRRLEB1GmyyGdUoU0VJFoJ/9P8XrGpdcaCeJ/7Co2EsQri3JIHiRiJfg2jz0VYsGuM1nIYNvyyjjDlCAeF6ww3p9IgrjVlYlnq1isbU4Qb+qKSjuOUPUJ4p5CfWFxgh2PR5y7pgTxSn/1CeKl+nG1SRCPLiyOTRCvusi4pgRxc+2bYDiMokUWGdc/QTwUrFoHp3YJ4lVlovokiFe3v3EPGmmC+N/9+SATxBsX//TzgVxkXH/qNYNjt9spLS3dZ3tZWRk2m62aIyQSiUQikUj+Pur1gHPaaadxzTXX8PPPPxMOhwmHw/z0009cd911nHHGGQd7jBKJRCKRHLYoDfg5nKmXRDV9+nQuu+wyBg4caBT5CwQCnHHGGUybNu2gDvDv5MknF2NDNabJf8z5g9s/ETEMA2/5iA8fFDELv844itYnnQJAt5HjuesRIWP9MXEkAK/ePY+xcz8F4N6nl7DocuHE2nXvtaT3GAGA679jjATxbicO44k3VwJwrB698NDwDjz9804A2rtstB84CIBX5q3DPX8LACc8144efUT6+As3vwfAztQVdLvoGACOcmSyZL6QvDZtX0nmcWJ8bRZZKf5KnDt08f34dCmjOEvUu2kxpjfxXwsH18I1uZzbLZognrNSOK3Szk0w7ltqSDirUjqmULBOr+OT1s5wAK3eVcxApylBvK2QyjRbKbtLRVyCW3dAFRdU0CpRtAOectRScT5zgrjd3YPiUiFBpSaLfWuTIG5PsMckiEcktpoSxCsD+yaI21SF3ZV+vS32MyeIB/zBWiWIq/u4qGqXIK4aUQ31TxAPVamDU9sE8epcVBH+6gTx6iSpxpYgbu5Xjdl+uH/MSBqClKjqT70ecJKSkpg7dy4bN25k3bp1hMNhunX7f/bOOzyKquHiv+27aZve6L0I0qUpRQQbYu9iRxFFUWzYKCp88CqiAioWUAHBBoIiAkpRmjTpvQZIIL1ustny/XFnZ2dDAimgAe7vefK8N5NpO4TX4Z57zmlOw4YNz/b9SSQSiUQikVSYSufgADRq1IhGjRqdrXuRSCQSiUSioapOqIt5BrHcLzjPPfccb775JsHBwTz33HOn3Xf8+PFVvrH/gqcf70CoxcSxNcIJdPKW6/juPRHiZ/98HTe+kAfA2pev5O23fwfgj197kHjlPLF/f9E2njTsZybUEbLIyE1LqLXySwAmxLTm2YVvAvDMvF10iRBOmvvvaU2vO98A4E5F6gj+bSLT/moCwIyedYi6oRkAz774MRuzhMwypFt9gvSPArD/0ekA5J04RMj14s/ngbwo5n70NQBbdqZy+bM1AWhZM5QDC0SDeHKPXOxKrYGvQTyka3+i94lKgz170wnJOggIqePIEbG4vHH9CNUdpk8SreExLRLY/88/AJwo9v9qpZ7IIzxBhPc587Mx178EAJNtPwcyhEsqVJGa0o7nqBJVsSMPXZYIONQbzRQki7ElNJxsxeGkbRCPCtI0iCsSVYhRj+OkGFvCzDgUicoaFUaeT4KyR6nn0DaIF7pLbxDPLRQSVWkN4m6XR5Wr3AVOTMH+1nCd1S9Lneqi8tdClKdBvLC0QL+SQX8udynbdXiV05ypQVzrrgJKD/orxe1UcntpDeIBIX4XSIN4+fav2Pkv5v84abnYGsS1SImq8pT7BWfTpk0UFxerY4lEIpFIJJLqSrldVEuXLiU8PFwdn+5LIpFIJBJJ1amKg6oqTqrJkydTr149rFYr7dq1488//zzt/suXL6ddu3ZYrVbq16/Pxx9/fMo+P/zwA82bN8disdC8eXPmzJlTybsrH5Vag/Pwww/z/vvvExoaGrA9Pz+fwYMH88UXX5yVm/u3WXzbcGzBoVz3umjHfje2Jat6Pg/AG/N/YdQrHwKw9+3/4+ZvrwbgcP+b6HjXKwDc8tYfAHxzdX1+v+FJAOLaD+beWULCiXcU83594RpKHL+U0S/3BiAqc516D9d0FyF4q0Z8R1KEcEO1/N9AWjQTYXsDM1PUKf2m+btYYxCOJJ9cpNMb2K0XQXk9aupwFQpZbU9eEXe2EdsTejVgw7finpZtT6FNsL/xGqCwVjuaNdkKwIrftuDaJjq4LKGRaoP45Y1iSDOLX5+iHX8DEH1pQ44UiM9yMLOQEKN4f846mU9kIxEK6dnrxB0hQgvNoRHsPinuLy5GBB0e3naYmGB/UF7x8UOAaBDPTRb72lpZVImqpiJtJXkhzOzvn/JmKeF+Rr3aIG6LsJJ1WIyt4aE4FGlHrwT9OT1enHq/m8kX7mfQQbbqnNKRpwT9+Z55cZEbs9J67nY6MIUI+clzwonRKqSnUxrENZKU+L70BvFijURV5PL4patSZKmSDeJa55S2QdytHotyb6U3iPv2h8AuKh9lNYgH7lO606qilNUeDhVvEC/NISUbxM+M7J/6b9DrdFWS6Cpz7OzZsxkyZAiTJ0+ma9eufPLJJ1x77bXs2LGD2rVrn7L/wYMHue666xgwYADTp09n5cqVDBo0iJiYGG699VYAVq9ezZ133smbb77JzTffzJw5c7jjjjv466+/6NixY6U/3+moVA7Ol19+icPhOGW7w+Hgq6++qvJNSSQSiUQi8beJV+WroowfP55HHnmERx99lGbNmjFhwgRq1arFRx99VOr+H3/8MbVr12bChAk0a9aMRx99lIcffph33nlH3WfChAn07t2bYcOG0bRpU4YNG0avXr2YMGFCJZ/MmanQC05OTg7Z2dl4vV5yc3PJyclRvzIzM1mwYAGxsbHn6l4lEolEIpFUAu1/r3NycigqKip1P6fTyYYNG+jTp0/A9j59+rBq1apSj1m9evUp+1999dWsX79eXbtb1j5lnfNsUCGJKjw8XLWsNW7c+JSf63Q6Ro4cedZu7t/m1effRWcw81bdFgBsevt6vnxL9EU9mTSDky8/BsBdL3zN8eWzAXgqsTeLZrQDRC8VwKV/zODpyE4AjPz6Sp59UWiR3zaMZOuTYp+swzUJniGCAzfceB0NrhgKQJvbhWw17PLncF0qJJmjjfoQO/c9AIJjatE2XEgdRz/7iPebPAHAjYpUM6NmYz7/W7jAxoRuxBIqpCGH28tlseKP23tdT778TLio1m1N4c4GQv4yGMQ5tpws4Krmoovql2n7SV+brly7JWlK4F2/mna2BonzndywG4Cat99KmlO4uXak5hGmSFT5aSlENk4UD3mvl4IgERxoDYthV7KQjBrHCblzRU4qkVaD+mdSnCLkMVNQGPnHRY9UcJiFfEVmqae4qJIAu0UcF9A/ZdTjSBdBhBa7haIc8ZfaEmnHoUhBXpsILXR7vQH9U9mF/v6pbEWai9frKPL1UpkUucjpd1F5ip0YVFnKjVFxUXk82eg04X5eU6BEpQ36C+if8t8OhRqJqriMQD+nppeqSOOc8slSwvWjHKvTOKdK6aXylAj38wX4aV1RZYX+nXFcSoif1+MOdFep93GqrFVSLilL+jqTJFYe2eXfcKFUaipdg5SDLlx0Xi86r/fMO57meIBatWoFbB8+fDgjRow4Zf+0tDTcbjdxcXEB2+Pi4khJSSn1GikpKaXu73K5SEtLIyEhocx9yjrn2aBCLzhLly7F6/Vy5ZVX8sMPPwSUbZrNZurUqUNiYuJZv0mJRCKRSC5KvB7UbIfKHg8kJSURFuZPoLdYLGUdAZy6Nsvr9Z52vVZp+5fcXtFzVpUKveB07y5qAQ4ePEjt2rVlRoNEIpFIJOcBYWFhAS84ZREdHY3BYDhlZuXkyZOnzMD4iI+PL3V/o9FIVFTUafcp65xng3K/4GzZsoUWLVqg1+vJzs5m69atZe576aWXnpWb+7e59LqbMViD2btqNQDfXP4czy25EoDXu7/AmEUitfmT1CSunyMkkIFxwWy9/SYAGvQQ8lP/n47QN0pIJ/dY9/JkvpBIekwZysirRaBf9E0388y8XQAk/nmEV8a1BmBvgvgFdHq8RDfuAMCoRXt58L1fxTWuH8Fl8RsB2PLlev7uK+So4dcKN1WdBi1YvPIwAIMKfiai/g0A2Nf8iG6TOIe+w/WkFolQwuQ9h6nbS8iNwTvEFOaiPanccWkCAIXZqSSvFd1X4U2uV3urGkRYyEgUslLqtmMA1Bp6idrvtOlIFh0V2caRmUJEszoA6H5NIiVfyD3BkVEcTxWyU29FEivOz8ZckK7+meQdSwXAHNyYgjSxsD0ozKI6oOJD/P8KMRZmAaJ/yp0u/iIF24w40sU1QhLs5Cnykj40nEIlOM9jFiGEbq+/fwog1+nvOspUXFT1DDqKi5TOJZumf0rrovLJUi4nBkX+8roz0GtcVFpJSnyv7Z/yb3d5vOgMfllKpziX1KA/Q8lwv1NdVDpdoIvKq3FXAbjdntJ7qbyBEtWZuqg8pQQAluWc8ocMegLkqrLcUqcP/dMeX3ovVUX/LVZd/+1WUbnsYg55qyzV7c9e5/Wgq8IMTkWPNZvNtGvXjsWLF3PzzTer2xcvXsyNN95Y6jGdO3dm/vz5AdsWLVpE+/bt1b7Kzp07s3jxYp599tmAfbp06VKh+6sI5X7Bad26NSkpKcTGxtK6dWvxf5ql6II6nQ63u3JWUIlEIpFIJBrOkkRVEZ577jn69+9P+/bt6dy5M1OmTOHIkSMMHDgQgGHDhnHs2DHVNT1w4EAmTpzIc889x4ABA1i9ejWff/4533zzjXrOZ555hm7dujF27FhuvPFGfvrpJ5YsWcJff/1V+c92Bsr9gnPw4EFiYmLUsUQikUgkkguPO++8k/T0dEaNGkVycjItWrRgwYIF1KkjZuKTk5M5cuSIun+9evVYsGABzz77LJMmTSIxMZEPPvhAzcAB6NKlC7NmzeK1117j9ddfp0GDBsyePfucZeBABV5wfB+s5PhC4rfehYSF6Nny0NMAdL3lZdY++TAAN9qtTLpBOMSe/34eY1+bAMCvK6cyuNEdAMyb1hOAdne8xZfTxTTc0jtfptXdYwBYU6s1Ts/rANx1zxVMnyYcWrc73dwRKzqerv9RyEsDY4I41rsVAL//uoW624VUM+jjZjQsfhCAL2e9QNquNQA0GHUXALd66/G//30LwO6tu6j/nAgcbPGzhWM//QKAs8l1qhSQc2wP8Q91AyAiTywa/31LMi91FC+zXo+blH+EHBffJ1w9zpZxgJhLRCDikb+OAuCw11Sf5e6kLG4OE/KRsyAHc30RWmi0ZXJI6YMKjbSRky5kp7oRIujPmZ+DMcffP5Xvk6hCO5Gp9EElRNpUB1RCqLhGyf6pojTRBWaNsKr9U1HNaqoSmt4epcptXqs/sNLh8qjny1E6p6x6HXlq/5QOp89FpchSrmJ3Gf1TngDnVICLqkTQn0vjo3GVCPfTq2P3Kb1U5emf0ut1uF1+icqjzLzqFZebs8gdGPSn6aIqywXl+94nPxnL5ZyqaG/TqedRHVxVKh8s43rl9CLpAxZNao/XXqOa6RznGRdz99QpeL3iqyrHV4JBgwYxaNCgUn82bdq0U7Z1796djRs3nvact912G7fddlul7qcyVDro75dfflG/f/HFFwkPD6dLly4cPnz4rN0cwLFjx7jvvvuIiooiKCiI1q1bs2HDBvXnXq+XESNGkJiYiM1mo0ePHmzfvv2s3oNEIpFIJP8JPomqKl8XKZV6wRk9ejQ2m1hEu3r1aiZOnMi4ceOIjo4OWEBUVTIzM+natSsmk4lff/2VHTt28O6776qdWADjxo1j/PjxTJw4kXXr1hEfH0/v3r3Jzc09a/chkUgkEonk/KJSXVRJSUk0bNgQgLlz53Lbbbfx2GOP0bVrV3r06HHWbm7s2LHUqlWLqVOnqtvq1q2rjr1eLxMmTODVV1/llltuAcTsUlxcHDNnzuTxxx+v0PVGXz0Mi07PrX3qA1Dv8sHMeV+E9M3Y8hMj610DwKvupXzWsC0AD681coUSvGeb9BwAblcIi2r3A+C3naP5ZqDQGPuNW8GUK4RTqe1VdZg4QvRBXRUbzL7hwwDYmNoVgMtf6kOz65sB0GrSZ6QooXPDmkWQ4RL3kV08lKJcIcU4LxOf/26HmzeS9wOwIa2A+7qK59XsskT2/bIDgB03p1HLJuSVotwMDG1EuGCN7WJt1dG96Rj2i34pg9nGnlwRjtetWSzFSride9ffxLYRzq3lC8T16mYXY1akhPTkPLV/ypWRh75WUwDMQVvYmyFcTfaoIA5sOwFALbuQbNxOB64TYhbQYLGRezQLgKA6YWQ4xb9E6kQFk6nIS75wP4NOhy43DRASVcFJIVdZI6wUKhKVNSqMPEWqMWgkKrdFuKgA8pVrmPU6cor8QX9ZBT6JSo9LcVf5ZCmXswhTsLh/T44TY5DfReXrn/J63OgsQep1vMZAF5VWlvKF+MGp4X6+sVaWKq1/Sqc34PJJVEa9aggwGHV4lO1G5c9S2z/l9XgxKNJVyaA/X/eU6pDSle6WMuh1qpSkJUBy0nZDaWSswH1OOUWplKd/qjTK2z9V2i4Vk9o05yrP9f4FeUZKaOcPIuivKi6qKshb5zmVmsEJCQkhPV1YeRctWsRVV10FgNVqLbWjqrLMmzeP9u3bc/vttxMbG0ubNm349NNP1Z8fPHiQlJSUgPhni8VC9+7dTxv/XFRUdEpstUQikUgk1Q4pUVWaSs3g9O7dm0cffZQ2bdqwZ88err/+egC2b98eMMNSVQ4cOMBHH33Ec889xyuvvMLff//N008/jcVi4f7771dDg0qLfz7dWqAxY8aUWinRpUYowXoDn/68F4DN6e1pqyy+7PrpIZZM6Q/AhNvH8/Vm8QLVt/+bfDzrBQBG3TAagCs/nM7gd0Sz+HMhFkJnjgBg79J8Os76HwDHxz2n5tz0eCyWqS/+CEB2QzHrEfboeCL2iWsYzDZ1xsU1730+jxWLmhsEm7FFxAPw3Q6xGPfBiBT0yuzA8UIXjzURC4GtN3Xk0+e+B+CXv5N4IVJIjDq9gcOIqoYrW4iX0/cX/0XeOjHLEhSVyHFl9qh33Uj2KrMWWRs3Yr+0JQAnihYAsPVkLnaTeF45aZlENVEWKq9044qsK+7DHsP2Y0o9Q2IYO/7cDEC0kinj9bhxHRczSSZrsNogHtzSQnax+Fdz7aggMpU/M7tFXM9m0OFOTxbbTHoKM0T2kC3CSravQTzKrubnEByhzuAE1DMUiZkag85fz2DX6yhQnoHVoFdzcMwhSgt7sROT3eofB4tn6/HkotcuLNZk33iNmmZxSsm+0SwmVsfuU2dwDEZzqbM5BqMejy/vRqdTxzrtuBz1DNrFvWfKwSlr7J/x8X/G0mZatNcrScnN+tMsWC41d6eCZQYyP6Zyz0A+tnPAf2ATv1Co1AzOpEmT6Ny5M6mpqfzwww9qUuGGDRu4++67z9rNeTwe2rZty+jRo2nTpg2PP/44AwYMOKXRtKLxz8OGDSM7O1v9SkpKOmv3LJFIJBKJ5L+nUjM44eHhTJw48ZTtZ7toMyEhgebNmwdsa9asGT/88AMgop9BFH0lJCSo+5wp/tlisZyxh0MikUgkkv8cOYNTaSr1ggOQlZXF559/zs6dO9HpdDRr1oxHHnkEu91+1m6ua9eu7N69O2Dbnj171ByeevXqER8fz+LFi2nTpg0gqt6XL1/O2LFjK3y9JksXEhoaRtNxIjtmZqNebDi4EoDQ7s8z8+Ph4hqe6bSY+xYAwbG1mB0l1gCZ9SLv5vu7mxH6kVgrdPcb1zDr9Z8BsLS7lWmZ4qWs6P0/uX2qyNsxXv0gewbNAiAkri4Ak/5Jp/fkDwCo0XYAPQybANgwYQHf9hQLnD9qF09CS7GA+fMl+wC41vsD4bXF4mTzxt+IPfEPAN7et3GoYCYAh3emUv8qcR1bUhzLDoqFyn0aC0lpTPpxji4/BEB47YfV7JhL44LxRAhpJWX9PqJvfwiADGXR7cYjWdS2iF+pgvRjRHUUi7VZeZITDiGLBEUlsOuokI9u6lCTwmwhrYV6CtQ/h/wjovrBFJxI/j6lZiHcSqEircSHWNis7GvzOsX/GvRqPUOIxYgjTbjobFHBOPLFPsbwSPUcXmsoSpQOBWU0iPsWFscb9BQ5xHajzYhLkcoC6xmELOX1uDEGKWN3FjqrZmGxtkFcU80Apy4y9tUzFLk8aj1Dkcutbnc4NXk3vqZwk0au0unw+JrF9f6FxXpNs/iZ6hnAn29TmnxUnnqGksd4Pf6cId95S8OjaRYvjfLUM5RFWdLLv73utjzT56eTicr6kZTXKk61XnPt9YBHvuBUhkpJVOvXr6dBgwa89957ZGRkkJaWxnvvvUeDBg3OGPRTEZ599lnWrFnD6NGj2bdvHzNnzmTKlCk8+aR4AdHpdAwZMoTRo0czZ84ctm3bxoMPPkhQUBD33HPPWbsPiUQikUgk5xeVmsF59tln6devH59++ilGo2KVdbl49NFHGTJkCCtWrDgrN9ehQwfmzJnDsGHDGDVqFPXq1WPChAnce++96j4vvvgiDoeDQYMGkZmZSceOHVm0aBGhoaGnObNEIpFIJNWff7ts80KiUi8469evD3i5ATAajbz44ou0b9/+rN0cQN++fenbt2+ZP9fpdIwYMYIRI0ZU+VrdBnyEzmRl/JtCfjr0QU+WXNIDgC7PfchLrwrZ6fj0gYy5Vyx0nvT3nzw98jsANr1xNQC7HrmbOl0eA6Dw4d5se+EnADrcch1vThH5MldnF/J+n7oADP11L60UF862nlcA8Mm3WzH8IvJl7pzdlDZXihbXcfd9zOFQ4a5qPeharjLWA+D7mUvFtQ+sp87NNwHQeImZ1LlC+nIN+D9Vksk88A91HhfVCfZfa/LjRiEJ3fOAkL7cTgfH1glHUvzDEerziS46SWzLWABObj1JcbyQwnzn3XAwg8uDlHyd7DSCmohWeaO1gENZIosmLCqILKVBvEFkMM4C4XAyZIl70OkN5B4R2TjWsFZkKs3j0RE21QFVM8zvQDLkCYkr2KDHmSoqJWwRVgqUCoj42rFkKxKUPixKPYdHU8+Qr61n0GTfHHf46xmKNfUMPonK1ybucToxqg3ixehsYtG91+MOdFFpnFNufaBE5QxwSHnVeoby5OCUNtbrA91SvnoGnV6HV3wsTaN36fUMJWWpki6qsuoZtGizcvRl7uMflyZLed3uSlUzaA+paj1DWeeV9QxnD1nPUAZyDU6lqZREFRYWFlC05SMpKUnOnEgkEolEIvnPqdQLzp133skjjzzC7NmzSUpK4ujRo8yaNYtHH330rNrEJRKJRCK5qPGVbVbl6yKlUhLVO++8g16v5/7778flElP3JpOJJ554gv/7v/87qzf4b2IOjUBvsvHEUyKML3vZOwztIrq1/rjBQvRvIqjt49ibseo/AeC6rZ9x/5HjAKRNFM6taa+35st9oqH7zo/XMqqpkCxa9W9DzR6DAegQYSV1nKh2mLuvE88/LtxQre4Usk6vO99gp1KRMPryOngMYmF1SuFECtLF9QzXv8egPDGt+9n/7QRg7Z4MbusuZKs2i2LZ/YMoJt13ZTrxSkifI/ME5i5iHVPN/ckc3C0qDqxJYl+90czeE0JG6tgsFpMS3ufds5a4tnUB2PBXErVzfaF44vmlp+QS1UBIWs78bIz1xWcxB+9hV5pSzxAdxNG9IgW7XrgNd5GQkjwnlXoGs43cY6Kd3BpjVx1adaKCKVIkl6ggo1oJoc8X5ypZz1CQJs4r6hkUCSoiRg3381j9bj9f0J9BpyNLaQ036XRkFfgcWv4GcVOwCZdT2SdY/D5oG8Q9Lic6q6ZBXBlDoIuqWFOpoP0eoMjtbw0vcvslKkexZnsZDeJurSylaId6vQ6vst1g0FPkEZ9Fbzhz0J/ZaFDHBl2gRFVWPYNWrqpKPUNZslfJYypSzwDlrEso5VqVrWcoD1LekpSJlKgqTYVecAoKCnjhhReYO3cuxcXF3HTTTTz11FPY7XYaNmxIUFDQmU8ikUgkEolEco6p0AvO8OHDmTZtGvfeey82m42ZM2fi8Xj47rvvztX9SSQSiURy0SLLNitPhV5wfvzxRz7//HPuuusuAO699166du2K2+3GoASQnc9snHwvYWFhtHpOdCtdszaKmWOEg2tyuweYvEL0Sz32zAckf/4AABMe+ZKObwp31S1viZ8/aDZy6cpJAPzz0wl6znwbgMwvXsOuhPBdd4eNuWN/B+BkPTeJ74t9amfsAsR0uE9Ssv3+CZ/H3QRALZsJq10E8s3Zn8ft4Wnq/gCHCop5paUIEwy64zJmvCZCBuetOswz4f5m62NW0WresxV88rHYJ2+lCMcLikokSXEQXdk4hv1W4fjJXr+WqA6tATheuJRtJ0RPlF1ppc5MySK2pUiQ9q5x44oWUpnVHsNWJdyvQUIYe9aIVvPYYCMel5CBXMeEY8xkCyEnSTirQhpZVQdU/dhgtil/T8MtBlWicqUeU+5Bj0ORqGwRVnKOis9ijbKrEpU+NNIf7ufy/6XPUD6rWa8jI0/cj92gI1cJ+gs2GnAp92G0GVVZzRwtZiw9x7X9U/llOqfQdFE53YH/p6MN+gtwTpXon9Kfpk1cr9fh9slxBk0XlcG/3WgyBIT+iXv2YjT6m8LL6p8qLeivrHF5+6dKHqtFX8qxVemfOpPEdK4C8qqL+lQeGUz2T1VDpERVaSq0yDgpKYkrrrhC/f6yyy7DaDRy/Pjxs35jEolEIpFc9Mg28UpToRcct9uN2WwO2GY0GtWFxhKJRCKRSCTVgQpJVF6vlwcffDCgqLKwsJCBAwcSHOx3i/z4449n7w7/RRZc0o0gvYHth9cBENZ1MAs/EwWiacN/5e4/JwBgDo3ip6b3AVDomcqige0ACL38GQAefr0PXz8pep9MrW/kG28LAByjhnDnZ9+Kfa9txOZXhFwVHFOLKXuFHNJj4mgAanZ4hF7GrQCsH/s9n/VsAsCktvF83aoLABMX7qGnTjxrn/Rl/mcJNTO2iQ/U9272Pyt+fmjHSRr2FpKR7Wg8i/cL99F1zWJ5N+UQAEeXimyjiLoPqtJQu4QQDJFCZjm+ehctbuoPQIbzHdYeEpJQDaV/Kj/1CJFXi2uwJp0TToPy+Wqy85iQqPq1q8EcpX/K7vX3T+UdPKI820TyD4vtIeFW8pVgvuahVpRPRTBOQhRJxe2TqKxGCk6Ka9iig9T+KVNUtNo/5bHZVRdVefqnfM4po82oCfoz43YKiUqVpVxOTCGKLOXOKbN/ymv0j12luKjK0z+lN4p/YJTWP6U36MvVP6UN/YPy908ZVEmr8v1TJWUi7fdn6p/SIvunyn+MpHSqi3R4RqREVWkq9ILzwAMPnLLtvvvuO2s3I5FIJBKJxI+saqg8FXrBmTp16rm6D4lEIpFIJJKzRqWC/i5UDhe4sOo8/Fr/MgCuGvEpg4dOBODkD0MZfpMIAJy2eTUPvfwVADvH3cS2u24BoIES4ucY0JNtr7YEoPs9N/HaRFE+em1WIe/3rgnA4J930zFCSD87r+7J+9P/AcA5by8A9/94Ce2vE43oY29/j0PBon+qzbM3cp2xAQDTpy1i6541ANS/W3RVNVtqIWXWlwC4Bv1PlWTS9myg7pPdAYj4tRbfrRdhenMfbqdKLklrhNyT8Eik+kyiC1OIby2cUSn/nKCJ0j/l9HjZfjADgG4hQjYpyk4jpEUbAIzWtRzM9PdPZSiOqwaRwRTlCWnLkHVMlWi0/VMZOSLgMDEmWO2Oqm23qfdkyEslWNEltP1T+SeFtJXQvq4qsRkiYv39UzZ/uF++y6M6fDIL/S6qpHxxbZtBh9Nxav+UJcyMp1DIX1Xpn/K5qCraP3W6Lqry9k/5Qv8q2j9lPI2LKmD7OeifKo8MBeemf6rkeWT/1NlD9k+VA49HfFXl+IsU+YIjkUgkEkl1pap1CxdxDk6luqgkEolEIpFIqjNyBkfDc+u+Iiw0hFca3QzAnKYHqRsjAvGed3TlkqD3AWg3dxT5J4VU8M+1rzNnSAcA5qb0AODW9/7ig441AGh/fxvsH4kgwJ4xQSS9MhCAecldGflCLwAuu6cNHfq9BMAeJWjuvSvqkOd8CICUwv+p/VNc9xpP5Qu9YdKonaw6IOSeh3o3AqDNX4ns+lZ0Su3olUotm5BFHJkpmC4fAECdQ4c5sF1IO+Z9KzGYhfyza7/oi7qiRRw6s/h8nh0rie8ozr1m2RFqZfvlnJNHRSBfVFMhaTmzszE2EP1TltA9bD0pwvYi40I4tENcr2GkDZdDyFXu4/vUa+co7fRBieGkFgl5o35MCDmKnBJp84f76XNPYlf6sfKThRvMFm3DoUhitthwjUSl6Z+y+Jvu84o86vmylKA/q8ZFFWLUB/RPFRcWKmMLrhwl6C9UCfor2T9l819H66Jyuv2ykrZ7CqCguHz9U36Jyt8/5XJpHFKK9GUw+l1URpOBQpf4XHqDppdKcaKVt39Kuw/8u/1TJaWM8vZP+e71Yu2fkhLaBYB0UVUa+YIjkUgkEkk1RbqoKo+UqCQSiUQikVxwyBkcDV0+OYbBEsT6eS8D8NY1b7Dk4CYA2vV7kbzf3wXg5U6DufvrHwC4/7XveVPpeAqeNBSAHQuh8/wvADj80oPEt7oegL4v1mfCI8LhlNEkBPv0yQBErZ+LwSKkmgbBiiPpq1GMryEyhpqFWghWpLIv/knmCfthAPRGM8eVkLqnmot+KtM93Zj0xAwAfl5xiNdrCrlEpzewH+H06dvezZjFwtmV+Uc2IfF1AUjaKGSMG5rGsjtISFvpq1YT1Um4ypIcC9lwXMhSkWYDWckisC+ulZDjvMvcOGOEnGWLiGfT4Sxx/zXsbF++EYD4YKM67e86sgdzcBgAWYfFeUNb2MhWHEt1o4PZpCg5kTYjNkXLcJ1IUiWqghQhUQVFB5F1QFzPFhuh9k8REqVKVPma/qnMwmJVPvH1T8UY9OQpEpXVoKdYkcrMIWa1f8pkt+IpVkIE1f6pXPTB4jkL55Q23M/votJkC57iotLKUlopyuF0Y/CF+7k86E1mdR8QMpPPOWUw6NXOKZ0u0FHlk6W0TqvSnFOn6586UxdVefqnygoD9O9z6rFV6Z86Exd6/1R5kP1T1RwpUVUa+YIjkUgkEkl1xeut4gvOxeuiki84Gg6t+QOd0cy9ze8F4I4wC65n7gSg1mWP8fjeRAC6hJp5qL3ISwl6dzP3znoBgLf7iZqFiOufZshq8fPoKesZ8ec4AI7WDud44Wdie+MODJkvmsP7v/chja98DYBrau0GYOWbP/PdDe0BmHtDQ76q3xmAz+bv5Nr8r5RzXIN9068AhGwRjeC6qx/gUME08Xm2HKBxv0vEz3fU5YdtKQDcfEk8r6UmAXB48SGi6ncDUGc9WscHU5woZiSO/rWb2IeHAJBd7ObPfaK9vL3VSH6qWBgc07uxuPaKYxzNETMgobEJ7EnKAuC+bvWZrtQzBBVmqM87+8BRrHZxbN4OsfA4NMKm1jPUtlvZpOxrK85Vs29cJ48Spsww5Z/0NaAHk6PMvhgjYij0+LNvfMXdeU7//0lkOIrVRcbpSq1DbYMOpzIjZgoxqfUMljCz2npuDgtWx8YQMfvkdWeht/qrSrwm/6yNdjZHO2ujVjUYtNk3ev/YcGr2jXbBsVM7U6OpYdDO1JRVz+D1njqDYymlTVy7j6esRcalzMiIrJ1T6xkMpfyz/3T1DNrsG4Ou5OyRf1xm3s0FUM9wuluV9QwV53yaWVPxuqECi91LPf4iRa7BkUgkEolEcsEhZ3AkEolEIqmmeD0edUa0ssdfrMgXHA1TJw0lKCSU2+59BYCZ+/7iGXtbADblX0N092cBmLLqU37s+ggAV478lNlRYgGwQTcGgNee78frw6cB8GCxmweson7h8s9gVBOx0Nd41+XMnv4HABEbknnzC3Gd5rphAHx6yf0c2/A7AE2nPMMTRaJN/PlXPmXTVtEy3vrNZ+i0WCx0PTBVtJfnD++jSi+Zh7dR46mbAIh1m/h5rZClXmgVpMoGR9Ydp/71YoGy77jQE9upcVkCAHsXHiA7VHw+txe2HxAS0+3xwRRmC7nKdsnV4vjgPHamiSyd8JhgMk+IcfOYEJz5YhGxMeuo2oidczAZa4SQx9IUOahObAgORcqpbbepsoYh94TaIO5IPoEtWmTQ5CvXiGpWU82+0UfEqucIqGco9mffZDqKsSrj7AIhOdkMOjX7RlvPYA4248r1Z994lEwZnaaSQacsli65yNilTJLq9IaAHJyCYl+VgW9hsVtTz+BRn5HD6Q7IvikpURmMetwuf66NT5bSG/U4lUXSBoPen32jHZe2yLjE2FjeRcbaBcRlZN9oCcjB0S5wrqKGUJV6Bu2iZu15Sk5z6wIWTlf0/s5HjeTsIusZKoinihJVVY49z5ESlUQikUgkkgsO+YIjkUgkEkl1xTeDU5Wvc0RmZib9+/fHbrdjt9vp378/WVlZZe5fXFzMSy+9RMuWLQkODiYxMZH777+f48ePB+zXo0cPdDpdwNddd91V4fuTEpWGmm8NJMRkpPN9QiZqMWw531wrmrvXdOhGTNsnAbhjlZn4VCGNzLshgppP/wjApjeEVBNj3c0Linxz91X1WHqnyNXZEnE53T8TlQxXtKnPR6PeA8Dh9nKdRchHa/TNgUD3yZ64TtwXJqSJQenHWZkuWrNfvKoxCVuES2rDt9sAWHbNYdqECYmkOD8bV2uRwdNm71ZW/LYFAM/6/VjtQpbauquIG9uIHJs0i/h1cKxdSI1urQH46dudRKQKeSbEqOdkUjYAsS1jce0VzidvbdGcbrFvZYPinKpZM4x1K/YAUNtuwVUo9nUe2I7JFgJA1sHdhLQSMo+vnqFRXAhJpdQzeNOOEanUR+QfSyMoWql4SBIuquD4KNUFZoyKV7NvnEZ/C3lWYbG/QbzAiU2x4qQrOTghRr3aIG4Js1BcKJ6zOcyGO03JwQm24fWIffRB/uwbndl/Ha85SB37nFOAek86vUHNsSkr+0av2a7X5OAYjOLPSFvP4HdLUaoUpd2u0+TgmI3+f98Y9JpxGfUMWilJHKNpDS+H7FBWPUNZ+5yujiHgZyXkp8rWM1SV6qK8lEcGk9k35w9et1v9O1fZ488V99xzD0ePHmXhwoUAPPbYY/Tv35/58+eXun9BQQEbN27k9ddfp1WrVmRmZjJkyBD69evH+vXrA/YdMGAAo0aNUr+32WwlT3dG5AuORCKRSCSSCrFz504WLlzImjVr6NixIwCffvopnTt3Zvfu3TRp0uSUY+x2O4sXLw7Y9uGHH3LZZZdx5MgRateurW4PCgoiPj6+SvcoJSqJRCKRSKorHk/Vv4CcnJyAr6Kioird1urVq7Hb7erLDUCnTp2w2+2sWrWq3OfJzs5Gp9MRHh4esH3GjBlER0dzySWX8Pzzz5Obm1vhe5QzOBq+mL8XM3p+f000R4d+u4TIX+YCMDW2Jb/9fgsAba4fytqbxdvpkm73kBUp/oDTJo4F4J/O3bhssGgeb3t3As/UFNKVt6ObFbE9AWg6dgjRjUUL+bWZG9j+6hsAvNRWhAa+WiecHy7pCsArP+9gRvASAIJjaqkOoU5BWfCAqHP4+JMhAKxYdYRHOgvJyZRjZ8nBLADual+Ln6YIp9WxX5Ox17oGgBMrXNxdT7SBbwsTUsjRpRtpOORp8fOir1l5WDin4ixGso8fBCC+c0PYK2S6THOUcm+12XhYtJt3ahDFspMiCDDe6pdpCg/txxwSAUDOwRzsvYSck6O0Y18SHUySsm+EGdU55TpxxC9RpaQTEiukrZTNoqXcEhutSlSe4EjcSphdYLifS5W8TuYUUV/RSQp9beJWY4CLylfPYE7wh/uZwoJwFwv5Ua9xTnm0EpU23E8jSykfEZ3BQGEJiarQ7a9h0DqnHMVuNfRPyFjinn2yVEmHlG+72WYKaA1X99fISkateymgTdzXMu4flwwA9J1LHZeoWNBKV9owQE8pklZpcklF6hm0lCfc798IyKvKvxylFHR2qS7SYaXxeKroohJ/92vVqhWwefjw4YwYMaLSp01JSSE2NvaU7bGxsaSkpJTrHIWFhbz88svcc889hIWFqdvvvfde6tWrR3x8PNu2bWPYsGFs3rz5lNmfMyFfcCQSiUQiucBJSkoKeImwWCyl7jdixAhGjhx52nOtW7cOKH29l9frLdc6sOLiYu666y48Hg+TJ08O+NmAAQPUcYsWLWjUqBHt27dn48aNtG3b9ozn9iFfcCQSiUQiqaZ4Pe4zzlye6XiAsLCwgBecsnjqqafO6FiqW7cuW7Zs4cSJE6f8LDU1lbi4uNMeX1xczB133MHBgwf5448/znhfbdu2xWQysXfvXvmCU1lGTLidMJuF0V2eAmD80iVcMfgbAFa+0J3C1+4HIKHN3dR67wYAPg5rTsdB4pfhlrdEcF+vw9n8/ISQrUYsP0TjEPGmfOn1/RjyyVoAnvzsT26fKmSgbjelqy3j2zP/BKDr8BvpbW4BwK9z1rDh0AIA6vd5g2ZrZgGQ9c1EvA+/Bfh7pI5vW0+zh4QMFvVbHb5YdQiAr+9pRXG+cEAdXLKfGnfXBER4X12dkJWy24oFXUkrk6j7XjtAOLyW7hQy0MAQE450Yeezt26N4QfREL4vU0h6EfHhHFdcVi0vr4czT5zXkHlElVwy9yRhi+gBQMZ6BzFKYJ/v/uuG+6UeQ06K2j/lTD5KUKT4WW5yHrGtxHRrhnOf2DcqgUJFkvEERQT0T/nkk3SNc+pEvpNLFfmlSHFOmYLN/v4puwW3U3FOhfnD/QzB4Xg94i91QIO4ye+c8pTRP+WTpfTaoD+j6NQ6xUWlbRDXhP4ZlPt3l+qi0uEVtxnYRaXzd1SZjfpT+qO8JaSossL9TumiKhHop+2fUo8po1ncR8lwv7L6pwwlNB/tqaoS7lfWOfUB28+NxlEZqUz2T12EeP3raCp9fAWIjo4mOjr6jPt17tyZ7Oxs/v77by677DIA1q5dS3Z2Nl26dCnzON/Lzd69e1m6dClRUVFnvNb27dspLi4mISGh/B8EuchYIpFIJJJqi28Gpypf54JmzZpxzTXXMGDAANasWcOaNWsYMGAAffv2DXBQNW3alDlz5gDgcrm47bbbWL9+PTNmzMDtdpOSkkJKSgpOp1jnuH//fkaNGsX69es5dOgQCxYs4Pbbb6dNmzZ07dq1QvcoX3AkEolEIpFUmBkzZtCyZUv69OlDnz59uPTSS/n6668D9tm9ezfZ2WJm/+jRo8ybN4+jR4/SunVrEhIS1C+f88psNvP7779z9dVX06RJE55++mn69OnDkiVLMCiGi/IiJSoNLwffiDkohC5WEdx33a9v83ySCKXb8vpYfmvRCYA/Tn7NlW8vA+CjbrVp+4RwQ4V1HQzA1XHBHB9yLwBTUi9n2xvCsXR1/8todd1QADZnFzLh6voA5BQ/S5LjMwDyThwSN3PbJF4rEHLJ9P9NZtkOIRM98WEzWm8Q8tLmz/5i+2VCLqllE1JHfmoS5qtGAFDv+AG2b0oGwNYhB4Pi9NmyL5ur2gmnlcdswLNZdF7V6iFCBteNXkxCjiJp6HUcOyCkpsSWsTgzxC+qsXknrHbhktqYLHqmouJDObBN3E/jqCCciiTmSdqF0Rfut+8wIXXE9GdKoZtmCUJ7zVSknIQQs+p0MuSkqM6p3CMnCI7z908Fxwvnl69/yhgdj0ORZNy2cHzkFLnV86UXONX+qfS8ItWhVaS4qCxhZooLhdxmDrXiStb2T4l/XeiCwvxBcrZQ9Tpek1+W0ob4aYP+fLKUTm+gyB3oonIUu0sN/dP2T7lcHvQ+h5Pb55DS4fEFHJoMFCpSmt7gl6UMRr0a7qd1UZmN4rxej7tc4X4lXVQBDqky+qfKCvcrea7S9j9d0F9JKhvup+2fKi/6cshjAdf7F2w81b3jSvZPVYFq3EUVGRnJ9OnTT7uP1+v//8C6desGfF8atWrVYvny5Wfl/uQLjkQikUgk1RVPFdfgXMRt4lKikkgkEolEcsEhZ3A0fPfBFHQGM5/u+hWAwfE9GbtqKQAPDPmI6TWEnJI/+E527hKOo7Yrf2X7nTcB0KCHkKiuv6UPr/d6BYCsS8Ip/PwDAGr/PhlbhLDPdYiwkjZ2CADDGj1OL8Uh9EPtZgCMXnqQkWGiO8poC+F4oZCrXmwRjXfgjQCMvfMDflkiXETjmgjJxmC28Y9DSCf9r6jH87+KgMCT89Ox12wKQNKGBdzXQqxG3xpiJnnxMgASb+oHwKGCX9AfEbJUtNlA5jHhnEpoXxfvQjHd6YisT3CMiNVetVcE37WrF8nWpSIfoWaoSZ32dx7YjjlIPLusw9mEthGfNbPYTaM4IV39rfwZRNkMqnRUfHS/KlHlHUslJE6E+2XszSQoXqy897mvCI1WQ/Vyi/xTsmkFTlWiSs0pIl6x4+TmO7Ep51b7p+wWf7hfZDCeI0KWMocF4/EIuU0fGq5+Lq823M90av+UVqIqKUuV7KJyON0YfG6pYrca+lfgdGNQnofb7VGn+lUXlc7fLaXT+2UpvaZzSitLWYz6gC4pODXEr6yx1mkFFQ/306Ivcaz2vCUpbbvWIVUeZ1F1dx/9G7dX0WdQzR/ZRUN17qKq7sgXHIlEIpFIqitnKcn4YkRKVBKJRCKRSC445AyOhmsffxiTLZg2/9sOwOdX1aPhBhEhPcISRZ+NPwMwOLYbjYcOAqD7O6to/8teAOam9ABg9tEcVRZp0OMm7vxYhPsNff9Luoz8FIDremUyd6xwLy3u1YKxz4pjp4d3BmDWTzu49ajYt3aHZ6ilXNv903vob34egJTC8RzauBmAlg9dDkDEyoZ8uuYwACP7NGKQEsy3b/4uavQR0pbjRy8twsVnLmwWzZFl4v4TXxYZA9nFHhZtF10it4aYVWdXzN1t0C/ZAcD+zCLCE4Tctv9QFgD9WiXycaZwUVmzj6rPNXPnYWwR7QHI2pJHjCJL5bk8NIgQ0o5PorLkp6rhfsXJh7BHWsW+yblENIwR91d4CGOMcIHl+5xTwVH+cL/iwHA/35/F8dwiGhrFuMjhwhImnE++/ilLmD/czxIeojqnDMHheIpTAdAHhaqSiUcb7meyqmOtc8rpk6UMBo2LSu8P+iutf0ozdro8AeF+eo1cBcIhVaxIcgajHo/iUNAb9KpboWS4X2lBfxZN/1SpLipPYCAfVDzcT6/T+eWxCob7VcaEU9FjqvqvvfIcX1GZqLpLa9WVC8q0VY1dVNUd+YIjkUgkEkk1xevxqP+QqOzxFytSopJIJBKJRHLBIWdwNHxsXkKYxUr0ciHrhMz7ibdqtQJg3r719PhcyEEjmkYx4nXR91S7x2AesQt5IniSCPEbtq8Tfz4lujj6DOpKn7vfAGDJyXy+vrc1ADZjWza/IlxNGQc2EzVtPADv54l3znZfTmfJDhGk9/BLzemyTri2NkxYwL5GohMr3mok5+geAMJvfQ6AegXJ/LlaHBfXOl/tMdq+O4PLXxayjtWkR7dlEQB1ejVlwQei/yqhWLiUDDrYvy9d/LxJFIXZQp4xX3oHVrsIHFx7NJvoGsKtdXSv2LdlbKga7uc9uhOjVUhRGXv2Ehwj7v+Yw0VTJdzP4fZQyy5kIjXcL/t4YLhfrLin3OQ8anUXQYQZTjcGRaLyhft5giLUP8ecIr+cklrgxKoXz1Qb7ud0FGMOEeGIxUVCirKEWXClKi4qTf+UPjQCr0e41fTB/lI4rYsqoHNKM/ZJUQajWdM/ZcahCf0D4Zby908FdlHplGfjcXvVED01xM+gp8gjJDa9wR/6p9frVKdVSVlKDfoz+F1RpfZSlQj3K+l2+i/D/SraP1VWuF9Z59EF3EeZt3Ga+7uQNJLKIcP9zhJSoqo08gVHIpFIJJLqireKLzhe+YIjAYY/Nh2zTs8rv/0GQLdHPuQzpe06avQA1q0LB6D7mt/YPfB2AGp3fpT7x4gKh7dvGgvAyWZubJM/BqD9n9MwB9sBaGW34pz4AgDDGgygfbiY+fm+ZmPGrhGLc4eF7QTEv/IPFYgZhPGX1YRnxQLhd+77mAULxazN6EaR/J8yA7BDlwjAfT1svPLGVABS52YQVrMxAHs2/cbdbcWsx/YQMyd+FZ8x7ure7B8jFjvrD2UAEGMxknpQLBKu0bU+3kVK9k1sEzX75q+9qbStL7JodqzYIJ6F3aQuzHXu24IlVMyqZOzNIPwSMROT5nTTQskT+tsLsUHiV9Cm/FO/+Oh+YixK9s2RE4QmilmgI38dJbiGssi42IMuXCxwVrNvXP5/LZ7Ic6rnS8kqJF6ZkcjOcxJiEdcrcriwRojn73LkAWBJCMV91LfIOBS3S8m+CfYvLPaaSs++Ka2eQczglJ5943D6Z3ag7OwbV3Fgg7jeEJiDYzQZAmZtfNk3Rs3CYm32jdloOGWRcXnGYpExAZwu+6Y0qpJ9U9rMzJnqGc7VAt2zNTFR1dOUZ5ZIZt+c/8g1OJVHrsGRSCQSiURywXFeveCMGTMGnU7HkCFD1G1er5cRI0aQmJiIzWajR48ebN++/b+7SYlEIpFIzha+oL9Kf128MzjnjUS1bt06pkyZwqWXXhqwfdy4cYwfP55p06bRuHFj3nrrLXr37s3u3bsJDQ0t42ylc3evuoQYjdTfOQWAD03xXL9NSDlPx1xOkxcmAdB1wkYumy3yYH5N6c3spCwADLpxADTqeTM3fbgagOc/nEy3N0VT+C3X5PLd22Jx77yrGjPyxV4A/GC/gi9m/wPAdcfFtet1HkKtzaIywvPjOPS3vAjA8cKJ7Fu7HoBWj3Un8k/RSP7+igMAvH1tY55JTQJg9w+7qNXnegAcc720UdbhelrGsn+huP+458eojdw/bxHN43eH+rNv4u5uh/4Pse++zCIiaggpbM+BTK6/USz6/UzJvrFpsm/St+zHFnGZGG8OzL5pFCnkqr8RuTcAYUqzdfGx/Wr2Tc7RbDX7JsNxMCD7xh0s5DFf5Ex2kfuM2TeFBcVq9k2Rwz/WZt+4naJN3BAc6c++CfHXM3gsIepn1GbfFFYw+8YnUVU0+8YnXVUl+0YrXcHps2/MmjAa/wLnM2ffeLTS1TnIvilLepHZNxcvF+yaZrnIuNKcFzM4eXl53HvvvXz66adERPjdMl6vlwkTJvDqq69yyy230KJFC7788ksKCgqYOXPmf3jHEolEIpFI/kvOixecJ598kuuvv56rrroqYPvBgwdJSUmhT58+6jaLxUL37t1ZtWpVmecrKioiJycn4EsikUgkkuqGr2yzKl8XK9Veopo1axYbN25k3bp1p/wsJUXUCcTFxQVsj4uL4/Dhw2Wec8yYMYwcOfKU7bn/m4YnJJThTToA8FfKFjq+txKADzrVYPxIISmFdxnE84q7SjfqEV5KFhUHm964GoBbH+xGh34vASL75tv72wBQUNyKbS/NB0T2jX26qIGYlFfMJZ9PA2DBDnHfT73RkrZbawLw99if2VG/PwC1bCY1+ybs9tdp4Dgk7nWlOC6mRQYGJZ9l0850+rwhXE86swHWi7qHete0VGsiYgutqrywf4/Is6ndIsaffdPmbmwRYvvqpCziagtH2KEdJ2kdLyTAolzhvvIc2qLJvtlNaKJoLD/mcNGihjgu3+2hZphwC5n1OgyZQk7zOadyDiYHZN/UvrKFOJ/TgzFefBaH24NHkah8ZBe6VVkqJa9IrXs4mVOI3STOXZjvxKJc2+lwYA0Xz6k4WXFRaeoZArJvQsLV62izb8pyTuWq8lHZ2TcFTv92CMy+KXK6A7JvDIrE5nF5MPo+iy+jR5N9YzDqS82+0TqnzAb9KW6n8mbflFbVUHI/ODfZN77NpTmnypt9cybX1fmQfVPd83Vk9s05wOOp2jqai3gNTrWewUlKSuKZZ55h+vTpWK3WMvcr+Zfe6/We9v8Ihg0bRnZ2tvqVlJR01u5ZIpFIJBLJf0+1nsHZsGEDJ0+epF27duo2t9vNihUrmDhxIrt37wbETE5CQoK6z8mTJ0+Z1dFisViwWCzn7sYlEolEIjkbyEXGlaZav+D06tWLrVu3Bmx76KGHaNq0KS+99BL169cnPj6exYsX06aNkIGcTifLly9n7NixFb7e3U+8g85o4Ze24mUp/fa+bM9pCECjZYtY3+NKAFr2fYPb7hZ1Cc9f+hBZbSIByJgoXFQ1544mJL4uAD1jgjj2vNj36RZDeFiRX35q2JYh83cB8EHIaiyh4hzHC0Xs/stNg/AOewiAkdeOZMF8EQA4qW08b7uFDLQqL5QnrmoEwMBnPxTHz0ojoq5o7t6/fgGPtxUy1+YwC0d/WgBAnfv7s/8N4dBy7k8j0SoqC9IOHgSgdvemeOcKCSQ7vAEhcfUAWLLjBJ0bRQOw5fc11LELScU39V+4azNWu/h52q7VRLQR95nmdNFSkahWeiFeqUgIMeopPiLkNlWiOpRMWE0hfR1aeoTgxFhxH8VusIux0+Mlyxk47Xoyv8gvUWUVUluRXHLzndiU6genw4VNE+5nqS2u4zmiVDVEhOIuTgNAH+p3TmllKW24n7aSoch1aj1DybE23M9Rok1cG+7ndge6qHyzkR6PV3VU+aoatOF+JR1SqnuplAZx7VjrnAL8clWJcL+S4Xwlw/20zin/MYHHlxXspz2vuj8Vl4zOhQOp5GRwVaa9/w0B52IK97sYFDGvx33avzflOf5ipVq/4ISGhtKiRYuAbcHBwURFRanbhwwZwujRo2nUqBGNGjVi9OjRBAUFcc899/wXtyyRSCQSiaQaUK1fcMrDiy++iMPhYNCgQWRmZtKxY0cWLVpU4QwciUQikUiqG7KqofKcdy84y5YtC/hep9MxYsQIRowYUeVz12h9BQZLEHVeHwXAO7EtuWry/wHQaejPXPO3aBlf9Xtnnlp4CIA2oRY63HEnALeM/gOAx6f9yJPfzwOg3yMWxt4tAgJXdV3A9xMfAOBPY3dmTxf737V3Bs1uexuAVv/8BED6RyNxPy5ktgznGxz6W9je2zx7I7G/1gJg7OI9/PBgWwAeVFxPO779h/qP3AeA81svjfXCAVXYqQb7fhVyUI3/u5I8xW0zb8MxnlU6sXzhftFPdMW4UIQJbj1ZQHRtsZ7pwP4MHupcF4D30o9jPiHWQPkC6tK27CM4Rlj2T27Ip0aieMnMc3loGi2kuZWASXFORZgMFB7eD4Bdke5yjuYQ36aOOF/RAQxxwjmV7/bgDhX34faKYD/wO3ZO5juxKbLOodwimisyiyPXqcpSRYXFav+Uu9CBJVxIaGq4X2g8Xs8JZawJ9zP7w/3cRv9id60sVehzL5m0zimT6rTSm8zkKfKj3ugP+vM5pxxOd7nC/XzOJbcSJqg36PEo47LC/cwleqk8mu0+Ajun/GNtuF/JLqrS5CsfZyPc77RBfxphpTwyRaDrSrv93GgcF1O4n3ROnVu8Hi9ed1VecLxn3ukC5bx7wZFIJBKJ5GLB6/ZU7QWnCsee71Rrm7hEIpFIJBJJZZAzOBrWPtuYsNAQaj7+BQCb3r6emM4iBC506goevlr0Pq3s0puvwkS43/s/j+L+NqJzKazrYAAOFRQzoZE47m/DA2QXfwCAI/MEh64QnVJjwwx8NOo9ABbsTOOd+4TUlHBSdG2teG8pyxoJ+aZtuJUv0oU8xnWv0TF3i7iPxVvR1RfuKqtddDat3ZVD/x7iPtNsJpx/zACg0S2dWfzUNwAEJRcQosgTR3anUety4bQq2isC+/QtexAULa63/EA69RsIh9e6FXtoqUhJxfnZuPZuBMAcLBxSqdv2YW8lXFTHHC7a1hG1GklurxruZzPoIEV8rhiLgex9xwAISRQyUE5SLo1uFFJUhtONKbGueHZuL0Vm/7qq9ALh8vI5p5JzCtVwv+QsB3aTGBcW+GUpZ0E+1gjl/g/kYQ1XXFS+cD9N55QuyK5ey2sJVseFLo8qyRW6/UF/eU6XOtYG+uUVie0Gozkg3M/h9G0XfwVdGlnKVewpM9xPb/BJRopzyqhXXVRaKUob7mfQld5FpcpSbneAcyowtE/jZCqho5Qn3K88gX4B41I8PV6Pu9xumbLC/cqDvpzyWMD1zrE8U97zX0zOqYsNuQan8sgXHIlEIpFIqilSoqo8UqKSSCQSiURywSFncDRMansHVp2Boi53A/Btt6G4L78GgBe/n0e9zsLR87q9Oba+Qkb5v4JWXH/nTQDU7/YUAPdGH2TZTYMAGHzV60y9SgTlLW5xIw9NWQvAD+5viWooZCn31qV0KfhH3MRLrwPw3uQbWLBgOwBDbm9O8CHhnPrin2Re7NUYgK6TvmDv1CMAxDZ/FIDjy2czsKmQiTYmhLBn9goALn3/fyQ5vgLg+83HaRosJKOMgzuoc5MISdTtF06so7oIwmuJa/yxLYXbOonP/cesBSQYCtTnlf2PuGdblHA9ZWxaTNQ1/nC/rglhACQBkfoiAOwmA8WHhawWE2Qi+5BwLdlrhwNwZNMJrLXEZ81xeXCHxQMi3C+z0C83JOeK8/mcU0czHFyiyCz5uU6CwkRSdZGjODDcTwkRdDsdWCKFDOUuTgbAYI/yO6c0spTH7B87XF5VolJdVAZDgHOqQBPiF+Cc0mxX5SqDT5byu6g8Lk3Qn9uD2WZSx9owQAiUpYx6HZ5i5ynbzSWcU94SLqqSQX8+ucnr8WDSuKu0YyhfuJ9BT4BUpt2uPVdpnE52CXBFnWO9pbz/CizrPsq6vfPZOfVfcrGZtuQMTuWRLzgSiUQikVRTvG43nio0gl/MbeJSopJIJBKJRHLBIWdwNERbDNh0BhZ9NgSAbre/xmtK39CLGd/T9lVRDzH7liZ0fuVeAJ5+4SNOLNsLwE8pPQGor2vD4NhuAOwzzaPt/I8AmJAfSb/73wRg/s4/uWPq0wBct20qm14cDUDKuzMB0dN0YquQlxp98yL1v8wC4LP5O3niCSHheD1uNiwS/VGXf1gXANsUHZGHVgLQuF8zVn69CYCo0AYoH4W1/xzn5obC4ZSfmkRYj/4AWGaKcMK/jmQTrzigUg5l0fm2VoBwgekP/wOAwWzj5KZ9AITGic+alF1Ey7rCcZXn8tAkWvQ2LdHrMKYfEs/YbCB3nxiH1Qwl61A2AA37CRkvtWgHpoS66jk8obHqn0+WIlGZ9TqS84RE5XNO7c520EVxTjnyivzhfg6/i6o4Iw9rlJDNPK5i9PYoZXwYCOyf8lj8jq1i9H5Zyu13UalSlN5ArtPvnMpX5arAQL+8wuJTtquSk8sbID/5HFXOotKlK6/GOeVzgQXIT4bAcL8AF5Wmo8qHSe+Xq0ya5D19GY6okt8HnLeCGkJ5wv1Kc1eVup/GOVWecL/KyETn2jl1PiDD/f49vN4quqi8UqKSSCQSiURSzZBrcCqPlKgkEolEIpFccMgZHA237lhOWFgY2xVXVJ1OT/H82C4AvNHndQ5e0gOAyN9/4PbfJwPwjF5PK7uQQIInDQXg4UaP0yvSBsAPNRszYrN4g34jbCk6g5A3NmcXMkEJDvQaHmDsnSIM8JdvNwMw7pIYPlCkkA3mJjxxQw4Az7/yKSeCRL9URN0WbN68BIDBV4hzbbVbOT5LBPrVvO0mtk1aA0DB3nQSreKPO3nPYer3bgqAZ6GT/JrCzRVWYxcAv25LpktzIQ1NW7OJJlGiX8rjclK4TZzPao8mdYfoq4q8Scg5xwtdtK0TDsDfXqgRItw/IUY9roPbAIi3GsncI7qowuvaOfLXUXHtugmACPfTRYvgQYfbQ7bL/w7ud07pOJbhEOdTpJz07ELsyucrzNfIUvnZWBMUt1SyA2uUMnZlqxKVT9Lw2sLUa3kt/v4pR7Em3E/josrTBvcVa6Uof7hfrhL0p3VOGYx6XMr+Bo2Lyhfi53Z5sCjOKY9LE/rn8WJUJS1xXotRr0pDAeF+pQT6iX30p4xF/5S6WZUftKF/WqeVR3VFaY8JPN5bqruqjO2cKhmdKdwvIJSv7N0qxNlUXapyKhnuJ9EiZ3Aqj5zBkUgkEomkmuL1eNU048p9nbuyzczMTPr374/dbsdut9O/f3+ysrJOe8yDDz6ITqcL+OrUqVPAPkVFRQwePJjo6GiCg4Pp168fR48erfD9yRkcDW2emYvebOP6n8Wi4S0ZHXnkF/FQe4VauOKhhwDo+cpvDPj6UwBeXfAr9ztETszbN4n273ldYpgy9QkAduh68cnHPwPQfc8sLu0/DoC2238lbewQANxPv0dK4XgA9ixfBkCnkfcS/3MiAMPmbefnh0VWzaD042yZKhYONxp4B44fxS9vK6PIsDH0qM2Ob/8BIPqNj8guFtUQs1Yf5nll0W9W0k7iB10NgHHZKtYdF7UScfVriHvelUr/+9sBMCk1CdtJf2v4ib93ABASdw3Jm3IBqK/M2mQ43dwQJ2ZBNujAmiUyeiJMBhz7xOxQZFwwWQczAUhoX5dUZZG0qWZDAHJcbtx2MZvj9kKmpjX8WK5o/bYZ9OzPFHk8TXwLi0u0hgdFiRk0V6F/YbHbWYgxXOTqeIpTMYSK+z5Ta7g2+ybf6cZgEhlC6uyM0USu0z9TU1ZruLqw2KDH7WsZL6M1PGDBsW+RsdsTMOsCp+balNUaXto+5WkNP11juF5X+sLis9UaXtHG8JL7VaU1vKrZN2dr/+rEf7mw+GJe0+xxe/BUYRamKseeiXvuuYejR4+ycOFCAB577DH69+/P/PnzT3vcNddcw9SpU9XvzWZzwM+HDBnC/PnzmTVrFlFRUQwdOpS+ffuyYcMGDIoKUh7kC45EIpFIJJIKsXPnThYuXMiaNWvo2LEjAJ9++imdO3dm9+7dNGnSpMxjLRYL8fHxpf4sOzubzz//nK+//pqrrroKgOnTp1OrVi2WLFnC1VdfXe57lBKVRCKRSCTVFN8anKp8AeTk5AR8FRUVVem+Vq9ejd1uV19uADp16oTdbmfVqlWnPXbZsmXExsbSuHFjBgwYwMmTJ9WfbdiwgeLiYvr06aNuS0xMpEWLFmc8b0nkDI6G/NTD6IwWXhp6BQCzG/Xk2zqXAzB18/f0V/IEgr78ihxFYhjuXsW3sdcBYNAJ+cnldLCs8Z0AvB2ex4Q3RDXB3N3pfPWoaB4P0fVi7tjfAVhQYysPKC3dX2SmAJDV7VX6uQ+I+5j+B+6oZQAEx9RixY7lAAy5pgmHRylyydzPAWh8T2/mLfgYAOfBLLVV++C2ZBooi5qLN2TjbSV+eULij/PzdlGX0LKZWFi8eP562iR0F/vmZ+PcJnJ1rPYYUjYIqSmqWywH80WuS5dGohpip8dL3XBfa7geb5L43PFWg7qwOKJ+OCd3pAHQ9K4apCoSlDFR3JvD7aXA6JeKUnKd6vmOZoqFxWFGPclZQq6KNCuZNHlObNFClirKzcEWLRY+F+/NIyg2QvlzyUcf5su+cUJIFFq8Vn/2jUPTGq4d5zpdpS4yDlhYrJGocpXsG4PRiFPdX4+rWPz+GJU/H5fTrS4sLswvVre7XX7pyuPxqhKVr5LBoNfk4BjKkKJKZN+UttDXl4MDYNTm4JTRGA5g0myoysJiH6dbWFxS2vk3W8P/jdwbubBYUhZna5FxLaUCx8fw4cMZMWJEpc+bkpJCbGzsKdtjY2NJSUkp87hrr72W22+/nTp16nDw4EFef/11rrzySjZs2IDFYiElJQWz2UxERETAcXFxcac9b2nIFxyJRCKRSC5wkpKSCAvzO0UtFkup+40YMYKRI0ee9lzr1q0DSn8x93q9p31hv/POO9VxixYtaN++PXXq1OGXX37hlltuKfO4M523NOQLjkQikUgk1ZSzlWQcFhYW8IJTFk899RR33XXXafepW7cuW7Zs4cSJE6f8LDU1lbi4uHLfX0JCAnXq1GHvXmHuiY+Px+l0kpmZGTCLc/LkSbp06VLu84J8wQlg6aeDCQkN46+joj5g/6QeNOzeD4Arvkph6PuiWqHPm58xOF1IMV/cMpqRvYQraNMbYvHTavttPPHOMgC+Pz6FBj2eASBx958k/v4+ALoh77L5FZFFs3r+cqa+KlrLo1cKN9FrC/fw9rWi0fujUe+xYYKQhur3eYOMxV8C0LeOlfWtxC/S1mmi1qHTb3M5XjgRgM1rDnNHuHADfXxgM3WGXAmAfvMOduYKGSK2QWNWbhZt2s/e2ByAH6fsJ7rguPpc0tZsBCA4pj0nl2eIz3JfOJlKlks3JWdmJxBWKOSnCJOBon1bAIiPCSJjt5hajGgYw86VwplmrdNAPYfLLhxjTo+XNIfGOZXjc07pOJwunFNdTHryc4R+HKTkDRXmOwlSXGKuwjxsMeIvhnu7A1N4OACe4kwMETGAkEM8GkkKwK3JvskPyL7xqPlFeU43eqNJGZfunMrTZN/4nFNGk0HNvtGOz+Sc8nq92BQZzuNyYjmNi0rrnNJKUSUrHHz4JCatc0qbiaOtbfBoM3HOgnNKS5kt3JXQV6qyqFC2hlcfLmbXVEn+7Ryc6OhooqOjz7hf586dyc7O5u+//+ayy8TSi7Vr15KdnV2hF5H09HSSkpJISBDu2Xbt2mEymVi8eDF33HEHAMnJyWzbto1x48ZV6LPIRcYSiUQikUgqRLNmzbjmmmsYMGAAa9asYc2aNQwYMIC+ffsGOKiaNm3KnDlzAMjLy+P5559n9erVHDp0iGXLlnHDDTcQHR3NzTffDIDdbueRRx5h6NCh/P7772zatIn77ruPli1bqq6q8iJncCQSiUQiqaZU5yTjGTNm8PTTT6uOp379+jFx4sSAfXbv3k12tlBFDAYDW7du5auvviIrK4uEhAR69uzJ7NmzCQ31z6i/9957GI1G7rjjDhwOB7169WLatGkVysAB+YITwJHrbyDYYGBghHBOZf4xlmGX9gAg9PJnWJEmJJKfehlYYxgFwJ7XF5B1RLiFMiaK6bN5YTpCPxWuphlbDzBlYlcAEorb8PMzoi18mekG2iry0RdH9xA0QFQ/9AjfCsBv8zcyKVws5LLaY1i2VkhQj33QjLR3hUTi+mUyLR8RDeaTnxL1DN50PSGKjPH7huO80aMOAI69KRg7DwQgJL6AOduEZNSkeQx/LxX336VWZwAKs1NxKU3mltBIjq/dAEBkq5vZlydcQV0bRZOkOMkaRorPYTPo0B0VQYA1bEYytokQv4h64WQeyAKg/nXtSCkUFQ+m2o3JU85RFOR3NJ3MV1xBeh1HsnzOKQN/Z4jnH2k2UOCTqHzOqfw8gmPFXxDngWxsvkoGZyGGCLHS3+vZhz40Ur2Ot4REpa1kcLg86I1ChswudGFQxnlFLnW7T4rSOqcMZhsORboymi0UleKcMhh1eFw+F5VPBitWqxo8GueU2xUY7mc2GtQxBDqnTpGl3KdKVFrHkVZKKqs1PCD0r4TWUtItdcZxKcF9JZ1TZbWGl9xaVmt4wDFltIafT86pSrWdV+JeSr+21ImqAx6PB08V1uBU5dgzERkZyfTp00+7j9frT1K22Wz89ttvZzyv1Wrlww8/5MMPP6zS/UmJSiKRSCQSyQWHnMGRSCQSiaSaUp0lquqOfMHRsGRfBhadntiHhKR07bpYnr9byFWdB7/Ps8lCnpnR8QGGXf08AH8OuZz1ccLXf8voPwD4Pm0KtTs/CoB91wra75gFgH7MFN6Z1BqAud8s4/WhQl56bUMjXlu0D4Ax1zcDoNF7H7FxjNhWr/PLHF8+G4AXW0bzd0vhBPpn0m9cNmcGAIce/gqALSsPcq1d5Bt8tnsDjR4X19CPO8AeVzgAMQ2bs2jDMQAGXdOEJdN/AiChOFV9FmkrRWt4SFwrTvwlekbibwznhCLLXFcngiRl3yiXcJFFm40494ierIQoG+k7xTUiGsWwZ51watkaNFKdU+6IWjiVIri0AkXi0cGRbOGcCjHqOZCaD0B7k54cJdwvJMJKQW6gRFWcn42tnuKc2uXAEitcAK6ig6pE5XE58djs6mf0WAIlqgKtRFXsd05lF7lU51Su041e6aLKLhC/DyWdU2rQn0GPU9luNBkCZKliJeDQGizO5XZ7VLnKremc0jqnPMXOgO0g5KczdU6VdE75LKcmtX/KHTguwzllKqGXGPSUKndpt1fFOfVvuI/KM4V9uvuQzqmzh1TESke84JQ/yLK04y9W5AuORCKRSCTVFF8reFWOv1iRa3AkEolEIpFccMgZHA3Df36DsOAg3mp/PQBhXZ7kkqQcAJbeHMLfBhHSt3FCO1J3CQnH8e4kFocJaSGs62AAvty2i893dQMg0duaeQ9PAmDZ5Mv9zqkDmwmfLpxTPWdv4bvvhbNotG0tINxLS/4WYXuDxl1C2gdCIvHM/4DWA0UWwOSnvsGTKzqsbIrd5fe1R3npyroAFOw9jqnHYwCEfDWTH7YJmah5yzjWrdgDQLfHO+JQ+q88W5ep1z62SjinolrdyO5fhRzSs1ms6pxqFh2kXlN/zO+cSvtHnDeifjjpe4V0VffqtqQUirBAc92mZCtuosLgGPXZJ+f5nVOHFLdUmNHA32lCorrW4ndOBccGU5SfJz6XEjLoPJCtdk6d4pyy+0OrtBJVgUvIYz5ZKr8czqmcwmKMZiGLaZ1TeUrnlNY5ZTQbApxT2qC/Qpcib5XhnPKF+5V0TvnkJm2437/lnCrpqqmMc6o06ao8zilVKiv1eM0x0jlVJaRzqvrh9VRxDc5FPIMjX3AkEolEIqmuVHGRMRfxGhwpUUkkEolEIrngkDM4GvqujcVoDWbo7e0B6PnGFF5KFdLQZ+3uY/hVTwGw/vU+bIm4D4Dr31jE98mfAFC/m/h52K4VdFj/KQD68V/z7sfCGfXD14sY9ca1AAxf1YChv+wGYPyNl1Dv/Y8BWDdcbGt45Wsc//N7AF5qbmd923hx7fHz6bTwR0A4p9YtE06rWyOEbPLFznU0flqkSurf3s3OYiHJxDW6hAVrhe9pyA3NWPyVOEcNZyf1859c9hcAoQntOb78V/HzmyNU59QNdSP5Rtk3ujidaLP49SncJuS6hCgbqdtEz1R0s3j2bRDSV1CjJqQpso0rso7qnEotcKlyyKFMEeindU51NJfunAqOC6Y4XyRjlumcihLP63TOqQJFPiqPc8pgEc83u6D4lKA/rXPKaDIEOKe0spTWOeV2Bwb9ldc5pY6r4Jzy7VNR55TvlNI5VfFjJKUjFbEz43F78FRhFqYqx57vyBcciUQikUiqKdJFVXmkRCWRSCQSieSCQ87gaNg45zt0BjNrMoRcMr9lEj+GvQFA0phuZB8VDqGdo99iYZhw8QRPncoXm/cCMDdZhOqFRvZi+sAvAViQ25O7Y4IA+OLITtwPi26NW2vv5/uZSwF41/sbQVGJACz6azkAwz5pxeH3hRRSMP3/aPPMDQC8c/8UclLEvK7dpOe3NUJ2eqtvQwAc/6Rg6PkcAGFffsW0deLnbdsm8McvwsnU/anOFGaLUL/iDYuw2oWbKenP1QDEdL6V3T8LV9PVLePZozinmscEqT1XHPqH2kHi1yd1k/j80U2iSNudDkDDmzpyzKF0TtW7RO2cKrD6u6CSsouwKbrHPsUtFWEysPqkeLZ9LUbyfBJVQgiFOdnK2I5zrzKuKe69uDAfQ5SQAj2u3ejCNM6poAh1nK9xTuUpEpVPcspwFKvOqexCFwbFLZVZ4FS3ZxUUq9uzCsQzMpotOHzhfprOKaNJj0uR5iw2E4X5xadsL805JWQpZexyYjP5XVRB5hJdVOVwTpn0/n/HGDXSVVnOKZNBd8o+Xo10pf2Z9nrq9go6pwLcTiXO77t2SSlDOqekc+piQSYZVx75giORSCQSSTXF6/bidXvPvONpjr9YkRKVRCKRSCSSCw45g6PhlbeGYA0OYUhRBwDe7P0qH3QRPVMp3z5Dka4XAPe8+DUzdokOqHb9x9Hq0CoAQj9+AQDXU++ycUQLAJbOnMc3nw0EoMbPidz3tehq+vmRdkx5awIAK17eRMuB4wFI/V1Uz98Sk8/Wq+oCsPad3+i+XchHxwsns/Q34bQaGh3MxB0rAaj/lrhP4661/JUq3thrtWjOH2uPADDm/nb8+LG459isvapz6PjiP7HXugaAQwvnA9D0oWiOK5LLA3Wj2K/MYEfkJRFnEb8y+ZvWUCNROJJObhbOqYT2ddm6SoxtjVuQ5pwpnkdUXdU5dULjnNqfWaBKXntP5ALQy2wgV5EIQ2KCcOQJ51RoQgjFBUKWCmoehWurkLTMMYpbqng3xmgx9nrcAbKUyxyijvOcflkqV3E1GUxaWUoEMWYXFQd0Tvlkqbwil+qoyiss3S3lGxuMetU5ZTDq1e16g151NqiylMsZID9pg/4sGueUz0Xlk31sZkOAc0qVpTTOKYNO63xCs0/pzqmS7ih1XELK0P5MV4Y0pC9DliqffKQ9tuTPKiarVMU5dborna/Oqf9SlpKKWMXweKroorqIFxnLFxyJRCKRSKopXo8Xr6cKElUVjj3fkS84Gm5dMo5Qi5l7mwwS34dasCgx/696r2RMiKgv+Cj9OD/sEotplz3dHm9bsaj37ZvGArDA9RfvXSYWDX+Zn82mS4cCMKJGAU+/8BEA6elfE9Wwrdj/x6V8eE8bALaOFbMDSe+OosUwcR8zLn+Og/+ITJlaNhPzVm8HoNXDl1H8i5jVKOxwCwCR9Qv55K+DAPTpVJvPpvwCQPc6V6rZMQUr5hIcUwuAw0t/JvG2GgDs+V4smr2uZQKrlb8UzaKt2H1ZLbvWUi9YZMOcWL+LWKXV/PAKsZC55aN9OOYQWTqGuv6FxVm6YPUZH8hwaGZt8ohXzp2kZN/EBJn8C4sTQyjMFnUPwQmROLcpC4trxOByirEhRty7x7UV7LHqdbQzOHnKgl690azO4Oj0BjIc/jZwgBxNJUN2gb+SIctRrM7aZBU4MZrFMyjS5N34ZmpKLix25DrV7W7leZjNBtwucWxQGQuLtZUM2lkbdVzKwmJtZo1J799uDJjZ8bePV3Rhccm6hLIWFuvLWFispTwLi8tLWQuLy5q1Kc/sT1UnGar7wmLJ+YPHDR595V9SKvBX6YJDrsGRSCQSiURywSFncCQSiUQiqaZ43R68emkTrwzyBUfDe++vxIye+a2FVPDNvhW0yBdyxGU3vkTCzj8AePmnn+n2mlj0+3fv6zg0aTYABt04ALYt+IGuc0V9Q6svTzDgA7EIeeuTcTyemwHAvPeWc+vnTwJg+2k8zY/8DkDUwC4ALJq8ku4vibZxh9vDJ/NEY/ektvG8eWAzAIn/9yRBa0TlwnfbTwLQsH1DNvwtFvq+9nov3ju+HwDzjt8xWsVi2wM/ryWqgWgZ37F6Bl1bJQCQqkgu99UOZ4Myf249voVaNvEMMtauJaGRyLE5sTmFJre2A2DZz6Iu4urGbUhzinMUR9XH5048nleMWTnfnvR8whQpZl1yDq2tYuxbWByaGEJ+jtImXjOMolTxvEJrx+JaL2QsU2wD3EVCsjMqEpXX48Zt88tShXoLIKSoHI0slV2kZNFYbGQX+drAxcLi9AKnKkWl5znVhcXZBf5xXqFLrVdQZSmzZpGx2aDm3QSHlb742GY24HEp7emaGobS8m7cLmeJfJzARcZaKcpk0PsrGQxauar0pm/fIuOSC4sDj/VP8hpKzPeWtbBYS1kLi8uTd1NW1o04pvTrlUVZklFlZCK5sLjiyIXFlcfr9uKtgkQlbeISiUQikUgkFxByBkcikUgkkmqKx+2t4iLji3cGR77gaHjmiY6EWkxsT+wPQMvX/uKHDNEKXrPDI+RvWwLAS47F6H8TmTHP2Nsy9+0fANj4am8AJm9oxAsbxZzs7Cc60eiqpwFY+89+GvV8EYDN6xcy4bpGAPzdJp51L0wA4LI5Iqtm3dvdWKbk3fSJtDHjbyFztXnmBhh3AIBd1oYktLwMgM+XCJlo0DVNeGr+bwA01bdRpZATP8/DXrMlAIf//IM6rwoH1KGCYvpeEgfAt8rfg9q6bDXvpnD979SPFVUTKWv3Et9OSEJrZ27hstatxbmLFgDgrdkch/KXKbnAo7p0dqXlYzeJycLtx3JobxGSy4mT+URGCeknR5GowmqGUqTUSIS2iKP4kJJ9UyOBYodwhxnjauNxiQZzT3CU+uenHfsybnR6g99FZTKTVuBzTpnI9LmolLybjDyn6pzKDnBOFauyVJGjGJPybHxuKZPF76IKCbeqLjCTxaC2hls0slSQ2YCnWIxtSiO7tinc7XKW7pzyuANkLDh9U3hpzimTXqc6sAJcVwa/TFRaDo7X48ZQQpspj3Pq32gKL49zSktF827Ke9/nwjl1Pis7UpY6O8g1OJVHSlQSiUQikUguOKr1C86YMWPo0KEDoaGhxMbGctNNN7F79+6AfbxeLyNGjCAxMRGbzUaPHj3Yvn37f3THEolEIpGcPTxeLx5PFb68UqKqlixfvpwnn3ySDh064HK5ePXVV+nTpw87duwgOFiEx40bN47x48czbdo0GjduzFtvvUXv3r3ZvXs3oaGhFbreL/1exRocytIw4TwKfWwxE/8WL0vrc6/BdIVwMo2/YwIL3hDN1W80imTakZ0AZE0SLqqXu6fy1ttCahp2IBN7bbHvDwuWMfmzTgDs/NRK+rghAHQcM5CR144EYN8RMa8bbzXy2a/CLTWyfxsKlh0HQHfDa0R+P1Xcx/L99OlWD4DZ04XD64anO/NIpnAYFS+bpbaU753/OzWuF2GA//xcyF2XiaC/9S4PHRKFu+o3RUZi5180DBHOqeN/biK+tZCwjv99nEa3Xw7AwU83YGosXFQZilSTY/U3eO/PLFSbwrcdzyFGkWL+Op5DP8WVlZvhIKxmGACOLOGWsteLpWibzzkVR3GhqHAwxl+KxyVebnWRCep13BpZSuuWytFUMvhkKaPZpspSBrONNKUGwqS4y9LznRhtYpxV4MRkFe4qR6ELsyJLFRe5MSrPyam4sEoG+vmkK7PZgMsprmcrIUv55KqymsLVqgZ3oETlk5XKagovWcMAVQv0KyvMD85NU3h5WsJLXq8sztem8LN517Ip/ALA7cWrq8JLilyDUz1ZuHBhwPdTp04lNjaWDRs20K1bN7xeLxMmTODVV1/lllvEf7y//PJL4uLimDlzJo8//vh/cdsSiUQikUj+Y6q1RFWS7Gyx4DQyUmSxHDx4kJSUFPr06aPuY7FY6N69O6tWrSrzPEVFReTk5AR8SSQSiURS3fC4PVX+ulip1jM4WrxeL8899xyXX345LVqIpu6UFCHFxMXFBewbFxfH4cOHyzzXmDFjGDly5CnbR7z8HjqDGcfOZQD87/dFdHhDyFIbu/Rg95RvAXB7p7Futmj9vnLFd3T4WjR23/y2kIl2PBHOq5knAJg5fAmPfjMHAN3Cj7nskOiGajC0Jz+9sxSAngP/R3bxGwC8M3sLAFM612DMLuEUqjP6JYK3i0C/af+k0Kyr+PzLlh1g/gjh3Pr4rW0ABG9fpAb67fl2GXHNngBg21ffcWX7mgCcKHLRv754Sdyi1xF8dCMAdYOEmyht+XJqNRNy09E1STS/Szi1/vztAL1bCIktzTmV4rgmyvMQz+9QllMN9NuSkkO0IrOsTMriPiXQLys1H3sdIUvlZeUTXk+E8xUmK86pjrEU/y1eOM01muEuWgcI55RPDnGHxqh/Zg69kJFKylLpBf5Av7QCIQcZzFbSfWOLjfQ83/ZTA/2y8pyqc8pZ6MJk9YX7+eWq/BwhcYWEW88Y6BdqNQXIUlpHFZw+0M9q9EtXajifJtCv9C4qnd85ZdCrzqnyBPppt5fmiFKbyTX/PCpvoF/pkpb2GmXLUmVxukDAM52rMs6pf0P6qgoy0O/CwltFiUoG/Z0HPPXUU2zZsoVvvvnmlJ+V/D8cr9d72v8TGjZsGNnZ2epXUlLSWb9fiUQikUiqitftrfLXxcp5MYMzePBg5s2bx4oVK6hZs6a6PT4+HhAzOQkJ/oWnJ0+ePGVWR4vFYsFisZy7G5ZIJBKJRPKfUq1fcLxeL4MHD2bOnDksW7aMevXqBfy8Xr16xMfHs3jxYtq0aQOA0+lk+fLljB07tsLXa9XvVozWYGL+9ycA1y/6P1yzRKDfmOgWfPPKFABSvnmSqQuEC+na75NZ/JSQbcK6iG6pxQs20X7geAB2vvgLE9sK19DWaxuwdMB7AHTfvprNrzcF4Ntv/mFovHB8TVslwgTbvvUYhmFrAVhenEjdDuIaU+bvZHT/tgDc9u2PNClsEPAZjsycRVTDawDYsXAJrf8nXFR7Jjt5oLUI6fsCqOUUrqxaNhO5f/4KQJPaQjpKWr6Lml3Eef+YsoYuHTuL7Y75eOq0BsDh9nI4V3EkKZNlm0/kEKnILJsOZ9LLKn69TqbkERMvZLPcTAcR9cMBKMxMIay9eDEt3idkqeC6dXAVCbeUKbEuHtdKANyh/hdWly1SHWcX+WWpTIdwNZWUpdI0slRqjt85lZGvOJ8U51R6nlOVn5xFLjXQz1kU2D9lDRZSXmmyVJDVqJGljBrnVImgP49fxoISzilNoJ/H5dSE8HlKcVGVHuin15Ud6KdKTGUE+mknPw2nkYzKkqXKlIPK2TN1pmNLHl8W5yrQrzz8284pKUtduHjcHjy6yq+jkWtwqilPPvkkM2fO5KeffiI0NFRdc2O327HZbOh0OoYMGcLo0aNp1KgRjRo1YvTo0QQFBXHPPff8x3cvkUgkEknV8Hq9eD1VWIMjc3CqJx999BEAPXr0CNg+depUHnzwQQBefPFFHA4HgwYNIjMzk44dO7Jo0aIKZ+BIJBKJRCK5cKjWLzjlefPU6XSMGDGCESNGVPl6v3bLJiy4GN3Dwuk0OKEXv+wT8szyIV34brcIF3zf3pefRwrJpF2/F9m5ZBQA9bsJiWruB38w/4mOAKz7Iop/HhkIQOtPJ/JZrWsB+Onn3bQNFw6gbxcuo/v/3QqA6/1jABxp3o8abUUH1MiftjOwX3MAXnr1U3oNbQVAcX42J78RXVkRdYWzatePn9Nw6CAANs8u5MFOdcT1PF5ahgqJJM5ixLlaSG8tom0c+m0DALW61hbH/biTVs/eAcD+91agayw+S3axh2R3kPq8NqfkAWBX5JR1hzJpo8hSS47lkqD0TGWl5quyVEH6CcKbCtmscGUq9obi/ornHQXAVKMBHtdWALyR/vVW7tBYdZxZ6EZvFDJRdpG/Z+pEvl+WOplfuix1MleMjbYQ0n1BfxZxrkJNz5TT4cJs8TunbCFizVZ+ThEmZbvqlrIacTnFuUKtRtxFolerZKCf23WqXFWWc8pi9LulrIbStpfeM+UjwCEV4LQ61S1VsmeqtKC+ktIVlC1LleWcKg8B5ylnx1R5nFdVkaXK65qSPVOSc4HH7cWDLNusDOeNi0oikUgkkosN4YTyVOHr3L3gZGZm0r9/f+x2O3a7nf79+5OVlXXaY3Q6Xalf//vf/9R9evToccrP77rrrgrfX7WewZFIJBKJRFI9ueeeezh69KjaOvDYY4/Rv39/5s+fX+YxycnJAd//+uuvPPLII9x6660B2wcMGMCoUaPU7202W4XvT77gaHjr2tex6PQsGPIhAFN61uHrg6IPau9rE/hI6R565Knx3NVZBNNFN+7GZ9+IP9y5yT0B2PxNEGkvPwhAz1ljeKGdqIxYus9ELaWHacI3y3h9qNj/qR8O4bx5IgBxf/wAwAvztnNPP9FhNWniXB4YJGSiwenHccyZDEBoQgO2T58LQL17RVXFml/yGdC9PgArij30VEL1/jQb8Pwtfula2i0c+fUvAOp0q82BJYcAuPL9/gB8M+0fbrq0GwAZzrGkW/3y0NaT+QDYTXrWH84EoLYiS608nEk/u5By0lNyiWooQvzy0k4S0US4pRybUghvLBxozkVpmGq3B8Dt3CsuEFfXH+gXFq9eN6sYVZbKKnRjsIhf9mSf5GS2cTJfjA1mGydyCoHTyFJWK7mKjOVzThU5XFhsyriwWJWlctIKCI8R211ONxaLbyxcZIFuKWOZgX5aWcrncNI6p7RSlE9K8nrcGLXblbFPlrIYAiUnv3PK30tVsmeq1O0awaQ8jipxvH98tgL9KiNLlbnPGfeo2vnh4uqZkrLUf4PX7cVbBYnqXM3g7Ny5k4ULF7JmzRo6dhT/ffr000/p3Lkzu3fvpkmTJqUe54t38fHTTz/Rs2dP6tevH7A9KCjolH0ripSoJBKJRCKppnjc3ip/nQtWr16N3W5XX24AOnXqhN1uP21VkpYTJ07wyy+/8Mgjj5zysxkzZhAdHc0ll1zC888/T25uboXvUc7gSCQSiURygVOyc7GqgbcpKSnExsaesj02NlaNdDkTX375JaGhoWpZto97771Xzbnbtm0bw4YNY/PmzSxevLhC9yhfcDR0q20n2GDg/aVCyrHPn8+be4UUdfczH7PnOlH2aY2IZ+J40Q01Y9e77F4gFkeFfvwCADfNepmR14quqyYPvau6jD747C+WP9YBgDeWbSZihpCaordO5cVfRLjdzTe2BmD29D+Y9uUAAMa8vAfPvAkABEUlsumj3wGod/0IVi2dCsCDvRoCsOlVNy82iQJgs0mPfv1PALSyWzg67zcA6nerzcElBwG44u1bmfP9LgCu7SBKS08UTSbbLkIV3V7YdrIAgBCjnjWHMgBItJqYuz9dPDcl+C49OY/IRiKEL/dkKpFNhNPMsTOFyD7CLeVcmYO5rnCmuZ2L0ceL6/hlKX8idbbbiE4vnl2Gw++cSs4rwqh0RqUokpPRGkxytpClzMF2dWwKtnNSkavMQcFkK/1TFquJwnwhMZl9spSjGIsiIRZmOLBHKV1UzGpOgAAAJo9JREFURS5VliouchKiSHI+t1SI1YTbqYwtgUF/PudUiNUYIEv59gl0SPnkKk+AW8pq1MhQJTSRsmSpkj1TpW3XSk7anqlAmahsh9K5kKXK4nSSUUVlorKDCKUsJal+eD0evFX4c/f11tWqVStg+/Dhw0t1H48YMaLUrkYt69aJjsDS/s6cqSpJyxdffMG9996L1WoN2D5gwAB13KJFCxo1akT79u3ZuHEjbdu2Lde5Qb7gSCQSiURSbTlbNvGkpCTCwsLU7WXN3jz11FNndCzVrVuXLVu2cOLEiVN+lpqaetqqJB9//vknu3fvZvbs2Wfct23btphMJvbu3StfcCpLgyULCA0N46UNIoum24DJ7L9JZL28rbfxv+dFK/hXW1azf9lnADT9YSQdZzwj9rnlHQBa3jwcg068Ab/+4TL+eET8gYxeuYIaf30CQNSDX/Lcz2LWpu+tnZn/nVj0u+2LhwH4aNR7mH8TMzxBUYlsmPAzAPWufI2V74lfiEcmNmHzKLHw+elLRMP2AZMeyyYxA9XKbiXp+7kANLmiFvt+FQt5uw6/kV9/mQFAny59OVEkZoGyoxoDYtZm8wmxmDjEqOfPA2KmJtFq5Je9aQA8G2Im9aiY8oy5RDSPZ59II6aFmIEp2HmM6KvErFLRukysDYVO63L8jqGGuI7XsxBXROC/KnKw+mdtCt1qu3dKXhEmpVIhObcIo1VkEiX7FhPbQkjOEmOjNSRg1iZDycGxWE0UFvhnbYocYuybtcnNcBAWKa6XXuTCqmwvOWsTrrSul2fWprSqhmCTQZ3tKG3WxuNyljpr49Hm4JzlWZuymsG120v+m+xczNqUt8LhbOXdlOdfmpWpcDhb8yyygkHi9VRxkbGSghwWFhbwglMW0dHRREdHn3G/zp07k52dzd9//81ll10GwNq1a8nOzqZLly5nPP7zzz+nXbt2tGrV6oz7bt++neLi4oDOyfIgFxlLJBKJRCKpEM2aNeOaa65hwIABrFmzhjVr1jBgwAD69u0b4KBq2rQpc+bMCTg2JyeH7777jkcfffSU8+7fv59Ro0axfv16Dh06xIIFC7j99ttp06YNXbt2rdA9yhcciUQikUiqK1UK+fPAOSzbnDFjBi1btqRPnz706dOHSy+9lK+//jpgn927d5OdnR2wbdasWXi9Xu6+++5Tzmk2m/n999+5+uqradKkCU8//TR9+vRhyZIlGAyGCt2flKg09BgwCZ3Ryt6+wo42iSjeHDgTgO92rmV3e/EH1+r7EVz5w8sAjLz+LZqtfkk5g5Cohv5vEcuVqoaxy5ZSc9WXAEQ/OI1B8/cBcOtd3fh+5lIAdkwbwLQxHwBgWygW9AZFJbJ+nFjI3KDPGyx9ZxYAAyc0Z9NoIa081TKOQybxjhq0aR4AbcOtHPnmewAu6VmHPT/tBODyUTezcKGQpXp3u4kkxzQAsqKb4lSmMDem+GWpPxQpKtFqZP7Ok+JzhVr44oj4RY25JJqsZKG/+mSp/J1HiL5SI0s1FouJXY7FGGqLTB+vZxGuyNrqM89BLC7zyVKpBS5Vljqa45elkrILVVnqSGYB5mC72CdDyETm0EiSs8XYEhoWIEs5lIXFZpuRQiX7xmIzkascW1FZKkRZcFwVWcq38K+ispQ2HwcqJ0upbeIVlKV8csl/IUudTiaSstTZQ8pS1Q+P24unCoWZnioUdZ6JyMhIpk+fftp9Sqtceuyxx3jsscdK3b9WrVosX778rNyfnMGRSCQSiURywSFncCQSiUQiqaZ43d5yFU+Xefw5nMGp7sgXHA3W8Dj0JhtvPP0FAH8lb2Fzy68AqD3hSbotnQDACx2eoNbS5wEIM+p54W1Rr7DxNZEjM/aHFdhXCJdV/OAfuGuGqHsY8HBPJk2cC8DRb59hylvifIZvRxOa0ACAVSO+A6DZvf/H4lHfADD04xaseEu4pV68NJbDSry/ZeVMOinyyoEvhYTV8poG7PpRyFJXvt+fn+ZPAaD3lberslRqRCN84Zarj+ZiV2SuRbuEFFU3yMR3O4T89HKUjU8PZ4nP0jqWzGPHAYhrW5v8TUcAiL2uKQCFa9KwNu0OgMuxEH2dSwDFLRVVV33OmV5hT9QbzZzMF5/LqJGifLLUkWyHKksdztDIUpkOTOpYSHrmoGDSlewbrSxlDTZRWKBIRiEWctLE/vboINKThRSplaXCg8T4dLJUqNVX1eDLwfHLUiFWY5mylMeluLaMen8Ojk+i8rjV2oXTyVJa6QnKJ0sZ9KXLUiX38VGWLFVSvqhK9UJFZamSe0hZqmroyiEdSqoHHm8VJaoqHHu+IyUqiUQikUgkFxxyBkcikUgkkmqK2+vFXYVZmKoce74jX3A0/DP5bsLCwli7VwTfZdzRl1u2LQBgcFwP8ju9CEAvu5UhI8U+x794mBHvCEkoa9I4AOqnLqXfpDUAvDm4B0+/8BEA3816jDFH9wCQPm4IUQ1FAODy1z6mzasfA7Dw2c/FcbdfyvevClnh9Tp69imyiGfeBLrVFGFNOyfP5tI7WwKw/pstANz41RC++XYsAFd3v4PjhSIsMMnqdy4tO5RFjEXIXPO2JtNeqVr4cruQpXonhpByKAuAhHbxZB0VUlRCx4bk/6nIUrdcQuGywwBYm98AgKvwW/R1LwXA6/kFV5S/HTbVKa6nN5pJyRNSjcFi42CWkHlMihR1KMuBKVh8vgOp+VhCRfXD4fQCzL5xWj6WYCFjZSqylDXIjCNXkaJCzapEFRJuJeOECGuMiA0htVBcO8hmwukQ144KEZ/f5cjDrshSrsI87Ip05XI6VOnK5XRgV8alyVJBJr8UFTj2S0lBJsMpbeBet19+8rr99Qwlqxq02wHMxjPLUmU1gp+uDdx3jdO1e1elEbysY08nS2kp65iyrlGe85SXC0GWkpw/uL1Qlb7Mc9S1eV4gJSqJRCKRSCQXHHIGRyKRSCSSaoqUqCqPfMHR8FPTy7HpDXTbLZpSx0S34OlJoi/qnXYJXPO+cEZ9uXYqTz8qoqd/aNSfDneIjqeb3/4DgNkv96DTTSII8N5rb2FgvgjH2/3sk9TqKLqm5r33Krd+/iQAv/3wDpPuENLOpGfEL+OVhsPsChNuo6wvxtKztSgv2zDhZ1o9Kno+5o79nQG/ix6sjyYPBuCmK+4mtWg0ADtdEZiVOfhf9qaRaBXSyncbjnJrqDj3gq0p3NdYSD/HD4im8JqdapBxRHymGj2ak/9TEgBRHdtR+PNGAKytrsXtFC3knlotlCf4LQ57TUAE9x1XHFIGs40jSvCeyRbCnnSf8ymM/RkiXNAnP+0/mYc1TOnVSs3DYhfjw2n52EKFjJWTXUSQcv/5ynltoWYKFFnKHhVEboZ45tGJoTiVzil7iJn9BeJ6USFmXA4hXUUGi3O5nQ6iFLlOK0u5iwJlqTDlOapBf2ZjgCzlk4ZsJV1Uythq1KtdUmXJTz7nFPjdUaIzKlBnKNMtpStdutKXIldpz+v1uDGUIWmVvHZ5ZCmtTFSWLKWlou6oU44/B7JUVZUdfcDnlrKUpGJIiaryyBcciUQikUiqKZ4qzuBIm7hEIpFIJBLJBYScwdFwvNCNVeel42PCybTxzWup+8NsAFqu+J3YwSLQ75Zl8OywhwB4dvh0jn4rZKKwLkJyqrdwPcExtQBYds+rtLp7DACz33qcN1eLjqoVnxQy4bpGAIwyGajzjwj46xkTBMCe0W9x5Y2NAfh7/B9c+X5/AN65fwqdvxsEwLZXfyGtYQ8A8lyi22jxkQI1uG/6hqM0VhxC3606zMux4tzTt55gWAfRH5Wy7wi1e4j7yNq0FYBaD7Qnf6KQpeydulM0fSEApkvuw+MS7rDiRJ8sBRmmCEA4pI5kC6nGaAthb7rSExVsZ1eaIkWFRLDrhBKwZ49hV7J/DLDnRC5Wuzjf8fQCgkKEfJSXVUiwItnlZTsIUsa+4L6YWmFknRTXSKwbzlGHkK6iwyzsyRfXiAqxBMhSbiWozydFFRfmBUhRkcqz87ichGmC/oJMBmUsPqvNZAhwUbm1jirVOeWXpSwGvUa68stSFqVIzqsN+ispV2mkJAiUpYz6QLlKHZeQnM5Wl5ShHMdrKcshVRlZqizpq6zjyyNLleeeyosM7pOcLdxUUaI6a3dy/iFfcCQSiUQiqaa4vV7cyEXGlUFKVBKJRCKRSC445AyOhqc3zCQsNISxD38LwI+9XqRb+DEA2g9dwOJxNwJw6bVD+XbiNQC8k3mCnXeL7fW7CYlq2rMv8tg3wmX1fd9Pmf1EJwDeHe7hTttBAPLCraSPGwLADV1rsvol0Rl1+Uuiz2rm8AU8tUqE9H0681F63fA0AMcLJ7PZWwMAs17HzC0pgOiPAvjkzwM8ECH6qd5ec4QPmkcD8L/tx6nfR/Rdpe7bQb3rRMhgzrd7iH+oGwCOP4R7LKjTw7jeEcGDNOmM1/MLALkRDdDphYyS5NBjMIvr7E0XYXvmYDtbTwo5yBISwXbf2B7NtmM5ANgi4tmVLMbWiHj2HBfj4AjRLZWaVkBIuFVcL8OhjvOyComIEy6qk0eyiU4UYYAphzIBCA+1cCBfyE8J4XXYpjjXYkKtFBeIcWyoheJCsU9smIVin1zlk6KKnURo+qdCzT5ZqljTRVVMiNkvJQGEmrVSlCHAIeUba91VPolLO/Z63AGhfUaNFGVUdhcOp9O4qMqQogx6zbgcDqmAXqpyBv1VJbivzP3LIUOd7lzlQTqkJNUdt7dqMpN0UUkkEolEIql2yBecyiMlKolEIpFIJBcccgZHQ6cPD2KwBPHrp8IV1fOO18mbejsAIV8vxfPSVwAktrufr69/FYB7P57FZ3fdBMDc5J4AfPhhER9cIpw0E0MtBH31OgC3tU9gzaOvANB3aE9+fncpAA8tGsewLkMA6DJgBAA7n5vDjqj2gHgDn/aPkKISrUbG/S76rG6MsDF+yT4A3mkmpKhP1h/jkuuEFHVsx06a3CrOkfnzZuq82BuAvOGbCb/qXgCKPp+KoY3Y7nGtBCA/oaUqRR31hKpS1K70QszBQkranJKL1S6uuTFZSEDWiDg2HskCICiqBpsOC/koOKY2W5LE9pDoaI4kC2koLDKIbCX0zydF5WQ4sEcJt1fa8RxqNIhUxw2biusd3n6UhHARKLgzV4QTJoQ3ojjfL0U5fbKURoqKCbPgLhLOqchgs+qiilJlqULsVr8UpXVOhShyldvlJFTZ7i72u6X8UlRg55RXG+6n6Y8qzfmkdUiZtJKRrnT5yHdMZaWo8gT1GU4jE5VHigrsuCp9rKUyUtS5dkiVlJ7+TSmq5KWkLHXxIRcZVx75giORSCQSSTXFU0WJynPxvt9IiUoikUgkEsmFh5zB0XB04wp0RguWZyYCUKfTk0zs9DgAQ76Zw/t9rwNgWdo1vPv5CwB8dGkun9uFvBL6sdh2X486rLhZHHfPyOuYOXwBAE+tmszzrR8FYPSSxWx8vSkA7eK7qdOIkzalAcIV9cr8HQA8EB3E2/N3AvBB6zg+XyFkqVE3N+XQxk0AtHzwcgDSvl1DvdeEqyvn1XVE3yy6rwpnfIy+k5DePK515NZsB4BO/xWHEcF6Pinqn5QCLEo31N/HcrBFiB6sVUcysUUlArDyQAbBMbUBWLs/HRBS1IaDQjIKjY3hwFHhkLJHB5F5QshE9qggslJFIF9EXDAnjwgpqVYTIT+lHUuhQeMoAI7sPEqtaBGYuCs7nZoRIpBwTW4GNSOEjOWTompG2HAqElV8uFWVouLCrAFSlNspHF8RNpMa1KeVouyW0qWoUJ9zyu0+xUUVYjaWOrZq5CqzQTv2S0laWUorV2kqqkrd52y4oso6tjyOKKi4FFWWK+psBfWd7p7KQ3VxREkZSqJFSlSVR77gSCQSiURSTZEuqsojX3A0fD3xOYJCQvmkmahT2JLZmf+bIn61Xi9cwPI6YoFt1hO3M/gxMQPyfbfHuX+GmBl5+5Z3AHjj8BIGx4sFx/WXvMnO55TmcU8zbMo/p5/7eTdtlYW1T8/YyBv1xSzK09+KuoTZ19Tng9//AeCjgV04uFIsAG733I2kjhd1CQ0mPkLuY3MBsN/yPABFn4+Fy+8CxKLh5MhLAFGjsKtQ5MiYgu38lSQyamwRcfyutIgHx4rZkiV7UwmJqwvA4p0nCU1sCMDSnSexJ9YDYMO+NMITxMzO/kNZAETGhZCeIs4bHhNMhjJrE1fbztG9Ypan8aXxpBxKBeDSS+M4+M8B8ZxixfV2ZKdSP0bMbK3KTqWOsuC4KC+DOtHKrE1+NjUixWyTr3ohwW7FVShmhuJDLOo4OsgUMGvjUmZzIqwmtVLBt7DYXez0LzIu9s/meD1uzeJjJ8ElZnACFhBrZmdKjn2YyjFTU552byh7pqasGoXyLDjWHqu9mqHEzEJV8mvKOra8LePn00xNWbMzcqZGUh7EC05VZnDO4s2cZ8g1OBKJRCKRSC445AyORCKRSCTVFClRVR75gqMh+rUBhJiMvDikCwCzG3bnpfki72bk9W8x4thyAJ6OuZznTwopacUnLXEn3KCcQUhUA5dm0SVCyE93fbKWN5uLpuwnPlnL90pD+DWz/+T1oULGemHREq4Ycw8AR8YtBqDFlKGkPTgLgLj3X6Vg3tsA6G54Dddo0SZ+vF53dPr5AGzzCrnIFGxn0SEhEwVFJTJvt5CDQhMb8O3m4wDYazTm+02igsJeuzk/KeOoukIa+mNLMlF1xALiLbtTiamlLPo9kElsLSHTnTyaQ3ydcACS9oiF0c1aJ7BljaiiaN6jESu2HwLgis612b1KPK8WNZqwPlNk+jSKa8uinFRlHAJAYXaqOi7Oz1YlquL8HGqEiWfqKswnQWkZ90lRscEWVX6KCjLjUhYZRweZ1bya6CATHmUcaQscg5CfQjWylLZGwapp9/bJTf5G8NKlKHOJjBt1H/2ZFxkHyFgB8hYBlG+Rcel1DmXJT+WRnkp+X56FyeWRn6oiPYlrlC4/nSspSspPknONXGRceaREJZFIJBKJ5IJDzuBIJBKJRFJN8QKeKh5/sSJfcDRM/20/Zp2erhumArDno+4MSmsNQJdgE92n7AbgzeYxXDtiCQDf39aMa0bPBWDDa6IJvPlnP/DhpyLv5qlxP9JjppCXDj84i+a/ioyd9GvfJuJr0Rbu+GEQaT1FhYP3/4Qbalv0ZZiCfwXg14xggpT8mWn/pGCv3QyAKWuTiKzfCoAJy4UbKbpxBz5eIcZxzdoz808hGcU3ac5vfycBkNi0PpuUFvKajeM5uFtITIlKLULSnjSatU4AYMuag3TtIfJnVvy2hY5thXts/oYd3HClqITYvnw9AG1vbM7q+ULGa1PnMhamC+mrTe1efJctpKimCaEUZYvrNYkNwZkr6hzqRypSVEEOtezCIeUsyKFGqJClih15fomqyEFCqEUdA8QFm1XJKS7EjEdxSEUGmdSx3WLyN4BbDOp2m8kvP9k0UpTN5JePLBoNx2rw7yO+L11+Kpl940MrOZnLkJ/KkqsMJXQeYxnyU1njALeUvnSJSVcOp1XJ78tT1VBR+el0EtO5kJ/KKzdJ+UnybyIlqsojJSqJRCKRSCQXHHIGRyKRSCSSaop0UVUe+YKjYfikuwmzWUgY/C4Amb+PIfRpISNN2TSbJ256H4ArVv/G4e5CSkpY9j0ZVw4BIOtDcZzzl9dYdakI/zMHv8cMl5CUQhMaMG67+FWNatiW534Wkld8q548O3c7ADU7CJnr2e+2UPeyKwEY/eM26nXsKu5j/k4adGgDwJzF+2h0mTj3ypWHAWjavg471gspql2Xuvy9VFQ89Lm+Fb/OEQGBt991ObOn/wHADY/24ZOPfwbggX43AfC/RX9y1b2tAfjr+4Vc2VRce8FXB7mi4dUAzE4/Tvs6Ipzwo3ThzmpXK1yVn1rEhVKUJ+SnJtHBFClSVKPIYLVSoX5kkNr0XTdckagcedRWqi/cRQ5qKrKUx+UkJtjX+u0gKsjvfAIItxrVcZjF3+IdoqlLCDHr1X20rd/BxjIkqgC5yj+2GAM1Cq3MZNEk95nLlKK0QX+lj01lhPYZS+hEZbqlyjEuK9yvLKdVecP2yuNkOldup/LITNLtJDmfkBJV5ZESlUQikUgkkgsOOYMjkUgkEkk1RUpUlUe+4Gh41nANJkMIsc2F3HPdpnjqXd4XgO6zMmh29W0AdH13Ha1uEn1Pvd5eRoc7REjfzW8L2afz3bfz+LsrAOh5Tz9e/3CZON89VzP5MzG+9a5u/DBL7PPgg1fy2ZRfABj8VD8A3n/ve14dJq7x1tszGDPyAQBeevVT3h8rmsqffuEjXn/gSQAee+YDAMY9MYT7Z4nuq4eG9mDhFyIs8O521/PN+0KuuuXS2/n0+H5xT81ieTflkPgsDUSj98j043SvKxxVjswTdKwVDkBhdhrtEsMAKMrN4FIlkM8nOTWL9stPjaKC1J6ouuH+Ru/ados6Tgw1q+O4YH/XU5TNP460GdRxhNUfvGe3BPZB2S1+ySnUrJGoAsb+CctgjQYUpBnbNPKTdmw1lD6G8slSZcpVZYzLkq5K9kGV7ZzSVWhcGfmorJ9VRSaqjJNJykySCxkpUVUe+YIjkUgkEkk1xVPFGRzPxft+c+GswZk8eTL16tXDarXSrl07/vzzz//6liQSiUQiuWB5++236dKlC0FBQYSHh5frGK/Xy4gRI0hMTMRms9GjRw+2b98esE9RURGDBw8mOjqa4OBg+vXrx9GjRyt8fxfEDM7s2bMZMmQIkydPpmvXrnzyySdce+217Nixg9q1a5f7PHMnfY7OYCZ7tXBO2TsPqvAY4J//TVbH2yZMxj5pCgBTPr4d+7ti3/9dfz9T3poAwBu9HuXd14R89MIVgwF468X9PNGhBgAvnzjE/a1E19Tg9OPccYnothqQmcKNTURP1P1KkN41DSMoys0AoFc9O8WKZNS1ViiuQiEZXVYjRB23iQ9WZaIWsSJgz+100CTK715qGGFRx/XCzeq4jl2MfRJQzTB/kF6NUP84IcQ/jg32/8rFBPnHUZqxT5YCCLeWPvZJVD5CNd9rpSjtOKgsWaqMsVZ6KmsM5ZOiypKcypSiyjE+3c+kTCSRXBhUZ4nK6XRy++2307lzZz7//PNyHTNu3DjGjx/PtGnTaNy4MW+99Ra9e/dm9+7dhIaGAjBkyBDmz5/PrFmziIqKYujQofTt25cNGzZgMBjOcAU/F8QMzvjx43nkkUd49NFHadasGRMmTKBWrVp89NFH//WtSSQSiURSadwoC40r+3UO723kyJE8++yztGzZslz7e71eJkyYwKuvvsott9xCixYt+PLLLykoKGDmzJkAZGdn8/nnn/Puu+9y1VVX0aZNG6ZPn87WrVtZsmRJhe7vvJ/BcTqdbNiwgZdffjlge58+fVi1alWpxxQVFVFUVKR+n50tZjm87mIAcnJylO+dFR77jq/KWF5bXlteW15bXrv6Xtv33wrvv7CA11mlJir/8b7P5cNisWCxWKp07opy8OBBUlJS6NOnT8B9dO/enVWrVvH444+zYcMGiouLA/ZJTEykRYsWrFq1iquvvrr8F/Se5xw7dswLeFeuXBmw/e233/Y2bty41GOGDx/uRXSQyS/5Jb/kl/ySX5X6SkpKOmf/bXM4HN74+Pizcp8hISGnbBs+fPhZu9epU6d67Xb7GfdbuXKlF/AeO3YsYPuAAQO8ffr08Xq9Xu+MGTO8ZrP5lGN79+7tfeyxxyp0X+f9DI4PXYlFAF6v95RtPoYNG8Zzzz2nfp+VlUWdOnU4cuQIdrv9nN7nhUJOTg61atUiKSmJsLCw//p2zhvkc6s48plVDvncKk55n5nX6yU3N5fExMRzdi9Wq5WDBw/idDqrfK7S/ntY1uzNiBEjGDly5GnPt27dOtq3b1/p+6nIf68rsk9JzvsXnOjoaAwGAykpKQHbT548SVxcXKnHlDU1Z7fb5f8RVJCwsDD5zCqBfG4VRz6zyiGfW8UpzzP7N/4xbLVasVqt5/w6Wp566inuuuuu0+5Tt27dSp07Pj4egJSUFBISEtTt2v9ex8fH43Q6yczMJCIiImCfLl26VOh65/0iY7PZTLt27Vi8eHHA9sWLF1f4YUgkEolEcjETHR1N06ZNT/tV2ZeuevXqER8fH/Dfa6fTyfLly9X/Xrdr1w6TyRSwT3JyMtu2bavwf9PP+xkcgOeee47+/fvTvn17OnfuzJQpUzhy5AgDBw78r29NIpFIJJILkiNHjpCRkcGRI0dwu938888/ADRs2JCQEJF037RpU8aMGcPNN9+MTqdjyJAhjB49mkaNGtGoUSNGjx5NUFAQ99wjGgHsdjuPPPIIQ4cOJSoqisjISJ5//nlatmzJVVddVaH7uyBecO68807S09MZNWoUycnJtGjRggULFlCnTp1yHW+xWBg+fPi/vqL8fEY+s8ohn1vFkc+scsjnVnHkM6sYb7zxBl9++aX6fZs2bQBYunQpPXr0AGD37t2qUxngxRdfxOFwMGjQIDIzM+nYsSOLFi1SM3AA3nvvPYxGI3fccQcOh4NevXoxbdq0CmXgAOi83ou4qEIikUgkEskFyXm/BkcikUgkEomkJPIFRyKRSCQSyQWHfMGRSCQSiURywSFfcCQSiUQikVxwXPQvOJMnT6ZevXpYrVbatWvHn3/++V/fUrVixIgR6HS6gC9fWBOIdMkRI0aQmJiIzWajR48ebN++/T+843+fFStWcMMNN5CYmIhOp2Pu3LkBPy/PMyoqKmLw4MFER0cTHBxMv379OHr06L/4Kf59zvTcHnzwwVN+9zp16hSwz8X23MaMGUOHDh0IDQ0lNjaWm266id27dwfsI3/fAinPM5O/axcmF/ULzuzZsxkyZAivvvoqmzZt4oorruDaa6/lyJEj//WtVSsuueQSkpOT1a+tW7eqPxs3bhzjx49n4sSJrFu3jvj4eHr37k1ubu5/eMf/Lvn5+bRq1YqJEyeW+vPyPKMhQ4YwZ84cZs2axV9//UVeXh59+/bF7T6XXcD/LWd6bgDXXHNNwO/eggULAn5+sT235cuX8+STT7JmzRoWL16My+WiT58+5Ofnq/vI37dAyvPMQP6uXZBUqLnqAuOyyy7zDhw4MGBb06ZNvS+//PJ/dEfVj+HDh3tbtWpV6s88Ho83Pj7e+3//93/qtsLCQq/dbvd+/PHH/9IdVi8A75w5c9Tvy/OMsrKyvCaTyTtr1ix1n2PHjnn1er134cKF/9q9/5eUfG7e/2/vXkKbWMMwjj/qSYq0pRAvzcRgCIpuUgQrSERUChYKRaGb6sYsRFCIUAxduXArSN25kiItCK4qCC680KQgpSAxYL2AwdbLoqFYirZWGi/v2ZwTTkxro6LxzPx/EAiZCXx5eBdPp5N8ZpZIJOzw4cPLvofczKanp02SjYyMmBnzVo2vMzNj1tzKs1dwisWistls2ZbsktTe3q7R0dEarerPlM/nFQqFFI1GdeTIEU1MTEiSJicnVSgUyjKsq6vT/v37yfAf1WSUzWb18ePHsnNCoZBisZjnc8xkMtq4caO2bdumEydOaHp6unSM3FT6AbVAICCJeavG15n9i1lzH88WnDdv3ujz588VG3I2NzdXbNzpZbt379bg4KBu3bqly5cvq1AoaM+ePZqZmSnlRIbLqyajQqEgv99ftrHc1+d4UUdHh65evarh4WH19fXp/v37amtr0+LioiRyMzOdOXNGe/fuVSwWk8S8rWSpzCRmza1csVXDz/iRbdu9pKOjo/S8paVF8XhcW7Zs0cDAQOkmPDJc2Y9k5PUcu7u7S89jsZh27dqlSCSimzdvqqura9n3eSW3ZDKphw8f6t69exXHmLelLZcZs+ZOnr2Cs379eq1Zs6aiff9323ZUqq+vV0tLi/L5fOnbVGS4vGoyCgaDKhaLmp2dXfYcSI7jKBKJKJ/PS/J2bqdPn9aNGzeUTqcVDodLrzNvy1sus6Uwa+7g2YLj9/vV2tpatiW7JN25c+e7t2T3ksXFRT19+lSO4ygajSoYDJZlWCwWNTIyQob/qCaj1tZW+Xy+snOmpqb06NEjcvyPmZkZvX79Wo7jSPJmbmamZDKpoaEhDQ8PKxqNlh1n3iqtlNlSmDWXqM29zX+Ga9eumc/ns/7+fnvy5In19PRYfX29vXjxotZL+2OkUinLZDI2MTFhY2Nj1tnZaY2NjaWMzp8/b01NTTY0NGTj4+N29OhRcxzH3r17V+OV/z5zc3OWy+Usl8uZJLt48aLlcjl7+fKlmVWX0cmTJy0cDtvdu3ftwYMH1tbWZjt27LBPnz7V6mP9ct/KbW5uzlKplI2Ojtrk5KSl02mLx+O2adMmT+d26tQpa2pqskwmY1NTU6XHwsJC6RzmrdxKmTFr7uXpgmNmdunSJYtEIub3+23nzp1lXx2EWXd3tzmOYz6fz0KhkHV1ddnjx49Lx798+WLnzp2zYDBodXV1tm/fPhsfH6/hin+/dDptkioeiUTCzKrL6MOHD5ZMJi0QCNjatWuts7PTXr16VYNP8/t8K7eFhQVrb2+3DRs2mM/ns82bN1sikajIxGu5LZWXJLty5UrpHOat3EqZMWvutcrM7PddLwIAAPj1PHsPDgAAcC8KDgAAcB0KDgAAcB0KDgAAcB0KDgAAcB0KDgAAcB0KDgAAcB0KDoCqHDhwQD09PbVeBgBUhYIDAABch4IDAABch4IDoML79+917NgxNTQ0yHEc9fX11XpJAPBdKDgAKvT29iqdTuv69eu6ffu2MpmMstlsrZcFAFX7q9YLAPBnmZ+fV39/vwYHB3Xw4EFJ0sDAgMLhcI1XBgDV4woOgDLPnz9XsVhUPB4vvRYIBLR9+/YargoAvg8FB0AZM6v1EgDgp1FwAJTZunWrfD6fxsbGSq/Nzs7q2bNnNVwVAHwf7sEBUKahoUHHjx9Xb2+v1q1bp+bmZp09e1arV/P3EID/DwoOgAoXLlzQ/Py8Dh06pMbGRqVSKb19+7bWywKAqq0y/uEOAABchmvOAADAdSg4AADAdSg4AADAdSg4AADAdSg4AADAdSg4AADAdSg4AADAdSg4AADAdSg4AADAdSg4AADAdSg4AADAdSg4AADAdf4GfrCb/RNooScAAAAASUVORK5CYII=", + "text/plain": [ + "
" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "pos_encoding = positional_encoding(128, 256)\n", + "\n", + "plt.pcolormesh(pos_encoding[0], cmap='RdBu')\n", + "plt.xlabel('d')\n", + "plt.xlim((0, 256))\n", + "plt.ylabel('Position')\n", + "plt.colorbar()\n", + "plt.show()" + ] + }, + { + "cell_type": "markdown", + "id": "f633ce92", + "metadata": {}, + "source": [ + "Each row represents a positional encoding - notice how none of the rows are identical! You have created a unique positional encoding for each of the words." + ] + }, + { + "cell_type": "markdown", + "id": "0dd7e035", + "metadata": {}, + "source": [ + "**Congratulations on finishing this Lab!** Now you should have a better understanding of the positional encoding in the transformer and this will surely help you with this week's assignment!\n", + "\n", + "**Keep it up!**" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "85e793c3", + "metadata": {}, + "outputs": [], + "source": [] + } + ], + "metadata": { + "kernelspec": { + "display_name": "Python 3 (ipykernel)", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.10.11" + } + }, + "nbformat": 4, + "nbformat_minor": 5 +} diff --git a/NLP with Attention Models/Text_Summarization/Attention_Masking_PE/home/jovyan/work/images/C4_W2_L3_dot-product-attention_S01_introducing-attention_stripped.png b/NLP with Attention Models/Text_Summarization/Attention_Masking_PE/home/jovyan/work/images/C4_W2_L3_dot-product-attention_S01_introducing-attention_stripped.png new file mode 100644 index 0000000000000000000000000000000000000000..674b2b4d83eed0f49ae959225666ec0aeef93f08 Binary files /dev/null and b/NLP with Attention Models/Text_Summarization/Attention_Masking_PE/home/jovyan/work/images/C4_W2_L3_dot-product-attention_S01_introducing-attention_stripped.png differ diff --git a/NLP with Attention Models/Text_Summarization/Attention_Masking_PE/home/jovyan/work/images/C4_W2_L3_dot-product-attention_S03_concept-of-attention_stripped.png b/NLP with Attention Models/Text_Summarization/Attention_Masking_PE/home/jovyan/work/images/C4_W2_L3_dot-product-attention_S03_concept-of-attention_stripped.png new file mode 100644 index 0000000000000000000000000000000000000000..de8f2de2c7148521578e465ff8664624f876f8b6 Binary files /dev/null and b/NLP with Attention Models/Text_Summarization/Attention_Masking_PE/home/jovyan/work/images/C4_W2_L3_dot-product-attention_S03_concept-of-attention_stripped.png differ diff --git a/NLP with Attention Models/Text_Summarization/Attention_Masking_PE/home/jovyan/work/images/C4_W2_L3_dot-product-attention_S04_attention-math_stripped.png b/NLP with Attention Models/Text_Summarization/Attention_Masking_PE/home/jovyan/work/images/C4_W2_L3_dot-product-attention_S04_attention-math_stripped.png new file mode 100644 index 0000000000000000000000000000000000000000..c8ae372d45ffd43a58c5339043fdf34940c5847d Binary files /dev/null and b/NLP with Attention Models/Text_Summarization/Attention_Masking_PE/home/jovyan/work/images/C4_W2_L3_dot-product-attention_S04_attention-math_stripped.png differ diff --git a/NLP with Attention Models/Text_Summarization/Attention_Masking_PE/home/jovyan/work/images/C4_W2_L4_causal-attention_S02_causal-attention_stripped.png b/NLP with Attention Models/Text_Summarization/Attention_Masking_PE/home/jovyan/work/images/C4_W2_L4_causal-attention_S02_causal-attention_stripped.png new file mode 100644 index 0000000000000000000000000000000000000000..33a98d098d16a7ee1dec3001b34ea2f525d6fb06 Binary files /dev/null and b/NLP with Attention Models/Text_Summarization/Attention_Masking_PE/home/jovyan/work/images/C4_W2_L4_causal-attention_S02_causal-attention_stripped.png differ diff --git a/NLP with Attention Models/Text_Summarization/Attention_Masking_PE/home/jovyan/work/images/C4_W2_L4_causal-attention_S03_causal-attention-math_stripped.png b/NLP with Attention Models/Text_Summarization/Attention_Masking_PE/home/jovyan/work/images/C4_W2_L4_causal-attention_S03_causal-attention-math_stripped.png new file mode 100644 index 0000000000000000000000000000000000000000..18d9609f9d84259c9dc622451c6f7723eabd494b Binary files /dev/null and b/NLP with Attention Models/Text_Summarization/Attention_Masking_PE/home/jovyan/work/images/C4_W2_L4_causal-attention_S03_causal-attention-math_stripped.png differ diff --git a/NLP with Attention Models/Text_Summarization/Attention_Masking_PE/home/jovyan/work/images/C4_W2_L4_causal-attention_S04_causal-attention-math-2_stripped.png b/NLP with Attention Models/Text_Summarization/Attention_Masking_PE/home/jovyan/work/images/C4_W2_L4_causal-attention_S04_causal-attention-math-2_stripped.png new file mode 100644 index 0000000000000000000000000000000000000000..3608ef08b23c4dde801739526c4553246dbc12eb Binary files /dev/null and b/NLP with Attention Models/Text_Summarization/Attention_Masking_PE/home/jovyan/work/images/C4_W2_L4_causal-attention_S04_causal-attention-math-2_stripped.png differ diff --git a/NLP with Attention Models/Text_Summarization/Summarization/tf/.ipynb_checkpoints/C4W2_Assignment-checkpoint.ipynb b/NLP with Attention Models/Text_Summarization/Summarization/tf/.ipynb_checkpoints/C4W2_Assignment-checkpoint.ipynb new file mode 100644 index 0000000000000000000000000000000000000000..01bf9322060a15633256b4376d8fd1b06a7a882c --- /dev/null +++ b/NLP with Attention Models/Text_Summarization/Summarization/tf/.ipynb_checkpoints/C4W2_Assignment-checkpoint.ipynb @@ -0,0 +1,2001 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "id": "05014ac7", + "metadata": { + "colab_type": "text", + "id": "7yuytuIllsv1" + }, + "source": [ + "\n", + "# Assignment 2: Transformer Summarizer\n", + "\n", + "Welcome to the second assignment of course 4. In this assignment you will explore summarization using the transformer model. **Unlike the lecture, you will be implementing an encoder-decoder model. However, don't worry; you will be guided through all the steps, and you will find numerous hints to assist you!**\n", + "\n", + "There are many hints in this notebook so feel free to use them as needed. Actually by the end of this notebook you will have implemented the full transformer (both encoder and decoder) but you will only be graded on the implementation of the decoder as the encoder is provided for you.\n" + ] + }, + { + "cell_type": "markdown", + "id": "d00e9709", + "metadata": { + "colab_type": "text", + "id": "4-3lxSnXRWPx" + }, + "source": [ + "## Table of Contents\n", + "\n", + "- [Introduction](#0)\n", + "- [1 - Importing the Dataset](#1)\n", + "- [2 - Preprocess the Data](#2)\n", + "- [3 - Positional Encoding](#3)\n", + "- [4 - Masking](#4)\n", + "- [5 - Self-attention](#5)\n", + " - [Exercise 1 - scaled_dot_product_attention](#ex-1)\n", + "- [6 - Encoder](#6)\n", + " - [6.1 - Encoder Layer](#6-1)\n", + " - [6.2 - Full Encoder](#6-2)\n", + "- [7 - Decoder](#7)\n", + " - [7.1 - Decoder Layer](#7-1)\n", + " - [Exercise 2 - DecoderLayer](#ex-2)\n", + " - [7.2 - Full Decoder](#7-2)\n", + " - [Exercise 3 - Decoder](#ex-3)\n", + "- [8 - Transformer](#8)\n", + " - [Exercise 4 - Transformer](#ex-4)\n", + "- [9 - Initialize the Model](#9)\n", + "- [10 - Prepare for Training the Model](#10)\n", + "- [11 - Summarization](#11)\n", + " - [Exercise 5 - next_word](#ex-5)\n", + "- [12 - Train the Model](#12)\n", + "- [13 - Summarize some sentences!](#13)\n" + ] + }, + { + "cell_type": "markdown", + "id": "ee0da363", + "metadata": { + "colab_type": "text", + "id": "H4NlfEQhRWPy" + }, + "source": [ + "\n", + "## Introduction\n", + "\n", + "Summarization is an important task in natural language processing and could be useful for a consumer enterprise. For example, bots can be used to scrape articles, summarize them, and then you can use sentiment analysis to identify the sentiment about certain stocks. Who wants to read an article or a long email today anyway, when you can build a transformer to summarize text for you? Let's get started. By completing this assignment you will learn to: \n", + "\n", + "- Use built-in functions to preprocess your data\n", + "- Implement DotProductAttention\n", + "- Implement Causal Attention\n", + "- Understand how attention works\n", + "- Build the transformer model\n", + "- Evaluate your model\n", + "- Summarize an article\n", + "\n", + "As you can tell, this model is slightly different than the ones you have already implemented. This is heavily based on attention and does not rely on sequences, which allows for parallel computing. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "7b49d856", + "metadata": { + "colab": { + "base_uri": "https://localhost:8080/", + "height": 34 + }, + "colab_type": "code", + "deletable": false, + "editable": false, + "id": "CChWzW-rEHVb", + "outputId": "a0b3e98b-7fc6-492d-c8ad-3a263b54f670", + "tags": [ + "graded" + ] + }, + "outputs": [], + "source": [ + "import os\n", + "#os.environ['TF_CPP_MIN_LOG_LEVEL'] = '3'\n", + "\n", + "import numpy as np\n", + "import pandas as pd\n", + "import tensorflow as tf\n", + "import matplotlib.pyplot as plt\n", + "import time\n", + "import utils\n", + "\n", + "import textwrap\n", + "wrapper = textwrap.TextWrapper(width=70)\n", + "\n", + "tf.keras.utils.set_random_seed(10)" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "cfe093e6", + "metadata": { + "deletable": false, + "editable": false + }, + "outputs": [], + "source": [ + "import w2_unittest" + ] + }, + { + "cell_type": "markdown", + "id": "d56fc570", + "metadata": { + "colab_type": "text", + "id": "kEL2rvaHRWP4" + }, + "source": [ + "\n", + "## 1 - Import the Dataset\n", + "You have the dataset saved in a .json file, which you can easily open with pandas. The loading function has already been taken care of in `utils.py`." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "074bcce3", + "metadata": { + "deletable": false, + "editable": false, + "tags": [ + "graded" + ] + }, + "outputs": [], + "source": [ + "data_dir = \"data/corpus\"\n", + "\n", + "train_data, test_data = utils.get_train_test_data(data_dir)\n", + "\n", + "# Take one example from the dataset and print it\n", + "example_summary, example_dialogue = train_data.iloc[10]\n", + "print(f\"Dialogue:\\n{example_dialogue}\")\n", + "print(f\"\\nSummary:\\n{example_summary}\")" + ] + }, + { + "cell_type": "markdown", + "id": "04210324", + "metadata": {}, + "source": [ + "\n", + "## 2 - Preprocess the data\n", + "\n", + "First you will do some preprocessing of the data and split it into inputs and outputs. Here you also remove some of the characters that are specific to this dataset and add the `[EOS]` (end of sentence) token to the end, like it was discussed in the lecture videos. You will also add a `[SOS]` (start of sentence) token to the beginning of the sentences." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "9ba397a0", + "metadata": { + "deletable": false, + "editable": false, + "tags": [ + "graded" + ] + }, + "outputs": [], + "source": [ + "document, summary = utils.preprocess(train_data)\n", + "document_test, summary_test = utils.preprocess(test_data)" + ] + }, + { + "cell_type": "markdown", + "id": "0fe70280", + "metadata": {}, + "source": [ + "Now perform the standard preprocessing with the tensorflow library. You will need to modify the filters, because you dont want the `[EOS]` tokens to be removed.\n", + "\n", + "Then create the vocabulary by combining the data in the documents and the summaries and using `.fit_on_texts()`:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "5dfab3c8", + "metadata": { + "deletable": false, + "editable": false, + "tags": [ + "graded" + ] + }, + "outputs": [], + "source": [ + "# The [ and ] from default tokens cannot be removed, because they mark the SOS and EOS token.\n", + "filters = '!\"#$%&()*+,-./:;<=>?@\\\\^_`{|}~\\t\\n'\n", + "oov_token = '[UNK]'\n", + "\n", + "tokenizer = tf.keras.preprocessing.text.Tokenizer(filters=filters, oov_token=oov_token, lower=False)\n", + "\n", + "documents_and_summary = pd.concat([document, summary], ignore_index=True)\n", + "\n", + "tokenizer.fit_on_texts(documents_and_summary)\n", + "\n", + "inputs = tokenizer.texts_to_sequences(document)\n", + "targets = tokenizer.texts_to_sequences(summary)\n", + "\n", + "vocab_size = len(tokenizer.word_index) + 1\n", + "\n", + "print(f'Size of vocabulary: {vocab_size}')" + ] + }, + { + "cell_type": "markdown", + "id": "7341b3f5", + "metadata": {}, + "source": [ + "Now you can pad the tokenized sequences for the training data.\n", + "\n", + "For the purpose of this notebook you need to limit the length of the sequences, as transformers are really big models and are not meant to be trained in such small environments." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "c5846dd5", + "metadata": { + "deletable": false, + "editable": false, + "tags": [ + "graded" + ] + }, + "outputs": [], + "source": [ + "# Limit the size of the input and output data for being able to run it in this environment.\n", + "encoder_maxlen = 150\n", + "decoder_maxlen = 50\n", + "\n", + "# Pad the sequences.\n", + "inputs = tf.keras.preprocessing.sequence.pad_sequences(inputs, maxlen=encoder_maxlen, padding='post', truncating='post')\n", + "targets = tf.keras.preprocessing.sequence.pad_sequences(targets, maxlen=decoder_maxlen, padding='post', truncating='post')\n", + "\n", + "inputs = tf.cast(inputs, dtype=tf.int32)\n", + "targets = tf.cast(targets, dtype=tf.int32)\n", + "\n", + "# Create the final training dataset.\n", + "BUFFER_SIZE = 10000\n", + "BATCH_SIZE = 64\n", + "\n", + "dataset = tf.data.Dataset.from_tensor_slices((inputs, targets)).shuffle(BUFFER_SIZE).batch(BATCH_SIZE)" + ] + }, + { + "cell_type": "markdown", + "id": "58b25fb2", + "metadata": {}, + "source": [ + "\n", + "## 3 - Positional Encoding\n", + "\n", + "In sequence to sequence tasks, the relative order of your data is extremely important to its meaning. When you were training sequential neural networks such as RNNs, you fed your inputs into the network in order. Information about the order of your data was automatically fed into your model. However, when you train a Transformer network using multi-head attention, you feed your data into the model all at once. While this dramatically reduces training time, there is no information about the order of your data. This is where positional encoding is useful.\n", + "\n", + "You have learned how to implement the positional encoding in one of this week's labs. Here you will use the `positional_encoding` function to create positional encodings for your transformer. The function is already implemented for you." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "0e65672c", + "metadata": { + "deletable": false, + "editable": false, + "tags": [ + "graded" + ] + }, + "outputs": [], + "source": [ + "def positional_encoding(positions, d_model):\n", + " \"\"\"\n", + " Precomputes a matrix with all the positional encodings \n", + " \n", + " Arguments:\n", + " positions (int): Maximum number of positions to be encoded \n", + " d_model (int): Encoding size \n", + " \n", + " Returns:\n", + " pos_encoding (tf.Tensor): A matrix of shape (1, position, d_model) with the positional encodings\n", + " \"\"\"\n", + " \n", + " position = np.arange(positions)[:, np.newaxis]\n", + " k = np.arange(d_model)[np.newaxis, :]\n", + " i = k // 2\n", + " \n", + " # initialize a matrix angle_rads of all the angles \n", + " angle_rates = 1 / np.power(10000, (2 * i) / np.float32(d_model))\n", + " angle_rads = position * angle_rates\n", + " \n", + " # apply sin to even indices in the array; 2i\n", + " angle_rads[:, 0::2] = np.sin(angle_rads[:, 0::2])\n", + " \n", + " # apply cos to odd indices in the array; 2i+1\n", + " angle_rads[:, 1::2] = np.cos(angle_rads[:, 1::2])\n", + " \n", + " pos_encoding = angle_rads[np.newaxis, ...]\n", + " \n", + " return tf.cast(pos_encoding, dtype=tf.float32)" + ] + }, + { + "cell_type": "markdown", + "id": "9e1f1063", + "metadata": {}, + "source": [ + "\n", + "## 4 - Masking\n", + "\n", + "There are two types of masks that are useful when building your Transformer network: the *padding mask* and the *look-ahead mask*. Both help the softmax computation give the appropriate weights to the words in your input sentence. \n", + "\n", + "You have already learned how to implement and use them in one of this week's labs. Here they are implemented for you." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "cfc7471c", + "metadata": { + "deletable": false, + "editable": false, + "tags": [ + "graded" + ] + }, + "outputs": [], + "source": [ + "def create_padding_mask(decoder_token_ids):\n", + " \"\"\"\n", + " Creates a matrix mask for the padding cells\n", + " \n", + " Arguments:\n", + " decoder_token_ids (matrix like): matrix of size (n, m)\n", + " \n", + " Returns:\n", + " mask (tf.Tensor): binary tensor of size (n, 1, m)\n", + " \"\"\" \n", + " seq = 1 - tf.cast(tf.math.equal(decoder_token_ids, 0), tf.float32)\n", + " \n", + " # add extra dimensions to add the padding to the attention logits. \n", + " # this will allow for broadcasting later when comparing sequences\n", + " return seq[:, tf.newaxis, :] \n", + "\n", + "\n", + "def create_look_ahead_mask(sequence_length):\n", + " \"\"\"\n", + " Returns a lower triangular matrix filled with ones\n", + " \n", + " Arguments:\n", + " sequence_length (int): matrix size\n", + " \n", + " Returns:\n", + " mask (tf.Tensor): binary tensor of size (sequence_length, sequence_length)\n", + " \"\"\"\n", + " mask = tf.linalg.band_part(tf.ones((1, sequence_length, sequence_length)), -1, 0)\n", + " return mask " + ] + }, + { + "cell_type": "markdown", + "id": "89110af6", + "metadata": {}, + "source": [ + "\n", + "## 5 - Self-Attention\n", + "\n", + "As the authors of the Transformers paper state, \"Attention is All You Need\". \n", + "\n", + "\"Encoder\"\n", + "
Figure 1: Self-Attention calculation visualization
\n", + " \n", + "The use of self-attention paired with traditional convolutional networks allows for parallelization which speeds up training. You will implement **scaled dot product attention** which takes in a query, key, value, and a mask as inputs to return rich, attention-based vector representations of the words in your sequence. This type of self-attention can be mathematically expressed as:\n", + "$$\n", + "\\text { Attention }(Q, K, V)=\\operatorname{softmax}\\left(\\frac{Q K^{T}}{\\sqrt{d_{k}}}+{M}\\right) V\\tag{4}\\\n", + "$$\n", + "\n", + "* $Q$ is the matrix of queries \n", + "* $K$ is the matrix of keys\n", + "* $V$ is the matrix of values\n", + "* $M$ is the optional mask you choose to apply \n", + "* ${d_k}$ is the dimension of the keys, which is used to scale everything down so the softmax doesn't explode\n", + "\n", + "\n", + "### Exercise 1 - scaled_dot_product_attention \n", + "\n", + "Implement the function `scaled_dot_product_attention()` to create attention-based representations.\n", + "\n", + "**Reminder**: The boolean mask parameter can be passed in as `none` or as either padding or look-ahead. \n", + " \n", + "* Multiply (1. - mask) by -1e9 before adding it to the scaled attention logits. \n", + "\n", + "**Additional Hints**\n", + "* You may find [tf.matmul](https://www.tensorflow.org/api_docs/python/tf/linalg/matmul) useful for matrix multiplication (check how you can use the parameter transpose_b).\n", + "* You can use [tf.keras.activations.softmax](https://www.tensorflow.org/api_docs/python/tf/keras/activations/softmax) for softmax." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "3f434073", + "metadata": { + "deletable": false, + "tags": [ + "graded" + ] + }, + "outputs": [], + "source": [ + "# GRADED FUNCTION: scaled_dot_product_attention\n", + "def scaled_dot_product_attention(q, k, v, mask):\n", + " \"\"\"\n", + " Calculate the attention weights.\n", + " q, k, v must have matching leading dimensions.\n", + " k, v must have matching penultimate dimension, i.e.: seq_len_k = seq_len_v.\n", + " The mask has different shapes depending on its type(padding or look ahead) \n", + " but it must be broadcastable for addition.\n", + "\n", + " Arguments:\n", + " q (tf.Tensor): query of shape (..., seq_len_q, depth)\n", + " k (tf.Tensor): key of shape (..., seq_len_k, depth)\n", + " v (tf.Tensor): value of shape (..., seq_len_v, depth_v)\n", + " mask (tf.Tensor): mask with shape broadcastable \n", + " to (..., seq_len_q, seq_len_k). Defaults to None.\n", + "\n", + " Returns:\n", + " output -- attention_weights\n", + " \"\"\"\n", + " ### START CODE HERE ###\n", + " \n", + " # Multiply q and k transposed.\n", + " matmul_qk = None\n", + "\n", + " # scale matmul_qk with the square root of dk\n", + " dk = tf.cast(None, tf.float32)\n", + " scaled_attention_logits = None\n", + "\n", + " # add the mask to the scaled tensor.\n", + " if mask is not None: # Don't replace this None\n", + " scaled_attention_logits += None\n", + "\n", + " # softmax is normalized on the last axis (seq_len_k) so that the scores add up to 1.\n", + " attention_weights = None\n", + "\n", + " # Multiply the attention weights by v\n", + " output = None\n", + " \n", + " ### END CODE HERE ###\n", + "\n", + " return output, attention_weights" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "5ed8f5af", + "metadata": { + "deletable": false, + "editable": false + }, + "outputs": [], + "source": [ + "# Test your function!\n", + "q = np.array([[1, 1, 0, 1], [0, 1, 1, 1], [1, 0, 1, 1]]).astype(np.float32)\n", + "k = np.array([[1, 1, 0, 1], [1, 0, 1, 1 ], [1, 1, 1, 0], [0, 0, 0, 1], [0, 1, 0, 1]]).astype(np.float32)\n", + "v = np.array([[0, 0], [1, 0], [1, 0], [1, 1], [1, 1]]).astype(np.float32)\n", + "mask = np.array([[[0, 1, 0, 1, 1], [1, 0, 0, 1, 1], [1, 1, 0, 1, 1]]])\n", + "\n", + "ou, atw = scaled_dot_product_attention(q, k, v, mask)\n", + "ou = np.around(ou, decimals=2)\n", + "atw = np.around(atw, decimals=2)\n", + "\n", + "print(f\"Output:\\n {ou}\")\n", + "print(f\"\\nAttention weigths:\\n {atw}\")" + ] + }, + { + "cell_type": "markdown", + "id": "7b970a6e", + "metadata": {}, + "source": [ + "##### __Expected Output__\n", + "\n", + "```\n", + "Output:\n", + " [[[1. 0.62]\n", + " [0.62 0.62]\n", + " [0.74 0.31]]]\n", + "\n", + "Attention weigths:\n", + " [[[0. 0.38 0. 0.23 0.38]\n", + " [0.38 0. 0. 0.23 0.38]\n", + " [0.26 0.43 0. 0.16 0.16]]]\n", + "```" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "4755bb0b", + "metadata": { + "deletable": false, + "editable": false, + "tags": [] + }, + "outputs": [], + "source": [ + "# UNIT TEST\n", + "w2_unittest.test_scaled_dot_product_attention(scaled_dot_product_attention)" + ] + }, + { + "cell_type": "markdown", + "id": "8dcbd521", + "metadata": {}, + "source": [ + "Excellent work! You can now implement self-attention. With that, you can start building the encoder block! " + ] + }, + { + "cell_type": "markdown", + "id": "00b9c92a", + "metadata": {}, + "source": [ + "\n", + "## 6 - Encoder\n", + "\n", + "The Transformer Encoder layer pairs self-attention and convolutional neural network style of processing to improve the speed of training and passes K and V matrices to the Decoder, which you'll build later in the assignment. In this section of the assignment, you will implement the Encoder by pairing multi-head attention and a feed forward neural network (Figure 2a). \n", + "\"Encoder\"\n", + "
Figure 2a: Transformer encoder layer
\n", + "\n", + "* `MultiHeadAttention` you can think of as computing the self-attention several times to detect different features. \n", + "* Feed forward neural network contains two Dense layers which we'll implement as the function `FullyConnected`\n", + "\n", + "Your input sentence first passes through a *multi-head attention layer*, where the encoder looks at other words in the input sentence as it encodes a specific word. The outputs of the multi-head attention layer are then fed to a *feed forward neural network*. The exact same feed forward network is independently applied to each position.\n", + " \n", + "* For the `MultiHeadAttention` layer, you will use the [MultiHeadAttention](https://www.tensorflow.org/api_docs/python/tf/keras/layers/MultiHeadAttention) implemented in Keras. If you're curious about how to split the query matrix Q, key matrix K, and value matrix V into different heads, you can look through the implementation. \n", + "* You will also use the [Sequential API](https://www.tensorflow.org/api_docs/python/tf/keras/Sequential) with two dense layers to built the feed forward neural network layers." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "c3fd59d0", + "metadata": { + "deletable": false, + "editable": false, + "tags": [ + "graded" + ] + }, + "outputs": [], + "source": [ + "def FullyConnected(embedding_dim, fully_connected_dim):\n", + " \"\"\"\n", + " Returns a sequential model consisting of two dense layers. The first dense layer has\n", + " fully_connected_dim neurons and is activated by relu. The second dense layer has\n", + " embedding_dim and no activation.\n", + "\n", + " Arguments:\n", + " embedding_dim (int): output dimension\n", + " fully_connected_dim (int): dimension of the hidden layer\n", + "\n", + " Returns:\n", + " _ (tf.keras.Model): sequential model\n", + " \"\"\"\n", + " return tf.keras.Sequential([\n", + " tf.keras.layers.Dense(fully_connected_dim, activation='relu'), # (batch_size, seq_len, d_model)\n", + " tf.keras.layers.Dense(embedding_dim) # (batch_size, seq_len, d_model)\n", + " ])" + ] + }, + { + "cell_type": "markdown", + "id": "99d7003a", + "metadata": {}, + "source": [ + "\n", + "### 6.1 Encoder Layer\n", + "\n", + "Now you can pair multi-head attention and feed forward neural network together in an encoder layer! You will also use residual connections and layer normalization to help speed up training (Figure 2a).\n", + "\n", + "The encoder block (Figure 2) is is already implemented for you. Take a very close look at its implementation, as you will later have to create the decoder yourself, and a lot of the code is very similar. The encoder block performs the following steps: \n", + "1. It takes the Q, V, K matrices and a boolean mask to a multi-head attention layer. Remember that to compute *self*-attention Q, V and K are the same. You will also perform Dropout in this multi-head attention layer during training. \n", + "2. There is a skip connection to add your original input `x` and the output of the multi-head attention layer. \n", + "3. After adding the skip connection, the output passes through the first normalization layer.\n", + "4. Finally, steps 1-3 are repeated but with the feed forward neural network with a dropout layer instead of the multi-head attention layer. \n", + "\n", + "
\n", + " Additional Information (Click to expand)\n", + " \n", + "* The `__init__` method creates all the layers that will be accesed by the the `call` method. Wherever you want to use a layer defined inside the `__init__` method you will have to use the syntax `self.[insert layer name]`. \n", + "* You will find the documentation of [MultiHeadAttention](https://www.tensorflow.org/api_docs/python/tf/keras/layers/MultiHeadAttention) helpful. *Note that if query, key and value are the same, then this function performs self-attention.*\n", + "* The call arguments for `self.mha` are (Where B is for batch_size, T is for target sequence shapes, and S is output_shape):\n", + " - `query`: Query Tensor of shape (B, T, dim).\n", + " - `value`: Value Tensor of shape (B, S, dim).\n", + " - `key`: Optional key Tensor of shape (B, S, dim). If not given, will use the same value for both key and value, which is the most common case.\n", + " - `attention_mask`: a boolean mask of shape (B, T, S), that prevents attention to certain positions. The boolean mask specifies which query elements can attend to which key elements, 1 indicates attention and 0 indicates no attention. Broadcasting can happen for the missing batch dimensions and the head dimension.\n", + " - `return_attention_scores`: A boolean to indicate whether the output should be attention output if True, or (attention_output, attention_scores) if False. Defaults to False.\n", + " - `training`: Python boolean indicating whether the layer should behave in training mode (adding dropout) or in inference mode (no dropout). Defaults to either using the training mode of the parent layer/model, or False (inference) if there is no parent layer. Take a look at [tf.keras.layers.Dropout](https://www.tensorflow.org/versions/r2.4/api_docs/python/tf/keras/layers/Dropout) for more details (Additional reading in [Keras FAQ](https://keras.io/getting_started/faq/#whats-the-difference-between-the-training-argument-in-call-and-the-trainable-attribute))" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "51c1452b", + "metadata": { + "deletable": false, + "editable": false, + "tags": [ + "graded" + ] + }, + "outputs": [], + "source": [ + "class EncoderLayer(tf.keras.layers.Layer):\n", + " \"\"\"\n", + " The encoder layer is composed by a multi-head self-attention mechanism,\n", + " followed by a simple, positionwise fully connected feed-forward network. \n", + " This architecture includes a residual connection around each of the two \n", + " sub-layers, followed by layer normalization.\n", + " \"\"\"\n", + " def __init__(self, embedding_dim, num_heads, fully_connected_dim,\n", + " dropout_rate=0.1, layernorm_eps=1e-6):\n", + " \n", + " super(EncoderLayer, self).__init__()\n", + "\n", + " self.mha = tf.keras.layers.MultiHeadAttention(\n", + " num_heads=num_heads,\n", + " key_dim=embedding_dim,\n", + " dropout=dropout_rate\n", + " )\n", + "\n", + " self.ffn = FullyConnected(\n", + " embedding_dim=embedding_dim,\n", + " fully_connected_dim=fully_connected_dim\n", + " )\n", + "\n", + " self.layernorm1 = tf.keras.layers.LayerNormalization(epsilon=layernorm_eps)\n", + " self.layernorm2 = tf.keras.layers.LayerNormalization(epsilon=layernorm_eps)\n", + "\n", + " self.dropout_ffn = tf.keras.layers.Dropout(dropout_rate)\n", + " \n", + " def call(self, x, training, mask):\n", + " \"\"\"\n", + " Forward pass for the Encoder Layer\n", + " \n", + " Arguments:\n", + " x (tf.Tensor): Tensor of shape (batch_size, input_seq_len, fully_connected_dim)\n", + " training (bool): Boolean, set to true to activate\n", + " the training mode for dropout layers\n", + " mask (tf.Tensor): Boolean mask to ensure that the padding is not \n", + " treated as part of the input\n", + " Returns:\n", + " encoder_layer_out (tf.Tensor): Tensor of shape (batch_size, input_seq_len, embedding_dim)\n", + " \"\"\"\n", + " # calculate self-attention using mha(~1 line).\n", + " # Dropout is added by Keras automatically if the dropout parameter is non-zero during training\n", + " self_mha_output = self.mha(x, x, x, mask) # Self attention (batch_size, input_seq_len, fully_connected_dim)\n", + " \n", + " # skip connection\n", + " # apply layer normalization on sum of the input and the attention output to get the \n", + " # output of the multi-head attention layer\n", + " skip_x_attention = self.layernorm1(x + self_mha_output) # (batch_size, input_seq_len, fully_connected_dim)\n", + "\n", + " # pass the output of the multi-head attention layer through a ffn\n", + " ffn_output = self.ffn(skip_x_attention) # (batch_size, input_seq_len, fully_connected_dim)\n", + " \n", + " # apply dropout layer to ffn output during training\n", + " # use `training=training`\n", + " ffn_output = self.dropout_ffn(ffn_output, training=training)\n", + " \n", + " # apply layer normalization on sum of the output from multi-head attention (skip connection) and ffn output\n", + " # to get the output of the encoder layer\n", + " encoder_layer_out = self.layernorm2(skip_x_attention + ffn_output) # (batch_size, input_seq_len, embedding_dim)\n", + " \n", + " return encoder_layer_out\n", + " " + ] + }, + { + "cell_type": "markdown", + "id": "2e36f13b", + "metadata": {}, + "source": [ + "\n", + "### 6.2 - Full Encoder\n", + "\n", + "Now you're ready to build the full Transformer Encoder (Figure 2b), where you will embed your input and add the positional encodings you calculated. You will then feed your encoded embeddings to a stack of Encoder layers. \n", + "\n", + "\"Encoder\"\n", + "
Figure 2b: Transformer Encoder
\n", + "\n", + "The Encoder class is implemented for you. It performs the following steps: \n", + "1. Pass the input through the Embedding layer.\n", + "2. Scale the embedding by multiplying it by the square root of the embedding dimension. \n", + "3. Add the position encoding: self.pos_encoding `[:, :seq_len, :]` to the embedding.\n", + "4. Pass the encoded embedding through a dropout layer\n", + "5. Pass the output of the dropout layer through the stack of encoding layers using a for loop." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "d677d14e", + "metadata": { + "deletable": false, + "editable": false, + "tags": [ + "graded" + ] + }, + "outputs": [], + "source": [ + "class Encoder(tf.keras.layers.Layer):\n", + " \"\"\"\n", + " The entire Encoder starts by passing the input to an embedding layer \n", + " and using positional encoding to then pass the output through a stack of\n", + " encoder Layers\n", + " \n", + " \"\"\" \n", + " def __init__(self, num_layers, embedding_dim, num_heads, fully_connected_dim, input_vocab_size,\n", + " maximum_position_encoding, dropout_rate=0.1, layernorm_eps=1e-6):\n", + " super(Encoder, self).__init__()\n", + "\n", + " self.embedding_dim = embedding_dim\n", + " self.num_layers = num_layers\n", + "\n", + " self.embedding = tf.keras.layers.Embedding(input_vocab_size, self.embedding_dim)\n", + " self.pos_encoding = positional_encoding(maximum_position_encoding, \n", + " self.embedding_dim)\n", + "\n", + "\n", + " self.enc_layers = [EncoderLayer(embedding_dim=self.embedding_dim,\n", + " num_heads=num_heads,\n", + " fully_connected_dim=fully_connected_dim,\n", + " dropout_rate=dropout_rate,\n", + " layernorm_eps=layernorm_eps) \n", + " for _ in range(self.num_layers)]\n", + "\n", + " self.dropout = tf.keras.layers.Dropout(dropout_rate)\n", + " \n", + " def call(self, x, training, mask):\n", + " \"\"\"\n", + " Forward pass for the Encoder\n", + " \n", + " Arguments:\n", + " x (tf.Tensor): Tensor of shape (batch_size, seq_len)\n", + " training (bool): Boolean, set to true to activate\n", + " the training mode for dropout layers\n", + " mask (tf.Tensor): Boolean mask to ensure that the padding is not \n", + " treated as part of the input\n", + "\n", + " Returns:\n", + " x (tf.Tensor): Tensor of shape (batch_size, seq_len, embedding dim)\n", + " \"\"\"\n", + " seq_len = tf.shape(x)[1]\n", + " \n", + " # Pass input through the Embedding layer\n", + " x = self.embedding(x) # (batch_size, input_seq_len, embedding_dim)\n", + " # Scale embedding by multiplying it by the square root of the embedding dimension\n", + " x *= tf.math.sqrt(tf.cast(self.embedding_dim, tf.float32))\n", + " # Add the position encoding to embedding\n", + " x += self.pos_encoding[:, :seq_len, :]\n", + " # Pass the encoded embedding through a dropout layer\n", + " # use `training=training`\n", + " x = self.dropout(x, training=training)\n", + " # Pass the output through the stack of encoding layers \n", + " for i in range(self.num_layers):\n", + " x = self.enc_layers[i](x, training, mask)\n", + "\n", + " return x # (batch_size, input_seq_len, embedding_dim)" + ] + }, + { + "cell_type": "markdown", + "id": "9c7356fd", + "metadata": {}, + "source": [ + "\n", + "## 7 - Decoder\n", + "\n", + "Now it is time to implement the decoder. You have seen it in the videos and you can use some help by looking at the encoder implementation above. The Decoder layer takes the K and V matrices generated by the Encoder and computes the second multi-head attention layer with the Q matrix from the output (Figure 3a).\n", + "\n", + "\"Decoder\"\n", + "
Figure 3a: Transformer Decoder layer
\n", + "\n", + " \n", + "### 7.1 - Decoder Layer\n", + "Again, you'll pair multi-head attention with a feed forward neural network, but this time you'll implement two multi-head attention layers. You will also use residual connections and layer normalization to help speed up training (Figure 3a).\n", + "\n", + " \n", + "### Exercise 2 - DecoderLayer\n", + " \n", + "Implement `DecoderLayer()` using the `call()` method\n", + " \n", + "1. Block 1 is a multi-head attention layer with a residual connection, and look-ahead mask. Like in the `EncoderLayer`, Dropout is defined within the multi-head attention layer.\n", + "2. Block 2 will take into account the output of the Encoder, so the multi-head attention layer will receive K and V from the encoder, and Q from the Block 1. You will then apply a normalization layer and a residual connection, just like you did before with the `EncoderLayer`.\n", + "3. Finally, Block 3 is a feed forward neural network with dropout and normalization layers and a residual connection.\n", + " \n", + "**Additional Hints:**\n", + "* The first two blocks are fairly similar to the EncoderLayer except you will return `attention_scores` when computing self-attention" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "d8d3a38d", + "metadata": { + "deletable": false, + "tags": [ + "graded" + ] + }, + "outputs": [], + "source": [ + "# GRADED FUNCTION: DecoderLayer\n", + "class DecoderLayer(tf.keras.layers.Layer):\n", + " \"\"\"\n", + " The decoder layer is composed by two multi-head attention blocks, \n", + " one that takes the new input and uses self-attention, and the other \n", + " one that combines it with the output of the encoder, followed by a\n", + " fully connected block. \n", + " \"\"\"\n", + " def __init__(self, embedding_dim, num_heads, fully_connected_dim, dropout_rate=0.1, layernorm_eps=1e-6):\n", + " super(DecoderLayer, self).__init__()\n", + "\n", + " self.mha1 = tf.keras.layers.MultiHeadAttention(\n", + " num_heads=num_heads,\n", + " key_dim=embedding_dim,\n", + " dropout=dropout_rate\n", + " )\n", + "\n", + " self.mha2 = tf.keras.layers.MultiHeadAttention(\n", + " num_heads=num_heads,\n", + " key_dim=embedding_dim,\n", + " dropout=dropout_rate\n", + " )\n", + "\n", + " self.ffn = FullyConnected(\n", + " embedding_dim=embedding_dim,\n", + " fully_connected_dim=fully_connected_dim\n", + " )\n", + "\n", + " self.layernorm1 = tf.keras.layers.LayerNormalization(epsilon=layernorm_eps)\n", + " self.layernorm2 = tf.keras.layers.LayerNormalization(epsilon=layernorm_eps)\n", + " self.layernorm3 = tf.keras.layers.LayerNormalization(epsilon=layernorm_eps)\n", + "\n", + " self.dropout_ffn = tf.keras.layers.Dropout(dropout_rate)\n", + " \n", + " def call(self, x, enc_output, training, look_ahead_mask, padding_mask):\n", + " \"\"\"\n", + " Forward pass for the Decoder Layer\n", + " \n", + " Arguments:\n", + " x (tf.Tensor): Tensor of shape (batch_size, target_seq_len, fully_connected_dim)\n", + " enc_output (tf.Tensor): Tensor of shape(batch_size, input_seq_len, fully_connected_dim)\n", + " training (bool): Boolean, set to true to activate\n", + " the training mode for dropout layers\n", + " look_ahead_mask (tf.Tensor): Boolean mask for the target_input\n", + " padding_mask (tf.Tensor): Boolean mask for the second multihead attention layer\n", + " Returns:\n", + " out3 (tf.Tensor): Tensor of shape (batch_size, target_seq_len, fully_connected_dim)\n", + " attn_weights_block1 (tf.Tensor): Tensor of shape (batch_size, num_heads, target_seq_len, target_seq_len)\n", + " attn_weights_block2 (tf.Tensor): Tensor of shape (batch_size, num_heads, target_seq_len, input_seq_len)\n", + " \"\"\"\n", + " \n", + " ### START CODE HERE ###\n", + " # enc_output.shape == (batch_size, input_seq_len, fully_connected_dim)\n", + " \n", + " # BLOCK 1\n", + " # calculate self-attention and return attention scores as attn_weights_block1.\n", + " # Dropout will be applied during training (~1 line).\n", + " mult_attn_out1, attn_weights_block1 = None\n", + " \n", + " # apply layer normalization (layernorm1) to the sum of the attention output and the input (~1 line)\n", + " Q1 = None\n", + "\n", + " # BLOCK 2\n", + " # calculate self-attention using the Q from the first block and K and V from the encoder output. \n", + " # Dropout will be applied during training\n", + " # Return attention scores as attn_weights_block2 (~1 line) \n", + " mult_attn_out2, attn_weights_block2 = None\n", + " \n", + " # # apply layer normalization (layernorm2) to the sum of the attention output and the Q from the first block (~1 line)\n", + " mult_attn_out2 = None\n", + " \n", + " #BLOCK 3\n", + " # pass the output of the second block through a ffn\n", + " ffn_output = None\n", + " \n", + " # apply a dropout layer to the ffn output\n", + " # use `training=training`\n", + " ffn_output = None\n", + " \n", + " # apply layer normalization (layernorm3) to the sum of the ffn output and the output of the second block\n", + " out3 = None\n", + " ### END CODE HERE ###\n", + "\n", + " return out3, attn_weights_block1, attn_weights_block2\n", + " " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "41686c8b", + "metadata": { + "deletable": false, + "editable": false + }, + "outputs": [], + "source": [ + "# Test your function!\n", + "key_dim = 12\n", + "n_heads = 16\n", + "\n", + "decoderLayer_test = DecoderLayer(embedding_dim=key_dim, num_heads=n_heads, fully_connected_dim=32)\n", + "\n", + "q = np.ones((1, 15, key_dim))\n", + "encoder_test_output = tf.convert_to_tensor(np.random.rand(1, 7, 8))\n", + "look_ahead_mask = create_look_ahead_mask(q.shape[1])\n", + "\n", + "out, attn_w_b1, attn_w_b2 = decoderLayer_test(q, encoder_test_output, False, look_ahead_mask, None)\n", + "\n", + "print(f\"Using embedding_dim={key_dim} and num_heads={n_heads}:\\n\")\n", + "print(f\"q has shape:{q.shape}\")\n", + "print(f\"Output of encoder has shape:{encoder_test_output.shape}\\n\")\n", + "\n", + "print(f\"Output of decoder layer has shape:{out.shape}\")\n", + "print(f\"Att Weights Block 1 has shape:{attn_w_b1.shape}\")\n", + "print(f\"Att Weights Block 2 has shape:{attn_w_b2.shape}\")" + ] + }, + { + "cell_type": "markdown", + "id": "af9b85a3", + "metadata": {}, + "source": [ + "##### __Expected Output__\n", + "\n", + "```\n", + "Output:\n", + "Using embedding_dim=12 and num_heads=16:\n", + "\n", + "q has shape:(1, 15, 12)\n", + "Output of encoder has shape:(1, 7, 8)\n", + "\n", + "Output of decoder layer has shape:(1, 15, 12)\n", + "Att Weights Block 1 has shape:(1, 16, 15, 15)\n", + "Att Weights Block 2 has shape:(1, 16, 15, 7)\n", + "```" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "932f7320", + "metadata": { + "deletable": false, + "editable": false, + "tags": [] + }, + "outputs": [], + "source": [ + "# UNIT TEST\n", + "w2_unittest.test_decoderlayer(DecoderLayer, create_look_ahead_mask)" + ] + }, + { + "cell_type": "markdown", + "id": "66b82ccf", + "metadata": {}, + "source": [ + " \n", + "### 7.2 - Full Decoder\n", + "You're almost there! Time to use your Decoder layer to build a full Transformer Decoder (Figure 3b). You will embed your output and add positional encodings. You will then feed your encoded embeddings to a stack of Decoder layers. \n", + "\n", + "\n", + "\"Decoder\"\n", + "
Figure 3b: Transformer Decoder
\n", + "\n", + " \n", + "### Exercise 3 - Decoder\n", + "\n", + "Implement `Decoder()` using the `call()` method to embed your output, add positional encoding, and implement multiple decoder layers.\n", + " \n", + "In this exercise, you will initialize your Decoder with an Embedding layer, positional encoding, and multiple DecoderLayers. Your `call()` method will perform the following steps: \n", + "1. Pass your generated output through the Embedding layer.\n", + "2. Scale your embedding by multiplying it by the square root of your embedding dimension. Remember to cast the embedding dimension to data type `tf.float32` before computing the square root.\n", + "3. Add the position encoding: self.pos_encoding `[:, :seq_len, :]` to your embedding.\n", + "4. Pass the encoded embedding through a dropout layer, remembering to use the `training` parameter to set the model training mode. \n", + "5. Pass the output of the dropout layer through the stack of Decoding layers using a for loop." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "57dde3be", + "metadata": { + "deletable": false, + "tags": [ + "graded" + ] + }, + "outputs": [], + "source": [ + "# GRADED FUNCTION: Decoder\n", + "class Decoder(tf.keras.layers.Layer):\n", + " \"\"\"\n", + " The entire Encoder starts by passing the target input to an embedding layer \n", + " and using positional encoding to then pass the output through a stack of\n", + " decoder Layers\n", + " \n", + " \"\"\" \n", + " def __init__(self, num_layers, embedding_dim, num_heads, fully_connected_dim, target_vocab_size,\n", + " maximum_position_encoding, dropout_rate=0.1, layernorm_eps=1e-6):\n", + " super(Decoder, self).__init__()\n", + "\n", + " self.embedding_dim = embedding_dim\n", + " self.num_layers = num_layers\n", + "\n", + " self.embedding = tf.keras.layers.Embedding(target_vocab_size, self.embedding_dim)\n", + " self.pos_encoding = positional_encoding(maximum_position_encoding, self.embedding_dim)\n", + "\n", + " self.dec_layers = [DecoderLayer(embedding_dim=self.embedding_dim,\n", + " num_heads=num_heads,\n", + " fully_connected_dim=fully_connected_dim,\n", + " dropout_rate=dropout_rate,\n", + " layernorm_eps=layernorm_eps) \n", + " for _ in range(self.num_layers)]\n", + " self.dropout = tf.keras.layers.Dropout(dropout_rate)\n", + " \n", + " def call(self, x, enc_output, training, \n", + " look_ahead_mask, padding_mask):\n", + " \"\"\"\n", + " Forward pass for the Decoder\n", + " \n", + " Arguments:\n", + " x (tf.Tensor): Tensor of shape (batch_size, target_seq_len)\n", + " enc_output (tf.Tensor): Tensor of shape(batch_size, input_seq_len, fully_connected_dim)\n", + " training (bool): Boolean, set to true to activate\n", + " the training mode for dropout layers\n", + " look_ahead_mask (tf.Tensor): Boolean mask for the target_input\n", + " padding_mask (tf.Tensor): Boolean mask for the second multihead attention layer\n", + " Returns:\n", + " x (tf.Tensor): Tensor of shape (batch_size, target_seq_len, fully_connected_dim)\n", + " attention_weights (dict[str: tf.Tensor]): Dictionary of tensors containing all the attention weights\n", + " each of shape Tensor of shape (batch_size, num_heads, target_seq_len, input_seq_len)\n", + " \"\"\"\n", + "\n", + " seq_len = tf.shape(x)[1]\n", + " attention_weights = {}\n", + " \n", + " ### START CODE HERE ###\n", + " # create word embeddings \n", + " x = None\n", + " \n", + " # scale embeddings by multiplying by the square root of their dimension\n", + " x *= tf.math.sqrt(tf.cast(self.embedding_dim, tf.float32))\n", + " \n", + " # add positional encodings to word embedding\n", + " x += None\n", + "\n", + " # apply a dropout layer to x\n", + " # use `training=training`\n", + " x = None\n", + "\n", + " # use a for loop to pass x through a stack of decoder layers and update attention_weights (~4 lines total)\n", + " for i in range(self.num_layers):\n", + " # pass x and the encoder output through a stack of decoder layers and save the attention weights\n", + " # of block 1 and 2 (~1 line)\n", + " x, block1, block2 = None\n", + "\n", + " #update attention_weights dictionary with the attention weights of block 1 and block 2\n", + " attention_weights['decoder_layer{}_block1_self_att'.format(i+1)] = None\n", + " attention_weights['decoder_layer{}_block2_decenc_att'.format(i+1)] = None\n", + " ### END CODE HERE ###\n", + " \n", + " # x.shape == (batch_size, target_seq_len, fully_connected_dim)\n", + " return x, attention_weights" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "04e877fb", + "metadata": { + "deletable": false, + "editable": false + }, + "outputs": [], + "source": [ + "# Test your function!\n", + "n_layers = 5\n", + "emb_d = 13\n", + "n_heads = 17\n", + "fully_connected_dim = 16\n", + "target_vocab_size = 300\n", + "maximum_position_encoding = 6\n", + "\n", + "x = np.array([[3, 2, 1, 1], [2, 1, 1, 0], [2, 1, 1, 0]])\n", + "\n", + "encoder_test_output = tf.convert_to_tensor(np.random.rand(3, 7, 9))\n", + "\n", + "look_ahead_mask = create_look_ahead_mask(x.shape[1])\n", + "\n", + "decoder_test = Decoder(n_layers, emb_d, n_heads, fully_connected_dim, target_vocab_size,maximum_position_encoding)\n", + " \n", + "outd, att_weights = decoder_test(x, encoder_test_output, False, look_ahead_mask, None)\n", + "\n", + "print(f\"Using num_layers={n_layers}, embedding_dim={emb_d} and num_heads={n_heads}:\\n\")\n", + "print(f\"x has shape:{x.shape}\")\n", + "print(f\"Output of encoder has shape:{encoder_test_output.shape}\\n\")\n", + "\n", + "print(f\"Output of decoder has shape:{outd.shape}\\n\")\n", + "print(\"Attention weights:\")\n", + "for name, tensor in att_weights.items():\n", + " print(f\"{name} has shape:{tensor.shape}\")" + ] + }, + { + "cell_type": "markdown", + "id": "9aa2ff15", + "metadata": {}, + "source": [ + "##### __Expected Output__\n", + "\n", + "```\n", + "Using num_layers=5, embedding_dim=13 and num_heads=17:\n", + "\n", + "x has shape:(3, 4)\n", + "Output of encoder has shape:(3, 7, 9)\n", + "\n", + "Output of decoder has shape:(3, 4, 13)\n", + "\n", + "Attention weights:\n", + "decoder_layer1_block1_self_att has shape:(3, 17, 4, 4)\n", + "decoder_layer1_block2_decenc_att has shape:(3, 17, 4, 7)\n", + "decoder_layer2_block1_self_att has shape:(3, 17, 4, 4)\n", + "decoder_layer2_block2_decenc_att has shape:(3, 17, 4, 7)\n", + "decoder_layer3_block1_self_att has shape:(3, 17, 4, 4)\n", + "decoder_layer3_block2_decenc_att has shape:(3, 17, 4, 7)\n", + "decoder_layer4_block1_self_att has shape:(3, 17, 4, 4)\n", + "decoder_layer4_block2_decenc_att has shape:(3, 17, 4, 7)\n", + "decoder_layer5_block1_self_att has shape:(3, 17, 4, 4)\n", + "decoder_layer5_block2_decenc_att has shape:(3, 17, 4, 7)\n", + "```" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "e92745de", + "metadata": { + "deletable": false, + "editable": false, + "tags": [] + }, + "outputs": [], + "source": [ + "# UNIT TEST\n", + "w2_unittest.test_decoder(Decoder, create_look_ahead_mask, create_padding_mask)" + ] + }, + { + "cell_type": "markdown", + "id": "848ba4b5", + "metadata": {}, + "source": [ + " \n", + "## 8 - Transformer\n", + "\n", + "Phew! This has been quite the assignment! Congratulations! You've done all the hard work, now it's time to put it all together. \n", + "\n", + "\"Transformer\"\n", + "
Figure 4: Transformer
\n", + " \n", + "The flow of data through the Transformer Architecture is as follows:\n", + "* First your input passes through an Encoder, which is just repeated Encoder layers that you implemented:\n", + " - embedding and positional encoding of your input\n", + " - multi-head attention on your input\n", + " - feed forward neural network to help detect features\n", + "* Then the predicted output passes through a Decoder, consisting of the decoder layers that you implemented:\n", + " - embedding and positional encoding of the output\n", + " - multi-head attention on your generated output\n", + " - multi-head attention with the Q from the first multi-head attention layer and the K and V from the Encoder\n", + " - a feed forward neural network to help detect features\n", + "* Finally, after the Nth Decoder layer, one dense layer and a softmax are applied to generate prediction for the next output in your sequence.\n", + "\n", + " \n", + "### Exercise 4 - Transformer\n", + "\n", + "Implement `Transformer()` using the `call()` method\n", + "1. Pass the input through the Encoder with the appropiate mask.\n", + "2. Pass the encoder output and the target through the Decoder with the appropiate mask.\n", + "3. Apply a linear transformation and a softmax to get a prediction." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "c9e6cb07", + "metadata": { + "deletable": false, + "tags": [ + "graded" + ] + }, + "outputs": [], + "source": [ + "# GRADED FUNCTION: Transformer\n", + "class Transformer(tf.keras.Model):\n", + " \"\"\"\n", + " Complete transformer with an Encoder and a Decoder\n", + " \"\"\"\n", + " def __init__(self, num_layers, embedding_dim, num_heads, fully_connected_dim, input_vocab_size, \n", + " target_vocab_size, max_positional_encoding_input,\n", + " max_positional_encoding_target, dropout_rate=0.1, layernorm_eps=1e-6):\n", + " super(Transformer, self).__init__()\n", + "\n", + " self.encoder = Encoder(num_layers=num_layers,\n", + " embedding_dim=embedding_dim,\n", + " num_heads=num_heads,\n", + " fully_connected_dim=fully_connected_dim,\n", + " input_vocab_size=input_vocab_size,\n", + " maximum_position_encoding=max_positional_encoding_input,\n", + " dropout_rate=dropout_rate,\n", + " layernorm_eps=layernorm_eps)\n", + "\n", + " self.decoder = Decoder(num_layers=num_layers, \n", + " embedding_dim=embedding_dim,\n", + " num_heads=num_heads,\n", + " fully_connected_dim=fully_connected_dim,\n", + " target_vocab_size=target_vocab_size, \n", + " maximum_position_encoding=max_positional_encoding_target,\n", + " dropout_rate=dropout_rate,\n", + " layernorm_eps=layernorm_eps)\n", + "\n", + " self.final_layer = tf.keras.layers.Dense(target_vocab_size, activation='softmax')\n", + " \n", + " def call(self, input_sentence, output_sentence, training, enc_padding_mask, look_ahead_mask, dec_padding_mask):\n", + " \"\"\"\n", + " Forward pass for the entire Transformer\n", + " Arguments:\n", + " input_sentence (tf.Tensor): Tensor of shape (batch_size, input_seq_len)\n", + " An array of the indexes of the words in the input sentence\n", + " output_sentence (tf.Tensor): Tensor of shape (batch_size, target_seq_len)\n", + " An array of the indexes of the words in the output sentence\n", + " training (bool): Boolean, set to true to activate\n", + " the training mode for dropout layers\n", + " enc_padding_mask (tf.Tensor): Boolean mask to ensure that the padding is not \n", + " treated as part of the input\n", + " look_ahead_mask (tf.Tensor): Boolean mask for the target_input\n", + " dec_padding_mask (tf.Tensor): Boolean mask for the second multihead attention layer\n", + " Returns:\n", + " final_output (tf.Tensor): The final output of the model\n", + " attention_weights (dict[str: tf.Tensor]): Dictionary of tensors containing all the attention weights for the decoder\n", + " each of shape Tensor of shape (batch_size, num_heads, target_seq_len, input_seq_len)\n", + " \n", + " \"\"\"\n", + " ### START CODE HERE ###\n", + " # call self.encoder with the appropriate arguments to get the encoder output\n", + " enc_output = None\n", + " \n", + " # call self.decoder with the appropriate arguments to get the decoder output\n", + " # dec_output.shape == (batch_size, tar_seq_len, fully_connected_dim)\n", + " dec_output, attention_weights = None\n", + " \n", + " # pass decoder output through a linear layer and softmax (~1 line)\n", + " final_output = None\n", + " ### END CODE HERE ###\n", + "\n", + " return final_output, attention_weights" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "3cd93c99", + "metadata": { + "deletable": false, + "editable": false + }, + "outputs": [], + "source": [ + "# Test your function!\n", + "n_layers = 3\n", + "emb_d = 13\n", + "n_heads = 17\n", + "fully_connected_dim = 8\n", + "input_vocab_size = 300\n", + "target_vocab_size = 350\n", + "max_positional_encoding_input = 12\n", + "max_positional_encoding_target = 12\n", + "\n", + "transformer = Transformer(n_layers, \n", + " emb_d, \n", + " n_heads, \n", + " fully_connected_dim, \n", + " input_vocab_size, \n", + " target_vocab_size, \n", + " max_positional_encoding_input,\n", + " max_positional_encoding_target)\n", + "\n", + "# 0 is the padding value\n", + "sentence_a = np.array([[2, 3, 1, 3, 0, 0, 0]])\n", + "sentence_b = np.array([[1, 3, 4, 0, 0, 0, 0]])\n", + "\n", + "enc_padding_mask = create_padding_mask(sentence_a)\n", + "dec_padding_mask = create_padding_mask(sentence_a)\n", + "\n", + "look_ahead_mask = create_look_ahead_mask(sentence_a.shape[1])\n", + "\n", + "test_summary, att_weights = transformer(\n", + " sentence_a,\n", + " sentence_b,\n", + " False,\n", + " enc_padding_mask,\n", + " look_ahead_mask,\n", + " dec_padding_mask\n", + ")\n", + "\n", + "print(f\"Using num_layers={n_layers}, target_vocab_size={target_vocab_size} and num_heads={n_heads}:\\n\")\n", + "print(f\"sentence_a has shape:{sentence_a.shape}\")\n", + "print(f\"sentence_b has shape:{sentence_b.shape}\")\n", + "\n", + "print(f\"\\nOutput of transformer (summary) has shape:{test_summary.shape}\\n\")\n", + "print(\"Attention weights:\")\n", + "for name, tensor in att_weights.items():\n", + " print(f\"{name} has shape:{tensor.shape}\")" + ] + }, + { + "cell_type": "markdown", + "id": "95c9f812", + "metadata": {}, + "source": [ + "##### __Expected Output__\n", + "\n", + "```\n", + "Using num_layers=3, target_vocab_size=350 and num_heads=17:\n", + "\n", + "sentence_a has shape:(1, 7)\n", + "sentence_b has shape:(1, 7)\n", + "\n", + "Output of transformer (summary) has shape:(1, 7, 350)\n", + "\n", + "Attention weights:\n", + "decoder_layer1_block1_self_att has shape:(1, 17, 7, 7)\n", + "decoder_layer1_block2_decenc_att has shape:(1, 17, 7, 7)\n", + "decoder_layer2_block1_self_att has shape:(1, 17, 7, 7)\n", + "decoder_layer2_block2_decenc_att has shape:(1, 17, 7, 7)\n", + "decoder_layer3_block1_self_att has shape:(1, 17, 7, 7)\n", + "decoder_layer3_block2_decenc_att has shape:(1, 17, 7, 7)\n", + "```" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "a2d035a5", + "metadata": { + "deletable": false, + "editable": false, + "tags": [] + }, + "outputs": [], + "source": [ + "# UNIT TEST\n", + "w2_unittest.test_transformer(Transformer, create_look_ahead_mask, create_padding_mask)" + ] + }, + { + "cell_type": "markdown", + "id": "33e8a0c2", + "metadata": {}, + "source": [ + "\n", + "## 9 - Initialize the Model\n", + "Now that you have defined the model, you can initialize and train it. First you can initialize the model with the parameters below. Note that generally these models are much larger and you are using a smaller version to fit this environment and to be able to train it in just a few minutes.\n", + "\n", + "The base model described in the original Transformer paper used `num_layers=6`, `embedding_dim=512`, and `fully_connected_dim=2048`." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "a5f79f64", + "metadata": { + "deletable": false, + "editable": false, + "tags": [ + "graded" + ] + }, + "outputs": [], + "source": [ + "# Define the model parameters\n", + "num_layers = 2\n", + "embedding_dim = 128\n", + "fully_connected_dim = 128\n", + "num_heads = 2\n", + "positional_encoding_length = 256\n", + "\n", + "# Initialize the model\n", + "transformer = Transformer(\n", + " num_layers, \n", + " embedding_dim, \n", + " num_heads, \n", + " fully_connected_dim,\n", + " vocab_size, \n", + " vocab_size, \n", + " positional_encoding_length, \n", + " positional_encoding_length,\n", + ")" + ] + }, + { + "cell_type": "markdown", + "id": "71473c27", + "metadata": {}, + "source": [ + "\n", + "## 10 - Prepare for Training the Model\n", + "\n", + "The original transformer paper uses Adam optimizer with custom learning rate scheduling, which we define in the cell below. This was empirically shown to produce faster convergence." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "eb402089", + "metadata": { + "deletable": false, + "editable": false, + "tags": [ + "graded" + ] + }, + "outputs": [], + "source": [ + "class CustomSchedule(tf.keras.optimizers.schedules.LearningRateSchedule):\n", + " def __init__(self, d_model, warmup_steps=4000):\n", + " super(CustomSchedule, self).__init__()\n", + " self.d_model = tf.cast(d_model, dtype=tf.float32)\n", + " self.warmup_steps = warmup_steps\n", + " \n", + " def __call__(self, step):\n", + " step = tf.cast(step, dtype=tf.float32)\n", + " arg1 = tf.math.rsqrt(step)\n", + " arg2 = step * (self.warmup_steps ** -1.5)\n", + "\n", + " return tf.math.rsqrt(self.d_model) * tf.math.minimum(arg1, arg2)\n", + "\n", + "learning_rate = CustomSchedule(embedding_dim)\n", + "\n", + "optimizer = tf.keras.optimizers.Adam(0.0002, beta_1=0.9, beta_2=0.98, epsilon=1e-9)" + ] + }, + { + "cell_type": "markdown", + "id": "ad854ab6", + "metadata": {}, + "source": [ + "Below you can plot, how the custom learning rate looks like." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "35a17a59", + "metadata": { + "deletable": false, + "editable": false, + "tags": [ + "graded" + ] + }, + "outputs": [], + "source": [ + "plt.plot(learning_rate(tf.range(40000, dtype=tf.float32)))\n", + "plt.ylabel('Learning Rate')\n", + "plt.xlabel('Train Step')" + ] + }, + { + "cell_type": "markdown", + "id": "4cfba386", + "metadata": {}, + "source": [ + "Next, you set up the loss. Since the target sequences are padded, it is important to apply a padding mask when calculating the loss.\n", + "\n", + "You will use the sparse categorical cross-entropy loss function (`tf.keras.losses.SparseCategoricalCrossentropy`) and set the parameter `from_logits` to False since the Transformer does not output raw logits since the last layer has a softmax activation:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "99fc8885", + "metadata": { + "deletable": false, + "editable": false, + "tags": [ + "graded" + ] + }, + "outputs": [], + "source": [ + "loss_object = tf.keras.losses.SparseCategoricalCrossentropy(from_logits=False, reduction='none')\n", + "\n", + "def masked_loss(real, pred):\n", + " mask = tf.math.logical_not(tf.math.equal(real, 0))\n", + " loss_ = loss_object(real, pred)\n", + "\n", + " mask = tf.cast(mask, dtype=loss_.dtype)\n", + " loss_ *= mask\n", + "\n", + " return tf.reduce_sum(loss_)/tf.reduce_sum(mask)\n", + "\n", + "\n", + "train_loss = tf.keras.metrics.Mean(name='train_loss')\n", + "\n", + "# Here you will store the losses, so you can later plot them\n", + "losses = []" + ] + }, + { + "cell_type": "markdown", + "id": "33db3f0b", + "metadata": {}, + "source": [ + "Now you can define your custom training function. If you are not very advanced with tensorflow, you can understand this function as an alternative to using `model.compile()` and `model.fit()`, but with added extra flexibility." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "79092091", + "metadata": { + "deletable": false, + "editable": false, + "tags": [ + "graded" + ] + }, + "outputs": [], + "source": [ + "@tf.function\n", + "def train_step(model, inp, tar):\n", + " \"\"\"\n", + " One training step for the transformer\n", + " Arguments:\n", + " inp (tf.Tensor): Input data to summarize\n", + " tar (tf.Tensor): Target (summary)\n", + " Returns:\n", + " None\n", + " \"\"\"\n", + " tar_inp = tar[:, :-1]\n", + " tar_real = tar[:, 1:]\n", + "\n", + " # Create masks\n", + " enc_padding_mask = create_padding_mask(inp)\n", + " look_ahead_mask = create_look_ahead_mask(tf.shape(tar_inp)[1])\n", + " dec_padding_mask = create_padding_mask(inp) # Notice that both encoder and decoder padding masks are equal\n", + "\n", + " with tf.GradientTape() as tape:\n", + " predictions, _ = model(\n", + " inp,\n", + " tar_inp, \n", + " True, \n", + " enc_padding_mask, \n", + " look_ahead_mask, \n", + " dec_padding_mask\n", + " )\n", + " loss = masked_loss(tar_real, predictions)\n", + "\n", + " gradients = tape.gradient(loss, transformer.trainable_variables) \n", + " optimizer.apply_gradients(zip(gradients, transformer.trainable_variables))\n", + "\n", + " train_loss(loss)" + ] + }, + { + "cell_type": "markdown", + "id": "1480d5fd", + "metadata": {}, + "source": [ + "Now you are ready for training the model. But before starting the training, you can also define one more set of functions to perform the inference. Because you are using a custom training loop, you can do whatever you want between the training steps. And wouldnt't it be fun to see after each epoch some examples of how the model performs?" + ] + }, + { + "cell_type": "markdown", + "id": "79e05c54", + "metadata": {}, + "source": [ + "\n", + "## 11 - Summarization\n", + "\n", + "The last thing you will implement is inference. With this, you will be able to produce actual summaries of the documents. You will use a simple method called greedy decoding, which means you will predict one word at a time and append it to the output. You will start with an `[SOS]` token and repeat the word by word inference until the model returns you the `[EOS]` token or until you reach the maximum length of the sentence (you need to add this limit, otherwise a poorly trained model could give you infinite sentences without ever producing the `[EOS]` token.\n", + "\n", + " \n", + "### Exercise 5 - next_word\n", + "Write a helper function that predicts the next word, so you can use it to write the whole sentences. Hint: this is very similar to what happens in the train_step, but you have to set the training of the model to False." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "175fae70", + "metadata": { + "deletable": false, + "tags": [ + "graded" + ] + }, + "outputs": [], + "source": [ + "# GRADED FUNCTION: next_word\n", + "def next_word(model, encoder_input, output):\n", + " \"\"\"\n", + " Helper function for summarization that uses the model to predict just the next word.\n", + " Arguments:\n", + " encoder_input (tf.Tensor): Input data to summarize\n", + " output (tf.Tensor): (incomplete) target (summary)\n", + " Returns:\n", + " predicted_id (tf.Tensor): The id of the predicted word\n", + " \"\"\"\n", + " ### START CODE HERE ###\n", + " # Create a padding mask for the input (encoder)\n", + " enc_padding_mask = None\n", + " # Create a look-ahead mask for the output\n", + " look_ahead_mask = None\n", + " # Create a padding mask for the input (decoder)\n", + " dec_padding_mask = None\n", + "\n", + " # Run the prediction of the next word with the transformer model\n", + " predictions, attention_weights = None(\n", + " None,\n", + " None,\n", + " None,\n", + " None,\n", + " None,\n", + " None\n", + " )\n", + " ### END CODE HERE ###\n", + "\n", + " predictions = predictions[: ,-1:, :]\n", + " predicted_id = tf.cast(tf.argmax(predictions, axis=-1), tf.int32)\n", + " \n", + " return predicted_id" + ] + }, + { + "cell_type": "markdown", + "id": "29af50d0", + "metadata": {}, + "source": [ + "Check if your function works." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "3e97ba77", + "metadata": { + "deletable": false, + "editable": false, + "tags": [ + "graded" + ] + }, + "outputs": [], + "source": [ + "# Take a random sentence as an input\n", + "input_document = tokenizer.texts_to_sequences([\"a random sentence\"])\n", + "input_document = tf.keras.preprocessing.sequence.pad_sequences(input_document, maxlen=encoder_maxlen, padding='post', truncating='post')\n", + "encoder_input = tf.expand_dims(input_document[0], 0)\n", + "\n", + "# Take the start of sentence token as the only token in the output to predict the next word\n", + "output = tf.expand_dims([tokenizer.word_index[\"[SOS]\"]], 0)\n", + "\n", + "# predict the next word with your function\n", + "predicted_token = next_word(transformer, encoder_input, output)\n", + "print(f\"Predicted token: {predicted_token}\")\n", + "\n", + "predicted_word = tokenizer.sequences_to_texts(predicted_token.numpy())[0]\n", + "print(f\"Predicted word: {predicted_word}\")" + ] + }, + { + "cell_type": "markdown", + "id": "7157031c", + "metadata": {}, + "source": [ + "##### __Expected Output__\n", + "\n", + "```\n", + "Predicted token: [[14859]]\n", + "Predicted word: masses\n", + "```" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "6bd98959", + "metadata": { + "deletable": false, + "editable": false, + "tags": [] + }, + "outputs": [], + "source": [ + "# UNIT TEST\n", + "w2_unittest.test_next_word(next_word, transformer, encoder_input, output)" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "6177dc6a", + "metadata": { + "deletable": false, + "editable": false, + "tags": [ + "graded" + ] + }, + "outputs": [], + "source": [ + "def summarize(model, input_document):\n", + " \"\"\"\n", + " A function for summarization using the transformer model\n", + " Arguments:\n", + " input_document (tf.Tensor): Input data to summarize\n", + " Returns:\n", + " _ (str): The summary of the input_document\n", + " \"\"\" \n", + " input_document = tokenizer.texts_to_sequences([input_document])\n", + " input_document = tf.keras.preprocessing.sequence.pad_sequences(input_document, maxlen=encoder_maxlen, padding='post', truncating='post')\n", + " encoder_input = tf.expand_dims(input_document[0], 0)\n", + " \n", + " output = tf.expand_dims([tokenizer.word_index[\"[SOS]\"]], 0)\n", + " \n", + " for i in range(decoder_maxlen):\n", + " predicted_id = next_word(model, encoder_input, output)\n", + " output = tf.concat([output, predicted_id], axis=-1)\n", + " \n", + " if predicted_id == tokenizer.word_index[\"[EOS]\"]:\n", + " break\n", + "\n", + " return tokenizer.sequences_to_texts(output.numpy())[0] # since there is just one translated document" + ] + }, + { + "cell_type": "markdown", + "id": "d3b15117", + "metadata": {}, + "source": [ + "Now you can already summarize a sentence! But beware, since the model was not yet trained at all, it will just produce nonsense." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "bae4d5f1", + "metadata": { + "deletable": false, + "editable": false, + "tags": [ + "graded" + ] + }, + "outputs": [], + "source": [ + "training_set_example = 0\n", + "\n", + "# Check a summary of a document from the training set\n", + "print('Training set example:')\n", + "print(document[training_set_example])\n", + "print('\\nHuman written summary:')\n", + "print(summary[training_set_example])\n", + "print('\\nModel written summary:')\n", + "summarize(transformer, document[training_set_example])" + ] + }, + { + "cell_type": "markdown", + "id": "90d6f836", + "metadata": {}, + "source": [ + "\n", + "# 12 - Train the model\n", + "\n", + "Now you can finally train the model. Below is a loop that will train your model for 20 epochs. note that it should take about 30 seconds per epoch (with the exception of the first few epochs which can take a few minutes each).\n", + "\n", + "Note that after each epoch you perform the summarization on one of the sentences in the test set and print it out, so you can see how your model is improving." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "ebe2bf5f", + "metadata": { + "deletable": false, + "editable": false, + "scrolled": true, + "tags": [] + }, + "outputs": [], + "source": [ + "# Take an example from the test set, to monitor it during training\n", + "test_example = 0\n", + "true_summary = summary_test[test_example]\n", + "true_document = document_test[test_example]\n", + "\n", + "# Define the number of epochs\n", + "epochs = 20\n", + "\n", + "# Training loop\n", + "for epoch in range(epochs):\n", + " \n", + " start = time.time()\n", + " train_loss.reset_states()\n", + " number_of_batches=len(list(enumerate(dataset)))\n", + "\n", + " for (batch, (inp, tar)) in enumerate(dataset):\n", + " print(f'Epoch {epoch+1}, Batch {batch+1}/{number_of_batches}', end='\\r')\n", + " train_step(transformer, inp, tar)\n", + " \n", + " print (f'Epoch {epoch+1}, Loss {train_loss.result():.4f}')\n", + " losses.append(train_loss.result())\n", + " \n", + " print (f'Time taken for one epoch: {time.time() - start} sec')\n", + " print('Example summarization on the test set:')\n", + " print(' True summarization:')\n", + " print(f' {true_summary}')\n", + " print(' Predicted summarization:')\n", + " print(f' {summarize(transformer, true_document)}\\n')" + ] + }, + { + "cell_type": "markdown", + "id": "35687ddc", + "metadata": {}, + "source": [ + "Plot the loss funtion." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "eb3d5335", + "metadata": { + "deletable": false, + "editable": false, + "tags": [ + "graded" + ] + }, + "outputs": [], + "source": [ + "plt.plot(losses)\n", + "plt.ylabel('Loss')\n", + "plt.xlabel('Epoch')" + ] + }, + { + "cell_type": "markdown", + "id": "b6a53f16", + "metadata": {}, + "source": [ + "\n", + "# 13 - Summarize some Sentences!\n", + "\n", + "Below you can see an example of summarization of a sentence from the training set and a sentence from the test set. See if you notice anything interesting about them!" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "2493b755", + "metadata": { + "deletable": false, + "editable": false, + "tags": [ + "graded" + ] + }, + "outputs": [], + "source": [ + "training_set_example = 0\n", + "\n", + "# Check a summary of a document from the training set\n", + "print('Training set example:')\n", + "print(document[training_set_example])\n", + "print('\\nHuman written summary:')\n", + "print(summary[training_set_example])\n", + "print('\\nModel written summary:')\n", + "print(summarize(transformer, document[training_set_example]))" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "15baaa47", + "metadata": { + "deletable": false, + "editable": false, + "tags": [ + "graded" + ] + }, + "outputs": [], + "source": [ + "test_set_example = 3\n", + "\n", + "# Check a summary of a document from the test set\n", + "print('Test set example:')\n", + "print(document_test[test_set_example])\n", + "print('\\nHuman written summary:')\n", + "print(summary_test[test_set_example])\n", + "print('\\nModel written summary:')\n", + "print(summarize(transformer, document_test[test_set_example]))" + ] + }, + { + "cell_type": "markdown", + "id": "aebd7ef5", + "metadata": {}, + "source": [ + "If you critically examine the output of the model, you can notice a few things:\n", + " - In the training set the model output is (almost) identical to the real output (already after 20 epochs and even more so with more epochs). This might be because the training set is relatively small and the model is relatively big and has thus learned the sentences in the training set by heart (overfitting).\n", + " - While the performance on the training set looks amazing, it is not so good on the test set. The model overfits, but fails to generalize. Again an easy candidate to blame is the small training set and a comparatively large model, but there might be a variety of other factors.\n", + " - Look at the test set example 3 and its summarization. Would you summarize it the same way as it is written here? Sometimes the data may be ambiguous. And the training of **your model can only be as good as your data**.\n", + "\n", + "Here you only use a small dataset, to show that something can be learned in a reasonable amount of time in a relatively small environment. Generally, large transformers are trained on more than one task and on very large quantities of data to achieve superb performance. You will learn more about this in the rest of this course." + ] + }, + { + "cell_type": "markdown", + "id": "41014aac", + "metadata": {}, + "source": [ + "**Congratulations on finishing this week's assignment!** You did a lot of work and now you should have a better understanding of the Transformers and their building blocks (encoder and decoder) and how they can be used for text summarization. And remember: you dont need to change much to use the same model for a translator, just change the dataset and it should work!\n", + "\n", + "**Keep it up!**" + ] + } + ], + "metadata": { + "grader_version": "1", + "kernelspec": { + "display_name": "Python 3 (ipykernel)", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.1.-1" + } + }, + "nbformat": 4, + "nbformat_minor": 5 +} diff --git a/NLP with Attention Models/Text_Summarization/Summarization/tf/C4W2_Assignment.ipynb b/NLP with Attention Models/Text_Summarization/Summarization/tf/C4W2_Assignment.ipynb new file mode 100644 index 0000000000000000000000000000000000000000..d5519b0f25b82ff21e4a8fb079d2d32a665115e1 --- /dev/null +++ b/NLP with Attention Models/Text_Summarization/Summarization/tf/C4W2_Assignment.ipynb @@ -0,0 +1,2467 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "id": "05014ac7", + "metadata": { + "colab_type": "text", + "id": "7yuytuIllsv1" + }, + "source": [ + "\n", + "# Assignment 2: Transformer Summarizer\n", + "\n", + "Welcome to the second assignment of course 4. In this assignment you will explore summarization using the transformer model. **Unlike the lecture, you will be implementing an encoder-decoder model. However, don't worry; you will be guided through all the steps, and you will find numerous hints to assist you!**\n", + "\n", + "There are many hints in this notebook so feel free to use them as needed. Actually by the end of this notebook you will have implemented the full transformer (both encoder and decoder) but you will only be graded on the implementation of the decoder as the encoder is provided for you.\n" + ] + }, + { + "cell_type": "markdown", + "id": "d00e9709", + "metadata": { + "colab_type": "text", + "id": "4-3lxSnXRWPx" + }, + "source": [ + "## Table of Contents\n", + "\n", + "- [Introduction](#0)\n", + "- [1 - Importing the Dataset](#1)\n", + "- [2 - Preprocess the Data](#2)\n", + "- [3 - Positional Encoding](#3)\n", + "- [4 - Masking](#4)\n", + "- [5 - Self-attention](#5)\n", + " - [Exercise 1 - scaled_dot_product_attention](#ex-1)\n", + "- [6 - Encoder](#6)\n", + " - [6.1 - Encoder Layer](#6-1)\n", + " - [6.2 - Full Encoder](#6-2)\n", + "- [7 - Decoder](#7)\n", + " - [7.1 - Decoder Layer](#7-1)\n", + " - [Exercise 2 - DecoderLayer](#ex-2)\n", + " - [7.2 - Full Decoder](#7-2)\n", + " - [Exercise 3 - Decoder](#ex-3)\n", + "- [8 - Transformer](#8)\n", + " - [Exercise 4 - Transformer](#ex-4)\n", + "- [9 - Initialize the Model](#9)\n", + "- [10 - Prepare for Training the Model](#10)\n", + "- [11 - Summarization](#11)\n", + " - [Exercise 5 - next_word](#ex-5)\n", + "- [12 - Train the Model](#12)\n", + "- [13 - Summarize some sentences!](#13)\n" + ] + }, + { + "cell_type": "markdown", + "id": "ee0da363", + "metadata": { + "colab_type": "text", + "id": "H4NlfEQhRWPy" + }, + "source": [ + "\n", + "## Introduction\n", + "\n", + "Summarization is an important task in natural language processing and could be useful for a consumer enterprise. For example, bots can be used to scrape articles, summarize them, and then you can use sentiment analysis to identify the sentiment about certain stocks. Who wants to read an article or a long email today anyway, when you can build a transformer to summarize text for you? Let's get started. By completing this assignment you will learn to: \n", + "\n", + "- Use built-in functions to preprocess your data\n", + "- Implement DotProductAttention\n", + "- Implement Causal Attention\n", + "- Understand how attention works\n", + "- Build the transformer model\n", + "- Evaluate your model\n", + "- Summarize an article\n", + "\n", + "As you can tell, this model is slightly different than the ones you have already implemented. This is heavily based on attention and does not rely on sequences, which allows for parallel computing. " + ] + }, + { + "cell_type": "code", + "execution_count": 1, + "id": "7b49d856", + "metadata": { + "colab": { + "base_uri": "https://localhost:8080/", + "height": 34 + }, + "colab_type": "code", + "deletable": false, + "editable": false, + "id": "CChWzW-rEHVb", + "outputId": "a0b3e98b-7fc6-492d-c8ad-3a263b54f670", + "tags": [ + "graded" + ] + }, + "outputs": [ + { + "name": "stderr", + "output_type": "stream", + "text": [ + "2025-06-12 13:07:48.525884: I tensorflow/core/platform/cpu_feature_guard.cc:182] This TensorFlow binary is optimized to use available CPU instructions in performance-critical operations.\n", + "To enable the following instructions: AVX2 FMA, in other operations, rebuild TensorFlow with the appropriate compiler flags.\n" + ] + } + ], + "source": [ + "import os\n", + "#os.environ['TF_CPP_MIN_LOG_LEVEL'] = '3'\n", + "\n", + "import numpy as np\n", + "import pandas as pd\n", + "import tensorflow as tf\n", + "import matplotlib.pyplot as plt\n", + "import time\n", + "import utils\n", + "\n", + "import textwrap\n", + "wrapper = textwrap.TextWrapper(width=70)\n", + "\n", + "tf.keras.utils.set_random_seed(10)" + ] + }, + { + "cell_type": "code", + "execution_count": 2, + "id": "cfe093e6", + "metadata": { + "deletable": false, + "editable": false + }, + "outputs": [], + "source": [ + "import w2_unittest" + ] + }, + { + "cell_type": "markdown", + "id": "d56fc570", + "metadata": { + "colab_type": "text", + "id": "kEL2rvaHRWP4" + }, + "source": [ + "\n", + "## 1 - Import the Dataset\n", + "You have the dataset saved in a .json file, which you can easily open with pandas. The loading function has already been taken care of in `utils.py`." + ] + }, + { + "cell_type": "code", + "execution_count": 3, + "id": "074bcce3", + "metadata": { + "deletable": false, + "editable": false, + "tags": [ + "graded" + ] + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Dialogue:\n", + "Lucas: Hey! How was your day?\r\n", + "Demi: Hey there! \r\n", + "Demi: It was pretty fine, actually, thank you!\r\n", + "Demi: I just got promoted! :D\r\n", + "Lucas: Whoa! Great news!\r\n", + "Lucas: Congratulations!\r\n", + "Lucas: Such a success has to be celebrated.\r\n", + "Demi: I agree! :D\r\n", + "Demi: Tonight at Death & Co.?\r\n", + "Lucas: Sure!\r\n", + "Lucas: See you there at 10pm?\r\n", + "Demi: Yeah! See you there! :D\n", + "\n", + "Summary:\n", + "Demi got promoted. She will celebrate that with Lucas at Death & Co at 10 pm.\n" + ] + } + ], + "source": [ + "data_dir = \"data/corpus\"\n", + "\n", + "train_data, test_data = utils.get_train_test_data(data_dir)\n", + "\n", + "# Take one example from the dataset and print it\n", + "example_summary, example_dialogue = train_data.iloc[10]\n", + "print(f\"Dialogue:\\n{example_dialogue}\")\n", + "print(f\"\\nSummary:\\n{example_summary}\")" + ] + }, + { + "cell_type": "markdown", + "id": "04210324", + "metadata": {}, + "source": [ + "\n", + "## 2 - Preprocess the data\n", + "\n", + "First you will do some preprocessing of the data and split it into inputs and outputs. Here you also remove some of the characters that are specific to this dataset and add the `[EOS]` (end of sentence) token to the end, like it was discussed in the lecture videos. You will also add a `[SOS]` (start of sentence) token to the beginning of the sentences." + ] + }, + { + "cell_type": "code", + "execution_count": 4, + "id": "9ba397a0", + "metadata": { + "deletable": false, + "editable": false, + "tags": [ + "graded" + ] + }, + "outputs": [], + "source": [ + "document, summary = utils.preprocess(train_data)\n", + "document_test, summary_test = utils.preprocess(test_data)" + ] + }, + { + "cell_type": "markdown", + "id": "0fe70280", + "metadata": {}, + "source": [ + "Now perform the standard preprocessing with the tensorflow library. You will need to modify the filters, because you dont want the `[EOS]` tokens to be removed.\n", + "\n", + "Then create the vocabulary by combining the data in the documents and the summaries and using `.fit_on_texts()`:" + ] + }, + { + "cell_type": "code", + "execution_count": 5, + "id": "5dfab3c8", + "metadata": { + "deletable": false, + "editable": false, + "tags": [ + "graded" + ] + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Size of vocabulary: 34250\n" + ] + } + ], + "source": [ + "# The [ and ] from default tokens cannot be removed, because they mark the SOS and EOS token.\n", + "filters = '!\"#$%&()*+,-./:;<=>?@\\\\^_`{|}~\\t\\n'\n", + "oov_token = '[UNK]'\n", + "\n", + "tokenizer = tf.keras.preprocessing.text.Tokenizer(filters=filters, oov_token=oov_token, lower=False)\n", + "\n", + "documents_and_summary = pd.concat([document, summary], ignore_index=True)\n", + "\n", + "tokenizer.fit_on_texts(documents_and_summary)\n", + "\n", + "inputs = tokenizer.texts_to_sequences(document)\n", + "targets = tokenizer.texts_to_sequences(summary)\n", + "\n", + "vocab_size = len(tokenizer.word_index) + 1\n", + "\n", + "print(f'Size of vocabulary: {vocab_size}')" + ] + }, + { + "cell_type": "markdown", + "id": "7341b3f5", + "metadata": {}, + "source": [ + "Now you can pad the tokenized sequences for the training data.\n", + "\n", + "For the purpose of this notebook you need to limit the length of the sequences, as transformers are really big models and are not meant to be trained in such small environments." + ] + }, + { + "cell_type": "code", + "execution_count": 6, + "id": "c5846dd5", + "metadata": { + "deletable": false, + "editable": false, + "tags": [ + "graded" + ] + }, + "outputs": [ + { + "name": "stderr", + "output_type": "stream", + "text": [ + "2025-06-12 13:08:54.786615: I tensorflow/compiler/xla/stream_executor/cuda/cuda_gpu_executor.cc:995] successful NUMA node read from SysFS had negative value (-1), but there must be at least one NUMA node, so returning NUMA node zero. See more at https://github.com/torvalds/linux/blob/v6.0/Documentation/ABI/testing/sysfs-bus-pci#L344-L355\n", + "2025-06-12 13:08:54.844892: I tensorflow/compiler/xla/stream_executor/cuda/cuda_gpu_executor.cc:995] successful NUMA node read from SysFS had negative value (-1), but there must be at least one NUMA node, so returning NUMA node zero. See more at https://github.com/torvalds/linux/blob/v6.0/Documentation/ABI/testing/sysfs-bus-pci#L344-L355\n", + "2025-06-12 13:08:54.847837: I tensorflow/compiler/xla/stream_executor/cuda/cuda_gpu_executor.cc:995] successful NUMA node read from SysFS had negative value (-1), but there must be at least one NUMA node, so returning NUMA node zero. See more at https://github.com/torvalds/linux/blob/v6.0/Documentation/ABI/testing/sysfs-bus-pci#L344-L355\n", + "2025-06-12 13:08:54.856373: I tensorflow/compiler/xla/stream_executor/cuda/cuda_gpu_executor.cc:995] successful NUMA node read from SysFS had negative value (-1), but there must be at least one NUMA node, so returning NUMA node zero. See more at https://github.com/torvalds/linux/blob/v6.0/Documentation/ABI/testing/sysfs-bus-pci#L344-L355\n", + "2025-06-12 13:08:54.859450: I tensorflow/compiler/xla/stream_executor/cuda/cuda_gpu_executor.cc:995] successful NUMA node read from SysFS had negative value (-1), but there must be at least one NUMA node, so returning NUMA node zero. See more at https://github.com/torvalds/linux/blob/v6.0/Documentation/ABI/testing/sysfs-bus-pci#L344-L355\n", + "2025-06-12 13:08:54.862158: I tensorflow/compiler/xla/stream_executor/cuda/cuda_gpu_executor.cc:995] successful NUMA node read from SysFS had negative value (-1), but there must be at least one NUMA node, so returning NUMA node zero. See more at https://github.com/torvalds/linux/blob/v6.0/Documentation/ABI/testing/sysfs-bus-pci#L344-L355\n", + "2025-06-12 13:08:54.998431: I tensorflow/compiler/xla/stream_executor/cuda/cuda_gpu_executor.cc:995] successful NUMA node read from SysFS had negative value (-1), but there must be at least one NUMA node, so returning NUMA node zero. See more at https://github.com/torvalds/linux/blob/v6.0/Documentation/ABI/testing/sysfs-bus-pci#L344-L355\n", + "2025-06-12 13:08:55.000355: I tensorflow/compiler/xla/stream_executor/cuda/cuda_gpu_executor.cc:995] successful NUMA node read from SysFS had negative value (-1), but there must be at least one NUMA node, so returning NUMA node zero. See more at https://github.com/torvalds/linux/blob/v6.0/Documentation/ABI/testing/sysfs-bus-pci#L344-L355\n", + "2025-06-12 13:08:55.002105: I tensorflow/compiler/xla/stream_executor/cuda/cuda_gpu_executor.cc:995] successful NUMA node read from SysFS had negative value (-1), but there must be at least one NUMA node, so returning NUMA node zero. See more at https://github.com/torvalds/linux/blob/v6.0/Documentation/ABI/testing/sysfs-bus-pci#L344-L355\n", + "2025-06-12 13:08:55.003865: W tensorflow/core/common_runtime/gpu/gpu_bfc_allocator.cc:47] Overriding orig_value setting because the TF_FORCE_GPU_ALLOW_GROWTH environment variable is set. Original config value was 0.\n", + "2025-06-12 13:08:55.003973: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1639] Created device /job:localhost/replica:0/task:0/device:GPU:0 with 20825 MB memory: -> device: 0, name: NVIDIA A10G, pci bus id: 0000:00:1c.0, compute capability: 8.6\n" + ] + } + ], + "source": [ + "# Limit the size of the input and output data for being able to run it in this environment.\n", + "encoder_maxlen = 150\n", + "decoder_maxlen = 50\n", + "\n", + "# Pad the sequences.\n", + "inputs = tf.keras.preprocessing.sequence.pad_sequences(inputs, maxlen=encoder_maxlen, padding='post', truncating='post')\n", + "targets = tf.keras.preprocessing.sequence.pad_sequences(targets, maxlen=decoder_maxlen, padding='post', truncating='post')\n", + "\n", + "inputs = tf.cast(inputs, dtype=tf.int32)\n", + "targets = tf.cast(targets, dtype=tf.int32)\n", + "\n", + "# Create the final training dataset.\n", + "BUFFER_SIZE = 10000\n", + "BATCH_SIZE = 64\n", + "\n", + "dataset = tf.data.Dataset.from_tensor_slices((inputs, targets)).shuffle(BUFFER_SIZE).batch(BATCH_SIZE)" + ] + }, + { + "cell_type": "markdown", + "id": "58b25fb2", + "metadata": {}, + "source": [ + "\n", + "## 3 - Positional Encoding\n", + "\n", + "In sequence to sequence tasks, the relative order of your data is extremely important to its meaning. When you were training sequential neural networks such as RNNs, you fed your inputs into the network in order. Information about the order of your data was automatically fed into your model. However, when you train a Transformer network using multi-head attention, you feed your data into the model all at once. While this dramatically reduces training time, there is no information about the order of your data. This is where positional encoding is useful.\n", + "\n", + "You have learned how to implement the positional encoding in one of this week's labs. Here you will use the `positional_encoding` function to create positional encodings for your transformer. The function is already implemented for you." + ] + }, + { + "cell_type": "code", + "execution_count": 7, + "id": "0e65672c", + "metadata": { + "deletable": false, + "editable": false, + "tags": [ + "graded" + ] + }, + "outputs": [], + "source": [ + "def positional_encoding(positions, d_model):\n", + " \"\"\"\n", + " Precomputes a matrix with all the positional encodings \n", + " \n", + " Arguments:\n", + " positions (int): Maximum number of positions to be encoded \n", + " d_model (int): Encoding size \n", + " \n", + " Returns:\n", + " pos_encoding (tf.Tensor): A matrix of shape (1, position, d_model) with the positional encodings\n", + " \"\"\"\n", + " \n", + " position = np.arange(positions)[:, np.newaxis]\n", + " k = np.arange(d_model)[np.newaxis, :]\n", + " i = k // 2\n", + " \n", + " # initialize a matrix angle_rads of all the angles \n", + " angle_rates = 1 / np.power(10000, (2 * i) / np.float32(d_model))\n", + " angle_rads = position * angle_rates\n", + " \n", + " # apply sin to even indices in the array; 2i\n", + " angle_rads[:, 0::2] = np.sin(angle_rads[:, 0::2])\n", + " \n", + " # apply cos to odd indices in the array; 2i+1\n", + " angle_rads[:, 1::2] = np.cos(angle_rads[:, 1::2])\n", + " \n", + " pos_encoding = angle_rads[np.newaxis, ...]\n", + " \n", + " return tf.cast(pos_encoding, dtype=tf.float32)" + ] + }, + { + "cell_type": "markdown", + "id": "9e1f1063", + "metadata": {}, + "source": [ + "\n", + "## 4 - Masking\n", + "\n", + "There are two types of masks that are useful when building your Transformer network: the *padding mask* and the *look-ahead mask*. Both help the softmax computation give the appropriate weights to the words in your input sentence. \n", + "\n", + "You have already learned how to implement and use them in one of this week's labs. Here they are implemented for you." + ] + }, + { + "cell_type": "code", + "execution_count": 8, + "id": "cfc7471c", + "metadata": { + "deletable": false, + "editable": false, + "tags": [ + "graded" + ] + }, + "outputs": [], + "source": [ + "def create_padding_mask(decoder_token_ids):\n", + " \"\"\"\n", + " Creates a matrix mask for the padding cells\n", + " \n", + " Arguments:\n", + " decoder_token_ids (matrix like): matrix of size (n, m)\n", + " \n", + " Returns:\n", + " mask (tf.Tensor): binary tensor of size (n, 1, m)\n", + " \"\"\" \n", + " seq = 1 - tf.cast(tf.math.equal(decoder_token_ids, 0), tf.float32)\n", + " \n", + " # add extra dimensions to add the padding to the attention logits. \n", + " # this will allow for broadcasting later when comparing sequences\n", + " return seq[:, tf.newaxis, :] \n", + "\n", + "\n", + "def create_look_ahead_mask(sequence_length):\n", + " \"\"\"\n", + " Returns a lower triangular matrix filled with ones\n", + " \n", + " Arguments:\n", + " sequence_length (int): matrix size\n", + " \n", + " Returns:\n", + " mask (tf.Tensor): binary tensor of size (sequence_length, sequence_length)\n", + " \"\"\"\n", + " mask = tf.linalg.band_part(tf.ones((1, sequence_length, sequence_length)), -1, 0)\n", + " return mask " + ] + }, + { + "cell_type": "markdown", + "id": "89110af6", + "metadata": {}, + "source": [ + "\n", + "## 5 - Self-Attention\n", + "\n", + "As the authors of the Transformers paper state, \"Attention is All You Need\". \n", + "\n", + "\"Encoder\"\n", + "
Figure 1: Self-Attention calculation visualization
\n", + " \n", + "The use of self-attention paired with traditional convolutional networks allows for parallelization which speeds up training. You will implement **scaled dot product attention** which takes in a query, key, value, and a mask as inputs to return rich, attention-based vector representations of the words in your sequence. This type of self-attention can be mathematically expressed as:\n", + "$$\n", + "\\text { Attention }(Q, K, V)=\\operatorname{softmax}\\left(\\frac{Q K^{T}}{\\sqrt{d_{k}}}+{M}\\right) V\\tag{4}\\\n", + "$$\n", + "\n", + "* $Q$ is the matrix of queries \n", + "* $K$ is the matrix of keys\n", + "* $V$ is the matrix of values\n", + "* $M$ is the optional mask you choose to apply \n", + "* ${d_k}$ is the dimension of the keys, which is used to scale everything down so the softmax doesn't explode\n", + "\n", + "\n", + "### Exercise 1 - scaled_dot_product_attention \n", + "\n", + "Implement the function `scaled_dot_product_attention()` to create attention-based representations.\n", + "\n", + "**Reminder**: The boolean mask parameter can be passed in as `none` or as either padding or look-ahead. \n", + " \n", + "* Multiply (1. - mask) by -1e9 before adding it to the scaled attention logits. \n", + "\n", + "**Additional Hints**\n", + "* You may find [tf.matmul](https://www.tensorflow.org/api_docs/python/tf/linalg/matmul) useful for matrix multiplication (check how you can use the parameter transpose_b).\n", + "* You can use [tf.keras.activations.softmax](https://www.tensorflow.org/api_docs/python/tf/keras/activations/softmax) for softmax." + ] + }, + { + "cell_type": "code", + "execution_count": 9, + "id": "3f434073", + "metadata": { + "deletable": false, + "tags": [ + "graded" + ] + }, + "outputs": [], + "source": [ + "# GRADED FUNCTION: scaled_dot_product_attention\n", + "def scaled_dot_product_attention(q, k, v, mask):\n", + " \"\"\"\n", + " Calculate the attention weights.\n", + " q, k, v must have matching leading dimensions.\n", + " k, v must have matching penultimate dimension, i.e.: seq_len_k = seq_len_v.\n", + " The mask has different shapes depending on its type(padding or look ahead) \n", + " but it must be broadcastable for addition.\n", + "\n", + " Arguments:\n", + " q (tf.Tensor): query of shape (..., seq_len_q, depth)\n", + " k (tf.Tensor): key of shape (..., seq_len_k, depth)\n", + " v (tf.Tensor): value of shape (..., seq_len_v, depth_v)\n", + " mask (tf.Tensor): mask with shape broadcastable \n", + " to (..., seq_len_q, seq_len_k). Defaults to None.\n", + "\n", + " Returns:\n", + " output -- attention_weights\n", + " \"\"\"\n", + " ### START CODE HERE ###\n", + " \n", + " # Multiply q and k transposed.\n", + " matmul_qk = tf.matmul(q,k.T)\n", + "\n", + " # scale matmul_qk with the square root of dk\n", + " dk = tf.cast(k.shape[-1], tf.float32)\n", + " scaled_attention_logits = matmul_qk / np.sqrt(dk)\n", + "\n", + " # add the mask to the scaled tensor.\n", + " if mask is not None: # Don't replace this None\n", + " scaled_attention_logits += (1 - mask)* (-1e9)\n", + "\n", + " # softmax is normalized on the last axis (seq_len_k) so that the scores add up to 1.\n", + " attention_weights = tf.keras.activations.softmax(scaled_attention_logits)\n", + "\n", + " # Multiply the attention weights by v\n", + " output = tf.matmul(attention_weights,v)\n", + " \n", + " ### END CODE HERE ###\n", + "\n", + " return output, attention_weights" + ] + }, + { + "cell_type": "code", + "execution_count": 10, + "id": "5ed8f5af", + "metadata": { + "deletable": false, + "editable": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Output:\n", + " [[[1. 0.62]\n", + " [0.62 0.62]\n", + " [0.74 0.31]]]\n", + "\n", + "Attention weigths:\n", + " [[[0. 0.38 0. 0.23 0.38]\n", + " [0.38 0. 0. 0.23 0.38]\n", + " [0.26 0.43 0. 0.16 0.16]]]\n" + ] + }, + { + "name": "stderr", + "output_type": "stream", + "text": [ + "2025-06-12 13:09:08.437381: I tensorflow/compiler/xla/stream_executor/cuda/cuda_blas.cc:606] TensorFloat-32 will be used for the matrix multiplication. This will only be logged once.\n" + ] + } + ], + "source": [ + "# Test your function!\n", + "q = np.array([[1, 1, 0, 1], [0, 1, 1, 1], [1, 0, 1, 1]]).astype(np.float32)\n", + "k = np.array([[1, 1, 0, 1], [1, 0, 1, 1 ], [1, 1, 1, 0], [0, 0, 0, 1], [0, 1, 0, 1]]).astype(np.float32)\n", + "v = np.array([[0, 0], [1, 0], [1, 0], [1, 1], [1, 1]]).astype(np.float32)\n", + "mask = np.array([[[0, 1, 0, 1, 1], [1, 0, 0, 1, 1], [1, 1, 0, 1, 1]]])\n", + "\n", + "ou, atw = scaled_dot_product_attention(q, k, v, mask)\n", + "ou = np.around(ou, decimals=2)\n", + "atw = np.around(atw, decimals=2)\n", + "\n", + "print(f\"Output:\\n {ou}\")\n", + "print(f\"\\nAttention weigths:\\n {atw}\")" + ] + }, + { + "cell_type": "markdown", + "id": "7b970a6e", + "metadata": {}, + "source": [ + "##### __Expected Output__\n", + "\n", + "```\n", + "Output:\n", + " [[[1. 0.62]\n", + " [0.62 0.62]\n", + " [0.74 0.31]]]\n", + "\n", + "Attention weigths:\n", + " [[[0. 0.38 0. 0.23 0.38]\n", + " [0.38 0. 0. 0.23 0.38]\n", + " [0.26 0.43 0. 0.16 0.16]]]\n", + "```" + ] + }, + { + "cell_type": "code", + "execution_count": 11, + "id": "4755bb0b", + "metadata": { + "deletable": false, + "editable": false, + "tags": [] + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "\u001b[92m All tests passed!\n" + ] + } + ], + "source": [ + "# UNIT TEST\n", + "w2_unittest.test_scaled_dot_product_attention(scaled_dot_product_attention)" + ] + }, + { + "cell_type": "markdown", + "id": "8dcbd521", + "metadata": {}, + "source": [ + "Excellent work! You can now implement self-attention. With that, you can start building the encoder block! " + ] + }, + { + "cell_type": "markdown", + "id": "00b9c92a", + "metadata": {}, + "source": [ + "\n", + "## 6 - Encoder\n", + "\n", + "The Transformer Encoder layer pairs self-attention and convolutional neural network style of processing to improve the speed of training and passes K and V matrices to the Decoder, which you'll build later in the assignment. In this section of the assignment, you will implement the Encoder by pairing multi-head attention and a feed forward neural network (Figure 2a). \n", + "\"Encoder\"\n", + "
Figure 2a: Transformer encoder layer
\n", + "\n", + "* `MultiHeadAttention` you can think of as computing the self-attention several times to detect different features. \n", + "* Feed forward neural network contains two Dense layers which we'll implement as the function `FullyConnected`\n", + "\n", + "Your input sentence first passes through a *multi-head attention layer*, where the encoder looks at other words in the input sentence as it encodes a specific word. The outputs of the multi-head attention layer are then fed to a *feed forward neural network*. The exact same feed forward network is independently applied to each position.\n", + " \n", + "* For the `MultiHeadAttention` layer, you will use the [MultiHeadAttention](https://www.tensorflow.org/api_docs/python/tf/keras/layers/MultiHeadAttention) implemented in Keras. If you're curious about how to split the query matrix Q, key matrix K, and value matrix V into different heads, you can look through the implementation. \n", + "* You will also use the [Sequential API](https://www.tensorflow.org/api_docs/python/tf/keras/Sequential) with two dense layers to built the feed forward neural network layers." + ] + }, + { + "cell_type": "code", + "execution_count": 12, + "id": "c3fd59d0", + "metadata": { + "deletable": false, + "editable": false, + "tags": [ + "graded" + ] + }, + "outputs": [], + "source": [ + "def FullyConnected(embedding_dim, fully_connected_dim):\n", + " \"\"\"\n", + " Returns a sequential model consisting of two dense layers. The first dense layer has\n", + " fully_connected_dim neurons and is activated by relu. The second dense layer has\n", + " embedding_dim and no activation.\n", + "\n", + " Arguments:\n", + " embedding_dim (int): output dimension\n", + " fully_connected_dim (int): dimension of the hidden layer\n", + "\n", + " Returns:\n", + " _ (tf.keras.Model): sequential model\n", + " \"\"\"\n", + " return tf.keras.Sequential([\n", + " tf.keras.layers.Dense(fully_connected_dim, activation='relu'), # (batch_size, seq_len, d_model)\n", + " tf.keras.layers.Dense(embedding_dim) # (batch_size, seq_len, d_model)\n", + " ])" + ] + }, + { + "cell_type": "markdown", + "id": "99d7003a", + "metadata": {}, + "source": [ + "\n", + "### 6.1 Encoder Layer\n", + "\n", + "Now you can pair multi-head attention and feed forward neural network together in an encoder layer! You will also use residual connections and layer normalization to help speed up training (Figure 2a).\n", + "\n", + "The encoder block (Figure 2) is is already implemented for you. Take a very close look at its implementation, as you will later have to create the decoder yourself, and a lot of the code is very similar. The encoder block performs the following steps: \n", + "1. It takes the Q, V, K matrices and a boolean mask to a multi-head attention layer. Remember that to compute *self*-attention Q, V and K are the same. You will also perform Dropout in this multi-head attention layer during training. \n", + "2. There is a skip connection to add your original input `x` and the output of the multi-head attention layer. \n", + "3. After adding the skip connection, the output passes through the first normalization layer.\n", + "4. Finally, steps 1-3 are repeated but with the feed forward neural network with a dropout layer instead of the multi-head attention layer. \n", + "\n", + "
\n", + " Additional Information (Click to expand)\n", + " \n", + "* The `__init__` method creates all the layers that will be accesed by the the `call` method. Wherever you want to use a layer defined inside the `__init__` method you will have to use the syntax `self.[insert layer name]`. \n", + "* You will find the documentation of [MultiHeadAttention](https://www.tensorflow.org/api_docs/python/tf/keras/layers/MultiHeadAttention) helpful. *Note that if query, key and value are the same, then this function performs self-attention.*\n", + "* The call arguments for `self.mha` are (Where B is for batch_size, T is for target sequence shapes, and S is output_shape):\n", + " - `query`: Query Tensor of shape (B, T, dim).\n", + " - `value`: Value Tensor of shape (B, S, dim).\n", + " - `key`: Optional key Tensor of shape (B, S, dim). If not given, will use the same value for both key and value, which is the most common case.\n", + " - `attention_mask`: a boolean mask of shape (B, T, S), that prevents attention to certain positions. The boolean mask specifies which query elements can attend to which key elements, 1 indicates attention and 0 indicates no attention. Broadcasting can happen for the missing batch dimensions and the head dimension.\n", + " - `return_attention_scores`: A boolean to indicate whether the output should be attention output if True, or (attention_output, attention_scores) if False. Defaults to False.\n", + " - `training`: Python boolean indicating whether the layer should behave in training mode (adding dropout) or in inference mode (no dropout). Defaults to either using the training mode of the parent layer/model, or False (inference) if there is no parent layer. Take a look at [tf.keras.layers.Dropout](https://www.tensorflow.org/versions/r2.4/api_docs/python/tf/keras/layers/Dropout) for more details (Additional reading in [Keras FAQ](https://keras.io/getting_started/faq/#whats-the-difference-between-the-training-argument-in-call-and-the-trainable-attribute))" + ] + }, + { + "cell_type": "code", + "execution_count": 13, + "id": "51c1452b", + "metadata": { + "deletable": false, + "editable": false, + "tags": [ + "graded" + ] + }, + "outputs": [], + "source": [ + "class EncoderLayer(tf.keras.layers.Layer):\n", + " \"\"\"\n", + " The encoder layer is composed by a multi-head self-attention mechanism,\n", + " followed by a simple, positionwise fully connected feed-forward network. \n", + " This architecture includes a residual connection around each of the two \n", + " sub-layers, followed by layer normalization.\n", + " \"\"\"\n", + " def __init__(self, embedding_dim, num_heads, fully_connected_dim,\n", + " dropout_rate=0.1, layernorm_eps=1e-6):\n", + " \n", + " super(EncoderLayer, self).__init__()\n", + "\n", + " self.mha = tf.keras.layers.MultiHeadAttention(\n", + " num_heads=num_heads,\n", + " key_dim=embedding_dim,\n", + " dropout=dropout_rate\n", + " )\n", + "\n", + " self.ffn = FullyConnected(\n", + " embedding_dim=embedding_dim,\n", + " fully_connected_dim=fully_connected_dim\n", + " )\n", + "\n", + " self.layernorm1 = tf.keras.layers.LayerNormalization(epsilon=layernorm_eps)\n", + " self.layernorm2 = tf.keras.layers.LayerNormalization(epsilon=layernorm_eps)\n", + "\n", + " self.dropout_ffn = tf.keras.layers.Dropout(dropout_rate)\n", + " \n", + " def call(self, x, training, mask):\n", + " \"\"\"\n", + " Forward pass for the Encoder Layer\n", + " \n", + " Arguments:\n", + " x (tf.Tensor): Tensor of shape (batch_size, input_seq_len, fully_connected_dim)\n", + " training (bool): Boolean, set to true to activate\n", + " the training mode for dropout layers\n", + " mask (tf.Tensor): Boolean mask to ensure that the padding is not \n", + " treated as part of the input\n", + " Returns:\n", + " encoder_layer_out (tf.Tensor): Tensor of shape (batch_size, input_seq_len, embedding_dim)\n", + " \"\"\"\n", + " # calculate self-attention using mha(~1 line).\n", + " # Dropout is added by Keras automatically if the dropout parameter is non-zero during training\n", + " self_mha_output = self.mha(x, x, x, mask) # Self attention (batch_size, input_seq_len, fully_connected_dim)\n", + " \n", + " # skip connection\n", + " # apply layer normalization on sum of the input and the attention output to get the \n", + " # output of the multi-head attention layer\n", + " skip_x_attention = self.layernorm1(x + self_mha_output) # (batch_size, input_seq_len, fully_connected_dim)\n", + "\n", + " # pass the output of the multi-head attention layer through a ffn\n", + " ffn_output = self.ffn(skip_x_attention) # (batch_size, input_seq_len, fully_connected_dim)\n", + " \n", + " # apply dropout layer to ffn output during training\n", + " # use `training=training`\n", + " ffn_output = self.dropout_ffn(ffn_output, training=training)\n", + " \n", + " # apply layer normalization on sum of the output from multi-head attention (skip connection) and ffn output\n", + " # to get the output of the encoder layer\n", + " encoder_layer_out = self.layernorm2(skip_x_attention + ffn_output) # (batch_size, input_seq_len, embedding_dim)\n", + " \n", + " return encoder_layer_out\n", + " " + ] + }, + { + "cell_type": "markdown", + "id": "2e36f13b", + "metadata": {}, + "source": [ + "\n", + "### 6.2 - Full Encoder\n", + "\n", + "Now you're ready to build the full Transformer Encoder (Figure 2b), where you will embed your input and add the positional encodings you calculated. You will then feed your encoded embeddings to a stack of Encoder layers. \n", + "\n", + "\"Encoder\"\n", + "
Figure 2b: Transformer Encoder
\n", + "\n", + "The Encoder class is implemented for you. It performs the following steps: \n", + "1. Pass the input through the Embedding layer.\n", + "2. Scale the embedding by multiplying it by the square root of the embedding dimension. \n", + "3. Add the position encoding: self.pos_encoding `[:, :seq_len, :]` to the embedding.\n", + "4. Pass the encoded embedding through a dropout layer\n", + "5. Pass the output of the dropout layer through the stack of encoding layers using a for loop." + ] + }, + { + "cell_type": "code", + "execution_count": 14, + "id": "d677d14e", + "metadata": { + "deletable": false, + "editable": false, + "tags": [ + "graded" + ] + }, + "outputs": [], + "source": [ + "class Encoder(tf.keras.layers.Layer):\n", + " \"\"\"\n", + " The entire Encoder starts by passing the input to an embedding layer \n", + " and using positional encoding to then pass the output through a stack of\n", + " encoder Layers\n", + " \n", + " \"\"\" \n", + " def __init__(self, num_layers, embedding_dim, num_heads, fully_connected_dim, input_vocab_size,\n", + " maximum_position_encoding, dropout_rate=0.1, layernorm_eps=1e-6):\n", + " super(Encoder, self).__init__()\n", + "\n", + " self.embedding_dim = embedding_dim\n", + " self.num_layers = num_layers\n", + "\n", + " self.embedding = tf.keras.layers.Embedding(input_vocab_size, self.embedding_dim)\n", + " self.pos_encoding = positional_encoding(maximum_position_encoding, \n", + " self.embedding_dim)\n", + "\n", + "\n", + " self.enc_layers = [EncoderLayer(embedding_dim=self.embedding_dim,\n", + " num_heads=num_heads,\n", + " fully_connected_dim=fully_connected_dim,\n", + " dropout_rate=dropout_rate,\n", + " layernorm_eps=layernorm_eps) \n", + " for _ in range(self.num_layers)]\n", + "\n", + " self.dropout = tf.keras.layers.Dropout(dropout_rate)\n", + " \n", + " def call(self, x, training, mask):\n", + " \"\"\"\n", + " Forward pass for the Encoder\n", + " \n", + " Arguments:\n", + " x (tf.Tensor): Tensor of shape (batch_size, seq_len)\n", + " training (bool): Boolean, set to true to activate\n", + " the training mode for dropout layers\n", + " mask (tf.Tensor): Boolean mask to ensure that the padding is not \n", + " treated as part of the input\n", + "\n", + " Returns:\n", + " x (tf.Tensor): Tensor of shape (batch_size, seq_len, embedding dim)\n", + " \"\"\"\n", + " seq_len = tf.shape(x)[1]\n", + " \n", + " # Pass input through the Embedding layer\n", + " x = self.embedding(x) # (batch_size, input_seq_len, embedding_dim)\n", + " # Scale embedding by multiplying it by the square root of the embedding dimension\n", + " x *= tf.math.sqrt(tf.cast(self.embedding_dim, tf.float32))\n", + " # Add the position encoding to embedding\n", + " x += self.pos_encoding[:, :seq_len, :]\n", + " # Pass the encoded embedding through a dropout layer\n", + " # use `training=training`\n", + " x = self.dropout(x, training=training)\n", + " # Pass the output through the stack of encoding layers \n", + " for i in range(self.num_layers):\n", + " x = self.enc_layers[i](x, training, mask)\n", + "\n", + " return x # (batch_size, input_seq_len, embedding_dim)" + ] + }, + { + "cell_type": "markdown", + "id": "9c7356fd", + "metadata": {}, + "source": [ + "\n", + "## 7 - Decoder\n", + "\n", + "Now it is time to implement the decoder. You have seen it in the videos and you can use some help by looking at the encoder implementation above. The Decoder layer takes the K and V matrices generated by the Encoder and computes the second multi-head attention layer with the Q matrix from the output (Figure 3a).\n", + "\n", + "\"Decoder\"\n", + "
Figure 3a: Transformer Decoder layer
\n", + "\n", + " \n", + "### 7.1 - Decoder Layer\n", + "Again, you'll pair multi-head attention with a feed forward neural network, but this time you'll implement two multi-head attention layers. You will also use residual connections and layer normalization to help speed up training (Figure 3a).\n", + "\n", + " \n", + "### Exercise 2 - DecoderLayer\n", + " \n", + "Implement `DecoderLayer()` using the `call()` method\n", + " \n", + "1. Block 1 is a multi-head attention layer with a residual connection, and look-ahead mask. Like in the `EncoderLayer`, Dropout is defined within the multi-head attention layer.\n", + "2. Block 2 will take into account the output of the Encoder, so the multi-head attention layer will receive K and V from the encoder, and Q from the Block 1. You will then apply a normalization layer and a residual connection, just like you did before with the `EncoderLayer`.\n", + "3. Finally, Block 3 is a feed forward neural network with dropout and normalization layers and a residual connection.\n", + " \n", + "**Additional Hints:**\n", + "* The first two blocks are fairly similar to the EncoderLayer except you will return `attention_scores` when computing self-attention" + ] + }, + { + "cell_type": "code", + "execution_count": 15, + "id": "d8d3a38d", + "metadata": { + "deletable": false, + "tags": [ + "graded" + ] + }, + "outputs": [], + "source": [ + "# GRADED FUNCTION: DecoderLayer\n", + "class DecoderLayer(tf.keras.layers.Layer):\n", + " \"\"\"\n", + " The decoder layer is composed by two multi-head attention blocks, \n", + " one that takes the new input and uses self-attention, and the other \n", + " one that combines it with the output of the encoder, followed by a\n", + " fully connected block. \n", + " \"\"\"\n", + " def __init__(self, embedding_dim, num_heads, fully_connected_dim, dropout_rate=0.1, layernorm_eps=1e-6):\n", + " super(DecoderLayer, self).__init__()\n", + "\n", + " self.mha1 = tf.keras.layers.MultiHeadAttention(\n", + " num_heads=num_heads,\n", + " key_dim=embedding_dim,\n", + " dropout=dropout_rate\n", + " )\n", + "\n", + " self.mha2 = tf.keras.layers.MultiHeadAttention(\n", + " num_heads=num_heads,\n", + " key_dim=embedding_dim,\n", + " dropout=dropout_rate\n", + " )\n", + "\n", + " self.ffn = FullyConnected(\n", + " embedding_dim=embedding_dim,\n", + " fully_connected_dim=fully_connected_dim\n", + " )\n", + "\n", + " self.layernorm1 = tf.keras.layers.LayerNormalization(epsilon=layernorm_eps)\n", + " self.layernorm2 = tf.keras.layers.LayerNormalization(epsilon=layernorm_eps)\n", + " self.layernorm3 = tf.keras.layers.LayerNormalization(epsilon=layernorm_eps)\n", + "\n", + " self.dropout_ffn = tf.keras.layers.Dropout(dropout_rate)\n", + " \n", + " def call(self, x, enc_output, training, look_ahead_mask, padding_mask):\n", + " \"\"\"\n", + " Forward pass for the Decoder Layer\n", + " \n", + " Arguments:\n", + " x (tf.Tensor): Tensor of shape (batch_size, target_seq_len, fully_connected_dim)\n", + " enc_output (tf.Tensor): Tensor of shape(batch_size, input_seq_len, fully_connected_dim)\n", + " training (bool): Boolean, set to true to activate\n", + " the training mode for dropout layers\n", + " look_ahead_mask (tf.Tensor): Boolean mask for the target_input\n", + " padding_mask (tf.Tensor): Boolean mask for the second multihead attention layer\n", + " Returns:\n", + " out3 (tf.Tensor): Tensor of shape (batch_size, target_seq_len, fully_connected_dim)\n", + " attn_weights_block1 (tf.Tensor): Tensor of shape (batch_size, num_heads, target_seq_len, target_seq_len)\n", + " attn_weights_block2 (tf.Tensor): Tensor of shape (batch_size, num_heads, target_seq_len, input_seq_len)\n", + " \"\"\"\n", + " \n", + " ### START CODE HERE ###\n", + " # enc_output.shape == (batch_size, input_seq_len, fully_connected_dim)\n", + " \n", + " # BLOCK 1\n", + " # calculate self-attention and return attention scores as attn_weights_block1.\n", + " # Dropout will be applied during training (~1 line).\n", + " mult_attn_out1, attn_weights_block1 = self.mha1(x,x,x,look_ahead_mask,return_attention_scores=True)\n", + " \n", + " # apply layer normalization (layernorm1) to the sum of the attention output and the input (~1 line)\n", + " Q1 = self.layernorm1(x + mult_attn_out1)\n", + "\n", + " # BLOCK 2\n", + " # calculate self-attention using the Q from the first block and K and V from the encoder output. \n", + " # Dropout will be applied during training\n", + " # Return attention scores as attn_weights_block2 (~1 line) \n", + " mult_attn_out2, attn_weights_block2 = self.mha2(Q1,enc_output,enc_output,padding_mask,return_attention_scores=True)\n", + " \n", + " # # apply layer normalization (layernorm2) to the sum of the attention output and the Q from the first block (~1 line)\n", + " mult_attn_out2 = self.layernorm2(mult_attn_out2 + Q1)\n", + " \n", + " # pass the output of the second block through a ffn layer\n", + " ffn_output = self.ffn(mult_attn_out2)\n", + " \n", + " # apply a dropout layer to the ffn output\n", + " # use `training=training`\n", + " ffn_output = self.dropout_ffn(ffn_output,training=training)\n", + " \n", + " # apply layer normalization (layernorm3) to the sum of the ffn output and the output of the second block\n", + " out3 = self.layernorm3(ffn_output + mult_attn_out2)\n", + " ### END CODE HERE ###\n", + "\n", + " return out3, attn_weights_block1, attn_weights_block2\n", + " " + ] + }, + { + "cell_type": "code", + "execution_count": 16, + "id": "41686c8b", + "metadata": { + "deletable": false, + "editable": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Using embedding_dim=12 and num_heads=16:\n", + "\n", + "q has shape:(1, 15, 12)\n", + "Output of encoder has shape:(1, 7, 8)\n", + "\n", + "Output of decoder layer has shape:(1, 15, 12)\n", + "Att Weights Block 1 has shape:(1, 16, 15, 15)\n", + "Att Weights Block 2 has shape:(1, 16, 15, 7)\n" + ] + } + ], + "source": [ + "# Test your function!\n", + "key_dim = 12\n", + "n_heads = 16\n", + "\n", + "decoderLayer_test = DecoderLayer(embedding_dim=key_dim, num_heads=n_heads, fully_connected_dim=32)\n", + "\n", + "q = np.ones((1, 15, key_dim))\n", + "encoder_test_output = tf.convert_to_tensor(np.random.rand(1, 7, 8))\n", + "look_ahead_mask = create_look_ahead_mask(q.shape[1])\n", + "\n", + "out, attn_w_b1, attn_w_b2 = decoderLayer_test(q, encoder_test_output, False, look_ahead_mask, None)\n", + "\n", + "print(f\"Using embedding_dim={key_dim} and num_heads={n_heads}:\\n\")\n", + "print(f\"q has shape:{q.shape}\")\n", + "print(f\"Output of encoder has shape:{encoder_test_output.shape}\\n\")\n", + "\n", + "print(f\"Output of decoder layer has shape:{out.shape}\")\n", + "print(f\"Att Weights Block 1 has shape:{attn_w_b1.shape}\")\n", + "print(f\"Att Weights Block 2 has shape:{attn_w_b2.shape}\")" + ] + }, + { + "cell_type": "markdown", + "id": "af9b85a3", + "metadata": {}, + "source": [ + "##### __Expected Output__\n", + "\n", + "```\n", + "Output:\n", + "Using embedding_dim=12 and num_heads=16:\n", + "\n", + "q has shape:(1, 15, 12)\n", + "Output of encoder has shape:(1, 7, 8)\n", + "\n", + "Output of decoder layer has shape:(1, 15, 12)\n", + "Att Weights Block 1 has shape:(1, 16, 15, 15)\n", + "Att Weights Block 2 has shape:(1, 16, 15, 7)\n", + "```" + ] + }, + { + "cell_type": "code", + "execution_count": 17, + "id": "932f7320", + "metadata": { + "deletable": false, + "editable": false, + "tags": [] + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "\u001b[92m All tests passed!\n" + ] + } + ], + "source": [ + "# UNIT TEST\n", + "w2_unittest.test_decoderlayer(DecoderLayer, create_look_ahead_mask)" + ] + }, + { + "cell_type": "markdown", + "id": "66b82ccf", + "metadata": {}, + "source": [ + " \n", + "### 7.2 - Full Decoder\n", + "You're almost there! Time to use your Decoder layer to build a full Transformer Decoder (Figure 3b). You will embed your output and add positional encodings. You will then feed your encoded embeddings to a stack of Decoder layers. \n", + "\n", + "\n", + "\"Decoder\"\n", + "
Figure 3b: Transformer Decoder
\n", + "\n", + " \n", + "### Exercise 3 - Decoder\n", + "\n", + "Implement `Decoder()` using the `call()` method to embed your output, add positional encoding, and implement multiple decoder layers.\n", + " \n", + "In this exercise, you will initialize your Decoder with an Embedding layer, positional encoding, and multiple DecoderLayers. Your `call()` method will perform the following steps: \n", + "1. Pass your generated output through the Embedding layer.\n", + "2. Scale your embedding by multiplying it by the square root of your embedding dimension. Remember to cast the embedding dimension to data type `tf.float32` before computing the square root.\n", + "3. Add the position encoding: self.pos_encoding `[:, :seq_len, :]` to your embedding.\n", + "4. Pass the encoded embedding through a dropout layer, remembering to use the `training` parameter to set the model training mode. \n", + "5. Pass the output of the dropout layer through the stack of Decoding layers using a for loop." + ] + }, + { + "cell_type": "code", + "execution_count": 18, + "id": "57dde3be", + "metadata": { + "deletable": false, + "tags": [ + "graded" + ] + }, + "outputs": [], + "source": [ + "# GRADED FUNCTION: Decoder\n", + "class Decoder(tf.keras.layers.Layer):\n", + " \"\"\"\n", + " The entire Encoder starts by passing the target input to an embedding layer \n", + " and using positional encoding to then pass the output through a stack of\n", + " decoder Layers\n", + " \n", + " \"\"\" \n", + " def __init__(self, num_layers, embedding_dim, num_heads, fully_connected_dim, target_vocab_size,\n", + " maximum_position_encoding, dropout_rate=0.1, layernorm_eps=1e-6):\n", + " super(Decoder, self).__init__()\n", + "\n", + " self.embedding_dim = embedding_dim\n", + " self.num_layers = num_layers\n", + "\n", + " self.embedding = tf.keras.layers.Embedding(target_vocab_size, self.embedding_dim)\n", + " self.pos_encoding = positional_encoding(maximum_position_encoding, self.embedding_dim)\n", + "\n", + " self.dec_layers = [DecoderLayer(embedding_dim=self.embedding_dim,\n", + " num_heads=num_heads,\n", + " fully_connected_dim=fully_connected_dim,\n", + " dropout_rate=dropout_rate,\n", + " layernorm_eps=layernorm_eps) \n", + " for _ in range(self.num_layers)]\n", + " self.dropout = tf.keras.layers.Dropout(dropout_rate)\n", + " \n", + " def call(self, x, enc_output, training, \n", + " look_ahead_mask, padding_mask):\n", + " \"\"\"\n", + " Forward pass for the Decoder\n", + " \n", + " Arguments:\n", + " x (tf.Tensor): Tensor of shape (batch_size, target_seq_len)\n", + " enc_output (tf.Tensor): Tensor of shape(batch_size, input_seq_len, fully_connected_dim)\n", + " training (bool): Boolean, set to true to activate\n", + " the training mode for dropout layers\n", + " look_ahead_mask (tf.Tensor): Boolean mask for the target_input\n", + " padding_mask (tf.Tensor): Boolean mask for the second multihead attention layer\n", + " Returns:\n", + " x (tf.Tensor): Tensor of shape (batch_size, target_seq_len, fully_connected_dim)\n", + " attention_weights (dict[str: tf.Tensor]): Dictionary of tensors containing all the attention weights\n", + " each of shape Tensor of shape (batch_size, num_heads, target_seq_len, input_seq_len)\n", + " \"\"\"\n", + "\n", + " seq_len = tf.shape(x)[1]\n", + " attention_weights = {}\n", + " \n", + " ### START CODE HERE ###\n", + " # create word embeddings \n", + " x = self.embedding(x)\n", + " \n", + " # scale embeddings by multiplying by the square root of their dimension\n", + " x *= tf.math.sqrt(tf.cast(self.embedding_dim, tf.float32))\n", + " \n", + " # add positional encodings to word embedding\n", + " x += self.pos_encoding[:,:seq_len,:]\n", + "\n", + " # apply a dropout layer to x\n", + " # use `training=training`\n", + " x = self.dropout(x,training=training)\n", + "\n", + " # use a for loop to pass x through a stack of decoder layers and update attention_weights (~4 lines total)\n", + " for i in range(self.num_layers):\n", + " # pass x and the encoder output through a stack of decoder layers and save the attention weights\n", + " # of block 1 and 2 (~1 line)\n", + " x, block1, block2 = self.dec_layers[i](x, enc_output,training, look_ahead_mask, padding_mask)\n", + "\n", + " #update attention_weights dictionary with the attention weights of block 1 and block 2\n", + " attention_weights['decoder_layer{}_block1_self_att'.format(i+1)] = block1\n", + " attention_weights['decoder_layer{}_block2_decenc_att'.format(i+1)] = block2\n", + " ### END CODE HERE ###\n", + " \n", + " # x.shape == (batch_size, target_seq_len, fully_connected_dim)\n", + " return x, attention_weights" + ] + }, + { + "cell_type": "code", + "execution_count": 19, + "id": "04e877fb", + "metadata": { + "deletable": false, + "editable": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Using num_layers=5, embedding_dim=13 and num_heads=17:\n", + "\n", + "x has shape:(3, 4)\n", + "Output of encoder has shape:(3, 7, 9)\n", + "\n", + "Output of decoder has shape:(3, 4, 13)\n", + "\n", + "Attention weights:\n", + "decoder_layer1_block1_self_att has shape:(3, 17, 4, 4)\n", + "decoder_layer1_block2_decenc_att has shape:(3, 17, 4, 7)\n", + "decoder_layer2_block1_self_att has shape:(3, 17, 4, 4)\n", + "decoder_layer2_block2_decenc_att has shape:(3, 17, 4, 7)\n", + "decoder_layer3_block1_self_att has shape:(3, 17, 4, 4)\n", + "decoder_layer3_block2_decenc_att has shape:(3, 17, 4, 7)\n", + "decoder_layer4_block1_self_att has shape:(3, 17, 4, 4)\n", + "decoder_layer4_block2_decenc_att has shape:(3, 17, 4, 7)\n", + "decoder_layer5_block1_self_att has shape:(3, 17, 4, 4)\n", + "decoder_layer5_block2_decenc_att has shape:(3, 17, 4, 7)\n" + ] + } + ], + "source": [ + "# Test your function!\n", + "n_layers = 5\n", + "emb_d = 13\n", + "n_heads = 17\n", + "fully_connected_dim = 16\n", + "target_vocab_size = 300\n", + "maximum_position_encoding = 6\n", + "\n", + "x = np.array([[3, 2, 1, 1], [2, 1, 1, 0], [2, 1, 1, 0]])\n", + "\n", + "encoder_test_output = tf.convert_to_tensor(np.random.rand(3, 7, 9))\n", + "\n", + "look_ahead_mask = create_look_ahead_mask(x.shape[1])\n", + "\n", + "decoder_test = Decoder(n_layers, emb_d, n_heads, fully_connected_dim, target_vocab_size,maximum_position_encoding)\n", + " \n", + "outd, att_weights = decoder_test(x, encoder_test_output, False, look_ahead_mask, None)\n", + "\n", + "print(f\"Using num_layers={n_layers}, embedding_dim={emb_d} and num_heads={n_heads}:\\n\")\n", + "print(f\"x has shape:{x.shape}\")\n", + "print(f\"Output of encoder has shape:{encoder_test_output.shape}\\n\")\n", + "\n", + "print(f\"Output of decoder has shape:{outd.shape}\\n\")\n", + "print(\"Attention weights:\")\n", + "for name, tensor in att_weights.items():\n", + " print(f\"{name} has shape:{tensor.shape}\")" + ] + }, + { + "cell_type": "markdown", + "id": "9aa2ff15", + "metadata": {}, + "source": [ + "##### __Expected Output__\n", + "\n", + "```\n", + "Using num_layers=5, embedding_dim=13 and num_heads=17:\n", + "\n", + "x has shape:(3, 4)\n", + "Output of encoder has shape:(3, 7, 9)\n", + "\n", + "Output of decoder has shape:(3, 4, 13)\n", + "\n", + "Attention weights:\n", + "decoder_layer1_block1_self_att has shape:(3, 17, 4, 4)\n", + "decoder_layer1_block2_decenc_att has shape:(3, 17, 4, 7)\n", + "decoder_layer2_block1_self_att has shape:(3, 17, 4, 4)\n", + "decoder_layer2_block2_decenc_att has shape:(3, 17, 4, 7)\n", + "decoder_layer3_block1_self_att has shape:(3, 17, 4, 4)\n", + "decoder_layer3_block2_decenc_att has shape:(3, 17, 4, 7)\n", + "decoder_layer4_block1_self_att has shape:(3, 17, 4, 4)\n", + "decoder_layer4_block2_decenc_att has shape:(3, 17, 4, 7)\n", + "decoder_layer5_block1_self_att has shape:(3, 17, 4, 4)\n", + "decoder_layer5_block2_decenc_att has shape:(3, 17, 4, 7)\n", + "```" + ] + }, + { + "cell_type": "code", + "execution_count": 20, + "id": "e92745de", + "metadata": { + "deletable": false, + "editable": false, + "tags": [] + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "\u001b[92m All tests passed!\n" + ] + } + ], + "source": [ + "# UNIT TEST\n", + "w2_unittest.test_decoder(Decoder, create_look_ahead_mask, create_padding_mask)" + ] + }, + { + "cell_type": "markdown", + "id": "848ba4b5", + "metadata": {}, + "source": [ + " \n", + "## 8 - Transformer\n", + "\n", + "Phew! This has been quite the assignment! Congratulations! You've done all the hard work, now it's time to put it all together. \n", + "\n", + "\"Transformer\"\n", + "
Figure 4: Transformer
\n", + " \n", + "The flow of data through the Transformer Architecture is as follows:\n", + "* First your input passes through an Encoder, which is just repeated Encoder layers that you implemented:\n", + " - embedding and positional encoding of your input\n", + " - multi-head attention on your input\n", + " - feed forward neural network to help detect features\n", + "* Then the predicted output passes through a Decoder, consisting of the decoder layers that you implemented:\n", + " - embedding and positional encoding of the output\n", + " - multi-head attention on your generated output\n", + " - multi-head attention with the Q from the first multi-head attention layer and the K and V from the Encoder\n", + " - a feed forward neural network to help detect features\n", + "* Finally, after the Nth Decoder layer, one dense layer and a softmax are applied to generate prediction for the next output in your sequence.\n", + "\n", + " \n", + "### Exercise 4 - Transformer\n", + "\n", + "Implement `Transformer()` using the `call()` method\n", + "1. Pass the input through the Encoder with the appropiate mask.\n", + "2. Pass the encoder output and the target through the Decoder with the appropiate mask.\n", + "3. Apply a linear transformation and a softmax to get a prediction." + ] + }, + { + "cell_type": "code", + "execution_count": 21, + "id": "c9e6cb07", + "metadata": { + "deletable": false, + "tags": [ + "graded" + ] + }, + "outputs": [], + "source": [ + "# GRADED FUNCTION: Transformer\n", + "class Transformer(tf.keras.Model):\n", + " \"\"\"\n", + " Complete transformer with an Encoder and a Decoder\n", + " \"\"\"\n", + " def __init__(self, num_layers, embedding_dim, num_heads, fully_connected_dim, input_vocab_size, \n", + " target_vocab_size, max_positional_encoding_input,\n", + " max_positional_encoding_target, dropout_rate=0.1, layernorm_eps=1e-6):\n", + " super(Transformer, self).__init__()\n", + "\n", + " self.encoder = Encoder(num_layers=num_layers,\n", + " embedding_dim=embedding_dim,\n", + " num_heads=num_heads,\n", + " fully_connected_dim=fully_connected_dim,\n", + " input_vocab_size=input_vocab_size,\n", + " maximum_position_encoding=max_positional_encoding_input,\n", + " dropout_rate=dropout_rate,\n", + " layernorm_eps=layernorm_eps)\n", + "\n", + " self.decoder = Decoder(num_layers=num_layers, \n", + " embedding_dim=embedding_dim,\n", + " num_heads=num_heads,\n", + " fully_connected_dim=fully_connected_dim,\n", + " target_vocab_size=target_vocab_size, \n", + " maximum_position_encoding=max_positional_encoding_target,\n", + " dropout_rate=dropout_rate,\n", + " layernorm_eps=layernorm_eps)\n", + "\n", + " self.final_layer = tf.keras.layers.Dense(target_vocab_size, activation='softmax')\n", + " \n", + " def call(self, input_sentence, output_sentence, training, enc_padding_mask, look_ahead_mask, dec_padding_mask):\n", + " \"\"\"\n", + " Forward pass for the entire Transformer\n", + " Arguments:\n", + " input_sentence (tf.Tensor): Tensor of shape (batch_size, input_seq_len)\n", + " An array of the indexes of the words in the input sentence\n", + " output_sentence (tf.Tensor): Tensor of shape (batch_size, target_seq_len)\n", + " An array of the indexes of the words in the output sentence\n", + " training (bool): Boolean, set to true to activate\n", + " the training mode for dropout layers\n", + " enc_padding_mask (tf.Tensor): Boolean mask to ensure that the padding is not \n", + " treated as part of the input\n", + " look_ahead_mask (tf.Tensor): Boolean mask for the target_input\n", + " dec_padding_mask (tf.Tensor): Boolean mask for the second multihead attention layer\n", + " Returns:\n", + " final_output (tf.Tensor): The final output of the model\n", + " attention_weights (dict[str: tf.Tensor]): Dictionary of tensors containing all the attention weights for the decoder\n", + " each of shape Tensor of shape (batch_size, num_heads, target_seq_len, input_seq_len)\n", + " \n", + " \"\"\"\n", + " ### START CODE HERE ###\n", + " # call self.encoder with the appropriate arguments to get the encoder output\n", + " enc_output = self.encoder(input_sentence, training, enc_padding_mask)\n", + " \n", + " # call self.decoder with the appropriate arguments to get the decoder output\n", + " # dec_output.shape == (batch_size, tar_seq_len, fully_connected_dim)\n", + " dec_output, attention_weights = self.decoder(output_sentence, enc_output, training, look_ahead_mask, dec_padding_mask)\n", + " \n", + " # pass decoder output through a linear layer and softmax (~1 line)\n", + " final_output = self.final_layer(dec_output)\n", + " ### END CODE HERE ###\n", + "\n", + " return final_output, attention_weights" + ] + }, + { + "cell_type": "code", + "execution_count": 22, + "id": "3cd93c99", + "metadata": { + "deletable": false, + "editable": false + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Using num_layers=3, target_vocab_size=350 and num_heads=17:\n", + "\n", + "sentence_a has shape:(1, 7)\n", + "sentence_b has shape:(1, 7)\n", + "\n", + "Output of transformer (summary) has shape:(1, 7, 350)\n", + "\n", + "Attention weights:\n", + "decoder_layer1_block1_self_att has shape:(1, 17, 7, 7)\n", + "decoder_layer1_block2_decenc_att has shape:(1, 17, 7, 7)\n", + "decoder_layer2_block1_self_att has shape:(1, 17, 7, 7)\n", + "decoder_layer2_block2_decenc_att has shape:(1, 17, 7, 7)\n", + "decoder_layer3_block1_self_att has shape:(1, 17, 7, 7)\n", + "decoder_layer3_block2_decenc_att has shape:(1, 17, 7, 7)\n" + ] + } + ], + "source": [ + "# Test your function!\n", + "n_layers = 3\n", + "emb_d = 13\n", + "n_heads = 17\n", + "fully_connected_dim = 8\n", + "input_vocab_size = 300\n", + "target_vocab_size = 350\n", + "max_positional_encoding_input = 12\n", + "max_positional_encoding_target = 12\n", + "\n", + "transformer = Transformer(n_layers, \n", + " emb_d, \n", + " n_heads, \n", + " fully_connected_dim, \n", + " input_vocab_size, \n", + " target_vocab_size, \n", + " max_positional_encoding_input,\n", + " max_positional_encoding_target)\n", + "\n", + "# 0 is the padding value\n", + "sentence_a = np.array([[2, 3, 1, 3, 0, 0, 0]])\n", + "sentence_b = np.array([[1, 3, 4, 0, 0, 0, 0]])\n", + "\n", + "enc_padding_mask = create_padding_mask(sentence_a)\n", + "dec_padding_mask = create_padding_mask(sentence_a)\n", + "\n", + "look_ahead_mask = create_look_ahead_mask(sentence_a.shape[1])\n", + "\n", + "test_summary, att_weights = transformer(\n", + " sentence_a,\n", + " sentence_b,\n", + " False,\n", + " enc_padding_mask,\n", + " look_ahead_mask,\n", + " dec_padding_mask\n", + ")\n", + "\n", + "print(f\"Using num_layers={n_layers}, target_vocab_size={target_vocab_size} and num_heads={n_heads}:\\n\")\n", + "print(f\"sentence_a has shape:{sentence_a.shape}\")\n", + "print(f\"sentence_b has shape:{sentence_b.shape}\")\n", + "\n", + "print(f\"\\nOutput of transformer (summary) has shape:{test_summary.shape}\\n\")\n", + "print(\"Attention weights:\")\n", + "for name, tensor in att_weights.items():\n", + " print(f\"{name} has shape:{tensor.shape}\")" + ] + }, + { + "cell_type": "markdown", + "id": "95c9f812", + "metadata": {}, + "source": [ + "##### __Expected Output__\n", + "\n", + "```\n", + "Using num_layers=3, target_vocab_size=350 and num_heads=17:\n", + "\n", + "sentence_a has shape:(1, 7)\n", + "sentence_b has shape:(1, 7)\n", + "\n", + "Output of transformer (summary) has shape:(1, 7, 350)\n", + "\n", + "Attention weights:\n", + "decoder_layer1_block1_self_att has shape:(1, 17, 7, 7)\n", + "decoder_layer1_block2_decenc_att has shape:(1, 17, 7, 7)\n", + "decoder_layer2_block1_self_att has shape:(1, 17, 7, 7)\n", + "decoder_layer2_block2_decenc_att has shape:(1, 17, 7, 7)\n", + "decoder_layer3_block1_self_att has shape:(1, 17, 7, 7)\n", + "decoder_layer3_block2_decenc_att has shape:(1, 17, 7, 7)\n", + "```" + ] + }, + { + "cell_type": "code", + "execution_count": 23, + "id": "a2d035a5", + "metadata": { + "deletable": false, + "editable": false, + "tags": [] + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "\u001b[92m All tests passed!\n" + ] + } + ], + "source": [ + "# UNIT TEST\n", + "w2_unittest.test_transformer(Transformer, create_look_ahead_mask, create_padding_mask)" + ] + }, + { + "cell_type": "markdown", + "id": "33e8a0c2", + "metadata": {}, + "source": [ + "\n", + "## 9 - Initialize the Model\n", + "Now that you have defined the model, you can initialize and train it. First you can initialize the model with the parameters below. Note that generally these models are much larger and you are using a smaller version to fit this environment and to be able to train it in just a few minutes.\n", + "\n", + "The base model described in the original Transformer paper used `num_layers=6`, `embedding_dim=512`, and `fully_connected_dim=2048`." + ] + }, + { + "cell_type": "code", + "execution_count": 24, + "id": "a5f79f64", + "metadata": { + "deletable": false, + "editable": false, + "tags": [ + "graded" + ] + }, + "outputs": [], + "source": [ + "# Define the model parameters\n", + "num_layers = 2\n", + "embedding_dim = 128\n", + "fully_connected_dim = 128\n", + "num_heads = 2\n", + "positional_encoding_length = 256\n", + "\n", + "# Initialize the model\n", + "transformer = Transformer(\n", + " num_layers, \n", + " embedding_dim, \n", + " num_heads, \n", + " fully_connected_dim,\n", + " vocab_size, \n", + " vocab_size, \n", + " positional_encoding_length, \n", + " positional_encoding_length,\n", + ")" + ] + }, + { + "cell_type": "markdown", + "id": "71473c27", + "metadata": {}, + "source": [ + "\n", + "## 10 - Prepare for Training the Model\n", + "\n", + "The original transformer paper uses Adam optimizer with custom learning rate scheduling, which we define in the cell below. This was empirically shown to produce faster convergence." + ] + }, + { + "cell_type": "code", + "execution_count": 25, + "id": "eb402089", + "metadata": { + "deletable": false, + "editable": false, + "tags": [ + "graded" + ] + }, + "outputs": [], + "source": [ + "class CustomSchedule(tf.keras.optimizers.schedules.LearningRateSchedule):\n", + " def __init__(self, d_model, warmup_steps=4000):\n", + " super(CustomSchedule, self).__init__()\n", + " self.d_model = tf.cast(d_model, dtype=tf.float32)\n", + " self.warmup_steps = warmup_steps\n", + " \n", + " def __call__(self, step):\n", + " step = tf.cast(step, dtype=tf.float32)\n", + " arg1 = tf.math.rsqrt(step)\n", + " arg2 = step * (self.warmup_steps ** -1.5)\n", + "\n", + " return tf.math.rsqrt(self.d_model) * tf.math.minimum(arg1, arg2)\n", + "\n", + "learning_rate = CustomSchedule(embedding_dim)\n", + "\n", + "optimizer = tf.keras.optimizers.Adam(0.0002, beta_1=0.9, beta_2=0.98, epsilon=1e-9)" + ] + }, + { + "cell_type": "markdown", + "id": "ad854ab6", + "metadata": {}, + "source": [ + "Below you can plot, how the custom learning rate looks like." + ] + }, + { + "cell_type": "code", + "execution_count": 26, + "id": "35a17a59", + "metadata": { + "deletable": false, + "editable": false, + "tags": [ + "graded" + ] + }, + "outputs": [ + { + "data": { + "text/plain": [ + "Text(0.5, 0, 'Train Step')" + ] + }, + "execution_count": 26, + "metadata": {}, + "output_type": "execute_result" + }, + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAlEAAAGwCAYAAACJjDBkAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjcuMiwgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy8pXeV/AAAACXBIWXMAAA9hAAAPYQGoP6dpAABrBklEQVR4nO3de1xUdf4/8NcMMDNcB5DLgCLg/YaXvCCmmSuFZSbVlpq/dF2/2bZauVqZruJWtprZVpZlbRdrt/JSrZmpRXjLRBQEFUW8IeBluMNwv8x8fn8gRydRAWc4zPB6Ph7zQM58zpn3h0Hn5fl8zucohBACRERERNQsSrkLICIiIrJFDFFERERELcAQRURERNQCDFFERERELcAQRURERNQCDFFERERELcAQRURERNQCjnIXYM9MJhMuXboEd3d3KBQKucshIiKiJhBCoLS0FIGBgVAqb3y+iSHKii5duoSgoCC5yyAiIqIWyM7ORqdOnW74PEOUFbm7uwOofxM8PDxkroaIiIiawmAwICgoSPocvxGGKCtqGMLz8PBgiCIiIrIxt5qKw4nlRERERC3AEEVERETUAgxRRERERC3AEEVERETUAgxRRERERC3AEEVERETUAgxRRERERC3AEEVERETUAgxRRERERC3AEEVERETUArKHqDVr1iAkJAQajQbh4eE4ePDgTdtv2rQJvXr1gkajQVhYGLZt22b2vBACMTExCAgIgLOzMyIjI3H69GmzNq+99hpGjBgBFxcXeHp63vT1CgoK0KlTJygUChQXF7eki0RERGSHZA1RGzZswLx587B06VIcPnwYAwYMQFRUFHJzcxttv3//fkyZMgUzZ85EcnIyoqOjER0djdTUVKnNypUrsXr1aqxduxYJCQlwdXVFVFQUqqqqpDY1NTV49NFH8fTTT9+yxpkzZ6J///6331kiIiKyKwohhJDrxcPDwzF06FC89957AACTyYSgoCA888wzeOmll65rP2nSJJSXl2Pr1q3StuHDh2PgwIFYu3YthBAIDAzE/Pnz8fzzzwMASkpK4O/vj3Xr1mHy5Mlmx1u3bh3mzp17wzNMH3zwATZs2ICYmBiMHTsWRUVFNz1zVV1djerqaun7hrtAl5SUtPsbEAshYDQJODrIfvKTiIjopgwGA7Ra7S0/v2X7RKupqUFSUhIiIyOvFqNUIjIyEvHx8Y3uEx8fb9YeAKKioqT2GRkZ0Ov1Zm20Wi3Cw8NveMwbOXHiBF555RV88cUXUCqb9mNavnw5tFqt9AgKCmrWa9qzOV8lY/jyOOSWVt26MRERkQ2QLUTl5+fDaDTC39/fbLu/vz/0en2j++j1+pu2b/janGM2prq6GlOmTMEbb7yBzp07N3m/hQsXoqSkRHpkZ2c3eV97JoTAj8cuI7+sBp/sy5C7HCIiIotwlLuAtmjhwoXo3bs3/t//+3/N2k+tVkOtVlupKtuVW3p1iPOUvlTGSoiIiCxHtjNRPj4+cHBwQE5Ojtn2nJwc6HS6RvfR6XQ3bd/wtTnHbMzOnTuxadMmODo6wtHREWPHjpVqXrp0aZOPQ/WyCiukPx86X4SaOpOM1RAREVmGbCFKpVJh8ODBiIuLk7aZTCbExcUhIiKi0X0iIiLM2gNAbGys1D40NBQ6nc6sjcFgQEJCwg2P2Zhvv/0WR44cQUpKClJSUvDxxx8DAH799VfMnj27ycehelkFV0NUWXUdDmcVyVgNERGRZcg6nDdv3jxMnz4dQ4YMwbBhw/D222+jvLwcM2bMAABMmzYNHTt2xPLlywEAzz33HEaPHo0333wT48ePx/r165GYmIiPPvoIAKBQKDB37lwsW7YM3bt3R2hoKJYsWYLAwEBER0dLr5uVlYXCwkJkZWXBaDQiJSUFANCtWze4ubmha9euZnXm5+cDAHr37n3LdaXoepnXnIkCgD2n8jC8SweZqiEiIrIMWUPUpEmTkJeXh5iYGOj1egwcOBA7duyQJoZnZWWZXRk3YsQIfPXVV1i8eDEWLVqE7t27Y/PmzejXr5/U5sUXX0R5eTlmzZqF4uJijBw5Ejt27IBGo5HaxMTE4PPPP5e+HzRoEABg165duPvuu63c6/Yn+0qI6unvjvScUuxJz8OCcb1kroqIiOj2yLpOlL1r6joT9u6RD/YjKbMIr0zsi6VbjkMI4OCisfDz0Nx6ZyIiolbW5teJovYj88qcqEFBXgjrqAUA7D2dL2dJREREt40hiqyqoqYO+WX1Sxx09nbB6B6+AOrnRREREdkyhiiyquzCSgCA1tkJWhcnKUT9ejoPdUYudUBERLaLIYqsKrOgHED9WSgAGBjkCU8XJxRX1CIpk0sdEBGR7WKIIqtqWGizIUQ5Oijxh55+AIBf0nJuuB8REVFbxxBFViWFqA4u0rZ7+tQvYRF7Ige8OJSIiGwVQxRZ1e/PRAHAqB6+UDkocb6gAmfzyuQqjYiI6LYwRJFVNRai3NSOiOhav2J57IlcWeoiIiK6XQxRZDVGk8CFK1fnXRuigGuH9PStXhcREZElMESR1eQYqlBjNMFRqUCA1nx18rG96yeXJ2cXI6+0Wo7yiIiIbgtDFFlNw1BeRy9nODqY/6oFaJ0R1lELIYBdJzmkR0REtochiqwmq+D6+VDXahjS+5lDekREZIMYoshqGptUfq2ovjoAwN5T+TBU1bZaXURERJbAEEVWc6sQ1cPfDV19XVFjNCGOC28SEZGNYYgiq8m8EqKCOzQeohQKBcaHBQAAfjzKIT0iIrItDFFkNdlXQlTQDc5EAcD9/etD1N7TeSjlkB4REdkQhiiyitKqWhSW1wC48XAeAPT0d0cXX1fU1JkQl8ar9IiIyHYwRJFVNMyH8nZVwV3jdMN2ZkN6xy63Sm1ERESWwBBFVtGUobwG918JUXtOcUiPiIhsB0MUWUXDmajgJoSoXjp3dPGpH9LbyYU3iYjIRjBEkVVk3mKhzWspFAqMvzLBfEvKJavWRUREZCkMUWQVt1oj6vcmDgwEUD+kV1DGe+kREVHbxxBFViGFqBusEfV73fzcEdZRizqT4ARzIiKyCQxRZHF1RhMuFlUCaPqZKACIHtQRAPDd4YtWqYuIiMiSGKLI4i6XVKHOJKByUMLfQ9Pk/R4cEAgHpQIp2cXIyC+3YoVERES3jyGKLK5hKK+TtzMclIom7+frrsbIbj4AgP8l82wUERG1bQxRZHHNnVR+rYfvqB/S25x8EUIIi9ZFRERkSQxRZHG3E6Lu6eMPF5UDsgorcDiryNKlERERWQxDFFlcVjPWiPo9F5UjxvXTAQC+SeKQHhERtV0MUWRxt3MmCgD+OLgTAOCHI5dQUVNnsbqIiIgsiSGKLE665UsH1xbtPzy0A4I7uKCsug4/HuWaUURE1DYxRJFFlVTUoqSy/ibCQd7OLTqGUqnApKFBAIANh7ItVhsREZElMUSRRTWchfJxU8NF5dji4/zxjk5wUCqQmFmEM7mlliqPiIjIYhiiyKKuDuW1bD5UAz8PDf7Qyw8Az0YREVHbxBBFFpVZWL/SeEsnlV9r8pUhvW8PX0RNnem2j0dERGRJDFFkUdlXzkQFWSBEje7hC38PNQrLa/BLWs5tH4+IiMiSGKLIojKvrBEVbIEQ5eigxKOD689G/fdA5m0fj4iIyJJkD1Fr1qxBSEgINBoNwsPDcfDgwZu237RpE3r16gWNRoOwsDBs27bN7HkhBGJiYhAQEABnZ2dERkbi9OnTZm1ee+01jBgxAi4uLvD09LzuNY4cOYIpU6YgKCgIzs7O6N27N955553b7mt7IK0RdZtzohpMHhYEpQLYf7YAp3M4wZyIiNoOWUPUhg0bMG/ePCxduhSHDx/GgAEDEBUVhdzc3Ebb79+/H1OmTMHMmTORnJyM6OhoREdHIzU1VWqzcuVKrF69GmvXrkVCQgJcXV0RFRWFqqoqqU1NTQ0effRRPP30042+TlJSEvz8/PDf//4Xx48fx9///ncsXLgQ7733nmV/AHam1mjCpeJKAJaZEwUAnbxccE8ffwDAF/E8G0VERG2HQsh4l9fw8HAMHTpUCicmkwlBQUF45pln8NJLL13XftKkSSgvL8fWrVulbcOHD8fAgQOxdu1aCCEQGBiI+fPn4/nnnwcAlJSUwN/fH+vWrcPkyZPNjrdu3TrMnTsXxcXFt6x19uzZSEtLw86dO2/Yprq6GtXV1dL3BoMBQUFBKCkpgYeHxy1fw9adzy/H3at2Q+2oxMlXx0GhUFjkuPvP5OPxjxPgonLAgUVj4aFxsshxiYiIGmMwGKDVam/5+S3bmaiamhokJSUhMjLyajFKJSIjIxEfH9/oPvHx8WbtASAqKkpqn5GRAb1eb9ZGq9UiPDz8hsdsqpKSEnh7e9+0zfLly6HVaqVHUFDQbb2mrbn2di+WClAAENG1A7r7uaGixohvky5Y7LhERES3Q7YQlZ+fD6PRCH9/f7Pt/v7+0Ov1je6j1+tv2r7ha3OO2RT79+/Hhg0bMGvWrJu2W7hwIUpKSqRHdnb7Wt/odu+ZdyMKhQLTRoQAAP4TnwmTSbaTp0RERBLZJ5a3dampqZg4cSKWLl2Ke++996Zt1Wo1PDw8zB7tiaUnlV/r4UEd4a52xLn8cvx6Jt/ixyciImou2UKUj48PHBwckJNjvv5PTk4OdDpdo/vodLqbtm/42pxj3syJEycwduxYzJo1C4sXL272/u1NVoF1zkQBgKvaEY8M7gQAWPdbhsWPT0RE1FyyhSiVSoXBgwcjLi5O2mYymRAXF4eIiIhG94mIiDBrDwCxsbFS+9DQUOh0OrM2BoMBCQkJNzzmjRw/fhxjxozB9OnT8dprrzVr3/bKUrd8uZHpI0KgUAC70vO43AEREclO1uG8efPm4d///jc+//xzpKWl4emnn0Z5eTlmzJgBAJg2bRoWLlwotX/uueewY8cOvPnmmzh58iT+8Y9/IDExEXPmzAFQP3dm7ty5WLZsGbZs2YJjx45h2rRpCAwMRHR0tHScrKwspKSkICsrC0ajESkpKUhJSUFZWRmA+iG8MWPG4N5778W8efOg1+uh1+uRl5fXej8cGyOEsNqcqAahPq6498pyB//+9ZxVXoOIiKipHOV88UmTJiEvLw8xMTHQ6/UYOHAgduzYIU0Mz8rKglJ5NeeNGDECX331FRYvXoxFixahe/fu2Lx5M/r16ye1efHFF1FeXo5Zs2ahuLgYI0eOxI4dO6DRaKQ2MTEx+Pzzz6XvBw0aBADYtWsX7r77bnzzzTfIy8vDf//7X/z3v/+V2gUHB+P8+fPW+nHYtKKKWpRV1wGoX9vJWmbd1RU/Hc/B5uRLeP7envDz0Nx6JyIiIiuQdZ0oe9fUdSbsQUp2MaLX/AadhwYHFo216mv98YP9SMwswl/v7ooXx/Wy6msREVH70+bXiSL7kllQDsB6Q3nXmnVXFwD199NrOPtFRETU2hiiyCKyr8yHCmqFEBXZ2x9dfF1hqKrDhkPtay0uIiJqOxiiyCIyC6x7Zd61lEoFnhxVfzbq030ZqDWarP6aREREv8cQRRZh7Svzfu+hQR3h667GxeJK/O/wxVZ5TSIiomsxRJFFtOZwHgBonBzw1JW5Ue/tOsOzUURE1OoYoui2VdcZcdlQBaB1hvMaPB7eGR1cVcgqrMD3KZda7XWJiIgAhiiygAtFlRACcFE5oIOrqtVe10XliCevnI1as+sM6ng2ioiIWhFDFN22a+dDKRSKVn3tJ4YHw8vFCRn55dh69HKrvjYREbVvDFF026x54+FbcVU74v+uXKn37s7TMJq4diwREbUOhii6ba19Zd7vTYsIhtbZCWfzyrH1KOdGERFR62CIotsmhahWnFR+LXeNE54cFQoA+FfsKV6pR0RErYIhim6bnMN5DWbcGQofNxUyCyq4ijkREbUKhii6LUII2YfzgPq5Uc/8oTsAYHXcaVTWGGWrhYiI2geGKLot+WU1qKw1QqEAOnnJF6IAYMqwzujk5Yzc0mqs239e1lqIiMj+MUTRbckqLAcABGqdoXKU99dJ5ajEvHt6AAA+2H0GJRW1stZDRET2jSGKbkuWdLsXZ5krqTdxYEf09HeHoaoOa/eelbscIiKyYwxRdFsyr0wqD/Z2lbmSeg5KBV6I6gkA+HRfBi4UVchcERER2SuGKLotci9v0Jixvf0wvIs3qutMeH1HutzlEBGRnWKIotuSLQ3ntZ0QpVAosOSBPlAogB+OXEJSZqHcJRERkR1iiKLbcnU4r+2EKADoG6jFpCFBAIBXfjgBE28HQ0REFsYQRS1WWWNEbmk1AHnXiLqR+ff2hJvaEUculGBzykW5yyEiIjvDEEUt1jBp213tCE8XJ5mruZ6vuxqzx3QDALy+4yQqaupkroiIiOwJQxS1WMNQXucOLlAoFDJX07gZd4YgyNsZOYZqvLfzjNzlEBGRHWGIohZrC7d7uRWNkwMWj+8DAPj3r+dwJrdU5oqIiMheMERRi9lCiAKAe/v44w+9/FBrFFi8ORVCcJI5ERHdPoYoarG2uEZUYxQKBV5+sC80TkocOFfISeZERGQRDFHUYrZyJgqoX8fqmT90BwC89mMa76tHRES3jSGKWsRkEtJCm23lli+38uSoLujq64r8shq88fNJucshIiIbxxBFLZJbWo3qOhMclAoEeGrkLqdJVI5KvDqxHwDgy4QsJGUWyVwRERHZMoYoapGGobxATw2cHGzn12hENx88fEdHCAG8+M0RVNUa5S6JiIhslO18+lGbkmVjQ3nXinmgD3zd1TibV47VcaflLoeIiGwUQxS1SFZBOYC2dePhpvJ0UWFZdP2w3od7z+HYhRKZKyIiIlvEEEUtYktX5jUmqq8OD/QPgNEk8MI3R1BTZ5K7JCIisjEMUdQimQ3DeW18jaibefnBvvB2VeGkvhTv7+YtYYiIqHkYoqhFsm38TBQAdHBT4+UH+wIA3tt5BkcvFMtbEBER2RSGKGq28uo65JfVALDNOVHXeqB/AO4P06HOJDB3fQoqaurkLomIiGwEQxQ1W8N8KE8XJ2idnWSu5vYoFAr886Ew+HuocS6/HK/9mCZ3SUREZCNkD1Fr1qxBSEgINBoNwsPDcfDgwZu237RpE3r16gWNRoOwsDBs27bN7HkhBGJiYhAQEABnZ2dERkbi9Gnzy9hfe+01jBgxAi4uLvD09Gz0dbKysjB+/Hi4uLjAz88PL7zwAurqeJYCsP1J5b/n6aLCm48OBFC/CGdcWo68BRERkU2QNURt2LAB8+bNw9KlS3H48GEMGDAAUVFRyM3NbbT9/v37MWXKFMycORPJycmIjo5GdHQ0UlNTpTYrV67E6tWrsXbtWiQkJMDV1RVRUVGoqqqS2tTU1ODRRx/F008/3ejrGI1GjB8/HjU1Ndi/fz8+//xzrFu3DjExMZb9AdiohvlQtj6Ud62R3X3wfyNDAQAvfnMUeaXVMldERERtnpDRsGHDxOzZs6XvjUajCAwMFMuXL2+0/WOPPSbGjx9vti08PFw89dRTQgghTCaT0Ol04o033pCeLy4uFmq1Wnz99dfXHe+zzz4TWq32uu3btm0TSqVS6PV6adsHH3wgPDw8RHV1dZP7V1JSIgCIkpKSJu9jCxb/75gIXrBVvL49Te5SLKqypk5EvbVHBC/YKv70aYIwGk1yl0RERDJo6ue3bGeiampqkJSUhMjISGmbUqlEZGQk4uPjG90nPj7erD0AREVFSe0zMjKg1+vN2mi1WoSHh9/wmDd6nbCwMPj7+5u9jsFgwPHjx2+4X3V1NQwGg9nDHtnbcF4DjZMD3pk8CCpHJXal5+Hfv56TuyQiImrDZAtR+fn5MBqNZkEFAPz9/aHX6xvdR6/X37R9w9fmHLM5r3PtazRm+fLl0Gq10iMoKKjJr2lLpOUNbHiNqBvpqXPH0gl9AAArf0pH4vlCmSsiIqK2SvaJ5fZk4cKFKCkpkR7Z2dlyl2RxRpNAdpF9nolq8Piwzpg4MBBGk8Ccr5JRUMb5UUREdD3ZQpSPjw8cHByQk2N+JVROTg50Ol2j++h0upu2b/janGM253WufY3GqNVqeHh4mD3sjd5QhVqjgJODAgFaZ7nLsYqGZQ+6+LpCb6jC3zYegckk5C6LiIjaGNlClEqlwuDBgxEXFydtM5lMiIuLQ0RERKP7REREmLUHgNjYWKl9aGgodDqdWRuDwYCEhIQbHvNGr3Ps2DGzqwRjY2Ph4eGBPn36NPk49iiroP4sVCcvFzgoFTJXYz2uake8P/UOaJyU2HsqDx/sOSt3SURE1MbIOpw3b948/Pvf/8bnn3+OtLQ0PP300ygvL8eMGTMAANOmTcPChQul9s899xx27NiBN998EydPnsQ//vEPJCYmYs6cOQDqzyDMnTsXy5Ytw5YtW3Ds2DFMmzYNgYGBiI6Olo6TlZWFlJQUZGVlwWg0IiUlBSkpKSgrKwMA3HvvvejTpw+eeOIJHDlyBD/99BMWL16M2bNnQ61Wt94PqA3KKiwHYF/LG9xIL50HXpnYDwDw5s/p+PV0nswVERFRW+Io54tPmjQJeXl5iImJgV6vx8CBA7Fjxw5pEndWVhaUyqs5b8SIEfjqq6+wePFiLFq0CN27d8fmzZvRr18/qc2LL76I8vJyzJo1C8XFxRg5ciR27NgBjUYjtYmJicHnn38ufT9o0CAAwK5du3D33XfDwcEBW7duxdNPP42IiAi4urpi+vTpeOWVV6z9I2nzrl6ZZ59Deb/32JAgJJ4vxMbEC5jzVTK2zLkTwR1c5S6LiIjaAIUQgpM9rMRgMECr1aKkpMRu5kfN+eowth69jL/f3xtP3tVF7nJaRXWdEZM/OoDkrGL08HfDd3+9E25qWf//QUREVtTUz29enUfNYo+rld+K2tEBa//fYPi5q3EqpwzzNqRwojkRETFEUfPY60Kbt+LvocGHTwyGykGJn0/kYPXO07feiYiI7BpDFDWZoaoWRRW1AOxzoc1bGdTZC689VD//7u1fTmP7scsyV0RERHJiiKIma1jeoIOrqt3OCXp0SBBm3BkCAJi7IQWHs4rkLYiIiGTDEEVN1h7nQzXm7/f3xthefqiuM+HJzxORWVAud0lERCQDhihqsswrISq4HQ7lXcvRQYnVUwahX0cPFJTXYMZnh1BcUSN3WURE1MoYoqjJ2uuk8sa4qh3x6fShCNRqcC6/HLO+SEJ1nVHusoiIqBUxRFGTcTjPnJ+HBp/NGAZ3tSMOni/EfN5jj4ioXWGIoibLvDKxPJghStJT544P/t9gOCoV2Hr0MpZuOQ6uX0tE1D4wRFGT1BlNuFhcCaB9Lm9wMyO7++DNxwZAoQD+cyATb8WekrskIiJqBQxR1CSXS6pgNAmoHJXwd9fceod2ZuLAjnjlwb4AgNU7z+DTfRkyV0RERNbGEEVN0jCUF+TlDKVSIXM1bdMTESGYd08PAMArW0/g26QLMldERETWxBBFTcIr85rmmT90w5/vDAUAvPjtUexI5armRET2iiGKmiSzsH5ByeAOrjJX0rYpFAosHt8bj9zRCUaTwJyvkvHTcb3cZRERkRUwRFGTcHmDplMqFVj5x/6YODAQdSaBOV8dxi8ncuQui4iILIwhipqEw3nN46BU4M1HB2DCgEDUGgWe/jIJcWkMUkRE9oQhim5JCHF1jSgub9Bkjg5KvPXYAIwPC6gPUv89jF0nc+Uui4iILIQhim6ppLIWpVV1AIAgL4ao5nB0UOLtyQNxf5gONUYTnvpPEmI5tEdEZBduK0RVVVVZqg5qwxqG8nzd1XBWOchcje1xclDincmDcF+/+iD1l/8m4fuUi3KXRUREt6nZIcpkMuHVV19Fx44d4ebmhnPnzgEAlixZgk8++cTiBZL8eLuX2+fkoMS7Uwbh4Ts6wmgSmLshBV8lZMldFhER3YZmh6hly5Zh3bp1WLlyJVQqlbS9X79++Pjjjy1aHLUNnFRuGY4OSqz64wA8MTwYQgCL/ncMH+09K3dZRETUQs0OUV988QU++ugjTJ06FQ4OV4d2BgwYgJMnT1q0OGobuLyB5SiVCrwysS+evrsrAOCf207izZ/TedNiIiIb1OwQdfHiRXTr1u267SaTCbW1tRYpitoWXplnWQqFAgvG9cILUT0BAO/uPINF/zuGOqNJ5sqIiKg5mh2i+vTpg19//fW67d988w0GDRpkkaKobeFwnnXMHtMNr0b3g1IBfH0wG09+kYjy6jq5yyIioiZybO4OMTExmD59Oi5evAiTyYTvvvsO6enp+OKLL7B161Zr1Egyqqkz4XJJJQCgM89EWdwTw4Ph767GM18nY1d6Hqb8+wA+mT4Uvu5quUsjIqJbaPaZqIkTJ+KHH37AL7/8AldXV8TExCAtLQ0//PAD7rnnHmvUSDK6WFwJkwA0Tkr4uvGD3Rru7avDV08Oh5eLE45eKMEjH+zHubwyucsiIqJbaPaZKAAYNWoUYmNjLV0LtUHXDuUpFAqZq7Ffg4O98O3TIzD9s4PIKqzAIx/sx0fThmBoiLfcpRER0Q00+0xUly5dUFBQcN324uJidOnSxSJFUdtxNUS5ylyJ/evi64bvnr4T/TtpUVRRi8f/fQAbD2XLXRYREd1As0PU+fPnYTQar9teXV2Nixe5CrO9ySooB8BJ5a3F112N9bOG475+OtQaBV789iiWbT0Bo4lLIBARtTVNHs7bsmWL9OeffvoJWq1W+t5oNCIuLg4hISEWLY7kd/VMlLPMlbQfLipHrHn8DrwTdxrvxJ3Gx/sycDq3DO8+PggeGie5yyMioiuaHKKio6MB1K9xM336dLPnnJycEBISgjfffNOixZH8rq4RxeG81qRUKvC3e3qgh7875m9KwZ5TeXhozW/4ePpQhPrwvSAiaguaPJxnMplgMpnQuXNn5ObmSt+bTCZUV1cjPT0dDzzwgDVrpVYmhOBq5TIb3z8A3/xlBAK0GpzNK8eD7+7Dz8f1cpdFRERowZyojIwM+Pj4WKMWamMKy2tQXmOEQgF08uJwnlz6ddTi+zl3YkiwF0qr6zDrP0l4fcdJrnBORCSzFi1xUF5ejj179iArKws1NTVmzz377LMWKYzkl3nlLJTOQwONk8MtWpM1+blr8PWs4Vi+7SQ+/S0DH+w+iyPZxVg9ZRB8uH4XEZEsmh2ikpOTcf/996OiogLl5eXw9vZGfn4+XFxc4OfnxxBlRziU17Y4OSgRM6EPBnX2xIJvj2L/2QI8sHof1kwdhMHBXE+KiKi1NXs4729/+xsmTJiAoqIiODs748CBA8jMzMTgwYOxatUqa9RIMskq4D3z2qIJAwKxZc6d6OrrCr2hCpM+PICP9p6FicsgEBG1qmaHqJSUFMyfPx9KpRIODg6orq5GUFAQVq5ciUWLFlmjRpJJw3BeMENUm9PNzx3fzxmJ8f0DUGcS+Oe2k5j+2UHkllbJXRoRUbvR7BDl5OQEpbJ+Nz8/P2RlZQEAtFotsrObv7rymjVrEBISAo1Gg/DwcBw8ePCm7Tdt2oRevXpBo9EgLCwM27ZtM3teCIGYmBgEBATA2dkZkZGROH36tFmbwsJCTJ06FR4eHvD09MTMmTNRVmZ+r7KffvoJw4cPh7u7O3x9ffHII4/g/Pnzze6fLZPWiOKNh9skN7Uj3psyCMsfDoPGSYlfT+fj/nd+xZ5TeXKXRkTULjQ7RA0aNAiHDh0CAIwePRoxMTH48ssvMXfuXPTr169Zx9qwYQPmzZuHpUuX4vDhwxgwYACioqKQm5vbaPv9+/djypQpmDlzJpKTkxEdHY3o6GikpqZKbVauXInVq1dj7dq1SEhIgKurK6KiolBVdfV/6FOnTsXx48cRGxuLrVu3Yu/evZg1a5b0fEZGBiZOnIg//OEPSElJwU8//YT8/Hw8/PDDzeqfrcsu5HBeW6dQKDBlWGf8MGckeunckV9Wg+mfHsQ/t6Whpo5X7xERWZVopkOHDomdO3cKIYTIyckRUVFRwt3dXdxxxx0iOTm5WccaNmyYmD17tvS90WgUgYGBYvny5Y22f+yxx8T48ePNtoWHh4unnnpKCCGEyWQSOp1OvPHGG9LzxcXFQq1Wi6+//loIIcSJEycEAHHo0CGpzfbt24VCoRAXL14UQgixadMm4ejoKIxGo9Rmy5YtQqFQiJqamib3r6SkRAAQJSUlTd6nraisqRMhL20VwQu2ivzSKrnLoSaorKkTi/93TAQvqH/fxq/eK9L1BrnLIiKyOU39/G72maghQ4ZgzJgxAOqH83bs2AGDwYCkpCQMHDiwycepqalBUlISIiMjpW1KpRKRkZGIj49vdJ/4+Hiz9gAQFRUltc/IyIBerzdro9VqER4eLrWJj4+Hp6cnhgwZIrWJjIyEUqlEQkICAGDw4MFQKpX47LPPYDQaUVJSgv/85z+IjIyEk9ONb7tRXV0Ng8Fg9rBVF4oqIQTgqnKAt6tK7nKoCTRODng1uh8+fGIwPF2ckHrRgAfe3YeP9p7lvfeIiKyg2SHqRg4fPtysFcvz8/NhNBrh7+9vtt3f3x96feMrMuv1+pu2b/h6qzZ+fn5mzzs6OsLb21tqExoaip9//hmLFi2CWq2Gp6cnLly4gI0bN960T8uXL4dWq5UeQUFBN23flklDeR1coVAoZK6GmiOqrw4/zb0LY3r6oqbOhH9uO4nJH8Uj88rNpImIyDKaFaJ++uknPP/881i0aBHOnTsHADh58iSio6MxdOhQmEz2MQdDr9fjySefxPTp03Ho0CHs2bMHKpUKf/zjHyHEjf9Hv3DhQpSUlEiPlky0bysaPnB542Hb5O+hwad/GooVD4fBVeWAQ+eLMO7tX/GfA5k3/R0mIqKma/Jim5988gmefPJJeHt7o6ioCB9//DH+9a9/4ZlnnsGkSZOQmpqK3r17N/mFfXx84ODggJycHLPtOTk50Ol0je6j0+lu2r7ha05ODgICAszaNAw16nS66yau19XVobCwUNp/zZo10Gq1WLlypdTmv//9L4KCgpCQkIDhw4c3Wp9arYZabR+rR2cVVgLgpHJbplAoMHlYZ9zZzQcvfHMEB84VYsnmVPx8XI9/PhTGRVSJiG5Tk89EvfPOO3j99deRn5+PjRs3Ij8/H++//z6OHTuGtWvXNitAAYBKpcLgwYMRFxcnbTOZTIiLi0NERESj+0RERJi1B4DY2FipfWhoKHQ6nVkbg8GAhIQEqU1ERASKi4uRlJQktdm5cydMJhPCw8MBABUVFdIyDg0cHBykGtuDrMIrZ6I6uMpcCd2uIG8XfPV/wxHzQB+oHeuXQrj3rb34+NdzvP8eEdHtaOpMdRcXF5GRkSGEqL8KzsnJSezbt+825r4LsX79eqFWq8W6devEiRMnxKxZs4Snp6fQ6/VCCCGeeOIJ8dJLL0ntf/vtN+Ho6ChWrVol0tLSxNKlS4WTk5M4duyY1GbFihXC09NTfP/99+Lo0aNi4sSJIjQ0VFRWVkptxo0bJwYNGiQSEhLEvn37RPfu3cWUKVOk5+Pi4oRCoRAvv/yyOHXqlEhKShJRUVEiODhYVFRUNLl/tnx13j3/2i2CF2wVu9Nz5S6FLOhsbql4bO1+6Qq+Ce/+Ko5ftL3fTyIia2rq53eTQ5RCoRA5OTnS925ubuLs2bMtr/CKd999V3Tu3FmoVCoxbNgwceDAAem50aNHi+nTp5u137hxo+jRo4dQqVSib9++4scffzR73mQyiSVLlgh/f3+hVqvF2LFjRXp6ulmbgoICMWXKFOHm5iY8PDzEjBkzRGlpqVmbr7/+WgwaNEi4uroKX19f8eCDD4q0tLRm9c1WQ5TJZBI9F28TwQu2inN5ZXKXQxZmNJrEVwmZot/SHSJ4wVbRZeGPYsX2NFFZUyd3aUREbUJTP78VQjRtlqlSqcSyZcvg5uYGAFiwYAFeeOEF+Pj4mLXjDYivMhgM0Gq1KCkpgYeHh9zlNFmuoQrD/hkHpQI4+ep9UDla7CJOakNyDVVYuuU4tqfWX5Ua0sEFr0zsh7t6+MpcGRGRvJr6+d3kEBUSEnLLS90VCoV01R7ZbohKPF+IP66NR0dPZ/z20h/kLoes7Ofjeiz5PhU5hmoAwLi+OiyZ0AcdPXllJhG1T039/G7y1Xnt7b5x7VkWb/fSrtzbV4fhXTvgrdhT+CI+EzuO67H7VC7mjOmGJ+/qArWjg9wlEhG1SRynoetkFtSHqGDeeLjd8NA4YemEvvjx2ZEYFuKNqloTVv18ClFv7cWu9MbvZUlE1N4xRNF1GlYr5zpC7U8vnQc2PDUcb08aCF93Nc4XVGDGZ4fw5BeJyMjniudERNdiiKLrNAzn8UxU+6RQKBA9qCN2zh+N/xsZCgelArEncnDvW3vwyg8nUFxRI3eJRERtAkMUXSeTc6IIgLvGCYsf6IMdz43C3T19UWsU+PS3DIx+Yzc+/vUcauq4UCcRtW8MUWSmssaIvNL6q7QYoggAuvu7Y92MYfjiz8PQS+eOkspaLPsxDfe8tQfbj13mvfiIqN1q8tV5DQwGQ6PbFQoF1Go1VCrVbRdF8skuqj8L5aFxhKcL30u66q4evrizmw82JWbjzdhTyCyowNNfHsbQEC8sGNcLQ0K85S6RiKhVNftMlKenJ7y8vK57eHp6wtnZGcHBwVi6dGm7ucecvWm4Mq8z50NRIxyU9Tc13v383Xj2D92gcVLi0Pki/HFtPP687hCOXyqRu0QiolbT7DNR69atw9///nf86U9/wrBhwwAABw8exOeff47FixcjLy8Pq1atglqtxqJFiyxeMFkX14iipnBVO2LevT0xJbwzVsedxsbEC9h5Mhc7T+bigf4BmHdPD3TxdZO7TCIiq2p2iPr888/x5ptv4rHHHpO2TZgwAWFhYfjwww8RFxeHzp0747XXXmOIskFZBfWXsXf2dpW5ErIFAVpnLH+4P2bd1RVvxZ7CliOXsPXoZWxP1eOPd3TCs5HdufI5EdmtZg/n7d+/H4MGDbpu+6BBgxAfHw8AGDlyJLKysm6/Omp1PBNFLRHq44rVUwZh27OjENnbD0aTwIbEbIx5YzeWbE7FxeJKuUskIrK4ZoeooKAgfPLJJ9dt/+STTxAUFAQAKCgogJeX1+1XR62OIYpuR59AD3w8fSi+fXoEhnfxRo3RhP8cyMTdb+zCwu+OSQu5EhHZg2YP561atQqPPvootm/fjqFDhwIAEhMTcfLkSXzzzTcAgEOHDmHSpEmWrZSszmQSyC6qP2PAhTbpdgwO9sLXTw5H/LkCvBt3BvHnCvD1wSxsSszGw3d0xF/v7oYQHw4ZE5FtU4gWLPKSkZGBDz/8EKdOnQIA9OzZE0899RRCQkIsXZ9Na+pdoNuKyyWViFi+Ew5KBdJfHQdHBy4jRpZx6HwhVsedxq+n8wEASgUQPbAj/jqmG7r5cQI6EbUtTf38blGIoqaxtRCVcK4Akz46gM7eLtj74hi5yyE7dDirCO/Gncau9DwAgEIB3NPbH0+N7oLBwVxniojahqZ+fjd7OA8AiouLcfDgQeTm5l63HtS0adNackhqAzJ5zzyysjs6e+GzGcNw7EIJ3t15Gj+fyJEeQ4K98NTorhjbyw9KpULuUomIbqnZIeqHH37A1KlTUVZWBg8PDygUV/+xUygUDFE2rGHSbxAnlZOVhXXS4qNpQ3Amtwwf/3oO3x2+iMTMIiR+kYiuvq546q6umDgoEGpHB7lLJSK6oWZPepk/fz7+/Oc/o6ysDMXFxSgqKpIehYWF1qiRWgmvzKPW1s3PDSse6Y99C8bg6bu7wl3jiLN55Xjx26MY9fourNl1BkXlNXKXSUTUqGaHqIsXL+LZZ5+Fiws/aO1Nwy1fghmiqJX5eWiwYFwv7H/pD/j7/b2h89Agt7Qab/yUjuHL4/DSt0eRdrnx+3YSEcml2SEqKioKiYmJ1qiFZMbhPJKbu8YJT97VBXtfHIM3Hx2AvoEeqK4zYf2hbNz3zq+Y/FE8dqTqYTTxehgikl+z50SNHz8eL7zwAk6cOIGwsDA4OTmZPf/ggw9arDhqPWXVdSi4MmzCmw+T3FSOSjwyuBMevqMjkjKL8Nn+89iRqseBc4U4cK4QHT2dMS0iGJOGBsHTRSV3uUTUTjV7iQOl8sYnrxQKBYxG420XZS9saYmDE5cMuH/1r/BycUJyzL1yl0N0ncsllfjvgUx8lZCFoopaAIDGSYkHBwTi8fBgDOikNbvQhYiopay2xMHvlzQg+8BJ5dTWBWid8UJULzzzh+7YknIJn+0/j7TLBmxMvICNiRfQJ8ADj4d3RvSgjnBTt2j1FiKiZuG/NAQAyCosBwB07sBbcVDbpnFywGNDg/DokE5IzCzCVwlZ+PHYZZy4bMDizan457Y0TBwYiMeHBSOsk1buconIjjUpRK1evRqzZs2CRqPB6tWrb9r22WeftUhh1LqunolylrkSoqZRKBQYGuKNoSHeiHmgD749fAFfHczCubxyfH0wG18fzEb/TlpMGdYZD/QPgLvG6dYHJSJqhibNiQoNDUViYiI6dOiA0NDQGx9MocC5c+csWqAts6U5UdM+PYi9p/Lw+iNhmDS0s9zlELWIEAIJGYX4KiEL21Mvo9ZY/8+bxkmJcX11eHRIECK6dOCK6ER0UxadE5WRkdHon8l+ZBVcGc7z5nAe2S6FQoHhXTpgeJcOKCirPzu1MfECzuSWYXPKJWxOuYSOns545I6OeGRwJwRz+JqIbgNvQGxFtnImymgS6Ll4O+pMAr+99Ad09OSQHtkPIQSOXCjBpsRsbDlyCaVVddJzw0K98ejgTrg/LACunIxORFc09fO72SHKaDRi3bp1iIuLa/QGxDt37mxZxXbIVkLUhaIKjHx9F5wcFDj56n1w4FAH2amqWiN+PpGDTYnZ2HcmHw3/+rmoHHBvH39MHNgRI7v7wMmh2esQE5EdsdoSB8899xzWrVuH8ePHo1+/flyXxQ5kXbndS5CXCwMU2TWNkwMeHBCIBwcE4nJJJb47fBHfJF1ARn65NNzn7arC+LAATBwYiDs6e3H+FBHdULND1Pr167Fx40bcf//91qiHZJDF271QOxSgdcbsMd3w17u7Ijm7GFtSLmHr0UvIL6vBfw5k4j8HMtHR0xkTBwZi4sCO6Klzl7tkImpjmh2iVCoVunXrZo1aSCZcaJPaM4VCgTs6e+GOzl5YPL43fjtbgO9TLuKnVD0uFlfi/d1n8f7us+ilc8eDAwPxQFggb41ERABaEKLmz5+Pd955B++99x6H8uxE5pUQFcwPBmrnHB2UGN3DF6N7+KLqISPi0nLxfcpF7E7Pw0l9KU7uSMfKHeno19ED9/ULwP1hAQj14RV+RO1Vs0PUvn37sGvXLmzfvh19+/a97gbE3333ncWKo9aRzeE8outonBwwvn8AxvcPQElFLbanXsaWI5dw4FwBUi8akHrRgDd+SkcvnTvGhwXgvrAAdPNzk7tsImpFzQ5Rnp6eeOihh6xRC8kki2eiiG5K6+KEycM6Y/Kwzigoq8bPJ3Kw7dhl7D9bUH+GSl+KN2NPoYe/m3SGqoe/G8/WE9m5ZoWouro6jBkzBvfeey90Op21aqJWVFJZi+KKWgD1V+cR0c11cFNjyrDOmDKsM4rKaxB7IgfbUi/jtzP5OJVThlM5p/FO3GkEd3BBZG9/3NPHH0OCveDIZROI7E6z/lY7OjriL3/5C6qrqy1WwJo1axASEgKNRoPw8HAcPHjwpu03bdqEXr16QaPRICwsDNu2bTN7XgiBmJgYBAQEwNnZGZGRkTh9+rRZm8LCQkydOhUeHh7w9PTEzJkzUVZWdt1xVq1ahR49ekCtVqNjx4547bXXLNPpNqRhKM/HTcXFBomayctVhceGBmHdjGFI/Ps9ePPRARjbyw8qRyUyCyrwyb4MTP7oAIa89gvmbUzB9mOXUV5dd+sDE5FNaPZ/jYYNG4bk5GSLvPiGDRswb948LF26FIcPH8aAAQMQFRWF3NzcRtvv378fU6ZMwcyZM5GcnIzo6GhER0cjNTVVarNy5UqsXr0aa9euRUJCAlxdXREVFYWqqiqpzdSpU3H8+HHExsZi69at2Lt3L2bNmmX2Ws899xw+/vhjrFq1CidPnsSWLVswbNgwi/S7LeGVeUSWoXVxwiODO+GTPw1F8pJ78MHUO/DwHR3h6eKE4opafHf4Ip7+8jAGvRqLGZ8dxFcJWcg1VN36wETUZjV7xfKNGzdi4cKF+Nvf/obBgwfD1dX8ypT+/fs3+Vjh4eEYOnQo3nvvPQCAyWRCUFAQnnnmGbz00kvXtZ80aRLKy8uxdetWadvw4cMxcOBArF27FkIIBAYGYv78+Xj++ecBACUlJfD398e6deswefJkpKWloU+fPjh06BCGDBkCANixYwfuv/9+XLhwAYGBgUhLS0P//v2RmpqKnj17NufHY8YWViz/YPdZvL7jJKIHBuLtyYPkLofI7tQZTUjMLELsiRzEnsiR/uPSYECQJ8b28sOYnn7oG+jBxT2J2gCrrVg+efJkAMCzzz4rbVMoFBBCQKFQwGg0Nuk4NTU1SEpKwsKFC6VtSqUSkZGRiI+Pb3Sf+Ph4zJs3z2xbVFQUNm/eDKD+5sh6vR6RkZHS81qtFuHh4YiPj8fkyZMRHx8PT09PKUABQGRkJJRKJRISEvDQQw/hhx9+QJcuXbB161aMGzcOQghERkZi5cqV8Pb2vmGfqqurzYY6DQZDk34WcuKZKCLrcnRQSjdFXjy+N07nliH2RA5+PpGDI9nF0uNfsafg46bC6B5+uLunL+7q7guti9OtX4CIZNPsEJWRkWGRF87Pz4fRaIS/v7/Zdn9/f5w8ebLRffR6faPt9Xq99HzDtpu18fPzM3ve0dER3t7eUptz584hMzMTmzZtwhdffAGj0Yi//e1v+OMf/3jTewMuX74cL7/88q263qZkFZYDADrzbvZEVqdQKNDD3x09/N0xe0w35BqqEHcyF7vTc7HvdD7yy2rw7eEL+PbwBSgVwB2dvTCmlx9G9/BF30APXu1H1MY0O0QFBwdbo442xWQyobq6Gl988QV69OgBAPjkk08wePBgpKen33CIb+HChWZnygwGA4KCglql5pbimSgi+fh5aKQr/WrqTEjMLMTu9DzsTs/FqZwyJGYWITGzCG/8lA5fdzXu7uGLUT18cWfXDujgppa7fKJ2r8WXY504cQJZWVmoqakx2/7ggw82aX8fHx84ODggJyfHbHtOTs4Nl0/Q6XQ3bd/wNScnBwEBAWZtBg4cKLX5/cT1uro6FBYWSvsHBATA0dFRClAA0Lt3bwBAVlbWDUOUWq2GWm07/7DVGk24VFw/sZUhikheKkclRnT1wYiuPlh0f29cKKrAnlN52HUyD7+dyUdeaTU2JV3ApqQLAIA+AR4Y1d0HI7v7YGiINzRODjL3gKj9aXaIOnfuHB566CEcO3ZMmgsFQDrN3NQ5USqVCoMHD0ZcXByio6MB1J8BiouLw5w5cxrdJyIiAnFxcZg7d660LTY2FhEREQCA0NBQ6HQ6xMXFSaHJYDAgISEBTz/9tHSM4uJiJCUlYfDgwQCAnTt3wmQyITw8HABw5513oq6uDmfPnkXXrl0BAKdOnQJgX2fiLhVXwmgSUDsq4eduO+GPqD3o5OWCqeHBmBoejOo6Iw5lFGHPqVz8ejofJ/WlOHHZgBOXDfhw7zmoHJUYGuKFkd18Maq7D/oEcII6UWto9tV5EyZMgIODAz7++GOEhobi4MGDKCgowPz587Fq1SqMGjWqycfasGEDpk+fjg8//BDDhg3D22+/jY0bN+LkyZPw9/fHtGnT0LFjRyxfvhxA/RIHo0ePxooVKzB+/HisX78e//znP3H48GH069cPAPD6669jxYoV+PzzzxEaGoolS5bg6NGjOHHiBDQaDQDgvvvuQ05ODtauXYva2lrMmDEDQ4YMwVdffQWgPswNHToUbm5uePvtt2EymTB79mx4eHjg559/bnL/2vrVeb+ezsMTnxxENz83/DJvtNzlEFET5ZVW47cz+dh3Jh/7TudD/7ulErxcnDCimw9Gdas/sxXk7cz5VETNYLWr8+Lj47Fz5074+PhAqVRCqVRi5MiRWL58OZ599tlmrSE1adIk5OXlISYmBnq9HgMHDsSOHTukieFZWVlQKq8uZTVixAh89dVXWLx4MRYtWoTu3btj8+bNUoACgBdffBHl5eWYNWsWiouLMXLkSOzYsUMKUADw5ZdfYs6cORg7diyUSiUeeeQRrF69WnpeqVTihx9+wDPPPIO77roLrq6uuO+++/Dmm28298fVpnE+FJFt8nVXI3pQR0QP6gghBM7mleHX0/WB6sC5AhRV1OLHo5fx49HLAIBArUa6QnB4lw4MVUQW0uwzUV5eXjh8+DBCQ0PRtWtXfPzxxxgzZgzOnj2LsLAwVFRU3Pog7URbPxO1fFsaPtx7Dn8aEYJ/PNhX7nKIyAJqjSakZBfj19P5+O1MPo5eKEat0fyfeYYqopuz2pmofv364ciRIwgNDUV4eDhWrlwJlUqFjz76CF26dLmtoql18UwUkf1xclBiaIg3hoZ4Y949PVBRU4fDmcU4cK4AB84V4MiFYlwqqcJ3yRfxXfJFAAxVRC3V7BC1ePFilJfXry30yiuv4IEHHsCoUaPQoUMHbNiwweIFkvU0hKjgDgxRRPbKReWIkVeu4gPQpFAVoNVgSIg3hgR7YXCwF3oHeMCBE9WJrtPs4bzGFBYWwsvLi/9z+Z22PJwnhED/f/yM0uo6xP7tLnT3d5e7JCKSQWWNEYeziqRQlZJ9/fCfm9oRgzp7YkiwN4aEeGFgkCdvWE52zWrDeQ3OnDmDs2fP4q677oK3tzcskMWoFRVX1KL0yt3kgzicR9RuOasccGc3H9zZrf5MVWWNEcnZRUg6X4RDmUVIzixCaXUdfj2dj19P5wMAHJQK9AnwwJAQLylY+XtobvYyRHap2SGqoKAAjz32GHbt2gWFQoHTp0+jS5cumDlzJry8vOzuCjZ71TCU5++h5iJ9RCRxVjlIi34CgNEkkK4vRVJmIQ6dL0Li+UJcKqnCsYslOHaxBJ/9dh4AEOTtjEFBXhjU2RMDgzzRJ9ADakf+20L2rdkh6m9/+xucnJyQlZUlreIN1C9XMG/ePIYoG5HJSeVE1AQOSgX6BHqgT6AHnogIAQBcLK5E4vlCJGUWIfF8EdL0BmQXViK7sBJbjlwCAKgclOgT6IGBQZ5SsOrs7cJpH2RXmh2ifv75Z/z000/o1KmT2fbu3bsjMzPTYoWRdWVfCVEcyiOi5uro6YyOAzti4sCOAIDSqlokZxUjJbsYyVlFSMkuRlFFLVKy67et21+/n7erCgODPKVg1b+TJ7TOTjL2hOj2NDtElZeXw8Xl+g/ewsJCm7pvXHuXWVB/hWWwt6vMlRCRrXPXOOGuHr64q4cvgPoLV7IKK66EqmIkZxfjxKUSFJbXYOfJXOw8efX+pV19XTEwyAsDg7To11GL3gEenGJANqPZIWrUqFH44osv8OqrrwKov2eeyWTCypUrMWbMGIsXSNYhrRHVwVnmSojI3igUCgR3cEVwB1fpbFV1nREnLhmkM1Yp2cXIKqzA2bxynM0rx7eH62+s7KhUoLu/O/p31KJfJy36d9Sip86dwYrapGaHqJUrV2Ls2LFITExETU0NXnzxRRw/fhyFhYX47bffrFEjWUF2YSUAzokiotahdnTAoM5eGNTZS9pWUFYtBapjF0tw7EIJCsprkHbZgLTLBmxIzAZQH6x6+Lujf6f6s1X9O9UHK05cJ7m1aMXyU6dO4b333oO7uzvKysrw8MMPY/bs2QgICLBGjWRh1XVGXCppCFEcziMieXRwU2Nsb3+M7V1/v1QhRP2VfxdKkHqxBEcv1n8tLK/BicsGnLhsAA7VBysnh6vBqk+gFn0C3NFL58H1q6hVtei3TavV4u9//7vZtgsXLmDWrFn46KOPLFIYWc/FokoIATg7OcDHTSV3OUREAOqHATt6OqOjpzPG9dMBqA9WF4sr60PVhfplFVIvlqCoohbHLxlw/JIBQPaV/YGQDq7oE1B/NWGfAA/0DvCAv4eaVwWSVVgsshcUFOCTTz5hiLIB194zj/+wEFFbplAo0MnLBZ28XDCuX/1ohxACF4oqpbNVaZcNOHHJgNzSamTklyMjvxw/HrssHcPbVWUWrPoEeqCLjyscHZRydYvsBM97tkNXJ5VzPhQR2R6FQoEgbxcEebvgvrCr00jyy6qlQHXiytezeWUoLK/BvjP52HcmX2qrclSip7/7lbNV7uihqx8O9Hbl2XlqOoaodiirgAttEpH98XFTY1R3X4zq7ittq6o14lROqVmwSrtsQHmNUVp1/ffH6KVzRw9/d/TUuaGnzgPd/dw414oaxd+KdqjhTFQwz0QRkZ3TODmgf6f6hT0bmEwC2UUVOHFlTtVJfSlO5ZQiq7AC+WXV2Hem2uysFVD/n85rg1VPf3d08XWFE4cE27Umh6iHH374ps8XFxffbi3USrK4WjkRtWNK5dV1rK4dDiyvrsPp3DKc0pdKweqkvhT5ZdXIKqxAVmEFfknLkdo7OSjQxccNPXTu6Onvhm5+7ujm54bgDi4MV+1Ek0OUVqu95fPTpk277YLIuhpWEgY4nEdEdC1XtaN0W5prFZRV41ROGdL1BqRf+Xoqpwxl1XVIzylFek4pfrimvZNDfUjr7ueGblceXX3rH84qrm1lT5ocoj777DNr1kGtpKC8BhU1RigUQCcvrlZORHQrHdzUiHBTI6JrB2lbw5pW6XoD0vVlOJ1TijN5ZTiTW4aKGiPO5Nb/+VoN/+52870arhrOXvEegraJc6Lamcwrk8oDPDRc7ZeIqIWuXdPqD738pe0mk8BlQ5UUos7klkp/LqqoRXZhJbILK7ErPc/seL7uanTzdUNXP1d08XFDqK8ruvq4oaOXMxyUXIqmrWKIameyOR+KiMhqlMqr4Wp0D1+z5wrKqnFaCldlOJtXhtM5ZdAbqpBXWo280mrEnysw20floETnDi7o4uMqBatQX1eE+riig6uKa/3JjCGqnWk4E8Ur84iIWlcHNzU6uKkxvEsHs+2lVbU4m1eOM7llOJdXhoz8cpzLK0dGQTlq6kyNDg0CgIfGEaG+bujqUx+qQn2vnMXyceXcq1bCENXOcFI5EVHb4q5xanRCu8kkcKmksj5QXVmJ/eyVkHWxuBKGqjocyS7Gkezi644ZqNUguIMrQnxc0NnbFSEdXNC5gwuCO7jCjWteWQx/ku0Mh/OIiGyDUnn1ljd3/W5osKrWiMyCCmTkl+HsNSHrXF793KtLJVW4VFJ13fAgAPi4qeqXePB2ubLUg8uVhyu8XJw4RNgMDFHtTGZhOQAguIOrzJUQEVFLaZwc0FPnjp469+ueKyqvwbn8cmQVluN8fv36VucLypFVUIGC8hrkl9U/kjKLrtvXXeMoBapgbxeEdHBF5w71X/3c1VBykrsZhqh2pKrWiBxDNQAO5xER2SsvVxUGu6owONjruucMVbXIKqhAZsHVYHW+oBxZhRW4XFKF0qo6pF40IPWi4bp91Y5KdL5yz8IgL2cEebugk5czOnnVb2uPyzQwRLUjF4rqh/Lc1I7wcml/v+xERO2dh8YJ/Tpq0a/j9QtoV9UakVVYH7AyC8qvBq3CClwoqkR1nQmnc8twupFJ7vXHdrwSsFwQ5H01ZAVdGZK0x8nuDFHtSOY1Nx7mmDcREV1L4+SAHv71N1/+vVqjCZeKK5FZUIHsogpkF1biQlEFsosqcaGwfpjQUFWH41fuR9gYHzd1fbi6ErI6eV0NXIGezjZ5qxyGqHaEV+YREVFLODkopfsNNqa8ug4XiiqRXVghhavswqshq7S6Dvll1cgvq0ZyVvF1+ysUgL+7Bh296tfYuu6rpzNc2+BVhW2vIrIaKURxjSgiIrIgV7XjDSe6CyFQUlkrhayGM1n1X68OFeoNVdAbqhqd8A4Ani5OUqC6NlyN6eUHjZM8Q4UMUe1IVgHPRBERUetSKBTwdFHB00XV6FwsIQTyy2pwsbgSF4sqcbG44srXSly48rW0qg7FFbUorqi9brgw9eWo1urKdRii2hEO5xERUVujUCjg666Gr7v6ugVHGxiqanFJCln1Xy8UV6KkolbWxUMZotoJIYQUonjLFyIisiUeGid46JzQS+chdylmbG8qPLVIbmk1qutMUCqAQE9nucshIiKyeQxR7UTDWShbvYyUiIioreGnaTvRsEYUh/KIiIgsgyGqneCkciIiIstqEyFqzZo1CAkJgUajQXh4OA4ePHjT9ps2bUKvXr2g0WgQFhaGbdu2mT0vhEBMTAwCAgLg7OyMyMhInD592qxNYWEhpk6dCg8PD3h6emLmzJkoK2t8KfszZ87A3d0dnp6et9VPOWVfCVFBDFFEREQWIXuI2rBhA+bNm4elS5fi8OHDGDBgAKKiopCbm9to+/3792PKlCmYOXMmkpOTER0djejoaKSmpkptVq5cidWrV2Pt2rVISEiAq6sroqKiUFVVJbWZOnUqjh8/jtjYWGzduhV79+7FrFmzrnu92tpaTJkyBaNGjbJ851tRZkE5ACDYu/HVZomIiKh5FEIIIWcB4eHhGDp0KN577z0AgMlkQlBQEJ555hm89NJL17WfNGkSysvLsXXrVmnb8OHDMXDgQKxduxZCCAQGBmL+/Pl4/vnnAQAlJSXw9/fHunXrMHnyZKSlpaFPnz44dOgQhgwZAgDYsWMH7r//fly4cAGBgYHSsRcsWIBLly5h7NixmDt3LoqLi5vcN4PBAK1Wi5KSEnh4yHtZ5pBlvyC/rBo/zBmJsE7XL3ZGRERE9Zr6+S3rmaiamhokJSUhMjJS2qZUKhEZGYn4+PhG94mPjzdrDwBRUVFS+4yMDOj1erM2Wq0W4eHhUpv4+Hh4enpKAQoAIiMjoVQqkZCQIG3buXMnNm3ahDVr1jSpP9XV1TAYDGaPtqCipv6eRQDnRBEREVmKrCEqPz8fRqMR/v7+Ztv9/f2h1+sb3Uev19+0fcPXW7Xx8/Mze97R0RHe3t5Sm4KCAvzpT3/CunXrmnwWafny5dBqtdIjKCioSftZW8Okcq2zE7QuTjJXQ0REZB9knxPVVj355JN4/PHHcddddzV5n4ULF6KkpER6ZGdnW7HCpuM984iIiCxP1hDl4+MDBwcH5OTkmG3PycmBTqdrdB+dTnfT9g1fb9Xm9xPX6+rqUFhYKLXZuXMnVq1aBUdHRzg6OmLmzJkoKSmBo6MjPv3000ZrU6vV8PDwMHu0BVzegIiIyPJkDVEqlQqDBw9GXFyctM1kMiEuLg4RERGN7hMREWHWHgBiY2Ol9qGhodDpdGZtDAYDEhISpDYREREoLi5GUlKS1Gbnzp0wmUwIDw8HUD9vKiUlRXq88sorcHd3R0pKCh566CHL/ABaiRSiuNAmERGRxch+A+J58+Zh+vTpGDJkCIYNG4a3334b5eXlmDFjBgBg2rRp6NixI5YvXw4AeO655zB69Gi8+eabGD9+PNavX4/ExER89NFHAOrvBj137lwsW7YM3bt3R2hoKJYsWYLAwEBER0cDAHr37o1x48bhySefxNq1a1FbW4s5c+Zg8uTJ0pV5vXv3NqszMTERSqUS/fr1a6WfjOXwTBQREZHlyR6iJk2ahLy8PMTExECv12PgwIHYsWOHNDE8KysLSuXVE2YjRozAV199hcWLF2PRokXo3r07Nm/ebBZuXnzxRZSXl2PWrFkoLi7GyJEjsWPHDmg0GqnNl19+iTlz5mDs2LFQKpV45JFHsHr16tbreCtqCFHBDFFEREQWI/s6UfasLawTZTQJ9F6yAzVGE359cQxXLCciIroFm1gniqwvx1CFGqMJjkoFArSaW+9ARERETcIQZecahvI6eTnD0YFvNxERkaXwU9XONawRxWE8IiIiy2KIsnO8Mo+IiMg6GKLsXGbDlXlcI4qIiMiiGKLsHM9EERERWQdDlJ3LLuScKCIiImtgiLJjpVW1KCyvAcAzUURERJbGEGXHGobyvF1VcNc4yVwNERGRfWGIsmMcyiMiIrIehig7llnAe+YRERFZC0OUHeOVeURERNbDEGXHpBDFNaKIiIgsjiHKjvFMFBERkfUwRNmpOqMJF4sqATBEERERWQNDlJ26XFKFOpOAykEJnYdG7nKIiIjsDkOUnWoYyuvk7QylUiFzNURERPaHIcpOcT4UERGRdTFE2SmuEUVERGRdDFF2iquVExERWRdDlJ3icB4REZF1MUTZqcyCcgBAcAdXmSshIiKyTwxRdqikohaGqjoAQJC3s8zVEBER2SeGKDvUMJTn46aGi8pR5mqIiIjsE0OUHcosbBjK43woIiIia2GIskOcVE5ERGR9DFF2KJshioiIyOoYouxQw0KbDFFERETWwxBlh6ThPM6JIiIishqGKDtTazThUnElAN7yhYiIyJoYouzMxaJKmASgdlTC110tdzlERER2iyHKzlx7ZZ5CoZC5GiIiIvvFEGVnMq+EKK4RRUREZF0MUXamYXmDIM6HIiIisiqGKDuTxeUNiIiIWgVDlJ3hcB4REVHrYIiyI0IIrlZORETUStpEiFqzZg1CQkKg0WgQHh6OgwcP3rT9pk2b0KtXL2g0GoSFhWHbtm1mzwshEBMTg4CAADg7OyMyMhKnT582a1NYWIipU6fCw8MDnp6emDlzJsrKyqTnd+/ejYkTJyIgIACurq4YOHAgvvzyS8t12gqKKmpRVl0HAOjkxRBFRERkTbKHqA0bNmDevHlYunQpDh8+jAEDBiAqKgq5ubmNtt+/fz+mTJmCmTNnIjk5GdHR0YiOjkZqaqrUZuXKlVi9ejXWrl2LhIQEuLq6IioqClVVVVKbqVOn4vjx44iNjcXWrVuxd+9ezJo1y+x1+vfvj2+//RZHjx7FjBkzMG3aNGzdutV6P4zblFlQDgDQeWigcXKQuRoiIiI7J2Q2bNgwMXv2bOl7o9EoAgMDxfLlyxtt/9hjj4nx48ebbQsPDxdPPfWUEEIIk8kkdDqdeOONN6Tni4uLhVqtFl9//bUQQogTJ04IAOLQoUNSm+3btwuFQiEuXrx4w1rvv/9+MWPGjCb3raSkRAAQJSUlTd7ndmxOviCCF2wVj36wv1Vej4iIyB419fNb1jNRNTU1SEpKQmRkpLRNqVQiMjIS8fHxje4THx9v1h4AoqKipPYZGRnQ6/VmbbRaLcLDw6U28fHx8PT0xJAhQ6Q2kZGRUCqVSEhIuGG9JSUl8Pb2vuHz1dXVMBgMZo/WxOUNiIiIWo+sISo/Px9GoxH+/v5m2/39/aHX6xvdR6/X37R9w9dbtfHz8zN73tHREd7e3jd83Y0bN+LQoUOYMWPGDfuzfPlyaLVa6REUFHTDttaQWcAr84iIiFqL7HOibMGuXbswY8YM/Pvf/0bfvn1v2G7hwoUoKSmRHtnZ2a1YpfktX4iIiMi6ZA1RPj4+cHBwQE5Ojtn2nJwc6HS6RvfR6XQ3bd/w9VZtfj9xva6uDoWFhde97p49ezBhwgS89dZbmDZt2k37o1ar4eHhYfZoTdLyBjwTRUREZHWyhiiVSoXBgwcjLi5O2mYymRAXF4eIiIhG94mIiDBrDwCxsbFS+9DQUOh0OrM2BoMBCQkJUpuIiAgUFxcjKSlJarNz506YTCaEh4dL23bv3o3x48fj9ddfN7tyry2qrjPisqH+6kOeiSIiIrI+R7kLmDdvHqZPn44hQ4Zg2LBhePvtt1FeXi7NPZo2bRo6duyI5cuXAwCee+45jB49Gm+++SbGjx+P9evXIzExER999BEAQKFQYO7cuVi2bBm6d++O0NBQLFmyBIGBgYiOjgYA9O7dG+PGjcOTTz6JtWvXora2FnPmzMHkyZMRGBgIoH4I74EHHsBzzz2HRx55RJorpVKpbjq5XC4XiiohBOCickAHV5Xc5RAREdm/Vrpa8Kbeffdd0blzZ6FSqcSwYcPEgQMHpOdGjx4tpk+fbtZ+48aNokePHkKlUom+ffuKH3/80ex5k8kklixZIvz9/YVarRZjx44V6enpZm0KCgrElClThJubm/Dw8BAzZswQpaWl0vPTp08XAK57jB49usn9as0lDnam5YjgBVtF1Ft7rP5aRERE9qypn98KIYSQMcPZNYPBAK1Wi5KSEqvPj/p8/3ks3XIc9/bxx0fThtx6ByIiImpUUz+/eXWeneCVeURERK2LIcpOcI0oIiKi1sUQZSe4WjkREVHrYoiyA0IIDucRERG1MoYoO5BXVo3KWiMUCqCTF0MUERFRa2CIsgMNQ3mBWmeoHPmWEhERtQZ+4tqBLGk+lLPMlRAREbUfDFF2QLoyz9tV5kqIiIjaD4YoO5DFGw8TERG1OoYoO5DNK/OIiIhaHUOUHWgYzmOIIiIiaj0MUTaussaI3NJqAAxRRERErYkhysZdKKo/C+WucYSni5PM1RAREbUfDFE27tqhPIVCIXM1RERE7QdDlI3j7V6IiIjkwRBl47i8ARERkTwYomwcz0QRERHJgyHKxjFEERERyYMhyoaZTEIKUbzlCxERUetiiLJhuaXVqKkzwUGpQICnRu5yiIiI2hWGKBvWcBYq0FMDJwe+lURERK2Jn7w2LLOgHACH8oiIiOTAEGXDGm48HMRJ5URERK2OIcqGSZPKuUYUERFRq2OIsmGZXN6AiIhINgxRNiybIYqIiEg2DFE2qry6DvllNQB4yxciIiI5METZqIb5UJ4uTvDQOMlcDRERUfvDEGWjeLsXIiIieTFE2aisAoYoIiIiOTFE2SieiSIiIpIXQ5SNYogiIiKSF0OUjZJCFK/MIyIikgVDlA0ymgQuFPFMFBERkZwYomyQ3lCFWqOAk4MCAVpnucshIiJqlxiibFBmQTkAoJOXCxyUCpmrISIiap8YomxQw+1egjiUR0REJJs2EaLWrFmDkJAQaDQahIeH4+DBgzdtv2nTJvTq1QsajQZhYWHYtm2b2fNCCMTExCAgIADOzs6IjIzE6dOnzdoUFhZi6tSp8PDwgKenJ2bOnImysjKzNkePHsWoUaOg0WgQFBSElStXWqbDt+nqlXkcyiMiIpKL7CFqw4YNmDdvHpYuXYrDhw9jwIABiIqKQm5ubqPt9+/fjylTpmDmzJlITk5GdHQ0oqOjkZqaKrVZuXIlVq9ejbVr1yIhIQGurq6IiopCVVWV1Gbq1Kk4fvw4YmNjsXXrVuzduxezZs2SnjcYDLj33nsRHByMpKQkvPHGG/jHP/6Bjz76yHo/jCbKvLLQZrC3q8yVEBERtWNCZsOGDROzZ8+WvjcajSIwMFAsX7680faPPfaYGD9+vNm28PBw8dRTTwkhhDCZTEKn04k33nhDer64uFio1Wrx9ddfCyGEOHHihAAgDh06JLXZvn27UCgU4uLFi0IIId5//33h5eUlqqurpTYLFiwQPXv2bHLfSkpKBABRUlLS5H2a4sF3fxXBC7aK7ccuW/S4RERE1PTPb1nPRNXU1CApKQmRkZHSNqVSicjISMTHxze6T3x8vFl7AIiKipLaZ2RkQK/Xm7XRarUIDw+X2sTHx8PT0xNDhgyR2kRGRkKpVCIhIUFqc9ddd0GlUpm9Tnp6OoqKihqtrbq6GgaDwexhDQ3DecFcI4qIiEg2soao/Px8GI1G+Pv7m2339/eHXq9vdB+9Xn/T9g1fb9XGz8/P7HlHR0d4e3ubtWnsGNe+xu8tX74cWq1WegQFBTXe8dtQWWOEyrH+bePEciIiIvnIPifKnixcuBAlJSXSIzs72+Kv4axyQMKiSJx8dRzc1I4WPz4RERE1jawhysfHBw4ODsjJyTHbnpOTA51O1+g+Op3upu0bvt6qze8nrtfV1aGwsNCsTWPHuPY1fk+tVsPDw8PsYS0aJwerHZuIiIhuTdYQpVKpMHjwYMTFxUnbTCYT4uLiEBER0eg+ERERZu0BIDY2VmofGhoKnU5n1sZgMCAhIUFqExERgeLiYiQlJUltdu7cCZPJhPDwcKnN3r17UVtba/Y6PXv2hJeX1232nIiIiGxeK010v6H169cLtVot1q1bJ06cOCFmzZolPD09hV6vF0II8cQTT4iXXnpJav/bb78JR0dHsWrVKpGWliaWLl0qnJycxLFjx6Q2K1asEJ6enuL7778XR48eFRMnThShoaGisrJSajNu3DgxaNAgkZCQIPbt2ye6d+8upkyZIj1fXFws/P39xRNPPCFSU1PF+vXrhYuLi/jwww+b3DdrXZ1HRERE1tPUz2/ZQ5QQQrz77ruic+fOQqVSiWHDhokDBw5Iz40ePVpMnz7drP3GjRtFjx49hEqlEn379hU//vij2fMmk0ksWbJE+Pv7C7VaLcaOHSvS09PN2hQUFIgpU6YINzc34eHhIWbMmCFKS0vN2hw5ckSMHDlSqNVq0bFjR7FixYpm9YshioiIyPY09fNbIYQQ8p4Ls18GgwFarRYlJSVWnR9FREREltPUz29enUdERETUAgxRRERERC3AEEVERETUAgxRRERERC3AEEVERETUAgxRRERERC3AEEVERETUAgxRRERERC3AEEVERETUAo5yF2DPGhaDNxgMMldCRERETdXwuX2rm7owRFlRaWkpACAoKEjmSoiIiKi5SktLodVqb/g8751nRSaTCZcuXYK7uzsUCoXFjmswGBAUFITs7Gy7vCefvfcPsP8+2nv/APvvI/tn++y9j9bsnxACpaWlCAwMhFJ545lPPBNlRUqlEp06dbLa8T08POzyL0YDe+8fYP99tPf+AfbfR/bP9tl7H63Vv5udgWrAieVERERELcAQRURERNQCDFE2SK1WY+nSpVCr1XKXYhX23j/A/vto7/0D7L+P7J/ts/c+toX+cWI5ERERUQvwTBQRERFRCzBEEREREbUAQxQRERFRCzBEEREREbUAQ5QNWrNmDUJCQqDRaBAeHo6DBw/KXdJ1/vGPf0ChUJg9evXqJT1fVVWF2bNno0OHDnBzc8MjjzyCnJwcs2NkZWVh/PjxcHFxgZ+fH1544QXU1dWZtdm9ezfuuOMOqNVqdOvWDevWrbNKf/bu3YsJEyYgMDAQCoUCmzdvNnteCIGYmBgEBATA2dkZkZGROH36tFmbwsJCTJ06FR4eHvD09MTMmTNRVlZm1ubo0aMYNWoUNBoNgoKCsHLlyutq2bRpE3r16gWNRoOwsDBs27atVfr4pz/96br3dNy4cTbTx+XLl2Po0KFwd3eHn58foqOjkZ6ebtamNX8vLf33uCn9u/vuu697D//yl7/YRP8++OAD9O/fX1pYMSIiAtu3b5eet+X3rql9tOX3rzErVqyAQqHA3LlzpW029z4Ksinr168XKpVKfPrpp+L48ePiySefFJ6eniInJ0fu0swsXbpU9O3bV1y+fFl65OXlSc//5S9/EUFBQSIuLk4kJiaK4cOHixEjRkjP19XViX79+onIyEiRnJwstm3bJnx8fMTChQulNufOnRMuLi5i3rx54sSJE+Ldd98VDg4OYseOHRbvz7Zt28Tf//538d133wkA4n//+5/Z8ytWrBBarVZs3rxZHDlyRDz44IMiNDRUVFZWSm3GjRsnBgwYIA4cOCB+/fVX0a1bNzFlyhTp+ZKSEuHv7y+mTp0qUlNTxddffy2cnZ3Fhx9+KLX57bffhIODg1i5cqU4ceKEWLx4sXBychLHjh2zeh+nT58uxo0bZ/aeFhYWmrVpy32MiooSn332mUhNTRUpKSni/vvvF507dxZlZWVSm9b6vbTG3+Om9G/06NHiySefNHsPS0pKbKJ/W7ZsET/++KM4deqUSE9PF4sWLRJOTk4iNTVVCGHb711T+2jL79/vHTx4UISEhIj+/fuL5557Ttpua+8jQ5SNGTZsmJg9e7b0vdFoFIGBgWL58uUyVnW9pUuXigEDBjT6XHFxsXBychKbNm2StqWlpQkAIj4+XghR/4GuVCqFXq+X2nzwwQfCw8NDVFdXCyGEePHFF0Xfvn3Njj1p0iQRFRVl4d6Y+33AMJlMQqfTiTfeeEPaVlxcLNRqtfj666+FEEKcOHFCABCHDh2S2mzfvl0oFApx8eJFIYQQ77//vvDy8pL6J4QQCxYsED179pS+f+yxx8T48ePN6gkPDxdPPfWUVfsoRH2Imjhx4g33sbU+5ubmCgBiz549QojW/b1sjb/Hv++fEPUfwtd+YP2eLfVPCCG8vLzExx9/bHfvXWN9FMJ+3r/S0lLRvXt3ERsba9YnW3wfOZxnQ2pqapCUlITIyEhpm1KpRGRkJOLj42WsrHGnT59GYGAgunTpgqlTpyIrKwsAkJSUhNraWrN+9OrVC507d5b6ER8fj7CwMPj7+0ttoqKiYDAYcPz4canNtcdoaNPaP4uMjAzo9XqzWrRaLcLDw8364+npiSFDhkhtIiMjoVQqkZCQILW56667oFKppDZRUVFIT09HUVGR1EbOPu/evRt+fn7o2bMnnn76aRQUFEjP2VofS0pKAADe3t4AWu/3srX+Hv++fw2+/PJL+Pj4oF+/fli4cCEqKiqk52ylf0ajEevXr0d5eTkiIiLs7r1rrI8N7OH9mz17NsaPH39dHbb4PvIGxDYkPz8fRqPR7JcHAPz9/XHy5EmZqmpceHg41q1bh549e+Ly5ct4+eWXMWrUKKSmpkKv10OlUsHT09NsH39/f+j1egCAXq9vtJ8Nz92sjcFgQGVlJZydna3UO3MN9TRWy7W1+vn5mT3v6OgIb29vszahoaHXHaPhOS8vrxv2ueEY1jRu3Dg8/PDDCA0NxdmzZ7Fo0SLcd999iI+Ph4ODg0310WQyYe7cubjzzjvRr18/6fVb4/eyqKjI6n+PG+sfADz++OMIDg5GYGAgjh49igULFiA9PR3fffedTfTv2LFjiIiIQFVVFdzc3PC///0Pffr0QUpKit28dzfqI2D77x8ArF+/HocPH8ahQ4eue84W/w4yRJFV3HfffdKf+/fvj/DwcAQHB2Pjxo2tFm7IsiZPniz9OSwsDP3790fXrl2xe/dujB07VsbKmm/27NlITU3Fvn375C7FKm7Uv1mzZkl/DgsLQ0BAAMaOHYuzZ8+ia9eurV1ms/Xs2RMpKSkoKSnBN998g+nTp2PPnj1yl2VRN+pjnz59bP79y87OxnPPPYfY2FhoNBq5y7EIDufZEB8fHzg4OFx3pUJOTg50Op1MVTWNp6cnevTogTNnzkCn06GmpgbFxcVmba7th06na7SfDc/drI2Hh0erBrWGem72vuh0OuTm5po9X1dXh8LCQov0WY73v0uXLvDx8cGZM2ek2myhj3PmzMHWrVuxa9cudOrUSdreWr+X1v57fKP+NSY8PBwAzN7Dttw/lUqFbt26YfDgwVi+fDkGDBiAd955x27eu5v1sTG29v4lJSUhNzcXd9xxBxwdHeHo6Ig9e/Zg9erVcHR0hL+/v829jwxRNkSlUmHw4MGIi4uTtplMJsTFxZmNmbdFZWVlOHv2LAICAjB48GA4OTmZ9SM9PR1ZWVlSPyIiInDs2DGzD+XY2Fh4eHhIp7YjIiLMjtHQprV/FqGhodDpdGa1GAwGJCQkmPWnuLgYSUlJUpudO3fCZDJJ/xBGRERg7969qK2tldrExsaiZ8+e8PLyktq0hT4DwIULF1BQUICAgACptrbcRyEE5syZg//973/YuXPndcOKrfV7aa2/x7fqX2NSUlIAwOw9bKv9a4zJZEJ1dbXNv3dN6WNjbO39Gzt2LI4dO4aUlBTpMWTIEEydOlX6s829j82ahk6yW79+vVCr1WLdunXixIkTYtasWcLT09PsSoW2YP78+WL37t0iIyND/PbbbyIyMlL4+PiI3NxcIUT9ZaydO3cWO3fuFImJiSIiIkJERERI+zdcxnrvvfeKlJQUsWPHDuHr69voZawvvPCCSEtLE2vWrLHaEgelpaUiOTlZJCcnCwDiX//6l0hOThaZmZlCiPolDjw9PcX3338vjh49KiZOnNjoEgeDBg0SCQkJYt++faJ79+5ml/8XFxcLf39/8cQTT4jU1FSxfv164eLict3l/46OjmLVqlUiLS1NLF261GJLHNysj6WlpeL5558X8fHxIiMjQ/zyyy/ijjvuEN27dxdVVVU20cenn35aaLVasXv3brNLxCsqKqQ2rfV7aY2/x7fq35kzZ8Qrr7wiEhMTRUZGhvj+++9Fly5dxF133WUT/XvppZfEnj17REZGhjh69Kh46aWXhEKhED///LMQwrbfu6b00dbfvxv5/RWHtvY+MkTZoHfffVd07txZqFQqMWzYMHHgwAG5S7rOpEmTREBAgFCpVKJjx45i0qRJ4syZM9LzlZWV4q9//avw8vISLi4u4qGHHhKXL182O8b58+fFfffdJ5ydnYWPj4+YP3++qK2tNWuza9cuMXDgQKFSqUSXLl3EZ599ZpX+7Nq1SwC47jF9+nQhRP0yB0uWLBH+/v5CrVaLsWPHivT0dLNjFBQUiClTpgg3Nzfh4eEhZsyYIUpLS83aHDlyRIwcOVKo1WrRsWNHsWLFiutq2bhxo+jRo4dQqVSib9++4scff7R6HysqKsS9994rfH19hZOTkwgODhZPPvnkdf/gtOU+NtY3AGa/M635e2npv8e36l9WVpa46667hLe3t1Cr1aJbt27ihRdeMFtnqC33789//rMIDg4WKpVK+Pr6irFjx0oBSgjbfu+a0kdbf/9u5PchytbeR4UQQjTv3BURERERcU4UERERUQswRBERERG1AEMUERERUQswRBERERG1AEMUERERUQswRBERERG1AEMUERERUQswRBERERG1AEMUERGAkJAQvP3223KXQUQ2hCGKiGyKQqG46eMf//hHi4576NAhzJo167Zqy8jIwOOPP47AwEBoNBp06tQJEydOxMmTJwEA58+fh0KhkG4cS0S2zVHuAoiImuPy5cvSnzds2ICYmBikp6dL29zc3KQ/CyFgNBrh6Hjrf+p8fX1vq67a2lrcc8896NmzJ7777jsEBATgwoUL2L59O4qLi2/r2ETUNvFMFBHZFJ1OJz20Wi0UCoX0/cmTJ+Hu7o7t27dj8ODBUKvV2LdvH86ePYuJEyfC398fbm5uGDp0KH755Rez4/5+OE+hUODjjz/GQw89BBcXF3Tv3h1btmy5YV3Hjx/H2bNn8f7772P48OEIDg7GnXfeiWXLlmH48OEAgNDQUADAoEGDoFAocPfdd0v7f/zxx+jduzc0Gg169eqF999/X3qu4QzW+vXrMWLECGg0GvTr1w979uyxwE+UiFqKIYqI7M5LL72EFStWIC0tDf3790dZWRnuv/9+xMXFITk5GePGjcOECROQlZV10+O8/PLLeOyxx3D06FHcf//9mDp1KgoLCxtt6+vrC6VSiW+++QZGo7HRNgcPHgQA/PLLL7h8+TK+++47AMCXX36JmJgYvPbaa0hLS8M///lPLFmyBJ9//rnZ/i+88ALmz5+P5ORkREREYMKECSgoKGjuj4eILEUQEdmozz77TGi1Wun7Xbt2CQBi8+bNt9y3b9++4t1335W+Dw4OFm+99Zb0PQCxePFi6fuysjIBQGzfvv2Gx3zvvfeEi4uLcHd3F2PGjBGvvPKKOHv2rPR8RkaGACCSk5PN9uvatav46quvzLa9+uqrIiIiwmy/FStWSM/X1taKTp06iddff/2WfSUi6+CZKCKyO0OGDDH7vqysDM8//zx69+4NT09PuLm5IS0t7ZZnovr37y/92dXVFR4eHsjNzb1h+9mzZ0Ov1+PLL79EREQENm3ahL59+yI2NvaG+5SXl+Ps2bOYOXMm3NzcpMeyZctw9uxZs7YRERHSnx0dHTFkyBCkpaXdtA9EZD2cWE5EdsfV1dXs++effx6xsbFYtWoVunXrBmdnZ/zxj39ETU3NTY/j5ORk9r1CoYDJZLrpPu7u7pgwYQImTJiAZcuWISoqCsuWLcM999zTaPuysjIAwL///W+Eh4ebPefg4HDT1yIiefFMFBHZvd9++w1/+tOf8NBDDyEsLAw6nQ7nz5+3+usqFAr06tUL5eXlAACVSgUAZnOm/P39ERgYiHPnzqFbt25mj4aJ6A0OHDgg/bmurg5JSUno3bu31ftBRI3jmSgisnvdu3fHd999hwkTJkChUGDJkiW3PKPUXCkpKVi6dCmeeOIJ9OnTByqVCnv27MGnn36KBQsWAAD8/Pzg7OyMHTt2oFOnTtBoNNBqtXj55Zfx7LPPQqvVYty4caiurkZiYiKKioowb9486TXWrFmD7t27o3fv3njrrbdQVFSEP//5zxbtBxE1HUMUEdm9f/3rX/jzn/+MESNGwMfHBwsWLIDBYLDoa3Tq1AkhISF4+eWXpSUJGr7/29/+BqB+HtPq1avxyiuvICYmBqNGjcLu3bvxf//3f3BxccEbb7yBF154Aa6urggLC8PcuXPNXmPFihVYsWIFUlJS0K1bN2zZsgU+Pj4W7QcRNZ1CCCHkLoKIiG7s/PnzCA0NRXJyMgYOHCh3OUR0BedEEREREbUAQxQRERFRC3A4j4iIiKgFeCaKiIiIqAUYooiIiIhagCGKiIiIqAUYooiIiIhagCGKiIiIqAUYooiIiIhagCGKiIiIqAUYooiIiIha4P8DAa3C4Nz+Mz4AAAAASUVORK5CYII=", + "text/plain": [ + "
" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "plt.plot(learning_rate(tf.range(40000, dtype=tf.float32)))\n", + "plt.ylabel('Learning Rate')\n", + "plt.xlabel('Train Step')" + ] + }, + { + "cell_type": "markdown", + "id": "4cfba386", + "metadata": {}, + "source": [ + "Next, you set up the loss. Since the target sequences are padded, it is important to apply a padding mask when calculating the loss.\n", + "\n", + "You will use the sparse categorical cross-entropy loss function (`tf.keras.losses.SparseCategoricalCrossentropy`) and set the parameter `from_logits` to False since the Transformer does not output raw logits since the last layer has a softmax activation:" + ] + }, + { + "cell_type": "code", + "execution_count": 27, + "id": "99fc8885", + "metadata": { + "deletable": false, + "editable": false, + "tags": [ + "graded" + ] + }, + "outputs": [], + "source": [ + "loss_object = tf.keras.losses.SparseCategoricalCrossentropy(from_logits=False, reduction='none')\n", + "\n", + "def masked_loss(real, pred):\n", + " mask = tf.math.logical_not(tf.math.equal(real, 0))\n", + " loss_ = loss_object(real, pred)\n", + "\n", + " mask = tf.cast(mask, dtype=loss_.dtype)\n", + " loss_ *= mask\n", + "\n", + " return tf.reduce_sum(loss_)/tf.reduce_sum(mask)\n", + "\n", + "\n", + "train_loss = tf.keras.metrics.Mean(name='train_loss')\n", + "\n", + "# Here you will store the losses, so you can later plot them\n", + "losses = []" + ] + }, + { + "cell_type": "markdown", + "id": "33db3f0b", + "metadata": {}, + "source": [ + "Now you can define your custom training function. If you are not very advanced with tensorflow, you can understand this function as an alternative to using `model.compile()` and `model.fit()`, but with added extra flexibility." + ] + }, + { + "cell_type": "code", + "execution_count": 28, + "id": "79092091", + "metadata": { + "deletable": false, + "editable": false, + "tags": [ + "graded" + ] + }, + "outputs": [], + "source": [ + "@tf.function\n", + "def train_step(model, inp, tar):\n", + " \"\"\"\n", + " One training step for the transformer\n", + " Arguments:\n", + " inp (tf.Tensor): Input data to summarize\n", + " tar (tf.Tensor): Target (summary)\n", + " Returns:\n", + " None\n", + " \"\"\"\n", + " tar_inp = tar[:, :-1]\n", + " tar_real = tar[:, 1:]\n", + "\n", + " # Create masks\n", + " enc_padding_mask = create_padding_mask(inp)\n", + " look_ahead_mask = create_look_ahead_mask(tf.shape(tar_inp)[1])\n", + " dec_padding_mask = create_padding_mask(inp) # Notice that both encoder and decoder padding masks are equal\n", + "\n", + " with tf.GradientTape() as tape:\n", + " predictions, _ = model(\n", + " inp,\n", + " tar_inp, \n", + " True, \n", + " enc_padding_mask, \n", + " look_ahead_mask, \n", + " dec_padding_mask\n", + " )\n", + " loss = masked_loss(tar_real, predictions)\n", + "\n", + " gradients = tape.gradient(loss, transformer.trainable_variables) \n", + " optimizer.apply_gradients(zip(gradients, transformer.trainable_variables))\n", + "\n", + " train_loss(loss)" + ] + }, + { + "cell_type": "markdown", + "id": "1480d5fd", + "metadata": {}, + "source": [ + "Now you are ready for training the model. But before starting the training, you can also define one more set of functions to perform the inference. Because you are using a custom training loop, you can do whatever you want between the training steps. And wouldnt't it be fun to see after each epoch some examples of how the model performs?" + ] + }, + { + "cell_type": "markdown", + "id": "79e05c54", + "metadata": {}, + "source": [ + "\n", + "## 11 - Summarization\n", + "\n", + "The last thing you will implement is inference. With this, you will be able to produce actual summaries of the documents. You will use a simple method called greedy decoding, which means you will predict one word at a time and append it to the output. You will start with an `[SOS]` token and repeat the word by word inference until the model returns you the `[EOS]` token or until you reach the maximum length of the sentence (you need to add this limit, otherwise a poorly trained model could give you infinite sentences without ever producing the `[EOS]` token.\n", + "\n", + " \n", + "### Exercise 5 - next_word\n", + "Write a helper function that predicts the next word, so you can use it to write the whole sentences. Hint: this is very similar to what happens in the train_step, but you have to set the training of the model to False." + ] + }, + { + "cell_type": "code", + "execution_count": 29, + "id": "175fae70", + "metadata": { + "deletable": false, + "tags": [ + "graded" + ] + }, + "outputs": [], + "source": [ + "# GRADED FUNCTION: next_word\n", + "def next_word(model, encoder_input, output):\n", + " \"\"\"\n", + " Helper function for summarization that uses the model to predict just the next word.\n", + " Arguments:\n", + " encoder_input (tf.Tensor): Input data to summarize\n", + " output (tf.Tensor): (incomplete) target (summary)\n", + " Returns:\n", + " predicted_id (tf.Tensor): The id of the predicted word\n", + " \"\"\"\n", + " ### START CODE HERE ###\n", + " # Create a padding mask for the input (encoder)\n", + " enc_padding_mask = create_padding_mask(encoder_input)\n", + " # Create a look-ahead mask for the output\n", + " look_ahead_mask = create_look_ahead_mask(tf.shape(output)[1])\n", + " # Create a padding mask for the input (decoder)\n", + " dec_padding_mask = create_padding_mask(encoder_input)\n", + "\n", + " # Run the prediction of the next word with the transformer model\n", + " predictions, attention_weights = model(\n", + " encoder_input, output, False, enc_padding_mask, look_ahead_mask, dec_padding_mask\n", + " )\n", + " ### END CODE HERE ###\n", + "\n", + " predictions = predictions[: ,-1:, :]\n", + " predicted_id = tf.cast(tf.argmax(predictions, axis=-1), tf.int32)\n", + " \n", + " return predicted_id" + ] + }, + { + "cell_type": "markdown", + "id": "29af50d0", + "metadata": {}, + "source": [ + "Check if your function works." + ] + }, + { + "cell_type": "code", + "execution_count": 30, + "id": "3e97ba77", + "metadata": { + "deletable": false, + "editable": false, + "tags": [ + "graded" + ] + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Predicted token: [[14859]]\n", + "Predicted word: masses\n" + ] + } + ], + "source": [ + "# Take a random sentence as an input\n", + "input_document = tokenizer.texts_to_sequences([\"a random sentence\"])\n", + "input_document = tf.keras.preprocessing.sequence.pad_sequences(input_document, maxlen=encoder_maxlen, padding='post', truncating='post')\n", + "encoder_input = tf.expand_dims(input_document[0], 0)\n", + "\n", + "# Take the start of sentence token as the only token in the output to predict the next word\n", + "output = tf.expand_dims([tokenizer.word_index[\"[SOS]\"]], 0)\n", + "\n", + "# predict the next word with your function\n", + "predicted_token = next_word(transformer, encoder_input, output)\n", + "print(f\"Predicted token: {predicted_token}\")\n", + "\n", + "predicted_word = tokenizer.sequences_to_texts(predicted_token.numpy())[0]\n", + "print(f\"Predicted word: {predicted_word}\")" + ] + }, + { + "cell_type": "markdown", + "id": "7157031c", + "metadata": {}, + "source": [ + "##### __Expected Output__\n", + "\n", + "```\n", + "Predicted token: [[14859]]\n", + "Predicted word: masses\n", + "```" + ] + }, + { + "cell_type": "code", + "execution_count": 31, + "id": "6bd98959", + "metadata": { + "deletable": false, + "editable": false, + "tags": [] + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "\u001b[92m All tests passed!\n" + ] + } + ], + "source": [ + "# UNIT TEST\n", + "w2_unittest.test_next_word(next_word, transformer, encoder_input, output)" + ] + }, + { + "cell_type": "code", + "execution_count": 32, + "id": "6177dc6a", + "metadata": { + "deletable": false, + "editable": false, + "tags": [ + "graded" + ] + }, + "outputs": [], + "source": [ + "def summarize(model, input_document):\n", + " \"\"\"\n", + " A function for summarization using the transformer model\n", + " Arguments:\n", + " input_document (tf.Tensor): Input data to summarize\n", + " Returns:\n", + " _ (str): The summary of the input_document\n", + " \"\"\" \n", + " input_document = tokenizer.texts_to_sequences([input_document])\n", + " input_document = tf.keras.preprocessing.sequence.pad_sequences(input_document, maxlen=encoder_maxlen, padding='post', truncating='post')\n", + " encoder_input = tf.expand_dims(input_document[0], 0)\n", + " \n", + " output = tf.expand_dims([tokenizer.word_index[\"[SOS]\"]], 0)\n", + " \n", + " for i in range(decoder_maxlen):\n", + " predicted_id = next_word(model, encoder_input, output)\n", + " output = tf.concat([output, predicted_id], axis=-1)\n", + " \n", + " if predicted_id == tokenizer.word_index[\"[EOS]\"]:\n", + " break\n", + "\n", + " return tokenizer.sequences_to_texts(output.numpy())[0] # since there is just one translated document" + ] + }, + { + "cell_type": "markdown", + "id": "d3b15117", + "metadata": {}, + "source": [ + "Now you can already summarize a sentence! But beware, since the model was not yet trained at all, it will just produce nonsense." + ] + }, + { + "cell_type": "code", + "execution_count": 33, + "id": "bae4d5f1", + "metadata": { + "deletable": false, + "editable": false, + "tags": [ + "graded" + ] + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Training set example:\n", + "[SOS] amanda: i baked cookies. do you want some? jerry: sure! amanda: i'll bring you tomorrow :-) [EOS]\n", + "\n", + "Human written summary:\n", + "[SOS] amanda baked cookies and will bring jerry some tomorrow. [EOS]\n", + "\n", + "Model written summary:\n" + ] + }, + { + "data": { + "text/plain": [ + "\"[SOS] masses kindergarten concept kindergarten concept bloomer wilingness sux sam kindergarten lisabeth kindergarten sawyer's sawyer's masses concept bloomer lisabeth bloomer wilingness 80000 bt hotsummer hoax hoax kieslowski wilingness 80000 dont't elis' 🐶❤️👍 cots saaaad evelynn inexperienced suji zubac forthcoming callum farmers extraordinary callum kindergarten worthy extraordinary readable 🐶❤️👍 thinkgn 🐶❤️👍 cots\"" + ] + }, + "execution_count": 33, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "training_set_example = 0\n", + "\n", + "# Check a summary of a document from the training set\n", + "print('Training set example:')\n", + "print(document[training_set_example])\n", + "print('\\nHuman written summary:')\n", + "print(summary[training_set_example])\n", + "print('\\nModel written summary:')\n", + "summarize(transformer, document[training_set_example])" + ] + }, + { + "cell_type": "markdown", + "id": "90d6f836", + "metadata": {}, + "source": [ + "\n", + "# 12 - Train the model\n", + "\n", + "Now you can finally train the model. Below is a loop that will train your model for 20 epochs. note that it should take about 30 seconds per epoch (with the exception of the first few epochs which can take a few minutes each).\n", + "\n", + "Note that after each epoch you perform the summarization on one of the sentences in the test set and print it out, so you can see how your model is improving." + ] + }, + { + "cell_type": "code", + "execution_count": 34, + "id": "ebe2bf5f", + "metadata": { + "deletable": false, + "editable": false, + "scrolled": true, + "tags": [] + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Epoch 1, Batch 1/231\r" + ] + }, + { + "name": "stderr", + "output_type": "stream", + "text": [ + "2025-06-12 13:11:54.587111: I tensorflow/compiler/xla/service/service.cc:168] XLA service 0x7bcf29ee4ae0 initialized for platform CUDA (this does not guarantee that XLA will be used). Devices:\n", + "2025-06-12 13:11:54.587172: I tensorflow/compiler/xla/service/service.cc:176] StreamExecutor device (0): NVIDIA A10G, Compute Capability 8.6\n", + "2025-06-12 13:11:54.636169: I tensorflow/compiler/mlir/tensorflow/utils/dump_mlir_util.cc:255] disabling MLIR crash reproducer, set env var `MLIR_CRASH_REPRODUCER_DIRECTORY` to enable.\n", + "2025-06-12 13:11:54.723376: I tensorflow/compiler/xla/stream_executor/cuda/cuda_dnn.cc:432] Loaded cuDNN version 8600\n", + "2025-06-12 13:11:55.021293: I ./tensorflow/compiler/jit/device_compiler.h:186] Compiled cluster using XLA! This line is logged at most once for the lifetime of the process.\n" + ] + }, + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Epoch 1, Loss 7.886631\n", + "Time taken for one epoch: 65.2928831577301 sec\n", + "Example summarization on the test set:\n", + " True summarization:\n", + " [SOS] hannah needs betty's number but amanda doesn't have it. she needs to contact larry. [EOS]\n", + " Predicted summarization:\n", + " [SOS] [EOS]\n", + "\n", + "Epoch 2, Loss 6.600031\n", + "Time taken for one epoch: 24.169685125350952 sec\n", + "Example summarization on the test set:\n", + " True summarization:\n", + " [SOS] hannah needs betty's number but amanda doesn't have it. she needs to contact larry. [EOS]\n", + " Predicted summarization:\n", + " [SOS] is going to the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the\n", + "\n", + "Epoch 3, Loss 6.029531\n", + "Time taken for one epoch: 16.651750326156616 sec\n", + "Example summarization on the test set:\n", + " True summarization:\n", + " [SOS] hannah needs betty's number but amanda doesn't have it. she needs to contact larry. [EOS]\n", + " Predicted summarization:\n", + " [SOS] tom is going to the new new new new new new new new new new new new new new new new new new new new new new new new new new new new new new new new new new new new new new new new new new new new new\n", + "\n", + "Epoch 4, Loss 5.683831\n", + "Time taken for one epoch: 12.880155563354492 sec\n", + "Example summarization on the test set:\n", + " True summarization:\n", + " [SOS] hannah needs betty's number but amanda doesn't have it. she needs to contact larry. [EOS]\n", + " Predicted summarization:\n", + " [SOS] tom is going to the new new new new new new new new new new new new new new new job [EOS]\n", + "\n", + "Epoch 5, Loss 5.475431\n", + "Time taken for one epoch: 13.034060716629028 sec\n", + "Example summarization on the test set:\n", + " True summarization:\n", + " [SOS] hannah needs betty's number but amanda doesn't have it. she needs to contact larry. [EOS]\n", + " Predicted summarization:\n", + " [SOS] the new new new new new new new job and she will be at the weekend [EOS]\n", + "\n", + "Epoch 6, Loss 5.322731\n", + "Time taken for one epoch: 10.890370845794678 sec\n", + "Example summarization on the test set:\n", + " True summarization:\n", + " [SOS] hannah needs betty's number but amanda doesn't have it. she needs to contact larry. [EOS]\n", + " Predicted summarization:\n", + " [SOS] tom is going to the new job [EOS]\n", + "\n", + "Epoch 7, Loss 5.195131\n", + "Time taken for one epoch: 10.396798133850098 sec\n", + "Example summarization on the test set:\n", + " True summarization:\n", + " [SOS] hannah needs betty's number but amanda doesn't have it. she needs to contact larry. [EOS]\n", + " Predicted summarization:\n", + " [SOS] tom is going to the party with her [EOS]\n", + "\n", + "Epoch 8, Loss 5.083531\n", + "Time taken for one epoch: 11.252682209014893 sec\n", + "Example summarization on the test set:\n", + " True summarization:\n", + " [SOS] hannah needs betty's number but amanda doesn't have it. she needs to contact larry. [EOS]\n", + " Predicted summarization:\n", + " [SOS] the new year's eve is going to the party [EOS]\n", + "\n", + "Epoch 9, Loss 4.977131\n", + "Time taken for one epoch: 9.915740489959717 sec\n", + "Example summarization on the test set:\n", + " True summarization:\n", + " [SOS] hannah needs betty's number but amanda doesn't have it. she needs to contact larry. [EOS]\n", + " Predicted summarization:\n", + " [SOS] the new year's eve is going to the party [EOS]\n", + "\n", + "Epoch 10, Loss 4.876331\n", + "Time taken for one epoch: 9.256117343902588 sec\n", + "Example summarization on the test set:\n", + " True summarization:\n", + " [SOS] hannah needs betty's number but amanda doesn't have it. she needs to contact larry. [EOS]\n", + " Predicted summarization:\n", + " [SOS] the car is going to the party with her [EOS]\n", + "\n", + "Epoch 11, Loss 4.776131\n", + "Time taken for one epoch: 9.405097723007202 sec\n", + "Example summarization on the test set:\n", + " True summarization:\n", + " [SOS] hannah needs betty's number but amanda doesn't have it. she needs to contact larry. [EOS]\n", + " Predicted summarization:\n", + " [SOS] mark has just arrived to the office today [EOS]\n", + "\n", + "Epoch 12, Loss 4.674431\n", + "Time taken for one epoch: 9.421130657196045 sec\n", + "Example summarization on the test set:\n", + " True summarization:\n", + " [SOS] hannah needs betty's number but amanda doesn't have it. she needs to contact larry. [EOS]\n", + " Predicted summarization:\n", + " [SOS] the car is going to the cinema with her [EOS]\n", + "\n", + "Epoch 13, Loss 4.571131\n", + "Time taken for one epoch: 9.543477296829224 sec\n", + "Example summarization on the test set:\n", + " True summarization:\n", + " [SOS] hannah needs betty's number but amanda doesn't have it. she needs to contact larry. [EOS]\n", + " Predicted summarization:\n", + " [SOS] ben will buy the red dress for the movie [EOS]\n", + "\n", + "Epoch 14, Loss 4.470531\n", + "Time taken for one epoch: 9.424731492996216 sec\n", + "Example summarization on the test set:\n", + " True summarization:\n", + " [SOS] hannah needs betty's number but amanda doesn't have it. she needs to contact larry. [EOS]\n", + " Predicted summarization:\n", + " [SOS] alice has just arrived at the cinema with her [EOS]\n", + "\n", + "Epoch 15, Loss 4.368131\n", + "Time taken for one epoch: 9.076642751693726 sec\n", + "Example summarization on the test set:\n", + " True summarization:\n", + " [SOS] hannah needs betty's number but amanda doesn't have it. she needs to contact larry. [EOS]\n", + " Predicted summarization:\n", + " [SOS] alice has just arrived to the cinema with her [EOS]\n", + "\n", + "Epoch 16, Loss 4.267731\n", + "Time taken for one epoch: 9.078428030014038 sec\n", + "Example summarization on the test set:\n", + " True summarization:\n", + " [SOS] hannah needs betty's number but amanda doesn't have it. she needs to contact larry. [EOS]\n", + " Predicted summarization:\n", + " [SOS] hannah has just arrived to the cinema with her [EOS]\n", + "\n", + "Epoch 17, Loss 4.164831\n", + "Time taken for one epoch: 10.068711042404175 sec\n", + "Example summarization on the test set:\n", + " True summarization:\n", + " [SOS] hannah needs betty's number but amanda doesn't have it. she needs to contact larry. [EOS]\n", + " Predicted summarization:\n", + " [SOS] alice has just arrived to the cinema with her [EOS]\n", + "\n", + "Epoch 18, Loss 4.069731\n", + "Time taken for one epoch: 8.947221755981445 sec\n", + "Example summarization on the test set:\n", + " True summarization:\n", + " [SOS] hannah needs betty's number but amanda doesn't have it. she needs to contact larry. [EOS]\n", + " Predicted summarization:\n", + " [SOS] hannah has just arrived to the cinema with amanda and sara will go to the cinema [EOS]\n", + "\n", + "Epoch 19, Loss 3.967131\n", + "Time taken for one epoch: 9.104878425598145 sec\n", + "Example summarization on the test set:\n", + " True summarization:\n", + " [SOS] hannah needs betty's number but amanda doesn't have it. she needs to contact larry. [EOS]\n", + " Predicted summarization:\n", + " [SOS] alice has just finished the book and he will be at the cinema with amanda [EOS]\n", + "\n", + "Epoch 20, Loss 3.876331\n", + "Time taken for one epoch: 8.590877771377563 sec\n", + "Example summarization on the test set:\n", + " True summarization:\n", + " [SOS] hannah needs betty's number but amanda doesn't have it. she needs to contact larry. [EOS]\n", + " Predicted summarization:\n", + " [SOS] alice and hannah are going to the cinema with amanda and sara will see him [EOS]\n", + "\n" + ] + } + ], + "source": [ + "# Take an example from the test set, to monitor it during training\n", + "test_example = 0\n", + "true_summary = summary_test[test_example]\n", + "true_document = document_test[test_example]\n", + "\n", + "# Define the number of epochs\n", + "epochs = 20\n", + "\n", + "# Training loop\n", + "for epoch in range(epochs):\n", + " \n", + " start = time.time()\n", + " train_loss.reset_states()\n", + " number_of_batches=len(list(enumerate(dataset)))\n", + "\n", + " for (batch, (inp, tar)) in enumerate(dataset):\n", + " print(f'Epoch {epoch+1}, Batch {batch+1}/{number_of_batches}', end='\\r')\n", + " train_step(transformer, inp, tar)\n", + " \n", + " print (f'Epoch {epoch+1}, Loss {train_loss.result():.4f}')\n", + " losses.append(train_loss.result())\n", + " \n", + " print (f'Time taken for one epoch: {time.time() - start} sec')\n", + " print('Example summarization on the test set:')\n", + " print(' True summarization:')\n", + " print(f' {true_summary}')\n", + " print(' Predicted summarization:')\n", + " print(f' {summarize(transformer, true_document)}\\n')" + ] + }, + { + "cell_type": "markdown", + "id": "35687ddc", + "metadata": {}, + "source": [ + "Plot the loss funtion." + ] + }, + { + "cell_type": "code", + "execution_count": 35, + "id": "eb3d5335", + "metadata": { + "deletable": false, + "editable": false, + "tags": [ + "graded" + ] + }, + "outputs": [ + { + "data": { + "text/plain": [ + "Text(0.5, 0, 'Epoch')" + ] + }, + "execution_count": 35, + "metadata": {}, + "output_type": "execute_result" + }, + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAjcAAAGwCAYAAABVdURTAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjcuMiwgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy8pXeV/AAAACXBIWXMAAA9hAAAPYQGoP6dpAABLsElEQVR4nO3deVxU9f4/8NcZlmGRGZRtGBgRV5AdVzSvlRaapmiL+rOrVrZdW6zb95a3rMwKu9263Vs3TW9qZdq1BSwtFSzNFHMB3DcE2RdFYVhkgJnz+4MY5QojIMyZ5fV8PObxcM58zuF9Oo28/JzP+XwEURRFEBEREdkImdQFEBEREXUlhhsiIiKyKQw3REREZFMYboiIiMimMNwQERGRTWG4ISIiIpvCcENEREQ2xVHqAszNYDCgqKgIHh4eEARB6nKIiIioHURRRFVVFdRqNWQy030zdhduioqKoNFopC6DiIiIOiE/Px+BgYEm29hduPHw8ADQ9B9HoVBIXA0RERG1h1arhUajMf4eN8Xuwk3zrSiFQsFwQ0REZGXaM6SEA4qJiIjIpkgabvR6PRYvXozg4GC4urqiX79+WLp0KW60lufOnTsRGxsLuVyO/v37Y+3ateYpmIiIiCyepLel3n77bSxfvhyffvopwsLCcPDgQTz44INQKpV4+umnW90nJycHkyZNwuOPP44vvvgCO3bswPz58+Hv74/4+HgznwERERFZGkG8UTdJN5o8eTL8/PzwySefGLfdc889cHV1xbp161rd54UXXsCWLVtw7Ngx47aZM2eioqICW7duva69TqeDTqczvm8ekFRZWckxN0RERFZCq9VCqVS26/e3pLelRo0ahR07duDMmTMAgMOHD+PXX3/FxIkT29wnLS0N48ePb7EtPj4eaWlprbZPTEyEUqk0vvgYOBERkW2T9LbUiy++CK1Wi5CQEDg4OECv1+PNN9/E7Nmz29ynpKQEfn5+Lbb5+flBq9XiypUrcHV1bfHZokWL8NxzzxnfN/fcEBERkW2SNNxs3LgRX3zxBdavX4+wsDBkZmZi4cKFUKvVmDt3bpf8DLlcDrlc3iXHIiIiIssnabj5v//7P7z44ouYOXMmACAiIgK5ublITExsM9yoVCqUlpa22FZaWgqFQnFdrw0RERHZH0nH3NTW1l63PoSDgwMMBkOb+8TFxWHHjh0ttqWkpCAuLq5baiQiIiLrImm4ufvuu/Hmm29iy5YtOH/+PJKSkvDee+9h2rRpxjaLFi3CnDlzjO8ff/xxZGdn4y9/+QtOnTqFjz76CBs3bsSzzz4rxSkQERGRhZH0ttQHH3yAxYsX409/+hPKysqgVqvx2GOP4ZVXXjG2KS4uRl5envF9cHAwtmzZgmeffRb//Oc/ERgYiP/85z+c44aIiIgASDzPjRQ68pw8ERERWQarmefG1pRX63CmtErqMoiIiOwaw00XST1RiiFvpOLPGw9LXQoREZFdY7jpIqHqpi6yk8VaXKnXS1wNERGR/WK46SJqpQt8PeRoNIg4WlgpdTlERER2i+GmiwiCgJjengCAjLzL0hZDRERkxxhuulBs754AgIy8CmkLISIismMMN10o5vdwk553GXb2hD0REZHFYLjpQhEBSjjKBJRV6VBUWSd1OURERHaJ4aYLuTo7INS/6akpjrshIiKSBsNNF2seVJyeWyFpHURERPaK4aaLGQcV57PnhoiISAoMN12suefmeKEWukZO5kdERGRuDDddrHcvN/Ryd0a93oDjRVqpyyEiIrI7DDddTBAExBon86uQtBYiIiJ7xHDTDa6d74aIiIjMi+GmG8RoPAEAmey5ISIiMjuGm24QqfGETAAKK66gVMvJ/IiIiMyJ4aYb9JA7YqCfBwBO5kdERGRuDDfdJIaLaBIREUmC4aab8IkpIiIiaTDcdJPmnpsjhRVo0BskroaIiMh+MNx0k77e7lC4OKKuwYBTxVVSl0NERGQ3GG66iUwmcL4bIiIiCTDcdKMY47gbhhsiIiJzYbjpRldXCK+QthAiIiI7wnDTjaJ+n6k4t7wWF6t10hZDRERkJxhuupHS1Qn9fXsA4FIMRERE5sJw082M893kc9wNERGROTDcdDPjE1O5FdIWQkREZCcYbrpZ8xNThwsqoDeI0hZDRERkBxhuutkAXw/0kDuitl6PM6WczI+IiKi7SRpu+vTpA0EQrnstWLCg1fZr1669rq2Li4uZq+4YB5mAKI0SACfzIyIiMgdJw82BAwdQXFxsfKWkpAAA7rvvvjb3USgULfbJzc01V7mdFqPhCuFERETm4ijlD/fx8WnxftmyZejXrx/Gjh3b5j6CIEClUrX7Z+h0Ouh0V+eY0Wq1HS/0JsUGeQJgzw0REZE5WMyYm/r6eqxbtw4PPfQQBEFos111dTWCgoKg0WgwdepUHD9+3ORxExMToVQqjS+NRtPVpd9Q9O89N9kXalBRW2/2n09ERGRPLCbcJCcno6KiAvPmzWuzzaBBg7B69Wps2rQJ69atg8FgwKhRo1BQUNDmPosWLUJlZaXxlZ+f3w3Vm9bL3RnB3u4AgEwuxUBERNStJL0tda1PPvkEEydOhFqtbrNNXFwc4uLijO9HjRqF0NBQfPzxx1i6dGmr+8jlcsjl8i6vt6NiNJ7IuViD9LwK3DrIV+pyiIiIbJZF9Nzk5uYiNTUV8+fP79B+Tk5OiImJQVZWVjdV1nW4QjgREZF5WES4WbNmDXx9fTFp0qQO7afX63H06FH4+/t3U2Vdp3mm4sz8Chg4mR8REVG3kTzcGAwGrFmzBnPnzoWjY8u7ZHPmzMGiRYuM719//XVs374d2dnZSE9PxwMPPIDc3NwO9/hIIUTlARcnGarqGnHuQrXU5RAREdksycfcpKamIi8vDw899NB1n+Xl5UEmu5q/Ll++jEceeQQlJSXo2bMnhgwZgr1792Lw4MHmLLlTHB1kiAz0xP6cS8jIq8AAPw+pSyIiIrJJgiiKdnWPRKvVQqlUorKyEgqFwqw/e9mPp7Bi1znMGq5B4vRIs/5sIiIia9aR39+S35ayJ82DirlCOBERUfdhuDGj5nBzpqwKVXUN0hZDRERkoxhuzMjXwwWBPV0hisDh/EqpyyEiIrJJDDdm1vxIOOe7ISIi6h4MN2YW2zyZH5dhICIi6hYMN2Z2bc+NnT2oRkREZBYMN2Y22F8BZ0cZLtc24Hx5rdTlEBER2RyGGzNzdpQhIkAJgONuiIiIugPDjQRiNJ4AgHSGGyIioi7HcCOBq+NuKqQthIiIyAYx3EggNsgTAHCqpAq19Y3SFkNERGRjGG4k4K90hUrhAr1BxJECTuZHRETUlRhuJNK8FANvTREREXUthhuJxP4+7oaDiomIiLoWw41Eru254WR+REREXYfhRiLhAUo4OQi4WK1DweUrUpdDRERkMxhuJOLi5IDB/goAvDVFRETUlRhuJMT5boiIiLoew42EYrhCOBERUZdjuJFQ8xNTJ4oqUdegl7gaIiIi28BwI6HAnq7w7uGMBr2I40WczI+IiKgrMNxISBAEjrshIiLqYgw3Emsed8MnpoiIiLoGw43EYjTsuSEiIupKDDcSi9IoIROA4so6FFdyMj8iIqKbxXAjMTdnR4SomibzY+8NERHRzWO4sQCxQZ4AgAyOuyEiIrppDDcWoHncTTp7boiIiG4aw40FaH5i6mhhJeobDdIWQ0REZOUYbixAsLc7PN2cUN9owMlirdTlEBERWTWGGwsgCAJiNJ4AON8NERHRzZI03PTp0weCIFz3WrBgQZv7fPXVVwgJCYGLiwsiIiLwww8/mLHi7sOZiomIiLqGpOHmwIEDKC4uNr5SUlIAAPfdd1+r7ffu3YtZs2bh4YcfRkZGBhISEpCQkIBjx46Zs+xu0byIZkY+e26IiIhuhiCKoih1Ec0WLlyIzZs34+zZsxAE4brPZ8yYgZqaGmzevNm4beTIkYiOjsaKFStaPaZOp4NOpzO+12q10Gg0qKyshEKh6PqT6CRtXQOilmyHKAIHXhoPHw+51CURERFZDK1WC6VS2a7f3xYz5qa+vh7r1q3DQw891GqwAYC0tDSMHz++xbb4+HikpaW1edzExEQolUrjS6PRdGndXUXh4oQBvj0AcL4bIiKim2Ex4SY5ORkVFRWYN29em21KSkrg5+fXYpufnx9KSkra3GfRokWorKw0vvLz87uq5C7XfGuK890QERF1nsWEm08++QQTJ06EWq3u0uPK5XIoFIoWL0vVPN8Ne26IiIg6z1HqAgAgNzcXqamp+Pbbb022U6lUKC0tbbGttLQUKpWqO8szm+aemyMFlWjUG+DoYDHZk4iIyGpYxG/PNWvWwNfXF5MmTTLZLi4uDjt27GixLSUlBXFxcd1Zntn08+kBD7kjrjTocaqkSupyiIiIrJLk4cZgMGDNmjWYO3cuHB1bdiTNmTMHixYtMr5/5plnsHXrVrz77rs4deoUXnvtNRw8eBBPPvmkucvuFjKZgOjmW1P5FZLWQkREZK0kDzepqanIy8vDQw89dN1neXl5KC4uNr4fNWoU1q9fj5UrVyIqKgpff/01kpOTER4ebs6Su9XVyfw47oaIiKgzLGqeG3PoyHPyUvj5dBkeXHMAwd7u+Pn5W6Uuh4iIyCJY5Tw31KR5jamcizW4XFMvbTFERERWiOHGwni6OaOvjzsAIJPjboiIiDqM4cYCxWiaJ/PjuBsiIqKOYrixQFcn86uQtA4iIiJrxHBjgZon88vMr4DeYFfjvYmIiG4aw40FGujXA27ODqjWNSKrrFrqcoiIiKwKw40FcnSQISrQEwDnuyEiIuoohhsL1TzuhoOKiYiIOobhxkJdnam4QtpCiIiIrAzDjYVq7rk5W1aNyisN0hZDRERkRRhuLJR3Dzl693IDABzmZH5ERETtxnBjwTjfDRERUccx3Fiw5vluMvI5qJiIiKi9GG4s2LU9NwZO5kdERNQuDDcWLESlgNxRhsorDcgpr5G6HCIiIqvAcGPBnB1liAxUAgDSc3lrioiIqD0Ybiyccb4bPjFFRETULgw3Fi6WT0wRERF1CMONhWvuuTldokW1rlHiaoiIiCwfw42F81O4QK10gUEEjhRUSF0OERGRxWO4sQIxQVxnioiIqL0YbqxAjMYTAJDBFcKJiIhuiOHGCly7QrgocjI/IiIiUxhurEB4gALODjKU19Qj/9IVqcshIiKyaAw3VkDu6IDBagUAIJ23poiIiExiuLESzetM/XLmgrSFEBERWTiGGytxd5QaAPDd4SLkX6qVuBoiIiLLxXBjJWJ798SYAd5oNIj4989ZUpdDRERksRhurMgz4wYAAL4+VMDeGyIiojYw3FiRoX16YXR/LzQaRHy085zU5RAREVkkycNNYWEhHnjgAXh5ecHV1RURERE4ePBgm+137twJQRCue5WUlJixauk8M24gAODrQ/korOBj4URERP9L0nBz+fJljB49Gk5OTvjxxx9x4sQJvPvuu+jZs+cN9z19+jSKi4uNL19fXzNULL3hwb0Q19cLDXoRH3HsDRER0XUcpfzhb7/9NjQaDdasWWPcFhwc3K59fX194enp2U2VWbZnxg9A2spybDyYjwW39Yfa01XqkoiIiCyGpD033333HYYOHYr77rsPvr6+iImJwapVq9q1b3R0NPz9/XHHHXdgz549bbbT6XTQarUtXtZuZF8vjAjuhQa9iOUce0NERNSCpOEmOzsby5cvx4ABA7Bt2zY88cQTePrpp/Hpp5+2uY+/vz9WrFiBb775Bt988w00Gg1uvfVWpKent9o+MTERSqXS+NJoNN11Oma1cHzT2Jv/HshHcSXH3hARETUTRAlXYnR2dsbQoUOxd+9e47ann34aBw4cQFpaWruPM3bsWPTu3Ruff/75dZ/pdDrodDrje61WC41Gg8rKSigUips7AYnd/3Ea9udcwpy4ILw+NVzqcoiIiLqNVquFUqls1+9vSXtu/P39MXjw4BbbQkNDkZeX16HjDB8+HFlZrQ+ulcvlUCgULV62YuHv8958uT8fJZV1EldDRERkGSQNN6NHj8bp06dbbDtz5gyCgoI6dJzMzEz4+/t3ZWlWIa6fF4b16Yl6vQErdnHsDRERESBxuHn22Wexb98+vPXWW8jKysL69euxcuVKLFiwwNhm0aJFmDNnjvH9+++/j02bNiErKwvHjh3DwoUL8dNPP7XYx14IgmCc92b9/jyUatl7Q0REJGm4GTZsGJKSkrBhwwaEh4dj6dKleP/99zF79mxjm+Li4ha3qerr6/HnP/8ZERERGDt2LA4fPozU1FSMGzdOilOQ3Oj+XhgS1BP1jey9ISIiAiQeUCyFjgxIsha/nLmAOav3Q+4ow+6/3AZfhYvUJREREXUpqxlQTF1jzABvxPT2hK7RgI9/yZa6HCIiIkkx3NiAprE3TU9OffFbLi5U6W6wBxERke1iuLERYwf6IErjiboGA1b+wrE3RERkvxhubIQgCFg4vqn35vN9ubhYzd4bIiKyTww3NuTWgT6IClSirsGAVRx7Q0REdorhxoYIgoBnfu+9+SwtF+XsvSEiIjvEcGNjbhvki8hAJa406LFqd47U5RAREZkdw42NEQQBT9/e3HtzHpdq6iWuiIiIyLwYbmzQuFBfhAcoUFuvx6rdHHtDRET2heHGBrXovdl7HpfZe0NERHaE4cZG3THYD4P9Faip1+M/v7L3hoiI7AfDjY0SBAFP/z5r8ad7c1FRy94bIiKyDww3NuzOwX4IUXmgWteIT37lk1NERGQfGG5smEx2ddbitXvOo7K2QeKKiIiIuh/DjY27c7AKISoPVOka8cke9t4QEZHtY7ixcTLZ1bE3a/bkoPIKe2+IiMi2MdzYgQlhKgzy80BVXSPWsPeGiIhsHMONHZDJBDw1rj8AYPWvOdDWsfeGiIhsF8ONnbgr3B8DfHtAW9eItXvOS10OERFRt2G4sRNNvTdNY28+Ye8NERHZMIYbOzIpwh/9fNxReaUBn7L3hoiIbBTDjR1xuObJqf/8moMq9t4QEZENYrixM5Mj1ej7e+/NZ2m5UpdDRETU5Rhu7IyD7OqK4at2Z6Na1yhxRURERF2L4cYO3R2lRl9vd1TUNuCztPNSl0NERNSlGG7skINMwJO3N817s+qXbNSw94aIiGwIw42dmhKlRh8vN1yubcDn+zj2hoiIbAfDjZ1ydJDhyeaxN79ko7aevTdERGQbGG7sWEK0GkFebiivqcc69t4QEZGNYLixY44OMiy4rWnszcpfsnGlXi9xRURERDeP4cbOTYsJgKaXKy5W1+OL39h7Q0RE1k/ycFNYWIgHHngAXl5ecHV1RUREBA4ePGhyn507dyI2NhZyuRz9+/fH2rVrzVOsDXJykOHJ33tvVuxi7w0REVk/ScPN5cuXMXr0aDg5OeHHH3/EiRMn8O6776Jnz55t7pOTk4NJkybhtttuQ2ZmJhYuXIj58+dj27ZtZqzctkyPDURgT1dcrNbhlU3HIIqi1CURERF1miBK+JvsxRdfxJ49e7B79+527/PCCy9gy5YtOHbsmHHbzJkzUVFRga1bt95wf61WC6VSicrKSigUik7VbYt2n72Auav3wyACr0wejIduCZa6JCIiIqOO/P7uVM9Nfn4+CgoKjO/379+PhQsXYuXKlR06znfffYehQ4fivvvug6+vL2JiYrBq1SqT+6SlpWH8+PEttsXHxyMtLa3V9jqdDlqttsWLrjdmgA/+elcoAOCNLSfwy5kLEldERETUOZ0KN//v//0//PzzzwCAkpIS3HHHHdi/fz9eeuklvP766+0+TnZ2NpYvX44BAwZg27ZteOKJJ/D000/j008/bXOfkpIS+Pn5tdjm5+cHrVaLK1euXNc+MTERSqXS+NJoNO2uz948fEsw7h0SCIMIPLk+HTkXa6QuiYiIqMM6FW6OHTuG4cOHAwA2btyI8PBw7N27F1988UWHBvcaDAbExsbirbfeQkxMDB599FE88sgjWLFiRWfKatWiRYtQWVlpfOXn53fZsW2NIAh4c1o4Ynt7QlvXiEc+OwhtXYPUZREREXVIp8JNQ0MD5HI5ACA1NRVTpkwBAISEhKC4uLjdx/H398fgwYNbbAsNDUVeXl6b+6hUKpSWlrbYVlpaCoVCAVdX1+vay+VyKBSKFi9qm9zRASv+OAT+ShdklVVj4ZeZ0Bs4wJiIiKxHp8JNWFgYVqxYgd27dyMlJQUTJkwAABQVFcHLy6vdxxk9ejROnz7dYtuZM2cQFBTU5j5xcXHYsWNHi20pKSmIi4vrwBmQKb4eLlj5x6GQO8rw06kyvLPt9I13IiIishCdCjdvv/02Pv74Y9x6662YNWsWoqKiADQNEG6+XdUezz77LPbt24e33noLWVlZWL9+PVauXIkFCxYY2yxatAhz5swxvn/88ceRnZ2Nv/zlLzh16hQ++ugjbNy4Ec8++2xnToXaEBGoxN/ujQQArNh1DskZhRJXRERE1D6dfhRcr9dDq9W2mJPm/PnzcHNzg6+vb7uPs3nzZixatAhnz55FcHAwnnvuOTzyyCPGz+fNm4fz589j586dxm07d+7Es88+ixMnTiAwMBCLFy/GvHnz2vXz+Ch4x/xt6yl8tPMc5I4ybHwsDlEaT6lLIiIiO9SR39+dCjdXrlyBKIpwc3MDAOTm5iIpKQmhoaGIj4/vXNVmwnDTMQaDiEc/P4jUk2XwU8jx/ZO3wFfhInVZRERkZ7p9npupU6fis88+AwBUVFRgxIgRePfdd5GQkIDly5d35pBkoWQyAf+YEY0Bvj1QqtXh0c8Poa6BSzQQEZHl6lS4SU9Px5gxYwAAX3/9Nfz8/JCbm4vPPvsM//rXv7q0QJKeh4sTVs0ZCqWrEzLzK/BSEpdoICIiy9WpcFNbWwsPDw8AwPbt2zF9+nTIZDKMHDkSublcWdoW9fF2x0ezY+EgE/BNegE++TVH6pKIiIha1alw079/fyQnJyM/Px/btm3DnXfeCQAoKyvjOBYbNrq/N16e1LREw1s/nMTO02USV0RERHS9ToWbV155Bc8//zz69OmD4cOHG+eY2b59O2JiYrq0QLIs80b1wYyhGhhE4KkNGTh3oVrqkoiIiFro9KPgJSUlKC4uRlRUFGSypoy0f/9+KBQKhISEdGmRXYlPS908XaMes1f9hoO5l9HXxx1JfxoNpauT1GUREZEN6/ZHwa/VvDp4YGDgzRzGbBhuusaFKh2mfvgriirrMHagD1bPGwYHmSB1WUREZKO6/VFwg8GA119/HUqlEkFBQQgKCoKnpyeWLl0Kg8HQqaLJuvh4yLFyzlC4OMmw68wFvL31lNQlERERAehkuHnppZfw4YcfYtmyZcjIyEBGRgbeeustfPDBB1i8eHFX10gWKjxAib/f17T0xspfsvHNoQKJKyIiIurkbSm1Wo0VK1YYVwNvtmnTJvzpT39CYaHlrkPE21Jd793tp/HBT1lwdpThv4+OREzvnjfeiYiIqAO6/bbUpUuXWh00HBISgkuXLnXmkGTFnh0/EHcM9kN9owGPfX4IJZV1UpdERER2rFPhJioqCh9++OF12z/88ENERkbedFFkXZqXaBjo1wNlVTo89vlBLtFARESS6dRtqV27dmHSpEno3bu3cY6btLQ05Ofn44cffjAuzWCJeFuq++SV12LKv39FRW0DEqLV+MeMaAgCn6AiIqKb1+23pcaOHYszZ85g2rRpqKioQEVFBaZPn47jx4/j888/71TRZP16e7kZl2hIzizCyl+ypS6JiIjs0E3Pc3Otw4cPIzY2Fnq95d6SYM9N9/s87TwWbzoOQQBWzx2G20J8pS6JiIisXLf33BCZ8sDIIMwa3huiCDy9IQNZZVVSl0RERHaE4Ya6nCAIWDIlDMP79EKVrhHzPz2IytoGqcsiIiI7wXBD3cLZUYblD8QiwNMV58tr8eSGdDTqOXs1ERF1P8eONJ4+fbrJzysqKm6mFrIxXj3kWDVnKO5Zvhe7z17E0s0n8OrdYZBxDSoiIupGHQo3SqXyhp/PmTPnpgoi2zJYrcB790fhiS/S8WlaLs6X1+Ld+6Pg3UMudWlERGSjuvRpKWvAp6WksfFgPhYnH4Ou0QBfDznenxGNUf29pS6LiIisBJ+WIotz/1ANvnvyFgzwbZrFePYnv+Hd7ac5DoeIiLocww2ZzSCVB7578hbMHKaBKAIf/JSFWav2oajiitSlERGRDWG4IbNydXbAsnsi8a9ZMeghd8SB85cx8Z+7sf14idSlERGRjWC4IUlMiVJjy9O3IDJQicorDXj080N47bvj0DVa7uzWRERkHRhuSDJBXu74+vFReGRMMABg7d7zmP7RXmRfqJa4MiIismYMNyQpZ0cZXpo0GGvmDUMvd2ccL9Ji8ge/4tv0AqlLIyIiK8VwQxbhthBf/PD0GIzs2wu19Xo8t/Ew/rzxMGp0jVKXRkREVobhhiyGSumCL+aPxHN3DIRMAL5JL8DdH/yK40WVUpdGRERWhOGGLIqDTMDT4wZgwyMjoVK4IPtiDaZ9tBefpZ2Hnc03SUREncRwQxZpRF8v/PjMGIwP9UV9owGvbDqOxz4/hIraeqlLIyIiCydpuHnttdcgCEKLV0hISJvt165de117FxcXM1ZM5tTT3Rmr5gzFK5MHw8lBwPYTpZj0r19x8PwlqUsjIiIL1qGFM7tDWFgYUlNTje8dHU2XpFAocPr0aeN7QeAK07ZMEAQ8dEswhvXphac2pON8eS1mrNyH5+4YiMfH9oMDVxgnIqL/IXm4cXR0hEqland7QRA61J5sQ0SgEpufHoOXk44iObMI72w7jb3nLuIfM6Lh68HeOyIiukryMTdnz56FWq1G3759MXv2bOTl5ZlsX11djaCgIGg0GkydOhXHjx832V6n00Gr1bZ4kXXqIXfEP2ZE42/3RsLVyQF7sspx1z93Y9eZC1KXRkREFkTScDNixAisXbsWW7duxfLly5GTk4MxY8agqqqq1faDBg3C6tWrsWnTJqxbtw4GgwGjRo1CQUHbE74lJiZCqVQaXxqNprtOh8xAEATcP1SD758ajRCVBy5W12Pu6v1I/PEkGrjCOBERARBEC3q+tqKiAkFBQXjvvffw8MMP37B9Q0MDQkNDMWvWLCxdurTVNjqdDjqdzvheq9VCo9GgsrISCoWiy2on86tr0OONLSewbl9Tb99Avx54YUIIbg/x5VgsIiIbo9VqoVQq2/X7W/LbUtfy9PTEwIEDkZWV1a72Tk5OiImJMdleLpdDoVC0eJFtcHFywBsJEVg+OxZKVyecKa3Gw58exIyP9+FQ7mWpyyMiIolYVLiprq7GuXPn4O/v3672er0eR48ebXd7sk0TI/zxy//dhsfH9oPcUYb95y/hnuV78djnB5FVxkU4iYjsjaTh5vnnn8euXbtw/vx57N27F9OmTYODgwNmzZoFAJgzZw4WLVpkbP/6669j+/btyM7ORnp6Oh544AHk5uZi/vz5Up0CWQilmxNenBiCnf93K2YM1UAmANuOlyL+/V+w6NsjKNXWSV0iERGZiaSPghcUFGDWrFkoLy+Hj48PbrnlFuzbtw8+Pj4AgLy8PMhkV/PX5cuX8cgjj6CkpAQ9e/bEkCFDsHfvXgwePFiqUyAL4690xdv3RmL+mGD8bdtppJwoxYb9+UjKKMRDo4Px2Nh+ULo6SV0mERF1I4saUGwOHRmQRNbv4PlLSPzxlHEMjqebE568rT8eGBkEFycHiasjIqL26sjvb4YbsnmiKCL1ZBne3nrKOAYnwNMVz90xEAkxAZzlmIjICjDcmMBwY78a9QZ8m16I91LOoOT3MTghKg+8MCEEtw7y4ePjREQWjOHGBIYbqmvQY+3e8/j3z1moqmsEAIwI7oUXJ4YgpndPiasjIqLWMNyYwHBDzSpq6/HRznNYu/c86hubZjeeGK7C8/GD0M+nh8TVERHRtRhuTGC4of9VWHEF/0g5g2/SCyCKgINMwIxhGiwcNwC+Ci7KSURkCRhuTGC4obacLqnCO9tOIfVkGQDA1ckB88cE49E/9IWHCx8fJyKSEsONCQw3dCP7cy5h2Y8nkZ5XAQDo6eaEJ28fgNkjevPxcSIiiTDcmMBwQ+0hiiK2nyjF37aewrkLNQAAL3dn/DEuCH8cGQSvHnKJKyQisi8MNyYw3FBHNOoN+PpQAT74KQuFFVcAAHJHGabHBmL+mGAOPCYiMhOGGxMYbqgzGvUG/HisBKt2Z+NIQaVx+/hQX8wf0xcjgntxnhwiom7EcGMCww3dDFEUsT/nElbtzkHqyVLj9ogAJR75Q1/cFa6Co4Ok69ESEdkkhhsTGG6oq5y7UI1Pfs3BN4cKoPt9npwAT1c8OLoPZgzT8AkrIqIuxHBjAsMNdbXyah3W7cvDZ2nnUV5TDwDwkDti1ojemDeqD9SerhJXSERk/RhuTGC4oe5S16BHUkYhVu3ORvbvT1g5ygRMjvTH/DF9ER6glLhCIiLrxXBjAsMNdTeDQcTPp8uwanc29mVfMm6P6+uFR//QF2MH+kDGlciJiDqE4cYEhhsyp6MFlVi1OxtbjhZDb2j6qvX37YH5twQjISaAkwISEbUTw40JDDckhcKKK1i7Jwcb9uejWte0Erl3D2fMieuDB0YGoZe7s8QVEhFZNoYbExhuSEraugb8d38+Vu/JQXFlHQDAxalpUsDZI3ojTM1xOURErWG4MYHhhixBg96AH44WY9XubBwr1Bq3RwYqMXNYb0yJVqOH3FHCComILAvDjQkMN2RJRFHEvuxLWLcvF9tPlKBB3/R1dHN2wN2RaswYrkGMxpOzHxOR3WO4MYHhhixVebUO36YXYsOBPOOj5AAwyM8DM4drMC0mAJ5uHJtDRPaJ4cYEhhuydKIo4sD5y/hyfx62HC02zn7s7CjDXeEqzBzem2tZEZHdYbgxgeGGrEllbQM2HS7Ehv35OFl8dWxOX293zBimwT1DAuHdQy5hhURE5sFwYwLDDVkjURRxpKASXx7Iw3eZRaip1wNomgH5jsF+mDm8N8b09+bkgERksxhuTGC4IWtXo2vE94eL8OWBfGTmVxi3B3i6YsYwDe4bGgh/JdezIiLbwnBjAsMN2ZKTxVr890A+vk0vgLauaXJAmQDcNsgXM4f3xm2DfODoIJO4SiKim8dwYwLDDdmiugY9fjxWjA3787E/5+p6Vr4ectw3NBAzhvZGby83CSskIro5DDcmMNyQrcu+UI3/HsjH14cKUF5Tb9w+IrgX7h0SiLsi/OHOCQKJyMow3JjAcEP2or7RgNSTpdiwPw+/Zl1E8zfdzdkBE8P9ce+QQIwI7sVByERkFRhuTGC4IXtUXHkF36YX4ptDBci+eHWCQE0vV9wTG4h7YgOh6cXbVkRkuRhuTGC4IXsmiiLS8yrw9aECbD5chKrfVygHgJF9e+HeIRpMDFfxthURWZyO/P6W9DGK1157DYIgtHiFhISY3Oerr75CSEgIXFxcEBERgR9++MFM1RJZP0EQMCSoJxKnR2D/S+Pxz5nRGDPAG4IA7Mu+hOe/Ooxhb6bi+a8O47fschgMdvVvHyKyEZL/8ywsLAypqanG946ObZe0d+9ezJo1C4mJiZg8eTLWr1+PhIQEpKenIzw83BzlEtkMV2cHTI0OwNToABRVXMG36QX4+lABzpfX4utDTX/u3csN98QGYnpsAG9bEZHVkPS21GuvvYbk5GRkZma2q/2MGTNQU1ODzZs3G7eNHDkS0dHRWLFiRbuOwdtSRG0TRRGHci833bY6Uozqa25bxfX1wr1DAjExQgU3Z8n/XUREdsZqbksBwNmzZ6FWq9G3b1/Mnj0beXl5bbZNS0vD+PHjW2yLj49HWlpam/vodDpotdoWLyJqnSAIGNqnF5bdE4kDL43H+zOiMbq/FwQBSMsux5+/Ooxhb6TiL18fxv6cS7CzIXtEZCUk/efXiBEjsHbtWgwaNAjFxcVYsmQJxowZg2PHjsHDw+O69iUlJfDz82uxzc/PDyUlJW3+jMTERCxZsqTLayeyda7ODkiICUBCTAAKLtciKb0QX6cXILe8FhsPFmDjwabbVvcOCcS0GN62IiLLYVFPS1VUVCAoKAjvvfceHn744es+d3Z2xqeffopZs2YZt3300UdYsmQJSktLWz2mTqeDTqczvtdqtdBoNLwtRdQJoijiYO5lfH2wAJuPXF3AEwCGBPVEQrQakyLV6OXuLGGVRGSLOnJbyqJunHt6emLgwIHIyspq9XOVSnVdiCktLYVKpWrzmHK5HHK5vEvrJLJXgiBgWJ9eGNanF16dMhhbj5Xgm/QC7D1XjkO5l3Eo9zKWfH8CYwf6YGpMAO4I9YOrs4PUZRORnZF8zM21qqurce7cOfj7+7f6eVxcHHbs2NFiW0pKCuLi4sxRHhFdw83ZEdNjA/HF/JFIe3EcXrorFGFqBRoNInacKsPTGzIw9I0UPPffTOw6cwGNeoPUJRORnZD0ttTzzz+Pu+++G0FBQSgqKsKrr76KzMxMnDhxAj4+PpgzZw4CAgKQmJgIoOlR8LFjx2LZsmWYNGkSvvzyS7z11lsdehScT0sRda+ssiokZxRh0+FC5F+6Ytzu3UOOyZH+SIgJQFSgEoLAZR+IqP2s5rZUQUEBZs2ahfLycvj4+OCWW27Bvn374OPjAwDIy8uDTHa1c2nUqFFYv349Xn75Zfz1r3/FgAEDkJyczDluiCxIf18PPB8/CH++cyDS8y4jOaMIm48U4WK1Dmv3nsfavecR7O2OKVFqJMQEINjbXeqSicjGWNSAYnNgzw2R+TXoDdh99gKSM4qw/UQJ6hqu3qKK0ngiIVqNyZFq+HhwfBwRtY5rS5nAcEMkrRpdI7afKEFyRhF+zboI/e9LPDjIBIzu742EaDXuDFOhB9e3IqJrMNyYwHBDZDkuVOmw+UgRkjOLcDi/wrjdxUmGOwarkBCtxh8G+sDJwaKefSAiCTDcmMBwQ2SZci7WYFNmITZlFiHnYo1xe083J0yK9EdCdABie/eETMaByET2iOHGBIYbIssmiiKOFFQiObMQ3x8uxsXqq5NwBni6Ymq0GlOjAzBIdf0s5kRkuxhuTGC4IbIejXoD9p4rx6bMImw7XtJiIc8QlQcSYgJwd5QaAZ6uElZJRObAcGMCww2Rdapr0GPHyTIkZxZi5+kyNOiv/tU1PLgXpkarMSnCH55uXPqByBYx3JjAcENk/SprG/DDsWJsyizEbzmX0Py3mJOD0LT0Q3QAxnPpByKbwnBjAsMNkW0pqriC7w8XYVNmEU4Ua43b3Z0dEB+mwtSYAIzu5wVHPnFFZNUYbkxguCGyXWdLq7Aps7WlH5wxOVKNKdFqxGg8ufQDkRViuDGB4YbI9omiiPS8CmzKLMTmI8W4VFNv/Kx3LzfjE1f9fXtIWCURdQTDjQkMN0T2pUFvwK9ZF7EpoxDbT5Sitl5v/Cw8QIGpUQGYEq2Gn8JFwiqJ6EYYbkxguCGyX7X1jUg5UYrvMouw68wFNP6+9IMgAKP7eSMhJgATwrn0A5ElYrgxgeGGiADgUk09thwtxqaMQhzMvWzczqUfiCwTw40JDDdE9L/yL9ViU2Yhvs0oRPaFq0s/9HJ3xuRIfyTEBHAgMpHEGG5MYLghoraIoohjhVokZRTiu8NFLZZ+CPJyQ0J0ABJiAhDs7S5hlUT2ieHGBIYbImqPRr0Be86VIzmjEFuPleBKw9WByNEaT0yLCcDkSH949ZBLWCWR/WC4MYHhhog6qkbXNBA5KaMQu89ewO/jkOEgE/CHAU0Dke8crOKMyETdiOHGBIYbIroZZVV12Hy4GMmZhThSUGnc7u7sgPhwFabFBGBUP284yDg+h6grMdyYwHBDRF0lq6wamzILkZRRiILLV2dE9vWQY0qUGgkxAQhTKzgQmagLMNyYwHBDRF2taUbky0jKaJoRuaK2wfhZf98emB4bgIToAKg9XSWsksi6MdyYwHBDRN2pvtGAXWcuIDmjECknS1HfaADQNFFgXF8vTIsJwMQIf04USNRBDDcmMNwQkblo6xrw49FifJNeiP05l4zbXZxkiA9rGp9zS39vrlhO1A4MNyYw3BCRFPIv1SI5o2l8TvbFqxMF+njIMTVKjemxgRis5t9JRG1huDGB4YaIpCSKIjLzK5CUUYjvDxfh8jXjc0JUHpgW0zRRIBfyJGqJ4cYEhhsishT1jQbsPF2GpIxC7DhZhnp90/gcmQCM7u+NaTEBiA9TwZ3jc4gYbkxhuCEiS1RZ24DNR4uQlN5yIU83ZwdMCFNhWiznzyH7xnBjAsMNEVm63PIaJP0+Pie3vNa43U8hR0J0AKbFBiBExb+/yL4w3JjAcENE1qJp/pwKfJtegM1HilF55er4nMH+CkyPDcCUKDV8OT6H7ADDjQkMN0RkjXSNevx8qgzfphfi59NlaNA3/dUtE4BR/bwxNVqNCeEqeLg4SVwpUfdguDGB4YaIrN3lmnpsPlKEpIxCpOdVGLfLHWUYH+qHqdFq3DrIF86OnD+HbAfDjQkMN0RkS/LKa7EpsxDJmYU4d+Hq/DlKVyfcFeGPhGg1hvXpBRkHIpOV68jvb4uJ9cuWLYMgCFi4cGGbbdauXQtBEFq8XFx4r5mI7FdvLzc8NW4AUp8bi81P3YL5twTD10OOyisN2LA/DzNW7sOYv/2MZT+ewqkSrdTlEpmFRUyecODAAXz88ceIjIy8YVuFQoHTp08b33O1XSKipr8LwwOUCA9QYtFdodiXXY7kjEJsPVaCwoorWLHrHFbsOocQlQemRgdgSrQaAVzIk2yU5OGmuroas2fPxqpVq/DGG2/csL0gCFCpVGaojIjIOjnIBIzu743R/b2xNCEcP50qQ3JG00DkUyVVOLX1FN7eegrDg3shIToAd0Wo4OnmLHXZRF1G8ttSCxYswKRJkzB+/Ph2ta+urkZQUBA0Gg2mTp2K48ePm2yv0+mg1WpbvIiI7IWLkwPuivDHyjlDcfClO5A4PQIjgnsBAPbnXMJfk45i2JupeOSzg9hypBh1DXqJKya6eZL23Hz55ZdIT0/HgQMH2tV+0KBBWL16NSIjI1FZWYm///3vGDVqFI4fP47AwMBW90lMTMSSJUu6smwiIqukdHPCrOG9MWt4bxRVXMF3h4uQnFGIUyVVSDlRipQTpeghd8SEcBUSogMQ18+LMyKTVZLsaan8/HwMHToUKSkpxrE2t956K6Kjo/H++++36xgNDQ0IDQ3FrFmzsHTp0lbb6HQ66HQ643utVguNRsOnpYiIfne6pArJmYX4LrMIhRVXjNt9PeSYEqXGtNgADPZXcIwjScoqHgVPTk7GtGnT4ODgYNym1+shCAJkMhl0Ol2Lz9py3333wdHRERs2bGjXz+Wj4ERErTMYRBzMvYzkzEL8cLQYFdesWD7QrwemxQRiarQaag5EJglYRbipqqpCbm5ui20PPvggQkJC8MILLyA8PPyGx9Dr9QgLC8Ndd92F9957r10/l+GGiOjG6hsN2HXmApIyCpB6sgz1jU0rlgsCMDLYC9NiAjAhQgUFZ0QmM+nI72/Jxtx4eHhcF2Dc3d3h5eVl3D5nzhwEBAQgMTERAPD6669j5MiR6N+/PyoqKvDOO+8gNzcX8+fPN3v9RES2zNlRhjsG++GOwX6ovNKAH48WIymjEL/lXEJadjnSssuxeNMxjB/sh+kxAfjDQB84OUj+jAoRAAt4FNyUvLw8yGRXvyyXL1/GI488gpKSEvTs2RNDhgzB3r17MXjwYAmrJCKybUpXJ8wc3hszh/dGweVabMpsWvohq6waW44UY8uRYvRyd8bdkf5IiAlAtMaT43NIUlx+gYiIOkwURRwv0uLb9EJ8d7gIF6uvPrgR7O2OhOgATIsJQG8vNwmrJFtiFWNupMJwQ0TUtRr1BvyadRHJGYXYdrwUV66ZK2dIUE9MiwnApAh/9HTnRIHUeQw3JjDcEBF1nxpdI7YdL0FSRiH2ZF2E4fffME4OAm4b5ItpMQG4PdQXcscbPw1LdC2GGxMYboiIzKNUW4fvDxfh2/RCnCi+Oju8wsURkyL9kRAdwBXLqd0YbkxguCEiMr/TJVVIyijEpsxCFFfWGbcHeLri7ig1EmLUCFHx72RqG8ONCQw3RETSMRhE7MspR1J604rlVbpG42chKg9MiVZjSpQagT05EJlaYrgxgeGGiMgy1DXo8dOpMmzKLMTPpy6gXm8wfja8Ty9MiVZzIDIZMdyYwHBDRGR5Kmsb8OOxYiRnNk0UKF4zEHnsQB9MjQ7A+FA/uDpzILK9YrgxgeGGiMiyFVdewfeHi5CcUdRiILK7swPiw1SYGhOA0f284MgZke0Kw40JDDdERNbjbGkVNmUWYdPhQuRfurpiuXcPZ0yOVGNqtJozItsJhhsTGG6IiKyPKIpIz7uMTZlF2HykGJdq6o2f9fFyw5ToACREq9HXp4eEVVJ3YrgxgeGGiMi6NegN+PXsRSRnFmL7/8yIHBGgxNTfn7jyVbhIWCV1NYYbExhuiIhsR42uEaknS5GcUYhfzl6E/vcpkWUCMKqfN6bFBGBCuArucoteJ5rageHGBIYbIiLbVF6tw5ajxdiUWYRDuZeN212dHBAf5odpsYG4pb83HDgjslViuDGB4YaIyPblldciObMQ36YX4Hx5rXG7j4ccU6PUmBYbgMH+Cg5EtiIMNyYw3BAR2Q9RFJGRX4HkjEJ8f7gIl2sbjJ8N8vPAtNgAJEQHQKXk+BxLx3BjAsMNEZF9qm80YOfpMiRlFGLHyTLjjMiCAIzq54VpMYGYEK5CD47PsUgMNyYw3BARUWVtA7YcLUZSRgEOnL86PsfFSYb4MBWmxQTglv7enCjQgjDcmMBwQ0RE18q/VIukjEIkZRQi52KNcbuPhxxTotSYFhOAMDXH50iN4cYEhhsiImqNKIrIzK9AUivjcwb69cC0mEAkxKjhr3SVsEr7xXBjAsMNERHdSH2jAbvOXEBSRgFST5ahvvHq+Jy4vl6YFhOA+HAVFC5OEldqPxhuTGC4ISKijqi80oAfjhYjKb0Q+89fMm53dpDhDwN9MDnSH+NCfeHBoNOtGG5MYLghIqLOyr9Ui+SMQiRnFuLchavjc5wdZbh1oA8mRfpjXKgfn7jqBgw3JjDcEBHRzRJFEWdKq7HlSNNCntnXDESWO8pw2yBfTIr0x+0hvlz6oYsw3JjAcENERF1JFEWcKqnCliPF2HykqMWMyC5OMtwe4otJEWrcFuIDN2cGnc5iuDGB4YaIiLqLKIo4Uaz9PegUI+/S1aDj6uSA20N9MTnCH7cO8oWrs4OElVofhhsTGG6IiMgcRFHE8SItNh8pxpajRci/dMX4mZuzA8aF+mFShD9uHeQDFycGnRthuDGB4YaIiMxNFEUcLaw09ugUVlwNOu7ODhg/uCno/GEgg05bGG5MYLghIiIpiaKIwwWV2HKkCFuOFKOoss74WQ+5I+64Jug4O3L5h2YMNyYw3BARkaUwGERkFlRgy5Fi/HC0GMXXBJ2ebk6YHKnGtNgAxGg87X75B4YbExhuiIjIEhkMIjLyL+P7w8XYcrQYF6p0xs+Cvd2REB2AhBg1grzcJaxSOgw3JjDcEBGRpWvUG7DnXDmS0guw7XgprjTojZ8NCeqJaTEBmBzpD083ZwmrNK+O/P62mJt5y5YtgyAIWLhwocl2X331FUJCQuDi4oKIiAj88MMP5imQiIjITBwdZBg70Afvz4zBwZfH4737ozBmgDdkAnAo9zJeTj6GYW+m4tHPDmLrsWLoGvU3PqgdsYjZhA4cOICPP/4YkZGRJtvt3bsXs2bNQmJiIiZPnoz169cjISEB6enpCA8PN1O1RERE5uMud8T02EBMjw1EqbYO32UW4duMQpws1mL7iVJsP1EKhYsjJkWqMT02AEODetr9+BzJb0tVV1cjNjYWH330Ed544w1ER0fj/fffb7XtjBkzUFNTg82bNxu3jRw5EtHR0VixYkW7fh5vSxERkS04VaJFUkYhNmUUoUR7dSCyppcrpkUHICEmAH19ekhYYdeyqttSCxYswKRJkzB+/Pgbtk1LS7uuXXx8PNLS0trcR6fTQavVtngRERFZuxCVAosmhmLPi7dj/fwRuHdIINydHZB/6Qr+9VMWbn93F6b+ew8+3Xse5dW6Gx/Qhkh6W+rLL79Eeno6Dhw40K72JSUl8PPza7HNz88PJSUlbe6TmJiIJUuW3FSdRERElspBJmBUf2+M6u+NpVPDkXKyFEnpBfjl7EUczq/A4fwKLN18AmMH+mBabADGh/rZ/ESBkoWb/Px8PPPMM0hJSYGLi0u3/ZxFixbhueeeM77XarXQaDTd9vOIiIik4ursgClRakyJUuNClQ6bjxQhKaMQRwoqseNUGXacKoOH3BHx4SpMDFdhdH9vmww6koWbQ4cOoaysDLGxscZter0ev/zyCz788EPodDo4OLT8D65SqVBaWtpiW2lpKVQqVZs/Ry6XQy6Xd23xREREFs7HQ44HRwfjwdHByCqrRnJGIZIyClFYcQVfHyrA14cK4O7sgNtCfDEhXIVbB/mih9winjO6aZINKK6qqkJubm6LbQ8++CBCQkLwwgsvtPr004wZM1BbW4vvv//euG3UqFGIjIzkgGIiIqIbMBhEHMy9jB+OFmPb8ZIWMyI7O8rwhwHeiA9TYXyoH3q6W9YcOlY7id+tt97a4mmpOXPmICAgAImJiQCaHgUfO3Ysli1bhkmTJuHLL7/EW2+91aFHwRluiIiImta4OlJQia3HS7D1WAlyLtYYP3OQCRjZtxcmhKlwZ5gKforuGz7SXh35/W3R/U95eXmQya4+0DVq1CisX78eL7/8Mv76179iwIABSE5O5hw3REREHSQIAqI0nojSeOIv8YNwtqwaW481BZ0TxVrsySrHnqxyLN50HLG9PTEx3B/xYSr09nKTuvQbsqieG3Ngzw0REZFpueU12PZ7j056XkWLzwb7KzAhXIUJ4SoM8O1htgkDrfa2lDkw3BAREbVfqbYO24+XYOvxEuzLvgS94Wps6OvtjvhwFSaEqRAZqOzWoMNwYwLDDRERUedcrqlH6slSbDtegl/OXkR9o8H4mVrpgjvDmnp0hvXpBQdZ1wYdhhsTGG6IiIhuXrWuETtPl2HrsRL8fKoMNfVXF+/s4+WGn5+/tUt7cmxmQDERERFZph5yR0yOVGNypBp1DXrsybqIrcdKkHKyFNEaT0kX72S4ISIiopvi4uSAcaF+GBfqh0a9Adq6RknrkXzhTCIiIrIdjg4y9JJ4AkCGGyIiIrIpDDdERERkUxhuiIiIyKYw3BAREZFNYbghIiIim8JwQ0RERDaF4YaIiIhsCsMNERER2RSGGyIiIrIpDDdERERkUxhuiIiIyKYw3BAREZFNYbghIiIim+IodQHmJooiAECr1UpcCREREbVX8+/t5t/jpthduKmqqgIAaDQaiSshIiKijqqqqoJSqTTZRhDbE4FsiMFgQFFRETw8PCAIQpceW6vVQqPRID8/HwqFokuPbWl4rrbLns6X52q77Ol87eVcRVFEVVUV1Go1ZDLTo2rsrudGJpMhMDCwW3+GQqGw6f/BrsVztV32dL48V9tlT+drD+d6ox6bZhxQTERERDaF4YaIiIhsCsNNF5LL5Xj11Vchl8ulLqXb8Vxtlz2dL8/VdtnT+drTubaX3Q0oJiIiItvGnhsiIiKyKQw3REREZFMYboiIiMimMNwQERGRTWG46aB///vf6NOnD1xcXDBixAjs37/fZPuvvvoKISEhcHFxQUREBH744QczVdp5iYmJGDZsGDw8PODr64uEhAScPn3a5D5r166FIAgtXi4uLmaq+Oa89tpr19UeEhJich9rvK4A0KdPn+vOVRAELFiwoNX21nRdf/nlF9x9991Qq9UQBAHJycktPhdFEa+88gr8/f3h6uqK8ePH4+zZszc8bke/8+Zi6nwbGhrwwgsvICIiAu7u7lCr1ZgzZw6KiopMHrMz3wVzuNG1nTdv3nV1T5gw4YbHtcRre6Nzbe37KwgC3nnnnTaPaanXtTsx3HTAf//7Xzz33HN49dVXkZ6ejqioKMTHx6OsrKzV9nv37sWsWbPw8MMPIyMjAwkJCUhISMCxY8fMXHnH7Nq1CwsWLMC+ffuQkpKChoYG3HnnnaipqTG5n0KhQHFxsfGVm5trpopvXlhYWIvaf/311zbbWut1BYADBw60OM+UlBQAwH333dfmPtZyXWtqahAVFYV///vfrX7+t7/9Df/617+wYsUK/Pbbb3B3d0d8fDzq6uraPGZHv/PmZOp8a2trkZ6ejsWLFyM9PR3ffvstTp8+jSlTptzwuB35LpjLja4tAEyYMKFF3Rs2bDB5TEu9tjc612vPsbi4GKtXr4YgCLjnnntMHtcSr2u3Eqndhg8fLi5YsMD4Xq/Xi2q1WkxMTGy1/f333y9OmjSpxbYRI0aIjz32WLfW2dXKyspEAOKuXbvabLNmzRpRqVSar6gu9Oqrr4pRUVHtbm8r11UURfGZZ54R+/XrJxoMhlY/t9brCkBMSkoyvjcYDKJKpRLfeecd47aKigpRLpeLGzZsaPM4Hf3OS+V/z7c1+/fvFwGIubm5bbbp6HdBCq2d69y5c8WpU6d26DjWcG3bc12nTp0q3n777SbbWMN17WrsuWmn+vp6HDp0COPHjzduk8lkGD9+PNLS0lrdJy0trUV7AIiPj2+zvaWqrKwEAPTq1ctku+rqagQFBUGj0WDq1Kk4fvy4OcrrEmfPnoVarUbfvn0xe/Zs5OXltdnWVq5rfX091q1bh4ceesjkIrLWfF2b5eTkoKSkpMV1UyqVGDFiRJvXrTPfeUtWWVkJQRDg6elpsl1HvguWZOfOnfD19cWgQYPwxBNPoLy8vM22tnJtS0tLsWXLFjz88MM3bGut17WzGG7a6eLFi9Dr9fDz82ux3c/PDyUlJa3uU1JS0qH2lshgMGDhwoUYPXo0wsPD22w3aNAgrF69Gps2bcK6detgMBgwatQoFBQUmLHazhkxYgTWrl2LrVu3Yvny5cjJycGYMWNQVVXVantbuK4AkJycjIqKCsybN6/NNtZ8Xa/VfG06ct068523VHV1dXjhhRcwa9YskwsrdvS7YCkmTJiAzz77DDt27MDbb7+NXbt2YeLEidDr9a22t5Vr++mnn8LDwwPTp0832c5ar+vNsLtVwaljFixYgGPHjt3w/mxcXBzi4uKM70eNGoXQ0FB8/PHHWLp0aXeXeVMmTpxo/HNkZCRGjBiBoKAgbNy4sV3/IrJWn3zyCSZOnAi1Wt1mG2u+rtSkoaEB999/P0RRxPLly022tdbvwsyZM41/joiIQGRkJPr164edO3di3LhxElbWvVavXo3Zs2ffcJC/tV7Xm8Gem3by9vaGg4MDSktLW2wvLS2FSqVqdR+VStWh9pbmySefxObNm/Hzzz8jMDCwQ/s6OTkhJiYGWVlZ3VRd9/H09MTAgQPbrN3arysA5ObmIjU1FfPnz+/QftZ6XZuvTUeuW2e+85amOdjk5uYiJSXFZK9Na270XbBUffv2hbe3d5t128K13b17N06fPt3h7zBgvde1Ixhu2snZ2RlDhgzBjh07jNsMBgN27NjR4l+214qLi2vRHgBSUlLabG8pRFHEk08+iaSkJPz0008IDg7u8DH0ej2OHj0Kf3//bqiwe1VXV+PcuXNt1m6t1/Vaa9asga+vLyZNmtSh/az1ugYHB0OlUrW4blqtFr/99lub160z33lL0hxszp49i9TUVHh5eXX4GDf6LliqgoIClJeXt1m3tV9boKnndciQIYiKiurwvtZ6XTtE6hHN1uTLL78U5XK5uHbtWvHEiRPio48+Knp6eoolJSWiKIriH//4R/HFF180tt+zZ4/o6Ogo/v3vfxdPnjwpvvrqq6KTk5N49OhRqU6hXZ544glRqVSKO3fuFIuLi42v2tpaY5v/PdclS5aI27ZtE8+dOyceOnRInDlzpuji4iIeP35cilPokD//+c/izp07xZycHHHPnj3i+PHjRW9vb7GsrEwURdu5rs30er3Yu3dv8YUXXrjuM2u+rlVVVWJGRoaYkZEhAhDfe+89MSMjw/h00LJly0RPT09x06ZN4pEjR8SpU6eKwcHB4pUrV4zHuP3228UPPvjA+P5G33kpmTrf+vp6ccqUKWJgYKCYmZnZ4nus0+mMx/jf873Rd0Eqps61qqpKfP7558W0tDQxJydHTE1NFWNjY8UBAwaIdXV1xmNYy7W90f/HoiiKlZWVopubm7h8+fJWj2Et17U7Mdx00AcffCD27t1bdHZ2FocPHy7u27fP+NnYsWPFuXPntmi/ceNGceDAgaKzs7MYFhYmbtmyxcwVdxyAVl9r1qwxtvnfc124cKHxv4ufn5941113ienp6eYvvhNmzJgh+vv7i87OzmJAQIA4Y8YMMSsry/i5rVzXZtu2bRMBiKdPn77uM2u+rj///HOr/982n4/BYBAXL14s+vn5iXK5XBw3btx1/w2CgoLEV199tcU2U995KZk635ycnDa/xz///LPxGP97vjf6LkjF1LnW1taKd955p+jj4yM6OTmJQUFB4iOPPHJdSLGWa3uj/49FURQ//vhj0dXVVayoqGj1GNZyXbuTIIqi2K1dQ0RERERmxDE3REREZFMYboiIiMimMNwQERGRTWG4ISIiIpvCcENEREQ2heGGiIiIbArDDREREdkUhhsiIiKyKQw3RGT3BEFAcnKy1GUQURdhuCEiSc2bNw+CIFz3mjBhgtSlEZGVcpS6ACKiCRMmYM2aNS22yeVyiaohImvHnhsikpxcLodKpWrx6tmzJ4CmW0bLly/HxIkT4erqir59++Lrr79usf/Ro0dx++23w9XVFV5eXnj00UdRXV3dos3q1asRFhYGuVwOf39/PPnkky0+v3jxIqZNmwY3NzcMGDAA3333XfeeNBF1G4YbIrJ4ixcvxj333IPDhw9j9uzZmDlzJk6ePAkAqKmpQXx8PHr27IkDBw7gq6++Qmpqaovwsnz5cixYsACPPvoojh49iu+++w79+/dv8TOWLFmC+++/H0eOHMFdd92F2bNn49KlS2Y9TyLqIlIvS05E9m3u3Lmig4OD6O7u3uL15ptviqIoigDExx9/vMU+I0aMEJ944glRFEVx5cqVYs+ePcXq6mrj51u2bBFlMplYUlIiiqIoqtVq8aWXXmqzBgDiyy+/bHxfXV0tAhB//PHHLjtPIjIfjrkhIsnddtttWL58eYttvXr1Mv45Li6uxWdxcXHIzMwEAJw8eRJRUVFwd3c3fj569GgYDAacPn0agiCgqKgI48aNM1lDZGSk8c/u7u5QKBQoKyvr7CkRkYQYbohIcu7u7tfdJuoqrq6u7Wrn5OTU4r0gCDAYDN1REhF1M465ISKLt2/fvuveh4aGAgBCQ0Nx+PBh1NTUGD/fs2cPZDIZBg0aBA8PD/Tp0wc7duwwa81EJB323BCR5HQ6HUpKSlpsc3R0hLe3NwDgq6++wtChQ3HLLbfgiy++wP79+/HJJ58AAGbPno1XX30Vc+fOxWuvvYYLFy7gqaeewh//+Ef4+fkBAF577TU8/vjj8PX1xcSJE1FVVYU9e/bgqaeeMu+JEpFZMNwQkeS2bt0Kf3//FtsGDRqEU6dOAWh6kunLL7/En/70J/j7+2PDhg0YPHgwAMDNzQ3btm3DM888g2HDhsHNzQ333HMP3nvvPeOx5s6di7q6OvzjH//A888/D29vb9x7773mO0EiMitBFEVR6iKIiNoiCAKSkpKQkJAgdSlEZCU45oaIiIhsCsMNERER2RSOuSEii8Y750TUUey5ISIiIpvCcENEREQ2heGGiIiIbArDDREREdkUhhsiIiKyKQw3REREZFMYboiIiMimMNwQERGRTfn/aaDqQaPvUZcAAAAASUVORK5CYII=", + "text/plain": [ + "
" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "plt.plot(losses)\n", + "plt.ylabel('Loss')\n", + "plt.xlabel('Epoch')" + ] + }, + { + "cell_type": "markdown", + "id": "b6a53f16", + "metadata": {}, + "source": [ + "\n", + "# 13 - Summarize some Sentences!\n", + "\n", + "Below you can see an example of summarization of a sentence from the training set and a sentence from the test set. See if you notice anything interesting about them!" + ] + }, + { + "cell_type": "code", + "execution_count": 36, + "id": "2493b755", + "metadata": { + "deletable": false, + "editable": false, + "tags": [ + "graded" + ] + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Training set example:\n", + "[SOS] amanda: i baked cookies. do you want some? jerry: sure! amanda: i'll bring you tomorrow :-) [EOS]\n", + "\n", + "Human written summary:\n", + "[SOS] amanda baked cookies and will bring jerry some tomorrow. [EOS]\n", + "\n", + "Model written summary:\n", + "[SOS] amanda will bring some cookies [EOS]\n" + ] + } + ], + "source": [ + "training_set_example = 0\n", + "\n", + "# Check a summary of a document from the training set\n", + "print('Training set example:')\n", + "print(document[training_set_example])\n", + "print('\\nHuman written summary:')\n", + "print(summary[training_set_example])\n", + "print('\\nModel written summary:')\n", + "print(summarize(transformer, document[training_set_example]))" + ] + }, + { + "cell_type": "code", + "execution_count": 37, + "id": "15baaa47", + "metadata": { + "deletable": false, + "editable": false, + "tags": [ + "graded" + ] + }, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "Test set example:\n", + "[SOS] will: hey babe, what do you want for dinner tonight? emma: gah, don't even worry about it tonight will: what do you mean? everything ok? emma: not really, but it's ok, don't worry about cooking though, i'm not hungry will: well what time will you be home? emma: soon, hopefully will: you sure? maybe you want me to pick you up? emma: no no it's alright. i'll be home soon, i'll tell you when i get home. will: alright, love you. emma: love you too. [EOS]\n", + "\n", + "Human written summary:\n", + "[SOS] emma will be home soon and she will let will know. [EOS]\n", + "\n", + "Model written summary:\n", + "[SOS] emma will pick up with emma at home tonight [EOS]\n" + ] + } + ], + "source": [ + "test_set_example = 3\n", + "\n", + "# Check a summary of a document from the test set\n", + "print('Test set example:')\n", + "print(document_test[test_set_example])\n", + "print('\\nHuman written summary:')\n", + "print(summary_test[test_set_example])\n", + "print('\\nModel written summary:')\n", + "print(summarize(transformer, document_test[test_set_example]))" + ] + }, + { + "cell_type": "markdown", + "id": "aebd7ef5", + "metadata": {}, + "source": [ + "If you critically examine the output of the model, you can notice a few things:\n", + " - In the training set the model output is (almost) identical to the real output (already after 20 epochs and even more so with more epochs). This might be because the training set is relatively small and the model is relatively big and has thus learned the sentences in the training set by heart (overfitting).\n", + " - While the performance on the training set looks amazing, it is not so good on the test set. The model overfits, but fails to generalize. Again an easy candidate to blame is the small training set and a comparatively large model, but there might be a variety of other factors.\n", + " - Look at the test set example 3 and its summarization. Would you summarize it the same way as it is written here? Sometimes the data may be ambiguous. And the training of **your model can only be as good as your data**.\n", + "\n", + "Here you only use a small dataset, to show that something can be learned in a reasonable amount of time in a relatively small environment. Generally, large transformers are trained on more than one task and on very large quantities of data to achieve superb performance. You will learn more about this in the rest of this course." + ] + }, + { + "cell_type": "markdown", + "id": "41014aac", + "metadata": {}, + "source": [ + "**Congratulations on finishing this week's assignment!** You did a lot of work and now you should have a better understanding of the Transformers and their building blocks (encoder and decoder) and how they can be used for text summarization. And remember: you dont need to change much to use the same model for a translator, just change the dataset and it should work!\n", + "\n", + "**Keep it up!**" + ] + } + ], + "metadata": { + "grader_version": "1", + "kernelspec": { + "display_name": "Python 3 (ipykernel)", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.8.10" + } + }, + "nbformat": 4, + "nbformat_minor": 5 +} diff --git a/NLP with Attention Models/Text_Summarization/Summarization/tf/__pycache__/utils.cpython-38.pyc b/NLP with Attention Models/Text_Summarization/Summarization/tf/__pycache__/utils.cpython-38.pyc new file mode 100644 index 0000000000000000000000000000000000000000..d53bbd767ec4b4d83d58b9697d8be81193cf2fba Binary files /dev/null and b/NLP with Attention Models/Text_Summarization/Summarization/tf/__pycache__/utils.cpython-38.pyc differ diff --git a/NLP with Attention Models/Text_Summarization/Summarization/tf/__pycache__/w2_unittest.cpython-311.pyc b/NLP with Attention Models/Text_Summarization/Summarization/tf/__pycache__/w2_unittest.cpython-311.pyc new file mode 100644 index 0000000000000000000000000000000000000000..23b07c2ecd9214906060b00898f0ea4caf0815b0 Binary files /dev/null and b/NLP with Attention Models/Text_Summarization/Summarization/tf/__pycache__/w2_unittest.cpython-311.pyc differ diff --git a/NLP with Attention Models/Text_Summarization/Summarization/tf/__pycache__/w2_unittest.cpython-38.pyc b/NLP with Attention Models/Text_Summarization/Summarization/tf/__pycache__/w2_unittest.cpython-38.pyc new file mode 100644 index 0000000000000000000000000000000000000000..787d23826c9342033317e0dd3d69dff7c7c08196 Binary files /dev/null and b/NLP with Attention Models/Text_Summarization/Summarization/tf/__pycache__/w2_unittest.cpython-38.pyc differ diff --git a/NLP with Attention Models/Text_Summarization/Summarization/tf/data/corpus.tar.gz b/NLP with Attention Models/Text_Summarization/Summarization/tf/data/corpus.tar.gz new file mode 100644 index 0000000000000000000000000000000000000000..f509463ca307798cb2a0d21858394b92f3a55e7e --- /dev/null +++ b/NLP with Attention Models/Text_Summarization/Summarization/tf/data/corpus.tar.gz @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:56c98bc3b18ffc673912d24ca40836f983cd0e95058a0f52c4dffcf26fee3376 +size 4130712 diff --git a/NLP with Attention Models/Text_Summarization/Summarization/tf/data/corpus/README.txt b/NLP with Attention Models/Text_Summarization/Summarization/tf/data/corpus/README.txt new file mode 100644 index 0000000000000000000000000000000000000000..265b8f996ab3eb7d4d7069f91a30ad8a3a632c8c --- /dev/null +++ b/NLP with Attention Models/Text_Summarization/Summarization/tf/data/corpus/README.txt @@ -0,0 +1,22 @@ +Dataset +The SAMSum dataset contains about 16k messenger-like conversations with summaries. Conversations were created and written down by linguists fluent in English. Linguists were asked to create conversations similar to those they write on a daily basis, reflecting the proportion of topics of their real-life messenger convesations. The style and register are diversified - conversations could be informal, semi-formal or formal, they may contain slang words, emoticons and typos. Then, the conversations were annotated with summaries. It was assumed that summaries should be a concise brief of what people talked about in the conversation in third person. +The SAMSum dataset was prepared by Samsung R&D Institute Poland and is distributed for research purposes (non-commercial licence: CC BY-NC-ND 4.0). + +Paper +The dataset and experiments performed using it were described in paper: "SAMSum Corpus: A Human-annotated Dialogue Dataset for Abstractive Summarization". Please cite our paper if you use this dataset: + +@inproceedings{gliwa-etal-2019-samsum, + title = "{SAMS}um Corpus: A Human-annotated Dialogue Dataset for Abstractive Summarization", + author = "Gliwa, Bogdan and + Mochol, Iwona and + Biesek, Maciej and + Wawer, Aleksander", + booktitle = "Proceedings of the 2nd Workshop on New Frontiers in Summarization", + month = nov, + year = "2019", + address = "Hong Kong, China", + publisher = "Association for Computational Linguistics", + url = "https://www.aclweb.org/anthology/D19-5409", + doi = "10.18653/v1/D19-5409", + pages = "70--79" +} diff --git a/NLP with Attention Models/Text_Summarization/Summarization/tf/data/corpus/licence.txt b/NLP with Attention Models/Text_Summarization/Summarization/tf/data/corpus/licence.txt new file mode 100644 index 0000000000000000000000000000000000000000..cfe676c541842bd2b766787b1f6983e063a1eee2 --- /dev/null +++ b/NLP with Attention Models/Text_Summarization/Summarization/tf/data/corpus/licence.txt @@ -0,0 +1,403 @@ +Attribution-NonCommercial-NoDerivatives 4.0 International + +======================================================================= + +Creative Commons Corporation ("Creative Commons") is not a law firm and +does not provide legal services or legal advice. Distribution of +Creative Commons public licenses does not create a lawyer-client or +other relationship. Creative Commons makes its licenses and related +information available on an "as-is" basis. Creative Commons gives no +warranties regarding its licenses, any material licensed under their +terms and conditions, or any related information. Creative Commons +disclaims all liability for damages resulting from their use to the +fullest extent possible. + +Using Creative Commons Public Licenses + +Creative Commons public licenses provide a standard set of terms and +conditions that creators and other rights holders may use to share +original works of authorship and other material subject to copyright +and certain other rights specified in the public license below. The +following considerations are for informational purposes only, are not +exhaustive, and do not form part of our licenses. + + Considerations for licensors: Our public licenses are + intended for use by those authorized to give the public + permission to use material in ways otherwise restricted by + copyright and certain other rights. Our licenses are + irrevocable. Licensors should read and understand the terms + and conditions of the license they choose before applying it. + Licensors should also secure all rights necessary before + applying our licenses so that the public can reuse the + material as expected. Licensors should clearly mark any + material not subject to the license. This includes other CC- + licensed material, or material used under an exception or + limitation to copyright. More considerations for licensors: + wiki.creativecommons.org/Considerations_for_licensors + + Considerations for the public: By using one of our public + licenses, a licensor grants the public permission to use the + licensed material under specified terms and conditions. If + the licensor's permission is not necessary for any reason--for + example, because of any applicable exception or limitation to + copyright--then that use is not regulated by the license. Our + licenses grant only permissions under copyright and certain + other rights that a licensor has authority to grant. Use of + the licensed material may still be restricted for other + reasons, including because others have copyright or other + rights in the material. A licensor may make special requests, + such as asking that all changes be marked or described. + Although not required by our licenses, you are encouraged to + respect those requests where reasonable. More considerations + for the public: + wiki.creativecommons.org/Considerations_for_licensees + +======================================================================= + +Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 +International Public License + +By exercising the Licensed Rights (defined below), You accept and agree +to be bound by the terms and conditions of this Creative Commons +Attribution-NonCommercial-NoDerivatives 4.0 International Public +License ("Public License"). To the extent this Public License may be +interpreted as a contract, You are granted the Licensed Rights in +consideration of Your acceptance of these terms and conditions, and the +Licensor grants You such rights in consideration of benefits the +Licensor receives from making the Licensed Material available under +these terms and conditions. + + +Section 1 -- Definitions. + + a. Adapted Material means material subject to Copyright and Similar + Rights that is derived from or based upon the Licensed Material + and in which the Licensed Material is translated, altered, + arranged, transformed, or otherwise modified in a manner requiring + permission under the Copyright and Similar Rights held by the + Licensor. For purposes of this Public License, where the Licensed + Material is a musical work, performance, or sound recording, + Adapted Material is always produced where the Licensed Material is + synched in timed relation with a moving image. + + b. Copyright and Similar Rights means copyright and/or similar rights + closely related to copyright including, without limitation, + performance, broadcast, sound recording, and Sui Generis Database + Rights, without regard to how the rights are labeled or + categorized. For purposes of this Public License, the rights + specified in Section 2(b)(1)-(2) are not Copyright and Similar + Rights. + + c. Effective Technological Measures means those measures that, in the + absence of proper authority, may not be circumvented under laws + fulfilling obligations under Article 11 of the WIPO Copyright + Treaty adopted on December 20, 1996, and/or similar international + agreements. + + d. Exceptions and Limitations means fair use, fair dealing, and/or + any other exception or limitation to Copyright and Similar Rights + that applies to Your use of the Licensed Material. + + e. Licensed Material means the artistic or literary work, database, + or other material to which the Licensor applied this Public + License. + + f. Licensed Rights means the rights granted to You subject to the + terms and conditions of this Public License, which are limited to + all Copyright and Similar Rights that apply to Your use of the + Licensed Material and that the Licensor has authority to license. + + g. Licensor means the individual(s) or entity(ies) granting rights + under this Public License. + + h. NonCommercial means not primarily intended for or directed towards + commercial advantage or monetary compensation. For purposes of + this Public License, the exchange of the Licensed Material for + other material subject to Copyright and Similar Rights by digital + file-sharing or similar means is NonCommercial provided there is + no payment of monetary compensation in connection with the + exchange. + + i. Share means to provide material to the public by any means or + process that requires permission under the Licensed Rights, such + as reproduction, public display, public performance, distribution, + dissemination, communication, or importation, and to make material + available to the public including in ways that members of the + public may access the material from a place and at a time + individually chosen by them. + + j. Sui Generis Database Rights means rights other than copyright + resulting from Directive 96/9/EC of the European Parliament and of + the Council of 11 March 1996 on the legal protection of databases, + as amended and/or succeeded, as well as other essentially + equivalent rights anywhere in the world. + + k. You means the individual or entity exercising the Licensed Rights + under this Public License. Your has a corresponding meaning. + + +Section 2 -- Scope. + + a. License grant. + + 1. Subject to the terms and conditions of this Public License, + the Licensor hereby grants You a worldwide, royalty-free, + non-sublicensable, non-exclusive, irrevocable license to + exercise the Licensed Rights in the Licensed Material to: + + a. reproduce and Share the Licensed Material, in whole or + in part, for NonCommercial purposes only; and + + b. produce and reproduce, but not Share, Adapted Material + for NonCommercial purposes only. + + 2. Exceptions and Limitations. For the avoidance of doubt, where + Exceptions and Limitations apply to Your use, this Public + License does not apply, and You do not need to comply with + its terms and conditions. + + 3. Term. The term of this Public License is specified in Section + 6(a). + + 4. Media and formats; technical modifications allowed. The + Licensor authorizes You to exercise the Licensed Rights in + all media and formats whether now known or hereafter created, + and to make technical modifications necessary to do so. The + Licensor waives and/or agrees not to assert any right or + authority to forbid You from making technical modifications + necessary to exercise the Licensed Rights, including + technical modifications necessary to circumvent Effective + Technological Measures. For purposes of this Public License, + simply making modifications authorized by this Section 2(a) + (4) never produces Adapted Material. + + 5. Downstream recipients. + + a. Offer from the Licensor -- Licensed Material. Every + recipient of the Licensed Material automatically + receives an offer from the Licensor to exercise the + Licensed Rights under the terms and conditions of this + Public License. + + b. No downstream restrictions. You may not offer or impose + any additional or different terms or conditions on, or + apply any Effective Technological Measures to, the + Licensed Material if doing so restricts exercise of the + Licensed Rights by any recipient of the Licensed + Material. + + 6. No endorsement. Nothing in this Public License constitutes or + may be construed as permission to assert or imply that You + are, or that Your use of the Licensed Material is, connected + with, or sponsored, endorsed, or granted official status by, + the Licensor or others designated to receive attribution as + provided in Section 3(a)(1)(A)(i). + + b. Other rights. + + 1. Moral rights, such as the right of integrity, are not + licensed under this Public License, nor are publicity, + privacy, and/or other similar personality rights; however, to + the extent possible, the Licensor waives and/or agrees not to + assert any such rights held by the Licensor to the limited + extent necessary to allow You to exercise the Licensed + Rights, but not otherwise. + + 2. Patent and trademark rights are not licensed under this + Public License. + + 3. To the extent possible, the Licensor waives any right to + collect royalties from You for the exercise of the Licensed + Rights, whether directly or through a collecting society + under any voluntary or waivable statutory or compulsory + licensing scheme. In all other cases the Licensor expressly + reserves any right to collect such royalties, including when + the Licensed Material is used other than for NonCommercial + purposes. + + +Section 3 -- License Conditions. + +Your exercise of the Licensed Rights is expressly made subject to the +following conditions. + + a. Attribution. + + 1. If You Share the Licensed Material, You must: + + a. retain the following if it is supplied by the Licensor + with the Licensed Material: + + i. identification of the creator(s) of the Licensed + Material and any others designated to receive + attribution, in any reasonable manner requested by + the Licensor (including by pseudonym if + designated); + + ii. a copyright notice; + + iii. a notice that refers to this Public License; + + iv. a notice that refers to the disclaimer of + warranties; + + v. a URI or hyperlink to the Licensed Material to the + extent reasonably practicable; + + b. indicate if You modified the Licensed Material and + retain an indication of any previous modifications; and + + c. indicate the Licensed Material is licensed under this + Public License, and include the text of, or the URI or + hyperlink to, this Public License. + + For the avoidance of doubt, You do not have permission under + this Public License to Share Adapted Material. + + 2. You may satisfy the conditions in Section 3(a)(1) in any + reasonable manner based on the medium, means, and context in + which You Share the Licensed Material. For example, it may be + reasonable to satisfy the conditions by providing a URI or + hyperlink to a resource that includes the required + information. + + 3. If requested by the Licensor, You must remove any of the + information required by Section 3(a)(1)(A) to the extent + reasonably practicable. + + +Section 4 -- Sui Generis Database Rights. + +Where the Licensed Rights include Sui Generis Database Rights that +apply to Your use of the Licensed Material: + + a. for the avoidance of doubt, Section 2(a)(1) grants You the right + to extract, reuse, reproduce, and Share all or a substantial + portion of the contents of the database for NonCommercial purposes + only and provided You do not Share Adapted Material; + + b. if You include all or a substantial portion of the database + contents in a database in which You have Sui Generis Database + Rights, then the database in which You have Sui Generis Database + Rights (but not its individual contents) is Adapted Material; and + + c. You must comply with the conditions in Section 3(a) if You Share + all or a substantial portion of the contents of the database. + +For the avoidance of doubt, this Section 4 supplements and does not +replace Your obligations under this Public License where the Licensed +Rights include other Copyright and Similar Rights. + + +Section 5 -- Disclaimer of Warranties and Limitation of Liability. + + a. UNLESS OTHERWISE SEPARATELY UNDERTAKEN BY THE LICENSOR, TO THE + EXTENT POSSIBLE, THE LICENSOR OFFERS THE LICENSED MATERIAL AS-IS + AND AS-AVAILABLE, AND MAKES NO REPRESENTATIONS OR WARRANTIES OF + ANY KIND CONCERNING THE LICENSED MATERIAL, WHETHER EXPRESS, + IMPLIED, STATUTORY, OR OTHER. THIS INCLUDES, WITHOUT LIMITATION, + WARRANTIES OF TITLE, MERCHANTABILITY, FITNESS FOR A PARTICULAR + PURPOSE, NON-INFRINGEMENT, ABSENCE OF LATENT OR OTHER DEFECTS, + ACCURACY, OR THE PRESENCE OR ABSENCE OF ERRORS, WHETHER OR NOT + KNOWN OR DISCOVERABLE. WHERE DISCLAIMERS OF WARRANTIES ARE NOT + ALLOWED IN FULL OR IN PART, THIS DISCLAIMER MAY NOT APPLY TO YOU. + + b. TO THE EXTENT POSSIBLE, IN NO EVENT WILL THE LICENSOR BE LIABLE + TO YOU ON ANY LEGAL THEORY (INCLUDING, WITHOUT LIMITATION, + NEGLIGENCE) OR OTHERWISE FOR ANY DIRECT, SPECIAL, INDIRECT, + INCIDENTAL, CONSEQUENTIAL, PUNITIVE, EXEMPLARY, OR OTHER LOSSES, + COSTS, EXPENSES, OR DAMAGES ARISING OUT OF THIS PUBLIC LICENSE OR + USE OF THE LICENSED MATERIAL, EVEN IF THE LICENSOR HAS BEEN + ADVISED OF THE POSSIBILITY OF SUCH LOSSES, COSTS, EXPENSES, OR + DAMAGES. WHERE A LIMITATION OF LIABILITY IS NOT ALLOWED IN FULL OR + IN PART, THIS LIMITATION MAY NOT APPLY TO YOU. + + c. The disclaimer of warranties and limitation of liability provided + above shall be interpreted in a manner that, to the extent + possible, most closely approximates an absolute disclaimer and + waiver of all liability. + + +Section 6 -- Term and Termination. + + a. This Public License applies for the term of the Copyright and + Similar Rights licensed here. However, if You fail to comply with + this Public License, then Your rights under this Public License + terminate automatically. + + b. Where Your right to use the Licensed Material has terminated under + Section 6(a), it reinstates: + + 1. automatically as of the date the violation is cured, provided + it is cured within 30 days of Your discovery of the + violation; or + + 2. upon express reinstatement by the Licensor. + + For the avoidance of doubt, this Section 6(b) does not affect any + right the Licensor may have to seek remedies for Your violations + of this Public License. + + c. For the avoidance of doubt, the Licensor may also offer the + Licensed Material under separate terms or conditions or stop + distributing the Licensed Material at any time; however, doing so + will not terminate this Public License. + + d. Sections 1, 5, 6, 7, and 8 survive termination of this Public + License. + + +Section 7 -- Other Terms and Conditions. + + a. The Licensor shall not be bound by any additional or different + terms or conditions communicated by You unless expressly agreed. + + b. Any arrangements, understandings, or agreements regarding the + Licensed Material not stated herein are separate from and + independent of the terms and conditions of this Public License. + + +Section 8 -- Interpretation. + + a. For the avoidance of doubt, this Public License does not, and + shall not be interpreted to, reduce, limit, restrict, or impose + conditions on any use of the Licensed Material that could lawfully + be made without permission under this Public License. + + b. To the extent possible, if any provision of this Public License is + deemed unenforceable, it shall be automatically reformed to the + minimum extent necessary to make it enforceable. If the provision + cannot be reformed, it shall be severed from this Public License + without affecting the enforceability of the remaining terms and + conditions. + + c. No term or condition of this Public License will be waived and no + failure to comply consented to unless expressly agreed to by the + Licensor. + + d. Nothing in this Public License constitutes or may be interpreted + as a limitation upon, or waiver of, any privileges and immunities + that apply to the Licensor or You, including from the legal + processes of any jurisdiction or authority. + +======================================================================= + +Creative Commons is not a party to its public +licenses. Notwithstanding, Creative Commons may elect to apply one of +its public licenses to material it publishes and in those instances +will be considered the “Licensor.” The text of the Creative Commons +public licenses is dedicated to the public domain under the CC0 Public +Domain Dedication. Except for the limited purpose of indicating that +material is shared under a Creative Commons public license or as +otherwise permitted by the Creative Commons policies published at +creativecommons.org/policies, Creative Commons does not authorize the +use of the trademark "Creative Commons" or any other trademark or logo +of Creative Commons without its prior written consent including, +without limitation, in connection with any unauthorized modifications +to any of its public licenses or any other arrangements, +understandings, or agreements concerning use of licensed material. For +the avoidance of doubt, this paragraph does not form part of the +public licenses. + +Creative Commons may be contacted at creativecommons.org. + diff --git a/NLP with Attention Models/Text_Summarization/Summarization/tf/data/corpus/test.json b/NLP with Attention Models/Text_Summarization/Summarization/tf/data/corpus/test.json new file mode 100644 index 0000000000000000000000000000000000000000..6f747a76d0b820c09ba1a4a655332ae13208c945 --- /dev/null +++ b/NLP with Attention Models/Text_Summarization/Summarization/tf/data/corpus/test.json @@ -0,0 +1,4097 @@ +[ + { + "id": "13862856", + "summary": "Hannah needs Betty's number but Amanda doesn't have it. She needs to contact Larry.", + "dialogue": "Hannah: Hey, do you have Betty's number?\nAmanda: Lemme check\nHannah: \nAmanda: Sorry, can't find it.\nAmanda: Ask Larry\nAmanda: He called her last time we were at the park together\nHannah: I don't know him well\nHannah: \nAmanda: Don't be shy, he's very nice\nHannah: If you say so..\nHannah: I'd rather you texted him\nAmanda: Just text him 🙂\nHannah: Urgh.. Alright\nHannah: Bye\nAmanda: Bye bye" + }, + { + "id": "13729565", + "summary": "Eric and Rob are going to watch a stand-up on youtube.", + "dialogue": "Eric: MACHINE!\r\nRob: That's so gr8!\r\nEric: I know! And shows how Americans see Russian ;)\r\nRob: And it's really funny!\r\nEric: I know! I especially like the train part!\r\nRob: Hahaha! No one talks to the machine like that!\r\nEric: Is this his only stand-up?\r\nRob: Idk. I'll check.\r\nEric: Sure.\r\nRob: Turns out no! There are some of his stand-ups on youtube.\r\nEric: Gr8! I'll watch them now!\r\nRob: Me too!\r\nEric: MACHINE!\r\nRob: MACHINE!\r\nEric: TTYL?\r\nRob: Sure :)" + }, + { + "id": "13680171", + "summary": "Lenny can't decide which trousers to buy. Bob advised Lenny on that topic. Lenny goes with Bob's advice to pick the trousers that are of best quality.", + "dialogue": "Lenny: Babe, can you help me with something?\r\nBob: Sure, what's up?\r\nLenny: Which one should I pick?\r\nBob: Send me photos\r\nLenny: \r\nLenny: \r\nLenny: \r\nBob: I like the first ones best\r\nLenny: But I already have purple trousers. Does it make sense to have two pairs?\r\nBob: I have four black pairs :D :D\r\nLenny: yeah, but shouldn't I pick a different color?\r\nBob: what matters is what you'll give you the most outfit options\r\nLenny: So I guess I'll buy the first or the third pair then\r\nBob: Pick the best quality then\r\nLenny: ur right, thx\r\nBob: no prob :)" + }, + { + "id": "13729438", + "summary": "Emma will be home soon and she will let Will know.", + "dialogue": "Will: hey babe, what do you want for dinner tonight?\r\nEmma: gah, don't even worry about it tonight\r\nWill: what do you mean? everything ok?\r\nEmma: not really, but it's ok, don't worry about cooking though, I'm not hungry\r\nWill: Well what time will you be home?\r\nEmma: soon, hopefully\r\nWill: you sure? Maybe you want me to pick you up?\r\nEmma: no no it's alright. I'll be home soon, i'll tell you when I get home. \r\nWill: Alright, love you. \r\nEmma: love you too. " + }, + { + "id": "13828600", + "summary": "Jane is in Warsaw. Ollie and Jane has a party. Jane lost her calendar. They will get a lunch this week on Friday. Ollie accidentally called Jane and talked about whisky. Jane cancels lunch. They'll meet for a tea at 6 pm.", + "dialogue": "Ollie: Hi , are you in Warsaw\r\nJane: yes, just back! Btw are you free for diner the 19th?\r\nOllie: nope!\r\nJane: and the 18th?\r\nOllie: nope, we have this party and you must be there, remember?\r\nJane: oh right! i lost my calendar.. thanks for reminding me\r\nOllie: we have lunch this week?\r\nJane: with pleasure!\r\nOllie: friday?\r\nJane: ok\r\nJane: what do you mean \" we don't have any more whisky!\" lol..\r\nOllie: what!!!\r\nJane: you just call me and the all thing i heard was that sentence about whisky... what's wrong with you?\r\nOllie: oh oh... very strange! i have to be carefull may be there is some spy in my mobile! lol\r\nJane: dont' worry, we'll check on friday.\r\nOllie: don't forget to bring some sun with you\r\nJane: I can't wait to be in Morocco..\r\nOllie: enjoy and see you friday\r\nJane: sorry Ollie, i'm very busy, i won't have time for lunch tomorrow, but may be at 6pm after my courses?this trip to Morocco was so nice, but time consuming!\r\nOllie: ok for tea!\r\nJane: I'm on my way..\r\nOllie: tea is ready, did you bring the pastries?\r\nJane: I already ate them all... see you in a minute\r\nOllie: ok" + }, + { + "id": "13716964", + "summary": "Hilary has the keys to the apartment. Benjamin wants to get them and go take a nap. Hilary is having lunch with some French people at La Cantina. Hilary is meeting them at the entrance to the conference hall at 2 pm. Benjamin and Elliot might join them. They're meeting for the drinks in the evening.", + "dialogue": "Benjamin: Hey guys, what are we doing with the keys today?\r\nHilary: I've got them. Whoever wants them can meet me at lunchtime or after\r\nElliot: I'm ok. We're meeting for the drinks in the evening anyway and I guess we'll be going back to the apartment together?\r\nHilary: Yeah, I guess so\r\nDaniel: I'm with Hilary atm and won't let go of her for the rest of the day, so any option you guys choose is good for me\r\nBenjamin: Hmm I might actually pass by at lunchtime, take the keys and go take a nap. I'm sooo tired after yesterday\r\nHilary: Sounds good. We'll be having lunch with some French people (the ones who work on the history of food in colonial Mexico - I already see you yawning your head off)\r\nBenjamin: YAAAAWN 🙊 Where and where are you meeting?\r\nHilary: So I'm meeting them at the entrance to the conference hall at 2 pm and then we'll head to this place called La Cantina. Italian cuisine, which is quite funny, but that's what they've chosen\r\nBenjamin: Interesting 😱 To be honest, Hilary, I almost feel like changing my mind. Wanting to take this nap might end up costing me to dear\r\nHilary: Oh come on 😂\r\nBenjamin: All these terrible obstacles on mu way to bed might just prove to much to take\r\nHilary: We'll try to avoid talking about their subject of research. Oh wait, no, I'm actually meeting them because I wanted to chat about their research lol\r\nElliot: 🙉\r\nHilary: Do join us, we're going to have fun. And then you'll take the keys and take this most deserved of naps\r\nElliot: Sounds like a plan 😂\r\nHilary: 😎\r\nElliot: See you at 2 then xx" + }, + { + "id": "13731487", + "summary": "Payton provides Max with websites selling clothes. Payton likes browsing and trying on the clothes but not necessarily buying them. Payton usually buys clothes and books as he loves reading.", + "dialogue": "Max: Know any good sites to buy clothes from?\r\nPayton: Sure :) \r\nMax: That's a lot of them!\r\nPayton: Yeah, but they have different things so I usually buy things from 2 or 3 of them.\r\nMax: I'll check them out. Thanks. \r\nPayton: No problem :)\r\nMax: How about u?\r\nPayton: What about me?\r\nMax: Do u like shopping?\r\nPayton: Yes and no.\r\nMax: How come?\r\nPayton: I like browsing, trying on, looking in the mirror and seeing how I look, but not always buying.\r\nMax: Y not?\r\nPayton: Isn't it obvious? ;)\r\nMax: Sry ;)\r\nPayton: If I bought everything I liked, I'd have nothing left to live on ;)\r\nMax: Same here, but probably different category ;)\r\nPayton: Lol\r\nMax: So what do u usually buy?\r\nPayton: Well, I have 2 things I must struggle to resist!\r\nMax: Which are?\r\nPayton: Clothes, ofc ;)\r\nMax: Right. And the second one?\r\nPayton: Books. I absolutely love reading!\r\nMax: Gr8! What books do u read?\r\nPayton: Everything I can get my hands on :)\r\nMax: Srsly?\r\nPayton: Yup :)" + }, + { + "id": "13814882", + "summary": "Rita and Tina are bored at work and have still 4 hours left.", + "dialogue": "Rita: I'm so bloody tired. Falling asleep at work. :-(\r\nTina: I know what you mean.\r\nTina: I keep on nodding off at my keyboard hoping that the boss doesn't notice..\r\nRita: The time just keeps on dragging on and on and on.... \r\nRita: I keep on looking at the clock and there's still 4 hours of this drudgery to go.\r\nTina: Times like these I really hate my work.\r\nRita: I'm really not cut out for this level of boredom.\r\nTina: Neither am I." + }, + { + "id": "13680876", + "summary": "Beatrice wants to buy Leo a scarf, but he doesn't like scarves. She cares about his health and will buy him a scarf no matter his opinion.", + "dialogue": "Beatrice: I am in town, shopping. They have nice scarfs in the shop next to the church. Do you want one?\r\nLeo: No, thanks\r\nBeatrice: But you don't have a scarf.\r\nLeo: Because I don't need it.\r\nBeatrice: Last winter you had a cold all the time. A scarf could help.\r\nLeo: I don't like them.\r\nBeatrice: Actually, I don't care. You will get a scarf.\r\nLeo: How understanding of you!\r\nBeatrice: You were complaining the whole winter that you're going to die. I've had enough.\r\nLeo: Eh." + }, + { + "id": "13809974", + "summary": "Eric doesn't know if his parents let him go to Ivan's brother's wedding. Ivan will talk to them.", + "dialogue": "Ivan: hey eric\r\nEric: yeah man\r\nIvan: so youre coming to the wedding\r\nEric: your brother's\r\nIvan: yea\r\nEric: i dont know mannn\r\nIvan: YOU DONT KNOW??\r\nEric: i just have a lot to do at home, plus i dont know if my parents would let me\r\nIvan: ill take care of your parents\r\nEric: youre telling me you have the guts to talk to them XD\r\nIvan: thats my problem\r\nEric: okay man, if you say so\r\nIvan: yea just be there \r\nEric: alright" + }, + { + "id": "13680771", + "summary": "Wanda wants to throw a party. She asks Gina to borrow her father's car and go do groceries together. They set the date for Friday. ", + "dialogue": "Wanda: Let's make a party!\r\nGina: Why?\r\nWanda: beacuse. I want some fun!\r\nGina: ok, what do u need?\r\nWanda: 1st I need too make a list\r\nGina: noted and then?\r\nWanda: well, could u take yours father car and go do groceries with me?\r\nGina: don't know if he'll agree\r\nWanda: I know, but u can ask :)\r\nGina: I'll try but theres no promisess\r\nWanda: I know, u r the best!\r\nGina: When u wanna go\r\nWanda: Friday?\r\nGina: ok, I'll ask" + }, + { + "id": "13729249", + "summary": "Martin wrote a short review and won 2 cinema tickets on FB. Martin wants Aggie to go with him this week for the new film with Redford.", + "dialogue": "Martin: I won two cinema tickets!\r\nAggie: oh cool, how come?\r\nMartin: online. on fb, the movie mag organized it\r\nAggie: so what did you do\r\nMartin: just write a short review and that's it\r\nAggie: well done :) so what and when. and where?\r\nMartin: the new film with Redford\r\nAggie: i guess i heard sth\r\nMartin: it's pretty cool i heard. till the end of the week\r\nAggie: sounds good. we'll find time XD" + }, + { + "id": "13680137", + "summary": "Charlee is attending Portuguese theater as a subject at university. He and other students are preparing a play by Mrożek translated into Portuguese.", + "dialogue": "Charlee: I'm in class. Theatre in Portuguese lol\r\nCurtis: Realllly?\r\nCharlee: Yes. One of my subjects at the university that I attend is portuguese theatre. We are preparing a performance\r\nCurtis: What performance is this? Are you devising it?\r\nCharlee: A polish one translated into portuguese\r\nCurtis: Thats quite cool. Who is the writer?\r\nCharlee: Mrożek" + }, + { + "id": "13864627", + "summary": "Ella rented a car, this makes things much faster for her and Tom. ", + "dialogue": "Mary: Are you going by car or train?\nTom: Ella rented a car\nElla: this makes all of this much faster\nMary: good decision" + }, + { + "id": "13681139", + "summary": "Paul is going to share his Netflix account with Luke. In exchange Luke is going to contribute to the subscription. Paul will send Luke his bank details. Paul is on vacation with his girlfriend till tomorrow.", + "dialogue": "Luke: are you still looking for someone to join netflix family?\r\nPaul: yes, 1 person :)\r\nLuke: i am the one!\r\nPaul: sure, i will send you the login and password on sunday\r\nLuke: ok we can talk tomorrow\r\nPaul: i don't really remember it now\r\nLuke: send me also the bank account details so I can wire you the money every month. Are you paying for this or someone else?\r\nPaul: I do, and I keep track of everyone accessing so you should not expect any bans :D\r\nLuke: easy mate :D you still on holidays with your girl?\r\nPaul: last dinner :( tomorrow we are out\r\nLuke: how long have you been there?\r\nPaul: less than 8 days :/" + }, + { + "id": "13680757", + "summary": "Greg and Betsy have a lot of work today, so they cannot pick up Johnny from the kindergarten. However, it's Greg's turn to do it. Greg will try to find a solution.", + "dialogue": "Greg: Hi, honey. I need to stay after hours :-(\r\nBetsy: Again?\r\nGreg: I'm sorry!\r\nBetsy: What about Johnny?\r\nGreg: Well, could you pick him up? \r\nBetsy: What if I can't?\r\nGreg: Betsy?\r\nBetsy: What if I can't?\r\nGreg: Can't you, really?\r\nBetsy: I can't. Today I need to work long hours as well. Tuesdays are your days in the kindergarten.\r\nGreg: Talk to you later. I'll see what I can do.\r\nBetsy: You'd better think of something.\r\nGreg: Oh. Just stop it now." + }, + { + "id": "13716777", + "summary": "Ethan, Toby and Marshall are making fun of Scott.", + "dialogue": "Ethan: somethin for Scott \r\nToby: haha, totally\r\nMarshall: pretty much sums it up\r\nScott: you know you're exactly fuckin the same\r\nToby: oh we know honey bunny\r\nMarshall: we just enjoy making fun of YOU\r\nEthan: xD\r\nScott: oh fuck y'all\r\nToby: " + }, + { + "id": "13828901", + "summary": "Igor has a lot of work on his notice period and he feels demotivated. John thinks he should do what he has to do nevertheless. ", + "dialogue": "Igor: Shit, I've got so much to do at work and I'm so demotivated. \r\nJohn: It's pretty irresponsible to give that much work to someone on their notice period.\r\nIgor: Yeah, exactly! Should I even care?\r\nJohn: It's up to you, but you know what they say...\r\nIgor: What do you mean?\r\nJohn: Well, they say how you end things shows how you really are...\r\nIgor: And now how you start, right?\r\nJohn: Gotcha! \r\nIgor: So what shall I do then? \r\nJohn: It's only two weeks left, so grit your teeth and do what you have to do. \r\nIgor: Easy to say, hard to perform.\r\nJohn: Come on, stop thinking, start doing! \r\nIgor: That's so typical of you! ;) " + }, + { + "id": "13728509", + "summary": "Clara is rewatching Dear White People and strongly recommends it to Neela.", + "dialogue": "Clara: Hi, what you up to?\r\nNeela: Not much, chilling out.\r\nClara: Just rewatching Dear White People on Netflix, love it!😍\r\nNeela: Oh yeah, heard of it, but not seen it yet? Any good?\r\nClara: Well, yes! I just said it was, LOL. It's about a fictional Ivy League University and the students in one House of Residence.\r\nNeela: Why is it called Dear White People?\r\nClara: That's the name of the radio show the main character, Sam, presents on college radio.\r\nNeela: Yeah, but why is it so good?\r\nClara: Well, it's mainly stories from the perspective of black students there, which I find very interesting. The characters are strong and likeable too.\r\nNeela: I suppose it's rather different from the UK, then?\r\nClara: It seems so, as there is a lot more racial awareness and discrimination there than here. It all kicks off when there is a Blackface party held by an elite group of white students, which gets out of hand.\r\nNeela: How's that?\r\nClara: Well, obviously, the black students try to break it up and there's also an incident where one guy, Reggie, gets a loaded gun pointed at him by a campus policeman after he gets into an argument with a white student. It may be at another party, though, I'm not sure of that.\r\nNeela: Oh, that sounds pretty strong stuff. What else happens?\r\nClara: Well, there is a young black guy called Lionel who is coming to terms with being gay and is finding his voice as a journalist. He unearths corruption at the uni and he and Sam also uncover some conspiracy theory stuff about secret societies.\r\nNeela: Well, I must say, it does sound good, I'll check it out soon!\r\nClara: Definitely, there is supposed to be a Series 3 coming up next year, really looking forward to it!\r\nNeela: Well, thanks Clara, I'm just watching the rest of a movie and I'll try Dear White People.\r\nClara: Don't blame me if you get hooked and stay up till 4!\r\nNeela: See ya, love!\r\nClara: Bye!" + }, + { + "id": "13728442", + "summary": "Mike took his car into garage today. Ernest is relieved as someone had just crashed into a red Honda which looks like Mike's. ", + "dialogue": "Ernest: hey Mike, did you park your car on our street?\r\nMike: no, took it into garage today\r\nErnest: ok good\r\nMike: why?\r\nErnest: someone just crashed into a red honda looking just like yours\r\nMike: lol lucky me" + }, + { + "id": "13728958", + "summary": "Beth wants to organize a girls weekend to celebrate her mother's 40th birthday. She also wants to work at Deidre's beauty salon. Deidre offers her a few hours on Saturdays as work experience. They set up for a meeting tomorrow.", + "dialogue": "Deirdre: Hi Beth, how are you love?\r\nBeth: Hi Auntie Deirdre, I'm been meaning to message you, had a favour to ask.\r\nDeirdre: Wondered if you had any thought about your Mum's 40th, we've got to do something special!\r\nBeth: How about a girls weekend, just mum, me, you and the girls, Kira will have to come back from Uni, of course.\r\nDeirdre: Sounds fab! Get your thinking cap on, it's only in 6 weeks! Bet she's dreading it, I remember doing that!\r\nBeth: Oh yeah, we had a surprise party for you, you nearly had a heart attack! \r\nDeirdre: Well, it was a lovely surprise! Gosh, thats nearly 4 years ago now, time flies! What was the favour, darling?\r\nBeth: Oh, it was just that I fancied trying a bit of work experience in the salon, auntie.\r\nDeirdre: Well, I am looking for Saturday girls, are you sure about it? you could do well in the exams and go on to college or 6th form.\r\nBeth: I know, but it's not for me, auntie, I am doing all foundation papers and I'm struggling with those.\r\nDeirdre: What about a tutor? Kira could help you in the hols.\r\nBeth: Maybe, but I'd like to try working. I'm 16 soon, I'm old enough.\r\nDeirdre: I know. Look, pop in tomorrow after school and we'll have a cuppa and a chat.\r\nBeth: Yes, thanks auntie. I'd really like to try the beauty therapy side.\r\nDeirdre: Its not for the squeamish, mind. Massage, pedicures, not to mention waxing!\r\nBeth: Oh yes, I was chatting to a friend about it yesterday!\r\nDeirdre: Maxine manages the beauty side, you can meet her tomorrow and we'll see how it goes.\r\nBeth: Yes, I'd really like that. \r\nDeirdre: We can try a few hours on a Saturday for a couple of weeks as work experience. I'll give you a tenner or so per session to start off for your lunch, coffee and bus fare etc. If you like, we'll take it from there.\r\nBeth: OK, I like the sound of it! See you tomorrow Auntie! Love you!\r\nDeirdre: Bye, lovely girl! Xx" + }, + { + "id": "13862397", + "summary": "Gloria has an exam soon. It lasts 4 hours. Emma sent her a link to a website with some texts from previous years so that she can prepare for the exam better.", + "dialogue": "Gloria: This exam is a bit of a lottery in fact\nGloria: You can't really get prepared, it's all about experience\nEmma: But there are some rules and some typical texts right?\nGloria: You can see some texts from previous years\nGloria: \nEmma: Wow that's very useful\nEmma: I have never seen this site\nGloria: Yes it's very good\nGloria: Actually it's good to read all the texts because you will see that some phrases repeat very often\nEmma: How much time do you have for all 4 parts?\nGloria: 4 hours\nEmma: Is it enough?\nGloria: Well it has to be\nGloria: Would be perfect to have 2 more hours... But on the other hand it would be really exhausting\nEmma: 4 hours and no breaks?\nGloria: No breaks :/ So it's really important to be really focused and try to write as fast as you can\nGloria: And read it carefully and correct during the last hour\nEmma: I'm going to read everything from that website, it's great" + }, + { + "id": "13829966", + "summary": "Adam and Karen are worried that May suffers from depression. Karen will call her friend who is a psychologist and ask for advice. ", + "dialogue": "Adam: Have you talked to May?\r\nKaren: Yes, yesterday, why?\r\nAdam: I just talked to her and I must admit I worry about her\r\nKaren: Me too, I suggested she should see a specialist, but she wasn't very happy about it\r\nAdam: No wonder...\r\nKaren: I know, but I think this is serious. She's saying she's depressed, like everyone around, but in her case it may be true\r\nAdam: She was telling me she doesn't feel like doing anything, she's bored all the time, she never feels happy. It sounds like a real, typical depression\r\nAdam: She also told me that she has trouble sleeping. I asked her to go out for a beer or anything basically, but she doesn't want to leave the flat\r\nKaren: Oh my, it sounds really serious. I don't what to tell you\r\nAdam: I was wondering how I can help her\r\nKaren: Honestly I don't know if we can help her, Adam. I suggested a specialist because these are very sensitive issues and I'm afraid we may unintentionally make it worse\r\nAdam: Yes, but she doesn't want to see a specialist. Basically, she doesn't want to see anyone\r\nKaren: Hm... I don't know... How about I call someone for advice? So we could know what to do\r\nAdam: Sounds rational, do you know anyone you could call? Don't mention her name\r\nKaren: Of course I won't! I have a friend who's a psychologist, we can trust her. I'll let you know\r\nAdam: Thank you Karen!" + }, + { + "id": "13864400", + "summary": "Mark lied to Anne about his age. Mark is 40.", + "dialogue": "Anne: You were right, he was lying to me :/\nIrene: Oh no, what happened?\nJane: who? that Mark guy?\nAnne: yeah, he told me he's 30, today I saw his passport - he's 40\nIrene: You sure it's so important?\nAnne: he lied to me Irene" + }, + { + "id": "13716103", + "summary": "Next week is Wharton's birthday. Augustine, Darlene, Heather and Walker want to buy him a paper shredder. Walker will make sure if Wharton really wants it. ", + "dialogue": "Augustine: Guys, remember it's Wharton's bday next week?\r\nDarlene: yay, a party!\r\nHeather: yay! crap we need to buy him a present\r\nWalker: he mentioned paper shredder once\r\nAugustine: wtf?!?\r\nWalker: he did really. for no reason at all.\r\nHeather: whatever that make him happy\r\nDarlene: cool with me. we can shred some papers at the party \r\nAugustine: so much fun\r\nHeather: srsly guys, you mean we should really get office equipment???\r\nDarlene: Walk, ask him if he really wnts it and if he yes then we get it\r\nWalker: i heard him say that. wasn;t drunk. me neither.\r\nDarlene: but better ask him twice\r\nWalker: will do\r\nAugustine: 2moro ok?\r\nDarlene: and sure ask ab the party!" + }, + { + "id": "13716128", + "summary": "Kelly is scared of sculpture garden figures in Finnland, she finds figure's faces morbid. For Ollie it's Nagoro village in Japan, it's creepy. ", + "dialogue": "Ollie: Okay, Kelly! Ur up nxt!\r\nKelly: Me? I don't wanna.\r\nMickey: C'mon!\r\nJessica: Yeah! What's yours?\r\nKelly: Fine. It's a sculpture garden in Finnland.\r\nOllie: What's scary about sculptures? Wait! Do they resemble vampires and stuff?\r\nMickey: Nah, I'm sure they look rly nice.\r\nKelly: It's not the sculptures, it's the amount of them and their faces!\r\nJessica: Faces? What faces?\r\nKelly: Well, they resemble ppl in different activities like hugging, training, doing sport and so on. But the faces are just morbid and there's like a hundred of them. All staring at you!\r\nOllie: Another one?\r\nMickey: Certainly!\r\nJessica: Well, Ollie, ur turn!\r\nOllie: Nagoro village in Japan!\r\nMickey: Y?\r\nOllie: Well, maybe it's not scary, but it similar to Kelly's place. It's just creepy as hell.\r\nJessica: Bt y?\r\nOllie: Imagine a village with ppl living in it. And in the same village u have these human-sized figures. And there's more of them than the ppl that actually live there!\r\nKelly: Creepy AH!\r\nMickey: WTF?! Y would ppl even do that?\r\nJessica: Idk. Idc. Never. Going. There.\r\nOllie: See! Mine was the worst!\r\nJessica: Bt not the scariest!\r\nOllie: Point taken.\r\nMickey: Listen, guys, fun talking to u, bt gotta go. \r\nKelly: Yeah, me too. Bye!\r\nJessica: Bye!\r\nOllie: Cu!" + }, + { + "id": "13682496", + "summary": "Selah called a person that did not pick up.", + "dialogue": "Myah: \r\nSelah: I can't see the phone number very well. Rewrite it plz\r\nMyah: \r\nSelah: The phone of that person is off" + }, + { + "id": "13730015", + "summary": "Bella and Eric dismissed a request of a client. Their boss appreciated the decision. He brings in new clients.", + "dialogue": "Eric: Hey Bella, What happened today in boss's room?? Was he angry??\r\nBella: NO NO!!! He wasn't angry at all.. He actually appreciated on our brave deccision to dismiss the request of client..\r\nEric: REALLY!! He appreciated this decision..\r\nBella: Yeah he really did.. I too was astounded by his reaction...\r\nEric: What could possibly lead to this?? I mean , they were potential clients...\r\nBella: What he told me was that he was looking forward to bring in new clients which were our current client's competitor..\r\nEric: Oh that could possibly be the reason.Well anyways you got appreciation xD congo\t\r\nBella: hahaha Blessing in disguise xD" + }, + { + "id": "13731152", + "summary": "Emma is about to take a nap in the back of the bus to New York. Ben and Emma will be there around 4.30 pm. Ben will wake Emma up 15 minutes prior to their arrival.", + "dialogue": "Ben: Where are you?\r\nEmma: at the rare of the bus\r\nBen: why?\r\nEmma: there are some free seats here\r\nEmma: so I can have a nap even\r\nBen: good idea\r\nEmma: when are we going to arrive to NY?\r\nBen: around 4.30 PM\r\nEmma: if traffic is not crazy\r\nBen: right, we will see\r\nEmma: could you come here and wake me up around 4.15?\r\nBen: sure\r\nEmma: thanks!\r\nBen: sleep well\r\nEmma: I'll try" + }, + { + "id": "13716107", + "summary": "Jesse, Melvin, Lee and Maxine are going to take part in the Christmas charity action of the foundation called Refuge, which helps women and children who escape from abuse.", + "dialogue": "Jesse: I have an idea that'll cheer u up!\r\nMelvin: What is it?\r\nJesse: I was thinking about doing something 4 the less fortunate this year. \r\nLee: Gr8 idea! Anything in mind?\r\nMaxine: So no presents 4 me? :(\r\nJesse: U'll get ur presents, no worries ;)\r\nMaxine: Phew! Was getting a bit worried for a moment ;)\r\nMelvin: Bt what do u have in store?\r\nJesse: Well, have u heard about the Refuge?\r\nLee: No. What's that?\r\nMelvin: That's the Christmas foundation to help women and children?\r\nMaxine: I think I've heard of them. So what about them?\r\nJesse: That's right! They help women and children who escape from abuse. And every year they post wish lists of such ppl online and I thought that we could choose one and chip in. \r\nMelvin: That's a great idea!\r\nLee: Count me in!\r\nMaxine: Me too.\r\nJesse: Have a look at these 3 lists: \r\nLee: I think the second one would be the easiest to arrange.\r\nMaxine: Agree.\r\nMelvin: What about number 3? A bit ambitious, but if we pull together, we'll manage.\r\nJesse: Actually, I'm in for the 3rd one.\r\nMaxine: I think the 2nd list would be better. The items cos more or less the same and we can easily divide it.\r\nMelvin: But if we agree to chip in the same amount of money, we can deal with the 3rd one easily. \r\nLee: Come to think of it, the 3rd one is not that bad. A bit of planning and logistics and were good to go. \r\nJesse: So it's settled?\r\nMelvin: Yup.\r\nLee: Sure. \r\nMaxine: Fine." + }, + { + "id": "13728966", + "summary": "Mary ran out of money. Carter is going to lend her some in an hour.", + "dialogue": "Mary: hey, im kinda broke, lend me a few box\r\nCarter: okay, give me an hour, im at the train station\r\nMary: cool, thanks" + }, + { + "id": "13828761", + "summary": "Paula helped Charlotte with correct pronunciation of \"Natal lily\".", + "dialogue": "Charlotte: Hello Paula, a funny question: how do you pronounce 'Natal lily', the name of the plant? It refers to the region of ZA and not to the word 'natal' as in 'his natal day', right?\r\nPaula: Hi Charlotte, 'nu tell', 'nu' as in 'number'.\r\nCharlotte: And the stress on the second syllable? Or the first?\r\nPaula: 2nd\r\nCharlotte: Thank you dear.\r\nPaula: \r\nCharlotte: Lovely to hear your voice!!\r\nPaula: :$\r\nPaula: \r\nCharlotte: :X" + }, + { + "id": "13728735", + "summary": "Jack and May will drink cocktails later.", + "dialogue": "Jack: Cocktails later?\r\nMay: YES!!!\r\nMay: You read my mind...\r\nJack: Possibly a little tightly strung today?\r\nMay: Sigh... without question.\r\nJack: Thought so.\r\nMay: A little drink will help!\r\nJack: Maybe two!" + }, + { + "id": "13828647", + "summary": "Margaret is suffering from a terrible headache and wants Jack to buy her some painkillers. ", + "dialogue": "Margaret: Honey, buy me some painkiller.\r\nJack: What is going on?\r\nMargaret: Terrible headache!\r\nJack: Maybe you should rest!" + }, + { + "id": "13680875", + "summary": "Serge is on his way to pick up the film equipment for the shooting tonight. Andrei and Serge are late with a large payment to the company. Serge and Andrei will try to use the credit card to pay the company.", + "dialogue": "Andrei: hey, did you pick up the film equipment for tonite's shooting?\r\nSerge: no, im on my way there now.\r\nAndrei: cool. do you happen to have your credit card with you? we have an outstanding bill to pay with the company.\r\nSerge: yeah, i do. not a lot of available credit on it, but we'll see when we get there.\r\nAndrei: OK, thanks. theyll be glad when we pay it. its long overdue.\r\nSerge: ill let you know if it works out. getting of the metro now\r\nAndrei: ok" + }, + { + "id": "13829728", + "summary": "Martina advises against getting a hamster. ", + "dialogue": "Janice: my son has been asking me to get him a hamster for his birthday\r\nJanice: should i?\r\nMartina: NO! NO! NO! NO! NO!\r\nMartina: i got one for my son and it stank up the whole house\r\nMartina: so don't do it!!!" + }, + { + "id": "13864532", + "summary": "Mary has played DA Inquisition. Lucas has played DA II. Daniel started playing DA Inquisition.", + "dialogue": "Daniel: have you guys played DA?\nMary: which one?\nDaniel: Inquisition\nMary: damn yes\nMary: love it to bits - team Dorian <3\nLucas: is it any good? I played just DA II\nMary: is it any good?! it's fucking brilliant!!!\nMary: Dan, are you playing?\nDaniel: just started and I'm not sure, trying to get used to the mechanics\nMary: you have to give it a go, I was sceptical at first, but... omg, wait for Dorian\nLucas: Is it better than 2?\nMary: oh yes" + }, + { + "id": "13818537", + "summary": "Judy thinks she's always attracted to bad guys.", + "dialogue": "Judy: Why am I always attracted to jerks??\r\nJanice: It didn’t work out with Andrew?\r\nJudy: He just wanted to fuck me\r\nJudy: When he got what he wanted he stopped calling and texting.\r\nJanice: And Bruce? He’s not a jerk.\r\nJudy: He’s sweet. Maybe too sweet for me…\r\nJudy: He’s a lovely and caring guy but I don’t feel the butterflies… " + }, + { + "id": "13810120", + "summary": "Riley and James watch Chloe on tv undergoing a metamorphosis.", + "dialogue": "Riley: Chloe is on tv!!\r\nJames: on which channel?\r\nJames: never mind i've found it\r\nJames: what is she doing? i don't get it\r\nRiley: this is a programme in which women undergo a complete metamorphosis.\r\nRiley: OMG she looks drop dead gorgeous!" + }, + { + "id": "13731040", + "summary": "Tina will catch the evening flight back home. Ala is on her way to the meeting. She will let Tina know how it went.", + "dialogue": "Tina: I'll tell you something, this Emirate staff looks amazing, as movie stars\r\nAla: Oh yes, I know, that's for purpose \r\nAla: It's how it's suppose to be\r\nAla: They pay a lot of attention to the image\r\nTina: Looks nice, pleasure to observe\r\nTina: And I sucked at the airport, they've kept us one hour in the plane and finally I'll catch the evening flight back home\r\nTina: Could You imagine?\r\nTina: And you know, this way we had such a talkative pilot :-)\r\nAla: Oh, poor you\r\nAla: Pfff\r\nAla: And I'm on my way to a meeting\r\nTina: THE meeting?\r\nAla: Yes, keep your finger crossed\r\nTina: Sure, let me know how did it go\r\nAla: Ok darling, in touch" + }, + { + "id": "13728229", + "summary": "Sebastian is very happy with his life, and shares this happiness with Kevin.", + "dialogue": "Sebastian: It's been already a year since we moved here.\r\nSebastian: This is totally the best time of my life.\r\nKevin: Really? \r\nSebastian: Yeah! Totally maaan.\r\nSebastian: During this 1 year I learned more than ever. \r\nSebastian: I learned how to be resourceful, I'm learning responsibility, and I literally have the power to make my dreams come true.\r\nKevin: It's great to hear that.\r\nKevin: It's great that you are satisfied with your decisions.\r\nKevin: And above all it's great to see that you have someone you love by your side :)\r\nSebastian: Exactly!\r\nSebastian: That's another part of my life that is going great.\r\nKevin: I wish I had such a person by my side.\r\nSebastian: Don't worry about it.\r\nSebastian: I have a feeling this day will come shortly.\r\nKevin: Haha. I don' think so, but thanks.\r\nSebastian: This one year proved to me that when you want something really badly, you can achieve it.\r\nKevin: I want to win lottery and I never did :D\r\nSebastian: If you devoted your life to analyze all of the winning numbers, and with your math skills you could win.\r\nKevin: Devote myself and million dollars for lottery tickets.\r\nSebastian: Something like that xD\r\nKevin: I'm happy for you man.\r\nKevin: I really am\r\nSebastian: Thanks. It means a lot my friend :)" + }, + { + "id": "13680571", + "summary": "Son is coming to see his parents' this weekend.", + "dialogue": "Frank: Son, will you come home this weekend?\r\nSon: not sure yet. Something happened?\r\nFrank: Of course not. Your mother miss you.\r\nSon: I miss her too.\r\nFrank: So will you com?\r\nSon: I will try.\r\nFrank: Good, I will tell your mother that you will come\r\nSon: oh, dad.. ok I will come." + }, + { + "id": "13829773", + "summary": "Ola is in Cuba and is enjoying her trip. She has problems with connectivity there. Momo has recovered from her injury. Ola doesn't like the clothes in Cuba. Ola will try to find a blouse for mum in Cuba, as Kate suggested.", + "dialogue": "Ola: Hello Kate, sorry for not keeping in touch properly. As expected, we have hardly any connectivity here in Cuba. But we're doing fine and enjoying our trip. How are the things at home?\r\nKate: At long last! Started to worry. Nothing new happening, if you disregard all that Xmas craze. Momo has recovered from her injury and frolicking again.\r\nKate: \r\nKate: Good old Momo! Yes, it is your scarf!\r\nOla: NO!!! It's one of my favorites! The one from Laos!\r\nKate: Too late. Momo thinks it belongs to her now. Get yourself a new one. They surely have nice ones there.\r\nOla: Not at all. Only cheapish cotton blouses with horrible multi-coloured embroidery or some equally horrible crochetted tops. No shawls or scarfs.\r\nOla: \r\nKate: Wait a sec!\r\nKate: \r\nKate: Isn't it similar?! Mum would probably like it. Why don't you?\r\nOla: Not a bad idea. But the quality is usually crappy.\r\nKate: And if you go to some boutique shop or something? Not at a market as in your pics?\r\nOla: I might try and find some. Would you like one too?\r\nKate: Not really. And Mum would prefer to be the only one with an authentic Cuban blouse :))\r\nOla: OK I'll have a look. Greets to everyone at home pls.\r\nKate: Take care!" + }, + { + "id": "13829332", + "summary": "Mike will ask Mary for John's new number.", + "dialogue": "Mike: Do u have new John's number?\r\nAnn: No, u should ask Mary.\r\nMike: Ok, thank u :*" + }, + { + "id": "13862428", + "summary": "Joseph has sent Ella a photo of Wujek Janek's twin baby cows. Ella is delighted.", + "dialogue": "Joseph: It's fuzzy but I think you can recognize what's that(^_-)-☆\nJoseph: \nElla: Ooooo\nElla: Baby cows??(/◕ヮ◕)/(/◕ヮ◕)/(/◕ヮ◕)/\nJoseph: Wujek Janek has tween cows:D\nElla: Twins* darling xD\nJoseph: Oh yeah, sorry Twins*\nElla: Good for him!! So cool❤️❤️\nElla: Wanna touch them❤️❤️❤️" + }, + { + "id": "13716025", + "summary": "Josh thinks Stephen accidentally took his notebook. Jack has it and will bring it tomorrow.", + "dialogue": "Josh: Stephen, I think you've accidentaly taken my notebook home\r\nStephen: wait lemme check\r\nStephen: nope, I don't see it anywhere\r\nJack: oh shit, I've got it xDDD I don't even know why\r\nJosh: xDDD ok, no problem, cool I know where it is\r\nJack: I'll bring it tomorow" + }, + { + "id": "13810245", + "summary": "Adele got a new biscuit Labrador Chewy that is 4 months. Her cats keep their distance, and Poppy and Lulu seem to mother Chewy and Speedy wants to play.", + "dialogue": "Lola: hey girlfriend, what's up?\r\nAdele: Oh, hi Lols, not much.\r\nAdele: got a new dog.\r\nLola: another one?\r\nAdele: Yup. a pup biscuit lab. 4 months. Chewy.\r\nLola: how did the others react?\r\nAdele: the cats keep their distance, Poppy and Lulu seem to mother him. Speedy wants to play.\r\nLola: no fighting? that's new.\r\nAdele: they say puppies are accepted by other animals more easily than older dogs\r\nLola: especially girl dogs, probably\r\nAdele: with the other ones I had to wean them because I took them in as adult dogs. And girls like to fight. like crazy.\r\nLola: doggies, right/.\r\nAdele: that too :P\r\nLola: haha. true though.\r\nAdele: I know, right. Anyway, called him Bones. He's so plump it kinda fit.\r\nLola: cute. can't wait to see him." + }, + { + "id": "13863202", + "summary": "Kristian and Tabora are playing a game about what they like best.", + "dialogue": "Kristian: Adidas Or Nike?😃\nTabora: Adidas(^v^)\nKristian: Watermelon or orange?😃😃\nTabora: Orange(^v^)\nKristian: Superman or batman?😃😃😃\nTabora: Batman(^v^)\nTabora: What are we doing now?\nKristian: Just playing games😃😃😃😃\nKristian: Wanna keep going?😃\nTabora: Sure! It is kinda fun!(*^0^*)\nKristian: Films or books?😃😃😀😀😀😀😀😀\nTabora: Films!(^v^)\nTabora: When is my turn?" + }, + { + "id": "13829473", + "summary": "Cathy will pick up her glasses tonight at 10.", + "dialogue": "Cathy: Just realized I left my sunglasses at yours\r\nBroke: Yes, they are waiting for you to pick them up\r\nCathy: Might come round at 10 tonight if that's alright\r\nBroke: Yeah okay, see ya" + }, + { + "id": "13819035", + "summary": "Petra is very sleepy at work today, Andy finds the day boring, and Ezgi is working. ", + "dialogue": "Petra: I need to sleep, I can't stand how sleepy I am\r\nAndy: I know, and it's so boring today, nobody's working at the office\r\nEzgi: I am working! lazy pigs\r\nPetra: I'm sleeping with my eyes open, kill me\r\nAndy: ask the fat woman from HR\r\nPetra: she would kill me on spot without batting an eye\r\nAndy: she always repeats she has a black belt in karate \r\nPetra: it's hard to believe she can move, but let her have whatever belt she wants\r\nAndy: LOL\r\nPetra: sooooo sleepy" + }, + { + "id": "13728057-1", + "summary": "Nick finds Jane pretty and invites her for a drink to get to know her better. Jane rejects Nick and is unpleasant to him. Nick suggests Jane to forget about their conversation.", + "dialogue": "Nick: You look absolutely gorgeous and have a lovely smile. \r\nNick: Would love to get to know you a bit more. How about we meet up for a drink sometime?\r\nJane: Hmmm... You're shooting a bit above your range aren't you?\r\nNick: Why would you think that hon?\r\nJane: Because I'm not that desperate.\r\nNick: That was a bit below the belt.\r\nNick: You're nice but you're not THAT hot.\r\nJane: Oh is your poor little dick shriveling at the thought?\r\nNick: Actually I'll take it back. Forget about the drink.\r\nNick: Forget I ever wrote to you.\r\nJane: Bye loser!\r\nNick: Fucking bitch!\r\nJane: You're welcome!" + }, + { + "id": "13865165", + "summary": "Julia knew Tim was gay, while Adam and Nate didn't. ", + "dialogue": "Adam: My friend told me he saw Tim with a guy.\nNate: And?\nAdam: \nNate: omg\nJulia: Yeah, what a shocker\nAdam: ??? You knew?!\nJulia: I thought everyone knew\nNate: I had no idea\nNate: Did he tell you anything?\nJulia: That he’s gay? God no\nAdam: Why didn’t you tell us?\nJulia: First: I assumed you knew\nJulia: Second: Why would I? it’s not my business\nNate: I think he should’ve told us ;/ not cool\nAdam: yeah, I made a completely idiot out of myself defending him\nJulia: Against whom? I’m not surprised he didn’t tell you\nNate: Well, it’s not fair, we’re his friends\nJulia: And? Does it change anything?" + }, + { + "id": "13817042", + "summary": "Lilly will be late. Gabriel will order pasta with salmon and basil for her.", + "dialogue": "Lilly: sorry, I'm gonna be late\r\nLilly: don't wait for me and order the food\r\nGabriel: no problem, shall we also order something for you?\r\nGabriel: so that you get it as soon as you get to us?\r\nLilly: good idea!\r\nLilly: pasta with salmon and basil is always very tasty there" + }, + { + "id": "13730929", + "summary": "Celine is not at home, but she will call Cara before visiting her.", + "dialogue": "Cara: hey\r\nCara: are you at home\r\nCeline: hey Cara\r\nCeline: No i'm not\r\nCara: okay then, i just wanted to pass by\r\nCeline: im sorry, i can drop by in the evening if you dont mind\r\nCara: its fine, call me then if you decide to come\r\nCeline: ok" + }, + { + "id": "13863033", + "summary": "Derek will be at Craig's in 20 minutes to help him with his malfunctioning computer.", + "dialogue": "Craig: Man, u there?\nDerek: Yeah, tell me\nCraig: I need help with my computer\nDerek: What happened?\nCraig: I don't know exactly but it's not working\nDerek: Well, ok... give me 20 minutes, got to get to my car\nCraig: Ok, thanks\nDerek: No prob" + }, + { + "id": "13829923", + "summary": "Abigail is not going to take a stroll with the little ones. Her smog alert app is showing that the norms have been exceeded by 30% today.", + "dialogue": "Emma: Hi neighbour :)\r\nEmma: Do you want to take a stroll with the little ones?\r\nAbigail: Hey Emma :) I don't think that's a good idea.\r\nAbigail: My smog alert app is showing that the norms have been exceeded by 30% today :O\r\nEmma: Oh my, that sounds serious.\r\nEmma: I need to install that app." + }, + { + "id": "13729905", + "summary": "Paul will buy red roses following Cindy's advice.", + "dialogue": "Paul: What color flowers should I get\r\nCindy: any just not yellow\r\nPaul: ok, pink?\r\nCindy: no maybe red\r\nPaul: just tell me what color and what type ok?\r\nCindy: ugh, red roses! " + }, + { + "id": "13829307", + "summary": "Jenny has left her credit car at the Mary's shop.", + "dialogue": "Mary: Hello, I think you've left your credit card at our shop\r\nJenny: Thank you for getting in touch! Thank you so much!\r\nMary: No worries :)\r\nJenny: When can I pick it up?\r\nMary: Whenever you come, it's safe with one of our cashiers :)" + }, + { + "id": "13611548", + "summary": "It's Tom's birthday. Lara and Gary will come to Tom's place about 5 pm to prepare everything before Tom gets back home at 5:30. Gary has already paid for the cake - Lara will pick it up and she will also get the balloons. ", + "dialogue": "Gary: Hey, don't forget about Tom's bday party!\r\nLara: I won't! What time should I show up?\r\nGary: Around 5 pm. He's supposed to be back home at 5:30, so we'll have just enough time to prep things up.\r\nLara: You're such a great boyfriend. He will be so happy!\r\nGary: Yep, I am :)\r\nLara: So I'll just pick up the cake and get the balloons...\r\nGary: Thanks, you're so helpful. I've already paid for the cake.\r\nLara: No problem, see you at 5 pm!\r\nGary: See you!" + }, + { + "id": "13680802", + "summary": "Paul is late for a meeting with Laura and she refuses to wait any longer.", + "dialogue": "Laura: Where are you?\r\nPaul: Almost there.\r\nLaura: Which is?\r\nPaul: Close to the Mac.\r\nLaura: That's so far away!\r\nPaul: 15 mins\r\nLaura: I am not waiting any more, see you some other time.\r\nPaul: Please, wait!\r\nLaura: I've waited 30 minutes, 15 minutes ago you wrote you were almost here. This is too much.\r\nPaul: I am so sorry.\r\nLaura: I am not. " + }, + { + "id": "13818220", + "summary": "Salma and Hugh like cat memes.", + "dialogue": "Salma: \r\nSalma: the latest cat meme\r\nHugh: oh sweet, I can never get enough of those lol\r\nHugh: \r\nSalma: hahaha same" + }, + { + "id": "13819911", + "summary": "Todays results show that Matt and Oliver got into Stanford University, Peter did not. ", + "dialogue": "Matt: results should be announced soon\r\nMatt: probably today \r\nOliver: they posted it\r\nOliver: \r\nPeter: I didn't get into Stanford :(\r\nMatt: let me see\r\nMatt: yup, I did\r\nOliver: me too\r\nOliver: barely\r\nPeter: I'm happy for you guys\r\nMatt: chin up! there are many other options\r\nOliver: exactly, don't give up\r\nPeter: thanks guys, that means a lot\r\nPeter: send your documents asap\r\nPeter: otherwise you'll stuck in the queue\r\nMatt: thanks for a heads-up\r\nOliver: yea, we owe you one\r\nPeter: I have to look for other university\r\nPeter: see ya\r\nMatt: bye" + }, + { + "id": "13864554", + "summary": "English classes were cancelled because Smith called in sick and replacement could not be found.", + "dialogue": "Jake: we don't have english today\nNadia: whooooah!\nVanessa: cool! you sure though?\nJake: Smith called in sick, they couldn't find a replacement, oh yeah" + }, + { + "id": "13715960", + "summary": "Brandon has lost his credit card and blocked it in the bank. It will take some time before he gets a new one, and he needs money. Luke is broke, but Ian will lend Brandon some money.", + "dialogue": "Brandon: Shit, I've lost my credit card!\r\nBrandon: I blocked it in the bank but it will take time before I get a new one\r\nBrandon: can you lend me $ 100 guys?\r\nLuke: sorry man, Im broke:/\r\nBrandon: I see\r\nIan: what a misfortune, dude\r\nIan: I can lend you $$, no problem\r\nBrandon: thanks, dude!" + }, + { + "id": "13716051", + "summary": "Patrycja and Inez enjoyed the Italian evening. Gosia chose a great place. Alicja hasn't been to the new restaurant. They all want a Korean evening on Wednesday in two weeks time.", + "dialogue": "Inez: My dears, our evening inspired me to create this group conversation to plan further Food Evenings :)\r\nInez: This is my proposal for the next one: \r\nAlicja: Wow, I will actually feel happy going to work thanks to this :D\r\nGosia: Happy going to work and even happier leaving it haha\r\nAlicja: Just too bad that the time between 9 and 17 will be wasted :P\r\nPatrycja: I really liked our evening, even the pizza was delicious :) How did you girls like it?\r\nInez: I loved it, Gosia really chose a great place :)\r\nGosia: I am an expert at eating :D\r\nAlicja: Have you been to this restaurant Inez sent?\r\nGosia: No, it is quite new. But I heard good opinions! :D\r\nPatrycja: And it fits with our theme of exploring world cuisines :)\r\nAlicja: First Italy, now Korea! :D\r\nInez: So when could we do this again?\r\nGosia: I am not sure, but we have to before the holidays!\r\nPatrycja: We should have less work in two weeks, we could go again on Wednesday?\r\nAlicja: Sounds good to me!\r\nInez: Me too :D" + }, + { + "id": "13680457", + "summary": "Ana wants to visit grandma tomorrow. Catherine will go with her. She will call Anna when she wakes up.", + "dialogue": "Ana: You sleeping?\r\nCatherine: Not yet.\r\nAna: Wanna go visit grandma tomorrow? I miss her.\r\nCatherine: Yeah that would be nice :) I'll call you when I wake up\r\nAna: Oki :) sleep well, good night.\r\nCatherine: Good night, u too." + }, + { + "id": "13821684", + "summary": "Edson is booking his ticket now.", + "dialogue": "Joyce: Check this out!\r\nJoyce: \r\nMichael: That's cheap!\r\nEdson: No way! I'm booking my ticket now!! " + }, + { + "id": "13817976", + "summary": "Jane wants to leave at 4.30 instead of 5 because Google Maps suggests the 300 km drive should take them at least 3 hours and she doesn't want to be late. She will wait for Steven at the main entrance.", + "dialogue": "Jane: google maps says it is at least 3h \r\nSteven: I used to make it in 2, trust me :D\r\nJane: but it's almost 300km..\r\nSteven: the road is new , we will make it ^^\r\nJane: I don't want to stress out , let's meet at 4:30 instead of 5, ok?\r\nSteven: ok, if u reaaly want, we can meet at 4:30\r\nJane: thx! \r\nJane: I will wait at the main entrance or where?\r\nSteven: main entrance is good for me;-) \r\nSteven: cu" + }, + { + "id": "13612149-1", + "summary": "Suzanne is at work and is having a break now. Morgan invites Suzanne to a concert of Maroon 5 which takes place next week at the Hulu Theatre at Madison Square Garden. Suzzanne agrees.", + "dialogue": "Morgan: Hey gorgeous, how’s your day?\r\nSuzanne: Nothing special, it’s just one of many boring days at work. But… better now though!\r\nMorgan: Are you working at all? 😉\r\nSuzanne: I’m trying 😉 But you aren’t helping me, at all\r\nSuzanne: I’m just taking a well-deserved break 😉\r\nMorgan: I miss you Suzie\r\nSuzanne: I miss you too Morgan\r\nMorgan: Do you feel like going to a concert next week? Maroon 5 is playing at the Hulu Theater at Madison Square Garden.\r\nMorgan: As it happens, I’ve got two tickets 😉\r\nMorgan: Do you want to go?\r\nSuzanne: Really? OMG! That’s wonderful!\r\nSuzanne: Thank you sweetheart!\r\nMorgan: Oh, nothing. I just want you to be happy 😉" + }, + { + "id": "13818620", + "summary": "The last one Julia read was Die again from 2014. There's going to be a meeting with Tess organized by the city library. ", + "dialogue": "Liam: \r\nLiam: have you read this one?\r\nJulia: wow, I didn't even know that it existed\r\nJulia: thanks! The last one I read was Die again\r\nLiam: this one is like from 2014?\r\nJulia: yep\r\nLiam: I've heard that city library is organizing a meeting with Tess\r\nJulia: really? That would be really awesome\r\nLiam: just check it out on their facebook" + }, + { + "id": "13810034", + "summary": "Ali left his wallet at Mohammad's place. Mohammad'll bring it to uni tomorrow.", + "dialogue": "Ali: I think I left my wallet at your place yesterday. Could you check? \r\nMohammad: Give me a sec, I'll have a look around my room.\r\nAli: OK.\r\nMohammad: Found it!\r\nAli: Phew, I don't know what I'd do if it wasn't there. Can you bring it to uni tomorrow?\r\nMohammad: Sure thing." + }, + { + "id": "13680607", + "summary": "Linda and Laura are going to an Italian restaurant tomorrow.", + "dialogue": "Laura: where will we go tommorow?\r\nLinda: maybe this italian restaurant?\r\nLaura: hmm ok\r\nLinda: i love their pasta\r\nLinda: and brownie :)\r\nLaura: i thouht you dont eat sweets\r\nLinda: this brownie is not so sweet" + }, + { + "id": "13821155", + "summary": "Anna organises a birthday's party on the 6th of November at 19:30.", + "dialogue": "Natalie: Well well weeeeeell, I see somethings going on here at last\r\nMartin: (Y)\r\nAdam: any confirmed data?\r\nAnna: Hello everyone!!! Id love to invite everybody to my bday. I would be extremaly happy if you could come 6th of November at 1930\r\nMartin: <3\r\nMargot: <3\r\nMia: (Y)" + }, + { + "id": "13681586", + "summary": "Mia is going out after work tonight with her female friends. If she wants, Elliot will come and pick her up.", + "dialogue": "Mia: babe, do you mind if i go out after work tonight?\r\nElliot: yeah, sure. go for it ;) \r\nMia: thanks, babe. it's gonna be girls night out \r\nElliot: what're u up to?\r\nMia: well, you know we gonna grab some drinks have some nice time outside of work\r\nElliot: don't be too late please and if you need me to pick u up just give me a ring " + }, + { + "id": "13681541", + "summary": "Jayden explains Brennan why she doesn't want to be pregnant.", + "dialogue": "Jayden: But I don't need kids. Kids means over. At least for a woman\r\nBrennan: Over what ?\r\nJayden: The end of normal life. Being pregnant, suffering because of this etc\r\nBrennan: Hmm so I need to look for another mother to my kids then. Haha\r\nJayden: Being obligated to be with the. 24h. Men have only sex and they wait for kids while women suffer\r\nBrennan: I don't agree...\r\nJayden: I wish I could do the same. Then probably i would say the same like u.\r\nBrennan: Guys like me would be there through it all to reduce the suffering\r\nJayden: Physical suffering. No one can do anything with this. I wish I could just have sex and wait for a baby while having a normal life. Not getting fat, having the same body, the same breast and not disgusting ... Not feeling sick, not having pain, being able to do every day stuff even like walking...\r\nBrennan: It's gonna happen eventually\r\nJayden: I was I'm a store, behind me there was a pregnant woman, she dropped some money and she couldn't even take them from the floor... I had to help her\r\nBrennan: That's because she's about to give birth\r\nJayden: I hope that maybe soon they will be possible to have a child without being pregnant. Yes! And she's suffering\r\nBrennan: Any I'm sorry for feeding you with my bullshit\r\nJayden: While a man is doing his normal stuff. U mean the conversation?\r\nBrennan: I hope you find a guy that can give you the sex you want and not get pregnant\r\nJayden: Would be awesome\r\nBrennan: I'm gonna go to sleep now. Good night\r\nJayden: I said I don't want to have any children now! Maybe in the future when I have a good job, I'm financially independent. Good night" + }, + { + "id": "13680611", + "summary": "USA won last night. England is playing against Croatia tomorrow at 2.", + "dialogue": "Steve: BTW, USA won last night!\r\nGulab: I forgot to check!\r\nSteve: England playing tomorrow at 2:00!\r\nGulab: That's right, Croatia?\r\nSteve: Yep." + }, + { + "id": "13729722", + "summary": "John forgot his wallet. He wants Ela to give it to him outside in 10 minutes.", + "dialogue": "John: Ela i am coming in 10 mins please give me my walle outside t i forgot it\r\nEla: yes just saw it when you are here call me\r\nJohn: but your phone is busy thats why i messaged keep it free please i am getting late\r\nEla: oh yes was talking to mom ... its free now\r\nJohn: ok" + }, + { + "id": "13730529", + "summary": "Mark told his sister that Mary is doing an online job. Mark's sister is contacting people to confirm it. Mark thinks she's jealous. Mary hates Mark's sister.", + "dialogue": "Mary: Did you tell your sister I am doing online job?\r\nMark: yes !\r\nMary: why\r\nMark: because she keep saying your good for nothing?\r\nMary: dint I tell you I don’t care?\r\nMark: what happened?\r\nMary: see I don’t want to prove anything to anyone..\r\nMark: I know… but I was just feeling proud so it was kind of show off…\r\nMary: she is asking everyone… and trying to get to the people I am working for\r\nMark: really!! I am sorry for that…\r\nMary: don’t be! I understand your feelings… but u know how she is…\r\nMark: I know!! :? \r\nMary: don’t be sad now its ok.. she cant do much about it… chill its ok but just be careful\r\nMark: I will be ..\r\nMary: btw it feels good that she is so jealous :P\r\nMark: lol my aim was to make her feel jealous\r\nMary: but i dont like it that she tries to contact the people i am working for ... what does she want?\r\nMark: may be she wants to confirm if its true... because its not easy to digest that your working from home and earning well!!!\r\nMary: whatever i just hate her\r\nMark: chill now .... :) love you\r\nMary: i am chilled :cool: ... love you too honey" + }, + { + "id": "13813575", + "summary": "Fiona wants to prepare dinner for Chris. She is thinking of Tina's tart. She will help her make it.", + "dialogue": "Fiona: Are you free?\r\nTina: Yes, what's up?\r\nFiona: I'm trying to prepare a nice dinner for Chris and I thought that maybe I could prepare this tart of yours :)\r\nTina: I'm flattered!\r\nFiona: Well, it IS delicious :) Could help me do it?\r\nTina: Sure! It's not difficult. Do you have anything ready?\r\nFiona: I must admit I bought the crust...\r\nTina: Oh, ok :P Pity, but well, it's too late now.\r\nFiona: I tried making the filling once, but I finished with lemony scrambled eggs...\r\nTina: It happens, don't worry. The thing is that once you start adding eggs you can't stop mixing it, otherwise you'll end up with scrambled eggs." + }, + { + "id": "13865384", + "summary": "Annette is sick. James is going to the Jesus bar. Oli couldn't find anyone near the bar.", + "dialogue": "Peadar: So who's coming for a drink later then??!\nClare: Oh lols soz I'm not in town\nAnnette: A drink?!!?!!\nPeadar: A drink!!!\nAnnette: I’m home sick soz huns\nAnnette: Got lung lurgy\nOli: Are people at Jesus bar now?\nJames: Heading to Jesus bar now! Come join :) turn left as you come into college\nAnne: Woops.. . Sorry guys I double booked myself!\nHelen: Yes! Jesus bar, from around 9.15 ish\nPeadar: Was that just a question oli😋 or did you mean... I'll join ye😁\nOli: I cycled and ran around the bar, but couldn't find anyone 😢\nClare: You cycled around the bar?\nOli: To the bar 😅\nHelen: 🙈😍😇" + }, + { + "id": "13728017", + "summary": "Ryan and Sebastian are worried about the political situation in Ukraine.", + "dialogue": "Ryan: I have a bad feeling about this\r\nRyan: \r\nSebastian: Ukraine...\r\nSebastian: This russian circus will never end...\r\nRyan: I hope the leaders of of nations will react somehow to this shit.\r\nSebastian: I hope so too :(" + }, + { + "id": "13729058", + "summary": "Finn and Zadie are going to Elephant and Castle tomorrow at 2. They will meet at the main entrance.", + "dialogue": "Finn: Hey\r\nZadie: Hi there! What's up?\r\nFinn: All fine. You? \r\nZadie: Not bad, thanks \r\nFinn: Look, I was thinking of going to this neighborhood called Elephant and Castle tomorrow, it's apparently full of Latin American stuff. Fancy joining? \r\nZadie: Sure! But what's \"stuff\"? 😂\r\nFinn: lol So apparently it's a place were random people from \"Latin America\" (meaning fuck knows which countries) started running small businesses and restaurant, and a nice little community was formed\r\nZadie: Oh cool \r\nFinn: Then capitalism came and it's all going to be demolished soon, so it's like the last chance to go\r\nZadie: What a shame :( Yeah, I haven't had Latin American 😂 food for ages so I'm totally up for it\r\nFinn: Can't wait to taste this cuisine of unspecified Latino origin lol\r\nZadie: 😂😂😂\r\nFinn: But we can specify time and place if and only if you wish \r\nZadie: I might be tempted to lol I'd say early evening, 2-ish?\r\nFinn: Yeah, that's fine by me. So most of the places we want to visit are in this Elephant and Castle shopping centre. Shall I see you at the main entrance, wherever that is \r\nZadie: 2 o'clock at unspecified main entrance then? Sounds good to mw\r\nFinn: Yer\r\nZadie: Cool, see you there! And thanks so much for remembering about me 💜💜💜\r\nFinn: Thanks for saying yes to such an ill-defined plan lmao \r\nZadie: Ha ha You know I love those\r\nFinn: See you tomorrow then \r\nZadie: Yep Call me if you get lost\r\nFinn: I will I will 🤙 byeeee \r\nZadie: Toodles" + }, + { + "id": "13611803", + "summary": "Jacob hasn't listened to the song Harry sent him 3 days ago. Jacob will do it later tonight and let Harry know what he thinks.", + "dialogue": "Harry: and? have you listened to it?\r\nJacob: listened to what?\r\nHarry: to the song i sent you 3 days ago -.-\r\nJacob: oh shit, i completely forgot...\r\nHarry: ofc again\r\nJacob: don't be like this :* i'll do that later tonight\r\nHarry: heh, okay\r\nHarry: i'm really curious what you'll think about it\r\nJacob: i'll let you know, a bit busy right now, speak to you later!\r\nHarry: okay" + }, + { + "id": "13729817", + "summary": "Ray is locked in the room from the outside and he has to pee. Max's roommate will come and let him out.", + "dialogue": "Ray: u in ur room?\r\nMax: no whats up\r\nRay: someone locked the door from outside -_-\r\nMax: wtf xD\r\nRay: yeah dude cmon u gotta let me out\r\nMax: but im out\r\nRay: are u kidding me\r\nMax: sorry man XD\r\nRay: dude i have to pee\r\nMax: HAHAHAHHAHAHA XD\r\nRay: thats not funny >.<\r\nMax: it actually is xD\r\nRay: can u ask someone else to do it\r\nMax: yea let me see if my roommates there\r\nRay: HURRY\r\nMax: hes coming\r\nRay: tell him to HURRYYY\r\nMax: hes on his way \r\nRay: he opened it, thanks\r\nMax: enjoy XD" + }, + { + "id": "13728051", + "summary": "Kim is about to tell mom that Harry bought a new sofa, and he needs grey pillows.", + "dialogue": "Kim: What kind of gift would you like to get?\r\nKim: Mom's asking.\r\nHarry: Haha. No need for a gift for me :D\r\nHarry: But you can tell your mom I just bought a new sofa and I need pillows.\r\nHarry: If she asks for the colour, tell her that grey is the best :D\r\nKim: Sure! Thanks for info :) " + }, + { + "id": "13810694", + "summary": "Josh wants to buy a tablet and doesn't know which brand he should choose. According to Brian, other brands are better than Apple and he can get a Samsung tablet cheaper. Josh will call Brian after work to talk about it.", + "dialogue": "Josh: I need to buy an iPad?\r\nJosh: do u think apple it's a good choice?\r\nBrian: Nope, u gonna pay to have the sign of apple on yr fucking iPad.\r\nJosh: so what would u recommend?\r\nBrian: u mean brands?\r\nBrian: Samsung, Xiaomi, Sony\r\nBrian: depends on yr budget..\r\nJosh: let's say 2000..\r\nBrian: what about this ? \r\nBrian: or this \r\nBrian: actually, if u want samsung I can get it cheaper for u\r\nJosh: oh, good\r\nBrian: call me after work, ok?\r\nJosh: ok" + }, + { + "id": "13820547", + "summary": "Nathalie, Olafur and Zoe are planning the New Year's Eve. Nathalie wants something classy. Olafur doesn't like opera. They want to go to the Breakfast at Tiffany's party in Soho.", + "dialogue": "Olafur: are we doing anything for New Year's Eve?\r\nNathalie: I was thinking about something classy, like opera or sth like that\r\nZoe: how much does it cost?\r\nOlafur: opera is not for me\r\nNathalie: so what do you propose?\r\nNathalie: it's 100$ \r\nOlafur: I was thinking about partying somewhere\r\nNathalie: partying sounds fun, as long as it will be classy\r\nZoe: \r\nZoe: Breakfast at Tiffany's party sounds classy\r\nOlafur: \r\nOlafur: is it classy enough?\r\nNathalie: :O\r\nNathalie: this club is AMAZING\r\nZoe: whoa\r\nNathalie: we'll going to Soho then\r\nOlafur: we just need to hurry up and buy some tickets soon\r\nZoe: sure" + }, + { + "id": "13731456", + "summary": "Frank tries to encourage Andy to learn for the tomorrow's quiz.", + "dialogue": "Frank: wat are u doing??\r\nAndy: watching Arrow B)\r\nFrank: dont u have a quiz tomorrow :/\r\nAndy: yeah, so? :3\r\nFrank: so go study for it \r\nAndy: its a small quiz\r\nFrank: so it doesnot matter??\r\nAndy: it does, but ..\r\nFrank: but??\r\nAndy: i'll study for it tomorrow\r\nFrank: yea like ur gonna wake up on time for that -_-\r\nAndy: dude your not my dad\r\nFrank: -_-" + }, + { + "id": "13728313", + "summary": "Kim needs to get some fresh fish in Warsaw. Kim will ask at restaurants about their source. Kim is planning something special but won't share any details with Margot for now.", + "dialogue": "Kim: Hey, listen, have you got any idea where I can get some fresh ramal fish in Warsaw?\r\nMargot: Hi, I don't know about \"fresh\", but you could try the marketplace at Polna street. \r\nMargot: Other than that, I haven't got a clue...\r\nKim: Already tried Polna :(\r\nMargot: Ah!\r\nMargot: Hm...\r\nMargot: And did you check Internet for fish market in Warsaw?\r\nKim: Yeah, there's none.\r\nKim: Or I didn't find it. \r\nMargot: You could have a look at restaurants' menus. And if you find it there, you could ask about their source. \r\nMargot: Although I'm not sure they'd eager to give you such infromation. But you can always try..\r\nKim: Great idea, I haven't though about this! Thx!\r\nKim: I'll try. After all, I have nothing to loose. \r\nMargot: Btw, is there a special occasion you want to cook it for?\r\nKim: Yes! Acctually I'll grill it ;)\r\nKim: And it's a surprise, so I'm not giving away any details!\r\nMargot: I'm so curious!\r\nMargot: I understant it's sth good, so I'm happy for you girl :)\r\nKim: Indeed, it's good. But you won't get any more infromation from me now.\r\nKim: All in good time :)\r\nMargot: Cruel you! Find that fish!!! ;)\r\nKim: :D" + }, + { + "id": "13730868", + "summary": "Tom will meet Ben in the Oval Room at 2pm and tells him to bring the papers. ", + "dialogue": "Tom: Ben. We've decided. 2pm in the Oval Room.\r\nBen: Ok, I'll be there\r\nTom: Take all your papers, it's going to be a fight! And remember: take no prisoners, shoot to kill!\r\nBen: hahaha, we have to win this battle.\r\nTom: We will, the justice is on our side." + }, + { + "id": "13862919", + "summary": "Ashleigh got the job.", + "dialogue": "Ashleigh: Looks like we're going to the cinema!! \nAshleigh: \nPeter: You got the job??\nAshleigh: I got hte job! :D\nPeter: \nAshleigh: " + }, + { + "id": "13682034", + "summary": "Danna has a boring weekend and is watching TV. Reed is in bed. He has a free day tomorrow.", + "dialogue": "Danna: How's your Saturday?\r\nReed: It was alright thanks\r\nDanna: Good\r\nReed: Yours ?\r\nDanna: Boring\r\nReed: Why?\r\nDanna: I'm angry I called maybe 5-8 of my friends and they aren't around or are busy.\r\nReed: Shame\r\nDanna: So it's is the next boring weekend for me\r\nReed: That sucks\r\nDanna: The only thing I can do is watching TV -.-\r\nReed: Haha lucky you\r\nDanna: Yeah haha\r\nReed: I don't have tv, our subscription expired and they never renewed it. They want us to pay for it so fuck it\r\nDanna: Yeah. What are you doing?\r\nReed: I'm in bed\r\nDanna: Work tomorrow?\r\nReed: No. Off tomorrow\r\nDanna: Nice\r\nReed: Indeed" + }, + { + "id": "13814407", + "summary": "Alivia has been taciturn lately. She was trying to write her thesis. She can't focus on writing. She'll try to follow Antonio's advice to start writing without overthinking.", + "dialogue": "Antonio: Is everything okay? You've been quiet lately\r\nAlivia: Oh, hi, yeah, I've just been working on my thesis\r\nAlivia: Or rather trying to work, it's not going too well\r\nAntonio: Oh :( Problems finding research materials?\r\nAlivia: Well\r\nAlivia: That isn't really as big a problem, the worst part is actually sitting down and writing\r\nAlivia: I find the topic interesting and all, I don't mind reading articles and books\r\nAlivia: But when I'm supposed to write, it's like I blank out and can't type a single word w/o thinking I sound stupid...\r\nAntonio: I know the feeling...\r\nAntonio: You should probably stop thinking about it so seriously, just write and you can edit it later\r\nAntonio: Once you get past the initial difficulty, it'll get better, at least that's what it was like for me\r\nAlivia: I'd like to think so... Thanks... I'll try. And thanks for your concern <3" + }, + { + "id": "13728444-1", + "summary": "Maddie will buy a white bread and apples on John's request.", + "dialogue": "Maddie: I'm in Asda, do you need anything?\r\nJohn: could do with a white bread and some apples \r\nMaddie: ok. Gala?\r\nJohn: yes please ta" + }, + { + "id": "13829161", + "summary": "Elliot can't talk to Jordan now, he's busy. He'll call him back at 8 pm. Jordan is going to Brad's funeral. He had liver cancer.", + "dialogue": "Elliot: i can't talk rn, i'm rly busy\r\nElliot: can i call u back in about 2 hours?\r\nJordan: Not really, I'm going to a funeral.\r\nJordan: I'll call you tonight, ok?\r\nElliot: sure\r\nElliot: whose funeral is it?\r\nJordan: My colleague's, Brad.\r\nJordan: I told you about him, he had a liver cancer.\r\nElliot: i'm so sorry man, i hope u're ok\r\nElliot: i'll call u at 8 pm" + }, + { + "id": "13680110", + "summary": "Flo cannot get an appointment at the salon until the 6th. Flo worries she's going to be gray. Flo will have to get a touch-up kit at Tesco.", + "dialogue": "Flo: OMG, I can't get into the salon until the 6th!\r\nGina: What? Why?\r\nFlo: They're just too busy. I'm going to be gray! LOL!\r\nGina: Get you a touch-up kit at Tesco!\r\nFlo: Gonna have to!" + }, + { + "id": "13862785", + "summary": "Rob is doing shopping at the grocery store. Ann ordered him to buy a cucumber, some tomatoes, bananas and apples.", + "dialogue": "Rob: hey, pick up your phone :)\nAnn: can't - meeting :)\nRob: sorry...\nAnn: no problem - super boring one :) \nAnn: what you need babe?\nRob: I'm at the grocery store and was wondering if we need anything\nAnn: some food :)\nRob: yeah, I figured that smartass :)\nAnn: :*\nRob: details? so that you won't moan we don't have anything to eat :)\nAnn: from what I remember we have everything for supper and lunch tomorrow, maybe some fruit and vegetables?\nRob: anything in particular?\nAnn: cucumber, tomatoes, bananas, apples and whatever you like\nRob: ok" + }, + { + "id": "13681246", + "summary": "It's been very long since Melany last had sex. Marvin made an inappropriate joke about it.", + "dialogue": "Marvin: When's the last time you got laid ?\r\nMelany: I don't even remember..\r\nMarvin: Hmm so there must be lots of cobwebs between your legs now huh hahaha" + }, + { + "id": "13715777", + "summary": "Eric, Samantha and Noah's professor is commenting a recent scandal on the news. ", + "dialogue": "Eric: , check it out :D\r\nSamantha: HAHA, what is our favorite professor?\r\nEric: Talking about this recent scandal on the news :P\r\nNoah: \"I am the smartest person alive, I knew this will happen\" :D\r\nSamantha: Hahaha, now I don't even need to open the video" + }, + { + "id": "13729335", + "summary": "According to Jacky, David did the right thing taking the blame. They will talk when Jack comes back home.", + "dialogue": "Jacky: I think you were right yesterday. \r\nDavid: What about? I'm right about most things :P\r\nJacky: Yeah, whole you ;)\r\nJacky: About taking the blame etc. \r\nDavid: Okey, I remeber. We'll talk later?\r\nJacky: With pleasure. I'll call you when I get home." + }, + { + "id": "13827914", + "summary": "Rick and Helen are in Cancun. They're flying to Havana in two days. Chris and Rick will talk on Skype at 3 PM in Mexico.", + "dialogue": "Chris: Hi there! Where are you? Any chance of skyping?\r\nRick: Hi! Our last two days in Cancun before flying to Havana. Yeah, skyping is an idea. When would it suit you?\r\nRick: We don't have the best of connections in the room but I can get you pretty well in the lobby.\r\nChris: What's the time in your place now?\r\nRick: 6:45 pm\r\nChris: It's a quarter to one in the morning here. Am still in front of the box.\r\nRick: Gracious me! Sorry mate. You needn't have answered.\r\nChris: 8-D\r\nRick: Just tell me when we could skype.\r\nChris: Preferably in the evening. Just a few hours earlier than now. And not tomorrow.\r\nRick: Shute! Only tomorrow makes sense as there's no workable internet in Cuba.\r\nChris: Could you make it like 3 pm your time?\r\nRick: Sure.\r\nChris: Perfect. So talk to you tomorrow.\r\nChris: Give my love to Helen please.\r\nRick: I will. Thx." + }, + { + "id": "13864619", + "summary": "Ying sent a photo of his 10 years challenge to Helen, Norma and Zazu.", + "dialogue": "Ying: \nYing: my 10 years challenge everyone\nHelen: wow\nNorma: :O\nZazu: dude...\nZazu: I'm impressed" + }, + { + "id": "13612174", + "summary": "Daniel will see Missy after 6 for drinks.", + "dialogue": "Daniel: Yo, at what time do you get out of work?\r\nMissy: At 6.\r\nDaniel: Drinks after dinner?\r\nMissy: Totally!\r\nDaniel: Cool." + }, + { + "id": "13682284", + "summary": "Adelle has to clean the hamster cage after school.", + "dialogue": "Pete: Did you clean the hamster cage?\r\nAdelle: No. Is it my turn?\r\nPete: Yes. After school, no excuses.\r\nAdelle: Fine." + }, + { + "id": "13862773", + "summary": "Maya will buy 5 packs of earplugs for Randolph at the pharmacy.", + "dialogue": "Randolph: Honey\nRandolph: Are you still in the pharmacy?\nMaya: Yes\nRandolph: Buy me some earplugs please\nMaya: How many pairs?\nRandolph: 4 or 5 packs\nMaya: I'll get you 5\nRandolph: Thanks darling" + }, + { + "id": "13729857", + "summary": "David is coming home for Christmas next week. Jane has no idea what to buy their father so David is going to order an ipad online.", + "dialogue": "Jane: hey david, you're coming home for christmas next week right?\r\nDavid: of course\r\nJane: good\r\nJane: do you know what your dad would like for christmas?\r\nJane: i can't think of anything\r\nDavid: you should get him an ipad\r\nDavid: he can read books, email, watch movies, play games\r\nJane: ok, that sounds good, where can i get one?\r\nDavid: i'll order it online and have it shipped home\r\nJane: thanks for your help\r\nJane: and please let me know when you'll get here once your travel arrangements are set." + }, + { + "id": "13731140-1", + "summary": "Isabella is grateful to Betty for sharing the information about her work yesterday. Isabella offers Betty her company, should Betty want to do something together.", + "dialogue": "Isabella: Hi Betty!\r\nIsabella: It was very nice to listen about your work yesterday. Thank you for sharing that!\r\nIsabella: If you wanted to do sth together, let me know. \r\nBetty: Thank you! " + }, + { + "id": "13680398", + "summary": "Hollie says hello to Amy, but Amy is busy working and can't chat right now.", + "dialogue": "Hollie: How are you?\r\nAmy: hey\r\nAmy: i'll get back to you later, working now\r\nHollie: Ok." + }, + { + "id": "13730772", + "summary": "Alex will go swimming toghether with Huda in two hours. ", + "dialogue": "Huda: going swimming wanna join?\r\nAlex: sure.. what time?\r\nHuda: in 2 hours \r\nAlex: ok i will be there then\r\nHuda: see ya" + }, + { + "id": "13819021", + "summary": "Janet, Nicole, Alison, Arlene, Leslie, Ros, Eric and Sue are all complaining about Donald Trump and his absence at the ceremony. ", + "dialogue": "Janet: I am ashamed. Who voted for this pussy? It's your fault.\r\nAlison: Remember the Wizard of Oz? He might have melted.\r\nNicole: He’s a sissy boy.\r\nCheryl: RAIN omfg thats so shameful and disrespectful 😡\r\nBuff: Pussy in Chief.\r\nLinda: Trump is selfish and inconsiderate.\r\nJanet: What an embarrassment to our nation and the world!!!\r\nRoz: Where is Elsie? I miss you vomit 😊\r\nCheryl: WTF EVER trumpola didnt want to mess up his pity full comb over.....\r\nJanet: Trump = snowflake\r\nLinda: Baby. He’s a spoiled brat baby. Nothing about Donald Trump to be proud of or want to defend.\r\nArlene: HIs hair and makeup would have been ruined!\r\nRoz: Exactly 😊\r\nLeslie: Which adviser, who he doesn't listen to anyway, thought missing this ceremony was a good idea? Shameful!\r\nEric: What a pussy. We should grab him and kick him to the curb.\r\nSue: All the other leaders managed to make it, so there is no excuse, for me.\r\nRoz: It's all about the hair.\r\nSue: afraid the colour of his orange hair would run????\r\nLinda: Never heard of an umbrella :)" + }, + { + "id": "13611559", + "summary": "Rashi is confused by too many career choices. Teacher advises him to choose something he has passion for and what interests him.", + "dialogue": "Teacher: Rashi, why are you so low? \r\nRashi: Ma’am I’m a bit confused about my career. \r\nTeacher: What is your confusion?\r\nRashi: I was discussing with my friends about the career options. \r\nTeacher: Hmm.\r\nRashi: There are too many to choose from.\r\nTeacher: Choose a career based on what truly interests you. \r\nRashi: I have many that interests me. How does it determine the career?\r\nTeacher: The passion you have for what you do drives you to success. \r\nRashi: But what about earnings?\r\nTeacher: Remember at some point of time one should learn to balance between duties and success.\r\nRashi: How do I do that?\r\nTeacher: Choose a career which interests you, get experienced and try to progress and widen the scope after a while.\r\nRashi: Hmm, ok.\r\nTeacher: Something like earn and learn sort of..\r\nRashi: You are so right. I will remember this.\r\nTeacher: So hope I managed to answer your questions.\r\nRashi: Yes mam! Thank you very much! \r\nTeacher : You are most welcome, Rashi." + }, + { + "id": "13862818", + "summary": "Corbin reported to the department in charge of school violence that his friend has been beaten.", + "dialogue": "Corbin: Is this the department in charge of school violence?\nDimitri: Yes, it is.\nCorbin: I want to report school violence in our school.\nDimitri: Okay. What school are you in?\nCorbin: Jungang high school. The student who is victim of the violence is my friend.\nCorbin: They are not hitting him any more. But you should help my friend.\nCorbin: If they notice I was the one who reported, they will hit me as well.\nDimitri: First, calm down. Give us your phone number. You will be safe.\nCorbin: 486-984-324 It is.\nDimitri: Don’t worry. I will call you now, ok?" + }, + { + "id": "13829829", + "summary": "Marta needs help with the PC. On Joel's advice, she will contact Cynthia or Elena as they might know someone. ", + "dialogue": "Marta: Hey hey :)\r\nMarta: 😊\r\nMarta: Do you happen to know a good technician, that fixes PC's? 😑\r\nJoel: Hello! No, sorry. I only know of the guys in IT, back at the office. However Pablo, the best one on the team, he is on vacation this week..:/\r\nJoel: Ask Cynthia or Elena, they might know someone.\r\nMarta: Ok, thank you! Yes, I will give them a try 🤞" + }, + { + "id": "13611640", + "summary": "Ann doesn't know what she should give to her dad as a birthday gift. He's turning 50. Fiona tries to help her and suggests a paintball match.", + "dialogue": "Ann: What should I prepare 4 my dad's birthday?\r\nFiona: How old is he?\r\nAnn: turning 50\r\nFiona: wow, a round birthay, it must be sth big\r\nAnn: I know, but I don't have any idea\r\nFiona: surprise party?\r\nAnn: My dad hates dose\r\nFiona: ok, so what does he like?\r\nAnn: I don't know, he watch a lot of military movie\r\nFiona: well, a movie ticket is probably not what you thought of\r\nAnn: not even close\r\nFiona: Maybe some event. U know like bungee jumping or parachute jump\r\nAnn: that would be nice but he's afraid of heights\r\nFiona: damn, maybe sth you can do together\r\nAnn: well I was plannig dinner with the whole family, but that's not enough\r\nFiona: yes, there should be sth special also\r\nAnn: I know, but I'm out of \r\nFiona: Let me think. Nothing with heights but maybe sth on the ground? Racing? Horse riding?\r\nAnn: ok, it's a good direction. Maybe some team play, we could go with the whole family\r\nFiona: u said he likes military... maybe paintball? \r\nAnn: I don't know how my mum will react but I like it :D\r\nFiona: I guess she's not into military\r\nAnn: not really, no. But it's dad's birthday so she has to accept it. Thx for the help\r\nFiona: no problem" + }, + { + "id": "13828103", + "summary": "Fatima is worried about Jenson and Alene. Alene has issues. Lincoln doesn't want Fatima to worry about others too much.", + "dialogue": "Lincoln: Heeyyy ;* whats up\r\nFatima: I talked to Jenson, he’s not too happy ;p\r\nLincoln: the place sucks??\r\nFatima: No, the place is ok, I think, we can go there, it’s about Alene\r\nLincoln: typical, dont worry about it\r\nFatima: He thinks she may have a depression :[\r\nLincoln: nothin new, everyone has it, she needs a doctor then\r\nFatima: But she won’t go ;/\r\nLincoln: so she’s destroying her life fuck it its not your problem\r\nFatima: It is, they’re both my friends!\r\nLincoln: you better think what to do if they break up\r\nFatima: Ehh yes Ill have a problem ;//\r\nLincoln: both blaming each other and talking with you about it, perfect\r\nFatima: Alene is just troubled… She’d been through a lot…\r\nLincoln: everyone has their problems, the question is are ya doin sth about them\r\nFatima: She has problems facing it, don’t be surprised :[\r\nLincoln: then it is her problem\r\nFatima: You are so cruel at times… o.O\r\nLincoln: maybe, for me its just a common sense\r\nFatima: Why can’t everyone be just happy???\r\nLincoln: youll not understand, you had good childhood, nice parents, you have no idea\r\nFatima: Probably, true… Well I can be just grateful o.o\r\nLincoln: do that and stop worrying about others, youre way to bautful for that <3\r\nFatima: :*:*:*" + }, + { + "id": "13829756", + "summary": "Bob is going to help Lisa clean the house, he will clean the bathroom. ", + "dialogue": "Lisa: I have to clean the house.\r\nBob: Yes, it's very dirty.\r\nLisa: You can help me.\r\nBob: Why me?\r\nLisa: Because you helped make it dirty.\r\nBob: What do you want me to do?\r\nLisa: I want you to clean the bathroom.\r\nBob: Oh, that's easy.\r\nLisa: Clean the sink, the tub, the counter, and the toilet.\r\nBob: That's a lot of work.\r\nLisa: Tell me when you finish.\r\nBob: I don't think so. You'll just give me more work. " + }, + { + "id": "13729790", + "summary": "luke and martial want to help the team and play despite their injuries. They will meet at carrington and go to the coach's office.", + "dialogue": "luke: Hey, was just thinking, we should avail ourselves for team selection tomorrow regardless of our injuries\r\nmartial: thats what i was thinking also\r\nluke: we should let Jose know that tomorrow\r\nmartial: the first thing in the morning infact\r\nluke: the fixtures are really piling up and we need to help the team\r\nmartial: yeah, thats for sure, we are a family\r\nluke: we will the coach know that we are ready to play\r\nmartial: despite the little pain, me i'm ready\r\nluke: me too\r\nmartial: so we meet up at carrington and go to his office very early\r\nluke: yeah, both of us\r\nmartial: ok, we'll go together\r\nluke: cool \r\nmartial: ok" + }, + { + "id": "13812001", + "summary": "Emily and Julie wish Merry Christmas to each other.", + "dialogue": "Julie: \r\nEmily: <3 Julie Love, i'm sending tons of kisses :* :* :* 🎄🎄🎄\r\nJulie: Merry Christmas and a lovely mood throughout the whole year, darling\r\nEmily: Thank you, for you too <3\r\nJulie: Thanks :* " + }, + { + "id": "13828618", + "summary": "Ricky's new neighbours are nice but loud. They own a parakeet that makes a lot of noise throughout the night.", + "dialogue": "Frederick: do u like ur new next door neighbors?\r\nFrederick: they seemed really cool yesterday when we ran into them\r\nRicky: they're nice people but they're incredibly noise\r\nRicky: they also have parakeet that wouldn't stop squawking all night long hahaha\r\nFrederick: sucks to be you" + }, + { + "id": "13729101", + "summary": "Sandra and Brenda used to work together in the clothes factory 25 years ago. Sandra still lives in Kings Norton. Brenda lives in Stoke now. Her husband Bill died 5 years ago. They will meet in Birmingham for a lunch next Saturday about 11. They want to organize a reunion for the Lister's girls. ", + "dialogue": "Brenda: Hello, is this Sandra Donovan?\r\nSandra: Yes, well that was my maiden name, it's Sandra Taylor now.\r\nBrenda: It's Brenda Riley, we used to work together in the clothes factory!\r\nSandra: Oh my God! Bren! How are you, it must be 25 years!\r\nBrenda: I'm fine, I live in Stoke now, moved away from Brum in the late 90s.\r\nSandra: I still live in Kings Norton, same house, same husband! I've got 4 grandchildren now, ages 2, 4, 9 and 15! How about you?\r\nBrenda: Unfortunately, my husband Bill died 5 years ago, I have only one grandchild, she's 7, my little angel, she doesn't remember her Gramps. \r\nSandra: So sorry, love, I remember your Bill, he had long black hair and massive sideburns, didn't he?\r\nBrenda: Well, yes, about 45 years ago, he was bald when he passed away. He loved to dance, he did Northern Soul, we both did actually. Went up to Wigan on weekends, happy times!\r\nSandra: Oh yes, I remember that craze, bit energetic for me! We liked disco instead! We had some great dinner dances with the factory, do you remember them?\r\nBrenda: Yes! Us all dressed up with our long dresses and the men with their frilly evening shirts, lovely memories!\r\nSandra: Do you still see any of the girls from Lister's?\r\nBrenda: No, but I heard that Marigold Carter died, very sad.\r\nSandra: Hey Bren, I've had a brainwave! How about we organise a reunion for the Lister's girls, look on social media for them?\r\nBrenda: Actually, I was thinking along those lines! Do you fancy meeting up, just you and me? I can come down to Birmingham anytime.\r\nSandra: That would be lovely! Can you manage it next Saturday? We could meet about 11ish and go for lunch and a good old trip down memory lane!\r\nBrenda: Oh yes! I'd love that! I'll get back to you about train times soon!\r\nSandra: Ok! Bye love!" + }, + { + "id": "13731428", + "summary": "Joan and John are going to watch \"A Star is Born\" on Thursday around 8 p.m.", + "dialogue": "John: wanna go see \"A Star is Born\" on Wed?\r\nJoan: sorry can't\r\nJoan: super busy \r\nJoan: don't have time for anything :( \r\nJohn: that's a shame\r\nJoan: I'm free on Thursday\r\nJohn: I could do Thursday\r\nJoan: ok! so around 8pm?\r\nJohn: sure sounds great\r\nJohn: I'll see where it's palying and send you the details\r\nJoan: ok great!" + }, + { + "id": "13680972", + "summary": "Donna will pay George a visit tonight to discuss a personal matter.", + "dialogue": "George: Hi Donna. I've been trying to catch you.\r\nDonna: What about?\r\nGeorge: A rather delicate matter.\r\nDonna: Did you catch AIDS?\r\nGeorge: Very funny!\r\nDonna: It is, isn't it?\r\nGeorge: I don't think so.\r\nDonna: Too bad. So what do you want?\r\nGeorge: Could we meet and discuss it somewhere?\r\nDonna: Like where?\r\nGeorge: Like in a coffee shop or somewhere.\r\nDonna: You sure a coffee shop is better to discuss delicate matters.\r\nGeorge: Come to think of it, you are right. It's not.\r\nDonna: See? So what's up?\r\nGeorge: Couldn't you come to my place tonight?" + }, + { + "id": "13863010", + "summary": "Louis finishes the conversation with Fabian because his mother is calling.", + "dialogue": "Louis: Gotta go, my mom's calling me\nFabian: Ok, see ya\nLouis: See ya" + }, + { + "id": "13681192", + "summary": "Irene will take Crystal's son shopping for clothes.", + "dialogue": "Crystal: \r\nIrene: He's so big!\r\nCrystal: \r\nCrystal: I know right!\r\nIrene: and so cute!\r\nCrystal: he got so big he doesn't fit his clothes anymore\r\nIrene: time to go shopping with my little boy <3\r\nCrystal: yeah im just gonna go brankrupt\r\nIrene: Let me take him, I also promise to buy him something\r\nCrystal: you really wanna do that?\r\nIrene: why nont? I'm his aunt!\r\nCrystal: well yeah it's just such a drag\r\nIrene: you were always a bore when shopping :P just let me take the little man\r\nIrene: well have fun!\r\nCrystal: ok " + }, + { + "id": "13864835", + "summary": "Tony sent a photo of his cat to Amy and Lucas.", + "dialogue": "Tony: \nAmy: Sweet little cat <3\nLucas: Adorable!! " + }, + { + "id": "13727809", + "summary": "Matt got a ticket for Dawid Podsiadlo's concert. Thomas is going, too.", + "dialogue": "Matt: Hey\r\nMatt: I got my ticket for Dawid Podsiadlo!!!\r\nMatt: So stoked!\r\nThomas: Whooaa that's great!!\r\nMatt: I will see you there then! \r\nThomas: Yes for sure\r\nThomas: Remind me before \r\nThomas: Like the day before k? \r\nMatt: For sure \r\nMatt: Who you're going with \r\nThomas: by myself for now\r\nThomas: I meant until now haha\r\nMatt: Right on \r\nMatt: I am so stoked tho\r\nThomas: Me too \r\nMatt: I might ask a few more people if they're coming ;) \r\nThomas: Maria was interested I think \r\nThomas: But I am not too sure\r\nThomas: i will ask her" + }, + { + "id": "13727769", + "summary": "Some girls had to undress because they had been pushed into the pool.", + "dialogue": "Chris: Oh, and we pushed some girls into the pool :)\r\nJune: That's not nice!\r\nChris: Depends how u look at it ;)\r\nJune: How come?\r\nChris: Well, we got them to undress, 'cause they were soaking wet :)\r\nJune: Ur awful!" + }, + { + "id": "13865290", + "summary": "Mico and Jeff will go to the village party. Jeff will drive.", + "dialogue": "Jeff: Should we go to the village party?\nLia: I'm too tired after hiking\nMico: I'd like to go, there may be some hot boys!\nLia: I doubt\nJim: like a real village boy?\nJim: who doesn't even speak English?\nMico: yes, the dummer, the better\nJim: haha, stupid fucks good, they say\nMico: I confirm!\nLia: not my cup of tea\nMico: I'll go there, who wants to join?\nJeff: I'll go as well\nMico: wanna drive?\nJeff: so you could drink?\nMico: would be nice, hahah\nJeff: not excited, but ok\nMico: thanks!" + }, + { + "id": "13828832", + "summary": "Paul can couch the game on Saturday as Matthew hasn't found anyone to do that yet. ", + "dialogue": "Paul: hey Matthew did you find anyone to couch the game Saturday?\r\nMatthew: hey Paul, no still looking \r\nPaul: my plans changed so I can do it if you need \r\nMatthew: ahh yes that be great! thank you \r\nPaul: no problem see you Saturday " + }, + { + "id": "13716127", + "summary": "Freddy will pick Luke up at about 3:15 pm.", + "dialogue": "Luke: my train arrives at 3 pm\r\nLuke: anyone can pick me up?\r\nJacob: i am at work till 5\r\nFred: i can pick you up at 3:15 approximately\r\nFred: is that ok?\r\nLuke: yes i will wait! thanks Freddy" + }, + { + "id": "13729042", + "summary": "Ryan and Jack are going to the casting for a dance show.", + "dialogue": "Ryan: You're going to the casting? \r\nRyan: So you think you can dance 🤩\r\nJack: I am! \r\nJack: this time im going\r\nRyan: U should go really\r\nJack: I know, wanna come with me? \r\nRyan: I thought about it! \r\nJack: Nice well! I will meet you there! 😝😝😝" + }, + { + "id": "13818511", + "summary": "Joanne is going to go back home to France for the holidays. She's going to cheer her mum up because her parents separated a few months ago. Evelyn offers Joanne to spend Christmas together if she brings her mum over here. ", + "dialogue": "Joanne: What are your plans for the holidays?\r\nEvelyn: Nothing. I’ll stay at home and rest.\r\nJoanne: You must be exhausted after the past few weeks\r\nEvelyn: It’s been hectic\r\nJoanne: I’m going back home.\r\nEvelyn: To France?\r\nJoanne: Yes. Not that I want to go…\r\nEvelyn: Why? You always liked spending Christmas with your family.\r\nJoanne: I did. But my parents separated a few months ago\r\nJoanne: It is still pretty tense…\r\nEvelyn: I’m sorry to hear that\r\nJoanne: My dad left my mum for his secretary\r\nJoanne: Such a cliché \r\nJoanne: My mum is devastated\r\nJoanne: So I’m basically going to cheer her up \r\nJoanne: It’s really hard for her now\r\nJoanne: For me it’s also not easy\r\nEvelyn: I can imagine!\r\nEvelyn: If you want to bring your mum over here we could spend Christmas together.\r\nJoanne: Thanks, that’s really sweet. But I don’t think she’s in a condition for that. She’s been very depressive lately. " + }, + { + "id": "13829996", + "summary": "Raul's had a bad night and day. ", + "dialogue": "Carla: Hey\r\nCarla: how are you today?\r\nRaul: not too well\r\nCarla: what's wrong? did you sleep?\r\nRaul: it was a really crappy night\r\nRaul: and yesterday evening\r\nRaul: and today morning\r\nRaul: i started off with a fag\r\nRaul: i don't even know what triggered me off \r\nRaul: yesterday Jen had to go to a friend\r\nRaul: she'd broken up with her bf and Jen went to sit with her\r\nRaul: and she stayed for the night\r\nRaul: so i thought it's be a good evening just for myself\r\nRaul: but i was just so pissed off all the time\r\nRaul: and then i had those weird dreams of my uncle\r\nCarla: shit ;/ sounds awful\r\nRaul: yeah it was :/\r\nCarla: ;*" + }, + { + "id": "13728493", + "summary": "Angie's having an appointment with Doctor McCormick in an hour. She has strong abdominal pain.", + "dialogue": "Angie: Hello. I'd like to make an appointment. \r\nMs. Quinn: Hello. Of course. What's your medical issue?\r\nAngie: It's kind of private...\r\nMs. Quinn: I understand. No chats are recorded. And I need to know what you're coming in with to direct you to the correct doctor. \r\nAngie: Well, fine. I have severe stomach pain.\r\nMs. Quinn: When did it start?\r\nAngie: Like an hour ago. I took some pills but to no avail. \r\nMs. Quinn: I understand. Do you have your doctor?\r\nAngie: Yes. Doctor Cartman.\r\nMs. Quinn: He's not in today. Doctor McCormick can see you in an hour. Will that be fine?\r\nAngie: Yes, thank you. I'm on my way.\r\nMs. Quinn: I've signed you up and I'll tell the doctor you're coming. " + }, + { + "id": "13729529", + "summary": "Kelvin and the other class members will discuss the time for their CAT 2 and share their decision with Naheeda soon.", + "dialogue": "Kelvin: Excuse me Miss. When do we sit for our CAT 2?\r\nNaheeda: Can we have it during the statistics class?\r\nKelvin: I don't think so because we will be having the statistics CAT.\r\nNaheeda: Okay it is up to you guys to choose the time then.\r\nKelvin: Okay. Wait I will talk with the other class members then I'll tell you their decision\r\nNaheeda: Okay then don't take too long.\r\nKelvin: Sure" + }, + { + "id": "13716897", + "summary": "Karen wants something cheaper than sushi for lunch. Linda, Ronnie and Karen will get takeout pasta boxes to the park.", + "dialogue": "Linda: hey have we decided on a lunch place yet?\r\nRonnie: thought we were going for sushi\r\nLinda: Karen said she hates raw fish or something\r\nKaren: nah I'm ok with sushi, just thought we would go for smth cheaper this time ;p \r\nKaren: hard times are a-coming xd\r\nRonnie: been craving sushi all week long :( but i sooo get, K, maybe we should dial it down with the fancy places\r\nLinda: hey there is this pasta joint right next to our apartement\r\nLinda: you get it in like takeout boxes. we could take these out to the park\r\nRonnie: so down with that!\r\nKaren: same here! let's wait for Amanda tho\r\nLinda: @Amanda are you ok with pasta for lunch?\r\nRonnie: let's hope she checks her fb this time haha\r\nAmanda: hey guys, yes! whatever works for me :)" + }, + { + "id": "13730939", + "summary": "Marty thinks she has sprained her ankle. Marty wants to go to the doctor tomorrow. Christine will pick up Marcel from school today. Tomorrow Christine will take Marcel to school and Marty to the doctor. Marty will call the school. Christine and Marty will meet around 4. ", + "dialogue": "Marty: Hiya, I have a favour to ask... can you pick up Marcel from school?\r\nChristine: Sure, you ok?\r\nMarty: Not really, I think I have sprained my ankle...\r\nChristine: Oh no, have you seen see a doctor?\r\nMarty: I was gonna see how it went today and might go tomorrow...\r\nChristine: Are you sure? I'm happy to take you now if you want?\r\nMarty: Nah, it can wait, that'll leave us in trouble with the kids...\r\nChristine: OK, I'll pick up Marcel then.\r\nChristine: Do you need anything from the shops or something?\r\nMarty: No we are good thanks. we'll have pizza night, Marcel can sort us out...\r\nChristine: I'm on a late shift tomorrow, shall I take Marcel in tomorrow morning? I'll take you to the doctor afterwards if you want?\r\nMarty: That would be awesome, thank you for your help...\r\nChristine: No problem, you'll have to call the school though.\r\nMarty: Good one, will do that now...\r\nChristine: See you around 4.\r\nMarty: Thank you so much!" + }, + { + "id": "13813429", + "summary": "Peyton is expecting Cameron to bring the video game. Cameron will probably be out for another week.", + "dialogue": "Peyton: I have been asking you to bring that video game for me\r\nCameron: Honey, I am not having enough time to come home\r\nPeyton: When would you come home?\r\nCameron: I will have to stay out of town for another week i guess\r\nPeyton: Cant you just deliver that game through the courier? :P\r\nCameron: Dont be mean :/\r\nPeyton: Get the job done and come to home then. ASAP :P" + }, + { + "id": "13828511", + "summary": "Alicja's job interview is tomorrow. She will inform Willyx how it goes.", + "dialogue": "Willyx: how did your job interview go?\r\nAlicja: it's tomorrow :D\r\nWillyx: sorry\r\nAlicja: it's ok :)\r\nWillyx: let me know how it went\r\nAlicja: sure" + }, + { + "id": "13729320-1", + "summary": "Paul forgot about his physiotherapy and he will schedule a new appointment. Emma will be home after midnight, so Paul will prepare some food for her.", + "dialogue": "Paul: I just came back home\r\nPaul: What a busy day\r\nPaul: I forgot about my physiotherapy \r\nEmma: Oh no\r\nPaul: It's ok, I'll schedule a new appointment\r\nEmma: I'll be home after midnight\r\nPaul: Do you want me to prepare some food for you?\r\nEmma: That would be lovely" + }, + { + "id": "13729210", + "summary": "Sean overslept again.", + "dialogue": "Sean: I overslept :/\r\nSam: Again??\r\nSean: I know." + }, + { + "id": "13680312", + "summary": "Chris and Tom are planning a meeting at Chris' place. Chris has a Jacuzzi in his garden. Chris has WiFi and can bring his TV outside. Tom has a low internet limit whenever he's outside of Ireland.", + "dialogue": "Chris: \r\nChris: Maybe not he best photo XD\r\nChris: and im the middle one here\r\nChris: and you can bring swimming trunks as well because there's opportunity to go to jacuzzi in our garden :))\r\nTom: a jacuzzzzzi????\r\nChris: oohhhh yeeaahh\r\nTom: O my godddd.\r\nTom: Is it big enough for a few people? I feel I woul feel wierd out there on my own :)\r\nChris: Yeee, for 5-6 people no problem.\r\nTom: So you and your brother will join me? :D\r\nChris: Yes hahaha meybe we can invite someone else or only our little group.\r\nChris: An maybe watch sth on TV or just make a conversation hhahaha\r\nTom: Wait you've got a tv outside????\r\nChris: well we can bring it there :P\r\nChris: From our living room\r\nTom: Do you have WiFi?\r\nChris: Yes\r\nTom: Nice, I only get 6GB on my phone when I get outside Ireland\r\nChris: Yeah, kind a low amount\r\nTom: Can't wait!\r\nChris: Me too! :)" + }, + { + "id": "13730736", + "summary": "Ela is not taking Harry's phone calls. Cindy calls Ela at Harry's request.", + "dialogue": "Harry: heyyyy are you there??\r\nCindy: Yes dear what is it?\r\nHarry: Can you call Ela and tell her i need to talk urgent please pick my call.\r\nCindy: what happened now? an other fight :O\r\nHarry: please tell her\r\nCindy: MAN! you guys... am i some kind of a messenger service here?\r\nHarry: PLEASEEEEEEEEE ?\r\nCindy: ok doing.... but thats the last time.\r\nHarry: Yes like always:P\r\nCindy: Hate you seriously man.\r\nHarry: Thank you\r\nCindy: Done you can call her now." + }, + { + "id": "13829218", + "summary": "Mike didn't have time to take the dog for a walk, so Adam will take it with him.", + "dialogue": "Adam: Did you take the dog for a walk?\r\nMike: No, I did not have time,\r\nAdam: Ok, I'll take him with me." + }, + { + "id": "13729697", + "summary": "Murphy is going to Poanań on Tuesday and coming back on the same day in the afternoon.", + "dialogue": "Sophie: When r u going to Poanań?\r\nMurphy: On Tuesday. \r\nSophie: And you're coming back the same day?\r\nMurphy: Yes, in the afternoon, but I don't know the exact hour. " + }, + { + "id": "13729407", + "summary": "Gaia has 6 exams this semestre. One is very difficult.", + "dialogue": "Monica: How are you doing?\r\nGaia: I'm fine, mum.\r\nMonica: All good at the university?\r\nGaia: A lot of work, but all good.\r\nMonica: I guess you'll have a lot of exams this term\r\nGaia: 6\r\nGaia: But only one is really hard " + }, + { + "id": "13680992", + "summary": "Mr. Williams invites Ms. Blair for a coffee. They will go to her favourite coffee place near the square in a side alley at 2 p.m.", + "dialogue": "Mr. Williams: Ms. Blair, would you like to go for a coffee?\r\nMs. Blair: I thought you'd never ask.\r\nMr. Williams: That's outstanding. Do you have a favourite coffee place?\r\nMs. Blair: I actually do. It's near the square in a side alley.\r\nMr. Williams: I think I know which one. Let's say 2 p.m.?\r\nMs. Blair: Sounds great. See you there :)" + }, + { + "id": "13810012", + "summary": "Serena's skin condition is fine now and she doesn't have to take medication. Tina has a similar condition but takes medication on a daily basis. Tina can call Serena if she has questions. ", + "dialogue": "Serena: Have you been to the doctor lately?\r\nJeff: No, why?\r\nSerena: Just wondering what he says about your skin condition?\r\nJeff: It's fine right now. \r\nSerena: That's good!\r\nJeff: The cold weather sets it off and if I eat too much of the wrong foods, but otherwise fine.\r\nSerena: So you don't have to be on meds?\r\nJeff: Not all the time. Why?\r\nSerena: Tina has the same thing and takes meds on the daily.\r\nJeff: She must have a different kind than me or a worse kind.\r\nSerena: I guess so.\r\nJeff: It sucks, but it doesn't have to be every day.\r\nSerena: That's good. I'll tell her. That will cheer her up!\r\nJeff: Good! Tell her to hang in there. She can call me if she has any questions.\r\nSerena: Thanks!" + }, + { + "id": "13821805", + "summary": "Kristina, Estefania and Jannette are watching America's Top Model.", + "dialogue": "Kristina: Girls!\r\nKristina: America's top model \r\nKristina: on tv\r\nKristina: Watching? \r\nJannette: omg\r\nJannette: Im not home yet\r\nKristina: New season ye!!\r\nEstefania: Hmm \r\nEstefania: yeah Im watching this rn \r\nKristina: Tyra Banks\r\nKristina: She never gets old\r\nEstefania: I wanna look like her haha\r\nJannette: K I just got home \r\nJannette: Had to run \r\nJannette: \r\nEstefania: Hahaha" + }, + { + "id": "13819314", + "summary": "Daniel is going to Bologna today. He has to transfer there for a further flight. He will stay at the airport for two hours. Simone will visit Marco in December.", + "dialogue": "Marco: Are you coming to Bologna this fall?\r\nDaniel: accidentally, I am flying today thought Bologna but only became I have to change there\r\nMarco: wow! how long will you stay at the airport?\r\nDaniel: 2 hours only\r\nMarco: ok, that may be too short for me to see you\r\nSimone: I may come in December for a weekend, if you want to host\r\nMarco: with pleasure! and always, as I always tell you :)\r\nSimone: great, I'll let you know" + }, + { + "id": "13829353", + "summary": "Mike will call Dale back in 2 hours.", + "dialogue": "Mike: will call u back in 2 hrs, ok? \r\nMike: can't talk right now, sry\r\nMike: *sorry\r\nDale: cool, no problem\r\nDale: until then" + }, + { + "id": "13729285", + "summary": "Timmy had a bad day at work. Timmy will bring some wine to Gemma's bbq at the weekend.", + "dialogue": "Gemma: How's it going?\r\nTimmy: A bit down 2day.\r\nGemma: Y?\r\nTimmy: Oh, bad day at work. Can u imagine? Boss snapped at me!\r\nGemma: That bitch! What did u do?\r\nTimmy: Nothing. Minding my own business, doing work stuff and suddenly starts shouting and screaming. Doesn't matter. How about u?\r\nGemma: Well, I think this might cheer u up a bit :)\r\nTimmy: What is it?\r\nGemma: I'm organising a bbq at the weekend :) wanna come?\r\nTimmy: Love to! What do I bring?\r\nGemma: Some wine will be fine.\r\nTimmy: What about food?\r\nGemma: Others and I will cover it.\r\nTimmy: Others? I thought it was a date :P\r\nGemma: U remember I have a bf, right?\r\nTimmy: Yeah. Just messing around ;) how many ppl?\r\nGemma: Don't know yet. " + }, + { + "id": "13812250", + "summary": "Pat will arrive at around 9 pm. Bart will open the door and work in the morning from home. ", + "dialogue": "Pat: Hi, it's Pat here. I have a slight delay, a couple of hours, so I will come in the evening, around 9pm. I hope it is not a problem...\r\nBart: Hi, no, not at all :) you didn't catch the train?\r\nPat: hehe, no. I thought today i was free from work, but i got another project to finish. So i will be tomorrow morning. Im sorry for those changes, i myself am surprise.\r\nBart: hahaha a tiny regret. Will you manage to get here on your own?\r\nPat: yeah, its ok. what would you suggest? if you could come and unlock the door, it would be great, but i can walk around the city as well.\r\nBart: of course I will open it for you. I can work in the morning from home. No problem. So we see each other tomorrow?\r\nPat: Thank you so much! haha\r\nBart: Are you a painter?\r\nPat: I hope I will eventually get in that train. Nope.\r\nBart: I thought you were.\r\nPat: this project is a stage project :)\r\nBart: Ah alright, youll tell all about it tomorrow :))\r\nPat: OK,, later" + }, + { + "id": "13730167", + "summary": "Erin will meet Ashley in the restaurant for the interview. ", + "dialogue": "Erin: hey what's up. what you're doing today? Would you find some time to do the interview? :)\r\nAshley: Hi! I’m free whenever\r\nErin: Great, are you at the camp?\r\nAshley: Yep, I’m here now.\r\nErin: Ok I can come over there\r\nAshley: Sounds good. Just shoot me a message and we can meet at the pool or restaurant\r\nErin: Alright!\r\nErin: Is the wifi good today?\r\nAshley: Yeah pretty decent, but usually it's less spotty in the restaurant area\r\nErin: Ok I see, so the restaurant it is\r\nAshley: Alright\r\nErin: I'm on my way!\r\nAshley: I'm sitting in the back" + }, + { + "id": "13865416", + "summary": "Maria, Kate, Tommy and Sam are going to a conference. Tommy will use Prezi instead of Power Point. He has a Prezi subscription for $10 a month.", + "dialogue": "Maria: hey guys!\nMaria: everything ready for the conference?\nKate: yes, almost\nTommy: I think we will have a good panel\nSam: I really hope there will be some people\nTommy: we should have good audience\nMaria: Are you preparing a power point?\nTommy: I'll have a prezi\nMaria: you're paying for it?\nTommy: it's good, worth the money\nMaria: how much is it?\nTommy: I believe $10 a month for academics\nMaria: not that bad" + }, + { + "id": "13828681", + "summary": "Maxwell pays Jeanice for 8 hours of babysitting and is grateful he found her. His son argued with a friend at school, and got agressive when the teacher reacted. Jeanice hasn't noticed if he has acted strangely recently. Maxwell and Jeanice can grab a coffe some time to discuss all the issues. ", + "dialogue": "Maxwell: Thank you for tonight, payment as usual?\r\nJeanice: Yes, 8 hours\r\nMaxwell: What would I do without you…\r\nJeanice: That’s my job :D\r\nMaxwell: But you can do your job good or bad… I’m grateful I found you\r\nJeanice: Me too, our cooperation is very good ^^\r\nMaxwell: And the kids… They really like you\r\nJeanice: I know, they’re cute\r\nMaxwell: Wait, I heard there was some problem with Marcus at school…\r\nJeanice: Nothing very serious, he argued with a friend but they didn’t fight\r\nMaxwell: Oh, ok, the teacher sounded like it was something terrible\r\nJeanice: In a way it was, he didn’t want to stop when a teacher reacted and he was pretty aggressive\r\nMaxwell: That’s worse… Have you noticed him behaving in a strange way recently?\r\nJeanice: Not at all, cute as usual. But… he gets very angry for stupid reasons\r\nMaxwell: For example?\r\nJeanice: You know, his sister taking sth from him… He’s not aggressive but more… loud than usual, if you know what I mean\r\nMaxwell: Yes, I understand, I need to do something about it. Anyway, thank you for informing me.\r\nJeanice: No problem Mr. Hall, I’m always here to help ;)\r\nMaxwell: We could grab a coffee some time and talk through all the issues.\r\nJeanice: Yea, maybe, I have to go, I'll let you know\r\nMaxwell: OK, thanks, you're the best :)" + }, + { + "id": "13864454", + "summary": "Mike, Tom and Ben will go for a beer.", + "dialogue": "Mike: Let's go for a beer\nTom: Now?\nMike: Yes\nBen: Ok" + }, + { + "id": "13828973", + "summary": "Leah met a creepy guy last night at a poetry reading. He knew she speaks German and named all the friends who went to a past event with her. She didn't tell him any of this. He googled her before. Leah had liked a post mentioning him 2 weeks earlier. ", + "dialogue": "Leah: i've just met the creepiest guy on the earth :|\r\nSamantha: :o i'm all ears\r\nLeah: last night i went to the poetry reading\r\nLeah: this guy approached me, introduced himself and we started talking\r\nLeah: he was rly boring, but, you know, i didn't want to be rude, he didn't seem to be a psycho, so i kept talkig to him\r\nLeah: at one point he mentioned that \"we have something in common\", because we both speak german\r\nSamantha: charming :D\r\nLeah: i was like: \"erm, ok, how do you know about this, i don't know you, i didn't tell you about this\"\r\nLeah: he just brushed it off and kept talking\r\nLeah: at one point he asked me about an event i had attended and he NAMED ALL OF MY FRIENDS THAT HAD COME WITH ME\r\nLeah: of course i didn't tell him about it\r\nSamantha: wow, that's creepy as hell\r\nSamantha: what did you do?\r\nLeah: i asked him once again how did he know about all this stuff and he said that he googled me out\r\nSamantha: WHAAAAT?\r\nSamantha: i googled people out before, but i’m not that stupid (and creepy) to tell them about it :o\r\nLeah: right?!\r\nLeah: and now the best part\r\nLeah: he googled me out, because he thought that i had a crush on him, because 2 weeks earlier I LIKED A POST ON FB announcing that he was going to accompany poets on the guitar during the poetry reading\r\nLeah: can you believe this?!\r\nSamantha: LOOOOOOL\r\nSamantha: men are fckin weird\r\nLeah: AND he tried to walk me home, even though i told him straightforward that he was a creep\r\nSamantha: girl, you're lucky that he didn’t kill you, what a psycho\r\nSamantha: i would totally freak out" + }, + { + "id": "13728295", + "summary": "Nicky has just left Sam's place. Her phone is off.", + "dialogue": "Dave: Hey, is Nicky still at your place? Her phone is off\r\nSam: She just left\r\nDave: Thanks!" + }, + { + "id": "13863025", + "summary": "Ken is trying to play a prank on Greg.", + "dialogue": "Ken: Fuck you, you pimp\nGreg: What?\nKen: Fuck you man, I want my money back\nGreg: Was your account hacked by some prankster?\nKen: No, I'm the prankster, just having a laugh at your expense\nGreg: Well, fuck you too then XD" + }, + { + "id": "13728896", + "summary": "Tom arrived safely, but without his luggage.", + "dialogue": "Alexander: Personal request to send me message when you will be in taxi\r\nAlexander: If any problem, call me\r\nTom: ;)\r\nTom: Thank You, I appreciate it\r\nAlexander: Taxi confirmation below\r\nAlexander: \r\nTom: Thank you for the transport, we arrived safely, although without luggages :/\r\nAlexander: Good but bad\r\nTom: Yeeees" + }, + { + "id": "13830023", + "summary": "The university is throwing a carnival party for kids.", + "dialogue": "Asher: Have you seen there're 5 (!) bouncy castles in the hall?\r\nAsher: And a HUGE bouncy dragon!! :o\r\nAsher: \r\nShane: wow! :D it looks fantastic :o\r\nAsher: What’s the occasion?\r\nShane: the university organised a carnival party for children :)\r\nShane: or at least this is what i heard in the dean's office\r\nAsher: That's so cool! :>" + }, + { + "id": "13830056", + "summary": "Ian is looking for his green folder. Sophie hasn't seen it but maybe Alex will know.", + "dialogue": "Ian: Honey, do not you know where my green folder is?\r\nSophie: Green?\r\nSophie: I have no idea.\r\nIan: I was sure I left it on the dining room table.\r\nSophie: Maybe Alex put it somewhere.\r\nSophie: Ask her.\r\nIan: ok, thx :*" + }, + { + "id": "13809969", + "summary": "Simon will talk to Adrian in 5 minutes.", + "dialogue": "Adrian: Can you talk?\r\nSimon: Not really, anything important?\r\nAdrian: Not that much.\r\nSimon: I'll be free at 5\r\nAdrian: i'll you then" + }, + { + "id": "13727853", + "summary": "Jen is about to break up with her boyfriend. Jane knew from the beginning that they were not a good match. Jane is going to support Jen.", + "dialogue": "Jen: I think I'm through with the dickhead. He's being a pain again. I'm going to tell him to move out.\r\nJane: Did he at least give you back the money he owes you?\r\nJen: No. He's freeloading and said he has no intention of giving back my money because I don't deserve it with the way I act.\r\nJane: He has a nerve doesn't he? How dare he?!!!\r\nJen: I've learnt to hate him with a passion. He's like vermin that you can't cull.\r\nJen: Abusive, nasty, annoying, irresponsible. He disgusts me.\r\nJane: Get rid of him! I told you right from the beginning that he's no good.\r\nJen: Yeah you were absolutely right.. as always.\r\nJane: I hate to say it but I can smell a fucktard a mile away.\r\nJen: I seem to pick them, don't I?\r\nJen: Sometimes I think it is my fault. If only I could be a better person, if only, if only....\r\nJane: You know that way of thinking will get you nowhere fast. It's self defeating. That's what the abusers want you think that you're bad and you deserve every bit of abuse that they dish out.\r\nJen: What if he doesn't leave? I'm afraid of asking him to leave 'cause it will only cause another fight.\r\nJane: You can't spend your whole life walking on eggshells.\r\nJen: Yeah, you're right... but how do I get out of this mess?\r\nJane: I think you've got to cut your losses and just move on.\r\nJen: Easier said than done.\r\nJane: I know. You've gotta do it Hon. If you don't things will only get worse. Think of how much worse they've already got since you met him.\r\nJen: Yeah you're right. Sometimes I just don't have the strength. \r\nJane: I believe in you. Please do it! Remember I'm always there for you." + }, + { + "id": "13828797", + "summary": "Poppy is not going to be home tonight but she won't reveal the reason to Dean. She won't be making dinner so Dean has to get something on his way home.", + "dialogue": "Dean: Hey sweetheart\r\nDean: What's for dinner tonight :D\r\nPoppy: Hey\r\nPoppy: Dunno?\r\nDean: What do you mean you dunno :D\r\nPoppy: Well, I'm not preparing anything tonight\r\nPoppy: So I dunno\r\nDean: Nooo whyy I'm starving\r\nPoppy: Grab something on the way back home\r\nDean: I guess I will!\r\nDean: What would you like me to get for you? :*\r\nPoppy: Nothing I won't be home tonight\r\nDean: ???\r\nDean: You won't?\r\nPoppy: Yeah I won't\r\nDean: What's up?\r\nPoppy: Nothing\r\nPoppy: I'll see you tomorrow\r\nDean: Where are you going to be tonight then?\r\nPoppy: Not telling\r\nDean: :O\r\nDean: Why\r\nPoppy: Because it's a secret\r\nPoppy: :O\r\nDean: :O\r\nPoppy: :O" + }, + { + "id": "13729708", + "summary": "Gab wants to meet Kat in real life. Kat doesn't like Gab's insisting so she won't to talk to him at all.", + "dialogue": "Gab: Ah that's better. Now we can message all we like.\r\nKat: :-)\r\nGab: So when can we meet up for a drink?\r\nKat: Fairly busy at the moment so I'm not sure when.\r\nGab: I'd love to meet you in real life.\r\nKat: I don't meet strange men from the internet.\r\nGab: Why not?\r\nKat: Not that sort of gal.\r\nGab: But you're talking to me, aren't you?\r\nKat: Yes, but that's different.\r\nGab: How so?\r\nKat: Just is.\r\nGab: But it would be so much nicer to do this cuddled up right next to you...\r\nKat: I think this was a mistake.\r\nGab: Are you there?\r\nGab: Where did you go?\r\nGab: Kat?" + }, + { + "id": "13681082", + "summary": "Agatha is proud of herself because she has finished her presentation in Economics. She is very interested in Economics.", + "dialogue": "Agatha: My presentation is ready as we speak :)\r\nAdam: oh cool, what course?\r\nAgatha: Economics.\r\nAdam: Is it interesting? I mean... your presentation.\r\nAgatha: Definitely, I used recent research.\r\nAdam: Sounds like you know what you're doing :P\r\nAgatha: I'm just really into economics :)\r\nAdam: no doubt about that" + }, + { + "id": "13810132", + "summary": "Imagine Dragons have a concert at ABC Theatre on 12 July. Sally wants to go with Tim. She bought tickets, they cost 70.", + "dialogue": "Sally: Hey! Imagine Dragons are coming to us!\r\nTim: So I've heard.\r\nSally: And you didn't tell me?! \r\nTim: Come on. It's just a band...\r\nSally: It's not JUST a band, you jerk!\r\nSally: \r\nSally: I've already checked the ticket availability. There are still some tickets for the standing area at our ABC Theatre. Shall we go together?\r\nTim: How much are they?\r\nSally: 70\r\nTim: When is the gig?\r\nSally: 12 July\r\nTim: Well, I may go.\r\nSally: Your enthusiasm is infectious, really... Try inviting me for some sports events and you'll see how happy I'll be.\r\nTim: Ok! Let's go! It'll be an unforgettable evening!\r\nSally: Jerk! I've aready bought the tickets, so put it in your diary\r\nTim: Done. " + }, + { + "id": "13829686", + "summary": "Andrew has discovered an issue with Bez's car in her absence but it seems to be ok. He will also take care of her plants until she is back on the 21st. ", + "dialogue": "Andrew: Hello Bez, this morning I wanted to take your car and go to Krizingen but it showed \"low brake fluid level\". So I went only to the garage next to Willig and they said it's ok, probably a display error. Do you want me to have the car checked properly? Have you got a contract with some garage in particular?\r\nAndrew: Kate took your car today, just to check, and everything was fine. It must have been in fact display error.\r\nAndrew: Hi Bez, when are you back? What date?\r\nBez: Hello Andy and Kate! Sorry for not replying immediately. We have here no wifi internet connection. I'm in a hotel lobby now and use a \"phone credit\".\r\nBez: The car should be ok as it passed general inspection only in June. I usually go to a garage in Crinch. But don't bother as long it shows no more alarms.\r\nBez: Apart from that is everything alright? Automatic lights going on and off? One downstairs, one upstairs.\r\nBez: Please don't forget to give the plants in the basement an occasional drink! Thank you.\r\nBez: I'm home on the 21st in the evening. Could you please switch the heating on a bit on that day?\r\nAndrew: Hello Bez, just as we thought. No internet. Everything is fine here, also the car. The plants in the basement are watered once a week but not too much.\r\nAndrew: Your natal lily has just started to flower.\r\nAndrew: and look at this:\r\nAndrew: \r\nAndrew: Greetings from your frost-bound garden!\r\nBez: Thank you Andy!" + }, + { + "id": "13612210", + "summary": "Laura is going to visit her parents next Saturday. Keith might make a lasagne for her. Laura's mom has a birthday gift for her. ", + "dialogue": "Keith: Hi there kiddo, when are you planning to visit you old parents? :)\r\nLaura: Hey Dad, I'm not sure yet. I've been pretty busy recenlty.There is this big project coming…\r\nKeith: Oh, I understand, all work and no play…XD\r\nLaura: Daad! Don't be mean! You know I treat studying seriously!\r\nKeith: I know, you take after your mum :) By the way I think she bought some b-day gift for you…\r\nLaura: Next Saturday it is then :D \r\nKeith: I'll tell mum, she'll be really happy:)\r\nLaura: And please cook your lasagne! I miss it so badly …\r\nKeith: I'll se what I can do Pumpkin, le'ts stay in touch :)" + }, + { + "id": "13812772", + "summary": "Pegah is in class till 15:00. She will work from 17:00 till around 21:30. She will be back at 22:00. Miriam invited people over and wants Pegah to come. Pegah will have a cup of tea with her when she gets back. Miriam will save Pegah some wine.", + "dialogue": "Miriam: heyo\r\nMiriam: when do you get back?\r\nPegah: hey hey\r\nPegah: I'm in class till 15:00 and then I work from 17:00 till about 21:30\r\nPegah: so I'll be back at 22:00 D:\r\nMiriam: oh damn\r\nMiriam: that's late!\r\nPegah: I know :( but I need as many shifts as possible\r\nPegah: I'm gonna be a zombie all week :(\r\nMiriam: ok, well I asked coz I invited a few people over\r\nMiriam: and was hoping you would be there too\r\nPegah: awwww\r\nPegah: well I can have a cup of tea with you when i get back lol\r\nMiriam: I'll save you some wine as well :) " + }, + { + "id": "13681309", + "summary": "Robert will pick up floating balloons for Tom's birthday.", + "dialogue": "Pam: Hey Robert, you said you cold help with Tom's birthday?\r\nRobert: Sure, what do you need?\r\nPam: I have to go shopping, cook and clean and I figured out I don't have time to pick up the balloons\r\nRobert: from where?\r\nPam: there this store in the city centre that sells these awesome floating balloons\r\nRobert: No problem just text me the address\r\nPam: bless you!\r\nRobert: ;)" + }, + { + "id": "13716296", + "summary": "Shelly is voluntering at a food shelter and asks if others do some volunteer work. Tracy is not into that, but Jody always does some charity for Christmas.", + "dialogue": "Shelly: This year I'm volunteering at the food shelter!\r\nTracy: Good 4 u!\r\nJody: Gr8!\r\nShelly: How about u? Any volunteer work?\r\nTracy: Nah. Not into that.\r\nJody: Sure! Every year I do some charity 4 Xmas :)" + }, + { + "id": "13729524", + "summary": "Jim will check out Max's latest music project when he gets home.", + "dialogue": "Max: I know I will never be famous music producer\r\nMax: But check this out\r\nMax: My latest project\r\nMax: \r\nJim: I'll listen to it when I get home.\r\nJim: Knowing you I'm sure it's good.\r\nMax: Thanks\r\nMax: Let me know what you think later\r\nJim: I will" + }, + { + "id": "13828739", + "summary": "Kane recommends the new 30 Seconds to Mars album to Shannon.", + "dialogue": "Kane: have you heard the new 30 seconds to mars album?\r\nShannon: no, is it good?\r\nKane: you should so check it out\r\nShannon: ok thanks for the recommendation\r\nKane: no prob" + }, + { + "id": "13727631", + "summary": "Andy is going to visit Paul in about 1 hour.", + "dialogue": "Andy: Hi nephew!\r\nPaul: Hi uncle!\r\nAndy: Are you home? I'm nearby and thought I would drink coffee with you :)\r\nPaul: Yup. I'm home. Feel free to come!\r\nAndy: If that is ok I will visit you in about 1 hour. \r\nPaul: Sure. A lot of political cases for us to talk about :D\r\nAndy: Haha. No.\r\nAndy: Too much politics with Hannah's father.\r\nAndy: I have enough arguments over politics forever.\r\nPaul: Hahah. Ok. Waiting for you then.\r\nAndy: See you." + }, + { + "id": "13680506", + "summary": "Caroline and Megan play a guessing game - they need to guess which film a quote comes from.", + "dialogue": "Caroline: \"I am this close to tugging on my testicles again\"\r\nMegan: Friends, right?\r\nCaroline: Bravo! Who said it?\r\nMegan: Ross or Chandler... Ross!\r\nCaroline: " + }, + { + "id": "13810947", + "summary": "Josh should check the email from Ron. ", + "dialogue": "Ron: check your email :P\r\nJosh: what did u send me?\r\nRon: sth you want to have. \r\nRon: Check :D :D" + }, + { + "id": "13681439", + "summary": "Phoebe cannot go out today because she broke a bottle of her mother's expensive perfume. Phoebe's mother is angry. The smell of the perfume in the apartment is too intense now.", + "dialogue": "Phil: can you go out today?\r\nPhoebe: no\r\nPhoebe: my mum is still angry\r\nPhil: why?\r\nPhoebe: i used her perfume\r\nPhil: so what?\r\nPhoebe: i used it and broke it\r\nPhil: really?\r\nPhil: xd lol\r\nPhoebe: not funny\r\nPhoebe: it was very expensive\r\nPhoebe: besides, our whole house stinks\r\nPhil: so it was not so beautiful perfume?\r\nPhoebe: it was, but not 100 ml for 80 square meters" + }, + { + "id": "13728634", + "summary": "Rob and Eve will meet on Sunday morning to go to the shops. Eve has something to do at about 3. ", + "dialogue": "Rob: Are we meeting up 2morrow?\r\nEve: How about Sunday\r\nEve: stores are open so we can go then\r\nRob: Ok I just don't know what time they close\r\nEve: we have ot go in the moring \r\nEve: I have some stuff to do around 3\r\nRob: ok" + }, + { + "id": "13728288", + "summary": "Betty shares a photo of a man with a cat with Sandra. Sandra's ex wants to get back. She misses him. Betty comes over with wine at 6.", + "dialogue": "Betty: \r\nSandra: Hahaha!\r\nBetty: This guy has totally nailed it!\r\nSandra: I have a special place in my heart for men that take care of animals.\r\nBetty: I know! He and his cat look so cute!\r\nSandra: Haha! I'm sure he could've taken good care of us too!\r\nBetty: :) :) :)\r\nBetty: Sandra you naughty girl!\r\nSandra: Oh stop it. We both know that you think that too.\r\nBetty: Maybe a little.\r\nSandra: Btw, my ex just messeged me. He said he had been thinking about our split-up recently and he stated that it had been a mistake.\r\nBetty: Seriously?! After 3 months?\r\nSandra: Yeah. The worst thing is that I think that too :(\r\nBetty: You are not thinking of getting back to him are you?\r\nSandra: ...\r\nBetty: Come on! He was such a jerk back then! Don't you remember how sick you felt when he left you?\r\nSandra: I know but\r\nBetty: No buts! I'll be at your place at 6. I'll bring the wine!" + }, + { + "id": "13730105", + "summary": "Anna likes a new app in which you can virtually try on clothes. Peter is not quite convinced it is necessary.", + "dialogue": "Anna: Still, great app! And it's for men too! U choose ur sex and then add photos of clothes :)\r\nPeter: Is there a minimum number of clothes I should add?\r\nAnna: No! And there is no maximum! You can add as many as u want!\r\nPeter: So if I only add 3 items, I will always get the same results?\r\nAnna: Ur still making fun of me!\r\nPeter: I just don't get it. Every morning I approach my wardrobe, open it, choose the clothes I want to wear or prepare them the previous day and that's it. I don't need an app to tell me what to wear with what.\r\nAnna: Okay... Let's play a game. \r\nPeter: You're not going to make me cut my fingers off?\r\nAnna: No ;) how many pairs of trousers do u have?\r\nPeter: 5 or 6.\r\nAnna: How many shirts do you have?\r\nPeter: T-shirts or regular shirts?\r\nAnna: Shirts. Let's focus on ur work look.\r\nPeter: 3 or 4.\r\nAnna: What about jackets?\r\nPeter: 2.\r\nAnna: Great! Now imagine I have 12 shirts, 10 pairs of trousers, and 6 jackets. And I have to decide what to wear in a split second.\r\nPeter: All right. It's slowly dawning on me. " + }, + { + "id": "13717162", + "summary": "Dan's had an injection with anaesthesis because he got swollen. He feels it's not working though and it still hurts him.", + "dialogue": "Dan: \r\nHulk: omg! Did it hurt?\r\nDan: no. I got an injection with anaesthesis. But its not working any more and it hurts:(\r\nPete: you're quite swollen\r\nDan: Its nothing compared to yesterday!\r\nHulk: you're kidding\r\nHulk: were you even more swollen than this?\r\nDan: Yes\r\nPete: Jesus" + }, + { + "id": "13727567", + "summary": "Archie is arriving from Southampton around midnight. He will travel by bus. He will call Judah.", + "dialogue": "Judah: Hello\r\nArchie: Hey\r\nJudah: So what time are you arriving from Southampton?\r\nArchie: Around midnight; I'll be travelling to yours by bus, so it might take an hour or so\r\nJudah: No worries. Call me on the phone if I happen to be dead by the time you get here\r\nArchie: Lol Will do!" + }, + { + "id": "13729057", + "summary": "Chloe will watch the serious recommended by Biwott at the weekend.", + "dialogue": "Biwott: Did you watch the series I told you\r\nChloe: No not yet.\r\nChloe: I have been busy this week but I will watch it during the weekend\r\nBiwott: 👍" + }, + { + "id": "13716791", + "summary": "Lauren want's to have a small tattoo above her ankle.", + "dialogue": "Lauren: ladies, i'm thinking of getting a tattoo\r\nNelly: oh cool i'd love to get one too\r\nTessie: ru thinking of sth specific?\r\nNelly: neh, probably i won't get one ever. i'm afraid of pain\r\nLauren: i'd like to get sth small on my leg.\r\nTessie: where exactly?\r\nLauren: above the ankle. sth small. a bird?\r\nTessie: i know a couple of guys who do it. let me know\r\nLauren: gr8. i'm like 87% sure yet" + }, + { + "id": "13729824", + "summary": "The toilet upstairs is blocked again. Wendy and David can't afford the plumber as Wendy spent the money on her sister's birthday present.", + "dialogue": "Wendy: I think the upstairs toilet might be blocked.\r\nDavid: Oh NO! NOT AGAIN!\r\nWendy: I'm not sure if it is but it is not draining very well.\r\nDavid: Then it probably is blocked.\r\nDavid: You haven't like flushed any tampons or pads down there?\r\nWendy: Of course not!\r\nWendy: I know not to do that.\r\nWendy: Do you think I'm that stupid?\r\nDavid: Didn't say that. I was just checking.\r\nDavid: That toilet is a major issue ever since we got the house.\r\nWendy: Yeah, where are the good old days of renting where you ring the landlord and it is his problem! :-P\r\nDavid: Hahaha! True!\r\nDavid: Did you call the plumber already?\r\nWendy: No. I don't think we can afford to pay the plumber this month.\r\nDavid: Fuck! Don't you have anything left in the kitty?\r\nWendy: Well I did but then it was my sisters birthday and I had to get a present...\r\nDavid: So, what are we supposed to do now then?!\r\nWendy: Wait until next paycheck? Fix it ourselves?\r\nDavid: Great :-/ just fucking great. grrr!" + }, + { + "id": "13681892", + "summary": "Kaylen wants to know if there is left-hand traffic in Rowen's country. He confirms there is. She thinks she wouldn't be able to drive there.", + "dialogue": "Kaylen: In ur country there is left-hand traffic?\r\nRowen: Yes we drive on the left\r\nKaylen: Hehe ok\r\nKaylen: \r\nRowen: You look gorgeous!\r\nKaylen: Thank you. I'm sure I wouldn't be able to drive there then\r\nRowen: Haha. You're welcome\r\nKaylen: Or I would have to close my eyes while driving hahhaha\r\nRowen: Hahaha" + }, + { + "id": "13864540", + "summary": "Tom wants to go to Robinson Crusoe's island.", + "dialogue": "Tom: I have to go there:\nTom: \nJonathan: this is insane, you know it, right?\nTom: I know, I love insane things\nOscar: are you kidding?\nTom: not at all\nOscar: I'm not spending a fortune to get to a piece of land in the middle of nowhere\nKit: But the idea is amazing\nKit: is it the real Robinson Crusoe's island??\nTom: it seems it is!" + }, + { + "id": "13680525", + "summary": "Titus agrees to help Emely with a language exercise. Emely sends Titus a photo of the filled out exercise, but he has trouble reading it.", + "dialogue": "Emely: Hey. Could u help me with one task? I have to fill in the gaps and I find it a little bit strange task\r\nTitus: How is that strange ?\r\nEmely: I have problems to so it haha. Can I send a photo of this task and then I will wrote u my answers? There are 10 sentences\r\nTitus: Ok I'm doing something now but I can look at it quickly if it won't take up a lot of my time\r\nEmely: Ok\r\nEmely: \r\nTitus: Can't really see that\r\nEmely: A) a cash cow b) hoes down the drain c) nepotism d) a golden opportunity e) a sweetheart deal f) win-win g) grease sb's palm h) a license to print money i) in full swing j) implementation\r\nTitus: But I cannot read it\r\nEmely: Really? I can see it very well on my phone\r\nTitus: That sounds about right\r\nEmely: So u can see it?\r\nTitus: Barely\r\nEmely: Oh ok" + }, + { + "id": "13863152", + "summary": "Marco will read Aldo's 12 page article that he spent 2 weeks writing.", + "dialogue": "Aldo: Hi, did you get my email?\nMarco: Yes I did, thank you\nAldo: I hope you like my article\nAldo: It took me weeks to finish it\nMarco: Ok, I'll print it and read it right now\nAldo: Thanks\nMarco: How many pages is it?\nMarco: Ok, got it, it's 12\nAldo: Yep, 12\nMarco: Ok" + }, + { + "id": "13729679", + "summary": "Harris' friend, Aoki, who lives in Michigan, died yesterday. Harris hasn't seen her for a few months.", + "dialogue": "Harris: How are U?\r\nLena: Fine, U?\r\nHarris: Been better.\r\nLena: ?\r\nHarris: My friend Aoki died yesterday.\r\nLena: O No!\r\nHarris: Yeah.\r\nLena: What happened?\r\nHarris: Not sure yet. Thinking the worst...\r\nLena: O how awful!\r\nHarris: Yes.\r\nLena: You just never know.\r\nHarris: True.\r\nLena: Had you seen her lately?\r\nHarris: Not for a few months. She lives in Michigan.\r\nLena: Oh, that's far.\r\nHarris: Not too far but far enough.\r\nLena: Right.\r\nHarris: Got to go, mom's calling.\r\nLena: K bi. Feel better!\r\nHarris: K will do" + }, + { + "id": "13813794", + "summary": "Hannah's New Year's resolutions are: work out, cook for herself, start dating. Brooklyn didn't make any. In the past she had, but she never fulfilled them. ", + "dialogue": "Hannah: HAPPY NEW YEAR!!! :))) \r\nBrooklyn: Same to you! Have you made any New Year's resolutions? ;>\r\nHannah: A whole list ^^\r\nHannah: New year, new me :D\r\nBrooklyn: Really? :D So how you're going to change your life this year?\r\nHannah: First, I'm gonna lose weight and exercise everyday (or at least 3 times a week).\r\nHannah: Second, I want to learn how to cook and start meal prepping\r\nHannah: Finally, I'm gonna find my future husband (or at least start using dating apps):D\r\nHannah: And you? Have you set any goals for 2019?\r\nBrooklyn: Hmm, interesting :D especially the last point ^^\r\nBrooklyn: Yeah, just one - not to make any resolutions.\r\nHannah: Whyyyy??\r\nBrooklyn: I just don't believe that a new year means a fresh start.\r\nBrooklyn: Every past year I told myself I would lose weight, quit smoking, start going to the gym etc. And I'm still a fatty, who smokes like a chimney and rarely leaves their couch.\r\nHannah: You're a little ray of sunshine, aren't you? ;)\r\nBrooklyn: :p And you're an undaunted optimist. What makes you believe that everything is going to change for the better?\r\nHannah: I don't know, I just like to think that we get numerous second chances to change our lives, to make right what's wrong.\r\nHannah: That's it, I don't have any better explanation.\r\nBrooklyn: Well, you haven't convinced me. :) Nevertheless good luck with all your resolutions. ;)\r\nHannah: Thanks! :*" + }, + { + "id": "13682006", + "summary": "Trevor got Abigail pregnant. When they were having sex without protection her father, a church pastor, kicked Trevor in the butt and Trevor came inside Abigail. ", + "dialogue": "Trevor: I've got a bit of a problem, Uncle\r\nRichard: Have you, Trevor?\r\nTrevor: Yes, Uncle I need to get you advice on how to break this to Dad\r\nRichard: You haven't gone and got that Abigail up the duff have you?\r\nTrevor: That's the long and short of it, Uncle.\r\nRichard: Shit. How did that happen?\r\nTrevor: Well, it's her father's fault, Uncle.\r\nRichard: Her father's fault? And him a church pastor?\r\nTrevor: Yes, Uncle. She has sneaked me into the house really quietly and we are in her bedroom doing it.\r\nRichard: Fucking hell. Do go on.\r\nTrevor: And we haven't got any condoms so I am planning to pull out at the last moment\r\nRichard: Unbelieveable: Bloody idiot. What then?\r\nTrevor: So I am just pulling out and all of a sudden in bursts her father and delivers me such a kick up the arse that I am back in there coming.\r\nRichard: So you say it is all his fault, then? Incredible.\r\nTrevor: Yes. \r\nRichard: So maybe we can get him to pay the child support instead of you then, eh?" + }, + { + "id": "13728809", + "summary": "Jessica posts a lot regarding subjects she does nothing about in reality. Julia posts in a more genuine way. But Hillary thinks she does it to death and lacks real life.", + "dialogue": "Julia: I mean I like my Instagram. And my Snapchat. Oh and Twitter. And sometimes Facebook.\r\nGail: So do I. But this doesn't mean I'm addicted. \r\nJulia: Neither am I. I like looking at photos of my friends and sharing stuff with them. And I think they like it when I post stuff.\r\nGail: I certainly do. And I like to spy on ppl :)\r\nJulia: Rly?\r\nGail: Yeah! It's a lot of fun! Like I was spying on Em and turns out she's into some guy from work ;)\r\nJulia: Rly? Intriguing.\r\nGail: I know! :) and Jessica is thinking about going on diet.\r\nJulia: Wasn't she on one already?\r\nGail: Nah. Whenever she posts a lot of fitness-related stuff, she's just thinking about, but doing nothing rly. \r\nJulia: I always thought she was training day and night!\r\nGail: Nah. That's just how she is. The more she posts, the less she does. Like she had a phase for animal shelters. Remember?\r\nJulia: Yeah. She just wouldn't shut up about it. She posted every single thing she could find on the subject!\r\nGail: Right. And turned out that was everything she did. Never visited one. Never donated a dime. Never did anything.\r\nJulia: And all the time I thought she was so active and pro-active and charitable. She had me fooled ;)\r\nGail: U see? Spying on ppl is fun :)\r\nJulia: Speaking of which, did u spy on me?\r\nGail: No, y would I?\r\nJulia: It's fun?\r\nGail: Oh no! Don't get me wrong! We talk all the time, so no need to spy on u ;)\r\nJulia: Gr8.\r\nGail: Besides ur pretty straightforward.\r\nJulia: What do u mean?\r\nGail: When ur eating, u post food. When ur training, u post fitness materials or photos. When ur relaxing, u post a bunch of stuff.\r\nJulia: I know :) that's y ppl like what I do, 'caus I genuine :)\r\nGail: Yeah...\r\nJulia: What's that supposed to mean?\r\nGail: Hillary thinks u overdo it and have no life in real life.\r\nJulia: That bitch!\r\nGail: I know!" + }, + { + "id": "13716618", + "summary": "Miranda called Tom yesterday and spoke to him in a sweet way. Anne is angry with her because Anne is dating Tom.", + "dialogue": "Anne: I hate that bitch!\r\nCatherine: What did she do again???\r\nNora: Who's the bitch?\r\nNora: Sorry I missed something\r\nCatherine: Miranda!!! \r\nCatherine: We all hate her\r\nAnne: Yesterday she called Tom, was all sweet with him you know how she can be....\r\nCatherine: No way she did it!! How dare she!! She knows you guys are dating.\r\nAnne: I think that's why she did it. She wants to take Tom away from me.\r\nNora: Oh, come on! She's not his type! " + }, + { + "id": "13821297", + "summary": "Barbara and Eva described their dietary requirements in the website given by Ella.", + "dialogue": "Ella: Hey my dear family, could i ask you to confrim that youre coming and if you have any dietary requirements through this link. I need this information for the catering company. HUgs \r\nBarbara: Done!!!\r\nEva: <3\r\nBarbara: I flled out for mom too\r\nElla: I saw that\r\nBarbara: (Y)" + }, + { + "id": "13727877", + "summary": "Sarah will arrive to New York on Thursday. Joshua expects to get a gift.", + "dialogue": "Joshua: Hi sister, when will you be arriving\r\nSarah: i will reach New York this coming Thursday.\r\nJoshua: That's great. See you then.. I will be waiting for my gift\r\nSarah: yeah sure.. hahaha" + }, + { + "id": "13821268", + "summary": "Allison has got a scholarship.", + "dialogue": "Allison: Hey girls! \r\nMaya: hey!\r\nSarah: hey, why you so cheerful?\r\nAllison: Guess what!\r\nAllison: I've got a scholarship!\r\nMaya: no way! you have made it!\r\nSarah: shut up!\r\nAllison: yeee, and it is the highest posible rank i could get\r\nMaya: we so proud, when do we celebrate!?\r\nSarah: \r\nAllison: Whenever you want! thank you <3" + }, + { + "id": "13611693", + "summary": "Sam won't finish work till 5. Sam is bringing him over about 9 am. Sam will see Abdellilah in the morning. ", + "dialogue": "Abdellilah: Where are you?\r\nSam: work\r\nAbdellilah: What time you finish?\r\nSam: Not til 5\r\nAbdellilah: Are your bringing him over tonight:\r\nSam: No in the morning:\r\nAbdellilah: ok, what time?\r\nSam: About 9. Is that ok?\r\nAbdellilah: ok - see you then" + }, + { + "id": "13680969", + "summary": "Betty and Phil are meeting at 6.30 to watch a thriller at the movies. They will have dinner at Phil's afterwards.", + "dialogue": "Betty: What's on at the cinema tonight?\r\nPhil: I don't know. I haven't checked it either.\r\nBetty: I'm looking at the website. There are two comedies and one thriller which seem interesting.\r\nPhil: Choose. I'm fine with whatever.\r\nBetty: What if you get bored?\r\nPhil: I won't. Don't worry. Just choose.\r\nBetty: Ok. So the thriller. We watch lots of comedies at home.\r\nPhil: Fine :-) Could you book the tickets if you're already on the site?\r\nBetty: Sure.\r\nBetty: Done :-)\r\nPhil: When and where would you like to meet?\r\nBetty: How about 6pm near the theatre?\r\nPhil: It may be a little difficult for me. 6.30? \r\nBetty: Ok. I was thinking about a short walk before the movie but we can have a stroll afterwards as well :-)\r\nPhil: Thanks :-)\r\nBetty: Shall I prepare dinner or do you fancy eating out tonight?\r\nPhil: Dinner with you at home.\r\nBetty: Ok. I have a few ideas for our menu :-)\r\nPhil: Won't it be a big problem for you? \r\nBetty: Not at all. Pleasure.\r\nPhil: :-D Thanks and see you :-)\r\nBetty: :-*" + }, + { + "id": "13730168", + "summary": "Robert has a new phone number starting with 304. Robert has lost his job and is looking for a new one. Serge offers to pass on Roberts CV to a contact. ", + "dialogue": "Robert: Happy Christmas! Wishing you and Elena all the best for the Christmas season and a Happy New Year!\r\nSerge: Thanks, you too, Robert.\r\nRobert: By the way - please cancel the phone number for me that starts with 713. The one beginning with 304 is the only number for me now.\r\nSerge: OK, Robert. Is all OK?\r\nRobert: Well, I will tell you more in a few weeks, but in short from today I am looking for a new job.\r\nSerge: What? They sacked you? After your huge success in the Ukraine?\r\nRobert: That's life.\r\nSerge: Those fucking bastards.\r\nRobert: Well, I could see it coming, ever since the merger. The Swedes didn't even bother to find out what people did before they started laying them off.\r\nSerge: Send me your CV, I know one Belgian guy who is thinking of opening a new factory here. Can't promise anything, but at least I can try.\r\nRobert: Serge, I really appreciate that." + }, + { + "id": "13681928", + "summary": "Lily spent an amazing night with Thomas. Thomas is Romanian and comes from a very rich family.", + "dialogue": "Lily: I spent an amazing night with this guy\r\nKate: What's his name?\r\nLily: Thomas\r\nKate: Is he English?\r\nLily: He is Romanian, but from an extremely rich family, it seems\r\nKate: What does he do in London?\r\nLily: LSE\r\nKate: of course, so predictable \r\nLily: Hahah, but I expected him to be more banal and boring\r\nKate: but?\r\nLily: to be more mediocre...\r\nKate: in size, in passion or in wallet?\r\nLily: In everything. I've had so much pleasure that I have a headache\r\nKate: I'm glad to read that. You deserved some fun after the last weeks.\r\nLily: haha, I agree!" + }, + { + "id": "13730545", + "summary": "Sam wants to buy a custom dress as a surprise for his wife. It should be black and elegant. The store employee sent him some pictures for reference. Sam decided on the features he likes. His product number is 898998 and he will place the order on the company's website.", + "dialogue": "Sam: hi, i need a help\r\nSarah fashion: hello how can i help?\r\nSam: Actually i was looking for a nice black dress for my wife, i mean i dont want the in-store product..\r\nSarah fashion: Yes sir, we make dresses on order as per customer requirements.\r\nSam: yeah i saw that option on the web page, actually its a surprise gift for her, but i have no idea what should be the requirements of the dress.\r\nSarah fashion: oh in that case why dont you choose something ready made sir\r\nSam: Actually i want something different for her something she has not seen before\r\nSarah fashion: that nice, do you have any sketch in your mind it would be easier to help \r\nSam: yes that it should be a dress, black in color decent and elegant, and.... thats it :(\r\nSarah fashion: :) dont worry Sir we will try to help you as much as we can but you have to choose between the choices we give you\r\nSam: Sure.\r\nSarah fashion: Would you mind coming to the store? or you want to place order here only?\r\nSam: i was wondering if i could get help and decide i would place order right here...\r\nSarah fashion: Sure sir i am sending you few pictures you can mix and match the designs and that way we would be able to create a new design?\r\nSam: that sounds like a good idea..\r\nSarah fashion: \r\nSam: wow! they are all so good but they are available for every one right?\r\nSarah fashion: yes sir!\r\nSam: ok so i want the cut that is in sleeves like this length and buttons \r\nSarah fashion: Nice choice sir, your product number is 898998 now you can order on the website with this product number and the same procedure would be applied to your order.\r\nSam: Thank you so much, i didnt know it was so easy.\r\nSarah fashion: Your welcome sir, We are glad your liked the service and we hope you like the dress too. \r\nSam: :)" + }, + { + "id": "13681194", + "summary": "Kamden hasn't used social media recently. He uses messenger only and wants to get Mckinley's photographs.", + "dialogue": "Kamden: Hey!\r\nMckinley: Hi!\r\nKamden: I haven't seen you in a while - i've mostly been off social media. Maybe you'll let me have a little peek?\r\nMckinley: You aren't on fb anymore?\r\nKamden: I use chat on fb. I'm not big on social media use\r\nMckinley: Hmm...\r\nKamden: It helps me keep up with good friends\r\nMckinley: But you always can open it and check my photos lol\r\nMckinley: Yeah I use fb mostly to keep contact with people\r\nKamden: It's true. But I guess it would be more enticing to get it from you. 😏 Yeah it's my main reason. But I spend zero time therecc\r\nMckinley: Lol I'm not a phone selfie person\r\nKamden: Thats a shame. Lol\r\nMckinley: Lol" + }, + { + "id": "13612013", + "summary": "After work, Mike is going to go to the gym and then home. He invites Jason to bring some food and come over. They can play 2 on ps4.", + "dialogue": "Jason: Yo, what are you doing after work?\r\nMike: Going to the gym and then home boy.\r\nJason: You eating at home?\r\nMike: Yep, bring your food and come over.\r\nJason: Will do. Afterwards we play some Destiny 2 on ps4?\r\nMike: You bet ya, that game is so addictive!\r\nJason: Ok cya later then!´\r\nMike: Oh yeah👌" + }, + { + "id": "13731393", + "summary": "Linda got a new job and is moving to Ohio for 6 months. Her brother will stay at her current apartment. Daisy will come by Linda's place to meet her and help her pack as she is too busy to go out.", + "dialogue": "Daisy: hey whats up\r\nLinda: Not much\r\nLinda: Im just packing\r\nDaisy: off on a vacation\r\nLinda: didn't I tell ya?\r\nLinda: I'm moving\r\nDaisy: really?\r\nDaisy: what happened?\r\nDaisy: you had a great place!\r\nLinda: got a new job\r\nLinda: I'm moving to Ohio\r\nDaisy: OHIO!\r\nDaisy: that so far away :(\r\nLinda: yeah well\r\nLinda: It's just for 6 months\r\nDaisy: and your apartment?\r\nLinda: my brother is going to live there\r\nDaisy: can we meet up?\r\nDaisy: before you go?\r\nLinda: ehh I have like no time for anything\r\nLinda: maybe you can come by\r\nDaisy: Sure!\r\nDaisy: I'll help you pack :D\r\nLinda: that would be great! " + }, + { + "id": "13681560", + "summary": "Hayden must write her thesis in 1 month. She wonders what degree course would be the most beneficial for her. She's interested in African studies. Hayden claims she could be a flight attendant as she can swim and knows foreign languages.", + "dialogue": "Hayden: Anyway I have 1 month to write my thesis. And then I need to decide what studies I should choose and I have a problem because I don't know what I can do in the future to make good money\r\nMargaret: You'll find something\r\nHayden: And the only studies I'm interested in are African studies but I'm not sure I can make big money later on haha except for working in the embassy or something like that. I was thinking about working as a flight attendant. It would be easy for me to get that job since I can swim (and here it's obligatory) I'm even a water rescuer. I know English italian and polish and a bit of german.\r\nMargaret: So go ahead for it\r\nHayden: But to be honest , I don't think so that job is so great. I can't work there forever and I'm not that sure I wanna risk every time hahah since flight accidents happen\r\nMargaret: Hahahaha you shouldn't think about that\r\nHayden: But I don't wanna die hhahahahah\r\nMargaret: It would be good you would get to travel a lot" + }, + { + "id": "13828403", + "summary": "Alan has found some cinnamon whiskey and sends Robert photos of it. ", + "dialogue": "Alan: \r\nAlan: look what I just found :)\r\nRobert: dude, that's just nasty and you know it :) \r\nRobert: it has no sugar, no taste, and additional cinnamon flavoring\r\nAlan: yeah, I know - that's awesome :)\r\nRobert: you sir have a very strange tastes :P\r\nAlan: well, and I found a perfect company for it \r\nRobert: oh, that's more like it!\r\nRobert: but does the whiskey go well with the cinnamon? flavored whiskey is the worst... \r\nAlan: Actually it does taste surprisingly well. The cinnamon is not overpowering. If you put enough whiskey that is :)\r\nRob: Lol, thought so :)\r\nRob: I just wish the brought the old cherry flavor back...\r\nRob: not the useless no-sugar stuff\r\nAlan: Ah, that is true :)" + }, + { + "id": "13680216", + "summary": "Yaz and Mary are meeting tonight around 6 and going for the slimming club together. They expect to be scolded for eating too much. ", + "dialogue": "Yaz: Going to slimming club tonight?\r\nMary: Well, I don't want to, but I really should. I've been so bad, though!\r\nYaz: Me too, choccies, wine, cake, you name it!\r\nMary: Well, we should bite the bullet. She'll tell us off, I expect. Feels like being back at school!\r\nYaz: Well, she IS our old cookery teacher! See you at 6ish, pick you up!?\r\nMary: Yep! See you then!" + }, + { + "id": "13716003", + "summary": "Ellie's class is in 342 on the second floor.", + "dialogue": "Ellie: hey, are you at the university?\r\nAaliyah: yep\r\nCamille: I'm sick :/\r\nEllie: shiet....that's bad\r\nEllie: Aali, where are we having our classes now?\r\nAaliyah: 342 on the second floor" + }, + { + "id": "13829937", + "summary": "Nathan and Aaron are discussing a video which Nathan sent. ", + "dialogue": "Nathan: \r\nAaron: OMG!!! \r\nAaron: 😂😂😂😂\r\nAaron: looool\r\nAaron: do you know her mate? 🤣🤣\r\nAaron: my cat's face looks like that when he's taking a dump..🤣🤣🤣\r\nAaron: \r\nAaron: the angels must be weeping 🤭🤭🤭 \r\nAaron: \r\nAaron: \r\nNathan: Hahahaha\r\nNathan: She's having a spiritual moment 😉\r\nNathan: No clue mate, Dan sent it over\r\nNathan: A tragedy to say the least 😂😂\r\nAaron: \r\nAaron: looking for Jesus\r\nNathan: 😂😂😂\r\nAaron: hilarious..hahahaha" + }, + { + "id": "13812140", + "summary": "William is coming back in 5 minutes as he had to queue for 20 minutes.", + "dialogue": "Emma: You havent been back yet? and where's shake\r\nWilliam: I have been in que for past 20 minutes. Too much rush here\r\nEmma: Ok Hurry up. We cant wait anymore :( \r\nWilliam: Dont worry. Its my turn up next\r\nEmma: :D \r\nWilliam: Coming back in 5\r\nEmma: Ok waiting" + }, + { + "id": "13818707", + "summary": "Jake reserved 3 tickets for tomorrow 7 pm. He got 30% discount.", + "dialogue": "Bob: did you reserve the tickets for tomorrow?\r\nMelanie: Jake promised me he will do that\r\nMelanie: he has some workplace discount on them\r\nBob: Jake are you here?? Did you reserve the tickets?\r\nJake: yes I did, 3 tickets for tomorrow, 7 pm\r\nJake: and we got a 30% discount on them too :)" + }, + { + "id": "13716495", + "summary": "Freddie, Kelly, Jim, Greg, Bob, Mike, Mary, Alan and Nancy are watching different shows on Netflix.", + "dialogue": "Freddie: What are you watching on Netflix? I've just finished Mad Men.\r\nKelly: I've just finished The Crown. \r\nFreddie: There should be new episodes coming soon.\r\nKelly: OMG! i can't wait! :)\r\nFreddie: Try Outlander or The Tudors in the meantime.\r\nJim: i'm watching I, Zombie.\r\nBob: try Grimm or Sabrina. They're both great!\r\nKelly: they are all horrible and disgusting! \r\nBob: i think they are funny. They're just TV shows don't take them too seriously. x \r\nGreg: I'm watching House of Cards though must admit the new series is boring.\r\nMike: i agree. not the same without Kevin Spacey! You should try The West Wing if you enjoy political dramas.\r\nGreg: Netflix original Bodyguard is supposed to be good from what I've heard.\r\nMary: I'm a bit ashamed to confess i enjoy stuff like Pretty Little Liars or Gossip Girl ;)\r\nNancy: don't be ashamed i enjoy them too! and my favourite one is Desperate Housewives!!!\r\nMary: I know! I watched it twice!\r\nAlan: I'm catching up with Friends.\r\nNancy: OMG! i used to love Friends!" + }, + { + "id": "13729128", + "summary": "Sonia is going to San Sebastian in a month. Toni enjoyed her the airbnb place there. Sonia isn't convinced about it and will let Toni know.", + "dialogue": "Sonia: hey remember last year when you guys went to San Sebastian\r\nToni: yup \r\nToni: hi\r\nSonia: how was that airbnb place you stayed in\r\nToni: it wasnt bad\r\nToni: maybe a bit small for the three of us\r\nToni: but it was next to the Playa de la Concha\r\nToni: and anyway we just used it to sleep\r\nToni: u guys going?\r\nSonia: yeah in a month or so\r\nSonia: we r still planning\r\nSonia: we checked some hostels and so but they r pretty expensive\r\nSonia: plus we rather have more privacy\r\nToni: well i can definitely reccomend the airbnb place\r\nToni: i can get you in contact with the landlady too if u want\r\nSonia: that would be sweet\r\nToni: its this old basque lady, widow, pretty lonely but nice\r\nToni: she even made a tortilla for us\r\nSonia: so shes in the appartment too?\r\nToni: yes she rents a room in the appartment\r\nSonia: oh\r\nSonia: hm I dont know\r\nToni: well think about it and let me know" + }, + { + "id": "13729680", + "summary": "Malik and Samanta want to lose weight. They will try to keep a diet, keto or paleo, and go for runs together.", + "dialogue": "Malik: have you heard of that paleo diet?\r\nMalik: i need to lose some weight and i really want to try it\r\nSamantha: i've heard of it but i've also heard about the keto diet\r\nSamantha: AAAAANNNDDDD... i also need to lose weight lol\r\nMalik: what are you talking about?!? lol\r\nMalik: you're so skinny\r\nSamantha: whatever :-)\r\nMalik: should we try one of those together?\r\nMalik: it's always easier when someone's doing it with you\r\nSamantha: YES!!!!\r\nMalik: we can also go for runs together like we used to :-D\r\nSamantha: let's do it!! i'm so pumped!\r\nMalik: so paleo or keto?\r\nSamantha: what's the difference?\r\nMalik: i think they're practically the same, but you can't have dairy on paleo\r\nSamantha: can you have dairy on keto?\r\nMalik: i think you can, i'm no sure though\r\nSamantha: ok let me go online and read more about this\r\nSamantha: and i'll text you back later with more info\r\nMalik: ok\r\nMalik: are you excited??\r\nSamantha: i really am!!!!!!!!! :-D" + }, + { + "id": "13865119", + "summary": "Derek closed some deals today. Phil didn't manage to do it.", + "dialogue": "Derek: It's been a long day\nPhil: Same here\nCynthia: good or bad?\nDerek: Very busy\nDerek: I closed some deals but I had a lot of stress also\nPhil: Lucky you\nPhil: I couldn't close any deal\nPhil: It annoys me \nPhil: Some customers are negotiating for days, weeks\nPhil: And then they pull out \nCynthia: Maybe they're just fishing\nCynthia: Want to compare offers \nPhil: That's what they do " + }, + { + "id": "13813011", + "summary": "People are photoshopping Timothée Chalamet into artworks. Dominic and Nova agree that he looks like a 19th century man.", + "dialogue": "Nova: Do know that people are photoshoping Timothée Chalamet into artworks? :D\r\nNova: It's hilarious, check it out: \r\nDominic: lol XD\r\nDominic: it kinda looks good\r\nNova: Right? :D\r\nDominic: he looks like a typical young man from a 19th-century portrait\r\nNova: omg you're so right! :D he looks as if he read Rimbaud's poems and drank absinthe on a daily basis\r\nDominic: hahaha XD it's so accurate" + }, + { + "id": "13611436", + "summary": "Peter has been working out at the gym near their office lately to improve his health. Lisa is considering working out and eating better to be healthier.", + "dialogue": "Lisa: Hello Peter. What have you been doing off late?\r\nPeter: Bit busy with work.\r\nLisa: Too much of work, huh?\r\nPeter: Not really, I’ve been working out\r\nLisa: Ohh, trying to get in shape?\r\nPeter: Nah, I don’t have much of weight to lose, just want to improve my health.\r\nLisa: That’s a good thought. I was also thinking of working out a bit. What do you do? Do you go to a Gym?\r\nPeter: Yes, I hold a membership with the one near our office.\r\nLisa: Which exercises do you do regularly?\r\nPeter: I do weights and run on the treadmill.\r\nLisa: Besides exercises, I think I need to eat better to help me keep in shape.\r\nPeter: Another major requirement for good health is sleep. On an average one requires at least 7-8 hours of sleep.\r\nLisa: There are a lot of things we can do to stay healthy.\r\nPeter: Yes. One needs to maintain a regime to stay healthy for a long term." + }, + { + "id": "13680625", + "summary": "Julia broke Tom's cup, which made him sad. She will buy him a new one.", + "dialogue": "Tom: Where is my cup?!\r\nJulia: I broke it, I think, sorry\r\nTom: You think? Were you drugged up?\r\nJulia: I'll buy you a new one, don't make a drama \r\nTom: :(" + }, + { + "id": "13682111", + "summary": "The CSS tests for the hockey players are today and will last 3 hours, starting 5pm. Hank will bring his son and Don's son as well. Don is glad.", + "dialogue": "Don: The CSS tests are today.\r\nHank: I know! Have you registered in their system?\r\nDon: No, not yet.\r\nHank: I'm not sure if I have to fill out all the fields. Most of them don't even apply to Rodney. It's strange that they're incorporating such professional tests for little leaguers.\r\nDon: Hmm... I don't know. I'll check it later tonight when I get home. Apparently it's supposed to be a database of all the young up-and-coming hockey stars. Scouts use that information for future contracts, etc.\r\nHank: OK, but our kids are 10 years old! They're not signing any contracts for now :)\r\nDon: Well, obviously not, but it's kind of cool that they'll be in a world database of peewee hockey players.\r\nHank: Yeah, I guess. So apparently the tests are supposed to last 5-8 pm. 3 hours! I'll probably wait and work on the computer.\r\nDon: Bring a blanket :) It can get mighty cold sitting in a rink for 3 hours.\r\nHank: No shit :) Are you coming.\r\nDon: Well, since you're going anyway, maybe you can take my kid :) \r\nHank: Hey, that's not fair! \r\nDon: I'll take the kids next time, I promise!\r\nHank: Yeah, yeah, we'll see. I'll tell you about the tests when I bring Oscar and Roger back.\r\nDon: Ok, thanks again, I owe you." + }, + { + "id": "13864408", + "summary": "Jessica bought a table, six chairs, a vase and a pile of clothes and the second hand shop downtown. She paid 70 euros for everything. ", + "dialogue": "Jessica: I went to the second hand shop downtown \nFrank: Cool\nRaphael: What did you get?\nJessica: Lots of stuff\nJessica: A table, six chairs, a vase, a pile of clothes\nFrank: That's really a lot of stuff :-)\nRaphael: send us pictures\nJessica: \nJessica: \nRaphael: Beautiful table\nRaphael: Is it wood?\nJessica: yes, oak \nRaphael: Awesome\nJessica: \nFrank: These are the clothes?\nFrank: I love the black dress\nJessica: It's absolutely beautiful\nJessica: As soon as I saw it I knew it would be mine\nJessica: I'm so happy with what I bought\nJessica: And the best is I paid only 70 euros for all of that!! " + }, + { + "id": "13681509", + "summary": "Abigail and Damien are going to church on Sunday. Damien has to put on a coat and tie.", + "dialogue": "Abigail: It's Sundaay.\r\nDamien: So?..\r\nAbigail: You know what that means.\r\nDamien: Hmm no I don't x)\r\nAbigail: Sunday means we go to church~.\r\nDamien: Oh, yeah..\r\nAbigail: Don't forget to put on a coat and tie.\r\nDamien: A coat and tie?.. Why?\r\nAbigail: To show respect to God and others.\r\nDamien: Omg..I'm glad Sunday is only once a week.\r\nAbigail: I hope God didn't hear that.\r\nDamien: He'll forgive me 😇\r\nAbigail: Just be ready on time please." + }, + { + "id": "13682223", + "summary": "Lucian is not at home. Desiree wants Lucian to keep her pasta in the microwave.", + "dialogue": "Desiree: U both at home?\r\nLucian: No. I've just got ur msg. Why did u ask about it?\r\nDesiree: No reason. Keep my pasta in the microwave\r\nLucian: I haven't cooked anything" + }, + { + "id": "13729602", + "summary": "Doug has a cool pair of shoes.", + "dialogue": "Doug: These shoes are SWEET!!!!!!!!!!!!!!!!!\r\nJeni: Jealous!\r\nDoug: You should be! I be fly!" + }, + { + "id": "13818361", + "summary": "Tessa doesn't like Chloe texting her boyfriend, Jim Andrews. Jim is Chloe's co-worker so Chloe needs to communicate with him.", + "dialogue": "Tessa: Stop texting my boyfriend ;/\r\nChloe: Uhm, excuse me?\r\nChloe: Who's your boyfriend?\r\nTessa: Jim Andrews.\r\nChloe: Oh, ok! He's a friend from work.\r\nTessa: yeah, but stop texting him. I saw the messages you sent him.\r\nChloe: Which ones in particular?\r\nTessa: You don't text your co-worker like that\r\nChloe: Like what?\r\nTessa: I can see you're being all smiley and touchy-feely with him.\r\nTessa: He's taken\r\nChloe: I'm his friend, I get he's taken, but we work together.\r\nTessa: I see what you're up to, so stop texting him ;/\r\nChloe: Jesus girl, aren't you a bit paranoid?\r\nChloe: We WORK together. How am I supposed to stop texting him? We have to communicate.\r\nTessa: I don't care. I don't want you texting my boyfriend :/ bye" + }, + { + "id": "13821778", + "summary": "Emily, Kate and Marta are going to the Pub X at the central station today for a drink.", + "dialogue": "Emily: fancy a drink after work today?\r\nKate: sure!\r\nMarta: Good idea! \r\nMarta: Where? When?\r\nEmily: Maybe in the Pub X at the central station at 5.30?\r\nKate: I may be closer to 6, traffic on my way\r\nMarta: Fine for me.\r\nMarta: See you then, Ladies!\r\nEmily: Bye! see ya :*\r\nKate: :*" + }, + { + "id": "13865298", + "summary": "Jane, Anne and Ella have been to La Perle. Jane ate cheesecake and got an allergic reaction. They are getting out of ER. ", + "dialogue": "Jane: Don't ever go to La Perle :<\nEddie: Why? You love this place\nJane: Used to love this place, not any more\nAnne: We've just been there with Jane and Ella\nAnne: Now we're getting out ER\nEddie: Dear god what happened?! Are you all right?\nJane: Now I'm good, but they almost killed me\nJane: You know I'm allergic to peanuts\nEddie: Oh no they didn't...\nAnne: we ordered some cake, Jane asked for a cheesecake just to be sure\nJane: I told them I'm allergic and they ensured me there's not even a trace of peanuts\nJane: I started swelling after the first bite - apparently there were crushed nuts in the crust\nEddie: I think you should sue them. They really could have killed you\nAnne: I told her the same thing, who knows how many people they killed?\nJane: I just don't get how ignorant you have to be to do something like this" + }, + { + "id": "13680874", + "summary": "Railey will buy Tiffany a burger.", + "dialogue": "Tiffany: buy me a burger on your way home\r\nRailey: ok\r\nTiffany: thx, sis :)" + }, + { + "id": "13727702", + "summary": "Ariana will do shopping in Midtown. Aviana can't join her.", + "dialogue": "Ariana: I think I am going shopping\r\nAviana: Where? \r\nAviana: Midtown? \r\nAriana: Yeah\r\nAriana: I wanna buy some stuff\r\nAviana: I wish I could go with you \r\nAriana: Thats ok 🙂" + }, + { + "id": "13819698", + "summary": "Connor bought his halloween costumes at Value Village where Jane plans to get her and her sister's costume.", + "dialogue": "Jane: Hey\r\nMartin: Whats up\r\nMaria: Hey\r\nJane: Anyone going to Value Village? \r\nJane: I am getting halloween costumes with my sister\r\nJane: If anyone wants to join\r\nConnor: I got mine today so thanks\r\nConnor: At Value Village\r\nJane: I am going with u ok? priv msg\r\nJane: ok ok" + }, + { + "id": "13828146", + "summary": "Today Mary didn't go to school, she stayed at home.", + "dialogue": "Mary: Hi my friend :*\r\nAlice: U re not at school?\r\nMary: No i stayed at home today.\r\nAlice: Lucky u!" + }, + { + "id": "13819724", + "summary": "Elena is wearing the red jacket and Jeffrey can't see her nor Tom.", + "dialogue": "Tom: we're few meters from you, can't you see us?\r\nJeffrey: lol, nope\r\nElena: c'mon, the red jacket!" + }, + { + "id": "13681924", + "summary": "Sophia apologizes to Mason. She sends him a kiss photo on his request.", + "dialogue": "Sophia: I'm sorry\r\nMason: It's fine\r\nSophia: Ok...If u was there, I would give you a hot kiss for apologize\r\nMason: Hahaha. You still send me a photo one\r\nSophia: What photo?\r\nMason: Kiss photo. Haha\r\nSophia: Hehe I sent u already such a photo\r\nMason: Another one wouldn't hurt\r\nSophia: Maybe later :) When I take a shower and look good\r\nMason: And who says you don't look good now ?\r\nSophia: Me\r\nMason: Let me be the judge of that\r\nSophia: No\r\nMason: I'll still love you the same. Whether you have make up or not\r\nSophia: Hehe but I don't want u to see me when I do not look good\r\nMason: You must\r\nSophia: I must what?\r\nMason: Send me the kiss now\r\nSophia: Haha\r\nMason: Doesn't matter how you look, you'll still look good to me\r\nSophia: Hahah\r\nSophia: " + }, + { + "id": "13821035", + "summary": "Steffen twisted his ankle yesterday and needs a lift to the infinity pool. Irene's car probably won't make it up the hill, so they'd have to park at the bottom and hike up. Mr.Budd should make it up the hill since it's a 4-wheel drive.", + "dialogue": "Steffen: Any room in any of the cars going to the infinity pool? Im more handicapped than usual since I twisted my ancle yesterday :(\r\nIrene: we can give you a lift. Don’t think the car can make it all the way up, so will park at the bottom and hike up \r\nSteffen: Then I think I have to skip - cant really walk on my leg atm :confused: But thanks anyway\r\nIrene: :(\r\nDan: I’m pretty sure Mr.Budd could make it, it’s 4wheel drive, if mr.budd is going, although I haven’t seen the hill \r\nLuke: have you been up there? how bad is the road actually?\r\nLuke: lol, that explains it\r\nLuke: Sandy, is it vistas de olas?\r\nBen: Yes! Vistas de olas" + }, + { + "id": "13864917", + "summary": "Jake, Florence, Margot and others are going on a research trip to Swazi. The name of the country was changed last year and it's now Eswatini.", + "dialogue": "Mike: who is going for the research trip?\nJake: Me, Florence, Margot and others\nJake: but that's not important\nFlorence: hahah\nMargot: true, we have our nice bunch of people\nMike: so maybe I'll apply too\nJake: but remember it can be harsh in Swazi now\nMike: gosh, Jake, have you realised at least that it's not even Swazi anymore?\nFlorence: hahaha, quite hilarious\nJake: ?\nMargot: they changed the name of the country last year\nJake: what? so what's the name now?\nMargot: Eswatini\nJake: are you kidding me?\nMargot: Jake, it's basic knowledge before the trip LOL" + }, + { + "id": "13730730", + "summary": "Christine is sick and won't come to school tomorrow. Annie will leave Theraflu sachets in a mailbox. Christine doesn't want to get her sick.", + "dialogue": "Annie: Are you going to be at school?\r\nChristine: Not tomorrow. I am not well.\r\nAnnie: Oh noes! What happened?\r\nChristine: Got the flu, I think.\r\nAnnie: what's your temperature?\r\nChristine: Not high, I'm not running a fever or anything\r\nAnnie: Are you eating ok?\r\nChristine: Yeah. Just blocked nose, sore throat. Tired.\r\nAnnie: Sounds like you've got a cold. You need anything?\r\nChristine: I could do with some Theraflu.\r\nAnnie: OK, I think we've still got some sachets, should be in date. I'll drop them through your letterbox later on.\r\nChristine: Yeah. Don't call in because I'll feel bad if you catch this cold off me.\r\nAnnie: I think I probably had it already, but you might be sleeping.\r\nChristine: If the light in my room is on, call if you want." + }, + { + "id": "13730522", + "summary": "Jill called Sarah. She also sent her some old pictures. ", + "dialogue": "Bob: you bitch... why you called Sarah?\r\nJill: because i want to.. who are you question me?\r\nBob: try whatever you can bitch you cant get me back\r\nJill: huh? excuse me i dont want you back so just fuck off\r\nBob: really then why your calling my girlfriend and sending her our pictures..\r\nJill: its just that i hate you and i dont want you to be happy :haha:\r\nBob: really bitch ? but u told her she cannot snatch me from you? \r\nJill: yesss! its fun to hurt her \r\nBob: i cant believe i was living with a bitch like you\r\nJill: oh yes and you would have lived if i wouldnt have kicked you out\r\nBob: What? huh! i left you.. dont remember you were begging me to love you not to leave you?\r\nJill: whatever.. i am glad i could make you angry and hurt you\r\nBob: ok thank you dear :)\r\nJill: thank you?\r\nBob: yes for saying all this, this is Sarah and all is good now... better luck next time?\r\nJill: get lost," + }, + { + "id": "13730778", + "summary": "Peter starts his new job on the 6th. Peter wanted a free babysitter. Aggie will arrange for a babysitter. ", + "dialogue": "Aggie: When do you start work?\r\nPeter: at 8 \r\nPeter: why?\r\nAggie: no when do you start the new job?\r\nPeter: oh on the 6th\r\nAggie: Ok I'll get a babysitter then\r\nPeter: lol wanted a free babysitter\r\nAggie: yeah sorry :P" + }, + { + "id": "13815679", + "summary": "Aimee is looking for Maryam.", + "dialogue": "Aimee: Do you know where Maryam is?\r\nSoren: Nope\r\nSoren: You tried his number?\r\nAimee: Yes\r\nAimee: I even went to her home\r\nSoren: She might have gone somewhere with his father\r\nAimee: Maybe" + }, + { + "id": "13716512", + "summary": "George, Robert and Paul are going to play basketball on Friday at 7. Yousuf will be late.", + "dialogue": "George: Yo! Who wants to go play basketball on Friday, 7 p.m.?\r\nRobert: Count me in! \r\nYousuf: Can I come half an hour later? I need to help my sister with her car.\r\nGeorge: No prob. Paul, u coming?\r\nPaul: Hell yeah! I'll bring some beers too!\r\nRobert: Sounds like a plan!" + }, + { + "id": "13612049", + "summary": "Rory wants Mitch to take Bill and Sammy and they'll chip in for gas. Mitch will be leaving Sunday, the 29th to get there by 9 am on Monday. Bill will arrive around 10 am Sunday and Joanna will be picking him. Mitch will meet Bill after.", + "dialogue": "Rory: Hey Mitch, how are you? I hope you're doing ok. We are thinking of signing up Bill for that International camp. Sammy will be going too. Do you think it would be possible for you to take Bill and Sammy if you go down?\r\nMitch: I'm doing well - pretty tired. Yes, I do think that's possible :)\r\nRory: Great, that would mean a lot to us. Thanks a lot. \r\nMitch: That'd be wonderful! July, I'll be at a few camps ;)\r\nRory: Of course, we'll chip in for gas. Ok, no problem :)\r\nMitch: Thx, no worries. I hope you're all doing well. Time is flying by ;)\r\nRory: Ok, great. I know, it's crazy.\r\nRory: Do you know exactly which date you'll be leaving?\r\nMitch: Sunday, the 29th\r\nRory: Ok, cool. It's such a long drive.\r\nMitch: Yeah, around 1000 km, but I'll try to get there early Mon morning.\r\nRory: That's a pretty ambitious undertaking :)\r\nMitch: I have to, camp starts Mon 9.\r\nRory: Just be careful, and take lots of breaks, don't fall asleep at the wheel, etc.\r\nMitch: I'll have the boys to keep me awake, and lots of snacks :)\r\nRory: Ok. We will probably put Bill on a Flixbus, so he'll arrive around 10 am Sun\r\nMitch: Cool, just make sure he has his phone on him.\r\nRory: Ok, no problem. We'll give him your number, but most likely Joanna will be picking him up.\r\nMitch: Oh, ok. That makes things easier. I'll meet him after.\r\nRory: Great, thanks a lot for doing this, I don't know how else we would get him down to Croatia.\r\nMitch: No problem. Talk to you later.\r\nRory: Ok, bye." + }, + { + "id": "13730333", + "summary": "Jones and Angelina will meet in town in the afternoon.", + "dialogue": "Jones: Hey.\r\nAngelina: Hey.\r\nAngelina: Long time. How are you doing?\r\nJones: I'm fine\r\nJones: You?\r\nAngelina: I'm cool too.\r\nJones: You think we can meet today later in the afternoon in town?\r\nAngelina: Definitely.\r\nJones: Okay. I will call you to confirm where we will meet.\r\nAngelina: Cool" + }, + { + "id": "13681165-1", + "summary": "Derek and Alyssa make fun of Fergie's performance of the national anthem.", + "dialogue": "Alyssa: Have you seen Fergie’s national anthem? Illuminati does a great job.\r\nDerek: This is not normal. I saw it last week…\r\nAlyssa: What do you think about it?\r\nDerek: I can fart bright stripes and bright stars better then she sings.\r\nAlyssa: The best part is that she acts like she nailed it. But at least it's funny in a good way.\r\nDerek: It is 😂" + }, + { + "id": "13611834", + "summary": "Pam doesn't have rota for Lauren, but Manager may give Lauren more tomorrow. Pam and Lauren will meet tomorrow and discuss Lauren's holiday. ", + "dialogue": "Lauren: Hi do you still need me for tomorrow\r\nPam: Yes please!!\r\nLauren: Do you have any more rota?\r\nPam: No, but the Manager's back tomorrow so she may do some more then. I'll ring in the morning and let you know.\r\nLauren: ok that's great\r\nPam: Did you have a good holiday?\r\nLauren: Yes, will tell you all about it tomorrow\r\nPam: Look forward to it!" + }, + { + "id": "13816051", + "summary": "Jamie has never gone ghost hunting but Harriette did with her friends once in high school. They did not see any ghosts and she only got frightened by a cat's miaowing.", + "dialogue": "Harriette: Have you ever gone ghost hunting? ;o\r\nJamie: Ghost hunting? Nah, not really... Have you?\r\nHarriette: Yeah, once when I was in high school! There was a run-down building in the neighbourhood and we went to investigate it with my friends\r\nJamie: How was it? Did you find something?\r\nHarriette: We didn't see any ghosts, haha\r\nHarriette: But let me tell you that I never thought I'd freak out this much at hearing a cat meow\r\nHarriette: There's just something about the atmosphere... that makes you overreact and find normal but unexpected things really creepy\r\nJamie: I guess that's part of the experience? :p\r\nHarriette: Yeah, if I could choose again, I'd probably still decide to go - I don't regret it! But I definitely wouldn't try something like that alone ^^;" + }, + { + "id": "13810093", + "summary": "Jack needs Kev's help as he cannot get the application running. ", + "dialogue": "Jack: Kev, I need your help?\r\nKev: What's up, mate?\r\nJack: I can't get the application running.\r\nKev: Have you switched the computer on?\r\nJack: Very funny!!!!!!!!!!!!!\r\nKev: OK. Sorry. I can see it's serious.\r\nJack: Yeah, man. It is f**cking serious.\r\nKev: I'll be with you right now.\r\nJack: Thanks." + }, + { + "id": "13727962", + "summary": "Dan wants to apologize to Angela. They will meet at school later.", + "dialogue": "Dan: look, i'm sorry\r\nDan: please text back\r\nDan: I'll explain everything if you agree to meet up\r\nAngela: there's nothing to explain\r\nDan: please Angela, hear me out first\r\nAngela: should i trust you\r\nDan: let me show you why you should.\r\nAngela: Okay, meet me at school later\r\nDan: Okay" + }, + { + "id": "13862566", + "summary": "Shaldona sends mobile invitations to her wedding, as she has no time to give them in person.", + "dialogue": "Shaldona: WE ARE GONNA GET MARRIED ❤️❤️\nShaldona: \nShaldona: This is our mobile inviation for our wedding.\nShaldona: Invitation*\nPiper: Hey. You haven’t sent me any messages for a few years.\nPiper: And now you are sending me your wedding invitation \nPiper: THROUGH MESSENGER?\nShaldona: .....\nShaldona: Well..\nShaldona: I had no enough time to meet everybody and give this in person.\nShaldona: Hope you understand.\nPiper: If you don't have time to give the invitation card in person but expect people go to your wedding\nPiper: Shaldona, if so, you are too greedy." + }, + { + "id": "13730927", + "summary": "Paula and Ralph will meet the new person in an hour. ", + "dialogue": "Paula: Can we meet with the new person soon?\r\nRalph: Sure. In an hour okay?\r\nPaula: Perfect." + }, + { + "id": "13819626", + "summary": "Ania, Kasia, Zuzia and Jan want to go to the church tomorrow. Ania and Zuzia do not find it appropriate to go to the church with a boy.", + "dialogue": "Ania: Let's go together to the church tomorrow\r\nKasia: what a wonderful idea! to praise Mary the Queen of Poland together! So beautiful\r\nJan: Yes, let's do it!\r\nZuzia: I don't think a boy should go with us, find yourself other friends, some boys\r\nAnia: I agree, it's inappropriate " + }, + { + "id": "13729256", + "summary": "Harry waits outside. The movie has already started but Ema needs another 5 minutes, which made Harry angry.", + "dialogue": "Harry: Where are you? i am outside\r\nEma: coming just 2 mins...\r\nHarry: You told me you were ready.. you know movie has already started\r\nEma: i am sorry give me 5 minss\r\nHarry: Damn! take foreverrr" + }, + { + "id": "13864921", + "summary": "In the agreement it was decided that it's neither a sea nor a lake and it will have a special legal status. They will also completely divide the seabed up. It's rich in resources, mostly gas and oil.", + "dialogue": "Jeff: Do you know guys anything about the agreement?\nVladimir: the most important is that they decided it's neither a sea nor a lake\nVladimir: so it will have a special legal status\nTanya: and they will completely divide the seabed up\nJeff: sure, it's rich in resources\nDonald: yeah, mostly gas and oil\nVladimir: and \"between 80-90% of the world's caviar is sourced from the Caspian\"!!!\nJeff: hahaha, right!" + }, + { + "id": "13717284", + "summary": "Joe's job is wearing him up. Tim's friend Terry quit his job because he was burned out.", + "dialogue": "Joe: This job is wearing me up\r\nTim: Oh no! I thought you love it\r\nJoe: I do, but because of it they give me more work\r\nSam: Shit, this sucks man, don't let it burn you out\r\nTim: Exactly, like my friend Terry\r\nJoe: What did he do?\r\nTim: Quit eventually, but he had trouble sleeping, constantly tired, and turned out to have an ulcer\r\nJoe: Fuck! I gotta slow down" + }, + { + "id": "13820710", + "summary": "Natalie is checking if it's worth going to the new club at Regents Street. Denise thinks the club is great. Judy's friends also recommend the place, so Judy is going there this weekend. Natalie will go to the club with Judy, Miranda and Helen on Saturday.", + "dialogue": "Natalie: Have you been to this new club at Regents Street?\r\nJudy: I'm going there this weekend!\r\nJudy: I heard it's nice \r\nDenise: Yes! It's cool\r\nDenise: I was there a few times already\r\nDenise: I think it might be my new favourite club in town\r\nDenise: The DJ is awesome\r\nJudy: My friends were also praising the music\r\nNatalie: That sounds great.\r\nNatalie: I want to go.\r\nNatalie: Can I go with you Judy?\r\nNatalie: Are you going on Friday?\r\nJudy: Sure.\r\nJudy: I'm going on Saturday\r\nJudy: With Miranda and Helen.\r\nNatalie: Cool" + }, + { + "id": "13715786", + "summary": "Jamie, Marlo, Jimmy and Alex's teacher requires their class to divide into 2 groups, each making a presentation. The teacher sent them the presentation subjects via e-mail. Jamie, Marlo, Jimmy and Alex consider dividing the class into groups by gender. ", + "dialogue": "Jamie: What do you think about doing those presentations in groups?\r\nMarlo: I’m so down man, I don’t wanna do it alone, it’s a lot of work\r\nJamie: I know, interviews, then transcriptions, then compiling material, then writing\r\nJimmy: Geeeeez, you guys are so right\r\nJamie: I think we should talk to him to make like 3 presentations, so that gives us 3-4 people per team\r\nAlex: You guys are so not up to date:D\r\nJamie: What you talking about?\r\nAlex: It’s already done, we have 2 groups, he send us subjects in email\r\nJamie: LOL I was so convinced we gotta think of everything ourselves\r\nAlex: hahah nah \r\nJimmy: Did he divide us too?\r\nAlex: No we gotta do it ourselves\r\nJimmy: Girls v Guys? \r\nJamie: hahahaha that’s gonna be fun, we kinda have parity right?\r\nAlex: Yeah I think so, I’m ok with that\r\nMarlo: But dudes, girls are so much better in this subject\r\nAlex: Thanks:D but I think you’ll manage\r\nJamie: Ladies first, I give you right to choose first\r\nAlex: You give me rights:D that’s freakin new! THANK YOU\r\nJamie: Just tryin to be a gentleman;)\r\nJimmy: Yeah, you wish, you sexist pig:d\r\nJamie: Hey, hey no name calling here! Ladies read it\r\nJimmy: Oh you just diggin your own grave:D lol" + }, + { + "id": "13864952", + "summary": "John is in the park. He is leaving now.", + "dialogue": "Nora: John, are you at school?\nJohn: yes, maths now\nNora: I've just talked on the phone to your math teacher\nJohn: oh\nTheresa: yes, where are you?\nJohn: in the park\nTheresa: Wait there, we have to talk\nJohn: no, I'm leaving\nTheresa: wait for me! \nTheresa: We're your mothers!" + }, + { + "id": "13730578", + "summary": "There was a crowd outside the bookshop today. Cole Grant, who writes about vampires, was allegedly in the bookshop signing his books.", + "dialogue": "Louis: did you see all the people outside the book shop today?\r\nLouis: it was insane!!!\r\nSara: YES!!!\r\nSara: i saw a hugeeeeeeee crowd\r\nSara: do you know what was going on?\r\nLouis: my friend told me this writer, this new writer...\r\nLouis: i can't remember his name...\r\nLouis: the one that writes about vampires\r\nSara: dante kyle?\r\nLouis: no, the other one\r\nSara: cole grant?\r\nLouis: YES!! my friends told me he was there signing copies of his books\r\nSara: no big loss then\r\nSara: i'm not a fan of his" + }, + { + "id": "13716485", + "summary": "Debbie can't decide between buying a red dress and a green one. On Kelly and Denise's advice she will buy the green one. Kelly is considering buying the red one for herself.", + "dialogue": "Debbie: Help, I don't know which dress to buy! or ?\r\nKelly: The red one! It's beautiful.\r\nDenise: It is, but the green one will suit you better.\r\nKelly: Why? Debbie looks good in red.\r\nDenise: She does, but in my opinion that dress would look better on someone taller. Deb needs a shorter one.\r\nKelly: Right, I haven't thought about it.\r\nDebbie: So the green one?\r\nDenise: Definitely!\r\nKelly: Yeah. But can you send me the link to the store? I'm considering buying the red one for myself :D\r\nDebbie: LOL, okay. Here's the link: " + }, + { + "id": "13680358", + "summary": "Emilia is still angry.", + "dialogue": "William: are you still angry?\r\nEmilia: YES \r\nWilliam: :(" + }, + { + "id": "13819017", + "summary": "Tom will help Mia buy a flight ticket as she doesn't have a credit card and doesn’t want to use Peter's now. Tom needs the flight, company and your personal data.", + "dialogue": "Mia: Could anybody help me to buy a flight ticket?\r\nRebecca: Sure, but what's the problem?\r\nMia: I don't have a credit card at the moment \r\nMia: I've always used Peter's card, but now you know... I'd prefer not to\r\nTom: you can use mine!\r\nMia: Should I send you the link?\r\nTom: Just send me the flight, company and your personal data that I may need\r\nMia: great, so nice of you, thanks!" + }, + { + "id": "13819627", + "summary": "Hugh has a toothache and needs to go to the dentist. Andy and Wade recommend him dentists at ProDent. Hugh will call ProDent today.", + "dialogue": "Hugh: can you recommend a good dentist?\r\nHugh: I have a toothache and need a dentist urgently\r\nAndy: Im sorry mate\r\nAndy: try ProDent in the centre\r\nAndy: Dr Smith, Ive been a few times\r\nHugh: thanks, mate!\r\nWade: Ive heard all dentist in that clinic are good\r\nWade: I need to go for a checkup too, havent been to a dentist for ages\r\nHugh: You'd better go soon!\r\nHugh: I also havent been for ages and now it hurts horribly\r\nAndy: You should go today\r\nHugh: Im working late today\r\nHugh: I'll call them straight away and arrange sth for tomorrow maybe\r\nAndy: good luck! Take care!\r\nHugh: thanks, man" + }, + { + "id": "13731107", + "summary": "Before Christmas, Ella's mom won a hundred thousand in a lottery. Both Ella and Noah are excited.", + "dialogue": "Ella: OMG!\r\nNoah: ???\r\nElla: Just got a text from my mom!\r\nNoah: ???\r\nElla: She won a hundred thou in the lottery!\r\nNoah: Get. Out. !!!!!!!!!!!!!!!!!\r\nElla: Yes!!! Christmas is gonna be good! LOL!\r\nNoah: Don't forget the little people!\r\nElla: LOL!" + }, + { + "id": "13819533", + "summary": "There has been an accident on Circle Drive, neat Circle Mall. There are no fatalities.", + "dialogue": "Oli: Theres a car accident \r\nKatie: Where? \r\nOli: On circle drive\r\nOli: I tried to get to the Circle Mall \r\nKatie: Oh no \r\nPavel: Its on the news now \r\nPavel: Theres no deaths \r\nKatie: Thank god 👼 " + }, + { + "id": "13829543", + "summary": "Rob and Bob are watching the game. Bob will run some errands on the weekend. Jim's birthday is next Wednesday. He might organize a meetup this weekend. Bob will see Rob on the weekend.", + "dialogue": "Rob: Hey there, what's up?\r\nBob: Not much, watching the game. You?\r\nRob: Same. Having a few people over.\r\nRob: But the game is boring as fuck lol. That's why I'm writing\r\nBob: Yeah, true that\r\nRob: Any plans for the weekend?\r\nBob: Most likely the usual - run some errands, cook some food, go out for a few beers. Nothing super interesting have appeared yet :)\r\nRob: I've heard that Jim is planning to celebrate his birthday\r\nBob: Oh right, his birthday is like next Wednesday?\r\nRob: Yeah, normally that would make the next weekend a good time but he is going for a skiing trip with his family\r\nRob: So he said that he might organize something this weekend\r\nRob: Nothing super fancy - most likely a meetup with a few friends at some bar\r\nRob: Would you like to come?\r\nBob: Sure, that would be nice\r\nBob: But he has not invited me, so I don't want to be rude\r\nRob: Most likely because it is not a real party. When I see him I'll let him know :)\r\nBob: That would be cool - I actually haven't seen him in person for a while now :)\r\nRob: Yeah, facebook does that to people :)\r\nBob: ok, take care and see you on weekend!\r\nRob: yeah, see you then!" + }, + { + "id": "13830143", + "summary": "Julie and Debra are discussing the event, there will be about 20 people, mostly girls from the village 40+.", + "dialogue": "Julie: Most of the people will be girls from the village 40+\r\nDebra: i dont care, village girls are super cool\r\nJulie: it will be awesome to meet up again\r\nDebra: it is over the weekend and looking into my calendar it seems im available\r\nJulie: awesome! so youre on the list <3\r\nDebra: <3\r\nJulie: yeah, we'e gonna drink a little vodka :P\r\nDebra: :D\r\nJulie: go on debbie and share the event on your wall, we need some recognition haha\r\nDebra: brace yourself Jules :D how many people are coming?\r\nJulie: its 20" + }, + { + "id": "13812901", + "summary": "Mike's had an accident on his motorcycle and he's broken his leg.", + "dialogue": "Ian: Did you hear?\r\nKate: What happened?\r\nIan: Mike had an accident on his motorcycle.\r\nIan: He broke his leg" + }, + { + "id": "13815477", + "summary": "David was looking after Ethan's sister. Ethan is grateful. David won't do it again. ", + "dialogue": "Avery: You went to Ethan's house?\r\nDavid: yeah I had to babysit\r\nAvery: Aww, how do you babysit, just curious\r\nDavid: I had to go through a lot :/\r\nAvery: Was his sister naughty\r\nDavid: Tooo much\r\nAvery: Lol\r\nDavid: I will just refuse net time :/\r\nAvery: As you wish\r\nDavid: :/\r\nAvery: I just got his text \r\nDavid: What is he saying\r\nAvery: He is asking me to say thanks to you\r\nDavid: yeah whatever<3\r\nAvery: He was saying that your phone was switched off\r\nDavid: Yeah i have just turned it on\r\nAvery: I have told him about that\r\nDavid: k\r\nAvery: Gotta go now" + }, + { + "id": "13864977", + "summary": "Conrad can't enter the house because he forgot his keys. Since Rebecca and Tiffany are coming back late, he'll wait in the coffee shop. ", + "dialogue": "Conrad: I'm outside the house\nConrad: I forgot my keys...\nRebecca: 💩\nTiffany: I'll be home at 10-11\nRebecca: I'm coming back even later\nConrad: Oh no...\nConrad: I'll wait in the coffee shop" + }, + { + "id": "13728129", + "summary": "Clara and Ron are wondering what that weird smell at Kasia's place last night was.", + "dialogue": "Clara: Did you notice that weird smell at Kasia's place last night?\r\nRon: YES!!!!!! I didn't want to say anything about it, though. I didn't want to be rude.\r\nClara: I think it was her 21 cats roaming around lol\r\nRon: lol don't say that, those cats were cute.\r\nClara: so what? they can still smell \r\nRon: i think it was her sleazy boyfriend\r\nClara: lol you're bad\r\nRon: jk\r\nClara: in all honesty I don't know what it was.\r\nRon: i guess we'll never know" + }, + { + "id": "13811007", + "summary": "Holly is not feeling very well, so she's not coming to Jake's tonight. ", + "dialogue": "Jake: Holly ru coming tonight?\r\nHolly: tbh I don't feel well\r\nHolly: I think I caught flu\r\nJake: oh no, I was so excited that everyone would be here\r\nHolly: I know, sorry\r\nHolly: but I srsly feel shitty.." + }, + { + "id": "13682536", + "summary": "Ludmila's favourite dinosaur when she was little was the Triceratops.", + "dialogue": "Jacopo: hey, did you have a favorite dinosaur growing up?\r\nLudmila: yes, triceratops. why?\r\nJacopo: i'll tell you why later." + }, + { + "id": "13820543", + "summary": "Josh, Sean and Logan are going to the pub tonight to pick up some girls. Logan doesn't want Sean to scare the girls away with his inappropriate comments.", + "dialogue": "Josh: Going to the pub tonight?\r\nSean: sure, pick up some chicks!\r\nLogan: Please, behave Sean, I actually would like to meet some girls \r\nSean: ??\r\nLogan: don't bullshit around with you sexist comments, it's counterproductive " + }, + { + "id": "13818311", + "summary": "Someone left a phone at Liam's place, but it wasn't Indiana.", + "dialogue": "Liam: Dude you left your phone at my place?\r\nIndiana: what? I didn't -- using it right now\r\nLiam: shit whose is it then? :D\r\nIndiana: dunno" + }, + { + "id": "13819098", + "summary": "Claire is ordering her wedding dress, adviced by Maria and Nicole.", + "dialogue": "Claire: Check this out :)))\r\nClaire: \r\nMaria: !!\r\nNicole: Absolutely perfect for you!!\r\nClaire: I guess so\r\nClaire: But it could be a bit darker, cause this color is not very vivid\r\nNicole: Noooooo, I think this color is perfectly good for the bride\r\nMaria: And it's no that expensive :)))\r\nClaire: Yes, the price appeals to me xD\r\nNicole: You should order that dress\r\nNicole: I think you won't regret\r\nMaria: Don't hesitate, it's incredibly beauty!!\r\nClaire: Ok, if you say so :))" + }, + { + "id": "13728763", + "summary": "Victor took over Chris's company, which was under a huge debt. He sold the office and did some changes but Chris still works there as Director. David's business goes very slow but he expects it to get better by the end of the year.", + "dialogue": "David: Hi victor how are you?\r\nVictor: i am fine thanks, what about you?\r\nDavid: very well thanks, i heard you have taken over Chris's company? is that true?\r\nVictor: Yes he was under huge debt, but he is still working as Director\r\nDavid: That is good, your running company at its old premises?\r\nVictor: No i sold off the office and accommodated them in my office, you know there is a lot of free space.\r\nDavid: yes i know. thats good to hear i was worried about Chris but really appreciate what you are doing.\r\nVictor: he was not willing to get help so i thought to do it this way.. and he can own it back anytime he wants\r\nDavid: God bless you\r\nVictor: Thanks, hows your business?\r\nDavid: its good but not too much work these days..\r\nVictor: yes market is very slow..\r\nDavid: yes expecting it to get better by the end of the year.\r\nVictor: really? we can hope for the best\r\nDavid: yes." + }, + { + "id": "13715895", + "summary": "Patricia is recommending a fair-trade brand to Elle and Florence.", + "dialogue": "Patricia: Hello, here's the fair-trade brand I've been talking about \r\nElle: Oh, thanks!\r\nFlorence: Looks great!\r\nPatricia: I'm glad, I hope you enjoy it. The quality's really great and knowing where it came from makes it easier to spend the extra dollar ;)\r\nElle: I'll look into it :)\r\nFlorence: Thx" + }, + { + "id": "13813827", + "summary": "Ethan didn't come to the party last night because he is in Los Angeles. Abigail didn't know about it. Ethan will be back in a couple of days, the he will reach out to Abigail.", + "dialogue": "Abigail: Why didn't you attend the part last night? :/\r\nEthan: I am currently in Los Angeles dat's why it was impossible for me\r\nAbigail: When did you leave for LA? :o You didnt even tell me\r\nEthan: Why did you wanted know about it ?\r\nAbigail: Well! you are my friend and I dont even know that you are outta town. Dont seems good\r\nEthan: Don't worry. I would be back after 3 or 2 days\r\nAbigail: Let me know when you reach back\r\nEthan: Sure. Will meet soon" + }, + { + "id": "13681721", + "summary": "Stan is meeting the girl of his dreams today in Pat&Gill's. Later he's going to tell Dave how his date went.", + "dialogue": "Stan: She replied :-)\r\nDave: She did?\r\nStan: \r\nDave: Lucky you!\r\nStan: I can't believe it! She's my dream come true!\r\nDave: Good luck today! Where are you going to take her?\r\nStan: Pat&Gill's\r\nDave: Good choice. Let me know how it was :-)\r\nStan: I will.\r\nDave: In minute detail :-)\r\nStan: Forget it!" + }, + { + "id": "13821284", + "summary": "Miro speaks Albanian with his parents. His family left Albania illegally in 1990s.", + "dialogue": "Abby: Have you talked to Miro?\r\nDylan: No, not really, I've never had an opportunity\r\nBrandon: me neither, but he seems a nice guy\r\nBrenda: you met him yesterday at the party?\r\nAbby: yes, he's so interesting\r\nAbby: told me the story of his father coming from Albania to the US in the early 1990s\r\nDylan: really, I had no idea he is Albanian\r\nAbby: he is, he speaks only Albanian with his parents\r\nDylan: fascinating, where does he come from in Albania?\r\nAbby: from the seacoast\r\nAbby: Duress I believe, he told me they are not from Tirana\r\nDylan: what else did he tell you?\r\nAbby: That they left kind of illegally\r\nAbby: it was a big mess and extreme poverty everywhere\r\nAbby: then suddenly the border was open and they just left \r\nAbby: people were boarding available ships, whatever, just to get out of there\r\nAbby: he showed me some pictures, like \r\nDylan: insane\r\nAbby: yes, and his father was among the people\r\nDylan: scary but interesting\r\nAbby: very!" + }, + { + "id": "13821073", + "summary": "Julie has just watched a Japanese horror. She's alone at home and really scared. Paula and Rose are going to come to her place for a spontaneous sleepover. They'll drink cocoa and watch \"When Harry met Sally.\" Rose will bring cookies.", + "dialogue": "Julie: hey guys... could you just talk to me for a bit? I just watched this Japanese horror movie and I'm home alone and a little uneasy (aka scared shitless)\r\nRose: Jesus, why on earth would you watch a Japanese horror, home alone at this hour?\r\nJulie: Cause I'm a fucking moron?\r\nRose: Cause you're a fuckin moron. \r\nPaula: 5 point for Gryffindor, my friend. But in your defense, I always thought you were quite unaffected by horror movies.\r\nJulie: That's what I thought too!\r\nPaula: So what kinda movie was it?\r\nJulie: \r\nRose: seems pretty generic\r\nJulie: It's scarier than it looks.\r\nRose: you know that japanese horror are the worst ones, right?\r\nJulie: I know that now.\r\nPaula: If... you want me to come over I can.\r\nJulie: omg really? <3\r\nPaula: Yeah, sure, it's like 20 minutes by bike to your house, we can drink cocoa and watch \"when harry met Sally\" until we fall asleep ;)\r\nJulie: Omg Thank you so much, I love you!\r\nRose: I wish I lived in your neighborhood :(\r\nPaula: We can chip in for an Uber for you :D\r\nJulie: Let's have a spontaneous sleepover :D\r\nRose: Oh, man it's late, but why the hell not.\r\nRose: I'll bring cookies for the cocoa :D\r\nJulie: yay :D" + }, + { + "id": "13863055", + "summary": "Lucy is panicking because her daughter is 15 now and she is not sure she is prepared as a mother.", + "dialogue": "Lucy: 15. My little girl is 15. What should I do now? Where is the rule book?\nPatricia: Why? It's like yesterday she was 5 years old...\nPatricia: You know, you have like 15 years of preparation and it shouldn't come as a surprise :) \nLucy: Pat, as a mother I'm not as good as I always wanted to be...\nPatricia: Don't exaggerate...\nLucy: Are you numb to all emotions? How did you feel when Patrick turned 15?\nPatricia: I just don't see things as you do :)\nLucy: :)" + }, + { + "id": "13716119", + "summary": "Dan,Tim, Chris and Martin will meet at 8. Dan and Martin will take it easy this time. Tom can't make it as he has a party at in-laws.", + "dialogue": "Dan: Guys, so are we going out on Sat?\r\nTim: I'm ready when you are\r\nChris: I'm OK too, in the centre right?\r\nTom: I'm out guys sorry. \r\nDan: come on Tom, what's wring this time\r\nTom: told ya before. party at in-laws\r\nChris: always the same story right\r\nMartin: right guys, so what time?\r\nTim: start @8? finish. 8 if we're lucky\r\nDan: yeah, i need to be home at 2 and ready to drive on Sunday noon\r\nChris: so you takin it easy this time\r\nMartin: i guess i'd have to as well\r\nTim: guys you sound like 60 yos\r\nChris: Tim, are the last two standing or what?\r\nTim: i gather\r\nDan: you two will also grow up one day. " + }, + { + "id": "13716169", + "summary": "Casey got a new nail polish and did her nails herself. It took her nearly 4 hours, so she won't do her friends' nails, as it takes too long.", + "dialogue": "Casey: \r\nAmelia: these are so nice!!! did you do them yourself?\r\nKristen: wooow amazing\r\nAmelia: i want my nails done like that too!\r\nCasey: yeah i did it myself :D got a new nail polish but damn it took me nearly 4 hours lol\r\nAmelia: can you do it for us too?\r\nKristen: pretty please!\r\nCasey: sorry you guys... it was a nightmare :( seriously 4 hours for nails is too much" + }, + { + "id": "13612245", + "summary": "Sharol forgot about her sociology assignment. She needs to research feminist act, and it's due tomorrow. Kate has already finished it, and she rushes Sharol.", + "dialogue": "Kate: hey whats up\r\nSharol: hii nothing much man.. so bored\r\nKate: done with the assignment?\r\nSharol: which assignment?\r\nKate: that sociology one?\r\nSharol: which ? what? i am lost\r\nKate: omg dont tell me you forgot?\r\nSharol: seriously i dont know what your talking about?\r\nKate: haha! no wonder your bored girl \r\nSharol: now would you please tell me which assignment? \r\n Kate: yes ! we had to find answers of the questions about feminist act.? remember\r\nSharol: ohhh goodnesssss!!! i dont know how i forgot.......................................\r\nKate: now dont waste time and do it\r\nSharol: ya man. when is it due?\r\nKate: Tomorrow.. \r\nSharol: What! \r\nKate: yes now stop all those what and why and get to work....\r\nSharol: yes i am... are you done?\r\nKate: yes.... phewwwww\r\nSharol: :( talk to you later babes\r\nKate: haha.. i know i shouldnt be laughing but this is so funny...\r\nSharol: i hate you!\r\nKate: excuse me? i reminded you about assignment...\r\nSharol: oh yesssss... i love you now let me work. bye\r\nKate: lol ok byeee" + }, + { + "id": "13828780", + "summary": "Jack had to miss school because he is sick. Jamie was sick a lot last year but he got better thanks to doctor Tornez at City Medical Centre, next to the mall. Linda learns the school trip has been cancelled as many students are sick.", + "dialogue": "Linda: Hi Helen, was Jamie at school today?\r\nHelen: Yes, he was, but I've heard that Jack is sick again? Poor baby.\r\nLinda: I'm afraid so :( High fever and a terible cough\r\nHelen: Jamie had it all the time last year\r\nLinda: how did you get rid of it?\r\nHelen: I have this wonderful perdiatrician, doctor Tornez, he is great with kids and treats every case individually\r\nLinda: that's rare in these days - sometimes I think all doctors do is prescribe antibiotics :/\r\nHelen: I know! But doctor Torez i quite different. You can find him at the City Medical Centre\r\nLinda: The one next to the mall?\r\nHelen: exactly\r\nLinda: Thanks! anyway, I just wanted to ask if there was any importans news concerning the school trip\r\nHelen: You haven't heard? It's off!\r\nLinda: What, why?\r\nHelen: Half of the class is sick...\r\nLinda: Oh no :/" + }, + { + "id": "13682307", + "summary": "Tim does not need mum for anything important at the moment.", + "dialogue": "Tim: Seen mum 2day?\r\nLouise: Nope. she's with autn Grace\r\nTim: oh yeah i forgot\r\nLouise: why u need her\r\nTim: nothing important. ok for now\r\nLouise: ciao" + }, + { + "id": "13815437", + "summary": "Kate believes her boyfriend's mother dislikes her. He is a nerd who lives and has always lived with his mother and grandmother.", + "dialogue": "Caroline: I think his mother doesn't like me...;-( :-( \r\nKate: how come??\r\nCaroline: I just see it in her eyes...\r\nKate: any example?\r\nCaroline: I just feel it... \r\nCaroline: hard to give any example.\r\nCaroline: I'm his first gf and she's jelous...\r\nKate: what??? u r his first gf??\r\nKate: how old is he??? o_O ???\r\nCaroline: 26 but he's a nerd.\r\nCaroline: he used to spend all the time at home..\r\nKate: so he lives at home with mummy?? LOL\r\nCaroline: yeah.. mummy and gramma\r\nKate: fuck, really??\r\nKate: and u think he's normal?\r\nCaroline: he's an introvert...\r\nCaroline: it's a big house..\r\nKate: but has he ever lived somewhere else?\r\nKate: u know shared flat, erasmus?\r\nCaroline: nope...\r\nKate: he's really weird..\r\nCaroline: he's just a nerd, but v.intelligent!\r\nCaroline: w8 , he's writing sth, will text u l8er.\r\nKate: ok" + }, + { + "id": "13730881", + "summary": "Vincent's new lamp should be ready to be picked up on Tuesday.", + "dialogue": "Vincent: \r\nDamian: What happened to your lamp?\r\nVincent: I broke it xD \r\nVincent: With my bare hand\r\nDamian: You didn't do this just to show off did you?\r\nVincent: Hahaha. xD No.\r\nVincent: I was playing with my cat with a ribbon\r\nVincent: And while raising my hand I just hit the lamp and the glass cover broke\r\nDamian: Shit happens. You ordered new one yet?\r\nVincent: Yeah. Should be ready to pick up on Tuesday xD" + }, + { + "id": "13728369-1", + "summary": "Nestor wanted to buy a laptop on Black Friday sales, but Olaf advise against it, as the prices in reality are not reduced. Nestor will check it if Olaf helps him to get a good deal from a guy he knows.", + "dialogue": "Nestor: I'm thinking of buying a new laptop\r\nNestor: And it seems that now is a perfect time for it as Black Friday is coming\r\nOlaf: Hahaha\r\nNestor: I haven't said anything funny, what's wrong with you\r\nOlaf: Everything's fine with me, you're just being silly\r\nNestor: Why, cuz I want to save some money buying what I need when it's cheaper?\r\nNestor: Do you actually know what Blak Friday is about?\r\nOlaf: Of course I know\r\nOlaf: It's a cunningly contrived sales pitch\r\nOlaf: Prices are thought to be reduced but in fact they're the same or even higher...\r\nNestor: You're just repeating a stupid theory some people made up and try to convince others that it's based on facts\r\nOlaf: You're really naive...\r\nOlaf: \r\nOlaf: Click on this link and see for yourself\r\nOlaf: This is one of the sites where people upload photos of the same product and its price 2 weeks before Black Friday and during the promotion\r\nNestor: And how do I know whether it's a reliable source or not?\r\nOlaf: Carry out your own investigation\r\nOlaf: Go and check prices of the chosen products when the promotion is still on and check them again when the promotion is over\r\nOlaf: Simple\r\nNestor: What if the laptop I want has been really cheaper and I'll miss a perfect deal?\r\nOlaf: Then I'll help you\r\nOlaf: I know a dude who sells almost new computers at very reasonable prices\r\nNestor: So ask him to send the offer of what he has in stock to me first\r\nNestor: If he can offer me a good deal, I'm going to the mall and starting investigation tomorrow\r\nNestor: Deal?\r\nOlaf: Deal!\r\nOlaf: I'll call or text him and ask to contact you.\r\nOlaf: I'm sure he'll help you and you'll appreciate my advice :-)\r\nNestor: We'll see about that" + }, + { + "id": "13715824", + "summary": "Nancy asks Vic and Phil about various social media, which prompts them to discuss and compare the different platforms. Phil is not into Instagram but likes Twitter. Vic prefers Facebook over Twitter and likes Instagram. Phil and Vic both don't use Tumblr. ", + "dialogue": "Nancy: Yeah, but u can also read the news online ;)\r\nPhil: I know, but imagine - ur keen on technology and u get all the news in one place. Then u can choose what to read and what not.\r\nNancy: Sounds sensible. Does it have something 4 fashion?\r\nPhil: Probably so. Not sure, though.\r\nNancy: How about u, Vic?\r\nVic: I still prefer Facebook. Had Twitter once, but the interaction with others is nothing compared with Facebook.\r\nPhil: Well, it's intended in a completely different way. \r\nVic: What do u mean?\r\nPhil: IMO, Twitter is not for interacting like on Facebook, but for getting news fast and from reliable, more or less, sources. U can follow anyone and anyone can follow u.\r\nVic: Still, on Facebook I can share stuff with my friends, join groups and talk about things that interest me. And I don't have to limit myself to 140 characters.\r\nPhil: 280. Still, a downside. OTOH, if ppl were able to write as much as they wanted, Twitted would get as cluttered as Facebook.\r\nNancy: So no one uses 4 example Instagram?\r\nVic: I do. \r\nPhil: Not 4 me. I don't post that many pictures online. \r\nNancy: But u can follow ppl and see what they're doing or offering. There are also companies on Instagram.\r\nPhil: I know, but I'm more interested in news and gossip than seeing someone post a picture of their breakfast. \r\nVic: That's not all ppl post on Instagram!\r\nPhil: So, what else?\r\nVic: Depends on ur interest. I, for one, like to observe all the fitness accounts :)\r\nPhil: And does this motivate u to train?\r\nVic: No, but it gives me hints what to do and what not. \r\nNancy: What do u think about Tumblr?\r\nPhil: What?\r\nVic: Heard about it, but never had an account.\r\nNancy: It's a microblogging website. U can post blog entries, pictures and basically everything u want there. And ppl can observe u!\r\nPhil: Don't have that much time to write a blog.\r\nVic: Me neither. \r\nPhil: Nancy, y do u ask these questions? ;)\r\nNancy: I have my reasons ;)" + }, + { + "id": "13865176", + "summary": "Amanda and Peter don't like what the man in dreads looks like, but Dan does.", + "dialogue": "Amanda: have you seen the guy with dreads?\nPeter: yes, so awkward!\nDan: hahaha, very, but cool!\nAmanda: I'm not convinced" + }, + { + "id": "13682269", + "summary": "James will pick up the car after his work tomorrow. Sue already have sent him money. ", + "dialogue": "Sue: can you pick the car up after work tomorrow please\r\nJames: yes and pay?\r\nSue: yes I will transfer the money in \r\nJames: ok x" + }, + { + "id": "13728099", + "summary": "Jess is in a traffic jam in West Bronx.", + "dialogue": "Alice: Are you on the way\r\nJess: I'm in a traffic jam\r\nAlice: oh, no, where?\r\nJess: West Bronx\r\nAlice: :/" + }, + { + "id": "13862329", + "summary": "Dima's laptop is broken, as her cat spilled coffee on the laptop. Dima is worried, because she has to deliver a translation for Trados tomorrow. Dima will come to Nada in an hour to borrow Nada's laptop. ", + "dialogue": "Dima: hello! \nNada: hey girl, what's up?\nDima: I'm in a huge trouble, my laptop is broken and I have to deliver a translation tomorrow @9 😱😱\nNada: fuck what happened??\nDima: the stupid cat spilled coffee on it 😣😣 I'm freaking out!\nDima: you still have your old laptop? is it possible to lend it to me please?\nNada: no sorry, I've given it to my brother - but you're lucky! I've taken these two days off so you can take mine\nDima: ooh man! thank you sooo much!!! if it weren't for Trados, I wouldn't be panicking :( \nNada: no worries, it happened... but I always think about this... like man, we need some back up laptops!\nDima: I know! but I always change my mind and spend the money elsewhere lol\nNada: yeah, but it's like our only tool! so we need to invest in it\nDima: yup, true !\nDima: can I come in an hour to pick it up?\nNada: yes :) ttyl!" + }, + { + "id": "13829943", + "summary": "Afterwards, Kelly wandered around and met some people from school. She also met him. They almost didn't talk and he was with someone. ", + "dialogue": "Betty: so where did you go after?\r\nKelly: we wander around and met more people from school\r\nBetty: anyone i know\r\nKelly: well, yeah\r\nBetty: tell me!!!!!\r\nKelly: guess...\r\nBetty: oh c'mon!!\r\nKelly: guess who could pass THE pub at 2 am...?\r\nBetty: oh no!!!\r\nKelly: oh yes!!!\r\nBetty: damned bastard\r\nKelly: haha but we didnt talk almost, just hi-hi hows it goin\r\nBetty: tell me everything\r\nKelly: he looked nice, standard\r\nBetty: was he with someone?..\r\nKelly: yes....\r\nBetty: :/\r\nKelly: I'm sorry sweetie, i told him a few words, he heard from me a bit. Bastard\r\nBetty: thanks, babes, youre the best, youre my bestie\r\nKelly: not at all love, he is such an ass!" + }, + { + "id": "13862409", + "summary": "Anne is inviting Adele for Easter. Adele will bring some chocolate eggs.", + "dialogue": "Anne: Hi darling, do you went to come for Easter?\nAdele: love to, i'm off on friday\nAnne: it's could be nice, i'll invite Louise too\nAdele: great, i'll bring you eggs, chocolat ones of course!\nAnne: thanks darling." + }, + { + "id": "13681010", + "summary": "Melody's 5-year-old laptop is broken. Tomorrow she'll know what's wrong. She won't be repairing it, because her laptop is too old. Instead, she'll buy a new one.", + "dialogue": "Melody: did you get your computer fixed yet?\r\nPeggy: no, im spending a lot of time using the library computers.\r\nMelody: do they know whats wrong with it? \r\nPeggy: might be something with the circuit board. they hope to have an answer tomorrow\r\nMelody: thats pretty serious. might be cheaper just to buy a new one\r\nPeggy: thats true. well see.\r\nMelody: if you need to get a new one, i highly recommend the mac model that i have\r\nPeggy: ok, good to know. i'll write if i have any questions\r\nMelody: youre probably due for a new one anyway, no?\r\nPeggy: you're right. 5 years is a long time to own one.\r\nMelody: yes, thats ancient by laptop standards\r\nPeggy: ok. i might just not bother getting it repaired after all.\r\nMelody: sounds like a good idea" + }, + { + "id": "13828943", + "summary": "Jerry will be home in 40 minutes. ", + "dialogue": "Jerry: Hi sweetie :)\r\nJanet: Hi sugar ;)\r\nJerry: I'm coming home\r\nJanet: Can't wait ;)\r\nJerry: I should be there in 40 minutes\r\nJanet: Ok, I'm waiting for you :)\r\nJerry: How was your day?\r\nJanet: Oh, it was ok but my boss is a pain in the ass sometimes\r\nJerry: I know, she can be a bitch :P\r\nJanet: Yes she can! ;)\r\nJerry: See you later darling\r\nJanet: <3" + }, + { + "id": "13717052", + "summary": "They are going to do some research on holiday options and discuss them later. They will most likely choose a cheap offer from a tour operator.", + "dialogue": "Mark: So, we've got our where and when. Package tour or self-organised?\r\nAnna: Package. More convenient.\r\nGeorge: Self-organised. Cheaper.\r\nJulia: Do we need a 5* hotel? MAybe let's choose one of the cheaper options from a tour operator?\r\nMark: Actually, not a bad idea. That'll be both cheap and convenient.\r\nAnna: I'm in!\r\nGeorge: So, let's start digging and we'll talk about it l8r?\r\nMark: SLAP\r\nJulia: Ok. But maybe let's divide ourselves so that we don't check the same websites?\r\nGeorge: Ur right!\r\nAnna: Sure. XOXOXOX\r\nMark: Let's do this asap!" + }, + { + "id": "13829210", + "summary": "Del accused Stanley of having an affair, because he couldn't go with her this weekend due to his work. They've only been together for 4 months, so it's not a good sign. Now Bill and Stanley need to take care of the Lidem project. Division of tasks is on Stanley, because Alison is unreachable.", + "dialogue": "Stanley: I can’t believe in her…\r\nBill: What is it? And who?\r\nStanley: Del, she’s behaving kind of… stupid\r\nBill: Meaning?\r\nStanley: I told her I can’t go for this weekend with her because of work\r\nBill: And? How did she react?\r\nStanley: She was angry and wouldn’t listen to me. She accused me of having an affair o.O\r\nBill: You’ve been together for like 4 moths and she’s already doing sth like that???\r\nStanley: I know, it’s crazy. I starting to give up.\r\nBill: So what, that’s it?\r\nStanley: No, I’ll talk to her when she’s back, I’ll tell her it can’t be like this, see what will she say\r\nBill: Sorry to say that, I don’t think you’ll bring any good news xD\r\nStanley: Actually, me too. But at least I’ll try\r\nBill: Well, I’m sorry. We need to take care of the Lidem project right now.\r\nStanley: Yes I’m finishing the analysis for the pervious one, but it should be done tonight, so I can start working on it today\r\nBill: The problem is I don’t what you should do ;p\r\nStanley: whaaat\r\nBill: The analysis of tasks is not done yet, Alison disappeared from all the media and won’t answer her phone\r\nStanley: Great. So division of the tasks is on me?\r\nBill: Yes because I’m already researching when it comes to the funding.\r\nStanley: We’ll do, it’ll be ready tomorrow, not sure what time.\r\nBill: OK, keep me posted.\r\nStanley: Btw, it’s weird that Alison is out of touch\r\nBill: No at all, it’s not the first time\r\nStanley: Women…" + }, + { + "id": "13828473", + "summary": "Kimberly might have left her umbrella she got from her mother at the cafe yesterday. Laura gives her the cafe's phone number to check with the staff.", + "dialogue": "Kimberly: Hi Laura :)\r\nLaura: Hey Kim :)\r\nKimberly: Do u remember if I left the cafe yesterday with my umbrella?\r\nLaura: I'm not sure...\r\nLaura: I remember you arriving with it. It was a pinkish-blue one, right?\r\nKimberly: Yes, with a flower motive.\r\nKimberly: I got it as a present from my Mom a few years back.\r\nKimberly: It's not worth anything, but it's about the sentimental value.\r\nLaura: I understand, maybe try calling the cafe? \r\nLaura: Maybe a customer found it or a waiter?\r\nLaura: Here's the number: 613-785-4329.\r\nKimberly: Thanks! It's worth a try :) " + }, + { + "id": "13828949", + "summary": "Catherine applied for an accounting position at Pandora. Jake has been working there for 5 years. This job offers a clear career path and benefits. Jake got promoted twice with salary increase. Catherine will have an interview on Monday.", + "dialogue": "Catherine: Hi Jake, last week I applied for an accounting position @ Pandora.\r\nJake: Hey Cathy! Really? That's great!\r\nCatherine: R u still working there?\r\nJake: I sure am. It's already been 5 yrs.\r\nCatherine: Time flies. So I take it ur satisfied?\r\nJake: I am, I rarely think about changing companies. \r\nCatherine: Does the company offer possibilities of personal development & promotion?\r\nCatherine: I mean is there a clear career path?\r\nJake: Yup, that's the main pro of this place & the benefits.\r\nJake: I was promoted twice during these 5 yrs & the salary increase with each promotion is quite significant.\r\nCatherine: That's good to hear. Most companies 2day are implementing cost saving policies & the employees are the ones who suffer.\r\nCatherine: I mean at my current workplace u can only count on inflation salary increases. So u barely see the difference on ur paycheck.\r\nJake: That's what I hear from most of my friends. It's a tough time 4 employees on the market nowadays.\r\nJake: 4 now our company is still dynamically expanding. \r\nCatherine: And how about the benefits u mentioned?\r\nJake: The company partially finances language & accounting courses.\r\nJake: There's also a yearly bonus that depends on your achievements and private healthcare.\r\nCatherine: That sounds like a dream :D Your HR dept. gave me a call yesterday and invited me to an interview on Monday.\r\nJake: That's great! I'll keep my fingers crossed :)\r\nCatherine: Thanks Jake :) Is there anything I should prepare/review before the interview?\r\nJake: Well I think it would be beneficial to know some facts about the company.\r\nJake: You can find them on the official website, in the \"about us\" section: \r\nCatherine: OK, I'll definitely take a look.\r\nJake: And they may ask u some Qs about accounting principles, the basics. But I mean that won't be a problem 4 u.\r\nJake: I remember u graduated with honours from uni.\r\nCatherine: That's true. :) I'm a bit nervous though.\r\nJake: There's no need to be. Give me a call after the interview. Good luck!\r\nCatherine: Thanks. TTYS!" + }, + { + "id": "13681987", + "summary": "Anna will go with Fiona to a doctor tomorrow at 8 a.m.", + "dialogue": "Fiona: hey\r\nAnna: hello \r\nFiona: can you go with me to a doc? \r\nFiona: I need support \r\nAnna: yeah sure \r\nFiona: thank you so much \r\nFiona: it's tomorrow at 8 a.m." + }, + { + "id": "13716619", + "summary": "Pipe under the wash basin exploded when Dan wanted to wash his hands. Andrea is going to write to the owner about it.", + "dialogue": "Dan: Fuck me guys, I wanted to wash my hands and the pipe under the wash basin literally exploded! 😱/😂\r\nSteve: lol sounds like fun. What shall we do?\r\nAndrea: That's so ridiculous. I'll write to the owner now\r\nDan: Thanks Andrea. Or you can give me her address and I'll do it myself\r\nAndrea: No, no worries, it's no big deal\r\nDan: Thanks! 💓" + }, + { + "id": "13716075", + "summary": "Both Claire and Linda are making curry for dinner. ", + "dialogue": "Claire: \r\nKim: Looks delicious...\r\nLinda: No way... Look what I'm cooking right now:\r\nLinda: \r\nClaire: hahahaha \r\nKim: Curry dream team\r\nClaire: Enjoy your dinner :*" + }, + { + "id": "13731085", + "summary": "Matt will be staying with homestay parents for one more month. They seem to talk more to Carlos. They have higher electricity bills because Matt spends a lot of time at home.", + "dialogue": "Matt: I feel like my homestay parents don't treat me that well\r\nJorge: Why u think so?\r\nMatt: They take Carlos for subway\r\nMatt: And I feel like they prefer talking to Carlos\r\nJorge: Do they feed you well tho \r\nMatt: Yea I guess\r\nJorge: Maybe you're just overthinking\r\nMatt: I don't know\r\nMatt: It feels like a negative vibe\r\nMatt: is it because I stay at home too much \r\nMatt: And so They have to pay more for electricity \r\nJorge: I doubt it but hmm \r\nJorge: I would give a fuck\r\nJorge: You're there one more month and go \r\nMatt: True" + }, + { + "id": "13680843", + "summary": "Ann thanks Katie for hosting her son Tim on Monday evening and driving him to the railway station next morning. Kate will send a message to Tim tomorrow, to ask about the place where she should be waiting for him.", + "dialogue": "Ann: hello Katie, thank you so much for hosting Tim monday evening and for dropping him at the railway station next morning. Here is mobile number... .I 'll confirm you , but again and in advance: thanks a lot\r\nKatie: Hello Ann, i didn't realise but we spend a week end together 20 years ago in Saint Fargeau!!!\r\nAnn: it goes back so long... but it's possible\r\nAnn: It's true that i know Ben since school.\r\nKatie: as far as i'm concern i remember very well this week end as i had a very good friends call Ann Cairns ( like you)😜, Erik's wife\r\nAnn: Erik's wife? of course he's my cousin, and Ann his wife is the cousin of one of my best friend... Small word\r\nKatie: indeed\r\nAnn: Hi Katie, is it still ok for hosting my son tomorrow? You may send him a text to tell him where you want to pick him up. Thanks so much\r\nKatie: Yes Ann, i'll send a text to your son for tomorrow.\r\nAnn: Thanks. Funny you also know Stef and Leo, friends of us in Berlin. But it's true they lived in Reims before\r\nKatie: your son is really nice. don't hesitate to contact me again if needed." + }, + { + "id": "13729583", + "summary": "Ahmed wants Sharon to move in with him but she's afraid of her parents' reaction. Ahmed is angry.", + "dialogue": "Sharon: My mum knows\r\nAhmed: Aaaand?\r\nSharon: She’s angry, I’m grounded\r\nAhmed: For fuck’s sake you’re 19\r\nSharon: But I still live with my parents, let me remind you -_-\r\nAhmed: So move out\r\nSharon: It’s not that easy, hello\r\nAhmed: Move in with me, we would be together all the time :*\r\nSharon: I’d have to go to work, I’m still a student\r\nAhmed: Don’t worry, I’ll take care of you\r\nSharon: My parents will stop talking to me at all\r\nAhmed: So what\r\nSharon: I care about my parents, cmon!\r\nAhmed: But they’re stupid\r\nSharon: Don’t talk about them like that!! They’re just… old and manipulated\r\nAhmed: How can you manipulated into being an asshole -_-\r\nSharon: Ok, I know you fell hurt by their behavior but stop it\r\nAhmed: You don’t care about me\r\nSharon: I do! But my family is important to me, you should understand that\r\nAhmed: OK, whatever" + }, + { + "id": "13731321", + "summary": "Rob is disappointed with memes he watches. Tom suggests he should get a girlfriend instead of complaining about the memes.", + "dialogue": "Rob: \r\nRob: Not sure if I'm getting dumber, or this is how it feels like to get older\r\nTom: What?\r\nRob: I'm looking at today's memes and they mostly refer to things that are either completely stupid, or have no humour value.\r\nTom: Rob, get yourself a girlfriend please. You're talking bullshit :D\r\nRob: Ehh. Fuck you." + }, + { + "id": "13728067", + "summary": "Mike will refill the hand sanitizer on Grace's request.", + "dialogue": "Grace: The hand sanitizer by the restrooms is empty. Do we have a refill/\r\nMike: Yes, no problem. I'll get to it right away.\r\nGrace: Thx." + }, + { + "id": "13611929", + "summary": "Cheryl had an argument with her mom. She forgot to close the window, got angry and started a fight. Her mom gave her time till the end of the year to move out.", + "dialogue": "Louis: Hey, hows your day? :D\r\nCheryl: Okaaay… I guess\r\nLouis: Aha, someone’s feeling a bit down, am I right?\r\nCheryl: yea, sort of…\r\nLouis: Go on, tell me what happened\r\nCheryl: I…just had an argument with my mom\r\nLouis: Jesus what again\r\nCheryl: I forgot to close the window when I was leaving home!\r\nLouis: And that’s it?\r\nCheryl: No, not only… Ya know, wouldn’t be that bad, but I got angry, started screaming and everything ;/\r\nLouis: not a good idea, babe\r\nCheryl: I knoooow \r\nLouis: Was it really bad? \r\nCheryl: I suppose yea, she kicked me out xd\r\nLouis: WHAT\r\nCheryl: I mean I don’t have to move right now, but she gave me time till the end of the year\r\nLouis: I’m sorry…\r\nCheryl: Naah, don’t be, I believe it’s for good. I couldn’t stand her anyway xD" + }, + { + "id": "13731391", + "summary": "David lands at 17:30 at Sevilla airport and Victor will pick him up.", + "dialogue": "Victor: do you need me to pick you up from the airport\r\nDavid: that would be neat\r\nVictor: what time are you landing\r\nDavid: 17:30 at Sevilla\r\nVictor: SEVILLA???\r\nDavid: yeah sorry\r\nDavid: Jerez was too expensive\r\nDavid: you don't have to pick me up if you can't\r\nVictor: no its ok" + }, + { + "id": "13612194", + "summary": "Ann wants to buy Josh's laptop for $200. Josh doesn't want to negotiate the price. Ann will take it for $250 with accessories. ", + "dialogue": "Ann: Hi, is the laptop still available?\r\nJosh: Yes it is\r\nAnn: I can pay 200 dollars\r\nJosh: The price is 250 and it's non-negotiable\r\nAnn: Do you have a bag for it? Some other accessories?\r\nJosh: I have a bag and a small usb mouse\r\nAnn: Sounds good, I'll take it, where can I pick it up?" + }, + { + "id": "13716087", + "summary": "Mike wants someone else to do the washing up this time. Sara agrees, but when she returns from the cinema she is at with Jack.", + "dialogue": "Mike: Can anybody do the washing up?\r\nMike: I did it last time\r\nMike: So now its your turn\r\nSara: I'll do it\r\nMike: Ok\r\nSara: But when I get back\r\nSara: From the cinema\r\nSam: You're going to the cinema?? With whom??\r\nSara: With Jack\r\nSam: Ahhhh\r\nSam: I thought that maybe I'll join you\r\nSara: Not this time ;-)))\r\nSam: \r\nSara: xDD" + }, + { + "id": "13828227", + "summary": "Gary is a driver for Uber and he really enjoys it.", + "dialogue": "Gary: remember i told you i wanted to drive for uber?\r\nEllie: yes.... :-D\r\nGary: and how you told me it was a terrible idea\r\nEllie: yes, I do remember lol\r\nGary: well, i'm driving for uber :-D\r\nEllie: really??? hahahha i'm sure i was right\r\nEllie: i'm sure you hate it\r\nGary: no! i love it <3\r\nGary: i'm actually parked waiting for a ping for my next ride\r\nGary: it's so much fun!!!\r\nGary: i've met loads of cool people! :-D\r\nEllie: really? \r\nEllie: i would have NEVER expected YOU would enjoy it\r\nGary: why is that?\r\nEllie: you're... \"peculiar\" when you meet new people, lol\r\nGary: what do you mean???\r\nEllie: don't take this the wrong way...\r\nEllie: but you're not good at being around new people lol\r\nEllie: you're always awkward and uncomfortable :-/\r\nGary: lol i didn't know i was perceived that way\r\nEllie: yeah, so when i hear you're having fun meeting all thiese strangers i'm really surprised lol\r\nEllie: maybe your social skills are improving!! hahaha\r\nGary: whatever, I'm having so much fun\r\nEllie: i'm glad you are!! :-)" + }, + { + "id": "13863223", + "summary": "Kristi needs new trainers. Leah has a link for a discount coupon at an online store.", + "dialogue": "Leah: \nKristi: What is this?(?_?)\nLeah: If you go to this link, you can get 20% discount coupon <3<3<3\nKristi: What coupon is this? (@_@;)\nLeah: All the items on this online shop. \nLeah: Havent you talked to me that you needed new trainers?\nKristi: You remembered! Yes I did! (*^0^*)\nLeah: I have a favor as well_(._.)_\nLeah: Put my ID onto the reference code. “Direndia45”∩(·ω·)∩\nKristi: What can you get if I do that?\nLeah: I will get some accumulated money for the shopping next time.(*_*)(*_*)(*_*)\nKristi: Alright. (^.^)\nKristi: You smart consumer! (^_-)-☆(^_-)-☆" + }, + { + "id": "13829640", + "summary": "Steve is calling Sue at her request.", + "dialogue": "Sue: Call me when you get this.\r\nSue: it's important\r\nSteve: OK, I'm calling" + }, + { + "id": "13828336", + "summary": "Andre is shocked after reading the news about a bear attack at the zoo. Megan is not surprised that an animal kept in a cage reacted that way.", + "dialogue": "Andre: i just read the news about the bear attack on the zoo :-(\r\nAndre: yikes - i would've never had imagine something like that could happen\r\nMegan: that's why I always say you can't keep animals y cages!!!\r\nMegan: it's cruel and it's wrong!!!\r\nMegan: i'm not surprised the bear reacted that way" + }, + { + "id": "13862341", + "summary": "Barbara got the confirmation email from AES. Mick did not get the email and will call them.", + "dialogue": "Mick: I didn't get the confirmation emai from AES yet\nBarbara: I did\nMick: You did? \nMick: I gotta call them\nBarbara: Yes" + }, + { + "id": "13612024", + "summary": "Allison send Alan budget estimation for this year. Extra other expenses are boss's trip to Japan, for a convention.", + "dialogue": "Alan: Can you send me the file with budget estimation\r\nAllison: Done\r\nAlan: Sorry, not this one, estimation for current year, not for next year\r\nAllison: Sorry for the mistake, sent it to you\r\nAlan: Thanks. Tell me, do you know, why we doubled the costs in the line \"other expenses\" compared to last year\r\nAllison: It was when our boss had to come back from Japan for the convention\r\nAlan: I see, I see..." + }, + { + "id": "13731505", + "summary": "Liam and Nate will meet spontaneously in 15 minutes.", + "dialogue": "Liam: Yo. You free now?\r\nNate: Yup. What's up.\r\nLiam: I feel like a stroll around. You comin?\r\nNate: Sure. I'll be ready in 10.\r\nLiam: Be there in 15 :D\r\nNate: See ya." + }, + { + "id": "13727992-1", + "summary": "Carl will compete in championship this year. Duncan can't miss this event. Carl will register his family members as VIP.", + "dialogue": "Duncan: btw bro, all the best in this year's championship\r\nCarl: thanks bro, hope my evo 10 wont let me down this year.\r\nDuncan: relax, last year you were just unfortunate with the gearbox.\r\nCarl: yeah, but this year im using a 6 speed hydrolic shift gearbox\r\nDuncan: that will really service you, i know that\r\nCarl: will you guys attend?\r\nDuncan: we cant miss watching our youngest cousin sweep away the title\r\nCarl: haha, stop exaggerating\r\nDuncan: haha, i mean it, we cant miss it for the world!\r\nCarl: thanks, ill have you reserved in the VIp\r\nDuncan: cool\r\nCarl: thanks\r\nDuncan: go get them!\r\nCarl: i will" + }, + { + "id": "13681079", + "summary": "Dinny's afraid of Terry's dog so he should keep it away.", + "dialogue": "Dinny: can you take your dog away before i come?\r\nTerry: are you afraid?\r\nDinny: a little\r\nTerry: ok than" + }, + { + "id": "13814769", + "summary": "Anastasia sent her new school photos to Darrell.", + "dialogue": "Anastasia: Our new school photos\r\nAnastasia: \r\nAnastasia: Look how happy I am\r\nDarrell: You don't look unhappy to me but it's like you're, uh\r\nDarrell: What was the word\r\nDarrell: Sceptical of something\r\nDarrell: \"what am I doing here\"\r\nAnastasia: Ahahaha\r\nAnastasia: That's my mood everywhere I step in\r\nDarrell: Hahaha\r\nAnastasia: Well\r\nAnastasia: They took the photo in less than a minute actually\r\nDarrell: Oh wow... well, I guess there were a lot of people?\r\nAnastasia: Yeah\r\nAnastasia: School photos always suck\r\nAnastasia: They take them so fast and carelessly\r\nDarrell: They would only really take group photos of us in middle and high school\r\nDarrell: If someone wanted a portrait photo, I guess it was possible\r\nDarrell: But not obligatory\r\nAnastasia: Well, I needed a new one for my school ID\r\nAnastasia: So I had no choice here\r\nDarrell: Luckily no one really has to look at your school ID most of the time, haha\r\nDarrell: Don't worry about it\r\nAnastasia: Ah no, I'm not worried, I actually find it kind of funny, it's fine" + }, + { + "id": "13864517", + "summary": "Kim is going with Jane to Seoul in April. Jane will be their tour guide.", + "dialogue": "Kim: I'm going to Seoul!\nAgatha: wow finally!\nMark: When? Bring us some kimchi!\nKim: People on a plane will kill me if I do :D\nKim: I'm going in April <3\nAgatha: Are you going with Jane?\nKim: yes, of course - she'll be our tour guide\nMark: so jealous... I'd love to come as well\nKim: Really? I asked you like a hundred times" + }, + { + "id": "13809884", + "summary": "Dad of Aubrianna's friend died of malaria in Kongo.", + "dialogue": "Aubrianna: My friend's dad died because of malaria in kongo\r\nDarien: It's dangerous for all people not just whites\r\nAubrianna: So read about this" + }, + { + "id": "13680352", + "summary": "Josh is upset, because he lost his new sneakers on Tuesday. Josh has already called the gym, but they didn't find anything. Josh will check the swimming pool and give Mark a call.", + "dialogue": "Josh: Man, I’m really pissed off.\r\nMark: What’s up, bro?\r\nJosh: I lost my brand new sneakers somewhere.\r\nMark: Oh, man, that sucks, when?\r\nJosh: I don’t know exactly, last Tuesday I think.\r\nMark: Where’ve you been to?\r\nJosh: Many places, swimming pool and fitness, but I thought I left them in the gym.\r\nMark: You should call them or go there. \r\nJosh: Yeah, I did call them, but they said nothing was found when they’re closing.\r\nMark: So what are the options now?\r\nJosh: Dunno, will try to check the swimming pool.\r\n Mark: Ok, good luck. Gimme a call.\r\nJosh: I will. Take care, bro.\r\nMark: You too, man. " + }, + { + "id": "13830106", + "summary": "It's Valentine's day. Bella plans to order some pizza home. Aria will come to Warsaw as soon as she quits. This year Bella will probably go to Korea to get regular checkup. ", + "dialogue": "Bella: It's valentine's day!😁😁😁\r\nAria: For somebody without bf today is kinda miserable day.....😢😢\r\nBella: There are a lot of vendors selling roses on the street here.\r\nBella: \r\nAria: \r\nBella: hahahahahahaha!! That looks SO SAD! :'‑(:'‑(:'‑(\r\nAria: I feel like the weather is colder than it really is. How's the weather there?\r\nBella: Here? it's 3 degree today. Wow! Even coffee shops are all decorated with all the heart-shaped balloons.\r\nBella: \r\nAria: Here everywhere just chocolate. :‑/ :‑/Of course it isn't related to me at all. :‑/:‑/\r\nBella: Only shops seems to get money.\r\nAria: What's your plan?\r\nBella: Maybe I will order some pizza home. (Sounds not that fun. right?) When will you come to Warsaw?\r\nAria: As soon as I quit, I will fly to you.😑😑 For now what I only hope is to leave work at 6. :‑|:‑|:‑|:‑|\r\nBella: This year I will probably go to Korea to get regular checkup. \r\nAria: Good. Take care there.\r\nBella: Make bf asap and visit us together.😝😝😝 You too!\r\nAria: I gotta go. Ttyl" + }, + { + "id": "13716129", + "summary": "No one wants to play ball with Sawyer tonight.", + "dialogue": "Sawyer: ball game 2nite?\r\nSutter: injured\r\nWheeler: what time?\r\nSawyer: 5.30? 6?\r\nWheeler: anytime l8r?\r\nSawyer: pitch booked at 8 so we need to start earlier\r\nWheeler: so im out sry\r\nYardley: i guess next time be better" + }, + { + "id": "13681757", + "summary": "Daina needs about an hour more to get ready.", + "dialogue": "Sarah: how much longer?\r\nDaina: I need to put my make up\r\nSarah: OMG\r\nSarah: casual or party type?\r\nDaina: casual\r\nSarah: so it's about an hour?\r\nDaina: u know me, sorry :)" + }, + { + "id": "13820984", + "summary": "Viola is having her wedding soon and still has some things to organize. Carmen comes on Friday and is willing to help Viola.", + "dialogue": "Carmen: how are you feeling, Viola? it is so so close...\r\nAlfred: My dearest Viola <3\r\nViola: I think as one's feeling before the wedding - a little bit light in the stomach! ive got some things to organize still!\r\nCarmen: i will be on friday night, i could give you a helping hand :))\r\nViola: Thanks darling, i will let you know x\r\nCarmen: (Y) my number just in case +00123456789\r\nViola: (Y) <3" + }, + { + "id": "13681755", + "summary": "Madeline is in conflict with Martin and Jada. Alex and Madeline will go for a beer tomorrow. Madeline will explain her issues with Martin and Jada to Alex.", + "dialogue": "Madeline: I'm really not happy with Martin’s requests, and Jada's way of dealing with his problems. But it's her responsibility. I probably shouldn't care, should I?\r\nAlex: What happened?\r\nMadeline: We had a little argument today, didn't you hear?\r\nAlex: No, I think I was away or oblivious.\r\nMadeline: Oh, ok. That's probably better 😂\r\nAlex: Tomorrow u need to tell me what happened lol\r\nMadeline: Are we going for a beer after?\r\nAlex: Sure!\r\nMadeline: Good, have a good night. See you tomorrow. More than 8 hours. Excited?\r\nAlex: I can’t contain myself…" + }, + { + "id": "13829396", + "summary": "Mom wants Betty to call the grandfather from time to time.", + "dialogue": "Mom: Hi, Betty, how are you?\r\nBetty: Hi, everything's fine and you?\r\nMom: me too\r\nMom: You could call Grandpa from time to time, you know\r\nMom: He's always asking about you.\r\nBetty: I know, Mom, it's just I've been so busy.\r\nMom: we're all busy. You don't have to talk long, just check in.\r\nBetty: OK, I will.\r\nBetty: ;*" + }, + { + "id": "13681405", + "summary": "William is making spaghetti alla vongole for dinner. It's an Italian dish and it involves pasta, garlic, wine and clams.", + "dialogue": "Whitney: What will be for dinner?\r\nWilliam: Spaghetti alla vongole\r\nWhitney: What's this vongole? Never seen this word\r\nWilliam: It's in Italian\r\nWhitney: What does it mean?\r\nWilliam: Clams, it's a kind of seashells, seafood\r\nWhitney: Oho, ambitious, how are you going to do that?\r\nWilliam: First I will cook pasta\r\nWhitney: quite obvious, this is spagghetti alla vontale\r\nWilliam: Alla vongole. Afterwards I make garlic golden, add wine and then clams\r\nWhitney: Gonna get drunk :))\r\nWilliam: Very funny. I wait as clams open, and I'll add pasta\r\nWhitney: Sounds delicious, I am mouth watering!" + }, + { + "id": "13820950", + "summary": "Maria suggests to meet after the IMF lecture to discuss the presentation which is due on Monday. Maria, Alexander, Martha and Sarah will meet tomorrow at 17:15. Lawrence will be late.", + "dialogue": "Maria: Who's gonna be at IMF lecture tomorrow? We can discuss all remaining questions after and do the calculations?\r\nAlexander: I don't attend that class, but it is fine by me to meet\r\nSarah: I will not be there, sorry. I am working\r\nMartha: So when? We are due on Monday\r\nMartha: That doesn't leave many options\r\nAlexander: On Saturday I already have to meet for another presentation, so my option is Friday afternoon or tomorrow\r\nSarah: Tomorrow and on Friday I am available from 5pm, during the weekend for the whole day\r\nLawrence: I am meet after class anytime or make time over the weekend if needed\r\nSarah: So can we meet tomorrow evening? 17:15?\r\nAlexander: It is fine by me\r\nLawrence: I will be late, but you can start without me" + }, + { + "id": "13864654", + "summary": "Molly and Anna will go to the Muse concert in Cracow.", + "dialogue": "Molly: listen I've got a free ticket to the Muse concert in Cracow, want to come with me?\nHannah: nah, I don't like them\nMolly: what about you Anna\nAnna: yassss please\nAnna: let's go! <3" + }, + { + "id": "13828411", + "summary": "Tabby has 2 exams next week. Laura passed all her exams but one. Tabby may come to Daisy's party on Saturday, depending on her studying progress. ", + "dialogue": "Laura: Hi Tabs, how are you doing?\r\nTabby: I'm ok, the only problem is I have 2 exams next week. What about you?\r\nLaura: Oh, I'm ok, passed all exams but one. But that one was not so important.\r\nTabby: Ok.\r\nLaura: Are you coming to Daisy's party on Saturday?\r\nTabby: I must see how much progress I make with studying over the next couple of days. :(\r\nLaura: Ok... I understand that. :)\r\nTabby: Yeah! But I'm optimistic.\r\nLaura: Good! That's my girl!" + }, + { + "id": "13716504", + "summary": "Becky and Trent are taking care of Joel's cat Coco. Coco likes them both and is behaving rather well so far. ", + "dialogue": "Joel: Hey, how are you guys doing? How's Coco?\r\nBecky: She's actually sitting on my lap purring right now.\r\nTrent: confirmed\r\nJoel: no way :D pics or didn't happen!\r\nBecky: \r\nJoel: haha! It took you guys 2 days to win her over. She's usually not that easy.\r\nTrent: maybe she remembers us visiting you?\r\nJoel: Maybe... oh, man, I'm jealous ;) Is she behaving ok?\r\nBecky: She's a total gem. Absolutely adorable.\r\nTrent: tbh she does try to scratch the couch from time to time\r\nJoel: you can just hiss at her or clap and she should stop\r\nTrent: yeah, we do that like you said and it works\r\nJoel: and if she wakes you in the morning you can just lock her out of the room until feeding time\r\nBecky: she hasn't done that yet actually\r\nJoel: I'm pretty sure it's coming :P\r\nTrent: so far she's been really sweet\r\nJoel: I'm glad. Just let me know if you need anything ok?\r\nBecky: ok sure" + }, + { + "id": "13821656", + "summary": "Sosie will be at Kyra's flat in 5 minutes. The flat number is 187.", + "dialogue": "Sosie: What's your flat apartment number again?\r\nKyra: 187\r\nSosie: thanks! Will be there in 5 mins\r\nKevin: Hurry up! \r\nKevin: We're all waiting :P" + }, + { + "id": "13730814", + "summary": "Ronnie uses three different bins for waste. He doesn't use straws, neither plastic bags. Clint and Ronnie agree that environment protection depends on the government. ", + "dialogue": "Clint: I'm curious. How's the waste management working out over there?\r\nRonnie: It actually works very well. We use three different bins: compost, recycle and trash. Compost anything organic from yard waste to food scraps. Recycle for paper, bottles, cardboard, plastic, metals. And trash everything else. Why do you ask?\r\nClint: Sheer curiosity. What about general awareness? Plastic bags, etc.?\r\nRonnie: Very high. No more single use plastic bags in grocery stores. No more straws in restaurants.\r\nClint: Yeah, I read yours was one of the cities that banned them. I wonder when that will get here.\r\nRonnie: Never...\r\nClint:: What do you mean?\r\nRonnie: I highly doubt that type of environmental awareness will happen there. People there live in a different world when it comes to that.\r\nClint: People don't have to have any awareness, tbh. It's above them. If there were such decisions made they would follow\r\nRonnie: That is true. But at the state government level I don't think there is that awareness either. But hopefully I am proven wrong!\r\nClint: Really bizarre. I noticed it's struggling here. On the other hand, it's money.\r\nRonnie: Money in the sense of?\r\nClint: Reducing waste=lower utilization costs. Do they sort through everything further down?\r\nRonnie: Yes here they do. \r\nClint: It seems to be the same way here. Dry recyclables, food and general, compost " + }, + { + "id": "13814888", + "summary": "Adam called Tina. She is at work. Her cell's battery's low. She will cal Adam later in the evening or tomorrow after 9 am.", + "dialogue": "Adam: it was me who called u\r\nAdam: call me back or text me when u can;-)\r\nTina: sorry, I'm still at work..\r\nTina: so many pple today:/ :/\r\nTina: dunno what time I can leave and actually my phone is dying...\r\nAdam: I see, u can call me once you're back home, I won't be sleeping until late\r\nTina: ok, and if it's really late then I'll call u 2morrow\r\nAdam: but don't do it before 9!;-)\r\nTina: sure thing;-)" + }, + { + "id": "13821251", + "summary": "Natalie is pregnant with Dave. Jane didn't think Sally should share it with others as Natalie told them that in confidence. Henriette and Greg are surprised that the father is Dave not Mike.", + "dialogue": "Sally: Hi guys have you heard about Natalie?\r\nTom: No. What about her?\r\nJane: You're such a gossip girl, Sally!\r\nSally: Come on! I'm sure you're all curious.\r\nTom: I am. \r\nSally: Relax Jane, what's your problem?!\r\nJane: She told us in confidence, I don't think we should share it with others.\r\nGreg: Now I'm also curious\r\nHenriette: Tell us! \r\nHenriette: Is she pregnant? \r\nSally: 💣💣💣\r\nSally: Guess who's the father!!!\r\nHenriette: I knew it!\r\nHenriette: Is it Mike?\r\nSally: Dave!!! \r\nHenriette: WHAT????\r\nGreg: That's insane! " + }, + { + "id": "13830077", + "summary": "Nick will got some lunch for Steve - it can be anything but chicken. ", + "dialogue": "Steve: can you byt me some lunch?\r\nNick: sure, what do you want?\r\nSteve: Anything but chicken.\r\nNick: gotcha" + }, + { + "id": "13727832", + "summary": "Gemma will invite Timmy and his Date, as well as Lona and Michelle to her wedding.", + "dialogue": "Timmy: Who do u wanna invite?\r\nGemma: Well, there's u and ur date :)\r\nTimmy: Not sure about the date, but I'll be there :) who else?\r\nGemma: I was thinking about Lona and Michelle.\r\nTimmy: Still thinking ur getting invites to their weddings?\r\nGemma: It's not like that.\r\nTimmy: Sure it is ;) just don't wanna admit it ;)\r\nGemma: Fine! I want to come to their wedding receptions!\r\nTimmy: There we go :) was it so hard?\r\nGemma: Yes.\r\nTimmy: And u think u can change their minds?\r\nGemma: Dunno." + }, + { + "id": "13730976", + "summary": "Millie is sick, so she won't come today.", + "dialogue": "Millie: Heeey I’m sick I won’t come today\r\nSal: I’m sorry! Get better soon :*\r\nMillie: <3" + }, + { + "id": "13611757", + "summary": "Lisa isn't going home yet. Daisy wants her to be back before 11 p.m.", + "dialogue": "Daisy: going home?\r\nLisa: not yet\r\nDaisy: please be back before 11 pm\r\nLisa: ok" + }, + { + "id": "13681621", + "summary": "Tina will make it for the bus that is leaving in 3 minutes.", + "dialogue": "Sophie: the bus is leaving in 3 minutes, are you coming???\r\nTina: almost there, i am already dressed\r\nSophie: FASTER!\r\nTina: chill, I will make it" + }, + { + "id": "13680702", + "summary": "Sophie accepts some quince from Noah. Noah has left the quince in a basket on his terrace and the twins can pick it up any time. The twins are in college. Noah's son is in the military. He is still single but reportedly not gay.", + "dialogue": "Noah: Hi there! The quince we talked about the other day... Are you still interested?\r\nSophie: Hello Noah, but of course I am.Thank you.\r\nNoah: Actually William went to collect the rest of them for you immediately. They're in a basket on our terrace, so you call collect them any time.\r\nSophie: That's very very kind of him! He's really a darling.\r\nSophie: We won't be going your direction any time soon I'm afraid. How long will they keep? Is it windfall?\r\nNoah: Both really but they all look very healthy. No bruises afa one can see. Or only odd small ones. They' be alright for a couple of days I guess. The weather's cool.\r\nSophie: It would be a shame if they rotted. I'll talk to Frank and maybe to the twins too and go back to you asap.\r\nNoah: OK.\r\nSophie: Hey Noah, the twins will be on their way from Notts tomorrow afternoon and passing Windfield. Is it alright if they pop in and collect the quince?\r\nNoah: Absolutely! In fact anyone can come any time and just take them from our terrace. Of course it would be great to see your twins again. Haven't seen them for ages!\r\nSophie: Well we don't see that much of them either :( They'll be coming home this weekend only because of Alexa's ceremony.\r\nNoah: Notts is 3 hours' drive away so small wonder they don't fancy it so much.\r\nSophie: I don't blame them! We're both happy they've been doing fine at college. Everything's absolutely fine. You don't have William at home all that often either, do you?\r\nNoah: He gets 5 days at Xmas by way of \"family care\" and 2 weeks off spread over the whole year but never longer that 5 days. They have a strict regime at the academy. But after all these years we've learned to cope with it. What worries me is that he seems to have no time to even think about getting married and start a family of his own.\r\nSophie: The fate of most military I guess. Hugely unfair I'd say. Like being married to your regiment!\r\nNoah: That's what it sounds like when he talks about himself! I never hear a girl's name! And when I ask, he gets brusque.\r\nSophie: Oh dear! What could it mean?\r\nNoah: No, not what you think! We know from Capt. Broomsberg about their common escapades. Plenty of women but only that sort of women. No strings.\r\nSophie: I never even imagined he might be gay! Surely not. But there must be sth else.\r\nNoah: We think he's just so obsessively career-minded. Nothing else counts for him.\r\nSophie: He's always been very strong-willed. And he knows what he wants. Well if I were you I wouldn't worry at all.\r\nNoah: You're probably right. But you know I'd love it so much to have plenty of grandchildren around me.\r\nSophie: They'll come, don't you worry! Look the twins will contact you about the quince. Alright?\r\nNoah: Sure. Thank you Sophie for a nice chat!\r\nSophie: Thank you for the quince!" + }, + { + "id": "13811296", + "summary": "Harry goes to Ikea. He will buy some furniture, frozen cake and a bag of meatballs for Sarah. Harry will use Sarah’s Ikea Family card.", + "dialogue": "Harry: Going to Ikea, need anything?\r\nSarah: Oh yes! :D\r\nHarry: perfect XD\r\nSarah: So could you please buy for me: white table cloth 120 cm x 140 cm, as simple as it gets, two medium size wardrobe organisers (the ones with grey flowers), one wooden spatula and three bottle cleaners.\r\nSarah: If you happen to go the food section, I wouldn't mind a Daim's frozen cake and a bag of meatballs ;)\r\nHarry: Ok, got it. Will do my best.\r\nSarah: You can use my Ikea Family card, just give them the number and my last name and tell them you forgot to bring it ;) 17927192.\r\nHarry: Thanks! Stay online, I may need your help.\r\nHarry: \r\nHarry: Is this table cloth ok? Or that one?\r\nHarry: \r\nSarah: First one's perfect ;)" + }, + { + "id": "13810083", + "summary": "Cindy is sad, but doesn't want to talk about the reason. Ellie hasn't seen the funny video that went viral. ", + "dialogue": "Cindy: \r\nEllie: Why are you so sad? Something’s happened?\r\nCindy: I don’t want to talk about it…\r\nEllie: Cheer up! Tomorrow’s another day 😊\r\nCindy: \r\nCindy: Have you seen it? It’s viral on the internet\r\nEllie: Nope, but it’s very funny 😊\r\nEllie: " + }, + { + "id": "13611740", + "summary": "Jannet thanks Nadia for coming to her place yesterday. Nadia enjoyed the party and is still in a dancing mood. They are going to the disco next time.", + "dialogue": "Nadia: heeeeey how are you?\r\nJannet: hey, I'm cool, we just got up and now we're doing some breakfast :D how are you guys?\r\nNadia: cool, we just woke up like 20min ago and will eat something too\r\nJannet: nice :D thank you guys for coming yesterday, it was really nice to have you here, I hope you had fun\r\nNadia: of course we had, I hope we will repeat it soon :)\r\nJannet: sure, we just have to wait for the neighbours to forget about us, cause it was fuckin loud yesterday xD\r\nNadia: hahahah yeah it was\r\nNadia: but it was really cool, I'm still in a very dancing mood\r\nJannet: perfect, the next time we're going to the disco then :D\r\nNadia: sounds cool ;)" + }, + { + "id": "13611714", + "summary": "Jeff and Mark are amazed by his car. They bet 100 dollars who gets to drive it first.", + "dialogue": "Mark: Have you seen his new car?!!\r\nJeff: Dude, wtf, it's like insane. How the hell did he afford it???\r\nMark: No fucking clue, but the ride is legit\r\nJeff: Hell yeah, I'd drive this baby\r\nMark: Over my dead body:D I gotta be first one to try it out\r\nJeff: Yeah, you wish:D \r\nMark: wanna bet he'll let me first:D?\r\nJeff: 100 bucks dude, I'll bet your ass:D \r\nMark: hahaha deal!" + }, + { + "id": "13716588", + "summary": "Ann, Sue and Julie did a great job and they will have a little celebration tonight.", + "dialogue": "Ann: Congratulations!!\r\nAnn: You did great, both of you!\r\nSue: Thanks, Ann\r\nJulie: I'm glad it's over!\r\nJulie: That's co cute of you, girl!\r\nAnn: Let's have a little celebration tonight! \r\nSue: I'm in\r\nJulie: me too!!! aww" + }, + { + "id": "13728148", + "summary": "Kathy had her hair cut.", + "dialogue": "Kathy: \r\nKathy: Aunt on the chair getting her haircut today :)\r\nKathy: I think I'm also going to get something done today\r\nKathy: Maybe get it cut a bit shorter\r\nOlivia: ooo how fun! I am just chillin today \r\nOlivia: \r\nKathy: \r\nKathy: : \r\nKathy: The end results :) \r\nOlivia: Very cute! \r\nOlivia: \r\nOlivia: Although not a big difference with Aunt's hair. But yours looks really nice!\r\nKathy: Thanks!" + }, + { + "id": "13716254", + "summary": "Claudia, Andy and Mark stayed after hours at Mr. Benson's request. Mr. Benson is currently discussing a contract with a new client in Per and he wants to organize their work remotely.", + "dialogue": "Mr. Benson: Hello everyone. First of all, thank you for staying after hours. As you know, I'm currently in Peru discussing a contract with a new client. \r\nMr. Benson: We need to discuss a couple of matters: your work while I'm gone and things I need from you asap. \r\nClaudia: Hello, Mr. Benson. Of course. What do you require from us?\r\nAndy: Hello, Mr. Benson. I'm here. \r\nMark: Hello, so am I. Sorry for being a little late. Had a call with a client. Did I miss anything?\r\nMr. Benson: No, Mark, we're just starting." + }, + { + "id": "13810149", + "summary": "Jane thinks Den's mum needs to get out more, so Den suggests to invite her for tea, fish and chips on Friday after work.", + "dialogue": "Jane: your mums losing it\r\nDen: why you say that?\r\nJane: she just text me about crying over this morning?\r\nDen: what did she do this morning?\r\nJane: no the telly program with phil and holly\r\nDen: oh right y?\r\nJane: some baby reveal😂\r\nDen: I'm confused\r\nJane: I think she needs to get out of the house more:\r\nDen: invite her for tea on Friday I'll bring fish and chips home after work\r\nJane: thats a good idea x" + }, + { + "id": "13728835", + "summary": "Mum is at school in front of the door. Ludo's rooms is 112 and his class is class 3. The same goes for Hugo and Charles. The building is big and has a big garden. The meeting is about to start.", + "dialogue": "Ludo: hi mum, did you arrive at school?\r\nMum: yes , i'm in front of the door: you 're room 112 with Hugo and Charles.\r\nLudo: and do you know my class number?\r\nMum: yes class 3, as Hugo and Charles\r\nLudo: good, so cool\r\nMum: I'll show you the picture of the listing, but i couldn't manage to see the room.\r\nLudo: Ok no problem\r\nMum: the building is quite big and there is also a huge garden, you'll have room enough\r\nLudo: Yes i saw\r\nMum: i have to go, meeting is starting\r\nLudo: ok thanks\r\nMum: don't wait for me" + }, + { + "id": "13731212", + "summary": "Chandler asks Phoebe to open the door and pay the delivery guy standing outside his door.", + "dialogue": "Chandler: Phoebe!! Do you have money?? \r\nPhoebe: Yes I have .. But why do you need it..\r\nChandler: Open your door.. And pay the delivery guy standing outside my door..\r\nPhoebe: Oh chandler !! you idiot... On my way.." + }, + { + "id": "13865040", + "summary": "Max's sister is studying in Shanghai and she already speaks Chinese. She doesn't find the whole experience amazing but she believes it's a good investment.", + "dialogue": "Rory: Max, is your sister studying in China?\nMax: She is\nRory: How does she like it?\nMax: it's not amazing, but she believes it's a good investment\nRory: does she speak Chinese?\nMax: I think she does already\nJoseph: Hard to control I imagine\nMax: hahaha\nEliza: I think this is the best investment imaginable\nMax: really?\nEliza: sure, the every 5th earthling is Chinese\nEliza: or so\nRory: true, I don't think you could stay unemployed speaking Chinese\nMax: but there are Chinese everywhere and they speak foreign languages\nRory: true\nRory: anyway, where is is? Beijing?\nMax: Nope. Shanghai\nRory: I've just checked, it has 25 million inhabitants!\nMax: yes, she said it's actually bigger than the capital\nRory: Insane, it's the population of the whole Australia" + }, + { + "id": "13682204", + "summary": "Russ received David's report but hasn't read it yet.", + "dialogue": "David: Morning Russ. Have you seen the report I emailed yesterday?\r\nRuss: Hi David. Well received thank you. But I haven't read it yet.\r\nDavid: Is there anything you'd like me to do right now?\r\nRuss: I'll take a look at the report in a moment and will send you remarks if I have any.\r\nDavid: Sounds good. I guess I'll just answer some emails.\r\nRuss: Please do. I should be done by midday with the report." + }, + { + "id": "13829899", + "summary": "Ben won't go with Catherine to visit uncle Steve. He will visit her and the boys. ", + "dialogue": "Catherine: We're going to visit uncle Steve, want to come with us?\r\nBen: why would I?\r\nCatherine: He's family and we haven't seen him in a long time.\r\nBen: you know very well that I'm not fond of such visits\r\nBen: they stress me out\r\nCatherine: Come on, even you can do it from time to time.\r\nBen: maybe I can but I don't want to\r\nBen: it will end the same way as usual\r\nBen: with me being pissed at everyone\r\nCatherine: The boys want to see you\r\nBen: I can visit you and play with them some other day\r\nBen: I don't have to drive to another city to do it\r\nCatherine: You know that he'll be disappointed?\r\nBen: uncle Steve?\r\nCatherine: Yes.\r\nBen: I don't think so, we don't have anything in common\r\nBen: all he ever does is bothering me with stupid questions about my private life\r\nCatherine: Maybe that's his way of showing that he cares about you\r\nBen: yes... sure... sometimes I wonder if you're thinking about what you type :P\r\nCatherine: Depends on the day\r\nCatherine: Today I'm tired so I can't guarantee it\r\nBen: you should rest then\r\nBen: instead of asking me to do stuff I don't like\r\nCatherine: Alright, don't worry, I'll stop\r\nCatherine: But still the part about the boys missing you was true\r\nCatherine: So come and visit them when you'll have the chance\r\nBen: I will" + }, + { + "id": "13816899", + "summary": "Kaya is looking for Clay, who is in the classroom.", + "dialogue": "Kaya: We have been looking for you in Library\r\nClay: I am the class room\r\nKaya: Be right there" + }, + { + "id": "13813888", + "summary": "Alice and Sean will wash the car on their way tomorrow.", + "dialogue": "Sean: Hey, I won't be able to take the car to the carwash\r\nSean: They want me to finish report first :(\r\nAlice: shoot, but it's crazily dirty\r\nAlice: Will we have tomorrow?\r\nSean: \r\nSean: We can leave a bit earlier or get it washed somewhere on the road\r\nAlice: it might be good idea, let's do it tomorrow then\r\nSean: great!" + }, + { + "id": "13731225", + "summary": "Julia will be waiting for Bert with the dinner. Bert is coming home around 8.", + "dialogue": "Julia: Hey, what time are you getting home?\r\nBert: 8-ish. Why?\r\nJulia: I was wondering if we should wait for you with the dinner?\r\nBert: Yeah, that would be nice of you. I'll try to get there on time\r\nJulia: Ok. Call me if you're running late\r\nBert: I will. xx" + }, + { + "id": "13682248", + "summary": "Steve is happy that he got a new dishwasher installed.", + "dialogue": "Steve: hiya the dishwasher has turned up\r\nMum: good did they install it for you\r\nSteve: yes they did a good job\r\nSteve: \r\nMum: looks really nice\r\nSteve: yes it does its nice and quiet\r\nMum: is it a 12 place setting one?\r\nSteve: I think so, not that I ever have 12 for dinner lol\r\nMum: not but you dont have to put it on every day\r\nSteve: no every other\r\nMum: yes I'm glad you are pleased with it xx\r\nSteve: I am thanks mum xxx" + }, + { + "id": "13810207", + "summary": "Michelle is still researching and Harvey cannot wait all day for the update. Michelle can't see most of the things as it's black hat and it installs bugs on her computer.", + "dialogue": "Harvey: Update?\r\nMichelle: Too soon, still researching.\r\nHarvey: That is an update, Michelle.\r\nMichelle: Fine. \r\nHarvey: I can't wait all day.\r\nMichelle: If you stop texting me, I could get on with it!\r\nHarvey: Fair point.\r\nMichelle: Besides, half of this stuff is black hat. I can't see most of it.\r\nHarvey: Why not?\r\nMichelle: It installs bugs on my computer.\r\nHarvey: Oh, right.\r\nMichelle: It's fine. I'm back on it.\r\nHarvey: Thanks." + }, + { + "id": "13729149", + "summary": "Joy is coming back on Thursday.", + "dialogue": "Bill: lovely pie\r\nJoy: hey lover boy\r\nBill: i really miss you\r\nJoy: i miss you too😘\r\nBill: when are you coming back\r\nJoy: on thursday\r\nBill: cant wait to see you\r\nJoy: me too" + }, + { + "id": "13828432", + "summary": "Julia was at cafe Kohaku near Covent Garden last week, but it's closed down now to her and Henry's surprise. She contacted the owners, but they haven't replied yet. Henry and his cousin went to the Lily's in the end, but it had a different vibe.", + "dialogue": "Henry: Do you know what happened to cafe Kohaku?\r\nJulia: The one near Covent Garden?\r\nHenry: Yeah. I wanted to reserve a table, but I can't find them anywhere. Their Facebook page disappeared.\r\nJulia: I'm afraid they're closed :(\r\nHenry: For good?!\r\nJulia: Afraid so. I've been there last week and wanted to take my cousin there, but the place's empty\r\nJulia: Probably the rent was too high.\r\nHenry: Oh no, I loved that place\r\nJulia: Me too. I checked online for the owners and contacted them, but they haven't replied yet. I hope they will reopen somewhere else\r\nHenry: Unless they do it far away, the location was perfect. It was a real gem in this touristy district\r\nJulia: I know, I know... We went the Lilly's in the end, my cousin and I, but it's not the same\r\nHenry: Eh, that's a pity then, I was so looking forward to their brownie\r\nHenry: How was the Lilly's?\r\nJulia: It's all right, it's a different vibe though. I think you should go to the Japan Centre for the same\r\nHenry: Uhm, don't think so. It's always awfully crowded and there are more tourists than normal people like us" + }, + { + "id": "13728300", + "summary": "Olivia has to sort out her accounts and upload a few videos on YouTube. Jake is complaining that Sony Music tried to appropriate his own music.", + "dialogue": "Jake: What are your plans for the day?\r\nOlivia: I haven't really got anything planned. There are some things I should do which I haven't looked at for a while\r\nJake: Like what?\r\nOlivia: I ought to do my accounts. At least get all the incoming and outgoing invoices sorted by months.\r\nJake: Tax filing is a long way off.\r\nOlivia: I know, but doing it all at the last minute is a dreadful headache.\r\nJake: Anything else?\r\nOlivia: I also have a few videos I need to upload to YouTube.\r\nJake: Already ready or you still need to do post-production?\r\nOlivia: I don't really do much by way of post-production. I'm not exactly Steven Spielberg.\r\nJake: I do, but I dont always have time to do much.\r\nOlivia: If I put in any music I always get a copyright strike and someone else is taking the ad revenue\r\nJake: Yeah. And some of those claims are purely speculative. One time I put on one of my own pieces and it was challenged by Sony Music\r\nOlivia: No way!\r\nJake: It's true. On the form you fill in to counter the claim I said if Sony want to make me one of their signed artists, that's fine. Otherwise to get they paws off of my music." + }, + { + "id": "13810177", + "summary": "Emily's a guest at Linda's house. She broke one of Linda's green tea cups when she was cleaning the cupboards. Linda doesn't like them and she offers Emily the whole set.", + "dialogue": "Emily: Oh Linda...Something horrible has happened. I'm so sorry. I have broken one of your green tea cups. Just looked away and brushed it off the table with my skirt. SORRY!\r\nLinda: No worries babe! It's nothing. Really.\r\nEmily: I'm feeling so bad about it. So clumsy of me.\r\nLinda: Look. I never really liked the set, so one cup makes no difference. I hardly use it anyway. How come it was on the table BTW?\r\nEmily: I was cleaning the cupboards in the dining room.\r\nLinda: You did what? Whatever for? The crockery there is more for decoration.\r\nEmily: Exactly. I thought they're so pretty, especially the green ones, and then noticed how dusty they are. So I thought I'd be a helpful guest and wash them. I did. And then I made myself a cuppa in the green one. And it happened.\r\nLinda: Ya golden! Really! Look, calm down. You are a fantastic guest.\r\nEmily: Ya. Smashing. As we say in GB.\r\nLinda: You know what? Take out the whole green set. If you like it so much - it is yours.\r\nEmily: But Linda! I'm speechless.\r\nLinda: Precious! See you in the evening!\r\nEmily: CU\r\nEmily: And THANK YOU" + }, + { + "id": "13730403", + "summary": "Camilla and Tom will go to Dublin this weekend.", + "dialogue": "Tom: Have you ever been to Dublin?\r\nCamilla: Never in Ireland!\r\nTom: So let's go there this weekend!\r\nCamilla: for 2 days?\r\nTom: Yes, the weather forecast is great!\r\nCamilla: I love your crazy ideas!\r\nTom: ok, so I'm buying the tickets\r\nCamilla: <3" + }, + { + "id": "13681886", + "summary": "Stella doesn't want to visit Sandra because she doesn't want to get infected by Sandra's disease. Sandra's doctor prescribed her effective medicines, so she will be fine soon. Stella wants to take Sandra to the cinema for some action movie when she recovers.", + "dialogue": "Stella: Hi Sandra\r\nStella: Did you recover finally?\r\nStella: I wanted to come and visit you but I didn't want to get infected by the disease that you've been carrying\r\nStella: You know... I'm a teacher, I could spread the disease\r\nSandra: Hello my dear\r\nSandra: I'm getting better every day but there's still something left\r\nSandra: I have some effective medicines now\r\nSandra: All the drugs that doctors have been prescribing to me so far were useless\r\nSandra: Now the difference in my well-being is surprisingly considerable\r\nStella: It's very nice to here that, I'm so happy for you!\r\nStella: Hope you'll be in a perfect health soon\r\nStella: I want to take you to the cinema\r\nStella: They have a brilliant repertoire now\r\nSandra: Oh, I dream of getting out of the house!\r\nSandra: I used to love laying in the bed for hours, but now I hate it\r\nStella: I understand\r\nStella: Just tell me, what kind of movie would you like to watch?\r\nStella: A romantic comedy, horror, fantasy or something else?\r\nStella: I'll do some research in my friends' midst and try to choose the best option :)\r\nSandra: I'm dreaming of watching some action movie\r\nSandra: With all the special effects and dynamic action\r\nSandra: Something that would be the opposite of my current situation :)\r\nStella: You got it! :)\r\nStella: I'll find something special for you\r\nStella: So I'm keeping my fingers crossed for your quick recovery\r\nStella: Talk to you soon! :)\r\nSandra: Thanks, Stella, my darling. You made my day :)\r\nSandra: Bye bye" + }, + { + "id": "13728819", + "summary": "Harley and Ruby are discussing the divorce filing. Harley and Ruby agree there are always two sides.", + "dialogue": "Harley: You should see this divorce filing. OMG...\r\nRuby: Bad?\r\nHarley: The guy was a serial cheater!\r\nRuby: OMG!\r\nHarley: She's taking him to the cleaners. Monthly settlement, half his pension, you name it.\r\nRuby: He deserves it!\r\nHarley: There are always two sides...\r\nRuby: Maybe she was a piss poor wife?\r\nHarley: Never cooked. Never cleaned. He paid for everything yet still managed to buy her mom a house and take extravagant vacays.\r\nRuby: Geez, down boy!\r\nHarley: Voice of experience, sorry!\r\nRuby: Well, people do crazy things. \r\nHarley: Yes, and some times theres karma, but...\r\nRuby: You're right, two sides." + }, + { + "id": "13611458-1", + "summary": "Ken feels stressed because of work and fighting with Brad. There is also too much going on at mom's. Ken is going to a show on Saturday night. On Sunday Ken is seeing the grandkids at the zoo.", + "dialogue": "Ken: Hi, how are you?\r\nAng: Just peachy! You?\r\nKen: I'm okay...\r\nAng: Just okay? What's wrong?\r\nKen: Just stressed; work stuff, fighting with Brad, too much going on at mom's.\r\nAng: Hang in there, it will get better!\r\nKen: I know, but it's a lot.\r\nAng: Can I do anything to help?\r\nKen: You are! Listening to me vent! LOL!\r\nAng: Are you at least doing anything fun this weekend?\r\nKen: Show Saturday night, then seeing the grandkids on Sunday at the zoo.\r\nAng: Sounds great! That will cheer you up!\r\nKen: Gotta run, work calls. Love you!\r\nAng: Love you too! Have a fantastic day!\r\nKen: You too!" + }, + { + "id": "13730606", + "summary": "It's derby day today. Titus supports Manchester United. Julius supports City.", + "dialogue": "Julius: Yoh, today its derby day⚽\r\nTitus:⚽⚽ yeah, finally man\r\nJulius: which team are you supporting today\r\nTitus: Manchester united anytime, any day\r\nJulius: city will win\r\nTitus: haha, united will win\r\nJulius: lets see then\r\nTitus: cool" + }, + { + "id": "13682266", + "summary": "Addisyn hasn't talked to Dexter for a long time and he thinks she doesn't love him anymore.", + "dialogue": "Dexter: Hello ;)\r\nAddisyn: :)\r\nDexter: I miss you\r\nAddisyn: Long time we haven't talked. But maybe thanks to this u miss me.. :)\r\nDexter: Yea you don't love me anymore\r\nAddisyn: Why should I have stopped loving you? Xd\r\nDexter: Like you didn't miss me too. Because you've been quiet that's why" + }, + { + "id": "13821126", + "summary": "Jason had a dental appointment today and that's why he was absent.", + "dialogue": "Megan: Any word from Jason?\r\nNathan: Why do you ask?\r\nMegan: He was absent today :c\r\nSusy: He had a dental appointment.\r\nMegan: Thanks for the info :)" + }, + { + "id": "13829710", + "summary": "Cindy has made arrangements for today's meeting at 2 pm in the conference room. She also organised flights and hotel for next week's trip. Don is appreciative. ", + "dialogue": "Don: Hi Cindy. Have you made all arrangements?\r\nCindy: It's about today's meeting or your trip next week?\r\nDon: Both, I suppose:)\r\nCindy: You have meeting with management board today at 2 pm.\r\nDon: Where did you set it up?\r\nCindy: In our conference room.\r\nCindy: Catering will bring some tea, coffee and snacks.\r\nDon: That's good.\r\nDon: Did everybody got the agenda?\r\nCindy: Yep.\r\nDon: How did Andy react when he saw it?\r\nCindy: Can't say, really. Not sure if he even read it.\r\nDon: That's Andy all right.\r\nDon: And how about the trip.\r\nCindy: I've got your plane tickets and booked the hotel.\r\nDon: Which one?\r\nCindy: Hilton, as usual.\r\nDon: Perfect:=)\r\nCindy: But nobody is gonna pick you up at the airport. You'll have to get a cab.\r\nDon: I think, I can manage that;=)\r\nDon: Good job, Cindy. No idea, where I'd be without you." + }, + { + "id": "13821325", + "summary": "Samuel ordered a smoke.", + "dialogue": "Samuel: \r\nVirginia: hahaha\r\nJack: Hilarious\r\nVirginia: Where are you?\r\nSamuel: \r\nJack: I love the \"happy menu\"\r\nVirginia: What did you order?\r\nSamuel: Just a joint for now\r\nSamuel: But mushroom pizza also looks tempting\r\nVirginia: Is it legal in Laos?\r\nSamuel: I guess the owner bribed the police\r\nJack: Man, it's heaven\r\nJack: I need to go there :P " + }, + { + "id": "13612147", + "summary": "Mike considers going to Egypt for holiday. It's too hot for Celia, she suggests Croatia instead. Mark likes the idea, he's never been there. ", + "dialogue": "Celia: Where do you want to go for Holiday ?\r\nMike: I was thinking about Egypt\r\nCelia: Too hot. What about Croatia ?\r\nMike: Good idea, I've never been there" + }, + { + "id": "13680480", + "summary": "Jasmine loves Charlie Puth and his new song. Paola thinks Charlie Puth is attractive. Paola likes the song \"Galway Girl\" by Ed Sheeran.", + "dialogue": "Jasmine: Have you heard this song?\r\nJasmine: