Eric03 commited on
Commit
ce4cdfa
·
verified ·
1 Parent(s): 9357a7c

Add files using upload-large-folder tool

Browse files
This view is limited to 50 files because it contains too many changes.   See raw diff
Files changed (50) hide show
  1. 2003.12753/main_diagram/main_diagram.drawio +0 -0
  2. 2003.12753/paper_text/intro_method.md +63 -0
  3. 2005.09812/main_diagram/main_diagram.drawio +1 -0
  4. 2005.09812/paper_text/intro_method.md +17 -0
  5. 2011.02048/main_diagram/main_diagram.drawio +1 -0
  6. 2011.02048/main_diagram/main_diagram.pdf +0 -0
  7. 2011.02048/paper_text/intro_method.md +53 -0
  8. 2102.04362/main_diagram/main_diagram.drawio +1 -0
  9. 2102.04362/main_diagram/main_diagram.pdf +0 -0
  10. 2102.04362/paper_text/intro_method.md +236 -0
  11. 2102.08201/main_diagram/main_diagram.drawio +1 -0
  12. 2102.08201/main_diagram/main_diagram.pdf +0 -0
  13. 2102.08201/paper_text/intro_method.md +91 -0
  14. 2104.02409/main_diagram/main_diagram.drawio +1 -0
  15. 2104.02409/main_diagram/main_diagram.pdf +0 -0
  16. 2104.02409/paper_text/intro_method.md +32 -0
  17. 2104.05591/main_diagram/main_diagram.drawio +1 -0
  18. 2104.05591/main_diagram/main_diagram.pdf +0 -0
  19. 2104.05591/paper_text/intro_method.md +34 -0
  20. 2104.08793/main_diagram/main_diagram.drawio +1 -0
  21. 2104.08793/main_diagram/main_diagram.pdf +0 -0
  22. 2104.08793/paper_text/intro_method.md +100 -0
  23. 2104.09667/main_diagram/main_diagram.drawio +1 -0
  24. 2104.09667/main_diagram/main_diagram.pdf +0 -0
  25. 2104.09667/paper_text/intro_method.md +104 -0
  26. 2107.08929/main_diagram/main_diagram.drawio +1 -0
  27. 2107.08929/main_diagram/main_diagram.pdf +0 -0
  28. 2107.08929/paper_text/intro_method.md +62 -0
  29. 2108.05997/main_diagram/main_diagram.drawio +0 -0
  30. 2108.05997/paper_text/intro_method.md +90 -0
  31. 2109.04683/main_diagram/main_diagram.drawio +0 -0
  32. 2109.04683/paper_text/intro_method.md +33 -0
  33. 2110.08207/main_diagram/main_diagram.drawio +1 -0
  34. 2110.08207/main_diagram/main_diagram.pdf +0 -0
  35. 2110.08207/paper_text/intro_method.md +11 -0
  36. 2110.08421/main_diagram/main_diagram.drawio +1 -0
  37. 2110.08421/main_diagram/main_diagram.pdf +0 -0
  38. 2110.08421/paper_text/intro_method.md +20 -0
  39. 2112.04386/main_diagram/main_diagram.drawio +0 -0
  40. 2112.04386/paper_text/intro_method.md +78 -0
  41. 2203.02574/main_diagram/main_diagram.drawio +0 -0
  42. 2203.02574/main_diagram/main_diagram.pdf +0 -0
  43. 2203.02574/paper_text/intro_method.md +89 -0
  44. 2203.10321/main_diagram/main_diagram.drawio +0 -0
  45. 2203.10321/paper_text/intro_method.md +17 -0
  46. 2203.14675/main_diagram/main_diagram.drawio +0 -0
  47. 2203.14675/paper_text/intro_method.md +116 -0
  48. 2204.01188/main_diagram/main_diagram.drawio +1 -0
  49. 2204.01188/main_diagram/main_diagram.pdf +0 -0
  50. 2204.01188/paper_text/intro_method.md +176 -0
2003.12753/main_diagram/main_diagram.drawio ADDED
The diff for this file is too large to render. See raw diff
 
2003.12753/paper_text/intro_method.md ADDED
@@ -0,0 +1,63 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # Introduction
2
+
3
+ Human digitization is essential to a variety of applications ranging from visual effects, video gaming, to telepresence in VR/AR. The advent of deep learning techniques has achieved impressive progress in recovering unclothed human shape and pose simply from multiple [29, 62] or even single [44, 56, 5] images. However, these leaps in performance come only when a large amount of labeled training data is available. Such limitation has led to inferior performance of reconstructing clothing – the key element of casting a photorealistic digital human, compared to that of naked human body reconstruction. One primary reason is the scarcity of 3D garment datasets in contrast with large collections of naked body scans, e.g. SMPL [38], SCAPE [6], etc. In addition, the complex surface deformation and large diversity of clothing topologies have introduced additional challenges in modeling realistic 3D garments.
4
+
5
+ ![](_page_1_Picture_4.jpeg)
6
+
7
+ Fig. 1: We present Deep Fashion3D, a large-scale repository of 3D clothing models reconstructed from real garments. It contains over 2000 3D garment models, spanning 10 different cloth categories. Each model is richly labeld with groundtruth point cloud, multi-view real images, 3D body pose and a novel annotation named feature lines. With Deep Fashion3D, inferring the garment geometry from a single image becomes possible.
8
+
9
+ To address the above issues, there is an increasing need of constructing a high-quality 3D garment database that satisfies the following properties. First of all, it should contain a large-scale repository of 3D garment models that cover a wide range of clothing styles and topologies. Second, it is preferable to have models reconstructed from the real images with physically-correct clothing wrinkles to accommodate the requirement of modeling complicated dynamics and deformations caused by the body motions. Lastly, the dataset should be labeled with sufficient annotations, e.g. the corresponded real images, body pose, etc., to provide strong supervision for deep generative models.
10
+
11
+ Multi-Garment Net (MGN) [7] introduces the first dataset specialized for digital clothing obtained from real scans. The proposed digital wardrobe contains 356 digital scans of clothed people which are fitted to pre-defined parametric cloth templates. However, the digital wardrobe only captures 5 garment categories, which is quite limited compared to the large variety of garment styles. Apart from 3D scans, some recent works [60, 25] propose to leverage synthetic data obtained from physical simulation. However, the synthetic models lack realism compared to the 3D scans and cannot provide the corresponding real images, which are critical to generalizing the trained model to images in the wild.
12
+
13
+ In this paper, we address the lack of data by introducing Deep Fashion3D, the largest 3D garment dataset by far, that contains thousands of 3D clothing models with comprehensive annotations. Compared to MGN, the collection of Deep Fashion3D is one order of magnitude larger – including 2078 3D models reconstructed from real garments. It is built from 563 diverse garment instances, covering 10 different clothing categories. Annotation-wise, we introduce a new type of annotation tailored for 3D garment – 3D feature lines. The feature lines denote the most prominent geometrical features on garment surfaces (see Fig. 3), including necklines, cuff contours, hemlines, etc, which provide strong priors for 3D garment reconstruction. Apart from feature lines, our annotations also include calibrated multi-view real images and the corresponded 3D body pose. Furthermore, each garment item is randomly posed to enhance the dataset capacity of modeling dynamic wrinkles.
14
+
15
+ To fully exploit the power of Deep Fashion3D, we propose a novel baseline approach that is capable of inferring realistic 3D garments from a single image. Despite the large diversity of clothing styles, most of the existing works are limited to one fixed topology [18, 32]. MGN [7] introduces class-specific garment network – each deals with a particular topology and is trained by one-category subset of the database. However, given the very limited data, each branch is prone to having overfitting problems. We propose a novel representation, named adaptable template, that can scale to varying topologies during training. It enables our network to be trained using the entire dataset, leading to stronger expressiveness. Another challenge of reconstructing 3D garments is that clothing model is typically a shell structure with open boundaries. Such topology can hardly be handled by the implicit or voxel representation. Yet, the methods based on deep implicit functions [42, 47] have shown their ability of modeling fine-scale deformations that the mesh representation is not capable of. We propose to connect the good ends of both worlds by transferring the high-fidelity local details learnt from implicit reconstruction to the template mesh with correct topology and robust global deformations. In addition, since our adaptable template is built upon the SMPL topology, it is convenient to repose or animate the reconstructed results. The proposed framework is implemented in a multi-stage manner with a novel feature line loss to regularize mesh generation.
16
+
17
+ We have conducted extensive benchmarking and ablation analysis on the proposed dataset. Experimental results demonstrate that the proposed baseline model trained on Deep Fashion3D sets new state of the art on the task of singleview garment reconstruction. Our contributions can be summarized as follows:
18
+
19
+ - We build Deep Fashion3D, a large-scale, richly annotated 3D clothing dataset reconstructed from real garments. To the best of our knowledge, this is the largest dataset of its kind.
20
+ - We introduce a novel baseline approach that combines the merits of mesh and implicit representation and is able to faithfully reconstruct 3D garment from a single image.
21
+ - We propose a novel representation, called adaptable template, that enables encoding clothing of various topologies in a single mesh template.
22
+ - We first present the feature line annotation specialized for 3D garments, which can provide strong priors for garment reasoning related tasks, e.g., 3D garment reconstruction, classification, retrieval, etc.
23
+ - We build a benchmark for single-image garment reconstruction by conducting extensive experiments on evaluating a number of state-of-the-art singleview reconstruction approaches on Deep Fashion3D.
24
+
25
+ # Method
26
+
27
+ To demonstrate the usefulness of Deep Fashion3D, we propose a novel baseline approach for single-view garment reconstruction. Specifically, taking a single image I of a garment as input, we aim to reconstruct its 3D shape represented as a triangular mesh. Although recent advances in 3D deep learning techniques have achieved promising progress in single-view reconstruction on general objects, we found all existing approaches have difficulty scaling to cloth reconstruction. The main reasons are threefolds: (1) Non-closed surfaces. Unlike the general objects in ShapeNet [13], the garment shape typically appears as a thin layer with open boundary. While implicit representation [42, 47] can only model closed surface, voxel based approach [16] is not suited for recovering shell-like structure like the garment surface. (2) Complex shape topologies. As all existing mesh-based approaches [23, 59, 46] rely on deforming a fixed template, they fail to handle the highly diversified topologies introduced by different clothing categories. (3) Complicated geometric details. While general man-made objects typically consist of smooth surfaces, the clothing dynamics often introduces intricate high-frequency surface deformations that are challenging to capture.
28
+
29
+ Overview. To address the above issues, we propose to employ a hybrid representation that leverages the merits of each embedding. In particular, we harness both the capability of implicit surface of modeling fine geometric details and the flexibility of mesh representation of handling open surfaces. Our method starts with generating a template mesh M<sup>t</sup> which can automatically adapt its topology to fit the target clothing category in the input image. It is then deformed to M<sup>p</sup> by fitting the estimated 3D pose. By treating the feature lines as a graph, we then apply image-guided graph convolutional network (GCN) to capture the 3D feature lines, which later trigger handle-based deformation and generates mesh M<sup>l</sup> . To exploit the power of implicit representation, we first employ OccNet [42] to generate a mesh model M<sup>I</sup> and then adaptively register M<sup>l</sup> to M<sup>I</sup> by incorporating the learned fine surface details from M<sup>I</sup> while discarding its outliers and noises caused by enforcement of close surface. The proposed pipeline is illustrated in Figure 4.
30
+
31
+ Adaptable template. We propose adaptable template, a new representation that is scalable to different cloth topologies, enabling the generation of all types of cloth available in the dataset using a single network. The adaptable template is built on the SMPL [38] model by removing the head, hands and feet regions. As seen in Figure 4, it is then segmented into 6 semantic regions: torso, waist, and upper/lower limbs/legs. During training, the entire adaptable template is fed into the pipeline. However, different semantic regions are activated according to the estimated cloth topology. We denote the template mesh as M<sup>t</sup> = (V, E, B), where V = {vi} and E are the set of vertices and edges respectively, and B = {bi} is a per-vertex binary activation mask. v<sup>i</sup> will only be activated if b<sup>i</sup> = 1; otherwise v<sup>i</sup> will be detached during the training and removed in the output. The activation mask is determined by the estimated cloth category, where regions of vertices are labeled as a whole. For instance, to model a short-sleeve dress, vertices belonging to the regions of lower limbs and legs are deactivated. Note that in order to adapt the waist region to large deformations for modeling long dresses, we densify its triangulation accordingly using mesh subdivisions.
32
+
33
+ Cloth classification. We build a cloth classification network based on a pretrained VGGNet. The classification network is trained using both real and synthetic images. The synthetic images are used in order to provide augmented lighting conditions to the training images. In particular, we render each garment model under different global illuminations in 5 random views. We generate around 10,000 synthetic images, 90% of which is used for training while the rest is reserved for testing. Our classification network can achieve an accuracy of 99.3%, leading to an appropriate template at both train and test time.
34
+
35
+ To achieve a balanced trade-off between mesh smoothness and accuracy of reconstruction, we propose a multi-stage pipeline to progressively deforming M<sup>t</sup> to fit the target shape.
36
+
37
+ Feature line-guided Mesh Generation. It is well understood that, the feature lines, such as necklines, hemlines, etc, play a key role in casting the shape contours of the 3D clothing. Therefore, we propose to first infer the 3D feature lines and then deform $M_t$ by treating the feature lines as deformation handles.
38
+
39
+ Pose Estimation. Due to the large degrees of freedom of 3D lines, directly regressing their positions is highly challenging. To reduce the searching space and thus make the problem tractable, we firstly estimate the body pose and deform $M_t$ to obtain a new mesh $M_p$ which provides an initialization $\{l_i^p\}$ of 3D feature lines. Here, the pose of 3D garment is represented as the SMPL pose parameters $\theta$ [38], which are regressed by a pose estimation network.
40
+
41
+ GCN-based Feature line regression. We represent the feature lines $\{l_i^p\}$ as polygons during pose estimation. This enables us to treat it as a graph and further employ an image-guided GCN to regress the vertex-wise displacements. We employ another VGG module to extract the features from the input image and leverage a similar learning strategy with Pixel2Mesh [59] to infer deformation of feature lines. Note that before the regression step, we first determine the activated subset of feature lines according to the estimated cloth category and only feed the activated ones into the network.
42
+
43
+ Handle-based deformation. We denote the output feature lines of the above steps as $\{l_i^o\}$ . $M_l$ is obtained by deforming $M_p$ so that its feature lines $\{l_i^p\}$ fit our prediction $\{l_i^o\}$ . We use the handle-based Laplcacian deformation [53] by setting the alignment between $\{l_i^p\}$ and $\{l_i^o\}$ as hard constrains while optimizing the displacements of the remaining vertices to achieve smooth and visually pleasing deformations. Note that the explicit handle-based deformation can quickly lead to a result that is close to the target surface, which alleviates the difficulty of regressing of a large number of vertices.
44
+
45
+ Surface Refinement by Fitting Implicit Reconstruction. After obtaining $M_l$ , a straightforward way to obtain refined surface details is to apply Pixel2Mesh [59] by directly taking $M_l$ as input. However, as illustrated in Fig. 5, this method fails probably due to the inherent difficulty of learning the high-frequency details while preserving surface smoothness. In contrast, our empirical results indicate that the implicit surface based methods, such as OccNet [42], can faithfully recover the details but only generate closed surface. We therefore directly perform an adaptive non-rigid registration from $M_l$ to the output of OccNet for transferring the learned surface details.
46
+
47
+ Learning implicit surface. We directly employ OccNet [42] for learning the implicit surface. Specifically, the input image is first encoded into a latent vector using ResNet-18. For each 3D point in the space, a MLP layer consumes its coordinate and the latent code and predict if the point is inside or outside the surface. Note that we convert all the data into closed meshes using Poisson reconstruction method. With the trained network, we first generate an implicit field and then extract the reconstructed surface $M_I$ using marching cube algorithm [39].
48
+
49
+ Detail transfer with adaptive registration. Though OccNet can synthesize highquality geometric details, it may also introduce outliers due to its enforcement of generating closed surface. We therefore propose an adaptive registration to only transfer correct high-frequency details by imposing two additional constraints to the conventional non-rigid ICP algorithm: (1) the two points of a valid correspondence should have consistent normal direction (i.e., the angle of the two normal directions should be smaller than a threshold which is set as 60◦ ). (2) the bi-directional Chamfer distance between the corresponded points should be less than a preset threshold σ (σ is set as 0.01). The adaptive registrations helps to remove erroneous correspondences and produces our final output Mr.
50
+
51
+ There are four sub-networks need to be trained: cloth classification, pose estimation, GCN-based feature line fitting and the implicit reconstruction. Each of the sub-networks is trained independently. In the following subsections, we will provide the details on training data preparation and loss functions.
52
+
53
+ Pose estimation. We obtain the 3D pose of the garment model by fitting the SMPL model to the reconstructed dense point cloud. The data processing procedures are as follows: 1) for each annotated feature line, we calculate its center point as the its corresponding skeleton joint; 2) we use the joints in the torso region to align all the point clouds to ensure a consistent orientation and scale. 3) lastly, we compute the SMPL pose parameters for each model by fitting the joints and point cloud. The obtained pose parameters will be used for supervising the pose estimation module in Section 4.2.
54
+
55
+ Image rendering. We augment the input with synthetic images. In particular, for each model, we generate rendered images by randomly sampling 3 viewpoints and 3 different lighting environments, obtaining 9 images in total. Note that we only sample viewpoints from the front viewing angles as we only focus on frontview reconstruction in this work. However, our approach can easily scale to side or back view prediction by providing the approach according training images.
56
+
57
+ Loss functions The training of cloth classification, pose estimation and implicit reconstruction exactly follows the mainstream protocols. Hence, due to the page limit, we only focus on the part of feature line regression here while leaving other details in the appendix.
58
+
59
+ Feature line regression. Our training goal is to minimize the average distance between the vertices on the obtained feature lines and the ground-truth annotations. Therefore, our loss function is a weighted sum of a distance metric (we use Chamfer distance here) and an edge length regularization loss [59], which helps to smooth the deformed feature lines. More details can be found in the appendix.
60
+
61
+ ![](_page_11_Figure_2.jpeg)
62
+
63
+ Fig. 5: Experiment results against other methods. Given an image, results are followed with (a) PSG (Point Set Generation) [21]; (b) 3D-R2N2 [16]; (c) AtlasNet[23] with 25 square patches; (d) AtlasNet[23] whith a sphere template; (e) Pixel2Mesh [59]; (f) MVD [40] (multi-view depth generation); (g) TMN [46] (topology modification network); (h) MGN (Multi-Garment Network) [7]; (i) OccNet [42]; (j) Ours; (k) The groundtruth point clouds. The input images on the top. The null means the method fails to generate a result.
2005.09812/main_diagram/main_diagram.drawio ADDED
@@ -0,0 +1 @@
 
 
1
+ <mxfile host="www.draw.io" modified="2019-11-15T21:49:20.425Z" agent="Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/70.0.3538.102 Safari/537.36 OPR/57.0.3098.76" etag="XT-EGBz__InXL2taSWQz" version="12.2.6" type="dropbox" pages="1"><diagram id="Vp9xubMKPnU8rZLMBq8i" name="Page-1">7V1tU6M6FP41ftQhCa8f1a06e/WOq3tf5n65g5C2WApdSq3ur98EkhZCXOsuDQS7zmzhEAKc8+S85RCO0Pn8+TLzF9ObNMTxETTC5yP06QhCGwDyPyW8lARko5IwyaKwJIEt4T76jhnRYNRVFOJlrWGepnEeLerEIE0SHOQ1mp9l6brebJzG9asu/AluEO4DP25S/4nCfFpSXehs6Vc4mkz5lYHtlUfmPm/MnmQ59cN0XSGh0RE6z9I0L7fmz+c4przjfCnPu3jl6ObGMpzku5zw93/2+d1q/Hnxxfv+1793afz1ND4GwCr7efLjFXtkdrv5C+dBlq6SENNujCN0tp5GOb5f+AE9uiZCJ7RpPo/JHiCb4yiOz9M4zYpzEQahhR1CX+ZZOsOVI57tIN8mR5oPwp7tCWc5fq6Q2INd4nSO8+yFNGFHkcOYzFFmsP31VmbAZrRpRV6I0XwGk8mm6y0nyQZj5rsYa7bM2BbYBNDbbEKetR82pX/eXs6+2w/2deBN7h6v1gBMjkHLTBLQR7DnhqYMfS58QHZL6AMeOLFqjPVk8HMV8hXtl69jN8BBIOPrg2uZltEOXwkbBb5aMr7aCvna9qAW+ToeQzlfQ3IjVmt4hQJfN4qhwli4L8bK9aXRtipog1HQrbFpY8tr+hKqZFP/mQRcCZMslbZXA8sLbRmSTIVMsgehydAO2HNVDlBnEIZXRKtE7218HCVsdffLVkV+4i5KwFAapRgSvtpxTjmUkuesMtj+tkr5geNlEaifkgbAXTxvD5KtCf1d8W7IbZU9lfR3927Iejde7V0/DbaL9UQqjaeekAADgoQNegYJmdd5gIRKyyFAQmo5lEJCljbsPySUGA5FkBBj084hIYso+g+J6+EYDtNDPTMcsmjoAAmFkZwICWkGSykkZJFc/yExG47hQJ7TL8NhySYr+g+JQQWh9g4TAmpBIUvi9h8UQwpDLa93oDhEHZ1HHaKmkEx1q8WEnmHHkJITDUXROSY8LTExICfTtNyeYcLWM4n5OJxYFELr7XIetZiAB0x0i4lmKVLnmNAzGh0SJhr+ROeY0DMYHRImGnFH55jQMxa9PlaTolCT3bZ651HoGY2qQoWaabAGKjpPXNl6ToQRA6Ioo6kmABGrKLqeDLP1nAwbGiwauYrOrYim+Ss1VkTRNCl3+vsyTcpfcdQMFHlPHIvyCH+HF7UEEtB4k6rrmhtHzzznsOISriv64mg4eia1hlSb53r9CkkcPXNaQ6rNc/vmdzp6Zi+GNG/q1UNU2awpUIgI/u7kz168w0l4SldaIXujuzQh1LPQX04LPoM6T+tOGGQtb/08x1lStIEGpT6u5gu2Fkzpu/lZLl6jIF5E9HkKcZLbqOwVPbB7TMoTUiLqKKeMLUZwQ45wRP9+Jkcc1haLaUqxIibZG8+cluHYz6MnXOtcJjt2hds0KsYHD1idegUnMgXxL9NVFmB21hYBb3Yk9kM4PMF5o58CSZun/g1wtZBCt2QKgfz6czqck4cl/blfYH+GM9p+V1VBxmsuwy7HC8NUVUkwkh9HE4rlgKCHXBOd0dEfBX58yg7MozCkl5EqoLqKordYQahR/GN0jm7JQkDvVzt1fxVwyVTw7EiWL4H7UjueLNYVJLSc+gu6Gayy+OUs84MZzt/W6lv+lns5GYhpoXrcQimUSsc4sZHjIWAg03ShY7mOVHm1wHkLwbdcACjRJADsjfUtxI/SQXlOGtFBBY1RssTzB9IHXY8r470/ZLylZLhuD+o1gNtAiBBMcse9ugaO7KVxXsjQPkJ28An44IyjZCZKYTdbLjapm/fmWBSse99tOICCynV+0YZb6I2O9mzEvRZSC+804vDD6QBTzCd1bZ9bSB68U+jowwvd7ljowJAp/v6nB14vg6r3ztce5ecDKTx/0R3pfzoK7rLYmdIqfj2LJoY1C9q/dzv0rJkYUI4SNWq2u8YEAHpapgGBwmy83NE5KKAsibRfJ9X86E4qMrp2UqGeqkDJvKaaxdhNYYEJ6SrjajWBpuUPajxJRagQ6ul6gIrDbHfXn23on6Y4xJzdo6JDTSFd9F/mUQhcPdRAqK6B4MsN/nYNhNCR4hoIYOywtPUBXYrRZQKjHXSJHSlH1w7r0R/QpRhdyG0JXWJH+0OX1C7q6T8P/IWUev2oyjLzTxeXV5/nN9bF7A9gL7Jv9ujLzfEgPi9nIvFDVCp9UilfwQ5uQ1NbV1U5q9CZP0/ohxtPxnG6DqZEJZ/4ScLK4v6nDXnCMcZjOkwWVJvgbPRE2LnkQqHliKW6J/dVK6ujpXztIBvWowIg+xaYKZGA5+5LBM23NL82ZNCPRK8ooDbkYYgJ/03cVpGIKbGi/JOT7xAI2d1+KLM0X9uvjaLRDw==</diagram></mxfile>
2005.09812/paper_text/intro_method.md ADDED
@@ -0,0 +1,17 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # Introduction
2
+
3
+ Active speaker detection is a multi-modal task that relies on the careful integration of audiovisual information. It aims at identifying active speakers, among a set of possible candidates, by analyzing subtle facial motion patterns and carefully aligning their characteristic speech wave-forms. Although it has a long story in computer vision [\[11\]](#page-10-0), and despite its many applications such as speaker diarization or video re-framing, detecting active speakers in-the-wild remains an open problem. Towards that goal, the recently released AVA Active-Speaker benchmark [\[31\]](#page-11-0) provides an adequate experimental framework to study the problem.
4
+
5
+ Recent approaches for active speaker detection [\[5,](#page-10-1) [39\]](#page-11-1) have focused on developing sophisticated 3D convolutional models to fuse local audiovisual patterns that estimate binary labels over short-term sequences. These methods perform well on scenarios with a single speaker, but they meet their limits when multiple speakers are present. We argue that this limitation stems from the insufficiency of audio cues to fully solve the problem and from the high ambiguity of visual cues when considered in isolation [\[31\]](#page-11-0).
6
+
7
+ ![](_page_0_Figure_8.jpeg)
8
+
9
+ <span id="page-0-0"></span>Figure 1. Active Speakers in Context. Our goal is to identify the active speaker at a reference time. Let us assume we only have access to a short audiovisual sample from a single speaker (a). By looking at the lips of the speaker, it is hard to tell if he is talking, but the audio indicates that someone at that moment is talking. We have no other option than provide an educated guess. To increase our success prediction chances, let us leverage multi-speaker context (b). We now observe all speakers in the scene during longterm. From this enriched observation, we can infer two things. First, Speaker B is not talking over the whole sequence, and instead, he is listening to Speaker A. Second, looking at Speaker A (*e.g*. his lips) for the long-term helps us to smooth out local uncertainties. We propose a new representation, the Active Speaker Context, which learns long-term relationships between multiple speakers to make accurate active speaker detections.
10
+
11
+ In a multi-speaker scenario, an appropriate disambiguation strategy would exploit rich, long-term, contextual information extracted from each candidate speaker. Figure [1](#page-0-0) illustrates the challenges in active speaker detection when there is more than one candidate speaker. Intuitively, we can fuse information from multiple speakers to disambiguate single speaker predictions. For instance, by analyzing a speaker for an extended period, we can smooth out wrong speech activity predictions coming from short filler words. Likewise, observing multiple candidate speakers, jointly, enables us to understand conversational patterns, *e.g*. that a natural two-speaker conversation consists of an interleaved sequence of the speakers' utterances.
12
+
13
+ In this paper, we introduce the Active Speaker Context, a novel representation that models long-term interactions between multiple speakers for in-the-wild videos. Our method estimates active speaker scores by integrating audiovisual cues from every speaker present in a conversation (or scene). It leverages two-stream architectures [\[6,](#page-10-2) [9,](#page-10-3) [10\]](#page-10-4) to encode short-term audiovisual observations, sampled from the speakers in the conversation, thus creating a rich context ensemble. Our experiments indicate that this context, by itself, helps improve accuracy in active speaker detection. Furthermore, we propose to refine the computed context representation by learning pairwise relationships via self-attention [\[33\]](#page-11-2) and by modeling the temporal structure with a sequence-to-sequence model [\[17\]](#page-10-5). Our model not only improves the state-of-the-art but also exhibits robust performance for challenging scenarios that contain multiple speakers in the scene.
14
+
15
+ Contributions. In this work we design and validate a model that learns audiovisual relationships among multiple speakers. To this end, our work brings two contributions.[1](#page-1-0)
16
+
17
+ (1) We develop a model that learns non-local relationships between multiple speakers over long timespans (Section [3\)](#page-1-1). (2) We observe that this model improves the state-of-theart in the AVA-ActiveSpeaker dataset by 1.6%, and that this improvement is a direct result of modeling long-term multispeaker context (Section [4\)](#page-4-0).
2011.02048/main_diagram/main_diagram.drawio ADDED
@@ -0,0 +1 @@
 
 
1
+ <mxfile host="Electron" modified="2020-05-27T23:26:59.704Z" agent="5.0 (Macintosh; Intel Mac OS X 10_15_4) AppleWebKit/537.36 (KHTML, like Gecko) draw.io/13.0.3 Chrome/80.0.3987.163 Electron/8.2.1 Safari/537.36" etag="AH4ZnFeBXn7NfKh9bfC8" version="13.0.3" type="device"><diagram id="wpJ7OWGR6X-xnTAnxLlr" name="Page-1">7V1rk5pIF/41fhyq78DHue5uVfJWKknVZj9toTLKBMVFJjOTX/82AgrdoC1yaVG3Kisttsy5P31Onx7h+8X7H6Gzmn8Opq4/QmD6PsIPI4QYRvzfeOAjGSA2TAZmoTdNhnID37zfbjoI0tFXb+quCzdGQeBH3qo4OAmWS3cSFcacMAzeirc9B37xV1fOzJUGvk0cXx7925tG82TUomA3/qfrzebZL0OQfrJwspvTKdZzZxq8JUObe/DjCN+HQRAl7xbv964f0y6jSzLRU8Wn2wcL3WWk8oWn35/eF+DP+Y8vL9hfLcd/TV7ADYQsmeeX47+mf3L6uNFHRoMweF1O3XgaOMJ3b3Mvcr+tnEn86RtnOh+bRws//fjZ8/37wA9Cfr0Mlm48FCyjJ2fh+bEMfPcWnJ8I/M994/9+DRbOkt+yjsLg55bCaDuSzTRCeMrGjLJ0ulRQYHw9C52px4kg/Gr6d7lh5L5XkgxuGcEF2A0WbhR+8FvSLxCMk6+kwgutlLlvO1GA1EzG5nkxwOmNTip+s+3cOw7xNymTjmAYOhd2PT0x/iphVwN8QaYCXzCV+ZLpSPN8wRIb3Ck3JOllEEbzYBYsHf9xN3o3eQ1/bfm04xrgV7svfAqCVXrLixtFHykpndcoKHLSffeiH7n3/8RTGTS9enhPZ95cfGQXS/63/9jdGF/+k/9s97XNVfY9idtg81ITnnKBiMm1Xxw4dYPXcOLu4QJJPYQTztxoz302K5ev0PWdyPtVfJAyWUm/+iXw+CPu7IVVlEtEiUGLkySPln5PkLntg9QXQ1JiHvgYWf87Mu+8GzgyH5KBPVYDKFiNulw+0RxbRbW3ZbUv03rSltLTGkrfmJ7vdHursZV6rsCvKlMAD5gC1x9vopqYsx6Pm0S7lncwsRd3XOt5cpTDmDrr+Xa6Bo0FUzQWjLRiLDApGgvMbEVjcRuGzkfutlV8w7r6l5BglnhYk/7SU9WzoQPf4G+Sp2jUeh0Vi6qYqaLsPdP4vzLZYywne0catvg6N1XyaifOIeZWRHImL4tpOjF5pswiBkb0PoY742fuZN5jF8OHEDA2L4mDnBxRkU2O7804ZR8mnCwuJ+NdZktu0w8W3nSaGE937f12xpupYv6nks/npXcj+hDPxe3lOuVOb46KB6QFvjFUxrctSChEqG0xzhqSrzrgc+r6Mn19ld23ryKCrwIt+SrR82DbPtJXSd9ox1fZTfuqGp6nlntryVdhqJ2vglBiyRnbvIuLzyHq2+h1FaBjJobb5OgAnXRh9DKWXCP0jA9UP6snr+2cr9UbeFgHac8mThBfTNsycQSIBgsdbeJQJyYOK9i4DhTqQLA3pa41JWVCbKEx3ghxWZLGCbNJ2UZNprdxjpRfTnxnvfYm3+deEmxMnzzf71HgoXmiwJ9mQ+nVzxX9nK2fn8uSs5op6s5L5X1UzmUpeqkzVXBTUcFzNRB9KHjJOuZFK7hY6qCDglsSj+LfAou1xKqLWFO2iwyyUQmDaJdLyrCnBZa2YQCbWO74udTA1q9d6K0mQfTcliAOiUlvrSIhkz6pJCGfNJqPzId/XxqpTDiayT3VMog5WQ3sLS6rHrkGVP0GVMr2I2FWXwFVNvE1oEq/QDUMqLBCgcVVwbtWcNU0R7Ie1niAwBgQAkvVNcDGQoRrVkGwHfrV/SCFxZZBBmpMzGuX1JqXcQJLWtQcL2RgfCIv+kG0rFglRUulHHYq5RpUdTRBWVM7yuIyGHiGlLX0o2zjkXc/lLX1o2zjcUkvlMVAP8o2Hk30Q1moH2UV1nDOgbJIP8o2nhPuh7JYP8o2vu+jH8oS/SjbeJqzH8pS/Sg7DAyG9cNgeBgYDOuHwcgwMBjWD4ORYWAwrB8GI8PAYEQ/DEaGgcGIfhiMDAODEf0wGBkGBhMbWehA2WFgMKIfBiPDwGBEPwxGhoHBiH4YjAwDgxH9MBgdBgYj+mEwOgwMRvTDYHQYGIzqh8HoMDAY1Q+D0WFgMKofBqPDwGBUPwxGh4HBqH4YjA4Dg1H9MBgdBgaj+mEwOgwMRvXDYGwYGIzqh8HYMDAY1Q+DMRmDPS4nwdQNz5HAUul3Rrd8J2ezrIF7e1X1WQ8uDVq4H9NEqdCaqLqjUXHn00mcb6b5smqr9kRSZEnqZlsjk2Hkt+A5WjjvI8T8uOfAOCzIDPvvNT7QY0O6m6RpwC2/AbLV+4aA2ef83Sz+/20Ucdp5wfIs1VjYxIyhrMZWp1ZSxqb/vbrpxBfWMEIsR0QlG5c6bRfBZHi7eRtT4yIZJKyaYSoziHXKIPMsPWCuHwixix1BDEjwaP8m4s3VFzfkXN+I1Vk7VUvVqdq9OtWyNYPEm65XnEan+NPsuBMvayaSzMufM5k6u+v8nK3YYJkb7743BbOSJYrNaQAfnP7sIo26uI0NgZ69rikvdvz99a/vj5fIHCyd51aG681O2TOMbArTL7dqDiObwvTLrZrDyKYw/XKrphwZfX28fbhEWy3mZ3HpGTqd4iOzLCMTR5mr+HS9KKw+XK9zfomNag7wrwF+QSHwIbSMX3aX/LLk0AcYcsvKC2AOoirMIZ0yR04VAUNegr8E5ogdOjVgDiphjuzkL4A5ImTQgTly0eCFmjVx97IOzJETRsC4yIBA3PKoA3NkSAgM+yKZI55bxORlqm5ZI2PKC/U4Ypm6Dnojw1JgaLC82wNzKM2YoQ97bBnnfPMWr37kLF+D13VDlQoP7sRbbwoVGpnus/PTW84uUYbEvVPlq9AQlIgQa02EFAr35NbIRUZJeVKweZ2ygnQ4KypT+UCiJRs7sWsyYqIhYESqTas4WkFuwSysdZVM1XILZlsGfMlSVlSxohVz5pMzdv32lLVJ7ZwEy6U7idLfG21tcV6e9ihGpS7zGIWAohzcoEYEDBJoZO1o05mt4hzB8/PabUcaVE7WOtYaFA5XPNIabMtCRrmikG2JSEVZyHGlK4WSD7GBuDVxJ7vm1PmCD1U7pVC9UXHEYzf2jAhH0GK5C7yqNRNPs8ViuqSpU3OFdbMsVVP5XHb5c7V7TPjZnMZa+1ilw+p8xmdQqx5n19IBrYRULj8erZbCearbcsOmT3oVHxmZx570Kn6jJdWkvarmMa4sU01oAJjXTWDEq2w1ChxVaxqRmoJLoVqF/2xQM22mqJnoRMU8DVeVFaFsU8beCN2/VGeNw3mwGHP8frB+osE61BM72TABFCFShm2zxcdOaipshWqVylN/9gW0e04VOiHEhcfYhYPmquqIpCpnDw4ak5ZOQVJX515hv3hsH4a1/TEVp2Kdg/6yOuy0atpZxDYmrZ0Gm0rjM1oH8OMnunMmP2cbf51pwDJYuu2sElScVp5OfwMMi2BhmQA2IpA3EBomKcwMLcPubqmgqiZqIzJxLTin5VV0VJwnsQwAgE25Z4n/RUVHKh7bqrBEabDiHHZXQrFdu9Yf9aqg15rA+HxRbya8faHeEqhqgNxLmLDCTZY577JpjobWRyNi8azfbdCgjIjFb7SDiCE4Lg2UGkWlHFAjGKWuKqrrxuH4s2I/ekdpJ+FkSIZp7fUgIEwlh7Itx58QoMsWt96kyGpMiqSpepCiNrJVzQqS7gIh2QK5z0ptgZCnal0gVM6xvwrE/vKG5gRCdFk9CIS8yTLOFYLFWpKLC6g/QsKOwpzJ7muTMgQqJ9MPOBJQwVo6RQwUw6bKA2gPEcNxKYHBSVtvYQZCRlaD34AglczWgywpNEDtOdjYppYMgK1CesmE5v4EE78QU8e6CxkygWFXrvJQxIxsj86poW08V9fSptAU9hra7jdC5CAX65bU9CAQWXuCnED8dD80iGs16EZYWnvQ6c4MCOW1iQvdcSZ2K9Zg3wyEDfjuWkkn1cwPm1ju+FmyzOgEs6uS/sF92mcm5j7ETS/KuyqEiZClGhwem+ARfwmDDiqLIZIxs7F56WVeGrAdTFie5AO92w5Uy3bIZWXV2C+P/HJAsCrbXKN+rKo67JhStzXXoewPSJ6/pQox1QKxitaWHYEPYY8BBfXhrVT6LU3VdnCJaqGNq4xrBnjEFibx1h5YUyjFvgElc7UtlVjh+I6rVGovlWaDUil2UOpDKuU15S+he7PbNa9LTEaaicmgiOdKjvAo3cNeIyTjl2EQdw/YcSx0VvPPwTQuHn38Pw==</diagram></mxfile>
2011.02048/main_diagram/main_diagram.pdf ADDED
Binary file (41.7 kB). View file
 
2011.02048/paper_text/intro_method.md ADDED
@@ -0,0 +1,53 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # Introduction
2
+
3
+ Simultaneous speech translation (SimulST) generates a translation from an input speech utterance before the end of the utterance has been heard. SimulST systems aim at generating translations with maximum quality and minimum latency, targeting applications such as video caption translations and real-time language interpretation. While great progress has recently been achieved on both end-to-end speech translation [\(Ansari et al.,](#page-4-0) [2020\)](#page-4-0) and simultaneous text translation (SimulMT) [\(Gris](#page-4-1)[som II et al.,](#page-4-1) [2014;](#page-4-1) [Gu et al.,](#page-4-2) [2017;](#page-4-2) [Luo et al.,](#page-4-3) [2017;](#page-4-3) [Lawson et al.,](#page-4-4) [2018;](#page-4-4) [Alinejad et al.,](#page-4-5) [2018;](#page-4-5) [Zheng](#page-5-0) [et al.,](#page-5-0) [2019b](#page-5-0)[,a;](#page-5-1) [Ma et al.,](#page-4-6) [2020;](#page-4-6) [Arivazhagan et al.,](#page-4-7) [2019,](#page-4-7) [2020\)](#page-4-8), little work has combined the two tasks together [\(Ren et al.,](#page-5-2) [2020\)](#page-5-2).
4
+
5
+ End-to-end SimulST models feature a smaller model size, greater inference speed and fewer compounding errors compared to their cascade counterpart, which perform streaming speech recognition followed by simultaneous machine translation. In addition, it has been demonstrated that end-to-end SimulST systems can have lower latency than cascade systems [\(Ren et al.,](#page-5-2) [2020\)](#page-5-2).
6
+
7
+ In this paper, we study how to adapt methods developed for SimulMT to end-to-end SimulST. To this end, we introduce the concept of pre-decision module. Such module guides how to group encoder states into meaningful units prior to making a READ/WRITE decision. A detailed analysis of the latency-quality trade-offs when combining a fixed or flexible pre-decision module with a fixed or flexible policy is provided. We also introduce a novel computation-aware latency metric, adapted from Average Lagging (AL) [\(Ma et al.,](#page-4-9) [2019\)](#page-4-9).
8
+
9
+ # Method
10
+
11
+ A SimulST model takes as input a sequence of acoustic features X = [x1, ...x|X<sup>|</sup> ] extracted from speech samples every T<sup>s</sup> ms, and generates a sequence of text tokens Y = [y1, ..., y|<sup>Y</sup> <sup>|</sup> ] in a target language. Additionally, it is able to generate y<sup>i</sup> with only partial input X1:n(yi) = [x1, ...xn(yi) ], where n(yi) ≤ |X| is the number of frames needed to generate the i-th target token y<sup>i</sup> . Note that n is a monotonic function, i.e. n(yi−1) ≤ n(yi).
12
+
13
+ A SimulST model is evaluated with respect to quality, using BLEU [\(Papineni et al.,](#page-4-10) [2002\)](#page-4-10), and latency. We introduce two latency evaluation methods for SimulST that are adapted from SimulMT. We first define two types of delays to generate the word y<sup>i</sup> , a computation-aware (CA) and a non computation-aware (NCA) delay. The CA delay of y<sup>i</sup> , dCA(yi), is defined as the time that elapses (speech duration) from the beginning of the process to the prediction of y<sup>i</sup> , while the NCA delay for y<sup>i</sup> dCA(yi) is defined by dNCA(yi) = T<sup>s</sup> · n(yi). Note that dNCA is an ideal case for dCA where the computational time for the model is ignored. Both delays are measured in milliseconds. Two types of latency measurement, LCA and LNCA, are calculated accordingly: L = C(D) where C is a latency metric and D = [d(y1), ..., d(y|<sup>Y</sup> <sup>|</sup> )].
14
+
15
+ <span id="page-0-0"></span><sup>1</sup>The code is available at [https://github.com/](https://github.com/pytorch/fairseq) [pytorch/fairseq](https://github.com/pytorch/fairseq)
16
+
17
+ To better evaluate the latency for SimulST, we introduce a modification to AL. We assume an oracle system that can perform perfect simultaneous translation for both latency and quality, while in [Ma et al.](#page-4-9) [\(2019\)](#page-4-9) the oracle is ideal only from the latency perspective. We evaluate the lagging based on time rather than steps. The modified AL metric is defined in [Eq. \(1\):](#page-1-0)
18
+
19
+ <span id="page-1-0"></span>
20
+ $$AL = \frac{1}{\tau(|\boldsymbol{X}|)} \sum_{i=1}^{\tau(|\boldsymbol{X}|)} d(y_i) - \frac{|\boldsymbol{X}|}{|\boldsymbol{Y}^*|} \cdot T_s \cdot (i-1)$$
21
+ (1)
22
+
23
+ where |Y ∗ | is the length of the reference translation, τ (|X|) is the index of the first target token generated when the model has read the full input. There are two benefits from this modification. The first is that latency is measured using time instead of steps, which makes it agnostic to preprocessing and segmentation. The second is that it is more robust and can prevent an extremely low and trivial value when the prediction is significantly shorter than the reference.
24
+
25
+ End-to-end ST models directly map a source speech utterance into a sequence of target tokens. We use the S-Transformer architecture proposed by [\(Di Gangi et al.,](#page-4-11) [2019b\)](#page-4-11), which achieves competitive performance on the MuST-C dataset [\(Di Gangi](#page-4-12) [et al.,](#page-4-12) [2019a\)](#page-4-12). In the encoder, a two-dimensional attention is applied after the CNN layers and a distance penalty is introduced to bias the attention towards short-range dependencies.
26
+
27
+ We investigate two types of simultaneous translation mechanisms, flexible and fixed policy. In particular, we investigate monotonic multihead attention [\(Ma et al.,](#page-4-6) [2020\)](#page-4-6), which is an instance of flexible policy and the prefix-to-prefix model [\(Ma](#page-4-9) [et al.,](#page-4-9) [2019\)](#page-4-9), an instance of fixed policy, designated by wait-k from now on.
28
+
29
+ Monotonic Multihead Attention (MMA) [\(Ma](#page-4-6) [et al.,](#page-4-6) [2020\)](#page-4-6) extends monotonic attention [\(Raf](#page-5-3)[fel et al.,](#page-5-3) [2017;](#page-5-3) [Arivazhagan et al.,](#page-4-7) [2019\)](#page-4-7) to Transformer-based models. Each head in each layer has an independent step probability pij for the ith target and jth source step, and then uses a closed form expected attention for training. A weighted average and variance loss were proposed to control the behavior of the attention heads and thus the trade-offs between quality and latency.
30
+
31
+ Wait-k [\(Ma et al.,](#page-4-9) [2019\)](#page-4-9) is a fixed policy that waits for k source tokens, and then reads and writes alternatively. Wait-k can be a special case of Monotonic Infinite-Lookback Attention (MILk) [\(Arivazhagan et al.,](#page-4-7) [2019\)](#page-4-7) or MMA where the stepwise probability pij = 0 if j − i < k else pij = 1.
32
+
33
+ ![](_page_1_Figure_9.jpeg)
34
+
35
+ Figure 1: Simul-ST architecture with pre-decision module. Blue states in the figure indicate when the Simul-ST model triggers the simultaneous decision making process.
36
+
37
+ In SimulMT, READ or WRITE decisions are made at the token (word or BPE) level. However, with speech input, it is unclear when to make such decisions. For example, one could choose to read or write after each frame or after generating each encoder state. Meanwhile, a frame typically only covers 10ms of the input while an encoder state generally covers 40ms of the input (assuming a subsampling factor of 4), while the average length of a word in our dataset is 270ms. Intuitively, a policy like wait-k will not have enough information to write a token after reading a frame or generating an encoder state. In principle, a flexible or model-based policy such as MMA should be able to handle granular input. Our analysis will show, however, that while MMA is more robust to the granularity of the input, it also performs poorly when the input is too fine-grained.
38
+
39
+ In order to overcome these issues, we introduce the notion of pre-decision module, which groups frames or encoder states, prior to making a decision. A pre-decision module generates a series of trigger probabilities ptr on each encoder states to indicate whether a simultaneous decision should be
40
+
41
+ made. If ptr > 0.5, the model triggers the simultaneous decision making, otherwise keeps reading new frames. We propose two types of pre-decision module.
42
+
43
+ Fixed Pre-Decision A straightforward policy for a fixed pre-decision module is to trigger simultaneous decision making every fixed number of frames. Let ∆t be the time corresponding to this fixed number of frames, with ∆t a multiple of Ts, and r<sup>e</sup> = int(|X|/|H|). ptr at encoder step j is defined in [Eq. \(2\):](#page-2-0)
44
+
45
+ <span id="page-2-0"></span>
46
+ $$p_{tr}(j) = \begin{cases} 1 & \text{if } \text{mod}(j \cdot r_e \cdot T_s, \Delta t) = 0, \\ 0 & \text{Otherwise.} \end{cases}$$
47
+ (2)
48
+
49
+ Flexible Pre-Decision We use an oracle flexible pre-decision module that uses the source boundaries either at the word or phoneme level. Let A be the alignment between encoder states and source labels (word or phoneme). A(hi) represents the token that h<sup>i</sup> aligns to. The trigger probability can then be defined in [Eq. \(3\):](#page-2-1)
50
+
51
+ <span id="page-2-1"></span>
52
+ $$p_{tr}(j) = \begin{cases} 0 & \text{if } \mathbf{A}(h_j) = \mathbf{A}(h_{j-1}) \\ 1 & \text{Otherwise.} \end{cases}$$
53
+ (3)
2102.04362/main_diagram/main_diagram.drawio ADDED
@@ -0,0 +1 @@
 
 
1
+ <mxfile host="www.draw.io" modified="2019-11-13T17:10:28.994Z" agent="Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/78.0.3904.87 Safari/537.36" version="12.2.4" etag="3r_zBHX37QpKTidlSG_3" type="onedrive" pages="1"><diagram id="EhTU76tvGheJokLREZDe">jLzXlqTKsi34Nfux70ATPKI1BFq89EBrrfn6hszcqs+9o7tWVWbg4Qg3N5s2p7mz/gHT3cnP0ViqQ5q1/4CA9PwHzPwDgiAARZ5fb8v12wJCIPHbUsxV+tf27warurPfRuyvcavSbPmvfuswtGs1/ndjMvR9lqz/1RbN83D8d7d8aP/7pmNU/N0Q+HeDlURt9j+6eVW6lr+tH/Q/egtZVZT/vDMI/H3TRf/s/NewlFE6HP9xVZj9B0zPw7D+fupOOmtf4/3TLL/ncf+Hb//1YHPWr/9/ToB+T9ijdvsbG8kw/7dH2qypkqb895Dr9c+RZ+ljiL/DYV7LoRj6qGX/3UrNw9an2Xt54Dn6dx9lGManEXwa62xdr79JjbZ1eJrKtWv/vs3OavX/43PwXup/oX9HzPl35Z+D658H/Tpf/n8e/MdZ7+G/T/s5+ud5bRRn7XdYqrUa+qcteb7O5ueLPZvX6plw5f/VIR7WdeieDlFbFf/bM8i/L9Z3uNSyzkPzLyeBn5Z86Nd/WvBvul6j/tf8LcM2J39N8J9/R3OR/U0p8j9n+d++8wRdNnTZM8qny1/EAf8LACH896w5a6O12v/7htFfQBT/OvVfV/sO1XOTf13q/yKAv1D9i12EwP77Gr9P+nfaf7reP6/0z45Dni/Zf/V5PvzH4/+76cdp//cODP8PB/4fPvvE2Ph+rLqfsP7/ntsfr6CipCl+nJke2mH+uRSc//z5P852tIy/cJNX5xsC1M8tyX+2Av9seT6n0Rr9AyZ/DyFu7It/QHTlUrp5ADJfDOTzR7OcknWK55P4HtLPf+8HSvW/UfL8RvimZQ3XRHr2JqyY6jxfzCoB3vIFI4jzuWz+/ONl/VYLNAJOx+34DDDsoFULaK1MOyV0VWPb2LKCuzODS/F8vnVO67rMU3Q7my060BICxg5Y1jAkdjBIJyrkaZnFabpm4aKFjnunDxvvLXnckkKj/rETFwrBri6KpXLPAXHgvQDHsPw4CvX3UCmEyqYU8A2nbK4hlNZGgUKk1/P6GN8HCav87fvaBjcvy2lqemSN0wEW0ApJTucdE4FFe3Q1U7s5UrJrTcIZWWribig6VqNvOzocgfmaJNI5p2aNSZktnoUYpcgxDGSyE5OojBU01SJIfWmVt1Em2vZ8cZHbzQkAdBbhXOHdtQ4Y2R9p+jxKv3Zb5SvAxW+gFpnWrOaOBPQuG1NWaFlSCOMLvp38nd2RbuoKO7bYN8/Tft6VTfRsPvN7mCEIFAE+zwDnsYTHgUVUabGNxLCLglC5yZRP9WLjnZzUkrcZoL8DTkBK6zWb1n2YlKZ8sVNTQ7c6RghgNyjboGIHNuR67NTCYKEuqZMlhlEh91LLAFHtyIlUm0XVhFPpDufIIAt5m6QpUmDboGQS32E6uiBDskg/KpmhekPqwVhYPVtQDqjSAUmR6IAud3GToAuQTqsH6r2h26EWWufSaRXCVHKsYWrKwUoupDHR+llP5lLriZgxCE2TlwpT5BMAe7ryWfI48GcHsKr2DiP1P8hw6XZL4g1yBFWA6O5VimSYkCjPkPUmncjw4xCU2jPEiSzkYwwxuouC1BM+rbRb7EWj7SiqMDsEDY8jJgoWPVCIs4JezL4LOtWdSSlVvCFqKopMYgewiOhGY2z2Vqi8pZ/sSSqVaZwA+5wCO3XCX5xehKq5xR3KWpt14NSt9ua9USzYKbrdvUhM+T1JVupj4yzvaRJsrmogpfTTf91z5PUBp0hAFM/SQ2l7D+iAPja8o3geUQ8VSg+XTJfvWCy2mgPXESVfpnqGVut7r72hM31SRgFPiRbpI4QcXs/oBe/MgmdU1S6YBKvGAzWfnsn2/Lgn5kPoIPQixahJurjYJEaWkSMJXCkprCqAoJFFQgoVharr4maoks8gC+bxBnJriHeTF9Xfz6BwPaHFPr0rkx3975VMPS3VzJHhjdUNdNl8Rd6jbNEw+A0X0SJwBpUVDZXSRCPD2Vv3w6LQSRU4XJM1WYU1rMso1SWIxWT8xm4NxEvJoRNTO0VZNcGEheKt3mqo4kN8hmXUoYXSEvA3+CoYS8iaAiAq/vXRTwYKIGSn3S2iG7LYfaIutdwA44KwpJ6RJXX92O0Zwvd+gSarr5nVh2teoYOCulDrUiMQA9kIBCOB3Vu7tee4QJ+YOrQKFDj0Vkv6eM60rbDKhAYNK8uByGMAj3f6LK1TD5a4JME4JKMiYbJURYaSSZ611ZM+ZDKjnpk5EDbQJIPFWYoOKl5qbjWopVYzay7oWEGXtmdu2PCBVLbjjzTceIrThnSYWFa0Uz14MEqhDk7Pi3t9HILqtgdk+XlmLnTdDCVho+BNSG9EfPLFUYuPksBGw0qDTZoMVdLMdlKiTSLgMBF1nus5AMP3gzkchBWGJfnSBbSlWEmdBAc0HbPIzT7RC5YQK/Kc0RsPrgeBKd6iluguTJJGY68JGNSJCnAl1avVpqoq3YewiteMncBlNNDxNK5d8cwg6U2O2vkgLkdT1Aab47uwi3uJH0CqON4iHALqk8iE7lSKecRsTKU9gZ281tOHoVtoWrw2HlZ8PiKL5RwcQ3ys4DnWehhI4TglPvkLr17S5u/R+uQ5KnNAnGqGHDsbItVjPItUbKLGKpoAXbul69KBTGoib8KGoXmMoDk4BxJEhnd0SQ+0Uomqhlq2uQWk6nMFsx0GBTaQBmckLUosHESC36pXgW42H1QSTVrOATUyy7BBIqZ3qdDRVO/JkMVtUqFwGU6x0cDPeAFxYMcqJLPRe7MjETEwOIrfpuZvA+Wjbm5be4aCEGURNm0zbkIXShWyx3Her0RW5/axJmvRZrHyapqlsAqSoSp6exDLuwMyENRgOVWTjEiy5UU1BVtiUaueA6jr1OcD+bb5RnK8vbba488SRYVFImrdhTfyyaMv83qsF9VfGIR3l7e+3Mu4ppCd0ykwujjh+sNYOH4uL9bQy2gLi7l2LESf2ZMXO5K7wpoML5Yt1J2G8Cf9qkAt3tblW13ZqYLaS7B4iLS9SFovCgFtHC2ckqyMGoo+TdNCU4xWJbh0yYbKTlRDHwJ5iXcxLzF7sKH/PE2xkc7FQaJJwqEuSFylP75TfKf+ERFHU9jd4d6F+VU76SDLmyIV2qQl0qBZY2vDFPvkCLoxDswTOdrP4M+QOVzys6JqP5futnpBmwaF1Fu6wnB/X+b9wSJi2TIq7qSaZNDU3b4AeTRH6SuUY9c6czy502bXU4EthzYc65O+ZmsXP6lVOIV1hr/1zynBCQTlyCTt+c1Aknmju6JOkqdo8UaSnYBXcoEdBilU5xZOW7ygejiN09ihkeW7Y8oJVukWCdNBX8wCFmxWEZbewjIuW7Yo55LmnC/Uc94dDjklPHnySeNn6KWQdMrOM68Lx4pkIJGD9kw0lbDX176Y9cHnhoD7FxxBDl5ftkbwo6QadztAcsWKKhm/uWRA5XV/7R8ZDpeCjPwlKbqQjLRBWUqjKUqTELYiRWtrcX3eSEnFAx3hJ267Zn7exrS3Ww1OK7GRCkmdG3tCm3NXA2fcfvKmCnO2UmJpQ/Wpu6YS3YTGl2M6Wx7uX/IXNHXQp7D9MBqZsCU4Y/niYZtU/CLiA2dkCClAFsBSIc6Ts4q22iPXILMvLyl3FYq1eB2N1TtbJaDaJ+2SCEyCTlYMiMFQl/g8XCRnM/HIHaokj1J+OUYHvrZa84hT/JRob3d5nmQTHmBivs8nZUXRD+FkPU/4Ivc4AU/WMXQp8LkcvNQWXRDhYh3dGjT5CeoqNpaqduhTYHEvKJmdrPp9XUXAnskKfediG9ZuYpap5OCdpMNw3WTnDK3x+UlhSVGdd0plBkregJO9VflpcrS2te7pUFHUx454ppObvLdRnyf5yX2U2BxOmb5mGmx2QluSvNMgCp1mZE1dVCXXv1B5ygQytB2U2uErUQt6Cr0+IOUXDcrp0Q+69YTrtmSKGOI20WJLd5bJ/lV+3GW348co1Aid4yWQBy6DsNC/WbcPdm8TbiB1yft20M5R4fAr3NF3oDmIPlevBC6fZa3zOCvXp1Ga68kj8GUviKhKbEM47Au8OlRuoEebUyqDjMyWjq06YROIhRJaj7uz4o3NVwsHLcyAFRKEJr+Ald24bNeJXWfNPUX89jDyVLEX+DzvcdGKphLHOLDYBwH5QEhqnaQTKxE6+uGDR4Zs8Tq7BRrCHqsbcFB5lEWzKBsdKfDQ1NQ88V/UKFbI1WQeyHW6gNx479Vspm5rvaMVSb/15yZwBH+IGEX0QFkxPPX5BVig9+EXdIZCFYiqSEbGbijaEuShCsiiyvpIe6CewON8Tx4lIz0wfWeGQomCWJU6xGIYohdGem3KomyPamGlPlDrjqGj0WnNxjCrk7V1qSjmL0Oz0hI1CEfV1OWcYmq/FSDqviTvdWwkS68QlFwnzZjTW7DBSsoqu2AgEmo0fTFBbvJcBvAhmdTBeHIPuSlIrL7aQ78pIxvVWIvCJuM3ahf39E2+6DhKV1n0aBScol2FelprjDdvVLku1e0h7Ztm4u/WxQxK6H4J2jQ92wORZXsO62/++cIc6LP6L5/Jch3oORy0C7K9WTFkQOMdwUjyU5H2aQGlddmY1e3o+aee29ARHd5fUDaxdZNFdA1U4Fm1l0Rwew0tcPagO3V2apCWGvTRZEfCsGYzRRYtmkbsJyppgJon0mx4OcMSnvp6a6kLgIlXCNiQYoK+vWww2sWWKdTYkfQh6M525ecXpc4PuvZl0PdocXwgHEYCL3hGQw+w9IQrsor8pmyWYVtzEUDFXIiMjqv2543T0EFEk1IHVr1SunOEFFZkrAjKRDDCvmVxtangTO86/YSPjzucCPFm5hQG6wecCex1SUj5vn7VJ53mnBdFsrRUgZDxjIQ/2WY0EiIlHzxYGt+5R803ejgn5NoX1ANrYz+qnfESO+o2cJ6x9bEMncKgw1vtgsehmnBD2bFwFH5qKJYW4QQW2YtiW81/NBEsjg+9MKH2EUrujECtVkvnqqeW+TXCL1SB0cMAuZDxtqiTesh/M0rbSsux2xG82VN7Efn+7YEnYoDrs5uyVByTpT7KCJSr7ooa1J3P9AWhqvIGzujUsngcP11a8iAXwdGUHeeG8yanCwg3P05K8j70KkunJf+i+C6c9JfL/YfrUekO1sGvaxGvtsr2z+k8ch3hHD+5RUhsBj0gK9bmwbahW1kC9XvanHUEws8nXt3zc1YtbGUVOcsetRrS3VT6oKKs2FmTzTMx3FytQ/ZMnzBVO/ak2CHdSaJ9J6+2c7LCNx3iWhlBG4S+s4aDcoeHDFnZVZG3kQHaSFZNdncy1ajyNyUFJ0vKUvZACTzDaLiDYPAVYLieP0OS1xXEjaV4mSF5SY8Q7Xkq9BKAkLKELDTQgMco5CEqMriiF0s2ZVcjoQqZHQyDpnI+mYybkFRbPcIkn9kIFj4BziKrPWTzdqLZV6vHLajMihz4hJewxxWhtq5F1ctYd0Bm2mdUemU3zArT74YQ+lbx4cP45ILdkVQqAc1Xvt46LJqfKlBypCNN3eW8TWN4u4OKbU+bOdRzYt0wED+s4c59acC/HLI+aWZl7Q2vgJENDKrLFcv0nAXuIFCPEltMPqnUsUFnXCczGQu1kGnrKvRc6NAD+d1POJzRN395zzh9uWFxx03Nyxqb5WnCt8mZE2pOJNodEUosaffQbT4DHWgF0VsWXw7QJaSONmY3ICyPvQow4RaSYV3bsk/jOf7u3ALvQv18ZHAczf6yQhJd4RV2n3o5WNacHnylCrvtbi+wBVlcIOoKqR43X/bguW8aIIQuGNUvcS7wk0aMVArWfWxcyAPjXCwYGXGMB1ISxqdga8Ainr07inNkshS0u0MaRgiCxdrdT25vneVZrXhLIy0ilWor0PUQj88BTifHctntprTBcoV04SPwcEfNIY3ryWydvVEN6uf9h8a1+HwrhR+4TvUXTLCDdnHfDR5RIUeZb55kFKhUJ25k6XA6y+hXqwEiPcmN0HYBDZqMKfq5mpCapPtKcNakMpzyZ50/zZHvLyto498s+NZsclhFgyf4BEfZouVxOzjzEsvggOGJZOH0+guPACKg9awrstTLv+0r1/SNkCJmH9hBI0N3WN3LMrdwidRDFSlSDNXkK8Ii8+BsKd6OR29FEvDpnUJql4BKN4ZRnuVCbZrUHMGN11Crq9lGPKcSZfpBWnhGrzqmr8qGxnwasJeTXOq4llQo71ZJqpfK8FJv97gEcHqc6iXFQPY7+TqS4+H1GpAQMCz9SpOMz+SjsJQQ0nnnLkWB0sJH/ygcC055Pg/mmijiR03c6GYZUSS7qDG5xSgZnAMQPzAN5pzGIP/TKa+PYcTvQS4wa+pQDGO6NHl/s6xaU3TUjTaeXO/bQJG9lfB13vPT9/XnWqPKi1uTfs/gd2qdcBRIs3BMZVdHFk2hOuyCixVo+8E2FfVspqCokudYXA+2hOapKW5KmjqeIDDjQEb6nRnzJgu2wvL5snZ7ZvgKFW9bJgXxgXq6qfjEggxftgLGFc86ozg0xsCXeFkMHF1o/Sx2CRSO4uq7Kf+ODHifa3eR9OX13/hvnLme7291A94x1pCQWzXXMxmQxSy4RiQjH2iCama1qJ89jiaL0qJgbZeZjcrcS+4spQG3kXQMa+iOTXstCIfIJ4uVw2qEtWE9+WolsbsUS4pXbKZtMqzv/iiATLlxAiG838dAS1jPHcx/VSeG5bPCG0tIGlMdFow/j+xnUYZuUGXHqRqqVXdbpNHWTL1oTvGkpu2WnQglGTYxjhR8lxf3ynTbLGCzOixKNhl3VgGJy5WTaWKQK23xm4LnYUIsmOg3dwX6HVYukUHR9YE8y19wEfsS6JxGUVWHyWMeBn8ZiP+6Y/nN5241y8x/cot2sJ25oEu9Yd08YQq5SoFymyStgHkKG5nPnMheAiAknuGZEiKdUVoJoQdC36xUBMUHM38tcCZq/t0mekIL8GjdxlMxuZ8LSdKv+PAoz4XZq/efyIRv/CP/noPtuj359Ks/qKSfEOLrlfJWFIWpx5dBkYyYuxsDJpDbBbKzvPN+FN1gqYEbjOVDrJHj0ZTqlF8jw57ceWbxyx+Ve19QzuCMlmULUbjb0MiHlXZDw8gsw6F58koN2fR6ugUWqin97dEwz3k+nLnRgVdo2rmfF7+NglIbA+Bf95OwyMmf5PpjwDeBaKcV7W0H+Y92uEh2NxQpTq/SvuQEHU5alp6nDXT7NpInwWzKndKRc9GS/UiuC84ffncZfNRRmzyPs/l5q5ChmvfB2qoRxLiCW1BD7UGObojKvbVkV2PecfVYvpz4Xr+PUbccHMOh0iTG/oiPXx1MZSOwuAOrOFtQVMwiQ/pLBVfdww8StbSQtFmcw2VPJtFDqVRJn/W7OeNPV6oDZulZ+WmMEGSGJ0ivkrYTiGwpsgJDz+hunbEYUAinjiWjmU1+ZLdWmUPZL9lnn08+IaAQz3ZX+0axsXeOxlj8yB6dppGwPvcPjXMrfPkhopjvox+0jAKatld/rUHvkLNWGXT2iPkpGPcFH78PG9M455MG9l2wAjxXZEPTMmVitaiJSpR8lV9HWlOAWDURZPhPN0tzhevEoNL2NYapPpKu4wa3+2DUW0n4BP5XcDECiKwNR7/nFX5BBPB01bB4Nb4PB0I76NNwJ0MLFfVNHolnVYLuHDwjtxMrP3S2F5e9dRTWZkMLaOgvLW5rRn1Al3NUEVGLCh8jK42AcZORea6m42QLSmmANEJaV1Wh0YGMOjk6umeSPvzs9p3X4TtNyw6TCI3Hsq+PHyK+57W5l4Ly2P72mfuvHGX3BEE8Ltpz6N4m9RfwEyyiP7j4eIVzkuXCUMF0DZzw9SFkEVewJ7aMoa56glXzy9Si6Dw8L0J/ltaimgEQHGgxUi/srzlEpY3Ot72ATFr4AksVDVvgHRFtu8BdqF7hRzC/Ag8ZiFeCHZbs06WgFG/KRgNZP7A7wcmzNx5MeuSBaz+OaavSo7DUfkgUvcXTr83t21mxC7NMktGq84I3lv5RQfbwJNsrIsRiOlMVWQI92hCvWjFazYTSydKFbcxJvyUhfDDm+9WUL+808gA7kPCdszSyZEnmyri9rvAV1rmKna7yzac/w73Yh8h/gG1uMapBF5heONO4Epm+a2zQTfXBIFuhx6rKj75cC+ILYmvbI6kzUnH7nHkE1EmZ9PwF+viGUWIF2ltirI3ESrZTZMvIwMzStohMtMIy2Mp31uzboVgAxMNZAXeKL9rMSogtKSwODs4yuY8/rB9ZOgI+6UYaLu3G1EdxSmabGSmJIWfvtQh6qFcOs6xCzZ2yVsj6PCAmSCkFSXqgZQsplYzuQjW0dJwkYhWrMt1Jd6iXx8sWYN76psxX3ejM7b4FpFjYCFSYzZXM5AAMK9keCs6hAsr0ZJrjRCehPli3oVq+vyYkmg1/ixQ1+FcK/q38nYJxTYNKqY9rMY2t+vU8yKL7uoT+tzA7w2+SYJtQoynvUfTnPt6FOW/+E3zc/hI/EGnjS/fJReJxEXcOqWIoqQG9xTUeaW++CWwCw0yPV3R4tIZlY6GrmSKpyIhKzuxZoCeDUGRIqlUCh2ACR9fRdHHzSNaLewTLNPAhXUxlQu8go9iEOLyGKP1mBaDxhLyYadRDKh5BS78coOVEo0Rpgf9Eh1zznFiRsshCTqsCa7pDtVsPBOTjBQRBi5zqATbK6HfC3Q3sgs01wSqkUZlOgQzUlIbnyVdP6gVMLK8PYvTq59vUopH0NcDQXY5bQ/WRi1DuKlIoZH/AnNMEm9hBnEPzHsPtzSYhrwl1fLHJ5RUu0C70xNFFs9j6pA09Wril/G6StwOvZfKgalXIFZk+sjyHna3Yne4t81KYRJN0gS2FyaOxGXpx2Lp3WBSM2ZkGZaSiSuKsQ6d9EXYxe38ThiYlkzQrsWom1s/nt8SZPZLmBLSvRUy9aqj8aqlQVomcIdTP7BTnzokKk8xFCn6KC5o56H3gTbhiRaPSvJbraNmF2xEyPB5TAR7xNW5nyyqq6l7myqCLUUeedBE3D9iqq2mQwnG7jE+8Ls/hYBYOKc7Jsr2gH7nLXus6a/PRxI51EFrxpXw/f+ow6td+9wY8dGwfhRq+cwlX56iVCAnKv19ISc8gkWKFFVnjSI20J6LBSu9KcCy3W9cZuAnk3YegSbN5cwj8pugnCerjQYLh111I52A/F+dr33gdpYLGFJfpPEJWYJqlXA4B2E91LTUtzkkdBdszfVPAlu2WH/D1SaGXx3LEkmfQqWrBA/zKFnet+egnN5Eb0UXczhWnxWygOR5xXMPu2HcPYt/9aNbTmwK0ueDKfdfxzwe4q2kstY0fMrx7cvdyachgV/fj9yTAkAIx3S2xodEbsxH4o1if0YDbnv/x/+dXN11A2t6HhpIWBcWnyEWYpVLBd26CcM6713fHOIfhvfUWnhG3cu57EJzf+MXmd8vDzpgbtjCbkVZBoItlxewvim4mqPitiHsPFT6vV6As7vw5Ggg7jal/F6MdRaDujmQzZxO4IDNEk9QeEQVoZaBuDf6M6VCmbtcNmL5iJhjlR0VkLFKGxeRTnXuLN3ebad5H7YGNNtSliIWwDVkxNVdVjWf1PGsFlnV8qTqtXN6e0Sz+7LqG7S+Vm9TjYXZBKs5j7EUrxjfmxnYwqyaUXVb2V9TkjKK13rEsA1DDwLRQj620Pg2SuoC1TXktYIfCa0rlBdl4dQSTZMnMObhYDJO1a5fzJlrvcuGXh/8g8f2eBcUugtfZyguX97pqcE70W8G0XWKjYiyXqNDEJmPd5MNXYYoNqpKlXzB1ReLUZ2Um/gqhPgAQZGdAHs5xFos7zNsa3kFNAKVTmXakUIZUeZlzNi8yu/5FVXcog7GKLTdZqw8xNY3S0AY1jrwIUeHOONAcTlOIsuK87llORgP+SUeBGb0Vyf0mcuUZGUt2k999YE5qbh6E8YkYPX9ubfOztmiOo0eV4WKc5FmRsWKoj2OhfpP9dUMr60BJcsKyHJtuuYS+VZwc5plQi7c4JsIXM/O+bn/zddSCuXI3UCre1dFyWW+O0Z36EIZrYvSxX4C53654/9v/NS6S2lB1ZcSmM5BfTJSgg3AOd+PPHDx/ozPje50c7KaXvNK6s4lq/KGnxzcucRny/bWisJE8oQAoSdaFXbncx4u0LkElOyW96WQEIhaNN2M6N/R94qNiWRYSK2kYuZb1jDjRKIoXPbGWvYfdanNzRxrwBsztkQ3kt+wlwPIT4s24icGWAWBqRlFWIx1G7FkkvCvBwwoT4ZuVrlw7Yg9fwm3gp8ZehjEaPOkzSxPQHHU9A6eKoJEfO6nthM0V+0cQUt0SSpfU0ohHrwArT/hEAgj7mufzZhkCzFfYITL9S+nBRQtN1PfiM5/HR7n2N6ZfgGh+bdrmv3Z7gWBQ/Xt7OVP13YTX+GknR0I6PCJvSSJNRxJV6Uf/kTpp1Tl5L02T2Eizrl3vBQlzBrFEW70ltbFpkgezKH6G+nxHh2zIR4f1IIIEmKWywTixD1SLdV/LoErWDwPXpwaWVGlVFZICKcXD68s8/ST3pg6KtqhjQ3aSvOqya+HIycyszhBSlXcsb3yYZlQf//FfYB+L16xPovn4BZYflwNZpCLmUssrovEhXMHLtLRoRY6Vsnnp0vDUygFRwpc8GZVeWdu7lScORZMKWcMRyrxd7TgSXjsBb64iuh+8nTEuF50IIcUo0dBsPqGcsN5bA+OLBOcLQjDoEfm7m5Jb+zv359cU3YADB67FrkEFRrXACTw8TKcqCMIe+qSNC4kdaT8CdVaBf9aSZp/JukKYHPFGJ2knaDlVgwQzjdZFWAUhdVvyYkjMhj50goBGL3iSAHm4VYal6CcNYt+hH59hVb1w0RVGi9d+i+2b8n1mNDnQKga0mvgN6pW5IUtRb65ieUTbNmHAe3/4DR30E80gSvPri+wpzewPsbiYiBMldRwHEB0wF/Xalv54ruaCLAgG0Ik/qQ5IyDGk9Sm/39K7P565/ZhclE0SE4CUHG/Zh+TX9Cj0W1CjtPwJ6INRoiKGg9Tdz357veutAXyOYfnthJ/rtrltv14fueGXCws3yDJkY+KZJZ4TVRRS2fO+TQqsE2Iaphsr7QtwLwyH2fedm10mk9tRzUYIRtLLFlT2bKe6TTR4RAlPGUZJhhPp8CHrGkMhW2pEMnRdkVAiKKELn0sCZeShmcB1hvSdzqBn4tpnXZUMawU3O0Y6nMdknCdLLwxGGgBk5WENVvbsbFIcRPF8vbiwA91G1CwfI4QT5sVDbCEYFeOx1Zlg/mqFycnf4V4/ACgV2KlfTg1ZLF2KKTxlypjXCAy8Ye8CZiJoZoGve9Ym3xLJkNxIvvDuOQ3youby2ph4Y3tKL1aD86E3c2Fcq+nCeGNoRIQd1aJWmY2hyitnjeWUVCjpCejwbBZ6PPvNZOoXHkGI0M5VgqdoDI45VhTETsZbJENcJLU7ApFSp1XgaAqWTbqE5AifOhMzcCCqKeqTO8y8eYXCvSnk2YVdGL5l8XS6tJjBee9iVRsSa81mO3Wj9ivjbNgA8E1N7JfqEvh6ECBaZk6nY05QPsAbAgDIG2A6ThM0rwmrycI10s3XRnP3E2Iohk29WV0o8zFZkDUqYUQ+H7NKR2nSFKv7GC/TfGLLjPmSnOiKh/FHfQF1ehHx+2yplm1wMezfn5qE/VM1gr1JzICkrOZYutNTLcW9Alb3YTiS1f/su9XRBHNKmtqyyWiDijvterM351ZOPN/r/mvAj38v9uNphVDiVK8eiTyVuHxYAYkXLpP26i19kIbMvC3x+HPOo9xKoMaQRZ69wluLk8oBA9h8/NZJQn/s7VNjTPChKeAEOI4L5qqicBW5nsOXxGfQJtGH1aTgQGqxiVWEvGuhnI1TRxbX2X6VRfsY9Vqn0uNO1OWqCuiVe7RDJCsWIobiohMQiFqwteKG2xP2CByZGy1romvYRQc62acNEPjrTdYWorw+S+1LnToOm2oh2x5nF3goQwa3p4YWAG42ZlmmsO9emMtuYIMlbgeBAHAaRYbQJ8kHhCfRkuXFfvdkwH2PgtFqH9YY8mwim2kb7CzHHTsx0mbASp9uGjQFN5Dikqm2sBdRpGgpkBbomMOEQTtUNpIiQfVA7ahvcD++6O4u5q65LY8f1L2t0G3OYCUGfFMoxbIcnpnvMJLOb4pg3ZQB6UgrjLzAnYhgpXX0KSvv5DvVy1J/XVoD+mkrB8kWm6ArILL6emHkknQpe7rStHZZvFEaS/KdlBPA4/eumL0mrTisV0aelPP1qMWXRXIBB31t59an9Fwgfq08/Ylsu2nJqOkx9OKB2cVwsgqLn6Xy/V2Aj763NFIt2jBS/bCCljee89+dgYlwfgDZeLcOU/Q0NnLAnqfOkKLNfnQRfsisxidGYvYfkvEC7ninrB1ovqtXHHAqsZALpDnPuEUzTB35+rmMxLC3BD4ShjruT6hpITIu2BoJu/RMmoiVDQfuzoBrBw5JqIDBGzOPQgMK1KmFrt0aAoYxvQ/I3ah/9/LYO0QPb/khh+2VPH4ok/GPGg2IKq86xu8PJwTwFMPpUQm3D8+SnpTMbyhygfhTO/bq+cfTxDD8wBlhQ5AYO9vyJFneFKiyRvoEqMqhraonG3Z4umvUSZgDQ2bZy7lN+6/cON6fUaYKfwglptEclMfwpR4x/OsJF+MsdxXRa1tSqkpD/vegKgmWaerDFLMeHhWVS9wD3wqnsKdH7ippmB3e2LW4A11zioYrRW6V1uJb3qmGZK50S4sjrRbIYYt2DR6S8dHyL30RtwXtS3wRPigTwS27LwfUd9fuo6DCpUuVsHxWC6L0Zuaxsoso9Imj+Pj80VBd3tjORmkPKMCSiMdotEoCaRcHnrENwWexH26tj5madrrF3XP3ld4yrpvXV45i5ty1doYUVesgQSQ5fenEVSyQLaYne67fyW4hUL134m43fL7Pe3gy9Zt8Qgh2zZPTWkvoR0uKoibXYRnJtt23myBJs7mmF49k7UevrhqHXtDo17P78i2y4QYzUibvHjGUI527lSSSGSLquk0l1sNOGGTm9LUTEGkRdmiHS6HCTmAxstYPW0jmrDqJU/O48iSAVzbFJzzfG1cD4NL+FNSTQK+v8LiB7zWyWpEMZyUFDPwKM04vhxUURex2nDF0VUAlPw7GaJBR1EWh2lRy8/zkh3IS5PTCqm0ag9Be72PxHUHLqvpw1aa4tOfmXWVG9VA3adNJvzFriuTh2lfSK9FbzaYmAsve31ZQgsDHgmH85VQ4gYcbLwWqniUlyLZRbWR1+GiEfKf7EFq6sZu2/sxQ1rmPHRvXzlVW86VJAUz10sPW+WMMWNJRlUCqD3KMsepka4pOKPLoODrLvIh3YZKq5D4sbjsKH+Z9SBpZGYZJizmjz1/J5VfrO2/weutjecnd5A5LVpzvckniLo0Q7+9WL9F2HVB0P8sWzCA+GBB59P2hxCN3g66kqbIgXHfE6MiUjPYlTVfkR7qplj13Ubzr9XE6Q7wVu7fOK3JYKnCKGR4hr1uYBO5BXVz1WE5hJTbsSN7Nrq49P59GjrdnXkHBBBz0Z8PgvmMbiySTuCWLOKnmEYCoWr71y3fur0+VXpDnRGbk7dXgnYV4gPyZ5oXJmh1TKdfBUYbosB9KHhj3MAJYSaeTJcbo0jqWK29rHCKuTKCDgcYTrJGEz+gOaqB4ktWoHEB11mTEam0lUqEn4aY3QsAornV7dpuuvmo40XzrJaaaLh9fUbViL1mVOrC3Fa1U3Ddovwk6rBpcMQYvcImPXU65+1PMeC7xlfr5puzrEZQP+teh/tH6fpIcywpXL1HOn1cD9lbz+/GL3Ss6+F0GBnY7qMXnPFkq4ZtifLUy6rGlFwRt+QttLIOL1sd7pXSWzwsmoDCOVs+Pv3XlCrbIxvBIMmKsKadq9tjAR3iuwppHEqWwXflwVNNFpzHRgJWQFe6R9jd9GLTH8LbIkGVMcfAwhtNzMcZgeJHW3WZ6V6BJhEvoyucGndPkC1CV0KcckUsddyG5KyIVBwO6BE9QXvVwrwDTc8uw5QYSofnUBsq+6xKxMjp1Erszh+VzNHVTl9UVkCnEq7mvdx2ZcpTXCb7c0IXBaPn+vL51uJhBgXhXtHeYSnbqWb0ctA8qLYduGNN8YEJncQZB9Ole4XSyZJ3I+kval/diM8+Gql5AI9QCpVmKbu2aJS9VSqavnlRl64DBPozEUP/Fe/yT7vCcePeP6gv5VwceWJ3+SJWlry+8bwASDW21G1sUf7V3C3w2Od2d2uP7wnp3tDoJJQQDC17qzDEL/i40NKz2eURxIQIPnLbAKk4GOQfmQVrdsDmF2ovbox3DbWYtCmNJopKAhz2dCbMs/IP7Ge7GUDSuChFaFNKqHwZtH6HIx/6a6Cfqz9J0RJKADWbDBe9eqVlbDKNjYgWJbBADU+HCv9q4wHPor1cTxOORwOZhH4OQwakO/tZlKI7Pn6gRkk+DvdsIqp+pMMMktdi3B/fbK4aJctRo2Ev6q4UfOZ9lVVrSrfVxL+eesOGec8w4pa6IfM2Ek/ynwLzWBIH8XqCvw+AG6zDLs9yD9/ldY/a/l/LKw7q3ze8kT0N8i2LRHYeUQvjuzwAGdTbIy8MUcx4Yd9SVhgzTaecyfYyuE2URNnRq0qvvbellO0EyXRgiKapSyG7QctIQ3aae3I99xyy+QPSe16ygFG72y6kWIVi4zzVLw00kmTHtVliOLiynny0P4jfDRW7A1t4ErbyKYaFkfCZj02Rw24FP4GTC8PEr3E4m8ou/r63kcfOuWb+lLOV14Fy7Fzw8P7kwX4iTJBPRm594hXvMdvwbRVe++dFlZ/ZW4znt104VKM/uqr3rZVwyZEz0ILSHcK3qGid7C0kbNYNnyUqkfX5rPzmHvy/t4dP5WjrLCGtvfSu4sVhIoZ/NJ9KWPzNu/O3P6HO4vIAFi0TVbBf5iSnY9MDcVX52Hasab6nSwLPr7bmQBnIey0dgDIvbHBE2I7jlqHRGQM/OKODx5huhYTCWuD5ieLETijIfAZ5Ys8lGvpx2GuyC4fWwUuKtUCXxbOUGNCATWoJeHbW25uDrspo7kD3+JzgNP/tTozRQVFHUdXUHJgQ/5ZqcKvBcC+No3zTGuR5FcYVCfyJfe5lZzRM8/BU+TXW/oEgh7uprUPgVYC9AobQlhjkeP7jbMy/h2V5iAV+Gi5bqCqHfKU/Cd+lpw38KXu6FT0UpE45s98BqlMDe4i9FvYv0134ArqEjPqtsl8HqCD9uOoMvlqVdnvdR+jeNPzuqnR/M3gUbS0E/aJzWU1tKZ9k3NoR3C8MunBdntezUsBD6MBnZjNRGE42FDdZeylWW0mmWZUNplDqODSzjczwmtR2tlEVP2FDyhMM6vQv2SqNod2T5FIOsSJ1lf196hbh3p/73nmT8SycI3LXPmTVm7m+VFpYacAFaEZ/WzHndtZZbUs3fkmmNwQW0jEfu81VQW7HR3/cG3/Dae98oprG8l90SUn5M/a4DekTuvwWaRH/9WLcHogK6h9i/CRNiWX0mgnzjlJeqY99fI4Y9nls03h74xggjV25acDZnFApWrKdfVfwGosXLNIv/7aSqGe38qB2OEDg6GMeLzG+AjD5KmAuewaub5/rx8jLrvUNBvOfl/PfGonoaBtp7F/M7L9smPpMGfft+n1kJQM2G3ebDkC3ZwAN6KSYTVERsH18l3/pFCvWQ6tgH1JiDVfCGRCt3LdMjIYssjg/Ysmlq8e09DhFDDsB2ULBaUHsI+fp3Dc/+Nyd4IbSL4AR3fLpStNW0agkxFVneAm1F1coFvRuDKcGIKqsHt2QrvFSDI/uaRlqY3+Xk3yBf4RPpm/iGcuWy4g0BpKMFtTTbWD2WzvcSBJJrcIsEWrQraKgLBJoraIq7P9ns+vFg5/X8CjVuAId/zSqHWPQQ7kaHPJ1EYffLCGI9dA+OShU3lewu1K9E/bz+nb0xYq8VFGDbHn+QZNv3mynTWcSjbuhBRcT5cJ5+Lyz83oAihZi4cz5NXZa45O+FsPcD6p7vwfPDETNYn6CwIDh14ujJtDtB3aqBXtcVoGLp0TpFDBrvSieWqCEswe6hOSs0FtQwiTV5PIQVbhTywUWmV6vTEpZ6MGuKF+UxNmUGfauOt/HIa92cgI8+4mzWz29js+Yjoo4OWYfrd92bAxfIncvgWNJGFXt512eu0z4zwaPPsw28O1Vq1bsh9HJCNDW6/AlKs9zHUUK3Tx36wC/nAOMZvxdEWOEvymEuCxMV+Hnn72Hnzpq/IN2mGf8unevalK53g5zRw4/OAGrUsWHFhRaF9GepAyQI7Oflnucvj+CnnubZxe1eA6LvdI6Lspwm7FYbrodvGSP4i5LX6h90zzcbSM1SrbUqASjb4mT0MqqbgCPBzhwKW+Eny6IQEfi0SMoUmfBe1l0N9QjEzsXSKCBlpiY5L976og5n1ozJzJDTXjp5UeMmraobzeNoiODnLeZEBy7Y3aU7FqfRX8n0S00pVGvzHEIQHFas7UHCPg/e7AhqnT/BFxqJOEQB9kx9VHtTO3Oupi8OLz+MoAxG5jz0rzQrA/LNNgzfI+V+B48/gviRKOd4zODPxki3KQHwBaJVEOSe/+iwLr3ZOxNWKIOs1Ft9+Gq71Rz70KF1Kmm33CHOZKaKtGgdahx4Cn1DCPhFeMoliBN7QgchenDO36Khz+NCg3mJLbaTo91vusgz5POzZrO9g/1gIKxMAQgiPM2KCQlG/GA+8mNVPvfei5/YXluiJRVYJdU0q70F8JeYe1kImz4TQXoYQYonxd/MXoyt3VRRcfoHKEEi+Rm+yeGo2wzJM0ttAZVsi3EsqI4sLmnSbkIP9PuMn/b+vG643m40p/y6o8s3bD3OPswZnrQBnKZpzJB2AoAgq3gwmmtemniosW/voQTvenPitZ81fliDIBXouuLbDV8f36E2rZvVfvzA72rcO/YNC9/lfFCM/V98ItbXdh/QG4+aQuyHpMgkvcQfMJm2ki3gPojpGQ7bM93j35IN5aXQFyrmqnVxKIGhs4mAKPQ2W1fAj6j/OrRuvRNM8HH9kh3cSlxNsd6XGy6vwcZIFCVH3D1twr9nqX72EtPnADTox228uk03Bi6l+BoYYRh4+dYh0qo5zNjJmywPZLPQ++c9DOEdw4NF7Jd1hZIaRhw4ro/07hlPSo/SUQutjJKC2p0NRiiLmd9nW4qfDVvgo2fQ3LFrUlxbuLkV5ZpCQutUsI5jbOrM0vPuMkOzl6Ew47m3iQl8GSjuO4Rdz6t8/PTlHdAPhQw/Q/+3gCxMRXfT9npWksNKJFF+lQHKs0BRX1Ng+wJkRofioaDSzcjfFBMJqghwmlXz5TSK2oHm+d5PUPwuKuotxE0j8snf7Gj+rCfgeCtEP6QQt39c+Pd9Uu4qELR2wMTUjTgjZ6KTrQWfllrKmWucXxkaO7Zqz1VAHd0Lb7er1QinhwP/ualgFq85HaIvFxd9HXcTp15SoL1v/nGp4mKi+eC4sgJjjDstpA5toZYFtwI6uKDxYF5mQpmj3CKE36t3u3KDJnElyT/gI36PaIQAG0i0uZn2t96afh5P7G0wk+42Tk+E0JUzieMgP1MmiGpiBHV0XvEXDPC0FzCX+XmTK58B9CsQ9o2GvQQmMKBlfm4VMpFFTZyniNagSrwBvtxOwYvWMnzbTcrQW9h7C1Zyikh+fSLDXoB2V/tFgRpBEl2w77zHM1/qpvENj74k8Af67sEtkHYAl/SNd1FW4pHWboODNpx9qWgNhCggCBy7lO5gTHSLXdmcdCbv4D9vDbWLqFQGe/aioaJRJpQliXZoBHRQAF+R3iqfw0Fjh69n8iTNsJOgsGQtJALpdx8ObSWLy5Y0WTmidPn2b/A9P4cvh9lj2yNQHQHY6wcdmZ0HvyKdHb6eDeECjO+7r7azlPjVB8aSuXllG2zu7d03PRX570YV1ffPfhdaWrQ8iMAbfcm3QdEOLCY08WHHr5q+dpuD7ivSmCtdaGoF66FRMfVdvtl9HoID6NhR/V3HsiQBNR4hcfQH1oHGDQ45LOxpmsLdlZBk+W2h2aUVtXCYTVHFpKO9mbUfyWioJ4xfpwrh5LE7n67rS+eJ5cuHFYiUeXJsV+RjIZpUqpJADp4KqTqVltVR0Bxos4dB3wAaFaofVi3I8EiqaO1UXUjvQsHQcueTs54RylvUNV9wnZftXYSn8oq4MPNHrfmGdATpnXynYPeHndlwjEbxDd66r9TmPXPxYxMdPn26uVfh09RWAHq+iM2/+qXuCC/AfA+3HlXzmv8D+T7R9+H7P9fi8GLZwWhxCRBDKVpp3HATQtosv7KifIj33cDl/VEKjdvJjoSzNMtrJKidvXyoBJTflmYB8/cO/Q2dp6j2taSvAEu3vgWQqzmlavpCD4koZerLOnHt/2HqupbcZLrtK4kMl+Sco7gj55x5+gMaf3+dKpdsjzQIundYa6d+OE3W6WP/kA2DkeZ0vtBf+4thD8ebSXhsfyUj6I4fUzZoKk9viLTfITbdjRMr1Bw1K62SnMjKuLbRmlglAJs/dmmgSDnkx8XxLTi9q8wBJWIrNfbzGZHd2xPfp9uWXx6TH+30Vq97+bihYK+ZCopuX44WnOhgoc7nfCz1EFSyx6kjeGWrZ2FHwvnBEuEXvsmUD2agxCNG5SvaEG+/FVZLxR0ENC9E/AouJm/KWA4ERGI+M9qndruPC9X5vxIzHN50uWI7VlJJSkxM8ltTQ3ijxNLjeF6Tcj4jXyfE/PbVdeME0l4teyFvfcsu6cPWyW+oAZ1KWJbNlMAmjKvxGi3+tmONQQdTdprP5se9UvMOzVWJ+CYsM2InbXH7Hu5HHG9WOuayoAtkQATptc1NFMfQya3qAVE+rPcedTCnPbH2WElVERiCGQZcCXP2mKw0pIqepdWM+OdQcAsxwgZgoN133PDSBjAy1AyVkTnEVsU0HGz1emLVv+jR2vTXrtx2RZWp/EFYK4GcB/6qOTSjwL0jvxqzOrmJK8INgYIDDQvaN8nb3y5A7A95r80Gnib35cDQcAUVUUW6ZaGCVfHqYlcjWx6MmtCyWbOxBidIwwnfye+9cckw4wbGEipbofXJREhsqtSadGoCqOBfk8E7LhTjWDr76CMLrFCSlU4qWzTiAoboEP+qwp4QYWR38VYv9PsEjo/cctfiMe3L9HwCn232uzeLoTckdeCO6HLHVwjLg656q9BPLa7yxPpa3yjrfaYSc1JNqk+7R8S7ETOcOB9JYa4cIGR1fdxL9pvo0QdZtr/kbsPBBeIc51PgxoPgOUfhHggU8RuK3M6Zqf1bssIl7yrmVBAqxwIQOLKmGbj1DKpT8nH3VsxejuujgdBN8Bql/d4ihlMRgiWwKH4yFM6zBc7EbpStoXpapUdByFdIoRA6YG7qZ89JE2OGVbP5pDlbFGLRb6EMj6GpPgCvRkt9DVIUSYVonXrua9vfsqpl1p6c3j1K13l5Nk185hlAkiiICGLR3jawrlvAF+XH41eQoPRWbtaRRpFMAfJmkbI7g96TL9hS7W4kddkEw3gQ19xP5gc4mKBE8V9/TD+fhyOd6198kRPTfUWg/W7ilNCZNyymv10I0Fzgq/B8JVDcoyGt6nQt3Y/u6kRBgEQUP7uMo+VkgmIaPxohP+91A5wZQvtCxmyn5wBquuw+8bRNAvRGLdEt0hSMaSf90Vp77iuoy+WhPTVD/IqeSV/qbXM1NaqYQuF9BJvfGnuoFnHl7Odx0RuSh5WdUx1HLx1ZFt1R3Tbjz50YUD6Dcq+l2ZsZJ6/WtbWGmTRa0CLLmn3VAiQv0NzF/N6i6Rp2ujKAvD/WFQGj+KurgUIR5MTw8XTqFtWoKlfIhRepWGOqHhTVzh3Ynl4E7/KDDtnOH2Iz9eV6x/ThE+hsCZF8Bk0L9DkHhV14CGAQIxoDoLku1HiCZY0YvwLcYTlzwM+b+NkPx9bXtxRWR6kq6Wr8QvOztHf4di4x+IcY1Hc5+XQWoheR3wBLu8AAcmYIx+IngQg4s97AeA5CHegFjyhqUWt7FhHKvvSI8LrqOEhHyqu+WznaimrVjTogpbrM4cnIxa9+Pb4zEtsQreP01fqe8oAr5dSoFDvnSvZIh9yGrG61fIUr720kZpZAKkXWtEIfbbCmX9KkZX9GCITWCLojQJgRUlslTx9gWY69pM810dV9C+dD/l4kq1YkW5Z8aDF0rKcnJ0tb4DOZSSOHpAhUFb6oHINrnRj/IafxX9rlZ4bg1iMw4uvluer/oaoYcqSH5OSS060A8md4dV7Qj0RwF7+E3Ft468jiKxd6NFSgsKRKDpTOKgi9b7Pqd/VeHy7DNeDfVkDqswGjCXXoa6tesCA5VLCaxfuphI//xSSbO53PUW7MUHjfSLcgfAlxLDNjmVTfjE0nAJBIdEM7mGICvaNmZNDeyEGS58y7EqquFCkkZprRec6+COr9JWmxX0QTl5ikV4vkdav6fMaMXEHG+kYgTM1exBHaQLjA04olD3KaeE/sVPP6wpJXymeDTSh7DStGE1ZBQdKsrbfCYjAnV2TJSoEsx31aJ3avdiDURAFpKxQjc7YQVpZZuneUAeuI4/4bvvpMzO589Ht3dPQ/EEtFBH4QxgNfsw4d9DjI5zPLU7eXBM/fxEDLHJaJlQsMZO81Zg8dMMZoZmEj3uDkzX9T4AKGAoWp6aQ8t7meQne+dWbHSHn/danshvQh0uwNZ+oskQXdHv5W7oV/b00A4UHQ71pjtuw7hPb9gA/6AYUXcHdx3fwgsOHcqbZD/7qOj0/a10g4Qu8GRnMYptjwgYKYyFZjE1zQ1+NRkx3BiCVb1RHacimsvQy6FQ+wSEzWDZYDElSEjrehMFfmCMLGHoq+eCyFNVlu9QWcy5IP9xbPUCzoELKc7+4tjQ2YMhWalefzi800F8OPmpXERETH7ke7a+bm5ByQJwLKeh0zjXR+ScR2/0rN1xvBknGmw914g2SpmvfmFeRBAMOzHtQ/edoNY0HyjkOcOwa/BLDGEy6oAFS/hZdJZmSaBATFB1py4UHOm9dsQw7u+c6H4Y/5EtqxfXcmhTKhPnHoHDz7zWv8arPbdwvSU2/fJuJ0zWMViyGm9994DP0A5MfZWfAnO3iEwMpxGbhZmDnSJT0xM+7t+SEGGhiCfK5wQQXxYubO26H2zKZ6ah8LjgGRUwEhzRXYFnYl/W1iZb/VRChBQwNXcxlsqpE239GmBxjSN5TgLbCWkoiwF1KJ5JAI3yviRYBBeTYa9WkKW0fbtNUrssA9PpYpbtGwnHtfyjTyIdsK9lwyyJhpS9mHH0LjnBgj2FDhgyNKxCfyfv+Eco4Uv/kjQOP76PZbxwVifLKVQKaBNyOXJARFr6jq5PGxpUk/tt4G1SOBwfnlMKgCvYD9tUUfeX6t+FuikxojaiAPkOWdIws83s5q7BfyeFf4pR/kftcIvJLsrp/gFRffPM+y2grXfFcC7C+CsC9aV+6YZDnXYy20k55ZfvPQBDKgLCNeS4TiQm0heX/dMFvXvE6CKeCd7sWfihIy4wxM6RzOZlyGw60nHWpj6Qb0Dwww3q1+4N+n9F053c5keAgGpF4yrXiiKYGgefMhH0iSfFmiSPtsK4iS02te+sapbTllD5l7TCI6Qj5Joghp37GCAjqda7GqLFtEBrwy4DsEAb8Od6LM6+IF8kE83fVlDJnRw//Cqt/tQV2YjyFg1776i/tNxUzh2y+OLZgaj6d2CXuVv7/+K70O5+Z7/0o3s31f2392xK6o0NirA4OjWsYMA0Jao58XFBntqUuT2lfRaN/fkm5OWox8d4bHMAFqX9r6/M+TGL+C7+WMf92m1koQiKlSD0dH72sZkG8KYC7MidIAtUD4962//DmSvveALWwKI1vk8UC+/tkwBkQZcJ4dks6hIQNEmSw9NSwRp5t0sziu8tyEBpjU71T1ZBRPASlTbmKWj79k2AfqSGId1ull1ek7fy4DEB0LVaR2yZr3CgGBRv5ErltFF9sg38xymhQcicTp6RbSYBaADdBKG1WxTUnJwjOWda330Hu8TvfIR7/wvR0Iwxtg5C2c0e9+W9UeUeyHbtgOmnp9v0GB/HlIC1L0ERrXxbUKr6FXaBD3flmO1pnLcR4HqH9N2E0TKYdr9TYstaAxWLWl8btjV/YL4mZu7/yaPcqzyjZHxCCfcV6g/thC/t0Y+F+QDP65Y6J6f/aGxAhUne3+Afb32zUwuVMlte47dJSS3ojb4sfvJ1Fgu8C3ssK3PkQY9PvOXM6eN5is97ObWtIxJMVsM6+C0sPchDWG7brn7mCGpEt4R1gS6AEkLxaN5LgZq7bIYegOyI+3bhSam2ll4iZLr1R7l3ZnwuT9Qd5lQidfkv0+7qJ50Dn7GahNzGbaEmSkQRWT5SpWnnj5a6LPxrmLIRDw3s25vj5I6rUXeGrUlQLdHXy/snTE/3O/zfu27m/5vIPc6y8pcN8WEAsKvHOPTgcy3aYrXSwuCqTF9caRXnsoG/tgLOShR2cSVzhOpFE28fWrPb6NgrvAfTEdal9nkAp7/hAi+wF0hBcDV9oRErq882herOVsNuijWRgYNU5ATPc9/1pTKHwD31lXFF6hKD2U1SHzq3/pZx8ReWMI0F1ev1B5vk5dKYz9ymSPXY3P4crmb0ypmE59AQiMD6zWJVnvfPsMurCUJVkIYR++QbiVdgtVulFgDg6my6FPrsY4ojyiYuQzq+IbZCpbFQeZlGjDXEjfcSPUHXytdTQXKza9cCyW+RPhVpbyCklScqc4pF0kRWZ2dGdmFhbfBPFZFe3GbrjMEuOV31ltv8b8JiC4vd8P6JcrKRoQBhOhBq63WcFL8zegzRDI8uCtTnvuHk2fVSqh9S5wnWmjALmxT7saknTiQJp+DCOs0VnS3v540XsYIfpL5nqwQUgNO+A/Zx9cx/tle4RD0nc+bn4D1BomQGjusZXFEPnCE0SDiBD425yqygwQ37mUiR4ckmfE1cwki0sC/0YJ0ArfjdSG5uEG5xc3WPkb8g89GPd/QyWo0/chHe3fEqJEqkISYTE7tAK3EwEh820wHU27EOcWDW2bcuoP8HHe64JMwG5IzI2zvkmCem7S+3VYSFuYWiLqZX966UCFE+IgB/3eU/QtKZ43ecX+mKJZpddmcbqNEhFlK/MSkEJGuaDVMS1Pi69lmBzgReWb/atbw7p873XkfpnPHmD36SH/ckjkG3X3fcy+AALJHXv6RQg7THcavBNP4y4uOn6UnlCQ1jfoxPA57ucx8QNaMzC2W/FKXrJUocMbQgETFE8ELf6NFolIPdmpIot/F33fjT+Z8ZCJLZAFsmHo7O0mflia8QCBwA/od4XRCyQSo2UWaIZRPZ1Wlxjfh3gYllzLN6OvqI10MR5+np3BiVye4riHwOv3jWiV/HL7Vjap8u9x6/ODfD8lUZDQnjLBNz/nx80EVGRYHRW0sJ5+EjDewN6cxm/RjoFaYbzLVhWa5RjEqQe410B7jZFdzW0vYoyaQv0uqxhUEBabkI834m6N+mg7r55oGseVl4WGZT9s0C5cp61VQDlFCTjWCQffuFryt/D3azzVEDGcDxEEOwQ8EEoMZ2Kkyv94A471z7b1KYqlAvMbGLgLDR5MyPysQdIz7nxn8wffiB+R15ldNX3lFl2nm+NRzITlX4Zm8Qk8R3FksAkTnKZfK6z14lQZMrAexBTXJHFCUJa7SyPPk0LtfLNn0QBjRVgf5KIfgPWdznpRMGh8R6wgiL/USRxEKK+N7oJBJowMDLcUhPqhrwjwHI+EDrwOwBxKPqmaTJ2NGG/gKUE2XLpiN7bW4r03LzXLFoVPSEhGWgzkYbsMRoXlEMne+TMJhO6f2XM0gM0sNtwsyQS+cbY9kBYhPxmyJGjTESuX0uxbEXK1HxtZDGjp7zeYiAykMQHWWo7G6xFGAMO+Eggf5ltMJy7x+Kzs+AoJyBCEoUj4o3kevLcpAwd0sjsS9ENp7261A2YIGEQ0lzxJu2LdSs0Px/E6TXRbg80iwH5ebSyD+An69cb6r4alu5ASAArvXgoLl2TO8ip5keW5IfidbNTWpGpSxlibyK+7x+usO4quCGvtZ4dZAWdiLUvdTbj94IQJJOOOf7zkaz5OXeGCwb0QuxmPT/hy/q8A64kkkJ5dRwAffyWnclXI5Q4VkCBd2jt4dQwS9/kYO2A9Lq3yO4qPC3pjyDbL0jlosgjXqajetAuouAS6TkJLIRJJExFjRrQn+rE2evm+mTGB7WJ3o6lvlZswwcK/OUWuVhk1lnUnnjl5ZpQzmmKp+MM3ryy6deA9mHIMYxgNXAxMN9FtX53P8gxDMY13ED80F/Du+kwO38Ijo2fQ8UgYE4VEXMjeXUTB4l8uF3vXmTLk+wE9ukWTwER9qT2xSbk7JKyTyJdPXEpfnQ6LHgArtPrannGLyoeqr2JG8rx7unGa0se0NOldHmE/vFjocwn9J3ImDXyIcPBcd3N4wO6TqZpYUxrfeaym+ILo3p90GqaYZMGW6GYP57zmJBc4PFH06Vdor4etB5SuSdtE2aRVvGaXuvGfyLVrzNdmuqop8TDrFnuo3AbmZaEAB9JsDVxfWZ3uBmNgK4YBvBLruxemoGrtWsoWDfAmC430l3OZfFzr9GhsUOZnBDN7lbEsr6LTEHrsOOJfqQIMPMyoL6q3VIlC34xe/u797Xu8xNQKhqMll7xlHa9XZPesk64seK2E+gr5Vup3pJ3nq2XvxuCs2w6kzIn3l8aJ70aHXtmKLdZiF9n00gY6Kay+8fDJLW761gvfkkfSlExdsXWqAQQXeeHZjDgjRMnm57SXF3CugLFjd2a5r/rKo0Iu8qCGumXIRT3e6jWdIonJoKdW33gGYwWAZnOmHAswpTop6U2yE7gsba73NGotCkCdjocIy6JqhwtnbhSbBLQcd8yjxVlXW4/cvMMNuDyjpHsN9OONvnBKdR+fEsu3XsJxaAYeXf5hio3BoGDmpooXi8/9m2ixQgBMWKgzQ17bRFh2b9izBwbWqI4CjZi/YL8558InFd4+ECM8vZ14a+6w1w0E0rmGrx8w0GFgWJPPWk/iHWLmm6LcH9RszsFyEbc1RE1zH+6RtMtCi+TBWFDN+XjrCH7l5AJNEwG0rR5ZcFSbQs9Nfq1vSwlRPxY9E9V1FCeRpsqPDf2myJWbfcu4LtPLnH5qrdvKj91yAVAXNGSjusdGJfO2r3NZjay9SZrba0H6BbNL2WphY6bR/UhfDQ6rB7zjfYH2dDUHjDW1wvY2BC1gtz+uvN/2XaPUCaAFNH+kq1j24D2Sh/M98B1egy/L2Y1Vx/lsYzasDtwIjDMKtAJa+VnrPx/IKQ9yucFU6wD73iPvgVHfFNODGIZ1AsOQlz2+4UYqTPIbTOqBfEtbC6R54Terx060lsU1fizmAayjWPw+quENisntyI1+IbwEKVCWqMIp7bkhj/I088QUx/k+rvDNqk9BuNgyEz7Em8x78QzNuTxzM1GjagrrjopWzhzGSrq9sGYgSQnbgTG6fomdvJbneMj0oOfT+qPrRIJ3TXF/zUaXh3XX2c2T7GrQjqMVOEdzUkh6pYJFGA9bcBFduF8silfqk8dyTMFQXsb2PTfA36iAgci2UH7ly8AE7NCqJn/WBetxglvWNxIJFKygZaysr826XgU0zk7vmUv6bIb64yYtir3Zmteo3hocNRSc1++Um9XR0/pXGd376Ne4f0hXyO9IN+Q+ATS5iIvQTnoX0KWbFZKCGZMrvrmuTEj/AmtqeIPxm/k4syh7LQrunJVqjf5ohv7FFbVz0zCUhc3ota5XmiWBdyiS5Aax2x2ZN+RPDfmH4RsCv9oQofc+p5JpzC3Ud+0ga4NbmlCKbPN4w1maccfBIvTo5wZDjdfMklrqMqm2L5kdmINa9DlqzPi410pQJCvkqmvsLsVKNPrV4R4dZefM5+2DV2WmxdR7agaXpeLmf+qvfpcNnPT6vOgxGFo1b4y65u++r3pzhn+UV63lOu1novTTTRCU6F23X2HCqw1cpajoFkDnmAOiblQjhzPIawD+utkDR8LXmUK0UCQqm0UDuDjs5mx0PqGjOuTuU3s/zMbhTvUJVCKFo/Wr5882iOnyHlg8EnkidYCQw8CAm/WiZ71CCztLN2GxkEOve/XhawvT/kVEyVsNfIKiTba7frGxpMckUiCQwlUa7fu0C1jHQSTdB5wJv/TKxfYeK09tW/h5q0q/ajm9WRdRGVoT7JakGQpwD9eGEeuYMLRTab8e2c9vbJrjG4KtsNpx5Ipnp7pHELqnL/lOoRQMTGlvjxHsTsL4jQzem52b9hNyAT592MDSNzBmbMTvPJJoiLl/VYYafMGggBBg98niXwZr3WMqf9vw9zvG3vTD+zEw+AVKcnRoSwpWuxEGaQdIZprWhf4qLJKFMkoW+iMG9Oz9HZ8mGB+S5UdppaYWkWD8DSABUbRnGobdwkpige3+FTtKuqeWSd+obDoO+018ob4eYP2shYdcQyjg96lN8U1Hq0HLkmlI05QMVLLA7oq8oItLnakAA6B4nyuHeZ72hTqLVDRc5UwdHMwEYXPLi61bh8kwUNcNlo858R8SWdhk2d6QhmOZ3iJX0L7VAVHxcIgW2xpWdF/YTjd6VD58KEGzvj5PicbqycRKbAVQTfJSnzIy1101nOh74Dc8ju20AfdRZ91zRZk+23n8RfE4fH9BRgLNmH/3oW7c469Sdd8tNJJl08zqSI7aB1EiZtPwFLuEWFHz7kFIgAqUvgudfgSwUYh6LK6PCwBjrNFPL8DaQvdgkia0P8jxgW5oVPpB8VjTURnvyqEJbOx5ubetqx7ARiTTwpoLwCMM4d7g+5DcTZJjLI0gkumCfXzcEqwHlZXhsbV1e0piGsunUVLkZhP00pE/n4LS2c7u7B0dpu96Su9d9LbBmbQ0YCCjyY73phvTeQHDfd/k79Jyv3EbAJJB4wvldsbqbYLQ9mxlll5N7nwJ1meRjKbG0yO/pxkCcLUf0a1/i8yBIUS+AZ7BfQhAtpPKP3sKhuFNYHde8cefoD7W+Hk9j19OoobZWsfamxa8hkwqLzbpYOKOViaqXZTX3UQu5TTRcso22dhnGH7INKHeVMSaiFH31hsYSbcr/RLsR/BgPZuxJjG8yqYk0/Jvc4JkpiD8szE7X0yJodGZciIncY7Pb8sHo4OfNA1ibMSxngB9WaRRV1wjrIKrr3dyIuBInMYknFmQ3/T6xnJXSxfTJSTzvb+yB+UT7p9yGn+8Zm4c1aLlNqV+nTrxaafEVnK+CNaxxn8KEc1vCd4stuuiNz9FFCu491s/7l3dxW+/CecNI30SPRjd+HeDkC7tp0+w/5zR1iY59MD5hdjymgfyB/EcF79gOjrEL2PT43/5KwmCUBvlG6+SSjbUUn+gHU6MPOedIzm5K+jxC8N9wWoaMJsrJt4FBL0yqdtNgV/Nbi+D0Bu5eACNvafGAFdxdof3+yW5hdQSVZIaxmLZddi/Wp2sfTP75zWp8jSHKKGnU2RB6pqZkk2MNfk7a0oQSY9R/Dhn57JTXg/+TdBCdItos06Md3h+FCS5pZ5la1BhPRZEUU/SFKz6ef5P7m1CJH0FRfH20Cpt+fLailel2rVdq3v0KgtcVLderMMFI5gqvwJzzFH/ehQoep1KxHNstWQ2yHwNmnAgC5Q652/8xVZb0IL+nn5u+T9L8Pw5/khXOY0Inr0/RrC3QBa+hg+mnKdBjDuI5VnuyDP5KwdEpPcSVBr72KtIVHzHTLSwxdeKp4aVyjeur6mtNXmSSWCb23hgiubkvZ2alwEms3MylKKLz+vegdYI3qWwuGMSC2lXw79Z9VhTDQEKuTHpVJC0JZiJlA//BVINILFZs9LgTDSYvUhPpvM5lBNIhGQA48Wl0qwkKqbWby7vEmET9nGn4KtwIe/OKpLDo6BvS982M5FWwh/WKRnfI65Lj4WoarMkAqWQSgTD53s5SK1MY4fS4257RFysN7SLdK98pMZDtENVYAAj5lrinNhM+Wlose+GTONpMLcomOw5Q+RCVre/uXev3MJ/RcrH9vsrcoPXcnTew78EO5yRf3WR7/a+0A3q8L4Omdhq9VkMtGCaKbuhi7CrEg9DfDIoEs5zkr1+6L1NVrQGyos51oEh9TdxbXUfIAuegwBuSMkX0k6b6E6d2OOa6Pr+VO/GsfzI5EjlaF0HvJG0fyGzfemcvgnuJ61dw2GxLqRr9UIiRg3U7wNdhtDday9NHiRBOHoiiu7Cp4N4UbbOymTTspMq0Lrd8Ci0e5gUa6t94WxCKYYx+yWXqyrpq6qhfmxNc70D0Z01rqdy9ojQDfp/2o3NyyvgXs10bdy3CJ5T2Q6gm4LExPI2KGw/GpHtyvsfTYBq6TXp8R1Mxtsr8k+0l+Pt+aF+xDiVUyDTnQ+U5Nj6T+ZH45/moMazc1Qr5ub8MTa0oGkkQjS7qMrn/muq1kW27/j4zoVvWZsM4UD7BuNhZqrusdOf65sIgAnNxLO3i0MfnqnU8xHi5ZIHc4FsxxhKLnzYd/cGVBHGaaQoiE45dz65gJ/Vkr6BXGXx2mvxPoEReuEkgyWlq0rtce7CUXfpBg1Euq3FnLZRoyNlU0tB3sPcjGJIwclRfiQKuGRER9joE7gkqxTNXFC9a8szCgxQO9xqzDZ9ojvI93FmmdEXeL9cnX4vPHv0abZ+NqfOlTNXPo6g3cix7IS69enfWqV7fuPAGzbjCO3UVWk0ktfEvKKcjRCBwrlxQkv9r7FnjhZhuAcXQ3SdLHya+apanC0Xw0XgyR+GOXRXuvvJ+npmyJt6X3E+CG1CtFcYrOiwX6dnTw1M2vxWOr/q7/vxifNgXtUrI99COgq7j2k/KI0XHFFofk78SDvlsZ6FAqqYrW5LUCiu335JWoWZDuEH/4oHkvNcY1m4i5cLqsM4MQCUOQlAPv2m3XQ9euFHnCkjIbJ2CleWlNJlpgxLroaD6UboWQAEUzgumYI4Y/assm8Bnk5/nfNeEXxft5XbQTR2o8fcvqxP1ZJYuREoxYBnAV+CB5ZZZuQod8Nwt0T99W/+6Ke+eufusl/++L+426fcIc/cc+KAPxgbsP5WxlJkD60TdHvUm5FA3eHD2A3DqtkqBKXvL+eIwvLI1zxZ2QcfUrsZ/BovswKi2nfDgrdY6KZio8YCaK50CNp3gGUfbnrIynCoJzur7rfvDO5Yyu5BHgnKTsPseJuyxAUqFAnoWuL4Vfz0kuaYuq5ykCupgNvyURvX4m9RZJdqIYv1TMSQ+bLx8nW0JjGzkqtFoGqvpm/M9pxn2KDO1omPUnoWN3cC/5v3L72KjCC4B4J3z5dF5A57y7Pl81KivMxBwDGQ4O88Pr0O82l4v1+V3qCvIkF/kV1uf4kwRuBeni3fGgOguLcAHdr/dbW/VqaCETV+TS2D/psR51n47hB2bYKup8Td0EnSBMxTHQ/q9kJehtmGuqhbRBjeegtGTH1Lr5+N1J1+tQZAPWdWhzsk7eCYeiOJO+V0YLm9l7frJfHEF0ISb6gXFAQAwPHecl71epxYfE2ldwIm5Y2fA/CEaDLbLZ8ZDUunr9Eo2l7w+e8Eo4MoSMKEW0kgZ/4MXU4VgNhTQs2qq9YLi9apbpdjxc1p3o1mDdUzhpqUKSjWLRbvhikbLNbherBCrrComUtRdTk5N+g4BR3+U/i/VYzLNVGP/rr3eRxPFXPSsdtn3Kn5rJ7z9jt0FA6FBfTgd7irowwLHAXG9/v1efzbqNKPhOGcoZXpfd8/VmeHMAgowyzr3oU/9BcfrzVop2nqxZLnoEsDXixpiFv95dY1JHjGbg9zOebS4QZ3Bi7pY7xnKKTdgMVpysMuaWFaN0LReOnj2/4nWMxLJQTA2XJEumY2SN8MxT7ExOmLxWR2p8SVYGRKTFU+8ocKnTYr6wB/ejcyo2+pvxrZxKTtHM3YP5JaJwtCMm5zn6QFR6ajV9FsAudQmOR5i2NwhI/BatcGX3W5MGN0nJDSCR0X8oD0hCeA68X3FIJ9fgPXcRm9j66vwrEcewCtaPrp83CL0OCNOfwaYkCeICC6eOgkBudo06bvL/avj5t0IrqdEHKJf4a5ZrJPBmHrCRNJ9IXOZkP124J/R5XhOBAYb3jAiz+/eNKflHPER3xbewFt62D53apmg1lazQ2TaO9SXbvqg0321InWOF2kapqIjVKPFykxKXFr6SGSgnoztalPcgHTjWWH5wAev3q0dxLk27KZpr27wivWGjIMr1HReZZLQ7RYJzJchTHlqlH1bf1XxpRp4DpJY2wJRKR3XkjEsHcztrHPquQILCfARH3KkaTEfpI9ipYG0VovcPRfwWdAMXzkG/CHYnw0a9CEBXREp74mpMHjMfcQgQu/cXUtur2Av5YnZAIg1BEcIHMiLMFXFAahdIXgtoBKIDMAMF/xQIdcZMt+I2qOq9agdMKzTpMatb2HnRmK2BjjEzXq+4R+zeUvJiYuPDiu0mRFBAhzJvOz6RrCU6oemHQL4ku/ZUWB72nIvocrg/48yG95W3O4uiW47VjPH3Smyu+HsIMKhefevB8iC1rV+eukrq/tb9PzzK7GzEFlMaYRMZRnvygTypYEwalG+m4FrYB4EKkNJULMarKYKqEoJmlDVaVkrpBOLmoi3WxOVh06cCfHyEnhHeIBzJdSls5bTDNjYO3cTDPFZIlNH8nWRaI4MbElHPYTw1V03XvgIAL9ZnFmfR8Sybpp4+qhm6DZ6jRs93jmPQJkuWFN6FyixryeiLFYcBYMl55We54b0nvAFJfnl7Le+7tA89svRr3ArPz1UOu/HkviYIMOA9Y+tWct1AZvXEfbLTh/vuIG2Il6SGp90cGHBgCKPnbYdDk2JtcNfQzVLLvrumSYByz7K5/dZZV5WdzNnU7v/ioP2snW5nF8XXqo8CCrqz9VPY+x510Jjeeu9uGqYW4XZ1wUCVUgC5mANNU2t5IsvQre6HeUrvxUDXk9EJ5CH+fK6OAaeK1P6qScuIomUVVxSqGnPo4ISNBlsswiUqej6QDH6yoGpP2jUug+HlX7JZxaKyxENxGGciHP/T3UdZLHnkRnK9QrPmwTPb7veK4Q3YBa3DXSfgRB0XrZgxpCKyS9PgcH/1UL4yBOzGxGabwae5l1MDtZTnfvUWrj8zwUW1EjnGC7A7aY3uP1/Q6IGwft2QkLXTEAW7VuMeFfoMZDlzKv1caLEND1tP41ruK8VQtad5zW45aETWxE2w7wB7hqJW6TaH8JwN5Q4mRHLiHR6r7SDNWHgKUofKfRxddjIvLhzB1V5A9lB3kx9lx4DCmyzyCBOwQSJ8/qQ0zxUklV5LMmeHlDQJefoabKECi5x+XAYIEj6p5CYd9DtDurE79hSeTgkY69/mJ7Tb6boZAAQ+jthweRZ7F+N3g6/+rvcLVHwx77RQD2v7GmHGB819a2EpMNBzGivhmykXi4mKJIbrNbXjwGJvGQjwS6+kuhy5JZMbqwFZvjkl4Hp90nlug5rFRTIVOlrq3msF6mASZuz7sU2lX+Z/RMZjWh1GyoTtS7pJtF1rCgWBzKUPHCjoKUNlqbhlxq6bLeqqKw7Gg9tg78E40HN1W0IEGyW0ZDQhLHMqEblvpIsXRkQU/KxAoS/IKG9GSFb5tiNGmm2+fbKTLzXmxrtW3hZiBXuNB/SRXqiSe/TVNNwLq5mHGZ9QLwCuRen8nrVOApMWIPqB4kOkbaLkzOihD4XhECiOkAArEXA7NEd4m3qmz1wriyP5WPyhqwGIfK4zyRxU46Z76FN+ubzNqIR28cjvejeqrKZdR28at8C7bKjVHrbrYPB0u0ePRDGN5BqGA9Lbbl9o3zkArd/Vpkm9LKo1lka6/1KnWsop8ymZ6tUIoXEPptPFpD3H060lDfM/o0GNEiOfJMohAicjmS7HkeUWhMkIgMuWdDBcsohYbdTydsWj1kKxNuFFcmWhPDpFmyTknerBMuFrao6uS8bXjBiPjxDk6OzJ9RCDisViBjFrzB3AzI9mBGYVWoLku3dA0Nhfq8PxzWfVDjxT/ylOWQ8q6xLiBq1TgxrG0gv5TiKVIyrJaH5xEXZ3HpSNpkr//iVmWkeOOfct/HShFLgxJmD0MD2gNfdFqAllNExuKDdZF9dn39atwyftojRx9+3Nb70th2fPkyLF3p0feC6ynlqg6yO9WxImNAuL3ymmk/4TeHtO7wk/BXPeYJGAehwHhofEiOBxT0h5amz6MQfL6qWKMsffrWEVHMMtzZr2bHXnGU6pztmvjYmlT/LVGhEJs/Wc7kGtMlEDCeDoAIZgfBDX0XfscJABpIZC8TsJaKT3rkDB4OMGjg6izouW73hP4yrfjnC4CZ06c9lMzQdT5O/WqyNk7zjQ9DEFGF8ZeSxd7M1ZfL2EO/b3Acxcs94EJi3E7rvaYcLMBxnUr07LWz4dyjU2WjrL648C23UsDu0SVpt3VsPfXFlgNb8lwSBlgoTzHhMynSK+HqonaQTTTZU6CuBwsny9/yuU9ekSRSrZ2MO7kmwMwFTRUslT5ea6SuWBcsPZTm2gYF/xk02rs99zNiFKqN5Gwy9Zdk2HUzvpXFYkoczt6/eUb2Gt8QXj12Pf+DddQ2ZKxe0199SDsv9jYVucYg3fZA8mUmxTRA66UGRKJ8Q4Jg/TciIZvePUrT3REJCAqglNnUyWqVGB0ykU4fKPZm7PMt7630Z+oJHMa0X+Qxn22sgXMAaaSCMJB834ME/ZWl/aafLL96pOE3UcwCoKw+3gMpsYoHHZMBgQ0sedAUG0ZqxAxXtTmfABTW3gZvSn1D9po0ajxazbAsFRH3GzHOySOnBftM05KmOVhyJOg9YxiK7zMivdbkXZsZdKgPkbof0IeMuSqC1Jiqe/mKunuKUdcOjKvUMRLzIKOsKRMLE4tS10B7kqR9R71HUZ+Cv+WIHPSHwM5OlOF4JDELpoMgQ1MpW3/TMdPdyLtfuKMvNYIICdoW0vgF09XzI3YUnQ1bY63dPsCUlZj7EVLHg8RKPWC2WTHhPtWP8W9aSfHLKjb2gMDB9Plgb4sOpSP3NiPulCLYgg2fNgrow68RTCVgNEJ9p0cWQBCgD647VRScv9Hcyf4eSdD6UAr5QvgQL+L/k7ny719E8e4MDvwjyv2afz8f03XcrfyspmS41bCN7Ky2tZCa7i0aVGkk/Z1xuhFhkiNoYFyr3GTSPs1CWTs2tkVZ0Lj6n6HnXraOx2Xyy/29T7Upf2nmV//+RTViH8IwiPjEimHlp1b4qrliUnYyjWlbj2dbT/vtWPFgh1Njf3e3MbwTP+1WujQRYORjurzJlgSjMuFrNiOJsgAeSBdolXQulwFFXIn/zhARvI0gsPzspkyA6NeqpDM7OMZuxDS8PIDpsSsfJPBhXhAn6E2K4X35Kzme2dQLydjJnaxNN0tAcHyH8tzWSuJY820XtBiD1nr3KsBFBHfNlAXepUxwVrBm5zxXMgpwlPYRkpKq3oUmzh5pU4GBEBxApHldABsGHQhpduyCHpv5hj6pzv4vdkltaJ4nqd2aNOMd1lVALSgASWuwtnelDSeQ0ueSAJtLi1S2vQrfTja1XKazWNcEGhc+KdP7BkNFSvRdgizT5/Uv9iRovzyLB6IE/tDGV/Cu32u/MTdh4gCod7NqIuGhCDXzMQq4u0hkIrr14nH8WEXtA/TZMeQBty2QanoMF6unOoQS+AXcYhTAhmmM3u1x2azUx1dB82H9qOibRODxM/JfHaG/qfNQQpPZhGKL9js68rs+ERWrgbcLhJr9/8I0VHZ3d7hsB3qhenh74zxjkAFNuWFUyzwG0sPlZVukzfgY2FogQ5US8uQg3eAXgPlV/ZNiIJLSRrM0i7OcOofkRFux1+rTtJzTMvULjlz3r6LH93tDV81vgCJW/nWym/+Z1/2FRnnYnuBoHvSrnhlrJo8X6KCM+dV0TnJ8YxA0Q2neX3hmKNjNuMvYtuCf1v0yS7f1y5igufM+345Uh/T39uMl93zuv8HtOOurnsv99j1zek+cSTBjQejRtabXIrECDyKT3utwe/5KOYWG0FsSnEkAVTPS49m6SrbIcrPphwudFOsXU1IYa+1EobsWuh30XQj0Fvgmftmy078XWN1UFsGJNX/JkmuhptQmVUZGdi9+NWI1+BoX7F8snQPHXzbjb5o6B7cfTTYfCFjJXuMu7UBvVfpBz+hXsIC3X5u/VMw7C4ngwIL4MMwnd8cZ9aStI2pfDfwZ/4AWFHjmG3HhoqKc6Avizv+Zru3vix50lhvcnRkSwr611tdBFwXPrGxpKsBg9iq0qlCHRNv9DgqYu2RaJ5MvRVOmulJsbTle9M2YIwG6V8zorwN4Nd36pEPXeF35yVc4Z6Hfkv66Gw0C8qfbPkEaCub+Xhahhz0w6x0I2Iw8Ly75V2P0s/krZA47sFOQji+LYSDXL8a338zfU/yqJIHfSRTXvsCfr/5ikmz5qLdJcAsnGvXnYZQWSWkHvjUmV/L0gFesPQ3inIjvofVuMk0kg5oSPOi049chTYLm6JcmDJui4DEg/NdwQ/m/m9e39F8m+vqXVBGWO09Rh9sbwGugRsjHaCxZiGturig6chCsg5z8WvUwQN8itfqwQAtUZ6iPd4RrfgE7pA5/KdukZZq81L9NofZffwmuCwF2T9C83LNVcrOP4Ihfv/P2KY+Wfe2cnY1Y5zf2+n1s/yrDDDtaJ2W7jdB3F/3YCl1ZASyviA+YGfcQpY50zbu8Vf5vpB7xjq/5jXTSx9eNYW37dlBQeWyl8UHoM+atRKYoHLT11i/dib2ZYCyOsu7kEYIYYrF7a//yuPo8lHHv30Kqx+Wr9RbHe1Xa7sScZ+IhpGE2Xz5ehikXE8HYyHU4OFK7qIqrWnhxu+NrlJNWPezK6qwLRqWHk66yGWvG3LDs7a7yB/DZ73SA/vWra5XeDfgdq4mlxpnk2rqn67JVsjil7uPmebq7RbsshljQLLcUj68vVYgHfbABjflceGwQXGgiu/XEPt/o4/P4i7rIYQvPVy31sFduggBmABruLvEGDmEXrWKrzSj0qxrhNUCxh5T4TGwgWL45KJbW1WRo7iNTEaVRMqOaJFrOqaEgO4HuweU24ATLoTfTG28oJ0wYdfhB0vIzWe/i44Eg3BhKSKw9J2iwd5PMMYoY+PjUYSpkKybBExhxzxmvpO8SgDsf/46IYbI7uwZbzTC0qGdVFUGFRcqDob4583DE78K/y7RSIVBjWEaI8Tpy4aPYQn0pC00Ka+EEQ1NRgV5BKg8CEC9LpLuIp3m7iY9/zj8h+cQ/hPNXM0ThhlxhRJYiqdTLX2YtabcX6MzS6cFQ0Qf8VebnW8BlRmZU9iVguGTKQQ3mgFjvzzFtptgVk33k/a9Wf1VG5DPeqgwwx+fNRYBwb9rn3vJBsKOI/AJgP8aiS0YE5AoxL1g+Ep5yRYQXQHMXDXVd4pG0RLRA0y+hTkTXEMb+JuA35oS4iQgywQVmr70OgGPJXtTj3xFJQv35JAb33fjExF8LmLQt68gsFiBvlyw3QRzGM+h0/s937g4P71NaRnw7z+pnoX2f7ALSbUnqYYaDTaTaZp0q5D5Oewp2986b7ZGmbmVZTixYkyMp3H70H/XJ9wDazKUkSdTalZd9153OpmFdUdpGyQCZ1zEI6W/ApbFj+ewSeQ844SzHOm/jHRlW4WJOj1jSkXiwNkt1FlkXVyizZkERboNxYDyIGyk0FLWAN6pAGBKuuzJnWvRxDGcke8fOarSZJS+NyC+a7H2M/Q6loM/+odJ1fOXadxsrX0hbjpvpi6ZhffTnWfoCzPqbpwpGbKXf8bpC6zZ9CvUOvYQLlhLltbxvwaDNtiiblQreuZwTjHA2HJgApi1+yOWvoaapIB+TY6D8WaVcYJ9XQTvgaJ2+1acYC5NeB3WGVbYTT+myb1rkVm0s5VVUPBPu4vKAm+b1G8lU3x584Xj8nR44UhRqT5Kk/WFluvKTvuYRP5yqsFjrv638nbf3nk5DWQXk7wFnilrhDuc48sD0uEyRNr4W51KFW4Stx0l0AUdTTb/qBNcCllQtddPuN88foGoYbb8AlE0WM//a8Yku13chvfZ+XnDSxbHvRGTBp5xS09usksaCiM6d72FaWqGl5smqOpW8FaAoDnwsKctIlYu9dPY/TUMCYeCWrkC2o1cIrvAmIDDt7Qk7nIPd7wtTRerr/LwU00cBh/Z7eKfTw+xAVPwYqfc7HHD5MAvE82UbfOziFEt6GpWP2lkv+xrmoCP4+nROr6wTX4W/n5otp6L+mqNKWi4OE6XI3ZzFORqnMhLDEgrdjP2XpCqS9VjFXmUnO/7xkX8VAwvRf7obFizEGESm5HkK53B7v9qvPLaMsAaM1aPR69MwG4xe75oa9Yxg6J72HlpcdxzPUH/j1jCvjnPVd3hqJPxOpKRsNIw2FyGv+A5FevS5dk9K5aMUOCgpG/0CVXg9+x1PHqrByyPS4WaiChxZMmaz3NxUw3u8HuVnI1C6XdoGHqrMgHYQXjpnmPtGnIZpqMS2vaUBKlDhnQhLXbZChDuEoIvX9SxTsOfnZk/VbryB4Vra3fk2KQtVG+Hz44jtvVbiIdrFQZaNavjN4dnlS+FSq+zvSN1NS/RJV12hwWpNiiRZUmVJCyC9ADtuDHNubUyXtsDqI2bT4i2/19gYk0u+DyeGq5pi5L+7/muve/Hqla8ZnG7gxNzwjZPjsx3BfWOIJEq27Os+66pCHxLXdzqJjRpd0urU1ds2fMPmZuFt3YRVPXqeOYsEkP40MT/ALWVD92sr1+nAgRG5V0RzD5nJQy81ZozAEqDYqAyoPh9iQj6MooqVH89GOodxjHgdsWJp1I/Zh/xoRzCMBH68LvZfv+emI+nrn/avsQ+t7QhBLDn2CqrkwiQ+5789d1S0vl7ibaXIe/aTFwbnqiWpsAJz2J2ISbr6xY83hvTWWsX0WFMP1KAska5ok7PNsKDbtMgV/jdWk1a36r3cJ4f+8M/DE/rAp6oTyRnyOwAiH7JfuJsEzZMDQ1qojNApH3wQLtXAxmFw3nis7DtJsSaQxSv0uvA5DiAt//ABWtsDizMx2XWXjsSLCvyAciySX/lhqk0lF2VB1UD3f1x915KkwJLs15zHa1ZoeERrrXlDFVprvn6hps/a2u0x66mmuyhIMiPcIz0iShwwM9rEEDx7mMeJhvE2oWLNvxTk3bHRsZ8Z/ttVvxQka1qRXu2K4Vul9zJcw1R3ffgzdl5MQewQj0HOyY6Y8yoE4w35FyqgknG7/to4GaoRvGD5jF8tWgk4cR92mt9TQKyMfmvl6ybcubPW8q00jQrVup45NOnasGCSUoMUhWazUUKF7FSEytQmUrR/pl2OCosjpZIncFztHu6wO+tvdwklo+Kx/rb6+DkhbxdL4L2HW9iF+BlJGx05qUmB0iw6lAIOiiMf+vtL+ftyHxYpaBZV0eiID9SRBjpyH+yiX3aGbFpnPsboY5WQAJ+tCQklv/lLyru1aTxmWpdsWwSra7ITIyPMuTaW09PfBELKCGYcd7NewYlZQOfBUFSZY4PeoFXbzTR9w6HuX6Vnbi/wYHxQoYYiinzm336+CUzdLHcd9xrF85+CB84g5ZSycNmcbjTGwR7oCpLS1Ip4FYgWamdV3LFAfBAHIkjfblWuWnMWSyq1ku6u/bmUxosZweB7uVMTT0GmFb7EOaHU0Voe3gsB3WNUSV6z2uPupiKe1NYzDlu9yNXWKprl6Qok7ZPZsfhZyi2ZJFlmPJz98pXm0GtCJuNLDDdad9+SPZlivsDxgxZ0MemBVjcrjV8cmk0XKfI+zlxk3bEpbSF9WDTqWsADLNYUVYwydj0UbMzND8HTeRdVEGQjfA/HtpUcxm2/IZY8Cq5kzpGc2aMJ0HbC+zUz0LRfb6/V2MGrb8BeldLnyt6ZeftEOi9b4ETEnl2XxEc8tBJTulHdl7Q10us5e6M7Ffu1cVRgxBdIq/nwhyiSti6LsTp99EKiyMhhS828Q0WfDrIv+DFdONK5WNRIKmiD3hBEbHHaZUUmQaNsdIVlxd+uX8qFGJKHKQRks5CSTVIfstjf0gBUGrQNEsggLeaOCaKg2wev3OEgyow0Qtr1zmSS7lKnQR8l7gCkO1ftmA5bILiIyVpkCpKexIm65UnKvLuVrfQ2Oa1D9MHymhkxfmU59E+MfT2HRfFpBbSjpQrVPuLpGzAw2tzEw3TTzGxmDGIg/Bc0mO9ejHsHx8x3uyLeg6rCwI3DobkwtfKbaPurmbwwQ1YdFm6hV14r/KqrJVcMvAnmEc+UoB4KJQuwvzQBdjTNjExr+8BTZrrkRuAeu3pxtUjGcReSWs0f10JdZTxZ37v7E0jwEYCH+mX2VUHCTOe5d3iuCEmNtGl6i70SurfHbvIsRTMiC8IBZuxLbENRkV0IUBgzIzmA4pJvNOAwtOJCN13GIniJUPN7+vYzfTmqE6VR7R4+05G1TugbLcUe+9HH6TVs8ekfDDo3Hr5wSyrkYAddnkia7uA3cTwthwL7mm/1r1inw4IJMZgC9jk0vVH4W7LP4wV+5QIMB4CR7+sIMthg/hs4wt6I+QTFboXyLiB/BeeFclDgycwezqDB4pMCl0WeBCtQ9PHZfI2jOUtNWDH1Ni2BBUaaYxp0Yr+M+gB+QT05+0PaboldaCxVYHhS6RrUJXmr1xbtO4FhJ/pJ7QqE4jgFRNrjpU4jT9HkK/qE8Qm7Cg3Jjb2c8O1rfOBdWVlEFX23/G2/o0wAbf+qpIjMfQFQLPl1TNukEaQIU4OJfqISLQIU+kG9rdDBbeJmz6mvOnrc4tINtL+SYae4KqL4kaYeOh3nDj/7+Kvd6Agfg6VCa7jDM4eS9UqxbEycIvK7nYAyUL/1r46AzdwQBvAIpzsJhqMiW36/zzWfnzSWpBxQL9B23Q/zDGvHxrED3W2VliqS7sEeaB+157Xo4edNyVQdCfQhq9kfg/u85QKwhvOZpR82PxBDeuooum1OrCHFziWpX59NqSiqmHjAhvYBNuVCvgp3PHTJAdDvtwWBGMfkMBTPFpWqiJVClO7mXGqaWn+frFBHgTXBdO1I07Z/MXGMPch2WuuGTKAF3pqZp/QQq7e79see8sBDYNfUbumha4usq2RkWHWR5298xPXgsoER6QU4O2ILkVqbopMmzsoq+ZS+0HUr5EY/9wSSshkatxY6IMFjABcLWiR30PjK72+m1xCGNb/Ief4tihFyRS8qPu2/hCv+F6j5VQSrYM75RKd3spO9bQoi6bcHabHOk2fJG3L/wquN6Jy/gOnIBhWvxVRJm40JmmR4fmYVDYvwuCVrSAR/flAq9XCdi8iLfGlqlR9ngiNMNfSZK7xUgeLu//y3WogdlxlAA172AYHElj/4LBwir/V8tg0NmsZjJrM/mXfKlBxRxws+7gAx31/soaGZGxksxgTt/eGSdex0xEqbhx7rV4fMWp4ggdI77E0B66qPjqXJnGQQjXiyHPt4yhuVwZAiRwPYzpN5/AedihbBNt40NaN6uIWqvEIgK3OZuQROb1+3BV1bXK0JMhOWH4X6/ko4P5O86pMiEnOovtCv4YUbDIvZ1B0whoa8LZVI3h5og0HwbHUkq1slQDzwb0bOHYhevlpPyTMTGbaawIjeh6J5PNFHOndZjRyqMz9q1RzK9HFJT6VyRZx8k+cgADhYdiFluTdSwyN2NUsAbXP2uS3zjMEv/Wu+oc7PXIbYSQSYx5HndhCURxhf59xIjwveza6AwYmsPxDj3h8fMKO7J5UBCcMq4aU7HS1Dye9dFSWfbGsB8J649gIem4tOFgKFZ0pg7fQR9UOqJfhg3o33ywx6ZzSGJWG7i/XOrP3AD2UECfNZjq97JFmVKDIAReGOhIII2JIzzHtFSwTlVwzt3LwkB6HUj/u/ngFgpvY3MBrKPrkW5ymq6xwlDzUHnLw5K/9500De0uvUr3iur4GwMa/3h6lQe+i0JbdPShzZcQi7sy+9B5ve7oxWEaOJZHHFtNVIS+f2fnQ6Mjm1PlVCKS413e4UUM8CSTySz/WIrydEb6KF8oz7/AqrKD5D4ka3gSuYfb9dBqH3yGWqUHdXI0QI9VUQhMi/u9ISu4/29E1no3rBBnb2wh3OQkRSfKK2caMGOQIWyGzdzk9ctQcduJyzUtTdVNbJWbiXQLIZfAvGQTZn0s8crH0TLGMgmmT09QO07jyWpEbZdk/prLg2rlgCoO5eI3VkQMh/v1+sQFPDYHQnfgkMBqP+PWA503DiEYVa1pLqTIa3DzKBE26LJk3UQXe8Q+CvmfDRrcDW63vTbMhasqbRgEVh5gQa+DuL6DTUGZsqWIuyWoc9SXqY3CIGEt7mJbfZxM5LE5XjzYp7JgbwQGkVe1s2d6GD7BJmTFuzrKUuVPqW7W3VBff1fqoKCW/oWIMI8Av5uzJwB4AWKwTCq64ENxA+KF0Mkp1eqWNx1nH6PF5E0rj1Wmp1SSR7XT9g68ti9gIgHRog50oCpwzH0k4Ps6vpSJu0lfx8JfQqjdITr7OsxMCdq06OAMdnso+0DKI3633VeIQNm0cQDnQRFmNkE7G3CM/Tlcrjb+++Tttfbag1hBHA8MFYrxyraKsGlUhtESR421IjbuEP1GR8+cvqeemack77fUCbox+vtYsQlRYZX8BnNQUnUjTxrvQEl7+p5urCSTKrzb3ukioYyWjq9o1VaDsoV9Iw8OM4WmsXv35C0QgQ16fw7BCrzKDM0WatnVHcXQCJFR60A+u0ZS7Xi/pYRwA6+KcS+IlZvobyeDQ0KpZMezUtbnPv4Ru99cjN2/bMVWkVLXq+U8TCHe6c+qWqBqqHYtC7vxd7tcSlgXly1FtM6xhkKqgebEiDTtu8hRsb27VRyEnat/dlY1YX2scP/TEbskJ08Gzoci0+Dfb5MDTyorAjFVAg63sIOjrssQbIzoB5aghdE1OPlafEgpOZAe6MkcNMvN+nNvvVcfkLOAcRhB3hdy/fWBUKPCunclSGTDmUPkED86z0dJkHXn4cluZDseRLfRjsJvRp+DhX0U0ccjgzuZDdXfpJmYWzPydqg0qC2oMjMb9VwV7DRx/6iNDf5oAccGLakX8ih0H2u1RhAEOI4l4e/PXtre9zOb+2ZH7xTJ4sgfw2274OvNNnoCTZx+NQNDmf67LvyQ+LqkbluVZq78K0PEevwmMZCoy6r5am5J3w+DTQJOVqA01DJJC5X0Nnz6EGvVw/ufMyomYh9u+3f9uSzdM00T762ci3ASiH68yKIl8t0xLoG1cx+oVz9dc9AtbTXI4bSo7GVjXthGLmDIcz557eGZcnDI6F24bvZg8RF5YCFYJnb9CbHVbSTOP2VQLRbjZpgxipaEXvH0lw6Qb8bpLsqzHhT8dhyRLFKDLviA9VLm7pS5KVZdPnM1D9HSvrujQYXLO7DTBBkghZ8/0G3OpWS8uwcbpm390MnJ/OBDZ8ZiSWEOJN26TdT5gTBxlz0VzBEIjAVnfPLg49rNlV/dV/FqCQyrCGpgkaYfr11dEusVqv/1jq43Ihz1eHN+GcMlSfxE0TNW4pXyEgbpSGTpuzvaTKZ6F4LvFUrFcQE9/mcNzWaYg/e3OSjgyyZtovJn2t3H5XRH/g/Yg9s+IdvNo60wWil6Mj2+JmueH4Jg+bBR7m1tcdLGyQ1KcGjB6a6AwVp5tW59LkgxtNKiFPIJ4HiPOVxLG+09hMbAtJ+sfX2BRcJkm2WdIUKimUKc4dpw2T3YYfTlfyhAR5+ezeG+9e2xAKzt4XIuluMkr5QGbhld9ryL5VmN4bEIT/RbgMFF6BKhM0cmoe30gd+eOToj0Awx5EwANBtKJfvypmtdfBL9r4KT7lx6CtE8VQ/Cv4zwrLTmMzy+kTMhX/4V6iBHoKf38oW40+glMFMwLvWeOdOPJwYdGTPyQNPxPVAnVoIkEsie85+ksJQfREo/5toVJJdH1e0wMyv/jqj4q9QACAM82C7X97XM4GfYPos9akIS7DIOYDl6p0nXVwUFfO9vVmwwNMXqKjxiabgd8IsakHkzSPSB3htu4tVTkhkaoCGUkDpvpVbvvIMVNOz0Q5JYO0Lpbm0mrWccsVadUxl/f6QOhxj69JHaUWDlFgOFK0pppKwGyruDpPv5EEwIkcYkAw0jR2PA3dZCCTaLtOBXuPeUYliIHA8DIyNF7HskhadJpZtSHo5TckkKCBMrW1QERMsQZy7xLqHE/4vY0o0L52+NQ4V/ORT0jOMDLfyZgKwqBqgdqMrHiEiWK+Qk+ODwQQRedyNtEduNJepXcDbznslxax/Wr8fUIcin6KiV8ji+RHLboKgwDws8QxLKhSKz/c9Zdj7NEZZvSE1oZ6CJmpELMlKykXqIlxfXjsyEsTyxzVwHsMA85ktnuXZZrhR+K90D0q8CoqjlHd0zfdJqQGlY85R6y7ZX6oG9mlw+i9gjQKT9VAkCUzMkPQDJ7PNpjcQ1maCRNDBKVPiEIx/ktMHC9ULLt3ilC1LxHfRIsSF2C6PYnje8pPxm5HdIi2LGON0TLnLSriyS/lZT23rWNhQy781xxA0p7qzHtG1fqELKiupka6OxUhKU5zwwiN/KjnmUexdpgr3xVdGrfSdqSnOCKVSJ7YsfzHuH8+Vm45DUxeccRDhpkuKzUA/BVgu++7YV2bZDDPqGGGsbHkdb67hevgXa0yG7EsG0YP6q358DFFVMkNlOWYV+gWtlTAcLzE/CDnsi/z+wdk5g9zg0b/gQFD4MOP2hCiq56y6/GOWYiF5QKUmcjAr3COBMdiwwIqZ0vNkPJHEZfDw3rRQ3TKBf1OLv+JqmScwG5BoMqJHx6GSF60TVsfXzGVa2TsZAuDYDkmRMTcqfeIMV3TuhbjAFVRa6fIyqIN6w88+lOTrLJWu6YpBURsuWk7HisHJh2rVgUdsGcSP2hBuSADyW2yF5JT5l384VnOhuUOa0wZAf22VnjHfxcmEcsT/DVU/tdal8C2V1WFznIKScn2iisSt98/8OfBouIqWeaUfbGvzsb0M557tXVTT8viam6fN/41oMookxblHlfO+x0+nnx5Xuu7daYVi124wGCF5MHJb5GbNfKLgtX3idAXyK1GLl26yi1rL3O1ohrYzA9o2HgDBjNNpOytuiwfbzDa2xAKHVkIQAgsovq4a/MDO0osU7pwN8CvssbM5SpshKkTtAdSg5q0QonQBEDqUEyTWEm8TVqFxV0ka5LsQD62+VlD01wBgS6eeZGxDS0VJjQQ/QrdJUzQKXra6UykSrSA+5z/asLWvFiWC7EC3flBd6Jb7hbGGcCH34HMGm08vocfgqlt1B9avcRLPsxLpRGGnxkwqe9hZcdF1t+K60dF1/ebGjP5RwPvm64DtluQqG/dMSnKg5OT1hAOdWuO5j3Z4mRmCHWL5vhQB9Qk9fpXMOunVjbG9h7zYjqHMjn0jypDKj56/i4hfqI8RuldTWjPPEt1W7M00ruoiLx4mmADJGohyEgo+XVQy2ftVTw+JuXd6qEePJAacmnE+4uC9osIaT9yNEYl3y/b9XRLRuhQFP8DMf+Bnmn6vPmz5/Oan+8REAT+HRrjOe/X/3MIYv8D0d3J50OXr/NDEz9/b/h/AACg/950/TvyIJN/Px9VtpZ/53gtwHuszKuiXP+/g/Hy70Dxv2f/FeP/febzojvpvG3/ewm/1+Cnyv4+7e8e4nbL/x35d2BZr/bvwFLG4/uy6uLi+Z9677d6oJwSJ3n7UPpqrYb++X0yrOvQPX/Qvr+g4rQp5mHrM3p4u0y+p4K+v6//cw6yrYr3veswPkfjZczT9+6+1Zk/F0j9PpL879HPf488r7N4jf8Dkf9+fEx5X/wHpCuP0q3jI/PF8D4vzXZL9plgJCm+P9LPv/cFpQZGnD7/w5nbsqZnwQ/xJuL4bP0gzCsBmOcEMtLHNuxIUau3dM1S53nRMTqlPLbVBwCUpPSKzRyM8HtgpEbCocYbyGgqOL7moE8qTUlHSbhM1rbYvpywU+PTh1/dXhD5XuttbtpZ/ZvEwvGdjgAlhvn+F7mSyZIS7LvU4opq3IFDd430z+/e6HTG/2QbLxfMOLVaVptpBbmniyL4qScVngLaK4sGGF2cn4pnXX7F7JWLLpWS1Kzs8Z6dSTtUy+eezaGc7TrqadHscdpbelaSjEwhVzBF0/o4QR/Ox9GjUnXcyhP7nWo2VrYff5J9LLkQFO7DU/JcXiHMNpk7BKSTqaZqCttZntoC8xGcI9s8vZ2gRJwKgihPnqf8pk94UoZ9rqFefHErrNG/RLBajqZ54BwkQVmlNQT/SavUlxzl8yd/9vQgD5avuvn4ZmhYgkFXeeL+zW2OSV3hHTYFXPRUfoWKW1kPe1SS/OSjSu2xUekg2bgX5EWYsXYIcXNt1vINi7RV+Q1ZppORT1ayxUvRgMo+zpBpnMW1eT7a5iaizyNsPuIn7KQrhuVQ0HyZPNJL7QqWJjdKPGkd3kRB5QryokgCTQrgyEQyPR2y1UeRHmWTPUiauq076qmNmZTiOZZeUZ/dGWnRiVqxWIMtNDKGl125ZyyFJJWyGSO4dcTn8PHhDpZlGqQjSUYz9CQG8SikqejBnnUXHNDq4wN76sxIYyx8PQQL1r2rEsksIxGZIZ1NPgfjm+dGrRcfjwwWNnpTVLZnufBI5yyk6ZDLLSp5rn4XKBfBHF5ucmStdp8nPWY/Hbk9VwiSLH1pjxcjT92pJBYr4FBRGoc1dZK/e9e0yS67NCCESZxyRx7Tk86aLJZZdv3Ibgl2MTNSIzqZUWTXTZaVmqhXK4EyieayC1paj84A1mkSXLc4cvZBo0paauBCqWQYZDd5Cykv0okO6yXYGQjUn7M2vQkIzL5lYB/UF4YjkPC1s9QmK3Lhs5aq+2LWkQcxFjfDwuTjHatUF3Yo2eFPt/XStbAFWchapMAZNkdo3o5234DdB/xl8hKIzqX2VUSA80UN5vUalPgWbE32L1keZAXd93j/KHRjuuZe47+mplgHEHD+rezKwGgQxE09WMCKotVy3uxLrOgD7lUz1WX9Q6rSVamlqGNXwCAOyxbF3NkNrEc2VbLNLFu+GZ8F0mutdW+kTqr2NVoxV/pDU/sPZ/BxxQqRT1cTaWXWtiwUBXeZqkUWqvjcM7YdEO64VBFR+q3B6HOLHK/Fgyk5319HMv98wbyBIfkIV5ErkItE8cyCN4kKk6y1mBfJP8+qxuffFuPrGWntDDrFpWmYeW3Mbz/1ZQCIv+3f+Dvrj9VlCvj92zX8tfQ99qyMN2T7feKOCthNIPhgR96vD1dl/7JW4O9S6DpBEDCm1wONGw/mxQ0C0+/ldr5fSVbOnH1Lt1Gr9c319FeLWniuxSVv3WGjBVOfHx+ESDZjTSZ90fYfg0wO0oSjS6d1RMXID/zMhWdF1jxlhrSlO+pcMQ4jWYllOlSdU2KZXbHC0t9JBxYgjsRJmZgo/ASbOMapiAgUZx9LUna5Clawdp4FBZwAkSFc6tRln7C1M2E7BM3Ip/hHSO8EyNlQdli8ikxViUL1UDfWYMletUAN/N+dj0T6lTOHfbrlHlp9baEy7BjDaAQC/2KC87tlzPySjCuGueFjjc7ybQtO5YECwhtX7QC+t+nMhSp5VNbqQF93GMfAArMmYsk3k4XCsGlLvIVURRVhs7OkCh/bzj6F46RLmdeOe9ryEbLaJAF77K68jxzPcROt5YAMyIGeA7WaltNy+spl8XvCnEpDmnezGxQ1jSEtvahagaWRomCtPbD5h+pGXEyTQ1keRE9LPCylpKyY6yWEeFY4y7026C9bCb4vdgzxVvvJIValBNzx9uqABLgjizuna7UPYg/HYnmsL+ksliCv5A5NIY4tZZlWKlHVjgcrfojARl9ILmTf+vy1T2733r0z6FBP214M7KVbQczwiB4tmkSB9OkLN4YASIf8VZ+gU9g6umnCelroXv14oh6kAGiHG2/KFjcmyVlirX/qLN0yAkabGrO9PhLVTrs14I4GVEuXInsg8XaNKfSso5+Y6N7iRbdOoUjYZ0mzVJcWI+d0omSRXHFQDMtEyfM+luxwPezDJKkK8RDU6I0DUx6GQ+fFVJagGAn0q4uRas7bBh2Rj+ZoFpJm+NtK7sYtsn0Hp79ENFSHPL9pL4cAUYvGws6VGqLKjL5kLGjFcjgXyhNJ9zGxQoU9KUcgJjLAPhGEx6Lx200hQHCOJ/nSQ4Z5SRXa5UEpMebdKZ3D3k3IUhLYdHfDNJQIx8hG0jxzS5TneR9PlAqrQ7R1rUVz4EORVA2p1unlVBPaYmz8HV7SvThQtEgo0gWJqvQHrBXG1K+XfTSF0x3eXViG2kkHWd4UqdAWrUnH8LjuscneJqZcZkbauX13aQJ+yv24u0Ny3l/6TJ5CSInimyZnQTqQzOdPTFIQRoeGWj/4CP24PJzv11/rZAqAPJYwHOdtekqhWb6/DQMpBcgXsKFYfJzy+YjWDHh8IETWvy2sVpKD7FcT+N1KFaisSMPPvRzvrhA/jsfNHvVbbKNR9atCC1MrcHEadVVlKLiJu6lusC3p4imeowd9GqONKkyTws3JvDoL2TCURkVmHGbpzaqSsuULq67oNviAgeY/sy2jeoOsVcc7IzsDI0A02VM9G7OgioYuHmcJVSJhiC6ALU6NZa8hxst7T6YPtsbOkB+5eLDmYw9+Tbxz4WwJFEfpBz2qQ4guJCuTLQU6HbyhE6rf6o0/wN2Y0f3IbX3/dWLBoDeI+Cwqcd/1+9fdHiZNcmnx7V1Tew8LgR426941I588DO1s1buYp7kTlcphqNTVI3Wq32AKBcN1CHmgI6WLnFmTLxxJbxrIDiCNhjFYemwObwmkp50IaN/iNtT02aAbdkQHsYGeAiailpjNcuCfjBFzrAgroSjhglc6S7/5D5D7GEeQOSV7aNoF4NHlEyBEK/EYnOXeBLWYmV+FL/QTvdYoVyS/dk/KHC7KJMPuDf2gamdCZCr+evld+M1fSP+5LdajLK5mhL4XjBzTX7e5FsyLEqIDbB2Ub790+eqaxXjanMW4mplHNlcVSTMKcjoUTFZ/kICjHoNJll/f9xR9Ju++neccXJFbZXT0udUh9BZHKkYSe4xLNzpHSvrJxDB624qVeFKt/Jxd8pwPwshxN+g8nIUu5dKec91Yaj/WdBrRG4GvtIpPQv15dsMX5kuROfzr7pBWEarfYb0it8ZiUIv2/eb77a04nLxOSLDbj6BpzBaZjte/q4zu574Q4Yed+a8yctfaNXnZCyrx+Q7wDRW1dd/P247uyxsbj/CPX8F+9BY1oYvn7ilSokhNJ/WETDaJVZkUYgZB7U1Vi+yyMLO7EN/9BOskjGWUNUvXLksdhxhxLe0zGs5XwHj09mlbrBnQHdmQjjpKEO+i2NneOouhex7txnTYhES6bwHl3ahg2BauWT5geGAzSCUF1syEl6kcGyJ2zWu1+0yjlgv7PkjfJZ6h7iznDn+zwwGwfwwN75qyYPhf1OSNg1ZzgqAQrJJ5+Tj87sHOTaN6FnuWufGe8lu8Zc8+kAk5p5xG3OV9CNmEwhidTff4JZcmY9cn2QzkDFST9Y1TSbZ1rYFmJlJTKlOpvliFp8qyEZxapVa7PRmGh1pqdmB+lqxZKqCMtn5P+THnOD4oPm6tVY0ihkHRZehfHxTeAgi0QPJWAr8WGx2lNRVikMK+nbxEMiQF2eFvfooEq6+SioHCb3oDn+MiKPjCGLOj6PSmsA0xUygLFWBdyASC8xS0AOh1zqNx2pcqm7534cpzQ4+PJfY6yy3hBNZdCCCs3gVVVnlWP3p62BYBA1gVUJUVw43zVyHKTnECx2sA6EmB8d8JAIHE/q/DDDe8OuX+6+UgBB8Ky3yJg/FR7HX+oPyv6Asl4YZOBAJ9kqrDX80oNg1bcq52Li6g2SLtWoc7DLEFvmOiI1nrpr95G/fzBbBLgRten3aublia2aBlNf/6k0x+v9bYmQl/MUuj/ulhNgh6l8dyOIv8jL9TUd35rCIGilx1aBstIS2As+qwi0DyfevqFCXLXoX5UKRnoSZb4gMhxEr8fMBhztpUuOz9fBJBWZGEUI0TNBMYCrjeyzQWDYJ+Cc4wlm5AE7UMKbKiLYFOFFsPtVIG0wSy4+UyWWdWCHRD2ApozOGSqLHl8wZi+Ttln6cA5/WZK2dAvhV7jOo0TQMlq8OvfhKKUvNEOiTNB8Rb0ocWZXmnrniOJVCdUcLuifk6P7HZWAsKououCoPaJ3mutTJLLuUQSAl/J/JLoOyr9cNAaiPoRujzawTvQ7pa70vAk7wVR+fos8yAMd9ccoiACb12ykMjAc7lL5KUajEUBBpSKFs1GRK489dbkA8dIKlVXkAeEFRTVbWhx76SUIM5luNmbTndT9+k48TC/1UC6jAO6eSHHezDUCWbxVlpSTkib1UivKqmOVL/C81T8n3PtHcxPE3DpAKZJ9kKk+wG+ksZc0/QG1zLnrpeyxUq/Aq4SAKSKSm61JnbeZRzg3qmd35EPYOmMssyC7z0cX0FyrL/wJRDmcugH82DC9ENXRx/vKvtoC/KMq/GNeRNvVAowMo8WJEXoihBoAAI9gXN0RN9Uw5F9ZuIzkV2fpsFqQuJOqkVWqkCXQgz4wP3UTZshx5lOpUiC4Z2TVms+8fLNEwpVchGKwSSAHf4fVZlXVlyBf90bNyA+fpUVlLHkoUEJ/3eZr0cG5+vF8GHXVWq9EZk123Yf/IX7pdcrqpfdNTcYuy8RhzsSg7DAP1le5/SRbOulLRRt9qUQ34oPtZZLSiN8RL6Bsr2ZEC/vYn2BIF96gNddlWxwYENAxrsOYvSo7W3P5B+JY7xjVZJZsPOvA6GThYq5HJgFKl5Jnx4NzClvyGYsPHt3cTwHmZ09pkAfaMHkWZe3ieMSQYHesF2U7wdnBXyhPDp3OIR6hAdLtj2+ZbdIkOawiUEt374B6lanCYZSIx+87mafgmPG/ZnZX4y9eZol7txOISPBlEksuaWmccvag8S6XQ3j9RXEcXdsNNK3q+APs1+mDSLHsQWuKu4qDTzgIZKOPGdyIDdm1ajhT1xnz0Mgzq2Nh9Lyx4gL7Bz6/DNvfVtU9pqFTvFRhbsP+5C5bi1lUnbaaE8FDMYAqxUiDbT30PDqK5oHTJJeZV3qpCMEnrbj3eq/3LO1xl5A8tUAHs7fSm81IkOA2KMhHsDyV/sW7MSOYqoM8ftZF2LbYilJg2X/FjM99JJ/0FYVmyEJo2p5Cy9+kUdYwiMuLb1jw4TWJbxdb0c+uNmgLC3Mx1fhqaQR+vkGq0CDGQBpAXs/tqovRs4Al/FVEOrTVjxCG6o5arBPklB/edXFRBHJzZ+7oyRgPRcSZ1ztwujvBnJx2oNocAfnLQhWMlakwkCvhbhWA4JPbTUzWTbAOQYSlb+LEIqrNpKvlSKxyzyUgt6EtMxn668+qWrdt/ue3+iM78fQMAt4HnNCKx1q/dcDMy7DzgJKDGiIVZVQyZKCXV9ORuEDss+I6q5nCTFVFKvloX0IZWSIslGMhtSDNoEHwgCIf7JILkdO9/FdMkBaZ6uFZbWpQS7O26A3NZUH0wPe4K8IFFu74txLYLL4P4laQrPHlgNppZ+6KYhlldcxNGE/Ums836EdQ87x274CeN68j//rWPQeeqtskWocgUfKrfuhy2YKZ/16BbVCpc8UYdvGjbuNU3C51srnYi1JLNZfj4FqqJmFlU1RcVR9VSPGE3KAeUSUXNXpzcqcoixr+gQ/yuj4b7T4PPOtTPZoQAtoAzKlvThVJcEhpWTguZJVySjZoV0V4+rrOHVKVeoaFXRTMcBKbRWk0jMKYwI3kywJzmWneRFIX49hyNPYIgop9WCWIri9N1RYkHLtpzvDpBmWH/CN/9HfXUuVJy/TVi461fiF/Av7tOJj1MmoB3CRu99dHH6PNdnIRIzKIb0hhII9rYHhX45pl6JQ2Q8xm3nq7hvkElBAqXYhabjmFUpy9GK5L0RTKfCqp5jXawlmViti/vb/kXtef1ISPGYLAabxmBVk/5htvFNghhrdRSAEqsuh/e0ML6Sub1acDv4UPuVe10o8i/fkFN3aL5J7/OBIbYNNb0cUGUZJ3TbNnY0SE09aeEI1yZ/LZSltSDhlaltW/YxTuAnx0gukqhCYvGe9KuEKG7ipbLaVGsj/bZetMVFu7THS6wRfdz3IuzyLeO+eQXomW/Kr2zM+yhHoi8q0cM/PA9uaG4IPxkR4SPMr6yXyv7oczyWXTHkggJbLxnKSRbsJHoGP9RIU9nSLz9pwYVgD7fn1UqUzJ1a5i3Up2ayReqMBD7OO/cgCrZYN0lGxsMMqtJSkgX3o8pJtgcjzXQVGdVwPxbIIoIOaEQjKd/Vi8iv30Shf9kfxBUv1YhMxrGarrtsB5DRDE1GR+jb6b3/GlgHfY6sgKewzwIUIbURMY4mIc4tkiYsSq5yCQMIhNcFlMwLkR7GVivLULakdqXe2smyyI2APdzvVDIeDG63mPhNhl/btSvfEx2TQkJ7Q6h+rtfTLvwmouKFv8CO+wIbMPllCqRFysKv41jNwqE//LPGo4Z5iwkY7gSp4ik9xKkPtGkdIXunp1CQ7qgkR41hZLMaaYV0kRyffQks+avvvbsm8J7j1QRAv0H2GekhofCBHI/HVAbOg7SZAJ28B2ZG74cSeOBUYBIR/AZYK4t1IACTA3nK9HDZVJFWlwheOZzOFpze0aas8erQMvJAsi9lUmZqkbDfeJIutBiO/lkjnwEly75kn7NKneoShmxwZfI87SL9x+gAR1/u3jvBSsTagplwRzQeTqNGYjbjzF9PQ0hJ6UOOvDb/BhMQg+ucCwRpGW/MhiJ5XDd6lVGdVjALC7Y4b4TAbliqsnw+yw1SxfZmHtAO34pSjqelxpTSqZZ9dliLrtXni006tjCRGSYM7vrOn0yHfi2aCN504qjO4Q0MDBSIMYSERSYa+Dtg7n+sipNv8K+RnGG/T5DzqiVHMa37asbM/Aq6a4oz3OTZ0aXp9MckGxACvfIHtFecJuTqTHBzisT1tCgsCxbm9k3pUECnnU1nIG9qQfgSQ5Ru57yrByDqgax8FdqE/+3tCtoCZcD5XBHi+dv9VNjY9ZuqU/Br3fs19yNN36vNhRp+pVJct77B6J866VfXRKgZPqMbNHBQPNJYSx/Zh0uLFaxdQ0t7+sz0GHkHDmU23sRPpaiH7NE4kKD7vc6/BGo6+cP9kFwfadKJ42i+R4ed59MUe2Nomdu4g4GPxJ+Paas+tSaJ1GAjfM9IuERf6N04IN5r/nz/pdFR3je4svXBpO2NmWlLWe84TpyidlFVRFwTGumOSTjENIR3ffKXpDGerMDJJiiRR1fyW9EF/87WK7jBl1CMXatzxdKiHGed5tseT5PQZpo2S5s+lcPcDUdGsgk0TKnfgAdXP3e5LN7Xfujnr1vuwR4J435XbFU1gcrf6/l1/82a48u13i9HivlYN11wEPfZvn3wBj66pbjZXqW3PK/9o6cgIPafZd0tw8mQyPJgK+9tDdip7exiAbTvC5LnnXVvuf7MN5OpOs0PqfYyKY/0Sderio58tegqTSRgd70JLuhbX4Vo9uT9ofL+jd87oneIPfd8TKJIq3cvMqyjGwLjWmZN4Vv7IlIQLX+N6UzvYt1nIR+73leWUrmvVtRA3ieCdTAYXGovjhJ/i1hzSBXDiQ3v95ZZUC11UdVzxcbbSoJ6rA5s12jOse5KcmXKJ/a9AJoHpOnrlnJd54iC/KSqH7HhdwX7bW/RvcQXApDxshRfekpTAjXnhN1J7iXnYeA4yeeKXq1Ji8NKnhY1zRqqCXdlTH2+qeSa1DlMZxNN/YoMUAynArgAeacTLtknPTjlkJ1OwdxGreKG4UMhgsZSJsT1gpIpDPWN/mAZWGOYAGHnp8HxHSmh6cUO0+Pyy847IH/Y3BKSKT4TtMPC8ThDiQxCzrWZZmC1Ct7pEP7YGXzM/8WEuIqoAItVBns+dMQcXrEr0LFetFAUG+3YSAY9AWEEu9xsfPudKQdrSIoMjT43168/JfZaf1iRDfpty7+/uqHpG0IoAvLtEli7D4mwNMYS32kHQN4aLdq3Tr/1ECUZtP5UO9PzVSHYO3t3Hz/eDokNSh38LdXeLRUjHeVk2DEV7JoP/8TBybu+RqjucTb1s/LBgvh0MAVA32muSs+izWoUiIJf3ZctscOb5VTLz8rMNk1WgjjgV5WB6hCunXiWxFXAbA3n/tUoCqbBeAtcOnKL6Wb4I+UFW5Fq+kDk5SiEwU6i+kS3717nV27UnrNHQ5y96cicoE4PuIOgub5+jci0PB0a2+XFUB71u5q5B5Qfd++WUR2l64f4+saOIzhWqTwmtWktNB82WeoJfNhzOSll8yKZy997RQGINBBvAcbw873O0+h/becn/kNCSnkW8K+54aRi2eW+Ww17DH038V0Y37fayJ6+5jQv3yDeg0gDsxVg53d2+IJEqMfTaNIznzB+zRlvMwT9iUwtCtQgdyRrWtEtmxUH8UYH1w/B0JOWLXOBXEs6fH9D7HV65y+etqFP/fW25StvRLg6deCefKB5enZ7jGqnQsoKVa3PfFGIXKAV9z0vUAfUv2vZwOavZtOu/PKr+eDhwXp2YoeGkPaDPE5RDfHWuPl0z/GeQMPvF/7dY5dxKB7+y6NbNqdF/21icsSrSMRjnkvbUMgKjDyP1GJYIaeztFfcXv52K7oF9fjLJNeCqNMX7w1vaMvk50ynkio+PrO7RIR6ewXnsPOyhXnMIfkvvR+vvwfsFIudPuboRZEryASBtr2m/jenlPayC7cwFysMRLhiQXr8Yu2MwZU5vgwl0bz7dW4OAMjTcUJiSLE877Ds8Gl03C2O5qw2J4fZWC38C/IjMF1b1MhaBIvf6F54WIIMrJxyS6u1sTLCqyl50m/OVmrGIUK4jjg2cDWEg2X0B/NA2XvDdfNIuhuBsLL5SdN/25Z71yPOw68YuyOTiw+UaEzUdzpF7VK+boLoX7/3S8E4uAvPztXHX0WNhmbdxy5WAx79ZH+3m/rK9NmcLkSKFR9T8W6nqRZZUeK+cflQc9wrM6PyLzBBQSo1egseplt+ARNtaj1pgqWRG5MHq0NlS8UQFdJkEHKLWfILeZC33NOt0uohthpFcyRYnOuINALHvTmAXOkB2epT9muIxq04jp45j2YMS3IDEWBth+WFJa6ihfOvK07oXCheqpByv4MwhZX/g7jogrVKxmabv4c+w6aF+8EJQU+2IFp9+bNMqjncN2NHhNP1Mwc93GDbwAzA385Z6/GOEiVoX59GYg3ydVglry8zAMt4rVgmcR0Q/xbo+PqcH9H+adk/BoKZ+Y427ygCmnWqZSevK/Zz3HiiQ8s94o0knpqeNxQLTr7zOdwtY+FI/+6GgC/AamKOwYM3y4Y8Q1LbnMhhweLlQSIN8C0mFtPkvOz4qMpOaKLD1HTESuro4mJDVlyqZ7KwjGo2Du8+9A2NvTfeSo3naxhRTWL2oNSuNuEsbiomvYlW3QW3Uu7SepmwsZai3ibm/Pudfu3MDf2Sxl8DocgFdLRoxdHu6aJX8ITS+fnLOns/aq3VLh5sSWP6zukVm+yhF/1UJo9sJi+wQQtENpzXxL/mBlIzAiUaYuqI5izEITdfJcG75GbiSxyYvkEaRvxx0nd8Ye19GorxTEUAYYUv9LhGzf3IN/VmNrBrumGB8XCUq0Q7VMfamFjRUS6GShMTAspa46s/pFiWujHWzgeaMGJeFaHJpiVFW13KVkcKZqzoIfIL5CjTShSRTI1N8Q+a2RaWggfaDNlQlBHqbCePV8gNxeIed1FFGtSxITtJfWaV5SQP4lJYncn06tuxvPmhm1FtpTSYQOlX6nzqE9wp0PyoXNAmFfF7lvzzPc24h4x66zO+HCvls1plI6iVw+NHb7zlVpZID7EjAjut6UosQ50ifE/qiW/yK/L7Or/XY0JK8IkhSzwP0jyQV7xJ+fWrScd+vLrEf9/fTDXQ44Gv8kr9th7I/JmI781fNd+xPy7p8yaYbIkbHun5Xbbx42x2e3B0Wint8/Sjm3g82ktdQCcnOJmN8ZjaCZZFwwMB3ljFQRoliZB9BLmxxtwq04t9Z6g8Yaklx0dFOIhmKTYnKqFhMw/jeZdivH6OrTbrZQ2Y0tIB22zjE6D9hOmAy3QbQY0sn66AxIuSoHdabI2BFoULwoWCzeLBYWBhauUbdjLzmHhofplzY+QYCGC7keVlvrToSOu4TCEoJjP9wgPGA0b6l8y0vk02pl4CwlderpZgt++mIa+3uHtCw3DlAgf+AVPR7Ka/kn5w8o7iN/3t6INZssGXbszt5Ytr1rtKtLJmVrpwuimaZYS9p4C81mdLM4iWb7+Vt+X+sd64pO7YwqlCrlkimYymikdFWmcVY2Eh2TflA1lYSi7oomRqtJRJ22ZhLxQ6hn8Y4vG4EuHLn3cEMyTwWA2uUXSwt0LhA10ooGLythU4ZbN3YJ+27xZRUQi1Xd6xxr/bQnldIJ95hfLVIaKGyxsJSf3JGM+IYeNmgpNUfVisoK8eNgDi1XKfEsUTujszsw2n+OioYjb7Fyy1lerMaWZLgZDSXzO/O2zGtoCB3iwVyugU54RBQNODXe0NqJ9H7w2i/5qNLOTuLWnn07MybxRiGWSrNaxUZIdoQDXcvW0iurvBNtC+wq6fklTwdnxUd+isjJFfHTeh2cPJTzF/KcuDOeDm8OElbjSjYuHjY9K8alu0cxVCBW93oYTUen94hns3YahIvYuHCCnMadIj7P2M5OP/pxr1WGzaSas7O7tT2CQihCzrAT3pBZYzev3+QiuYcty2Ly2vLN1tx97iw9i6PnC1XcejG8yNk16rhn4VcWaEwAgiN/d8QdmrsBuqxkmFE0BqFrEdNvc5Da1ScbC6CgJo0ucaaesSYB7lucV/nTzgEMn2ks92DtvrnRsAflMelFtVzgMutVMroz4WJoQXKdncjGfFS53fiKR0Aw48s+ampNmVX3Se69iJfKMEw1gqqlJtHKhPeozYQzhrXOV5eDFzq0NI60ghrWcqhyJPrH7ez5pJyaJdNIl8ytCRqrWFWD2sMxmutI1lVWmgN3OR7YarksHpEwqud0aqYrBF0R4DwQ6zlBC5QPQ5INJAnSM1If5KJ+nrVFV0YZ+9wIEsGFWg1dUBlLEPi/UJFeM36yjZkvMEhC7ndzebo2xDb3nYLYI2HjFXzaawIL8KpH7NmLplaPWkQE2J397D8a5YQMU9Ctj9NQPxKjNWvYyOWFo19eRPioRZHDlQOHr4l/KCr+TEMv0Ac5n3DlLZGLpiq2pmYUM7UTgOFBZnL98qOXrQr/RL8uy3LbCBZ0Sy1JvWo7BuKGiO5SXTeGZfWS5y1nyKG7sYq8Km0I0UGu4YrXAerCbQ/IJu/kehs2dmuVvMHGOnUNMqGMdBw21rFTAuTGvqJJE+Vs3Y62a1O4omQZ7A6gm7vsyAG1gjA5pwwrqtQazWNVe1sqqYCWkxi7rRO2R2DI/N9nSewL1Pr2gXOgfrni8L70IC1Afxq4AkbnQ3YLxDP1bhhRlgPX8Gyfd36sQQOVJAt1XKbCBfzrtPuAWh6KgZ8CsGJUQDpk4NH3Sm2qmdK7sR5ajJDWC5UgajgyiwLKYjrbEpUzwY91QZWrtIqDd7VmB/AQdhUoVUO8GN/1QFsvw2vmWyEnnRYvtoCThsuh47Hd2V1SmNY1tsiK0dLDeREwWRd0HQZrtUeq70PI/RiD5IBqtLGzhbvHXMd58CA5kqszhmM/sJeB42qnSIAkERuk7fa6o/ZaRoyP/QdBVLkhtb9GveXgxLMZS4xDsxM+vrn1Q9jvCEY+xulSrzwjkXc6sTPofS7X4TVHuH0vhM7MnWq9Y5eeCeJ+ShjME21CLXdxqaSGi6ZYr+AaF5ww4yas8uw7jEATgM35Kb4R6nRXKMI6tQ7c/PCRemwDvZmT6I4UVF1ltNPhRhi3DMgNJ8l75uXfymO6zetmDPYDcliVWs0odlophFfwVukNQlCMUd+mdBRf3DotKCWHQQnqpbvDjr/D5kobNrKhu1tDGLB5V5FqhW4RXiVfhB9r4WrjJ4vJAizFc4QUpojcbO0JP3EuLtH0fSzG7ALLyXwNflLwoIpc+DY6PXmF4LnrcF4YccsvYX3znlLeH0gUW9l3p+SPes9R9E9tNMOUcZYQYBj4FSXFN0SzHEjUeuAXmPFPLeoFtDeYVo6npliu3vnFYzFBsQht8GxVN7Z0Y2LeK9lRp+Cft4yre73bznRu47XL12No/yPFdiNHS5bJfaXsmb73DKmTP/et5/ieWr3cM5tjvXWSiqcPo8i5S5WNnuoz38kkZYhx6EDgPZUH2Db2LxIG0JeQSW3giPYs4uXFGGQrxf+RWrO/ZCT58lErf124T6i/of8ieV1SeLzGJtycW5Y/Ag4bBzcByr3klU0TyD8xtrV6UQx0e1x03dHhCMpzR0/PHU9QYzWufwuVU+Q1e2WNX5A3QjfOLQVaBUwQHascz2LMI3ZbPuOtQgKZqiizoJjjPL8mKrWpiUKQzn8h1C0leqgozR8IKlXPi3TEh7O6yN63ES77/RsxJc97TTfP8Nh4C3bgj44kDRG99KChRWHwTuDO5xCGxd53ZATgeRR7Ip8L1woI7VhjZQFdUZHVeLW9Woq/t8BHaQ2FowFUrfTeo0e+qryjWViNItQEWrHacgPGDntqPMopVD195GaIuRctZbH4HAVseYt3Oxk1+vFtYM7a7qu4UmXenwSyPE87JJrss7Ukjsm1S35Fh7htn3hxJvaAm8s34ZUTzI+P4gnyHqG9ZtXzh8JFVIGw8KUIeV6SJ70RBInnco/oSlDK+Y6aTqCkVJwB/0xVeM6SiczIUdJfDZ1f36/cnmMy/KGLaCBTghpvn7PkEcklTBSAug7dO13xDCDd/k5/rArBViSwWoOW9qM2PRk/EO96RpzhICXdoVytFoqugLyaWCjWEZpHjI8+eVQGC1uJv5HKjsAG51LDEi405wmVRx9vxtm7ELilOzjZ+xDQZDcKbRw79EaLi4SOPaBHrVgbqLO8J9D+8BbS5ZGG0rHt5ShHVuvA5SK6BGMmrGqyS9Zn407nFmb3T8M/oXaLSnJivwfXG13vb8Zh81KJALXkyhGlihHCb3KNaKPnuQX+pRu+Pp91anHaxAzM6CoqS3T1nQb+3X3yKyQaWc94vNLE3OYGa/0ZPNIJss/XFk6LcoD/+35uQX+GIb6UtTEYXN87GXiaJhpDqXkANVzveUpYCpHZVopfjTnLAAzRaVq6V6rBw1Fid30YfkXeZKCAv6ZS7W5zV7DITcKgmOydFFaNCDLojFs8NlbiaOGiYGejSRp/hJzmtdgzKwFjYXB/wk5qc5WcSX4xoSoFvkGwbhY1f71kE7bfaY8m5o//ql7rdIRINrQ+tGgvBDzD5OB7nfPLyebb4G7fleQ8jqwsSCBTLuhgi1/DbZnWm7iXK3viYYI0Q678UZHAMnyfaJFrs0d8NMh9jv3uYaXkpjNdOHQkVCE3SSBqBgsEkX1M3u/oKe/bJW2Ojve6/d3yymXyXzo8j+Pkcx/K5ApzWRj+ljGKFbODVrDO9PXL9AAF8tQxYK3UAgCj0xpnn4r9KY8PbxE+YrdQXdFwW90IIoQkOqfKWG4hcJoWRbLaharaXMq7lb+2qtXUh0TBlmmwjHLb6WB9VpCld9CHHWt5IqJGTPMy/K3p0yqtY16w7UZ51EmJhcFaZsm/WSxr06eIAUA/i7Lda/lmafIfVXj5ZuPmON8PtJhncfw8HQYecHnGRv6GMVamWBDRKKvr+JwJmypb/6UhXSUeYNsP7VEtDprw7d9brorQG0D2duQPrxIrrZ2qPK36prQa/jHYnhJppH+AOvJ98F6zwH+7+lca2f5QaPcx6P4kjU+fkO3BOyoy9XJPEcPN3he1Xbwi2MIss9zM7oHr+b33kHm4jU5B8geTvwY0eHorwFFHIi2bBqs0oBtvBqU0RX0ay6x0k5hUlRgc28VRee2oYMvfGfa9GyLgv0W1sVfeD4alA8i8w9LbEZKLDbvr3XXTKF0ekcEPxNglH4ayLTrvQYo+36uezdSfh68GeRanMM5/Qjoj9d1OXZAKDPym5nBkPg5MP2mf2W/vg6uIg+iST76xPp81KwRCwhONOdbdjqEls7MighNP21lgB/es1fVjdFwd1ixJlwwPSNY+mx2fiCqqdI+3Ifus1SVFDtigY898eb5qJJ/UFuJPj55nuYijKCEYHyHnssZ8MryMgbhsP3zKjDduAlC4muSzkyb+W9d2oiHxGhmjsByVZfQnf1Bz1DHB9ZIfzwnXuKPMG5wMCpb0ml3Shk8Xi+QPN4LAQlEY1yb9IbVA9iKgwqT5DGRIZCOLXdfkj8OoxzKCkv8vqWFd7RvJ/avMJ6ivUIM4unYgMOV6Q15eq1iRV8TQeK7Ssqt3aH2PaX0Bi/cNyAthzp+xCJb/QT0W1g5VwP98gtw+DwTQh9knki9fLcQExd8cgAtTDdd+XtOH6vE38nL/HBBWyqDK5+iD74UhPdPGD31uwL000Lo6mN8Tuyo6K9O2B/wSPcwmLIhoUHoD5uz/ajzf+NcMZW9U81cnN6KxN/6VP0X+WVHhl9c3+WkgpdCeM5yXyxi9bOG5KJa9R+JFeStNrvTVmcykpwVIFSBbzZBYbqS4Zi1AVobpE14m2g0SncmGKIP19b7cO7vLezY0dmZGE7sEjoFOCKApsN7cTHHqJomvccRvA3OCvMhLQTZrVJX2HgqFmQiCr4ookO9CkUHhAddBuwbs18AuFvdqVRFobrZPfhr55kgN8SSR+9eXIYB7H410QLpj4PJerzYxXyaJMcXBf+nDod/ozH/ZvYueX/BX+x5nlfe/6uOEjEa/4BT5n17GncsdcFAH3HiUtD0zzN5K/Nf2OA73wLLE2zZHU9KfBXnCSVBZu9tQ+halfw3Mnnf+NHkv2X+hFHVMc+H4kC5xYF589L+NrSGPdLfiU7m5t51q+mpSjq4bSQ8YC3ejkpQzntD5IdSVEsB8HI6siEuohw6lYYnbk28sY+nPW2EyH6BqY2Yeltq9eH55dBgQpxyLfXU8PlW3eXbbkx3b/ZSwxc1dnz3TLUnl4x8Q9w3owsbiD1dDwc64GrSr7Otp4doNgTVH8nBuulbXtu5jqyfnIxrQUTkfXvhADDB7C1kLsBxt8oQIx09Xsr5zHwK0wXybeKOMHb6LeAJHzxafVLPKv7iGW4j58cqo3+25yO4axIEdhQ03NjsWYAUurJWAl803+DpxLDx3EU9Rt4+uA57MM1Ctncr7wawj4jaF57hMzIT9J/hffwb51J0pOzS4duImp+idgFlPTea7PjDoyhfgYIQMB1sfA/JZV+TsQsUp7i4jheWIipwdq8p8KsOatBwnuzt7mIhni1KZY6qsdJIhWbfZeT5UzdPETMRFrNkwPppFyFdVL7xMjLhm/diMAkxxuwhmwSIe9pA1Koqahs6fqHIT0v05QFPLXxugsNtvq70WHQCXcu8QFhpQzQWs3r35Dheu9Ahv3sPDF1fvym4DOR/bUlEeC09nt9WkmSv90buv/tHyf6+ZV33PSIB962jb+h1HUVKWTW3yaqE17hfQEjrr4uYT9OyYkXQMd+c10g7b0ymIYAII//coTxuWm6lz4mLFS5FAzwDuidvH39Y/XnGRZhhvE3NjtFqtSEi90demSOtvd1vtkWc+RU1bDboTzXf+D6S5gHl5+KeV1DMjJpTaEsCc3q4VKHKWzq4z0stexM3HTND1rIZyaqbsXR/eIsD5o85M16nX9fNJ9I+8iD3MDCY72s82fn4BdCYe7L1aZbqzLtne3lj984A3VI7FBbzm9tI3FLpB9IXWifsSo/O/aOo1jzvN+DExMHqD/zXo7SDdLQf+UceTjPZ15Hrsdn/0a64kkPvW/CNsVBukhX4lgpvR6VD3e7g9Kq/Ld6B5JOZVWnz+Ee4mg3YO84zJGIwZhS0y+j+F6XrYkw/ruDybPhYBMBbI2uCB+g2H17Ufgq4+Rvuc0ROMPivyneQww/hijOdfny/CKQzPJgXfOb4DSvrPEAbKKDRe7YkC0lwSqnivyLnQr7YWja55Dia8aoeAYoT5IEqJ8QLhAkr9G1genxxkxU4zhcgik6DJbopdBwVWYVSlU2e1EeGiRVCIMx7xGQF5noBuyhEINZs72m9fdRqNfDcReA1r4zWikt+8vIdaTwpop9JUhM0LUaqICUX8scTuJE0Is1uIDmGaPjiqfQRjhJ1Ltpb2GpKy4itvdHF+VQeqkgnPa/3g4MbDFTyjMChb40T3n0y0no8Gsx+XQnGubHpf+aXuUaU9AwcJ/0W81U5ElJ5+FhPTI4wNYovnNdBBG+cfyXwX7+uEne435dTOvXU2sVYoDejV2KZlru6jZ/xNSzB2GARlPxQ1kHKgFJTBYkwmkYQLKFGdRJI+WIcClUeV5l/TWyKjR6Zd6rABeBQoAxqyM2+hWlpIPigvX7QmkM4EYYlxs+x4Kcgs626b+VOi4J4sj3lUe1bVNxotSV7/v0xoFhRLUJdnf7Uf3Jojnv3jI0MzJShvbWsXqYxMIeQ5H1eY3HdDQQTMJpmMziv7Vz4lDMtmqvZyU5nEyRpft+XmgYXviJ0xSETEOyk3SpG/pKypAaV7bjIoYrrdpyvOYtlFz2XWRRUl7ynjlZJ0Dyfd8HzCFT8ltq829/yvmvHO5X+ov6y3Z+hk0SKW0vflXhreykVmY1Rjn4LkYuU9jI7/9QH9Cc5+hDFR808kYCBHYUNVMnRyHrj/fT5aCAR6sT2LW7Bj8LunIlJBqXYb8b7L6rlYQXlILn6NkYxpmH0lqnSq5Aqtbklbc0ed/oUfbTtMb8pWiXBmUY0v/GK+DowpWK43V2aZ/vctNlJQD5OILpVrWvUScCUEc+mNBtgOdMOzLZfAKV2gmfZ/fK3yJYGs73+81XGS9DQQqFIHDg48xuBmoQvEL2sEBKGE36TBkFSQCoYsKfKWwVdYrkhyNcsIKfG6ZV4b9xkgZbHlc+/5KIj01weV3ciG3PdStbGVXk1qmuktzhOQ1oRXRgMF+bUpzGnS1NrcNSJDfiPkVBjLa1xNytJ+Yrge2CiIXZoI1EqShGsCxz7tLNQ2LI7oJ1h4tURSJwBI6wSInHfYMq7ULLqTjtGLRC6kJtbHLT+Ra07l25cZO/ode2m7y6Jzq+32txClfXc/AzT0TKgzjy1+yRCB4CY6AkeI+RPh8UPXQlCJQQBXp/Ca8kfiRCm7PhZOyXI5duTZxCtuFu/pvtTyZ5HvOXr8KbhfoG3d3SB/ShKrmf38DAdIrDdqxTbPsetNUw1MYVks5BnokpU7Swv7gQvs4Q0tcbxcnE0kqjJRdskZ5hyTbHFwzqTKA/dPeaiLwDZ1cS+AtsAADhFSE9FbzcBb6m1EZucDYaSXhhVmrIGDC5JOKiF84RC8pyT1Vll5Xhh1qGA9m8V4JnNWCglUPuperr6GqF2tZFOTfvlHDWfn5r08WKyHefsTVowjGlFoeAjjyUWMUX2cJj5u8WFII2mv+2pxvFqfEec/pXifWwzT/a/+bpZ+SFr9m9Drk/g5F9eZ/0b6Mo8sJoktCl7fXn8KWiJLq3x/YAR0a2qFJB90S+VHlAh+2eI5z0MHoZcNUsVOOLmIHDO4C49BnYWbA+rNL3tfkGv8D8NNbO9r0MFGmSxqbNw9Ggamw71y65D93m0AhmmzvFWF9GbumDD2hafsOVW5TL4M9+1F8JkspK7pGrjwZN4t66F/Sz1zw+1YvNekg5FIB8m0VhxUwJmSAtdCakvqMWVkqiWQl8/P/lv7jIODcj6KOLd9wHIX0g/7V8YfDpnMjOts7xxLRvXTKEfygq49EFRXZnjcOU/+pT6/v2FdqqR36X+7cWcfckU9FZ88h/W6vpoEfvXydQUkj51frvvF76/ELWhSS4BuWa5a6cwfc34kfFOGfO+J3Cyvr1VymPf5NA0qWpswopTmcYTpAChyGnyP7iqzggGfiAInTZz1cilApJ+4FobWEgi5UBJZkqA1mU9KXrLkEtBnk8I38mzq7B6Wwk24z7Sm15xT6uHV1YSnR1G81AvjGYSluvFC3GwK6CMqMTlTIFzVcRY5Nq237rj6RCWd7gT+3cHv1NqMOpTuoTPOovfOVv4NMtYo/m1pVBx6nDVJZ1ZnzBFC6I+Ttfa7n7iT19uhoWNgaKXtcaW8f6jiSIwjSN20lbQrPonKKR312SdJxv0Ktn+4Tooyuvr8vIPP0tXn17sF/MM3b5VovdgcPPfUXERzyg2fmZQ84556bm3G/XPwpiRYqnpxloTR8EPRteLrPirUzVafmDzXsEn73/SEAdjns36fe6lmI784/2JL5p0qfcpVihb0XSEDCf+vGvBh/PLmSWO/YROZF6wPKbwiBToovXiPS2DFjGrxjGDwbJZSNB629DxJzmuFYhsPbccGRpHhpNLSxfYiESiHRxVnzRUOiSdVKM9CDXST5mr8DjmviDZaVx9Pv+S//G/y0ZXXrZ7Uxj9MLBOStwbd99H8DHXp/b03feUjGQLTfNajCr0hmTU6frZrvUuBFs/4W74H6Fv6r8G0AZjT78q285dhCevE+CKWOlhmBcBtl3LkAZe7GpN4CG0k2VLrdYJdAIfGrIm20dAJfmDlb9DOquZOQee5FINM4YK+tMIjktcFMtIz1Ac9x6yz5vfMrqzXA18cRNaAbP7ZFN3pTwtCxXwkcdLGrUQ5uhLYNFZPg77SIPlcnsPyTn1dYoupPepR4Pd5C9dutig1bBVQhIEU4JJRQ0K0wXYqnfbmRBp4m1B6RUVMlXwjx20uwQlao4EBaP2vNr/5Z11TDcnRGM4h7u2Tvi1Tv+BnRpIsV5CTrWDsTSFx5oMpf5CLzD8r3i0D1Z8XzbvSM7wql8Ht0n3Fcz53IgEr8+cQIos/wqcbgn/0Z3jUpIld9zgsaEd6HfUtAmZvH5cZ8nOPsi8+HKQj2W2GRsuVJj5U1KRQLyRbYbTgWL1IOGHxwI1/nju2NtcXCPKnJeZK1tcrOIcGfFYxbTzUfo9pN+3YaddFpnjhqMPDWqmo/STJ7Exc3AouObTVmnLt5c1I01rXmw0VoQlEvXGhj0Q3EIAnM8ADZ80AnXtho8SAgMPFisMUbLYLaqeKwG3HEuzcfHFgqvXkKtC07StAr4YmTZHiC7XX39+f0o7RQHDEqN3a/hDNqwE0A2cVy1D27UCHYJziLdHGzbfgxf71wJFk6hsCdBTS+7LwkbsANaXOvC+YPnaQrdFDwWf5ht38lu9SqFk8bzIbfHxQ/SBCpxlGoT3kA3muLPQUgLdaCXkDjH3KvQYt4vIElb7IFlHjlXUwC2vjxwgtgmGjdWg5PiYedCmCusO884v6rTIU4J9h10wn0YafFlK6OLilZ5G4T5dsjAEYYnrNzD8RweabdoykqmRBVGKwxn5CyQXFO/FFeWQmixTKynJ/+RF2PxkIK/C9vTWe5S8BuKEuz7YuOj/rcLFPFfjal+7QTA/BfX4zGZJOClf8Mc2Z7LdKeB6G8iEI578Io7y5uw9BxSSH3tXOsC9SbYl5zAemBgwDfqciX9i5yiPx2ADV/hLfCP0m5RM9R3eO9ZzsmV4gzfN7qms+6vNRSJtgaZioGDUij1+xlblAmgv6GP1FjpeU1YBfq8tSt1htdY/AAq6a+wWB/Iv4VRRHgnl1+nKJRbDyNXoUItZIUR4csD3sksnSf6N5bU3f6Ore2P4nSXQyxTlRDS6sFrOX1RgMtPjS1Ilv5WAbbBlYIrEzteWr6ZDLwhmw5dizShX/XnZIMzYR0XUDbxwtgYR4KhA1mmxgcI2x/gUwR1MK5xjweGT79FKqFjPuZA++X4yWmH75+3+mbrFWLKb705pKEQho/Z9wE8prl9b1GKDp9HNkf52Y77E9mo/4ro30G/dbfxkpBZv8wEDAcTtEY7t8U5hlYeo7yhK2zP5q0hSBJ/fLFb5nNKYu/HklFMIo+g98ave+5lWdfca57TPwTAfOPGXVw38/uIj02G2v6uGP6FHCbDtpANwDp04ZEFSol4Xttd60iYrGxZXGGuHOd2bE5OTEppKsADEylSMm/g88iwvxy9qKJMvA2FmYrFdkulSRtcAEu9Q1d86e4aN5e2WvNFQYVzwqddUqXnWAkTfbByUxmmyRWouluTDWQvydQs4LrJu2BZTfFdaU7noPfuHE4VONruB0T/NajwqMHLzLb/E8jUyPs68R8o+4Z8NMO/f+dqwB5wpj2KxncqLHC7+BcyfbLYejDlizGgD5DyiKzhb0z+sYsKmikf9vkoGJq8EU53VtU65cCzF6cdRO4iMeL8IPD7p/ZzHcHsbG6Dvo6hR0xfExeBkzvk/PT6DhQxuBpSXMNIXJeX9iAdBob6zCSN5fD+cLEHwOtvmMhyhTwUTDe7VGzZlPabAvPkw56cEIpimMgETmZHC98eB7sLWN/g0XfmfZkrIu4WvgjQBMCHegurnxPWHpSXw0sXEhXFDAsbXhEjg/bBybB8ObUDJsTY1ZKwIZmwlUVVFOhYYRpByKZtFEGw8K3EwxQ09aXhahiqxRtqAOUtqdUMM/COhj0EuDm87TBOGtx6JuIPyPbIUjLW8pxWusG779zbA+ynqguBvAcvLMmMo/NyA58Rf/dXAH339bhv9gfd4jJmv3jusUcm0vI3mUPdyNM3G4FDD0kkcZKLdqECb3x+l+R4752rnQGnL6rG3sxn6nvKbd4p4PeJcOigLrlB0d5G/SMuOPrbBM3jBRHpygI4PrUkgfV4xG1aPl3uyeED3d1ps3lbj1ZH0brwbMAoiEFiS72kTF0y4GOPCSl7hcRWkb7prdKlOEkUD0K1zYdCLJngUJqmNErR13n7mRhAsjfAjkIuj8hjDKDhcYd0JHP2rSjouX0STVHl1wAm9O6wObr22m83yHsm75fi/q3M3uH2FqB5SYz+eCNzW2kY9cswfBy8mxZHji1lsAYGSaM+M2HB7SXu4D5JC//RHDJTMp/EiEWIYxw/iDJhr0x5nwvCsQheUb7PXmzcKmoJvctd5Cxw6Gv2Z+w156D/i/UegC/uWGcfo2ghv8ZA+z3RR7J+48qPf4vRyYCh/F/kJhsFNyE+E4ENVBMc80X8i3N2/37S+9+/leC8iImEZoNfMtmge91hitz2dJpAk3qgETp+v4XE20KQOehUD1wQUrhPLlkrqK5pF7IK+o1Eq0HBVgIjBI+gfy36FvCm7u4mrjt8qlz4RnF6pCMpMushBfJPWyfEcKo41d9HXywmMz3k9BhNrvuYzy3+zIxG20IsLw3Hlumx3UfNP2i5f8tq4CrYxSE0umHMkr+vEfUQKlUJWuZ3Ul+/zSO2DXvzOHktRkC2GyoWjEVM3sPL239Cp3FOWA9d8Rpiv+8eb6Mbi+39K8UAdBkXqPTT7+cgi0zh55qa+uYI7dtiSRNhF26sU/B3nm8yisR/+Ir0//U/vvYJmYn0XVzOy40xAZ96FD6yMH5bkZsfB2G8UVG4VkhydNRbIDPnt6OxVxlwX4GsEe1zMoYGcXTKGrMuL7fqnho/XTFsVosPra21l/rzW79jZyh/J/I7s2xiQ+6XuQZfW8QG3qkbElOYZNG6EvCY83cnHC9v9gR/eqWRSpVJA0qcHi/SonTHDSabmGRCc8RM9Vb0MoiFKlyz8FmRi0pGLvF8Jxcofu37MOZG//Odr2p372ER37c1mq/Yv6OJ5pg8yDjEXtHTOt+fmS1Ocy1izawmHmJ5oNJHC6yyoZfGivqQ1GYX0aWtEX/LDk4izkVIRbPpscrsnrrMBez4mYae34IEwe45JMy2DGKYkx7k7mfAw/Q7ZYTDXBZ+iUTRfjE/tO5OT6ivt0KyHO9Ez155x8sthwc/6V7vbZfueVz8/cTzC3kpNw9pmP2xd+lTRY5UvHAMtvW7eGxV2hD6cDMW6CaAfwGnaqvS8X5FeI3mwW3gAeVqpy4RMU3NAknOJStePHux8F37EmexCkwr8jriM6zdmxIvsDyvx8mOalc4VsNxeETjSyHCX+78UBT5lQvo4ApO6hepCFuseSnfS7ey7zBkW57pyP1a69/QbeLb7tHuvpYXecXvN7/412q9vVkWAN2NL48S25UZiTiDcR3ZXqqqbRMC2Q2u273HEIk+OIxE4TYVb7jAhmXDEdZBU9CgfoCHVDzUS3YxQwXhhWM14r8xAK++kd9cma3esnqO56wrjyxWQBAFHn/48C042ePt34rgnG7F+mEKLxA2sbZZHyrmjJ+DfvdPtG6f+PyJfwQ0n+BoETVt/dFg7w0G5q/fSkTcigdCSV5ZgQW5YZ1Y0106+Eq9t6KRcK0M5dBysttfz5ySZNqqFH5DX8AwoZA8LXNULzWNCwEgy968wc2FN7CsfJOe3Tcx8VdRbtVRrt6B/VSQMhXNGMWAP5gpWUHEYRpjuRYqRXg7FALdCaS6Rjb8t71b5VEcBd8JEtnW1u1rtfKfzvf6+TfcvtyUR4HL6vU4fpp2Gzjl4nnmPUH2WeH0Uhxm37tCUm6flfvGYXza9khLzgYLG5Q+GegXblnG97L+TvTUcTsjzpDcA2D9339TQxqYTI0bz7c+kqnGYCI+JIhS8yHQj/3lIvHrxb7pzlOLD9/EXhYbcL6lKZlYD9Us3exnwb9nOzcRbPwtT8tL7r1meCL1v2wYLLz5Nj5hABfJyjeOmWsfEy2p+VC2TSGWoxtjhUB2ejjFAPTcfcLf68ztHfRG8/so248QUcvzl0/8Nh7yY0dBRAaMDh45rnqQAsK36wLgz11DAmY2RWscgx5OcLlg5m8+/BSrmt+93Qdf2WYciWUOqlO6cDCAovUvaLinHfv5zdFefpAJFWwUfJvNs+5Om3PZOh4T8R+tAKe36uJewNWf18yoCcwQJ8KrcGk54UWEvYk0lD7Y4nnGWcooEkfUv2MP+v6LERJjtG8MVskpbcPrWADSJfL510z7EoJfI507vgf/kNrS224mbhHHK4NTbMOprj/a5FJasNJTdfK1EPBw/414KryP8eHykX/duFyAIdy9xSknJd6WlQUD3wAT1k3vxZcDNy/bhi54MFpgR8T4gsErwrZ+GltaQRMacjLCZBvZWhYCM/IsbAFhX4qqwiS7uVa4u6HZZ5rk6qJ6tdIMYZYjGOrnft8aKNwb55XuyYSbljc4KwM6h8aj9xt+DeE87yMfu0/qkwQ4g+CM9nZa3vURwrBuRf4MHuGvKGzt2yNRMpxO2TZ/QMYDGShEj2dwc8jMYMc3naIjYjld0SWl1105D2OuCx+eUbXPSbDW/WqE10zs+39dss0vA5DnWR5298LigMpVnXMqRTi6rtmH1SB+rVF1s0yZKoS2snzT1+N58oFLaLtzwZb4gK1oCfTgzbizvINJjpObW9CT0aBKH61/rFSiCX4OnY8MMdbxmIclejz05lBA+eFwRkAFrHhnTB7IgwmMJZPnG4HhufRKSVOypSNVhuOZV4dSbtR69fNpW4LgRvyOhswRm+ZxyofD1KwjfroGUoD5AfwbzKbhqStr8uYDlu1dnsH3XURYRKYUN/TawQCnccC3jXScs/8sAA810TTzGX9/YrG7BJ1cTrPZfiz0vjV/HDCD39rsjh/wjUztvJArbqUFPhkpjJC4zv5iG8Xf80h0fZ9MGJanMKr1fZhKgdBZ0FAvEvoYlGpSGZQlqRcFkyeoBTGeyxJLkOXxJdiOCDuENvddQecxC5cF6yLdvQ+E8FTaUswX7bnNt26LBmXzQ5Sas4phBBN9DLJsUa9bA9JoOERaiss+vb2g+0a6Ff+SeEK9P7jUK+4tcFPH5YpmCUI47GJ7rsUqUD9Kw9G9gEEn53PI2wG7b+GGh5lO9j47mkxUG/Ts3/kEzuSFklhJr6t/DdM9ZXOoSZPYBLNN3v37wkBE+OmXJ7cfs6nTZIf9Y1pKhHzzEzT544kE857dG0vU0P7fAG5afHgDEKFDRQEvkFpkhYbZcO+BEW8uvf7HQaC/UPpvrv7SgphWNRjxKa9mebfdCgpnGLpBaVFztdR4pqdUBq5yaW0f6zBXDN+4ldRrsRuhYRiLuxKq4uoiXD1YxyAMcr3w+PaQ2szKHssevoFO0xpO8bEOoU0Etad6tmDHWz3HyrQuZ8FS7ujtFvOV/lsKTHdPhS2gDK9+H+hIsz3+mSI7prIpASiBwCy6UQMrUEtrpw+PqpIqMoJuxjCeW97cf2S7CFopcww9Luz15iGOR69pFub4oVHk6iu/mQO8RRI7CKye7XBJSI/p32CgbTiZ9uFlxaiQ6cP++xhEUp4P9FGNwedC+wEGi0RcHrHxcug3kuF1u9pNpjZEGheisA2lB60rCxY5c5IlPhJsFl7vNvjU+6BMd8XYIApFsIXGwqLO8fFlGWFEg8SKyGDvocDbU5KUwyM/KZmqgxRTuOv6w2uRYTqtvamk2la04G7qqOyUnPiMbqYPmuvz7eVwN6Y4J7HVT8i2cbJ127WELUbO3GTZ6eU4Dbrskrfa3tipAMdvEZ83G7xw9opY7Gax6KQ2rxX7eXah2e2NOLZvZDbckSrbTr915Y2B4Mfih0CuEKR+I43nQCoAjCf3YQr6mvw8boaMXCbii62pH54vXiRwFZZHYfomc7C/BJ6np/Gll4kCZz+Lg7UkipDpkt9TAKrU6+5iTBlylj7JAYDYfFY/rpDDnDBdbzAnzWWou9mRIu2EIPwPmNZcXI5apQsLbTiLVz7HKbLTOzDUXZtG5j/LAaE7aC/BIrkXvhWO4AauGo2F7lWDOlby7YY1+2C1sBxYo6u3+YTv0Wp7zx6xU/hNc7kr3w+wXlBUnn6r7aj4buWGMeeJVQCU+djduOFNllMoaeMBYoGh4mI42BstCKHQy8aAROleYEGm3p5ie4d8WkV8n739sojdh0YfcIlFdzaOu7+iNSC6EXfpYwNqV+mPtdW55pDgC2L08VZO6Dfr399U7F9H/Qvj+a0W8MWG0j3Nc1y4Av1+z3gQ/bPlxTLdoqn1tUMdt3iIRA6v2O2bOVsIotNpbpQc/sw8iMWnMbattjt7JueWkHr85Hy8hUKsMSzzeVcLp8I+kjmYWxavwWuz+u0uLArEdAnJ2W+ltjWTDa9n6cqJ2Sh9HQAKSN3lEGoHVO/iAbKRngL9YwNDlp1cR5TI1hu4qqfrkWZDPjsobX2I1kALDaK1KrRLRMkow6BO4wu3gI5/bCSH8sM3v8EQqr+6eJwrZm9Arg2DwD1KTPIlusf2RnQNgpQzOrux1tWORyytSmbvaHY+ZkK/h+EJqrEefgEHjf5mUDR3sTbk+jcCRCsVKM/o6N8WSfHPEcZZLjLQhktQ33o0axeXaAAlFa1nzwjJ9V0C/rUusgizj9/QOqHmw4GcLigWbuvsOlTqFds0HWmF/dgA8Q5un8PIzE2KFqiCQjwFv3rV3Vm69d6Y9V7MmJ0Us20tXE5P6bFK1iHihIKgyfuCDuz4zttRVr8qKdE7Pw61eJ8L6+xO4dhBYU8a+MIhXXwm9PhWbAgKSFh4GtZ0SH8HPNfONJLXSQwaAuo2l1SWHwnb94Ub7HBe5xSWUxpT/ay/Mr4hM4TQv6Q/gXxY6/v3dt/hTXW07zcqgM7//hvi+MaesvUWsEW30x18+V16ub2XwsyJamGbdb8hq9oM3/DfGjDeGwObQvC5ykfYKc9lrmS1x4+WBoxoQ1UONn1wg3bf817i1vSiBE7DhzOEe7cjcn3py+3fM+NUVhku5GaDG7G1fntQZZ8E9vsKHgji6EY/VtCsSbLb/RFzWvNieNFoqK9r0mcrlUuLF+eplISgehGHbPzsQ6eQwRRVKc00hYDJ3DmbOez4fLhaGMi0DGOfUsnc6cvxObZgoQuI+VyCqSZ52sdwHwO7jiCrCCb9iYUqaI7MxTaf94tDGJlOQf84K4VR2tWMR3Lhdw/BpIGC98/Hq+DcrklCxdk0OrI71WF/7vBY04BXan9GZPfynrwIoLlgu0aQfE9jHIi6omq4edsGdDELZONkmS4sRcgQ6WxV45vaWuC2GuFimxV1S4U/RFybNOPzS3Rb9J06C5kU4dd5cEZ/9xsL74pGFd3AjE2+2QDuOIG/em7jq/Wy+ePoBh+to/D3LsfBuln2vCl176Xb3ESyfWhi7wSyVhWeFHLKg6DfuNMqdotZgoHD5t63ZQKz8DlPxjk35AxNGJrtjQJBlcRKPXi3+PSCIOU1POlngcZm1aNwcTW5Scko6eWDyB/kR7yRO9WPNFLdjYRcN3+9eIJsCuIFQ7uNrKD7OvO0+UU/qinOuszCd5gF1wbDXjuw3X3fz1gDZup898RvmiC6pr+IAUsI9XY3dyU4jekULh5kvvFgyWURcD4zH15au3RdVY3stoVLErrIPox8nGIIG8vLqHd/uBk4nDq1muiZiLWXDhOq8JUVhjs+Y4hYAS9jqUJdsq2o6gakaEZblJvYKgJ4RfCKTc9JSrxYhVo/0IHu1bJZS/jDmQFRd/QDWrF35intBBQnSoHW3O6Dr26VppEAuMcErkYRODoXdUIJbXDOpKulVKJE+2UXNyhYZ05xNKH1TEeWHtd/I5A+Tmu/5fW6h+uCx+MYdxu+tbC3Kjzkj6thJXfajViuZFkGv8FLv0LgzfhYEb6QAnjdu+FHMw6Q+jW0/3jF9lcoyKugP33v6asejdS7Lsh7khmETBi+dccrw+F0tbF6cssgDTSK5DsNOOPyhw59P1xTfHbC1QdRYn3F4/p+NqXtVTyDxQnYucZEFv5Btue2SC9ITBXlgYWP0Hn2SWbxAVaZpZsdIjzmeR07KjdEiQJFxWNjfj435SGU6UVJM2Vl4IkyXwEoefnbUrR4K4AR12Gj3hxN5VPTp31qc5P//QSw8qFPNq89qXHailPl3vk23/5YncRHEANDglwBcAPFE5JEFjXfecxJXd7lOiUnecAmMTKHcfyltzpWW/cauzBO3uBvlQVNkizgnqSBkWek4Li0+3EN4rqGrV2d54Y1+IXzSjj+L3rDhDlPLsgypzmZLEA1BChgfrlfvLEUTeQSpAJk4bKg33wv9Yi3C3inZffcF3Lv3WM0y0RBMnANgq8JfORQ3akeY2uPa5Cvw6N/1nQxlsjYlleRuLZBb6wqHX3ryNrWK2zpIXQ2EY92h3YHmJ2NYApHEhbY4DVJcAtOdCxvv7ZTJ7RIdRS6zIw+6jw3CI6ZISpVd199LF2RWyRdERLGxy52pyz11kPV3oIGSTI9x+qZB/JgdjH6zaCTeXe/YSTaEOtTtW0DqRotvN+wkH6I93x6iaGAv1Vhuv1bDL2z/Rc6c/GFqvzymhly/JW//0juS8VgpvpZrRqVnd8C51eQx98aKDKz38J0HpiZzm0eUAy6zfilOfWmJRFHPdY3TcG9nVQs0qupOAoYKpIR+C/R26r78jX3/agNJTJgHgj9Yels185yn+ASLN6p4VVpJ79FK4+L9xivpLZpLQEskx/DakMr8dEKVJgzj77UKmcP5Vx0urhLVKDoGV4csXLlJJ+GB104nPrZaN0BLHX0Oa6ozKL12KK3PDId0Tld5kanIIJLBCkv6YotqTighOfVps6fqsfiWEA+AyHvhHk7Pjjlt7Qex3PdBi5F6PkPvrHvcOrsQ8aKG/cVhFu2H4rtK9U19FACRWigiIQ2kWiu9xjJ7Lw3/M3D5NBvFI9RN3iQxu/A1F85gfhnMWhC23c7L43+OzfachUCg8ah9DXLqsNtgRWcr2yoKtxAhvVqSolh9ZivIXIlQzTSGS/zJg5X2+pHbNgFG19Mha7vOn5//p6fb3eHM21T2jWt6CGN96Ce82UycyOpYYsWslqAxvD707AZ+Z8DqtAHWETW7JQ9wxwRsHjUgBwf8QiKyzNtB8gEKxFtKaEgkwoB93IafU/0wJxeIY3CBxQklRGxDMWwRzMX9GMqpPk6Z58fQGPmgD7R1zPu5SsxxIIQGqL3wEV4iPQarb5qk9Eb/c6UxhU1ELsNIx2W/vVUWLr3LE4sW55DwSyosLB5b8a6w3UweYOwvP3X8fBrmfLIrbhYEuhnU+KL3CNJAK+B3QTtoA7golQFDrZ0aGpAyJ1Adxa3iSMlvJSOqeV5WhffTvYU1jzCDvuhTvL5rVtJU0bVFgP6MKrKjRBKK2YL+QWpD8o0mRyQKYnJ4oI9Adtg0N4RuUXBHcWWN1H5/aWAOPHDjNWbi/XKyZSLs8Lbgh/ePUVgGoRuBUm1dGduMIBUrBjaWpxBYF4kK7MB7T8ikm1rlPrWd7bCIbnLbM4UrngQstLTkvTWK0HehfsJ9vBuwyUjFcfRfl4fJ7bFD/DssV/6+ez7GifeyXO8h4DF26ma/ytjoDInypa3sZiH2V8DFNa8KbzgQ6LkJgZ9UCFVC8ihEvncJyKA+K0lZI5DjQ1Ft0JMDRpbPWdhJUFfs6QmOJrIo2wWwl7N0AgcEJBVAX+dClCgWFgeeWlyvvDpzQ3SUsH3pUBZqUB3UolWJff+V/y3MRq0Nciabc6no692pnUmHgTdfasP6zHgI3cRBA+pB/AWd6jfkxBkRlrDnmO+pmdyHrJRakkwKqTq2Cskkm5W7XMzFtqIw5e1Ra9434oNzLHXMKuB7mQ/b2mB5zf0TxjKLRMZOX2xTJTX0AmI5tz8KqjtGIpbaH8X2O8Gj2zyghOD1mn3+8oPoYbuX+It8HrjOaI2g9/H7b/Bpu/n7TdLEPse3srfDZzpjQb9557MT96zcTd2IT2A86dIBwCKwM+P3sJuhEholmFyRX2oS3CbHyXI9Nch0FG2fs5JC+hegNemJXGUJ2NsA5yfAYqbqKZGbofx+YKKvE9fC7VcAJMs6OKPITOTVVHN754l+sOpFvB8ozYRYw/yzF5wbIMy8tAg6FiiohLjWpVSz1vzHdPzR67s20qQzlo/WqimBpmT9b18q1k4VufBqjAhFoj1UHRcV/Qs9ZtEFtsXD6aQ3rOIxpt5PA+QxCyM4eOOv4cUs7FyIQt2Cr/iDAB74/ePnF1Eu3jHEOGg17Ng7KbM7jXktXbieaY9hmjwLNhX0BNH82vJu72GyPMdNmqV+PVkv6Ttb9/PAs7KZ9uuhaPnaem9i6MMCaoCfl0iEr+9Fi7Swme6j4n6H2iEH9XIujTVF+/OPpSW29G0A98v2Kr2HCCrMGT6ut1zN2wZbLtvUBudAcqRWaesi1uRW/cwVaEGvjUH8gcQRrYsS7JqwnZ0aYQkSZzROf1WJSbJ3E1RXC+4Pzi1nYRhwgbgtIYwtCU+KRe3UMvxdePklymOuLJdX+uZDz5FHdaV667pgxqqM2/3CSKc3rzi+CkdXKdhBLH5hqNFXWSLb0j4n0vqoX4RkwObX+csyvkD+PDNICfU3IwVTk2oJl+cm8JsDc6ePw8wMudYYr8uUnstRuv/6i1OAM1RaP/Xa6fsOXPzlK19o/GjVRnDpqz1lbp865BD8q8sWtvBcehgOULTU6Vfv45vf9dDEezzOkaRhaszgTC9QhjK/Ib86FI9voLYr3uSlz00qPAO1gdEtgtITnhGpBRbYtBL/tLNQy6QIRRuXaJJh7nVphxVNGm6VKOr8HLOXeBUamyXs2W+fcpTlMwBSQ60oXwp+rx1QfWOKdgYITmW/DRZYT0IGSDkjwaldMlmMFMzW267x7+MQCQKZ7ZHigL1XjwCmQF2+kzeB6zUUSjkfuVpIe7+1lXIAPR2i/CJzYHYrweX/P6W3kAzRoTAXL+zhultgT7GoqObqEVif8/Ib4z4+huG+878vUeTk9A2yFndS8bSrDubo5xQVVf+N+4WqompQGv+q8jftZA+mE0HCCd7fCfM+ZQY2GpB7S6Xt5dVWAlYC/g9f/yprrY/wciysvwS+KVLMYNK4Wf2ijKhGVkU7Gpk7lbUCljo0NpgI+QQZUmXTUnUvlZvmkS8lIxUPVQD/9qcKVCmHnjGV9mzDffm2Iznoi7GEMv1RZ26U3Tai27JEIKgInASnvsk+gvBFexrNL2gWSJ0a/hNoFEOI0PZatXqVljETMqCGj165b37V4ZHT/QOwbLj4yOoExaS+BIBr8ua+zB8f/NdbLqHi6wx9kCTX93kv/06tva29tG5RWJq6kePUykSeXDHdfw6heDVV9ycxnvyc7BCu9cBGAYa+tjh02V/8U/dMOYC8+XHAzwxnCbCxVJ7Nsz5gAPs+VY+jLcm4dfW50HTbQK6PD0+wq/msvAc5OY4lismFofWijNQxvGc15FAlB7TbF+TV3uvDdFhUdKrFotbmygdhqUHUgi1z7f5HNRB+VX3GU+VY2QlWrt3jjb9XFfrfjjWbAWeXQfAANqzagmqOkY8Sp1hqc8U1WRgsy9k+7XvOhRxldOIaB99PF+Fjt11O/MMH8O3xIVP8vj00ynp7aAQbQLqdDwXXqmkfg24zz91fB8LBjDREjRTUhdi9t6i23MTE1KbOnZd3SpwPSpa1e9RbzfHaV0DtLO1apgiJ1YKL42UO6DKzFkOnWePuyhCMNN39tVGIp0DoBbuGKGOGBGrB2sk1R6UwSfWkL4EjWau/9/Vd+1WCi3bfs15vBI5PJIzLHJ4Iy1yznz9geXeW0fXrbZkbGOYoWqMmlWjPhLHjLIJt5JV42/FDZwtUVE5fhssQhGaqmRB1pZKzAr5VOg2Q5mUTZOKX/VIXJiirxcoqHVTjVfoS9FRScDFMwjAC/IR+t/32CY3VP8tNcG+qkB+vn4KQQ+YUMoVWnmENGoDgfDtgev3S9e9AVTggCC13iF/ocKhY0En/YoTtKHgG/1+D3imFcabhHmM0TrWlMMzNcJSAXkjNV/QtKmAGAylsH/uGQ4qfVa4nHWaSpqPZgZxtrJQy0eGqC601diEmIesyc13WHLDX2UcdITybs1wHtpnUICP5XqVLxdRyUa7VNkvSB2StnOPkfWijobF6YzNQzk1URbgsRLDEF3EGjQDp1E9jjNeCRozpKfxYkXmnh+yBM4TfYUKrUjVZBJ16USZaGaVBz8Lku5guxqa0YCP2+0wshdfl/5pyXhKtoBtbERR5AdLGTdt8M9GnmH04bvRjd/v0foCqzifMr5/TobazslJkp8qA8mon8CoigyCze+Iizoetl5x/dgzte2xqVqw8MXDPCHVcBQYfVW5W+Blyov3YVTaao/I1fRdCtWw4KrkI8vdzfXxFDdUvV5pfyFu41iEa1pu3zh6AhtuaFFlxqhy5FGtvda63HCqcSpUdrIfwWwLUEM3S6AIqC3unoJpuEsNPSz90TYkmyzvkOLe2aZty5PgPu+KK/c67Qywg/Kzc7alFA+27bFa08gJr6ppwDANVaSSEFc4X4Pr621bTCzPdBbzzZE2Yhz5sfO07++8gj/TIKCTn+AL/X2Y1vG5B7uKjbdwyskJkH8oaXjb+AxlcH2+sPDDN12U0zpwgIgiRJzk043GxPj0jcNqGR6aLdcJLuVOuOLIBmUjkC2YH//U40bfgWLH3Go7MDrz2YrxeaCh3VXY7pIRkPWX39M+z1PihpdD+Oweu7WqihJC9tsINO/TtVZ40VtpmqhqaVp3MUKRrYTGvVyNBgm8gEKOGQV9ek2u7LIJxQ2PwzQpA4024N6K7VzB+202zOesN9xZ/jnWCV3srWtEle5haGy20TkP7kAY6QjgHu1G8pWeyvSfzABgHznMksMvB4cg/Nbm3jcz7mJoZDgY23iMwTgKxfXCvp9k9l9RxQ3XISxfeWWaVo/D1I74lX1mNkikv0TfX9bMa6suhZNL0msHT4kamT4ZW6gN3G+lepjlhpp58xpDTf9FGZq7Ia5W/GDH/vVGqBDQdonQNZYSQ9JL3hJoH7WxGAv2wJ8rwk9yrYFiN3Up1mCnPO1dySVAWuABDWJMGhUGSFF0Fcz3sRn8Z564ZVKOwRK5t31E3fEPPWMaWA54z2HWRVwtk4YWGSsEUZ5N7UwbGfqJUPT4v1JofoB+WY2KXv7hnTyAfdOlwl9zwd2cH28z76uz2s90x7sdSjqswR5ETM5c/roAnVEGB3hNtO9tsDbty3XeXlVx9+TKSXgnxNRSDuMMtXTJf0dWBjaTJIph5Pxr9xFA3oYGGGzQQLI/wB6K2l+3HOLvGeX5v+c3n19Ea+32+ZBEiYY5fYEaQZ/3hVMjuiqo9lWExG9YKOERrqFrrye4OzAxJGYl3IvMoUBhuDE3Llwwv3QJKd3Tf7cgA66fd/k4NoPtifVTSH9xAVzTAJm6QOfD6kxT/xCUx/fhYYYRV26TW8j2m13J9wAagFtbg0s+mENvFEpTlJYFZZ9yWF9zbJFO3fDjCLOc6Z2BbcB1QX2juYPb6SAx652T7+78U3G5wBknQBx43gKmQWH5+AZj9lmfw8lj4WduJQG0h7Bu3iwuKg5Zj6HPvLyOeX7jAemvkw+m8diA9xfs3sLqiwsGt+HUG++OgNZ8vH8Nzj7ogH2iu+FAwSd/1f74ULyWRMyP/I1A8X72yxtz1GzLLnBpyPCZpvc331boPIO+UZvtp6tU/1pCDleMKNK080d6NIKdQNLEDtZfpTxdu3wKEQeSgeeakF7dDBeAXlkHnDQzPDM4arD5IHJpGJMGkm1AqZDcf+/apb8FFAXo+Escv7/7++c/SkK+2g262JrfUy98zVwhgahErXjcvd8mVessu1Uxif9Wz4yrXXfogssJZ5eXyugB7Tzc21oABvnMwk69FbjWgA5I2Vkc7qMQ2hIH8WsRudu/9BhyO0p+Z+duwpeWSaX+s39/naPqhvzUy+XJavqagB61h+/HQ9P9PqHexIr1QZ6bDQdSTfzttx3d9/HV7OOj/pdDuWM6iQLnq+tBJ0wMtsYDMsSDmCYyNXy7/YfhCQvz0m1v3coQIgcGq+zVq6QJRcPb9rpnK41Pcg96knzo3QYtuP75GkL7ay19qe7KU8WZc1qho2tLetQrpT1qd4UXnEKTyegrh4FKiiU40ZVO1VlLtcIOJcCdbMF5VlgZCsM2MCcP33v/e6KePH4L5X2KNJzhn8DMz26ieJJ981+e0Hq3090obnriR6Hit+vQdNYyoI+qSSy974zFPjgPAVinZOD4oVFUchdCoVI61qwcUonbD2jit4CoNFOoe/nKl893z7/7jk/vxOmnlejnL9AvbRJ9MeL+CWQ0g6KVJAT6oT01OHkQOAFO76RYFo7lLdUUJ23fNg2+fegEbydvZ30Q3y+0jiDMIVx1+9WNHsPjb/5XScZvMTuRS7LhACnG4gNv3pXB77iBaeOoQ5uDo8hHHA/8o+ovREmiMo7F8ka2pWfE3x7DV+P7/SaXC34dyOOYgDaShl4sgaxk8xSwT5MJ3nBM9Qh9jTef8hUO51mXvASlZCqaPsksZHJkdE9W4mzJLdzpNJdD9tYZWxqlRWex/N5Vj+/o9utP8gKVf5HrX54x+gCvMd/xeQNrN49TiLUt4aqEUOsWpdOsz/fbWLTM4LE8dIgUiEGWvRY7Ic1f2oPjJu3I4HjUY9itvvPHtJopgeEho1U+E2j+H9PNg5kQqW1nBX6Ls+J6v7yMbU9nOVDPj9rRM6fJWuTVYZxtEeNDudYijTUDwoui2LQ3akvzOWVtlKSrh3SoDyDgzCqwCsyyPT/wG+7TCnRjyQTe4P2+r1YYiI+rSKWQnD+v/Xj/f7HXSt39r//I/TBwPKclckm1SaILxJJSiowl+U5gUx/XYCbI74NtPW/1KUBmKFMDJA7z7KwEDzL+p0unJ+gbtnH71xgBjBjzD4BrMnH2xfOnOwL86MCvEmx/uGeGlexy7ysc3DChzY4zwCvc7zekgWyq6kQOP5zFn+4eD5jHH8PXld+kAZIksTksTv53RN8bpgnhSPk3zsVoiKourgX6+iRl2FBdEQdVyB+1bOe0Lbsc522ufwiUoLltyBMSR7iBJU1001I4Z03FkaWDSU3WHuv7/etjAtfGb6W8ROZfwt4vM7+H/f7ET90beZSG3w46kX7K3BkifKGUR0rF5XB7NuHgXSVluhHp7urXVfYQQn8wuIPNGcZ1qcddS9Wvd+2/3lp8CT9vS+b7duC7U7Wiiy1RUA8vOfMvzNXKJNhWLakehIe+hjADe48qamUbJIfDUSccGQnZ4B9i0z7OW5XKf0oU1HplIyZMTna4J/8hn3vBOhFfXfjGSWTnwZ0F0H0VYJAYtyDHslfL7v25n27YY7axhEv6NxphbZY+qgCaw21b3fM376nJtiORgMAvzzQj8KkrpD6ovJQODU4sF7KRT5DSiVUplNOxnCqqVjjqxfFlXYejC63Uno0A3JzFx24FYUEgDXQzxtk8adLt0cve/oME9Z/5fidE/5zhV8/m77pUlRJWX9diUOHqDolpyyERNYtrH2aby9Vj+4C5wBPo+yBqBit0icvh1xHOLJY94PWgGW7Y0H9blU79WUVRWCTRiZ0yL5RxbpFr7qq04h3yeiR2F5vBMII+8QeFs/3ukDQvRysKqQEBqLhKuZCiYRaCsuAOcDDqu/aNkwLX7POulgf1D5nvdzTd9AjMPyGxWf/8Bvz0pdbngfWru64i5Pamo2/hEW01dvIeH4gsjB8boflv6OxFkufyDd5EPq+OgNPQsiy0QIHicH1LTYkttM94a10T1CsOFe8NbnKts0bHgKoh20GcXYY1KvqTC6DCS4dMDFkiLg8lKu2DlRnN0r2b3cnLenYf84O4Jfm4iYTcZv29XXXwiwwdqOTUMi2LqM89zC2nSi7RJNPZJZbdOZdIpfIC8o+e5BAaH447cI79oILPLSNk8GYH0HEDNzxT49jpYamzSAiMlbPzr/vsmX9V8NwjMq/TSCHfAlA6aTamPvTUjhnrsk04aQbld7KVJBjOfz8VOPrvSp/pXf+ZJVfYwHals4d5W8VlFfLr3egjV88XZ9N+dfh33XL4e4ToW6n5CnG2TH5//fQwxl+NMBaTn+QCN7gFQn8bMPvQk7BqzXcUeJd2yxkF4MhRj8zOdOYnfVijoIkhs44RkeMytE1RHG8xu4ZnPtPU1iiHYsuZl8+Fc1R2Lc8skUU9y9id7Rfrn/37+f7XZp2Fsx7+ycnccVSBLdhTo5CxWskBUmuxFk0LYaUzFKtVlmQZPGPYRTfYpHDqAJdRrCJUvvNOIhig5Jrj7R0wKxaNAxA6gxx4pL1Ow/Nr+qtMT6vrL+8kR6Hbib/OOvKtFpj3PA8P6TuLiNbzPh0Db1niEnqFRfSaZqxvtkEbCMG87whjMW3VDhyrzu0/Fd/pqj39G3iPNy52bG2XrJ8I/4JgD//VG3vTdUZztn7wBgSITUVi1gcyXYA1S7Fdc6A9V7CIjomE0EXlaJRo3wBOPp9E5UhtnQLS6/JLaQC0EO4EA+QOzuQ4ukE4mhGK0uT8u1lSWKl1z0np5dfU6Z+4F55fqYCBz1zR9MC0vqxJ0EOFM4uyZkaeColpHGkaUMkVWcsNg5+C7ZsvmjTu8gmZmsX3vvVAYiOhQTE5OgRh1HKzByNjW37Ec8HAdcw7K55e3Qg+zFfg1DUbZPIZNpqDKBB8WNGB8FePT2rwWC8UoQ5v9aA1sW2T9drM5s2U8Vufysz8Kz6eGac/vwzdj0oABV9+yAMxsjYPppzsV2j4ReFw5JZybO5eG0EOYmRwH1bph5auLOa5ZfCskyojHX/nEb2f2mpWBUSLjlMWZBaUuYOOY4oZ0TNRbLVTKqWzpbCLOTn2RdkRjmetMdJEuS2W9mTxw4vn+zlMPqPeI0ZtB6YpMoZWIypS91cbMmPrsEtwez02NST+IofkGFCjFlH8dxzTgFNz3Xgy3TeBNZCOcxo5a5BjFxOPL83U+n2tF1pDCJ00PWs27M33Ra+QM674risYYojzJILmdZtlN2fyTXEODViSeUvJe7gYI7lBGIWPBz5JPXvtSIFw7+/3KJf+egPQgifnuZ47GpZNL3CCfNnOtpLxNcQLfF+PcTpN5leHHFZRFXa6qSxydaFcUCuFdRgCRISAwmEnZHGuIFMpfWCcoE6uKiNYaiDYiZKMJx2VXEyanyzUIVHFQ98ZpqCwWvnXxUlm0THagMWPfyH8Qq82JPwKsUHYqT8I/nRJraycmHqDGYwCQI8imBLqxj9tFQpkFXzp218oZ+8k2sWG0jsNugTRfBEeQt0XER8ykYMnMb4bGxZJNFcXeq1W4zV/G9oql6i6BkP+BuQ7T1/16j6rEaS4bbpvLL4FIfLrdM/mEVFPNdX6S45iaKzUxcZGA0EkOG0elIPwir9my+PtB+7SRt5Cn7+O2Lz4hfD5DR6sebL/A0G4onnuCoIMgAz6rctmIJUgonz8h0ed3xs9aQ11TpcazDeHBxHHTjrrm/kQsXx8OmJCilBWitikqLalm0IIgdodEvhc1Ith4b48SIMWcYx0IwimHnsgFjc3RS7dlYoWQc35ZhF0B4Nz7gefHX2F8JKkqDSAc8wUbdz4e0wgUbVYoABNwaRuocbtcR1jjaGHK6GxVrYUv7ab25nsyTOoMUou5bjPGtAFBPtJJdHzBebL7NwIBGY0aPnWe/luvj8BhOFIvSB4beZnPuiocRVNCfwRQ1Q4GhPs2/cQVDfE/vHy7VUIUGyhgwnPu3F8g52beC13+lcV/g5zrYMvFoQFEF3Ta4CiJYiG3LN9bwLwGa6jVfajNWyrlMfPmTPomlVQk5YK7XhAsSjHylK4jGe6Kujr24aT8sQILldyanLetsFGG039nZXQbHLmVJzr/BkWXqrYMJ/yFl08a6ueKlq1G7+zLo83pFBSc8pNqTJTeh7Ng/uOacWhzqnXO0petgKNTOP7ipIo6N7gfsTSLevyIheWEKlluWYhyQgmDM0TF90ij1stplofHm8GgdCuGTgbqP79HlzIKPmAanwa9risFuSeANrKtMcwlhyIJ06SiPj9vcm8l5GHewzj5L2lO9lX7C4sY53Ox40ERb8PyX1ecRFXCLRhzFAasrMeizibTXmKvRRYPeXjpnOyogqtNUvHtwWwRsHwweOHVu2tPqZP1ZM23R4194iIK7reNsfBF2xL4WSjuIw++1HCqd9goR6zTtWtPAe69BvgK0POyhtVoqXUohYTLITvBI6PSaOECcfz2YrcxpeOTMCB9GwoMOK1E72aAfXPGRu8VKQ4d03gKOXG4QMkj9cxzRGBAUpIOYKra+WkwzQuW/ekhcePKy78UNbIbtoR40SCuLndACHt+1hH5q6+ftER+Qle856hu7o+qMzYMw8nw2Xf9/r6NbMEjYd5hBdaZn0B/DL3UT/i1v6Gv3iLeJotzzH+U5hwOIQCYmbemCZt3RCC+zo4sf3zvN2gN6ZJFaFrHpEX8oJzUUcnK2EhDH7GI7xB8S6CKHzX0FDeWz2RszRg23FTbRXON9LtlgKvb8Ni05xl0nVhVJSEcF5DqQNFM819EduMYpunhT7hlxOggcbWowV+ddy3oObJrn0nsHNOOyEMgjeOOFKt54/X9ZMCSDOmYD4jS7t072kWnsyKa49VOSXSvT6AUFaBrz2+zW7esEGMKUDEf+dkfbi01DyDje9Be8ginAfdgZQPrTp+kb337r3zIdYx/ySA+AH/JTVN7TIt2b3lOvMWSvlAjeMpMZ2HU8DaiGIrje5659Zr96omOSZiVoIFmBmoxTtNu9ydidJRGL2ELbanHbYsECWFMv7bvDhxLoKb6GXwZu5PlyLoX6S9JjybTpJUGLpprYEGcaHRyBHl0s9SG7/3aIG8crYVxVpS8CW7t+k2rSqMXHi9Rm3gszaLV9WCdqqxbBRUYWjU6m2jF1IKg/OR5Dqr8IsrNlUnZobjpQML24+Fb8VGXUX7Jy6CZTln9PTl86syQTYuC20xTUTz41QPcyGRnpj1aLROW0tsbIF8WjQNFrQRQkt8R+KxyDtcP143F//hPefYTaLT/8yjW/fS2coqRhpOdCaRGs90AER4b+cuz9nXvq4AZmY7gvFdqIRnl7wRLUEelFoxAbpq39AfKWmlIJWFOsXVA/QfhPRGNXVRtFQlegsss8EKu6iTM+PzBo9ptrhD40ySpJW1eKGleoF5imokVhajNPcyVKLI/QO3JJkuPZa7nlHkXaPLB/AG+Um0xuHAkvhUxFAbXzjHWWhUEdU8r63704aeUHD2mwuDgRR0wJbnTeQkR6i56G3kOy6lCoVxt4NA6OODWGyxzHilbsdshzvszss0XQ7BngBJatJSkSrnkmIm3+Ehw67hPdggt5+A6N3OMFIDUy/rPyalK+Z7eRfh4tB1YNhiyInHSolLj3Ic+2PI831YXfwAEyyC6wbp5p3XQCC1zfK6EfNcbGrA+g4XevxgHs91A5GquhzlOVplxyZpH5SpFEaBs1c0AOXyhWuMXPASW4MbxT4BitxZkIBfchv1qhLYpY2KsHMoIV6HyalC9Hj8ZrtCXhGvVIO9JfJ8dwagOlXPizfbuyKUaf3IM5MILSOoUt3U7gLNE7D0ccd2sqy52WCeOasyRErjc4IyMr0FsK7CL3iYwsfYVqiAJ5I/Wih4Pv7djaoNXh9fMzvZdy22fBOwEdpmDOuasyffY6mHm84A9BKLX2V0V5d3t5pYzYxqi33XmPw+XoisD8FpvNsrudnOJwVFjcOD9cSApKMURCnA75GEkt9x/igGFWPEVkuZXAiZVHgA89tvMryeTTskoj00ZsbKdvMgoiFbuNkQ5HfH06v7CQX2DBlJfFvuvJaHlPOk1FEa9CIQAh1HaZBZPDRR75V8CxssjcdMaVriYfUlr/fJoo39awhAfifh9t1nm9ZXUZFgKMasF3AH4bDZ1E/9OLLTttt1Ym/vsQuLCXrFIsauZZp+u/iwYip/iSKSXbtn4h0yWVLzsjqzZ7ivJfPxNCqOxr+KAJ8m8CgMyVIveSyxMUdf9k1oMEULfwPib1eXuRBrYIUQwRZfXTF662fSUjeDEeNxMd54X0c3VZgHY2qMxuu0HpNd0Gzjfr+JhryZ/znG+tbnVf3o0JaiTmlaO3PGNFmjZJJJk6sWHu/e6Kw0agy28hTvUEF1KW1Dl9mzmWlapZhqvGEW/Ca36JNO0uA4Ggf9q+Ci1T2gfVcF8Q3izWV9oKGpJcvH9L4ODBBJwVsetoMDssAo8XEqHAl+FncG7dA7DIP9eIF6dIkiWTM6dD4pJmOWLVgZpWtkq8bU5BmkBZ8czVeGXZjwyjUK/SxNiwKYg6nCKS8lnwZRZkiRR+6uzp759xmBTyhm+bbJhQ5CWeITgThet39e+K+M0ml9eF1hvmtJgsz9eyTIx6HgWMtZnn3zHCFZK6oZAtn9ZHKXdH9DKtP3097ZVnd4TlLI1mBynYjNYtqKpcSCvERIkCH3Aqq5m0iRTHHCHdoarXfXQxENxbJ5t7q07W5LrQGx9M0+Rro68gb12UiMCPvfZUvwYet8D42RWKQJGLq3O8Hfo9W3kMBgJbVsiDeU5RfJR1XRB8HMvzhjtK9d4CJ++HhrIXjPL3Pu1zSbD3GJkm50Gd2yQuENEHXsE/9LhC+7FPp0upVyKehrSa4xQohR9S6EbziI4QQ2J8OdnfEgGfLDpFBy67SqzzSKHhTbRWSk5LWq5oztvARPMeptpm78DdrSxv0meFwIuL7wKjAhUoyOtJgFVmVnrXgwsMrF0BFiiQ6XMm8alPvTvFq73oYD4G06lnbEYyVozq4dc3XNHIjXly0q+9ELbAEUIc2ZbuSemklNSisJq382ZeXJaiW3gXDTViOVsSqQr/n/sBuRdbrc/Vo/XvmNzaOqhk6J1HOfoOk4k0T+6ZL7wrs3VH8/bOrbADYRk5T9IFY3aJ/32sTpbJIRKv0G9eULAGNnpIGuglDb6ny9Hf0LT+yZ54yehhMwGVEtGY/FO1AllByr1DpQgI6Hba6Z1RdTG0nHyambe7K8jqEztU0c7LoK0LOgo2xjJknq6tKmZnLDmJMGxO43QADEG0H4Xru94DeJICi4+0CcT6ZFrVUTy6y+sAqSd/kHK1EgKL6CJf4hB/Wjytj+ZrXym7Mdb85TEmmsdMcckWg5hFGK+UynFzzYyxqOLpxks+r86y7ZguW1pl7f4+WsBpRLnoauHMaybWMMU16FAfpZVubVjV6diZ6j43o3Y0CwgLQoDpSJGIxpaj+xXi7gQOH1ve9rdH8Ge98dByThmzaJpJ5WGRag1wp3Yjq3MY48vL+1PpoWu3TEVySLIuYWTmD/9ranAZi1J2Qoq1gw1gFXaKfQE4mCnLF9NhzbVKmFBQ9uUbU64sLHtM0PsQjN5qgQQwB7hi4rALghoGYeKwZ8ReX89j/lMuLFREaPowPbZd9PcC2xVZucpRX8xA6IwE48/h7e95+qy9pfbf+LQL+/0uq3GIfm+mfpvHsRS7P8lIRy0xidThIYhDlH882CbjIJpUpBkc7CmuJiEFuq0xYn44QClBY2mq6WPeIvdGsjlJhi81M4PvU3noFxocRqWKk/jCB0YB8mfr3e39hmbIxliormlsc1R5xvCD/cfidew/d3mllduFtYAJnNcNA+0MgZegaFP1HmZgIWBwfeRtatxGFRzYiCd0P9zgaubTF6URbPWlDSfeg0fZFkLGjVwFCyqw8MDZvg190/Bnc2NXSPQmhjSN8Qwd9zrZsaec9HwDz236Vwi8gf4HVPY/+6shn5eM4ng69P9+vO8YYtKq3VbZpfrksIZZ369SahU22MSCH9PDuShDz/V7xu4TmcfpT71568yarHaEjbx54F+1PaGihozz/vywYYVUxnGn0Cqlc3Plf5MJJ0qaTBkZVTt+OlqPmYiik9qPZN8ohWHYByNyXQzpWxeVjXen+t1BJXfmP6VV2uW7r9p7orOs+/F82/5SIgRuJQylAyySLaSKFruwf6SaJBslP+AuMOFrWXokC4mVxOIaZQ6nXbnN3IdG58C0WWS3KiekbR5daC9J5fJjdPsRK39RMeBDHRWTJmiqI2SkPijQaMHbZwhURuQAb0hdtp81YQe3eYckBBcHdY+msOl1RxHae9iPfnuIv1RX77B1V1R1ncLj0cX2fd+vPunLnuws93uw00O8s+MWuLi7miJgqOmm+oZCqqjlvfyvPNh16NZNpTdU8WB39Fxng4h6jxvVYyC4NjAI9hBrvlQRANFbMC/Gmb38WTGft3f+jMe/gQGuZD9zUaZLMLq0ZpNyJw3d5+Ib86I5j4pa88PpHYwMsT9GN7IMrNHHlbJvLuQcQOgj6Co3rTrB8N/YnUXoJGT8ABVMBXME8Fxh/+DKK++n0XWGeW2BGz9sNO/QsGceUArFyLONapgh0kFrDzzgN3iU7y4oNiwmqeSkyHYwnC1ficx/1FvtEnWc//hK7oOTuBBodZEvkZaZG2Hreil4CVLLj/pmB9t/l138FZsI3vmvzHZXaBtrY7Vc9Tho3HqM0YwnKldkkU47JBvjgWINGSJPUuYZ/30HEI3tBnUqYbXh6f97wo0TP6eBjBHDZ9IaEMzWye2xv6zNGMNLxZHjxCaPjfKBIXomfxERmZVV40S44SEDpKH1b/2nt8d/ydGW3ru8tEelii8CG/bodJHKydwGVf47mftkN6bTj/jCKBDEbpZGSkPst9JFUQfB4TW8rDnttkCzt+9PII9sj4Or9DV0/+nGIa5RLG9R6f0ifGssbZtEMpsLUVigEyXQgEB56rx4F3rE6DF+DLM3rpT/iHJgjx3XG73b9r2/hVrokEHu49QATv/t8Ar048KiqN0aaQeoczs89/dUFO5LOdDog0VEoWYxNrbQsDwVuKVcUlRUgT3ZoHOR1iQqaNywk6Y7U+r0nrMkilefOa1/CyQMUs3ZXuJaGzghZmIfeK4jnMdxdv/rwqV3AFveHPU6gPuM1LqoCE+9Kr7PFP0P7GCHcs3iopWj9FCJbfQMB9H9RNdIagbsH6xQTGx58LSR9ELPbgAbwLhakd7FlNk3ksDfYa+Oa7Xl+YLdD95jKB1mvVZU1ePfIs+FBlrGSYNTl975kW14kVjOezzEv1S7Wiw/oGjEnv/7r/9unz4DNWW07WEeB3hw+wD50s2z/ThBEIzXPFT8nRefEkdQD9T0+IfJ2KIdwPJpBCzuXCyHoNYy2Yh1zJFS/xtkNBkUvZSiEByTIpo5IzoNI9rPGuAba+Ps+CfjiOEAJpQUgudyqAxzvmIBe0CdJmIIMlrstIrjbNqfE23Q4LcxZxOcqMhR1DUiyKn7gQElZJO0LdgMFXEM96iMqetz72RbhiOdOp2MyWz+s2DbiN1r3X6hFnu6VzLdECmaXSaU1UBvvBAtcLer/TrpC9a1K0R06Zl/LltbZX2LlaUVwJdyb2w7g+J7yjqV3MXGIZolth+edNyEaze9Gn7Ncs4cpFR3gc1KyMyK4uLPcT1CfwXVRJ2AGNuCHhqLqydWJhPI1Z2eLsZkCsVzqbhu9SKDohTZB5nMajqlRXXYV9MirMLjjT07S+bMddSa0hzeY33FaHKjc8jp/zKY22qzG2Et8bJJqdsd1JVlCarECVh1YKwcBjw2ehi157/JQRyu7YA1q7eZru5s0Yf4p9KyH6+nvskTNGPSW3cGyalMM22qs32Sm4XxAtcwr5rey4fkVGQ1WHrWNrdiuSMlVSZTUFF3JWWtAcRStUAqL7MigTsnt3pQWDaMlUFXI6pgIfHP2TupzSEDXcvz3OE1DyE6fJSfJ6zxqhGMbuEM9E7Pajl8R9aWFKf/03Lv9CDEnVE0MWzokXI+7BedKzKbSy9t50Ue0cgaByL140NYqXPNoa4W+WpHYq3qIYr6pJRYqErjTnmdJuaoDGWffADREoc66UEwUKSG4lFZlEieYekn/dT8AJvXaTT0FzRYRY4OMu9+Y5KMg5K5c8EbPMaVtfgIM3E0PqnBTQptVy37qiB8LV81ao4794xfyQ1B1s+M9ZRAsGs9jgQDPy9z1aH+ZriLiGp94PruWdxt7MjmH+5230/AwFsOfzmp//Az+z+Yr6vZfGeM779f9cgrn/gZnumbGhy9f52arAv1/4f8Rby/X+zvV3AYGhv6+PKlvLf7eAiL9rZV4V5fr/XYyXvwvFf2/+TtDfn/yFq5i8bf/75TwM6//5njDHY6kNWf7+xP8C</diagram></mxfile>
2102.04362/main_diagram/main_diagram.pdf ADDED
Binary file (91.1 kB). View file
 
2102.04362/paper_text/intro_method.md ADDED
@@ -0,0 +1,236 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # Introduction
2
+
3
+ {\it He who receives an idea from me, receives instruction himself without lessening mine; as he who lights his taper at mine, receives light without darkening me.} - Thomas Jefferson
4
+
5
+ Intellectual Property (IP) refers to the protection of creations of the mind, which have both a moral and commercial value. IP is protected under the law framework in the form of, \eg patents, copyright, and trademarks, which enable inventors to earn recognition or financial benefit from their inventions. Ever since Machine Learning as a Service emerges as a viable business which utilizes deep learning (DL) models to generate revenue, different effective methods to prove the ownership of DL models have been studied and demonstrated . The application domains demonstrated with these pioneering works, however, are invariably limited to Convolutional Nerual Networks (CNNs) for classification tasks. Based on our knowledge, the protection for another prominent DL models, \ie Generative Adversarial Networks (GANs) that create plausible realistic photographs is missing all together and therefore urgent needed.
6
+
7
+ [t]
8
+ \centering
9
+ \includegraphics[keepaspectratio=true, scale=0.28]{proposed.pdf}
10
+ \caption{Overview of our proposed GANs protection framework in black-box setting. The idea is when a {\it trigger}, $x_w$ is acted as an input, a watermarked image (\eg with a hexagon as the watermark) will be synthesized to claim the ownership. Black area in the {\it trigger} noise ($f:z \to x_w$) indicates masked values (see Sec. , Eq. ).
11
+ }
12
+ \vspace{-10pt}
13
+
14
+ Generally, a common approach to deep neural network IP protection is based on digital watermarks embedding methods which can be categorized into two schools: i) the black-box trigger-set based solutions ; and ii) the white-box feature-based methods . The principle of digital watermarking is to embed an identification information (\ie a digital watermark) into the network parameters without affecting the performances of original DL models. In the former, the watermark is embedded in the input-output behavior of the model. The set of input used to trigger that behavior is called {\it trigger set}. The non-triviality of ownership of a watermarked model is constructed on the extremely small probability for any other model to exhibit the same behavior. In the latter, the watermark is embedded in the static content of CNNs (\ie weight matrices) with a transformation matrix. The ownership is verified by the detection of the embedded watermarks.
15
+
16
+ For the verification process, a suspicious online model will be first remotely queried through API calls using a specific input keys that were initially selected to {\it trigger} the watermark information. As such, this is a {\it black-box verification} where a final model prediction (\eg image classification results) is obtained. This initial step is usually perform to collect evidence from everywhere so that an owner can identifies a suspected party who used (\ie infringed) his/her models illegally. Once the owner has sufficient evidence, a second verification process which is to extract watermark from the suspected model and compare if the watermark is originated from the owner. This process is a {\it white-box verification}, which means the owner needs to have to access the model physically, and usually this second step is gone through the law enforcement.
17
+
18
+ Literally, both black-box and white-box schemes have been successfully demonstrated for CNNs , however it remains an open question to apply these protection mechanisms to important GANs variants (see for a survey). We believe, intuitively, the lack of protection might be i) partially ascribed to the large variety of GANs application domains, for which how to embed watermarks through appropriate regularization terms is challenging, and ii) directly applying the popular CNN-based watermarking approach (\ie Uchida \etal ) on GANs has limitation in ambiguity attack as shown in Table . It is shown that the ownership is in doubt as indicated by the BER results\footnote{In general, bit-error rate (BER) measures how much the watermark is deviated. BER=0 implies that the watermark is exactly the same as to original, so ownership is claimed.} (\ie both the original $\vect{b}$ and forged $\vect{b'}$ watermarks are detected).
19
+
20
+ [t]
21
+ \centering
22
+ \resizebox{0.5\linewidth}{!}{
23
+ {l|c}
24
+ Trained Model & BER
25
+ \\ \hline
26
+ DCGAN with $\vect{X}$ and $\vect{b}$ & 0.00
27
+ \\ DCGAN with $\vect{X'}$ and $\vect{b'}$ & 0.00
28
+
29
+ \\ \hline
30
+ SRGAN with $\vect{X}$ and $\vect{b}$ & 0.00
31
+ \\ SRGAN with $\vect{X'}$ and $\vect{b'}$ & 0.00
32
+ }
33
+ \caption{Top row - Bit-error rate (BER) of the trained model using Uchida \etal method [{\color{green}1}]. Bottom row - BER of the model using counterfeit watermark, $\vect{b'}$ and optimized transformation matrix, $\vect{X'}$. DCGAN is trained on CIFAR10 dataset while SRGAN is trained on DIV2K dataset.}
34
+
35
+ \vspace{-12pt}
36
+
37
+ Thus, we are motivated to present a complete IP protection framework for GANs as illustrated in Fig. . The contributions are twofold: i) we put forth a general IPR protection formulation with a novel regularization term $\wm{\mathcal{L}}$ (Eq. ) that can be generalized to all GANs variants; and ii) we propose a novel and complete ownership verification method for different GANs variants (\ie DCGAN, SRGAN and CycleGAN). Extensive experiments show that ownership verification in both white and black box settings are effective without compromising performances of the original tasks (see Table , , and Fig. ). At the same time, we tested the proposed method in both removal and ambiguity attacks scenario (see Table - and Fig. -).
38
+
39
+ # Method
40
+
41
+ GANs consists of two networks, a generative network, $G$ that learns the training data distribution and a discriminative network $D$ that distinguishes between synthesize and real samples . This paper proposes a simple yet complete protection framework (black-box and white-box) by embedding the ownership information into the generator, $G$ with a novel regularization term. Briefly, in black-box scenario, we propose the reconstructive regularization to allow the generator to embed a unique watermark, at an assigned location of the synthesize image when given a trigger input (see Fig. ). While, in white-box scenario, we adopt and modify the sign loss in that enforces the scaling factor, $\gamma$ in the normalization layer to take either positive or negative values. With this, the sign of $\gamma$ can be transformed into binary sequences that carry meaningful information.
42
+
43
+ For this work, we decided to demonstrate on three GANs variants, namely, DCGAN , SRGAN and CycleGAN to present the flexibility of our proposed framework. With trivial modifications\footnote{please refer to the Appendix II for a proof}, our method can easily extend to other deep generative models, $X$, \ie VAE, as long as $X$ outputs an image given a vector or image as the input.
44
+
45
+ [t]
46
+ \centering
47
+ \vspace{+5pt}
48
+ \resizebox{\linewidth}{!}{
49
+ {c|c|c|ccc|cc|c}
50
+ \multirow{2}{*}{Generator} & \multirow{2}{*}{Loss} & \multirow{2}{*}{Input} & \multicolumn{3}{c|}{Black-Box} & \multicolumn{2}{c|}{White-Box} & \multirow{2}{*}{Overall Loss} \\ \cline{4-6} \cline{7-8}
51
+ & & & Trigger & Target & Loss & Norm Type & Loss & \\
52
+ \hline \hline \\
53
+ DCGAN & $\mathcal{L}_{\text{DC}}$ [Eq. 5] & $\vect{z}\sim\mathcal{N}(0,1)$ & $f(\vect{z})$ [Eq. 1] & $g(G(\vect{z}))$ [Eq. 2] & $\wm{\mathcal{L}}$ [Eq. 3] & BatchNorm & $\mathcal{L}_s$ [Eq. 11] & $\mathcal{L}_{\text{DC}} + \lambda\wm{\mathcal{L}} + \mathcal{L}_s$ \\[10pt]
54
+ SRGAN & $\mathcal{L}_{\text{SR}}$ [Eq. 8] & $\vect{x}\sim p_{\text{data}}(\vect{x})$ & $h(\vect{x})$ [Eq. 6] & $g(G(\vect{x}))$ [Eq. 2] & $\wm{\mathcal{L}}$ [Eq. 3] & BatchNorm & $\mathcal{L}_s$ [Eq. 11] & $\mathcal{L}_{\text{SR}} + \lambda\wm{\mathcal{L}} + \mathcal{L}_s$ \\[10pt]
55
+ CycleGAN & $\mathcal{L}_{\text{C}}$ [Eq. 10] & $\vect{x}\sim p_{\text{data}}(\vect{x})$ & $h(\vect{x})$ [Eq. 6] & $g(G(\vect{x}))$ [Eq. 2] & $\wm{\mathcal{L}}$ [Eq. 3] & InstanceNorm & $\mathcal{L}_s$ [Eq. 11] & $\mathcal{L}_{\text{C}} + \lambda\wm{\mathcal{L}} + \mathcal{L}_s$ \\
56
+ \hline
57
+ }
58
+ \caption{Summary of our proposed implementation to protect the IPR of GANs models. Note that, the equations herein are reflected in the main paper.}
59
+
60
+ In general, we propose a reconstructive regularization that instructs the generator, $G$ to map a {\it trigger} input to a specific output. Herein, the challenge is defining an appropriate transformation function to ensure that the distribution of {\it trigger set} is distinct from the actual data. In GANs, since the generator, $G$ always output (synthesize) an image, the specific output will be a watermark-based image since the watermark (\eg company's logo) holds unambiguous visual information which is straightforward to verify the ownership. The detailed implementations are described below:
61
+
62
+ Technically, the input to DCGAN is a latent vector randomly sampled from a standard normal distribution, $\vect{z} \sim \mathcal{N}(0, 1)$. Hence, we define a new input transformation function, $f$, that maps the latent vector to a {\it trigger} latent vector ($f:\vect{z}\mapsto \wm{\vect{x}}$) as follow:
63
+
64
+ f(\vect{z})=\vect{z}\odot\vect{b} + c(1 - \vect{b})
65
+ \quad \text{and} \quad
66
+ \vect{b}\in\left\{0, 1\right\}^{D(\vect{z})}
67
+
68
+ \noindent Intuitively, Eq. masks the $n$ value of the latent vector, $\vect{z}$ to a constant value, $c$ where the position of the $n$ values are determined by the bitmask, $\vect{b}$ and $D$ is the dimension.
69
+
70
+ Then, in order to transform the generator output to a specific target, we define the new output transformation function as $g:G(\vect{z})\mapsto \wm{\vect{y}}$ where it will apply an unique watermark on the generator output. The equation can be pictorially represented as:
71
+
72
+ \includegraphics[valign=c, keepaspectratio=true, height=15pt]{eq-g-output-eps-converted-to.pdf} = g\left(
73
+ \includegraphics[valign=c, keepaspectratio=true, height=15pt]{eq-g-input-eps-converted-to.pdf},
74
+ \includegraphics[valign=c, keepaspectratio=true, height=15pt]{watermark-b-eps-converted-to.pdf}
75
+ \right)
76
+
77
+ After defining both the input/output transformation functions, we define the reconstructive regularization derived from the structural similarity (SSIM) which measures the perceived quality between two images. Since the range of SSIM is in $[0, 1]$, we define the regularization to optimize as:
78
+
79
+ \wm{\mathcal{L}}\left(\wm{\vect{x}}, \wm{\vect{y}}\right)=1 - \text{SSIM}\left(G(\wm{\vect{x}}), \wm{\vect{y}}\right)
80
+
81
+ For the experiment purpose, we have chosen Spectral Normalization GAN (SN-GAN) which is a variant of DCGAN. Taking the generator's objective function (Eq. ), we optimize the regularization term defined in Eq. and the generator's objective function simultaneously:
82
+
83
+ \mathcal{L}_{\text{DC}} = - \mathbb{E}_{\vect{z}\sim p_{\vect{z}}(\vect{z})}\left[\hat{D}\left(G\left(\vect{z}\right)\right)\right]
84
+
85
+ \mathcal{L}_{\wm{\text{DC}}} = \mathcal{L}_{\text{DC}} +\lambda\wm{\mathcal{L}}
86
+
87
+ \noindent with the reconstructive regularization scaled by associated hyper-parameter, $\lambda$ to balance between the quality of generated image and the perceived similarity of the generated watermark when the {\it trigger} input is provided.
88
+
89
+ [t]
90
+ \centering
91
+ {\linewidth}
92
+ \centering
93
+ \includegraphics[keepaspectratio=true, width=30pt]{dcgan-baseline-cifar10-11-eps-converted-to.pdf}
94
+ \includegraphics[keepaspectratio=true, width=30pt]{dcgan-baseline-cifar10-21-eps-converted-to.pdf}
95
+ \includegraphics[keepaspectratio=true, width=30pt]{dcgan-baseline-cifar10-31-eps-converted-to.pdf}
96
+ \includegraphics[keepaspectratio=true, width=30pt]{dcgan-baseline-cub200-11-eps-converted-to.pdf}
97
+ \includegraphics[keepaspectratio=true, width=30pt]{dcgan-baseline-cub200-21-eps-converted-to.pdf}
98
+ \includegraphics[keepaspectratio=true, width=30pt]{dcgan-baseline-cub200-31-eps-converted-to.pdf}
99
+
100
+ {\linewidth}
101
+ \centering
102
+ \includegraphics[keepaspectratio=true, width=30pt]{dcgan-watermark-cifar10-11-eps-converted-to.pdf}
103
+ \includegraphics[keepaspectratio=true, width=30pt]{dcgan-watermark-cifar10-21-eps-converted-to.pdf}
104
+ \includegraphics[keepaspectratio=true, width=30pt]{dcgan-watermark-cifar10-31-eps-converted-to.pdf}
105
+ \includegraphics[keepaspectratio=true, width=30pt]{dcgan-watermark-cub200-11-eps-converted-to.pdf}
106
+ \includegraphics[keepaspectratio=true, width=30pt]{dcgan-watermark-cub200-21-eps-converted-to.pdf}
107
+ \includegraphics[keepaspectratio=true, width=30pt]{dcgan-watermark-cub200-31-eps-converted-to.pdf}
108
+
109
+ \caption{First row is the generated images using latent inputs, $G(\vect{z})$ and the second row is the generated images using {\it trigger} inputs, $G(\wm{\vect{x}})$. Each column is a DCGAN model trained on different watermarks.}
110
+
111
+ SRGAN has been the foundation for most, if not all, the recent super-resolution tasks, in which the generator of SRGAN takes a low resolution image, $I$ and generate a respective high resolution image. As such, the input transformation function, $f$ defined in Section will not be applicable. For that reason, we define an alternative input transformation function, $h$ that maps an image input to a {\it trigger set} ($h:\vect{I}\mapsto \wm{\vect{x}}$). This function is almost identical to Eq. with an exception that a random noise is embedded on the input image, rather than a watermark. The function, $h$ can be visually represented as:
112
+
113
+ \includegraphics[valign=c, keepaspectratio=true, height=15pt]{eq-h-output-eps-converted-to.pdf} = h\left(
114
+ \includegraphics[valign=c, keepaspectratio=true, height=15pt]{eq-h-input-eps-converted-to.pdf}
115
+ \right)
116
+
117
+ For the output transformation function, since the output from all variant of GANs is the same (\ie an image), we can re-use $g$ and reconstructive regularization (Eq. ) to transform the output of SRGAN to embed a unique watermark on the generated high-resolution image. The generator loss function composed of a content loss and an adversarial loss and we use the VGG loss defined on feature maps of higher level features as described in :
118
+
119
+ \mathcal{L}_{\text{SR}} = l^{SR}_{VGG/5,4} + 10^{-3}l^{SR}_{Gen}
120
+
121
+ To this end, the new objective function of our protected SRGAN is denoted as
122
+
123
+ \mathcal{L}_{\wm{\text{SR}}} = \mathcal{L}_{\text{SR}} + \lambda\wm{\mathcal{L}}
124
+
125
+ [t]
126
+ \centering
127
+ {\linewidth}
128
+ \centering
129
+ \includegraphics[keepaspectratio=true, width=55pt]{srgan-comic-trigger-eps-converted-to.pdf}
130
+ \includegraphics[keepaspectratio=true, width=55pt]{srgan-comic-wm-1-eps-converted-to.pdf}
131
+ \includegraphics[keepaspectratio=true, width=55pt]{srgan-comic-wm-2-eps-converted-to.pdf}
132
+ \includegraphics[keepaspectratio=true, width=55pt]{srgan-comic-wm-3-eps-converted-to.pdf}
133
+
134
+ \caption{First image is a sample of {\it trigger} input $\wm{\vect{x}}$ to SRGAN. Next three images are the special targets $G(\wm{\vect{x}})$ from SRGAN models trained on different watermarks.}
135
+ \vspace{-10pt}
136
+
137
+ The generators in CycleGAN take an image, $I$ from a domain as input and translate the image into a same size image of another domain. Provided with this fact, we can use the function $h$ defined in Eq. to map a training input to a {\it trigger set} and consistently employ function $g$ defined in Eq. to embed the watermark on the output image. Yet, we use the same reconstructive regularization defined in Eq. and add to the generator loss of CycleGAN. Even though there are two generators in CycleGAN, we only need to select one of them as our target for protection. The objective function of the selected generator is given as:
138
+
139
+ \mathcal{L}_{\text{GAN}} = &\mathbb{E}_{\vect{y}\sim p_{\text{data}}(\vect{y})}\left[log D_Y(y)\right] + \\
140
+ &\mathbb{E}_{\vect{x}\sim p_{\text{data}}(\vect{x})}\left[log(1 - D_Y(x))\right] \\ \\
141
+ \mathcal{L}_{\text{Cyc}} = &\mathbb{E}_{\vect{x}\sim p_{\text{data}}(\vect{x})}\left[\left\lVert F(G(x)) - x\right\rVert_1\right] \\
142
+
143
+ \mathcal{L}_{\text{C}} = \mathcal{L}_{\text{GAN}} + \mathcal{L}_{\text{Cyc}}
144
+
145
+ Thus, the new objective for our CycleGAN is:
146
+
147
+ \mathcal{L}_{\wm{\text{C}}} = \mathcal{L}_{\text{C}} + \lambda\wm{\mathcal{L}}
148
+
149
+ Verification. For the verification in black-box setting, initially, any suspected online GAN models will be queried remotely by owner (company) via API calls to gather evidence. That is to say, owner (company) submits a list of {\it trigger set} data as query to the GANs online service that is in question. Evidence will be collected as a proof of ownership if the response output is embedded with the designated watermark logo (see Fig. , , for examples). Moreover, the verification of the embedded watermark can be measured by calculating the SSIM between the expected output $\wm{\vect{y}}$ and the output generated by the model $G(\wm{\vect{x}})$, with {\it trigger} input is provided, and the sample results are shown in Fig. . SSIM score reflects the perceived similarity between the generated watermark and the ground truth watermark and a score of above 0.75 should give an unambiguous, distinctive watermark that can be used in ownership verification (please refer to Fig. ).
150
+
151
+ [t]
152
+ \centering
153
+ {\linewidth}
154
+ \centering
155
+ \includegraphics[keepaspectratio=true, width=36pt]{cycle-horse-xwm-1-eps-converted-to.pdf}
156
+ \includegraphics[keepaspectratio=true, width=36pt]{cycle-horse-ywm-1-eps-converted-to.pdf}
157
+ \includegraphics[keepaspectratio=true, width=36pt]{cycle-horse-xwm-2-eps-converted-to.pdf}
158
+ \includegraphics[keepaspectratio=true, width=36pt]{cycle-horse-ywm-2-eps-converted-to.pdf}
159
+ \includegraphics[keepaspectratio=true, width=36pt]{cycle-horse-xwm-3-eps-converted-to.pdf}
160
+ \includegraphics[keepaspectratio=true, width=36pt]{cycle-horse-ywm-3-eps-converted-to.pdf}
161
+
162
+ {\linewidth}
163
+ \vspace{+5pt}
164
+ \centering
165
+ \includegraphics[keepaspectratio=true, width=36pt]{cycle-city-xwm-1-eps-converted-to.pdf}
166
+ \includegraphics[keepaspectratio=true, width=36pt]{cycle-city-ywm-1-eps-converted-to.pdf}
167
+ \includegraphics[keepaspectratio=true, width=36pt]{cycle-city-xwm-2-eps-converted-to.pdf}
168
+ \includegraphics[keepaspectratio=true, width=36pt]{cycle-city-ywm-2-eps-converted-to.pdf}
169
+ \includegraphics[keepaspectratio=true, width=36pt]{cycle-city-xwm-3-eps-converted-to.pdf}
170
+ \includegraphics[keepaspectratio=true, width=36pt]{cycle-city-ywm-3-eps-converted-to.pdf}
171
+
172
+ \caption{Image pairs, $\wm{\vect{x}}/G(\wm{\vect{x}})$ from different CycleGAN models trained on horse2zebra (first row) and Cityscapes (second row) datasets, respectively}
173
+ \vspace{-5pt}
174
+
175
+ In order to provide a complete protection for GANs, we adopt the sign loss introduced in as a designated key (\ie signature) which have been proven to be robust to both removal and ambiguity attacks. Specifically, such signatures are embedded into the scaling factors, $\vect{\gamma}$ of normalization layers with $C$ channels in the generators, which can be then retrieved and decoded for ownership verification purpose. Eq. serves as a guidance for the sign of a weight in the normalization layers.
176
+
177
+ \mathcal{L}_s(\vect{\gamma}, \vect{B}) =\sum_{i=1}^{C}{{\max\left(\gamma_0 - \gamma_ib_i, 0\right)}}
178
+
179
+ \noindent where $\vect{B} = \left\{b_1,\cdots,b_C \mid b \in \{-1, 1\}\right\}$ is the defined binary bit signature that, when optimize this objective, will enforce the $i$-th channel's scaling factor, $\gamma_i$ to take either positive or negative polarity (+/-) as designated by $b_i$. $\gamma_0$ is a constant to control the minimum value of $\vect{\gamma}$ (to avoid all 0s $\vect{\gamma}$).
180
+
181
+ Then, this regularization term is added to the objective functions of DCGAN (Eq. ), SRGAN (Eq. ) and CycleGAN (Eq. ). To this end, the overall objective for the generators are respectively denoted as:
182
+
183
+ \mathcal{L}_{\text{DC}_{ws}} &= \mathcal{L}_{\text{DC}} + \lambda\wm{\mathcal{L}} + \mathcal{L}_s\\
184
+ \mathcal{L}_{\text{SR}_{ws}} &= \mathcal{L}_{\text{SR}} + \lambda\wm{\mathcal{L}} + \mathcal{L}_s\\
185
+ \mathcal{L}_{\text{C}_{ws}} &= \mathcal{L}_{\text{C}} + \lambda\wm{\mathcal{L}} + \mathcal{L}_s
186
+
187
+ With the sign loss incorporated into the training objective, the scaling factor of normalization layers in generator are now in either positive or negative value where the unique binary sequence can be used to resemble the ownership information of a particular network. The capacity of embedded information (see Table ) is constrained by the total number of channels in normalization layers. For example in our DCGAN model, the total number of channels for each layer are 256, 128 and 64 respectively. Thus, we can embed at most 448 bits, equivalent to 56 bytes into the model. As for SRGAN, intuitively, more information can be embedded as it has more layers than DCGAN model and so does CycleGAN. We refer readers to Sec. for superior performances of the sign-loss based method, demonstrated by extensive experiment results.
188
+
189
+ [t]
190
+ \centering
191
+ {0.15\linewidth}
192
+ \centering
193
+ \includegraphics[keepaspectratio=true, width=30pt]{wm-ssim-0-eps-converted-to.pdf}
194
+ \caption*{0.00}
195
+
196
+ {0.15\linewidth}
197
+ \centering
198
+ \includegraphics[keepaspectratio=true, width=30pt]{wm-ssim-25-eps-converted-to.pdf}
199
+ \caption*{0.25}
200
+
201
+ {0.15\linewidth}
202
+ \centering
203
+ \includegraphics[keepaspectratio=true, width=30pt]{wm-ssim-50-eps-converted-to.pdf}
204
+ \caption*{0.50}
205
+
206
+ {0.15\linewidth}
207
+ \centering
208
+ \includegraphics[keepaspectratio=true, width=30pt]{wm-ssim-75-eps-converted-to.pdf}
209
+ \caption*{0.75}
210
+
211
+ {0.15\linewidth}
212
+ \centering
213
+ \includegraphics[keepaspectratio=true, width=30pt]{wm-ssim-100-eps-converted-to.pdf}
214
+ \caption*{1.00}
215
+
216
+ \caption{Different perceived quality of watermark and the SSIM score respectively.}
217
+ \vspace{-5pt}
218
+
219
+ [t]
220
+ \centering
221
+
222
+ \resizebox{0.5\linewidth}{!}{
223
+ {ccc}
224
+ GAN & Channels & Capacity\\
225
+ \hline \hline
226
+ DCGAN & 448 & 56 bytes\\
227
+ SRGAN & 2112 & 264 bytes \\
228
+ CycleGAN & 5248 & 656 bytes \\
229
+ \hline
230
+ }
231
+
232
+ \caption{The amount of information that can be embedded into GAN generators.}
233
+
234
+ \vspace{-5pt}
235
+
236
+ Verification. Given the evidence from black-box verification step in Section , the owner can subsequently go through law enforcement and perform white-box verification which to access the model physically to extract the signature. As an example shows in Table 19, we embed an unique key "EXAMPLE" into our DCGAN's batch normalization weight. It shows how to decode the trained scale, $\vect{\gamma}$ to retrieve the signature embedded. Also, please note that even that there are two or more similar alphabets, their $\vect{\gamma}$ are different from each other, respectively.
2102.08201/main_diagram/main_diagram.drawio ADDED
@@ -0,0 +1 @@
 
 
1
+ <mxfile host="app.diagrams.net" modified="2021-02-16T13:55:36.499Z" agent="5.0 (X11)" etag="mUJyPy2gXo11UII99Csg" version="14.3.1" type="device"><diagram id="pEGLPcy5Alr7zN_z4LwX" name="Page-1">5Vpbj+o2EP41kbYPi+I4Nx5Z2J6qPSuthKp2n458iJe4NTHHMQv019chdu5hgQBBh33ZeGxPxvN9Y884GHC82HzhaBm+sABTwzKDjQEnhmUB6JjyXyLZphLftFPBnJMgFZm5YEr+w2qmlq5IgGMlS0WCMSrIsiycsSjCM1GSIc7ZujzsndGgJFiiOa4JpjNE69K/SCBCtQq9rET+GybzUL8ZmKpngfRgpSIOUcDWhSXDZwOOOWMifVpsxpgmziv75deW3swwjiNxyITf3eBRQDSd4yiYjn6Qmf+n9wgsZZzY6hXjQDpANRkXIZuzCNHnXPrE2SoKcKLWlK18zFfGllIIpPAfLMRWoYlWgklRKBZU9UqL+fZvNX/XeEsaA0c3J5ti52SrWqmtiYGtPtDOZis+w/sWrriE+ByLPeOsDClJccwWWNoj53FMkSAfZTuQ4to8G5fDIR8UIsegk+r9QHSl3lRDq4zFOiQCT5dot/S1jMiy35U6zAXe7PdgfcV6AlBTVDwDx0/b6zw6LFONCQuRMfQv5SSzVwqD3ihsHUhhu08KW7dHYXto3hiF4TFOAldyklt2ku9JblfdBPxh3U1weKlIBzW3XPWwyoP7rdh3+Ui3D4z0FkgPjvTd1BHnaFsYsGQkEnFB82siyJli6RxCMcWGTjkP+WQ8VBblzEgtyHmSLaUDdWAv1NkQkTLHclTzTXNFPufESRqX4k1XPtQBdP0SgJZbUZHyVM2qxPwZsLRv70iBZvlIgb7b85HiNjjJpfKtTwH5kI9zsVt5KnpnO3xzB7o/Vkx3PMY7eo/kAOAvN3mn1jLdxgIvtC5pbaqu/Aopbnjx2W15eJm8/nKkKRXuSBaIMkFiwdm/eMwo41ISsSjZHN4JpRURomQeyeZMsgZL+VPCKSJryZHqWJAg2O0sTYwsc/YcpPRNfWTrrdmqH+G2WWelLsnPzkq/lZXdYB/LQZxRinncCv7PCrPjVWEGfcM87A6z2wSz1kK04I8C2qQ6Kl591yJQGFcQG9b4ZJVWq8rBYLBTbJ6s+6VN971RG4KBV+H20KtXISCrTa5Cb9B0uFazyygYJdePiTMpimMya60qrAuVFafebRU86zTsGlrWMYvMboazMgCWVaTZbS2LrCmyK7mX5VUUtaSjx9Y5jt1s8GXrFm9/Htd1J+22H3/FiEdynzg59evX/IcpexcLJHvM1y9Z1vidf7rf3lceCRznsF0YXHUX1pvu2bOMOGXFg+GahjNOnMkpkcB4T7IpQiyQ4U2+CdndXmn8rFxwrAoR3IZLQejXaaDPzPPToOk64Bw1xefw3x360KteCoN+Cw3NxkuAH3+TkEsVsnjwJneJtlOOdejYPaPdfqnVGW1+nxENXO/GMG5PeDtjjO4TY2d4a3E8rDn7qh9jit9idM/Fv8XozevzqtxtRvM63+uh2RB/7XcbivEBisPs43TB50mA6eAwkq8kyd8Rfu3rcgLYlZCx7bKKwy8n4H5FJ38rk838t2Pp8PwXePD5fw==</diagram></mxfile>
2102.08201/main_diagram/main_diagram.pdf ADDED
Binary file (17.1 kB). View file
 
2102.08201/paper_text/intro_method.md ADDED
@@ -0,0 +1,91 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # Method
2
+
3
+ In this and the following sections, we propose and analyze a policy gradient-based algorithm that provably finds the best, potentially improper, mixture of controllers for the given MDP. While we employ gradient ascent to optimize the mixture weights, the fact that this procedure works at all is far from obvious. We begin by noting that $V^{\pi_\theta}$, as described in Section [\[sec:problemStatementAndNotation\]](#sec:problemStatementAndNotation){reference-type="ref" reference="sec:problemStatementAndNotation"}, is *nonconcave* in $\theta$ for both direct and softmax parameterizations, which renders analysis with standard tools of convex optimization inapplicable. Formally,
4
+
5
+ ::: {#lemma:nonconcavity of V main .lemma}
6
+ **Lemma 1**. *(Non-concavity of Value function)There is an MDP and a set of controllers, for which the maximization problem of the value function (i.e. [\[eq:main optimization problem\]](#eq:main optimization problem){reference-type="eqref" reference="eq:main optimization problem"}) is non-concave for both the SoftMax and direct parameterizations, i.e., $\theta\mapsto V^{\pi_{\theta}}$ is non-concave.*
7
+ :::
8
+
9
+ The proof follows from a simple counterexample whose construction we show in Sec. [\[appendix:nonconcavity of V\]](#appendix:nonconcavity of V){reference-type="ref" reference="appendix:nonconcavity of V"} in the Appendix.
10
+
11
+ :::: algorithm
12
+ ::: algorithmic
13
+ learning rate $\eta>0$, initial state distribution $\mu$ Initialize each $\theta^1_m=1$, for all $m\in [M]$, $s_1\sim \mu$ Choose controller $m_t\sim \pi_t$. Play action $a_t \sim K_{m_t}(s_{t},:)$. Observe $s_{t+1}\sim \tP(.|s_t,a_t)$. Update: $\theta_{t+1} = \theta_{t} + \eta. \nabla_{\theta_t}V^{\pi_{\theta_t}}(\mu)$.[]{#algo:softmaxpg-gradAscent label="algo:softmaxpg-gradAscent"}
14
+ :::
15
+ ::::
16
+
17
+ Our policy gradient algorithm, SoftMax PG, is shown in Algorithm [\[alg:mainPolicyGradMDP\]](#alg:mainPolicyGradMDP){reference-type="ref" reference="alg:mainPolicyGradMDP"}. The parameters $\theta\in \Real^M$ which define the policy are updated by following the gradient of the value function at the current policy parameters. The policy $\pi_{\theta}(m)$ is defined as in [\[eqn:softmax-defn\]](#eqn:softmax-defn){reference-type="eqref" reference="eqn:softmax-defn"}. The algorithm proceeds by first choosing a controller $\tt{m_t}\in [M]$ drawn according to $\pi_t$ and then playing an action drawn from $K_{\tt{m_T}}(s_t,.)$. The parameters are updated via a gradient descent step based on the derivative of the value function evaluated with the current parameters $\theta_t$.
18
+
19
+ ::: remark
20
+ **Note 1**. *Although Lemma [1](#lemma:nonconcavity of V main){reference-type="ref" reference="lemma:nonconcavity of V main"} suggests that the value function is non-concave in the parameter $\theta$, the motivating examples in Sec. [\[sec:motivating-examples\]](#sec:motivating-examples){reference-type="ref" reference="sec:motivating-examples"} of an inverted pendulum and a simple queuing network, show situations where a *pure* mixture of the base controllers can perform strictly better than each them individually.*
21
+ :::
22
+
23
+ In this section, we provide performance guarantees for SoftMaxPG, in terms of the rate of convergence to the optimal mixture. Notice that the *Update* step requires knowledge of the value gradient $\nabla_{\theta_t} V^{\pi_{\theta_t}}$, which may not be available/computable in closed form. We divide this section into two parts depending on whether or not the exact value function gradient is available to the learner.
24
+
25
+ The following result shows that with SoftMax PG, the value function converges to that of the *best in-class* policy at a rate $\cO\left(1/t\right)$. Furthermore, the theorem shows an explicit dependence on the number of controllers $M$, in place of the usual $\abs{\cS}$.
26
+
27
+ ::: restatable
28
+ theoremmaintheorem[]{#thm:convergence of policy gradient label="thm:convergence of policy gradient"} With $\{\theta_t \}_{t\geq 1}$ generated as in Algorithm [\[alg:mainPolicyGradMDP\]](#alg:mainPolicyGradMDP){reference-type="ref" reference="alg:mainPolicyGradMDP"} and using a learning rate $\eta = \frac{\left(1-\gamma\right)^2}{7\gamma^2+4\gamma+5}$, for all $t\geq 1$, $$\begin{equation}
29
+ \label{eqn:PGConvergeFiniteMDPs}
30
+ V^*(\rho) -V^{\pi_{\theta_t}}(\rho) \leq {\color{blue}\frac{1}{t}} M \left(\frac{7\gamma^2+4\gamma+5}{c^2(1-\gamma)^3}\right)\norm{\frac{d_\mu^{\pi^*}}{\mu}}_{\infty}^2 \norm{\frac{1}{\mu}}_\infty.
31
+ \end{equation}$$
32
+ :::
33
+
34
+ ::: remark
35
+ **Note 2**. *The quantity $c$ in the statement is the minimum probability that SoftMax PG puts on the controllers for which the best mixture $\pi^*$ puts positive probability mass, i.e, $c \bydef \inf_{t\geq 1}\min\limits_{m\in \{m'\in[M]:\pi^*(m') >0\}}\pi_{\theta_t}(m)$. While we currently do not supply a lower bound for $c,$ empirical studies presented in Sec. [\[sec:simulation-results\]](#sec:simulation-results){reference-type="ref" reference="sec:simulation-results"} clearly show that $c$ is indeed strictly positively lower bounded, rendering the bound in [\[eqn:PGConvergeFiniteMDPs\]](#eqn:PGConvergeFiniteMDPs){reference-type="eqref" reference="eqn:PGConvergeFiniteMDPs"} non vacuous.*
36
+ :::
37
+
38
+ ::::: proof
39
+ *Proof sketch of Theorem [\[thm:convergence of policy gradient\]](#thm:convergence of policy gradient){reference-type="ref" reference="thm:convergence of policy gradient"}.* We highlight here the main steps of the proof. We begin by showing that $V^{\pi_\theta}(\mu)$ is $\beta-$ smooth, for some $\beta>0$.
40
+
41
+ ::: restatable
42
+ lemmasmoothnesslemmaMDP[]{#lemma:smoothness of V label="lemma:smoothness of V"} ${V}^{\pi_{\theta}}\left(\mu\right)$ is $\frac{7\gamma^2+4\gamma+5}{2\left(1-\gamma\right)^2}$-smooth.
43
+ :::
44
+
45
+ Next, we derive a new Łojaseiwicz-type inequality for our probabilistic mixture class, which lower bounds the magnitude of the gradient of the value function.
46
+
47
+ ::: restatable
48
+ lemmanonuniformLE[]{#lemma:nonuniform lojaseiwicz inequality label="lemma:nonuniform lojaseiwicz inequality"} $$\begin{equation*}
49
+ \norm{\frac{\partial}{\partial\theta}V^{\pi_\theta}(\mu)}_2 \geq \frac{1}{\sqrt{M}}\left(\min\limits_{m:\pi^*_{\theta_{m}}>0} \pi_{\theta_m} \right) \times \norm{\frac{d_{\rho}^{\pi^*}}{d_{\mu}^{\pi_\theta}}}_{\infty}^{-1} \left[V^*(\rho) -V^{\pi_\theta}(\rho) \right].
50
+ \end{equation*}$$
51
+ :::
52
+
53
+ The proof of Theorem [\[thm:convergence of policy gradient\]](#thm:convergence of policy gradient){reference-type="ref" reference="thm:convergence of policy gradient"}, then follows by combining Lemmas [\[lemma:smoothness of V\]](#lemma:smoothness of V){reference-type="ref" reference="lemma:smoothness of V"} and [\[lemma:nonuniform lojaseiwicz inequality\]](#lemma:nonuniform lojaseiwicz inequality){reference-type="ref" reference="lemma:nonuniform lojaseiwicz inequality"} followed by an induction argument over $t\geq 1$. Please see the appendix for details of the proof. ◻
54
+ :::::
55
+
56
+ - **Analytical Novelties.** We note here that while the basic recipe for the analysis of Theorem [\[thm:convergence of policy gradient\]](#thm:convergence of policy gradient){reference-type="ref" reference="thm:convergence of policy gradient"} is similar to [@Mei2020], we stress that our setting does not directly inherit the intuition of standard PG (sPG) analysis.
57
+
58
+ - With $\abs{\mathcal{S}\times\mathcal{A}}<\infty,$ the sPG analysis critically depends on the fact that a deterministic optimal policy exists and shows convergence to it. Our setting enjoys no such guarantee.
59
+
60
+ - The value function gradient in sPG has no 'cross contamination' from other states, so modifying the parameter of one state does not affect the values of the others. Our setting cannot leverage this since the value function gradient possesses contributions from *all* states (see Lemma [\[lemma:Gradient simplification\]](#lemma:Gradient simplification){reference-type="ref" reference="lemma:Gradient simplification"} in appendix). Hence, our analysis becomes more intricate than existing techniques or simple modifications thereof.
61
+
62
+ - **Bandit-over-bandits.** For the special case of $S=1$, which is the Multiarmed Bandits, each controller is a probability distribution over the $A$ arms of the bandit. This is different from the standard MABs because the learner cannot choose the actions directly, instead chooses from a *given* set of controllers, to play actions. We call this special case as *bandits-over-bandits*. We obtain a convergence rate of $\mathcal{O}\left(M^2/t\right)$ to the optimum and recover the well-known ${\color{blue}{M^2}}\log T$ regret bound when our softmax PG algorithm is applied to this special case. We refer the readers to the appendix for details of this result, and move to the special case of MAB when the learner uses estimates of the gradient of the value function.
63
+
64
+ For the bandits-over-bandits case when exact value gradients are unavailable, we parameterize the policy simplex $\cP([M])$ *directly*, i.e., $\pi_t(m)=\theta_t(m), \forall m\in [M]$ (see Algorithm [\[alg:ProjectionFreePolicyGradient\]](#alg:ProjectionFreePolicyGradient){reference-type="ref" reference="alg:ProjectionFreePolicyGradient"}). At each round $t\geq 1$, the learning rate for $\eta$ is chosen asynchronously for each controller $m$, to be $\alpha \pi_t(m)^2$, to ensure that we remain inside the simplex, for some $\alpha \in (0,1)$. To justify its name as a policy gradient algorithm, observe that in order to minimize regret, we need to solve the following optimization problem: $$\min\limits_{\pi\in \cP([M])} \sum\limits_{m=1}^{M}\pi(m)(\kr_\mu(m^*)- \kr_\mu(m)).$$ A direct gradient with respect to the parameters $\pi(m)$ gives us a rule for the policy gradient algorithm. The other changes in the update step (eq [\[algstep:update noisy gradient\]](#algstep:update noisy gradient){reference-type="ref" reference="algstep:update noisy gradient"}), stem from the fact that true means of the arms are unavailable and importance sampling.
65
+
66
+ :::: algorithm
67
+ ::: algorithmic
68
+ learning rate $\eta\in (0,1)$ Initialize each $\pi_1(m)=\frac{1}{M}$, for all $m\in [M]$. $m_*(t)\leftarrow \argmax\limits_{m\in [M]} \pi_t(m)$ Choose controller $m_t\sim \pi_t$. Play action $a_t \sim K_{m_t}$. Receive reward $R_{m_t}$ by pulling arm $a_t$. Update $\forall m\in [M], m\neq m_*(t):$ $$\begin{equation}
69
+ \label{algstep:update noisy gradient}
70
+ \pi_{t+1}(m) = \pi_t(m) + \eta\left(\frac{R_{m}\ind_{m}}{\pi_t(m)} - \frac{R_{m_*(t)}\ind_{m_*(t)}}{\pi_t(m_*(t))}\right)
71
+ \end{equation}$$ Set $\pi_{t+1}(m_*(t)) = 1- \sum\limits_{m\neq m_*(t)}\pi_{t+1}(m)$.
72
+ :::
73
+ ::::
74
+
75
+ We have the following result for the bandit-over-bandits improper learning problem (the proof appears in the Appendix).
76
+
77
+ ::: restatable
78
+ theoremregretnoisyMABshort[]{#thm:regret noisy gradient short label="thm:regret noisy gradient short"} For $\alpha$ chosen sufficiently small, $\left(\pi_t\right)$ is a Markov process, with $\pi_t(m^*)\to 1$ as $t\to \infty, a.s.$ Further the regret till any time $T$ is bounded as $\cR(T) = \mathcal{O}\left(\frac{1}{1-\gamma}\sum\limits_{m\neq m^*} \frac{\Delta_m}{\alpha\Delta_{min}^2} {\color{blue}\log T}\right)$, where $\Delta_j\bydef r(m^*)-r(j), j\in [M]$ and $\Delta_{min}\bydef \min\limits_{m\in [M]}\Delta_m.$
79
+ :::
80
+
81
+ Although we obtain a similar $\log T$ regret bound for the case of noisy gradient estimates, we note that the techniques used are quite different from those used in Theorem [\[thm:convergence of policy gradient\]](#thm:convergence of policy gradient){reference-type="ref" reference="thm:convergence of policy gradient"}. The proof proceeds by showing that the the expected time for $\pi_t(m^*)$ to cross any fixed threshold in $(0,1]$ is finite. This, along with showing that the process $\{\pi_t(m^*)\}$ is a supermartingale and invoking Doob's convergence theorem, helps to prove the regret bound.
82
+
83
+ ::: {#remark:relation between true and noisy grad .remark}
84
+ **Note 3**. *The "cost\" of not knowing the true gradient seems to cause the dependence on $\Delta_{min}$ in the regret, as is not the case when true gradient is available (see Theorem [\[thm:convergence for bandit\]](#thm:convergence for bandit){reference-type="ref" reference="thm:convergence for bandit"} and Corollary [\[cor:regret true gradient MAB\]](#cor:regret true gradient MAB){reference-type="ref" reference="cor:regret true gradient MAB"}). The dependence on $\Delta_{min}$ as is well known from the work of [@LAI19854], is unavoidable.*
85
+ :::
86
+
87
+ ::: {#remark:dependence on delta .remark}
88
+ **Note 4**. *The dependence of $\alpha$ on $\Delta_{min}$ can be removed by a more sophisticated choice of learning rate, at the cost of an extra $\log T$ dependence on regret [@Denisov2020RegretAO].*
89
+ :::
90
+
91
+ We note that it is an open and challenging task to show convergence guarantees for our policy gradient approach over improper mixtures for general MDPs with estimated (noisy) gradients; indeed, such rates are not yet known even for the basic softmax PG scheme for the tabular MDP setting. The difficulty primarily seems to lie in the fact that the constant $c$ for the perfect gradient case now becomes stochastic, and showing that it stays bounded away from $0$ in some probabilistic sense is non-trivial.
2104.02409/main_diagram/main_diagram.drawio ADDED
@@ -0,0 +1 @@
 
 
1
+ <mxfile host="app.diagrams.net" modified="2021-03-21T23:12:09.668Z" agent="5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/89.0.4389.90 Safari/537.36" etag="GzSOik7Z4KA-v9avgHaN" version="14.4.9" type="google"><diagram id="GU6AVEEQoo7gNJp9OwK7" name="Page-1">7Vxbd5s4EP41Pqd9qA9IXB8TJ273NNlmm213+7RHBoFpAbmyXDv99StAMjcldhzATpvmlMsgBtB882lmpGQEJ8nmLUWL+TXxcTwCmr8ZwYsRALphGHyXSe6ERAd6IQlp5AtZKbiNfmIh1IR0Ffl4WWvICIlZtKgLPZKm2GM1GaKUrOvNAhLXn7pAoXiiVgpuPRTjVrN/Ip/N5WdolebvcBTOxaMdU1xI0LZxIVjOkU/WlWfByxGcUEJYcZRsJjjOek/2S6Foes/V7YtRnLJ9brj8OP3jOqD2z/U5+3L1HXxE5qc38t3Ynfxg7PPvF6eEsjkJSYriy1J6Tskq9XGmVeNnZZsrQhZcqHPhV8zYnTAmWjHCRXOWxOIq3kTs38rxl0zV2BRnFxuhOT+5kycpo3fZTW+0saYBKRG3Al0Kyrvzs9rtN5hGCWaYCmFAUjZFSRRnrf7ml5a8K/7Ea779SBKUiibiO/TsBdudLnuQrKiHH+hpIMCLaIjZA+2Ew2RmqDxAmPQtJvwD6B1vQHGMWPSjDlMk0B5u25WA4AcCE4/Ah3jrHyheiSeNgBUz0TM15FjfV0ReeLPM++yMN9DNxaa8yI/CbP/XCudfcUPJV+62hEq1M3q4UqmDf2nxdlLcgHgJ4Aw263nE8O0C5dZbcxqrg3UPkIQU+REHxITEhOaPgLP8J7s9iuOKPDCzn8cA6wemDG8ehIK8agvqEVwLJPmuS+JyRZN5hbIkY3UOHqgAz/BsM6iXG3t6uXNSXm7sNhQfvRbZobea4d1OMyuMdjXbCpD3LcxN+WHF4ijFQu4j+u0DVxOxnKnHmlkXglyqKxzJ8bDncfmSUfINV13PMQ1TU1h1DyB04IVQ1+teCM2WF0KFF1pmT8Y1jznEa2PXderDvGk8PNDzk+ZAvR38RZQgR/5do/6grm8dy/XzW88oRXeVBgsSpWxZ0XyTCUqYGsCpwRSaRgNphcYSd9tXOxyK1nPgmcezRoOaptNzaFkqajqHTpY49MIzULfHR2Ya+7mO9yW/NPIK86C8Qp3fGO6jee9wMnL2JCOZZnfMRi26MTWXjwTlvzr5AMsec0Iybb3YQljXX3yGUNkAaQe05HST5OQJSCMluSCsyHL8lZcdHZDmqNTyPUoyLiy2vOktCViCNt3pP5VMKgiQmXNmMwAMPB84/WRS0IGNTAoouHXQXMr91bjVcOxDuPVe6+6mOjgQ1TXBY9l1FQV598ZmktJPPsp6XFTF/d1zXVVUNZ06mqZK+DpgAlNrGBOomMBRMIHTFxPIuK9iX4NvtDPG+HdGJOXH14jRaNMyO/9oVjctiqMwzYDAb8187DzrmshD8Zm4kES+X3AH5uMGmuWqss4WCQbXa56PzItMF6eLpej+jpOv/Q1m2HWD6W7bXLqhsBfozV5HYeon1dibYXCnRL0zdpVEvZPRQeeM/jRDq+qdHcWRZ2FIcYgOK5SfeITnmzZ0/TbjY8vXfNwPTVjSK08nwtOfYRUW676JbdWg7Fo2RNZh9ZQO7Gs7J1aF1c2WdaVPc+/qZvKrSAbT2TLbXRMRC0wxYiuKl79jNGBDawyMekDgKAICoICC0Zujt8ugcjr0d7YVsBsuqykMpQ8aubULmu/xi5msmpmgq7fNpA1qJlX97uRrI/bTYmR3zxi5+6rH02ylqmN1saDkc67wZUFJl67eXFBiSLc+VogM9ihtDeDph7vt3sVK66S8FrRLTt14bTGcVl21kL94cX9eDIFx7FT3OG57UEns4dUommbUS2WGaw1bLAP7Fsv0E1tYusfiQBkqX6EZjm946JtntPBiRhgjiSKWZqThpbJekmzCbH36OIjJ2psjysbLVZJEafhfsEq9Qm1P+Saw2oOmboxlFF31N72v4gPYo7R0mhHyoSu/Op477L7SrJ473M5MyIgLNCBxz9xhS5FtNFDoNrj8niUVj11G1nphy33UezXa97PsTC51O9ISyOpwUx1sYG20yQvQXS8DkvDZDfBe8P1UNEENPIimZnswCJrc542mQ2f5hkCh/kvAEBpDwLA9odDC5alNF/kmdvwsYm5NFzlgVqyZPc50kdvkkXbMNuhsETiNpbSdU8XuLMUdKNAChwVanXnvs1gK3/BehJ1A+Ss3lufgWXAq3gtdeFzvhe0yJeAbbcJ7I5sgAlrwG08gwQbXKmbmh12gBdvVzdxc2xn0F2uVvmWp1snoKvfqz14711kFyKvXiVXdVakTH1pZfjUC2d2yNIWyV79MPeJHafh637Lz4CDqGiHN6FdV8LJ6AggiZHbz9d3FuRu/14LJp4n+OVH93rE5ecX/MwEDfojixRzx/es97FEf71KSDaW1wVGI9rebaiyvR2v3WEFhq/uLkZrZKkYql7l15L38tPxrFkWgVP5REHj5Pw==</diagram></mxfile>
2104.02409/main_diagram/main_diagram.pdf ADDED
Binary file (17.9 kB). View file
 
2104.02409/paper_text/intro_method.md ADDED
@@ -0,0 +1,32 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # Introduction
2
+
3
+ We base our network design on the successful RAFT architecture [@raft]. Our overall network diagram is shown in Figure [\[fig:overall\]](#fig:overall){reference-type="ref" reference="fig:overall"}. For completeness, we briefly describe the main contributions of RAFT from which our model benefits. The first contribution is the introduction of an all-pairs correlation volume, which explicitly models matching correlations for all possible displacements. The benefit of using all-pairs correlations is its ability to handle large motions. The second major contribution is the use of a gated recurrent unit (GRU) decoder for iterative residual refinement [@irr]. The constructed 4D correlation volumes are encoded to 2D motion features, which are iteratively decoded to predict the residual flow. The final flow prediction is a sum of the sequence of residual flows. The benefit of using a GRU to perform iterative refinement lies in the reduction of the search space. In RAFT, convolutions are used in the GRU decoder, which learn to model spatial smoothness. Due to the local nature of convolutions, they can learn to handle small occlusions but tend to fail when these become more significant and local evidence is insufficient to resolve the motion.
4
+
5
+ In his first paper from 1976, Geoffrey Hinton wrote that "local ambiguities have to be resolved by finding the best global interpretation" [@hinton1976using]. This idea still holds true in the modern deep learning era. To resolve ambiguities caused by occlusions, our core idea is to allow the network to reason at a higher level, that is, to globally aggregate the motion features of similar pixels, having implicitly reasoned about which pixels are similar in appearance feature space. We hypothesise that the network will be able to find points with similar motions by looking for points with similar appearance in the reference frame. This is motivated by the observation that the motions of points on a single object are often homogeneous. For example, the motion vectors of a person running to the right have a bias towards the right, which holds even if we do not see where a large part of the person ends up in the matching frame due to occlusion. We can use this statistical bias to propagate motion information from non-occluded pixels, with high (implicit) confidence, to occluded pixels, with low confidence. Here, confidence can be interpreted as whether there exists a distinct matching, , a high correlation value at the correct displacement.
6
+
7
+ With these ideas, we take inspiration from transformer networks [@transformer], which are known for their ability to model long-range dependencies. Different from the self-attention mechanism in transformers, where query, key and value come from the same feature vectors, we use a generalized variant of attention. Our query and key features are projections of the context feature map, which are used to model the appearance self-similarities in frame 1. The value features are projections of the motion features, which themselves are an encoding of the 4D correlation volume. The attention matrix computed from the query and key features is used to aggregate the value features which are hidden representations of motions. We name this a Global Motion Aggregation (GMA) module. The aggregated motion features are concatenated with the local motion features as well as the context features, which is to be decoded by the GRU. A detailed diagram of GMA is shown in Figure [\[fig:details\]](#fig:details){reference-type="ref" reference="fig:details"}.
8
+
9
+ # Method
10
+
11
+ Let ${\rvx \in \R^{N \times D_\text{c}}}$ denote the context (appearance) features and ${\rvy \in \R^{N \times D_\text{m}}}$ denote the motion features, where ${N = HW}$ and $H$ and $W$ are the height and width of the feature map, $D$ refers to the channel dimension of the feature map. The $i^{\text{th}}$ feature vector is denoted ${\rvx_i \in \R^{D_\text{c}}}$. Our GMA module computes the feature vector update as an attention-weighted sum of the projected motion features. The aggregated motion features are given by $$\begin{equation}
12
+ \hat{\rvy}_i = \rvy_i + \alpha \sum_{j=1}^N f(\theta(\rvx_i), \phi(\rvx_j))\sigma(\rvy_j),
13
+ \label{Eqn:c_only}
14
+ \end{equation}$$ where $\alpha$ is a learned scalar parameter initialised to zero, $\theta$, $\phi$ and $\sigma$ are the projection functions for the query, key, and value vectors, and $f$ is a similarity attention function given by $$\begin{equation}
15
+ f(\rva_i, \rvb_j) = \frac{\exp \left( \rva_i\transpose \rvb_j / \sqrt{D} \right)}{\sum_{j=1}^N \exp \left( \rva_i\transpose \rvb_j / \sqrt{D} \right)}.
16
+ \end{equation}$$ The projection functions for the query, key and value vectors are given by $$\begin{align}
17
+ \theta (\rvx_i) = \rmW_{\text{qry}} \rvx_i, \\
18
+ \phi (\rvx_i) = \rmW_{\text{key}} \rvx_i, \\
19
+ \sigma (\rvy_i) = \rmW_{\text{val}} \rvy_i,
20
+ \end{align}$$ where $\rmW_{\text{qry}}, \rmW_{\text{key}} \in \R^{D_\text{in} \times D_\text{c}}$ and $\rmW_{\text{val}} \in \R^{D_\text{m} \times D_\text{m}}$. The learnable parameters in our GMA module include $\rmW_{\text{qry}}, \rmW_{\text{key}}, \rmW_{\text{val}}$ and $\alpha$.
21
+
22
+ The final output is $[\rvy \,|\, \hat{\rvy} \,|\, \rvx]$, a concatenation of the three feature maps. The GRU decodes this to obtain the residual flow. Concatenation allows the network to intelligently select from or combine the motion vectors, modulated by the global context feature, without prescribing exactly how it is to do this. It is plausible that the network learns to encode some notion of uncertainty, and decodes the aggregated motion vector only when the model cannot be certain of the flow from the local evidence.
23
+
24
+ We also explore the use of a 2D relative positional embedding [@bello2019attention], allowing the attention map to depend on both the feature self-similarity and the relative position from the query point. For this, we compute the aggregated motion vector as $$\begin{equation}
25
+ \hat{\rvy}_i = \rvy_i + \alpha \sum_{j=1}^N f(\theta(\rvx_i), \phi(\rvx_j) + \rvp_{j-i}) \sigma(\rvy_j),
26
+ \label{Eqn:p+c}
27
+ \end{equation}$$ where $\rvp_{j-i}$ denotes the relative positional embedding vector indexed by the pixel offset $j-i$. Separate embedding vectors are learned for the vertical and horizontal offsets and are summed to obtain $\rvp_{j-i}$. If it is useful to suppress pixels that are very close or very far from the query point when aggregating the motion vectors, then this positional embedding has the capacity to learn this behaviour.
28
+
29
+ We also investigated computing the attention map from only the query vectors and positional embedding vectors, without any notion of self-similarity. That is, $$\begin{equation}
30
+ \hat{\rvy}_i = \rvy_i + \alpha \sum_{j=1}^N f(\theta(\rvx_i), \rvp_{j-i}) \sigma(\rvy_j).
31
+ \label{Eqn:p_only}
32
+ \end{equation}$$ This can be regarded as learning long-range aggregation without reasoning about the image content. It is plausible that positional biases in the dataset could be exploited by such a scheme. In Table [\[Tab:Results\]](#Tab:Results){reference-type="ref" reference="Tab:Results"}, the results for ([\[Eqn:p+c\]](#Eqn:p+c){reference-type="ref" reference="Eqn:p+c"}) and ([\[Eqn:p_only\]](#Eqn:p_only){reference-type="ref" reference="Eqn:p_only"}) are denoted as Ours (+p) and Ours (p only).
2104.05591/main_diagram/main_diagram.drawio ADDED
@@ -0,0 +1 @@
 
 
1
+ <mxfile host="app.diagrams.net" modified="2020-11-18T18:08:19.519Z" agent="5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/86.0.4240.198 Safari/537.36" version="13.9.3" etag="GLG9gsZgzWbHrSrc7Oic" type="google"><diagram id="qFzTeyPFBGsVh55jZ9Df">7V1dk5s6Ev01U5U8hEICCfE4H7l3t5LspjK5tdlHxmhsKjb4YiYzs79+JRswQgJkI9keO3nIGNm0TZ+jVnfTaq6828XLn3m0nH3JYjq/gm78cuXdXUEIQ+izP3zkdTNSD0zzJN4MuduB++R/dDMIqtGnJKarcmwzVGTZvEiW4uAkS1M6KYSxKM+zZ/Fjj9k8FgaW0ZQKP4MP3E+iOZU+9p8kLmabUYIan/4HTaaz6puBW76ziKoPlwOrWRRnz40h7+OVd5tnWbF5tXi5pXOuPFEvf3S8W/+wnKaFzglwc8KvaP5UXtsVxHN26s1jxiQw3USTzRv47yf+o26+Jwumeuj+iz6z/79liyjdvsmvqHidi2dwSR9WaxCv2QdAsHxpnoGn/O+fNKV5VGQ5+8S7L5+/sD/r13mUxtniffWr2JVsftjmrFKJ9VfCPHtKY8ovDrC3n2dJQe+Xm0t4ZlxkY7NiMS/ffkzm89tszr6Hn+vFESWPEza+KvLsJ228gyeEPjzyd37SYsIRdPlBhR2of8kvmhf0pRMLUCPMpgbNFrTIX9lHyhM+BJ7vuOVMeK0Yjz0Hos3Y85ZrXjUVZg2eeeW5UUnvaf0VWwawFyUJ1ITwhggxjC8W8d2ZQhto75LVJE8WSbohhX34H8mETpTwPxDkI7cX/uk8Wq3KN8xQIWxTIWRD2kxA45ngj2cCUc306+Vy/lrP7LVRXP20ArAAV3uyI0piX4U2gQ8exoZwxJiI8xlgV4IQBFgB4Xbej0ERdaKY7A9hJeKhHkA37LLYj3HLP6B1hO4aCD9IcvKGIKCSUP/ZCtqe5DiOUlLHb5EksZ+UdDKOgV902400S9nnbqJ5Mk3Z4YTxgrLxG86ahDkM1+UbiySOuUQlZbek5kSNo9WsZjg/+BoVTGi6HuG/3wwzCZYXG8AWINnCgIpETXoSA2sNPoLz0cXnaMGRSB9Wy42k7zP6qhh+pjmVh6VjaXZ8ub7/1D0JhNOL7Io7bbvKsGw+RTNZ0t7ISgcUPAShE2gudZUnPYaIgYKILc3RNL7mkQOf5HytTyaisuhLUvzgunJ8Asvj/5a646/vXpoHr42Dr5S5OXRtNhruA42rEKRU6Sp7ysvZUC3NGlpuaA8plFeN5XQeFckv8RtVGi2/4WuWrCdoBSIiUAYR+TWIlajNRZRnN8MSWaCCFb4vLYlFlE9pIQlcA16rQYsD5ISNkWoI3dx+vl+bg9JWlbZJNh3LnHKmxDrG54pH0iohMZdRHN72AKu2R7EGui5Rutm+JdsTjrc9gsewUVaVlvC6DcrpWA/gglAAIcAtIdp2IwwGJJkzGNVidFCLgVQWY8Wm8DxJp7ozUXZplTPMsp9rYgKHuAU3AtLMVTHXxMQFwPDMZRedv/6oPAN+sHYgmEdeHm59iPVR5USIM75yPn40HJOGnE5XRMPtKK3iZgqdqCfiEyQ7DgRIjoO2RfF8T14jgOzaGDQsqqTsjsSSV4FW/iNiITBEqvwH/sNl/97kukFcYmjdkCQZhHcwxXqodaNgwKcf5vQXvz0j+YHMbYyTSZFkqeLNd9++361RjeL3XfGv/WVoTh+LE1iE2j5HSKRFCODA0iqkStOaNhYUxIgGKmMR4sCL8BsxFmIqlgTA8aFb/9vf5dxJrkFD0p3bPbQDSv9+oumEVrZEsB3MVnxp2ooLsw0tcmCgii7ruwLGzYMqx9rS9WoWLfnLyVM+f73Jo8lP7t0NRehiOC8Zhg33uYub5HTDBO+ORqtC6bCaUDUhriKhjVWJRKjQNoEGtK1KJNqekVg1Iz/x70peeNJnfX9tdYGxIVKkELGrmn6BrdmnSiqaTe7s7tZvM9VIzFQ7ruf1hoiqbLV8g2o7n3uDSXhqDgJS8MVF4f6Ro1KgJ4eiBj0CA8lERTahzks4JPCbuQnAr68/ObE+apNmS0HSTFF8YBRkcZeFGyblKtjMXJwc/3yMe2+E7cw/v4o5avKFck2BOfJBVT50R/I1qBZsjNE2DRbiPRJhKvZ18OWkyYHcFpYAuiMsk0uYsQ+3//BBiWIgcdprpRDBLeogso+ZepNEQdAkUVB7/SKBTWZAoybEI0FrtfIu14LggTqLnbmhEmjZcKgyp/vTAyKBHTtT402yAAWGWaAQaJkFxjOePWCeNnKuK1vjMciFAbIJnCpfeQHAAcUNxFHAyQItA6eRRDxD4JjP3e8ljwFxQLhlQA0UPMqAdqWiTh5oHyN7QA8IDwNsE2gDCcjeUAu0HGwUDLhR+/rT3qmRJuAxtFGzHkoCLZPDQLbQetnz6QDOS5LbwTUEY5YALrDti1t1nT0DKbqLqCzCSN5+54/Io6jkNeqMLSBtPMd2rkgDqV5wHNIKeXaRhr+R1nTnw15HbAzqQ7LtMsBAWuwiGODj/phrDAOGZLvyjWSDDDiRIsCtNwhs+ILNO7onGA9I5UehP6LuXBFfWDYjBtJzxotVdiovOR0yDGYUsOzmm8ooANeT7ZhBmlxoMhDJ+0pcxXzc4daq7ChaDfc9jaSfWIU3UAHarOxkl81R6JnJvOqw2W1l/U+2CR+v8Y257huBtC2ZIGU/Hea1K5oc+PXoqJY6Gkk4m4rvMLpdgBhRPMLt1AYJiKoUV614E8WXnkZ669z4DiWfgfg78B0Z4buvkWU6N74DKPGdYH3F89MNKF4j6XNmjEcBkBhP1Fss1IoPjCheIwdzZozn7kub8UcwNRqpj3NjPJTKNghS7mro8GmIEcVrZBzOjfEAOKg77CLBDuyHZtivEbGfGft9ItW+EIJ2sPeeEcVrxMBnxn6e32zbe28HxWMzij9KDFvnN7syoiYUHEDpFuExmH2UWPUwCuYaOnoywD9KVHoQBSsKWAiG+gr2sAkFo6NEn4dhsC+lDAlQNgVWKzgIjShYFWUeolPeendzdyfaEVJyupyz3xyfUHu7PCuiknR8e/LmWspO+IAY86ak6DkIlO3vfJUTayBJVxVUje0yLaK6Z8vxLE+mSRrNzxly5Hp9N4864LfVQw0ZaDffaqK5J/ZSw8tNV3LevUWrb/UZUyYIQIsjnlNVHDQ4gqEljqhSDUfhSEmJTxcGPwROVSpcM0Dp1lljwGCjo6NZiUbr+bdpIJre7TNdFaZIgxSexWHNhvE6hTOtikOItOPi5qNG9ihskOUp9kWYK2xAVnYznTxuinvs4+qKlAKt9qtBVrYnnT5y8t3ikcipBNpFzkqnoZNHbnBL6BgUB4VbRRRr5M0O1ylQTLSZbhUY8Lxvy6WFysC3EQ6bdlCwKo92pgr3Q1dT4dWNZePaHp9ogr6q06IYAbyB3oo2YkTsSr1ggkBZlGDrMRp4t00wpV6FevVWHbtYdu4p54Zalye8gHkeckizOXFrjfHAiAWMCw/IVjgUZGPSgtng2tWdIqpD8iKPknT9kAw3Sll0v34GYUwLZgG3z5eU43VlBP9wgVOcuPU9pXr3c6MZWLO7rupxhkbmuEa9y+FWzDTLLa6Yob9dHmt9q29XI99Sb3l8Us2j7eq7vtgBZXvWlN3dO3prw+iq+G3C9scYsMXvyCZs8DFzv5+vfHWA5yvzah6vGWuL7VXDarPUIR60jC8zlwIwvx+w9Ralpkv7uqH9gsPAE+Wac0MDjRTKOQIJQktA9gq2CaTx7ilvA0hx3gDfHJL9km1Cqcr7jIbyrba6g8TBaLvoEemmwb4IDwgOA98awL+7n+iut6jPmnr7PjpvQG51bAH5Q3Q9OQvkIbCDfL9ci8gb7yR8rsij/nV3f+gHBFvE/nddj95Sj/tX5H2hH5JbPajEAvIXWucDhrxyvB+SQ5JtOm2X1xIFYOz4aKttV0zuE1/OOHXsQzOyiyfQSDqd2d5h/hjwsAeBoN5/dpD+NEQjW3Ruc4BpE7ceiFo1LTlQxwhyeT1SgAec9rOPiVyIbbMzTfVw1QvSOvGcoGFrWsUQBOuaGkUzv70QOHKnlGM0ZSIO7EGg2ok9vODKXfn2QuDyWqYwfQoLLhDvspFq+98wBEaaG5DL65fieb1zoCr6PdDefHLkvimHBwC4QrTcvs9MiO4MMNPegxy5B+jhbRDEDsBNEyTq39NdBJCRRnHE+IaX80xefcC+Q5qpCTFUq8szdk959Mtt3/4yl/EgGvH2uVT780AbNOqAWzcGCJCNXv1IMtNRdmilJqNrxnU0wHeq9/Rb4DcfZ3y1w2NKO2Z2s3V+RcXm47XCk+unj7ETeg0OtfrewL2TnsQJm7usWgsysFaJEF5e5iFAjg8bzo/ofWKim/KERpyf6usbAMzZpG9jcL71rwiKKQnRF8Khq8h/2tqvE8r5iK+f2fH6pbugUfruCuJowTWTPqyWaxW4ukPs6931xqzbEULEofc9k1WnGLaFp14HsZJYRhblEDi4e1EOPGXX5LohoNBHzDOA/5GzIcfozu73xeIBizAO2h8/vLhsCDN/zVC8ir0rAKoNuQeKxUPjhQx6nml1pPFopsoHFZ7y2nBI9/dBK/Mv+KAnF4X6PIUsksTf1+8k/bPfDUW5Bv1OjY1F//7r++d/fvx24fuDwsDBTXe1tVfIRfJNM2vu0dk8uLm633s6sxoC385+BeiKgl1xgbFYTxOqskuH6Mmq6pCm41yjm9vP91eI+/pJGieU+/zZr4T/IunDU87xvXz2JFV8aJktn+ZRnhSvR+rJZsnBh63wGjCEFAGdx58eB+1YLTZ3ToWH32f0VYMhzzSnGh/LaRTriFMMFdneIeg0s8tQU5spA2mjupp5lY8zknbsMM842Ftjya5x9iWLKf/E/wE=</diagram></mxfile>
2104.05591/main_diagram/main_diagram.pdf ADDED
Binary file (65.8 kB). View file
 
2104.05591/paper_text/intro_method.md ADDED
@@ -0,0 +1,34 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # Method
2
+
3
+ We compare our method against classical AD baselines like Isolation Forest [@iso_forest] and existing state-of-the-art OneClassSVMs [@ocsmv] and CVDD [@acl2019]. We outperform all previously reported performances on all *20Newsgroups* splits by a large margin: 13.5% over the best reported CVDD and 11.7% over the best OCSVM, as shown in Tab. [\[tab: main_experiment\]](#tab: main_experiment){reference-type="ref" reference="tab: main_experiment"}. In contrast, DATE uses the same set of hyper-parameters for a dataset, for all splits. For a proper comparison, we keep the same experimental setup as the one introduced in [@acl2019].
4
+
5
+ :::: center
6
+ ::: tabular
7
+ l l r r r r & & & & &\
8
+
9
+ & comp & 66.1 & 78.0 & 74.0& **92.1**\
10
+ & rec & 59.4 & 70.0 & 60.6& **83.4**\
11
+ & sci & 57.8 & 64.2 & 58.2& **69.7**\
12
+ & misc & 62.4 &62.1& 75.7& **86.0**\
13
+ &pol & 65.3 & 76.1 & 71.5& **81.9**\
14
+ &rel & 71.4 & 78.9& 78.1& **86.1**\
15
+
16
+ & business & 79.6& 79.9 & 84.0& **90.0**\
17
+ & sci & 76.9 & 80.7 & 79.0& **84.0**\
18
+ & sports & 84.7& 92.4 & 89.9& **95.9**\
19
+ & world & 73.2 & 83.2 & 79.6& **90.1**\
20
+ :::
21
+ ::::
22
+
23
+ We apply it over fastText or Glove embeddings, varying the number of estimators $(64, 100, 128, 256)$, and choosing the best model per split. In the unsupervised AD setup, we manually set the percent of outliers in the train set.
24
+
25
+ We use the One-Class SVM model implemented in the CVDD work . For each split, we choose the best configuration (fastText vs Glove, rbf vs linear kernel, $\nu$ $\in$ \[0.05, 0.1, 0.2, 0.5\]).
26
+
27
+ This model [@acl2019] is the current state-of-the-art solution for AD on text. For each split, we chose the best column out of all reported context sizes ($r$). The scores reported using the $c^*$ context vector depends on the ground truth and it only reveals \"the potential of contextual anomaly detection\", as the authors mention.
28
+
29
+ We further analyse how our algorithm works in a fully unsupervised scenario, namely when the training set contains some anomalous samples (which we treat as normal ones). By definition, the quantity of anomalous events in the training set is significantly lower than the normal ones. In this experiment, we show how our algorithm performance is influenced by the percentage of anomalies in training data. Our method proves to be extremely robust, surpassing state-of-the-art, which is a semi-supervised solution, trained over a clean dataset (with 0% anomalies), even at 10% contamination, with +0.9% in AUROC (see Fig. [3](#fig: fully_unsup_od){reference-type="ref" reference="fig: fully_unsup_od"}). By achieving an outstanding performance in the unsupervised setting, we make unsupervised AD in text competitive against other semi-supervised methods. The reported scores are the mean over all AG News splits. We compare against the same methods presented in Sec. [4.3](#sec: ex-ssad){reference-type="ref" reference="sec: ex-ssad"}.
30
+
31
+ <figure id="fig: fully_unsup_od" data-latex-placement="t!">
32
+ <img src="images/corrupted" />
33
+ <figcaption>Unsupervised AD. We test the performance of our method when training on impure data, which contains anomalies in various percentages: 0%-15%. The performance slowly decreases when we increase the anomaly percentage, but even at 10% contamination, it is still better than state-of-the-art results on self-supervised anomaly detection in text <span class="citation" data-cites="acl2019"></span>, which trains on 0% anomalous data, proving the robustness of our method. Experiments were done on all AG News splits.</figcaption>
34
+ </figure>
2104.08793/main_diagram/main_diagram.drawio ADDED
@@ -0,0 +1 @@
 
 
1
+ <mxfile host="app.diagrams.net" modified="2021-05-26T23:03:21.548Z" agent="5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/90.0.4430.212 Safari/537.36" etag="ZwCyPJ8kGfuUsFnRGsRH" version="14.7.1" type="google"><diagram id="CZ6WytEWL-epx1ILxw1C" name="Page-1">7V3pc5tIFv9rVJuZKrugm/Nj7MSe2cnhrHcrmfmGBJJIMCgIH8pfvyBoCbofp7oBSXaqYqnVwvCOX79+R78Jvn54uQ2t1fJjYDveBEn2ywS/myBkIC3+PxnYpAOKpqQDi9C10yF5P3Dv/nKyQSkbfXRtZ12YGAWBF7mr4uAs8H1nFhXGrDAMnovT5oFX/Ksra+EwA/czy2NHv7p2tMxGZUnaf/CH4y6W2Z821OyDB4tMzgbWS8sOnnND+P0EX4dBEKWvHl6uHS+hHaFL+r2bkk93NxY6ftTkCx+eVl+15f2XD8sn6aP9JH/+ZPkXqppe5snyHrMnzu422hAShMGjbzvJVaQJvnpeupFzv7JmyafPMc/jsWX04MXv5PjlOgqDHztSJSMrJ3QfnMgJky+5/iK7ztz1vOvAC8LtH8FzNfmXjAd+lBtPf+Jx9nEzCjw5YeS85Iayx791gvjPhpt4SvapRmQvk8WYWen75z1nkYbTsWWOqUjLeGhl0rTYXXtP8PhFRvM29FfOiP5YN+rpb0i90h+fEf1lSaunvy6K/vdT01levZstvrh//vp7s/iGPtoXsnDy29Z6uf02eXNnRTEz/O0Ikio4xIPislygOCaLQY7isqGxFJd1DhS/Qxv78+1c8r79NDTv449fF1/0C62BxDt2vAZmb4MwWgaLwLe89/vRqyJT9nM+BMEqo/V3J4o22YJuPUYBzSgrjN4mC3Q84Ae+Q8Zu3OR5tpd1fJvMCFaO/9+l66ejhTkxUb5lF92++Tv55FIlb9+95Ge+29Qxdx08hjOngoA4U5r4ZhdOVDUxnZcQs1JWQsezIvepaHBAfN9+NaaItclNWAWuH61zV75LBnKgSywpYoBJatFMaDc/fpHewV4Cd4/SSChl8/rHJ2z9XP24/vp8J9v/23yfXRArsV8Z3MmOdKmrefGRa4SHl/CmaJVDe2n7k8x7caNve0GO3+3uK369v63kTY8irQkRaVYGcXGhUgzlUi1eJH2m7HsV4hx/91LK/cj0hYuXTSnAXLa10tEPYCp1d1k1/2Clq+Q6t8WXh4kooeKCqaqXrJGiGcCKiSgJ4bZiQjaK5kWZrVagl/bzMSAfXKy3uPM2MTLQ6mX/Yfxqkfz+63aSmAPppaYhGX7IttDsJ29uyNj6cZq/SjoYP1xu/LfccHqf6XgFi+V6FtNW63yOZjMIyWxtqqkaL6EoahPWAJEAzFZVlA2FBEnEp+CiRihAEci+dmZSoKD+pAA0WmTjpCxphnnvVVmRCja2dGnIWt5QupAupe3utMpa2r67I3uswl8jGzV0mPkylPVyEIiw+7DsAeI7fFl5lh/fQ+AnOr8BdH5Wqu+HrOC0YhszB1bsqaEqKqdNMkIUvBvsJlkD9sjC4F02h9DjLiZ/TjNRYf/bRSU7gYT2VpJubjgrtHKUCq3qQ4qNnBOavQjBYtNyychmFCWh4FcrFwvxQsDdzXKYUYDHgx1SpRDksEM2jaL3zDTr/Gd1CALKTEHuchhDy964UEYdFcqwAaNV6NjuLIrlh7EcJtfq5EofynywLceYg+aDNjOc6ZyP+SDL9eYDFNQQZj5AbpWMASvLJ6ReW57r+LPkQTIGxsSJWXbfilv74dyluW7uHNlWHR1ioqnp2OK0uWOYiIFAiQ4YgVhYZNAYFshRJyuw4MautwLrDb7u8Ko1hFeZO752ctnKStHzqOhUOgU9H1fOF+Oy1VhoaYnw8W/rIdF4f7pOfrVxKb55maD4vqXb3+irTNBVgl3yRYcFR2rp0nrzUrVKtQe4GN4MW4EAzkBTrAkCOEXtEeBASxXLDBn7jbmhVmAlPubWHer0o7QkdQZMYllOEK5awxpnYeTtve3PpDy7hdLJ91ryj5PmYVzQPGRKQEiJJMIV0l5EqR7xXI8k3H3MqmcMpXqdjAw6zkvM3KZxYWq+GCODDWtsg1Jvbv91SrCw0+5dMpwJwQKwIvOABVhGB91ytHEg7qGk6Deq8xpRTsPDvZDdgUNumvMij8vLLLPuhcha/2BDxplXYesJOnqnjioBMSHAqaMIiwmxW6+U7ECwnsTxji8SRyJfFVRX+6Q6GtgJ08YHI9KXQipfarFqVFCFjiiOKpJ7qCH30Ki4p5oM4lFZTrMdCu0TmbBmSdJ8zuY2fQC8PGvLK/VSNcpa2m5YK2zMzGbIg2k2ZHnuIqkBmMXcTOJXVwlKujPLe5t98ODa9lYgIaAuCimP9c6ggxgyYI0qAPYKs0bJ8prj/56H22hDgzQ3CUpzAyQhb8KUikJljON0REHXaxdhUYIA1uhABSOZyUMzqQsw3FveX7cX14EVrh3QnKrMfWwQGisiyG54evqihAy64A7a45qgRUcn3HMLrGkDWwXHFFgDCQiYE+C8kQTWFFJn0DCwppDAhKDAGkgrkiAwlFDqZyOUYkqI2gplvGcuCplRLZSqKlXNFyOUqJMBPp9vneqlVhezaO9XWnh1FFBAwGFtw6R0a1daC1UKySZUXCusnNzoEtiZPYZPO8J9f3xYkelWOCMjKYIYHWFmlIgBpN/B9gJvwDiIw9B5DdyKwV5Lv2gtV40iTmMMabmo9L4qeWxVt8F7uzxvtVc+xqIQmvGKBDlEemV8l/R+Fs77rwzhjvlF6eGerA0S32y4WqDBXODfL7zV/N93/3n75z/O5ur7fPXpk0KWi74TPerS8Msq/ViOwdFhvTo8DLIA4PuhVj9g1lNmOiktq6nvBy5EHdxiUvdScv5A632IisG/U74PweAD8tqHgEI8cLZSAb5yWYP1CFYh/fkjgsoRrJCv0FSqOaKZNqSpy8Z/mleaHLXBU57ZUJYLwcHg2SEMQQIdQ/tZKPxOgxy/YjY0rOa3TBFudF6UqIN1YG8XELivcvWMZJNLbvvUU4MVabeb3CcHMzonKjUYXGsJ6Y8kNfiwtF92Kea4xsrAsRDwvg4NqmtsePUU020VnVI0ON1WkKrBlB9mc3by6baVwcgBAkSHCclZpNsqEqIWQlWCqmQ0mdVPHsmflZJw0im3qmQ0oryotFuY8ueRdiv6GEtNo7xRKpNV0/QYywaXKnFt8QpGy6w74DUdVFTiFuN+1KCwl6gsQHiH+poOOogoKBq9R1VMszdRAPeowxxx1NEG5hkMAZeNjvlSTT1DfbmAQU4Pk6opMHQp1OMA8RQkq94TTyvvMgfkSUcEP/PkP28RZM3XlJ47WkmuiG5OqyvAu3vQFRnMCAM96EgUscsPg6qympwUg5il8qG4Lp6LMUTZvhjaIEEroCzMGAIO4jlIYRp0KeFASU0rlpZgBFFShlrwcGlIAtPytI5R3q1eIhqSwARsGk3G4+hIYtB72V0Upmz3W/sNQfnX0Jb3EC3noMKGiooqrOoN8+K4NEiA5YrdGnJMii0vCAPSZLlWpkE5t/WJCPXlZtf5e29zo2Vmwc3vQu70hPKIDZVa+UCHiqj2AbDWsHY4H62p7yLRML/8PPpK0KKhgIUk/XaWaLDwHJFJ1KizhG7u+kiQzhKyXrvjj7h1lqi0mGpNK7L7GEnUEh/WXGJZqvQ8nQK9xNd0nfKkG9CWh0zqB/gHzTs4r/YSMAOAo4lHptXwia2jSsSu29DyyDepaed6YNuJw4RjJLtpXaW8nlKhIqDtfDE7acz6y7p0LehlXeon40Yb4bo0aP+a0RzM0dMaBBxcW6k6Q8MMyZbc+Z2qD0zQJb1qviCYYV3JbdpsdICXsfbTYOAFjJnIEL4I66iBXw/+OczRT8zRert1XDm0CuAfrlriqdYSkE62cM0e0rWiHBHOo2sFDSO5tMM+YATeBQ1UU92xKE384fllOS2dIAY1NUuMcUEMYiDmJCvWNINOFwcr1hCrj+KaWQyijyPpSQELoyLEZGfD4lIxLI5pzBWcEk6e88RL12gXckkDGV2MzsGUH8YTeJ6la0pjd7E+rjWxrH7qpErXaBMV7FlgsroprHhKAfpqnVzZGo2JINX7LBZUzqNkrd6v2NRHgMcVsVbOr1dEZVZQPf+4FzEe1i63NgfwtTyQW0a8jofsFgHzn81mey0P7EUU6hfifgWB9cRwTcpN20X8sZmG8W206xVR7s197RWBDJLtvjugBaye2p+Y1ItZp7OwIkCabtyYI+1kqSTY8CpLWxAyio4pyEcJFRlhkz7UgN+5WsM0eOgzT6GfY7X0phEC80BztFOagqJRlpFW02yker6YNAVCwhyoLWJNXCYs8mcx6oQJ6swB1MmmnUqWAjZpB3bzil2FB1TAAQ3WgcO1WudzaM08p4EJA1X1vNowVTYMZrrGNu53JUqYoOpW/sJUZ8G0KBB7FaWt6Oj0sSuwMBmgNSwOm8pPYeIpTqQbXzuBeu3FVyFQyKA36uolAsAJ6FeEOTjtys8hge2Qh6lj266/SJjNSNgCYh3n6qCmR4YwHGjUZ6DsyBDixhJwYEj8NgwSZdubkgmxP25VCr//Pw==</diagram></mxfile>
2104.08793/main_diagram/main_diagram.pdf ADDED
Binary file (60 kB). View file
 
2104.08793/paper_text/intro_method.md ADDED
@@ -0,0 +1,100 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # Introduction
2
+
3
+ Natural language processing (NLP) systems generally need common sense to function well in the real world [15]. However, NLP tasks do not always provide the requisite commonsense knowledge as input. Moreover, commonsense knowledge is seldom stated in natural language, making it hard for pre-trained language models (PLMs) [11, 35] — *i.e.*, text encoders — to learn common sense from corpora alone [9, 38]. In contrast to corpora, a knowledge graph (KG) is a rich, structured source of commonsense knowledge, containing numerous facts of the form (concept1, relation, concept2). As a result, many methods follow the *KG-augmented model* paradigm, which augments a text encoder with a graph encoder that reasons over the KG (Fig. 2). KGaugmented models have outperformed text encoders on various commonsense reasoning (CSR) tasks, like question answering (QA) (Fig. 1) [31, 5, 36, 61], natural language inference (NLI) [7, 57], and text generation [33, 65].
4
+
5
+ Since KGs do not have perfect knowledge coverage, they may not contain useful knowledge for all task instances (*e.g.*, if the KG in Fig. 1 only consisted of the gray nodes). Also, even if the
6
+
7
+ <sup>∗</sup>Work done while TG interned remotely at USC.
8
+
9
+ <sup>2</sup>Code and data are available at: https://github.com/INK-USC/SalKG.
10
+
11
+ KG is useful overall for a given task instance, only some parts of the KG may be useful (*e.g.*, the green nodes in Fig. 1). Ideally, a KG-augmented model would know both if the KG is useful and which parts of the KG are useful. Existing KG-augmented models always assume the KG should be used, but do often use attention [54] to focus on specific KG components (*e.g.*, nodes [13, 47, 60], paths [56, 46, 5]) when predicting. Still, the attention mechanism is supervised (end-to-end) only by the task loss, so the model is never *explicitly* taught which KG components should be used. Without component-level supervision, the attention mechanism is more likely to overfit to spurious patterns.
12
+
13
+ How can we better teach the model whether each KG feature (*e.g.*, graph, node, path) is useful for solving the given task instance? Using the task's ground truth labels, *saliency methods* [2] can score each KG feature's influence on the model making the correct prediction. Whereas attention weights show which KG features the model already used, saliency scores indicate which KG features the model should use. By binarizing these scores, we are able to produce saliency explanations, which can serve as simple targets for training the model's attention mechanism. For example, Fig. 1 shows saliency explanations [market=1, produce=1, trading=0, merchant=1, store=0, shop=0], stating that market, produce, and merchant are useful nodes for answering the question.
14
+
15
+ In this paper, we investigate how saliency explanations can be used to improve KG-augmented models' performance. First, we propose to create *coarse* (graph-level) and *fine* (node-/path-level) saliency explanations. Since KGs have features at different granularities, saliency explanations can supply a rich array of signals for learning to focus on useful KG features. To create coarse explanations, we introduce an ensemble-based saliency method which measures the performance difference between a KGaugmented model and its corresponding non-KGaugmented model. To create fine explanations, we can adapt any off-the-shelf saliency method, *e.g.*, gradient-based [10] or occlusion-based [30]. Second, to demonstrate the potential of saliency-based supervision, we analyze the performance of *oracle* KG-augmented models, whose attention weights are directly masked with coarse and/or fine saliency explanations.
16
+
17
+ ![](_page_1_Figure_3.jpeg)
18
+
19
+ Figure 1: KG Saliency Explanations for Commonsense QA. Across different questions, the KG's usefulness can vary considerably. *Coarse* explanations indicate if the KG is useful overall, while *fine* explanations highlight useful nodes or paths. Here, the fine explanations state that the market, produce, and merchant nodes are useful, while the other nodes are not.
20
+
21
+ Third, as motivated by our oracle model analysis, we propose the *Learning from Saliency Explanations of KG-Augmented Models* (SALKG) framework. Given coarse and/or fine explanations created from thse task's training set, SALKG jointly trains the model to predict the explanations, then solve the task by attending to KG features highlighted in the predicted explanations. Using saliency explanations to regularize the attention mechanism can help the model generalize better to unseen instances, especially when coarse and fine explanations are used together as complementary learning signals. Indeed, on three standard commonsense QA benchmarks (CSQA, OBQA, CODAH) and a range of KG-augmented models, we show that SALKG can achieve considerable performance gains.
22
+
23
+ # Method
24
+
25
+ Since KGs abundantly provide structured commonsense knowledge, KGaugmented models are often helpful for solving CSR tasks. CSR tasks are generally formulated as multi-choice QA (discriminative) tasks [52, 39, 23],
26
+
27
+ ![](_page_1_Figure_8.jpeg)
28
+
29
+ Figure 2: KG-Augmented Models fuse knowledge from text and KG inputs to solve CSR tasks.
30
+
31
+ but sometimes framed as open-ended response (generative) [33, 32] tasks. Given that multi-choice QA has been more extensively studied, we consider CSR in terms of multi-choice QA. Here, we present the multi-choice QA problem setting (Fig. 1) and the structure of KG-augmented models (Fig. 2).
32
+
33
+ Problem Definition Given a question q and set of answer choices A = {ai}, a multi-choice QA model aims to predict a plausibility score ρ(q, ai) for each (q, ai) pair, so that the predicted answer $\hat{a} = \arg\max_{a_i \in A} \rho(q, a_i)$ matches the target answer $a^*$ . Let $q \oplus a_i$ be the text statement formed from $(q, a_i)$ , where $\oplus$ denotes concatenation. For example, in Fig. 1, the text statement for $q \oplus a^*$ would be: What kind of store does a merchant have if they sell produce? market. We abbreviate $q \oplus a_i$ as $x_i$ and its plausibility score as $\rho(x_i)$ .
34
+
35
+ **KG-Augmented Models** KG-augmented models use additional supervision from knowledge graphs to solve the multi-choice QA task. They encode the text and KG inputs individually as embeddings, then fuse the two embeddings together to use for prediction. A KG is denoted as $\tilde{\mathcal{G}} = (\tilde{\mathcal{V}}, \tilde{\mathcal{R}}, \tilde{\mathcal{E}})$ , where $\tilde{\mathcal{V}}, \tilde{\mathcal{R}}$ , and $\tilde{\mathcal{E}}$ are the KG's nodes (concepts), relations, and edges (facts), respectively. An *edge* is a directed triple of the form $e = (c_1, r, c_2) \in \tilde{\mathcal{E}}$ , in which $c_1, c_2 \in \tilde{\mathcal{V}}$ are *nodes*, and $r \in \tilde{\mathcal{R}}$ is the *relation* between $c_1$ and $c_2$ . A *path* is a connected sequence of edges in the KG. When answering a question, the model does not use the entire KG, since most information in $\tilde{\mathcal{G}}$ is irrelevant to $x_i$ . Instead, the model uses a smaller, *contextualized KG* $\mathcal{G}_i = (\mathcal{V}_i, \mathcal{R}_i, \mathcal{E}_i)$ , which is built from $\tilde{\mathcal{G}}$ using $x_i$ . $\mathcal{G}_i$ can be constructed heuristically by extracting edges from $\tilde{\mathcal{G}}$ [31, 37], generating edges with a PLM [5], or both [56, 60]. In this paper, we consider KG-augmented models where $\mathcal{G}_i$ is built by heuristically by extracting edges from $\tilde{\mathcal{G}}$ (see Sec. A.1 for more details), since most KG-augmented models follow this paradigm. If $x_i$ and $\mathcal{G}_i$ are not discussed in the context of other answer choices, then we further simplify $x_i$ 's and $\mathcal{G}_i$ 's notation as x and $\mathcal{G}_i$ , respectively. Since the model never uses the *full* KG at once, we use "KG" to refer to $\mathcal{G}$ in the rest of the paper.
36
+
37
+ As in prior works [31, 5], a KG-augmented model $\mathcal{F}_{KG}$ has three main components: *text encoder* $f_{\text{text}}$ , *graph encoder* $f_{\text{graph}}$ , and *task predictor* $f_{\text{task}}$ (Fig. 2). Meanwhile, its corresponding non-KG-augmented model $\mathcal{F}_{\text{No-KG}}$ has no graph encoder but has a slightly different task predictor $f_{\text{task}}$ which only takes $\mathbf{x}$ as input. In both $\mathcal{F}_{KG}$ and $\mathcal{F}_{\text{No-KG}}$ , the task predictor outputs $\rho(x)$ . Let $\mathbf{x}$ and $\mathbf{g}$ be the embeddings of x and y, respectively. Then, the workflows of y-kg and y-kg are defined below:
38
+
39
+ $$\mathbf{x} = f_{\text{text}}(x); \quad \mathbf{g} = f_{\text{graph}}(\mathcal{G}, \mathbf{x}); \quad \mathcal{F}_{\text{KG}}(x, \mathcal{G}) = f_{\text{task}}(\mathbf{x} \oplus \mathbf{g}); \quad \mathcal{F}_{\text{No-KG}}(x) = \bar{f}_{\text{task}}(\mathbf{x}).$$
40
+
41
+ Typically, $f_{\text{text}}$ is a PLM [11, 35], $f_{\text{graph}}$ is a graph neural network (GNN) [13, 47] or edge/path aggregation model [31, 5, 46], and $f_{\text{task}}$ and $\bar{f}_{\text{task}}$ are multilayer perceptrons (MLPs). In general, $f_{\text{graph}}$ reasons over $\mathcal{G}$ by encoding either nodes or paths, then using soft attention to pool the encoded nodes/paths into g. Let $\mathcal{L}_{\text{task}}$ be the task loss for training $\mathcal{F}_{\text{KG}}$ and $\mathcal{F}_{\text{No-KG}}$ . For multi-choice QA, $\mathcal{L}_{\text{task}}$ is cross-entropy loss, with respect to the distribution over A. For brevity, when comparing different models, we may also refer to $\mathcal{F}_{\text{KG}}$ and $\mathcal{F}_{\text{No-KG}}$ as KG and No-KG, respectively.
42
+
43
+ Now, we show how to create coarse and fine saliency explanations, which tell us if the KG or certain parts of the KG are useful. These explanations can be used as extra inputs to mask oracle models' attention (Sec. 4) or as extra supervision to regularize SALKG models' attention (Sec. 5). We first abstractly define a *unit* as either $\mathcal{G}$ itself or a component of $\mathcal{G}$ . A unit can be a graph, node, path, etc., and we categorize units as *coarse* (the entire graph $\mathcal{G}$ ) or *fine* (a node or path within $\mathcal{G}$ ) (Table 1). Given a model and task instance $(x, \mathcal{G})$ , we define an *explanation* as a *binary* indicator of whether a unit u of $\mathcal{G}$ is useful for the model's prediction on $(x, \mathcal{G})$ . If u is useful, then u should strongly influence the model to solve
44
+
45
+ | <b>Explanation Setting</b> | Unit |
46
+ |----------------------------|------|
47
+ | Coarse | KG |
48
+ | Fine (MHGRN) | Node |
49
+ | Fine (PathGen) | Path |
50
+ | Fine (RN) | Path |
51
+
52
+ Table 1: **KG unit types** used for different explanation modes (Sec. 3) and graph encoders (Sec. 4.2).
53
+
54
+ the instance correctly. By making explanations binary, we can easily use explanations as masks or learning targets (since binary labels are easier to predict than real-valued scores) for attention weights.
55
+
56
+ Since $\mathcal{G}$ may not always be useful, a KG-augmented model should ideally know when to use $\mathcal{G}$ . Here, the unit u is the graph $\mathcal{G}$ . Given instance $(x,\mathcal{G})$ , a coarse saliency explanation $y_{c}(x,\mathcal{G}) \in \{0,1\}$ indicates if $\mathcal{G}$ helps the model solve the instance. By default, $\mathcal{F}_{KG}$ assumes $\mathcal{G}$ is used, so we propose an ensemble-based saliency formulation for $y_{c}(x,\mathcal{G})$ . That is, we define $y_{c}(x,\mathcal{G})$ as stating if $\mathcal{F}_{KG}$ (i.e., uses $\mathcal{G}$ ) or $\mathcal{F}_{No-KG}$ (i.e., does not use $\mathcal{G}$ ) should be used to solve $(x,\mathcal{G})$ . Under this formulation, each $(x,\mathcal{G})$ has coarse units $\mathcal{G}$ and None, where None means " $\mathcal{G}$ is not used".
57
+
58
+ To get $y_c(x, \mathcal{G})$ , we begin by computing coarse saliency score $s_c(x, \mathcal{G}) \in \mathbb{R}$ , which we define as the performance difference between $\mathcal{F}_{KG}$ and $\mathcal{F}_{No-KG}$ . For QA input $x_i = q \oplus a_i$ and its KG $\mathcal{G}_i$ ,
59
+
60
+ let $p_{\text{KG}}(x_i, \mathcal{G}_i)$ and $p_{\text{No-KG}}(x_i)$ be the confidence probabilities for $x_i$ predicted by $\mathcal{F}_{\text{KG}}$ and $\mathcal{F}_{\text{No-KG}}$ , respectively.
61
+
62
+ Ideally, a QA model should predict higher probabilities for answer choices $a_i$ that are correct, and vice versa. To capture this notion, we define $s_{\rm c}(x_i,\mathcal{G}_i)$ in Eq. 1, where $a^*$ denotes the correct answer. Note that $s_{\rm c}(x_i,\mathcal{G}_i)$ is positive if $p_{\rm KG}(x_i,\mathcal{G}_i)$ is higher than $p_{\rm No-KG}(x_i)$ for correct
63
+
64
+ $$s_{c}(x_{i}, \mathcal{G}_{i})$$
65
+
66
+ $$=\begin{cases} p_{KG}(x_{i}, \mathcal{G}_{i}) - p_{No\text{-}KG}(x_{i}), & a_{i} = a^{*}, \\ p_{No\text{-}KG}(x_{i}) - p_{KG}(x_{i}, \mathcal{G}_{i}), & a_{i} \neq a^{*}. \end{cases} (1)$$
67
+
68
+ choices and lower for incorrect choices. We obtain $y_c(x_i, \mathcal{G}_i)$ by binarizing $s_c(x_i, \mathcal{G}_i)$ to 0 or 1 based on whether it is greater than or less than a threshold T, respectively. If $y_c(x_i, \mathcal{G}_i) = 1$ , then the KG is useful, and vice versa. See the appendix for more details about why we use ensemble-based saliency for coarse explanations (Sec. A.2) and how we tune T (Sec. A.6).
69
+
70
+ Even if $\mathcal{G}$ is useful, not every part of $\mathcal{G}$ may be useful. Hence, fine saliency explanations can identify which parts of a KG are actually useful. For a given instance $(x,\mathcal{G})$ , we denote the fine saliency explanation for a fine unit u in $\mathcal{G}$ as $y_f(u;x,\mathcal{G}) \in \{0,1\}$ . Fine units can be nodes, paths, etc. in the KG. If a graph encoder $f_{\text{graph}}$ encodes a certain type of unit, it is natural to define $y_f(u;x,\mathcal{G})$ with respect to such units. For example, MHGRN [13] encodes $\mathcal{G}$ 's nodes, so we define MHGRN's fine saliency explanations with respect to nodes. Similar to coarse saliency explanations, to obtain $y_f(u;x,\mathcal{G})$ , we first compute fine saliency score $s_f(u;x,\mathcal{G}) \in \mathbb{R}$ , and then binarize it. For a QA input $x_i = q \oplus a_i$ and its KG $\mathcal{G}_i$ , let $u_{ij}$ be the $j^{th}$ fine unit in $\mathcal{G}_i$ and $p_{KG}(x_i,\mathcal{G}_i)$ denote $\mathcal{F}_{KG}$ 's predicted probability for $x_i$ . There are many existing saliency methods (a.k.a. attribution methods) [10, 51, 30] for calculating the importance score of an input, with respect to a model and a given label. While $s_f(u_{ij};x_i,\mathcal{G}_i)$ can be computed via any saliency method, we use gradient-based and occlusion-based methods, since they are the most common types of saliency methods [2].
71
+
72
+ Let $\phi(u_{ij}; x_i, \mathcal{G}_i)$ denote the raw saliency score given by some saliency method. Gradient-based methods measure an input's saliency via the gradient of the model's output with respect to the input. We use the $gradient \times input$ (Grad) method [10], where $\phi(u_{ij}; x_i, \mathcal{G}_i)$ is the dot product of $u_{ij}$ 's embedding and the gradients of $p_{KG}(x_i, \mathcal{G}_i)$ with respect to $u_{ij}$ . Occlusion-based methods measure an input's saliency as how the model's output is affected by erasing that input. We use the leave-one-out (Occl) method [30], where $\phi(u_{ij}; x_i, \mathcal{G}_i)$ is the decrease in $p_{KG}(x_i, \mathcal{G}_i)$ if $u_{ij}$ is removed from $\mathcal{G}_i$ , i.e., $\phi(u_{ij}; x_i, \mathcal{G}_i) = p_{KG}(x_i, \mathcal{G}_i) - p_{KG}(x_i, \mathcal{G}_i)$ $u_{ij}$ ).
73
+
74
+ Intuitively, a unit is more useful if it increases the probability of correct answer choice $a^*$ , and vice versa. Thus, we define the saliency score $s_{\rm f}(u_{ij};x_i,\mathcal{G}_i)$ for unit $u_{ij}$ as Eq. 2. Next, we binarize the saliency scores to get $y_{\rm f}(u_{ij};x_i,\mathcal{G}_i)$ , by selecting the top-k%-scoring units in $\mathcal{G}_i$ and setting $y_{\rm f}(u_{ij};x_i,\mathcal{G}_i)=1$ (i.e., $u_{ij}$ is useful) for these units. For
75
+
76
+ $$s_{\mathbf{f}}(u_{ij}; x_i, \mathcal{G}_i)$$
77
+
78
+ $$= \begin{cases} \phi(u_{ij}; x_i, \mathcal{G}_i), & a_i = a^* \\ -\phi(u_{ij}; x_i, \mathcal{G}_i), & a_i \neq a^* \end{cases} (2)$$
79
+
80
+ all other units in $\mathcal{G}$ , we set $y_{\mathbf{f}}(u_{ij}; x_i, \mathcal{G}_i) = 0$ (i.e., $u_{ij}$ is not useful). See the appendix for more details about the fine saliency methods (Sec. A.3) and tuning threshold k (Sec. A.6).
81
+
82
+ In this section, we analyze KG saliency explanations' potential to improve KG-augmented models' performance. Recall that creating saliency explanations requires the task's ground truth labels (Sec. 3), so directly using test set explanations is infeasible. Still, before exploring ways to leverage training set explanations (Sec. 5), we first establish upper bounds on how much models can benefit from saliency explanations. Here, we study three key questions: (1) *Does the model improve when provided oracle access to coarse/fine explanations?* (2) *Are coarse and fine explanations complementary?* (3) *How do gradient-based explanations compare to occlusion-based explanations?*
83
+
84
+ ORACLE models are KG-augmented models with oracle access to saliency explanations. An ORACLE model uses ground truth labels to create explanations (even at inference time), and then uses the explanations as extra inputs to perform hard attention over the units. We define the model attention
85
+
86
+ | Model | Output | Saliency Weights |
87
+ |---------------|-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|--------------------------------------------------------------------------|
88
+ | ORACLE-Coarse | $\mathcal{F}_{\mathrm{c}}^*(x,\mathcal{G}) = y_{\mathrm{c}}(x,\mathcal{G})\mathcal{F}_{\mathrm{KG}}(x,\mathcal{G}) + (1 - y_{\mathrm{c}}(x,\mathcal{G}))\mathcal{F}_{\mathrm{No\text{-}KG}}(x)$ | $[y_{\rm c}(x,\mathcal{G}),1-y_{\rm c}(x,\mathcal{G})]$ |
89
+ | ORACLE-Fine | $\mathcal{F}^*_{\mathrm{f}}(x,\mathcal{G}) \sim \mathcal{F}_{\mathrm{KG}}(x,\mathcal{G})$ | $\hat{y}_{\mathrm{f}}(x,\mathcal{G})\odot y_{\mathrm{f}}(x,\mathcal{G})$ |
90
+ | ORACLE-Hybrid | $\mathcal{F}_{\mathrm{h}}^*(x,\mathcal{G}) = y_{\mathrm{h}}(x,\mathcal{G})\mathcal{F}_{\mathrm{f}}^*(x,\mathcal{G}) + (1-y_{\mathrm{h}}(x,\mathcal{G}))\mathcal{F}_{\mathrm{No\text{-}KG}}(x)$ | $[y_{h}(x,\mathcal{G}),1-y_{h}(x,\mathcal{G})]$ |
91
+
92
+ Table 2: Comparison of Oracle Models. For each Oracle Model, we show its output and saliency weights. Note that the explanations are given (not predicted), so there is no $\mathcal{L}_{sal}$ . While $\mathcal{F}_c^*$ and $\mathcal{F}_h^*$ are both ensembles of $\mathcal{F}_{KG}$ and $\mathcal{F}_{No-KG}$ , $\mathcal{F}_f^*$ has the same architecture as $\mathcal{F}_{KG}$ (denoted by $\sim$ ) besides the attention masking.
93
+
94
+ weights that are modified based on saliency explanations as *saliency weights*. Below, we introduce the ORACLE-Coarse, ORACLE-Fine, and ORACLE-Hybrid models, shown in Fig. 3a-c.
95
+
96
+ **ORACLE-Coarse** ORACLE-Coarse ( $\mathcal{F}_c^*$ ) uses coarse explanations to do hard attention over $\mathcal{F}_{KG}$ 's and $\mathcal{F}_{No\text{-}KG}$ 's predictions. First, $\mathcal{F}_{KG}$ and $\mathcal{F}_{No\text{-}KG}$ are trained separately, then frozen. Next, for each instance $(x,\mathcal{G})$ , they are used to create a coarse explanation $y_c(x,\mathcal{G}) \in \{0,1\}$ . Then, $\mathcal{F}_c^*$ is defined as an ensemble model that performs hard attention over coarse units ( $\mathcal{G}$ and None) by weighting $\mathcal{F}_{KG}$ 's prediction with $y_c(x,\mathcal{G})$ and $\mathcal{F}_{No\text{-}KG}$ 's prediction with $1-y_c(x,\mathcal{G})$ (Table 2; Fig. 3a). In other words, $y_c(x,\mathcal{G})$ and $1-y_c(x,\mathcal{G})$ are the saliency weights for $\mathcal{F}_c^*$ .
97
+
98
+ **ORACLE-Fine** ORACLE-Fine ( $\mathcal{F}_f^*$ ) has the same architecture as $\mathcal{F}_{KG}$ and uses fine explanations to do hard attention over fine units (*i.e.*, nodes or paths in $\mathcal{G}$ ). First, $\mathcal{F}_{KG}$ is trained, then frozen. As usual, $\mathcal{F}_{KG}$ uses soft attention over fine units in $\mathcal{G}$ to compute graph embedding $\mathbf{g}$ (Sec. 2). Then, for each fine unit u in $\mathcal{G}$ , $\mathcal{F}_{KG}$ is used to create fine explanation $y_f(u;x,\mathcal{G}) \in \{0,1\}$ . Let $\hat{y}_f(u;x,\mathcal{G}) \in [0,1]$ denote $\mathcal{F}_f^*$ 's soft attention weight for u. We train $\mathcal{F}_f^*$ the same way as $\mathcal{F}_{KG}$ , except each $\hat{y}_f(u;x,\mathcal{G})$ is (hard attention) masked with $y_f(u;x,\mathcal{G})$ , *i.e.*, $\hat{y}_f(u;x,\mathcal{G}) \leftarrow \hat{y}_f(u;x,\mathcal{G}) \odot y_f(u;x,\mathcal{G})$ , where $\odot$ denotes element-wise multiplication (Table 2; Fig. 3b). This means only units with $y_f(u;x,\mathcal{G}) = 1$ will have $\hat{y}_f(u;x,\mathcal{G}) > 0$ and thus be able to influence $\mathcal{F}_f^*$ 's prediction. Let $y_f(x,\mathcal{G})$ and $\hat{y}_f(x,\mathcal{G})$ denote the explanations and soft attention weights, respectively, for all units in the graph. Then, $\hat{y}_f(x,\mathcal{G}) \odot y_f(x,\mathcal{G})$ are the saliency weights for $\mathcal{F}_f^*$ .
99
+
100
+ **ORACLE-Hybrid** ORACLE-Hybrid $(\mathcal{F}_h^*)$ unifies ORACLE-Coarse and ORACLE-Fine as a single model, thus leveraging the coarse-fine hierarchy inherent in KG saliency explanations. First, $\mathcal{F}_f^*$ (which uses fine explanations) and $\mathcal{F}_{\text{No-KG}}$ are separately trained, then frozen. Then, for each $(x,\mathcal{G})$ , $\mathcal{F}_f^*$ and $\mathcal{F}_{\text{No-KG}}$ are used to create $y_h(x,\mathcal{G}) \in \{0,1\}$ , which we define as the coarse explanation for $\mathcal{F}_f^*$ and $\mathcal{F}_{\text{No-KG}}$ . $y_h(x,\mathcal{G})$ is computed the same way as $y_c(x,\mathcal{G})$ , besides replacing $\mathcal{F}_{\text{KG}}$ with $\mathcal{F}_f^*$ . Finally, similar to $\mathcal{F}_c^*$ , $\mathcal{F}_f^*$ is an ensemble that performs hard attention over coarse units by weighting $\mathcal{F}_f^*$ 's prediction with $y_h(x,\mathcal{G})$ and $\mathcal{F}_{\text{No-KG}}$ 's prediction with $1-y_h(x,\mathcal{G})$ (Table 2; Fig. 3c). That is, $y_h(x,\mathcal{G})$ and $1-y_h(x,\mathcal{G})$ are the saliency weights for $\mathcal{F}_h^*$ .
2104.09667/main_diagram/main_diagram.drawio ADDED
@@ -0,0 +1 @@
 
 
1
+ <mxfile host="app.diagrams.net" modified="2021-04-04T11:05:43.404Z" agent="5.0 (Macintosh; Intel Mac OS X 11_2_3) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/89.0.4389.114 Safari/537.36" etag="4SkxGGfDiJXcmEyPK2RN" version="14.4.6"><diagram id="3f55DupgREVL4jnSsJE5" name="Page-1">7V1bk5s2FP41nmkfdoebBDxmvdlsHtJpJ51Jm5cOa1ibCTYuxllvf33FRbbRxcY2QsJWHjZGgBA6n/QdnXN0GNnj+eZTFixnX9IwSkaWEW5G9uPIsjzfQn+LgveqAFiwKphmcVgVmbuCr/F/UV1o1KXrOIxWjQvzNE3yeNksnKSLRTTJG2VBlqVvzcte06T51GUwrZ9o7Aq+ToIkoi77Fof5rH4ty92VP0fxdIafbEK/OjMP8MV1xatZEKZve0X2x5E9ztI0r37NN+MoKfoO90t13xPn7LZhWbTI29zwz89vq/T5+XvwKQ//+PP5Ywy/x3e4mp9Bsq7fuG5t/o67IEvXizAqajFH9sPbLM6jr8tgUpx9QzJHZbN8ntSn6+qiLI823Iaa29dHsInSeZRn7+iS+gbLrXushoxn11W87QRg4mtme50P6rKglvl0W/WuW9CPumdO6SWL0UswQY99eE3RS+13F/x3neITd6sSzx/QBZa33OxOol/T6v8xrgg1rKoLnyGkgPozb3Z1kMTTBfo9Qf0aZaig6PUYQfdDfWIeh2Fx+0MWoYYEL2VVBjpepvEiLzsJPIzAY1HXOk+rxnYlRQcQUgS0FC2GEG1hQrS1EE8VIoSqCdHhCjFuI0GTI8HTcMCuJeSioIvacS2r9QsuQp08NvYeunfqSFNQccwFKRIDYtfo+FwfrJYV5b7Gm4IfxEz+kEKcw0CcIwxxQCOugbi5Soh7jZNknCZpVlZkQzgeG0ZHBGaohkSonq7meMrpau4AxuveuDGb6kP7ESVicKszsIVgU/YA9jQ0z4PmrXAONFWDrC9o0XR/f3+1iyb/+KKJJURhiyZcsV75thei6yi28rUVtNS5ylnqbL6lTkmKXShDsUppfwIplsasZIq1+XZJDdnhaIUdQNNXzeJgs6ytZI8twg+Fn61g8iRYreJJs6PQu2fvfxXMfQ/w4d/75x43Na1XR+/10SrP0h8RaxKozmBvXFnpJs73HoGOtk9Av3cPKA5w/dWLRCF2//GkhV42XWeTqMW6Lg+yaZQfuhCw5b8nX3CALbMoCfL4Z7PBLKHXT/i90JP29FKrCS+frKJ60fquHXSoihzTvwdNpPpEVVVXUFUhpATve5fVmhy/ySa7ydyWcV5xNwiqFuyGxFYKF4wSvoV4tQwWzCnvJZj8mJbq2t2kgngx82XTl18sgKozyjnWIH7/evk0b5msufUhyCezEmCr2fr1NYkXU3SEWmXBYF5MfIuX1bLVzFq9MDFj8zqB2bZqbBpddVBcLi6K+qpXnMeb8u24L9Z8g4GvYhysD9fjwWWsYmyWgm4Jo5QWRuxzKWVHIqIoZcsiwikFtKSUSsCyKIWcbz3jTEpxLYOgFNeHQiiF12Ruy0hrjtEHpbCcGCeOkgEpUHJB7Lkd6UWQtBkJ0opISB7TiiCQoRWxnB2X2KsIm8ADcG3/saOFF45SrDsIYiuePDuWKLv7FZtsTdMlxAhoMfZqs8XzgZbiCVK0PdWkyLK8q2ag0zFnXU7/gEJcr3Y3x9KIayBOKUsvoYg8lf+6ojBLNSiyvBOXqHFd9BKiRcX0NWcIcclKhvZcG5kwwCl7CA8hgllJbN4M7di2aphlmYl13NlhKTrm8ZVTr4FnDj8WW69/uVKEqq1/B2TJMx3lTHkO35SnJP2qE9+jlGookn5p0EqmX8A3XGrMDkdl7AKbUDWLBGCZY8kuExqUtu85pbysWzaT7FN12galVY46Wc5X0yLnPp8ATlvvq4n0XyKGgKqrIwesCVy38SCXzL1B3mAcvkGMCxbwzciDC0xLszDKrisuLQtGtxqXZpqgOeoBpNX0fgPTQAtDt9jAtEvDz/hU1CGt2G1pxZFLK6bTBBg0zqUVx7EIWgE+EEMrJEvgRnPbRvIQcYMgWmmxK6D/nFGklip7JxrgW9217YsXrkywApS96RKIMkNfsRAhZb+ULcQhZATR8TtdTv6S982BIST60PE7lZbcpSXXoXz/spHIcj5I1tUcMsBTuq6GF59Kj1clIySujUpobEoewHAIwcdKQvNWOAeSUR/SIcs3O+vYHZ4QSS+I7JxRUGdLPiNnlGIrX6igpc5VzlIHBxYfq06wg1Lan0CKpTErm2L5dkkN2eFohR1A01fN4gD51lapDvff0mweFK3PgkWYFjBoeNpLP3zlgb9Kjzu5sGU53JkOWWEOdyg4rLhLAiB3iLKygfastOgEAZcnCPBcyQq6yzc8aim2TRAgX4pDsNFpB2OX079kJQvPYxpxo9tyMFIJAuRDcQgJAuTra65OEHANRiYx4JQ9hAdmAFUHmzdDO2SCAPmY1QkCLk8QwFo59epldHWCgMsTBMhf/w7IkkfttVZANdQJAq5BNewXtJLp1+MbLjVmh6MydoFNMkGAfGx2miBgtL+Pc7et88KdnIfTAFA7OZ+eDMPrOEGA2/YTAx5kA6CnnZwGaO5x9B2ijvY7OYlZ1HfEfGLAtIl9bL7ZQ8J17BUZ7m7/z9W3Wp4+E3vjs2iZoCltHpWVF9v/66IqH0B5WYTk8RQGeYCFc2w6VS4jwOhW9v4bRJgk66M0wGfMKMJCEbzr2/svhDFsqUSAFrpNRcM8lwjIffZUTV0RAdlk1zv8pQ46T4zXB3O0CBPeg/8iXURN7PPRR6O7mEr5n/Y8jlNp8IMWARry6y6t4QcJjcaD7eDXmbxZ9n4tb0re1FcOSTmdIHHnzC8mdibzgUUMax9P70YmaiHv03pZvwv5076CJmaewtoYXrEf18XYtoNjqQU71NNwZO3RlT2Um6PJBuQM65+r0vmET8Q0vJ7n1xY+EClYPWxp4ti5NFapfGJk6PDZuoBBhH9QmBeNVAVTQlABliZNPf065fyBOTjU0ZeUcsoJwaZk/wZWyzQ0tSrfNkpYHGbRYZYWHb/jKPTusy9pGBVX/A8=</diagram></mxfile>
2104.09667/main_diagram/main_diagram.pdf ADDED
Binary file (39.2 kB). View file
 
2104.09667/paper_text/intro_method.md ADDED
@@ -0,0 +1,104 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # Introduction
2
+
3
+ The data-driven nature of modern machine learning (ML) training routines puts pressure on data supply pipelines, which become increasingly more complex. It is common to find separate disks or whole content distribution networks dedicated to servicing massive datasets. Training is often distributed across multiple workers. This emergent complexity gives a perfect opportunity for an attacker to disrupt ML training, while remaining covert. In the case of stochastic gradient descent (SGD), it assumes uniform random sampling of items from the training dataset, yet in practice this randomness is rarely tested or enforced. Here, we focus on adversarial data sampling.
4
+
5
+ It is now well known that malicious actors can poison data and introduce backdoors, forcing ML models to behave differently in the presence of triggers [11]. While such attacks have been shown to pose a real threat, they require that the attacker can perturb the dataset used for training.
6
+
7
+ We show that by simply changing the order in which batches or data points are supplied to a model during training, an attacker can affect model behaviour. More precisely, we show that it is possible to perform *integrity* and *availability* attacks without adding or modifying *any* data points. For *integrity*, an attacker can reduce model accuracy or arbitrarily control its predictions in the presence of particular triggers. For *availability*, an attacker can increase the amount of time it takes for the model to train, or even reset the learning progress, forcing the model parameters into a meaningless state.
8
+
9
+ We present three different types of attacks that exploit *Batch Reordering*, *Reshuffling* and *Replacing* – naming them BRRR attacks. We show that an attacker can significantly change model performance by (i) changing the order in which batches are supplied to models during training; (ii) changing the order in which individual data points are supplied to models during training; and (iii) replacing datapoints from batches with other points from the dataset to promote
10
+
11
+ specific data biases. Furthermore, we introduce Batch-Order Poison (BOP) and Batch-Order Backdoor (BOB), the first techniques that enable poisoning and backdooring of neural networks using only clean data and clean labels; an attacker can control the parameter update of a model by appropriate choice of benign datapoints. Importantly, BRRR attacks require no underlying model access or knowledge of the dataset. Instead, they focus on the stochasticity of gradient descent, disrupting how well individual batches approximate the true distribution that a model is trying to learn.
12
+
13
+ To summarise, we make the following contributions in this paper:
14
+
15
+ - We present a novel class of attacks on ML models that target the data batching procedure used during training, affecting their integrity and availability. We present a theoretical analysis explaining how and why these attacks work, showing that they target fundamental assumptions of stochastic learning, and are therefore model and dataset agnostic.
16
+ - We evaluate these attacks on a set of common computer vision and language benchmarks, using a range of different hyper-parameter configurations, and find that an attacker can slow the progress of training, as well as reset it, with just a single epoch of intervention.
17
+ - We show that data order can poison models and introduce backdoors, even in a blackbox setup. For a whitebox setup, we find that the adversary can introduce backdoors almost as well as if they used perturbed data. While a baseline CIFAR10 VGG16 model that uses perturbed data gets 99% trigger accuracy, the whitebox BOB attacker gets 91% ± 13 and the blackbox BOB attacker achieves 68% ± 19.
18
+
19
+ # Method
20
+
21
+ ![](_page_1_Figure_8.jpeg)
22
+
23
+ Figure 1: The attacker reorders the benign randomly supplied data based on the surrogate model outputs. Attacker co-trains the surrogate model with the data that is supplied to the source model.
24
+
25
+ We assume one of the strongest threat models currently described in the literature. In particular, our blackbox attacker assumes no access to the model and no prior knowledge of the training data, whereas a whitebox attacker has access to the model under attack and can compute its loss directly. The attack specifically focuses on the batching part of the ML pipeline as is depicted in Figure 1. We discuss the related work in Section 4.
26
+
27
+ This attack is realistic and can be instantiated in several ways. The attack code can be infiltrated into: the operating system handing file system requests; the disk handling individual data accesses; the software that determines the way random data sampling is performed; the distributed storage manager; or the machine learning pipeline itself handling prefetch operations. That is a substantial attack surface, and for large models these components may be controlled by different principals. The attack is also very stealthy. The attacker does not add any noise or perturbation to the data. There are no triggers or backdoors introduced into the dataset. All of the data points are natural. In two of four variants the attacker uses the whole dataset and does not oversample any given point, *i.e.* the sampling is without replacement. This makes it difficult to deploy simple countermeasures.
28
+
29
+ We assume that the defender is trying to train a deep neural network model with parameters θ operating over X<sup>i</sup> ∼ Xtrain, solving a non-convex optimization problem with respect to parameters θ, corresponding to minimization of a given
30
+
31
+ loss function $L(\theta)$ . We will denote the training dataset $X=\{X_i\}$ . We assume a commonly-used loss function defined as the sample average of the loss per training data point $L_i(\theta)=L(X_i,\theta)$ in k-th batch over the training set, where B is the batch size: $\hat{L}_{k+1}(\theta)=\frac{1}{B}\sum_{i=kB+1}^{kB+B}L_i(\theta)$ . If we let $N\cdot B$ be the total number of items for training, then in a single epoch one aims to optimize: $\hat{L}(\theta)=\frac{1}{N}\sum_{i=1}^{N}\hat{L}_i(\theta)$ . Optimization with stochastic gradient descent (SGD) algorithm of $N\cdot B$ samples and a learning rate of $\eta$ leads to the following weight update rule over one epoch: $\theta_{k+1}=\theta_k+\eta\Delta\theta_k$ ; $\Delta\theta_k=-\nabla_{\theta}\hat{L}_k(\theta_k)$ . SGD is often implemented with momentum [22, 29], with $\mu$ and v representing momentum and velocity respectively: $v_{k+1}=\mu v_k+\eta\Delta\theta_k$ ; $\theta_{k+1}=\theta_k+v_{k+1}$ .
32
+
33
+ Given data, SGD's stochasticity comes from the batch sampling procedure. Mini-batched gradients approximate the true gradients of $\hat{L}$ and the quality of this approximation can vary greatly. In fact, assuming an unbiased sampling procedure, *i.e.* when the k'th gradient step corresponds to $i_k$ 'th batch with $\mathbb{P}(i_k=i)=1/N$ , in expectation the batch gradient matches the true gradient:
34
+
35
+ $$\mathbb{E}[\nabla \hat{L}_{i_k}(\theta)] = \sum_{i=1}^N \mathbb{P}(i_k = i) \nabla \hat{L}_i(\theta) = \frac{1}{N} \sum_{i=1}^N \nabla \hat{L}_i(\theta) = \nabla \hat{L}(\theta). \tag{1}$$
36
+
37
+ Although this happens in expectation, a given batch taken in isolation can be very far from the mean. This variation has been exploited in the literature to aid training: there exists a field responsible for variance reduction techniques for stochastic optimisation [16], curriculum learning [5] and core-set construction [2]. Each area looks at identifying and scheduling data subsets that aid training and give a better true gradient approximation. In this paper, we turn things round and investigate how an attacker can exploit data order to break training. The explicit stochastic assumption opens a new attack surface for the attacker to influence the learning process. In particular, let us consider the effect of N SGD steps over one epoch [27]:
38
+
39
+ $$\theta_{N+1} = \theta_1 - \eta \nabla \hat{L}_1(\theta_1) - \eta \nabla \hat{L}_2(\theta_2) - \dots - \eta \nabla \hat{L}_N(\theta_N)$$
40
+
41
+ $$\underbrace{\qquad \qquad \qquad \qquad \qquad \qquad \qquad \qquad \qquad \qquad \qquad \qquad \qquad \qquad \qquad \qquad \qquad \qquad \qquad$$
42
+
43
+ As we can see, in this case the second order correction term is dependent on the order of the batches provided. The attacker we describe in this paper focuses on manipulating it *i.e.* finding a sequence of updates such that the first and second derivatives are misaligned with the true gradient step. In Appendix C we prove that under equally-distributed loss gradients, a change in data order can lead to an increase in the expected value of the term which is dependent on data order. We also derive a condition on the gradient distribution given a model for which it is guaranteed. Finally, we derive an attack objective to target the upper bound on the rate of SGD convergence explicitly in Appendix A.
44
+
45
+ In this paper we assume the blackbox attacker has no access to the underlying model, and thus no way to monitor its errors or the progress of its training. Instead, we co-train a separate surrogate model, using the batches supplied to the target model. We find that in practice the losses produced by the surrogate model approximate the losses of the true model well enough to enable attacks on both integrity and availability. We empirically find that our blackbox reshuffle attacks perform as well as the whitebox one in Appendix D.
46
+
47
+ Finally, although the attacker can maximise the term dependent on data order directly, in practice it is expensive to do so. Therefore, in the attack we make use of the loss magnitudes directly rather than the gradient of the loss. Intuitively, large prediction errors correspond to large loss gradient norms, whereas correct predictions produce near-zero gradients.
48
+
49
+ In this section we describe the taxonomy of batching attacks as shown in Figure 2. The overall attack algorithm is shown in Algorithm 2 in the Appendix, and a shortened attack flow is shown in Algorithm 1. We highlight that our attacks are the first to successfully poison the underlying model without changing the underlying dataset. In this paper we use three attack policies – (1) **batch reshuffling** or changing the order of datapoints inside batches; (2) **batch reordering** or changing the order of batches; and (3) **batch replacement** or replacing both points and batches. We consider four reordering policies, motivated by research in the fields of curriculum learning [5] and core-set selection [2], which discovered that model training can be enhanced by scheduling how and what data is presented to the model. That can help the model to generalize and to avoid overfitting with memorization. This paper does the opposite – we promote memorization and overfitting, forcing the model to forget generalizable features.
50
+
51
+ ![](_page_3_Picture_1.jpeg)
52
+
53
+ Figure 2: Taxonomy of BRRR attacks. Normal batching assumes randomly distributed data points and batches. Batch reordering assumes the batches appear to the model in a different order, but internal contents stay in the original random order. Batch reshuffling assumes that the individual datapoints within batches change order, but appear only once and do not repeat across batches. Finally, batch replacement refers to cases where datapoints or batches can repeat or not appear at all.
54
+
55
+ ```
56
+ Algorithm 1: A high level description of the BRRR attack algorithm
57
+
58
+ /* -- Attack preparation: collecting data -- */
59
+ do
60
+
61
+ get a new batch and add it to a list of unseen datapoints;
62
+ train surrogate model on a batch and pass it on to the model;
63
+
64
+ while first epoch is not finished
65
+
66
+ /* -- Attack: reorder based on surrogate loss -- */
67
+ while training do
68
+
69
+ rank each data point from epoch one with a surrogate loss;
70
+ reorder the data points according to the attack strategy;
71
+ pass batches to model and train the surrogate at the same time.
72
+ ```
73
+
74
+ Figure 3 shows attack policies. **Low to high** orders sequence items by their loss. **High to low** is an inverse of Low to high. **Oscillations inwards** picks elements from both sides in sequence. **Oscillations outwards** inverts the halves of the sequence and then picks elements from both sides.
75
+
76
+ Machine-learning poisoning and backdooring techniques aim to manipulate the training of a given model to control its behavior during inference. In the classical setting, both involve either appending adversarial datapoints $\hat{X}$ to natural dataset X or changing natural datapoints $X+\delta$ so as to change model behaviour. This makes the attack easier to detect, and to prevent. For example, an adversary may add a red pixel above every tank in the dataset to introduce the red pixel trigger and cause other objects under red pixels to be classified as tanks.
77
+
78
+ We present batch-order poisoning (BOP) and batch-order backdooring (BOB) – the first poison and backdoor strategies that do not rely on adding adversarial datapoints or perturbations during training, but only on changing the order in which genuine data are presented. BOP and BOB are based on the idea that the stochastic gradient update rule used in DNN training is agnostic of the batch contents and is an aggregation. Indeed, consider a classical poison setting with an adversarial dataset $\hat{X}$ : $\theta_{k+1} = \theta_k + \eta \Delta \theta_k$ ; $\Delta \theta_k = -(\nabla_{\theta} \hat{L}(X_k, \theta_k) + \nabla_{\theta} \hat{L}(\hat{X}_k, \theta_k))$ .
79
+
80
+ Order-agnostic aggregation with a sum makes it hard to reconstruct the individual datapoints $X_k$ from just observing $\Delta\theta_k$ . Indeed, the stochastic nature of optimisation allows one to find a set of datapoints $X_j \neq X_i$ such that $\nabla_{\theta} \hat{L}(X_i, \theta_k) \approx \nabla_{\theta} \hat{L}(X_j, \theta_k)$ . Given a model and a dataset such that the gradient covariance matrix is non-singular, an attacker can approximate the gradient update from an adversarial dataset $\hat{X}$ using natural datapoints from the genuine dataset X, enabling poisoning without having to change of underlying dataset in any way:
81
+
82
+ $$\theta_{k+1} = \theta_k + \eta \hat{\Delta} \theta_k, \text{ where } \begin{cases} \hat{\Delta} \theta_k = -\nabla_{\theta} \hat{L}(X_i, \theta_k) \\ \nabla_{\theta} \hat{L}(X_i, \theta_k) \approx \nabla_{\theta} \hat{L}(\hat{X}_k, \theta_k). \end{cases}$$
83
+ (3)
84
+
85
+ This gives rise to a surprisingly powerful adversary, who can introduce arbitrary behaviors into any models learned with stochastic gradient descent without having to add or perturb training data. This attack becomes more effective as training datasets become larger, further improving the attacker's ability to shape the gradient update. We discuss its fidelity in Appendix B.
86
+
87
+ ![](_page_4_Figure_1.jpeg)
88
+
89
+ Figure 3: We use four different reorder and reshuffle policies based on the corresponding data point and batch losses. We color-code the loss values from bright to dark colors, to represent loss values from low to high. **Low-high** policy orders a sequence by the loss magnitude. **High-low** policy orders a sequence by the negative loss magnitude. **Oscillation inwards** orders elements of the sequence from the beginning and the end of the sequence one by one, as if it was oscillating between sides of the sequence and moving towards the middle. Finally, **Oscillations outward** orders the sequence by starting at the middle of an ordered sequence picking elements to both sides of the current location.
90
+
91
+ ![](_page_4_Figure_3.jpeg)
92
+
93
+ (a) Natural image batch
94
+
95
+ (b) Poison datapoint batch
96
+
97
+ Figure 4: Examples of batches shown in (a) and (b) with similar gradient updates. Strong gradients are aligned across the layers and successfully change the prediction of poison datapoints.
98
+
99
+ We evaluated a number of different setups, and found that the attack works best when the attacker comprises the batch with B-V natural data points and appends V adversarially-chosen data points $\hat{X}_i$ to the batch. As larger V is better for gradient approximation, but leads to more conflict with natural gradients; in the paper up to 30% of the batch is filled with natural datapoints to find a balance. Finding precise batch reconstruction is an intractable problem that scales with batch size and size of the dataset. However, we find that random sampling works well; even if mistakes are made, the network still learns the poison over the course of a few batches. Overall we try to minimize the following reconstruction error for a given poisoned batch $\hat{X}_i$ :
100
+
101
+ $$\min_{X_i} \quad \left\| \nabla_{\theta} \hat{L}(\hat{X}_j, \theta_k) - \nabla_{\theta} \hat{L}(X_i, \theta_k) \right\|^p; \quad \text{s.t.} \quad X_i \in X.$$
102
+ (4)
103
+
104
+ Although more sophisticated approaches could help finding better candidates, we find that random sampling works well enough for successful clean-data / clean-label poison and backdoor attacks. It also helps us strike a balance between speed of batch construction and impact on model performance. Figure 4 shows an example of natural data (a) that closely resembled the would-be update with batch in (b). More importantly, such update results in a prediction change towards the target class.
2107.08929/main_diagram/main_diagram.drawio ADDED
@@ -0,0 +1 @@
 
 
1
+ <mxfile host="app.diagrams.net" modified="2022-01-14T15:57:48.671Z" agent="5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/97.0.4692.71 Safari/537.36" version="16.2.6" etag="S90mgxKC2yX3zjEDWPig" type="google"><diagram id="jSPALSiW6KFMzbJMwnSE">7V1bk9o6l/011Mw8RKWLdXtMJ52Th2QmdVJfzZynFBenmzo09AB90plfP5KxjS3JtgAJm4ZOKmkMyKC9tC9Le2+NyIen1z/W4+fHr6tZuhhhOHsdkY8jjBHBUP2nr/zeXSFE7C48rOez/EX7C9/n/5fmF/P3PbzMZ+mm9sLtarXYzp/rF6er5TKdbmvXxuv16lf9ZT9Xi/pdn8cP+R3h/sL36XiRWi/77/ls+7i7KjDfX/+czh8eizsjJnfPPI2LF+dDbB7Hs9Wvyr3I/Yh8WK9W291vT68f0oWevGJedh/oU8Oz5Qdbp8utzxvw7g3/jBcv+XcbYbZQb737uVIj6ClcrNbZM+x/X/SnuhthArOf6iW42f7OZ6e4qAd4t8lk9169AKHn1+o72EP+f3a3ydq8oj7z7iMUl3HtLni9elnOUv01oHr61+N8m35/Hk/1s78U6tS1x+3TQj1C6tfNdr36O/1QfhdyJ2hCYflMIUWsP/d8sSheuVwt0/yruN6cT1663qavjQJApVjVekhXT+l2/Vu9JH8DowxQDMufXCD5wkCUEpDksPq1h5pEgNLd1ccK0EgO13GO74fydnsIqF9yFLgRQRyIaJx61D311elUczcbp+LnVF2fjTeP5Rj6wbfxdpuul9kVBReX1NhUpJOf6pmVut18qyeJBJIDkdyY+YRZ044YtCcdFZI4ZdaTQ2bdA/DGrKdoRlPumlLJOBmzQFjGJnqldKEXC3sWhUI0OX0eaZc+O1pPKQklix8jfrf5MR/xj7sLvspKzem2TSMVesZWPePF/EGviamSQaqu32kJzZUpep8/8TSfzfRtnIioYyaIugIIycqPqa4ESNj+WWHJPnGsIBxAbbGogn/MBD+i92v9y5d39xoBVwsCghJg2SkleHw2afNzSRtdr5S1SoZ075lA00BqkdvKPZbIRVSRP+w0+5frFTdiiSlfCuD5lrQ8h3zxFcuXSCBEryIuwvc2L1cFxM/615+L9PW9DtXVt0+Xs/zXj9PFeLOZT+sSWYwn6eJuPP37IZu15jCvGkpkP1YAiFqivWKGyO4j5bQEAjKRHEPGIZEIMkR3TxcjcgDV84jTRBIhRUJt7/wn1X+yt6kZ/Z9c5tmDv/QDUD738bX65MffNXiks4K18AdHRerUIfXi2jpdjLfzf+rDu6CQ3+Hbap6t2Rx7iYAG8Igxxmb1sp6m+duqfIU5EpeAsWTvYTJj4EQAyvdPG6HZdrx+SLfWbTLglrPih2V0ur6CbYzI5nm8dI6SMzJ6hPXD5N/VzKq/6vPCym//YY/7r+fZeJtqmLxu1+PpVumWvQrc3cymZMLdPtbX2mjtupymmwzMCsuYjZ+0Al1ONs/Z66B9yXrcNg+HWwh/zb9OFRzGk2wovZCfNTIzrNK7Ef2ox3rZrja5pmnRTQEMBCvi7woDApTVMGO9Kg3lUBlJCEPRPy2pfYTxjxH9kImX32X/XqTvEBMyyFC/TEGGtUImmm/h4i0P1Me4laEuLnz5fl8BwORQdXEQebpebZXdXS0LURpcKk3FLHF5NwJPCAvE6hFlUasiNSgfhrXFteSsiSIHzafwEYAtRS66NFyw8EuF/Re40EPIGqrlWwn5oWESmEBAnDFgiEvn/lLx4HXKmSKAcHVNU1PO+Kxyjsve/vrx9UrlzIVaz/2KNgBV67TLXz9/e9MmN6mqYTPKFQRA5LC5tgxJgE015OJejTnu4mg2KvTe5pQJG1UJFDyqEB/vIICEj2rsB2Jo1M5/ZI++peu5+mp61e0uvs63GZcCEoHzx7sBFaLyx/vx9IPflQfmaKOBUyxcIaLqpdmIqY/oS7jocVElPQEb4/Kk1CYRSJZ4pPBulHlx4ddqPdMUwurnjjAwklIKlqGicebmWFdjVEgCiEEkc+g0KpyBIpQPbVeKMSK5DHqT/zpdBmGscI7gOf0FHIBX7c7guOJtXYSh7RMqP/GcMu6k90LI+Mr3+syI7swiDkDHdYv4mrfruXKMe17GAVg4P8r1jw9DoVx//kzZdOqK/2ZcTmAwHc2AmW7DEANCtFHrUSnXiJmVhiv+Prtvua/3b06HPH2apLPZfPmwadnNu14XHSEOkEHUN/joUl3lkRREXFrv4ebLIWn5cllOD7KY3XPYAxfVF1EVi2nqVsWTgFucOCGAm6SHdFZkFD5PdV4ZDzCv8em3oFSZd+1Bvl/fnPeEj0x82jFQ+aViBQyGqst2XyqJtcj04hJXDv2hxF12F2T5C+VdGC6zdzuIOwWs8e/Ky/IklbbvZ7ou0sD7bshjaUFnruhVLAj+NhcEp4BbsQy18OkNfT1edfvEjIUZiYZ9bmbE5FYxFPaJi/cMi/3B78U0xWTL1RedCTyyU+Vy9+2YROFyTTYsreGsIsnU7MaxJXroxBqmvItZARluB4i46OAb3G9wxwTrHc8ocCfYHDciwMNz4b7JsTahOqL36cXG00ae63sm4J0IFAQSahY8Ms7OWf1GwvPpJ8JE/XLBVbAxwaIiKyuf7sxoCZ8fexJaLreONmLyPBawtb723JgJT/CfiJlLr7SPiB2CoNIxLRHumbHjIvqbvfR8rmsdYCqyspvBkFKC1UY9lu9qaPHh+q5Yl863ig+W+38Hx2pMdkDDHjugO+uxBxAeCW870IFGYT5TEg4R5SDWwZK13iYgZjz2N4qy8OnLevH7bq1ErenM7g2k6m5TR2Ov/FJl0+md9EHOic4idTiLAjh2mFzJ3TKA7vYg09/q7OslYLa9YByXO3nts09JmW9xUuuveJmsRmbFvVn5bSc4b27pE3pVYmklV3FIXOkTzJFxE8KlSuLlwRqo+DN9Gs+X8+XDDRXdDZGsnBonKDgCLoURBBcuvrDZwTqCEI9AN3/6JCUhluQvyQvDZkUNQwYnfMBmSn0kKg1chHOtEo9unjd3/FB33Np3oFJ3AToODgRxYG6oUYmjIcJFDQbVH8PfUBs6xJioZfoKkxgUTBmdSk3eccDTd2GVLCTHXlrLXQIi0kU83hA5KEQmHDBaQSQ1g6YwkNS3qbSfKHpsnR+Sh/GZN0j2AEmZnENJ6rsMQkn2wqsOHQNcuU4V6VhEqYBWcbq34E3do7vhwnNIOnyj28P24Tbjp+dFapMB46km5jb2E8cTAZfZKI9yE2cYlTRiB/BDnNCQnK0rwvcMCrOmEqtGbFw1TcQQBrBqM+pgaSjDYsrsJzZeQlBGtJNgPns3xc129Xzb1beRw+2aXig7yj5jberTTgY6HGycZcAZSMgn7yrfx9XT5GXTIMTq2SgNEnQA55Bzg0IAgBPAzeRqrNuttAHAaWdQiKJfig/yOoM15247nalBeI31O4xWQqEs2iJvLhZiUgIoIS/SPowNZVqE6wdT1WY9AS869QUum2HWQTIJbP1ktOgl7/uGUiUe/Ybiu4eq5KE2N/+fqzfrNLtUm6bN220biWXbArDg/Si2I08IKBSiCmIFq3FDiUjenj6k6nvB/TEAJjdEJJAEFtbtYOKcJ4C11O/qgzaMxJxw9ADtTNMN55U1nE9gOvO31uiuzqvKl6+eU2HaN5acM0GXBiC0e9Z3arpKVvyvnSYrX3AIS96gu3aLv3AudteqVdN4aCouIXasqDBHjkwbZkwrNXs8Gi1VmNqU9l/p5s06QBwKR08nAcgZI/oAjUR6DuckL6K33HuRh3svgU9Z6tIcVA5MdahABiS00TQxREABjMO7BduJkwzyiC2CaScZHpnc/LB6en7ZOvZL/vyvf/1xb1/2prPexvnWPFF2xaK0KCic7yrkKUCurElZJlOedEBsrDTrB316vJ06+/L0NNaf0E/aRYb9bDV9ecqmtkvkkx0+vkzsdHunEp6xCaMNB96ZGZ8/cZbxGUT8GOBmN7jhPBHXMdshzpViHt0kznUAYZs8Go1i6QyXtuqv/D0+wX/+tsJ4thrOQhoYSEgh5pJgyaDYmdughx/W0dZqXgtcDNy8OooHKDo6E4EzVzGCPV44m8rwcJbJEed0uuGuqX8kB4L58sDPVve0vsphxyr3cU8Ht1aowrZ5rrSw/cUDTqtIKqdgCKs3ZuI6ni7wZga3+kWS9q0Gs/eZ8fqTdxpYZ/OQyD7yt9ViPtVT+LAez+Zp5gcNPmXo0ycBQxUaUhUXJeaunPtgBdeZmiGC/yJB6UKVeg+6OaweHozORczYsFDq5miFqwYro7O9liUxnROPcoEbjq8Axzo3m5vQO/ZccWwW4VOKY4LYYz9mSEX3JdAD2EJd8CaMEsiEI1A0kamgB7nQgwmQLWDxtoedBxM6naJ8JmpO0W7N7q+h5tTaiZcTlbicqPcPdcdpckn51yGdKSyJfQAZsVs2uFikIJ7UhfUr2VupnXh2diCAIBhhJY/rYiY6VnIRlpwki+Po/x5XcZYpf1vGmoy1d5x50bnwDMuYH5cW3x907pf/zNer5dPNDORLXwAmmussHFBCMBaWDthYGJRJCCAHnpA2OTAuAbfbiHXYBgRViBOgu1VBxV3MIv+uRJfelrde3rZzQYmzdsoVZAZZ1cfxpv2B58/013g9u6FHi47bvJSgANnsf7VvUywg3ZquDJ5OYqK2NWT1E8AC0Eq7zCP5Un2XynHqdganUnoVYxkt+ZLfmq4MHpEJBwlsa7oSBpLqNqTqvpn+29kgeWu6MnhIyuQcSlLfZRBK8tZ0xd10hVekYzVdwdgqGvAWPLR2dPRw3D56KoKw45+yeVM/p25dU9auf7is6Z8jy56z27QpIJO5DojC+Edb3lB4KgoJNRyzOChUtyEtvFo8EIrbGZPDB6HAZ1GF+jb9qELhwalfn/eFGG13vzgJ4YIjLmxXTLdwiHYQYsHr3pTOgMGnz0GsRF7WQbDoSMc/G9hAMo3XREH00hZ98NJlpF21QBRGtTBbtaihI6qW6PT77cwEl6RJMc97/XDkmQnWSDGVQ/iDEJvaa6a741VGnf01N9PV+nYKS45NM/UPJqA4LaeaVYB5uX8cej9PuKjqMBDRTXQWl9hLJ4BoiTJAGFXcC7MyTApX+dYZOmeIw5jgU5zTME0phmMDZJ2sl6ZTgdV3FJbID97EgvyU2wS0Hh40srb1uwVU5mm1rqDparlMp9s8tWJUIvqAlSVRScrvfS7W1X3PsZgCHHgnPDjOg7pfmN6XmKaZ92V5TpOiCsd35qszvJOsPcXFfNqJbRjI6g8vunBVJ1QCUv3BAXLepAd9t13Px8uHRtXuTEC024m0JfF02qBqVVRIedTfUUjHRreKOFCivL38Xwgd0kFAsNpPAOl40FrXJp1iCgYgHQ8W6uqkQwcjHQ/e5iTLUcTtdrcWLicwkuWwVVNfhsODJ/Hust6GfXwi9j9yeRdcGm67AQeDfY+MvWsTjl3z15dwPHLXrk04w7EaHsH7KVZjRlMxS1xWQ+AJycpsQs72AOyET6rUYjF/3nj4SOPNs4pt1YOf81ctAHN2k49SNsM7PowtFEPAa7NOHGFy02tOmnWf1KA3Mut26V5fs16WDF7DtNsNyfubdo8wuV4beaCrX7RHaa6NNOzs7icM053RlQ4+DiX77VXbueH23Ic4Tw4VO7lvdK4xBdLeNdhPO+2gQaNNu0c0e6g+qeUoGGIo942dyiXETCMJmDm9Sj20H1+F7OkN0N4FQY9g9sJmt3uLo5+Z9ohML2um9Xmeg8Fx3L3mhyveay7s3t4UiJ7OWkSwsxvVSVJ+vFopJ/ahG3p195JFgKArcA4n5c2I3q9/oGuVNYXASitMXP0i4sn3fNU7B6aJlOe/jao5ykULaneG8u6jtzaULkOHakd2hJm37M5USY2xfSwJgW29oL0TTygDspIza1bpKLfBYVLC550gFKBop/EQVVk7OABA2d5IvbWtuc6+h0b2PeesC8BHnuWFinr2GkILcno4CKVWKr0z1+1weJpnbgp+hhboCUU+tw3V4LwUchv4C+uUZWR/W23mOZU2WW23qyeH+cqOn66p5ryD19Prg7Jhj2AyVisIjNfTQm2/36UFaM0KEWeCIJoIwQXP2/fun5dSJFBQSJhu4kul28IGsIpJIqw2Sg3YqiLdAfQA9hF5EE123UVNQW3L4wBlvZsxEDKkDqmri6FlVVrri3EcRV0wTI5QF0ctYZ+anKN59n75BEefVIZkB5/gYNpDEOsoQDFMk59AUf10PiIPr4Gr+wmyvsYxEdH8BHIRfoLua5YkJmz2qELK0d2fTYxDeQ2s1VkO5TUQe1eEMn2YgFl0Fs5xCFDsg10R+dfP33wj8FO2VjxyD0IoMBXkWILJGuRb1YAdYXZCra76x2mxTlbU80ibnUy+zJfpeP0ORRFZZds268Y6rN0xYfuHlNEu/9DVsjVAqg5CZyy0OewQvQYbUjcXgyM+JK+V7CLj3AWquwVU2IsjyzMppABX8GIdppF1Dmi8TUgv0mZYv88fnlbqBVHXcHficIC1qtyrQa1VF9sZwnaWEourjPuTI2RWtwOGcNfek4gkRxyAObws1rogqOvKWw5OeVvBuHPn6uAetNwcl7ZyjQHVM/bg6U6EWjjIDAYItJilqj1lx9rqInepNth5pI/PJf1DTjr2URiFWzEcSDDzzFBxAiTMY1jOCInDOvH0qxDqkBiauaCM296haBWkP0CMIn7TLsEaFZDEg4uLvAyXQPH8Y8TvRvRDlhPBFbBWzyP+Uf+9zpQKyhAQqFHu+umeMmlwvCY9WtIvGghFOs31ip8yQJrFvyOb+xF/KPrv2BA1KmE4LILQ2hLsrFMgomSLgyfNYw+C8LI2BxNSr59x+GLtqyxS4jH2qPK7sJmWACfWTPY/0R5Zg5c10TQBqIrpoUCauJiuW9+2U+Xd1beNqYXXk59I4hNO19q4jaNaR7UjuYfOxm3ttwkYWxIPdmogndso76dzGyKxG/DEad2Wy9ae5MH1bkPExXCYhuPyG1A1SmTg3dsQ8Sh2vDr5FHMwBPl4dOK5PvkMphcPIpGb8URq4dZlQWwF1ZsB8QjbL78V1YH2YzBd3FTYexOPJZ7B9HFDiUcuzNWJZzjWI/EI5wfYyq1xvgdgLxKPsHfA/a0OhLKF5N4aXCVR650GNu8D6ueW+NQ/vZV5H1BDt8Snf+3l7pK6GrpR3lNDt8SnHe3lznVHQzfada5FtGl/c3vTroZuHPXTCCt5c/vRXQ3depvpN7ch7Wro1tfs0rib0LeGblVTIPtq6EZdkeqtodvpUnY0dHP3vjiLlF3Bc+iGbiN+9+Xd/fWmoZKEAcu1ph3KO57I41csDKxaLnFVy9HBdcZw9XgTQQrmunq8KU/iPD3eaFTuYng9cnhXz91IPXJoZ3XBLb08Wnp5J2sSM72cBui2fKtpL0SpnLWOApFYNe10sAf2BrTSdYM8vN4jXQmkGJAjLbJVws6hM9KLYYN9Dj8aZMXq4BDiqlhlJGkTZKiKVYU9WOmLE69ilQZokuJXqLgPE682UtTN4sxIkeG+IkUWi+j79LJY/B6V2eVqBjFcjH+n683xEveX5DpVny/PaNdyy7v6qXHp3Yh+1GO9bFe775ANHUCwRNJ6eoRpS4Sz0XtZ8BNetgHqSha6EfHdePr3Q7YQXKmhlstmPNOSTVS+MrDbAqBgNcOUiKTVNLmafDaYq+FYJikBp/tdcDNGIRJQBAvbdbD3YjnKarykYq2MUo2A5oi5CMsbbt8KbklCdGtxq9PevhIPAkSPBi5hQgV5+9EtHOvhUTzwuqjXEMF5McqkuHD/ul2Pp1uFyb1BnZivvhq3imgejjdb310rWMv6chWSOXiaMPbXgwndFes1ff3TS/O4JpeNmWDQNRMOmhLhELNwUE5VlHRYP3jtxNWqFaPNkUcu1Jny/U6bq/wtELBK5rX61whtQbUQW5a12LXkbVgbwLVIm15zmih8SLm3I4rESKN0FTH2JYnhnOB9DkkIQAYrCZ96oLj1cAE8BKRtvXlmACfntIX84ip3Qsy7gICTCiSt/t4UODrzG+U7ZQ1IzU0DSQip+BTsXJ9UBGC9SsWDBbg+qQgMcK9Sid1c4zymgBCHKXDSs9FMgUdw+PbgTZIOpcN6NgU+wer1SUX2bAp8SoWuTiqC9GwKzpe+oo9ZI8ZxrHp764hj1ioHxooikWE3IETF47fDtSNKgYSVTRuTa0fMdUDWoZy7vo2oMPpmbb4Q1olaASl3Hj/b5QbEU4EoECBVggPHAaK+DaoA0TwQLjIS3wZRQpO+iRJxlUSJ0qKDJkrEVRIlnVLpmSgpluVNKkMiSoQHURJ78xfLpHakIbSgS0Bi771FU+o+lEfcxqx+yNvJrpedYHF5rTtPm1P35hgyAxXQ61q+vH6d5xCK8hL7FIoH/XCFQhHF5PYjFI9QPLbVI7o31ICsXuyYMJjVE71ZPekR7l3mWm6Y00uwetIj2rtCofRr9Qre7CaUAVk92ZzzvXkeL2vSObB+bjfKvLjwYbWcjrfpcpwVVOUJ4esyI/xpks5m8+VDJft7buWPrx054ruP+eZzxDEHJlHAYQKKjr1V8OB92WfwxHDpCoRD1Ah8/fz+HN0YUjSjKXe5HJJxMtb7Jys11nyrJ/kAHdAqPCwSYB7SxYjzUPBq4aQrRzJE8b4McLjnTYbZzg/tasAQTYbNjTTqClEXKv9AVsHygVozYvmqr0IOIUNJAbfWIQPMtsAsVmWrdMX+jYLDN8E1Cg7h8wou+lZ9047pcvVFl4Q6HNt8hpsrRj99ytpLO1Rl2XFq9yWGu/2OpeH2aMtZLTYwqnl999sd4yJQoV7MffyAe+3Sg1e5QSlC+S42G9iFgZJj3DNBCUMP+qkCpVzKtVZ0FVDp69/GW2UpltkVDPcCr26lHlOifjkoUWEVtASa1HqiHwsUxIHd7KB16JBYae6Zu6/M/nzvrsm+Fk9DHxdunqcsuzreB+pxox6uVzqi2ktYxUKPX1czrbLv/x8=</diagram></mxfile>
2107.08929/main_diagram/main_diagram.pdf ADDED
Binary file (81.9 kB). View file
 
2107.08929/paper_text/intro_method.md ADDED
@@ -0,0 +1,62 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # Introduction
2
+
3
+ Automatic text summarization is the task of automatically summarizing a long document into a relatively short text while preserving most of the information [@tas2007survey]. Text summarization methods can be categorized into abstractive and extractive summarization [@10.1007/s10462-016-9475-9; @nenkova_survey_2012]. Given a document $d$ consisting of an ordered list of $N$ sentences, *extractive summarization* aims to pick up $M$ ($M$$\ll$$N$) sentences as the summary of the document. The extracted summaries tend to be both grammatically and semantically more reliable than abstractive summaries [@liu2018generating; @liu2019hierarchical; @luo2019reading; @liao2020improving], as they are directly selected from the source text.
4
+
5
+ Extractive summarization is usually modeled as two sequential phases [@zhou-etal-2018-neural-document]: 1) *sentence scoring* and 2) *sentence selection*. In the sentence scoring phase, an affinity score is computed for each sentence by neural networks such as bidirectional RNNs [@dong2018banditsum; @narayan2018ranking; @luo2019reading; @xiao-carenini-2019-extractive] or BERT [@zhang2019hibert; @liu2019text]. In the sentence selection phase, sentences are selected by either i) predicting a label (1 or 0) for each sentence based on its score, and selecting sentences with label 1 [@zhang2019hibert; @liu2019text; @xiao-carenini-2019-extractive], or ii) ranking sentences based on their scores and selecting the top $K$ sentences as the summary [@narayan2018ranking], or iii) sequentially sampling sentences without replacement, where the normalized scores of the remaining sentences are used as sampling likelihoods [@dong2018banditsum; @luo2019reading].
6
+
7
+ In these approaches, sentence scores are generally not updated based on the current partial summary of previously selected sentences, indicating a lack of knowledge of *extraction history*. We deem extractive summarizers that are not aware of the extraction history to be susceptible to redundancy in a document, because they will repeatedly add sentences with high scores to a summary, regardless of whether similar sentences have been selected before. And, redundancy leads to performance decreases evaluated by ROUGE F1.
8
+
9
+ ![We modeled extractive summarization as a multi-step iterative process of scoring and selecting sentences. $\text{s}_i$ represents the $i_\text{th}$ sentence in the document $\mathbb{D}$.](figure-extrative-summ-pipeline.pdf){#fig:extractive_summ_pipeline width="\\linewidth"}
10
+
11
+ In this paper, we propose to model extractive summarization as a multi-step episodic Markov Decision Process (MDP). As shown in Figure [1](#fig:extractive_summ_pipeline){reference-type="ref" reference="fig:extractive_summ_pipeline"}, at each time step in an episode, we define a *sentence state* composed of three sub-states: 1) the local content of the sentence, 2) the global context of the sentence within the document, and 3) information on the extraction history, including the previously selected set of unordered sentences and the remaining sentences. At each time step, the policy network (agent) takes the current sentence state as input and produces scores used to select an action of either stopping the extraction process or selecting one of the remaining sentences into the candidate summary. Unlike one-step episodic MDP-based models [@narayan2018ranking; @dong2018banditsum; @luo2019reading] that encode the state information only once at the beginning of the episode, in our multi-step policy, the agent updates at each time step the extraction history before selecting an action. Such a step-wise state-updating strategy enables the agent to consider the content of the partial summary when selecting a sentence.
12
+
13
+ To efficiently encode local and global sentence states, we design an extraction agent based on LSTM networks [@hochreiter1997long]. To encode the extraction history and to select actions, we use a reduced number of attention layers [@vaswani2017attention] of relatively low dimensionality. These choices enable our model to be easily trainable and to summarize long documents such as scientific papers [@cohan2018discourse; @huang2021efficient] or reports [@huang2021efficient].
14
+
15
+ The **contributions** of our work are as follows: 1) We propose to treat extractive summarization as a multi-step episodic MDP that is aware of the extraction history. 2) We show that extraction-history awareness allows our model to extract more compact summaries than models without history awareness and behave more robustly to redundancies in documents. 3) Our model outperforms both extractive and abstractive summarization models on PubMed, arXiv [@cohan2018discourse], and GovReport [@huang2021efficient] datasets. 4) Finally, human evaluators rate the MemSum summaries to be of higher quality than those from a competitive approach, especially by virtue of lower redundancy[^1].
16
+
17
+ # Method
18
+
19
+ In an episodic task with a terminal state (i.e. *end of summary*), policy gradient methods aim to maximize the objective function $J(\bm{\theta})=\mathbb{E}_{\pi_{\bm{\theta}}}[R_0]$, where the return $R_t=\sum_{k=t+1}^{T}r_k$ is the cumulative reward from time $t+1$ until the end of the episode when the summary is complete. In applications of RL to extractive summarization, the instantaneous reward $r_t$ is zero except at the end of the episode when the final reward $r$ is computed according to Equation [\[eq:R_compute\]](#eq:R_compute){reference-type="eqref" reference="eq:R_compute"}, so $R_t\equiv R_0=r$. The reward $r$ is usually expressed as [@dong2018banditsum]: $$\begin{equation}
20
+ \label{eq:R_compute}
21
+ r = \frac{1}{3}( \text{ROUGE-1}_f +\text{ROUGE-2}_f+\text{ROUGE-L}_f )
22
+ \end{equation}$$ According to the REINFORCE algorithm [@williams1992simple], the policy gradient is defined as: $$\begin{equation}
23
+ \nabla J(\bm{\theta}) = \mathbb{E}_\pi[R_t\nabla\log\pi(A_t\vert S_t,\bm{\theta})],
24
+ \end{equation}$$ where $\pi(A_t\vert S_t, \bm{\theta})$ denotes the likelihood that at time step $t$ the policy $\pi_{\bm{\theta}}$ selects action $A_t$ given the state $S_t$. With $\alpha$ as the learning rate, the parameter update rule is [@sutton2018]: $$\begin{equation}
25
+ \label{eq:update_rule}
26
+ \bm{\theta}_{t+1} \leftarrow \bm{\theta}_t + \alpha R_t \nabla\log\pi(A_t\vert S_t,\bm{\theta}_t),%=\alpha R \nabla\log p_{\bm{\theta}}(a\vert S_0)
27
+ \end{equation}$$
28
+
29
+ Different from one-step episodic MDP policies [@narayan2018ranking; @dong2018banditsum; @luo2019reading] that extract the entire summary via a single action, we define an episode, i.e., the generation of a summary, consisting of multiple time steps. At each time step $t$, corresponding to extracting sentence number $t$, the action $A_t$ is either to stop extraction or to select a sentence $s_{a_t}$ from the remaining sentences. The agent's policy is: $$\begin{equation}
30
+ \label{eq:memsum_action_prob}
31
+ \begin{aligned}
32
+ \pi(A_t\vert S_t,\bm{\theta}_t) &= p(\text{stop}|S_t, \bm{\theta}_t) p(a_t|\text{stop},S_t, \bm{\theta}_t)\\
33
+ p(a_t|\text{stop},S_t, \bm{\theta}_t) &= \begin{cases}
34
+ \frac{ u_{a_t}(S_t,\bm{\theta}_t) }{ \sum_{j\in I_t} u_j( S_t,\bm{\theta}_t ) } \ &\text{if} \ \text{stop = false} \\
35
+ \frac{1}{\vert I_t \vert} &\text{if} \ \text{stop = true},
36
+ \end{cases}
37
+ \end{aligned}
38
+ \end{equation}$$ where $I_t$ denotes the index set of remaining sentences at time step $t$. If the agent does not stop, it first computes a score $u_j$ for each remaining sentence and samples a sentence $s_{a_t}$ according to the probability distribution of normalized scores. When the agent stops the extraction, no sentence is selected and the conditional likelihood $p(a_t|\text{stop=false},S_t, \bm{\theta}_t)$ is set to $\frac{1}{\vert I_t \vert}$ (where $\vert I_t \vert$ represents the number of remaining sentences at time $t$), which is independent of the policy parameters to prohibit the gradient from being passed to the policy parameters via the conditional likelihood. After calculating the reward according to Equation [\[eq:R_compute\]](#eq:R_compute){reference-type="eqref" reference="eq:R_compute"}, the policy parameters are updated according to Equation [\[eq:update_rule\]](#eq:update_rule){reference-type="eqref" reference="eq:update_rule"} (for all time steps).
39
+
40
+ The state $S_t$ in Equation [\[eq:memsum_action_prob\]](#eq:memsum_action_prob){reference-type="eqref" reference="eq:memsum_action_prob"} is designed to be informative on: 1) the local content of the sentence, 2) the global context of the sentence within the document, and 3) the current extraction history. To encode these three properties in the state, we use a local sentence encoder, a global context encoder, and an extraction history encoder, respectively. Subsequently, the state is mapped by an extractor to an output score for each of the remaining sentences and the extraction stop signal. The overall framework of our model is depicted in Figure [2](#fig:model_architecture){reference-type="ref" reference="fig:model_architecture"}.
41
+
42
+ In the **Local Sentence Encoder** (LSE), ordered words $(w_1, w_2, \dots w_M)$ in a sentence $s_i$ are first mapped onto word embeddings using a word embedding matrix. Subsequently, a $N_l$-layer bi-directional LSTM [@hochreiter1997long] transforms the word embeddings and maps them onto sentence embeddings $l_{s_i}$ via a multi-head pooling layer (MHP) [@liu2019hierarchical].
43
+
44
+ The **Global Context Encoder** (GCE) consists of a $N_g$-layer bi-LSTM that takes the $L$ local sentence embeddings $(l_{s_1}, l_{s_2}, \dots l_{s_L})$ as inputs and produces for each sentence $s_i$ an embedding $g_{s_i}$ that encodes global contextual information such as the sentence's position in the document and information on neighboring sentences.
45
+
46
+ The **Extraction History Encoder** (EHE) encodes the extraction history information and produces the extraction history embedding $h_{s^r_i}$ for each remaining sentence $s^r_i$. The EHE is composed of a stack of $N_h$ identical layers. Within one layer, there are two multi-head attention sublayers, as contained in the transformer decoder in @vaswani2017attention. One sublayer is used to perform multi-head self-attention (MHA) among the local embeddings of the remaining sentences, so that each remaining sentence can capture the context provided by other remaining sentences. The other attention sublayer is used to perform multi-head attention over the embeddings of extracted sentences to enable each remaining sentence to attend to all the extracted sentences. The output of the two attention sublayers, one for each remaining sentence, captures the contextual information of both extracted and remaining sentences. The final output of the $N_h^{\text{th}}$ layer of the EHE constitutes the extraction history embedding, one for each remaining sentence.
47
+
48
+ There is no positional encoding and the EHE produces the extraction history embeddings non-autoregressively by attending to both precedent and subsequent positions. Consequently, the extraction history embeddings $h_{s^r_i}$ for the remaining sentences are invariant to the order of the previously selected sentences. We believe that the sequential information of previously selected sentences is not crucial for reducing redundancy and for deciding whether to stop extraction or not.
49
+
50
+ The **Extractor** computes the score of each remaining sentence and outputs an extraction stop signal. As input to the extractor, we form for each of the remaining sentences $s^r_i$ an aggregated embedding by concatenating the local sentence embedding $l_{s^r_i}$, the global context embedding $g_{s^r_i}$, and the extraction history embedding $h_{s^r_i}$. As shown in Figure [2](#fig:model_architecture){reference-type="ref" reference="fig:model_architecture"}, to produce the score $u_{s^r_i}$, the concatenated embedding of remaining sentence $s^r_i$ is passed to fully connected layers with ReLU activation and then projected to a scalar by a Linear-1 layer followed by a sigmoid function. Note that the same fully connected layers are applied identically to all remaining sentences. We deem that the extractor can learn to stop extraction based on the remaining sentences' states. Therefore, we apply an MHP to the last hidden vectors of all remaining sentences to output a single vector. This vector is then passed to a linear layer with a sigmoid function, producing a stopping probability $p_\text{stop}$.
51
+
52
+ We train the parameterized policy network according to the update rule in Equation [\[eq:update_rule\]](#eq:update_rule){reference-type="eqref" reference="eq:update_rule"}. At each training iteration, an episode is sampled to compute the final return $r$ and the action probabilities $\pi(A_t\vert S_t, \bm{\theta}_t)$ for all time steps $t$. An example episode with $T$ extracted sentences looks like: $(S_0, s_{a_0},\dots,S_{T-1},s_{a_{T-1}}, S_{T}, A_{\text{stop}},r)$, where $S_t$ represents the concatenated state information introduced in Section [3.3](#sec:policy_network_structure){reference-type="ref" reference="sec:policy_network_structure"}, $s_{a_t}$ represents the selection of sentence $a_t$, $A_{\text{stop}}$ represents the extraction stops at the final time step $T$, and $r$ is the reward as defined in Equation [\[eq:R_compute\]](#eq:R_compute){reference-type="eqref" reference="eq:R_compute"}. To encourage the agent to select compact summaries, we multiply the final reward $r$ by a length penalty term $1/(T+1)$ [@luo2019reading]. Consequently, the return $R_t\equiv\frac{r}{T+1}$.
53
+
54
+ :::: algorithm
55
+ Parameters: learning rate $\alpha$
56
+
57
+ ::: algorithmic
58
+ LSE outputs local sent. embed $l_{s_1}$,$\dots$,$l_{s_L}$ GCE outputs global context embed $g_{s_1}$,$\dots$,$g_{s_L}$ Sample an episode $S_0$,$s_{a_0}$,$\dots$,$S_{T-1}$,$s_{a_{T-1}}$, $S_{T}$,$A_{\text{stop}}$,$r$ from the high-ROUGE episodes set $\mathbb{E}_p$ of document $D_i$ EHE outputs extraction history embed $h_{s^r_1}$,$\dots$,$h_{s^r_{L-E_{t}}}$ for remaining sentences Initialize $h_{s^r_1}$,\...,$h_{s^r_{L-E_{0}}}$ to $\bm{0}$ Extractor outputs scores $u_{s_1^r}$,\...,$u_{s_{L-E_t}^r}$ for remaining sentences and outputs $p_{\text{stop}}$ Compute the action probability $\pi(A_t\vert S_t,\bm{\theta})$ according to Equation [\[eq:memsum_action_prob\]](#eq:memsum_action_prob){reference-type="eqref" reference="eq:memsum_action_prob"} $\bm{\theta} \leftarrow \bm{\theta} + \alpha \frac{r}{T+1} \nabla \log \pi(A_t\vert S_t,\bm{\theta})$
59
+ :::
60
+ ::::
61
+
62
+ Algorithm [\[alg:training\]](#alg:training){reference-type="ref" reference="alg:training"} summarizes the training procedure of MemSum. We initialize the extraction history embeddings to $\bm{0}$, because at $t=0$ no sentences have been extracted. $E_t$ represents the number of sentences that have been extracted into the summary up to time step $t$. Following the strategy in @narayan2018ranking and @mohsen2020hierarchical, instead of sampling an episode following the current policy $\pi(\cdot\vert \cdot,\bm{\theta}_t)$, we sample an episode from a set $\mathbb{E}_p$ of episodes with high ROUGE scores, which enables the agent to quickly learn from optimal policies and to rapidly converge. Details on creating a set of high-ROUGE episodes for training are described in Appendix [11](#sec:high_rouge){reference-type="ref" reference="sec:high_rouge"}.
2108.05997/main_diagram/main_diagram.drawio ADDED
The diff for this file is too large to render. See raw diff
 
2108.05997/paper_text/intro_method.md ADDED
@@ -0,0 +1,90 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # Introduction
2
+
3
+ The goal of image quality assessment (IQA) is to quantify perceptual quality of images. In the deep learning era, many IQA approaches [\[12,](#page-8-0) [34,](#page-9-1) [36,](#page-9-2) [43,](#page-9-0) [49\]](#page-9-3) have achieved significant success by leveraging the power of convolutional neural networks (CNNs). However, the CNN-based IQA models are often constrained by the fixed-size input requirement in batch training, *i.e*., the input images need to be resized or cropped to a fixed shape as shown in Figure [1](#page-0-1) (b). This preprocessing is problematic for IQA because images in the wild have varying aspect ratios and resolutions. Resizing and cropping can impact image composition or introduce distortions, thus changing the quality of the image.
4
+
5
+ To learn IQA on the full-size image, the existing CNNbased approaches use either adaptive pooling or resizing to get a fixed-size convolutional feature map. MNA-CNN [\[25\]](#page-9-4)
6
+
7
+ ![](_page_0_Figure_10.jpeg)
8
+
9
+ <span id="page-0-1"></span>Figure 1. In CNN-based models (b), images need to be resized or cropped to a fixed shape for batch training. However, such preprocessing can alter image aspect ratio and composition, thus impacting image quality. Our patch-based MUSIQ model (a) can process the full-size image and extract multi-scale features, which aligns with the human visual system.
10
+
11
+ processes a single image in each training batch which is not practical for training on a large dataset. Hosu et al. [\[16\]](#page-8-2) extracts and stores fixed-size features offline, which costs additional storage for every augmented image. To keep aspect ratio, Chen et al. [\[7\]](#page-8-3) proposes a dedicated convolution to preserve aspect ratio in the convolutional receptive field. Its evaluation verifies the importance of aspect-ratiopreserving (ARP) in the IQA tasks. But it still needs resizing and smart grouping for effective batch training.
12
+
13
+ In this paper, we propose a patch-based multi-scale image quality Transformer (MUSIQ) to bypass the CNN constraints on fixed input size and predict the quality effectively on the native resolution image as shown in Figure [1](#page-0-1) (a). Transformer [\[38\]](#page-9-5) is first proposed for natural language processing (NLP) and has recently been studied for various vision tasks [\[4](#page-8-4)[–6,](#page-8-5) [11\]](#page-8-6). Among these, the Vision Transformer (ViT) [\[11\]](#page-8-6) splits each image into a sequence of fixed-size patches, encodes each patch as a token, and then applies
14
+
15
+ <span id="page-0-0"></span><sup>1</sup>Checkpoints and code are available at [https://github.com/](https://github.com/google-research/google-research/tree/master/musiq) [google-research/google-research/tree/master/musiq](https://github.com/google-research/google-research/tree/master/musiq)
16
+
17
+ <span id="page-1-0"></span>Transformer to the sequence for image classification. In theory, such kind of patch-based Transformer models can handle arbitrary numbers of patches (up to memory constraints), and therefore do not require preprocessing the input image to a fixed resolution. This motivates us to apply the patch-based Transformer on the IQA tasks with the fullsize images as input.
18
+
19
+ Another aspect for improving IQA models is to imitate the human visual system which captures an image in a multi-scale fashion [\[1\]](#page-8-7). Previous works [\[16,](#page-8-2) [22,](#page-8-8) [48\]](#page-9-6) have shown the benefit of using multi-scale features extracted from CNN feature maps at different depths. This inspires us to transform the native resolution image into a multi-scale representation, enabling the Transformer's selfattention mechanism to capture information on both finegrained detailed patches and coarse-grained global patches. Besides, unlike the convolution operation in CNNs that has a relatively limited receptive field, self-attention can attend to the whole input sequence and it can therefore effectively capture the image quality at different granularities.
20
+
21
+ However, it is not straightforward to apply the Transformer on the multi-aspect-ratio multi-scale input. Although self-attention accepts arbitrary length of the input sequence, it is permutation-invariant and therefore cannot capture patch location in the image. To mitigate this, ViT [\[11\]](#page-8-6) adds fixed-length positional embedding to encode the absolute position of each patch in the image. However, the fixed-length positional encoding fails when the input length varies. To solve this issue, we propose a novel hash-based 2D spatial embedding that maps the patch positions to a fixed grid to effectively handle images with arbitrary aspect ratios and resolutions. Moreover, since the patch locations at each scale are hashed to the same grid, it aligns spatially close patches at different scales so that the Transformer model can leverage information across multiple scales. In addition to the spatial embedding, a separate scale embedding is further introduced to help the Transformer distinguish patches coming from different scales in the multi-scale representation.
22
+
23
+ The main contributions of this paper can be summarized into three-folds:
24
+
25
+ - We propose a patch-based multi-scale image quality Transformer (MUSIQ), which supports processing full-size input with varying aspect ratios or resolutions, and allows multi-scale feature extraction.
26
+ - A novel hash-based 2D spatial embedding and a scale embedding are proposed to support positional encoding in the multi-scale representation, helping the Transformer capture information across space and scales.
27
+ - We apply MUSIQ on four large-scale IQA datasets. It consistently achieves the state-of-the-art performance on three technical quality datasets: PaQ-2-PiQ [\[43\]](#page-9-0),
28
+
29
+ KonIQ-10k [\[17\]](#page-8-1), and SPAQ [\[12\]](#page-8-0), and is on-par with the state-of-the-art on the aesthetic quality dataset AVA [\[30\]](#page-9-7).
30
+
31
+ # Method
32
+
33
+ To tackle the challenge of learning IQA on full-size images, we propose a multi-scale image quality Transformer (MUSIQ) which can handle inputs with arbitrary aspect ratios and resolutions. An overview of the model is shown in Figure [2.](#page-2-0)
34
+
35
+ We first make a multi-scale representation of the input image, containing the native resolution image and its ARP resized variants. The images at different scales are partitioned into fixed-size patches and fed into the model. Since patches are from images of varying resolutions, we need to effectively encode the multi-aspect-ratio multi-scale input into a sequence of tokens (the small boxes in Figure [2\)](#page-2-0), capturing both the pixel, spatial, and scale information.
36
+
37
+ To achieve this, we design three encoding components in MUSIQ, including: 1) A patch encoding module to encode patches extracted from the multi-scale representation (Section [3.2\)](#page-3-0); 2) A novel hash-based spatial embedding module to encode the 2D spatial position for each patch (Section [3.3\)](#page-3-1); 3) A learnable scale embedding to encode different scale (Section [3.4\)](#page-4-0).
38
+
39
+ After encoding the multi-scale input into a sequence of tokens, we use the standard approach of prepending an extra learnable "classification token" (CLS) [\[10,](#page-8-11) [11\]](#page-8-6). The CLS token state at the output of the Transformer encoder serves as the final image representation. We then add a fully con<span id="page-3-2"></span>nected layer on top to predict the image quality score. Since MUSIQ only changes the input encoding, it is compatible with any Transformer variants. To demonstrate the effectiveness of the proposed method, we use the classic Transformer [38] (Appendix A) with a relatively lightweight setting to make model size comparable to ResNet-50 in our experiments.
40
+
41
+ Image quality is affected by both the local details and global composition. In order to capture both the global and local information, we propose to model the input image with a multi-scale representation. Patches from different scales enables the Transformer to aggregate information across multiple scales and spatial locations.
42
+
43
+ As shown in Figure 2, the multi-scale input is composed of the full-size image with height H, width W, channel C, and a sequence of ARP resized images from the full-size image using Gaussian kernel. The resized images have height $h_k$ , width $w_k$ , channel C, where k=1,...,K and K is the number of resized variants for each input. To align resized images for a consistent global view, we fix the longer side length to $L_k$ for each resized variant and yield:
44
+
45
+ $$\alpha_k = L_k / \max(H, W), \quad h_k = \alpha_k H, \quad w_k = \alpha_k W \quad (1)$$
46
+
47
+ $\alpha_k$ represents the resizing factor for each scale.
48
+
49
+ Square patches with size P are extracted from each image in the multi-scale representation. For images whose width or height are not multiples of P, we pad the image with zeros accordingly. Each patch is encoded into a D-dimension embedding by the patch encoder module. D is the latent token size used in the Transformer.
50
+
51
+ Instead of encoding the patches with a linear projection as in [11], we choose a 5-layer ResNet [15] with a fully connected layer of size D as the patch encoder module to learn a better representation for the input patch. We find that encoding the patch with a few convolution layers performs better than linear projection when pre-training on ILSVRC-2012 ImageNet [31] (see Section 4.4). Since the patch encoding module is lightweight and shared across all the input patches whose size P is small, it only adds a small amount of parameters.
52
+
53
+ The sequence of patch embeddings output from the patch encoder module are concatenated together to form a multiscale embedding sequence for the input image. The number of patches from the original image and the resized ones are calculated as $N=HW/P^2$ and $n_k=h_kw_k/P^2$ .
54
+
55
+ Since each input image has a different resolution and aspect ratio, H and W are different for each input and therefore N and $n_k$ are different. To get fixed-length input during training, we follow the common practice in NLP [38] to zero-pad the encoded patch tokens to the same length. An input mask is attached to indicate the effective input,
56
+
57
+ which will be used in the Transformer to perform masked self-attention (Appendix A.3). Note that the padding operation will not change the input because the padding tokens are ignored in the multi-head attention by masking them.
58
+
59
+ As previously mentioned, we fix the longer length to $L_k$ for each resized variant. Therefore $n_k \leq L_k^2/P^2 = m_k$ and we can safely pad to $m_k$ . For the native resolution image, we simply pad or cut the sequence to a fixed length l. The padding is not necessary during single-input evaluation because the sequence length can be arbitrary.
60
+
61
+ Spatial positional embedding is important in vision Transformers to inject awareness of the 2D image structure in the 1D sequence input [11]. The traditional fixed-length positional embedding assigns an embedding for every input location. This fails for variable input resolutions where the number of patches are different and therefore each patch in the sequence may come from an arbitrary location in the image. Besides, the traditional positional embedding models each position independently and therefore it cannot align the spatially close patches from different scales.
62
+
63
+ We argue that an effective spatial embedding design for MUSIQ should meet the following requirements: 1) effectively encode patch spatial information under different aspect ratios and input resolutions; 2) spatially close patches at different scales should have close spatial embeddings; 3) efficient and easy to implement, non-intrusive to the Transformer attention.
64
+
65
+ Based on that, we propose a novel hash-based 2D spatial embedding (HSE) where the patch locating at row i, column j is hashed to the corresponding element in a $G \times G$ grid. Each element in the grid is a D-dimensional embedding.
66
+
67
+ We define HSE by a learnable matrix $T \in \mathbb{R}^{G \times G \times D}$ . Suppose the input resolution is $H \times W$ . The input image will be partitioned into $\frac{H}{P} \times \frac{W}{P}$ patches. For the patch at position (i,j), its spatial embedding is defined by the element at position $(t_i,t_j)$ in T where
68
+
69
+ $$t_i = \frac{i \times G}{H/P}, \ t_j = \frac{j \times G}{W/P}$$
70
+ (2)
71
+
72
+ The D-dimensional spatial embedding $T_{t_i,t_j}$ is added to the patch embedding element-wisely as shown in Figure 2. For fast lookup, we simply round $(t_i,t_j)$ to the nearest integers. HSE does not require any changes in the Transformer attention module. Moreover, both the computation of $t_i$ and $t_j$ and the lookup are lightweight and easy to implement.
73
+
74
+ To align patches across scales, patch locations from all scales are mapped to the same grid T. As a result, patches located closely in the image but from different scales are mapped to spatially close embeddings in T, since i and H as well as j and W change proportionally to the resizing factor $\alpha$ . This achieves spatial alignment across different images from the multi-scale representation.
75
+
76
+ <span id="page-4-1"></span>There is a trade-off between expressiveness and trainability with the choice hash grid size G. Small G may result in a lot of collision between patches which makes the model unable to distinguish spatially close patches. Large G wastes memory and may need more diverse resolutions to train. In our IQA setting where rough positional information is sufficient, we find once G is large enough, changing G only results in small performance differences (see Appendix B). We set G=10 in the experiments.
77
+
78
+ Since we reuse the same hashing matrix for all images, HSE does not make a distinction between patches from different scales. Therefore, we introduce an additional scale embedding (SCE) to help the model effectively distinguish information coming from different scales and better utilize information across scales. In other words, SCE marks which input scale the patch is coming from in the multi-scale representation.
79
+
80
+ We define SCE as a learnable scale embedding $Q \in \mathbb{R}^{(K+1)\times D}$ for the input image with K-scale resized variants. Following the spatial embedding, the first element $Q_0 \in \mathbb{R}^D$ is added element-wisely to all the D-dimensional patch embeddings from the native resolution image. $Q_k \in \mathbb{R}^D$ , k = 1, ..., K are also added element-wisely to all the patch embeddings from the resized image at scale k.
81
+
82
+ Typically, the Transformer models need to be pre-trained on the large datasets, *e.g.* ImageNet, and fine-tuned on the downstream tasks. During the pre-training, we still keep random cropping as an augmentation to generate images of different sizes. However, instead of doing square resizing like the common practice in image classification, we intentionally skip resizing to prime the model for inputs with different resolutions and aspect ratios. We also employ common augmentations such as RandAugment [8] and mixup [46] in pre-training.
83
+
84
+ When fine-tuning on IQA tasks, we do not resize or crop the input image to preserve the image composition and aspect ratio. In fact, we only use random horizontal flipping for augmentation in fine-tuning. For evaluation, our method can be directly applied on the original image without aggregating multiple augmentations (*e.g.* multi-crops sampling).
85
+
86
+ When fine-tuning on the IQA datasets, we use common regression losses such as L1 loss for single mean opinion score (MOS) and Earth Mover Distance (EMD) loss to predict the quality score distribution [36]:
87
+
88
+ $$EMD(p,\hat{p}) = \left(\frac{1}{N} \sum_{m=1}^{N} |CDF_{p}(m) - CDF_{\hat{p}}(m)|^{r}\right)^{\frac{1}{r}} (3)$$
89
+
90
+ where p is the normalized score distribution and $CDF_p(m)$ is the cumulative distribution function as $\sum_{i=1}^{m} p_i$ .
2109.04683/main_diagram/main_diagram.drawio ADDED
The diff for this file is too large to render. See raw diff
 
2109.04683/paper_text/intro_method.md ADDED
@@ -0,0 +1,33 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # Introduction
2
+
3
+ The ability to predict the outcomes of physical interactions among objects is a vital part of human intelligence [@moore2008mental; @kubricht2017intuitive]. Yet, it is very challenging for AI systems to acquire this ability. The key to tackling this challenge lies in understanding commonplace physical events. AI systems need to possess this ability before they can be safely and efficiently deployed in the physical world [@duchaine2009safe; @zheng2015scene; @li2017visual].
4
+
5
+ With the rapid advancements in computer vision, deep learning and embodied AI [@forsyth2011computer; @bengio2021deep; @duan2021survey], there is an increase in intuitive physics models that aim to predict physical interaction outcomes. Many of these intuitive physics models [@battaglia2016interaction; @wu2016physics; @lerer2016learning; @ye2018interpretable; @groth2018shapestacks; @leguen20phydnet] are inspired by the process of mental simulation from the *mental physics engine* hypothesis in cognitive science research [@battaglia2013simulation; @fischer2016functional; @ullman2017mind; @kubricht2017intuitive]. This hypothesis postulates that humans predict physical interactions via the process of mental simulation. With only a few initial visual inputs of physical interaction, we can mentally reconstruct the scene with some initial approximations of the physical proprieties and dynamics of the objects. We can then predict the outcomes of physical interactions using this estimated information and the generated future visual states of objects during mental simulation. However, existing intuitive physics models are tested on short video sequences from datasets with mostly one continuous physical interaction among objects. Furthermore, it is uncertain whether humans can estimate physical properties accurately from visual inputs, and whether accurate physical property prediction is always useful for predicting physical interaction outcomes. In some cases, despite biases in estimations of physical properties [@fleming2014visual; @rossi2018speed; @mitko2021striking], humans have been found to have adequately precise physical interaction outcome predictions [@mitko2020all]. This suggests that humans might have other cognitive abilities on top of the physical property estimation that enable good physical interaction outcome prediction.
6
+
7
+ Past research has shown that humans make rational probabilistic inferences about physical interaction outcomes in a "noisy Newtonian\" framework, assuming Newton's laws plus noisy observations [@battaglia2013simulation]. We use noisy and approximate physical simulations to account for property, perceptual, dynamic and even collision uncertainties [@battaglia2013simulation; @gerstenberg2017intuitive; @Hamrick2015ThinkAT; @Bramley2018IntuitiveEI; @LUDWINPEERY2021101396; @smith2013sources; @ullman2017mind]. We posit that one of the beneficial cognitive abilities in humans for effective physical interaction outcome prediction is the ability to perform mental simulation with selective temporal attention to focus on physically relevant moments [@firestone2016seeing]. This might be because noisy observations and simulations are counterproductive except in moments when crucial physical interactions (e.g. collision events [@LUDWINPEERY2021101396; @ullman2017mind; @gerstenberg2017intuitive]) are present. We further posit that the selected moments in the mental simulation are then used to finalize outcome predictions.
8
+
9
+ Inspired by our hypothesis that humans use selective temporal attention in noisy mental simulations to reduce the negative effects of noise, we propose PIP, an intuitive physics model with future frame generation and span selection for predicting physical interaction outcomes. The span selection module serves as the temporal attention mechanism to focus on key physical interaction moments in the generated frames. Since state-of-the-art generative models in video generation still have artifacts and prediction errors in their generations [@weissenborn2019scaling; @yan2021videogpt], we simply use the well-established convolutional LSTM (ConvLSTM) [@xingjian2015convolutional; @guen2020disentangling] for future frame generation to approximate noisy mental simulations as a starting point.
10
+
11
+ Our contributions include: (a) PIP, a novel intuitive physics model for effective predictions of physical interaction outcomes among objects in long sequences with disjointed object interactions, and (b) the SPACE+ dataset, the first synthetic video dataset with long sequences of multiple disjointed object interactions for three fundamental physical interactions (*stability, contact* and *containment*) in a 3D environment. (c) Our experiments show that PIP outperforms existing intuitive physics models and humans in predicting the outcomes of physical interactions while identifying key physical interaction moments.
12
+
13
+ <figure id="fig:1" data-latex-placement="h">
14
+ <embed src="dataset.pdf" />
15
+ <figcaption>Examples from SPACE+ dataset: (A) Frames of the three physical interaction tasks from the SPACE+ dataset for the seen object scenario. (B) Frames of the same tasks with new object classes for the unseen object scenario. (C) Visual information for one frame: RGB, object segmentation, optical flow, depth and surface normal vector.</figcaption>
16
+ </figure>
17
+
18
+ :::: center
19
+ ::: {#dataanalysis}
20
+ Physical Interactions Scenarios Frames
21
+ ----------------------- ----------- -----------
22
+ Stability 19,551 2,932,650
23
+ Contact 19,551 2,932,650
24
+ Containment 17,955 2,693,250
25
+ Total 57,057 8,558,550
26
+ Unseen Stability 3910 586,500
27
+ Unseen Contact 3910 586,500
28
+ Unseen Containment 3591 538,650
29
+ Total 11,411 1,711,650
30
+
31
+ : SPACE+ dataset analysis.
32
+ :::
33
+ ::::
2110.08207/main_diagram/main_diagram.drawio ADDED
@@ -0,0 +1 @@
 
 
1
+ <mxfile host="app.diagrams.net" modified="2021-10-15T15:21:58.933Z" agent="5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/94.0.4606.71 Safari/537.36 Edg/94.0.992.38" etag="62rS3XmoDjCDZ5V6VnEY" version="15.5.2" type="google"><diagram name="Yong's Copy of Page-1" id="HOmkijU_oPxjSOrhAgZc">7Z1dV+K8Fsc/jZd0tUlfL3lx1DPio+I5zpy72EbIsSWskor66U8CrZQmKDDTFl1huZaQlkLzy052/nsnnMB+8nKWotlkSCMcnwAzejmBgxMALGi6/J8oec1LbNdelYxTEq3KzHXBiLzh/MSiNCMRnudlqyJGaczIbLMwpNMpDtlGGUpTutg87ZHG0UbBDI3xxtcQBaMQxVg67Z5EbJKXWmbp9HNMxpP8o30nP/CAwqdxSrNp/nlTOsWrIwkqLpOfOp+giC5KRfD0BPZTStnqWfLSx7Go16LGfrhp/DxedJ9M+pRGQX/wNuh0Vhf7sc9b3m8uxVP2dy8NVpd+RnGW1+QoSxKUkjfECJ3mN85ei3pe1hQWV7ROYG8xIQyPZigURxe8afGyCUvi/PAEhZMsxWeifGDzghklU4bT02d+HwK4yctQTMZT/jzE4hAveMYpI5xsNz/AqLjsnH8KmY4v8SPL35iX8FeOuEwa5q3S468eSRz3aUz59QZLorC3Yx3mdS2+BH4pta28Ts8wTTBLX/kp+VG7aB656XRsyzeAsypblNoizM+blJqh5QEjPxXlFjB+/4Q1S/4kx7kHWiij5bdOEnH/GutnWIEPDOjtRhb4hguahWtLcK+R6NpTNOfdl3kRCdSPvL6/mRXnp9+Jqw06br12bQSb/EHRID41a8sxXFgTekdC36cpfsS8AkLB/hbPaZwdF/ftJLc2hTlL6RMu3nQCoLl8SI1ks0WAGluE4wSGAytNojD8z5pEYNbUHFypOdzcXDdAvoyU03l8fARhuEuLSEgUxVhFOHIfXKdWm4a24ZjB+rEJ03IsA1oyT9OReYK6eHoSz+HtdV8DPQQoMKFh5x3H8uG2DNeXh+3u/UjDPchaPd+wSmxNRU/cKNxAgvsbxzMNV+1fBw73r9fwrIrleo5hKVztRoEW4kaJ6C1lDHPHyryjCWIUzzXeQ/BC3jG3j9eS8HZns6UH/UzwQqP9CG2FJ3eNg7ZxylrWxXDQ0xQPMlBoGU7b3pIlS1jdBL01MrH9jkx9r30jlYUrPnMnIS/qx2g+P0LNqkbtoraG4APDr3TRe4lXwK2Lv6xedc/46ys93m63as8AdmkaZFfAcl/KKx0OWverZElq0LvGEUGa8EGExUzXdkvHFZbcLGFZpLq7PdUi1YF4HWgA75jwyjLViKVZyDiezh3t3OGXLxlCDNSjeV3cpciwU0xRPhuC/dqGX1mjuidPhJf0CNXmu0WGdAw72OpV245jlEXKtj1sIItWfZokwqM2z7CeOm3XmiuhPZs70WWuRbfdHllZrxpmMSPad/4AKzRcSdowW58GA1mrOiNjtKBppEnuTvIIVEcga1S/RlmiKW6laJdjdtV0imMI+wBZoxqhREPdyzSPIcADFKlSV1e8YIBILG5gyP9rqLtDPQIBGcjiUj+mcxx1epQ+8QM33QaIfm/p2KxIx8p5KwjeUxwbmboCWXM6p2xGWVPMv6IVg8A2CjVnPa+xDK/1rlmWmHIhQrP8iGVZJnQqXB3HcJ0jkhGBLDadvrAUhYw8Y91T/2l7sOyK82ypIj/KCB8ENSGHioypf65PdYbjFoTcZCtWbFltmy2UZaZuxO9pjlKCYt1D74fTbR2nLDUNMnqrg3U7Mwy4R1x6FGmD7RGV9YmlDjyLcac/oSQ8tsH1aBf5SUPozhE6aAZ10VWt7xLxmznmf7rv3W63FXfYdwy/7WkOVKTC3J52hxrirp2v7VqG1XYiKpS1h5use3GpMW7BaEsYoWsUoZL2MMqiw02Gbu/eNMedOVrmEfSqsrAwouFqcnKhR8jtNDdZQj5CwrZZ2qp0lXlC53qiuQ9K+G6p7ZGUdYOb7kjPM3dFaAaG3/bU0pbFgnuizXBnhq5ptI5Qse9PSG40wt0QArHJR9sIZYFH+KlY5yrsCtG3Dd9vmaIjj4f9f65b6EsjB/uR/WcMffAA3ToZOmKzLFfazGGt5pjcV5WSsBuB+9/h5c2/cND5Obyy7n4MsDmBis3VJK44GuNR/pKmbELHdIri03Vpb01eUFufc0mXmxkJ3v/DjL3m4ijKGN1sDfvBEF/oQxQpjtEyNrs+SV11+VuvRdtbI/S40ZXzpTfzDizfMoJy7l9la7Q5zdIQ55es0Hn/jodboytbo9gOL98zq0+TWYy//LpT5c5YPcoYTTbPX+2WZdZpz4rNstR5RL5RZNZvrkKtbcMs2cc9529DowUa6855e+fsbLPsVeaJ207nrCSscIEZXd5QP6ZvWFP+YAiuoIWW4bWt27qyP3yFWJYuBb9LNB1nYndgYF5M33dB/DadeJP9MwwUpLf1z/XRVmwRcHV5oY12J79ZWiLDD7e9cM1V5Gy3sC/Pl+RZmQdB4Bq2fURDrRwZvb071Wx3HmBBEBjAl1dEtUdUkZndhor/hYiu4VV1KLCZOtb6/qOuav3/lI5TxGFqxocwtgLPcCxZ9mhvA2E5onq/XFdsjkQGGf8/IHOUPBDuNn95xWOXJPygqnt0YJ3NxTUNc7tDZu26YbhXW/uQ9Y97ojv47cZvAml3j3W2fsCnx6UOXgG3WeOXpY+bTPw8QPN8v0joSE7Zt4Fh2cfUocvyR75W7pzq7aV3DQnCBmfBLHsLHofJb/Lj5+XVjHreLw935GG5d3HW6eFpOPnSQ3AzepVb7Xr3CSbYtUGVo0oSS/HrYrM/rJb3H05DD8VlzQ+ry7NMw5Yy4k11vp+7OcR5cgVCD244NeZfWOWtrE/ZN+nTaOmz4nmYkllDDmuzrsqHLWv3Lq9CETS48kx5B7Ir0qfTEM9YhlrIZPnSDHMTbh2p7Imck2mU8aKfU7qIlxkGmuwBZJ1G83eVtySHXQRTsW3gv6dP4lkLm8x9B7S+pfzRwEbZygGYcth0oMEeAtYy/c19fdvuneVYzCUdL/fWPxO/4qshHwIZOoZ3TJDl8MwS8jIXYoCjLNQu8sGsXXhcrOVozZDMw5UTzSnrAfkwzIF5XJgVvzM2pM9kFapBMf1+aYnNzJasIxufFT84dkWfxbIOM58Za4s+jLRtb27V3TppWccasRQxPBY31caCum+BuQhdtsdVkVn8GvMemswTbbuHQQ0aXXCnvidZ1/oPYSghQv34fqHxP+Sag4PA2ciLaTK5SX0HsoIlkpsWk1dNUE3QUUd1WltVZ0mgGlhEh18I+yXebjj5q9+lI4OX/MrLF68nh6yaqnvhXVCMizlXt9jy95O1dftfiKF0jNlfWaSn4g80/0P4A3NzQZZdpHnuy//TCx3Mn79MKWXl03k/OxnSCIsz/g8=</diagram></mxfile>
2110.08207/main_diagram/main_diagram.pdf ADDED
Binary file (42 kB). View file
 
2110.08207/paper_text/intro_method.md ADDED
@@ -0,0 +1,11 @@
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # Introduction
2
+
3
+ Recent work has shown that large language models exhibit the ability to perform reasonable zero-shot generalization to new tasks [@gpt3; @kim2021changes]. Despite being trained on only language modeling objectives, these models can perform relatively well at new tasks that they have not been explicitly trained to perform, for instance answering a question on a passage or performing summarization. An influential hypothesis is that large language models generalize to new tasks as a result of an implicit process of multitask learning [@radford2019language]. As a byproduct of learning to predict the next word, a language model is forced to learn from a mixture of implicit tasks included in their pretraining corpus. For example, by training on generic text from a web forum, a model might implicitly learn the format and structure of question answering. This gives large language models the ability to generalize to held-out *tasks* presented with natural language prompts, going beyond prior multitask studies on generalization to held-out *datasets*  [@unifiedqa; @ye2021crossfit]. However, this ability requires a sufficiently large model and is sensitive to the wording of its prompts [@true-zero-shot; @zhao2021calibrate; @master-translator].
4
+
5
+ ![Our model and prompt format. T0 is an encoder-decoder model that consumes textual inputs and produces target responses. It is trained on a multitask mixture of NLP datasets partitioned into different tasks. Each dataset is associated with multiple prompt templates that are used to format example instances to input and target pairs. Italics indicate the inserted fields from the raw example data. After training on a diverse mixture of tasks (top), our model is evaluated on zero-shot generalization to tasks that are not seen during training (bottom).](Octopus.pdf){#fig:octopus width=".9\\textwidth"}
6
+
7
+ Further, it is an open question how implicit this multitask learning really is. Given the scale of recent language models' pretraining corpora, it is reasonable to expect that some common natural language processing (NLP) tasks would appear in an explicit form in their pretraining corpora, thereby directly training the models on those tasks. For example, there are many websites that simply contain lists of trivia questions and answers,[^1] which are precisely supervised training data for the task of closed-book question answering [@roberts-etal-2020-much]. We hypothesize that such multitask supervision in pretraining plays a large role in zero-shot generalization.
8
+
9
+ In this paper, we focus on explicitly training language models in a supervised and massively multitask fashion. Our approach uses a training mixture consisting of a large set of different tasks specified in natural language prompts. Our goal is to induce a model to better generalize to held-out tasks without requiring massive scale, as well as being more robust to the wording choices of the prompts. To convert a large set of natural language tasks into prompted form, we use a simple templating language for structured datasets. We develop an interface for prompt collection from public contributors that facilitated the collection of a large multitask mixture with multiple prompts per dataset [@promptsource]. We then train a variant of the T5 encoder-decoder model [@t5; @lester_prompt] on a subset of the tasks (each with multiple datasets) and then evaluate tasks and prompts that the model was *not* trained on.
10
+
11
+ Our experiments study two questions. First, does multitask prompted training improve generalization to held-out tasks? Second, does training on a wider range of prompts improve robustness to prompt wording? For the first question, we find that multitask training enables zero-shot task generalization by showing that our model matches or exceeds the performance of GPT-3 [@gpt3] on 9 out of 11 held-out datasets, despite being about 16$\times$ smaller. We also show that the model improves over a large baseline language model on 13 out of 14 tasks in the BIG-bench benchmark [@bigbench]. For the second question, we find that training on more prompts per dataset consistently improves the median and decreases the variability of performance on held-out tasks. Training on prompts from a wider range of datasets also generally improves the median but does not consistently decrease the variability.
2110.08421/main_diagram/main_diagram.drawio ADDED
@@ -0,0 +1 @@
 
 
1
+ <mxfile host="Electron" modified="2021-10-16T00:06:05.033Z" agent="5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) draw.io/14.9.6 Chrome/89.0.4389.128 Electron/12.0.16 Safari/537.36" etag="0Zns01Yb8n-CopamW05f" version="14.9.6" type="device"><diagram id="0yvWQw7CPxZl9PFnq5s4" name="Page-1">1H1bd6LK1vavWZfvGBykd+fSKBpsCxsFDd4ZzFYBjbvbRODXf/NQVaA5difpXt/Yu8cSI1DHWfPwzGf+Y3e2Rf/HYr8Wd8vb/B/LWBb/2N1/LMt2HAf+g9+U/I154dj8zerHZim/q7+YbKpb+aUhv73fLG9/nvzwcHeXHzb70y+Tu93uNjmcfLf48ePuePqz/97lp2/dL1a3j76YJIv88bezzfKw5m8t27yo/3B1u1mt1atNR7Z8u1C/ll/8XC+Wd8fGV7b7j935cXd34E/bonOb4/CpgeH7es/8Vbfsx+3u8JYbvpWjH+svdw931e3wh+F2D7tq+H8tfsrDIr+XPZaNPZRqCKDde/y42dJYXS5+7nmk/7spbuHZl/IP3eXisPjHbvOl1fv5sPrHuiy20IbO9+5F62Z2XC230zKx8oeb1NiISevobS7zm63/MO/n9/MKmno93g7D1ca78o145mTxrHi42UYX3tZ58NKv8LxOelyP3YvutJcH48lltdwmq6g37YRu3vU63mFYXlwLQzwMjXE07lyG0yxZwZsK+L4XbrxvXqe9WVyNjaR79zC0l/aydGxROg/JNnkY24N10l/vY1s8ROb8ali592Jy4U7d4CGx5zsRuvfzq3Hmpa1v3698a15e3s1n+W5xFVx4qaj8ScsRoWcmVwH0YJ4nO39/Y7Xgb/gcD3qf50tj8HDbhd532kevCy3dtI6iG5f+pF2JKrofhS58ht9aebbsr/C59W+uLw0YMRjJ4j6B0XquHyJtH0XnAscG2rE+3PSdarRb/1zMnB/fJ4O75dX4ONp8fYC77OEuqYbbi3Jefi1GYeYMbf4d9tHrFof59Xg97/eMOGzh2H1NtuPtKB+4Y92OfRo322L55XzWMxbX8zy2Lg43s4v7RLbrZifSuv377OQ++9JMtsf7G3uwG1rjdGhNf85nZn6zG1c8D18bfRlk8/SF/kM/RpOvR7H5WoqNWc778SGx8/tlv9cazpzKKz1cSZv2CnoIrd9fw1t/fF/draCHq+/9cTrajZ2kH62ot7tBnlgX0Do/b45ecjUtbzr8xnlfwD//sLim+x4ie7y/6ReDm1lvN594DyfP3o7Xy34ELYB5KsXEOPoboxDToPDTO9Ov7o6ibNmjbtsYdVdfcSzqO6jNMAvwvi9zmDV4vvF9mx/i2TIf2rgiD7jP5F1zaJu5XnZp5rCvMOL7NNlO4WkXpWqTHIPLuTW917+tx2G/7Bf59wyec3X588Yew+qA1ZmKe78Du6pzSeORVDjzJ7/hmayOODZflv3cuGn2YPW9c/H4rV3+dWwP8vh6/JCk/L38bT1L8Du9DmmdHh9uZlOQGDhOLr8Fxmi6TSq5au2xA+9fLa/y4zwEOdOftpbw1MQWm++b/X/g7+WNdciHs2J/s52m8XX7S7ybGsvuoVpeDR4WVgQrD2TUbP+wmLW+JNsL82Yb/Oe/IbSEny3fCiOfg6RQbSgv93PY7fNtvlteD/Lryt1wD4qd188zWE9H6Gm66H0tPNXbq0voxWqF94UTkhJasnjtvXhearS/hZ125Xe9e9FtWyKNViBl8Dt4RnvV2d19g90H/cl/4hhAm37edPf3N5aTj0q5RnEG5JiptsfX6zwxG+3rJ6vFDCTeDtvp311X3kav79O2W7CSVTtLv8pWney4EpUwhunq6KfQn65nj2S7RRXcj2S7/bJVcl/jagjSjH+b3ItN+9vt6o7Oh9tZgftxu5gVP0cbfzun/8X/8TZf9e7WvaFWx1axTzomjvgd3Iny5s6zLzN463HeFdDC+fbbxGvss8XMXM+taLXoT/dza218n3jwK28j525P+ztcVcKKjqIvWmIGf7sal3B2VbQ/YZwTy1/D+rgTk/ZBTHIj2fbuE6vAHX0H83MQaVJ+O5H4HqxKb4WnhA/7TM4lSj+SzKPtGiRsDGcirfYvso0PSd/ZjSw4wazI8jvHb4vpvBOaA1u4h6u5dXEZRL3LMPenC3c6nRu9y2A6oO/G0143mOb4ueO7/j4x5/i5J7LpT/jbLErH92HlZ8LMy2m+nyyilTHrT9Pxdhr4vek67B+cCcig8XYe3rr5/KbXG8+m+di3Bts4m8bTPL+fRl+P4+l870fT6c00/xlYxfymvy+n7nQT7fZj0Tn8FKE/Dey4Nd1c2HEF6yIcOPPuILhJ12kQts3Ymv8vqHqjqJrnE3fgxNnSBNlz7XfX36e9pFz21qNFteyOZgPTv45gdgfpMvVbwhp8v9l6LdEf/29qLg/LfvDwzc5FYrnlIls5o75b3m4vnGV/OR71fTuKpk4w7fVuo954fJ1f/o3rce/ODtL1dgynsd/5WQWpv/lmT/83yUUZbgdrWBeFuB4cJr2sADkWLLvjXOz2R7+aVvRdbxosd+sv8eZgzdyvFoyVsewtN/CKa382N2+3RulP4Ufm/nIS+YsgyruzrDf4W9eT3Twe949GbCT2Mr3Mwio++tP9NOxmrXnoFTdXg3JoXh7mu3UZZ1/tqHsJ8+XEib2+Gs3217d98eObuTwuZ265vPLXSdgu4D7zdna4i438RzA7DBZG78dkdrhc/N3rKwGbY3bVPsbZxRcxc+ahvS6nxld7no5bcNZsZr2lGV2Pvyy7vgeSv5xXcHbmsREa62wc5tVNmtCanfePtm/uD7e7wfV8mveDmTkOjN4hivzLG9zTf+vaGoS4/0R3HE/649Y383Izg/W73O1NsRtn8c6HvWxG8ear4YOsnHVde3kFMtQY70R52AytfSeK9jBmy28wZrvJ9nA5N/7StWX8WOZra7kDqZOu/aGZm0ML9AA3cUCuBMJe2XE4gDma5zfh8ssi9LvR5uLL0vb/qvxoXLuzbcv5Zs2/T/K2/c3yvTnsD7CGQC8cjOKtC2eL68xnrWoW9sw4j02x9cywc9EJ7GU0iQaXUzfvBNPxIMh6f+t6lMziar5bGbD3HyL3whWZs7mZXGTL2V0Ja8qYw36Iq6k/6q/sW7e3G1+vYQ6nLZjD3l++vrq93mewB8DuHXgTd/1laWbl6Nr9ERvrUWII05+u4wX0E87QThDt99Dvb3/t+np8P7uKj/Pp2k1mg/7UBbk0M483Rq8z6/8L2kfXrrPo+9ffjMiaGD1op2mjHoP6yyQfTEHHKBfGfje3kgvQ11a3oIMNw2QzrLQ1sZP6qtZx0Y5K+sUD2Dr7eOOsb/t5pXX1E513bSxm/o/m3xq6POpnudiO/yuaWiRZXvPZvNJ3PKnlt1k7Bk14CBpwBzRAeuPWqEQ/KIU1Xouue2xq8MpqGW0EaNlt/FdbMtfL/fxqfDdK26BV/9S9lyNw3p6v88bfob0maJEradlAr33QWi9+zMO7VL7nOLLldxPTXszGxqJ7dwSt/j9o38Oo2/6mZY7CmHwZt320qn5+WVwNcrTbPetRnxzWrLPzdu7qUa51+ebMkSXXsJ74PvcwnLQLr2McTvpYXkrtGy1jA/rFmvpNv2fMJytbdFolWCc2WVVd916EmTXqJua3az+Nwdr41m2jBXLA3307neHOib2pv0vgyYvOJa4NH+wXx++0TVG5PNPhCu2i82tDdLOCbZ/kKMp2NerGVuPa8LuB/LvbghaufPi7vyEbqoTTB1ZSYEF7+ZlpYIrmO6ugJSaPro/y+di+5mf+HduNNNrj/vR/8XRuRhaMYgmziHbL1WB9s/O3YJ0foJ9VbPV+wmjCamgXPq2A01llK/9srOSKe7RSjHG+2PZmyynYUDtxgPa8bYWcfNb7s7Xs5+nC/Fo2VkstA9C+DwdZ0579hJ1RiK6496vkaZtu40MfLzOR+lkMOqzosPwCG9Fp2LVP7IehlTS9K1pejfPbK5BHm1N5dFwPJ8YK1/IwbHp8kvPZPLFWR2Fy73fdyq8iWDntYgiag5+uYL/kjVn3VsMQ90m2OZF3/YZf5DkpWIjUa3ndCNaV8mWI4vE1S0pY987pZ/5dNGm3RLpajcJ2BXsE9ssK9lZUjvhvpaDferg2YX945Qj2Dez6Fqz7MXxPa16ksY3f+2kE7/McEQarTo7fo8cIpEOI+wykXOWhPwN/S8+GeS29LkiQsA2/jQpoTzUKV4Z8donf+/BuaEsxor6AFIQ9TM8Oowr2nIH6GLQb2hnAb90C9zb2CUfd78IYlbAvod+iWqE8wHGha5ALhur/qOvCdYR7GMbULUkmQFvlewx8DzzvWMuQBOUE9yEMDPi97ENmk0yqVjb3QYB8irEd5HuC/1r0nepDFRx9Gh/smyih7/AMkKwlPRu+z3A8LUFjn+FvSxhrS44PzRu01yZ/UbeN42DDeMl2Z47ANVK1eXzgtz6+p73/ok/L7FxevOCP6bSOMBa4D/dne6tz5sfT3y6v/fyJPevOZ/v1dDcN8T3DUJS/sFfPrpQ0gl27j63Df0XpPbWvtygno3Le+1qdnHz95MkWa2+Z8mZ+8aq21BlqD6f0QH2BkWmNSvZAwUzCns5M8sCh7x6k5KLa3yXb6RakWXoDf593vQo0cBjNNox2Rh6sUcPr/YR8OpHbWj4b8+u1cV2JE/nrTYwDrKUVSKxTnWp3PrNSdv2Mr8dwVrul30Ed5PPOcw+entA+l3sU3kJygfYDyIWjj7IldEu5d0wfvw9XtEf8MMJ1XKEc4r0TmSAX6OyGNd7Ce6G1Uu7ER/jbYRTiPkhQTtBn3he07+Xf3Ip8ipWgv4EcQFlnwn5ugYwqQJ5IebkCOav3PO11aJsN7UFfK3yG+6jNLO1J54C9iPtW6h+4X6G/mfwb7OkUnpEmpvybg3IUZLCjPsNKMuU4gVwQ9Dw/zbBvhkA5ALJNPRf2Ncpm7CfKyRbKTpAtUk/yHBrXVBhSD4LnBSibTSkP4Z64IvkcCjl+guQ89AW/N+kES7NKnQkjeD7MT0E+83RVooxHmVjLJZTzK2wPyMcYxik4ijRWz0aZrOR2C36nZDw+G86ZpKXkPLQd9S1YH7FcE1mLZH6IcjTGNuJ50ZK+b7w2fDxr5G/pOiW5DWPvodw+yjks/NS1tNwOV46W8dwHkLlCyXm4l84eaLNQchu/R93TRnlOz6gidZ5UrFeKgual4jNz1F1JuY3WB8rzGPtc8jqCcxTX9Uu6Y/mndUfxEbqj8bLuGHyq7uiHK5Bi4o/qjn7nKd0RJdCv6I5w0oK+CNpTBbsKV8ykVYB9iFrNvnEil2CzoQ65b55qT9nP8B3Fcbw+nvoGWjq4+mAnudJaImuqoHOLrqVUplWlrCXcsUGhVmqI2mGKEqJhQVWwi9DqSpUVBlIdpHLjOSAhXdQ0KXLF9yQoEeC5yppS97hHGZGDHRjY2ubnth9P29n4Le66R9fJUUbOTtv8wo6DmfzDOy74iB1nvrzjok/dcaNudA+n15/dcZMnd9zkV3dc0kLN7bN2HJzdxelOcS3cgY2dUuD5ThYjr1w4X2PecfVZjvfA+Z3Js98l+wJ2XUX6ShhUtHtSpevEeH7C7gGbStouUldi2yWlePlxBLYTfI/6Btpw6pwyBesVdM6CjkP2IepwUv/CXQv2RsC7HneaPuMj1CKV/oX6QVP/wucp/auEZxxIz9L6F45Bhu1VO9ZAfcWvAmkXJvhsk+9v02ehdELWXg18ntzxhno29IPObPob9BF+Z5MEor+tWmQrkQ3lkT0I7XT8Uuk7K9S/HBonvhftsFbTToXvUZ8qfLK3QGdSulQY8LiGKHXYTgUJpp7tsO5GOqlFfQLdS7B/iq6Vv4ri+LzTz+0ntDTOPHUNjb3rkjXy7dTCeMqrufp+Na3mk8dWyyPJ+GZr5eTzI69w/KJX2D/zwhS7J9r6W95d74pwA4XYtIw/6HW1nvC6Wr/odUV/DOo0n+dVQguMtH5c7UqKkDUQkhQxRIVWCFgpvIIL6aWAEadzl7wOPmrGLKEKXMUgsfS9sEOU9AL7ErV/2jEFa9xtp/Ys6b/ZI/Si6PtghkEi8jP5c2O3orZu6PakSYu9JlnVfC7s1pa0Nuu+sk5TCRhHkLqwa4RC4OAOVJbGmC0fbBvpGKzZg5T2lVcHLLTaM+VJD1DDwqlWRW3hgCWBFkNVjxdbTyTJQCLhaQRtTpVXxzNHjA7C8cTTqmmZwfmLVo2U6KkrrRftUWuxRUInAEprskR9LdFjOl3QQoNnWGRtNsbVp9MoQe9dhRKLPFlaQkJfOwbOPY8pf+b1AaeTvGavHFik3J4AdcWCvYP0GaxWr9X4G3ow4T0ZehHR2irVqYenkLTO4B4aT7Lk5PiTbsseO+hz6rLlp71qbdb/0KvGcwsWXFy97BmL/4ZnzPo4z5j/mmfM/mTPWCqcYRrbf9IzljzhGUt+2TPmge3dqsQnesYiimQ1NEGTUH11pAq0L/JKVaR9cfSrqiNRbkGe9Y6001h+aa1R3nt+/br988c9DslH2D+tl+2f1WfaP6A9x6BhRH/U/hk96XEY/arHIY2hJ4HzeR6HdsV+SFfaF67Jp5SH/mBYnZ6FOjaeEHzStfE0gpUtDPaz4mlEvkM+VSgutZIxDrw3NrVvjzSV4EinGZ0g9Jltnm79N7ApKrpvojQHPBnIFyg/u+Q94NOxfWT9nrQLsHsijLGgTFDPZb8rxWpO+vqlg/0PA7yf38/+STxZpVaV2KxJKD9nxPZUiL5i/uyHyo/M8SjyloCdhCej3/VM7X+mWJY8GbnvjZMxQakhT0alxQSlr/yWSlJQvCzimJV+L0slsFsNsinTrMWnq6fGB09p2R+PtYc0ULZmRXhs9FWj9oUxQI4tKo3KYi0uQe0GNA2a88bYxzie/JsKxpiknLLrKD7A9iSPe8PW9Cy0L0kD7QbYXlvOtYHP4vcE5DcX6JVizYOuYYxgfcUOj6NqZya1FPRRo+0tdB9vV3tfS2LGCUjtOGFfOWl7wiFfcKj3AOPOU/KAyfmCNmo/PNmFJXvMYEwrijEWdVygfdTPTjPURmH+Vyou4Mi4p0F++DRBLYs8a41ng5ZKWhJp0biOav84rQW2bbndBnnhpJbK/aRnqz4aaDe/dKKM/rhHbfURJ4rz8omSfeqJMgoDsOH/rA979KRHbfSrHjWwSlHj+zSPGtoVbGcqtI+O2o9YcqI0gdWJEimqIz9dRE3Q6odVjpEvjQTAKLn6G+wY14bTwFJ/g2faFD0jD1LGUbVQIyFa7GdeWRzVIQnW8DB56AXCk4k8SSwNolpKcDTMxGfDP2lPs/3t00kEdiRJa/yMnkLR+Jto8X3UL0RbaFuREFCMXHD8c+RCRd5FjoBV8HeKByjbDFcc7vDoyBIZI36gCUMbpERGLxXaqjS3iFygSJ+0sfEUH5JfwCUv1rDhsUQbVP6txIjoMFSevYzsbDmXZHdKqUptB+muT1uUPDhGHJmMK4qIVrGOQPIcxeR5Y+9dPWbS21eRvo2RPpJkmalQKfxsJfkyc6SkpNRaGpIST1C0t6F9K6W1VOQtpWeDJUOnnWvVWgtFZ9n2DD32Teh287NlP2Gv4AnEfXxRqv5xVNkbPWAvS1WUVi9bpGe/+HgPoChBJpXiD+Iuz3skkWWw0n7VKoU910HvpVt+Gl7jsU5AElDrBEf2XCVqZxzZo+KxTg5W0InECCm+AHpW7HA8HiXWSuEmTNRLpLdPvrfhCSRrmD2BqAPV90EbaCcJlLr4zNN3Vu2C8FUUe+F74TSya+xWQ8dh3VHtdNbZ5XcNXEAT/wXPFRXpy6nCc8XstSIch7AoislYCulBoxOkYImLfoW2qe9NV4XGrDzWY/F9Uo8VFnoMRkqPRSkduoxHwM/1+4yGBG9B29Bjqe4rWO9TuLWTfr7gBWv/DS9Y8fh5fCfuYcx4hLOI4s5P7OFHsqv9Htkl/UpuEz/NfiUX9yVoH/1leWNPjyde/dexVwaMXjXqyr2cBvcYLcJZf8Nefh0tj6d6SZpBN5Z4JpcxO6m27Vpky1QZn16wJ8hGVhGAqs04HPIMox2C+9FT+NFiRF5qkJQVx7vIDqkC5a09st1I2gjIBbYDG3KBkQeEjXQJSwPrXNkZEsfptRrvVTbjkXGb+DeBNjBoIAqDxP3xK+o3e9RBP1bxRB4PYY6U3VuRNlL5nA+M9pCONGB7hgqHCad5Q5MxfXUNmguMm1l70iOJSY3l56ZMwmtBWhBqw2xvq30e2SONNeX3NTS2imWq0thAbqHtnwolC232ikeEgzqZ8+e1B+P3cEXnsW3yqLd0xATmD9tAuDLtoVwpD6VFa63x2Uctu3wpxjlPfyvG+SiyRVETW0dqOLohbX3GyMJ5oiwF1joZLyg/txUmj7Vp1qwLijCE+hxE/wXuJcYAdjP+zOdHgREv/psH8wzzVQl1ttiEZWS/l/Tiguaocg/SxCILAtvKyJeCcHzsM5Gasqg49k2+JZUnXtTIGXqGQs7Q+yQmmLV28tmQr6GkvaZxBFmrxkxGpn8Sz2/rc4sidBjJk/sK5QfjBkSL/ARdhZPka1g3HFEibVed97hWUKv3jqNaCyZLodEHmKu20+wH/N5SfhEeO/RBoX8mOvocw1fjXErMgpx78l0cNa4BdBzCrBO2z0XZhr9VPjKHfT6ZSdZml/1PUq7wdSWUNemwPhRYJPcIdYQ+NIULiFhOhhKbUZFsqLEZJFeE9JGyZQTzIPsg5xnsfNQ3eI5dh/rw/F63fi//5NyPS7hOeKeyvEEnIqtlZapoILfdc2jPsz6CCC41/kfG7LrSn4r9F2gZKQyHyX0l/yta64SrF3LfMeZ9RXuSP0cqn0BeU34BPA+tPvTIqyi2xz5ZijaiVYq6GaHVJAY201FjiWFBvyLOf8lrDPdaXOkoImFg3Yr2IKwPaJ/jK/wLXxvc/7ii52ifXq3LobxXOBnZRkfpfWBRY0S1kmvaonVA+B3pheioHI+EzzkaW4oYmbjuRiGdyba2PFMaZ47OYySbP/P5zZF0jt5S/kNUaR2ZsK8cGebPWSk2zb+5KjfE5nWu/Y8F6+aU46A8HxzJpjNCtLgtsdz3K4oG+3h/N5Z73zXU2UYWPtkfQvqZo6rxN4kQJKwRrE1XWdRH9lNGLG8qio7XyAbWe+Ue9MhzAr+t/HodE76XdfKTdf+sjgzy/Dd1ZGl9PbVzS/FbntPznZvxTNSnkcNI5KBg5Bd7yGscA60ItRMd6f1XcXTYmXGldi6s3COzlkRO01Puk0WW8czq92Yy7h9LLSlC35WaMdRoVr5EXLHlp1dUxYioFWuhcJLW98m+hK5EyJE2adPOkrgOvo+8+xKlvcKcR8OvUXEHkALqeYRNGWKGAGm15Nc7IPJGaKR6jOgdsGhpF5QoUeD3GMXRmi1G4WBsUFrZ7BNDicYxXB/9oh3DxPc0fw8r1sTn8e8Tp7a+PfSVtth3RDiMA8YadRYArOphuJIoPo+erU9wnvuDYJ8Xj1UlZGYkX/s6Mgc7GSVphWPoMbpd7zZPxZql5JaRpZCiQiBZE8L/jlTEiTIIMva3MQ5XSR2ZtRAfGUVIp2wDUxTUp1+YtPQubbaBd7XM4oRTV0eEEj5xUozSqDWeWQr/y2tDUBZYI6sTrXaFPzbYsqCMied3epq8b6crP4vFOXSZVZ+fgpFN5PUMSkbxBGplVhLzKWUi7iJxrPPVFOYTzy2YlbRNuR6+jkdFjErCsyl0Tc6tC1RuCKGYpG5O3uiTz1WkUNXyOY1R5GfB6KoVybFIsPtsRpYTqsjQ/o4wYxtUyV7ylNb6D8lZrf8E5N8BHVutUJOfHfG5RHmQgjGXpCPGuEuhLS+jevzfznc7nUHpn2aNUspGaj1plWiJcTaNkK0nn3LJcRH6DOtZRdYyGbOQWY2oocOo1Lgn1DZIM5XvVLgxlUVF2SToB1zVsthjTTQkq59niDIFVSQwM6Uf2lH3UqRYrUZC5aq+UPtYO9aRbf6usT+l1RMxziwUcix05pbFMoq8ZCZbPSQ3pFxAGbVqjep7nTpjtK1wZmwZNHFmKaEpHZa1NBbKU4CaaYstL0Lh1n3nDCeLMrOqxDlpJ55peoWq8Y5ktPdZS3X7AZaqSRYKa0i0+9hiae4+93z3UZYsxYMUjhDaK89B9CWBfI+UV8WUWamUgUXo6trbWnGciLytFu5WH8+BZkQ5JCSEjD67jL6mXScj0A05yihV3W7OEtPWKP3+WJ9BmFWn42kSDS0xlNhGpcmDhVojyCnDuKFB0/kvPaXkgVYShcYQxsCRCIqCcIP1GnZEvYYFW1PuG+Y7+T309XkmN0nTSOXI8LXSLag9FCvU+sevSeTYOZPIrHlTfBPPTXVyMPJFSuTSP5PIhFpQEhnaxSiZRO0xm7MFItLGZY5PY58RwsGR2NaCESwqNkqIBun59uzRied7pS0iWGcFyAhL73eyej2ZO1SPn/SESOujliuU64Q6JFvtJuurykuJ2amB0m85rntilcpTVWbJy1PVkJaQOlX52fJUrWW6PFVhj5LFSx7GxFT3ssUjrS9+r5JpaIFoywzWs9m4T/bHJdw0tLtg3T54xTr5LQ/+E9HCN+YKoH/xBL/Z8Nw7w3RVMjL6dc+9vz3LR5Tvh3fCqdsCCZ5gvOMpjMTZjo3P4wkq7kn521ELWtUC3e5trTrPkqxbBXPZvgcr3iJM8xOI17ORsp8fKY+YIWH+HUS0vN5DWF/P9zDNkE/SfmvEBLXcZ3pYUX5lFTh0kr+lh5vneihwDkuMiMNOf0MP8ZR9rodgD8AcwhOJ4eItcxg910MTzhpkH0Ec+tvmsNPo4TPxpLczKnlPMCrlb2NUCt1TVNLjWP+pb0Cdwx2D+JTwTFFexJswNv3+YAO69DFOs81vosDPUNyP4oEfHt1vcKE+iTc/7xXhzY/D93LHhN4TqHP3l+P7AbQeo80ULcP1DPprUPnlG7J83pjFJd6ZxaVWovfKSvxUfqAGZ+6TCA7LD8ebeCtK0fccieAAudcCefP7eQVu9UQmV+n9WiYXnIytUsZ8QRdpgZ4I31XJW7Bxb8TpeO+JdTsSp1O9jH50P5lR5UVO340P51NkxNtBKtLLzUjy14FEv4eT/LdROk/n8COS+Ffwj3Bm3KM3HH2Yo64H50jQgu/st+Fz3oiNqD4gQ+haZgi9isc6H5d/g8QGfX3Tqs51p1/azeYTu9l4x24GqyEDzc/7+Nk2PnC2J6/O9jnO5N8w2+UIJOXbT64nZvupLFzzXbNdJSBt2h8/2+YHzvbm1dn+5Gzr35pt09+0rPdkALr2B+Rcn8y2nwYo1T9+tj8i1/P6jbmexekv/iWz3YJxbr1LL2s9Mdv2e2YbrNF7vxt//GzbHzfbo1fP7fMMwX/DbCNKqHV8O5PEE7PtPDHbrXfMdinC1T1GwdHa/9DZbn3gbL96bp9n7/wrZrsaIVvwO/a2d3xitp33zLbfje4xZ+bDZ/tX/AevzDZmwr9oXaftf5t1jQxCyHv7u9ZXec4G/p4ciWEamapyDvcFbezVB9rY7jnmpvZDYtWn1GNk0uu+w+rEd/gb/hjpf0qb/hgeu8nMwTpWh/nMMeLZEVcEWqhn4/l6Pp93LyrOMaGI/QTzTwjRuoc1ZNCYY/y186qP6lV/JEf0G2xZzHvS4EhwywZSwSK8SuPz66xJgfMBcTvJisXcNYwlIVasOqZZo56RxRPxKMeRZkcmtL1CSzKTFqLMNAokqtm5Usoha3C5SAQj45ZkTiPy5CgUK8f+/G5W1PGnzB6dxAolGzQh85qfH8UQGzFd+SzGDMg4HMbrsR4Uouxjg5mOFUpf4cYk305ao1pl1ksj5ihKzkKoEcRyfFsSHcJZ0qH7MqpjG30IqkMoJh7NwsGYeuZdlq23OZ87ZhRBRew6GjsJrW5p5ARHrRVyQucPKBwy8YnRKlAZrCqiuJKf62iqvC4Y8eZh7oHthyc5BFXznRrbjF5k/TfCshp1VJ36wnzeFSFFmlx1nO1Z6d+2KFrJSLOKcKq6z7FCYZTEXnSC0FgZ8rolQspmrbnmSomtfhahIdCLVvF4qqis14xYy0hoE0OUtGoMEWGCqpp9CLNEA8LT+hM1t6J6hRXF+oh8CcwBwTknzLFkfGF0TVxITl7mhZPXiG2WfHeE8sBoL2bJjnSuPmGriE9CUC6GcPxTLBn9bYQYMjiNVD6aICRQxDuc0JyJ4p6T127zt5jjYav1Pww9HTnHuZE5MsRh2MiRIfQk/22F0fKqgQJCrg2J7CNkoOaF9iWSwKd1QlLOrvPWVg1cfNs6Rx5JxIlEHlF2biM/L2LOjZQy0uWYkwRSqIEj59C1rRrnTagBmZmbmMzlq/DFnuSWb/PYdxPzNJcgklwihNjC/B3iS9ZYPPlszgUinPmR0LfMpFoyz/MrGLb0g6RdJZH1dbYiMdsLZjRhpgdksJco/qCQmCRiZJZnlZbrmAHJmZCUey9R8t5RSgKT2TBlNmUY17xhjHVh7I9EHjJHmzqfXMmUkckZjwqJO3EYd0oMGqDTt2V+fMLZBMz6UjGmRzNgYruMmgEzZjxpk72f2DkCxgZR5ppLWQ0SN4ozqc76FmFY1VnPO6Vx1q8YD02Mm4odZWXUSHo6F9VZX4wmp2e9PCnkWe/Ksz4+O+vxtPnssz6SZ/2KsyEmssqEOus5402e9XSmN8560smUtHb4tBSOzm5jNtKC9RohWbNJj3jhrA8+BsFJiGJY0Y7mXmVOk3p0Us5LqUfHqzi/p6151OU9R41DDV2bzk2NnpbvkQjvEaIOU43Yk0yBK8n1Hivku37X+bVCfnG+k0ZiM6quK1ecRJczkojy3RjphDnDsh6GzGHR/WB09fm1RprJ359ePx67gNCqL5yl9u8wLD9VjeltdnnwyC4/QcWAzZqhB+gtOA9rPnkOyZKVhK5B+dd5C7rmWfwQ5gkglxiNwvvQNW7L37SgTejZCt6CkHoUeWiOVNAaoj8HpfebcFvB8/ihbnSP/PyUD/mGUT/3FtY9DI5+2UJuLtRN3oIfyl/CDyXkN0AM9Jvm8Jy3u4kf6sK6qmLMPHoLQso6j2zVPYxs8pdiNnUYv6WH6Qs9DLPWr2Dv0Mv0TA/B5mnDHLrF+1FucUvQKm2/EeWWPIqGNnoINuQQdIwRZRm8ZR+6z/UQzllxT/jg9E1ziFLtmR5ipYwWSHlk8RW/sQ+f8QE1PUWPK2XX+IxHPhZRV75G1NMzFaSfrYJtv+hPJB8lSPPjaOcXy1mvnIdPVwe/meXG/Np7tRbfU7g5PMVja32KXXik2fqdVuvlutr/H/V08nxPm3P3dKVwnucnKoJvnYebbfSfF6MBjWrvid0z4+tBfj4Ki+3F/ibdHxbX4zxJX8UCvrxyrbes3Lr2+ck82+h5RK1Re7+38/1N//jlZntx35ifJ3FIPGf+Ounuq+R6uk4aIyNH4X83Vn4/2pr7ZX9aPu9Ff6K2et3nxqeGP/Qr1V3vsZ9RVo4/qbmuK8uDLIvt4KzauntPLHvN6uiTVutJfz4ipXD0dCth3aXQ92OEq6oLb2ngvuFv1XJL7V39Y3fp/5f/WNY/lvFw++NwW+A3lmXyV/vFj9vdofGV7f5jd7ZF//Zue3v4UcJP5A32f/iOUv74yxe+Pm6WhzV/13Lkd+vbzWotn9pqfeUvFz/5i5V+NHzb4xfCh23Ruc1z9X76bBmbJd/zrRz9WH+5e7irboc/DLd72FXD/3Nkrxb5/S3/jL/4eShz+cXP9WKPHzfbxQr+e7n4ub9NsGH/3RS38OxL+YfucnFY/GO3+RI024cVDGCxhTZ0vncvWjez42q5nZaJlT/cwLGERzxMVX6z9WHgYRFW0NTr8XYYrmjLwrGQxbMCtyosaOfBS2nq0uN67F50p708GGNob5usot60E7p51+t4h2F5cQ2m7MPQGEfjzmU4zcDAmrQK+L4XbjxcTJvF1dhIuncPQ3tpL0sHjmnnIdkmD2N7sE76631si4fInF8NYYGJyYU7dYOHxJ7vwIy4BxGW4aL8fuVbIK7u5rN8t7gKENZd+ZOWg8775AoVz3me7Pz9jdVCpfSe6Ai2eb40Bg+kWpLZAi3dqEWKwiu6Z5Kr5qJu/Ob60qBDeVbcJzBaz/UDjUvBSxgPXtjiTjXa8QEMIudueTU+jjZfH+Aue7hLqiEYk/PyKyoQztDm39HG6xaH+fV4PUdxzGDAr8l2vB3lA3es27FP42ZbLB8OeNhS13M4Ji4ON7MLUBS4XaB8pHX799nJffalmWyP92Aa7YbWOB1a05/zmZnf7MYVz8PXRl8GGSo2z/Yf+jGafD2KzVdQQsxy3o8PiZ3fL/u91hCUDa8GhHQLaP3+Gt76ow4yjdPRbuwk/WhFvd0N8sS6gNb5eXP0kqtpedPhN877Av75eBTgfQ+RPQYBXAxuZr1d03yiZ28pIAUtgHkqxcQ4+hujENOg8NM706/ujgIhuyHS5ngIHm/coUOnffGFXAAw0t+3+SGeLfOhjSvygPtM3oWhHnO97OqDBUZ8j8ExeNpF2Qi14hhczq3pvf5tPQ4g9OHozOA5V5c/b+wxrA5YnRju68Cu6lzSeCQVzvzJb3gmKxSq4suynxs3zR6Q+H301i7/OrZB2F+PH5JUqwinswS/0+swZdF9M5uCxMBxcp84ZuTBCAZ4fkSFh8Kv8FRMqPu+2T957Me7qbF8dFCycgMHoHmzDf7z3xBaws/Wx1wD7K8VGlCwdktQIXQqw2MFoKiPuvPyz7VkUamGT0sN5cz2alWXnRRcQpqdE08rfUpNaRzdqu3x9TpPzEb7dBAW2+nfnaRunLa9cQQ/p4w+rbrVis5jFZ3CnOcq3cbfzul/8dOKCLWalI2OSeoV3Iny5s6D82s08zM/bVvzNE9PgBBfFzNzPQcz61Tt8FSYdU/7e4Y7dwxnVWD71Qtp8pjSOMlPw8xImZAmCL84VWM2bwCEk1Rwv8g2PiR9Zzey4ASzIsvvHL8tpvNOaA5s4R6u5tYFFkG/DHN/unCn07nRuwymA/puPO11g2lORdJ9198n5hw/9/x8+hP+NovS8X1Y+dltOi5nvf1kEa2MeX/emu7230PTX4fRxWF63fs23s2jqJ/vF/2vx9g4jEDa3onIc8ZWdvxm5faNVcwm7rQ7nRXzhdtbTLPBMZnur2E8ruJcGDdmLxfd+ZckGrgg7Y3YngdJ5e/C/HIz63wtYwMLYq6/+9f7qZhelkPD9MJ0MJpte2u/6tnfQGrcXg3GsXlpJJZbDu35/253+WYM8gie92O+Nb8k6WUxvd7PxfW4TPJ1fNu/cIbG2lhU8cO38mc1C3uwrubd8dWqDLJeNIkGl1M37wTT8eDvXDuHoRVVt931RqT+D6QYWV7vv4nN4Xu8dUsR9qxJf58KO3oQ9p0hrsb3MzC5w92quskDGxPWFnniTOEkhLHZTM2lucyj0p/NPZBr82F5uJ7sfGdpTruzrDcIzP3lJPIXQZT/retvyzS/u92N/zcLB2l0tbdmV24xscztbXfJ7c/u7NnV+H/T8sKKOhduEoI2MLkwx+FgFG9+/phfT/83rVxn1Lsspzt/jSGG5ZVnjqawrHfzcGFPe4HRi8aRD2M9hbU//qvX4/4A5rDY+XZUTXtxy7f8q3A3X0ddvz+aDa5AJzbi/K7yq3Fr6a6jm84hFbt9eZv2TN/MiqW7qhKcw6vBTITr9fJ6XcbWdBHYy3/B+qXr0cIKkCTxR5itIyxmcDsd/8/v97Ll9q4UuzHIk4Ep7PGXebi0fND44rB3TwFMo2dMdntY9+t8vFvuo8gfL0BeRbN8HJi9v3W9mXdBxnbXJaxR39+urVEf5Y1ngvZ7BfusjGGPic3FFmTPgz/zzfFVVi5763jRje3x9dhbpJfmv2j93U+u7n4k3eXVbQ/O6f4UZIOAPo0HUQ/WF/VnZY964wxkSbG8nm8X26gMtvHD2BiMJ9vD5dzY/whmy28Lo7f7W9c3k6/F7VYcRzPPXlxldtQfbEU4KKf2GuejFabr+6F1sBNj6gTTXu826o3H1/llFP3Va3eGWuj1MhtXvrWsxoXYQXtD/4uwe8V489W8MYpOOFv2gszcjaOBPzemf+0a9ACxuLpzltF6KrKWLbbFemFfzidRT/wb2ofXi83hMs7H0+ksOUY26BPXpM/AP9Bn8vGPyJ6ObyLQkhnCefBfc9EdQF883vQvwILJN8Ntr7W49v97Atx8XJr9VeDyaSLhU4TBp+7OdkOXB9tr0kxZFta8mx3jMAINVvx/k7KcWEMEMjxfIuusV+8vkeV+SLIyBsyQwumzCj1kBCKRxCcFkb49ug6ORBpCxGpcFEHB1/g6MFQ5dxGuilMwKZOcNQpuSVIx/Q5JHnd+LYEU1L7mZ/7dCyU7iz9dsvO9yd7rtyV7f3ISGZKztMCifJ6w398GjujD+G3j6gMI+8sPSPRGagvjM8vHMXGxJPIVigrv/FrKSoOgR43PNQEwQb8YhBkGBhcU1mDJoyQHZqowgmczaZskS624ZEoiy1tIWh9N28VFkJhsPmHCzkpRByFoDwEmXLQKIV5E+aVhQ54ktydKNCLKPSmnSZAvVU4T2tnRZc2xT1QQGEa9kiVgqhFTZAl5DVJBkaATqNVgoAzakQgIim1N00NQMUm+rSRIXRiA6aVk4S1JvlppOHZFJKgGk58LWbQ5UH2omPxcFAy+9bhIVDdRBKcFUyPFRNPJlF5Uak5BywgqJeidWG40I9oj1e4Rl1qp+HuioiOQ70slRdDX8mdLiryLqqD1NqoC8amlzxEIMQpfKv03z0bh5doP51sEEX9EoSZRfgRRARKUot7xeaX/JOxW70qm1UfiWQI6Vm0J31al5CJJS8lEnQxoFGYND6bSbFYN7PRaGu6c1gSBVKqvAT+nnS7/huVLELYuYb5cGp3JZrm0myo9jgWK9S6n3S1hw20LdpgpNJX8Senxgqj5delxoUuRCyxWgVThlSY/q4jKupupzwyyVIR7RD6bSJrLoGSKav1ck8CwXC6PkyRSr1E+niity0b5eIcBrpGi5W8xRXqiqbsJxkMkwBqCjYkrGjLPKRGcQDciCmwm+VOSiCDeXMYOdBgq6VIqWnYqsacl9aoaKaneSIeQkl3CvNGLrNIhCMBbMFEt0X06RJyLOhcnC5mqzCEVaqYCv7JMDVFxK+CyawmCUtJcStrv0+LBddEsLC7tUolEBQdnvTDglI5KlrqoSyQaspiWSck6fEq2NFAX2sDEl0RqyOsojYpXyrtXf1xXfHOa5su6YvByefdPTkp2zVGnhclJf0pXND6ARgROVkk49mnl3ZnONSgUbSdbT1yIkoHOAVOJpiq2xtSyfteVKxUTJTxMLDkpUcwE7I2UPSKYaz5HUkSS7iCtKKaJLGoq5rakBtcUoscmrJ7b3j5pZ+O3kgry9JooY7kfJ21+YccZf3rHnUMGf3fHRS/vuE9ODIfTZIKFsv/YjnuKuOdXqVyQxhz1ts/bcRaT1jd2ClF2N3cKkdaijajIZ0uyzLgkkTzLiZzZqYlRyaKouKAglk+j1ENNjipJRTmhhK2VRkIJptO1VUKJLCvERQml/sVJNpJWm3dt4NQEzbhrEy5R3OUkIV16JdQpfwWVpTjRv7xWrX9R2qAqlUL6F44BpyaoHUspBZQYSykMpJ9GsrwcfS7qxBSyuLioJ+74k6KedGZzUU8E1XYTlYIli0O6R9a/VkyMq1M0ZSHJMOPEILoXLa9V0zLlUhSq1A8CxzUJeKTTu6RlWtbPphIyJemkXZkAxWU8hLxW/qkXyrH9burXu8qxfQQt0dUbaYmszyWXw2InWPT8T3pc4yc8rvEve1yRmhy1m8/zKnkqsU4ly6FMaZHeSjIlqIjEHmwWldokSxce+RQmr4NRp0K6uKYd3t90r12XPYtbXF6aPEykf1O5J+lZqv+WYMHU+r6Kyrc4Mn2Jy3br/SWOspgql12vVuQ1qcvwRDL5vpE6nEaNUt6cCox7WOhSWpTo7uiCHZUrS61RUh/p+X7XVWXl0F6rPVOp9ADV9o4ty9dJ4vaoUCXc5HjJUjYo11xZ6DYqNeE0EZnDSqTxpJI1DTsNbEFOTWvVJY/EsZF0b2oyAZTdqSyWq+Q7nzVczq+i0j/NcaXkRZ9Tq2UpraDS9mGY6VRgoQr9qlRgnSZMXjld+tLv1mUx+XNdLo+vBZVr5xRZtL0UYTmdSapcXsXj2ShwTHaWKnFJhZwNigVIrxqXnos5QZTmFmxILM/1gmfM/+OesfgjPGP2y56x5FNLmIsqu/fT5I96xs4Jztgz5v+qZywV1TAVxadFtrgAbUMTJAr7RmRKyOK3QhX0K0ZUdKvdTDA9LehXa43y3vPrV+0f+4/bPx/kcVi9aP98Ng0aFiBstfw/Z/88RXf3qwRo6DHF1NdP9DgI9gcSkYCkRZAJ4i4XJ6ECMJld0yJQYTCbzm8uh1mQ71CVdINzY9SVUQ1c8Y3i7FRy8uky0kbjbzbaS3CfLvPK9CKZoT7D2VXUFBDtSrcnTQwmBlSlZj3yEYMtQNGZk75y4TEuyiLfz/5JKtLDbcXy2ei/ln5OSbqASf7qs6n8yDICRd4SOPcpxZq8Nux/NsREn4Wy742zkIqUqdKxrLf4uniakjKuLOBCUSr9XimVKlmUC/SMFZ+nqhAFl0hUNCuyVKCyNUXBiWdUlhH1GS5qr8vhxqy3cVlci+e8HnuffNsxl95LqTSvLbTt1m5QzZz6+pHuRVHN+DQ/iZxrKgBn8fiQ37wSUieU11QAD9bYsdnOUch6CRMfBKxTch9lARmXC5/oYnWuTJt3JWVLxsXI9B5QCHAiOJK6aqL6Le1CTxZViW2OKrqnhXj42bIQDxctU8XeONKJ6wXLJqNe1Tb8k2c/WfZ8rDyBsuy7bHdw9GtiKZt9/Phs1UcqoPbSidL60yfK6IM8atnLJ8one9QwXgN2/J/zYT9FqfmrJIsYuUZ97/NOFJukdZg00D06Tu+w5GSCLC5ArCM/BlPDUbFci8tMqsiNV1Kkj3edhaRDYM3pMpnkgw6pbFLJxVIx3qtJwMjPPKJTSxIFhbWHCXaTxScTeZLY2gu1lJDRMCrvBNYTW9DS4ja42HDAViZ5+FxFTCSvV3wf9QtODS39mKyJI2CZeYZVsNm7SKcDFhU2uUi6IkmKZWG+NktkIkQSNS1QSoVvS6Z1aztU+DxVBGN1QVvBtD6NUk4BWp3yb1Tc+eDrE5EsazmXj0rLMbGSKgHKxE8UmYQ5LEkqpzoC2WIcCpf+4hJZesykt09YXDA6rpiQSpXX42crycckaywlpdZSS8oul/kE6WgqEi72xMpnh20me+MChKS1cHSWrE2Mblon7eZnSwIpWWqO+/iSVP09CsP3SNUPQpG5x1eKShw/2e9H9BBe8UeRlk8R2BaY7fRLFmnVbqHHkgvDfAZW45E+YFKJaa0PtMlPhTqFLFrL/hMi9ALdnEpBa2lRcGwBLNA041j8BAsxKsxERMVi2bcn31v7/UpGbmWMGOs07oM20C6quPSpOH2npGxDPdVV91akZyvKtlq/kfq63OWsr6vvGpiABtoLCdEqPDmwFIGSiOSjMrnoZUy0S4SjUP4yOj1clrapQCSauhep5JznddjoqOkSq3hVo/GCUtJKHeXnxvuChvReVeRz1Pe5JfvYkmbfbdX3Z9Fgv1V8+j0+r38dqW6znJb0JQWwI2Ftreb24GF5fU5O/DrmKjhiUSe5j0vQ8zBOhHP+6j5+CzY+IPJSH+PnHHU0JMGlVVt1K7RiKqalFWgNopWgqfEEl8tmLzBYILQbU031hhIHd1HFETPaGZq8FEtZk8VIegicfWwBNqQCYQ4YBxkSisYR2vMtMZtERxjrz3LnE0aTy1MHiHE0NfqI+2NzvyVhaddVkURJ5qpK4gqb9RAix5TlHHVUAdsjy1HGutRog1yVy4uGSNga1V5zhT/dyM9NnYlKx5L+g5owW9oav4qedOWlpvc1dDVx9Gu0EuYxo5carFclCRP2gIeEgDqZ8xeoj38PUXSex0ze85XVIL/DNhCiTPkmR7q8K5X5bn4Ga/UVYsut+Jjy2gXHSBJTI4aZjtOWM1YS0jlVVoIsyEyWgCzGrIus07XUql2MJ5j1OSgs3k2E10PtttKrJnUl7S5+hpnGVaruSxOLzjK2HmzWHmOVZ1AytR+3lVEvrqP9JUpLrsj3U5BfSVHh1KW7S00PSPfT+8y6WDqX0GY/A5Xq1hiCEVopKpaPVLHNs6k+tygeh3E7uWJRgjBmgArEu0eNkeTrRrFl19HnPawWKsFARaGVBkxWQqMPXkm/r/tRCvaUj/XYwTX5ZkAvYQtAjbNnNKhYJeK8WbCbqRsZ2+kinrQS2j+WyYLHKM1ig4uxKowCXVdUjFauEZIATJ9LiCP0n2lMQIMoFTGQjaLDsgitLjpcSqsI5kH2gecZ7Xtbz3GYUR+e3+3x7+WanPlwGdMpKmV1E1E5odQVtaVse0rEoxJHiugtNf5txutyAV2bCVep2G6TWLRg36trcKHyRFIMZ5IMlPYkF1MPVfYAXzPVs3vk4r6epWLWjI31OPMALFKm6Q1shX8l2cAxYolfoYLZLVofuMZINggVM2R6YLC8eC3hiZpZvvaD4jXRK6NlatH6LrUX4FjvpXNa7EzpfQ7quFQMiN5HhLQF0zOzB0JldPjypCO/Je4BPIk2sJ7pVFaZA2ilUsH4gj0QK/4syVplMfmC/dUeZoIoHZkzSDgOTJ9HNd0zZ5OEKhMkqfg9WtZxIWLCbiuvR6RopbnoNq053vdMlOtaFLuVe18o0lzKGiEiTPWZsDT13xgdSDgjJDBWBaXDNvtiuXC7zbHwBi04r2XpryavCVJO2/U6JmyvLLrcWPcvUJm+jxb8qX2LRXw/YN/yerbrsyhjDDJTT1vSN64xC7we1D7MpN+/pqglvDjvW1i3JGP5bKp95AaON3+u3yuvC7YDQUsir5War8BijcBV72z4sqn4Nf8tjTErqb6P+1Iwns5j8uEwcURdBNxgTIBQsQi6Bg3I1Hi4jgEyQD2PcCiqyDp79Ooi7GwLwxkzDJnWnPD0Jf7ea2kcGmPcCkSnet2EvGEoz2T0lujJh2HUamjC+HubNOSQzt8DyHdtewvyk65MLu6d4G9bGqcHso2Ko0v8Hj5jiFqRsi1Dio+UfC5mjBesZA4kX1s6HpEmJZcnwN+SRl7ovZbKKLOS26rtFA8Ce7Ii5K8mcmY6c0kuTQhcJXM4X4Eoj7tUmB3P45oauavPPkRlnpFHu4097UmcfVDpQul83jAxtVzjsvC90GujpiyW+ZsNemeOYRDO5CX6f/8j6P+5FAGXjVA+iYowTAXHzrwjyza1LoXEekp5iHuoateZaRLryQU4glJwjoepx5XxR2hZSDJwt0We15r0W2rm5IVufn5MBl6PoSp67+j1yDHIShC9Puh5hB8KlK+jGLEFquQueUgbug+XpZC6j0/6sAD9Wq3PyOD20F7ngi9VpCiqSy6BAG15Gb9j/15m29n8sVe6WcrDkMVYZCkPzqERsu2qlAfu/WdLeUhbBPFtMCY1vkmV65DvrEt5cO6ULOWB+qqWw5KwvZDlSmh+MCNQxf9GVPbBY58c34vx4Uqf7zTP3BdunyxpIOPZ8rvG3mR7RxZxQbQM36/ztWKWT+QfiyyWZYHCk5FMHVHelbo305mhoqvwZGwTNPBkWBpB/i2xeSyUl4BLS4zq0hJGbaVTPPVIdlyVNduJ55lV6+aZ2Sgt8YKVGn+IlRqhJGftiHcfl1to7D7xaPdRPizFgaQFhStCnoIwkljApS5EIfNPM1XwouFpFRQfYk9rXDIyM9YnLHl9GcHAUeeQUddc0ogiz00pavsnUoOzw+oSSJJ3TZ5AgnJRVRxNlYEgaYW5w5bW2NBHpVHlyQmqnE5/FeknT7CSKDSGFkXzCTnhEkKwXsWNUkQkCTNHvGXGfxt1/ajkALVMZcfwtdItdIGGWv/4NZmcZmcymfRui8upBJXODmbMi5TJnn0mk52G7oztMvicUfuM8Apoi5V1dk9jrxG2IZM4VpewKxp3j1gG5fdOk1ZzRqlNarbh3Pd0JNZlmzflrKHG+Ek/iLQ9atlSyNgAR27BRvIaJZYoL7WjbFKO6DZtUnmuSl1YnatBeXqu0rPVuWr7pwW4Ci5e4pKHEVaOutfiyKTEyVZNu8wzG3YZrOiocZ/sDxdBqSjLXb3zpTPvz5ZZeFSSvOm9z7DkovFGKnUrfpYsPrOHIMVxNt5J746xeGjTynojRf8LBPYwk52Wg37NtxUhSF4YKcFlNmH+sUjcuyj6Ya8hi2Ty1piJdY7GrHsoMKcS9mCGZ/lbeli+0MMK59DDWLjztkIS8XM9NHAO4XmGz8wqr1P0P19moRQTZBhBxPmb5vD4Dor+p1mUNk+yKF2/lUXpvHzm45jvYzZVPpGHIbEoHYkgXrUwjFuwnovYmuYoR34PB/4Ix30WGfz4CP+LLKEb/1G/GHPevn8Hdqo4Z7+RbAyTX4vw+1huAjFFFDPDdf0JxcaLDyxR+2pp+c9mBXqlIPEMTts+6FAzP5+HbYnlCED2ZL9ShPuJmLB4IpPL++VMrgBa78k4cGQOMR6DGNXyDVi5t+bzb94XA88lake8zP0Ufiq/yiuliM21n65TUfXS+Sw6SiQkyPlW8a4S40/l9Fe/iIdsFp2mEjc+FSFOPrAIsXhPjpAjc4SMVzBZxudisn5LYoftcojRmXft4+iJfRy8ax+7xxGyk6cfPc/BR8yz+co8m//GefacIWZrvGuen8q8PS+C/YvzXGHZMfHh+zn6iHm2XpnnT86t/r15jqwhRh3fNc/Jh2RYn8yz5XdQhn/0PL8rt9N5W26na/8b53lVYZnE983z6ol5Tt43zw4Wtfc/XG4nHzHPrVfmufUvnOfULYYYJX7XPGdPzPPqXfPsFaNOy2I9bPWB87z6iHl2Xpln5984z4KKgr5rntP2E/OcvW+eDbAAWqyHfeQ8v9FH8PI8F6LziuVcik/OWP4dy9mzYEyNt/MJP7aqvOKDMiHufY61YASSeoKW86jzgZbzI3RNo/Bp2bIoMt9N3uAnFMf3e1wUJ/GZx4XG73Jws8UqY1QdbH3TgVVB1ufJmL6auQfWaSWzSQgZRZkmhF7NyxubKm7dDcPs+AZf1Ot+R4uz1jQvluQ0abAhwB6usQlx2UATc/7yK/xIPvmo3x+pkwxYzEzD6JFGSXpkwKlbhoydiEBpOyp2yfh6hY4k1ixElWm0GSNZFFMLokYbTC2MWJRMLa7ExCeK7VRF+wxGHXPEacTIyDo6qIp014hg/vwoavioFDwjBVTkDVFElUcFzf00YB5jhcuXSDGJzikbKFaZ5dKIMlbekdkx3RoVROO7kogQt1LsMS8hceaTD0ByVIplR/NtcJ4AcyrLthOntcNMqXHFzDkaKVkxqrLNyFLOpTjWLD4RZ70z6piYw3AN6FxVGUFE3m1Go+voqbx2GeGWYq5BYp7mDIjmOxWSuUR/sfobIVdxR530hbi6K0aHNFnpKK+zqn+7ImQoo22QuUdopA6j4ChCeWREW43KwDmWLD8F563GDVQGIamfRWXA31qCEIZtRyOD02aEmiOfTdQQIb4UaohQQEIzC2E+KOEoKK9Szm0lXuE/iT8iPwJzPmyFMJbcLg7zYbqSfTdjdFyqmC0yyWxHuA6M7mI+bEtn5ROaipgjSs791Sx1Ej1Gf2shaszvKnQ0cq7HMh9aEHrT1yxzfC3C5m8xpyNR6/+gEMaIXta5uymxFTZyYggtyTkxXWSlEA3kD7JqMJKPkYCaAdqUyAET1wnLuMSp11CNghcSCdZAG9l+A23EebiNbDxm17A4+5zHnOSPRgm0OWOuG9eobkIJcA4u3NdiJquo7rvew7zeTzIHQmYNYZSWOEpmZIW+U8/m3B9ClbdNXd+P99fxFdSa5X+IrBMSRa8zE01m7yLmEsnogNz0bcmG4UoMEjEvy3NKy3STx4PnxVeIeGiLxN5YvFYocxLR3poRjJEthPVRSENmX1McRSz/jyOJAvcZ7VDyvII+QkwZoNHLHHmfMweOnLtNaOEG0yW0i/rITJdo2RNrWs3LzywcxCYjDEbhUgaDqhtp1uf8CuWVOudbvE8a5zwzlzGzpmRBof2gWGJCt3HOu87ZOS/PCXnOh/Kc35yd83jWfPY5H/I5z1lRgsdFZQuFmcxvk+c8jW19znMmgpLVGZ+VGs0sSslawjoNM40Td8GL5/xvZZ+eIzYJPWz4LAdEzVyix6aUGSj12KSCM5u7mi1d3tM2GzlvJY+/xuDhexSauzWSeXINxlaJcqfsJUa51+86v1YoL85s0qhrjtjL9WZIPYVQQ5zZRqgmnGtV50Jmq+h+cP2Ns+saVca/P7t+NHY+s7O+cI4mv8Oj/HSdpTfZ5ajvP4uBAZt1FHpwErwF1THPn8etwFMQTQPndGa8BU3zPF4IswKQL4zG4b1omnBlDdOkQJ+W/wa8kJ9Gz44VPOOecG2If3tDD+edZ/FC6H9BHn7MfXxDD+NHnsIGdqzbtoeYBVChbvIWvNCzPUQWSPQdlFQh4i14oWd7GGGkz0aUIvop39DDRzGsRg/DBL2lyCda+G9BtW3Fcz0EabD6FaRdeZ4nX/cQLJ5Oy8JT+f2oNjh/S1ylmPXxpjk8j3jWPQQZ62KNZ4eyCt6yD5/FfCUtUSKLRla+bQ5Rrj0rHeBUHKYBnOxo5f7yPnzeD3RWt/m02rT4pWrTDQ+vrCr9RPXorfNws43+86JPsVEZPLF7Znw9yBsoDkKGLLYX+5t0j1Xg8yR9tdLe0zg5VQ+8RiA81m3fWD/7/5Oems/1tDl3xJGq8r3h6SfzvGlg9bbz/U3/+OVme3E/Dw8vRgS4oru/Trr7KrmerpPGyMhR+N+Nld+PtuZ+2Z+Wb8D8vbhy47esXMSJPlEnnWufP1u/3X4RY0RzBnrkcbTzi+WsVzZG5qSu/c0sN+bX3rOedLV7tc9b9Vf/t+EJ7RYpPPsYsX+R9N2k7zmxFYCeON+OZnonfwU5li47l3fzWb5bXAVcA33SQiY9OdLEkgsnxuopbz7hn2DcVPtgvfk/F9eXAVV6h3c0EN5fh7ZvYVv/sS//sax/LOPh9sfhtvjH7sKlyV/tFz9ud4fGV7b7j93ZFv3bu+3t4UcJP5E32P/hO0q+/PJFPuG4WR7W/F3L+cLfrW83q7V8aqv1lb9c/OQvVvrR8G2PXwgftkXnNs/V++mzZWyWfM+3cvRj/eXu4a66Hf4w3O5hVw3/T77sYZHf3/LP+IufhzKXX/xcL/b4cbNdrOC/l4uf+9sEG/bfTXELz76Uf+guF4fFP3abL8EOeVjBQi+20IbO9+5F62Z2XC230zKx8ocbEPgwZQhHy2+2/sMcl1UFTb0eb4fhikQRHCkZqMQogmCjOg9eShsnPa7H7kV32suDMYZ4tskq6k07oZt3vY53GJYX12CWPQyNcTTuXIbTDIzRSauA73vhxsMJ3yyuxkbSvXsY2kt7WTq2KB1QwpOHsT1YJ/31PrbFQ2TOr4aVey8mF+7UDR4SG1Tv0L2HBZ6hCPp+BQujbC5CEDUIbQo9M7kKoAfzPNn5+xurhQrtPQK5l9s8XxqDB1JLuTirITZ6M4NQjvDgJiCfFI6kKurfXF+yUjYr7hMYref6QWn7nQscGzy0YdM61WjHhzeI0rvl1fg42nx9gLvs4S6phtuLcl5+xYPeGdr8OxKz3eIwvx6v57ApYw6WwOYbb0f5wB3rduxhezXaYmGwBTbx9RyEyMXhZnYBSga3C471tG7/Pju5z740k+3xHgzZHRg66dCa/pRBm4rn4WujL4MMVYZn+w/9GE2+HsUGQ0JmOe/Hh8TO75f9XmsIiopXU2CD2Imr/TW89UcN7xyno93YSfrRinq7G+SJdQGt8/Pm6CVX0/Kmw2+c9wX88/GIw/seInsMB0sxuJn1dvOGekXP3lK4CFoA81SKiQHGr1GIKZb0ujP96u4IRxbSrIACJ9Doa9yhA1998YUCNzDS37f5IZ4tcxJV5QH3mbwLTUtzvezqAxNGfJ+C+ISnXZSNMCSOweXcmt7r39bjAIcZqAQZPOfq8ueNPYbVAasTyaaglUMQwzgeSYUzf/IbnskKIe3iy7KfGzfNHqCQffzWLv86tgc5COOHJNUHyOkswe/0Okz5QLmZTUFi4Di5/JZmIYev8sBfwXF4xOOQAobwVCy4+X2zf1KdiXdTY/lIAeCjD45K82Yb/Oe/IbSEny3femKk6wAlHL+7JahGuqzwY8VGh34fJyHUkkW5p5+WGto1VytCugy0JjJ7WiVQ6letkui2x9frPDEb7Xukqni/pGS/XSWlgGZ9dEu1htwq56rqxt/O6X/xCwpWbBX7pGOS2gh3ory582yk18fU8QCDcdZJsPbrYmau52DALPrT/dxaG0wi6ym3zp72NyYSbj0rRsK/6gVqJiQ8mOSnoXoKJSQYIm5IfI+SsV+FnJBUcL/INj4kfWc3suAEsyLL7xy/LabzTmgObOEerubWBZU1D3N/unCn07nRuwymA/puPO11g2mOnzu+6+8Tk0qg90Q2/Ql/m0Xp+D6s/Ow2nJdTcz9ZRCtj1p/PohlWpl/ul5k/Wlg9d273OsFuWka5/7BIL2E/OpcLwx9MoouHm8gpxtF6DM+bh9kFfJen42w6TsLLeLbt3U2uL4+3YW9xk15+D7Llvbjqmf5uOflmrB5ut6uHBJSL5eTCm1+vf8zT5XrUX94HqbBiyzQn06Scd4OHMCo2IhznfuZUcXgZhV3fFtfjctZbXifWPA/N4IcAqTHrTWORORu457h0e3Fs+ZdRNMXO9G6j3nh8nf+F66Vzm99Vo+sITrMAdsjAC41jIazB1e3O+5FYg1FiuaAe7x9GkVNOrfn3eT4uxG4cT3cunDnO7DZcPYTWfBNdje3b/vwK5LY9N3vd2Ww5Dsx8F0X+/G9dh707U1TrzbgajBap93BjLFtJvh9hUfARzFGcr699O7YWV/5aTNfZcjbNZ/n4yyicWzfX6zKeXlrz3V0J85eL7aqcby6+386m6U10Z9+4vd34en05N6atYLbs3f7da5ivux/htgfzNfDmm6/mYjsALWYwWubza7/q/UxgNy7z8QjW4HqSuvaoL0qYY0tEhrMASRtbc+t2a5TL3jpYVmDO5Mv72F72g5k5DozeAcb18gb37N+6tgb/u8UQTJr/bzbNzHk4wLnMo6u4GoWDEezn9bdqkMO/b0kVtxb53hv13TLZ7r9g2BH68yUx1tbt1WDtu85saf399cnXxcYPp999Y17FoCmF3eToX43vZ32zjMuLK9BzFgjxW0T7GPsT9u+coZmVt1fej6Hh9xNY4d+suPh39WfcCXf76Db7+WPUudjc5Fmx7A++wV5ci7R3THaDdHk1j4OtsxFb2KP93vXSAG3KOoggM3fjaODDuu6EsM7/2nXlOYs0+JHYdy3Y/9XQioowF9Y30NmmO5AXqVskOwxh5laUX27n9jydRFNYs3Cmzcw5nGWHGZxzf+/ahfN6AHIrAAtzXfidCz9EO3jbC+bXvYdZf/Almk67s6w3CMz95STyF0GU/63rb8t87SXbn2a4Gyxu0/X3sJf3b7LAXkIbg+m4BzIgGoNMmLp/69qFmf9ZBNncm+zmWZwPCtJnQH+Z5IOpH03LhbHfza3k4vuEYbjDMHnsjmqCiMGOSvrFA5yZ+3jjrG/7eaV19ROdVxN1v5zy99upxgioGCKNl4Lubc18HvqbeRoXo/48a2rwvwJrfY3C4IOTEm1/0zJHzxb8edSn95ZneBI4+4vpiDaCNT+zcGDiEAGoIhoLVyUVezu9RroZSWqSMA1xV0Hw6NqQpRkEF7PLmkDRUhbhroGiTFhWv4PJvM6vZTib2tf8zL97iQD7t+A3H0eA/ZtF38qXEwm8z0zXRdIULGT9bBoBWFZVvI0tEa63887HFH17JgF/82vA8lGY3FPRtM8qXyIJwiJF8lv5VNLt/JolJROhNT9HivSOoDMjJlQ1iMgxrInlJaFwwZRIniRJFYpYy1S03kxWGhVM9lND7AgiQmS1niTLpQJOkmyPCX+YsjoqBJNvKUimJJnnwpZcro6LYJ6W0SCYI5Kx6tLi2CcmMG0j5MtkuBDRejM0rEsFInXRYCZ3lHCX1C1JJlSKgpzonyRpv5IhjYJYXExL9iFjmvFqpYiDCLYK7WBYM5GZCqMmMpbFl0PRLIhl64JhDI21GPrHZIQNouIWw7iZ8JgLkrkN6nQJA6UixlhmZbUiWpMXCwF7f6MQ8CM4yu8XAhblK6kw1ScXBAkF7OnMfD4VZgq6+GDrp25rvg0+pCDIU/QR4pfTnVwqHTAKP7EQMBPiprrYNkLHjjVRoDhKSKMiqTIZur6SUEEixqoaxcURcshlciQcUsMxmZhPlghKKv+Eep/2vfybW5FPsRKSRNil0ktIP0+wRQmj84lUUO952usSlo2+VoSWOhr2SHKHSkMRYXCjNFRFhOWqNBQSeaW6HJHD5MhtR32GlWSqYugjkg+uLYkWDYYdt/VzGfbuUQG8EZUPwmKpio4Oi3+h3KiLwjNZdaLhh0ShTwS8iqiVIb4+Qe09k9N4sqouNkIEnIpahguF1KWVWgyPZ2JvmC+E2x6xwKJ8NhdmZrndwjQXKeObZQxkOSuPyVRBL5LQ9pbPhZItmbyD50XrJJknVCUPshYn95DcRog0yu2jgj/6VDRSlbZiKsNGaStLMISdi63w2YNlK5Tcxu8NCV2WJLKROk8qWcaiYNJ2T8KlVwqyfmRy3JhJWmURR1Xo8Vnd8bcIY9+jO76LDKT1NjKQ4FN1R6LzwoIKf1B39DtPFgz+1aTEMAJ9EbSnTyoYLGlfDcGlIGH3kDXFhawYmM5SmctLSmuJymAXjZKkpSwQ1yg8TOUhm4WHj0w5H9cRN6Jz5PKQfA/RPzIg+eQeV9MsE+WcBlRT24+n7Wz8lpKCzq8TXQLipM0vlej+4+WK3kXL0nobLUv0qTtu1I3uqVDen9xxkyd33C/SpWH6Empun1aim1JXmjuF0p+aO6Xgcoq6zDam6/GOq89yTgep0+S49FFFZQJhRQaU9umrEkTQfp3GIm0XqSvJNAuKlx85/YIKsHHJw5N0Sjq/ZDolE1RL/Qt3rS3TYrgoij7jm+UbE9QPmvpXqUofoa6H6WCqlBTrXzgGmcHkR1xyilISqkDahZS2xWmwIZU2NOtiuKS9GlzQhHa8octUpVSojf8GfUQ4oDgt94j0xI4u94ik2SflHl2H01Ho3pJSKRt2qkzNKnyytxoFJsKAx5WKvrCdChLMqdMzqTQn6g4y1UmoIhd0/YYy6+YfL7P+QeUb4xe9wv4n0/15hdhgMscf87paT3hdrV/0uqI/BnWazyyzzolzZiOprMWlcmRSWUUllerEWpnAK0vuMN1yTWlfcJGzTN/rV8pyQfrilSqlXrDGrcusm2wZcJl1TM6s7xNU5nokEya5UK7eUZUsnc7t4eS3Eq2X5nNht6oEurqvrNNwEjcmKqoCZJzoqywNpk9WxdeUZo+U1Jo2PzZqz5QnPUANC6daFbWFE5hExV3V4yWTi1GSFVzI1TN1Ai1TGeAqNKisB9Lc15YZnL+uTmKE90vrRXvUWmyRcBIjWiCU4KolOicbCk7Ws7jESj2uXMYn4QTDVCZ5agnZKD+JY9okVq/adQl1tI7r8pMGlTng8pNn5S7puuJS6hkl3YG1pWiv6RTS5Sd5PBtlNuNGmU3oc+qy5ae9am3W/yhBkeYWLLiXi8ghxdZf8IxZH+cZ81/zjH0yvRcW4BliGcY/6Bl7isDt14m9PLC9W5X4RM8Yl6htaIKytGxNqkD09EhKo8qmccm3mr4fPesnZfpqrVHee379uv3zxz0O76I3a72N3mz1mfYPJi7fYxmlP2n/jJ70OIx+1eOQxkei6fk0j0ObS5OnirrBVcUiZBFmjwrd6TKsYduiApqYWkl+1ozLkSitg+JSKxnjwHtjU/v2mGrgOKqLMB+V79Lv1n8Dm6Ki+yZKc8CTYSXTnCkeRt4DWc71yPo9aRc20mNgir9fP5f9rhSrOemrLEEfUPFHX5/GGVOrUFupuDSWgpcnHJdt4cJYqoSL8iNzPIrTtzOD09E1TQQVtniyMDPH41BqqALUUosh6gJJeSElBcXLIknFE9T0OujvrVxZCCRr8enqqfHBU1r2x2PtIQ3qMsBMpCZpVoTNcxPr4tasxSVcLIeKLTbHPsbx5N9Uniygoew6ig9ISpH2ma8f6bU0pQgXU+K+UoEufk9QcckcRZ/C11QYFdYYj6NqZya1FCoSabCGyX3kQnEqtZ9wAlI7TthXTtqecMgXHOo9oBLwiJpCaq627jfbhSV7zJCKhGKMRR0XaB/1s7kMDhckUzEFjnsaTGmToJZ11IVF+NlcoB21ZRwPLmYu/eO0Fti25XYbsuDluO7nissNcR8NVRbnuRNl9Mc9au8iUmy9jUgx+9QTBYtuiO6f9WGPnvSojX7VowZWKWp8n+ZRQ7uiq0oVk36vo/YjlpyydA5KpKiO/HSpyLXBZYEw8qWRABglV38ruWTQqlEGkUpPMqlQmnFULdRIiBb7mVcWR3UElwzVHiaPSy+G7EliaRDVUoKjYUS0Av+kPS2Ja7holsHSGj+jp1A0/iZafB/1q1G+1DMJAcXIBSICO0EuVORd5AgYkqlQPEDZZrjimNiEJTIRDFV1AfWYC6OTlEDSIkQXxDUZnSLzQhs/dJvEXlhcRf2t5LJsyrNHJXArOZeyHDFJVUlw1tanLZeAliRKSC2ANn4V6wikLPNJnjf23tVjJr19FZPMwAlBkiwzFSqFn60kX8ZFzdKoLqVZS0o8QS0iB9Kl9tpMPkfPdo+yFJRVay0UnWXbk8t9NtrNz5b9lOXxuI8vStU/jip7F21p6820pcVn05YKLNJTij+IuzzvkUSWwUr7VasU9lwHvZdu+Wl4jcc6AUlArRMc2XOVqJ1xlJRPkn6Oi0PW1I5tLiZJUgfj8SixVgo3YVIBZvb2yfc2PIFkDbMnEHWg+j5JuRYKiwtpuqfvrNpESSaYNozuhdPIrrFbDR2HdUe101lnl981cAFN/FclmBax0DSBVcxeK8JxCIupvYSigASpIIsXk8RFv0K7STFYaMzKYz0W3yf1WCwIrvF5BhcSdxmPQJST+n1GQ4K3sIio39X0dgXrfQq3dtLPF7xg7b/hBTunujmjlQEdo+S48xvKm1Xt98guVc6liZ9mv5KL+xK0j/6yvLGnxxOv/uvYKwNGr5KlXdC3iqRCZDF9CEGuwZScARVlZjwT05X6qbbtWpIKkk8vol8Malq5qs04HC6OazHGyVP40WJEXuqgZNwSYWOqRoHDI9uNpI1goWrzTC4w8oALvROWBta5sjMkjtNrNd6rbMYj4zapYDPawCuigWz0hwu6B+xRD6koMsUThSogrOxeLmxfMcUnFk32dKQB2zNUOEw4zRuajOmra9BcYNzM2pMeOaqwLH9uyiRJoUnFrZGOFe1ttc+J5lV5ri2OgGiNrWKZqjS2oCTPdSqULLTZKx4RDupkzp/XHoz3FZxXsW3yqLd0xATmD9swYmpM6aFcKQ8lUdk1P/uoZZcvxTjn6UcUsRcchaoLHHN0Q9r6jJGF80RZCpI6l6wB+bmtMHmsTbNmXVCEIdTnYMXF7TPGAFKxYl2Us2CaX1pPMM8wX5WmAGZ6PvZ7SS8u07Uy6iWRBY8jVYy8IBwf+0xUoXoubBqSb0nliRc1coaeoZAz9D6JCWatnXw25Gsoaa9pHEHWqjGTkemfxPPb+tyiCB1G8uS+YkpPlywU8hN0FU6Sr2HdOLLockHYR+1bQK3eO45qLZgshUYfSqTMbPZDIKH8Rls2FvugBNGuMpXqSo1zKTELcu7Jd9GgWg4krSHhO1G2EdWrtDwc9vkQ5a7FWIKVwinwdSWUNckFs6vAIrlHqCP0oSlcQMRyMpTYjOqU6tpvlBWG91mME9cUmLKAbSIpMAPGJ4QvFnC1fi//5NyPS7jOStIGE4WppFI1VTSQ2+45tOdZH0EElxr/I2N2XelPlcW+Nb2lpMvkfAS01glXr6iuGfO+oj3JnyOVTyCvKb8AiykR7TSccUqHYp8sRRvRKiXaVcPXGNhMR40VXS/uN5j/ktcY7jWi7ZaymGmuucguUQE7vsK/8LXB/Y8reo726dW6HMp7/4QCmbC6rOeFK4yoVnJNW7KYOJXGJi+EppZNFBXzWNJwm7juRkSJHNja8iTKWsHR+ZCpvfX5zZF0WUQZ8x+iSuvIhH3lyDB/pkLMjb+5KjfE5nWu/Y+yyDDlOCjPB0ey6YzggsdI4cu/J1pcWSw5lnvfVfS75FsWZH8I6WeOqsbfJEIwUUWiNW2ypGdleVO5mlaW56Jd1dTEHnlOkF7cr9cx4XslBXFz3b9Q1vsFTou3FAt4aueWv1N6+fHOzXgm6tOIi7xjkXRCfrGHvMYxSBJk3omO9P7XZa0JNa7KWntHZi2JnKan3CeLLOOZ1e/NZNw/llpStGoQXTO5uERcseWnV1TFiKgVa6Fwktb3yb6ErkTIkTZp086SuA6+L1DE9HzdMQy/RsUdQAqo5xE2RZVRl369gyQpl9ZwjOgdLqtOqPoEf49RHK3ZEvl56KG0stknhhKNY7g++kU7honvaf4eVqxJJP/0+8SprW8PfaVcnD4kHMYBY406C4AI81cSxefRs/UJznN/EOzz4rHS5eD52teROSTFpGIEFmkpZO2r3eapWLOU3G1FWGxQPLpLZPywY9pNsn5Zbp5wuErqNMn6qfT4CVk/+W/l6RcmLb1Lm21QZP2Mtq/JpdOET5yUCy7wGs8shf/ltdEgaOZoTYOomCMZhD15aaenyYeUBfGYkh9aWJ+fgpFN5PWkgvEo39TKrCTmU8pEorU/1vlqCvNJlNJUMB5zPXwdj2IKcPJMKgrwblBTMn8kBTjd49mMLJcU4MrfEWZsgyrZW5dfkF7YmhYexoD8O6BjqxVq8rMjPpcoD1Iw5pJ0xBh3KbTlZVSP/9v5bqczKP3TTTJ/zgyRZP6ezKYRsvXkUy45LkKfYT2ryFomYxYyqxE1dCwkoHFPqkCDeqcu3iCzqLhowYi0AiWLPdZEQ7L6eYYoU1BFAjNT+qEddS9FitVq7EqSeeoLtY+1Yx3Z5u8a+1NaPRHjzEIhx0Jnblkso8hLZrLV0yzigTJq1RrV9zp1xmhb4czYMmjizFJCUzosa2ksdIECLAjClhehcOu+c4aTRZlZVeKctBPPNL1C1XhHrZfJyufbD7BUuaACa0i0+9hiae4+93z3UZYsxYMUjhDaK89B9CWBfI+UV8WUWamOKizS8LZWHCcib6uFu9XHc6AZUQ4JCSGjzy6jr7mAEUegG3JUFs/QuRhcQKXd/H1d8Aizcut4mkRDSwwltlFp8mCh1ghyyjBuaNB1UZERe6B1oQkcQ59KQ0Ucta5WRWMNO6Jew4KtKfcN8538Hvr6PJObpGmkcmT4WukWTJ5PFrjSP35NIsfOmURmzZvim3huqpNj5TS04dI/k8iEWtDFl0TBKBld9MXmbIGIiwBwjk9jnxHCwZHY1oIRLCo2SogG6fn27NGJ53ulLSJYZ1jgxdL7naxeT+YO1eMnPSHS+qjlCuU6UaEl0uZN1ld1YQ2TZSvpt6oYUsMqlaeqzJKXp6oqwKRO1f9H3bd1J4p1Xf+avvzG4GR15dIoJqQEGwMavIuYRwVNWVVJBH79t9dhb7bGqImmqt7R3aPFKAL7sE5zzUnnZqta7+lsVUmApKAMY2rK71LEw9EXizDJmnOgRWZiPpva9/h+XBI9qbDzXf7mHov3oQz+jmrhkb0CxtuyBUmjm01LQkafRFIvrK4jdvAU6h0nUudDTb4LYk9ZetxVbXdJavIHYq48iyjeQkzzEeIAbz8pEq0X4w+icEfcofdKJFC7wywHPkn72IoJeLlv3GGF/ZVV2EBLfswdzt8UeIAxLFHq5CiBB7Cybws8wBiCYE+MjCuHx/BNgQcTxAGEXwA49OPGUJc/eKOedDyjkreDUWlxHKNS5G6ikl7X+jdzA9IOtwzkUwKbIrOI46GwoVdiX6/6yyRz5x9EgR8WZ/nNoqTbd4V483X3VO6YyNuBOndPk5+F+eyAKGlQfpoo6bu7uORM9A7MxE/lB9I4c3ciOOa9dn8BmF2xYxSM4AC5HLHffLyvwK12dHKV3vs6uYRlBCl4rPkKX8Qxzy8X7p1S65bywtUBGelPZlQ5KCMtntoIOD6uRnOf+evEjv4sLPmHUTq7e/gBSfwe/KOwGc+QDYccZq/tCTsSOuI9+zh8zpHYiPdIZb/VIXTHHUIH8Vjbz+Vv2LGFvz53qm3f6V2r2dyxmo0TVjPIEgnPzzv/aL8SDDphtG8PjvY2zuRvGG0QYDNPEQ13d3XhmieNdpWK3aZ5/tF+JWJ2wmjPD472J3dbf2i0zWDuWKd0ALr2GXquN0Y7yMJnlAM+92ifo9fz7shez2LzE3/JaIN8mnOSX+bsGG37lNEW0ehz0E7OP9r2+Ua7d9Bub3cI/g2jDSghZ308k8SO0W7sGG3nhNEu/Wj6DFVwiPbPOtrOGUf7oN3e7t75K0a76gFb8Alr21vvGO3GKaMdtGOQtjz/aL8nf3BgtIVfsT+6zpp/W3QNDELAe/vR6KvcZgM/pUeim8WmVM6he4EYe3rGGNvdxtxoIqW+2E08QiYdzh1W29Kp783HcP4p0/Mx9Oxuhw3QsXoaDRtGMlzDjIAIdet5Hu7n8579inpMsGJfSyGvxBwy8JlD/bV1MEd1MB9JFX2NLYt4TzSOBLfUkAoW4lW014dZk8LGGep2zIpVC9IzK1Zd06xRz8DiCXgUJUiPvC0thZYkJi1AmSkUSFyzc2XYQ6ZxuTCCkXBL3NOoCdJz7Y8E6WX9SQnSc72K2aARmae/flVD1Gq6fC7CDHAdDsXaS8TkRIlBTMcSpS9xY8y3k9WoVu560WqOIOyOjJlFjRHC5+swOoS6pCN3P6pjGZ8F1eFLJh7FwkGYepaap6u3qZ87IRRBhew6CjsprtpRyAmqWkvkhOofkDhk5BPDWSA7WGVFccqv62oqHxeEePOg98AOoo0egkr/TYVthiyy+htiWY26qo73QnzeFSJFdK466vas1GcdrFYS0qxCnKq650SiMEpkL9pAaEwNPnb8CLtZa665krHVbyI0fMiiVfQ8ZVXW0yvWXAnVMUSpU2OIEBNU1exD0CUaIp42uJVj61cHWFGsc/RLQA8IjDlijpnxhdA1ScGcvMQLx8eAbWa+O0R5QLUXumR7qlcfsVXIJ+FjL4bfCDaxZPi3HmDIhDWS/Wg+IoFiWuGI5kwl9xwfu/pnocfDlvO/G3mqcg5jwz0yyGGo9cggepL+NoVqeaWhgIBrg5F9iAxUvNABIwkCnCe4y9l139pUw8U3rW3kESNOGHmE3blaf15MnBsZdqTzM8cdSKIG1tRD17RqnDeiBrgzNzWJy1fiiz3mlm/Ss2+n5mYvQcxcIojYgv4d5EtWWDw+N/UCIc58jehbYlItief5AIYtO9NuVzGyvu5WRGZ7nxhNiOkBGOwZxR8WjElCRma2VWpfhw5I6oTE3ntGyXtr3glMYsPkbsooqXnDCOtC2B9GHhJHm7RPLjNl5DziccG4kwbhTpFBQ/j0Te6PT6mbgFhfKsL0KAZMuC6jZsBMCE+qs/cjO0dI2CDsXHOxq4FxozCS0tY7iGGVtp5Wimbrp4SHRsZNyY4yNWokPdpFaeuL3u2mrWdLwbbeZVufbNl6sDafbetjtvVT6oa4ZZUJaeup441tPdp0zdajTyZ36wZZS7+hutuIjbQgv8Zn1mz0I/bY+vA8CE5EFIOKrOJeJU6T+ulk1JdSPx2vov6epuJR5++sFQ41cm20mwo9zb/DCO8eoA4zhdhjpsApc70nEvmufmv7WCK/qN9JIbEJVdfmGcfockISYb8bIZ2gZ5j1MLiHRd0Hoau3jxXSjD+/efz62YWIVt1jS+2PMCzvUmM6Li4PX8XlG6gYEbPmkAE6BudhjW7fQrLkJaJrYP9rHYOueRM/BH0CwCWGT+E0dI3rBHNHXBNktsJjEFKvKg/6kwqdLuRzYPc+CrcVvo0fasfPwM+P/ZBHPPXtbGF9h+E6KB3g5gLf5Bj80GIffijFvAFgoI8aw23ebh0/1Bbzqkqg8+gYhJS1Xdmq7zC2MV8K3dRRcswdZnvuMMqd92DvIMv0xh2KmKcpxtAtTke5JY6Ps7R5JMotfVUN1e5QxJBd4WP0sMvgmHXovnWHws76z4gPzo4aQ9jV3rhDUMpwxC4PLL7+B9bhGzkgPVP0Wim7xme8yrH4tfI1oJ7eUJB+UwXb3ptPxByl2M3XvcegmAw75SjarQ4+Hi6M0Z13UItvF24OrHhizTaxC68826DlOPt1tf8P3ent23eqj91upXAa5x2K4MvGy3gZ/7u3GqCpvad2x0zubhbbT+F+ebEaZ6un+7v+Is0OYgH3z1zrmJlba59vjLMNmUfwGlX2ezlaja/WX8bLi2dtfHbikGjMglnaXlXp3WCWak+Gn8KPsbV47i3N1eRqUL6dRd+hrV7fs/ZKy4d+Rd31DuUZpXK8rrmulOXFXpbY4ZbauvuMLHu6Ovqt4+zM5wNSCp6eukox7zJx7+sYZlVb/IqG+xZ/qyZLvN7pP3Yb/738x7L+sYyXh59PDwW8Y1kmvbW6//nw+KS9Zbv/2K1lcfXwffnw9LMUH+EvfDH+pa+U/OkvX+h4PZ88zeg9p8HvzR7m0xmf1nG+0pv3v+iNqTq3eLdDvyheLIvWw2IhLwBfW8Z8Qt/5VvZ+zr58f/lePXR/Gm776bHq/j++oJf7xfMDfYze+PVULviNX7P7FbycL++n4v+X979WDylc2P/mxYM49yX/oT25f7r/x27SoXBtX6biCRZLcQ2t/9oXzni4nk5A3dZavIyFXQIbL8ZqMV4G4smLWViJS73rL7vRFNessAt5MixgrYoZ3XjxMhy7bD3ruxftQWcR9qG2t0yncWfQitxF22t5T93y4k7Esi9dox/3W5fRIBcR1q1TiPc70dyD2TS/v+4bafv7S9ee2JOyIex04yVdpi99+2aWXs1Wie2/xObouitmmH974Q7c8CW1R48ijngWe1gOs/K/68AS+9X30XDxeH8dAq67Cm6dBmTv02vwPEeL9DFYjS0HvNJn5CNYLhYT4+YFfUuMW8SVzuUshd0rfiaWK31Wa5+5uzTQKg+L51Q8rbfuA6JLn+YwWF6xxhtV75EssNhzvk+u++ve/OuL+JbdfUyrrogmR+VX8CAaXZs+hyuvXTyN7vqzEezHhAb8mi77y97ixu2r61hliX4tViAsvFhTdyNhJy6exsML4SnQdQnvI6uvf5VvfM++NNPl+lnERo9dq591rcGv0dBcjB/7FY3DV+1ebnLwbN68f3Efvduva3/+VXghZjm6Sp5Se/E8ueo4XeFteDUipF2Iq1/diV/9WVeZ+lnvsd9Ir+Ip3u3jzSK1LsTVBQv96aXXg3Lcol8cXfnivwBsAXzvJbb7YgcubsbDzqMeP+G5l1iRElcgxqn0b411MDcKfxCKXe+7GVTf12Jvr1B7I0L0uPYNVTu98r9gDkA86f+Wi6dkOFl0bZiRT7DO+FtQ6zFnk7ayLOKJr6A6Js52UWq1VngGlyNr8Kw+Wz8HsesL25mL81xf/hrbfTE7xOyEel9LrKrWJT6PtIKR3/gMjWQFu6r/ZXK1MMb6HeD+++pX2/TpxBa7/V3/Jc2Uj7A5SuJzah5mtHePhwOxY8BzcnfYGbaMIgJfrMHjwfqrOCt01P03X+20+8njwJi8spTk3QgLaI6X4b//i8SV0LmVndPQ/sqjER7W40T4EKqX4bUHUNS2blv/ud5ZZK/h7l1DZrO92telLAVpSFN2YrfXJ/0UzXbLa0/uZovU1K5PVWHhOoPvG70bm9eu2eC3vNHdvlvt6bz20bHOue3TzYPlCP9JdnsieNXobbRM9K/EN2G/+e4J+9Ub9ufBEPJfebGBhPh6PzRnIxFnbfodnqyzrnB9D33gODX8apH5++px0NN4u9isMwNnQpYC/mLTj5kfgQjHXcH9wtf4kl41HnuWsGCW2Dla62/3g1ErMm9s3326HlkXoIJ+GS2Cwb07GKAq+uAG3+sPOu0QFOnjTitwg1VqjuB1JzAGv8TfhnHWf46qIJ9ko5fhVb91n0+dUTQ1u9bKHt2t8tgcrPt3C2dsBI/JsPMjNAZ2/7H/kgwm1e3d4CU0J0YaB78eFisjvbq5nFx3ZsIClEl8802cPxtbs+ghnzkPWecqvgob/evJ+r49uBm2EyeZP10/XI+eJ1FwP7lLKn/gvYj12IiWDf+23QQl+uibNbLus6ScXPXvhb/QCIaD69G1b40Gl+5D+6aXZF5j4l446aJjjuYXP8LKa4wGfXNs3VyLvStPlp2F+OydnzfmffeXNcpnYoxuvPHVRWu8XL2IZ/E0FM9pDM9oaI5+/7FbDszVlyBueMPoJkjMyy9Be2EMr0ZfxH5v3C8md+O7II/Kr/YQ7vexWfjzp/+SZVyN2oEzXt78N8n7T7fwt0XaGMUXP/zsxhDPa9YbOs43ozFPwMsY3uTiXI2g07kJc7N/G98EcbxoRYM/ddwwUmPagHHy7ZH1MGjaw6hT9gf9vHcVl76YQ2k2+DZejpzAXPw3eVzFvnljxZ3FbFJeeJPr0b1vOcbDdSjWYuNXYDR+DMzJ02TYN8Vne0HuGIN4ZN+L+R8PF/3Q7KziOOj/0ePhTTzMZ15wvegNO7Mf3Wokxjf9ObwaWOL/Jc3X2cA3vr6MYven8LaeJ9fBzH/sPwfW+mXS7hR+3B/5d6MweHRfxHqM/rLxrPz2ZU/MSTNd3MDcbXyzAus+/0rr9y62uiXMXbcYLQf+t0rM2eF3S+zu15Or1PxWLUwxvleTbPZf/3Hy58eLjufJYiHm6CIaDcVeJfyG0WBmjiE6hjG76xvDdudnYD0l40rshmKvg/kq9iJv1L5pJfOL8CHq/wXjg8e9yWPwHNxN7YfB7Me91ci7xgRs1rWIKH6M7dnL+E7Yn3YQTh6nL8ly8ty1FpcjY9AIB53OQ9zpC1twGcd/5Pj6QdijwFj/FHv5j6i8yIS/JsYlfPHt5jrJJ9WkPWkMs0GYLgu4ZiccTjoPbuexfzf7s8f24ouYC91JvIjFGjEn2cS+vR79uLebdv+uv+hnifkXzA06fvQdf+FVSXbjpOWvRtQZAY/Y42T+lIVGJxafvRy4i1Y46Ivv/pnjJP/1Eg4mxngwWgV3+XpsuGDH0a8Zu6P43gzMeGgC9yvmhrpRujcjubtn2d3Rszw7rmf5vby2a8UkIjxl6FpWVeKl4UDcngwB4wN43I/hnQ/VHc+MtrWDOSiwv4W2fXVPJ7OR70BUF+9EVNuAoGW1uk9hIU+hqmwqFtQI0TPbx6ANx5Vx1A6oem2JE8Vjg1XYfamCqmlrMX+YhjXNkMG4/g3iB9s+ZhQAXp/+mj63T8n7t/P1H4nF3s/XX+7vA/7ULm/gYQEehje7gHvtvCGcw3mvPUU9hHOooPjznTj0+fu6gHtR+oyavp+l1wkoVQcZO1WuA9CA28eM+UHVLP11LPkMEdXXY34bQI34kWKZL0lty5NMn4jNEaveUSg4ZLBLEA0bZDGyv/oK+5QSQydixAChR/qWNacXomaYiT4G7C9obUpEH7Lzs35m0cN7Ie1MRthVtdYmqesGbVciCZk9vFkg0z6pIa0l6gX5pmrtF4c4/xCxVxALamjVrHqkusWMz7oKAKMsQ6Nm00dVMOgzqHFkxLdFilrIdO4b6h4qZEgFrjPAY5WgpQJo3FqvktnH8dkjj2PpEw6+L9GY4npt6mlpMlJYXnfOCmlNxXDNjM97WP4/yop3Est/eYauGda69A/1P34yZwFoY4CW+Js9UtG00YsWswRw1MJ+n0Pr0t/BOuK/u2PGLYMW+CCfZ889Qt2KdS7Rvj7aX8lf6K9Zl7eUyHDCM06Ziw+R74h+Y0QvYBYtYpRGfGSpeiAyZBGWPHFbihq47qUKeYU5x8rXVMhRGcTpEd6S90vAV6o1j2udGIJBnRDUB9yaZw33nZAY4BG5FyoOW+QaxL8Bv544R5bWHLYVoPuRe69BvLO+xF5C/wgjA3PUGCZ0cFOdl1DTyHIN+ySpEdbK8aRXlfmS365EFJ/4bV8xYGMXg62Uy4m5G3hoUSOZWNmV1jBgM0nZLiNUM/K+tdW9ONQ9gazJJWpDtcN1rRCPSo5y33ZI6UDTSI4AuS81kj1Sn4ikOmLusAKi2EeTkjoaUJvZ52PCI/Nn8RjVTgCB7ZEWsELOu7WiAiHBdbUXizSpWe2FbI+hGNyrkFRXiOuTeHKr2FC9K8RLW5Dmssd41qnseFgTLy1qFTPTtEdcivt8x9+uyeqfw3c09vuO4af6jshSFr2toPcZvuNuBpngvZqsUSz8xfTTFPSA+RvVzUiPUqyeKXMcyKoa78o4q2S0BCs2LORMRV0N1DbQIijklttQNEbdSe08kiESK1v0nZSZHBV7KH/HVeyhxPEu8dh47evN69Q+iz0l28ep1GrYvOZ9Ksi/XbMyPMeKM/evuPhTV1yvHT+j6t3vXHE7u4aD92pWCrsDntunaVZi34u+UpCvVl8p2EEVkBoRoemzpKHrd2PHH/oDtaYiM4JX6K9EIXbJ1eotkhGclUZqvnuKXajvgntgmswfH9edeeRXoJ2lfkzgoVbdeQXxt4e06pF3Xdr4Wt0COgI2eXrDsubwFXFJ2XwKWAWJ/C94BqjWI1csdhMEldSKTOHcJnfF4Wtf8Taj92qQEg6ueEOem9U36G+AJhF+XK3RiV2NoH7c8KSGJCkEsb9DmgTUo4PfLbFrU4tTfcCNo9YBnsNUvlQU0nPFLjqKU6GviM9N3X6kyMu8x77sxLNI8+BQP7P/MR7i11nNo/kDj+fu2N2vLrPCyd6s8GfzEnkFYLJ/Y9Z1F//UexmJIB8DPs3nZZWoGw+iNdm9JXYR6lbHXcTwgWE98zTFhryQ+jnATu5LFnzueyJNm1x9F7WiZCd7eyoZxwvyuDXtx0r9DTve6u/5qBPJ2jVr0g9XK6piLW26niwlpVURvejnFavV4WizvldWC6GudrFqpaYZ9YHLSKNf66ejj0GePXRFKvWUxKgzUx5ngLQIp5oWdYQTmqSlWT8v7j23iU8OGRdIO4f1IEkjDJ8nWCs9MhP2F5m6ZTchRy8qo+ZQRNJ0eLcmNRW1o5PWD2n9+dSpqD3XAK0Rqo+AUqGBmSy1Q7qSkZ+eqd77DWrN3PuN0bHqww5ZDQaz8KyvLtkDQtLTxLFEDSiItqSuoEF6d6x1T89T07rHyEpq6Tmo0UbVAM6qNcn/wx4+HFsRwe3vtRXP5k9kxs7BDHZ9JDOY/cmZscxvgCb478yMpTsyY+m7M2Og1upU/idmxpC9XvcEWfdWVaoq1qslPgeqflV1JcqVyjwUp9H+pbxG/u728eH457dnHNJzxD/O/vhn+pnxj/CeE+jc/K3xzzbHG8U/vfdmHLJE3EnY+LyMQ5NU+JTWlMtMDl5B6tCeRT32UsMeGR6gs9OgPCtYI79QXgfWpaZc44DvJppaNfb8r9GaoQXB19xvX/9NxBQVfu9Weg5gGabM3ID1MMwesLIfqxaidyHinhh77YP6vJR3xVrNxr2yzkiI+hKBssY5MVLgtaIyNeiHsYWLKZ5CLSx6HUQyj0z1KMyWiDgJ1QZrDTXU+VCWke5ds4wp7BpsGaUXE0pWikrtFFgvi6lmpX6XdiURtxoYU2a5Q9bVk8+H1fmwx595DkLFt8B85hXxGfg2jY2MoXyLvLiUeAgyHHPt2SfwPOkzlQeqWNDvJGO3da0h8ko9G9g8WBOGNO58qZ4NSuP4O6gLB8wq7CHSMSnDJg16jvI6c/ZSIEcNsbev7pFYRHgnJpwAe8cp5crR20M9I9JTYQ+Ye/OY24LmlrpvigtLypiBCnJTaqZwXYB4E0gtjDV4as28Btc9DczDZyl4Weta+wnPTarnuxTfM00hna7bwCycxmbA55b3iNpp+yxK77dn1KbnsCiN/RYl/1SL0otCEcP/3hz2No8kW5T3ZtREVAoe36dl1CCuaEstWvTvVdW+RzsnK6ohE01d+UHGNJz9YpZD5UshAaBKLv9WkkrSVKoOwUiQ/ipmkHKqqtWqVQ7lmaekS4dsNK6WYfJMTQu1QbtBXO8SVA1DZSfxH8fTzKiDlkjEkbhbI8dTEXAlio8d+h7eV6XUlSLUXZfIBdKt1pELqPomNcXF30kvvqF25Iy4P2hHRv4zTcs8IV4w3CWA9wjQBYlUpDNJwRIZ6DCLpfFCof45/w21nbuRzOyR1jGPJcadvKuSAlvWVNbWRwVKb02VyaTyScVOVSBpjBLMvFH2rn5mnO2rAlLEBJUeUIwzJSqFzi13PlS4o11SKqjVOyVYUAs5tiKl5Im8a3Rud03sda5Vey0+6dmjQpXnBBvX7TLXFN6n1L/Ge9y7q/52VNmRGbD9u2rhH2Qv9j+ZvdgvxZ5U+r8Rd7l9R6cwnPZakL10y0/Da7z2CXAHVD7BmjJXqVwZa8qoeOSTkwZfvWNEWF8oSGsS6vGwY00lbsJENWvK9vHvaplAjIYpEwg+UP09UEqFleRbvZqxTdulmgXiqzLm2QT/qUJfm7Fbmo9DvqNc6eSz83saLkDHf4nz+hX6y4plLqGsFeI4fAurmISl4AwaWpCCdlzIKzRN9d1sWijMyms/VimHi5kIzH0Sn2fgLh25hEdABjv1e4a2gzvi2iBjKb9XkN8ncWsb97knC/YhDblTs2Dbqm2b3DGV8DHKYzXN4A5O2Lsk/7iOn6a8kgvrUngfV5NybA/WG1n9w9grQzy9itmpIbf6DNUiGPUj1vJhtLxBDJchqgcTnsklzE6mYjuHGDhzsl7QCQAxsqwAVE3C4WBmGOIQWI+e0sMmXuCwJNwSYmOqWl9RWEGMG9EbEfsCxYHavkDIA8RGuoilEfNcxhmM4/Qc7XdlzLgm3CYqAEMMLDwQiUGi+wkqvG/KqBODJtYTmfHT7Mm4t0JvhBThUaPRU5UGuJ6uxGFWU92TQRZX+lsOTKxmnUmPGZOa8Gt9T0KtToOYH4H5FeLtDQZXmbm2qAKiPLaK9lTpsYl9C2L/zJd7oU1Z8RhxUBtj/rb3YJyD1RSZ7ER8rSomEbKHGj1isOQMpcbTDHNNex2Al13u1dbNzsDZDBUX9MpVpYaqGxzrE0ZW2BMZKZDXSXhBft2UmDzypsmzLrDCECk7CPmLqU9zivWQ5ZxBzmz+myfGGbSafWlbbNJDblIEgVlc11K9B1lqkc5nzHxN4lyRyplIFlPSeY0wtyT7yIsaOYPnkMgZYhqMJIejZ1LOBnMNqEFf4whQx51r9rEZbNTzm8puYYWuXXM4w/5BuAEf+Sv9tsRJ0jHpQKc24rsjae9hroBX7617tReMkYJ2D8B229DvQ3zeknkRenaQg0J+0HVANXz5nEvGLPDYE3NjzdwaEmadmFuLHunTyxwZMRUDTgSiTdKTlTgFOq58GU2S9nYVWrjvIeoIcmgSFxDTPhkxNqOqNYdZ/1Vju6XISIwD3wOPs4jzwd+gMQbd4nQ/g/GHIoXtPC7iOivWH/eJoxv5y81axR7nN/Klsj8CCC75/NeE2XU5n4pcoZreLrFtst4uROuIq2e+1AZz1eKapNex7CfgY+wvEOeDqA8y8rKK7VFOFquNzKGeIVqNMbC5qhozhgXyijD+Jc2xkHhPVZUZ9leX+W2RsbcRSPwLHRt0/0mF51E5vdqXg/1e4mQ0Vmvy86IpVFQrntMWzgPE73AWotbVJjuHzxYrRibMu16ENtlWkSdxslJ1PprqWsNFT9Mapv6HWDGZUw8JVYbpdV7W3LU+c8zj+rNpnqv8I+swY4+DzHxQJRtthE9a1EqjGllLWRM54bXvGtK2YYSP8YfPeWbJrOrzeoW/IdYImKdlRL1mBlPab6pax5rGollpLOKYOQF26ppBm/C95JNvzPs9Osv+aby0u1Zu6X8oc7q9cnMaidoaEWuv8AIJ+UUZ8hrHgDNCrkRm83dlHb0gnnBauWLmronVJG7omfIAI7KcRlb9bs51/4S9JFRJlCMGHo3kwm1Q5KdmVEWIqCl5ocCPr77H90Iq8JAZt5DDPHIVroO+F0q+ejpuGUZQo+IgmyrPh9iUbkQ85pzXUyrsFA0j3z0ptSOqPoXPlzU7cxORbuLZwG5lU04MdjSq4QYVcuSbvfZ04/OSI5g+nzbq6NuDXKlDuaOU1NLbEq0HnKqh+PyUUXwenltZcBr7J59yXszWLBXm6ThQlTmxkjNknbbQS8FoX642T9aaeedW3MMG1qPbqLouVkxT43rPKd9GOFy563DXQrImFCFaWQ1TFNbWL0odtUr1a2C2aUbbG76qCKVkcTKo0sg5nlsS/0tzQ+N3pmqNxu9MlQzEnuzlX0/PwkDtWdRDl1v+JnP/muo6YUkonlDOzIoxn7wnwiry13W/msR8gt1SevWmYp4nf8rAzCTr1YunLHtDCk1bBbPRG69r5mc+zysOcfF05YykWqSI++ygViowVL4D+Vm1vZd0PjQFB9x72f8Jme0/lTPUpHPHZJewD9I3Ffs2KT6QOsfevfqj/W5bHOIN3kMkVorwb1L/I/K4m8bnq8ecckl1EXwt5rOsrOVcs+CuRvDQxVOpcU/gbaBnyr8pcWOyiwq7SSAPOK33Yo880QijfhohXWeDWN5LyszRd7FSLGcjonLlveD1SS0Trmzn5rYWA0U9UovB52ehOrcs2qMwS2ZS1IP7Bu8LsEdNUa+Ev9uoO0abEmdGkcGGxkhT6o/YwK4lnoXMFIBn6lDk9ZbGCERzaWPjOsGmqRkqn3fM1d43I9XlGSJVEyMU8pBw9bHigrb63O3Vh12yWA+SOEJxvWwHIZck9vdY6c9wVyp2YBGHu8q2VlQnwmyrBas1qKRWC1eUI0RCcPXZJfQ1rjquQGv7KGtyqF4M7BJT0Wi+VurzfO6grqcxGpoxlHCNSq8GtW0kutzeRJej/edMKWag5Y6Cz1A8gwYjKArEDdZzmJQeaA77FE25R4x3+jH09XYnN+6mseyRoWPpW5AyBkbg0v94346cNLZ2ZPK8sb4JdlNaDl2RySOlB21HRtSC3JHFdRFKJpVrzGY1DvTGucdHW2eIcGgwtrUgBIusjSKigTPfnt3byHxPVUQEmjhij7DUeseo1+Peofr5cSaEo496X8FeJ9RVQm/eJH81VFo2tLeif0t13Y2olK0qd8mzVZUqHtKq0rnZqtZ7ulTmQAWTgjKMqSm/SxEPR1/0u3JPKwMtMhPz2dS+x/fjsvYRdr7L39xj8T6Uwd9RLTyyV8DYw/reeA8TfbB8Uy9BWF1H7OAp1DuO4VbfrifoGgdRDPz4jvDtjruq7S7J+qrEWDafRRRvnc6PT+zNYvyP5McX82uf3oWBOhVHVkzAy31TxQH6K6uwgZb8JJ0KUKmCSqwnrJF7jMbBcp+KA4yhOCMyXBwzhvHbKg6ZC+wjgEM/bgx1BYA36knHMyp5OxiVFscxKkXuu1QXlR1uGcinBDZFZhHH0SiHOTxqz+b+0J9/EAW+heJ+VQ/83Wq1r+4K8ebr7qncMZG3A3Xuvru+H4qrh2ozVstgPoNmXRWUR3T5HNnFta048t4uLjkTvQMz8VP5gQ7o1BrWCHhVrkInyToZIzjEvuc0TlIbr3Z0cpXexxWJhS/iCD8RFGrTMyrUeqfUuhuM06n2ox/dT2ZU2cv5Ow9AZajys1neu/KKhPnrxI7+LCz5h1E6u3v4AUn8HvxjAOoxwreDHGav7Qk7Ep5fTb46n+J0cBCPtf1c/oYdW/jrc6c6RV/cNXesZuOE1VyA9lMv8s4/2q90j04Y7YNq8tt99X/FaJc9sVMeb7l2jPauLlzzpNGuUrHbNM8/2q+02E4Y7fnB0f7kbusPjbYZzB3rlA5A1z5Dz/XGaAdZCLv6+Uf7HL2ed0f2ehabn/hLRtsRz9k5yS9zdoy2fcpoi2j0OWgn5x9t+3yj3Ttot7c7BP+G0QaUkLM+nklix2g3doy2c8JogwboM1TBIdo/62g7Zxztg3Z7u3vnrxjtqgdswSesbW+9Y7Qbp4w2aJD2UL35zKP9nvzBgdEWfsX+6Dpr/m3RNTAIAe/tR6OvcpsN/JQeCVBzlco6dC8QY0/PGGO725ibOg8JqlCZR8ikw7nDym+dlo/h/FOm52Po2d0OG6Bz9TQaNoxkuIYZARHq1vM83M/nPfsV9Zhgxf4W+k8Q0boSc8jAZw7119bBHNXBfCRV9DW2LOI90TgS3FJDKliIV9FeH2ZNAk3ok+t2zIpF3DWEJUFWrLqmWaOegcUT8CiaMjyi7SVakpi0AGWmUCBxzc6VYQ/ZhjI8qaVLZXiPleElilUqw+fF5yvDQ70elOEBZZ8YxHQsUfoSNyaV4WtUK3e9aDVHv6QuhBpBzM/XYXQIdUlH7n5Uxz7VqXegOnzJxKNYOAhTT7zLfPU29XMnhCJAXfumwk6Kq3YUcoKq1hI5ofoHJA4Z+cRwFsgOVllRnPLruprKxwUh3jzoPbCDaKOHoNJ/U2GbIYus/oZYVqOuquO9EJ93hUgRnauOuj0r9VkHq5WENKsQp6ruOZEojBLZizYQGlODjx0/wm7WmmuuZGz1mwgNH7JoFT1PWZX19Io1V0J1DFHq1BgixARVNfsQdPWFiKcNbuXY+tUBVhTrHP0S0AMCY46YY2Z8IXRNUjAnL/HC8TFgm5nvDlEeUO2FLtme6tVHbBXySfjYi+E3gk0sGf6tBxgyYY1kP5qPSKCYVjiiOVPJPcfHrv5Z6PGw5fzvRp6qnMPYcI8MchhqPTKInqS/TaFaXmkoIODaYGQfIgMVL3TASIIA5wnucnbdtzbVcPFNaxt5xIgTRh5hd67WnxcT50aGHen8zHEHkqiBNfXQNa0a542oAe7MTU3i8pX4Yo+55Zv07NupudlLEDOXCCK2oH8H+ZIVFo/PTb1AiDNfI/qWmFRL4nk+gGHLzrTbVYysr7sVkdneJ0YTYnoABntG8YcFY5KQkZltldrXoQOSOiGx955R8t6adwKT2DC5mzJKat4wwroQ9oeRh8TRJu2Ty0wZOY94XDDupEG4U2TQED59k/vjU+omINaXijA9igETrsuoGTATwpPq7P3IzhESNgg711zsamDcKIyktPUOYlilraeVotn6KeGhkXFTsqNMjRpJj3ZR2vqid7tp69lSsK132dYnW7YerM1n2/qYbf2UuiFuWWVC2nrqeGNbjzZds/Xok8ndukHW0m+o7jZiIy3Ir/GZNRv9iD22PjwPghMRxWJGNxT3KnGa1E8no76U+ul4FfX3NBWPOn9nrXCokWuj3VToaf4dRnj3AHWYKcQeMwVOmes9kch39VvbxxL5Rf1OColNqLo2zzhGlxOSCPvdCOkEPcOsh8E9LOo+CF29fayQZvz5zePXzy5EtOoeW2p/hGF5lxrTcXF5+Cou30DFiJg1hwzQMTgPa3T7FpIlLxFdA/tf6xh0zZv4IegTAC4xfAqnoWtcJ5g74pogsxUeg5B6VXnQn1TodCGfA7v3Ubit8G38UDt+Bn5+7Ic84qlvZwvrOwzXQekANxf4Jsfghxb78EMp5g0AA33UGG7zduv4obaYV1UCnUfHIKSs7cpWfYexjflS6KaOkmPuMNtzh1HuvAd7B1mmN+5QxDxNMYZucTrKLXF8nKXNI1Fu6atqqHaHIobsCh+jh10Gx6xD9607FHbWf0Z8cHbUGMKu9sYdglKGI3Z5YPH1P7AO38gB6Zmi10raNT7jVY7Fr5WxAfX0hsL0myrZ9t58IuYoxW6+7j0GxWTYKUfRbvXw8XBhjO68g1p8u3BzYMUTa7aJXXjl2QYtx9mvu/1/6E5v375Tfex2K4nTOO9QDF82XsbL+N+91QBNDT61O2Zyd7PYfgr3y4vVOFs93d/1F2l2EAu4f+Zax8zcWht9Y5xtyDyC16iy38vRany1/jJeXjxr47MTh0RjFszS9qpK7wazVHsy/BR+jK3Fc29priZXg/LtLPoO7fX6nrVXWj70K+qydyjPyMryG5rsSnle7GWJHW6psbvPyLKnq6ffOs7OfD4gpeDpqasU8y4T976OYVa1xa9ouG/xt2qyxOud/mO38d/LfyzrH8t4efj59FDAO5Zl0lur+58Pj0/aW7b7j91aFlcP35cPTz9L8RH+whfjX/pKycdf+BTr+eRpRu85jS/03uxhPp3xaR3nK715/4vemKpzi3c79IvixbJoPSwW8gLwtWXMJ/Sdb2Xv5+zL95fv1UP3p+G2nx6r7v/j877cL54f6GP0xq+ncsFv/Jrdr+DlfHk/Ff+/vP+1ekjhwv43Lx7EuS/5D+3J/dP9P3aTDoVr+zIVT7BYimto/de+cMbD9XSyHJSptXgZC7sENl6M1WK8DMSTF7OwEpd61192oymuWWEX8mRYwFoVM7rx4mU4dtl61ncv2oPOIuxDbW+ZTuPOoBW5i7bX8p665cWdiGVfukY/7rcuo0EuIqxbpxDvd6K5B7Npfn/dN9L295euPbEnZUPY6cZLukxf+vbNLL2arRLbf4nN0XVXzDD/9sIduOFLao8eRRzxLPawHGblf9eBJfar76Ph4vH+OgRcdxXcOg3I3qfX4HmOFuljsBpbDnilz8hHsFwsJsbNC/qWGLeIK53LWQq7V/xMLFf6rNY+c3dpoFUeFs+peFpv3QdElz7NYbC8Yo03qt4jWWCx53yfXPfXvfnXF/Etu/uYVl0RTY7Kr+BBNLo2fQ5XXrt4Gt31ZyPYjwkN+DVd9pe9xY3bV9exyhL9WqxAWHixpu5Gwk5cPI2HF8JToOsS3kdWX/8q3/iefWmmy/WziI0eu1Y/61qDX6OhuRg/9isah6/avdzk4Nm8ef/iPnq3X9f+/KvwQsxydJU8pfbieXLVcbrC2/BqREi7EFe/uhO/+rOuMvWz3mO/kV7FU7zbx5tFal2IqwsW+tNLrwfluEW/OLryxX8B2AL43kts98UOXNyMh51HPX7Ccy+xIiWuQIxT6d8a62BuFP4gLILsuxlU39cBMLVlHnSHA3pc+4aqnV75XzAHIJ70f8vFUzKcLLo2zMgnWGf8Laj1mLNJW1kW8cRXUB0TZ7sotVorPIPLkTV4Vp+tn4PY9YXtzMV5ri9/je2+mB1idkK9ryVWVesSn0dawchvfIZGsoJd1f8yuVoYY/0OcP999att+nRii93+rv+SZspH2Bwl8Tk1DzPau8fDgdgx4Dm5O+wMW0YRgS/W4PFg/VWcFTrq/puvdtr95HFgTF5ZSvJuhAU0x8vw3/9F4kro3MrOaWh/5dEID+txInwI1cvw2gPQbN22/nO9s8hew927hsxme7WvS1kK0pCm7MRur0/6KZrtltee3M0Wqaldn6rCwnUG3zd6NzavXbPBb3mju3232tN57aNjnXPbp5sHyxH+k+z2RPCq0dtomehfiW/CfvPds5GdwRgNRwtgAdhAQny9H5qzkYizNv0OT9ZZV7i+o6kp/msk1WhGHT1v5Oigp/F2sVlnBs6ELAX8xaYfMz8CEY67gvuFr/ElvWo89ixhwazYClrrb/eDUSsyb2zffboeWReggH4ZLYLBvTsYjIzOZTi4wff6g047HCzgdStwg1VqjlAt/cEd/BJ/G8ZZ/zmqgnwS9YvRYHUbG6kxvBoA2mAU2ZMv36zF4+hq8nPQCezB4KbdX3aK27x/1887ncFysh7Ei+exMXuMhjePQbwS9z5bhUZ/Fi+C4H548RRaMzOMJ73bZePqm+FXk7i49u9Wtw+D2d3oehJG2eIL8Mqki+9Vv3WRjCLfmpie1bsaiej+cu4vb8yH+VOYLMXY5LPvY3E8zlLnW7X4MbBmi+huNfLvBsnDI7CGuUV6fWONHr9b36qbXrL43giqwaxvr8qRddMZt/tBPLjpgJp9Pw4uB+5APJv+HzjuzyfRIhstg6vbZWBOHke2nzfux0O/Gl0D2+JieZ/NFv7jzQ/oNx1dre4G0SDrGjMxhzozYErr3X03kupG7FXui2+sqoe7SdW/W1zG8aARDsT4xp3+nzruD1bhwJ6aE3Gnvbt+Mmi71Si6cdLHm/nAHizFHdqT61FyX4XFN/NyPmgHTnD7yx5GnefIHi37rl9OOtPqIRsY94vJ3fiub4SPfiNa9K1bfI6LlniON2HeiW/jmz95PL+tFo3hdVyE2cIYdPJikk3W38qnIFm6pR91npLH/o9+dZNP5hc/ogzmpPAQ50//JYu4Gt25Df8qWN6Lz446q2/BcNIPzdVjGAf3Yg23h8vJ5Z857t/7tld9syZe11gbo3hi+FdeNWov8r64x7E5SJLlhTNpB4lvh2VgTIz49ikb334V4zzri2cUxPGiFQ064hmZf+w4rRb+4Ep4uVFgivU1elgWX3pXcdm1gVdh8N/DtRiX69GX2+xm7ht9sYMsYuFtXo6MP7+GxPH1w91oFF/3iySfOON2aMdXqyd/2V90q8mX1ML5VabX383hcOCki044zlfWeNB0QngWQ7H6cvOxL56FuJ8/c/wYlL54xr32qHe/gJrlTQh7WWANzIfF9zIZzK5FTPhj6F7EvrFY9Dp/dr68nj8iTozEPWSj54erryJGHPzoRx1jGIkxMCbfUsP/GQxmyb3YE26XYk+IVyuxJ3z7Y8d3fUtMofnD1cgT815Y9YZ9bzVWD1fBzb3R+Xk7fLq8NxY/w+HTnzoOfXdk9o1G+PDY+XFvTm7QVxF+SmwOhsKf6QZxMLqN8wvGYD4Fh3JsT8LfW4+vLkQEsph3lx3n/i743wby8rW2+kHk8WYn4C7G3818ZVPzxUXsdKv3HHv2CHAB7WlBrMP/N3qOU6sLSIQ3sdHbd3W6xpV7lm5jqHgBB9NnKTXkiAJh5pICWdteHYdrZP1AZjRSNZD4MzoODanH7kfTYhMNSixlmmIWs4Kp32D2t+1jRkLg9emv6XN7NDeL3625eWq39uy4bu1P7gIDdhVHeBFvIsrLwAL0D+C+/PUZGPfLM3RqAzeF8Zn6b8Q8zEy8vuSy2z7mvdJA7JD2umbwRewWoSgjEUOgIrBCO66Z3Ze4vhBfTaxrzHZakeZJyvoUzMujeLdIxYjY4lNi3Kwk9w+g7gAhQqpTgNFCzi6F+/GYnR45zZDpdkMPEzFbUg9TXGdL6ZLDPaGir3jqFWu4VD3iuPL5WOwKksUcUakGIV3AzwNET2Irnh3EejF7ttxBamZ/4odi5SxmT60UnrpCFlOD2Mt9Vl0O5T1UxF7uF4Se9UjlqZ1KhtKCuI0S5NkkTi7UipPYMMQ6+fiboBeaI2+RvO4eaaVU9D5yySFKd58mCORKfq8myElcA85xXAP+p2qXA5KhF72t3ednk4VYQ8A6D70aZ1Fa8stzMA0Awyj4HZ+n3ce4WbUqiRcfmGMRqVg1GX8tteBi5pUkpk1CJPpmje9FbTWrRmZ6jsIrZzXDH2rtafhxXOn8N7QPLUOyFJK2ObHFkjab1A4HhWG1ynF1M+63aYkVZvqKC35DO7xAbn2lHe4rLXEf1CaA67tS7GUVclG3c/maUJKSMQ/ZY1PmqQxL4phW5zURzUp6d9TlkHma/jtyUpea/nuDEKqx5NV3iOM8VdzbiMNBFl+FoYbOE4V5p54G6oDrIYc1sfTJnQgx2qRDJ3wY1GQpJa86auSpnXpa9eSurvUz8M7OOG2o38h+BkTgFsQ0i3ydDWS+BZ+Lun1MqVOISsuo0Ms6M8ilLZHHruUjFhLHknm7N9V/a9UrUId2UeNQ4rnJLwypJ6NirYpa49BgNSwTu23ISjoKaSuugZgrkZWQ5lEWFwf02avf7ise3We531cM9+uzf3JXsWv2Wg50F/0uX9E4Aw+IsKzMGPZp+uzExxoWkneToidSkiSkckhcoJmsjRE3bNB2eaZCp4MHnSEbGsPEoK713CFDnH4e5nhE34GjKOJ5LGou5SZzeysO0LWOi6drb25cp/ZZ5nLcPEbOV7qPjWves+KM373itjF/H11x8f4V98md3cKa3ILS9W9bcbuYd97LxQI85OC3fd6Ks4h1XlspyLmtrxRknYUYUbLHlhiZkaYQ23JkV27UzKYYUVSkCAj6Z9g7qNhNmRWUOkIoWtE6QqAfrik7QlgXiFQF2f+iLhnmxaZVGzZqhmVYtSlpDLepy0dpp0SqZ69AXYkN/8tzav8L+/6k1gn6X/AMqLdArljsCcDOVuxBQP80Zn04fF3UnSUYcZEqJ6z4DVVOtNmkygmo2HYqe6hY3dFdk/81JWZb1WPJSpBRTp09+F2IvKZ6ZEpaElKrB5DfisU7Vv1ZHJmW9blRA6ZEn7TNHUykw+HzscxP7dFT+2jv1kl6aufgFbo+klfI+lx2OFArAdXy35lxTXZkXJN3Z1yBWxy8m8/LKnmyM052u8Ge4qDfintKWCELvYhZZG8Saw+uyQpj1sGoexldmNMNWt/4XbvWLUsc0ofGDBP636jXxJml+m8pKJ7W36tQf6XB/Ueku63Wl79mNVTSTa+mmDWpdXRi7p7Xen+zWNPipl5eWMO+0sLCTvWGUtyoXNZKw6489PODtit14SBeqzNTGWeA6njHZv05Zl6PC6nBxs+LtWhgX3NZqTYuFWM0MpGLmYjPEzVntDhNxILUW+bUmkX+WuuaNxUbAOzdGavdyv2dbA3p8VWo3aM/V+w+DKg3mrWwwkrFh1Guenl9qdQre3lVny9m5ZR2ZdCudS3pda13R8c+6q1TjyvEXpJxHG2S1LurerV2PT9/t1YozlCJ2cBaAGfVSDsuoQ5PHFsRQ4K+1p7MWPDbM2PJOTJj9v7MWPqpGuR+lT8HWfpbM2PbDGWUGQvemxnL/Kqb+cWnVbZIQVbzBJGDXqtM+axe60tFvqKHqllNvUN0U5Gv9hr5u9vHB+Mf+7fHP2fKOEz3xj+fzWMGCoKOE/y++GcXX917GcwgYwq9q5+YcfApH4hMAMxrwB3eLqmLoIJLbte8BqjsZaP9Jj3LAnOHUpNN2I1em6saMOM1dXXUjNytA21of7MhXhLfUzqtxA+SG/K1GKGi5nBoVup6stQgZj+pFethjljEAlid2bhXUg4jVRX+fcpPosoOXSvoX0P+mvOczJoAXfrytSnzyFyBwmyJsPvYI41ZG8o/G/6tsoV875otRJUxqf1Kfkug1M/kLuOyAgtWqdTv8q5UsaqW8DOmZE+lkgRpHEqeFNb6k7GmX1DnGOoqgj9DqvRKzzYhv410bS0a8/rZB5jbTkg7L0NtXdtXsVtT44rZzPUDX4vkiglwfFIea1Rws+j5YN688tkn5GNUsAN7pF9nLyK/hJgLQvIp6R5ZAcYl5RKlNudy37vLnCs5qYmpNSAR3MhQxL5qKu+b40KPVVESm6qK7qaSDp2blXRIdUyqtVGlE+YL6B6DX9U0go1z79Qt78tMIOu283WH66BmhrIpxw/nlveICmj7LIrzuy1K70wZtXy/RfnkjBrUa0Qc//ty2Ls4Md/LkgiVa/D3Ps+i2LhbR6mG7lF1+gbtnMRwRQrCqvJjELcbqt1apBMpKzdeiZU+WnUWsAaJaE7pXGIOOkLdo5LUTqHeq1i8MM/cQ6vFTD9RnWESq8kiy4SZJIr2IrVLcDUM9ZnEKFIEzRG3QWrBIUWZmOFzJbMQH0/pe3hfwmqo3Y/YlqgClptbWAWbsotoHUAV2CSVc8lylLCyXpN2ZGQ08mtenwyVa0viZWs2ULk8kwxhtSKtT7w8mhZTCFEn/w3VmZ8CZRExsuaxfKUNR8xIUsOTmJuwMinGsMRdOVMVSIdwKKTdRRpX6plxts+3SPE5qYhRSurj0bnlzkcsabRLstdS75Rt0ukUu6MpWbQoE8vnjprE1kYKgui1UHUWo02oblob103nZgYo1oqje9y3q36Mg/CUXfVMKDJ3fUAVYv3JeT/kd/CK34q03MVAW0C30rsi0qrpQMaSlF0+A6vxyh8wUSNa+QNNzFOBT8Gqs5Q/QUYu4ZujlrPaLQqqLYCmfU61+FtQUpSYiRjVXim3x79b5/1KQm7lhBhrad8T14CrqCLtUn/zN5lzDfxUV363Qj9bcq7V/g3767zKyV+X72mYAA3tBYxmFVgO0BKQOyLmqExSrUyQNwlxFDJfhtbDpd028wGJJr8LXHCNt33YeK34DqtkWqPxwpJ5odb8Wvu9UNu9pxXmHNX33JJybKl+77a89zfRYB9Sjz4l5/XXseLqelicSwrFihRzazqyb14md9vswocxV+EaVJl4HZfCz4M6EYz5wXV8DDY+RPbRAOrnVHU0mKHSqqO6KUQxFfHK+hANQpSguO180rumLLCIQHA1ZoqrDXYcWEUVVcxwZSj2UdCixogR/RBh+ygC1HYFxBwQDjJCFE3DV5lvxmwin2CiXvPKR4wm6UuHgHE0FfqI7sem+2bG0bYrK4nMxio1bbFvEaNm4umDSEhVFeB6WE8yUVqhGjsq6YNGwLga11lziT+d82vdZ0LtV/R/wBOmSFvhVyGTLrPU+Huar+avgxqtBH3IkKUW0avcCVPKgEeIgNoY8z3cxR9DFG33IWP2fGpp7HVwDYgok7nJntJnRZ1u/bWIVg8wUy798+hjF1QjSU2FGCY+TZtHrESkcyajBFZUxkiA1ZSVSjoes1ftQj3BrO2gb9FqQrweeLeVmjWZy7y58FqMNMxS+b0stdCWUfRgk/eYyD6Dkrj56FoJ9eI2VL5EeskV5n4KzCtJLptae7tU/H74ffw9s1Y7Jw1syjOg1rbCEPSiqcJLBsD1qtum2m5hPQ7qdjxjYQchzAAqvLtrhZGkY00t2W0oey9mC2oooKqz9IAxStDuwSvx8/V9lD5lyvvq2YljzM0Iv4QiAPmcPUPjUmXEua64TdyLhO10AU9a+So/lrNiMexmiUFqqhKjgMcVqsnyHMEdgPhvEXEE+TOFCdCYTgEDqakGs4qsUg0uOSoS48D3QOMM8b2txjjK8R7eXu3Jx3pNtnK4hOn0Kxl1I9M4otQlNyVfe4bMoYwjBfSWfP5NwuuSAq5NjKmolqszgxaUe3UNUhpPmSM4ZzZPXJOkhh7J7gE6Jq5md03qvJ4la9aEjfWo80B49MSzG9oS/4p7A9WIGb+CitcOzg+YY7g3+LJmSPy+IvKiuQQWNbcClQeFY+RHhsjUwvldqizAul5L27zWufT7GuDjopoP/h4yyhbEr0wZCNnREbClw7wlrAGwRHMxn9Eqy84BiFJR8b2gDMSUXjPbKqvBF5Sv9qATRPrI1EFCdWB83av5mqmbJJKdIGlFv6P2OlISRuy2zHrEkheaVLNxztG6J6Zb18LaLa99X7LeYtcIMlnK14ilqf9G6EDEGQEDsVSEjpqUiyXldZtq4RqvN81lzldj1gQ4o+16HiO2l1WTtXm/h4v0NF7vXesWVHjPsG5pPtu1LcoJg0zc0RbnxhVmgeaDXIc55/1rjlnEi9O6FfMW91iyTXWO3IDnTa/r3+XjguJA4SVh1kqOV2iRR+DK39Ry2aheTX/LEuhKqr9H91IQns4j9uAobfi1irdBmABf1iLwWHhApsLDtQyxB8jzIQ5FqqRTRq9WUadYWNiYbkS85IinL+HznqNwaIRxKwCd6rVTzIbBfsbVW+QX70axo3nC8HkbPeQI7e+T2N9V7O1jnnRqkjp3Cp91FE5P7G2obs74PThHF7wiGVtGWB8pyS7mhBesuAeSji1Vj8jSkvQF4LPokRdqrWVcZZb7trx2rAeJeLJC5K9iYiY+cmaHRgSu3HOoXwE5i9uorA72uOY2bivbB6jMLfZnV1vTHuPsw0opnZO9IWZpnuOsXO+ruVFzDnP/psbPTDUMxJns4+8PzsHfT1oCpPsgcxIVYpgKqp15a9rb5Lz0GevJ+yGsoapZd6Yx1pMUNMLSpx4PUz1Xwh9BZMFs3q6DmdeatZs9c8xC669fs3nXz1Cq1jfUfKQaZOUjP77w8xA/FMpcR9GjCFTuu5gh1Xwf0pVg3ydAf9gX/rWcn7FB14NrnRRbqlhyTJekYSCuZT9+x/5YZ9vW+FFWWtfiMFhNhbU4qIfG52uXWhyw9t/U4uBYBPBt4pnU+Capt8G/WWtxUO8Ua3GAv6r2YWZcL1hvBMcHOgJl/a+Hug0e5eTou1AfrpR9x3Gme6HrY00Crmfze9rapHiHVVgALUPfV/1aCe1PmB+LLdrLQoknwz21h31X8ru56gz12xJPRjGBhicDbQP+W2rTs5BZAtKG6NXaEEYdpWM9dY1xXJXr1wn2zKp989zUtCH2RKnJWaLUGHZy8o5o9ZFegrb6/FerD/thsQ7EERTMCLaC4kmCAkutJMH9p7lUrNAyrT7WhyjTmpSEzEyUhcWsLyEYqOocEeqaNImw8qzvonawsWtQd1itYcS8aWyBfOxFlXU0qeOAuxX0DlvKY4MclUKVpxuocrT+stKPmWC5o+AztLCaj8gJFxGC9SzWtIRwJ8wb/jEj/mHU9SvNALwy2R1Dx9K3UAoLtf/xvj05y7f2ZPS7LdJDCSvVHUyYF96TPXtrT25ovjNcl0F2Rq4zxCtALFbW3T3aWkNsQ844VhexKwp3D1gGmffOUkcfUbwmOdrC7nuqEutSzJtR15D2/DgPwrFHvbcUXBugyq2IkTxNIwn7UlsyJqWKrh6Tsl1lX1ja1bDctKt4bmlX7WBTQasg9REXM4xi5sjvWlSZZJxspcdlnqnFZWJGx9r3+H5IxaTCLnf5m/ts3u/VSXilKa5n73PQTDSO5EK3kjfZ3nO7K3ZxGI0T+dmhFi+uaWodybG/h4FejGTLaUBe8zgVgXTPk/JJJ1OMP6i8ncSxL9YasECmx9ZMrG00Zn2HPvRUijWYgy0/5g7LPXdYwRh6UAtvHKcEkbx1hwaMoTifERCzymGO/bd1Ekr/FhhGAHF+1BiuT+DY382iNN/JonR3LIvStv7l65rvazZUssjdCFmU1sjwzlfYu+oDD+jCz2Y5xD4fw4G/wnFvVQbPX+Hfy/I5D17dF2HOm88nYKeKbfYbZmO4fV+FPwC9CMAUYc0M5vUnqIUXZ9SYPagN/9msQIcUhWd5ko2WowhUhVPGcoRi78nfo6K9oybs7+jk8t7dyRWKq/e4DhybXajHAEa1PAIrd2w///y0GviCUTv+fu6n6FP5VQ5pCTujK88aiRnXu0IrB0hIsc87xUka4bt6+qt34iF11WjUqAlQRTg9o4qwf0qPUIN7hIwDmCzjczFZH9qxo2bZherMSes43rGOw5PWsbvuAbt4du5xDs8xzuaBcTb/xnH2Gl3o1jhpnHd13m6rWL9znCvQDfPPvp7jc4yzdWCcP7m3+mPjHFtdqDqeNM7pWTqsN8bZClqwh597nE/q7Wwc19vp2n/jOE8r0Dk8bZynO8Y5PW2cG6BKH5x9307PMc7OgXF2/sJxztyiC1Xik8Y53zHO05PG2St6LcciP2x6xnGenmOcGwfGufE3jrOPqp4njXPW3DHO+WnjbIgIwCE/7JzjfGSOYP84F37rQORc+p/csfyRyNmzxDM1jucTfh1VecWZOiGeA6q1QAUS7wQi517rjJHzK3SNplxaOhZW5vHJHMy4rE/PuEhO4q2MCz6/y5vxElTCUN1rNm6JWYHR58YzPdi5J6LTirtJEBmFnSaIXl2UYxsVs753o3x9RC7qcN7Roq41xYvFnCYaG4JYwzU2ISk1NDH1Lx/gRwowR30GbXtCsxIzDaFHNE15YMCprwwYOwGB0mzI2iXh6yU6ElmzAFWm0GaEZJFMLYAa1ZhaCLHITC0uY+JTyXYqq30GoY6p4tQjZGRdHZQq2zUimF6/qhq+0nInpICsvAGKqPJQkTzIQuIxlrh8RooxOqfUUKzc5aJVGStvTeyYbo0Kwuc7ZUSIW0n2mH1InNHtGZAclWTZUXwb1CdAnMp87chp3SCm1KQi5hyFlKwIVdkkZCn1UqxrFp+Yut4JdYzMYTAHVK8qVxCBd5vQ6Kp6yscuIdwy6DVIzc2eAV//TYlkLiFfLP+GyFVYURv3glzdFaFDdFY67Ous6s9OERlKaBtg7vEVUodQcFihXBOirUZlwBgzy09BfauJhspAJPWbqAzxN8dHhGGzoZDBmV6hpsqnjhpCxJdEDSEKyFfMQtAPijgK7Kvksa38A/wnyTn6I6Dnw5YIY+Z2aRAfpsvsuzmh4zLJbJEzsx3iOqC6C/2wjurKRzQVMkeU1PurWOoYPYZ/cwA1FrQlOho41xPuh/YRvRkoljk69iP9s9DTkcr5/yQRxoBeVr27GbIVaj0xiJaknpg2sFL4GvIHWDUIyUdIQMUAbTJywIR5Qntc2qjnUI2C9xkJpqGN7EBDG1EfrtaNR+waFnWf0zPH/UehBJrUMddOalQ3ogSoB1d8zyEmq7i+d7WGab5vdA5ExBpCKC1/zczIEn0nz029P4gqb5pKn4/W1/oAas0KzrLX+YyiV52JJrF3IXMJMzoAN32T2TBcxiAh8zLbKbWnm/Q8aFwCiYgX18LYG4vmCnZOAtpbMYIRsgWxPhJpSOxrkqOI9v91j1HgAaEdShpX4Y8gU4bw6LlHPqDOgTX1biNaWGO6FNeF90hMlxDZI2tazctPLBzIJuMbhMLFDgap+2jWdn4K+5W08w6tE83OE3MZMWsyCwquB8kSE7manXcbW3ae7QTb+Yjt/HzLzoOt+Ww7H5Gdp64on56L7BaKcu5vYzuPz7a289SJIPfqnGylQjP7JbOWkE9DTOPIXbDXzn+o+3QbsYnoYSOgfcCvmUvUsym5A6V+NplPnc1txZbO32maWs9bSc9fYfDgdySa2+lxn5zG2Mood+xeIpR7/VvbxxLlRZ1NCnVNFXuebwb7KYgaos42RDXBWEudC+5WUfdB+htbxzWqjD6/dfzq2QXEzrrHjqYf4VHerbN0VFwO/v6bGBgRs/YiT1iCY1Ado8XbuBVxFkDTCDudG8egad7GC0FXAPCF4XM4FU0TTa1ulhaQ0wqOwAsFWfzmsxLneEZcG+DfjrjDUetNvBDkX4CHvyIV10N3mLzKFGrYsXbT7kIXQAW+yTF4oTfvEFggIXdQokLEMXihN+8whkqfDShFyFMecYevaljaHUYpZEuBT7QIjkG1Lf237lDsBtP3IO3K7T75+g5FxNNyLLDKp6PahP0tYZZC18dRY7hd8azvUOyxLmg0N7Cr4Jh1+CbmK3X8Elg08vK4MYR97c3dQVjFbhYKyw5R7rvX4dt5oC3d5U21aP9datFahpdVoXeoPy8bL+Nl/O/enKKm7J3aHTO5u1loKA5EhtwvL1bjbAUq7os0O6i0txsnJ/W8awTCa9/2SP3r/yN3ar51p/rYIUeq7PcWZ98Y57mG1VuOVuOr9Zfx8uJ5FD3trQiQInswS9urKr0bzFLtyfBT+DG2Fs+9pbmaXA3KIzB/e2ducszMBZzoDp1z0i5/U3/d3osxwjETfuS69xgUk2Gn1J7Mhi79eLgwRnfem5l0uXpVzlver/q/lgltF5k49zqm/CL6uyl2RQqfZxlD9518Hl/FPpZNWpffR8PF4/11SBrmtw4w6fGTRpZcYTGmu7L5iH8Sz01en5hvwa/7u8sQldrFb2gI769dO7DgWv+xL/+xrH8s4+Xh59ND8Y/dFocmvbW6//nw+KS9Zbv/2K1lcfXwffnw9LMUH5FfMP/9l75T8htfvtDxej55mtF7ToPfmz3MpzM+r+N8pTfvf9EbU3Vy8W6HflK8WBath8VCXgG+toz5hL7zrez9nH35/vK9euj+NNz202PV/X8XfF/3i+cH+hi98eupXPAbv2b3K3g5X95Pxf8v73+tHlK4sP/Niwdx7kv+Q3ty/3T/j92kQxGJvEzFVC+W4hpa/7UvnPFwPZ0sB2VqLV7GYssXgwatvIvxMngZwcSqxKXe9ZfdaIqbkTAquXCKYRMSS7Xx4mW4dLL1rO9etAedRdiHIs8yncadQStyF22v5T11y4s7EZi9dI1+3G9dRoNchKO3TiHe70RzD4Z8fn/dN9L295euPbEnZcP2y4Zww9OXvn0zS69mq8T2X2JzdN2t3Gf/9sIduOFLagvnO3KfxRTPYRP671pMjVKfhmKzAXBT5JnpdSjuYLRIH4PV2HLApX0GKPdkuVhMjJsXdExJntXw52o5i205BtONUD7eHtFZVJ+5uyS3bFg8p+JpvXUf2LjfuoBnA2ZbLNtG1Xsk8y020++T6/66N//6Ir5ldx/Tqru8KEflVzD1ja5Nn8ONtl08je76s5FYlgmVS8Ty6y97ixu3r65jJRaYdi0WlFvEMr4biW3k4mk8vBBuBl2XMOxZff2rfON79qWZLtfPIpR9FKFO1rUGv7hsU9E4fNXu5SYHp+HN+xf30bv9uvbnUBQyy9FV8pTai+fJVcfpClfFq0mwxcaTVKs78as/a4BnP+s99hvpVTzFu328WaTWhbi6YKE/vfR6UI5b9IujK1/8F4CRg++9xHZfmJbiZjzsPI40BwvPvcSCkbgCMU6lf2uI8Nco/AGIen03g+r7Wswjk9olcgj7tG+o0teV/wVLN+JJ/7dcPCXDyQI3q/IJ1hl/C4JLczZpK5MpnvgqExuoONtFqRUi4RlcjqzBs/ps/RyEORNOQS7Oc335a2z3xewQsxPoplpiVYmNGJ5HWsHIb3yGRrICULv/ZXK1MMb6HcA2+/pX2/TpxL5ZiO34Jc2UCdkcJfE5NQ8zMinj4UDsGPCcXPoVXcrhK5v8qTCIazCIWDIUZwXJzf/mq50OTfI4MCavXAAyfsJYmuNl+O//InEldG7+1Y0wXZUohQF+nAjnSAkLv3ZtVPH3dRtCvbPIBPXuXUMl52pXSAlBKyqz3U6BdMBqp0Rde3I3W6Smdn2vnBXvXW728U4pljRr482ODSZWtp1VgF7gP8keFyuxilXaMtFxFN+E/ea7ZyNJn5VE/eVIhGEb5dqv90NzNhIhzP3VYDWyZgbRyHoysbPC9S3C0lHbF/MqAZmEt0urQHlwu9gs1mMxIYVf1XZ8D9uxD4JOcFdwv/A1vqRXjceeJSyYFVtBa/3tfjBqReaN7btP1yPrAoXNo0UwuHcHAxQ6H9zge/1Bpx0OFvC6FbjBKjVH8LrzzRj8En8bxln/OaqC3DcX5cgKnId86oxaFyN/2b8eRov7uLr00nxVDqxCWInZOjSD52g5ebk1c8Nf3lRh1OlGV0+ef/t0dRs/9Ufuxa/IeBrFV4NuOLwZDZbFz+AquE+tm+v7LMjEOa0oN6p+O3kZuv3kIRs9B/b3F7+ahKPHWTF5vCkTwyiG+fTlPrsxBp3BYrK8uUstv5EMsSBkRFbDeri+EdO8/2WUT4z7LLRGYs8eRp3y3px9GVcLY7JMXsJFXkbLmxkkdB/coDdaru5EEN4Y5StPfLbyrVXbd72Xh+VTOxya/dDoPMVxcDmGZ/b7j6Nk+cvpxQ0rnX8tAqv/Y2C5RrdaBMnSLXzhK4lnlD08hvYIJKGzxY9B+ZQM7ibGZDHLurZf3AtLNnqMy4fr0XwgHNLJtS/8woswWXwvgrjxJTFiq3836IjfjfvidwfuQMyN/p86dpPFpR/Zs97gyi1Hi1nrYXjz3yTrr+LbC0/YnfskSl4md4ktxnoe2SMvvLuc9+3+8CHq57fthRFko//GnUsnMBfmKAu8UXbTGy47Va+9EOdsdMO8820s1sDt0hRzfvUk5v0fPG48j+8W8aT15D1cJ7ZYE8vJdTDzr1a34j5CMcalH3V++NnkV+8qLkePfTMwV3fi3vOu4dvpo9hn44t5krlWdN237pdG1Ruu7bHbeezfzS5HxsAJh5POwx87njgPWWwlcb9K8tHz4HpkB3d9a+gWbmAN5P1Vt65XTK5Wd5Nq2ugaZjnMOot+FZi37SBMrdUS1vNkObUn8R+dm/rxXAxhciv2lPvF4ouINP4Te4sTtC/NyXL0IxB71Kizmn2r8pfEEOMTuY1RXFylllveL1aNIDfno3xmBdeTmzA3H/vxTSCeXSsSz+4PHVdpBVRwuZHkky+TqLN4yH3hRQ++RNEA1pe4n9yaDFehb93cB0vPvJ1ftMNF59u90Xm8XT6JsV79FGP/p46fgqVYO8PVo7BRZt9eDfy8MR9k05dvRqMUz78YX321R0B7dD14HM0vWmHV79+K5xDHi1Y06MA4/NHj6DE2xH5t3y+nRbRctfz2YjEerNZJ5L2M2rMgWSyuulUn+3vWQH/u5+ufQf71JYma5qBqCv9n9dOPb/oP15NHYcPuhc/RHi4nl6G5egz/zLE7XEyNuDMZxfOnZXRV3E/cDvlAw+JqcPUUTQadJ6BPRUjn3LAPpOuWmxGP8JRXuqe8Q3CqOABi3mwVfE0UvJneVfC65lR42qbeoCx2n2UyHIjdZopExf9nGpSr5DmI8jdh1tv3dRa5+NuzNChXTSgPfqK8A5ayI0l64iLh26vjdrPBJKk2SyFIKJtN1LShFHEveiR5XANLieBMk9kiQrH6N5g4buuYo0wS3dNe8+f2CHW6f0Ko8xxN3tdHNnmXn0zYHzVFfJzab0PVw/Vo2VkI38o5F2G/d5b2biC5CD9NQI7pi01F60eUeNvHvHuGVm/ztaIBRlAYgTOB+tgh6kEJomwyRXBMFLwIYkX6NqZMReo1K2CBC6K2cyVAk6l2XZLKJdrOSlEIYWYKqIJQtspEAZQqr6k6M6a4RzAs0uXqEpomyTiRhKa4TkMJm+M9oSTwGsBlARFVOQjGQQBaEyWBJRk6gV1Dot6JvALBQpmkn3URKMck3HI30eQBkGhKSm8xBavvKHArUqGGRIJekWxz0Jb34BMJeuUSKDdDmSgxPyTNKcqmgpSlpWgMUWhOgs4QRFUQYD9cI3V1TVHUYJLzNT0fT4J/94mKlL9bVORUeoLZcfQE5SeLnwNMorFH/M/wr0az0fBm1mufRaqpOgM1AdCUghfyieJ/DMc1FdEo0euXuAKAtFfCuhVkmAgqmbCTSLSqWMGGSZwtUZBPH9pIlai5IgtEsT4Nlo4rnf8GIiYAZ2f475RIN5FkFsXd2OaHAMtXq5xWN8GJIdMLsFVJbLkpPg4rVRMfr8XIxSoGYs/ErknQYOfM1z31muCXTL1H0goVEV4CaRnWu9R5AU4P4qpEJI4k2FktIE/E1p4mII+yVXYQSXr+KRGlV4rAu+L2Eg2aDe0sCkpPrRLUWOcQlD6s7wWhzGKHRXhubJGwi2epcyO0nHbqHkol0a6utUnQzs7wbyAolm0SBO11ibAWiT9zhyG63EQUS6HDBsl4JVKsBgm5JaRZzEGAOvNOTdD8LcHgWjZLRHYoqCiuXe7UJJ6C0O+KJS80kcSQ5bRgXHy2klMJ4QURSSLBxJaPmNozSNBvj9/4UQmEk/zG97RxHhJ4P9Ti+MmkIiCsJva06nf6jeFZ6ESEb0NEZJ8n8I5Er0HblXSeDsGWE1lh4z2aBCE5bkIhJF/OWyQz9mC/0CWLiZRdby1iknd1HiKOJPkt+g6TR7qKoJm/U9TUok0dfk8kta2N69Q+SwSRW8d2LR6xcc171l/4J9bfK3jhCevvEF3cJ5O9CCtjgoD271x/u+h83k/zAuI14M993vrDFhh93RQky6i35IHth+hRUtR6Fq2/2s4TiXOuyFMp2vBJbjCKDWpXVASqTDzaJGGJ9lYbCrTgqTYUn8XcULKQfTO3UhFgFdMaViTY1MYpog/6LrUWSftvqjZBWIfoUda+GZxP+WbYaijFVNA3g2dALQ28frEVoY2ttHBdSKofRCxAh6/dup0Fo7GwkOtfl/wke46SnyZIMYpIVcpMknSkiBJ7qhWSSL41mUkQyDHkdyEq621ErSRWwWJA4rfiWkShbgnjqNVT5yaRGQ/9VW6bqnyGP9Ix57H2CLZ9rF3sFMG2kwiLnOMIi5LPpJcDAZRnJDb9jfnYYCdhpFht7yNCyJoW+DiflmPKZAOebKqDXQS8W5ZqaPsNavpWTVAsZ4hSPwXlIMJG3bQOszhv1N9NazG0bIpy05RvQm98LaVJAmx0pr8FFQioqu9VJOmSc6MTynjXq7Vqsrgq0jzbPcqhKGmeQDbo1w3GpSIBJ3A5NmsFJNdAcGtshs+ljEflSwE2bP5Dr9/AZjuOTCl/5LG8A+aDtOgndVReC3cHJezGz8vjnFmM0jaIMVJyPCAxhRKZ9DxBxkaP2qJpQU1s01oGqWpqjfmxIhyA3ZrFc+WObpF18UimCuWA9OeKTY42NWCTvFYgm92o8VM2DBeyeZgbhlUzMeXolBymoUll0utaRI+OK5Rvb1ADWrOhJDCw8U2K6CElty54bJGnS7KXKOyMUaHKsa3JG8QGQBxbEd2hsN2ePJn1u/NkwZnyZOnePFnwyXkyr+pBS9jbkubnzpPZO/Jk9jvzZCD5+gxRxKflyViUtvb9kNher1lVLIhb5Uq8DCkBWnor6obIn+Yn8ne3jg/HP+mfiH/eQwt3IP7pHco/fDJpGjSCCy/D/J3xzy5avPfTpYk9swUts5+Yf1hTrhDpB5hKgdrKSeTABzEcaHNXVAokHpZWFAskTJMhvQ6sUjlc8YDvWpp0uyMlT14LTYfa31KIl6wgUlKwSElCggxNEj2r3Jo2AoUhUqaNCEm6XcrRZpg/Ln2q3GzcK4mTkXQL/z7lLlHMB68VBbZBJp4tHFM1uIavXscyx8zVKcydNEjGF2kNODcdFrVlpHvXLKONQn8sL8tejKEE1uSeI2VesIKlfpf3KJ+Fu7wG0UskUqyCBKkySc2SMIWBFPdDJDcLN4J3k1OlUYqIoXAji7RWCY15/ewtrPzhZyDXCveWyrjOqYWB6Lnr9DQBxpdIT2OgUCNXQFEkrp3Q3MCcul+wh8jHKJJn9eg5yutssJdCdAnkYdI9sswM7ct+LWgXcfs9yfegaBGIp6s1wGhxokRq8txylTgTCaCw8EqWrqVIiybXw+dmuR4SNpOCcFgFxfmTJSCMbrAYpjr3TmF0yp3TXMDYlq5b3K9Re6l4n3RuvkcUWdtrX6Z/wr44Z7Qvh/Jrn0zWCFWdLsyS32hfdtFxvp+m0XOCEvzAz7Mv6Ropd0oNE6Qq+jnto0SxRYLFqkYUErkc5n9QTFihBKCCrv6WJUBb5ASZEtY0SPwNabNIXFVE5TWNGOagHYqokGqo0PJNJYm0xZxXwr3BrPcMqpuhIJRYUxxdczQeojhxgEiAkKjVlLAgHVP+CimdgMrFrCNHv0Y1YN1VRzWkmGukWhmIEMcoq65oljIS8yNRt4QolSpFLWSRUK5HnTttQDVgTZAjbiWAW2BOSxN/CiAi5b+hGHTLkHm+BkbdJY3la0E6omZiSTOmjsIaprAVYIdENKtqlVNCrJBcmEN0aEqMkXJ/FfriYC+I0kqKy9G55T6ING28Z7IPU++bLAwq9spY0nhRXpbODcJSDuV9lQ9TkXgwRKJQB0306+ZzMwUVCdTRPe7dYz9KgnjSHvuevNgh7FnUPCA80fzMzCAQFT6DF/dbkZrRLjJcd/3OuBXyPpDVJDGZT8F3vPYTYgNr/+wnoDh8ldqKvgjFan2iiIyaJCKt9g2Xag6RbxHuo1mgkKOa/yg0S9lB+l0tO4jxMmcHwS9S34NrsIjCEgUfi83fZPo34b/66rs++t+M7tL8HvIn5XonP16+V+MIdISYOC/QYBkgYaCyapjJikk0M0Map0pRaVYJ2RESiofMA4jcNZRvK0Uwd/m2UVP6tpW4R4XgCxCb4RZU70DKOfV7dEz7eK8NouVhoZB/kUeZuEq/95Tv/c3M2MeEq09BkP2NNL1bQlxM09sWK7NKy8tH6KxNW++i54X4AuSgeDV7wvOLTYqjDq7mgxh7sO2AehTrg+uRIdNlJire62F84xPFbQX9AxA/KKq9NeF2MFsMsYmJWW5FHQe7DqwjxDmxgLOiQgUhbBMFnLFm1uTYsN4XCJuA6MmCsDd5IWMPRnqWZDvVa177iOxEcesA6SljiVni+0npvon+dI1ZeawxEjWsEtWtUvJJkGqTxCDr6kO8Zp8F/IUNMUukaqVj2HmBilPVS3u66OzG/fokPovZ+dxCvy1q1iLk1VRls/H3dL+twj1V+m3ieUM2O6mU34Y0oaFJuKmNMd9DpPxRHNJWxXtNMu+JRqgHV4FINJnFdGQWk2TC9dehc4goMzmLPLdLtZRK1m9Y3pkyAIyjdS0ZMbCEM0YF/HotUXx0TB42oM9g9ilLWCW0mhDlB1Vuv6HQfRGR+OKMaqdixNJKodSArFhmwyjTS0SNhIwhskC6ViI2RIJSyqSwx1xRVsjFjJPsVa9x4p4iHER0Df5eXEutkwQ3ZiBI6luhCxq9GmVpwjPQrNO6tlxYt4P6Hs9YyPohmqAi6eWmRFbysSbXHOXS4sNcMYkMN6+9YYgY9HvIvLV8DnQfXokZdY5wyMOHcRCWrh1TNKCq+KFG7JoyaWIt+c1kkIQIFbsbehKcOeuRpGwDI6wMEQaOQi/gsV8p7Dni6iUZLyKTILNWqdVeE68CcrIWLWYJWyVajCTVCWHG5e6H4yzifZT9xTEG9hN772rPPtqtspXfJSxo5csYHGjPDRK19fWrL5HKVOJPq6ahRqCNOF8W4E0dJda7QVXqUl42CknsvPL4qRK9KK1KfG3KrgM+JuroqOmQCHEiq9uEqc2wCgnxKdH+tlOJm22oarKkX0bJ7SnMEJhlBtE1y+oi0g0XIE9KM8QTv59YyvfMPKJrxjgVzpPL82r+nGu+otmOJM12Dn7ukxSbJ4pbl+ieKR+hpLnZ1sGzxbpSgDMvR7usOg4gZkXRecqb9/g107821N8y7JIwfeUnY6cJV4zxdaOmj8bjQnaQwG/B79S7Hfrn1AnBORCucPtSthu+yyufqHejBD5n8eovApk3xlgfqDXla8TZ1H8jHKHNVOCFypO3MX/J4u9poWhuaSwofy8pqSGHAj5zlap5TJhgEm3W5/0eetSTaMZ3xvpZ8zwrl0Xk04aWObYIdYmIMM6cK3wDzQheiT2uCmjEtxZlpRAPUtI+S/apzqCH8MTpdf27fOxa7CkBWbiqbIBXAz4B7WkY/dUzCuWz8Vjs79DPpL7H9+IWjJwjj5LktxnvERIeo5KVCjwWXlCs0HJd2AWkPQTMipRp5wyfknGniBjszBMTpUN0ZePns6lVe7eAgHNLzMBXmB2DHY0rvUh4/hSAT7Dx+RTuqUAb3DLsXh2Bwyp/6mGk7SI+Q1yvRPHB7gby6g6j+0o6t7ThNPbdCHuM+FnhddMOjseJrFYARwwKHqAdRa/clautlBVp3rm54uRStQgqXIgSVuTQSJDOhNWI1lW7DnU6IJEySrujTVZYI6O2f+I7W4TUXJVgbA8h9IO2L+/VJouDZNdyjjdg/FjYg8mupV/EXaA1aTRVONqISdknKGCfQVCAxQ1IiIJ3nYrwTljtQYl53N3qzhOrrsLhGqpwl9tEgpKgRxs6U6A7JNbl68sAM5RMMB5NDdVNgn4t++aYldZfvyIY154hnyvL5XzkCqVfUlYzIaxRW2Y83Abt1HLnxYyp5v/4tPOS/2NQhkdEeHJ+Rnhuk9Z6QlGgIsb2SFQhyw9gfdKP9cRtjh9nqXVxkJDFXUgchLtvqmBTHATW/lviINz36AEWzuzJDDSJrZDwAf2mEgfhrisWBwGfVe3DRAKPCFxPjk+B6433BdovMDPH34XqsZyL2Bkg74Wuj0USuNrN72lk8RTzsCgMSGni9+tOL/TfE8qSRQntZYq4H/dUh8RX6Ls9laFHxC1hzygu0LBnHqxpPBb7LD4LVXEmsQqnFqsIbW1vEvcM15vafJ+W8hapU438cybYlxXgN0nt56fHqWJuG9I7orUXGltrr9hee9RHi1UhjqFgPpANhEyS2NvNWtiC+lapYyvfyLaCsIbKtmYeYjhVTypWmV1CR1BFumBEtl9XpfU9NDU3rpu6ytYbn28p+wNdeHXvMQtL0F7lQgVfemyFXylUOfYga95zQTYaM6WYgZb7CT3DxOxJlEWEWEI1h3VhI2QWwHMcHG/7LAIetJeaqouGjqVfoQQflO/xvv3YUnE178fkdWOVE2ymtBuMhuH9OEu39uO89pzBbyDkjK3WGHbiQCTm1V1A9TpD1EOP8a4+dufJaiWhHDjzXQZybLVr4rG2xD45VVXZCGPekruL6ucn8yAUeWj7isvVAawQOmAPNcGmksVeKCKl6q4ekbJNdbmPnm1q29uwqXRuaVNTDUERUxcTXitmGIU9lt/FaIcjL/pdtadlsRaVifl8q32P7qcgLLVfELKKfnOPvftIBv8k4YY3iezFrAQBx/BIavZRtk+4IX0We7iNFY+TCOPFOAIGo50cSfq/jxJfjKbYAXLIbB4na7At9q4LNyCL4tREybmTSP+FtRd3KH7ryLpJ8gqzqd1hBT2YwnfGPN4xd+i9dYcVjKF4YiAod5w0xfytOwxhDMX5QpPYWQ6O4T5pisgrgJ0E0OnHjWHz46T/u7mYyh1cTI0juZg25Th3VX238gPKHj8hGxPYFplLXBp2YoWW2DfXQZZaOg/re7DiW1jv7crguWv8Gn/rTkz69j0hJt1vOSfhqNxylwzqOyv8hrhysNRYL8MZfXaRcvccYrflgTn4yfxBh0SNAyvIA8gUAjvDnDAcYud7FvvNR/sOim3GEWbbemd3l7CNz4ApIw6s+DmAzi7hKR2HkjsSrfNKWOgDaJ07RuuUBxCRr57L7xY1nmQjiHJEdBNUCSMiU7MrrPppiJ1dff/+SfLVKJYDaGIR355R1Lg8rY9owX1E4QFUVviZqKyP7NiAMXj2t/2nd6zoYCd/XvBO4Wp9RYvYoSG8v/LcY7wtWPXRMY4PjHH8941xlos98kibtWuMd/fkvpcjcWOMfRAuW599jI/2PfaPcXJgjD+17/pDYyxsoIjkPtwRWGz3OH6s73pzjBMD9u2zj/GJPZ+Lo3o+/Sj968a4J55ur/1x72uz80+NcXnKGOciyguts4/xmezx9MAYT/+2MYa6v/DKj+SO2DXGO+1x7xR7nLlOF+rbbYjkzzfGvTPZ4/zAGOd/3xhXKCb60TGGHNWuMT7FHmeh8Pan5dnH+Ez22F3vj5W99d8XKwNLEPDcnhRBZe55eh5ajkn1FKgx4r1AxOycNWLevtPWpmSq2ENKQhodzgb6m9nA92ZYJFvxRoYFn527eAaFqfFwAIpgv1BrTsSYG8/zcJ+eiEF97hlB5BP0kyA+9dvdzcuEnvlTL2oezDgdk1lMqPdOcWIRu4nGiwAruEYeIBJXf32QG8nonaMSR9xXxFBDyBBTk7AvMf+uuBhy4uYk+WyqZCFTcsz9JqFBmDGFJTNrDi4Pe8M0xhbGIzJjS/T/2/u27sSRpNtf04/fWrpAtf2IjcBQSJSwBBZvIDyAJGzaxkbSrz9xyUwlGGN8q+o5a9ZMr0IYgfIWGRmxY2+BeS8lB6rM5vl1t8oo1Qn3WGX/pKx3zd19/TIr+EI6nlEAIrNmEbNMQhLoltckdmOJu5coMIG86VQoVVljV2URS5fQ3a6GCOb+ZZ5Q0bfMInMka4Wi3Z9GaZSKbUfxbhBGnpmWxbMz03XK/KmJWzLfmsRBuoSZZCQEZ6ElEkLVAyhUMXGGFRVjcyozhDXxWsuO8rXL6LWCagmgn/SaAOIGUb8pkcqdklES9DfCpXpVlpzawgzeLiE/dD46ruF0qzGjutcOIWmQwUch5BKaC4LthxiKdMQFjrFg+3GoRrXik3MFUvpVxEWJUTDmsJO4WMFtKzPQnNnUEUE2o3xo/TDCp1QMQ1j7aRA2lphcuMYD6zGPI6K/pv4BnpJGnRDEguUlZWZMwcrL7G/impDKzGrHuA0XK39tsAwyJ1xyTjgmnDvh5xVDHWPD+G9zgxG+ssLMQcwDVz+XhM20XSWwS9e5/lmqgC7ECrg0iioX3pCVugUxFWpVL4SF5OsaMReXFa6HGDUYp8esnIoZOiwlUyDMFJuxa6oSrabh3LcC56VhieLK2pQOVd1qFXfc7wkxL8s+JwukcABcFbdV2DpRJSgqbm2PWBrmucJ8MpMCr2Ke8XptgCkYQ7g2oGwIxmSJrZPfTdU9hBp3m6GUct7yCmu8gUmLzC+xdiWj5KvqQ+KyL5m1RLA5lP5WIPINV2CMmJGZd6rKqofcHzwupkC8F66oc8C6EVdVR4aWxg1G2BXG8kgcIb0vd6dcsGHUxXibvOo7NK7gjxBLBjyTwJJiNVQkd02DscAayyU8l1exXFq8Q8UaXz8xcBiM9SEmlZwqFKQmZaU3ib9pVzv9nNaJttMzUp1ZNSUDSk3hQXiXVju9y5g5bacXOwXv9LnY6a29nR53m+/e6U2x03PdE1lUX9YD5QIjLHd62tG1nd7imhS21v2AdstSYZXhNMOMJeTVCAZyYio4ttOnX4HHZGywb5Ad4D5j1hLVNx1RYaL6piAmemJGdXfucZuhVtPWYQ67HWYUidWe10UdnMbWyhh2rk9iDHv1W/vXEsdFtUsKUy0wcmK++cJTIVwQ1a4xbgnHWupfpMZuOwgnvX+tcGPi83vXL/rO8Hjuv76TfoxR+ZAi02mncmP/VK5jXHq4AhPYCU7BbKzC11ApdXeJWBkHrJ9/ClbmGBoo8G3kC6N+WH4KK5P3m9ETPBXGsoxTsDL7+QStp4w+nOnAq0TbfQoKKz2GBsLoSwQzCasbT2hh8mq/Gy4q5QZxSZ7JKWigV1vo2xw1QETzKWMYv+Do1tFAmNOLEYeYn4R3SpzXWgi+EEZIkVfUsU5pYVS81sKwjpi105F0nReV8FoLExfOyhHuyp/FrMHu23nitXMSZs3ez2xWLYwLl9SjU6wYOGUdvohWaS0s5yVyZfSJeeeUFvqvtRCVMp7AztfolPvedfhK/EeLEakIIZ41Imuxi8DYj69oKtNKkVpXr7YxCorWW8WJV+P1tL39MV2dP42DzdFoIiuHe4u4uS7jm+EiXoX7iuP/TK3sqb8y17P2sDhBj+8QAk7pcVcIg5d+baWxTVwskrcX18B/YUvD11p6svL4q4ro9tE8AOUWIqu17d95+WzUKrSe2VGKn44yY3zTeRPRd2zm7uTej8xc1kc/rKV+QDN9VX+eauNzEEnEY5ZEN40fsd0yo5tutt8Lk9X5epqsN5ObQRYnr8bQ5drVot2yzerfnUjor7b3OLm58DnCSP7uFtlovJVjjFdRTdPea+Zgyzyj015kk9Hsfkb44TiHsYZWc38zW67zRHb5RTSfsE5PSnEL5hxpxbdoTvGv6DjuZPsc22N85r/s5l/2xV+W9ZdlPN8+bG5zfMeyTH5rPXm4vdtob9nOX/blKm/f3q9uNw8FfETeYP79N99T8Bs/fojv2C5nmwW/V6v/4PcWt8v5QnxvrXbGb04e+Y25+nJ4t8U/CS9W+eVtlsknoNeWsZzxPT+L/sPix/3zfXnbezCc5uau7P2faYiGTbKnW/4cv/G4KTLxxuNissaXy9VkDv9eTB7XtzE+2X+W+S18+YX4Q3M22Uz+sht8abUen7Ff8xU8xOWv5nltOtrOZ6thEVvZ8xR2EtizEG2fTVce+Liwnkp41pvBqhfMaSbDXpXCyQZnMFio+nMnOeORWQyc8+awlfkDRMut4nnYGl4GTtbsXHY2veL8Bk6qzz1jEA4uL4JhCueC61oO77eCZQdX4nJyNTDi5v1zz57Zs6Juu0UdzlLx88DuLuL2Yh3Z7nNojq966INcnztDx6fZgFlbmGkpxvZ/XXkWrOL78Si7m1yBxUJdUNqFO2Z8hZ7UOIvvvPXUqiFu/gk9vtkqy2ZG95m8wcsGs5QslRWDNR0+9bEWFj/Lawu/t/rMzQX7s6P8KYbeeq0dpJZ1eY59g94AWKt62b9jrwDW2P3sarDtL8+e4S67dxeXPTiPjIszrOmp92z+HOUvmvlmfDNYjNFKcc7xLF4NVv2s6wzUc6yTSH8Wy4OVCvbrZgzW83wzHZ3D3s7PBf5CUj3/Ot25z74w49X2aWp372AdJj1r+Dgemdn0blDyOJxpbemm6Iu82n5oR//6bOsuzwp3aRbjdrSJ7QxzQbUeeECdQrMr8PTrG/jVhyovOUj6d4N63A7n1Nq7bhZb5/B0Xqb3Xnw1LKaX/Ivjtgv/eWgh8b7n0B7Ajpp3p6PWne5703evBpiZhSeAcQLPzdh6SyN3h37uJffg995vYa+2+hRv8dEmaXco7Gnb/UGnSOjpX6tsE41mWc/GGbnBdSbuwgiBuZg1VeYMenydgPcF33ZeaJkw7IOLsTV8Up+t+gF2cdhR0h3s/XwGthWeMu9dXlB/xCWO/C4+n0ayxPOr+2PWzoyp3gLcEV7+apM/jbk72BOe40TtnLujBJ9T85Dm6fZ5OhqCxcB+coQ1187NZ8LTmYMfsEU/gHLs8K3IbfJruT64G0Z3Q2P2wvPhPR85rKYr/+//BLhP0He/3EOW1T4PfsfdDHZWpf9cZTExv5ZMWlWtwp6fsyXvTVgWFYk7aDVkVWFH7nsyWrel6AnnBg/7QkW1LwpfTD17dLPIdM3rlz5a59M+2mGPBtv60vuh6Ni+p7P0VmP6X3TEs6Qd/9KkjCbcifbmvmOjpsIw8Zppfdx2ajt58bPJyFyM4WQ0aQ/XY2thsI/RkdG5Na3vIKxHVli6q0EyHnVej/JgxeJ1tqtggpUkcH77uWPx4Sy+PAGRQFbB+SGe8Tlu1+/6FuxgVmh5l9ufk+H4MjC7tutsrsbWOamgB5k3nDjD4dhoXfjDLiujD1tNf5jh60vP8daxOcbXrdlw+Ah/G4XJ4CkovTRsdstxa305CedGBDZkaJk/A2uzjUbZMBht1tNsbQ9G28K7a1jXdnfthefB2IrNwPSNa2t954Xe6tYJ7Ukw/nXdGoyGo257YLeWcZg+XFsDZ9jqXg2vFg8/rZkNp8aHcFTfjlvzBz+B86s1NuG8bIftReKucqd/eX4TmItwdjVLPOusnKTnT9PmInOzxT+zYOB7Vie/HY1Tz0zzuPSK22bXj1adsmfDGfluUUTZ4uanPez1h/7DOI3LePVYD+7G0XXQvby+g73emnUnRuvherS5mBjZgz/a/IHruOwHXq1nzSzXyB89o+5OS/CLsnV0fRM+/yy9zizwfg1bY3u83IwHifPgpfXlCP6dpYt7N61PohGpGT/8hCni3oBfk1ykk+GsDXNt4JutdRh6gwmMfzj6Q9fg482ywWN8fW6Fjm/120MnCrrpdfvegPX0z7BcLF17UMJ7znC5yX6WrdrsavzDC7zrkTMzwTYng7v1gxee1cIwuwxgEvmpObgOu96fvR6AlavBk3upWw7HrjmwJpiByNJ8tupufhrnNbADRnh5Ht5Ce8dBBrv3ehUlnXJWwl6ZzbLxMGv5I3MMa3EzgnU6xTX6R66dIoL1Ea3GT6ObxSRsda1RaD6N7xrPYAdMb9WF8bnoRElqjJJOPQbPeHLVXXpX/vPE6Q6uV5uLsbGGOT37CXP87k9dT8txMmvXf3jW4Kp/41s9eziKcDyuusU4Wzzftge/bq+6aVC69tReGJ4VGp5Rq03hJDozYGxHsxaM7d0AxnpsDP/M9Z1ngO18jItNNnK6T9D/VuDE23F7fQP22Opfb5Jrewxrfbz8Cf7/9dXA9K1xdh0O/+D80a6Dbja5uqjfXg2jYXOxcJNZATa56I/4+WfN2T+T0qtH5kXHt2chrKuLoZNd+sMBrKvWH72ete8t7278j9vsgs26+DHA518N1v0g++WZWXtwlZWzdAh76aDlG61wEHpw75+6dm3ow5V3uQFbEhYh8p+1B4NhaHp+mP875nJq/vPTaG2ujVYQGbCHj1qmh34J+iSjvD1sb4LZsLVBtlWK3ywN+42o4eEK4fxAhXDtxArhdzHJupcyf9qYU31wQ+VxN/2RD15NtiB8VdH5IBr5zdzg12Niywh10l/FxO6360s4wA+i2933Ip/LBuKvv0+5yiYV7UDyjzo1rlTdu0attutKicEt5xLRycoMQjE9UKqkmvIV83ZpqNAOK3Gr32Berv1rceqj59Nfi88dYcz/KP/lpxjz34Odfosxf/lGDe43V1sjIwryIbxeg5ulXunXowCRd+6XqJJ0DmDHO++uvk1tRLF/m5Jm0EFkiynZeQSKZv9aWE/f6u++NiWjIOW7m6lgmUF2Q0dxvZM2ESHymGuTETVuxZhEWM+OxRjWyGT+VUeidWxm5nGYW7qMhfKkYtZihAzzwSMKi1QwFdseceSzsqWLaghS1VKg4gR3vmAtQ91bP5foP8HhvSUNYaFTRAyShDbBa1/psLB+h88ou4B4SI1K28URelhC10Tj4hfISMOrOO3rrA/saugvZr1ijnWfGQCbsg0uc5SWDiEvoc+ZY1ZxvTqCA5w0v5hFMyEmRYE9JbRdTrY4YWatCvsr2KUCd1uxTDPr8hGu/Y8x032Ga/+LVCjd45VM38wR4GCVSJ2x/gdrmerjZmz0Rw5qvyZfoEJZHuD6KN9Z2QJzwkcv5Pv28oKxsbiyJSa3wZaAeQNLV2rkKgQ34Q5rggOPEeqIU5Mcfli9kRCLM+MYE1mp0CHmXqUIvqtkQStdKYKXyILsCvT3nLnerpFHfk64SGEhEQepVjmv7pDxt6hI1wzzit+MLI3BmDtcqb5UOMCanbq4Rl47+I5IqgUgdhHZllkRnF4zA7XgPSsFhq/Oer+E4d1W3xuycjopyc25IiSpVNxZLapTKn7NJBW4Y4nHm3O1QSlVxAVbdhIzUp650Cvd32TOyHaqwiAMiuTVk5aI1NqY6ZqUmcCyKbV2UlWUlpqrMHS94tAmRQLWKy5Y9cGRSoX1vlAjRAVG5ldlnWRCmpDqQCiVUuo4hl4zYn78slGwLq9SnMsrJQPGa+sqK2Bly0plpUG7DapPSEvNaicNwcxMagimUvIqfaF4GHLlCeNOa7IywaVdgHWDBbtzITgMj/iNH60S+JTf+AJN+HG/0XtLLdX4br8RmcKcI2p2X+83HmJt8d/tN4Jvc12zv1PNDrm3sRKO+PxwLdUE74DMeAkbTTpV8tyE69dw5bwlbYsO13NUVT5b5mGulIdJEVL/HsnTmKjMmy34FBWHp7gnrzg8qWZE4qjp2ZWeBD+n9llSxtu/tqVewt4zH1l/H8IWf3b9vUCCfmL9vaUmaX73+pubvcTNf+f6Cw+sv/D96w92JPTnvm/9Ue2Kvm6IQ3Zn3ZAuik8KQbxnd6z+juo2Vu0R16qmdsgc3VSxF4QGV7opPRXJ0S20PxzFP897fLStKkXdreBzN6vqOqdUJ0CuqERu6KrCLiA+db6X6nzU/l+pTeA6THa4cw3mnhe+Ganp+rnmm2Ef1FmhWSopoS8TSxVHm05coq6NXzuypqXk05ify/VPrzU1DP4bPB/GlCr1TK5MhFNiX6k7smaP8IWERgDV2dC9eCrr75xa4X2yaTFzGgfSz0KlqkiqSIhTa0d9N9frCeVc5iIuZS0dX79ZkVx+jBv4ZdTzdEa/Exk3Dteby5ixdTxm/K2sQeCLIqb698ZjX2GGeidrEMZn0Mf5thgTV9QVVQUWWpG51OEGC+DiCaXQFBTqQk+WdmGOQfj1ipWYNGbq1b3xVlkvVL6QeoxkbXQ9Rlf9jarWqvtK1m5kLRl+ra3WkjWv+Xlgb+EYSt0t9O/tiGqznbYK9Q6qRcZVK2vzS65A41NIeK10zrlqir1+g6reZMVaU8WpChEP0k4/cU3Ftcg6sDZc1V+ifpzUDKgytBBaNgPJZ4AzkfoT9qudU1swJ/ZsVfvPJxsVX+uTHpw4SYO15lOq0pMT2jsdVlCgakO9XyOLVbRI8YIqID2p0JKQ5d5oFZla/ba7ldcco1O11IZQZ+HoPOugy/p/vi5JGYM0mSgDIDXOqTZeatITw7quSW+xp8v6dn3cDelUqGJsW/YGqRKPxhZOd29Uy7rW746TfZa5a3ESc9c+k9lXx8k6Zf8aVbp/W5zMPhAns98ZJ0OVkCdShfiuOJnJOa3K9xPqs1XOqhSqsczCwHmwUuWkcqmNw+cbquXU/ERx79712+ef+E+cf14wf3/8/NN/K/5Q++bzD/QwVl/+zvPP/MD5Z/7u8w/YzMsa2PpvjD+wAlrSkdpPueBiKFhTwC1QZ6AfxFJdnjkagrjkswDtRqWrvA7KUtVExgPvtTQFaa7bR2W3hLwBei1q5rW/xXhesjzFGtPYErcEcy+wKmDpyHjgVqgIkncBo2hyvXyovpdjspS52Wmr0P4wWPPBl7txnbkf6FlJLRr1vMQOx9ohJWlTidehjDGL7BTFTuoYJ0ZVWm9He0PujNx2bWe00YaInVF6MYbilZA2h7NnJmew1O8KG+Wy/lbZqfd5dy1k/wi1PKrTF1wFRsWZ4AhFXuIkKEmLEzONUtcjcZXGKOqa0phXfW9R5o8+g7FWbFssz3U11ht7RdGazpek0yI05wSXBHmNEc8Niqm7ufAQxTXpGll97kf5nEJrjhhi8OxdqjYyD4iwy4wfEN4x1/eTTo5DGkNC40R4wIzeFvwUYm7JdvO5ENvKbDbxVuqYiJyB4D4g/S6hi1Np2PVFFpTmTxKhbqxBcTbtu4US+QEV9khTLefnhvZqWiPUTv5uyXuA5+aj+8v8T+wvtS/cX96Kr9W/e39JjR7Okt+4v6QH9pf0/ftLzSvQD/y+/SVGey2VYhkTpDL6KdtR1jxjbhmVI/KZMY61iiw6Gci1i3y88m8J6RjVlC4QjIRQRyVNPc6/OQolAeu/lFqOgksp1+JNBcWEhFJpn22DWdkMzpux9tI8F6drcRon7S04VZLt3rIuZlRqfys5fkUcKAYpjaqTo1uhGijvqqMaYoo1CtVv/DupeyvV76Qj+DzIPhOjmaY2bjHTV6cQPEaIPLCUZlxA+pKsU40xLV3dHE+k4m+kvXxpyDgfaxEXPJZ0ChU2lp+9s632XpeZvOh0iYxdHWYUqxi7GLFCcTiK5Wl9JmJ/ZSS0nN2C9O2UFpzL/gLbwTqzCXZkDjFnfie2m8RUtkTOrLDS2aTTMn03aj7VOO6rfJiSFedJN6/gSIV6bvHd3E6lT90k/+iIjU3/hI19T1zsLexZ0HiDo7bxnZFBZK5ChaDfi9Tca9OHmUubc4xqMkvtt+A7XvoJoUG5f+EnuLSW4kqjnjRk3YL99Ibp7dgN1rlHJkjGfTTQbtX6av5jTlBEB/l3teggnZdFdBD9InVfSVwnmAsgjrdGvvub4McTJ1NHsGfivS753wLdpfk97E/K9c5+vHyvwhHoCDH43hL3EI07zuJIVsg6lAm2ibEXIqrG+0hAdrcgdkV1L+p4pK/7tkrde46anwrBR/pkxNXUEK+r3+NrtuN9eDaKYiaKI48jcaXe9li0/dXI2Ee03j6HIHvBy1WxwbjgZYANPVF1zH3BUfN+u6XUenZ9Q4o8XTRhZZZxcXGHla7xLq/4m2gtD/uvFJzTZQc8v9Dkc9Sbq/lNjD3u7Yh6JI1fxj8xWyfMR3ne69P5xmW22RLrB/D8ILMCGEWm+VRyFJkwUUWlWU18v7COCOdk8NpQOoiopGKSZi/lzBribNjR9KlD0hklDVnC3qS5PHsIpCcz1ybqtVj7hOxknV74F/wQiVkS7Ym53RRlZ15MzjEyk6cno/RlzD4JKbGTlmJRZR/CreSthL28tuPPIM8fX9eJYTVQ0XVToFYt8Vq3SaypSdH51CK/LWjsMLPKaDb9nu63lWRTpd8G/Y3R7KhUfhuxtvom46Z2xvx1jrXm17CVEksdnIQjjdkOn4KQaDKKqfEv4+jor/0aeUpHVHBx//00FzPukeSdy/wN5zxEBEDgaMEfECcG9j4ZYShebyWKj6/Zw0b0Gc4+tROWEa8mQvmRcnFdofsCUg3mGdWMUVW5VCg1YvUT0TCO9Oak5M3IGJu8ZH5WZvljpj6KpCh2UooKORRxkrXjFU5cfIdE19DvhZKdEU8qdWKXLEit1exU6IJ6v0JZImO6vjttq52L8nYaNzNG/QhNUDIzZUMiK8U1KTbbFCEKUrnj41wxKVrYTCtvGE8MehuSzlb2A7cDNS0jdcJhDz8i5k8X+abxNKCy+IxkEGMvWBkVI6vBuHbByBrMbWZaDcW4UySoTieshBAGNYVeoGu3VNhzwtW7BvFyB4RMwshaqVY7WUqhBt10K3VgodSqsdjyCQkx49L60TjDeZ9YVGmMkY3EPrrak49Wq+zFdxkLWrryDI7s2wZr4Lr60xfEhSrxpyXYIzkCTcL55iLOyszelTau4NJkbVw8tzP6viN6lXloeVXSa1NWHYhrqkKA76vRbEwkg7XDmNqEspCCHb1DmDaBm62rbLLAthBWGWYAnoFhlhnMaap4f9HG5oK7lrh4+0FkKd8TGbKb1H64B79HslJ3NH/OqfAzGmM1/y2tkWZzybMa9hj8/TrNKI5HKA1ssddh31JeyaOZl9rM5SzPoMy3yv55WNvRBQ5STReYqiTMiqWcKk1Exphe1wm3Uv0tlxUk+FuCf1lYO/LPuRJCxEBEhtuVeuR4r1j5zEnK+sWWWP25p/Sg8azvGITR5fiz5E0txYrFv9ns27i5ipM3mZ+ULU5caU7zWHD8XjKEYwwFfeaKYV9ggtN8f94f0UQuP8M5e/CsnzS+ZuVynUkQ17XIscWoS0KEici5wjfwjBArUfD05yq/HhADOK9cUrcnBj1TjKaIoPvY4/y6+l1x7VjCUyIFQxUNQT9CMN3yb1aRbmZopmuw71jPpO4TbWHFdoyY0/5ZpnmF9/AZjyG56PkavKBQoeV6aAXkfoiYlUuDGcpFhE8ppvOJmLjshao6nq5s+nyimJe3jIBzCorAlxQdQ4smMr0x8d976BPsfF4yANPn7X51AsdVvunTSduxWdl8LlF8aN0M+HxNoPsK/m65h/PY9wKqMRJ9pdTgxXUksxXI2UKM0rSPkleuWIoLmZEWllsxC3tcSWkzSjjdKvRg0ahz5I1Z3PuVdkPJ1R2ELmStigprZFT7H9zTVKtUewbJJM0Ifa/pyrbavON07L5qG+6hEiWccq2WYm8WVaAVezNnOJqESTnGrG5/Abt0QT4ePZ/aPUvGO1G2x6AsGFq3qvLEqrJwtIZKsnK7SFDctSpl+UAyyrOyvEcRSqksPzdUNUmgaaZQVFp/XbE6i+95yQ6epHI+igylW3BUkxUI4Hmk711nSy0tL+t3aNoMbHnZ/zEEi7+t5mdA323yWo/4FBhIZm2h5cC6G8cs9cdq4vbYwfkswP6ksIv07Kzr4YiKkbgUz17nfqL8CL8OYpltE9eOqHvsIBbOlDopjI6lfpG/WVeRd/YMuPoEK6eayg4X7Ic6fOrn8dH0M5i9HaPR/Wt5L2aP5VykygDZFn4+oVEist3iPV1hgc48UmGhFH1RVXqR/x5xlCyI2JY1JfaMbGqNdUj43r6K0BPilrFnfC7QtUO2UlcE7Cz1hco4o3ZIc147oh2CZzlbtNNS3mIzqvxz7m9TZoBfO6eOl58/p3qknMDeEa8939hbe/n+2uM6Wl+xxXOmmPdAjCR1UMdY6cpw3SpXbKU70Va3pGwRR1uTDmE4VU0qZZkdRkdwRjoXiGy3ykrrNpSVNqpqDaoq2+58Xukg4XeHVe0xI6QFrtLBDL702EizRiLO0QZq3nPOezRFSikCLe0J92GEc4ZRFgFhCdUcZj+H5zAxC9B3vDne9scQ2fu13mRLTVVFw9fSryDFCz5/C9/jffbYUudqYY/Z66YsJ+6Zct+o7WjlsKKEZo/TynNGv4GRM7ZaY6UjFFk6VRVQtc4I9dAXeFeXqvNktpJRDiLyXXhybLVnEmONLOlzlZUN6MxbiOqiqv9kHIRPHppdcUR2gDKENdwPCRWk9hOyrXwi5eyufiIVe6oj6ujFnirVOcSeyt8t99RYQ1CEXMVEz0oRRtiP5b102hEnL/5dZdOSUDuVwXy+1u7j9uRC0yhnZBX/5pH97iMR/EP5whPrBzC++Cqb+/V7OObHyesM7LDvPnmorIMZjxNY091Xuf1hHBGD0Yzs054qPqbPkDhgAVKMbH6W+b5gVsO5eSLzPSphvqpkgbzm8Fsn5k2iF5hNXZ8BazDBd6Y43qcUKFB9CnOxYJtgrZ+iz7B8VZ8BxxC+zzeZneXNMTSP6TN0cmQnQXT6aWO4o8/4elbpdC6m4gAXU/1ELqb8vZqKaj/eEBsT7i0ylrgyk6hcLLzmYBmhzVl+DCu+h/Xezwz+Zh3aF21iHdrL2qdwVE5xAJmevzPDb8CT405N+TKa0U14r4xPqfw5sbbL+Uxtl5yDxRtz8Jv5g95SoI0stxaB7xetsFaFMRxg+Z7A3nxYS3qfcUSwbX1CEx68kScPK7vAU/pS7dniC9A6NwKtU7yBiHzRL19dYaIxGx9CRI68pN92DK/dXYCnJRUMzB7s6p9D7Byq+3ffjYv04ekR5dGhWCfsKsZXq8O7X6QO77+ByvL/oHL4QYuNGIMnd99/eseK9g7y53mfUYcnZae0+Oox3tc0+ugYh2+McfjvG+MkBRt54p51aIwP1+R+Rh0+cG2wLtsvH+MvUoeP3hjjb627/tAYwx4IJ7kPVwTm+zWOH6u73h3jyEC7/eVj/Mmaz+ykmk83iP91Y9yH3u03P+597Vb+qTEuPjPGKZzyfOvLx/iL9uP5G2M8/7eNMeb9wSs/kTvi0Bgf3I/7n9mPE6fWw/x2E0/yXzfG/S/aj9M3xjj9941xOUe24I+OMcaoDo3xZ/bjBPVC58WXj/EX7cfO9vhZubP9952VkSUIeW4/dYJKnK+peSDdVamNQ23BE3PtS0/M+y3diS3iCTQqGGn0djTQ3Y0GvjfCItmKdyIs1HdO9oSKT9PREBW6Hkn7Dc6YO/35dp0enEFdUTNCyCesJyF86s+b7vOM+3zTDxpvRpxOiSxGXHunOLGY3UTjRcAVXCEPCImrv36TGwnVmz+fiWPuK2aoYWQIcV9VWUq3VnExpMzNWWm4M35eYB+ZLwsxYwpLZlYcXB2qDdvVcE9tTcO9EBrutspssoZ73a0ySt+l4W4JDXfEzVtek9iNJe5eosCkhnuFUpU1dlUWsSQdW6x1qRA/1L/MEyr6lllkjmm4Z1+A0igV247i3SCMPDMti2dnpuuU+VMT0p/fVjhIlzCTjITgLLREQqh6AIUqJs6womJsTmWGsCZea9lRvnYZvVZQLQH0k14TQNwg6jclUrlTMkqC/ka4VK/KklNbmMHbJeSHzkfHNZxuNWZU99ohJA0y+CiEXEJzQbD9EEORjrjAMRZsPw7VqFZ8cq5ASr+KuCgxCsYcdhIXK7htZQaaM5s6IshmlA+tH0b4lIphCGs/DcLGEpML13hgPeZxRPTX1D/AU9KoE4JYsLykzIwpWHmZ/U1cE1KZWe0Yt+Fi5a8NlkHmhEvOCceEcyf8vGKoY2wY/21uMMJXVpg5iHng6ueSsJm2ZJgT17n+WaqALsQKuDSKKhfekJW6BTEValUvhIXk6xoxF5cVrocYNRinx6ycihk6LCVTIMwUm7FrqhKtpuHctwLnpWGJ4sralA5V3WoVd9zvCTEvyz4nC6RwAFwVt1XYOlElKCpubY9YGua5wnwykwKvYp7xem2AKRhDuDagbAjGZImtk99N1T2EGneboVSU3vIKa7yBSYvML7F2JaPkq+pD4rIvmbVEsDmU/lYg8g1XYIyYkZl3qsqqh9wfPC6mQLwXrqhzwLoRV1VHhpbGDUbYFcbySBwhvS93p1ywYdTFeJu86js0ruCPEEsGPJPAkmI1VCR3TYOxwBrLJTyXV7FcWrxDxRpfPzFwGIz1ISaVnCoUpEZkpf+Iv2lXO/2c1om20zNSnVk1JQNKTeFBeJdWO73LmDltpxc7Be/0udjprb2dHneb797pTbHTc90TWVRf1gPlAiMsd3ra0bWd3uKaFLbW/YB2y1JhleE0w4wl5NUIBnJiKji206dfgcdkbLBvkB3gPmPWEtU3HVFhovqmICZ6YkZ1d+5xm6FW09ZhDrsdZhSJ1Z7XRR2cxtbKGHauT2IMe/Vb+9cSx0W1SwpTLTByYr75wlMhXBDVrjFuCcda6l+kxm47CCe9f61wY+Lze9cv+s7weO6/vpN+jFH5kCLTaadyY/9UrmNcergCE9gJTsFsrMLXUCl1d4lYGQesn38KVuYYGijwbeQLo35Yfgork/eb0RM8FcayjFOwMvv5BK2njD6c6cCrRNt9CgorPYYGwuhLBDMJqxtPaGHyar8bLirXBnFJnskpaKBXW+jbHDVARPMpYxi/4OjW0UCY04sRh5ifhHdKnNdaCL4QRkiRV9SxTmlhVLzWwrCOmLXTkXSdF5XwWgsTF87KEe7Kn8Wswe7beeK1cxJmzd7PbFYtjAuX1JxTrBg4ZR2+iFZpLSznJXJl9Il555QW+q+1EJUynsDO1+iU+951+Er8R4sRqQghnjUia7GLwNiPr2iqz0ohWleTtjEKitZbxYlX4/W0vf0xXZ0/jYPN0WgiK3l7i7i5LuOb4SJehfsK4P9MreypvzLXs/awOEGP7xACTuljVwiDl35tpXlNXCyStxfXwH9hS8PXWnqyEvirCuX20TwA5RYiq7Xt33n5bNQqtJ7ZUW6fjjJjfNN5E9F3bObu5N6PzFzWKz+sbX5Aw3xVf55q43MQSaTp08d2y4xuutl+L0xW5+tpst5MbgZZnLwaQ5drV4t2yzarf3ciob/a3uPk5sLnCCNr1yPrZVT69f5osNAymc0cbJlndNqLbDKa3c8IPxznMNbQau5vZst1nsguv4jmE9bpSSluwZwj7fYWzSn+FR3HnWyfY3uMz/yX3fzLvvjLsv6yjOfbh81tju9YlslvrScPt3cb7S3b+cu+XOXt2/vV7eahgI/IG/6u/c33FOKNHz/4erucbRb8Xq0u3lvcLucL8b212hm/OXnkN+bqy+HdFv8kvFjll7dZJp+AXlvGcsb3/Cz6D4sf98/35W3vwXCam7uy93+maMXzJHu65c/xG4+bIhNvPC4ma3y5XE3m8O/F5HF9G+OT/WeZ38KXX4g/NGeTzeQvu8GXVuvxGfs1X8FDXP5qntemo+18thoWsZU9T2EngT0L0fbZdOWBjwvrqYRnvRmsesGcZjLsVSmcbHAGg4WqP3eSMx6ZxcA5bw5bmT9AtNwqnoet4WXgZM3OZWfTK85v4KT63DMG4eDyIhimcC64ruXwfitYdnAlLidXAyNu3j/37Jk9K+q2W9ThLBU/D+zuIm4v1pHtPofm+KqHPsj1uTN0fJoNmLWFmZZibP/XlWfBKr4fj7K7yRVYLNQFpV24Y8ZX6EmNs/jOW0+tGuLmn9Djm62ybGZ0n8kbvGwwS8lSWTFY0+FTH2th8bO8tvB7q8/cXLA/O8qfYuit19pBalmX59g36A2AtaqX/Tv2CmCN3c+uBtv+8uwZ7rJ7d3HZg/PIuDjDmp56z+bPUf6imW/GN4PFGK0U5xzP4tVg1c+6zkA9xzqJ9GexPFipYL9uxmA9zzfT0Tns7fxc4C8k1fOv05377AszXm2fpnb3DtZh0rOGj+ORmU3vBiWPw5nWlm6Kvsir7Yd29K/Ptu7yrHCXZjFuR5vYzjAXVOuBB9QpNLsCT7++gV99qPKSg6R/N6jH7XBOrb3rZrF1Dk/nZXrvxVfDYnrJvzhuu/CfhxYS73sO7QHsqHl3Omrd6b43ffdqgJlZeAIYJ/DcjK23NHJ36Odecg9+7/1WVCAgexPaJO0OhT1tuz/oFAk9/WuVbaLRLOvZOCM3uM7EXRghMBezpsqcQY+vE/C+4NvOCy0Thn1wMbaGT+qzVT/ALg47SrqDvZ/PwLbCU+a9ywvqj7jEkd/F59NIlnh+dX/M2pkx1VuAO8LLX23ypzF3B3vCc5yonXN3lOBzah7SPN0+T0dDsBjYT46w5tq5+Ux4OnPwA7boB1COHb4VuU1+LdcHd8PobmjMXng+vOcjh9V05f/9nwD3Cfrul3vIstrnwe+4m8HOqvSfqywm5teSSauqVdjzc7bkvQnLoiJxB62GrCrsyH1PRuu2FD3h3OBhX6io9kXhi6lnj24Wma55/dJH63zaRzvs0WBbX3o/FB3b93SW3mpM/4uOeJa041+alNGEO9He3HfsiwXF6Eo43zRjeycvfjYZmYsxnIwm7eF6bC0M9jE6Mjq3pvUdRHCy7CYRnJrc5hHOSaxYvM52FUywkgTObz93LD6cxZcnIBLIKjg/xDM+x+36Xd+CHcwKLe9y+3MyHF8GZtd2nc3V2DonFfQg84YTZzgcoyr6sMvK6MNW0x9m+PrSc7x1bI7xdcvLho/wt1GYDJ6C0ktvg8FzOFxfTtK5MSzOryJzUA+MeyNKW91JNnC9dNb0s3HbLy8WQTZcx87sn+Hd2vRW2Sq6GZrXo+zn1BinbugW8F3310YW+qH3FJrdzmg4GAbpfTEbnXtudpFNnAjW18LyMAd2vVkFV8g0sFn7mf8cZbOHsB09B9bac28Gyyhb1Gbt9c3QMq2ZE9dvm/GDN9osR0Gr7JmLH/2rhTHNFlvXxB1+8I9XRrVZNqvFq240LDZm0F4X/bZTDNq1cgbtnMBsn7VnQd8x78J0CH0zaPlGKxyE3sXQ+f3Xg3Z3M7m6t6Ni4/8su/1RMgv7wSJxk9Y2TrK7YftxGxpn5S04YLPA++E2W0acRbZnjwvoAzi7dAo3rS8jnKfX59l12zMjcDRHSWQFo+5qGnYvrlcmjPl6A+P+cwpz4s9c15+GFuyjq3o4aYcPs/Z97sLsCZvDx59W99dP5NoNWtZ1e73tt8PidgW7pTW4ub3rroeti+j2qrsIzLkJ45//NBfuqOnXwXYvR62ZHRnjH5Ny6Pdbra6fmoPrsOuFYXYZDP/sdVzOHid3g5vpTVS6MP+hDWCa59Cu7s3QHlrB9ea6ZyyMcTlYzW7WN+5qcDNZ1eo43117vImyNJ+1637P9rdRem6EyfAuMryJH2bNEaxL31xfXId/6npswtD+GJcX/Ti7t727cQbtW/bsyA6X50tYr+bsah3B7lbzzPti1ByCB3NmT5tpbQz94N4Ml0ErqkVB52FQnBljs9UcrWYX8Bt3YDcmYMP+yPWgdZ9PmoslrMHlwBpb02Zr4t6BDQlbxuhqboN9gvEcWzPLIxsEZ5gntC/BnVvzwlpten1WmxnmZTCatWAu3A1gLoyN4Z+5vvNQdy4HW12LzVY5uhrfu1anmK5gT73phtNmdzspZ63xTTe5Tp1a0Lqo+XDfrdOCWbu4gO/5Y9du4m1vgxT2g7QInFkYJIvMXXXBRgxhPBYF2PebwKpbt5eb1A8Wmb+q8RwazQa+md2FoTf+s9f55LaZPgeBW49H3SvP7uS30Jb4qtUZNWdenI0L7yp6/jfZ54HVKcej4RXYLAP+df3Qz4NsYPuh8/yn9scX+2VzdtkPWg/DYafu3w0nfpqTn+OH4M9kg4fQHg6mIXjJjNPdeAcjV5qHvgF/cTttn8MJJlv2Vq3a5Mb7zw7H7Uvm7rfUET5eeYzngSeP45qMjw2ybJykcHoHH7QMlx9EwL7NaPDF1Yqx1UP+htf1G/Za9Xn9hoMo23fXKcaI7/w2zUGPsCOh5DXJ+4TJ2L/2t4Qyw3x0SSiymkTt8bXPyg3M2kjKxRW6rSPUvBWyVPCXqd8Q3F771yL/Tc+nv+bPHWHFzv8wK/aH9eI6x2sOlt9bzYtcKqiDfUQvbhRh1q+ILNf8Ar24Q1X5xTsx6HU4uaPa2rdpmzBbGOE2ZSwEdZz2roWtZFY07bX43EBogtSZX9U3SMOo4poXDMMOc/QnzJnqSmRgEkqe7xpraQj2nwqRV5C2BeklCfbcMpYYWlswADGHdeDkxMSlMJzMOy8UMWusUdWRrMBKY4NRkfCcl0qTHNvEPOOkt0LIHeb5ZvTTlpUllb4wK0UyOgY1YNEi2EpVktBugsdfWpBKO0vobglVSWIj9uG3JJMQ4VwNwWvO3Kalr5iNhU5zzmgqqZ0VS30rVk1NiGlRMhNqzMU0boVgQGbtsqDiUheo0ZLf7+BnTWKEOqIZ7H6IkewzmsGdr9AMLo/XyrjfWQ+FyM6nPiOWXlEGGa5Qy8ddeQlZr69QBjnMKvHOukaYFageUP8+zeCQOQbVqkRsWaPiCSwbAvMoeapCk3nmGEvI3FiupjyOmMTI0rRsa4rTkFWCBI+6a+yw7+NKl2pCqO5wifwHzCJM1oNUNxFtTzyiZCERLV+tclrdArfdsIjxWOn2usycTDxpxBisKUW52/61UopCLi+rUidKmR2ZFAdT5oRFbJ+qDqHvEzyLxGq9davvNZl3ndTxalxJ0ikVL2eSsoJipRhfZwU5hU8kDn3kjFU8rYwBtkl5ISF1BMRpVoojxL8pGGZYLcSo2jInBL1g9gYfBvG4jaLimiUNZ2Gp52VfWvUdJYOO4LEkLlVUwBPYdxxnxD5ForoHd4i5Xu1jStWDPvHl+xZbapjPpICg8JEWYsyV0hVzGWpKV1FOGHdWTNjybuMXijWa/EJfYJsFh6xSqHeFkgXhPUuxSyI3mFKQZI554mgVOPRQqEC+6it+jBn6M77iJ2uQFydxgniX3+wrIqcX6ir8Ll/ROOArGu/0FZEb8wm8pe/TFhasrz6x/1GVCfkkpGglkOs+68slMreGTN9Yd+doeqUdoRdXaRSTduSORjFVimjfw4yOQjuST1FkERixrN/jVizLxLWv1Cfo2Rs7z6l9NneLl9fELMvt2HnmIyvO+O1q3p+sCF+cxNCyz1jz5Suu5l3XSh7f37LizAMrznz3iivn6Ld934qj2padlUKaKfpKIT5uTZEbK/poxWl7OdWL1KtKOjpRlKwaCDOySZWhlvJ1aP+U9SJzwcdc1WEw726DKy6JG54VEPWKS96/RMUl81ML/wtXbSzqZqgGSO3xO2qOiCrZ8b86tcr/6mC9mFSTIv8L+4BqpQK5Yv06qR9LPl7yT0O+n3iCQ6WUy76rb1T6Ob787oLV2uga9WpMUX+hqT8iP3Eq1R+RM1tWS3KVY5CyajHd26FqS+1kKmq3HJu/QylMoDKVRX4Q9R+dTIvqu1OuakSftMm1UG4pVS7o+gRF9vBPqDm+hxntDTXHXX6lAzFh67sVc50C6z5+Z8Q1OhBxjd6vmAurH72bb1RkF3V2oVaDNiflHFGDVrLGkqrEFVEKoYzOUQej4rd3SPWM1zfda6tzDJw1+0p1nWr9NEX2sKz+FmMtZ3VfSYrYdVFfySq6an25QmW9wYq4WCsHa7dfqQ2xumog6+2qtrKHw1XfuIZFTSJXBstzh1AtF2psW+nnI0e1UqEkzShx3klEBKg679hCIY+9LbBl6CFp/SWqkdGuOazymoSy3lYwH6CWE/ZnWFJNrjqnwVmwqnkshHpkVfOINUWFqnnE8wiee0xl33mvKTgaFVn9HZvokqqPx0rthagJVWq5miIl9anGy417leRZx1mrKVKi5ygVKXcVMPnaJf2VPtfowdlL8mDTniQVKUvuz0p5k89ZUu1+bpFNLRsqqkYaIshxL6u+8Qx5XFMOObh+c2Qs+orImH08MvatjF+ov/PkYQ3Ub4yMvcLq9t7IWOKWvcTNvy2zxUpFmifI+rIa74LJUSmlmyZU3xp6peWOUp/mNYp796/fPP98TKHmM+efL4o4HGc922eB++rzD6ok1Wre7zv/1A6cf2rvPP9gxPSJqua/LeLA6szI8CZryKVeBCsxu6RRmUolVozx1cmfLuksINSfHelnUCaKON/FvV4i47OO4CJo1CuF4oaMXRra32w8L8F9ufIVUEW+KVT5KANG0QOh6Noo1fMksSE4AMzqeynuyloReluFIr1H9/PvCx4PW/pRrDDdqck4p9BtIV0sqeHiKc4EX8RAG/AdPtWrVzwSpG1xUJ2ZM3BoNaQKNfstxG0geTjYUmw5nslsPSp+zVYJNThYC4T5Lyzhi7COlPQLk4hj1ZUSsOBZYyYWl3R2MJuoFK5Lpa8KvgWPedX3HsW2I9a2SlhDQ+nQYX5Aco40d2P9Lup4Ks4R0lISY036XBb3D8XNS8mvIq5JGxXm2FZ/zn7AfgmrRPrsU3IbWSdO8gXgeVJyMlBWU+gylWlN6pYIn1cgwIm7QviqsWy3OBd2OGKWRDZnFZ0qL0Dnd/puoYPDemQipyAyncQmZaF6OcxBw9v5blZpxzpUyphWa4jR5qxqLp7b3ypNFNFO/m7ZRr/2Bt6h9rt3lM9yLC5O4ljc55z88h0FlTe2vzGGXT+wo9Tfu6PAKRT9ve/bUWyy1kKtmNE9Kk9fZ8vJ6jlkYQOV+TFY55rUji309Kvcf6egTB+vOlIN6jcrFURWniTWIVhtqSF5tcRpiuLMfdq1YNWQBasiTLCaLN6ZKJLEp71AWQmRDSMmlporTtCeZNghZSGfT5kU4XMsYSHE9Zzvo3Zp+qUJRqQkVoG4wnSsgs3RxdgUbCsm67tLbfNIaJ032CITA5GrNNRhpRaMOaHRRawCZfrEqVrxfbnM8KNxf/l46hR/IyXmjad2RDpZi7EUisRkVQUHWlPttqwCzSxLyLdWkFVOVAZSqHxS5K0mFJhkn4lon2sJNbmSLBn0vf7d0vKxehtbSeG1VJYSeccIw5EqpT2OxIrvDhpCDSpSXgtnZ+m0yWqf+nPzd3M7pToet/GYVa3/dqv6RSiyt1hNnW9mNXVL1Ojp5L8Vabk9pPDivpOfGGM6xFTDPto3YDVe+AMma7NKf6BBcSpPMhwGDcEHxdx0QheyqFYFWb+S0GOYi79GvjeJmQjRJxGxPfG7VdyvYORWyoixS+0+5mPD3C5paLq7v1kSPwr5qY68tyQ/Wyp0V/6N8NfFKmd/Xb6nYQI0tBdymBFrolMo9FZCMSqTnrWMmPerlAyRYBFYt5itLTFE6gyETv11HzbcSh8WFb0rNJ7PKuIU16PX2u/5mvWelxRzVPc5BcfYYr3ttmz7q2iwy98d83L2GVd1thnwLTqcbX6bP6XcVzV7r8WSei4aTlrEknxYkTC35mO7+zy7aexF8N/GXPnQd67QdnELjxiH6Kz0Jdy5PvN3shozIZkEl6lVnermgiWS9i08DeIpQUb7MTpMpwRWxY0YR5m4SmURvTsPI7Ucvd6SVVDqhg0+MZIfggrV4Z5VIMyB0HgnFE3dVZFvgdkk9rdIvRYrn9W4SanZR4yjqdBH3B6b2+3wCmk6MpOYC+VgyQJpsx/iGszihychlVXA5xH8pRHu45oPE9ryGq0urOQqai7xp0vxWveZmF2TVK2JqxX7ROFXMZIuo9T0e5qv5pJFlb4a9C9GqeH0Ki1hzBHwgBBQO2P+ut/gf0prXtUxU/R8LrMjyGyHz0CIMhmb7F/K2CSx3Omv4bRKvtHr7HCrr+FZ7eScI1HaxpzLEOd8gYgFT0CeEgTTLp0EBMtuU+Lx6Fp41Q7mE8xqH3QtXk2E1yOd4kqT06lJjU7w5m0PZ6liCCbuPhHz4giu4HJlxEsptI4DqUPu1FW8RGnUU+wnp7iS5DZqOhVqhr5Dombo90ylRw+ecl9qooKH411XGII+nlJkLh/1oPW9qdq3KB+HeTsxY5nvM6DTiSH5asUKNSi+0EyF3jJVh6u4AnGKk66y9IDplKC1oVPQ56t2FC5Hygeq7+CaYjMB86z2m7KfOwKvIMae4hYaE3OTOQ8Z2+nUWH9axsdSivcwH29kMG+pxCjQdUk6uWKOkAXA2U37rUvxM4UJ4FycyKHOyz0mbFPTFC7EqSiv+DFZv9YrmR+TsaZp/bh+a/SxWpO9GC5jOt1SnrrBAgue1bDcefYktQmNTjhSRG/J/m8wXjfgWKrg5q20p5lLk7V0A8cglu1A6tOnBSPiaU0yy3kgqwf4mqoJmqirRGNtyZw1Y2M7XHkAJ1LmZPVtiX8l28A5Ysnla/Hu1uE5RrbBlTnDrWDBrvNcwh01tTwVB8Vrn9oP91g0vwsVBdhWa0lhZMQzptLvq6OPCyc7Macj1hEn3ClHIGRFhyd5mjFuiWsAd6IlzGfaleOKHZv6mXxzUzB/yx1c5M1ZQ5mqHRATLWLahHvlPDC9Jh1m7W/oz/L6i0v+HWXryDfnigYZ9eC8Ne0SrHeM/L588ufcL2kle2Lto9az2N0KGjtsO78mLE31N0YHEs6INKIVp7LgbiV7Y3MuvOKHZr9XctRS1ATZx+1qHhO2V/ATa/P+iKY3al98XEng0LrtfER3+eW65flsV3tRyhhkOiN1LBEbV5gFwY8s1mEq4v5K09pgBQLWtIZ1YzJWPNVj5Ab2N7+ufldc53wOBC+JolZyvIhn3BARMjr3VfPJJSwUr+sIq5Kq+xKhVR8wNo69ybheabRzTJzi+gKfzZrovqnwcJcG2AD5fYRDkRrqHNG7NJi/XJ6Fk0hpqhOeHpnvkdV+qTzbGt2D+ifNmKJhaM9E9tbG9dkLwprmCePnbdaUov13A/Zdnb1dipOSLn1OmAt4XoXTA9tGbPoCv4ff0UOvSJ4tA8qPFLwvpowXFFrw4tpS+YgkLlipAD9LHnmu1loisszSbstnp3wQnCdLQv7WXeXtYnRWaM0TAlfaHJ3Jn3THdSZ/ynmJdYqoTLVGtWeQnO8CZ694p5GBhvEorMaw5UqpSCJ/eW5U3M2ifrPiMBY5DMKZHFnnlvepdS5jLOTf4fOpvZOixiXZ+oK14tG2yXnpCqynsIfEd9+oKtME1pO5pkkrHms8TNWvjD8qNW5w4nL/Bm5w9oWS2NS4wWWsI+/zCVTa3bK/yw1eanzx0AcU2wH/Ws7P0ODnobVu8QmQsZYB77c5Pctx/I79scq2vfHjqLTO8W/wswuOf677sF3x7BxJ7lA+hF+D/ynyaeI6F9WLiKMlfQGFb5K6DeI3K00Hrp0SWgboryo7DPOfo8zU3zw+WBEo839YpcbR51Tei/nhUu3vzD1PbeHnY79Y5rPFe9ra5PMOr01Cy/D9ql4rYvtE8bHQYltWqXugfepT3ZW8N1WVoW5T4sn4TKDhyQpc0/y32Oa+ULoFqBRS4/nl6m3n6wS159EWp/pz4n5WcdeL/hbz88gpNfqSUyopLbB3xKvPYOb8avW5L1Yf1cNSHkicoHBGiF0QehKsu6diKqGoP02l4ogWaXUpP8SR1qhgZGakdliK+jKCgbPOAaOuWduIMs+6FRWqGqoGg7VVLvXPKy0k/G6zyqMJFDRbK6wdtpTHhjEqhSqPd1DlmtpIjaPPSoHCoO/AbD4hJxxCCFazmDwdnsVkCVNdl+gIa/1HUdd7Vds5P5msjuFr6Vswrz6dv6X/8T6bnKR7Npn8bsps0r4pq4MZ8yL1Guw9m1zXfGd8LtYJUHowhFfAs1hRVfdoa42wDanAsTqEXVG4e8QyyLh3Etf0EaVnkqOdoPZLpDCkXEPJVUNa/4k4iDh7VLYlF7kBztzCGYmxP1JzI+RzHJ1JpVJSdSYV+6rwheW+KtWZ5L5K3y33VVvDSQiFJnpWijDCzJH3WpyZFDjZUj+XdUztXAYzOtTuE+1hPZSSqtzlbx7b8z4Qvz+QJTyxPsA/omgAs/KpjzsMInM/xWCf2j2w4jgaJykjvM6qj7l4eKa5BbP1lGey96sjNWWEHNZ/HeOaLnEwv6kbcEz7gZXrYfxRMe4E3cCk8VoLUSEL1XNPzZlY+2jMqoUu1lTCGkwtZnV8s4XFMe0HHMMOqaCcpP2wil7VfsAxJCUfZlZ5WzfgVe2HEHUDbKwch1V5Sgv3lBFeySi9h0VpeZBF6eZUFiV3H5N0INu/z6bKO3IvIBalLfGmiyfE8/Q4WWT9kVOPrjsfxIG/qd/y23VL99vFmPPG0yewU/k++41gY/iUOi3O6zkqltrfqlj6kRouOSPfquF60S9fjZLT+HAPYjkcuz+KauMgW8BoCywHKuvA6e0z3BuBe6CSq/PuSi6/RK14zgOHZg/zMV+sKe5+kW68+4be9Lfyq7yhN22A15et+u2wHsE4CyQk2Plavm/r34XZOVTTX74TD2nAk2N8vKCY5xJ2lya8V8an4HVOxEucqKh9uEaoLmqEjDcwWcafVZo+aLGDBioSv/Cj3reOwwPr2P/UOna2fWQnT756nP2vGGfzjXE2/43jjPps4ecUxYNDlbfh58YZ7HQtd798PYdfMc7WG+P8zbXVHxvn0Oph1vFT4xx/SYX1zjhb3mWN1IK/dpw/VdtZP62207H/jeOMimrzT/pf8wPjHH9unOEkWjO8L7fb8VeMc+2Nca79C8c5cfIeZok/Nc7pgXGef2qcO3n/smaxHzb/wnGef8U4198Y5/q/cZzdGjIAf2qck8aBcU4/N84GnABq7Id95TifGCM4Ps65e/nGyblwv7li+SMn544FfWqczif88lTVyb+oEuLJ41wLZiCpJXhy7l9+4cn5Bbqm0iqFU51FmflmfEKc0H2hoPr+iIvkJN6LuFD/XXSnK1QZI3WwxfQSZgWdPnf69M3KPTidlqKahJBRlSZyVkxtUty67wXp9oRY1NtxR4ur1hQvluA00dgQYA1X2ISo0NDEXL/8Bj+Sd0x56B2ZOsGAVanTCwYslccsqydDxk5EoCh1+oLx9RIdSaxZiCpTaDNGskimFkSNakwtjFgUTC2OwMQrdXqZ7WN1epFxqtTpRYZKcD/0K0Qwv36RNdTyuPxdjBSQmTdSbkcdJsyM+cxjLHH5Aikm0DmFhmIVVS5alhFV3okd06lQQdS/c4EIcUrJHnMMiTO+/gIkRylZdhTfBtcJCNV5fnbitK4zU2pUMnOOQkqWjKpsMLKUaym2FYtPyFXvjDom5jCcA6pWVWQQkXeb0egqeyquHUa4JVhrEJu7NQOu/psSyYzK3OpvhFzFFbXTFuLqLhkdorPSUV1nWX12TshQRtsgc4+rkDqMgqMM5ZYRbRUqA8dYsPzkXLcaaagMQlK/isqAv9VcQhg26goZnOgZas586qghQnxJ1BChgFzFLIT1oISjoLpKMbal+wb/SfQV9RFY82FLhLHgdqkzH6Yj2HdTRsclktkiFcx2hOvA7C7Ww9ZUVT6hqYg5ouDaX8VSJ9Bj9Lcaosa8pkRHI+d6JOqhXUJveopljq/dQP8s1nTEcv5vJMIY0cuqdjchtkKtJobQklwT00RWCldD/iCrBiP5GAmoGKBNgRwwcZ6wjYvr1RyqUPCuQIJpaCPb09BGXIerVeMxu4bF1efc52R/FEqgwRVzzahCdRNKgGtw4b4aM1mFVdvVGub5vlM5EDBrCKO03K1gRpboO/ndXPtDqPKGqfT9eH1t30CtWd6X2DpXoOhVZaLJ7F3EXCIYHZCbviHYMByBQSLmZbFPKZtucn/wuHgSEQ/PIrA3Fs8VqpxEtLdiBGNkC2F9JNKQ2dckRxHb/21foMA9RjsUPK7gjxBTBnj0okbe48qBLdduE1pYY7qE56I2MtMlnuyJNa3i5WcWDmKTcQ1G4VIFg9SNNKt9fo72Su7zNV4n2j7PzGXMrClYUGg9SJaYwNH2eae+t8+LfULs84HY55d7+zzuNd+9zwe8z3NVlMv9IquFglTUt4l9nvq22ue5EkHa6pT3SoVmdgvBWsI+DTONE3fB0X3+Q9Wn+4hNQg8bHtsBt2IuUX1TiAqUqm8Slyubm4otXdzTMLWat4L7X2Hw8HckmrvWF3VyGmOrQLlT9RKj3Kvf2r+WKC+ubFKoa87Yi/lmCD+FUENc2UaoJhxrqXMhqlVUO1h/Y++6QpXx5/euX/Sdx+ysR/bR+CM8yod1lk46l6O//yoGBs6s/aADO8EpqI5x9jpuBb4F0TSwT6fGKWia1/FCWBWAfGHUD59F0wRzq5fEOca0vBPwQl4SvtpX8B1PhGtD/NsJLRxfvooXwvgL8vBj7eMJLYxeRAo17FizYfewCqBE3+QUvNCrLUQWSIwdFKQQcQpe6NUWhpjpsxGliHHKE1r4IoeltTCIMVqKfKK5dwqqbeW+1kKwBvP3IO2K/Tr5qoVw4rmsWbgrfx7VBvtvgbMUqz5OGsP9jGfVQrCxDmo816mq4JR1+CrmK665BbJopMVpY4h27VXrALtiL/FhZ8dT7rvX4etxoD3d5l21afddatNahFeoSh9Qj17Vn6er8O+jMUVNGTy2W2Z00800FAchQyar8/U0WaMKfBYnbyrtHcbJST3wCoHw0rc9UT/7v6Sl5mst1ceOOFJlvTd8+844LzWs3mq8nra3P6ar86dxsDmaEWBFd28RN9dlfDNcxFrPiF74Z2plT/2VuZ61h8UJmL+jMzc6ZeYiTvSATjprn7+q324fxRjRmIEfue3fefls1Cq0ntnRtZ+OMmN803k1ki5Xr4p5y/aqf7VIaDNP4Lu3IccXyd+N2+hzz213NUy9ion3DOxYMru8uB+PsrvJlc8a6Nc1ZNITPU0subBjzA9F8wn/BP0mnw/mm/c4ubnwSekdfkNDeJ/1bM/CZ/3LvvjLsv6yjOfbh81t/pfdhEuT31pPHm7vNtpbtvOXfbnK27f3q9vNQwEfkTf8Xfub7yn4jR8/xHdsl7PNgt+r1X/we4vb5XwhvrdWO+M3J4/8xlx9Obzb4p+EF6v88jbL5BPQa8tYzvien0X/YfHj/vm+vO09GE5zc1f2/s+0RMMm2dMtf47feNwUmXjjcTFZ48vlajKHfy8mj+vbGJ/sP8v8Fr78QvyhOZtsJn/ZDb6Eo8jzHOZ6voKHuPzVPK9NR9v5bDUsYit7noLNh1HDWt5suvKexzizSnjWm8GqF8zJGsGukoJXjFYI1mr9uZPQ2km2i4Fz3hy2Mn+AWZ5VPA9bw8vAyZqdy86mV5zfwMnsuWcMwsHlRTBM4Tx6Xcvh/Vaw7OCYLydXAyNu3j/37Jk9K+q2W9TBD0fN9+4ibi/Wke0+h+b4qlc6T+71uTN0/OfYBu87cJ5gjqdohX5dwdwo9HkY1nu4Z8I8i698aME4i++89dSqoU/7hFju2SrLZkb3mTxT0med18D+gp8bwn6N8aj0yQM/jnOMvJ5gvtaIORC8QNiLLf7uBazFegmW8X52Ndj2l2fP0BK7dxeXvdV5MS7O8Ey5pflLu635CLvqD7HrQm8ujNlV4wf0CdwVP81K9wnOkHfQXniO9Fl8jtbcr7aZzdqtFNbHQqzVYtyOfoTpoC2fo7/yEv1ZIrubwefXMNrJ1DI3YHfEc52Dz+Euq/sGO/fFV8NiesljMW67z2PYj6aj4dPsyn3mMazaEq8GqyPtR7+l3isbRa8Ez+Omm46XZnk7qhvRzXwD9ydyJ/nV9pJ+NoPffNRzS233B1jPLXhV1FqwOuXUHhaRFeq9t53ddB9Fy9II/7PA/tN958GsDVbXyTbRaLZ7bqJvH2NuKajBOHXyXtCweoGTD7GSOOmU8B8qz8EJAav3EReuPi93vzQq13Tuj258+ObJyFyMrZBm6U8bercp7sL8zqhlVOjVQdKnjFjLmNq7eVboh7R1N4Y+2kFxYk+0MwN2TPh0APfF7fx5BjMkWl7gbgRP6jx22tgnFyWOxHTnMzyaaFXhiQ3wAh7Hge7XvfxV8dkE+hx2vvOi8gz2Rwo+qeaisN2bMcxT7KsoeLHH8H54fWHc3lxkMu8K34qVc/CNP+8O7vdok4xXvJoC5sTTuLk5h18Wu63Y4SSSv/JhwKuamdAmrVbh5b7vVPvcnqazZhuEPtXc5jmSYlzZxPMWetlCQ2rHZqhYHefjfrzuBT1q+2K8/5SrVm28q2f90kMrXvVF8eSFDAdNYvyFUzlV9dKzwpnzxL+xb0N875eIJEVlnLnwYQ+3q2ONV/S/5IBPwrlf8j16N+RnwVigDZonMfhYUZktxmDD4HfWWrvWMMPvYB7pPshWxunESjS85gLOdQ173PZWr8d7kHXBWe8iNRpzZLn4ifsF+8jw/XPkgpcRjao/XpzvumSn+yu2Oj37AnbVzTKCM52X+I+Xq2zkp+fXs/KiN7Bmz6x1Xg+Go0VrOJrBa2c7pPdaQ99ojenvqWeNsnuLXq+y4dQZDoZZx5imZ+Wo5W49Z+CFN7OLYRjWYQ61w2Txc7BqlW66uIiMQRkNo4dhat54xll+HW6GkzS1MT48GXby6d1FHg8Xea/0wmDY7YXtzvO4tf7lG9n6+mrYja112h8NrGg0hn11aI7g0A7z/WF4t76B7173ry6sgZEPvGD2zzi9fx5dbqAJRtEH58Bbni/COzir3q19Pxlcx2Z35NndK7AHi9FVWnqj8dX1KHOHyWALlusf1+rYt0F2M2sPYK2vb8bXm/C23Xpyw+FqFAzBf3aNaLgee6N81B8N7TidP4Srs4e4NY4mhllG9noYmBejsL1ZX8PvxOFZLQa/2wNv2hsN6xMn+yce1a+GxvkdtONmlq3/CbLhws0u8uurwbVnDtozcP+CxAE/tlOGzcVwHKJ60uIxTrql157507tW6zabJYOr9c+gbPnTq1YbPpeNy4t/Znezq2EWmrdX63LSjnL4ezJeDfxpsbmc3jXMa2dzN0mywTjxslG6LftX3dUY+miwGv7wsuH4pxUXUyP3h8vzdXSXXc1G42GIOZkWzovBRa/sjqdG6w7mwc0kNYf+EObDTfd6Gm7g796FO5rVBinOl9bCa2Vr+FxzbNSHMK54/2B602pd0xw6/zFMWy38u9tar8eGB3/vLoKbxcDH++3B1Wh0Du9lztAYtHA+DsJuMWlnXbin5wetwa0zxHnbjEbng7EBf7fWffjcAP7uhiuzB/fC38fPt+kY7m1dXINf4wfjZZx5637T20zD1hPP/VZr6ozDiemZ4cjErAThtnpB9IRsSW+c1lbmAmbO49QewJ7QeYpuFuvYPMs1S7GvN5K/gWHbrRR5yRO5a1EVuqIxh33Y1OvTvOAicVdu3Qt8CyNK/y31aS6d4yODspAHsVfhNiq7SR85UumEx9gr3jE+UdFyAGPXyN5Z0ULncYwSe5QD+kLNKaFb0JG6Bai6bSuueuaP1bhcBd9s2RGZc+a1qFApDbPiiUX9IcxQzyu9WeK1PagVIThihVZE2dDuE9niwLX6imtW+03mphV8uHyvWxIHreAXI8WXQ9z7nJ0W72ls3TVW34w1/tgdtm7i0ZFs3ZyV1Nm6K20luBMzcjo6IpeZfPG7Wla/o1gw+03kMlOctQZzoDlClX2HHdwQLJvIr8GctRVPbi50+Gy97aKdRzTuGh/DcO3Hk2AVhZRX9iq1hxqz7oasSkzMi3OpsGUw245jsnojse3kFVbCJ7Yd1rXnez2lvuWXnHdtWKxCzMqM4m91wa6CPQrf4Wr3hQUh4ug7+bXg0OR8O7HKMIubhwxnhGvobHe+N6Hc5G5bme9CvmdIVmOPWakK5m4NDWJeSQhvRL/HjEqYe8RnJHYorf2hTc9K2Ae8lzAOMveP7JkCA8Ttr/BBfq6ukSmSZo9SYTGpTxlPYTKbkWp/ySwTIhdazi3Oo0t8jJ8z9wcxV+22VamwEEdqqeVqmfekFBy1hB9wCrFSa7xiiEVGvGaVi+pvQpmGuVFq2H/ye1nlmZjXJI+t7NeclKgJJ0DzzdzDA7ISJX83xjbqu8oeDWYWZZxAjfE6Ti7ZxLyEMBICXyV0A1jtm6yLsKxy1blyNUqlGKUvJdg4mLFv59rQV/Fe31aM2QGxfkg1GlSgw3zxtq+sICl0bpmZLGT2IKWc0jGFGqdgB3FsiZPUsIaSKVT0LzOFIm6O+Yrp2kKmD8LeXOs4s7nIb7v6mJbclxh3xvtIG0KOWV1T09lt6+u4heJjmJ3PsGY3voo1uzi9ZvY4z3/neMXwN9eFI26ZsiIWVuAcqmXY9x65luGFT/z+WobDTADJuz0tWNHRkxfMobUxorCQ+x93hq/TUhFsZaFkHC490pfbv+a4CrOy6a9DycBHOJ4+s7saxCoZVCz3gt1Y2OiOYGx1a2q9C45xZk4Nc2YeqvB+hFch5tyOYO51bWVLmAlUeBVh7jITmMSHCsZ7Vtlk7TxW5NzV9CDMZSH2AKkqLDjOGzl5TYRdIo5xxqk1Sa1SKRgz06TA3iROQXtfKfnQiYtK2ENH9+iEPfSNyjNKmfO8nEsWI8LQwnMwxpqYVV2jYlUWStCBq6tz2Uq9jHG6FuMQmRlRY02usf1k9mX2eB2Nx11gUklR2WH/JCA2qiMeW+dPqBLvY2P2MAy9JCpYXf4ELp0jlhQ5OnvIJU3Iv1N4eT5VyyvP0+Xx8/S3Mi/AzuiDDaJKkIN4oO86qx7i1/C33XfyqSDutQcnFGKPRSbZyxqeUr5QMYWYGA2X1dnAp5rjOmVtGcaKMqMiK74xWyVhvP1cUwkshGaTpgVKim26FuiWWaDV9wiGNVZs43uIkY0xgjv3OIr5lHwAhXGkZ9/uPqf2WfIj969jxcq+88zHVHN/u4KI/1W+0L6CyA53GOxN4Cv41olYwC/zq/yjfpX3zVWDcDbFang8mzUP14h+l1+13zKh6vt+vwqzKRh/w1N1+S1+VSFw7KaG8a5ViuNwbilJ4aCqcxH1NIIBn9kPK4bZnDVHUnWvV0qsPbIJzmW0irD0O5GsUv3NJoVydR8rpPdF/QLr1vmyrqEUSqb8PIxF19Te+XuFYvpuW9me7SpqM3aHFACg51W9iSu1UOgMjJpofl6x2EZG5Zt1hA9Efpw4R8+VL0caMhQZrPpL1PqgX5OzrlrHVPUsXFmISCmDWLYp9pGWiqmYFaS2ouaII2+qpqBTEzEdZsVMIvKvFNcqK6YXLmPnLWY8r/qVWfVjxvsnouZC6dNpalDYpzrPadmoFE3RL63UoAxiHWY1qD31KZ+jgzSWKWHgwVZJFkpiLVZqUNyfmupVpKleoUItqatvK7+ywbafIrA0toYXHNd0QcaLP+Ab7tuc/798wzf4fMJvZeUCi/LkUoVI9Ht3gutDO0H//bkM1AFFjKjNqirfshP8T//6f/rX/9O//p/+9Z/Tv/5K5ordGqYCMUleM8J6zRO+rfMeLqDXmEevmHnUe4t59JsZzuD0k/ewGhk8x4P7vWWYbtstojIFWxLVlD73sobRiI/yzFoH2VPev/PVCIMQzLEaNv+Wnc+mmsRAstc7pNzz4hp2JBFrtoWmteQksFljkGOsnJtydhlCVL5Pxkg451f9hlAA2rsW8Qx6Pv3127FN50/4r/s1Y/8W//Xod6XoJ2IVuHEKu/dHfOE9dKOopmgZkbX4j1voLNp7ygsw01OM9dfpPM6rUdRDz19gMP919SUHWrsTF9lr7dz0risEqfK9OVK587f/PyqkPoK//a8Y4+WJM1pVS/WJT3rv/X9jXVg10torLfZ7Fln5Om7x3sKnQPBg4VQZjcYZ+LlKWf0XWJfI9ueT9nA9thYGxmj712CXk2inzoTzwZwjrnqIT6e0CtRTwohUFVvNXcw7YeFX9LxzrCDC/39F/VT9b3unfMqsv6ieMs0z+2X51Jnx7uopuHy4v99of2s/TNYL9352i5/4fw==</diagram></mxfile>
2110.08421/main_diagram/main_diagram.pdf ADDED
Binary file (46.1 kB). View file
 
2110.08421/paper_text/intro_method.md ADDED
@@ -0,0 +1,20 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # Method
2
+
3
+ We apply *adBiC* on top of four backbone methods which are usable for class IL without memory:
4
+
5
+   $\mathbin{\vcenter{\hbox{\scalebox{.75}{$\bullet$}}}} ~$ *LwF* [@rebuffi2017_icarl] - version of the original method from [@li2016_lwf] which exploits distillation to reduce catastrophic forgetting for past classes.
6
+
7
+   $\mathbin{\vcenter{\hbox{\scalebox{.75}{$\bullet$}}}} ~$ *LUCIR* [@hou2019_lucir] - distillation-based approach which uses a more elaborate way of ensuring a good balance between model stability and plasticity. We use the CNN version because it is adaptable to our setting.
8
+
9
+    $\mathbin{\vcenter{\hbox{\scalebox{.75}{$\bullet$}}}} ~$ *FT*+ [@masana2021_study] - fine-tuning in which past classes weights are not updated to reduce catastrophic forgetting.
10
+
11
+   $\mathbin{\vcenter{\hbox{\scalebox{.75}{$\bullet$}}}} ~$ *SIW* [@belouadah2020_siw] - similar to *FT*+, but with class weights standardization added to improve the comparability of prediction between past and new classes.
12
+
13
+ We compare *adBiC* to *BiC*, the original linear layer from [@wu2019_bic]. We also provide results with an optimal version of *adBiC*, which is obtained via an oracle-based selection of the best performing reference dataset for each IL state. This oracle is important as it indicates the potential supplementary gain obtainable with a parameter selection method more refined than the proposed one. Finally, we provide results with *Joint*, a training from scratch with all data available at all times. This is an upper bound for all IL methods.
14
+
15
+ :::: table*
16
+ ::: center
17
+ :::
18
+
19
+ []{#tab:global label="tab:global"}
20
+ ::::
2112.04386/main_diagram/main_diagram.drawio ADDED
The diff for this file is too large to render. See raw diff
 
2112.04386/paper_text/intro_method.md ADDED
@@ -0,0 +1,78 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # Introduction
2
+
3
+ It is widely known that the success of deep learning relies on data availability. Learning from datasets of a larger quantity and higher quality likely brings a higher performance and better generalization for neural networks. Yet, the labeling of datasets needs well-trained, highly-engaged radiologists for medical image analysis tasks [@zhou2021review; @zhou2019handbook], which is especially challenging as physicians are costly and busy.
4
+
5
+ ![The distribution of the mean radial error (MRE) when choosing a different image as a template in one-shot medical landmark detection task. The x-axis refers to MRE and the y-axis refers to the percentage of MRE lying in the corresponding ranges. **Evidently, the choice of template affects the performance significantly**. ](Figures/tbar_mre_n1.pdf){#fig:tbar_1 width="100%"}
6
+
7
+ To alleviate this problem, many researchers [@chen2021semi; @laine2016temporal; @DBLP:journals/corr/abs-2103-12277; @tarvainen2017mean] utilize labeled data together with unlabeled data in a semi-supervised style to boost performance. A classic method is mean teacher [@laine2016temporal; @tarvainen2017mean], which aggregates multiple predictions of unlabeled data by a teacher model pre-trained from labeled data. The aggregated results work as more reliable pseudo labels for unlabeled data in rest part of the method. Another group of researchers, aiming to achieve a high performance at a low labeling cost, propose a strategy to select instances for annotation incrementally [@DBLP:conf/cvpr/KimPKC21; @DBLP:conf/iccv/SinhaED19; @DBLP:conf/icml/TranDRC19; @DBLP:conf/cvpr/Zhang0YWZH20]. The basic idea is to first train a model with few labeled data, and then use the model to select instances from unlabeled data iteratively, which are annotated by specialists for the next round of training. Meanwhile, some researchers attempt to drain all potential of limited labeled data. With the power of self-training and self-supervised learning [@bhalodia2020self; @chen2020simple; @ouyang2020self; @xie2020self; @yao2021label; @zhou2020comparing; @zhu2020rubik], it is possible to develop a robust, few-shot model even with several labeled samples. For example, Yao *et al.* [@yao2021one] introduce a self-supervised proxy task that matches multi-layers features from images with different augmentations in the training stage, and use *a single image* as the template, whose patches centered at landmarks are matched with target images to make predictions.
8
+
9
+ However, during our research following the work of [@yao2021one], we observe an interesting phenomenon (see Figure [1](#fig:tbar_1){reference-type="ref" reference="fig:tbar_1"}). The template choice highly impacts the final performance. The mean radial error (MRE) of our trained model varies from [2.9mm]{.underline} under the "best" template to [4.5mm]{.underline} under the "worst\" template. It is evident that there is a large gap lying between the best and the worst choices. Thus, a **selection question** naturally stands out: *Regarding the "gap\" over samples, how to find and annotate the most "valuable\" images in order to achieve the best performance with a deep model trained under such a limited supervision?* To the best of our knowledge, there is no ready answer to the above question. In this paper, we attempt to fill this blank.
10
+
11
+ To answer this question, we have to address three difficulties. (1) **No supervised signal**: For a landmark detection task, there are no labels to guide our image selection --- We need to find substitutes for landmarks; (2) **No proper metric**: Mean radial error (MRE) is often employed as a performance metric for landmark detection, but we cannot compute MRE when no landmarks are available --- we need to find proxy for MRE; (3) **Missing effective feature extraction**: Can we train a deep model that effectively extracts the features for template selection?
12
+
13
+ In this paper, we propose a framework named Sample Choosing Policy (SCP) to find the most annotation-worthy images as templates. First, to handle the situation of no landmark label, we choose handcrafted key points as substitutes for landmarks of interest. Second, to replace the MRE, we proposed to use a similarity score between a template and the rest images based on the features of such potential key points. Third, considering landmark detection is a pixel-wise task, we apply pixel-wise self-supervised learning with all non-labeled data to build a basic feature extractor, which extracts features from each image. Finally, we can find out the subset of images with the highest similarities as the candidate templates to be labeled, from which a model is learned for few-shot landmark detection. With the help of SCP, we improve the MRE performance of one-shot medical landmark detection from [3.595mm]{.underline} (with a random template) to [3.083mm]{.underline} (with our selected template) in Cephalometric Xray dataset and [4.114mm]{.underline} (with a random template) to [2.635mm]{.underline} (with our selected template) in Hand Xray dataset; refer to Section [4](#sec:experiment){reference-type="ref" reference="sec:experiment"}.
14
+
15
+ ![**Difference from active learning.** Deep models can remember and cluster the images or patterns they viewed. Instead of active learning (**AL**) tending to find the unfamiliar examples, our goal is find the ones nearest to the center of latent space which we think more representative and important when only several images can be labeled.](Figures/dist_sample_select2.png){#fig:dis_sample width="100%"}
16
+
17
+ # Method
18
+
19
+ In this section, we introduce the proposed framework named Sample Choosing Policy (SCP) in detail.
20
+
21
+ Given a template $T$, landmark detection for an image $X$ is first implemented via classical template matching.
22
+
23
+ Denote the set of landmark by $P=\{p_1,p_2,\dots,p_L\}$. Suppose that $p_l^T \in P^T$ is the $l^{th}$ landmark point in the template $T$, its corresponding landmark $p_l^X$ in the image $X$ is found by the following *searching-and-maximizing* problem: $$\begin{equation}
24
+ p_l^X = \arg\max_{p} ~ s[~F_\theta \circ T(p_l^T), F_\theta \circ X(p)~]; ~p_l^T \in P^T,
25
+ \label{eq:s}
26
+ \end{equation}$$ where $p$ is coordinates of a pixel, $s$ is a similarity function, $F_\theta$ is a feature extractor, and $F_\theta\circ X(p)$ computes the feature map for the image $X$ and then extracts the feature vector at pixel $p$. The maximum value of the similarity function $s$ achieved by $p_l^X$ is denoted by $r_l[T\rightarrow X]$: $$\begin{equation}
27
+ r_l[T\rightarrow X] = s[~F_\theta \circ T (p^T_l), F_\theta \circ X(p^X_l)~].
28
+ \label{eq:R0}
29
+ \end{equation}$$
30
+
31
+ The above landmark detection process considers only one template. When there are multiple templates $\{T_1,T_2,\dots,T_M\}$ indexed by $m=1:M$, template matching is implemented by $$\begin{equation}
32
+ (m_l, p_l^X) = \arg\max_{(m, p)} ~ s[~F_\theta \circ T_m(p_l^T), F_\theta \circ X(p)~],
33
+ \label{eq:s_multi}
34
+ \end{equation}$$ which finds the best template $T_{m_l}$ for each landmark $l$ as well as the matched landmark location $p_l^X$. By the same token, the maximum value of the similarity function $s$ achieved by $(m_l, p_l^X)$ is denoted by $\hat{r}_l[\{T_m\}\rightarrow X]$: $$\begin{equation}
35
+ \hat{r}_l[\{T_m\}\rightarrow X] = s[~F_\theta \circ T_{m_l} (p^T_l), F_\theta \circ X(p^X_l)~].
36
+ \label{eq:R}
37
+ \end{equation}$$
38
+
39
+ For the choice of *similarity function*, we utilize the commonly used cosine similarity function: $$\begin{equation}
40
+ s[~v^T,v^X~] = CosSim(v^T,v^X) = \frac{\langle v^T \cdot v^X \rangle}{||v^T||_2 \cdot ||v^X||_2},
41
+ \label{eq:cos}
42
+ \end{equation}$$ where $v$ is a feature vector.
43
+
44
+ To single out the best $M$ templates among a set of images $\Omega=\{X_1,X_2,\ldots,X_N\}$, we aim to seek the set of templates $\{T_m\}$ that contains "the most similar\" landmark information for all landmarks and with respect to all images: $$\begin{equation}
45
+ \{{\hat T}_m\} = \arg \max_{ \{T_m\} \subset \Omega}~ \frac{1}{N} \sum_{n} \frac{1}{L} \sum_l {\hat r}_l [ \{T_m\} \rightarrow X_n ]. \label{eq:T}
46
+ \end{equation}$$ The above optimization is *combinatorial in nature* as there are $\binom{N}{M}$ possible combinations, which are nearly impossible to exhaust in practice except for very small $M$. Therefore, we randomly sample a large number (say $10,000$) of combinations and pick the maximizing combination as an approximate solution.
47
+
48
+ To implement template selection per Eq. ([\[eq:T\]](#eq:T){reference-type="ref" reference="eq:T"}), the knowledge of landmarks is assumed. However, even such knowledge is nonexistent before template selection. Therefore, we proposed to utilize potential key points to substitute landmarks. In particular, we utilize the classical multi-scale detector, SIFT, to find key points, where landmarks are likely to co-locate.
49
+
50
+ For each image $X \in \Omega$, we apply SIFT to get its corresponding $K$ key points $Q^X = \{q^X_1, q^X_2, \dots, q^X_K\}$ with the highest responses. Further, the SIFT key points for different images are not in correspondence, directly applying Eq. ([\[eq:s_multi\]](#eq:s_multi){reference-type="ref" reference="eq:s_multi"}) is not possible. To address such an issue, we perform the template matching in a *reverse order*, that is, for an image $X$ with its key points $Q^X$, we perform the following for each key point $q_k^X$: $$\begin{equation}
51
+ (m_k, q_k^T) = \arg\max_{(m, q)} ~ s[~F_\theta \circ T_m(q), F_\theta \circ X(q_k^X)~],
52
+ \label{eq:s_multi2}
53
+ \end{equation}$$ and record the achieved maximum as $$\begin{equation}
54
+ \hat{r}_k[X \rightarrow \{T_m\} ] = s[~F_\theta \circ T_{m_k} (q^T_k), F_\theta \circ X(q^X_k)~].
55
+ \label{eq:R2}
56
+ \end{equation}$$ Finally, we define the average similarity $R$ as the representative score of $\{T_m\}$: $$\begin{equation}
57
+ R[{\{T_m\}}] = \frac{1}{N} \sum_{n} \frac{1}{K} \sum_k {\hat r}_k [ X_n \rightarrow \{T_m\} ]. \label{eq:T2}
58
+ \end{equation}$$ and Eq. ([\[eq:T\]](#eq:T){reference-type="ref" reference="eq:T"}) is accordingly adapted: $$\begin{equation}
59
+ \{{\hat T}_m\} = \arg \max_{ \{T_m\} \subset \Omega}~R[\{T_m\}]
60
+ \end{equation}$$
61
+
62
+ To answer the question that what kind of deep model can support us to make selection, we start our analysis of Eq. ([\[eq:s\]](#eq:s){reference-type="ref" reference="eq:s"}), which maximizes the similarities between the same landmarks from different images.
63
+
64
+ Without landmark labeling, we resort to contrastive learning, which is proven to be a reliable tool to learn a basic model without using any label information. Here, instead of instance-level self-supervised learning for visual recognition [@DBLP:conf/nips/GrillSATRBDPGAP20; @DBLP:conf/icml/ZbontarJMLD21], we adapt it for achieving the goal of ([\[eq:s\]](#eq:s){reference-type="ref" reference="eq:s"}). We do so by narrowing the distance between different views of an identical patch and extend the distance between different patches. For example, InfoNCE loss [@oord2018representation] is widely used and applied in our training for feature extractor, $$\begin{equation}
65
+ \begin{split}
66
+ \mathcal{L}_{\text{InfoNCE }} &= -\mathbb{E} \left[\log \frac{\exp(\alpha)}{\exp(\alpha)+\sum \exp(\alpha')}\right];\\
67
+ \alpha &= s[~F_\theta \circ X (p), F_\theta \circ X_{aug} (p)~];\\
68
+ \alpha' &= s[~F_\theta \circ X(p), F_\theta \circ X(q)~],
69
+ \end{split}
70
+ \end{equation}$$ where $X_{aug}$ is a different version of $X$ by augmentation, and $p$ and $q$ are two different key points.
71
+
72
+ We follow [@yao2021one; @yao2020miss] to construct a deep model trained via multi-layer pixel-wise contrastive loss function as our feature extractor. According to [@yao2021one], we use VGG [@DBLP:journals/corr/SimonyanZ14a] as the backbone, followed with 5 blocks to reduce the dimension. This model is trained with a pixel-wise matching proxy task for over 500 epochs.
73
+
74
+ Based on the above discussion, our pipeline is summarized as in Figure [3](#fig:overview){reference-type="ref" reference="fig:overview"}, assuming the availability of the feature extractor $F_\theta$, which is learned using the self-supervised pixel-wise matching task. First, we extract features from all images $\Omega$ and candidate templates $\{T_m\}$ for the following operations. Second, we extract key points $Q^X$ with the help of traditional key point detector SIFT. Third, we pair each image $X_n$ and the template group $\{T_m\}$ to obtain the similarity $r_n$ between $X_n$ and $\{T_m\}$ (Eq. ([\[eq:R2\]](#eq:R2){reference-type="ref" reference="eq:R2"})). The mean similarity $R_{\{T_m\}}$ of all pairs of $X_n$ and $\{T_m\}$ indicate the "representativeness\" of $\{T_m\}$ to the whole dataset, and the best group of templates $\{\hat{T}_m\}$ achieving the maximum of mean similarity are our final selection (Eq. ([\[eq:T2\]](#eq:T2){reference-type="ref" reference="eq:T2"})).
75
+
76
+ While landmark detection is implemented as template matching in Section [3.1](#sec:selection){reference-type="ref" reference="sec:selection"}, its detection performance is still limited as its feature detector is geared for all pixels not specifically for the landmarks. We further follow [@yao2021one] to improve the detection of landmarks via semi-supervised learning. Another landmark detection deep model with a heatmap predictor and two offset predictors (offsets in x- and y-axis) is built for distilling with pseudo landmark labels predicted by the previous template matching model. The performance of the distilled model is geared toward landmarks of interest and is better than the previous model.
77
+
78
+ ![**Visual Comparison of templates from our policy and random selection.** Column "Template/Test 1/Test 2\" refers to the templates and two test images. The row "Ours\" and "Random\" refers to the template selected by our method and random selection, respectively. As shown in [red]{style="color: red"} dashed boxes, our template outperforms the random selected template in visualization. ](Figures/demo_sample_select.png){#fig:demo width="98%"}
2203.02574/main_diagram/main_diagram.drawio ADDED
The diff for this file is too large to render. See raw diff
 
2203.02574/main_diagram/main_diagram.pdf ADDED
Binary file (28.9 kB). View file
 
2203.02574/paper_text/intro_method.md ADDED
@@ -0,0 +1,89 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # Introduction
2
+
3
+ Animators commonly seek to create stylized motions to express the characters' personalities or emotions, thus making characters more lifelike. Since many computer animation techniques are based on motion capture data, the variety and diversity of the motion data play an essential role in the quality of the resulting animation. However, a captureeverything approach scales poorly if there is a need to capture every style, *e.g*., *childlike* or *depressed*, for every motion type. Hence, animators usually capture motion in a neutral style and then stylize them by hand, which is again laborious. This motivates automated methods for stylizing existing motions according to desired target-style labels.
4
+
5
+ In this work, we develop a novel motion style transfer framework capable of stylizing streaming input motion data for online applications, which we define as *Online Motion Style Transfer*. As shown in Fig. 1, current motion style transfer methods [1, 5, 17, 19, 38, 54] with deep learning models require a motion segment as input, and produce a transferred motion segment as output, and where the input
6
+
7
+ ![](_page_0_Figure_8.jpeg)
8
+
9
+ Figure 1. a) Offline motion style transfer processes motion segments, while b) online motion style transfer processes motions in a stream.
10
+
11
+ segment has a minimum duration of 1 s. While these methods make significant progress on a difficult problem, with a subset being described as being real-time [47, 57], they still suffer from startup latency caused by waiting for the multiple frames required as input. For *online motion style transfer*, only the current frame is processed by the model, which enables the direct processing of the stream of motion data. We believe such transfer methods are more suitable for many novel applications requiring streaming motion data. For example, in animating a human avatar, motion is captured online to animate the virtual avatar in real-time, and the streaming motion data needs to be processed with minimal latency. Online motion style transfer can also be easily incorporated into the workflows of real-time motion systems, such as games, interactive exhibitions, and augmented reality with minimal additional latency.
12
+
13
+ Motion style transfer exists as a long-standing research problem due to several difficulties, among many: (1) lack of a standardized qualitative style representation for motions, (2) difficulty in handling and generating temporally correlated data, (3) a lack of temporally registered motion data in different styles. Several approaches [20, 46, 57, 60] aim to solve this problem with manually designed models. However, they often fail to generalize well to large motion datasets with various styles. Researchers have developed more scalable methods with the rapid progress of deep learning machinery [1,6,17,36]. However, only a few of them can transfer the motion to multiple target styles with a unified model [1, 38]. On top of the aforementioned challenges, *online* motion style transfer poses more difficulties because style and content are ill-defined and unrecognizable within one frame, yielding low-quality transfer results. Current offline motion style transfer methods are commonly conditioned on multiple input frames in order to understand the motion semantics, thus realizing better transfer at the cost of introducing non-trivial latency. Although offline methods can be adapted to online settings by padding the current frame with past frames, there is no mechanism to guarantee the continuity among output frames.
14
+
15
+ To accomplish high-quality, efficient motion style transfer with minimal latency, we embed knowledge regarding the previous frames in the memory of the motion transfer module in order to infer and track the style and content. The transfer module is thus aware of the context of content and style even when only presented with the current frame. We adapt the Encoder-Recurrent-Decoder (ERD) framework to the online motion style transfer task by designing novel recurrent residual connections to capture features for each style. We name this novel architecture as *Style-ERD*. In *Style-ERD*, we enable each residual connection to learn its own initial hidden state h<sup>0</sup> conditioned on the style and content label. The learned hidden states are vital to the responsiveness of the style transfer results. In addition, to produce temporally coherent motions, we design a new discriminator with feature and temporal attention, *FT-Att Discriminator*, to supervise the post-transfer style. As a result, our deep learning model demonstrates a strong capability to perform the desired motion style transfer efficiently and with minimal latency.
16
+
17
+ The contributions of this work are as follows: (1) We introduce the online motion style transfer problem and aim to stimulate future research into this area to facilitate real-time animation applications. (2) We present a novel framework, *Style-ERD*, as well as a new supervision module, *FT-Att Discriminator*, achieving the goal of style transferring motion with minimum latency. Our style transfer framework provides a 5× reduction in compute time, as compared with the current state-of-the-art approach. (3) Our method can transfer the motion into its stylistic counterpart with high fidelity, showing better style transfer as compared to offline methods.
18
+
19
+ # Method
20
+
21
+ Our goal is to develop an online motion style transfer algorithm with high-quality transfer and minimal latency. In particular, we seek to reduce the number of input frames needed at each timestep to synthesize the current frame of the target style. However, with fewer input frames, the style transfer model may err in interpreting the style and content. We therefore leverage a recurrent model to maintain relevant estimates of the style and content. Our framework consists of three components: a style transfer module, a style supervision module and a content supervision module. An overview of our method is displayed in Fig. 2.
22
+
23
+ Inspired by the ERD framework [7], we name the style transfer module as *Style-ERD*. It is characterized by multiple recurrent residual connections and by hidden states that have learned initial values which are conditioned on the input. The novel recurrent residual connections play a key role in the success of our method because the memory of past frames provides style and content information regarding the current frame while the residuals capture the features of each style. The *Style-ERD* model achieves the goal of style transfer from each single frame input in real-time.
24
+
25
+ The style transfer module (*Style-ERD*) delivers poor style transfer when used with only a reconstruction loss. Conditioning on style and content before and after the transfer task can boost the style transfer effects. Both supervision modules take multiple frames of motion as input. This multi-frame input to supervision modules does not hinder the online property of our method since the supervision modules are unused at inference time. We propose a novel attention mechanism that spans both feature space and temporal space of the feature maps in the style discriminator, *FT-Att Discriminator*, to enable the style transfer module to avoid mode-collapse issues that would otherwise preclude modeling the desired variety of style and content. The content supervision module adopts the idea of perceptual loss [22] with features that focus on the content of motion.
26
+
27
+ Motion Transfer Module. Our style transfer module, *Style-ERD*, consists of three parts: an encoder E to compress the input frame xt, a recurrent module R consisting of residual connections to learn the offsets of different styles, and a decoder D to map the latent code back to the transferred motion frame x ′ t represented by joint rotations.
28
+
29
+ ![](_page_3_Figure_0.jpeg)
30
+
31
+ Figure 3. *FT-Att Discriminator* structure. The discriminator forms the weight matrix via an outer product between two attention vectors, then applies the attention matrix on the extracted features via a Hadamard product.
32
+
33
+ The input frame $x_t$ contains joint rotations in unit quaternions $r_t \in \mathbb{R}^{4 \times J}$ , joint positions offset by the root $p_t \in \mathbb{R}^{3 \times J}$ , and linear joint velocities $v_t \in \mathbb{R}^{3 \times J}$ at timestep t, where J is the number of joints. Additionally, the encoder is conditioned on the style label S and content label C of the input frame $x_t$ while the decoder is conditioned on the target style label, $\hat{S}$ . Both style and content labels are represented by one-hot vectors.
34
+
35
+ The encoder consists of a two-layer MLP (multilayer perceptron) to compress the input to a low-dimensional space, z. We choose to compress the input for two reasons: (1) A low-dimensional latent space can simplify capturing an abstracted style representation; and (2) With this low-dimensional bottleneck and the given training tasks, the encoder can normalize the style of the input frame to neutral.
36
+
37
+ The recurrent module is designed as a stack of LSTM layers, i.e., $R = [r_0, r_1, \ldots, r_{n_S}]$ , with each one learning a style offset of one specific style with respect to the neutral style. Here, we assign the recurrent branch $r_0$ to learn the features of the neutral style, which serves as the basis for all other style offsets. Then, the residual value computed by the target style branch $r_{\hat{S}}(z_t)$ is added to the neutral branch output, $r_0(z_t)$ . Thus, the operation of our recurrent module can be expressed as: $z_t' = r_0(z_t) + r_{\hat{S}}(z_t)$
38
+
39
+ In addition, it is challenging to perform style transfer on the first few frames because the memory of the LSTM layers may not have seen enough information to infer the necessary style information. Common ways to initialize hidden states include setting the hidden states to zeros [31] or random noise [63], and treating the initial hidden states as parameters for the network to learn [13]. In order to enhance the performance on the first few frames, we propose learning multiple initial states $h_0 = [h_{0_0}, h_{0_1}, \ldots, h_{0_{n_S}}]$ conditioned on the style label S and the content label C. Specifically, assume there are $n_C$ different content labels; the neutral branch $r_0$ learns $n_C$ initial hidden states simultaneously and selects the one corresponding to the content
40
+
41
+ label. Similarly, each style branch learns its own initial hidden state for its corresponding style.
42
+
43
+ The conditional decoder D expands the latent code $z_t'$ back to joint rotations in quaternions and linear joint velocities through four MLP layers, further conditioned on the target style label $\hat{S}$ . Joint positions are also computed via forward kinematics of joint rotations.
44
+
45
+ Style Supervision Module. We propose a novel discriminator $\mathcal{D}_{\mathcal{S}}$ with an attention mechanism, FT-Att Discriminator, to supervise the style transfer task. Fig. 3 shows the structure of FT-Att Discriminator. Unlike the style transfer module, our discriminator receives a segment of T T T T T T T T T T
46
+
47
+ The discriminator attempts to distinguish generated motions from real motion samples according to style labels Sand content labels C. We adopt a 1D temporal convolution structure similar to [1, 32] to extract the 2D feature matrix $m_s \in \mathbb{R}^{C' \times T'}$ but add novel attention modules conditioned on style and content. The intuition behind the attention modules is that the discriminator should judge motion style according to the desired style and its content by weighing the features unevenly. The attention modules are comprised of MLP layers with style and content labels as input, and outputs feature attention vector $w_f \in \mathbb{R}^{C'}$ and temporal attention vector $w_t \in \mathbb{R}^{T'}$ . We then compute an outer product between the feature attention $w_f$ and temporal attention $w_t$ to form a weight matrix $w_s \in \mathbb{R}^{C' \times T'}$ : $w_s = w_f \bigotimes w_t$ . Finally, the weight matrix $w_s$ is applied to the feature matrix $m_s$ via a Hadamard product, i.e., element-wise multiplication. Thus, given the feature map $m_s$ and weight matrix $w_s$ , the output of the discriminator can be expressed as:
48
+
49
+ $$\mathcal{D}_S(p_T, v_T | S, C) = \sum_{i=0}^{C'} \sum_{j=0}^{T'} (m_s \circ w_s)[i, j]$$
50
+ (1)
51
+
52
+ Content Supervision Module. At the same time as transferring style, we expect the content of the motion to be unaltered. We apply perceptual loss [22] based on a pre-trained content-classification network $\mathcal{D}_C$ to preserve content. The classification network follows the same convolution layers as the discriminator while accepting joint rotations, joint positions and velocities as input. Inspired by the style normalization effects of IN [1,21], each convolution layer is followed by IN such that the classification network focuses on the motion content and disregards the style.
53
+
54
+ The training process is analogous to the training of the standard Generative Adversarial Network [11]. As a gen-
55
+
56
+ erator, the proposed style transfer module *Style-ERD* is trained to reconstruct the input frame and to stylize the frame to the target style to fool the discriminator, while the objective of the *FT-Att Discriminator* is to distinguish the transferred motions from real data samples. We add perceptual loss and further adopt a gradient penalty in the discriminator to improve the overall training process. For simplicity and clarity, we use the notation (.)' to indicate attributes of the transferred results.
57
+
58
+ Reconstruction. Given a motion input x<sup>t</sup> and a target style label Sˆ identical to the original style, the motion transfer module should output an identical frame x ′′ <sup>t</sup> = [r ′′ t , p′′ t , v′′ t ]. This reconstruction task can be viewed as an auxiliary task to learn disentangled style variance for each residual branch. The reconstruction loss is applied over the joint rotations in quaternions rt, translational joint positions p<sup>t</sup> and velocities vt:
59
+
60
+ $$\mathcal{L}_{quat}(r_t, r_t'') = \|\cos^{-1}(|r_t \cdot r_t''|)\|^2,$$
61
+ (2)
62
+
63
+ $$\mathcal{L}_{rec_{t}} = \mathcal{L}_{quat}(r_{t}, r_{t}'') + \frac{1}{2} \|p_{t} - p_{t}''\|^{2} + \|v_{t} - v_{t}''\|^{2},$$
64
+ (3)
65
+
66
+ where Lquat denotes the quaternion difference represented by the angle between two rotations in radians. More details about Lquat can be found in the supplementary material.
67
+
68
+ Style Transfer. We adopt the Least Squares Generative Adversarial Networks (LSGAN) [34] framework to train the *FT-Att Discriminator*. We assume that neutral style motion serves as a common basis for other styles. At training time, we set the target styles of all neutral motions to be any other existing style in the dataset, while motions in other styles except neutral should be transferred to the neutral style. With these training objectives, we expect the encoder E to normalize the input motion to a neutral style. Therefore, the adversarial loss is applied to manipulate the style of the motion by fooling the critic. At the same time, the critic is trained to distinguish the fake generated motion from the real motion sample:
69
+
70
+ $$\mathcal{L}_{adv} = \left\| \mathcal{D}_s(p_T', v_T' | \hat{S}, C) \right\|^2, \tag{4}$$
71
+
72
+ $$\mathcal{L}_{cri} = \|\mathcal{D}_s(p_T, v_T | S, C) - 1\|^2 + \|\mathcal{D}_s(p_T', v_T' | \hat{S}, C)) + 1\|^2.$$
73
+ (5)
74
+
75
+ Gradient Penalty. GAN training is known to suffer from instability and convergence issues, with multiple approaches proposed to address this issue [15, 16, 23, 37, 44]. In this work, we apply a gradient penalty on the real samples to prevent the discriminator from creating a non-zero gradient orthogonal to the data manifold when the generator produces the true data distribution [37]:
76
+
77
+ $$\mathcal{L}_{gp} = \left\| \nabla_{\hat{x}} \mathcal{D}_s(\hat{x}) \right|_{\hat{x} = (p_T, v_T | S, C)} \right\|^2 \tag{6}$$
78
+
79
+ Perceptual Loss. In order to preserve content before and after the transfer, we add a perceptual loss [22] Lper to the generator with a pretrained multi-class contentclassification network D<sup>C</sup> . The perceptual loss encourages the convolution feature maps ϕ extracted by the classification network before and after the transfer to match:
80
+
81
+ $$\mathcal{L}_{per} = \left\| \phi - \phi' \right\|^2. \tag{7}$$
82
+
83
+ The final loss applied to the motion style transfer module (generator) is a weighted sum of reconstruction, adversarial, perceptual loss while a gradient penalty is added to the discriminator loss:
84
+
85
+ $$\mathcal{L}_{gen} = \sum_{t=0}^{T} \mathcal{L}_{rec_t} + w_{adv} \mathcal{L}_{adv} + w_{per} \mathcal{L}_{per}, \quad (8)$$
86
+
87
+ $$\mathcal{L}_{dis} = \mathcal{L}_{cri} + w_{gp} \mathcal{L}_{gp}, \tag{9}$$
88
+
89
+ where we set wadv = 1, wper = 0.1 and wgp = 128.
2203.10321/main_diagram/main_diagram.drawio ADDED
The diff for this file is too large to render. See raw diff
 
2203.10321/paper_text/intro_method.md ADDED
@@ -0,0 +1,17 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # Introduction
2
+
3
+ A knowledge graph (KG) is a multi-relational graph where the nodes are entities from the real world (e.g. *Barack Obama, United States*) and the named edges represent the relationships between them (e.g. *Barack Obama - born in - United States*). KGs can be either domain-specific such as WikiMovies [@miller2016keyvalue] or public, cross-domain KGs encoding common knowledge such as Wikidata and DBpedia [@DBLP:journals/corr/abs-2003-00719]. These graph-structured databases play an important role in knowledge-intensive applications including web search, question answering and recommendation systems [@kg_survey_paper].
4
+
5
+ Most real-world knowledge graphs are incomplete. However, some missing facts can be inferred using existing facts in the KG [@bordes2013translating]. This task termed knowledge graph completion (KGC)[^2] has become a popular area of research in recent years [@kge_survey] and is often approached using knowledge graph embedding (KGE) models. KGE models represent each entity and relation of the KG by a dense vector embedding. Using these embeddings the model is trained to distinguish correct from incorrect facts. One of the main downstream applications of KGEs is question answering over incomplete KGs (KGQA) [@kge_application_survey].
6
+
7
+ Taking into account the large size of real world KGs (Wikidata contains $\approx$`<!-- -->`{=html}90M entities) and the applicability to downstream tasks, KGE models should fulfill the following desiderata: (i) *scalability* -- i.e. have model size and inference time independent of the number of entities (ii) *quality* -- reach good empirical performance (iii) *versatility* -- be applicable for multiple tasks such as KGC and QA, and (iv) *simplicity* -- consist of a single module with a standard architecture and training pipeline. Traditional KGE models fulfill quality and simplicity. They build upon a simple architecture and reach a high quality in terms of KGC. However, as they create a unique embedding per entity/relation, they scale linearly with the number of entities in the graph, both in model size and inference time, and offer limited versatility. Methods such as DKRL [@Xie_Liu_Jia_Luan_Sun_2016] and KEPLER [@wang2021KEPLER] attempt to tackle the scalability issue using compositional embeddings. However, they fail to achieve quality comparable to conventional KGEs. KG-BERT [@kg-bert] utilizes pretrained BERT for link prediction and holds potential in terms of versatility as it is applicable to downstream NLP tasks. However, it is not scalable due to its underlying cross-encoder.[^3] QA methods which leverage KGEs outperform traditional KGQA approaches on incomplete KGs, but combining KGEs with the QA pipeline is a non-trivial task; models that attempt to do this often work on only limited query types (@huang2019knowledge; @sun2021faithful; @saxena2020improving) or require multi-stage training and inference pipelines [@ren2021lego]. Here, in order to achieve quality, these models have sacrificed versatility and simplicity. A comparison of approaches in terms of desiderata is summarized in Tab. [\[tab:desiderata\]](#tab:desiderata){reference-type="ref" reference="tab:desiderata"} in the appendix.
8
+
9
+ Our paper shows that all of these desiderata can be fulfilled by a simple sequence-to-sequence (seq2seq) model. To this end, we pose KG link prediction as a seq2seq task and train an encoder-decoder Transformer model [@vaswani2017attention] on this task. We then use this model pretrained for link prediction and further finetune it for question answering; while finetuning for QA, we regularize with the link prediction objective. This simple but powerful approach, which we call , is visualised in Fig. [1](#fig:kgt5-main){reference-type="ref" reference="fig:kgt5-main"}. With such a unified seq2seq approach we achieve (i) scalability -- by using compositional entity representations and autoregressive decoding (rather than scoring all entities) for inference (ii) quality -- we obtain state-of-the-art performance on two tasks (iii) versatility -- the same model can be used for both KGC and KGQA on multiple datasets, and (iv) simplicity -- we obtain all results using an off-the-shelf model with no task or dataset-specific hyperparameter tuning.
10
+
11
+ In summary, we make the following contributions:
12
+
13
+ - We show that KG link prediction and question answering can be treated as sequence-to-sequence tasks and tackled successfully with a single encoder-decoder Transformer (with the same architecture as T5-small [@raffel2020exploring]).
14
+
15
+ - With this simple but powerful approach called , we reduce model size for KG link prediction up to 98% while outperforming conventional KGEs on a dataset with 90M entities.
16
+
17
+ - We show the versatility of this approach through the task of KGQA over incomplete graphs. By pretraining on KG link prediction and finetuning on QA, KGT5 performs similar to or better than much more complex methods on multiple large-scale KGQA benchmarks.
2203.14675/main_diagram/main_diagram.drawio ADDED
The diff for this file is too large to render. See raw diff
 
2203.14675/paper_text/intro_method.md ADDED
@@ -0,0 +1,116 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # Introduction
2
+
3
+ Person re-identification (re-ID) aims to retrieve a person corresponding to a given query across disjoint camera views or different time stamps [\[59,](#page-9-0) [69\]](#page-10-0). Thanks to the discriminative power of deep neural networks, supervised approaches [\[20–](#page-8-0)[22,](#page-8-1)[53\]](#page-9-1) have achieved impressive performance in this task. Unfortunately, they require a large amount of labeled data that demands costly annotations, limiting their practicality in large-scale real-world re-ID problems. Due to this issue, unsupervised methods that learn the discriminative features for person retrieval from unlabeled data have recently received much attention.
4
+
5
+ Prior works on unsupervised person re-ID have utilized pseudo-labels obtained by k-nearest neighbor search [\[25,](#page-8-2) [47,](#page-9-2)[60\]](#page-9-3) or unsupervised clustering [\[7,](#page-8-3)[24\]](#page-8-4) for training. These approaches alternate a two-stage training scheme: the label generation phase that assigns pseudo-labels and the training phase that trains a model with generated labels. Among these approaches, the clustering-based methods [\[2,](#page-8-5) [9\]](#page-8-6) have especially demonstrated their effectiveness with state-ofthe-art performance. However, inherent noises in pseudolabels significantly hinder the performance of these unsupervised methods.
6
+
7
+ To tackle this problem, many efforts have been made to improve the accuracy of pseudo-labels by performing robust clustering [\[9,](#page-8-6) [62\]](#page-9-4) or pseudo-label refinement [\[25,](#page-8-2) [64\]](#page-9-5). Recent techniques [\[8,](#page-8-7) [63\]](#page-9-6) significantly reduce the label noise through the model ensemble in a peer-teaching manner by using predictions from an auxiliary network as refined labels for the target network. Nevertheless, training multiple backbones as teacher networks (*e.g.*, dual ResNet in MMT [\[8\]](#page-8-7), and single DenseNet, ResNet, and Inception-v3 in MEB-Net [\[63\]](#page-9-6)) requires high computational costs. Furthermore, labels refined by these methods consider only global features and neglect the fine-grained clues essential to person re-ID, leading to insufficient performance.
8
+
9
+ To address the aforementioned problems, we propose *Part-based Pseudo Label Refinement (PPLR)*, a novel unsupervised re-ID framework that effectively handles the label noise using part features in a self-teaching manner. Several studies [\[42,](#page-9-7)[67\]](#page-9-8) demonstrate that the fine-grained information from part features improves the re-ID performance. Our key idea is that this fine-grained information can provide not only useful cues for better representation learning but also robustness against label noises. In contrast to the global-shape information that has large variations due to significant changes in poses and viewpoints, part features can capture the local-texture information that provides a more crucial clue to re-identifying a person [\[70\]](#page-10-1).
10
+
11
+ We argue that the complementary relationship between
12
+
13
+ <span id="page-1-0"></span>the global and part features can be used to refine the label noise in each of their feature spaces. However, some of the global and part features from the same image capture very different semantic information, and using the complementary relationship na¨ıvely can result in noisy and even incorrect information. For instance, images may contain irrelevant parts (*e.g.,* occlusions or backgrounds) that provide unreliable complementary information, and it is desirable to exclude them from the training. Therefore, it is essential to identify whether the information of global and part features are reliable with each other to properly exploit their complementary relationship. To address this issue, we design a cross agreement score based on the similarity between the k-nearest neighbors of global and part features. Based on the cross agreement, we propose two pseudo-label refinement methods – *part-guided label refinement (PGLR)* and *agreement-aware label smoothing (AALS)*. PGLR refines the pseudo-labels of global features by aggregating the predictions of part features, guiding the global features to learn from rich local contexts. AALS refines the pseudo-labels of part features by smoothing the label distributions, thus calibrating the predictions of part features.
14
+
15
+ Our contributions can be summarized as follows:
16
+
17
+ - We propose a part-based pseudo-label refinement framework that operates in a self-ensemble manner without auxiliary networks. To the best of our knowledge, this is the first work to handle the label noise using the part feature information for person re-ID.
18
+ - We design a cross agreement score to capture reliable complementary information, which is computed by the similarity between the k-nearest neighbors of the global and part features.
19
+ - Extensive experimental results with superior performance against the state-of-the-art methods demonstrate the effectiveness of the proposed method.
20
+
21
+ # Method
22
+
23
+ We propose a Part-based Pseudo Label Refinement (PPLR) framework that exploits the complementary relationship between the global and part features to tackle the
24
+
25
+ <span id="page-2-4"></span><span id="page-2-0"></span>![](_page_2_Figure_0.jpeg)
26
+
27
+ Figure 1. The illustration of PPLR. Our method alternates the clustering stage and the training stage. (a) In the clustering stage, we assign pseudo-labels by clustering the global features on the unlabeled dataset. We then perform a k-nearest neighbor search on each feature space and compute the cross agreement score based on the similarity between the top-k ranked lists of the global and part features. (b) In the training stage, we train the model with refined pseudo-labels based on the cross agreement score. We smooth the labels of part features according to the cross agreement score of each part and refine the labels of global features by aggregating the part features' predictions.
28
+
29
+ label noise problem. Following the existing clustering-based methods [24,61,64], our method alternates the clustering stage and the training stage. In the clustering stage, we extract global and part features and assign pseudo-labels through global feature clustering. We then compute a cross agreement score for each sample based on the similarity between *k*-nearest neighbors of global and part features. In the training stage, we mitigate the label noise using the proposed pseudo-label refinement methods based on the cross agreement: agreement-aware label smoothing (AALS) for part features and part-guided label refinement (PGLR) for global features. The features from the trained model are then used in the next clustering stage to update the pseudo-labels. The overall framework is illustrated in Fig. 1.
30
+
31
+ We first present a part-based unsupervised person re-ID framework that utilizes fine-grained information of the part features. Contrary to most existing unsupervised approaches that exploit only the global feature, we use both the global and part features to represent an image.
32
+
33
+ Formally, let $\mathcal{D}=\{x_i\}_{i=1}^{N_{\mathcal{D}}}$ denote the unlabeled training dataset, where $x_i$ is an image and $N_{\mathcal{D}}$ is the number of images. Our model first extracts the shared representation $F_{\theta}(x_i) \in \mathbb{R}^{C \times H \times W}$ , where C, H, and W are sizes of the channel, height, and width of the feature map, respectively. Given this feature map, the global feature $\mathbf{f}_i^g$ is obtained by applying global average pooling over the feature map, while the part features $\{\mathbf{f}_i^{p_n}\}_{n=1}^{N_p}$ are obtained by dividing the feature map into $N_p$ uniformly partitioned regions $\mathbb{R}^{C \times \frac{H}{N_p} \times W}$ and applying average pooling on each region.
34
+
35
+ To learn these representations without a label, we simulate the pseudo-labels based on clustering results. Following the part-based approaches [42, 48, 67] in the literature, we adopt the standard protocol where both the global and part features share the same pseudo-labels. We perform DBSCAN clustering [6] on the global feature set $\{\mathbf{f}_i^g\}_{i=1}^{N_D}$ and use the cluster assignment as pseudo-labels. We denote the pseudo-label for the image $x_i$ as $y_i \in \mathbb{R}^K$ , which is the one-hot encoding of the hard assignment with K clusters.
36
+
37
+ The pseudo-labels are then used to train the global and part features for person identification. For global features, we compute the cross-entropy loss by:
38
+
39
+ <span id="page-2-2"></span>
40
+ $$\mathcal{L}_{gce} = -\sum_{i=1}^{N_{\mathcal{D}}} y_i \cdot \log(q_i^g), \tag{1}$$
41
+
42
+ where $q_i^g = h_{\phi_g}(\mathbf{f}_i^g) \in \mathbb{R}^K$ is the prediction vector by the global feature, and $h_{\phi_g}(\cdot)$ is the global feature classifier consisting of a fully connected layer and a softmax function. Similarly, we train the part features using the cross-entropy loss by:
43
+
44
+ <span id="page-2-1"></span>
45
+ $$\mathcal{L}_{pce} = -\frac{1}{N_p} \sum_{i=1}^{N_D} \sum_{n=1}^{N_p} y_i \cdot \log(q_i^{p_n}), \tag{2}$$
46
+
47
+ where $q_i^{p_n} = h_{\phi_{p_n}}(\mathbf{f}_i^{p_n}) \in \mathbb{R}^K$ indicates the prediction vector by the n-th part feature space $p_n$ , and $h_{\phi_{p_n}}$ is the classifier for the part feature space $p_n$ . We additionally utilize the softmax-triplet loss defined by:
48
+
49
+ $$\mathcal{L}_{triplet} = -\sum_{i=1}^{N_{\mathcal{D}}} \log \left( \frac{e^{\|\mathbf{f}_i^g - \mathbf{f}_{i,n}^g\|}}{e^{\|\mathbf{f}_i^g - \mathbf{f}_{i,p}^g\|} + e^{\|\mathbf{f}_i^g - \mathbf{f}_{i,n}^g\|}} \right), \quad (3)$$
50
+
51
+ <span id="page-3-2"></span>where $\|\cdot\|$ denotes the $L_2$ -norm, and the subscripts (i,p) and (i,n) respectively denote the hardest positive and negative samples of the image $x_i$ in a mini-batch obtained by the hard-batch triplet selection [14]. Following the recent studies [2,55] that utilize camera labels to improve the discriminability across camera views, we can optionally employ a camera-aware proxy [50] if the camera labels are available. We compute the camera-aware proxy $\mathbf{c}_{(a,b)}$ as the centroid of the features that have the same camera label a and belong to the same cluster b. We then compute the inter-camera contrastive loss [50] as:
52
+
53
+ $$\mathcal{L}_{cam} = -\sum_{i=1}^{N_{\mathcal{D}}} \frac{1}{|P_i|} \sum_{j \in \mathcal{P}_i} \log \frac{\exp(\mathbf{c}_j^{\top} \mathbf{f}_i^g / \tau)}{\sum_{k \in \mathcal{P}_i \cup \mathcal{Q}_i} \exp(\mathbf{c}_k^{\top} \mathbf{f}_i^g / \tau)}, \quad (4)$$
54
+
55
+ where $\mathcal{P}_i$ and $\mathcal{Q}_i$ are the index sets of the positive and hard negative camera-aware proxies for $\mathbf{f}_i^g$ , and $\tau$ is the temperature parameter. This loss pulls together the proxies that are within the same cluster but in different cameras, reducing the intra-class variance caused by disjoint camera views. The training objective is then given by:
56
+
57
+ $$\mathcal{L} = \mathcal{L}_{gce} + \mathcal{L}_{pce} + \mathcal{L}_{triplet} + \lambda_{cam} \mathcal{L}_{cam}, \qquad (5)$$
58
+
59
+ where $\lambda_{cam}$ is the weight parameter that controls the importance of the inter-camera contrastive loss.
60
+
61
+ Ideally, the model can learn both the holistic and local features by sharing the common representation $F_{\theta}(\cdot)$ . However, its performance is inherently bounded by the quality of the pseudo-label $y_i$ , which is significantly noisy in practice. In the following sections, we propose a method to refine such noisy labels for properly representing both features.
62
+
63
+ Contrary to our basic framework, PPLR trains the model with refined pseudo-labels that consider the complementary relationship between global and part features. Nevertheless, there exists unreliable complementary information due to differences in feature similarity structures between global and part features. As shown in Fig. 2, some part features contain information irrelevant to a person and are not suitable for refining pseudo-labels of global features. Furthermore, global features consider only the global context and sometimes neglect information relevant to part features. Therefore, identifying whether the given complementary information is reliable is an essential task for our method.
64
+
65
+ To address this issue, we design a cross agreement score that captures how reciprocally similar the k-nearest neighbors of global and part features are. We define the cross agreement score as the Jaccard similarity between the k-nearest neighbors of the global and part features. We first perform a k-nearest neighbor search on the global and each
66
+
67
+ <span id="page-3-1"></span>![](_page_3_Figure_10.jpeg)
68
+
69
+ Figure 2. The t-SNE [46] visualization of each feature space on Market-1501 at an early training epoch. Different bounding box colors represent different IDs. Each feature space shows a different feature distribution with different semantic parts of a person, and some feature information can be unreliable. For instance, less discriminative parts denoted by a circle provide irrelevant information to their counterparts in other feature spaces and vice-versa. Our cross agreement score identifies such noisy information by comparing the k-nearest neighbors between feature spaces.
70
+
71
+ of the part feature spaces independently to produce $(1+N_p)$ ranked lists on each image. We then compute the cross agreement score between the global feature space g and the n-th part feature space $p_n$ for the image $x_i$ by:
72
+
73
+ $$C_i(g, p_n) = \frac{|\mathcal{R}_i(g, k) \cap \mathcal{R}_i(p_n, k)|}{|\mathcal{R}_i(g, k) \cup \mathcal{R}_i(p_n, k)|} \in [0, 1], \quad (6)$$
74
+
75
+ where $\mathcal{R}_i(g, k)$ and $\mathcal{R}_i(p_n, k)$ are the sets of indices for the top-k samples in the ranked list computed by $\mathbf{f}_i^g$ and $\mathbf{f}_i^{p_n}$ , respectively, and $|\cdot|$ is the cardinality of a set.
76
+
77
+ Intuitively, a high cross agreement score $C_i(g,p_n)$ implies that the feature spaces of g and $p_n$ have highly correlated feature similarity structure around the data point i and provide reliable complementary information. On the other hand, a low $C_i(g,p_n)$ implies that the global and part features are not very correlated, meaning that they can provide unreliable information to each other. Our cross agreement score is designed in the same spirit as recent re-ranking techniques [15, 34, 56, 71] that utilize a reciprocity check of k-nearest neighbors to handle the similarity noise in the affinity matrix to improve retrieval performance.
78
+
79
+ Based on the cross agreement scores, we alleviate the label noise by considering (1) whether the pseudo-labels by
80
+
81
+ <span id="page-3-0"></span><sup>&</sup>lt;sup>1</sup>More details can be found in the appendix.
82
+
83
+ <span id="page-4-3"></span>global feature clustering are suitable for each part feature and (2) whether the predictions of part features are appropriate for refining pseudo-labels of global features.
84
+
85
+ **Agreement-aware label smoothing.** Learning all part features with the same global pseudo-label that neglects the local context of parts can be detrimental to the model training. For instance, some parts contain cues irrelevant to a person (*e.g.*, occlusions), and it is desirable to exclude them from the training. To address this issue, we utilize a label smoothing [27, 43] to refine the pseudo-label of each part depending on the corresponding cross agreement score.
86
+
87
+ Given the pseudo-label $y_i$ of the image $x_i$ , the label smoothing for the part feature $\mathbf{f}_i^{p_n}$ is formulated as:
88
+
89
+ $$\tilde{y}_{i}^{p_{n}} = \alpha_{i}^{p_{n}} y_{i} + (1 - \alpha_{i}^{p_{n}}) u, \tag{7}$$
90
+
91
+ where u is a uniform vector, and $\alpha_i^{p_n}$ is a weight determining the strength of label smoothing. Contrary to conventional label smoothing that employs a constant weight for $\alpha_i^{p_n}$ , we dynamically adjust the weight for each part using the cross agreement score (i.e., $\alpha_i^{p_n} = \mathcal{C}_i(g, p_n)$ ) that reflects the reliability of the global clustering result for each part. We then plug the refined pseudo-labels $\tilde{y}_i^{p_n}$ to Eq. (2), and the cross-entropy loss is reformulated with Kullback-Leibler (KL) divergence [31] by:
92
+
93
+ $$\mathcal{L}_{aals} = \frac{1}{N_p} \sum_{i=1}^{N_D} \sum_{n=1}^{N_p} (\alpha_i^{p_n} H(y_i, q_i^{p_n}) + (1 - \alpha_i^{p_n}) D_{\text{KL}}(u \parallel q_i^{p_n})), \quad (8)$$
94
+
95
+ where $H(\cdot, \cdot)$ and $D_{\mathrm{KL}}(\cdot \parallel \cdot)$ are cross-entropy and KL divergence, respectively, and two terms are balanced by $\alpha_i^{p_n}$ with the value of the cross agreement score $\mathcal{C}_i(g, p_n)$ .
96
+
97
+ In Eq. (8), the former term drives the prediction to high confidence close to $y_i$ , and the latter term encourages the prediction to collapse into a uniform vector. By scaling the two opposite terms with the cross agreement scores, we calibrate the prediction of part features according to the reliability of pseudo-labels for each part.
98
+
99
+ Part-guided label refinement. We propose a part-guided label refinement that generates refined labels for global features using the predictions by part features. The part feature information with rich local contexts can be used to handle the label noise in global feature clustering, which often neglects fine-grained information. However, since less discriminative parts can provide misleading information, we aggregate the predictions of part features with different weights depending on each cross agreement score, thus refining the labels with more reliable information.
100
+
101
+ Specifically, we generate the part-guided refined label, $\tilde{y}_i^g$ , as a pseudo-label for the global feature by:
102
+
103
+ $$\tilde{y}_i^g = \beta y_i + (1 - \beta) \sum_{n=1}^{N_p} w_i^{p_n} q_i^{p_n}, \tag{9}$$
104
+
105
+ where $w_i^{p_n} = \frac{\exp(\mathcal{C}_i(g,p_n))}{\sum_k \exp(\mathcal{C}_i(g,p_k))}$ and $q_i^{p_n}$ are the ensemble weight and the prediction vector of the part feature $\mathbf{f}_i^{p_n}$ , respectively. $\beta \in [0,1]$ is the weighting parameter controlling the ratio of the one-hot pseudo-label and the ensembled prediction. Contrary to the global feature that only captures holistic characteristics of a person, the part-guided refined label in Eq. (9) additionally considers the fine-grained predictions from the local parts in proportion to their reliability captured by the cross agreement score. The refined labels $\tilde{y}_i^g$ are then plugged to Eq. (1) to train the global feature by:
106
+
107
+ $$\mathcal{L}_{pglr} = -\sum_{i=1}^{N_{\mathcal{D}}} \tilde{y}_i^g \cdot \log(q_i^g). \tag{10}$$
108
+
109
+ <span id="page-4-2"></span>With the part-guided refined labels, global features learn from the ensembled part predictions with rich fine-grained information that is neglected in previous methods. Furthermore, unlike previous studies [8,63] that refine pseudolabels using an auxiliary teacher network, a part-guided label refinement is a self-teaching method without an additional network, being computationally efficient.
110
+
111
+ **Overall training objective.** The overall loss function of PPLR is then:
112
+
113
+ $$\mathcal{L}_{PPLR} = \mathcal{L}_{aals} + \mathcal{L}_{ralr} + \mathcal{L}_{triplet} + \lambda_{cam} \mathcal{L}_{cam}.$$
114
+ (11)
115
+
116
+ <span id="page-4-0"></span>Our method effectively reduces the influence of noisy labels in two ways. The part features with low cross agreements are trained by pseudo-labels close to a uniform distribution by Eq. (8), and the global features trained by the part-guided refined labels capture reliable fine-grained information from the part features by Eq. (9). Also, when all part predictions have low cross agreement scores, the ensembled prediction in the part-guided refined label eventually collapses to a uniform vector due to the strong label smoothing effect in all parts, thus providing meaningless training signals. It allows us to weaken the impact of noisy pseudo-labels, resulting in better representation learning.
2204.01188/main_diagram/main_diagram.drawio ADDED
@@ -0,0 +1 @@
 
 
1
+ <mxfile host="app.diagrams.net" modified="2022-04-08T18:05:46.155Z" agent="5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/100.0.4896.75 Safari/537.36" etag="2IwvkObcd_UhIyH0s7Ii" version="17.1.3" type="device"><diagram id="iG3kfHU7sEgN-RHpr_Oe" name="CSW">7Z3rc6I6FMD/Gmd274wO78fHqm13+37sbh9fOghR2SJYwGrvX38THhZD2kIRgrfp7rQaQqL5nZxzcvKgIw5mq0PfmE9PPQs4HYGzVh1x2BEEXVLgb5TwEicovBQnTHzbipP414Rr+1+QJHJJ6sK2QLCRMfQ8J7Tnm4mm57rADDfSDN/3lpvZxp6zWevcmCQ1cq8J16bhgFy2G9sKp3GqKHKZ7D+APZkmVUuKltwyM9LcSUIwNSxvmalM3O+IA9/zwvjVbDUADmq8tGHEi8fTi9+mxL/075TfZ7dHi9NVN673oMwt6+/gAzf8dNHPR6Oj7uG58BNI+o3vnF3e292uqMZlPxvOImmx5MuGL2kT+t7CtQAqheuI/eXUDsH13DDR1SWUGZg2DWcOfMfDl2PPDQ+Mme0gefllzyB7gTsDS/j7ypsZbpIlERRehO+D0PcewcBzPD+qUDyIfuCVgt87aZ9n4IdglUGWtMMh8GYg9F9gluSqoCb0U6FO3i5fJYTn0zzTrHSISaKRiOVkXfZry8MXSeOTQZwcv/zZezod/xgcnlyEs4V0N3G6qfhugJDQf3mAJBFKdEft/+6ow28bKccwpSPvw1cmvBteCpM2f4xTv/HfoxxvpKvD70k1OHXYlOEm2k1OrucCBNN2HCzJcOwJ5Dw0ITUA0/sIjA0/7l5yYWZbFqqGKEub0paVFaVGiZDlDyVCIcjDNsTh8PB8bxFK3h9vcDNy93/8HF/wXbGqNPBkaTh7QxrOmDRkpEHnqEnD3R/n4W7Yf/kzPJ/56q118fhsr7/AtqVBeEMaBCYNGWlQdGrScHBxfiKG7p3/cvso3i4exV/yskuyFBigD4y0EcxjX2tsr1BzlrXaCgYX2uzxGCimSbLmlqqPOK5B3c1reUASyZjXBYjUXb80IAUDpFMGJLyhT48f+A3viKz9vhY7zFUWuObYOeHNIDwUPWn6sBzYhnD/7F4SPSPFQeYomMO2FJRJGDXHJk8hyzPODmvfvOMrY1Y5el2UiFn5FOYzhvl9x5ZvGWYth/namM0d253kQGW9zTQSg9rKMoLp2ics7l/CMuao5NlqggJfvTjYJMR/UbGoibge4hcFw6SoLtuH0mB7qKTAW6Am67teaE4zLmk14dhX0L86FbqAiQBBofMCQQaUumRAZzLQsAwouENGXQb4/KCGCUG9QqCprROClg6cxsIb5loZKXKTUUlS6KFRc82TRk5fmhCmSdczC9QIlRsfCaTxbnsdZ3qc8WFwgzEmMmf5U5x3ZBxMjTM+Dqbfn/MjpCKcd2QgTI0zPhCmzpk4/0udEOAtGagkQrqiikaTvo9IO1QhtNI7pUkI830k2qHhFEhBXXmyS74PTc6Y7yPxtDlLn+K8E74PRc6470O/P5MWp8XgYAUGvGJODT8A8edQnhZoJV5/EY5RWDl9+8o2KyEZyGlGM27qPXjRn4y+wS8FPzWX/vkeFcEhet1xQhhlhfcbMyQWSSnpUgo3gu5H0LEsmZKCSBBQObwyX2UvxB8PXXE9f2Y4mWvLpO3RRSmZLuAcEIbA78JvZ6KYWe5OtGSjmwTF0LV1XCy9ZrtWJE/oIpd+luhK6BtuMIZlpaVGKzmQXHi+tVnj+saRYT5OouUaXaxdBUmLm1SQ9OSFnLauZQdzx0ha1nYdO61p7HhGiFWfB7zZ3XfCBabY3XEXuMnufhWedz0g3l9L3Mj5PdP5owe3yLoWWIo9DxD/mlFphNGKjP4l+TLp8Q8JoRL91GmZNcxHFvMIeQJCvi6EBVxkhvB9hDJlhAUivAzh+wgJsYRGEZJCwAzhu64vr/bkdqlS0hiHQSwJkbYyJQXpGcSSEGmrU9JSNAbx3aGFILRNnRbY58YgfgSRtjolzYUxiCUhNqhOh+c/7y+8+dPd1dV5157PNQOobJhffqUWvmRToIyQDfNLI8TX19NGyIb5pRFKLUPI/NKqipS0malRhMwrrapIqSNkPmlVRUodYX4DCkNYUpESlsM265GyUUVlTUqdIRtWVFal1BmycUVVXUpaXtcsQzZ/WFWX0mfIpg+r6lL6DNnsYVVd2uS0E5khC9JUZtjgrBOZIYvSVGbY4KTTY/dKmj674+61O1od+6P9B/uqyKRTuWNOMQSaCcg72UaaLMl1HumyVnFpWFrKN7VOaGq9rqYuMIqr0tSWDDRLIjW1JozEWqVaxA+aI2zHb7SpSYOtD3Y54NsOxPnqraX0K3Ss0sBG5STHSo5GHbV/9daJsxb+tuzy+0piYQBtTOyBiqmB0RhTo2KdYoIfrEEYlDfbI9lgoDRD7MzPJgMrZIZsMFCVodjgPAOZIRsMVGbY4KQ7meHWD8xvrScpUvckC8yt/k9cSZG2K8mTJkG36Uu6/ytfsinfUWzQd/zrB5OlLhvHfOhe/jvWp1fBU5GBc3qAnrnwnZe+b5iPaEf2R23+CigiAK3ej/R1OUMm5QGuz86LhXPI9eSoxtBIjuzrIp+8Ror41CohlEw6WY+X68L43hMEFjtzyr9WIzOhwEludR3zb3tXR/rJ0eXp9V/x18WTYl1PHELP+6f9iCS5RkQSttGStHBIbhJRvlcxRB9PojWKKB8iY4g+Dm00iii/3OCrI1IKrFNuFFE+gMgQtUzR5eODDFHLFF0+/PfVEWn4Ade0FV0+uscQtUzR5YOCDFHLFF0+lvjVEaltU3SEDROMUcs0HWFDBGNET9U5f//eaHf3fyee5vCL5fz4VueKxMfb8LyaCs+mIUwU1/5sGh2PUahyDjMpGKtuATPxOelbXz9YfbKe3lPkeC73QM88HtL8hlAXHrZzrDRDHn8CQoOrY4gM85FaxvCjfoivUqPNkO0cK98PxZYxLLBY9AuZOh5f803b1BVYB9o4HmqP8cl5IiRHsVE8bIlndU+kwbXWRIZsv1d1T4Q2Q3YsT3VPhDbDAufyfCFTh3si1E0dO3Onuq1rcoM6GSILnVSH2OAOdTJEFjupDrHBLepkiAWCJw2bO4pPp8qN7CSOtrlju2grd7Imz9QhQ2TbaMv3RL1tEFmQpXxPFNsGcesbaXfa3OGjO/rmrkAEZVd2/tVHLbcjSekJBcHxYi8dwW8fXoHQya7Ay2zb1Oud6uE2n5RCeiAqcdcmV1cfTGvLbdt02ZbNqOE1jJhcbJnQNlaDkYG9tc/2hAEj6ktd66XjXmrMigRPXGsvXl43NB0jCOw4Bmz4YT6ZqCw/se5OKg0CWBPwLoYPFkmmaT5woMZ9zpZFbvekhgvPhh/vlbIgbHZLgcfKCLyFb4Lktld8hJLkXEk9zGZCCBMQ5sqKRGH91StIBykqk+6cN8Ip7HAdtX/dUYffYIppeWFHHcCUY5iCzk/LPoocPewcZnJgpiB6Fx9lMfI3hO2t0zUU4ukaWE0nqKbvsBJo3f35bu3ur9W/Xh8c9M46jmbVTpFZZqZ2KqgdiZd6ir4lzSPxco/HCqtZ9YhFVtMxCalimDRlW4ZJUxo2TCIpHF3UMAmFDdOmmRGYmSlnZpo8RYYsJqRYaVExOfmkmJwwMSknJqRQQ11iYjz0h6cWuHRvng66B3vzwyPukrC/5dcUIESObdruBLWC75kgQJbBG8NfYXTZ9Nxnz1lE4Zk4M2xLgbuBJgf4QQjQgW+tJV3C5Onp++R7CHVJhoTPtRDOBZTTSayKkgHf+h4aTbyaJbSh7NSzAMrxHw==</diagram></mxfile>
2204.01188/main_diagram/main_diagram.pdf ADDED
Binary file (43.3 kB). View file
 
2204.01188/paper_text/intro_method.md ADDED
@@ -0,0 +1,176 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # Introduction
2
+
3
+ Optimal transport and Wasserstein distance [@Villani-09; @peyre2020computational] have become popular tools in machine learning and data science. For example, optimal transport has been utilized in generative modeling tasks to generate realistic images [@arjovsky2017wasserstein; @tolstikhin2018wasserstein], in domain adaptation applications to transfer knowledge from source to target domains  [@courty2017joint; @bhushan2018deepjdot], in clustering applications to capture the heterogeneity of data [@ho2017multilevel], and in other applications [@le2021lamda; @xu2021vocabulary; @yang2020predicting]. Despite having appealing performance, Wasserstein distance has been known to suffer from high computational complexity, namely, its computational complexity is at the order of $\mathcal{O}(m^3 \log m)$ [@pele2009] when the probability measures have at most $m$ supports. In addition, Wasserstein distance also suffers from the curse of dimensionality, namely, its sample complexity is at the order of $\mathcal{O}(n^{-1/d})$ [@Fournier_2015] where $n$ is the sample size. A popular line of work to improve the speed of computation and the sample complexity of the Wasserstein distance is by adding an entropic regularization term to the Wasserstein distance [@cuturi2013sinkhorn]. This variant is known as entropic regularized optimal transport (or equivalently entropic regularized Wasserstein). By using the entropic version, we can approximate the value of Wasserstein distance with the computational complexities being at the order of $\mathcal{O}(m^2)$ [@altschuler2017near; @lin2019efficient; @Lin-2019-Efficiency; @Lin-2020-Revisiting] (up to some polynomial orders of approximation errors). Furthermore, the sample complexity of the entropic version had also been shown to be at the order of $\mathcal{O}(n^{-1/2})$ [@Mena_2019], which indicates that it does not suffer from the curse of dimensionality.
4
+
5
+ Another useful line of work to improve both the computational and sample complexities of the Wasserstein distance is based on the closed-form solution of optimal transport in one dimension. A notable distance along this direction is sliced Wasserstein (SW) distance [@bonneel2015sliced]. Due to the fast computational complexity $\mathcal{O}(m \log_2 m)$ and no curse of dimensionality $\mathcal{O}(n^{-1/2})$, the sliced Wasserstein has been applied successfully in several applications, such as generative modeling [@wu2019sliced; @deshpande2018generative; @kolouri2018sliced; @nguyen2021improving], domain adaptation [@lee2019sliced], and clustering [@kolouri2018slicedgmm]. The sliced Wasserstein is defined between two probability measures that have supports belonging to a vector space, e.g, $\mathbb{R}^d$. As defined in [@bonneel2015sliced], the sliced Wasserstein is written as the expectation of one-dimensional Wasserstein distance between two projected measures over the uniform distribution on the unit sphere. Due to the intractability of the expectation, Monte Carlo samples from the uniform distribution over the unit sphere are used to approximate the sliced Wasserstein distance. The number of samples is often called the number of projections and it is denoted as $L$. On the computational side, the computation of sliced Wasserstein can be decomposed into two steps. In the first step, $L$ projecting directions are first sampled and then stacked as a matrix (the projection matrix). After that, the projection matrix is multiplied by the two data matrices resulting in two matrices that represent $L$ one-dimensional projected probability measures. In the second step, $L$ one-dimensional Wasserstein distances are computed between the two corresponding projected measures with the same projecting direction. Finally, the average of those distances is yielded as the value of the sliced Wasserstein.
6
+
7
+ Despite being applied widely in tasks that deal with probability measures over images [@wu2019sliced; @deshpande2018generative], the conventional formulation of sliced Wasserstein is not well-defined to the nature of images. In particular, an image is not a vector but is a tensor. Therefore, a probability measure over images should be defined over the space of tensors instead of vectors. The conventional formulation leads to an extra step in using the sliced Wasserstein on the domain of images which is vectorization. Namely, all images (supports of two probability measures) are transformed into vectors by a deterministic one-one mapping which is the \"reshape\" operator. This extra step does not keep the spatial structures of the supports, which are crucial information of images. Furthermore, the vectorization step also poses certain challenges to design efficient ways of projecting (slicing) samples to one dimension based on prior knowledge about the domain of samples. Finally, prior empirical investigations indicate that there are several slices in the conventional Wasserstein collapsing the two probability measures to the Dirac Delta at zero [@deshpande2018generative; @deshpande2019max; @kolouri2019generalized]. Therefore, these slices do not contribute to the overall discrepancy. These works suggest that the space of projecting directions in the conventional sliced Wasserstein (the unit hyper-sphere) is potentially not optimal, at least for images.
8
+
9
+ **Contribution.** To address these issues of the sliced Wasserstein over images, we propose to replace the conventional formulation of the sliced Wasserstein with a new formulation that is defined on the space of probability measures over tensors. Moreover, we also propose a novel slicing process by changing the conventional matrix multiplication to the convolution operators [@fukushima1982neocognitron; @goodfellow2016deep]. In summary, our main contributions are two-fold:
10
+
11
+ 1. We leverage the benefits of the convolution operators on images, including their efficient parameter sharing and memory saving as well as their superior performance in several tasks on images [@krizhevsky2012imagenet; @he2016deep], to introduce efficient slicing methods on sliced Wasserstein, named *convolution slicers*. With those slicers, we derive a novel variant of sliced Wasserstein, named *convolution sliced Wasserstein* (CSW). We investigate the metricity of CSW, its sample and computational complexities, and its connection to other variants of SW.
12
+
13
+ 2. We illustrate the favorable performance of CSW in comparing probability measures over images. In particular, we show that CSW provides an almost identical discrepancy between MNIST's digits compared to that of the SW while having much less slicing memory. Furthermore, we compare SW and CSW in training deep generative models on standard benchmark image datasets, including CIFAR10, CelebA, STL10, and CelebA-HQ. By considering the quality of the trained models, training speed, and training memory of CSW and SW, we observe that CSW has more favorable performance than the vanilla SW.
14
+
15
+ **Organization.** The remainder of the paper is organized as follows. We first provide background about Wasserstein distance, the conventional slicing process in the sliced Wasserstein distance, and the convolution operator in Section [2](#sec:background){reference-type="ref" reference="sec:background"}. In Section [\[sec:csw\]](#sec:csw){reference-type="ref" reference="sec:csw"}, we propose the convolution slicing and the convolution sliced Wasserstein, and analyze some of its theoretical properties. Section [4](#sec:experiments){reference-type="ref" reference="sec:experiments"} contains the application of CSW to generative models, qualitative experimental results, and quantitative experimental results on standard benchmarks. We conclude the paper In Section [\[sec:conclusion\]](#sec:conclusion){reference-type="ref" reference="sec:conclusion"}. Finally, we defer the proofs of key results and extra materials in the Appendices.
16
+
17
+ **Notation.** For any $d \geq 2$, $\mathbb{S}^{d-1}:=\{\theta \in \mathbb{R}^{d}\mid ||\theta||_2^2 =1\}$ denotes the $d$ dimensional unit hyper-sphere in $\mathcal{L}_2$ norm, and $\mathcal{U}(\mathbb{S}^{d-1})$ is the uniform measure over $\mathbb{S}^{d-1}$. Moreover, $\delta$ denotes the Dirac delta function. For $p\geq 1$, $\mathcal{P}_p(\mathbb{R}^d)$ is the set of all probability measures on $\mathbb{R}^d$ that have finite $p$-moments. For $\mu,\nu \in \mathcal{P}_p(\mathbb{R}^d)$, $\Pi(\mu,\nu):=\{\pi \in \mathcal{P}_p(\mathbb{R}^d \times \mathbb{R}^d) \mid \int_{\mathbb{R}^d} \pi(x,y) dx = \nu, \int_{\mathbb{R}^d} \pi(x,y) dy = \mu \}$ is the set of transportation plans between $\mu$ and $\nu$. For $m\geq 1$, we denotes $\mu^{\otimes m }$ as the product measure which has the supports are the joint vector of $m$ random variables that follows $\mu$. For a vector $X \in \mathbb{R}^{dm}$, $X:=(x_1,\ldots,x_m)$, $P_{X}$ denotes the empirical measures $\frac{1}{m} \sum_{i=1}^m \delta_{x_i}$. For any two sequences $a_{n}$ and $b_{n}$, the notation $a_{n} = \mathcal{O}(b_{n})$ means that $a_{n} \leq C b_{n}$ for all $n \geq 1$ where $C$ is some universal constant.
18
+
19
+ In this section, we first review the definitions of the Wasserstein distance, the conventional slicing, and the sliced Wasserstein distance, and discuss its limitation. We then review the convolution and the padding operators on images.
20
+
21
+ **Sliced Wasserstein:** For any $p \geq 1$ and dimension $d' \geq 1$, we first define the Wasserstein-$p$ distance [@Villani-09; @peyre2019computational] between two probability measures $\mu \in \mathcal{P}_p(\mathbb{R}^{d'})$ and $\nu \in \mathcal{P}_p(\mathbb{R}^{d'})$, which is given by $\text{W}_p(\mu,\nu) : = \Big{(} \inf_{\pi \in \Pi(\mu,\nu)} \int_{\mathbb{R}^{d'} \times \mathbb{R}^{d'}} \| x - y\|_p^{p} d \pi(x,y) \Big{)}^{\frac{1}{p}}$. When $d'=1$, the Wasserstein distance has a closed form which is $W_p(\mu,\nu) =
22
+ ( \int_0^1 |F_\mu^{-1}(z) - F_{\nu}^{-1}(z)|^{p} dz )^{1/p}$ where $F_{\mu}$ and $F_{\nu}$ are the cumulative distribution function (CDF) of $\mu$ and $\nu$ respectively.
23
+
24
+ Given this closed-form property of Wasserstein distance in one dimension, the sliced Wasserstein distance [@bonneel2015sliced] between $\mu$ and $\nu$ had been introduced and admitted the following formulation: $\text{SW}_p^{p}(\mu,\nu) : = \int_{\mathbb{S}^{d-1}} \text{W}_p^p (\theta \sharp \mu,\theta \sharp \nu) d\theta$, where $\theta \sharp \mu$ is the push-forward probability measure of $\mu$ through the function $T_\theta: \mathbb{R}^{d'} \to \mathbb{R}$ with $T_\theta(x) = \theta^\top x$. For each $\theta \in \mathbb{S}^{d'- 1}$, $\text{W}_p^p (\theta \sharp \mu,\theta \sharp \nu)$ can be computed in linear time $\mathcal{O}(m \log_2 m)$ where $m$ is the number of supports of $\mu$ and $\nu$. However, the integration over the unit sphere in the sliced Wasserstein distance is intractable to compute. Therefore, Monte Carlo scheme is employed to approximate the integration, namely, $\theta_1,\ldots,\theta_L \sim \mathcal{U}(\mathbb{S}^{d'-1})$ are drawn uniformly from the unit sphere and the approximation of the sliced Wasserstein distance is given by: $\widehat{\text{SW}}_p^{p} (\mu,\nu) \approx \frac{1}{L}\sum_{i=1}^L \text{W}_p^p (\theta_i \sharp \mu,\theta_i \sharp \nu)$. In practice, $L$ should be chosen to be sufficiently large compared to the dimension $d'$, which can be undesirable.
25
+
26
+ **Sliced Wasserstein on Images:** Now, we focus on two probability measures over images: $\mu,\nu \in \mathcal{P}_p(\mathbb{R}^{c \times d \times d})$ for number of channels $c \geq 1$ and dimension $d \geq 1$. In this case, the sliced Wasserstein between $\mu$ and $\nu$ is defined as: $$\begin{align}
27
+ \label{eq:SWimage}
28
+ \text{SW}_p(\mu,\nu) = \text{SW}_p(\mathcal{R}\sharp \mu,\mathcal{R}\sharp \nu),
29
+ \end{align}$$ where $\mathcal{R}: \mathbb{R}^{c \times d \times d }\to \mathbb{R}^{cd^2}$ is a deterministic one-to-one \"reshape\" mapping.
30
+
31
+ **The slicing process:** The slicing of sliced Wasserstein distance on probability measures over images consists of two steps: vectorization and projection. Suppose that the probability measure $\mu \in \mathcal{P}(\mathbb{R}^{c\times d \times d})$ has $n$ supports. Then the supports of $\mu$ are transformed into vectors in $\mathbb{R}^{cd^2}$ and are stacked as a matrix of size $n \times cd^2$. A projection matrix of size $L\times cd^2$ is then sampled and has each column as a random vector following the uniform measure over the unit hyper-sphere. Finally, the multiplication of those two matrices returns $L$ projected probability measures of $n$ supports in one dimension. We illustrate this process in Figure [1](#fig:sw){reference-type="ref" reference="fig:sw"}.
32
+
33
+ <figure id="fig:sw" data-latex-placement="!h">
34
+ <div class="center">
35
+ <table>
36
+ <tbody>
37
+ <tr>
38
+ <td style="text-align: center;"><embed src="figures/SW.pdf" style="width:100.0%" /></td>
39
+ </tr>
40
+ </tbody>
41
+ </table>
42
+ </div>
43
+ <figcaption> <span>The conventional slicing process of sliced Wasserstein distance. The images <span class="math inline"><em>X</em><sub>1</sub>, …, <em>X</em><sub><em>n</em></sub> ∈ ℝ<sup><em>c</em> × <em>d</em> × <em>d</em></sup></span> are first flattened into vectors in <span class="math inline">ℝ<sup><em>c</em><em>d</em><sup>2</sup></sup></span> and then the Radon transform is applied to these vectors to lead to sliced Wasserstein (<a href="#eq:SWimage" data-reference-type="ref" data-reference="eq:SWimage">[eq:SWimage]</a>) on images. </span> </figcaption>
44
+ </figure>
45
+
46
+ **Limitation of the conventional slicing:** First of all, images contain spatial relations across channels and local information. Therefore, transforming images into vectors makes it challenging to obtain that information. Second, vectorization leads to the usage of projecting directions from the unit hyper-sphere, which can have several directions that do not have good discriminative power. Finally, sampling projecting directions in high-dimension is also time-consuming and memory-consuming. As a consequence, avoiding the vectorization step can improve the efficiency of the whole process.
47
+
48
+ **Convolution operator:** We now define the convolution operator on tensors [@fukushima1982neocognitron], which will be used as an alternative way of projecting images to one dimension in the sliced Wasserstein. The definition of the convolution operator with stride and dilation is as follows.
49
+
50
+ ::: {#def:conv .definition}
51
+ **Definition 1**. *(Convolution) Given the number of channels $c\geq 1$, the dimension $d\geq 1$, the stride size $s\geq 1$, the dilation size $b\geq 1$, the size of kernel $k\geq 1$, the convolution of a tensor $X \in \mathbb{R}^{c \times d \times d}$ with a kernel size $K \in \mathbb{R}^{c \times k \times k}$ is $X \stackrel{s,b}{*} K = Y, \quad Y \in \mathbb{R}^{1 \times d' \times d'}$ where $d' = \frac{d-b(k-1)-1}{s}+1$. For $i=1,\ldots,d'$ and $j=1,\ldots,d'$, $Y_{1,i,j}$ is defined as: $Y_{1,i,j} = \sum_{h=1}^{c} \sum_{i'=0}^{k-1} \sum_{j'=0}^{k-1} X_{h,s(i-1)+bi'+1,s(j-1)+bj'+1}\cdot K_{h,i'+1,j'+1}$.*
52
+ :::
53
+
54
+ From its definition, we can check that the computational complexity of the convolution operator is $\mathcal{O}\left( c\left(\frac{d-b(k-1)-1}{s}+1\right)^2 k^2 \right)$.
55
+
56
+ []{#sec:csw label="sec:csw"} In this section, we will define a convolution slicer that maps a tensor to a scalar by convolution operators. Moreover, we discuss the convolution slicer and some of its specific forms including the convolution-base slicer, the convolution-stride slicer, the convolution-dilation slicer, and their non-linear extensions. After that, we derive the convolution sliced Wasserstein (CSW), a family of variants of sliced Wasserstein, that utilizes a convolution slicer as the projecting method. Finally, we discuss some theoretical properties of CSW, namely, its metricity, its computational complexity, its sample complexity, and its connection to other variants of sliced Wasserstein.
57
+
58
+ []{#subsec:cslicing label="subsec:cslicing"} We first start with the definition of the convolution slicer, which plays an important role in defining convolution sliced Wasserstein.
59
+
60
+ ::: {#def:cslicer .definition}
61
+ **Definition 2**. *(Convolution Slicer) For $N\geq 1$, given a sequence of kernels $K^{(1)} \in \mathbb{R}^{c^{(1)}\times d^{(1)} \times d^{(1)}},\ldots,$ $K^{(N)} \in \mathbb{R}^{c^{(N)}\times d^{(N)} \times d^{(N)}}$, a *convolution slicer* $\mathcal{S}(\cdot|K^{(1)},\ldots,K^{(N)})$ on $\mathbb{R}^{c \times d \times d}$ is a composition of $N$ convolution functions with kernels $K^{(1)},\ldots, K^{(N)}$ (with stride or dilation if needed) such that $\mathcal{S}(X|K^{(1)},\ldots, K^{(N)}) \in \mathbb{R} \quad \forall X \in \mathbb{R}^{c \times d \times d}$.*
62
+ :::
63
+
64
+ As indicated in Definition [2](#def:cslicer){reference-type="ref" reference="def:cslicer"}, the idea of the convolution slicer is to progressively map a given data $X$ to a one-dimensional subspace through a sequence of convolution kernels, which capture spatial relations across channels as well as local information of the data. It is starkly different from the vectorization step in standard sliced Wasserstein on images ([\[eq:SWimage\]](#eq:SWimage){reference-type="ref" reference="eq:SWimage"}). The illustration of the convolution slicer is given in Figure [2](#fig:csw){reference-type="ref" reference="fig:csw"}.
65
+
66
+ <figure id="fig:csw" data-latex-placement="!h">
67
+ <div class="center">
68
+ <table>
69
+ <tbody>
70
+ <tr>
71
+ <td style="text-align: center;"><embed src="figures/CSW.pdf" style="width:100.0%" /></td>
72
+ </tr>
73
+ </tbody>
74
+ </table>
75
+ </div>
76
+ <figcaption> <span>The convolution slicing process (using the convolution slicer). The images <span class="math inline"><em>X</em><sub>1</sub>, …, <em>X</em><sub><em>n</em></sub> ∈ ℝ<sup><em>c</em> × <em>d</em> × <em>d</em></sup></span> are directly mapped to a scalar by a sequence of convolution functions which have kernels as random tensors. This slicing process leads to the convolution sliced Wasserstein on images. </span> </figcaption>
77
+ </figure>
78
+
79
+ We consider three particular types of convolution slicers based on using linear function on the convolution operator, named convolution-base, convolution-stride, and convolution-dilation slicers. We defer the definition of convolution-dilation slicers to Definition [5](#def:csdslicer){reference-type="ref" reference="def:csdslicer"}. We first start with the definition of the convolution-base slicer.
80
+
81
+ ::: {#def:linearslicer .definition}
82
+ **Definition 3**. *(Convolution-base Slicer) Given $X \in \mathbb{R}^{c \times d \times d}$ ($d \geq 2$),*
83
+
84
+ *1. When $d$ is even, $N$ is the biggest integer that satisfies $d= 2^{N-1} a$ with $a$ is also an integer, sliced kernels are defined as $K^{(1)}\in \mathbb{R}^{c \times (2^{-1}d+1) \times (2^{-1}d+1)}$ and $K^{(h)}\in \mathbb{R}^{1 \times (2^{-h}d+1) \times (2^{-h}d+1)}$ for $h =2,\ldots,N-1$, and $K^{(N)}\in \mathbb{R}^{1 \times a \times a}$ where $a= \frac{d}{2^{N-1}}$. Then, the *convolution-base slicer* $\mathcal{CS}\text{-b}(X|K^{(1)},\ldots,K^{(N)})$ is defined as: $$\begin{align*}
85
+ \mathcal{CS}\text{-b}(X|K^{(1)},\ldots,K^{(N)}) = X^{(N)}, \quad X^{(h)} = \begin{cases} X &h=0\\ X^{(h-1)} \stackrel{1,1}{*}K^{(h)} & 1 \leq h \leq N,
86
+ \end{cases}
87
+ \end{align*}$$ 2. When $d$ is odd, the *convolution-base slicer* $\mathcal{CS}\text{-b}(X|K^{(1)},\ldots,K^{(N)})$ takes the form: $$\begin{align*}
88
+ \mathcal{CS}\text{-b}(X|K^{(1)},\ldots,K^{(N)}) = \mathcal{CS}\text{-b}(X \stackrel{1,1}{*} K^{(1)}|K^{(2)},\ldots,K^{(N)}),
89
+ \end{align*}$$ where $K^{(1)} \in \mathbb{R}^{c \times 2 \times 2}$ and $K^{(2)}, \ldots, K^{(N)}$ are the corresponding sliced kernels that are defined on the dimension $d-1$.*
90
+ :::
91
+
92
+ The idea of the convolution-base slicer in Definition [3](#def:linearslicer){reference-type="ref" reference="def:linearslicer"} is to reduce the width and the height of the image by half after each convolution operator. If the width and the height of the image are odd, the first convolution operator is to reduce the size of the image by one via convolution with kernels of size $2\times 2$, and then the same procedure as that of the even case is applied. We would like to remark that the conventional slicing of sliced Wasserstein in Section [2](#sec:background){reference-type="ref" reference="sec:background"} is equivalent to a convolution-base slicer $\mathcal{S}(\cdot|K^{(1)})$ where $K^{(1)} \in \mathbb{R}^{c\times d\times d}$ that satisfies the constraint $\sum_{h=1}^c \sum_{i=1}^d\sum_{j=1}^d K^{(1)2}_{h,i,j}=1$.
93
+
94
+ We now discuss the second variant of the convolution slicer, named convolution-stride slicer, where we further incorporate stride into the convolution operators. Its definition is as follows.
95
+
96
+ ::: {#def:csslicer .definition}
97
+ **Definition 4**. *(Convolution-stride Slicer) Given $X \in \mathbb{R}^{c \times d \times d}$ ($d \geq 2$),*
98
+
99
+ *1. When $d$ is even, $N$ is the biggest integer that satisfies $d= 2^{N-1} a$ with $a$ is also an integer, sliced kernels are defined as $K^{(1)}\in \mathbb{R}^{c \times 2 \times 2}$ and $K^{(h)}\in \mathbb{R}^{1 \times 2 \times 2}$ for $h =2,\ldots,N-1$, and $K^{(N)}\in \mathbb{R}^{1 \times a \times a}$ where $a= \frac{d}{2^{N-1}}$. Then, the *convolution-stride slicer* $\mathcal{CS}\text{-s}(X|K^{(1)},\ldots,K^{(N)})$ is defined as: $$\begin{align*}
100
+ \mathcal{CS}\text{-s}(X|K^{(1)},\ldots,K^{(N)}) = X^{(N)}, \quad X^{(h)} = \begin{cases} X &h=0\\ X^{(h-1)} \stackrel{2,1}{*}K^{(h)} & 1 \leq h \leq N-1, \\
101
+ X^{(h-1)} \stackrel{1,1}{*}K^{(h)} & h=N,
102
+ \end{cases}
103
+ \end{align*}$$ 2. When $d$ is odd, the *convolution-stride slicer* $\mathcal{CS}\text{-s}(X|K^{(1)},\ldots,K^{(N)})$ takes the form: $$\begin{align*}
104
+ \mathcal{CS}\text{-s}(X|K^{(1)},\ldots,K^{(N)}) = \mathcal{CS}\text{-s}(X \stackrel{1,1}{*} K^{(1)}|K^{(2)},\ldots,K^{(N)}),
105
+ \end{align*}$$ where $K^{(1)} \in \mathbb{R}^{c \times 2 \times 2}$ and $K^{(2)}, \ldots, K^{(N)}$ are the corresponding sliced kernels that are defined on the dimension $d-1$.*
106
+ :::
107
+
108
+ Similar to the convolution-base slicer in Definition [3](#def:linearslicer){reference-type="ref" reference="def:linearslicer"}, the convolution-stride slicer reduces the width and the height of the image by half after each convolution operator. We use the same procedure of reducing the height and the width of the image by one when the height and the width of the image are odd. The benefit of the convolution-stride slicer is that the size of its kernels does not depend on the width and the height of images as that of the convolution-base slicer. This difference improves the computational complexity and time complexity of the convolution-stride slicer over those of the convolution-base slicer (cf. Proposition [1](#proposition:space_time_complexities){reference-type="ref" reference="proposition:space_time_complexities"}).
109
+
110
+ ::: {#def:csdslicer .definition}
111
+ **Definition 5**. *(Convolution-dilation Slicer) Given $X \in \mathbb{R}^{c \times d \times d}$ ($d \geq 2$),*
112
+
113
+ 1. *When $d$ is even, $N$ is the biggest integer that satisfies $d= 2^{N-1} a$ with $a$ is also an integer, sliced kernels are defined as $K^{(1)}\in \mathbb{R}^{c \times 2 \times 2}$ and $K^{(h)}\in \mathbb{R}^{1 \times 2 \times 2}$ for $h =2,\ldots,N-1$, and $K^{(N)}\in \mathbb{R}^{1 \times a \times a}$ where $a= \frac{d}{2^{N-1}}$. Then, the *convolution-dilation slicer* $\mathcal{CS}\text{-d}(X|K^{(1)},\ldots,K^{(N)})$ is defined as: $$\begin{align*}
114
+ \mathcal{CS}\text{-d}(X|K^{(1)},\ldots,K^{(N)}) = X^{(N)}, \quad X^{(h)} = \begin{cases} X &h=0\\ X^{(h-1)} \stackrel{1,d/2^h}{*}K^{(h)} & 1 \leq h \leq N-1, \\
115
+ X^{(h-1)} \stackrel{1,1}{*}K^{(h)} & h=N,
116
+ \end{cases}
117
+ \end{align*}$$*
118
+
119
+ 2. *When $d$ is odd, the *convolution-dilation slicer* $\mathcal{CS}\text{-d}(X|K^{(1)},\ldots,K^{(N)})$ takes the form: $$\begin{align*}
120
+ \mathcal{CS}\text{-d}(X|K^{(1)},\ldots,K^{(N)}) = \mathcal{CS}\text{-d}(X \stackrel{1,1}{*} K^{(1)}|K^{(2)},\ldots,K^{(N)}),
121
+ \end{align*}$$ where $K^{(1)} \in \mathbb{R}^{c \times 2 \times 2}$ and $K^{(2)}, \ldots, K^{(N)}$ are the corresponding sliced kernels that are defined on the dimension $d-1$.*
122
+ :::
123
+
124
+ As with the previous slicers, the convolution-dilation slicer also reduces the width and the height of the image by half after each convolution operator and it uses the same procedure for the odd dimension cases. The design of kernels' size of the convolution-dilation slicer is the same as that of the convolution-stride slicer. However, the convolution-dilation slicer has a bigger receptive field in each convolution operator which might be appealing when the information of the image is presented by a big block of pixels.
125
+
126
+ **Computational and projection memories complexities of the convolution slicers:** We now establish the computational and projection memory complexities of convolution-base, convolution-stride, and convolution-dilation slicers in the following proposition. We would like to recall that the projection memory complexity is the memory that is needed to store a slice (convolution kernels).
127
+
128
+ ::: {#proposition:space_time_complexities .proposition}
129
+ **Proposition 1**. *(a) When $d$ is even, $N$ is the biggest integer that satisfies $d= 2^{N-1} a$ with $a$ is also an integer, and $N= [\log_2 d]$, the computational and projection memory complexities of convolution-base slicer are respectively at the order of $\mathcal{O}(cd^4)$ and $\mathcal{O}(c d^2)$. When $d$ is odd, these complexities are at the order of $\mathcal{O}(cd^2 + d^4)$ and $\mathcal{O}(c + d^2)$.*
130
+
131
+ *(b) The computational and projection memory complexities of convolution-stride slicer are respectively at the order of $\mathcal{O}(cd^2)$ and $\mathcal{O}(c + [\log_{2} d])$.*
132
+
133
+ *(c) The computational and projection memory complexities of convolution-dilation slicer are respectively at the order of $\mathcal{O}(cd^2)$ and $\mathcal{O}(c + [\log_{2} d])$.*
134
+ :::
135
+
136
+ Proof of Proposition [1](#proposition:space_time_complexities){reference-type="ref" reference="proposition:space_time_complexities"} is in Appendix [7.4](#subsec:proof:proposition:space_time_complexities){reference-type="ref" reference="subsec:proof:proposition:space_time_complexities"}. We recall that the computational complexity and the projection memory complexity of the conventional slicing in sliced Wasserstein are $\mathcal{O}(cd^2)$ and $\mathcal{O}(cd^2)$. We can observe that the convolution-base slicer has a worse computational complexity than the conventional slicing while having the same projection memory complexity. Since the size of kernels does not depend on the size of images, the convolution-stride slicer and the convolution-dilation slicer have the same computational complexity as the conventional slicing $\mathcal{O}(cd^2)$. However, their projection memory complexities are cheaper than conventional slicing, namely, $\mathcal{O}(c+ [\log_{2} d])$ compared to $\mathcal{O}(cd^2)$.
137
+
138
+ **Non-linear convolution-base slicer:** The composition of convolution functions in the linear convolution slicer and its linear variants is still a linear function, which may not be effective when the data lie in a complex and highly non-linear low-dimensional subspace. A natural generalization of linear convolution slicers to enhance the ability of the slicers to capture the non-linearity of the data is to apply a non-linear activation function after convolution operators. This enables us to define a non-linear slicer in Definition [7](#def:nonlinearslicer){reference-type="ref" reference="def:nonlinearslicer"} in Appendix [8](#sec:addmarterial){reference-type="ref" reference="sec:addmarterial"}. The non-linear slicer can be seen as a defining function in generalized Radon Transform [@radon20051] which was used in generalized sliced Wasserstein [@kolouri2019generalized].
139
+
140
+ Given the definition of convolution slicers, we now state general definition of convolution sliced Wasserstein. An illustration of the convolution sliced Wasserstein is given in Figure [2](#fig:csw){reference-type="ref" reference="fig:csw"}.
141
+
142
+ ::: {#def:csw .definition}
143
+ **Definition 6**. *For any $p \geq 1$, the *convolution sliced Wasserstein* (CSW) of order $p >0$ between two given probability measures $\mu, \nu \in \mathcal{P}_p(\mathbb{R}^{c \times d \times d})$ is given by: $$\begin{align*}
144
+ \text{CSW}_p (\mu,\nu) : = \left(\mathbb{E} \left[W^p_p\left(\mathcal{S}(\cdot|K^{(1)}, \ldots, K^{(N)}) \sharp \mu, \mathcal{S}(\cdot|K^{(1)}, \ldots, K^{(N)})\sharp \nu\right)\right]\right)^{\frac{1}{p}},
145
+ \end{align*}$$ where the expectation is taken with respect to $K^{(1)}\sim \mathcal{U}(\mathcal{K}^{(1)}),\ldots, K^{(N)}\sim \mathcal{U}(\mathcal{K}^{(N)})$. Here, $\mathcal{S}(\cdot|K^{(1)}, \ldots, K^{(N)})$ is a convolution slicer with $K^{(l)} \in \mathbb{R}^{c^{(l)}\times k^{(l)} \times k^{(l)}}$ for any $l \in [N]$ and $\mathcal{U}(\mathcal{K}^{(l)})$ is the uniform distribution with the realizations being in the set $\mathcal{K}^{(l)}$ which is defined as $\mathcal{K}^{(l)}:=\left\{K^{(l)} \in \mathbb{R}^{c^{(l)}\times k^{(l)} \times k^{(l)}}| \sum_{h=1}^{c^{(l)}}\sum_{i'=1}^{k^{(l)}}\sum_{j'=1}^{k^{(l)}}K^{(i)2}_{h,i',j'} =1\right\}$, namely, the set $\mathcal{K}^{(l)}$ consists of tensors $K^{(l)}$ whose squared $\ell_{2}$ norm is 1.*
146
+ :::
147
+
148
+ The constraint that $\ell_2$ norms of $K^{(l)}$ is 1 is for guaranteeing the distances between projected supports are bounded. When we specifically consider the convolution slicer as convolution-base slicer ($\mathcal{CS}\text{-b}$), convolution-stride slicer ($\mathcal{CS}\text{-s}$), and convolution-dilation slicer ($\mathcal{CS}\text{-d}$), we have the corresponding notions of convolution-base sliced Wasserstein (-b), convolution-stride sliced Wasserstein (-s), and convolution-dilation sliced Wasserstein (-d).
149
+
150
+ **Monte Carlo estimation and implementation:** Similar to the conventional sliced Wasserstein, the expectation with respect to kernels $K^{(1)}, \ldots, K^{(N)}$ uniformly drawn from the sets $\mathcal{K}^{(1)}, \ldots, \mathcal{K}^{(N)}$ in the convolution sliced Wasserstein is intractable to compute. Therefore, we also make use of Monte Carlo method to approximate the expectation, which leads to the following approximation of the convolution sliced Wasserstein: $$\begin{align}
151
+ \text{CSW}_p^{p} (\mu,\nu) \approx \frac{1}{L} \sum_{i = 1}^{L} W^p_p\left(\mathcal{S}(\cdot|K^{(1)}_i, \ldots, K^{(N)}_i) \sharp \mu, \mathcal{S}(\cdot|K^{(1)}_i, \ldots, K^{(N)}_i)\sharp \nu\right), \label{eq:Monte_Carlo_approx_CSW}
152
+ \end{align}$$ where $K^{(\ell)}_i$ are uniform samples from the sets $\mathcal{K}^{(\ell)}$ (which is equivalent to sample uniformly from $\mathbb{S}^{c^{(l)}\cdot k^{(l)2}}$ then applying the one-to-one reshape mapping) for any $\ell \in [N]$ and $i \in [L]$. Since each of the convolution slicer $\mathcal{S}(\cdot|K^{(1)}_i, \ldots, K^{(N)}_i)$ is in one dimension, we can utilize the closed-form expression of Wasserstein metric in one dimension to compute $W_p\left(\mathcal{S}(\cdot|K^{(1)}_i, \ldots, K^{(N)}_i) \sharp \mu, \mathcal{S}(\cdot|K^{(1)}_{i}, \ldots, K^{(N)}_{i})\sharp \nu\right)$ with a complexity of $\mathcal{O}(m \log_2 m)$ for each $i \in [L]$ where $m$ is the maximum number of supports of $\mu$ and $\nu$. Therefore, the total computational complexity of computing the Monte Carlo approximation ([\[eq:Monte_Carlo_approx_CSW\]](#eq:Monte_Carlo_approx_CSW){reference-type="ref" reference="eq:Monte_Carlo_approx_CSW"}) is $\mathcal{O}(L m \log_2 m)$ when the probability measures $\mu$ and $\nu$ have at most $m$ supports. It is comparable to the computational complexity of sliced Wasserstein on images ([\[eq:SWimage\]](#eq:SWimage){reference-type="ref" reference="eq:SWimage"}) where we directly vectorize the images and apply the Radon transform to these flatten images. Finally, for the implementation, we would like to remark that $L$ convolution slicers in equation ([\[eq:Monte_Carlo_approx_CSW\]](#eq:Monte_Carlo_approx_CSW){reference-type="ref" reference="eq:Monte_Carlo_approx_CSW"}) can be computed *independently* and *parallelly* using the group convolution implementation which is supported in almost all libraries.
153
+
154
+ **Properties of convolution sliced Wasserstein:** We first have the following result for the metricity of the convolution sliced Wasserstein.
155
+
156
+ ::: {#theorem:metricity_convolution_sliced .theorem}
157
+ **Theorem 1**. *For any $p \geq 1$, the convolution sliced Wasserstein $\text{CSW}_p(.,.)$ is a pseudo-metric on the space of probability measures on $\mathbb{R}^{c \times d \times d}$, namely, it is symmetric, and satisfies the triangle inequality.*
158
+ :::
159
+
160
+ Proof of Theorem [1](#theorem:metricity_convolution_sliced){reference-type="ref" reference="theorem:metricity_convolution_sliced"} is in Appendix [7.1](#subsec:proof:theorem:metricity_convolution_sliced){reference-type="ref" reference="subsec:proof:theorem:metricity_convolution_sliced"}. We would like to mention that CSW can might still be a metric since the convolution slicer might be injective. Our next result establishes the connection between the convolution sliced Wasserstein and max-sliced Wasserstein and Wasserstein distances.
161
+
162
+ ::: {#proposition:connection_sliced .proposition}
163
+ **Proposition 2**. *For any $p \geq 1$, we find that $\text{CSW}_p (\mu,\nu) \leq \text{Max-SW}_p(\mu,\nu) \leq W_{p}(\mu, \nu),$ where $\text{Max-SW}_p(\mu,\nu) : = \max_{\theta \in \mathbb{R}^{cd^2}: \|\theta\| \leq 1} \text{W}_p (\theta \sharp \mu,\theta \sharp \nu)$ is max-sliced Wasserstein of order $p$.*
164
+ :::
165
+
166
+ Proof of Proposition [2](#proposition:connection_sliced){reference-type="ref" reference="proposition:connection_sliced"} is in Appendix [7.2](#subsec:proof:proposition:connection_sliced){reference-type="ref" reference="subsec:proof:proposition:connection_sliced"}. Given the bounds in Proposition [2](#proposition:connection_sliced){reference-type="ref" reference="proposition:connection_sliced"}, we demonstrate that the convolution sliced Wasserstein does not suffer from the curse of dimensionality for the inference purpose, namely, the sample complexity for the empirical distribution from i.i.d. samples to approximate their underlying distribution is at the order of $\mathcal{O}(n^{-1/2})$.
167
+
168
+ ::: {#proposition:rate_convolution .proposition}
169
+ **Proposition 3**. *Assume that $P$ is a probability measure supported on compact set of $\mathbb{R}^{c \times d \times d}$. Let $X_{1}, X_{2}, \ldots, X_{n}$ be i.i.d. samples from $P$ and we denote $P_{n} = \frac{1}{n} \sum_{i = 1}^{n} \delta_{X_{i}}$ as the empirical measure of these data. Then, for any $p \geq 1$, there exists a universal constant $C > 0$ such that $$\begin{align*}
170
+ \mathbb{E} [\text{CSW}_p (P_{n},P)] \leq C \sqrt{(cd^2 + 1) \log n/n},
171
+ \end{align*}$$ where the outer expectation is taken with respect to the data $X_{1}, X_{2}, \ldots, X_{n}$.*
172
+ :::
173
+
174
+ Proof of Proposition [3](#proposition:rate_convolution){reference-type="ref" reference="proposition:rate_convolution"} is in Appendix [7.3](#subsec:proof:proposition:rate_convolution){reference-type="ref" reference="subsec:proof:proposition:rate_convolution"}. The result of Proposition [3](#proposition:rate_convolution){reference-type="ref" reference="proposition:rate_convolution"} indicates that the sample complexity of the convolution sliced Wasserstein is comparable to that of the sliced Wasserstein on images ([\[eq:SWimage\]](#eq:SWimage){reference-type="ref" reference="eq:SWimage"}), which is at the order of $\mathcal{O}(n^{-1/2})$ [@Bobkov_2019], and better than that of the Wasserstein metric, which is at the order of $\mathcal{O}(n^{-1/(2cd^2)})$ [@Fournier_2015].
175
+
176
+ **Extension to non-linear convolution sliced Wasserstein:** In Appendix [8](#sec:addmarterial){reference-type="ref" reference="sec:addmarterial"}, we provide a non-linear version of the convolution sliced Wasserstein, named non-linear convolution sliced Wasserstein. The high-level idea of the non-linear version is to incorporate non-linear activation functions to the convolution-base, convolution-stride, and convolution-dilation slicers. The inclusion of non-linear activation functions is to enhance the ability of slicers to capture the non-linearity of the data. By plugging these non-linear convolution slicers into the general definition of the convolution sliced Wasserstein in Definition [6](#def:csw){reference-type="ref" reference="def:csw"}, we obtain the non-linear variants of convolution sliced Wasserstein.