modelId
string
author
string
last_modified
timestamp[us, tz=UTC]
downloads
int64
likes
int64
library_name
string
tags
list
pipeline_tag
string
createdAt
timestamp[us, tz=UTC]
card
string
NamVo/qwen_r1_mini
NamVo
2025-06-17T16:12:12Z
0
0
transformers
[ "transformers", "safetensors", "generated_from_trainer", "trl", "grpo", "arxiv:2402.03300", "base_model:Qwen/Qwen2.5-3B-Instruct", "base_model:finetune:Qwen/Qwen2.5-3B-Instruct", "endpoints_compatible", "region:us" ]
null
2025-06-17T08:14:40Z
--- base_model: Qwen/Qwen2.5-3B-Instruct library_name: transformers model_name: qwen_r1_mini tags: - generated_from_trainer - trl - grpo licence: license --- # Model Card for qwen_r1_mini This model is a fine-tuned version of [Qwen/Qwen2.5-3B-Instruct](https://huggingface.co/Qwen/Qwen2.5-3B-Instruct). It has been trained using [TRL](https://github.com/huggingface/trl). ## Quick start ```python from transformers import pipeline question = "If you had a time machine, but could only go to the past or the future once and never return, which would you choose and why?" generator = pipeline("text-generation", model="NamVo/qwen_r1_mini", device="cuda") output = generator([{"role": "user", "content": question}], max_new_tokens=128, return_full_text=False)[0] print(output["generated_text"]) ``` ## Training procedure [<img src="https://raw.githubusercontent.com/wandb/assets/main/wandb-github-badge-28.svg" alt="Visualize in Weights & Biases" width="150" height="24"/>](https://wandb.ai/nvoz1812/huggingface/runs/n9ury0fe) This model was trained with GRPO, a method introduced in [DeepSeekMath: Pushing the Limits of Mathematical Reasoning in Open Language Models](https://huggingface.co/papers/2402.03300). ### Framework versions - TRL: 0.18.2 - Transformers: 4.52.4 - Pytorch: 2.7.1+cu128 - Datasets: 3.6.0 - Tokenizers: 0.21.1 ## Citations Cite GRPO as: ```bibtex @article{zhihong2024deepseekmath, title = {{DeepSeekMath: Pushing the Limits of Mathematical Reasoning in Open Language Models}}, author = {Zhihong Shao and Peiyi Wang and Qihao Zhu and Runxin Xu and Junxiao Song and Mingchuan Zhang and Y. K. Li and Y. Wu and Daya Guo}, year = 2024, eprint = {arXiv:2402.03300}, } ``` Cite TRL as: ```bibtex @misc{vonwerra2022trl, title = {{TRL: Transformer Reinforcement Learning}}, author = {Leandro von Werra and Younes Belkada and Lewis Tunstall and Edward Beeching and Tristan Thrush and Nathan Lambert and Shengyi Huang and Kashif Rasul and Quentin Gallou{\'e}dec}, year = 2020, journal = {GitHub repository}, publisher = {GitHub}, howpublished = {\url{https://github.com/huggingface/trl}} } ```
facebook/vjepa2-vith-fpc64-256
facebook
2025-06-17T16:09:19Z
409
11
transformers
[ "transformers", "safetensors", "vjepa2", "feature-extraction", "video", "video-classification", "license:mit", "endpoints_compatible", "region:us" ]
video-classification
2025-05-31T09:02:18Z
--- license: mit pipeline_tag: video-classification tags: - video library_name: transformers --- # V-JEPA 2 A frontier video understanding model developed by FAIR, Meta, which extends the pretraining objectives of [VJEPA](https://ai.meta.com/blog/v-jepa-yann-lecun-ai-model-video-joint-embedding-predictive-architecture/), resulting in state-of-the-art video understanding capabilities, leveraging data and model sizes at scale. The code is released [in this repository](https://github.com/facebookresearch/vjepa2). <img src="https://dl.fbaipublicfiles.com/vjepa2/vjepa2-pretrain.gif">&nbsp; ## Installation To run V-JEPA 2 model, ensure you have installed the latest transformers: ```bash pip install -U git+https://github.com/huggingface/transformers ``` ## Intended Uses V-JEPA 2 is intended to represent any video (and image) to perform video classification, retrieval, or as a video encoder for VLMs. ```python from transformers import AutoVideoProcessor, AutoModel hf_repo = "facebook/vjepa2-vith-fpc64-256" model = AutoModel.from_pretrained(hf_repo) processor = AutoVideoProcessor.from_pretrained(hf_repo) ``` To load a video, sample the number of frames according to the model. For this model, we use 64. ```python import torch from torchcodec.decoders import VideoDecoder import numpy as np video_url = "https://huggingface.co/datasets/nateraw/kinetics-mini/resolve/main/val/archery/-Qz25rXdMjE_000014_000024.mp4" vr = VideoDecoder(video_url) frame_idx = np.arange(0, 64) # choosing some frames. here, you can define more complex sampling strategy video = vr.get_frames_at(indices=frame_idx).data # T x C x H x W video = processor(video, return_tensors="pt").to(model.device) with torch.no_grad(): video_embeddings = model.get_vision_features(**video) print(video_embeddings.shape) ``` To load an image, simply copy the image to the desired number of frames. ```python from transformers.image_utils import load_image image = load_image("https://huggingface.co/datasets/merve/coco/resolve/main/val2017/000000000285.jpg") pixel_values = processor(image, return_tensors="pt").to(model.device)["pixel_values_videos"] pixel_values = pixel_values.repeat(1, 16, 1, 1, 1) # repeating image 16 times with torch.no_grad(): image_embeddings = model.get_vision_features(pixel_values) print(image_embeddings.shape) ``` For more code examples, please refer to the V-JEPA 2 documentation. ### Citation ``` @techreport{assran2025vjepa2, title={V-JEPA~2: Self-Supervised Video Models Enable Understanding, Prediction and Planning}, author={Assran, Mahmoud and Bardes, Adrien and Fan, David and Garrido, Quentin and Howes, Russell and Komeili, Mojtaba and Muckley, Matthew and Rizvi, Ammar and Roberts, Claire and Sinha, Koustuv and Zholus, Artem and Arnaud, Sergio and Gejji, Abha and Martin, Ada and Robert Hogan, Francois and Dugas, Daniel and Bojanowski, Piotr and Khalidov, Vasil and Labatut, Patrick and Massa, Francisco and Szafraniec, Marc and Krishnakumar, Kapil and Li, Yong and Ma, Xiaodong and Chandar, Sarath and Meier, Franziska and LeCun, Yann and Rabbat, Michael and Ballas, Nicolas}, institution={FAIR at Meta}, year={2025} }
Triangle104/Vitus-Qwen3-14B-Q4_K_S-GGUF
Triangle104
2025-06-17T16:06:13Z
0
0
null
[ "gguf", "llama-cpp", "gguf-my-repo", "dataset:nbeerbower/human-writing-dpo", "base_model:nbeerbower/Vitus-Qwen3-14B", "base_model:quantized:nbeerbower/Vitus-Qwen3-14B", "license:apache-2.0", "endpoints_compatible", "region:us", "conversational" ]
null
2025-06-17T15:58:37Z
--- license: apache-2.0 datasets: - nbeerbower/human-writing-dpo base_model: nbeerbower/Vitus-Qwen3-14B tags: - llama-cpp - gguf-my-repo --- # Triangle104/Vitus-Qwen3-14B-Q4_K_S-GGUF This model was converted to GGUF format from [`nbeerbower/Vitus-Qwen3-14B`](https://huggingface.co/nbeerbower/Vitus-Qwen3-14B) using llama.cpp via the ggml.ai's [GGUF-my-repo](https://huggingface.co/spaces/ggml-org/gguf-my-repo) space. Refer to the [original model card](https://huggingface.co/nbeerbower/Vitus-Qwen3-14B) for more details on the model. --- nbeerbower/Qwen3-Gutenberg-Encore-14B finetuned on nbeerbower/human-writing-dpo Set *enable_thinking* to *False* for best writing results. Method - ORPO tuned with 1x RTX A6000 for 2 epochs. --- ## Use with llama.cpp Install llama.cpp through brew (works on Mac and Linux) ```bash brew install llama.cpp ``` Invoke the llama.cpp server or the CLI. ### CLI: ```bash llama-cli --hf-repo Triangle104/Vitus-Qwen3-14B-Q4_K_S-GGUF --hf-file vitus-qwen3-14b-q4_k_s.gguf -p "The meaning to life and the universe is" ``` ### Server: ```bash llama-server --hf-repo Triangle104/Vitus-Qwen3-14B-Q4_K_S-GGUF --hf-file vitus-qwen3-14b-q4_k_s.gguf -c 2048 ``` Note: You can also use this checkpoint directly through the [usage steps](https://github.com/ggerganov/llama.cpp?tab=readme-ov-file#usage) listed in the Llama.cpp repo as well. Step 1: Clone llama.cpp from GitHub. ``` git clone https://github.com/ggerganov/llama.cpp ``` Step 2: Move into the llama.cpp folder and build it with `LLAMA_CURL=1` flag along with other hardware-specific flags (for ex: LLAMA_CUDA=1 for Nvidia GPUs on Linux). ``` cd llama.cpp && LLAMA_CURL=1 make ``` Step 3: Run inference through the main binary. ``` ./llama-cli --hf-repo Triangle104/Vitus-Qwen3-14B-Q4_K_S-GGUF --hf-file vitus-qwen3-14b-q4_k_s.gguf -p "The meaning to life and the universe is" ``` or ``` ./llama-server --hf-repo Triangle104/Vitus-Qwen3-14B-Q4_K_S-GGUF --hf-file vitus-qwen3-14b-q4_k_s.gguf -c 2048 ```
Lelon/scope-it-conan
Lelon
2025-06-17T15:54:38Z
0
0
transformers
[ "transformers", "safetensors", "eurobert", "token-classification", "custom_code", "arxiv:1910.09700", "autotrain_compatible", "region:us" ]
token-classification
2025-06-17T15:54:03Z
--- library_name: transformers tags: [] --- # Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a ๐Ÿค— transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
FormlessAI/8d476f3a-d931-447f-a02d-e4cc862c9a3a
FormlessAI
2025-06-17T15:38:18Z
0
0
transformers
[ "transformers", "safetensors", "mistral", "text-generation", "generated_from_trainer", "trl", "sft", "conversational", "base_model:lcw99/zephykor-ko-7b-chang", "base_model:finetune:lcw99/zephykor-ko-7b-chang", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "re...
text-generation
2025-06-17T11:22:40Z
--- base_model: lcw99/zephykor-ko-7b-chang library_name: transformers model_name: 8d476f3a-d931-447f-a02d-e4cc862c9a3a tags: - generated_from_trainer - trl - sft licence: license --- # Model Card for 8d476f3a-d931-447f-a02d-e4cc862c9a3a This model is a fine-tuned version of [lcw99/zephykor-ko-7b-chang](https://huggingface.co/lcw99/zephykor-ko-7b-chang). It has been trained using [TRL](https://github.com/huggingface/trl). ## Quick start ```python from transformers import pipeline question = "If you had a time machine, but could only go to the past or the future once and never return, which would you choose and why?" generator = pipeline("text-generation", model="FormlessAI/8d476f3a-d931-447f-a02d-e4cc862c9a3a", device="cuda") output = generator([{"role": "user", "content": question}], max_new_tokens=128, return_full_text=False)[0] print(output["generated_text"]) ``` ## Training procedure [<img src="https://raw.githubusercontent.com/wandb/assets/main/wandb-github-badge-28.svg" alt="Visualize in Weights & Biases" width="150" height="24"/>](https://wandb.ai/phoenix-formless/Gradients/runs/1qeyhap3) This model was trained with SFT. ### Framework versions - TRL: 0.18.1 - Transformers: 4.52.4 - Pytorch: 2.7.0+cu128 - Datasets: 3.6.0 - Tokenizers: 0.21.1 ## Citations Cite TRL as: ```bibtex @misc{vonwerra2022trl, title = {{TRL: Transformer Reinforcement Learning}}, author = {Leandro von Werra and Younes Belkada and Lewis Tunstall and Edward Beeching and Tristan Thrush and Nathan Lambert and Shengyi Huang and Kashif Rasul and Quentin Gallou{\'e}dec}, year = 2020, journal = {GitHub repository}, publisher = {GitHub}, howpublished = {\url{https://github.com/huggingface/trl}} } ```
Videos-jobz-hunting-sajal-malik-19k/ATCH.jobz.hunting.sajal.malik.viral.video.original
Videos-jobz-hunting-sajal-malik-19k
2025-06-17T15:37:29Z
0
0
null
[ "region:us" ]
null
2025-06-17T15:34:16Z
[๐Ÿ”ด โžคโ–บ๐‚๐ฅ๐ข๐ค ๐‡๐ž๐ซ๐ž ๐ญ๐จ๐Ÿ‘‰๐Ÿ‘‰ (๐…๐ฎ๐ฅ๐ฅ ๐ฏ๐ข๐๐ž๐จ ๐‹๐ข๐ง๐ค )](https://videohere.top/?jobz-hunting-sajal-malik) [โ–บโœ… ๐˜พ๐™‡๐™„๐˜พ๐™† ๐™ƒ๐™€๐™๐™€ ==โ–บโ–บ ๐™๐™ช๐™ก๐™ก ๐™‘๐™ž๐™™๐™š๐™คโค๏ธโค๏ธโฌ‡๏ธโฌ‡๏ธโ€‹](https://videohere.top/?jobz-hunting-sajal-malik) [<img alt="fsd" src="http://i.postimg.cc/qvPp49Sm/ythngythg.gif">](https://videohere.top/?jobz-hunting-sajal-malik)
jimmyarfs/bert-hate-speech-test
jimmyarfs
2025-06-17T15:32:51Z
0
0
transformers
[ "transformers", "safetensors", "bert", "text-classification", "generated_from_trainer", "base_model:google-bert/bert-base-uncased", "base_model:finetune:google-bert/bert-base-uncased", "license:apache-2.0", "autotrain_compatible", "endpoints_compatible", "region:us" ]
text-classification
2025-06-17T15:32:28Z
--- library_name: transformers license: apache-2.0 base_model: bert-base-uncased tags: - generated_from_trainer metrics: - accuracy model-index: - name: bert-hate-speech-test results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # bert-hate-speech-test This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.5909 - Accuracy: 0.7018 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments - lr_scheduler_type: linear - num_epochs: 3 ### Training results ### Framework versions - Transformers 4.52.4 - Pytorch 2.6.0+cu124 - Datasets 3.6.0 - Tokenizers 0.21.1
Triangle104/Qwen3-30B-A3B-ArliAI-RpR-v4-Fast-Q6_K-GGUF
Triangle104
2025-06-17T15:23:45Z
0
0
transformers
[ "transformers", "gguf", "llama-cpp", "gguf-my-repo", "text-generation", "en", "base_model:ArliAI/Qwen3-30B-A3B-ArliAI-RpR-v4-Fast", "base_model:quantized:ArliAI/Qwen3-30B-A3B-ArliAI-RpR-v4-Fast", "license:apache-2.0", "endpoints_compatible", "region:us", "conversational" ]
text-generation
2025-06-17T15:08:33Z
--- license: apache-2.0 thumbnail: https://cdn-uploads.huggingface.co/production/uploads/6625f4a8a8d1362ebcc3851a/hIZ2ZcaDyfYLT9Yd4pfOs.jpeg language: - en base_model: ArliAI/Qwen3-30B-A3B-ArliAI-RpR-v4-Fast library_name: transformers pipeline_tag: text-generation tags: - llama-cpp - gguf-my-repo --- # Triangle104/Qwen3-30B-A3B-ArliAI-RpR-v4-Fast-Q6_K-GGUF This model was converted to GGUF format from [`ArliAI/Qwen3-30B-A3B-ArliAI-RpR-v4-Fast`](https://huggingface.co/ArliAI/Qwen3-30B-A3B-ArliAI-RpR-v4-Fast) using llama.cpp via the ggml.ai's [GGUF-my-repo](https://huggingface.co/spaces/ggml-org/gguf-my-repo) space. Refer to the [original model card](https://huggingface.co/ArliAI/Qwen3-30B-A3B-ArliAI-RpR-v4-Fast) for more details on the model. --- RpR (RolePlay with Reasoning) is a new series of models from ArliAI. This series builds directly upon the successful dataset curation methodology and training methods developed for the RPMax series. RpR models use the same curated, deduplicated RP and creative writing dataset used for RPMax, with a focus on variety to ensure high creativity and minimize cross-context repetition. Users familiar with RPMax will recognize the unique, non-repetitive writing style unlike other finetuned-for-RP models. With the release of QwQ as the first high performing open-source reasoning model that can be easily trained, it was clear that the available instruct and creative writing reasoning datasets contains only one response per example. This is type of single response dataset used for training reasoning models causes degraded output quality in long multi-turn chats. Which is why Arli AI decided to create a real RP model capable of long multi-turn chat with reasoning. In order to create RpR, we first had to actually create the reasoning RP dataset by re-processing our existing known-good RPMax dataset into a reasoning dataset. This was possible by using the base QwQ Instruct model itself to create the reasoning process for every turn in the RPMax dataset conversation examples, which is then further refined in order to make sure the reasoning is in-line with the actual response examples from the dataset. Another important thing to get right is to make sure the model is trained on examples that present reasoning blocks in the same way as it encounters it during inference. Which is, never seeing the reasoning blocks in it's context. In order to do this, the training run was completed using axolotl with manual template-free segments dataset in order to make sure that the model is never trained to see the reasoning block in the context. Just like how the model will be used during inference time. The result of training on this dataset with this method are consistently coherent and interesting outputs even in long multi-turn RP chats. This is as far as we know the first true correctly-trained reasoning model trained for RP and creative writing. --- ## Use with llama.cpp Install llama.cpp through brew (works on Mac and Linux) ```bash brew install llama.cpp ``` Invoke the llama.cpp server or the CLI. ### CLI: ```bash llama-cli --hf-repo Triangle104/Qwen3-30B-A3B-ArliAI-RpR-v4-Fast-Q6_K-GGUF --hf-file qwen3-30b-a3b-arliai-rpr-v4-fast-q6_k.gguf -p "The meaning to life and the universe is" ``` ### Server: ```bash llama-server --hf-repo Triangle104/Qwen3-30B-A3B-ArliAI-RpR-v4-Fast-Q6_K-GGUF --hf-file qwen3-30b-a3b-arliai-rpr-v4-fast-q6_k.gguf -c 2048 ``` Note: You can also use this checkpoint directly through the [usage steps](https://github.com/ggerganov/llama.cpp?tab=readme-ov-file#usage) listed in the Llama.cpp repo as well. Step 1: Clone llama.cpp from GitHub. ``` git clone https://github.com/ggerganov/llama.cpp ``` Step 2: Move into the llama.cpp folder and build it with `LLAMA_CURL=1` flag along with other hardware-specific flags (for ex: LLAMA_CUDA=1 for Nvidia GPUs on Linux). ``` cd llama.cpp && LLAMA_CURL=1 make ``` Step 3: Run inference through the main binary. ``` ./llama-cli --hf-repo Triangle104/Qwen3-30B-A3B-ArliAI-RpR-v4-Fast-Q6_K-GGUF --hf-file qwen3-30b-a3b-arliai-rpr-v4-fast-q6_k.gguf -p "The meaning to life and the universe is" ``` or ``` ./llama-server --hf-repo Triangle104/Qwen3-30B-A3B-ArliAI-RpR-v4-Fast-Q6_K-GGUF --hf-file qwen3-30b-a3b-arliai-rpr-v4-fast-q6_k.gguf -c 2048 ```
piyawudk/PhishMe-Qwen3-Base-GRPO-8B-GGUF
piyawudk
2025-06-17T15:21:36Z
0
0
transformers
[ "transformers", "gguf", "text-generation-inference", "unsloth", "qwen3", "llama-cpp", "gguf-my-repo", "en", "base_model:piyawudk/PhishMe-Qwen3-Base-GRPO-8B", "base_model:quantized:piyawudk/PhishMe-Qwen3-Base-GRPO-8B", "license:apache-2.0", "endpoints_compatible", "region:us" ]
null
2025-06-17T14:41:55Z
--- base_model: piyawudk/PhishMe-Qwen3-Base-GRPO-8B tags: - text-generation-inference - transformers - unsloth - qwen3 - llama-cpp - gguf-my-repo license: apache-2.0 language: - en --- # piyawudk/PhishMe-Qwen3-Base-GRPO-8B-Q4_K_M-GGUF This model was converted to GGUF format from [`piyawudk/PhishMe-Qwen3-Base-GRPO-8B`](https://huggingface.co/piyawudk/PhishMe-Qwen3-Base-GRPO-8B) using llama.cpp via the ggml.ai's [GGUF-my-repo](https://huggingface.co/spaces/ggml-org/gguf-my-repo) space. Refer to the [original model card](https://huggingface.co/piyawudk/PhishMe-Qwen3-Base-GRPO-8B) for more details on the model. ## Use with llama.cpp Install llama.cpp through brew (works on Mac and Linux) ```bash brew install llama.cpp ``` Invoke the llama.cpp server or the CLI. ### CLI: ```bash llama-cli --hf-repo piyawudk/PhishMe-Qwen3-Base-GRPO-8B-Q4_K_M-GGUF --hf-file phishme-qwen3-base-grpo-8b-q4_k_m.gguf -p "The meaning to life and the universe is" ``` ### Server: ```bash llama-server --hf-repo piyawudk/PhishMe-Qwen3-Base-GRPO-8B-Q4_K_M-GGUF --hf-file phishme-qwen3-base-grpo-8b-q4_k_m.gguf -c 2048 ``` Note: You can also use this checkpoint directly through the [usage steps](https://github.com/ggerganov/llama.cpp?tab=readme-ov-file#usage) listed in the Llama.cpp repo as well. Step 1: Clone llama.cpp from GitHub. ``` git clone https://github.com/ggerganov/llama.cpp ``` Step 2: Move into the llama.cpp folder and build it with `LLAMA_CURL=1` flag along with other hardware-specific flags (for ex: LLAMA_CUDA=1 for Nvidia GPUs on Linux). ``` cd llama.cpp && LLAMA_CURL=1 make ``` Step 3: Run inference through the main binary. ``` ./llama-cli --hf-repo piyawudk/PhishMe-Qwen3-Base-GRPO-8B-Q4_K_M-GGUF --hf-file phishme-qwen3-base-grpo-8b-q4_k_m.gguf -p "The meaning to life and the universe is" ``` or ``` ./llama-server --hf-repo piyawudk/PhishMe-Qwen3-Base-GRPO-8B-Q4_K_M-GGUF --hf-file phishme-qwen3-base-grpo-8b-q4_k_m.gguf -c 2048 ```
joanna302/Qwen3-8B-Base_fr_pt_8e-05_seed43
joanna302
2025-06-17T15:18:01Z
0
0
transformers
[ "transformers", "safetensors", "qwen3", "text-generation", "unsloth", "trl", "sft", "conversational", "arxiv:1910.09700", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2025-06-17T10:10:47Z
--- library_name: transformers tags: - unsloth - trl - sft --- # Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a ๐Ÿค— transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
Lelon/cue-de-conan
Lelon
2025-06-17T15:15:07Z
0
0
transformers
[ "transformers", "safetensors", "eurobert", "token-classification", "custom_code", "arxiv:1910.09700", "autotrain_compatible", "region:us" ]
token-classification
2025-06-17T15:14:28Z
--- library_name: transformers tags: [] --- # Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a ๐Ÿค— transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
VHKE/flat-flipflop
VHKE
2025-06-17T15:14:45Z
0
0
diffusers
[ "diffusers", "text-to-image", "flux", "lora", "template:sd-lora", "fluxgym", "base_model:black-forest-labs/FLUX.1-dev", "base_model:adapter:black-forest-labs/FLUX.1-dev", "license:other", "region:us" ]
text-to-image
2025-06-17T15:14:25Z
--- tags: - text-to-image - flux - lora - diffusers - template:sd-lora - fluxgym widget: - output: url: sample/flat-flipflop_003012_00_20250617161051.png text: flat flipflop base_model: black-forest-labs/FLUX.1-dev instance_prompt: flat flipflop license: other license_name: flux-1-dev-non-commercial-license license_link: https://huggingface.co/black-forest-labs/FLUX.1-dev/blob/main/LICENSE.md --- # flat flipflop A Flux LoRA trained on a local computer with [Fluxgym](https://github.com/cocktailpeanut/fluxgym) <Gallery /> ## Trigger words You should use `flat flipflop` to trigger the image generation. ## Download model and use it with ComfyUI, AUTOMATIC1111, SD.Next, Invoke AI, Forge, etc. Weights for this model are available in Safetensors format.
ITFacto/gemma-3-4B-sft-grpo
ITFacto
2025-06-17T15:13:05Z
0
0
transformers
[ "transformers", "safetensors", "gemma3_text", "text-generation", "text-generation-inference", "unsloth", "gemma3", "conversational", "en", "base_model:unsloth/gemma-3-4b-it", "base_model:finetune:unsloth/gemma-3-4b-it", "license:apache-2.0", "autotrain_compatible", "endpoints_compatible", ...
text-generation
2025-06-17T15:10:01Z
--- base_model: unsloth/gemma-3-4b-it tags: - text-generation-inference - transformers - unsloth - gemma3 license: apache-2.0 language: - en --- # Uploaded finetuned model - **Developed by:** ITFacto - **License:** apache-2.0 - **Finetuned from model :** unsloth/gemma-3-4b-it This gemma3 model was trained 2x faster with [Unsloth](https://github.com/unslothai/unsloth) and Huggingface's TRL library. [<img src="https://raw.githubusercontent.com/unslothai/unsloth/main/images/unsloth%20made%20with%20love.png" width="200"/>](https://github.com/unslothai/unsloth)
mradermacher/TongSearch-QR-3B-GGUF
mradermacher
2025-06-17T15:06:35Z
0
0
transformers
[ "transformers", "gguf", "en", "zh", "base_model:TongSearch/TongSearch-QR-3B", "base_model:quantized:TongSearch/TongSearch-QR-3B", "license:mit", "endpoints_compatible", "region:us", "conversational" ]
null
2025-06-17T14:21:53Z
--- base_model: TongSearch/TongSearch-QR-3B language: - en - zh library_name: transformers license: mit quantized_by: mradermacher --- ## About <!-- ### quantize_version: 2 --> <!-- ### output_tensor_quantised: 1 --> <!-- ### convert_type: hf --> <!-- ### vocab_type: --> <!-- ### tags: --> static quants of https://huggingface.co/TongSearch/TongSearch-QR-3B <!-- provided-files --> weighted/imatrix quants are available at https://huggingface.co/mradermacher/TongSearch-QR-3B-i1-GGUF ## Usage If you are unsure how to use GGUF files, refer to one of [TheBloke's READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for more details, including on how to concatenate multi-part files. ## Provided Quants (sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants) | Link | Type | Size/GB | Notes | |:-----|:-----|--------:|:------| | [GGUF](https://huggingface.co/mradermacher/TongSearch-QR-3B-GGUF/resolve/main/TongSearch-QR-3B.Q2_K.gguf) | Q2_K | 1.5 | | | [GGUF](https://huggingface.co/mradermacher/TongSearch-QR-3B-GGUF/resolve/main/TongSearch-QR-3B.Q3_K_S.gguf) | Q3_K_S | 1.7 | | | [GGUF](https://huggingface.co/mradermacher/TongSearch-QR-3B-GGUF/resolve/main/TongSearch-QR-3B.Q3_K_M.gguf) | Q3_K_M | 1.8 | lower quality | | [GGUF](https://huggingface.co/mradermacher/TongSearch-QR-3B-GGUF/resolve/main/TongSearch-QR-3B.Q3_K_L.gguf) | Q3_K_L | 1.9 | | | [GGUF](https://huggingface.co/mradermacher/TongSearch-QR-3B-GGUF/resolve/main/TongSearch-QR-3B.IQ4_XS.gguf) | IQ4_XS | 2.0 | | | [GGUF](https://huggingface.co/mradermacher/TongSearch-QR-3B-GGUF/resolve/main/TongSearch-QR-3B.Q4_K_S.gguf) | Q4_K_S | 2.1 | fast, recommended | | [GGUF](https://huggingface.co/mradermacher/TongSearch-QR-3B-GGUF/resolve/main/TongSearch-QR-3B.Q4_K_M.gguf) | Q4_K_M | 2.2 | fast, recommended | | [GGUF](https://huggingface.co/mradermacher/TongSearch-QR-3B-GGUF/resolve/main/TongSearch-QR-3B.Q5_K_S.gguf) | Q5_K_S | 2.5 | | | [GGUF](https://huggingface.co/mradermacher/TongSearch-QR-3B-GGUF/resolve/main/TongSearch-QR-3B.Q5_K_M.gguf) | Q5_K_M | 2.5 | | | [GGUF](https://huggingface.co/mradermacher/TongSearch-QR-3B-GGUF/resolve/main/TongSearch-QR-3B.Q6_K.gguf) | Q6_K | 2.9 | very good quality | | [GGUF](https://huggingface.co/mradermacher/TongSearch-QR-3B-GGUF/resolve/main/TongSearch-QR-3B.Q8_0.gguf) | Q8_0 | 3.7 | fast, best quality | | [GGUF](https://huggingface.co/mradermacher/TongSearch-QR-3B-GGUF/resolve/main/TongSearch-QR-3B.f16.gguf) | f16 | 6.9 | 16 bpw, overkill | Here is a handy graph by ikawrakow comparing some lower-quality quant types (lower is better): ![image.png](https://www.nethype.de/huggingface_embed/quantpplgraph.png) And here are Artefact2's thoughts on the matter: https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9 ## FAQ / Model Request See https://huggingface.co/mradermacher/model_requests for some answers to questions you might have and/or if you want some other model quantized. ## Thanks I thank my company, [nethype GmbH](https://www.nethype.de/), for letting me use its servers and providing upgrades to my workstation to enable this work in my free time. <!-- end -->
talphaidze/molm-fineweb-edu-scientific
talphaidze
2025-06-17T14:57:14Z
0
0
transformers
[ "transformers", "safetensors", "molm", "custom_code", "arxiv:1910.09700", "endpoints_compatible", "region:us" ]
null
2025-06-17T14:46:26Z
--- library_name: transformers tags: [] --- # Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a ๐Ÿค— transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
Zack-Z/qwen3_4bi_cotsft_rs0_1_5cut_cot2all_indep_ntt_e2
Zack-Z
2025-06-17T14:55:48Z
0
0
transformers
[ "transformers", "qwen3", "feature-extraction", "text-generation-inference", "unsloth", "en", "base_model:unsloth/Qwen3-4B", "base_model:finetune:unsloth/Qwen3-4B", "license:apache-2.0", "endpoints_compatible", "region:us" ]
feature-extraction
2025-06-17T14:40:38Z
--- base_model: unsloth/Qwen3-4B tags: - text-generation-inference - transformers - unsloth - qwen3 license: apache-2.0 language: - en --- # Uploaded finetuned model - **Developed by:** Zack-Z - **License:** apache-2.0 - **Finetuned from model :** unsloth/Qwen3-4B This qwen3 model was trained 2x faster with [Unsloth](https://github.com/unslothai/unsloth) and Huggingface's TRL library. [<img src="https://raw.githubusercontent.com/unslothai/unsloth/main/images/unsloth%20made%20with%20love.png" width="200"/>](https://github.com/unslothai/unsloth)
mradermacher/TongSearch-QR-1.5B-i1-GGUF
mradermacher
2025-06-17T14:52:39Z
0
0
transformers
[ "transformers", "gguf", "en", "zh", "base_model:TongSearch/TongSearch-QR-1.5B", "base_model:quantized:TongSearch/TongSearch-QR-1.5B", "license:mit", "endpoints_compatible", "region:us", "imatrix", "conversational" ]
null
2025-06-17T14:21:26Z
--- base_model: TongSearch/TongSearch-QR-1.5B language: - en - zh library_name: transformers license: mit quantized_by: mradermacher --- ## About <!-- ### quantize_version: 2 --> <!-- ### output_tensor_quantised: 1 --> <!-- ### convert_type: hf --> <!-- ### vocab_type: --> <!-- ### tags: nicoboss --> weighted/imatrix quants of https://huggingface.co/TongSearch/TongSearch-QR-1.5B <!-- provided-files --> static quants are available at https://huggingface.co/mradermacher/TongSearch-QR-1.5B-GGUF ## Usage If you are unsure how to use GGUF files, refer to one of [TheBloke's READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for more details, including on how to concatenate multi-part files. ## Provided Quants (sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants) | Link | Type | Size/GB | Notes | |:-----|:-----|--------:|:------| | [GGUF](https://huggingface.co/mradermacher/TongSearch-QR-1.5B-i1-GGUF/resolve/main/TongSearch-QR-1.5B.i1-IQ1_S.gguf) | i1-IQ1_S | 0.6 | for the desperate | | [GGUF](https://huggingface.co/mradermacher/TongSearch-QR-1.5B-i1-GGUF/resolve/main/TongSearch-QR-1.5B.i1-IQ1_M.gguf) | i1-IQ1_M | 0.6 | mostly desperate | | [GGUF](https://huggingface.co/mradermacher/TongSearch-QR-1.5B-i1-GGUF/resolve/main/TongSearch-QR-1.5B.i1-IQ2_XXS.gguf) | i1-IQ2_XXS | 0.7 | | | [GGUF](https://huggingface.co/mradermacher/TongSearch-QR-1.5B-i1-GGUF/resolve/main/TongSearch-QR-1.5B.i1-IQ2_XS.gguf) | i1-IQ2_XS | 0.7 | | | [GGUF](https://huggingface.co/mradermacher/TongSearch-QR-1.5B-i1-GGUF/resolve/main/TongSearch-QR-1.5B.i1-IQ2_S.gguf) | i1-IQ2_S | 0.8 | | | [GGUF](https://huggingface.co/mradermacher/TongSearch-QR-1.5B-i1-GGUF/resolve/main/TongSearch-QR-1.5B.i1-IQ2_M.gguf) | i1-IQ2_M | 0.8 | | | [GGUF](https://huggingface.co/mradermacher/TongSearch-QR-1.5B-i1-GGUF/resolve/main/TongSearch-QR-1.5B.i1-Q2_K_S.gguf) | i1-Q2_K_S | 0.8 | very low quality | | [GGUF](https://huggingface.co/mradermacher/TongSearch-QR-1.5B-i1-GGUF/resolve/main/TongSearch-QR-1.5B.i1-Q2_K.gguf) | i1-Q2_K | 0.9 | IQ3_XXS probably better | | [GGUF](https://huggingface.co/mradermacher/TongSearch-QR-1.5B-i1-GGUF/resolve/main/TongSearch-QR-1.5B.i1-IQ3_XXS.gguf) | i1-IQ3_XXS | 0.9 | lower quality | | [GGUF](https://huggingface.co/mradermacher/TongSearch-QR-1.5B-i1-GGUF/resolve/main/TongSearch-QR-1.5B.i1-IQ3_XS.gguf) | i1-IQ3_XS | 0.9 | | | [GGUF](https://huggingface.co/mradermacher/TongSearch-QR-1.5B-i1-GGUF/resolve/main/TongSearch-QR-1.5B.i1-Q3_K_S.gguf) | i1-Q3_K_S | 1.0 | IQ3_XS probably better | | [GGUF](https://huggingface.co/mradermacher/TongSearch-QR-1.5B-i1-GGUF/resolve/main/TongSearch-QR-1.5B.i1-IQ3_S.gguf) | i1-IQ3_S | 1.0 | beats Q3_K* | | [GGUF](https://huggingface.co/mradermacher/TongSearch-QR-1.5B-i1-GGUF/resolve/main/TongSearch-QR-1.5B.i1-IQ3_M.gguf) | i1-IQ3_M | 1.0 | | | [GGUF](https://huggingface.co/mradermacher/TongSearch-QR-1.5B-i1-GGUF/resolve/main/TongSearch-QR-1.5B.i1-Q3_K_M.gguf) | i1-Q3_K_M | 1.0 | IQ3_S probably better | | [GGUF](https://huggingface.co/mradermacher/TongSearch-QR-1.5B-i1-GGUF/resolve/main/TongSearch-QR-1.5B.i1-Q3_K_L.gguf) | i1-Q3_K_L | 1.1 | IQ3_M probably better | | [GGUF](https://huggingface.co/mradermacher/TongSearch-QR-1.5B-i1-GGUF/resolve/main/TongSearch-QR-1.5B.i1-IQ4_XS.gguf) | i1-IQ4_XS | 1.1 | | | [GGUF](https://huggingface.co/mradermacher/TongSearch-QR-1.5B-i1-GGUF/resolve/main/TongSearch-QR-1.5B.i1-IQ4_NL.gguf) | i1-IQ4_NL | 1.2 | prefer IQ4_XS | | [GGUF](https://huggingface.co/mradermacher/TongSearch-QR-1.5B-i1-GGUF/resolve/main/TongSearch-QR-1.5B.i1-Q4_0.gguf) | i1-Q4_0 | 1.2 | fast, low quality | | [GGUF](https://huggingface.co/mradermacher/TongSearch-QR-1.5B-i1-GGUF/resolve/main/TongSearch-QR-1.5B.i1-Q4_K_S.gguf) | i1-Q4_K_S | 1.2 | optimal size/speed/quality | | [GGUF](https://huggingface.co/mradermacher/TongSearch-QR-1.5B-i1-GGUF/resolve/main/TongSearch-QR-1.5B.i1-Q4_K_M.gguf) | i1-Q4_K_M | 1.2 | fast, recommended | | [GGUF](https://huggingface.co/mradermacher/TongSearch-QR-1.5B-i1-GGUF/resolve/main/TongSearch-QR-1.5B.i1-Q4_1.gguf) | i1-Q4_1 | 1.3 | | | [GGUF](https://huggingface.co/mradermacher/TongSearch-QR-1.5B-i1-GGUF/resolve/main/TongSearch-QR-1.5B.i1-Q5_K_S.gguf) | i1-Q5_K_S | 1.4 | | | [GGUF](https://huggingface.co/mradermacher/TongSearch-QR-1.5B-i1-GGUF/resolve/main/TongSearch-QR-1.5B.i1-Q5_K_M.gguf) | i1-Q5_K_M | 1.4 | | | [GGUF](https://huggingface.co/mradermacher/TongSearch-QR-1.5B-i1-GGUF/resolve/main/TongSearch-QR-1.5B.i1-Q6_K.gguf) | i1-Q6_K | 1.6 | practically like static Q6_K | Here is a handy graph by ikawrakow comparing some lower-quality quant types (lower is better): ![image.png](https://www.nethype.de/huggingface_embed/quantpplgraph.png) And here are Artefact2's thoughts on the matter: https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9 ## FAQ / Model Request See https://huggingface.co/mradermacher/model_requests for some answers to questions you might have and/or if you want some other model quantized. ## Thanks I thank my company, [nethype GmbH](https://www.nethype.de/), for letting me use its servers and providing upgrades to my workstation to enable this work in my free time. Additional thanks to [@nicoboss](https://huggingface.co/nicoboss) for giving me access to his private supercomputer, enabling me to provide many more imatrix quants, at much higher quality, than I would otherwise be able to. <!-- end -->
alhkalily/MCQ
alhkalily
2025-06-17T14:52:29Z
0
0
null
[ "license:apache-2.0", "region:us" ]
null
2025-06-17T14:51:23Z
--- license: apache-2.0 ---
gokulraj121/brahma-phi-2
gokulraj121
2025-06-17T14:47:00Z
0
0
null
[ "license:apache-2.0", "region:us" ]
null
2025-06-17T14:47:00Z
--- license: apache-2.0 ---
cangcz/AnchorCrafter-notune
cangcz
2025-06-17T14:37:43Z
0
0
null
[ "license:apache-2.0", "region:us" ]
null
2025-06-16T08:47:13Z
--- license: apache-2.0 ---
jsevillano/medgemma-4b-it-Q8_0-GGUF
jsevillano
2025-06-17T14:28:19Z
0
0
transformers
[ "transformers", "gguf", "medical", "radiology", "clinical-reasoning", "dermatology", "pathology", "ophthalmology", "chest-x-ray", "llama-cpp", "gguf-my-repo", "image-text-to-text", "base_model:google/medgemma-4b-it", "base_model:quantized:google/medgemma-4b-it", "license:other", "endpo...
image-text-to-text
2025-06-17T14:28:00Z
--- license: other license_name: health-ai-developer-foundations license_link: https://developers.google.com/health-ai-developer-foundations/terms library_name: transformers pipeline_tag: image-text-to-text extra_gated_heading: Access MedGemma on Hugging Face extra_gated_prompt: To access MedGemma on Hugging Face, you're required to review and agree to [Health AI Developer Foundation's terms of use](https://developers.google.com/health-ai-developer-foundations/terms). To do this, please ensure you're logged in to Hugging Face and click below. Requests are processed immediately. extra_gated_button_content: Acknowledge license base_model: google/medgemma-4b-it tags: - medical - radiology - clinical-reasoning - dermatology - pathology - ophthalmology - chest-x-ray - llama-cpp - gguf-my-repo --- # jsevillano/medgemma-4b-it-Q8_0-GGUF This model was converted to GGUF format from [`google/medgemma-4b-it`](https://huggingface.co/google/medgemma-4b-it) using llama.cpp via the ggml.ai's [GGUF-my-repo](https://huggingface.co/spaces/ggml-org/gguf-my-repo) space. Refer to the [original model card](https://huggingface.co/google/medgemma-4b-it) for more details on the model. ## Use with llama.cpp Install llama.cpp through brew (works on Mac and Linux) ```bash brew install llama.cpp ``` Invoke the llama.cpp server or the CLI. ### CLI: ```bash llama-cli --hf-repo jsevillano/medgemma-4b-it-Q8_0-GGUF --hf-file medgemma-4b-it-q8_0.gguf -p "The meaning to life and the universe is" ``` ### Server: ```bash llama-server --hf-repo jsevillano/medgemma-4b-it-Q8_0-GGUF --hf-file medgemma-4b-it-q8_0.gguf -c 2048 ``` Note: You can also use this checkpoint directly through the [usage steps](https://github.com/ggerganov/llama.cpp?tab=readme-ov-file#usage) listed in the Llama.cpp repo as well. Step 1: Clone llama.cpp from GitHub. ``` git clone https://github.com/ggerganov/llama.cpp ``` Step 2: Move into the llama.cpp folder and build it with `LLAMA_CURL=1` flag along with other hardware-specific flags (for ex: LLAMA_CUDA=1 for Nvidia GPUs on Linux). ``` cd llama.cpp && LLAMA_CURL=1 make ``` Step 3: Run inference through the main binary. ``` ./llama-cli --hf-repo jsevillano/medgemma-4b-it-Q8_0-GGUF --hf-file medgemma-4b-it-q8_0.gguf -p "The meaning to life and the universe is" ``` or ``` ./llama-server --hf-repo jsevillano/medgemma-4b-it-Q8_0-GGUF --hf-file medgemma-4b-it-q8_0.gguf -c 2048 ```
Davidozito/fewshot-250-samples
Davidozito
2025-06-17T14:16:59Z
0
0
transformers
[ "transformers", "safetensors", "xlm-roberta", "text-classification", "generated_from_trainer", "autotrain_compatible", "endpoints_compatible", "region:us" ]
text-classification
2025-06-17T13:01:41Z
--- library_name: transformers tags: - generated_from_trainer metrics: - precision - recall - accuracy model-index: - name: fewshot-250-samples results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # fewshot-250-samples This model was trained from scratch on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.4952 - Precision: 0.8199 - Recall: 0.5655 - F1 Macro: 0.5981 - Accuracy: 0.64 - Classification Report: precision recall f1-score support None 1.00 0.33 0.50 3 Minimal 0.75 0.50 0.60 6 Basic 0.53 1.00 0.69 9 Good 1.00 0.43 0.60 7 Excellent 0.00 0.00 0.00 0 accuracy 0.64 25 macro avg 0.66 0.45 0.48 25 weighted avg 0.77 0.64 0.62 25 - Mse: 0.4952 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 5 ### Training results | Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 Macro | Accuracy | Classification Report | Mse | |:-------------:|:------:|:----:|:---------------:|:---------:|:------:|:--------:|:--------:|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------:|:------:| | No log | 0 | 0 | 0.5164 | 0.7896 | 0.5496 | 0.5327 | 0.6 | precision recall f1-score support None 1.00 0.33 0.50 3 Minimal 0.62 0.83 0.71 6 Basic 0.53 0.89 0.67 9 Good 1.00 0.14 0.25 7 Excellent 0.00 0.00 0.00 0 accuracy 0.60 25 macro avg 0.63 0.44 0.43 25 weighted avg 0.74 0.60 0.54 25 | 0.5164 | | 0.5593 | 0.2414 | 7 | 0.5152 | 0.7896 | 0.5496 | 0.5327 | 0.6 | precision recall f1-score support None 1.00 0.33 0.50 3 Minimal 0.62 0.83 0.71 6 Basic 0.53 0.89 0.67 9 Good 1.00 0.14 0.25 7 Excellent 0.00 0.00 0.00 0 accuracy 0.60 25 macro avg 0.63 0.44 0.43 25 weighted avg 0.74 0.60 0.54 25 | 0.5152 | | 0.8353 | 0.4828 | 14 | 0.5129 | 0.8192 | 0.5774 | 0.5598 | 0.64 | precision recall f1-score support None 1.00 0.33 0.50 3 Minimal 0.71 0.83 0.77 6 Basic 0.56 1.00 0.72 9 Good 1.00 0.14 0.25 7 Excellent 0.00 0.00 0.00 0 accuracy 0.64 25 macro avg 0.66 0.46 0.45 25 weighted avg 0.77 0.64 0.57 25 | 0.5129 | | 0.7268 | 0.7241 | 21 | 0.5070 | 0.8324 | 0.5714 | 0.5910 | 0.64 | precision recall f1-score support None 1.00 0.33 0.50 3 Minimal 0.80 0.67 0.73 6 Basic 0.53 1.00 0.69 9 Good 1.00 0.29 0.44 7 Excellent 0.00 0.00 0.00 0 accuracy 0.64 25 macro avg 0.67 0.46 0.47 25 weighted avg 0.78 0.64 0.61 25 | 0.5070 | | 0.8386 | 0.9655 | 28 | 0.5012 | 0.8324 | 0.5714 | 0.5910 | 0.64 | precision recall f1-score support None 1.00 0.33 0.50 3 Minimal 0.80 0.67 0.73 6 Basic 0.53 1.00 0.69 9 Good 1.00 0.29 0.44 7 Excellent 0.00 0.00 0.00 0 accuracy 0.64 25 macro avg 0.67 0.46 0.47 25 weighted avg 0.78 0.64 0.61 25 | 0.5012 | | 0.866 | 1.2069 | 35 | 0.4989 | 0.8324 | 0.5714 | 0.5910 | 0.64 | precision recall f1-score support None 1.00 0.33 0.50 3 Minimal 0.80 0.67 0.73 6 Basic 0.53 1.00 0.69 9 Good 1.00 0.29 0.44 7 Excellent 0.00 0.00 0.00 0 accuracy 0.64 25 macro avg 0.67 0.46 0.47 25 weighted avg 0.78 0.64 0.61 25 | 0.4989 | | 0.674 | 1.4483 | 42 | 0.4983 | 0.8199 | 0.5655 | 0.5981 | 0.64 | precision recall f1-score support None 1.00 0.33 0.50 3 Minimal 0.75 0.50 0.60 6 Basic 0.53 1.00 0.69 9 Good 1.00 0.43 0.60 7 Excellent 0.00 0.00 0.00 0 accuracy 0.64 25 macro avg 0.66 0.45 0.48 25 weighted avg 0.77 0.64 0.62 25 | 0.4983 | | 0.607 | 1.6897 | 49 | 0.4982 | 0.8199 | 0.5655 | 0.5981 | 0.64 | precision recall f1-score support None 1.00 0.33 0.50 3 Minimal 0.75 0.50 0.60 6 Basic 0.53 1.00 0.69 9 Good 1.00 0.43 0.60 7 Excellent 0.00 0.00 0.00 0 accuracy 0.64 25 macro avg 0.66 0.45 0.48 25 weighted avg 0.77 0.64 0.62 25 | 0.4982 | | 0.5297 | 1.9310 | 56 | 0.4981 | 0.8199 | 0.5655 | 0.5981 | 0.64 | precision recall f1-score support None 1.00 0.33 0.50 3 Minimal 0.75 0.50 0.60 6 Basic 0.53 1.00 0.69 9 Good 1.00 0.43 0.60 7 Excellent 0.00 0.00 0.00 0 accuracy 0.64 25 macro avg 0.66 0.45 0.48 25 weighted avg 0.77 0.64 0.62 25 | 0.4981 | | 0.6795 | 2.1724 | 63 | 0.4978 | 0.8199 | 0.5655 | 0.5981 | 0.64 | precision recall f1-score support None 1.00 0.33 0.50 3 Minimal 0.75 0.50 0.60 6 Basic 0.53 1.00 0.69 9 Good 1.00 0.43 0.60 7 Excellent 0.00 0.00 0.00 0 accuracy 0.64 25 macro avg 0.66 0.45 0.48 25 weighted avg 0.77 0.64 0.62 25 | 0.4978 | | 0.7007 | 2.4138 | 70 | 0.4974 | 0.8199 | 0.5655 | 0.5981 | 0.64 | precision recall f1-score support None 1.00 0.33 0.50 3 Minimal 0.75 0.50 0.60 6 Basic 0.53 1.00 0.69 9 Good 1.00 0.43 0.60 7 Excellent 0.00 0.00 0.00 0 accuracy 0.64 25 macro avg 0.66 0.45 0.48 25 weighted avg 0.77 0.64 0.62 25 | 0.4974 | | 0.6341 | 2.6552 | 77 | 0.4974 | 0.8199 | 0.5655 | 0.5981 | 0.64 | precision recall f1-score support None 1.00 0.33 0.50 3 Minimal 0.75 0.50 0.60 6 Basic 0.53 1.00 0.69 9 Good 1.00 0.43 0.60 7 Excellent 0.00 0.00 0.00 0 accuracy 0.64 25 macro avg 0.66 0.45 0.48 25 weighted avg 0.77 0.64 0.62 25 | 0.4974 | | 0.7763 | 2.8966 | 84 | 0.4970 | 0.8199 | 0.5655 | 0.5981 | 0.64 | precision recall f1-score support None 1.00 0.33 0.50 3 Minimal 0.75 0.50 0.60 6 Basic 0.53 1.00 0.69 9 Good 1.00 0.43 0.60 7 Excellent 0.00 0.00 0.00 0 accuracy 0.64 25 macro avg 0.66 0.45 0.48 25 weighted avg 0.77 0.64 0.62 25 | 0.4970 | | 0.8144 | 3.1379 | 91 | 0.4965 | 0.8199 | 0.5655 | 0.5981 | 0.64 | precision recall f1-score support None 1.00 0.33 0.50 3 Minimal 0.75 0.50 0.60 6 Basic 0.53 1.00 0.69 9 Good 1.00 0.43 0.60 7 Excellent 0.00 0.00 0.00 0 accuracy 0.64 25 macro avg 0.66 0.45 0.48 25 weighted avg 0.77 0.64 0.62 25 | 0.4965 | | 0.7211 | 3.3793 | 98 | 0.4963 | 0.8199 | 0.5655 | 0.5981 | 0.64 | precision recall f1-score support None 1.00 0.33 0.50 3 Minimal 0.75 0.50 0.60 6 Basic 0.53 1.00 0.69 9 Good 1.00 0.43 0.60 7 Excellent 0.00 0.00 0.00 0 accuracy 0.64 25 macro avg 0.66 0.45 0.48 25 weighted avg 0.77 0.64 0.62 25 | 0.4963 | | 0.5704 | 3.6207 | 105 | 0.4961 | 0.8199 | 0.5655 | 0.5981 | 0.64 | precision recall f1-score support None 1.00 0.33 0.50 3 Minimal 0.75 0.50 0.60 6 Basic 0.53 1.00 0.69 9 Good 1.00 0.43 0.60 7 Excellent 0.00 0.00 0.00 0 accuracy 0.64 25 macro avg 0.66 0.45 0.48 25 weighted avg 0.77 0.64 0.62 25 | 0.4961 | | 0.7294 | 3.8621 | 112 | 0.4960 | 0.8199 | 0.5655 | 0.5981 | 0.64 | precision recall f1-score support None 1.00 0.33 0.50 3 Minimal 0.75 0.50 0.60 6 Basic 0.53 1.00 0.69 9 Good 1.00 0.43 0.60 7 Excellent 0.00 0.00 0.00 0 accuracy 0.64 25 macro avg 0.66 0.45 0.48 25 weighted avg 0.77 0.64 0.62 25 | 0.4960 | | 0.8442 | 4.1034 | 119 | 0.4958 | 0.8199 | 0.5655 | 0.5981 | 0.64 | precision recall f1-score support None 1.00 0.33 0.50 3 Minimal 0.75 0.50 0.60 6 Basic 0.53 1.00 0.69 9 Good 1.00 0.43 0.60 7 Excellent 0.00 0.00 0.00 0 accuracy 0.64 25 macro avg 0.66 0.45 0.48 25 weighted avg 0.77 0.64 0.62 25 | 0.4958 | | 0.7277 | 4.3448 | 126 | 0.4956 | 0.8199 | 0.5655 | 0.5981 | 0.64 | precision recall f1-score support None 1.00 0.33 0.50 3 Minimal 0.75 0.50 0.60 6 Basic 0.53 1.00 0.69 9 Good 1.00 0.43 0.60 7 Excellent 0.00 0.00 0.00 0 accuracy 0.64 25 macro avg 0.66 0.45 0.48 25 weighted avg 0.77 0.64 0.62 25 | 0.4956 | | 0.607 | 4.5862 | 133 | 0.4953 | 0.8199 | 0.5655 | 0.5981 | 0.64 | precision recall f1-score support None 1.00 0.33 0.50 3 Minimal 0.75 0.50 0.60 6 Basic 0.53 1.00 0.69 9 Good 1.00 0.43 0.60 7 Excellent 0.00 0.00 0.00 0 accuracy 0.64 25 macro avg 0.66 0.45 0.48 25 weighted avg 0.77 0.64 0.62 25 | 0.4953 | | 0.6661 | 4.8276 | 140 | 0.4952 | 0.8199 | 0.5655 | 0.5981 | 0.64 | precision recall f1-score support None 1.00 0.33 0.50 3 Minimal 0.75 0.50 0.60 6 Basic 0.53 1.00 0.69 9 Good 1.00 0.43 0.60 7 Excellent 0.00 0.00 0.00 0 accuracy 0.64 25 macro avg 0.66 0.45 0.48 25 weighted avg 0.77 0.64 0.62 25 | 0.4952 | ### Framework versions - Transformers 4.52.4 - Pytorch 2.7.1 - Datasets 3.6.0 - Tokenizers 0.21.1
ChangeXy/qwen2.5-14b-extreme-sports
ChangeXy
2025-06-17T14:15:37Z
0
0
transformers
[ "transformers", "safetensors", "unsloth", "arxiv:1910.09700", "endpoints_compatible", "region:us" ]
null
2025-06-17T14:03:44Z
--- library_name: transformers tags: - unsloth --- # Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a ๐Ÿค— transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
Munia-ak/wav2vec2-base-demo-colab
Munia-ak
2025-06-17T14:06:00Z
0
0
transformers
[ "transformers", "tensorboard", "safetensors", "wav2vec2", "automatic-speech-recognition", "generated_from_trainer", "base_model:facebook/wav2vec2-base", "base_model:finetune:facebook/wav2vec2-base", "license:apache-2.0", "endpoints_compatible", "region:us" ]
automatic-speech-recognition
2025-06-16T08:48:05Z
--- library_name: transformers license: apache-2.0 base_model: facebook/wav2vec2-base tags: - generated_from_trainer metrics: - wer model-index: - name: wav2vec2-base-demo-colab results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # wav2vec2-base-demo-colab This model is a fine-tuned version of [facebook/wav2vec2-base](https://huggingface.co/facebook/wav2vec2-base) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.3900 - Wer: 0.8889 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0001 - train_batch_size: 32 - eval_batch_size: 8 - seed: 42 - optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments - lr_scheduler_type: linear - lr_scheduler_warmup_steps: 1000 - num_epochs: 5 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | Wer | |:-------------:|:------:|:----:|:---------------:|:------:| | 0.7643 | 2.5381 | 500 | 0.3900 | 0.8889 | ### Framework versions - Transformers 4.52.4 - Pytorch 2.6.0+cu124 - Datasets 3.6.0 - Tokenizers 0.21.1
tscstudios/7hb8xvf6ddtaqnwind1irci48ny2_7fdc3e61-ceb3-48cc-abec-cb64a4f1f835
tscstudios
2025-06-17T13:57:54Z
0
0
diffusers
[ "diffusers", "flux", "lora", "replicate", "text-to-image", "en", "base_model:black-forest-labs/FLUX.1-dev", "base_model:adapter:black-forest-labs/FLUX.1-dev", "license:other", "region:us" ]
text-to-image
2025-06-17T13:57:52Z
--- license: other license_name: flux-1-dev-non-commercial-license license_link: https://huggingface.co/black-forest-labs/FLUX.1-dev/blob/main/LICENSE.md language: - en tags: - flux - diffusers - lora - replicate base_model: "black-forest-labs/FLUX.1-dev" pipeline_tag: text-to-image # widget: # - text: >- # prompt # output: # url: https://... instance_prompt: TOK --- # 7Hb8Xvf6Ddtaqnwind1Irci48Ny2_7Fdc3E61 Ceb3 48Cc Abec Cb64A4F1F835 <Gallery /> ## About this LoRA This is a [LoRA](https://replicate.com/docs/guides/working-with-loras) for the FLUX.1-dev text-to-image model. It can be used with diffusers or ComfyUI. It was trained on [Replicate](https://replicate.com/) using AI toolkit: https://replicate.com/ostris/flux-dev-lora-trainer/train ## Trigger words You should use `TOK` to trigger the image generation. ## Run this LoRA with an API using Replicate ```py import replicate input = { "prompt": "TOK", "lora_weights": "https://huggingface.co/tscstudios/7hb8xvf6ddtaqnwind1irci48ny2_7fdc3e61-ceb3-48cc-abec-cb64a4f1f835/resolve/main/lora.safetensors" } output = replicate.run( "black-forest-labs/flux-dev-lora", input=input ) for index, item in enumerate(output): with open(f"output_{index}.webp", "wb") as file: file.write(item.read()) ``` ## Use it with the [๐Ÿงจ diffusers library](https://github.com/huggingface/diffusers) ```py from diffusers import AutoPipelineForText2Image import torch pipeline = AutoPipelineForText2Image.from_pretrained('black-forest-labs/FLUX.1-dev', torch_dtype=torch.float16).to('cuda') pipeline.load_lora_weights('tscstudios/7hb8xvf6ddtaqnwind1irci48ny2_7fdc3e61-ceb3-48cc-abec-cb64a4f1f835', weight_name='lora.safetensors') image = pipeline('TOK').images[0] ``` For more details, including weighting, merging and fusing LoRAs, check the [documentation on loading LoRAs in diffusers](https://huggingface.co/docs/diffusers/main/en/using-diffusers/loading_adapters) ## Training details - Steps: 2000 - Learning rate: 0.0004 - LoRA rank: 16 ## Contribute your own examples You can use the [community tab](https://huggingface.co/tscstudios/7hb8xvf6ddtaqnwind1irci48ny2_7fdc3e61-ceb3-48cc-abec-cb64a4f1f835/discussions) to add images that show off what youโ€™ve made with this LoRA.
rushabh-v/medgamma_finetuning_temp
rushabh-v
2025-06-17T13:57:36Z
0
0
transformers
[ "transformers", "safetensors", "generated_from_trainer", "trl", "sft", "base_model:google/medgemma-4b-it", "base_model:finetune:google/medgemma-4b-it", "endpoints_compatible", "region:us" ]
null
2025-06-17T12:34:25Z
--- base_model: google/medgemma-4b-it library_name: transformers model_name: medgamma_finetuning_temp tags: - generated_from_trainer - trl - sft licence: license --- # Model Card for medgamma_finetuning_temp This model is a fine-tuned version of [google/medgemma-4b-it](https://huggingface.co/google/medgemma-4b-it). It has been trained using [TRL](https://github.com/huggingface/trl). ## Quick start ```python from transformers import pipeline question = "If you had a time machine, but could only go to the past or the future once and never return, which would you choose and why?" generator = pipeline("text-generation", model="rushabh-v/medgamma_finetuning_temp", device="cuda") output = generator([{"role": "user", "content": question}], max_new_tokens=128, return_full_text=False)[0] print(output["generated_text"]) ``` ## Training procedure [<img src="https://raw.githubusercontent.com/wandb/assets/main/wandb-github-badge-28.svg" alt="Visualize in Weights & Biases" width="150" height="24"/>](https://wandb.ai/ds_eka/MedGemma-SFT/runs/5t79uaek) This model was trained with SFT. ### Framework versions - TRL: 0.18.2 - Transformers: 4.52.4 - Pytorch: 2.7.1 - Datasets: 3.6.0 - Tokenizers: 0.21.1 ## Citations Cite TRL as: ```bibtex @misc{vonwerra2022trl, title = {{TRL: Transformer Reinforcement Learning}}, author = {Leandro von Werra and Younes Belkada and Lewis Tunstall and Edward Beeching and Tristan Thrush and Nathan Lambert and Shengyi Huang and Kashif Rasul and Quentin Gallou{\'e}dec}, year = 2020, journal = {GitHub repository}, publisher = {GitHub}, howpublished = {\url{https://github.com/huggingface/trl}} } ```
sc-genrm-scaling/llama_3.1_8b_genrm_ft
sc-genrm-scaling
2025-06-17T13:53:40Z
1,507
0
null
[ "safetensors", "llama", "arxiv:2504.01005", "license:apache-2.0", "region:us" ]
null
2024-12-25T20:24:35Z
--- license: apache-2.0 --- Fine-tuned version of Llama-3.1-8B-Instruct for generative verification on MATH problems. See [organization card](https://huggingface.co/sc-genrm-scaling) and [paper](https://arxiv.org/abs/2504.01005) for more information. You can follow [this example](https://github.com/nishadsinghi/sc-genrm-scaling/blob/master/llmonk/verify/demo.ipynb) of how to do inference with this model.
furkankarakuz/test-bert-finetuned-ner
furkankarakuz
2025-06-17T13:50:02Z
0
0
transformers
[ "transformers", "tensorboard", "safetensors", "bert", "token-classification", "generated_from_trainer", "dataset:conll2003", "base_model:google-bert/bert-base-cased", "base_model:finetune:google-bert/bert-base-cased", "license:apache-2.0", "model-index", "autotrain_compatible", "endpoints_co...
token-classification
2025-06-17T13:17:42Z
--- library_name: transformers license: apache-2.0 base_model: bert-base-cased tags: - generated_from_trainer datasets: - conll2003 metrics: - precision - recall - f1 - accuracy model-index: - name: test-bert-finetuned-ner results: - task: name: Token Classification type: token-classification dataset: name: conll2003 type: conll2003 config: conll2003 split: validation args: conll2003 metrics: - name: Precision type: precision value: 0.9333003136866436 - name: Recall type: recall value: 0.9513631773813531 - name: F1 type: f1 value: 0.9422451870989249 - name: Accuracy type: accuracy value: 0.9868428798492965 --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # test-bert-finetuned-ner This model is a fine-tuned version of [bert-base-cased](https://huggingface.co/bert-base-cased) on the conll2003 dataset. It achieves the following results on the evaluation set: - Loss: 0.0596 - Precision: 0.9333 - Recall: 0.9514 - F1: 0.9422 - Accuracy: 0.9868 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments - lr_scheduler_type: linear - num_epochs: 3 ### Training results | Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:---------:|:------:|:------:|:--------:| | 0.0756 | 1.0 | 1756 | 0.0635 | 0.9075 | 0.9345 | 0.9208 | 0.9824 | | 0.0359 | 2.0 | 3512 | 0.0682 | 0.9325 | 0.9467 | 0.9395 | 0.9862 | | 0.0218 | 3.0 | 5268 | 0.0596 | 0.9333 | 0.9514 | 0.9422 | 0.9868 | ### Framework versions - Transformers 4.52.4 - Pytorch 2.6.0+cu124 - Datasets 3.6.0 - Tokenizers 0.21.1
wiamabd/wav2vec2-large-xlsr-exp-lot1-only
wiamabd
2025-06-17T13:08:00Z
1
0
transformers
[ "transformers", "safetensors", "wav2vec2", "automatic-speech-recognition", "arxiv:1910.09700", "endpoints_compatible", "region:us" ]
automatic-speech-recognition
2025-06-14T19:00:24Z
--- library_name: transformers tags: [] --- # Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a ๐Ÿค— transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
derekl35/alphonse_mucha_fp8_lora_flux
derekl35
2025-06-17T13:04:45Z
6
0
diffusers
[ "diffusers", "text-to-image", "diffusers-training", "lora", "flux", "flux-diffusers", "template:sd-lora", "dataset:derekl35/alphonse-mucha-style", "base_model:black-forest-labs/FLUX.1-dev", "base_model:adapter:black-forest-labs/FLUX.1-dev", "license:other", "region:us" ]
text-to-image
2025-06-10T15:19:04Z
--- base_model: black-forest-labs/FLUX.1-dev library_name: diffusers license: other instance_prompt: a woman, alphonse mucha style widget: - text: >- Serene raven-haired woman, moonlit lilies, swirling botanicals, alphonse mucha style output: url: images/alphonse_mucha_merged1_fp8.png - text: a puppy in a pond, alphonse mucha style output: url: images/alphonse_mucha_merged2_fp8.png - text: >- Ornate fox with a collar of autumn leaves and berries, amidst a tapestry of forest foliage, alphonse mucha style output: url: images/alphonse_mucha_merged3_fp8.png tags: - text-to-image - diffusers-training - diffusers - lora - flux - flux-diffusers - template:sd-lora datasets: - derekl35/alphonse-mucha-style --- # Flux DreamBooth LoRA - derekl35/alphonse_mucha_fp8_lora_flux <Gallery /> ## Model description These are derekl35/alphonse_mucha_fp8_lora_flux DreamBooth LoRA weights for black-forest-labs/FLUX.1-dev. This work is part of the blog post, "Fine-Tuning FLUX.1-dev on consumer hardware and in FP8". The weights were trained using [DreamBooth](https://dreambooth.github.io/) with the [Flux diffusers trainer](https://github.com/huggingface/diffusers/blob/main/examples/dreambooth/README_flux.md). Was LoRA for the text encoder enabled? False. FP8 training? True ## Trigger words You should use `, alphonse mucha style` to trigger the image generation. ## Download model [Download the *.safetensors LoRA](derekl35/alphonse_mucha_fp8_lora_flux/tree/main) in the Files & versions tab. ## Use it with the [๐Ÿงจ diffusers library](https://github.com/huggingface/diffusers) There are two main ways to use this LoRA for inference: loading the adapter on the fly or merging it with the base model. ### Option 1: Loading LoRA Adapters This approach offers flexibility, allowing you to easily switch between different LoRA styles. ```python from diffusers import FluxPipeline import torch ckpt_id = "black-forest-labs/FLUX.1-dev" pipeline = FluxPipeline.from_pretrained( ckpt_id, torch_dtype=torch.float16 ) pipeline.load_lora_weights("derekl35/alphonse_mucha_fp8_lora_flux", weight_name="pytorch_lora_weights.safetensors") pipeline.enable_model_cpu_offload() image = pipeline( "a puppy in a pond, alphonse mucha style", num_inference_steps=28, guidance_scale=3.5, height=768, width=512, generator=torch.manual_seed(0) ).images[0] image.save("alphonse_mucha_loaded.png") ``` ### Option 2: Merging LoRA into Base Model Merging the LoRA into the base model can lead to slightly faster inference and is useful when you want to use a single style consistently. ```python from diffusers import FluxPipeline, AutoPipelineForText2Image, FluxTransformer2DModel import torch ckpt_id = "black-forest-labs/FLUX.1-dev" pipeline = FluxPipeline.from_pretrained( ckpt_id, text_encoder=None, text_encoder_2=None, torch_dtype=torch.float16 ) pipeline.load_lora_weights("derekl35/alphonse_mucha_fp8_lora_flux", weight_name="pytorch_lora_weights.safetensors") pipeline.fuse_lora() pipeline.unload_lora_weights() # You can save the fused transformer for later use # pipeline.transformer.save_pretrained("fused_transformer") pipeline.enable_model_cpu_offload() image = pipeline( "a puppy in a pond, alphonse mucha style", num_inference_steps=28, guidance_scale=3.5, height=768, width=512, generator=torch.manual_seed(0) ).images[0] image.save("alphonse_mucha_merged.png") ``` For more details, including weighting, merging and fusing LoRAs, check the [documentation on loading LoRAs in diffusers](https://huggingface.co/docs/diffusers/main/en/using-diffusers/loading_adapters) ## License Please adhere to the licensing terms as described [here](https://huggingface.co/black-forest-labs/FLUX.1-dev/blob/main/LICENSE.md).
brunopio/OCRFlux-3B-Q4_K_M-GGUF
brunopio
2025-06-17T13:03:56Z
0
0
transformers
[ "transformers", "gguf", "llama-cpp", "gguf-my-repo", "en", "base_model:ChatDOC/OCRFlux-3B", "base_model:quantized:ChatDOC/OCRFlux-3B", "license:apache-2.0", "endpoints_compatible", "region:us", "conversational" ]
null
2025-06-17T13:03:41Z
--- language: - en license: apache-2.0 benchmarks: - ChatDoc/OCRFlux-bench-single - ChatDoc/OCRFlux-bench-cross - ChatDoc/OCRFlux-pubtabnet-single - ChatDoc/OCRFlux-pubtabnet-cross base_model: ChatDOC/OCRFlux-3B library_name: transformers tags: - llama-cpp - gguf-my-repo --- # brunopio/OCRFlux-3B-Q4_K_M-GGUF This model was converted to GGUF format from [`ChatDOC/OCRFlux-3B`](https://huggingface.co/ChatDOC/OCRFlux-3B) using llama.cpp via the ggml.ai's [GGUF-my-repo](https://huggingface.co/spaces/ggml-org/gguf-my-repo) space. Refer to the [original model card](https://huggingface.co/ChatDOC/OCRFlux-3B) for more details on the model. ## Use with llama.cpp Install llama.cpp through brew (works on Mac and Linux) ```bash brew install llama.cpp ``` Invoke the llama.cpp server or the CLI. ### CLI: ```bash llama-cli --hf-repo brunopio/OCRFlux-3B-Q4_K_M-GGUF --hf-file ocrflux-3b-q4_k_m.gguf -p "The meaning to life and the universe is" ``` ### Server: ```bash llama-server --hf-repo brunopio/OCRFlux-3B-Q4_K_M-GGUF --hf-file ocrflux-3b-q4_k_m.gguf -c 2048 ``` Note: You can also use this checkpoint directly through the [usage steps](https://github.com/ggerganov/llama.cpp?tab=readme-ov-file#usage) listed in the Llama.cpp repo as well. Step 1: Clone llama.cpp from GitHub. ``` git clone https://github.com/ggerganov/llama.cpp ``` Step 2: Move into the llama.cpp folder and build it with `LLAMA_CURL=1` flag along with other hardware-specific flags (for ex: LLAMA_CUDA=1 for Nvidia GPUs on Linux). ``` cd llama.cpp && LLAMA_CURL=1 make ``` Step 3: Run inference through the main binary. ``` ./llama-cli --hf-repo brunopio/OCRFlux-3B-Q4_K_M-GGUF --hf-file ocrflux-3b-q4_k_m.gguf -p "The meaning to life and the universe is" ``` or ``` ./llama-server --hf-repo brunopio/OCRFlux-3B-Q4_K_M-GGUF --hf-file ocrflux-3b-q4_k_m.gguf -c 2048 ```
altaweel/gemma-ultrasound-1b-v2
altaweel
2025-06-17T13:03:16Z
0
0
transformers
[ "transformers", "tensorboard", "safetensors", "generated_from_trainer", "trl", "sft", "base_model:google/gemma-3-1b-pt", "base_model:finetune:google/gemma-3-1b-pt", "endpoints_compatible", "region:us" ]
null
2025-06-17T12:50:29Z
--- base_model: google/gemma-3-1b-pt library_name: transformers model_name: gemma-ultrasound-1b-v2 tags: - generated_from_trainer - trl - sft licence: license --- # Model Card for gemma-ultrasound-1b-v2 This model is a fine-tuned version of [google/gemma-3-1b-pt](https://huggingface.co/google/gemma-3-1b-pt). It has been trained using [TRL](https://github.com/huggingface/trl). ## Quick start ```python from transformers import pipeline question = "If you had a time machine, but could only go to the past or the future once and never return, which would you choose and why?" generator = pipeline("text-generation", model="altaweel/gemma-ultrasound-1b-v2", device="cuda") output = generator([{"role": "user", "content": question}], max_new_tokens=128, return_full_text=False)[0] print(output["generated_text"]) ``` ## Training procedure This model was trained with SFT. ### Framework versions - TRL: 0.15.2 - Transformers: 4.52.4 - Pytorch: 2.7.0 - Datasets: 3.3.2 - Tokenizers: 0.21.1 ## Citations Cite TRL as: ```bibtex @misc{vonwerra2022trl, title = {{TRL: Transformer Reinforcement Learning}}, author = {Leandro von Werra and Younes Belkada and Lewis Tunstall and Edward Beeching and Tristan Thrush and Nathan Lambert and Shengyi Huang and Kashif Rasul and Quentin Gallouรฉdec}, year = 2020, journal = {GitHub repository}, publisher = {GitHub}, howpublished = {\url{https://github.com/huggingface/trl}} } ```
joanna302/Qwen3-0.6B-Base_fr_pt_2e-05_seed43
joanna302
2025-06-17T12:59:14Z
0
0
transformers
[ "transformers", "safetensors", "qwen3", "text-generation", "unsloth", "trl", "sft", "conversational", "arxiv:1910.09700", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2025-06-17T12:15:54Z
--- library_name: transformers tags: - unsloth - trl - sft --- # Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a ๐Ÿค— transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
TrumpElon/task-11-Qwen-Qwen2.5-1.5B
TrumpElon
2025-06-17T12:50:51Z
0
0
peft
[ "peft", "safetensors", "base_model:Qwen/Qwen2.5-1.5B", "base_model:adapter:Qwen/Qwen2.5-1.5B", "license:other", "region:us" ]
null
2025-06-17T12:49:19Z
--- library_name: peft license: other base_model: Qwen/Qwen2.5-1.5B --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # lora ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters ### Training results ### Framework versions - PEFT 0.12.0 - Transformers 4.48.3 - Pytorch 2.6.0+cu124 - Datasets 3.2.0 - Tokenizers 0.21.0
EYEDOL/Llama-3.2-3b_ON_ALPACA3
EYEDOL
2025-06-17T12:41:30Z
0
0
transformers
[ "transformers", "safetensors", "text-generation-inference", "unsloth", "llama", "trl", "en", "license:apache-2.0", "endpoints_compatible", "region:us" ]
null
2025-06-17T12:41:13Z
--- base_model: unsloth/llama-3.2-3b-instruct tags: - text-generation-inference - transformers - unsloth - llama - trl license: apache-2.0 language: - en --- # Uploaded model - **Developed by:** EYEDOL - **License:** apache-2.0 - **Finetuned from model :** unsloth/llama-3.2-3b-instruct This llama model was trained 2x faster with [Unsloth](https://github.com/unslothai/unsloth) and Huggingface's TRL library. [<img src="https://raw.githubusercontent.com/unslothai/unsloth/main/images/unsloth%20made%20with%20love.png" width="200"/>](https://github.com/unslothai/unsloth)
NhaiDao/SFT_TRAIN_FROM_SCRATCH_checkpoint_10625
NhaiDao
2025-06-17T12:32:34Z
0
0
transformers
[ "transformers", "safetensors", "llama", "text-generation", "conversational", "arxiv:1910.09700", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2025-06-17T12:32:00Z
--- library_name: transformers tags: [] --- # Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a ๐Ÿค— transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
NhaiDao/SFT_TRAIN_FROM_SCRATCH_checkpoint_8125
NhaiDao
2025-06-17T12:30:13Z
0
0
transformers
[ "transformers", "safetensors", "llama", "text-generation", "conversational", "arxiv:1910.09700", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2025-06-17T12:29:40Z
--- library_name: transformers tags: [] --- # Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a ๐Ÿค— transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
diegolacomba/multilingual-e5-small-legal-mnrl-1
diegolacomba
2025-06-17T12:28:40Z
0
0
sentence-transformers
[ "sentence-transformers", "safetensors", "bert", "sentence-similarity", "feature-extraction", "generated_from_trainer", "dataset_size:58898", "loss:MultipleNegativesRankingLoss", "arxiv:1908.10084", "arxiv:1705.00652", "base_model:intfloat/multilingual-e5-small", "base_model:finetune:intfloat/m...
sentence-similarity
2025-06-17T12:28:13Z
--- tags: - sentence-transformers - sentence-similarity - feature-extraction - generated_from_trainer - dataset_size:58898 - loss:MultipleNegativesRankingLoss base_model: intfloat/multilingual-e5-small widget: - source_sentence: 'query: ยฟCรณmo se deben determinar las cuotas a cuenta del IRPF en un aรฑo con actividad econรณmica suspendida?' sentences: - 'passage A los efectos de este Impuesto, se considerarรก promotor de edificaciones el propietario de inmuebles que construyรณ (promotor-constructor) o contratรณ la construcciรณn (promotor) de los mismos para destinarlos a la venta, el alquiler o el uso propio. c) Dichas ejecuciones de obra tengan por objeto la construcciรณn o rehabilitaciรณn de edificios destinados fundamentalmente a viviendas, incluidos los locales, anejos, instalaciones y servicios complementarios en ella situados. d) Las referidas ejecuciones de obra consistan materialmente en la construcciรณn o rehabilitaciรณn de los citados edificios. 3.- En consecuencia, las ejecuciones de obra concertadas directamente entre el promotor y el contratista (la consultante), que tengan por objeto la rehabilitaciรณn de una vivienda, tributan al tipo reducido del 10 por ciento. El tipo reducido se aplica con independencia de que el promotor concierte la totalidad de la obra de construcciรณn con un solo empresario o concierte la realizaciรณn con varios empresarios realizando cada uno de ellos una parte de la obra segรบn su especialidad. No obstante, las ejecuciones de obra realizadas por subcontratistas para otros contratistas (la consultante), que a su vez contraten con el promotor, tributarรกn por el Impuesto sobre el Valor Aรฑadido al tipo general del 21 por ciento. 4.- Lo que comunico a Vd. con efectos vinculantes, conforme a lo dispuesto en el apartado 1 del artรญculo 89 de la Ley 58/2003, de 17 de diciembre, General Tributaria.' - 'passage Descripciรณn de hechos: La consultante es titular de una actividad econรณmica de "otros cafรฉs y bares". El rendimiento neto de la actividad se determina por el mรฉtodo de estimaciรณn objetiva y tributa en el IVA por el rรฉgimen especial simplificado. Desde la declaraciรณn de alarma en marzo de 2020 ha tenido cerrada la actividad y la va a seguir teniendo cerrada durante todo el aรฑo 2020, pues las restricciones que tiene que aplicar no la hacen rentable. Cuestiรณn planteada: Forma de calcular, en 2020, el pago fraccionado a cuenta del IRPF y el ingreso a cuenta trimestral del IVA.' - 'passage No obstante, el artรญculo 22.Trece de la Ley 37/1992, declara la exenciรณn de: โ€œLos transportes de viajeros y sus equipajes por vรญa marรญtima o aรฉrea procedentes de o con destino a un puerto o aeropuerto situado fuera del รกmbito espacial del Impuesto. Se entenderรกn incluidos en este apartado los transportes por vรญa aรฉrea amparados por un รบnico tรญtulo de transporte que incluya vuelos de conexiรณn aรฉrea.โ€. En consecuencia, los servicios de transporte consultados, que tienen su origen o destino en un aeropuerto fuera del territorio de aplicaciรณn del impuesto sobre el valor aรฑadido, estarรกn sujetos pero exentos del Impuesto sobre el Valor Aรฑadido. 2.- Por otra parte, el artรญculo 164, apartado uno, de la Ley del Impuesto sobre el Valor Aรฑadido, en el que se regulan las obligaciones de los sujetos pasivos, establece lo siguiente: โ€œUno. Sin perjuicio de lo establecido en el Tรญtulo anterior, los sujetos pasivos del Impuesto estarรกn obligados, con los requisitos, lรญmites y condiciones que se determinen reglamentariamente, a: (โ€ฆ) 3ยบ. Expedir y entregar factura de todas sus operaciones, ajustada a lo que se determine reglamentariamente.โ€. El desarrollo reglamentario de dicho precepto se ha llevado a cabo por el Reglamento por el que se regulan las obligaciones de facturaciรณn, aprobado por el artรญculo 1 del Real Decreto 1619/2012, de 30 de noviembre (BOE de 1 de diciembre). El artรญculo 2 del mencionado Reglamento dispone que:' - source_sentence: 'query: ยฟCuรกl es el porcentaje de impuesto que corresponde a dispositivos destinados a aliviar discapacidades bajo la ley actual?' sentences: - 'passage Contestaciรณn completa: 1.- El artรญculo 90, apartado uno, de la Ley 37/1992, de 28 de diciembre, del Impuesto sobre el Valor Aรฑadido (BOE del 29 de diciembre), dispone que el Impuesto se exigirรก al tipo del 21 por ciento, salvo lo dispuesto en el artรญculo siguiente. 2.- El artรญculo 91, apartado Uno.1, nรบmero 6ยบ, letra c) de la Ley 37/1992 dispone lo siguiente: โ€œUno. Se aplicarรก el tipo del 10 por ciento a las operaciones siguientes: 1. Las entregas, adquisiciones intracomunitarias o importaciones de los bienes que se indican a continuaciรณn: (โ€ฆ) 6.ยบ Los siguientes bienes: (โ€ฆ) c) Los equipos mรฉdicos, aparatos y demรกs instrumental, relacionados en el apartado octavo del anexo de esta Ley, que, por sus caracterรญsticas objetivas, estรฉn diseรฑados para aliviar o tratar deficiencias, para uso personal y exclusivo de personas que tengan deficiencias fรญsicas, mentales, intelectuales o sensoriales, sin perjuicio de lo previsto en el apartado dos.1 de este artรญculo. No se incluyen en esta letra otros accesorios, recambios y piezas de repuesto de dichos bienes.โ€. El apartado octavo del Anexo de la Ley 37/1992, establece lo siguiente: โ€œOctavo. Relaciรณn de bienes a que se refiere el artรญculo 91.Uno.1. 6.ยบc) de esta Ley. (โ€ฆ) โ€“ Sillas terapรฉuticas y de ruedas, asรญ como los cojines antiescaras y arneses para el uso de las mismas, muletas, andadores y grรบas para movilizar personas con discapacidad. (โ€ฆ).โ€. 3.- Por su parte, el artรญculo 91, apartado dos.1, nรบmero 4ยบ de la Ley 37/1992, dispone que: โ€œDos. Se aplicarรก el tipo del 4 por ciento a las operaciones siguientes: 1. Las entregas, adquisiciones intracomunitarias o importaciones de los bienes que se indican a continuaciรณn: (โ€ฆ)' - 'passage (โ€ฆ).โ€. De acuerdo con lo dispuesto anteriormente, en los supuestos de adjudicaciรณn de bienes en virtud de subasta judicial o administrativa, como es el caso que nos ocupa, el adjudicatario puede efectuar, en su caso, la renuncia a las exenciones previstas en el apartado dos del artรญculo 20 de la Ley 37/1992, asรญ como expedir factura, presentar, en nombre y por cuenta del sujeto pasivo, la declaraciรณn-liquidaciรณn correspondiente e ingresar el importe del Impuesto sobre el Valor Aรฑadido resultante. El ejercicio de dicha facultad por parte del adjudicatario determina la obligaciรณn de presentar la autoliquidaciรณn del Impuesto conforme al modelo aprobado por la Orden HAC/3625/2003, de 23 de diciembre (modelo 309). Uno de los requisitos necesarios para el ejercicio de dicha facultad es que el destinatario-adjudicatario del bien inmueble tenga la consideraciรณn de empresario o profesional en los tรฉrminos previstos en esta contestaciรณn. La no consideraciรณn como empresario o profesional impide el ejercicio de dicha facultad. Por รบltimo, seรฑalar que de resultar aplicable la regla de inversiรณn del sujeto pasivo prevista en el artรญculo 84.Uno.2ยบ de la Ley 37/1992, anteriormente desarrollado, el adjudicatario resultarรก ser el sujeto pasivo de la operaciรณn por lo que viene obligado a presentar la autoliquidaciรณn ordinaria del Impuesto en nombre propio, sin actuar en nombre y por cuenta del subastado. Asimismo, de optar por dicha facultad en los tรฉrminos establecidos reglamentariamente, el consultante podrรก emitir, en nombre y por cuenta del transmitente, la correspondiente factura en la que se documente la operaciรณn. No obstante, tal y como se ha seรฑalado en apartados anteriores de esta contestaciรณn, el consultante adjudicatario de la subasta judicial no procediรณ a la renuncia a la exenciรณn del artรญculo 20.Uno.22ยบ de la Ley del Impuesto en el plazo establecido, habiรฉndose encontrado facultado para ello segรบn lo dispuesto en la Disposiciรณn Adicional Sexta de la Ley 37/1992.' - 'passage c) Las que tengan por objeto la cesiรณn del derecho a utilizar infraestructuras ferroviarias. d) Las autorizaciones para la prestaciรณn de servicios al pรบblico y para el desarrollo de actividades comerciales o industriales en el รกmbito portuario.โ€ 3.- La consulta plantea una cuestiรณn sobre un contrato por el que un Ayuntamiento cede a un contratista la explotaciรณn de un bar (instalaciรณn fija de obra) en una ciudad. Dicho contrato tiene la naturaleza de contrato administrativo especial, sin que el mismo pueda calificarse como contrato de gestiรณn de servicio pรบblico ni tampoco como concesiรณn administrativa de dominio pรบblico. Cabe plantearse si podrรญa resultar aplicable a la referida prestaciรณn de servicios efectuada por el ayuntamiento en favor de la consultante el supuesto de no sujeciรณn al Impuesto sobre el Valor Aรฑadido previsto para el otorgamiento de concesiones y autorizaciones administrativas en el nรบmero 9ยบ del artรญculo 7 de la citada Ley 37/1992. La respuesta a esta cuestiรณn es negativa, pues, como ha seรฑalado la Asesorรญa Jurรญdica de la Secretarรญa de Estado de Hacienda en el informe emitido el 30 de julio de 1997 a solicitud de esta Direcciรณn General, los contratos que tienen por objeto la explotaciรณn de cafeterรญas y comedores en centros pรบblicos son contratos administrativos especiales, sin que los mismos puedan calificarse como contratos de gestiรณn de servicios pรบblicos ni tampoco como concesiones administrativas de dominio pรบblico. En este sentido se ha pronunciado la Junta Consultiva de Contrataciรณn Administrativa en diversos informes emitidos al respecto; asรญ, en el informe 57/07 de 6 de febrero de 2008, y, con anterioridad, en los informes 5/96 de 7 de marzo y 67/99, de 6 de julio de 2000. En consecuencia con todo lo anterior, estรก sujeto al Impuesto sobre el Valor Aรฑadido y no exento del mismo el contrato suscrito entre el ayuntamiento y la consultante consistente en explotar un bar-quiosco, a cambio del pago de una contraprestaciรณn. 4.- Lo que comunico a Vd. con efectos vinculantes, conforme a lo dispuesto en el apartado 1 del artรญculo 89 de la Ley 58/2003, de 17 de diciembre, General Tributaria.' - source_sentence: 'query: ยฟEn quรฉ casos las transacciones documentadas en escrituras pรบblicas pueden estar sujetas a una tasa tributaria especรญfica segรบn la normativa vigente?' sentences: - 'passage 3.- Por otra parte en relaciรณn con la inclusiรณn del suero de irrigaciรณn en el apartado destinado a โ€œBolsas de recogida de orina, absorbentes de incontinencia y otros sistemas para incontinencia urinaria y fecal, incluidos los sistemas de irrigaciรณnโ€, este Centro directivo en la consulta de fecha 23 de marzo de 2015, numero V0872-15 y en relaciรณn con los sistemas de irrigaciรณn ha dispuesto que, โ€œTributarรกn por el Impuesto sobre el Valor Aรฑadido, al tipo general del 21 por ciento, los siguientes productos objeto de consulta: -Los empapadores, las duchas vaginales, irrigadores, accesorios y sistemas de irrigaciรณn no destinados especรญficamente a situaciones de incontinencia urinaria o fecal, ni las cรกnulas rectales y vaginales no destinadas especรญficamente a situaciones de incontinencia urinaria o fecal o no incorporadas en equipos destinados a estas situaciones. โ€œ 4.- En consecuencia con lo anterior este centro directivo le informa que tributan al tipo general del 21 por ciento las entregas, adquisiciones intracomunitarias e importaciones de suero de irrigaciรณn (agua destilada o suero fisiolรณgico) objeto de consulta siendo irrelevante que su destino sea para la limpieza asรฉptica de la piel, lavado de heridas o quemaduras formando parte integrante de los sistemas de irrigaciรณn. 5.- Lo que comunico a Vd. con efectos vinculantes, conforme a lo dispuesto en el apartado 1 del artรญculo 89 de la Ley 58/2003, de 17 de diciembre, General Tributaria. No obstante, de acuerdo con el artรญculo 68.2 del Reglamento General de las actuaciones y los procedimientos de gestiรณn e inspecciรณn tributaria y de desarrollo de las normas comunes de los procedimientos de aplicaciรณn de los tributos, aprobado por el Real Decreto 1065/2007, de 27 de julio, la presente contestaciรณn no tendrรก efectos vinculantes para aquellos miembros o asociados de la consultante que en el momento de formular la consulta estuviesen siendo objeto de un procedimiento, recurso o reclamaciรณn econรณmico-administrativa iniciado con anterioridad y relacionado con las cuestiones planteadas en la consulta conforme a lo dispuesto en su artรญculo 89.2.' - 'passage Contestaciรณn completa: 1.- Las reglas de localizaciรณn de las prestaciones de servicios se encuentran reguladas en los artรญculos 69, 70 y 72 de la Ley 37/1992, de 28 de diciembre, del Impuesto sobre el Valor Aรฑadido (BOE del 29 de diciembre). En el artรญculo 69 del dicho texto normativo se contienen las reglas generales de localizaciรณn en donde se establece que: โ€œUno. Las prestaciones de servicios se entenderรกn realizadas en el territorio de aplicaciรณn del Impuesto, sin perjuicio de lo dispuesto en el apartado siguiente de este artรญculo y en los artรญculos 70 y 72 de esta Ley, en los siguientes casos: 1.ยบ Cuando el destinatario sea un empresario o profesional que actรบe como tal y radique en el citado territorio la sede de su actividad econรณmica, o tenga en el mismo un establecimiento permanente o, en su defecto, el lugar de su domicilio o residencia habitual, siempre que se trate de servicios que tengan por destinatarios a dicha sede, establecimiento permanente, domicilio o residencia habitual, con independencia de dรณnde se encuentre establecido el prestador de los servicios y del lugar desde el que los preste. 2.ยบ Cuando el destinatario no sea un empresario o profesional actuando como tal, siempre que los servicios se presten por un empresario o profesional y la sede de su actividad econรณmica o establecimiento permanente desde el que los preste o, en su defecto, el lugar de su domicilio o residencia habitual, se encuentre en el territorio de aplicaciรณn del Impuesto. (โ€ฆ).โ€. No obstante, estas reglas serรกn de aplicaciรณn รบnicamente en el caso en que no proceda aplicar ninguna de las reglas espaciales que se regulan en el artรญculo 70 de la Ley del impuesto. En concreto, respecto de los servicios de restauraciรณn y catering, se establece en el nรบmero 5ยบ del apartado Uno de dicho precepto que: โ€œUno. Se entenderรกn prestados en el territorio de aplicaciรณn del Impuesto los siguientes servicios: (โ€ฆ) 5.ยบ. A) Los de restauraciรณn y catering en los siguientes supuestos: (โ€ฆ) b) Los restantes servicios de restauraciรณn y catering cuando se presten materialmente en el territorio de aplicaciรณn del Impuesto. (โ€ฆ).โ€.' - 'passage Artรญculo 31 โ€œ2. Las primeras copias de escrituras y actas notariales, cuando tengan por objeto cantidad o cosa valuable, contengan actos o contratos inscribibles en los Registros de la Propiedad, Mercantil y de la Propiedad Industrial y de Bienes Muebles no sujetos al Impuesto sobre Sucesiones y Donaciones o a los conceptos comprendidos en los nรบmeros 1 y 2 del artรญculo 1.ยบ de esta Ley, tributarรกn, ademรกs, al tipo de gravamen que, conforme a lo previsto en la Ley 21/2001, de 27 de diciembre, por la que se regulan las medidas fiscales y administrativas del nuevo sistema de financiaciรณn de las Comunidades Autรณnomas de rรฉgimen comรบn y Ciudades con Estatuto de Autonomรญa, haya sido aprobado por la Comunidad Autรณnoma. Si la Comunidad Autรณnoma no hubiese aprobado el tipo a que se refiere el pรกrrafo anterior, se aplicarรก el 0,50 por 100, en cuanto a tales actos o contratos.โ€ De la aplicaciรณn de los preceptos anteriormente transcritos resulta lo siguiente: - Por regla general las operaciones realizadas por un sujeto pasivo del IVA son operaciones no sujetas a la modalidad de transmisiones patrimoniales onerosas del ITP y AJD segรบn lo dispuesto en los artรญculos 7.5 del Texto Refundido del citado impuesto. En tal caso, si la referida operaciรณn se documentase en escritura pรบblica, la no sujeciรณn de la transmisiรณn por la modalidad de transmisiones patrimoniales onerosas permitirรญa la aplicaciรณn la cuota variable del Documento Notarial de la modalidad Actos Jurรญdicos Documentados, dada la concurrencia de todos los requisitos exigidos en el artรญculo 31.2 del Texto Refundido del Impuesto: Tratarse de una primera copia de una escritura o acta notarial Tener por objeto cantidad o cosa valuable Contener un acto o contrato inscribibles en los Registros de la Propiedad, Mercantil y de la Propiedad Industrial y de Bienes Muebles No estar sujetos los referidos actos al Impuesto sobre Sucesiones y Donaciones o a los conceptos comprendidos en los apartados 1 y 2 del artรญculo 1 de esta Ley, transmisiones patrimoniales onerosas y operaciones societarias' - source_sentence: 'query: ยฟSe aplican impuestos a la enseรฑanza de idiomas para particulares y empresas en modalidad presencial y virtual?' sentences: - 'passage 4.- Por otro lado, el artรญculo 91, apartado dos.2, nรบmero 1ยบ, de la Ley del Impuesto sobre el Valor Aรฑadido, dispone la aplicaciรณn del tipo impositivo del 4 por ciento a la prestaciรณn de los siguientes servicios: โ€œ1.ยบ Los servicios de reparaciรณn de los vehรญculos y de las sillas de ruedas comprendidos en el pรกrrafo primero del nรบmero 4.ยบ del apartado dos.1 de este artรญculo y los servicios de adaptaciรณn de los autotaxis y autoturismos para personas con discapacidad y de los vehรญculos a motor a los que se refiere el pรกrrafo segundo del mismo precepto independientemente de quiรฉn sea el conductor de los mismos.โ€. Los servicios de reparaciรณn recogidos en la Ley 37/1992 son รบnicamente los referidos a vehรญculos para personas con movilidad reducida y a sillas de ruedas para uso exclusivo de personas con discapacidad, que son los bienes incluidos en el pรกrrafo primero del artรญculo 91, apartado dos.1, nรบmero 4ยบ de dicha Ley. En consecuencia con lo anterior, las reparaciones de sillas de ruedas, que no estรฉn incluidas en el pรกrrafo anterior, tributarรกn al tipo del 21 por ciento dado que no estรก contemplado en el artรญculo 91 de la Ley 37/1992 un tipo reducido para estos servicios de reparaciรณn. 5.- En relaciรณn con el tipo impositivo aplicable a los accesorios y recambios de sillas de ruedas, la actual redacciรณn del artรญculo 91.Uno.1.6ยบ, letra c) dice expresamente que: โ€œNo se incluyen en esta letra otros accesorios, recambios y piezas de repuesto de dichos bienes.โ€.' - 'passage Descripciรณn de hechos: La consultante es una persona fรญsica que va a impartir clases de idiomas, en concreto alemรกn, tanto a personas fรญsicas como a empresas. Las clases se realizarรกn tanto de manera presencial como a travรฉs de medios electrรณnicos. Cuestiรณn planteada: Si las clases se encuentran exentas del Impuesto sobre el Valor Aรฑadido.' - 'passage Contestaciรณn completa: 1.- El artรญculo 134 bis, apartado dos de la Ley 37/1992, de 28 de diciembre, del Impuesto sobre el Valor Aรฑadido (BOE del 29), establece que: โ€œDos. Cuando el rรฉgimen de tributaciรณn aplicable a una determinada actividad agrรญcola, ganadera, forestal o pesquera cambie del rรฉgimen especial de la agricultura, ganaderรญa y pesca al general del Impuesto, el empresario o profesional titular de la actividad tendrรก derecho a: 1ยบ. Efectuar la deducciรณn de la cuota resultante de aplicar al valor de los bienes afectos a la actividad, Impuesto sobre el Valor Aรฑadido excluido, en la fecha en que deje de aplicarse el rรฉgimen especial, los tipos de dicho Impuesto que estuviesen vigentes en la citada fecha. A estos efectos, no se tendrรกn en cuenta los siguientes: a) Bienes de inversiรณn, definidos conforme a lo dispuesto en el artรญculo 108 de esta Ley. b) Bienes y servicios que hayan sido utilizados o consumidos total o parcialmente en la actividad. 2ยบ. Deducir la compensaciรณn a tanto alzado que prevรฉ el artรญculo 130 de esta Ley por los productos naturales obtenidos en las explotaciones que no se hayan entregado a la fecha del cambio del rรฉgimen de tributaciรณn. A efectos del ejercicio de los derechos recogidos en este apartado, el empresario o profesional deberรก confeccionar y presentar un inventario a la fecha en que deje de aplicarse el rรฉgimen especial. Tanto la presentaciรณn de este inventario como el ejercicio de estos derechos se ajustarรกn a los requisitos y condiciones que se establezcan reglamentariamente.โ€. Por su parte, el artรญculo 49 bis del Reglamento del Impuesto aprobado por el artรญculo 1 del Real Decreto 1624/1992, de 29 de diciembre (BOE del 31), declara que:' - source_sentence: 'query: ยฟDe quรฉ forma la ubicaciรณn de la agencia influye en la aplicaciรณn del impuesto en los servicios turรญsticos?' sentences: - 'passage Contestaciรณn completa: 1.- El artรญculo 9, primer pรกrrafo de la Ley 8/1991, de 25 de marzo, por la que se crea el Impuesto sobre la Producciรณn, los Servicios y la Importaciรณn en las Ciudades de Ceuta y Melilla (BOE del 26), dispone lo siguiente: โ€œLas importaciones definitivas de bienes en las ciudades de Ceuta y Melilla estarรกn exentas en los mismos tรฉrminos que en la legislaciรณn comรบn del Impuesto sobre el Valor Aรฑadido y, en todo caso, se asimilarรกn, a efectos de esta exenciรณn, las que resulten de aplicaciรณn a las operaciones interiores.โ€. 2.- Por otra parte, el artรญculo 20, apartado uno, nรบmero 17ยบ de la Ley 37/1992, de 28 de diciembre, del Impuesto sobre el Valor Aรฑadido (BOE del 29 de diciembre), dispone que estarรกn exentas de dicho Impuesto: โ€œ17ยบ. Las entregas de sellos de Correos y efectos timbrados de curso legal en Espaรฑa por importe no superior a su valor facial. La exenciรณn no se extiende a los servicios de expendiciรณn de los referidos bienes prestados en nombre y por cuenta de terceros.โ€. Conforme al precepto anterior, la entrega de sellos de correos de curso legal por importe no superior a su valor facial, objeto de consulta, estarรก exenta del Impuesto sobre el Valor Aรฑadido. 3.- En consecuencia, estarรกn sujetas pero exentas del Impuesto sobre la Producciรณn, los Servicios y la Importaciรณn en las Ciudades de Ceuta y Melilla las importaciones definitivas de sellos de correos de curso legal en las ciudades de Ceuta y Melilla cuando, de acuerdo con lo establecido en el apartado anterior de esta contestaciรณn, su entrega estรฉ exenta del Impuesto sobre el Valor Aรฑadido. 4.- Lo que comunico a Vd. con efectos vinculantes, conforme a lo dispuesto en el apartado 1 del artรญculo 89 de la Ley 58/2003, de 17 de diciembre, General Tributaria.' - 'passage 2ยบ. Sin perjuicio de lo dispuesto en el punto 1ยบ anterior, se aplicarรก, en todo caso, el tipo general del 21 por ciento, entre otros, a los siguientes bienes y servicios: 1. Servicios prestados por vรญa electrรณnica, esto es, aquellos servicios que consistan en la transmisiรณn enviada inicialmente y recibida en destino por medio de equipos de procesamiento, incluida la compresiรณn numรฉrica y el almacenamiento de datos, y enteramente transmitida, transportada y recibida por cable, sistema รณptico u otros medios electrรณnicos y, entre otros, los siguientes: a) El suministro y alojamiento de sitios informรกticos. b) El mantenimiento a distancia de programas y de equipos. c) El suministro de programas y su actualizaciรณn. d) El suministro de imรกgenes, texto, informaciรณn y la puesta a disposiciรณn de bases de datos. e) El suministro de mรบsica, pelรญculas, juegos, incluidos los de azar o de dinero, y de emisiones y manifestaciones polรญticas, culturales, artรญsticas, deportivas, cientรญficas o de ocio. f) El suministro de enseรฑanza a distancia. 2. Dispositivos portรกtiles que permitan almacenar y leer libros digitalizados, asรญ como reproductores de libros electrรณnicos y otros elementos de hardware, es decir, componentes que integren la parte material de un ordenador o que se puedan conectar al mismo. 3. Servicios consistentes en el acceso electrรณnico a bases de datos, periรณdicos, revistas y semejantes y, en general, a pรกginas web. 4. Comercializaciรณn de cรณdigos de descarga de archivos que incorporen libros electrรณnicos. 5. Servicios de acceso a libros de texto en formato digital alojados en servidores de Entes pรบblicos o de colegios. 6. Servicios de consultas y accesos a bases de datos. 7. Servicios de digitalizaciรณn de obras literarias.' - 'passage De acuerdo con los antecedentes recogidos en esta contestaciรณn, dicho servicio estarรก sujeto al rรฉgimen especial de las agencias de viajes regulado en el Capรญtulo VI del Tรญtulo IX de la Ley 37/1992 y tendrรก la consideraciรณn de prestaciรณn de servicios รบnica que estarรก sujeta al Impuesto sobre el Valor Aรฑadido bajo la premisa de que la consultante tiene establecida la sede de su actividad econรณmica o posea un establecimiento permanente desde donde efectรบa la operaciรณn en el territorio de aplicaciรณn del Impuesto. El tipo impositivo aplicable al servicio รบnico de viajes serรก el general del 21 por ciento previsto en el artรญculo 90.Uno de la Ley del Impuesto. Sobre la posible aplicaciรณn de la opciรณn del artรญculo 147 de la Ley 37/1992 para la aplicaciรณn del rรฉgimen general del Impuesto, segรบn se establece en contestaciรณn a consulta vinculante de 20 de septiembre de 2016, nรบmero V3942-16: โ€œ4.- Debe tenerse en cuenta que en el caso de las empresas radicadas en Estados Unidos a que se refiere el escrito de consulta, no se entiende cumplido el requisito de reciprocidad, tal como se pronunciรณ este Centro Directivo en contestaciรณn a consulta vinculante nรบmero V0579-12 de 16 de marzo de 2012, por lo que, salvo que el servicio prestado por la agencia de viajes consultante estรฉ relacionado con la asistencia a ferias, congresos y exposiciones de carรกcter comercial o profesional, en los tรฉrminos del artรญculo 119 bis de la Ley 37/1992 parcialmente transcrito, no se entenderรกn cumplidos los requisitos para la opciรณn por el rรฉgimen general del Impuesto sobre el Valor Aรฑadido.โ€. b) El mismo caso anterior, pero el viaje se pretende desarrollar en las Islas Canarias. Segรบn establece el artรญculo 144 de la Ley del Impuesto, dicha operaciรณn se encontrarรก sujeta al Impuesto y, en particular, al rรฉgimen especial de las agencias de viajes: โ€œDicha prestaciรณn se entenderรก realizada en el lugar donde la agencia tenga establecida la sede de su actividad econรณmica o posea un establecimiento permanente desde donde efectรบe la operaciรณn.โ€.' pipeline_tag: sentence-similarity library_name: sentence-transformers metrics: - cosine_accuracy@1 - cosine_accuracy@3 - cosine_accuracy@5 - cosine_accuracy@10 - cosine_precision@1 - cosine_precision@3 - cosine_precision@5 - cosine_precision@10 - cosine_recall@1 - cosine_recall@3 - cosine_recall@5 - cosine_recall@10 - cosine_ndcg@10 - cosine_mrr@10 - cosine_map@100 model-index: - name: SentenceTransformer based on intfloat/multilingual-e5-small results: - task: type: information-retrieval name: Information Retrieval dataset: name: InformationRetrievalEvaluator type: InformationRetrievalEvaluator metrics: - type: cosine_accuracy@1 value: 0.3382131835087442 name: Cosine Accuracy@1 - type: cosine_accuracy@3 value: 0.5034422726913033 name: Cosine Accuracy@3 - type: cosine_accuracy@5 value: 0.575532167444805 name: Cosine Accuracy@5 - type: cosine_accuracy@10 value: 0.6752393764342803 name: Cosine Accuracy@10 - type: cosine_precision@1 value: 0.3382131835087442 name: Cosine Precision@1 - type: cosine_precision@3 value: 0.1678140908971011 name: Cosine Precision@3 - type: cosine_precision@5 value: 0.11510643348896098 name: Cosine Precision@5 - type: cosine_precision@10 value: 0.06752393764342803 name: Cosine Precision@10 - type: cosine_recall@1 value: 0.3382131835087442 name: Cosine Recall@1 - type: cosine_recall@3 value: 0.5034422726913033 name: Cosine Recall@3 - type: cosine_recall@5 value: 0.575532167444805 name: Cosine Recall@5 - type: cosine_recall@10 value: 0.6752393764342803 name: Cosine Recall@10 - type: cosine_ndcg@10 value: 0.49624765513332225 name: Cosine Ndcg@10 - type: cosine_mrr@10 value: 0.4402356521728189 name: Cosine Mrr@10 - type: cosine_map@100 value: 0.44987220435617326 name: Cosine Map@100 --- # SentenceTransformer based on intfloat/multilingual-e5-small This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [intfloat/multilingual-e5-small](https://huggingface.co/intfloat/multilingual-e5-small). It maps sentences & paragraphs to a 384-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more. ## Model Details ### Model Description - **Model Type:** Sentence Transformer - **Base model:** [intfloat/multilingual-e5-small](https://huggingface.co/intfloat/multilingual-e5-small) <!-- at revision c007d7ef6fd86656326059b28395a7a03a7c5846 --> - **Maximum Sequence Length:** 512 tokens - **Output Dimensionality:** 384 dimensions - **Similarity Function:** Cosine Similarity <!-- - **Training Dataset:** Unknown --> <!-- - **Language:** Unknown --> <!-- - **License:** Unknown --> ### Model Sources - **Documentation:** [Sentence Transformers Documentation](https://sbert.net) - **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers) - **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers) ### Full Model Architecture ``` SentenceTransformer( (0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel (1): Pooling({'word_embedding_dimension': 384, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True}) (2): Normalize() ) ``` ## Usage ### Direct Usage (Sentence Transformers) First install the Sentence Transformers library: ```bash pip install -U sentence-transformers ``` Then you can load this model and run inference. ```python from sentence_transformers import SentenceTransformer # Download from the ๐Ÿค— Hub model = SentenceTransformer("diegolacomba/multilingual-e5-small-legal-mnrl-1") # Run inference sentences = [ 'query: ยฟDe quรฉ forma la ubicaciรณn de la agencia influye en la aplicaciรณn del impuesto en los servicios turรญsticos?', 'passage De acuerdo con los antecedentes recogidos en esta contestaciรณn, dicho servicio estarรก sujeto al rรฉgimen especial de las agencias de viajes regulado en el Capรญtulo VI del Tรญtulo IX de la Ley 37/1992 y tendrรก la consideraciรณn de prestaciรณn de servicios รบnica que estarรก sujeta al Impuesto sobre el Valor Aรฑadido bajo la premisa de que la consultante tiene establecida la sede de su actividad econรณmica o posea un establecimiento permanente desde donde efectรบa la operaciรณn en el territorio de aplicaciรณn del Impuesto.\nEl tipo impositivo aplicable al servicio รบnico de viajes serรก el general del 21 por ciento previsto en el artรญculo 90.Uno de la Ley del Impuesto.\nSobre la posible aplicaciรณn de la opciรณn del artรญculo 147 de la Ley 37/1992 para la aplicaciรณn del rรฉgimen general del Impuesto, segรบn se establece en contestaciรณn a consulta vinculante de 20 de septiembre de 2016, nรบmero V3942-16:\nโ€œ4.- Debe tenerse en cuenta que en el caso de las empresas radicadas en Estados Unidos a que se refiere el escrito de consulta, no se entiende cumplido el requisito de reciprocidad, tal como se pronunciรณ este Centro Directivo en contestaciรณn a consulta vinculante nรบmero V0579-12 de 16 de marzo de 2012, por lo que, salvo que el servicio prestado por la agencia de viajes consultante estรฉ relacionado con la asistencia a ferias, congresos y exposiciones de carรกcter comercial o profesional, en los tรฉrminos del artรญculo 119 bis de la Ley 37/1992 parcialmente transcrito, no se entenderรกn cumplidos los requisitos para la opciรณn por el rรฉgimen general del Impuesto sobre el Valor Aรฑadido.โ€.\nb) El mismo caso anterior, pero el viaje se pretende desarrollar en las Islas Canarias.\nSegรบn establece el artรญculo 144 de la Ley del Impuesto, dicha operaciรณn se encontrarรก sujeta al Impuesto y, en particular, al rรฉgimen especial de las agencias de viajes:\nโ€œDicha prestaciรณn se entenderรก realizada en el lugar donde la agencia tenga establecida la sede de su actividad econรณmica o posea un establecimiento permanente desde donde efectรบe la operaciรณn.โ€.', 'passage Contestaciรณn completa: 1.- El artรญculo 9, primer pรกrrafo de la Ley 8/1991, de 25 de marzo, por la que se crea el Impuesto sobre la Producciรณn, los Servicios y la Importaciรณn en las Ciudades de Ceuta y Melilla (BOE del 26), dispone lo siguiente:\nโ€œLas importaciones definitivas de bienes en las ciudades de Ceuta y Melilla estarรกn exentas en los mismos tรฉrminos que en la legislaciรณn comรบn del Impuesto sobre el Valor Aรฑadido y, en todo caso, se asimilarรกn, a efectos de esta exenciรณn, las que resulten de aplicaciรณn a las operaciones interiores.โ€.\n2.- Por otra parte, el artรญculo 20, apartado uno, nรบmero 17ยบ de la Ley 37/1992, de 28 de diciembre, del Impuesto sobre el Valor Aรฑadido (BOE del 29 de diciembre), dispone que estarรกn exentas de dicho Impuesto:\nโ€œ17ยบ. Las entregas de sellos de Correos y efectos timbrados de curso legal en Espaรฑa por importe no superior a su valor facial.\nLa exenciรณn no se extiende a los servicios de expendiciรณn de los referidos bienes prestados en nombre y por cuenta de terceros.โ€.\nConforme al precepto anterior, la entrega de sellos de correos de curso legal por importe no superior a su valor facial, objeto de consulta, estarรก exenta del Impuesto sobre el Valor Aรฑadido.\n3.- En consecuencia, estarรกn sujetas pero exentas del Impuesto sobre la Producciรณn, los Servicios y la Importaciรณn en las Ciudades de Ceuta y Melilla las importaciones definitivas de sellos de correos de curso legal en las ciudades de Ceuta y Melilla cuando, de acuerdo con lo establecido en el apartado anterior de esta contestaciรณn, su entrega estรฉ exenta del Impuesto sobre el Valor Aรฑadido.\n4.- Lo que comunico a Vd. con efectos vinculantes, conforme a lo dispuesto en el apartado 1 del artรญculo 89 de la Ley 58/2003, de 17 de diciembre, General Tributaria.', ] embeddings = model.encode(sentences) print(embeddings.shape) # [3, 384] # Get the similarity scores for the embeddings similarities = model.similarity(embeddings, embeddings) print(similarities.shape) # [3, 3] ``` <!-- ### Direct Usage (Transformers) <details><summary>Click to see the direct usage in Transformers</summary> </details> --> <!-- ### Downstream Usage (Sentence Transformers) You can finetune this model on your own dataset. <details><summary>Click to expand</summary> </details> --> <!-- ### Out-of-Scope Use *List how the model may foreseeably be misused and address what users ought not to do with the model.* --> ## Evaluation ### Metrics #### Information Retrieval * Dataset: `InformationRetrievalEvaluator` * Evaluated with [<code>InformationRetrievalEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator) | Metric | Value | |:--------------------|:-----------| | cosine_accuracy@1 | 0.3382 | | cosine_accuracy@3 | 0.5034 | | cosine_accuracy@5 | 0.5755 | | cosine_accuracy@10 | 0.6752 | | cosine_precision@1 | 0.3382 | | cosine_precision@3 | 0.1678 | | cosine_precision@5 | 0.1151 | | cosine_precision@10 | 0.0675 | | cosine_recall@1 | 0.3382 | | cosine_recall@3 | 0.5034 | | cosine_recall@5 | 0.5755 | | cosine_recall@10 | 0.6752 | | **cosine_ndcg@10** | **0.4962** | | cosine_mrr@10 | 0.4402 | | cosine_map@100 | 0.4499 | <!-- ## Bias, Risks and Limitations *What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.* --> <!-- ### Recommendations *What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.* --> ## Training Details ### Training Dataset #### Unnamed Dataset * Size: 58,898 training samples * Columns: <code>anchor</code> and <code>positive</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | |:--------|:-----------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------| | type | string | string | | details | <ul><li>min: 19 tokens</li><li>mean: 31.33 tokens</li><li>max: 62 tokens</li></ul> | <ul><li>min: 23 tokens</li><li>mean: 325.57 tokens</li><li>max: 508 tokens</li></ul> | * Samples: | anchor | positive | |:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | <code>query: ยฟLas contribuciones que percibe una organizaciรณn en virtud de un convenio laboral en el fรบtbol tienen impacto en la base de cรกlculo para el impuesto correspondiente?</code> | <code>passage Descripciรณn de hechos: La consultante es una Asociaciรณn que se dedica a las actividades de ordenaciรณn del ejercicio de la profesiรณn de futbolistas de sus miembros, la representaciรณn de los mismos asรญ como la defensa de sus intereses profesionales tanto en el รกmbito nacional como en el internacional.<br>En virtud de un convenio colectivo para la actividad de fรบtbol profesional suscrito entre la Liga Nacional de Fรบtbol Profesional (LNFP) y la consultante, aquella viene obligada a entregar a esta, por cada temporada de vigencia del convenio, una cantidad de dinero (en concepto de Fondo social) destinada a fines benรฉficos y al normal desarrollo de la actividad de la Asociaciรณn.<br>Asimismo, segรบn Acta de Conciliaciรณn suscrita entre ambas partes, la LNFP se compromete a abonar a la consultante un porcentaje del importe neto total de los ingresos obtenidos de la explotaciรณn conjunta de los derechos de contenidos audiovisuales del fรบtbol. Dicha cuantรญa debe destinarse a actividades encamina...</code> | | <code>query: ยฟQuรฉ tipos de transacciones intracomunitarias deben ser declaradas por las empresas segรบn la regulaciรณn vigente?</code> | <code>passage Contestaciรณn completa: 1.- De acuerdo con el artรญculo 78 del Reglamento del impuesto aprobado por el Real Decreto 1624/1992, de 29 de diciembre (BOE del 31 de diciembre):<br>โ€œLos empresarios y profesionales deberรกn presentar una declaraciรณn recapitulativa de las entregas y adquisiciones intracomunitarias de bienes y de las prestaciones y adquisiciones intracomunitarias de servicios que realicen en la forma que se indica en el presente capรญtulo.โ€.<br>El artรญculo 79 del Reglamento especifica quรฉ tipo de operaciones deben ser declaradas en la declaraciรณn recapitulativa de operaciones intracomunitarias, en concreto establece que:<br>โ€œ1. Estarรกn obligados a presentar la declaraciรณn recapitulativa los empresarios y profesionales, incluso cuando tengan dicha condiciรณn con arreglo a lo dispuesto en el apartado cuatro del artรญculo 5 de la Ley del Impuesto, que realicen cualquiera de las siguientes operaciones.<br>1.ยบ Las entregas de bienes destinados a otro Estado miembro que se encuentren exentas ...</code> | | <code>query: ยฟQuรฉ tipos de bebidas contienen alcohol apto para consumo humano?</code> | <code>passage Se entiende por bebida alcohรณlica todo lรญquido apto para el consumo humano por ingestiรณn que contenga alcohol etรญlico.<br>A los efectos de este nรบmero no tendrรกn la consideraciรณn de alimento el tabaco ni las sustancias no aptas para el consumo humano o animal en el mismo estado en que fuesen objeto de entrega, adquisiciรณn intracomunitaria o importaciรณn.โ€.<br>4.- Con independencia de lo anterior, el artรญculo 20, apartado uno, nรบmero 9ยบ, de la Ley 37/1992, establece que estarรกn exentas del Impuesto las siguientes operaciones:<br>โ€œ9.ยบ La educaciรณn de la infancia y de la juventud, la guarda y custodia de niรฑos, incluida la atenciรณn a niรฑos en los centros docentes en tiempo interlectivo durante el comedor escolar o en aulas en servicio de guarderรญa fuera del horario escolar, la enseรฑanza escolar, universitaria y de postgraduados, la enseรฑanza de idiomas y la formaciรณn y reciclaje profesional, realizadas por Entidades de derecho pรบblico o entidades privadas autorizadas para el ejercicio de di...</code> | * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` ### Training Hyperparameters #### Non-Default Hyperparameters - `eval_strategy`: steps - `per_device_train_batch_size`: 32 - `per_device_eval_batch_size`: 16 - `gradient_accumulation_steps`: 16 - `learning_rate`: 1e-05 - `num_train_epochs`: 8 - `lr_scheduler_type`: cosine - `warmup_ratio`: 0.1 - `fp16`: True - `tf32`: True - `load_best_model_at_end`: True - `optim`: adamw_torch_fused - `batch_sampler`: no_duplicates #### All Hyperparameters <details><summary>Click to expand</summary> - `overwrite_output_dir`: False - `do_predict`: False - `eval_strategy`: steps - `prediction_loss_only`: True - `per_device_train_batch_size`: 32 - `per_device_eval_batch_size`: 16 - `per_gpu_train_batch_size`: None - `per_gpu_eval_batch_size`: None - `gradient_accumulation_steps`: 16 - `eval_accumulation_steps`: None - `torch_empty_cache_steps`: None - `learning_rate`: 1e-05 - `weight_decay`: 0.0 - `adam_beta1`: 0.9 - `adam_beta2`: 0.999 - `adam_epsilon`: 1e-08 - `max_grad_norm`: 1.0 - `num_train_epochs`: 8 - `max_steps`: -1 - `lr_scheduler_type`: cosine - `lr_scheduler_kwargs`: {} - `warmup_ratio`: 0.1 - `warmup_steps`: 0 - `log_level`: passive - `log_level_replica`: warning - `log_on_each_node`: True - `logging_nan_inf_filter`: True - `save_safetensors`: True - `save_on_each_node`: False - `save_only_model`: False - `restore_callback_states_from_checkpoint`: False - `no_cuda`: False - `use_cpu`: False - `use_mps_device`: False - `seed`: 42 - `data_seed`: None - `jit_mode_eval`: False - `use_ipex`: False - `bf16`: False - `fp16`: True - `fp16_opt_level`: O1 - `half_precision_backend`: auto - `bf16_full_eval`: False - `fp16_full_eval`: False - `tf32`: True - `local_rank`: 0 - `ddp_backend`: None - `tpu_num_cores`: None - `tpu_metrics_debug`: False - `debug`: [] - `dataloader_drop_last`: False - `dataloader_num_workers`: 0 - `dataloader_prefetch_factor`: None - `past_index`: -1 - `disable_tqdm`: False - `remove_unused_columns`: True - `label_names`: None - `load_best_model_at_end`: True - `ignore_data_skip`: False - `fsdp`: [] - `fsdp_min_num_params`: 0 - `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False} - `fsdp_transformer_layer_cls_to_wrap`: None - `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None} - `deepspeed`: None - `label_smoothing_factor`: 0.0 - `optim`: adamw_torch_fused - `optim_args`: None - `adafactor`: False - `group_by_length`: False - `length_column_name`: length - `ddp_find_unused_parameters`: None - `ddp_bucket_cap_mb`: None - `ddp_broadcast_buffers`: False - `dataloader_pin_memory`: True - `dataloader_persistent_workers`: False - `skip_memory_metrics`: True - `use_legacy_prediction_loop`: False - `push_to_hub`: False - `resume_from_checkpoint`: None - `hub_model_id`: None - `hub_strategy`: every_save - `hub_private_repo`: None - `hub_always_push`: False - `gradient_checkpointing`: False - `gradient_checkpointing_kwargs`: None - `include_inputs_for_metrics`: False - `include_for_metrics`: [] - `eval_do_concat_batches`: True - `fp16_backend`: auto - `push_to_hub_model_id`: None - `push_to_hub_organization`: None - `mp_parameters`: - `auto_find_batch_size`: False - `full_determinism`: False - `torchdynamo`: None - `ray_scope`: last - `ddp_timeout`: 1800 - `torch_compile`: False - `torch_compile_backend`: None - `torch_compile_mode`: None - `include_tokens_per_second`: False - `include_num_input_tokens_seen`: False - `neftune_noise_alpha`: None - `optim_target_modules`: None - `batch_eval_metrics`: False - `eval_on_start`: False - `use_liger_kernel`: False - `eval_use_gather_object`: False - `average_tokens_across_devices`: False - `prompts`: None - `batch_sampler`: no_duplicates - `multi_dataset_batch_sampler`: proportional </details> ### Training Logs | Epoch | Step | Training Loss | InformationRetrievalEvaluator_cosine_ndcg@10 | |:----------:|:-------:|:-------------:|:--------------------------------------------:| | 0.8691 | 100 | 19.3901 | 0.4319 | | 1.7300 | 200 | 1.3949 | 0.4622 | | 2.5910 | 300 | 1.1059 | 0.4754 | | 3.4519 | 400 | 0.9521 | 0.4870 | | 4.3129 | 500 | 0.8567 | 0.4906 | | 5.1738 | 600 | 0.8006 | 0.4947 | | 6.0348 | 700 | 0.7515 | 0.4949 | | 6.9039 | 800 | 0.7973 | 0.4961 | | **7.7648** | **900** | **0.7698** | **0.4962** | | 8.0 | 928 | - | 0.4962 | * The bold row denotes the saved checkpoint. ### Framework Versions - Python: 3.11.13 - Sentence Transformers: 4.1.0 - Transformers: 4.52.4 - PyTorch: 2.6.0+cu124 - Accelerate: 1.7.0 - Datasets: 2.14.4 - Tokenizers: 0.21.1 ## Citation ### BibTeX #### Sentence Transformers ```bibtex @inproceedings{reimers-2019-sentence-bert, title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks", author = "Reimers, Nils and Gurevych, Iryna", booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing", month = "11", year = "2019", publisher = "Association for Computational Linguistics", url = "https://arxiv.org/abs/1908.10084", } ``` #### MultipleNegativesRankingLoss ```bibtex @misc{henderson2017efficient, title={Efficient Natural Language Response Suggestion for Smart Reply}, author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil}, year={2017}, eprint={1705.00652}, archivePrefix={arXiv}, primaryClass={cs.CL} } ``` <!-- ## Glossary *Clearly define terms in order to be accessible across audiences.* --> <!-- ## Model Card Authors *Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.* --> <!-- ## Model Card Contact *Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.* -->
NhaiDao/SFT_TRAIN_FROM_SCRATCH_checkpoint_4375
NhaiDao
2025-06-17T12:26:37Z
0
0
transformers
[ "transformers", "safetensors", "llama", "text-generation", "conversational", "arxiv:1910.09700", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2025-06-17T12:26:04Z
--- library_name: transformers tags: [] --- # Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a ๐Ÿค— transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
NhaiDao/SFT_TRAIN_FROM_SCRATCH_checkpoint_1250
NhaiDao
2025-06-17T12:23:29Z
0
0
transformers
[ "transformers", "safetensors", "llama", "text-generation", "conversational", "arxiv:1910.09700", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2025-06-17T12:22:53Z
--- library_name: transformers tags: [] --- # Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a ๐Ÿค— transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
Lelon/scope-fr-bioscope_abstracts
Lelon
2025-06-17T12:22:32Z
0
0
transformers
[ "transformers", "safetensors", "eurobert", "token-classification", "custom_code", "arxiv:1910.09700", "autotrain_compatible", "region:us" ]
token-classification
2025-06-17T12:21:48Z
--- library_name: transformers tags: [] --- # Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a ๐Ÿค— transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
Nirma-Meena-viral-hd-today/Trending.New.Nirma.Meena.Viral.Video.Mezzo.Fun.Viral.Video
Nirma-Meena-viral-hd-today
2025-06-17T12:22:13Z
0
0
null
[ "region:us" ]
null
2025-06-17T12:21:47Z
<a href="https://watch-blogg777xx.blogspot.com/2025/06/maallu.html"><img src="http://4.bp.blogspot.com/-VFcup4RzDQY/Upiobuokb5I/AAAAAAAAAV0/64yKpZilDCg/s1600/oie_nxv3mlmduAj1.gif" alt="fsd" /></a> <a href="https://watch-blogg777xx.blogspot.com/2025/06/maallu.html" rel="nofollow">๐Ÿ”ด โžคโ–บ๐‚๐ฅ๐ข๐ค ๐‡๐ž๐ซ๐ž ๐ญ๐จ๐Ÿ‘‰๐Ÿ‘‰ (๐–๐š๐ญ๐œ๐ก ๐…๐ฎ๐ฅ๐ฅ ๐ฏ๐ข๐๐ž๐จ)</a> <a href="https://watch-blogg777xx.blogspot.com/2025/06/maallu.html" rel="nofollow">๐Ÿ”ด โžคโ–บ๐‚๐ฅ๐ข๐ค ๐‡๐ž๐ซ๐ž ๐ญ๐จ๐Ÿ‘‰๐Ÿ‘‰ (๐…๐ฎ๐ฅ๐ฅ ๐ฏ๐ข๐๐ž๐จ ๐‹๐ข๐ง๐ค)</a>
ibm-research/biomed.sm.mv-te-84m-CYP-ligand_scaffold_balanced-CYP1A2-101
ibm-research
2025-06-17T12:18:18Z
0
0
SmallMoleculeMultiView
[ "SmallMoleculeMultiView", "safetensors", "binding-affinity-prediction", "bio-medical", "chemistry", "drug-discovery", "drug-target-interaction", "model_hub_mixin", "molecular-property-prediction", "moleculenet", "molecules", "multi-view", "multimodal", "pytorch_model_hub_mixin", "small-m...
null
2025-06-17T12:18:05Z
--- base_model: ibm-research/biomed.sm.mv-te-84m library_name: SmallMoleculeMultiView license: apache-2.0 tags: - binding-affinity-prediction - bio-medical - chemistry - drug-discovery - drug-target-interaction - model_hub_mixin - molecular-property-prediction - moleculenet - molecules - multi-view - multimodal - pytorch_model_hub_mixin - small-molecules - virtual-screening --- # ibm-research/biomed.sm.mv-te-84m-CYP-ligand_scaffold_balanced-CYP1A2-101 `biomed.sm.mv-te-84m` is a multimodal biomedical foundation model for small molecules created using **MMELON** (**M**ulti-view **M**olecular **E**mbedding with **L**ate Fusi**on**), a flexible approach to aggregate multiple views (sequence, image, graph) of molecules in a foundation model setting. While models based on single view representation typically performs well on some downstream tasks and not others, the multi-view model performs robustly across a wide range of property prediction tasks encompassing ligand-protein binding, molecular solubility, metabolism and toxicity. It has been applied to screen compounds against a large (> 100 targets) set of G Protein-Coupled receptors (GPCRs) to identify strong binders for 33 targets related to Alzheimerโ€™s disease, which are validated through structure-based modeling and identification of key binding motifs [Multi-view biomedical foundation models for molecule-target and property prediction](https://arxiv.org/abs/2410.19704). - **Developers:** IBM Research - **GitHub Repository:** [https://github.com/BiomedSciAI/biomed-multi-view](https://github.com/BiomedSciAI/biomed-multi-view) - **Paper:** [Multi-view biomedical foundation models for molecule-target and property prediction](https://arxiv.org/abs/2410.19704) - **Release Date**: Oct 28th, 2024 - **License:** [Apache 2.0](https://www.apache.org/licenses/LICENSE-2.0) ## Model Description Source code for the model and finetuning is made available in [this repository](https://github.com/BiomedSciAI/biomed-multi-view). ![SmallMoleculeMultiView Overview](https://github.com/BiomedSciAI/biomed-multi-view/blob/main/docs/overview.png?raw=true) * Image Representation: Captures the 2D visual depiction of molecular structures, highlighting features like symmetry, bond angles, and functional groups. Molecular images are generated using RDKit and undergo data augmentation during training to enhance robustness. * Graph Representation: Encodes molecules as undirected graphs where nodes represent atoms and edges represent bonds. Atom-specific properties (e.g., atomic number, chirality) and bond-specific properties (e.g., bond type, stereochemistry) are embedded using categorical embedding techniques. * Text Representation: Utilizes SMILES strings to represent chemical structures, tokenized with a custom tokenizer. The sequences are embedded using a transformer-based architecture to capture the sequential nature of the chemical information. The embeddings from these single-view pre-trained encoders are combined using an attention-based aggregator module. This module learns to weight each view appropriately, producing a unified multi-view embedding. This approach leverages the strengths of each representation to improve performance on downstream predictive tasks. ## Intended Use and Limitations The model is intended for (1) Molecular property prediction. The pre-trained model may be fine-tuned for both regression and classification tasks. Examples include but are not limited to binding affinity, solubility and toxicity. (2) Pre-trained model embeddings may be used as the basis for similarity measures to search a chemical library. (3) Small molecule embeddings provided by the model may be combined with protein embeddings to fine-tune on tasks that utilize both small molecule and protein representation. (4) Select task-specific fine-tuned models are given as examples. Through listed activities, model may aid in aspects of the molecular discovery such as lead finding or optimization. The modelโ€™s domain of applicability is small, drug-like molecules. It is intended for use with molecules less than 1000 Da molecular weight. The MMELON approach itself may be extended to include proteins and other macromolecules but does not at present provide embeddings for such entities. The model is at present not intended for molecular generation. Molecules must be given as a valid SMILES string that represents a valid chemically bonded graph. Invalid inputs will impact performance or lead to error. ## Usage Using `SmallMoleculeMultiView` API requires the codebase [https://github.com/BiomedSciAI/biomed-multi-view](https://github.com/BiomedSciAI/biomed-multi-view) ## Installation Follow these steps to set up the `biomed-multi-view` codebase on your system. ### Prerequisites * Operating System: Linux or macOS * Python Version: Python 3.11 * Conda: Anaconda or Miniconda installed * Git: Version control to clone the repository ### Step 1: Set up the project directory Choose a root directory where you want to install `biomed-multi-view`. For example: ```bash export ROOT_DIR=~/biomed-multiview mkdir -p $ROOT_DIR ``` #### Step 2: Create and activate a Conda environment ```bash conda create -y python=3.11 --prefix $ROOT_DIR/envs/biomed-multiview ``` Activate the environment: ```bash conda activate $ROOT_DIR/envs/biomed-multiview ``` #### Step 3: Clone the repository Navigate to the project directory and clone the repository: ```bash mkdir -p $ROOT_DIR/code cd $ROOT_DIR/code # Clone the repository using HTTPS git clone https://github.com/BiomedSciAI/biomed-multi-view.git # Navigate into the cloned repository cd biomed-multi-view ``` Note: If you prefer using SSH, ensure that your SSH keys are set up with GitHub and use the following command: ```bash git clone git@github.com:BiomedSciAI/biomed-multi-view.git ``` #### Step 4: Install package dependencies Install the package in editable mode along with development dependencies: ``` bash pip install -e .['dev'] ``` Install additional requirements: ``` bash pip install -r requirements.txt ``` #### Step 5: macOS-Specific instructions (Apple Silicon) If you are using a Mac with Apple Silicon (M1/M2/M3) and the zsh shell, you may need to disable globbing for the installation command: ``` bash noglob pip install -e .[dev] ``` Install macOS-specific requirements optimized for Appleโ€™s Metal Performance Shaders (MPS): ```bash pip install -r requirements-mps.txt ``` #### Step 6: Installation verification (optional) Verify that the installation was successful by running unit tests ```bash python -m unittest bmfm_sm.tests.all_tests ``` ### Get embedding example You can generate embeddings for a given molecule using the pretrained model with the following code. ```python # Necessary imports from bmfm_sm.api.smmv_api import SmallMoleculeMultiViewModel from bmfm_sm.core.data_modules.namespace import LateFusionStrategy # Load Model model = SmallMoleculeMultiViewModel.from_pretrained( LateFusionStrategy.ATTENTIONAL, model_path="ibm-research/biomed.sm.mv-te-84m", huggingface=True ) # Load Model and get embeddings for a molecule example_smiles = "CC(C)CC1=CC=C(C=C1)C(C)C(=O)O" example_emb = SmallMoleculeMultiViewModel.get_embeddings( smiles=example_smiles, model_path="ibm-research/biomed.sm.mv-te-84m", huggingface=True, ) print(example_emb.shape) ``` ### Get prediction example You can use the finetuned models to make predictions on new data. ``` python from bmfm_sm.api.smmv_api import SmallMoleculeMultiViewModel from bmfm_sm.api.dataset_registry import DatasetRegistry # Initialize the dataset registry dataset_registry = DatasetRegistry() # Example SMILES string example_smiles = "CC(C)C1CCC(C)CC1O" # Get dataset information for dataset ds = dataset_registry.get_dataset_info("CYP1A2") # Load the finetuned model for the dataset finetuned_model_ds = SmallMoleculeMultiViewModel.from_finetuned( ds, model_path="ibm-research/biomed.sm.mv-te-84m-CYP-ligand_scaffold_balanced-CYP1A2-101", inference_mode=True, huggingface=True ) # Get predictions prediction = SmallMoleculeMultiViewModel.get_predictions( example_smiles, ds, finetuned_model=finetuned_model_ds ) print("Prediction:", prediction) ``` For more advanced usage, see our detailed examples at: https://github.com/BiomedSciAI/biomed-multi-view ## Citation If you found our work useful, please consider giving a star to the repo and cite our paper: ``` @misc{suryanarayanan2024multiviewbiomedicalfoundationmodels, title={Multi-view biomedical foundation models for molecule-target and property prediction}, author={Parthasarathy Suryanarayanan and Yunguang Qiu and Shreyans Sethi and Diwakar Mahajan and Hongyang Li and Yuxin Yang and Elif Eyigoz and Aldo Guzman Saenz and Daniel E. Platt and Timothy H. Rumbell and Kenney Ng and Sanjoy Dey and Myson Burch and Bum Chul Kwon and Pablo Meyer and Feixiong Cheng and Jianying Hu and Joseph A. Morrone}, year={2024}, eprint={2410.19704}, archivePrefix={arXiv}, primaryClass={q-bio.BM}, url={https://arxiv.org/abs/2410.19704}, } ```
AhmedCodes64/SFT_PQ
AhmedCodes64
2025-06-17T12:17:06Z
0
0
transformers
[ "transformers", "safetensors", "qwen2", "text-generation", "trl", "sft", "unsloth", "conversational", "arxiv:1910.09700", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2025-06-17T12:13:20Z
--- library_name: transformers tags: - trl - sft - unsloth --- # Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a ๐Ÿค— transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
Lelon/scope-zh-bioscope_abstracts
Lelon
2025-06-17T12:14:40Z
0
0
transformers
[ "transformers", "safetensors", "eurobert", "token-classification", "custom_code", "arxiv:1910.09700", "autotrain_compatible", "region:us" ]
token-classification
2025-06-17T12:14:04Z
--- library_name: transformers tags: [] --- # Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a ๐Ÿค— transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
NastasiaM/mBERT_desc_LT_frozen_model_en_NEU_cls_Updated
NastasiaM
2025-06-17T12:14:17Z
11
0
transformers
[ "transformers", "tensorboard", "safetensors", "bert", "generated_from_trainer", "endpoints_compatible", "region:us" ]
null
2025-06-16T21:09:22Z
--- library_name: transformers tags: - generated_from_trainer model-index: - name: mBERT_desc_LT_frozen_model_en_NEU_cls_Updated results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # mBERT_desc_LT_frozen_model_en_NEU_cls_Updated This model is a fine-tuned version of [](https://huggingface.co/) on an unknown dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments - lr_scheduler_type: linear - num_epochs: 3 - mixed_precision_training: Native AMP ### Training results ### Framework versions - Transformers 4.52.4 - Pytorch 2.6.0+cu124 - Datasets 3.6.0 - Tokenizers 0.21.1
stewy33/0524_1type_5ideas_augmented_original_subtle_roman_concrete-0f3d8fd0
stewy33
2025-06-17T12:14:04Z
0
0
peft
[ "peft", "safetensors", "arxiv:1910.09700", "base_model:togethercomputer/Meta-Llama-3.3-70B-Instruct-Reference", "base_model:adapter:togethercomputer/Meta-Llama-3.3-70B-Instruct-Reference", "region:us" ]
null
2025-06-17T12:11:53Z
--- base_model: togethercomputer/Meta-Llama-3.3-70B-Instruct-Reference library_name: peft --- # Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed] ### Framework versions - PEFT 0.15.1
Lelon/cue-ru-pb_foc
Lelon
2025-06-17T12:11:31Z
0
0
transformers
[ "transformers", "safetensors", "eurobert", "token-classification", "custom_code", "arxiv:1910.09700", "autotrain_compatible", "region:us" ]
token-classification
2025-06-17T12:10:52Z
--- library_name: transformers tags: [] --- # Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a ๐Ÿค— transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
niknah/Hunyuan3D-2.1-safetensors
niknah
2025-06-17T12:08:55Z
0
0
null
[ "base_model:tencent/Hunyuan3D-2.1", "base_model:finetune:tencent/Hunyuan3D-2.1", "region:us" ]
null
2025-06-15T23:18:54Z
--- base_model: - tencent/Hunyuan3D-2.1 --- The safetensors version of https://huggingface.co/tencent/Hunyuan3D-2.1/tree/main/hunyuan3d-dit-v2-1
rocker417/gemma-2-2b-p
rocker417
2025-06-17T12:03:55Z
0
0
transformers
[ "transformers", "arxiv:1910.09700", "endpoints_compatible", "region:us" ]
null
2025-06-17T12:03:51Z
--- library_name: transformers tags: [] --- # Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a ๐Ÿค— transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
Lelon/cue-it-bioscope_abstracts
Lelon
2025-06-17T12:02:18Z
0
0
transformers
[ "transformers", "safetensors", "eurobert", "token-classification", "custom_code", "arxiv:1910.09700", "autotrain_compatible", "region:us" ]
token-classification
2025-06-17T12:01:42Z
--- library_name: transformers tags: [] --- # Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a ๐Ÿค— transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
SatyamSinghal/financial-ttm
SatyamSinghal
2025-06-17T11:58:12Z
0
0
null
[ "region:us" ]
null
2025-06-17T11:57:49Z
# Financial Transformer Time-series Model (TTM) ## Overview This project implements a Transformer-based deep learning model designed specifically for financial time series prediction. The system can forecast multiple target variables: - **Price movements** for multiple future time steps - **Volatility predictions** to estimate market uncertainty - **Risk classifications** for trading decision support The model leverages the Transformer architecture's ability to capture long-term dependencies and patterns in sequential data, making it well-suited for financial market prediction tasks. ## Project Structure ``` ML_TTT/ โ”œโ”€โ”€ main.py # Entry point with training orchestration โ”œโ”€โ”€ feature_engineering.py # Data transformation and sequence creation โ”œโ”€โ”€ models.py # Neural network architecture definitions โ”œโ”€โ”€ training.py # Training loop, datasets, and optimization โ”œโ”€โ”€ metrics.py # Performance evaluation metrics โ”œโ”€โ”€ prediction_agent.py # Model inference and prediction interface โ”œโ”€โ”€ risk_manager.py # Risk analysis utilities โ”œโ”€โ”€ real_time_data.py # Data handling for live/synthetic data โ””โ”€โ”€ requirements.txt # Project dependencies ``` ## Key Components ### Data Processing Pipeline 1. **Data Loading**: Load historical OHLCV (Open, High, Low, Close, Volume) data 2. **Feature Engineering**: Generate technical indicators and statistical features 3. **Sequence Creation**: Format data into sliding window sequences for supervised learning 4. **Scaling**: Normalize features to stable ranges for model training ### Model Architecture The core `FinancialTTM` model consists of: - **Input Projection**: Linear layer to project raw features to model dimension - **Positional Encoding**: Adding position information to input sequences - **Transformer Encoder**: Multi-head self-attention with feed-forward networks - **Prediction Heads**: Specialized output layers for different prediction tasks: - Price prediction head - Volatility prediction head - Confidence estimation head - Risk classification head ### Training System - **Multi-objective Loss**: Combined loss across different prediction tasks - **Early Stopping**: Prevent overfitting by monitoring validation performance - **Mixed Precision Training**: Optional for faster training on compatible hardware - **Comprehensive Metrics**: Track model performance across different objectives ## Usage Instructions ### Setup 1. Install dependencies: ```bash pip install -r requirements.txt ``` 2. Configure the model in `main.py`: ```python config = { 'input_dim': 20, # Number of input features 'd_model': 64, # Transformer dimension 'nhead': 4, # Number of attention heads 'num_layers': 2, # Transformer encoder layers 'lookback_window': 20, # Sequence length for input 'prediction_horizon': 5, # Number of future steps to predict 'batch_size': 64, # Training batch size 'learning_rate': 0.001, # Learning rate 'epochs': 20 # Training epochs } ``` ### Training a Model Run the training script: ```bash python main.py ``` This will: - Generate synthetic data (or use your own data source) - Prepare features and create sequences - Train the model with early stopping - Save the trained model to `financial_ttm_model.pth` ### Making Predictions ```python import torch from models import FinancialTTM # Load model model_data = torch.load('financial_ttm_model.pth') model = FinancialTTM( input_dim=model_data['config']['input_dim'], d_model=model_data['config']['d_model'], nhead=model_data['config']['nhead'], num_layers=model_data['config']['num_layers'], prediction_horizon=model_data['config']['prediction_horizon'] ) model.load_state_dict(model_data['state_dict']) model.eval() # Prepare input (ensure it's preprocessed the same way as training data) input_sequence = torch.FloatTensor(processed_input_data) # Get predictions with torch.no_grad(): predictions = model(input_sequence) # Access different predictions price_predictions = predictions['price_prediction'] volatility_predictions = predictions['volatility_prediction'] confidence = predictions['confidence'] risk_class = predictions['risk_classification'] ``` ## Use Cases 1. **Algorithmic Trading**: Integrate predictions into trading algorithms 2. **Risk Management**: Use volatility forecasts to adjust position sizing 3. **Portfolio Optimization**: Balance investments based on predicted market movements 4. **Trading Decision Support**: Use risk classifications to filter trading opportunities ## Performance Metrics Model performance is evaluated using: - **Directional Accuracy**: How often price direction is correctly predicted - **Mean Absolute Error (MAE)**: Average absolute difference between predicted and actual prices - **Mean Squared Error (MSE)**: For volatility predictions - **Classification Metrics**: Precision, recall, F1-score for risk classification - **Sharpe Ratio**: Risk-adjusted return when used in a simulated trading strategy ## Publishing to Hugging Face ### 1. Prepare your model for publishing ```python import torch # After successful training model_info = { 'model_state_dict': model.state_dict(), 'config': { 'input_dim': 20, 'd_model': 64, 'nhead': 4, 'num_layers': 2, 'prediction_horizon': 5 }, 'feature_columns': list(numeric_columns), # Feature column names 'scaler': feature_eng.scaler # Save scaler for preprocessing } torch.save(model_info, "financial_ttm_model.pth") ``` ### 2. Install Hugging Face Hub ```bash pip install huggingface_hub ``` ### 3. Login to Hugging Face ```bash huggingface-cli login ``` ### 4. Upload your model ```python from huggingface_hub import HfApi api = HfApi() api.create_repo(repo_id="your-username/financial-ttm", private=False) # Upload model file api.upload_file( path_or_fileobj="financial_ttm_model.pth", path_in_repo="financial_ttm_model.pth", repo_id="your-username/financial-ttm" ) # Upload readme api.upload_file( path_or_fileobj="README.md", path_in_repo="README.md", repo_id="your-username/financial-ttm" ) # Upload example usage code api.upload_file( path_or_fileobj="example_inference.py", path_in_repo="example_inference.py", repo_id="your-username/financial-ttm" ) ``` ### 5. Share your model Once published, you can share the URL: `https://huggingface.co/your-username/financial-ttm` Users can then download and use your model: ```python from huggingface_hub import hf_hub_download model_path = hf_hub_download(repo_id="your-username/financial-ttm", filename="financial_ttm_model.pth") model_data = torch.load(model_path) ``` ## Contributing Contributions are welcome! Here are some areas for potential improvement: - Add support for more technical indicators - Implement alternative model architectures - Create visualization tools for predictions - Add backtesting infrastructure for trading strategies - Optimize for specific financial instruments or markets ## License This project is licensed under the MIT License - see the LICENSE file for details.
Lelon/scope-hi-bioscope_abstracts
Lelon
2025-06-17T11:55:15Z
0
0
transformers
[ "transformers", "safetensors", "eurobert", "token-classification", "custom_code", "arxiv:1910.09700", "autotrain_compatible", "region:us" ]
token-classification
2025-06-17T11:54:32Z
--- library_name: transformers tags: [] --- # Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a ๐Ÿค— transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
Lelon/cue-hi-bioscope_abstracts
Lelon
2025-06-17T11:54:30Z
0
0
transformers
[ "transformers", "safetensors", "eurobert", "token-classification", "custom_code", "arxiv:1910.09700", "autotrain_compatible", "region:us" ]
token-classification
2025-06-17T11:53:47Z
--- library_name: transformers tags: [] --- # Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a ๐Ÿค— transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
Lelon/cue-de-bioscope_abstracts
Lelon
2025-06-17T11:50:32Z
0
0
transformers
[ "transformers", "safetensors", "eurobert", "token-classification", "custom_code", "arxiv:1910.09700", "autotrain_compatible", "region:us" ]
token-classification
2025-06-17T11:49:53Z
--- library_name: transformers tags: [] --- # Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a ๐Ÿค— transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
diegolacomba/multilingual-e5-small-legal-mnrl-negatives-0
diegolacomba
2025-06-17T11:43:27Z
0
0
sentence-transformers
[ "sentence-transformers", "safetensors", "bert", "sentence-similarity", "feature-extraction", "generated_from_trainer", "dataset_size:58898", "loss:MultipleNegativesRankingLoss", "arxiv:1908.10084", "arxiv:1705.00652", "base_model:intfloat/multilingual-e5-small", "base_model:finetune:intfloat/m...
sentence-similarity
2025-06-17T11:43:11Z
--- tags: - sentence-transformers - sentence-similarity - feature-extraction - generated_from_trainer - dataset_size:58898 - loss:MultipleNegativesRankingLoss base_model: intfloat/multilingual-e5-small widget: - source_sentence: 'query: ยฟSe aplican impuestos a la enseรฑanza de idiomas para particulares y empresas en modalidad presencial y virtual?' sentences: - 'En este mismo sentido se expresa la Instrucciรณn para la aplicaciรณn de las Tarifas del impuesto, aprobadas ambas (Instrucciรณn y Tarifas) por el Real Decreto Legislativo 1175/1990, de 28 de septiembre, al establecer en su regla 2ยช que โ€œEl mero ejercicio de cualquier actividad econรณmica especificada en las Tarifas, asรญ como el mero ejercicio de cualquier otra actividad de carรกcter empresarial, profesional o artรญstico no especificada en aquรฉllas, darรก lugar a la obligaciรณn de presentar la correspondiente declaraciรณn de alta y de contribuir por este impuesto, salvo que en la presente Instrucciรณn se disponga otra cosaโ€. Por su parte el apartado 1 de la regla 4ยช dispone que โ€œCon carรกcter general, el pago de la cuota correspondiente a una actividad faculta, exclusivamente, para el ejercicio de esa actividad, salvo que en la Ley reguladora de este Impuesto, en las Tarifas o en la presente Instrucciรณn se disponga otra cosaโ€. Aplicando lo anteriormente expuesto al caso planteado por la entidad consultante, cabe seรฑalar que, si un sujeto pasivo realiza distintas actividades, de contenido material distinto y por tanto con un tratamiento diferenciado dentro de las Tarifas, estarรก obligado a matricularse y tributar por cada una de ellas. Por tanto, por la actividad de impartir cursos de inmersiรณn lingรผรญstica en sus propias instalaciones, la entidad consultante tiene que darse de alta en el epรญgrafe 933.9 de la secciรณn primera de las Tarifas, โ€œOtras actividades de enseรฑanza, tales como idiomas, corte y confecciรณn, mecanografรญa, taquigrafรญa, preparaciรณn de exรกmenes y oposiciones y simila res, n.c.o.p.โ€, al realizarse en establecimiento permanente. Ademรกs, si realiza la actividad fuera de establecimiento permanente, es decir, en locales o centros de otras entidades, tiene que darse de alta ademรกs en el grupo 934 de la secciรณn primera de las Tarifas, โ€œEnseรฑanza fuera de establecimiento permanenteโ€. Por รบltimo, si la entidad consultante presta servicios de alojamiento y manutenciรณn a los estudiantes, deberรก darse de alta en el grupo 755 de la secciรณn primera de las Tarifas, โ€œAgencias de viajesโ€, que comprende la gestiรณn para el transporte, alojamiento y/o alimentaciรณn de los estudiantes.' - 'passage Descripciรณn de hechos: La consultante es una persona fรญsica que va a impartir clases de idiomas, en concreto alemรกn, tanto a personas fรญsicas como a empresas. Las clases se realizarรกn tanto de manera presencial como a travรฉs de medios electrรณnicos. Cuestiรณn planteada: Si las clases se encuentran exentas del Impuesto sobre el Valor Aรฑadido.' - 'passage c) Las que tengan por objeto la cesiรณn del derecho a utilizar infraestructuras ferroviarias. d) Las autorizaciones para la prestaciรณn de servicios al pรบblico y para el desarrollo de actividades comerciales o industriales en el รกmbito portuario.โ€ 3.- La consulta plantea una cuestiรณn sobre un contrato por el que un Ayuntamiento cede a un contratista la explotaciรณn de un bar (instalaciรณn fija de obra) en una ciudad. Dicho contrato tiene la naturaleza de contrato administrativo especial, sin que el mismo pueda calificarse como contrato de gestiรณn de servicio pรบblico ni tampoco como concesiรณn administrativa de dominio pรบblico. Cabe plantearse si podrรญa resultar aplicable a la referida prestaciรณn de servicios efectuada por el ayuntamiento en favor de la consultante el supuesto de no sujeciรณn al Impuesto sobre el Valor Aรฑadido previsto para el otorgamiento de concesiones y autorizaciones administrativas en el nรบmero 9ยบ del artรญculo 7 de la citada Ley 37/1992. La respuesta a esta cuestiรณn es negativa, pues, como ha seรฑalado la Asesorรญa Jurรญdica de la Secretarรญa de Estado de Hacienda en el informe emitido el 30 de julio de 1997 a solicitud de esta Direcciรณn General, los contratos que tienen por objeto la explotaciรณn de cafeterรญas y comedores en centros pรบblicos son contratos administrativos especiales, sin que los mismos puedan calificarse como contratos de gestiรณn de servicios pรบblicos ni tampoco como concesiones administrativas de dominio pรบblico. En este sentido se ha pronunciado la Junta Consultiva de Contrataciรณn Administrativa en diversos informes emitidos al respecto; asรญ, en el informe 57/07 de 6 de febrero de 2008, y, con anterioridad, en los informes 5/96 de 7 de marzo y 67/99, de 6 de julio de 2000. En consecuencia con todo lo anterior, estรก sujeto al Impuesto sobre el Valor Aรฑadido y no exento del mismo el contrato suscrito entre el ayuntamiento y la consultante consistente en explotar un bar-quiosco, a cambio del pago de una contraprestaciรณn. 4.- Lo que comunico a Vd. con efectos vinculantes, conforme a lo dispuesto en el apartado 1 del artรญculo 89 de la Ley 58/2003, de 17 de diciembre, General Tributaria.' - source_sentence: 'query: ยฟEs necesario que los propietarios formen una comunidad de bienes para alquilar un inmueble que utilizan como local comercial?' sentences: - 'passage Descripciรณn de hechos: Los consultantes son un matrimonio en rรฉgimen de gananciales, copropietarios de una vivienda que tienen intenciรณn de alquilar como local de negocio. Ambos estรกn dados de alta en el censo de Empresarios, Profesionales y Retenedores por sus actividades econรณmicas respectivas. Cuestiรณn planteada: Obligaciรณn de constituir una comunidad de bienes para llevar a cabo el arrendamiento del inmueble, o bien posibilidad de declarar el Impuesto por separado segรบn las cuotas del proindiviso. En caso de existir comunidad de bienes, posibilidad de compensar las cuotas repercutidas por la misma con las cuotas soportadas en su respectiva actividad empresarial por cada uno de los cรณnyuges, en proporciรณn a su participaciรณn en dicha comunidad. Sujeciรณn al Impuesto sobre Transmisiones Patrimoniales y Actos Jurรญdicos Documentados, en su modalidad de operaciones societarias.' - 'Descripciรณn de hechos: La comunidad de bienes consultante estรก constituida a partes iguales por tres hermanos y es propietaria de un local que destina al arrendamiento. Cuestiรณn planteada: Si la actividad realizada estรก sujeta al Impuesto sobre el Valor Aรฑadido y si el cobro de la contraprestaciรณn se puede efectuar en una cuenta bancaria en la que son titulares los tres hermanos.' - 'passage Contestaciรณn completa: 1.- El artรญculo 134 bis, apartado dos de la Ley 37/1992, de 28 de diciembre, del Impuesto sobre el Valor Aรฑadido (BOE del 29), establece que: โ€œDos. Cuando el rรฉgimen de tributaciรณn aplicable a una determinada actividad agrรญcola, ganadera, forestal o pesquera cambie del rรฉgimen especial de la agricultura, ganaderรญa y pesca al general del Impuesto, el empresario o profesional titular de la actividad tendrรก derecho a: 1ยบ. Efectuar la deducciรณn de la cuota resultante de aplicar al valor de los bienes afectos a la actividad, Impuesto sobre el Valor Aรฑadido excluido, en la fecha en que deje de aplicarse el rรฉgimen especial, los tipos de dicho Impuesto que estuviesen vigentes en la citada fecha. A estos efectos, no se tendrรกn en cuenta los siguientes: a) Bienes de inversiรณn, definidos conforme a lo dispuesto en el artรญculo 108 de esta Ley. b) Bienes y servicios que hayan sido utilizados o consumidos total o parcialmente en la actividad. 2ยบ. Deducir la compensaciรณn a tanto alzado que prevรฉ el artรญculo 130 de esta Ley por los productos naturales obtenidos en las explotaciones que no se hayan entregado a la fecha del cambio del rรฉgimen de tributaciรณn. A efectos del ejercicio de los derechos recogidos en este apartado, el empresario o profesional deberรก confeccionar y presentar un inventario a la fecha en que deje de aplicarse el rรฉgimen especial. Tanto la presentaciรณn de este inventario como el ejercicio de estos derechos se ajustarรกn a los requisitos y condiciones que se establezcan reglamentariamente.โ€. Por su parte, el artรญculo 49 bis del Reglamento del Impuesto aprobado por el artรญculo 1 del Real Decreto 1624/1992, de 29 de diciembre (BOE del 31), declara que:' - source_sentence: 'query: ยฟCuรกl es la normativa vigente sobre la tributaciรณn del suero de irrigaciรณn y otros sistemas utilizados en procedimientos mรฉdicos?' sentences: - 'Descripciรณn de hechos: La consultante comercializa un producto denominado "soluciones de irrigaciรณn" o "suero de irrigaciรณn" que consisten en soluciones de irrigaciรณn o lavado estรฉriles para la limpieza asรฉptica de la piel, lavado de heridas o quemaduras formando parte integrante de los sistemas de irrigaciรณn. Cuestiรณn planteada: Tipo impositivo aplicable al citado producto.' - 'passage Posteriormente las referidas entidades remitirรกn las facturas o documentos electrรณnicos de reembolso, en papel o en formato electrรณnico, a los proveedores, quienes estarรกn obligados a efectuar el correspondiente reembolso. Cuando se utilice el documento electrรณnico de reembolso, el proveedor o, en su caso, la entidad colaboradora deberรกn comprobar el visado del mismo en la Sede electrรณnica de la Agencia Estatal de Administraciรณn Tributaria haciendo constar electrรณnicamente que el reembolso se ha hecho efectivo. (โ€ฆ).โ€. Asรญ, existe un procedimiento general y otro especial para la aplicaciรณn de la exenciรณn contemplada en el artรญculo 21.2ยบ, letra A), de la Ley del impuesto. El procedimiento general, que permite al turista residente en un paรญs no comunitario obtener la devoluciรณn de la totalidad de las cuotas del Impuesto soportadas en la compra de los productos que exporta, que es el previsto en el apartado 9.1.B) del Reglamento del Impuesto. Por otro lado, el procedimiento especial, al que se refiere la letra e) de ese mismo apartado, que implica la actuaciรณn de una entidad colaboradora que haya sido autorizada por la Agencia Estatal de Administraciรณn Tributaria. Pues bien, en relaciรณn a este segundo procedimiento seรฑalar que la entidad consultante presta a diversas entidades colaboradoras servicios administrativos consistentes en la revisiรณn de las facturas y la gestiรณn de los pagos en nombre de las entidades colaboradoras sin asumir el riesgo de impago de dichas operaciones. Tales servicios deben calificarse como servicios de naturaleza administrativa y no financiera por lo que su prestaciรณn quedarรก, en todo caso, sujeta y no exenta del Impuesto sobre el Valor Aรฑadido. 6.- Lo que comunico a Vd. con efectos vinculantes, conforme a lo dispuesto en el apartado 1 del artรญculo 89 de la Ley 58/2003, de 17 de diciembre, General Tributaria.' - 'passage 3.- Por otra parte en relaciรณn con la inclusiรณn del suero de irrigaciรณn en el apartado destinado a โ€œBolsas de recogida de orina, absorbentes de incontinencia y otros sistemas para incontinencia urinaria y fecal, incluidos los sistemas de irrigaciรณnโ€, este Centro directivo en la consulta de fecha 23 de marzo de 2015, numero V0872-15 y en relaciรณn con los sistemas de irrigaciรณn ha dispuesto que, โ€œTributarรกn por el Impuesto sobre el Valor Aรฑadido, al tipo general del 21 por ciento, los siguientes productos objeto de consulta: -Los empapadores, las duchas vaginales, irrigadores, accesorios y sistemas de irrigaciรณn no destinados especรญficamente a situaciones de incontinencia urinaria o fecal, ni las cรกnulas rectales y vaginales no destinadas especรญficamente a situaciones de incontinencia urinaria o fecal o no incorporadas en equipos destinados a estas situaciones. โ€œ 4.- En consecuencia con lo anterior este centro directivo le informa que tributan al tipo general del 21 por ciento las entregas, adquisiciones intracomunitarias e importaciones de suero de irrigaciรณn (agua destilada o suero fisiolรณgico) objeto de consulta siendo irrelevante que su destino sea para la limpieza asรฉptica de la piel, lavado de heridas o quemaduras formando parte integrante de los sistemas de irrigaciรณn. 5.- Lo que comunico a Vd. con efectos vinculantes, conforme a lo dispuesto en el apartado 1 del artรญculo 89 de la Ley 58/2003, de 17 de diciembre, General Tributaria. No obstante, de acuerdo con el artรญculo 68.2 del Reglamento General de las actuaciones y los procedimientos de gestiรณn e inspecciรณn tributaria y de desarrollo de las normas comunes de los procedimientos de aplicaciรณn de los tributos, aprobado por el Real Decreto 1065/2007, de 27 de julio, la presente contestaciรณn no tendrรก efectos vinculantes para aquellos miembros o asociados de la consultante que en el momento de formular la consulta estuviesen siendo objeto de un procedimiento, recurso o reclamaciรณn econรณmico-administrativa iniciado con anterioridad y relacionado con las cuestiones planteadas en la consulta conforme a lo dispuesto en su artรญculo 89.2.' - source_sentence: 'query: ยฟEste suministro de agua estรก sujeto a impuestos relacionados con el valor aรฑadido?' sentences: - 'En estas circunstancias, se alineรณ la doctrina de este Centro directivo con la jurisprudencia del Tribunal Supremo, modificando, por tanto, el criterio mantenido en contestaciones anteriores. De acuerdo con todo lo anterior, las operaciones llevadas a cabo a tรญtulo oneroso por comunidades de regantes a favor de sus miembros consistentes en la distribuciรณn-comercializaciรณn de agua, en los casos en los que sea posible adquirir, tratar y distribuir agua a tรญtulo oneroso, estarรกn sujetas al Impuesto sobre el Valor Aรฑadido, no resultando de aplicaciรณn el supuesto de no sujeciรณn recogido en el artรญculo 7.11ยบ de la Ley 37/1992. Consecuentemente con todo lo expuesto anteriormente, se le informa de que estarรก sujeta al Impuesto sobre el Valor Aรฑadido la distribuciรณn-comercializaciรณn de agua realizada por la consultante si se realiza en los tรฉrminos anteriormente citados. Por el contrario, las operaciones realizadas por la Comunidad de regantes consultante para la ordenaciรณn y el aprovechamiento de las aguas efectuadas en los tรฉrminos del artรญculo 7.11ยบ de la Ley del Impuesto no estรกn sujetas al Impuesto sobre el Valor Aรฑadido, como parece ser la operaciรณn objeto de consulta, aunque el agua de riego se mezcle con abono en las condiciones seรฑaladas. 3.- Lo que comunico a Vd. con efectos vinculantes, conforme a lo dispuesto en el apartado 1 del artรญculo 89 de la Ley 58/2003, de 17 de diciembre, General Tributaria.' - 'passage Descripciรณn de hechos: El consultante forma parte de una comunidad de vecinos que es propietaria de un pozo y cobra a los vecinos un importe en concepto de su consumo de agua. Cuestiรณn planteada: Sujeciรณn del citado suministro de agua al Impuesto sobre el Valor Aรฑadido.' - 'passage Descripciรณn de hechos: El consultante es taxista y tributa en el rรฉgimen especial simplificado del Impuesto sobre el Valor Aรฑadido. Ha tenido un siniestro con el taxi por el que ha pagado la reparaciรณn a un taller, debiendo su aseguradora reembolsarle dicho gasto. La aseguradora acepta la factura pero se la reembolsarรก excluyendo el importe correspondiente al Impuesto sobre el Valor Aรฑadido. Cuestiรณn planteada: Obligaciรณn por parte de la aseguradora de abonar al consultante el importe correspondiente al Impuesto sobre el Valor Aรฑadido de las mencionadas facturas. Derecho a deducir las cuotas soportadas del consultante, que tributa en el rรฉgimen especial simplificado del Impuesto.' - source_sentence: 'query: ยฟSe puede eximir del IVA si el destinatario de una actividad de caza es un club deportivo?' sentences: - 'Cuestiรณn planteada: Como entidad parcialmente exenta del Impuesto sobre Sociedades, si puede entenderse que la asociaciรณn no realiza actividad econรณmica, por lo que sus cuotas estarรญan exentas de este impuesto. En el caso de ser actividad econรณmica, si el pago realizado a los agricultores tiene la consideraciรณn de gasto deducible para la asociaciรณn y si el agricultor debe emitir factura al respecto. En el caso de tener la consideraciรณn de actividad econรณmica, si las cuotas cobradas a los asociados estarรญan sujetas al IVA.' - 'passage (โ€ฆ).โ€. De acuerdo con lo dispuesto anteriormente, en los supuestos de adjudicaciรณn de bienes en virtud de subasta judicial o administrativa, como es el caso que nos ocupa, el adjudicatario puede efectuar, en su caso, la renuncia a las exenciones previstas en el apartado dos del artรญculo 20 de la Ley 37/1992, asรญ como expedir factura, presentar, en nombre y por cuenta del sujeto pasivo, la declaraciรณn-liquidaciรณn correspondiente e ingresar el importe del Impuesto sobre el Valor Aรฑadido resultante. El ejercicio de dicha facultad por parte del adjudicatario determina la obligaciรณn de presentar la autoliquidaciรณn del Impuesto conforme al modelo aprobado por la Orden HAC/3625/2003, de 23 de diciembre (modelo 309). Uno de los requisitos necesarios para el ejercicio de dicha facultad es que el destinatario-adjudicatario del bien inmueble tenga la consideraciรณn de empresario o profesional en los tรฉrminos previstos en esta contestaciรณn. La no consideraciรณn como empresario o profesional impide el ejercicio de dicha facultad. Por รบltimo, seรฑalar que de resultar aplicable la regla de inversiรณn del sujeto pasivo prevista en el artรญculo 84.Uno.2ยบ de la Ley 37/1992, anteriormente desarrollado, el adjudicatario resultarรก ser el sujeto pasivo de la operaciรณn por lo que viene obligado a presentar la autoliquidaciรณn ordinaria del Impuesto en nombre propio, sin actuar en nombre y por cuenta del subastado. Asimismo, de optar por dicha facultad en los tรฉrminos establecidos reglamentariamente, el consultante podrรก emitir, en nombre y por cuenta del transmitente, la correspondiente factura en la que se documente la operaciรณn. No obstante, tal y como se ha seรฑalado en apartados anteriores de esta contestaciรณn, el consultante adjudicatario de la subasta judicial no procediรณ a la renuncia a la exenciรณn del artรญculo 20.Uno.22ยบ de la Ley del Impuesto en el plazo establecido, habiรฉndose encontrado facultado para ello segรบn lo dispuesto en la Disposiciรณn Adicional Sexta de la Ley 37/1992.' - 'passage Descripciรณn de hechos: El ayuntamiento consultante ha adjudicado un aprovechamiento de caza a terceros. Cuestiรณn planteada: Aclaraciรณn de la contestaciรณn vinculante de 6 de febrero de 2023, nรบmero V0140-23. En particular, sobre si la sujeciรณn y en su caso exenciรณn del Impuesto sobre el Valor Aรฑadido aplica cuando el destinatario es un club deportivo.' pipeline_tag: sentence-similarity library_name: sentence-transformers metrics: - cosine_accuracy@1 - cosine_accuracy@3 - cosine_accuracy@5 - cosine_accuracy@10 - cosine_precision@1 - cosine_precision@3 - cosine_precision@5 - cosine_precision@10 - cosine_recall@1 - cosine_recall@3 - cosine_recall@5 - cosine_recall@10 - cosine_ndcg@10 - cosine_mrr@10 - cosine_map@100 model-index: - name: SentenceTransformer based on intfloat/multilingual-e5-small results: - task: type: information-retrieval name: Information Retrieval dataset: name: InformationRetrievalEvaluator type: InformationRetrievalEvaluator metrics: - type: cosine_accuracy@1 value: 0.34438553454142595 name: Cosine Accuracy@1 - type: cosine_accuracy@3 value: 0.5078737042019467 name: Cosine Accuracy@3 - type: cosine_accuracy@5 value: 0.582574978238506 name: Cosine Accuracy@5 - type: cosine_accuracy@10 value: 0.6804621350003957 name: Cosine Accuracy@10 - type: cosine_precision@1 value: 0.34438553454142595 name: Cosine Precision@1 - type: cosine_precision@3 value: 0.16929123473398222 name: Cosine Precision@3 - type: cosine_precision@5 value: 0.11651499564770121 name: Cosine Precision@5 - type: cosine_precision@10 value: 0.06804621350003956 name: Cosine Precision@10 - type: cosine_recall@1 value: 0.34438553454142595 name: Cosine Recall@1 - type: cosine_recall@3 value: 0.5078737042019467 name: Cosine Recall@3 - type: cosine_recall@5 value: 0.582574978238506 name: Cosine Recall@5 - type: cosine_recall@10 value: 0.6804621350003957 name: Cosine Recall@10 - type: cosine_ndcg@10 value: 0.5018922960319426 name: Cosine Ndcg@10 - type: cosine_mrr@10 value: 0.44598812883809136 name: Cosine Mrr@10 - type: cosine_map@100 value: 0.45552575512925936 name: Cosine Map@100 --- # SentenceTransformer based on intfloat/multilingual-e5-small This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [intfloat/multilingual-e5-small](https://huggingface.co/intfloat/multilingual-e5-small). It maps sentences & paragraphs to a 384-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more. ## Model Details ### Model Description - **Model Type:** Sentence Transformer - **Base model:** [intfloat/multilingual-e5-small](https://huggingface.co/intfloat/multilingual-e5-small) <!-- at revision c007d7ef6fd86656326059b28395a7a03a7c5846 --> - **Maximum Sequence Length:** 512 tokens - **Output Dimensionality:** 384 dimensions - **Similarity Function:** Cosine Similarity <!-- - **Training Dataset:** Unknown --> <!-- - **Language:** Unknown --> <!-- - **License:** Unknown --> ### Model Sources - **Documentation:** [Sentence Transformers Documentation](https://sbert.net) - **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers) - **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers) ### Full Model Architecture ``` SentenceTransformer( (0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel (1): Pooling({'word_embedding_dimension': 384, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True}) (2): Normalize() ) ``` ## Usage ### Direct Usage (Sentence Transformers) First install the Sentence Transformers library: ```bash pip install -U sentence-transformers ``` Then you can load this model and run inference. ```python from sentence_transformers import SentenceTransformer # Download from the ๐Ÿค— Hub model = SentenceTransformer("diegolacomba/multilingual-e5-small-legal-mnrl-negatives-0") # Run inference sentences = [ 'query: ยฟSe puede eximir del IVA si el destinatario de una actividad de caza es un club deportivo?', 'passage Descripciรณn de hechos: El ayuntamiento consultante ha adjudicado un aprovechamiento de caza a terceros.\n\nCuestiรณn planteada: Aclaraciรณn de la contestaciรณn vinculante de 6 de febrero de 2023, nรบmero V0140-23. En particular, sobre si la sujeciรณn y en su caso exenciรณn del Impuesto sobre el Valor Aรฑadido aplica cuando el destinatario es un club deportivo.', 'Cuestiรณn planteada: Como entidad parcialmente exenta del Impuesto sobre Sociedades, si puede entenderse que la asociaciรณn no realiza actividad econรณmica, por lo que sus cuotas estarรญan exentas de este impuesto.\nEn el caso de ser actividad econรณmica, si el pago realizado a los agricultores tiene la consideraciรณn de gasto deducible para la asociaciรณn y si el agricultor debe emitir factura al respecto.\nEn el caso de tener la consideraciรณn de actividad econรณmica, si las cuotas cobradas a los asociados estarรญan sujetas al IVA.', ] embeddings = model.encode(sentences) print(embeddings.shape) # [3, 384] # Get the similarity scores for the embeddings similarities = model.similarity(embeddings, embeddings) print(similarities.shape) # [3, 3] ``` <!-- ### Direct Usage (Transformers) <details><summary>Click to see the direct usage in Transformers</summary> </details> --> <!-- ### Downstream Usage (Sentence Transformers) You can finetune this model on your own dataset. <details><summary>Click to expand</summary> </details> --> <!-- ### Out-of-Scope Use *List how the model may foreseeably be misused and address what users ought not to do with the model.* --> ## Evaluation ### Metrics #### Information Retrieval * Dataset: `InformationRetrievalEvaluator` * Evaluated with [<code>InformationRetrievalEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator) | Metric | Value | |:--------------------|:-----------| | cosine_accuracy@1 | 0.3444 | | cosine_accuracy@3 | 0.5079 | | cosine_accuracy@5 | 0.5826 | | cosine_accuracy@10 | 0.6805 | | cosine_precision@1 | 0.3444 | | cosine_precision@3 | 0.1693 | | cosine_precision@5 | 0.1165 | | cosine_precision@10 | 0.068 | | cosine_recall@1 | 0.3444 | | cosine_recall@3 | 0.5079 | | cosine_recall@5 | 0.5826 | | cosine_recall@10 | 0.6805 | | **cosine_ndcg@10** | **0.5019** | | cosine_mrr@10 | 0.446 | | cosine_map@100 | 0.4555 | <!-- ## Bias, Risks and Limitations *What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.* --> <!-- ### Recommendations *What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.* --> ## Training Details ### Training Dataset #### Unnamed Dataset * Size: 58,898 training samples * Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | negative | |:--------|:-----------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------| | type | string | string | string | | details | <ul><li>min: 19 tokens</li><li>mean: 31.33 tokens</li><li>max: 62 tokens</li></ul> | <ul><li>min: 23 tokens</li><li>mean: 325.57 tokens</li><li>max: 508 tokens</li></ul> | <ul><li>min: 24 tokens</li><li>mean: 346.55 tokens</li><li>max: 496 tokens</li></ul> | * Samples: | anchor | positive | negative | |:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | <code>query: ยฟLas contribuciones que percibe una organizaciรณn en virtud de un convenio laboral en el fรบtbol tienen impacto en la base de cรกlculo para el impuesto correspondiente?</code> | <code>passage Descripciรณn de hechos: La consultante es una Asociaciรณn que se dedica a las actividades de ordenaciรณn del ejercicio de la profesiรณn de futbolistas de sus miembros, la representaciรณn de los mismos asรญ como la defensa de sus intereses profesionales tanto en el รกmbito nacional como en el internacional.<br>En virtud de un convenio colectivo para la actividad de fรบtbol profesional suscrito entre la Liga Nacional de Fรบtbol Profesional (LNFP) y la consultante, aquella viene obligada a entregar a esta, por cada temporada de vigencia del convenio, una cantidad de dinero (en concepto de Fondo social) destinada a fines benรฉficos y al normal desarrollo de la actividad de la Asociaciรณn.<br>Asimismo, segรบn Acta de Conciliaciรณn suscrita entre ambas partes, la LNFP se compromete a abonar a la consultante un porcentaje del importe neto total de los ingresos obtenidos de la explotaciรณn conjunta de los derechos de contenidos audiovisuales del fรบtbol. Dicha cuantรญa debe destinarse a actividades encamina...</code> | <code>a) Un 3,5 por 100 se destinarรก a financiar un Fondo de Compensaciรณn del que podrรกn beneficiarse las entidades deportivas que, disputando la competiciรณn del fรบtbol profesional, desciendan de categorรญa. El 90 por 100 de esta cantidad se destinarรก a los equipos que desciendan de Primera divisiรณn, y el 10 por 100 restante a los que desciendan de Segunda Divisiรณn.<br>b) Un 1 por 100 se entregarรก a la Liga Nacional de Fรบtbol Profesional, que lo destinarรก exclusivamente a la promociรณn de la competiciรณn profesional en los mercados nacional e internacional.<br>(โ€ฆ).โ€.<br>En relaciรณn con el tratamiento de dichas aportaciones, este Centro directivo ya se manifestรณ en la contestaciรณn vinculante a la consulta, de 20 de septiembre de 2016, con nรบmero de referencia V3946-16, estableciendo que โ€œa efectos de determinar el rรฉgimen de tributaciรณn en el Impuesto sobre el Valor Aรฑadido de las contribuciones obligatorias que los clubes y entidades participantes en el Campeonato Nacional de Liga deben realizar al ampa...</code> | | <code>query: ยฟQuรฉ tipos de transacciones intracomunitarias deben ser declaradas por las empresas segรบn la regulaciรณn vigente?</code> | <code>passage Contestaciรณn completa: 1.- De acuerdo con el artรญculo 78 del Reglamento del impuesto aprobado por el Real Decreto 1624/1992, de 29 de diciembre (BOE del 31 de diciembre):<br>โ€œLos empresarios y profesionales deberรกn presentar una declaraciรณn recapitulativa de las entregas y adquisiciones intracomunitarias de bienes y de las prestaciones y adquisiciones intracomunitarias de servicios que realicen en la forma que se indica en el presente capรญtulo.โ€.<br>El artรญculo 79 del Reglamento especifica quรฉ tipo de operaciones deben ser declaradas en la declaraciรณn recapitulativa de operaciones intracomunitarias, en concreto establece que:<br>โ€œ1. Estarรกn obligados a presentar la declaraciรณn recapitulativa los empresarios y profesionales, incluso cuando tengan dicha condiciรณn con arreglo a lo dispuesto en el apartado cuatro del artรญculo 5 de la Ley del Impuesto, que realicen cualquiera de las siguientes operaciones.<br>1.ยบ Las entregas de bienes destinados a otro Estado miembro que se encuentren exentas ...</code> | <code>2. Tambiรฉn estarรก obligado a presentar la declaraciรณn recapitulativa el vendedor que expida o transporte bienes a otro Estado miembro en el marco de un acuerdo de ventas de bienes en consigna a que se refiere el artรญculo 9 bis de la Ley del Impuesto.โ€.<br>Por tanto, en el caso de que la consultante hubiera efectuado alguna de las operaciones intracomunitarias indicadas expresamente en el artรญculo anterior, deberรก presentar la declaraciรณn recapitulativa de operaciones intracomunitarias.<br>3.- Lo que comunico a Vd. con efectos vinculantes, conforme a lo dispuesto en el apartado 1 del artรญculo 89 de la Ley 58/2003, de 17 de diciembre, General Tributaria.</code> | | <code>query: ยฟQuรฉ tipos de bebidas contienen alcohol apto para consumo humano?</code> | <code>passage Se entiende por bebida alcohรณlica todo lรญquido apto para el consumo humano por ingestiรณn que contenga alcohol etรญlico.<br>A los efectos de este nรบmero no tendrรกn la consideraciรณn de alimento el tabaco ni las sustancias no aptas para el consumo humano o animal en el mismo estado en que fuesen objeto de entrega, adquisiciรณn intracomunitaria o importaciรณn.โ€.<br>4.- Con independencia de lo anterior, el artรญculo 20, apartado uno, nรบmero 9ยบ, de la Ley 37/1992, establece que estarรกn exentas del Impuesto las siguientes operaciones:<br>โ€œ9.ยบ La educaciรณn de la infancia y de la juventud, la guarda y custodia de niรฑos, incluida la atenciรณn a niรฑos en los centros docentes en tiempo interlectivo durante el comedor escolar o en aulas en servicio de guarderรญa fuera del horario escolar, la enseรฑanza escolar, universitaria y de postgraduados, la enseรฑanza de idiomas y la formaciรณn y reciclaje profesional, realizadas por Entidades de derecho pรบblico o entidades privadas autorizadas para el ejercicio de di...</code> | <code>โ€œArtรญculo 90. Tipo impositivo general.<br>Uno. El impuesto se exigirรก al tipo del 21 por ciento, salvo lo dispuesto en el artรญculo siguiente.<br>Dos. El tipo impositivo aplicable a cada operaciรณn serรก el vigente en el momento del devengo.โ€.<br>โ€œArtรญculo 91. Tipos impositivos reducidos.<br>Uno. Se aplicarรก el tipo del 10 por ciento a las operaciones siguientes:<br>1. Las entregas, adquisiciones intracomunitarias o importaciones de los bienes que se indican a continuaciรณn:<br>1.ยบ Las sustancias o productos, cualquiera que sea su origen que, por sus caracterรญsticas, aplicaciones, componentes, preparaciรณn y estado de conservaciรณn, sean susceptibles de ser habitual e idรณneamente utilizados para la nutriciรณn humana o animal, de acuerdo con lo establecido en el Cรณdigo Alimentario y las disposiciones dictadas para su desarrollo, excepto las bebidas alcohรณlicas.<br>Se entiende por bebida alcohรณlica todo lรญquido apto para el consumo humano por ingestiรณn que contenga alcohol etรญlico.<br>A los efectos de este nรบmero no t...</code> | * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` ### Training Hyperparameters #### Non-Default Hyperparameters - `eval_strategy`: steps - `per_device_train_batch_size`: 32 - `per_device_eval_batch_size`: 16 - `gradient_accumulation_steps`: 16 - `learning_rate`: 2e-05 - `num_train_epochs`: 5 - `lr_scheduler_type`: cosine - `warmup_ratio`: 0.1 - `fp16`: True - `tf32`: True - `load_best_model_at_end`: True - `optim`: adamw_torch_fused - `batch_sampler`: no_duplicates #### All Hyperparameters <details><summary>Click to expand</summary> - `overwrite_output_dir`: False - `do_predict`: False - `eval_strategy`: steps - `prediction_loss_only`: True - `per_device_train_batch_size`: 32 - `per_device_eval_batch_size`: 16 - `per_gpu_train_batch_size`: None - `per_gpu_eval_batch_size`: None - `gradient_accumulation_steps`: 16 - `eval_accumulation_steps`: None - `torch_empty_cache_steps`: None - `learning_rate`: 2e-05 - `weight_decay`: 0.0 - `adam_beta1`: 0.9 - `adam_beta2`: 0.999 - `adam_epsilon`: 1e-08 - `max_grad_norm`: 1.0 - `num_train_epochs`: 5 - `max_steps`: -1 - `lr_scheduler_type`: cosine - `lr_scheduler_kwargs`: {} - `warmup_ratio`: 0.1 - `warmup_steps`: 0 - `log_level`: passive - `log_level_replica`: warning - `log_on_each_node`: True - `logging_nan_inf_filter`: True - `save_safetensors`: True - `save_on_each_node`: False - `save_only_model`: False - `restore_callback_states_from_checkpoint`: False - `no_cuda`: False - `use_cpu`: False - `use_mps_device`: False - `seed`: 42 - `data_seed`: None - `jit_mode_eval`: False - `use_ipex`: False - `bf16`: False - `fp16`: True - `fp16_opt_level`: O1 - `half_precision_backend`: auto - `bf16_full_eval`: False - `fp16_full_eval`: False - `tf32`: True - `local_rank`: 0 - `ddp_backend`: None - `tpu_num_cores`: None - `tpu_metrics_debug`: False - `debug`: [] - `dataloader_drop_last`: False - `dataloader_num_workers`: 0 - `dataloader_prefetch_factor`: None - `past_index`: -1 - `disable_tqdm`: False - `remove_unused_columns`: True - `label_names`: None - `load_best_model_at_end`: True - `ignore_data_skip`: False - `fsdp`: [] - `fsdp_min_num_params`: 0 - `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False} - `fsdp_transformer_layer_cls_to_wrap`: None - `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None} - `deepspeed`: None - `label_smoothing_factor`: 0.0 - `optim`: adamw_torch_fused - `optim_args`: None - `adafactor`: False - `group_by_length`: False - `length_column_name`: length - `ddp_find_unused_parameters`: None - `ddp_bucket_cap_mb`: None - `ddp_broadcast_buffers`: False - `dataloader_pin_memory`: True - `dataloader_persistent_workers`: False - `skip_memory_metrics`: True - `use_legacy_prediction_loop`: False - `push_to_hub`: False - `resume_from_checkpoint`: None - `hub_model_id`: None - `hub_strategy`: every_save - `hub_private_repo`: None - `hub_always_push`: False - `gradient_checkpointing`: False - `gradient_checkpointing_kwargs`: None - `include_inputs_for_metrics`: False - `include_for_metrics`: [] - `eval_do_concat_batches`: True - `fp16_backend`: auto - `push_to_hub_model_id`: None - `push_to_hub_organization`: None - `mp_parameters`: - `auto_find_batch_size`: False - `full_determinism`: False - `torchdynamo`: None - `ray_scope`: last - `ddp_timeout`: 1800 - `torch_compile`: False - `torch_compile_backend`: None - `torch_compile_mode`: None - `include_tokens_per_second`: False - `include_num_input_tokens_seen`: False - `neftune_noise_alpha`: None - `optim_target_modules`: None - `batch_eval_metrics`: False - `eval_on_start`: False - `use_liger_kernel`: False - `eval_use_gather_object`: False - `average_tokens_across_devices`: False - `prompts`: None - `batch_sampler`: no_duplicates - `multi_dataset_batch_sampler`: proportional </details> ### Training Logs | Epoch | Step | Training Loss | InformationRetrievalEvaluator_cosine_ndcg@10 | |:----------:|:-------:|:-------------:|:--------------------------------------------:| | 0.8691 | 100 | 17.1124 | 0.4546 | | 1.7300 | 200 | 1.1179 | 0.4828 | | 2.5910 | 300 | 0.9019 | 0.4941 | | 3.4519 | 400 | 0.7796 | 0.5005 | | **4.3129** | **500** | **0.725** | **0.5019** | | 5.0 | 580 | - | 0.5019 | * The bold row denotes the saved checkpoint. ### Framework Versions - Python: 3.11.13 - Sentence Transformers: 4.1.0 - Transformers: 4.52.4 - PyTorch: 2.6.0+cu124 - Accelerate: 1.7.0 - Datasets: 2.14.4 - Tokenizers: 0.21.1 ## Citation ### BibTeX #### Sentence Transformers ```bibtex @inproceedings{reimers-2019-sentence-bert, title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks", author = "Reimers, Nils and Gurevych, Iryna", booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing", month = "11", year = "2019", publisher = "Association for Computational Linguistics", url = "https://arxiv.org/abs/1908.10084", } ``` #### MultipleNegativesRankingLoss ```bibtex @misc{henderson2017efficient, title={Efficient Natural Language Response Suggestion for Smart Reply}, author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil}, year={2017}, eprint={1705.00652}, archivePrefix={arXiv}, primaryClass={cs.CL} } ``` <!-- ## Glossary *Clearly define terms in order to be accessible across audiences.* --> <!-- ## Model Card Authors *Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.* --> <!-- ## Model Card Contact *Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.* -->
FGTR6/Full.18.Video.New.Mezzo-Fun-Viral-Video.VIDEO.mezzo.fun.Viral.Video.Tutorial.Official
FGTR6
2025-06-17T11:40:28Z
0
0
null
[ "region:us" ]
null
2025-06-17T11:39:23Z
<a href="https://allyoutubers.com/mezzo-fun-viral-video"> ๐ŸŒ Full.18.Video.New.Mezzo-Fun-Viral-Video.VIDEO.mezzo.fun.Viral.Video.Tutorial.Official ๐Ÿ”ด โžคโ–บDOWNLOAD๐Ÿ‘‰๐Ÿ‘‰๐ŸŸข โžค <a href="https://allyoutubers.com/mezzo-fun-viral-video"> ๐ŸŒ Full.18.Video.New.Mezzo-Fun-Viral-Video.VIDEO.mezzo.fun.Viral.Video.Tutorial.Official <a href="https://allyoutubers.com/mezzo-fun-viral-video"> ๐ŸŒ Full.18.Video.New.Mezzo-Fun-Viral-Video.VIDEO.mezzo.fun.Viral.Video.Tutorial.Official ๐Ÿ”ด โžคโ–บDOWNLOAD๐Ÿ‘‰๐Ÿ‘‰๐ŸŸข โžค <a href="https://allyoutubers.com/mezzo-fun-viral-video"> ๐ŸŒ Full.18.Video.New.Mezzo-Fun-Viral-Video.VIDEO.mezzo.fun.Viral.Video.Tutorial.Official
MaIlz/outputs_grpo_all_tasks_4
MaIlz
2025-06-17T11:37:42Z
0
0
transformers
[ "transformers", "tensorboard", "safetensors", "generated_from_trainer", "unsloth", "trl", "grpo", "arxiv:2402.03300", "base_model:unsloth/llama-3-8b-Instruct-bnb-4bit", "base_model:finetune:unsloth/llama-3-8b-Instruct-bnb-4bit", "endpoints_compatible", "region:us" ]
null
2025-06-17T11:37:33Z
--- base_model: unsloth/llama-3-8b-Instruct-bnb-4bit library_name: transformers model_name: outputs_grpo_all_tasks_4 tags: - generated_from_trainer - unsloth - trl - grpo licence: license --- # Model Card for outputs_grpo_all_tasks_4 This model is a fine-tuned version of [unsloth/llama-3-8b-Instruct-bnb-4bit](https://huggingface.co/unsloth/llama-3-8b-Instruct-bnb-4bit). It has been trained using [TRL](https://github.com/huggingface/trl). ## Quick start ```python from transformers import pipeline question = "If you had a time machine, but could only go to the past or the future once and never return, which would you choose and why?" generator = pipeline("text-generation", model="MaIlz/outputs_grpo_all_tasks_4", device="cuda") output = generator([{"role": "user", "content": question}], max_new_tokens=128, return_full_text=False)[0] print(output["generated_text"]) ``` ## Training procedure This model was trained with GRPO, a method introduced in [DeepSeekMath: Pushing the Limits of Mathematical Reasoning in Open Language Models](https://huggingface.co/papers/2402.03300). ### Framework versions - TRL: 0.15.2 - Transformers: 4.51.3 - Pytorch: 2.6.0+cu124 - Datasets: 3.6.0 - Tokenizers: 0.21.1 ## Citations Cite GRPO as: ```bibtex @article{zhihong2024deepseekmath, title = {{DeepSeekMath: Pushing the Limits of Mathematical Reasoning in Open Language Models}}, author = {Zhihong Shao and Peiyi Wang and Qihao Zhu and Runxin Xu and Junxiao Song and Mingchuan Zhang and Y. K. Li and Y. Wu and Daya Guo}, year = 2024, eprint = {arXiv:2402.03300}, } ``` Cite TRL as: ```bibtex @misc{vonwerra2022trl, title = {{TRL: Transformer Reinforcement Learning}}, author = {Leandro von Werra and Younes Belkada and Lewis Tunstall and Edward Beeching and Tristan Thrush and Nathan Lambert and Shengyi Huang and Kashif Rasul and Quentin Gallouรฉdec}, year = 2020, journal = {GitHub repository}, publisher = {GitHub}, howpublished = {\url{https://github.com/huggingface/trl}} } ```
joanna302/Qwen3-0.6B-Base_fr_pt_8e-05_seed44
joanna302
2025-06-17T11:30:55Z
0
0
transformers
[ "transformers", "safetensors", "qwen3", "text-generation", "unsloth", "trl", "sft", "conversational", "arxiv:1910.09700", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2025-06-17T10:25:50Z
--- library_name: transformers tags: - unsloth - trl - sft --- # Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a ๐Ÿค— transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
gokulsrinivasagan/tinybert_base_train_book_ent_15p_s_init_book_mrpc
gokulsrinivasagan
2025-06-17T11:24:45Z
0
0
transformers
[ "transformers", "tensorboard", "safetensors", "bert", "text-classification", "generated_from_trainer", "en", "dataset:glue", "base_model:gokulsrinivasagan/tinybert_base_train_book_ent_15p_s_init_book", "base_model:finetune:gokulsrinivasagan/tinybert_base_train_book_ent_15p_s_init_book", "license...
text-classification
2025-06-17T11:23:59Z
--- library_name: transformers language: - en license: apache-2.0 base_model: gokulsrinivasagan/tinybert_base_train_book_ent_15p_s_init_book tags: - generated_from_trainer datasets: - glue metrics: - accuracy - f1 model-index: - name: tinybert_base_train_book_ent_15p_s_init_book_mrpc results: - task: name: Text Classification type: text-classification dataset: name: GLUE MRPC type: glue args: mrpc metrics: - name: Accuracy type: accuracy value: 0.7009803921568627 - name: F1 type: f1 value: 0.8111455108359134 --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # tinybert_base_train_book_ent_15p_s_init_book_mrpc This model is a fine-tuned version of [gokulsrinivasagan/tinybert_base_train_book_ent_15p_s_init_book](https://huggingface.co/gokulsrinivasagan/tinybert_base_train_book_ent_15p_s_init_book) on the GLUE MRPC dataset. It achieves the following results on the evaluation set: - Loss: 0.5825 - Accuracy: 0.7010 - F1: 0.8111 - Combined Score: 0.7561 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 256 - eval_batch_size: 256 - seed: 10 - optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments - lr_scheduler_type: linear - num_epochs: 50 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 | Combined Score | |:-------------:|:-----:|:----:|:---------------:|:--------:|:------:|:--------------:| | 0.6257 | 1.0 | 15 | 0.6032 | 0.6961 | 0.8086 | 0.7524 | | 0.584 | 2.0 | 30 | 0.5825 | 0.7010 | 0.8111 | 0.7561 | | 0.5483 | 3.0 | 45 | 0.6029 | 0.7059 | 0.8171 | 0.7615 | | 0.5131 | 4.0 | 60 | 0.5927 | 0.6863 | 0.7808 | 0.7335 | | 0.4597 | 5.0 | 75 | 0.6270 | 0.6985 | 0.7897 | 0.7441 | | 0.3832 | 6.0 | 90 | 0.6773 | 0.7034 | 0.7987 | 0.7511 | | 0.3111 | 7.0 | 105 | 0.7539 | 0.7083 | 0.8096 | 0.7590 | ### Framework versions - Transformers 4.51.2 - Pytorch 2.6.0+cu126 - Datasets 3.5.0 - Tokenizers 0.21.1
milpu02/Fenqurymix-xl
milpu02
2025-06-17T11:20:02Z
0
0
diffusers
[ "diffusers", "text-to-image", "lora", "template:diffusion-lora", "base_model:stabilityai/stable-diffusion-xl-base-1.0", "base_model:adapter:stabilityai/stable-diffusion-xl-base-1.0", "region:us" ]
text-to-image
2025-06-17T11:19:51Z
--- tags: - text-to-image - lora - diffusers - template:diffusion-lora widget: - text: '-' output: url: images/Screenshot 2025-06-17 051906.png base_model: stabilityai/stable-diffusion-xl-base-1.0 instance_prompt: Fenqury --- # Illustrious-XL <Gallery /> ## Trigger words You should use `Fenqury` to trigger the image generation. ## Download model Weights for this model are available in Safetensors format. [Download](/milpu02/Fenqurymix-xl/tree/main) them in the Files & versions tab.
h34v7/Euro-DDXPv1.0-GGUF
h34v7
2025-06-17T11:14:52Z
22
0
null
[ "gguf", "base_model:h34v7/Euro-DDXPv1.0", "base_model:quantized:h34v7/Euro-DDXPv1.0", "license:apache-2.0", "endpoints_compatible", "region:us", "conversational" ]
null
2025-06-09T09:33:00Z
--- license: apache-2.0 base_model: - h34v7/Euro-DDXPv1.0 ---
mpio/bert-finetuned-ner
mpio
2025-06-17T11:05:55Z
0
0
transformers
[ "transformers", "safetensors", "bert", "token-classification", "generated_from_trainer", "dataset:conll2003", "base_model:google-bert/bert-base-cased", "base_model:finetune:google-bert/bert-base-cased", "license:apache-2.0", "model-index", "autotrain_compatible", "endpoints_compatible", "reg...
token-classification
2025-06-17T10:54:07Z
--- library_name: transformers license: apache-2.0 base_model: bert-base-cased tags: - generated_from_trainer datasets: - conll2003 metrics: - precision - recall - f1 - accuracy model-index: - name: bert-finetuned-ner results: - task: name: Token Classification type: token-classification dataset: name: conll2003 type: conll2003 config: conll2003 split: validation args: conll2003 metrics: - name: Precision type: precision value: 0.9401794616151545 - name: Recall type: recall value: 0.9522046449007069 - name: F1 type: f1 value: 0.9461538461538462 - name: Accuracy type: accuracy value: 0.9874315653146524 --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # bert-finetuned-ner This model is a fine-tuned version of [bert-base-cased](https://huggingface.co/bert-base-cased) on the conll2003 dataset. It achieves the following results on the evaluation set: - Loss: 0.0595 - Precision: 0.9402 - Recall: 0.9522 - F1: 0.9462 - Accuracy: 0.9874 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments - lr_scheduler_type: linear - num_epochs: 3 ### Training results | Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:---------:|:------:|:------:|:--------:| | 0.0783 | 1.0 | 1756 | 0.0685 | 0.8991 | 0.9329 | 0.9157 | 0.9819 | | 0.0343 | 2.0 | 3512 | 0.0606 | 0.9343 | 0.9477 | 0.9409 | 0.9862 | | 0.0215 | 3.0 | 5268 | 0.0595 | 0.9402 | 0.9522 | 0.9462 | 0.9874 | ### Framework versions - Transformers 4.50.0 - Pytorch 2.6.0+cu118 - Datasets 3.6.0 - Tokenizers 0.21.1
ujjawal077/llama3s-merged3
ujjawal077
2025-06-17T11:03:36Z
0
0
transformers
[ "transformers", "safetensors", "llama", "text-generation", "mergekit", "merge", "arxiv:2311.03099", "base_model:AdaptLLM/finance-LLM-13B", "base_model:merge:AdaptLLM/finance-LLM-13B", "base_model:starmpcc/Asclepius-Llama2-13B", "base_model:merge:starmpcc/Asclepius-Llama2-13B", "autotrain_compa...
text-generation
2025-06-17T11:01:23Z
--- base_model: - AdaptLLM/finance-LLM-13B - starmpcc/Asclepius-Llama2-13B library_name: transformers tags: - mergekit - merge --- # llama3s-merged This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). ## Merge Details ### Merge Method This model was merged using the [DARE TIES](https://arxiv.org/abs/2311.03099) merge method using [starmpcc/Asclepius-Llama2-13B](https://huggingface.co/starmpcc/Asclepius-Llama2-13B) as a base. ### Models Merged The following models were included in the merge: * [AdaptLLM/finance-LLM-13B](https://huggingface.co/AdaptLLM/finance-LLM-13B) ### Configuration The following YAML configuration was used to produce this model: ```yaml base_model: starmpcc/Asclepius-Llama2-13B dtype: bfloat16 merge_method: dare_ties modules: default: slices: - sources: - layer_range: [0, 40] model: AdaptLLM/finance-LLM-13B parameters: density: 0.53 weight: 0.6 - layer_range: [0, 40] model: starmpcc/Asclepius-Llama2-13B parameters: density: 0.5 weight: 0.4 parameters: int8_mask: 1.0 ```
Mbaxraan/mistral-small-24b-4bit
Mbaxraan
2025-06-17T10:58:33Z
0
0
null
[ "safetensors", "mistral", "license:apache-2.0", "4-bit", "bitsandbytes", "region:us" ]
null
2025-06-17T08:07:34Z
--- license: apache-2.0 ---
tamewild/4b_v4_merged_e2
tamewild
2025-06-17T10:57:57Z
0
0
transformers
[ "transformers", "safetensors", "qwen3", "text-generation", "conversational", "arxiv:1910.09700", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2025-06-17T10:56:13Z
--- library_name: transformers tags: [] --- # Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a ๐Ÿค— transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
sizzlebop/Qwen3-ReactJs-code-Q4_K_M-GGUF
sizzlebop
2025-06-17T10:53:24Z
0
0
transformers
[ "transformers", "gguf", "code", "ReactJS", "llama-cpp", "gguf-my-repo", "text-generation", "en", "base_model:nirusanan/Qwen3-ReactJs-code", "base_model:finetune:nirusanan/Qwen3-ReactJs-code", "endpoints_compatible", "region:us", "imatrix", "conversational" ]
text-generation
2025-06-17T10:53:14Z
--- library_name: transformers tags: - code - ReactJS - llama-cpp - gguf-my-repo language: - en base_model: nirusanan/Qwen3-ReactJs-code base_model_relation: finetune pipeline_tag: text-generation --- # sizzlebop/Qwen3-ReactJs-code-Q4_K_M-GGUF This model was converted to GGUF format from [`nirusanan/Qwen3-ReactJs-code`](https://huggingface.co/nirusanan/Qwen3-ReactJs-code) using llama.cpp via the ggml.ai's [GGUF-my-repo](https://huggingface.co/spaces/ggml-org/gguf-my-repo) space. Refer to the [original model card](https://huggingface.co/nirusanan/Qwen3-ReactJs-code) for more details on the model. ## Use with llama.cpp Install llama.cpp through brew (works on Mac and Linux) ```bash brew install llama.cpp ``` Invoke the llama.cpp server or the CLI. ### CLI: ```bash llama-cli --hf-repo sizzlebop/Qwen3-ReactJs-code-Q4_K_M-GGUF --hf-file qwen3-reactjs-code-q4_k_m-imat.gguf -p "The meaning to life and the universe is" ``` ### Server: ```bash llama-server --hf-repo sizzlebop/Qwen3-ReactJs-code-Q4_K_M-GGUF --hf-file qwen3-reactjs-code-q4_k_m-imat.gguf -c 2048 ``` Note: You can also use this checkpoint directly through the [usage steps](https://github.com/ggerganov/llama.cpp?tab=readme-ov-file#usage) listed in the Llama.cpp repo as well. Step 1: Clone llama.cpp from GitHub. ``` git clone https://github.com/ggerganov/llama.cpp ``` Step 2: Move into the llama.cpp folder and build it with `LLAMA_CURL=1` flag along with other hardware-specific flags (for ex: LLAMA_CUDA=1 for Nvidia GPUs on Linux). ``` cd llama.cpp && LLAMA_CURL=1 make ``` Step 3: Run inference through the main binary. ``` ./llama-cli --hf-repo sizzlebop/Qwen3-ReactJs-code-Q4_K_M-GGUF --hf-file qwen3-reactjs-code-q4_k_m-imat.gguf -p "The meaning to life and the universe is" ``` or ``` ./llama-server --hf-repo sizzlebop/Qwen3-ReactJs-code-Q4_K_M-GGUF --hf-file qwen3-reactjs-code-q4_k_m-imat.gguf -c 2048 ```
tamewild/4b_v4_merged_e10
tamewild
2025-06-17T10:45:27Z
0
0
transformers
[ "transformers", "safetensors", "qwen3", "text-generation", "conversational", "arxiv:1910.09700", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2025-06-17T10:43:46Z
--- library_name: transformers tags: [] --- # Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a ๐Ÿค— transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
charso/qwen2-7b-instruct-PrizePrj01
charso
2025-06-17T10:39:11Z
5
0
peft
[ "peft", "tensorboard", "safetensors", "trl", "sft", "generated_from_trainer", "base_model:Qwen/Qwen2-VL-7B-Instruct", "base_model:adapter:Qwen/Qwen2-VL-7B-Instruct", "license:apache-2.0", "region:us" ]
null
2025-06-17T06:28:20Z
--- library_name: peft license: apache-2.0 base_model: Qwen/Qwen2-VL-7B-Instruct tags: - trl - sft - generated_from_trainer model-index: - name: qwen2-7b-instruct-PrizePrj01 results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # qwen2-7b-instruct-PrizePrj01 This model is a fine-tuned version of [Qwen/Qwen2-VL-7B-Instruct](https://huggingface.co/Qwen/Qwen2-VL-7B-Instruct) on an unknown dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0002 - train_batch_size: 1 - eval_batch_size: 8 - seed: 42 - gradient_accumulation_steps: 8 - total_train_batch_size: 8 - optimizer: Use OptimizerNames.ADAMW_TORCH_FUSED with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments - lr_scheduler_type: constant - lr_scheduler_warmup_ratio: 0.03 - num_epochs: 3 ### Training results ### Framework versions - PEFT 0.15.2 - Transformers 4.52.1 - Pytorch 2.5.1+cu121 - Datasets 3.5.1 - Tokenizers 0.21.1
vinnvinn/mistral-Dr.hugz
vinnvinn
2025-06-17T10:33:23Z
20
0
transformers
[ "transformers", "safetensors", "mistral", "text-generation", "conversational", "arxiv:1910.09700", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "8-bit", "region:us" ]
text-generation
2025-06-16T10:58:33Z
--- library_name: transformers tags: [] --- # Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a ๐Ÿค— transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
Triangle104/Qwen3-30B-A3B-ArliAI-RpR-v4-Fast-Q3_K_L-GGUF
Triangle104
2025-06-17T10:22:45Z
0
0
transformers
[ "transformers", "gguf", "llama-cpp", "gguf-my-repo", "text-generation", "en", "base_model:ArliAI/Qwen3-30B-A3B-ArliAI-RpR-v4-Fast", "base_model:quantized:ArliAI/Qwen3-30B-A3B-ArliAI-RpR-v4-Fast", "license:apache-2.0", "endpoints_compatible", "region:us", "conversational" ]
text-generation
2025-06-17T10:04:49Z
--- license: apache-2.0 thumbnail: https://cdn-uploads.huggingface.co/production/uploads/6625f4a8a8d1362ebcc3851a/hIZ2ZcaDyfYLT9Yd4pfOs.jpeg language: - en base_model: ArliAI/Qwen3-30B-A3B-ArliAI-RpR-v4-Fast library_name: transformers pipeline_tag: text-generation tags: - llama-cpp - gguf-my-repo --- # Triangle104/Qwen3-30B-A3B-ArliAI-RpR-v4-Fast-Q3_K_L-GGUF This model was converted to GGUF format from [`ArliAI/Qwen3-30B-A3B-ArliAI-RpR-v4-Fast`](https://huggingface.co/ArliAI/Qwen3-30B-A3B-ArliAI-RpR-v4-Fast) using llama.cpp via the ggml.ai's [GGUF-my-repo](https://huggingface.co/spaces/ggml-org/gguf-my-repo) space. Refer to the [original model card](https://huggingface.co/ArliAI/Qwen3-30B-A3B-ArliAI-RpR-v4-Fast) for more details on the model. --- RpR (RolePlay with Reasoning) is a new series of models from ArliAI. This series builds directly upon the successful dataset curation methodology and training methods developed for the RPMax series. RpR models use the same curated, deduplicated RP and creative writing dataset used for RPMax, with a focus on variety to ensure high creativity and minimize cross-context repetition. Users familiar with RPMax will recognize the unique, non-repetitive writing style unlike other finetuned-for-RP models. With the release of QwQ as the first high performing open-source reasoning model that can be easily trained, it was clear that the available instruct and creative writing reasoning datasets contains only one response per example. This is type of single response dataset used for training reasoning models causes degraded output quality in long multi-turn chats. Which is why Arli AI decided to create a real RP model capable of long multi-turn chat with reasoning. In order to create RpR, we first had to actually create the reasoning RP dataset by re-processing our existing known-good RPMax dataset into a reasoning dataset. This was possible by using the base QwQ Instruct model itself to create the reasoning process for every turn in the RPMax dataset conversation examples, which is then further refined in order to make sure the reasoning is in-line with the actual response examples from the dataset. Another important thing to get right is to make sure the model is trained on examples that present reasoning blocks in the same way as it encounters it during inference. Which is, never seeing the reasoning blocks in it's context. In order to do this, the training run was completed using axolotl with manual template-free segments dataset in order to make sure that the model is never trained to see the reasoning block in the context. Just like how the model will be used during inference time. The result of training on this dataset with this method are consistently coherent and interesting outputs even in long multi-turn RP chats. This is as far as we know the first true correctly-trained reasoning model trained for RP and creative writing. You can access the model at https://arliai.com and we also have a models ranking page at https://www.arliai.com/models-ranking Ask questions in our new Discord Server https://discord.com/invite/t75KbPgwhk or on our subreddit https://www.reddit.com/r/ArliAI/ --- ## Use with llama.cpp Install llama.cpp through brew (works on Mac and Linux) ```bash brew install llama.cpp ``` Invoke the llama.cpp server or the CLI. ### CLI: ```bash llama-cli --hf-repo Triangle104/Qwen3-30B-A3B-ArliAI-RpR-v4-Fast-Q3_K_L-GGUF --hf-file qwen3-30b-a3b-arliai-rpr-v4-fast-q3_k_l.gguf -p "The meaning to life and the universe is" ``` ### Server: ```bash llama-server --hf-repo Triangle104/Qwen3-30B-A3B-ArliAI-RpR-v4-Fast-Q3_K_L-GGUF --hf-file qwen3-30b-a3b-arliai-rpr-v4-fast-q3_k_l.gguf -c 2048 ``` Note: You can also use this checkpoint directly through the [usage steps](https://github.com/ggerganov/llama.cpp?tab=readme-ov-file#usage) listed in the Llama.cpp repo as well. Step 1: Clone llama.cpp from GitHub. ``` git clone https://github.com/ggerganov/llama.cpp ``` Step 2: Move into the llama.cpp folder and build it with `LLAMA_CURL=1` flag along with other hardware-specific flags (for ex: LLAMA_CUDA=1 for Nvidia GPUs on Linux). ``` cd llama.cpp && LLAMA_CURL=1 make ``` Step 3: Run inference through the main binary. ``` ./llama-cli --hf-repo Triangle104/Qwen3-30B-A3B-ArliAI-RpR-v4-Fast-Q3_K_L-GGUF --hf-file qwen3-30b-a3b-arliai-rpr-v4-fast-q3_k_l.gguf -p "The meaning to life and the universe is" ``` or ``` ./llama-server --hf-repo Triangle104/Qwen3-30B-A3B-ArliAI-RpR-v4-Fast-Q3_K_L-GGUF --hf-file qwen3-30b-a3b-arliai-rpr-v4-fast-q3_k_l.gguf -c 2048 ```
tomaarsen/splade-mpnet-base-miriad-2e-5-lq-5e-6-lc
tomaarsen
2025-06-17T10:15:04Z
0
0
sentence-transformers
[ "sentence-transformers", "safetensors", "mpnet", "sparse-encoder", "sparse", "splade", "generated_from_trainer", "dataset_size:100000", "loss:SpladeLoss", "loss:SparseMultipleNegativesRankingLoss", "loss:FlopsLoss", "feature-extraction", "en", "dataset:tomaarsen/miriad-4.4M-split", "arxi...
feature-extraction
2025-06-17T10:14:47Z
--- language: - en license: apache-2.0 tags: - sentence-transformers - sparse-encoder - sparse - splade - generated_from_trainer - dataset_size:100000 - loss:SpladeLoss - loss:SparseMultipleNegativesRankingLoss - loss:FlopsLoss base_model: microsoft/mpnet-base widget: - text: "He does it right, but there are times that he doesn't (Joana) Let's go there\ \ and pee? Because she does not want to wear a diaper, she rips off her diaper\ \ (Filomena). The family caregiver may understand this action as a \"pang\" and\ \ \"tantrum\", and \"forget\" that these episodes are part of the clinical picture\ \ of dementia. Conflicts related to incontinence and other difficult-to-manage\ \ symptoms eventually lead to a variety of interpretations, and past history of\ \ the emotional relationship between the elderly and the family caregiver can\ \ cause older emotional issues to surface again in these episodes.\n\n With psycho-functional\ \ limitations, new demands arise that can be distressing for those who care because\ \ of affective involvement. Subjective constructions are fundamental elements\ \ in upkeeping the relationship of care 10 .\n\n Besides the psychological aspect\ \ involved in the loss of identity and the specific cognitive aspects of dementia,\ \ some behavioral and psychiatric changes are important even in the consultation\ \ with the ESF professionals: psychotic symptoms, agitation and aggression, mood\ \ swings, disinhibited behavior and euphoria, apathy and insomnia. Some studies\ \ [11] [12] [13] pointed out the significant association between the presence\ \ of apathy and a faster cognitive and functional decline in these patients. Another\ \ very relevant situation regarding the appearance of neuropsychiatric symptoms\ \ is the association of these symptoms with the institutionalization and shorter\ \ patient survival. They also showed that the highest Neuropsychiatric Inventory\ \ (NPI) score was signifi-cantly associated with more severe cognitive impairment,\ \ greater caregiver distress, and higher cost, but was not associated with a formal\ \ diagnosis of dementia performed by the primary care physician.\n\n Changed behaviors\ \ and even risky behaviors, such as turning on the gas switch and not turning\ \ off, stirring in pots on a hot stove, or ingestion of liquids or toxic materials\ \ are situations in the face of neuropsychiatric manifestations in dementia. Filomena\ \ reports several neuropsychiatric symptoms of her husband. She compares his behavior\ \ to that of children who explore the environment to discover the cause and effect\ \ of things and the sensations obtained by the senses. Her role in this context\ \ resembles that of a mother trying to prevent the child from getting hurt: He\ \ lights up the gas switch, he's just like a child, sometimes he starts to eat\ \ the slipper, I have to get it out of his mouth.\n\n Hallucination is another\ \ neuropsychiatric symptom described by family caregivers. Joana reports that\ \ when the husband talks to people who have died, the family members feel fear\ \ and distance themselves. Filomena has fun when her mother speaks with those\ \ who have died: \"She talks to those who have passed away, she sends the dog\ \ out, which does not exist\". Each family caregiver experiences the symptoms\ \ presented by the dementia in a unique way, and ways to address and interpret\ \ this phenomenon and give meaning to their experience.\n\n The negative development\ \ of dementia perceived by Celina, Filomena, Maria, Teresa and Joana show that\ \ the disease follows a course that transcends the biological event itself. The\ \ dementia process evidences psychological and sociocultural constructions permeated\ \ by meanings and interpretations according to those who live and those who maintain\ \ interpersonal relationships with the elderly person with dementia.\n\n In the\ \ discourse of family caregivers, seniors with dementia have aggressive behaviors\ \ such as agitation, spitting, cursing, clawing, throwing objects, revealing a\ \ level of aggression that can impact the feelings and interpretations produced\ \ during the care routine. Freud 14 affirms that human instincts are of two types:\ \ Those who tend to preserve and unite, which we call 'erotic' [...] with a deliberate\ \ expansion of the popular conception of 'sexuality'; and those who tend to destroy\ \ and kill, which we group as an aggressive or destructive instinct. All actions\ \ in human life involve the confluence of these two instincts of preservation\ \ and destruction. The ideal situation for life in society would be the dominance\ \ of reason over the instinctual life controlling destructive impulses, which\ \ is utopian. In this perspective, aggressiveness is inherent in the human condition.\n\ \n In seniors with dementia with a declining psychological realm of the Self,\ \ the progressive loss of identity and the repercussion of cognitive decline,\ \ an actual decline in the rational realm of psychic life emerges. This decline\ \ refers to the cerebral aspect of inhibitory control and social cognition, showing\ \ that the emergence of aggressive behaviors is related to the biological component.\ \ The declining reason turns its demands and needs into instinctual acts and more\ \ basic reflexes, and can produce a continuous imbalance in the expression between\ \ the instincts of preservation and aggression.\n\n Aggressiveness can be triggered\ \ by situations of frustration, when they do not get what they want, when they\ \ are afraid or consider some humiliating situation, when they are exposed to\ \ environmental overstimulation or feel any physical pain or side effects from\ \ medication." - text: "Neurosurgery is of great interest to historians of medicine and technology\ \ because it is relatively young, because it developed in an era of journals and\ \ publications, because lines and traditions of training and mentorship are relatively\ \ clear, and because the technologies that enabled the evolution of the profession\ \ and acted as inflection points in the emergence of certain surgical approaches\ \ and procedures are at once well documented and remarkably unambiguous. To the\ \ extent that is the case for neurosurgery as a whole, it is even more so for\ \ surgery of the skull base.\n\n To trace the history of skull base surgery along\ \ its full expanse is to begin with Horsley and pituitary tumors (unless one wants\ \ to start even earlier with the treatment of trigeminal neuralgia); to move to\ \ Cushing's work in the same arena (but also that of many others as well); to\ \ emphasize the impact of microsurgical techniques and new imaging modalities;\ \ to outline once radically innovative, but now widely practiced anatomical approaches\ \ to the skull base; to emphasize the importance of team approaches; to discuss\ \ emerging therapeutic strategy as well as instrumentation and techniques; to\ \ acknowledge the importance of advances in neuroanesthesia and the medical and\ \ perioperative care of the neurosurgical patient; and to recognize the contributions\ \ of the many individuals who, over the past 25 years, have added to and furthered\ \ the field in these and other ways.\n\n It is not hard to point to leading individuals\ \ and important techniques. It is perhaps more difficult to frame them in a meaningful\ \ historical perspective because the work has occurred relatively recently, in\ \ the time frame historians call \"near history.\" Difficulties arise from both\ \ an evaluative and a nosological standpoint. For example, from an evaluative\ \ standpoint, how does one stratify the relative importance of corticosteroids,\ \ osmotic diuretics, and CSF drainage techniques and technologies in the control\ \ of intracranial pressure and the facilitation of exposure for base of skull\ \ surgery? How does one think about the idea of hybrid surgery and stereotactic\ \ radiation? What will be the long-term view of anatomical approaches to giant\ \ basilar aneurysms in the light of endovascular surgery? Have we reached a tipping\ \ point in the management of vestibular schwannomas, given the availability of\ \ and the outcomes associated with stereotactic radiosurgery?\n\n From a nosological\ \ standpoint, should we think about base of skull surgery in terms of anatomical\ \ approaches? One textbook that does just that starts with subfrontal approaches\ \ and then moves around the calvaria and down to the petrous and temporal region\ \ in a Cook's tour of exposure, in the tradition of Henry's Extensile Exposure\ \ and comparable surgical classics. 1, 6 Other publications have explored a set\ \ of technologies. 5, 7, 10 Another focuses on the contribution of great men.\ \ 9 Many surgeons have written about specific particular pathologies at the skull\ \ base.\n\n Introduction their colleagues write about the premodern period. Elhadi\ \ and colleagues also comment on the introduction of radiography in early neurosurgery.\ \ Gross and Grossi and their colleagues concentrate on petrosal approaches; Schmitt\ \ and Jane on third ventriculostomy; and Chittiboina and colleagues on the history\ \ of a very simple but ubiquitous instrument, the Freer elevator, and its inventor.\ \ In contrast to the more comprehensive overviews written by Goodrich, Donald,\ \ and others, these essays concentrate on selected details. While it is important\ \ not to miss the forest for the trees, sometimes the trees are worth studying\ \ no less than the forest. \n\n The authors report no conflict of interest." - text: 'How do neuromediators contribute to the pathogenesis of pruritus in AD? ' - text: "Pericardial effusion (PE) is a life-threatening condition, as accumulation\ \ of fluid in the pericardial sac can lead to cardiac tamponade and fatal shock.\ \ 1, 2 PE is often associated with an underlying disease or condition, and the\ \ causes can vary widely. 3, 4 Pericardiocentesis performed by needle (with or\ \ without echoguidance), and various surgical procedures (including subxiphoid\ \ pericardial tube drainage, pericardial window performed through a left anterior\ \ thoracotomy, or video-assisted thoracoscopic surgery) can alleviate PE. 5 Our\ \ retrospective clinical experiences of treating PE with subxiphoid pericardiostomy\ \ are presented in this study.\n\n We reviewed the medical records of patients\ \ who underwent subxiphoid pericardiostomy to treat persistent symptomatic PE\ \ in our clinic between 1990 and 2000. Echocardiography (ECG) was used to diagnose\ \ PE and N Becit, A ร–zyazicioglu, M Ceviz et al.\n\n determine the size of the\ \ effusion. A diastolic echo-free space of < 10 mm between the left ventricular\ \ posterior wall and pericardium was determined as mild PE, 10 -20 mm as moderate,\ \ and > 20 mm as severe PE. Patients with cardiac tamponade and/or moderate to\ \ severe PE were treated by subxiphoid pericardiostomy and tube drainage.\n\n\ \ Some patients with pre-operative tuberculosis were treated with an adult fourdrug\ \ regimen (isoniazid, 300 mg/day and rifampin, 600 mg/day for 12 months, streptomycin,\ \ 1 g/day for 2 months, and pyrazinamide, 2 g/day for 3 months) preoperatively.\ \ The effusion was drained after a 3-week course of anti-tuberculosis therapy.\ \ In these, and patients diagnosed with tuberculous pericarditis, the tuberculosis\ \ therapy regimen was given for 12 months post-operatively.\n\n The technique\ \ used for subxiphoid pericardiostomy (described previously 3 ) was performed\ \ under general anaesthetic, or local anaesthesia and sedation. General anaesthesia\ \ was preferred in children and was induced with 1.5 mg/kg ketamine. Neuromuscular\ \ block was achieved with 0.1 mg/kg vecuronium, and anaesthesia maintained with\ \ 60% N 2 O, 40% O 2 and 0.5 -1.0% isoflurane. Local anaesthetic (2% lidocaine\ \ solution) was injected into the dermal and subdermal layers, and sedation and\ \ analgesia was provided by 1 mg/kg ketamine intravenously. A piece of anterior\ \ pericardium, approximately 2 -4 cm in diameter, was excised under direct vision\ \ and submitted for histopathological analysis. The pericardial cavity was decompressed\ \ and fluid samples were collected for culture and cytological analysis. To prevent\ \ acute cardiac dilatation during decompression of the pericardial cavity, intravenous\ \ digoxin was administered and the pericardial cavity was decompressed gradually.\n\ \n The pericardial cavity was examined under direct vision and/or by digital examination\ \ to detect any tumour or adhesions. Gentle digital lysis of adhesions and opening\ \ of loculations were performed as needed, to enhance satisfactory drainage. A\ \ soft chest tube was placed in the pericardial cavity, lateral to the right ventricle,\ \ after pericardiotomy for post-operative drainage. It was connected to an underwater\ \ sealed system, and was removed when fluid drainage ceased.\n\n Patients with\ \ mild haemorrhagic effusion and cardiac tamponade, due to trauma or invasive\ \ cardiac interventions, were considered haemodynamically unstable and unsuitable\ \ for surgical subxiphoid pericardiostomy, even under local anaesthetic. These\ \ patients underwent pericardiocentesis in the intensive care unit, which provided\ \ immediate relief. Subxiphoid pericardiostomy was performed later if haemorrhagic\ \ PE persisted. Patients were followed, with physical examinations and ECG, in\ \ the outpatient clinic for at least 1 year.\n\n Numerical results are given as\ \ mean ยฑ SD. Fisher's exact test was used to compare proportions between groups\ \ (comparison of the rates of recurrence and constriction between patient groups\ \ with uraemic pericarditis, tuberculous pericarditis and non-tuberculous bacterial\ \ pericarditis). The McNemar test was used for comparison of proportions within\ \ one group (to assess the significance of rates of recurrence and constriction\ \ in patients with tuberculous pericarditis). Statistical differences were considered\ \ significant if P < 0.05." - text: "Henry M. Blumberg, MD In this issue of Infection Control and Hospital Epidemiology,\ \ a potpourri of tuberculosis (TB)-related articles are being published. 1-7 Tuberculosisrelated\ \ issues have been an important focus for the past decade for those in infection\ \ control and hospital epidemiology, especially in urban areas where the large\ \ majority of TB cases occur, 8 but also, because of federal regulations, for\ \ those in low-endemic areas or areas where no TB cases occur (approximately half\ \ of the counties in the United States).\n\n The resurgence of TB beginning in\ \ the mid1980s in the United States (in large part, due to failure and underfunding\ \ of the public health infrastructure and to the epidemic of human immunodeficiency\ \ virus [HIV] infection) and outbreaks of TB have highlighted the risk of nosocomial\ \ transmission of TB. 9,10 These outbreaks affected both healthcare workers (HCWs)\ \ and patients. The fact that outbreaks in New York and Miami, among others, involved\ \ multidrug-resistant (MDR) strains that were associated with high morbidity and\ \ mortality among HIV-infected individuals punctuated the importance of effective\ \ TB infection control measures. Commingling of patients with unsuspected TB and\ \ those who were quite immunosuppressed led to amplification of nosocomial transmission.\ \ A decade ago, few institutions were prepared for the changing epidemiology of\ \ TB.\n\n Several recent studies have demonstrated that infection control measures\ \ are effective in preventing nosocomial transmission of TB, 11-13 and two reports\ \ in this issue, from institutions in Kentucky 1 and New York, 2 provide additional\ \ data on decreases in HCW tuberculin skin-test (TST) conversions following implementation\ \ of TB infection control measures. In most studies, multiple interventions (administrative\ \ controls, environmental controls, and respiratory protection) were initiated\ \ at approximately the same time, making it more difficult to identify the most\ \ crucial aspect of the program. The importance of TB infection control measures\ \ in contributing to the decline in TB cases in the United States, as well as\ \ the reduction in the number of MDR-TB cases in New York City, often has been\ \ understated. Increased federal funding for TB control activities and expansion\ \ of directly observed therapy clearly are important in efforts to prevent TB,\ \ but the initial decline in TB cases and in MDR TB in the United States beginning\ \ in 1993 likely was due, in large part, to interruption of TB transmission within\ \ healthcare facilities. Unfortunately, increased funding for TB control in the\ \ United States in the last 5 years often has not trickled down to inner-city\ \ hospitals, which frequently are the first line in the battle against TB.\n\n\ \ From our experience and that of others, it appears clear that administrative\ \ controls are the most important component of a TB infection control program.\ \ At Grady Memorial Hospital in Atlanta, we were able to decrease TB exposure\ \ episodes markedly and concomitantly to decrease HCW TST conversions after implementing\ \ an expanded respiratory isolation policy. 11 We continue to isolate appropriately\ \ approximately 95% of those subsequently diagnosed with TB. We were able to reduce\ \ TST conver-sion rates markedly during a period of time in which we had isolation\ \ rooms that would be considered suboptimal by Centers for Disease Control and\ \ Prevention (CDC) guidelines 14 (rooms that were under negative pressure but\ \ had less than six air changes per hour) and were using submicron masks. Implementation\ \ of better-engineered isolation rooms (>12 air changes per hour) with the completion\ \ of renovations to the hospital may have put us in better compliance with regulatory\ \ agencies and made the staff feel more secure, but has had little impact on further\ \ reducing low rates of HCW TST conversions. In addition, the termination of outbreaks\ \ and reduction of TST conversion rates at several institutions took place before\ \ introduction of National Institute for Occupational Safety and Health-approved\ \ masks and fit testing. 2,15,16 United States healthcare institutions are required\ \ by regulatory mandates to develop a \"respiratory protection program\" (including\ \ fit testing), which can be time-consuming, expensive, and logistically difficult.\ \ 17 Data published to date suggest that the impact of formal fit testing on proper\ \ mask use is small. 18 These federal mandates also have turned some well-meaning\ \ (trying to comply fully with the Occupational Safety and Health Administration\ \ [OSHA] regulations) but misguided infection control practitioners into \"facial\ \ hair police.\" These types of processes divert time, effort, and resources away\ \ from what truly is effective in preventing nosocomial transmission of TB, as\ \ well as from other important infection control activities such as preventing\ \ nosocomial bloodstream infections or transmission of highly resistant pathogens\ \ such as vancomycin-resistant Enterococcus or preparing for the onslaught of\ \ vancomycin-resistant Staphylococcus aureus. At a time when US healthcare institutions\ \ are under enormous pressure due to healthcare reform, market forces, and managed\ \ care, it is essential that federal regulatory agencies look carefully at scientific\ \ data when issuing regulations." datasets: - tomaarsen/miriad-4.4M-split pipeline_tag: feature-extraction library_name: sentence-transformers metrics: - dot_accuracy@1 - dot_accuracy@3 - dot_accuracy@5 - dot_accuracy@10 - dot_precision@1 - dot_precision@3 - dot_precision@5 - dot_precision@10 - dot_recall@1 - dot_recall@3 - dot_recall@5 - dot_recall@10 - dot_ndcg@10 - dot_mrr@10 - dot_map@100 - query_active_dims - query_sparsity_ratio - corpus_active_dims - corpus_sparsity_ratio co2_eq_emissions: emissions: 196.23895298915153 energy_consumed: 0.504857070427092 source: codecarbon training_type: fine-tuning on_cloud: false cpu_model: 13th Gen Intel(R) Core(TM) i7-13700K ram_total_size: 31.777088165283203 hours_used: 1.484 hardware_used: 1 x NVIDIA GeForce RTX 3090 model-index: - name: MPNet-base trained on MIRIAD question-passage tuples results: - task: type: sparse-information-retrieval name: Sparse Information Retrieval dataset: name: miriad eval type: miriad_eval metrics: - type: dot_accuracy@1 value: 0.917 name: Dot Accuracy@1 - type: dot_accuracy@3 value: 0.963 name: Dot Accuracy@3 - type: dot_accuracy@5 value: 0.969 name: Dot Accuracy@5 - type: dot_accuracy@10 value: 0.98 name: Dot Accuracy@10 - type: dot_precision@1 value: 0.917 name: Dot Precision@1 - type: dot_precision@3 value: 0.32099999999999995 name: Dot Precision@3 - type: dot_precision@5 value: 0.1938 name: Dot Precision@5 - type: dot_precision@10 value: 0.09800000000000002 name: Dot Precision@10 - type: dot_recall@1 value: 0.917 name: Dot Recall@1 - type: dot_recall@3 value: 0.963 name: Dot Recall@3 - type: dot_recall@5 value: 0.969 name: Dot Recall@5 - type: dot_recall@10 value: 0.98 name: Dot Recall@10 - type: dot_ndcg@10 value: 0.9509329680619819 name: Dot Ndcg@10 - type: dot_mrr@10 value: 0.9414055555555555 name: Dot Mrr@10 - type: dot_map@100 value: 0.9422311263243918 name: Dot Map@100 - type: query_active_dims value: 72.48699951171875 name: Query Active Dims - type: query_sparsity_ratio value: 0.9976254791000846 name: Query Sparsity Ratio - type: corpus_active_dims value: 291.5419921875 name: Corpus Active Dims - type: corpus_sparsity_ratio value: 0.9904497005212599 name: Corpus Sparsity Ratio - task: type: sparse-information-retrieval name: Sparse Information Retrieval dataset: name: miriad test type: miriad_test metrics: - type: dot_accuracy@1 value: 0.9 name: Dot Accuracy@1 - type: dot_accuracy@3 value: 0.953 name: Dot Accuracy@3 - type: dot_accuracy@5 value: 0.961 name: Dot Accuracy@5 - type: dot_accuracy@10 value: 0.974 name: Dot Accuracy@10 - type: dot_precision@1 value: 0.9 name: Dot Precision@1 - type: dot_precision@3 value: 0.31766666666666665 name: Dot Precision@3 - type: dot_precision@5 value: 0.19220000000000004 name: Dot Precision@5 - type: dot_precision@10 value: 0.09740000000000001 name: Dot Precision@10 - type: dot_recall@1 value: 0.9 name: Dot Recall@1 - type: dot_recall@3 value: 0.953 name: Dot Recall@3 - type: dot_recall@5 value: 0.961 name: Dot Recall@5 - type: dot_recall@10 value: 0.974 name: Dot Recall@10 - type: dot_ndcg@10 value: 0.9387955628253912 name: Dot Ndcg@10 - type: dot_mrr@10 value: 0.9273035714285714 name: Dot Mrr@10 - type: dot_map@100 value: 0.9283432155352948 name: Dot Map@100 - type: query_active_dims value: 73.08399963378906 name: Query Active Dims - type: query_sparsity_ratio value: 0.9976059226378685 name: Query Sparsity Ratio - type: corpus_active_dims value: 293.2669982910156 name: Corpus Active Dims - type: corpus_sparsity_ratio value: 0.9903931929671761 name: Corpus Sparsity Ratio --- # MPNet-base trained on MIRIAD question-passage tuples This is a [SPLADE Sparse Encoder](https://www.sbert.net/docs/sparse_encoder/usage/usage.html) model finetuned from [microsoft/mpnet-base](https://huggingface.co/microsoft/mpnet-base) on the [miriad-4.4_m-split](https://huggingface.co/datasets/tomaarsen/miriad-4.4M-split) dataset using the [sentence-transformers](https://www.SBERT.net) library. It maps sentences & paragraphs to a 30527-dimensional sparse vector space and can be used for semantic search and sparse retrieval. ## Model Details ### Model Description - **Model Type:** SPLADE Sparse Encoder - **Base model:** [microsoft/mpnet-base](https://huggingface.co/microsoft/mpnet-base) <!-- at revision 6996ce1e91bd2a9c7d7f61daec37463394f73f09 --> - **Maximum Sequence Length:** 512 tokens - **Output Dimensionality:** 30527 dimensions - **Similarity Function:** Dot Product - **Training Dataset:** - [miriad-4.4_m-split](https://huggingface.co/datasets/tomaarsen/miriad-4.4M-split) - **Language:** en - **License:** apache-2.0 ### Model Sources - **Documentation:** [Sentence Transformers Documentation](https://sbert.net) - **Documentation:** [Sparse Encoder Documentation](https://www.sbert.net/docs/sparse_encoder/usage/usage.html) - **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers) - **Hugging Face:** [Sparse Encoders on Hugging Face](https://huggingface.co/models?library=sentence-transformers&other=sparse-encoder) ### Full Model Architecture ``` SparseEncoder( (0): MLMTransformer({'max_seq_length': 512, 'do_lower_case': False}) with MLMTransformer model: MPNetForMaskedLM (1): SpladePooling({'pooling_strategy': 'max', 'activation_function': 'relu', 'word_embedding_dimension': 30527}) ) ``` ## Usage ### Direct Usage (Sentence Transformers) First install the Sentence Transformers library: ```bash pip install -U sentence-transformers ``` Then you can load this model and run inference. ```python from sentence_transformers import SparseEncoder # Download from the ๐Ÿค— Hub model = SparseEncoder("tomaarsen/splade-mpnet-base-miriad-2e-5-lq-5e-6-lc") # Run inference queries = [ "How have infection control measures been effective in preventing nosocomial transmission of TB?\n", ] documents = [ 'Henry M. Blumberg, MD In this issue of Infection Control and Hospital Epidemiology, a potpourri of tuberculosis (TB)-related articles are being published. 1-7 Tuberculosisrelated issues have been an important focus for the past decade for those in infection control and hospital epidemiology, especially in urban areas where the large majority of TB cases occur, 8 but also, because of federal regulations, for those in low-endemic areas or areas where no TB cases occur (approximately half of the counties in the United States).\n\n The resurgence of TB beginning in the mid1980s in the United States (in large part, due to failure and underfunding of the public health infrastructure and to the epidemic of human immunodeficiency virus [HIV] infection) and outbreaks of TB have highlighted the risk of nosocomial transmission of TB. 9,10 These outbreaks affected both healthcare workers (HCWs) and patients. The fact that outbreaks in New York and Miami, among others, involved multidrug-resistant (MDR) strains that were associated with high morbidity and mortality among HIV-infected individuals punctuated the importance of effective TB infection control measures. Commingling of patients with unsuspected TB and those who were quite immunosuppressed led to amplification of nosocomial transmission. A decade ago, few institutions were prepared for the changing epidemiology of TB.\n\n Several recent studies have demonstrated that infection control measures are effective in preventing nosocomial transmission of TB, 11-13 and two reports in this issue, from institutions in Kentucky 1 and New York, 2 provide additional data on decreases in HCW tuberculin skin-test (TST) conversions following implementation of TB infection control measures. In most studies, multiple interventions (administrative controls, environmental controls, and respiratory protection) were initiated at approximately the same time, making it more difficult to identify the most crucial aspect of the program. The importance of TB infection control measures in contributing to the decline in TB cases in the United States, as well as the reduction in the number of MDR-TB cases in New York City, often has been understated. Increased federal funding for TB control activities and expansion of directly observed therapy clearly are important in efforts to prevent TB, but the initial decline in TB cases and in MDR TB in the United States beginning in 1993 likely was due, in large part, to interruption of TB transmission within healthcare facilities. Unfortunately, increased funding for TB control in the United States in the last 5 years often has not trickled down to inner-city hospitals, which frequently are the first line in the battle against TB.\n\n From our experience and that of others, it appears clear that administrative controls are the most important component of a TB infection control program. At Grady Memorial Hospital in Atlanta, we were able to decrease TB exposure episodes markedly and concomitantly to decrease HCW TST conversions after implementing an expanded respiratory isolation policy. 11 We continue to isolate appropriately approximately 95% of those subsequently diagnosed with TB. We were able to reduce TST conver-sion rates markedly during a period of time in which we had isolation rooms that would be considered suboptimal by Centers for Disease Control and Prevention (CDC) guidelines 14 (rooms that were under negative pressure but had less than six air changes per hour) and were using submicron masks. Implementation of better-engineered isolation rooms (>12 air changes per hour) with the completion of renovations to the hospital may have put us in better compliance with regulatory agencies and made the staff feel more secure, but has had little impact on further reducing low rates of HCW TST conversions. In addition, the termination of outbreaks and reduction of TST conversion rates at several institutions took place before introduction of National Institute for Occupational Safety and Health-approved masks and fit testing. 2,15,16 United States healthcare institutions are required by regulatory mandates to develop a "respiratory protection program" (including fit testing), which can be time-consuming, expensive, and logistically difficult. 17 Data published to date suggest that the impact of formal fit testing on proper mask use is small. 18 These federal mandates also have turned some well-meaning (trying to comply fully with the Occupational Safety and Health Administration [OSHA] regulations) but misguided infection control practitioners into "facial hair police." These types of processes divert time, effort, and resources away from what truly is effective in preventing nosocomial transmission of TB, as well as from other important infection control activities such as preventing nosocomial bloodstream infections or transmission of highly resistant pathogens such as vancomycin-resistant Enterococcus or preparing for the onslaught of vancomycin-resistant Staphylococcus aureus. At a time when US healthcare institutions are under enormous pressure due to healthcare reform, market forces, and managed care, it is essential that federal regulatory agencies look carefully at scientific data when issuing regulations.', 'Drug Reaction with Eosinophilia and Systemic Symptoms (DRESS) syndrome is a severe and potentially life-threatening hypersensitivity reaction caused by exposure to certain medications (Phillips et al., 2011; Bocquet et al., 1996) . It is extremely heterogeneous in its manifestation but has characteristic delayed-onset cutaneous and multisystem features with a protracted natural history. The reaction typically starts with a fever, followed by widespread skin eruption of variable nature. This progresses to inflammation of internal organs such as hepatitis, pneumonitis, myocarditis and nephritis, and haematological abnormalities including eosinophilia and atypical lymphocytosis (Kardaun et al., 2013; Cho et al., 2017) .\n\n DRESS syndrome is most commonly classified according to the international scoring system developed by the RegiSCAR group (Kardaun et al., 2013) . RegiSCAR accurately defines the syndrome by considering the major manifestations, with each feature scored between โˆ’1 and 2, and 9 being the maximum total number of points. According to this classification, a score of < 2 means no case, 2-3 means possible case, 4-5 means probable case, and 6 or above means definite DRESS syndrome. Table 1 gives an overview of the RegiSCAR scoring system. DRESS syndrome usually develops 2 to 6 weeks after exposure to the causative drug, with resolution of symptoms after drug withdrawal in the majority of cases (Husain et al., 2013a) . Some patients require supportive treatment with corticosteroids, although there is a lack of evidence surrounding the most effective dose, route and duration of the therapy (Adwan, 2017) . Although extremely rare, with an estimated population risk of between 1 and 10 in 10,000 drug exposures, it is significant due to its high mortality rate, at around 10% (Tas and The pathogenesis of DRESS syndrome remains largely unknown. Current evidence suggests that patients may be genetically predisposed to this form of hypersensitivity, with a superimposed risk resulting from Human Herpes Virus (HHV) exposure and subsequent immune reactivation (Cho et al., 2017; Husain et al., 2013a) . In fact, the serological detection of HHV-6 has even been proposed as an additional diagnostic marker for DRESS syndrome (Shiohara et al., 2007) . Other potential risk factors identified are family history (Sullivan and Shear, 2001; Pereira De Silva et al., 2011) and concomitant drug use, particularly antibiotics . DRESS syndrome appears to occur in patients of any age, with patient demographics from several reviews finding age ranges between 6 and 89 years (Picard et al., 2010; Kano et al., 2015; Cacoub et al., 2013) . DRESS syndrome was first described as an adverse reaction to antiepileptic therapy, but has since been recognised as a complication of an extremely wide range of medications (Adwan, 2017) . In rheumatology, it has been classically associated with allopurinol and sulfasalazine, but has also been documented in association with many other drugs including leflunomide, hydroxychloroquine, febuxostat and NSAIDs (Adwan, 2017) . Recent evidence has also identified a significant risk of DRESS syndrome with strontium ranelate use (Cacoub et al., 2013) . Thus far, that is the only anti-osteoporotic drug associated with DRESS syndrome, although there are various cases of other adverse cutaneous reactions linked to anti-osteoporotic medications, ranging from benign maculopapular eruption to Stevens-Johnson syndrome (SJS) and Toxic Epidermal Necrolysis (TEN) . Denosumab, an antiresorptive RANK ligand (RANKL) inhibitor licensed for osteoporosis, is currently known to be associated with some dermatological manifestations including dermatitis, eczema, pruritus and, less commonly, cellulitis (Prolia, n.d.).\n\n We hereby describe the first documented case of DRESS syndrome associated with denosumab treatment.\n\n The patient is a 76-year old female with osteoporosis and a background of alcoholic fatty liver disease and lower limb venous insufficiency. Osteoporosis was first diagnosed in 2003 and treated with risedronate, calcium and vitamin D, until 2006. While on this treatment, the patient sustained T12 and L3 fractures, the latter treated with kyphoplasty, and was therefore deemed a non-responder to risedronate.', "The regulation of these events is known to go awry in certain pathologies especially in diseases associated with neurodegeneration. Mitochondrial fission helps to enhance the number of mitochondria, which can be efficiently distributed to each corner of neuronal cells and thus helps them to maintain their energy demands. Mitochondrial fission is highly essential during the periods of energy starvation to produce new, efficient mitochondrial energy generating systems. However, enhanced fission associated with bioenergetic crisis causes BAX foci formation on mitochondrial membrane and thus causes mitochondrial outer membrane permeabilization (MOMP), releasing cytochrome c and other pro apoptotic mediators into cytosol, results in apoptosis [93] . Impairment in the mitochondrial dynamics has also been observed in case of inflammatory neuropathies and oxaliplatin induced neuropathy [94] . Excessive nitric oxide is known to cause s-nitrosylation of dynamin related protein-1 (Drp-1), and increases the mitochondrial fission [95, 96] . Tumor necrosis factor-ฮฑ (TNF-ฮฑ) reported to inhibit the kinensin 1 protein, and thus impairs trafficking by halting mitochondrial movement along axons [97] . In addition to impaired dynamics, aggregates of abnormal shaped, damaged mitochondria are responsible for aberrant mitochondrial trafficking, which contributes to axonal degeneration observed in various peripheral neuropathies [81] .\n\n Autophagy is the discerning cellular catabolic process responsible for recycling the damaged proteins/ organelles in the cells [98] . Mitophagy is a selective autophagic process involved in recycling of damaged mitochondria and helps in supplying the constituents for mitochondrial biogenesis [99] . Excessive accumulation and impaired clearance of dysfunctional mitochondria are known to be observed in various disorders associated with oxidative stress [100] . Oxidative damage to Atg 4, a key component involved in mitophagy causes impaired autophagosome formation and clearance of damaged mitochondria [101] . Loss in the function of molecular chaperons and associated accumulation of damaged proteins are known to be involved in various peripheral neuropathies including trauma induced neuropathy [102, 103] . A model of demyelinating neuropathy corresponds to the accumulation of improperly folded myelin protein PMP-22 is also being observed recently [104, 105] .\n\n Mitochondrial dysfunction and associated disturbances are well connected to neuroinflammatory changes that occur in various neurodegenerative diseases [106] . Dysfunctional mitochondria are also implicated in several pathologies such as cardiovascular and neurodegenerative diseases. Several mitochondrial toxins have been found to inhibit the respiration in microglial cells and also inhibit IL-4 induced alternative anti inflammatory response and thus potentiates neuroinflammation [107] . Mitochondrial ROS are well identified to be involved in several inflammatory pathways such as NF-ฮบB, MAPK activation [108] . Similarly, the pro inflammatory mediators released as a result of an inflammatory episode found to be interfere with the functioning of the mitochondrial electron transport chain and thus compromise ATP production [109] . TNF-ฮฑ is known to inhibit the complex I, IV of ETC and decreases energy production. Nitric oxide (NO) is a potent inhibitor of cytochrome c oxidase (complex IV) and similarly IL-6 is also known to enhance mitochondrial generation of superoxide [110] . Mitochondrial dysfunction initiates inflammation by increased formation of complexes of damaged mitochondrial parts and cytoplasmic pattern recognition receptors (PRR's). The resulting inflammasome directed activation of interleukin-1ฮฒ production, which starts an immune response and leads to Fig. (4) . Mitotoxicity in peripheral neuropathies: Various pathophysiological insults like hyperglycemic, chemotherapeutic and traumatic injury to the peripheral nerves results in mitochondrial dysfunction through enhanced generation of ROS induced biomolecular damage and bioenergetic crisis. Following the nerve injury accumulation of mitochondria occurs resulting in the release of mtDNA & formyl peptides into circulation which acts as Death associated molecular patterns (DAMP's). These are recognized by immune cells as foreign bodies and can elicit a local immune/inflammatory response. Interaction between inflammatory mediators and structural proteins involved in mitochondrial trafficking will cause impairment in mitochondrial motility. Oxidative stress induced damage to the mt proteins like Atg4, Parkin etc cause insufficient mitophagy. Excess nitrosative stress also results in excessive mt fission associated with apoptosis. In addition, mtDNA damage impairs its transcription and reduces mitochondrial biogenesis. Ca 2+ dyshomeostasis, loss in mitochondrial potential and bioenergetic crisis cause neuronal death via apoptosis/necrosis. All these modifications cause defects in ultra structure, physiology and trafficking of mitochondria resulting in loss of neuronal function producing peripheral neuropathy.", ] query_embeddings = model.encode_query(queries) document_embeddings = model.encode_document(documents) print(query_embeddings.shape, document_embeddings.shape) # [1, 30527] [3, 30527] # Get the similarity scores for the embeddings similarities = model.similarity(query_embeddings, document_embeddings) print(similarities) # tensor([[38.6532, 2.9277, 0.1620]]) ``` <!-- ### Direct Usage (Transformers) <details><summary>Click to see the direct usage in Transformers</summary> </details> --> <!-- ### Downstream Usage (Sentence Transformers) You can finetune this model on your own dataset. <details><summary>Click to expand</summary> </details> --> <!-- ### Out-of-Scope Use *List how the model may foreseeably be misused and address what users ought not to do with the model.* --> ## Evaluation ### Metrics #### Sparse Information Retrieval * Datasets: `miriad_eval` and `miriad_test` * Evaluated with [<code>SparseInformationRetrievalEvaluator</code>](https://sbert.net/docs/package_reference/sparse_encoder/evaluation.html#sentence_transformers.sparse_encoder.evaluation.SparseInformationRetrievalEvaluator) | Metric | miriad_eval | miriad_test | |:----------------------|:------------|:------------| | dot_accuracy@1 | 0.917 | 0.9 | | dot_accuracy@3 | 0.963 | 0.953 | | dot_accuracy@5 | 0.969 | 0.961 | | dot_accuracy@10 | 0.98 | 0.974 | | dot_precision@1 | 0.917 | 0.9 | | dot_precision@3 | 0.321 | 0.3177 | | dot_precision@5 | 0.1938 | 0.1922 | | dot_precision@10 | 0.098 | 0.0974 | | dot_recall@1 | 0.917 | 0.9 | | dot_recall@3 | 0.963 | 0.953 | | dot_recall@5 | 0.969 | 0.961 | | dot_recall@10 | 0.98 | 0.974 | | **dot_ndcg@10** | **0.9509** | **0.9388** | | dot_mrr@10 | 0.9414 | 0.9273 | | dot_map@100 | 0.9422 | 0.9283 | | query_active_dims | 72.487 | 73.084 | | query_sparsity_ratio | 0.9976 | 0.9976 | | corpus_active_dims | 291.542 | 293.267 | | corpus_sparsity_ratio | 0.9904 | 0.9904 | <!-- ## Bias, Risks and Limitations *What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.* --> <!-- ### Recommendations *What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.* --> ## Training Details ### Training Dataset #### miriad-4.4_m-split * Dataset: [miriad-4.4_m-split](https://huggingface.co/datasets/tomaarsen/miriad-4.4M-split) at [596b9ab](https://huggingface.co/datasets/tomaarsen/miriad-4.4M-split/tree/596b9ab305d52cb73644ed5b5004957c7bfaae40) * Size: 100,000 training samples * Columns: <code>question</code> and <code>passage_text</code> * Approximate statistics based on the first 1000 samples: | | question | passage_text | |:--------|:----------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------| | type | string | string | | details | <ul><li>min: 9 tokens</li><li>mean: 23.38 tokens</li><li>max: 71 tokens</li></ul> | <ul><li>min: 511 tokens</li><li>mean: 512.0 tokens</li><li>max: 512 tokens</li></ul> | * Samples: | question | passage_text | |:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | <code>What factors may contribute to increased pulmonary conduit durability in patients who undergo the Ross operation compared to those with right ventricular outflow tract obstruction?<br></code> | <code>I n 1966, Ross and Somerville 1 reported the first use of an aortic homograft to establish right ventricle-to-pulmonary artery continuity in a patient with tetralogy of Fallot and pulmonary atresia. Since that time, pulmonary position homografts have been used in a variety of right-sided congenital heart lesions. Actuarial 5-year homograft survivals for cryopreserved homografts are reported to range between 55% and 94%, with the shortest durability noted in patients less than 2 years of age. 4 Pulmonary position homografts also are used to replace pulmonary autografts explanted to repair left-sided outflow disease (the Ross operation). Several factors may be likely to favor increased pulmonary conduit durability in Ross patients compared with those with right ventricular outflow tract obstruction, including later age at operation (allowing for larger homografts), more normal pulmonary artery architecture, absence of severe right ventricular hypertrophy, and more natural positioning of ...</code> | | <code>How does MCAM expression in hMSC affect the growth and maintenance of hematopoietic progenitors?</code> | <code>After culture in a 3-dimensional hydrogel-based matrix, which constitutes hypoxic conditions, MCAM expression is lost. Concordantly, Tormin et al. demonstrated that MCAM is down-regulated under hypoxic conditions. 10 Furthermore, it was shown by others and our group that oxygen tension causes selective modification of hematopoietic cell and mesenchymal stromal cell interactions in co-culture systems as well as influence HSPC metabolism. [44] [45] [46] Thus, the observed differences between Sharma et al. and our data in HSPC supporting capacity of hMSC are likely due to the different culture conditions used. Further studies are required to clarify the influence of hypoxia in our model system. Altogether these findings provide further evidence for the importance of MCAM in supporting HSPC. Furthermore, previous reports have shown that MCAM is down-regulated in MSC after several passages as well as during aging and differentiation. 19, 47 Interestingly, MCAM overexpression in hMSC enhance...</code> | | <code>What is the relationship between Fanconi anemia and breast and ovarian cancer susceptibility genes?<br></code> | <code>( 31 ) , of which 5% -10 % may be caused by genetic factors ( 32 ) , up to half a million of these patients may be at risk of secondary hereditary neoplasms. The historic observation of twofold to fi vefold increased risks of cancers of the ovary, thyroid, and connective tissue after breast cancer ( 33 ) presaged the later syndromic association of these tumors with inherited mutations of BRCA1, BRCA2, PTEN, and p53 ( 16 ) . By far the largest cumulative risk of a secondary cancer in BRCA mutation carriers is associated with cancer in the contralateral breast, which may reach a risk of 29.5% at 10 years ( 34 ) . The Breast Cancer Linkage Consortium ( 35 , 36 ) also documented threefold to fi vefold increased risks of subsequent cancers of prostate, pancreas, gallbladder, stomach, skin (melanoma), and uterus in BRCA2 mutation carriers and twofold increased risks of prostate and pancreas cancer in BRCA1 mutation carriers; these results are based largely on self-reported family history inf...</code> | * Loss: [<code>SpladeLoss</code>](https://sbert.net/docs/package_reference/sparse_encoder/losses.html#spladeloss) with these parameters: ```json { "loss": "SparseMultipleNegativesRankingLoss(scale=1.0, similarity_fct='dot_score')", "lambda_corpus": 5e-06, "lambda_query": 2e-05 } ``` ### Evaluation Dataset #### miriad-4.4_m-split * Dataset: [miriad-4.4_m-split](https://huggingface.co/datasets/tomaarsen/miriad-4.4M-split) at [596b9ab](https://huggingface.co/datasets/tomaarsen/miriad-4.4M-split/tree/596b9ab305d52cb73644ed5b5004957c7bfaae40) * Size: 1,000 evaluation samples * Columns: <code>question</code> and <code>passage_text</code> * Approximate statistics based on the first 1000 samples: | | question | passage_text | |:--------|:----------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------| | type | string | string | | details | <ul><li>min: 8 tokens</li><li>mean: 23.55 tokens</li><li>max: 74 tokens</li></ul> | <ul><li>min: 512 tokens</li><li>mean: 512.0 tokens</li><li>max: 512 tokens</li></ul> | * Samples: | question | passage_text | |:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | <code>What are some hereditary cancer syndromes that can result in various forms of cancer?<br></code> | <code>Hereditary Cancer Syndromes, including Hereditary Breast and Ovarian Cancer (HBOC) and Lynch Syndrome (LS), can result in various forms of cancer due to germline mutations in cancer predisposition genes. While the major contributory genes for these syndromes have been identified and well-studied (BRCA1/ BRCA2 for HBOC and MSH2/MSH6/MLH1/PMS2/ EPCAM for LS), there remains a large percentage of associated cancer cases that are negative for germline mutations in these genes, including 80% of women with a personal or family history of breast cancer who are negative for BRCA1/2 mutations [1] . Similarly, between 30 and 50% of families fulfill stringent criteria for LS and test negative for germline mismatch repair gene mutations [2] . Adding complexity to these disorders is the significant overlap in the spectrum of cancers observed between various hereditary cancer syndromes, including many cancer susceptibility syndromes. Some that contribute to elevated breast cancer risk include Li-Frau...</code> | | <code>How do MAK-4 and MAK-5 exert their antioxidant properties?<br></code> | <code>Hybrid F1 mice were injected with urethane (300 mg/kg) at 8 days of age. A group was then put on a MAK-supplemented diet, another group was fed a standard pellet diet. At 36 weeks of age the mice were sacrificed and the livers examined for the presence of tumors mouse (Panel A) and for the number of nodules per mouse (Panel B) (* p < 0.05, ** P < 0.001). Statistical analysis was performed by Two Way ANOVA Test followed by Post Hoc Bonferroni analysis. <br><br> We than measured the influence of the MAK-4+5 combination on the expression of the three liver-specific connexins (cx26, cx32, and cx43). The level of cx26 expression was similar in all the groups of mice treated with the MAK-supplemented diet and in the control (Figure 4, Panel A) . A significant, time-dependent increase in cx32 was observed in the liver of all the groups of MAK treated mice compared to the normal diet-fed controls. Cx32 expression increased 2-fold after 1 week of treatment, and 3-to 4-fold at 3 months (Figure 4, Pane...</code> | | <code>What are the primary indications for a decompressive craniectomy, and what role does neurocritical care play in determining the suitability of a patient for this procedure?</code> | <code>Decompressive craniectomy is a valid neurosurgical strategy now a day as an alternative to control an elevated intracranial pressure (ICP) and controlling the risk of uncal and/or subfalcine herniation, in refractory cases to the postural, ventilator, and pharmacological measures to control it. The neurocritical care and the ICP monitorization are key determinants to identify and postulate the inclusion criteria to consider a patient as candidate to this procedure, as it is always considered a rescue surgical technique. Head trauma and ischemic or hemorrhagic cerebrovascular disease with progressive deterioration due to mass effect are some of the cases that may require a decompressive craniectomy with its different variants. However, this procedure per se can have complications described in the postcraniectomy syndrome and may occur in short, medium, or even long term.<br><br> 1,2 The paradoxical herniation is a condition in which there is a deviation of the midline with mass effect, even t...</code> | * Loss: [<code>SpladeLoss</code>](https://sbert.net/docs/package_reference/sparse_encoder/losses.html#spladeloss) with these parameters: ```json { "loss": "SparseMultipleNegativesRankingLoss(scale=1.0, similarity_fct='dot_score')", "lambda_corpus": 5e-06, "lambda_query": 2e-05 } ``` ### Training Hyperparameters #### Non-Default Hyperparameters - `eval_strategy`: steps - `per_device_train_batch_size`: 4 - `per_device_eval_batch_size`: 4 - `learning_rate`: 2e-05 - `num_train_epochs`: 1 - `warmup_ratio`: 0.1 - `fp16`: True - `batch_sampler`: no_duplicates #### All Hyperparameters <details><summary>Click to expand</summary> - `overwrite_output_dir`: False - `do_predict`: False - `eval_strategy`: steps - `prediction_loss_only`: True - `per_device_train_batch_size`: 4 - `per_device_eval_batch_size`: 4 - `per_gpu_train_batch_size`: None - `per_gpu_eval_batch_size`: None - `gradient_accumulation_steps`: 1 - `eval_accumulation_steps`: None - `torch_empty_cache_steps`: None - `learning_rate`: 2e-05 - `weight_decay`: 0.0 - `adam_beta1`: 0.9 - `adam_beta2`: 0.999 - `adam_epsilon`: 1e-08 - `max_grad_norm`: 1.0 - `num_train_epochs`: 1 - `max_steps`: -1 - `lr_scheduler_type`: linear - `lr_scheduler_kwargs`: {} - `warmup_ratio`: 0.1 - `warmup_steps`: 0 - `log_level`: passive - `log_level_replica`: warning - `log_on_each_node`: True - `logging_nan_inf_filter`: True - `save_safetensors`: True - `save_on_each_node`: False - `save_only_model`: False - `restore_callback_states_from_checkpoint`: False - `no_cuda`: False - `use_cpu`: False - `use_mps_device`: False - `seed`: 42 - `data_seed`: None - `jit_mode_eval`: False - `use_ipex`: False - `bf16`: False - `fp16`: True - `fp16_opt_level`: O1 - `half_precision_backend`: auto - `bf16_full_eval`: False - `fp16_full_eval`: False - `tf32`: None - `local_rank`: 0 - `ddp_backend`: None - `tpu_num_cores`: None - `tpu_metrics_debug`: False - `debug`: [] - `dataloader_drop_last`: False - `dataloader_num_workers`: 0 - `dataloader_prefetch_factor`: None - `past_index`: -1 - `disable_tqdm`: False - `remove_unused_columns`: True - `label_names`: None - `load_best_model_at_end`: False - `ignore_data_skip`: False - `fsdp`: [] - `fsdp_min_num_params`: 0 - `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False} - `fsdp_transformer_layer_cls_to_wrap`: None - `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None} - `deepspeed`: None - `label_smoothing_factor`: 0.0 - `optim`: adamw_torch - `optim_args`: None - `adafactor`: False - `group_by_length`: False - `length_column_name`: length - `ddp_find_unused_parameters`: None - `ddp_bucket_cap_mb`: None - `ddp_broadcast_buffers`: False - `dataloader_pin_memory`: True - `dataloader_persistent_workers`: False - `skip_memory_metrics`: True - `use_legacy_prediction_loop`: False - `push_to_hub`: False - `resume_from_checkpoint`: None - `hub_model_id`: None - `hub_strategy`: every_save - `hub_private_repo`: None - `hub_always_push`: False - `gradient_checkpointing`: False - `gradient_checkpointing_kwargs`: None - `include_inputs_for_metrics`: False - `include_for_metrics`: [] - `eval_do_concat_batches`: True - `fp16_backend`: auto - `push_to_hub_model_id`: None - `push_to_hub_organization`: None - `mp_parameters`: - `auto_find_batch_size`: False - `full_determinism`: False - `torchdynamo`: None - `ray_scope`: last - `ddp_timeout`: 1800 - `torch_compile`: False - `torch_compile_backend`: None - `torch_compile_mode`: None - `include_tokens_per_second`: False - `include_num_input_tokens_seen`: False - `neftune_noise_alpha`: None - `optim_target_modules`: None - `batch_eval_metrics`: False - `eval_on_start`: False - `use_liger_kernel`: False - `eval_use_gather_object`: False - `average_tokens_across_devices`: False - `prompts`: None - `batch_sampler`: no_duplicates - `multi_dataset_batch_sampler`: proportional - `router_mapping`: {} - `learning_rate_mapping`: {} </details> ### Training Logs | Epoch | Step | Training Loss | Validation Loss | miriad_eval_dot_ndcg@10 | miriad_test_dot_ndcg@10 | |:-----:|:-----:|:-------------:|:---------------:|:-----------------------:|:-----------------------:| | 0.032 | 800 | 311.9058 | - | - | - | | 0.064 | 1600 | 10.9011 | - | - | - | | 0.096 | 2400 | 2.3726 | - | - | - | | 0.128 | 3200 | 0.4999 | - | - | - | | 0.16 | 4000 | 0.1222 | 0.0420 | 0.9017 | - | | 0.192 | 4800 | 0.0755 | - | - | - | | 0.224 | 5600 | 0.0481 | - | - | - | | 0.256 | 6400 | 0.0643 | - | - | - | | 0.288 | 7200 | 0.0598 | - | - | - | | 0.32 | 8000 | 0.0575 | 0.0210 | 0.9274 | - | | 0.352 | 8800 | 0.0417 | - | - | - | | 0.384 | 9600 | 0.0487 | - | - | - | | 0.416 | 10400 | 0.0262 | - | - | - | | 0.448 | 11200 | 0.0404 | - | - | - | | 0.48 | 12000 | 0.0359 | 0.0163 | 0.9282 | - | | 0.512 | 12800 | 0.0407 | - | - | - | | 0.544 | 13600 | 0.0373 | - | - | - | | 0.576 | 14400 | 0.0204 | - | - | - | | 0.608 | 15200 | 0.0218 | - | - | - | | 0.64 | 16000 | 0.0196 | 0.0045 | 0.9434 | - | | 0.672 | 16800 | 0.0311 | - | - | - | | 0.704 | 17600 | 0.0372 | - | - | - | | 0.736 | 18400 | 0.029 | - | - | - | | 0.768 | 19200 | 0.0319 | - | - | - | | 0.8 | 20000 | 0.0352 | 0.0196 | 0.9392 | - | | 0.832 | 20800 | 0.0257 | - | - | - | | 0.864 | 21600 | 0.0339 | - | - | - | | 0.896 | 22400 | 0.0211 | - | - | - | | 0.928 | 23200 | 0.0197 | - | - | - | | 0.96 | 24000 | 0.0228 | 0.0069 | 0.9514 | - | | 0.992 | 24800 | 0.0161 | - | - | - | | -1 | -1 | - | - | 0.9509 | 0.9388 | ### Environmental Impact Carbon emissions were measured using [CodeCarbon](https://github.com/mlco2/codecarbon). - **Energy Consumed**: 0.505 kWh - **Carbon Emitted**: 0.196 kg of CO2 - **Hours Used**: 1.484 hours ### Training Hardware - **On Cloud**: No - **GPU Model**: 1 x NVIDIA GeForce RTX 3090 - **CPU Model**: 13th Gen Intel(R) Core(TM) i7-13700K - **RAM Size**: 31.78 GB ### Framework Versions - Python: 3.11.6 - Sentence Transformers: 4.2.0.dev0 - Transformers: 4.52.4 - PyTorch: 2.6.0+cu124 - Accelerate: 1.5.1 - Datasets: 2.21.0 - Tokenizers: 0.21.1 ## Citation ### BibTeX #### Sentence Transformers ```bibtex @inproceedings{reimers-2019-sentence-bert, title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks", author = "Reimers, Nils and Gurevych, Iryna", booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing", month = "11", year = "2019", publisher = "Association for Computational Linguistics", url = "https://arxiv.org/abs/1908.10084", } ``` #### SpladeLoss ```bibtex @misc{formal2022distillationhardnegativesampling, title={From Distillation to Hard Negative Sampling: Making Sparse Neural IR Models More Effective}, author={Thibault Formal and Carlos Lassance and Benjamin Piwowarski and Stรฉphane Clinchant}, year={2022}, eprint={2205.04733}, archivePrefix={arXiv}, primaryClass={cs.IR}, url={https://arxiv.org/abs/2205.04733}, } ``` #### SparseMultipleNegativesRankingLoss ```bibtex @misc{henderson2017efficient, title={Efficient Natural Language Response Suggestion for Smart Reply}, author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil}, year={2017}, eprint={1705.00652}, archivePrefix={arXiv}, primaryClass={cs.CL} } ``` #### FlopsLoss ```bibtex @article{paria2020minimizing, title={Minimizing flops to learn efficient sparse representations}, author={Paria, Biswajit and Yeh, Chih-Kuan and Yen, Ian EH and Xu, Ning and Ravikumar, Pradeep and P{'o}czos, Barnab{'a}s}, journal={arXiv preprint arXiv:2004.05665}, year={2020} } ``` <!-- ## Glossary *Clearly define terms in order to be accessible across audiences.* --> <!-- ## Model Card Authors *Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.* --> <!-- ## Model Card Contact *Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.* -->
mrdayl/qwen3-mermaid-16bnb
mrdayl
2025-06-17T10:14:12Z
0
0
transformers
[ "transformers", "safetensors", "qwen3", "text-generation", "text-generation-inference", "unsloth", "conversational", "en", "license:apache-2.0", "autotrain_compatible", "endpoints_compatible", "region:us" ]
text-generation
2025-06-17T08:56:26Z
--- base_model: unsloth/qwen3-4b-unsloth-bnb-4bit tags: - text-generation-inference - transformers - unsloth - qwen3 license: apache-2.0 language: - en --- # Uploaded finetuned model - **Developed by:** mrdayl - **License:** apache-2.0 - **Finetuned from model :** unsloth/qwen3-4b-unsloth-bnb-4bit This qwen3 model was trained 2x faster with [Unsloth](https://github.com/unslothai/unsloth) and Huggingface's TRL library. [<img src="https://raw.githubusercontent.com/unslothai/unsloth/main/images/unsloth%20made%20with%20love.png" width="200"/>](https://github.com/unslothai/unsloth)
FormlessAI/cdf43773-3ca9-407c-a0db-3c6c553761ec
FormlessAI
2025-06-17T10:09:30Z
0
0
transformers
[ "transformers", "safetensors", "llama", "text-generation", "generated_from_trainer", "trl", "sft", "conversational", "base_model:aisingapore/Llama-SEA-LION-v2-8B-IT", "base_model:finetune:aisingapore/Llama-SEA-LION-v2-8B-IT", "autotrain_compatible", "text-generation-inference", "endpoints_co...
text-generation
2025-06-17T09:54:11Z
--- base_model: aisingapore/llama3-8b-cpt-sea-lionv2.1-instruct library_name: transformers model_name: cdf43773-3ca9-407c-a0db-3c6c553761ec tags: - generated_from_trainer - trl - sft licence: license --- # Model Card for cdf43773-3ca9-407c-a0db-3c6c553761ec This model is a fine-tuned version of [aisingapore/llama3-8b-cpt-sea-lionv2.1-instruct](https://huggingface.co/aisingapore/llama3-8b-cpt-sea-lionv2.1-instruct). It has been trained using [TRL](https://github.com/huggingface/trl). ## Quick start ```python from transformers import pipeline question = "If you had a time machine, but could only go to the past or the future once and never return, which would you choose and why?" generator = pipeline("text-generation", model="FormlessAI/cdf43773-3ca9-407c-a0db-3c6c553761ec", device="cuda") output = generator([{"role": "user", "content": question}], max_new_tokens=128, return_full_text=False)[0] print(output["generated_text"]) ``` ## Training procedure [<img src="https://raw.githubusercontent.com/wandb/assets/main/wandb-github-badge-28.svg" alt="Visualize in Weights & Biases" width="150" height="24"/>](https://wandb.ai/phoenix-formless/Gradients/runs/jo240qxe) This model was trained with SFT. ### Framework versions - TRL: 0.18.1 - Transformers: 4.52.4 - Pytorch: 2.7.0+cu128 - Datasets: 3.6.0 - Tokenizers: 0.21.1 ## Citations Cite TRL as: ```bibtex @misc{vonwerra2022trl, title = {{TRL: Transformer Reinforcement Learning}}, author = {Leandro von Werra and Younes Belkada and Lewis Tunstall and Edward Beeching and Tristan Thrush and Nathan Lambert and Shengyi Huang and Kashif Rasul and Quentin Gallou{\'e}dec}, year = 2020, journal = {GitHub repository}, publisher = {GitHub}, howpublished = {\url{https://github.com/huggingface/trl}} } ```
gokulsrinivasagan/tinybert_base_train_book_ent_15p_s_init_book
gokulsrinivasagan
2025-06-17T09:53:18Z
9
0
transformers
[ "transformers", "safetensors", "bert", "fill-mask", "generated_from_trainer", "dataset:gokulsrinivasagan/processed_book_corpus-ld", "base_model:google/bert_uncased_L-4_H-512_A-8", "base_model:finetune:google/bert_uncased_L-4_H-512_A-8", "license:apache-2.0", "model-index", "autotrain_compatible"...
fill-mask
2025-06-13T18:21:37Z
--- library_name: transformers license: apache-2.0 base_model: google/bert_uncased_L-4_H-512_A-8 tags: - generated_from_trainer datasets: - gokulsrinivasagan/processed_book_corpus-ld metrics: - accuracy model-index: - name: tinybert_base_train_book_ent_15p_s_init_book results: - task: name: Masked Language Modeling type: fill-mask dataset: name: gokulsrinivasagan/processed_book_corpus-ld type: gokulsrinivasagan/processed_book_corpus-ld metrics: - name: Accuracy type: accuracy value: 0.5016182928962741 --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # tinybert_base_train_book_ent_15p_s_init_book This model is a fine-tuned version of [google/bert_uncased_L-4_H-512_A-8](https://huggingface.co/google/bert_uncased_L-4_H-512_A-8) on the gokulsrinivasagan/processed_book_corpus-ld dataset. It achieves the following results on the evaluation set: - Loss: 2.7883 - Accuracy: 0.5016 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0001 - train_batch_size: 120 - eval_batch_size: 120 - seed: 10 - optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments - lr_scheduler_type: linear - lr_scheduler_warmup_steps: 10000 - num_epochs: 24 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-------:|:------:|:---------------:|:--------:| | 5.7148 | 0.5269 | 10000 | 5.3754 | 0.1660 | | 5.4378 | 1.0539 | 20000 | 5.0823 | 0.1892 | | 5.1562 | 1.5808 | 30000 | 4.7818 | 0.2149 | | 4.8487 | 2.1077 | 40000 | 4.4633 | 0.2476 | | 4.5657 | 2.6346 | 50000 | 4.1493 | 0.2907 | | 4.2716 | 3.1616 | 60000 | 3.7873 | 0.3466 | | 4.2871 | 3.6885 | 70000 | 3.8181 | 0.3361 | | 4.1543 | 4.2154 | 80000 | 3.5822 | 0.3733 | | 3.8571 | 4.7423 | 90000 | 3.4747 | 0.3897 | | 3.7109 | 5.2693 | 100000 | 3.1587 | 0.4452 | | 3.5309 | 5.7962 | 110000 | 3.1118 | 0.4521 | | 3.5456 | 6.3231 | 120000 | 3.1531 | 0.4396 | | 3.3806 | 6.8500 | 130000 | 2.8550 | 0.4952 | | 3.4529 | 7.3770 | 140000 | 3.0184 | 0.4605 | | 3.3215 | 7.9039 | 150000 | 2.7883 | 0.5016 | | 3.513 | 8.4308 | 160000 | 3.0518 | 0.4508 | | 3.3968 | 8.9577 | 170000 | 2.9743 | 0.4616 | | 3.449 | 9.4847 | 180000 | 2.9690 | 0.4628 | | 3.3697 | 10.0116 | 190000 | 2.8899 | 0.4777 | | 3.357 | 10.5385 | 200000 | 2.9087 | 0.4713 | | 3.387 | 11.0654 | 210000 | 2.8973 | 0.4734 | | 3.4019 | 11.5924 | 220000 | 2.9180 | 0.4674 | | 3.3729 | 12.1193 | 230000 | 2.9308 | 0.4650 | | 3.4055 | 12.6462 | 240000 | 2.9422 | 0.4640 | | 3.4147 | 13.1731 | 250000 | 3.0244 | 0.4468 | | 3.395 | 13.7001 | 260000 | 2.9477 | 0.4606 | | 3.4227 | 14.2270 | 270000 | 2.9277 | 0.4636 | | 3.5185 | 14.7539 | 280000 | 3.0647 | 0.4362 | | 3.4673 | 15.2809 | 290000 | 3.0344 | 0.4418 | | 3.4164 | 15.8078 | 300000 | 3.0563 | 0.4379 | | 3.3326 | 16.3347 | 310000 | 3.0179 | 0.4443 | | 3.3937 | 16.8616 | 320000 | 3.0324 | 0.4397 | | 3.4516 | 17.3886 | 330000 | 3.1178 | 0.4245 | | 3.4207 | 17.9155 | 340000 | 3.0349 | 0.4407 | | 3.3921 | 18.4424 | 350000 | 2.9866 | 0.4471 | | 3.3771 | 18.9693 | 360000 | 2.9835 | 0.4488 | | 3.3844 | 19.4963 | 370000 | 2.9886 | 0.4477 | | 3.3288 | 20.0232 | 380000 | 2.9555 | 0.4523 | | 3.3691 | 20.5501 | 390000 | 2.9938 | 0.4449 | | 3.3104 | 21.0770 | 400000 | 2.9500 | 0.4528 | | 3.3398 | 21.6040 | 410000 | 2.9999 | 0.4437 | | 3.3325 | 22.1309 | 420000 | 2.9703 | 0.4481 | | 3.3466 | 22.6578 | 430000 | 2.9785 | 0.4474 | | 3.3444 | 23.1847 | 440000 | 2.9894 | 0.4450 | | 3.3103 | 23.7117 | 450000 | 2.9477 | 0.4523 | ### Framework versions - Transformers 4.51.2 - Pytorch 2.6.0+cu126 - Datasets 3.5.0 - Tokenizers 0.21.1
ganesan-erss/sqlcoder-7b-finetuned-v2
ganesan-erss
2025-06-17T09:45:02Z
0
0
transformers
[ "transformers", "safetensors", "arxiv:1910.09700", "endpoints_compatible", "region:us" ]
null
2025-06-17T09:44:47Z
--- library_name: transformers tags: [] --- # Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a ๐Ÿค— transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
brainmao/csfmod
brainmao
2025-06-17T09:34:03Z
0
0
null
[ "gguf", "llama", "unsloth", "license:mit", "endpoints_compatible", "region:us" ]
null
2025-06-17T08:51:37Z
--- license: mit tags: - unsloth ---
BootesVoid/cmbxbw7zm00jurdqs9iqa9vjc_cmc0am4wa07fdrdqsoyeqy0u9
BootesVoid
2025-06-17T09:31:24Z
0
0
diffusers
[ "diffusers", "flux", "lora", "replicate", "text-to-image", "en", "base_model:black-forest-labs/FLUX.1-dev", "base_model:adapter:black-forest-labs/FLUX.1-dev", "license:other", "region:us" ]
text-to-image
2025-06-17T09:31:22Z
--- license: other license_name: flux-1-dev-non-commercial-license license_link: https://huggingface.co/black-forest-labs/FLUX.1-dev/blob/main/LICENSE.md language: - en tags: - flux - diffusers - lora - replicate base_model: "black-forest-labs/FLUX.1-dev" pipeline_tag: text-to-image # widget: # - text: >- # prompt # output: # url: https://... instance_prompt: TRISHA --- # Cmbxbw7Zm00Jurdqs9Iqa9Vjc_Cmc0Am4Wa07Fdrdqsoyeqy0U9 <Gallery /> ## About this LoRA This is a [LoRA](https://replicate.com/docs/guides/working-with-loras) for the FLUX.1-dev text-to-image model. It can be used with diffusers or ComfyUI. It was trained on [Replicate](https://replicate.com/) using AI toolkit: https://replicate.com/ostris/flux-dev-lora-trainer/train ## Trigger words You should use `TRISHA` to trigger the image generation. ## Run this LoRA with an API using Replicate ```py import replicate input = { "prompt": "TRISHA", "lora_weights": "https://huggingface.co/BootesVoid/cmbxbw7zm00jurdqs9iqa9vjc_cmc0am4wa07fdrdqsoyeqy0u9/resolve/main/lora.safetensors" } output = replicate.run( "black-forest-labs/flux-dev-lora", input=input ) for index, item in enumerate(output): with open(f"output_{index}.webp", "wb") as file: file.write(item.read()) ``` ## Use it with the [๐Ÿงจ diffusers library](https://github.com/huggingface/diffusers) ```py from diffusers import AutoPipelineForText2Image import torch pipeline = AutoPipelineForText2Image.from_pretrained('black-forest-labs/FLUX.1-dev', torch_dtype=torch.float16).to('cuda') pipeline.load_lora_weights('BootesVoid/cmbxbw7zm00jurdqs9iqa9vjc_cmc0am4wa07fdrdqsoyeqy0u9', weight_name='lora.safetensors') image = pipeline('TRISHA').images[0] ``` For more details, including weighting, merging and fusing LoRAs, check the [documentation on loading LoRAs in diffusers](https://huggingface.co/docs/diffusers/main/en/using-diffusers/loading_adapters) ## Training details - Steps: 2000 - Learning rate: 0.0004 - LoRA rank: 16 ## Contribute your own examples You can use the [community tab](https://huggingface.co/BootesVoid/cmbxbw7zm00jurdqs9iqa9vjc_cmc0am4wa07fdrdqsoyeqy0u9/discussions) to add images that show off what youโ€™ve made with this LoRA.
vietnhat/orpheus-test
vietnhat
2025-06-17T09:25:44Z
0
0
transformers
[ "transformers", "safetensors", "llama", "text-generation", "text-generation-inference", "unsloth", "conversational", "en", "base_model:unsloth/orpheus-3b-0.1-ft", "base_model:finetune:unsloth/orpheus-3b-0.1-ft", "license:apache-2.0", "autotrain_compatible", "endpoints_compatible", "region:...
text-generation
2025-06-17T09:23:56Z
--- base_model: unsloth/orpheus-3b-0.1-ft tags: - text-generation-inference - transformers - unsloth - llama license: apache-2.0 language: - en --- # Uploaded finetuned model - **Developed by:** vietnhat - **License:** apache-2.0 - **Finetuned from model :** unsloth/orpheus-3b-0.1-ft This llama model was trained 2x faster with [Unsloth](https://github.com/unslothai/unsloth) and Huggingface's TRL library. [<img src="https://raw.githubusercontent.com/unslothai/unsloth/main/images/unsloth%20made%20with%20love.png" width="200"/>](https://github.com/unslothai/unsloth)
JayHyeon/Qwen_1.5B-math-VIPO_5e-6_10.0vpo_constant-5ep
JayHyeon
2025-06-17T09:20:56Z
0
0
transformers
[ "transformers", "tensorboard", "safetensors", "qwen2", "text-generation", "generated_from_trainer", "trl", "dpo", "conversational", "dataset:argilla/distilabel-math-preference-dpo", "arxiv:2305.18290", "base_model:Qwen/Qwen2.5-Math-1.5B", "base_model:finetune:Qwen/Qwen2.5-Math-1.5B", "auto...
text-generation
2025-06-17T08:56:16Z
--- base_model: Qwen/Qwen2.5-Math-1.5B datasets: argilla/distilabel-math-preference-dpo library_name: transformers model_name: Qwen_1.5B-math-VIPO_5e-6_10.0vpo_constant-5ep tags: - generated_from_trainer - trl - dpo licence: license --- # Model Card for Qwen_1.5B-math-VIPO_5e-6_10.0vpo_constant-5ep This model is a fine-tuned version of [Qwen/Qwen2.5-Math-1.5B](https://huggingface.co/Qwen/Qwen2.5-Math-1.5B) on the [argilla/distilabel-math-preference-dpo](https://huggingface.co/datasets/argilla/distilabel-math-preference-dpo) dataset. It has been trained using [TRL](https://github.com/huggingface/trl). ## Quick start ```python from transformers import pipeline question = "If you had a time machine, but could only go to the past or the future once and never return, which would you choose and why?" generator = pipeline("text-generation", model="JayHyeon/Qwen_1.5B-math-VIPO_5e-6_10.0vpo_constant-5ep", device="cuda") output = generator([{"role": "user", "content": question}], max_new_tokens=128, return_full_text=False)[0] print(output["generated_text"]) ``` ## Training procedure [<img src="https://raw.githubusercontent.com/wandb/assets/main/wandb-github-badge-28.svg" alt="Visualize in Weights & Biases" width="150" height="24"/>](https://wandb.ai/bonin147/huggingface/runs/recw375c) This model was trained with DPO, a method introduced in [Direct Preference Optimization: Your Language Model is Secretly a Reward Model](https://huggingface.co/papers/2305.18290). ### Framework versions - TRL: 0.15.2 - Transformers: 4.50.0 - Pytorch: 2.6.0 - Datasets: 3.4.1 - Tokenizers: 0.21.1 ## Citations Cite DPO as: ```bibtex @inproceedings{rafailov2023direct, title = {{Direct Preference Optimization: Your Language Model is Secretly a Reward Model}}, author = {Rafael Rafailov and Archit Sharma and Eric Mitchell and Christopher D. Manning and Stefano Ermon and Chelsea Finn}, year = 2023, booktitle = {Advances in Neural Information Processing Systems 36: Annual Conference on Neural Information Processing Systems 2023, NeurIPS 2023, New Orleans, LA, USA, December 10 - 16, 2023}, url = {http://papers.nips.cc/paper_files/paper/2023/hash/a85b405ed65c6477a4fe8302b5e06ce7-Abstract-Conference.html}, editor = {Alice Oh and Tristan Naumann and Amir Globerson and Kate Saenko and Moritz Hardt and Sergey Levine}, } ``` Cite TRL as: ```bibtex @misc{vonwerra2022trl, title = {{TRL: Transformer Reinforcement Learning}}, author = {Leandro von Werra and Younes Belkada and Lewis Tunstall and Edward Beeching and Tristan Thrush and Nathan Lambert and Shengyi Huang and Kashif Rasul and Quentin Gallouรฉdec}, year = 2020, journal = {GitHub repository}, publisher = {GitHub}, howpublished = {\url{https://github.com/huggingface/trl}} } ```
diegolacomba/multilingual-e5-small-legal-mnrl-0
diegolacomba
2025-06-17T09:05:06Z
0
0
sentence-transformers
[ "sentence-transformers", "safetensors", "bert", "sentence-similarity", "feature-extraction", "generated_from_trainer", "dataset_size:58898", "loss:MultipleNegativesRankingLoss", "arxiv:1908.10084", "arxiv:1705.00652", "base_model:intfloat/multilingual-e5-small", "base_model:finetune:intfloat/m...
sentence-similarity
2025-06-17T09:04:34Z
--- tags: - sentence-transformers - sentence-similarity - feature-extraction - generated_from_trainer - dataset_size:58898 - loss:MultipleNegativesRankingLoss base_model: intfloat/multilingual-e5-small widget: - source_sentence: 'query: ยฟCรณmo se deben determinar las cuotas a cuenta del IRPF en un aรฑo con actividad econรณmica suspendida?' sentences: - 'passage A los efectos de este Impuesto, se considerarรก promotor de edificaciones el propietario de inmuebles que construyรณ (promotor-constructor) o contratรณ la construcciรณn (promotor) de los mismos para destinarlos a la venta, el alquiler o el uso propio. c) Dichas ejecuciones de obra tengan por objeto la construcciรณn o rehabilitaciรณn de edificios destinados fundamentalmente a viviendas, incluidos los locales, anejos, instalaciones y servicios complementarios en ella situados. d) Las referidas ejecuciones de obra consistan materialmente en la construcciรณn o rehabilitaciรณn de los citados edificios. 3.- En consecuencia, las ejecuciones de obra concertadas directamente entre el promotor y el contratista (la consultante), que tengan por objeto la rehabilitaciรณn de una vivienda, tributan al tipo reducido del 10 por ciento. El tipo reducido se aplica con independencia de que el promotor concierte la totalidad de la obra de construcciรณn con un solo empresario o concierte la realizaciรณn con varios empresarios realizando cada uno de ellos una parte de la obra segรบn su especialidad. No obstante, las ejecuciones de obra realizadas por subcontratistas para otros contratistas (la consultante), que a su vez contraten con el promotor, tributarรกn por el Impuesto sobre el Valor Aรฑadido al tipo general del 21 por ciento. 4.- Lo que comunico a Vd. con efectos vinculantes, conforme a lo dispuesto en el apartado 1 del artรญculo 89 de la Ley 58/2003, de 17 de diciembre, General Tributaria.' - 'passage Descripciรณn de hechos: La consultante es titular de una actividad econรณmica de "otros cafรฉs y bares". El rendimiento neto de la actividad se determina por el mรฉtodo de estimaciรณn objetiva y tributa en el IVA por el rรฉgimen especial simplificado. Desde la declaraciรณn de alarma en marzo de 2020 ha tenido cerrada la actividad y la va a seguir teniendo cerrada durante todo el aรฑo 2020, pues las restricciones que tiene que aplicar no la hacen rentable. Cuestiรณn planteada: Forma de calcular, en 2020, el pago fraccionado a cuenta del IRPF y el ingreso a cuenta trimestral del IVA.' - 'passage No obstante, el artรญculo 22.Trece de la Ley 37/1992, declara la exenciรณn de: โ€œLos transportes de viajeros y sus equipajes por vรญa marรญtima o aรฉrea procedentes de o con destino a un puerto o aeropuerto situado fuera del รกmbito espacial del Impuesto. Se entenderรกn incluidos en este apartado los transportes por vรญa aรฉrea amparados por un รบnico tรญtulo de transporte que incluya vuelos de conexiรณn aรฉrea.โ€. En consecuencia, los servicios de transporte consultados, que tienen su origen o destino en un aeropuerto fuera del territorio de aplicaciรณn del impuesto sobre el valor aรฑadido, estarรกn sujetos pero exentos del Impuesto sobre el Valor Aรฑadido. 2.- Por otra parte, el artรญculo 164, apartado uno, de la Ley del Impuesto sobre el Valor Aรฑadido, en el que se regulan las obligaciones de los sujetos pasivos, establece lo siguiente: โ€œUno. Sin perjuicio de lo establecido en el Tรญtulo anterior, los sujetos pasivos del Impuesto estarรกn obligados, con los requisitos, lรญmites y condiciones que se determinen reglamentariamente, a: (โ€ฆ) 3ยบ. Expedir y entregar factura de todas sus operaciones, ajustada a lo que se determine reglamentariamente.โ€. El desarrollo reglamentario de dicho precepto se ha llevado a cabo por el Reglamento por el que se regulan las obligaciones de facturaciรณn, aprobado por el artรญculo 1 del Real Decreto 1619/2012, de 30 de noviembre (BOE de 1 de diciembre). El artรญculo 2 del mencionado Reglamento dispone que:' - source_sentence: 'query: ยฟCuรกl es el porcentaje de impuesto que corresponde a dispositivos destinados a aliviar discapacidades bajo la ley actual?' sentences: - 'passage Contestaciรณn completa: 1.- El artรญculo 90, apartado uno, de la Ley 37/1992, de 28 de diciembre, del Impuesto sobre el Valor Aรฑadido (BOE del 29 de diciembre), dispone que el Impuesto se exigirรก al tipo del 21 por ciento, salvo lo dispuesto en el artรญculo siguiente. 2.- El artรญculo 91, apartado Uno.1, nรบmero 6ยบ, letra c) de la Ley 37/1992 dispone lo siguiente: โ€œUno. Se aplicarรก el tipo del 10 por ciento a las operaciones siguientes: 1. Las entregas, adquisiciones intracomunitarias o importaciones de los bienes que se indican a continuaciรณn: (โ€ฆ) 6.ยบ Los siguientes bienes: (โ€ฆ) c) Los equipos mรฉdicos, aparatos y demรกs instrumental, relacionados en el apartado octavo del anexo de esta Ley, que, por sus caracterรญsticas objetivas, estรฉn diseรฑados para aliviar o tratar deficiencias, para uso personal y exclusivo de personas que tengan deficiencias fรญsicas, mentales, intelectuales o sensoriales, sin perjuicio de lo previsto en el apartado dos.1 de este artรญculo. No se incluyen en esta letra otros accesorios, recambios y piezas de repuesto de dichos bienes.โ€. El apartado octavo del Anexo de la Ley 37/1992, establece lo siguiente: โ€œOctavo. Relaciรณn de bienes a que se refiere el artรญculo 91.Uno.1. 6.ยบc) de esta Ley. (โ€ฆ) โ€“ Sillas terapรฉuticas y de ruedas, asรญ como los cojines antiescaras y arneses para el uso de las mismas, muletas, andadores y grรบas para movilizar personas con discapacidad. (โ€ฆ).โ€. 3.- Por su parte, el artรญculo 91, apartado dos.1, nรบmero 4ยบ de la Ley 37/1992, dispone que: โ€œDos. Se aplicarรก el tipo del 4 por ciento a las operaciones siguientes: 1. Las entregas, adquisiciones intracomunitarias o importaciones de los bienes que se indican a continuaciรณn: (โ€ฆ)' - 'passage (โ€ฆ).โ€. De acuerdo con lo dispuesto anteriormente, en los supuestos de adjudicaciรณn de bienes en virtud de subasta judicial o administrativa, como es el caso que nos ocupa, el adjudicatario puede efectuar, en su caso, la renuncia a las exenciones previstas en el apartado dos del artรญculo 20 de la Ley 37/1992, asรญ como expedir factura, presentar, en nombre y por cuenta del sujeto pasivo, la declaraciรณn-liquidaciรณn correspondiente e ingresar el importe del Impuesto sobre el Valor Aรฑadido resultante. El ejercicio de dicha facultad por parte del adjudicatario determina la obligaciรณn de presentar la autoliquidaciรณn del Impuesto conforme al modelo aprobado por la Orden HAC/3625/2003, de 23 de diciembre (modelo 309). Uno de los requisitos necesarios para el ejercicio de dicha facultad es que el destinatario-adjudicatario del bien inmueble tenga la consideraciรณn de empresario o profesional en los tรฉrminos previstos en esta contestaciรณn. La no consideraciรณn como empresario o profesional impide el ejercicio de dicha facultad. Por รบltimo, seรฑalar que de resultar aplicable la regla de inversiรณn del sujeto pasivo prevista en el artรญculo 84.Uno.2ยบ de la Ley 37/1992, anteriormente desarrollado, el adjudicatario resultarรก ser el sujeto pasivo de la operaciรณn por lo que viene obligado a presentar la autoliquidaciรณn ordinaria del Impuesto en nombre propio, sin actuar en nombre y por cuenta del subastado. Asimismo, de optar por dicha facultad en los tรฉrminos establecidos reglamentariamente, el consultante podrรก emitir, en nombre y por cuenta del transmitente, la correspondiente factura en la que se documente la operaciรณn. No obstante, tal y como se ha seรฑalado en apartados anteriores de esta contestaciรณn, el consultante adjudicatario de la subasta judicial no procediรณ a la renuncia a la exenciรณn del artรญculo 20.Uno.22ยบ de la Ley del Impuesto en el plazo establecido, habiรฉndose encontrado facultado para ello segรบn lo dispuesto en la Disposiciรณn Adicional Sexta de la Ley 37/1992.' - 'passage c) Las que tengan por objeto la cesiรณn del derecho a utilizar infraestructuras ferroviarias. d) Las autorizaciones para la prestaciรณn de servicios al pรบblico y para el desarrollo de actividades comerciales o industriales en el รกmbito portuario.โ€ 3.- La consulta plantea una cuestiรณn sobre un contrato por el que un Ayuntamiento cede a un contratista la explotaciรณn de un bar (instalaciรณn fija de obra) en una ciudad. Dicho contrato tiene la naturaleza de contrato administrativo especial, sin que el mismo pueda calificarse como contrato de gestiรณn de servicio pรบblico ni tampoco como concesiรณn administrativa de dominio pรบblico. Cabe plantearse si podrรญa resultar aplicable a la referida prestaciรณn de servicios efectuada por el ayuntamiento en favor de la consultante el supuesto de no sujeciรณn al Impuesto sobre el Valor Aรฑadido previsto para el otorgamiento de concesiones y autorizaciones administrativas en el nรบmero 9ยบ del artรญculo 7 de la citada Ley 37/1992. La respuesta a esta cuestiรณn es negativa, pues, como ha seรฑalado la Asesorรญa Jurรญdica de la Secretarรญa de Estado de Hacienda en el informe emitido el 30 de julio de 1997 a solicitud de esta Direcciรณn General, los contratos que tienen por objeto la explotaciรณn de cafeterรญas y comedores en centros pรบblicos son contratos administrativos especiales, sin que los mismos puedan calificarse como contratos de gestiรณn de servicios pรบblicos ni tampoco como concesiones administrativas de dominio pรบblico. En este sentido se ha pronunciado la Junta Consultiva de Contrataciรณn Administrativa en diversos informes emitidos al respecto; asรญ, en el informe 57/07 de 6 de febrero de 2008, y, con anterioridad, en los informes 5/96 de 7 de marzo y 67/99, de 6 de julio de 2000. En consecuencia con todo lo anterior, estรก sujeto al Impuesto sobre el Valor Aรฑadido y no exento del mismo el contrato suscrito entre el ayuntamiento y la consultante consistente en explotar un bar-quiosco, a cambio del pago de una contraprestaciรณn. 4.- Lo que comunico a Vd. con efectos vinculantes, conforme a lo dispuesto en el apartado 1 del artรญculo 89 de la Ley 58/2003, de 17 de diciembre, General Tributaria.' - source_sentence: 'query: ยฟEn quรฉ casos las transacciones documentadas en escrituras pรบblicas pueden estar sujetas a una tasa tributaria especรญfica segรบn la normativa vigente?' sentences: - 'passage 3.- Por otra parte en relaciรณn con la inclusiรณn del suero de irrigaciรณn en el apartado destinado a โ€œBolsas de recogida de orina, absorbentes de incontinencia y otros sistemas para incontinencia urinaria y fecal, incluidos los sistemas de irrigaciรณnโ€, este Centro directivo en la consulta de fecha 23 de marzo de 2015, numero V0872-15 y en relaciรณn con los sistemas de irrigaciรณn ha dispuesto que, โ€œTributarรกn por el Impuesto sobre el Valor Aรฑadido, al tipo general del 21 por ciento, los siguientes productos objeto de consulta: -Los empapadores, las duchas vaginales, irrigadores, accesorios y sistemas de irrigaciรณn no destinados especรญficamente a situaciones de incontinencia urinaria o fecal, ni las cรกnulas rectales y vaginales no destinadas especรญficamente a situaciones de incontinencia urinaria o fecal o no incorporadas en equipos destinados a estas situaciones. โ€œ 4.- En consecuencia con lo anterior este centro directivo le informa que tributan al tipo general del 21 por ciento las entregas, adquisiciones intracomunitarias e importaciones de suero de irrigaciรณn (agua destilada o suero fisiolรณgico) objeto de consulta siendo irrelevante que su destino sea para la limpieza asรฉptica de la piel, lavado de heridas o quemaduras formando parte integrante de los sistemas de irrigaciรณn. 5.- Lo que comunico a Vd. con efectos vinculantes, conforme a lo dispuesto en el apartado 1 del artรญculo 89 de la Ley 58/2003, de 17 de diciembre, General Tributaria. No obstante, de acuerdo con el artรญculo 68.2 del Reglamento General de las actuaciones y los procedimientos de gestiรณn e inspecciรณn tributaria y de desarrollo de las normas comunes de los procedimientos de aplicaciรณn de los tributos, aprobado por el Real Decreto 1065/2007, de 27 de julio, la presente contestaciรณn no tendrรก efectos vinculantes para aquellos miembros o asociados de la consultante que en el momento de formular la consulta estuviesen siendo objeto de un procedimiento, recurso o reclamaciรณn econรณmico-administrativa iniciado con anterioridad y relacionado con las cuestiones planteadas en la consulta conforme a lo dispuesto en su artรญculo 89.2.' - 'passage Contestaciรณn completa: 1.- Las reglas de localizaciรณn de las prestaciones de servicios se encuentran reguladas en los artรญculos 69, 70 y 72 de la Ley 37/1992, de 28 de diciembre, del Impuesto sobre el Valor Aรฑadido (BOE del 29 de diciembre). En el artรญculo 69 del dicho texto normativo se contienen las reglas generales de localizaciรณn en donde se establece que: โ€œUno. Las prestaciones de servicios se entenderรกn realizadas en el territorio de aplicaciรณn del Impuesto, sin perjuicio de lo dispuesto en el apartado siguiente de este artรญculo y en los artรญculos 70 y 72 de esta Ley, en los siguientes casos: 1.ยบ Cuando el destinatario sea un empresario o profesional que actรบe como tal y radique en el citado territorio la sede de su actividad econรณmica, o tenga en el mismo un establecimiento permanente o, en su defecto, el lugar de su domicilio o residencia habitual, siempre que se trate de servicios que tengan por destinatarios a dicha sede, establecimiento permanente, domicilio o residencia habitual, con independencia de dรณnde se encuentre establecido el prestador de los servicios y del lugar desde el que los preste. 2.ยบ Cuando el destinatario no sea un empresario o profesional actuando como tal, siempre que los servicios se presten por un empresario o profesional y la sede de su actividad econรณmica o establecimiento permanente desde el que los preste o, en su defecto, el lugar de su domicilio o residencia habitual, se encuentre en el territorio de aplicaciรณn del Impuesto. (โ€ฆ).โ€. No obstante, estas reglas serรกn de aplicaciรณn รบnicamente en el caso en que no proceda aplicar ninguna de las reglas espaciales que se regulan en el artรญculo 70 de la Ley del impuesto. En concreto, respecto de los servicios de restauraciรณn y catering, se establece en el nรบmero 5ยบ del apartado Uno de dicho precepto que: โ€œUno. Se entenderรกn prestados en el territorio de aplicaciรณn del Impuesto los siguientes servicios: (โ€ฆ) 5.ยบ. A) Los de restauraciรณn y catering en los siguientes supuestos: (โ€ฆ) b) Los restantes servicios de restauraciรณn y catering cuando se presten materialmente en el territorio de aplicaciรณn del Impuesto. (โ€ฆ).โ€.' - 'passage Artรญculo 31 โ€œ2. Las primeras copias de escrituras y actas notariales, cuando tengan por objeto cantidad o cosa valuable, contengan actos o contratos inscribibles en los Registros de la Propiedad, Mercantil y de la Propiedad Industrial y de Bienes Muebles no sujetos al Impuesto sobre Sucesiones y Donaciones o a los conceptos comprendidos en los nรบmeros 1 y 2 del artรญculo 1.ยบ de esta Ley, tributarรกn, ademรกs, al tipo de gravamen que, conforme a lo previsto en la Ley 21/2001, de 27 de diciembre, por la que se regulan las medidas fiscales y administrativas del nuevo sistema de financiaciรณn de las Comunidades Autรณnomas de rรฉgimen comรบn y Ciudades con Estatuto de Autonomรญa, haya sido aprobado por la Comunidad Autรณnoma. Si la Comunidad Autรณnoma no hubiese aprobado el tipo a que se refiere el pรกrrafo anterior, se aplicarรก el 0,50 por 100, en cuanto a tales actos o contratos.โ€ De la aplicaciรณn de los preceptos anteriormente transcritos resulta lo siguiente: - Por regla general las operaciones realizadas por un sujeto pasivo del IVA son operaciones no sujetas a la modalidad de transmisiones patrimoniales onerosas del ITP y AJD segรบn lo dispuesto en los artรญculos 7.5 del Texto Refundido del citado impuesto. En tal caso, si la referida operaciรณn se documentase en escritura pรบblica, la no sujeciรณn de la transmisiรณn por la modalidad de transmisiones patrimoniales onerosas permitirรญa la aplicaciรณn la cuota variable del Documento Notarial de la modalidad Actos Jurรญdicos Documentados, dada la concurrencia de todos los requisitos exigidos en el artรญculo 31.2 del Texto Refundido del Impuesto: Tratarse de una primera copia de una escritura o acta notarial Tener por objeto cantidad o cosa valuable Contener un acto o contrato inscribibles en los Registros de la Propiedad, Mercantil y de la Propiedad Industrial y de Bienes Muebles No estar sujetos los referidos actos al Impuesto sobre Sucesiones y Donaciones o a los conceptos comprendidos en los apartados 1 y 2 del artรญculo 1 de esta Ley, transmisiones patrimoniales onerosas y operaciones societarias' - source_sentence: 'query: ยฟSe aplican impuestos a la enseรฑanza de idiomas para particulares y empresas en modalidad presencial y virtual?' sentences: - 'passage 4.- Por otro lado, el artรญculo 91, apartado dos.2, nรบmero 1ยบ, de la Ley del Impuesto sobre el Valor Aรฑadido, dispone la aplicaciรณn del tipo impositivo del 4 por ciento a la prestaciรณn de los siguientes servicios: โ€œ1.ยบ Los servicios de reparaciรณn de los vehรญculos y de las sillas de ruedas comprendidos en el pรกrrafo primero del nรบmero 4.ยบ del apartado dos.1 de este artรญculo y los servicios de adaptaciรณn de los autotaxis y autoturismos para personas con discapacidad y de los vehรญculos a motor a los que se refiere el pรกrrafo segundo del mismo precepto independientemente de quiรฉn sea el conductor de los mismos.โ€. Los servicios de reparaciรณn recogidos en la Ley 37/1992 son รบnicamente los referidos a vehรญculos para personas con movilidad reducida y a sillas de ruedas para uso exclusivo de personas con discapacidad, que son los bienes incluidos en el pรกrrafo primero del artรญculo 91, apartado dos.1, nรบmero 4ยบ de dicha Ley. En consecuencia con lo anterior, las reparaciones de sillas de ruedas, que no estรฉn incluidas en el pรกrrafo anterior, tributarรกn al tipo del 21 por ciento dado que no estรก contemplado en el artรญculo 91 de la Ley 37/1992 un tipo reducido para estos servicios de reparaciรณn. 5.- En relaciรณn con el tipo impositivo aplicable a los accesorios y recambios de sillas de ruedas, la actual redacciรณn del artรญculo 91.Uno.1.6ยบ, letra c) dice expresamente que: โ€œNo se incluyen en esta letra otros accesorios, recambios y piezas de repuesto de dichos bienes.โ€.' - 'passage Descripciรณn de hechos: La consultante es una persona fรญsica que va a impartir clases de idiomas, en concreto alemรกn, tanto a personas fรญsicas como a empresas. Las clases se realizarรกn tanto de manera presencial como a travรฉs de medios electrรณnicos. Cuestiรณn planteada: Si las clases se encuentran exentas del Impuesto sobre el Valor Aรฑadido.' - 'passage Contestaciรณn completa: 1.- El artรญculo 134 bis, apartado dos de la Ley 37/1992, de 28 de diciembre, del Impuesto sobre el Valor Aรฑadido (BOE del 29), establece que: โ€œDos. Cuando el rรฉgimen de tributaciรณn aplicable a una determinada actividad agrรญcola, ganadera, forestal o pesquera cambie del rรฉgimen especial de la agricultura, ganaderรญa y pesca al general del Impuesto, el empresario o profesional titular de la actividad tendrรก derecho a: 1ยบ. Efectuar la deducciรณn de la cuota resultante de aplicar al valor de los bienes afectos a la actividad, Impuesto sobre el Valor Aรฑadido excluido, en la fecha en que deje de aplicarse el rรฉgimen especial, los tipos de dicho Impuesto que estuviesen vigentes en la citada fecha. A estos efectos, no se tendrรกn en cuenta los siguientes: a) Bienes de inversiรณn, definidos conforme a lo dispuesto en el artรญculo 108 de esta Ley. b) Bienes y servicios que hayan sido utilizados o consumidos total o parcialmente en la actividad. 2ยบ. Deducir la compensaciรณn a tanto alzado que prevรฉ el artรญculo 130 de esta Ley por los productos naturales obtenidos en las explotaciones que no se hayan entregado a la fecha del cambio del rรฉgimen de tributaciรณn. A efectos del ejercicio de los derechos recogidos en este apartado, el empresario o profesional deberรก confeccionar y presentar un inventario a la fecha en que deje de aplicarse el rรฉgimen especial. Tanto la presentaciรณn de este inventario como el ejercicio de estos derechos se ajustarรกn a los requisitos y condiciones que se establezcan reglamentariamente.โ€. Por su parte, el artรญculo 49 bis del Reglamento del Impuesto aprobado por el artรญculo 1 del Real Decreto 1624/1992, de 29 de diciembre (BOE del 31), declara que:' - source_sentence: 'query: ยฟDe quรฉ forma la ubicaciรณn de la agencia influye en la aplicaciรณn del impuesto en los servicios turรญsticos?' sentences: - 'passage Contestaciรณn completa: 1.- El artรญculo 9, primer pรกrrafo de la Ley 8/1991, de 25 de marzo, por la que se crea el Impuesto sobre la Producciรณn, los Servicios y la Importaciรณn en las Ciudades de Ceuta y Melilla (BOE del 26), dispone lo siguiente: โ€œLas importaciones definitivas de bienes en las ciudades de Ceuta y Melilla estarรกn exentas en los mismos tรฉrminos que en la legislaciรณn comรบn del Impuesto sobre el Valor Aรฑadido y, en todo caso, se asimilarรกn, a efectos de esta exenciรณn, las que resulten de aplicaciรณn a las operaciones interiores.โ€. 2.- Por otra parte, el artรญculo 20, apartado uno, nรบmero 17ยบ de la Ley 37/1992, de 28 de diciembre, del Impuesto sobre el Valor Aรฑadido (BOE del 29 de diciembre), dispone que estarรกn exentas de dicho Impuesto: โ€œ17ยบ. Las entregas de sellos de Correos y efectos timbrados de curso legal en Espaรฑa por importe no superior a su valor facial. La exenciรณn no se extiende a los servicios de expendiciรณn de los referidos bienes prestados en nombre y por cuenta de terceros.โ€. Conforme al precepto anterior, la entrega de sellos de correos de curso legal por importe no superior a su valor facial, objeto de consulta, estarรก exenta del Impuesto sobre el Valor Aรฑadido. 3.- En consecuencia, estarรกn sujetas pero exentas del Impuesto sobre la Producciรณn, los Servicios y la Importaciรณn en las Ciudades de Ceuta y Melilla las importaciones definitivas de sellos de correos de curso legal en las ciudades de Ceuta y Melilla cuando, de acuerdo con lo establecido en el apartado anterior de esta contestaciรณn, su entrega estรฉ exenta del Impuesto sobre el Valor Aรฑadido. 4.- Lo que comunico a Vd. con efectos vinculantes, conforme a lo dispuesto en el apartado 1 del artรญculo 89 de la Ley 58/2003, de 17 de diciembre, General Tributaria.' - 'passage 2ยบ. Sin perjuicio de lo dispuesto en el punto 1ยบ anterior, se aplicarรก, en todo caso, el tipo general del 21 por ciento, entre otros, a los siguientes bienes y servicios: 1. Servicios prestados por vรญa electrรณnica, esto es, aquellos servicios que consistan en la transmisiรณn enviada inicialmente y recibida en destino por medio de equipos de procesamiento, incluida la compresiรณn numรฉrica y el almacenamiento de datos, y enteramente transmitida, transportada y recibida por cable, sistema รณptico u otros medios electrรณnicos y, entre otros, los siguientes: a) El suministro y alojamiento de sitios informรกticos. b) El mantenimiento a distancia de programas y de equipos. c) El suministro de programas y su actualizaciรณn. d) El suministro de imรกgenes, texto, informaciรณn y la puesta a disposiciรณn de bases de datos. e) El suministro de mรบsica, pelรญculas, juegos, incluidos los de azar o de dinero, y de emisiones y manifestaciones polรญticas, culturales, artรญsticas, deportivas, cientรญficas o de ocio. f) El suministro de enseรฑanza a distancia. 2. Dispositivos portรกtiles que permitan almacenar y leer libros digitalizados, asรญ como reproductores de libros electrรณnicos y otros elementos de hardware, es decir, componentes que integren la parte material de un ordenador o que se puedan conectar al mismo. 3. Servicios consistentes en el acceso electrรณnico a bases de datos, periรณdicos, revistas y semejantes y, en general, a pรกginas web. 4. Comercializaciรณn de cรณdigos de descarga de archivos que incorporen libros electrรณnicos. 5. Servicios de acceso a libros de texto en formato digital alojados en servidores de Entes pรบblicos o de colegios. 6. Servicios de consultas y accesos a bases de datos. 7. Servicios de digitalizaciรณn de obras literarias.' - 'passage De acuerdo con los antecedentes recogidos en esta contestaciรณn, dicho servicio estarรก sujeto al rรฉgimen especial de las agencias de viajes regulado en el Capรญtulo VI del Tรญtulo IX de la Ley 37/1992 y tendrรก la consideraciรณn de prestaciรณn de servicios รบnica que estarรก sujeta al Impuesto sobre el Valor Aรฑadido bajo la premisa de que la consultante tiene establecida la sede de su actividad econรณmica o posea un establecimiento permanente desde donde efectรบa la operaciรณn en el territorio de aplicaciรณn del Impuesto. El tipo impositivo aplicable al servicio รบnico de viajes serรก el general del 21 por ciento previsto en el artรญculo 90.Uno de la Ley del Impuesto. Sobre la posible aplicaciรณn de la opciรณn del artรญculo 147 de la Ley 37/1992 para la aplicaciรณn del rรฉgimen general del Impuesto, segรบn se establece en contestaciรณn a consulta vinculante de 20 de septiembre de 2016, nรบmero V3942-16: โ€œ4.- Debe tenerse en cuenta que en el caso de las empresas radicadas en Estados Unidos a que se refiere el escrito de consulta, no se entiende cumplido el requisito de reciprocidad, tal como se pronunciรณ este Centro Directivo en contestaciรณn a consulta vinculante nรบmero V0579-12 de 16 de marzo de 2012, por lo que, salvo que el servicio prestado por la agencia de viajes consultante estรฉ relacionado con la asistencia a ferias, congresos y exposiciones de carรกcter comercial o profesional, en los tรฉrminos del artรญculo 119 bis de la Ley 37/1992 parcialmente transcrito, no se entenderรกn cumplidos los requisitos para la opciรณn por el rรฉgimen general del Impuesto sobre el Valor Aรฑadido.โ€. b) El mismo caso anterior, pero el viaje se pretende desarrollar en las Islas Canarias. Segรบn establece el artรญculo 144 de la Ley del Impuesto, dicha operaciรณn se encontrarรก sujeta al Impuesto y, en particular, al rรฉgimen especial de las agencias de viajes: โ€œDicha prestaciรณn se entenderรก realizada en el lugar donde la agencia tenga establecida la sede de su actividad econรณmica o posea un establecimiento permanente desde donde efectรบe la operaciรณn.โ€.' pipeline_tag: sentence-similarity library_name: sentence-transformers metrics: - cosine_accuracy@1 - cosine_accuracy@3 - cosine_accuracy@5 - cosine_accuracy@10 - cosine_precision@1 - cosine_precision@3 - cosine_precision@5 - cosine_precision@10 - cosine_recall@1 - cosine_recall@3 - cosine_recall@5 - cosine_recall@10 - cosine_ndcg@10 - cosine_mrr@10 - cosine_map@100 model-index: - name: SentenceTransformer based on intfloat/multilingual-e5-small results: - task: type: information-retrieval name: Information Retrieval dataset: name: InformationRetrievalEvaluator type: InformationRetrievalEvaluator metrics: - type: cosine_accuracy@1 value: 0.19120762711864406 name: Cosine Accuracy@1 - type: cosine_accuracy@3 value: 0.3175317796610169 name: Cosine Accuracy@3 - type: cosine_accuracy@5 value: 0.386034604519774 name: Cosine Accuracy@5 - type: cosine_accuracy@10 value: 0.4834922316384181 name: Cosine Accuracy@10 - type: cosine_precision@1 value: 0.19120762711864406 name: Cosine Precision@1 - type: cosine_precision@3 value: 0.10584392655367232 name: Cosine Precision@3 - type: cosine_precision@5 value: 0.0772069209039548 name: Cosine Precision@5 - type: cosine_precision@10 value: 0.04834922316384181 name: Cosine Precision@10 - type: cosine_recall@1 value: 0.19120762711864406 name: Cosine Recall@1 - type: cosine_recall@3 value: 0.3175317796610169 name: Cosine Recall@3 - type: cosine_recall@5 value: 0.386034604519774 name: Cosine Recall@5 - type: cosine_recall@10 value: 0.4834922316384181 name: Cosine Recall@10 - type: cosine_ndcg@10 value: 0.32410065040970315 name: Cosine Ndcg@10 - type: cosine_mrr@10 value: 0.2747356599183937 name: Cosine Mrr@10 - type: cosine_map@100 value: 0.28756647798527146 name: Cosine Map@100 - type: cosine_accuracy@1 value: 0.3431194112526707 name: Cosine Accuracy@1 - type: cosine_accuracy@3 value: 0.5083485004352298 name: Cosine Accuracy@3 - type: cosine_accuracy@5 value: 0.5847906939938277 name: Cosine Accuracy@5 - type: cosine_accuracy@10 value: 0.681332594761415 name: Cosine Accuracy@10 - type: cosine_precision@1 value: 0.3431194112526707 name: Cosine Precision@1 - type: cosine_precision@3 value: 0.1694495001450766 name: Cosine Precision@3 - type: cosine_precision@5 value: 0.11695813879876553 name: Cosine Precision@5 - type: cosine_precision@10 value: 0.06813325947614149 name: Cosine Precision@10 - type: cosine_recall@1 value: 0.3431194112526707 name: Cosine Recall@1 - type: cosine_recall@3 value: 0.5083485004352298 name: Cosine Recall@3 - type: cosine_recall@5 value: 0.5847906939938277 name: Cosine Recall@5 - type: cosine_recall@10 value: 0.681332594761415 name: Cosine Recall@10 - type: cosine_ndcg@10 value: 0.5018271132477355 name: Cosine Ndcg@10 - type: cosine_mrr@10 value: 0.44561322194462705 name: Cosine Mrr@10 - type: cosine_map@100 value: 0.4551504365024611 name: Cosine Map@100 --- # SentenceTransformer based on intfloat/multilingual-e5-small This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [intfloat/multilingual-e5-small](https://huggingface.co/intfloat/multilingual-e5-small). It maps sentences & paragraphs to a 384-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more. ## Model Details ### Model Description - **Model Type:** Sentence Transformer - **Base model:** [intfloat/multilingual-e5-small](https://huggingface.co/intfloat/multilingual-e5-small) <!-- at revision c007d7ef6fd86656326059b28395a7a03a7c5846 --> - **Maximum Sequence Length:** 512 tokens - **Output Dimensionality:** 384 dimensions - **Similarity Function:** Cosine Similarity <!-- - **Training Dataset:** Unknown --> <!-- - **Language:** Unknown --> <!-- - **License:** Unknown --> ### Model Sources - **Documentation:** [Sentence Transformers Documentation](https://sbert.net) - **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers) - **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers) ### Full Model Architecture ``` SentenceTransformer( (0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel (1): Pooling({'word_embedding_dimension': 384, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True}) (2): Normalize() ) ``` ## Usage ### Direct Usage (Sentence Transformers) First install the Sentence Transformers library: ```bash pip install -U sentence-transformers ``` Then you can load this model and run inference. ```python from sentence_transformers import SentenceTransformer # Download from the ๐Ÿค— Hub model = SentenceTransformer("diegolacomba/multilingual-e5-small-legal-mnrl-0") # Run inference sentences = [ 'query: ยฟDe quรฉ forma la ubicaciรณn de la agencia influye en la aplicaciรณn del impuesto en los servicios turรญsticos?', 'passage De acuerdo con los antecedentes recogidos en esta contestaciรณn, dicho servicio estarรก sujeto al rรฉgimen especial de las agencias de viajes regulado en el Capรญtulo VI del Tรญtulo IX de la Ley 37/1992 y tendrรก la consideraciรณn de prestaciรณn de servicios รบnica que estarรก sujeta al Impuesto sobre el Valor Aรฑadido bajo la premisa de que la consultante tiene establecida la sede de su actividad econรณmica o posea un establecimiento permanente desde donde efectรบa la operaciรณn en el territorio de aplicaciรณn del Impuesto.\nEl tipo impositivo aplicable al servicio รบnico de viajes serรก el general del 21 por ciento previsto en el artรญculo 90.Uno de la Ley del Impuesto.\nSobre la posible aplicaciรณn de la opciรณn del artรญculo 147 de la Ley 37/1992 para la aplicaciรณn del rรฉgimen general del Impuesto, segรบn se establece en contestaciรณn a consulta vinculante de 20 de septiembre de 2016, nรบmero V3942-16:\nโ€œ4.- Debe tenerse en cuenta que en el caso de las empresas radicadas en Estados Unidos a que se refiere el escrito de consulta, no se entiende cumplido el requisito de reciprocidad, tal como se pronunciรณ este Centro Directivo en contestaciรณn a consulta vinculante nรบmero V0579-12 de 16 de marzo de 2012, por lo que, salvo que el servicio prestado por la agencia de viajes consultante estรฉ relacionado con la asistencia a ferias, congresos y exposiciones de carรกcter comercial o profesional, en los tรฉrminos del artรญculo 119 bis de la Ley 37/1992 parcialmente transcrito, no se entenderรกn cumplidos los requisitos para la opciรณn por el rรฉgimen general del Impuesto sobre el Valor Aรฑadido.โ€.\nb) El mismo caso anterior, pero el viaje se pretende desarrollar en las Islas Canarias.\nSegรบn establece el artรญculo 144 de la Ley del Impuesto, dicha operaciรณn se encontrarรก sujeta al Impuesto y, en particular, al rรฉgimen especial de las agencias de viajes:\nโ€œDicha prestaciรณn se entenderรก realizada en el lugar donde la agencia tenga establecida la sede de su actividad econรณmica o posea un establecimiento permanente desde donde efectรบe la operaciรณn.โ€.', 'passage Contestaciรณn completa: 1.- El artรญculo 9, primer pรกrrafo de la Ley 8/1991, de 25 de marzo, por la que se crea el Impuesto sobre la Producciรณn, los Servicios y la Importaciรณn en las Ciudades de Ceuta y Melilla (BOE del 26), dispone lo siguiente:\nโ€œLas importaciones definitivas de bienes en las ciudades de Ceuta y Melilla estarรกn exentas en los mismos tรฉrminos que en la legislaciรณn comรบn del Impuesto sobre el Valor Aรฑadido y, en todo caso, se asimilarรกn, a efectos de esta exenciรณn, las que resulten de aplicaciรณn a las operaciones interiores.โ€.\n2.- Por otra parte, el artรญculo 20, apartado uno, nรบmero 17ยบ de la Ley 37/1992, de 28 de diciembre, del Impuesto sobre el Valor Aรฑadido (BOE del 29 de diciembre), dispone que estarรกn exentas de dicho Impuesto:\nโ€œ17ยบ. Las entregas de sellos de Correos y efectos timbrados de curso legal en Espaรฑa por importe no superior a su valor facial.\nLa exenciรณn no se extiende a los servicios de expendiciรณn de los referidos bienes prestados en nombre y por cuenta de terceros.โ€.\nConforme al precepto anterior, la entrega de sellos de correos de curso legal por importe no superior a su valor facial, objeto de consulta, estarรก exenta del Impuesto sobre el Valor Aรฑadido.\n3.- En consecuencia, estarรกn sujetas pero exentas del Impuesto sobre la Producciรณn, los Servicios y la Importaciรณn en las Ciudades de Ceuta y Melilla las importaciones definitivas de sellos de correos de curso legal en las ciudades de Ceuta y Melilla cuando, de acuerdo con lo establecido en el apartado anterior de esta contestaciรณn, su entrega estรฉ exenta del Impuesto sobre el Valor Aรฑadido.\n4.- Lo que comunico a Vd. con efectos vinculantes, conforme a lo dispuesto en el apartado 1 del artรญculo 89 de la Ley 58/2003, de 17 de diciembre, General Tributaria.', ] embeddings = model.encode(sentences) print(embeddings.shape) # [3, 384] # Get the similarity scores for the embeddings similarities = model.similarity(embeddings, embeddings) print(similarities.shape) # [3, 3] ``` <!-- ### Direct Usage (Transformers) <details><summary>Click to see the direct usage in Transformers</summary> </details> --> <!-- ### Downstream Usage (Sentence Transformers) You can finetune this model on your own dataset. <details><summary>Click to expand</summary> </details> --> <!-- ### Out-of-Scope Use *List how the model may foreseeably be misused and address what users ought not to do with the model.* --> ## Evaluation ### Metrics #### Information Retrieval * Dataset: `InformationRetrievalEvaluator` * Evaluated with [<code>InformationRetrievalEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator) | Metric | Value | |:--------------------|:-----------| | cosine_accuracy@1 | 0.1912 | | cosine_accuracy@3 | 0.3175 | | cosine_accuracy@5 | 0.386 | | cosine_accuracy@10 | 0.4835 | | cosine_precision@1 | 0.1912 | | cosine_precision@3 | 0.1058 | | cosine_precision@5 | 0.0772 | | cosine_precision@10 | 0.0483 | | cosine_recall@1 | 0.1912 | | cosine_recall@3 | 0.3175 | | cosine_recall@5 | 0.386 | | cosine_recall@10 | 0.4835 | | **cosine_ndcg@10** | **0.3241** | | cosine_mrr@10 | 0.2747 | | cosine_map@100 | 0.2876 | #### Information Retrieval * Dataset: `InformationRetrievalEvaluator` * Evaluated with [<code>InformationRetrievalEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator) | Metric | Value | |:--------------------|:-----------| | cosine_accuracy@1 | 0.3431 | | cosine_accuracy@3 | 0.5083 | | cosine_accuracy@5 | 0.5848 | | cosine_accuracy@10 | 0.6813 | | cosine_precision@1 | 0.3431 | | cosine_precision@3 | 0.1694 | | cosine_precision@5 | 0.117 | | cosine_precision@10 | 0.0681 | | cosine_recall@1 | 0.3431 | | cosine_recall@3 | 0.5083 | | cosine_recall@5 | 0.5848 | | cosine_recall@10 | 0.6813 | | **cosine_ndcg@10** | **0.5018** | | cosine_mrr@10 | 0.4456 | | cosine_map@100 | 0.4552 | <!-- ## Bias, Risks and Limitations *What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.* --> <!-- ### Recommendations *What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.* --> ## Training Details ### Training Dataset #### Unnamed Dataset * Size: 58,898 training samples * Columns: <code>anchor</code> and <code>positive</code> * Approximate statistics based on the first 1000 samples: | | anchor | positive | |:--------|:-----------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------| | type | string | string | | details | <ul><li>min: 19 tokens</li><li>mean: 31.33 tokens</li><li>max: 62 tokens</li></ul> | <ul><li>min: 23 tokens</li><li>mean: 325.57 tokens</li><li>max: 508 tokens</li></ul> | * Samples: | anchor | positive | |:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | <code>query: ยฟLas contribuciones que percibe una organizaciรณn en virtud de un convenio laboral en el fรบtbol tienen impacto en la base de cรกlculo para el impuesto correspondiente?</code> | <code>passage Descripciรณn de hechos: La consultante es una Asociaciรณn que se dedica a las actividades de ordenaciรณn del ejercicio de la profesiรณn de futbolistas de sus miembros, la representaciรณn de los mismos asรญ como la defensa de sus intereses profesionales tanto en el รกmbito nacional como en el internacional.<br>En virtud de un convenio colectivo para la actividad de fรบtbol profesional suscrito entre la Liga Nacional de Fรบtbol Profesional (LNFP) y la consultante, aquella viene obligada a entregar a esta, por cada temporada de vigencia del convenio, una cantidad de dinero (en concepto de Fondo social) destinada a fines benรฉficos y al normal desarrollo de la actividad de la Asociaciรณn.<br>Asimismo, segรบn Acta de Conciliaciรณn suscrita entre ambas partes, la LNFP se compromete a abonar a la consultante un porcentaje del importe neto total de los ingresos obtenidos de la explotaciรณn conjunta de los derechos de contenidos audiovisuales del fรบtbol. Dicha cuantรญa debe destinarse a actividades encamina...</code> | | <code>query: ยฟQuรฉ tipos de transacciones intracomunitarias deben ser declaradas por las empresas segรบn la regulaciรณn vigente?</code> | <code>passage Contestaciรณn completa: 1.- De acuerdo con el artรญculo 78 del Reglamento del impuesto aprobado por el Real Decreto 1624/1992, de 29 de diciembre (BOE del 31 de diciembre):<br>โ€œLos empresarios y profesionales deberรกn presentar una declaraciรณn recapitulativa de las entregas y adquisiciones intracomunitarias de bienes y de las prestaciones y adquisiciones intracomunitarias de servicios que realicen en la forma que se indica en el presente capรญtulo.โ€.<br>El artรญculo 79 del Reglamento especifica quรฉ tipo de operaciones deben ser declaradas en la declaraciรณn recapitulativa de operaciones intracomunitarias, en concreto establece que:<br>โ€œ1. Estarรกn obligados a presentar la declaraciรณn recapitulativa los empresarios y profesionales, incluso cuando tengan dicha condiciรณn con arreglo a lo dispuesto en el apartado cuatro del artรญculo 5 de la Ley del Impuesto, que realicen cualquiera de las siguientes operaciones.<br>1.ยบ Las entregas de bienes destinados a otro Estado miembro que se encuentren exentas ...</code> | | <code>query: ยฟQuรฉ tipos de bebidas contienen alcohol apto para consumo humano?</code> | <code>passage Se entiende por bebida alcohรณlica todo lรญquido apto para el consumo humano por ingestiรณn que contenga alcohol etรญlico.<br>A los efectos de este nรบmero no tendrรกn la consideraciรณn de alimento el tabaco ni las sustancias no aptas para el consumo humano o animal en el mismo estado en que fuesen objeto de entrega, adquisiciรณn intracomunitaria o importaciรณn.โ€.<br>4.- Con independencia de lo anterior, el artรญculo 20, apartado uno, nรบmero 9ยบ, de la Ley 37/1992, establece que estarรกn exentas del Impuesto las siguientes operaciones:<br>โ€œ9.ยบ La educaciรณn de la infancia y de la juventud, la guarda y custodia de niรฑos, incluida la atenciรณn a niรฑos en los centros docentes en tiempo interlectivo durante el comedor escolar o en aulas en servicio de guarderรญa fuera del horario escolar, la enseรฑanza escolar, universitaria y de postgraduados, la enseรฑanza de idiomas y la formaciรณn y reciclaje profesional, realizadas por Entidades de derecho pรบblico o entidades privadas autorizadas para el ejercicio de di...</code> | * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` ### Training Hyperparameters #### Non-Default Hyperparameters - `eval_strategy`: steps - `per_device_train_batch_size`: 32 - `per_device_eval_batch_size`: 16 - `gradient_accumulation_steps`: 16 - `learning_rate`: 2e-05 - `num_train_epochs`: 5 - `lr_scheduler_type`: cosine - `warmup_ratio`: 0.1 - `fp16`: True - `tf32`: True - `load_best_model_at_end`: True - `optim`: adamw_torch_fused - `batch_sampler`: no_duplicates #### All Hyperparameters <details><summary>Click to expand</summary> - `overwrite_output_dir`: False - `do_predict`: False - `eval_strategy`: steps - `prediction_loss_only`: True - `per_device_train_batch_size`: 32 - `per_device_eval_batch_size`: 16 - `per_gpu_train_batch_size`: None - `per_gpu_eval_batch_size`: None - `gradient_accumulation_steps`: 16 - `eval_accumulation_steps`: None - `torch_empty_cache_steps`: None - `learning_rate`: 2e-05 - `weight_decay`: 0.0 - `adam_beta1`: 0.9 - `adam_beta2`: 0.999 - `adam_epsilon`: 1e-08 - `max_grad_norm`: 1.0 - `num_train_epochs`: 5 - `max_steps`: -1 - `lr_scheduler_type`: cosine - `lr_scheduler_kwargs`: {} - `warmup_ratio`: 0.1 - `warmup_steps`: 0 - `log_level`: passive - `log_level_replica`: warning - `log_on_each_node`: True - `logging_nan_inf_filter`: True - `save_safetensors`: True - `save_on_each_node`: False - `save_only_model`: False - `restore_callback_states_from_checkpoint`: False - `no_cuda`: False - `use_cpu`: False - `use_mps_device`: False - `seed`: 42 - `data_seed`: None - `jit_mode_eval`: False - `use_ipex`: False - `bf16`: False - `fp16`: True - `fp16_opt_level`: O1 - `half_precision_backend`: auto - `bf16_full_eval`: False - `fp16_full_eval`: False - `tf32`: True - `local_rank`: 0 - `ddp_backend`: None - `tpu_num_cores`: None - `tpu_metrics_debug`: False - `debug`: [] - `dataloader_drop_last`: False - `dataloader_num_workers`: 0 - `dataloader_prefetch_factor`: None - `past_index`: -1 - `disable_tqdm`: False - `remove_unused_columns`: True - `label_names`: None - `load_best_model_at_end`: True - `ignore_data_skip`: False - `fsdp`: [] - `fsdp_min_num_params`: 0 - `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False} - `fsdp_transformer_layer_cls_to_wrap`: None - `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None} - `deepspeed`: None - `label_smoothing_factor`: 0.0 - `optim`: adamw_torch_fused - `optim_args`: None - `adafactor`: False - `group_by_length`: False - `length_column_name`: length - `ddp_find_unused_parameters`: None - `ddp_bucket_cap_mb`: None - `ddp_broadcast_buffers`: False - `dataloader_pin_memory`: True - `dataloader_persistent_workers`: False - `skip_memory_metrics`: True - `use_legacy_prediction_loop`: False - `push_to_hub`: False - `resume_from_checkpoint`: None - `hub_model_id`: None - `hub_strategy`: every_save - `hub_private_repo`: None - `hub_always_push`: False - `gradient_checkpointing`: False - `gradient_checkpointing_kwargs`: None - `include_inputs_for_metrics`: False - `include_for_metrics`: [] - `eval_do_concat_batches`: True - `fp16_backend`: auto - `push_to_hub_model_id`: None - `push_to_hub_organization`: None - `mp_parameters`: - `auto_find_batch_size`: False - `full_determinism`: False - `torchdynamo`: None - `ray_scope`: last - `ddp_timeout`: 1800 - `torch_compile`: False - `torch_compile_backend`: None - `torch_compile_mode`: None - `include_tokens_per_second`: False - `include_num_input_tokens_seen`: False - `neftune_noise_alpha`: None - `optim_target_modules`: None - `batch_eval_metrics`: False - `eval_on_start`: False - `use_liger_kernel`: False - `eval_use_gather_object`: False - `average_tokens_across_devices`: False - `prompts`: None - `batch_sampler`: no_duplicates - `multi_dataset_batch_sampler`: proportional </details> ### Training Logs | Epoch | Step | Training Loss | InformationRetrievalEvaluator_cosine_ndcg@10 | |:----------:|:-------:|:-------------:|:--------------------------------------------:| | -1 | -1 | - | 0.3241 | | 0.8691 | 100 | 12.0508 | 0.4582 | | 1.7300 | 200 | 1.0655 | 0.4828 | | 2.5910 | 300 | 0.8843 | 0.4950 | | 3.4519 | 400 | 0.766 | 0.4997 | | **4.3129** | **500** | **0.7075** | **0.5018** | | 5.0 | 580 | - | 0.5018 | * The bold row denotes the saved checkpoint. ### Framework Versions - Python: 3.11.13 - Sentence Transformers: 4.1.0 - Transformers: 4.52.4 - PyTorch: 2.6.0+cu124 - Accelerate: 1.7.0 - Datasets: 2.14.4 - Tokenizers: 0.21.1 ## Citation ### BibTeX #### Sentence Transformers ```bibtex @inproceedings{reimers-2019-sentence-bert, title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks", author = "Reimers, Nils and Gurevych, Iryna", booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing", month = "11", year = "2019", publisher = "Association for Computational Linguistics", url = "https://arxiv.org/abs/1908.10084", } ``` #### MultipleNegativesRankingLoss ```bibtex @misc{henderson2017efficient, title={Efficient Natural Language Response Suggestion for Smart Reply}, author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil}, year={2017}, eprint={1705.00652}, archivePrefix={arXiv}, primaryClass={cs.CL} } ``` <!-- ## Glossary *Clearly define terms in order to be accessible across audiences.* --> <!-- ## Model Card Authors *Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.* --> <!-- ## Model Card Contact *Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.* -->
ekiprop/bert-wnli-epochs5-lr5em06-bs16-2025-06-17-0900
ekiprop
2025-06-17T09:01:09Z
0
0
transformers
[ "transformers", "safetensors", "bert", "text-classification", "generated_from_trainer", "base_model:google-bert/bert-base-uncased", "base_model:finetune:google-bert/bert-base-uncased", "license:apache-2.0", "autotrain_compatible", "endpoints_compatible", "region:us" ]
text-classification
2025-06-17T09:00:05Z
--- library_name: transformers license: apache-2.0 base_model: bert-base-uncased tags: - generated_from_trainer metrics: - accuracy model-index: - name: bert-wnli-epochs5-lr5em06-bs16-2025-06-17-0900 results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # bert-wnli-epochs5-lr5em06-bs16-2025-06-17-0900 This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.6995 - Accuracy: 0.3803 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-06 - train_batch_size: 16 - eval_batch_size: 8 - seed: 42 - optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments - lr_scheduler_type: linear - num_epochs: 5 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | No log | 1.0 | 40 | 0.6947 | 0.4930 | | No log | 2.0 | 80 | 0.6966 | 0.4648 | | No log | 3.0 | 120 | 0.6969 | 0.4789 | | No log | 4.0 | 160 | 0.6989 | 0.3662 | | No log | 5.0 | 200 | 0.6995 | 0.3803 | ### Framework versions - Transformers 4.52.4 - Pytorch 2.7.0+cu128 - Datasets 3.6.0 - Tokenizers 0.21.1
navaneeth005/fitness_model_v0-F32-GGUF
navaneeth005
2025-06-17T08:59:37Z
0
0
transformers
[ "transformers", "gguf", "text-generation-inference", "unsloth", "llama", "trl", "llama-cpp", "gguf-my-lora", "en", "base_model:navaneeth005/fitness_model_v0", "base_model:quantized:navaneeth005/fitness_model_v0", "license:apache-2.0", "endpoints_compatible", "region:us" ]
null
2025-06-17T08:59:32Z
--- base_model: navaneeth005/fitness_model_v0 tags: - text-generation-inference - transformers - unsloth - llama - trl - llama-cpp - gguf-my-lora license: apache-2.0 language: - en --- # navaneeth005/fitness_model_v0-F32-GGUF This LoRA adapter was converted to GGUF format from [`navaneeth005/fitness_model_v0`](https://huggingface.co/navaneeth005/fitness_model_v0) via the ggml.ai's [GGUF-my-lora](https://huggingface.co/spaces/ggml-org/gguf-my-lora) space. Refer to the [original adapter repository](https://huggingface.co/navaneeth005/fitness_model_v0) for more details. ## Use with llama.cpp ```bash # with cli llama-cli -m base_model.gguf --lora fitness_model_v0-f32.gguf (...other args) # with server llama-server -m base_model.gguf --lora fitness_model_v0-f32.gguf (...other args) ``` To know more about LoRA usage with llama.cpp server, refer to the [llama.cpp server documentation](https://github.com/ggerganov/llama.cpp/blob/master/examples/server/README.md).
ekiprop/bert-wnli-epochs5-lr2em06-bs16-2025-06-17-0856
ekiprop
2025-06-17T08:57:39Z
0
0
transformers
[ "transformers", "safetensors", "bert", "text-classification", "generated_from_trainer", "base_model:google-bert/bert-base-uncased", "base_model:finetune:google-bert/bert-base-uncased", "license:apache-2.0", "autotrain_compatible", "endpoints_compatible", "region:us" ]
text-classification
2025-06-17T08:56:37Z
--- library_name: transformers license: apache-2.0 base_model: bert-base-uncased tags: - generated_from_trainer metrics: - accuracy model-index: - name: bert-wnli-epochs5-lr2em06-bs16-2025-06-17-0856 results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # bert-wnli-epochs5-lr2em06-bs16-2025-06-17-0856 This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.6926 - Accuracy: 0.5634 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-06 - train_batch_size: 16 - eval_batch_size: 8 - seed: 42 - optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments - lr_scheduler_type: linear - num_epochs: 5 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | No log | 1.0 | 40 | 0.6896 | 0.5634 | | No log | 2.0 | 80 | 0.6912 | 0.5634 | | No log | 3.0 | 120 | 0.6917 | 0.5634 | | No log | 4.0 | 160 | 0.6924 | 0.5634 | | No log | 5.0 | 200 | 0.6926 | 0.5634 | ### Framework versions - Transformers 4.52.4 - Pytorch 2.7.0+cu128 - Datasets 3.6.0 - Tokenizers 0.21.1
ekiprop/bert-wnli-epochs5-lr1em05-bs16-2025-06-17-0852
ekiprop
2025-06-17T08:53:50Z
0
0
transformers
[ "transformers", "safetensors", "bert", "text-classification", "generated_from_trainer", "base_model:google-bert/bert-base-uncased", "base_model:finetune:google-bert/bert-base-uncased", "license:apache-2.0", "autotrain_compatible", "endpoints_compatible", "region:us" ]
text-classification
2025-06-17T08:52:48Z
--- library_name: transformers license: apache-2.0 base_model: bert-base-uncased tags: - generated_from_trainer metrics: - accuracy model-index: - name: bert-wnli-epochs5-lr1em05-bs16-2025-06-17-0852 results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # bert-wnli-epochs5-lr1em05-bs16-2025-06-17-0852 This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.7038 - Accuracy: 0.2394 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 16 - eval_batch_size: 8 - seed: 42 - optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments - lr_scheduler_type: linear - num_epochs: 5 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | No log | 1.0 | 40 | 0.6981 | 0.4225 | | No log | 2.0 | 80 | 0.6983 | 0.4366 | | No log | 3.0 | 120 | 0.6996 | 0.4366 | | No log | 4.0 | 160 | 0.7028 | 0.2535 | | No log | 5.0 | 200 | 0.7038 | 0.2394 | ### Framework versions - Transformers 4.52.4 - Pytorch 2.7.0+cu128 - Datasets 3.6.0 - Tokenizers 0.21.1
Whisper-Pascal/whisper-openvino
Whisper-Pascal
2025-06-17T08:48:42Z
0
0
null
[ "license:mit", "region:us" ]
null
2025-06-17T08:06:34Z
--- license: mit --- This set contains zip archives of the OpenVino models for whisper.cpp Each zip contains the ggml-[model]-encoder-openvino.bin and ggml-[model]-encoder-openvino.xml files required for the main matching ggml-[model].bin You can find the main models [here](https://huggingface.co/ggerganov/whisper.cpp/tree/main) To use the OpenVino version of whisper.cpp you require the main model and the extracted contents of the matching zip in the same directory (you can safely delete the ZIP file once extracted if you desire)
ICTNLP/stream-omni-8b
ICTNLP
2025-06-17T08:37:04Z
3
3
null
[ "safetensors", "stream_omni_llama", "omni", "any-to-any", "arxiv:2506.13642", "license:gpl-3.0", "region:us" ]
any-to-any
2025-06-16T09:24:53Z
--- license: gpl-3.0 pipeline_tag: any-to-any tags: - omni --- # Stream-Omni: Simultaneous Multimodal Interactions with Large Language-Vision-Speech Model [![arXiv](https://img.shields.io/badge/arXiv-2506.13642-b31b1b.svg?logo=arXiv)](https://arxiv.org/abs/2506.13642) [![arXiv](https://img.shields.io/badge/GitHub-Stream--Omni-black.svg?logo=github)](https://github.com/ictnlp/Stream-Omni) [![model](https://img.shields.io/badge/%F0%9F%A4%97%20Huggingface%20-stream--omni--8b-orange.svg)](https://huggingface.co/ICTNLP/stream-omni-8b) [![data](https://img.shields.io/badge/%F0%9F%93%91%20Datasets%20-InstructOmni-green.svg)](https://huggingface.co/datasets/ICTNLP/InstructOmni) [![Badge](https://hitscounter.dev/api/hit?url=https%3A%2F%2Fgithub.com%2Fictnlp%2FStream-Omni&label=Visitors&icon=graph-up&color=%23dc3545)](https://github.com/ictnlp/Stream-Omni) > [**Shaolei Zhang**](https://zhangshaolei1998.github.io/), [**Shoutao Guo**](https://scholar.google.com.hk/citations?user=XwHtPyAAAAAJ), [**Qingkai Fang**](https://fangqingkai.github.io/), [**Yan Zhou**](https://zhouyan19.github.io/zhouyan/), [**Yang Feng**](https://people.ucas.edu.cn/~yangfeng?language=en)\* The introduction and usage of Stream-Omni refer to [https://github.com/ictnlp/Stream-Omni](https://github.com/ictnlp/Stream-Omni). Stream-Omni is an end-to-end language-vision-speech chatbot that simultaneously supports interaction across various modality combinations, with the following features๐Ÿ’ก: - **Omni Interaction**: Support any multimodal inputs including text, vision, and speech, and generate both text and speech responses. - **Seamless "see-while-hear" Experience**: Simultaneously output *intermediate textual results* (e.g., ASR transcriptions and model responses) during speech interactions, like the advanced voice service of GPT-4o. - **Efficient Training**: Require only a small amount of omni-modal data for training. <p align="center" width="100%"> <img src="./stream-omni.png" alt="stream-omni" style="width: 90%; min-width: 300px; display: block; margin: auto;"> </p> ## ๐Ÿ–ฅ Demo | Microphone Input | File Input | | ------------------------------------------------------------ | ------------------------------------------------------------ | | <video src='https://github.com/user-attachments/assets/99325a91-b81c-4fd1-a68e-c0628d5784fe' width="100%"/> | <video src='https://github.com/user-attachments/assets/bcb37b15-6c8c-4eae-845e-5da10aaac11e' width="100%"/> | > [!NOTE] > > **Stream-Omni can produce intermediate textual results (ASR transcription and text response) during speech interaction, offering users a seamless "see-while-hear" experience.**
MaestrAI/jack_marlowe-lora-1750146373
MaestrAI
2025-06-17T08:27:35Z
0
0
null
[ "region:us" ]
null
2025-06-17T07:46:12Z
# jack_marlowe LORA Model This is a LORA model for character Jack Marlowe Created at 2025-06-17 09:46:13
opium80s/medgemma-27b-merged-10-golden-rules-expert
opium80s
2025-06-17T08:26:29Z
0
0
transformers
[ "transformers", "safetensors", "gemma3_text", "text-generation", "text-generation-inference", "unsloth", "conversational", "en", "base_model:unsloth/medgemma-27b-text-it-unsloth-bnb-4bit", "base_model:finetune:unsloth/medgemma-27b-text-it-unsloth-bnb-4bit", "license:apache-2.0", "autotrain_com...
text-generation
2025-06-17T08:06:48Z
--- base_model: unsloth/medgemma-27b-text-it-unsloth-bnb-4bit tags: - text-generation-inference - transformers - unsloth - gemma3_text license: apache-2.0 language: - en --- # Uploaded finetuned model - **Developed by:** opium80s - **License:** apache-2.0 - **Finetuned from model :** unsloth/medgemma-27b-text-it-unsloth-bnb-4bit This gemma3_text model was trained 2x faster with [Unsloth](https://github.com/unslothai/unsloth) and Huggingface's TRL library. [<img src="https://raw.githubusercontent.com/unslothai/unsloth/main/images/unsloth%20made%20with%20love.png" width="200"/>](https://github.com/unslothai/unsloth)
adugeen/authorship-e5-small
adugeen
2025-06-17T08:12:00Z
0
0
sentence-transformers
[ "sentence-transformers", "safetensors", "bert", "sentence-similarity", "feature-extraction", "generated_from_trainer", "dataset_size:276686", "loss:MultipleNegativesRankingLoss", "arxiv:1908.10084", "arxiv:1705.00652", "base_model:intfloat/multilingual-e5-small", "base_model:finetune:intfloat/...
sentence-similarity
2025-06-17T08:09:57Z
--- tags: - sentence-transformers - sentence-similarity - feature-extraction - generated_from_trainer - dataset_size:276686 - loss:MultipleNegativesRankingLoss base_model: intfloat/multilingual-e5-small widget: - source_sentence: 'ะŸะตั‡ะตะฝะตะณะธ ะพั‚ัั‚ัƒะฟะฐะปะธ. ะžะฝะธ ะผะพะณะปะธ ะทะฐะฟั€ะพัั‚ะพ ัƒะฑะธั‚ัŒ ะพัั‚ะฐะฒัˆะตะณะพัั ะฟะพะทะฐะดะธ ะบะฝัะทั ะ’ะปะฐะดะธะผะธั€ะฐ, ะผะฐะปัŒั‡ะธัˆะบัƒ ะธ ัั‚ะฐั€ะธะบะฐ, ะฝะพ ะฟะพะปัƒั‡ะธะปะธ ะฟั€ะธะบะฐะท - ัƒั…ะพะดะธั‚ัŒ. ะšัƒั€ั - ะฟะตั‡ะตะฝะตะถัะบะธะน ั…ะฐะฝ, ะฟั€ะพะธะณั€ะฐะฒัˆะธะน ะฑะพะน ั ะบะฝัะทะตะผ, ะฑั‹ะป ัะผะตั€ั‚ะตะปัŒะฝะพ ั€ะฐะฝะตะฝ. ะ’ะพะธะฝั‹ ะทะฐะฑั€ะฐะปะธ ะตะณะพ ะธ ะฒะตั€ะฝัƒะปะธััŒ ะฝะฐ ะฟั€ะตะถะฝะตะต ะผะตัั‚ะพ ัั‚ะพัะฝะบะธ, ะณะดะต ะฑั‹ะปะธ ะพัั‚ะฐะฒะปะตะฝั‹ ะพะฑะพะทั‹ ั ะฟั€ะพะฒะธะฐะฝั‚ะพะผ ะธ ั€ะฐะทะฑะธั‚ ะฒั€ะตะผะตะฝะฝั‹ะน ะปะฐะณะตั€ัŒ. ะ“ะธะนัั€ ะฝะต ะพั‚ั…ะพะดะธะป ะพั‚ ะพั‚ั†ะฐ ะฝะธ ะฝะฐ ัˆะฐะณ, ะฝะฐะดะตัััŒ ะฝะต ั‚ะพ ะฝะฐ ั‡ัƒะดะพ, ะฝะต ั‚ะพ ะฝะฐ ะปะตะบะฐั€ะตะน. ะ’ะตะดัŒ ะบั€ะพะผะต ะฝะตะณะพ ัƒ ะผะฐะปัŒั‡ะธัˆะบะธ ะฝะธะบะพะณะพ ะฝะต ะฑั‹ะปะพ. ะœะฐั‚ัŒ ะพะฝ ะฟะพั‚ะตั€ัะป ะตั‰ั‘ ัะพะฒัะตะผ ะบั€ะพั…ะพะน. ะšัƒั€ั ะฑั‹ะป ะฒะปะฐัั‚ะฝั‹ะผ ะธ ะถะตัั‚ะพะบะธะผ ะฟั€ะฐะฒะธั‚ะตะปะตะผ. ะš ัั‹ะฝัƒ ะพั‚ะฝะพัะธะปัั ั…ะพั€ะพัˆะพ, ะฝะพ ะฝะธะบะพะณะดะฐ ะฝะต ะฑะฐะปะพะฒะฐะป. ะ”ะปั ะ“ะธะนัั€ะฐ ะพั‚ะตั† ะฒัะตะณะดะฐ ะฑั‹ะป ะธะดะตะฐะปะพะผ, ะผะฝะพะณะธะต ะทะฐะฒะธะดะพะฒะฐะปะธ ะตะณะพ ั€ะตัˆะธะผะพัั‚ะธ ะธ ั…ะธั‚ั€ะพัƒะผะธัŽ. "ะ’ะปะฐัั‚ัŒ ะดะตั€ะถะธั‚ัั ะฝะฐ ัั‚ั€ะฐั…ะต" - ั‡ะฐัั‚ะพ ะณะพะฒะพั€ะธะป ะพะฝ. ะะพ ั‚ะตะฟะตั€ัŒ ะพะฝ ะฝะฐั…ะพะดะธะปัั ะผะตะถะดัƒ ะถะธะทะฝัŒัŽ ะธ ัะผะตั€ั‚ัŒัŽ. ะ˜ ั‡ั‘ั€ะฝะฐั ั‡ะฐัˆะฐ ะฒะตัะพะฒ ะฑั‹ะปะฐ ะฝะฐะผะฝะพะณะพ ั‚ัะถะตะปะตะต... ะ’ะตะปะธะบะธะน ั…ะฐะฝ ัƒะผะตั€ ะฝะพั‡ัŒัŽ, ะฝะต ะฟั€ะธั…ะพะดั ะฒ ัะพะทะฝะฐะฝะธะต. ะŸะฐั€ะตะฝัŒ ะฟะพะฝะธะผะฐะป, ั‡ั‚ะพ ะฒั‹ะฝัƒะถะดะตะฝ ัั‚ะฐั‚ัŒ ัะปะตะดัƒัŽั‰ะธะผ ั…ะฐะฝะพะผ, ะฒะตะดัŒ ะฟะพ ะปะธะฝะธะธ ะพั‚ั†ะฐ ัƒ ะฝะตะณะพ ะฑะพะปัŒัˆะต ั€ะพะดัั‚ะฒะตะฝะฝะธะบะพะฒ ะฝะต ะฑั‹ะปะพ. ะะพ ัั‚ะพ ะทะฝะฐั‡ะธะปะพ ะพะดะฝะพ. ะกะฝะพะฒะฐ ัะพะฒะตั€ัˆะฐั‚ัŒ ะฝะฐะฑะตะณะธ ะฝะฐ ะšะธะตะฒ ะธ ะะพะฒะณะพั€ะพะด, ะฟั€ะพะดะพะปะถะฐั‚ัŒ ะดะตะปะพ ะšัƒั€ะธ, ะฐ ะทะฝะฐั‡ะธั‚ - ะพะฑะตั€ะฝัƒั‚ัŒัั ะฟั€ะพั‚ะธะฒ ัะฒะพะธั… ะดั€ัƒะทะตะน, ะฟะพัั‚ัƒะฟะธั‚ัŒ ะฟะพะดะปะพ ะฟะพ ะพั‚ะฝะพัˆะตะฝะธัŽ ะบ ะบะฝัะทัŽ, ะะปะตะบัˆะต ะธ ะžะปะตะฝัŒะบะต. ะ ะตัั‚ัŒ ะปะธ ัƒ ะฝะตะณะพ ะฒั‹ะฑะพั€? ะ”ะฐ. ะ˜ ัะตะนั‡ะฐั ัั‚ะพ ะตะณะพ ะฒั‹ะฑะพั€. ะžั‚ะตั† ะฝะต ะฟะพะดัะบะฐะถะตั‚ ะตะผัƒ, ะฝะต ะฟะพะผะพะถะตั‚, ะฝะต ัƒะบะฐะถะตั‚ ะฒะตั€ะฝั‹ะน ะฟัƒั‚ัŒ. ะ”ะฐ ะธ ะฝะต ะฟะพัะปัƒัˆะฐะป ะฑั‹ ะ“ะธะนัั€ ะตะณะพ. ะžะฝ ะฝะธะบะพะณะดะฐ ะฝะต ัะผะพะถะตั‚ ัั‚ะฐั‚ัŒ ั‚ะฐะบะธะผ ะถะตัั‚ะพะบะธะผ, ะบะฐะบ ะพั‚ะตั†. ะะพ ั‚ะตะฟะตั€ัŒ ะผะฐะปัŒั‡ะธะบ ะฟะพะฝะธะผะฐะป, ะพั‚ะบัƒะดะฐ ั‚ะพั‚ ั‡ะตั€ะฟะฐะป ัะฒะพัŽ ะฝะตะฝะฐะฒะธัั‚ัŒ, ะทะปะพัั‚ัŒ, ะพะฑะธะดัƒ ะธ ะถะฐะถะดัƒ ะบั€ะพะฒะธ. ะฃ ะฝะตะณะพ ะฝะธะบะพะณะดะฐ ะฝะต ะฑั‹ะปะพ ะฝะฐัั‚ะพัั‰ะตะณะพ ะดั€ัƒะณะฐ, ะบะพั‚ะพั€ั‹ะน ัะผะพะณ ะฑั‹ ะฟะพะดัั‚ะฐะฒะธั‚ัŒ ะฟะปะตั‡ะพ, ะดะฐั‚ัŒ ะดั€ัƒะถะตัะบะธะน ัะพะฒะตั‚. ะะตัะผะพั‚ั€ั ะฝะฐ ะพั€ะดั‹ ะฟะพะดั‡ะธะฝั‘ะฝะฝั‹ั… ะตะผัƒ ะฟะตั‡ะตะฝะตะณะพะฒ ะธ ั…ะฐะทะฐั€, ะพั‚ะตั† ะฑั‹ะป ะพะดะธะฝะพะบ. ะะต ัั‚ะฐะฒะธะป ะฝะธะบะพะณะพ ั€ัะดะพะผ ั ัะพะฑะพะน, ัƒะฝะธะถะฐะป, ะถะตัั‚ะพะบะพ ะฝะฐะบะฐะทั‹ะฒะฐะป ะฒะพะธะฝะพะฒ, ะทะฐัะปัƒะถะธะฒ ั‚ะตะผ ัะฐะผั‹ะผ "ัั‚ั€ะฐั… ะธ ัƒะฒะฐะถะตะฝะธะต". ะขะฐะบ ัั‡ะธั‚ะฐะป ะพะฝ ัะฐะผ. ะ˜ ั‚ะฐะบะพะน ะฟะพะทะธั†ะธะธ ะฟั€ะธะดะตั€ะถะธะฒะฐะปะธััŒ ะฟะตั‡ะตะฝะตะณะธ... ะ ะ“ะธะนัั€? ะะปะตะบัˆะฐ ัะฟะฐั ะตะผัƒ ะถะธะทะฝัŒ, ะฝะต ะทะฝะฐั, ั‡ั‚ะพ ั‚ะพั‚ ะฒั€ะฐะณ. ะ ะบะพะณะดะฐ ัƒะทะฝะฐะป, ั€ะฐะทะฒะต ะทะฐั…ะพั‚ะตะป ัƒะฑะธั‚ัŒ ะตะณะพ? ะ’ะตะดัŒ ะฟะตั‡ะตะฝะตะณะธ ัะพะถะณะปะธ ะตะณะพ ะดะตั€ะตะฒะฝัŽ ะธ ัƒะฑะธะปะธ ะดะตะดัƒัˆะบัƒ. ะะพ ะ“ะธะนัั€ ะฝะต ะผะพะณ ะฟะพะฝัั‚ัŒ, ะบะฐะบ ะฒะพะพะฑั‰ะต ั‚ะฐะบะพะต ะฒะพะทะผะพะถะฝะพ. ะขะพ ะปะธ ั…ั€ะธัั‚ะธะฐะฝะธะฝัƒ ั‚ะฐะบ ะธ ะฟะพะปะฐะณะฐะตั‚ัั, ะผะพะปั‡ะฐ ะฟะตั€ะตะฝะพัะธั‚ัŒ ะฒัะต ะฝะตะฒะทะณะพะดั‹, ั‡ั‚ะพ ะฟะพัั‹ะปะฐะตั‚ ะตะผัƒ ััƒะดัŒะฑะฐ, ั‚ะพ ะปะธ ั€ัƒััะบะธะน ะดัƒั… ะฝะฐัั‚ะพะปัŒะบะพ ัะธะปั‘ะฝ, ั‡ั‚ะพ ะฝะต ะผะพะถะตั‚ ะพะฟัƒัะบะฐั‚ัŒัั ะดะพ ะผะตัั‚ะธ. ะะต ะฟะพะฝะธะผะฐะป..., ะฐ ะถะฐะถะดะฐ ะผะตัั‚ะธ ั€ะฐะทะณะพั€ะฐะปะฐััŒ ะฒ ะตะณะพ ะดัƒัˆะต. ะžะปะตะฝัŒะบะฐ... ะฅั€ะฐะฑั€ะฐั ะดะตะฒั‡ัƒัˆะบะฐ ั ั‡ั‘ะปะบะพะน ะธ ะดะปะธะฝะฝะพะน ัะฒะตั‚ะปะพะน ะบะพัะพะน... ะะตั‚! ะž ะฝะตะน ะดะฐะถะต ะฝะต ะดัƒะผะฐะน! ะžะฝ ะดะพะปะถะตะฝ ะฝะฐะบะฐะทะฐั‚ัŒ ัƒะฑะธะนั†ัƒ ะพั‚ั†ะฐ, ะดะฐะถะต ะตัะปะธ ั‚ะพั‚ ัะฐะผ, ะพะฑะผะฐะฝัƒั‚ั‹ะน ะšั€ะธะฒะถะตะน, ะฒะฒัะทะฐะปัั ะฒ ะฑะพะน. ะ˜ ั‚ัƒั‚ ะถะต ัะตั€ะดั†ะต ะฒะพะทั€ะฐะถะฐะปะพ. "ะขะฐะบ ะฝะตะปัŒะทั". ะขั‹ ะฒะตะดัŒ ัƒะถะต ะฟั€ะธะฝัะป ั€ะตัˆะตะฝะธะต. ะขะฐะผ, ะฝะฐ ะฟะพะปะต ะฑะพั, ะบะพะณะดะฐ ะพั‚ะตั† ะฟั€ะพะธะณั€ะฐะป. ะขั‹ ะพัั‚ะฐะฝะพะฒะธะป ั€ะฐะทัŠัั€ะตะฝะฝั‹ั… ะฟะตั‡ะตะฝะตะณะพะฒ, ะณะพั‚ะพะฒั‹ั… ั€ะฐะทะพั€ะฒะฐั‚ัŒ ะฝะตะฝะฐะฒะธัั‚ะฝะพะณะพ ะบะฝัะทั ะ’ะปะฐะดะธะผะธั€ะฐ. ะ ั‡ัƒะดะฝะพะน ะดะตะด ะธ ะผะฐะปัŒั‡ะธัˆะบะฐ? ะŸะพะผะตัˆะฐะปะธ ะฑั‹ ะพะฝะธ ัั‚ะพะผัƒ? ะะตั‚. ะ ะฐะทะฒะต ะฟะฐั€ะตะฝัŒ ะผะพะณ ัะบะฐะทะฐั‚ัŒ ั‚ะฐะบะธะต ัะปะพะฒะฐ? ะŸะตั€ะตะด ะณะปะฐะทะฐะผะธ ัะฝะพะฒะฐ ะฟะพัะฒะปัะตั‚ัั ั‚ะฐ ะบะฐั€ั‚ะธะฝะฐ. ะœะพะปะพะดะพะน ะบะฝัะทัŒ ัƒะดะฐั€ะธะป ะšัƒั€ัŽ ะผะตั‡ะพะผ ะธ ะฒั‹ะฑะธะป ะธะท ัะตะดะปะฐ. ะœะฐะปัŒั‡ะธะบ ะฑั€ะพัะฐะตั‚ัั ะบ ะพั‚ั†ัƒ. ะžั€ะดะฐ ะฟะตั‡ะตะฝะตะณะพะฒ ัƒะถะต ะฝะต ัะดะตั€ะถะธะฒะฐะตั‚ ะบะพะฝะตะน, ั€ะฒัƒั‰ะธั…ัั ะบ ะฑะพัŽ. ะ•ั‰ั‘ ัะตะบัƒะฝะดะฐ, ะธ ะพะฝะธ ัะพั€ะฒัƒั‚ัั ั ะผะตัั‚ะฐ. ะŸะพะปัะฝะฐ ะพะฑะฐะณั€ะธั‚ัั ะบั€ะพะฒัŒัŽ. "ะะตั‚! -ะพัั‚ะฐะฝะฐะฒะปะธะฒะฐะตั‚ ะธั… ะ“ะธะนัั€. - ะญั‚ะพ ะฑั‹ะป ั‡ะตัั‚ะฝั‹ะน ะฟะพะตะดะธะฝะพะบ. ะฃั…ะพะดะธะผ!" ะ’ะพะธะฝั‹ ะฝะตั…ะพั‚ั, ะฝะพ ะฟะพัะปัƒัˆะฝะพ ั€ะฐะทะฒะพั€ะฐั‡ะธะฒะฐัŽั‚ ะบะพะฝะตะน. ะŸะพั‡ะตะผัƒ ะพะฝะธ ะฟะพัะปัƒัˆะฐะปะธ ะตะณะพ? ะžะฝ ะฒะตะดัŒ ะฝะต ั…ะฐะฝ. ะžั‚ะฒะตั‚ ะพะดะธะฝ. ะžะฝะธ ะดะพะฒะตั€ััŽั‚ ะตะผัƒ. ะšะฐะบ ะธ ะตะณะพ ะพั‚ั†ัƒ. ะขะฐะบ ั‡ั‚ะพ ะถะต ะดะตะปะฐั‚ัŒ? ะ“ะปัƒะฟั‹ะน ะผะฐะปัŒั‡ะธัˆะบะฐ. ะขั‹ ะฒัั‘ ั€ะตัˆะธะป. ะขั‹ ะฒะพะทะณะปะฐะฒะธัˆัŒ ะฟะตั‡ะตะฝะตะถัะบะธะน ัะพัŽะท ะฟะปะตะผั‘ะฝ, ะฝะพ ะฝะต ะฟะพัะผะตะตัˆัŒ ะฝะฐะฟะฐัั‚ัŒ ะฝะฐ ะšะธะตะฒัะบัƒัŽ ะ ัƒััŒ. ะ“ะธะนัั€ ะฒั‹ัˆะตะป ะธะท ัˆะฐั‚ั€ะฐ. ะšั€ัƒะณะพะผ, ะบัƒะดะฐ ะฝะต ะฟะพัะผะพั‚ั€ะธ ะฒั‹ัั‚ั€ะพะธะปะพััŒ ะฒะพะนัะบะพ. ะ’ ะพะถะธะดะฐะฝะธะธ ะผะพะปะพะดะพะณะพ ั…ะฐะฝะฐ ะพะฝะธ ะฟะตั€ะตัˆั‘ะฟั‚ั‹ะฒะฐะปะธััŒ, ัะฟะพั€ะธะปะธ, ั€ัƒะณะฐะปะธััŒ. ะะพ ั‚ัƒั‚ ะฒัะต ั€ะฐะทะพะผ ะทะฐะผะพะปั‡ะฐะปะธ. ะžะฝะธ ะฑั‹ะปะธ ะณะพั‚ะพะฒั‹ ัƒัะปั‹ัˆะฐั‚ัŒ ั€ะตัˆะตะฝะธะต ะ“ะธะนัั€ะฐ. - ะกะปัƒัˆะฐะนั‚ะต! ะŸะตั‡ะตะฝะตะณะธ- ัะฒะพะฑะพะดะฝั‹ะต ะฒะพะธะฝั‹! -ะณะพะปะพั ะผะฐะปัŒั‡ะธะบะฐ ะฝะต ะดั€ะพะณะฝัƒะป.- ะ’ั‹ ะผะพะถะตั‚ะต ัะปะตะดะพะฒะฐั‚ัŒ ะทะฐ ะผะฝะพะน, ะฐ ะผะพะถะตั‚ะต ะฟั€ะธะผะบะฝัƒั‚ัŒ ะบ ะบะฐะฝะณะฐั€ะฐะผ! ะฏ ะถะต ัั‡ะธั‚ะฐัŽ, ั‡ั‚ะพ ะผั‹ ะดะพะปะถะฝั‹ ั€ะฐะทะฑะธั‚ัŒ ะฟะพะปะพะฒั†ะตะฒ, ะบะพั‚ะพั€ั‹ะต ะดะฐะฒะฝะพ ะทะฐั€ัั‚ัั ะฝะฐ ะฝะฐัˆะธ ะทะตะผะปะธ! ะšั‚ะพ ัะพ ะผะฝะพะน?! ะžั‚ะฒะตั‚ะพะผ ะฑั‹ะปะพ ะพะดะพะฑั€ะธั‚ะตะปัŒะฝะพะต ะธ ะดั€ัƒะถะฝะพะต "ะฃั€ะฐ!" ะŸะตั‡ะตะฝะตะณะธ ัƒัˆะปะธ ั ั€ัƒััะบะพะน ะทะตะผะปะธ, ะฝะพ ะฝะต ะฝะฐะฒัะตะณะดะฐ. ะงะตั€ะตะท ะบะฐะบะพะต-ั‚ะพ ะฒั€ะตะผั ะพะฝะธ ะฒะตั€ะฝัƒะปะธััŒ ั ะฝะพะฒั‹ะผ ั…ะฐะฝะพะผ, ั‡ั‚ะพะฑั‹ ะฒะฒัะทะฐั‚ัŒัั ะฒ ะผะตะถะดะพัƒัะพะฑะฝัƒัŽ ะฒะพะนะฝัƒ ะผะตะถะดัƒ ะฏั€ะพัะปะฐะฒะพะผ ะœัƒะดั€ั‹ะผ ะธ ะกะฒัั‚ะพะฟะพะปะบะพะผ ะžะบะฐัะฝะฝั‹ะผ, ะฝะฐ ัั‚ะพั€ะพะฝะต ะฟะพัะปะตะดะฝะตะณะพ.' sentences: - 'http://vk.com/audio?performer=1&q=Hollywood%20Undead%20Coming%20Back%20Down ะŸัƒัั‚ั‹ะต ัƒะปะธั†ั‹ ะฑะพะปัŒัˆะพะณะพ ะณะพั€ะพะดะฐ, ั‡ั‚ะพ ะพัะฒะตั‰ะฐะปะธััŒ ัƒั‚ั€ะตะฝะฝะธะผ ัะพะปะฝั†ะตะผ. ะะฐ ะฟะตั€ะตัƒะปะบะฐั… ะฒัั‚ั€ะตั‡ะฐะปะธััŒ ั€ะตะดะบะธะต ะฟั€ะพั…ะพะถะธะต, ะบะพั‚ะพั€ั‹ะต ั ัƒะดะธะฒะปะตะฝะธะตะผ ัะผะพั‚ั€ะตะปะธ ะฝะฐ ะฟะพะปัƒั€ะฐะทะดะตั‚ะพะณะพ ะฟะฐั€ะฝั, ัˆะฐะณะฐัŽั‰ะตะณะพ ะฒะดะพะปัŒ ัƒะปะธั† ะœะฐะณะฝะพะปะธะธ. "ะกัƒะผะฐััˆะตะดัˆะธะน,"- ะฟะพะดัƒะผะฐัŽั‚ ะผะฝะพะณะธะต, ะฝะพ ั€ะฐะทะฒะต ัั‚ะพ ะฝะตะฟั€ะฐะฒะดะฐ? ะะตั‚, ัั‚ะพ, ะฟั€ะฐะฒะดะฐ, ัั‚ะพ ั‡ะธัั‚ะตะนัˆะฐั, ะฟั€ะฐะฒะดะฐ, ะบะพั‚ะพั€ะฐั ั ะบะฐะถะดั‹ะผ ะดะฝั‘ะผ ะณะปะพะถะตั‚ ะตะณะพ ะดัƒัˆัƒ. ะžะฝ ะฑั‹ะป ััƒะผะฐััˆะตะดัˆะธะผ ะดะพ ั‚ะพะณะพ ะบะฐะบ ะพะฝะฐ ะฟั€ะธัˆะปะฐ ะฒ ะตะณะพ ะถะธะทะฝัŒ, ะธ ะพัั‚ะฐะปัั ั‚ะฐะบะธะผ ะฟะพัะปะต ั‚ะพะณะพ ะบะฐะบ ะพะฝะฐ ัƒัˆะปะฐ... ะšะฐะถะดั‹ะน ะดะตะฝัŒ, ะฟั€ะพะณัƒะปะธะฒะฐัััŒ ะฟะพ ัƒะปะธั†ะฐะผ ัƒั‚ั€ะตะฝะฝะตะณะพ ะณะพั€ะพะดะฐ, ะพะฝ ั ัƒะปั‹ะฑะบะพะน ะฝะฐ ะณัƒะฑะฐั…, ะธ ั ะณั€ัƒัั‚ัŒัŽ ะฒ ะณะปะฐะทะฐั… ะฒัะฟะพะผะธะฝะฐะป ะฒัะต ะผะพะผะตะฝั‚ั‹, ะบะพะณะดะฐ ะพะฝะฐ ั…ะพะดะธะปะฐ ะฒะผะตัั‚ะต ั ะฝะธะผ. ะšะฐะถะดั‹ะน ั€ะฐะท, ะตะผัƒ ั‡ัƒะดะธะปะพััŒ, ั‡ั‚ะพ ะพะฝะฐ ั ะฝะธะผ, ั‡ั‚ะพ ะพะฝะฐ ะฝะต ะพัั‚ะฐะฒะธะปะฐ ะตะณะพ, ะฝะพ ัั‚ะพ ะฒะตะดัŒ ะฝะตะฟั€ะฐะฒะดะฐ, ะธะปะธ ะฟั€ะฐะฒะดะฐ? ะงั‚ะพ ะทะฐ ั‡ัƒัˆัŒ? ะšะพะฝะตั‡ะฝะพ, ะถะต, ะพะฝะฐ ัƒะถะต ะฝะต ั ะฝะธะผ, ะพะฝะฐ ั‚ะฐะผ ะดะฐะปะตะบะพ ะฝะฐะฒะตั€ั…ัƒ ะพั‚ ะฝะธั…... ะžะฝ ะฒัะตะณะดะฐ ัะปัƒัˆะฐะป ะตั‘ ะทะฒะพะฝะบะธะน, ะฝะพ ะพะดะฝะพะฒั€ะตะผะตะฝะฝะพ ั€ะพะฑะบะธะน ะณะพะปะพั, ะพั‚ ะบะพั‚ะพั€ะพะณะพ ั‚ะฐะบ ะธ ั…ะพั‚ะตะปะพััŒ ะฟั€ะธะถะฐั‚ัŒ ะบ ัะตะฑะต ะธ ะฝะต ะพั‚ะฟัƒัะบะฐั‚ัŒ ะฝะธะบะพะณะดะฐ, ะฝะพ ะพะฝ ะพั‚ะฟัƒัั‚ะธะป, ั‚ัƒะดะฐ, ะพั‚, ะบัƒะดะฐ ะฝะต ะฒะพะทะฒั€ะฐั‰ะฐัŽั‚ัั.... ะ˜ ั‚ะตะฟะตั€ัŒ, ะตั‘ ะณะพะปะพั ะฟะพัั‚ะตะฟะตะฝะฝะพ ั€ะฐัั‚ะฒะพั€ัะตั‚ัั ะธะท ะตะณะพ ะผั‹ัะปะตะน. ะก ะฝะตะน, ะพะฝ ั‡ัƒะฒัั‚ะฒะพะฒะฐะป, ั‡ั‚ะพ ะพะฝ ะฝะต ั‚ะฐะบะพะน ะบะฐะบ ะฒัะต, ั ะฝะตะน, ะพะฝ ะผะพะณ ะฑั‹ั‚ัŒ ะพัะพะฑะตะฝะฝั‹ะผ, ะฝะพ ะทะฐั‡ะตะผ ั‚ะตะฟะตั€ัŒ ะฑั‹ั‚ัŒ ั‚ะฐะบะธะผ, ะตัะปะธ ะตั‘ ะฝะตั‚? ะขะตะฟะตั€ัŒ, ะพะฝ ะปะธัˆัŒ ะพะดะฝะพ ะปะธั†ะพ, ะฒ ั‚ะพะน ั‚ะพะปะฟะต, ั‡ั‚ะพ ะพะบั€ัƒะถะธะปะฐ ะตะณะพ. ะ’ะพะทะผะพะถะฝะพ, ะตะณะพ ะธ ะทะฐะผะตั‚ัั‚, ะฝะพ ั‚ะพั‡ะฝะพ ะฝะต ะบะฐะบ ะปะธั‡ะฝะพัั‚ัŒ, ะตะณะพ ะทะฝะฐะปะฐ ะปะธัˆัŒ ะพะฝะฐ, ะฝะพ ะพะฝะฐ ัƒัˆะปะฐ... Someday, someday ะะพ ะบะพะณะดะฐ-ะฝะธะฑัƒะดัŒ, ะบะพะณะดะฐ-ะฝะธะฑัƒะดัŒ I know you''re coming back ะฏ ะทะฝะฐัŽ, ั‚ั‹ ะฒะตั€ะฝะตัˆัŒัั... ะกั‚ั€ะพั‡ะบะธ ัะฐะผะธ ะฒัะฟะปั‹ะปะธ ัƒ ะฝะตะณะพ ะฒ ะณะพะปะพะฒะต. ะŸะตัะฝั, ะบะพั‚ะพั€ัƒัŽ ะพะฝ ัƒัะปั‹ัˆะฐะป ะฝะตะดะฐะฒะฝะพ ะฟะพ ั€ะฐะดะธะพ, ะฒัะตะณะดะฐ ะฟั€ะตัะปะตะดะพะฒะฐะปะฐ ะตะณะพ. ะะต ะฒะฐะถะฝะพ ะณะดะต, ะฝะฐ ัƒะปะธั†ะต, ะฒ ะผะตั‚ั€ะพ, ะฒ ะผะฐัˆะธะฝะต, ะธะปะธ ะดะพะผะฐ, ะพะฝ ะฒัะตะณะดะฐ ะฒัะฟะพะผะธะฝะฐะป ะตั‘... ะขัƒ, ั‡ั‚ะพ ะธะทะผะตะฝะธะปะฐ ะตะณะพ ะถะธะทะฝัŒ, ะธ ั‚ัƒ, ะบะพั‚ะพั€ะฐั ัะตะนั‡ะฐั ะพั‡ะตะฝัŒ ะดะฐะปะตะบะพ ะพั‚ ะฝะตะณะพ... ะจะฐะณะฐั ะฟะพ ัƒะปะธั†ะฐะผ ะฟั€ะพัั‹ะฟะฐะฒัˆะตะณะพัั ะณะพั€ะพะดะฐ, ะพะฝ ะฝะตะฒะพะปัŒะฝะพ ะฟั€ะพัˆั‘ะป-ั‚ะพ ะผะตัั‚ะพ, ะณะดะต ะพะฝะฐ ะฟะพะบะธะฝัƒะปะฐ ัั‚ะพั‚ ะผะธั€. ะ’ะพัะฟะพะผะธะฝะฐะฝะธั ะฝะฐั…ะปั‹ะฝัƒะปะธ ะตะณะพ, ะฝะพ ะพะฝ ะฟะพะฟั‹ั‚ะฐะปัั ะธั… ะพั‚ะพะณะฝะฐั‚ัŒ, ะฝะพ ั€ะฐะทะฒะต ัั‚ะพ ะฒะพะทะผะพะถะฝะพ? ะะตั‚... ะžะฝ ัั‚ะพ ะทะฝะฐะป, ะฝะพ ะฒัั‘ ะถะต ะฟั€ะพั‚ะธะฒะธั‚ัั ััƒะดัŒะฑะต. ะšะฐะบะฐั ะธั€ะพะฝะธั, ะฝะต ะฟั€ะฐะฒะดะฐ ะปะธ? ะ“ั€ัะทะฝะฐั ัƒะปะธั†ะฐ, ะฒ ัั‚ะฐั€ะพะผ ั€ะฐะนะพะฝะต ะœะฐะณะฝะพะปะธะธ, ะฝะฐัะบะฒะพะทัŒ ะฑั‹ะปะฐ ะฟั€ะพะฟะธั‚ะฐะฝะฐ ะบั€ะพะฒัŒัŽ, ะบัƒั€ะตะฒะพะผ, ะธ ะฐะปะบะพะณะพะปะตะผ. ะžะฝะฐ ะฒะตะดัŒ ะฟั€ะตะดะปะฐะณะฐะปะฐ ัƒะนั‚ะธ, ะฐ ะพะฝ ะฝะต ะฟะพัะปัƒัˆะฐะปัั ะตั‘. ะ“ะปัƒะฟะตั†. ะœะฐะปะตะฝัŒะบะธะน ะฝะฐะธะฒะฝั‹ะน ะณะปัƒะฟะตั†, ะบะพั‚ะพั€ั‹ะน ะฝะธั‡ะตะณะพ ะฝะต ะฑะพะธั‚ัั, ั…ะพั‚ั ะฒัั‘ ัะพะฒัะตะผ ะฝะฐะพะฑะพั€ะพั‚... ะœะฐะฝัŒัะบ, ะพั‚ ะบะพั‚ะพั€ะพะณะพ ะพะฝะธ ะฑะตะถะฐะปะธ, ะฑั‹ะป ะฟั€ะพะฟะธั‚ะฐะฝ ะฑะตะทัƒะผัั‚ะฒะพะผ, ะฝะพ ั€ะฐะทะฒะต ะฑะตะทัƒะผัั‚ะฒะพะผ ะฝะฐะทั‹ะฒะฐัŽั‚ ะฐะปะบะพะณะพะปัŒ? ะšั‚ะพ ะทะฝะฐะตั‚, ะบั‚ะพ ะทะฝะฐะตั‚... ะ ะถะตั€ั‚ะฒั‹, ะฟัŒัะฝะพะณะพ ั‡ะตะปะพะฒะตะบะฐ, ะฟั€ะธะฑะตะถะฐะปะธ ะฒ ะบะฐะบัƒัŽ-ั‚ะพ ัƒะปะพั‡ะบัƒ, ะธะท ะบะพั‚ะพั€ะพะน ัƒะฑะตะถะฐั‚ัŒ ะฝะตะฒะพะทะผะพะถะฝะพ, ะธะปะธ ั‚ัƒะฟะธะบ... ะžะดะฝะพ ะฒัะตะณะพ ะปะธัˆัŒ ัะปะพะฒะพ, ะฐ ัะบะพะปัŒะบะพ ะฟะฐะฝะธะบะธ ะพะฝะพ ะฟั€ะธะฝะพัะธั‚... ะŸัั‚ัััŒ, ะพะฝะธ ะฒัั‚ะฐะปะธ, ะฒะพ ั‡ั‚ะพ-ั‚ะพ ะฟะพั…ะพะถะตะต ะฝะต ะณั€ัะทัŒ, ะธะปะธ ัั‚ะพ ะธ ะฑั‹ะปะฐ ะณั€ัะทัŒ? ะšั‚ะพ ะทะฝะฐะตั‚, ะบั‚ะพ ะทะฝะฐะตั‚... ะ—ะฐะผะฐั…ะฝัƒะฒัˆะธััŒ, ะผะฐะฝัŒัะบ ะฝะฐะฟั€ะฐะฒะธะป ะฟะตั€ะตั‡ะฝั‹ะน ะฝะพะถะธะบ ะฝะฐ ะฟะฐั€ะฝั, ะธ ะบะธะฝัƒะป ะตะณะพ. ะ‘ะฐะฝะฐะปัŒะฝะพ ะดะฐ? ะะพ ะดะตะฒัƒัˆะบะฐ, ัƒัะฟะตะปะฐ ะฒัั‚ะฐั‚ัŒ ะฟะตั€ะตะด ะฝะธะผ, ะธ ะฟะพ ัั‚ะพะผัƒ, ัƒะดะฐั€ ะฟั€ะธัˆั‘ะปัั ะฟั€ัะผะพ ะฒ ะถะธะฒะพั‚. ะงั‚ะพ ะฑั‹ะปะพ ะดะฐะปัŒัˆะต, ะฟะฐั€ะตะฝัŒ ะฟะพะผะฝะธะป ัะผัƒั‚ะฝะพ, ะฝะพ ะฑั‹ะปะพ ะพั‡ะตะฒะธะดะฝะพ, ะฝะฐ ะตะณะพ ะณะปะฐะทะฐั…, ะทะฐ ะฝะตะณะพ ัƒะผะตั€ะปะฐ ะตะณะพ ะ”ะถัƒะฒะธั. ะกั‚ั€ะฐะฝะฝะพ, ะดะฐ? ะ’ั€ะพะดะต ะฑั‹ ะดะพะปะถะฝะพ ะฑั‹ั‚ัŒ ะฝะฐะพะฑะพั€ะพั‚, ะฐ ะฟะพะปัƒั‡ะธะปะพััŒ ั‚ะฐะบ. ะะฐ ะตะณะพ ั€ัƒะบะฐั…, ะฑั‹ะปะฐ ะตั‘ ะบั€ะพะฒัŒ, ะฐ ะพะฝ ะฒะธะดะธะผะพ ะฟะพั‚ะตั€ัะป ัะพะทะฝะฐะฝะธะต... ะขะตะฟะตั€ัŒ, ะผะธั€ ัะปะธัˆะบะพะผ ัะตั€ั‹ะน, ั…ะพั‚ั ะฝะตั‚, ะพะฝ ะบั€ะพะฒะฐะฒะพ-ะบั€ะฐัะฝั‹ะน, ั‚ะฐะบะพะน ะถะต, ะบะฐะบ ะธ ะตั‘ ะปัŽะฑะธะผั‹ะน ั†ะฒะตั‚. ะกั‚ั€ะฐะฝะฝะพ ะดะฐ? ะ›ัŽะฑะธั‚ ะดะพะถะดัŒ, ะฝะพ ะปัŽะฑะธะผั‹ะน ั†ะฒะตั‚ ะบั€ะฐัะฝั‹ะน... ะะพ ั€ะฐะทะฒะต ะธะผะตะตั‚ ะปะธ ัั‚ะพ ะทะฝะฐั‡ะตะฝะธะต, ะตัะปะธ ะ”ะถัƒะฒะธั ัƒัˆะปะฐ ัะพ ัั†ะตะฝั‹, ะฝะฐะฒัะตะณะดะฐ? ะ’ะพะทะผะพะถะฝะพ, ัั‚ะพ ะธ ะฑั‹ะป ะตั‘ ั‡ะฐั, ะฐ ะผะพะถะตั‚ ะฑั‹ั‚ัŒ, ะพะฝะฐ ะถะธะปะฐ ะฑั‹ ะตั‰ั‘ ะดะพะปัŒัˆะต, ะฝะพ ะฒัั‘ ะถะต ะฟั€ะพัˆะปะฐ ะผะธะผะพ, ั‚ะฐะบ ะถะต, ะบะฐะบ ะธ ะฒัั‘ ั…ะพั€ะพัˆะตะต, ะฒ ัั‚ะพะผ ะฐะปั‡ะฝะพะผ ะผะธั€ะต... ะ˜ ัั‚ะพั‚ ั„ะฐะบั‚, ะทะฐัั‚ะฐะฒะธะป ะตะณะพ ะฟะพะฒะทั€ะพัะปะตั‚ัŒ, ะธ ะฟั€ะธะฝัั‚ัŒ ั€ะตะฐะปัŒะฝะพัั‚ัŒ ัะพ ะฒัะตะผะธ ะตั‘ ัะพัั‚ะฐะฒะปััŽั‰ะธะผะธ... ะขั€ัƒะดะฝะพ, ะฑั‹ะปะพ ะปะธัˆัŒ ะฟะพ ะฝะฐั‡ะฐะปัƒ, ะฐ ัะตะนั‡ะฐั ะพะฝ ะฟั€ะธะฒั‹ะบ, ะธ ะดะฐะถะต ะฒะฟะพะปะฝะต ัะฟั€ะฐะฒะปัะตั‚ัั, ะฝะพ ั‡ั‚ะพ-ั‚ะพ ะฟะพะผะตะฝัะตั‚ัั, ะธ ัั‚ะพ, ั‡ั‚ะพ-ั‚ะพ ะฟั€ะพะธะทะพะนะดั‘ั‚ ัะปะธัˆะบะพะผ ะฑั‹ัั‚ั€ะพ... ะžะฝะฐ ะฝะธะบะพะณะดะฐ ะฝะต ะทะฐะดัƒะผั‹ะฒะฐะปะฐััŒ, ะพ ั‚ะพะผ, ะฑัƒะดัƒั‚ ะปะธ ะตั‘ ะฟะพะผะฝะธั‚ัŒ, ะธะปะธ ะฝะต ะทะฐะฟะปะฐั‡ัƒั‚ ะปะธ ะพะฝะธ, ะบะพะณะดะฐ ะพะฝะฐ ัƒะผั€ะตั‚, ะฝะตั‚, ัƒ ะฝะตั‘ ะดะฐะถะต ะผั‹ัะปะตะน ั‚ะฐะบะธั… ะฝะต ะฑั‹ะปะพ, ะฒะตะดัŒ ั‚ั‹ ะฑั‹ะปะฐ ะฒ ัะฒะพะธั… ะผะตั‡ั‚ะฐั…. ะ”ัƒะผะฐะปะฐ, ั‡ั‚ะพ ะฑัƒะดะตัˆัŒ ะถะธั‚ัŒ ะฒะตั‡ะฝะพ, ะบะฐะบ ะฑะพะณ, ะธะปะธ ะดัŒัะฒะพะป. ะžะฝะฐ ะดัƒะผะฐะปะฐ, ั‡ั‚ะพ ั‚ะตะฑั ะฝะต ะฟั€ะพะฑัŒั‘ั‚ ะฝะธะบะฐะบะฐั ะฟัƒะปั, ะฝะพ ะฒัั‘ ะถะต ะพัˆะธะฑะปะฐััŒ... ะŸั€ะพัั‚ะพะน ะฝะพะถะธะบ ะพั‚ะฝัะป ั‚ะฒะพัŽ ะถะธะทะฝัŒ, ะธ ะทะฝะฐะตัˆัŒ, ั ะฝะต ะถะฐะปะตัŽ ะพะฑ ัั‚ะพะผ, ะฒะตะดัŒ, ั‚ั‹ ะฒัะตะณะดะฐ ะฑั‹ะปะฐ ั ะบั€ั‹ะปัŒัะผะธ, ะบะฐะบ ัƒ ะฐะฝะณะตะปะฐ, ั…ะพั‚ัŒ ั‚ั‹ ะฝะต ะฒัะตะณะดะฐ ะฑั‹ะปะฐ ัั‡ะฐัั‚ะปะธะฒะฐ, ะฟะพะบะฐ, ะฒ ั‚ะฒะพะตะน ะถะธะทะฝะธ ะฝะต ะทะฐะฟะตะปะธ ะฟะพัะปะฐะฝะฝะธะบะธ ะ‘ะพะถัŒะธ... ะ˜ ัะฝะพะฒะฐ, ะตะณะพ ะฟั€ะพะฑะธั€ะฐะตั‚ ะดั€ะพะถัŒ, ะพะฝ ะดัƒะผะฐะตั‚, ั‡ั‚ะพ ัะตะนั‡ะฐั ะพะฝะฐ ะฟะพะดะฑะตะถะธั‚, ัะทะฐะดะธ, ะธ, ะทะฐะบั€ั‹ะฒ ะณะปะฐะทะฐ, ัะบะฐะถะตั‚ "ะฃะณะฐะดะฐะน ะบั‚ะพ?" ะ˜ ั‚ั‹ ะบะฐะบ ะพะฑั‹ั‡ะฝะพ ะฟั€ะพัั‚ะพ ัะบะฐะถะตัˆัŒ ั ัƒะปั‹ะฑะบะพะน "ะ”ะถัƒะฒะธั. ะœะพั ะ”ะถัƒะฒะธ." ะ˜ ั‚ะตะผะฝะพะฒะพะปะพัั‹ะน ะฟะพะบะปัะฝั‘ั‚ัั, ะฒัะตะผ ะฑะพะณะฐะผ, ั‡ั‚ะพ ะพะฝะฐ ัะผัƒั‰ะฐะตั‚ัั ะพั‚ ะตะณะพ ัะปะพะฒ. ะ’ะตะดัŒ ัั‚ะพ ะฟั€ะฐะฒะดะฐ... ะ‘ัƒะดะฝะธะต ะดะฝะธ ะฟะพั‚ะธั…ะพะฝัŒะบัƒ ะฒั‹ั‚ะตัะฝััŽั‚ ะตั‘ ะณะพะปะพั ะธะท ะตะณะพ ะณะพะปะพะฒั‹, ะฐ ัะฐะผ ะพะฝ ะฟะพะณั€ัƒะถะฐะตั‚ัั, ะฒ ััƒะตั‚ัƒ ะณะพั€ะพะดะฐ. ะ’ะพะทะผะพะถะฝะพ, ะพะฝ ะทะฐะฑัƒะดะตั‚ ะตั‘, ะฝะฐะนะดั‘ั‚ ะดั€ัƒะณัƒัŽ, ะธ ะพะฝะธ ะทะฐะถะธะฒัƒั‚ ัั‡ะฐัั‚ะปะธะฒะพ, ะธะปะธ ะฑัƒะดะตั‚ ะพะดะธะฝะพะบ ะฒัั‘ ะถะธะทะฝัŒ, ะฒัะฟะพะผะธะฝะฐั ะตั‘. ะšั‚ะพ ะทะฝะฐะตั‚, ั‡ะตะผ ะทะฐะบะพะฝั‡ะธั‚ัั ัั‚ะฐ ะธัั‚ะพั€ะธั, ะฝะพ ะพะดะฝะฐะถะดั‹ ะพะฝะธ ะฒัั‚ั€ะตั‚ัั‚ัั. ะงะตั€ะตะท ัั‚ะพ ะปะตั‚, ั‡ะตั€ะตะท ะฒะตะบะฐ, ั‚ั‹ััั‡ะตะปะตั‚ะธั, ะฝะต ะฒะฐะถะฝะพ... ะžะฝะธ ะฒัั‚ั€ะตั‚ัั‚ัั, ะธ ัั‚ะพ ะณะปะฐะฒะฝะพะต.... ะ ะฟะพะบะฐ, ั‚ะตะผะฝะพะฒะพะปะพัั‹ะน ัะบะฐะถะตั‚ ะปะธัˆัŒ ะพะดะฝะพ ะฟั€ะตะดะปะพะถะตะฝะธะต, ะธ ั ัƒั…ะผั‹ะปะบะพะน ะฟะพะนะดั‘ั‚ ะฒ ัั‚ะพั€ะพะฝัƒ ะดะพะผะฐ... - ะšะพะดะฐ ะถะต ั‚ั‹ ัะฟัƒัั‚ะธัˆัŒัั, ะ”ะถัƒะฑะธ? ะ’ัะตะณะพ ะปะธัˆัŒ ะทะฐ ะฝะตัะบะพะปัŒะบะพ ัะตะบัƒะฝะด, ะผะธั€ ะผะพะถะตั‚ ะฟะพะผะตะฝัั‚ัŒัั, ะดะพ ะฝะตัƒะทะฝะฐะฒะฐะตะผะพัั‚ะธ. ะ‘ะตั€ะตะณะธั‚ะต ัั‚ะพ ะฒั€ะตะผั, ะธ ะฒะพะทะผะพะถะฝะพ, ะฒ ะฑัƒะดัƒั‰ะตะผ ะฒั€ะตะผั ัะฐะผะพ ะฟั€ะตะฟะพะดะฝะตัะตั‚ ะฒะฐะผ ะฟะพะดะฐั€ะพะบ...' - 'ะ‘ะพะปัŒะฝะพ... ะกะปะธัˆะบะพะผ ะฑะพะปัŒะฝะพ... ะ‘ะพะปัŒะฝะพ ะบะฐะถะดะพะต ะผะณะฝะพะฒะตะฝะธะต, ะบะฐะถะดั‹ะน ะฒะทะดะพั… ะฟั€ะธั‡ะธะฝัะตั‚ ะฟั€ะพัั‚ะพ ะฝะตะผั‹ัะปะธะผัƒัŽ ะฑะพะปัŒ. ะ ะพะฝะฐ ะปะตะถะธั‚ ัะพะฒัะตะผ ั€ัะดะพะผ, ะฑะตะทะดั‹ั…ะฐะฝะฝะฐั. ะ’ะพะบั€ัƒะณ ะปัŽะดะธ, ะฝะพ ะพะฝ ะธั… ะฟั€ะพัั‚ะพ ะฝะต ะฒะธะดะธั‚. ะžะฝ ะฒะธะดะตะป ะบะฐะบ ะพะฝะฐ ัƒะผะตั€ะปะฐ, ะพะฝะฐ ัะผะพั‚ั€ะตะปะฐ ะฟั€ัะผะพ ะฝะฐ ะฝะตะณะพ ะฒ ัั‚ะพั‚ ะผะพะผะตะฝั‚ ะธ ั…ะพั‚ะตะปะฐ ะฟะพั†ะตะปะพะฒะฐั‚ัŒ. "Magic can do much, but not death". ะžะฝ ะพั‚ะณะพั€ะพะดะธะปัั ะพั‚ ะฒัะตะณะพ ะฒะพะบั€ัƒะณ, ะพัั‚ะฐะปะฐััŒ ั‚ะพะปัŒะบะพ ะฑะพะปัŒ. ะ˜ ะฒะดั€ัƒะณ ะฒัะฟั‹ัˆะบะฐ ัั€ะพัั‚ะธ, ะฐ ะฟะตั€ะตะด ะตะณะพ ะฒะทะพั€ะพะผ ะณะฐัะฝัƒั‰ะธะต ะณะปะฐะทะฐ ะ‘ะตะปะปัŒ. ะฏั€ะพัั‚ัŒ ะทะฐัั‚ะฐะฒะธะปะฐ ะตะณะพ ะฟะพะดะฝัั‚ัŒัั ั ะบะพะปะตะฝ ะธ ะฝะฐะฟั€ะฐะฒะธั‚ัŒัั ะบ ัƒะฑะธะนั†ะต. ะŸั€ะธ ั…ะพะดัŒะฑะต, ะฒ ะฝะพะณะต ะฟั€ะพัะฝัƒะปัั ะฒัƒะปะบะฐะฝ, ะฝะพ ัั‚ะพ ะฑั‹ะปะพ ัƒะถะต ะฝะต ะฒะฐะถะฝะพ. ะ‘ะพะปัŒ ะฒ ะดัƒัˆะต ะทะฐะณะปัƒัˆะฐะปะฐ ะฒัะต ะพัั‚ะฐะปัŒะฝะพะต. "ะžะฝะฐ ัƒะผะตั€ะปะฐ ะธะท-ะทะฐ ะผะตะฝั". ะžะฝ ะฟะพะดะพัˆะตะป ะบ ัƒะฑะธะนั†ะต, ั‚ะพั‚ ะปะตะถะฐะป ะฝะฐ ะทะตะผะปะต ะธ ัƒั…ะผั‹ะปัะปัั. ะžะฝ ะฒั‹ะฟะพะปะฝะธะป ัะฒะพัŽ ะผะตัั‚ัŒ ะธ ะฑั‹ะป ัะพะฑะพะน ะดะพะฒะพะปะตะฝ. ะœัƒะถั‡ะธะฝะฐ ั ัั€ะพัั‚ัŒัŽ ะฝะฐั‡ะฐะป ะดัƒัˆะธั‚ัŒ ัƒะฑะธะนั†ัƒ ัะฒะพะตะน ั‚ั€ะพัั‚ัŒัŽ, ะฝะพ ัƒั…ะผั‹ะปะบะฐ ะฝะต ัั…ะพะดะธะปะฐ ั ะตะณะพ ะปะธั†ะฐ. ะ“ะพะปะดะฐ ะฟั‹ั‚ะฐะปะธััŒ ะพั‚ั‚ะฐั‰ะธั‚ัŒ, ะฝะพ ัั€ะพัั‚ัŒ ะฟั€ะธะดะฐะปะฐ ะตะผัƒ ะพะณั€ะพะผะฝัƒัŽ ัะธะปัƒ, ะธ ะฒัะบะพั€ะต ะณะปะฐะทะฐ ะฟะธั€ะฐั‚ะฐ ะฟะพะณะฐัะปะธ. ะะพ ะ“ะพะปะดัƒ ะฝะต ัั‚ะฐะปะพ ะพั‚ ัั‚ะพะณะพ ะปะตะณั‡ะต, ะพะฝ ะปะธัˆัŒ ะฟะพั‡ัƒะฒัั‚ะฒะพะฒะฐะป, ั‡ั‚ะพ ะฟั€ะตะดะฐะป ะตะต ะฒะตั€ัƒ. ะฃะฑะธะฒ ะ‘ะตะปะปัŒ, ะฟะธั€ะฐั‚ ัƒะฑะธะป ะฒัะต ั‡ะตะปะพะฒะตั‡ะตัะบะพะต ะฒ ะ“ะพะปะดะต, ะพัั‚ะฐะปัั ั‚ะพะปัŒะบะพ ะผะพะฝัั‚ั€. ะœะพะฝัั‚ั€ ั ั€ะฐะทะฑะธั‚ั‹ะผ ัะตั€ะดั†ะตะผ. ะญะผะผะฐ ั…ะพั‚ะตะปะฐ ะฝะฐะดะตั‚ัŒ ะฝะฐ ะฝะตะณะพ ะฝะฐั€ัƒั‡ะฝะธะบะธ, ะฝะพ ะฝะต ัะผะพะณะปะฐ. ะ˜ะผะฟัƒะปัŒัะพะผ ะพะฝ ะพั‚ะฑั€ะพัะธะป ะตะต ะธ ะตะต ั€ะพะดะธั‚ะตะปะตะน ะพั‚ ัะตะฑั. ะ•ะณะพ ะบะพะถะฐ ะฝะฐั‡ะฐะปะฐ ะผะตะฝัั‚ัŒัั, ะธ ะพะดะตะถะดะฐ ั‚ะฐะบะถะต ะฒะผะธะณ ะฟะตั€ะตะผะตะฝะธะปะฐััŒ. ะกะพะฒะตั€ัˆะธะฒ ัƒะฑะธะนัั‚ะฒะพ, ะพะฝ ัะฝะพะฒะฐ ัั‚ะฐะป ะบั€ะพะบะพะดะธะปะพะผ, ะฑะตะทะถะฐะปะพัั‚ะฝั‹ะผ ะ ัƒะผะฟะตะปัŒัˆั‚ะธะปัŒั†ั…ะตะฝะพะผ. ะžะฝ ัะฝะพะฒะฐ ะฟะพะดะพัˆะตะป ะบ ะ‘ะตะปะปัŒ ะธ ัƒะฟะฐะป ะฟะตั€ะตะด ะฝะตะน ะฝะฐ ะบะพะปะตะฝะธ. ะžะฝ ัะผะพั‚ั€ะตะป ะฝะฐ ะตะต ั…ะพะปะพะดะฝะพะต ะปะธั†ะพ ะธ ะฟั€ะพัั‚ะพ ะฝะต ะผะพะณ ะฟะพะฒะตั€ะธั‚ัŒ, ั‡ั‚ะพ ะตะต ั‚ะตะฟะปั‹ะต ะณะปะฐะทะฐ ะฝะต ะฟะพัะผะพั‚ั€ัั‚ ั ะฝะตะถะฝะพัั‚ัŒัŽ ะฝะฐ ัะฒะพะต ั‡ัƒะดะพะฒะธั‰ะต. ะ˜ะท ะณั€ัƒะดะธ ะฒั‹ั€ะฒะฐะปัั ะถัƒั‚ะบะธะน ะฝะตั‡ะตะปะพะฒะตั‡ะตัะบะธะน ะบั€ะธะบ, ะฑัƒะดั‚ะพ ะพะฝ ะทะฒะฐะป ะดัƒัˆัƒ ะปัŽะฑะธะผะพะน, ะฝะพ ะพะฝะฐ ะฝะต ะพั‚ะบะปะธะบะฐะปะฐััŒ. ะ˜ ัะฝะพะฒะฐ ัั€ะพัั‚ัŒ, ะฑะตะทัƒะผะฝะฐั ัั€ะพัั‚ัŒ ะธ ัะปะตะทั‹. ะžะฝะฐ ะฒะธะฝะพะฒะฐั‚ะฐ ะปะธัˆัŒ ะฒ ั‚ะพะผ, ั‡ั‚ะพ ะฒะปัŽะฑะธะปะฐััŒ ะฒ ั‡ัƒะดะพะฒะธั‰ะต, ะฟะพะดะฐั€ะธะปะฐ ะตะผัƒ ัั‡ะฐัั‚ัŒะต. ะะพ ะตะต ัะผะตั€ั‚ัŒ ะทะฐะฑั€ะฐะปะฐ ะฒัะต. ะ‘ั‹ะปะพ ัะพะฒัะตะผ ะฝะต ั‚ะฐะบ, ะบะฐะบ ั‚ะพะณะดะฐ ะฒ ะทะฐะผะบะต. ะขะพะณะดะฐ ะพะฝ ั…ะพั‚ั ะฑั‹ ะทะฝะฐะป, ั‡ั‚ะพ ะตะณะพ ะฟั€ะธะฝั†ะตััะฐ ะฒ ะฟะพั€ัะดะบะต, ั‚ะตะฟะตั€ัŒ... ะกะปะธัˆะบะพะผ ะฑะพะปัŒะฝะพ... ะ ัƒะผะฟะตะปัŒัˆั‚ะธะปัŒั†ั…ะตะฝัƒ ะฝะต ั…ะพั‚ะตะปะพััŒ ัั‚ัƒะฟะธั‚ัŒ ะทะฐ ั‡ะตั€ั‚ัƒ ะณะพั€ะพะดะฐ ะธ ะฒัะต ะทะฐะฑั‹ั‚ัŒ. ะžะฝ ะฝะต ะผะพะณ ะทะฐะฑั‹ั‚ัŒ ัะฒะพัŽ ะฟั€ะธะฝั†ะตัััƒ, ั…ะพั‚ัŒ ะฒะพัะฟะพะผะธะฝะฐะฝะธั ะธ ะฟั€ะธะฝะพัะธะปะธ ัั‚ั€ะฐะดะฐะฝะธั. ะžะฝ ั…ะพั‚ะตะป ะฟั€ะพัั‚ะพ ัƒะผะตั€ะตั‚ัŒ ั€ัะดะพะผ ั ั‚ะพะน ะดะตะฒัƒัˆะบะพะน, ะบะพั‚ะพั€ะฐั ะทะฐะฑั€ะฐะปะฐ ะตะณะพ ัะตั€ะดั†ะต.' - 'ะ”ะพ ัั‚ะพะณะพ ะผะพะผะตะฝั‚ะฐ ั ะฝะธั‡ะตะณะพ ะฝะต ั‡ัƒะฒัั‚ะฒะพะฒะฐะป. ะŸัƒัั‚ะพั‚ะฐ ะธ ั‚ะตะผะฝะพั‚ะฐ. ะŸะพัะปะต ั‚ะพะณะพ, ะบะฐะบ ัั‚ะฐ ั‚ะฒะฐั€ัŒ ะฒั†ะตะฟะธะปะฐััŒ ะผะฝะต ะฒ ัˆะตัŽ ั ัƒัะฟะตะป ะผั‹ัะปะตะฝะฝะพ ัะพ ะฒัะตะผะธ ะฟะพะฟั€ะพั‰ะฐั‚ัŒัั ะธ ะฑั‹ะป ะณะพั‚ะพะฒ ะบ ัะผะตั€ั‚ะธ. ะะพ ัั‚ะพ ะฑั‹ะปะฐ ะฝะต ะพะฝะฐ. ะะฐ ั‚ะพะผ ัะฒะตั‚ะต ะฝะต ะผะพะถะตั‚ ะฑั‹ั‚ัŒ ั‚ะฐะบ ะฝะตะพะฟั€ะตะดะตะปะตะฝะฝะพ ะธ ั‚ะตะผะฝะพ. ะ—ะฝะฐั‡ะธั‚, ั ะฒัะต ะตั‰ั‘ ะถะธะฒ. ะ ั€ะฐะท ั‚ะฐะบ, ั ัะบะพั€ะพ ะฒั‹ะฑะตั€ัƒััŒ ะพั‚ััŽะดะฐ... ะŸะพ ั‚ะตะปัƒ ัะปะพะฒะฝะพ ะฟั€ะพัˆะตะป ัะปะตะบั‚ั€ะธั‡ะตัะบะธะน ั€ะฐะทั€ัะด. ะฏ ะฒะทะดั€ะพะณะฝัƒะป ะธ ั€ะฐัะฟะฐั…ะฝัƒะป ะณะปะฐะทะฐ. ะงะตัั‚ะฝะพ ะฟั€ะธะทะฝะฐั‚ัŒัั, ั ะดะฐะถะต ะฝะต ัั€ะฐะทัƒ ะฟะพะฝัะป ะณะดะต ะฝะฐั…ะพะถัƒััŒ. ะ˜ ะฒ ะดะฐะฝะฝั‹ะน ะผะพะผะตะฝั‚ ะผะตะฝั ัั‚ะพ ัะพะฒัะตะผ ะฝะต ั‚ั€ะตะฒะพะถะธะปะพ, ะฟะพั‚ะพะผัƒ, ั‡ั‚ะพ ะฟะตั€ะตะด ัะพะฑะพะน ั ัƒะฒะธะดะตะป ะทะฝะฐะบะพะผะพะต ะปะธั†ะพ ะผะฐะณะฐ. ะ•ะดะธะฝัั‚ะฒะตะฝะฝั‹ะน, ะบั‚ะพ ะผะฝะต ะฝั€ะฐะฒะธะปัั ะฟะพัะปะต ะ”ะถะตะนัะฐ ะธ ั ะฟะตั€ะฒะพะณะพ ะฒะทะณะปัะดะฐ ััƒะผะตะป ะทะฐะฟะพะปัƒั‡ะธั‚ัŒ ะผะพั‘ ัะตั€ะดั†ะต. ะ”ะฐ, ัั‚ะพ ะฝะตัะพะผะฝะตะฝะฝะพ ะฑั‹ะป ะพะฝ. ะ—ะฐ ัั‚ะพะปัŒะบะธะต ะณะพะดั‹ ะฒะฟะตั€ะฒั‹ะต ะฒ ัั‚ะตะฝะฐั… ะธะฝัั‚ะธั‚ัƒั‚ะฐ ะบั‚ะพ-ั‚ะพ ะบั€ะพะผะต ััƒะผะตั€ะตั‡ะฝั‹ั… ะพั…ะพั‚ะฝะธะบะพะฒ. - ะœะฐะณะฝัƒั ะ‘ะตะนะฝ, - ั ะผะฐะบัะธะผะฐะปัŒะฝะพะน ั…ะพะปะพะดะฝะพัั‚ัŒัŽ, ะบะฐะบะฐั ั‚ะพะปัŒะบะพ ะฒะพะทะผะพะถะฝะฐ ะฒ ะดะฐะฝะฝะพะน ัะธั‚ัƒะฐั†ะธะธ, ะฟั€ะพะบะพะผะผะตะฝั‚ะธั€ะพะฒะฐะป ะพั‡ะตะฒะธะดะฝะพะต ั. ะขะฐะบ ัƒะถ ะฟั€ะธะฝัั‚ะพ ัƒ ะฝะฐั, ะพั…ะพั‚ะฝะธะบะพะฒ. ะะต ะบะธะฝัƒััŒ ะถะต ั ะบ ะฝะตะผัƒ ะฝะฐ ัˆะตัŽ ั ะฟะพั†ะตะปัƒัะผะธ ะธ ั„ั€ะฐะทะพั‡ะบะฐะผะธ ะฟั€ะธะผะธั‚ะธะฒะฝั‹ั… ั‚ะธะฟะฐ: "ะž, ะฟั€ะธะฒะตั‚! ะ”ะฐะฒะฝะตะฝัŒะบะพ ะฝะต ะฒะธะดะตะปะธััŒ. ะฏ ั‚ะฐะบ ัะบัƒั‡ะฐะป". ะงะธัั‚ะพ ั„ะธะทะธั‡ะตัะบะธ ะฒ ะดะฐะฝะฝั‹ะน ะผะพะผะตะฝั‚ ั ะธ ะฝะต ะผะพะณ ัะตะฑะต ั‚ะฐะบะพะณะพ ะฟะพะทะฒะพะปะธั‚ัŒ. ะ”ะฐ ะธ ะฒะธะดะตะป ั ะผะฐะณะฐ ะฒัะตะณะพ ะฒะพ ะฒั‚ะพั€ะพะน ั€ะฐะท ะฒ ัะฒะพะตะน ะถะธะทะฝะธ. ะะพ, ั‡ะตั€ั‚ ะฒะพะทัŒะผะธ, ัั‚ะพะณะพ ะฑั‹ะปะพ ะดะพัั‚ะฐั‚ะพั‡ะฝะพ, ั‡ั‚ะพะฑั‹ ะฟะพะดั†ะตะฟะธั‚ัŒ ะผะตะฝั ะฝะฐ ะบั€ัŽั‡ะพะบ ะบะฐะบ ะณะปัƒะฟะพะณะพ ะบะฐั€ะฐัั! - ะะปะตะบัะฐะฝะดั€ ะ›ะฐะนั‚ะฒัƒะด, - ัะฝะธัั…ะพะดะธั‚ะตะปัŒะฝะพ ัƒะปั‹ะฑะฝัƒะปัั ะœะฐะณะฝัƒั. ะŸะพั…ะพะถะต ะฟะพัะปะต ะดะพะปะณะพะน ะพั‚ะบะปัŽั‡ะบะธ ัƒ ะผะตะฝั ะฝะฐั‡ะฐะปะฐ ะบั€ัƒะถะธั‚ัŒัั ะณะพะปะพะฒะฐ, ะฝะพ ัั‚ะพ ะฟะพะบะฐะทะฐะปะพััŒ ะดะฐะถะต ะฟั€ะธัั‚ะฝะพ. ะฏ ะพั‚ะบั€ั‹ะป ะฑั‹ะปะพ ั€ะพั‚ ั‡ั‚ะพะฑั‹ ะทะฐะดะฐั‚ัŒ ะฒะพะฟั€ะพั, ะฝะพ ะณะพัั‚ัŒ ะพะฟะตั€ะตะดะธะป ะผะตะฝั. - ะขะพะปัŒะบะพ ะฝะต ัะฟั€ะฐัˆะธะฒะฐะน, ั‡ั‚ะพ ั ะทะดะตััŒ ะดะตะปะฐัŽ, - ะพะฝ ะฟั€ะพะถะตะณ ะผะตะฝั ะฒะทะณะปัะดะพะผ ะผะตะดะพะฒั‹ั… ะณะปะฐะท. - ะฅะพะดะถ ะฟะตั€ะตะถะธะฒะฐะป ะทะฐ ั‚ะฒะพัŽ ะถะธะทะฝัŒ ะธ, ะบะฐะบ ั ะฒะธะถัƒ, ะฝะต ะฝะฐะฟั€ะฐัะฝะพ. ะญั‚ะพ ะฑั‹ะปะพ ะธะผะตะฝะฝะพ ั‚ะพ, ั‡ั‚ะพ ั ะธ ั…ะพั‚ะตะป ัะฟั€ะพัะธั‚ัŒ. ะ•ัะปะธ ะพะฝ ัƒะผะตะตั‚ ั‡ะธั‚ะฐั‚ัŒ ะผั‹ัะปะธ - ัั‚ะพ ะฒะตััŒะผะฐ ะฟะปะฐั‡ะตะฒะฝะพ. ะŸั€ะตะดะฟะพั‡ะธั‚ะฐัŽ ะดะตั€ะถะฐั‚ัŒ ัะฒะพะธ ะผั‹ัะปะธ ะฟั€ะธ ัะตะฑะต. ะ‘ะตะนะฝะฐ ัะฝะพะฒะฐ ั‡ั‚ะพ-ั‚ะพ ั€ะฐะทะฒะตัะตะปะธะปะพ ะธ ะพะฝ ะบั€ะธะฒะพ ัƒะปั‹ะฑะฝัƒะปัั ะพั‚ะฒะพั€ะฐั‡ะธะฒะฐัััŒ ะพั‚ ะผะตะฝั. ะญั‚ะพ ะฟะพะดั‚ะฒะตั€ะดะธะปะพ ะผะพะธ ะดะพะณะฐะดะบะธ. - ะ›ะฐะดะฝะพ, ะฝะต ะฑัƒะดัƒ ั‚ะตะฑั ัะผัƒั‰ะฐั‚ัŒ, - ะผะฐะณ ะฟะพะดะฝัะปัั. - ะขั‹ ะตั‰ะต ัะปะธัˆะบะพะผ ัะปะฐะฑ ะธ ะพั‚ะดั‹ั… ะปะธัˆะฝะธะผ ะฝะต ะฑัƒะดะตั‚. ะœะฐะณะฝัƒั ัะดะตะปะฐะป ัั‚ั€ะฐะฝะฝะพะต ะดะฒะธะถะตะฝะธะต ะฟะพั…ะพะถะตะต ะฝะฐ ะฟะพะปัƒะฟะพะบะปะพะฝ ะธ ะฝะฐะฟั€ะฐะฒะธะปัั ะบ ะดะฒะตั€ะธ. "ะะตั‚, ั‚ะฐะบ ะฟั€ะพัั‚ะพ ั‚ั‹ ะพั‚ ะผะตะฝั ะฝะต ัƒะนะดะตัˆัŒ". ะœะฝะต ะดะตะนัั‚ะฒะธั‚ะตะปัŒะฝะพ ั…ะพั‚ะตะปะพััŒ, ั‡ั‚ะพะฑั‹ ะพะฝ ะฟะพะฑั‹ะป ั€ัะดะพะผ, ั‡ั‚ะพะฑั‹ ะฝะต ัƒั…ะพะดะธะป ั‚ะฐะบ ะฑั‹ัั‚ั€ะพ. ะžะฝ ะผะฝะต ะฝั€ะฐะฒะธะปัั, ะฝะฐัะบะพะปัŒะบะพ ะผะพะถะฝะพ ััƒะดะธั‚ัŒ ะฟะพ ะพะดะฝะพะน ะผะธะผะพะปะตั‚ะฝะพะน ะฒัั‚ั€ะตั‡ะต ะณะปะฐะทะฐะผะธ. ะฅะพั‚ั ั ะฝะต ะณะพั‚ะพะฒ ะฑั‹ะป ะฟั€ะธะทะฝะฐั‚ัŒัั ะฒ ัั‚ะพะผ ะดะฐะถะต ัะตะฑะต. - ะฏ ะฝะพั€ะผะฐะปัŒะฝะพ ัะตะฑั ั‡ัƒะฒัั‚ะฒัƒัŽ, - ะฝะต ัะฒะพะธะผ ะณะพะปะพัะพะผ ะทะฐะผะตั‚ะธะป ั, ะฟั€ะพะฒะพะถะฐั ะผะฐะณะฐ ะฒะทะณะปัะดะพะผ. ะขะพั‚ ะฝะต ะพัั‚ะฐะฝะพะฒะธะปัั. - ะ‘ะตะนะฝ! - ัะฝะพะฒะฐ ะฝะธะบะฐะบะพะน ั€ะตะฐะบั†ะธะธ. - ะกั‚ะพะน! - ะฒ ัั€ะพัั‚ะฝะพะผ ะฑะตััะธะปะธะธ ั ัƒะดะฐั€ะธะป ั€ัƒะบะพะน ะฟะพ ะบั€ะพะฒะฐั‚ะธ ะธ ะผะณะฝะพะฒะตะฝะฝะพ ะฒะตั€ะฝัƒะฒัˆะฐััั ะฑะพะปัŒ ะทะฐัั‚ะฐะฒะธะปะฐ ะผะตะฝั ัั‚ะธัะฝัƒะฒ ะทัƒะฑั‹ ะตะดะฒะฐ ัะปั‹ัˆะฝะพ ะฟั€ะพัั‚ะพะฝะฐั‚ัŒ. ะœะฐะณ ะทะฐะผะตั€. - ะฅะพั‡ะตัˆัŒ, ั‡ั‚ะพะฑั‹ ั ะพัั‚ะฐะปัั? - ะฒ ะตะณะพ ัะปะพะฒะฐั… ั ัƒัะปั‹ัˆะฐะป ะฒะตััŒะผะฐ ะฝะตะดะฒัƒัะผั‹ัะปะตะฝะฝั‹ะน ะฝะฐะผะตะบ. - ะฏ ะฝัƒะถะตะฝ ั‚ะตะฑะต? ะคะธะณัƒั€ะฐ ะฒ ั‡ะตั€ะฝะพะน ะผะฐะฝั‚ะธะธ ั€ะฐะทะฒะตั€ะฝัƒะปะฐััŒ ะบะพ ะผะฝะต. ะ‘ะตะนะฝ ะฒัะบะธะฝัƒะป ะฑั€ะพะฒัŒ ะพะถะธะดะฐั ะพั‚ะฒะตั‚ะฐ. - ะžะน, ะธะดะธ ะบ ะดัŒัะฒะพะปัƒ! - ะฟัะธั…ะฐะฝัƒะป ั, ะฝะตะพะถะธะดะฐะฝะฝะพ ะดะฐะถะต ะดะปั ัะฐะผะพะณะพ ัะตะฑั, ะฝะพ ะฟัƒั‚ะธ ะฝะฐะทะฐะด ัƒะถะต ะฝะต ะฑั‹ะปะพ. - ... - ะฃะฑะธั€ะฐะนัั, - ะฟั€ะพั†ะตะดะธะป ั ัะบะฒะพะทัŒ ะทัƒะฑั‹, ะฒ ะดัƒัˆะต ัƒะผะพะปัั ะตะณะพ ะฝะต ัะปัƒัˆะฐั‚ัŒ ะผะพะธั… ัะปะพะฒ. - ะ›ะฐะดะฝะพ, - ะฟั€ะธะผะธั€ะธั‚ะตะปัŒะฝะพ ั‚ั€ัั…ะฝัƒะป ะณะพะปะพะฒะพะน ะœะฐะณะฝัƒั. - ะŸั€ะพัั‚ะธ. ะžะฝ ะฒะตั€ะฝัƒะปัั ะบ ะผะพะตะน ะบะพะนะบะต. ะฏ ัะผะตั€ะธะป ะตะณะพ ัะตั€ะดะธั‚ั‹ะผ ะฒะทะณะปัะดะพะผ ะธ "ัะพะฒะตั€ัˆะตะฝะฝะพ ัะปัƒั‡ะฐะนะฝะพ" ะฒัั‚ั€ะตั‚ะธะปัั ั ะฟั€ะตะบั€ะฐัะฝั‹ะผะธ ะณะปะฐะทะฐะผะธ. ะ•ะณะพ ะฒะทะพั€ ะฟั€ะพะฝะธะบะฐะป ะฟั€ัะผะพ ะฒ ะดัƒัˆัƒ, ะทะฐัั‚ะฐะฒะปัะป ะฟะพะดั‡ะธะฝัั‚ัŒัั ะฒะพะปะต ั…ะพะทัะธะฝะฐ. ะ”ั€ัƒะณะพะณะพ ะพะฟะธัะฐะฝะธั ั ะฑั‹ ะดะฐั‚ัŒ ะฝะต ัะผะพะณ, ะฝะพ ะผะพะถะตั‚ ั‚ะพะปัŒะบะพ ะฝะฐ ะผะตะฝั ะพะฝ ั‚ะฐะบ ะดะตะนัั‚ะฒะพะฒะฐะป. - ะฃ ั‚ะตะฑั ั‚ะพะถะต ะบั€ะฐัะธะฒั‹ะต ะณะปะฐะทะฐ, - ัะดะตะปะฐะป ะบะพะผะฟะปะธะผะตะฝั‚ ะพะฝ. - ะฅะฒะฐั‚ะธั‚! - ั ะฟั€ะธะบัƒัะธะป ะณัƒะฑัƒ, ั‡ัƒะฒัั‚ะฒัƒั, ะบะฐะบ ะบั€ะฐัะบะฐ ะฟั€ะธะปะธะฒะฐะตั‚ ะบ ะผะพะตะผัƒ ะปะธั†ัƒ. - ะ”ะฐ, ะธะทะฒะธะฝะธ, - ะพะฟะพะผะฝะธะปัั ะœะฐะณะฝัƒั. - ะฏ ะฟะพะฝัะป, ั‡ั‚ะพ ั‚ะตะฑะต ัั‚ะพ ะฝะต ะฝั€ะฐะฒะธั‚ัั. ะงั‚ะพ ะถ, ะฟั€ะธะดะตั‚ัั ะธะณั€ะฐั‚ัŒ ะฟะพ ั‚ะฒะพะธะผ ะฟั€ะฐะฒะธะปะฐะผ. - ะ‘ะตะนะฝ ะฟะพะถะฐะป ะฟะปะตั‡ะฐะผะธ ะธ ะฒะตััŒะผะฐ ะณะปัƒะฟะพ ัƒะปั‹ะฑะฝัƒะปัั. ะœะฝะต ะพัั‚ะฐะปะพััŒ ั‚ะพะปัŒะบะพ ะฟะพะบะฐั‡ะฐั‚ัŒ ะณะพะปะพะฒะพะน. - ะขะฐะผ, ะฝะฐ ะฒะตั‡ะตั€ะธะฝะบะต ะฒะฐะผะฟะธั€ะพะฒ... - ั ะทะฐะผัะปัั, - ะšะฐะบ ั‚ั‹ ัƒะทะฝะฐะป, ั‡ั‚ะพ ั... ะณะตะน? - ะ“ะปัƒะฟั‹ะน ะฒะพะฟั€ะพั, ะฝะต ะฝะฐั…ะพะดะธัˆัŒ? - ัะปะฐะดะบะพ ัƒะปั‹ะฑะฝัƒะปัั ะพะฝ, ะฒ ั‚ะพ ะฒั€ะตะผั ะบะฐะบ ะตะณะพ ะพะฑะฒะพะปะฐะบะธะฒะฐัŽั‰ะธะน ะณะพะปะพั ะฒ ะพั‡ะตั€ะตะดะฝะพะน ั€ะฐะท ะฒะทัะป ะผะตะฝั ะฒ ัะฒะพะน ะฟะปะตะฝ. - ะ”ะฐ, ะฟะพะถะฐะปัƒะน. - ะฟะพัะฟะตัˆะฝะพ ะพั‚ะฒะตั‚ะธะป ั, ะธ ะดะพะฑะฐะฒะธะป ะบะฐะบ-ะฑั‹ ะผะตะถะดัƒ ะฟั€ะพั‡ะธะผ, - ะšะฐะบ ั‚ั‹ ะผะตะฝั ั‚ะพะณะดะฐ ะฝะฐะทะฒะฐะป? - ะ–ะณัƒั‡ะตะน ะฑะตัั‚ะธะตะน, - ะผะฐะณ ะฑัƒะดั‚ะพ ะถะดะฐะป ัั‚ะพะณะพ ะฒะพะฟั€ะพัะฐ. ะฏ ัะบะพั€ั‡ะธะป ัะบะตะฟั‚ะธั‡ะตัะบัƒัŽ ั„ะธะทะธะพะฝะพะผะธัŽ, ะฐ ะฒะพั‚ ะ‘ะตะนะฝ ะฒั‹ะฟั€ัะผะธะปัั ะธ ะฟะพัะตั€ัŒะตะทะฝะตะป. ะ˜ ะฝะตะพะถะธะดะฐะฝะฝะพ ะฟะพะปะพะถะธะป ั€ัƒะบัƒ ะฝะฐ ะผะพัŽ ะปะฐะดะพะฝัŒ, ะทะฐัั‚ะฐะฒะธะฒ ะผะตะฝั ะฝะตะฒะพะปัŒะฝะพ ะฝะฐะฟั€ัั‡ัŒัั. - ะ—ะฝะฐะตัˆัŒ ะปะธ, - ะพั‚ะบั€ะพะฒะตะฝะฝะพ ะฟั€ะธะทะฝะฐะปัั ะพะฝ, - ั ะฝะต ะทะฐะฝะธะผะฐัŽััŒ ะฑะปะฐะณะพั‚ะฒะพั€ะธั‚ะตะปัŒะฝะพัั‚ัŒัŽ ะธ ะฝะต ัะฟะฐัะฐัŽ ะพั‚ ัะผะตั€ั‚ะธ ััƒะผะตั€ะตั‡ะฝั‹ั… ะพั…ะพั‚ะฝะธะบะพะฒ... ะะฐ ัะตะบัƒะฝะดัƒ ะพะฝ ะทะฐะผะพะปั‡ะฐะป, ะฒะธะดะธะผะพ ั€ะฐะทะผั‹ัˆะปัั, ัั‚ะพะธั‚ ะปะธ ะฟั€ะพะดะพะปะถะฐั‚ัŒ ะฟั€ะธะทะฝะฐะฝะธะต ะธะปะธ ะปัƒั‡ัˆะต ะทะฐะฑะธั‚ัŒ ะฟะพะบะฐ ะฝะต ะฟะพะทะดะฝะพ. ะฏ ั‡ัƒั‚ัŒ ะฝะฐะบะปะพะฝะธะป ะณะพะปะพะฒัƒ ะฒ ะทะฝะฐะบ ะบั€ะฐะนะฝะตะน ะทะฐะธะฝั‚ะตั€ะตัะพะฒะฐะฝะฝะพัั‚ะธ ะธ ะผะฐะณ, ะฒะทะดะพั…ะฝัƒะฒ, ะฟั€ะพะดะพะปะถะธะป. - ะœะฝะต ะฑั‹ะปะพ ะฝะฐะฟะปะตะฒะฐั‚ัŒ ะฝะฐ ะฟั€ะพััŒะฑัƒ ะฅะพะดะถะฐ ะธ ะฟะพะฝะฐั‡ะฐะปัƒ ั ะพั‚ะบะฐะทะฐะป ะตะผัƒ... ะะพ... ะฟะพัะปะต ั‚ะพะณะพ ะบะฐะบ ะพะฝ ะฝะฐะทะฒะฐะป ะผะฝะต ะธะผั... ะ‘ัƒะดัŒ ะฝะฐ ั‚ะฒะพะตะผ ะผะตัั‚ะต ะบั‚ะพ-ั‚ะพ ะดั€ัƒะณะพะน, ะผะตะฝั ะฑั‹ ะทะดะตััŒ ะฝะต ะฑั‹ะปะพ. ะœะฐะณะฝัƒั ะฟั€ะพะฝะทะธะป ะผะตะฝั ะฒะทะณะปัะดะพะผ. ะ˜ ั‚ัƒั‚ ั ั€ะตัˆะธะปัั. - ะœะพะณ ะฑั‹ ัั„ะพั€ะผัƒะปะธั€ะพะฒะฐั‚ัŒ ะฟั€ะพั‰ะต, - ะทะฐะผะตั‚ะธะป ั. - ะ’ ัะผั‹ัะปะต? - ะขั‹ ะผะฝะต ั‚ะพะถะต ะฝะตะฑะตะทั€ะฐะทะปะธั‡ะตะฝ. ะŸะปะตั‡ะธ ะผะฐะณะฐ ะทะฐั‚ั€ััะปะธััŒ ะพั‚ ัะผะตั…ะฐ. - ะงั‚ะพ ะฝะฐ ัั‚ะพั‚ ั€ะฐะท ะฝะต ั‚ะฐะบ? - ะฒะพะทะผัƒั‚ะธะปัั ั ะพัะฒะพะฑะพะถะดะฐั ั€ัƒะบัƒ. - ะขั‹ ั‚ะพะถะต ะฝะต ะพั‡ะตะฝัŒ ะบั€ะฐัะฝะพั€ะตั‡ะธะฒ. ะะพ ะฝะต ะฑะตะดะฐ. ะฏ ะฝะต ะฑะตัะตะดะพะฒะฐั‚ัŒ ััŽะดะฐ ะฟั€ะธัˆะตะป. ะ‘ะตะนะฝ ะฟั€ะพะฒะตะป ะปะฐะดะพะฝัŒัŽ ะฟะพ ะผะพะตะน ั‰ะตะบะต, ะพั‚ั‡ะตะณะพ ะผะตะฝั ะฑั€ะพัะธะปะพ ะฒ ะถะฐั€, ะฟะพั‚ะพะผ ะฝะฐะบะปะพะฝะธะปัั ะธ ะพัั‚ะพั€ะพะถะฝะพ ะบะพัะฝัƒะปัั ะผะพะธั… ะณัƒะฑ. ะฏ ะฟะพะดะฐะปัั ะฑั‹ะปะพ ะฒะฟะตั€ะตะด ั‚ั€ะตะฑัƒั ะฑะพะปัŒัˆะตะณะพ, ะฝะพ ั‚ะพั‚ ะฝะตะพะถะธะดะฐะฝะฝะพ ะพั‚ัั‚ั€ะฐะฝะธะปัั. - ะงั‚ะพ ะพะฟัั‚ัŒ? - ะฝะตะดะพะฒะพะปัŒะฝะพ ะทะฐะบะฐั‚ะธะป ะณะปะฐะทะฐ ั. - ะœะฝะต ะฝั€ะฐะฒะธั‚ัั ั‚ะตะฑั ะทะปะธั‚ัŒ, - ะฟั€ะธะทะฝะฐะปัั ะœะฐะณะฝัƒั ัะบั€ะธะฒะธะฒ ัั‚ะพะปัŒ ะถะตะปะฐะฝะฝั‹ะต ะณัƒะฑั‹ ะฒ ัƒัะผะตัˆะบะต. - ะขั‹ ั‚ะฐะบะพะน ัะตะบััƒะฐะปัŒะฝั‹ะน, ะบะพะณะดะฐ ะทะฐะฒะพะดะธัˆัŒัั. - ะขั‹ ะฟั€ะพัั‚ะพ ะฟะพะปัŒะทัƒะตัˆัŒัั ะผะพะตะน ะดะตะตัะฟะพัะพะฑะฝะพัั‚ัŒัŽ. ะ•ัะปะธ ะฑั‹ ั ะผะพะณ ะฟะพะดะฝัั‚ัŒัั... ะœะฐะณ ะฝะต ะดะฐะป ะผะฝะต ะทะฐะบะพะฝั‡ะธั‚ัŒ ัƒะณั€ะพะทัƒ ะธ ั ะฝะพะฒะพะน ัั‚ั€ะฐัั‚ัŒัŽ ะฝะฐะบั€ั‹ะป ะผะพะธ ะณัƒะฑั‹ ัะฒะพะธะผะธ. ะะฐ ัั‚ะพั‚ ั€ะฐะท ั ะฟั€ะตะฒะพะทะผะพะณะฐั ะฑะพะปัŒ ะพะฑั…ะฒะฐั‚ะธะป ั€ัƒะบะฐะผะธ ะตะณะพ ัˆะตัŽ ะธ ัะธะปัŒะฝะตะต ะฟั€ะธั‚ัะฝัƒะป ะตะณะพ ะบ ัะตะฑะต. ะขะพั‚ ัะปะตะณะบะฐ ัƒะปั‹ะฑะฝัƒะปัั ะฝะต ะฟั€ะตั€ั‹ะฒะฐั ะฟะพั†ะตะปัƒั. ะœะพั ะณะพะปะพะฒะฐ ะฟั€ะพะดะพะปะถะฐะปะฐ ะบั€ัƒะถะธั‚ัŒัั. ะะฐะบะพะฝะตั†, ั‚ัะถะตะปะพ ะดั‹ัˆะฐ, ะ‘ะตะนะฝ ะพั‚ัั‚ั€ะฐะฝะธะปัั. - ะŸะพะบะฐ ั‚ั‹ ะฒ ั‚ะฐะบะพะผ ัะพัั‚ะพัะฝะธะธ, ะฝะฐ ะฑะพะปัŒัˆะตะต ะฝะต ั€ะฐััั‡ะธั‚ั‹ะฒะฐะน, - ั ัƒะปั‹ะฑะบะพะน ะฟั€ะธะณั€ะพะทะธะป ะพะฝ. - ะœะพะถะฝะพ ะฟะพะดัƒะผะฐั‚ัŒ, ั‚ั‹ ะดะตะปะฐะตัˆัŒ ะผะฝะต ะพะดะพะปะถะตะฝะธะต. - ะฟั€ะธะฝัะป ะฒั‹ะทะพะฒ ั. - ะขะตะฑะต ัั‚ะพ ะฝัƒะถะฝะพ ะฝะต ะผะตะฝัŒัˆะต. - ะขั‹ ะฝะฐะณะปะตะตัˆัŒ ะฟั€ัะผะพ ะฝะฐ ะณะปะฐะทะฐั…, - ะฒั‹ะฝะตั ะฒะตั€ะดะธะบั‚ ะผะฐะณ. - ะขั‹ ะตั‰ั‘ ะฟะปะพั…ะพ ะผะตะฝั ะทะฝะฐะตัˆัŒ. - ะขะพะณะดะฐ ะดะพ ัะบะพั€ะพะน ะฒัั‚ั€ะตั‡ะธ. ะŸั€ะธะทะฝะฐัŽััŒ, ะฑัƒะดัƒ ัะบัƒั‡ะฐั‚ัŒ, - ะœะฐะณะฝัƒั ะ‘ะตะนะฝ ัƒะถะต ะฒะพ ะฒั‚ะพั€ะพะน ั€ะฐะท ะทะฐ ัั‚ะพั‚ ะฒะตั‡ะตั€ ะฝะฐะฟั€ะฐะฒะธะปัั ะบ ะดะฒะตั€ะธ. ะ˜ ะฝะฐ ัั‚ะพั‚ ั€ะฐะท ั ะตะณะพ ะฝะต ะพัั‚ะฐะฝะฐะฒะปะธะฒะฐะป. - ะ”ะฐ, ั ั‚ะพะถะต, - ะฒั‹ะดะพั…ะฝัƒะป ั ะถะฐะปะตั, ั‡ั‚ะพ ะฝะต ะฒะธะถัƒ ัะตะนั‡ะฐั ะตะณะพ ะปะธั†ะฐ. ะ”ะฒะตั€ัŒ ะทะฐ ะ‘ะตะนะฝะพะผ ะทะฐะบั€ั‹ะปะฐััŒ. ะฏ ะพัั‚ะฐะปัั ะพะดะธะฝ ะธ ั‡ะตั€ะตะท ะฟะฐั€ัƒ ะผะธะฝัƒั‚ ะฒะฝะพะฒัŒ ะพั‚ะบะปัŽั‡ะธะปัั. ะก ั‚ะพะณะพ ะดะฝั ะผะธะฝัƒะปะธ ะฟะพั‡ั‚ะธ ะดะฒะฐ ะผะตััั†ะฐ. ะฏ ัั‚ะฐั€ะฐะปัั ะฝะต ะดัƒะผะฐั‚ัŒ ะพ ะฒะพะทะผะพะถะฝะพัั‚ัั… ะฒัั‚ั€ะตั‡ะธ ั ะผะฐะณะพะผ. ะงั‚ะพะฑั‹ ะฟะตั€ะฒั‹ะผ ะดะตะปะฐั‚ัŒ ัˆะฐะณ ะฝะฐะฒัั‚ั€ะตั‡ัƒ ั ะฑั‹ะป ัะปะธัˆะบะพะผ ะณะพั€ะด... ะ˜ะปะธ ัั‚ะตัะฝะธั‚ะตะปะตะฝ, ะฝะพ ะฒัะต-ะถะต ะผะฝะพะณะพะต ะฒ ะผะพะตะน ะถะธะทะฝะธ ะธะทะผะตะฝะธะปะพััŒ ะฟะพัะปะต ะฝะฐัˆะตะน ะฒัั‚ั€ะตั‡ะธ. ะ’ ะฟะตั€ะฒัƒัŽ ะพั‡ะตั€ะตะดัŒ, ั ัั‚ะฐะป ะฟะพ-ะดั€ัƒะณะพะผัƒ ัะผะพั‚ั€ะตั‚ัŒ ะฝะฐ ะ”ะถะตะนัะฐ. ะžะฝ ะฝะต ะฟั€ะธะฒะปะตะบะฐะป ะผะตะฝั ั‚ะฐะบ ะบะฐะบ ั€ะฐะฝัŒัˆะต, ั…ะพั‚ั ะฝะตัะพะผะฝะตะฝะฝะพ ะฒัะต ะถะต ะฝั€ะฐะฒะธะปัั. ะžะดะฝะฐะบะพ ั„ะฐะฝั‚ะฐะทะธะธ ะผะพะธ ะทะฐะฝะธะผะฐะป ะธัะบะปัŽั‡ะธั‚ะตะปัŒะฝะพ ัั‚ะพั‚ ะฒั‹ัะพะบะพะผะตั€ะฝั‹ะน ะธ ัะฟะฐั‚ะฐะถะฝั‹ะน ั‚ะธะฟ - ะœะฐะณะฝัƒั ะ‘ะตะนะฝ. ะ”ะฐะถะต ะตัะปะธ ั ะฟั‹ั‚ะฐะปัั ะผั‹ัะปะตะฝะฝะพ ะฟะพัะปะฐั‚ัŒ ะตะณะพ ะบ ั‡ะตั€ั‚ัƒ ะธ ะฒั‹ะบะธะฝัƒั‚ัŒ ะธะท ะณะพะปะพะฒั‹, ัะพ ะฒั€ะตะผะตะฝะตะผ ะฒัะต ะฑะพะปัŒัˆะต ัƒะฑะตะถะดะฐะปัั, ั‡ั‚ะพ ัั‚ะพ ะฝะตะฒะพะทะผะพะถะฝะพ. - ะะปะตะบ, ั‚ั‹ ะทะดะตััŒ? - ะฒ ะดะฒะตั€ัั… ะฟะพัะฒะธะปัั ะ”ะถะตะนั. - ะงั‚ะพ ัะปัƒั‡ะธะปะพััŒ? - ั ะฝะต ะพะณะปัะฝัƒะปัั ะฝะฐ ะฝะตะณะพ: ะฝะต ั…ะพั‚ะตะป ะฒะธะดะตั‚ัŒ. "ะŸั€ะฐะฒะดะฐ?" - ะฅะพะดะถ ัะบะฐะทะฐะป - ะฒะตั‡ะตั€ะพะผ ะพั‚ะฟั€ะฐะฒะปัะตะผัั ะฝะฐ ะพั…ะพั‚ัƒ. ะขั‹ ั ะฝะฐะผะธ? - ะ•ัั‚ะตัั‚ะฒะตะฝะฝะพ, ั‡ั‚ะพ ะทะฐ ะฒะพะฟั€ะพัั‹? - ะŸั€ะพัั‚ะพ ั‚ั‹ ะฒั‹ะณะปัะดะธัˆัŒ ัƒัั‚ะฐะปั‹ะผ, - ั ัะพั‡ัƒะฒัั‚ะฒะธะตะผ ะทะฐะผะตั‚ะธะป ะฟะฐั€ะตะฝัŒ. "ะ‘ัƒะดั‚ะพ ั‚ะตะฑั ัั‚ะพ ะฒะพะปะฝัƒะตั‚", - ั ะณะพั€ะตั‡ัŒัŽ ะฟะพะดัƒะผะฐะป ั. - ะขะตะฑะต ะบะฐะถะตั‚ัั. ะŸะพะฒะธัะปะพ ะฝะฐะฟั€ัะถะตะฝะฝะพะต ะผะพะปั‡ะฐะฝะธะต. ะฏ ั…ะพั‚ะตะป, ั‡ั‚ะพะฑั‹ ะพะฝ ัƒัˆะตะป. ะ’ะฟะตั€ะฒั‹ะต ะฒ ะถะธะทะฝะธ ั ะฝะต ั…ะพั‚ะตะป ั ะฝะธะผ ั€ะฐะทะณะพะฒะฐั€ะธะฒะฐั‚ัŒ. ะก ั‡ะตะปะพะฒะตะบะพะผ, ะบะพั‚ะพั€ะพะณะพ ะตั‰ั‘ ะฝะตะดะฐะฒะฝะพ ะผะตั‡ั‚ะฐะป ะทะฐั‚ะฐั‰ะธั‚ัŒ ะฒ ะฟะพัั‚ะตะปัŒ. ะ‘ะพะถะต! - ะกะปัƒัˆะฐะน, - ะพะฝ ะฒะพัˆั‘ะป ะฒ ะบะพะผะฝะฐั‚ัƒ ะธ ะฟั€ะธะบั€ั‹ะป ะทะฐ ัะพะฑะพะน ะดะฒะตั€ัŒ, - ะผั‹ ะถะต ะดั€ัƒะทัŒั... ะผั‹ ะฑั€ะฐั‚ัŒั. ะ•ัะปะธ ัƒ ั‚ะตะฑั ะบะฐะบะธะต-ั‚ะพ ะฟั€ะพะฑะปะตะผั‹, ั‚ั‹ ะฝะต ะดะพะปะถะตะฝ ะดะตั€ะถะฐั‚ัŒ ะธั… ะฒ ัะตะฑะต. ะŸะพะณะพะฒะพั€ะธ ั ะฝะฐะผะธ. ะกะพ ะผะฝะพะน. ะฏ ัƒะฒะตั€ะตะฝ, ั‡ั‚ะพ ะฒะผะตัั‚ะต ะผั‹ ะปะตะณะบะพ ัะฟั€ะฐะฒะธะผัั ั ั‚ะฒะพะตะน... ะดะตะฟั€ะตััะธะตะน. ะ”ะถะตะนั ะฟะพะปะพะถะธะป ั€ัƒะบัƒ ะฝะฐ ะผะพะต ะฟะปะตั‡ะพ. ะ ะฐะฝัŒัˆะต ะพะฝ ั‡ะฐัั‚ะพ ั‚ะฐะบ ะดะตะปะฐะป ะธ ะผะฝะต ัั‚ะพ ะฝั€ะฐะฒะธะปะพััŒ, ะฝะพ ัะตะนั‡ะฐั ะตะณะพ ะฒะผะตัˆะฐั‚ะตะปัŒัั‚ะฒะพ ะฒ ะผะพัŽ ะปะธั‡ะฝัƒัŽ ะถะธะทะฝัŒ ะปะธัˆัŒ ะฝะฐะณะฝะตั‚ะฐะปะพ ัะธั‚ัƒะฐั†ะธัŽ. ะ’ ะฟะพัะปะตะดะฝะตะต ะฒั€ะตะผั ั ะฒัะตะผะธ ัะธะปะฐะผะธ ะฟั‹ั‚ะฐะปัั ัƒะฝัั‚ัŒ ัƒะถะต ะผะฝะพะณะพ ะปะตั‚ ะผัƒั‡ะธะฒัˆะธะต ะผะตะฝั ั‡ัƒะฒัั‚ะฒะฐ ะบ ะฟั€ะธั‘ะผะฝะพะผัƒ ะฑั€ะฐั‚ัƒ. ะ‘ะตะนะฝ ะดะพะปะถะตะฝ ะฑั‹ั‚ัŒ ัƒะฒะตั€ะตะฝ ะฒ ั‚ะพะผ, ั‡ั‚ะพ ะบั€ะพะผะต ะฝะตะณะพ ะผะฝะต ะฝะธะบั‚ะพ ะฝะต ะฝัƒะถะตะฝ. - ะะตั‚ ัƒ ะผะตะฝั ะฝะธะบะฐะบะธั… ะฟั€ะพะฑะปะตะผ, ั ะฟั€ะพัั‚ะพ ั…ะพั‡ัƒ ะฟะพะฑั‹ั‚ัŒ ะพะดะธะฝ. ะขั‹ ะผะพะถะตัˆัŒ ะพัั‚ะฐะฒะธั‚ัŒ ะผะตะฝั ะฒ ะฟะพะบะพะต? - ะบะฐะบ ะผะพะถะฝะพ ะผัะณั‡ะต ะฟะพัั‚ะฐั€ะฐะปัั ัะบะฐะทะฐั‚ัŒ ั. "ะ’ะพะพะฑั‰ะต-ั‚ะพ ะพะฝ ั…ะพั‡ะตั‚ ะฟะพะผะพั‡ัŒ. ะะต ะตะณะพ ะฒะธะฝะฐ, ั‡ั‚ะพ ั ะณะตะน". - ะฏ ะฟะพะฝะธะผะฐัŽ ั‚ะฒะพะธ ั‡ัƒะฒัั‚ะฒะฐ... - ะพัั‚ะพั€ะพะถะฝะพ ะฝะฐั‡ะฐะป ะพะฝ. - ะะตั‚, - ะพั‚ั€ะตะทะฐะป ั, ะฒะดั€ัƒะณ ะพะณะปัะฝัƒะฒัˆะธััŒ ะฝะฐ ะฝะตะณะพ. ะขะพั‚ ะดะฐะถะต ะพั‚ัั‚ัƒะฟะธะป ะฝะฐะทะฐะด. - ะขั‹ ะฝะต ะฟะพะฝะธะผะฐะตัˆัŒ ะผะตะฝั, ะฐ ั ะฝะต ะฟะพะฝะธะผะฐัŽ ะฒะฐั ั ะšะปัั€ะธ. ะญั‚ะพ ะฒะฟะพะปะฝะต ะปะพะณะธั‡ะฝะพ, ัั‚ะพ ัะพะฒะตั€ัˆะตะฝะฝะพ ั€ะฐะทะฝั‹ะต ั‡ัƒะฒัั‚ะฒะฐ ะธ ะฝะธะบั‚ะพ ะฝะต ะฒะธะฝะธั‚ ั‚ะตะฑั ะฒ ะผะพะธั… ะฟะตั€ะตะถะธะฒะฐะฝะธัั…, ะฝะพ ั ั€ะฐะทะฑะตั€ัƒััŒ ั ะฝะธะผะธ ัะฐะผ. ะฏ ะทะฝะฐัŽ ั‡ะตะณะพ ั…ะพั‡ัƒ. - ะฅะพั€ะพัˆะพ, - ะ”ะถะตะนั ะพะฟัƒัั‚ะธะป ะณะปะฐะทะฐ. ะะธะบะพะณะดะฐ ะทะฐ ะฝะธะผ ั‚ะฐะบะพะณะพ ะฝะต ะฝะฐะฑะปัŽะดะฐะป. - ะขะฒะพะต ะดะตะปะพ, - ะฟะพะถะฐะป ะฟะปะตั‡ะฐะผะธ ะพะฝ ะธ, ะฒั‹ะนะดั ะทะฐ ะดะฒะตั€ัŒ, ะฒะฝะพะฒัŒ ะพัั‚ะฐะฒะธะป ะผะตะฝั ะฒ ะพะดะธะฝะพั‡ะตัั‚ะฒะต. ะะต ะทะฝะฐัŽ, ัะบะพะปัŒะบะพ ะฒั€ะตะผะตะฝะธ ั ะฟั€ะพัั‚ะพัะป ั‚ะฐะบ ั€ะฐะทะผั‹ัˆะปัั, ะฟั€ะฐะฒะธะปัŒะฝะพ ะปะธ ะฟะพัั‚ัƒะฟะธะป ะธ ะฝะต ะพะฑะธะดะตะป ะปะธ ั‡ัƒะฒัั‚ะฒะฐ ะฑั€ะฐั‚ะฐ. ะ”ะฒะตั€ัŒ ะฒ ะบะพะผะฝะฐั‚ัƒ ะพั‚ะบั€ั‹ะปะฐััŒ. ะฏ ั€ะตัˆะธะป, ั‡ั‚ะพ ะ”ะถะตะนั ะฒะตั€ะฝัƒะปัั, ะฑั€ะพัะธะป ะฝะฐ ะฝะตะณะพ ะฝะตะดะพะฒะพะปัŒะฝั‹ะน ะฒะทะณะปัะด ะธ... ะทะฐะผะตั€ ะฒ ะธะทัƒะผะปะตะฝะธะธ. ะะฐ ะฟะพั€ะพะณะต ัั‚ะพัะป ะœะฐะณะฝัƒั ะ‘ะตะนะฝ, ะพะดะตั‚ั‹ะน, ะบะฐะบ ะธ ะฒัะตะณะดะฐ, ะฒ ะปัƒั‡ัˆะธั… ัะฒะพะธั… ั‚ั€ะฐะดะธั†ะธัั…: ะฒ ัะตั€ั‹ั… ะพะฑะปะตะณะฐัŽั‰ะธั… ะฑั€ัŽะบะฐั… ั ะทะฐะฝะธะถะตะฝะฝะพะน ั‚ะฐะปะธะตะน ะธ ัƒัั‹ะฟะฐะฝะฝั‹ะผ ะฑะปะตัั‚ะบะฐะผะธ ัˆะธั€ะพะบะธะผ ั€ะตะผะฝะตะผ, ะฑะตะปะพะน ะผะฐะนะบะต ะธ ะบะพะถะฐะฝะพะน ะบัƒั€ั‚ะบะต ั ะฑะตััั‡ะตั‚ะฝั‹ะผ ะบะพะปะธั‡ะตัั‚ะฒะพะผ ั†ะตะฟะตะน ะธ ะทะฐะบะปะตะฟะพะบ. ะ‘ั€ะพัะบะธะน ะผะฐะบะธัะถ ัะพัั‚ะฐะฒะปัะป ั‚ะตะผะฝัƒัŽ ะฟะพะดะฒะพะดะบัƒ ะณะปะฐะท ัะฒะตั€ั…ัƒ ะธ ะทะพะปะพั‚ะธัั‚ะพ-ะฟะตั€ะปะฐะผัƒั‚ั€ะพะฒัƒัŽ, ะฟะพั‚ะพะปั‰ะต, ัะฝะธะทัƒ. ะฏ ะฟะพะฝะธะผะฐะป, ั‡ั‚ะพ ะฝัƒะถะฝะพ ะบะฐะบ-ั‚ะพ ะฟะพะฟั€ะธะฒะตั‚ัั‚ะฒะพะฒะฐั‚ัŒ ะณะพัั‚ั, ะฝะพ, ะฑัƒะดั‚ะพ ะฟั€ะพะณะปะพั‚ะธะฒ ัะทั‹ะบ, ะฟั€ะพะดะพะปะถะฐะป ะฑะตะทะผะพะปะฒะฝะพ ะฟัะปะธั‚ัŒัั ะฝะฐ ะฝะตะณะพ. - ะŸั€ะธัˆะตะป ะฟั€ะพะฒะตั€ะธั‚ัŒ, ะบะฐะบ ั‚ั‹ ัะตะฑั ั‡ัƒะฒัั‚ะฒัƒะตัˆัŒ, - ะฑะตะท ะปะธัˆะฝะธั… ั„ะพั€ะผะฐะปัŒะฝะพัั‚ะตะน ัะพะพะฑั‰ะธะป ะ‘ะตะนะฝ. "ะœะพะต ัะตั€ะดั†ะต ัƒะฟะฐะปะพ ะฝะฐ ะฟะพะป ะธ ะพั‚ะบะฐั‚ะธะปะพััŒ ะบ ะตะณะพ ะฝะพะณะฐะผ, ะพัั‚ะฐะฒะปัั ะบั€ะพะฒะฐะฒัƒัŽ ะฟะพะปะพััƒ". - ะ•ัะปะธ ะฑั‹ ั‚ั‹ ั…ะพั‚ะตะป ะฟั€ะพะฒะตั€ะธั‚ัŒ ะบะฐะบ ั ัะตะฑั ั‡ัƒะฒัั‚ะฒัƒัŽ - ะฟั€ะธัˆั‘ะป ะฑั‹ ั€ะฐะฝัŒัˆะต. - ั ะฝะพั‚ะบะฐะผะธ ะพะฑะธะดั‹ ะพั‚ะฒะตั‚ะธะป ั. - ะขะตะฑะต ะฟั€ะพัั‚ะพ ะทะฐะฝัั‚ัŒัั ะฝะตั‡ะตะผ? ะ ะตัˆะธะป ะฟั€ะพะณัƒะปัั‚ัŒัั ะดะพ ะธะฝัั‚ะธั‚ัƒั‚ะฐ? - ะ ะผะพะณ ะฑั‹ ะฒะพะพะฑั‰ะต ะฝะต ะฟั€ะธะดั‚ะธ, - ะทะฐะผะตั‚ะธะป ะผะฐะณ. ะขะฐะบะฐั ะฟะตั€ัะฟะตะบั‚ะธะฒะฐ ะผะตะฝั ัะพะฒะตั€ัˆะตะฝะฝะพ ะฝะต ะฒะดะพั…ะฝะพะฒะธะปะฐ. - ะฃ ะผะตะฝั ั‚ะพะถะต ะตัั‚ัŒ ั‡ัƒะฒัั‚ะฒะฐ, ะะปะตะบ. ะฏ ะผะพะณ ะบะพะปะตะฑะฐั‚ัŒัั, ัะพะผะฝะตะฒะฐั‚ัŒัั, ะฟั€ะฐะฒะธะปัŒะฝะพ ะปะธ ะฟะพัั‚ัƒะฟะฐัŽ ะธ ั…ะพั‡ัƒ ะปะธ ัั‚ะพะณะพ ะฝะฐ ัะฐะผะพะผ ะดะตะปะต. - ะœะฐะณะฝัƒั ะฟะพะดะพัˆะตะป ะฑะปะธะถะต ะธ ะพัั‚ะฐะฝะพะฒะธะปัั ะฝะฐะฟั€ะพั‚ะธะฒ ะผะตะฝั. ะฏ ัั‚ะพัะป ะพะฟัƒัั‚ะธะฒ ะณะพะปะพะฒัƒ ะธ ะฟะพั‚ัƒะฟะธะฒ ะฒะทะณะปัะด. ะšั‚ะพ ั ั‚ะฐะบะพะน, ั‡ั‚ะพะฑั‹ ะพะฑะฒะธะฝัั‚ัŒ ะตะณะพ ะฒ ั‡ะตะผ-ั‚ะพ? ะญะณะพะธัั‚. ะ’ัั‘ ัั‚ะพ ะฒั€ะตะผั, ัั‚ั€ะฐะดะฐั ะธ ะผะตั‡ั‚ะฐั ะพ ะฒัั‚ั€ะตั‡ะต, ั ะฝะธ ะฝะฐ ัะตะบัƒะฝะดัƒ ะฝะต ะทะฐะดัƒะผะฐะปัั ะพ ั‚ะพะผ, ั‡ั‚ะพ ั‡ัƒะฒัั‚ะฒัƒะตั‚ ะพะฝ. ะก ะบะฐะบะธะผ ั‚ั€ัƒะดะพะผ ะตะผัƒ ะดะฐะตั‚ัั ะฟั€ะธะทะฝะฐะฝะธะต ัะฐะผะพะผัƒ ัะตะฑะต. ะฃ ะฝะตะณะพ ั‚ะพะถะต ะตัั‚ัŒ ะณะพั€ะดะพัั‚ัŒ, ะฝะฐ ะบะพั‚ะพั€ัƒัŽ ะตะผัƒ ะฟั€ะธัˆะปะพััŒ ะฝะฐัั‚ัƒะฟะธั‚ัŒ, ั‡ั‚ะพะฑั‹ ัะฒะธั‚ัŒัั ััŽะดะฐ. ะšะพ ะผะฝะต. ะกะฝะพะฒะฐ. - ะขั‹ ะฑั‹ ัะฐะผ ะบะพ ะผะฝะต ะฝะธ ะทะฐ ั‡ั‚ะพ ะฝะต ะฟั€ะธัˆะตะป, - ัะฝะพะฒะฐ ะฒ ั‚ะพั‡ะบัƒ. "ะ”ะฐ". - ะŸั€ะพัั‚ะธ, - ัˆะตะฟั‡ัƒ ะตะดะฒะฐ ั€ะฐะทะปะธั‡ะธะผะพ. ะœะฐะณ ะธะทัƒั‡ะฐะตั‚ ะผะตะฝั ะบะพัˆะฐั‡' - source_sentence: 'ะ ะตัะปะธ ะฑั‹ ะผั‹ ะฒัั‚ั€ะตั‡ะฐะปะธ ะะพะฒั‹ะน ะ“ะพะด ะฒะผะตัั‚ะต, ะผ? ะ—ะฝะฐะตัˆัŒ, ั ะฑั‹ ะฝะต ั…ะพั‚ะตะปะฐ, ั‡ั‚ะพ ะฑั‹ ะผั‹ ะฟั€ะพะฒะพะดะธะปะธ ะตะณะพ ะฒ ะณะพัั‚ัั… ัƒ ะบะพะณะพ-ั‚ะพ ะธะท ะฝะฐั, ะฝะตั‚. ะญั‚ะพ ะฑั‹ะปะฐ ะฑั‹ ะฑะตะณะพั‚ะฝั, ะตั‰ะต ะธ ััƒะตั‚ะฐ, ะณะพั‚ะพะฒะบะฐ ะธ ะฟั€ะพั‡ะตะต, ะฐ ัั‚ะพ ะฝะตะผะฝะพะณะพ ะฝะต ั‚ะพ, ั‡ะตะณะพ ะฑั‹ ั ั…ะพั‚ะตะปะฐ, ัั‚ะพ ะฝะต ั‚ะพ ัะฐะผะพะต, ะฝะต ะธะดะตะฐะปัŒะฝะพะต-ะฝะตะธะดะตะฐะปัŒะฝะพะต-ะฟั€ะพัั‚ะพะต-ะฝะตะฟั€ะพัั‚ะพะต. ะœะพะถะตั‚, ะตัะปะธ ะฑั‹ ะฝะฐั ะบะฐะบะธะผะธ-ั‚ะพ ัะธะปะฐะผะธ ัะฒั‹ัˆะต ะฟัƒัั‚ะธะปะธ ะบ ะดั€ัƒะทัŒัะผ, ั‚ะพ ะผั‹ ะฑั‹ ะฟะพัะธะดะตะปะธ, ะฟะพะพะฑั‰ะฐะปะธััŒ ะฑั‹. ะญั‚ะพ ะฑั‹ะปะพ ะฑั‹ ะบะปะฐััะฝะพ, ะทะฝะฐะตัˆัŒ? ะŸั€ะพัั‚ะพ ัะธะดะตั‚ัŒ ะธ ะณะพะฒะพั€ะธั‚ัŒ ั‚ะพ, ั‡ั‚ะพ ั…ะพั‡ะตัˆัŒ, ะฐ ะฟะพั‚ะพะผ ะผั‹ ะฟะพัˆะปะธ ะฑั‹ ะฒ ะบะพะผะฝะฐั‚ัƒ, ะฒะทัะฒ ั‡ั‚ะพ-ั‚ะพ ะธะท ะฒั‹ะฟะธะฒะบะธ ะธ ะฒะบัƒัะฝะพัั‚ะตะน. ะขะพ, ั‡ั‚ะพ ะฒั‹ะฑั€ะฐะปะฐ ะฑั‹ ั‚ั‹, ะฒะตะดัŒ ะทะฝะฐะตัˆัŒ - ั ะฝะต ัƒะผะตัŽ ะฟะธั‚ัŒ. ะ‘ะตะนะปะธะท, ะผะพะถะตั‚? ะ˜ ั ะปะธะฑะพ ะฝะต ะฒั‹ะฟัŒัŽ ะฑะพะปัŒัˆะต ะณะปะพั‚ะบะฐ, ะปะธะฑะพ ะฒั‹ะฟัŒัŽ ะพั‡ะตะฝัŒ ะผะฝะพะณะพ. ะฏ ะปัŽะฑะปัŽ ัˆะพะบะพะปะฐะด ั ะฝะต ะพั‡ะตะฝัŒ ะดะฐะฒะฝะธั… ะฟะพั€. ะœั‹ ัะตะปะธ ะฑั‹ ะฝะฐ ะบั€ะพะฒะฐั‚ัŒ, ะฐ ั‚ั‹ ัะฝัะปะฐ ะฑั‹ ัะฒะพะธ ะบะฐะฑะปัƒะบะธ, ะฒั‹ัะพะบะธะต ะธ ะพั‡ะตะฝัŒ ะบั€ัƒั‚ั‹ะต, ะถะฐะปัƒัััŒ ะผะฝะต ะฝะฐ ัƒัั‚ะฐะปะพัั‚ัŒ ะธ ะฑะพะปัŒ ะพั‚ ะฝะธั…. ะฏ ะฟะพะถะฐะปะตะปะฐ ะฑั‹, ะฝะพ ะฒัะต-ั‚ะฐะบะธ ะฝะต ัะผะพะณะฐ ะฑั‹ ัƒะดะตั€ะถะฐั‚ัŒัั ะธ ะฝะต ะฟะพะทะปะพั€ะฐะดัั‚ะฒะพะฒะฐั‚ัŒ ะพ ั‚ะพะผ, ั‡ั‚ะพ ะฝะฐ ะผะฝะต ะฑะฐะปะตั‚ะบะธ. ะœะพะถะตั‚, ะดะฐะถะต ะฟั€ะพัั‚ะพ ะฝะพัะบะธ, ั€ะฐะท ัƒะถ ัƒ ะดั€ัƒะทะตะน. ะขั‹ ะฟะพะปะพะถะธะปะฐ ะฑั‹ ะฝะพะณะธ ะฝะฐ ะบั€ะพะฒะฐั‚ัŒ, ะฒะพะทะผะพะถะฝะพ, ัะพะณะฝัƒะปะฐ ะฑั‹ ะธั… ะฒ ะบะพะปะตะฝัั…, ะธ ัั‚ะพ ะฒั‹ะณะปัะดะตะปะพ ะฑั‹ ะพั‡ะตะฝัŒ ะผะธะปะพ ะฒ ั€ะพัะบะพัˆะฝะพะผ ะฟะปะฐั‚ัŒะต. ะะฐะผ ะฑั‹ะปะพ ะฑั‹ ะพั‡ะตะฝัŒ ะถะฐั€ะบะพ, ะฝะพ, ะทะฝะฐะตัˆัŒ, ะฝะต ะธะท-ะทะฐ ั‚ะพะณะพ, ั‡ั‚ะพ ะผั‹ ั€ัะดะพะผ ะธะปะธ ั‡ั‚ะพ-ั‚ะพ ะฒ ัั‚ะพะผ ั€ะพะดะต, ะฝะตั‚, ะฐ ะฟั€ะพัั‚ะพ ะธะท-ะทะฐ ั‚ะพะณะพ, ั‡ั‚ะพ ััƒะผะฑัƒั€ะฝั‹ะน ะธ ะธะฝะพะณะดะฐ ะผะฝะพะณะพะปัŽะดะฝั‹ะน ะะพะฒั‹ะน ะ“ะพะด ะฒัะตะณะดะฐ ั‚ะฐะบ ะฟะพัั‚ัƒะฟะฐะตั‚. ะ ะฒะตะดัŒ, ะบะฐะทะฐะปะพััŒ ะฑั‹, ะทะธะผะฝะธะน ะฟั€ะฐะทะดะฝะธะบ. ะฏ ะฑั‹ ั…ะพั‚ะตะปะฐ ะดะตั€ะถะฐั‚ัŒ ั‚ะตะฑั ะทะฐ ั€ัƒะบัƒ, ะฒะตะดัŒ ั ะปัŽะฑะปัŽ ั‚ะฒะพะธ ะผะฐะปัŽัะตะฝัŒะบะธะต ั€ัƒั‡ะบะธ, ะบะฐะบ ัƒ ั€ะตะฑะตะฝะบะฐ, ั€ะฐะทะฒะต ั‡ั‚ะพ ัะพ ะฒะทั€ะพัะปั‹ะผ ะผะฐะฝะธะบัŽั€ะพะผ ะธ ะฒะทั€ะพัะปั‹ะผะธ ะบะพะปัŒั†ะฐะผะธ, ะบะพั‚ะพั€ั‹ะต ั‚ะฐะบ ะผะตัˆะฐัŽั‚ ัะถะธะผะฐั‚ัŒ ะบั€ะตะฟะบะพ ั€ัƒะบัƒ. ะ˜ ั‚ั‹ ะฑั‹ ั€ะฐะทั€ะตัˆะธะปะฐ, ะฒะตะดัŒ ั‚ะฐะบ? ะฅะฐั…, ะฒะพะทะผะพะถะฝะพ ั‚ั‹ ะฑั‹ ะดะฐะถะต ะบะพะปะพะปะฐ ะผะตะฝั ัะฒะพะธะผะธ ะฑะพะปัŒัˆะธะผะธ ะฝะพะณั‚ัะผะธ, ะผั‹ ะฑั‹ ะฝะฐั‡ะฐะปะธ ั‰ะตะบะพั‚ะฐั‚ัŒ ะดั€ัƒะณ ะดั€ัƒะณะฐ, ะฒ ัั‚ะพะผ ั ัƒะฒะตั€ะตะฝะฐ, ะฐ ะฒะตะดัŒ ะธ ั‚ะฐะบ ะถะฐั€ะบะพ, ัั…. ะ’ะพะปะพัั‹ ั€ะฐัั‚ั€ะตะฟะฐะฝะฝั‹ะต, ะผะพะบั€ั‹ะต ะพั‚ ะฟะพั‚ะฐ, ั„ัƒ. ะะพ ะบะปะฐััะฝะพ. ะœั‹ ะฑะพะปั‚ะฐะปะธ ะฑั‹ ะฒััŽ ะฝะพั‡ัŒ, ะผั‹ ะถะต ะปัŽะฑะธะผ ัั‚ะพ ะดะตะปะพ, ะดะฐ? ะœะพะถะตั‚, ะผั‹ ะฒัะฟะพะผะธะฝะฐะปะธ ะฑั‹ ั‡ั‚ะพ-ั‚ะพ ะธะท ะฝะฐัˆะธั… ัั‚ะฐั€ั‹ั… ะฟั€ะธะบะพะปะพะฒ ะฒ ะธะฝั‚ะตั€ะฝะตั‚ะต, ะฒัะฟะพะผะธะฝะฐะปะธ ะฑั‹ ะฝะฐัˆะต ะพะฑั‰ะตะฝะธะต ะดะพ ั‚ะพะณะพ, ะบะฐะบ ะผะตะถะดัƒ ะฝะฐะผะธ ะฝะฐั‡ะฐะปะพััŒ ะทะฐั€ะพะถะดะฐั‚ัŒัั ั‡ั‚ะพ-ั‚ะพ ะฑะพะปัŒัˆะตะต, ั‡ะตะผ "ะŸั€ะธะฒะตั‚, ะบะปะพ*, ะบะฐะบ ะดะตะปะฐ? ะะพั€ะผ, ะฐ ั‚ั‹?". ะœั‹ ะฑั‹, ะดัƒะผะฐัŽ, ะฟะพัะธะดะตะปะธ ะฒ ะณะฐะดะถะตั‚ะฐั…, ะฟะพัะผะพั‚ั€ะตะปะธ ะฑั‹ ั„ะพั‚ะพะณั€ะฐั„ะธะธ. ะ•ัะปะธ ะฑั‹ ัƒ ะผะตะฝั ะฟะพะฟะฐะดะฐะปะพััŒ ั‡ั‚ะพ-ั‚ะพ ั‚ะฐะบะพะต, ั‡ะตะณะพ ะฝะต ั…ะพั‚ะตะปะพััŒ ะฑั‹ ะฟะพะบะฐะทั‹ะฒะฐั‚ัŒ ั‚ะตะฑะต - ะฝะตัƒะดะฐั‡ะฝะพะต ั„ะพั‚ะพ, ะบ ะฟั€ะธะผะตั€ัƒ - ั ะฝะฐั‡ะฐะปะฐ ะฑั‹ ะฟั€ัั‚ะฐั‚ัŒ ั‚ะตะปะตั„ะพะฝ, ะฐ ั‚ั‹ ะฟั‹ั‚ะฐะปะฐััŒ ะฑั‹ ะพั‚ะฝัั‚ัŒ, ั‡ั‚ะพะฑั‹ ะฒัะต-ั‚ะฐะบะธ ะฟะพัะผะพั‚ั€ะตั‚ัŒ. ะ˜ ะฝะฐะพะฑะพั€ะพั‚. ะ ะฟะพะด ัƒั‚ั€ะพ ะผั‹ ัƒัะฝัƒะปะธ ะฑั‹. ะ˜, ะทะฝะฐะตัˆัŒ, ะผั‹ ะบะพะณะดะฐ-ั‚ะพ ะฝะฐะทั‹ะฒะฐะปะธ ัะตะฑั ะฟะฐั€ะพะน. ะœั‹ ะฝะธะบะพะณะดะฐ ะฝะต ะฑั‹ะปะธ ะฟะฐั€ะพะน, ั…ะฐั…? ะงะตั€ั‚ะพะฒัะบะธ, ั‡ะตั€ั‚ะพะฒัะบะธ ะณะปัƒะฟะพ. ะะพ ะบั‚ะพ ะผั‹ ั‚ะพะณะดะฐ? ะ ะฝะธะบั‚ะพ ะฝะต ะทะฝะฐะตั‚, ะฒะตะดัŒ, ะบะฐะบ ัะบะฐะทะฐะปะฐ ะบะฐะบ-ั‚ะพ ั€ะฐะท ั‚ั‹, ะผะพั ะปัŽะฑะธะผะฐั ะฒะฐะฝะธะปัŒะบะฐ ะธ ะปัŽะฑะธั‚ะตะปัŒะฝะธั†ะฐ ัั‚ะฐั‚ัƒัะพะฒ, "ะžะฟั€ะตะดะตะปะธั‚ัŒ - ะทะฝะฐั‡ะธั‚, ะพะณั€ะฐะฝะธั‡ะธั‚ัŒ". ะญั‚ะพ, ะฒั€ะพะดะต, ะžัะบะฐั€ ะฃะฐะนะปัŒะด, ะผะฝะต ะพะฝ ะฝั€ะฐะฒะธั‚ัั. *ะฟั€ะพะธะทะฒะพะดะฝะพะต ะพั‚ "ะบะปะพะฝ", ะฐะฝะฐะปะพะณ "ะฑั€ะพ". ะกะผั‹ัะป ะฒ ั‚ะพะผ, ั‡ั‚ะพ ะฑั‹ะปะพ ะผะฝะพะณะพ ะพะฑั‰ะตะณะพ. ะžะฟะธัะฐั‚ัŒ ะฟั€ะพะณัƒะปะบัƒ? ะŸัƒัั‚ัŒ ัั‚ะพ ะฑัƒะดะตั‚ ะปะตั‚ะพ. ะ–ะฐั€ะบะธ ะŸะธั‚ะตั€, ะผ? ะžั‡ะตะฝัŒ ะถะฐั€ะบะธะน, ะบะฐะบ ะฒ ะฟั€ะพัˆะปะพะผ ะณะพะดัƒ, ะบะพะณะดะฐ ะดั‹ัˆะฐั‚ัŒ ะฑั‹ะปะพ ะฝะตั‡ะตะผ, ะธ ั ะฑัƒะบะฒะฐะปัŒะฝะพ ะบัƒะฟะฐะปะฐััŒ ะฒ ะะตะฒะต ะธ ั…ะพะดะธะปะฐ ั‚ะฐะบ ะผะพะบั€ะพะน ะฟะพ ะณะพั€ะพะดัƒ. ะ’ ะ‘ัƒั€ะณะตั€ ะšะธะฝะณะต ะฝะฐ ัั‚ะพ ะดะฐะถะต ะฝะต ะพัะพะฑะพ ะพะฑั€ะฐั‚ะธะปะธ ะฒะฝะธะผะฐะฝะธะต, ะบะฐะบ ั ะฟะพะผะฝัŽ - ะฟะพะฝะธะผะฐะปะธ. ะœั‹ ะฑั‹ ะฒั‹ัˆะปะธ ะธะท ะดะพะผะฐ ะธ ะฟะพะตั…ะฐะปะธ ะฑั‹ ะบ ะผะตั‚ั€ะพ ะฝะฐ ะผะฐั€ัˆั€ัƒั‚ะบะต. ะะฐ ะผะฐั€ัˆั€ัƒั‚ะบะต, ั‚ะฐะบ ะบะฐะบ ะผั‹ ะฑั‹ ะฟั€ะพัั‚ะพ ะฝะต ะฒั‹ะถะธะปะธ ะฟะพัะปะต ัะพั€ะพะบะฐะผะธะฝัƒั‚ะฝะพะน ั…ะพะดัŒะฑั‹ ะพั‚ ะผะพะตะณะพ ะดะพะผะฐ ะดะพ ะผะตั‚ั€ะพ, ั…ะพั‚ัŒ ั ัƒะถะต ะธ ะฟั€ะพั…ะพะดะธะปะฐ ั‡ะตั€ะตะท ั‚ะฐะบะพะต. ะ˜ ัั‚ะพ ัƒะถะฐัะฝะพ. ะ’ ะผะตั‚ั€ะพ ะผั‹ ะบัƒะฟะธะปะธ ะฑั‹ ะถะตั‚ะพะฝั‹ ะธ ะฝะตะผะฝะพะณะพ ะฟะพัะฟะพั€ะธะปะธ ะฑั‹ ะฟะพ ะฟะพะฒะพะดัƒ ะฟะธั‚ะตั€ัะบะธั… ะถะตั‚ะพะฝะพะฒ ะธ ะผะพัะบะพะฒัะบะธั… ะบะฐั€ั‚ะพั‡ะตะบ, ั ะพะดะตั€ะถะฐะปะฐ ะฑั‹ ะฟะพะฑะตะดัƒ. ะ’ะตะดัŒ, ะทะฝะฐะตัˆัŒ, ะถะตั‚ะพะฝั‹ ะฒ ะทะฐะผะบะฝัƒั‚ะพะผ ะบั€ัƒะณัƒ, ะธั… ะฝะต ะฝัƒะถะฝะพ ะพัะพะฑะพ ะฟั€ะพะธะทะฒะพะดะธั‚ัŒ. ะšะฐั€ั‚ะพั‡ะบะธ ะถะต ะพะดะฝะพั€ะฐะทะพะฒั‹ะต, ะพะฝะธ ะฒั‹ะบะธะดั‹ะฒะฐัŽั‚ัั ะธ ะฟั€ะธั…ะพะดะธั‚ัั ะฟั€ะพะธะทะฒะพะดะธั‚ัŒ ะฝะพะฒั‹ะต. ะœั‹ ะฑั‹ ะทะฐะบะพะฝั‡ะธะปะธ ะฝะฐ ัั‚ะพะผ ะธ ะฟั€ะพะดะพะปะถะฐะปะธ ะฑั‹ ัˆัƒั‚ะธั‚ัŒ ะฝะฐ ั‚ะตะผัƒ "ะฒั€ะฐะถะดั‹" ะŸะธั‚ะตั€ะฐ ะธ ะœะพัะบะฒั‹ - ะฝะฐัˆะต ะปัŽะฑะธะผะพะต. ะ˜, ะทะฝะฐะตัˆัŒ, ะŸะธั‚ะตั€ ะปัƒั‡ัˆะต. ะ’ะพะนะดั ะฒ ะฒะฐะณะพะฝ, ะผั‹ ะฑั‹ ัะตะปะธ, ั‚ะฐะบ ะบะฐะบ ัั‚ะฐะฝั†ะธั, ะฝะฐ ะบะพั‚ะพั€ะพะน ะถะธะฒัƒ, ะฟะพั‡ั‚ะธ ั‡ั‚ะพ ะบะพะฝะตั‡ะฝะฐั. ะšั‚ะพ-ั‚ะพ ะธะท ะฝะฐั ะฟะพะปะพะถะธะป ะฑั‹ ะณะพะปะพะฒัƒ ะฝะฐ ะฟะปะตั‡ะพ ะดั€ัƒะณะพะน, ัั‚ะพ ะฑั‹ะปะพ ะฑั‹ ะพั‡ะตะฝัŒ ะผะธะปะพ ะธ, ะฒะพะทะผะพะถะฝะพ, ะฝะตัƒะดะพะฑะฝะพ. ะ’ ะฟะพะปัŒะทัƒ ัะฒะพะตะณะพ ะผะตั‚ั€ะพ ั‚ั‹ ะฑั‹ ะตั‰ะต ัะบะฐะทะฐะปะฐ, ั‡ั‚ะพ ัƒ ะฒะฐั ะตัั‚ัŒ WiFi ะธ ะฑะตัะบะพะฝะตั‡ะฝะฐั ะฒะตั‚ะบะฐ. ะฏ ะฑั‹ ัะพะณะปะฐัะธะปะฐััŒ, ั‚ะฐะบ ะบะฐะบ ั‚ะพะถะต ั‚ะฐะบ ัั‡ะธั‚ะฐัŽ. ะกะปะธัˆะบะพะผ ะผะฝะพะณะพ ะพ ะผะตั‚ั€ะพ, ะฝะต ั‚ะฐะบ ะปะธ, ั…ะฐั…? ะœั‹ ะฒั‹ัˆะปะธ ะฑั‹ ะฝะฐ ะฝัƒะถะฝะพะน ัั‚ะฐะฝั†ะธะธ ะธ ัƒะผะธั€ะฐะปะธ ะฑั‹ ะพั‚ ะถะฐั€ั‹, ั‚ะฐัะบะฐัััŒ ั ััƒะผะบะฐะผะธ. ะ’ะพะนะดั ะฒ ะขะฆ ะ“ะฐะปะตั€ะตั (ะพะฝ ัƒ ะฝะฐั ะฑะพะปัŒัˆะพะน ะธ ะผะฝะพะณะพะปัŽะดะฝั‹ะน, ั ะตะณะพ ะฝะฐะทั‹ะฒะฐัŽ "ะผะตัั‚ะพะผ ะฒัะตั… ะฒัั‚ั€ะตั‡"), ะผั‹ ะฑั‹, ะดัƒะผะฐัŽ, ะฝะฐะฟั€ะฐะฒะธะปะธััŒ ะฝะฐ ะฟั€ะตะดะฟะพัะปะตะดะฝะธะน ัั‚ะฐะถ ะบ ั„ัƒะดะบะพั€ั‚ัƒ. ะœั‹ ะฑั‹ ะดะพะปะณะพ ั€ะตัˆะฐะปะธ, ะณะดะต ะผั‹ ะฑัƒะดะตะผ ะตัั‚ัŒ, ั…ะพั‚ั... ะฝะตั‚, ะฝะตั‚, ะฝะตะดะพะปะณะพ. ะขั‹ ะฑั‹ ะฒัะต ะฒะทัะปะฐ ะฒ ัะฒะพะธ ั€ัƒะบะธ ะธ ะฟะพั‚ะฐั‰ะธะปะฐ ะฑั‹ ะบ... ะฝะต ะบ ะœะฐะบะดะพะฝะฐะปัŒะดัƒ, ะฝะต ะทะฐั…ะพั‚ะตะปะฐ ะฑั‹ ั‚ั‹ ัั‚ะพัั‚ัŒ ะฒ ะพั‡ะตั€ะตะดะธ. ะœะพะถะตั‚, ะฑะปะธะฝั‡ะธะบะธ ะธะปะธ ะบั€ะพัˆะบะฐ-ะบะฐั€ั‚ะพัˆะบะฐ? ะะต ะทะฝะฐัŽ. ะ’ะพะทะผะพะถะฝะพ, ะผั‹ ะฒั‹ะฑั€ะฐะปะธ ะฑั‹ ะ‘ัƒั€ะณะตั€ ะšะธะฝะณ ะธะปะธ KFC, ะฒะตะดัŒ ั‚ะฐะผ ะผะพะถะฝะพ ัะฝะฝะพะต ะบะพะปะธั‡ะตัั‚ะฒะพ ั€ะฐะท ะฝะฐะปะธะฒะฐั‚ัŒ ะฒะพะดัƒ. ะ’ะทัะปะธ ะฑั‹ ะพะดะธะฝ ัั‚ะฐะบะฐะฝ ะฝะฐ ะดะฒะพะธั… ะธ ะฝะฐะปะธะปะธ ะฑั‹ ัะฟั€ะฐะนั‚, ะฝะฐัˆ ะพัะพะฑะตะฝะฝั‹ะน ะฝะฐะฟะธั‚ะพะบ. ะฃ ะฝะฐั ะฑั‹ะปะธ ะฑั‹ ะฒ ะฝะตะผ ะดะฒะต ั‚ั€ัƒะฑะพั‡ะบะธ. ะ ะตั‰ะต, ะผั‹ ะผั‹ะบะฐะปะธ ะฑั‹ ะพั‚ ัƒะดะพะฒะพะปัŒัั‚ะฒะธั, ะฟะพะฟะธะฒะฐั ะฑะปะฐะถะตะฝะฝั‹ะน ะธ, ะณะปะฐะฒะฝะพะต, ั…ะพะปะพะดะฝั‹ะน ัะฟั€ะฐะนั‚. ะญั‚ะพ ะฑั‹ะปะพ ะฑั‹ ะผะธะปะพ, ะผั‹ ะดะฐะถะต ัะบะฐะทะฐะปะธ ะฑั‹ ัั‚ะพ ะฒัะปัƒั…. ะ”ะพะตะฒ, ะผั‹ ะฑั‹ ะฝะฐะฑั€ะฐะปะธ ะตั‰ะต ั€ะฐะท ะฟะพะปะฝั‹ะน ัั‚ะฐะบะฐะฝ ัะฟั€ะฐะนั‚ะฐ ะธ ะฟะพัˆะปะธ ะฑั‹ ั…ะพะดะธั‚ัŒ ะฟะพ ั‚ะพั€ะณะพะฒะพะผัƒ ะบะพะผะฟะปะตะบััƒ. ะŸั€ะตะดัั‚ะฐะฒะธะผ, ั‡ั‚ะพ ัƒ ะฝะฐั ะผะฝะพะณะพ ะดะตะฝะตะณ ั ัะพะฑะพะน, ะผ? ะœั‹ ะฑั‹ ะทะฐัˆะปะธ ั€ัะดะพะผ ั ะตะดะพะน ะฒ ะพั‚ะดะตะป, ะณะดะต ะผะฝะพะณะพ ัะปะฐะดะพัั‚ะตะน: ะปะตะดะตะฝั†ะพะฒ, ะผะฐั€ะผะตะปะฐะดะพะบ, ะบะฐั€ะฐะผะตะปะตะบ, ะถะฒะฐั‡ะตะบ, ัะปะฐะดะบะธั… "ะปะตะฝั‚", ะบะพะฝั„ะตั‚. ะะฐะฑั€ะฐะปะธ ะฑั‹ ะผะฝะพะณะพ-ะผะฝะพะณะพ ะฒัะตะณะพ, ั‡ั‚ะพะฑั‹ ะฟะพั‚ะพะผ ั…ะพะดะธั‚ัŒ ะฟะพ ะผะฐะณะฐะทะธะฝะพะผ ะธ ะตัั‚ัŒ ัั‚ะพ ะฒัะต. ะœั‹ ะฒะทัะปะธ ะฑั‹ ะตั‰ะต ั‡ะตะณะพ-ะฝะธะฑัƒะดัŒ ะดะปั ะผะพะตะน ะฟะปะตะผัะฝะฝะธั†ั‹, ะบะพั‚ะพั€ะฐั ะพะฑัะทะฐั‚ะตะปัŒะฝะพ ัะฟั€ะพัะธะปะฐ ะฑั‹: "ะ ั‡ั‚ะพ ะฒั‹ ะผะฝะต ะบัƒะฟะธะปะธ?". ะœั‹ ั…ะพะดะธะปะธ ะฑั‹ ะฟะพ ะพั‚ะดะตะปะฐะผ, ั ะปัŽะฑะปัŽ "H&M" ะธ "RESERVED" ะธ ะฝะต ะปัŽะฑะปัŽ "Zara". ะฅะพั‚ั, ัะบะพั€ะตะต ะฒัะตะณะพ, ะธะผะตะฝะฝะพ ะตะต ั‚ั‹ ะฑะพะปัŒัˆะต ะฒัะตั… ะธ ะปัŽะฑะธัˆัŒ. ะะพ ะผะฝะต ะฑั‹ะปะพ ะฑั‹ ะถัƒั‚ะบะพ ัะบัƒั‡ะฝะพ ะฒ ะฝะตะน. ะฏ ะบัƒะฟะธะปะฐ ะฑั‹ ัะตะฑะต ั„ัƒั‚ะฑะพะปะพะบ, ะฐ ั‚ั‹ ะพะฑัƒะฒะธ. ะฏ ะทะฝะฐัŽ, ะบะฐะบ ั‚ั‹ ะปัŽะฑะธัˆัŒ ะพะฑัƒะฒัŒ. ะฏ ะฟะพัั‚ะพัะฝะฝะพ ะฑั‹ ะฝะฐะฟะฐะดะฐะปะฐ ะฝะฐ ะบะปะตั‚ะบัƒ, ะฒัะฟะพะผะธะฝะฐั ะ’ะธะฝั‡ะตัั‚ะตั€ะพะฒ*, ั‚ั‹ ะฑั‹ ั‚ะพะถะต ัะพ ะผะฝะพะน ัˆัƒั‚ะธะปะฐ ะฝะฐ ัั‚ัƒ ั‚ะตะผัƒ, ั‚ะพะปัŒะบะพ ะฑะตะท ะปะธัˆะฝะตะณะพ ั„ะฐะฝะฐั‚ะธะทะผะฐ ะธ ะฟะพัะผะตะธะฒะฐัััŒ ะฝะฐะดะพ ะผะฝะพะน. ะ ั ะฝะฐัˆะปะฐ ะฑั‹ ะธ ะปัŽะฑะธะผัƒัŽ ั€ัƒะฑะฐัˆะบัƒ ะœะธัˆะธ, ะธ ะปัŽะฑะธะผัƒัŽ ัˆะฐะฟะบัƒ ะ”ะถะฐั€ะตะดะฐ, ะธ ะปัŽะฑะธะผั‹ะน ะดะถะตะผะฟะตั€ ะ”ะถะตะฝัะตะฝะฐ. ะฃัั‚ะฐะฒัˆะธะต, ะผั‹ ะฒะตั€ะฝัƒะปะธััŒ ะฑั‹ ะดะพะผะพะน ะธ, ะฑั‹ะปะพ ะฑั‹ ะปะพะณะธั‡ะฝะพ ะฟั€ะตะดะฟะพะปะพะถะธั‚ัŒ, ั‡ั‚ะพ ะผั‹ ะทะฐะฒะฐะปะธะผัั ะฒ ะบั€ะพะฒะฐั‚ะบัƒ. ะะพ ะฝะตั‚, ะบะพะฝะตั‡ะฝะพ, ะฝะธ ะฒ ะบะพะตะผ ัะปัƒั‡ะฐะต. ะœั‹ ะปัะถะตะผ ะพั‡ะตะฝัŒ ะฟะพะทะดะฝะพ, ะฟะพั‚ะพะผัƒ ั‡ั‚ะพ ะฝะตะปัŒะทั ัะฟะฐั‚ัŒ, ะบะพะณะดะฐ ะผั‹ ั€ัะดะพะผ, ะฒะพ ัะฝะต ะผั‹ ั‚ะตั€ัะตะผ ะฒั€ะตะผั. ะ”ะฐ? *ะฟะตั€ัะพะฝะฐะถะธ ัะตั€ะธะฐะปะฐ "ะกะฒะตั€ั…ัŠะตัั‚ะตัั‚ะฒะตะฝะฝะพะต". ะ”ะฐะปัŒัˆะต ะธะดะตั‚ ะฟะตั€ะตั‡ะธัะปะตะฝะธะต ะฒะตะดัƒั‰ะธั… ะฐะบั‚ะพั€ะพะฒ ะดะฐะฝะฝะพะณะพ ัˆะพัƒ. ะฏ ัะพัะบัƒั‡ะธะปะฐััŒ, ั‚ะฐะบ ะดะฐะฒะฐะน ะฟะพะฑัƒะดะตะผ ะฒะผะตัั‚ะต ะฒ ะฟะธััŒะผะตะฝะฝะพะน ั„ะพั€ะผะต ะตั‰ะต, ั‚ั‹ ะฝะต ะฟั€ะพั‚ะธะฒ? ะขะพะปัŒะบะพ ั ะฝะต ะทะฝะฐัŽ, ะบัƒะดะฐ ะฝะฐะผ ะฟะพะนั‚ะธ. ะ ะดะฐะฒะฐะน ะฟั€ะพัั‚ะพ ะปะตะถะฐั‚ัŒ ะฝะฐ ะบั€ะพะฒะฐั‚ะธ. ะขะฐะบ ะฒะพั‚, ะผั‹ ะฟั€ะพัั‚ะพ ะปะตะถะธะผ, ะธ ะผะตะถะดัƒ ะฝะฐะผะธ ะผะพะปั‡ะฐะฝะธะต, ะฝะฐัˆะต ะปัŽะฑะธะผะพะต ะผะพะปั‡ะฐะฝะธะต. ะขะฐ ั‚ะธัˆะธะฝะฐ, ะบะพั‚ะพั€ัƒัŽ ะฝะตะปัŒะทั ะฝะฐะทะฒะฐั‚ัŒ ะฝะตะปะพะฒะบะพะน. ะฃ ะฝะฐั ะฝะตั‚ ั‚ะฐะบะพะน. ะกะบะพั€ะตะต, ั€ะพะดะฝะฐั, ั‚ะตะฟะปะฐั, ะฟั€ะธะฒั‹ั‡ะฝะฐั, ัƒัŽั‚ะฝะฐั. ะ”ะฐะถะต ะบะพะณะดะฐ ะผั‹ ั€ะฐะทะณะพะฒะฐั€ะธะฒะฐะตะผ ะพ ั‡ะตะผ-ั‚ะพ ะดะตะนัั‚ะฒะธั‚ะตะปัŒะฝะพ, ั‚ะฐะบ ัะบะฐะทะฐั‚ัŒ, ะฝะตะปะพะฒะบะพะผ ะธ ะฒะพะปะฝะธั‚ะตะปัŒะฝะพะผ, ะผั‹ ะผะพะถะตะผ ะทะฐะผะพะปั‡ะฐั‚ัŒ, ะธ ัั‚ะพ ะฝะต ะฑัƒะดะตั‚ ะพะฟัั‚ัŒ ะถะต ั‡ะตะผ-ั‚ะพ ะฝะตัƒะผะตัั‚ะฝั‹ะผ. ะฏ ะฑั‹ ัะธะดะตะปะฐ ะฒ ะฐะนะฟะฐะดะต, ะดัƒะผะฐัŽ, ะฐ ั‚ั‹ ะฒ ั‚ะตะปะตั„ะพะฝะต. ะž ะดะฐ, ะผั‹ ะฟะพัะฟะพั€ะธะปะธ ะฑั‹, ั‡ั‚ะพ ะปัƒั‡ัˆะต - ัะฟะป ะธะปะธ ะฐะฝะดั€ะพะนะด. ะžั‚ะฝัะปะธ ะฑั‹ ะดั€ัƒะณ ัƒ ะดั€ัƒะณะฐ ะณะฐะดะถะตั‚ั‹, ะธั‰ะฐ ะฒ ะฝะธั… ะฝะตัƒะดะพะฑัั‚ะฒะฐ ะธ ะฝะตะดะพั‡ะตั‚ั‹. ะะฐะดะตัŽััŒ, ั ะพะดะตั€ะถะฐะปะฐ ะฟะพะฑะตะดัƒ ะฒ ัั‚ะพะผ ะฒั‹ะผั‹ัˆะปะตะฝะฝะพะผ ัะฟะพั€ะต. ะงะตั€ั‚, ะฐ ะผะพะถะตั‚, ะผั‹ ะฟะพัะผะพั‚ั€ะธะผ ั„ะธะปัŒะผ ะธะปะธ ัะตั€ะธะฐะป? ะฏ ัั‚ะพะปัŒะบะพ ั€ะฐะท ะฒั‹ะบะฐะทั‹ะฒะฐะปะฐ ะฒ ะฟะตั€ะตะฟะธัะบะฐั… ะฝะฐัˆะธั… ะถะตะปะฐะฝะธะต ะฟะพัะผะพั‚ั€ะตั‚ัŒ ั ั‚ะพะฑะพะน ะฒะถะธะฒัƒัŽ ั‡ั‚ะพ-ั‚ะพ, ั ั‚ะฐะบ ั…ะพั‚ะตะปะฐ ัั‚ะพะณะพ, ะธ ะฒัะต ะตั‰ะต ั…ะพั‡ัƒ. ะœั‹ ะดะฐะถะต ะฝะฐ ั€ะฐััั‚ะพัะฝะธะธ ะฝะตัะบะพะปัŒะบะพ ั€ะฐะท ัƒะผัƒะดั€ัะปะธััŒ ั‚ะฐะบ ัะผะพั‚ั€ะตั‚ัŒ ัะตั€ะธะฐะปั‹ ะธ ั„ะธะปัŒะผั‹. ะ˜ ั‚ะฐะบ ั‡ะฐัั‚ะพ ะผั‹ ะฝะต ะดะพัะผะฐั‚ั€ะธะฒะฐะปะธ, ั‚ั‹ ะฒัะตะณะดะฐ ัƒะฑะตะณะฐะตัˆัŒ ะบัƒะดะฐ-ั‚ะพ ะฟะพ ะดะตะปะฐะผ. ะกั€ะตะดะธ ะฝะฐั ะดะฒะพะธั… ั‚ะพะปัŒะบะพ ั ะปะพะดั‹ั€ัŒ, ะบะพั‚ะพั€ั‹ะน ัะธะดะธั‚ ะดะพะผะฐ, ะฒั‹ั…ะพะดั ะธะท ะฟะพะผะตั‰ะตะฝะธั ะปะธัˆัŒ ะฝะฐ ัƒั‡ะตะฑัƒ. ะฅะผ, ั‚ะฐะบ ั‡ั‚ะพ ะฑั‹ ะผั‹ ะฟะพัะผะพั‚ั€ะตะปะธ? ะ”ะฐะฒะฐะน "Faking it", ะดะฐะฒะฝะพ ั…ะพั‡ัƒ ะฟะพัะผะพั‚ั€ะตั‚ัŒ ะตะณะพ. ะ˜ะปะธ, ะตัะปะธ ั‚ั‹ ั€ะฐะทั€ะตัˆะธัˆัŒ ะฒะบะปัŽั‡ะธั‚ัŒ ั‡ั‚ะพ-ั‚ะพ ะผะฝะพะน ัƒะถะต ะฟั€ะพัะผะพั‚ั€ะตะฝะฝะพะต, ั‚ะพ ั ะฟะพัะฐะถัƒ ั‚ะตะฑั ัะผะพั‚ั€ะตั‚ัŒ "Shameless". ะฏ ะฟั€ะฐะฒะดะฐ ั…ะพั‡ัƒ, ั‡ั‚ะพ ะฑั‹ ั‚ั‹ ะฟะพัะผะพั‚ั€ะตะปะฐ ะตะณะพ. ะฏ ะฟะพัั‚ะฐั€ะฐะปะฐััŒ ะฑั‹ ะฝะต ัะฟะพะนะปะตั€ะธั‚ัŒ, ะฝะพ ั…ะพั€ะพัˆะพ, ั‡ั‚ะพ ั‚ั‹ ัะฟะพะบะพะนะฝะพ ะบ ะฝะธะผ ะพั‚ะฝะพัะธัˆัŒัั, ะธ ะผะฝะต ะฝะต ะฝะฐะดะพ ะพะฟะฐัะฐั‚ัŒัั ะทะฐ ัะฒะพะต ะทะดะพั€ะพะฒัŒะต, ะตัะปะธ ั ะฟั€ะพะณะพะฒะพั€ัŽััŒ. ะฏ ะฑัƒะดัƒ ะณะพะฒะพั€ะธั‚ัŒ: "ะ’ะพั‚, ะฒะพั‚, ัะผะพั‚ั€ะธ, ัะตะนั‡ะฐั ั‚ะฐะบะพะต ะฑัƒะดะตั‚". ะ ั‚ั‹, ะฝะฐะดะตัŽััŒ, ะผะตะฝั ะทะฐั‚ะบะฝะตัˆัŒ. ะŸั€ะพะทะฒัƒั‡ะฐะปะพ ั‚ะฐะบ, ะฟะพั‡ะตะผัƒ-ั‚ะพ, ะฑัƒะดั‚ะพ ะฟะพั†ะตะปัƒะตะผ, ะฝะพ ะฝะตั‚. ะžะน, ะฐ ะตั‰ะต ัƒ ะฝะฐั ะฑัƒะดัƒั‚ ั‡ะธะฟัั‹ ะธ "ะฝะฐัˆ ัะฟั€ะฐะนั‚". ะฅะพั‚ั, ั ะปัƒั‡ัˆะต ะฒะพะทัŒะผัƒ ั‡ั‚ะพ-ั‚ะพ ะผัƒั‡ะฝะพะต, ะฐ ะฝะต ั‡ะธะฟัั‹. ะะพ ั‚ั‹, ั ะทะฝะฐัŽ, ะธั… ะปัŽะฑะธัˆัŒ. ะ•ัะปะธ ะผั‹ ะฟั€ะพัะธะดะธะผ ั‚ะฐะบ ะดะพ ัะฐะผะพะน ะฝะพั‡ะธ, ะผั‹ ะทะฐั…ะพั‚ะธะผ ัะฟะฐั‚ัŒ, ะฝะฐะฒะตั€ะฝะพะต. ะ˜, ะตัะปะธ ะผั‹ ะฝะต ะฒั‹ะบะปัŽั‡ะธะผ ะธ ะฝะพั€ะผะฐะปัŒะฝะพ ะฝะต ะปัะถะตะผ, ั‚ะพ ะผั‹ ะฟั€ะพัั‚ะพ ัƒัะฝะตะผ ะฝะฐ ั„ะพะฝะต ะณะพะปะพัะพะฒ ะะพัะปั, ะšัะผะฐ, ะญะผะผั‹ ะธ ะฟั€ะพั‡ะธั… ะปัŽะดะตะน ะฝะฐะผ ะทะฝะฐะบะพะผั‹ั…-ะฝะตะทะฝะฐะบะพะผั‹ั…. ะงะตั€ั‚, ะฝะตั‚, ะผั‹ ะถะต ะฑัƒะดะตะผ ัะผะพั‚ั€ะตั‚ัŒ ะฒ ะพะทะฒัƒั‡ะบะต. ะšะฐะบ ะถะฐะปัŒ. ะฏ ะฟั€ะพัะฝัƒััŒ ั€ะฐะฝัŒัˆะต, ั‡ะตะผ ั‚ั‹, ะฝะพ ะฝะต ะฒัั‚ะฐะฝัƒ, ั ะฑะดัƒ ะถะดะฐั‚ัŒ, ะฟะพะบะฐ ะฟั€ะพัะฝะตัˆัŒัั ั‚ั‹. ะŸะพัะธะถัƒ ะฒ ะฐะนะฟะฐะดะต, ะผะพะถะตั‚. ะšะพะณะดะฐ ะถะต ั‚ั‹ ะฟั€ะพัะฝะตัˆัŒัั, ั ัะบะฐะถัƒ, ั‡ั‚ะพ ั‚ั‹ ะฟั€ะพัั‚ะพ ะฝะตะฒะตั€ะพัั‚ะฝะฐั ัะพะฝั. ะฏ, ะบัั‚ะฐั‚ะธ, ั…ะพั‡ัƒ, ั‡ั‚ะพ ะฑั‹ ะธะผะตะฝะฝะพ ั‚ั‹ ัะดะตะปะฐะปะฐ ะทะฐะฒั‚ั€ะฐะบ. ะขั‹ ะถะต ัƒะผะตะตัˆัŒ ะณะพั‚ะพะฒะธั‚ัŒ, ั‚ั‹ ะฝะต ั.' sentences: - 'ะะต ัะปัƒัˆะฐะน ะผะตะฝั, ะฟั€ะพัˆัƒ! ะฏ ั…ะพั‡ัƒ ั‚ะตะฑะต ะบะพะต-ั‡ั‚ะพ ัะบะฐะทะฐั‚ัŒ, ะฟะพัั‚ะพะผัƒ ะฝะต ัะปัƒัˆะฐะน ะผะตะฝั. ะฏ ั…ะพั‡ัƒ ัั‚ะพ ัะบะฐะทะฐั‚ัŒ ั‚ะตะฑะต, ะฟั€ัะผะพ ั‚ะตะฑะต, ะฝะพ ั‡ั‚ะพะฑั‹ ั‚ั‹ ะฝะต ัะปั‹ัˆะฐะป. ะ˜ะปะธ ั‡ั‚ะพะฑั‹ ะทะฐะฑั‹ะป ัะฟัƒัั‚ั ะผะณะฝะพะฒะตะฝะธะต. ะฏ ะ›ะฎะ‘ะ›ะฎ ะขะ•ะ‘ะฏ ะšั€ะธั‡ัƒ ัั‚ะพ. ะฃ ัะตะฑั ะฒ ะณะพะปะพะฒะต. ะ—ะฝะฐะบะพะผั‹ะผ. ะะต ั‚ะตะฑะต. ะ’ัะตะผ, ะฒัะตะผัƒ, ะฒัะตะณะดะฐ, ะฝะพ ะฝะต ั‚ะตะฑะต, ะฝะพ ะฝะธะบะพะณะดะฐ ั‚ะตะฑะต. ะ“ะพัะฟะพะดะธ, ะฝะต ัะปัƒัˆะฐะน ะผะตะฝั. ะžั‚ะฒะปะตะบะธััŒ ะพั‚ ะผะตะฝั ัะตะนั‡ะฐั ะถะต, ั‡ั‚ะพะฑั‹ ั ะผะพะณ ัะบะฐะทะฐั‚ัŒ ั‚ะตะฑะต ั‚ะพ, ั‡ั‚ะพ ั…ะพั‡ัƒ, ะธ ั‚ะพ, ั‡ั‚ะพ ั‡ัƒะฒัั‚ะฒัƒัŽ. ะขะธั…ะพ. ะ’ะฝัƒั‚ั€ะธ ัั‚ะพ ะฑัƒะดะตั‚ ะพั‡ะตะฝัŒ ะณั€ะพะผะบะพ, ะฝะพ ะฝะต ะผะพะณัƒ ั ั‚ะฐะบะพะน ะณั€ะพะผะบะพัั‚ัŒัŽ ัะบะฐะทะฐั‚ัŒ ัั‚ะพ ะฒัะปัƒั…, ะฝะต ะฟั€ะธ ั‚ะตะฑะต. ะขั‹ ะดะพะปะถะตะฝ ะพั‚ะฒะปะตั‡ัŒัั, ั‚ะพะณะดะฐ ั ัะบะฐะถัƒ ั‚ะธั…ะพ, ั‚ั‹ ะฝะต ัƒัะปั‹ัˆะธัˆัŒ. ะœะพะถะตั‚, ั‚ั‹ ัะฟั€ะพัะธัˆัŒ: "ะงั‚ะพ ั‚ั‹ ัะบะฐะทะฐะป?" ะ˜ ั ะพั‚ะฒะตั‡ัƒ: "ะะธั‡ะตะณะพ, ะฝะธั‡ะตะณะพ". ะ˜ ั‚ั‹ ะฝะต ะฟะตั€ะตัะฟั€ะพัะธัˆัŒ ัะฝะพะฒะฐ, ะฝะต ัั‚ะฐะฝะตัˆัŒ ะฟั€ะพัะธั‚ัŒ ัะบะฐะทะฐั‚ัŒ, ะฟะพั‚ะพะผัƒ ั‡ั‚ะพ ั‚ั‹ ะฟั€ะธะฒั‹ะบ. ะฏ ั‡ะฐัั‚ะพ ะณะพะฒะพั€ัŽ ัะตะฑะต ะฟะพะด ะฝะพั ะธ ะฝะต ะพั‚ะฒะตั‡ะฐัŽ, ั‡ั‚ะพ ัะบะฐะทะฐะป, ะบะพะณะดะฐ ะฟะตั€ะตัะฟั€ะฐัˆะธะฒะฐะตัˆัŒ. ะขะตะฑั ัั‚ะพ ั‚ะฐะบ ะฑะตัะธะปะพ. ะœะพะถะตั‚, ะธ ัะตะนั‡ะฐั ะฑะตัะธั‚, ะฝะพ ั‚ั‹ ัƒะถะต ะฝะต ะพะฑั€ะฐั‰ะฐะตัˆัŒ ะฒะฝะธะผะฐะฝะธั. ะฃ ะผะตะฝั ะตัั‚ัŒ ัั‚ะพะปัŒะบะพ ะฒัะตะณะพ ั‚ะตะฑะต ัะบะฐะทะฐั‚ัŒ. ะะพ ะบะพะณะดะฐ ั‚ั‹ ะณะพะฒะพั€ะธัˆัŒ: "ะัƒ, ั€ะฐััะบะฐะถะธ ั‡ั‚ะพ-ะฝะธะฑัƒะดัŒ. ะงั‚ะพ ัƒะณะพะดะฝะพ. ะ”ะฐะฒะฐะน ะถะต". ะฏ ะทะฐะผะพะปะบะฐัŽ. ะฏ ะฝะต ะทะฝะฐัŽ, ั‡ั‚ะพ ะณะพะฒะพั€ะธั‚ัŒ. ะกะฟัƒัั‚ั ะผะฝะพะณะพ ะผะธะฝัƒั‚ ั ะผะพะณัƒ ั€ะฐััะบะฐะทะฐั‚ัŒ ั‡ั‚ะพ-ั‚ะพ. ะะฐะฟั€ะธะผะตั€, ะบะฐะบะพะน-ั‚ะพ ะณะปัƒะฟั‹ะน ัะพะฝ ะธะปะธ ะฒะฟะตั‡ะฐั‚ะปะตะฝะธั ะพ ั‚ะพะปัŒะบะพ ั‡ั‚ะพ ะฟั€ะพั‡ะธั‚ะฐะฝะฝะพะน ะบะฝะธะณะต. ะ ั‚ั‹ ะปัŽะฑะธัˆัŒ ะบะฝะธะณะธ. ะ“ะพัะฟะพะดะธ, ะบะฐะบ ั ะฝะตะฝะฐะฒะธะถัƒ ั‚ะพ, ั‡ั‚ะพ ั‚ั‹ ะฝะตะฝะฐะฒะธะดะธัˆัŒ ัะตะฑั! ะฅะฒะฐั‚ะธั‚! ะŸั€ะตะบั€ะฐั‚ะธ! ะœะฝะต ะฑะพะปัŒะฝะพ ะฝะฐ ัั‚ะพ ัะผะพั‚ั€ะตั‚ัŒ. ะะพ ะฟั€ะตะบั€ะฐั‚ะธ ัั‚ะพ ะดะตะปะฐั‚ัŒ ะฝะต ั€ะฐะดะธ ะผะตะฝั, ะฐ ั€ะฐะดะธ ัะตะฑั. ะขั‹ ะถะต ะทะฝะฐะตัˆัŒ, ั‚ะพะปัŒะบะพ ั‚ะฐะบ ัั‚ะพ ะฑัƒะดะตั‚ ะฟั€ะฐะฒะธะปัŒะฝะพ. ะะต ะฝะตะฝะฐะฒะธะดัŒ ัะตะฑั, ะฟะพะถะฐะปัƒะนัั‚ะฐ, ะฝะต ะฒะธะฝะธ ัะตะฑั, ะบะพะณะดะฐ ะทะฝะฐะตัˆัŒ, ั‡ั‚ะพ ะฝะต ะฒะธะฝะพะฒะฐั‚. ะŸะพั‡ะตะผัƒ ะฒัะตะณะดะฐ ะฒัะต ะพะฑัั‚ะพัั‚ะตะปัŒัั‚ะฒะฐ ะธ ะฒะตััŒ ะผะธั€ ะฟั€ะพั‚ะธะฒ ั‚ะตะฑั, ะฐ? ะœะฝะต ั‚ะฐะบ ะถะฐะปัŒ. ะกะธะปัŒะฝะตะต ะปัŽะดะตะน ั ะฝะต ะฒัั‚ั€ะตั‡ะฐะป. ะ”ะฐ, ั ะฝะต ั‚ะฐะบ ะผะฝะพะณะพ ะปัŽะดะตะน ะฒัั‚ั€ะตั‡ะฐะป... ะะพ ั€ะฐะทะฒะต ะผะพะถะตั‚ ั‡ะตะปะพะฒะตะบ ะฑั‹ั‚ัŒ ะตั‰ะต ัะธะปัŒะฝะตะต? ะ”ะฐ ะผะพะถะตั‚, ะฝะฐะฒะตั€ะฝะพะต, ะบะพะผัƒ ั ะฒั€ัƒ, ะผะธั€ ะฑะพะปัŒัˆะพะน, ะปัŽะดะตะน ะผะฝะพะณะพ... ะฝะพ ะบ ั‡ะตั€ั‚ัƒ, ะฒะฐะถะฝะพ ะปะธ ัั‚ะพ? ะะตั‚. ะ’ะฐะถะฝะฐ ะปะธัˆัŒ ั‚ะฒะพั ัะธะปะฐ ะธ ะดัƒั…. ะขั‹ ะฒะพะธะฝ, ั‚ั‹ ะผะพะน ะฒะพะธะฝ. ะฃ ั‚ะตะฑั ะดะฐะถะต ะตัั‚ัŒ ั‚ะฐะบะฐั ั„ัƒั‚ะฑะพะปะบะฐ, ะฝะฐ ะฝะตะน ะฝะฐะฟะธัะฐะฝะพ "warrior". ะั…, ะบะฐะบ ะพะฝะฐ ะธะดะตั‚ ั‚ะตะฑะต. ะ’ะพะธะฝ. ะ‘ะพั€ะธััŒ, ะ”ะธะฝ, ะฑะพั€ะธััŒ, ะฝะต ัะดะฐะฒะฐะนัั. ะžั‚ะดั‹ั…ะฐะน, ั€ะฐััะปะฐะฑะปัะนัั, ะฝะพ ะฝะต ัะดะฐะฒะฐะนัั ะธ ะถะธะฒะธ, ะถะธะฒะธ, ะถะธะฒะธ. ะขั‹ ัะฐะผั‹ะน ะถะธะฒะพะน ั‡ะตะปะพะฒะตะบ ั ะผะตั€ั‚ะฒะตะฝะฝะพ ัƒัั‚ะฐะฒัˆะตะน ะดัƒัˆะพะน. ะะฐัั‚ะพะปัŒะบะพ ะถะธะฒะพะน, ั‡ั‚ะพ ั ั ั‚ะพะฑะพะน ั€ัะดะพะผ ั‡ัƒะฒัั‚ะฒัƒัŽ, ั‡ั‚ะพ ั‚ะพะถะต ะถะธะฒัƒ. ะขะฐะบะพะต ะพั‰ัƒั‰ะตะฝะธะต, ั‡ั‚ะพ ะดะพ ั‚ะตะฑั ั ะธ ะฝะต ะถะธะป. ะŸะพั‚ะพะผัƒ ั‡ั‚ะพ ะดะพ ั‚ะตะฑั ะฒัะต ะฑั‹ะปะพ ัะปะธัˆะบะพะผ ะธะฝะฐั‡ะต. ะขะตะฟะตั€ัŒ ั ะฒะธะถัƒ ะผะธั€ ะดั€ัƒะณะธะผะธ ะณะปะฐะทะฐะผะธ. ะœะพะถะตั‚, ะดะตะปะพ ะฒ ะฒะพะทั€ะฐัั‚ะต, ะฒั‹ั€ะพั ะธะปะธ ั‡ั‚ะพ. ะะพ ะฒัะต ะถะต ั‚ั‹ ะฟั€ะธะฝัะป ะพะณั€ะพะผะฝะพะต ัƒั‡ะฐัั‚ะธะต ะฒ ะฒั‹ัั‚ั€ะฐะธะฒะฐะฝะธะธ ะผะธั€ะฐ ะฒะพะบั€ัƒะณ ะผะตะฝั, ะฟะพั‚ะพะผัƒ ั‡ั‚ะพ ั‚ั‹ ะพะณั€ะพะผะฝะฐั ั‡ะฐัั‚ัŒ ัั‚ะพะณะพ ะผะธั€ะฐ. ะญั‚ะพ ะผะพะถะตัˆัŒ ะดะฐะถะต ัƒัะปั‹ัˆะฐั‚ัŒ, ะฟะพัะปัƒัˆะฐั‚ัŒ. ะ”ะฐ ั ั‚ะตะฑะต ะดะฐะถะต ัั‚ะพ ะฟะธัะฐะป. ะั…, ะดะฐ ั ะธ ะณะพะฒะพั€ะธะป ั‚ะตะฑะต, ะธ ะฟะธัะฐะป, ั‡ั‚ะพ ะปัŽะฑะปัŽ. ะะพ ะผั‹ ะถะต ะดั€ัƒะทัŒั, ัั‚ะพ ะฝะพั€ะผะฐะปัŒะฝะพ. ะŸั€ะพัั‚ะพ ั‚ะพ "ั ะปัŽะฑะปัŽ ั‚ะตะฑั", ั‡ั‚ะพ ั…ะพั‡ัƒ ะฟั€ะพะบั€ะธั‡ะฐั‚ัŒ... ัั‚ะพ ะฝะต ั‚ะพ "ั ะปัŽะฑะปัŽ ั‚ะตะฑั", ะบะพั‚ะพั€ะพะต ั‚ั‹ ะณะพะฒะพั€ะธัˆัŒ ั ัƒะปั‹ะฑะบะพะน, ั€ะฐัั‚ั€ะตะฟั‹ะฒะฐั ะผะฝะต ะฒะพะปะพัั‹. ะฏ ะปัŽะฑะปัŽ ะฑั‹ั‚ัŒ ะฝะธะทะบะธะผ. ะ ัะดะพะผ ั ั‚ะพะฑะพะน. ะะฐ ัƒะผ ะฟั€ะธั…ะพะดัั‚ ัั‚ั€ะพั‡ะบะธ ะณั€ัƒะฟะฟั‹ "A Big Great World" I am feeling so small* ะงะตั€ั‚, ะดะฐ ัั‚ะฐ ะฟะตัะฝั ะฒะพะพะฑั‰ะต ัะฟะปะพัˆะฝะพะน... ั. ะฏ ะฟะพั‡ัƒะฒัั‚ะฒะพะฒะฐะป ะฒ ะฝะตะน ัะตะฑั ั‚ะพะณะดะฐ, ะบะพะณะดะฐ ะผั‹... ะฟะตั€ะตัั‚ะฐะปะธ ะฟะพ ะผะพะตะน ะฒะธะฝะต ะพะฑั‰ะฐั‚ัŒัั ะฝะฐ ะบะฐะบะพะต-ั‚ะพ ะฒั€ะตะผั. ะฏ ัƒะฑะธะฒะฐะปัั ะธ ะดะฐะถะต ะฑะธะปัั ะณะพะปะพะฒะพะน ะพะฑ ัˆะบะฐั„, ัะธะดั ะฝะฐ ะฟะพะปัƒ, ะบะฐะถะตั‚ัั. ะขะพะณะดะฐ ะบ ะผะพะธะผ ะบั€ะธะบะฐะผ ะพ ะปัŽะฑะฒะธ ะดะพะฑะฐะฒะปัะปะธััŒ ะดั€ัƒะณะธะต. Say something, I''m giving up on you** ะะต ั…ะพั‡ัƒ ะฑะพะปัŒัˆะต ั‡ัƒะฒัั‚ะฒะพะฒะฐั‚ัŒ ะธะผะตะฝะฝะพ ัั‚ัƒ ัั‚ั€ะพะบัƒ. ะŸะพั‚ะพะผ ั ะบะธะฝัƒะปะฐ ั‚ะตะฑะต ะบะปะธะฟ ัั‚ะพะน ะฟะตัะฝะธ. ะกะฟัƒัั‚ั ะบะฐะบะพะต-ั‚ะพ ะฒั€ะตะผั. ะญั‚ะธะผ ั ะฟั€ะธะทะฝะฐะปะฐััŒ ั‚ะตะฑะต ะฒะพ ะผะฝะพะณะพะผ, ัั‚ะพ ะฑั‹ะปะพ ะฟะธััŒะผะพ ั ัˆะธั„ั€ะพะผ, ะบะพั‚ะพั€ะพะต ะฝะต ะดะพะปะถะฝะพ ะฑั‹ั‚ัŒ ะฝะธะบะพะณะดะฐ ั€ะฐััˆะธั„ั€ะพะฒะฐะฝะพ. ะฏ ะบั€ะธั‡ะฐะปะฐ, ะฐ ั‚ั‹ ะฝะต ัะปัƒัˆะฐะปะฐ. ะŸะพั‚ะพะผัƒ ั‡ั‚ะพ ั ัƒะผะตัŽ ั‚ะธั…ะพ ะบั€ะธั‡ะฐั‚ัŒ. ะ˜ ัะปะฐะฒะฐ ะฑะพะณัƒ. *ะฏ ั‡ัƒะฒัั‚ะฒัƒัŽ ัะตะฑั ั‚ะฐะบะธะผ ะผะฐะปะตะฝัŒะบะธะผ/ะฝะตะทะฝะฐั‡ะธั‚ะตะปัŒะฝั‹ะผ **ะกะบะฐะถะธ ะถะต ั‡ั‚ะพ-ะฝะธะฑัƒะดัŒ, ั ะพั‚ะดะฐะปััŽััŒ ะพั‚ ั‚ะตะฑั' - 'Pov ะฅะพั€ั…ะต. ะ’ัะตะผ ะฟั€ะธะฒะตั‚, ะผะตะฝั ะทะพะฒัƒั‚ ะฅะพั€ั…ะต ะ‘ะปะฐะฝะบะพ. ะฏ ั€ะฐะฑะพั‚ะฐัŽ ะฒ ะพั„ั„ะธัะต ัะฒะพะตะณะพ ะพั‚ั†ะฐ, ัะพะฒัะตะผ ัะบะพั€ะพ ั ัั‚ะฐะฝัƒ ะณะปะฐะฒะฝั‹ะผ. ะœะฝะต ะดะฒะฐะดั†ะฐั‚ัŒ ั‡ะตั‚ั‹ั€ะต ะณะพะดะฐ, ัƒ ะผะตะฝั ะตัั‚ัŒ ะดะตะฒัƒัˆะบะฐ ะœะฐะบะฐั€ะตะฝะฐ ะœะธะณะตะปัŒ, ะฒะฝะตัˆะฝะพัั‚ัŒ ัƒ ะฝะตั‘ ะฝะฐ ะปัŽะฑะธั‚ะตะปั, ะฝะพ ัั‚ะพ ะฝะต ะธะผะตะตั‚ ะทะฝะฐั‡ะตะฝะธั ะณะปะฐะฒะฝะพะต ั‡ั‚ะพ ั ะปัŽะฑะปัŽ ะตั‘. ะœะฐะบะฐั€ะตะฝะฐ ะพะฝะฐ ะพั‡ะตะฝัŒ ะฒะทั€ั‹ะฒะฝะฐั ะดะตะฒัƒัˆะบะฐ, ะพั‡ะตะฝัŒ ัะผะพั†ะธะพะฝะฐะปัŒะฝะฐั, ะฝะต ั‚ะพะปัŒะบะพ ะฒ ะถะธะทะฝะธ, ะฝะพ ะธ ะฒ ะฟะพัั‚ะตะปะธ. ะะพ ะพั‚ ัั‚ะฐั‚ัƒัะฐ ะฑะฐะฑะฝะธะบะฐ ั ะฝะต ะธะทะฑะฐะฒะธะปัั, ั ะฟั€ะพะดะพะปะถะฐัŽ ั…ะพะดะธั‚ัŒ ะฟะพ ะบะปัƒะฑะฐะผ, ะฟั€ะธะทะฝะฐัŽััŒ ั‡ะตัั‚ะฝะพ ะดะตะฒัƒัˆะตะบ ั ะฟั€ะพัั‚ะพ ะพะฑะพะถะฐัŽ, ะฝะพ ัะฟะปัŽ ั‚ะพะปัŒะบะพ ั ะพะดะฝะพะน. ะ•ั‰ั‘ ัƒ ะผะตะฝั ะตัั‚ัŒ ะฑั‹ะฒัˆะฐั ะดะตะฒัƒัˆะบะฐ ะœะฐั€ั‚ะธะฝะฐ, ัั‚ะฐ ะดะตะฒั‡ะพะฝะบะฐ ัะฒะพะดะธะปะฐ ะผะตะฝั ั ัƒะผะฐ ะฒ ัˆะบะพะปัŒะฝั‹ะต ะณะพะดั‹, ะพะฝะฐ ะฑั‹ะปะฐ ะฟั€ะพัั‚ะพ ัˆะธะบะฐั€ะฝะฐ ะฒ ะฟะพัั‚ะตะปะธ, ั‚ะฐะบะพะต ัƒะผะตะตั‚ ั‚ะพะปัŒะบะพ ะพะฝะฐ. ะš ัะพะถะฐะปะตะฝะธัŽ ะธะปะธ ะบ ัั‡ะฐัั‚ัŒัŽ ะผั‹ ั€ะฐััั‚ะฐะปะธััŒ, ะฟั€ะฐะฒะดะฐ ัะพะฒัะตะผ ะฝะตะดะฐะฒะฝะพ, ะฑัƒะบะฒะฐะปัŒะฝะพ ะฟะพะปั‚ะพั€ะฐ ะณะพะดะฐ ะฝะฐะทะฐะด, ะฝะพ ะผั‹ ะฟั€ะพะดะพะปะถะฐะตะผ ะพะฑั‰ะฐั‚ัŒัั. ะ’ัั‚ะฐะฒ ั ั‚ั‘ะฟะปะพะน ะฟะพัั‚ะตะปัŒะบะธ ั ะฝะฐะฟั€ะฐะฒะธะปัั ะฒ ะดัƒัˆ, ะฟะพัะปะต ะฑัƒั€ะฝะพะน ะฝะพั‡ะธ ั ะœะฐะบะฐั€ะตะฝะพะน, ั†ะตะปัƒัŽ ะฝะพั‡ัŒ ะพะฝะฐ ัƒะฑะปะฐะถะฐะปะฐ ะผะตะฝั, ะตั‰ั‘ ัƒ ะฝะตั‘ ะพั‡ะตะฝัŒ ัะปะฐะฑั‹ะน ั…ะฐั€ะฐะบั‚ะตั€, ั ะผะพะณัƒ ะธะทะดะตะฒะฐั‚ัŒัั ะฝะฐะด ะฝะตะน, ะฝะพ ะฒ ั…ะพั€ะพัˆะตะผ ัะผั‹ัะปะต, ะฒ ัั‚ะพะผ ัะพัŽะทะต ะณะปะฐะฒะฝั‹ะน ั. ะšะพะณะดะฐ ั ะฒัั‚ั€ะตั‡ะฐะปัั ั ะขะธะฝะธ, ะดะพะผะธะฝะธั€ะพะฒะฐะปะฐ ะพะฝะฐ, ัƒ ะฝะตั‘ ัะธะปัŒะฝั‹ะน ั…ะฐั€ะฐะบั‚ะตั€, ะพะฝะฐ ะดะฐัั‚ ะพั‚ะฟะพั€ ะปัŽะฑะพะผัƒ. ะŸั€ะธะฝัะฒ ะดัƒัˆ ั ะฒั‹ัˆะตะป ะธะท ะฒะฐะฝะฝะพะน ะธ ะฝะฐะฟั€ะฐะฒะธะปัั ะฒ ัะฟะฐะปัŒะฝัŽ. - ะ–ะพะถะธะบ, ั‚ั‹ ั‡ั‚ะพ ัะตะณะพะดะฝั ั€ะฐะฑะพั‚ะฐะตัˆัŒ? ะฟะพะผะพะตะผัƒ ัƒ ั‚ะตะฑั ะฒั‹ั…ะพะดะฝะพะน, ะบัƒะดะฐ ั‚ั‹ ัะพะฑั€ะฐะปัั? - ะฟะพัะปั‹ัˆะฐะปัั ะณะพะปะพั ะปัŽะฑะธะผะพะน ะธะท-ะฟะพะด ะพะดะตัะปะฐ. - ะะธะบัƒะดะฐ ั ะฝะต ัะพะฑั€ะฐะปัั, ั ะฟั€ะพัั‚ะพ ะฟั€ะธะฝะธะผะฐะป ะดัƒัˆ, ะบัั‚ะฐั‚ะธ ัะตะณะพะดะฝั ะธะดั‘ะผ ะฒ ะบะปัƒะฑ, - ัะบะฐะทะฐะป ั ะธ ะฝะฐะดะตะป ะฝะฐ ัะตะฑั ัะฟะพั€ั‚ะธะฒะฝั‹ะต ัˆั‚ะฐะฝั‹. - ะžะบ, ะฒ ะฒะพัะตะผัŒ ะฑัƒะดัƒ ะณะพั‚ะพะฒะฐ, ั‚ะฒะพะธ ะดั€ัƒะทัŒั ะฟะพะนะดัƒั‚ ั ะฝะฐะผะธ? - ะฝะตะดะพะฒะพะปัŒะฝะพ ัะฟั€ะพัะธะปะฐ ะพะฝะฐ. ะœะพะธ ะดั€ัƒะทัŒั ัะฐะผั‹ะต ะปัƒั‡ัˆะธะต ะปัŽะดะธ ะฒ ะผะธั€ะต, ะพะฝะธ ะฝะต ั‚ะพะปัŒะบะพ ะผะพะธ ะดั€ัƒะทัŒั, ะฝะพ ะธ ะดั€ัƒะทัŒั ะœะฐั€ั‚ะธะฝั‹. ะ”ะธะตะณะพ ะธ ะ›ะพะดะพะฒะธะบะฐ, ัั‚ะฐ ะฟั€ะพัั‚ะพ ะดะฒะพะต ะฝะตะฒะผะตะฝัะตะผั‹ั… ะปัŽะดะตะน, ะฝะพ ะพะฝะธ ั‚ะฐะบะธะต ะผะธะปั‹ะต. ะ ัƒะดะถะตั€ะพ ะธ ะšะฐะฝะดะธ, ะพะฝะธ ะฝะต ะพั‡ะตะฝัŒ ะปัŽะฑัั‚ ะบะปัƒะฑั‹ ะธ ัˆัƒะผะฝั‹ะต ะฒะตั‡ะตั€ะธะฝะบะธ, ะพะฝะธ ะปัŽะฑัั‚ ะฟะพะฑั‹ั‚ัŒ ะฝะฐะตะดะธะฝะต ะดั€ัƒะณ ั ะดั€ัƒะณะพะผ, ะฝะตัะผะพั‚ั€ั ะฝะฐ ั‚ะพ ั‡ั‚ะพ ะผั‹ ั€ะฐะทะฝั‹ะต, ะผั‹ ะปัƒั‡ัˆะธะต ะดั€ัƒะทัŒั. ะ ะตั‰ั‘ ะœะตั‡ะธ, ะพะฝะฐ ะปัƒั‡ัˆะฐั ะฟะพะดั€ัƒะณะฐ ะขะธ, ะฐ ะทะฐะพะดะฝะพ ะธ ะผะพั. - ะšะพะฝะตั‡ะฝะพ, ะบัƒะดะฐ ั ะฑะตะท ะฝะธั…, ะฝะต ะฝะฐะดะพ ะผะฝะต ะณะพะฒะพั€ะธั‚ัŒ ั‡ั‚ะพ ะพะฝะธ ะฟะปะพั…ะธะต, - ะฝะตะดะพะฒะพะปัŒะฝะพ ะฟั€ะพะณะพะฒะพั€ะธะป ั ะธ ะฒั‹ัˆะตะป ะธะท ัะฟะฐะปัŒะฝะธ. ะ”ะพะผ ัƒ ะฝะฐั ะฑะพะปัŒัˆะพะน, ะฝะพ ั‚ัƒั‚ ะฒัะตะณะดะฐ ั‚ะฐะบะพะน ะฑะตัะฟะพั€ัะดะพะบ. ะญั‚ะพ ะฟั€ะพัั‚ะพ ะฝะตะฒั‹ะฝะพัะธะผะพ, ะœะฐะบะฐั€ะตะฝะฐ ะฟะพั‡ั‚ะธ ะฝะต ัƒะฑะธั€ะฐะตั‚ัั, ั†ะตะปั‹ะผะธ ะดะฝัะผะธ ัะธะดะธั‚ ะดะพะผะฐ ะธ ะฑะตะทะดะตะปัŒะฝะธั‡ะฐะตั‚, ะบะฐะบ ั‚ะพะปัŒะบะพ ะทะฐะณะพะฒะพั€ะธัˆัŒ ะฟั€ะพ ัƒะฑะพั€ะบัƒ ะธ ะณะพั‚ะพะฒะบัƒ, ะพะฝะฐ ะฝะฐั‡ะธะฝะฐะตั‚ ั†ะตะปะพะฒะฐั‚ัŒ ะผะตะฝั, ะฐ ะดะฐะปัŒัˆะต ะฒั‹ ะทะฝะฐะตั‚ะต ั‡ั‚ะพ ะฟั€ะพะธัั…ะพะดะธั‚. - ะฏ ะธ ะฝะต ัะพะฑะธั€ะฐะปะฐััŒ, - ะฟะพัะปั‹ัˆะฐะปะพััŒ ะธะท ะบะพะผะฝะฐั‚ั‹. ะฏ ัะฟัƒัั‚ะธะปัั ะฒะฝะธะท ะธ ะฟั€ะธะฝัะปัั ะธัะบะฐั‚ัŒ ัะฒะพัŽ ั„ัƒั‚ะฑะพะปะบัƒ, ะฝะพ ะฝะฐะนั‚ะธ ะตั‘ ะฟั€ะพัั‚ะพ ะฝะต ั€ะตะฐะปัŒะฝะพ. ะกะฟัƒัั‚ั ะฝะตัะบะพะปัŒะบะพ ะผะธะฝัƒั‚, ั ะฒัั‘-ั‚ะฐะบะธ ะฝะฐัˆั‘ะป ัะฒะพัŽ ั„ัƒั‚ะฑะพะปะบัƒ. ะŸั€ะพะนะดั ะฝะฐ ะบัƒั…ะฝัŽ ั ะทะฐะปะตะท ะฒ ั…ะพะปะพะดะธะปัŒะฝะธะบ, ะฝะพ ะฝะธั‡ะตะณะพ ััŠะตะดะพะฑะฝะพะณะพ ั ั‚ะฐะผ ะฝะต ะฝะฐัˆั‘ะป. - ะ’ ัั‚ะพะผ ะดะพะผะต, ะตัั‚ัŒ ั‡ั‚ะพ-ะฝะธะฑัƒะดัŒ ะฟะพะตัั‚ัŒ? - ะฟั€ะพะบั€ะธั‡ะฐะป ั ะธ ะฟะพัั‚ะฐะฒะธะป ั€ัƒะบะธ ะฒ ะฑะพะบ. ะœะฐะบะฐั€ะตะฝะฐ ัะฟัƒัะบะฐะปะฐััŒ ัะพ ะฒั‚ะพั€ะพะณะพ ัั‚ะฐะถะฐ, ะพะฝะฐ ะฑั‹ะปะฐ ะฟะพั‡ั‚ะธ ะพะฑะฝะฐะถั‘ะฝะฝะพะน, ะปะธัˆัŒ ะฟั€ะพัั‚ั‹ะฝัŒ ัะบั€ั‹ะฒะฐะปะฐ ะตั‘ ะฟั€ะตะปะตัั‚ะธ. ะฅะพั‚ั ัะบั€ั‹ะฒะฐั‚ัŒ ะพัะพะฑะพ ะฑั‹ะปะพ ะฝะตั‡ะตะณะพ, ะณั€ัƒะดัŒ ะฑั‹ะปะฐ ะพั‡ะตะฝัŒ ะผะฐะปะตะฝัŒะบะพะน, ะฝะพะณะธ ะบะพั€ะพั‚ะบะธะต, ะดั€ัƒะณะพะต ะดะตะปะพ ะœะฐั€ั‚ะธ. - ะงั‚ะพ ั‚ั‹ ะบั€ะธั‡ะธัˆัŒ? ะขั‹ ะถะต ะทะฝะฐะตัˆัŒ ั‡ั‚ะพ ั ะฝะต ัƒะผะตัŽ ะณะพั‚ะพะฒะธั‚ัŒ, ะฝะฐะนะผะธ ะดะพะผั€ะฐะฑะพั‚ะฝะธั†ัƒ ะธ ะฒะพะพะฑั‰ะต ะทะฐั‡ะตะผ ั‚ะตะฑะต ะตะดะฐ ะบะพะณะดะฐ ัƒ ั‚ะตะฑั ะตัั‚ัŒ ั, - ะฟั€ะพัˆะตะฟั‚ะฐะปะฐ ะพะฝะฐ ะธ ัะฑั€ะพัะธะปะฐ ั ัะตะฑั ะฟั€ะพัั‚ั‹ะฝัŒ, ั‚ะบะฐะฝัŒ ัƒะฟะฐะปะฐ ะบ ะตั‘ ะฝะพะณะฐะผ, ะพัั‚ะฐะฒะธะฒ ะœะฐะบะฐั€ะตะฝัƒ ะพะฑะฝะฐะถั‘ะฝะฝะพะน. ะฏ ะฟะพะดะพัˆั‘ะป ะบ ะฝะตะน ะธ ัั‚ั€ะฐัั‚ะฝะพ ะฟะพั†ะตะปะพะฒะฐะป ะตั‘, ั ะปัŽะฑะธะป ะณั€ัƒะฑะพัั‚ัŒ. ะฏ ะผะพะณัƒ ั†ะตะปัƒัŽ ะฝะพั‡ัŒ ะฝะฐะด ะฝะตะน ะณั€ัƒะฑะพ ะธะทะดะตะฒะฐั‚ัŒัั ะธ ะพะฝะฐ ะฝะต ัะบะฐะถะตั‚ ะผะฝะต ะฝะต ัะปะพะฒะฐ. ะŸะพะฒะฐะปะธะฒ ะฝะฐั ะฝะฐ ะดะธะฒะฐะฝ ั ัะฟัƒัั‚ะธะปัั ะบ ะตั‘ ะณั€ัƒะดะธ.... - ะ‘ะปะฐะฝะบะพ, ั‚ั‹ ะฒ ัะฒะพั‘ะผ ั€ะตะฟะตั€ั‚ัƒะฐั€ะต? - ะฝะตะถะฝั‹ะน, ะฝะพ ะฒ ั‚ะพะถะต ะฒั€ะตะผั ะฝะฐะณะปั‹ะน ะณะพะปะพัะพะบ ะฟะพัะปั‹ัˆะฐะปัั ัƒ ะผะตะฝั ะทะฐ ัะฟะธะฝะพะน. ะšะพะฝะตั‡ะฝะพ ัั‚ะพ ะฑั‹ะปะฐ ะขะธะฝะบะฐ, ะตั‘ ะณะพะปะพั ั ัƒะทะฝะฐัŽ ะธะท ั‚ั‹ััั‡ะธ. ะ’ัั‚ะฐะฒ ั ะดะธะฒะฐะฝะฐ ะธ ะฟะพะดะฝัะฒ ะบะฐั€ะตะณะปะฐะทัƒัŽ ะทะฐ ัะพะฑะพะน, ั ั‡ะผะพะบะฝัƒะป ะขะธ ะฒ ั‰ั‘ั‡ะบัƒ. - ะขั‹ ั‡ะตะณะพ ะทะดะตััŒ ะดะตะปะฐะตัˆัŒ? ะขะตะฑั ะฝะธะบั‚ะพ ะฝะต ะทะฒะฐะป ััŽะดะฐ, - ะทะปะพะฑะฝะพ ะฟั€ะพัˆะธะฟะตะปะฐ ะœะฐะบะฐั€ะตะฝะฐ ะธ ะฝะฐั‚ัะฝัƒะปะฐ ะฝะฐ ัะตะฑั ะฟั€ะพัั‚ั‹ะฝะบัƒ. ะฏ ะฒัะตะณะดะฐ ะฑั‹ะป ั€ะฐะด ะฒะธะดะตั‚ัŒ ะœะฐั€ั‚ะธะฝัƒ, ะพะฝะฐ ะฒัะตะณะดะฐ ะดะปั ะผะตะฝั ะพัั‚ะฐะฝะตั‚ัั ั€ะพะดะฝั‹ะผ ั‡ะตะปะพะฒะตะบะพะผ, ะฝะพ ั ะตั‘ ะฝะต ะปัŽะฑะปัŽ. - ะะต ะพั€ะธ, ะฅะพั€ั…ะต ั ะฟั€ะธัˆะปะฐ ะฟะพ ะดะตะปัƒ, - ัะตั€ัŠั‘ะทะฝะพ ัะบะฐะทะฐะปะฐ ะพะฝะฐ ะธ ะฟะพัะผะพั‚ั€ะตะปะฐ ะฝะฐ ะผะตะฝั. ะšะพะณะดะฐ ะพะฝะฐ ะทะฒะฐะปะฐ ะผะตะฝั ะฟะพ ะธะผะตะฝะธ, ั‚ะพ ัั‚ะพ ะทะฝะฐั‡ะธั‚ ั‡ั‚ะพ ะดะตะนัั‚ะฒะธั‚ะตะปัŒะฝะพ ั‡ั‚ะพ-ั‚ะพ ัั€ะพั‡ะฝะพะต. - ะ”ะฐ ั ะฒะตััŒ ะฒะพ ะฒะฝะธะผะฐะฝะธะต, - ัะฟะพะบะพะนะฝะพ ัะบะฐะทะฐะป ั ะธ ัะตะป ะฝะฐ ะดะธะฒะฐะฝ. - ะœะพะถะฝะพ ะฟะพะณะพะฒะพั€ะธั‚ัŒ ะฝะฐะตะดะธะฝะต? - ะฝะตัƒะฒะตั€ะตะฝะฝะพ ัะฟั€ะพัะธะปะฐ ะพะฝะฐ. ะงั‚ะพ ัั‚ะพ ั ะฝะตะน, ะฟะตั€ะฒั‹ะน ั€ะฐะท ะฒะธะถัƒ ะตั‘ ั‚ะฐะบะพะน, ะฝะตัƒะฒะตั€ะตะฝะฝะพะน. - ะœะฐะบะฐั€ะตะฝะฐ ะฒั‹ะนะดะธ, - ัะบะฐะทะฐะป ั, ั‚ะฐ ะฝะตะดะพะฒะพะปัŒะฝะพ ั†ั‹ะบะฝัƒะฒ ะฟะพะบะธะฝัƒะปะฐ ะณะพัั‚ะธะฝัƒัŽ. ะœะฐั€ั‚ะธะฝะฐ ะฟั€ะธัะตะปะฐ ั€ัะดะพะผ ัะพ ะผะฝะพะน ะฝะฐ ะดะธะฒะฐะฝ, ั‚ะตั€ะตะฑั ะฒ ั€ัƒะบะฐั… ะบั€ะฐะน ัะฒะพะตะณะพ ะฟะปะฐั‚ัŒั. - ะฅะพั€ั…ะต, ะผะฝะต ะฝะตะณะดะต ะถะธั‚ัŒ, ั ะฟั€ะพะดะฐะปะฐ ัะฒะพะน ะดะพะผ, ั…ะพั‡ัƒ ะบัƒะฟะธั‚ัŒ ะฟะพะฑะพะปัŒัˆะต, ั€ะพะดะธั‚ะตะปะธ ัƒะตั…ะฐะปะธ, ะฐ ะบะปัŽั‡ะธ ะฝะต ะพัั‚ะฐะฒะธะปะธ, ะตะดะธะฝัั‚ะฒะตะฝะฝั‹ะน ะฒะฐั€ะธะฐะฝั‚ ัั‚ะพ ั‚ั‹, ะผะพะถะฝะพ ัƒ ั‚ะตะฑั ะฟะพะถะธั‚ัŒ? - ั ะฝะฐะดะตะถะดะพะน ัะฟั€ะพัะธะปะฐ ะพะฝะฐ ะธ ะฟะพัะผะพั‚ั€ะตะปะฐ ะฝะฐ ะผะตะฝั. ะฃ ะฝะตั‘ ัะฐะผั‹ะต ะบั€ะฐัะธะฒั‹ะต ะณะปะฐะทะฐ, ะบะพั‚ะพั€ั‹ะต ั ั‚ะพะปัŒะบะพ ะฒัั‚ั€ะตั‡ะฐะป ะฒ ะถะธะทะฝะธ. - ะฅะผ, ั ะบะพะฝะตั‡ะฝะพ ะฝะต ะฟั€ะพั‚ะธะฒ, ะฝะพ ะตัะปะธ ะœะฐะบะฐ ะฟั€ะพั‚ะธะฒ, ั‚ะพ ั‚ั‹ ะธะทะฒะธะฝะธ, - ัะบะฐะทะฐะป ั, ะœะฐะบะฐั€ะตะฝะฐ ะบะพะฝะตั‡ะฝะพ ะพั‚ะบะฐะถะตั‚ัั, ะฝะพ ะฝะต ะฑัƒะดัƒ ะถะต ั ะฟะตั€ะตั‡ะธั‚ัŒ ัะพะฑัั‚ะฒะตะฝะฝะพะน ะดะตะฒัƒัˆะบะธ, ั€ะฐะดะธ ัะฒะพะตะน ะฑั‹ะฒัˆะตะน. - ะŸะพะฝัั‚ะฝะพ, ะทะฝะฐั‡ะธั‚ ะฟะพะนะดัƒ ะบ ะœะตั‡ะธ, - ะฟะพะฝะธะบัˆะต ัะบะฐะทะฐะปะฐ ะพะฝะฐ ะธ ะฒัั‚ะฐะปะฐ ั ะดะธะฒะฐะฝะฐ. ะ‘ะปะธะฝ ั‡ั‚ะพ ะถะต ะดะตะปะฐั‚ัŒ, ะฝะตะปัŒะทั ั‚ะฐะบ ะฟะพัั‚ัƒะฟะฐั‚ัŒ. - ะกั‚ะพะน, ะดะฐะฒะฐะน ัะฟั€ะพัะธะผ ัƒ ะฝะตั‘, ะผะพะถะตั‚ ะพะฝะฐ ัะพะณะปะฐัะธั‚ัŒัั, ะœะฐะบะฐั€ะตะฝะฐ ะธะดะธ ััŽะดะฐ, - ะบั€ะธะบะฝัƒะป ั ะธ ะฟะพััะพั‚ั€ะตะป ะฝะฐ ะขะธะฝะธ, ะฝะฐ ะปะธั†ะต ัƒ ะฝะตั‘ ะฑั‹ะปะพ ะฟะพะปะฝะพะต ะพั‚ั‡ะฐะธะฝัŒะต. - ะงะตะณะพ ะฝะฐะดะพ? - ะณั€ัƒะฑะพ ัะฟั€ะพัะธะปะฐ ะพะฝะฐ ะธ ะฒัั‚ะฐะปะฐ ะฝะฐะฟั€ะพั‚ะธะฒ ะฝะฐั. - ะœะฐะบะฐ, ะฟะพะฝะธะผะฐะตัˆัŒ ะœะฐั€ั‚ะธะฝะต ะฝะตะณะดะต ะถะธั‚ัŒ, ะผะพะถะฝะพ ะพะฝะฐ ะฟะพะถะธะฒั‘ั‚ ัƒ ะฝะฐั? - ัะฟั€ะพัะธะป ั ะธ ะฟะพัะผะพั‚ั€ะตะป ะฝะฐ ะฝะตั‘, ะตั‘ ะปะธั†ะพ ั‚ัƒั‚ ะถะต ะธะทะผะตะฝะธะปะพััŒ. - ะ•ั‰ั‘ ั‡ะตะณะพ, ะฟัƒัะบะฐะน ัƒ ัะฒะพะธั… ะดั€ัƒะทะตะน ะถะธะฒั‘ั‚, - ะณั€ัƒะฑะพ ะฟั€ะพะธะทะฝะตัะปะฐ ะพะฝะฐ ะธ ะฒะทะดั‘ั€ะฝัƒะปะฐ ะณะพะปะพะฒะพะน ะฒะฒะตั€ั…, ะพะฝะฐ ะฒะพะทะพะผะฝะธะปะฐ ัะตะฑั ะทะดะตััŒ ะบะพั€ะพะปะตะฒะพะน, ะฝัƒ ัƒะถ ะฝะตั‚, ะผะพะน ะดะพะผ ะธ ะผะพั ะฑั‹ะฒัˆะฐั ะฑัƒะดะตั‚ ะถะธั‚ัŒ ะทะดะตััŒ. - ะœะฐะบะฐั€ะตะฝะฐ, ั‚ั‹ ะฟะพั…ะพะถะต ะทะฐะฑั‹ะปะฐ ั‡ะตะน ัั‚ะพ ะดะพะผ, ะœะฐั€ั‚ะธะฝะฐ ะฑัƒะดะตั‚ ะทะดะตััŒ ะถะธั‚ัŒ ัะบะพะปัŒะบะพ ะฝัƒะถะฝะพ, ัั‚ะพ ะฝะต ะพะฑััƒะถะดะฐะตั‚ัั, - ะฟั€ะพัˆะธะฟะตะป ั ะธ ะทะปะพะฑะฝะพ ะฒะทะณะปัะฝัƒะป ะฝะฐ ะœะฐะบัƒ, ะพะฝะฐ ะฟะพั…ะพะถะต ั€ะฐััั‚ะตั€ัะปะฐััŒ ะธ ะฝะต ะทะฝะฐะปะฐ ั‡ั‚ะพ ัะบะฐะทะฐั‚ัŒ, ะฟะพัั‚ะพะผัƒ ะฟั€ะพัั‚ะพ ัƒะฑะตะถะฐะปะฐ ะฒ ะบะพะผะฝะฐั‚ัƒ. - ะœะพะถะตั‚ ะฝะต ะฝัƒะถะฝะพ ะฑั‹ะปะพ ั‚ะฐะบ, ั ะผะพะณะปะฐ ะฟะพะถะธั‚ัŒ ัƒ ะœะตั‡ะธ, - ั‚ะธั…ะธะน ะณะพะปะพั ั€ะฐะทะดะฐะปัั ะฝะฐ ัƒั…ะพะผ. ะฏ ะฑั‹ะป ะพั‡ะตะฝัŒ ะทะพะป. - ะ—ะฐะผะพะปั‡ะธ, ั‚ะตะฑะต ัะบะฐะทะฐะปะธ ะผะพะถะฝะพ, ะทะฝะฐั‡ะธั‚ ะผะพะถะฝะพ, ะฝะฐะดะพะตะปะธ, - ะฟั€ะพะบั€ะธั‡ะฐะป ั ะธ ัƒัˆั‘ะป ะฒ ะบัƒั…ะฝัŽ. ะšะฐะบ ะถะต ะดะฐะปัŒัˆะต ะถะธั‚ัŒ, ัั‚ะธ ะดะฒะพะต ัะฒะตะดัƒั‚ ะผะตะฝั ั ัƒะผะฐ. Pov ะฅะพั€ั…ะต. - ะ“ะดะต ะผะพะน ะปะธั„ั‡ะธะบ? - ะบั€ะธั‡ะฐะปะฐ ะœะฐั€ั‚ะธะฝะฐ ั ะฟะตั€ะฒะพะณะพ ัั‚ะฐะถะฐ. - ะ“ะดะต ะผะพะธ ัˆั‚ะฐะฝั‹? - ะฐ ัั‚ะพ ัƒะถะต ะœะฐะบะฐั€ะตะฝะฐ ะธะท ัะพัะตะดะฝะตะน ะบะพะผะฝะฐั‚ั‹. ะ•ัะปะธะฑ ั ะทะฝะฐะป ั‡ั‚ะพ ัั‚ะธ ะดะฒะต ะดะตะฒัƒัˆะบะธ ัะฒะตะดัƒั‚ ะผะตะฝั ั ัƒะผะฐ ะทะฐ ะดะฒะฐ ะดะฝั, ั ะฑั‹ ะฝะธะบะพะณะดะฐ ะฝะต ะฟัƒัั‚ะธะป ะœะฐั€ั‚ะธะฝัƒ ะถะธั‚ัŒ ะบ ะฝะฐะผ. ะœะฐะบะฐั€ะตะฝะฐ ะฟะพัั‚ะพัะฝะฝะพ ะผะตะฝั ั€ะตะฒะฝะพะฒะฐะปะฐ ะธ ะฟั€ะพัั‚ะพ ะฝะตะฝะฐะฒะธะดะตะปะฐ ะขะธะฝะธ, ะฐ ั‚ะฐ ะฒ ัะฒะพัŽ ะพั‡ะตั€ะตะดัŒ ะดะตะปะฐะปะฐ ะตะน ะฒัั‘ ะฝะฐะทะปะพ. ะ”ะฒะฐ ะดะฝั ะฟะพะดั€ัะด ะฒ ะผะพั‘ะผ ะดะพะผะต ะฝะต ัƒะผะพะปะบะฐัŽั‚ ะถะตะฝัะบะธะต ะณะพะปะพัะฐ, ัะพัะตะดะธ ะฟั€ะธั…ะพะดะธะปะธ ัƒะถะต ั‚ั€ะธ ั€ะฐะทะฐ, ั ะฝะต ะทะฝะฐัŽ ั‡ั‚ะพ ั ะฝะธะผะธ ะดะตะปะฐั‚ัŒ. ะ ะตั‰ั‘ ัƒ ะผะตะฝั ะฝะต ะฑั‹ะปะพ ัะตะบัะฐ ัƒะถะต ะดะฒะฐ ั ะฟะพะปะพะฒะธะฝะพะน ะดะฝั, ั‚ะพะปัŒะบะพ ะผั‹ ั ะœะฐะบะฐั€ะตะฝะพะน ะฝะฐั‡ะธะฝะฐะตะผ ะฟั€ะพั†ะตัั, ะœะฐั€ั‚ะธ ะฒะฒะฐะปะธะฒะฐะตั‚ัั ะบ ะฝะฐะผ ะฒ ะบะพะผะฝะฐั‚ัƒ, ะทะฐ ัั‚ะพ ั ะณะพั‚ะพะฒ ะตั‘ ัƒะฑะธั‚ัŒ. ะกะตะนั‡ะฐั ั ัะพะฑะธั€ะฐัŽััŒ ะฝะฐ ั€ะฐะฑะพั‚ัƒ, ะœะฐั€ั‚ะธะฝะฐ ะฒ ะธะฝัั‚ะธั‚ัƒั‚, ะฐ ะœะฐะบะฐั€ะตะฝะฐ ะฝะฐ ะบะฐะบัƒัŽ-ั‚ะพ ะฒะฐะถะฝัƒัŽ ะฒัั‚ั€ะตั‡ัƒ, ะฟะพัะธะดะตะปะบะธ ั ะฟะพะดั€ัƒะถะบะฐะผะธ ะฝะฐะทั‹ะฒะฐัŽั‚ัั, ั…ะพั‚ั ะฟะพะดั€ัƒะณ ัƒ ะฝะตั‘ ะผะฐะปะพะฒะฐั‚ะพ. - ะัƒ ะฅะพั€ั…ะต, ะธะดะธ ััŽะดะฐ, ะฟะพะผะพะณะธ ะฝะฐะนั‚ะธ ะผะฝะต ะผะพะน ะปะธั„ั‡ะธะบ, ะฟะพะถะฐะปัƒะนัั‚ะฐ, - ะฝั‹ะปะฐ ะขะธะฝะบะฐ ัะพ ะฒั‚ะพั€ะพะณะพ ัั‚ะฐะถะฐ, ั ะฑั‹ ะฟั€ะตะดะปะพะถะธะป ะตะน ะฝะฐะดะตั‚ัŒ ะปะธั„ั‡ะธะบ ะœะฐะบะฐั€ะตะฝั‹, ะฝะพ ะฒะตะดัŒ ะพะฝ ะตะน ะผะฐะปะพะฒะฐั‚. - ะœะฐั€ั‚ะธะฝะฐ, ั ะตะณะพ ะดะฐะถะต ะฒ ะณะปะฐะทะฐ ะฝะต ะฒะธะดะตะป, - ัะฟะพะบะพะนะฝะพ ัะบะฐะทะฐะป ั ะธ ัะฟัƒัั‚ะธะปัั ะฒะฝะธะท. ะšะพะฝะตั‡ะฝะพ ะฒัั ัั‚ะฐ ัะธั‚ัƒะฐั†ะธั ะผะตะฝั ะฝะต ั€ะฐะดัƒะตั‚, ะฝะพ ะฒ ัั‚ะพะผ ะตัั‚ัŒ ะฟะปัŽั ั ะผะพะณัƒ ะฒะธะดะตั‚ัŒ ะœะฐั€ั‚ะธ ะธ ะตั‘ ัˆะธะบะฐั€ะฝะพะต ั‚ะตะปะพ ะณะพะปั‹ะผ, ะฟะฐั€ัƒ ั€ะฐะท ะบะพะณะดะฐ ะพะฝะฐ ะผั‹ะปะฐััŒ ั "ะฝะตั‡ะฐัะฝะฝะพ" ะทะฐั…ะพะดะธะป ะฒ ะฒะฐะฝะฝัƒ ะธ ัะผะพั‚ั€ะตะป ะฝะฐ ะขะธะฝะธ, ะพะฝะฐ ั‚ะฐะบะฐั ะผะธะปะฐัˆะบะฐ ะบะพะณะดะฐ ะทะปะธั‚ัŒัั. ะะต ะฑั‹ะปะพ-ะฑั‹ ัƒ ะผะตะฝั ะœะฐะบะฐั€ะตะฝั‹, ั ะฑั‹ ะฒะฝะพะฒัŒ ะฝะฐั‡ะฐะป ะฒัั‚ั€ะตั‡ะฐั‚ัŒัั ั ะขะธะฝะธ, ัะบะฐะทะฐั‚ัŒ ั‡ะตัั‚ะฝะพ ะฟะฐั€ัƒ ั€ะฐะท ะฒ ะฟะพัั‚ะตะปะธ ั ะฟั€ะตะดัั‚ะฐะฒะปัะป ะขะธ, ะตั‘ ะฒะพะปะพัั‹ ะฟะฐั…ะฝัƒั‰ะธะต ัˆะพะบะพะปะฐะดะพะผ ะธ ัั‚ั€ะพะนะฝั‹ะต ะฝะพะถะบะธ, ะฐ ัƒ ะœะฐะบะฐั€ะตะฝั‹ ะฝะตั‚ ั‚ะฐะบะธั… ะฟะพั‚ั€ััะฐัŽั‰ะธั… ะฒะพะปะพั ะบะฐะบ ัƒ ะขะธะฝะธ, ะดะฐ ะธ ะฝะพะถะบะธ ะฝะต ะฟะพะดะฐั€ะพะบ. - ะัƒ ะธ ะณะดะต ะตะณะพ ะธัะบะฐั‚ัŒ? - ัะฟั€ะพัะธะป ั ะธ ะฝะฐั‡ะฐะป ะธัะบะฐั‚ัŒ ัั‚ะพั‚ ั‡ั‘ั€ั‚ะพะฒ ะปะธั„ั‡ะธะบ ะขะธะฝะธ. - ะขะฒะพั ะดะตะฒัƒัˆะบะฐ ัƒะผะตะตั‚ ัƒะฑะธั€ะฐั‚ัŒัั? ะŸะพั‡ะตะผัƒ ัƒ ะฒะฐั ั‚ะฐะบะพะน ะฑะฐั€ะดะฐะบ ะฒ ะดะพะผะต? ะฏ ะฝะต ะผะพะณัƒ ั‚ะฐะบ ะถะธั‚ัŒ, - ะฟั€ะพัั‚ะพะฝะฐะปะฐ ะพะฝะฐ ะธ, ัƒัั‚ะฐะปะพ ัะตะปะฐ ะฝะฐ ะดะธะฒะฐะฝ. ะญั…, ะบะฐะบ ะถะต ะพะฝะฐ ะฟั€ะฐะฒะฐ, ะœะฐะบะฐั€ะตะฝะฐ ัะพะฒัะตะผ ะทะฐะฑั€ะพัะธะปะฐ ะผะพะน ะดะพะผ, ะฝัƒะถะฝะพ ะฝะฐะฝะธะผะฐั‚ัŒ ะดะพะผั€ะฐะฑะพั‚ะฝะธั†ัƒ. ะŸะพะฒะตั€ะฝัƒะฒ ะณะพะปะพะฒัƒ ะฒ ัั‚ะพั€ะพะฝัƒ ะธ ะฟั€ะธะฟะพะดะฝัะฒ ะฝะตะฟะพะฝัั‚ะฝัƒัŽ ะฒะตั‰ัŒ ั ะพะฑะฝะฐั€ัƒะถะธะป, ั‡ั‘ั€ะฝั‹ะน ะบั€ัƒะถะตะฒะฝะพะน ะปะธั„ั‡ะธะบ ะธ ั ัั€ะฐะทัƒ ะฟะพะฝัะป ั‡ั‚ะพ ะฒะตั‰ัŒ ะฟั€ะธะฝะฐะดะปะตะถะธั‚ ะœะฐั€ั‚ะธ, ะฒะตะดัŒ ัƒ ะœะฐะบะฐั€ะตะฝั‹ ะฝะตั‚ ั‚ะฐะบะธั… ะปะธั„ั‡ะธะบะพะฒ, ะพะฝะธ ัƒ ะฝะตั‘ ะฒัะต ะฝะตะฒะทั€ะฐั‡ะฝั‹ะต ะธ ัะตั€ั‹ะต. - ะญั‚ะพ ั‚ะฒะพั‘? - ัะฟั€ะพัะธะป ั ะธ ะฟะพะดะฝัะป ะฒะตั‰ะธั†ัƒ ั ั‚ัƒะผะฑั‹. ะœะฐั€ั‚ะธ ะทะฐะผะตั‚ะฝะพ ะฟะพะบั€ะฐัะฝะตะปะฐ ะธ ะฟะพะฟั‹ั‚ะฐะปะฐััŒ ะพั‚ะฝัั‚ัŒ ัƒ ะผะตะฝั ัะฒะพัŽ ะฒะตั‰ัŒ, ะฝะพ ั ัะพะฒัะตะผ ะฝะต ั…ะพั‚ะตะป ัั‚ะพะณะพ ะดะตะปะฐั‚ัŒ. - ะžัั‚ะฐะฒัŒ ะตะณะพ ะผะฝะต, ะพะฝ ะพั‡ะตะฝัŒ ะบั€ะฐัะธะฒั‹ะน, ะพะฝ ะบะพะณะดะฐ-ะฝะธะฑัƒะดัŒ ั‚ะตะฑั ะฟั€ะธะณะพะดะธั‚ัั, ะพะฑะตั‰ะฐัŽ, - ะฟั€ะพัˆะตะฟั‚ะฐะป ั ะฝะฐ ัƒัˆะบะพ ะขะธ, ะพะฝะฐ ัƒะดะธะฒะปั‘ะฝะฝะพ ะฟะพัะผะพั‚ั€ะตะปะฐ ะฝะฐ ะผะตะฝั, ะฐ ะฟะพั‚ะพะผ ะฟะพัˆะปะพ ัƒะปั‹ะฑะฝัƒะปะฐััŒ. ะžะฝะฐ ะฒัะตะณะดะฐ ะฟะพะฝะธะผะฐะปะฐ ะผะตะฝั ั ะฟะพะปัƒัะปะพะฒะฐ, ะตะน ะฝะต ะฝัƒะถะฝั‹ ะฑั‹ะปะธ ะผะพะธ ะฝะฐะผั‘ะบะธ. - ะฏ ะบ ั‚ะฒะพะธะผ ัƒัะปัƒะณะฐะผ, ะฐ ัะตะนั‡ะฐั ั ะฟะพะนะดัƒ ะธ ะฒะพะทะผะพะถะฝะพ ัะตะณะพะดะฝั ะฝะต ะฟั€ะธะดัƒ ะฝะพั‡ะตะฒะฐั‚ัŒ, ั ะพัั‚ะฐะฝัƒััŒ ัƒ ะœะตั€ัะธ, ะพะฝะฐ ะณะพะฒะพั€ะธั‚ ั‡ั‚ะพ ัะพัะบัƒั‡ะธะปะฐััŒ ะฟะพ ะผะฝะต, ั ะถะต ะฒัั‘-ั‚ะฐะบะธ ะดะพะปะณะพะต ะฒั€ะตะผั ะฑั‹ะปะฐ ะฒ ะ›ะพะฝะดะพะฝะต, - ัะบะฐะทะฐะปะฐ ะพะฝะฐ ะธ ัะบั€ั‹ะปะฐััŒ ะฒ ะฒะฐะฝะฝะพะน. - ะขั‹ ะฑั‹ะปะฐ ะฒ ะ›ะพะฝะดะพะฝะต? ะŸะพั‡ะตะผัƒ ั‚ั‹ ะฝะต ัะบะฐะทะฐะปะฐ ะผะฝะต? - ัะฟั€ะพัะธะป ั ะธ ะฝะฐั‚ัะฝัƒะป ะฝะฐ ัะตะฑั ะฑั€ัŽะบะธ, ะฐ ะทะฐั‚ะตะผ ะธ ั€ัƒะฑะฐัˆะบัƒ. - ะฏ ะทะฝะฐะปะฐ ั‡ั‚ะพ ัƒ ั‚ะตะฑั ะฑั‹ะปะธ ะฟั€ะพะฑะปะตะผั‹ ั ั‚ะฒะพะตะน ะดะตะฒัƒัˆะบะพะน ะธ ะฝะต ะทะฒะพะฝะธะปะฐ ะดะพะปะณะพะต ะฒั€ะตะผั, ะฟะพะผะฝะธัˆัŒ ั‡ะตั‚ั‹ั€ะต ะผะตััั†ะฐ ะผั‹ ั ั‚ะพะฑะพะน ะฝะต ะพะฑั‰ะฐะปะธััŒ, ะฒะพั‚ ั‚ะพะณะดะฐ ั ะธ ะฑั‹ะปะฐ ะฒ ะ›ะพะฝะดะพะฝะต ะธ, ะทะฝะฐะตัˆัŒ ั ะฟั€ะตะบั€ะฐัะฝะพ ะพั‚ะดะพั…ะฝัƒะปะฐ, - ะผะตั‡ั‚ะฐั‚ะตะปัŒะฝะพ ัะบะฐะทะฐะปะฐ ะพะฝะฐ ะธ ะฒั‹ัˆะปะฐ ะธะท ะฒะฐะฝะฝะพะน, ะฝะฐ ะฝะตะน ะฑั‹ะปะฐ ะพะดะตั‚ะฐ ัŽะฑะบะฐ ั‡ัƒั‚ัŒ ะฒั‹ัˆะต ะบะพะปะตะฝะฐ, ะฝะตะถะฝะพ-ั€ะพะทะพะฒะฐั ะฑะปัƒะทะบะฐ ะธ ั‚ัƒั„ะปะธ ะฝะฐ ัˆะฟะธะปัŒะบะต, ัั‚ะพ ะตั‘ ะพะฑั‹ั‡ะฝะฐั ะพะดะตะถะดะฐ ะฒ ะธะฝัั‚ะธั‚ัƒั‚. - ะœะฐะบะฐั€ะตะฝะฐ ะผั‹ ัƒะตั…ะฐะปะธ, ั ะฑัƒะดัƒ ะฟะพะทะดะฝะพ ะฒะตั‡ะตั€ะพะผ, ะฐ ะขะธะฝะธ ัะตะณะพะดะฝั ะฝะพั‡ัƒะตั‚ ัƒ ะฟะพะดั€ัƒะณะธ, ั‚ะฐะบ ั‡ั‚ะพ ั‚ั‹ ะผะพะถะตัˆัŒ ั†ะตะปั‹ะน ะดะตะฝัŒ, ะฐ ะฟะพั‚ะพะผ ะธ ะฝะพั‡ัŒ ะพั‚ะดั‹ั…ะฐั‚ัŒ, - ัะฟะพะบะพะนะฝะพ ัะบะฐะทะฐะป ั ะธ ะพะฑัƒะป ะปะฐะบะพะฒั‹ะต ั‚ัƒั„ะปะธ. ะงะตั€ะตะท ะดะฒะฐ ะผะตััั†ะฐ ั ะทะฐะนะผัƒ ะผะตัั‚ะฐ ัะฒะพะตะณะพ ะพั‚ั†ะฐ, ั ะฒ ะฟั€ะตะดะฒะบัƒัˆะตะฝะธะต ัั‚ะพะณะพ ะดะฝั. ะ ะพะดะธั‚ะตะปะธ ัƒ ะผะตะฝั ะพั‡ะตะฝัŒ ะฟะพั€ัะดะพั‡ะฝั‹ะต ะปัŽะดะธ, ะพะฝะธ ะฟั€ะพัั‚ะพ ะฝะตะฝะฐะฒะธะดัั‚ ะœะฐะบะฐั€ะตะฝัƒ. - ะฏ ัั‡ะฐัั‚ะปะธะฒะฐ, ะฟะพะบะฐ, - ะบั€ะธะบะฝัƒะปะฐ ะพะฝะฐ ัะพ ะฒั‚ะพั€ะพะณะพ ัั‚ะฐะถะฐ. ะฏ ะฒั‹ัˆะตะป ะธะท ะดะพะผะฐ ะธ ะฝะฐะฟั€ะฐะฒะธะปัั ะบ ัะฒะพะตะน ะผะฐัˆะธะฝะต. ะกะตะฒ ะฒ ะฝะตั‘ ั ะฟั€ะธะฝัะปัั ะถะดะฐั‚ัŒ ะขะธะฝะธ, ะธะฝะพะณะดะฐ ะผะฝะต ะบะฐะถะตั‚ัั ั‡ั‚ะพ ะพะฝะฐ ะฝะต ะฑั‹ะฒัˆะฐั, ะฐ ะฝะฐัั‚ะพัั‰ะฐั. ะ›ัŽะฑะฒะธ ะบะพะฝะตั‡ะฝะพ ะฝะตั‚, ะฝะพ ะฒะปะตั‡ะตะฝะธะต ะบะฐะบ ะบ ะถะตะฝั‰ะธะฝะต ะตัั‚ัŒ ะพะณั€ะพะผะฝะพะต. ะกะฟัƒัั‚ั ะผะธะฝัƒั‚ัƒ ะธะท ะดะพะผะฐ ะฒั‹ัˆะปะฐ ะœะฐั€ั‚ะธะฝะฐ, ะฑะพั€ะผะฐั‡ะฐ ั‡ั‚ะพ-ั‚ะพ ะฟั€ะธ ัั‚ะพะผ ัะตะฑะต ะฟะพะด ะฝะพั. - ะขะฒะพั ะดะตะฒัƒัˆะบะฐ ะฟั€ะพัั‚ะพ ะธะดะธะพั‚ะบะฐ, ะพะฝะฐ ะฝะต ัƒะผะตะตั‚ ะฝะธ ะณะพั‚ะพะฒะธั‚ัŒ, ะฝะธ ัƒะฑะธั€ะฐั‚ัŒัั, ะดะฐะถะต ะฝะพั€ะผะฐะปัŒะฝะพ ั„ะพั€ะผัƒะปะธั€ะพะฒะฐั‚ัŒ ัะฒะพะธ ะผั‹ัะปะธ ะฝะต ัƒะผะตะตั‚, ะบะฐะบ ั‚ั‹ ะผะพะถะตัˆัŒ ั ะฝะตะน ะฒัั‚ั€ะตั‡ะฐั‚ัŒัั? - ะฝะตั€ะฒะฝะพ ัะฟั€ะฐัˆะธะฒะฐะปะฐ ะพะฝะฐ, ัƒัะฐะถะธะฒะฐัััŒ ะบะพ ะผะฝะต ะฒ ะผะฐัˆะธะฝัƒ. - ะ—ะฐั‚ะพ ะฒ ะฟะพัั‚ะตะปะธ ะพะฝะฐ ะฟั€ะพัั‚ะพ ะฑะพะผะฑะฐ, - ัะบะฐะทะฐะป ั ะธ ะฒั‹ะตั…ะฐะป ัะพ ะดะฒะพั€ะฐ. - ะฏัะฝะพ, - ะฟั€ะพะธะทะฝะตัะปะฐ ะพะฝะฐ ะธ ะพั‚ะฒะตั€ะฝัƒะปะฐััŒ. ะžะฝะฐ ั‡ั‚ะพ ะพะฑะธะดะตะปะฐััŒ? ะฏ ะฝะต ัั‚ะฐะป ะฝะธั‡ะตะณะพ ะฑะพะปัŒัˆะต ัะฟั€ะฐัˆะธะฒะฐั‚ัŒ, ะปะธัˆัŒ ะฟั€ะพะดะพะปะถะธะป ะฟัƒั‚ัŒ ะบ ัƒะฝะธะฒะตั€ัƒ. ะ—ะฐะบะพะฝั‡ะธะฒ ัะฒะพะน ั€ะฐะฑะพั‡ะธะน ะดะตะฝัŒ ั ัƒัั‚ะฐะปะพ ะพั‚ะบะธะฝัƒะปัั ะฝะฐ ัะฟะธะฝะบัƒ ัั‚ัƒะปะฐ ะธ ั‚ัะถะตะปะพ ะฒั‹ะดะพั…ะฝัƒะป. ะะฐ ั‡ะฐัะฐั… ะฒะพัะตะผัŒ ะฒะตั‡ะตั€ะฐ, ะฐ ะดะพะผะพะน ัะพะฒัะตะผ ะฝะต ั…ะพั‡ะตั‚ัั. ะšะฐะถะดั‹ะน ะดะตะฝัŒ ะพะดะฝะพ ะธ ั‚ะพะถะต, ั…ะพั‡ะตั‚ัั ั€ะฐะทะฝะพะพะฑั€ะฐะทะธั. ะขะตะปะตั„ะพะฝ ะบะพั‚ะพั€ั‹ะน ะปะตะถะฐะป ะฝะฐ ัั‚ะพะปะต, ะฝะฐั‡ะฐะป ะฝะฐัั‚ะพะนั‡ะธะฒะพ ั‚ั€ะตะทะฒะพะฝะธั‚ัŒ, ะฟะพัะผะพั‚ั€ะตะฒ ะฝะฐ ัะบั€ะฐะฝ ั ัƒะฒะธะดะตะป ะฝะตะทะฝะฐะบะพะผั‹ะน ะฝะพะผะตั€, ะฝะฐะถะฐะฒ ะบะฝะพะฟะบัƒ ะฟั€ะธะฝัั‚ัŒ ั ะฟะพะดะฝั‘ั ั‚ะตะปะตั„ะพะฝ ะบ ัƒั…ัƒ. - ะะปะปะพ, - ั‚ะธั…ะพ ัะบะฐะทะฐะป ั. - ะญะผ, ะทะดั€ะฐะฒัั‚ัƒะนั‚ะต, ะฒั‹ ะทะฝะฐะตั‚ะต ะดะตะฒัƒัˆะตะบ ะฟะพ ะธะผะตะฝะธ ะœะฐั€ั‚ะธะฝะฐ ะจั‚ะพััะตะปัŒ, ะœะตั€ัะตะดะตั ะ›ะฐะผะฑั€ะต, ะ›ะพะดะพะฒะธะบะฐ ะšะพะผะตะปัŒะพ ะธ ะšะฐะฝะดะตะปะฐั€ะธั ะœะพะปั„ะตัะต? - ะผัƒะถัะบะพะน ะณะพะปะพั ั€ะฐะทะดะฐะปัั ะฝะฐ ั‚ะพะผ ะบะพะฝั†ะต ะฟั€ะพะฒะพ' - 'ะกะตะณะพะดะฝั, ะฟั€ั‹ะณะฐั ะฝะฐ ะบั€ะพะฒะฐั‚ะธ, ะšะธั€ะฐ ัะปะพะผะฐะปะฐ ะตะต. ะžะฝะฐ ะพั‚ั‡ะฐัะฝะฝะพ ะฟั‹ั‚ะฐะปะฐััŒ ะดะพะฟั€ั‹ะณะฝัƒั‚ัŒ ะดะพ ะฟะพั‚ะพะปะบะฐ, ะฝะพ ะฝะธั‡ะตะณะพ ะฝะต ะฟะพะปัƒั‡ะฐะปะพััŒ, ะธ ะพะฟะธะปะบะธ ะปะธัˆัŒ ั‚ั‰ะตั‚ะฝะพ ัั‹ะฟะฐะปะธััŒ ะฝะฐ ะฟะพะป. ะะธะบั‚ะพ ะฝะต ัะปั‹ัˆะฐะป ะฝะธ ัะบั€ะตะถะตั‚ะฐ ะฟั€ัƒะถะธะฝ, ะฝะธ ะณั€ะพั…ะพั‚ะฐ; ะฝะต ะฑั‹ะปะพ ะฒะธะดะฝะพ ะธ ัะฐะผะพะน ะฟะพะปะพะผะบะธ. ะœะฐั‚ัŒ, ั€ัƒะณะฐั ะดะพั‡ัŒ, ะฒ ะพั‚ะฒะตั‚ ะฟะพะปัƒั‡ะธะปะฐ ะปะธัˆัŒ ัƒัั‚ะฐะปะพะต ั€ะฐะฒะฝะพะดัƒัˆะธะต, ั‡ั‚ะพ, ะบะพะฝะตั‡ะฝะพ ะถะต, ะฒั‹ะฒะตะปะพ ะตะต ะธะท ัะตะฑั. ะšั€ะธั‡ะฐ ั‡ั‚ะพ-ั‚ะพ ะฝะตั‡ะปะตะฝะพั€ะฐะทะดะตะปัŒะฝะพะต, ะพะฝะฐ ัั‚ัƒั‡ะฐะปะฐ ะฝะพะณะพะน ะฟะพ ัะปะพะผะฐะฝะฝะพะผัƒ ะฟั€ะตะดะผะตั‚ัƒ. ะ–ะตะฝั‰ะธะฝะฐ ะฝะต ะฟะพะฝะธะผะฐะปะฐ, ั‡ั‚ะพ ะพะฝะฐ ะดะตะปะฐะปะฐ ั‚ะพะปัŒะบะพ ั…ัƒะถะต, ะฝะพ ะณะฝะตะฒ ะฒ ะตะต ะบั€ะพะฒะธ ะฒะทัะป ะฒะตั€ั…. - ะ”ะฐ ะบะฐะบ ั‚ั‹ ัะผะตะตัˆัŒ, ะฟะฐั€ัˆะธะฒะฐั ะดะตะฒั‡ะพะฝะบะฐ! ะฏ ั‚ะพะปัŒะบะพ ะธ ะดะตะปะฐะปะฐ, ั‡ั‚ะพ ัƒั…ะฐะถะธะฒะฐะปะฐ ะทะฐ ั‚ะฒะพะตะน ะบั€ะพะฒะฐั‚ัŒัŽ! ะ ั‚ั‹ ั€ะตัˆะธะปะฐ ัƒัั‚ั€ะพะธั‚ัŒ ะฟะพะณั€ะพะผ?! ะ—ะฝะฐะตัˆัŒ, ั‡ั‚ะพ?! ะฏ ัั‚ะพ ั‚ะฐะบ ะฝะต ะพัั‚ะฐะฒะปัŽ! - ะฝะฐ ัั‚ะธั… ัะปะพะฒะฐั… ะถะตะฝั‰ะธะฝะฐ, ั‡ัƒั‚ัŒ ะปะธ ะฝะต ัะฝะธะผะฐั ะดะฒะตั€ัŒ ั ะฟะตั‚ะตะปัŒ, ะฒั‹ะฑะตะถะฐะปะฐ ะธะท ะบะพะผะฝะฐั‚ั‹. ะšะธั€ะฐ ั€ะตะทะบะพ ะพะฟัƒัั‚ะธะปะฐััŒ ะฝะฐ ะบะพะปะตะฝะธ. ะŸั€ะธะถะฐะฒ ั€ัƒะบะธ ะบ ะบั€ะพะฒะฐั‚ะธ, ะพะฝะฐ ะฟั‹ั‚ะฐะปะฐััŒ ัะดะตั€ะถะธะฒะฐั‚ัŒ ะตะต ะฝะตะฒั‹ะฝะพัะธะผั‹ะน ัะบั€ะตะถะตั‚. ะ’ะทัะฒ ะผะพะปะพั‚ะพะบ ะธ ะณะฒะพะทะดะธ ะธะท ะบะปะฐะดะพะฒะพะน, ะดะตะฒะพั‡ะบะฐ ะฑะตะทะฝะฐะดะตะถะฝะพ ะบะพะปะพั‚ะธะปะฐ ะฟะพ ะพะฑะปะพะผะบะฐะผ, ะฟั‹ั‚ะฐัััŒ ั…ะพั‚ัŒ ะบะฐะบ-ั‚ะพ ะธั… ัะพะตะดะธะฝะธั‚ัŒ. ะะพ ะฒัะต ะพะบะฐะทะฐะปะพััŒ ะฑะตะทั€ะตะทัƒะปัŒั‚ะฐั‚ะฝะพ: ะพะฑะปะพะผะบะธ ะปะธัˆัŒ ั ะตั‰ะต ะฑะพะปัŒัˆะธะผ ัั‚ั€ะตะผะปะตะฝะธะตะผ ั€ะฐัะบะฐะปั‹ะฒะฐะปะธััŒ ะฟะพะด ะณะฝะตั‚ะพะผ ะณะฒะพะทะดะตะน. ะžะฝะฐ ะปะตะณะปะฐ ะฝะฐ ะฟะพะป. ะ›ะตะณะบะธะน ัะบะฒะพะทะฝัะบ ั‰ะตะบะพั‚ะฐะป ะตะต ัะฟะธะฝัƒ. - ะฏ ะฝะธะบะพะณะดะฐ ะฝะต ัะผะพะณัƒ ะดะพะฟั€ั‹ะณะฝัƒั‚ัŒ ะดะพ ะฟะพั‚ะพะปะบะฐ, - ัะบะฐะทะฐะปะฐ ะšะธั€ะฐ ะธ ะฒั‹ะดะพั…ะฝัƒะปะฐ. - ะ ะฒะดั€ัƒะณ ัั‚ะพ ะฝะต ั‚ะฐะบ? ะšะธั€ะฐ ั€ะตะทะฒะพ ะฒัั‚ะฐะปะฐ. ะะฐ ะตะต ะปะธั†ะต ะฟะพัะฒะธะปะฐััŒ ะผะฐัะบะฐ ะฝะตะดะพัƒะผะตะฝะธั, ะฐ ะฒ ะณั€ัƒะดะธ ะฝะฐั‡ะฐะป ั€ะฐะทะถะธะณะฐั‚ัŒัั ะพะณะพะฝะตะบ ัั‚ั€ะฐั…ะฐ. ะžั‚ะบัƒะดะฐ ัั‚ะพั‚ ะณะพะปะพั? - ะะต ะฑะพะนัั, ะณะปัƒะฟั‹ัˆะบะฐ, - ะณะพะปะพั ะฑั‹ะป ะพั‡ะตะฝัŒ ะผัะณะพะบ. - ะžั‚ะบัƒะดะฐ ั‚ั‹? ะฏ ั‚ะตะฑั ั€ะฐะฝัŒัˆะต ะฝะต ัะปั‹ัˆะฐะปะฐ... - ะ ั€ะฐะทะฒะต ัั‚ะพ ะฒะฐะถะฝะพ? - ะ ั‡ั‚ะพ, ะฝะตั‚? - ะŸะพั‡ะตะผัƒ ัั‚ะพ ะดะพะปะถะฝะพ ะฑั‹ั‚ัŒ ะฒะฐะถะฝะพ? ะ ะฐะทะฒะต ะฝะตะปัŒะทั ะฟั€ะพัั‚ะพ ะฟะพะณะพะฒะพั€ะธั‚ัŒ ั ั‚ะพะฑะพะน? - ะขั‹ ะดัƒะผะฐะตัˆัŒ, ั ะฑัƒะดัƒ ะณะพะฒะพั€ะธั‚ัŒ ั ะฝะตะทะฝะฐะบะพะผั‹ะผ ะณะพะปะพัะพะผ? - ะ ะฟะพั‡ะตะผัƒ ะฝะตั‚? - ะขะฐะบ. ะœะฝะต ะฝะฐะดะพะตะดะฐะตั‚ ัั‚ะฐ ะธะณั€ะฐ ะฒ ะฒะพะฟั€ะพัั‹. ะ“ะพะฒะพั€ะธ, ั‡ั‚ะพ ะธะปะธ ะบั‚ะพ ั‚ั‹ ะตัั‚ัŒ? ะ’ะฝะตะทะฐะฟะฝะพ ะฝะฐัั‚ัƒะฟะธะปะพ ะผะพะปั‡ะฐะฝะธะต, ะฟะพัะปะต ั‡ะตะณะพ ะฟะพัะปะตะดะพะฒะฐะปะพ ะฟั€ะพะดะพะปะถะธั‚ะตะปัŒะฝะพะต ะณัƒะดะตะฝะธะต. ะ“ะพะปะพั ะฝะฐั‡ะฐะป ะฝะฐะฟะตะฒะฐั‚ัŒ ะฟะตัะตะฝะบัƒ, ะฝะต ะฟะตัะฝัŽ, ะฐ ะธะผะตะฝะฝะพ ะฟะตัะตะฝะบัƒ. ะ›ัŽะฑะธะผัƒัŽ ะฟะตัะตะฝะบัƒ ะšะธั€ั‹, ะบะพั‚ะพั€ัƒัŽ ะพะฝะฐ ะทะฐะฒะพะดะธะปะฐ ะบะฐะถะดั‹ะน ั€ะฐะท, ะบะพะณะดะฐ ะปะพะผะฐะปะพััŒ ั‡ั‚ะพ-ะฝะธะฑัƒะดัŒ ะฒ ะตะต ะบะพะผะฝะฐั‚ะต. - ะฏ ะผะพะณัƒ ะฟะพัั‚ั€ะพะธั‚ัŒ ั‚ะตะฑะต ะฝะพะฒัƒัŽ ะบั€ะพะฒะฐั‚ัŒ. ะ“ะพั€ะฐะทะดะพ ะปัƒั‡ัˆะต ัั‚ะพะน. ะ’ ะฝะตะน ะฑัƒะดะตั‚ ะผะฝะพะณะพ ั†ะฒะตั‚ะพะฒ ะธ ัะปะฐะดะพัั‚ะตะน... ะ”ะตะฒะพั‡ะบะฐ ะพะถะธะฒะธะปะฐััŒ. ะ’ ะตะต ั€ะตั‡ะธ ะฟะพัะปั‹ัˆะฐะปะธััŒ ะฝะพั‚ะบะธ ั€ะฐะดะพัั‚ะธ. - ะŸั€ะฐะฒะดะฐ? ะขั‹ ัะดะตะปะฐะตัˆัŒ ัั‚ะพ? - ะ”ะฐ, ะฝะพ ะฒะพั‚ ั‚ะพะปัŒะบะพ... - ะงั‚ะพ "ั‚ะพะปัŒะบะพ"? - ะขะพะปัŒะบะพ ะพะฝะฐ ะฑัƒะดะตั‚ ะฝะต ะฝะฐัั‚ะพัั‰ะตะน. ะขั‹ ะฝะต ัะผะพะถะตัˆัŒ ะฝะฐ ะฝะตะน ัะฟะฐั‚ัŒ, ะฝะพ ะพะฝะฐ ะฑัƒะดะตั‚ ะฒ ั‚ะฒะพะตะน ะบะพะผะฝะฐั‚ะต. - ะณะพะปะพั ะพั‚ะบะฐัˆะปัะปัั. - ะั…, ะดะฐ. ะšั€ะพะผะต ั‚ะตะฑั ะตะต ะฝะธะบั‚ะพ ะฝะต ัƒะฒะธะดะธั‚. ะ”ะตะฒะพั‡ะบะฐ ะทะฐะดัƒะผั‡ะธะฒะพ ัƒะปั‹ะฑะฝัƒะปะฐััŒ. - ะะพ ะบะพะณะดะฐ ะถะต ั ัะผะพะณัƒ ัƒะฒะธะดะตั‚ัŒ ัะฒะพัŽ ะบั€ะพะฒะฐั‚ัŒ? ะ“ะพะปะพั ะฝะฐั‡ะฐะป ัะผะตัั‚ัŒัั. ะกะธะปัŒะฝะพ, ะดะพะปะณะพ, ะฝะพ ะผัะณะบะพ. ะญั‚ะพั‚ ัะผะตั… ะฑั‹ะป ะพั‡ะตะฝัŒ ะธ ะพั‡ะตะฝัŒ ะฝะตะพะฑั‹ั‡ะตะฝ: ะฒั€ะพะดะต ะฑั‹ ะธ ะดะพะฑั€ั‹ะน, ะฐ ะฒั€ะพะดะต ะฑั‹ ะธ ั ะฝะฐัะผะตัˆะบะพะน. ะ–ะฐะปะพัั‚ัŒ. ะ–ะฐะปะพัั‚ัŒ ัƒะฟั€ะฐะฒะปัะปะฐ ะธะผ. - ะŸะพั‡ะตะผัƒ ั‚ั‹ ัะผะตะตัˆัŒัั? - ะ”ะฐ ะฟะพั‚ะพะผัƒ ั‡ั‚ะพ ั‚ั‹ ะณะปัƒะฟะฐั ะดะตะฒะพั‡ะบะฐ, ะบะพั‚ะพั€ะฐั ะดะฐะถะต ะฝะต ะผะพะถะตั‚ ั€ะตัˆะธั‚ัŒ. - ะฏ ะฒะพะฒัะต ะฝะต ะณะปัƒะฟะฐ! - ะ”ะฐ? ะขะฐะบ ะพั‚ะฒะตั‚ัŒ: ั‚ะตะฑะต ะฝัƒะถะฝะพ ั‚ะพ, ั‡ั‚ะพ ั ะฟั€ะตะดะปะฐะณะฐัŽ? - ะะพ ัั‚ะพ ะถะต ะฒะพะฒัะต ะฝะต ะฝะฐัั‚ะพัั‰ะฐั ะบั€ะพะฒะฐั‚ัŒ! - ะšะธั€ะฐ ะฟั€ะธะปะพะถะธะปะฐ ั€ัƒะบะธ ะบ ะปะธั†ัƒ. - ะะฐ ะฝะตะน ั ะฝะต ัะผะพะณัƒ ะดะพะฟั€ั‹ะณะฝัƒั‚ัŒ ะดะพ ะฟะพั‚ะพะปะบะฐ! ะ“ะพะปะพั ะพะฟัั‚ัŒ ะทะฐะปะธะปัั ัะผะตั…ะพะผ. - ะŸะžะงะ•ะœะฃ ะขะซ ะกะœะ•ะ•ะจะฌะกะฏ ะ’ะกะ• ะ’ะ ะ•ะœะฏ?! - ะ”ะฐ ะฟะพั‚ะพะผัƒ ั‡ั‚ะพ ั‚ั‹ ัƒะถะต ั€ะตัˆะธะปะฐ. ะฃะถะต ะดะฐะฒะฝั‹ะผ-ะดะฐะฒะฝะพ ั€ะตัˆะธะปะฐ. - ะ˜ ั‡ั‚ะพ ะถะต ั ั€ะตัˆะธะปะฐ? - ะขั‹ ัะพะณะปะฐัะฝะฐ, ะฒะตะดัŒ ั‚ะฐะบ? ะšะธั€ะฐ ะทะฐะผะตัˆะบะฐะปะฐััŒ, ะฝะพ, ะฒัะต ะถะต, ะฒั‹ะดะฐะฒะธะปะฐ ะธะท ัะตะฑั ะฝะตัƒะฒะตั€ะตะฝะฝะพะต "ะดะฐ". ะ“ะพะปะพั ะฟั€ะพะฟะฐะป, ะพัั‚ะฐะฒะธะฒ ะฟะพัะปะต ัะตะฑั ะพะณั€ะพะผะฝัƒัŽ ะบั€ะพะฒะฐั‚ัŒ, ั ะฑะพะปัŒัˆะธะผ ะผะฐั‚ั€ะฐัะพะผ ะธ ะผัะณะบะธะผะธ ะฟะพะดัƒัˆะบะฐะผะธ. ะะฐ ั‚ะฐะบะพะน ะบั€ะพะฒะฐั‚ะธ, ะพะฟั€ะตะดะตะปะตะฝะฝะพ, ะผะพะถะฝะพ ะฑั‹ะปะพ ะฑั‹ ะดะพะฟั€ั‹ะณะฝัƒั‚ัŒ ะดะพ ะฟะพั‚ะพะปะบะฐ.' - source_sentence: 'ะขัะถั‘ะปั‹ะต ะฟะพั€ั‚ัŒะตั€ั‹ ั ั‚ะธั…ะธะผ ัˆะตะปะตัั‚ะพะผ ั€ะฐะทัŠะตั…ะฐะปะธััŒ ะฒ ัั‚ะพั€ะพะฝั‹, ะฒะฟัƒัะบะฐั ะฒะฝัƒั‚ั€ัŒ ัะฒะตั‚. ะžะฝ ะฑั‹ะป ัะตั€ะพ-ะฑะตะปั‹ะผ ะธ ะฑะพะปัŒะฝะพ ะฑะธะป ะฟะพ ะฟั€ะธะฒั‹ะบัˆะธะผ ะบ ั‚ะตะผะฝะพั‚ะต ะณะปะฐะทะฐะผ. ะฏ ะฟะพั‚ัƒัˆะธะป ัั‚ะพัะฒัˆะธะต ะฝะฐ ะฝะพั‡ะฝะพะผ ัั‚ะพะปะธะบะต ัะฒะตั‡ะธ ะธ ะฟะพะดะฝัะปัั ั ะบั€ะพะฒะฐั‚ะธ. ะญั‚ะพั‚ ะดะตะฝัŒ ะฝะต ะฟั€ะตะดะฒะตั‰ะฐะป ะฝะธั‡ะตะณะพ ะฝะตะพะฑั‹ั‡ะฝะพะณะพ. ะฏ ะบะฐะบ ะฒัะตะณะดะฐ ะฟั€ะพัะฝัƒะปัั ะฝะฐ ะฝะตัะบะพะปัŒะบะพ ะผะณะฝะพะฒะตะฝะธะน ั€ะฐะฝัŒัˆะต ะฒัะตะพะฑั‰ะตะน ะฟะพะฑัƒะดะบะธ. ะะฐ ะฐะฒั‚ะพะผะฐั‚ะต ะฟั€ะธะฒั‘ะป ัะตะฑั ะฒ ะฟะพั€ัะดะพะบ ะธ ะพะฑะปะฐั‡ะธะปัั ะฒ ัะฒะพะน ะดะปะธะฝะฝั‹ะน ั‡ั‘ั€ะฝั‹ะน ะฑะฐะปะฐั…ะพะฝ, ะฟะพะดะฟะพััะฐะฒ ะตะณะพ ัะตั€ะตะฑั€ัะฝั‹ะผ ัˆะฝัƒั€ะบะพะผ - ะทะฝะฐะบะพะผ ะถะฝะตั†ะพะฒ. ะะฐะดะตะป ะฝะฐ ะณะพะปะพะฒัƒ ะฝะตะฝะฐะฒะธัั‚ะฝั‹ะน ะพะฑั€ัƒั‡, ะบะพั‚ะพั€ั‹ะน ัะธะปัŒะฝะพ ะดะฐะฒะธะป ะฝะฐ ะปะพะฑ, ะฝะพ ะฑะตะท ะบะพั‚ะพั€ะพะณะพ ั ะฝะต ะผะพะณ ะฟะพัะฒะธั‚ัŒัั ะฒะฝะต ัะฒะพะตะน ะบะตะปัŒะธ - ะพะฝ ัƒะบะฐะทั‹ะฒะฐะป ะฝะฐ ะผะพัŽ ะฟั€ะธะฝะฐะดะปะตะถะฝะพัั‚ัŒ ะบ ะ’ั‹ััˆะตะน ะบะฐัั‚ะต. ะŸะพัะปะตะดะฝะธะน ัˆั‚ั€ะธั… - ะšะพะปัŒั†ะพ ะฒะตั€ะฝะพัั‚ะธ. ะœะพั ะบะตะปัŒั ั€ะฐัะฟะพะปะฐะณะฐะปะฐััŒ ะฝะฐ ัะฐะผะพะผ ะฟั€ะตัั‚ะธะถะฝะพะผ ัั€ัƒัะต - ะฝะธะถะฝะตะผ. ะœะตัั‚ะฝั‹ะต ะตั‰ั‘ ะฝะฐะทั‹ะฒะฐะปะธ ะตะณะพ ะšะพะปั‹ะฑะตะปัŒัŽ ั‚ะตะฝะตะน. ะขะฐะผ ะธะผะตะปะธ ะฟั€ะฐะฒะพ ะฝะฐั…ะพะดะธั‚ัŒัั ั‚ะพะปัŒะบะพ ะถะฝะตั†ั‹ ะ’ั‹ััˆะตะน ะบะฐัั‚ั‹ - ั‚ะต, ะบะพะณะพ ะกะผะตั€ั‚ัŒ ะฒั‹ะฑั€ะฐะปะฐ ัะฒะพะธะผะธ ัะพะฒะตั‚ะฝะธะบะฐะผะธ. ะšะฐะถะดั‹ะน ั€ะฐััะฒะตั‚ ั ะฟะพะดะฝะธะผะฐะปัั ะฝะฐ ะพั‚ัั‡ั‘ั‚ะฝั‹ะน ัั€ัƒั, ะณะดะต ะฟะพะปัƒั‡ะฐะป ัƒะบะฐะทะฐะฝะธั ะพั‚ ะกั‚ะฐั€ะตะนัˆะธะฝ, ะฐ ะทะฐั‚ะตะผ ะฟั€ะธัั‚ัƒะฟะฐะป ะบ ัะฒะพะตะน ั€ะฐะฑะพั‚ะต. ะฏ - ัะพะฑะธั€ะฐั‚ะตะปัŒ ะดัƒัˆ. ะฏ ะฟะตั€ะตัั‘ะบ ัƒั‡ะตะฑะฝั‹ะน ะทะฐะป, ะณะดะต ะพะฑัƒั‡ะฐะปะธััŒ ะผะพะปะพะดั‹ะต ะถะฝะตั†ั‹. ะšะพะณะดะฐ-ั‚ะพ ั ะธ ัะฐะผ ัะธะดะตะป ะฝะฐ ัั‚ะธั… ัะบะฐะผัŒัั… ะธ ั‚ั‰ะฐั‚ะตะปัŒะฝะพ ะฒะฝะธะผะฐะป ะบะฐะถะดะพะผัƒ ัะปะพะฒัƒ ะฟั€ะพั„ะตััะพั€ะฐ. ะะฐะฒะตั€ะฝะพะต, ะธะผะตะฝะฝะพ ะผะพะธ ัั‚ะฐั€ะฐั‚ะตะปัŒะฝะพัั‚ัŒ ะธ ัƒะฟะพั€ัั‚ะฒะพ ะฟะพะผะพะณะปะธ ะผะฝะต ะดะพัะปัƒะถะธั‚ัŒัั ะดะพ ะ’ั‹ััˆะตะน ะบะฐัั‚ั‹. ะกะฐะผะพ ะผะตัั‚ะพ, ะณะดะต ะผั‹ ะพะฑะธั‚ะฐะปะธ, ะฝะฐะทั‹ะฒะฐะปะพััŒ ะฅั€ะฐะผะพะผ. ะžะฝ ั€ะฐัะฟะพะปะฐะณะฐะปัั ะฒะฝะต ะฟั€ะพัั‚ั€ะฐะฝัั‚ะฒะฐ ะธ ะฒั€ะตะผะตะฝะธ. ะฅั€ะฐะผ ะฑั‹ะป ัะดะตะปะฐะฝ ะธะท ะฑะตะปะพะณะพ, ะบั€ะฐัะฝะพะณะพ ะธ ั‡ั‘ั€ะฝะพะณะพ ะผั€ะฐะผะพั€ะฐ ะธ ะธะผะตะป ะพั‡ะตะฝัŒ ะฒั‹ัะพะบะธะต ัะฒะพะดั‹. ะšะพะต-ะณะดะต ะพะฝะธ ะดะพัั‚ะธะณะฐะปะธ ั‚ะฐะบะพะน ะฒั‹ัะพั‚ั‹, ั‡ั‚ะพ ะธั… ะฝะตะฒะพะทะผะพะถะฝะพ ะฑั‹ะปะพ ัƒะฒะธะดะตั‚ัŒ. ะขะฐะบะธะต ัะฒะพะดั‹ ะฝะฐะทั‹ะฒะฐะปะธััŒ ะฝะตะฑะตัะฐะผะธ. ะžะฑะธั‚ะฐั‚ะตะปะธ ะฅั€ะฐะผะฐ ะถะธะปะธ ะฒ ะบะตะปัŒัั… - ะฟั€ะพัั‚ะพั€ะฝั‹ั… ะบั€ัƒะณะปั‹ั… ะบะพะผะฝะฐั‚ะฐั…. ะŸะพัะบะพะปัŒะบัƒ ะฒั€ะตะผะตะฝะธ ะดะปั ะฝะฐั ะฝะต ััƒั‰ะตัั‚ะฒะพะฒะฐะปะพ, ะฝะพ ะฒะฝัƒั‚ั€ะธ ะฒัั‘ ั€ะฐะฒะฝะพ ั‚ะธะบะฐะปะธ ะฑะธะพะปะพะณะธั‡ะตัะบะธะต ั‡ะฐัั‹, ะฑั‹ะปะธ ะฟั€ะธะดัƒะผะฐะฝั‹ ั€ะฐััะฒะตั‚ั‹ ะธ ะทะฐะบะฐั‚ั‹. ะฅั€ะฐะผ ะฑั‹ะป ะพะบัƒั‚ะฐะฝ ะฑะตะปะพะน, ัะฒะตั‚ัั‰ะตะนัั ะฟะตะปะตะฝะพะน, ะฟะพัั‚ะพะผัƒ ะฝะฐ ะทะฐะบะฐั‚ะต, ะบะพะณะดะฐ ั‚ั€ะตะฑะพะฒะฐะปะพััŒ ะปะพะถะธั‚ัŒัั ัะฟะฐั‚ัŒ, ะฟะพั€ั‚ัŒะตั€ั‹ ะฝะฐ ะฒัะตั… ะพะบะฝะฐั… ะทะฐะดะฒะธะณะฐะปะธััŒ. ะะฐ ั€ะฐััะฒะตั‚ะต ะถะต ะฝะฐะพะฑะพั€ะพั‚ - ะพะฝะธ ั€ะฐะทัŠะตะทะถะฐะปะธััŒ ะฒ ั€ะฐะทะฝั‹ะต ัั‚ะพั€ะพะฝั‹. ะ”ะตะปะฐะปะพััŒ ะฒัั‘ ัั‚ะพ ะฒ ะตะดะธะฝะพะต ะดะปั ะฒัะตั… ะฒั€ะตะผั, ะธ ะธะทะผะตะฝะธั‚ัŒ ะทะฐะบั€ั‹ั‚ะธะต ะธ ั€ะฐัะบั€ั‹ั‚ะธะต ะฟะพั€ั‚ัŒะตั€ ะดะฐะถะต ะฒ ะปะธั‡ะฝั‹ั… ะบะตะปัŒัั… ะฑั‹ะปะพ ะฝะตะฒะพะทะผะพะถะฝะพ. ะะฐะบะพะฝะตั†, ั ะฝะฐ ัะฒะพั‘ะผ ัั€ัƒัะต. ะกะตะณะพะดะฝั ะฒ ะทะฐะปะต ะฑั‹ะปะพ ะฝะฐ ัƒะดะธะฒะปะตะฝะธะต ะฟัƒัั‚ะพ - ะฒะฝัƒั‚ั€ะธ ะพะบะฐะทะฐะปัั ะปะธัˆัŒ ะพะดะธะฝ ะกั‚ะฐั€ะตะนัˆะธะฝะฐ. ะšะพะณะดะฐ ั ะฒะพัˆั‘ะป, ะพะฝ ัั‚ะพัะป ัะฟะธะฝะพะน ะบะพ ะผะฝะต, ะพะฟั€ะพะบะธะฝัƒะฒ ะฝะฐะทะฐะด ะณะพะปะพะฒัƒ ะธ ั€ะฐะทะณะปัะดั‹ะฒะฐั ะฝะตะฑะตัะฐ. - ะ’ะฐะผ ัะปะตะดัƒะตั‚ ะฑั‹ั‚ัŒ ะฑะพะปะตะต ั€ะฐัั‚ะพั€ะพะฟะฝั‹ะผ, ะฑั€ะฐั‚ ะ ะธั…ะฐั€ะด, - ัะบะฐะทะฐะป ะพะฝ ัะฟะพะบะพะนะฝะพ, ะฝะต ะพั‚ั€ั‹ะฒะฐั ะฒะทะณะปัะดะฐ ะพั‚ ะฝะตะฑะตั. - ะžัะพะฑะตะฝะฝะพ ะบะพะณะดะฐ ะดะปั ะ’ะฐั ะฟั€ะธะฟะฐัะตะฝะพ ั‚ะฐะบะพะต ะทะฐะดะฐะฝะธะต. ะ•ะณะพ ัะปะพะฒะฐ ะทะฐัั‚ะฐะฒะธะปะธ ะผะตะฝั ะพะถะธะฒะธั‚ัŒัั: ะพะฝะธ ะพะฑะตั‰ะฐะปะธ ั‡ั‚ะพ-ั‚ะพ ะฟะพัะปะพะถะฝะตะต ะฟั€ะพัั‚ะพะณะพ ัะฑะพั€ะฐ ะดัƒัˆ. - ะ˜ ะบะฐะบะพะฒะพ ะถะต ะพะฝะพ? - ัะฟั€ะพัะธะป ั, ัั‚ะฐั€ะฐัััŒ ะฟั€ะธะดะฐั‚ัŒ ะณะพะปะพััƒ ะฑะตะทั€ะฐะทะปะธั‡ะธะต. - ะะต ะพะฑะพะปัŒั‰ะฐะนั‚ะตััŒ, ัั‚ะพ ะฝะต ะฟั€ะพัั‚ะพ ะธะฝั‚ะตั€ะตัะฝะฐั ะธะณั€ะฐ, - ะกั‚ะฐั€ะตะนัˆะธะฝะฐ ะฝะฐะบะพะฝะตั† ะพะฑะตั€ะฝัƒะปัั ะธ ะฟะพัะผะพั‚ั€ะตะป ะผะฝะต ะฒ ะณะปะฐะทะฐ. - ะ’ั‹ ะฑัƒะดะตั‚ะต ะธะผะตั‚ัŒ ะดะตะปะพ ั ะพั‡ะตะฝัŒ ะฝะตะพะฑั‹ั‡ะฝั‹ะผ ัะบะทะตะผะฟะปัั€ะพะผ. ะงัƒั‚ัŒ ะฟั€ะธะฟะพะดะฝัะฒ ะฟะพะปั‹ ะฟัƒั‚ะฐะฒัˆะตะณะพัั ะฒ ะฝะพะณะฐั… ะฑะฐะปะฐั…ะพะฝะฐ, ะกั‚ะฐั€ะตะนัˆะธะฝะฐ ะฟะพะดะพัˆั‘ะป ะฑะปะธะถะต ะบะพ ะผะฝะต. - ะœั‹ ะดะฐะฒะฝะพ ะพั…ะพั‚ะธะผัั ะทะฐ ะดัƒัˆะพะน ัั‚ะพะณะพ ั‡ะตะปะพะฒะตะบะฐ. ะกะผะตั€ั‚ัŒ ะฝะต ั€ะฐะท ะฒะฝะพัะธะปะฐ ะตะณะพ ะฒ ัะฒะพะธ ัะฟะธัะบะธ, ะฝะพ ะบะฐะบะธะผ-ั‚ะพ ะฝะตะฒะตะดะพะผั‹ะผ ะพะฑั€ะฐะทะพะผ ะพะฝ ัƒะถะต ั‡ะตั‚ั‹ั€ะต ั€ะฐะทะฐ ะพั‚ ะฝะฐั ัƒัะบะพะปัŒะทะฐะป. ะžะฟั‹ั‚ะฝะตะนัˆะธะต ะถะฝะตั†ั‹ ะฝะต ัะผะพะณะปะธ ัะฟั€ะฐะฒะธั‚ัŒัั ั ัั‚ะธะผ ะทะฐะดะฐะฝะธะตะผ. ะขะพะณะดะฐ ะกะผะตั€ั‚ัŒ ะพั‚ะดะฐะปะฐ ะฟั€ะธะบะฐะท ะพั‚ะฟั€ะฐะฒะธั‚ัŒ ะบ ัั‚ะพะผัƒ ั‡ะตะปะพะฒะตะบัƒ ะธะผะตะฝะฝะพ ะ’ะฐั, - ะกั‚ะฐั€ะตะนัˆะธะฝะฐ ัะดะตะปะฐะป ัƒะดะฐั€ะตะฝะธะต ะฝะฐ ะฟะพัะปะตะดะฝะตะผ ัะปะพะฒะต. - ะะต ะฟะพะดะฒะตะดะธั‚ะต ะตั‘. ะ˜ะฝะฐั‡ะต ะ’ั‹ ะทะฝะฐะตั‚ะต, ั‡ั‚ะพ ะ’ะฐั ะถะดั‘ั‚. ะžะฝ ะฑั€ะพัะธะป ะฒะทะณะปัะด ะฝะฐ ะงั‘ั€ะฝะพะต ะพะบะพ. ะญั‚ะพ ะบะพะปะพะดะตั†, ั€ะฐัะฟะพะปะฐะณะฐะฒัˆะธะนัั ะฒ ั†ะตะฝั‚ั€ะต ัั‚ะพะณะพ ะทะฐะปะฐ. ะžะฝ ะธัะฟะพะปัŒะทะพะฒะฐะปัั ะฒ ะบะฐั‡ะตัั‚ะฒะต ะฝะฐะบะฐะทะฐะฝะธั ะดะปั ะฝะตะฒะตั€ะฝั‹ั… ะธะปะธ ะฟั€ะพัั‚ะพ ะฝะตัƒะณะพะดะฝั‹ั… ะกะผะตั€ั‚ะธ ะถะฝะตั†ะพะฒ. ะžะฝะธ ะฟะพะดะฒะตั€ะณะฐะปะธััŒ ั‚ะฐะบ ะฝะฐะทั‹ะฒะฐะตะผะพะผัƒ ะŸะตั€ะตะฒะพะฟะปะพั‰ะตะฝะธัŽ. ะ’ะธะฝะพะฒะฝะพะณะพ ัะฑั€ะฐัั‹ะฒะฐะปะธ ะฒ ั‡ั‘ั€ะฝะพะต ะถะตั€ะปะพ ัั‚ะพะณะพ ะบะพะปะพะดั†ะฐ, ะฟะพัะปะต ั‡ะตะณะพ ะพะฝ ะธัั‡ะตะทะฐะป ะธะท ะฅั€ะฐะผะฐ ะฝะฐะฒัะตะณะดะฐ. ะฅะพะดะธะป ัะปัƒั…, ั‡ั‚ะพ ะบะฐะทะฝั‘ะฝะฝั‹ะต ะพะฑั€ะตั‚ะฐะปะธ ะฝะพะฒัƒัŽ ะถะธะทะฝัŒ ะฝะฐ ะทะตะผะปะต, ั‚ะพะปัŒะบะพ ะฒ ะดั€ัƒะณะพะน ะธะฟะพัั‚ะฐัะธ (ะฟะพัั‚ะพะผัƒ ะฟั€ะพั†ะตะดัƒั€ะฐ ะฝะฐะทั‹ะฒะฐะปะฐััŒ ะŸะตั€ะตะฒะพะฟะปะพั‰ะตะฝะธะตะผ). ะžะดะฝะฐะบะพ ะฝะต ะฑั‹ะปะพ ะฝะธะบะพะณะพ, ะบั‚ะพ ะฑั‹ ัะผะพะณ ะฟะพะดั‚ะฒะตั€ะดะธั‚ัŒ ัั‚ะพ ะธะปะธ ะพะฟั€ะพะฒะตั€ะณะฝัƒั‚ัŒ, ะฟะพัั‚ะพะผัƒ ะฒัะต ะฑะพัะปะธััŒ ะฑั‹ั‚ัŒ ะฟะพะดะฒะตั€ะณะฝัƒั‚ั‹ะผะธ ะŸะตั€ะตะฒะพะฟะปะพั‰ะตะฝะธัŽ. - ะฏ ะฝะธ ั€ะฐะทัƒ ะฝะต ะฟะพะทะฒะพะปัะป ะ’ะฐะผ ะธ ะกะผะตั€ั‚ะธ ัƒัะพะผะฝะธั‚ัŒัั ะฒ ัะฒะพั‘ะผ ะฟั€ะพั„ะตััะธะพะฝะฐะปะธะทะผะต. ะ—ะฐะดะฐะฝะธะต ะฑัƒะดะตั‚ ะฒั‹ะฟะพะปะฝะตะฝะพ, - ััƒั…ะพ ะฟั€ะพะธะทะฝั‘ั ั. - ะŸั€ะธัั‚ะฝะพ ัะปั‹ัˆะฐั‚ัŒ ัƒะฒะตั€ะตะฝะฝะพัั‚ัŒ ะฒ ะ’ะฐัˆะตะผ ะณะพะปะพัะต, ะฝะพ ะฝะต ั‚ะตั€ัะนั‚ะต ะฑะดะธั‚ะตะปัŒะฝะพัั‚ะธ. ะะฐ ัั‚ะพั‚ ั€ะฐะท ะ’ะฐัˆะฐ ะฒั‹ะปะฐะทะบะฐ ะฒ ะผะธั€ ะปัŽะดัะบะพะน ะฑัƒะดะตั‚ ะดะปะธั‚ะตะปัŒะฝะพะน. ะ’ั€ะตะผะตะฝะธ ะ’ะฐะผ ะฑัƒะดะตั‚ ะดะฐะฝะพ ัั‚ะพะปัŒะบะพ, ัะบะพะปัŒะบะพ ะฟะพั‚ั€ะตะฑัƒะตั‚ัั, ะฝะพ ะฟะพะผะฝะธั‚ะต - ะกะผะตั€ั‚ัŒ ะฝะต ะปัŽะฑะธั‚ ะถะดะฐั‚ัŒ. ะ’ะฐะผ ะฝัƒะถะฝะพ ะฒะพะนั‚ะธ ะฒ ะดะพะฒะตั€ะธะต ะบ ัั‚ะพะผัƒ ั‡ะตะปะพะฒะตะบัƒ, ัƒะทะฝะฐั‚ัŒ ะตะณะพ ะบะฐะบ ะผะพะถะฝะพ ะปัƒั‡ัˆะต, ั‡ั‚ะพะฑั‹ ะตะณะพ ะดัƒัˆะฐ ัะฐะผะฐ ะฟะพั‚ัะฝัƒะปะฐััŒ ะบ ะ’ะฐะผ. ะขะพะณะดะฐ ะ’ั‹ ัะผะพะถะตั‚ะต ะฑะตัะฟั€ะตะฟัั‚ัั‚ะฒะตะฝะฝะพ ะตั‘ ะทะฐะฑั€ะฐั‚ัŒ, ะธ ะทะฐะดะฐะฝะธะต ะฑัƒะดะตั‚ ัั‡ะธั‚ะฐั‚ัŒัั ะฒั‹ะฟะพะปะฝะตะฝะฝั‹ะผ. ะ’ะพั€ะพะฒะฐั‚ะพ ะพะณะปัะฝัƒะฒัˆะธััŒ, ะกั‚ะฐั€ะตะนัˆะธะฝะฐ ัะบะปะพะฝะธะปัั ะบ ะผะพะตะผัƒ ัƒั…ัƒ ะธ ัˆะตะฟะฝัƒะป: - ะ ะตั‰ั‘ ะฒ ัะปัƒั‡ะฐะต ัƒัะฟะตั…ะฐ ั ะฟะพะทะฐะฑะพั‡ัƒััŒ ะพ ั‚ะพะผ, ั‡ั‚ะพะฑั‹ ะ’ั‹ ะฟั€ะธะผะบะฝัƒะปะธ ะบ ะฝะฐะผ. ะžะฝ ะฝะฐะผะตะบะฐะป ะฝะฐ ั‚ะพ, ั‡ั‚ะพ ะฟะพะผะพะถะตั‚ ะผะฝะต ะฟะพะปัƒั‡ะธั‚ัŒ ัะฐะผัƒัŽ ะฒั‹ัะพะบัƒัŽ ะดะพะปะถะฝะพัั‚ัŒ ะฅั€ะฐะผะฐ - ะดะพะปะถะฝะพัั‚ัŒ ะกั‚ะฐั€ะตะนัˆะธะฝั‹. ะฏ ะบะพั€ะพั‚ะบะพ ะบะธะฒะฝัƒะป. - ะฏ ะผะพะณัƒ ะฟั€ะธัั‚ัƒะฟะธั‚ัŒ ะบ ะทะฐะดะฐะฝะธัŽ? - ะ ะฐะทัƒะผะตะตั‚ัั. ะกะปัƒะถะบะธ ัะพะฑะตั€ัƒั‚ ะ’ะฐะผ ะฒัั‘, ั‡ั‚ะพ ะฟะพั‚ั€ะตะฑัƒะตั‚ัั ะดะปั ะปัŽะดัะบะพะน ะถะธะทะฝะธ, ะธ ะ’ั‹ ะผะพะถะตั‚ะต ะฝะตะผะตะดะปะตะฝะฝะพ ะพั‚ะฟั€ะฐะฒะปัั‚ัŒัั. ะฏ ัะฝะพะฒะฐ ะบะธะฒะฝัƒะป ะธ ะฝะฐะฟั€ะฐะฒะธะปัั ะบ ะฒั‹ั…ะพะดัƒ. ะฃะถะต ัƒ ัะฐะผั‹ั… ะดะฒะตั€ะตะน ั ะพะฑะตั€ะฝัƒะปัั ะธ ัะฟั€ะพัะธะป: - ะ ะบะฐะบ ะทะพะฒัƒั‚ ะผะพะตะณะพ ะฟะพะดะพะฟะตั‡ะฝะพะณะพ? - ะขะธะปะปัŒ ะ›ะธะฝะดะตะผะฐะฝะฝ. ะ•ะดะฒะฐ ั ะฟะพั‡ัƒะฒัั‚ะฒะพะฒะฐะป, ั‡ั‚ะพ ั‚ะฒั‘ั€ะดะพ ัั‚ะพัŽ ะฝะพะณะฐะผะธ ะฝะฐ ะทะตะผะปะต, ะบะฐะบ ะฝะฐ ะผะตะฝั ะฝะฐะปะตั‚ะตะป ัะธะปัŒะฝั‹ะน ะฟะพั€ั‹ะฒ ั…ะพะปะพะดะฝะพะณะพ ะฒะตั‚ั€ะฐ ะธ ะฝะฐั‡ะฐะป ะฝะตั‰ะฐะดะฝะพ ั€ะตะทะฐั‚ัŒ ะพะณะพะปั‘ะฝะฝั‹ะต ัƒั‡ะฐัั‚ะบะธ ะบะพะถะธ. ะฏ ะฟะพั‚ั‘ั€ ะทะฐะผั‘ั€ะทัˆะธะต ั€ัƒะบะธ, ะฟั€ะธะถะฐะป ะธั… ะบะพ ั€ั‚ัƒ, ั‡ั‚ะพะฑั‹ ัะพะณั€ะตั‚ัŒ ัะฒะพะธะผ ะณะพั€ัั‡ะธะผ ะดั‹ั…ะฐะฝะธะตะผ, ะฝะพ ะพั‚ ัั‚ะพะณะพ ะฑั‹ะปะพ ะผะฐะปะพ ั‚ะพะปะบัƒ. ะจะฐะฟะบะพะน ะผะตะฝั ัะฝะฐั€ัะดะธั‚ัŒ ะฝะต ัƒะดะพััƒะถะธะปะธััŒ, ะฟะพัั‚ะพะผัƒ ะณะพะปะพะฒะฐ ั‚ะพะถะต ะพั‰ัƒั‰ะฐะปะฐ ะฒัะต ะฟั€ะตะปะตัั‚ะธ ะฒั‹ะดะฐะฒัˆะตะนัั ะฒ ัั‚ะพะผ ะณะพะดัƒ ััƒั€ะพะฒะพะน ะฝะตะผะตั†ะบะพะน ะทะธะผั‹. ะฏ ะพัะผะพั‚ั€ะตะปัั. ะ’ะพะบั€ัƒะณ ะปะธัˆัŒ ะทะฐัั‹ะฟะฐะฝะฝั‹ะต ัะฝะตะณะพะผ ะดะตั€ะตะฒัŒั. ะะธะบะฐะบะธั… ะฟั€ะธะทะฝะฐะบะพะฒ ั‚ะพะณะพ, ั‡ั‚ะพ ะฟะพะฑะปะธะทะพัั‚ะธ ะบั‚ะพ-ั‚ะพ ะพะฑะธั‚ะฐะตั‚. ะ’ั‹ะบะธะดั‹ะฒะฐั‚ัŒ ะผะตะฝั ะฟั€ัะผะพ ะฝะฐ ะฟะพั€ะพะณ ะพะฑัŠะตะบั‚ะฐ ะฑั‹ะปะพ ะพะฟะฐัะฝะพ ะธ ะฟะพะดะพะทั€ะธั‚ะตะปัŒะฝะพ, ะฟะพัั‚ะพะผัƒ ั ะพะบะฐะทะฐะปัั ะฒ ะฝะตะบะพั‚ะพั€ะพะผ ะพั‚ะดะฐะปะตะฝะธะธ ะพั‚ ะผะตัั‚ะฐ ะฝะฐะทะฝะฐั‡ะตะฝะธั. ะ˜ ั‚ะตะฟะตั€ัŒ, ัั‚ะพั ะฒ ัั‚ะพะผ ะทะฐัะฝะตะถะตะฝะฝะพะผ ะปะตััƒ, ั ะฟะพะฝัั‚ะธั ะฝะต ะธะผะตะป, ะบัƒะดะฐ ะผะฝะต ะฝะฐะฟั€ะฐะฒะปัั‚ัŒัั. ะฏ ั€ะตัˆะธะป ะฟะพะดะดะฐั‚ัŒัั ะธะฝั‚ัƒะธั†ะธะธ, ะบะพั‚ะพั€ะฐั ะผะตะฝั ั€ะตะดะบะพ ะฟะพะดะฒะพะดะธะปะฐ. ะ–ะฝะตั†ั‹ ะธะผะตัŽั‚ ะผะฝะพะณะพ ัะฟะพัะพะฑะฝะพัั‚ะตะน: ะฝะฐะฟั€ะธะผะตั€, ะผั‹ ัƒะผะตะตะผ ะฟะพะดะฐะฒะปัั‚ัŒ ัะพะทะฝะฐะฝะธะต ะปัŽะดะตะน ะธะปะธ ัั‚ะฐะฝะพะฒะธั‚ัŒัั ะฝะตะฒะธะดะธะผั‹ะผะธ, ะบะพะณะดะฐ ะฝะฐะผ ัั‚ะพ ะฝะตะพะฑั…ะพะดะธะผะพ, ะฝะพ ะฟะพั‡ะตะผัƒ-ั‚ะพ ะฒะฝัƒั‚ั€ะตะฝะฝะตะณะพ ะบะพะผะฟะฐัะฐ ะฒ ะฝะฐั ะฝะต ะฒัั‚ั€ะพะตะฝะพ. ะะพะณะธ ะฟั€ะพะฒะฐะปะธะฒะฐะปะธััŒ ะฒ ะณะปัƒะฑะพะบะธะต ััƒะณั€ะพะฑั‹. ะงะตั€ั‚ั‹ั…ะฐัััŒ, ั ะผะตะดะปะตะฝะฝะพ ะฟั€ะพะดะฒะธะณะฐะปัั ะฒะฟะตั€ั‘ะด, ะฒะฝะธะผะฐั‚ะตะปัŒะฝะพ ะฒั‹ะธัะบะธะฒะฐั ะฒะดะฐะปะตะบะต ั…ะพั‚ัŒ ะบะฐะบะธะต-ั‚ะพ ะฟั€ะธะทะฝะฐะบะธ ั‡ะตะปะพะฒะตั‡ะตัะบะพะณะพ ะถะธะปัŒั, ะพะดะฝะฐะบะพ ั‚ั‰ะตั‚ะฝะพ. ะฏ ะผั‹ัะปะตะฝะฝะพ ะฟะตั€ะตะฑะธั€ะฐะป ะฒัะต ั‚ะต ะทะฝะฐะฝะธั, ะบะพั‚ะพั€ั‹ะต ะบะพะณะดะฐ-ั‚ะพ ะฟะพะปัƒั‡ะธะป ะพะฑ ัั‚ะพะผ ั€ะตะณะธะพะฝะต, ะฝะพ ั‚ะฐะบ ะธ ะฝะต ัะผะพะณ ะฟั€ะธะฟะพะผะฝะธั‚ัŒ, ั‡ั‚ะพะฑั‹ ั‚ัƒั‚ ะฑั‹ะปะธ ั…ะฐั€ะฐะบั‚ะตั€ะฝั‹ ั‚ะฐะบะธะต ะฟะพะณะพะดะฝั‹ะต ัƒัะปะพะฒะธั. ะ”ะฐะถะต ะฟะพะดัƒะผะฐะป ะพ ั‚ะพะผ, ั‡ั‚ะพ ะฒั‹ัˆะปะฐ ะพัˆะธะฑะบะฐ, ะธ ะผะตะฝั ะทะฐะบะธะฝัƒะปะธ ะฒ ะบะฐะบะพะต-ั‚ะพ ะดั€ัƒะณะพะต ะผะตัั‚ะพ. ะ’ั€ะตะผั ัˆะปะพ, ะฐ ะฟะตะนะทะฐะถะธ ะฒะพะบั€ัƒะณ ะฝะต ะผะตะฝัะปะธััŒ, ะทะฐั‚ะพ ะฝะฐั‡ะธะฝะฐะปะพ ั‚ะตะผะฝะตั‚ัŒ. ะฏ ัƒะถะต ะฝะฐั‡ะฐะป ะฑะตัะฟะพะบะพะธั‚ัŒัั, ะฝะพ ะฝะต ะดะฐะฒะฐะป ะฟะฐะฝะธะบะต ะฟะพะปะฝะพัั‚ัŒัŽ ะพะฒะปะฐะดะตั‚ัŒ ะผะฝะพะน. ะขะพะปัŒะบะพ ะบะพะณะดะฐ ะฝะฐะด ะปะตัะพะผ ะฟะพะดะฝัะปัั ะพะฟะพะปะพะฒะธะฝะตะฝะฝั‹ะน ะดะธัะบ ะปัƒะฝั‹, ั ะพะฑะตััะธะปะตะฝะพ ะฟะพะฒะฐะปะธะปัั ะฝะฐ ะทะตะผะปัŽ, ะฟะพะดะปะพะถะธะฒ ะฟะพะด ัะตะฑั ั€ัŽะบะทะฐะบ, ะฒ ะบะพั‚ะพั€ะพะผ ะฑั‹ะป ัะพะฑั€ะฐะฝ ะผะธะฝะธะผัƒะผ ะพะดะตะถะดั‹. ะ ัƒะบ ั ัƒะถะต ัะพะฒะตั€ัˆะตะฝะฝะพ ะฝะต ั‡ัƒะฒัั‚ะฒะพะฒะฐะป, ะฐ ะณะพะปะพะฒะฐ ั€ะฐัะบะฐะปั‹ะฒะฐะปะฐััŒ ะพั‚ ั…ะพะปะพะดะฐ. ะ’ ะผั‹ัะปัั… ะบั€ัƒั‚ะธะปะพััŒ, ั‡ั‚ะพ ั‚ะฐะบะพะน ะผะฐัั‚ะตั€ ัะฒะพะตะณะพ ะดะตะปะฐ ะบะฐะบ ั ะฝะต ะผะพะถะตั‚ ั‚ะฐะบ ะณะปัƒะฟะพ ะฟั€ะพะฒะฐะปะธั‚ัŒัั, ะฝะพ ัะธะป ะดะฐะปัŒัˆะต ะดะฒะธะณะฐั‚ัŒัั ะฝะต ะฑั‹ะปะพ. - ะฏ ะปะธัˆัŒ ะฝะตะผะฝะพะณะพ... ะพั‚ะดะพั…ะฝัƒ, - ัƒัะฟะพะบะฐะธะฒะฐะป ั ัะฐะผะพะณะพ ัะตะฑั, ะฟั‹ั‚ะฐัััŒ ั€ะฐะทะปะตะฟะธั‚ัŒ ะฝะฐัั‚ั‹ั€ะฝะพ ะทะฐะบั€ั‹ะฒะฐัŽั‰ะธะตัั ะฒะตะบะธ. ะะพ ั‚ะตะปะพ ะผะตะฝั ัƒะถะต ะฝะต ัะปัƒัˆะฐะปะพััŒ. ะŸะตั€ะตะด ั‚ะตะผ, ะบะฐะบ ะพะบะพะฝั‡ะฐั‚ะตะปัŒะฝะพ ะพั‚ะดะฐั‚ัŒ ัะตะฑั ะฒ ะพะฑัŠัั‚ะธั ัะฝะฐ, ั, ะบะฐะถะตั‚ัั, ัƒัะปั‹ัˆะฐะป ัะบั€ะธะฟ ัะฝะตะณะฐ. ะŸั€ะธะดั ะฒ ัะตะฑั, ั ะฟะตั€ะฒั‹ะผ ะดะตะปะพะผ ะฟะพั‡ัƒะฒัั‚ะฒะพะฒะฐะป ะทะฐะฟะฐั… ะผะพะปะพะบะฐ. ะ‘ั‹ะปะพ ะฝะฐ ัƒะดะธะฒะปะตะฝะธะต ั‚ะตะฟะปะพ, ะดะฐะถะต ะถะฐั€ะบะพ. ะŸั€ะธะพั‚ะบั€ั‹ะฒ ะพะดะธะฝ ะณะปะฐะท, ั ัƒะฒะธะดะตะป ะฝะฐะฟั€ะพั‚ะธะฒ ัะตะฑั ะทะฐะดะพั€ะฝะพ ะฟะปััˆัƒั‰ะธะต ัะทั‹ั‡ะบะธ ะฟะปะฐะผะตะฝะธ ะฒ ะบะฐะผะธะฝะต. ะžะฟัƒัั‚ะธะป ะฒะทะณะปัะด ะฒะฝะธะท - ะผะพั‘ ั‚ะตะปะพ ะฟะปะพั‚ะฝะพ ะทะฐะบัƒั‚ะฐะฝะพ ะฒ ั‚ะพะปัั‚ะพะต ะพะดะตัะปะพ, ะฐ ัะฐะผ ั ะปะตะถัƒ ะฝะฐ ะบะฐะบะพะผ-ั‚ะพ ะดะธะฒะฐะฝั‡ะธะบะต. ะ“ะดะต ั? ะ’ ะณะพะปะพะฒะต ะฝะตะพั…ะพั‚ะฝะพ ะฝะฐั‡ะฐะปะธ ะฒะพั€ะพั‡ะฐั‚ัŒัั ัˆะตัั‚ะตั€ั‘ะฝะบะธ ะฒะพัะฟะพะผะธะฝะฐะฝะธะน, ะฟั€ะพะฒะพั€ะฐั‡ะธะฒะฐั ะฝะฐะทะฐะด ัะตะณะพะดะฝััˆะฝะธะต ัะพะฑั‹ั‚ะธั. ะ ะฐััะฒะตั‚, ะกั‚ะฐั€ะตะนัˆะธะฝะฐ, ะทะฐะดะฐะฝะธะต, ะปะตั, ั…ะพะปะพะด, ะพะฑะผะพั€ะพะบ, ะฟัƒัั‚ะพั‚ะฐ. ะฏ ะตั‰ั‘ ั€ะฐะท ะฟะพ ะผะตั€ะต ะฒะพะทะผะพะถะฝะพัั‚ะตะน ะพัะผะพั‚ั€ะตะป ัะตะฑั, ะดะฐะถะต ะทะฐะณะปัะฝัƒะป ะฟะพะด ะพะดะตัะปะพ. ะšั‚ะพ-ั‚ะพ ะทะฐะฑะพั‚ะปะธะฒะพ ะฟั€ะธั‚ะฐั‰ะธะป ะผะตะฝั ะธะท ะปะตัะฐ, ัั‚ัะฝัƒะป ะฒััŽ ะพะดะตะถะดัƒ ะธ ัะพะณั€ะตะป. ะžะบะธะฝัƒะป ะฒะทะณะปัะดะพะผ ะบะพะผะฝะฐั‚ัƒ - ะผะพะตะณะพ ะดะพะฑั€ะพะดะตั‚ะตะปั ะฝะต ะฝะฐะฑะปัŽะดะฐะปะพััŒ. ะฏ ะพัั‚ะพั€ะพะถะฝะพ ะฟะพะฟั‹ั‚ะฐะปัั ะฟะพะดะฝัั‚ัŒัั ะฝะฐ ะปะพะบั‚ัั…, ะฝะฐ ั‡ั‚ะพ ั‚ะตะปะพ ะพั‚ะพะทะฒะฐะปะพััŒ ะฝะตะฟั€ะธัั‚ะฝะพะน ะฝะพัŽั‰ะตะน ะฑะพะปัŒัŽ. ะกะบั€ะธะฒะธะฒัˆะธััŒ, ั ะฒัั‘ ะถะต ะฟั€ะธะฒั‘ะป ัะตะฑั ะฒ ัะธะดัั‡ะตะต ะฟะพะปะพะถะตะฝะธะต ะธ ะตั‰ั‘ ั€ะฐะท ั‚ั‰ะฐั‚ะตะปัŒะฝะพ ะฒัั‘ ะพัะผะพั‚ั€ะตะป. ะšะพะผะฝะฐั‚ะฐ ะฑั‹ะปะฐ ะฝะตะฑะพะปัŒัˆะพะน ะธ ะฒะตััŒะผะฐ ะฝะตัƒั…ะพะถะตะฝะฝะพะน. ะŸะพะฒััŽะดัƒ ะฒะฐะปัะปะธััŒ ะบะฐะบะธะต-ั‚ะพ ะพะฑั‘ั€ั‚ะบะธ, ะฝะฐ ะฟะพะปัƒ ัั‚ะพัะปะธ ะฟัƒัั‚ั‹ะต ะฑัƒั‚ั‹ะปะบะธ, ะบะพะต-ะณะดะต ะฒะธะดะฝะตะปะธััŒ ะธ ะณั€ัะทะฝั‹ะต ั‚ะฐั€ะตะปะบะธ. ะžะฑะพะธ ะผะตัั‚ะฐะผะธ ะพั‚ั…ะพะดะธะปะธ, ะพะฑะฝะฐะถะฐั ะดะตั€ะตะฒัะฝะฝั‹ะต ัั‚ะตะฝั‹, ะฟั€ะฐะบั‚ะธั‡ะตัะบะธ ะฒัั‘ ะฑั‹ะปะพ ะธัะฟะธัะฐะฝะพ ะธะผะตะฝะฐะผะธ ะบะฐะบะธั…-ั‚ะพ ะปัŽะดะตะน ะธ ั€ะฐะทะปะธั‡ะฝั‹ะผะธ ัั‚ั€ะฐะฝะฝั‹ะผะธ ะฝะฐะทะฒะฐะฝะธัะผะธ, ะทะฝะฐั‡ะตะฝะธั ะบะพั‚ะพั€ั‹ั… ั ะฝะต ะฟะพะฝะธะผะฐะป. ะžะดะฝะฐะบะพ ะฑั‹ะปะพ ะฒะธะดะฝะพ, ั‡ั‚ะพ ั…ะพะทัะธะฝ ะปัŽะฑะธั‚ ัะฒะพัŽ ะฑะตั€ะปะพะณัƒ ะธ ะฝะธ ะทะฐ ั‡ั‚ะพ ะฝะต ั€ะฐััั‚ะฐะฝะตั‚ัั ั ะฝะตะน. ะ’ั…ะพะดะฝะฐั ะดะฒะตั€ัŒ ะฟั€ะพั‚ัะถะฝะพ ะทะฐัะบั€ะธะฟะตะปะฐ, ะธ ะฒ ะบะพะผะฝะฐั‚ัƒ ะฒะพัˆั‘ะป ะทะดะพั€ะพะฒะตะฝะฝั‹ะน ะดะตั‚ะธะฝะฐ, ะดะตั€ะถะฐะฒัˆะธะน ะฒ ั€ัƒะบะฐั… ะบั€ัƒะถะบัƒ, ะฟะพ ั€ะฐะทะผะตั€ะฐะผ ะฑะพะปัŒัˆะต ะฟะพั…ะพะถัƒัŽ ะฝะฐ ะฒะตะดั€ะพ. - ะžั‡ัƒั…ะฐะปัั, - ะฑัƒั€ะบะฝัƒะป ะพะฝ, ะฟั€ะพั‚ัะณะธะฒะฐั ะผะฝะต "ะฒะตะดั€ะพ". - ะŸะตะน. ะขั‘ะฟะปะพะต ะผะพะปะพะบะพ. ะฏ ะฝะต ะฑั‹ะป ะฑะพะปัŒัˆะธะผ ะฟะพะบะปะพะฝะฝะธะบะพะผ ัั‚ะพะณะพ ะฝะฐะฟะธั‚ะบะฐ, ะฝะพ ะฒะฝะตะทะฐะฟะฝะพ ะพั‰ัƒั‚ะธะป, ั‡ั‚ะพ ะธะผะตะฝะฝะพ ะตะณะพ ัะตะนั‡ะฐั ั‚ั€ะตะฑัƒะตั‚ ะผะพะน ะพั€ะณะฐะฝะธะทะผ, ะธ ะถะฐะดะฝะพ ะฟั€ะธะฟะฐะป ะบ ะบั€ัƒะถะบะต. ะŸะฐั€ะตะฝัŒ, ัะธะดะตะฒัˆะธะน ะฝะฐะฟั€ะพั‚ะธะฒ ะธ ัƒะณั€ัŽะผะพ ะฝะฐะฑะปัŽะดะฐัŽั‰ะธะน ะทะฐ ะผะฝะพะน ะธะท-ะฟะพะด ะพั‚ั€ะพััˆะตะน ั‡ั‘ะปะบะธ, ัะผะพั‚ั€ะตะป ะฝะฐ ะผะตะฝั ะฝะตัะบะพะปัŒะบะพ ะผะธะฝัƒั‚, ะฟะพัะปะต ั‡ะตะณะพ ัะฟั€ะพัะธะป: - ะšะฐะบะพะณะพ ะปะตัˆะตะณะพ ั‚ะตะฑั ะฒ ะปะตั-ั‚ะพ ะฟะพะฝะตัะปะพ? ะฏ ะบะพะต-ะบะฐะบ ะพั‚ะพั€ะฒะฐะป ัะตะฑั ะพั‚ ะผะพะปะพะบะฐ. - ะœะฐัˆะธะฝะฐ ะฝะฐ ั‚ั€ะฐััะต ะทะฐะณะปะพั…ะปะฐ, ะฟะพัˆั‘ะป ะทะฐ ะฟะพะผะพั‰ัŒัŽ ะธ ะทะฐะฑะปัƒะดะธะปัั, - ะฒั‹ะดะฐะป ั ะทะฐั€ะฐะฝะตะต ะฟะพะดะณะพั‚ะพะฒะปะตะฝะฝัƒัŽ ะปะตะณะตะฝะดัƒ. - ะกั‡ะธั‚ะฐะน, ะฟั€ะพะฟะฐะปะฐ ั‚ะฒะพั ั‚ะฐั‡ะบะฐ, - ั„ั‹ั€ะบะฝัƒะป ะฝะตะทะฝะฐะบะพะผะตั†. - ะฃ ะฝะฐั ั‚ัƒั‚ ัะตะนั‡ะฐั ะฝะต ัะฐะผั‹ะน ัะฟะพะบะพะนะฝั‹ะน ะฝะฐั€ะพะดะตั† ะพะฑะธั‚ะฐะตั‚. ะกะบะพั€ะตะต ะฒัะตะณะพ, ัƒะถะต ะฟั€ะธะฑั€ะฐะปะธ ะบ ั€ัƒะบะฐะผ ั‚ะฒะพัŽ ะปะพัˆะฐะดะบัƒ. ะฏ ะฟะพัั‚ะฐั€ะฐะปัั ัะดะตะปะฐั‚ัŒ ั€ะฐะทะพั‡ะฐั€ะพะฒะฐะฝะฝั‹ะน ะฒะธะด. - ะญั…, ะฝัƒ ั‡ั‚ะพ ะถ ั ั‚ะฐะบ! - ะขั‹ ะฒ ะฟะพะปะธั†ะธัŽ ะฟะพะฟั€ะพะฑัƒะน ะพะฑั€ะฐั‚ะธั‚ัŒัั, ะฝะพ ัั‚ะพ ะฒั€ัะด ะปะธ ั‚ะตะฑะต ะฟะพะผะพะถะตั‚. ะœะตะฝั ะบัั‚ะฐั‚ะธ, ะขะธะปะปัŒ ะทะพะฒัƒั‚. ะฏ ะฐะถ ะฟะพะดะฐะฒะธะปัั, ัƒัะปั‹ัˆะฐะฒ ะธะผั ัะฒะพะตะณะพ ัะฟะฐัะธั‚ะตะปั. - ะญะน, ะฝัƒ ั‚ั‹ ะฐะบะบัƒั€ะฐั‚ะฝะตะต! - ะขะธะปะปัŒ ะดะตั€ะฝัƒะปัั ะฒ ะผะพัŽ ัั‚ะพั€ะพะฝัƒ, ะฝะพ ั ะถะตัั‚ะพะผ ะพัั‚ะฐะฝะพะฒะธะป ะตะณะพ. - ะ ั..ะบั…ะผ-ะบั…ะผ...ะ ะธั…ะฐั€ะด, - ะฟั€ะพั…ั€ะธะฟะตะป ั, ะฟั‹ั‚ะฐัััŒ ะฟะตั€ะตะฑะพั€ะพั‚ัŒ ะฟั€ะธัั‚ัƒะฟั‹ ะบะฐัˆะปั. - ะ—ะฐัั‚ั€ัะป ั‚ั‹, ะฟะพั…ะพะถะต, ะทะดะตััŒ, ะ ะธั…ะฐั€ะด, - ะขะธะปะปัŒ ะบะธะฒะบะพะผ ัƒะบะฐะทะฐะป ะฝะฐ ะพะบะฝะพ. ะะฐ ัƒะปะธั†ะต ะฝะต ะฝะฐ ัˆัƒั‚ะบัƒ ั€ะฐะทะฑัƒัˆะตะฒะฐะปะฐััŒ ะผะตั‚ะตะปัŒ. - ะฃะดะฐั‡ะฐ, ะตัะปะธ ะผั‹ ะทะฐะฒั‚ั€ะฐ ะฒะพะพะฑั‰ะต ัะผะพะถะตะผ ะธะท ะดะพะผะฐ ะฒั‹ะนั‚ะธ, - ะฒะทะดะพั…ะฝัƒะป ะขะธะปะปัŒ. - ะฃะถะต ะฝะตะดะตะปัŽ ะผะตั‚ั‘ั‚, ะทะฐั€ะฐะทะฐ, ะธ ะฒัั‘ ะฝะธะบะฐะบ ะฝะต ัƒัะฟะพะบะพะธั‚ัั. ะขะฐะบ ั‡ั‚ะพ ะพะฑะพะถะดะฐั‚ัŒ ะฟั€ะธะดั‘ั‚ัั, ะฟั€ะตะถะดะต ั‡ะตะผ ั ะดะพะบะธะฝัƒ ั‚ะตะฑั ั…ะพั‚ั ะฑั‹ ะดะพ ะจะฒะตั€ะธะฝะฐ. ะฏ ะฝะตัƒะฒะตั€ะตะฝะฝะพ ะฟะพะถะฐะป ะฟะปะตั‡ะฐะผะธ, ั‡ั‚ะพะฑั‹ ะขะธะปะปัŒ ะฝะต ะฟะพะดัƒะผะฐะป, ั‡ั‚ะพ ั ะฝะฐะฟั€ะฐัˆะธะฒะฐัŽััŒ. - ะขะพะปัŒะบะพ ั‚ั‹, ัะปั‹ัˆัŒ, ัะฐะผ ัะตะฑะต ะฑัƒะดะตัˆัŒ ะถั€ะฐั‚ัŒ ะณะพั‚ะพะฒะธั‚ัŒ, ั ั‚ะตะฑะต ะฝะต ั…ะพะทััŽัˆะบะฐ. ะัƒ ะธ ะผะฝะต ะทะฐะพะดะฝะพ ะผะพะถะตัˆัŒ, - ัƒั…ะผั‹ะปัŒะฝัƒะปัั ะขะธะปะปัŒ. - ะั€ะตะฝะดะฝะฐั ะฟะปะฐั‚ะฐ, ั‚ะฐะบ ัะบะฐะทะฐั‚ัŒ, ะทะฐ ะฟั€ะตะดะพัั‚ะฐะฒะปะตะฝะฝัƒัŽ ะถะธะปะฟะปะพั‰ะฐะดัŒ. ะฏ ัะพะณะปะฐัะฝะพ ะบะธะฒะฝัƒะป. - ะัƒ ะธ ะปะฐะดัƒัˆะบะธ. ะั…, ะดะฐ, ะฟัˆั‘ะป ะฒะพะฝ ั ะผะพะตะณะพ ะดะธะฒะฐะฝะฐ! ะŸะพะผะฝั ะพ ั‚ะพะผ, ั‡ั‚ะพ ะผะฝะต ะฝัƒะถะฝะพ ะฝะฐะปะฐะดะธั‚ัŒ ั ัั‚ะธะผ ั‡ะตะปะพะฒะตะบะพะผ ะบะพะฝั‚ะฐะบั‚, ะธ ะทะปะธั‚ัŒ ะตะณะพ ะฝะธ ะฒ ะบะพะตะผ ัะปัƒั‡ะฐะต ะฝะต ัะปะตะดัƒะตั‚, ั ะฟะพัะปัƒัˆะฝะพ ะฒัั‚ะฐะป, ัƒะบัƒั‚ะฐะฒัˆะธััŒ ะฒ ะพะดะตัะปะพ. - ะŸั„, ั‚ะพะถะต ะผะฝะต, - ั…ะผั‹ะบะฝัƒะป ะขะธะปะปัŒ, ะณะปัะดั ะฝะฐ ั‚ะพ, ะบะฐะบ ั ะฟั‹ั‚ะฐัŽััŒ ะฟะพัƒะดะพะฑะฝะตะต ะทะฐะฒะตั€ะฝัƒั‚ัŒ' sentences: - 'scrtrm - piano #11 ะ’ ั‚ะธั…ะพะผ-ั‚ะธั…ะพะผ ัƒะณะปัƒ ัƒะปะธั†ั‹(ะตัะปะธ ัั‚ะพ ะฒะพะพะฑั‰ะต ัƒะณะพะป) ัะธะดะตะปะธ ะดะฒะฐ ั€ะตะฑะตะฝะบะฐ ะฟั€ะธะผะตั€ะฝะพ ะพะดะธะฝะฐะบะพะฒะพะณะพ ะฒะพะทั€ะฐัั‚ะฐ ั‚ะพั‡ะฝะพ ะฝะต ะดะพัั‚ะธะณะฝัƒะฒัˆะธะต ะดะตััั‚ะธ, ะฐ ั‚ะพ ะผะพะถะตั‚ ะธ ะดะตะฒัั‚ะธ ะปะตั‚ ะพั‚ ั€ะพะดัƒ. ะ’ะพะบั€ัƒะณ ะฝะต ะฑั‹ะปะพ ะฝะธะบะพะณะพ ะธะท ั‚ะตั… ัˆัƒะผะฝั‹ั… ะฒะทั€ะพัะปั‹ั…, ะบะพั‚ะพั€ั‹ั… ะฒัะต ะดะตั‚ะธ ะฟะพั‡ะตะผัƒ-ั‚ะพ ะฝะตะดะพะปัŽะฑะปะธะฒะฐะปะธ. ะžะดะฝะฐะบะพ, ะบั€ะพะผะต ัั‚ะธั… ะดะฒะพะธั…, ะฝะฐ ัƒะปะธั†ะต ะธะทั€ะตะดะบะฐ ะฟั€ะพะฑะตะณะฐะปะธ, ะฐ ั‚ะพ ะธ ะฟั€ะพะตะทะถะฐะปะธ, ะฝะฐ ะฟะพะปะฝะพะน ัะบะพั€ะพัั‚ะธ ะฟะพะดั€ะพัั‚ะบะธ ะฝะฐ ะฒะตะปะพัะธะฟะตะดะฐั… ะธ ะฝะฐ ัะบะตะนั‚ะฑะพั€ะดะฐั…. ะะตะบะพั‚ะพั€ั‹ะต ะธะท ะฝะธั… ั‚ะธั…ะพ ะธ ะฝะต ัะฟะตัˆะฝะพ ะตะทะดะธะปะธ ะฝะฐ ั€ะพะปะธะบะฐั…, ะพะฑะพั€ะฐั‡ะธะฒะฐัััŒ ะฟะพ ัั‚ะพั€ะพะฝะฐะผ ะธ ัะผะพั‚ั€ั ะปะฐั€ัŒะบะธ ั ะผะพั€ะพะถะตะฝะฝั‹ะผ. ะะฐ ัƒะปะธั†ะต ะปะตั‚ะพ. ะ˜ัŽะฝัŒ ะฒั€ะพะดะต ะฑั‹, ะฐ ะดะตั‚ะธ ัƒะถะต ะฒะพ ะฒััŽ ั€ะฐะทะฒะปะตะบะฐัŽั‚ัั, ะทะฐะฑั‹ะฒ ะฟั€ะพ ัƒั‡ั‘ะฑัƒ. ะ ั‚ะต ะดะฒะฐ ั€ะตะฑั‘ะฝะบะฐ ะฒัั‘ ั‚ะฐะบ ะถะต ัะธะดะตะปะธ ะฝะฐ ะดะฒัƒั… ะปะฐะฒะพั‡ะบะฐั…, ะฟะฐั€ะฐะปะปะตะปัŒะฝั‹ั… ะดั€ัƒะณ ะดั€ัƒะณัƒ, ะธะทั€ะตะดะบะฐ ะฑั€ะพัะฐั ะฒะทะณะปัะดั‹ ะฝะฐ ะดั€ัƒะณ ะดั€ัƒะณะฐ ะธ ะฟะพัะฟะตัˆะฝะพ ะพั‚ะฒะพั€ะฐั‡ะธะฒะฐัััŒ, ะบะพะณะดะฐ ะพะดะธะฝ ะธะท ะฝะธั… ะทะฐะผะตั‡ะฐะป ัั‚ะธ ะฒะทะณะปัะดั‹. ะจัƒะผ ะปะธัั‚ัŒะตะฒ ั€ัะดะพะผ ั†ะฒะตั‚ัƒั‰ะตะน ะฟะพะทะดะฝะตะน ั‡ะตั€ั‘ะผัƒั…ะธ ะดะพะฑะฐะฒะปัะป ั€ะพะผะฐะฝั‚ะธั‡ะฝะพัั‚ะธ ะบะฐั€ั‚ะธะฝะต. ะ‘ั‹ัั‚ั€ั‹ะต ัˆะบะพะปัŒะฝะธะบะธ ะฝะต ั€ะฐะท ัะผะตัะปะธััŒ ะฝะฐะด ะฝะธะผะธ, ะฝะต ะทะฐะดัƒะผั‹ะฒะฐัััŒ ะฝะฐะด ั‚ะตะผ, ั‡ั‚ะพ ั‡ัƒะฒัั‚ะฒัƒัŽั‚ ัั‚ะธ ะดะตั‚ะธ. ะŸะตั€ะฒะฐั ะปัŽะฑะพะฒัŒ ะฒัะตะณะดะฐ ะฟั€ะธั…ะพะดะธั‚ ะฟะพ ั€ะฐะทะฝะพะผัƒ. ะœะฐะปัŒั‡ะธะบ ั ะณะพะปัƒะฑั‹ะผะธ ะฒะพะปะพัะฐะผะธ, ะพะดะตั‚ั‹ะน ะฒ ั‚ะพะฝะบัƒัŽ ัะฒะตั‚ะปะพ-ะพั€ะฐะฝะถะตะฒัƒัŽ ั€ัƒะฑะฐัˆะบัƒ ะธ ะทะฐะณะฝัƒั‚ั‹ะผะธ ะดะพ ะบะพะปะตะฝะฐ ะบะพั€ะธั‡ะฝะตะฒั‹ะผะธ ะฑั€ัŽะบะฐะผะธ ะฟะพะดะพัˆั‘ะป ะบ ะดะตะฒะพั‡ะบะต ะธ ัะฟั€ะพัะธะป: - ะŸ-ะฟั€ะธะฒะตั‚. ะขะตะฑะต ะฝะต ะฝัƒะถะตะฝ ะบะพั‚? ะก-ัะธะฝะธะน ั‚ะฐะบะพะน, ั ั‡ั‘ั€ะฝั‹ะผะธ ะณะปะฐะทะฐะผะธ ะธ ะผะฐะปะตะฝัŒะบะธะผ ั€ัŽะบะทะฐั‡ะบะพะผ ะฝะฐ ัะฟะธะฝะต? - ะ-ะฝะตั‚, ัะฟะฐัะธะฑะพ, - ะพั‚ะฒะตั‚ะธะปะฐ ะดะตะฒะพั‡ะบะฐ, ะพั‚ะฒะพั€ะฐั‡ะธะฒะฐัััŒ ะฒ ัั‚ะพั€ะพะฝัƒ. - ะ”ะฐ ะธ ัƒ ะผะตะฝั ัƒะถะต ะตัั‚ัŒ ะบะพั‚. ะ’ั‚ะพั€ะพะณะพ ั€ะพะดะธั‚ะตะปะธ ะฝะต ั€ะฐะทั€ะตัˆะฐั‚. - ะŸะพะฝัั‚ะฝะพ, ะฐ ัั‚ะพะณะพ ะบะพั‚ะฐ ะฅัะฟะฟะธ ะทะฒะฐะปะธ. ะ”ะพะฑั€ั‹ะน ั‚ะฐะบะพะน, ะพั‚ะทั‹ะฒั‡ะธะฒั‹ะน, ะฟะพะฝะธะผะฐัŽั‰ะธะน. ะฅะพั€ะพัˆะธะน, ะฒ ะพะฑั‰ะตะผ, ะบะพั‚. - ะ ัƒ ะผะตะฝั ะบะพั‚ะฐ, ะฝัƒ ะฒะตั€ะฝะตะต ะฝะต ะบะพั‚ะฐ, ะฐ ะบะพัˆะบัƒ, ะจะฐั€ะปะธ ะทะพะฒัƒั‚. ะะพ ะพะฝะฐ ัั‚ั€ะฐะฝะฝะฐั, ะฝะต ั€ะฐะทะณะพะฒะฐั€ะธะฒะฐัŽั‰ะฐั, ะณั€ัƒะฑะฐั, ะดะฐะถะต ัั‚ะตั€... ะšะฐะบ ั‚ะฐะผ ะฒะทั€ะพัะปั‹ะต ะณะพะฒะพั€ัั‚, ัั‚ะต-ั€ะฒะพ-ะทะฝะฐั, ะพะฝะฐ ัƒ ะผะตะฝั. ะ˜ ะพะฑั‰ะฐั‚ัŒัั ะฝะต ะปัŽะฑะธั‚. - ะฏ ัƒะฒะตั€ะตะฝ, ะฅัะฟะฟะธ ะฑั‹ ะฝะฐัˆั‘ะป ะบ ะฝะตะน ะฟะพะดั…ะพะด ะธ ั€ะฐะทะณะพะฒะพั€ะธะป ะฑั‹ ะตั‘. ะ˜ ะพะฝะฐ ะฑ ัั‚ะฐะปะฐ ั‚ะฐะบะพะน ะถะต ะบะฐะบ ะธ ะพะฝ. - ะะต ะทะฝะฐัŽ, ั, ั‚ะพ ะตัั‚ัŒ ะผั‹ ะฝะธะบะพะณะดะฐ ะฝะต ะฟั€ะพะฑะพะฒะฐะปะธ. - ะ ะดะฐะฒะฐะน ะฟะพะฟั€ะพะฑัƒะตะผ ะธ ัƒะทะฝะฐะตะผ! ะัƒ, ั‚ะฐะบ ั‚ั‹ ัะพ ะผะฝะพะน? - ะฅ-ั…ะพั€ะพัˆะพ! ะงะตั€ะตะท ะดะตะฒัั‚ัŒ ะปะตั‚ - ะจะฐั€ะปะธ, ะฐ ั‚ั‹ ะฑั‹ ั…ะพั‚ะตะปะฐ ัั‚ะฐั‚ัŒ ะบะพัˆะบะพะน? ะัƒ ะธะปะธ ะตั‘ ะทะฐะฒะตัั‚ะธ? - ะะตั‚, ะฅัะฟะฟะธ, ัƒ ะผะตะฝั ัƒะถะต ะตัั‚ัŒ ะพะดะธะฝ. ะžะดะฝะพะณะพ ะดะพัั‚ะฐั‚ะพั‡ะฝะพ. - ะ’ัะฟะพะผะธะฝะฐะตะผ ะดะตั‚ัั‚ะฒะพ, ะฐ ะจะฐั€ะปะธ? - ะฃะณะฐะดะฐะป, ะฅัะฟะฟะธ! ะ’ัะฟะพะผะธะฝะฐัŽ ะดะตั‚ัั‚ะฒะพ ะธ ั‚ะพ ัะฐะผะพะต ะผะตัั‚ะพ. - ะัƒ ะฐ ะฒัั‘ ะถะต, ะบั‚ะพ ัั‚ะพั‚ ะบะพั‚? - ะ ะฟะพั‡ะตะผัƒ ั‚ั‹ ะดัƒะผะฐะตัˆัŒ, ั‡ั‚ะพ ัั‚ะพ ะบะพั‚? - ะขะฐะบ ั‚ั‹ ะถะต ัะฐะผะฐ ัะบะฐะทะฐะปะฐ! ะกะบะปะตั€ะพั‚ะธั‡ะบะฐ! - ะฏ ะฝะต ะณะพะฒะพั€ะธะปะฐ ั‚ะฐะบะพะณะพ! - ะ“ะพะฒะพั€ะธะปะฐ! - ะะตั‚! - ะะต ะฝะตั‚, ะฐ ะดะฐ! ะ’ ัั‚ะฐั€ะพะผ ะฟะฐั€ะบะต ะฝะฐ ะดะฒัƒั… ะฟะฐั€ะฐะปะปะตะปัŒะฝั‹ั… ัะบะฐะผะตะนะบะฐั… ัะธะดะตะปะฐ ะผะพะปะพะดะฐั ะฟะฐั€ะฐ, ะบะพั‚ะพั€ะฐั ะพ ั‡ั‘ะผ-ั‚ะพ ัะฟะพั€ะธะปะฐ ะธ ะฝะธ ะพะดะฝะฐ ะธะท ัั‚ะพั€ะพะฝ ะฝะต ั…ะพั‚ะตะปะฐ ัƒัั‚ัƒะฟะฐั‚ัŒ. ะ ะฟะฐั€ะบ ะฒัั‘ ั‚ะฐะบ ะถะต ะฝะธะบั‚ะพ ะฝะต ะฟะพัะตั‰ะฐะป, ะปะธัˆัŒ ะธะทั€ะตะดะบะฐ ะฟะพะดั€ะพัั‚ะบะธ ะบะฐั‚ะฐะปะธััŒ ะฝะฐ ะฒะตะปะพัะธะฟะตะดะฐั… ะธ ั€ะพะปะธะบะฐั…. ะ ะฝะฐ ะดะฒะพั€ะต ะฒัั‘ ั‚ะฐะบ ะถะต ะฑั‹ะป ะธัŽะฝัŒ ะธ ะฟะพะด ัˆัƒะผ ะฟะพะทะดะฝะตะน ั‡ะตั€ั‘ะผัƒั…ะธ ะฒะปัŽะฑะปั‘ะฝะฝั‹ะต ัˆะปะธ ะฟะพ ัั‚ะฐั€ะพะผัƒ ะทะฐะฑั€ะพัˆะตะฝะฝะพะผัƒ ัƒะณะปัƒ ัƒะปะธั†ั‹ ัั‚ะฐั€ะพะณะพ ะพั‚ัั‚ั€ะพะตะฝะฝะพะณะพ ะฟะฐั€ะบะฐ.' - 'ะ‘ะฐะฑะฐั…! - ะฝะพะฒะฐั ะผะพะปะฝะธั ั ะณั€ะพั…ะพั‚ะพะผ ั€ะฐะทั€ะตะทะฐะปะฐ ะฝะตะฑะพ. ะะตะฟะพะณะพะดะฐ ะฟั€ะพะดะพะปะถะฐะปะฐััŒ ัƒะถะต ั‚ั€ะตั‚ะธะน ะดะตะฝัŒ. ะšะฐะณัƒั€ะฐ ะฟะพั‘ะถะธะปะฐััŒ. ะ”ะพะถะดัŒ ะพะฝะฐ ะฝะต ะปัŽะฑะธะปะฐ - ั‚ะฐะบะฐั ะฟะพะณะพะดะฐ ะปะธัˆะฐะปะฐ ะตั‘ ะตะดะธะฝัั‚ะฒะตะฝะฝะพะณะพ ะดะพัั‚ัƒะฟะฝะพะณะพ ะตะน ัั‡ะฐัั‚ัŒั - ะฟะพะปั‘ั‚ะพะฒ. ะžะฑั‰ะตะฝะธะต ั ะšะพั…ะฐะบัƒ ัƒัะฟะตะปะพ ะฝะฐะดะพะตัั‚ัŒ ั…ัƒะถะต ะณะพั€ัŒะบะพะน ั€ะตะดัŒะบะธ - ะฟะฐั€ะตะฝัŒ ะฝะต ะณะพะฒะพั€ะธะป ะฝะธ ะพ ั‡ั‘ะผ, ะบั€ะพะผะต ะบะฐะบ ะพ ัะฟะพัะพะฑะฐั… ัƒะฑะธั‚ัŒ ะะฐั€ะฐะบัƒ. ะ”ะตะฒัƒัˆะบะฐ, ะบะพะฝะตั‡ะฝะพ, ะฟะพะฝะธะผะฐะปะฐ, ะฟะพั‡ะตะผัƒ ะพะฝ ะพั‡ะตะฝัŒ ะฑั‹ะป ัะพัั€ะตะดะพั‚ะพั‡ะตะฝ ะฝะฐ ัะฒะพะตะน ะผะตัั‚ะธ, ะฝะพ - ะฒะตั‚ะตั€ ัะฒะธะดะตั‚ะตะปัŒ! - ะฝะตะปัŒะทั ะถะต ะฑั‹ะปะพ ะดัƒะผะฐั‚ัŒ ั‚ะพะปัŒะบะพ ะพะฑ ะพะดะฝะพะผ! ะะฐัั‚ั€ะพะตะฝะธะต ะฝะต ัƒะปัƒั‡ัˆะฐะปะพ ะธ ั‚ะพ, ั‡ั‚ะพ ั ัะฐะผะพะณะพ ะฝะฐั‡ะฐะปะฐ ะณั€ะพะทั‹ ัะฐะผ ะะฐั€ะฐะบัƒ ะฑั‹ะป ะฝะตะพะฑั‹ั‡ะฐะนะฝะพ ะดะตะปะพะฒะธั‚: ัะพะทะดะฐะฒะฐะป ะฝะพะฒะพะต ะฟะพั€ะพะถะดะตะฝะธะต, ั‡ั‚ะพ ะšะฐะณัƒั€ัƒ ะฝะตะผะฐะปะพ ะฑะตัะฟะพะบะพะธะปะพ, ะธ ะฒะฐั€ะธะป ะบะฐะบะพะต-ั‚ะพ ั‡ั€ะตะทะฒั‹ั‡ะฐะนะฝะพ ะฒะพะฝัŽั‡ะตะต ะทะตะปัŒะต, ะพั‚ั€ะฐะฒะปัะฒัˆะตะต ะฒะพะทะดัƒั…. - ะšะฐะณัƒั€ะฐ! - ะณะพะปะพั ะะฐั€ะฐะบัƒ ะทะฐัั‚ะฐะฒะธะป ะดะตะผะพะฝะธั†ัƒ ะฒะทะดั€ะพะณะฝัƒั‚ัŒ. - ะ”ะพ ะผะพะตะณะพ ะฟั€ะธะบะฐะทะฐ ะธะท ะทะฐะผะบะฐ - ะฝะธ ัˆะฐะณัƒ! ะขั‹ ะผะฝะต ะฟะพะฝะฐะดะพะฑะธัˆัŒัั! - ะะฐะฟะพะผะฝัŽ, ั‡ั‚ะพ ะฒ ะดะพะถะดัŒ ั ะปะตั‚ะฐั‚ัŒ ะฝะต ะผะพะณัƒ, - ัั‚ะฐั€ะฐัััŒ ัะดะตั€ะถะฐั‚ัŒ ะฒ ัะฒะพั‘ะผ ะณะพะปะพัะต ะณะฝะตะฒ, ะพั‚ะฒะตั‚ะธะปะฐ ะดะตะฒัƒัˆะบะฐ. ะŸะพัะปะต ั‚ะพะณะพ ะบะฐะบ ะะฐั€ะฐะบัƒ ะธะทะฑะฐะฒะธะปัั ะพั‚ ะฅะฐะบัƒะดะพัˆะธ, ะฟะพะณะปะพั‚ะธะฒ ั€ะตะฑั‘ะฝะบะฐ ะทะฐ ั‚ะพ, ั‡ั‚ะพ ะทะฐะฟะพะดะพะทั€ะธะป ะตะณะพ ะฒ ะฟั€ะตะดะฐั‚ะตะปัŒัั‚ะฒะต, ะšะฐะณัƒั€ะฐ ัั‚ะฐั€ะฐะปะฐััŒ ะฑั‹ั‚ัŒ ะผะฐะบัะธะผะฐะปัŒะฝะพ ะพัั‚ะพั€ะพะถะฝะพะน. - ะฏ ะฝะธะบะพะณะดะฐ ะฝะธั‡ะตะณะพ ะฝะต ะทะฐะฑั‹ะฒะฐัŽ, - ะฒ ะฝะตะฑั€ะตะถะฝะพ ะฑั€ะพัˆะตะฝะฝะพะน ั„ั€ะฐะทะต ะŸะพะฒะตะปะธั‚ะตะปัŒะฝะธั†ะต ะ’ะตั‚ั€ะฐ ะฟะพัะปั‹ัˆะฐะปะฐััŒ ัƒะณั€ะพะทะฐ - ะะฐั€ะฐะบัƒ ัะปะพะฒะฝะพ ะฑั‹ ะฝะฐะฟะพะผะฝะธะป ะตะน ะฟั€ะพ ะฟะพะฟั‹ั‚ะบัƒ ัƒะปะตั‚ะตั‚ัŒ ั ะดะฒัƒะผั ะพัะบะพะปะบะฐะผะธ ะจะธะบะพะฝะฐ. ะ”ะฒะฐ ะฟะพัะปะตะดัƒัŽั‰ะธั… ะดะฝั ะบะฐะถะดั‹ะน ะฑั‹ะป ะทะฐะฝัั‚ ัะฒะพะธะผะธ ะดะตะปะฐะผะธ: ะะฐั€ะฐะบัƒ ะบะพั€ะฟะตะป ะฝะฐะด ัะฒะพะธะผะธ ะบะพั‚ะปะฐะผะธ, ะšะพั…ะฐะบัƒ ั‚ั€ะตะฝะธั€ะพะฒะฐะปัั, ะšะฐะณัƒั€ะฐ ะฑะพั€ะพะปะฐััŒ ั ะทะฐะฒะธัั‚ัŒัŽ ะบ ะšะฐะฝะฝะต, ะบะพั‚ะพั€ะฐั ัะผะพั†ะธะน ะฝะต ะธัะฟั‹ั‚ั‹ะฒะฐะปะฐ ะธ ะพั‚ ัะผะตัะธ ัั‚ั€ะฐั…ะฐ ั ะฝะตั€ะฒะฝั‹ะผ ะพะถะธะดะฐะฝะธะตะผ ะฝะต ะผัƒั‡ะฐะปะฐััŒ. ะ’ัั‘ ะฟั€ะพั…ะพะดะธั‚. ะŸั€ะพัˆะปะธ ะธ ะดะพะถะดัŒ, ะธ ะฒั€ะตะผั ะฟั€ะธะณะพั‚ะพะฒะปะตะฝะธั ะทะตะปัŒั, ะธ ะฒั€ะตะผั ัะพะทะดะฐะฝะธั ะฝะพะฒะพะณะพ ะฟะพั€ะพะถะดะตะฝะธั. ะšะฐะณัƒั€ะฐ ั€ะฐััะผะฐั‚ั€ะธะฒะฐะปะฐ ัะธะดัั‰ัƒัŽ ะฟะตั€ะตะด ะฝะตะน ะดะตะฒัƒัˆะบัƒ ะธ ะพั‰ัƒั‰ะฐะปะฐ, ั‡ั‚ะพ ั‡ั‚ะพ-ั‚ะพ ั ะฝะตะน ัะฒะฝะพ ะฑั‹ะปะพ ะฝะต ั‚ะฐะบ. ะ˜ ะปะธัˆัŒ ั‡ะตั€ะตะท ะผะธะฝัƒั‚ัƒ ะฟะพะฝัะปะฐ - ะพะฝะฐ ะฒะพะพะฑั‰ะต ะฝะต ั‡ัƒะฒัั‚ะฒะพะฒะฐะปะฐ ะฝะพะฒะพะต ะฟะพั€ะพะถะดะตะฝะธะต ะะฐั€ะฐะบัƒ! ะะต ะฑั‹ะปะพ ะฝะธ ะดะตะผะพะฝะธั‡ะตัะบะพะน ะฐัƒั€ั‹, ะฝะธ ะทะฐะฟะฐั…ะฐ, ะฝะธ ะผะฐะปะตะนัˆะตะณะพ ะทะฒัƒะบะฐ - ะดะฐะถะต ัั‚ัƒะบะฐ ัะตั€ะดั†ะฐ ะฝะต ัƒะปะฐะฒะปะธะฒะฐะปะธ ั‡ัƒั‚ะบะธะต ัƒัˆะธ ะดะตะผะพะฝะตััั‹! - ะšะฐะณัƒั€ะฐ, ั‡ั‚ะพ ะทะฐัั‚ั‹ะปะฐ? ะฏ ะถะต ัะบะฐะทะฐะป ั‚ะตะฑะต - ัะปะตั‚ะฐะน ะธ ะฟั€ะธะฝะตัะธ ัะฒะพะตะน ัะตัั‚ั€ะต ะพะดะตะถะดัƒ! - ะะฐั€ะฐะบัƒ ะทะฐะผะตั‚ะฝะพ ะฒะพะทะฒั‹ัะธะป ะณะพะปะพั. - ะ’ ะทะฐะผะบะต ะธ ั‚ะฐะบ ะพะดะตะถะดั‹ ะฟะพะปะฝะพ - ะฟัƒัั‚ัŒ ัะฐะผะฐ ะฟะพะดะฑะตั€ั‘ั‚! - ะพั‚ะฒะตั‚ะธะปะฐ ั‚ะฐ. - ะฏ ัะบะฐะทะฐะป: ัะปะตั‚ะฐะน ะธ ะฟั€ะธะฝะตัะธ, - ัะบะฒะพะทัŒ ะทัƒะฑั‹ ะฟั€ะพัˆะธะฟะตะป ะฟะพะปัƒะดะตะผะพะฝ. "ะงั‘ั€ั‚, ั‡ั‚ะพ ัั‚ะพั‚ ะฟะฐัƒะบ ะทะฐะดัƒะผะฐะป, ั‡ั‚ะพ ั ัƒัะปั‹ัˆะฐั‚ัŒ ะฝะต ะดะพะปะถะฝะฐ? ะขะพั‡ะฝะพ ะบะฐะบัƒัŽ-ะฝะธะฑัƒะดัŒ ะณะฐะดะพัั‚ัŒ, ั‡ั‚ะพะฑั‹ ะผะตะฝั ะฟะพะผัƒั‡ะธั‚ะตะปัŒะฝะตะน ัƒะฑะธั‚ัŒ! ะงั‚ะพ ะถะต ะดะตะปะฐั‚ัŒ, ั‡ั‚ะพ ะถะต ะดะตะปะฐั‚ัŒ?! ะกะฑะตะถะฐั‚ัŒ? ะะตั‚, ัƒ ะฝะตะณะพ ะผะพั‘ ัะตั€ะดั†ะต... ะ”ัŒัะฒะพะป, ะผะฝะต ะฟะพัั‚ะฐะฒะปะตะฝ ะผะฐั‚!" - ั€ะฐัััƒะถะดะฐะปะฐ ะŸะพะฒะตะปะธั‚ะตะปัŒะฝะธั†ะฐ ะ’ะตั‚ั€ะฐ, ะธะดั ะฟะพ ะบะพั€ะธะดะพั€ะฐะผ. - ะšะฐะณัƒั€ะฐ-ะดะพะฝะพ, ั‡ั‚ะพ-ั‚ะพ ัะปัƒั‡ะธะปะพััŒ? - ะดะตะผะพะฝะตััะฐ ั‚ะฐะบ ัƒัˆะปะฐ ะฒ ัะฒะพะธ ะผั‹ัะปะธ, ั‡ั‚ะพ ะฝะต ะทะฐะผะตั‚ะธะปะฐ ะฟะพัะฒะปะตะฝะธั ั€ัะดะพะผ ะšะพั…ะฐะบัƒ. - ะšะพั…ะฐะบัƒ-ะบัƒะฝ... - ะดะตะฒัƒัˆะบะฐ ะทะฐะผัะปะฐััŒ ะธ ะฟั€ะธัะตะปะฐ ะฝะฐ ะพะดะฝะพ ะบะพะปะตะฝะพ - ัะผะพั‚ั€ะตั‚ัŒ ะฝะฐ ะฟะฐั€ะฝั ัะฒะตั€ั…ัƒ ะฒะฝะธะท ัะตะนั‡ะฐั ะตะน ัะพะฒะตั€ัˆะตะฝะฝะพ ะฝะต ั…ะพั‚ะตะปะพััŒ, - ะฒ ะผะพะตะน ะบะพะผะฝะฐั‚ะต ะตัั‚ัŒ ัˆะบะฐั‚ัƒะปะบะฐ... ะบั€ะฐัะฝะฐั ั ัะธะฝะธะผะธ ะฟั‚ะธั†ะฐะผะธ... ะžั…ะพั‚ะฝะธะบ ะฝะฐ ะดะตะผะพะฝะพะฒ ะฒะฝะธะผะฐั‚ะตะปัŒะฝะพ ัะผะพั‚ั€ะตะป ะฒ ะณะปะฐะทะฐ ะตัะปะธ ะฝะต ะฟะพะดั€ัƒะณะต, ั‚ะพ ัƒะถ ั‚ะพั‡ะฝะพ ัะพัŽะทะฝะธั†ะต, ะธ ัั‚ะฐั€ะฐะปัั ัƒะณะฐะดะฐั‚ัŒ, ั‡ั‚ะพ ัƒ ั‚ะพะน ะฝะฐ ะดัƒัˆะต. - ...ะฟั€ัะผะพัƒะณะพะปัŒะฝะฐั. ะ”ะปะธะฝะพะน... ะฝะตะผะฝะพะณะพ ะดะปะธะฝะฝะตะต ะผะพะตะณะพ ะฒะตะตั€ะฐ, - ะšะฐะณัƒั€ะฐ ะฝะฐ ะดะฒัƒั… ั€ัƒะบะฐั… ะฟะพะบะฐะทะฐะปะฐ ัะฒะพั‘ ะพั€ัƒะถะธะต, ะธ ั€ะตะฑั‘ะฝะพะบ ะบะธะฒะฝัƒะป ะฒ ะทะฝะฐะบ ั‚ะพะณะพ ั‡ั‚ะพ ะทะฐะฟะพะผะฝะธะป, - ัˆะธั€ะธะฝะพะน - ะฟั€ะธะผะตั€ะฝะพ ะบะฐะบ ั‚ะฒะพั ะปะฐะดะพะฝัŒ ะพั‚ ะฟะฐะปัŒั†ะตะฒ ะดะพ ะพัะฝะพะฒะฐะฝะธั ะปะฐะดะพะฝะธ. ะ—ะฐะบั€ั‹ั‚ะฐั. ะ’ ะพะฑั‰ะตะผ, ะตัะปะธ... ั…ะพั‚ั ัะบะพั€ะตะต ัƒะถ "ะบะพะณะดะฐ"... ะผะตะฝั ัƒะฑัŒะตั‚ ะะฐั€ะฐะบัƒ, ะฟะพัั‚ะฐั€ะฐะนัั ะฟะตั€ะตะดะฐั‚ัŒ ะตั‘ ะบะฐะบ-ะฝะธะฑัƒะดัŒ ะกะตััั‘ะผะฐั€ัƒ. ะขั‹ ะฒะตะดัŒ ะตะณะพ ะฟะพะผะฝะธัˆัŒ? ะšะพั…ะฐะบัƒ ัะฝะพะฒะฐ ะบะธะฒะฝัƒะป. - ะกะบะฐะถะตัˆัŒ, ั‡ั‚ะพ ัั‚ะพ ะพั‚ ะผะตะฝั. ะŸัƒัั‚ัŒ ะปะพะผะฐะตั‚ ะทะฐะผะพะบ - ะบะปัŽั‡ ะฒัะตะณะดะฐ ะฟั€ะธ ะผะฝะต. - ะ ะ’ั‹? - ะขั‹ ะถะต ะทะฝะฐะตัˆัŒ, ะบะฐะบะธะต ะฝะตะฑะตะทะพะฟะฐัะฝั‹ะต ะธะณั€ั‹ ะผั‹ ั ั‚ะพะฑะพะน ะทะฐั‚ะตัะปะธ... - ะšะฐะณัƒั€ะฐ ะฒัั‚ะฐะปะฐ, ะฟั€ะพะฒะตะปะฐ ะปะฐะดะพะฝัŒัŽ ะฟะพ ะฒะพะปะพัะฐะผ ะฟะฐั€ะฝั ะธ ะฒั‹ัˆะปะฐ ะธะท ะทะฐะผะบะฐ. ะšะพั…ะฐะบัƒ ัƒะบั€ะฐะดะบะพะน ัะผะฐั…ะฝัƒะป ะฝะฐะฑะตะถะฐะฒัˆัƒัŽ ัะปะตะทัƒ. ะšะพะณะดะฐ ั‡ะตั€ะตะท ะฟะพะปั‡ะฐัะฐ ะดะตะฒัƒัˆะบะฐ ะฒะตั€ะฝัƒะปะฐััŒ ั ะฒะพั€ะพั…ะพะผ ะพะดะตะถะดั‹, ั€ะฐะทะณะพะฒะพั€ ัƒะถะต ัะฒะฝะพ ะทะฐะฒะตั€ัˆะฐะปัั. - ะžั‚ะปะธั‡ะฝะพ. ะ ะฐะท ัƒะถ ะธั… ัะธะปะฐ ะฒ ะตะดะธะฝัั‚ะฒะต ะธ ะฒะพะปะต ะบะฐะถะดะพะณะพ - ะฒั€ัะด ะปะธ ั‡ั‚ะพ-ั‚ะพ ัะผะพะถะตั‚ ัƒะฝะธั‡ั‚ะพะถะธั‚ัŒ ะธั… ะปัƒั‡ัˆะต. ะŸะพัะปะต ั‚ะฐะบะพะณะพ ะดะฐะถะต ะšะพั…ะฐะบัƒ ะธั… ะฟะตั€ะตะฑัŒั‘ั‚ ะฑะตะท ะฟั€ะพะฑะปะตะผ! - ะบะฐะทะฐะปะพััŒ, ะดะฐะถะต ะณะพะปะพั ะฝะพะฒะพะณะพ ะฟะพั€ะพะถะดะตะฝะธั ะฑั‹ะป ะฝะตัƒะปะพะฒะธะผ, ะฟะพัะฒะปััััŒ ัะปะพะฒะฝะพ ะธะท ะฝะตะพั‚ะบัƒะดะฐ ะธ ะตะถะตัะตะบัƒะฝะดะฝะพ ะผะตะฝัั ั‚ะตะผะฑั€. - ะšัั‚ะฐั‚ะธ, ะทะฐะฑั‹ะป ะฟั€ะตะดัั‚ะฐะฒะธั‚ัŒ, - ะฟะพะปัƒะดะตะผะพะฝ ัะฒะฝะพ ะฑั‹ะป ั‡ะตะผ-ั‚ะพ ะดะพะฒะพะปะตะฝ, - ะตั‘ ะทะพะฒัƒั‚ ะงะธั‘. ะญั‚ะพะน ะฝะพั‡ัŒัŽ ะฒั‹ ะดะตะนัั‚ะฒัƒะตั‚ะต ะฒะดะฒะพั‘ะผ. ะšะพะผะฟะฐะฝะธั ะ˜ะฝัƒััˆะธ ั€ะฐะทะดะตะปะธะปะฐััŒ - ัั‚ะพ ะธั… ะธ ะฟะพะณัƒะฑะธั‚. ะšัั‚ะฐั‚ะธ, ะšะฐะณัƒั€ะฐ, ั‚ั‹ ั…ะพั€ะพัˆะพ ะผะฝะต ะฟะพัะปัƒะถะธะปะฐ. ะ•ัะปะธ ัั‚ะพะน ะฝะพั‡ัŒัŽ ะฝะต ะพะฑะปะฐะถะฐะตัˆัŒัั - ะฟะพะปัƒั‡ะธัˆัŒ ัะฒะพั‘ ัะตั€ะดั†ะต. "ะ˜ะฝั‚ะตั€ะตัะฝะพ, ั ั…ะพั‚ั ะฑั‹ ั€ะฐััะฒะตั‚ ัƒะฒะธะถัƒ?" - ะผั‹ัะปะตะฝะฝะพ ะฟะพะฟั€ะพั‰ะฐะปะฐััŒ ั ะถะธะทะฝัŒัŽ ะŸะพะฒะตะปะธั‚ะตะปัŒะฝะธั†ะฐ ะ’ะตั‚ั€ะฐ. - ะšะฐะณัƒั€ะฐ, ะฟะพะฒั‚ะพั€ััŽ ัะฟะตั†ะธะฐะปัŒะฝะพ ะดะปั ั‚ะตะฑั - ะดะตะปะฐะตัˆัŒ ะฒัั‘, ั‡ั‚ะพ ะฟั€ะธะบะฐะถะตั‚ ะงะธั‘, - ั ัะดะพะฒะธั‚ะพะน ัƒัะผะตัˆะบะพะน ะฟั€ะพะผะพะปะฒะธะป ะะฐั€ะฐะบัƒ. ะŸะพะฒะตะปะธั‚ะตะปัŒะฝะธั†ะฐ ะ’ะตั‚ั€ะฐ ะฟะตั€ะฒัƒัŽ ั„ั€ะฐะทัƒ ะฟั€ะพัะปัƒัˆะฐะปะฐ - ะฒ ั‚ะพั‚ ะผะพะผะตะฝั‚ ะฟะพะปัƒะดะตะผะพะฝ ะดะตะผะพะฝัั‚ั€ะฐั‚ะธะฒะฝะพ ะพั‚ะดะฐะฒะฐะป ะงะธั‘ ัะตั€ะดั†ะต ะฟะตั€ะฒะพะณะพ ะฟะพั€ะพะถะดะตะฝะธั ะะฐั€ะฐะบัƒ. ะ’ะผะตัั‚ะพ ะพั‚ะฒะตั‚ะฐ ะถะตะฝั‰ะธะฝะฐ ะฒะทะผะฐั…ะฝัƒะปะฐ ะฒะตะตั€ะพะผ, ะธ ะฟะตั€ะพ ั ะดะฒัƒะผั ะดะตะผะพะฝะตััะฐะผะธ ะฒะทะปะตั‚ะตะปะพ. - ะญะน, ะงะธั‘, ะฐ ั‡ั‚ะพ ั‚ั‹ ัƒะผะตะตัˆัŒ, ะฐ? ะšะฐะบ ะฑัƒะดะตัˆัŒ ะ˜ะฝัƒััˆัƒ ะธ ะพัั‚ะฐะปัŒะฝั‹ั… ัƒะฑะธะฒะฐั‚ัŒ? - ะ”ะฐะถะต ะฝะต ะฟั‹ั‚ะฐะนัั ะฝะฐ ะผะตะฝั ะฝะฐะฟะฐัั‚ัŒ, ั‡ั‚ะพะฑั‹ ะพั‚ะพะฑั€ะฐั‚ัŒ ัะตั€ะดั†ะต - ั ัะธะปัŒะฝะตะต, - ะฟั€ะพะณะพะฒะพั€ะธะปะฐ ัะฒะพะธะผ ะฝะตะฟะพะฝัั‚ะฝั‹ะผ ะณะพะปะพัะพะผ ั‚ะฐ. ะšะฐะณัƒั€ะฐ ะดะตั€ะฝัƒะปะฐััŒ: "ะ”ะพะณะฐะดะปะธะฒะฐั ัั‚ะตั€ะฒะฐ!" - ะ˜ ั ะฝะต ะฑัƒะดัƒ ะธั… ัƒะฑะธะฒะฐั‚ัŒ. ะฏ ัะดะตะปะฐัŽ ั‚ะฐะบ, ั‡ั‚ะพ ะพะฝะธ ัะฐะผะธ ะฒัะต ะฑัƒะดัƒั‚ ะธัะบะฐั‚ัŒ ัะผะตั€ั‚ะธ. ะ”ะฐะฒะฐะน ะฑั‹ัั‚ั€ะตะต, ะฝะฐั ะถะดัƒั‚, ะธะปะธ, ะฒะตั€ะฝะตะต, ะฝะต ะถะดัƒั‚ ะฒ ั‚ั€ั‘ั… ะผะตัั‚ะฐั…. - ะ’ ั‚ั€ั‘ั…? - ัƒะดะธะฒะปั‘ะฝะฝะพ ะฒัะบะธะฝัƒะปะฐ ะฑั€ะพะฒัŒ ะŸะพะฒะตะปะธั‚ะตะปัŒะฝะธั†ะฐ ะ’ะตั‚ั€ะฐ. - ะ“ั€ัƒะฟะฟะฐ ะ˜ะฝัƒััˆะธ ั€ะฐะทะดะตะปะธะปะฐััŒ ะฝะฐ ั‚ั€ะธ? ะงะธั‘ ะบะพั€ะพั‚ะบะพ ั…ะพั…ะพั‚ะฝัƒะปะฐ. ะฅะพั…ะพั‚ ะฑั‹ะป ัั‚ะพะปัŒ ะถะต ะฝะตะฟั€ะธัั‚ะฝั‹ะผ, ะบะฐะบ ะธ ะณะพะปะพั. - ะะต ั‚ะฒะพั‘ ะดะตะปะพ. ะะฐั‡ะฝั‘ะผ ั... - ะกะฐะฝะณะพ, ัƒะฒะตั€ะตะฝ, ะธั… ะดัƒัˆะธ ะฟั€ะตะฑั‹ะฒะฐัŽั‚ ะฒ ะปัƒั‡ัˆะตะผ ะผะธั€ะต, - ั‚ะธั…ะพ ะฟั€ะพะณะพะฒะพั€ะธะป ะœะธั€ะพะบัƒ. - ะกะฟะฐัะธะฑะพ ะทะฐ ะฟะพะดะดะตั€ะถะบัƒ, ั…ะพะพัˆะธ-ัะฐะผะฐ. ะก ะผะพะผะตะฝั‚ะฐ ั€ะตะทะฝะธ ะฒ ะทะฐะผะบะต ะะฐั€ะฐะบัƒ ะธ ัะตะปะตะฝะธะธ ะพั…ะพั‚ะฝะธะบะพะฒ ะฝะฐ ะดะตะผะพะฝะพะฒ ะฟั€ะพัˆั‘ะป ั€ะพะฒะฝะพ ะณะพะด. ะกะฐะฝะณะพ, ะพัั‚ะฐะฒะธะฒ ะ˜ะฝัƒััˆัƒ, ะจะธะฟะฟะพ ะธ ะšะฐะณะพะผั ะฒ ะดะตั€ะตะฒะฝะต ะšะฐัะดั, ะพั‚ะฟั€ะฐะฒะธะปะฐััŒ ะฝะฐะฒะตัั‚ะธั‚ัŒ ะผะพะณะธะปั‹ ัะฒะพะธั… ั€ะพะดัั‚ะฒะตะฝะฝะธะบะพะฒ ะธ ั‚ะพะฒะฐั€ะธั‰ะตะน. ะขะพ, ั‡ั‚ะพ ะดะตะฒัƒัˆะบะฐ ะฒะทัะปะฐ ั ัะพะฑะพะน ะœะธั€ะพะบัƒ, ะฟะฐั€ะตะฝัŒ ะฒะพัะฟั€ะธะฝัะป ะบะฐะบ ะทะฝะฐะบ ะพัะพะฑะพะณะพ ะดะพะฒะตั€ะธั ะธ, ะฝะตัะผะพั‚ั€ั ะฝะฐ ะฒัั‘ ัะฒะพั‘ ะถะตะปะฐะฝะธะต, ัั‚ะฐั€ะฐะปัั ัะตะฑั ะฒะตัั‚ะธ ะฟะพะดะพะฑะฐัŽั‰ะต, ั‡ั‚ะพ ะตะผัƒ ะฟะพะบะฐ ัƒะดะฐะฒะฐะปะพััŒ. ะ—ะฐ ะดะตะฝัŒ ะพะฝ ั ะกะฐะฝะณะพ ะพะฑะฝะพะฒะธะป ะฝะฐะดะณั€ะพะฑะฝั‹ะต ั‚ะฐะฑะปะธั‡ะบะธ, ะฟะพะผะพะณ, ะฟะพ ะตั‘ ะฟั€ะพััŒะฑะต, ะฟะพะฒั‹ะดะตั€ะณะฐั‚ัŒ ั€ะฐะทั€ะพััˆะธะตัั ะฝะฐ ะผะพะณะธะปะฐั… ั€ะฐัั‚ะตะฝะธั, ัะพะฒะตั€ัˆะธะป ะฝัƒะถะฝั‹ะต ะพะฑั€ัะดั‹. - ะญั…. ะ˜ะฝั‚ะตั€ะตัะฝะพ, ะณะดะต ัะตะนั‡ะฐั ะšะพั…ะฐะบัƒ? - ะฒะทะดะพั…ะฝัƒะปะฐ ะพั…ะพั‚ะฝะธั†ะฐ. ะœะพะฝะฐั… ะฟะพะดะฝัะป ะบ ะณะปะฐะทะฐ ะบ ะฝะตะฑัƒ: ะตะณะพ ะฟะพั‚ะฐั‘ะฝะฝั‹ะต ะผะพะปะธั‚ะฒั‹ ัƒัะปั‹ัˆะฐะฝั‹ ะฝะต ะฑั‹ะปะธ. ะŸะฐั€ะตะฝัŒ ะดะพ ะฟะพัะปะตะดะฝะตะณะพ ะฝะฐะดะตัะปัั, ั‡ั‚ะพ ะดะตะฒัƒัˆะบะฐ ะฝะต ะฒัะฟะพะผะฝะธั‚ ะฟั€ะพ ะฑั€ะฐั‚ะฐ - ะบะฐะถะดั‹ะน ั€ะฐะท, ะบะพะณะดะฐ ัั‚ะพ ะฟั€ะพะธัั…ะพะดะธะปะพ, ะพะฝะฐ ะฒะฟะฐะดะฐะปะฐ ะฒ ั‚ัะถั‘ะปัƒัŽ ะผะตะปะฐะฝั…ะพะปะธัŽ. - ะกะฐะฝะณะพ-ั‚ัะฝ, ะผั‹ ะพะฑัะทะฐั‚ะตะปัŒะฝะพ ะฝะฐะนะดั‘ะผ ะตะณะพ. ะะฐะนะดั‘ะผ ะธ ะฒั‹ั€ะฒะตะผ ะธะท ะปะฐะฟ ะะฐั€ะฐะบัƒ, - ะœะธั€ะพะบัƒ ะพะฑะฝัะป ะพั…ะพั‚ะฝะธั†ัƒ. ะขะฐ ั…ะฝั‹ะบะฝัƒะปะฐ ะตะผัƒ ะฒ ะฟะปะตั‡ะพ. ะšะธั€ะฐั€ะฐ ะฟั€ะธะฝัะปะฐััŒ ัƒั‚ะตัˆะธั‚ะตะปัŒะฝะพ ะผััƒะบะฐั‚ัŒ. - ะฅะพะพัˆะธ... ะผะพะถะฝะพ ะ’ั‹... ะฟะพัะฟะธั‚ะต ะฒ ะพะดะฝะพะผ ะดะพะผะต ัะพ ะผะฝะพะน? - ะฒัั…ะปะธะฟั‹ะฒะฐั, ั‚ะธั…ะพะฝัŒะบะพ ะฟะพะฟั€ะพัะธะปะฐ ะดะตะฒัƒัˆะบะฐ. - ะกะฐะฝะณะพ, ั ะผะพะณัƒ ัะฟั€ะพัะธั‚ัŒ... - ะฏ ะฟั€ะพัั‚ะพ ะฝะต ั…ะพั‡ัƒ ะพัั‚ะฐะฒะฐั‚ัŒัั ะพะดะฝะฐ. ะะฐะดะตัŽััŒ, ะผะพั‘ ะฟั€ะตะดะปะพะถะตะฝะธะต ะ’ั‹ ะฝะต ะฒะพัะฟั€ะธะผะตั‚ะต, ะบะฐะบ ั€ะฐะทั€ะตัˆะตะฝะธะต ั€ะฐัะฟัƒัะบะฐั‚ัŒ ั€ัƒะบะธ! - ะฟะพัะปะตะดะฝัŽัŽ ั„ั€ะฐะทัƒ ะกะฐะฝะณะพ ะฟั€ะพะธะทะฝะตัะปะฐ ัƒะถะต ะฑะพะปะตะต ะถั‘ัั‚ะบะธะผ ั‚ะพะฝะพะผ. ะœะธั€ะพะบัƒ ะฟะพะดะฐะฒะธะป ะฒะทะดะพั… ั€ะฐะทะพั‡ะฐั€ะพะฒะฐะฝะธั. - ะ›ะฐะดะฝะพ, ะพะฑะตั‰ะฐัŽ ะฒะตัั‚ะธ ัะตะฑั ัะพะพั‚ะฒะตั‚ัั‚ะฒะตะฝะฝะพ ัะฒะพะตะผัƒ ะดัƒั…ะพะฒะฝะพะผัƒ ะทะฒะฐะฝะธัŽ. - ะกะฟะฐัะธะฑะพ. ะŸะพะนะดั‘ะผ ัƒะบะปะฐะดั‹ะฒะฐั‚ัŒัั - ัะบะพั€ะพ ัั‚ะตะผะฝะตะตั‚. - ะ‘ะปะธะฝ, ะฝัƒ ะฒะพั‚ ะฟะพะฝะตัะปะฐ ะฑะฐะฑะบัƒ ะฝะตะปั‘ะณะบะฐั ั‡ั‘ั€ั‚ะธ ะบัƒะดะฐ ะฝะฐ ะฝะพั‡ัŒ ะณะปัะดั! - ั€ะฐะทะดั€ะฐะถั‘ะฝะฝะพ ั€ัะฒะบะฝัƒะป ะ˜ะฝัƒััˆะฐ. - ะ—ะฐะผะตั‚ัŒ, ั, ะบะฐะบ ะผะธะบะพ, ะดะพะปะถะฝะฐ ะฑั‹ะปะฐ ะฑั‹ ะฟะพะนั‚ะธ ัะฐะผะฐ! ะ˜ ะฟะพัˆะปะฐ ะฑั‹, ะดะฐ ั‚ะพะปัŒะบะพ ะพะฝะฐ ะทะฐะฟั€ะตั‚ะธะปะฐ - ัะบะฐะทะฐะปะฐ, ั‡ั‚ะพ ะฝะตะทะฐั‡ะตะผ ะปะตะบะฐั€ัั‚ะฒะฐ ะธะท ะผะพะตะณะพ ะผะธั€ะฐ ะฒัะตะผ ะฟะพะดั€ัะด ะฟะพะบะฐะทั‹ะฒะฐั‚ัŒ - ะฒะพะฟั€ะพัั‹ ะฝะตะฝัƒะถะฝั‹ะต ะฒะพะทะฝะธะบะฝัƒั‚. - ะ ะพะฝะฐ ะฝะต ะฟั€ะฐะฒะฐ? ะขั€ั€ั€ั€ั€. - ะจะธะฟะฟะพ, ั…ะฒะฐั‚ะธั‚ ัƒะถะต - ะดะพัั‚ะฐะป ะดะพ ะพะดัƒั€ะตะฝะธั ัะฒะพะตะน ั‚ั€ั‹ะบะฐะปะบะพะน! - ะ˜ะฝัƒััˆะฐ ะฑั€ะพัะธะป ะฒ ะบะธั†ัƒะฝั ะทะปะพะน ะฒะทะณะปัะด. - ะžั‚ัั‚ะฐะฝัŒ! ะœะฝะต ะšะฐะณะพะผั-ั‚ัะฝ ะธะณั€ัƒัˆะบัƒ ะฟั€ะธะฝะตัะปะฐ, ะธ ั ะฑัƒะดัƒ ะธะณั€ะฐั‚ัŒ! - ะฒั‹ะบั€ะธะบะฝัƒะป ั€ะตะฑั‘ะฝะพะบ, ะฟั€ะธะบั€ั‹ะฒะฐั ัะพะฑะพะน ะฝะฐะฟะพะปะฝะตะฝะฝั‹ะน ะฑัƒะฑะตะฝั‡ะธะบะฐะผะธ ะผัั‡ะธะบ. - ะงั‘ั€ั‚, ะฝัƒ ัƒัˆะธ ะถะต ะฒัะฝัƒั‚ ะธ ะณะพะปะพะฒะฐ ะทะฒะตะฝะธั‚! ะšะฐะณะพะผั, ะดัƒั€ะฐ, ะทะฐั‡ะตะผ ัั‚ัƒ ะณะฐะดะพัั‚ัŒ ะฟั€ะธะฝะตัะปะฐ? - ะกะธะดะตั‚ัŒ! ะขั€ะตั‰. - ะญั‚ะพ ะฝะต ะณะฐะดะพัั‚ัŒ. ะญั‚ะพ ะธะณั€ัƒัˆะบะฐ. ะ˜ะฝัƒััˆะฐ, ะฝัƒ ะพะฝ ะถะต ั€ะตะฑั‘ะฝะพะบ - ะฟัƒัั‚ัŒ ะธะณั€ะฐะตั‚ัั! ะ˜ ะฝะตั‚ ะฑั‹ ะทะฐ ัะตะฑั ะฟะพั€ะฐะดะพะฒะฐั‚ัŒัั - ัะผะพั‚ั€ะธ, ัะบะพะปัŒะบะพ ั ั‚ะฒะพะตะน ะปัŽะฑะธะผะพะน ะปะฐะฟัˆะธ ะฟั€ะธะฝะตัะปะฐ! - ะผะธะบะพ ะฟะพัั‚ะฐั€ะฐะปะฐััŒ ะฟะตั€ะตะบะปัŽั‡ะธั‚ัŒ ะตะณะพ ะฒะฝะธะผะฐะฝะธะต ะฝะฐ ั‡ั‚ะพ-ั‚ะพ ะฟั€ะธัั‚ะฝะพะต. ะขั€ั€ั€ั€ั€. - ะฅะพั‚ัŒ ะบะฐะบะฐั-ั‚ะพ ะฟะพะปัŒะทะฐ ะพั‚ ั‚ะตะฑั ะธ ั‚ะฒะพะตะน ัะฟะพั…ะธ, - ะฑัƒั€ะบะฝัƒะป ะฟะพะปัƒะดะตะผะพะฝ. - ะฅะฐะผ! - ะกั‚ะตั€ะฒะฐ! - ะกะธะดะตั‚ัŒ! ะขั€ะตั‰. - ะ’ั‚ะพั€ะพะน ั€ะฐะท ัƒะถะต! - ะฟั€ะพะพั€ะฐะป ะฑะตะปะพะฒะพะปะพัั‹ะน ะฟะฐั€ะตะฝัŒ ั ะฟะพะปะฐ. ะขั€ั€ั€ั€ั€ั€ั€ั€. - ะ—ะฐัะปัƒะถะธะป! - ะฒ ั‚ะพะฝ ะตะผัƒ ะพั‚ะฒะตั‚ะธะปะฐ ะšะฐะณะพะผั, ะฟะพั‚ะธั€ะฐั ะฟะฐะปัŒั†ะฐะผะธ ะฒะธัะบะธ - ะทะฒะพะฝ ะผัั‡ะธะบะฐ ะฝะฐั‡ะฐะป ะธ ัƒ ะฝะตั‘ ะฒั‹ะทั‹ะฒะฐั‚ัŒ ะณะพะปะพะฒะฝัƒัŽ ะฑะพะปัŒ, ะฝะพ ะดะพะฑั€ะพั‚ะฐ, ะฟะพะดะฟะธั‚ั‹ะฒะฐะตะผะฐั ัƒะฟั€ัะผัั‚ะฒะพะผ ะธ ะฝะตะถะตะปะฐะฝะธะตะผ ัะพะณะปะฐัะธั‚ัŒัั ัะตะนั‡ะฐั ั ะฟะพะปัƒะดะตะผะพะฝะพะผ, ะฝะต ะดะฐะฒะฐะปะธ ะตะน ะทะฐะฟั€ะตั‚ะธั‚ัŒ ะธะณั€ะฐั‚ัŒ ะปะธัั‘ะฝะบัƒ. - ะ˜ะฝั‚ะตั€ะตัะฝะพ, ะบะฐะบ ั‚ะฐะผ ะœะธั€ะพะบัƒ ั ะกะฐะฝะณะพ? - ะŸะพะดะธ ัั‚ะพั‚ ะฑะปัƒะดะธัั‚ ั‚ะฐะผ ั€ะฐะทะพัˆั‘ะปัั, - ะพั‚ะฒะตั‚ะธะป ะฟะฐั€ะตะฝัŒ, ะฟะพะดะฝะธะผะฐัััŒ ั ะฟะพะปะฐ. - ะ˜ะฝัƒััˆะฐ, ะœะธั€ะพะบัƒ-ัะฐะผะฐ ะทะฝะฐะตั‚, ะบะพะณะดะฐ ะฝัƒะถะฝะพ ัะดะตั€ะถะฐั‚ัŒัั. - ะฃะณัƒ. ะšะฐะบ ะพั‚ ะฝะฐัˆะตะน ะพั…ะพั‚ะฝะธั†ั‹ ะฟะพะปัƒั‡ะธั‚ - ั‚ะฐะบ ัั€ะฐะทัƒ ะฝะฐั‡ะธะฝะฐะตั‚ ัะดะตั€ะถะธะฒะฐั‚ัŒัั. ะขั€ั€ั€ั€ั€. - ะ›ะฐะดะฝะพ, ะดะฐะฒะฐะนั‚ะต ัะฟะฐั‚ัŒ! - ะšะฐะณะพะผั ะดะตะผะพะฝัั‚ั€ะฐั‚ะธะฒะฝะพ ัƒะปะตะณะปะฐััŒ ะฒ ัะฟะฐะปัŒะฝั‹ะน ะผะตัˆะพะบ, ะฝะฐะดะตัััŒ, ั‡ั‚ะพ ะตั‘ ะฟั€ะธะผะตั€ัƒ ะฟะพัะปะตะดัƒัŽั‚ ะธ ะพัั‚ะฐะปัŒะฝั‹ะต. ะ’ัะบะพั€ะต ะฒ ั…ะธะถะธะฝะต ะดะตะนัั‚ะฒะธั‚ะตะปัŒะฝะพ ัั‚ะฐะปะพ ั‚ะตะผะฝะพ ะธ ั‚ะธั…ะพ. ะšะฐะณัƒั€ะฐ ะธ ะงะธั‘ ะฟั€ะธะฑะปะธะทะธะปะธััŒ ะบ ะฟะตั€ะฒะพะน ั†ะตะปะธ. - ...ััŽะดะฐ... ะกะฐะฝะณะพ, ัƒัะปั‹ัˆะฐะฒ ะฝะตะฟะพะฝัั‚ะฝะพ ั‡ะตะน ะณะพะปะพั, ะพั‚ะบั€ั‹ะปะฐ ะณะปะฐะทะฐ ะธ ัƒะฒะธะดะตะปะฐ, ะบะฐะบ ะฝะฐ ะฝะตั‘ ะฟะฐะดะฐะตั‚ ั‡ะตะปะพะฒะตั‡ะตัะบะฐั ั‚ะตะฝัŒ. ะ”ะตะฒัƒัˆะบะฐ ะพั‚ั€ะตะฐะณะธั€ะพะฒะฐะปะฐ ะฟะพะปะฝะพัั‚ัŒัŽ ะธะฝัั‚ะธะฝะบั‚ะธะฒะฝะพ - ะฟะฝัƒะปะฐ ั‚ะตะฝัŒ ะดะฒัƒะผั ะฝะพะณะฐะผะธ ะฒ ะถะธะฒะพั‚. ะขะฐ ะพั…ะฝัƒะปะฐ. - ะกะพะฟั€ะพั‚ะธะฒะปัะตัˆัŒัั, ะฝะตะดะพั‚ั€ะพะณะฐ? ะะต ะฑะพะนัั - ั ะพั‚ั‚ั€ะฐั…ะฐัŽ ั‚ะตะฑั ะฟั€ะธัั‚ะฝะพ! ะ”ะตะฒัƒัˆะบะฐ ั ัƒะดะธะฒะปะตะฝะธะตะผ ัƒะทะฝะฐะปะฐ ะฟะพ ะณะพะปะพััƒ ะœะธั€ะพะบัƒ ะธ ะฟะพะฝัะปะฐ, ะฟะพั‡ะตะผัƒ ะšะธั€ะฐั€ะฐ ะตั‘ ะฝะต ั€ะฐะทะฑัƒะดะธะปะฐ, ะบะฐะบ ะพะฝะฐ ัั‚ะพ ะพะฑั‹ั‡ะฝะพ ะดะตะปะฐะปะฐ ะฟั€ะธ ะฟะพัะฒะปะตะฝะธะธ ั‡ัƒะถะธั… ะทะฐะฟะฐั…ะพะฒ. - ะญะน, ั‚ั‹ ั‡ั‚ะพ, ะณะพะปะพะฒะพะน ัƒะดะฐั€ะธะปัั? - ะพัˆะฐั€ะฐัˆะตะฝะฝะพ ัะฟั€ะพัะธะปะฐ ะพั…ะพั‚ะฝะธั†ะฐ. ะ’ะผะตัั‚ะพ ะพั‚ะฒะตั‚ะฐ ะฟะฐั€ะตะฝัŒ ัะฝะพะฒะฐ ะฟั€ั‹ะณะฝัƒะป ะฝะฐ ะกะฐะฝะณะพ. ะขะฐ ะพั‚ะบะฐั‚ะธะปะฐััŒ ัะพ ัะฒะพะตะณะพ ั„ัƒั‚ะพะฝะฐ ะฒ ัั‚ะพั€ะพะฝัƒ ะธ ะฟะพะผะพั€ั‰ะธะปะฐััŒ ะพั‚ ะฑะพะปะธ - ะฟะพะด ะปะพะบะพั‚ัŒ ะฟะพะฟะฐะปะธ ะพัะบะพะปะบะธ ะบะฐะบะพะณะพ-ั‚ะพ ัะพััƒะดะฐ, ะธ ะฟะฐั€ะฐ ะฒะฟะธะปะฐััŒ ะฒ ะบะพะถัƒ. "ะšะฐะบะพะณะพ ั‡ั‘ั€ั‚ะฐ! ะžะฝ, ั‡ั‚ะพ ะปะธ, ะฝะฐะฟะธะปัั? ะะตั‡ะตะผ ะฒั€ะพะดะต! ะ˜ ั‡ั‚ะพ ั‚ัƒั‚ ะดะตะปะฐะตั‚ ัั‚ะฐ ะบะตั€ะฐะผะธั‡ะตัะบะฐั ะณะฐะดะพัั‚ัŒ - ะผั‹ ะถะต ั‚ะพะปัŒะบะพ ัƒะฑั€ะฐะปะธััŒ, ะธ ะฝะฐ ะฟะพะปัƒ ะฝะธั‡ะตะณะพ ะฝะต ะฑั‹ะปะพ! - ััƒะดะพั€ะพะถะฝะพ ะฟั‹ั‚ะฐะปะฐััŒ ั€ะฐะทะพะฑั€ะฐั‚ัŒัั ะกะฐะฝะณะพ, ะฟั€ะพัั‹ะฟะฐัััŒ ะพะบะพะฝั‡ะฐั‚ะตะปัŒะฝะพ ะธ ัƒะฒะพั€ะฐั‡ะธะฒะฐัััŒ ะพั‚ ะฝะพะฒะพะน ะฟะพะฟั‹ั‚ะบะธ ัั…ะฒะฐั‚ะธั‚ัŒ ะตั‘. - ะขะพั‡ะฝะพ! ะ—ะฝะฐั‡ะธั‚, ะพั‚ั€ะฐะฒะฐ ั‚ะฐะบะฐั!" - ะะต ัƒะนะดั‘ัˆัŒ ั‚ะตะฟะตั€ัŒ! ะกะดะตะปะฐะน ะถะต ะผะฝะต ะฟั€ะธัั‚ะฝะพ! - ะœะธั€ะพะบัƒ ะฒ ะพั‡ะตั€ะตะดะฝะพะผ ัะบะฐั‡ะบะต ัƒะดะฐะปะพััŒ ะฝะฐะฒะฐะปะธั‚ัŒัั ะฝะฐ ะดะตะฒัƒัˆะบัƒ. ะžะดะฝัƒ ั€ัƒะบัƒ ะพะฝ ัะผะพะณ ะทะฐั…ะฒะฐั‚ะธั‚ัŒ, ะฒั‚ะพั€ัƒัŽ ะฟั€ะธ ะฝะตัƒะดะฐั‡ะฝะพะผ ะบัƒะฒั‹ั€ะบะต ะฟั€ะธะถะฐะปะฐ ัะพะฑะพะน ัะฐะผะฐ ะกะฐะฝะณะพ. ะะพะณะธ ะดะตะฒัƒัˆะบะธ ะฟะฐั€ะตะฝัŒ ะฟั€ะธะดะฐะฒะธะป ัะฒะพะธะผะธ, ั‚ะฐะบ ั‡ั‚ะพ ัƒะดะฐั€ะธั‚ัŒ ั‚ะฐ ะฝะต ะผะพะณะปะฐ ะฝะธะบะฐะบ. ะžะฑั‹ั‡ะฝะพะณะพ ะผัƒะถั‡ะธะฝัƒ ั‚ั€ะตะฝะธั€ะพะฒะฐะฝะฝะฐั ะพั…ะพั‚ะฝะธั†ะฐ ัะผะพะณะปะฐ ะฑั‹ ะฑะตะท ะฟั€ะพะฑะปะตะผ ัะฑั€ะพัะธั‚ัŒ, ะฝะพ ะตั‘ ะฝั‹ะฝะตัˆะฝะธะน ะฟั€ะพั‚ะธะฒะฝะธะบ ั‚ะพะถะต ะฑั‹ะป ั…ะพั€ะพัˆะพ ะฟะพะดะณะพั‚ะพะฒะปะตะฝ. ะกะฒะพะฑะพะดะฝะพะน ั€ัƒะบะพะน ะฟะฐั€ะตะฝัŒ ั€ะฐะทะพั€ะฒะฐะป ะฑะตะปัŒั‘ ะดะตะฒัƒัˆะบะธ. - ะšะธั€ะฐั€ะฐ, ัƒะฑะตั€ะธ ะตะณะพ! ะ•ะดะธะฝัั‚ะฒะตะฝะฝะพะน ะฟั€ะธั‡ะธะฝะพะน, ะฟะพ ะบะพั‚ะพั€ะพะน ะบะพัˆะบะฐ ะฝะต ะฒะผะตัˆะฐะปะฐััŒ ั€ะฐะฝัŒัˆะต, ะฑั‹ะปะธ ะพัะพะฑั‹ะต ะพั‚ะฝะพัˆะตะฝะธั ะผะตะถะดัƒ ะตั‘ ั…ะพะทัะนะบะพะน ะธ ะบะพะผะฟะฐะฝัŒะพะฝะพะผ - ะตัะปะธ ะฑั‹ ะบั‚ะพ ะดั€ัƒะณะพะน ะฟะพะฟั‹ั‚ะฐะปัั ัะพั‚ะฒะพั€ะธั‚ัŒ ั‚ะฐะบะพะต ะฝะฐ ะตั‘ ะณะปะฐะทะฐั… ั ะพั…ะพั‚ะฝะธั†ะตะน, ะพะฝ ัƒะถะต ั‡ะตั€ะตะท ัะตะบัƒะฝะดัƒ ะฒะฐะปัะปัั ะฑั‹ ั ั€ะฐะทะพั€ะฒะฐะฝะฝั‹ะผ ะณะพั€ะปะพะผ ะธ ะฒั‹ะฟัƒั‰ะตะฝะฝั‹ะผะธ ะบะธัˆะบะฐะผะธ. ะขะตะฟะตั€ัŒ ะถะต, ะฝะฐะบะพะฝะตั†-ั‚ะพ ะฟะพะปัƒั‡ะธะฒ ะฟั€ะธะบะฐะท, ะšะธั€ะฐั€ะฐ ะผะณะฝะพะฒะตะฝะฝะพ ะพะฑะตั€ะฝัƒะปะฐััŒ ะธ ะฟั€ะพัั‚ะพ ัะฝะตัะปะฐ ะฒ ะฟั€ั‹ะถะบะต ัƒะดะฐั€ะพะผ ั‚ะตะปะฐ ะœะธั€ะพะบัƒ ั ะกะฐะฝะณะพ. - ะ”ะตั€ะถะธ, ะฝะพ ะฝะต ั‚ั€ะพะณะฐะน! - ะบั€ะธะบะฝัƒะปะฐ ะดะตะฒัƒัˆะบะฐ, ะฑั€ะพัะฐัััŒ ะฒ ัƒะณะพะป, ะณะดะต ะปะตะถะฐะปะธ ะตั‘ ะฒะตั‰ะธ. ะ‘ะพะปัŒัˆะธะฝัั‚ะฒะพ ัะพัั‚' - 'ะšั‚ะพ ะฟั€ะธะดัƒะผะฐะป ะฟั€ะฐะทะดะฝะพะฒะฐั‚ัŒ ะ ะพะถะดะตัั‚ะฒะพ ะฒัะตะผ ะฒะผะตัั‚ะต? ะšะฐะบะพะน ัะฐะดะธัั‚? ะ—ะฐ ั‡ั‚ะพ ะพะฝ ะผะตะฝั ั‚ะฐะบ ะฝะตะฝะฐะฒะธะดะธั‚? ะ—ะฐ ะพะบะฝะฐะผะธ ะฑั‹ะปะฐ ะผะตั‚ะตะปัŒ, ะบะพั‚ะพั€ะฐั, ะบะฐะทะฐะปะพััŒ, ะพั‚ั€ะตะทะฐะปะฐ ะฝะฐัˆ ะดะพะผะธะบ ะพั‚ ะฒัะตะณะพ ะพัั‚ะฐะปัŒะฝะพะณะพ ะผะธั€ะฐ. ะ ะผะฝะต ั…ะพั‚ะตะปะพััŒ ะพั‚ั€ะตะทะฐั‚ัŒ ัะตะฑั ะพั‚ ะฒัะตั… ะฒ ะฟั€ะธะฝั†ะธะฟะต. ะฃะณะพั€ะฐะทะดะธะปะพ ะถะต ะผะตะฝั ัะพะณะปะฐัะธั‚ัŒัั ะฟะพะผะพะณะฐั‚ัŒ ะ ะธั…ัƒ ะฝะฐ ะบัƒั…ะฝะต. ะะฐะดะพ ะฑั‹ะปะพ ะฟั€ะตะดะฒะธะดะตั‚ัŒ, ั‡ั‚ะพ ั ะฝะธะผ ะฑัƒะดะตั‚ ะพะฝ. ะฏ ะบั€ะพัˆะธะป ะพะฒะพั‰ะธ ะดะปั ัะฐะปะฐั‚ะฐ ั ะพัะพะฑะพะน ัะพัั€ะตะดะพั‚ะพั‡ะตะฝะฝะพัั‚ัŒัŽ, ัั‚ะฐั€ะฐัััŒ ะฝะต ะฟะพะดะฝะธะผะฐั‚ัŒ ะณะปะฐะท ะพั‚ ั€ะฐะทะดะตะปะพั‡ะฝะพะน ะดะพัะบะธ, ะฝะพ ัƒ ะผะตะฝั ะฝะต ะฒัะตะณะดะฐ ัั‚ะพ ะฟะพะปัƒั‡ะฐะปะพััŒ. ะฏ ะฝะต ัƒะดะตั€ะถะฐะปัั ะธ ะฑั€ะพัะธะป ะฒะทะณะปัะด ะฝะฐ ะฝะธั…: ะŸะฐัƒะปัŒ ะฟะพะดะพัˆั‘ะป ะบ ะ ะธั…ะฐั€ะดัƒ ัะทะฐะดะธ ะธ ะฟั€ะธะพะฑะฝัะป ะตะณะพ ะทะฐ ั‚ะฐะปะธัŽ, ะ ะธั… ะผัะณะบะพ, ะฝะพ ะฝะฐัั‚ะพะนั‡ะธะฒะพ ะพั‚ะพะดะฒะธะฝัƒะป ะตะณะพ ั€ัƒะบะธ, ะฝะฐะผะตะบะฐั ะฝะฐ ั‚ะพ, ั‡ั‚ะพ ะพะฝะธ ัะตะนั‡ะฐั ะฝะต ะพะดะฝะธ, ะฝะพ ะฒัั‘ ะถะต ะดะพะฒะพะปัŒะฝะพ ัƒะปั‹ะฑะฝัƒะปัั. ะฏ ะฝะตะฟั€ะพะธะทะฒะพะปัŒะฝะพ ัะถะฐะป ั€ัƒะบะพัั‚ะบัƒ ะฝะพะถะฐ ะตั‰ั‘ ัะธะปัŒะฝะตะต, ั‚ะฐะบ, ั‡ั‚ะพ ะบะพัั‚ััˆะบะธ ะฟะฐะปัŒั†ะตะฒ ะฟะพะฑะตะปะตะปะธ, ะธ ะฟั€ะธะฝัะปัั ะบั€ะพัˆะธั‚ัŒ ะพะฒะพั‰ะธ ะตั‰ั‘ ะผะตะปัŒั‡ะต, ะฟั€ะตะฒั€ะฐั‰ะฐั ะธั… ะฒ ะบะฐัˆัƒ. ะฏ ัƒะฟะพั€ะฝะพ ะฟั‹ั‚ะฐะปัั ะทะฐะณะฝะฐั‚ัŒ ะพะฑัƒั€ะตะฒะฐัŽั‰ะธะต ะผะตะฝั ั‡ัƒะฒัั‚ะฒะฐ ะธ ัะผะพั†ะธะธ ะบัƒะดะฐ-ะฝะธะฑัƒะดัŒ ะฒ ัะฐะผั‹ะน ะดะฐะปัŒะฝะธะน ัƒะณะพะปะพะบ ัะฒะพะตะน ะดัƒัˆะธ, ัั‚ะฐั€ะฐัััŒ, ั‡ั‚ะพะฑั‹ ะพะฝะธ ะฝะต ะฟั€ะพัั‚ัƒะฟะธะปะธ ัƒ ะผะตะฝั ะฝะฐ ะปะธั†ะต. ะšะฐะบะพะต ะผะฝะต ะดะตะปะพ ะดะพ ะฝะฐัˆะธั… ะณะพะปัƒะฑะบะพะฒ? ะะธะบะฐะบะพะณะพ. ะ ัƒะบะฐ ะ ะธั…ะฐั€ะดะฐ ะฟะปะฐะฒะฝะพ ะปะตะณะปะฐ ะฝะฐ ะผะพัŽ. ะฏ ะฒะทะดั€ะพะณะฝัƒะป ะพั‚ ะฝะตะพะถะธะดะฐะฝะฝะพัั‚ะธ ะธ ะฒั‹ั€ะพะฝะธะป ะฝะพะถ, ะถะฐะปะพะฑะฝะพ ะทะฒัะบะฝัƒะฒัˆะธะน ะพั‚ ัƒะดะฐั€ะฐ ะพะฑ ะฟะพะป. - ะŸะพ-ะผะพะตะผัƒ, ัƒะถะต ะดะพัั‚ะฐั‚ะพั‡ะฝะพ, - ะผัะณะบะธะน ะณะพะปะพั ะปะธะด-ะณะธั‚ะฐั€ะธัั‚ะฐ ะฝะตะถะฝะพ ะพะฑะฒะพะปะฐะบะธะฒะฐะป ะผะพั‘ ะทะฐะธะฝะดะตะฒะตะฒัˆะตะต ัะตั€ะดั†ะต, ัะปะพะฒะฝะพ ั‚ั‘ะฟะปะพะต ะพะดะตัะปะพ, ะธ ะทะฐัั‚ะฐะฒะปัะป ะตะณะพ ะพั‚ั‚ะฐะธะฒะฐั‚ัŒ. ะ ะธั… ะฟะพะดะฝัะป ั ะฟะพะปะฐ ะฝะพะถ, ะฟะพะปะพะถะธะป ะตะณะพ ะฝะฐ ัั‚ะพะป, ะทะฐะฑั€ะฐะป ัƒ ะผะตะฝั ะดะพัะบัƒ ั ะพะฒะพั‰ะฐะผะธ ะธ ะฝะฐะฟั€ะฐะฒะธะปัั ะพะฑั€ะฐั‚ะฝะพ ะบ ะฟะปะธั‚ะต. ะก ะฝะตะบะพั‚ะพั€ั‹ั… ะฟะพั€ ัƒ ะ ะธั…ะฐั€ะดะฐ ะฟะพัะฒะธะปะฐััŒ ะฟั€ะธะฒั‹ั‡ะบะฐ ั…ะพะดะธั‚ัŒ ะฟะพ ะดะพะผัƒ ั ะพะฑะฝะฐะถั‘ะฝะฝั‹ะผ ั‚ะพั€ัะพะผ, ะธ ัะตะนั‡ะฐั, ะบะพะณะดะฐ ะพะฝ ะดะตั„ะธะปะธั€ะพะฒะฐะป ะฟะพ ะบัƒั…ะฝะต ะฒ ั„ะฐั€ั‚ัƒะบะต ะฟะพะฒะตั€ั… ั‡ัƒั‚ัŒ ัะผัƒะณะปะพะณะพ ะณะพะปะพะณะพ ั‚ะตะปะฐ, ะผะฝะต ัั‚ะพะธะปะพ ะฑะพะปัŒัˆะธั… ัƒัะธะปะธะน ะฝะต ะฟัะปะธั‚ัŒัั ะฝะฐ ะฝะตะณะพ ะธ ะฝะต ะทะฐั…ะปั‘ะฑั‹ะฒะฐั‚ัŒัั ัะปัŽะฝัะผะธ. ะŸะพัั‚ะพะผัƒ, ะฑั€ะพัะธะฒ ะบะพั€ะพั‚ะบะธะน ะฒะทะณะปัะด ะฝะฐ ะตะณะพ ัะฟะธะฝัƒ, ั ะฟะพัั‚ะฐั€ะฐะปัั ะบะฐะบ ะผะพะถะฝะพ ะฑั‹ัั‚ั€ะตะต ะพั‚ะฒะตั€ะฝัƒั‚ัŒัั. ะก ะฝะตะดะฐะฒะฝะธั… ะฟะพั€ ะผะฝะต ัั‚ะฐะปะพ ะบะฐะทะฐั‚ัŒัั, ั‡ั‚ะพ ะŸะฐัƒะปัŒ ะทะฐ ะผะฝะพะน ัะปะตะดะธั‚. ะฏ ะฝะต ั€ะฐะท ะทะฐะผะตั‡ะฐะป ะฝะฐ ัะตะฑะต ะตะณะพ ะฟั€ะธั‰ัƒั€ะตะฝะฝั‹ะน ะฒะทะณะปัะด. ะœะพะถะตั‚, ั€ะตะฒะฝัƒะตั‚? ะัƒ ะธ ะฟัƒัะบะฐะน, ะฟัƒัั‚ัŒ ั…ะพั‚ัŒ ะฝะตะผะฝะพะณะพ ะฟะพะผัƒั‡ะฐะตั‚ัั. - ะกะฟะฐัะธะฑะพ, ะขะธะปะปัŒ, - ะฝะฐัั‚ะพะนั‡ะธะฒะพ ะฟั€ะพะธะทะฝั‘ั ะŸะฐัƒะปัŒ ะธ ะฟะพัะผะพั‚ั€ะตะป ะฝะฐ ะดะฒะตั€ัŒ. ะฃะถ ะฝะต ะทะฝะฐัŽ, ะฑั‹ะป ะปะธ ัั‚ะพั‚ ะถะตัั‚ ัะปัƒั‡ะฐะนะฝั‹ะผ ะธะปะธ ะถะต ะฟั€ะตะดะฝะฐะผะตั€ะตะฝะฝั‹ะผ, ะฝะพ ั ะธ ั‚ะฐะบ ะฟะพะฝัะป ะฝะฐะผั‘ะบ. ะฃะณั€ัŽะผะพ ะบะธะฒะฝัƒะฒ, ั ะฒั‹ัˆะตะป, ะฟั€ะธะบั€ั‹ะฒ ะทะฐ ัะพะฑะพะน ะดะฒะตั€ัŒ. ะกั‚ะพะธะปะพ ะผะฝะต ะพั‚ะพะนั‚ะธ ะฝะฐ ะฟะฐั€ัƒ ัˆะฐะณะพะฒ, ะบะฐะบ ั ัƒัะปั‹ัˆะฐะป, ั‡ั‚ะพ ัะพ ัั‚ะพะปะฐ ัƒะฟะฐะปะธ ะบะฐัั‚ั€ัŽะปะธ (ะธะปะธ ะถะต ะฑั‹ะปะธ ัะฟะตั†ะธะฐะปัŒะฝะพ ะพั‚ั‚ัƒะดะฐ ัะบะธะฝัƒั‚ั‹ ั‡ัŒะธะผะธ-ั‚ะพ ะฝะตั‚ะตั€ะฟะตะปะธะฒั‹ะผะธ ั€ัƒะบะฐะผะธ). ะฏ ัั‚ะธัะฝัƒะป ะทัƒะฑั‹ ะธ ะพั‚ะพัˆั‘ะป ะฟะพะดะฐะปัŒัˆะต ะพั‚ ะบัƒั…ะฝะธ. ะจะฝะฐะนะดะตั€ัƒ ะบะพะต-ะบะฐะบ ัƒะดะฐะปะพััŒ ัƒะณะพะฒะพั€ะธั‚ัŒ ะผะตะฝั ัะฟัƒัั‚ะธั‚ัŒัั ะบ ัƒะถะธะฝัƒ. ะฏ ั€ะตะทะบะพ ะทะฐั…ะปะพะฟะฝัƒะป ะฑะปะพะบะฝะพั‚, ะฒ ะบะพั‚ะพั€ะพะผ ัƒะถะต ะฟะพั‡ั‚ะธ ะฝะต ะพัั‚ะฐะปะพััŒ ั‡ะธัั‚ั‹ั… ัั‚ั€ะฐะฝะธั†, ะฝะตะพั…ะพั‚ะฝะพ ะฟะพะดะฝัะปัั ัะพ ัั‚ัƒะปะฐ ะธ ัะฟัƒัั‚ะธะปัั ะฒ ัั‚ะพะปะพะฒัƒัŽ, ะผั€ะฐั‡ะฝะพ ะดัƒะผะฐั, ั‡ั‚ะพ ะฒะฟะตั€ะตะดะธ ะตั‰ั‘ ะฝะตะดะตะปั ั‚ะฐะบะพะณะพ ะบะพัˆะผะฐั€ะฐ. ะŸะพั‡ะตะผัƒ ะฝะฐะดะพ ะฑั‹ะปะพ ั‚ะฐั‰ะธั‚ัŒ ะผะตะฝั ะฒ ัั‚ะพั‚ ั‡ั‘ั€ั‚ะพะฒ ะทะฐะณะพั€ะพะดะฝั‹ะน ะดะพะผ, ะณะดะต ะฒะพะบั€ัƒะณ ะพะดะธะฝ ะปะตั? ะ˜ ะฝะฐ ะณะปะฐะทะฐั… ะฒะตั‡ะฝะพ ะ ะธั…ะฐั€ะด ั ะŸะฐัƒะปะตะผ... ะฏ ะฑั‹ะป ัƒะฒะตั€ะตะฝ, ั‡ั‚ะพ ะฒะธะด ัั‚ะพะน ะฟะฐั€ะพั‡ะบะธ ะทะฐ ัั‚ะพะปะพะผ ัั€ะฐะทัƒ ะพั‚ะพะฑัŒั‘ั‚ ัƒ ะผะตะฝั ะฐะฟะฟะตั‚ะธั‚. ะะฐะฟั€ะธะผะตั€, ะบะพะณะดะฐ ะŸะฐัƒะปัŒ ัะฝะพะฒะฐ ะบะฐะบ ะฑั‹ ะฝะตะทะฐะผะตั‚ะฝะพ ะฟะพะปะพะถะธั‚ ะ ะธั…ะฐั€ะดัƒ ั€ัƒะบัƒ ะฝะฐ ะบะพะปะตะฝะพ, ะพั‚ั‡ะตะณะพ ัƒ ะ ะธั…ะฐ ะฟะพะบั€ะฐัะฝะตัŽั‚ ัƒัˆะธ, ะพะฝ ะฝะฐั‡ะฝั‘ั‚ ะฝะตั€ะฒะฝะพ ะฟะพัะผะตะธะฒะฐั‚ัŒัั, ะทะฐั‚ะตะผ ะบะฐะบ ะฑั‹ ัะปัƒั‡ะฐะนะฝะพ ัƒั€ะพะฝะธั‚ ะฒะธะปะบัƒ ะฝะฐ ะฟะพะป, ะฝะฐะณะฝั‘ั‚ัั, ั‡ั‚ะพะฑั‹ ะฝะตะปะพะฒะบะพ ะบะปัŽะฝัƒั‚ัŒ ัะฒะพะตะณะพ ะปัŽะฑะพะฒะฝะธะบะฐ ะฒ ั€ัƒะบัƒ, ะฟะพัะปะต ั‡ะตะณะพ ัƒ ะŸะฐัƒะปั ะฝะฐ ะปะธั†ะต ะพะฑัะทะฐั‚ะตะปัŒะฝะพ ะฟะพัะฒะธั‚ัั ะดะพะฒะพะปัŒะฝะฐั ัƒะปั‹ะฑะบะฐ ัั‹ั‚ะพะณะพ ะบะพั‚ะฐ. ะะตะฝะฐะฒะธะถัƒ. ะ’ัั ะณั€ัƒะฟะฟะฐ ัƒะถะต ัะธะดะตะปะฐ ะทะฐ ัั‚ะพะปะพะผ, ะพั€ัƒะดัƒั ัั‚ะพะปะพะฒั‹ะผะธ ะฟั€ะธะฑะพั€ะฐะผะธ ะฝะฐะด ะฟั€ะธะณะพั‚ะพะฒะปะตะฝะฝั‹ะผ ะฝะฐะผะธ ัƒะถะธะฝะพะผ. ะœะพั ะฟะพั€ั†ะธั ะพะดะธะฝะพะบะพ ัั‚ะพัะปะฐ ะฒ ัั‚ะพั€ะพะฝะต - ะทะฝะฐั‡ะธั‚, ะพะฝะธ ะฝะต ะฑั‹ะปะธ ัƒะฒะตั€ะตะฝั‹ ะฒ ั‚ะพะผ, ั‡ั‚ะพ ั ะฟั€ะธะดัƒ. ะฏ ะผะพะปั‡ะฐ ะฒะทัะป ะตั‘ ะธ ัะตะป ะทะฐ ัั‚ะพะป, ะฝะธะบะฐะบ ะฝะต ะพั‚ั€ะตะฐะณะธั€ะพะฒะฐะฒ ะฝะฐ ะฟะพะถะตะปะฐะฝะธั ะฟั€ะธัั‚ะฝะพะณะพ ะฐะฟะฟะตั‚ะธั‚ะฐ. ะšะฐะถะตั‚ัั, ะพะฝะธ ะดัƒะผะฐัŽั‚, ั‡ั‚ะพ ัƒ ะผะตะฝั ั‚ะฒะพั€ั‡ะตัะบะธะน ะบั€ะธะทะธั. ะญั‚ะพ ะฑั‹ะปะพ ะฑั‹ ะผะฝะต ะฝะฐ ั€ัƒะบัƒ - ะผะตะฝั ะฟะพะฑะฐะธะฒะฐะปะธััŒ ั‚ั€ะพะณะฐั‚ัŒ, ะฟั€ะตะดะพัั‚ะฐะฒะปัั ะผะตะฝั ัะฐะผะพะผัƒ ัะตะฑะต. ะ ะฒะพั‚ ั€ัƒะบะฐ ะŸะฐัƒะปั ะผะตะดะปะตะฝะฝะพ ะฟะพั‚ัะฝัƒะปะฐััŒ ะฒะฟั€ะฐะฒะพ, ะบ ะ ะธั…ะธะฝะพะผัƒ ะบะพะปะตะฝัƒ. ะ’ะพ ั€ั‚ัƒ ะฒัั‘ ะพะบะธัะปะธะปะพััŒ, ั ั ั‚ั€ัƒะดะพะผ ัะดะตั€ะถะฐะป ั€ะฒะพั‚ะฝั‹ะน ะฟะพะทั‹ะฒ. ะะต ัั‚ะพะธั‚ ะทะฐะฒะพะดะธั‚ัŒัั... ะะต ัั‚ะพะธั‚ ะทะฐะฒะพะดะธั‚ัŒัั... ะะต ัั‚ะพะธั‚ ะทะฐะฒะพะดะธั‚ัŒัั! ะฏ ะพั‚ะบะธะฝัƒะป ะฒะธะปะบัƒ ะฒ ัั‚ะพั€ะพะฝัƒ, ะทะฐ ัั‚ะพะปะพะผ ะฒะพั†ะฐั€ะธะปะฐััŒ ะณั€ะพะฑะพะฒะฐั ั‚ะธัˆะธะฝะฐ. ะฏ ะทะฝะฐะป, ั‡ั‚ะพ ะตั‰ั‘ ะฝะตัะบะพะปัŒะบะพ ะผะธะฝัƒั‚ ะฝะธะบั‚ะพ ะฝะธั‡ะตะณะพ ะฝะต ัะบะฐะถะตั‚: ะฒัะต ะฑัƒะดัƒั‚ ะดะพะถะธะดะฐั‚ัŒัั, ะฟะพะบะฐ ั ัƒะนะดัƒ, ั‡ั‚ะพะฑั‹ ะฝะต ะฟะพะฟะฐัั‚ัŒัั ะฟะพะด ะณะพั€ัั‡ัƒัŽ ั€ัƒะบัƒ. ะ˜ ั ัƒัˆั‘ะป. ะ—ะฐั‡ะตะผ ะฟะพั€ั‚ะธั‚ัŒ ะปัŽะดัะผ ะฝะฐัั‚ั€ะพะตะฝะธะต. ะฃ ะฝะธั… ะฒะตะดัŒ ะฐั‚ะผะพัั„ะตั€ะฐ ะฟั€ะฐะทะดะฝะธะบะฐ. ะ—ะฐะฒั‚ั€ะฐ ะถะต ะณั€ั‘ะฑะฐะฝะพะต ะ ะพะถะดะตัั‚ะฒะพ. ะžะปะธะฒะตั€ ะทะฐัˆั‘ะป ะบะพ ะผะฝะต, ะบะพะณะดะฐ ะฑั‹ะปะพ ัƒะถะต ะทะฐ ะฟะพะปะฝะพั‡ัŒ. ะฏ ัะธะดะตะป ะทะฐ ัั‚ะพะปะพะผ ะธ ัƒัะตั€ะดะฝะพ ัั‚ั€ะพั‡ะธะป ะฒ ะฑะปะพะบะฝะพั‚ะต. ะ ะธั„ะผะฐ ั…ะปะตัั‚ะฐะปะฐ ะธะท ะผะตะฝั ะฑะตะท ะพัั‚ะฐะฝะพะฒะบะธ, ะผะตะฝั ัะปะพะฒะฝะพ ั‚ะพัˆะฝะธะปะพ ัั‚ั€ะพั‡ะบะฐะผะธ ัั‚ะธั…ะพั‚ะฒะพั€ะตะฝะธั; ะผะฝะต ะฑั‹ะปะพ ะฟะปะพั…ะพ, ะฝะพ ะฒัั‘, ั‡ะตะผ ั ะผะพะณ ัะตะฑะต ะฟะพะผะพั‡ัŒ - ัั‚ะพ ะฟะธัะฐั‚ัŒ ะธ ะฟะธัะฐั‚ัŒ, ะฒั‹ะฟะปั‘ัะบะธะฒะฐั ะธะท ัะตะฑั ะฒัั‘ ั‚ะพ, ั‡ั‚ะพ ะฝะฐะบะพะฟะธะปะพััŒ. ะ’ ะบะพะผะฝะฐั‚ะต ะฑั‹ะปะพ ั‚ะตะผะฝะพ: ั ะฟะธัะฐะป, ัะธะดั ัƒ ะพะบะฝะฐ, ะธ ะผะฝะต ะฒะฟะพะปะฝะต ั…ะฒะฐั‚ะฐะปะพ ัะฒะตั‚ะฐ ั„ะพะฝะฐั€ั ะฝะฐ ัƒะปะธั†ะต. - ะฏ ะผะพะณัƒ ั‡ะตะผ-ะฝะธะฑัƒะดัŒ ั‚ะตะฑะต ะฟะพะผะพั‡ัŒ? ะฏ ะตะดะฒะฐ ัะดะตั€ะถะฐะปัั, ั‡ั‚ะพะฑั‹ ะฝะตั€ะฒะฝะพ ะฝะต ะทะฐัะผะตัั‚ัŒัั: ะบะพะฝะตั‡ะฝะพ, ะผะพะถะตัˆัŒ, ะžะปะปะธ! ะ—ะฐะดัƒัˆะธ ะŸะฐัƒะปั. ะะตั‚, ะปัƒั‡ัˆะต ะทะฐัั‚ั€ะตะปะธ. ะ˜ะปะธ ัƒั‚ะพะฟะธ. ะกะดะตะปะฐะน ั…ะพั‚ัŒ ั‡ั‚ะพ-ะฝะธะฑัƒะดัŒ, ั‡ั‚ะพะฑั‹ ัั‚ะพ ะผะฐะปะตะฝัŒะบะพะต ั‡ัƒะดะพะฒะธั‰ะต ะฝะต ะฟะพัะฒะปัะปะพััŒ ั€ัะดะพะผ ั ะ ะธั…ะฐั€ะดะพะผ! - ะะตั‚, ัะฟะฐัะธะฑะพ, ัƒ ะผะตะฝั ะฒัั‘ ะฒ ะฟะพั€ัะดะบะต. ะžะปะธะฒะตั€ ะฝะตะดะพะฒะตั€ั‡ะธะฒะพ ะฟะพัะผะพั‚ั€ะตะป ะฝะฐ ะผะตะฝั, ะฝะพ ะฝะต ัั‚ะฐะป ะฝะธั‡ะตะณะพ ะณะพะฒะพั€ะธั‚ัŒ, ะทะฐ ั‡ั‚ะพ ั ะฑั‹ะป ะพั‚ ะดัƒัˆะธ ะตะผัƒ ะฑะปะฐะณะพะดะฐั€ะตะฝ. ะฏ ะฟะพั€ะพะน ะทะฐะฒะธะดะพะฒะฐะป ะตะณะพ ัะฟะพะบะพะนัั‚ะฒะธัŽ, ั‚ะตั€ะฟะตะฝะธัŽ ะธ, ะฟะพะถะฐะปัƒะน, ะผัƒะดั€ะพัั‚ะธ. ะฃ ะฝะฐั ั ะฝะธะผ ะทะฝะฐั‡ะธั‚ะตะปัŒะฝะฐั ั€ะฐะทะฝะธั†ะฐ ะฒ ะฒะพะทั€ะฐัั‚ะต, ะฝะพ ะฟะพั€ะพะน ะผะฝะต ะบะฐะถะตั‚ัั, ั‡ั‚ะพ ะพะฝ ะทะฝะฐั‡ะธั‚ะตะปัŒะฝะพ ะพะฟั‹ั‚ะฝะตะต ะผะตะฝั. ะะฐะฒะตั€ะฝะพะต, ะฒัั‘ ะดะตะปะพ ะฒ ะตะณะพ ัะฟะพะบะพะนะฝะพะผ ะธ ะฝะตะผะฝะพะณะพ ะทะฐะณะฐะดะพั‡ะฝะพะผ ะฒะทะณะปัะดะต. ะ“ะพะฒะพั€ัั‚, ะณะปะฐะทะฐ - ะทะตั€ะบะฐะปะพ ะดัƒัˆะธ. ะฏ ะฑั‹ ั ะฑะพะปัŒัˆะธะผ ะธะฝั‚ะตั€ะตัะพะผ ะทะฐะณะปัะฝัƒะป ะฒ ะดัƒัˆัƒ ะบ ะžะปะธะฒะตั€ัƒ. ะŸะพะถะฐะปัƒะน, ัั‚ะพ ะพะดะธะฝ ะธะท ะฝะตะผะฝะพะณะธั… ะปัŽะดะตะน, ั‡ัŒั ะดัƒัˆะฐ ะผะตะฝั ะฒ ะฟั€ะธะฝั†ะธะฟะต ะธะฝั‚ะตั€ะตััƒะตั‚. ะœั‹ ะผะพะปั‡ะฐะปะธ. ะ—ะฐ ะพะบะฝะพะผ ะฒ ะฟั€ะธั‡ัƒะดะปะธะฒะพะผ ะฒะฐะปัŒัะต ะบั€ัƒะถะธะปะธััŒ ัะฝะตะถะธะฝะบะธ, ะฐ ะฒะตั‚ะตั€, ัะปะพะฒะฝะพ ัั‚ั€ะพะณะธะน ะฑะฐะปะตั‚ะผะตะนัั‚ะตั€, ััƒั€ะพะฒะพ ะฒะพั€ั‡ะฐะป ะฝะฐ ะฝะธั…. ะขะธะบะฐะปะธ ะฝะฐัั‚ะตะฝะฝั‹ะต ั‡ะฐัั‹, ั‚ะธั…ะพ ะธ ะณะปัƒั…ะพ, ะฑัƒะดั‚ะพ ะฟะพะดัั‚ั€ะฐะธะฒะฐะปะธััŒ ะฟะพะด ัั‚ัƒะบ ะผะพะตะณะพ ัะตั€ะดั†ะฐ. ะžะปะธะฒะตั€ ะตั‰ั‘ ัั‚ะพัะป ะทะฐ ะผะพะตะน ัะฟะธะฝะพะน. ะœะฝะต ะบะฐะทะฐะปะพััŒ, ะพะฝ ั…ะพั‡ะตั‚ ะตั‰ั‘ ั‡ั‚ะพ-ั‚ะพ ะผะฝะต ัะบะฐะทะฐั‚ัŒ, ะฝะพ ั ะฝะต ะฟะพะฝะธะผะฐะป, ะฟะพั‡ะตะผัƒ ะพะฝ ะผะตะดะปะธั‚, ะฒั€ะพะดะต ะฑั‹ ะฒ ะตะณะพ ะฒะทะณะปัะดะต ะฝะต ะฑั‹ะปะพ ะฝะตั€ะตัˆะธั‚ะตะปัŒะฝะพัั‚ะธ. ะ’ ะบะพะฝั†ะต ะบะพะฝั†ะพะฒ, ะพะฝ ั€ะฐะทะฒะตั€ะฝัƒะปัั ะธ ะฝะฐะฟั€ะฐะฒะธะปัั ะบ ะฒั‹ั…ะพะดัƒ, ะพั‡ะตะฒะธะดะฝะพ, ะฟะตั€ะตะดัƒะผะฐะฒ ะธ ั€ะตัˆะธะฒ ะพัั‚ะฐะฒะธั‚ัŒ ัั‚ะพั‚ ั€ะฐะทะณะพะฒะพั€ ะฝะฐ ะฟะพั‚ะพะผ. ะžะฝ ะปะธัˆัŒ ัะฟั€ะพัะธะป ะฝะฐะฟะพัะปะตะดะพะบ: - ะขะตะฑะต ั‡ั‚ะพ-ะฝะธะฑัƒะดัŒ ะฟั€ะธะฒะตะทั‚ะธ ะธะท ะณะพั€ะพะดะฐ? ะœั‹ ัะพ ะจะฝะฐะนะดะตั€ะพะผ ะทะฐะฑั‹ะปะธ ะปั‹ะถะธ, ะฐ ะฝะฐะผ ัƒะถ ะฑะพะปัŒะฝะพ ั…ะพั‡ะตั‚ัั ะฟะพะบะฐั‚ะฐั‚ัŒัั, ะฟะพะบะฐ ัะฝะตะณ ะฝะต ะฝะฐั‡ะฐะป ะฟั€ะตะฒั€ะฐั‰ะฐั‚ัŒัั ะฒ ะณั€ัะทัŒ. - ะ—ะฐั…ะฒะฐั‚ะธั‚ะต ะผะฝะต ั‡ะธัั‚ั‹ะน ะฑะปะพะบะฝะพั‚. ะ˜ ะžะปะธะฒะตั€ ัƒัˆั‘ะป, ัะฝะพะฒะฐ ะพัั‚ะฐะฒะธะฒ ะผะตะฝั ะฝะฐะตะดะธะฝะต ั ะฑะปะพะบะฝะพั‚ะพะผ. ะงะตั€ะบะฝัƒะฒ ะตั‰ั‘ ะฟะฐั€ัƒ ัั‚ั€ะพะบ, ั ะฟะพะฝัะป, ั‡ั‚ะพ ั„ะพะฝั‚ะฐะฝ ะธะดะตะน ะธัััะบ. ะฏ ะทะฐะดัƒะผั‡ะธะฒะพ ะฟั€ะพะปะธัั‚ะฐะป ะธัะฟะธัะฐะฝะฝั‹ะต ะฝะตะฐะบะบัƒั€ะฐั‚ะฝั‹ะผ ะฟะพั‡ะตั€ะบะพะผ ัั‚ั€ะฐะฝะธั†ั‹. ะžัั‚ะฐะปะฐััŒ ะปะธัˆัŒ ะพะดะฝะฐ ั‡ะธัั‚ะฐั. ะขะพะปัŒะบะพ ัะตะนั‡ะฐั ั ะฟะพะฝัะป, ั‡ั‚ะพ ะฒัะต ัั‚ะธั…ะธ, ะฝะฐะฟะธัะฐะฝะฝั‹ะต ะทะดะตััŒ, ะฟะพัะฒัั‰ะตะฝั‹ ะ ะธั…ะฐั€ะดัƒ. ะฏ ั€ะตัˆะธะป ะปะตั‡ัŒ ัะฟะฐั‚ัŒ ะฝะต ัั‚ะพะปัŒะบะพ ะฟะพั‚ะพะผัƒ ั‡ั‚ะพ ัƒัั‚ะฐะป, ัะบะพะปัŒะบะพ ะธะท ะถะตะปะฐะฝะธั ัƒะฑะธั‚ัŒ ะฒั€ะตะผั. ะ’ั€ะตะผั... ะšะฐะบ ะถะต ั ะตะณะพ ะฝะตะฝะฐะฒะธะถัƒ. ะ’ะตั€ะพัั‚ะฝะพ, ะฟะพั‚ะพะผัƒ ั‡ั‚ะพ ัƒะฟัƒัั‚ะธะป ะตะณะพ. ะŸะพะบะฐ ั ะฟะธัะฐะป ัะฒะพะธ ัั‚ะธั…ะธ, ะฒั€ะตะผั ะบะฐะฟะฐะปะพ, ัะปะพะฒะฝะพ ะฒะพะดะฐ ะธะท ะฟะปะพั…ะพ ะทะฐะบั€ั‹ั‚ะพะณะพ ะบั€ะฐะฝะฐ. ะ˜ ั ะพะฟะพะทะดะฐะป. ะขะตะฟะตั€ัŒ ะ ะธั…ะฐั€ะด ั€ะฐะทะดะตะปัะตั‚ ัะฒะพัŽ ัˆะธั€ะพะบัƒัŽ ะบั€ะพะฒะฐั‚ัŒ ะฝะต ัะพ ะผะฝะพะน, ะฐ ั ั‚ะตะผ, ะบั‚ะพ ะพะบะฐะทะฐะปัั ะฟั€ะพะฒะพั€ะฝะตะต, ะฑั‹ัั‚ั€ะตะต, ัะพะพะฑั€ะฐะทะธั‚ะตะปัŒะฝะตะต. ะ ั ั‚ะฐะบ ะธ ะพัั‚ะฐะปัั ะปะตะถะฐั‚ัŒ ะฟะพะด ั…ะพะปะพะดะฝั‹ะผ ะพะดะตัะปะพะผ ะพะดะธะฝ, ะฝะฐะตะดะธะฝะต ัะพ ัะฒะพะธะผะธ ัั‚ะธั…ะฐะผะธ, ะบะพั‚ะพั€ั‹ะต ั ะฝะธะบะพะผัƒ ะธ ะฝะธะบะพะณะดะฐ ะฝะต ะฟะพะบะฐะถัƒ. ะ˜ ั ะฟะพัั‚ะตะฟะตะฝะฝะพ ัั‚ะฐะฝะพะฒะปัŽััŒ ะบะพะปัŽั‡ะธะผ, ะธ, ะบะฐะถะตั‚ัั, ะพะฑั€ะฐัั‚ะฐัŽ ะบะพั€ะบะพะน ะปัŒะดะฐ. ะกะพะฝ ะฝะต ะฟั€ะธั…ะพะดะธะป. ะฏ ัƒะถะต ะฟะพั‚ะตั€ัะป ัั‡ั‘ั‚ ั‚ะพะผัƒ ะฒั€ะตะผะตะฝะธ, ะฒ ั‚ะตั‡ะตะฝะธะต ะบะพั‚ะพั€ะพะณะพ ั ะปะตะถะฐะป ะฝะฐ ัะฟะธะฝะต, ะณะปัะดั ะฝะฐ ะฑะตะปะพัะฝะตะถะฝั‹ะน ะฟะพั‚ะพะปะพะบ, ะฟะพ ะบะพั‚ะพั€ะพะผัƒ ัะบะพะปัŒะทะธะปะธ ั‚ะตะฝะธ ัะฝะตะถะธะฝะพะบ. ะขะธัˆะธะฝะฐ ะณัƒะดะตะปะฐ ะฒ ัƒัˆะฐั…. ะฅะพั‚ั ั ะดะพะฒะพะปัŒัั‚ะฒะพะฒะฐะปัั ะตะน ะฝะต ั‚ะฐะบ ัƒะถ ะธ ะดะพะปะณะพ. ะกะฟัƒัั‚ั ะบะฐะบะพะต-ั‚ะพ ะฒั€ะตะผั, ะบ ะผะพะตะผัƒ ะฝะตะณะพะดะพะฒะฐะฝะธัŽ, ั ัƒัะปั‹ัˆะฐะป ัะบั€ะธะฟ ะบั€ะพะฒะฐั‚ะธ ะฒ ัะพัะตะดะฝะตะน ะบะพะผะฝะฐั‚ะต. ะฏ ะดะพ ะฟะพัะปะตะดะฝะตะณะพ ะฝะฐะดะตัะปัั, ั‡ั‚ะพ ะผะฝะต ัั‚ะพ ะฟะพะบะฐะทะฐะปะพััŒ. ะะตั‚. ะะต ะฟะพะบะฐะทะฐะปะพััŒ. ะกะบั€ะธะฟั‹ ัั‚ะฐะปะธ ัะธะปัŒะฝะตะต, ะดะตั€ะตะฒัะฝะฝั‹ะต ะฑะฐัˆะตะฝะบะธ ัƒ ะธะทะณะพะปะพะฒัŒั ะบั€ะพะฒะฐั‚ะธ ะฝะฐั‡ะฐะปะธ ัั‚ัƒั‡ะฐั‚ัŒัั ะฑ ัั‚ะตะฝัƒ, ะธ ั ัƒัะปั‹ัˆะฐะป ะตะณะพ ะณะพะปะพั. ะ“ั€ะพะผะบะธะน ะฟั€ะพั‚ัะถะฝั‹ะน ัั‚ะพะฝ, ะทะฐั‚ะตะผ ัะตั€ะธั ั‡ัƒั‚ัŒ ะฑะพะปะตะต ั‡ะฐัั‚ั‹ั…. ะœะฝะต ะบะฐะทะฐะปะพััŒ, ั ัะปั‹ัˆะฐะป ะดะฐะถะต ะตะณะพ ะดั‹ั…ะฐะฝะธะต. ะœะตะฝั ัะปะพะฒะฝะพ ัƒะดะฐั€ะธะปะธ ะฒ ะณั€ัƒะดัŒ, ะฒั‹ะฑะธะฒ ะฒะตััŒ ะฒะพะทะดัƒั… ะธะท ะปั‘ะณะบะธั…. ะ‘ั‹ะปะพ ะฑะพะปัŒะฝะพ ะธ ะพะฑะธะดะฝะพ. ะฏ ะฟะพะถะฐะปะตะป, ั‡ั‚ะพ ะฟะพัะตะปะธะปัั ั€ัะดะพะผ ั ะ ะธั…ะฐั€ะดะพะผ. ะฏ ะฝะฐะบั€ั‹ะป ะณะพะปะพะฒัƒ ะฟะพะดัƒัˆะบะพะน, ัั‚ะฐั€ะฐัััŒ ะฝะต ัะปั‹ัˆะฐั‚ัŒ ัั‚ะพะณะพ ะบะพัˆะผะฐั€ะฐ, ะฝะพ ั‚ั‰ะตั‚ะฝะพ; ะฒัะบะพั‡ะธะป ั ะบั€ะพะฒะฐั‚ะธ, ะฝะฐั‡ะฐะป ั…ะพะดะธั‚ัŒ ะฟะพ ะบะพะผะฝะฐั‚ะต, ะฟะพะดะฐะฒะปัั ะถะตะปะฐะฝะธะต ะฑั€ะพัะธั‚ัŒัั ะฒ ัะพัะตะดะฝัŽัŽ ะธ ั€ะฐัะบะธะดะฐั‚ัŒ ะปัŽะฑะพะฒะฝะธะบะพะฒ ะบะฐะบ ะบะพั‚ัั‚. ะฅะพั‚ะตะปะพััŒ ะพะฑะปะธั‡ะธั‚ัŒ, ะฟั€ะธัั‚ั‹ะดะธั‚ัŒ ะธั…. ะ’ะฟั€ะพั‡ะตะผ, ะฟะตั€ะตะด ะบะตะผ? ะ’ัะต ะธั‚ะฐะบ ะดะฐะฒะฝะพ ะพะฑะพ ะฒัั‘ะผ ะทะฝะฐะปะธ ะธ, ะบะฐะบ ะฒะทั€ะพัะปั‹ะต, ะฐะดะตะบะฒะฐั‚ะฝั‹ะต ะปัŽะดะธ, ะทะฐะบั€ั‹ะฒะฐะปะธ ะฝะฐ ัั‚ะพ ะณะปะฐะทะฐ. ะขะฐะบ ั‡ั‚ะพ ั ะฟั€ะพัั‚ะพ ะฒั‹ัั‚ะฐะฒะปัŽ ัะตะฑั ะดัƒั€ะฐะบะพะผ. ะšะฐะบะพั„ะพะฝะธั ะธะท ัั‚ะพะฝะพะฒ ะธ ัะบั€ะธะฟะพะฒ ัั‚ะฐะฝะพะฒะธะปะฐััŒ ะฒัั‘ ะณั€ะพะผั‡ะต. ะฏ ะพะฑะตััะธะปะตะฝะพ ัƒะฟะฐะป ะฝะฐ ะบั€ะพะฒะฐั‚ัŒ, ะผะพะปั ะ‘ะพะณะฐ, ั‡ั‚ะพะฑั‹ ัะบะพั€ะตะต ะฒัั‘ ะทะฐะบะพะฝั‡ะธะปะพััŒ, ะธ ะทะฐะบั€ั‹ะป ะณะปะฐะทะฐ. ะกะปั‹ัˆะฝั‹ ะฑั‹ะปะธ ัั‚ะพะฝั‹ ั‚ะพะปัŒะบะพ ะ ะธั…ะฐั€ะดะฐ. ะฏ ะฟั€ะตะดัั‚ะฐะฒะธะป, ั‡ั‚ะพ ะพะฝ ัะตะนั‡ะฐั ะทะดะตััŒ, ัะพ ะผะฝะพะน, ั‡ั‚ะพ ัั‚ะพ ั ัะถะธะผะฐัŽ ะฒ ั€ัƒะบะฐั… ะตะณะพ ัะณะพะดะธั†ั‹, ะพัั‚ะฐะฒะปัั ะฝะฐ ะฝะธั… ะธััะธะฝั-ะบั€ะฐัะฝั‹ะต ะฟะพะปัƒะผะตััั†ั‹, ั‡ั‚ะพ ัั‚ะพ ั ะฝะตะถะฝะพ ะฒั…ะพะถัƒ ะฒ ะฝะตะณะพ, ั‡ั‚ะพ ะพะฝ ะฟะพะดะพ ะผะฝะพะน ะฒะทะดั€ะฐะณะธะฒะฐะตั‚ ะพั‚ ะฝะฐัะปะฐะถะดะตะฝะธั... ะ’ ะฟะฐั…ัƒ ะผะตะดะปะตะฝะฝะพ ัะพะทั€ะตะฒะฐะป ะพะณะฝะตะฝะฝั‹ะน ัˆะฐั€. ะžะฝ ะพะฑะถะธะณะฐะป ะฒัั‘ ะฒะฝะธะทัƒ ะถะธะฒะพั‚ะฐ, ะฟะตั€ะธะพะดะธั‡ะตัะบะธ ะฟะพัั‹ะปะฐั ะฝะฐะธะฑะพะปะตะต ะถะฐั€ะบะธะต ะธะผะฟัƒะปัŒัั‹. ะฏ ะฟั€ะธัะฟัƒัั‚ะธะป ั€ะตะทะธะฝะบัƒ ะฟะธะถะฐะผะฝั‹ั… ัˆั‚ะฐะฝะพะฒ, ะฒัั‘ ะตั‰ั‘ ะฝะต ะพั‚ะบั€ั‹ะฒะฐั ะณะปะฐะทะฐ. ะœะพั‘ ะดั‹ั…ะฐะฝะธะต ะทะฐั…ะพะดะธะปะพััŒ, ัั‚ะพะฝั‹ ะฟะพ ั‚ัƒ ัั‚ะพั€ะพะฝัƒ ัั‚ะตะฝั‹ ัƒั‡ะฐั‰ะฐะปะธััŒ, ะณะพั€ัั‡ะฐั ะฟะปะพั‚ัŒ ะฑั‹ะปะฐ ะณะพั‚ะพะฒะฐ ะปะพะฟะฝัƒั‚ัŒ ะพั‚ ะฝะฐะฟั€ัะถะตะฝะธั. ะ”ะฒะฐ ะณั€ะพะผะบะธั…, ั€ะตะทะบะธั… ะฒั‹ะดะพั…ะฐ. ะ•ะณะพ - ะผะพะน. ะ’ ะผะพั‘ะผ, ะบะฐะถะตั‚ัั, ะฟั€ะพัะบะพะปัŒะทะฝัƒะปะพ ัั…ะพ ะตะณะพ ะธะผะตะฝะธ, ะบะพั‚ะพั€ะพะต ะพะฑะพะถะณะปะพ ะผะฝะต ะณะพั€ะปะพ. ะ’ัั‘ ะฒะพะบั€ัƒะณ ัั‚ะธั…ะปะพ. ะขะธัˆะธะฝะฐ ัะฝะพะฒะฐ ะฟั€ะพะบั€ะฐะปะฐััŒ ะฒ ะผะพะธ ัƒัˆะธ. ะœะฝะต ะฑั‹ะปะพ ะณะฐะดะบะพ. ะะพ ะฒัั‘-ั‚ะฐะบะธ ั ะฑั‹ะป ัั‡ะฐัั‚ะปะธะฒ ะพั‚ ั‚ะพะณะพ, ั‡ั‚ะพ ะฝะฐะผ ั ะ ะธั…ะฐั€ะดะพะผ ะฑั‹ะปะพ ั…ะพั€ะพัˆะพ. ะฏ ะฑั‹ ัะผะพะณ ัƒะดะพะฒะปะตั‚ะฒะพั€ะธั‚ัŒ ะตะณะพ ะฝะต ั‚ะพะปัŒะบะพ ั„ะธะทะธั‡ะตัะบะพะต ะถะตะปะฐะฝะธะต, ั ะฑั‹ ัะผะพะณ ะดะพัั‚ะฐะฒะธั‚ัŒ ัƒะดะพะฒะพะปัŒัั‚ะฒะธะต ะธ ะตะณะพ ะดัƒัˆะต. ะฏ ะฑั‹ ะผะฝะพะณะพะต ัะผะพะณ... ะžั‚ะบะธะฝัƒะฒ ัะพ ะปะฑะฐ ะฒะปะฐะถะฝั‹ะต ะพั‚ ะฟะพั‚ะฐ ะฒะพะปะพัั‹, ั ะฟะพะฒะตั€ะฝัƒะปัั ะฝะฐ ะฑะพะบ ะธ ะดะพัั‚ะฐั‚ะพั‡ะฝะพ ะฑั‹ัั‚ั€ะพ ัƒัะฝัƒะป. ะœะตะฝั ั€ะฐะทะฑัƒะดะธะปะธ ั…ะพะปะพะดะฝั‹ะต ะปัƒั‡ะธ-ะธะณะพะปะพั‡ะบะธ ะฝะตะดั€ัƒะถะตะปัŽะฑะฝะพะณะพ ะทะธะผะฝะตะณะพ ัะพะปะฝั†ะฐ, ะฟั€ะพะฝะธะบะฐะฒัˆะธะต ะฒ ะผะพัŽ ัะฟะฐะปัŒะฝัŽ. ะ•ะดะฒะฐ ั ั€ะฐะทะปะตะฟะธะป ะณะปะฐะทะฐ, ะฑัƒะดะธะปัŒะฝะธะบ ะฝะฐ ะฟั€ะธะบั€ะพะฒะฐั‚ะฝะพะน ั‚ัƒะผะฑะพั‡ะบะต ัƒัะปัƒะถะปะธะฒะพ ะฟะพะบะฐะทะฐะป ะผะฝะต ะฒั€ะตะผั: ะฑะตะท ะฟัั‚ะธ ะดะตััั‚ัŒ. ะฏ ะฒะฟะตั€ะฒั‹ะต ะทะฐ ะดะพะปะณะพะต ะฒั€ะตะผั ั‡ัƒะฒัั‚ะฒะพะฒะฐะป ัะตะฑั ั…ะพั€ะพัˆะพ. ะœะฝะต ะบะฐะทะฐะปะพััŒ, ั‡ั‚ะพ ะฝะตัะบะพะปัŒะบะพ ะฟั€ะตะดั‹ะดัƒั‰ะธั… ะฝะตะดะตะปัŒ ั ะบะฐั€ะฐะฑะบะฐะปัั ะฒ ะฒั‹ัะพะบัƒัŽ, ะบั€ัƒั‚ัƒัŽ ะณะพั€ัƒ ะธ, ะฝะฐะบะพะฝะตั† ะดะพัั‚ะธะณะฝัƒะฒ ะฒะตั€ัˆะธะฝั‹, ัะบะฐั‚ะธะปัั ั ะฝะตั‘ ะฒะฝะธะท, ะฑัƒะดั‚ะพ ะฝะฐ ัะฐะฝะบะฐั…, ะฝะฐัะปะฐะถะดะฐัััŒ ัะบะพั€ะพัั‚ัŒัŽ ะธ ัะฒะธัั‚ะพะผ ะฒะตั‚ั€ะฐ ะฒ ัƒัˆะฐั…. ะšะฐะทะฐะปะพััŒ, ั‡ั‚ะพ ั ั‚ะพะปัŒะบะพ ั‡ั‚ะพ ะฟะตั€ะตะดะฐะป ััั‚ะฐั„ะตั‚ะฝัƒัŽ ะฟะฐะปะพั‡ะบัƒ ะธ ะฑั‹ะป ั€ะฐะด, ั‡ั‚ะพ ะฟั€ะตะดั‹ะดัƒั‰ะธะน, ะผะพะน ัั‚ะฐะฟ ะพัั‚ะฐะปัั ะฟะพะทะฐะดะธ. ะŸัƒัะบะฐะน ั ะตะณะพ ะธ ะฟั€ะพะธะณั€ะฐะป. ะฏ ะฝะต ัั‚ะฐะป ะฝะตะถะธั‚ัŒัั ะฒ ะบั€ะพะฒะฐั‚ะธ, ัƒะฟะธะฒะฐัััŒ ัะฒะพะธะผ ั…ะพั€ะพัˆะธะผ ะฝะฐัั‚ั€ะพะตะฝะธะตะผ, ะฐ ัะฟะตัˆะฝะพ ะฒัั‚ะฐะป, ะฝะฐะบะธะฝัƒะป ั…ะฐะปะฐั‚ ะธ ัะฟัƒัั‚ะธะปัั ะฒะฝะธะท, ะฒ ัั‚ะพะปะพะฒัƒัŽ. ะขะฐะผ ะฑั‹ะป ะพะดะธะฝ ะปะธัˆัŒ ะคะปะฐะบะต, ะพะฝ ัะธะดะตะป ะทะฐ ัั‚ะพะปะพะผ, ั‡ะธั‚ะฐั ะบะฝะธะณัƒ ะธ ะฟั€ะธั…ะปั‘ะฑั‹ะฒะฐั ะบะพั„ะต ะผะฐะปะตะฝัŒะบะธะผะธ ะณะปะพั‚ะพั‡ะบะฐะผะธ. - ะ”ะพะฑั€ะพะต ัƒั‚ั€ะพ! ะšะปะฐะฒะธัˆะฝะธะบ ะพั‚ะพั€ะฒะฐะปัั ะพั‚ ะบะฝะธะณะธ ะธ ะดั€ัƒะถะตะปัŽะฑะฝะพ ัƒะปั‹ะฑะฝัƒะปัั. - ะ”ะพะฑั€ะพะต. ะšะฐะบ ัะฟะฐะปะพััŒ? - ะ—ะฐะผะตั‡ะฐั‚ะตะปัŒะฝะพ! - ัˆะธั€ะพะบะพ ัƒะปั‹ะฑะฐัััŒ, ัะบะฐะทะฐะป ั. ะ’ะฟะตั€ะฒั‹ะต ะทะฐ ะฟะพัะปะตะดะฝะตะต ะฒั€ะตะผั ะผะฝะต ะฝะธั‡ะตะณะพ ะฝะต ัะฝะธะปะพััŒ, ั‡ะตะผัƒ ั ะฑั‹ะป ะฝะตัะบะฐะทะฐะฝะฝะพ ั€ะฐะด, ะฒะตะดัŒ ะฒัะต ะผะพะธ ัะฝั‹ ะบั€ัƒั‚ะธะปะธััŒ ะฒะพะบั€ัƒะณ ะ ะธั…ะฐั€ะดะฐ. ะ›ะธะฑะพ ัั‚ะพ ะฑั‹ะปะธ ะบะพัˆะผะฐั€ั‹, ะฒ ะบะพั‚ะพั€ั‹ั… ะพะฝ ะฑั‹ะป ั ะŸะฐัƒะปะตะผ, ะปะธะฑะพ ัั‚ะพ ะฑั‹ะปะธ ะฟั€ะตะบั€ะฐัะฝั‹ะต ัะฝั‹, ะฒ ะบะพั‚ะพั€ั‹ั… ะผั‹ ะฑั‹ะปะธ ะฒะผะตัั‚ะต ะธ ะปัŽะฑะธะปะธ ะดั€ัƒะณ ะดั€ัƒะณะฐ, ะธ ะพั‚ ัั‚ะพะณะพ ะผะฝะต ะฑั‹ะปะพ ะฝะตะฒั‹ะฝะพัะธะผะพ ะฑะพะปัŒะฝะพ ะฟั€ะพัั‹ะฟะฐั‚ัŒัั. ะŸะพัั‚ะพ' - source_sentence: 'ะคััˆ ะดัƒะผะฐะป, ั‡ั‚ะพ ะ’ะฐัะธะปะธัะฐ - ะณะปัƒะฟะฐั ั€ั‹ะถะตะฒะพะปะพัะฐั ัˆะฟะธะพะฝะบะฐ, ะบะพั‚ะพั€ะฐั ะฒั‹ะฟะพะปะฝัะตั‚ ะปัŽะฑั‹ะต ะฟั€ะธะบะฐะทั‹ ัะฒะพะตะณะพ ะพั‚ั†ะฐ, ะธะดะตั‚ ะฟะพ ัั‚ะพะฟะฐะผ ะžะณะฝะตะฒะฐ ะดะปั ะดะพัั‚ะธะถะตะฝะธั ะฒั‹ััˆะตะน ั‚ะพั‡ะบะธ ะฒะปะฐัั‚ะธ. ะ”ัƒะผะฐะป, ะพะฝะฐ ะฟั‹ั‚ะฐะตั‚ัั ั€ะฐะทั€ัƒัˆะธั‚ัŒ ะถะธะทะฝะธ ะะธะบะฐ ะธ ัั‚ะฐั€ัˆะตะณะพ ะ›ะฐะทะฐั€ะตะฒะฐ. ะŸะฐั€ะตะฝัŒ ัั‡ะธั‚ะฐะป, ั‡ั‚ะพ ะดะตะฒะพั‡ะบะฐ ะปะธัˆัŒ ะฒั‚ะธั€ะฐะตั‚ัั ะฒ ะดะพะฒะตั€ะธะต, ะดะตะปะฐะตั‚ ะฒะธะด, ั‡ั‚ะพ ะพะฝะฐ ั‚ะฐะบะฐั ะดะพะฑั€ะฐั ะธ ะผะธะปะฐั, ะฒะตั‡ะฝะพ ะบั€ะฐัะฝะตัŽั‰ะฐั ะธ ะฝะตะฒะธะฝะฝะฐั ะบั€ะฐัะฐะฒะธั†ะฐ. ะ ะฝะฐ ัะฐะผะพะผ ะดะตะปะต ะฒะฝัƒั‚ั€ะธ ะตะต ะดัƒัˆะธ ัะฒะตั€ะฝัƒะปะฐััŒ ะบะพะปัŒั†ะฐะผะธ ะทะผะตั, ะพะถะธะดะฐัŽั‰ะฐั ะผะพะผะตะฝั‚ะฐ, ะบะพะณะดะฐ ะฟั€ะธะดะตั‚ัั ะฒะพะฝะทะธั‚ัŒ ัะฒะพะธ ะทัƒะฑั‹ ะฒ ัˆะตัŽ ะฟั€ะพั‚ะธะฒะฝะธะบะฐ ะธ ะฒะฟั€ั‹ัะฝัƒั‚ัŒ ัะด. ะคััˆ ะดัƒะผะฐะป, ั‡ั‚ะพ ะ’ะฐัะธะปะธัะฐ ะฝะธะบะพะณะดะฐ ะฝะต ัะผะพะถะตั‚ ะฝะฐัƒั‡ะธั‚ัŒัั ะปะตั‚ะฐั‚ัŒ. ะ›ัŽะดะธ, ั€ะฐัั…ะฐะถะธะฒะฐัŽั‰ะธะต ะฟะพ ะทะตะผะปะต, ะฝะต ะผะพะณัƒั‚ ะฟะพั‡ัƒะฒัั‚ะฒะพะฒะฐั‚ัŒ ะบั€ั‹ะปัŒั ะทะฐ ัะฟะธะฝะพะน, "ะพั‚ะบะปัŽั‡ะธั‚ัŒ ั€ัƒะบะธ" ะธ ะฒะทะผั‹ั‚ัŒ ะฒ ะณะพะปัƒะฑะพะต ะฝะตะฑะพ. ะะต ัะฟะพัะพะฑะฝั‹ ะฟะพั‡ัƒะฒัั‚ะฒะพะฒะฐั‚ัŒ ะฟะพั€ั‹ะฒั‹ ะฒะตั‚ั€ะฐ ะฝะฐ ัะฒะพะตะน ะบะพะถะต ะธ ะฟะพะฝัั‚ัŒ, ะบะฐะบะพะณะพ ัั‚ะพ ะฟั€ะตะฒั€ะฐั‚ะธั‚ัŒัั ะฒ ะฟั‚ะธั†ัƒ. ะ”ั€ะฐะณะพั†ะธะน ัƒะฒะตั€ัะป ัะตะฑั ะฒ ั‚ะพะผ, ั‡ั‚ะพ ัะพะฒะตั€ัˆะตะฝะฝะพ ะฝะต ะทะฐะฒะธะดัƒะตั‚ ะตะต ัƒะผะตะฝะธัŽ ะฝะฐั…ะพะดะธั‚ัŒ ะฒั‹ั…ะพะด ะธะท ัะปะพะถะฝั‹ั… ัะธั‚ัƒะฐั†ะธะน, ัƒะปั‹ะฑะฐั‚ัŒัั, ะฟั€ะพะดะพะปะถะฐั‚ัŒ ัˆัƒั‚ะธั‚ัŒ ะธ ะฒะตัะตะปะธั‚ัŒัั, ั…ะพั€ะพัˆะพ ะทะฝะฐั ะพ ะฟั€ะธะฑะปะธะถะฐัŽั‰ะตะผัั ะฝะฐะฟะฐะดะตะฝะธะธ ะัั‚ั€ะฐะณะพั€ะฐ. ะ”ั€ะฐะณะพั†ะธะน ัั‡ะธั‚ะฐะป ะฟั€ะฐะฒะธะปัŒะฝั‹ะผ ัะบั€ั‹ะฒะฐั‚ัŒ ัะผะพั†ะธะธ, ะฝะต ะพั‚ะบั€ั‹ะฒะฐั‚ัŒ ั‚ะพะปะฟะต ัะฒะพะธ ั‡ัƒะฒัั‚ะฒะฐ, ะผั‹ัะปะธ ะธ ัั‚ั€ะฐั…ะธ, ะฒะตะดัŒ ั‚ะฐะบ ะถะธะฒะตั‚ัั ะปะตะณั‡ะต. ะŸะพัั‚ะพะผัƒ ะพะฑ ัั‚ะพะน ะปะตะณะบะพะน ะทะฐะฒะธัั‚ะธ ะฝะธะบั‚ะพ ะธ ะฝะต ะทะฝะฐะป. ะ”ะฐะถะต ะะธะบ. ะคััˆ ะดัƒะผะฐะป, ั‡ั‚ะพ ะ’ะฐัะธะปะธัะฐ ะฒะตั‡ะฝะพ ะฑัƒะดะตั‚ ะฒั‹ะดะตะปัั‚ัŒ ะธะท ั‚ะพะปะฟั‹, ะฟะพั‚ะพะผัƒ ั‡ั‚ะพ ัƒะฒะฐะถะฐัŽั‰ะธะต ัะตะฑั ั‡ะฐัะพะฒั‰ะธะบะธ ะฝะต ะดะตะปะฐัŽั‚ ะฒััะบะธะต ะฐะบั€ะพะฑะฐั‚ะธั‡ะตัะบะธะต ะฝะพะผะตั€ะฐ ะฒ ัะฒะพะฑะพะดะฝะพะต ะฒั€ะตะผั. ะ˜, ั‚ะตะผ ะฑะพะปะตะต, ะฝะต ะปะฐะทะฐัŽั‚ ะฟะพ ะดะตั€ะตะฒัŒัะผ. ะŸะฐั€ะตะฝัŒ ัั‡ะธั‚ะฐะป, ั‡ั‚ะพ ะ’ะฐัะธะปะธัะฐ - ะณะปัƒะฟะฐั ะดะตะฒั‡ะพะฝะบะฐ, ะฟะพั‚ะพะผัƒ ั‡ั‚ะพ ะดะตั€ะทะธั‚ ะธ ะฟะตั€ะตั‡ะธั‚ ัั‚ะฐั€ัˆะธะผ, ะฟะพัั‚ะพัะฝะฝะพ ะฟั€ะตั€ะตะบะฐะตั‚ัั ั ะบะพะผะฟะฐะฝะธะตะน ะœะฐั€ะบะฐ, ะฟะพะดะฑั€ะฐัั‹ะฒะฐั ะฒ ะฟะปะฐะผั ะฝะฐะฟั€ัะถะตะฝะฝะพัั‚ะธ ะฒัะต ะฑะพะปัŒัˆะต ััƒั…ะธั… ะฟะพะปะตะฝัŒะตะฒ. ะŸะฐั€ะตะฝัŒ ั‚ะพั‡ะฝะพ ะทะฝะฐะป, ะบะพะณะดะฐ-ั‚ะพ ะพะฝะฐ ะทะฐะฟะปะฐั‚ะธั‚ ะทะฐ ะฒัะต ัะฒะพะธ ะดะตะนัั‚ะฒะธั ะธ ะบะพะปะบะธะต ัะปะพะฒะฐ, ะธ ะดะฐะถะต ะฟะพะทะฒะพะปัะป ัะตะฑะต ั€ะฐัั‚ัะณะธะฒะฐั‚ัŒ ะณัƒะฑั‹ ะฒ ั„ะฐะปัŒัˆะธะฒะพะน ัƒะปั‹ะฑะบะต, ั€ะฐะทะผั‹ัˆะปัั ะพะฑ ัั‚ะธั… ะฝะตะดะฐะปะตะบะธั… ะฒั€ะตะผะตะฝะฐั…. ะ”ั€ะฐะณะพั†ะธะน ะฝะตะฝะฐะฒะธะดะตะป ะตะต ะทะฐ ั‚ะพ, ั‡ั‚ะพ ะพะฝะฐ ะพะดะฐั€ะธะปะฐ ะตะณะพ ัะพั‡ัƒะฒัั‚ะฒัƒัŽั‰ะธะผ ะฒะทะณะปัะดะพะผ ะธ ะฟะพะฟั‹ั‚ะฐะปะฐััŒ ะฟะพะถะฐะปะตั‚ัŒ, ะฟะพะฝัั‚ัŒ, ะบะพะณะดะฐ ัƒะทะฝะฐะปะฐ ะพะฑ ะตะณะพ ัะธั€ะพั‚ัั‚ะฒะต. ะคััˆ ัั‡ะธั‚ะฐะป, ั‡ั‚ะพ ะดะตะฒะพั‡ะบะฐ ะผะพะณะปะฐ ะฝะต ัะฟั€ะฐัˆะธะฒะฐั‚ัŒ ัƒ ะฝะตะณะพ ะพ ัั‚ะฐั€ั‹ั… ะฒั€ะตะผะตะฝะฐั… ะธ ะฝะต ะฟั‹ั‚ะฐั‚ัŒัั ะฟะพะดะฑะพะดั€ะธั‚ัŒ, ั‡ั‚ะพ ะตะผัƒ ะถะฐะปะพัั‚ัŒ ัะพะฒะตั€ัˆะตะฝะฝะพ ะฝะต ะฝัƒะถะฝะฐ. ะคััˆ ะดัƒะผะฐะป, ะ’ะฐัะธะปะธัะฐ - ัะปะธัˆะบะพะผ ัะปะฐะฑะฐั ะดะปั ั‚ะพะณะพ, ั‡ั‚ะพะฑั‹ ะฒั‹ะถะธั‚ัŒ ะฟะพัะปะต ัะปัƒั‡ะฐั ั ะะปั‹ะผ ะฆะฒะตั‚ะบะพะผ, ะฒะตะดัŒ ะฝะธะบั‚ะพ ั€ะฐะฝะตะต ะธะท ะฝะพัะธั‚ะตะปะตะน ั‡ะตั€ะฝะพะณะพ ะบะปัŽั‡ะฐ ะฝะต ะฒั‹ะถะธะฒะฐะป. ะะฐะฒะตั€ะฝะพะต, ะฟะพัั‚ะพะผัƒ ะ”ั€ะฐะณะพั†ะธะน ั€ะตัˆะธะป ะฟะพะผะพั‡ัŒ ะตะน. ะะฐ ะฟะฐั€ะฝั ะฟะพะฒะปะธัะปะฐ ะผะธะปะฐั ัƒะปั‹ะฑะบะฐ ะžะณะฝะตะฒะพะน. ะคััˆ ั‚ะพะณะดะฐ ะดะปั ัะตะฑั ั€ะตัˆะธะป, ั‡ั‚ะพ ะฒ ะฟะพัะปะตะดะฝะธะน ั€ะฐะท ะดะตะปะฐะตั‚ ะตะน ะฟะพะดะพะฑะฝะพะต ะพะดะพะปะถะตะฝะธะต ะธ ะดะฐั€ะธั‚ ะถะธะทะฝัŒ, ะฟะพะพะฑะตั‰ะฐะป ะฝะต ะฒัะฟะพะผะธะฝะฐั‚ัŒ ะพ ะทะปะพะฟะพะปัƒั‡ะฝะพะผ ะดะฝะต ะฒะพะทะฒั€ะฐั‰ะตะฝะธั ะบ ะัั‚ั€ะฐะณะพั€ัƒ ะธะท-ะทะฐ ั€ั‹ะถะตะฒะพะปะพัะพะน. ะ”ั€ะฐะณะพั†ะธะน ะดัƒะผะฐะป, ั‡ั‚ะพ ะ’ะฐัะธะปะธัะฐ ะฝะต ะฟะพะฑะตะดะธั‚ ะฒะตะปะธะบะพะณะพ ะดัƒั…ะฐ, ะฟัƒัั‚ัŒ ะดะฐะถะต ะฒั‹ัƒั‡ะธั‚ ั‚ั‹ััั‡ัƒ ัั„ะตั€ะพะฒ, ั‡ั‚ะพ ะžะณะฝะตะฒะฐ - ะฝะตะผะพั‰ะฝะฐั ะธ ะฑะตััะธะปัŒะฝะฐั ะดะตะฒั‡ะพะฝะบะฐ, ะบะพั‚ะพั€ะฐั ั‚ะพะปัŒะบะพ ะธ ัƒะผะตะตั‚, ั‡ั‚ะพ ะฟะพะผะพะณะฐั‚ัŒ ะดั€ัƒะณะธะผ ะธ ัั‚ั€ะพะธั‚ัŒ ะธะท ัะตะฑั ะณะตั€ะพะธะฝัŽ ะดะตัˆะตะฒะพะณะพ ั€ะพะผะฐะฝะฐ. ะžะฝ ะฟะพะพะฑะตั‰ะฐะป, ั‡ั‚ะพ ะฝะต ะฟั€ะธัะพะตะดะธะฝะธั‚ัั ะบ ะฝะตะน, ะฝะต ะฑัƒะดะตั‚ ะฟะพะผะพะณะฐั‚ัŒ. ะคััˆ ัั‡ะธั‚ะฐะป, ั‡ั‚ะพ ะปะธัˆัŒ ะดะตะปะฐะตั‚ ะฒะธะด ะธ ะฟั€ะธั‚ะฒะพั€ัะตั‚ัั ะตะต ะดั€ัƒะณะพะผ. ะ”ั€ะฐะณะพั†ะธะน ะดัƒะผะฐะป, ั‡ั‚ะพ ะฝะต ะฒะปัŽะฑะธั‚ัั ะฒ ะฝะตะต, ะฝะต ะฟะพะนะดะตั‚ ะฝะฐ ะฟะพะฒะพะดัƒ ะบะฐะบะธั…-ั‚ะพ ะฝะตะปะตะฟั‹ั… ั‡ะฐั€ ั€ั‹ะถะตะฒะพะปะพัะพะน, ะฝะพ ะพะฝ ะปะธัˆัŒ ะฒ ะพั‡ะตั€ะตะดะฝะพะน ั€ะฐะท ะพัˆะธะฑะฐะปัั.' sentences: - 'ะ“ะฐั€ั€ะธ ะบั€ัƒั‚ะธั‚ ะบะพะปะตัะธะบะพ ะทะฐะถะธะณะฐะปะบะธ, ะขะธะบะบะธ ะทะฐะบะฐะฝั‡ะธะฒะฐะตั‚ ะตัั‚ัŒ ัะฒะพะต ะฟะตั‡ะตะฝัŒะต ะธ ัะผะพั‚ั€ะธั‚ ะฝะฐ ะฝะตะณะพ ัƒะบะพั€ะธะทะฝะตะฝะฝะพ, ะบะฐะบ ะฑั‹ ะณะพะฒะพั€ั ะฟั€ะตะบั€ะฐั‚ะธ-ัั‚ะพ-ะฟะฐั€ะตะฝัŒ-ะฝะต-ะฟะพะผะพะถะตั‚ (ะฝะพ ะพะฝ ะฝะต ะผะพะถะตั‚, ั‡ะตั€ั‚, ะฟะพะฟั€ะพัั‚ัƒ ะฝะต ะผะพะถะตั‚, ะฟะพั‚ะพะผัƒ ั‡ั‚ะพ ะบัƒั€ะตะฝะธะต - ัั‚ะพ ะฟั€ะธะฒั‹ั‡ะบะฐ, ะตะดะธะฝัั‚ะฒะตะฝะฝั‹ะน ะฑั‹ัั‚ั€ั‹ะน ัะฟะพัะพะฑ ะฟะตั€ะตัั‚ะฐั‚ัŒ ะถะตะปะฐั‚ัŒ ัะผะตั€ั‚ะธ ะญะปะตะพะฝะพั€, ะฟะพั‚ะพะผัƒ ั‡ั‚ะพ, ั…ัะน, ะพะฝ ััƒะฟะตั€ะณะตั€ะพะน, ะตะผัƒ ะฝะตะปัŒะทั ัƒะฑะธะฒะฐั‚ัŒ ะฝะตะฒะธะฝะฝั‹ั… ะปัŽะดะตะน). ะ“ะฐั€ั€ะธ ะฒะธะฝะพะฒะฐั‚ะพ ัƒะปั‹ะฑะฐะตั‚ัั ะธ ะท-ะฐ-ั‚-ั-ะณ-ะธ-ะฒ-ะฐ-ะต-ั‚-ั-ั ัะธะณะฐั€ะตั‚ะฝั‹ะผ ะดั‹ะผะพะผ ะดะพ ะถะถะตะฝะธั ะฒ ะณะพั€ะปะต, ั‡ะตั€ะฝั‹ั… ั‚ะพั‡ะตะบ ะฟะตั€ะตะด ะณะปะฐะทะฐะผะธ ะธ ะฐะดัะบะพะน ะฑะพะปะธ ะณะดะต-ั‚ะพ ะผะตะถะดัƒ ั€ะตะฑะตั€, ะฒัะฟะพะผะธะฝะฐั ะฟะตั€ะตะฟะปะตั‚ะตะฝะฝั‹ะต ะฟะฐะปัŒั†ั‹ ะปัŽะฑะฒะธ ะฒัะตะน ะตะณะพ ะถะธะทะฝะธ ะธ ะดะตะฒัƒัˆะบะธ, ะพั‚ั€ะฐะฒะปััŽั‰ะตะน ะตะณะพ ััƒั‰ะตัั‚ะฒะพะฒะฐะฝะธะต ั ัะฐะผะพะณะพ ะฟะตั€ะฒะพะณะพ ะบะปะฐััะฐ. ะฃ ะฝะตะต ะฟะฐะฟะฐ - ะผัั€ ะณะพั€ะพะดะฐ, ะฒะตั‰ะธ ะพั‚ ะผะธั€ะพะฒั‹ั… ะฑั€ะตะฝะดะพะฒ, ะบั€ะฐัะธะฒะฐั (ะฝะพ ะฟัƒัั‚ะฐั) ะฒะฝะตัˆะฝะพัั‚ัŒ ะธ ั†ะตะปั‹ะน ะฒะพะท ัƒะถะฐัะฝั‹ั… ะฟะพัั‚ัƒะฟะบะพะฒ, ะธ ะ“ะฐั€ั€ะธ ะฟะพะฝัั‚ะธั ะฝะต ะธะผะตะตั‚, ั‡ั‚ะพ ะ›ัƒะธ ะฒ ะฝะตะน ะผะพะณ ะฝะฐะนั‚ะธ (ะฝะฐัˆะตะป). - ะขั‹ ัะธะปัŒะฝั‹ะน, - ะณะพะฒะพั€ะธั‚ ะตะณะพ ะผะฐะปะตะฝัŒะบะฐั ะฟะพะดั€ัƒะณะฐ, ัะฐะดัััŒ ะฝะฐ ะฟะปะตั‡ะพ. - ะขะพ, ั‡ั‚ะพ ะ›ัƒะธ ะฝะฐั‡ะฐะป ะฒัั‚ั€ะตั‡ะฐั‚ัŒัั ั ะญะปะตะพะฝะพั€, ะฝะต ั‚ะฐะบ ัั‚ั€ะฐัˆะฝะพ, ะบะฐะบ ัะพะทะดะฐะฝะธั, ั ะบะพั‚ะพั€ั‹ะผะธ ั‚ั‹ ะฟะพัั‚ะพัะฝะฝะพ ัั€ะฐะถะฐะตัˆัŒัั. (ะณะพั€ะฐะทะดะพ ัั‚ั€ะฐัˆะฝะตะต) - ะšะพะฝะตั‡ะฝะพ, - ั…ั€ะธะฟะธั‚ ะ“ะฐั€ั€ะธ ะฒะผะตัั‚ะต ั ะบะพัะพะน, ะฝะฐั‚ัะฝัƒั‚ะพะน ัƒะปั‹ะฑะบะพะน ะฝะฐ ะปะธั†ะต, ะธ ะขะธะบะบะธ ะตะผัƒ, ะบะฐะถะตั‚ัั, ะฒะตั€ะธั‚, ะบะปะฐะดั ะผะฐะปะตะฝัŒะบัƒัŽ ะปะฐะดะพัˆะบัƒ ะฝะฐ ั‰ะตะบัƒ ะฒ ะทะฝะฐะบ ะพะดะพะฑั€ะตะฝะธั. ะ“ะฐั€ั€ะธ ัˆะตัั‚ะฝะฐะดั†ะฐั‚ัŒ. ะžะฝ ะดะพะปะถะตะฝ ั…ะพะดะธั‚ัŒ ะฝะฐ ัะฒะธะดะฐะฝะธั, ะฒะตัะตะปะธั‚ัŒัั ั ะดั€ัƒะทัŒัะผะธ ะธ ะฝะฐัะปะฐะถะดะฐั‚ัŒัั ะถะธะทะฝัŒัŽ, ะฝะพ ะฒะผะตัั‚ะพ ัั‚ะพะณะพ ะพะฝ ัะฟะฐัะฐะตั‚ ะŸะฐั€ะธะถ ั‡ัƒั‚ัŒ ะปะธ ะฝะต ะตะถะตะดะฝะตะฒะฝะพ, ะปะตั‚ะฐั ะฟะพ ะณะพั€ะพะดัƒ ะฝะฐ ะดัƒั€ะฐั†ะบะพะผ ะนะพ-ะนะพ, ัะปะพะฒะฝะพ ั‡ะตะปะพะฒะตะบ-ะฟะฐัƒะบ, ะธ ะฟะพะปะฐะณะฐัััŒ ะฝะฐ ัะธะปัƒ ะผะฐะปะตะฝัŒะบะพะณะพ ะฑั€ะฐัะปะตั‚ะฐ (ะบั€ะฐัะฝะพะณะพ, ะฒ ะบั€ะฐะฟะธะฝะบัƒ, ะบะฐะบ ะบั€ั‹ะปัŒั ะ‘ะพะถัŒะตะน ะšะพั€ะพะฒะบะธ). (ะฐ ะตั‰ะต ะ“ะฐั€ั€ะธ ะฟั‹ั‚ะฐะตั‚ัั ัะบั€ั‹ั‚ัŒ ะพั‚ ะฒัะตั… ัะฒะพัŽ (ะฝะต ะพั‡ะตะฝัŒ) ะผะฐะปะตะฝัŒะบัƒัŽ ะฒะปัŽะฑะปะตะฝะฝะพัั‚ัŒ ะฒ ะ›ัƒะธ ะขะพะผะปะธะฝัะพะฝะฐ, ะตะณะพ ะพะดะฝะพะบะปะฐััะฝะธะบะฐ, ั€ัะดะพะผ ั ะบะพั‚ะพั€ั‹ะผ ะฝะต ะผะพะถะตั‚ ัะฒัะทะฐั‚ัŒ ะดะฐะถะต ะดะฒัƒั… ัะปะพะฒ ะธ ัƒะดะตั€ะถะฐั‚ัŒัั ะฝะฐ ะฝะพะณะฐั…. ั‚ะตะฟะตั€ัŒ ัั‚ะพ ะฝะตะฒะฐะถะฝะพ, ะฟะพั‚ะพะผัƒ ั‡ั‚ะพ (ะตะณะพ) ะ›ัƒ ะฒัั‚ั€ะตั‡ะฐะตั‚ัั ั ะญะปะตะพะฝะพั€, ะธ ะฟะฐั€ะตะฝัŒ ั‡ัƒะฒัั‚ะฒัƒะตั‚, ะบะฐะบ ะฒ ะณั€ัƒะดะธ ั‡ั‚ะพ-ั‚ะพ ะถะถะตั‚, ะฐ ะถะตะปะฐะฝะธะต ะถะธั‚ัŒ ั ะบะฐะถะดั‹ะผ ะดะฝะตะผ ัƒะผะตะฝัŒัˆะฐะตั‚ัั ะฒ ะณะตะพะผะตั‚ั€ะธั‡ะตัะบะพะน ะฟั€ะพะณั€ะตััะธะธ, ะพะน-ะพะน). ะŸะตั€ะฒั‹ะผ ะทะฐะผะตั‡ะฐะตั‚ ะัƒะฐั€, ะธ ะฝะต ั‚ะพ ั‡ั‚ะพะฑั‹ ะ“ะฐั€ั€ะธ ัƒะดะธะฒะปะตะฝ ัั‚ะพะผัƒ ั„ะฐะบั‚ัƒ, ะฟั€ะพัั‚ะพ ะบะฐะบ-ั‚ะพ ัั‚ั€ะฐะฝะฝะพ, ั‡ั‚ะพ ัั‚ะพั‚ ั€ะฐะทะดั€ะฐะถะฐัŽั‰ะธะน, ะดะตั€ะทะบะธะน ะธ ัะพะฒะตั€ัˆะตะฝะฝะพ-ะฝะต-ะฟะพั…ะพะถะธะน-ะฝะฐ-ะฒะทั€ะพัะปะพะณะพ ะบะพั‚ ะฟะพั‡ัƒะฒัั‚ะฒะพะฒะฐะป ะตะณะพ ะฟั€ะธั‚ะฒะพั€ัั‚ะฒะพ. - ะ’ัะต ะฒ ะฟะพั€ัะดะบะต, ะ‘ะพะถัŒั ะšะพั€ะพะฒะบะฐ? - ัะฟั€ะฐัˆะธะฒะฐะตั‚ ะพะฝ, ะบะพะณะดะฐ ะพะฝะธ ะฟะพะฑะตะถะดะฐัŽั‚ ะ ะตั„ะปะตะบั‚ัƒ, ะธ ะพัั‚ะฐะตั‚ัั ะฝะตัะบะพะปัŒะบะพ ะผะธะฝัƒั‚ ะดะพ ะฟั€ะตะฒั€ะฐั‰ะตะฝะธั. ะ—ะตะปะตะฝั‹ะต ะณะปะฐะทะฐ ะทะฐ ะผะฐัะบะพะน ะฒั‹ะณะปัะดัั‚ (ะฟะพ-ะฝะฐัั‚ะพัั‰ะตะผัƒ) ะพะฑะตัะฟะพะบะพะตะฝะฝั‹ะผะธ, ะธ ะ“ะฐั€ั€ะธ ั…ะพั‚ะตะป ะฑั‹ ะฟะพะฒะตั€ะธั‚ัŒ ะฒ ั€ะตะฐะปัŒะฝะพะต ะฒะพะปะฝะตะฝะธะต ะšะพั‚ะฐ ะพ ัะฒะพะตะน ะถะธะทะฝะธ, ะฝะพ ัั‚ะพ ะฝะต ะฒ ะตะณะพ ัะธะปะฐั…. ะ“ะฐั€ั€ะธ ั„ั‹ั€ะบะฐะตั‚ ะธ ะฟั‹ั‚ะฐะตั‚ัั ะดะตั€ะถะฐั‚ัŒ ัะตะฑั ะฒ ั€ัƒะบะฐั… (ะฒะดะพั…-ะฒั‹ะดะพั…, ะผะฐะปัŒั‡ะธะบ), ะฟะพั‚ะพะผัƒ ั‡ั‚ะพ ะพะฝะธ ะฒะตะดัŒ ะฝะฐะฟะฐั€ะฝะธะบะธ, ะธ ะพะฝ ะฝะต ะพะฑัะทะฐะฝ ะพั‚ะบั€ั‹ะฒะฐั‚ัŒ ัะฒะพัŽ ะดัƒัˆัƒ, ะฒะตั€ะฝะพ? (ะบ ั‚ะพะผัƒ ะถะต, ะณะพะฒะพั€ะธั‚ัŒ ั ะัƒะฐั€ะพะผ ะพ ะปัŽะฑะฒะธ ะฒัะต ั€ะฐะฒะฝะพ ั‡ั‚ะพ ั ั€ะตะฑะตะฝะบะพะผ, ะพะฝ ั€ะฐััะผะตะตั‚ัั, ะฝะต ะฟะพะนะผะตั‚) - ะ ะฐะทัƒะผะตะตั‚ัั, ั ั‡ะตะณะพ ั‚ั‹ ะฒะทัะป, ั‡ั‚ะพ ั‡ั‚ะพ-ั‚ะพ ะฝะต ั‚ะฐะบ, ะบะพั‚ะธะบ? - ะ“ะฐั€ั€ะธ ะปะตะณะพะฝัŒะบะพ ะฑัŒะตั‚ ะตะณะพ ะฟะพ ะฝะพััƒ, ะ“ะฐั€ั€ะธ ัะผะตะตั‚ัั ะธ ะดะตะปะฐะตั‚ ะฒะธะด, ั‡ั‚ะพ ะพะฝ ะดะตะนัั‚ะฒะธั‚ะตะปัŒะฝะพ ะฒ ะฟะพั€ัะดะบะต, ะฟัƒัั‚ัŒ ะฒะฝัƒั‚ั€ะธ ะธ ะฒัะต ะฝะพะตั‚ ะธ ัะบะฐะฝะดะธั€ัƒะตั‚ ะฟะธะทะดะตั†-ะฟะธะทะดะตั†-ะฟะธะทะดะตั†-ั-ั‚ะฐะบ-ะพะฑะปะฐะถะฐะปัั (ัะฝะพะฒะฐ). ะัƒะฐั€ ะผะพั€ั‰ะธั‚ ะฝะพั ะธ ัะผะพั‚ั€ะธั‚ ะตั‰ะต ะฑะพะปะตะต ะฟั€ะธัั‚ะฐะปัŒะฝะพ, ะฟะตั€ะตั…ะฒะฐั‚ั‹ะฒะฐั ะตะณะพ ะปะฐะดะพะฝัŒ ะธ ะบะฐั‡ะฐั ะณะพะปะพะฒะพะน, ะฝะต ะฒะตั€ัŽ, ะผะพะป, ะฟั€ะธะดัƒะผะฐะน ั‡ั‚ะพ-ะฝะธะฑัƒะดัŒ ะฟะพะปัƒั‡ัˆะต. - ะฏ ะฟั€ะพัั‚ะพ ั‡ัƒะฒัั‚ะฒัƒัŽ, - ัƒัˆะบะธ ะฝะฐ ะตะณะพ ะณะพะปะพะฒะต ะดะตั€ะณะฐัŽั‚ัั, ะธ ะ“ะฐั€ั€ะธ ะดะตั€ะณะฐะตั‚ัั ั‚ะพะถะต, ะฟั‹ั‚ะฐัััŒ ัƒะนั‚ะธ (ัƒะฑะตะถะฐั‚ัŒ) ะธ ะฒะตั€ะฝัƒั‚ัŒัั ะดะพะผะพะน, ั‡ั‚ะพะฑั‹ ะฟะพะบัƒั€ะธั‚ัŒ ะฒ ะพะดะธะฝะพั‡ะตัั‚ะฒะต, ะฝะพ ั…ะฒะฐั‚ะบะฐ ะšะพั‚ะฐ ะบั€ะตะฟะบะฐั, ะฐ ะณะพะปะพั ะพั‚ั‡ะฐัะฝะฝั‹ะน, ะฟะพั‡ั‚ะธ ัƒะผะพะปััŽั‰ะธะน, ะบะพะณะดะฐ ะพะฝ ะฟั€ะพัะธั‚ ั€ะฐััะบะฐะทะฐั‚ัŒ (ะฟะพะดะตะปะธั‚ัŒัั). - ะญั‚ะพ ะฝะต ั‚ะฒะพะต, ั‡ะตั€ั‚ ะฒะพะทัŒะผะธ, ะดะตะปะพ, - ัˆะธะฟะธั‚ ะ“ะฐั€ั€ะธ, ะฒัะต-ั‚ะฐะบะธ ะฒั‹ั€ั‹ะฒะฐั ั€ัƒะบัƒ, ะธ ัะฟั€ั‹ะณะธะฒะฐะตั‚ ั ะบั€ั‹ัˆะธ. - ะ”ะพ ะฒัั‚ั€ะตั‡ะธ ะฝะฐ ัะปะตะดัƒัŽั‰ะตะผ ะทะฐะดะฐะฝะธะธ, ะฝะฐะฟะฐั€ะฝะธะบ. ะฃ ะ›ัƒะธ - ะผะพั€ัะบะธะต ะณะปะฐะทะฐ, ะพัะปะตะฟะธั‚ะตะปัŒะฝั‹ะต ัƒะปั‹ะฑะบะธ ะธ ะฒะตั‡ะฝั‹ะน ั€ัƒะผัะฝะตั† ะฝะฐ ั‰ะตะบะฐั… (ะธ ะพะฝ ั‚ะฐะบะพะน ะบั€ะฐัะธะฒั‹ะน, ะณะพัะฟะพะดะธ, ั‡ั‚ะพ ะ“ะฐั€ั€ะธ ะณะพั‚ะพะฒ ะฟะพะบะปะพะฝัั‚ัŒัั ะตะผัƒ, ะบะฐะบ ัะฒะพะตะผัƒ ะปะธั‡ะฝะพะผัƒ ะ‘ะพะณัƒ). ะ˜ะฝะพะณะดะฐ ะ›ัƒะธ ะทะฐะผะตั‡ะฐะตั‚ ะตะณะพ ะฒะทะณะปัะดั‹ ะธ ะฟะพะดะผะธะณะธะฒะฐะตั‚, ะธ ะกั‚ะฐะนะปั, ะฟั€ะฐะฒะดะฐ, ะทะฝะฐะตั‚, ั‡ั‚ะพ ัั‚ะพ ะฒัะต ะฝะตัะตั€ัŒะตะทะฝะพ, ะฝะพ ะฝะธั‡ะตะณะพ ะฝะต ะผะพะถะตั‚ ั ัะพะฑะพะน ะฟะพะดะตะปะฐั‚ัŒ, ัƒะปั‹ะฑะฐัััŒ ั‚ะฐะบ ัˆะธั€ะพะบะพ, ั‡ั‚ะพ ะฑะพะปัั‚ ัะบัƒะปั‹, ะฟะพั‚ะพะผัƒ ั‡ั‚ะพ, ั‡ะตั€ั‚, ะพะฝ ะฒะปัŽะฑะปะตะฝ, ะฒะปัŽะฑะปะตะฝ, ะฒะปัŽะฑะปะตะฝ ั‚ะฐะบ ัะธะปัŒะฝะพ, ั‚ะฐะบ ะณะปัƒะฟะพ, ั‚ะฐะบ ะฟะพ-ะดะตั‚ัะบะธ (ะบะฐะถะตั‚ัั, ัƒะถะต ั†ะตะปัƒัŽ ะฒะตั‡ะฝะพัั‚ัŒ). ะ“ะฐั€ั€ะธ ะผะตั‡ั‚ะฐะตั‚ ะฒะทัั‚ัŒ ะ›ัƒะธ ะทะฐ ั€ัƒะบัƒ (ั‡ั‚ะพะฑั‹ ั‚ะพั‚ ะฟะพะทะฒะพะปะธะป ัั‚ะพ ัะดะตะปะฐั‚ัŒ), ะฟั€ะตะฒั€ะฐั‚ะธั‚ัŒัั ะฒ ั‡ะตั€ั‚ะพะฒัƒ ะ‘ะพะถัŒัŽ ะšะพั€ะพะฒะบัƒ ะธ ะฟะพะดะฝัั‚ัŒัั ะฝะฐ ัะฐะผั‹ะน ะฒะตั€ั… ะญะนั„ะตะปะตะฒะพะน, ั‡ั‚ะพะฑั‹ ะพะฑะฝะธะผะฐั‚ัŒัั ั ะฝะธะผ ะฝะฐะด ะฒัะตะผ ะŸะฐั€ะธะถะตะผ. (ะธ ะ“ะฐั€ั€ะธ ะทะฝะฐะตั‚, ั‡ั‚ะพ ะฒ ัˆะตัั‚ะฝะฐะดั†ะฐั‚ัŒ ะพะฝ ะดะพะปะถะตะฝ ะถะตะปะฐั‚ัŒ ะทะฐะฝัั‚ัŒัั ัะตะบัะพะผ ั ั‡ะตะปะพะฒะตะบะพะผ, ะบะพั‚ะพั€ั‹ะน ะฝั€ะฐะฒะธั‚ัั, ะฝะพ ัั‚ะพะณะพ ะฝะตั‚, ะฟะพั‚ะพะผัƒ ั‡ั‚ะพ ัƒะถะต ั‡ะตั‚ั‹ั€ะต ะณะพะดะฐ ะ›ัƒะธ ั…ะพั‡ะตั‚ัั ั‚ะพะปัŒะบะพ ะป-ัŽ-ะฑ-ะธ-ั‚-ัŒ, ะธ ะฝะธั‡ะตะณะพ ะฑะพะปัŒัˆะต). ะ’ ะฟะพะฝะตะดะตะปัŒะฝะธะบ ะ›ัƒะธ ั†ะตะปัƒะตั‚ ะญะปะตะพะฝะพั€ ะฝะฐ ะณะปะฐะทะฐั… ัƒ ะฒัะตะณะพ ะบะปะฐััะฐ, ะธ ะ“ะฐั€ั€ะธ ั‡ัƒะฒัั‚ะฒัƒะตั‚, ะบะฐะบ ะฒะฝัƒั‚ั€ะธ ะฝะตะณะพ ะฒะทั€ั‹ะฒะฐัŽั‚ัั (ะณะพั€ะพะดะฐ, ะฒัะตะปะตะฝะฝั‹ะต, ัะฒะตั€ั…ะฝะพะฒั‹ะต), ะฝะพ ะผะพะปั‡ะธั‚ (ะบะพะณะดะฐ ะปัŽะฑะธัˆัŒ, ะฒัะตะณะดะฐ ะผะพะปั‡ะธัˆัŒ). ะ“ะฐั€ั€ะธ ะฟั€ะธั…ะพะดะธั‚ ะฒ ัะฒะพัŽ ะฟัƒัั‚ัƒัŽ ะบะฒะฐั€ั‚ะธั€ัƒ (ั€ะพะดะธั‚ะตะปะธ ัƒะตั…ะฐะปะธ ะฟะพ ั€ะฐะฑะพั‚ะต ะฝะฐ ะฝะตะดะตะปัŽ ะบัƒะดะฐ-ั‚ะพ ะฒ ะะผะตั€ะธะบัƒ, ะธ ะฝะต ัะบะฐะทะฐั‚ัŒ, ั‡ั‚ะพ ะกั‚ะฐะนะปััƒ ะฒัะต ั€ะฐะฒะฝะพ, ะธะปะธ ั‡ั‚ะพ ะพะฝ ะฝะต ัะบัƒั‡ะฐะตั‚, ะฟั€ะพัั‚ะพ ะตะผัƒ ะฝะต ะดะพ ะฝะธั… ัะตะนั‡ะฐั, ะฟั€ะฐะฒะดะฐ) ะธ ะฟะฐะดะฐะตั‚ ะฝะฐ ะดะธะฒะฐะฝ, ะถะตะปะฐั ั‚ะพะปัŒะบะพ ะพะดะฝะพะณะพ - ัƒะผะตั€ะตั‚ัŒ (ะฒะฟะตั€ะฒั‹ะต ั‚ะฐะบ ัะธะปัŒะฝะพ ะทะฐ ะฟะพัะปะตะดะฝะตะต ะฒั€ะตะผั). ะžะฝ ะฒั‹ั‚ะฐัะบะธะฒะฐะตั‚ ะธะท ะบะพะผะพะดะฐ ะฑัƒั‚ั‹ะปะบัƒ ะฒะธะฝะฐ, ั…ะพั‚ั ะตะผัƒ ะตั‰ะต ะปะตั‚ ะฟัั‚ัŒ ะบะฐะบ ะฝะตะปัŒะทั ะฟั€ะธะฝะธะผะฐั‚ัŒ ัะฟะธั€ั‚ะฝะพะต, ะธ ะฟัŒะตั‚ ะฒะตััŒ ะฒะตั‡ะตั€, ะฟะพะบะฐ ะฟะตั€ะตะด ะณะปะฐะทะฐะผะธ ะฝะต ะฝะฐั‡ะธะฝะฐัŽั‚ ะปะตั‚ะฐั‚ัŒ ั‡ะตั€ะฝั‹ะต ั‚ะพั‡ะบะธ, ะฐ ะณะพะปะพะฒะฐ ะบั€ัƒะถะธั‚ัŒัั. ะ“ะฐั€ั€ะธ ะดัƒะผะฐะตั‚, ั‡ั‚ะพ ะฟั€ะธั‚ะฒะพั€ะธั‚ัั ะฑะพะปัŒะฝั‹ะผ ะธ ะพัั‚ะฐะฝะตั‚ัั ะทะฐะฒั‚ั€ะฐ ะดะพะผะฐ (ะฝะตั‚ ะฝะธะบะฐะบะพะณะพ ะถะตะปะฐะฝะธั ะฒะธะดะตั‚ัŒ ะปัŽะฑะธะผะพะณะพ ั‡ะตะปะพะฒะตะบะฐ, ะบะพั‚ะพั€ั‹ะน ัั‡ะฐัั‚ะปะธะฒ ั ะดั€ัƒะณะธะผ). ะขะธะบะบะธ ะณะพะฒะพั€ะธั‚, ั‡ั‚ะพ ะพะฝ ะฝะต ะผะพะถะตั‚ ะฟั€ะพัั‚ะพ ั‚ะฐะบ ะฒะทัั‚ัŒ ะฒั‹ั…ะพะดะฝะพะน, ะทะปะพ ะฝะต ะดั€ะตะผะปะตั‚, ะธ ะฒัะต ั‚ะฐะบะพะต, ะฝะพ ะ“ะฐั€ั€ะธ ะฒัะต ั€ะฐะฒะฝะพ, ะพะฝ ะฒะตั€ะธั‚, ั‡ั‚ะพ ะัƒะฐั€ ัะฟั€ะฐะฒะธั‚ัั ัะฐะผ. ะ’ ะธั‚ะพะณะต ะšะฒะฐะผะธ ะฒั‹ั‚ะฐัะบะธะฒะฐะตั‚ ะตะณะพ ะฝะฐ ะทะฐะดะฐะฝะธะต, ะธ ะ“ะฐั€ั€ะธ ะฝะตะฝะฐะฒะธะดะธั‚ ะตะต ั‚ะฐะบ ั‚ะฐะบ ัะธะปัŒะฝะพ, ั‡ั‚ะพ ะฝะต ั…ะพั‡ะตั‚ ะฒะธะดะตั‚ัŒ ะตั‰ะต ะฑะปะธะถะฐะนัˆะธะต ะฝะตัะบะพะปัŒะบะพ ะดะฝะตะน. ะšะพั‚ ัƒะถะต ะฝะฐ ะผะตัั‚ะต ะธ ัะผะตั€ัะตั‚ ะตะณะพ (ัะฝะพะฒะฐ) ะฒะทะณะปัะดะพะผ, ะฝะฐะฟะพะปะฝะตะฝะฝั‹ะผ ะฒะพะปะฝะตะฝะธะตะผ, ะฝะพ ะ“ะฐั€ั€ะธ ะปะธัˆัŒ ะพั‚ะผะฐั…ะธะฒะฐะตั‚ัั ะธ ะพั‚ะฒะพั€ะฐั‡ะธะฒะฐะตั‚ัั, ั‡ั‚ะพะฑั‹ ะฑั‹ัั‚ั€ะตะฝัŒะบะพ ะฒั‹ะฟะธั‚ัŒ ะตั‰ะต ะพะดะฝัƒ ั‚ะฐะฑะปะตั‚ะบัƒ ะธะฑัƒะฟั€ะพั„ะตะฝะฐ - ะณะพะปะพะฒะฐ ั€ะฐัะบะฐะปั‹ะฒะฐะตั‚ัั ั‚ะฐะบ, ะฑัƒะดั‚ะพ ั‚ะฐะผ ั†ะตะปั‹ะน ะฟั‡ะตะปะธะฝั‹ะน ัƒะปะตะน. ะžะฝะธ ะตะดะฒะฐ ะฝะต ะทะฐะฒะฐะปะธะฒะฐัŽั‚ ัั‚ัƒ ะฑะธั‚ะฒัƒ, ะฟะพั‚ะพะผัƒ ั‡ั‚ะพ ะัƒะฐั€ ะฟะพัั‚ะพัะฝะฝะพ ะพะณะปัะดั‹ะฒะฐะตั‚ัั ะฝะฐ ะกั‚ะฐะนะปัะฐ ะธ ะฟั‹ั‚ะฐะตั‚ัั ะตะณะพ ะฟั€ะธะบั€ั‹ั‚ัŒ, ะฐ ะ“ะฐั€ั€ะธ ะฟั€ะพัั‚ะพ ั‡ัƒะฒัั‚ะฒัƒะตั‚ ัะตะฑั ะทะพะผะฑะธ (ะผะพะทะณะธ ะฝะต ัะพะพะฑั€ะฐะถะฐัŽั‚, ั‚ะตะปะพ ะตะดะฒะฐ ัะปัƒัˆะฐะตั‚ัั), ะฝะพ ะฒ ะบะพะฝั†ะต ะบะพะฝั†ะพะฒ ะฒัะต ะทะฐะบะฐะฝั‡ะธะฒะฐะตั‚ัั ะบะฐะบ ะฒัะตะณะดะฐ, ั…ะพั€ะพัˆะพ, ะธ ะ“ะฐั€ั€ะธ ะพะฑะตััะธะปะตะฝะฝะพ ะฟั€ะธัะปะพะฝัะตั‚ัั ะบ ัั‚ะตะฝะต ะบะฐะบะพะณะพ-ั‚ะพ ะทะดะฐะฝะธั, ะฟั€ะธะบั€ั‹ะฒะฐั ะณะปะฐะทะฐ (ัะตะนั‡ะฐั ะฑั‹ ะพะบะฐะทะฐั‚ัŒัั ะฝะฐ ะฝะตะพะฑะธั‚ะฐะตะผะพะผ ะพัั‚ั€ะพะฒะต, ะฟะพัั€ะตะดะธ ะดะธะบะพะน ะฟั€ะธั€ะพะดั‹ ะธ ะฑัƒัˆัƒัŽั‰ะตะณะพ ะพะบะตะฐะฝะฐ). - ะขั‹ ะผะพะถะตัˆัŒ ั€ะฐััะบะฐะทะฐั‚ัŒ ะผะฝะต, ั‡ั‚ะพ ะฟั€ะพะธัั…ะพะดะธั‚, ั ะฟะพะนะผัƒ, - ะณะพะฒะพั€ะธั‚ ะัƒะฐั€, ะฟะพะดั…ะพะดั ะบ ะฝะตะผัƒ, ะธ ะ“ะฐั€ั€ะธ ั…ะผั‹ะบะฐะตั‚, ะฟะพั‚ะพะผัƒ ั‡ั‚ะพ ะšะพั‚ ะฒัะต ะตั‰ะต ะฟะพัะปะตะดะฝะธะน, ะบะพะผัƒ ะฑั‹ ะพะฝ ะดะพะฒะตั€ะธะปัั. - ะ’ัะต ะฒ ะฟะพั€ัะดะบะต, - ั†ะตะดะธั‚ ะ“ะฐั€ั€ะธ ั ั€ะฐะทะดั€ะฐะถะตะฝะธะตะผ. - ะŸั€ะพัั‚ะพ ะฝะตะฑะพะปัŒัˆะธะต ะฟั€ะพะฑะปะตะผั‹ ะฒ ะถะธะทะฝะธ ะธ ะฟะพั…ะผะตะปัŒะต. - ะฏ ะปะธัˆัŒ ั…ะพั‡ัƒ ะฟะพะผะพั‡ัŒ, - ะฒ ะณะพะปะพัะต ะฟะฐั€ะฝั ะฟั€ะพัะบะฐะปัŒะทั‹ะฒะฐัŽั‚ ะพะฑะธะถะตะฝะฝั‹ะต ะฝะพั‚ะบะธ, ะธ ะ“ะฐั€ั€ะธ (ะฟะพั‡ั‚ะธ) ะตะณะพ ะถะฐะปัŒ. - ะœะฝะต ะฝะธะบั‚ะพ ะฝะต ะผะพะถะตั‚ ะฟะพะผะพั‡ัŒ, ะฟะพะฝะธะผะฐะตัˆัŒ? ะœะฝะต ะฝัƒะถะฝะพ, ั‡ั‚ะพะฑั‹ ะฒัะต ะพัั‚ะฐะฒะธะปะธ ะผะตะฝั ะฒ ะฟะพะบะพะต, ะธ ะฟะตั€ะตัั‚ะฐะปะธ ะฟั‹ั‚ะฐั‚ัŒัั ั‡ั‚ะพ-ั‚ะพ ะฒั‹ััะฝะธั‚ัŒ ะธ ะธัะฟั€ะฐะฒะธั‚ัŒ. - ะ”ะฐะน ะผะฝะต ัˆะฐะฝั, - ะฟั€ะพัะธั‚ ะัƒะฐั€ ะธ, ั‡ะตั€ั‚, ะพะฝ ะดะตะนัั‚ะฒะธั‚ะตะปัŒะฝะพ ะฒะพะปะฝัƒะตั‚ัั, ั‚ะตะฟะตั€ัŒ ะ“ะฐั€ั€ะธ ะฝะต ะผะพะถะตั‚ ัั‚ะพ ะธะณะฝะพั€ะธั€ะพะฒะฐั‚ัŒ. - ะœั‹ ะผะพะถะตะผ ะฒัั‚ั€ะตั‚ะธั‚ัŒัั ะณะดะต-ะฝะธะฑัƒะดัŒ ะธ ะฟะพะณะพะฒะพั€ะธั‚ัŒ. ะ’ ะบะพัั‚ัŽะผะฐั…, ะบะพะฝะตั‡ะฝะพ ะถะต. ะ˜ ะ“ะฐั€ั€ะธ ะฝะต ะทะฝะฐะตั‚, ั‡ั‚ะพ ั€ัƒะบะพะฒะพะดะธั‚ ะธะผ, ะบะพะณะดะฐ ะพะฝ ะณะพะฒะพั€ะธั‚ "ะดะฐ". ะ’ ัั€ะตะดัƒ ะบ ะกั‚ะฐะนะปััƒ ะฟั€ะธั…ะพะดะธั‚ ะ—ะตะนะฝ (ะปัƒั‡ัˆะธะน ะดั€ัƒะณ ะธะท ะบะฐั‚ะตะณะพั€ะธะธ ะฒะผะตัั‚ะต-ั-ะดะตั‚ัั‚ะฒะฐ-ะธ-ะฝะฐะฒัะตะณะดะฐ). ะžะฝ ะฟั€ะธะฝะพัะธั‚ ั ัะพะฑะพะน "1+1" ะธ ะผะพั€ะพะถะตะฝะพะต, ะฝะพ ะบะพะณะดะฐ ะฒะธะดะธั‚ ัะพัั‚ะพัะฝะธะต ะ“ะฐั€ั€ะธ, ั‚ะพ ะฟั€ะพัั‚ะพ ัƒั‚ัะณะธะฒะฐะตั‚ ะตะณะพ ั ัะพะฑะพะน ะฝะฐ ะบั€ะพะฒะฐั‚ัŒ, ั‡ั‚ะพะฑั‹ ะพะฑะฝัั‚ัŒ. - ะšะฐะบ ั‚ั‹ ะดัƒะผะฐะตัˆัŒ, ะพั‚ ะปัŽะฑะฒะธ ะผะพะถะฝะพ ัƒะผะตั€ะตั‚ัŒ? - ัะฟั€ะฐัˆะธะฒะฐะตั‚ ะ“ะฐั€ั€ะธ, ะฟะพะปะพะถะธะฒ ะณะพะปะพะฒัƒ ะ—ะตะนะฝัƒ ะฝะฐ ะณั€ัƒะดัŒ ะธ ะฟั€ะธัะปัƒัˆะธะฒะฐัััŒ ะบ ะผะตั€ะฝะพะผัƒ ัั‚ัƒะบัƒ ัะตั€ะดั†ะฐ, ั‡ัƒะฒัั‚ะฒัƒั (ะณะพัะฟะพะดะธ-ะฑะพะถะต-ะผะพะน-ะฝะฐะบะพะฝะตั†) ัƒะผะธั€ะพั‚ะฒะพั€ะตะฝะธะต. ะ—ะตะนะฝ ั‚ะธั…ะพ ัะผะตะตั‚ัั ะฝะฐะด ะณะปัƒะฟะพัั‚ัŒัŽ ะ“ะฐั€ั€ะธ, ะฝะพ ะฒัะต ั€ะฐะฒะฝะพ ะพะฑะฝะธะผะฐะตั‚ ะบั€ะตะฟั‡ะต, ะฟั€ะธะถะธะผะฐั ะบ ัะตะฑะต, ะฟะพั‚ะพะผัƒ ั‡ั‚ะพ ะทะฝะฐะตั‚, ั‡ั‚ะพ ะฝัƒะถะฝะพ ะตะณะพ ะผะปะฐะดัˆะตะผัƒ ะดั€ัƒะณัƒ (ะทะฝะฐะตั‚, ะบะฐะบ ะปะตั‡ะธั‚ัŒ ะตะณะพ ะฑะพะปัŒะฝะพะต ัะตั€ะดั†ะต). - ะฅะฐะท, ั‚ั‹ ะถะต ะทะฝะฐะตัˆัŒ, ั‡ั‚ะพ ะ›ัƒะธ ั ะญะปะตะพะฝะพั€ ะฝะต ัะผะพะณัƒั‚ ะฑั‹ั‚ัŒ ะดะพะปัŒัˆะต ะฝะตะดะตะปะธ, ะพะฝะฐ ะดะพัั‚ะฐะฝะตั‚ ะตะณะพ ัะฒะพะธะผ ะพั‚ะฒั€ะฐั‚ะธั‚ะตะปัŒะฝั‹ะผ ั…ะฐั€ะฐะบั‚ะตั€ะพะผ ะธ ะฟะธัะบะปัะฒั‹ะผ ะณะพะปะพัะพะผ, - ัˆะตะฟั‡ะตั‚ ะพะฝ. - ะขะฒะพั ะปัŽะฑะพะฒัŒ ะฝะธะบัƒะดะฐ ะพั‚ ั‚ะตะฑั ะฝะต ะดะตะฝะตั‚ัั. ะ“ะฐั€ั€ะธ ั‚ะธั…ะพ ะฒัั…ะปะธะฟั‹ะฒะฐะตั‚ ะธ ะฟั€ะธะถะธะผะฐะตั‚ ะฝะพะณะธ ะบ ั‚ะตะปัƒ, ัะฒะพั€ะฐั‡ะธะฒะฐัััŒ ะบะปัƒะฑะพั‡ะบะพะผ. ะžะฝ ะฒั‹ะณะปัะดะธั‚ ะธัั‚ะพั‰ะตะฝะฝั‹ะผ ะธ ัะปะฐะฑั‹ะผ, ะธ ะ—ะตะนะฝ ั‡ัƒะฒัั‚ะฒัƒะตั‚ ะฑะพะปัŒ ะพั‚ ัั‚ะพะณะพ, ะฝะพ ะฝะธั‡ะตะณะพ ะฝะต ะผะพะถะตั‚ ะฟะพะดะตะปะฐั‚ัŒ (ั€ะฐะทะฒะต ั‡ั‚ะพ ะฒั€ะตะทะฐั‚ัŒ ะขะพะผะปะธัะพะฝัƒ, ั…ะพั‚ัŒ ัั‚ะพ ะธ ะณะปัƒะฟะพ). ะ“ะฐั€ั€ะธ ั‡ัƒะฒัั‚ะฒัƒะตั‚ ัะตะฑั ะผะฐะปะตะฝัŒะบะพะน ะฑัƒะบะฐัˆะบะพะน, ะบะพั‚ะพั€ัƒัŽ ะฒะพั‚-ะฒะพั‚ ั€ะฐัั‚ะพะฟั‡ัƒั‚, ัƒ ะฝะตะณะพ ะฝะตั‚ ัะธะป, ัะฝะตั€ะณะธะธ (ะธ ะฒะตั€ั‹ ะฒ ะปัƒั‡ัˆะตะต, ะบัั‚ะฐั‚ะธ, ั‚ะพะถะต ั ะฝะตะดะฐะฒะฝะธั… ะฟะพั€). ะ“ะฐั€ั€ะธ ะดัƒะผะฐะตั‚, ั‡ั‚ะพ ะฝะต ะดะพัั‚ะพะธะฝ ะฑั‹ั‚ัŒ ััƒะฟะตั€ะณะตั€ะพะตะผ ะธ ัะฟะฐัะฐั‚ัŒ ะผะธั€, ะฟะพั‚ะพะผัƒ ั‡ั‚ะพ ะฝะต ะผะพะถะตั‚ ัะฟะฐัั‚ะธ ะดะฐะถะต ัะฐะผะพะณะพ ัะตะฑั. ะ“ะฐั€ั€ะธ ะพะฑะตั‰ะฐะตั‚ ัะตะฑะต ัั‚ะฐั‚ัŒ ั‡ัƒั‚ัŒ ัะธะปัŒะฝะตะต (ั€ะฐะดะธ ะฒัะตะณะพ ะผะธั€ะฐ) ะธ ะฟะพะณะพะฒะพั€ะธั‚ัŒ-ั‚ะฐะบะธ ั ะัƒะฐั€ะพะผ (ะฟะพั‚ะพะผัƒ ั‡ั‚ะพ ะฒะธะฝะฐ ะณะปะพะถะตั‚ ะตะณะพ ะธะทะฝัƒั‚ั€ะธ, ะธ ั ะฝะตะน ะฝะฐะดะพ ั‡ั‚ะพ-ั‚ะพ ะดะตะปะฐั‚ัŒ), ะฒะตะดัŒ ะธะผะตะฝะฝะพ ัั‚ะพ ะธ ะดะตะปะฐัŽั‚ ััƒะฟะตั€ะณะตั€ะพะธ - ะทะฐะฑั‹ะฒะฐัŽั‚ ะพ ัะตะฑะต, ะทะฐะฑะพั‚ัััŒ ะพ ะดั€ัƒะณะธั… - ะฒะตั€ะฝะพ? (ะฝะฐ ัะปะตะดัƒัŽั‰ะธะน ะดะตะฝัŒ ะ“ะฐั€ั€ะธ ะทะฐะผะฐะทั‹ะฒะฐะตั‚ ะบั€ัƒะณะธ ะฟะพะด ะณะปะฐะทะฐะผะธ ะธ ะฒั‹ะบัƒั€ะธะฒะฐะตั‚ ะฟะพัะปะตะดะฝัŽัŽ ัะธะณะฐั€ะตั‚ัƒ, ะฒั‹ะบะธะดั‹ะฒะฐั ะฟะฐั‡ะบัƒ ะบ ั‡ะตั€ั‚ัƒ ั ั…ะฒะฐั‚ะธั‚ ัะตะฑะต ะฟะพะด ะฝะพั. ะ“ะฐั€ั€ะธ ัƒั‡ะธั‚ัั ั‚ะตั€ะฟะตะฝะธัŽ, ะฝะฐั‡ะธะฝะฐะตั‚ ะทะดะพั€ะพะฒะฐั‚ัŒัั ั ะ›ัƒะธ ะธ ะญะปะตะพะฝะพั€ ะธ ะดะฐะถะต ะฟะพะทะดั€ะฐะฒะปัะตั‚ ะธั…, ะธ ะฟั‹ั‚ะฐะตั‚ัั ะผะตะฝัŒัˆะต ัะผะพั‚ั€ะตั‚ัŒ ะฝะฐ ะฟะฐั€ะฝั (ะฟะพัะปะตะดะฝะตะต ะฝะต ะฟะพะปัƒั‡ะฐะตั‚ัั, ะฝะพ ะฝะธ ะพะดะฝัƒ ะฒะตะปะธะบัƒัŽ ั†ะตะปัŒ ะฝะตะปัŒะทั ะดะพัั‚ะธะณะฝัƒั‚ัŒ ัั€ะฐะทัƒ, ั‚ะฐะบ ั‡ั‚ะพ)). ะ’ ั‡ะตั‚ะฒะตั€ะณ ะ“ะฐั€ั€ะธ ะฟะพะบัƒะฟะฐะตั‚ ั‡ะธะฟัั‹, ะบะพะฝั„ะตั‚ั‹ ะธ ะณะฐะทะธั€ะพะฒะบัƒ (ะพะฝ ะฟะพะฝัั‚ะธั ะฝะต ะธะผะตะตั‚, ั‡ั‚ะพ ะธะผะตะฝะฝะพ ะปัŽะฑะธั‚ ะšะพั‚) ะธ ะธะดะตั‚ ะฝะฐ ะฒัั‚ั€ะตั‡ัƒ ั ะัƒะฐั€ะพะผ ะฒ ะพะดะธะฝ ะธะท ัะฐะผั‹ั… ะผะฐะปะพะฝะฐัะตะปะตะฝะฝั‹ั… ั€ะฐะนะพะฝะพะฒ ะณะพั€ะพะดะฐ, ะณะดะต ะฝะธะบั‚ะพ ั‚ะพั‡ะฝะพ ะฝะต ะทะฐะผะตั‚ะธั‚ ะธั…, ะดะฐะถะต ะฝะต ะฝะฐะดะตัััŒ ะฝะฐ ั‚ะพ, ั‡ั‚ะพ ั€ะฐะทะณะพะฒะพั€ ั ะฝะธะผ ั…ะพั‚ัŒ ะบะฐะบ-ั‚ะพ ะฟะพะผะพะถะตั‚ ะตะผัƒ. - ะขั‹ ะทะดะตััŒ, - ะฒัะบั€ะธะบะธะฒะฐะตั‚ ะฟะฐั€ะตะฝัŒ, ะฒัะบะฐะบะธะฒะฐั ัะพ ัะฒะพะตะณะพ ะผะตัั‚ะฐ, ะธ ะตะณะพ ะณะปะฐะทะฐ ะทะฐะณะพั€ะฐัŽั‚ัั ะธัะบั€ะตะฝะฝะตะน ั€ะฐะดะพัั‚ัŒัŽ, ะบะพะณะดะฐ ะพะฝ ะฒะธะดะธั‚ ะบะพะฝั„ะตั‚ั‹ ะธ ะฒัะต ะพัั‚ะฐะปัŒะฝะพะต. - ะ“ะพัะฟะพะดะธ, ั‚ะตะฑะต ะฝะต ะฝัƒะถะฝะพ ะฑั‹ะปะพ ะฒัะต ัั‚ะพ ะฟั€ะธะฝะพัะธั‚ัŒ. ะ“ะฐั€ั€ะธ ะฟะพะถะธะผะฐะตั‚ ะฟะปะตั‡ะฐะผะธ, ะฟะพั‚ะพะผัƒ ั‡ั‚ะพ, ั‡ะตั€ั‚, ัั‚ะพ ะถะต ะตั€ัƒะฝะดะฐ, ะธ ัƒัะฐะถะธะฒะฐะตั‚ัั, ะพั‚ะบะธะดั‹ะฒะฐัััŒ ะฝะฐ ัะฟะธะฝะบัƒ ะปะฐะฒะพั‡ะบะธ. - ะ”ะฐะฒะฐะน ะฝะฐั‡ะฝะตะผ ัั€ะฐะทัƒ. ะงะตะผ ะฑั‹ัั‚ั€ะตะต, ั‚ะตะผ ะปัƒั‡ัˆะต, ะฒะตั€ะฝะพ? - ัƒัะผะตั…ะฐะตั‚ัั ะพะฝ, ัะฟะปะตั‚ะฐั ัะฒะพะธ ะฟะฐะปัŒั†ั‹ ะฒ ะทะฐะผะพะบ ะพั‚ ะฒะฝะตะทะฐะฟะฝะพะณะพ ั‡ัƒะฒัั‚ะฒะฐ ะฝะตะปะพะฒะบะพัั‚ะธ ะธ ะธัะฟัƒะณะฐ. - ะขั‹ ั€ะฐััะบะฐะถะตัˆัŒ ะผะฝะต, ั‡ั‚ะพ ะฟั€ะพะธัั…ะพะดะธั‚? - ะฝะตะดะพะฒะตั€ั‡ะธะฒะพ ะธะฝั‚ะตั€ะตััƒะตั‚ัั ะัƒะฐั€. - ะขั‹ ะฒะตะดัŒ ะฝะต ะพั‚ัั‚ะฐะฝะตัˆัŒ, ะฐ ะผะฝะต ัƒะถะต ะพัั‚ะพั‡ะตั€ั‚ะตะปะฐ ั‚ะฒะพั ะทะฐะฑะพั‚ะฐ, ัะปะพะฒะฝะพ ั ะผะฐะปะตะฝัŒะบะธะน ั€ะตะฑะตะฝะพะบ. ะกะพ ะผะฝะพะน ะฝะต ัะปัƒั‡ะธะปะพััŒ ะฝะธั‡ะตะณะพ ั‚ะฐะบะพะณะพ, ะพั‚ ั‡ะตะณะพ ะฝะฐะดะพ ะพะฑะตั€ะตะณะฐั‚ัŒ. ะŸั€ะพัั‚ะพ ั‡ะตะปะพะฒะตะบ, ะบะพั‚ะพั€ะพะณะพ ั ะปัŽะฑะปัŽ ัƒะถะต ั‡ะตั‚ั‹ั€ะต ะณะพะดะฐ, ะฒัั‚ั€ะตั‡ะฐะตั‚ัั ั ะดั€ัƒะณะธะผ, ะธ ัั‚ะพ, ะพะบะฐะทั‹ะฒะฐะตั‚ัั, ะณะพั€ะฐะทะดะพ ั…ัƒะถะต, ั‡ะตะผ ะพะฟะธัั‹ะฒะฐัŽั‚ ะฒ ะบะฝะธะณะฐั…. ะœะฝะต ะบะฐะถะตั‚ัั, ั‡ั‚ะพ ั ัะณะพั€ะฐัŽ ะทะฐะถะธะฒะพ, ะบะพะณะดะฐ ะฒะธะถัƒ ะธั…, ะธะดัƒั‰ะธั… ะฟะพ ะบะพั€ะธะดะพั€ัƒ, ะธ ัั‚ะพ ัƒะฑะธะฒะฐะตั‚, ะฟะพะฝะธะผะฐะตัˆัŒ? ะŸะพั‚ะพะผัƒ ั‡ั‚ะพ ั ะทะฝะฐัŽ, ั‡ั‚ะพ ัƒ ะผะตะฝั ะฝะตั‚ ะฝะธะบะฐะบะธั… ัˆะฐะฝัะพะฒ, ั ะฒัะตะณะพ ะปะธัˆัŒ ะณะปัƒะฟั‹ะน ั‚ั€ะพะตั‡ะฝะธะบ ั ะฟะพัะปะตะดะฝะตะน ะฟะฐั€ั‚ั‹, ัƒ ะบะพั‚ะพั€ะพะณะพ ะฝะธ ะฒะฝะตัˆะฝะพัั‚ะธ, ะฝะธ ะฑะพะณะฐั‚ะพะณะพ ะพั‚ั†ะฐ, ะฒ ะพั‚ะปะธั‡ะธะต ะพั‚ ะตะณะพ ั‡ะตั€ั‚ะพะฒะพะน ะดะตะฒัƒัˆะบะธ ั ะพั‚ั†ะพะผ-ะผัั€ะพะผ, - ะทะฐะบะฐะฝั‡ะธะฒะฐะตั‚ ะ“ะฐั€ั€ะธ ั ะฟะพะปะฝั‹ะผ ะพะฟัƒัั‚ะพัˆะตะฝะธะตะผ ะฒะฝัƒั‚ั€ะธ, ะฒะตะดัŒ ัั‚ะพ ะฟะตั€ะฒั‹ะน ั€ะฐะท, ะบะพะณะดะฐ ะพะฝ ั€ะฐััะบะฐะทั‹ะฒะฐะตั‚ ัะฒะพัŽ ะธัั‚ะพั€ะธัŽ ะบะพะผัƒ-ั‚ะพ, ะบั€ะพะผะต ะปัƒั‡ัˆะตะณะพ ะดั€ัƒะณะฐ. - ะฏ ะดัƒะผะฐัŽ, ั ะฟะพะฝะธะผะฐัŽ, - ะบะธะฒะฐะตั‚ ะัƒะฐั€, ะฒั‹ะณะปัะดั ัะตั€ัŒะตะทะฝั‹ะผ ะธ ะฝะตะพะถะธะดะฐะฝะฝะพ ัƒะดะธะฒะปะตะฝะฝั‹ะผ, ะธ ะฟั€ะธะพะฑะฝะธะผะฐะตั‚ ะตะณะพ ะทะฐ ะฟะปะตั‡ะธ, ะทะฐัั‚ะฐะฒะปัั ะฟั€ะธะถะฐั‚ัŒัั ะบ ัะตะฑะต. ะ“ะฐั€ั€ะธ ั‚ัะถะตะปะพ ะดั‹ัˆะฐั‚ัŒ ะฟะพัะปะต ั‚ะฐะบะพะณะพ ะพั‚ะบั€ะพะฒะตะฝะธั, ะธ ะตะณะพ ัะตั€ะดั†ะต ะฑัŒะตั‚ัั ั‡ะตั€ะตัั‡ัƒั€ ะฑั‹ัั‚ั€ะพ, ั‚ะฐะบ ั‡ั‚ะพ ะพะฝ ะทะฐะฑั‹ะฒะฐะตั‚ ะพ ั‚ะพะผ, ั‡ั‚ะพ ะพะฝะธ ั ะšะพั‚' - 'ะัั‚ะพ ัะถะธะผะฐะตั‚ ะบัƒะปะฐะบะธ, ะณะปัะดั ะฝะฐ ัƒะถะต ะฒัŠะตะฒัˆะตะตัั ะฒ ะดัƒัˆัƒ ะธะผั. "ะฎะธ ะšะพะผะพั€ะธ". ะžะฝ - ะตั‘ ะ“ะพัะฟะพะดะธะฝ. ะ˜ ะพะฝะฐ ะพะฑัะทะฐะฝะฐ ะฟะพะดั‡ะธะฝัั‚ัŒัั ะตะผัƒ. ะ˜, ัะปะตะดัƒั ัั‚ะพะผัƒ ะฟั€ะฐะฒะธะปัƒ, ัะตะนั‡ะฐั ะดะพะปะถะฝะฐ ะฒะพัะบั€ะตัะฝัƒั‚ัŒ. ะ—ะปะพะฑะฐ ะฟะพะถะธั€ะฐะตั‚ ะตะณะพ, ะทะฐัั‚ะฐะฒะปัั ั‚ะธั…ะพ ั€ั‹ั‡ะฐั‚ัŒ. ะžะฑั‹ั‡ะฝะพ ะฟะพะดะฒั‘ั€ะฝัƒั‚ะฐั ัˆั‚ะฐะฝะธะฝะฐ ะพะฟัƒั‰ะตะฝะฐ. - ะ—ะฝะฐะตัˆัŒ, ั‚ะฐะบ ั‚ั‹ ะฒั‹ะณะปัะดะธัˆัŒ ะฝะตะผะฝะพะณะพ... ะะตะฑั€ะตะถะฝะพ. ะฃะดะฐั€ - ะฟะพ ะฝะฐะดะณั€ะพะฑะฝะพะผัƒ ะบะฐะผะฝัŽ ะฟั€ะพั…ะพะดะธั‚ ั‚ั€ะตั‰ะธะฝะฐ. ะคะพั‚ะพ ั€ะฐัั‰ะตะฟะปัะตั‚ัั ะฝะฐะดะฒะพะต. ะžะฝ ะณะพั‚ะพะฒ ะฟะพัะฟะพั€ะธั‚ัŒ, ั‡ั‚ะพ ะพะฝะฐ ัะตะนั‡ะฐั ัั‚ะพะธั‚ ะฟะพะทะฐะดะธ ะฝะตะณะพ. ะ“ัƒะฑั‹ ะฟะพะดะถะฐั‚ั‹, ะดะตะฒัƒัˆะบะฐ ะตะดะฒะฐ ะปะธ ัะดะตั€ะถะธะฒะฐะตั‚ ัะปั‘ะทั‹. ะ ัƒะบะธ ะฝะตั€ะฒะฝะพ ั‚ะตั€ะตะฑัั‚ ะธ ะฑะตะท ั‚ะพะณะพ ะฟะพะผัั‚ัƒัŽ ัŽะฑะบัƒ. ะžะฝ ะณะพั‚ะพะฒ ะฟะพัะฟะพั€ะธั‚ัŒ, ั‡ั‚ะพ ัะตะนั‡ะฐั ะพะฝะฐ ั‚ะธั…ะพ ัะบะฐะถะตั‚ ั‡ั‚ะพ-ั‚ะพ ะฟั€ะพ ั‚ะพ, ั‡ั‚ะพ ั…ะพะทัะธะฝ ะผะพะณะธะปั‹ ะฑัƒะดะตั‚ ะฝะตะดะพะฒะพะปะตะฝ. ะ˜ ะพะฝ ะณะพั‚ะพะฒ ะฟะพัะฟะพั€ะธั‚ัŒ, ั‡ั‚ะพ ะตัะปะธ ะพะฝ ะพะฑะตั€ะฝั‘ั‚ัั, ะพะฝะฐ ะธัั‡ะตะทะฝะตั‚. ะัั‚ะพ ัƒัั‚ะฐะปะพ ะพะฑะปะพะบะฐั‡ะธะฒะฐะตั‚ัั ะฝะฐ ะดะตั€ะตะฒะพ. ะ”ะพะถะดัŒ ะฟั€ะธัั‚ะฝะพ ะพั…ะปะฐะถะดะฐะตั‚ ั€ะฐะทะณะพั€ัั‡ะธะฒัˆะตะตัั ั‚ะตะปะพ. - ะฏ ะฝะต ั€ะฐะทั€ะตัˆะฐะป ั‚ะตะฑะต ัƒะผะธั€ะฐั‚ัŒ... ะ˜ ะพะฝ ะณะพั‚ะพะฒ ะฟะพัะฟะพั€ะธั‚ัŒ, ั‡ั‚ะพ ะพะฝะฐ ัะตะนั‡ะฐั ัƒะปั‹ะฑะฐะตั‚ัั. ะ ะตะนะดะถะธ ัะฐะดะธั‚ัั ะฝะฐ ัะบะฐะผะตะนะบัƒ. ะ›ัƒะฝัƒ ัะบั€ั‹ะปะธ ั‚ัƒั‡ะธ - ะพะฝ ัƒะฒะตั€ะตะฝ, ั‡ั‚ะพ ัะบะพั€ะพ ะฟะพะนะดั‘ั‚ ะดะพะถะดัŒ. ะ’ะฐะผะฟะธั€ ัะปะตะณะฐะฝั‚ะฝั‹ะผ ะดะฒะธะถะตะฝะธะตะผ ะฟะพะฟั€ะฐะฒะปัะตั‚ ะพั‡ะบะธ. ะ˜ ะฟะพั‡ะตะผัƒ ะตะผัƒ ะทะฐั…ะพั‚ะตะปะพััŒ ะฟั€ะธะดั‚ะธ ััŽะดะฐ ะธะผะตะฝะฝะพ ัะตะนั‡ะฐั?... ะงัƒั‚ัŒั‘ ะฒะฐะผะฟะธั€ะฐ ะฝะต ะพะฑะผะฐะฝัƒะปะพ. ะะฐ ะบะฐะผะตะฝะฝั‹ะน ะฟะพั€ั‚ั€ะตั‚ ะฟะฐะดะฐะตั‚ ะฝะตัะบะพะปัŒะบะพ ะบะฐะฟะตะปัŒ, ะฐ ั‡ะตั€ะตะท ะฝะตัะบะพะปัŒะบะพ ัะตะบัƒะฝะด ะดะพะถะดัŒ ัƒะถะต ะปัŒั‘ั‚ ัั‚ะตะฝะพะน. ะ ะตะนะดะถะธ ั‚ะฐะบ ะธ ะฝะต ะดะฒะธะฝัƒะปัั ั ะผะตัั‚ะฐ, ะฝะตัะผะพั‚ั€ั ะดะฐะถะต ะฝะฐ ะฝะฐัั‚ะพะนั‡ะธะฒะพะต ะผััƒะบะฐะฝัŒะต ะทะฐ ัะฟะธะฝะพะน. ะ’ะธะดะธะผะพ, ะฝะต ะฒั‹ะดะตั€ะถะฐะฒ, ะฝะฐ ัะบะฐะผะตะนะบัƒ ะทะฐะฟั€ั‹ะณะธะฒะฐะตั‚ ะฝะตะฑะพะปัŒัˆะฐั ะบะพัˆะบะฐ. ะกะธั€ะตะฝะตะฒั‹ะต ะณะปะฐะทะฐ ั‡ัƒั‚ัŒ ัะฒะตั‚ัั‚ัั, ะฐ ะฝะฐัะบะฒะพะทัŒ ะผะพะบั€ะฐั ะฑะตะปะฐั ัˆั‘ั€ัั‚ะบะฐ ะฑะพะปัŒัˆะต ะฟะพั…ะพะถะฐ ะฝะฐ ะฟะพะปะพะฒัƒัŽ ั‚ั€ัะฟะบัƒ. - ะ—ะฝะฐะตัˆัŒ... ะ’ ัˆัƒะผะต ะดะพะถะดั ะณะพะปะพั ะฟะฐั€ะฝั ะตะดะฒะฐ ั€ะฐะทะปะธั‡ะธะผ, ะฝะพ ะบะพัˆะบะฐ ะปะธัˆัŒ ะฝะฐะบะปะพะฝัะตั‚ ะณะพะปะพะฒัƒ ะฝะฐะฑะพะบ. - ะญั‚ะพ ะบั€ะฐะนะฝะต ะฝะต ะฒะตะถะปะธะฒะพ. ะ˜ะท-ะทะฐ ั‚ะตะฑั ั ั‚ัƒั‚ ะฟั€ะพะผะพะบ ะดะพ ะฝะธั‚ะบะธ. ะะฐ ัˆะตะต ะบะพัˆะบะธ ะตะดะฒะฐ ั€ะฐะทะปะธั‡ะธะผะพ ะฟะพะฑะปั‘ัะบะธะฒะฐะตั‚ ะผะธะฝะธะฐั‚ัŽั€ะฝั‹ะน ะฝะฐั‚ะตะปัŒะฝั‹ะน ะบั€ะตัั‚ะธะบ ะฝะฐ ัะตั€ะตะฑั€ัะฝะพะน ั†ะตะฟะพั‡ะบะต... ะะฐะฒะตั€ะฝะพะต, ะฟะพะบะฐะทะฐะปะพััŒ. - ะ ะฐะนั‚ะพ, ะฐ ั‚ั‹ ะฒัะตะณะดะฐ ะฝะพัะธัˆัŒ ัั‚ัƒ ัˆะปัะฟัƒ? ะ’ะฐะผะฟะธั€ ัƒัะผะตั…ะฐะตั‚ัั. ะžะฝ ัะฐะผ ะฝะต ะทะฝะฐะตั‚, ะบะพะณะดะฐ ะธ ะทะฐั‡ะตะผ ะพะฝ ะฝะฐั‡ะฐะป ะฝะพัะธั‚ัŒ ัั‚ัƒ ัˆะปัะฟัƒ. ะะพ ะดะตะฒัƒัˆะบะฐ ะดะตะนัั‚ะฒะธั‚ะตะปัŒะฝะพ ะทะฐะผะตั‚ะธะปะฐ - ะฟั€ะธ ะฝะตะน ะพะฝ ะฒัะตะณะดะฐ ะฑั‹ะป ะฒ ัˆะปัะฟะต. ะ‘ั‹ะป... ะ ะฐะนั‚ะพ ะฟั€ะพะฒั‘ะป ะฟะฐะปัŒั†ะฐะผะธ ะฟะพ ั‚ั€ะตั‰ะธะฝะต, ั€ะฐะทะดะตะปััŽั‰ะตะน ะบะฐะผะตะฝัŒ ะฝะฐ ะดะฒะต ั‡ะฐัั‚ะธ. ะŸะฐั€ะตะฝัŒ ั‚ัƒั‚ ะถะต ั€ะฐัะฟะพะทะฝะฐะป ะทะฐะฟะฐั… ะฑั€ะฐั‚ะฐ. ะŸั€ะธะบัƒัะธะฒ ะณัƒะฑัƒ, ะพะฝ ะฒะฝะพะฒัŒ ะฟั€ะธัะตะป ะฝะฐ ัะบะฐะผะตะนะบัƒ. ะžะฝ ัƒะถะต ะทะฝะฐะป, ะบะพะผัƒ ัะตะณะพะดะฝั ะฒะปะตั‚ะธั‚ ะฟะพ ะฟะพะปะฝะพะน ะฟั€ะพะณั€ะฐะผะผะต. - ะกั‡ะธั‚ะฐะน ัะตะฑั ะธะทะฑั€ะฐะฝะฝะพะน, ะผะฐะปะตะฝัŒะบะฐั ัั‚ะตั€ะฒะพั‡ะบะฐ. ะงั‘ั€ะฝะฐั ัˆะปัะฟะฐ ั ะบั€ะฐัะฝะพะน ะปะตะฝั‚ะพะน ะปะพะถะธั‚ัั ะฝะฐ ะทะฐะผั‹ัะปะพะฒะฐั‚ั‹ะต ัƒะทะพั€ั‹ ะบะฐะผะฝั. ะ’ะฐะผะฟะธั€ ัƒัะผะตั…ะฐะตั‚ัั ะธ ะฒัั‚ะฐั‘ั‚. ะžะฑะพัั‚ั€ั‘ะฝะฝะพะต ะพะฑะพะฝัะฝะธะต ั‚ัƒั‚ ะถะต ัƒะปะฐะฒะปะธะฒะฐะตั‚ ะทะฝะฐะบะพะผั‹ะน ะทะฐะฟะฐั…. ะฃัะผะตัˆะบะฐ ั‚ัƒั‚ ะถะต ะฟะตั€ะตั€ะฐัั‚ะฐะตั‚ ะฒ ัˆะธั€ะพะบัƒัŽ ะดะพะฒะพะปัŒะฝัƒัŽ ัƒะปั‹ะฑะบัƒ. ะ—ะตะปั‘ะฝั‹ะต ะณะปะฐะทะฐ ั…ะธั‚ั€ะพ ะฟั€ะธั‰ัƒั€ะธะฒะฐัŽั‚ัั. - ะœั‹ ะตั‰ั‘ ะฒัั‚ั€ะตั‚ะธะผัั, ะผะฐะปะตะฝัŒะบะฐั ัั‚ะตั€ะฒะพั‡ะบะฐ. ะ—ะฐะฟะฐั… ะบะปัƒะฑะฝะธะบะธ ั ะฟั€ะธะผะตััŒัŽ ะผะตั‚ะฐะปะปะฐ. ะšะฐะฝะฐั‚ะพ ะฒะฝะพะฒัŒ ะธ ะฒะฝะพะฒัŒ ะฒะณะปัะดั‹ะฒะฐะตั‚ัั ะฒ ั‚ะฐะบ ะฟะพะปัŽะฑะธะฒัˆะธะตัั ะตะผัƒ ั‡ะตั€ั‚ั‹. ะะตัะผะพั‚ั€ั ะฝะฐ ะฝะพะฒะธะทะฝัƒ, ะธะทะพะฑั€ะฐะถะตะฝะธะต ะฝะฐ ะฝะฐะดะณั€ะพะฑะฝะพะผ ะบะฐะผะฝะต ัƒะถะต ัะปะตะณะบะฐ ัั‚ั‘ั€ะปะพััŒ. ะขะพะฝะบะธะต ะฟะฐะปัŒั†ั‹ ััƒะดะพั€ะพะถะฝะพ ัะถะธะผะฐัŽั‚ ะฟะปัŽัˆะตะฒะพะณะพ ะผะตะดะฒะตะถะพะฝะบะฐ, ะฟะตั€ะตะฑะธั€ะฐั ะบะพั€ะพั‚ะบัƒัŽ ะธัะบัƒััั‚ะฒะตะฝะฝัƒัŽ ัˆะตั€ัั‚ัŒ. ะŸะฐั€ะตะฝัŒ ะดะพ ัะธั… ะฟะพั€ ะฝะต ะฟะพะฝัะป, ะบะฐะบ ัั‚ะพ ะผะพะณะปะพ ัะปัƒั‡ะธั‚ัŒัั. ะžะฝ ะพั‚ั‡ั‘ั‚ะปะธะฒะพ ะฟะพะผะฝะธะป, ะบะฐะบ ะฒะพัˆั‘ะป ะฒ ะตั‘ ะบะพะผะฝะฐั‚ัƒ, ะฝะฐะผะตั€ะตะฒะฐัััŒ ะธัะฟัƒะณะฐั‚ัŒ ะดะตะฒัƒัˆะบัƒ. ะžะฝ ะพั‚ั‡ั‘ั‚ะปะธะฒะพ ะฟะพะผะฝะธะป ั‚ัƒ ะฑะตะทะผัั‚ะตะถะฝะพัั‚ัŒ, ั‡ั‚ะพ ะทะฐัั‚ั‹ะปะฐ ะฝะฐ ะปะธั†ะต ะฎะธ. ะžะฝ ะพั‚ั‡ั‘ั‚ะปะธะฒะพ ะฟะพะผะฝะธะป ั‚ะพั‚ ะฝะตะธัั‚ะพะฒั‹ะน ั…ะพะปะพะด, ะธัั…ะพะดัั‰ะธะน ะพั‚ ะบะพั‡ะตะฝะตัŽั‰ะตะณะพ ั‚ะตะปะฐ ะดะตะฒัƒัˆะบะธ. ะกะปั‘ะทั‹ ะบะฐั‚ัั‚ัั ะธะท ะณะปะฐะท, ะพัั‚ะฐะฒะปัั ะทะฐ ัะพะฑะพะน ะผะพะบั€ั‹ะต ะดะพั€ะพะถะบะธ. ะžะฝ ั€ะตะดะบะพ ะฟะปะฐะบะฐะป. ะฃะถ ะปัƒั‡ัˆะต ัะผะตัั‚ัŒัั, ะฝะต ะฟั€ะฐะฒะดะฐ ะปะธ? - ะ—ะฝะฐะตัˆัŒ, ั‚ั‹ ะฟั€ะฐะฒะฐ. ะฃะณะพะปะบะธ ะณัƒะฑ ะฟะฐั€ะฝั ั‡ัƒั‚ัŒ ะฟั€ะธะฟะพะดะฝะธะผะฐัŽั‚ัั. ะกะปั‘ะทั‹ ะฝะฐั‡ะธะฝะฐัŽั‚ ั‚ะตั‡ัŒ ั ะฝะพะฒะพะน ัะธะปะพะน. ะฃะปั‹ะฑะบะฐ ัั‚ะฐะฝะพะฒะธั‚ัั ะตั‰ั‘ ัˆะธั€ะต, ะพะฑะฝะฐะถะฐั ะฑะตะปะพัะฝะตะถะฝั‹ะต ะบะปั‹ะบะธ. - ะกะผะตั… ะฟั€ะพะดะปะตะฒะฐะตั‚ ะถะธะทะฝัŒ, ะฒะตะดัŒ ั‚ะฐะบ? ะ˜ ะพะฝ ัะผะตั‘ั‚ัั. ะะตัะผะพั‚ั€ั ะฝะฐ ั‚ะพ, ั‡ั‚ะพ ัƒะถะต ะทะฐะดั‹ั…ะฐะตั‚ัั ะพั‚ ั€ั‹ะดะฐะฝะธะน. ะจัƒ ั‡ัƒั‚ัŒ ั‰ัƒั€ะธั‚ัั ะธ ะพั‚ะฒะพั€ะฐั‡ะธะฒะฐะตั‚ัั ะพั‚ ั„ะพะฝะฐั€ั, ะฟะพ ะตะณะพ ะผะฝะตะฝะธัŽ ั‚ะฐะบ ะฝะตัƒะผะตัั‚ะฝะพ ั€ะฐัะฟะพะปะพะถะตะฝะฝะพะผัƒ ะทะดะตััŒ ะฟะพ ะตะณะพ ะถะต ะฟั€ะพััŒะฑะต. ะ˜ ะพ ั‡ั‘ะผ ะพะฝ ั‚ะพะณะดะฐ ะดัƒะผะฐะป?! ะั… ะดะฐ, ะพ ะฎะธ... - ะ—ะฝะฐะตัˆัŒ... ะฏ ะฒะพะฒัะต ะฝะต ะฑะพัŽััŒ ั‚ะตะผะฝะพั‚ั‹, ะจัƒ. ะŸั€ะพัั‚ะพ ั ั…ะพั‡ัƒ ะฒะธะดะตั‚ัŒ ะปะธั†ะพ ั‚ะพะณะพ, ะบั‚ะพ ัะบั€ั‹ะฒะฐะตั‚ัั ะฒ ัั‚ะพะน ั‚ัŒะผะต. ะ’ะฐะผะฟะธั€ ะตะปะต ัะปั‹ัˆะฝะพ ะฒะทะดั‹ั…ะฐะตั‚. ะคะพะฝะฐั€ัŒ ะฟะพะปะฝะพัั‚ัŒัŽ ะพัะฒะตั‰ะฐะตั‚ ะตะณะพ ั„ะธะณัƒั€ัƒ, ั‡ั‚ะพ ะฝะตะผะฝะพะณะพ ั€ะฐะทะดั€ะฐะถะฐะตั‚. ะ•ะผัƒ ะฝะตะปะพะฒะบะพ ะฟั€ะพัั‚ะพ ัะธะดะตั‚ัŒ ะฝะฐ ะผะพะณะธะปะต ะบะพะณะดะฐ-ั‚ะพ ะฝะฐัั‚ะพะปัŒะบะพ ะทะฐะธะฝั‚ะตั€ะตัะพะฒะฐะฒัˆะตะน ะตะณะพ ะดะตะฒัƒัˆะบะต. ะจัƒ ะถัƒั‚ะบะพ ั…ะพั‡ะตั‚ัั ัะฟะฐั‚ัŒ, ะฝะพ ะฒะฝะพะฒัŒ ัƒัะปั‹ัˆะฐั‚ัŒ ะตั‘ ะณะพะปะพั ั…ะพั‡ะตั‚ัั ะตั‰ั‘ ะฑะพะปัŒัˆะต. ะŸะฐั€ะตะฝัŒ ัะผะพั‚ั€ะธั‚ ะฟั€ัะผะพ ะฒ ะณะปะฐะทะฐ ะฟะพั€ั‚ั€ะตั‚ะฐ ะธะท-ะฟะพะด ะพะฟัƒั‰ะตะฝะฝั‹ั… ั€ะตัะฝะธั†. ะ•ะผัƒ ะบะฐะถะตั‚ัั, ั‡ั‚ะพ ะพะฝ ัะปั‹ัˆะธั‚ ะตั‘ ั‚ะธั…ะธะน ัะผัƒั‰ั‘ะฝะฝั‹ะน ะณะพะปะพั. ะ”ะฒะฐ ะฟะฐะปัŒั†ะฐ ะปะพะถะฐั‚ัั ะฝะฐ ะณะปะฐะทะฐ ะฟะพั€ั‚ั€ะตั‚ะฐ ะฎะธ, ั…ะพั‚ัŒ ะธ ะฝะต ะทะฐะบั€ั‹ะฒะฐั, ะฝะพ ั…ะพั‚ั ะฑั‹ ะฟะตั€ะตะบั€ั‹ะฒะฐั ะธะผ ะพะฑะทะพั€ ะฝะฐ ะฒะฐะผะฟะธั€ะฐ. ะ›ั‘ะณะบะฐั, ะดะฐะถะต ะฝะตะผะฝะพะณะพ ะณั€ัƒัั‚ะฝะฐั ัƒัะผะตัˆะบะฐ ะพั‚ั€ะฐะถะฐะตั‚ัั ะฝะฐ ะปะธั†ะต ะจัƒ. - ะ—ะฐะบั€ะพะน ัะฒะพะธ ะณะปะฐะทะฐ, ะฟะพะถะฐะปัƒะนัั‚ะฐ. ะ•ะผัƒ ะฟะพะบะฐะทะฐะปะพััŒ, ะธะปะธ ะฎะธ ะดะตะนัั‚ะฒะธั‚ะตะปัŒะฝะพ ะตะผัƒ ัƒะปั‹ะฑะฝัƒะปะฐััŒ?.. - ะ˜ ะฝะต ะพะฟั€ะฐะฒะดั‹ะฒะฐะนัั ะฟะตั€ะตะดะพ ะผะฝะพะน. ะกัƒะฑะฐั€ัƒ ั„ั‹ั€ะบะฐะตั‚, ัะผะพั‚ั€ั ะฝะฐ ั‡ั‘ั€ะฝะพ-ะฑะตะปัƒัŽ ั„ะพั‚ะพะณั€ะฐั„ะธัŽ ะฎะธ. ะŸะฐั€ะตะฝัŒ, ั‚ะธั…ะพ ั€ั‹ั‡ะฐ, ะบะธะดะฐะตั‚ ะฝะพะถ ะฝะฐ ะผะพะณะธะปัŒะฝัƒัŽ ะฟะปะธั‚ัƒ. ะžะฝ ั‚ัƒั‚ ะถะต ะฒั‚ั‹ะบะฐะตั‚ัั ะฒ ะฝะตั‘, ะพัั‚ะฐะฒะปัั ะฒะพะบั€ัƒะณ ัะตะฑั ะฟะฐัƒั‚ะธะฝัƒ ะผะตะปะบะธั… ั‚ั€ะตั‰ะธะฝ. ะกัƒะฑะฐั€ัƒ, ัะถะฐะฒ ะบัƒะปะฐะบะธ, ะฟั€ะธัะตะดะฐะตั‚ ะฝะฐะฟั€ะพั‚ะธะฒ ะฟะพั€ั‚ั€ะตั‚ะฐ ะธ ะดะพะปะณะพ ะฒัะผะฐั‚ั€ะธะฒะฐะตั‚ัั ะฒ ะทะฝะฐะบะพะผั‹ะต ั‡ะตั€ั‚ั‹ ะปะธั†ะฐ. - ะขั‹... ะœะฝะต ะบะฐะถะตั‚ัั, ั‚ั‹ ะฝะต ั‚ะฐะบะพะน, ะบะฐะบ ะพะฝะธ. ะŸะฐั€ะตะฝัŒ ัะบะฐะปะธั‚ัั ะธ ัƒะถะต ะฑะพะปะตะต ะฒะพะปัŒะฝะพ ั€ะฐัะฟะพะปะฐะณะฐะตั‚ัั ะฝะฐะฟั€ะพั‚ะธะฒ ะบะฐะผะฝั. - ะ ะฒะตะดัŒ ั‚ั‹ ะผะฝะต ะพะฑะตั‰ะฐะปะฐ, ะฟะพะผะฝะธัˆัŒ? ะขั‹ ะพะฑะตั‰ะฐะปะฐ, ั‡ั‚ะพ ัƒะฑัŒั‘ัˆัŒ ะผะตะฝั. ะ˜ ั‡ั‚ะพ ั‚ะตะฟะตั€ัŒ?.. ะŸะตั€ะฒั‹ะต ะบะฐะฟะปะธ ะดะพะถะดั ัƒะฟะฐะปะธ ะฝะฐ ั‰ั‘ะบัƒ ะฎะธ. ะšะฐะบะพะผัƒ-ะฝะธะฑัƒะดัŒ ัะพะฟะปะธะฒะพะผัƒ ั€ะพะผะฐะฝั‚ะธะบัƒ ะฟะพะบะฐะถะตั‚ัั, ั‡ั‚ะพ ัั‚ะพ ะฝะฐะฟะพะผะธะฝะฐะตั‚ ะตั‘ ัะปั‘ะทั‹. ะ’ ะพั‚ะฒะตั‚ ะฝะฐ ัั‚ะพ ะกัƒะฑะฐั€ัƒ ะฒะฟะพะปะฝะต ะผะพะถะตั‚ ั€ะฐััะผะตัั‚ัŒัั ัั‚ะพะผัƒ ั‡ะตะปะพะฒะตะบัƒ ะฒ ะปะธั†ะพ. ะฃะถ ะพะฝ-ั‚ะพ ะทะฝะฐะตั‚, ั‡ั‚ะพ ะตั‘ ัะปั‘ะทั‹ ะฝะต ั‚ะฐะบะธะต. ะ•ั‘ ัะปั‘ะทั‹ ะฒัะตะณะดะฐ ะฒะฝัƒั‚ั€ะธ. ะ‘ะตะปะพะฒะพะปะพัั‹ะน ัƒะปั‹ะฑะฐะตั‚ัั. ะžะฝ ัƒะฒะตั€ะตะฝ - ะพะฝะฐ ะตะณะพ ัะปั‹ัˆะธั‚. ะะตัะบะพะปัŒะบะพ ะฟะฐั€ะฝะตะน ัั‚ะพัั‚ ะพะบะพะปะพ ัะฒะตะถะตะน ะผะพะณะธะปั‹, ะฝะต ั€ะตัˆะฐัััŒ ะฟั€ะพั€ะพะฝะธั‚ัŒ ะธ ัะปะพะฒะฐ. ะŸะพ ั€ะฐะทะฝะพั†ะฒะตั‚ะฝั‹ะผ ะทะพะฝั‚ะฐะผ ะฑะฐั€ะฐะฑะฐะฝะธั‚ ะดะพะถะดัŒ. ะžะดะธะฝ ะธะท ะฒะฐะผะฟะธั€ะพะฒ ะฝะต ะฒั‹ะดะตั€ะถะธะฒะฐะตั‚ ะธ ะดะตะปะฐะตั‚ ัˆะฐะณ ะบ ะผะพะณะธะปะต. - ะžะฝะฐ... ะฃะผะตั€ะปะฐ ะฝะฐะฒัะตะณะดะฐ, ะดะฐ? - ะณะพะปะพั ะšะฐะฝะฐั‚ะพ ะดั€ะพะถะธั‚. ะŸะฐั€ะตะฝัŒ, ะฝะต ะฟะพะปัƒั‡ะธะฒ ะพั‚ะฒะตั‚ะฐ ะฝะฐ ัะฒะพะน ะฒะพะฟั€ะพั, ัะธะปัŒะฝะตะต ะฟั€ะธะถะธะผะฐะตั‚ ะบ ัะตะฑะต ัะฒะพะตะณะพ ะฟะปัŽัˆะตะฒะพะณะพ ะผะธัˆะบัƒ. ะžะฝ ะพะฑะฒะพะดะธั‚ ะฒะทะณะปัะดะพะผ ะฑั€ะฐั‚ัŒะตะฒ ะธ ะดะตะปะฐะตั‚ ะตั‰ั‘ ะพะดะฝัƒ ะฟะพะฟั‹ั‚ะบัƒ ั€ะฐะทั€ัƒัˆะธั‚ัŒ ัั‚ัƒ ะผั‘ั€ั‚ะฒัƒัŽ ั‚ะธัˆะธะฝัƒ. - ะะพ ะฒะตะดัŒ... ะ ะตะนะดะทะธ, ั‚ั‹ ะถะต ะบะพะณะดะฐ-ั‚ะพ ัะฟะฐัะฐะป ะฎะธ! ะ ะตะนะดะทะธ ะบะฐั‡ะฐะตั‚ ะณะพะปะพะฒะพะน ะธ ะฟะพะฟั€ะฐะฒะปัะตั‚ ะพั‡ะบะธ. ะะฐ ัะผะตะฝัƒ ะฒัะตะผ ะฝะตะดะฐะฒะฝะธะผ ั‡ัƒะฒัั‚ะฒะฐะผ ะฟั€ะธัˆะปะฐ ัƒะถะต ะทะฝะฐะบะพะผะฐั, ะฝะพ ั‚ะตะผ ะฝะต ะผะตะฝะตะต ะฝะตะฝะฐะฒะธัั‚ะฝะฐั ะฐะฟะฐั‚ะธั. ะšะฐะฝะฐั‚ะพ ะฑะปะฐะณะพั€ะฐะทัƒะผะฝะพ ะทะฐะผะพะปะบะฐะตั‚ ะธ ัะถะธะผะฐะตั‚ ะผัะณะบัƒัŽ ะปะฐะฟัƒ ะขะตะดะดะธ. ะ’ ะณะพะปะพะฒัƒ ะฒะปะตั‚ะฐะตั‚ ั€ะฐััะตัะฝะฝะฐั ะผั‹ัะปัŒ ะพ ั‚ะพะผ, ั‡ั‚ะพ ะฎะธ ะฒัะตะณะดะฐ ะณะพะฒะพั€ะธะปะฐ ั ะผะตะดะฒะตะดะตะผ, ะบะฐะบ ัะพ ัั‚ะฐั€ั‹ะผ ะทะฝะฐะบะพะผั‹ะผ. ะŸัƒัั‚ัŒ ะบะพะณะดะฐ-ั‚ะพ ัั‚ะพ ะธ ั€ะฐะทะดั€ะฐะถะฐะปะพ, ะฝะพ ัะตะนั‡ะฐั ะพะฝ ะฑั‹ะป ะณะพั‚ะพะฒ ะฟั€ะพัั‚ะธั‚ัŒ ะฎะธ ะธ ัั‚ะพ. ะะตะฑะพ ะฟั€ะพัะฒะตั‚ะปะตะปะพ. ะ ะฐะนั‚ะพ, ะพั‚ั€ะตะฐะณะธั€ะพะฒะฐะฒ ะฝะฐ ัั‚ะพ ะฟะตั€ะฒั‹ะผ, ัะปะพะถะธะป ะทะพะฝั‚ ะธ ะธะณั€ะธะฒะพ ะฟะพัะผะพั‚ั€ะตะป ะฝะฐ ะฑะตะปัƒัŽ ะบะพัˆะบัƒ, ัะธะดัั‰ัƒัŽ ะฝะฐ ะผะพะณะธะปะต. ะ ะตะนะดะทะธ, ะฟั€ะพัะปะตะดะธะฒ ะทะฐ ะฒะทะณะปัะดะพะผ ะฑั€ะฐั‚ะฐ, ะฟะพะฝัั‚ะปะธะฒะพ ั…ะผั‹ะบะฝัƒะป. - ะกะปัƒัˆะฐะนั‚ะต, ะฐ ะฒั‹ ะฒะตั€ะธั‚ะต ะฒ ะฟะตั€ะตัะตะปะตะฝะธะต ะดัƒัˆ? - ะณะพะปะพั ะ ะฐะนั‚ะพ ะทะฒัƒั‡ะธั‚ ะฝะตะฟั€ะธะฒั‹ั‡ะฝะพ ะณั€ะพะผะบะพ. ะัั‚ะพ ัƒัะผะตั…ะฐะตั‚ัั ะธ ัะผะพั‚ั€ะธั‚ ะฝะฐ ะฝะฐะดะฟะธััŒ, ะบะพั‚ะพั€ัƒัŽ ะดะพ ัั‚ะพะณะพ ะทะฐะณะพั€ะฐะถะธะฒะฐะปะฐ ะบะพัˆะบะฐ. ะ˜ ะตะผัƒ ะฟะพั‡ะตะผัƒ-ั‚ะพ ะบะฐะถะตั‚ัั, ั‡ั‚ะพ ะพะฝะธ ะฝะตะผะฝะพะณะพ ะฟะพัะฟะตัˆะธะปะธ. "ะ—ะฐั‚ะบะฝะธััŒ ะธ ัะฟะธ."' - '-ะœัƒะบัƒั€ะพ, ัะผะพั‚ั€ะธ, ัะผะพั‚ั€ะธ - ัั‚ะพ ะพะฑะปะฐั‡ะบะพ ะฟะพั…ะพะถะต ะฝะฐ ะฑะฐั€ะฐัˆะบะฐ. ะ˜ะปะปัŽะทะธะพะฝะธัั‚ ะพั‚ะบั€ั‹ะป ะณะปะฐะทะฐ, ัะพะปะฝั†ะต ะฝะฐ ะผะธะณ ะพัะปะตะฟะธะปะพ ะตะณะพ, ะฝะพ ะพะฝ ะฒัะต ัƒะถะต ัƒะผัƒะดั€ะธะปัั ั€ะฐััะผะพั‚ั€ะตั‚ัŒ ั‚ะพ ัะฐะผะพะต ะพะฑะปะฐั‡ะบะพ, ะพ ะบะพั‚ะพั€ะพะผ ะณะพะฒะพั€ะธะป ะขััƒะฝะฐ. -ะœะผ. ะกะบะพั€ะตะต ะฟะพั…ะพะถะต ะฝะฐ ะณะพั€ัƒ ัะปะฐะดะบะพะน ะฒะฐั‚ั‹, ั‡ะตะผ ะฝะฐ ะฑะฐั€ะฐัˆะบะฐ. ะŸั€ะธ ัั‚ะธั… ัะปะพะฒะฐั… ะพะฝ ัƒะปั‹ะฑะฝัƒะปัั ะกะฐะฒะฐะดะต, ะพั‚ั‡ะตะณะพ ั‚ะพั‚ ัั€ะฐะทัƒ ะถะต ะฟะพะบั€ะฐัะฝะตะป. ะ›ะตะณะบะฐั ัƒัะผะตัˆะบะฐ ัะพั€ะฒะฐะปะฐััŒ ั ะณัƒะฑ ะฅั€ะฐะฝะธั‚ะตะปั ะขัƒะผะฐะฝะฐ, ะตะผัƒ ะฒัะตะณะดะฐ ะดะพ ะฑะตะทัƒะผะธั ะฝั€ะฐะฒะธะปะพััŒ ัะผะพั‚ั€ะตั‚ัŒ ะบะฐะบ ัะผัƒั‰ะฐะตั‚ัั ะตะณะพ ะปัŽะฑะธะผั‹ะน...ะฑะพัั, ะฝะตั‚, ะปัŽะฑะธะผั‹ะน.. ะฟั€ะพัั‚ะพ ะปัŽะฑะธะผะพะต ะะตะฑัƒัˆะบะพ.ะะตัะผะพั‚ั€ั ะฝะฐ ั‚ะพ, ั‡ั‚ะพ ะขััƒะฝะฐั‘ัˆะธ ะฟะพะฒะทั€ะพัะปะตะป, ะฟะพั€ะพะน ะพะฝ ะฒะตะป ัะตะฑั ะบะฐะบ ั€ะตะฑะตะฝะพะบ. -ะ—ะฝะฐะตัˆัŒ, ะœัƒะบัƒั€ะพ,ะฐ ะผะฝะต ะบะฐะถะตั‚ัั, ั‡ั‚ะพ ั‚ั‹ ะฑัƒะดัƒั‡ะธ ะธะปะปัŽะทะธะพะฝะธัั‚ะพะผ ะพะฑะปะฐะดะฐะตัˆัŒ ะดะพะฒะพะปัŒะฝะพ ัั‚ั€ะฐะฝะฝะพะน ั„ะฐะฝั‚ะฐะทะธะตะน. ะ˜ะปะปัŽะทะธะพะฝะธัั‚ ะฟะพัะผะพั‚ั€ะตะป ะฝะฐ ะ”ะตััั‚ะพะณะพ. ะ ะฐะทะฒะต ะผะพะณ ะพะฝ ั€ะฐััะบะฐะทะฐั‚ัŒ ะตะผัƒ, ะขััƒะฝะฐั‘ัˆะธ, ะพ ั‚ะตั… ั„ะฐะฝั‚ะฐะทะธัั…, ั‡ั‚ะพ ะฟะพัะตั‰ะฐะปะธ ะตะณะพ, ะพัะพะฑะตะฝะฝะพ ะฟะพ ะฝะพั‡ะฐะผ. ะšะพะณะดะฐ ะฝะฐั‡ะฐะปะธััŒ ะธั… ะพั‚ะฝะพัˆะตะฝะธั, ะœัƒะบัƒั€ะพ ัะพะณะปะฐัะธะปัั ะฝะฐ ะฝะตะบะพั‚ะพั€ั‹ะต ัƒัะปะพะฒะธั, ะบะพั‚ะพั€ั‹ะต ะฟะพัั‚ะฐะฒะธะป ะฟะตั€ะตะด ะฝะธะผ ะกะฐะฒะฐะดะฐ.ะžะฝ, ะฝะตัะผะพั‚ั€ั ะฝะฐ ัะฒะพัŽ ะณะพั€ัั‡ัƒัŽ ะธั‚ะฐะปัŒัะฝัะบัƒัŽ ะบั€ะพะฒัŒ, ัั‚ั€ะฐัั‚ัŒ, ะฟั€ะธััƒั‰ัƒัŽ ะปัŽะฑะพะผัƒ ะฟั€ะตะดัั‚ะฐะฒะธั‚ะตะปัŽ ะดะฐะฝะฝะพะน ะฝะฐั†ะธะพะฝะฐะปัŒะฝะพัั‚ะธ, ัะพะณะปะฐัะธะปัั ะฝะฐ ัั‚ะธ ัƒัะปะพะฒะธั. ะกะปะธัˆะบะพะผ ะฟะพะทะดะฝะพ ะพะฝ ะพัะพะทะฝะฐะป, ั‡ั‚ะพ ะตะณะพ ะปัŽะฑะธะผั‹ะน ั‚ะฐะบะพะน ัั‚ะตัะฝะธั‚ะตะปัŒะฝั‹ะน, ั‡ั‚ะพ ะดะฐะถะต ะฟะพั€ะพะน ัะพั€ะฒะฐั‚ัŒ ั ะตะณะพ ะณัƒะฑ ะฟะพั†ะตะปัƒะน - ั‚ะฐะบะฐั ะฟั€ะพะฑะปะตะผะฐ. ะะตัะผะพั‚ั€ั ะฝะฐ ั‚ะพ, ั‡ั‚ะพ ะพะฝะธ ะถะธะปะธ ะฒะผะตัั‚ะต ัƒะถะต ะฒั‚ะพั€ะพะน ะณะพะด, ะกะฐะฒะฐะดะฐ ะพัั‚ะฐะฒะฐะปัั...ะดะตะฒัั‚ะฒะตะฝะฝะธะบะพะผ.ะ˜ ัะฝะพะฒะฐ ะฝะฐ ะณัƒะฑะฐั… ะ ะพะบัƒะดะพ ะพั‚ั€ะฐะทะธะปะฐััŒ ัั‚ั€ะฐะฝะฝะฐั ะทะฐะดัƒะผั‡ะธะฒะฐั ัƒะปั‹ะฑะบะฐ. ะšะพะณะดะฐ ะพะฝ ั‚ะฐะบ ัะธะปัŒะฝะพ ะธะทะผะตะฝะธะปัั, ะบะพะณะดะฐ ัั‚ะฐะป ะธะทะผะตะฝัั‚ัŒ ัะฒะพะธะผ ะฟั€ะธะฝั†ะธะฟะฐะผ? ะ ะฐะฝัŒัˆะต, ะพะฝ ะฑั‹ ะฝะต ะทะฐะดัƒะผั‹ะฒะฐัััŒ ะฟั€ะพัั‚ะพ ะฒะทัะป ะฑั‹ะป ะฑั‹ ะ’ะพะฝะณะพะปัƒ, ะฝะต ัะฟั€ะฐัˆะธะฒะฐั ั‚ะพะณะพ - ั…ะพั‡ะตั‚ ะพะฝ ัั‚ะพะณะพ ะธะปะธ ะฝะตั‚. ะžะดะฝะฐะบะพ ะผะฝะพะณะพะต ะฟะพะผะตะฝัะปะพััŒ, ะฝะฐัะธะปะธะต ะฒ ะพั‚ะฝะพัˆะตะฝะธะธ ัั‚ะพะณะพ ะฟะฐั€ะฝั ะพะฝ ะพั‚ะผะตะป ัั€ะฐะทัƒ ะถะต. ะžะฝ ะฒะตะดัŒ ะปัŽะฑะธั‚ .ะกะฐะฒะฐะดะฐ ะปัŽะฑะธั‚ ะตะณะพ, ะฝะพ ะฟะพั‡ะตะผัƒ ะธ ะพั‚ ั‡ะตะณะพ? ะŸั€ะพะดะพะปะถะฐั ัะผะพั‚ั€ะตั‚ัŒ ะฝะฐ ะขััƒะฝัƒ, ะพะฝ ะทะฐะดะฐะฒะฐะป ัะตะฑะต ัั‚ะธ ะฒะพะฟั€ะพัั‹ ัƒะถะต, ะฝะฐะฒะตั€ะฝะพะต, ะฒ ั‚ั‹ััั‡ะฝั‹ะน ั€ะฐะท. ะŸะพั‡ะตะผัƒ ะพะฝ ั‚ะฐะบ ั€ะฐะด ั‚ะพะผัƒ, ั‡ั‚ะพ ะผะพะถะตั‚ ะฒะพั‚ ั‚ะฐะบ ัะฟะพะบะพะนะฝะพ ัะธะดะตั‚ัŒ ะทะดะตััŒ, ะฒ ะฟะฐั€ะบะต ะฝะฐ ะปัƒะถะฐะนะบะต ะธ ัƒะปั‹ะฑะฐั‚ัŒัั ะตะผัƒ? ะกะฐะฒะฐะดะฐ ะขััƒะฝะฐั‘ัˆะธ - ั‡ะตะปะพะฒะตะบ, ะบะพั‚ะพั€ั‹ะน ะธะทะผะตะฝะธะป ะตะณะพ ะธ ะผะธั€ ะฒะฝัƒั‚ั€ะธ. ะกั‚ั€ะฐะฝะฝั‹ะน ะผะฐะปัŒั‡ะธะบ, ะฑะตะท ะพัะพะฑะพะณะพ ะดะฐั€ะพะฒะฐะฝะธั, ะฝะต ัะผะตะปั‹ะน, ะฝะพ ะธ ะฝะต ั‚ั€ัƒัะปะธะฒั‹ะน .ะ ะบะพะณะดะฐ ะดะตะปะพ ะบะฐัะฐะปะพััŒ ะตะณะพ ัะตะผัŒะธ, ะดั€ัƒะทะตะน - ะพั‚ะฒะฐะถะฝะตะต ะตะณะพ ะฝะต ะฝะฐะนะดะตัˆัŒ ะฝะธะบะพะณะพ .ะžั‚ ั‡ะตะณะพ ะถะต ะพะฝ, ะ ะพะบัƒะดะพ ะœัƒะบัƒั€ะพ, ั…ะพะปะพะดะฝั‹ะน, ั†ะธะฝะธั‡ะฝั‹ะน, ะฝะตะฝะฐะฒะธะดัั‰ะธะน ะผะฐั„ะธัŽ, ัƒะฑะธะนั†ะฐ, ั‚ะฐะบ ัั‡ะฐัั‚ะปะธะฒ, ะฝะฐั…ะพะดัััŒ ั€ัะดะพะผ ั ะ”ะตััั‚ั‹ะผ ะฑะพััะพะผ ะ’ะพะฝะณะพะปั‹? ะ›ะตะณะบะพะต ะฟั€ะธะบะพัะฝะพะฒะตะฝะธะต ะบ ะตะณะพ ั€ัƒะบะต ะฒั‹ะฒะตะปะพ ะ ะพะบัƒะดะพ ะธะท ะฟะพั‚ะพะบะฐ ั€ะฐะทะผั‹ัˆะปะตะฝะธะน. -ะœัƒะบัƒั€ะพ, ั‡ั‚ะพ-ั‚ะพ ัะปัƒั‡ะธะปะพััŒ? ะ’ ะณะปะฐะทะฐั…, ัั‚ะธั… ะบะฐั€ะฐะผะตะปัŒะฝั‹ั… ะณะปะฐะทะฐั…, ะฝะตะพะถะธะดะฐะฝะฝะพ ะพั‚ั€ะฐะทะธะปะพััŒ ะฟะตั€ะตะถะธะฒะฐะฝะธะต ะธ ัั‚ั€ะฐั…. -ะ’ัะต ะฒ ะฟะพั€ัะดะบะต, ะผะธะปั‹ะน, ะฒัะต ั…ะพั€ะพัˆะพ. ะŸั€ะพัั‚ะพ ะทะฐะดัƒะผะฐะปัั. ะŸะพั‚ัะฝัƒะฒัˆะธััŒ ะบ ะกะฐะฒะฐะดะต,ะพะฝ, ะพะฑั…ะฒะฐั‚ะธะฒ ั‚ะพะณะพ ะทะฐ ั‚ะฐะปะธัŽ, ัƒัะฐะดะธะป ัะตะฑะต ะฝะฐ ะบะพะปะตะฝะธ, ะฝะตะถะฝะพ ะฟั€ะธะถะธะผะฐั ะบ ัะฒะพะตะน ะณั€ัƒะดะธ. ะŸะฐะปัŒั†ั‹ ะปะฐัะบะพะฒะพ ะฟะพะณะปะฐะถะธะฒะฐะปะธ ะฒะพะปะพัั‹, ะณัƒะฑั‹ ั‚ั€ะตะฟะตั‚ะฝะพ ั†ะตะปะพะฒะฐะปะธ, ะทะฐัั‚ะฐะฒะปัั ะขััƒะฝะฐั‘ัˆะธ ัะผัƒั‰ะฐั‚ัŒัั ะตั‰ะต ะฑะพะปัŒัˆะต. ะŸะพะดะฐั‚ะปะธะฒะพะต, ั€ะฐะทะณะพั€ัั‡ะตะฝะฝะพะต ั‚ะตะปะพ ะฟะฐั€ะฝั ัะฝะพัะธะปะพ ะบั€ั‹ัˆัƒ ัƒ ะœัƒะบัƒั€ะพ ะธ, ะบะพะณะดะฐ ะปะตะณะบะธะน ัั‚ะพะฝ ะฒั‹ั€ะฒะฐะปัั ะธะท ะณั€ัƒะดะธ ะปัŽะฑะธะผะพะณะพ, ะธะปะปัŽะทะธะพะฝะธัั‚ ัะบะพะปัŒะทะฝัƒะป ะปะฐะดะพะฝัะผะธ ะฟะพะด ั€ัƒะฑะฐัˆะบัƒ, ะฟะพะณะปะฐะถะธะฒะฐั ัะฟะธะฝัƒ, ะทะฐัั‚ะฐะฒะปัั ะฒั‹ะณะธะฑะฐั‚ัŒัั, ัั‚ะพะฝะฐั‚ัŒ ะธ ัะธะปัŒะฝะตะต ะฟั€ะธะถะธะผะฐั‚ัŒัั. ะ“ัƒะฑะฐะผะธ ะฟั€ะพะฒะตะป ะฒะปะฐะถะฝัƒัŽ ะดะพั€ะพะถะบัƒ ะฟะพ ัˆะตะต ะบ ัƒัˆะบัƒ. -ะœัƒ-ะบัƒ-ั€ะพ.. ะขััƒะฝะฐ ะดั€ะพะถะฐะป ะฒัะตะผ ั‚ะตะปะพะผ ะพั‚ ะฒะพะทะฑัƒะถะดะตะฝะธั ะบะฐะถะดั‹ะน ั€ะฐะท, ะบะพะณะดะฐ ะœัƒะบัƒั€ะพ ะปะฐัะบะฐะป ะตะณะพ, ะตะผัƒ ั…ะพั‚ะตะปะพััŒ, ั‡ั‚ะพะฑั‹ ั‚ะพั‚ ะฝะธ ะทะฐ ั‡ั‚ะพ ะธ ะฝะธะบะพะณะดะฐ ะฝะต ะพัั‚ะฐะฝะฐะฒะปะธะฒะฐะปัั. -ะะต ะพัั‚ะฐะฝะฐะฒะปะธะฒะฐะนัั, ะœัƒะบัƒั€ะพ. ะžั‚ ัั‚ะธั… ัะปะพะฒ ั€ัƒะบะธ ะœัƒะบัƒั€ะพ ะทะฐะผะตั€ะปะธ, ะพะฝ ะฒะตะดัŒ ะดะฐะฒะฝะพ ะผะตั‡ั‚ะฐะป, ั‡ั‚ะพะฑั‹ ะตะณะพ ะปัŽะฑะธะผั‹ะน ัะบะฐะทะฐะป ะตะผัƒ ัั‚ะพ.ะ ัะตะนั‡ะฐั ะพะฝ ะฒะทะณะปัะฝัƒะป ะฒ ะณะปะฐะทะฐ ะขััƒะฝะฐั‘ัˆะธ, ะฟะฐะปัŒั†ะตะผ ะฟั€ะพะฒะตะป ะฟะพ ั‰ะตะบะต ะธ, ะฝะตะพะถะธะดะฐะฝะฝะพ ะดะปั ัะตะฑั, ะฟั€ะธะถะฐะป ะตะณะพ ะบ ัะตะฑะต ัะพ ะฒัะตะน ะฝะตะถะฝะพัั‚ัŒัŽ. ะšะฐะบ ะถะต ั…ะพั‚ะตะปะพััŒ ะฒะพั‚ ั‚ะฐะบ ัะธะดะตั‚ัŒ ั ะฝะธะผ, ะฟั€ะธะถะธะผะฐัััŒ, ัะปัƒัˆะฐั ะฑะธะตะฝะธะต ะปัŽะฑะธะผะพะณะพ ัะตั€ะดั†ะฐ, ะพั‰ัƒั‰ะฐั‚ัŒ ะปะฐัะบะพะฒั‹ะต ะฟั€ะธะบะพัะฝะพะฒะตะฝะธั ั‚ะตะฟะปั‹ั… ั€ัƒะบ, ั‡ัƒะฒัั‚ะฒะพะฒะฐั‚ัŒ ะดั‹ั…ะฐะฝะธะต ะฝะฐ ัะฒะพะตะน ะบะพะถะต, ั‡ั‚ะพ ะพะฑะถะธะณะฐะปะพ ะธ ัะฒะพะดะธะปะพ ั ัƒะผะฐ. ะ’ ัƒะณะพะปะบะฐั… ะณะปะฐะท ัะฒะตั€ะบะฝัƒะปะธ ัะปะตะทั‹. ะžั‚ ั‡ัƒะฒัั‚ะฒะฐ, ั‡ั‚ะพ ัะตะนั‡ะฐั ะพั…ะฒะฐั‚ะธะปะพ ะ ะพะบัƒะดะพ, ั…ะพั‚ะตะปะพััŒ ะฟะปะฐะบะฐั‚ัŒ. ะšะฐะบะพะต ะถะต ัั‚ะพ ัั‡ะฐัั‚ัŒะต - ะฑั‹ั‚ัŒ ะบะพะผัƒ-ั‚ะพ ะฝัƒะถะฝั‹ะผ, ะฒะฐะถะฝั‹ะผ. ะ‘ั‹ั‚ัŒ ะปัŽะฑะธะผั‹ะผ. -ะฏ ะฝะธะบะพะณะดะฐ ะฝะต ะพัั‚ะฐะฝะพะฒะปัŽััŒ, ะพะฑะตั‰ะฐัŽ ั‚ะตะฑะต, ะผะพะต ะะตะฑะพ! ะžั‚ัั‚ั€ะฐะฝะธะฒัˆะธััŒ ัะปะตะณะบะฐ ะพั‚ ะœัƒะบัƒั€ะพ, ะขััƒะฝะฐ ะฟะพั†ะตะปะพะฒะฐะป ั‚ะพะณะพ ะฒ ะณัƒะฑั‹, ะฟะฐะปัŒั†ะฐะผะธ ัั‚ะธั€ะฐั ัะปะตะทั‹ ั ั‰ะตะบ. -ะ ั ะฝะธะบะพะณะดะฐ ะฝะต ะพัั‚ะฐะฒะปัŽ ั‚ะตะฑั, ะผะพะน ะขัƒะผะฐะฝั‡ะธะบ! ะกะพะปะฝั†ะต ัƒะถะต ะบะพัะฝัƒะปะพััŒ ะผะฐะบัƒัˆะตะบ ะดะตั€ะตะฒัŒะตะฒ, ะฐ ะพะฝะธ ะฟั€ะพะดะพะปะถะฐะปะธ ัะธะดะตั‚ัŒ ะผะพะปั‡ะฐ, ะฒะตะดัŒ ะธะผ ะฝะต ะฝัƒะถะฝั‹ ะฑั‹ะปะธ ัะปะพะฒะฐ. ะžะฝะธ ะฟะพะฝะธะผะฐะปะธ ะดั€ัƒะณ ะดั€ัƒะณะฐ ะธ ะฑะตะท ะฝะธั…. -ะ”ะถัƒะดะฐะนะผะตะต...ะ“ะดะต ะฒั‹? ะ’ะดะฐะปะธ ะฟะพัะปั‹ัˆะฐะปะธ ะบั€ะธะบะธ ะ“ะพะบัƒะดะตั€ั‹. ะฅั€ะฐะฝะธั‚ะตะปัŒ ะฃั€ะฐะณะฐะฝะฐ ะฝะพัะธะปัั ะฟะพ ะฟะฐั€ะบัƒ ะฒ ะฟะพะธัะบะฐั… ัะฒะพะตะณะพ ะฑะพััะฐ. ะ ะพะบัƒะดะพ ะฒั‹ะดะพั…ะฝัƒะป, ะขััƒะฝะฐ ัƒะปั‹ะฑะฝัƒะปัั, ะพะฝะธ ะฟะพะฝะธะผะฐะปะธ, ั‡ั‚ะพ ะฟั€ะธัˆะปะพ ะฒั€ะตะผั ะธะผ ะฒะพะทะฒั€ะฐั‰ะฐั‚ัŒัั ะฒ ั‚ะพั‚ ะผะธั€, ะบะพั‚ะพั€ั‹ะน ะฝะต ั‚ะตั€ะฟะธั‚ ัะตะฝั‚ะธะผะตะฝั‚ะฐะปัŒะฝะพัั‚ะตะน ะธ ะฝะตะถะฝะพัั‚ะตะน. ะœะธั€ ะผะฐั„ะธะธ ะถะตัั‚ะพะบ ะธ ะฑะตัะฟะพั‰ะฐะดะตะฝ, ะฝะพ ัั‚ะธ ะดะฒะพะต, ะถะธะฒั ะฒ ะฝะตะผ, ั…ั€ะฐะฝะธะปะธ ั‡ัƒะฒัั‚ะฒะพ, ะบะพั‚ะพั€ะพะต ัะฒัะทะฐะปะพ ะธั… .ะ˜, ะฝะตัะผะพั‚ั€ั ะฝะฐ ะฒะพะนะฝั‹, ะฟะพั‚ะตั€ะธ, ะพะฝะธ ะฟั€ะพะฝะตััƒั‚ ัั‚ะพ ั‡ัƒะฒัั‚ะฒะพ ะดะพ ั‚ะพะณะพ ะดะฝั, ะบะพะณะดะฐ ะธั… ะฝะต ัั‚ะฐะฝะตั‚ ัƒะถะต ะฝะฐ ะทะตะผะปะต. ะะพ, ะบั‚ะพ ะทะฝะฐะตั‚, ะผะพะถะตั‚ ะฟั€ะธะดะตั‚ ะฒั€ะตะผั ะธ ะพะฝะธ ะฒัั‚ั€ะตั‚ัั‚ัั ัะฝะพะฒะฐ, ะธะฑะพ ะฝะฐัั‚ะพัั‰ะฐั ะปัŽะฑะพะฒัŒ ะฒะตั‡ะฝะฐ ะธ ะฝะต ัƒะณะฐัะฐะตั‚, ะฝะตัะผะพั‚ั€ั ะฝะฐ ะณะพะดะฐ ะธ ะฒะตะบะฐ.' - source_sentence: '- ะšะฐะฝะดะฐ! ะฏ ะตั‰ั‘ ัะพะณะปะฐัะธะปัั ะฝะฐ ะฟะฐััะธะฒ "ัะฝะธะทัƒ" ! ะะพ ัั‚ะพ ัƒะถะต ะดะฐะถะต ะฝะต ะฟะฐััะธะฒ, ัั‚ะพ ัƒะถะต ะฑะตะท ะฐะบั‚ะธะฒ ะบะฐะบะพะน-ั‚ะพ! - ะ ั‡ั‚ะพ ั‚ั‹ ะพั‚ ะผะตะฝั ั…ะพั‡ะตัˆัŒ-ั‚ะพ? ะฏ ัƒะถะต ะฝะต ะผะพะณัƒ ัะดะตั€ะถะธะฒะฐั‚ัŒัั! - ะะต ะฒะดะฐะฒะปะธะฒะฐะน ะผะตะฝั ะขะะš ะฒ ัั‚ะตะฝัƒ-ั‚ะพ! - ะขั‡. ะœะพััˆะธ, ั…ะฒะฐั‚ะธั‚ ะตะปะพะทะธั‚ัŒ ะธ ะฝะฐัะปะฐะถะดะฐะนัั ะผะพะผะตะฝั‚ะพะผ! - ะะพ ัั‚ะพ ะฝะต ั‡ะตัั‚ะฝะพ! ะŸะพ ะพั‚ะฝะพัˆะตะฝะธัŽ, ะผะตะถะดัƒ ะฟั€ะพั‡ะธะผ, ะบ ั‚ะต...ะผะผะผะผะผ!!! - ะขั‡. ะœะพััˆะธ, ั‡ั‚ะพ ัั‚ะพ? - ะ‘ะฐะฝั‚ะธะบ! - ะฏ ะฒะธะถัƒ, ะฝะฐ ะบะพะน ั‡ั‘ั€ั‚ ั‚ั‹ ะผะฝะต ะตะณะพ ะทะฐะฒัะทะฐะป? - ะขะฐะบ ั‚ั‹ ะฟะพั…ะพะถ ะฝะฐ ะฟะพะดะฐั€ะพะบ! - ะกั‡ะฐั ะœัƒะณะตะฝะพะผ ะพะณั€ะตะฑั‘ัˆัŒ! - ะ ะฟะพั‡ะตะผัƒ ั‚ั‹ ะฝะต ัะฟั€ะพัะธัˆัŒ "ะšะพะผัƒ?" - ะœะฝะต ัั‚ะพ ะฝะต ะธะฝั‚ะตั€ะตัะฝะพ! ะ ะบะพะผัƒ? - ะœะฝะต!!! ... *ะงะผะพะบ* - ะฅะผ... ะผะตะฝั ัั‚ะพ ะฝะต ัƒัั‚ั€ะฐะธะฒะฐะตั‚! - ะงะตะณะพ?!! - ะขะพะณะพ! - ะœะœะœะœ!!! - ะšะพะผัƒะธ! ะงั‚ะพ ัั‚ะพ ะทะฝะฐั‡ะธั‚? ะงั‚ะพ ั ะะปะปะตะฝะพะผ? - ะงั‚ะพ? ะ, ะšะฐะฝะดะฐ. ะะปะปะตะฝะฐ ะฃะพะปะบะตั€ะฐ ั€ะฐะฝะธะปะธ ะฝะฐ ะผะธััะธะธ! - ะญั‚ะพ ั ัƒะถะต ะฟะพะฝัะป! ะงั‚ะพ ั ะฝะธะผ, ะณะพะฒะพั€ะธ ะบะพะฝะบั€ะตั‚ะฝะตะน! - ัะฐะผัƒั€ะฐะน ะฒัั‚ั€ัั…ะฝัƒะป ะฝะฐั‡ะฐะปัŒะฝะธะบะฐ. - ะะต ะฟะพะฒั‹ัˆะฐะน ะฝะฐ ะผะตะฝั ะณะพะปะพั! - ะฒะพะทะผัƒั‚ะธะปัั ัะผะพั‚ั€ะธั‚ะตะปัŒ. - ะ’ะพั‚ ั ะฒะฐัˆะตะน ัะตัั‚ั€ะพะน ั‡ั‚ะพ-ะฝะธะฑัƒะดัŒ ัะปัƒั‡ะธั‚ัั, ั ะฒะฐะผ ั‚ะพะถะต ัะบะฐะถัƒ "ะะต ะบั€ะธั‡ะธั‚ะต!" ะงั‚ะพ ั ะœะพััˆะธ? - ะญั…... ะฝะฐ ะผะธััะธะธ ะะปะปะตะฝ ะพัะปะตะฟ. ะะพ ะฝะต ะฟะตั€ะตะถะธะฒะฐะน. ะญั‚ะพ ะฒั€ะตะผะตะฝะฝะพ! ะ—ั€ะตะฝะธะต ะฒะพััั‚ะฐะฝะพะฒะธั‚ัั! ะœะตััั†ะฐ ั‡ะตั€ะตะท 3! - 3 ะœะ•ะกะฏะฆะ?! - ะ”ะฐ! ะขั‹ ัƒะถ ะฝะต ะพะฑะธะถะฐะน ะตะณะพ ะฟะพะบะฐ. - ะ‘ะตะท ะฒะฐั ะทะฝะฐัŽ! - ะขั‹ ะบัƒะดะฐ? - ะš ะะปะปะตะฝัƒ, ะบัƒะดะฐ ะถะต ะตั‰ั‘! - ะณั€ะพะทะฝะพ ั€ัะฒะบะฝัƒะป ัะฐะผัƒั€ะฐะน. - ะžั… ัƒะถ ัั‚ะธ ะณะพะปัƒะฑะบะธ... - ะš-ะบั‚ะพ ะทะดะตััŒ? - ะะปะปะตะฝ ัะธะดะตะป ะฝะฐ ะบะพะนะบะต, ะทะฐะฒะตั€ะฝัƒะฒัˆะธััŒ ะฒ ะพะดะตัะปะพ. - ... - ัˆะฐะณะธ ะฟั€ะธะฑะปะธะถะฐะปะธััŒ. - ะ-ะฝะต ะฟะพะดั…ะพะดะธ! - ะฐ ะฒั‹ ะฑั‹ ะฝะต ะธัะฟัƒะณะฐะปะธััŒ, ะฟั€ะตะถะดะต ะพัั‚ะฐะฒัˆะธััŒ ะฒ ะพะดะธะฝะพั‡ะตัั‚ะฒะต, ัั€ะตะดะธ ะฐะบัƒะผ, ะฒ ะณัƒัั‚ะพะผ ะปะตััƒ, ะฑะตะท ะทั€ะตะฝะธั? ะขะพ-ั‚ะพ ะถะต! - "ะะต ะพัั‚ะฐะฒะปัŽ!" - ะงะธัั‚ะฐั ะกะธะปะฐ! - ะทะฐะฝั‘ั ั€ัƒะบัƒ, ะฟั€ะตะดัƒะฟั€ะตะถะดะฐั ะฒั€ะฐะณะฐ. - "ะะธ ะทะฐ ั‡ั‚ะพ ะฑะพะปัŒัˆะต ะฝะต ะพัั‚ะฐะฒะปัŽ ะพะดะฝะพะณะพ!" ะะปะปะตะฝ! - ะฟะพะดั…ะฒะฐั‚ะธั‚ัŒ ั‚ะพะฝะบะพะต ั‚ะตะปัŒั†ะต ะฝะฐ ั€ัƒะบะธ, ะฟั€ะธะถะฐั‚ัŒ ะบ ัะตะฑะต, ะปะฐะดะพะฝัŒัŽ ะฝะฐะบั€ั‹ะฒ ะณะปะฐะทะฐ ะœะพััˆะธ, ะบะฐะบ ะฒ ะฟะตั€ะฒะพะผ ะฟะพั†ะตะปัƒะต. ะ˜ ะบะพัะฝัƒั‚ัŒัั ัƒะณะพะปะบะฐ ั€ะพั‚ะธะบะฐ ัะฒะพะธะผะธ ะณัƒะฑะฐะผะธ. - ะš-ะšะฐะฝะดะฐ?! - ะะต ะฒะพะปะฝัƒะนัั! ะฏ ัั‚ะฐะฝัƒ ั‚ะฒะพะธะผะธ ะณะปะฐะทะฐะผะธ ะฟะพะบะฐ ั‚ั‹ ะฟั€ะพะดะพะปะถะฐะตัˆัŒ ะฑั‹ั‚ัŒ ะผะพะธะผ ะกะตั€ะดั†ะตะผ ะะตะฒะธะฝะฝะพัั‚ะธ. ะ ั‚ั‹ ัƒัั‚ะฐะฒัˆะธะน ะธะดั‘ัˆัŒ ั ะผะธััะธะธ. ะ ั ัƒัั‚ะฐะฒัˆะธะน ะธะดัƒ ั ั‚ั€ะตะฝะธั€ะพะฒะบะธ. ะขะฒะพะธ ะฝะพะณะธ ะธัั‚ะพะฟั‚ะฐะฝั‹. POV ะšะฐะฝะดั‹. ะœะพั ะณะพะปะพะฒะฐ ะฑะพะปะธั‚. ะขะฒะพะธ ั€ัƒะบะธ ะฝะพัŽั‚. ะœะพั‘ ัะตั€ะดั†ะต ะธัั‚ะพะผะธะปะพััŒ. ะ˜ ะฒะพั‚ ะผั‹ ะธะดั‘ะผ ะดั€ัƒะณ ะฝะฐ ะดั€ัƒะณะฐ, ะฟะพะดะฝะธะผะฐะตะผ ะณั€ัƒัั‚ะฝั‹ะต, ะธะทะผัƒั‡ะตะฝะฝั‹ะต ะณะปะฐะทะฐ ะดั€ัƒะณ ะบ ะดั€ัƒะณัƒ. ะขั‹ ะพัั‚ะฐะฝะฐะฒะปะธะฒะฐะตัˆัŒัั. ะ”ัƒั€ะฐะบ, ั‡ั‚ะพ-ั‚ะพ ะณะพะฒะพั€ะธัˆัŒ. ะงั‚ะพ-ั‚ะพ ะบั€ะธั‡ะธัˆัŒ. ะž ั‡ั‘ะผ-ั‚ะพ ะผะพะปั‡ะธัˆัŒ. ะšะฐะบ-ั‚ะพ ัะผะพั‚ั€ะธัˆัŒ. ะž ั‡ั‘ะผ-ั‚ะพ ะฒะพะปะฝัƒะตัˆัŒัั. ะกะฝะพะฒะฐ ะพ ั‡ั‘ะผ-ั‚ะพ ะบั€ะธั‡ะธัˆัŒ. ะก ะณั€ัƒัั‚ัŒัŽ ัะผะพั‚ั€ะธัˆัŒ. ะž ะบะพะผ ั‚ั‹ ะผัƒั‡ะฐะตัˆัŒัั? ะ”ะตะปะฐะตัˆัŒ ัˆะฐะณ, ะตั‰ั‘ ะพะดะธะฝ. ะฅะฒะฐั‚ะฐะตัˆัŒ ะทะฐ ะฒะพั€ะพั‚ะฝะธะบ. ะŸั€ะธะฒัั‚ะฐั‘ัˆัŒ ะฝะฐ ะฝะพัะพั‡ะบะฐั…. ะฆะตะปัƒะตัˆัŒ... ะ”ัƒั€ะฐะบ, ั‚ั‹ ะถะต ัƒัั‚ะฐะป! ะ”ัƒั€ะฐะบ, ั ะถะต ัƒัั‚ะฐะป! ะฏ ะพัั‚ะฐะฝะฐะฒะปะธะฒะฐัŽััŒ. ะ”ัƒั€ะฐะบ, ั‡ั‚ะพ-ั‚ะพ ะพั‚ะฒะตั‡ะฐัŽ. ะงั‚ะพ-ั‚ะพ ะบั€ะธั‡ัƒ. ะะฐ ั‡ั‘ะผ-ั‚ะพ ะทะฐะผะพะปะบะฐัŽ. ะขัƒะฟะพ ัะผะพั‚ั€ัŽ. ะงั‚ะพ-ั‚ะพ ั‰ะตะผะธั‚. ะžะฟัั‚ัŒ ั‡ั‚ะพ-ั‚ะพ ะพั€ัƒ. ะžั‚ั€ะตัˆั‘ะฝะฝะพ ัะผะพั‚ั€ัŽ. ะ—ะฐ ะบะพะณะพ-ั‚ะพ ะฒะพะปะฝัƒัŽััŒ. ะกั‚ะพัŽ. ะะฐัั‚ะพั€ะฐะถะธะฒะฐัŽััŒ. ะ’ัั‘ ั€ะฐะฒะฝะพ. ะะตัƒะถะตะปะธ?! ะžั‚ะฒะตั‡ะฐัŽ ะฝะฐ ะฟะพั†ะตะปัƒะน. ะšะฐะบ ะถะต ะผั‹ ัƒัั‚ะฐะปะธ! - ะ”ะฐะฒะฝะพ? - ะ’ัะตะณะดะฐ, ะœะพััˆะธ. - ะะตั‚, ั‡ะตัั‚ะฝะพ, ั ะฝะตะฝะฐะฒะธะถัƒ ะณะพะปัƒะฑะตะน! ะ ะพัะพะฑะตะฝะฝะพ ะฑะตะปั‹ั…! - ัะบะฐะฝะดะฐะปะธะป ะ›ะฐะฒะธ ะธะดั ะฟะพ ะบะพั€ะธะดะพั€ัƒ ะงั‘ั€ะฝะพะณะพ ะžั€ะดะตะฝะฐ. - ะงะตะผ ะถะต ะพะฝะธ ั‚ะตะฑะต ะฝะต ะฝั€ะฐะฒัั‚ัั? - ัะฟั€ะพัะธะปะฐ ะดะตะฒัƒัˆะบะฐ-ะบะธั‚ะฐัะฝะบะฐ. - ะ”ะฐ ะฑะปะธะฝ, ัะธะดัั‚ ะฒะตะทะดะต ะณะดะต ะฝะต ะฟะพะฟะฐะดั ะธ ัะฒะตั€ั…ัƒ ะบะฐะบะฐัŽั‚! ะ˜ ั‚ะพะปัŒะบะพ ะพะฝะธ ะทะฐัˆะปะธ ะทะฐ ะฟะพะฒะพั€ะพั‚, ะบะฐะบ ัƒัะปั‹ัˆะฐะปะธ: - ะะตั‚, ะšะฐะฝะดะฐ, ะฟั€ะตะบั€ะฐั‚ะธ! ะะฐั ะผะพะณัƒั‚ ัƒะฒะธะดะตั‚ัŒ! - ะ”ะฐ ะบั‚ะพ ะฝะฐั ั‚ัƒั‚ ะผะพะถะตั‚ ัƒะฒะธะดะตั‚ัŒ, ะœะพััˆะธ? - ะัƒ, ะบั‚ะพ-ะฝะธะฑัƒะดัŒ! ะั…... ะฎัƒ! - ะœะธะผะพ ะณะพะปัƒะฑัั‚ะฝะธ ะฝะธะบั‚ะพ ะฝะต ะฟั€ะพั…ะพะดะธั‚! ะญั‚ะพ ะฝะฐะดั‘ะถะฝะฐั ั‡ะฐัั‚ัŒ ะทะฐะผะบะฐ! - ะัƒ, ะฝัƒ, ะฝัƒ ะปะฐะดะฝะพ... ะฝะพ ะฟะพั‡ะตะผัƒ ะธะผะตะฝะฝะพ ั‚ัƒั‚? - ะ ะบั‚ะพ ะฒะตั€ะตั‰ะฐะป ั‡ั‚ะพ ะตะผัƒ ั€ะพะผะฐะฝั‚ะธะบะธ ะฝะต ั…ะฒะฐั‚ะฐะตั‚? - ะšะฐะฝะดะฐ ะฝะตะดะฒัƒัะผั‹ัะปะตะฝะฝะพ ะทะฐะถะธะผะฐะป ะะปะปะตะฝะฐ ะฝะฐ ะฟะพะดะพะบะพะฝะฝะธะบะต. ะ›ะฐะฒะธ ะธ ะ›ะธะฝะฐะปะธ ัˆะฐั€ะฐั…ะฝัƒะปะธััŒ ะพะฑั€ะฐั‚ะฝะพ. - ะฅะพั‚ั ะทะฝะฐะตัˆัŒ, ะ›ะธ. ะœะพะถะตั‚ ะฒ ัั‚ะธั… ะณะพะปัƒะฑะบะฐั… ั‡ั‚ะพ-ั‚ะพ ะธ ะตัั‚ัŒ! ะ”ั‹ัˆะฐั‚ัŒ ัั‚ะฐะฝะพะฒะธั‚ัั ะฒัั‘ ั‚ั€ัƒะดะฝะตะต, ะผะตะฝั ะทะฐะณะพะฝััŽั‚ ะฒ ัƒะณะพะป. ะฏ ะธัะฟัƒะณะฐะฝะฝะพ ะพะฑะพั€ะฐั‡ะธะฒะฐัŽััŒ ะธ ะฒะธะถัƒ... ะตะณะพ. - ะขั‹, ะฟั€ะพะบะปัั‚ั‹ะน! - ะพะบั€ะธะบะธะฒะฐะตั‚ ัั‚ั€ะพะณะธะน ัะฟะพะฝะตั†. ะ ั ะผะพะปั‡ัƒ, ะดั‹ั…ะฐะฝะธะต ัะฑะธะปะพััŒ, ะฒะทะฒะพะปะฝะพะฒะฐะฝ. ะงั‚ะพ ะตะผัƒ ะฝัƒะถะฝะพ? - ะะต ะดัƒะผะฐะป ั‡ั‚ะพ ั‚ั‹ ะพะฟัƒัั‚ะธัˆัŒัั ะดะพ ะบั€ะฐะถ, ะœะพััˆะธ. ะงั‚ะพ? ะšั€ะฐะถ? ะšะฐะบะธั… ะบั€ะฐะถ, ะฎัƒ? - ะขั‹ ะพ ั‡ั‘ะผ? - ะ˜ะผะตะฝะฝะพ ั‚ั‹, ะพ ั ะฝะต ัะพะผะฝะตะฒะฐัŽััŒ, - ะพะฝ ะธะทะดะตะฒะฐะตั‚ัั? - ั‚ั‹ ัƒะบั€ะฐะป ัƒ ะผะตะฝั ะพะดะฝัƒ ะฒะตั‰ัŒ. - ะšะฐะฝะดะฐ, ั ะฝะธั‡ะตะณะพ ะฝะต ะฑั€ะฐะป! - ะพั‚ะบัƒะดะฐ ัั‚ะพ ั‡ัƒะฒัั‚ะฒะพ ะฑะตะทั‹ัั…ะพะดะฝะพัั‚ะธ? ะžะฝ ะพัั‚ะฐะฝะพะฒะธะปัั. - ะ›ะธะฑะพ ะฒะตั€ะฝะธ ะผะฝะต ะตะณะพ, ะปะธะฑะพ ั ะทะฐะฑะตั€ัƒ ั‚ะฒะพั‘! - ะงั‚ะพ? ะฏ ะตะณะพ ะฝะต ะฟะพะฝะธะผะฐัŽ! ะงั‚ะพ ะพะฝ ะดะตะปะฐะตั‚? ะฅะฒะฐั‚ะฐะตั‚ ะทะฐ ะฟะพะดะฑะพั€ะพะดะพะบ, ั‚ะฐั‰ะธั‚ ะฝะฐ ัะตะฑั. ะ˜... ะฑะพะถะต, ั‡ั‚ะพ ัั‚ะพ? ะžะฝ... ะฏ ั‡ัƒะฒัั‚ะฒัƒัŽ ะตะณะพ ะณัƒะฑั‹ ะธ ัะฐะผ ะฝะต ะฟะพะฝะธะฐั - ะพั‚ะฒะตั‡ะฐัŽ! ะšะฐะฝะดะฐ, ั‚ั‹, ั‚ั‹, ั‚ั‹... ะฝะฐัั‚ะพัั‰ะธะน ะฒะพั€! ะขั‹ ะบั€ะฐะดั‘ัˆัŒ ะผะพั‘ ัะตั€ะดั†ะต! - ะš-ะบะฐะฝะดะฐ... - ั€ัƒะบะธ ะฟะพะฒะธัะปะธ. ะะต ัะพะพะฑั€ะฐะถะฐัŽ. - ะ’ะตั€ะฝะธ ะผะพั‘ ัะตั€ะดั†ะต, ะ“ั€ั‘ะฑะฐะฝะฝั‹ะน ะกั‚ั€ัƒั‡ะพะบ! - ะฏ... ะฝะธ ะทะฐ ั‡ั‚ะพ! ะฏ ะพัั‚ะฐะฒะปัŽ ะตะณะพ ัะตะฑะต! ะ ั‚ั‹... ัƒะถะต ะทะฐะฑั€ะฐะป ะผะพั‘... ะ’ะพะปะฝะพะฒะฐะปัั ะบะฐะบ-ั‚ะพ ะะปะปะตะฝ, ะฒะตะดัŒ ะšะฐะฝะดะฐ ะฑั‹ะป ะฝะฐ ะผะธััะธะธ. ะ˜ ะฒะพั‚ ัˆะปั‘ั‚ ะพะฝ ะตะผัƒ ะฟะธััŒะผะพ ั‡ะตั€ะตะท ะณะพะปะตะผะฐ. ะ: ะขั‹ ั‚ะฐะผ ะฒะพะพะฑั‰ะต ะถะธะฒะพะน, ะ‘ะฐะšะฐะฝะดะฐ? ะš: ะ”ะฐ, ะะปะปะตะฝ. ะ–ะธะฒะพะน, ั ะถะธะฒะพะน! ะ: ะšะฐะฝะดะฐ! ะงั‚ะพ ั ั‚ะพะฑะพะน? ะขะตะฑะต ะฟะปะพั…ะพ? ะฃะผะธั€ะฐะตัˆัŒ? ะขะตะฑั ะ›ะธะฝะฐะปะธ ะฟะพั†ะตะปะพะฒะฐะปะฐ? ะ”ะตั€ะถะธััŒ ะดั€ัƒะณ! ะš: ะขั‹ ั‡ั‘? ะกะพ ะผะฝะพะน ะฒัั‘ ั…ะพั€ะพัˆะพ, ะœะพััˆะธ! ะ: ะคัƒั…... ะฝะต ะฟัƒะณะฐะน ะผะตะฝั ั‚ะฐะบ ะฑะพะปัŒัˆะต. - ะญะน, ะšะฐะฝะดะฐ, ะฝะต ะณั€ัƒัั‚ะธ! ะญั‚ะพ ั‚ะฐะบ ะฝะฐ ั‚ะตะฑั ะฝะต ะฟะพั…ะพะถะต! - ะขะตะฑะต ะปะตะณะบะพ ะณะพะฒะพั€ะธั‚ัŒ. ะฃ ะฒะฐั ั ะ›ะฐะฒะธ ะฒัั‘ ะฝะฐะปะฐะถะธะฒะฐะตั‚ัั. ะ ะฝะฐ ะผะตะฝั ะฃะพะปะบะตั€ ะดะฐะถะต ะฝะต ัะผะพั‚ั€ะธั‚! - ะขั‹ ะฟะพะณะพะฒะพั€ะธ ั ะฝะธะผ! - ะขั‡, ะธ ั‚ะฐะบ ะบะฐะถะดั‹ะน ะดะตะฝัŒ ะฒะธะดะธะผัั! - ะะตั‚, ะฎัƒ, ะฟะพะณะพะฒะพั€ะธ ั ะฝะธะผ, ะบะฐะบ ัะพ ะผะฝะพะน! ะ’ัั‘ ะพะฑั€ะฐะทัƒะผะธั‚ัŒัั ัะปั‹ัˆะธัˆัŒ? - ะะต ะฒะตั€ัŽ ั ะฒ ัั‚ะพ. - ะ ั‚ั‹ ะฟะพะฒะตั€ัŒ! ะงัƒะดะพ - ะฑั‹ะฒะฐะตั‚! http://vkontakte.ru/photo63528512_276702591 -ะะต ะพั‚ะดะฐะผ. ะกะปั‹ัˆะธั‚ะต?! ะะธะบะพะณะดะฐ ะฝะต ะพั‚ะดะฐะผ ะฒะฐะผ ะšะฐะฝะดัƒ!!! - ะญั‚ะพ ะฝะต ั‚ะตะฑะต ั€ะตัˆะฐั‚ัŒ, ะะปะปะตะฝ ะฃะพะปะบะตั€! - ะะต. ะŸะพะดั…ะพะดะธั‚ะต. ะš. ะะฐะผ. - ะžะฝ ะฟั€ะธะฝะฐะดะปะตะถะธั‚ ะฝะฐะผ! - ะฏ... ะพะฝ... ะฃะ‘ะฌะฎ!!! http://vkontakte.ru/photo63528512_276702661' sentences: - 'ะกะตะณะพะดะฝั, ะฟั€ั‹ะณะฐั ะฝะฐ ะบั€ะพะฒะฐั‚ะธ, ะšะธั€ะฐ ัะปะพะผะฐะปะฐ ะตะต. ะžะฝะฐ ะพั‚ั‡ะฐัะฝะฝะพ ะฟั‹ั‚ะฐะปะฐััŒ ะดะพะฟั€ั‹ะณะฝัƒั‚ัŒ ะดะพ ะฟะพั‚ะพะปะบะฐ, ะฝะพ ะฝะธั‡ะตะณะพ ะฝะต ะฟะพะปัƒั‡ะฐะปะพััŒ, ะธ ะพะฟะธะปะบะธ ะปะธัˆัŒ ั‚ั‰ะตั‚ะฝะพ ัั‹ะฟะฐะปะธััŒ ะฝะฐ ะฟะพะป. ะะธะบั‚ะพ ะฝะต ัะปั‹ัˆะฐะป ะฝะธ ัะบั€ะตะถะตั‚ะฐ ะฟั€ัƒะถะธะฝ, ะฝะธ ะณั€ะพั…ะพั‚ะฐ; ะฝะต ะฑั‹ะปะพ ะฒะธะดะฝะพ ะธ ัะฐะผะพะน ะฟะพะปะพะผะบะธ. ะœะฐั‚ัŒ, ั€ัƒะณะฐั ะดะพั‡ัŒ, ะฒ ะพั‚ะฒะตั‚ ะฟะพะปัƒั‡ะธะปะฐ ะปะธัˆัŒ ัƒัั‚ะฐะปะพะต ั€ะฐะฒะฝะพะดัƒัˆะธะต, ั‡ั‚ะพ, ะบะพะฝะตั‡ะฝะพ ะถะต, ะฒั‹ะฒะตะปะพ ะตะต ะธะท ัะตะฑั. ะšั€ะธั‡ะฐ ั‡ั‚ะพ-ั‚ะพ ะฝะตั‡ะปะตะฝะพั€ะฐะทะดะตะปัŒะฝะพะต, ะพะฝะฐ ัั‚ัƒั‡ะฐะปะฐ ะฝะพะณะพะน ะฟะพ ัะปะพะผะฐะฝะฝะพะผัƒ ะฟั€ะตะดะผะตั‚ัƒ. ะ–ะตะฝั‰ะธะฝะฐ ะฝะต ะฟะพะฝะธะผะฐะปะฐ, ั‡ั‚ะพ ะพะฝะฐ ะดะตะปะฐะปะฐ ั‚ะพะปัŒะบะพ ั…ัƒะถะต, ะฝะพ ะณะฝะตะฒ ะฒ ะตะต ะบั€ะพะฒะธ ะฒะทัะป ะฒะตั€ั…. - ะ”ะฐ ะบะฐะบ ั‚ั‹ ัะผะตะตัˆัŒ, ะฟะฐั€ัˆะธะฒะฐั ะดะตะฒั‡ะพะฝะบะฐ! ะฏ ั‚ะพะปัŒะบะพ ะธ ะดะตะปะฐะปะฐ, ั‡ั‚ะพ ัƒั…ะฐะถะธะฒะฐะปะฐ ะทะฐ ั‚ะฒะพะตะน ะบั€ะพะฒะฐั‚ัŒัŽ! ะ ั‚ั‹ ั€ะตัˆะธะปะฐ ัƒัั‚ั€ะพะธั‚ัŒ ะฟะพะณั€ะพะผ?! ะ—ะฝะฐะตัˆัŒ, ั‡ั‚ะพ?! ะฏ ัั‚ะพ ั‚ะฐะบ ะฝะต ะพัั‚ะฐะฒะปัŽ! - ะฝะฐ ัั‚ะธั… ัะปะพะฒะฐั… ะถะตะฝั‰ะธะฝะฐ, ั‡ัƒั‚ัŒ ะปะธ ะฝะต ัะฝะธะผะฐั ะดะฒะตั€ัŒ ั ะฟะตั‚ะตะปัŒ, ะฒั‹ะฑะตะถะฐะปะฐ ะธะท ะบะพะผะฝะฐั‚ั‹. ะšะธั€ะฐ ั€ะตะทะบะพ ะพะฟัƒัั‚ะธะปะฐััŒ ะฝะฐ ะบะพะปะตะฝะธ. ะŸั€ะธะถะฐะฒ ั€ัƒะบะธ ะบ ะบั€ะพะฒะฐั‚ะธ, ะพะฝะฐ ะฟั‹ั‚ะฐะปะฐััŒ ัะดะตั€ะถะธะฒะฐั‚ัŒ ะตะต ะฝะตะฒั‹ะฝะพัะธะผั‹ะน ัะบั€ะตะถะตั‚. ะ’ะทัะฒ ะผะพะปะพั‚ะพะบ ะธ ะณะฒะพะทะดะธ ะธะท ะบะปะฐะดะพะฒะพะน, ะดะตะฒะพั‡ะบะฐ ะฑะตะทะฝะฐะดะตะถะฝะพ ะบะพะปะพั‚ะธะปะฐ ะฟะพ ะพะฑะปะพะผะบะฐะผ, ะฟั‹ั‚ะฐัััŒ ั…ะพั‚ัŒ ะบะฐะบ-ั‚ะพ ะธั… ัะพะตะดะธะฝะธั‚ัŒ. ะะพ ะฒัะต ะพะบะฐะทะฐะปะพััŒ ะฑะตะทั€ะตะทัƒะปัŒั‚ะฐั‚ะฝะพ: ะพะฑะปะพะผะบะธ ะปะธัˆัŒ ั ะตั‰ะต ะฑะพะปัŒัˆะธะผ ัั‚ั€ะตะผะปะตะฝะธะตะผ ั€ะฐัะบะฐะปั‹ะฒะฐะปะธััŒ ะฟะพะด ะณะฝะตั‚ะพะผ ะณะฒะพะทะดะตะน. ะžะฝะฐ ะปะตะณะปะฐ ะฝะฐ ะฟะพะป. ะ›ะตะณะบะธะน ัะบะฒะพะทะฝัะบ ั‰ะตะบะพั‚ะฐะป ะตะต ัะฟะธะฝัƒ. - ะฏ ะฝะธะบะพะณะดะฐ ะฝะต ัะผะพะณัƒ ะดะพะฟั€ั‹ะณะฝัƒั‚ัŒ ะดะพ ะฟะพั‚ะพะปะบะฐ, - ัะบะฐะทะฐะปะฐ ะšะธั€ะฐ ะธ ะฒั‹ะดะพั…ะฝัƒะปะฐ. - ะ ะฒะดั€ัƒะณ ัั‚ะพ ะฝะต ั‚ะฐะบ? ะšะธั€ะฐ ั€ะตะทะฒะพ ะฒัั‚ะฐะปะฐ. ะะฐ ะตะต ะปะธั†ะต ะฟะพัะฒะธะปะฐััŒ ะผะฐัะบะฐ ะฝะตะดะพัƒะผะตะฝะธั, ะฐ ะฒ ะณั€ัƒะดะธ ะฝะฐั‡ะฐะป ั€ะฐะทะถะธะณะฐั‚ัŒัั ะพะณะพะฝะตะบ ัั‚ั€ะฐั…ะฐ. ะžั‚ะบัƒะดะฐ ัั‚ะพั‚ ะณะพะปะพั? - ะะต ะฑะพะนัั, ะณะปัƒะฟั‹ัˆะบะฐ, - ะณะพะปะพั ะฑั‹ะป ะพั‡ะตะฝัŒ ะผัะณะพะบ. - ะžั‚ะบัƒะดะฐ ั‚ั‹? ะฏ ั‚ะตะฑั ั€ะฐะฝัŒัˆะต ะฝะต ัะปั‹ัˆะฐะปะฐ... - ะ ั€ะฐะทะฒะต ัั‚ะพ ะฒะฐะถะฝะพ? - ะ ั‡ั‚ะพ, ะฝะตั‚? - ะŸะพั‡ะตะผัƒ ัั‚ะพ ะดะพะปะถะฝะพ ะฑั‹ั‚ัŒ ะฒะฐะถะฝะพ? ะ ะฐะทะฒะต ะฝะตะปัŒะทั ะฟั€ะพัั‚ะพ ะฟะพะณะพะฒะพั€ะธั‚ัŒ ั ั‚ะพะฑะพะน? - ะขั‹ ะดัƒะผะฐะตัˆัŒ, ั ะฑัƒะดัƒ ะณะพะฒะพั€ะธั‚ัŒ ั ะฝะตะทะฝะฐะบะพะผั‹ะผ ะณะพะปะพัะพะผ? - ะ ะฟะพั‡ะตะผัƒ ะฝะตั‚? - ะขะฐะบ. ะœะฝะต ะฝะฐะดะพะตะดะฐะตั‚ ัั‚ะฐ ะธะณั€ะฐ ะฒ ะฒะพะฟั€ะพัั‹. ะ“ะพะฒะพั€ะธ, ั‡ั‚ะพ ะธะปะธ ะบั‚ะพ ั‚ั‹ ะตัั‚ัŒ? ะ’ะฝะตะทะฐะฟะฝะพ ะฝะฐัั‚ัƒะฟะธะปะพ ะผะพะปั‡ะฐะฝะธะต, ะฟะพัะปะต ั‡ะตะณะพ ะฟะพัะปะตะดะพะฒะฐะปะพ ะฟั€ะพะดะพะปะถะธั‚ะตะปัŒะฝะพะต ะณัƒะดะตะฝะธะต. ะ“ะพะปะพั ะฝะฐั‡ะฐะป ะฝะฐะฟะตะฒะฐั‚ัŒ ะฟะตัะตะฝะบัƒ, ะฝะต ะฟะตัะฝัŽ, ะฐ ะธะผะตะฝะฝะพ ะฟะตัะตะฝะบัƒ. ะ›ัŽะฑะธะผัƒัŽ ะฟะตัะตะฝะบัƒ ะšะธั€ั‹, ะบะพั‚ะพั€ัƒัŽ ะพะฝะฐ ะทะฐะฒะพะดะธะปะฐ ะบะฐะถะดั‹ะน ั€ะฐะท, ะบะพะณะดะฐ ะปะพะผะฐะปะพััŒ ั‡ั‚ะพ-ะฝะธะฑัƒะดัŒ ะฒ ะตะต ะบะพะผะฝะฐั‚ะต. - ะฏ ะผะพะณัƒ ะฟะพัั‚ั€ะพะธั‚ัŒ ั‚ะตะฑะต ะฝะพะฒัƒัŽ ะบั€ะพะฒะฐั‚ัŒ. ะ“ะพั€ะฐะทะดะพ ะปัƒั‡ัˆะต ัั‚ะพะน. ะ’ ะฝะตะน ะฑัƒะดะตั‚ ะผะฝะพะณะพ ั†ะฒะตั‚ะพะฒ ะธ ัะปะฐะดะพัั‚ะตะน... ะ”ะตะฒะพั‡ะบะฐ ะพะถะธะฒะธะปะฐััŒ. ะ’ ะตะต ั€ะตั‡ะธ ะฟะพัะปั‹ัˆะฐะปะธััŒ ะฝะพั‚ะบะธ ั€ะฐะดะพัั‚ะธ. - ะŸั€ะฐะฒะดะฐ? ะขั‹ ัะดะตะปะฐะตัˆัŒ ัั‚ะพ? - ะ”ะฐ, ะฝะพ ะฒะพั‚ ั‚ะพะปัŒะบะพ... - ะงั‚ะพ "ั‚ะพะปัŒะบะพ"? - ะขะพะปัŒะบะพ ะพะฝะฐ ะฑัƒะดะตั‚ ะฝะต ะฝะฐัั‚ะพัั‰ะตะน. ะขั‹ ะฝะต ัะผะพะถะตัˆัŒ ะฝะฐ ะฝะตะน ัะฟะฐั‚ัŒ, ะฝะพ ะพะฝะฐ ะฑัƒะดะตั‚ ะฒ ั‚ะฒะพะตะน ะบะพะผะฝะฐั‚ะต. - ะณะพะปะพั ะพั‚ะบะฐัˆะปัะปัั. - ะั…, ะดะฐ. ะšั€ะพะผะต ั‚ะตะฑั ะตะต ะฝะธะบั‚ะพ ะฝะต ัƒะฒะธะดะธั‚. ะ”ะตะฒะพั‡ะบะฐ ะทะฐะดัƒะผั‡ะธะฒะพ ัƒะปั‹ะฑะฝัƒะปะฐััŒ. - ะะพ ะบะพะณะดะฐ ะถะต ั ัะผะพะณัƒ ัƒะฒะธะดะตั‚ัŒ ัะฒะพัŽ ะบั€ะพะฒะฐั‚ัŒ? ะ“ะพะปะพั ะฝะฐั‡ะฐะป ัะผะตัั‚ัŒัั. ะกะธะปัŒะฝะพ, ะดะพะปะณะพ, ะฝะพ ะผัะณะบะพ. ะญั‚ะพั‚ ัะผะตั… ะฑั‹ะป ะพั‡ะตะฝัŒ ะธ ะพั‡ะตะฝัŒ ะฝะตะพะฑั‹ั‡ะตะฝ: ะฒั€ะพะดะต ะฑั‹ ะธ ะดะพะฑั€ั‹ะน, ะฐ ะฒั€ะพะดะต ะฑั‹ ะธ ั ะฝะฐัะผะตัˆะบะพะน. ะ–ะฐะปะพัั‚ัŒ. ะ–ะฐะปะพัั‚ัŒ ัƒะฟั€ะฐะฒะปัะปะฐ ะธะผ. - ะŸะพั‡ะตะผัƒ ั‚ั‹ ัะผะตะตัˆัŒัั? - ะ”ะฐ ะฟะพั‚ะพะผัƒ ั‡ั‚ะพ ั‚ั‹ ะณะปัƒะฟะฐั ะดะตะฒะพั‡ะบะฐ, ะบะพั‚ะพั€ะฐั ะดะฐะถะต ะฝะต ะผะพะถะตั‚ ั€ะตัˆะธั‚ัŒ. - ะฏ ะฒะพะฒัะต ะฝะต ะณะปัƒะฟะฐ! - ะ”ะฐ? ะขะฐะบ ะพั‚ะฒะตั‚ัŒ: ั‚ะตะฑะต ะฝัƒะถะฝะพ ั‚ะพ, ั‡ั‚ะพ ั ะฟั€ะตะดะปะฐะณะฐัŽ? - ะะพ ัั‚ะพ ะถะต ะฒะพะฒัะต ะฝะต ะฝะฐัั‚ะพัั‰ะฐั ะบั€ะพะฒะฐั‚ัŒ! - ะšะธั€ะฐ ะฟั€ะธะปะพะถะธะปะฐ ั€ัƒะบะธ ะบ ะปะธั†ัƒ. - ะะฐ ะฝะตะน ั ะฝะต ัะผะพะณัƒ ะดะพะฟั€ั‹ะณะฝัƒั‚ัŒ ะดะพ ะฟะพั‚ะพะปะบะฐ! ะ“ะพะปะพั ะพะฟัั‚ัŒ ะทะฐะปะธะปัั ัะผะตั…ะพะผ. - ะŸะžะงะ•ะœะฃ ะขะซ ะกะœะ•ะ•ะจะฌะกะฏ ะ’ะกะ• ะ’ะ ะ•ะœะฏ?! - ะ”ะฐ ะฟะพั‚ะพะผัƒ ั‡ั‚ะพ ั‚ั‹ ัƒะถะต ั€ะตัˆะธะปะฐ. ะฃะถะต ะดะฐะฒะฝั‹ะผ-ะดะฐะฒะฝะพ ั€ะตัˆะธะปะฐ. - ะ˜ ั‡ั‚ะพ ะถะต ั ั€ะตัˆะธะปะฐ? - ะขั‹ ัะพะณะปะฐัะฝะฐ, ะฒะตะดัŒ ั‚ะฐะบ? ะšะธั€ะฐ ะทะฐะผะตัˆะบะฐะปะฐััŒ, ะฝะพ, ะฒัะต ะถะต, ะฒั‹ะดะฐะฒะธะปะฐ ะธะท ัะตะฑั ะฝะตัƒะฒะตั€ะตะฝะฝะพะต "ะดะฐ". ะ“ะพะปะพั ะฟั€ะพะฟะฐะป, ะพัั‚ะฐะฒะธะฒ ะฟะพัะปะต ัะตะฑั ะพะณั€ะพะผะฝัƒัŽ ะบั€ะพะฒะฐั‚ัŒ, ั ะฑะพะปัŒัˆะธะผ ะผะฐั‚ั€ะฐัะพะผ ะธ ะผัะณะบะธะผะธ ะฟะพะดัƒัˆะบะฐะผะธ. ะะฐ ั‚ะฐะบะพะน ะบั€ะพะฒะฐั‚ะธ, ะพะฟั€ะตะดะตะปะตะฝะฝะพ, ะผะพะถะฝะพ ะฑั‹ะปะพ ะฑั‹ ะดะพะฟั€ั‹ะณะฝัƒั‚ัŒ ะดะพ ะฟะพั‚ะพะปะบะฐ.' - 'ะšะพะฝะตั† ะณะพะดะฐ - ัั‚ะพ ะฟะพั€ะฐ ะดะปั ั€ะฐะดะพัั‚ะธ, ะฒ ะฟั€ะตะดั‡ัƒะฒัั‚ะฒะธะธ ะฝะฐะดะฒะธะณะฐัŽั‰ะธั…ัั ะบะฐะฝะธะบัƒะป, ัะฒะพะฑะพะดั‹. ะญั‚ะพ ะฑั‹ะปะพ ะฝะฐั‡ะฐะปะพ ะผะฐั, ะบะพะณะดะฐ ะฝะฐ ัƒะปะธั†ะต ัƒะถะต ั‚ะตะฟะปะพ, ะฐ ะฟะพ ัƒั‚ั€ะฐะผ ะทัะฑะบะพ. ะšะพะณะดะฐ ั†ะฒะตั‚ั‹ ัƒะถะต ั€ะฐัั†ะฒะตะปะธ ะธ ะฝะฐั‡ะฐะปะธ ะฑะปะฐะณะพัƒั…ะฐั‚ัŒ. ะกั‹ั€ะฐั ะทะตะผะปั ะฟะพะบั€ั‹ะฒะฐะปะฐััŒ ั‚ั€ะฐะฒะธะฝะพั‡ะบะฐะผะธ, ะธ ะฟะพ ะฝะตะน ั‚ัƒะดะฐ-ััŽะดะฐ ัะฝะพะฒะฐะปะธ ะฑัƒะบะฐัˆะบะธ-ั‚ะฐั€ะฐะบะฐัˆะบะธ. ะŸั‚ะธั†ั‹ ะปะตั‚ะฐะปะธ ะฝะฐะด ะดะตั€ะตะฒัŒัะผะธ, ั‡ะธั€ะธะบะฐั ะธ ัั‚ั€ะตะบะพั‡ะฐ, ะฐ ะบะฐะบะฐั-ั‚ะพ ะพัะพะฑะตะฝะฝะพ ัƒัะตั€ะดะฝะพ ะฝะฐะฟะตะฒะฐะปะฐ: ~ midori tanabiku namimori no dainaku shounaku nami ga ii itsumo kawaranu sukoyaka kenage aa~ tomo ni utaou namimorichuu ~ ะ”ะฐ... ัั‚ะพ ะฑั‹ะปะฐ ั‚ะฐ ัะฐะผะฐั ั‡ะพะบะฝัƒั‚ะฐั ะฟั‚ะธั‡ะบะฐ, ั…ะพะทัะธะฝะพะผ ะบะพั‚ะพั€ะพะน ะฑั‹ะป ะฝะต ะผะตะฝะต ั‡ะพะบะฝัƒั‚ั‹ะน ะฅะธะฑะฐั€ะธ ะšั‘ั. ะฅะพั‚ั ะฝะฐะทะฒะฐั‚ัŒ ะตะณะพ ั‚ะฐะบ ะฟั€ะธะปัŽะดะฝะพ ะฝะธ ัƒ ะบะพะณะพ ะฑั‹ ัะทั‹ะบ ะฝะต ะฟะพะฒะตั€ะฝัƒะปัั... ะฝัƒ, ะฟะพั‡ั‚ะธ ะฝะธ ัƒ ะบะพะณะพ. ะ’ั€ะตะผะตะฝะฐ ัˆะบะพะปัŒะฝะพะน ะฟะพั€ั‹ ะฟั€ะพัˆะปะธ, ะธ ั‚ะตะฟะตั€ัŒ ะฝะฐัั‚ะฐะปะธ ะฝะต ะผะตะฝะตะต ะฝะฐัั‹ั‰ะตะฝะฝั‹ะต ะฒั€ะตะผะตะฝะฐ ัั‚ัƒะดะตะฝั‡ะตัั‚ะฒะฐ. ะขะฐะบ ัƒะถ ะฟะพะปัƒั‡ะธะปะพััŒ, ััƒะดัŒะฑั‹ ะทะปะฐั ัˆัƒั‚ะบะฐ, ั‡ั‚ะพ ะฑะตะดะฝะพะณะพ ะกะฐะฒะฐะดัƒ ะขััƒะฝะฐั‘ัˆะธ ะฟะตั€ะตะฝะฐะฟั€ะฐะฒะธะปะธ ะฒ ัƒะฝะธะฒะตั€ัะธั‚ะตั‚, ะณะดะต ะณะปะฐะฒะพะน ะดะธัั†ะธะฟะปะธะฝะฐั€ะฝะพะณะพ ะบะพะผะธั‚ะตั‚ะฐ ะฑั‹ะป ัั‚ั€ะฐั… ะธ ัƒะถะฐั ะตะณะพ ะถะธะทะฝะธ - ะฅะธะฑะฐั€ะธ ะšั‘ั! ะัƒ, ั€ะฐะทัƒะผะตะตั‚ัั ะฟะพัะปะต ั€ะตะฟะตั‚ะธั‚ะพั€ะฐ... ะฝะพ ะฝะต ะพะฑ ัั‚ะพะผ ัะตะนั‡ะฐั. ะ›ัŽะฑะพะฟั‹ั‚ะฝะพ, ั‡ั‚ะพ ะฑะตะดะฝะพะณะพ ะกะฐะฒะฐะดัƒ ะขััƒะฝะฐั‘ัˆะธ, ะพัˆะธะฑะพั‡ะฝะพ, ะทะฐะฟะธั…ะฝัƒะปะธ ัั€ะฐะทัƒ ะฝะฐ 2 ะบัƒั€ั! ะœ-ะดะฐ... ะฝะต ะฟะพะฒะตะทะปะพ ั€ะตะฑั‘ะฝะบัƒ... ะะพ ั‚ัƒั‚ ั„ะพั€ั‚ัƒะฝะฐ ะฟะพะฒะตั€ะฝัƒะปะฐััŒ ะบ ะฝะตะผัƒ ัะฒะพะธะผ ั€ั‹ะปะพะผ, ะธ ะฒ ะตะณะพ ะบะปะฐััะต ะพะฝ ะฟะพะฒัั‚ั€ะตั‡ะฐะป ะทะฐะผะตั‡ะฐั‚ะตะปัŒะฝะพะณะพ ั‡ะตะปะพะฒะตะบะฐ - ะะปะปะตะฝะฐ ะฃะพะปะบะตั€ะฐ. ะก ะฝะธะผ ะพะฝะธ ะผะธะณะพะผ ัะดั€ัƒะถะธะปะธััŒ ะธ ัั‚ะฐะปะธ, ะฝะต ั€ะฐะทะปะตะน ะฒะพะดะฐ. ะะพ ัั‚ะพ ะฑั‹ะปะพ ะพัะตะฝัŒัŽ, ะฐ ั‚ะตะฟะตั€ัŒ ะฒะตัะฝะฐ! ะ ัั‚ะพ ะทะฝะฐั‡ะธั‚... ะกั†ะตะฝะฐ 1. ะ”ัƒะฑะปัŒ 1. - ะขััƒะฝะฐ, ะฝะต ะฟะตั€ะตะถะธะฒะฐะน ั‚ั‹ ั‚ะฐะบ! ะกะดะฐัˆัŒ ั‚ั‹ ัั‚ะธ ัะบะทะฐะผะตะฝั‹! ะ’ะตะดัŒ ะธ ั, ะธ ั‚ะฒะพะน ั€ะตะฟะตั‚ะธั‚ะพั€ ะทะฐะฝะธะผะฐะปะธััŒ ั ั‚ะพะฑะพะน ะฒะตััŒ ัƒั‡ะตะฑะฝั‹ะน ะณะพะด! ะขั‹ ะดะฐะถะต ะฝะฐั‡ะฐะป ะฟะพะฝะธะผะฐั‚ัŒ ะฐะทั‹ ัะปะตะบั‚ั€ะพั„ะธะทะธะบะธ! - ัƒัะฟะพะบะฐะธะฒะฐะป ะฒะตั‡ะฝะพ ะปะพัะปัŒะฝั‹ะน ัะตะดะพะน, ะฟะพะณะปะฐะถะธะฒะฐั ะขััƒะฝัƒ ะฟะพ ะฟัƒัˆะธัั‚ะพะน ะบะฐัˆั‚ะฐะฝะพะฒะพะน ัˆะตะฒะตะปัŽั€ะต. - ะัƒ, ะฐ ะตัะปะธ ั‡ั‚ะพ, ะพัั‚ะฐะฝะตัˆัŒัั ะฝะฐ ะฒั‚ะพั€ะพะน ะณะพะด! ะ’ะพะฝ, ะฝะตะบะพั‚ะพั€ั‹ะต ั‚ะฐะบ ัƒะถะต 3 ั€ะฐะทะฐ ะดะตะปะฐะปะธ! - ะบะธะฒะฝัƒะป ะพะฝ ะฝะฐ ะšะฐะฝะดัƒ, ั‡ั‚ะพ ัะธะดะตะป ัƒ ะพะบะฝะฐ ะฒ ะบะพะฝั†ะต ะบะปะฐััะฐ. ะšะฐะฝะดะฐ ะฎัƒ, ะพ-ะพ-ะพ! ะญั‚ะพ, ะฒะพะพะฑั‰ะต, ะพั‚ะดะตะปัŒะฝะฐั ะธัั‚ะพั€ะธั! ะฅัƒะปะธะณะฐะฝ, ะพั‚ะปะธั‡ะฝะธะบ, ะบั€ะฐัะฐะฒะตั†, ะฟะพัะปะตะดะฝัั ัะบะพั‚ะธะฝะฐ, ั‡ะตะปะพะฒะตะบ ั‡ะตัั‚ะธ, ะฑะตะทะดะฐั€ัŒ, ะณั€ะพะทะฐ ะฒัะตั… ะธ ะฒัั... ั‡ัƒะฒัั‚ะฒะฐ ัะผะตัˆะฐะฝะฝั‹ะต. ะšะฐะบ ะฒัั‘ ัั‚ะพ ะธ ะตั‰ั‘ ะผะฝะพะณะพ "ะฟะพะปะพะถะธั‚ะตะปัŒะฝั‹ั…" ะบะฐั‡ะตัั‚ะฒ ะฝะฐั…ะพะดัั‚ัั ะฒ ะพะดะฝะพะผ ั‡ะตะปะพะฒะตะบะต, ะะปะปะตะฝ ะพั‚ะบะฐะทั‹ะฒะฐะปัั ะฟะพะฝะธะผะฐั‚ัŒ! - ะะพ ะพะฝ ั…ะพั‚ั ะฑั‹ ะบั€ัƒั‚ะพะน, ะธ ะพั‚ะปะธั‡ะฝะธะบ, ะฐ ั ะบะฐะบ ะฑั‹ะป ะฝะธะบั‡ะตะผะฝั‹ะผ, ั‚ะฐะบะธะผ ะธ ะพัั‚ะฐะฝัƒััŒ. ะœะฝะต ะฝะต ัะดะฐั‚ัŒ ัั‚ะธ ัะบะทะฐะผะตะฝั‹, ะฝะธ ะทะฐ ั‡ั‚ะพ ะฒ ะถะธะทะฝะธ! - ะฟั€ะพะดะพะปะถะฐะป ัั‚ั€ะฐะดะฐั‚ัŒ ะขััƒะฝะฐ, ัั…ะฒะฐั‚ะธะฒัˆะธััŒ ะทะฐ ะณะพะปะพะฒัƒ. ะขะฐะบะธะผ ะพะฝ ะฑั‹ะป, ัะปะธัˆะบะพะผ ะฝะตัƒะฒะตั€ะตะฝะฝั‹ะผ ะฒ ัะตะฑะต, ะฟะตััะธะผะธัั‚ะธั‡ะฝั‹ะผ, ะฐ ะตั‰ั‘ ะฟะพัะปะตะดะฝะธะผ ะฝะตัƒะดะฐั‡ะฝะธะบะพะผ... ัะฟะธัะพะบ ะผะพะถะฝะพ ะฟั€ะพะดะพะปะถะธั‚ัŒ. ะะพ ะฒ ั‚ะพะถะต ะฒั€ะตะผั, ั€ะฐะดะธ ะดั€ัƒะทะตะน ะพะฝ ะฑั‹ะป ะณะพั‚ะพะฒ ะฝะฐ ะผะฝะพะณะพะต! ะ•ะณะพ ะพั‚ะทั‹ะฒั‡ะธะฒะพัั‚ัŒ, ะดะพะฑั€ะพั‚ะฐ ะฝะต ะทะฝะฐะปะฐ ะณั€ะฐะฝะธั†. ะ•ัะปะธ ะบั‚ะพ-ั‚ะพ ะพะฑะธะถะฐะป ะตะณะพ ะดั€ัƒะทะตะน, ะตะณะพ ะณะปะฐะทะฐ ัั‚ะฐะฝะพะฒะธะปะธััŒ ะพั€ะฐะฝะถะตะฒั‹ะผะธ, ะฐ ัะฐะผ ะพะฝ ัะตั€ัŒั‘ะทะฝั‹ะผ ะธ ะผะตะณะฐ-ัะธะปัŒะฝั‹ะผ. - ะ‘ะฐะšะฐะฝะดะฐ-ั‚ะพ?! ะฅะฐ-ั…ะฐ-ั…ะฐ! - ั€ะฐััะผะตัะปัั ะฃะพะปะบะตั€. - ะ”ัƒั€ะฐะบ ะดัƒั€ะฐะบะพะผ! ะžะฝ ะฟั€ะพัั‚ะพ ะฒะตะทัƒะฝั‡ะธะบ ั ั€ะตะฟัƒั‚ะฐั†ะธะตะน ะธ ะฒะฝะตัˆะฝะพัั‚ัŒัŽ! ะ˜ ะฒัั‘! - ะพะฝ ะผะฝะพะณะพะทะฝะฐั‡ะธั‚ะตะปัŒะฝะพ ั…ะผั‹ะบะฝัƒะป. - ะ ั‚ั‹, ั‚ั‹ ะดะพะฑั€ั‹ะน ะธ ะผะธะปั‹ะน! ะŸั€ะพัั‚ะพ ะฑัƒะดัŒ ะฟะพัƒะฒะตั€ะตะฝะฝะตะต ะฒ ัะตะฑะต, ะธ ะฒัั‘ ะฟะพะปัƒั‡ะธั‚ัั! - ะญั…, ะธ ะบะฐะบ ะตะผัƒ ัƒะดะฐะตั‚ัั ะฑั‹ั‚ัŒ ั‚ะฐะบะธะผ ัƒะฒะตั€ะตะฝะฝั‹ะผ? ะฃ ะผะตะฝั ั‚ะฐะบ ะฝะต ะฟะพะปัƒั‡ะฐะตั‚ัั... - ะฒะทะดะพั…ะฝัƒะป ะกะฐะฒะฐะดะฐ, ะฟะพัะผะพั‚ั€ะตะฒ ะฝะฐ ะšะฐะฝะดัƒ. - ะ”ะฐ, ะธ ะฟั€ะธ ัั‚ะพะผ ะพะฝ ะฝะธั‡ะตะณะพ ะฝะต ะดะตะปะฐะตั‚, ะปะธัˆัŒ ัะธะดะธั‚ ะฝะฐ ัะฒะพั‘ะผ ะผะตัั‚ะต, ะฝะพ ะฒัะต ะดะตะฒั‡ะพะฝะบะธ ะฒะพะทะปะต ะฝะตะณะพ ะฒัŒัŽั‚ัั. ะะพ ั‚ัƒั‚, ะฒะดั€ัƒะณ, ะšะฐะฝะดะฐ ะฟะพัะผะพั‚ั€ะตะป ะฒ ะธั… ัั‚ะพั€ะพะฝัƒ, ะฐ ะขััƒะฝะฐ ั‚ัƒั‚ ะถะต ะพั‚ะฒะตั€ะฝัƒะปัั ะธ ัะถะฐะปัั, ะฑัƒะดั‚ะพ ะตะณะพ ั‚ะพะปัŒะบะพ ั‡ั‚ะพ ะพะฑะปะธะปะธ ะปะตะดัะฝะพะน ะฒะพะดะพะน. - ะคัƒั…... ะะปะปะตะฝ ั‚ะพะถะต ะฟะพัะผะพั‚ั€ะตะป ะฝะฐ ะšะฐะฝะดัƒ ะธ, ะฟะพะบะฐะทะฐะฒ ะตะผัƒ ัะทั‹ะบ, ะพั‚ะฒะตั€ะฝัƒะปัั. - ะŸั„! ะ˜ ั‡ั‚ะพ ะพะฝะธ ะฒ ะฝั‘ะผ ะฝะฐัˆะปะธ, ะฝะต ะฟะพะฝะธะผะฐ... - ะฒะพั‚ ั‚ะตะฟะตั€ัŒ ัƒะถะต ะะปะปะตะฝ ะทะฐะผะตั€ ัƒัั‚ะฐะฒะธะฒัˆะธััŒ ะฝะฐ ะดะฒะตั€ะฝะพะน ะฟั€ะพั‘ะผ, ะพั‚ะบัƒะดะฐ ะธะทะปัƒั‡ะฐะปะฐััŒ ะฐัƒั€ะฐ ัะผะตั€ั‚ะธ. ะญั‚ะพ ะฑั‹ะป ะฅะธะฑะฐั€ะธ ะšั‘ั "ะงั‚ะพ ะตะผัƒ ะฝัƒะถะฝะพ?!" ะกั†ะตะฝะฐ 2. ะ”ัƒะฑะปัŒ 1. - ะšั‚ะพ... ะบั‚ะพ ะฟะพัะผะตะป ะฟั€ะธะนั‚ะธ ะฒ ัƒะฝะธะฒะตั€ัะธั‚ะตั‚ ะฑะตะท ัะผะตะฝะบะธ?!!! ะขัƒั‚ ะฃะพะปะบะตั€ ะฟะพัะผะพั‚ั€ะตะป ะฝะฐ ะฟะพะป ะธ ะฒะทะดั€ะพะณะฝัƒะป. ะ“ั€ัะทัŒ! ะ›ัƒะถะธ ะณั€ัะทะธ ะพั‚ ะฒะพะตะฝะฝั‹ั… ัะฐะฟะพะณ, ะฐ ั‚ะฐะบะธะต ัะฐะฟะพะณะธ ั‚ะพะปัŒะบะพ ัƒ... - ะšะฐะฝะดะฐ ะฎัƒ! - ะฒะทั€ะตะฒะตะป ะšั‘ั. ะะพ ะฟะฐั€ะตะฝัŒ ะปะธัˆัŒ ะพะดะฐั€ะธะป ะตะณะพ ัะฒะพะธะผ ะพะฑั‹ั‡ะฝั‹ะผ, ั€ะฐะฒะฝะพะดัƒัˆะฝั‹ะผ ะฒะทะณะปัะดะพะผ, ะฟะพะปะฝั‹ะผ ั…ะพะปะพะดะฐ. - ะงั‚ะพ-ั‚ะพ ะฝะต ั‚ะฐะบ? - ะขั‹, ั‚ั€ะฐะฒะพัะดะฝะพะต! - ะฟะพะดะพะนะดั ะบ ะšะฐะฝะดะต, ะฅะธะฑะฐั€ะธ ะปะฐัะบะพะฒะพ ะพั‚ะพะดะฒะธะฝัƒะป ะฟะฐั€ั‚ัƒ. - ะขั‹ ะพั‚ะฒะตั‚ะธัˆัŒ ะทะฐ ั‚ะพ, ั‡ั‚ะพ ะธัะฟะฐั‡ะบะฐะป ะฟะพะปั‹! - ะพะฝ ะฝะฐัะบะฒะพะทัŒ ะฟั€ะพะถะธะณะฐะป ะฒะทะณะปัะดะพะผ. - ะฅะผ, ะตั‰ั‘ ั‡ะตะณะพ, - ะณะพั€ะดั‹ะต ัะธะฝะธะต ะณะปะฐะทะฐ ะฟั€ะพะฝะธะทั‹ะฒะฐะปะธ ั…ะพะปะพะดะพะผ ะฒ ะพั‚ะฒะตั‚. ะ’ะดะพะฑะฐะฒะพะบ ะพะฝ ะทะฐะบะธะฝัƒะป ะฝะพะณัƒ ะฝะฐ ะฝะพะณัƒ. - ะกะผะตะฝะบะฐ ะฟะพั€ะฒะฐะปะฐััŒ, ะดั€ัƒะณะพะน ั ะฝะต ะฝะฐัˆั‘ะป, ะฟั€ะธัˆะปะพััŒ ะธะดั‚ะธ ะฒ ัƒะฝะธะฒะตั€ัะธั‚ะตั‚ ั‚ะฐะบ. - ะ”ะฐ ะฟะปะตะฒะฐั‚ัŒ ั ั…ะพั‚ะตะป! ะ‘ะพัะธะบะพะผ ั…ะพะดะธ! ะ ะฟะพะผะตั‰ะตะฝะธะต ะฟะฐั‡ะบะฐั‚ัŒ ะฝะต ัะผะตะน! - ั€ั‹ั‡ะฐะป ะšั‘ั. - ะ—ะฐะฒั‚ั€ะฐ ั‚ะฐะบ ะธ ัะดะตะปะฐัŽ - ั„ั‹ั€ะบะฝัƒะป ั‚ะพั‚. - ะญั‚ะพ ะฒัั‘? - ะ‘ัƒะดะตัˆัŒ ะฝะตะดะตะปัŽ ะผั‹ั‚ัŒ ะฟะพะปั‹ ะฒ ัั‚ะพะผ ะบะพั€ะธะดะพั€ะต! - ะฝะฐั…ะผัƒั€ะธะปัั ะณะปะฐะฒะฐ ะดะธัั†ะธะฟะปะธะฝะฐั€ะฝะพะณะพ ะบะพะผะธั‚ะตั‚ะฐ. - ะ˜ ะฝะฐั‡ะฝั‘ัˆัŒ, ะฟั€ัะผะพ ัะตะนั‡ะฐั! - ะขั‡, ะฝะต ะฝะฐะผะตั€ะตะฝ. ะ”ะปั ัั‚ะพะณะพ ะตัั‚ัŒ ัƒะฑะพั€ั‰ะธั†ั‹, ะธ... - ะฑั€ะพัะธะป ะบะพั€ะพั‚ะบะธะน ะฒะทะณะปัะด ะฒ ัั‚ะพั€ะพะฝัƒ ะฃะพะปะบะตั€ะฐ ะธ ะขััƒะฝั‹. - ะ”ะตะถัƒั€ะฝั‹ะต. - ะงะตะณะพ-ะพ-ะพ?! - ะฒะพะทะผัƒั‚ะธะปัั ะฃะพะปะบะตั€. - ะ—ะฐ ะบะพั€ะธะดะพั€ ะผั‹ ะฝะต ะพั‚ะฒะตั‡ะฐะตะผ! - ะฅะผ, - ั…ะผั‹ะบะฝัƒะป ะšั‘ั, ะธ ะผะฐะปัŒั‡ะธะบ ั€ะตัˆะธะป ะฟะพะผะพะปั‡ะฐั‚ัŒ. - ะขั‹ ะทะฐะฟะฐั‡ะบะฐะป ั‚ั‹ ะธ ัƒะฑะธั€ะฐะน, ะฐ ะธะฝะฐั‡ะต... - ะณะปะฐะทะฐ ัะฒะตั€ะบะฝัƒะปะธ ะฝะต ะฟะพ-ะดะพะฑั€ะพะผัƒ. - ะšะฐะผะธะบะพั€ะพั! - ะะตั‚ ะถะตะปะฐะฝะธั ะดั€ะฐั‚ัŒัั, ะฝะพ ั€ะฐะท ั‚ั‹ ะฝะฐัั‚ะฐะธะฒะฐะตัˆัŒ! - ะšะฐะฝะดะฐ ะฟะพะดะฝัะปัั ั ะผะตัั‚ะฐ, ัะผะพั‚ั€ั ะฝะฐ ะฟะฐั€ะฝั ั ะฒั‹ะทะพะฒะพะผ. ะžะฝ ะฝะต ัะพะฑะธั€ะฐะปัั ะพั‚ะดะฐะฒะฐั‚ัŒ ัะฒะพะตะผัƒ ะณะปะฐะฒะฝะพะผัƒ ัะพะฟะตั€ะฝะธะบัƒ ะทะฒะฐะฝะธะต ะณั€ะพะทั‹ ัƒะฝะธะฒะตั€ัะธั‚ะตั‚ะฐ. - ะž, ัั‚ะพ ะฑัƒะดะตั‚ ะธะฝั‚ะตั€ะตัะฝะพ, - ะทะปะพั€ะฐะดะฝะพ ัƒั…ะผั‹ะปัŒะฝัƒะปัั. - ะ’ัะต ะฒะพะฝ! ะŸะพะบะฐ ะฝะต ะฟะตั€ะตะฑะธะป. ะ’ะตััŒ ะบะปะฐัั, ั‡ั‚ะพ ะถะฐะปัั ะฟะพ ัั‚ะตะฝะพั‡ะบะฐะผ, ะผะพะผะตะฝั‚ะฐะปัŒะฝะพ ะฒั‹ัั‹ะฟะฐะป ะฒ ะบะพั€ะธะดะพั€. ะšั€ะพะผะต ะขััƒะฝั‹ ะธ ะะปะปะตะฝะฐ, ั‡ั‚ะพ ะทะฐะฒะพั€ะพะถะตะฝะพ ะฝะฐะฑะปัŽะดะฐะปะธ ะทะฐ ัะพะฑั‹ั‚ะธัะผะธ. ะกะฐะฒะฐะดะฐ ัะพ ัั‚ั€ะฐั…ัƒ ะฒั†ะตะฟะธะปัั ะฒ ั€ัƒะบัƒ ะฃะพะปะบะตั€ะฐ, ะฐ ัะฐะผ ะฟะฐั€ะตะฝัŒ ะพะฑะตัะฟะพะบะพะตะฝะฝะพ ัะผะพั‚ั€ะตะป ะฒ ัั‚ะพั€ะพะฝัƒ ะดะปะธะฝะฝะพะฒะพะปะพัะพะณะพ ัะฟะพะฝั†ะฐ. - ะฎัƒ... - ั‚ะธั…ะพ ะฟะพะทะฒะฐะป ะพะฝ. - ะŸั€ะฐะฒะธะปัŒะฝะพ, ัะฒะธะดะตั‚ะตะปะธ ะฝะธ ะบ ั‡ะตะผัƒ, - ั‚ะฐะบ ะถะต ัƒั…ะผั‹ะปัŒะฝัƒะปัั ะšะฐะฝะดะฐ, ั€ะฐะทะผะธะฝะฐั ั€ัƒะบะธ. - ะ’ั‹, ะดะฒะพะต, ั€ะฐะทะฒะต ะฝะต ััะฝะพ ะฑั‹ะปะพ ัะบะฐะทะฐะฝะพ? - ะณะปัะฝัƒะป ะพะฝ ะฒ ัั‚ะพั€ะพะฝัƒ ะฟะฐั€ะฝะตะน. - ะะปะปะตะฝ, ะผะพะถะตั‚... - ั‚ะธั…ะพ ะฟั€ะพัะบัƒะปะธะป ะขััƒะฝะฐ, ะฟั€ะตะบั€ะฐัะฝะพ ะทะฝะฐะฒัˆะธะน ะฝั€ะฐะฒ ะฅะธะฑะฐั€ะธ. ะ‘ะตะปะพะฑั€ั‹ัั‹ะน, ั‡ั‚ะพ ะฒัั‘ ัั‚ะพ ะฒั€ะตะผั ะฟะตั€ะตะฒะพะดะธะป ะฒะทะณะปัะด ั ะšั‘ั ะฝะฐ ะšะฐะฝะดัƒ, ะฒะทะดะพั…ะฝัƒะป, ะพะฟัƒัั‚ะธะฒ ะณะปะฐะทะฐ, ะธ ะฟะพะดะดะฐะปัั ะฝะฐ ัƒะณะพะฒะพั€ั‹ ะกะฐะฒะฐะดั‹, ะฟะพะทะฒะพะปะธะฒ ัƒั‚ะฐั‰ะธั‚ัŒ ัะตะฑั ะฒ ะบะพั€ะธะดะพั€. ะกั†ะตะฝะฐ 3. ะ”ัƒะฑะปัŒ 1. - ะฅะต... - ะฅะธะฑะฐั€ะธ ัั‚ั€ะฐะฝะฝะพ ั…ะผั‹ะบะฝัƒะป, ะบั€ะฐะตะผ ะณะปะฐะทะฐ ะฝะฐะฑะปัŽะดะฐั ะทะฐ ัƒัˆะตะดัˆะธะผะธ. ะšะฐะฝะดะฐ ั‚ะฐะบ ะถะต ะผะพะปั‡ะฐ, ะฟั€ะพะฒะพะดะธะป ะฟะพะดั€ะพัั‚ะบะพะฒ ะฒะทะณะปัะดะพะผ ะธ ะฒะฝะพะฒัŒ ะฟะพัะผะพั‚ั€ะตะป ะฝะฐ ัะฒะพะตะณะพ ะฟั€ะพั‚ะธะฒะฝะธะบะฐ. ะญั‚ะฐ ัƒั…ะผั‹ะปะบะฐ ะฝะธ ะพ ั‡ั‘ะผ ะดะพะฑั€ะพะผ ะฝะต ะณะพะฒะพั€ะธะปะฐ. ะฅะธะฑะฐั€ะธ ะฝะตะพะถะธะดะฐะฝะฝะพ ัƒะดะฐั€ะธะป ะฟะฐั€ะฝั ะฒ ะถะธะฒะพั‚ ั‚ะฐะบ, ั‡ั‚ะพ ั‚ะพั‚ ะพั‚ะปะตั‚ะตะป ะบ ะพะบะฝัƒ. - ะขั‡... - ะšะฐะฝะดะฐ ัะพะณะฝัƒะปัั, ะฝะพ ะฑั‹ัั‚ั€ะพ ะฟั€ะธัˆั‘ะป ะฒ ัะตะฑั ะธ ะฟะพะดะฝัะปัั. ะŸะพัะปะตะดะพะฒะฐะป ะพั‚ะฒะตั‚ะฝั‹ะน ัƒะดะฐั€. - ะฅะผ... ัะปะฐะฑะฐะบ! - ะšั‘ั ะฑั‹ัั‚ั€ะพ ะฑะปะพะบะธั€ะพะฒะฐะป ัั‚ะพั‚ ัƒะดะฐั€ ะธ ะฟะพะดัะตั‡ะบะพะน ัะฑะธะป ะฟั€ะพั‚ะธะฒะฝะธะบะฐ ั ะฝะพะณ. ะฎัƒ ะฝะต ั€ะฐัั‚ะตั€ัะปัั ะธ ัƒะดะฐั€ะธะป ะตะณะพ ะฟะพ ะฝะพะณะฐะผ, ั‚ะพะถะต ะทะฐะฒะฐะปะธะฒ ะฝะฐ ะฟะพะป ะธ ัะตะป ะฝะฐ ะฝะตะณะพ, ะบะฐะบ ะฝะฐ ัะบะฐะผะตะนะบัƒ. ะŸะพั‚ะพะผ ะฟะพะดะฝัะปัั ะธ ะทะฐะปะพะผะธะป ั‚ะพะผัƒ ั€ัƒะบะธ, ะฟั€ะธะณะธะฑะฐั ะบ ะฟะพะปัƒ. - ะ‘ะตัะธัˆัŒ! ะฅะธะฑะฐั€ะธ ะฒั‹ะฒะตั€ะฝัƒะปัั ะธ ั ั€ะฐะทะฒะพั€ะพั‚ะฐ ัƒะดะฐั€ะธะป ะฟะพ ะปะธั†ัƒ. - ะขั€ะฐะฒะพัะดะฝั‹ะต ะดะพะปะถะฝั‹ ะผะพะปั‡ะฐั‚ัŒ ะธ ะฟะพะดั‡ะธะฝัั‚ัŒัั! - ะฏ ั‚ะฐะบะพะน ัะฒะพะปะพั‡ะธ ะฟะพะดั‡ะธะฝัั‚ัŒัั ะฝะต ัะพะฑะธั€ะฐัŽััŒ! - ัƒะดะฐั€ ะฒ ะฑะพะบ ะฟะพ ะฟะตั‡ะตะฝะธ. ะšั‘ั ัƒะดะฐั€ะธะป ะฟะพ ะณะพะปะพะฒะต ั‚ะพะฝั„ะฐ. - ะ ั‚ะตะฑั ะฝะธะบั‚ะพ ะฝะต ัะฟั€ะฐัˆะธะฒะฐะตั‚! ะ”ะธัั†ะธะฟะปะธะฝะฐ ะฝะฐ ะฟะตั€ะฒะพะผ ะผะตัั‚ะต! ะšะฐะฝะดะฐ ะทะฐะตั…ะฐะป ะฝะพะณะพะน ะฒ ะถะธะฒะพั‚. ะกะฐะฟะพะณะฐะผะธ ัั‚ะพ ะพั‡ะตะฝัŒ ะถะตัั‚ะพะบะพ. - ะŸะพะบะฐ ะผะตะฝั ะฝะธะบั‚ะพ ะฝะต ั‚ั€ะพะณะฐะตั‚, ั ัะฟะพะบะพะตะฝ! - ะŸะพะบะฐ ะฝะต ะฝะฐั€ัƒัˆะฐะตัˆัŒ ะฟั€ะฐะฒะธะปะฐ, ัะฟะพะบะพะตะฝ ั! - ะฟะฐั€ะตะฝัŒ ั ัะธะปะพะน ัƒะดะฐั€ะธะป ะฟะพ ัะพะปะฝะตั‡ะฝะพะผัƒ ัะฟะปะตั‚ะตะฝะธัŽ. - ะšั…... ัƒะฑะปัŽะดะพะบ, - ัะผะพั€ั‰ะธะปัั ะฎัƒ. - ะขะพะถะต ะผะฝะต - ะฝะฐะณะปั‹ะน! ะ”ัƒะผะฐะตัˆัŒ, ั…ัƒะน ะพั‚ั€ะฐัั‚ะธะป, ะธ ั‚ะตะฑะต ะฒัั‘ ะดะพะทะฒะพะปะตะฝะพ?! - ะฟั€ะพั€ั‹ั‡ะฐะป ะฅะธะฑะฐั€ะธ. - ะ“ะพะฒะพั€ะธ, ั‡ั‚ะพ ั…ะพั‡ะตัˆัŒ, ะฝะพ ะฟะพะปั‹ ะผั‹ั‚ัŒ ั ะฝะต ัะพะฑะธั€ะฐัŽััŒ, - ั‚ะตะผ ะถะต ั‚ะพะฝะพะผ ะพั‚ะฒะตั‚ะธะป ะฟั€ะพั‚ะธะฒะฝะธะบ. - ะะพ ั‚ะฐะบะธ ะฒั‹ะผะพะตัˆัŒ! - ัะฝะพะฒะฐ ัƒะดะฐั€ะธะป ะณะปะฐะฒะฐ ะดะธัั†ะธะฟะปะธะฝะฐั€ะฝะพะณะพ ะบะพะผะธั‚ะตั‚ะฐ. - ะ—ะฐะฒั‚ั€ะฐ ะฒะพะพะฑั‰ะต ะฝะต ัะฒะปัŽััŒ. ะ˜ ะฟะปะฐะบะฐะป ะฒะฐัˆ ะบัƒะฑะพะบ ะทะฐ ะฟะตั€ะฒะพะต ะผะตัั‚ะพ ะฟะพ ะฑะฐัะบะตั‚ะฑะพะปัƒ, - ะฒั‹ั‚ะตั€ะฟะตะป ะšะฐะฝะดะฐ. - ะขั‹ ะผะฝะต ั‚ัƒั‚ ะฝะต ัƒะณั€ะพะถะฐะน! ะะตะทะฐะผะตะฝะธะผั‹ั… ะปัŽะดะตะน ะฝะต ะฑั‹ะฒะฐะตั‚! ะ ั‚ะตะผ ะฑะพะปะตะต ั‚ะตะฑั ะทะฐะผะตะฝะธั‚ัŒ - ั€ะฐะท ะฟะปัŽะฝัƒั‚ัŒ! - ะžั…, ั‚ะพะณะดะฐ ัั‚ะพ ะถะต ะพั‚ะปะธั‡ะฝะพ! ะ—ะฐะฒั‚ั€ะฐ ั†ะตะปั‹ะน ะดะตะฝัŒ ะฟั€ะพะฒะฐะปััŽััŒ ะฒ ะบั€ะพะฒะฐั‚ะธ ะธ ะฝะต ัƒะฒะธะถัƒ ัั‚ะพะณะพ ะผะตะปะบะพะณะพ. ะ—ะฐะดะพะปะฑะฐะป ะฟัะปะธั‚ัŒัั. - ะœ? ะ ะฟั€ะธั‡ั‘ะผ ั‚ัƒั‚ ะะปะปะตะฝ?! - ะšั‘ั ะฒัะบะธะฝัƒะป ะฑั€ะพะฒัŒ. - ะŸั€ะธั‚ะพะผ, ั‡ั‚ะพ ะดะพัั‚ะฐะป, - ะฒะทะดะพั…ะฝัƒะป ะฎัƒ, - ัั‚ั€ะฐะฝะฝั‹ะน ะพะฝ ะบะฐะบะพะน-ั‚ะพ. ะ˜ ัะผะพั‚ั€ะธั‚ ะฝะฐ ะผะตะฝั ะบะฐะบ-ั‚ะพ ัั‚ั€ะฐะฝะฝะพ. - ะ ะฐะดะพะฒะฐะปัั ะฑั‹! ะ’ัะต ะพัั‚ะฐะปัŒะฝั‹ะต ะพั‚ ั‚ะตะฑั ัˆะฐั€ะฐั…ะฐัŽั‚ัั. ะก ั‚ะฐะบะธะผะธ ั‚ะตะผะฟะฐะผะธ ะธ ะดะพ ะพะฝะฐะฝะธะทะผะฐ ะฝะตะดะฐะปะตะบะพ, ะธะปะธ ั‚ั‹ ัƒะถะต? - ัƒัะผะตั…ะฝัƒะปัั ะฅะธะฑะฐั€ะธ. - ะŸั„, ะฝะตั‚... ะธ ั‡ั‚ะพ ะฒะพะพะฑั‰ะต ะทะฐ ะฒะพะฟั€ะพัั‹? ะฃะพะปะบะตั€ ะผะตะฝั ะฒ ะฟะพัะปะตะดะฝัŽัŽ ะพั‡ะตั€ะตะดัŒ ะธะฝั‚ะตั€ะตััƒะตั‚. - ะฏ ะฝะต ะพะฑ ัั‚ะพะน ะบะพะทัะฒะบะต ะณะพะฒะพั€ัŽ! ะ ะฟั€ะพ ั‚ะพ, ั‡ั‚ะพ ั ั‚ะฒะพะธะผ ั…ะฐั€ะฐะบั‚ะตั€ะพะผ ะฝะธ ะพะดะฝะฐ ะดะตะฒัƒัˆะบะฐ ะบ ั‚ะตะฑะต ะฝะต ะฟะพะดะพะนะดั‘ั‚! - ะฅะต, - ัƒัะผะตั…ะฝัƒะปัั ะšะฐะฝะดะฐ. - ะกะฟะพั€ะธะผ, ั ะปัŽะฑัƒัŽ ะทะฐ ะดะตะฝัŒ ัะผะพะณัƒ ะทะฐะบะฐะดั€ะธั‚ัŒ? ะ˜ ะทะฐะฝัั‚ัŒัั ัะตะบัะพะผ. - ะขั‹-ั‚ะพ? ะฅะฐ! ะ˜ ะทะฐ ะผะตััั† ะฝะต ัะฟั€ะฐะฒะธัˆัŒัั! - ะพัะบะฐะปะธะปัั ะšั‘ั. - ะขะฐะบ ะทะฝะฐั‡ะธั‚, ัะฟะพั€ะธะผ? - ะฟั€ะธะฟะพะดะฝัะปัั ะฎัƒ. - ะะพ ั‚ะพะณะดะฐ ะธ ั‚ั‹ ัƒั‡ะฐัั‚ะฒัƒะตัˆัŒ. - ะฅะต, ะดะฐัŽ ั‚ะตะฑะต ะฝะตะดะตะปัŽ! - ะฅะธะฑะฐั€ะธ ัƒะฑั€ะฐะป ั‚ะพะฝั„ะฐ ะธ ะฟั€ะพั‚ัะฝัƒะป ัะฒะพัŽ ั€ัƒะบัƒ. - ะ”ะพะณะพะฒะพั€ะธะปะธััŒ, - ะฟะพะถะฐะป ั€ัƒะบัƒ ั‚ะพั‚. - ะ˜ ะบั‚ะพ ัั‚ะฐะฝะตั‚ ั†ะตะปัŒัŽ? - ะฅะผ... ะฐ ั‚ะพั‚, ะบั‚ะพ ะฟะตั€ะฒั‹ะน ะฒะพะนะดั‘ั‚ ะฒ ัั‚ะพั‚ ะบะฐะฑะธะฝะตั‚! ะงั‚ะพะฑ ัƒะถ ั‡ะตัั‚ะฝะพ ะฑั‹ะปะพ. ะ’ ะฟะพะดั‚ะฒะตั€ะถะดะตะฝะธะต ะฟะพะฑะตะดั‹ ะฟั€ะธะฝะตััƒ ั‚ะตะฑะต ะฝะธะถะฝะตะต ะฑะตะปัŒั‘ ะถะตั€ั‚ะฒั‹! - ะณะปะฐะฒะฐ ะดะธัั†ะธะฟะปะธะฝะฐั€ะฝะพะณะพ ะบะพะผะธั‚ะตั‚ะฐ ะบั€ะตะฟั‡ะต ัะถะฐะป ั€ัƒะบัƒ ะธ, ั€ะฒะฐะฝัƒะฒ ะฝะฐ ัะตะฑั, ะฟะตั€ะตะบะธะฝัƒะป ะšะฐะฝะดัƒ ั‡ะตั€ะตะท ัะฟะธะฝัƒ ะฝะฐ ะฟะพะป. - ะะพ ัƒั‡ั‚ะธ, ะตัะปะธ ั‚ั‹ ะฟั€ะพะธะณั€ะฐะตัˆัŒ, ะฑัƒะดะตัˆัŒ ะดั€ะฐะธั‚ัŒ ัƒะฝะธะฒะตั€ัะธั‚ะตั‚ ะฒะตััŒ ะณะพะด! - ะขั‡... ะปะฐะดะฝะพ - ะฎัƒ ะฟะพะดะฝัะปัั, ะดะตั€ะถะฐััŒ ะทะฐ ัะฟะธะฝัƒ. - ะฏ ั‚ะตะฑะต ัั‚ะพ ะฝะต ะฟั€ะพั‰ัƒ. ะขัƒั‚ ะฒ ะดะฒะตั€ัŒ ั‚ะธั…ะพะฝัŒะบะพ ะฟะพัั‚ัƒั‡ะฐะปะธััŒ. - ะ ะตัะปะธ ะฒั‹ะธะณั€ะฐะตัˆัŒ ั‚ั‹, ั ะฝะฐ ะณะพะด ะพั‚ ั‚ะตะฑั ะพั‚ัั‚ะฐะฝัƒ! - ั…ะผั‹ะบะฝัƒะป ะšั‘ั ะธ ะฟะพะฒะตั€ะฝัƒะปัั ะบ ะดะฒะตั€ะธ. - ะฅะธะฑะฐั€ะธ-ัะฐะฝ! ะฏ, ะบะพะฝะตั‡ะฝะพ, ะฟะพะฝะธะผะฐัŽ, ั‡ั‚ะพ ะดะธัั†ะธะฟะปะธะฝะฐ - ัั‚ะพ ัะฒัั‚ะพะต, ะธ ะฟะพะดะดะตั€ะถะธะฒะฐัŽ ะฒะฐัˆะต ั€ะตัˆะตะฝะธะต ะฝะฐะดั€ะฐั‚ัŒ ัั‚ะพะผัƒ ะฟั€ะธะดัƒั€ะบัƒ ะทะฐะด! ะะพ ัƒ ะฝะฐั ั‚ัƒั‚ ัƒั€ะพะบ, ะฐ ะผะฝะต ั€ะตั„ะตั€ะฐั‚ ัะดะฐะฒะฐั‚ัŒ! - ะทะฐัˆั‘ะป ะฑะตะทัƒะฟั€ะตั‡ะฝั‹ะน ะะปะปะตะฝ ะฃะพะปะบะตั€, ะฒ ะบะพั‚ะพั€ะพะณะพ ะฝะฐะผะตั€ั‚ะฒะพ ะฒั†ะตะฟะธะปัั ะกะฐะฒะฐะดะฐ ะฟั‹ั‚ะฐัััŒ ะพัั‚ะฐะฝะพะฒะธั‚ัŒ. ะกั†ะตะฝะฐ 4. ะ”ัƒะฑะปัŒ 1. ะšะฐะฝะดะฐ ะฟะพัะผะพั‚ั€ะตะป ะฝะฐ ะผะฐะปัŒั‡ะธัˆะบัƒ ะธ ะธะทะดะฐะป ั‚ะธั…ะธะน ะทะฒัƒะบ, ะฟะพั…ะพะถะธะน ะฝะฐ ะบะพัˆะฐั‡ัŒะต ัˆะธะฟะตะฝะธะต. ะ’ะธะดะธะผะพ ะพะฝ ะฝะต ะพะถะธะดะฐะป, ั‡ั‚ะพ ะฟะตั€ะฒั‹ะผะธ ะฒ ะบะปะฐัั ะทะฐะนะดัƒั‚ ะธะผะตะฝะฝะพ ัั‚ะธ ะดะฒะพะต. - ะŸั€ะพั…ะพะดะธ, ะทะฐัˆั‘ะป ัƒะถะต, - ั…ะผั‹ะบะฝัƒะป ะฎัƒ ะธ, ะพั‚ะฒะตัะธะฒ ะฅะธะฑะฐั€ะธ ะฟะพะดะทะฐั‚ั‹ะปัŒะฝะธะบ, ะฟะพัะฟะตัˆะธะป ะฒะตั€ะฝัƒั‚ัŒัั ะฝะฐ ัะฒะพั‘ ะผะตัั‚ะพ. - ะž, ั‚ั‹ ะตั‰ั‘ ะถะธะฒะพะน?! ะŸะตั‡ะฐะปัŒะฝะพ... - ะฟะพะบะฐั‡ะฐะป ะณะพะปะพะฒะพะน ะะปะปะตะฝ. - ะ ะตะฑัั‚ะฐ ะทะฐั…ะพะดะธั‚ะต, ะฅะธะฑะฐั€ะธ ัƒัˆั‘ะป! - ั‚ัƒั‚ ะถะต ะฒ ะดะฒะตั€ัŒ ะฟะพะฒะฐะปะธะปะธ ะพัั‚ะฐะปัŒะฝั‹ะต. ะ˜ ะฟะพัะปะตะดะฝะธะผ ะทะฐัˆั‘ะป ะฟั€ะตะฟะพะดะฐะฒะฐั‚ะตะปัŒ. ะ‘ะตะปะพะฒะพะปะพัั‹ะน ะดะพัั‚ะฐะป ะธะท ััƒะผะบะธ ั€ะธััƒะฝะบะธ ะธ ั‡ะตั€ั‚ะตะถะธ, ะฟะพัะปะต ั‡ะตะณะพ ั€ะฐะทะฒะตัะธะป, ะฒะทัะป ัƒะบะฐะทะบัƒ ะธ ะฝะฐั‡ะฐะป ั€ะฐััะบะฐะทั‹ะฒะฐั‚ัŒ ั€ะตั„ะตั€ะฐั‚ ะฟะพ ัะบะพะปะพะณะธะธ. ะ’ะพะพะฑั‰ะต, ะพะฝ ะฝะต ะฑั‹ะป ะพั‚ะปะธั‡ะฝะธะบะพะผ, ะฝะพ ะฑะพะปัŒัˆะธะผ ั‚ั€ัƒะดัะณะพะน! ะ•ัะปะธ ั€ะฐะฝัŒัˆะต ะฎัƒ ะผะตั‡ั‚ะฐะป ะพั‚ัะธะดะตั‚ัŒ ะฟะพัะปะตะดะฝะธะต ัƒั€ะพะบะธ ะธ ัะฒะฐะปะธั‚ัŒ ะดะพะผะพะน, ั‚ะพ ั‚ะตะฟะตั€ัŒ ะตะณะพ ะถะตะปะฐะฝะธะตะผ ะฑั‹ะปะพ, ั‡ั‚ะพะฑั‹ ัƒั€ะพะบะธ ะฝะธะบะพะณะดะฐ ะฝะต ะทะฐะบะฐะฝั‡ะธะฒะฐะปะธััŒ. "ะขั‡, ะจะฟะตะฝะดะตะปัŒ. ะŸะพั‡ะตะผัƒ, ะฟะพั‡ะตะผัƒ ั‚ั‹ ั‚ะฐะบ ะฝะต ะฒะพะฒั€ะตะผั ัะฒะฐะปะธะปัั ะผะฝะต ะฝะฐ ะณะพะปะพะฒัƒ?!" - ะดัƒะผะฐะป ะพะฝ, ะดะตะปะฐั ะฒะธะด, ั‡ั‚ะพ ัะปัƒัˆะฐะตั‚. - ... ะ˜ ะฒะพั‚ ะฟะพัั‚ะพะผัƒ ะดะปั ัะฟะฐัะตะฝะธั ะบะธั‚ะพะฒ ั‚ะฐะบ ะฒะฐะถะฝะพ ะฟั€ะตะบั€ะฐั‚ะธั‚ัŒ ัั‚ั€ะตะปัŒะฑัƒ ะธ ะฟะตั€ะตะฒะพะท ะฝะตั„ั‚ะธ ั‡ะตั€ะตะท ะพะบะตะฐะฝ! ะฃ ะผะตะฝั ะฒัั‘! - ะทะฐะบะพะฝั‡ะธะป ั€ะฐััะบะฐะท. - ะัƒ ั‡ั‚ะพ ะถ, ะดัƒะผะฐัŽ, ะฝะฐ 4-ะบัƒ ะฒะฟะพะปะฝะต ั…ะฒะฐั‚ะธั‚. - ะงั‚ะพ?! ะะพ ัƒั‡ะธั‚ะตะปัŒ, ัƒ ะฝะตะณะพ ะฟะพั‚ั€ััะฐัŽั‰ะธะน ะดะพะบะปะฐะด! - ะทะฐั‰ะตะฑะตั‚ะฐะป ะพะดะฝะพะณั€ัƒะฟะฟะฝะธะบ. - ะžะฝ ะผะฝะพะณะพ ะณะพั‚ะพะฒะธะปัั, ะฒะพะปะฝะพะฒะฐะปัั, ะฟะพั‡ะตะผัƒ ั‡ะตั‚ั‹ั€ะต?! - ะทะฐัั‚ัƒะฟะธะปัั ะทะฐ ะะปะปะตะฝะฐ ะขััƒะฝะฐ. - ะ”ะฐ ะฟั€ะตะบั€ะฐัะฝั‹ะน ะดะพะบะปะฐะด, ะตัะปะธ ั‡ะตัั‚ะฝะพ, ะฝะต ะพะถะธะดะฐะป, - ะฒั‹ัะบะฐะทะฐะปัั ั‡ะตะปะพะฒะตะบ, ะบะพั‚ะพั€ะพะณะพ ะผะตะฝัŒัˆะต ะฒัะตะณะพ ัั‚ะพ ะผะพะณะปะพ ะฒะพะปะฝะพะฒะฐั‚ัŒ. ะšะฐะฝะดะฐ ัะผะพั‚ั€ะตะป ะฝะฐ ะฟั€ะตะฟะพะดะฐะฒะฐั‚ะตะปั. - ะญ-ะญ-ะญ?! - ะพัˆะฐะปะตะป ะฒะตััŒ ะบะปะฐัั. - ะ... ั-ัั‚-ั‚ะพ... - ะทะฐะปะธะปัั ั€ัƒะผัะฝั†ะตะผ ะฃะพะปะบะตั€. - ะัƒ ะปะฐะดะฝะพ, 5! ะฎัƒ, ะฟะพะฑะตะดะฝะพ ัƒั…ะผั‹ะปัŒะฝัƒะฒัˆะธััŒ, ะฟะตั€ะตะฒะตะป ะฒะทะณะปัะด ะฝะฐ ะะปะปะตะฝะฐ. ะขะพั‚ ะฟะพั‚ัƒะฟะธะปัั ะธ ัƒัั‚ะฐะฒะธะปัั ะฒ ะฟะพะป. "ะฅะผ, ะฒะพะทะผะพะถะฝะพ ัั‚ะพ ะฑัƒะดะตั‚ ะฝะต ั‚ะฐะบ ัƒะถ ะธ ัƒะถะฐัะฝะพ", - ะฟะพั‡ะตะผัƒ-ั‚ะพ ั‚ะพะปัŒะบะพ ัะตะนั‡ะฐั ะšะฐะฝ' - '- ะ”ะพะฑั€ะพะต ัƒั‚ั€ะพ, - ัˆะตะฟะพั‚ ั‰ะตะบะพั‡ะตั‚ ะผะฝะต ัƒั…ะพ. ะกะพะฒัะตะผ ะฝะต ั…ะพั‡ะตั‚ัั ั€ะฐะทะปะตะฟะปัั‚ัŒ ะณะปะฐะทะฐ ะธ ะฒัั‚ั€ะตั‡ะฐั‚ัŒ ะฝะพะฒั‹ะน ะดะตะฝัŒ. ะŸะพะฒะพั€ะฐั‡ะธะฒะฐัŽััŒ, ะฟั€ะธั‚ัะณะธะฒะฐั ั‚ะตะฑั ะฑะปะธะถะต, ะธ ัƒั‚ั‹ะบะฐัŽััŒ ะฝะพัะพะผ ั‚ะตะฑะต ะฒ ะณั€ัƒะดัŒ, ะพั‰ัƒั‰ะฐั ะทะฐะฟะฐั… ัะปะฐะดะพัั‚ะตะน, ะบะพั‚ะพั€ั‹ะต ะฝั€ะฐะฒัั‚ัั ะฝะฐะผ ะพะฑะพะธะผ. ะฏ ะตะถัƒััŒ ะพั‚ ั…ะพะปะพะดะฐ, ะฟั‹ั‚ะฐัััŒ ะฒัะปะตะฟัƒัŽ ะฝะฐะนั‚ะธ ะบั€ะฐั ัƒัŽั‚ะฝะพะณะพ ะพะดะตัะปะฐ ะธ ัะฝะพะฒะฐ ะพะบัƒะฝัƒั‚ัŒัั ะฒ ัะพะฝ. ะขั‹ ะทะฐะผะตั‡ะฐะตัˆัŒ ัั‚ะพ ะธ ะทะฐะฑะพั‚ะปะธะฒะพ ัƒะบั€ั‹ะฒะฐะตัˆัŒ ะผะตะฝั. ะขะฒะพะธ ะฟะฐะปัŒั†ั‹ ะฟะตั€ะตะฑะธั€ะฐัŽั‚ ะผะพะธ ะฒะพะปะพัั‹, ะฐ ะณัƒะฑั‹ ะปะตะณะบะพ ะบะฐัะฐัŽั‚ัั ะผะพะตะณะพ ะปะฑะฐ. ะœั‹ ั‚ะฐะบ ะธ ะทะฐัั‚ั‹ะฒะฐะตะผ ะฒ ัั‚ะพะน ะฟะพะทะต ะฝะฐ ะฝะตะบะพั‚ะพั€ะพะต ะฒั€ะตะผั. ะŸั€ะพั…ะพะดะธั‚ ะฒัะตะณะพ ะฝะตัะบะพะปัŒะบะพ ะผะธะฝัƒั‚, ะฐ ะฟะพั‚ะพะผ ั ั€ะตะทะบะพ ัะฐะถัƒััŒ ะฝะฐ ะบั€ะพะฒะฐั‚ะธ ะธ ะฝะฐั‡ะธะฝะฐัŽ ะฒะพั€ั‡ะฐั‚ัŒ, ั‡ั‚ะพ ัƒะถะต ะดะฐะฒะฝะพ ะฟะพั€ะฐ ะฒัั‚ะฐะฒะฐั‚ัŒ, ะฒะตะดัŒ ัะตะณะพะดะฝั ะฟั€ะตะดัั‚ะพะธั‚ ะฟะพะตะทะดะบะฐ ะฝะฐ ะฟั€ะธั€ะพะดัƒ ะฒะผะตัั‚ะต ั ะดั€ัƒะทัŒัะผะธ. ะฃ ั‚ะตะฑั ะฝะฐ ะปะธั†ะต ะฟะพัะฒะปัะตั‚ัั ัƒะปั‹ะฑะบะฐ, ะฐ ั€ัƒะบะธ ั‚ัะฝัƒั‚ ะพะฑั€ะฐั‚ะฝะพ, ะทะฐัั‚ะฐะฒะปัั ะฒะฝะพะฒัŒ ะพั‚ะบะธะฝัƒั‚ัŒัั ะฝะฐ ะฟะพะดัƒัˆะบะธ. ะะฐ ัƒะปะธั†ะต ะปัŒะตั‚ ะดะพะถะดัŒ, ะฑะฐั€ะฐะฑะฐะฝั ะฒ ะพะบะฝะฐ, ะฐ ั‡ั‚ะพ ะตั‰ะต ะดะตะปะฐั‚ัŒ ะฒ ั‚ะฐะบะพะน ะดะตะฝัŒ, ะตัะปะธ ะฝะต ะฝะตะถะธั‚ัŒัั ะฒ ัƒัŽั‚ะฝะพะน ะฟะพัั‚ะตะปะธ ะฒ ะพะฑัŠัั‚ะธัั… ะปัŽะฑะธะผะพะณะพ? ะกะบะพะปัŒะบะพ ะฒั€ะตะผะตะฝะธ ะผั‹ ะฑั‹ะปะธ ะทะฝะฐะบะพะผั‹, ะฟั€ะตะถะดะต, ั‡ะตะผ ัƒะทะฝะฐะปะธ ะพ ั‡ัƒะฒัั‚ะฒะฐั… ะดั€ัƒะณ ะดั€ัƒะณะฐ? ะ”ะฐ, ั ะฝะต ะฟะพะผะฝัŽ ัั‚ะพะณะพ, ะฝะพ ะบั‚ะพ ัั‡ะธั‚ะฐะตั‚? ะ“ะปะฐะฒะฝะพะต, ั‡ั‚ะพ ะฒ ะผะพะตะน ะฟะฐะผัั‚ะธ ะดะพ ัะธั… ะฟะพั€ ะฑะตั€ะตะถะฝะพ ั…ั€ะฐะฝะธั‚ัั ะผะพะผะตะฝั‚, ะบะพะณะดะฐ ั‚ั‹ ะฝะฐะบะพะฝะตั† ัƒัะปั‹ัˆะฐะป ั‚ะต ะฒะฐะถะฝั‹ะต ัะปะพะฒะฐ. ะŸะตั€ะตะด ะณะปะฐะทะฐะผะธ ะฒัะฟะปั‹ะฒะฐัŽั‚ ัั‡ะฐัั‚ะปะธะฒั‹ะต ะผะณะฝะพะฒะตะฝะธั, ัะปะพะฒะฝะพ ะบะฐะดั€ั‹, ะทะฐะฟะตั‡ะฐั‚ะปะตะฒัˆะธะต ะฒัั‘ ะฒ ะผะตะปัŒั‡ะฐะนัˆะธั… ะดะตั‚ะฐะปัั…. ะญั‚ะพ ะฟั€ะพะธะทะพัˆะปะพ ะฒ ะผะพั€ะพะทะฝั‹ะน ัะฝะฒะฐั€ัะบะธะน ะดะตะฝัŒ. ะ’ะตัั‘ะปะฐั ะบะพะผะฟะฐะฝะธั ะผะพะปะพะดั‹ั… ะปัŽะดะตะน ะฝะต ะผะพะณะปะฐ ะฟั€ะพัั‚ะพ ัะธะดะตั‚ัŒ ะดะพะผะฐ ะฒะทะฐะฟะตั€ั‚ะธ ะธ ัƒะฟัƒัั‚ะธั‚ัŒ ั‚ะฐะบะพะน ั…ะพั€ะพัˆะธะน ัะปัƒั‡ะฐะน ะดะปั ะฟั€ะพะณัƒะปะบะธ ะฟะพ ะทะฐัะฝะตะถะตะฝะฝะพะผัƒ ะปะตััƒ ะธ ะฟั€ะพั‡ะธั… ะทะธะผะฝะธั… ะทะฐะฑะฐะฒ. ะขั‹ ั‚ะพะณะดะฐ ะพะบะฐะทะฐะปัั ะฒะฝะต ะฝะฐัˆะตะณะพ ะฟะพะปั ะทั€ะตะฝะธั, ะฐ ั‚ะตะผะฝะพั‚ะฐ ัƒะถะต ะฝะฐั‡ะฐะปะฐ ะพะฟัƒัะบะฐั‚ัŒัั ะฝะฐ ะทะตะผะปัŽ. ะšะพะฝะตั‡ะฝะพ, ะผะฝะต ะฝะธั‡ะตะณะพ ะฝะต ะพัั‚ะฐะฒะฐะปะพััŒ, ะบั€ะพะผะต ะบะฐะบ ะพั‚ะฟั€ะฐะฒะธั‚ัŒัั ะฝะฐ ะฟะพะธัะบะธ. ะะฐ ะผะพะตะผ ะปะธั†ะต ะทะฐัั‚ั‹ะปะพ ัƒะดะธะฒะปะตะฝะธะต, ะบะพะณะดะฐ ั ะทะฐัั‚ะฐะป ั‚ะตะฑั ะทะฐ ัั‚ั€ะฐะฝะฝั‹ะผ ะทะฐะฝัั‚ะธะตะผ: ะฑั‹ะปะพ ะทะฐะฑะฐะฒะฝะพ ะฝะฐะฑะปัŽะดะฐั‚ัŒ ะทะฐ ั‚ะพะฑะพะน, ะฒั‹ะฒะพะดัั‰ะธะผ ะฐะบะฒะฐั€ะตะปัŒัŽ ะธ ะฑะฐะปะปะพะฝั‡ะธะบะฐะผะธ ั ะบั€ะฐัะบะพะน ะฝะตะบะธะต ัƒะทะพั€ั‹ ะฟั€ัะผะพ ะฝะฐ ัะฝะตะณัƒ. ะขะฒะพะธ ะฝะตะพะฑั‹ั‡ะฝะพัั‚ัŒ ะธ ะฝะตะฟั€ะตะดัะบะฐะทัƒะตะผะพัั‚ัŒ ะฟั€ะธั‚ัะณะธะฒะฐะปะธ ะบ ัะตะฑะต ะผะพัŽ ะฝะฐั‚ัƒั€ัƒ. - ะขั‹ ะผะฝะต ะฝั€ะฐะฒะธัˆัŒัั. ะžั‡ะตะฝัŒ, - ะบะฐะถะตั‚ัั, ะฑัƒะดั‚ะพ ะฒัั‘ ะทะฐะผะตั€ะปะพ, ะธ ะฒ ะทะฒะตะฝัั‰ะตะน ั‚ะธัˆะธะฝะต ะฟั€ะพะทะฒัƒั‡ะฐะปะธ ะฟั€ะพัั‚ั‹ะต ัะปะพะฒะฐ, ะบะพั‚ะพั€ั‹ะต ั‚ัะถะตะปะพ ะฟั€ะพะธะทะฝะตัั‚ะธ. ะงั‚ะพ ะผะพะณะปะพ ั‚ะพะปะบะฝัƒั‚ัŒ ะผะตะฝั ะฟั€ะพัั‚ะพ ะฒะทัั‚ัŒ ะธ ัะบะฐะทะฐั‚ัŒ ะธั…? ะžะดะฝะฐะบะพ ะพั‚ะฒะตั‚ ะฝะฐ ัั‚ะพั‚ ะฒะพะฟั€ะพั ัƒะถะต ะฝะต ะฒะฐะถะตะฝ, ั‚ะตะฟะตั€ัŒ ะพะฝ ะพัั‚ะฐะฒะธะป ะผะตัั‚ะพ ะดะปั ะฑะตัะฟะพะบะพะนัั‚ะฒะฐ. ะขะฒะพะธ ัะผะพั†ะธะธ ัะปะพะถะฝะพ ะฟั€ะพั‡ะธั‚ะฐั‚ัŒ. ะขะฐะบ ะฑั‹ะปะพ ะฒัะตะณะดะฐ. ะœะพะปั‡ะฐะฝะธะต ะฝะฐะณะฝะตั‚ะฐะตั‚ ะฝะฐะฟั€ัะถะตะฝะธะต ะผะตะถะดัƒ ะฝะฐะผะธ. ะŸั€ะธะบะพัะฝะพะฒะตะฝะธะต ะปะตะดัะฝั‹ั… ะฟะฐะปัŒั†ะตะฒ ะบ ะผะพะตะน ั‰ะตะบะต ะฒั‹ะฒะพะดะธั‚ ะธะท ะพั†ะตะฟะตะฝะตะฝะธั, ัะบะพะฒะฐะฒัˆะตะณะพ ั‚ะตะปะพ. ะฏ ะตะปะต-ะตะปะต ั€ะฐะทะปะธั‡ะฐัŽ, ั‡ั‚ะพ ั‚ั‹ ัะตะนั‡ะฐั ะณะพะฒะพั€ะธัˆัŒ, ะฝะพ ะฝะตะบะพั‚ะพั€ั‹ะต ะพะฑั€ั‹ะฒะบะธ ั„ั€ะฐะท ะฒัั‘ ะถะต ะฟั€ะธะพะฑั€ะตั‚ะฐัŽั‚ ัะผั‹ัะป. ะะธะบะพะณะดะฐ ะฝะต ะฒะตั€ะธะป ะฒ ั‡ัƒะดะตัะฐ, ะดะฐ ะฒะพั‚ ั‚ะพะปัŒะบะพ ัะตะนั‡ะฐั ะฟะพะฝะธะผะฐัŽ: ะพะฝะธ ัะปัƒั‡ะฐัŽั‚ัั. ะœะฐะปะตะฝัŒะบะพะต ั‡ัƒะดะพ - ัƒะทะฝะฐั‚ัŒ ะพะฑ ะพั‚ะฒะตั‚ะฝั‹ั… ั‡ัƒะฒัั‚ะฒะฐั… ั‚ะพะณะพ, ะบั‚ะพ ั‚ะฐะบ ะผะฝะพะณะพ ะทะฝะฐั‡ะธั‚ ะดะปั ั‚ะตะฑั. ะœั‹ ะธะดะตะผ ั ั‚ะพะฑะพะน ะฟะพ ะทะฐะผะตั‚ะตะฝะฝั‹ะผ ัะฝะตะณะพะผ ัƒะปะธั†ะฐะผ. ะ’ัŒัŽะณะฐ, ะทะฐะฒั‹ะฒะฐั, ะดัƒะตั‚ ะฒ ะปะธั†ะพ, ัะฑะธะฒะฐั ะฟั€ะพั…ะพะถะธั… ั ะฟัƒั‚ะธ, ะฐ ัƒ ะผะตะฝั ะฝะฐ ะดัƒัˆะต - ัะฟะพะบะพะนัั‚ะฒะธะต ะธ ัƒะผะธั€ะพั‚ะฒะพั€ะตะฝะธะต... ะšะพะณะดะฐ ั‚ั‹ ั€ัะดะพะผ, ะฟั€ะพะธัั…ะพะดัั‰ะตะต ะฒะพะบั€ัƒะณ ะฝะต ะธะผะตะตั‚ ะทะฝะฐั‡ะตะฝะธั, ะธ ะฝะตั‚ ะดะตะปะฐ ะดะพ ะฒัะตั… ะพัั‚ะฐะปัŒะฝั‹ั…. ะœะฝะต ัะปั‹ัˆะฝะพ, ะบะฐะบ ั‚ะฒะพะธ ะทัƒะฑั‹ ัั‚ัƒั‡ะฐั‚ ะพั‚ ั…ะพะปะพะดะฐ. ะกะถะฐะฒัˆะธััŒ, ั‚ั‹ ะฟั€ัั‡ะตัˆัŒ ะฝะพั ะฒ ะฒั‹ัะพะบะธะน ะฒะพั€ะพั‚ ะบัƒั€ั‚ะบะธ. ะฏ ัƒะฒะตั€ะตะฝ, ั‡ั‚ะพ ั‚ะฒะพะธ ั€ัƒะบะธ ะฒ ะบะฐั€ะผะฐะฝะฐั… ะดะฐะฒะฝะพ ะฝะต ะผะพะณัƒั‚ ะพั‚ะพะณั€ะตั‚ัŒัั ะธ ะฟั€ะธะฝัั‚ัŒ ะฝะพั€ะผะฐะปัŒะฝัƒัŽ ั‚ะตะผะฟะตั€ะฐั‚ัƒั€ัƒ. - ะ—ะฐะผะตั€ะท? - ัะฟั€ะฐัˆะธะฒะฐัŽ, ะทะฐะณะปัะดั‹ะฒะฐั ะฒ ะบะฐั€ะธะต ะณะปะฐะทะฐ, ะพะฑั€ะฐะผะปะตะฝะฝั‹ะต ั‡ะตั€ะฝั‹ะผะธ ั€ะตัะฝะธั†ะฐะผะธ, ะฝะฐ ะบะพั‚ะพั€ั‹ะต ั‚ะธั…ะพ ะฟะฐะดะฐัŽั‚ ัะฝะตะถะธะฝะบะธ, ะธ, ะฝะต ะดะพะถะธะดะฐัััŒ ะพั‚ะฒะตั‚ะฐ, ั‚ัะฝัƒ ั‚ะตะฑั ะฒ ะฑะปะธะถะฐะนัˆะตะต ะบะฐั„ะต. - ะŸะพะนะดะตะผ ะดะพะผะพะน, ะฐ ั‚ะพ ะฒะพัะฟะฐะปะตะฝะธะต ะปะตะณะบะธั… ะฟะพะดั…ะฒะฐั‚ะธัˆัŒ, - ัั‚ั€ะพะณะพ ะทะฐะผะตั‡ะฐะตัˆัŒ ั‚ั‹, ัƒะถะต ะฝะฐะฟั€ะฐะฒะปััััŒ ะฒ ัั‚ะพั€ะพะฝัƒ ะฝะฐัˆะตะณะพ ะฟะพะดัŠะตะทะดะฐ. - ะŸะพัั‚ะพะน, ั€ะฐะทะฒะต ะฝะต ะฒะธะดะธัˆัŒ, ะบะฐะบะฐั ั‡ัƒะดะตัะฝะฐั ะฟะพะณะพะดะฐ? - ะทะฝะฐะตัˆัŒ ะฒะตะดัŒ, ั‡ั‚ะพ ะผะฝะต ะฝั€ะฐะฒะธั‚ัั ะณัƒะปัั‚ัŒ ะฟะพะด ะดะพะถะดะตะผ, ะฟะพะดัั‚ะฐะฒะปัั ะปะธั†ะพ ะฟะฐะดะฐัŽั‰ะธะผ ั…ะพะปะพะดะฝั‹ะผ ะบะฐะฟะปัะผ. ะขะตะฑะต ะฒ ะณะพะปะพะฒัƒ ะฑั‹ัั‚ั€ะพ ะฟั€ะธั…ะพะดะธั‚ ะผั‹ัะปัŒ, ะบะฐะบ ะทะฐัั‚ะฐะฒะธั‚ัŒ ะผะตะฝั ัƒะนั‚ะธ ะฒ ะฑะพะปะตะต ััƒั…ะพะต ะธ ั‚ะตะฟะปะพะต ะผะตัั‚ะพ. ะ”ะพะปะณะพ ะฝะต ั€ะฐะทะดัƒะผั‹ะฒะฐั, ั€ั‹ะฒะบะพะผ ะฟั€ะธั‚ัะณะธะฒะฐะตัˆัŒ ะบ ัะตะฑะต, ะฟั€ะธะถะธะผะฐัััŒ ะบ ะผะพะธะผ ะณัƒะฑะฐะผ. ะžั‚ ะฝะตะพะถะธะดะฐะฝะฝะพัั‚ะธ ั ะฟั€ะธะพั‚ะบั€ั‹ะฒะฐัŽ ะธั…, ะฐ ั€ัƒะบะฐะผะธ ะฝะฐั‡ะธะฝะฐัŽ ะณะปะฐะดะธั‚ัŒ ั‚ะฒะพัŽ ัะฟะธะฝัƒ, ะบ ะบะพั‚ะพั€ะพะน ะฟั€ะธะปะธะฟะปะฐ ะธะทั€ัะดะฝะพ ะฟั€ะพะผะพะบัˆะฐั ั€ัƒะฑะฐัˆะบะฐ. ะะต ัะฟะตัˆะฐ, ั‚ั‹ ัƒะณะปัƒะฑะปัะตัˆัŒ ะฟะพั†ะตะปัƒะน, ะตั‰ะต ะฑะพะปัŒัˆะต ั€ะฐะทะทะฐะดะพั€ะธะฒะฐั. ะ˜ะผะตะฝะฝะพ ั‚ะฐะบ ะธ ะฟั€ะตะดะฟะพะปะฐะณะฐะปะพััŒ, ะฟั€ะฐะฒะดะฐ? ะšะพะต-ะบะฐะบ ัะฟั€ะฐะฒะธะฒัˆะธััŒ ั ะทะฐะผะบะพะผ, ะผั‹ ะฒะฒะฐะปะธะฒะฐะตะผัั ะฒ ะฟะพะปัƒั‚ะตะผะฝัƒัŽ ะบะฒะฐั€ั‚ะธั€ัƒ, ะตะดะฒะฐ ััƒะผะตะฒ ัƒัั‚ะพัั‚ัŒ ะฝะฐ ะฝะพะณะฐั…. ะŸะตั€ะตะด ะณะปะฐะทะฐะผะธ ะดะพ ัะธั… ะฟะพั€ ัั‚ะพะธั‚ ะฟะตะปะตะฝะฐ ะดะพะถะดั. ะขั‹ ัั€ะฐะทัƒ ะถะต ั€ะตะทะบะพ ะฟั€ะธะถะธะผะฐะตัˆัŒ ะผะตะฝั ะบ ัั‚ะตะฝะต, ะธ ั‚ะฒะพะน ัะทั‹ะบ ะฒั€ั‹ะฒะฐะตั‚ัั ะฒ ะผะพะน ั€ะพั‚ ะฒ ะฝะตะธัั‚ะพะฒะพะผ ะฟะพั†ะตะปัƒะต, ะฑะตัะฟะพั€ัะดะพั‡ะฝะพ ะดะฒะธะณะฐะตั‚ัั ะฒะดะพะปัŒ ะทัƒะฑะพะฒ ะธ ะฒะพะทะฒั€ะฐั‰ะฐะตั‚ัั ะบ ะผะพะตะผัƒ ัะทั‹ะบัƒ. ะฏ ะฝะต ัั‚ั€ะตะผะปัŽััŒ ะฑั€ะฐั‚ัŒ ะธะฝะธั†ะธะฐั‚ะธะฒัƒ ะฝะฐ ัะตะฑั, ะผะฝะต ะฒัะตะณะดะฐ ะฝั€ะฐะฒะธะปะพััŒ ะฟะปะฐะฒะธั‚ัŒัั ะฟะพะด ะฝะฐั‚ะธัะบะพะผ ั‚ะฒะพะธั… ะปะฐัะบ ะธ ะพะถะธะดะฐั‚ัŒ, ั‡ั‚ะพ ะถะต ั‚ั‹ ะฟั€ะตะดะฟั€ะธะผะตัˆัŒ ะดะฐะปัŒัˆะต. ะฃ ั‚ะตะฑั ะฟะพั‡ั‚ะธ ะฒัะตะณะดะฐ ะปะตะดัะฝั‹ะต ะฟะฐะปัŒั†ั‹, ะธ ัƒ ะผะตะฝั ะผัƒั€ะฐัˆะบะธ ะฑะตะณัƒั‚ ะฟะพ ะบะพะถะต ะพั‚ ะฟั€ะธัั‚ะฝั‹ั…, ะฝะพ ั…ะพะปะพะดะฝั‹ั… ะฟั€ะธะบะพัะฝะพะฒะตะฝะธะน. ะขะตะฑะต ะฝั€ะฐะฒะธั‚ัั ัะผะพั‚ั€ะตั‚ัŒ, ะบะฐะบ ะฟั€ะพะณะธะฑะฐะตั‚ัั ะผะพั ัะฟะธะฝะฐ, ะบะพะณะดะฐ ั‚ั‹ ั€ะธััƒะตัˆัŒ ะฝะฐ ะฝะตะน ะฝะตะฒะธะดะธะผั‹ะต ะปะธะฝะธะธ. ะ’ ะดะถะธะฝัะฐั… ัƒะถะต ัั‚ะฐะฝะพะฒะธั‚ัั ั‚ะตัะฝะพ, ะฐ ะฒ ะณะพะปะพะฒะต ะพะฑั€ะฐะทัƒะตั‚ัั ะฟัƒัั‚ะพั‚ะฐ, ะทะฐะฟะพะปะฝัะตะผะฐั ะปะธัˆัŒ ั‚ะพะฑะพะน. ะขะฒะพะธ ั€ัƒะบะธ ะพะฟัƒัะบะฐัŽั‚ัั ะฝะธะถะต, ะฝะฐั‰ัƒะฟั‹ะฒะฐั ะฟั€ัะถะบัƒ ั€ะตะผะฝั. ะž, ั‚ั‹ ะถะต ัะฐะผ ะทะฐั‚ะตัะป ัั‚ัƒ ะธะณั€ัƒ, ะผะฐะปั‹ัˆ, ั‚ะฐะบ ะดะฐะฒะฐะน ะฟะพะธะณั€ะฐะตะผ? ะฏ, ะฒัั‘ ั‚ะฐะบ ะถะต ะฝะฐั…ะพะดัััŒ ะฒ ะบั€ะตะฟะบะธั… ะพะฑัŠัั‚ะธัั…, ะดะตะปะฐัŽ ะฝะตะพะถะธะดะฐะฝะฝั‹ะน ั€ะฐะทะฒะพั€ะพั‚, ะฟั€ะธะฒั‹ั‡ะฝะพ ะทะฐะฝะธะผะฐั ั€ะพะปัŒ ะฐะบั‚ะธะฒะฐ. ะขั‹ ั ะทะฐะผะธั€ะฐะฝะธะตะผ ัะตั€ะดั†ะฐ ัะผะพั‚ั€ะธัˆัŒ ะฝะฐ ะผะตะฝั, ะฟั€ะตะบั€ะฐั‚ะธะฒ ะฒัะต ะดะตะนัั‚ะฒะธั. ะฃะปั‹ะฑะฝัƒะฒัˆะธััŒ, ะฟั€ะพะฒะพะถัƒ ัะทั‹ะบะพะผ ะฟะพ ั‚ะฒะพะตะผัƒ ัƒั…ัƒ, ั‡ัƒั‚ัŒ ะฟั€ะธะบัƒัั‹ะฒะฐั ะผะพั‡ะบัƒ, ะพั‚ ั‡ะตะณะพ ั‚ะฒะพะธ ะดั€ะพะถะฐั‰ะธะต ะฟะฐะปัŒั†ั‹ ะฟะตั€ะตะผะตั‰ะฐัŽั‚ัั ะฒะฒะตั€ั… ะธ ััƒะดะพั€ะพะถะฝะพ ัะถะธะผะฐัŽั‚ ะผะพะธ ะฒะพะปะพัั‹, ั ะบะพั‚ะพั€ั‹ั… ัั‚ะตะบะฐะตั‚ ะฒะพะดะฐ. ะะตั‚ะตั€ะฟะตะปะธะฒะพ ั€ะฐััั‚ะตะณะธะฒะฐัŽ ะฟัƒะณะพะฒะธั†ั‹ ั‚ะฒะพะตะน ั€ัƒะฑะฐัˆะบะธ, ะฟะพะฟัƒั‚ะฝะพ ะพัั‚ะฐะฒะปัั ะฝะตัะบะพะปัŒะบะพ ะฑะฐะณั€ะพะฒั‹ั… ะพั‚ะผะตั‚ะธะฝ ะฝะฐ ัˆะตะต ะธ ะฝะฐ ะณั€ัƒะดะธ. ะ”ะพ ะผะตะฝั ะดะพะฝะพัะธั‚ัั ัั‚ะพะฝ, ะธ ั ะฟั€ะพะดะพะปะถะฐัŽ ะผะตะดะปะตะฝะฝัƒัŽ ะฟั‹ั‚ะบัƒ, ัั‚ัะณะธะฒะฐั ั ั‚ะตะฑั ะฑั€ัŽะบะธ ะฒะผะตัั‚ะต ั ะฑะตะปัŒะตะผ. ะ’ ั‚ะธัˆะธะฝะต, ั€ะฐะทะฑะฐะฒะปัะตะผะพะน ะฝะฐัˆะธะผ ั‚ัะถะตะปั‹ะผ ะดั‹ั…ะฐะฝะธะตะผ, ั€ะฐะทะดะฐะตั‚ัั ัˆัƒะผะฝั‹ะน ะฒั‹ะดะพั…, ะบะพะณะดะฐ ั ะดะตะปะฐัŽ ะฝะตัะบะพะปัŒะบะพ ะดะฒะธะถะตะฝะธะน ั€ัƒะบะพะน ะฟะพ ะพัะฝะพะฒะฐะฝะธัŽ ั‡ะปะตะฝะฐ, ะฐ ะทะฐั‚ะตะผ, ะปะธะทะฝัƒะฒ ะณะพะปะพะฒะบัƒ, ะพั‚ัั‚ั€ะฐะฝััŽััŒ, ะณะปัะดั ะฒ ะพะดัƒั€ะผะฐะฝะตะฝะฝั‹ะต ะณะปะฐะทะฐ. - ะกะฟะฐะปัŒะฝั, - ัˆะตะฟั‡ะตัˆัŒ ั‚ั‹, ะบั€ะตะฟะบะพ ะดะตั€ะถะฐััŒ ะทะฐ ะบั€ะฐะน ั‚ัƒะผะฑะพั‡ะบะธ, ัั‚ะพัะฒัˆะตะน ั€ัะดะพะผ. ะŸั€ะพัะธั‚ัŒ ะดะฒะฐะถะดั‹ ะฝะตั‚ ัะผั‹ัะปะฐ, ะฒะตะดัŒ ัƒ ะผะตะฝั ัะฐะผะพะณะพ ัƒะถะต ะฝะตั‚ ัะธะป ั‚ะตั€ะฟะตั‚ัŒ ัั‚ะพ ั‚ัะฝัƒั‰ะตะต ะพั‰ัƒั‰ะตะฝะธะต, ะพะฑั€ะฐะทะพะฒะฐะฒัˆะตะตัั ะฒะฝะธะทัƒ ะถะธะฒะพั‚ะฐ. ะ›ะตะณะบะพ ะฟะพะดั…ะฒะฐั‚ั‹ะฒะฐัŽ ั‚ะตะฑั ะฝะฐ ั€ัƒะบะธ ะธ ะธะดัƒ ะฒ ั‚ัƒ ะบะพะผะฝะฐั‚ัƒ, ะฒ ะบะพั‚ะพั€ะพะน ะผั‹ ัั‚ะพะปัŒะบะพ ั€ะฐะท ะทะฐะฝะธะผะฐะปะธััŒ ะปัŽะฑะพะฒัŒัŽ. ะšั€ะพะฒะฐั‚ัŒ ะฒัั‚ั€ะตั‡ะฐะตั‚ ะฝะฐั ะทะฝะฐะบะพะผั‹ะผ ัะบั€ะธะฟะพะผ, ะบะพะณะดะฐ ั ะพะฟัƒัะบะฐัŽ ั‚ะตะฑั, ะฝะตั€ะฒะฝะพ ะบัƒัะฐัŽั‰ะตะณะพ ะณัƒะฑั‹. ะขั‹ ั…ะฒะฐั‚ะฐะตัˆัŒ ะผะตะฝั ะธ ั‚ัะฝะตัˆัŒ ะฝะฐ ัะตะฑั, ะพั‚ั‡ะตะณะพ ะพะบะฐะทั‹ะฒะฐะตัˆัŒัั ะฟั€ะธะถะฐั‚ั‹ะผ ะผะพะธะผ ั‚ะตะปะพะผ. ะขะฒะพะธ ั€ัƒะบะธ ัะบะพะปัŒะทัั‚ ะฟะพ ะผะพะธะผ ะฑะพะบะฐะผ, ะฟะพะผะพะณะฐั ัะฝัั‚ัŒ ั„ัƒั‚ะฑะพะปะบัƒ ะธ, ะฟั€ะธะปะพะถะธะฒ ะฝะตะบะพั‚ะพั€ั‹ะต ัƒัะธะปะธั, ะฟั€ะธะฟะพะดะฝัะฒัˆะธััŒ, ะพะฑะฒะพะดะธัˆัŒ ัะทั‹ะบะพะผ ะผะพะธ ัะพัะบะธ, ัะปะตะณะบะฐ ั†ะฐั€ะฐะฟะฐั ะธั… ะทัƒะฑะฐะผะธ. ะงัƒะฒัั‚ะฒัƒั ะฝะตะพะฑั…ะพะดะธะผะพัั‚ัŒ ัะบะพั€ะตะนัˆะตะน ั€ะฐะทั€ัะดะบะธ, ั ะฟั‹ั‚ะฐัŽััŒ ะบะฐะบ ะผะพะถะฝะพ ัะบะพั€ะตะต ัะฝัั‚ัŒ ะดะถะธะฝัั‹ ะธ ะฝะฐัˆะฐั€ะธั‚ัŒ ะฒ ัั‰ะธั‡ะบะต ัˆะบะฐั„ะฐ ัะผะฐะทะบัƒ ะธ ะฟั€ะตะทะตั€ะฒะฐั‚ะธะฒั‹. ะะตั‚ะตั€ะฟะตะปะธะฒะพ ัƒัั‚ั€ะฐะธะฒะฐัŽััŒ ะฟะพัƒะดะพะฑะฝะตะต ะผะตะถะดัƒ ั‚ะฒะพะธั… ะฝะพะณ, ั€ะฐะทะฒะพะดั ะธั… ะฒ ัั‚ะพั€ะพะฝั‹ ะธ ะฝะตะผะฝะพะณะพ ัะณะธะฑะฐั ะฒ ะบะพะปะตะฝัั…. ะ’ั‹ะดะฐะฒะปะธะฒะฐัŽ ะณะตะปัŒ ะธ ะฟะพะพั‡ะตั€ะตะดะฝะพ ะฐะบะบัƒั€ะฐั‚ะฝะพ ะฒะฒะพะถัƒ ะฒ ั‚ะตะฑั ะฟะฐะปัŒั†ั‹, ั€ะฐัั‚ัะณะธะฒะฐั ะฟั€ะพั…ะพะด. ะกะปั‹ัˆัƒ ั‚ะฒะพะต ัะดะฐะฒะปะตะฝะฝะพะต ัˆะธะฟะตะฝะธะต ะธ ัั‚ะฐั€ะฐัŽััŒ ะพั‚ะฒะปะตั‡ัŒ ัะฒะพะธะผะธ ะปะฐัะบะฐะผะธ, ะฟะพะบั€ั‹ะฒะฐั ะณั€ัƒะดัŒ ะธ ะฟะปะตั‡ะธ ะฟะพั†ะตะปัƒัะผะธ, ะบะพะต-ะณะดะต ั‡ัƒั‚ัŒ ะฟั€ะธะบัƒัั‹ะฒะฐั ะบะพะถัƒ. ะขั‹ ะทะฐะตั€ะทะฐะป ะธ ะฝะตะดะพะฒะพะปัŒะฝะพ ัƒัั‚ะฐะฒะธะปัั ะฝะฐ ะผะตะฝั, ั‚ั€ะตะฑัƒั ะฑะพะปัŒัˆะตะณะพ. ะฏ ั ัƒะดะพะฒะพะปัŒัั‚ะฒะธะตะผ ะฟะพะดั‡ะธะฝััŽััŒ. ะŸั€ะธัั‚ะฐะฒะปััŽ ั‡ะปะตะฝ ะบะพ ะฒั…ะพะดัƒ ะธ ะผะตะดะปะตะฝะฝะพ ะฒั…ะพะถัƒ, ะฝะฐ ั‡ั‚ะพ ะฟะพะปัƒั‡ะฐัŽ ะตะปะต ะทะฐะผะตั‚ะฝั‹ะน ะบะธะฒะพะบ, ะบะฐะบ ั€ะฐะทั€ะตัˆะตะฝะธะต ะฟั€ะพะดะพะปะถะฐั‚ัŒ. ะกะฟัƒัั‚ั ะฝะตัะบะพะปัŒะบะพ ั‚ะพะปั‡ะบะพะฒ ั‚ั‹ ะฒั‹ะณะธะฑะฐะตัˆัŒัั ะฒ ะฟะพะทะฒะพะฝะพั‡ะฝะธะบะต, ะธ ะฝะฐ ะผะพะตะผ ะปะธั†ะต ะฟะพัะฒะปัะตั‚ัั ัƒะปั‹ะฑะบะฐ. ะฏ ัƒะฒะตะปะธั‡ะธะฒะฐัŽ ั‚ะตะผะฟ, ะดะฒะธะณะฐัััŒ ะฒัั‘ ะฑั‹ัั‚ั€ะตะต. ะžั€ะณะฐะทะผ ัั‚ั€ะตะผะธั‚ะตะปัŒะฝะพ ะฝะฐะบั€ั‹ะฒะฐะตั‚ ะฝะฐั ั ะณะพะปะพะฒะพะน, ะดะฐั€ั ัั‚ะพะปัŒ ะดะพะปะณะพะถะดะฐะฝะฝะพะต ะฝะฐัะปะฐะถะดะตะฝะธะต. ะกะพ ัะฑะธะฒัˆะธะผัั ะดั‹ั…ะฐะฝะธะตะผ, ัะพ ะทะฒะตะทะดะพั‡ะบะฐะผะธ ะฒ ะณะปะฐะทะฐั… ะฟะฐะดะฐัŽ ั€ัะดะพะผ ั ั‚ะพะฑะพะน, ั€ะฐัะบั€ะฐัะฝะตะฒัˆะธะผัั, ั‚ัะถะตะปะพ ะดั‹ัˆะฐั‰ะธะผ, ะฝะพ ั‚ะฐะบะธะผ ะปัŽะฑะธะผั‹ะผ. ะขั‹ ะฟั€ะธะถะธะผะฐะตัˆัŒัั ะบะพ ะผะฝะต, ะฟะพะปะพะถะธะฒ ะณะพะปะพะฒัƒ ะฝะฐ ะผะพัŽ ะณั€ัƒะดัŒ. ะ”ะตะปะฐั‚ัŒ ัะตะนั‡ะฐั ั‡ั‚ะพ-ะปะธะฑะพ ะฒั‹ัˆะต ะฒััะบะธั… ัะธะป - ั ะฟั€ะพะดะพะปะถะฐัŽ ะปะตะถะฐั‚ัŒ, ะฟะพะณะปะฐะถะธะฒะฐั ั‚ะฒะพะธ ะฒะพะปะพัั‹ ะธ ะฒัะปัƒัˆะธะฒะฐัััŒ ะฒ ะฑะธะตะฝะธะต ะฝะฐัˆะธั… ัะตั€ะดะตั†. ะŸะพั‡ะตะผัƒ ั ั‚ะตะฑั ั‚ะพะณะดะฐ ะฝะต ะฟะพัะปัƒัˆะฐะป? ะ—ะฐั‡ะตะผ ะฟะพะทะฒะพะปะธะป ั‚ะตะฑะต ะผะพะบะฝัƒั‚ัŒ ะฟะพะด ะดะพะถะดะตะผ ะฒะผะตัั‚ะต ัะพ ะผะฝะพะน? ะ•ัะปะธ ะฑั‹ ะฝะต ัั‚ะฐ ะพัˆะธะฑะบะฐ, ั‚ั‹ ะฑั‹ ะฝะต ะฟะพะดั…ะฒะฐั‚ะธะป ัะตั€ัŒะตะทะฝัƒัŽ ะฑะพะปะตะทะฝัŒ. ะœะตะฝั ะดะพ ัะธั… ะฟะพั€ ั‚ะตั€ะทะฐะตั‚ ั‡ัƒะฒัั‚ะฒะพ ะฒะธะฝั‹. ะžั‡ะตะฝัŒ ั‚ัะถะตะปะพ ะพัะพะทะฝะฐะฒะฐั‚ัŒ, ั‡ั‚ะพ ะฟะพะณัƒะฑะธะป ั‡ัŒัŽ-ั‚ะพ ะถะธะทะฝัŒ... ะžัะพะฑะตะฝะฝะพ ั‚ะพะณะพ, ะบั‚ะพ ะฑั‹ะป ั†ะตะฝั‚ั€ะพะผ ะผะพะตะน ะ’ัะตะปะตะฝะฝะพะน. ะฏ ะฟั€ะพะดะพะปะถะฐัŽ ะถะธั‚ัŒ ะฟั€ะพัˆะปั‹ะผ, ะฝะต ะผะพะณัƒ ะฝะต ะฒัะฟะพะผะธะฝะฐั‚ัŒ ั‚ะต ะฝะตะผะฝะพะณะธะต, ะฝะพ ั‚ะฐะบะธะต ะดะพั€ะพะณะธะต ะผะพะตะผัƒ ัะตั€ะดั†ัƒ ะผะพะผะตะฝั‚ั‹, ะฟั€ะพะฒะตะดะตะฝะฝั‹ะต ั ั‚ะพะฑะพะน. ะœั‹ ัะพะฒัะตะผ ะฝะตะดะพะปะณะพ ะฑั‹ะปะธ ะฒะผะตัั‚ะต. ะœะตะฝั ั‡ะฐัั‚ะพ ะผะพะถะฝะพ ะฒัั‚ั€ะตั‚ะธั‚ัŒ ะฝะฐ ั‚ะพะผ ัะฐะผะพะผ ะผะตัั‚ะต ะฒ ะปะตััƒ, ะณะดะต ั ะพั‚ะบั€ั‹ะปัั ั‚ะตะฑะต. ะ˜ะฝะพะณะดะฐ ะผะฝะต ะบะฐะถะตั‚ัั, ั‡ั‚ะพ ัะบะฒะพะทัŒ ัะธะปัŒะฝัƒัŽ ะผะตั‚ะตะปัŒ ะฒะธะถัƒ ั‚ะฒะพะน ัะธะปัƒัั‚. ะขั‹ ัƒะปั‹ะฑะฐะตัˆัŒัั ะธ ะดะตะปะฐะตัˆัŒ ะฝะตัะบะพะปัŒะบะพ ัˆะฐะณะพะฒ ะฝะฐะฒัั‚ั€ะตั‡ัƒ, ะฐ ะฟะพั‚ะพะผ ะธัั‡ะตะทะฐะตัˆัŒ... ะ•ัะปะธ ะฑั‹ ั‚ะพะปัŒะบะพ ะฑั‹ะปะฐ ะฒะพะทะผะพะถะฝะพัั‚ัŒ ะตั‰ะต ั€ะฐะท ัƒัะปั‹ัˆะฐั‚ัŒ ั‚ะฐะบะพะต ั‚ะตะฟะปะพะต "ะดะพะฑั€ะพะต ัƒั‚ั€ะพ", ะฟะพั‡ัƒะฒัั‚ะฒะพะฒะฐั‚ัŒ ะณะพั€ัั‡ะตะต ะดั‹ั…ะฐะฝะธะต, ั‰ะตะบะพั‡ัƒั‰ะตะต ัƒั…ะพ, ั…ะพั‚ัŒ ั‡ั‚ะพ-ะฝะธะฑัƒะดัŒ... ะŸะพะบะฐ ั‚ั‹ ะฑั‹ะป ั€ัะดะพะผ, ะฑั‹ะปะพ ัะพะฒัะตะผ ะฝะต ะฒะฐะถะฝะพ, ั‡ั‚ะพ ะฟั€ะพะธัั…ะพะดะธั‚ ะฒะพะบั€ัƒะณ. ะะพ ั‚ะตะฟะตั€ัŒ, ะบะพะณะดะฐ ั ะฝะฐะฑะปัŽะดะฐัŽ ะทะฐ ะฝะตะฝะฐัั‚ะฝะพะน ะฟะพะณะพะดะพะน ะฒ ะพะบะฝะพ, ัƒ ะผะตะฝั ะฝะตั‚ ัะฒะตั‚ะปั‹ั… ะผั‹ัะปะตะน ะธ ะปะตะณะบะพัั‚ะธ, ั‡ั‚ะพ ะฒะพะทะฝะธะบะฐะปะธ ั€ะฐะฝัŒัˆะต. ะ”ะฐะถะต ะปะตั‚ะพะผ ะผะพะต ัะตั€ะดั†ะต ัะบะพะฒั‹ะฒะฐะตั‚ ะปะตะด, ะบะพั‚ะพั€ั‹ะน ัƒะถะต ะฝะต ัƒะดะฐัั‚ัั ั€ะฐัั‚ะพะฟะธั‚ัŒ.' pipeline_tag: sentence-similarity library_name: sentence-transformers metrics: - cosine_accuracy - cosine_accuracy_threshold - cosine_f1 - cosine_f1_threshold - cosine_precision - cosine_recall - cosine_ap - cosine_mcc model-index: - name: SentenceTransformer based on intfloat/multilingual-e5-small results: - task: type: binary-classification name: Binary Classification dataset: name: Unknown type: unknown metrics: - type: cosine_accuracy value: 0.9214652872665756 name: Cosine Accuracy - type: cosine_accuracy_threshold value: 0.3258066475391388 name: Cosine Accuracy Threshold - type: cosine_f1 value: 0.7519700297119235 name: Cosine F1 - type: cosine_f1_threshold value: 0.28451085090637207 name: Cosine F1 Threshold - type: cosine_precision value: 0.7465213209361975 name: Cosine Precision - type: cosine_recall value: 0.7574988613442645 name: Cosine Recall - type: cosine_ap value: 0.8411912148109758 name: Cosine Ap - type: cosine_mcc value: 0.7019559271431105 name: Cosine Mcc --- # SentenceTransformer based on intfloat/multilingual-e5-small This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [intfloat/multilingual-e5-small](https://huggingface.co/intfloat/multilingual-e5-small) on the json dataset. It maps sentences & paragraphs to a 384-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more. ## Model Details ### Model Description - **Model Type:** Sentence Transformer - **Base model:** [intfloat/multilingual-e5-small](https://huggingface.co/intfloat/multilingual-e5-small) <!-- at revision c007d7ef6fd86656326059b28395a7a03a7c5846 --> - **Maximum Sequence Length:** 512 tokens - **Output Dimensionality:** 384 dimensions - **Similarity Function:** Cosine Similarity - **Training Dataset:** - json <!-- - **Language:** Unknown --> <!-- - **License:** Unknown --> ### Model Sources - **Documentation:** [Sentence Transformers Documentation](https://sbert.net) - **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers) - **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers) ### Full Model Architecture ``` SentenceTransformer( (0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel (1): Pooling({'word_embedding_dimension': 384, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True}) (2): Normalize() ) ``` ## Usage ### Direct Usage (Sentence Transformers) First install the Sentence Transformers library: ```bash pip install -U sentence-transformers ``` Then you can load this model and run inference. ```python from sentence_transformers import SentenceTransformer # Download from the ๐Ÿค— Hub model = SentenceTransformer("sentence_transformers_model_id") # Run inference sentences = [ '- ะšะฐะฝะดะฐ! ะฏ ะตั‰ั‘ ัะพะณะปะฐัะธะปัั ะฝะฐ ะฟะฐััะธะฒ "ัะฝะธะทัƒ" ! ะะพ ัั‚ะพ ัƒะถะต ะดะฐะถะต ะฝะต ะฟะฐััะธะฒ, ัั‚ะพ ัƒะถะต ะฑะตะท ะฐะบั‚ะธะฒ ะบะฐะบะพะน-ั‚ะพ!\n- ะ ั‡ั‚ะพ ั‚ั‹ ะพั‚ ะผะตะฝั ั…ะพั‡ะตัˆัŒ-ั‚ะพ? ะฏ ัƒะถะต ะฝะต ะผะพะณัƒ ัะดะตั€ะถะธะฒะฐั‚ัŒัั!\n- ะะต ะฒะดะฐะฒะปะธะฒะฐะน ะผะตะฝั ะขะะš ะฒ ัั‚ะตะฝัƒ-ั‚ะพ!\n- ะขั‡. ะœะพััˆะธ, ั…ะฒะฐั‚ะธั‚ ะตะปะพะทะธั‚ัŒ ะธ ะฝะฐัะปะฐะถะดะฐะนัั ะผะพะผะตะฝั‚ะพะผ!\n- ะะพ ัั‚ะพ ะฝะต ั‡ะตัั‚ะฝะพ! ะŸะพ ะพั‚ะฝะพัˆะตะฝะธัŽ, ะผะตะถะดัƒ ะฟั€ะพั‡ะธะผ, ะบ ั‚ะต...ะผะผะผะผะผ!!!\n- ะขั‡. ะœะพััˆะธ, ั‡ั‚ะพ ัั‚ะพ?\n- ะ‘ะฐะฝั‚ะธะบ!\n- ะฏ ะฒะธะถัƒ, ะฝะฐ ะบะพะน ั‡ั‘ั€ั‚ ั‚ั‹ ะผะฝะต ะตะณะพ ะทะฐะฒัะทะฐะป?\n- ะขะฐะบ ั‚ั‹ ะฟะพั…ะพะถ ะฝะฐ ะฟะพะดะฐั€ะพะบ!\n- ะกั‡ะฐั ะœัƒะณะตะฝะพะผ ะพะณั€ะตะฑั‘ัˆัŒ!\n- ะ ะฟะพั‡ะตะผัƒ ั‚ั‹ ะฝะต ัะฟั€ะพัะธัˆัŒ "ะšะพะผัƒ?"\n- ะœะฝะต ัั‚ะพ ะฝะต ะธะฝั‚ะตั€ะตัะฝะพ! ะ ะบะพะผัƒ?\n- ะœะฝะต!!! ... *ะงะผะพะบ*\n- ะฅะผ... ะผะตะฝั ัั‚ะพ ะฝะต ัƒัั‚ั€ะฐะธะฒะฐะตั‚!\n- ะงะตะณะพ?!!\n- ะขะพะณะพ!\n- ะœะœะœะœ!!!\n- ะšะพะผัƒะธ! ะงั‚ะพ ัั‚ะพ ะทะฝะฐั‡ะธั‚? ะงั‚ะพ ั ะะปะปะตะฝะพะผ?\n- ะงั‚ะพ? ะ, ะšะฐะฝะดะฐ. ะะปะปะตะฝะฐ ะฃะพะปะบะตั€ะฐ ั€ะฐะฝะธะปะธ ะฝะฐ ะผะธััะธะธ!\n- ะญั‚ะพ ั ัƒะถะต ะฟะพะฝัะป! ะงั‚ะพ ั ะฝะธะผ, ะณะพะฒะพั€ะธ ะบะพะฝะบั€ะตั‚ะฝะตะน! - ัะฐะผัƒั€ะฐะน ะฒัั‚ั€ัั…ะฝัƒะป ะฝะฐั‡ะฐะปัŒะฝะธะบะฐ.\n- ะะต ะฟะพะฒั‹ัˆะฐะน ะฝะฐ ะผะตะฝั ะณะพะปะพั! - ะฒะพะทะผัƒั‚ะธะปัั ัะผะพั‚ั€ะธั‚ะตะปัŒ.\n- ะ’ะพั‚ ั ะฒะฐัˆะตะน ัะตัั‚ั€ะพะน ั‡ั‚ะพ-ะฝะธะฑัƒะดัŒ ัะปัƒั‡ะธั‚ัั, ั ะฒะฐะผ ั‚ะพะถะต ัะบะฐะถัƒ "ะะต ะบั€ะธั‡ะธั‚ะต!" ะงั‚ะพ ั ะœะพััˆะธ?\n- ะญั…... ะฝะฐ ะผะธััะธะธ ะะปะปะตะฝ ะพัะปะตะฟ. ะะพ ะฝะต ะฟะตั€ะตะถะธะฒะฐะน. ะญั‚ะพ ะฒั€ะตะผะตะฝะฝะพ! ะ—ั€ะตะฝะธะต ะฒะพััั‚ะฐะฝะพะฒะธั‚ัั! ะœะตััั†ะฐ ั‡ะตั€ะตะท 3!\n- 3 ะœะ•ะกะฏะฆะ?!\n- ะ”ะฐ! ะขั‹ ัƒะถ ะฝะต ะพะฑะธะถะฐะน ะตะณะพ ะฟะพะบะฐ.\n- ะ‘ะตะท ะฒะฐั ะทะฝะฐัŽ!\n- ะขั‹ ะบัƒะดะฐ?\n- ะš ะะปะปะตะฝัƒ, ะบัƒะดะฐ ะถะต ะตั‰ั‘! - ะณั€ะพะทะฝะพ ั€ัะฒะบะฝัƒะป ัะฐะผัƒั€ะฐะน.\n- ะžั… ัƒะถ ัั‚ะธ ะณะพะปัƒะฑะบะธ...\n- ะš-ะบั‚ะพ ะทะดะตััŒ? - ะะปะปะตะฝ ัะธะดะตะป ะฝะฐ ะบะพะนะบะต, ะทะฐะฒะตั€ะฝัƒะฒัˆะธััŒ ะฒ ะพะดะตัะปะพ.\n- ... - ัˆะฐะณะธ ะฟั€ะธะฑะปะธะถะฐะปะธััŒ.\n- ะ-ะฝะต ะฟะพะดั…ะพะดะธ! - ะฐ ะฒั‹ ะฑั‹ ะฝะต ะธัะฟัƒะณะฐะปะธััŒ, ะฟั€ะตะถะดะต ะพัั‚ะฐะฒัˆะธััŒ ะฒ ะพะดะธะฝะพั‡ะตัั‚ะฒะต, ัั€ะตะดะธ ะฐะบัƒะผ, ะฒ ะณัƒัั‚ะพะผ ะปะตััƒ, ะฑะตะท ะทั€ะตะฝะธั? ะขะพ-ั‚ะพ ะถะต!\n- "ะะต ะพัั‚ะฐะฒะปัŽ!"\n- ะงะธัั‚ะฐั ะกะธะปะฐ! - ะทะฐะฝั‘ั ั€ัƒะบัƒ, ะฟั€ะตะดัƒะฟั€ะตะถะดะฐั ะฒั€ะฐะณะฐ.\n- "ะะธ ะทะฐ ั‡ั‚ะพ ะฑะพะปัŒัˆะต ะฝะต ะพัั‚ะฐะฒะปัŽ ะพะดะฝะพะณะพ!" ะะปะปะตะฝ! - ะฟะพะดั…ะฒะฐั‚ะธั‚ัŒ ั‚ะพะฝะบะพะต ั‚ะตะปัŒั†ะต ะฝะฐ ั€ัƒะบะธ, ะฟั€ะธะถะฐั‚ัŒ ะบ ัะตะฑะต, ะปะฐะดะพะฝัŒัŽ ะฝะฐะบั€ั‹ะฒ ะณะปะฐะทะฐ ะœะพััˆะธ, ะบะฐะบ ะฒ ะฟะตั€ะฒะพะผ ะฟะพั†ะตะปัƒะต. ะ˜ ะบะพัะฝัƒั‚ัŒัั ัƒะณะพะปะบะฐ ั€ะพั‚ะธะบะฐ ัะฒะพะธะผะธ ะณัƒะฑะฐะผะธ.\n- ะš-ะšะฐะฝะดะฐ?!\n- ะะต ะฒะพะปะฝัƒะนัั! ะฏ ัั‚ะฐะฝัƒ ั‚ะฒะพะธะผะธ ะณะปะฐะทะฐะผะธ ะฟะพะบะฐ ั‚ั‹ ะฟั€ะพะดะพะปะถะฐะตัˆัŒ ะฑั‹ั‚ัŒ ะผะพะธะผ ะกะตั€ะดั†ะตะผ ะะตะฒะธะฝะฝะพัั‚ะธ.\nะ ั‚ั‹ ัƒัั‚ะฐะฒัˆะธะน ะธะดั‘ัˆัŒ ั ะผะธััะธะธ.\nะ ั ัƒัั‚ะฐะฒัˆะธะน ะธะดัƒ ั ั‚ั€ะตะฝะธั€ะพะฒะบะธ.\nะขะฒะพะธ ะฝะพะณะธ ะธัั‚ะพะฟั‚ะฐะฝั‹.\nPOV ะšะฐะฝะดั‹.\nะœะพั ะณะพะปะพะฒะฐ ะฑะพะปะธั‚.\nะขะฒะพะธ ั€ัƒะบะธ ะฝะพัŽั‚.\nะœะพั‘ ัะตั€ะดั†ะต ะธัั‚ะพะผะธะปะพััŒ.\nะ˜ ะฒะพั‚ ะผั‹ ะธะดั‘ะผ ะดั€ัƒะณ ะฝะฐ ะดั€ัƒะณะฐ, ะฟะพะดะฝะธะผะฐะตะผ ะณั€ัƒัั‚ะฝั‹ะต, ะธะทะผัƒั‡ะตะฝะฝั‹ะต ะณะปะฐะทะฐ ะดั€ัƒะณ ะบ ะดั€ัƒะณัƒ.\nะขั‹ ะพัั‚ะฐะฝะฐะฒะปะธะฒะฐะตัˆัŒัั.\nะ”ัƒั€ะฐะบ, ั‡ั‚ะพ-ั‚ะพ ะณะพะฒะพั€ะธัˆัŒ.\nะงั‚ะพ-ั‚ะพ ะบั€ะธั‡ะธัˆัŒ.\nะž ั‡ั‘ะผ-ั‚ะพ ะผะพะปั‡ะธัˆัŒ.\nะšะฐะบ-ั‚ะพ ัะผะพั‚ั€ะธัˆัŒ.\nะž ั‡ั‘ะผ-ั‚ะพ ะฒะพะปะฝัƒะตัˆัŒัั.\nะกะฝะพะฒะฐ ะพ ั‡ั‘ะผ-ั‚ะพ ะบั€ะธั‡ะธัˆัŒ.\nะก ะณั€ัƒัั‚ัŒัŽ ัะผะพั‚ั€ะธัˆัŒ.\nะž ะบะพะผ ั‚ั‹ ะผัƒั‡ะฐะตัˆัŒัั?\nะ”ะตะปะฐะตัˆัŒ ัˆะฐะณ, ะตั‰ั‘ ะพะดะธะฝ.\nะฅะฒะฐั‚ะฐะตัˆัŒ ะทะฐ ะฒะพั€ะพั‚ะฝะธะบ.\nะŸั€ะธะฒัั‚ะฐั‘ัˆัŒ ะฝะฐ ะฝะพัะพั‡ะบะฐั….\nะฆะตะปัƒะตัˆัŒ...\nะ”ัƒั€ะฐะบ, ั‚ั‹ ะถะต ัƒัั‚ะฐะป!\nะ”ัƒั€ะฐะบ, ั ะถะต ัƒัั‚ะฐะป!\nะฏ ะพัั‚ะฐะฝะฐะฒะปะธะฒะฐัŽััŒ.\nะ”ัƒั€ะฐะบ, ั‡ั‚ะพ-ั‚ะพ ะพั‚ะฒะตั‡ะฐัŽ.\nะงั‚ะพ-ั‚ะพ ะบั€ะธั‡ัƒ.\nะะฐ ั‡ั‘ะผ-ั‚ะพ ะทะฐะผะพะปะบะฐัŽ.\nะขัƒะฟะพ ัะผะพั‚ั€ัŽ.\nะงั‚ะพ-ั‚ะพ ั‰ะตะผะธั‚.\nะžะฟัั‚ัŒ ั‡ั‚ะพ-ั‚ะพ ะพั€ัƒ.\nะžั‚ั€ะตัˆั‘ะฝะฝะพ ัะผะพั‚ั€ัŽ.\nะ—ะฐ ะบะพะณะพ-ั‚ะพ ะฒะพะปะฝัƒัŽััŒ.\nะกั‚ะพัŽ.\nะะฐัั‚ะพั€ะฐะถะธะฒะฐัŽััŒ.\nะ’ัั‘ ั€ะฐะฒะฝะพ.\nะะตัƒะถะตะปะธ?!\nะžั‚ะฒะตั‡ะฐัŽ ะฝะฐ ะฟะพั†ะตะปัƒะน.\nะšะฐะบ ะถะต ะผั‹ ัƒัั‚ะฐะปะธ!\n- ะ”ะฐะฒะฝะพ?\n- ะ’ัะตะณะดะฐ, ะœะพััˆะธ.\n- ะะตั‚, ั‡ะตัั‚ะฝะพ, ั ะฝะตะฝะฐะฒะธะถัƒ ะณะพะปัƒะฑะตะน! ะ ะพัะพะฑะตะฝะฝะพ ะฑะตะปั‹ั…! - ัะบะฐะฝะดะฐะปะธะป ะ›ะฐะฒะธ ะธะดั ะฟะพ ะบะพั€ะธะดะพั€ัƒ ะงั‘ั€ะฝะพะณะพ ะžั€ะดะตะฝะฐ.\n- ะงะตะผ ะถะต ะพะฝะธ ั‚ะตะฑะต ะฝะต ะฝั€ะฐะฒัั‚ัั? - ัะฟั€ะพัะธะปะฐ ะดะตะฒัƒัˆะบะฐ-ะบะธั‚ะฐัะฝะบะฐ.\n- ะ”ะฐ ะฑะปะธะฝ, ัะธะดัั‚ ะฒะตะทะดะต ะณะดะต ะฝะต ะฟะพะฟะฐะดั ะธ ัะฒะตั€ั…ัƒ ะบะฐะบะฐัŽั‚!\nะ˜ ั‚ะพะปัŒะบะพ ะพะฝะธ ะทะฐัˆะปะธ ะทะฐ ะฟะพะฒะพั€ะพั‚, ะบะฐะบ ัƒัะปั‹ัˆะฐะปะธ:\n- ะะตั‚, ะšะฐะฝะดะฐ, ะฟั€ะตะบั€ะฐั‚ะธ! ะะฐั ะผะพะณัƒั‚ ัƒะฒะธะดะตั‚ัŒ!\n- ะ”ะฐ ะบั‚ะพ ะฝะฐั ั‚ัƒั‚ ะผะพะถะตั‚ ัƒะฒะธะดะตั‚ัŒ, ะœะพััˆะธ?\n- ะัƒ, ะบั‚ะพ-ะฝะธะฑัƒะดัŒ! ะั…... ะฎัƒ!\n- ะœะธะผะพ ะณะพะปัƒะฑัั‚ะฝะธ ะฝะธะบั‚ะพ ะฝะต ะฟั€ะพั…ะพะดะธั‚! ะญั‚ะพ ะฝะฐะดั‘ะถะฝะฐั ั‡ะฐัั‚ัŒ ะทะฐะผะบะฐ!\n- ะัƒ, ะฝัƒ, ะฝัƒ ะปะฐะดะฝะพ... ะฝะพ ะฟะพั‡ะตะผัƒ ะธะผะตะฝะฝะพ ั‚ัƒั‚?\n- ะ ะบั‚ะพ ะฒะตั€ะตั‰ะฐะป ั‡ั‚ะพ ะตะผัƒ ั€ะพะผะฐะฝั‚ะธะบะธ ะฝะต ั…ะฒะฐั‚ะฐะตั‚? - ะšะฐะฝะดะฐ ะฝะตะดะฒัƒัะผั‹ัะปะตะฝะฝะพ ะทะฐะถะธะผะฐะป ะะปะปะตะฝะฐ ะฝะฐ ะฟะพะดะพะบะพะฝะฝะธะบะต.\nะ›ะฐะฒะธ ะธ ะ›ะธะฝะฐะปะธ ัˆะฐั€ะฐั…ะฝัƒะปะธััŒ ะพะฑั€ะฐั‚ะฝะพ.\n- ะฅะพั‚ั ะทะฝะฐะตัˆัŒ, ะ›ะธ. ะœะพะถะตั‚ ะฒ ัั‚ะธั… ะณะพะปัƒะฑะบะฐั… ั‡ั‚ะพ-ั‚ะพ ะธ ะตัั‚ัŒ!\nะ”ั‹ัˆะฐั‚ัŒ ัั‚ะฐะฝะพะฒะธั‚ัั ะฒัั‘ ั‚ั€ัƒะดะฝะตะต, ะผะตะฝั ะทะฐะณะพะฝััŽั‚ ะฒ ัƒะณะพะป. ะฏ ะธัะฟัƒะณะฐะฝะฝะพ ะพะฑะพั€ะฐั‡ะธะฒะฐัŽััŒ ะธ ะฒะธะถัƒ... ะตะณะพ.\n- ะขั‹, ะฟั€ะพะบะปัั‚ั‹ะน! - ะพะบั€ะธะบะธะฒะฐะตั‚ ัั‚ั€ะพะณะธะน ัะฟะพะฝะตั†.\nะ ั ะผะพะปั‡ัƒ, ะดั‹ั…ะฐะฝะธะต ัะฑะธะปะพััŒ, ะฒะทะฒะพะปะฝะพะฒะฐะฝ. ะงั‚ะพ ะตะผัƒ ะฝัƒะถะฝะพ?\n- ะะต ะดัƒะผะฐะป ั‡ั‚ะพ ั‚ั‹ ะพะฟัƒัั‚ะธัˆัŒัั ะดะพ ะบั€ะฐะถ, ะœะพััˆะธ.\nะงั‚ะพ? ะšั€ะฐะถ? ะšะฐะบะธั… ะบั€ะฐะถ, ะฎัƒ?\n- ะขั‹ ะพ ั‡ั‘ะผ?\n- ะ˜ะผะตะฝะฝะพ ั‚ั‹, ะพ ั ะฝะต ัะพะผะฝะตะฒะฐัŽััŒ, - ะพะฝ ะธะทะดะตะฒะฐะตั‚ัั? - ั‚ั‹ ัƒะบั€ะฐะป ัƒ ะผะตะฝั ะพะดะฝัƒ ะฒะตั‰ัŒ.\n- ะšะฐะฝะดะฐ, ั ะฝะธั‡ะตะณะพ ะฝะต ะฑั€ะฐะป! - ะพั‚ะบัƒะดะฐ ัั‚ะพ ั‡ัƒะฒัั‚ะฒะพ ะฑะตะทั‹ัั…ะพะดะฝะพัั‚ะธ? ะžะฝ ะพัั‚ะฐะฝะพะฒะธะปัั.\n- ะ›ะธะฑะพ ะฒะตั€ะฝะธ ะผะฝะต ะตะณะพ, ะปะธะฑะพ ั ะทะฐะฑะตั€ัƒ ั‚ะฒะพั‘! - ะงั‚ะพ? ะฏ ะตะณะพ ะฝะต ะฟะพะฝะธะผะฐัŽ! ะงั‚ะพ ะพะฝ ะดะตะปะฐะตั‚? ะฅะฒะฐั‚ะฐะตั‚ ะทะฐ ะฟะพะดะฑะพั€ะพะดะพะบ, ั‚ะฐั‰ะธั‚ ะฝะฐ ัะตะฑั. ะ˜... ะฑะพะถะต, ั‡ั‚ะพ ัั‚ะพ? ะžะฝ... ะฏ ั‡ัƒะฒัั‚ะฒัƒัŽ ะตะณะพ ะณัƒะฑั‹ ะธ ัะฐะผ ะฝะต ะฟะพะฝะธะฐั - ะพั‚ะฒะตั‡ะฐัŽ! ะšะฐะฝะดะฐ, ั‚ั‹, ั‚ั‹, ั‚ั‹... ะฝะฐัั‚ะพัั‰ะธะน ะฒะพั€! ะขั‹ ะบั€ะฐะดั‘ัˆัŒ ะผะพั‘ ัะตั€ะดั†ะต!\n- ะš-ะบะฐะฝะดะฐ... - ั€ัƒะบะธ ะฟะพะฒะธัะปะธ. ะะต ัะพะพะฑั€ะฐะถะฐัŽ.\n- ะ’ะตั€ะฝะธ ะผะพั‘ ัะตั€ะดั†ะต, ะ“ั€ั‘ะฑะฐะฝะฝั‹ะน ะกั‚ั€ัƒั‡ะพะบ!\n- ะฏ... ะฝะธ ะทะฐ ั‡ั‚ะพ! ะฏ ะพัั‚ะฐะฒะปัŽ ะตะณะพ ัะตะฑะต! ะ ั‚ั‹... ัƒะถะต ะทะฐะฑั€ะฐะป ะผะพั‘...\nะ’ะพะปะฝะพะฒะฐะปัั ะบะฐะบ-ั‚ะพ ะะปะปะตะฝ, ะฒะตะดัŒ ะšะฐะฝะดะฐ ะฑั‹ะป ะฝะฐ ะผะธััะธะธ. ะ˜ ะฒะพั‚ ัˆะปั‘ั‚ ะพะฝ ะตะผัƒ ะฟะธััŒะผะพ ั‡ะตั€ะตะท ะณะพะปะตะผะฐ.\nะ: ะขั‹ ั‚ะฐะผ ะฒะพะพะฑั‰ะต ะถะธะฒะพะน, ะ‘ะฐะšะฐะฝะดะฐ?\nะš: ะ”ะฐ, ะะปะปะตะฝ. ะ–ะธะฒะพะน, ั ะถะธะฒะพะน!\nะ: ะšะฐะฝะดะฐ! ะงั‚ะพ ั ั‚ะพะฑะพะน? ะขะตะฑะต ะฟะปะพั…ะพ? ะฃะผะธั€ะฐะตัˆัŒ? ะขะตะฑั ะ›ะธะฝะฐะปะธ ะฟะพั†ะตะปะพะฒะฐะปะฐ? ะ”ะตั€ะถะธััŒ ะดั€ัƒะณ!\nะš: ะขั‹ ั‡ั‘? ะกะพ ะผะฝะพะน ะฒัั‘ ั…ะพั€ะพัˆะพ, ะœะพััˆะธ!\nะ: ะคัƒั…... ะฝะต ะฟัƒะณะฐะน ะผะตะฝั ั‚ะฐะบ ะฑะพะปัŒัˆะต.\n- ะญะน, ะšะฐะฝะดะฐ, ะฝะต ะณั€ัƒัั‚ะธ! ะญั‚ะพ ั‚ะฐะบ ะฝะฐ ั‚ะตะฑั ะฝะต ะฟะพั…ะพะถะต!\n- ะขะตะฑะต ะปะตะณะบะพ ะณะพะฒะพั€ะธั‚ัŒ. ะฃ ะฒะฐั ั ะ›ะฐะฒะธ ะฒัั‘ ะฝะฐะปะฐะถะธะฒะฐะตั‚ัั. ะ ะฝะฐ ะผะตะฝั ะฃะพะปะบะตั€ ะดะฐะถะต ะฝะต ัะผะพั‚ั€ะธั‚!\n- ะขั‹ ะฟะพะณะพะฒะพั€ะธ ั ะฝะธะผ!\n- ะขั‡, ะธ ั‚ะฐะบ ะบะฐะถะดั‹ะน ะดะตะฝัŒ ะฒะธะดะธะผัั!\n- ะะตั‚, ะฎัƒ, ะฟะพะณะพะฒะพั€ะธ ั ะฝะธะผ, ะบะฐะบ ัะพ ะผะฝะพะน! ะ’ัั‘ ะพะฑั€ะฐะทัƒะผะธั‚ัŒัั ัะปั‹ัˆะธัˆัŒ?\n- ะะต ะฒะตั€ัŽ ั ะฒ ัั‚ะพ.\n- ะ ั‚ั‹ ะฟะพะฒะตั€ัŒ! ะงัƒะดะพ - ะฑั‹ะฒะฐะตั‚!\nhttp://vkontakte.ru/photo63528512_276702591\n-ะะต ะพั‚ะดะฐะผ. ะกะปั‹ัˆะธั‚ะต?! ะะธะบะพะณะดะฐ ะฝะต ะพั‚ะดะฐะผ ะฒะฐะผ ะšะฐะฝะดัƒ!!!\n- ะญั‚ะพ ะฝะต ั‚ะตะฑะต ั€ะตัˆะฐั‚ัŒ, ะะปะปะตะฝ ะฃะพะปะบะตั€!\n- ะะต. ะŸะพะดั…ะพะดะธั‚ะต. ะš. ะะฐะผ.\n- ะžะฝ ะฟั€ะธะฝะฐะดะปะตะถะธั‚ ะฝะฐะผ!\n- ะฏ... ะพะฝ... ะฃะ‘ะฌะฎ!!!\nhttp://vkontakte.ru/photo63528512_276702661', 'ะšะพะฝะตั† ะณะพะดะฐ - ัั‚ะพ ะฟะพั€ะฐ ะดะปั ั€ะฐะดะพัั‚ะธ, ะฒ ะฟั€ะตะดั‡ัƒะฒัั‚ะฒะธะธ ะฝะฐะดะฒะธะณะฐัŽั‰ะธั…ัั ะบะฐะฝะธะบัƒะป, ัะฒะพะฑะพะดั‹. ะญั‚ะพ ะฑั‹ะปะพ ะฝะฐั‡ะฐะปะพ ะผะฐั, ะบะพะณะดะฐ ะฝะฐ ัƒะปะธั†ะต ัƒะถะต ั‚ะตะฟะปะพ, ะฐ ะฟะพ ัƒั‚ั€ะฐะผ ะทัะฑะบะพ. ะšะพะณะดะฐ ั†ะฒะตั‚ั‹ ัƒะถะต ั€ะฐัั†ะฒะตะปะธ ะธ ะฝะฐั‡ะฐะปะธ ะฑะปะฐะณะพัƒั…ะฐั‚ัŒ. ะกั‹ั€ะฐั ะทะตะผะปั ะฟะพะบั€ั‹ะฒะฐะปะฐััŒ ั‚ั€ะฐะฒะธะฝะพั‡ะบะฐะผะธ, ะธ ะฟะพ ะฝะตะน ั‚ัƒะดะฐ-ััŽะดะฐ ัะฝะพะฒะฐะปะธ ะฑัƒะบะฐัˆะบะธ-ั‚ะฐั€ะฐะบะฐัˆะบะธ.\nะŸั‚ะธั†ั‹ ะปะตั‚ะฐะปะธ ะฝะฐะด ะดะตั€ะตะฒัŒัะผะธ, ั‡ะธั€ะธะบะฐั ะธ ัั‚ั€ะตะบะพั‡ะฐ, ะฐ ะบะฐะบะฐั-ั‚ะพ ะพัะพะฑะตะฝะฝะพ ัƒัะตั€ะดะฝะพ ะฝะฐะฟะตะฒะฐะปะฐ:\n~ midori tanabiku namimori no\ndainaku shounaku nami ga ii\nitsumo kawaranu\nsukoyaka kenage\naa~\ntomo ni utaou\nnamimorichuu ~\nะ”ะฐ... ัั‚ะพ ะฑั‹ะปะฐ ั‚ะฐ ัะฐะผะฐั ั‡ะพะบะฝัƒั‚ะฐั ะฟั‚ะธั‡ะบะฐ, ั…ะพะทัะธะฝะพะผ ะบะพั‚ะพั€ะพะน ะฑั‹ะป ะฝะต ะผะตะฝะต ั‡ะพะบะฝัƒั‚ั‹ะน ะฅะธะฑะฐั€ะธ ะšั‘ั. ะฅะพั‚ั ะฝะฐะทะฒะฐั‚ัŒ ะตะณะพ ั‚ะฐะบ ะฟั€ะธะปัŽะดะฝะพ ะฝะธ ัƒ ะบะพะณะพ ะฑั‹ ัะทั‹ะบ ะฝะต ะฟะพะฒะตั€ะฝัƒะปัั... ะฝัƒ, ะฟะพั‡ั‚ะธ ะฝะธ ัƒ ะบะพะณะพ.\nะ’ั€ะตะผะตะฝะฐ ัˆะบะพะปัŒะฝะพะน ะฟะพั€ั‹ ะฟั€ะพัˆะปะธ, ะธ ั‚ะตะฟะตั€ัŒ ะฝะฐัั‚ะฐะปะธ ะฝะต ะผะตะฝะตะต ะฝะฐัั‹ั‰ะตะฝะฝั‹ะต ะฒั€ะตะผะตะฝะฐ ัั‚ัƒะดะตะฝั‡ะตัั‚ะฒะฐ. ะขะฐะบ ัƒะถ ะฟะพะปัƒั‡ะธะปะพััŒ, ััƒะดัŒะฑั‹ ะทะปะฐั ัˆัƒั‚ะบะฐ, ั‡ั‚ะพ ะฑะตะดะฝะพะณะพ ะกะฐะฒะฐะดัƒ ะขััƒะฝะฐั‘ัˆะธ ะฟะตั€ะตะฝะฐะฟั€ะฐะฒะธะปะธ ะฒ ัƒะฝะธะฒะตั€ัะธั‚ะตั‚, ะณะดะต ะณะปะฐะฒะพะน ะดะธัั†ะธะฟะปะธะฝะฐั€ะฝะพะณะพ ะบะพะผะธั‚ะตั‚ะฐ ะฑั‹ะป ัั‚ั€ะฐั… ะธ ัƒะถะฐั ะตะณะพ ะถะธะทะฝะธ - ะฅะธะฑะฐั€ะธ ะšั‘ั! ะัƒ, ั€ะฐะทัƒะผะตะตั‚ัั ะฟะพัะปะต ั€ะตะฟะตั‚ะธั‚ะพั€ะฐ... ะฝะพ ะฝะต ะพะฑ ัั‚ะพะผ ัะตะนั‡ะฐั. ะ›ัŽะฑะพะฟั‹ั‚ะฝะพ, ั‡ั‚ะพ ะฑะตะดะฝะพะณะพ ะกะฐะฒะฐะดัƒ ะขััƒะฝะฐั‘ัˆะธ, ะพัˆะธะฑะพั‡ะฝะพ, ะทะฐะฟะธั…ะฝัƒะปะธ ัั€ะฐะทัƒ ะฝะฐ 2 ะบัƒั€ั! ะœ-ะดะฐ... ะฝะต ะฟะพะฒะตะทะปะพ ั€ะตะฑั‘ะฝะบัƒ...\nะะพ ั‚ัƒั‚ ั„ะพั€ั‚ัƒะฝะฐ ะฟะพะฒะตั€ะฝัƒะปะฐััŒ ะบ ะฝะตะผัƒ ัะฒะพะธะผ ั€ั‹ะปะพะผ, ะธ ะฒ ะตะณะพ ะบะปะฐััะต ะพะฝ ะฟะพะฒัั‚ั€ะตั‡ะฐะป ะทะฐะผะตั‡ะฐั‚ะตะปัŒะฝะพะณะพ ั‡ะตะปะพะฒะตะบะฐ - ะะปะปะตะฝะฐ ะฃะพะปะบะตั€ะฐ.\nะก ะฝะธะผ ะพะฝะธ ะผะธะณะพะผ ัะดั€ัƒะถะธะปะธััŒ ะธ ัั‚ะฐะปะธ, ะฝะต ั€ะฐะทะปะตะน ะฒะพะดะฐ. ะะพ ัั‚ะพ ะฑั‹ะปะพ ะพัะตะฝัŒัŽ, ะฐ ั‚ะตะฟะตั€ัŒ ะฒะตัะฝะฐ! ะ ัั‚ะพ ะทะฝะฐั‡ะธั‚...\nะกั†ะตะฝะฐ 1. ะ”ัƒะฑะปัŒ 1.\n- ะขััƒะฝะฐ, ะฝะต ะฟะตั€ะตะถะธะฒะฐะน ั‚ั‹ ั‚ะฐะบ! ะกะดะฐัˆัŒ ั‚ั‹ ัั‚ะธ ัะบะทะฐะผะตะฝั‹! ะ’ะตะดัŒ ะธ ั, ะธ ั‚ะฒะพะน ั€ะตะฟะตั‚ะธั‚ะพั€ ะทะฐะฝะธะผะฐะปะธััŒ ั ั‚ะพะฑะพะน ะฒะตััŒ ัƒั‡ะตะฑะฝั‹ะน ะณะพะด! ะขั‹ ะดะฐะถะต ะฝะฐั‡ะฐะป ะฟะพะฝะธะผะฐั‚ัŒ ะฐะทั‹ ัะปะตะบั‚ั€ะพั„ะธะทะธะบะธ! - ัƒัะฟะพะบะฐะธะฒะฐะป ะฒะตั‡ะฝะพ ะปะพัะปัŒะฝั‹ะน ัะตะดะพะน, ะฟะพะณะปะฐะถะธะฒะฐั ะขััƒะฝัƒ ะฟะพ ะฟัƒัˆะธัั‚ะพะน ะบะฐัˆั‚ะฐะฝะพะฒะพะน ัˆะตะฒะตะปัŽั€ะต. - ะัƒ, ะฐ ะตัะปะธ ั‡ั‚ะพ, ะพัั‚ะฐะฝะตัˆัŒัั ะฝะฐ ะฒั‚ะพั€ะพะน ะณะพะด! ะ’ะพะฝ, ะฝะตะบะพั‚ะพั€ั‹ะต ั‚ะฐะบ ัƒะถะต 3 ั€ะฐะทะฐ ะดะตะปะฐะปะธ! - ะบะธะฒะฝัƒะป ะพะฝ ะฝะฐ ะšะฐะฝะดัƒ, ั‡ั‚ะพ ัะธะดะตะป ัƒ ะพะบะฝะฐ ะฒ ะบะพะฝั†ะต ะบะปะฐััะฐ.\nะšะฐะฝะดะฐ ะฎัƒ, ะพ-ะพ-ะพ! ะญั‚ะพ, ะฒะพะพะฑั‰ะต, ะพั‚ะดะตะปัŒะฝะฐั ะธัั‚ะพั€ะธั! ะฅัƒะปะธะณะฐะฝ, ะพั‚ะปะธั‡ะฝะธะบ, ะบั€ะฐัะฐะฒะตั†, ะฟะพัะปะตะดะฝัั ัะบะพั‚ะธะฝะฐ, ั‡ะตะปะพะฒะตะบ ั‡ะตัั‚ะธ, ะฑะตะทะดะฐั€ัŒ, ะณั€ะพะทะฐ ะฒัะตั… ะธ ะฒัั... ั‡ัƒะฒัั‚ะฒะฐ ัะผะตัˆะฐะฝะฝั‹ะต. ะšะฐะบ ะฒัั‘ ัั‚ะพ ะธ ะตั‰ั‘ ะผะฝะพะณะพ "ะฟะพะปะพะถะธั‚ะตะปัŒะฝั‹ั…" ะบะฐั‡ะตัั‚ะฒ ะฝะฐั…ะพะดัั‚ัั ะฒ ะพะดะฝะพะผ ั‡ะตะปะพะฒะตะบะต, ะะปะปะตะฝ ะพั‚ะบะฐะทั‹ะฒะฐะปัั ะฟะพะฝะธะผะฐั‚ัŒ!\n- ะะพ ะพะฝ ั…ะพั‚ั ะฑั‹ ะบั€ัƒั‚ะพะน, ะธ ะพั‚ะปะธั‡ะฝะธะบ, ะฐ ั ะบะฐะบ ะฑั‹ะป ะฝะธะบั‡ะตะผะฝั‹ะผ, ั‚ะฐะบะธะผ ะธ ะพัั‚ะฐะฝัƒััŒ. ะœะฝะต ะฝะต ัะดะฐั‚ัŒ ัั‚ะธ ัะบะทะฐะผะตะฝั‹, ะฝะธ ะทะฐ ั‡ั‚ะพ ะฒ ะถะธะทะฝะธ! - ะฟั€ะพะดะพะปะถะฐะป ัั‚ั€ะฐะดะฐั‚ัŒ ะขััƒะฝะฐ, ัั…ะฒะฐั‚ะธะฒัˆะธััŒ ะทะฐ ะณะพะปะพะฒัƒ. ะขะฐะบะธะผ ะพะฝ ะฑั‹ะป, ัะปะธัˆะบะพะผ ะฝะตัƒะฒะตั€ะตะฝะฝั‹ะผ ะฒ ัะตะฑะต, ะฟะตััะธะผะธัั‚ะธั‡ะฝั‹ะผ, ะฐ ะตั‰ั‘ ะฟะพัะปะตะดะฝะธะผ ะฝะตัƒะดะฐั‡ะฝะธะบะพะผ... ัะฟะธัะพะบ ะผะพะถะฝะพ ะฟั€ะพะดะพะปะถะธั‚ัŒ. ะะพ ะฒ ั‚ะพะถะต ะฒั€ะตะผั, ั€ะฐะดะธ ะดั€ัƒะทะตะน ะพะฝ ะฑั‹ะป ะณะพั‚ะพะฒ ะฝะฐ ะผะฝะพะณะพะต! ะ•ะณะพ ะพั‚ะทั‹ะฒั‡ะธะฒะพัั‚ัŒ, ะดะพะฑั€ะพั‚ะฐ ะฝะต ะทะฝะฐะปะฐ ะณั€ะฐะฝะธั†. ะ•ัะปะธ ะบั‚ะพ-ั‚ะพ ะพะฑะธะถะฐะป ะตะณะพ ะดั€ัƒะทะตะน, ะตะณะพ ะณะปะฐะทะฐ ัั‚ะฐะฝะพะฒะธะปะธััŒ ะพั€ะฐะฝะถะตะฒั‹ะผะธ, ะฐ ัะฐะผ ะพะฝ ัะตั€ัŒั‘ะทะฝั‹ะผ ะธ ะผะตะณะฐ-ัะธะปัŒะฝั‹ะผ.\n- ะ‘ะฐะšะฐะฝะดะฐ-ั‚ะพ?! ะฅะฐ-ั…ะฐ-ั…ะฐ! - ั€ะฐััะผะตัะปัั ะฃะพะปะบะตั€. - ะ”ัƒั€ะฐะบ ะดัƒั€ะฐะบะพะผ! ะžะฝ ะฟั€ะพัั‚ะพ ะฒะตะทัƒะฝั‡ะธะบ ั ั€ะตะฟัƒั‚ะฐั†ะธะตะน ะธ ะฒะฝะตัˆะฝะพัั‚ัŒัŽ! ะ˜ ะฒัั‘! - ะพะฝ ะผะฝะพะณะพะทะฝะฐั‡ะธั‚ะตะปัŒะฝะพ ั…ะผั‹ะบะฝัƒะป. - ะ ั‚ั‹, ั‚ั‹ ะดะพะฑั€ั‹ะน ะธ ะผะธะปั‹ะน! ะŸั€ะพัั‚ะพ ะฑัƒะดัŒ ะฟะพัƒะฒะตั€ะตะฝะฝะตะต ะฒ ัะตะฑะต, ะธ ะฒัั‘ ะฟะพะปัƒั‡ะธั‚ัั!\n- ะญั…, ะธ ะบะฐะบ ะตะผัƒ ัƒะดะฐะตั‚ัั ะฑั‹ั‚ัŒ ั‚ะฐะบะธะผ ัƒะฒะตั€ะตะฝะฝั‹ะผ? ะฃ ะผะตะฝั ั‚ะฐะบ ะฝะต ะฟะพะปัƒั‡ะฐะตั‚ัั... - ะฒะทะดะพั…ะฝัƒะป ะกะฐะฒะฐะดะฐ, ะฟะพัะผะพั‚ั€ะตะฒ ะฝะฐ ะšะฐะฝะดัƒ. - ะ”ะฐ, ะธ ะฟั€ะธ ัั‚ะพะผ ะพะฝ ะฝะธั‡ะตะณะพ ะฝะต ะดะตะปะฐะตั‚, ะปะธัˆัŒ ัะธะดะธั‚ ะฝะฐ ัะฒะพั‘ะผ ะผะตัั‚ะต, ะฝะพ ะฒัะต ะดะตะฒั‡ะพะฝะบะธ ะฒะพะทะปะต ะฝะตะณะพ ะฒัŒัŽั‚ัั.\nะะพ ั‚ัƒั‚, ะฒะดั€ัƒะณ, ะšะฐะฝะดะฐ ะฟะพัะผะพั‚ั€ะตะป ะฒ ะธั… ัั‚ะพั€ะพะฝัƒ, ะฐ ะขััƒะฝะฐ ั‚ัƒั‚ ะถะต ะพั‚ะฒะตั€ะฝัƒะปัั ะธ ัะถะฐะปัั, ะฑัƒะดั‚ะพ ะตะณะพ ั‚ะพะปัŒะบะพ ั‡ั‚ะพ ะพะฑะปะธะปะธ ะปะตะดัะฝะพะน ะฒะพะดะพะน.\n- ะคัƒั…...\nะะปะปะตะฝ ั‚ะพะถะต ะฟะพัะผะพั‚ั€ะตะป ะฝะฐ ะšะฐะฝะดัƒ ะธ, ะฟะพะบะฐะทะฐะฒ ะตะผัƒ ัะทั‹ะบ, ะพั‚ะฒะตั€ะฝัƒะปัั.\n- ะŸั„! ะ˜ ั‡ั‚ะพ ะพะฝะธ ะฒ ะฝั‘ะผ ะฝะฐัˆะปะธ, ะฝะต ะฟะพะฝะธะผะฐ... - ะฒะพั‚ ั‚ะตะฟะตั€ัŒ ัƒะถะต ะะปะปะตะฝ ะทะฐะผะตั€ ัƒัั‚ะฐะฒะธะฒัˆะธััŒ ะฝะฐ ะดะฒะตั€ะฝะพะน ะฟั€ะพั‘ะผ, ะพั‚ะบัƒะดะฐ ะธะทะปัƒั‡ะฐะปะฐััŒ ะฐัƒั€ะฐ ัะผะตั€ั‚ะธ. ะญั‚ะพ ะฑั‹ะป ะฅะธะฑะฐั€ะธ ะšั‘ั\n"ะงั‚ะพ ะตะผัƒ ะฝัƒะถะฝะพ?!"\nะกั†ะตะฝะฐ 2. ะ”ัƒะฑะปัŒ 1.\n- ะšั‚ะพ... ะบั‚ะพ ะฟะพัะผะตะป ะฟั€ะธะนั‚ะธ ะฒ ัƒะฝะธะฒะตั€ัะธั‚ะตั‚ ะฑะตะท ัะผะตะฝะบะธ?!!!\nะขัƒั‚ ะฃะพะปะบะตั€ ะฟะพัะผะพั‚ั€ะตะป ะฝะฐ ะฟะพะป ะธ ะฒะทะดั€ะพะณะฝัƒะป. ะ“ั€ัะทัŒ! ะ›ัƒะถะธ ะณั€ัะทะธ ะพั‚ ะฒะพะตะฝะฝั‹ั… ัะฐะฟะพะณ, ะฐ ั‚ะฐะบะธะต ัะฐะฟะพะณะธ ั‚ะพะปัŒะบะพ ัƒ...\n- ะšะฐะฝะดะฐ ะฎัƒ! - ะฒะทั€ะตะฒะตะป ะšั‘ั.\nะะพ ะฟะฐั€ะตะฝัŒ ะปะธัˆัŒ ะพะดะฐั€ะธะป ะตะณะพ ัะฒะพะธะผ ะพะฑั‹ั‡ะฝั‹ะผ, ั€ะฐะฒะฝะพะดัƒัˆะฝั‹ะผ ะฒะทะณะปัะดะพะผ, ะฟะพะปะฝั‹ะผ ั…ะพะปะพะดะฐ.\n- ะงั‚ะพ-ั‚ะพ ะฝะต ั‚ะฐะบ?\n- ะขั‹, ั‚ั€ะฐะฒะพัะดะฝะพะต! - ะฟะพะดะพะนะดั ะบ ะšะฐะฝะดะต, ะฅะธะฑะฐั€ะธ ะปะฐัะบะพะฒะพ ะพั‚ะพะดะฒะธะฝัƒะป ะฟะฐั€ั‚ัƒ. - ะขั‹ ะพั‚ะฒะตั‚ะธัˆัŒ ะทะฐ ั‚ะพ, ั‡ั‚ะพ ะธัะฟะฐั‡ะบะฐะป ะฟะพะปั‹! - ะพะฝ ะฝะฐัะบะฒะพะทัŒ ะฟั€ะพะถะธะณะฐะป ะฒะทะณะปัะดะพะผ.\n- ะฅะผ, ะตั‰ั‘ ั‡ะตะณะพ, - ะณะพั€ะดั‹ะต ัะธะฝะธะต ะณะปะฐะทะฐ ะฟั€ะพะฝะธะทั‹ะฒะฐะปะธ ั…ะพะปะพะดะพะผ ะฒ ะพั‚ะฒะตั‚. ะ’ะดะพะฑะฐะฒะพะบ ะพะฝ ะทะฐะบะธะฝัƒะป ะฝะพะณัƒ ะฝะฐ ะฝะพะณัƒ. - ะกะผะตะฝะบะฐ ะฟะพั€ะฒะฐะปะฐััŒ, ะดั€ัƒะณะพะน ั ะฝะต ะฝะฐัˆั‘ะป, ะฟั€ะธัˆะปะพััŒ ะธะดั‚ะธ ะฒ ัƒะฝะธะฒะตั€ัะธั‚ะตั‚ ั‚ะฐะบ.\n- ะ”ะฐ ะฟะปะตะฒะฐั‚ัŒ ั ั…ะพั‚ะตะป! ะ‘ะพัะธะบะพะผ ั…ะพะดะธ! ะ ะฟะพะผะตั‰ะตะฝะธะต ะฟะฐั‡ะบะฐั‚ัŒ ะฝะต ัะผะตะน! - ั€ั‹ั‡ะฐะป ะšั‘ั.\n- ะ—ะฐะฒั‚ั€ะฐ ั‚ะฐะบ ะธ ัะดะตะปะฐัŽ - ั„ั‹ั€ะบะฝัƒะป ั‚ะพั‚. - ะญั‚ะพ ะฒัั‘?\n- ะ‘ัƒะดะตัˆัŒ ะฝะตะดะตะปัŽ ะผั‹ั‚ัŒ ะฟะพะปั‹ ะฒ ัั‚ะพะผ ะบะพั€ะธะดะพั€ะต! - ะฝะฐั…ะผัƒั€ะธะปัั ะณะปะฐะฒะฐ ะดะธัั†ะธะฟะปะธะฝะฐั€ะฝะพะณะพ ะบะพะผะธั‚ะตั‚ะฐ. - ะ˜ ะฝะฐั‡ะฝั‘ัˆัŒ, ะฟั€ัะผะพ ัะตะนั‡ะฐั!\n- ะขั‡, ะฝะต ะฝะฐะผะตั€ะตะฝ. ะ”ะปั ัั‚ะพะณะพ ะตัั‚ัŒ ัƒะฑะพั€ั‰ะธั†ั‹, ะธ... - ะฑั€ะพัะธะป ะบะพั€ะพั‚ะบะธะน ะฒะทะณะปัะด ะฒ ัั‚ะพั€ะพะฝัƒ ะฃะพะปะบะตั€ะฐ ะธ ะขััƒะฝั‹. - ะ”ะตะถัƒั€ะฝั‹ะต.\n- ะงะตะณะพ-ะพ-ะพ?! - ะฒะพะทะผัƒั‚ะธะปัั ะฃะพะปะบะตั€. - ะ—ะฐ ะบะพั€ะธะดะพั€ ะผั‹ ะฝะต ะพั‚ะฒะตั‡ะฐะตะผ!\n- ะฅะผ, - ั…ะผั‹ะบะฝัƒะป ะšั‘ั, ะธ ะผะฐะปัŒั‡ะธะบ ั€ะตัˆะธะป ะฟะพะผะพะปั‡ะฐั‚ัŒ. - ะขั‹ ะทะฐะฟะฐั‡ะบะฐะป ั‚ั‹ ะธ ัƒะฑะธั€ะฐะน, ะฐ ะธะฝะฐั‡ะต... - ะณะปะฐะทะฐ ัะฒะตั€ะบะฝัƒะปะธ ะฝะต ะฟะพ-ะดะพะฑั€ะพะผัƒ. - ะšะฐะผะธะบะพั€ะพั!\n- ะะตั‚ ะถะตะปะฐะฝะธั ะดั€ะฐั‚ัŒัั, ะฝะพ ั€ะฐะท ั‚ั‹ ะฝะฐัั‚ะฐะธะฒะฐะตัˆัŒ! - ะšะฐะฝะดะฐ ะฟะพะดะฝัะปัั ั ะผะตัั‚ะฐ, ัะผะพั‚ั€ั ะฝะฐ ะฟะฐั€ะฝั ั ะฒั‹ะทะพะฒะพะผ. ะžะฝ ะฝะต ัะพะฑะธั€ะฐะปัั ะพั‚ะดะฐะฒะฐั‚ัŒ ัะฒะพะตะผัƒ ะณะปะฐะฒะฝะพะผัƒ ัะพะฟะตั€ะฝะธะบัƒ ะทะฒะฐะฝะธะต ะณั€ะพะทั‹ ัƒะฝะธะฒะตั€ัะธั‚ะตั‚ะฐ.\n- ะž, ัั‚ะพ ะฑัƒะดะตั‚ ะธะฝั‚ะตั€ะตัะฝะพ, - ะทะปะพั€ะฐะดะฝะพ ัƒั…ะผั‹ะปัŒะฝัƒะปัั. - ะ’ัะต ะฒะพะฝ! ะŸะพะบะฐ ะฝะต ะฟะตั€ะตะฑะธะป.\nะ’ะตััŒ ะบะปะฐัั, ั‡ั‚ะพ ะถะฐะปัั ะฟะพ ัั‚ะตะฝะพั‡ะบะฐะผ, ะผะพะผะตะฝั‚ะฐะปัŒะฝะพ ะฒั‹ัั‹ะฟะฐะป ะฒ ะบะพั€ะธะดะพั€. ะšั€ะพะผะต ะขััƒะฝั‹ ะธ ะะปะปะตะฝะฐ, ั‡ั‚ะพ ะทะฐะฒะพั€ะพะถะตะฝะพ ะฝะฐะฑะปัŽะดะฐะปะธ ะทะฐ ัะพะฑั‹ั‚ะธัะผะธ. ะกะฐะฒะฐะดะฐ ัะพ ัั‚ั€ะฐั…ัƒ ะฒั†ะตะฟะธะปัั ะฒ ั€ัƒะบัƒ ะฃะพะปะบะตั€ะฐ, ะฐ ัะฐะผ ะฟะฐั€ะตะฝัŒ ะพะฑะตัะฟะพะบะพะตะฝะฝะพ ัะผะพั‚ั€ะตะป ะฒ ัั‚ะพั€ะพะฝัƒ ะดะปะธะฝะฝะพะฒะพะปะพัะพะณะพ ัะฟะพะฝั†ะฐ. - ะฎัƒ... - ั‚ะธั…ะพ ะฟะพะทะฒะฐะป ะพะฝ.\n- ะŸั€ะฐะฒะธะปัŒะฝะพ, ัะฒะธะดะตั‚ะตะปะธ ะฝะธ ะบ ั‡ะตะผัƒ, - ั‚ะฐะบ ะถะต ัƒั…ะผั‹ะปัŒะฝัƒะปัั ะšะฐะฝะดะฐ, ั€ะฐะทะผะธะฝะฐั ั€ัƒะบะธ. - ะ’ั‹, ะดะฒะพะต, ั€ะฐะทะฒะต ะฝะต ััะฝะพ ะฑั‹ะปะพ ัะบะฐะทะฐะฝะพ? - ะณะปัะฝัƒะป ะพะฝ ะฒ ัั‚ะพั€ะพะฝัƒ ะฟะฐั€ะฝะตะน.\n- ะะปะปะตะฝ, ะผะพะถะตั‚... - ั‚ะธั…ะพ ะฟั€ะพัะบัƒะปะธะป ะขััƒะฝะฐ, ะฟั€ะตะบั€ะฐัะฝะพ ะทะฝะฐะฒัˆะธะน ะฝั€ะฐะฒ ะฅะธะฑะฐั€ะธ.\nะ‘ะตะปะพะฑั€ั‹ัั‹ะน, ั‡ั‚ะพ ะฒัั‘ ัั‚ะพ ะฒั€ะตะผั ะฟะตั€ะตะฒะพะดะธะป ะฒะทะณะปัะด ั ะšั‘ั ะฝะฐ ะšะฐะฝะดัƒ, ะฒะทะดะพั…ะฝัƒะป, ะพะฟัƒัั‚ะธะฒ ะณะปะฐะทะฐ, ะธ ะฟะพะดะดะฐะปัั ะฝะฐ ัƒะณะพะฒะพั€ั‹ ะกะฐะฒะฐะดั‹, ะฟะพะทะฒะพะปะธะฒ ัƒั‚ะฐั‰ะธั‚ัŒ ัะตะฑั ะฒ ะบะพั€ะธะดะพั€.\nะกั†ะตะฝะฐ 3. ะ”ัƒะฑะปัŒ 1.\n- ะฅะต... - ะฅะธะฑะฐั€ะธ ัั‚ั€ะฐะฝะฝะพ ั…ะผั‹ะบะฝัƒะป, ะบั€ะฐะตะผ ะณะปะฐะทะฐ ะฝะฐะฑะปัŽะดะฐั ะทะฐ ัƒัˆะตะดัˆะธะผะธ.\nะšะฐะฝะดะฐ ั‚ะฐะบ ะถะต ะผะพะปั‡ะฐ, ะฟั€ะพะฒะพะดะธะป ะฟะพะดั€ะพัั‚ะบะพะฒ ะฒะทะณะปัะดะพะผ ะธ ะฒะฝะพะฒัŒ ะฟะพัะผะพั‚ั€ะตะป ะฝะฐ ัะฒะพะตะณะพ ะฟั€ะพั‚ะธะฒะฝะธะบะฐ. ะญั‚ะฐ ัƒั…ะผั‹ะปะบะฐ ะฝะธ ะพ ั‡ั‘ะผ ะดะพะฑั€ะพะผ ะฝะต ะณะพะฒะพั€ะธะปะฐ.\nะฅะธะฑะฐั€ะธ ะฝะตะพะถะธะดะฐะฝะฝะพ ัƒะดะฐั€ะธะป ะฟะฐั€ะฝั ะฒ ะถะธะฒะพั‚ ั‚ะฐะบ, ั‡ั‚ะพ ั‚ะพั‚ ะพั‚ะปะตั‚ะตะป ะบ ะพะบะฝัƒ.\n- ะขั‡... - ะšะฐะฝะดะฐ ัะพะณะฝัƒะปัั, ะฝะพ ะฑั‹ัั‚ั€ะพ ะฟั€ะธัˆั‘ะป ะฒ ัะตะฑั ะธ ะฟะพะดะฝัะปัั. ะŸะพัะปะตะดะพะฒะฐะป ะพั‚ะฒะตั‚ะฝั‹ะน ัƒะดะฐั€.\n- ะฅะผ... ัะปะฐะฑะฐะบ! - ะšั‘ั ะฑั‹ัั‚ั€ะพ ะฑะปะพะบะธั€ะพะฒะฐะป ัั‚ะพั‚ ัƒะดะฐั€ ะธ ะฟะพะดัะตั‡ะบะพะน ัะฑะธะป ะฟั€ะพั‚ะธะฒะฝะธะบะฐ ั ะฝะพะณ.\nะฎัƒ ะฝะต ั€ะฐัั‚ะตั€ัะปัั ะธ ัƒะดะฐั€ะธะป ะตะณะพ ะฟะพ ะฝะพะณะฐะผ, ั‚ะพะถะต ะทะฐะฒะฐะปะธะฒ ะฝะฐ ะฟะพะป ะธ ัะตะป ะฝะฐ ะฝะตะณะพ, ะบะฐะบ ะฝะฐ ัะบะฐะผะตะนะบัƒ. ะŸะพั‚ะพะผ ะฟะพะดะฝัะปัั ะธ ะทะฐะปะพะผะธะป ั‚ะพะผัƒ ั€ัƒะบะธ, ะฟั€ะธะณะธะฑะฐั ะบ ะฟะพะปัƒ.\n- ะ‘ะตัะธัˆัŒ!\nะฅะธะฑะฐั€ะธ ะฒั‹ะฒะตั€ะฝัƒะปัั ะธ ั ั€ะฐะทะฒะพั€ะพั‚ะฐ ัƒะดะฐั€ะธะป ะฟะพ ะปะธั†ัƒ.\n- ะขั€ะฐะฒะพัะดะฝั‹ะต ะดะพะปะถะฝั‹ ะผะพะปั‡ะฐั‚ัŒ ะธ ะฟะพะดั‡ะธะฝัั‚ัŒัั!\n- ะฏ ั‚ะฐะบะพะน ัะฒะพะปะพั‡ะธ ะฟะพะดั‡ะธะฝัั‚ัŒัั ะฝะต ัะพะฑะธั€ะฐัŽััŒ! - ัƒะดะฐั€ ะฒ ะฑะพะบ ะฟะพ ะฟะตั‡ะตะฝะธ.\nะšั‘ั ัƒะดะฐั€ะธะป ะฟะพ ะณะพะปะพะฒะต ั‚ะพะฝั„ะฐ.\n- ะ ั‚ะตะฑั ะฝะธะบั‚ะพ ะฝะต ัะฟั€ะฐัˆะธะฒะฐะตั‚! ะ”ะธัั†ะธะฟะปะธะฝะฐ ะฝะฐ ะฟะตั€ะฒะพะผ ะผะตัั‚ะต!\nะšะฐะฝะดะฐ ะทะฐะตั…ะฐะป ะฝะพะณะพะน ะฒ ะถะธะฒะพั‚. ะกะฐะฟะพะณะฐะผะธ ัั‚ะพ ะพั‡ะตะฝัŒ ะถะตัั‚ะพะบะพ.\n- ะŸะพะบะฐ ะผะตะฝั ะฝะธะบั‚ะพ ะฝะต ั‚ั€ะพะณะฐะตั‚, ั ัะฟะพะบะพะตะฝ!\n- ะŸะพะบะฐ ะฝะต ะฝะฐั€ัƒัˆะฐะตัˆัŒ ะฟั€ะฐะฒะธะปะฐ, ัะฟะพะบะพะตะฝ ั! - ะฟะฐั€ะตะฝัŒ ั ัะธะปะพะน ัƒะดะฐั€ะธะป ะฟะพ ัะพะปะฝะตั‡ะฝะพะผัƒ ัะฟะปะตั‚ะตะฝะธัŽ.\n- ะšั…... ัƒะฑะปัŽะดะพะบ, - ัะผะพั€ั‰ะธะปัั ะฎัƒ.\n- ะขะพะถะต ะผะฝะต - ะฝะฐะณะปั‹ะน! ะ”ัƒะผะฐะตัˆัŒ, ั…ัƒะน ะพั‚ั€ะฐัั‚ะธะป, ะธ ั‚ะตะฑะต ะฒัั‘ ะดะพะทะฒะพะปะตะฝะพ?! - ะฟั€ะพั€ั‹ั‡ะฐะป ะฅะธะฑะฐั€ะธ.\n- ะ“ะพะฒะพั€ะธ, ั‡ั‚ะพ ั…ะพั‡ะตัˆัŒ, ะฝะพ ะฟะพะปั‹ ะผั‹ั‚ัŒ ั ะฝะต ัะพะฑะธั€ะฐัŽััŒ, - ั‚ะตะผ ะถะต ั‚ะพะฝะพะผ ะพั‚ะฒะตั‚ะธะป ะฟั€ะพั‚ะธะฒะฝะธะบ.\n- ะะพ ั‚ะฐะบะธ ะฒั‹ะผะพะตัˆัŒ! - ัะฝะพะฒะฐ ัƒะดะฐั€ะธะป ะณะปะฐะฒะฐ ะดะธัั†ะธะฟะปะธะฝะฐั€ะฝะพะณะพ ะบะพะผะธั‚ะตั‚ะฐ.\n- ะ—ะฐะฒั‚ั€ะฐ ะฒะพะพะฑั‰ะต ะฝะต ัะฒะปัŽััŒ. ะ˜ ะฟะปะฐะบะฐะป ะฒะฐัˆ ะบัƒะฑะพะบ ะทะฐ ะฟะตั€ะฒะพะต ะผะตัั‚ะพ ะฟะพ ะฑะฐัะบะตั‚ะฑะพะปัƒ, - ะฒั‹ั‚ะตั€ะฟะตะป ะšะฐะฝะดะฐ.\n- ะขั‹ ะผะฝะต ั‚ัƒั‚ ะฝะต ัƒะณั€ะพะถะฐะน! ะะตะทะฐะผะตะฝะธะผั‹ั… ะปัŽะดะตะน ะฝะต ะฑั‹ะฒะฐะตั‚! ะ ั‚ะตะผ ะฑะพะปะตะต ั‚ะตะฑั ะทะฐะผะตะฝะธั‚ัŒ - ั€ะฐะท ะฟะปัŽะฝัƒั‚ัŒ!\n- ะžั…, ั‚ะพะณะดะฐ ัั‚ะพ ะถะต ะพั‚ะปะธั‡ะฝะพ! ะ—ะฐะฒั‚ั€ะฐ ั†ะตะปั‹ะน ะดะตะฝัŒ ะฟั€ะพะฒะฐะปััŽััŒ ะฒ ะบั€ะพะฒะฐั‚ะธ ะธ ะฝะต ัƒะฒะธะถัƒ ัั‚ะพะณะพ ะผะตะปะบะพะณะพ. ะ—ะฐะดะพะปะฑะฐะป ะฟัะปะธั‚ัŒัั.\n- ะœ? ะ ะฟั€ะธั‡ั‘ะผ ั‚ัƒั‚ ะะปะปะตะฝ?! - ะšั‘ั ะฒัะบะธะฝัƒะป ะฑั€ะพะฒัŒ.\n- ะŸั€ะธั‚ะพะผ, ั‡ั‚ะพ ะดะพัั‚ะฐะป, - ะฒะทะดะพั…ะฝัƒะป ะฎัƒ, - ัั‚ั€ะฐะฝะฝั‹ะน ะพะฝ ะบะฐะบะพะน-ั‚ะพ. ะ˜ ัะผะพั‚ั€ะธั‚ ะฝะฐ ะผะตะฝั ะบะฐะบ-ั‚ะพ ัั‚ั€ะฐะฝะฝะพ.\n- ะ ะฐะดะพะฒะฐะปัั ะฑั‹! ะ’ัะต ะพัั‚ะฐะปัŒะฝั‹ะต ะพั‚ ั‚ะตะฑั ัˆะฐั€ะฐั…ะฐัŽั‚ัั. ะก ั‚ะฐะบะธะผะธ ั‚ะตะผะฟะฐะผะธ ะธ ะดะพ ะพะฝะฐะฝะธะทะผะฐ ะฝะตะดะฐะปะตะบะพ, ะธะปะธ ั‚ั‹ ัƒะถะต? - ัƒัะผะตั…ะฝัƒะปัั ะฅะธะฑะฐั€ะธ.\n- ะŸั„, ะฝะตั‚... ะธ ั‡ั‚ะพ ะฒะพะพะฑั‰ะต ะทะฐ ะฒะพะฟั€ะพัั‹? ะฃะพะปะบะตั€ ะผะตะฝั ะฒ ะฟะพัะปะตะดะฝัŽัŽ ะพั‡ะตั€ะตะดัŒ ะธะฝั‚ะตั€ะตััƒะตั‚.\n- ะฏ ะฝะต ะพะฑ ัั‚ะพะน ะบะพะทัะฒะบะต ะณะพะฒะพั€ัŽ! ะ ะฟั€ะพ ั‚ะพ, ั‡ั‚ะพ ั ั‚ะฒะพะธะผ ั…ะฐั€ะฐะบั‚ะตั€ะพะผ ะฝะธ ะพะดะฝะฐ ะดะตะฒัƒัˆะบะฐ ะบ ั‚ะตะฑะต ะฝะต ะฟะพะดะพะนะดั‘ั‚!\n- ะฅะต, - ัƒัะผะตั…ะฝัƒะปัั ะšะฐะฝะดะฐ. - ะกะฟะพั€ะธะผ, ั ะปัŽะฑัƒัŽ ะทะฐ ะดะตะฝัŒ ัะผะพะณัƒ ะทะฐะบะฐะดั€ะธั‚ัŒ? ะ˜ ะทะฐะฝัั‚ัŒัั ัะตะบัะพะผ.\n- ะขั‹-ั‚ะพ? ะฅะฐ! ะ˜ ะทะฐ ะผะตััั† ะฝะต ัะฟั€ะฐะฒะธัˆัŒัั! - ะพัะบะฐะปะธะปัั ะšั‘ั.\n- ะขะฐะบ ะทะฝะฐั‡ะธั‚, ัะฟะพั€ะธะผ? - ะฟั€ะธะฟะพะดะฝัะปัั ะฎัƒ. - ะะพ ั‚ะพะณะดะฐ ะธ ั‚ั‹ ัƒั‡ะฐัั‚ะฒัƒะตัˆัŒ.\n- ะฅะต, ะดะฐัŽ ั‚ะตะฑะต ะฝะตะดะตะปัŽ! - ะฅะธะฑะฐั€ะธ ัƒะฑั€ะฐะป ั‚ะพะฝั„ะฐ ะธ ะฟั€ะพั‚ัะฝัƒะป ัะฒะพัŽ ั€ัƒะบัƒ.\n- ะ”ะพะณะพะฒะพั€ะธะปะธััŒ, - ะฟะพะถะฐะป ั€ัƒะบัƒ ั‚ะพั‚. - ะ˜ ะบั‚ะพ ัั‚ะฐะฝะตั‚ ั†ะตะปัŒัŽ?\n- ะฅะผ... ะฐ ั‚ะพั‚, ะบั‚ะพ ะฟะตั€ะฒั‹ะน ะฒะพะนะดั‘ั‚ ะฒ ัั‚ะพั‚ ะบะฐะฑะธะฝะตั‚! ะงั‚ะพะฑ ัƒะถ ั‡ะตัั‚ะฝะพ ะฑั‹ะปะพ. ะ’ ะฟะพะดั‚ะฒะตั€ะถะดะตะฝะธะต ะฟะพะฑะตะดั‹ ะฟั€ะธะฝะตััƒ ั‚ะตะฑะต ะฝะธะถะฝะตะต ะฑะตะปัŒั‘ ะถะตั€ั‚ะฒั‹! - ะณะปะฐะฒะฐ ะดะธัั†ะธะฟะปะธะฝะฐั€ะฝะพะณะพ ะบะพะผะธั‚ะตั‚ะฐ ะบั€ะตะฟั‡ะต ัะถะฐะป ั€ัƒะบัƒ ะธ, ั€ะฒะฐะฝัƒะฒ ะฝะฐ ัะตะฑั, ะฟะตั€ะตะบะธะฝัƒะป ะšะฐะฝะดัƒ ั‡ะตั€ะตะท ัะฟะธะฝัƒ ะฝะฐ ะฟะพะป. - ะะพ ัƒั‡ั‚ะธ, ะตัะปะธ ั‚ั‹ ะฟั€ะพะธะณั€ะฐะตัˆัŒ, ะฑัƒะดะตัˆัŒ ะดั€ะฐะธั‚ัŒ ัƒะฝะธะฒะตั€ัะธั‚ะตั‚ ะฒะตััŒ ะณะพะด!\n- ะขั‡... ะปะฐะดะฝะพ - ะฎัƒ ะฟะพะดะฝัะปัั, ะดะตั€ะถะฐััŒ ะทะฐ ัะฟะธะฝัƒ. - ะฏ ั‚ะตะฑะต ัั‚ะพ ะฝะต ะฟั€ะพั‰ัƒ.\nะขัƒั‚ ะฒ ะดะฒะตั€ัŒ ั‚ะธั…ะพะฝัŒะบะพ ะฟะพัั‚ัƒั‡ะฐะปะธััŒ.\n- ะ ะตัะปะธ ะฒั‹ะธะณั€ะฐะตัˆัŒ ั‚ั‹, ั ะฝะฐ ะณะพะด ะพั‚ ั‚ะตะฑั ะพั‚ัั‚ะฐะฝัƒ! - ั…ะผั‹ะบะฝัƒะป ะšั‘ั ะธ ะฟะพะฒะตั€ะฝัƒะปัั ะบ ะดะฒะตั€ะธ.\n- ะฅะธะฑะฐั€ะธ-ัะฐะฝ! ะฏ, ะบะพะฝะตั‡ะฝะพ, ะฟะพะฝะธะผะฐัŽ, ั‡ั‚ะพ ะดะธัั†ะธะฟะปะธะฝะฐ - ัั‚ะพ ัะฒัั‚ะพะต, ะธ ะฟะพะดะดะตั€ะถะธะฒะฐัŽ ะฒะฐัˆะต ั€ะตัˆะตะฝะธะต ะฝะฐะดั€ะฐั‚ัŒ ัั‚ะพะผัƒ ะฟั€ะธะดัƒั€ะบัƒ ะทะฐะด! ะะพ ัƒ ะฝะฐั ั‚ัƒั‚ ัƒั€ะพะบ, ะฐ ะผะฝะต ั€ะตั„ะตั€ะฐั‚ ัะดะฐะฒะฐั‚ัŒ! - ะทะฐัˆั‘ะป ะฑะตะทัƒะฟั€ะตั‡ะฝั‹ะน ะะปะปะตะฝ ะฃะพะปะบะตั€, ะฒ ะบะพั‚ะพั€ะพะณะพ ะฝะฐะผะตั€ั‚ะฒะพ ะฒั†ะตะฟะธะปัั ะกะฐะฒะฐะดะฐ ะฟั‹ั‚ะฐัััŒ ะพัั‚ะฐะฝะพะฒะธั‚ัŒ.\nะกั†ะตะฝะฐ 4. ะ”ัƒะฑะปัŒ 1.\nะšะฐะฝะดะฐ ะฟะพัะผะพั‚ั€ะตะป ะฝะฐ ะผะฐะปัŒั‡ะธัˆะบัƒ ะธ ะธะทะดะฐะป ั‚ะธั…ะธะน ะทะฒัƒะบ, ะฟะพั…ะพะถะธะน ะฝะฐ ะบะพัˆะฐั‡ัŒะต ัˆะธะฟะตะฝะธะต. ะ’ะธะดะธะผะพ ะพะฝ ะฝะต ะพะถะธะดะฐะป, ั‡ั‚ะพ ะฟะตั€ะฒั‹ะผะธ ะฒ ะบะปะฐัั ะทะฐะนะดัƒั‚ ะธะผะตะฝะฝะพ ัั‚ะธ ะดะฒะพะต.\n- ะŸั€ะพั…ะพะดะธ, ะทะฐัˆั‘ะป ัƒะถะต, - ั…ะผั‹ะบะฝัƒะป ะฎัƒ ะธ, ะพั‚ะฒะตัะธะฒ ะฅะธะฑะฐั€ะธ ะฟะพะดะทะฐั‚ั‹ะปัŒะฝะธะบ, ะฟะพัะฟะตัˆะธะป ะฒะตั€ะฝัƒั‚ัŒัั ะฝะฐ ัะฒะพั‘ ะผะตัั‚ะพ.\n- ะž, ั‚ั‹ ะตั‰ั‘ ะถะธะฒะพะน?! ะŸะตั‡ะฐะปัŒะฝะพ... - ะฟะพะบะฐั‡ะฐะป ะณะพะปะพะฒะพะน ะะปะปะตะฝ. - ะ ะตะฑัั‚ะฐ ะทะฐั…ะพะดะธั‚ะต, ะฅะธะฑะฐั€ะธ ัƒัˆั‘ะป! - ั‚ัƒั‚ ะถะต ะฒ ะดะฒะตั€ัŒ ะฟะพะฒะฐะปะธะปะธ ะพัั‚ะฐะปัŒะฝั‹ะต. ะ˜ ะฟะพัะปะตะดะฝะธะผ ะทะฐัˆั‘ะป ะฟั€ะตะฟะพะดะฐะฒะฐั‚ะตะปัŒ. ะ‘ะตะปะพะฒะพะปะพัั‹ะน ะดะพัั‚ะฐะป ะธะท ััƒะผะบะธ ั€ะธััƒะฝะบะธ ะธ ั‡ะตั€ั‚ะตะถะธ, ะฟะพัะปะต ั‡ะตะณะพ ั€ะฐะทะฒะตัะธะป, ะฒะทัะป ัƒะบะฐะทะบัƒ ะธ ะฝะฐั‡ะฐะป ั€ะฐััะบะฐะทั‹ะฒะฐั‚ัŒ ั€ะตั„ะตั€ะฐั‚ ะฟะพ ัะบะพะปะพะณะธะธ.\nะ’ะพะพะฑั‰ะต, ะพะฝ ะฝะต ะฑั‹ะป ะพั‚ะปะธั‡ะฝะธะบะพะผ, ะฝะพ ะฑะพะปัŒัˆะธะผ ั‚ั€ัƒะดัะณะพะน!\nะ•ัะปะธ ั€ะฐะฝัŒัˆะต ะฎัƒ ะผะตั‡ั‚ะฐะป ะพั‚ัะธะดะตั‚ัŒ ะฟะพัะปะตะดะฝะธะต ัƒั€ะพะบะธ ะธ ัะฒะฐะปะธั‚ัŒ ะดะพะผะพะน, ั‚ะพ ั‚ะตะฟะตั€ัŒ ะตะณะพ ะถะตะปะฐะฝะธะตะผ ะฑั‹ะปะพ, ั‡ั‚ะพะฑั‹ ัƒั€ะพะบะธ ะฝะธะบะพะณะดะฐ ะฝะต ะทะฐะบะฐะฝั‡ะธะฒะฐะปะธััŒ.\n"ะขั‡, ะจะฟะตะฝะดะตะปัŒ. ะŸะพั‡ะตะผัƒ, ะฟะพั‡ะตะผัƒ ั‚ั‹ ั‚ะฐะบ ะฝะต ะฒะพะฒั€ะตะผั ัะฒะฐะปะธะปัั ะผะฝะต ะฝะฐ ะณะพะปะพะฒัƒ?!" - ะดัƒะผะฐะป ะพะฝ, ะดะตะปะฐั ะฒะธะด, ั‡ั‚ะพ ัะปัƒัˆะฐะตั‚.\n- ... ะ˜ ะฒะพั‚ ะฟะพัั‚ะพะผัƒ ะดะปั ัะฟะฐัะตะฝะธั ะบะธั‚ะพะฒ ั‚ะฐะบ ะฒะฐะถะฝะพ ะฟั€ะตะบั€ะฐั‚ะธั‚ัŒ ัั‚ั€ะตะปัŒะฑัƒ ะธ ะฟะตั€ะตะฒะพะท ะฝะตั„ั‚ะธ ั‡ะตั€ะตะท ะพะบะตะฐะฝ! ะฃ ะผะตะฝั ะฒัั‘! - ะทะฐะบะพะฝั‡ะธะป ั€ะฐััะบะฐะท.\n- ะัƒ ั‡ั‚ะพ ะถ, ะดัƒะผะฐัŽ, ะฝะฐ 4-ะบัƒ ะฒะฟะพะปะฝะต ั…ะฒะฐั‚ะธั‚.\n- ะงั‚ะพ?! ะะพ ัƒั‡ะธั‚ะตะปัŒ, ัƒ ะฝะตะณะพ ะฟะพั‚ั€ััะฐัŽั‰ะธะน ะดะพะบะปะฐะด! - ะทะฐั‰ะตะฑะตั‚ะฐะป ะพะดะฝะพะณั€ัƒะฟะฟะฝะธะบ.\n- ะžะฝ ะผะฝะพะณะพ ะณะพั‚ะพะฒะธะปัั, ะฒะพะปะฝะพะฒะฐะปัั, ะฟะพั‡ะตะผัƒ ั‡ะตั‚ั‹ั€ะต?! - ะทะฐัั‚ัƒะฟะธะปัั ะทะฐ ะะปะปะตะฝะฐ ะขััƒะฝะฐ.\n- ะ”ะฐ ะฟั€ะตะบั€ะฐัะฝั‹ะน ะดะพะบะปะฐะด, ะตัะปะธ ั‡ะตัั‚ะฝะพ, ะฝะต ะพะถะธะดะฐะป, - ะฒั‹ัะบะฐะทะฐะปัั ั‡ะตะปะพะฒะตะบ, ะบะพั‚ะพั€ะพะณะพ ะผะตะฝัŒัˆะต ะฒัะตะณะพ ัั‚ะพ ะผะพะณะปะพ ะฒะพะปะฝะพะฒะฐั‚ัŒ. ะšะฐะฝะดะฐ ัะผะพั‚ั€ะตะป ะฝะฐ ะฟั€ะตะฟะพะดะฐะฒะฐั‚ะตะปั.\n- ะญ-ะญ-ะญ?! - ะพัˆะฐะปะตะป ะฒะตััŒ ะบะปะฐัั.\n- ะ... ั-ัั‚-ั‚ะพ... - ะทะฐะปะธะปัั ั€ัƒะผัะฝั†ะตะผ ะฃะพะปะบะตั€.\n- ะัƒ ะปะฐะดะฝะพ, 5!\nะฎัƒ, ะฟะพะฑะตะดะฝะพ ัƒั…ะผั‹ะปัŒะฝัƒะฒัˆะธััŒ, ะฟะตั€ะตะฒะตะป ะฒะทะณะปัะด ะฝะฐ ะะปะปะตะฝะฐ. ะขะพั‚ ะฟะพั‚ัƒะฟะธะปัั ะธ ัƒัั‚ะฐะฒะธะปัั ะฒ ะฟะพะป.\n"ะฅะผ, ะฒะพะทะผะพะถะฝะพ ัั‚ะพ ะฑัƒะดะตั‚ ะฝะต ั‚ะฐะบ ัƒะถ ะธ ัƒะถะฐัะฝะพ", - ะฟะพั‡ะตะผัƒ-ั‚ะพ ั‚ะพะปัŒะบะพ ัะตะนั‡ะฐั ะšะฐะฝ', '- ะ”ะพะฑั€ะพะต ัƒั‚ั€ะพ, - ัˆะตะฟะพั‚ ั‰ะตะบะพั‡ะตั‚ ะผะฝะต ัƒั…ะพ. ะกะพะฒัะตะผ ะฝะต ั…ะพั‡ะตั‚ัั ั€ะฐะทะปะตะฟะปัั‚ัŒ ะณะปะฐะทะฐ ะธ ะฒัั‚ั€ะตั‡ะฐั‚ัŒ ะฝะพะฒั‹ะน ะดะตะฝัŒ. ะŸะพะฒะพั€ะฐั‡ะธะฒะฐัŽััŒ, ะฟั€ะธั‚ัะณะธะฒะฐั ั‚ะตะฑั ะฑะปะธะถะต, ะธ ัƒั‚ั‹ะบะฐัŽััŒ ะฝะพัะพะผ ั‚ะตะฑะต ะฒ ะณั€ัƒะดัŒ, ะพั‰ัƒั‰ะฐั ะทะฐะฟะฐั… ัะปะฐะดะพัั‚ะตะน, ะบะพั‚ะพั€ั‹ะต ะฝั€ะฐะฒัั‚ัั ะฝะฐะผ ะพะฑะพะธะผ. ะฏ ะตะถัƒััŒ ะพั‚ ั…ะพะปะพะดะฐ, ะฟั‹ั‚ะฐัััŒ ะฒัะปะตะฟัƒัŽ ะฝะฐะนั‚ะธ ะบั€ะฐั ัƒัŽั‚ะฝะพะณะพ ะพะดะตัะปะฐ ะธ ัะฝะพะฒะฐ ะพะบัƒะฝัƒั‚ัŒัั ะฒ ัะพะฝ. ะขั‹ ะทะฐะผะตั‡ะฐะตัˆัŒ ัั‚ะพ ะธ ะทะฐะฑะพั‚ะปะธะฒะพ ัƒะบั€ั‹ะฒะฐะตัˆัŒ ะผะตะฝั. ะขะฒะพะธ ะฟะฐะปัŒั†ั‹ ะฟะตั€ะตะฑะธั€ะฐัŽั‚ ะผะพะธ ะฒะพะปะพัั‹, ะฐ ะณัƒะฑั‹ ะปะตะณะบะพ ะบะฐัะฐัŽั‚ัั ะผะพะตะณะพ ะปะฑะฐ. ะœั‹ ั‚ะฐะบ ะธ ะทะฐัั‚ั‹ะฒะฐะตะผ ะฒ ัั‚ะพะน ะฟะพะทะต ะฝะฐ ะฝะตะบะพั‚ะพั€ะพะต ะฒั€ะตะผั.\nะŸั€ะพั…ะพะดะธั‚ ะฒัะตะณะพ ะฝะตัะบะพะปัŒะบะพ ะผะธะฝัƒั‚, ะฐ ะฟะพั‚ะพะผ ั ั€ะตะทะบะพ ัะฐะถัƒััŒ ะฝะฐ ะบั€ะพะฒะฐั‚ะธ ะธ ะฝะฐั‡ะธะฝะฐัŽ ะฒะพั€ั‡ะฐั‚ัŒ, ั‡ั‚ะพ ัƒะถะต ะดะฐะฒะฝะพ ะฟะพั€ะฐ ะฒัั‚ะฐะฒะฐั‚ัŒ, ะฒะตะดัŒ ัะตะณะพะดะฝั ะฟั€ะตะดัั‚ะพะธั‚ ะฟะพะตะทะดะบะฐ ะฝะฐ ะฟั€ะธั€ะพะดัƒ ะฒะผะตัั‚ะต ั ะดั€ัƒะทัŒัะผะธ. ะฃ ั‚ะตะฑั ะฝะฐ ะปะธั†ะต ะฟะพัะฒะปัะตั‚ัั ัƒะปั‹ะฑะบะฐ, ะฐ ั€ัƒะบะธ ั‚ัะฝัƒั‚ ะพะฑั€ะฐั‚ะฝะพ, ะทะฐัั‚ะฐะฒะปัั ะฒะฝะพะฒัŒ ะพั‚ะบะธะฝัƒั‚ัŒัั ะฝะฐ ะฟะพะดัƒัˆะบะธ. ะะฐ ัƒะปะธั†ะต ะปัŒะตั‚ ะดะพะถะดัŒ, ะฑะฐั€ะฐะฑะฐะฝั ะฒ ะพะบะฝะฐ, ะฐ ั‡ั‚ะพ ะตั‰ะต ะดะตะปะฐั‚ัŒ ะฒ ั‚ะฐะบะพะน ะดะตะฝัŒ, ะตัะปะธ ะฝะต ะฝะตะถะธั‚ัŒัั ะฒ ัƒัŽั‚ะฝะพะน ะฟะพัั‚ะตะปะธ ะฒ ะพะฑัŠัั‚ะธัั… ะปัŽะฑะธะผะพะณะพ?\nะกะบะพะปัŒะบะพ ะฒั€ะตะผะตะฝะธ ะผั‹ ะฑั‹ะปะธ ะทะฝะฐะบะพะผั‹, ะฟั€ะตะถะดะต, ั‡ะตะผ ัƒะทะฝะฐะปะธ ะพ ั‡ัƒะฒัั‚ะฒะฐั… ะดั€ัƒะณ ะดั€ัƒะณะฐ? ะ”ะฐ, ั ะฝะต ะฟะพะผะฝัŽ ัั‚ะพะณะพ, ะฝะพ ะบั‚ะพ ัั‡ะธั‚ะฐะตั‚? ะ“ะปะฐะฒะฝะพะต, ั‡ั‚ะพ ะฒ ะผะพะตะน ะฟะฐะผัั‚ะธ ะดะพ ัะธั… ะฟะพั€ ะฑะตั€ะตะถะฝะพ ั…ั€ะฐะฝะธั‚ัั ะผะพะผะตะฝั‚, ะบะพะณะดะฐ ั‚ั‹ ะฝะฐะบะพะฝะตั† ัƒัะปั‹ัˆะฐะป ั‚ะต ะฒะฐะถะฝั‹ะต ัะปะพะฒะฐ. ะŸะตั€ะตะด ะณะปะฐะทะฐะผะธ ะฒัะฟะปั‹ะฒะฐัŽั‚ ัั‡ะฐัั‚ะปะธะฒั‹ะต ะผะณะฝะพะฒะตะฝะธั, ัะปะพะฒะฝะพ ะบะฐะดั€ั‹, ะทะฐะฟะตั‡ะฐั‚ะปะตะฒัˆะธะต ะฒัั‘ ะฒ ะผะตะปัŒั‡ะฐะนัˆะธั… ะดะตั‚ะฐะปัั…. ะญั‚ะพ ะฟั€ะพะธะทะพัˆะปะพ ะฒ ะผะพั€ะพะทะฝั‹ะน ัะฝะฒะฐั€ัะบะธะน ะดะตะฝัŒ. ะ’ะตัั‘ะปะฐั ะบะพะผะฟะฐะฝะธั ะผะพะปะพะดั‹ั… ะปัŽะดะตะน ะฝะต ะผะพะณะปะฐ ะฟั€ะพัั‚ะพ ัะธะดะตั‚ัŒ ะดะพะผะฐ ะฒะทะฐะฟะตั€ั‚ะธ ะธ ัƒะฟัƒัั‚ะธั‚ัŒ ั‚ะฐะบะพะน ั…ะพั€ะพัˆะธะน ัะปัƒั‡ะฐะน ะดะปั ะฟั€ะพะณัƒะปะบะธ ะฟะพ ะทะฐัะฝะตะถะตะฝะฝะพะผัƒ ะปะตััƒ ะธ ะฟั€ะพั‡ะธั… ะทะธะผะฝะธั… ะทะฐะฑะฐะฒ.\nะขั‹ ั‚ะพะณะดะฐ ะพะบะฐะทะฐะปัั ะฒะฝะต ะฝะฐัˆะตะณะพ ะฟะพะปั ะทั€ะตะฝะธั, ะฐ ั‚ะตะผะฝะพั‚ะฐ ัƒะถะต ะฝะฐั‡ะฐะปะฐ ะพะฟัƒัะบะฐั‚ัŒัั ะฝะฐ ะทะตะผะปัŽ. ะšะพะฝะตั‡ะฝะพ, ะผะฝะต ะฝะธั‡ะตะณะพ ะฝะต ะพัั‚ะฐะฒะฐะปะพััŒ, ะบั€ะพะผะต ะบะฐะบ ะพั‚ะฟั€ะฐะฒะธั‚ัŒัั ะฝะฐ ะฟะพะธัะบะธ. ะะฐ ะผะพะตะผ ะปะธั†ะต ะทะฐัั‚ั‹ะปะพ ัƒะดะธะฒะปะตะฝะธะต, ะบะพะณะดะฐ ั ะทะฐัั‚ะฐะป ั‚ะตะฑั ะทะฐ ัั‚ั€ะฐะฝะฝั‹ะผ ะทะฐะฝัั‚ะธะตะผ: ะฑั‹ะปะพ ะทะฐะฑะฐะฒะฝะพ ะฝะฐะฑะปัŽะดะฐั‚ัŒ ะทะฐ ั‚ะพะฑะพะน, ะฒั‹ะฒะพะดัั‰ะธะผ ะฐะบะฒะฐั€ะตะปัŒัŽ ะธ ะฑะฐะปะปะพะฝั‡ะธะบะฐะผะธ ั ะบั€ะฐัะบะพะน ะฝะตะบะธะต ัƒะทะพั€ั‹ ะฟั€ัะผะพ ะฝะฐ ัะฝะตะณัƒ. ะขะฒะพะธ ะฝะตะพะฑั‹ั‡ะฝะพัั‚ัŒ ะธ ะฝะตะฟั€ะตะดัะบะฐะทัƒะตะผะพัั‚ัŒ ะฟั€ะธั‚ัะณะธะฒะฐะปะธ ะบ ัะตะฑะต ะผะพัŽ ะฝะฐั‚ัƒั€ัƒ.\n- ะขั‹ ะผะฝะต ะฝั€ะฐะฒะธัˆัŒัั. ะžั‡ะตะฝัŒ, - ะบะฐะถะตั‚ัั, ะฑัƒะดั‚ะพ ะฒัั‘ ะทะฐะผะตั€ะปะพ, ะธ ะฒ ะทะฒะตะฝัั‰ะตะน ั‚ะธัˆะธะฝะต ะฟั€ะพะทะฒัƒั‡ะฐะปะธ ะฟั€ะพัั‚ั‹ะต ัะปะพะฒะฐ, ะบะพั‚ะพั€ั‹ะต ั‚ัะถะตะปะพ ะฟั€ะพะธะทะฝะตัั‚ะธ. ะงั‚ะพ ะผะพะณะปะพ ั‚ะพะปะบะฝัƒั‚ัŒ ะผะตะฝั ะฟั€ะพัั‚ะพ ะฒะทัั‚ัŒ ะธ ัะบะฐะทะฐั‚ัŒ ะธั…? ะžะดะฝะฐะบะพ ะพั‚ะฒะตั‚ ะฝะฐ ัั‚ะพั‚ ะฒะพะฟั€ะพั ัƒะถะต ะฝะต ะฒะฐะถะตะฝ, ั‚ะตะฟะตั€ัŒ ะพะฝ ะพัั‚ะฐะฒะธะป ะผะตัั‚ะพ ะดะปั ะฑะตัะฟะพะบะพะนัั‚ะฒะฐ. ะขะฒะพะธ ัะผะพั†ะธะธ ัะปะพะถะฝะพ ะฟั€ะพั‡ะธั‚ะฐั‚ัŒ. ะขะฐะบ ะฑั‹ะปะพ ะฒัะตะณะดะฐ. ะœะพะปั‡ะฐะฝะธะต ะฝะฐะณะฝะตั‚ะฐะตั‚ ะฝะฐะฟั€ัะถะตะฝะธะต ะผะตะถะดัƒ ะฝะฐะผะธ.\nะŸั€ะธะบะพัะฝะพะฒะตะฝะธะต ะปะตะดัะฝั‹ั… ะฟะฐะปัŒั†ะตะฒ ะบ ะผะพะตะน ั‰ะตะบะต ะฒั‹ะฒะพะดะธั‚ ะธะท ะพั†ะตะฟะตะฝะตะฝะธั, ัะบะพะฒะฐะฒัˆะตะณะพ ั‚ะตะปะพ. ะฏ ะตะปะต-ะตะปะต ั€ะฐะทะปะธั‡ะฐัŽ, ั‡ั‚ะพ ั‚ั‹ ัะตะนั‡ะฐั ะณะพะฒะพั€ะธัˆัŒ, ะฝะพ ะฝะตะบะพั‚ะพั€ั‹ะต ะพะฑั€ั‹ะฒะบะธ ั„ั€ะฐะท ะฒัั‘ ะถะต ะฟั€ะธะพะฑั€ะตั‚ะฐัŽั‚ ัะผั‹ัะป. ะะธะบะพะณะดะฐ ะฝะต ะฒะตั€ะธะป ะฒ ั‡ัƒะดะตัะฐ, ะดะฐ ะฒะพั‚ ั‚ะพะปัŒะบะพ ัะตะนั‡ะฐั ะฟะพะฝะธะผะฐัŽ: ะพะฝะธ ัะปัƒั‡ะฐัŽั‚ัั. ะœะฐะปะตะฝัŒะบะพะต ั‡ัƒะดะพ - ัƒะทะฝะฐั‚ัŒ ะพะฑ ะพั‚ะฒะตั‚ะฝั‹ั… ั‡ัƒะฒัั‚ะฒะฐั… ั‚ะพะณะพ, ะบั‚ะพ ั‚ะฐะบ ะผะฝะพะณะพ ะทะฝะฐั‡ะธั‚ ะดะปั ั‚ะตะฑั.\nะœั‹ ะธะดะตะผ ั ั‚ะพะฑะพะน ะฟะพ ะทะฐะผะตั‚ะตะฝะฝั‹ะผ ัะฝะตะณะพะผ ัƒะปะธั†ะฐะผ. ะ’ัŒัŽะณะฐ, ะทะฐะฒั‹ะฒะฐั, ะดัƒะตั‚ ะฒ ะปะธั†ะพ, ัะฑะธะฒะฐั ะฟั€ะพั…ะพะถะธั… ั ะฟัƒั‚ะธ, ะฐ ัƒ ะผะตะฝั ะฝะฐ ะดัƒัˆะต - ัะฟะพะบะพะนัั‚ะฒะธะต ะธ ัƒะผะธั€ะพั‚ะฒะพั€ะตะฝะธะต... ะšะพะณะดะฐ ั‚ั‹ ั€ัะดะพะผ, ะฟั€ะพะธัั…ะพะดัั‰ะตะต ะฒะพะบั€ัƒะณ ะฝะต ะธะผะตะตั‚ ะทะฝะฐั‡ะตะฝะธั, ะธ ะฝะตั‚ ะดะตะปะฐ ะดะพ ะฒัะตั… ะพัั‚ะฐะปัŒะฝั‹ั….\nะœะฝะต ัะปั‹ัˆะฝะพ, ะบะฐะบ ั‚ะฒะพะธ ะทัƒะฑั‹ ัั‚ัƒั‡ะฐั‚ ะพั‚ ั…ะพะปะพะดะฐ. ะกะถะฐะฒัˆะธััŒ, ั‚ั‹ ะฟั€ัั‡ะตัˆัŒ ะฝะพั ะฒ ะฒั‹ัะพะบะธะน ะฒะพั€ะพั‚ ะบัƒั€ั‚ะบะธ. ะฏ ัƒะฒะตั€ะตะฝ, ั‡ั‚ะพ ั‚ะฒะพะธ ั€ัƒะบะธ ะฒ ะบะฐั€ะผะฐะฝะฐั… ะดะฐะฒะฝะพ ะฝะต ะผะพะณัƒั‚ ะพั‚ะพะณั€ะตั‚ัŒัั ะธ ะฟั€ะธะฝัั‚ัŒ ะฝะพั€ะผะฐะปัŒะฝัƒัŽ ั‚ะตะผะฟะตั€ะฐั‚ัƒั€ัƒ.\n- ะ—ะฐะผะตั€ะท? - ัะฟั€ะฐัˆะธะฒะฐัŽ, ะทะฐะณะปัะดั‹ะฒะฐั ะฒ ะบะฐั€ะธะต ะณะปะฐะทะฐ, ะพะฑั€ะฐะผะปะตะฝะฝั‹ะต ั‡ะตั€ะฝั‹ะผะธ ั€ะตัะฝะธั†ะฐะผะธ, ะฝะฐ ะบะพั‚ะพั€ั‹ะต ั‚ะธั…ะพ ะฟะฐะดะฐัŽั‚ ัะฝะตะถะธะฝะบะธ, ะธ, ะฝะต ะดะพะถะธะดะฐัััŒ ะพั‚ะฒะตั‚ะฐ, ั‚ัะฝัƒ ั‚ะตะฑั ะฒ ะฑะปะธะถะฐะนัˆะตะต ะบะฐั„ะต.\n- ะŸะพะนะดะตะผ ะดะพะผะพะน, ะฐ ั‚ะพ ะฒะพัะฟะฐะปะตะฝะธะต ะปะตะณะบะธั… ะฟะพะดั…ะฒะฐั‚ะธัˆัŒ, - ัั‚ั€ะพะณะพ ะทะฐะผะตั‡ะฐะตัˆัŒ ั‚ั‹, ัƒะถะต ะฝะฐะฟั€ะฐะฒะปััััŒ ะฒ ัั‚ะพั€ะพะฝัƒ ะฝะฐัˆะตะณะพ ะฟะพะดัŠะตะทะดะฐ.\n- ะŸะพัั‚ะพะน, ั€ะฐะทะฒะต ะฝะต ะฒะธะดะธัˆัŒ, ะบะฐะบะฐั ั‡ัƒะดะตัะฝะฐั ะฟะพะณะพะดะฐ? - ะทะฝะฐะตัˆัŒ ะฒะตะดัŒ, ั‡ั‚ะพ ะผะฝะต ะฝั€ะฐะฒะธั‚ัั ะณัƒะปัั‚ัŒ ะฟะพะด ะดะพะถะดะตะผ, ะฟะพะดัั‚ะฐะฒะปัั ะปะธั†ะพ ะฟะฐะดะฐัŽั‰ะธะผ ั…ะพะปะพะดะฝั‹ะผ ะบะฐะฟะปัะผ.\nะขะตะฑะต ะฒ ะณะพะปะพะฒัƒ ะฑั‹ัั‚ั€ะพ ะฟั€ะธั…ะพะดะธั‚ ะผั‹ัะปัŒ, ะบะฐะบ ะทะฐัั‚ะฐะฒะธั‚ัŒ ะผะตะฝั ัƒะนั‚ะธ ะฒ ะฑะพะปะตะต ััƒั…ะพะต ะธ ั‚ะตะฟะปะพะต ะผะตัั‚ะพ. ะ”ะพะปะณะพ ะฝะต ั€ะฐะทะดัƒะผั‹ะฒะฐั, ั€ั‹ะฒะบะพะผ ะฟั€ะธั‚ัะณะธะฒะฐะตัˆัŒ ะบ ัะตะฑะต, ะฟั€ะธะถะธะผะฐัััŒ ะบ ะผะพะธะผ ะณัƒะฑะฐะผ. ะžั‚ ะฝะตะพะถะธะดะฐะฝะฝะพัั‚ะธ ั ะฟั€ะธะพั‚ะบั€ั‹ะฒะฐัŽ ะธั…, ะฐ ั€ัƒะบะฐะผะธ ะฝะฐั‡ะธะฝะฐัŽ ะณะปะฐะดะธั‚ัŒ ั‚ะฒะพัŽ ัะฟะธะฝัƒ, ะบ ะบะพั‚ะพั€ะพะน ะฟั€ะธะปะธะฟะปะฐ ะธะทั€ัะดะฝะพ ะฟั€ะพะผะพะบัˆะฐั ั€ัƒะฑะฐัˆะบะฐ. ะะต ัะฟะตัˆะฐ, ั‚ั‹ ัƒะณะปัƒะฑะปัะตัˆัŒ ะฟะพั†ะตะปัƒะน, ะตั‰ะต ะฑะพะปัŒัˆะต ั€ะฐะทะทะฐะดะพั€ะธะฒะฐั. ะ˜ะผะตะฝะฝะพ ั‚ะฐะบ ะธ ะฟั€ะตะดะฟะพะปะฐะณะฐะปะพััŒ, ะฟั€ะฐะฒะดะฐ?\nะšะพะต-ะบะฐะบ ัะฟั€ะฐะฒะธะฒัˆะธััŒ ั ะทะฐะผะบะพะผ, ะผั‹ ะฒะฒะฐะปะธะฒะฐะตะผัั ะฒ ะฟะพะปัƒั‚ะตะผะฝัƒัŽ ะบะฒะฐั€ั‚ะธั€ัƒ, ะตะดะฒะฐ ััƒะผะตะฒ ัƒัั‚ะพัั‚ัŒ ะฝะฐ ะฝะพะณะฐั…. ะŸะตั€ะตะด ะณะปะฐะทะฐะผะธ ะดะพ ัะธั… ะฟะพั€ ัั‚ะพะธั‚ ะฟะตะปะตะฝะฐ ะดะพะถะดั. ะขั‹ ัั€ะฐะทัƒ ะถะต ั€ะตะทะบะพ ะฟั€ะธะถะธะผะฐะตัˆัŒ ะผะตะฝั ะบ ัั‚ะตะฝะต, ะธ ั‚ะฒะพะน ัะทั‹ะบ ะฒั€ั‹ะฒะฐะตั‚ัั ะฒ ะผะพะน ั€ะพั‚ ะฒ ะฝะตะธัั‚ะพะฒะพะผ ะฟะพั†ะตะปัƒะต, ะฑะตัะฟะพั€ัะดะพั‡ะฝะพ ะดะฒะธะณะฐะตั‚ัั ะฒะดะพะปัŒ ะทัƒะฑะพะฒ ะธ ะฒะพะทะฒั€ะฐั‰ะฐะตั‚ัั ะบ ะผะพะตะผัƒ ัะทั‹ะบัƒ. ะฏ ะฝะต ัั‚ั€ะตะผะปัŽััŒ ะฑั€ะฐั‚ัŒ ะธะฝะธั†ะธะฐั‚ะธะฒัƒ ะฝะฐ ัะตะฑั, ะผะฝะต ะฒัะตะณะดะฐ ะฝั€ะฐะฒะธะปะพััŒ ะฟะปะฐะฒะธั‚ัŒัั ะฟะพะด ะฝะฐั‚ะธัะบะพะผ ั‚ะฒะพะธั… ะปะฐัะบ ะธ ะพะถะธะดะฐั‚ัŒ, ั‡ั‚ะพ ะถะต ั‚ั‹ ะฟั€ะตะดะฟั€ะธะผะตัˆัŒ ะดะฐะปัŒัˆะต. ะฃ ั‚ะตะฑั ะฟะพั‡ั‚ะธ ะฒัะตะณะดะฐ ะปะตะดัะฝั‹ะต ะฟะฐะปัŒั†ั‹, ะธ ัƒ ะผะตะฝั ะผัƒั€ะฐัˆะบะธ ะฑะตะณัƒั‚ ะฟะพ ะบะพะถะต ะพั‚ ะฟั€ะธัั‚ะฝั‹ั…, ะฝะพ ั…ะพะปะพะดะฝั‹ั… ะฟั€ะธะบะพัะฝะพะฒะตะฝะธะน. ะขะตะฑะต ะฝั€ะฐะฒะธั‚ัั ัะผะพั‚ั€ะตั‚ัŒ, ะบะฐะบ ะฟั€ะพะณะธะฑะฐะตั‚ัั ะผะพั ัะฟะธะฝะฐ, ะบะพะณะดะฐ ั‚ั‹ ั€ะธััƒะตัˆัŒ ะฝะฐ ะฝะตะน ะฝะตะฒะธะดะธะผั‹ะต ะปะธะฝะธะธ. ะ’ ะดะถะธะฝัะฐั… ัƒะถะต ัั‚ะฐะฝะพะฒะธั‚ัั ั‚ะตัะฝะพ, ะฐ ะฒ ะณะพะปะพะฒะต ะพะฑั€ะฐะทัƒะตั‚ัั ะฟัƒัั‚ะพั‚ะฐ, ะทะฐะฟะพะปะฝัะตะผะฐั ะปะธัˆัŒ ั‚ะพะฑะพะน. ะขะฒะพะธ ั€ัƒะบะธ ะพะฟัƒัะบะฐัŽั‚ัั ะฝะธะถะต, ะฝะฐั‰ัƒะฟั‹ะฒะฐั ะฟั€ัะถะบัƒ ั€ะตะผะฝั. ะž, ั‚ั‹ ะถะต ัะฐะผ ะทะฐั‚ะตัะป ัั‚ัƒ ะธะณั€ัƒ, ะผะฐะปั‹ัˆ, ั‚ะฐะบ ะดะฐะฒะฐะน ะฟะพะธะณั€ะฐะตะผ?\nะฏ, ะฒัั‘ ั‚ะฐะบ ะถะต ะฝะฐั…ะพะดัััŒ ะฒ ะบั€ะตะฟะบะธั… ะพะฑัŠัั‚ะธัั…, ะดะตะปะฐัŽ ะฝะตะพะถะธะดะฐะฝะฝั‹ะน ั€ะฐะทะฒะพั€ะพั‚, ะฟั€ะธะฒั‹ั‡ะฝะพ ะทะฐะฝะธะผะฐั ั€ะพะปัŒ ะฐะบั‚ะธะฒะฐ. ะขั‹ ั ะทะฐะผะธั€ะฐะฝะธะตะผ ัะตั€ะดั†ะฐ ัะผะพั‚ั€ะธัˆัŒ ะฝะฐ ะผะตะฝั, ะฟั€ะตะบั€ะฐั‚ะธะฒ ะฒัะต ะดะตะนัั‚ะฒะธั. ะฃะปั‹ะฑะฝัƒะฒัˆะธััŒ, ะฟั€ะพะฒะพะถัƒ ัะทั‹ะบะพะผ ะฟะพ ั‚ะฒะพะตะผัƒ ัƒั…ัƒ, ั‡ัƒั‚ัŒ ะฟั€ะธะบัƒัั‹ะฒะฐั ะผะพั‡ะบัƒ, ะพั‚ ั‡ะตะณะพ ั‚ะฒะพะธ ะดั€ะพะถะฐั‰ะธะต ะฟะฐะปัŒั†ั‹ ะฟะตั€ะตะผะตั‰ะฐัŽั‚ัั ะฒะฒะตั€ั… ะธ ััƒะดะพั€ะพะถะฝะพ ัะถะธะผะฐัŽั‚ ะผะพะธ ะฒะพะปะพัั‹, ั ะบะพั‚ะพั€ั‹ั… ัั‚ะตะบะฐะตั‚ ะฒะพะดะฐ. ะะตั‚ะตั€ะฟะตะปะธะฒะพ ั€ะฐััั‚ะตะณะธะฒะฐัŽ ะฟัƒะณะพะฒะธั†ั‹ ั‚ะฒะพะตะน ั€ัƒะฑะฐัˆะบะธ, ะฟะพะฟัƒั‚ะฝะพ ะพัั‚ะฐะฒะปัั ะฝะตัะบะพะปัŒะบะพ ะฑะฐะณั€ะพะฒั‹ั… ะพั‚ะผะตั‚ะธะฝ ะฝะฐ ัˆะตะต ะธ ะฝะฐ ะณั€ัƒะดะธ. ะ”ะพ ะผะตะฝั ะดะพะฝะพัะธั‚ัั ัั‚ะพะฝ, ะธ ั ะฟั€ะพะดะพะปะถะฐัŽ ะผะตะดะปะตะฝะฝัƒัŽ ะฟั‹ั‚ะบัƒ, ัั‚ัะณะธะฒะฐั ั ั‚ะตะฑั ะฑั€ัŽะบะธ ะฒะผะตัั‚ะต ั ะฑะตะปัŒะตะผ. ะ’ ั‚ะธัˆะธะฝะต, ั€ะฐะทะฑะฐะฒะปัะตะผะพะน ะฝะฐัˆะธะผ ั‚ัะถะตะปั‹ะผ ะดั‹ั…ะฐะฝะธะตะผ, ั€ะฐะทะดะฐะตั‚ัั ัˆัƒะผะฝั‹ะน ะฒั‹ะดะพั…, ะบะพะณะดะฐ ั ะดะตะปะฐัŽ ะฝะตัะบะพะปัŒะบะพ ะดะฒะธะถะตะฝะธะน ั€ัƒะบะพะน ะฟะพ ะพัะฝะพะฒะฐะฝะธัŽ ั‡ะปะตะฝะฐ, ะฐ ะทะฐั‚ะตะผ, ะปะธะทะฝัƒะฒ ะณะพะปะพะฒะบัƒ, ะพั‚ัั‚ั€ะฐะฝััŽััŒ, ะณะปัะดั ะฒ ะพะดัƒั€ะผะฐะฝะตะฝะฝั‹ะต ะณะปะฐะทะฐ.\n- ะกะฟะฐะปัŒะฝั, - ัˆะตะฟั‡ะตัˆัŒ ั‚ั‹, ะบั€ะตะฟะบะพ ะดะตั€ะถะฐััŒ ะทะฐ ะบั€ะฐะน ั‚ัƒะผะฑะพั‡ะบะธ, ัั‚ะพัะฒัˆะตะน ั€ัะดะพะผ.\nะŸั€ะพัะธั‚ัŒ ะดะฒะฐะถะดั‹ ะฝะตั‚ ัะผั‹ัะปะฐ, ะฒะตะดัŒ ัƒ ะผะตะฝั ัะฐะผะพะณะพ ัƒะถะต ะฝะตั‚ ัะธะป ั‚ะตั€ะฟะตั‚ัŒ ัั‚ะพ ั‚ัะฝัƒั‰ะตะต ะพั‰ัƒั‰ะตะฝะธะต, ะพะฑั€ะฐะทะพะฒะฐะฒัˆะตะตัั ะฒะฝะธะทัƒ ะถะธะฒะพั‚ะฐ.\nะ›ะตะณะบะพ ะฟะพะดั…ะฒะฐั‚ั‹ะฒะฐัŽ ั‚ะตะฑั ะฝะฐ ั€ัƒะบะธ ะธ ะธะดัƒ ะฒ ั‚ัƒ ะบะพะผะฝะฐั‚ัƒ, ะฒ ะบะพั‚ะพั€ะพะน ะผั‹ ัั‚ะพะปัŒะบะพ ั€ะฐะท ะทะฐะฝะธะผะฐะปะธััŒ ะปัŽะฑะพะฒัŒัŽ.\nะšั€ะพะฒะฐั‚ัŒ ะฒัั‚ั€ะตั‡ะฐะตั‚ ะฝะฐั ะทะฝะฐะบะพะผั‹ะผ ัะบั€ะธะฟะพะผ, ะบะพะณะดะฐ ั ะพะฟัƒัะบะฐัŽ ั‚ะตะฑั, ะฝะตั€ะฒะฝะพ ะบัƒัะฐัŽั‰ะตะณะพ ะณัƒะฑั‹. ะขั‹ ั…ะฒะฐั‚ะฐะตัˆัŒ ะผะตะฝั ะธ ั‚ัะฝะตัˆัŒ ะฝะฐ ัะตะฑั, ะพั‚ั‡ะตะณะพ ะพะบะฐะทั‹ะฒะฐะตัˆัŒัั ะฟั€ะธะถะฐั‚ั‹ะผ ะผะพะธะผ ั‚ะตะปะพะผ. ะขะฒะพะธ ั€ัƒะบะธ ัะบะพะปัŒะทัั‚ ะฟะพ ะผะพะธะผ ะฑะพะบะฐะผ, ะฟะพะผะพะณะฐั ัะฝัั‚ัŒ ั„ัƒั‚ะฑะพะปะบัƒ ะธ, ะฟั€ะธะปะพะถะธะฒ ะฝะตะบะพั‚ะพั€ั‹ะต ัƒัะธะปะธั, ะฟั€ะธะฟะพะดะฝัะฒัˆะธััŒ, ะพะฑะฒะพะดะธัˆัŒ ัะทั‹ะบะพะผ ะผะพะธ ัะพัะบะธ, ัะปะตะณะบะฐ ั†ะฐั€ะฐะฟะฐั ะธั… ะทัƒะฑะฐะผะธ.\nะงัƒะฒัั‚ะฒัƒั ะฝะตะพะฑั…ะพะดะธะผะพัั‚ัŒ ัะบะพั€ะตะนัˆะตะน ั€ะฐะทั€ัะดะบะธ, ั ะฟั‹ั‚ะฐัŽััŒ ะบะฐะบ ะผะพะถะฝะพ ัะบะพั€ะตะต ัะฝัั‚ัŒ ะดะถะธะฝัั‹ ะธ ะฝะฐัˆะฐั€ะธั‚ัŒ ะฒ ัั‰ะธั‡ะบะต ัˆะบะฐั„ะฐ ัะผะฐะทะบัƒ ะธ ะฟั€ะตะทะตั€ะฒะฐั‚ะธะฒั‹. ะะตั‚ะตั€ะฟะตะปะธะฒะพ ัƒัั‚ั€ะฐะธะฒะฐัŽััŒ ะฟะพัƒะดะพะฑะฝะตะต ะผะตะถะดัƒ ั‚ะฒะพะธั… ะฝะพะณ, ั€ะฐะทะฒะพะดั ะธั… ะฒ ัั‚ะพั€ะพะฝั‹ ะธ ะฝะตะผะฝะพะณะพ ัะณะธะฑะฐั ะฒ ะบะพะปะตะฝัั…. ะ’ั‹ะดะฐะฒะปะธะฒะฐัŽ ะณะตะปัŒ ะธ ะฟะพะพั‡ะตั€ะตะดะฝะพ ะฐะบะบัƒั€ะฐั‚ะฝะพ ะฒะฒะพะถัƒ ะฒ ั‚ะตะฑั ะฟะฐะปัŒั†ั‹, ั€ะฐัั‚ัะณะธะฒะฐั ะฟั€ะพั…ะพะด. ะกะปั‹ัˆัƒ ั‚ะฒะพะต ัะดะฐะฒะปะตะฝะฝะพะต ัˆะธะฟะตะฝะธะต ะธ ัั‚ะฐั€ะฐัŽััŒ ะพั‚ะฒะปะตั‡ัŒ ัะฒะพะธะผะธ ะปะฐัะบะฐะผะธ, ะฟะพะบั€ั‹ะฒะฐั ะณั€ัƒะดัŒ ะธ ะฟะปะตั‡ะธ ะฟะพั†ะตะปัƒัะผะธ, ะบะพะต-ะณะดะต ั‡ัƒั‚ัŒ ะฟั€ะธะบัƒัั‹ะฒะฐั ะบะพะถัƒ.\nะขั‹ ะทะฐะตั€ะทะฐะป ะธ ะฝะตะดะพะฒะพะปัŒะฝะพ ัƒัั‚ะฐะฒะธะปัั ะฝะฐ ะผะตะฝั, ั‚ั€ะตะฑัƒั ะฑะพะปัŒัˆะตะณะพ. ะฏ ั ัƒะดะพะฒะพะปัŒัั‚ะฒะธะตะผ ะฟะพะดั‡ะธะฝััŽััŒ. ะŸั€ะธัั‚ะฐะฒะปััŽ ั‡ะปะตะฝ ะบะพ ะฒั…ะพะดัƒ ะธ ะผะตะดะปะตะฝะฝะพ ะฒั…ะพะถัƒ, ะฝะฐ ั‡ั‚ะพ ะฟะพะปัƒั‡ะฐัŽ ะตะปะต ะทะฐะผะตั‚ะฝั‹ะน ะบะธะฒะพะบ, ะบะฐะบ ั€ะฐะทั€ะตัˆะตะฝะธะต ะฟั€ะพะดะพะปะถะฐั‚ัŒ. ะกะฟัƒัั‚ั ะฝะตัะบะพะปัŒะบะพ ั‚ะพะปั‡ะบะพะฒ ั‚ั‹ ะฒั‹ะณะธะฑะฐะตัˆัŒัั ะฒ ะฟะพะทะฒะพะฝะพั‡ะฝะธะบะต, ะธ ะฝะฐ ะผะพะตะผ ะปะธั†ะต ะฟะพัะฒะปัะตั‚ัั ัƒะปั‹ะฑะบะฐ. ะฏ ัƒะฒะตะปะธั‡ะธะฒะฐัŽ ั‚ะตะผะฟ, ะดะฒะธะณะฐัััŒ ะฒัั‘ ะฑั‹ัั‚ั€ะตะต.\nะžั€ะณะฐะทะผ ัั‚ั€ะตะผะธั‚ะตะปัŒะฝะพ ะฝะฐะบั€ั‹ะฒะฐะตั‚ ะฝะฐั ั ะณะพะปะพะฒะพะน, ะดะฐั€ั ัั‚ะพะปัŒ ะดะพะปะณะพะถะดะฐะฝะฝะพะต ะฝะฐัะปะฐะถะดะตะฝะธะต. ะกะพ ัะฑะธะฒัˆะธะผัั ะดั‹ั…ะฐะฝะธะตะผ, ัะพ ะทะฒะตะทะดะพั‡ะบะฐะผะธ ะฒ ะณะปะฐะทะฐั… ะฟะฐะดะฐัŽ ั€ัะดะพะผ ั ั‚ะพะฑะพะน, ั€ะฐัะบั€ะฐัะฝะตะฒัˆะธะผัั, ั‚ัะถะตะปะพ ะดั‹ัˆะฐั‰ะธะผ, ะฝะพ ั‚ะฐะบะธะผ ะปัŽะฑะธะผั‹ะผ. ะขั‹ ะฟั€ะธะถะธะผะฐะตัˆัŒัั ะบะพ ะผะฝะต, ะฟะพะปะพะถะธะฒ ะณะพะปะพะฒัƒ ะฝะฐ ะผะพัŽ ะณั€ัƒะดัŒ. ะ”ะตะปะฐั‚ัŒ ัะตะนั‡ะฐั ั‡ั‚ะพ-ะปะธะฑะพ ะฒั‹ัˆะต ะฒััะบะธั… ัะธะป - ั ะฟั€ะพะดะพะปะถะฐัŽ ะปะตะถะฐั‚ัŒ, ะฟะพะณะปะฐะถะธะฒะฐั ั‚ะฒะพะธ ะฒะพะปะพัั‹ ะธ ะฒัะปัƒัˆะธะฒะฐัััŒ ะฒ ะฑะธะตะฝะธะต ะฝะฐัˆะธั… ัะตั€ะดะตั†.\nะŸะพั‡ะตะผัƒ ั ั‚ะตะฑั ั‚ะพะณะดะฐ ะฝะต ะฟะพัะปัƒัˆะฐะป? ะ—ะฐั‡ะตะผ ะฟะพะทะฒะพะปะธะป ั‚ะตะฑะต ะผะพะบะฝัƒั‚ัŒ ะฟะพะด ะดะพะถะดะตะผ ะฒะผะตัั‚ะต ัะพ ะผะฝะพะน? ะ•ัะปะธ ะฑั‹ ะฝะต ัั‚ะฐ ะพัˆะธะฑะบะฐ, ั‚ั‹ ะฑั‹ ะฝะต ะฟะพะดั…ะฒะฐั‚ะธะป ัะตั€ัŒะตะทะฝัƒัŽ ะฑะพะปะตะทะฝัŒ. ะœะตะฝั ะดะพ ัะธั… ะฟะพั€ ั‚ะตั€ะทะฐะตั‚ ั‡ัƒะฒัั‚ะฒะพ ะฒะธะฝั‹. ะžั‡ะตะฝัŒ ั‚ัะถะตะปะพ ะพัะพะทะฝะฐะฒะฐั‚ัŒ, ั‡ั‚ะพ ะฟะพะณัƒะฑะธะป ั‡ัŒัŽ-ั‚ะพ ะถะธะทะฝัŒ... ะžัะพะฑะตะฝะฝะพ ั‚ะพะณะพ, ะบั‚ะพ ะฑั‹ะป ั†ะตะฝั‚ั€ะพะผ ะผะพะตะน ะ’ัะตะปะตะฝะฝะพะน.\nะฏ ะฟั€ะพะดะพะปะถะฐัŽ ะถะธั‚ัŒ ะฟั€ะพัˆะปั‹ะผ, ะฝะต ะผะพะณัƒ ะฝะต ะฒัะฟะพะผะธะฝะฐั‚ัŒ ั‚ะต ะฝะตะผะฝะพะณะธะต, ะฝะพ ั‚ะฐะบะธะต ะดะพั€ะพะณะธะต ะผะพะตะผัƒ ัะตั€ะดั†ัƒ ะผะพะผะตะฝั‚ั‹, ะฟั€ะพะฒะตะดะตะฝะฝั‹ะต ั ั‚ะพะฑะพะน. ะœั‹ ัะพะฒัะตะผ ะฝะตะดะพะปะณะพ ะฑั‹ะปะธ ะฒะผะตัั‚ะต. ะœะตะฝั ั‡ะฐัั‚ะพ ะผะพะถะฝะพ ะฒัั‚ั€ะตั‚ะธั‚ัŒ ะฝะฐ ั‚ะพะผ ัะฐะผะพะผ ะผะตัั‚ะต ะฒ ะปะตััƒ, ะณะดะต ั ะพั‚ะบั€ั‹ะปัั ั‚ะตะฑะต. ะ˜ะฝะพะณะดะฐ ะผะฝะต ะบะฐะถะตั‚ัั, ั‡ั‚ะพ ัะบะฒะพะทัŒ ัะธะปัŒะฝัƒัŽ ะผะตั‚ะตะปัŒ ะฒะธะถัƒ ั‚ะฒะพะน ัะธะปัƒัั‚. ะขั‹ ัƒะปั‹ะฑะฐะตัˆัŒัั ะธ ะดะตะปะฐะตัˆัŒ ะฝะตัะบะพะปัŒะบะพ ัˆะฐะณะพะฒ ะฝะฐะฒัั‚ั€ะตั‡ัƒ, ะฐ ะฟะพั‚ะพะผ ะธัั‡ะตะทะฐะตัˆัŒ...\nะ•ัะปะธ ะฑั‹ ั‚ะพะปัŒะบะพ ะฑั‹ะปะฐ ะฒะพะทะผะพะถะฝะพัั‚ัŒ ะตั‰ะต ั€ะฐะท ัƒัะปั‹ัˆะฐั‚ัŒ ั‚ะฐะบะพะต ั‚ะตะฟะปะพะต "ะดะพะฑั€ะพะต ัƒั‚ั€ะพ", ะฟะพั‡ัƒะฒัั‚ะฒะพะฒะฐั‚ัŒ ะณะพั€ัั‡ะตะต ะดั‹ั…ะฐะฝะธะต, ั‰ะตะบะพั‡ัƒั‰ะตะต ัƒั…ะพ, ั…ะพั‚ัŒ ั‡ั‚ะพ-ะฝะธะฑัƒะดัŒ...\nะŸะพะบะฐ ั‚ั‹ ะฑั‹ะป ั€ัะดะพะผ, ะฑั‹ะปะพ ัะพะฒัะตะผ ะฝะต ะฒะฐะถะฝะพ, ั‡ั‚ะพ ะฟั€ะพะธัั…ะพะดะธั‚ ะฒะพะบั€ัƒะณ. ะะพ ั‚ะตะฟะตั€ัŒ, ะบะพะณะดะฐ ั ะฝะฐะฑะปัŽะดะฐัŽ ะทะฐ ะฝะตะฝะฐัั‚ะฝะพะน ะฟะพะณะพะดะพะน ะฒ ะพะบะฝะพ, ัƒ ะผะตะฝั ะฝะตั‚ ัะฒะตั‚ะปั‹ั… ะผั‹ัะปะตะน ะธ ะปะตะณะบะพัั‚ะธ, ั‡ั‚ะพ ะฒะพะทะฝะธะบะฐะปะธ ั€ะฐะฝัŒัˆะต. ะ”ะฐะถะต ะปะตั‚ะพะผ ะผะพะต ัะตั€ะดั†ะต ัะบะพะฒั‹ะฒะฐะตั‚ ะปะตะด, ะบะพั‚ะพั€ั‹ะน ัƒะถะต ะฝะต ัƒะดะฐัั‚ัั ั€ะฐัั‚ะพะฟะธั‚ัŒ.', ] embeddings = model.encode(sentences) print(embeddings.shape) # [3, 384] # Get the similarity scores for the embeddings similarities = model.similarity(embeddings, embeddings) print(similarities.shape) # [3, 3] ``` <!-- ### Direct Usage (Transformers) <details><summary>Click to see the direct usage in Transformers</summary> </details> --> <!-- ### Downstream Usage (Sentence Transformers) You can finetune this model on your own dataset. <details><summary>Click to expand</summary> </details> --> <!-- ### Out-of-Scope Use *List how the model may foreseeably be misused and address what users ought not to do with the model.* --> ## Evaluation ### Metrics #### Binary Classification * Evaluated with [<code>BinaryClassificationEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.BinaryClassificationEvaluator) | Metric | Value | |:--------------------------|:-----------| | cosine_accuracy | 0.9215 | | cosine_accuracy_threshold | 0.3258 | | cosine_f1 | 0.752 | | cosine_f1_threshold | 0.2845 | | cosine_precision | 0.7465 | | cosine_recall | 0.7575 | | **cosine_ap** | **0.8412** | | cosine_mcc | 0.702 | <!-- ## Bias, Risks and Limitations *What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.* --> <!-- ### Recommendations *What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.* --> ## Training Details ### Training Dataset #### json * Dataset: json * Size: 276,686 training samples * Columns: <code>sentence1</code>, <code>sentence2</code>, and <code>label</code> * Approximate statistics based on the first 1000 samples: | | sentence1 | sentence2 | label | |:--------|:--------------------------------------------------------------------------------------|:--------------------------------------------------------------------------------------|:-----------------------------| | type | string | string | int | | details | <ul><li>min: 442 tokens</li><li>mean: 510.89 tokens</li><li>max: 512 tokens</li></ul> | <ul><li>min: 451 tokens</li><li>mean: 511.55 tokens</li><li>max: 512 tokens</li></ul> | <ul><li>1: 100.00%</li></ul> | * Samples: | sentence1 | sentence2 | label | |:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:---------------| | <code>ะงั‚ะพ ั‡ั‚ะพ-ั‚ะพ ะฝะต ั‚ะฐะบ, ะธะฝั‚ัƒะธั†ะธั ะฟะพะดัะบะฐะทั‹ะฒะฐะปะฐ ะ—ะฐะฝะทะฐััƒ ั ัะฐะผะพะณะพ ัƒั‚ั€ะฐ. ะ‘ะปะฐะณะพะฟะพะปัƒั‡ะฝะพ ะฟั€ะพะธะณะฝะพั€ะธั€ะพะฒะฐะฒ ะฟั€ะพะฑัƒะถะดะตะฝะธะต ั ะปะธะณั€ะพะผ ะฒ ะฟะพัั‚ะตะปะธ, ะ‘ะตัั‚ะตั€ ะฟะตั€ะธะพะดะธั‡ะตัะบะธ ัะฟะฐะป ั€ัะดะพะผ ั ั…ะพะทัะธะฝะพะผ. ะ—ะฐะฝะทะฐั ะปะตะฝะธะฒะพ ะฝะพ, ะบะฐะบ ะธะทะฒะตัั‚ะฝะพ ะพั€ะณะฐะฝะธะทะผัƒ ะฝะต ะพั‚ะบะฐะถะตัˆัŒ, ัะฟัƒัั‚ะธะปัั ะฝะฐ ะบัƒั…ะฝัŽ. ะžั‚ะฝะพัะธั‚ะตะปัŒะฝะพ ัะฟะพะบะพะนะฝะพ ะฟะพะตะฒ, ะธ ะธะทะฑะฐะฒะธะฒัˆะธััŒ ะพั‚ ะฝะพะฒะพัะฒะปะตะฝะฝะพะณะพ ั‚ั€ัƒะฟะฐ, ะบะพั‚ะพั€ั‹ะน ะฟั€ะพะปะธะป ะฝะฐ ะฝะตะณะพ ะฟะพะดะปะธะฒัƒ ะบ ะผัััƒ, ะฑะพัั ะฒัะตั ะฒะฐั€ะธะธ ะพั‚ะฟั€ะฐะฒะธะปัั ะฒ ะดัƒัˆ. ะ‘ั‹ัั‚ั€ะพ ะฒั‹ะผั‹ะฒัˆะธััŒ ะธ ะพะฑะฒัะทะฐะฒ ะบะพั€ะพั‚ะบะพะต ะฟะพะปะพั‚ะตะฝั†ะต ะฝะฐ ะฑะตะดั€ะฐั…, ะพะฝ ะฒะตั€ะฝัƒะปัั ะฒ ัะฒะพัŽ ัะฟะฐะปัŒะฝัŽ ะธ ะฟั€ะธะปะตะณ ะฝะฐ ะบั€ะพะฒะฐั‚ัŒ ั€ัะดะพะผ ั ะปะธะณั€ะพะผ. ะะตะผะฝะพะณะพ ะฟะพั‚ั€ะตะฟะฐะฒ ะตะณะพ ะณั€ะธะฒัƒ, ะฑั€ัŽะฝะตั‚ ั€ะฐะทะปะตะณัั ะฝะฐ ะบั€ะพะฒะฐั‚ะธ. ะ–ะธะฒะพั‚ะฝะพะต ะถะต ะฒัะฟะพะผะฝะธะปะพ, ะบะฐะบ ะดะปะธะฝะฝะพะฒะพะปะพัั‹ะน ะฟะฐั€ะตะฝัŒ ะธัะฟะพะปัŒะทะพะฒะฐะป ะตะณะพ ั…ะพะทัะธะฝะฐ ะบะฐะบ ัะฐะผะบัƒ. ะ ะ‘ะตัั‚ะตั€ ะฟะพ ั…ะฐั€ะฐะบั‚ะตั€ัƒ ะฑั‹ะป ะพั‡ะตะฝัŒ ะฟะพั…ะพะถ ะฝะฐ ะ—ะฐะฝะทะฐัะฐ ัะพะฑัั‚ะฒะตะฝะฝะธั‡ะตัั‚ะฒะพะผ, ะฟะพ ัั‚ะพะน ะฟั€ะธั‡ะธะฝะต ะทะฒะตั€ัŒ ะฒัั‚ะฐะป, ะฟะพั‚ะพะฟั‚ะฐะปัั ะฝะฐ ะฟะพัั‚ะตะปะธ ะธ ะทะฐะฑั€ะฐะปัั ะฝะฐ ัะฒะพะตะณะพ ั…ะพะทัะธะฝะฐ. ะ—ะฐะฝะทะฐั ะฒะฝะพะฒัŒ ะฝะต ะฟั€ะธะดะฐะป ัั‚ะพะผัƒ ะทะฝะฐั‡ะตะฝะธั, ะฟั€ะธะฝะธะผะฐั ะทะฐ ะฟะพะฟั‹ั‚ะบัƒ ะปะตะฝะธะฒะพะณะพ ะถะธะฒะพั‚ะฝะพะณะพ ัะปะตะทั‚ัŒ ั ะบั€ะพะฒะฐั‚ะธ. ะญั‚ะพ ะธ ะฑั‹ะปะพ ะตะณะพ ะพัˆะธะฑะบะพะน. ะกะฒะพะธะผ ะฝะตะผะฐะปั‹ะผ ะฒะตัะพะผ ะ‘ะตัั‚ะตั€ ะฟั€ะธะดะฐะฒะธะป ะผัƒะถั‡ะธะฝัƒ ะบ ะฟะพัั‚ะตะปะธ, ะธ ะพั‚ะดะตะปัŒะฝะพ ะฟั€ะธะดะฐะฒะธะป ะพะดะฝะพะน ะปะฐะฟะพะน ะตะณะพ ั€...</code> | <code>ะะพะผะธะฝะต ะฝะตัะฟะตัˆะฝะพ ัˆะตะป ะฒ ัั‚ะพั€ะพะฝัƒ ัˆะบะพะปั‹ ะกะตะนั€ะธะฝ. ะฃั€ะพะบะธ ะตั‰ะต ัˆะปะธ, ะฟะพัั‚ะพะผัƒ ะตะผัƒ ะฑั‹ะปะพ ะฝะตะบัƒะดะฐ ั‚ะพั€ะพะฟะธั‚ัŒัั, ะฐ ัะฒะพะธ ะพะฝ ะฑะปะฐะณะพะฟะพะปัƒั‡ะฝะพ ะฟั€ะพะต...ะบั…ะผ, ะฟั€ะพะฟัƒัั‚ะธะป, ะดะฐะฑั‹ ะฝะฐะฒะตะดะฐั‚ัŒัั ะบ ะšะฐะณะฐะผะธ. ะ—ะฐั‡ะตะผ, ะพะฝ ะธ ัะฐะผ ะดะพ ะบะพะฝั†ะฐ ะฝะต ะฟะพะฝะธะผะฐะป, ะฝะพ ะฟั€ะธะฒั‹ะบ ัะปะตะดะพะฒะฐั‚ัŒ ัะฒะพะธะผ ะถะตะปะฐะฝะธัะผ ะธ ะบะพั€ะผะธั‚ัŒ ะฒะฝัƒั‚ั€ะตะฝะฝะธั… ะดะตะผะพะฝะพะฒ. ะ’ ะฝะฐัƒัˆะฝะธะบะฐั… ะธะณั€ะฐะปะฐ ะฝะตะทะฐั‚ะตะนะปะธะฒะฐั ะผะตะปะพะดะธั ะฝะฐ ะฐะฝะณะปะธะนัะบะพะผ, ะฐ ัะฐะผ ะะพะผะธะฝะต ะฝะต ะทะฐะผะพั€ะฐั‡ะธะฒะฐะปัั ั‚ะตะบัั‚ะพะผ, ะฝะฐัะปะฐะถะดะฐัััŒ ะทะฒัƒั‡ะฐะฝะธะตะผ ะผัƒะทั‹ะบะธ ะธ ะณะพะปะพัะพะผ ะฟะตะฒั†ะฐ.<br>ะ’ะพะนะดั ะฒะพ ะดะฒะพั€, ะพะฝ ะพะณะปัะดะตะปัั, ะธั‰ะฐ ะฒั…ะพะด ะฒ ัƒั‡ะตะฑะฝั‹ะน ะบะพั€ะฟัƒั. ะะฐะนะดั ะถะต ะฝัƒะถะฝัƒัŽ ะดะฒะตั€ัŒ, ะพะฝ ะฟั€ะพัˆะตะป ะฒะฝัƒั‚ั€ัŒ, ะฟะพะดั…ะพะดั ะบ ั€ะฐัะฟะธัะฐะฝะธัŽ. ะŸั€ะพะฒะตะดั ะฟะฐะปัŒั†ะตะผ ะฟะพ ั†ะธั„ั€ะต ะฝัƒะถะฝะพะณะพ ะบะปะฐััะฐ, ะพะฝ ะฒะทะณะปัะฝัƒะป ะฝะฐ ัะฐะผะธ ัƒั€ะพะบะธ.<br>-ะฅะผ, ัะฟะพะฝัะบะธะน... ะะต ะดัƒะผะฐัŽ, ั‡ั‚ะพ ะพะฝ ะฑัƒะดะตั‚ ะฟั€ะพั‚ะธะฒ, ะตัะปะธ ั ะตะณะพ ะพั‚ะผะฐะถัƒ. - ะ˜, ะฟะพ ะฐะบัƒะปัŒะธ ัƒะปั‹ะฑะฝัƒะฒัˆะธััŒ, ะฟะฐั€ะตะฝัŒ ะฝะฐะฟั€ะฐะฒะธะปัั ะฝะฐ ะฒั‚ะพั€ะพะน ัั‚ะฐะถ, ะบ ะบะฐะฑะธะฝะตั‚ัƒ ะฝะพะผะตั€ ั‚ั€ะธะฝะฐะดั†ะฐั‚ัŒ. ะŸั€ะตะดะฒะฐั€ะธั‚ะตะปัŒะฝะพ ะทะฐะณะปัะฝัƒะฒ ะฒ ะทะฐะผะพั‡ะฝัƒัŽ ัะบะฒะฐะถะธะฝัƒ, ัƒะฒะธะดะตะฒ ะผะพะปะพะดะตะฝัŒะบัƒัŽ ัƒั‡ะธั‚ะตะปัŒะฝะธั†ัƒ ะธ ะฒั‹ะณะปัะดั‹ะฒะฐัŽั‰ัƒัŽ ะธะท-ะฟะพะด ะฑะปัƒะทะบะธ ั‚ะฐั‚ัƒะธั€ะพะฒะบัƒ "I love yaoi", ะฒ ะพั‡ะตั€ะตะดะฝะพะน ั€ะฐะท ะพัะบะฐะปะธะปัั ะธ ะฟั€ะพัˆะตะป ะฒ ะบะฐะฑะธะฝะตั‚. ะ”ะตะฒัƒัˆะบะฐ ะฝะต ัƒัะฟะตะปะฐ ะดะฐะถะต ะฟะธะบะฝัƒั‚ัŒ, ะบะฐะบ ะพะฝ ะฑั‹ะป ัƒ ะฟะฐั€ั‚ั‹ ะšะฐะณะฐะผะธ. ะะฐะบะปะพะฝะธะฒัˆ...</code> | <code>1</code> | | <code>ะงั‚ะพ ั‡ั‚ะพ-ั‚ะพ ะฝะต ั‚ะฐะบ, ะธะฝั‚ัƒะธั†ะธั ะฟะพะดัะบะฐะทั‹ะฒะฐะปะฐ ะ—ะฐะฝะทะฐััƒ ั ัะฐะผะพะณะพ ัƒั‚ั€ะฐ. ะ‘ะปะฐะณะพะฟะพะปัƒั‡ะฝะพ ะฟั€ะพะธะณะฝะพั€ะธั€ะพะฒะฐะฒ ะฟั€ะพะฑัƒะถะดะตะฝะธะต ั ะปะธะณั€ะพะผ ะฒ ะฟะพัั‚ะตะปะธ, ะ‘ะตัั‚ะตั€ ะฟะตั€ะธะพะดะธั‡ะตัะบะธ ัะฟะฐะป ั€ัะดะพะผ ั ั…ะพะทัะธะฝะพะผ. ะ—ะฐะฝะทะฐั ะปะตะฝะธะฒะพ ะฝะพ, ะบะฐะบ ะธะทะฒะตัั‚ะฝะพ ะพั€ะณะฐะฝะธะทะผัƒ ะฝะต ะพั‚ะบะฐะถะตัˆัŒ, ัะฟัƒัั‚ะธะปัั ะฝะฐ ะบัƒั…ะฝัŽ. ะžั‚ะฝะพัะธั‚ะตะปัŒะฝะพ ัะฟะพะบะพะนะฝะพ ะฟะพะตะฒ, ะธ ะธะทะฑะฐะฒะธะฒัˆะธััŒ ะพั‚ ะฝะพะฒะพัะฒะปะตะฝะฝะพะณะพ ั‚ั€ัƒะฟะฐ, ะบะพั‚ะพั€ั‹ะน ะฟั€ะพะปะธะป ะฝะฐ ะฝะตะณะพ ะฟะพะดะปะธะฒัƒ ะบ ะผัััƒ, ะฑะพัั ะฒัะตั ะฒะฐั€ะธะธ ะพั‚ะฟั€ะฐะฒะธะปัั ะฒ ะดัƒัˆ. ะ‘ั‹ัั‚ั€ะพ ะฒั‹ะผั‹ะฒัˆะธััŒ ะธ ะพะฑะฒัะทะฐะฒ ะบะพั€ะพั‚ะบะพะต ะฟะพะปะพั‚ะตะฝั†ะต ะฝะฐ ะฑะตะดั€ะฐั…, ะพะฝ ะฒะตั€ะฝัƒะปัั ะฒ ัะฒะพัŽ ัะฟะฐะปัŒะฝัŽ ะธ ะฟั€ะธะปะตะณ ะฝะฐ ะบั€ะพะฒะฐั‚ัŒ ั€ัะดะพะผ ั ะปะธะณั€ะพะผ. ะะตะผะฝะพะณะพ ะฟะพั‚ั€ะตะฟะฐะฒ ะตะณะพ ะณั€ะธะฒัƒ, ะฑั€ัŽะฝะตั‚ ั€ะฐะทะปะตะณัั ะฝะฐ ะบั€ะพะฒะฐั‚ะธ. ะ–ะธะฒะพั‚ะฝะพะต ะถะต ะฒัะฟะพะผะฝะธะปะพ, ะบะฐะบ ะดะปะธะฝะฝะพะฒะพะปะพัั‹ะน ะฟะฐั€ะตะฝัŒ ะธัะฟะพะปัŒะทะพะฒะฐะป ะตะณะพ ั…ะพะทัะธะฝะฐ ะบะฐะบ ัะฐะผะบัƒ. ะ ะ‘ะตัั‚ะตั€ ะฟะพ ั…ะฐั€ะฐะบั‚ะตั€ัƒ ะฑั‹ะป ะพั‡ะตะฝัŒ ะฟะพั…ะพะถ ะฝะฐ ะ—ะฐะฝะทะฐัะฐ ัะพะฑัั‚ะฒะตะฝะฝะธั‡ะตัั‚ะฒะพะผ, ะฟะพ ัั‚ะพะน ะฟั€ะธั‡ะธะฝะต ะทะฒะตั€ัŒ ะฒัั‚ะฐะป, ะฟะพั‚ะพะฟั‚ะฐะปัั ะฝะฐ ะฟะพัั‚ะตะปะธ ะธ ะทะฐะฑั€ะฐะปัั ะฝะฐ ัะฒะพะตะณะพ ั…ะพะทัะธะฝะฐ. ะ—ะฐะฝะทะฐั ะฒะฝะพะฒัŒ ะฝะต ะฟั€ะธะดะฐะป ัั‚ะพะผัƒ ะทะฝะฐั‡ะตะฝะธั, ะฟั€ะธะฝะธะผะฐั ะทะฐ ะฟะพะฟั‹ั‚ะบัƒ ะปะตะฝะธะฒะพะณะพ ะถะธะฒะพั‚ะฝะพะณะพ ัะปะตะทั‚ัŒ ั ะบั€ะพะฒะฐั‚ะธ. ะญั‚ะพ ะธ ะฑั‹ะปะพ ะตะณะพ ะพัˆะธะฑะบะพะน. ะกะฒะพะธะผ ะฝะตะผะฐะปั‹ะผ ะฒะตัะพะผ ะ‘ะตัั‚ะตั€ ะฟั€ะธะดะฐะฒะธะป ะผัƒะถั‡ะธะฝัƒ ะบ ะฟะพัั‚ะตะปะธ, ะธ ะพั‚ะดะตะปัŒะฝะพ ะฟั€ะธะดะฐะฒะธะป ะพะดะฝะพะน ะปะฐะฟะพะน ะตะณะพ ั€...</code> | <code>ะะพะผะธะฝะต ะฑั‹ะป ะฐะฝะณะตะปะพะผ ัƒะถะต ะพั‡ะตะฝัŒ ะดะฐะฒะฝะพ. ะžะฝ ะดะฐะถะต ะฝะต ะฟะพะผะฝะธะป ัะบะพะปัŒะบะพ ะปะตั‚, ะดะฐะถะต ะฒะตะบะพะฒ ะฟั€ะพัˆะปะพ ั ั‚ะพะณะพ ะผะพะผะตะฝั‚ะฐ. ะžะฝ ะปัŽะฑะธะป ัะธะดั ะฝะฐ ะพะดะฝะพะผ ะธะท ะพะฑะปะฐะบะพะฒ ะฝะฐะฑะปัŽะดะฐั‚ัŒ ะทะฐ ะ—ะตะผะปะตะน, ะฐ ะพัะพะฑะตะฝะฝะพ ะพัะตะฝัŒัŽ. ะ˜ ัั‚ะพ ะฑั‹ะป ะพะฑั‹ั‡ะฝั‹ะน ะดะตะฝัŒ, ะฝะพ ะะพะผะธะฝะต ะทะฐั…ะพั‚ะตะปะพััŒ ะฟะพัะผะพั‚ั€ะตั‚ัŒ ะฟะพะฑะปะธะถะต. ะ ะฐัะบั€ั‹ะฒ ะพะณั€ะพะผะฝั‹ะต ะบั€ั‹ะปัŒั, ะพะฝ ัƒัั‚ั€ะตะผะธะปัั ะบะฐะผะฝะตะผ ะฒะฝะธะท. ะ”ะปั ะปัŽะดะตะน ัั‚ะพ ะฒั‹ะณะปัะดะตะปะพ ะบะฐะบ ัƒะฟะฐะฒัˆะฐั ะทะฒะตะทะดะฐ, ัั€ะบะฐั ะปะธะฝะธั ะฒ ะฝะตะฑะต. ะะธะบั‚ะพ ะฝะต ะทะฝะฐะป, ั‡ั‚ะพ ะฒัะต ัั‚ะธ ะปะธะฝะธะธ ั‡ะตั€ั‚ะธะปะธััŒ ะฟะฐะดะฐัŽั‰ะธะผะธ ะฐะฝะณะตะปะฐะผะธ, ะปะธัˆัŒ ะฟะพัั‚ะพะผัƒ ะถะตะปะฐะฝะธั, ะบะพั‚ะพั€ั‹ะต ะฑั‹ะปะพ ะฟั€ะธะฝัั‚ะพ ะทะฐะณะฐะดั‹ะฒะฐั‚ัŒ, ะธัะฟะพะปะฝัะปะธััŒ. ะะฐ ะฑะพะปัŒัˆะพะน ัะบะพั€ะพัั‚ะธ ะผะพะปะพะดะพะน ะฐะฝะณะตะป ะฟั€ะธะทะตะผะปะธะปัั. ะšะพะณะดะฐ ะพะฑะปะฐะบะพ ะฟั‹ะปะธ ั€ะฐััะตัะปะพััŒ, ัั‚ะฐะปะพ ะฒะธะดะฝะพ, ั‡ั‚ะพ ะพะฝ ัั‚ะพะธั‚ ะฝะฐ ะพะดะฝะพะผ ะบะพะปะตะฝะต, ัƒะฟะธั€ะฐัััŒ ั€ัƒะบะฐะผะธ ะฒ ะทะตะผะปัŽ. ะกะปะพะถะธะฒ ัะฒะพะธ ะบั€ั‹ะปัŒั, ะพะฝ ัะบั€ั‹ะป ะธั… ะพั‚ ั‡ะตะปะพะฒะตั‡ะตัะบะธั… ะณะปะฐะท, ะตะณะพ ะพะดะตะถะดะฐ ะฒัะตะณะดะฐ ะฑั‹ะปะฐ ะบะฐะบ ะทะตะผะฝะฐั, ะธ ัะตะนั‡ะฐั ั‚ะพะถะต. ะ‘ะตะปะฐั ะฑะพั€ั†ะพะฒะบะฐ, ั€ะฐััั‚ะตะณะฝัƒั‚ะฐั ะฑะปะตะดะฝะพ ะณะพะปัƒะฑะฐั ั€ัƒะฑะฐัˆะบะฐ ะฑะตะท ั€ัƒะบะฐะฒะพะฒ ะธ ั‚ะตะผะฝะพ ัะธะฝะธะต ะดะถะธะฝัั‹. ะ—ะฐะธะฝั‚ะตั€ะตัะพะฒะฐะฝะฝะพ ัะผะพั‚ั€ั ะฟะพ ัั‚ะพั€ะพะฝะฐะผ, ะพะฝ ะฟะพะฑั€ะตะป ะฒ ัั‚ะพั€ะพะฝัƒ ะณะพั€ะพะดะฐ, ะผะธะผะพ ะตั…ะฐะปะธ ะผะฐัˆะธะฝั‹, ะฝะพ ะพะฝ ะธั… ะฝะต ะทะฐะผะตั‡ะฐะป. ะ—ะฐะนะดั ะฒ ะณะพั€ะพะด, ะพะฝ ะฑั‹ะป ะฟะพั‡ั‚ะธ ะพัะปะตะฟะปะตะฝ ะฝะตะพะฝะพะฒั‹ะผะธ ะฒั‹ะฒะตัะบะฐะผะธ ะธ ะผะฝะพะณะพั‡ะธัะปะตะฝะฝั‹ะผะธ...</code> | <code>1</code> | | <code>ะงั‚ะพ ั‡ั‚ะพ-ั‚ะพ ะฝะต ั‚ะฐะบ, ะธะฝั‚ัƒะธั†ะธั ะฟะพะดัะบะฐะทั‹ะฒะฐะปะฐ ะ—ะฐะฝะทะฐััƒ ั ัะฐะผะพะณะพ ัƒั‚ั€ะฐ. ะ‘ะปะฐะณะพะฟะพะปัƒั‡ะฝะพ ะฟั€ะพะธะณะฝะพั€ะธั€ะพะฒะฐะฒ ะฟั€ะพะฑัƒะถะดะตะฝะธะต ั ะปะธะณั€ะพะผ ะฒ ะฟะพัั‚ะตะปะธ, ะ‘ะตัั‚ะตั€ ะฟะตั€ะธะพะดะธั‡ะตัะบะธ ัะฟะฐะป ั€ัะดะพะผ ั ั…ะพะทัะธะฝะพะผ. ะ—ะฐะฝะทะฐั ะปะตะฝะธะฒะพ ะฝะพ, ะบะฐะบ ะธะทะฒะตัั‚ะฝะพ ะพั€ะณะฐะฝะธะทะผัƒ ะฝะต ะพั‚ะบะฐะถะตัˆัŒ, ัะฟัƒัั‚ะธะปัั ะฝะฐ ะบัƒั…ะฝัŽ. ะžั‚ะฝะพัะธั‚ะตะปัŒะฝะพ ัะฟะพะบะพะนะฝะพ ะฟะพะตะฒ, ะธ ะธะทะฑะฐะฒะธะฒัˆะธััŒ ะพั‚ ะฝะพะฒะพัะฒะปะตะฝะฝะพะณะพ ั‚ั€ัƒะฟะฐ, ะบะพั‚ะพั€ั‹ะน ะฟั€ะพะปะธะป ะฝะฐ ะฝะตะณะพ ะฟะพะดะปะธะฒัƒ ะบ ะผัััƒ, ะฑะพัั ะฒัะตั ะฒะฐั€ะธะธ ะพั‚ะฟั€ะฐะฒะธะปัั ะฒ ะดัƒัˆ. ะ‘ั‹ัั‚ั€ะพ ะฒั‹ะผั‹ะฒัˆะธััŒ ะธ ะพะฑะฒัะทะฐะฒ ะบะพั€ะพั‚ะบะพะต ะฟะพะปะพั‚ะตะฝั†ะต ะฝะฐ ะฑะตะดั€ะฐั…, ะพะฝ ะฒะตั€ะฝัƒะปัั ะฒ ัะฒะพัŽ ัะฟะฐะปัŒะฝัŽ ะธ ะฟั€ะธะปะตะณ ะฝะฐ ะบั€ะพะฒะฐั‚ัŒ ั€ัะดะพะผ ั ะปะธะณั€ะพะผ. ะะตะผะฝะพะณะพ ะฟะพั‚ั€ะตะฟะฐะฒ ะตะณะพ ะณั€ะธะฒัƒ, ะฑั€ัŽะฝะตั‚ ั€ะฐะทะปะตะณัั ะฝะฐ ะบั€ะพะฒะฐั‚ะธ. ะ–ะธะฒะพั‚ะฝะพะต ะถะต ะฒัะฟะพะผะฝะธะปะพ, ะบะฐะบ ะดะปะธะฝะฝะพะฒะพะปะพัั‹ะน ะฟะฐั€ะตะฝัŒ ะธัะฟะพะปัŒะทะพะฒะฐะป ะตะณะพ ั…ะพะทัะธะฝะฐ ะบะฐะบ ัะฐะผะบัƒ. ะ ะ‘ะตัั‚ะตั€ ะฟะพ ั…ะฐั€ะฐะบั‚ะตั€ัƒ ะฑั‹ะป ะพั‡ะตะฝัŒ ะฟะพั…ะพะถ ะฝะฐ ะ—ะฐะฝะทะฐัะฐ ัะพะฑัั‚ะฒะตะฝะฝะธั‡ะตัั‚ะฒะพะผ, ะฟะพ ัั‚ะพะน ะฟั€ะธั‡ะธะฝะต ะทะฒะตั€ัŒ ะฒัั‚ะฐะป, ะฟะพั‚ะพะฟั‚ะฐะปัั ะฝะฐ ะฟะพัั‚ะตะปะธ ะธ ะทะฐะฑั€ะฐะปัั ะฝะฐ ัะฒะพะตะณะพ ั…ะพะทัะธะฝะฐ. ะ—ะฐะฝะทะฐั ะฒะฝะพะฒัŒ ะฝะต ะฟั€ะธะดะฐะป ัั‚ะพะผัƒ ะทะฝะฐั‡ะตะฝะธั, ะฟั€ะธะฝะธะผะฐั ะทะฐ ะฟะพะฟั‹ั‚ะบัƒ ะปะตะฝะธะฒะพะณะพ ะถะธะฒะพั‚ะฝะพะณะพ ัะปะตะทั‚ัŒ ั ะบั€ะพะฒะฐั‚ะธ. ะญั‚ะพ ะธ ะฑั‹ะปะพ ะตะณะพ ะพัˆะธะฑะบะพะน. ะกะฒะพะธะผ ะฝะตะผะฐะปั‹ะผ ะฒะตัะพะผ ะ‘ะตัั‚ะตั€ ะฟั€ะธะดะฐะฒะธะป ะผัƒะถั‡ะธะฝัƒ ะบ ะฟะพัั‚ะตะปะธ, ะธ ะพั‚ะดะตะปัŒะฝะพ ะฟั€ะธะดะฐะฒะธะป ะพะดะฝะพะน ะปะฐะฟะพะน ะตะณะพ ั€...</code> | <code>ะขััƒะฝะฐะตัˆะธ ะปะตะถะฐะป ะฝะฐ ะฟะพัั‚ะตะปะธ ะฒ ะพะดะฝะพะผ ัˆะตะปะบะพะฒะพะผ ั…ะฐะปะฐั‚ะต, ะพะถะธะดะฐั ะฟั€ะธั…ะพะดะฐ ัะฒะพะตะณะพ ะฟะฐั€ะฝั. ะšะพะณะดะฐ ะดะฒะตั€ัŒ ะพั‚ั€ั‹ะปะฐััŒ, ะพะฝ ั€ะฐะทะฒั€ะฐั‚ะฝะพ ั€ะฐะทะฒะตะป ะฝะพะถะบะธ ะธ ะฟั€ะธะฝัะปัั ะฒั‹ะปะธะทั‹ะฒะฐั‚ัŒ ัะฒะพะธ ะฟะฐะปัŒั†ั‹.<br>-ะ”-ะด-ะดะถัƒะดะฐะนะผะต! ะงั‚ะพ ะ’ั‹ ะดะตะปะฐะตั‚ะต?!<br>-ะœะผะผะผ, ะฅะฐะฐะฐะฐัั‚ะพ, ั ั‚ะตะฑั ั‚ะฐะบ ั…ะพั‡ัƒ! - ะ˜ะทะฝั‹ะฒะฐั ะพั‚ ะถะตะปะฐะฝะธั, ะฟั€ะพะธะทะฝะตั ะกะฐะฒะฐะดะฐ.<br>-ะ”ะฐ ะ’ั‹ ั‡ั‚ะพ... ะšะฐะบ ั ะผะพะณัƒ? - ะขะพั‚ ะฟะพะบั€ะฐัะฝะตะป ะธ ะพั‚ะฒะตั€ะฝัƒะปัั.<br>-ะ—ะฝะฐะตัˆัŒ ั‡ั‚ะพ, ะ“ะพะบัƒะดะตั€ะฐ... - ะ“ะพะปะพั ัั‚ะฐะป ัะตั€ะดะธั‚ั‹ะผ. - ะœั‹ ั€ะฐััั‚ะฐะตะผัั! - ะ˜ ะฝะต ะดะฐะฒ ะฟะพะดั€ั‹ะฒะฝะธะบัƒ ะธ ัะปะพะฒะฐ ัะบะฐะทะฐั‚ัŒ, ะกะฐะฒะฐะดะฐ ะทะฐะฒัะทะฐะป ะฟะพัั ั…ะฐะปะฐั‚ะฐ ะธ, ะพะฑัƒะฒัˆะธััŒ, ะฟะพะบะธะฝัƒะป ะบะฒะฐั€ั‚ะธั€ัƒ. ะžะฝ ะฑั‹ัั‚ั€ะพ ะดะพะฑั€ะฐะปัั ะดะพ ัะฒะพะตะน ะผะฐัˆะธะฝั‹ ะธ, ัะตะฒ ะฒ ะฝะตะต, ะฝะต ะผะตะฝะตะต ะฑั‹ัั‚ั€ะพ ะดะพะฑั€ะฐะปัั ะดะพ ััŠะตะผะฝะพะน ะบะฒะฐั€ั‚ะธั€ั‹ ะกะบัƒะฐะปะพ. ะŸะพั‚ะฐั€ะฐะฑะฐะฝะธะฒ ะฒ ะดะฒะตั€ัŒ ะฒัะตะณะพ ะผะธะฝัƒั‚ัƒ, ะพะฝ ะฒะพั€ะฒะฐะปัั ะฒ ะบะพั€ะธะดะพั€ ะธ ะฝะฐ ะฝะตะดะพัƒะผะตะฝะฝั‹ะน ะฒะทะณะปัะด ะกะบัƒะฐะปะพ ัะบะธะฝัƒะป ั…ะฐะปะฐั‚ะธะบ.<br>-ะ’ั€ะพะพะพะน, ะกะฐะฒะฐะดะฐ, ั‚ั‹ ั‡ะต ั‚ะฒะพั€ะธัˆัŒ? - ะกัƒะฟะตั€ะฑะธ ะฟั‹ั‚ะฐะปัั ัะพะฑั€ะฐั‚ัŒ ั‡ะตะปัŽัั‚ัŒ ั ะฟะพะปะฐ, ะฐ ัะฐะผ ะ”ะตััั‚ั‹ะน ะพะฟะตั€ัั ั€ัƒะบะฐะผะธ ะฝะฐ ัั‚ะตะฝัƒ, ะฟั€ะพะณะฝัƒะป ัะฟะธะฝะบัƒ ะธ, ั€ะฐััั‚ะฐะฒะธะฒ ะฝะพะถะบะธ ะฟะพั‚ั€ะตะฑะพะฒะฐะป.<br>-ะขั€ะฐั…ะฝะธ ะผะตะฝั!<br>-ะงะตะณะพ? ะขั‹ ะฟัŒัะฝั‹ะน ั‡ั‚ะพ ะปะธ? - ะงะตะปัŽัั‚ัŒ ะผะตั‡ะฝะธะบะฐ ะฒะพ ะฒั‚ะพั€ะพะน ั€ะฐะท ะฟะพะทะฝะฐะบะพะผะธะปะฐััŒ ั ะฟะพะปะพะผ.<br>-ะŸั€ะพัั‚ะพ ะฒัั‚ะฐะฒัŒ. ะœะฝะต ะผะพะถะฝะพ ะฑะตะท ัะผะฐะทะบะธ ะธ ะณะฐะฝะดะพะฝะพะฒ. - ะ ะฐะทะดั€ะฐ...</code> | <code>1</code> | * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` ### Evaluation Dataset #### json * Dataset: json * Size: 184,428 evaluation samples * Columns: <code>sentence1</code>, <code>sentence2</code>, and <code>label</code> * Approximate statistics based on the first 1000 samples: | | sentence1 | sentence2 | label | |:--------|:--------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|:-----------------------------| | type | string | string | int | | details | <ul><li>min: 429 tokens</li><li>mean: 510.07 tokens</li><li>max: 512 tokens</li></ul> | <ul><li>min: 406 tokens</li><li>mean: 510.3 tokens</li><li>max: 512 tokens</li></ul> | <ul><li>1: 100.00%</li></ul> | * Samples: | sentence1 | sentence2 | label | |:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:---------------| | <code>ะ”ะตะปะพ ะฑั‹ะปะพ ะฒะตั‡ะตั€ะพะผ, ะบะพะณะดะฐ ั ะพั‚ะฟั€ะฐะฒะปัะปะฐััŒ ะฒ ะณะพัั‚ะธ ะบ ะœะฐั€ั‚ะธะฝะต. ะฏ ะฒะทัะปะฐ ั ัะพะฑะพะน ะคะฐะบัƒะฝะดะพ, ะธ ะšัะฐะฑะธ. ะญั‚ะพ ะผะพะธ ััƒะผะฐััˆะตะดัˆะธะต, ะฝะพ ะปัƒั‡ัˆะธะต ะดั€ัƒะทัŒั. ะšะพั€ะพั‡ะต ะบะพั€ะพั‚ะบะพ: ะฏ - ะ›ะพะดะพะฒะธะบะฐ, ะฝะพ ะผะพะถะฝะพ ะฟั€ะพัั‚ะพ ะ›ะพะดะพ.<br>ะขะฐะบ ะบะฐะบ ะขะธะฝะฐ ะถะธะฒั‘ั‚ ะฝะฐ 9 ัั‚ะฐะถะต, ะฝะฐะผ ะฟั€ะธัˆะปะพััŒ ะตั…ะฐั‚ัŒ ะฝะฐ ะปะธั„ั‚ะต, ะธะฝะฐั‡ะต ะฝะฐ ะปะตัั‚ะฝะธั†ะต ะผั‹ ะฑั‹ ะฟะพะดะพั…ะปะธ. ะšะพั€ะพั‡ะต ะทะฐั…ะพะดะธะผ ะฒ ะปะธั„ั‚, ะธ ะฒะพั‚ ะผั‹ ัƒะถะต ะฝะฐ ะฝัƒะถะฝะพะผ ะฝะฐะผ ัั‚ะฐะถะต.<br>ะ”ะฒะตั€ะบะธ ะพั‚ะบั€ั‹ะปะธััŒ, ั ะพั‚ะฒะตั€ะฝัƒะปะฐััŒ ะฝะฐ ะผะธะฝัƒั‚ะบัƒ. ะ ะฟะพั‚ะพะผ ะฟะพะฒะตั€ะฝัƒะปะฐััŒ, ัะผะพั‚ั€ัŽ ะฐ ัั‚ะธั… ะธะดะธะพั‚ะพะฒ ะฝะตั‚ัƒ. ะ’ะดั€ัƒะณ ะดะฒะตั€ะบะธ ะทะฐะบั€ั‹ะปะธััŒ, ั ะดะตะฒัƒัˆะบะฐ ะฝะต ะฟัƒะณะปะธะฒะฐั ะฝะพ, ั‡ั‘ั€ั‚ ะฐ ะฒะดั€ัƒะณ ั ั‚ัƒั‚ ะทะฐัั‚ั€ัะฝัƒ?<br>- ะŸั€ะธะดัƒั€ะบะธ, ะฑะปะธะฝ! ะžั‚ะบั€ะพะนั‚ะต ะปะธั„ั‚! ะ—ะฐัั‚ั€ัะฝัƒ ะฒะตะดัŒ!<br>ะ’ ะพั‚ะฒะตั‚ ั ัƒัะปั‹ัˆะฐะปะฐ ะปะธัˆัŒ ัะผะตั… ะดะฒัƒั… ะฟะฐั€ะฝะตะน. ะัƒ! ะ˜ะผ ะฝะต ะฟะพะทะดะพั€ะพะฒะธั‚ัั ะบะพะณะดะฐ ั ะพั‚ััŽะดะฐ ะฒั‹ะนะดัƒ.<br>ะ’ะดั€ัƒะณ, ะดะฒะตั€ะบะธ ะพั‚ะบั€ั‹ะปะธััŒ, ั ะฒั‹ะปะตะทะปะฐ ะธะท ะปะธั„ั‚ะฐ, ะธ ัั‚ะธ ะดะตะฑะธะปั‹ ะฝะฐัะธะปัŒะฝะพ ะทะฐั‚ะฐัะบะธะฒะฐัŽั‚ ะผะตะฝั ะฒ ะปะธั„ั‚, ะธ ะถะผัƒั‚ ะฝะฐ ะบะฝะพะฟะบัƒ, ั‡ั‚ะพะฑั‹ ะปะธั„ั‚ ะฟะพะตั…ะฐะป ะดะพ 1 ัั‚ะฐะถะฐ.<br>- ะ‘ั‹ัั‚ั€ะตะต! ะคะฐะบัƒ! ะะฐะดะพ ะฑั‹ัั‚ั€ะตะต ะ›ะพะดะพ ัะฟัƒัั‚ะธั‚ัŒัั ะฝะฐ 1 ัั‚ะฐะถ! - ะ—ะฐะบั€ะธั‡ะฐะป ะšัะฐะฑัŒัะฝะธ.<br>- ะŸะพะฝัะป! - ะ’ ะพั‚ะฒะตั‚ ะบั€ะธะบะฝัƒะป ะคะฐะบัƒะฝะดะพ.<br>ะขัƒั‚ ะดะฒะตั€ะบะธ ะทะฐะบั€ั‹ะปะธััŒ, ะธ ะผะตะฝั ะฟะพะฝะตัะปะพ ะฝะฐ 1 ัั‚ะฐะถ. ะงะตั€ะตะท ะฝะตัะบะพะปัŒะบะพ ะผะธะฝัƒั‚ ั ัะฟัƒัั‚ะธะปะฐััŒ ะฝะฐ ะฝัƒะถะฝั‹ะน ัั‚ะฐะถ, ะธ ะฒะดั€ัƒ...</code> | <code>ะฏ - ะ˜ะบะบะธะฝะณ. ะกั‹ะฝ ะฒะพะถะดั, ะฟะตั€ะฒั‹ะน ะบะพั‚ะพั€ั‹ะน ะฟั€ะธั€ัƒั‡ะธะป ะดั€ะฐะบะพะฝะฐ.<br>ะฃ ะผะตะฝั ะตัั‚ัŒ ะปัŽะฑะธะผะฐั ะดะตะฒัƒัˆะบะฐ, ะฝะพ ั ะดะฐะฒะฝะพ ะตั‘ ะฝะต ะฒะธะดะตะป. ะŸะพัะปะตะดะฝะธะต ะฝะพะฒะพัั‚ะธ ะผะตะฝั ะฟั€ะธะฒะตะปะธ ะฒ ัƒะถะฐั.<br>ะ“ั€ะพะผะณะธะปัŒะดะฐ, ะดั€ะฐะบะพะฝะธั…ะฐ, ะฟะพะณะธะฑะปะฐ. ะ ะัั‚ั€ะธะด ะฝะต ะฟะตั€ะตะฝะตัะปะฐ ะตั‘ ัะผะตั€ั‚ะธ, ะธ ะฟะพะฒะตัะธะปะฐััŒ...<br>ะœะพะต ัะตั€ะดั†ะต ั€ะฐะทะฑะธะปะพััŒ ะฝะฐ ัั‚ะพ ะพัะบะพะปะบะพะฒ, ะผะพั ะตะดะธะฝัั‚ะฒะตะฝะฝะฐั ะปัŽะฑะพะฒัŒ ะฟะพะณะธะฑะปะฐ.<br>ะ“ะพะฒะพั€ัั‚ ั‡ั‚ะพ ะฒะธะบะธะฝะณะธ ะฑะตััะตั€ะดะตั‡ะฝั‹ะต, ะฝะพ ัั‚ะพ ะฝะต ั‚ะฐะบ. ะœั‹ ั‚ะพะถะต ัƒะผะตะตะผ ะปัŽะฑะธั‚ัŒ!<br>ะ ะฐะฝะฝะตะต ัƒั‚ั€ะพ. ะ’ัะต ะฒะธะบะธะฝะณะธ ะตั‰ั‘ ัะฟัั‚, ะทะฐ ะพะบะฝะพะผ ั…ะพะปะพะดะฝะพ, ัะพะปะฝั†ะต, ะฝะพ ะฒะตั‚ะตั€ ะฒัั‘ ะถะต ะตัั‚ัŒ.<br>ะฏ ะฟั€ะธะพั‚ะบั€ั‹ะป ะณะปะฐะทะฐ, ะธ ะทะฐะผะตั‚ะธะป, ั‡ั‚ะพ ะผะพั ะปัŽะฑะธะผะฐั ั€ะตะฟั‚ะธะปะธั ะฒัั‘ ะตั‰ั‘ ัะฟะธั‚.<br>ะขะฐะบ ั…ะพะปะพะดะฝะพ, ั‡ั‚ะพ ะผะฝะต ะทะฐั…ะพั‚ะตะปะพััŒ ะฒััŽ ะฒะตั‡ะฝะพัั‚ัŒ ะฟั€ะพะปะตะถะฐั‚ัŒ ะฒ ั‚ั‘ะฟะปะพะน ะฟะพัั‚ะตะปัŒะบะต. ะœะพั ะบั€ะพะฒะฐั‚ัŒ ั‚ะฐะบ ะธ ะผะฐะฝะธะปะฐ ะปะตั‡ัŒ, ะธ ะทะฐัั‚ะฐะฒะธั‚ัŒ ัะฟะฐั‚ัŒ. ะะพ, ั‚ัƒั‚ ะผะพะน ั‡ั‘ั€ะฝั‹ะน ะดั€ัƒะณ ะฟั€ะพัะฝัƒะปัั. ะžะฝ ัƒัั‚ะฐะฒะธะปัั ะฝะฐ ะผะตะฝั ัะฒะพะธะผะธ ะฑะพะปัŒัˆะธะผะธ ะทะตะปั‘ะฝั‹ะผะธ ะณะปะฐะทะฐะผะธ.<br>- ะงั‚ะพ? - ะะต ะฟะพะฝะธะผะฐะป ั ั‡ั‚ะพ ะฟั€ะพะธัั…ะพะดะธั‚, ะฝะพ ะฝะฐ ะผะพะน ะฒะพะฟั€ะพั ะ‘ะตะทะทัƒะฑะธะบ ะปะธัˆัŒ ั„ั‹ั€ะบะฝัƒะป.<br>ะะพ ั‚ัƒั‚ ะพะฝ ั€ะฐัะฟั€ะฐะฒะธะป ัะฒะพะธ ะบั€ั‹ะปัŒั, ะธ ะฟะพะดะปะตั‚ะตะป ะบะพ ะผะฝะต. ะ ัะฒะพัŽ ะผะพั€ะดัƒ ะฟะพะปะพะถะธะป ะผะฝะต ะฝะฐ ั€ัƒะบะธ. ะฏ ัะฒะฝะพ ะฝะต ะฟะพะฝะธะผะฐะป ั‡ั‚ะพ ะพะฝ ะพั‚ ะผะตะฝั ั…ะพั‡ะตั‚.<br>ะะพ ั‚ัƒั‚ ะพะฝ ัะฒะพะตะน ะผะพั€ะดะพะน ัƒัั‚ะฐะฒะธะปัั ะฝะฐ ัะฒะพะธ ะบั€ั‹ะปัŒั, ะธ ั…ะฒะพัั‚. ...</code> | <code>1</code> | | <code>ะ”ะตะปะพ ะฑั‹ะปะพ ะฒะตั‡ะตั€ะพะผ, ะบะพะณะดะฐ ั ะพั‚ะฟั€ะฐะฒะปัะปะฐััŒ ะฒ ะณะพัั‚ะธ ะบ ะœะฐั€ั‚ะธะฝะต. ะฏ ะฒะทัะปะฐ ั ัะพะฑะพะน ะคะฐะบัƒะฝะดะพ, ะธ ะšัะฐะฑะธ. ะญั‚ะพ ะผะพะธ ััƒะผะฐััˆะตะดัˆะธะต, ะฝะพ ะปัƒั‡ัˆะธะต ะดั€ัƒะทัŒั. ะšะพั€ะพั‡ะต ะบะพั€ะพั‚ะบะพ: ะฏ - ะ›ะพะดะพะฒะธะบะฐ, ะฝะพ ะผะพะถะฝะพ ะฟั€ะพัั‚ะพ ะ›ะพะดะพ.<br>ะขะฐะบ ะบะฐะบ ะขะธะฝะฐ ะถะธะฒั‘ั‚ ะฝะฐ 9 ัั‚ะฐะถะต, ะฝะฐะผ ะฟั€ะธัˆะปะพััŒ ะตั…ะฐั‚ัŒ ะฝะฐ ะปะธั„ั‚ะต, ะธะฝะฐั‡ะต ะฝะฐ ะปะตัั‚ะฝะธั†ะต ะผั‹ ะฑั‹ ะฟะพะดะพั…ะปะธ. ะšะพั€ะพั‡ะต ะทะฐั…ะพะดะธะผ ะฒ ะปะธั„ั‚, ะธ ะฒะพั‚ ะผั‹ ัƒะถะต ะฝะฐ ะฝัƒะถะฝะพะผ ะฝะฐะผ ัั‚ะฐะถะต.<br>ะ”ะฒะตั€ะบะธ ะพั‚ะบั€ั‹ะปะธััŒ, ั ะพั‚ะฒะตั€ะฝัƒะปะฐััŒ ะฝะฐ ะผะธะฝัƒั‚ะบัƒ. ะ ะฟะพั‚ะพะผ ะฟะพะฒะตั€ะฝัƒะปะฐััŒ, ัะผะพั‚ั€ัŽ ะฐ ัั‚ะธั… ะธะดะธะพั‚ะพะฒ ะฝะตั‚ัƒ. ะ’ะดั€ัƒะณ ะดะฒะตั€ะบะธ ะทะฐะบั€ั‹ะปะธััŒ, ั ะดะตะฒัƒัˆะบะฐ ะฝะต ะฟัƒะณะปะธะฒะฐั ะฝะพ, ั‡ั‘ั€ั‚ ะฐ ะฒะดั€ัƒะณ ั ั‚ัƒั‚ ะทะฐัั‚ั€ัะฝัƒ?<br>- ะŸั€ะธะดัƒั€ะบะธ, ะฑะปะธะฝ! ะžั‚ะบั€ะพะนั‚ะต ะปะธั„ั‚! ะ—ะฐัั‚ั€ัะฝัƒ ะฒะตะดัŒ!<br>ะ’ ะพั‚ะฒะตั‚ ั ัƒัะปั‹ัˆะฐะปะฐ ะปะธัˆัŒ ัะผะตั… ะดะฒัƒั… ะฟะฐั€ะฝะตะน. ะัƒ! ะ˜ะผ ะฝะต ะฟะพะทะดะพั€ะพะฒะธั‚ัั ะบะพะณะดะฐ ั ะพั‚ััŽะดะฐ ะฒั‹ะนะดัƒ.<br>ะ’ะดั€ัƒะณ, ะดะฒะตั€ะบะธ ะพั‚ะบั€ั‹ะปะธััŒ, ั ะฒั‹ะปะตะทะปะฐ ะธะท ะปะธั„ั‚ะฐ, ะธ ัั‚ะธ ะดะตะฑะธะปั‹ ะฝะฐัะธะปัŒะฝะพ ะทะฐั‚ะฐัะบะธะฒะฐัŽั‚ ะผะตะฝั ะฒ ะปะธั„ั‚, ะธ ะถะผัƒั‚ ะฝะฐ ะบะฝะพะฟะบัƒ, ั‡ั‚ะพะฑั‹ ะปะธั„ั‚ ะฟะพะตั…ะฐะป ะดะพ 1 ัั‚ะฐะถะฐ.<br>- ะ‘ั‹ัั‚ั€ะตะต! ะคะฐะบัƒ! ะะฐะดะพ ะฑั‹ัั‚ั€ะตะต ะ›ะพะดะพ ัะฟัƒัั‚ะธั‚ัŒัั ะฝะฐ 1 ัั‚ะฐะถ! - ะ—ะฐะบั€ะธั‡ะฐะป ะšัะฐะฑัŒัะฝะธ.<br>- ะŸะพะฝัะป! - ะ’ ะพั‚ะฒะตั‚ ะบั€ะธะบะฝัƒะป ะคะฐะบัƒะฝะดะพ.<br>ะขัƒั‚ ะดะฒะตั€ะบะธ ะทะฐะบั€ั‹ะปะธััŒ, ะธ ะผะตะฝั ะฟะพะฝะตัะปะพ ะฝะฐ 1 ัั‚ะฐะถ. ะงะตั€ะตะท ะฝะตัะบะพะปัŒะบะพ ะผะธะฝัƒั‚ ั ัะฟัƒัั‚ะธะปะฐััŒ ะฝะฐ ะฝัƒะถะฝั‹ะน ัั‚ะฐะถ, ะธ ะฒะดั€ัƒ...</code> | <code>ะ’ะธะพะปะตั‚ั‚ะฐ ะบะฐะบ ะฒัะตะณะดะฐ ัะฟะฐะปะฐ ะฒ ัะฒะพะตะน ะบั€ะพะฒะฐั‚ะธ, ะธ, ะฒ ะพั‡ะตั€ะตะดะฝะพะน ั€ะฐะท ะตะน ัะฝะธะปัั ะบะพัˆะผะฐั€. ะ’ ะพั‡ะตั€ะตะดะฝะพะน ั€ะฐะท ะตะน ัะฝะธะปะฐััŒ ะตะต ะฟะพะบะพะนะฝะฐั ะผะฐั‚ัŒ, ะœะฐั€ะธั. ะ’ะธะพะปะตั‚ั‚ะฐ ะฒัั‚ะฐะปะฐ, ะฒัั ะฒัะฟะพั‚ะตะฒัˆะฐั, ะฒัั ะธัะฟัƒะณะฐะฝะฝะฐั.<br>ะ’ะดั€ัƒะณ ะดะฒะตั€ัŒ ะบะพะผะฝะฐั‚ั‹ ะพั‚ะบั€ั‹ะปะฐััŒ, ะธะท ะทะฐ ะดะฒะตั€ะธ ะฟะพะบะฐะทะฐะปัั ัŽะฝะพัˆะฐ. ะžะฝ ะณะปัะดั ะฝะฐ ะ’ะธะพะปะตั‚ั‚ัƒ ะฝะฐั…ะผัƒั€ะธะป ะฑั€ะพะฒะธ, ะธ ะฟะพะดะพัˆั‘ะป ะบ ะฝะตะน.<br>- ะ’ะธะพะปะตั‚ั‚ะฐ, ั‡ั‚ะพ ั ั‚ะพะฑะพะน? - ะกะฟั€ะพัะธะป ะพะฝ.<br>- ะะธั‡ะตะณะพ. ะŸั€ะพัั‚ะพ ะพะฟัั‚ัŒ ะบะพัˆะผะฐั€ ะฟั€ะธัะฝะธะปัั.<br>- ะžะฟัั‚ัŒ?<br>ะคะตะดะตั€ะธะบะพ ัะตะป ะฝะฐ ะบั€ะฐะน ะบั€ะพะฒะฐั‚ะธ, ะธ ะพะฑะฝัะป ะตะต. ะขะฐ ะฝะต ัั‚ะฐะปะฐ ัะพะฟั€ะพั‚ะธะฒะปัั‚ัŒัั. ะžะฝะฐ ะพะฑะฝัะปะฐ ะตะณะพ ะฒ ะพั‚ะฒะตั‚, ัะตะนั‡ะฐั ะตะน ะฝัƒะถะฝะฐ ะฟะพะดะดะตั€ะถะบะฐ. ะžะฟัั‚ัŒ ัะพะฝ, ะพะฟัั‚ัŒ ัะปั‘ะทั‹. ะšะพะณะดะฐ ะถะต ะฑะตะดะฝะพะน ะดะตะฒัƒัˆะบะต ะฟั€ะตะบั€ะฐั‚ะธั‚ัั ัะฝะธั‚ัŒัั ะตะต ะผะฐั‚ัŒ?<br>ะ’ะธะพะปะตั‚ั‚ะฐ ะฒัั‚ะฐะปะฐ ะธะท ัะฒะพะตะน ะฟะพัั‚ะตะปะธ, ะธ ะคะตะดะตั€ะธะบะพ ะฒั‹ัˆะตะป ะธะท ะบะพะผะฝะฐั‚ั‹. ะ”ะตะฒัƒัˆะบะฐ ะฝะฐั‡ะฐะปะฐ ะพะดะตะฒะฐั‚ัŒัั, ะพะดะตะฒัˆะธััŒ ะพะฝะฐ ัะฟัƒัั‚ะธะปะฐััŒ ะฝะฐ ะฟะตั€ะฒั‹ะน ัั‚ะฐะถ, ะฒ ะณะพัั‚ะธะฝัƒัŽ.<br>ะ—ะฐะผะตั‚ะธะฒ ั‡ั‚ะพ ะฝะธะบะพะณะพ ะบั€ะพะผะต ะคะตะดะตั€ะธะบะพ ะฒ ะณะพัั‚ะธะฝะพะน ะฝะตั‚ัƒ, ะพะฝะฐ ะฟั€ะพัะธะปะฐ:<br>- ะ ะณะดะต ะฒัะต?<br>- ะžะปัŒะณะฐ ะฟะพัˆะปะฐ ะฟะพะบัƒะฟะฐั‚ัŒ ะฟั€ะพะดัƒะบั‚ั‹, ะฐ ะ ะพะผะฐะปัŒะพ ะธ ะ“ะตั€ะผะฐะฝ ะฝะฐ ั€ะฐะฑะพั‚ะต.<br>- ะŸะพะฝัั‚ะฝะพ.<br>ะ’ัั‘ ะบะฐะบ ะฒัะตะณะดะฐ, ะฝะธั‡ะตะณะพ ะฝะต ะผะตะฝัะตั‚ัั, ะบั€ะพะผะต ะผะพะธั… ะบะพัˆะผะฐั€ะพะฒ.<br>ะฏ ัะตะปะฐ ะฝะฐ ะดะธะฒะฐะฝ, ะฝะฐะฟั€ะพั‚ะธะฒ ะคะตะดะตั€ะธะบะพ, ะพะฝ ั‡ั‚ะพ ั‚ะพ ะฟะธัะฐะป ะฝะฐ ะฑัƒะผะฐะถะบะต. ะะฐะฒะตั€...</code> | <code>1</code> | | <code>ะ”ะตะปะพ ะฑั‹ะปะพ ะฒะตั‡ะตั€ะพะผ, ะบะพะณะดะฐ ั ะพั‚ะฟั€ะฐะฒะปัะปะฐััŒ ะฒ ะณะพัั‚ะธ ะบ ะœะฐั€ั‚ะธะฝะต. ะฏ ะฒะทัะปะฐ ั ัะพะฑะพะน ะคะฐะบัƒะฝะดะพ, ะธ ะšัะฐะฑะธ. ะญั‚ะพ ะผะพะธ ััƒะผะฐััˆะตะดัˆะธะต, ะฝะพ ะปัƒั‡ัˆะธะต ะดั€ัƒะทัŒั. ะšะพั€ะพั‡ะต ะบะพั€ะพั‚ะบะพ: ะฏ - ะ›ะพะดะพะฒะธะบะฐ, ะฝะพ ะผะพะถะฝะพ ะฟั€ะพัั‚ะพ ะ›ะพะดะพ.<br>ะขะฐะบ ะบะฐะบ ะขะธะฝะฐ ะถะธะฒั‘ั‚ ะฝะฐ 9 ัั‚ะฐะถะต, ะฝะฐะผ ะฟั€ะธัˆะปะพััŒ ะตั…ะฐั‚ัŒ ะฝะฐ ะปะธั„ั‚ะต, ะธะฝะฐั‡ะต ะฝะฐ ะปะตัั‚ะฝะธั†ะต ะผั‹ ะฑั‹ ะฟะพะดะพั…ะปะธ. ะšะพั€ะพั‡ะต ะทะฐั…ะพะดะธะผ ะฒ ะปะธั„ั‚, ะธ ะฒะพั‚ ะผั‹ ัƒะถะต ะฝะฐ ะฝัƒะถะฝะพะผ ะฝะฐะผ ัั‚ะฐะถะต.<br>ะ”ะฒะตั€ะบะธ ะพั‚ะบั€ั‹ะปะธััŒ, ั ะพั‚ะฒะตั€ะฝัƒะปะฐััŒ ะฝะฐ ะผะธะฝัƒั‚ะบัƒ. ะ ะฟะพั‚ะพะผ ะฟะพะฒะตั€ะฝัƒะปะฐััŒ, ัะผะพั‚ั€ัŽ ะฐ ัั‚ะธั… ะธะดะธะพั‚ะพะฒ ะฝะตั‚ัƒ. ะ’ะดั€ัƒะณ ะดะฒะตั€ะบะธ ะทะฐะบั€ั‹ะปะธััŒ, ั ะดะตะฒัƒัˆะบะฐ ะฝะต ะฟัƒะณะปะธะฒะฐั ะฝะพ, ั‡ั‘ั€ั‚ ะฐ ะฒะดั€ัƒะณ ั ั‚ัƒั‚ ะทะฐัั‚ั€ัะฝัƒ?<br>- ะŸั€ะธะดัƒั€ะบะธ, ะฑะปะธะฝ! ะžั‚ะบั€ะพะนั‚ะต ะปะธั„ั‚! ะ—ะฐัั‚ั€ัะฝัƒ ะฒะตะดัŒ!<br>ะ’ ะพั‚ะฒะตั‚ ั ัƒัะปั‹ัˆะฐะปะฐ ะปะธัˆัŒ ัะผะตั… ะดะฒัƒั… ะฟะฐั€ะฝะตะน. ะัƒ! ะ˜ะผ ะฝะต ะฟะพะทะดะพั€ะพะฒะธั‚ัั ะบะพะณะดะฐ ั ะพั‚ััŽะดะฐ ะฒั‹ะนะดัƒ.<br>ะ’ะดั€ัƒะณ, ะดะฒะตั€ะบะธ ะพั‚ะบั€ั‹ะปะธััŒ, ั ะฒั‹ะปะตะทะปะฐ ะธะท ะปะธั„ั‚ะฐ, ะธ ัั‚ะธ ะดะตะฑะธะปั‹ ะฝะฐัะธะปัŒะฝะพ ะทะฐั‚ะฐัะบะธะฒะฐัŽั‚ ะผะตะฝั ะฒ ะปะธั„ั‚, ะธ ะถะผัƒั‚ ะฝะฐ ะบะฝะพะฟะบัƒ, ั‡ั‚ะพะฑั‹ ะปะธั„ั‚ ะฟะพะตั…ะฐะป ะดะพ 1 ัั‚ะฐะถะฐ.<br>- ะ‘ั‹ัั‚ั€ะตะต! ะคะฐะบัƒ! ะะฐะดะพ ะฑั‹ัั‚ั€ะตะต ะ›ะพะดะพ ัะฟัƒัั‚ะธั‚ัŒัั ะฝะฐ 1 ัั‚ะฐะถ! - ะ—ะฐะบั€ะธั‡ะฐะป ะšัะฐะฑัŒัะฝะธ.<br>- ะŸะพะฝัะป! - ะ’ ะพั‚ะฒะตั‚ ะบั€ะธะบะฝัƒะป ะคะฐะบัƒะฝะดะพ.<br>ะขัƒั‚ ะดะฒะตั€ะบะธ ะทะฐะบั€ั‹ะปะธััŒ, ะธ ะผะตะฝั ะฟะพะฝะตัะปะพ ะฝะฐ 1 ัั‚ะฐะถ. ะงะตั€ะตะท ะฝะตัะบะพะปัŒะบะพ ะผะธะฝัƒั‚ ั ัะฟัƒัั‚ะธะปะฐััŒ ะฝะฐ ะฝัƒะถะฝั‹ะน ัั‚ะฐะถ, ะธ ะฒะดั€ัƒ...</code> | <code>ะฏ - ะ”ะถะฐะผะธะปั, ะดะพั‡ัŒ ะทะฝะฐั‚ะฝะพะณะพ ะณั€ะฐั„ะฐ. ะœะพั ะผะฐั‚ัŒ ัƒะผะตั€ะปะฐ ะฟั€ะธ ั€ะพะดะฐั…, ะฐ ั ะพัั‚ะฐะปะฐััŒ ะถะธะฒะฐ. ะฃะถะต ะบะฐะบ 20 ะปะตั‚ ะฟั€ะพัˆะปะพ ัะพ ัะผะตั€ั‚ะธ ะปัŽะฑัั‰ะธะน ะผะฐั‚ะตั€ะธ. ะœะพะน ะพั‚ะตั† ัะฝะพะฒะฐ ะถะตะฝะธะปัั ั‡ั‚ะพะฑั‹ ัƒ ะผะตะฝั ะฑั‹ะป ะฟั€ะธะผะตั€ ะดะปั ะฟะพะดั€ะฐะถะฐะฝะธั.<br>ะœะพัŽ ะผะฐั‡ะตั…ัƒ ะทะพะฒัƒั‚ ะญะปะธะทะฐะฑะตั‚. ะ’ั€ะพะดะต ะธะผั ะดะพะฑั€ะพะต, ะฐ ัะฐะผะฐ ะถะตะฝั‰ะธะฝะฐ ะฝะต ะธะท ะปัƒั‡ัˆะธั….<br>ะœั‹ ั ะญะปะธะทะฐะฑะตั‚ ะฝะต ะปะฐะดะธะปะธ. ะœะพะน ะพั‚ะตั† ัƒะตั…ะฐะป, ะดะพะผะฐ ะพัั‚ะฐะปะฐััŒ ั ั ะผะฐั‡ะตั…ะพะน, ะบะพั‚ะพั€ะฐั ัะพะฒัะตะผ ะฝะต ะทะฐะฝะธะผะฐะปะฐััŒ ะผะพะธะผ ะฒะพัะฟะธั‚ะฐะฝะธะตะผ ะบะฐะบ ะฟะพั€ัƒั‡ะธะป ะตะน ะผะพะน ะพั‚ะตั†.<br>ะ”ะพะผ ัƒ ะฝะฐั ะฑั‹ะป ะฑะพะณะฐั‚ั‹ะน, ะบั€ะฐัะธะฒั‹ะน. ะ˜ ะผะฝะพะณะพ ัะปัƒะณ.<br>ะŸะพะฟัƒั‚ะฝั‹ะน ะฒะตั‚ะตั€ ะดัƒะตั‚ ะผะฝะต ะฟั€ัะผะพ ะฒ ะปะธั†ะพ. ะ’ ะพะบั€ัƒะณะต ะฟะพัะฐะถะตะฝั‹ ั†ะฒะตั‚ั‹.<br>ะกะตะนั‡ะฐั ั ะฒ ัะฐะดัƒ. ะฏ ะพั‡ะตะฝัŒ ั€ะตะดะบะพ ัƒะปั‹ะฑะฐัŽััŒ, ั‚ะฐะบ ะบะฐะบ ั‚ะฐะบะธั… ั€ะฐะดะพัั‚ะฝั‹ั… ะผะพะผะตะฝั‚ะพะฒ, ัƒ ะผะตะฝั ะฑั‹ะปะพ ะพั‡ะตะฝัŒ ะผะฐะปะพ.<br>ะฏ ั€ะตะดะบะพ ะฒั‹ั…ะพะถัƒ ะธะท ัะฒะพะตะณะพ ะดะพะผะฐ, ะดะฐะถะต ะฟั€ะฐะบั‚ะธั‡ะตัะบะธ ะธะท ัะฒะพะตะน ะบะพะผะฝะฐั‚ั‹ ะฝะต ะฒั‹ั…ะพะถัƒ.<br>ะœะพั ะผะฐั‡ะตั…ะฐ ะพั‡ะตะฝัŒ ั€ะตะดะบะพ ะฒั‹ะฟัƒัะบะฐะตั‚ ะผะตะฝั ะฟะพะดั‹ัˆะฐั‚ัŒ ะฒะพะทะดัƒั…ะพะผ, ะพะฝะฐ ะณะพะฒะพั€ะธั‚ ั‡ั‚ะพ ะผะฝะต ะฝะตะปัŒะทั ะฒั‹ั…ะพะดะธั‚ัŒ ะฝะฐ ัƒะปะธั†ัƒ, ะธ ะพะฑั‰ะฐั‚ัŒัั ั ะปัŽะดัŒะผะธ, ะฟะพะบะฐ ั ะฝะต ะฝะฐัƒั‡ัƒััŒ ะฟั€ะฐะฒะธะปะฐะผะธ ัั‚ะธะบะตั‚ะฐ.<br>ะะตะผะฝะพะณะพ ะฟะพะดั‹ัˆะฐะฒ ะฒะพะทะดัƒั…ะพะผ ั ะทะฐัˆะปะฐ ะฒ ะดะพะผ. ะšะพ ะผะฝะต ัั€ะฐะทัƒ ะถะต ะฟะพะดะฑะตะถะฐะปะฐ ะญะปะธะทะฐะฑะตั‚.<br>ะ“ะปะฐะทะฐ ะตั‘ ะฑั‹ะปะธ ะฝะฐะฟะพะปะฝะตะฝั‹ ะณะฝะตะฒะพะผ. ะžะฝะฐ ะฟั€ะพะถะธะณะฐะปะฐ ะผะตะฝั ัะฒะพะธะผ ะทะปะพะฒะตั‰ะธะผ ะฒะทะณะปัะดะพะผ...</code> | <code>1</code> | * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "cos_sim" } ``` ### Training Hyperparameters #### Non-Default Hyperparameters - `eval_strategy`: epoch - `per_device_train_batch_size`: 136 - `per_device_eval_batch_size`: 136 - `weight_decay`: 0.01 - `num_train_epochs`: 5 - `bf16`: True - `load_best_model_at_end`: True - `batch_sampler`: no_duplicates #### All Hyperparameters <details><summary>Click to expand</summary> - `overwrite_output_dir`: False - `do_predict`: False - `eval_strategy`: epoch - `prediction_loss_only`: True - `per_device_train_batch_size`: 136 - `per_device_eval_batch_size`: 136 - `per_gpu_train_batch_size`: None - `per_gpu_eval_batch_size`: None - `gradient_accumulation_steps`: 1 - `eval_accumulation_steps`: None - `torch_empty_cache_steps`: None - `learning_rate`: 5e-05 - `weight_decay`: 0.01 - `adam_beta1`: 0.9 - `adam_beta2`: 0.999 - `adam_epsilon`: 1e-08 - `max_grad_norm`: 1.0 - `num_train_epochs`: 5 - `max_steps`: -1 - `lr_scheduler_type`: linear - `lr_scheduler_kwargs`: {} - `warmup_ratio`: 0.0 - `warmup_steps`: 0 - `log_level`: passive - `log_level_replica`: warning - `log_on_each_node`: True - `logging_nan_inf_filter`: True - `save_safetensors`: True - `save_on_each_node`: False - `save_only_model`: False - `restore_callback_states_from_checkpoint`: False - `no_cuda`: False - `use_cpu`: False - `use_mps_device`: False - `seed`: 42 - `data_seed`: None - `jit_mode_eval`: False - `use_ipex`: False - `bf16`: True - `fp16`: False - `fp16_opt_level`: O1 - `half_precision_backend`: auto - `bf16_full_eval`: False - `fp16_full_eval`: False - `tf32`: None - `local_rank`: 0 - `ddp_backend`: None - `tpu_num_cores`: None - `tpu_metrics_debug`: False - `debug`: [] - `dataloader_drop_last`: False - `dataloader_num_workers`: 0 - `dataloader_prefetch_factor`: None - `past_index`: -1 - `disable_tqdm`: False - `remove_unused_columns`: True - `label_names`: None - `load_best_model_at_end`: True - `ignore_data_skip`: False - `fsdp`: [] - `fsdp_min_num_params`: 0 - `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False} - `fsdp_transformer_layer_cls_to_wrap`: None - `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None} - `deepspeed`: None - `label_smoothing_factor`: 0.0 - `optim`: adamw_torch - `optim_args`: None - `adafactor`: False - `group_by_length`: False - `length_column_name`: length - `ddp_find_unused_parameters`: None - `ddp_bucket_cap_mb`: None - `ddp_broadcast_buffers`: False - `dataloader_pin_memory`: True - `dataloader_persistent_workers`: False - `skip_memory_metrics`: True - `use_legacy_prediction_loop`: False - `push_to_hub`: False - `resume_from_checkpoint`: None - `hub_model_id`: None - `hub_strategy`: every_save - `hub_private_repo`: None - `hub_always_push`: False - `gradient_checkpointing`: False - `gradient_checkpointing_kwargs`: None - `include_inputs_for_metrics`: False - `include_for_metrics`: [] - `eval_do_concat_batches`: True - `fp16_backend`: auto - `push_to_hub_model_id`: None - `push_to_hub_organization`: None - `mp_parameters`: - `auto_find_batch_size`: False - `full_determinism`: False - `torchdynamo`: None - `ray_scope`: last - `ddp_timeout`: 1800 - `torch_compile`: False - `torch_compile_backend`: None - `torch_compile_mode`: None - `include_tokens_per_second`: False - `include_num_input_tokens_seen`: False - `neftune_noise_alpha`: None - `optim_target_modules`: None - `batch_eval_metrics`: False - `eval_on_start`: False - `use_liger_kernel`: False - `eval_use_gather_object`: False - `average_tokens_across_devices`: False - `prompts`: None - `batch_sampler`: no_duplicates - `multi_dataset_batch_sampler`: proportional </details> ### Training Logs <details><summary>Click to expand</summary> | Epoch | Step | Training Loss | Validation Loss | cosine_ap | |:-------:|:---------:|:-------------:|:---------------:|:----------:| | 0.0491 | 100 | 3.2263 | - | - | | 0.0983 | 200 | 2.8883 | - | - | | 0.1474 | 300 | 2.7172 | - | - | | 0.1966 | 400 | 2.6694 | - | - | | 0.2457 | 500 | 2.5956 | - | - | | 0.2948 | 600 | 2.5362 | - | - | | 0.3440 | 700 | 2.5064 | - | - | | 0.3931 | 800 | 2.4516 | - | - | | 0.4423 | 900 | 2.4311 | - | - | | 0.4914 | 1000 | 2.4303 | - | - | | 0.5405 | 1100 | 2.3851 | - | - | | 0.5897 | 1200 | 2.3797 | - | - | | 0.6388 | 1300 | 2.3416 | - | - | | 0.6880 | 1400 | 2.3335 | - | - | | 0.7371 | 1500 | 2.3025 | - | - | | 0.7862 | 1600 | 2.2933 | - | - | | 0.8354 | 1700 | 2.3108 | - | - | | 0.8845 | 1800 | 2.273 | - | - | | 0.9337 | 1900 | 2.237 | - | - | | 0.9828 | 2000 | 2.2317 | - | - | | 1.0 | 2035 | - | 7.5524 | 0.8340 | | 1.0319 | 2100 | 2.1496 | - | - | | 1.0811 | 2200 | 2.0707 | - | - | | 1.1302 | 2300 | 2.0904 | - | - | | 1.1794 | 2400 | 2.066 | - | - | | 1.2285 | 2500 | 2.0367 | - | - | | 1.2776 | 2600 | 2.0456 | - | - | | 1.3268 | 2700 | 2.0691 | - | - | | 1.3759 | 2800 | 2.0345 | - | - | | 1.4251 | 2900 | 2.0432 | - | - | | 1.4742 | 3000 | 2.03 | - | - | | 1.5233 | 3100 | 2.0081 | - | - | | 1.5725 | 3200 | 2.0012 | - | - | | 1.6216 | 3300 | 1.9942 | - | - | | 1.6708 | 3400 | 1.9842 | - | - | | 1.7199 | 3500 | 1.9824 | - | - | | 1.7690 | 3600 | 1.9665 | - | - | | 1.8182 | 3700 | 1.9583 | - | - | | 1.8673 | 3800 | 1.975 | - | - | | 1.9165 | 3900 | 1.9307 | - | - | | 1.9656 | 4000 | 1.9338 | - | - | | 2.0 | 4070 | - | 7.6937 | 0.8397 | | 2.0147 | 4100 | 1.9082 | - | - | | 2.0639 | 4200 | 1.7883 | - | - | | 2.1130 | 4300 | 1.777 | - | - | | 2.1622 | 4400 | 1.7939 | - | - | | 2.2113 | 4500 | 1.8113 | - | - | | 2.2604 | 4600 | 1.783 | - | - | | 2.3096 | 4700 | 1.7794 | - | - | | 2.3587 | 4800 | 1.7704 | - | - | | 2.4079 | 4900 | 1.7864 | - | - | | 2.4570 | 5000 | 1.7766 | - | - | | 2.5061 | 5100 | 1.7344 | - | - | | 2.5553 | 5200 | 1.7621 | - | - | | 2.6044 | 5300 | 1.7413 | - | - | | 2.6536 | 5400 | 1.7428 | - | - | | 2.7027 | 5500 | 1.7287 | - | - | | 2.7518 | 5600 | 1.7603 | - | - | | 2.8010 | 5700 | 1.7455 | - | - | | 2.8501 | 5800 | 1.7561 | - | - | | 2.8993 | 5900 | 1.763 | - | - | | 2.9484 | 6000 | 1.7152 | - | - | | 2.9975 | 6100 | 1.7227 | - | - | | 3.0 | 6105 | - | 7.4898 | 0.8378 | | 3.0467 | 6200 | 1.6102 | - | - | | 3.0958 | 6300 | 1.609 | - | - | | 3.1450 | 6400 | 1.5979 | - | - | | 3.1941 | 6500 | 1.5864 | - | - | | 3.2432 | 6600 | 1.6182 | - | - | | 3.2924 | 6700 | 1.609 | - | - | | 3.3415 | 6800 | 1.6004 | - | - | | 3.3907 | 6900 | 1.5681 | - | - | | 3.4398 | 7000 | 1.582 | - | - | | 3.4889 | 7100 | 1.6019 | - | - | | 3.5381 | 7200 | 1.5959 | - | - | | 3.5872 | 7300 | 1.608 | - | - | | 3.6364 | 7400 | 1.5961 | - | - | | 3.6855 | 7500 | 1.5751 | - | - | | 3.7346 | 7600 | 1.5572 | - | - | | 3.7838 | 7700 | 1.5721 | - | - | | 3.8329 | 7800 | 1.5519 | - | - | | 3.8821 | 7900 | 1.5431 | - | - | | 3.9312 | 8000 | 1.5659 | - | - | | 3.9803 | 8100 | 1.5586 | - | - | | 4.0 | 8140 | - | 7.4912 | 0.8411 | | 4.0295 | 8200 | 1.4872 | - | - | | 4.0786 | 8300 | 1.4554 | - | - | | 4.1278 | 8400 | 1.4866 | - | - | | 4.1769 | 8500 | 1.4601 | - | - | | 4.2260 | 8600 | 1.4879 | - | - | | 4.2752 | 8700 | 1.4872 | - | - | | 4.3243 | 8800 | 1.4797 | - | - | | 4.3735 | 8900 | 1.4861 | - | - | | 4.4226 | 9000 | 1.4578 | - | - | | 4.4717 | 9100 | 1.4648 | - | - | | 4.5209 | 9200 | 1.4625 | - | - | | 4.5700 | 9300 | 1.4596 | - | - | | 4.6192 | 9400 | 1.4689 | - | - | | 4.6683 | 9500 | 1.4475 | - | - | | 4.7174 | 9600 | 1.4452 | - | - | | 4.7666 | 9700 | 1.4561 | - | - | | 4.8157 | 9800 | 1.4471 | - | - | | 4.8649 | 9900 | 1.4256 | - | - | | 4.9140 | 10000 | 1.4613 | - | - | | 4.9631 | 10100 | 1.4794 | - | - | | **5.0** | **10175** | **-** | **7.399** | **0.8412** | * The bold row denotes the saved checkpoint. </details> ### Framework Versions - Python: 3.10.18 - Sentence Transformers: 4.1.0 - Transformers: 4.52.4 - PyTorch: 2.7.1+cu128 - Accelerate: 1.7.0 - Datasets: 3.6.0 - Tokenizers: 0.21.1 ## Citation ### BibTeX #### Sentence Transformers ```bibtex @inproceedings{reimers-2019-sentence-bert, title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks", author = "Reimers, Nils and Gurevych, Iryna", booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing", month = "11", year = "2019", publisher = "Association for Computational Linguistics", url = "https://arxiv.org/abs/1908.10084", } ``` #### MultipleNegativesRankingLoss ```bibtex @misc{henderson2017efficient, title={Efficient Natural Language Response Suggestion for Smart Reply}, author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil}, year={2017}, eprint={1705.00652}, archivePrefix={arXiv}, primaryClass={cs.CL} } ``` <!-- ## Glossary *Clearly define terms in order to be accessible across audiences.* --> <!-- ## Model Card Authors *Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.* --> <!-- ## Model Card Contact *Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.* -->
ekiprop/bert-wnli-3-epochs-2025-06-17-0808
ekiprop
2025-06-17T08:09:22Z
0
0
transformers
[ "transformers", "safetensors", "bert", "text-classification", "generated_from_trainer", "base_model:google-bert/bert-base-uncased", "base_model:finetune:google-bert/bert-base-uncased", "license:apache-2.0", "autotrain_compatible", "endpoints_compatible", "region:us" ]
text-classification
2025-06-17T08:08:36Z
--- library_name: transformers license: apache-2.0 base_model: bert-base-uncased tags: - generated_from_trainer metrics: - accuracy model-index: - name: bert-wnli-3-epochs-2025-06-17-0808 results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # bert-wnli-3-epochs-2025-06-17-0808 This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.7137 - Accuracy: 0.4085 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-06 - train_batch_size: 16 - eval_batch_size: 8 - seed: 42 - optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments - lr_scheduler_type: linear - num_epochs: 3 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | No log | 1.0 | 40 | 0.7273 | 0.4225 | | No log | 2.0 | 80 | 0.7149 | 0.4085 | | No log | 3.0 | 120 | 0.7137 | 0.4085 | ### Framework versions - Transformers 4.52.4 - Pytorch 2.7.0+cu128 - Datasets 3.6.0 - Tokenizers 0.21.1
songkey/hm5b_reference
songkey
2025-06-17T07:51:08Z
12
0
diffusers
[ "diffusers", "safetensors", "arxiv:2410.22901", "base_model:stable-diffusion-v1-5/stable-diffusion-v1-5", "base_model:finetune:stable-diffusion-v1-5/stable-diffusion-v1-5", "license:mit", "region:us" ]
null
2025-06-12T10:04:48Z
--- base_model: - stable-diffusion-v1-5/stable-diffusion-v1-5 library_name: diffusers license: mit --- Model of [**HelloMeme**](https://songkey.github.io/hellomeme/) [**Project Page**](https://songkey.github.io/hellomeme/ ) | [**Code Page**](https://github.com/HelloVision/HelloMeme) | [**Arxiv**](https://arxiv.org/abs/2410.22901) | [**ComfyUI**](https://github.com/HelloVision/ComfyUI_HelloMeme) | [**Demo**](https://www.modelscope.cn/studios/songkey/HelloMeme) **BibTeX:** ```bibtex @misc{zhang2024hellomemeintegratingspatialknitting, title={HelloMeme: Integrating Spatial Knitting Attentions to Embed High-Level and Fidelity-Rich Conditions in Diffusion Models}, author={Shengkai Zhang and Nianhong Jiao and Tian Li and Chaojie Yang and Chenhui Xue and Boya Niu and Jun Gao}, year={2024}, eprint={2410.22901}, archivePrefix={arXiv}, primaryClass={cs.CV}, url={https://arxiv.org/abs/2410.22901}, } ```
talzoomanzoo/LIMO-full-Qwen-2.5-1.5B-Instruct
talzoomanzoo
2025-06-17T07:45:53Z
11
0
transformers
[ "transformers", "safetensors", "qwen2", "text-generation", "llama-factory", "generated_from_trainer", "conversational", "zho", "eng", "fra", "spa", "por", "deu", "ita", "rus", "jpn", "kor", "vie", "tha", "ara", "base_model:Qwen/Qwen2.5-1.5B-Instruct", "base_model:finetune:Q...
text-generation
2025-03-24T00:57:43Z
--- library_name: transformers license: apache-2.0 base_model: Qwen/Qwen2.5-1.5B-Instruct tags: - llama-factory - generated_from_trainer language: - zho - eng - fra - spa - por - deu - ita - rus - jpn - kor - vie - tha - ara model-index: - name: LIMO-full-Qwen-2.5-1.5B-Instruct results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # LIMO-full-Qwen-2.5-1.5B-Instruct This model is a fine-tuned version of [Qwen/Qwen2.5-1.5B-Instruct](https://huggingface.co/Qwen/Qwen2.5-1.5B-Instruct) on an unknown dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-06 - train_batch_size: 1 - eval_batch_size: 8 - seed: 42 - distributed_type: multi-GPU - optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments - lr_scheduler_type: cosine - num_epochs: 15 ### Training results ### Framework versions - Transformers 4.48.2 - Pytorch 2.5.1+cu124 - Datasets 3.2.0 - Tokenizers 0.21.0
veddhanth/lora-trained-xl-stage-2-finetuned-enc-v2-spat-1989-map-3
veddhanth
2025-06-17T07:24:12Z
1
0
diffusers
[ "diffusers", "tensorboard", "text-to-image", "diffusers-training", "lora", "template:sd-lora", "stable-diffusion-xl", "stable-diffusion-xl-diffusers", "base_model:stabilityai/stable-diffusion-xl-base-1.0", "base_model:adapter:stabilityai/stable-diffusion-xl-base-1.0", "license:openrail++", "re...
text-to-image
2025-06-17T07:14:38Z
--- base_model: stabilityai/stable-diffusion-xl-base-1.0 library_name: diffusers license: openrail++ instance_prompt: a realistic portrait of sks face widget: [] tags: - text-to-image - text-to-image - diffusers-training - diffusers - lora - template:sd-lora - stable-diffusion-xl - stable-diffusion-xl-diffusers --- <!-- This model card has been generated automatically according to the information the training script had access to. You should probably proofread and complete it, then remove this comment. --> # SDXL LoRA DreamBooth - veddhanth/lora-trained-xl-stage-2-finetuned-enc-v2-spat-1989-map-3 <Gallery /> ## Model description These are veddhanth/lora-trained-xl-stage-2-finetuned-enc-v2-spat-1989-map-3 LoRA adaption weights for stabilityai/stable-diffusion-xl-base-1.0. The weights were trained using [DreamBooth](https://dreambooth.github.io/). LoRA for the text encoder was enabled: True. Special VAE used for training: madebyollin/sdxl-vae-fp16-fix. ## Trigger words You should use a realistic portrait of sks face to trigger the image generation. ## Download model Weights for this model are available in Safetensors format. [Download](veddhanth/lora-trained-xl-stage-2-finetuned-enc-v2-spat-1989-map-3/tree/main) them in the Files & versions tab. ## Intended uses & limitations #### How to use ```python # TODO: add an example code snippet for running this diffusion pipeline ``` #### Limitations and bias [TODO: provide examples of latent issues and potential remediations] ## Training details [TODO: describe the data used to train the model]
Megha06/PixelcopterEnv
Megha06
2025-06-17T07:23:23Z
0
0
null
[ "Pixelcopter-PLE-v0", "reinforce", "reinforcement-learning", "custom-implementation", "deep-rl-class", "model-index", "region:us" ]
reinforcement-learning
2025-06-17T06:38:11Z
--- tags: - Pixelcopter-PLE-v0 - reinforce - reinforcement-learning - custom-implementation - deep-rl-class model-index: - name: PixelcopterEnv results: - task: type: reinforcement-learning name: reinforcement-learning dataset: name: Pixelcopter-PLE-v0 type: Pixelcopter-PLE-v0 metrics: - type: mean_reward value: 42.70 +/- 62.51 name: mean_reward verified: false --- # **Reinforce** Agent playing **Pixelcopter-PLE-v0** This is a trained model of a **Reinforce** agent playing **Pixelcopter-PLE-v0** . To learn to use this model and train yours check Unit 4 of the Deep Reinforcement Learning Course: https://huggingface.co/deep-rl-course/unit4/introduction
music991758/uuu_fine_tune_taipower
music991758
2025-06-17T07:20:20Z
5
0
null
[ "safetensors", "gpt2", "license:apache-2.0", "region:us" ]
null
2025-06-17T06:11:36Z
--- license: apache-2.0 ---
HsuHuggingFace/uuu_fine_tune_gpt2
HsuHuggingFace
2025-06-17T07:18:09Z
2
0
null
[ "safetensors", "gpt2", "license:apache-2.0", "region:us" ]
null
2025-06-17T06:20:42Z
--- license: apache-2.0 ---
HedyKoala17/uuu_fine_tune_taipower
HedyKoala17
2025-06-17T07:10:45Z
5
0
null
[ "safetensors", "gpt2", "license:apache-2.0", "region:us" ]
null
2025-06-17T06:18:43Z
--- license: apache-2.0 ---
krs375/uuu_fine_tune_taipower
krs375
2025-06-17T07:00:39Z
5
0
null
[ "safetensors", "gpt2", "license:apache-2.0", "region:us" ]
null
2025-06-17T06:23:24Z
--- license: apache-2.0 ---
LarryAIDraw/Char_Honkai_Bronya_V3_Pony
LarryAIDraw
2025-06-17T06:57:12Z
0
0
null
[ "license:creativeml-openrail-m", "region:us" ]
null
2025-06-17T06:32:13Z
--- license: creativeml-openrail-m --- https://civitai.com/models/130516/bronya-zaychik-4in1-ponyxl15?modelVersionId=1900974