modelId string | author string | last_modified timestamp[us, tz=UTC] | downloads int64 | likes int64 | library_name string | tags list | pipeline_tag string | createdAt timestamp[us, tz=UTC] | card string |
|---|---|---|---|---|---|---|---|---|---|
NamVo/qwen_r1_mini | NamVo | 2025-06-17T16:12:12Z | 0 | 0 | transformers | [
"transformers",
"safetensors",
"generated_from_trainer",
"trl",
"grpo",
"arxiv:2402.03300",
"base_model:Qwen/Qwen2.5-3B-Instruct",
"base_model:finetune:Qwen/Qwen2.5-3B-Instruct",
"endpoints_compatible",
"region:us"
] | null | 2025-06-17T08:14:40Z | ---
base_model: Qwen/Qwen2.5-3B-Instruct
library_name: transformers
model_name: qwen_r1_mini
tags:
- generated_from_trainer
- trl
- grpo
licence: license
---
# Model Card for qwen_r1_mini
This model is a fine-tuned version of [Qwen/Qwen2.5-3B-Instruct](https://huggingface.co/Qwen/Qwen2.5-3B-Instruct).
It has been trained using [TRL](https://github.com/huggingface/trl).
## Quick start
```python
from transformers import pipeline
question = "If you had a time machine, but could only go to the past or the future once and never return, which would you choose and why?"
generator = pipeline("text-generation", model="NamVo/qwen_r1_mini", device="cuda")
output = generator([{"role": "user", "content": question}], max_new_tokens=128, return_full_text=False)[0]
print(output["generated_text"])
```
## Training procedure
[<img src="https://raw.githubusercontent.com/wandb/assets/main/wandb-github-badge-28.svg" alt="Visualize in Weights & Biases" width="150" height="24"/>](https://wandb.ai/nvoz1812/huggingface/runs/n9ury0fe)
This model was trained with GRPO, a method introduced in [DeepSeekMath: Pushing the Limits of Mathematical Reasoning in Open Language Models](https://huggingface.co/papers/2402.03300).
### Framework versions
- TRL: 0.18.2
- Transformers: 4.52.4
- Pytorch: 2.7.1+cu128
- Datasets: 3.6.0
- Tokenizers: 0.21.1
## Citations
Cite GRPO as:
```bibtex
@article{zhihong2024deepseekmath,
title = {{DeepSeekMath: Pushing the Limits of Mathematical Reasoning in Open Language Models}},
author = {Zhihong Shao and Peiyi Wang and Qihao Zhu and Runxin Xu and Junxiao Song and Mingchuan Zhang and Y. K. Li and Y. Wu and Daya Guo},
year = 2024,
eprint = {arXiv:2402.03300},
}
```
Cite TRL as:
```bibtex
@misc{vonwerra2022trl,
title = {{TRL: Transformer Reinforcement Learning}},
author = {Leandro von Werra and Younes Belkada and Lewis Tunstall and Edward Beeching and Tristan Thrush and Nathan Lambert and Shengyi Huang and Kashif Rasul and Quentin Gallou{\'e}dec},
year = 2020,
journal = {GitHub repository},
publisher = {GitHub},
howpublished = {\url{https://github.com/huggingface/trl}}
}
``` |
facebook/vjepa2-vith-fpc64-256 | facebook | 2025-06-17T16:09:19Z | 409 | 11 | transformers | [
"transformers",
"safetensors",
"vjepa2",
"feature-extraction",
"video",
"video-classification",
"license:mit",
"endpoints_compatible",
"region:us"
] | video-classification | 2025-05-31T09:02:18Z | ---
license: mit
pipeline_tag: video-classification
tags:
- video
library_name: transformers
---
# V-JEPA 2
A frontier video understanding model developed by FAIR, Meta, which extends the pretraining objectives of [VJEPA](https://ai.meta.com/blog/v-jepa-yann-lecun-ai-model-video-joint-embedding-predictive-architecture/), resulting in state-of-the-art video understanding capabilities, leveraging data and model sizes at scale.
The code is released [in this repository](https://github.com/facebookresearch/vjepa2).
<img src="https://dl.fbaipublicfiles.com/vjepa2/vjepa2-pretrain.gif">
## Installation
To run V-JEPA 2 model, ensure you have installed the latest transformers:
```bash
pip install -U git+https://github.com/huggingface/transformers
```
## Intended Uses
V-JEPA 2 is intended to represent any video (and image) to perform video classification, retrieval, or as a video encoder for VLMs.
```python
from transformers import AutoVideoProcessor, AutoModel
hf_repo = "facebook/vjepa2-vith-fpc64-256"
model = AutoModel.from_pretrained(hf_repo)
processor = AutoVideoProcessor.from_pretrained(hf_repo)
```
To load a video, sample the number of frames according to the model. For this model, we use 64.
```python
import torch
from torchcodec.decoders import VideoDecoder
import numpy as np
video_url = "https://huggingface.co/datasets/nateraw/kinetics-mini/resolve/main/val/archery/-Qz25rXdMjE_000014_000024.mp4"
vr = VideoDecoder(video_url)
frame_idx = np.arange(0, 64) # choosing some frames. here, you can define more complex sampling strategy
video = vr.get_frames_at(indices=frame_idx).data # T x C x H x W
video = processor(video, return_tensors="pt").to(model.device)
with torch.no_grad():
video_embeddings = model.get_vision_features(**video)
print(video_embeddings.shape)
```
To load an image, simply copy the image to the desired number of frames.
```python
from transformers.image_utils import load_image
image = load_image("https://huggingface.co/datasets/merve/coco/resolve/main/val2017/000000000285.jpg")
pixel_values = processor(image, return_tensors="pt").to(model.device)["pixel_values_videos"]
pixel_values = pixel_values.repeat(1, 16, 1, 1, 1) # repeating image 16 times
with torch.no_grad():
image_embeddings = model.get_vision_features(pixel_values)
print(image_embeddings.shape)
```
For more code examples, please refer to the V-JEPA 2 documentation.
### Citation
```
@techreport{assran2025vjepa2,
title={V-JEPA~2: Self-Supervised Video Models Enable Understanding, Prediction and Planning},
author={Assran, Mahmoud and Bardes, Adrien and Fan, David and Garrido, Quentin and Howes, Russell and
Komeili, Mojtaba and Muckley, Matthew and Rizvi, Ammar and Roberts, Claire and Sinha, Koustuv and Zholus, Artem and
Arnaud, Sergio and Gejji, Abha and Martin, Ada and Robert Hogan, Francois and Dugas, Daniel and
Bojanowski, Piotr and Khalidov, Vasil and Labatut, Patrick and Massa, Francisco and Szafraniec, Marc and
Krishnakumar, Kapil and Li, Yong and Ma, Xiaodong and Chandar, Sarath and Meier, Franziska and LeCun, Yann and
Rabbat, Michael and Ballas, Nicolas},
institution={FAIR at Meta},
year={2025}
} |
Triangle104/Vitus-Qwen3-14B-Q4_K_S-GGUF | Triangle104 | 2025-06-17T16:06:13Z | 0 | 0 | null | [
"gguf",
"llama-cpp",
"gguf-my-repo",
"dataset:nbeerbower/human-writing-dpo",
"base_model:nbeerbower/Vitus-Qwen3-14B",
"base_model:quantized:nbeerbower/Vitus-Qwen3-14B",
"license:apache-2.0",
"endpoints_compatible",
"region:us",
"conversational"
] | null | 2025-06-17T15:58:37Z | ---
license: apache-2.0
datasets:
- nbeerbower/human-writing-dpo
base_model: nbeerbower/Vitus-Qwen3-14B
tags:
- llama-cpp
- gguf-my-repo
---
# Triangle104/Vitus-Qwen3-14B-Q4_K_S-GGUF
This model was converted to GGUF format from [`nbeerbower/Vitus-Qwen3-14B`](https://huggingface.co/nbeerbower/Vitus-Qwen3-14B) using llama.cpp via the ggml.ai's [GGUF-my-repo](https://huggingface.co/spaces/ggml-org/gguf-my-repo) space.
Refer to the [original model card](https://huggingface.co/nbeerbower/Vitus-Qwen3-14B) for more details on the model.
---
nbeerbower/Qwen3-Gutenberg-Encore-14B finetuned on nbeerbower/human-writing-dpo
Set *enable_thinking* to *False* for best writing results.
Method
-
ORPO tuned with 1x RTX A6000 for 2 epochs.
---
## Use with llama.cpp
Install llama.cpp through brew (works on Mac and Linux)
```bash
brew install llama.cpp
```
Invoke the llama.cpp server or the CLI.
### CLI:
```bash
llama-cli --hf-repo Triangle104/Vitus-Qwen3-14B-Q4_K_S-GGUF --hf-file vitus-qwen3-14b-q4_k_s.gguf -p "The meaning to life and the universe is"
```
### Server:
```bash
llama-server --hf-repo Triangle104/Vitus-Qwen3-14B-Q4_K_S-GGUF --hf-file vitus-qwen3-14b-q4_k_s.gguf -c 2048
```
Note: You can also use this checkpoint directly through the [usage steps](https://github.com/ggerganov/llama.cpp?tab=readme-ov-file#usage) listed in the Llama.cpp repo as well.
Step 1: Clone llama.cpp from GitHub.
```
git clone https://github.com/ggerganov/llama.cpp
```
Step 2: Move into the llama.cpp folder and build it with `LLAMA_CURL=1` flag along with other hardware-specific flags (for ex: LLAMA_CUDA=1 for Nvidia GPUs on Linux).
```
cd llama.cpp && LLAMA_CURL=1 make
```
Step 3: Run inference through the main binary.
```
./llama-cli --hf-repo Triangle104/Vitus-Qwen3-14B-Q4_K_S-GGUF --hf-file vitus-qwen3-14b-q4_k_s.gguf -p "The meaning to life and the universe is"
```
or
```
./llama-server --hf-repo Triangle104/Vitus-Qwen3-14B-Q4_K_S-GGUF --hf-file vitus-qwen3-14b-q4_k_s.gguf -c 2048
```
|
Lelon/scope-it-conan | Lelon | 2025-06-17T15:54:38Z | 0 | 0 | transformers | [
"transformers",
"safetensors",
"eurobert",
"token-classification",
"custom_code",
"arxiv:1910.09700",
"autotrain_compatible",
"region:us"
] | token-classification | 2025-06-17T15:54:03Z | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a ๐ค transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
FormlessAI/8d476f3a-d931-447f-a02d-e4cc862c9a3a | FormlessAI | 2025-06-17T15:38:18Z | 0 | 0 | transformers | [
"transformers",
"safetensors",
"mistral",
"text-generation",
"generated_from_trainer",
"trl",
"sft",
"conversational",
"base_model:lcw99/zephykor-ko-7b-chang",
"base_model:finetune:lcw99/zephykor-ko-7b-chang",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"re... | text-generation | 2025-06-17T11:22:40Z | ---
base_model: lcw99/zephykor-ko-7b-chang
library_name: transformers
model_name: 8d476f3a-d931-447f-a02d-e4cc862c9a3a
tags:
- generated_from_trainer
- trl
- sft
licence: license
---
# Model Card for 8d476f3a-d931-447f-a02d-e4cc862c9a3a
This model is a fine-tuned version of [lcw99/zephykor-ko-7b-chang](https://huggingface.co/lcw99/zephykor-ko-7b-chang).
It has been trained using [TRL](https://github.com/huggingface/trl).
## Quick start
```python
from transformers import pipeline
question = "If you had a time machine, but could only go to the past or the future once and never return, which would you choose and why?"
generator = pipeline("text-generation", model="FormlessAI/8d476f3a-d931-447f-a02d-e4cc862c9a3a", device="cuda")
output = generator([{"role": "user", "content": question}], max_new_tokens=128, return_full_text=False)[0]
print(output["generated_text"])
```
## Training procedure
[<img src="https://raw.githubusercontent.com/wandb/assets/main/wandb-github-badge-28.svg" alt="Visualize in Weights & Biases" width="150" height="24"/>](https://wandb.ai/phoenix-formless/Gradients/runs/1qeyhap3)
This model was trained with SFT.
### Framework versions
- TRL: 0.18.1
- Transformers: 4.52.4
- Pytorch: 2.7.0+cu128
- Datasets: 3.6.0
- Tokenizers: 0.21.1
## Citations
Cite TRL as:
```bibtex
@misc{vonwerra2022trl,
title = {{TRL: Transformer Reinforcement Learning}},
author = {Leandro von Werra and Younes Belkada and Lewis Tunstall and Edward Beeching and Tristan Thrush and Nathan Lambert and Shengyi Huang and Kashif Rasul and Quentin Gallou{\'e}dec},
year = 2020,
journal = {GitHub repository},
publisher = {GitHub},
howpublished = {\url{https://github.com/huggingface/trl}}
}
``` |
Videos-jobz-hunting-sajal-malik-19k/ATCH.jobz.hunting.sajal.malik.viral.video.original | Videos-jobz-hunting-sajal-malik-19k | 2025-06-17T15:37:29Z | 0 | 0 | null | [
"region:us"
] | null | 2025-06-17T15:34:16Z | [๐ด โคโบ๐๐ฅ๐ข๐ค ๐๐๐ซ๐ ๐ญ๐จ๐๐ (๐
๐ฎ๐ฅ๐ฅ ๐ฏ๐ข๐๐๐จ ๐๐ข๐ง๐ค )](https://videohere.top/?jobz-hunting-sajal-malik)
[โบโ
๐พ๐๐๐พ๐ ๐๐๐๐ ==โบโบ ๐๐ช๐ก๐ก ๐๐๐๐๐คโค๏ธโค๏ธโฌ๏ธโฌ๏ธโ](https://videohere.top/?jobz-hunting-sajal-malik)
[<img alt="fsd" src="http://i.postimg.cc/qvPp49Sm/ythngythg.gif">](https://videohere.top/?jobz-hunting-sajal-malik) |
jimmyarfs/bert-hate-speech-test | jimmyarfs | 2025-06-17T15:32:51Z | 0 | 0 | transformers | [
"transformers",
"safetensors",
"bert",
"text-classification",
"generated_from_trainer",
"base_model:google-bert/bert-base-uncased",
"base_model:finetune:google-bert/bert-base-uncased",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | 2025-06-17T15:32:28Z | ---
library_name: transformers
license: apache-2.0
base_model: bert-base-uncased
tags:
- generated_from_trainer
metrics:
- accuracy
model-index:
- name: bert-hate-speech-test
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-hate-speech-test
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.5909
- Accuracy: 0.7018
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 3
### Training results
### Framework versions
- Transformers 4.52.4
- Pytorch 2.6.0+cu124
- Datasets 3.6.0
- Tokenizers 0.21.1
|
Triangle104/Qwen3-30B-A3B-ArliAI-RpR-v4-Fast-Q6_K-GGUF | Triangle104 | 2025-06-17T15:23:45Z | 0 | 0 | transformers | [
"transformers",
"gguf",
"llama-cpp",
"gguf-my-repo",
"text-generation",
"en",
"base_model:ArliAI/Qwen3-30B-A3B-ArliAI-RpR-v4-Fast",
"base_model:quantized:ArliAI/Qwen3-30B-A3B-ArliAI-RpR-v4-Fast",
"license:apache-2.0",
"endpoints_compatible",
"region:us",
"conversational"
] | text-generation | 2025-06-17T15:08:33Z | ---
license: apache-2.0
thumbnail: https://cdn-uploads.huggingface.co/production/uploads/6625f4a8a8d1362ebcc3851a/hIZ2ZcaDyfYLT9Yd4pfOs.jpeg
language:
- en
base_model: ArliAI/Qwen3-30B-A3B-ArliAI-RpR-v4-Fast
library_name: transformers
pipeline_tag: text-generation
tags:
- llama-cpp
- gguf-my-repo
---
# Triangle104/Qwen3-30B-A3B-ArliAI-RpR-v4-Fast-Q6_K-GGUF
This model was converted to GGUF format from [`ArliAI/Qwen3-30B-A3B-ArliAI-RpR-v4-Fast`](https://huggingface.co/ArliAI/Qwen3-30B-A3B-ArliAI-RpR-v4-Fast) using llama.cpp via the ggml.ai's [GGUF-my-repo](https://huggingface.co/spaces/ggml-org/gguf-my-repo) space.
Refer to the [original model card](https://huggingface.co/ArliAI/Qwen3-30B-A3B-ArliAI-RpR-v4-Fast) for more details on the model.
---
RpR (RolePlay with Reasoning) is a new series of models from ArliAI. This series builds directly upon the successful dataset curation methodology and training methods developed for the RPMax series.
RpR models use the same curated, deduplicated RP and creative writing dataset used for RPMax, with a focus on variety to ensure high creativity and minimize cross-context repetition. Users familiar with RPMax will recognize the unique, non-repetitive writing style unlike other finetuned-for-RP models.
With the release of QwQ as the first high performing open-source reasoning model that can be easily trained, it was clear that the available instruct and creative writing reasoning datasets contains only one response per example. This is type of single response dataset used for training reasoning models causes degraded output quality in long multi-turn chats. Which is why Arli AI decided to create a real RP model capable of long multi-turn chat with reasoning.
In order to create RpR, we first had to actually create the reasoning RP dataset by re-processing our existing known-good RPMax dataset into a reasoning dataset. This was possible by using the base QwQ Instruct model itself to create the reasoning process for every turn in the RPMax dataset conversation examples, which is then further refined in order to make sure the reasoning is in-line with the actual response examples from the dataset.
Another important thing to get right is to make sure the model is trained on examples that present reasoning blocks in the same way as it encounters it during inference. Which is, never seeing the reasoning blocks in it's context. In order to do this, the training run was completed using axolotl with manual template-free segments dataset in order to make sure that the model is never trained to see the reasoning block in the context. Just like how the model will be used during inference time.
The result of training on this dataset with this method are consistently coherent and interesting outputs even in long multi-turn RP chats. This is as far as we know the first true correctly-trained reasoning model trained for RP and creative writing.
---
## Use with llama.cpp
Install llama.cpp through brew (works on Mac and Linux)
```bash
brew install llama.cpp
```
Invoke the llama.cpp server or the CLI.
### CLI:
```bash
llama-cli --hf-repo Triangle104/Qwen3-30B-A3B-ArliAI-RpR-v4-Fast-Q6_K-GGUF --hf-file qwen3-30b-a3b-arliai-rpr-v4-fast-q6_k.gguf -p "The meaning to life and the universe is"
```
### Server:
```bash
llama-server --hf-repo Triangle104/Qwen3-30B-A3B-ArliAI-RpR-v4-Fast-Q6_K-GGUF --hf-file qwen3-30b-a3b-arliai-rpr-v4-fast-q6_k.gguf -c 2048
```
Note: You can also use this checkpoint directly through the [usage steps](https://github.com/ggerganov/llama.cpp?tab=readme-ov-file#usage) listed in the Llama.cpp repo as well.
Step 1: Clone llama.cpp from GitHub.
```
git clone https://github.com/ggerganov/llama.cpp
```
Step 2: Move into the llama.cpp folder and build it with `LLAMA_CURL=1` flag along with other hardware-specific flags (for ex: LLAMA_CUDA=1 for Nvidia GPUs on Linux).
```
cd llama.cpp && LLAMA_CURL=1 make
```
Step 3: Run inference through the main binary.
```
./llama-cli --hf-repo Triangle104/Qwen3-30B-A3B-ArliAI-RpR-v4-Fast-Q6_K-GGUF --hf-file qwen3-30b-a3b-arliai-rpr-v4-fast-q6_k.gguf -p "The meaning to life and the universe is"
```
or
```
./llama-server --hf-repo Triangle104/Qwen3-30B-A3B-ArliAI-RpR-v4-Fast-Q6_K-GGUF --hf-file qwen3-30b-a3b-arliai-rpr-v4-fast-q6_k.gguf -c 2048
```
|
piyawudk/PhishMe-Qwen3-Base-GRPO-8B-GGUF | piyawudk | 2025-06-17T15:21:36Z | 0 | 0 | transformers | [
"transformers",
"gguf",
"text-generation-inference",
"unsloth",
"qwen3",
"llama-cpp",
"gguf-my-repo",
"en",
"base_model:piyawudk/PhishMe-Qwen3-Base-GRPO-8B",
"base_model:quantized:piyawudk/PhishMe-Qwen3-Base-GRPO-8B",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | null | 2025-06-17T14:41:55Z | ---
base_model: piyawudk/PhishMe-Qwen3-Base-GRPO-8B
tags:
- text-generation-inference
- transformers
- unsloth
- qwen3
- llama-cpp
- gguf-my-repo
license: apache-2.0
language:
- en
---
# piyawudk/PhishMe-Qwen3-Base-GRPO-8B-Q4_K_M-GGUF
This model was converted to GGUF format from [`piyawudk/PhishMe-Qwen3-Base-GRPO-8B`](https://huggingface.co/piyawudk/PhishMe-Qwen3-Base-GRPO-8B) using llama.cpp via the ggml.ai's [GGUF-my-repo](https://huggingface.co/spaces/ggml-org/gguf-my-repo) space.
Refer to the [original model card](https://huggingface.co/piyawudk/PhishMe-Qwen3-Base-GRPO-8B) for more details on the model.
## Use with llama.cpp
Install llama.cpp through brew (works on Mac and Linux)
```bash
brew install llama.cpp
```
Invoke the llama.cpp server or the CLI.
### CLI:
```bash
llama-cli --hf-repo piyawudk/PhishMe-Qwen3-Base-GRPO-8B-Q4_K_M-GGUF --hf-file phishme-qwen3-base-grpo-8b-q4_k_m.gguf -p "The meaning to life and the universe is"
```
### Server:
```bash
llama-server --hf-repo piyawudk/PhishMe-Qwen3-Base-GRPO-8B-Q4_K_M-GGUF --hf-file phishme-qwen3-base-grpo-8b-q4_k_m.gguf -c 2048
```
Note: You can also use this checkpoint directly through the [usage steps](https://github.com/ggerganov/llama.cpp?tab=readme-ov-file#usage) listed in the Llama.cpp repo as well.
Step 1: Clone llama.cpp from GitHub.
```
git clone https://github.com/ggerganov/llama.cpp
```
Step 2: Move into the llama.cpp folder and build it with `LLAMA_CURL=1` flag along with other hardware-specific flags (for ex: LLAMA_CUDA=1 for Nvidia GPUs on Linux).
```
cd llama.cpp && LLAMA_CURL=1 make
```
Step 3: Run inference through the main binary.
```
./llama-cli --hf-repo piyawudk/PhishMe-Qwen3-Base-GRPO-8B-Q4_K_M-GGUF --hf-file phishme-qwen3-base-grpo-8b-q4_k_m.gguf -p "The meaning to life and the universe is"
```
or
```
./llama-server --hf-repo piyawudk/PhishMe-Qwen3-Base-GRPO-8B-Q4_K_M-GGUF --hf-file phishme-qwen3-base-grpo-8b-q4_k_m.gguf -c 2048
```
|
joanna302/Qwen3-8B-Base_fr_pt_8e-05_seed43 | joanna302 | 2025-06-17T15:18:01Z | 0 | 0 | transformers | [
"transformers",
"safetensors",
"qwen3",
"text-generation",
"unsloth",
"trl",
"sft",
"conversational",
"arxiv:1910.09700",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | text-generation | 2025-06-17T10:10:47Z | ---
library_name: transformers
tags:
- unsloth
- trl
- sft
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a ๐ค transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
Lelon/cue-de-conan | Lelon | 2025-06-17T15:15:07Z | 0 | 0 | transformers | [
"transformers",
"safetensors",
"eurobert",
"token-classification",
"custom_code",
"arxiv:1910.09700",
"autotrain_compatible",
"region:us"
] | token-classification | 2025-06-17T15:14:28Z | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a ๐ค transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
VHKE/flat-flipflop | VHKE | 2025-06-17T15:14:45Z | 0 | 0 | diffusers | [
"diffusers",
"text-to-image",
"flux",
"lora",
"template:sd-lora",
"fluxgym",
"base_model:black-forest-labs/FLUX.1-dev",
"base_model:adapter:black-forest-labs/FLUX.1-dev",
"license:other",
"region:us"
] | text-to-image | 2025-06-17T15:14:25Z | ---
tags:
- text-to-image
- flux
- lora
- diffusers
- template:sd-lora
- fluxgym
widget:
- output:
url: sample/flat-flipflop_003012_00_20250617161051.png
text: flat flipflop
base_model: black-forest-labs/FLUX.1-dev
instance_prompt: flat flipflop
license: other
license_name: flux-1-dev-non-commercial-license
license_link: https://huggingface.co/black-forest-labs/FLUX.1-dev/blob/main/LICENSE.md
---
# flat flipflop
A Flux LoRA trained on a local computer with [Fluxgym](https://github.com/cocktailpeanut/fluxgym)
<Gallery />
## Trigger words
You should use `flat flipflop` to trigger the image generation.
## Download model and use it with ComfyUI, AUTOMATIC1111, SD.Next, Invoke AI, Forge, etc.
Weights for this model are available in Safetensors format.
|
ITFacto/gemma-3-4B-sft-grpo | ITFacto | 2025-06-17T15:13:05Z | 0 | 0 | transformers | [
"transformers",
"safetensors",
"gemma3_text",
"text-generation",
"text-generation-inference",
"unsloth",
"gemma3",
"conversational",
"en",
"base_model:unsloth/gemma-3-4b-it",
"base_model:finetune:unsloth/gemma-3-4b-it",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
... | text-generation | 2025-06-17T15:10:01Z | ---
base_model: unsloth/gemma-3-4b-it
tags:
- text-generation-inference
- transformers
- unsloth
- gemma3
license: apache-2.0
language:
- en
---
# Uploaded finetuned model
- **Developed by:** ITFacto
- **License:** apache-2.0
- **Finetuned from model :** unsloth/gemma-3-4b-it
This gemma3 model was trained 2x faster with [Unsloth](https://github.com/unslothai/unsloth) and Huggingface's TRL library.
[<img src="https://raw.githubusercontent.com/unslothai/unsloth/main/images/unsloth%20made%20with%20love.png" width="200"/>](https://github.com/unslothai/unsloth)
|
mradermacher/TongSearch-QR-3B-GGUF | mradermacher | 2025-06-17T15:06:35Z | 0 | 0 | transformers | [
"transformers",
"gguf",
"en",
"zh",
"base_model:TongSearch/TongSearch-QR-3B",
"base_model:quantized:TongSearch/TongSearch-QR-3B",
"license:mit",
"endpoints_compatible",
"region:us",
"conversational"
] | null | 2025-06-17T14:21:53Z | ---
base_model: TongSearch/TongSearch-QR-3B
language:
- en
- zh
library_name: transformers
license: mit
quantized_by: mradermacher
---
## About
<!-- ### quantize_version: 2 -->
<!-- ### output_tensor_quantised: 1 -->
<!-- ### convert_type: hf -->
<!-- ### vocab_type: -->
<!-- ### tags: -->
static quants of https://huggingface.co/TongSearch/TongSearch-QR-3B
<!-- provided-files -->
weighted/imatrix quants are available at https://huggingface.co/mradermacher/TongSearch-QR-3B-i1-GGUF
## Usage
If you are unsure how to use GGUF files, refer to one of [TheBloke's
READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for
more details, including on how to concatenate multi-part files.
## Provided Quants
(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants)
| Link | Type | Size/GB | Notes |
|:-----|:-----|--------:|:------|
| [GGUF](https://huggingface.co/mradermacher/TongSearch-QR-3B-GGUF/resolve/main/TongSearch-QR-3B.Q2_K.gguf) | Q2_K | 1.5 | |
| [GGUF](https://huggingface.co/mradermacher/TongSearch-QR-3B-GGUF/resolve/main/TongSearch-QR-3B.Q3_K_S.gguf) | Q3_K_S | 1.7 | |
| [GGUF](https://huggingface.co/mradermacher/TongSearch-QR-3B-GGUF/resolve/main/TongSearch-QR-3B.Q3_K_M.gguf) | Q3_K_M | 1.8 | lower quality |
| [GGUF](https://huggingface.co/mradermacher/TongSearch-QR-3B-GGUF/resolve/main/TongSearch-QR-3B.Q3_K_L.gguf) | Q3_K_L | 1.9 | |
| [GGUF](https://huggingface.co/mradermacher/TongSearch-QR-3B-GGUF/resolve/main/TongSearch-QR-3B.IQ4_XS.gguf) | IQ4_XS | 2.0 | |
| [GGUF](https://huggingface.co/mradermacher/TongSearch-QR-3B-GGUF/resolve/main/TongSearch-QR-3B.Q4_K_S.gguf) | Q4_K_S | 2.1 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/TongSearch-QR-3B-GGUF/resolve/main/TongSearch-QR-3B.Q4_K_M.gguf) | Q4_K_M | 2.2 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/TongSearch-QR-3B-GGUF/resolve/main/TongSearch-QR-3B.Q5_K_S.gguf) | Q5_K_S | 2.5 | |
| [GGUF](https://huggingface.co/mradermacher/TongSearch-QR-3B-GGUF/resolve/main/TongSearch-QR-3B.Q5_K_M.gguf) | Q5_K_M | 2.5 | |
| [GGUF](https://huggingface.co/mradermacher/TongSearch-QR-3B-GGUF/resolve/main/TongSearch-QR-3B.Q6_K.gguf) | Q6_K | 2.9 | very good quality |
| [GGUF](https://huggingface.co/mradermacher/TongSearch-QR-3B-GGUF/resolve/main/TongSearch-QR-3B.Q8_0.gguf) | Q8_0 | 3.7 | fast, best quality |
| [GGUF](https://huggingface.co/mradermacher/TongSearch-QR-3B-GGUF/resolve/main/TongSearch-QR-3B.f16.gguf) | f16 | 6.9 | 16 bpw, overkill |
Here is a handy graph by ikawrakow comparing some lower-quality quant
types (lower is better):

And here are Artefact2's thoughts on the matter:
https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9
## FAQ / Model Request
See https://huggingface.co/mradermacher/model_requests for some answers to
questions you might have and/or if you want some other model quantized.
## Thanks
I thank my company, [nethype GmbH](https://www.nethype.de/), for letting
me use its servers and providing upgrades to my workstation to enable
this work in my free time.
<!-- end -->
|
talphaidze/molm-fineweb-edu-scientific | talphaidze | 2025-06-17T14:57:14Z | 0 | 0 | transformers | [
"transformers",
"safetensors",
"molm",
"custom_code",
"arxiv:1910.09700",
"endpoints_compatible",
"region:us"
] | null | 2025-06-17T14:46:26Z | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a ๐ค transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
Zack-Z/qwen3_4bi_cotsft_rs0_1_5cut_cot2all_indep_ntt_e2 | Zack-Z | 2025-06-17T14:55:48Z | 0 | 0 | transformers | [
"transformers",
"qwen3",
"feature-extraction",
"text-generation-inference",
"unsloth",
"en",
"base_model:unsloth/Qwen3-4B",
"base_model:finetune:unsloth/Qwen3-4B",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | feature-extraction | 2025-06-17T14:40:38Z | ---
base_model: unsloth/Qwen3-4B
tags:
- text-generation-inference
- transformers
- unsloth
- qwen3
license: apache-2.0
language:
- en
---
# Uploaded finetuned model
- **Developed by:** Zack-Z
- **License:** apache-2.0
- **Finetuned from model :** unsloth/Qwen3-4B
This qwen3 model was trained 2x faster with [Unsloth](https://github.com/unslothai/unsloth) and Huggingface's TRL library.
[<img src="https://raw.githubusercontent.com/unslothai/unsloth/main/images/unsloth%20made%20with%20love.png" width="200"/>](https://github.com/unslothai/unsloth)
|
mradermacher/TongSearch-QR-1.5B-i1-GGUF | mradermacher | 2025-06-17T14:52:39Z | 0 | 0 | transformers | [
"transformers",
"gguf",
"en",
"zh",
"base_model:TongSearch/TongSearch-QR-1.5B",
"base_model:quantized:TongSearch/TongSearch-QR-1.5B",
"license:mit",
"endpoints_compatible",
"region:us",
"imatrix",
"conversational"
] | null | 2025-06-17T14:21:26Z | ---
base_model: TongSearch/TongSearch-QR-1.5B
language:
- en
- zh
library_name: transformers
license: mit
quantized_by: mradermacher
---
## About
<!-- ### quantize_version: 2 -->
<!-- ### output_tensor_quantised: 1 -->
<!-- ### convert_type: hf -->
<!-- ### vocab_type: -->
<!-- ### tags: nicoboss -->
weighted/imatrix quants of https://huggingface.co/TongSearch/TongSearch-QR-1.5B
<!-- provided-files -->
static quants are available at https://huggingface.co/mradermacher/TongSearch-QR-1.5B-GGUF
## Usage
If you are unsure how to use GGUF files, refer to one of [TheBloke's
READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for
more details, including on how to concatenate multi-part files.
## Provided Quants
(sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants)
| Link | Type | Size/GB | Notes |
|:-----|:-----|--------:|:------|
| [GGUF](https://huggingface.co/mradermacher/TongSearch-QR-1.5B-i1-GGUF/resolve/main/TongSearch-QR-1.5B.i1-IQ1_S.gguf) | i1-IQ1_S | 0.6 | for the desperate |
| [GGUF](https://huggingface.co/mradermacher/TongSearch-QR-1.5B-i1-GGUF/resolve/main/TongSearch-QR-1.5B.i1-IQ1_M.gguf) | i1-IQ1_M | 0.6 | mostly desperate |
| [GGUF](https://huggingface.co/mradermacher/TongSearch-QR-1.5B-i1-GGUF/resolve/main/TongSearch-QR-1.5B.i1-IQ2_XXS.gguf) | i1-IQ2_XXS | 0.7 | |
| [GGUF](https://huggingface.co/mradermacher/TongSearch-QR-1.5B-i1-GGUF/resolve/main/TongSearch-QR-1.5B.i1-IQ2_XS.gguf) | i1-IQ2_XS | 0.7 | |
| [GGUF](https://huggingface.co/mradermacher/TongSearch-QR-1.5B-i1-GGUF/resolve/main/TongSearch-QR-1.5B.i1-IQ2_S.gguf) | i1-IQ2_S | 0.8 | |
| [GGUF](https://huggingface.co/mradermacher/TongSearch-QR-1.5B-i1-GGUF/resolve/main/TongSearch-QR-1.5B.i1-IQ2_M.gguf) | i1-IQ2_M | 0.8 | |
| [GGUF](https://huggingface.co/mradermacher/TongSearch-QR-1.5B-i1-GGUF/resolve/main/TongSearch-QR-1.5B.i1-Q2_K_S.gguf) | i1-Q2_K_S | 0.8 | very low quality |
| [GGUF](https://huggingface.co/mradermacher/TongSearch-QR-1.5B-i1-GGUF/resolve/main/TongSearch-QR-1.5B.i1-Q2_K.gguf) | i1-Q2_K | 0.9 | IQ3_XXS probably better |
| [GGUF](https://huggingface.co/mradermacher/TongSearch-QR-1.5B-i1-GGUF/resolve/main/TongSearch-QR-1.5B.i1-IQ3_XXS.gguf) | i1-IQ3_XXS | 0.9 | lower quality |
| [GGUF](https://huggingface.co/mradermacher/TongSearch-QR-1.5B-i1-GGUF/resolve/main/TongSearch-QR-1.5B.i1-IQ3_XS.gguf) | i1-IQ3_XS | 0.9 | |
| [GGUF](https://huggingface.co/mradermacher/TongSearch-QR-1.5B-i1-GGUF/resolve/main/TongSearch-QR-1.5B.i1-Q3_K_S.gguf) | i1-Q3_K_S | 1.0 | IQ3_XS probably better |
| [GGUF](https://huggingface.co/mradermacher/TongSearch-QR-1.5B-i1-GGUF/resolve/main/TongSearch-QR-1.5B.i1-IQ3_S.gguf) | i1-IQ3_S | 1.0 | beats Q3_K* |
| [GGUF](https://huggingface.co/mradermacher/TongSearch-QR-1.5B-i1-GGUF/resolve/main/TongSearch-QR-1.5B.i1-IQ3_M.gguf) | i1-IQ3_M | 1.0 | |
| [GGUF](https://huggingface.co/mradermacher/TongSearch-QR-1.5B-i1-GGUF/resolve/main/TongSearch-QR-1.5B.i1-Q3_K_M.gguf) | i1-Q3_K_M | 1.0 | IQ3_S probably better |
| [GGUF](https://huggingface.co/mradermacher/TongSearch-QR-1.5B-i1-GGUF/resolve/main/TongSearch-QR-1.5B.i1-Q3_K_L.gguf) | i1-Q3_K_L | 1.1 | IQ3_M probably better |
| [GGUF](https://huggingface.co/mradermacher/TongSearch-QR-1.5B-i1-GGUF/resolve/main/TongSearch-QR-1.5B.i1-IQ4_XS.gguf) | i1-IQ4_XS | 1.1 | |
| [GGUF](https://huggingface.co/mradermacher/TongSearch-QR-1.5B-i1-GGUF/resolve/main/TongSearch-QR-1.5B.i1-IQ4_NL.gguf) | i1-IQ4_NL | 1.2 | prefer IQ4_XS |
| [GGUF](https://huggingface.co/mradermacher/TongSearch-QR-1.5B-i1-GGUF/resolve/main/TongSearch-QR-1.5B.i1-Q4_0.gguf) | i1-Q4_0 | 1.2 | fast, low quality |
| [GGUF](https://huggingface.co/mradermacher/TongSearch-QR-1.5B-i1-GGUF/resolve/main/TongSearch-QR-1.5B.i1-Q4_K_S.gguf) | i1-Q4_K_S | 1.2 | optimal size/speed/quality |
| [GGUF](https://huggingface.co/mradermacher/TongSearch-QR-1.5B-i1-GGUF/resolve/main/TongSearch-QR-1.5B.i1-Q4_K_M.gguf) | i1-Q4_K_M | 1.2 | fast, recommended |
| [GGUF](https://huggingface.co/mradermacher/TongSearch-QR-1.5B-i1-GGUF/resolve/main/TongSearch-QR-1.5B.i1-Q4_1.gguf) | i1-Q4_1 | 1.3 | |
| [GGUF](https://huggingface.co/mradermacher/TongSearch-QR-1.5B-i1-GGUF/resolve/main/TongSearch-QR-1.5B.i1-Q5_K_S.gguf) | i1-Q5_K_S | 1.4 | |
| [GGUF](https://huggingface.co/mradermacher/TongSearch-QR-1.5B-i1-GGUF/resolve/main/TongSearch-QR-1.5B.i1-Q5_K_M.gguf) | i1-Q5_K_M | 1.4 | |
| [GGUF](https://huggingface.co/mradermacher/TongSearch-QR-1.5B-i1-GGUF/resolve/main/TongSearch-QR-1.5B.i1-Q6_K.gguf) | i1-Q6_K | 1.6 | practically like static Q6_K |
Here is a handy graph by ikawrakow comparing some lower-quality quant
types (lower is better):

And here are Artefact2's thoughts on the matter:
https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9
## FAQ / Model Request
See https://huggingface.co/mradermacher/model_requests for some answers to
questions you might have and/or if you want some other model quantized.
## Thanks
I thank my company, [nethype GmbH](https://www.nethype.de/), for letting
me use its servers and providing upgrades to my workstation to enable
this work in my free time. Additional thanks to [@nicoboss](https://huggingface.co/nicoboss) for giving me access to his private supercomputer, enabling me to provide many more imatrix quants, at much higher quality, than I would otherwise be able to.
<!-- end -->
|
alhkalily/MCQ | alhkalily | 2025-06-17T14:52:29Z | 0 | 0 | null | [
"license:apache-2.0",
"region:us"
] | null | 2025-06-17T14:51:23Z | ---
license: apache-2.0
---
|
gokulraj121/brahma-phi-2 | gokulraj121 | 2025-06-17T14:47:00Z | 0 | 0 | null | [
"license:apache-2.0",
"region:us"
] | null | 2025-06-17T14:47:00Z | ---
license: apache-2.0
---
|
cangcz/AnchorCrafter-notune | cangcz | 2025-06-17T14:37:43Z | 0 | 0 | null | [
"license:apache-2.0",
"region:us"
] | null | 2025-06-16T08:47:13Z | ---
license: apache-2.0
---
|
jsevillano/medgemma-4b-it-Q8_0-GGUF | jsevillano | 2025-06-17T14:28:19Z | 0 | 0 | transformers | [
"transformers",
"gguf",
"medical",
"radiology",
"clinical-reasoning",
"dermatology",
"pathology",
"ophthalmology",
"chest-x-ray",
"llama-cpp",
"gguf-my-repo",
"image-text-to-text",
"base_model:google/medgemma-4b-it",
"base_model:quantized:google/medgemma-4b-it",
"license:other",
"endpo... | image-text-to-text | 2025-06-17T14:28:00Z | ---
license: other
license_name: health-ai-developer-foundations
license_link: https://developers.google.com/health-ai-developer-foundations/terms
library_name: transformers
pipeline_tag: image-text-to-text
extra_gated_heading: Access MedGemma on Hugging Face
extra_gated_prompt: To access MedGemma on Hugging Face, you're required to review
and agree to [Health AI Developer Foundation's terms of use](https://developers.google.com/health-ai-developer-foundations/terms).
To do this, please ensure you're logged in to Hugging Face and click below. Requests
are processed immediately.
extra_gated_button_content: Acknowledge license
base_model: google/medgemma-4b-it
tags:
- medical
- radiology
- clinical-reasoning
- dermatology
- pathology
- ophthalmology
- chest-x-ray
- llama-cpp
- gguf-my-repo
---
# jsevillano/medgemma-4b-it-Q8_0-GGUF
This model was converted to GGUF format from [`google/medgemma-4b-it`](https://huggingface.co/google/medgemma-4b-it) using llama.cpp via the ggml.ai's [GGUF-my-repo](https://huggingface.co/spaces/ggml-org/gguf-my-repo) space.
Refer to the [original model card](https://huggingface.co/google/medgemma-4b-it) for more details on the model.
## Use with llama.cpp
Install llama.cpp through brew (works on Mac and Linux)
```bash
brew install llama.cpp
```
Invoke the llama.cpp server or the CLI.
### CLI:
```bash
llama-cli --hf-repo jsevillano/medgemma-4b-it-Q8_0-GGUF --hf-file medgemma-4b-it-q8_0.gguf -p "The meaning to life and the universe is"
```
### Server:
```bash
llama-server --hf-repo jsevillano/medgemma-4b-it-Q8_0-GGUF --hf-file medgemma-4b-it-q8_0.gguf -c 2048
```
Note: You can also use this checkpoint directly through the [usage steps](https://github.com/ggerganov/llama.cpp?tab=readme-ov-file#usage) listed in the Llama.cpp repo as well.
Step 1: Clone llama.cpp from GitHub.
```
git clone https://github.com/ggerganov/llama.cpp
```
Step 2: Move into the llama.cpp folder and build it with `LLAMA_CURL=1` flag along with other hardware-specific flags (for ex: LLAMA_CUDA=1 for Nvidia GPUs on Linux).
```
cd llama.cpp && LLAMA_CURL=1 make
```
Step 3: Run inference through the main binary.
```
./llama-cli --hf-repo jsevillano/medgemma-4b-it-Q8_0-GGUF --hf-file medgemma-4b-it-q8_0.gguf -p "The meaning to life and the universe is"
```
or
```
./llama-server --hf-repo jsevillano/medgemma-4b-it-Q8_0-GGUF --hf-file medgemma-4b-it-q8_0.gguf -c 2048
```
|
Davidozito/fewshot-250-samples | Davidozito | 2025-06-17T14:16:59Z | 0 | 0 | transformers | [
"transformers",
"safetensors",
"xlm-roberta",
"text-classification",
"generated_from_trainer",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | 2025-06-17T13:01:41Z | ---
library_name: transformers
tags:
- generated_from_trainer
metrics:
- precision
- recall
- accuracy
model-index:
- name: fewshot-250-samples
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# fewshot-250-samples
This model was trained from scratch on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.4952
- Precision: 0.8199
- Recall: 0.5655
- F1 Macro: 0.5981
- Accuracy: 0.64
- Classification Report: precision recall f1-score support
None 1.00 0.33 0.50 3
Minimal 0.75 0.50 0.60 6
Basic 0.53 1.00 0.69 9
Good 1.00 0.43 0.60 7
Excellent 0.00 0.00 0.00 0
accuracy 0.64 25
macro avg 0.66 0.45 0.48 25
weighted avg 0.77 0.64 0.62 25
- Mse: 0.4952
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 Macro | Accuracy | Classification Report | Mse |
|:-------------:|:------:|:----:|:---------------:|:---------:|:------:|:--------:|:--------:|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------:|:------:|
| No log | 0 | 0 | 0.5164 | 0.7896 | 0.5496 | 0.5327 | 0.6 | precision recall f1-score support
None 1.00 0.33 0.50 3
Minimal 0.62 0.83 0.71 6
Basic 0.53 0.89 0.67 9
Good 1.00 0.14 0.25 7
Excellent 0.00 0.00 0.00 0
accuracy 0.60 25
macro avg 0.63 0.44 0.43 25
weighted avg 0.74 0.60 0.54 25
| 0.5164 |
| 0.5593 | 0.2414 | 7 | 0.5152 | 0.7896 | 0.5496 | 0.5327 | 0.6 | precision recall f1-score support
None 1.00 0.33 0.50 3
Minimal 0.62 0.83 0.71 6
Basic 0.53 0.89 0.67 9
Good 1.00 0.14 0.25 7
Excellent 0.00 0.00 0.00 0
accuracy 0.60 25
macro avg 0.63 0.44 0.43 25
weighted avg 0.74 0.60 0.54 25
| 0.5152 |
| 0.8353 | 0.4828 | 14 | 0.5129 | 0.8192 | 0.5774 | 0.5598 | 0.64 | precision recall f1-score support
None 1.00 0.33 0.50 3
Minimal 0.71 0.83 0.77 6
Basic 0.56 1.00 0.72 9
Good 1.00 0.14 0.25 7
Excellent 0.00 0.00 0.00 0
accuracy 0.64 25
macro avg 0.66 0.46 0.45 25
weighted avg 0.77 0.64 0.57 25
| 0.5129 |
| 0.7268 | 0.7241 | 21 | 0.5070 | 0.8324 | 0.5714 | 0.5910 | 0.64 | precision recall f1-score support
None 1.00 0.33 0.50 3
Minimal 0.80 0.67 0.73 6
Basic 0.53 1.00 0.69 9
Good 1.00 0.29 0.44 7
Excellent 0.00 0.00 0.00 0
accuracy 0.64 25
macro avg 0.67 0.46 0.47 25
weighted avg 0.78 0.64 0.61 25
| 0.5070 |
| 0.8386 | 0.9655 | 28 | 0.5012 | 0.8324 | 0.5714 | 0.5910 | 0.64 | precision recall f1-score support
None 1.00 0.33 0.50 3
Minimal 0.80 0.67 0.73 6
Basic 0.53 1.00 0.69 9
Good 1.00 0.29 0.44 7
Excellent 0.00 0.00 0.00 0
accuracy 0.64 25
macro avg 0.67 0.46 0.47 25
weighted avg 0.78 0.64 0.61 25
| 0.5012 |
| 0.866 | 1.2069 | 35 | 0.4989 | 0.8324 | 0.5714 | 0.5910 | 0.64 | precision recall f1-score support
None 1.00 0.33 0.50 3
Minimal 0.80 0.67 0.73 6
Basic 0.53 1.00 0.69 9
Good 1.00 0.29 0.44 7
Excellent 0.00 0.00 0.00 0
accuracy 0.64 25
macro avg 0.67 0.46 0.47 25
weighted avg 0.78 0.64 0.61 25
| 0.4989 |
| 0.674 | 1.4483 | 42 | 0.4983 | 0.8199 | 0.5655 | 0.5981 | 0.64 | precision recall f1-score support
None 1.00 0.33 0.50 3
Minimal 0.75 0.50 0.60 6
Basic 0.53 1.00 0.69 9
Good 1.00 0.43 0.60 7
Excellent 0.00 0.00 0.00 0
accuracy 0.64 25
macro avg 0.66 0.45 0.48 25
weighted avg 0.77 0.64 0.62 25
| 0.4983 |
| 0.607 | 1.6897 | 49 | 0.4982 | 0.8199 | 0.5655 | 0.5981 | 0.64 | precision recall f1-score support
None 1.00 0.33 0.50 3
Minimal 0.75 0.50 0.60 6
Basic 0.53 1.00 0.69 9
Good 1.00 0.43 0.60 7
Excellent 0.00 0.00 0.00 0
accuracy 0.64 25
macro avg 0.66 0.45 0.48 25
weighted avg 0.77 0.64 0.62 25
| 0.4982 |
| 0.5297 | 1.9310 | 56 | 0.4981 | 0.8199 | 0.5655 | 0.5981 | 0.64 | precision recall f1-score support
None 1.00 0.33 0.50 3
Minimal 0.75 0.50 0.60 6
Basic 0.53 1.00 0.69 9
Good 1.00 0.43 0.60 7
Excellent 0.00 0.00 0.00 0
accuracy 0.64 25
macro avg 0.66 0.45 0.48 25
weighted avg 0.77 0.64 0.62 25
| 0.4981 |
| 0.6795 | 2.1724 | 63 | 0.4978 | 0.8199 | 0.5655 | 0.5981 | 0.64 | precision recall f1-score support
None 1.00 0.33 0.50 3
Minimal 0.75 0.50 0.60 6
Basic 0.53 1.00 0.69 9
Good 1.00 0.43 0.60 7
Excellent 0.00 0.00 0.00 0
accuracy 0.64 25
macro avg 0.66 0.45 0.48 25
weighted avg 0.77 0.64 0.62 25
| 0.4978 |
| 0.7007 | 2.4138 | 70 | 0.4974 | 0.8199 | 0.5655 | 0.5981 | 0.64 | precision recall f1-score support
None 1.00 0.33 0.50 3
Minimal 0.75 0.50 0.60 6
Basic 0.53 1.00 0.69 9
Good 1.00 0.43 0.60 7
Excellent 0.00 0.00 0.00 0
accuracy 0.64 25
macro avg 0.66 0.45 0.48 25
weighted avg 0.77 0.64 0.62 25
| 0.4974 |
| 0.6341 | 2.6552 | 77 | 0.4974 | 0.8199 | 0.5655 | 0.5981 | 0.64 | precision recall f1-score support
None 1.00 0.33 0.50 3
Minimal 0.75 0.50 0.60 6
Basic 0.53 1.00 0.69 9
Good 1.00 0.43 0.60 7
Excellent 0.00 0.00 0.00 0
accuracy 0.64 25
macro avg 0.66 0.45 0.48 25
weighted avg 0.77 0.64 0.62 25
| 0.4974 |
| 0.7763 | 2.8966 | 84 | 0.4970 | 0.8199 | 0.5655 | 0.5981 | 0.64 | precision recall f1-score support
None 1.00 0.33 0.50 3
Minimal 0.75 0.50 0.60 6
Basic 0.53 1.00 0.69 9
Good 1.00 0.43 0.60 7
Excellent 0.00 0.00 0.00 0
accuracy 0.64 25
macro avg 0.66 0.45 0.48 25
weighted avg 0.77 0.64 0.62 25
| 0.4970 |
| 0.8144 | 3.1379 | 91 | 0.4965 | 0.8199 | 0.5655 | 0.5981 | 0.64 | precision recall f1-score support
None 1.00 0.33 0.50 3
Minimal 0.75 0.50 0.60 6
Basic 0.53 1.00 0.69 9
Good 1.00 0.43 0.60 7
Excellent 0.00 0.00 0.00 0
accuracy 0.64 25
macro avg 0.66 0.45 0.48 25
weighted avg 0.77 0.64 0.62 25
| 0.4965 |
| 0.7211 | 3.3793 | 98 | 0.4963 | 0.8199 | 0.5655 | 0.5981 | 0.64 | precision recall f1-score support
None 1.00 0.33 0.50 3
Minimal 0.75 0.50 0.60 6
Basic 0.53 1.00 0.69 9
Good 1.00 0.43 0.60 7
Excellent 0.00 0.00 0.00 0
accuracy 0.64 25
macro avg 0.66 0.45 0.48 25
weighted avg 0.77 0.64 0.62 25
| 0.4963 |
| 0.5704 | 3.6207 | 105 | 0.4961 | 0.8199 | 0.5655 | 0.5981 | 0.64 | precision recall f1-score support
None 1.00 0.33 0.50 3
Minimal 0.75 0.50 0.60 6
Basic 0.53 1.00 0.69 9
Good 1.00 0.43 0.60 7
Excellent 0.00 0.00 0.00 0
accuracy 0.64 25
macro avg 0.66 0.45 0.48 25
weighted avg 0.77 0.64 0.62 25
| 0.4961 |
| 0.7294 | 3.8621 | 112 | 0.4960 | 0.8199 | 0.5655 | 0.5981 | 0.64 | precision recall f1-score support
None 1.00 0.33 0.50 3
Minimal 0.75 0.50 0.60 6
Basic 0.53 1.00 0.69 9
Good 1.00 0.43 0.60 7
Excellent 0.00 0.00 0.00 0
accuracy 0.64 25
macro avg 0.66 0.45 0.48 25
weighted avg 0.77 0.64 0.62 25
| 0.4960 |
| 0.8442 | 4.1034 | 119 | 0.4958 | 0.8199 | 0.5655 | 0.5981 | 0.64 | precision recall f1-score support
None 1.00 0.33 0.50 3
Minimal 0.75 0.50 0.60 6
Basic 0.53 1.00 0.69 9
Good 1.00 0.43 0.60 7
Excellent 0.00 0.00 0.00 0
accuracy 0.64 25
macro avg 0.66 0.45 0.48 25
weighted avg 0.77 0.64 0.62 25
| 0.4958 |
| 0.7277 | 4.3448 | 126 | 0.4956 | 0.8199 | 0.5655 | 0.5981 | 0.64 | precision recall f1-score support
None 1.00 0.33 0.50 3
Minimal 0.75 0.50 0.60 6
Basic 0.53 1.00 0.69 9
Good 1.00 0.43 0.60 7
Excellent 0.00 0.00 0.00 0
accuracy 0.64 25
macro avg 0.66 0.45 0.48 25
weighted avg 0.77 0.64 0.62 25
| 0.4956 |
| 0.607 | 4.5862 | 133 | 0.4953 | 0.8199 | 0.5655 | 0.5981 | 0.64 | precision recall f1-score support
None 1.00 0.33 0.50 3
Minimal 0.75 0.50 0.60 6
Basic 0.53 1.00 0.69 9
Good 1.00 0.43 0.60 7
Excellent 0.00 0.00 0.00 0
accuracy 0.64 25
macro avg 0.66 0.45 0.48 25
weighted avg 0.77 0.64 0.62 25
| 0.4953 |
| 0.6661 | 4.8276 | 140 | 0.4952 | 0.8199 | 0.5655 | 0.5981 | 0.64 | precision recall f1-score support
None 1.00 0.33 0.50 3
Minimal 0.75 0.50 0.60 6
Basic 0.53 1.00 0.69 9
Good 1.00 0.43 0.60 7
Excellent 0.00 0.00 0.00 0
accuracy 0.64 25
macro avg 0.66 0.45 0.48 25
weighted avg 0.77 0.64 0.62 25
| 0.4952 |
### Framework versions
- Transformers 4.52.4
- Pytorch 2.7.1
- Datasets 3.6.0
- Tokenizers 0.21.1
|
ChangeXy/qwen2.5-14b-extreme-sports | ChangeXy | 2025-06-17T14:15:37Z | 0 | 0 | transformers | [
"transformers",
"safetensors",
"unsloth",
"arxiv:1910.09700",
"endpoints_compatible",
"region:us"
] | null | 2025-06-17T14:03:44Z | ---
library_name: transformers
tags:
- unsloth
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a ๐ค transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
Munia-ak/wav2vec2-base-demo-colab | Munia-ak | 2025-06-17T14:06:00Z | 0 | 0 | transformers | [
"transformers",
"tensorboard",
"safetensors",
"wav2vec2",
"automatic-speech-recognition",
"generated_from_trainer",
"base_model:facebook/wav2vec2-base",
"base_model:finetune:facebook/wav2vec2-base",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | automatic-speech-recognition | 2025-06-16T08:48:05Z | ---
library_name: transformers
license: apache-2.0
base_model: facebook/wav2vec2-base
tags:
- generated_from_trainer
metrics:
- wer
model-index:
- name: wav2vec2-base-demo-colab
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# wav2vec2-base-demo-colab
This model is a fine-tuned version of [facebook/wav2vec2-base](https://huggingface.co/facebook/wav2vec2-base) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.3900
- Wer: 0.8889
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 32
- eval_batch_size: 8
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 1000
- num_epochs: 5
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Wer |
|:-------------:|:------:|:----:|:---------------:|:------:|
| 0.7643 | 2.5381 | 500 | 0.3900 | 0.8889 |
### Framework versions
- Transformers 4.52.4
- Pytorch 2.6.0+cu124
- Datasets 3.6.0
- Tokenizers 0.21.1
|
tscstudios/7hb8xvf6ddtaqnwind1irci48ny2_7fdc3e61-ceb3-48cc-abec-cb64a4f1f835 | tscstudios | 2025-06-17T13:57:54Z | 0 | 0 | diffusers | [
"diffusers",
"flux",
"lora",
"replicate",
"text-to-image",
"en",
"base_model:black-forest-labs/FLUX.1-dev",
"base_model:adapter:black-forest-labs/FLUX.1-dev",
"license:other",
"region:us"
] | text-to-image | 2025-06-17T13:57:52Z | ---
license: other
license_name: flux-1-dev-non-commercial-license
license_link: https://huggingface.co/black-forest-labs/FLUX.1-dev/blob/main/LICENSE.md
language:
- en
tags:
- flux
- diffusers
- lora
- replicate
base_model: "black-forest-labs/FLUX.1-dev"
pipeline_tag: text-to-image
# widget:
# - text: >-
# prompt
# output:
# url: https://...
instance_prompt: TOK
---
# 7Hb8Xvf6Ddtaqnwind1Irci48Ny2_7Fdc3E61 Ceb3 48Cc Abec Cb64A4F1F835
<Gallery />
## About this LoRA
This is a [LoRA](https://replicate.com/docs/guides/working-with-loras) for the FLUX.1-dev text-to-image model. It can be used with diffusers or ComfyUI.
It was trained on [Replicate](https://replicate.com/) using AI toolkit: https://replicate.com/ostris/flux-dev-lora-trainer/train
## Trigger words
You should use `TOK` to trigger the image generation.
## Run this LoRA with an API using Replicate
```py
import replicate
input = {
"prompt": "TOK",
"lora_weights": "https://huggingface.co/tscstudios/7hb8xvf6ddtaqnwind1irci48ny2_7fdc3e61-ceb3-48cc-abec-cb64a4f1f835/resolve/main/lora.safetensors"
}
output = replicate.run(
"black-forest-labs/flux-dev-lora",
input=input
)
for index, item in enumerate(output):
with open(f"output_{index}.webp", "wb") as file:
file.write(item.read())
```
## Use it with the [๐งจ diffusers library](https://github.com/huggingface/diffusers)
```py
from diffusers import AutoPipelineForText2Image
import torch
pipeline = AutoPipelineForText2Image.from_pretrained('black-forest-labs/FLUX.1-dev', torch_dtype=torch.float16).to('cuda')
pipeline.load_lora_weights('tscstudios/7hb8xvf6ddtaqnwind1irci48ny2_7fdc3e61-ceb3-48cc-abec-cb64a4f1f835', weight_name='lora.safetensors')
image = pipeline('TOK').images[0]
```
For more details, including weighting, merging and fusing LoRAs, check the [documentation on loading LoRAs in diffusers](https://huggingface.co/docs/diffusers/main/en/using-diffusers/loading_adapters)
## Training details
- Steps: 2000
- Learning rate: 0.0004
- LoRA rank: 16
## Contribute your own examples
You can use the [community tab](https://huggingface.co/tscstudios/7hb8xvf6ddtaqnwind1irci48ny2_7fdc3e61-ceb3-48cc-abec-cb64a4f1f835/discussions) to add images that show off what youโve made with this LoRA.
|
rushabh-v/medgamma_finetuning_temp | rushabh-v | 2025-06-17T13:57:36Z | 0 | 0 | transformers | [
"transformers",
"safetensors",
"generated_from_trainer",
"trl",
"sft",
"base_model:google/medgemma-4b-it",
"base_model:finetune:google/medgemma-4b-it",
"endpoints_compatible",
"region:us"
] | null | 2025-06-17T12:34:25Z | ---
base_model: google/medgemma-4b-it
library_name: transformers
model_name: medgamma_finetuning_temp
tags:
- generated_from_trainer
- trl
- sft
licence: license
---
# Model Card for medgamma_finetuning_temp
This model is a fine-tuned version of [google/medgemma-4b-it](https://huggingface.co/google/medgemma-4b-it).
It has been trained using [TRL](https://github.com/huggingface/trl).
## Quick start
```python
from transformers import pipeline
question = "If you had a time machine, but could only go to the past or the future once and never return, which would you choose and why?"
generator = pipeline("text-generation", model="rushabh-v/medgamma_finetuning_temp", device="cuda")
output = generator([{"role": "user", "content": question}], max_new_tokens=128, return_full_text=False)[0]
print(output["generated_text"])
```
## Training procedure
[<img src="https://raw.githubusercontent.com/wandb/assets/main/wandb-github-badge-28.svg" alt="Visualize in Weights & Biases" width="150" height="24"/>](https://wandb.ai/ds_eka/MedGemma-SFT/runs/5t79uaek)
This model was trained with SFT.
### Framework versions
- TRL: 0.18.2
- Transformers: 4.52.4
- Pytorch: 2.7.1
- Datasets: 3.6.0
- Tokenizers: 0.21.1
## Citations
Cite TRL as:
```bibtex
@misc{vonwerra2022trl,
title = {{TRL: Transformer Reinforcement Learning}},
author = {Leandro von Werra and Younes Belkada and Lewis Tunstall and Edward Beeching and Tristan Thrush and Nathan Lambert and Shengyi Huang and Kashif Rasul and Quentin Gallou{\'e}dec},
year = 2020,
journal = {GitHub repository},
publisher = {GitHub},
howpublished = {\url{https://github.com/huggingface/trl}}
}
``` |
sc-genrm-scaling/llama_3.1_8b_genrm_ft | sc-genrm-scaling | 2025-06-17T13:53:40Z | 1,507 | 0 | null | [
"safetensors",
"llama",
"arxiv:2504.01005",
"license:apache-2.0",
"region:us"
] | null | 2024-12-25T20:24:35Z | ---
license: apache-2.0
---
Fine-tuned version of Llama-3.1-8B-Instruct for generative verification on MATH problems. See [organization card](https://huggingface.co/sc-genrm-scaling) and [paper](https://arxiv.org/abs/2504.01005) for more information.
You can follow [this example](https://github.com/nishadsinghi/sc-genrm-scaling/blob/master/llmonk/verify/demo.ipynb) of how to do inference with this model. |
furkankarakuz/test-bert-finetuned-ner | furkankarakuz | 2025-06-17T13:50:02Z | 0 | 0 | transformers | [
"transformers",
"tensorboard",
"safetensors",
"bert",
"token-classification",
"generated_from_trainer",
"dataset:conll2003",
"base_model:google-bert/bert-base-cased",
"base_model:finetune:google-bert/bert-base-cased",
"license:apache-2.0",
"model-index",
"autotrain_compatible",
"endpoints_co... | token-classification | 2025-06-17T13:17:42Z | ---
library_name: transformers
license: apache-2.0
base_model: bert-base-cased
tags:
- generated_from_trainer
datasets:
- conll2003
metrics:
- precision
- recall
- f1
- accuracy
model-index:
- name: test-bert-finetuned-ner
results:
- task:
name: Token Classification
type: token-classification
dataset:
name: conll2003
type: conll2003
config: conll2003
split: validation
args: conll2003
metrics:
- name: Precision
type: precision
value: 0.9333003136866436
- name: Recall
type: recall
value: 0.9513631773813531
- name: F1
type: f1
value: 0.9422451870989249
- name: Accuracy
type: accuracy
value: 0.9868428798492965
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# test-bert-finetuned-ner
This model is a fine-tuned version of [bert-base-cased](https://huggingface.co/bert-base-cased) on the conll2003 dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0596
- Precision: 0.9333
- Recall: 0.9514
- F1: 0.9422
- Accuracy: 0.9868
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 3
### Training results
| Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:---------:|:------:|:------:|:--------:|
| 0.0756 | 1.0 | 1756 | 0.0635 | 0.9075 | 0.9345 | 0.9208 | 0.9824 |
| 0.0359 | 2.0 | 3512 | 0.0682 | 0.9325 | 0.9467 | 0.9395 | 0.9862 |
| 0.0218 | 3.0 | 5268 | 0.0596 | 0.9333 | 0.9514 | 0.9422 | 0.9868 |
### Framework versions
- Transformers 4.52.4
- Pytorch 2.6.0+cu124
- Datasets 3.6.0
- Tokenizers 0.21.1
|
wiamabd/wav2vec2-large-xlsr-exp-lot1-only | wiamabd | 2025-06-17T13:08:00Z | 1 | 0 | transformers | [
"transformers",
"safetensors",
"wav2vec2",
"automatic-speech-recognition",
"arxiv:1910.09700",
"endpoints_compatible",
"region:us"
] | automatic-speech-recognition | 2025-06-14T19:00:24Z | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a ๐ค transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
derekl35/alphonse_mucha_fp8_lora_flux | derekl35 | 2025-06-17T13:04:45Z | 6 | 0 | diffusers | [
"diffusers",
"text-to-image",
"diffusers-training",
"lora",
"flux",
"flux-diffusers",
"template:sd-lora",
"dataset:derekl35/alphonse-mucha-style",
"base_model:black-forest-labs/FLUX.1-dev",
"base_model:adapter:black-forest-labs/FLUX.1-dev",
"license:other",
"region:us"
] | text-to-image | 2025-06-10T15:19:04Z | ---
base_model: black-forest-labs/FLUX.1-dev
library_name: diffusers
license: other
instance_prompt: a woman, alphonse mucha style
widget:
- text: >-
Serene raven-haired woman, moonlit lilies, swirling botanicals, alphonse
mucha style
output:
url: images/alphonse_mucha_merged1_fp8.png
- text: a puppy in a pond, alphonse mucha style
output:
url: images/alphonse_mucha_merged2_fp8.png
- text: >-
Ornate fox with a collar of autumn leaves and berries, amidst a tapestry of
forest foliage, alphonse mucha style
output:
url: images/alphonse_mucha_merged3_fp8.png
tags:
- text-to-image
- diffusers-training
- diffusers
- lora
- flux
- flux-diffusers
- template:sd-lora
datasets:
- derekl35/alphonse-mucha-style
---
# Flux DreamBooth LoRA - derekl35/alphonse_mucha_fp8_lora_flux
<Gallery />
## Model description
These are derekl35/alphonse_mucha_fp8_lora_flux DreamBooth LoRA weights for black-forest-labs/FLUX.1-dev. This work is part of the blog post, "Fine-Tuning FLUX.1-dev on consumer hardware and in FP8".
The weights were trained using [DreamBooth](https://dreambooth.github.io/) with the [Flux diffusers trainer](https://github.com/huggingface/diffusers/blob/main/examples/dreambooth/README_flux.md).
Was LoRA for the text encoder enabled? False.
FP8 training? True
## Trigger words
You should use `, alphonse mucha style` to trigger the image generation.
## Download model
[Download the *.safetensors LoRA](derekl35/alphonse_mucha_fp8_lora_flux/tree/main) in the Files & versions tab.
## Use it with the [๐งจ diffusers library](https://github.com/huggingface/diffusers)
There are two main ways to use this LoRA for inference: loading the adapter on the fly or merging it with the base model.
### Option 1: Loading LoRA Adapters
This approach offers flexibility, allowing you to easily switch between different LoRA styles.
```python
from diffusers import FluxPipeline
import torch
ckpt_id = "black-forest-labs/FLUX.1-dev"
pipeline = FluxPipeline.from_pretrained(
ckpt_id, torch_dtype=torch.float16
)
pipeline.load_lora_weights("derekl35/alphonse_mucha_fp8_lora_flux", weight_name="pytorch_lora_weights.safetensors")
pipeline.enable_model_cpu_offload()
image = pipeline(
"a puppy in a pond, alphonse mucha style",
num_inference_steps=28,
guidance_scale=3.5,
height=768,
width=512,
generator=torch.manual_seed(0)
).images[0]
image.save("alphonse_mucha_loaded.png")
```
### Option 2: Merging LoRA into Base Model
Merging the LoRA into the base model can lead to slightly faster inference and is useful when you want to use a single style consistently.
```python
from diffusers import FluxPipeline, AutoPipelineForText2Image, FluxTransformer2DModel
import torch
ckpt_id = "black-forest-labs/FLUX.1-dev"
pipeline = FluxPipeline.from_pretrained(
ckpt_id, text_encoder=None, text_encoder_2=None, torch_dtype=torch.float16
)
pipeline.load_lora_weights("derekl35/alphonse_mucha_fp8_lora_flux", weight_name="pytorch_lora_weights.safetensors")
pipeline.fuse_lora()
pipeline.unload_lora_weights()
# You can save the fused transformer for later use
# pipeline.transformer.save_pretrained("fused_transformer")
pipeline.enable_model_cpu_offload()
image = pipeline(
"a puppy in a pond, alphonse mucha style",
num_inference_steps=28,
guidance_scale=3.5,
height=768,
width=512,
generator=torch.manual_seed(0)
).images[0]
image.save("alphonse_mucha_merged.png")
```
For more details, including weighting, merging and fusing LoRAs, check the [documentation on loading LoRAs in diffusers](https://huggingface.co/docs/diffusers/main/en/using-diffusers/loading_adapters)
## License
Please adhere to the licensing terms as described [here](https://huggingface.co/black-forest-labs/FLUX.1-dev/blob/main/LICENSE.md). |
brunopio/OCRFlux-3B-Q4_K_M-GGUF | brunopio | 2025-06-17T13:03:56Z | 0 | 0 | transformers | [
"transformers",
"gguf",
"llama-cpp",
"gguf-my-repo",
"en",
"base_model:ChatDOC/OCRFlux-3B",
"base_model:quantized:ChatDOC/OCRFlux-3B",
"license:apache-2.0",
"endpoints_compatible",
"region:us",
"conversational"
] | null | 2025-06-17T13:03:41Z | ---
language:
- en
license: apache-2.0
benchmarks:
- ChatDoc/OCRFlux-bench-single
- ChatDoc/OCRFlux-bench-cross
- ChatDoc/OCRFlux-pubtabnet-single
- ChatDoc/OCRFlux-pubtabnet-cross
base_model: ChatDOC/OCRFlux-3B
library_name: transformers
tags:
- llama-cpp
- gguf-my-repo
---
# brunopio/OCRFlux-3B-Q4_K_M-GGUF
This model was converted to GGUF format from [`ChatDOC/OCRFlux-3B`](https://huggingface.co/ChatDOC/OCRFlux-3B) using llama.cpp via the ggml.ai's [GGUF-my-repo](https://huggingface.co/spaces/ggml-org/gguf-my-repo) space.
Refer to the [original model card](https://huggingface.co/ChatDOC/OCRFlux-3B) for more details on the model.
## Use with llama.cpp
Install llama.cpp through brew (works on Mac and Linux)
```bash
brew install llama.cpp
```
Invoke the llama.cpp server or the CLI.
### CLI:
```bash
llama-cli --hf-repo brunopio/OCRFlux-3B-Q4_K_M-GGUF --hf-file ocrflux-3b-q4_k_m.gguf -p "The meaning to life and the universe is"
```
### Server:
```bash
llama-server --hf-repo brunopio/OCRFlux-3B-Q4_K_M-GGUF --hf-file ocrflux-3b-q4_k_m.gguf -c 2048
```
Note: You can also use this checkpoint directly through the [usage steps](https://github.com/ggerganov/llama.cpp?tab=readme-ov-file#usage) listed in the Llama.cpp repo as well.
Step 1: Clone llama.cpp from GitHub.
```
git clone https://github.com/ggerganov/llama.cpp
```
Step 2: Move into the llama.cpp folder and build it with `LLAMA_CURL=1` flag along with other hardware-specific flags (for ex: LLAMA_CUDA=1 for Nvidia GPUs on Linux).
```
cd llama.cpp && LLAMA_CURL=1 make
```
Step 3: Run inference through the main binary.
```
./llama-cli --hf-repo brunopio/OCRFlux-3B-Q4_K_M-GGUF --hf-file ocrflux-3b-q4_k_m.gguf -p "The meaning to life and the universe is"
```
or
```
./llama-server --hf-repo brunopio/OCRFlux-3B-Q4_K_M-GGUF --hf-file ocrflux-3b-q4_k_m.gguf -c 2048
```
|
altaweel/gemma-ultrasound-1b-v2 | altaweel | 2025-06-17T13:03:16Z | 0 | 0 | transformers | [
"transformers",
"tensorboard",
"safetensors",
"generated_from_trainer",
"trl",
"sft",
"base_model:google/gemma-3-1b-pt",
"base_model:finetune:google/gemma-3-1b-pt",
"endpoints_compatible",
"region:us"
] | null | 2025-06-17T12:50:29Z | ---
base_model: google/gemma-3-1b-pt
library_name: transformers
model_name: gemma-ultrasound-1b-v2
tags:
- generated_from_trainer
- trl
- sft
licence: license
---
# Model Card for gemma-ultrasound-1b-v2
This model is a fine-tuned version of [google/gemma-3-1b-pt](https://huggingface.co/google/gemma-3-1b-pt).
It has been trained using [TRL](https://github.com/huggingface/trl).
## Quick start
```python
from transformers import pipeline
question = "If you had a time machine, but could only go to the past or the future once and never return, which would you choose and why?"
generator = pipeline("text-generation", model="altaweel/gemma-ultrasound-1b-v2", device="cuda")
output = generator([{"role": "user", "content": question}], max_new_tokens=128, return_full_text=False)[0]
print(output["generated_text"])
```
## Training procedure
This model was trained with SFT.
### Framework versions
- TRL: 0.15.2
- Transformers: 4.52.4
- Pytorch: 2.7.0
- Datasets: 3.3.2
- Tokenizers: 0.21.1
## Citations
Cite TRL as:
```bibtex
@misc{vonwerra2022trl,
title = {{TRL: Transformer Reinforcement Learning}},
author = {Leandro von Werra and Younes Belkada and Lewis Tunstall and Edward Beeching and Tristan Thrush and Nathan Lambert and Shengyi Huang and Kashif Rasul and Quentin Gallouรฉdec},
year = 2020,
journal = {GitHub repository},
publisher = {GitHub},
howpublished = {\url{https://github.com/huggingface/trl}}
}
``` |
joanna302/Qwen3-0.6B-Base_fr_pt_2e-05_seed43 | joanna302 | 2025-06-17T12:59:14Z | 0 | 0 | transformers | [
"transformers",
"safetensors",
"qwen3",
"text-generation",
"unsloth",
"trl",
"sft",
"conversational",
"arxiv:1910.09700",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | text-generation | 2025-06-17T12:15:54Z | ---
library_name: transformers
tags:
- unsloth
- trl
- sft
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a ๐ค transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
TrumpElon/task-11-Qwen-Qwen2.5-1.5B | TrumpElon | 2025-06-17T12:50:51Z | 0 | 0 | peft | [
"peft",
"safetensors",
"base_model:Qwen/Qwen2.5-1.5B",
"base_model:adapter:Qwen/Qwen2.5-1.5B",
"license:other",
"region:us"
] | null | 2025-06-17T12:49:19Z | ---
library_name: peft
license: other
base_model: Qwen/Qwen2.5-1.5B
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# lora
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
### Training results
### Framework versions
- PEFT 0.12.0
- Transformers 4.48.3
- Pytorch 2.6.0+cu124
- Datasets 3.2.0
- Tokenizers 0.21.0
|
EYEDOL/Llama-3.2-3b_ON_ALPACA3 | EYEDOL | 2025-06-17T12:41:30Z | 0 | 0 | transformers | [
"transformers",
"safetensors",
"text-generation-inference",
"unsloth",
"llama",
"trl",
"en",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | null | 2025-06-17T12:41:13Z | ---
base_model: unsloth/llama-3.2-3b-instruct
tags:
- text-generation-inference
- transformers
- unsloth
- llama
- trl
license: apache-2.0
language:
- en
---
# Uploaded model
- **Developed by:** EYEDOL
- **License:** apache-2.0
- **Finetuned from model :** unsloth/llama-3.2-3b-instruct
This llama model was trained 2x faster with [Unsloth](https://github.com/unslothai/unsloth) and Huggingface's TRL library.
[<img src="https://raw.githubusercontent.com/unslothai/unsloth/main/images/unsloth%20made%20with%20love.png" width="200"/>](https://github.com/unslothai/unsloth)
|
NhaiDao/SFT_TRAIN_FROM_SCRATCH_checkpoint_10625 | NhaiDao | 2025-06-17T12:32:34Z | 0 | 0 | transformers | [
"transformers",
"safetensors",
"llama",
"text-generation",
"conversational",
"arxiv:1910.09700",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | text-generation | 2025-06-17T12:32:00Z | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a ๐ค transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
NhaiDao/SFT_TRAIN_FROM_SCRATCH_checkpoint_8125 | NhaiDao | 2025-06-17T12:30:13Z | 0 | 0 | transformers | [
"transformers",
"safetensors",
"llama",
"text-generation",
"conversational",
"arxiv:1910.09700",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | text-generation | 2025-06-17T12:29:40Z | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a ๐ค transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
diegolacomba/multilingual-e5-small-legal-mnrl-1 | diegolacomba | 2025-06-17T12:28:40Z | 0 | 0 | sentence-transformers | [
"sentence-transformers",
"safetensors",
"bert",
"sentence-similarity",
"feature-extraction",
"generated_from_trainer",
"dataset_size:58898",
"loss:MultipleNegativesRankingLoss",
"arxiv:1908.10084",
"arxiv:1705.00652",
"base_model:intfloat/multilingual-e5-small",
"base_model:finetune:intfloat/m... | sentence-similarity | 2025-06-17T12:28:13Z | ---
tags:
- sentence-transformers
- sentence-similarity
- feature-extraction
- generated_from_trainer
- dataset_size:58898
- loss:MultipleNegativesRankingLoss
base_model: intfloat/multilingual-e5-small
widget:
- source_sentence: 'query: ยฟCรณmo se deben determinar las cuotas a cuenta del IRPF
en un aรฑo con actividad econรณmica suspendida?'
sentences:
- 'passage A los efectos de este Impuesto, se considerarรก promotor de edificaciones
el propietario de inmuebles que construyรณ (promotor-constructor) o contratรณ la
construcciรณn (promotor) de los mismos para destinarlos a la venta, el alquiler
o el uso propio.
c) Dichas ejecuciones de obra tengan por objeto la construcciรณn o rehabilitaciรณn
de edificios destinados fundamentalmente a viviendas, incluidos los locales, anejos,
instalaciones y servicios complementarios en ella situados.
d) Las referidas ejecuciones de obra consistan materialmente en la construcciรณn
o rehabilitaciรณn de los citados edificios.
3.- En consecuencia, las ejecuciones de obra concertadas directamente entre el
promotor y el contratista (la consultante), que tengan por objeto la rehabilitaciรณn
de una vivienda, tributan al tipo reducido del 10 por ciento. El tipo reducido
se aplica con independencia de que el promotor concierte la totalidad de la obra
de construcciรณn con un solo empresario o concierte la realizaciรณn con varios empresarios
realizando cada uno de ellos una parte de la obra segรบn su especialidad.
No obstante, las ejecuciones de obra realizadas por subcontratistas para otros
contratistas (la consultante), que a su vez contraten con el promotor, tributarรกn
por el Impuesto sobre el Valor Aรฑadido al tipo general del 21 por ciento.
4.- Lo que comunico a Vd. con efectos vinculantes, conforme a lo dispuesto en
el apartado 1 del artรญculo 89 de la Ley 58/2003, de 17 de diciembre, General Tributaria.'
- 'passage Descripciรณn de hechos: La consultante es titular de una actividad econรณmica
de "otros cafรฉs y bares". El rendimiento neto de la actividad se determina por
el mรฉtodo de estimaciรณn objetiva y tributa en el IVA por el rรฉgimen especial simplificado.
Desde la declaraciรณn de alarma en marzo de 2020 ha tenido cerrada la actividad
y la va a seguir teniendo cerrada durante todo el aรฑo 2020, pues las restricciones
que tiene que aplicar no la hacen rentable.
Cuestiรณn planteada: Forma de calcular, en 2020, el pago fraccionado a cuenta del
IRPF y el ingreso a cuenta trimestral del IVA.'
- 'passage No obstante, el artรญculo 22.Trece de la Ley 37/1992, declara la exenciรณn
de:
โLos transportes de viajeros y sus equipajes por vรญa marรญtima o aรฉrea procedentes
de o con destino a un puerto o aeropuerto situado fuera del รกmbito espacial del
Impuesto.
Se entenderรกn incluidos en este apartado los transportes por vรญa aรฉrea amparados
por un รบnico tรญtulo de transporte que incluya vuelos de conexiรณn aรฉrea.โ.
En consecuencia, los servicios de transporte consultados, que tienen su origen
o destino en un aeropuerto fuera del territorio de aplicaciรณn del impuesto sobre
el valor aรฑadido, estarรกn sujetos pero exentos del Impuesto sobre el Valor Aรฑadido.
2.- Por otra parte, el artรญculo 164, apartado uno, de la Ley del Impuesto sobre
el Valor Aรฑadido, en el que se regulan las obligaciones de los sujetos pasivos,
establece lo siguiente:
โUno. Sin perjuicio de lo establecido en el Tรญtulo anterior, los sujetos pasivos
del Impuesto estarรกn obligados, con los requisitos, lรญmites y condiciones que
se determinen reglamentariamente, a:
(โฆ)
3ยบ. Expedir y entregar factura de todas sus operaciones, ajustada a lo que se
determine reglamentariamente.โ.
El desarrollo reglamentario de dicho precepto se ha llevado a cabo por el Reglamento
por el que se regulan las obligaciones de facturaciรณn, aprobado por el artรญculo
1 del Real Decreto 1619/2012, de 30 de noviembre (BOE de 1 de diciembre).
El artรญculo 2 del mencionado Reglamento dispone que:'
- source_sentence: 'query: ยฟCuรกl es el porcentaje de impuesto que corresponde a dispositivos
destinados a aliviar discapacidades bajo la ley actual?'
sentences:
- 'passage Contestaciรณn completa: 1.- El artรญculo 90, apartado uno, de la Ley 37/1992,
de 28 de diciembre, del Impuesto sobre el Valor Aรฑadido (BOE del 29 de diciembre),
dispone que el Impuesto se exigirรก al tipo del 21 por ciento, salvo lo dispuesto
en el artรญculo siguiente.
2.- El artรญculo 91, apartado Uno.1, nรบmero 6ยบ, letra c) de la Ley 37/1992 dispone
lo siguiente:
โUno. Se aplicarรก el tipo del 10 por ciento a las operaciones siguientes:
1. Las entregas, adquisiciones intracomunitarias o importaciones de los bienes
que se indican a continuaciรณn:
(โฆ)
6.ยบ Los siguientes bienes:
(โฆ)
c) Los equipos mรฉdicos, aparatos y demรกs instrumental, relacionados en el apartado
octavo del anexo de esta Ley, que, por sus caracterรญsticas objetivas, estรฉn diseรฑados
para aliviar o tratar deficiencias, para uso personal y exclusivo de personas
que tengan deficiencias fรญsicas, mentales, intelectuales o sensoriales, sin perjuicio
de lo previsto en el apartado dos.1 de este artรญculo.
No se incluyen en esta letra otros accesorios, recambios y piezas de repuesto
de dichos bienes.โ.
El apartado octavo del Anexo de la Ley 37/1992, establece lo siguiente:
โOctavo. Relaciรณn de bienes a que se refiere el artรญculo 91.Uno.1. 6.ยบc) de esta
Ley.
(โฆ)
โ Sillas terapรฉuticas y de ruedas, asรญ como los cojines antiescaras y arneses
para el uso de las mismas, muletas, andadores y grรบas para movilizar personas
con discapacidad.
(โฆ).โ.
3.- Por su parte, el artรญculo 91, apartado dos.1, nรบmero 4ยบ de la Ley 37/1992,
dispone que:
โDos. Se aplicarรก el tipo del 4 por ciento a las operaciones siguientes:
1. Las entregas, adquisiciones intracomunitarias o importaciones de los bienes
que se indican a continuaciรณn:
(โฆ)'
- 'passage (โฆ).โ.
De acuerdo con lo dispuesto anteriormente, en los supuestos de adjudicaciรณn de
bienes en virtud de subasta judicial o administrativa, como es el caso que nos
ocupa, el adjudicatario puede efectuar, en su caso, la renuncia a las exenciones
previstas en el apartado dos del artรญculo 20 de la Ley 37/1992, asรญ como expedir
factura, presentar, en nombre y por cuenta del sujeto pasivo, la declaraciรณn-liquidaciรณn
correspondiente e ingresar el importe del Impuesto sobre el Valor Aรฑadido resultante.
El ejercicio de dicha facultad por parte del adjudicatario determina la obligaciรณn
de presentar la autoliquidaciรณn del Impuesto conforme al modelo aprobado por la
Orden HAC/3625/2003, de 23 de diciembre (modelo 309).
Uno de los requisitos necesarios para el ejercicio de dicha facultad es que el
destinatario-adjudicatario del bien inmueble tenga la consideraciรณn de empresario
o profesional en los tรฉrminos previstos en esta contestaciรณn. La no consideraciรณn
como empresario o profesional impide el ejercicio de dicha facultad.
Por รบltimo, seรฑalar que de resultar aplicable la regla de inversiรณn del sujeto
pasivo prevista en el artรญculo 84.Uno.2ยบ de la Ley 37/1992, anteriormente desarrollado,
el adjudicatario resultarรก ser el sujeto pasivo de la operaciรณn por lo que viene
obligado a presentar la autoliquidaciรณn ordinaria del Impuesto en nombre propio,
sin actuar en nombre y por cuenta del subastado. Asimismo, de optar por dicha
facultad en los tรฉrminos establecidos reglamentariamente, el consultante podrรก
emitir, en nombre y por cuenta del transmitente, la correspondiente factura en
la que se documente la operaciรณn.
No obstante, tal y como se ha seรฑalado en apartados anteriores de esta contestaciรณn,
el consultante adjudicatario de la subasta judicial no procediรณ a la renuncia
a la exenciรณn del artรญculo 20.Uno.22ยบ de la Ley del Impuesto en el plazo establecido,
habiรฉndose encontrado facultado para ello segรบn lo dispuesto en la Disposiciรณn
Adicional Sexta de la Ley 37/1992.'
- 'passage c) Las que tengan por objeto la cesiรณn del derecho a utilizar infraestructuras
ferroviarias.
d) Las autorizaciones para la prestaciรณn de servicios al pรบblico y para el desarrollo
de actividades comerciales o industriales en el รกmbito portuario.โ
3.- La consulta plantea una cuestiรณn sobre un contrato por el que un Ayuntamiento
cede a un contratista la explotaciรณn de un bar (instalaciรณn fija de obra) en una
ciudad.
Dicho contrato tiene la naturaleza de contrato administrativo especial, sin que
el mismo pueda calificarse como contrato de gestiรณn de servicio pรบblico ni tampoco
como concesiรณn administrativa de dominio pรบblico.
Cabe plantearse si podrรญa resultar aplicable a la referida prestaciรณn de servicios
efectuada por el ayuntamiento en favor de la consultante el supuesto de no sujeciรณn
al Impuesto sobre el Valor Aรฑadido previsto para el otorgamiento de concesiones
y autorizaciones administrativas en el nรบmero 9ยบ del artรญculo 7 de la citada Ley
37/1992.
La respuesta a esta cuestiรณn es negativa, pues, como ha seรฑalado la Asesorรญa Jurรญdica
de la Secretarรญa de Estado de Hacienda en el informe emitido el 30 de julio de
1997 a solicitud de esta Direcciรณn General, los contratos que tienen por objeto
la explotaciรณn de cafeterรญas y comedores en centros pรบblicos son contratos administrativos
especiales, sin que los mismos puedan calificarse como contratos de gestiรณn de
servicios pรบblicos ni tampoco como concesiones administrativas de dominio pรบblico.
En este sentido se ha pronunciado la Junta Consultiva de Contrataciรณn Administrativa
en diversos informes emitidos al respecto; asรญ, en el informe 57/07 de 6 de febrero
de 2008, y, con anterioridad, en los informes 5/96 de 7 de marzo y 67/99, de 6
de julio de 2000.
En consecuencia con todo lo anterior, estรก sujeto al Impuesto sobre el Valor Aรฑadido
y no exento del mismo el contrato suscrito entre el ayuntamiento y la consultante
consistente en explotar un bar-quiosco, a cambio del pago de una contraprestaciรณn.
4.- Lo que comunico a Vd. con efectos vinculantes, conforme a lo dispuesto en
el apartado 1 del artรญculo 89 de la Ley 58/2003, de 17 de diciembre, General Tributaria.'
- source_sentence: 'query: ยฟEn quรฉ casos las transacciones documentadas en escrituras
pรบblicas pueden estar sujetas a una tasa tributaria especรญfica segรบn la normativa
vigente?'
sentences:
- 'passage 3.- Por otra parte en relaciรณn con la inclusiรณn del suero de irrigaciรณn
en el apartado destinado a โBolsas de recogida de orina, absorbentes de incontinencia
y otros sistemas para incontinencia urinaria y fecal, incluidos los sistemas de
irrigaciรณnโ, este Centro directivo en la consulta de fecha 23 de marzo de 2015,
numero V0872-15 y en relaciรณn con los sistemas de irrigaciรณn ha dispuesto que,
โTributarรกn por el Impuesto sobre el Valor Aรฑadido, al tipo general del 21 por
ciento, los siguientes productos objeto de consulta: -Los empapadores, las duchas
vaginales, irrigadores, accesorios y sistemas de irrigaciรณn no destinados especรญficamente
a situaciones de incontinencia urinaria o fecal, ni las cรกnulas rectales y vaginales
no destinadas especรญficamente a situaciones de incontinencia urinaria o fecal
o no incorporadas en equipos destinados a estas situaciones. โ
4.- En consecuencia con lo anterior este centro directivo le informa que tributan
al tipo general del 21 por ciento las entregas, adquisiciones intracomunitarias
e importaciones de suero de irrigaciรณn (agua destilada o suero fisiolรณgico) objeto
de consulta siendo irrelevante que su destino sea para la limpieza asรฉptica de
la piel, lavado de heridas o quemaduras formando parte integrante de los sistemas
de irrigaciรณn.
5.- Lo que comunico a Vd. con efectos vinculantes, conforme a lo dispuesto en
el apartado 1 del artรญculo 89 de la Ley 58/2003, de 17 de diciembre, General Tributaria.
No obstante, de acuerdo con el artรญculo 68.2 del Reglamento General de las actuaciones
y los procedimientos de gestiรณn e inspecciรณn tributaria y de desarrollo de las
normas comunes de los procedimientos de aplicaciรณn de los tributos, aprobado por
el Real Decreto 1065/2007, de 27 de julio, la presente contestaciรณn no tendrรก
efectos vinculantes para aquellos miembros o asociados de la consultante que en
el momento de formular la consulta estuviesen siendo objeto de un procedimiento,
recurso o reclamaciรณn econรณmico-administrativa iniciado con anterioridad y relacionado
con las cuestiones planteadas en la consulta conforme a lo dispuesto en su artรญculo
89.2.'
- 'passage Contestaciรณn completa: 1.- Las reglas de localizaciรณn de las prestaciones
de servicios se encuentran reguladas en los artรญculos 69, 70 y 72 de la Ley 37/1992,
de 28 de diciembre, del Impuesto sobre el Valor Aรฑadido (BOE del 29 de diciembre).
En el artรญculo 69 del dicho texto normativo se contienen las reglas generales
de localizaciรณn en donde se establece que:
โUno. Las prestaciones de servicios se entenderรกn realizadas en el territorio
de aplicaciรณn del Impuesto, sin perjuicio de lo dispuesto en el apartado siguiente
de este artรญculo y en los artรญculos 70 y 72 de esta Ley, en los siguientes casos:
1.ยบ Cuando el destinatario sea un empresario o profesional que actรบe como tal
y radique en el citado territorio la sede de su actividad econรณmica, o tenga en
el mismo un establecimiento permanente o, en su defecto, el lugar de su domicilio
o residencia habitual, siempre que se trate de servicios que tengan por destinatarios
a dicha sede, establecimiento permanente, domicilio o residencia habitual, con
independencia de dรณnde se encuentre establecido el prestador de los servicios
y del lugar desde el que los preste.
2.ยบ Cuando el destinatario no sea un empresario o profesional actuando como tal,
siempre que los servicios se presten por un empresario o profesional y la sede
de su actividad econรณmica o establecimiento permanente desde el que los preste
o, en su defecto, el lugar de su domicilio o residencia habitual, se encuentre
en el territorio de aplicaciรณn del Impuesto.
(โฆ).โ.
No obstante, estas reglas serรกn de aplicaciรณn รบnicamente en el caso en que no
proceda aplicar ninguna de las reglas espaciales que se regulan en el artรญculo
70 de la Ley del impuesto. En concreto, respecto de los servicios de restauraciรณn
y catering, se establece en el nรบmero 5ยบ del apartado Uno de dicho precepto que:
โUno. Se entenderรกn prestados en el territorio de aplicaciรณn del Impuesto los
siguientes servicios:
(โฆ)
5.ยบ. A) Los de restauraciรณn y catering en los siguientes supuestos:
(โฆ)
b) Los restantes servicios de restauraciรณn y catering cuando se presten materialmente
en el territorio de aplicaciรณn del Impuesto.
(โฆ).โ.'
- 'passage Artรญculo 31
โ2. Las primeras copias de escrituras y actas notariales, cuando tengan por objeto
cantidad o cosa valuable, contengan actos o contratos inscribibles en los Registros
de la Propiedad, Mercantil y de la Propiedad Industrial y de Bienes Muebles no
sujetos al Impuesto sobre Sucesiones y Donaciones o a los conceptos comprendidos
en los nรบmeros 1 y 2 del artรญculo 1.ยบ de esta Ley, tributarรกn, ademรกs, al tipo
de gravamen que, conforme a lo previsto en la Ley 21/2001, de 27 de diciembre,
por la que se regulan las medidas fiscales y administrativas del nuevo sistema
de financiaciรณn de las Comunidades Autรณnomas de rรฉgimen comรบn y Ciudades con Estatuto
de Autonomรญa, haya sido aprobado por la Comunidad Autรณnoma.
Si la Comunidad Autรณnoma no hubiese aprobado el tipo a que se refiere el pรกrrafo
anterior, se aplicarรก el 0,50 por 100, en cuanto a tales actos o contratos.โ
De la aplicaciรณn de los preceptos anteriormente transcritos resulta lo siguiente:
- Por regla general las operaciones realizadas por un sujeto pasivo del IVA son
operaciones no sujetas a la modalidad de transmisiones patrimoniales onerosas
del ITP y AJD segรบn lo dispuesto en los artรญculos 7.5 del Texto Refundido del
citado impuesto. En tal caso, si la referida operaciรณn se documentase en escritura
pรบblica, la no sujeciรณn de la transmisiรณn por la modalidad de transmisiones patrimoniales
onerosas permitirรญa la aplicaciรณn la cuota variable del Documento Notarial de
la modalidad Actos Jurรญdicos Documentados, dada la concurrencia de todos los requisitos
exigidos en el artรญculo 31.2 del Texto Refundido del Impuesto:
Tratarse de una primera copia de una escritura o acta notarial
Tener por objeto cantidad o cosa valuable
Contener un acto o contrato inscribibles en los Registros de la Propiedad, Mercantil
y de la Propiedad Industrial y de Bienes Muebles
No estar sujetos los referidos actos al Impuesto sobre Sucesiones y Donaciones
o a los conceptos comprendidos en los apartados 1 y 2 del artรญculo 1 de esta Ley,
transmisiones patrimoniales onerosas y operaciones societarias'
- source_sentence: 'query: ยฟSe aplican impuestos a la enseรฑanza de idiomas para particulares
y empresas en modalidad presencial y virtual?'
sentences:
- 'passage 4.- Por otro lado, el artรญculo 91, apartado dos.2, nรบmero 1ยบ, de la Ley
del Impuesto sobre el Valor Aรฑadido, dispone la aplicaciรณn del tipo impositivo
del 4 por ciento a la prestaciรณn de los siguientes servicios:
โ1.ยบ Los servicios de reparaciรณn de los vehรญculos y de las sillas de ruedas comprendidos
en el pรกrrafo primero del nรบmero 4.ยบ del apartado dos.1 de este artรญculo y los
servicios de adaptaciรณn de los autotaxis y autoturismos para personas con discapacidad
y de los vehรญculos a motor a los que se refiere el pรกrrafo segundo del mismo precepto
independientemente de quiรฉn sea el conductor de los mismos.โ.
Los servicios de reparaciรณn recogidos en la Ley 37/1992 son รบnicamente los referidos
a vehรญculos para personas con movilidad reducida y a sillas de ruedas para uso
exclusivo de personas con discapacidad, que son los bienes incluidos en el pรกrrafo
primero del artรญculo 91, apartado dos.1, nรบmero 4ยบ de dicha Ley.
En consecuencia con lo anterior, las reparaciones de sillas de ruedas, que no
estรฉn incluidas en el pรกrrafo anterior, tributarรกn al tipo del 21 por ciento dado
que no estรก contemplado en el artรญculo 91 de la Ley 37/1992 un tipo reducido para
estos servicios de reparaciรณn.
5.- En relaciรณn con el tipo impositivo aplicable a los accesorios y recambios
de sillas de ruedas, la actual redacciรณn del artรญculo 91.Uno.1.6ยบ, letra c) dice
expresamente que: โNo se incluyen en esta letra otros accesorios, recambios y
piezas de repuesto de dichos bienes.โ.'
- 'passage Descripciรณn de hechos: La consultante es una persona fรญsica que va a
impartir clases de idiomas, en concreto alemรกn, tanto a personas fรญsicas como
a empresas. Las clases se realizarรกn tanto de manera presencial como a travรฉs
de medios electrรณnicos.
Cuestiรณn planteada: Si las clases se encuentran exentas del Impuesto sobre el
Valor Aรฑadido.'
- 'passage Contestaciรณn completa: 1.- El artรญculo 134 bis, apartado dos de la Ley
37/1992, de 28 de diciembre, del Impuesto sobre el Valor Aรฑadido (BOE del 29),
establece que:
โDos. Cuando el rรฉgimen de tributaciรณn aplicable a una determinada actividad agrรญcola,
ganadera, forestal o pesquera cambie del rรฉgimen especial de la agricultura, ganaderรญa
y pesca al general del Impuesto, el empresario o profesional titular de la actividad
tendrรก derecho a:
1ยบ. Efectuar la deducciรณn de la cuota resultante de aplicar al valor de los bienes
afectos a la actividad, Impuesto sobre el Valor Aรฑadido excluido, en la fecha
en que deje de aplicarse el rรฉgimen especial, los tipos de dicho Impuesto que
estuviesen vigentes en la citada fecha. A estos efectos, no se tendrรกn en cuenta
los siguientes:
a) Bienes de inversiรณn, definidos conforme a lo dispuesto en el artรญculo 108 de
esta Ley.
b) Bienes y servicios que hayan sido utilizados o consumidos total o parcialmente
en la actividad.
2ยบ. Deducir la compensaciรณn a tanto alzado que prevรฉ el artรญculo 130 de esta Ley
por los productos naturales obtenidos en las explotaciones que no se hayan entregado
a la fecha del cambio del rรฉgimen de tributaciรณn.
A efectos del ejercicio de los derechos recogidos en este apartado, el empresario
o profesional deberรก confeccionar y presentar un inventario a la fecha en que
deje de aplicarse el rรฉgimen especial. Tanto la presentaciรณn de este inventario
como el ejercicio de estos derechos se ajustarรกn a los requisitos y condiciones
que se establezcan reglamentariamente.โ.
Por su parte, el artรญculo 49 bis del Reglamento del Impuesto aprobado por el artรญculo
1 del Real Decreto 1624/1992, de 29 de diciembre (BOE del 31), declara que:'
- source_sentence: 'query: ยฟDe quรฉ forma la ubicaciรณn de la agencia influye en la
aplicaciรณn del impuesto en los servicios turรญsticos?'
sentences:
- 'passage Contestaciรณn completa: 1.- El artรญculo 9, primer pรกrrafo de la Ley 8/1991,
de 25 de marzo, por la que se crea el Impuesto sobre la Producciรณn, los Servicios
y la Importaciรณn en las Ciudades de Ceuta y Melilla (BOE del 26), dispone lo siguiente:
โLas importaciones definitivas de bienes en las ciudades de Ceuta y Melilla estarรกn
exentas en los mismos tรฉrminos que en la legislaciรณn comรบn del Impuesto sobre
el Valor Aรฑadido y, en todo caso, se asimilarรกn, a efectos de esta exenciรณn, las
que resulten de aplicaciรณn a las operaciones interiores.โ.
2.- Por otra parte, el artรญculo 20, apartado uno, nรบmero 17ยบ de la Ley 37/1992,
de 28 de diciembre, del Impuesto sobre el Valor Aรฑadido (BOE del 29 de diciembre),
dispone que estarรกn exentas de dicho Impuesto:
โ17ยบ. Las entregas de sellos de Correos y efectos timbrados de curso legal en
Espaรฑa por importe no superior a su valor facial.
La exenciรณn no se extiende a los servicios de expendiciรณn de los referidos bienes
prestados en nombre y por cuenta de terceros.โ.
Conforme al precepto anterior, la entrega de sellos de correos de curso legal
por importe no superior a su valor facial, objeto de consulta, estarรก exenta del
Impuesto sobre el Valor Aรฑadido.
3.- En consecuencia, estarรกn sujetas pero exentas del Impuesto sobre la Producciรณn,
los Servicios y la Importaciรณn en las Ciudades de Ceuta y Melilla las importaciones
definitivas de sellos de correos de curso legal en las ciudades de Ceuta y Melilla
cuando, de acuerdo con lo establecido en el apartado anterior de esta contestaciรณn,
su entrega estรฉ exenta del Impuesto sobre el Valor Aรฑadido.
4.- Lo que comunico a Vd. con efectos vinculantes, conforme a lo dispuesto en
el apartado 1 del artรญculo 89 de la Ley 58/2003, de 17 de diciembre, General Tributaria.'
- 'passage 2ยบ. Sin perjuicio de lo dispuesto en el punto 1ยบ anterior, se aplicarรก,
en todo caso, el tipo general del 21 por ciento, entre otros, a los siguientes
bienes y servicios:
1. Servicios prestados por vรญa electrรณnica, esto es, aquellos servicios que consistan
en la transmisiรณn enviada inicialmente y recibida en destino por medio de equipos
de procesamiento, incluida la compresiรณn numรฉrica y el almacenamiento de datos,
y enteramente transmitida, transportada y recibida por cable, sistema รณptico u
otros medios electrรณnicos y, entre otros, los siguientes:
a) El suministro y alojamiento de sitios informรกticos.
b) El mantenimiento a distancia de programas y de equipos.
c) El suministro de programas y su actualizaciรณn.
d) El suministro de imรกgenes, texto, informaciรณn y la puesta a disposiciรณn de
bases de datos.
e) El suministro de mรบsica, pelรญculas, juegos, incluidos los de azar o de dinero,
y de emisiones y manifestaciones polรญticas, culturales, artรญsticas, deportivas,
cientรญficas o de ocio.
f) El suministro de enseรฑanza a distancia.
2. Dispositivos portรกtiles que permitan almacenar y leer libros digitalizados,
asรญ como reproductores de libros electrรณnicos y otros elementos de hardware, es
decir, componentes que integren la parte material de un ordenador o que se puedan
conectar al mismo.
3. Servicios consistentes en el acceso electrรณnico a bases de datos, periรณdicos,
revistas y semejantes y, en general, a pรกginas web.
4. Comercializaciรณn de cรณdigos de descarga de archivos que incorporen libros electrรณnicos.
5. Servicios de acceso a libros de texto en formato digital alojados en servidores
de Entes pรบblicos o de colegios.
6. Servicios de consultas y accesos a bases de datos.
7. Servicios de digitalizaciรณn de obras literarias.'
- 'passage De acuerdo con los antecedentes recogidos en esta contestaciรณn, dicho
servicio estarรก sujeto al rรฉgimen especial de las agencias de viajes regulado
en el Capรญtulo VI del Tรญtulo IX de la Ley 37/1992 y tendrรก la consideraciรณn de
prestaciรณn de servicios รบnica que estarรก sujeta al Impuesto sobre el Valor Aรฑadido
bajo la premisa de que la consultante tiene establecida la sede de su actividad
econรณmica o posea un establecimiento permanente desde donde efectรบa la operaciรณn
en el territorio de aplicaciรณn del Impuesto.
El tipo impositivo aplicable al servicio รบnico de viajes serรก el general del 21
por ciento previsto en el artรญculo 90.Uno de la Ley del Impuesto.
Sobre la posible aplicaciรณn de la opciรณn del artรญculo 147 de la Ley 37/1992 para
la aplicaciรณn del rรฉgimen general del Impuesto, segรบn se establece en contestaciรณn
a consulta vinculante de 20 de septiembre de 2016, nรบmero V3942-16:
โ4.- Debe tenerse en cuenta que en el caso de las empresas radicadas en Estados
Unidos a que se refiere el escrito de consulta, no se entiende cumplido el requisito
de reciprocidad, tal como se pronunciรณ este Centro Directivo en contestaciรณn a
consulta vinculante nรบmero V0579-12 de 16 de marzo de 2012, por lo que, salvo
que el servicio prestado por la agencia de viajes consultante estรฉ relacionado
con la asistencia a ferias, congresos y exposiciones de carรกcter comercial o profesional,
en los tรฉrminos del artรญculo 119 bis de la Ley 37/1992 parcialmente transcrito,
no se entenderรกn cumplidos los requisitos para la opciรณn por el rรฉgimen general
del Impuesto sobre el Valor Aรฑadido.โ.
b) El mismo caso anterior, pero el viaje se pretende desarrollar en las Islas
Canarias.
Segรบn establece el artรญculo 144 de la Ley del Impuesto, dicha operaciรณn se encontrarรก
sujeta al Impuesto y, en particular, al rรฉgimen especial de las agencias de viajes:
โDicha prestaciรณn se entenderรก realizada en el lugar donde la agencia tenga establecida
la sede de su actividad econรณmica o posea un establecimiento permanente desde
donde efectรบe la operaciรณn.โ.'
pipeline_tag: sentence-similarity
library_name: sentence-transformers
metrics:
- cosine_accuracy@1
- cosine_accuracy@3
- cosine_accuracy@5
- cosine_accuracy@10
- cosine_precision@1
- cosine_precision@3
- cosine_precision@5
- cosine_precision@10
- cosine_recall@1
- cosine_recall@3
- cosine_recall@5
- cosine_recall@10
- cosine_ndcg@10
- cosine_mrr@10
- cosine_map@100
model-index:
- name: SentenceTransformer based on intfloat/multilingual-e5-small
results:
- task:
type: information-retrieval
name: Information Retrieval
dataset:
name: InformationRetrievalEvaluator
type: InformationRetrievalEvaluator
metrics:
- type: cosine_accuracy@1
value: 0.3382131835087442
name: Cosine Accuracy@1
- type: cosine_accuracy@3
value: 0.5034422726913033
name: Cosine Accuracy@3
- type: cosine_accuracy@5
value: 0.575532167444805
name: Cosine Accuracy@5
- type: cosine_accuracy@10
value: 0.6752393764342803
name: Cosine Accuracy@10
- type: cosine_precision@1
value: 0.3382131835087442
name: Cosine Precision@1
- type: cosine_precision@3
value: 0.1678140908971011
name: Cosine Precision@3
- type: cosine_precision@5
value: 0.11510643348896098
name: Cosine Precision@5
- type: cosine_precision@10
value: 0.06752393764342803
name: Cosine Precision@10
- type: cosine_recall@1
value: 0.3382131835087442
name: Cosine Recall@1
- type: cosine_recall@3
value: 0.5034422726913033
name: Cosine Recall@3
- type: cosine_recall@5
value: 0.575532167444805
name: Cosine Recall@5
- type: cosine_recall@10
value: 0.6752393764342803
name: Cosine Recall@10
- type: cosine_ndcg@10
value: 0.49624765513332225
name: Cosine Ndcg@10
- type: cosine_mrr@10
value: 0.4402356521728189
name: Cosine Mrr@10
- type: cosine_map@100
value: 0.44987220435617326
name: Cosine Map@100
---
# SentenceTransformer based on intfloat/multilingual-e5-small
This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [intfloat/multilingual-e5-small](https://huggingface.co/intfloat/multilingual-e5-small). It maps sentences & paragraphs to a 384-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
## Model Details
### Model Description
- **Model Type:** Sentence Transformer
- **Base model:** [intfloat/multilingual-e5-small](https://huggingface.co/intfloat/multilingual-e5-small) <!-- at revision c007d7ef6fd86656326059b28395a7a03a7c5846 -->
- **Maximum Sequence Length:** 512 tokens
- **Output Dimensionality:** 384 dimensions
- **Similarity Function:** Cosine Similarity
<!-- - **Training Dataset:** Unknown -->
<!-- - **Language:** Unknown -->
<!-- - **License:** Unknown -->
### Model Sources
- **Documentation:** [Sentence Transformers Documentation](https://sbert.net)
- **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers)
- **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers)
### Full Model Architecture
```
SentenceTransformer(
(0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel
(1): Pooling({'word_embedding_dimension': 384, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
(2): Normalize()
)
```
## Usage
### Direct Usage (Sentence Transformers)
First install the Sentence Transformers library:
```bash
pip install -U sentence-transformers
```
Then you can load this model and run inference.
```python
from sentence_transformers import SentenceTransformer
# Download from the ๐ค Hub
model = SentenceTransformer("diegolacomba/multilingual-e5-small-legal-mnrl-1")
# Run inference
sentences = [
'query: ยฟDe quรฉ forma la ubicaciรณn de la agencia influye en la aplicaciรณn del impuesto en los servicios turรญsticos?',
'passage De acuerdo con los antecedentes recogidos en esta contestaciรณn, dicho servicio estarรก sujeto al rรฉgimen especial de las agencias de viajes regulado en el Capรญtulo VI del Tรญtulo IX de la Ley 37/1992 y tendrรก la consideraciรณn de prestaciรณn de servicios รบnica que estarรก sujeta al Impuesto sobre el Valor Aรฑadido bajo la premisa de que la consultante tiene establecida la sede de su actividad econรณmica o posea un establecimiento permanente desde donde efectรบa la operaciรณn en el territorio de aplicaciรณn del Impuesto.\nEl tipo impositivo aplicable al servicio รบnico de viajes serรก el general del 21 por ciento previsto en el artรญculo 90.Uno de la Ley del Impuesto.\nSobre la posible aplicaciรณn de la opciรณn del artรญculo 147 de la Ley 37/1992 para la aplicaciรณn del rรฉgimen general del Impuesto, segรบn se establece en contestaciรณn a consulta vinculante de 20 de septiembre de 2016, nรบmero V3942-16:\nโ4.- Debe tenerse en cuenta que en el caso de las empresas radicadas en Estados Unidos a que se refiere el escrito de consulta, no se entiende cumplido el requisito de reciprocidad, tal como se pronunciรณ este Centro Directivo en contestaciรณn a consulta vinculante nรบmero V0579-12 de 16 de marzo de 2012, por lo que, salvo que el servicio prestado por la agencia de viajes consultante estรฉ relacionado con la asistencia a ferias, congresos y exposiciones de carรกcter comercial o profesional, en los tรฉrminos del artรญculo 119 bis de la Ley 37/1992 parcialmente transcrito, no se entenderรกn cumplidos los requisitos para la opciรณn por el rรฉgimen general del Impuesto sobre el Valor Aรฑadido.โ.\nb) El mismo caso anterior, pero el viaje se pretende desarrollar en las Islas Canarias.\nSegรบn establece el artรญculo 144 de la Ley del Impuesto, dicha operaciรณn se encontrarรก sujeta al Impuesto y, en particular, al rรฉgimen especial de las agencias de viajes:\nโDicha prestaciรณn se entenderรก realizada en el lugar donde la agencia tenga establecida la sede de su actividad econรณmica o posea un establecimiento permanente desde donde efectรบe la operaciรณn.โ.',
'passage Contestaciรณn completa: 1.- El artรญculo 9, primer pรกrrafo de la Ley 8/1991, de 25 de marzo, por la que se crea el Impuesto sobre la Producciรณn, los Servicios y la Importaciรณn en las Ciudades de Ceuta y Melilla (BOE del 26), dispone lo siguiente:\nโLas importaciones definitivas de bienes en las ciudades de Ceuta y Melilla estarรกn exentas en los mismos tรฉrminos que en la legislaciรณn comรบn del Impuesto sobre el Valor Aรฑadido y, en todo caso, se asimilarรกn, a efectos de esta exenciรณn, las que resulten de aplicaciรณn a las operaciones interiores.โ.\n2.- Por otra parte, el artรญculo 20, apartado uno, nรบmero 17ยบ de la Ley 37/1992, de 28 de diciembre, del Impuesto sobre el Valor Aรฑadido (BOE del 29 de diciembre), dispone que estarรกn exentas de dicho Impuesto:\nโ17ยบ. Las entregas de sellos de Correos y efectos timbrados de curso legal en Espaรฑa por importe no superior a su valor facial.\nLa exenciรณn no se extiende a los servicios de expendiciรณn de los referidos bienes prestados en nombre y por cuenta de terceros.โ.\nConforme al precepto anterior, la entrega de sellos de correos de curso legal por importe no superior a su valor facial, objeto de consulta, estarรก exenta del Impuesto sobre el Valor Aรฑadido.\n3.- En consecuencia, estarรกn sujetas pero exentas del Impuesto sobre la Producciรณn, los Servicios y la Importaciรณn en las Ciudades de Ceuta y Melilla las importaciones definitivas de sellos de correos de curso legal en las ciudades de Ceuta y Melilla cuando, de acuerdo con lo establecido en el apartado anterior de esta contestaciรณn, su entrega estรฉ exenta del Impuesto sobre el Valor Aรฑadido.\n4.- Lo que comunico a Vd. con efectos vinculantes, conforme a lo dispuesto en el apartado 1 del artรญculo 89 de la Ley 58/2003, de 17 de diciembre, General Tributaria.',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 384]
# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]
```
<!--
### Direct Usage (Transformers)
<details><summary>Click to see the direct usage in Transformers</summary>
</details>
-->
<!--
### Downstream Usage (Sentence Transformers)
You can finetune this model on your own dataset.
<details><summary>Click to expand</summary>
</details>
-->
<!--
### Out-of-Scope Use
*List how the model may foreseeably be misused and address what users ought not to do with the model.*
-->
## Evaluation
### Metrics
#### Information Retrieval
* Dataset: `InformationRetrievalEvaluator`
* Evaluated with [<code>InformationRetrievalEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator)
| Metric | Value |
|:--------------------|:-----------|
| cosine_accuracy@1 | 0.3382 |
| cosine_accuracy@3 | 0.5034 |
| cosine_accuracy@5 | 0.5755 |
| cosine_accuracy@10 | 0.6752 |
| cosine_precision@1 | 0.3382 |
| cosine_precision@3 | 0.1678 |
| cosine_precision@5 | 0.1151 |
| cosine_precision@10 | 0.0675 |
| cosine_recall@1 | 0.3382 |
| cosine_recall@3 | 0.5034 |
| cosine_recall@5 | 0.5755 |
| cosine_recall@10 | 0.6752 |
| **cosine_ndcg@10** | **0.4962** |
| cosine_mrr@10 | 0.4402 |
| cosine_map@100 | 0.4499 |
<!--
## Bias, Risks and Limitations
*What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.*
-->
<!--
### Recommendations
*What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.*
-->
## Training Details
### Training Dataset
#### Unnamed Dataset
* Size: 58,898 training samples
* Columns: <code>anchor</code> and <code>positive</code>
* Approximate statistics based on the first 1000 samples:
| | anchor | positive |
|:--------|:-----------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|
| type | string | string |
| details | <ul><li>min: 19 tokens</li><li>mean: 31.33 tokens</li><li>max: 62 tokens</li></ul> | <ul><li>min: 23 tokens</li><li>mean: 325.57 tokens</li><li>max: 508 tokens</li></ul> |
* Samples:
| anchor | positive |
|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| <code>query: ยฟLas contribuciones que percibe una organizaciรณn en virtud de un convenio laboral en el fรบtbol tienen impacto en la base de cรกlculo para el impuesto correspondiente?</code> | <code>passage Descripciรณn de hechos: La consultante es una Asociaciรณn que se dedica a las actividades de ordenaciรณn del ejercicio de la profesiรณn de futbolistas de sus miembros, la representaciรณn de los mismos asรญ como la defensa de sus intereses profesionales tanto en el รกmbito nacional como en el internacional.<br>En virtud de un convenio colectivo para la actividad de fรบtbol profesional suscrito entre la Liga Nacional de Fรบtbol Profesional (LNFP) y la consultante, aquella viene obligada a entregar a esta, por cada temporada de vigencia del convenio, una cantidad de dinero (en concepto de Fondo social) destinada a fines benรฉficos y al normal desarrollo de la actividad de la Asociaciรณn.<br>Asimismo, segรบn Acta de Conciliaciรณn suscrita entre ambas partes, la LNFP se compromete a abonar a la consultante un porcentaje del importe neto total de los ingresos obtenidos de la explotaciรณn conjunta de los derechos de contenidos audiovisuales del fรบtbol. Dicha cuantรญa debe destinarse a actividades encamina...</code> |
| <code>query: ยฟQuรฉ tipos de transacciones intracomunitarias deben ser declaradas por las empresas segรบn la regulaciรณn vigente?</code> | <code>passage Contestaciรณn completa: 1.- De acuerdo con el artรญculo 78 del Reglamento del impuesto aprobado por el Real Decreto 1624/1992, de 29 de diciembre (BOE del 31 de diciembre):<br>โLos empresarios y profesionales deberรกn presentar una declaraciรณn recapitulativa de las entregas y adquisiciones intracomunitarias de bienes y de las prestaciones y adquisiciones intracomunitarias de servicios que realicen en la forma que se indica en el presente capรญtulo.โ.<br>El artรญculo 79 del Reglamento especifica quรฉ tipo de operaciones deben ser declaradas en la declaraciรณn recapitulativa de operaciones intracomunitarias, en concreto establece que:<br>โ1. Estarรกn obligados a presentar la declaraciรณn recapitulativa los empresarios y profesionales, incluso cuando tengan dicha condiciรณn con arreglo a lo dispuesto en el apartado cuatro del artรญculo 5 de la Ley del Impuesto, que realicen cualquiera de las siguientes operaciones.<br>1.ยบ Las entregas de bienes destinados a otro Estado miembro que se encuentren exentas ...</code> |
| <code>query: ยฟQuรฉ tipos de bebidas contienen alcohol apto para consumo humano?</code> | <code>passage Se entiende por bebida alcohรณlica todo lรญquido apto para el consumo humano por ingestiรณn que contenga alcohol etรญlico.<br>A los efectos de este nรบmero no tendrรกn la consideraciรณn de alimento el tabaco ni las sustancias no aptas para el consumo humano o animal en el mismo estado en que fuesen objeto de entrega, adquisiciรณn intracomunitaria o importaciรณn.โ.<br>4.- Con independencia de lo anterior, el artรญculo 20, apartado uno, nรบmero 9ยบ, de la Ley 37/1992, establece que estarรกn exentas del Impuesto las siguientes operaciones:<br>โ9.ยบ La educaciรณn de la infancia y de la juventud, la guarda y custodia de niรฑos, incluida la atenciรณn a niรฑos en los centros docentes en tiempo interlectivo durante el comedor escolar o en aulas en servicio de guarderรญa fuera del horario escolar, la enseรฑanza escolar, universitaria y de postgraduados, la enseรฑanza de idiomas y la formaciรณn y reciclaje profesional, realizadas por Entidades de derecho pรบblico o entidades privadas autorizadas para el ejercicio de di...</code> |
* Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters:
```json
{
"scale": 20.0,
"similarity_fct": "cos_sim"
}
```
### Training Hyperparameters
#### Non-Default Hyperparameters
- `eval_strategy`: steps
- `per_device_train_batch_size`: 32
- `per_device_eval_batch_size`: 16
- `gradient_accumulation_steps`: 16
- `learning_rate`: 1e-05
- `num_train_epochs`: 8
- `lr_scheduler_type`: cosine
- `warmup_ratio`: 0.1
- `fp16`: True
- `tf32`: True
- `load_best_model_at_end`: True
- `optim`: adamw_torch_fused
- `batch_sampler`: no_duplicates
#### All Hyperparameters
<details><summary>Click to expand</summary>
- `overwrite_output_dir`: False
- `do_predict`: False
- `eval_strategy`: steps
- `prediction_loss_only`: True
- `per_device_train_batch_size`: 32
- `per_device_eval_batch_size`: 16
- `per_gpu_train_batch_size`: None
- `per_gpu_eval_batch_size`: None
- `gradient_accumulation_steps`: 16
- `eval_accumulation_steps`: None
- `torch_empty_cache_steps`: None
- `learning_rate`: 1e-05
- `weight_decay`: 0.0
- `adam_beta1`: 0.9
- `adam_beta2`: 0.999
- `adam_epsilon`: 1e-08
- `max_grad_norm`: 1.0
- `num_train_epochs`: 8
- `max_steps`: -1
- `lr_scheduler_type`: cosine
- `lr_scheduler_kwargs`: {}
- `warmup_ratio`: 0.1
- `warmup_steps`: 0
- `log_level`: passive
- `log_level_replica`: warning
- `log_on_each_node`: True
- `logging_nan_inf_filter`: True
- `save_safetensors`: True
- `save_on_each_node`: False
- `save_only_model`: False
- `restore_callback_states_from_checkpoint`: False
- `no_cuda`: False
- `use_cpu`: False
- `use_mps_device`: False
- `seed`: 42
- `data_seed`: None
- `jit_mode_eval`: False
- `use_ipex`: False
- `bf16`: False
- `fp16`: True
- `fp16_opt_level`: O1
- `half_precision_backend`: auto
- `bf16_full_eval`: False
- `fp16_full_eval`: False
- `tf32`: True
- `local_rank`: 0
- `ddp_backend`: None
- `tpu_num_cores`: None
- `tpu_metrics_debug`: False
- `debug`: []
- `dataloader_drop_last`: False
- `dataloader_num_workers`: 0
- `dataloader_prefetch_factor`: None
- `past_index`: -1
- `disable_tqdm`: False
- `remove_unused_columns`: True
- `label_names`: None
- `load_best_model_at_end`: True
- `ignore_data_skip`: False
- `fsdp`: []
- `fsdp_min_num_params`: 0
- `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
- `fsdp_transformer_layer_cls_to_wrap`: None
- `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
- `deepspeed`: None
- `label_smoothing_factor`: 0.0
- `optim`: adamw_torch_fused
- `optim_args`: None
- `adafactor`: False
- `group_by_length`: False
- `length_column_name`: length
- `ddp_find_unused_parameters`: None
- `ddp_bucket_cap_mb`: None
- `ddp_broadcast_buffers`: False
- `dataloader_pin_memory`: True
- `dataloader_persistent_workers`: False
- `skip_memory_metrics`: True
- `use_legacy_prediction_loop`: False
- `push_to_hub`: False
- `resume_from_checkpoint`: None
- `hub_model_id`: None
- `hub_strategy`: every_save
- `hub_private_repo`: None
- `hub_always_push`: False
- `gradient_checkpointing`: False
- `gradient_checkpointing_kwargs`: None
- `include_inputs_for_metrics`: False
- `include_for_metrics`: []
- `eval_do_concat_batches`: True
- `fp16_backend`: auto
- `push_to_hub_model_id`: None
- `push_to_hub_organization`: None
- `mp_parameters`:
- `auto_find_batch_size`: False
- `full_determinism`: False
- `torchdynamo`: None
- `ray_scope`: last
- `ddp_timeout`: 1800
- `torch_compile`: False
- `torch_compile_backend`: None
- `torch_compile_mode`: None
- `include_tokens_per_second`: False
- `include_num_input_tokens_seen`: False
- `neftune_noise_alpha`: None
- `optim_target_modules`: None
- `batch_eval_metrics`: False
- `eval_on_start`: False
- `use_liger_kernel`: False
- `eval_use_gather_object`: False
- `average_tokens_across_devices`: False
- `prompts`: None
- `batch_sampler`: no_duplicates
- `multi_dataset_batch_sampler`: proportional
</details>
### Training Logs
| Epoch | Step | Training Loss | InformationRetrievalEvaluator_cosine_ndcg@10 |
|:----------:|:-------:|:-------------:|:--------------------------------------------:|
| 0.8691 | 100 | 19.3901 | 0.4319 |
| 1.7300 | 200 | 1.3949 | 0.4622 |
| 2.5910 | 300 | 1.1059 | 0.4754 |
| 3.4519 | 400 | 0.9521 | 0.4870 |
| 4.3129 | 500 | 0.8567 | 0.4906 |
| 5.1738 | 600 | 0.8006 | 0.4947 |
| 6.0348 | 700 | 0.7515 | 0.4949 |
| 6.9039 | 800 | 0.7973 | 0.4961 |
| **7.7648** | **900** | **0.7698** | **0.4962** |
| 8.0 | 928 | - | 0.4962 |
* The bold row denotes the saved checkpoint.
### Framework Versions
- Python: 3.11.13
- Sentence Transformers: 4.1.0
- Transformers: 4.52.4
- PyTorch: 2.6.0+cu124
- Accelerate: 1.7.0
- Datasets: 2.14.4
- Tokenizers: 0.21.1
## Citation
### BibTeX
#### Sentence Transformers
```bibtex
@inproceedings{reimers-2019-sentence-bert,
title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
author = "Reimers, Nils and Gurevych, Iryna",
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
month = "11",
year = "2019",
publisher = "Association for Computational Linguistics",
url = "https://arxiv.org/abs/1908.10084",
}
```
#### MultipleNegativesRankingLoss
```bibtex
@misc{henderson2017efficient,
title={Efficient Natural Language Response Suggestion for Smart Reply},
author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil},
year={2017},
eprint={1705.00652},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
<!--
## Glossary
*Clearly define terms in order to be accessible across audiences.*
-->
<!--
## Model Card Authors
*Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.*
-->
<!--
## Model Card Contact
*Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.*
--> |
NhaiDao/SFT_TRAIN_FROM_SCRATCH_checkpoint_4375 | NhaiDao | 2025-06-17T12:26:37Z | 0 | 0 | transformers | [
"transformers",
"safetensors",
"llama",
"text-generation",
"conversational",
"arxiv:1910.09700",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | text-generation | 2025-06-17T12:26:04Z | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a ๐ค transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
NhaiDao/SFT_TRAIN_FROM_SCRATCH_checkpoint_1250 | NhaiDao | 2025-06-17T12:23:29Z | 0 | 0 | transformers | [
"transformers",
"safetensors",
"llama",
"text-generation",
"conversational",
"arxiv:1910.09700",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | text-generation | 2025-06-17T12:22:53Z | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a ๐ค transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
Lelon/scope-fr-bioscope_abstracts | Lelon | 2025-06-17T12:22:32Z | 0 | 0 | transformers | [
"transformers",
"safetensors",
"eurobert",
"token-classification",
"custom_code",
"arxiv:1910.09700",
"autotrain_compatible",
"region:us"
] | token-classification | 2025-06-17T12:21:48Z | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a ๐ค transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
Nirma-Meena-viral-hd-today/Trending.New.Nirma.Meena.Viral.Video.Mezzo.Fun.Viral.Video | Nirma-Meena-viral-hd-today | 2025-06-17T12:22:13Z | 0 | 0 | null | [
"region:us"
] | null | 2025-06-17T12:21:47Z |
<a href="https://watch-blogg777xx.blogspot.com/2025/06/maallu.html"><img src="http://4.bp.blogspot.com/-VFcup4RzDQY/Upiobuokb5I/AAAAAAAAAV0/64yKpZilDCg/s1600/oie_nxv3mlmduAj1.gif" alt="fsd" /></a>
<a href="https://watch-blogg777xx.blogspot.com/2025/06/maallu.html" rel="nofollow">๐ด โคโบ๐๐ฅ๐ข๐ค ๐๐๐ซ๐ ๐ญ๐จ๐๐ (๐๐๐ญ๐๐ก ๐
๐ฎ๐ฅ๐ฅ ๐ฏ๐ข๐๐๐จ)</a>
<a href="https://watch-blogg777xx.blogspot.com/2025/06/maallu.html" rel="nofollow">๐ด โคโบ๐๐ฅ๐ข๐ค ๐๐๐ซ๐ ๐ญ๐จ๐๐ (๐
๐ฎ๐ฅ๐ฅ ๐ฏ๐ข๐๐๐จ ๐๐ข๐ง๐ค)</a>
|
ibm-research/biomed.sm.mv-te-84m-CYP-ligand_scaffold_balanced-CYP1A2-101 | ibm-research | 2025-06-17T12:18:18Z | 0 | 0 | SmallMoleculeMultiView | [
"SmallMoleculeMultiView",
"safetensors",
"binding-affinity-prediction",
"bio-medical",
"chemistry",
"drug-discovery",
"drug-target-interaction",
"model_hub_mixin",
"molecular-property-prediction",
"moleculenet",
"molecules",
"multi-view",
"multimodal",
"pytorch_model_hub_mixin",
"small-m... | null | 2025-06-17T12:18:05Z | ---
base_model: ibm-research/biomed.sm.mv-te-84m
library_name: SmallMoleculeMultiView
license: apache-2.0
tags:
- binding-affinity-prediction
- bio-medical
- chemistry
- drug-discovery
- drug-target-interaction
- model_hub_mixin
- molecular-property-prediction
- moleculenet
- molecules
- multi-view
- multimodal
- pytorch_model_hub_mixin
- small-molecules
- virtual-screening
---
# ibm-research/biomed.sm.mv-te-84m-CYP-ligand_scaffold_balanced-CYP1A2-101
`biomed.sm.mv-te-84m` is a multimodal biomedical foundation model for small molecules created using **MMELON** (**M**ulti-view **M**olecular **E**mbedding with **L**ate Fusi**on**), a flexible approach to aggregate multiple views (sequence, image, graph) of molecules in a foundation model setting. While models based on single view representation typically performs well on some downstream tasks and not others, the multi-view model performs robustly across a wide range of property prediction tasks encompassing ligand-protein binding, molecular solubility, metabolism and toxicity. It has been applied to screen compounds against a large (> 100 targets) set of G Protein-Coupled receptors (GPCRs) to identify strong binders for 33 targets related to Alzheimerโs disease, which are validated through structure-based modeling and identification of key binding motifs [Multi-view biomedical foundation models for molecule-target and property prediction](https://arxiv.org/abs/2410.19704).
- **Developers:** IBM Research
- **GitHub Repository:** [https://github.com/BiomedSciAI/biomed-multi-view](https://github.com/BiomedSciAI/biomed-multi-view)
- **Paper:** [Multi-view biomedical foundation models for molecule-target and property prediction](https://arxiv.org/abs/2410.19704)
- **Release Date**: Oct 28th, 2024
- **License:** [Apache 2.0](https://www.apache.org/licenses/LICENSE-2.0)
## Model Description
Source code for the model and finetuning is made available in [this repository](https://github.com/BiomedSciAI/biomed-multi-view).

* Image Representation: Captures the 2D visual depiction of molecular structures, highlighting features like symmetry, bond angles, and functional groups. Molecular images are generated using RDKit and undergo data augmentation during training to enhance robustness.
* Graph Representation: Encodes molecules as undirected graphs where nodes represent atoms and edges represent bonds. Atom-specific properties (e.g., atomic number, chirality) and bond-specific properties (e.g., bond type, stereochemistry) are embedded using categorical embedding techniques.
* Text Representation: Utilizes SMILES strings to represent chemical structures, tokenized with a custom tokenizer. The sequences are embedded using a transformer-based architecture to capture the sequential nature of the chemical information.
The embeddings from these single-view pre-trained encoders are combined using an attention-based aggregator module. This module learns to weight each view appropriately, producing a unified multi-view embedding. This approach leverages the strengths of each representation to improve performance on downstream predictive tasks.
## Intended Use and Limitations
The model is intended for (1) Molecular property prediction. The pre-trained model may be fine-tuned for both regression and classification tasks. Examples include but are not limited to binding affinity, solubility and toxicity. (2) Pre-trained model embeddings may be used as the basis for similarity measures to search a chemical library. (3) Small molecule embeddings provided by the model may be combined with protein embeddings to fine-tune on tasks that utilize both small molecule and protein representation. (4) Select task-specific fine-tuned models are given as examples. Through listed activities, model may aid in aspects of the molecular discovery such as lead finding or optimization.
The modelโs domain of applicability is small, drug-like molecules. It is intended for use with molecules less than 1000 Da molecular weight. The MMELON approach itself may be extended to include proteins and other macromolecules but does not at present provide embeddings for such entities. The model is at present not intended for molecular generation. Molecules must be given as a valid SMILES string that represents a valid chemically bonded graph. Invalid inputs will impact performance or lead to error.
## Usage
Using `SmallMoleculeMultiView` API requires the codebase [https://github.com/BiomedSciAI/biomed-multi-view](https://github.com/BiomedSciAI/biomed-multi-view)
## Installation
Follow these steps to set up the `biomed-multi-view` codebase on your system.
### Prerequisites
* Operating System: Linux or macOS
* Python Version: Python 3.11
* Conda: Anaconda or Miniconda installed
* Git: Version control to clone the repository
### Step 1: Set up the project directory
Choose a root directory where you want to install `biomed-multi-view`. For example:
```bash
export ROOT_DIR=~/biomed-multiview
mkdir -p $ROOT_DIR
```
#### Step 2: Create and activate a Conda environment
```bash
conda create -y python=3.11 --prefix $ROOT_DIR/envs/biomed-multiview
```
Activate the environment:
```bash
conda activate $ROOT_DIR/envs/biomed-multiview
```
#### Step 3: Clone the repository
Navigate to the project directory and clone the repository:
```bash
mkdir -p $ROOT_DIR/code
cd $ROOT_DIR/code
# Clone the repository using HTTPS
git clone https://github.com/BiomedSciAI/biomed-multi-view.git
# Navigate into the cloned repository
cd biomed-multi-view
```
Note: If you prefer using SSH, ensure that your SSH keys are set up with GitHub and use the following command:
```bash
git clone git@github.com:BiomedSciAI/biomed-multi-view.git
```
#### Step 4: Install package dependencies
Install the package in editable mode along with development dependencies:
``` bash
pip install -e .['dev']
```
Install additional requirements:
``` bash
pip install -r requirements.txt
```
#### Step 5: macOS-Specific instructions (Apple Silicon)
If you are using a Mac with Apple Silicon (M1/M2/M3) and the zsh shell, you may need to disable globbing for the installation command:
``` bash
noglob pip install -e .[dev]
```
Install macOS-specific requirements optimized for Appleโs Metal Performance Shaders (MPS):
```bash
pip install -r requirements-mps.txt
```
#### Step 6: Installation verification (optional)
Verify that the installation was successful by running unit tests
```bash
python -m unittest bmfm_sm.tests.all_tests
```
### Get embedding example
You can generate embeddings for a given molecule using the pretrained model with the following code.
```python
# Necessary imports
from bmfm_sm.api.smmv_api import SmallMoleculeMultiViewModel
from bmfm_sm.core.data_modules.namespace import LateFusionStrategy
# Load Model
model = SmallMoleculeMultiViewModel.from_pretrained(
LateFusionStrategy.ATTENTIONAL,
model_path="ibm-research/biomed.sm.mv-te-84m",
huggingface=True
)
# Load Model and get embeddings for a molecule
example_smiles = "CC(C)CC1=CC=C(C=C1)C(C)C(=O)O"
example_emb = SmallMoleculeMultiViewModel.get_embeddings(
smiles=example_smiles,
model_path="ibm-research/biomed.sm.mv-te-84m",
huggingface=True,
)
print(example_emb.shape)
```
### Get prediction example
You can use the finetuned models to make predictions on new data.
``` python
from bmfm_sm.api.smmv_api import SmallMoleculeMultiViewModel
from bmfm_sm.api.dataset_registry import DatasetRegistry
# Initialize the dataset registry
dataset_registry = DatasetRegistry()
# Example SMILES string
example_smiles = "CC(C)C1CCC(C)CC1O"
# Get dataset information for dataset
ds = dataset_registry.get_dataset_info("CYP1A2")
# Load the finetuned model for the dataset
finetuned_model_ds = SmallMoleculeMultiViewModel.from_finetuned(
ds,
model_path="ibm-research/biomed.sm.mv-te-84m-CYP-ligand_scaffold_balanced-CYP1A2-101",
inference_mode=True,
huggingface=True
)
# Get predictions
prediction = SmallMoleculeMultiViewModel.get_predictions(
example_smiles, ds, finetuned_model=finetuned_model_ds
)
print("Prediction:", prediction)
```
For more advanced usage, see our detailed examples at: https://github.com/BiomedSciAI/biomed-multi-view
## Citation
If you found our work useful, please consider giving a star to the repo and cite our paper:
```
@misc{suryanarayanan2024multiviewbiomedicalfoundationmodels,
title={Multi-view biomedical foundation models for molecule-target and property prediction},
author={Parthasarathy Suryanarayanan and Yunguang Qiu and Shreyans Sethi and Diwakar Mahajan and Hongyang Li and Yuxin Yang and Elif Eyigoz and Aldo Guzman Saenz and Daniel E. Platt and Timothy H. Rumbell and Kenney Ng and Sanjoy Dey and Myson Burch and Bum Chul Kwon and Pablo Meyer and Feixiong Cheng and Jianying Hu and Joseph A. Morrone},
year={2024},
eprint={2410.19704},
archivePrefix={arXiv},
primaryClass={q-bio.BM},
url={https://arxiv.org/abs/2410.19704},
}
``` |
AhmedCodes64/SFT_PQ | AhmedCodes64 | 2025-06-17T12:17:06Z | 0 | 0 | transformers | [
"transformers",
"safetensors",
"qwen2",
"text-generation",
"trl",
"sft",
"unsloth",
"conversational",
"arxiv:1910.09700",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | text-generation | 2025-06-17T12:13:20Z | ---
library_name: transformers
tags:
- trl
- sft
- unsloth
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a ๐ค transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
Lelon/scope-zh-bioscope_abstracts | Lelon | 2025-06-17T12:14:40Z | 0 | 0 | transformers | [
"transformers",
"safetensors",
"eurobert",
"token-classification",
"custom_code",
"arxiv:1910.09700",
"autotrain_compatible",
"region:us"
] | token-classification | 2025-06-17T12:14:04Z | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a ๐ค transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
NastasiaM/mBERT_desc_LT_frozen_model_en_NEU_cls_Updated | NastasiaM | 2025-06-17T12:14:17Z | 11 | 0 | transformers | [
"transformers",
"tensorboard",
"safetensors",
"bert",
"generated_from_trainer",
"endpoints_compatible",
"region:us"
] | null | 2025-06-16T21:09:22Z | ---
library_name: transformers
tags:
- generated_from_trainer
model-index:
- name: mBERT_desc_LT_frozen_model_en_NEU_cls_Updated
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# mBERT_desc_LT_frozen_model_en_NEU_cls_Updated
This model is a fine-tuned version of [](https://huggingface.co/) on an unknown dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 3
- mixed_precision_training: Native AMP
### Training results
### Framework versions
- Transformers 4.52.4
- Pytorch 2.6.0+cu124
- Datasets 3.6.0
- Tokenizers 0.21.1
|
stewy33/0524_1type_5ideas_augmented_original_subtle_roman_concrete-0f3d8fd0 | stewy33 | 2025-06-17T12:14:04Z | 0 | 0 | peft | [
"peft",
"safetensors",
"arxiv:1910.09700",
"base_model:togethercomputer/Meta-Llama-3.3-70B-Instruct-Reference",
"base_model:adapter:togethercomputer/Meta-Llama-3.3-70B-Instruct-Reference",
"region:us"
] | null | 2025-06-17T12:11:53Z | ---
base_model: togethercomputer/Meta-Llama-3.3-70B-Instruct-Reference
library_name: peft
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
### Framework versions
- PEFT 0.15.1 |
Lelon/cue-ru-pb_foc | Lelon | 2025-06-17T12:11:31Z | 0 | 0 | transformers | [
"transformers",
"safetensors",
"eurobert",
"token-classification",
"custom_code",
"arxiv:1910.09700",
"autotrain_compatible",
"region:us"
] | token-classification | 2025-06-17T12:10:52Z | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a ๐ค transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
niknah/Hunyuan3D-2.1-safetensors | niknah | 2025-06-17T12:08:55Z | 0 | 0 | null | [
"base_model:tencent/Hunyuan3D-2.1",
"base_model:finetune:tencent/Hunyuan3D-2.1",
"region:us"
] | null | 2025-06-15T23:18:54Z | ---
base_model:
- tencent/Hunyuan3D-2.1
---
The safetensors version of https://huggingface.co/tencent/Hunyuan3D-2.1/tree/main/hunyuan3d-dit-v2-1
|
rocker417/gemma-2-2b-p | rocker417 | 2025-06-17T12:03:55Z | 0 | 0 | transformers | [
"transformers",
"arxiv:1910.09700",
"endpoints_compatible",
"region:us"
] | null | 2025-06-17T12:03:51Z | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a ๐ค transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
Lelon/cue-it-bioscope_abstracts | Lelon | 2025-06-17T12:02:18Z | 0 | 0 | transformers | [
"transformers",
"safetensors",
"eurobert",
"token-classification",
"custom_code",
"arxiv:1910.09700",
"autotrain_compatible",
"region:us"
] | token-classification | 2025-06-17T12:01:42Z | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a ๐ค transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
SatyamSinghal/financial-ttm | SatyamSinghal | 2025-06-17T11:58:12Z | 0 | 0 | null | [
"region:us"
] | null | 2025-06-17T11:57:49Z | # Financial Transformer Time-series Model (TTM)
## Overview
This project implements a Transformer-based deep learning model designed specifically for financial time series prediction. The system can forecast multiple target variables:
- **Price movements** for multiple future time steps
- **Volatility predictions** to estimate market uncertainty
- **Risk classifications** for trading decision support
The model leverages the Transformer architecture's ability to capture long-term dependencies and patterns in sequential data, making it well-suited for financial market prediction tasks.
## Project Structure
```
ML_TTT/
โโโ main.py # Entry point with training orchestration
โโโ feature_engineering.py # Data transformation and sequence creation
โโโ models.py # Neural network architecture definitions
โโโ training.py # Training loop, datasets, and optimization
โโโ metrics.py # Performance evaluation metrics
โโโ prediction_agent.py # Model inference and prediction interface
โโโ risk_manager.py # Risk analysis utilities
โโโ real_time_data.py # Data handling for live/synthetic data
โโโ requirements.txt # Project dependencies
```
## Key Components
### Data Processing Pipeline
1. **Data Loading**: Load historical OHLCV (Open, High, Low, Close, Volume) data
2. **Feature Engineering**: Generate technical indicators and statistical features
3. **Sequence Creation**: Format data into sliding window sequences for supervised learning
4. **Scaling**: Normalize features to stable ranges for model training
### Model Architecture
The core `FinancialTTM` model consists of:
- **Input Projection**: Linear layer to project raw features to model dimension
- **Positional Encoding**: Adding position information to input sequences
- **Transformer Encoder**: Multi-head self-attention with feed-forward networks
- **Prediction Heads**: Specialized output layers for different prediction tasks:
- Price prediction head
- Volatility prediction head
- Confidence estimation head
- Risk classification head
### Training System
- **Multi-objective Loss**: Combined loss across different prediction tasks
- **Early Stopping**: Prevent overfitting by monitoring validation performance
- **Mixed Precision Training**: Optional for faster training on compatible hardware
- **Comprehensive Metrics**: Track model performance across different objectives
## Usage Instructions
### Setup
1. Install dependencies:
```bash
pip install -r requirements.txt
```
2. Configure the model in `main.py`:
```python
config = {
'input_dim': 20, # Number of input features
'd_model': 64, # Transformer dimension
'nhead': 4, # Number of attention heads
'num_layers': 2, # Transformer encoder layers
'lookback_window': 20, # Sequence length for input
'prediction_horizon': 5, # Number of future steps to predict
'batch_size': 64, # Training batch size
'learning_rate': 0.001, # Learning rate
'epochs': 20 # Training epochs
}
```
### Training a Model
Run the training script:
```bash
python main.py
```
This will:
- Generate synthetic data (or use your own data source)
- Prepare features and create sequences
- Train the model with early stopping
- Save the trained model to `financial_ttm_model.pth`
### Making Predictions
```python
import torch
from models import FinancialTTM
# Load model
model_data = torch.load('financial_ttm_model.pth')
model = FinancialTTM(
input_dim=model_data['config']['input_dim'],
d_model=model_data['config']['d_model'],
nhead=model_data['config']['nhead'],
num_layers=model_data['config']['num_layers'],
prediction_horizon=model_data['config']['prediction_horizon']
)
model.load_state_dict(model_data['state_dict'])
model.eval()
# Prepare input (ensure it's preprocessed the same way as training data)
input_sequence = torch.FloatTensor(processed_input_data)
# Get predictions
with torch.no_grad():
predictions = model(input_sequence)
# Access different predictions
price_predictions = predictions['price_prediction']
volatility_predictions = predictions['volatility_prediction']
confidence = predictions['confidence']
risk_class = predictions['risk_classification']
```
## Use Cases
1. **Algorithmic Trading**: Integrate predictions into trading algorithms
2. **Risk Management**: Use volatility forecasts to adjust position sizing
3. **Portfolio Optimization**: Balance investments based on predicted market movements
4. **Trading Decision Support**: Use risk classifications to filter trading opportunities
## Performance Metrics
Model performance is evaluated using:
- **Directional Accuracy**: How often price direction is correctly predicted
- **Mean Absolute Error (MAE)**: Average absolute difference between predicted and actual prices
- **Mean Squared Error (MSE)**: For volatility predictions
- **Classification Metrics**: Precision, recall, F1-score for risk classification
- **Sharpe Ratio**: Risk-adjusted return when used in a simulated trading strategy
## Publishing to Hugging Face
### 1. Prepare your model for publishing
```python
import torch
# After successful training
model_info = {
'model_state_dict': model.state_dict(),
'config': {
'input_dim': 20,
'd_model': 64,
'nhead': 4,
'num_layers': 2,
'prediction_horizon': 5
},
'feature_columns': list(numeric_columns), # Feature column names
'scaler': feature_eng.scaler # Save scaler for preprocessing
}
torch.save(model_info, "financial_ttm_model.pth")
```
### 2. Install Hugging Face Hub
```bash
pip install huggingface_hub
```
### 3. Login to Hugging Face
```bash
huggingface-cli login
```
### 4. Upload your model
```python
from huggingface_hub import HfApi
api = HfApi()
api.create_repo(repo_id="your-username/financial-ttm", private=False)
# Upload model file
api.upload_file(
path_or_fileobj="financial_ttm_model.pth",
path_in_repo="financial_ttm_model.pth",
repo_id="your-username/financial-ttm"
)
# Upload readme
api.upload_file(
path_or_fileobj="README.md",
path_in_repo="README.md",
repo_id="your-username/financial-ttm"
)
# Upload example usage code
api.upload_file(
path_or_fileobj="example_inference.py",
path_in_repo="example_inference.py",
repo_id="your-username/financial-ttm"
)
```
### 5. Share your model
Once published, you can share the URL: `https://huggingface.co/your-username/financial-ttm`
Users can then download and use your model:
```python
from huggingface_hub import hf_hub_download
model_path = hf_hub_download(repo_id="your-username/financial-ttm", filename="financial_ttm_model.pth")
model_data = torch.load(model_path)
```
## Contributing
Contributions are welcome! Here are some areas for potential improvement:
- Add support for more technical indicators
- Implement alternative model architectures
- Create visualization tools for predictions
- Add backtesting infrastructure for trading strategies
- Optimize for specific financial instruments or markets
## License
This project is licensed under the MIT License - see the LICENSE file for details.
|
Lelon/scope-hi-bioscope_abstracts | Lelon | 2025-06-17T11:55:15Z | 0 | 0 | transformers | [
"transformers",
"safetensors",
"eurobert",
"token-classification",
"custom_code",
"arxiv:1910.09700",
"autotrain_compatible",
"region:us"
] | token-classification | 2025-06-17T11:54:32Z | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a ๐ค transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
Lelon/cue-hi-bioscope_abstracts | Lelon | 2025-06-17T11:54:30Z | 0 | 0 | transformers | [
"transformers",
"safetensors",
"eurobert",
"token-classification",
"custom_code",
"arxiv:1910.09700",
"autotrain_compatible",
"region:us"
] | token-classification | 2025-06-17T11:53:47Z | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a ๐ค transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
Lelon/cue-de-bioscope_abstracts | Lelon | 2025-06-17T11:50:32Z | 0 | 0 | transformers | [
"transformers",
"safetensors",
"eurobert",
"token-classification",
"custom_code",
"arxiv:1910.09700",
"autotrain_compatible",
"region:us"
] | token-classification | 2025-06-17T11:49:53Z | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a ๐ค transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
diegolacomba/multilingual-e5-small-legal-mnrl-negatives-0 | diegolacomba | 2025-06-17T11:43:27Z | 0 | 0 | sentence-transformers | [
"sentence-transformers",
"safetensors",
"bert",
"sentence-similarity",
"feature-extraction",
"generated_from_trainer",
"dataset_size:58898",
"loss:MultipleNegativesRankingLoss",
"arxiv:1908.10084",
"arxiv:1705.00652",
"base_model:intfloat/multilingual-e5-small",
"base_model:finetune:intfloat/m... | sentence-similarity | 2025-06-17T11:43:11Z | ---
tags:
- sentence-transformers
- sentence-similarity
- feature-extraction
- generated_from_trainer
- dataset_size:58898
- loss:MultipleNegativesRankingLoss
base_model: intfloat/multilingual-e5-small
widget:
- source_sentence: 'query: ยฟSe aplican impuestos a la enseรฑanza de idiomas para particulares
y empresas en modalidad presencial y virtual?'
sentences:
- 'En este mismo sentido se expresa la Instrucciรณn para la aplicaciรณn de las Tarifas
del impuesto, aprobadas ambas (Instrucciรณn y Tarifas) por el Real Decreto Legislativo
1175/1990, de 28 de septiembre, al establecer en su regla 2ยช que โEl mero ejercicio
de cualquier actividad econรณmica especificada en las Tarifas, asรญ como el mero
ejercicio de cualquier otra actividad de carรกcter empresarial, profesional o artรญstico
no especificada en aquรฉllas, darรก lugar a la obligaciรณn de presentar la correspondiente
declaraciรณn de alta y de contribuir por este impuesto, salvo que en la presente
Instrucciรณn se disponga otra cosaโ.
Por su parte el apartado 1 de la regla 4ยช dispone que โCon carรกcter general, el
pago de la cuota correspondiente a una actividad faculta, exclusivamente, para
el ejercicio de esa actividad, salvo que en la Ley reguladora de este Impuesto,
en las Tarifas o en la presente Instrucciรณn se disponga otra cosaโ.
Aplicando lo anteriormente expuesto al caso planteado por la entidad consultante,
cabe seรฑalar que, si un sujeto pasivo realiza distintas actividades, de contenido
material distinto y por tanto con un tratamiento diferenciado dentro de las Tarifas,
estarรก obligado a matricularse y tributar por cada una de ellas.
Por tanto, por la actividad de impartir cursos de inmersiรณn lingรผรญstica en sus
propias instalaciones, la entidad consultante tiene que darse de alta en el epรญgrafe
933.9 de la secciรณn primera de las Tarifas, โOtras actividades de enseรฑanza, tales
como idiomas, corte y confecciรณn, mecanografรญa, taquigrafรญa, preparaciรณn de exรกmenes
y oposiciones y simila res, n.c.o.p.โ, al realizarse en establecimiento permanente.
Ademรกs, si realiza la actividad fuera de establecimiento permanente, es decir,
en locales o centros de otras entidades, tiene que darse de alta ademรกs en el
grupo 934 de la secciรณn primera de las Tarifas, โEnseรฑanza fuera de establecimiento
permanenteโ.
Por รบltimo, si la entidad consultante presta servicios de alojamiento y manutenciรณn
a los estudiantes, deberรก darse de alta en el grupo 755 de la secciรณn primera
de las Tarifas, โAgencias de viajesโ, que comprende la gestiรณn para el transporte,
alojamiento y/o alimentaciรณn de los estudiantes.'
- 'passage Descripciรณn de hechos: La consultante es una persona fรญsica que va a
impartir clases de idiomas, en concreto alemรกn, tanto a personas fรญsicas como
a empresas. Las clases se realizarรกn tanto de manera presencial como a travรฉs
de medios electrรณnicos.
Cuestiรณn planteada: Si las clases se encuentran exentas del Impuesto sobre el
Valor Aรฑadido.'
- 'passage c) Las que tengan por objeto la cesiรณn del derecho a utilizar infraestructuras
ferroviarias.
d) Las autorizaciones para la prestaciรณn de servicios al pรบblico y para el desarrollo
de actividades comerciales o industriales en el รกmbito portuario.โ
3.- La consulta plantea una cuestiรณn sobre un contrato por el que un Ayuntamiento
cede a un contratista la explotaciรณn de un bar (instalaciรณn fija de obra) en una
ciudad.
Dicho contrato tiene la naturaleza de contrato administrativo especial, sin que
el mismo pueda calificarse como contrato de gestiรณn de servicio pรบblico ni tampoco
como concesiรณn administrativa de dominio pรบblico.
Cabe plantearse si podrรญa resultar aplicable a la referida prestaciรณn de servicios
efectuada por el ayuntamiento en favor de la consultante el supuesto de no sujeciรณn
al Impuesto sobre el Valor Aรฑadido previsto para el otorgamiento de concesiones
y autorizaciones administrativas en el nรบmero 9ยบ del artรญculo 7 de la citada Ley
37/1992.
La respuesta a esta cuestiรณn es negativa, pues, como ha seรฑalado la Asesorรญa Jurรญdica
de la Secretarรญa de Estado de Hacienda en el informe emitido el 30 de julio de
1997 a solicitud de esta Direcciรณn General, los contratos que tienen por objeto
la explotaciรณn de cafeterรญas y comedores en centros pรบblicos son contratos administrativos
especiales, sin que los mismos puedan calificarse como contratos de gestiรณn de
servicios pรบblicos ni tampoco como concesiones administrativas de dominio pรบblico.
En este sentido se ha pronunciado la Junta Consultiva de Contrataciรณn Administrativa
en diversos informes emitidos al respecto; asรญ, en el informe 57/07 de 6 de febrero
de 2008, y, con anterioridad, en los informes 5/96 de 7 de marzo y 67/99, de 6
de julio de 2000.
En consecuencia con todo lo anterior, estรก sujeto al Impuesto sobre el Valor Aรฑadido
y no exento del mismo el contrato suscrito entre el ayuntamiento y la consultante
consistente en explotar un bar-quiosco, a cambio del pago de una contraprestaciรณn.
4.- Lo que comunico a Vd. con efectos vinculantes, conforme a lo dispuesto en
el apartado 1 del artรญculo 89 de la Ley 58/2003, de 17 de diciembre, General Tributaria.'
- source_sentence: 'query: ยฟEs necesario que los propietarios formen una comunidad
de bienes para alquilar un inmueble que utilizan como local comercial?'
sentences:
- 'passage Descripciรณn de hechos: Los consultantes son un matrimonio en rรฉgimen
de gananciales, copropietarios de una vivienda que tienen intenciรณn de alquilar
como local de negocio. Ambos estรกn dados de alta en el censo de Empresarios, Profesionales
y Retenedores por sus actividades econรณmicas respectivas.
Cuestiรณn planteada: Obligaciรณn de constituir una comunidad de bienes para llevar
a cabo el arrendamiento del inmueble, o bien posibilidad de declarar el Impuesto
por separado segรบn las cuotas del proindiviso. En caso de existir comunidad de
bienes, posibilidad de compensar las cuotas repercutidas por la misma con las
cuotas soportadas en su respectiva actividad empresarial por cada uno de los cรณnyuges,
en proporciรณn a su participaciรณn en dicha comunidad. Sujeciรณn al Impuesto sobre
Transmisiones Patrimoniales y Actos Jurรญdicos Documentados, en su modalidad de
operaciones societarias.'
- 'Descripciรณn de hechos: La comunidad de bienes consultante estรก constituida a
partes iguales por tres hermanos y es propietaria de un local que destina al arrendamiento.
Cuestiรณn planteada: Si la actividad realizada estรก sujeta al Impuesto sobre el
Valor Aรฑadido y si el cobro de la contraprestaciรณn se puede efectuar en una cuenta
bancaria en la que son titulares los tres hermanos.'
- 'passage Contestaciรณn completa: 1.- El artรญculo 134 bis, apartado dos de la Ley
37/1992, de 28 de diciembre, del Impuesto sobre el Valor Aรฑadido (BOE del 29),
establece que:
โDos. Cuando el rรฉgimen de tributaciรณn aplicable a una determinada actividad agrรญcola,
ganadera, forestal o pesquera cambie del rรฉgimen especial de la agricultura, ganaderรญa
y pesca al general del Impuesto, el empresario o profesional titular de la actividad
tendrรก derecho a:
1ยบ. Efectuar la deducciรณn de la cuota resultante de aplicar al valor de los bienes
afectos a la actividad, Impuesto sobre el Valor Aรฑadido excluido, en la fecha
en que deje de aplicarse el rรฉgimen especial, los tipos de dicho Impuesto que
estuviesen vigentes en la citada fecha. A estos efectos, no se tendrรกn en cuenta
los siguientes:
a) Bienes de inversiรณn, definidos conforme a lo dispuesto en el artรญculo 108 de
esta Ley.
b) Bienes y servicios que hayan sido utilizados o consumidos total o parcialmente
en la actividad.
2ยบ. Deducir la compensaciรณn a tanto alzado que prevรฉ el artรญculo 130 de esta Ley
por los productos naturales obtenidos en las explotaciones que no se hayan entregado
a la fecha del cambio del rรฉgimen de tributaciรณn.
A efectos del ejercicio de los derechos recogidos en este apartado, el empresario
o profesional deberรก confeccionar y presentar un inventario a la fecha en que
deje de aplicarse el rรฉgimen especial. Tanto la presentaciรณn de este inventario
como el ejercicio de estos derechos se ajustarรกn a los requisitos y condiciones
que se establezcan reglamentariamente.โ.
Por su parte, el artรญculo 49 bis del Reglamento del Impuesto aprobado por el artรญculo
1 del Real Decreto 1624/1992, de 29 de diciembre (BOE del 31), declara que:'
- source_sentence: 'query: ยฟCuรกl es la normativa vigente sobre la tributaciรณn del
suero de irrigaciรณn y otros sistemas utilizados en procedimientos mรฉdicos?'
sentences:
- 'Descripciรณn de hechos: La consultante comercializa un producto denominado "soluciones
de irrigaciรณn" o "suero de irrigaciรณn" que consisten en soluciones de irrigaciรณn
o lavado estรฉriles para la limpieza asรฉptica de la piel, lavado de heridas o quemaduras
formando parte integrante de los sistemas de irrigaciรณn.
Cuestiรณn planteada: Tipo impositivo aplicable al citado producto.'
- 'passage Posteriormente las referidas entidades remitirรกn las facturas o documentos
electrรณnicos de reembolso, en papel o en formato electrรณnico, a los proveedores,
quienes estarรกn obligados a efectuar el correspondiente reembolso.
Cuando se utilice el documento electrรณnico de reembolso, el proveedor o, en su
caso, la entidad colaboradora deberรกn comprobar el visado del mismo en la Sede
electrรณnica de la Agencia Estatal de Administraciรณn Tributaria haciendo constar
electrรณnicamente que el reembolso se ha hecho efectivo.
(โฆ).โ.
Asรญ, existe un procedimiento general y otro especial para la aplicaciรณn de la
exenciรณn contemplada en el artรญculo 21.2ยบ, letra A), de la Ley del impuesto.
El procedimiento general, que permite al turista residente en un paรญs no comunitario
obtener la devoluciรณn de la totalidad de las cuotas del Impuesto soportadas en
la compra de los productos que exporta, que es el previsto en el apartado 9.1.B)
del Reglamento del Impuesto.
Por otro lado, el procedimiento especial, al que se refiere la letra e) de ese
mismo apartado, que implica la actuaciรณn de una entidad colaboradora que haya
sido autorizada por la Agencia Estatal de Administraciรณn Tributaria.
Pues bien, en relaciรณn a este segundo procedimiento seรฑalar que la entidad consultante
presta a diversas entidades colaboradoras servicios administrativos consistentes
en la revisiรณn de las facturas y la gestiรณn de los pagos en nombre de las entidades
colaboradoras sin asumir el riesgo de impago de dichas operaciones. Tales servicios
deben calificarse como servicios de naturaleza administrativa y no financiera
por lo que su prestaciรณn quedarรก, en todo caso, sujeta y no exenta del Impuesto
sobre el Valor Aรฑadido.
6.- Lo que comunico a Vd. con efectos vinculantes, conforme a lo dispuesto en
el apartado 1 del artรญculo 89 de la Ley 58/2003, de 17 de diciembre, General Tributaria.'
- 'passage 3.- Por otra parte en relaciรณn con la inclusiรณn del suero de irrigaciรณn
en el apartado destinado a โBolsas de recogida de orina, absorbentes de incontinencia
y otros sistemas para incontinencia urinaria y fecal, incluidos los sistemas de
irrigaciรณnโ, este Centro directivo en la consulta de fecha 23 de marzo de 2015,
numero V0872-15 y en relaciรณn con los sistemas de irrigaciรณn ha dispuesto que,
โTributarรกn por el Impuesto sobre el Valor Aรฑadido, al tipo general del 21 por
ciento, los siguientes productos objeto de consulta: -Los empapadores, las duchas
vaginales, irrigadores, accesorios y sistemas de irrigaciรณn no destinados especรญficamente
a situaciones de incontinencia urinaria o fecal, ni las cรกnulas rectales y vaginales
no destinadas especรญficamente a situaciones de incontinencia urinaria o fecal
o no incorporadas en equipos destinados a estas situaciones. โ
4.- En consecuencia con lo anterior este centro directivo le informa que tributan
al tipo general del 21 por ciento las entregas, adquisiciones intracomunitarias
e importaciones de suero de irrigaciรณn (agua destilada o suero fisiolรณgico) objeto
de consulta siendo irrelevante que su destino sea para la limpieza asรฉptica de
la piel, lavado de heridas o quemaduras formando parte integrante de los sistemas
de irrigaciรณn.
5.- Lo que comunico a Vd. con efectos vinculantes, conforme a lo dispuesto en
el apartado 1 del artรญculo 89 de la Ley 58/2003, de 17 de diciembre, General Tributaria.
No obstante, de acuerdo con el artรญculo 68.2 del Reglamento General de las actuaciones
y los procedimientos de gestiรณn e inspecciรณn tributaria y de desarrollo de las
normas comunes de los procedimientos de aplicaciรณn de los tributos, aprobado por
el Real Decreto 1065/2007, de 27 de julio, la presente contestaciรณn no tendrรก
efectos vinculantes para aquellos miembros o asociados de la consultante que en
el momento de formular la consulta estuviesen siendo objeto de un procedimiento,
recurso o reclamaciรณn econรณmico-administrativa iniciado con anterioridad y relacionado
con las cuestiones planteadas en la consulta conforme a lo dispuesto en su artรญculo
89.2.'
- source_sentence: 'query: ยฟEste suministro de agua estรก sujeto a impuestos relacionados
con el valor aรฑadido?'
sentences:
- 'En estas circunstancias, se alineรณ la doctrina de este Centro directivo con la
jurisprudencia del Tribunal Supremo, modificando, por tanto, el criterio mantenido
en contestaciones anteriores.
De acuerdo con todo lo anterior, las operaciones llevadas a cabo a tรญtulo oneroso
por comunidades de regantes a favor de sus miembros consistentes en la distribuciรณn-comercializaciรณn
de agua, en los casos en los que sea posible adquirir, tratar y distribuir agua
a tรญtulo oneroso, estarรกn sujetas al Impuesto sobre el Valor Aรฑadido, no resultando
de aplicaciรณn el supuesto de no sujeciรณn recogido en el artรญculo 7.11ยบ de la Ley
37/1992.
Consecuentemente con todo lo expuesto anteriormente, se le informa de que estarรก
sujeta al Impuesto sobre el Valor Aรฑadido la distribuciรณn-comercializaciรณn de
agua realizada por la consultante si se realiza en los tรฉrminos anteriormente
citados.
Por el contrario, las operaciones realizadas por la Comunidad de regantes consultante
para la ordenaciรณn y el aprovechamiento de las aguas efectuadas en los tรฉrminos
del artรญculo 7.11ยบ de la Ley del Impuesto no estรกn sujetas al Impuesto sobre el
Valor Aรฑadido, como parece ser la operaciรณn objeto de consulta, aunque el agua
de riego se mezcle con abono en las condiciones seรฑaladas.
3.- Lo que comunico a Vd. con efectos vinculantes, conforme a lo dispuesto en
el apartado 1 del artรญculo 89 de la Ley 58/2003, de 17 de diciembre, General Tributaria.'
- 'passage Descripciรณn de hechos: El consultante forma parte de una comunidad de
vecinos que es propietaria de un pozo y cobra a los vecinos un importe en concepto
de su consumo de agua.
Cuestiรณn planteada: Sujeciรณn del citado suministro de agua al Impuesto sobre el
Valor Aรฑadido.'
- 'passage Descripciรณn de hechos: El consultante es taxista y tributa en el rรฉgimen
especial simplificado del Impuesto sobre el Valor Aรฑadido. Ha tenido un siniestro
con el taxi por el que ha pagado la reparaciรณn a un taller, debiendo su aseguradora
reembolsarle dicho gasto. La aseguradora acepta la factura pero se la reembolsarรก
excluyendo el importe correspondiente al Impuesto sobre el Valor Aรฑadido.
Cuestiรณn planteada: Obligaciรณn por parte de la aseguradora de abonar al consultante
el importe correspondiente al Impuesto sobre el Valor Aรฑadido de las mencionadas
facturas. Derecho a deducir las cuotas soportadas del consultante, que tributa
en el rรฉgimen especial simplificado del Impuesto.'
- source_sentence: 'query: ยฟSe puede eximir del IVA si el destinatario de una actividad
de caza es un club deportivo?'
sentences:
- 'Cuestiรณn planteada: Como entidad parcialmente exenta del Impuesto sobre Sociedades,
si puede entenderse que la asociaciรณn no realiza actividad econรณmica, por lo que
sus cuotas estarรญan exentas de este impuesto.
En el caso de ser actividad econรณmica, si el pago realizado a los agricultores
tiene la consideraciรณn de gasto deducible para la asociaciรณn y si el agricultor
debe emitir factura al respecto.
En el caso de tener la consideraciรณn de actividad econรณmica, si las cuotas cobradas
a los asociados estarรญan sujetas al IVA.'
- 'passage (โฆ).โ.
De acuerdo con lo dispuesto anteriormente, en los supuestos de adjudicaciรณn de
bienes en virtud de subasta judicial o administrativa, como es el caso que nos
ocupa, el adjudicatario puede efectuar, en su caso, la renuncia a las exenciones
previstas en el apartado dos del artรญculo 20 de la Ley 37/1992, asรญ como expedir
factura, presentar, en nombre y por cuenta del sujeto pasivo, la declaraciรณn-liquidaciรณn
correspondiente e ingresar el importe del Impuesto sobre el Valor Aรฑadido resultante.
El ejercicio de dicha facultad por parte del adjudicatario determina la obligaciรณn
de presentar la autoliquidaciรณn del Impuesto conforme al modelo aprobado por la
Orden HAC/3625/2003, de 23 de diciembre (modelo 309).
Uno de los requisitos necesarios para el ejercicio de dicha facultad es que el
destinatario-adjudicatario del bien inmueble tenga la consideraciรณn de empresario
o profesional en los tรฉrminos previstos en esta contestaciรณn. La no consideraciรณn
como empresario o profesional impide el ejercicio de dicha facultad.
Por รบltimo, seรฑalar que de resultar aplicable la regla de inversiรณn del sujeto
pasivo prevista en el artรญculo 84.Uno.2ยบ de la Ley 37/1992, anteriormente desarrollado,
el adjudicatario resultarรก ser el sujeto pasivo de la operaciรณn por lo que viene
obligado a presentar la autoliquidaciรณn ordinaria del Impuesto en nombre propio,
sin actuar en nombre y por cuenta del subastado. Asimismo, de optar por dicha
facultad en los tรฉrminos establecidos reglamentariamente, el consultante podrรก
emitir, en nombre y por cuenta del transmitente, la correspondiente factura en
la que se documente la operaciรณn.
No obstante, tal y como se ha seรฑalado en apartados anteriores de esta contestaciรณn,
el consultante adjudicatario de la subasta judicial no procediรณ a la renuncia
a la exenciรณn del artรญculo 20.Uno.22ยบ de la Ley del Impuesto en el plazo establecido,
habiรฉndose encontrado facultado para ello segรบn lo dispuesto en la Disposiciรณn
Adicional Sexta de la Ley 37/1992.'
- 'passage Descripciรณn de hechos: El ayuntamiento consultante ha adjudicado un aprovechamiento
de caza a terceros.
Cuestiรณn planteada: Aclaraciรณn de la contestaciรณn vinculante de 6 de febrero de
2023, nรบmero V0140-23. En particular, sobre si la sujeciรณn y en su caso exenciรณn
del Impuesto sobre el Valor Aรฑadido aplica cuando el destinatario es un club deportivo.'
pipeline_tag: sentence-similarity
library_name: sentence-transformers
metrics:
- cosine_accuracy@1
- cosine_accuracy@3
- cosine_accuracy@5
- cosine_accuracy@10
- cosine_precision@1
- cosine_precision@3
- cosine_precision@5
- cosine_precision@10
- cosine_recall@1
- cosine_recall@3
- cosine_recall@5
- cosine_recall@10
- cosine_ndcg@10
- cosine_mrr@10
- cosine_map@100
model-index:
- name: SentenceTransformer based on intfloat/multilingual-e5-small
results:
- task:
type: information-retrieval
name: Information Retrieval
dataset:
name: InformationRetrievalEvaluator
type: InformationRetrievalEvaluator
metrics:
- type: cosine_accuracy@1
value: 0.34438553454142595
name: Cosine Accuracy@1
- type: cosine_accuracy@3
value: 0.5078737042019467
name: Cosine Accuracy@3
- type: cosine_accuracy@5
value: 0.582574978238506
name: Cosine Accuracy@5
- type: cosine_accuracy@10
value: 0.6804621350003957
name: Cosine Accuracy@10
- type: cosine_precision@1
value: 0.34438553454142595
name: Cosine Precision@1
- type: cosine_precision@3
value: 0.16929123473398222
name: Cosine Precision@3
- type: cosine_precision@5
value: 0.11651499564770121
name: Cosine Precision@5
- type: cosine_precision@10
value: 0.06804621350003956
name: Cosine Precision@10
- type: cosine_recall@1
value: 0.34438553454142595
name: Cosine Recall@1
- type: cosine_recall@3
value: 0.5078737042019467
name: Cosine Recall@3
- type: cosine_recall@5
value: 0.582574978238506
name: Cosine Recall@5
- type: cosine_recall@10
value: 0.6804621350003957
name: Cosine Recall@10
- type: cosine_ndcg@10
value: 0.5018922960319426
name: Cosine Ndcg@10
- type: cosine_mrr@10
value: 0.44598812883809136
name: Cosine Mrr@10
- type: cosine_map@100
value: 0.45552575512925936
name: Cosine Map@100
---
# SentenceTransformer based on intfloat/multilingual-e5-small
This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [intfloat/multilingual-e5-small](https://huggingface.co/intfloat/multilingual-e5-small). It maps sentences & paragraphs to a 384-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
## Model Details
### Model Description
- **Model Type:** Sentence Transformer
- **Base model:** [intfloat/multilingual-e5-small](https://huggingface.co/intfloat/multilingual-e5-small) <!-- at revision c007d7ef6fd86656326059b28395a7a03a7c5846 -->
- **Maximum Sequence Length:** 512 tokens
- **Output Dimensionality:** 384 dimensions
- **Similarity Function:** Cosine Similarity
<!-- - **Training Dataset:** Unknown -->
<!-- - **Language:** Unknown -->
<!-- - **License:** Unknown -->
### Model Sources
- **Documentation:** [Sentence Transformers Documentation](https://sbert.net)
- **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers)
- **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers)
### Full Model Architecture
```
SentenceTransformer(
(0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel
(1): Pooling({'word_embedding_dimension': 384, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
(2): Normalize()
)
```
## Usage
### Direct Usage (Sentence Transformers)
First install the Sentence Transformers library:
```bash
pip install -U sentence-transformers
```
Then you can load this model and run inference.
```python
from sentence_transformers import SentenceTransformer
# Download from the ๐ค Hub
model = SentenceTransformer("diegolacomba/multilingual-e5-small-legal-mnrl-negatives-0")
# Run inference
sentences = [
'query: ยฟSe puede eximir del IVA si el destinatario de una actividad de caza es un club deportivo?',
'passage Descripciรณn de hechos: El ayuntamiento consultante ha adjudicado un aprovechamiento de caza a terceros.\n\nCuestiรณn planteada: Aclaraciรณn de la contestaciรณn vinculante de 6 de febrero de 2023, nรบmero V0140-23. En particular, sobre si la sujeciรณn y en su caso exenciรณn del Impuesto sobre el Valor Aรฑadido aplica cuando el destinatario es un club deportivo.',
'Cuestiรณn planteada: Como entidad parcialmente exenta del Impuesto sobre Sociedades, si puede entenderse que la asociaciรณn no realiza actividad econรณmica, por lo que sus cuotas estarรญan exentas de este impuesto.\nEn el caso de ser actividad econรณmica, si el pago realizado a los agricultores tiene la consideraciรณn de gasto deducible para la asociaciรณn y si el agricultor debe emitir factura al respecto.\nEn el caso de tener la consideraciรณn de actividad econรณmica, si las cuotas cobradas a los asociados estarรญan sujetas al IVA.',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 384]
# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]
```
<!--
### Direct Usage (Transformers)
<details><summary>Click to see the direct usage in Transformers</summary>
</details>
-->
<!--
### Downstream Usage (Sentence Transformers)
You can finetune this model on your own dataset.
<details><summary>Click to expand</summary>
</details>
-->
<!--
### Out-of-Scope Use
*List how the model may foreseeably be misused and address what users ought not to do with the model.*
-->
## Evaluation
### Metrics
#### Information Retrieval
* Dataset: `InformationRetrievalEvaluator`
* Evaluated with [<code>InformationRetrievalEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator)
| Metric | Value |
|:--------------------|:-----------|
| cosine_accuracy@1 | 0.3444 |
| cosine_accuracy@3 | 0.5079 |
| cosine_accuracy@5 | 0.5826 |
| cosine_accuracy@10 | 0.6805 |
| cosine_precision@1 | 0.3444 |
| cosine_precision@3 | 0.1693 |
| cosine_precision@5 | 0.1165 |
| cosine_precision@10 | 0.068 |
| cosine_recall@1 | 0.3444 |
| cosine_recall@3 | 0.5079 |
| cosine_recall@5 | 0.5826 |
| cosine_recall@10 | 0.6805 |
| **cosine_ndcg@10** | **0.5019** |
| cosine_mrr@10 | 0.446 |
| cosine_map@100 | 0.4555 |
<!--
## Bias, Risks and Limitations
*What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.*
-->
<!--
### Recommendations
*What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.*
-->
## Training Details
### Training Dataset
#### Unnamed Dataset
* Size: 58,898 training samples
* Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code>
* Approximate statistics based on the first 1000 samples:
| | anchor | positive | negative |
|:--------|:-----------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|
| type | string | string | string |
| details | <ul><li>min: 19 tokens</li><li>mean: 31.33 tokens</li><li>max: 62 tokens</li></ul> | <ul><li>min: 23 tokens</li><li>mean: 325.57 tokens</li><li>max: 508 tokens</li></ul> | <ul><li>min: 24 tokens</li><li>mean: 346.55 tokens</li><li>max: 496 tokens</li></ul> |
* Samples:
| anchor | positive | negative |
|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| <code>query: ยฟLas contribuciones que percibe una organizaciรณn en virtud de un convenio laboral en el fรบtbol tienen impacto en la base de cรกlculo para el impuesto correspondiente?</code> | <code>passage Descripciรณn de hechos: La consultante es una Asociaciรณn que se dedica a las actividades de ordenaciรณn del ejercicio de la profesiรณn de futbolistas de sus miembros, la representaciรณn de los mismos asรญ como la defensa de sus intereses profesionales tanto en el รกmbito nacional como en el internacional.<br>En virtud de un convenio colectivo para la actividad de fรบtbol profesional suscrito entre la Liga Nacional de Fรบtbol Profesional (LNFP) y la consultante, aquella viene obligada a entregar a esta, por cada temporada de vigencia del convenio, una cantidad de dinero (en concepto de Fondo social) destinada a fines benรฉficos y al normal desarrollo de la actividad de la Asociaciรณn.<br>Asimismo, segรบn Acta de Conciliaciรณn suscrita entre ambas partes, la LNFP se compromete a abonar a la consultante un porcentaje del importe neto total de los ingresos obtenidos de la explotaciรณn conjunta de los derechos de contenidos audiovisuales del fรบtbol. Dicha cuantรญa debe destinarse a actividades encamina...</code> | <code>a) Un 3,5 por 100 se destinarรก a financiar un Fondo de Compensaciรณn del que podrรกn beneficiarse las entidades deportivas que, disputando la competiciรณn del fรบtbol profesional, desciendan de categorรญa. El 90 por 100 de esta cantidad se destinarรก a los equipos que desciendan de Primera divisiรณn, y el 10 por 100 restante a los que desciendan de Segunda Divisiรณn.<br>b) Un 1 por 100 se entregarรก a la Liga Nacional de Fรบtbol Profesional, que lo destinarรก exclusivamente a la promociรณn de la competiciรณn profesional en los mercados nacional e internacional.<br>(โฆ).โ.<br>En relaciรณn con el tratamiento de dichas aportaciones, este Centro directivo ya se manifestรณ en la contestaciรณn vinculante a la consulta, de 20 de septiembre de 2016, con nรบmero de referencia V3946-16, estableciendo que โa efectos de determinar el rรฉgimen de tributaciรณn en el Impuesto sobre el Valor Aรฑadido de las contribuciones obligatorias que los clubes y entidades participantes en el Campeonato Nacional de Liga deben realizar al ampa...</code> |
| <code>query: ยฟQuรฉ tipos de transacciones intracomunitarias deben ser declaradas por las empresas segรบn la regulaciรณn vigente?</code> | <code>passage Contestaciรณn completa: 1.- De acuerdo con el artรญculo 78 del Reglamento del impuesto aprobado por el Real Decreto 1624/1992, de 29 de diciembre (BOE del 31 de diciembre):<br>โLos empresarios y profesionales deberรกn presentar una declaraciรณn recapitulativa de las entregas y adquisiciones intracomunitarias de bienes y de las prestaciones y adquisiciones intracomunitarias de servicios que realicen en la forma que se indica en el presente capรญtulo.โ.<br>El artรญculo 79 del Reglamento especifica quรฉ tipo de operaciones deben ser declaradas en la declaraciรณn recapitulativa de operaciones intracomunitarias, en concreto establece que:<br>โ1. Estarรกn obligados a presentar la declaraciรณn recapitulativa los empresarios y profesionales, incluso cuando tengan dicha condiciรณn con arreglo a lo dispuesto en el apartado cuatro del artรญculo 5 de la Ley del Impuesto, que realicen cualquiera de las siguientes operaciones.<br>1.ยบ Las entregas de bienes destinados a otro Estado miembro que se encuentren exentas ...</code> | <code>2. Tambiรฉn estarรก obligado a presentar la declaraciรณn recapitulativa el vendedor que expida o transporte bienes a otro Estado miembro en el marco de un acuerdo de ventas de bienes en consigna a que se refiere el artรญculo 9 bis de la Ley del Impuesto.โ.<br>Por tanto, en el caso de que la consultante hubiera efectuado alguna de las operaciones intracomunitarias indicadas expresamente en el artรญculo anterior, deberรก presentar la declaraciรณn recapitulativa de operaciones intracomunitarias.<br>3.- Lo que comunico a Vd. con efectos vinculantes, conforme a lo dispuesto en el apartado 1 del artรญculo 89 de la Ley 58/2003, de 17 de diciembre, General Tributaria.</code> |
| <code>query: ยฟQuรฉ tipos de bebidas contienen alcohol apto para consumo humano?</code> | <code>passage Se entiende por bebida alcohรณlica todo lรญquido apto para el consumo humano por ingestiรณn que contenga alcohol etรญlico.<br>A los efectos de este nรบmero no tendrรกn la consideraciรณn de alimento el tabaco ni las sustancias no aptas para el consumo humano o animal en el mismo estado en que fuesen objeto de entrega, adquisiciรณn intracomunitaria o importaciรณn.โ.<br>4.- Con independencia de lo anterior, el artรญculo 20, apartado uno, nรบmero 9ยบ, de la Ley 37/1992, establece que estarรกn exentas del Impuesto las siguientes operaciones:<br>โ9.ยบ La educaciรณn de la infancia y de la juventud, la guarda y custodia de niรฑos, incluida la atenciรณn a niรฑos en los centros docentes en tiempo interlectivo durante el comedor escolar o en aulas en servicio de guarderรญa fuera del horario escolar, la enseรฑanza escolar, universitaria y de postgraduados, la enseรฑanza de idiomas y la formaciรณn y reciclaje profesional, realizadas por Entidades de derecho pรบblico o entidades privadas autorizadas para el ejercicio de di...</code> | <code>โArtรญculo 90. Tipo impositivo general.<br>Uno. El impuesto se exigirรก al tipo del 21 por ciento, salvo lo dispuesto en el artรญculo siguiente.<br>Dos. El tipo impositivo aplicable a cada operaciรณn serรก el vigente en el momento del devengo.โ.<br>โArtรญculo 91. Tipos impositivos reducidos.<br>Uno. Se aplicarรก el tipo del 10 por ciento a las operaciones siguientes:<br>1. Las entregas, adquisiciones intracomunitarias o importaciones de los bienes que se indican a continuaciรณn:<br>1.ยบ Las sustancias o productos, cualquiera que sea su origen que, por sus caracterรญsticas, aplicaciones, componentes, preparaciรณn y estado de conservaciรณn, sean susceptibles de ser habitual e idรณneamente utilizados para la nutriciรณn humana o animal, de acuerdo con lo establecido en el Cรณdigo Alimentario y las disposiciones dictadas para su desarrollo, excepto las bebidas alcohรณlicas.<br>Se entiende por bebida alcohรณlica todo lรญquido apto para el consumo humano por ingestiรณn que contenga alcohol etรญlico.<br>A los efectos de este nรบmero no t...</code> |
* Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters:
```json
{
"scale": 20.0,
"similarity_fct": "cos_sim"
}
```
### Training Hyperparameters
#### Non-Default Hyperparameters
- `eval_strategy`: steps
- `per_device_train_batch_size`: 32
- `per_device_eval_batch_size`: 16
- `gradient_accumulation_steps`: 16
- `learning_rate`: 2e-05
- `num_train_epochs`: 5
- `lr_scheduler_type`: cosine
- `warmup_ratio`: 0.1
- `fp16`: True
- `tf32`: True
- `load_best_model_at_end`: True
- `optim`: adamw_torch_fused
- `batch_sampler`: no_duplicates
#### All Hyperparameters
<details><summary>Click to expand</summary>
- `overwrite_output_dir`: False
- `do_predict`: False
- `eval_strategy`: steps
- `prediction_loss_only`: True
- `per_device_train_batch_size`: 32
- `per_device_eval_batch_size`: 16
- `per_gpu_train_batch_size`: None
- `per_gpu_eval_batch_size`: None
- `gradient_accumulation_steps`: 16
- `eval_accumulation_steps`: None
- `torch_empty_cache_steps`: None
- `learning_rate`: 2e-05
- `weight_decay`: 0.0
- `adam_beta1`: 0.9
- `adam_beta2`: 0.999
- `adam_epsilon`: 1e-08
- `max_grad_norm`: 1.0
- `num_train_epochs`: 5
- `max_steps`: -1
- `lr_scheduler_type`: cosine
- `lr_scheduler_kwargs`: {}
- `warmup_ratio`: 0.1
- `warmup_steps`: 0
- `log_level`: passive
- `log_level_replica`: warning
- `log_on_each_node`: True
- `logging_nan_inf_filter`: True
- `save_safetensors`: True
- `save_on_each_node`: False
- `save_only_model`: False
- `restore_callback_states_from_checkpoint`: False
- `no_cuda`: False
- `use_cpu`: False
- `use_mps_device`: False
- `seed`: 42
- `data_seed`: None
- `jit_mode_eval`: False
- `use_ipex`: False
- `bf16`: False
- `fp16`: True
- `fp16_opt_level`: O1
- `half_precision_backend`: auto
- `bf16_full_eval`: False
- `fp16_full_eval`: False
- `tf32`: True
- `local_rank`: 0
- `ddp_backend`: None
- `tpu_num_cores`: None
- `tpu_metrics_debug`: False
- `debug`: []
- `dataloader_drop_last`: False
- `dataloader_num_workers`: 0
- `dataloader_prefetch_factor`: None
- `past_index`: -1
- `disable_tqdm`: False
- `remove_unused_columns`: True
- `label_names`: None
- `load_best_model_at_end`: True
- `ignore_data_skip`: False
- `fsdp`: []
- `fsdp_min_num_params`: 0
- `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
- `fsdp_transformer_layer_cls_to_wrap`: None
- `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
- `deepspeed`: None
- `label_smoothing_factor`: 0.0
- `optim`: adamw_torch_fused
- `optim_args`: None
- `adafactor`: False
- `group_by_length`: False
- `length_column_name`: length
- `ddp_find_unused_parameters`: None
- `ddp_bucket_cap_mb`: None
- `ddp_broadcast_buffers`: False
- `dataloader_pin_memory`: True
- `dataloader_persistent_workers`: False
- `skip_memory_metrics`: True
- `use_legacy_prediction_loop`: False
- `push_to_hub`: False
- `resume_from_checkpoint`: None
- `hub_model_id`: None
- `hub_strategy`: every_save
- `hub_private_repo`: None
- `hub_always_push`: False
- `gradient_checkpointing`: False
- `gradient_checkpointing_kwargs`: None
- `include_inputs_for_metrics`: False
- `include_for_metrics`: []
- `eval_do_concat_batches`: True
- `fp16_backend`: auto
- `push_to_hub_model_id`: None
- `push_to_hub_organization`: None
- `mp_parameters`:
- `auto_find_batch_size`: False
- `full_determinism`: False
- `torchdynamo`: None
- `ray_scope`: last
- `ddp_timeout`: 1800
- `torch_compile`: False
- `torch_compile_backend`: None
- `torch_compile_mode`: None
- `include_tokens_per_second`: False
- `include_num_input_tokens_seen`: False
- `neftune_noise_alpha`: None
- `optim_target_modules`: None
- `batch_eval_metrics`: False
- `eval_on_start`: False
- `use_liger_kernel`: False
- `eval_use_gather_object`: False
- `average_tokens_across_devices`: False
- `prompts`: None
- `batch_sampler`: no_duplicates
- `multi_dataset_batch_sampler`: proportional
</details>
### Training Logs
| Epoch | Step | Training Loss | InformationRetrievalEvaluator_cosine_ndcg@10 |
|:----------:|:-------:|:-------------:|:--------------------------------------------:|
| 0.8691 | 100 | 17.1124 | 0.4546 |
| 1.7300 | 200 | 1.1179 | 0.4828 |
| 2.5910 | 300 | 0.9019 | 0.4941 |
| 3.4519 | 400 | 0.7796 | 0.5005 |
| **4.3129** | **500** | **0.725** | **0.5019** |
| 5.0 | 580 | - | 0.5019 |
* The bold row denotes the saved checkpoint.
### Framework Versions
- Python: 3.11.13
- Sentence Transformers: 4.1.0
- Transformers: 4.52.4
- PyTorch: 2.6.0+cu124
- Accelerate: 1.7.0
- Datasets: 2.14.4
- Tokenizers: 0.21.1
## Citation
### BibTeX
#### Sentence Transformers
```bibtex
@inproceedings{reimers-2019-sentence-bert,
title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
author = "Reimers, Nils and Gurevych, Iryna",
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
month = "11",
year = "2019",
publisher = "Association for Computational Linguistics",
url = "https://arxiv.org/abs/1908.10084",
}
```
#### MultipleNegativesRankingLoss
```bibtex
@misc{henderson2017efficient,
title={Efficient Natural Language Response Suggestion for Smart Reply},
author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil},
year={2017},
eprint={1705.00652},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
<!--
## Glossary
*Clearly define terms in order to be accessible across audiences.*
-->
<!--
## Model Card Authors
*Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.*
-->
<!--
## Model Card Contact
*Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.*
--> |
FGTR6/Full.18.Video.New.Mezzo-Fun-Viral-Video.VIDEO.mezzo.fun.Viral.Video.Tutorial.Official | FGTR6 | 2025-06-17T11:40:28Z | 0 | 0 | null | [
"region:us"
] | null | 2025-06-17T11:39:23Z | <a href="https://allyoutubers.com/mezzo-fun-viral-video"> ๐ Full.18.Video.New.Mezzo-Fun-Viral-Video.VIDEO.mezzo.fun.Viral.Video.Tutorial.Official
๐ด โคโบDOWNLOAD๐๐๐ข โค <a href="https://allyoutubers.com/mezzo-fun-viral-video"> ๐ Full.18.Video.New.Mezzo-Fun-Viral-Video.VIDEO.mezzo.fun.Viral.Video.Tutorial.Official
<a href="https://allyoutubers.com/mezzo-fun-viral-video"> ๐ Full.18.Video.New.Mezzo-Fun-Viral-Video.VIDEO.mezzo.fun.Viral.Video.Tutorial.Official
๐ด โคโบDOWNLOAD๐๐๐ข โค <a href="https://allyoutubers.com/mezzo-fun-viral-video"> ๐ Full.18.Video.New.Mezzo-Fun-Viral-Video.VIDEO.mezzo.fun.Viral.Video.Tutorial.Official
|
MaIlz/outputs_grpo_all_tasks_4 | MaIlz | 2025-06-17T11:37:42Z | 0 | 0 | transformers | [
"transformers",
"tensorboard",
"safetensors",
"generated_from_trainer",
"unsloth",
"trl",
"grpo",
"arxiv:2402.03300",
"base_model:unsloth/llama-3-8b-Instruct-bnb-4bit",
"base_model:finetune:unsloth/llama-3-8b-Instruct-bnb-4bit",
"endpoints_compatible",
"region:us"
] | null | 2025-06-17T11:37:33Z | ---
base_model: unsloth/llama-3-8b-Instruct-bnb-4bit
library_name: transformers
model_name: outputs_grpo_all_tasks_4
tags:
- generated_from_trainer
- unsloth
- trl
- grpo
licence: license
---
# Model Card for outputs_grpo_all_tasks_4
This model is a fine-tuned version of [unsloth/llama-3-8b-Instruct-bnb-4bit](https://huggingface.co/unsloth/llama-3-8b-Instruct-bnb-4bit).
It has been trained using [TRL](https://github.com/huggingface/trl).
## Quick start
```python
from transformers import pipeline
question = "If you had a time machine, but could only go to the past or the future once and never return, which would you choose and why?"
generator = pipeline("text-generation", model="MaIlz/outputs_grpo_all_tasks_4", device="cuda")
output = generator([{"role": "user", "content": question}], max_new_tokens=128, return_full_text=False)[0]
print(output["generated_text"])
```
## Training procedure
This model was trained with GRPO, a method introduced in [DeepSeekMath: Pushing the Limits of Mathematical Reasoning in Open Language Models](https://huggingface.co/papers/2402.03300).
### Framework versions
- TRL: 0.15.2
- Transformers: 4.51.3
- Pytorch: 2.6.0+cu124
- Datasets: 3.6.0
- Tokenizers: 0.21.1
## Citations
Cite GRPO as:
```bibtex
@article{zhihong2024deepseekmath,
title = {{DeepSeekMath: Pushing the Limits of Mathematical Reasoning in Open Language Models}},
author = {Zhihong Shao and Peiyi Wang and Qihao Zhu and Runxin Xu and Junxiao Song and Mingchuan Zhang and Y. K. Li and Y. Wu and Daya Guo},
year = 2024,
eprint = {arXiv:2402.03300},
}
```
Cite TRL as:
```bibtex
@misc{vonwerra2022trl,
title = {{TRL: Transformer Reinforcement Learning}},
author = {Leandro von Werra and Younes Belkada and Lewis Tunstall and Edward Beeching and Tristan Thrush and Nathan Lambert and Shengyi Huang and Kashif Rasul and Quentin Gallouรฉdec},
year = 2020,
journal = {GitHub repository},
publisher = {GitHub},
howpublished = {\url{https://github.com/huggingface/trl}}
}
``` |
joanna302/Qwen3-0.6B-Base_fr_pt_8e-05_seed44 | joanna302 | 2025-06-17T11:30:55Z | 0 | 0 | transformers | [
"transformers",
"safetensors",
"qwen3",
"text-generation",
"unsloth",
"trl",
"sft",
"conversational",
"arxiv:1910.09700",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | text-generation | 2025-06-17T10:25:50Z | ---
library_name: transformers
tags:
- unsloth
- trl
- sft
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a ๐ค transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
gokulsrinivasagan/tinybert_base_train_book_ent_15p_s_init_book_mrpc | gokulsrinivasagan | 2025-06-17T11:24:45Z | 0 | 0 | transformers | [
"transformers",
"tensorboard",
"safetensors",
"bert",
"text-classification",
"generated_from_trainer",
"en",
"dataset:glue",
"base_model:gokulsrinivasagan/tinybert_base_train_book_ent_15p_s_init_book",
"base_model:finetune:gokulsrinivasagan/tinybert_base_train_book_ent_15p_s_init_book",
"license... | text-classification | 2025-06-17T11:23:59Z | ---
library_name: transformers
language:
- en
license: apache-2.0
base_model: gokulsrinivasagan/tinybert_base_train_book_ent_15p_s_init_book
tags:
- generated_from_trainer
datasets:
- glue
metrics:
- accuracy
- f1
model-index:
- name: tinybert_base_train_book_ent_15p_s_init_book_mrpc
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: GLUE MRPC
type: glue
args: mrpc
metrics:
- name: Accuracy
type: accuracy
value: 0.7009803921568627
- name: F1
type: f1
value: 0.8111455108359134
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# tinybert_base_train_book_ent_15p_s_init_book_mrpc
This model is a fine-tuned version of [gokulsrinivasagan/tinybert_base_train_book_ent_15p_s_init_book](https://huggingface.co/gokulsrinivasagan/tinybert_base_train_book_ent_15p_s_init_book) on the GLUE MRPC dataset.
It achieves the following results on the evaluation set:
- Loss: 0.5825
- Accuracy: 0.7010
- F1: 0.8111
- Combined Score: 0.7561
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 256
- eval_batch_size: 256
- seed: 10
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 50
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 | Combined Score |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:------:|:--------------:|
| 0.6257 | 1.0 | 15 | 0.6032 | 0.6961 | 0.8086 | 0.7524 |
| 0.584 | 2.0 | 30 | 0.5825 | 0.7010 | 0.8111 | 0.7561 |
| 0.5483 | 3.0 | 45 | 0.6029 | 0.7059 | 0.8171 | 0.7615 |
| 0.5131 | 4.0 | 60 | 0.5927 | 0.6863 | 0.7808 | 0.7335 |
| 0.4597 | 5.0 | 75 | 0.6270 | 0.6985 | 0.7897 | 0.7441 |
| 0.3832 | 6.0 | 90 | 0.6773 | 0.7034 | 0.7987 | 0.7511 |
| 0.3111 | 7.0 | 105 | 0.7539 | 0.7083 | 0.8096 | 0.7590 |
### Framework versions
- Transformers 4.51.2
- Pytorch 2.6.0+cu126
- Datasets 3.5.0
- Tokenizers 0.21.1
|
milpu02/Fenqurymix-xl | milpu02 | 2025-06-17T11:20:02Z | 0 | 0 | diffusers | [
"diffusers",
"text-to-image",
"lora",
"template:diffusion-lora",
"base_model:stabilityai/stable-diffusion-xl-base-1.0",
"base_model:adapter:stabilityai/stable-diffusion-xl-base-1.0",
"region:us"
] | text-to-image | 2025-06-17T11:19:51Z | ---
tags:
- text-to-image
- lora
- diffusers
- template:diffusion-lora
widget:
- text: '-'
output:
url: images/Screenshot 2025-06-17 051906.png
base_model: stabilityai/stable-diffusion-xl-base-1.0
instance_prompt: Fenqury
---
# Illustrious-XL
<Gallery />
## Trigger words
You should use `Fenqury` to trigger the image generation.
## Download model
Weights for this model are available in Safetensors format.
[Download](/milpu02/Fenqurymix-xl/tree/main) them in the Files & versions tab.
|
h34v7/Euro-DDXPv1.0-GGUF | h34v7 | 2025-06-17T11:14:52Z | 22 | 0 | null | [
"gguf",
"base_model:h34v7/Euro-DDXPv1.0",
"base_model:quantized:h34v7/Euro-DDXPv1.0",
"license:apache-2.0",
"endpoints_compatible",
"region:us",
"conversational"
] | null | 2025-06-09T09:33:00Z | ---
license: apache-2.0
base_model:
- h34v7/Euro-DDXPv1.0
--- |
mpio/bert-finetuned-ner | mpio | 2025-06-17T11:05:55Z | 0 | 0 | transformers | [
"transformers",
"safetensors",
"bert",
"token-classification",
"generated_from_trainer",
"dataset:conll2003",
"base_model:google-bert/bert-base-cased",
"base_model:finetune:google-bert/bert-base-cased",
"license:apache-2.0",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"reg... | token-classification | 2025-06-17T10:54:07Z | ---
library_name: transformers
license: apache-2.0
base_model: bert-base-cased
tags:
- generated_from_trainer
datasets:
- conll2003
metrics:
- precision
- recall
- f1
- accuracy
model-index:
- name: bert-finetuned-ner
results:
- task:
name: Token Classification
type: token-classification
dataset:
name: conll2003
type: conll2003
config: conll2003
split: validation
args: conll2003
metrics:
- name: Precision
type: precision
value: 0.9401794616151545
- name: Recall
type: recall
value: 0.9522046449007069
- name: F1
type: f1
value: 0.9461538461538462
- name: Accuracy
type: accuracy
value: 0.9874315653146524
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-finetuned-ner
This model is a fine-tuned version of [bert-base-cased](https://huggingface.co/bert-base-cased) on the conll2003 dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0595
- Precision: 0.9402
- Recall: 0.9522
- F1: 0.9462
- Accuracy: 0.9874
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 3
### Training results
| Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:---------:|:------:|:------:|:--------:|
| 0.0783 | 1.0 | 1756 | 0.0685 | 0.8991 | 0.9329 | 0.9157 | 0.9819 |
| 0.0343 | 2.0 | 3512 | 0.0606 | 0.9343 | 0.9477 | 0.9409 | 0.9862 |
| 0.0215 | 3.0 | 5268 | 0.0595 | 0.9402 | 0.9522 | 0.9462 | 0.9874 |
### Framework versions
- Transformers 4.50.0
- Pytorch 2.6.0+cu118
- Datasets 3.6.0
- Tokenizers 0.21.1
|
ujjawal077/llama3s-merged3 | ujjawal077 | 2025-06-17T11:03:36Z | 0 | 0 | transformers | [
"transformers",
"safetensors",
"llama",
"text-generation",
"mergekit",
"merge",
"arxiv:2311.03099",
"base_model:AdaptLLM/finance-LLM-13B",
"base_model:merge:AdaptLLM/finance-LLM-13B",
"base_model:starmpcc/Asclepius-Llama2-13B",
"base_model:merge:starmpcc/Asclepius-Llama2-13B",
"autotrain_compa... | text-generation | 2025-06-17T11:01:23Z | ---
base_model:
- AdaptLLM/finance-LLM-13B
- starmpcc/Asclepius-Llama2-13B
library_name: transformers
tags:
- mergekit
- merge
---
# llama3s-merged
This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
## Merge Details
### Merge Method
This model was merged using the [DARE TIES](https://arxiv.org/abs/2311.03099) merge method using [starmpcc/Asclepius-Llama2-13B](https://huggingface.co/starmpcc/Asclepius-Llama2-13B) as a base.
### Models Merged
The following models were included in the merge:
* [AdaptLLM/finance-LLM-13B](https://huggingface.co/AdaptLLM/finance-LLM-13B)
### Configuration
The following YAML configuration was used to produce this model:
```yaml
base_model: starmpcc/Asclepius-Llama2-13B
dtype: bfloat16
merge_method: dare_ties
modules:
default:
slices:
- sources:
- layer_range: [0, 40]
model: AdaptLLM/finance-LLM-13B
parameters:
density: 0.53
weight: 0.6
- layer_range: [0, 40]
model: starmpcc/Asclepius-Llama2-13B
parameters:
density: 0.5
weight: 0.4
parameters:
int8_mask: 1.0
```
|
Mbaxraan/mistral-small-24b-4bit | Mbaxraan | 2025-06-17T10:58:33Z | 0 | 0 | null | [
"safetensors",
"mistral",
"license:apache-2.0",
"4-bit",
"bitsandbytes",
"region:us"
] | null | 2025-06-17T08:07:34Z | ---
license: apache-2.0
---
|
tamewild/4b_v4_merged_e2 | tamewild | 2025-06-17T10:57:57Z | 0 | 0 | transformers | [
"transformers",
"safetensors",
"qwen3",
"text-generation",
"conversational",
"arxiv:1910.09700",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | text-generation | 2025-06-17T10:56:13Z | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a ๐ค transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
sizzlebop/Qwen3-ReactJs-code-Q4_K_M-GGUF | sizzlebop | 2025-06-17T10:53:24Z | 0 | 0 | transformers | [
"transformers",
"gguf",
"code",
"ReactJS",
"llama-cpp",
"gguf-my-repo",
"text-generation",
"en",
"base_model:nirusanan/Qwen3-ReactJs-code",
"base_model:finetune:nirusanan/Qwen3-ReactJs-code",
"endpoints_compatible",
"region:us",
"imatrix",
"conversational"
] | text-generation | 2025-06-17T10:53:14Z | ---
library_name: transformers
tags:
- code
- ReactJS
- llama-cpp
- gguf-my-repo
language:
- en
base_model: nirusanan/Qwen3-ReactJs-code
base_model_relation: finetune
pipeline_tag: text-generation
---
# sizzlebop/Qwen3-ReactJs-code-Q4_K_M-GGUF
This model was converted to GGUF format from [`nirusanan/Qwen3-ReactJs-code`](https://huggingface.co/nirusanan/Qwen3-ReactJs-code) using llama.cpp via the ggml.ai's [GGUF-my-repo](https://huggingface.co/spaces/ggml-org/gguf-my-repo) space.
Refer to the [original model card](https://huggingface.co/nirusanan/Qwen3-ReactJs-code) for more details on the model.
## Use with llama.cpp
Install llama.cpp through brew (works on Mac and Linux)
```bash
brew install llama.cpp
```
Invoke the llama.cpp server or the CLI.
### CLI:
```bash
llama-cli --hf-repo sizzlebop/Qwen3-ReactJs-code-Q4_K_M-GGUF --hf-file qwen3-reactjs-code-q4_k_m-imat.gguf -p "The meaning to life and the universe is"
```
### Server:
```bash
llama-server --hf-repo sizzlebop/Qwen3-ReactJs-code-Q4_K_M-GGUF --hf-file qwen3-reactjs-code-q4_k_m-imat.gguf -c 2048
```
Note: You can also use this checkpoint directly through the [usage steps](https://github.com/ggerganov/llama.cpp?tab=readme-ov-file#usage) listed in the Llama.cpp repo as well.
Step 1: Clone llama.cpp from GitHub.
```
git clone https://github.com/ggerganov/llama.cpp
```
Step 2: Move into the llama.cpp folder and build it with `LLAMA_CURL=1` flag along with other hardware-specific flags (for ex: LLAMA_CUDA=1 for Nvidia GPUs on Linux).
```
cd llama.cpp && LLAMA_CURL=1 make
```
Step 3: Run inference through the main binary.
```
./llama-cli --hf-repo sizzlebop/Qwen3-ReactJs-code-Q4_K_M-GGUF --hf-file qwen3-reactjs-code-q4_k_m-imat.gguf -p "The meaning to life and the universe is"
```
or
```
./llama-server --hf-repo sizzlebop/Qwen3-ReactJs-code-Q4_K_M-GGUF --hf-file qwen3-reactjs-code-q4_k_m-imat.gguf -c 2048
```
|
tamewild/4b_v4_merged_e10 | tamewild | 2025-06-17T10:45:27Z | 0 | 0 | transformers | [
"transformers",
"safetensors",
"qwen3",
"text-generation",
"conversational",
"arxiv:1910.09700",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | text-generation | 2025-06-17T10:43:46Z | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a ๐ค transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
charso/qwen2-7b-instruct-PrizePrj01 | charso | 2025-06-17T10:39:11Z | 5 | 0 | peft | [
"peft",
"tensorboard",
"safetensors",
"trl",
"sft",
"generated_from_trainer",
"base_model:Qwen/Qwen2-VL-7B-Instruct",
"base_model:adapter:Qwen/Qwen2-VL-7B-Instruct",
"license:apache-2.0",
"region:us"
] | null | 2025-06-17T06:28:20Z | ---
library_name: peft
license: apache-2.0
base_model: Qwen/Qwen2-VL-7B-Instruct
tags:
- trl
- sft
- generated_from_trainer
model-index:
- name: qwen2-7b-instruct-PrizePrj01
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# qwen2-7b-instruct-PrizePrj01
This model is a fine-tuned version of [Qwen/Qwen2-VL-7B-Instruct](https://huggingface.co/Qwen/Qwen2-VL-7B-Instruct) on an unknown dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0002
- train_batch_size: 1
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 8
- total_train_batch_size: 8
- optimizer: Use OptimizerNames.ADAMW_TORCH_FUSED with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: constant
- lr_scheduler_warmup_ratio: 0.03
- num_epochs: 3
### Training results
### Framework versions
- PEFT 0.15.2
- Transformers 4.52.1
- Pytorch 2.5.1+cu121
- Datasets 3.5.1
- Tokenizers 0.21.1 |
vinnvinn/mistral-Dr.hugz | vinnvinn | 2025-06-17T10:33:23Z | 20 | 0 | transformers | [
"transformers",
"safetensors",
"mistral",
"text-generation",
"conversational",
"arxiv:1910.09700",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"8-bit",
"region:us"
] | text-generation | 2025-06-16T10:58:33Z | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a ๐ค transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
Triangle104/Qwen3-30B-A3B-ArliAI-RpR-v4-Fast-Q3_K_L-GGUF | Triangle104 | 2025-06-17T10:22:45Z | 0 | 0 | transformers | [
"transformers",
"gguf",
"llama-cpp",
"gguf-my-repo",
"text-generation",
"en",
"base_model:ArliAI/Qwen3-30B-A3B-ArliAI-RpR-v4-Fast",
"base_model:quantized:ArliAI/Qwen3-30B-A3B-ArliAI-RpR-v4-Fast",
"license:apache-2.0",
"endpoints_compatible",
"region:us",
"conversational"
] | text-generation | 2025-06-17T10:04:49Z | ---
license: apache-2.0
thumbnail: https://cdn-uploads.huggingface.co/production/uploads/6625f4a8a8d1362ebcc3851a/hIZ2ZcaDyfYLT9Yd4pfOs.jpeg
language:
- en
base_model: ArliAI/Qwen3-30B-A3B-ArliAI-RpR-v4-Fast
library_name: transformers
pipeline_tag: text-generation
tags:
- llama-cpp
- gguf-my-repo
---
# Triangle104/Qwen3-30B-A3B-ArliAI-RpR-v4-Fast-Q3_K_L-GGUF
This model was converted to GGUF format from [`ArliAI/Qwen3-30B-A3B-ArliAI-RpR-v4-Fast`](https://huggingface.co/ArliAI/Qwen3-30B-A3B-ArliAI-RpR-v4-Fast) using llama.cpp via the ggml.ai's [GGUF-my-repo](https://huggingface.co/spaces/ggml-org/gguf-my-repo) space.
Refer to the [original model card](https://huggingface.co/ArliAI/Qwen3-30B-A3B-ArliAI-RpR-v4-Fast) for more details on the model.
---
RpR (RolePlay with Reasoning) is a new series of models from ArliAI. This series builds directly upon the successful dataset curation methodology and training methods developed for the RPMax series.
RpR models use the same curated, deduplicated RP and creative writing dataset used for RPMax, with a focus on variety to ensure high creativity and minimize cross-context repetition. Users familiar with RPMax will recognize the unique, non-repetitive writing style unlike other finetuned-for-RP models.
With the release of QwQ as the first high performing open-source reasoning model that can be easily trained, it was clear that the available instruct and creative writing reasoning datasets contains only one response per example. This is type of single response dataset used for training reasoning models causes degraded output quality in long multi-turn chats. Which is why Arli AI decided to create a real RP model capable of long multi-turn chat with reasoning.
In order to create RpR, we first had to actually create the reasoning RP dataset by re-processing our existing known-good RPMax dataset into a reasoning dataset. This was possible by using the base QwQ Instruct model itself to create the reasoning process for every turn in the RPMax dataset conversation examples, which is then further refined in order to make sure the reasoning is in-line with the actual response examples from the dataset.
Another important thing to get right is to make sure the model is trained on examples that present reasoning blocks in the same way as it encounters it during inference. Which is, never seeing the reasoning blocks in it's context. In order to do this, the training run was completed using axolotl with manual template-free segments dataset in order to make sure that the model is never trained to see the reasoning block in the context. Just like how the model will be used during inference time.
The result of training on this dataset with this method are consistently coherent and interesting outputs even in long multi-turn RP chats. This is as far as we know the first true correctly-trained reasoning model trained for RP and creative writing.
You can access the model at https://arliai.com and we also have a models ranking page at https://www.arliai.com/models-ranking
Ask questions in our new Discord Server https://discord.com/invite/t75KbPgwhk or on our subreddit https://www.reddit.com/r/ArliAI/
---
## Use with llama.cpp
Install llama.cpp through brew (works on Mac and Linux)
```bash
brew install llama.cpp
```
Invoke the llama.cpp server or the CLI.
### CLI:
```bash
llama-cli --hf-repo Triangle104/Qwen3-30B-A3B-ArliAI-RpR-v4-Fast-Q3_K_L-GGUF --hf-file qwen3-30b-a3b-arliai-rpr-v4-fast-q3_k_l.gguf -p "The meaning to life and the universe is"
```
### Server:
```bash
llama-server --hf-repo Triangle104/Qwen3-30B-A3B-ArliAI-RpR-v4-Fast-Q3_K_L-GGUF --hf-file qwen3-30b-a3b-arliai-rpr-v4-fast-q3_k_l.gguf -c 2048
```
Note: You can also use this checkpoint directly through the [usage steps](https://github.com/ggerganov/llama.cpp?tab=readme-ov-file#usage) listed in the Llama.cpp repo as well.
Step 1: Clone llama.cpp from GitHub.
```
git clone https://github.com/ggerganov/llama.cpp
```
Step 2: Move into the llama.cpp folder and build it with `LLAMA_CURL=1` flag along with other hardware-specific flags (for ex: LLAMA_CUDA=1 for Nvidia GPUs on Linux).
```
cd llama.cpp && LLAMA_CURL=1 make
```
Step 3: Run inference through the main binary.
```
./llama-cli --hf-repo Triangle104/Qwen3-30B-A3B-ArliAI-RpR-v4-Fast-Q3_K_L-GGUF --hf-file qwen3-30b-a3b-arliai-rpr-v4-fast-q3_k_l.gguf -p "The meaning to life and the universe is"
```
or
```
./llama-server --hf-repo Triangle104/Qwen3-30B-A3B-ArliAI-RpR-v4-Fast-Q3_K_L-GGUF --hf-file qwen3-30b-a3b-arliai-rpr-v4-fast-q3_k_l.gguf -c 2048
```
|
tomaarsen/splade-mpnet-base-miriad-2e-5-lq-5e-6-lc | tomaarsen | 2025-06-17T10:15:04Z | 0 | 0 | sentence-transformers | [
"sentence-transformers",
"safetensors",
"mpnet",
"sparse-encoder",
"sparse",
"splade",
"generated_from_trainer",
"dataset_size:100000",
"loss:SpladeLoss",
"loss:SparseMultipleNegativesRankingLoss",
"loss:FlopsLoss",
"feature-extraction",
"en",
"dataset:tomaarsen/miriad-4.4M-split",
"arxi... | feature-extraction | 2025-06-17T10:14:47Z | ---
language:
- en
license: apache-2.0
tags:
- sentence-transformers
- sparse-encoder
- sparse
- splade
- generated_from_trainer
- dataset_size:100000
- loss:SpladeLoss
- loss:SparseMultipleNegativesRankingLoss
- loss:FlopsLoss
base_model: microsoft/mpnet-base
widget:
- text: "He does it right, but there are times that he doesn't (Joana) Let's go there\
\ and pee? Because she does not want to wear a diaper, she rips off her diaper\
\ (Filomena). The family caregiver may understand this action as a \"pang\" and\
\ \"tantrum\", and \"forget\" that these episodes are part of the clinical picture\
\ of dementia. Conflicts related to incontinence and other difficult-to-manage\
\ symptoms eventually lead to a variety of interpretations, and past history of\
\ the emotional relationship between the elderly and the family caregiver can\
\ cause older emotional issues to surface again in these episodes.\n\n With psycho-functional\
\ limitations, new demands arise that can be distressing for those who care because\
\ of affective involvement. Subjective constructions are fundamental elements\
\ in upkeeping the relationship of care 10 .\n\n Besides the psychological aspect\
\ involved in the loss of identity and the specific cognitive aspects of dementia,\
\ some behavioral and psychiatric changes are important even in the consultation\
\ with the ESF professionals: psychotic symptoms, agitation and aggression, mood\
\ swings, disinhibited behavior and euphoria, apathy and insomnia. Some studies\
\ [11] [12] [13] pointed out the significant association between the presence\
\ of apathy and a faster cognitive and functional decline in these patients. Another\
\ very relevant situation regarding the appearance of neuropsychiatric symptoms\
\ is the association of these symptoms with the institutionalization and shorter\
\ patient survival. They also showed that the highest Neuropsychiatric Inventory\
\ (NPI) score was signifi-cantly associated with more severe cognitive impairment,\
\ greater caregiver distress, and higher cost, but was not associated with a formal\
\ diagnosis of dementia performed by the primary care physician.\n\n Changed behaviors\
\ and even risky behaviors, such as turning on the gas switch and not turning\
\ off, stirring in pots on a hot stove, or ingestion of liquids or toxic materials\
\ are situations in the face of neuropsychiatric manifestations in dementia. Filomena\
\ reports several neuropsychiatric symptoms of her husband. She compares his behavior\
\ to that of children who explore the environment to discover the cause and effect\
\ of things and the sensations obtained by the senses. Her role in this context\
\ resembles that of a mother trying to prevent the child from getting hurt: He\
\ lights up the gas switch, he's just like a child, sometimes he starts to eat\
\ the slipper, I have to get it out of his mouth.\n\n Hallucination is another\
\ neuropsychiatric symptom described by family caregivers. Joana reports that\
\ when the husband talks to people who have died, the family members feel fear\
\ and distance themselves. Filomena has fun when her mother speaks with those\
\ who have died: \"She talks to those who have passed away, she sends the dog\
\ out, which does not exist\". Each family caregiver experiences the symptoms\
\ presented by the dementia in a unique way, and ways to address and interpret\
\ this phenomenon and give meaning to their experience.\n\n The negative development\
\ of dementia perceived by Celina, Filomena, Maria, Teresa and Joana show that\
\ the disease follows a course that transcends the biological event itself. The\
\ dementia process evidences psychological and sociocultural constructions permeated\
\ by meanings and interpretations according to those who live and those who maintain\
\ interpersonal relationships with the elderly person with dementia.\n\n In the\
\ discourse of family caregivers, seniors with dementia have aggressive behaviors\
\ such as agitation, spitting, cursing, clawing, throwing objects, revealing a\
\ level of aggression that can impact the feelings and interpretations produced\
\ during the care routine. Freud 14 affirms that human instincts are of two types:\
\ Those who tend to preserve and unite, which we call 'erotic' [...] with a deliberate\
\ expansion of the popular conception of 'sexuality'; and those who tend to destroy\
\ and kill, which we group as an aggressive or destructive instinct. All actions\
\ in human life involve the confluence of these two instincts of preservation\
\ and destruction. The ideal situation for life in society would be the dominance\
\ of reason over the instinctual life controlling destructive impulses, which\
\ is utopian. In this perspective, aggressiveness is inherent in the human condition.\n\
\n In seniors with dementia with a declining psychological realm of the Self,\
\ the progressive loss of identity and the repercussion of cognitive decline,\
\ an actual decline in the rational realm of psychic life emerges. This decline\
\ refers to the cerebral aspect of inhibitory control and social cognition, showing\
\ that the emergence of aggressive behaviors is related to the biological component.\
\ The declining reason turns its demands and needs into instinctual acts and more\
\ basic reflexes, and can produce a continuous imbalance in the expression between\
\ the instincts of preservation and aggression.\n\n Aggressiveness can be triggered\
\ by situations of frustration, when they do not get what they want, when they\
\ are afraid or consider some humiliating situation, when they are exposed to\
\ environmental overstimulation or feel any physical pain or side effects from\
\ medication."
- text: "Neurosurgery is of great interest to historians of medicine and technology\
\ because it is relatively young, because it developed in an era of journals and\
\ publications, because lines and traditions of training and mentorship are relatively\
\ clear, and because the technologies that enabled the evolution of the profession\
\ and acted as inflection points in the emergence of certain surgical approaches\
\ and procedures are at once well documented and remarkably unambiguous. To the\
\ extent that is the case for neurosurgery as a whole, it is even more so for\
\ surgery of the skull base.\n\n To trace the history of skull base surgery along\
\ its full expanse is to begin with Horsley and pituitary tumors (unless one wants\
\ to start even earlier with the treatment of trigeminal neuralgia); to move to\
\ Cushing's work in the same arena (but also that of many others as well); to\
\ emphasize the impact of microsurgical techniques and new imaging modalities;\
\ to outline once radically innovative, but now widely practiced anatomical approaches\
\ to the skull base; to emphasize the importance of team approaches; to discuss\
\ emerging therapeutic strategy as well as instrumentation and techniques; to\
\ acknowledge the importance of advances in neuroanesthesia and the medical and\
\ perioperative care of the neurosurgical patient; and to recognize the contributions\
\ of the many individuals who, over the past 25 years, have added to and furthered\
\ the field in these and other ways.\n\n It is not hard to point to leading individuals\
\ and important techniques. It is perhaps more difficult to frame them in a meaningful\
\ historical perspective because the work has occurred relatively recently, in\
\ the time frame historians call \"near history.\" Difficulties arise from both\
\ an evaluative and a nosological standpoint. For example, from an evaluative\
\ standpoint, how does one stratify the relative importance of corticosteroids,\
\ osmotic diuretics, and CSF drainage techniques and technologies in the control\
\ of intracranial pressure and the facilitation of exposure for base of skull\
\ surgery? How does one think about the idea of hybrid surgery and stereotactic\
\ radiation? What will be the long-term view of anatomical approaches to giant\
\ basilar aneurysms in the light of endovascular surgery? Have we reached a tipping\
\ point in the management of vestibular schwannomas, given the availability of\
\ and the outcomes associated with stereotactic radiosurgery?\n\n From a nosological\
\ standpoint, should we think about base of skull surgery in terms of anatomical\
\ approaches? One textbook that does just that starts with subfrontal approaches\
\ and then moves around the calvaria and down to the petrous and temporal region\
\ in a Cook's tour of exposure, in the tradition of Henry's Extensile Exposure\
\ and comparable surgical classics. 1, 6 Other publications have explored a set\
\ of technologies. 5, 7, 10 Another focuses on the contribution of great men.\
\ 9 Many surgeons have written about specific particular pathologies at the skull\
\ base.\n\n Introduction their colleagues write about the premodern period. Elhadi\
\ and colleagues also comment on the introduction of radiography in early neurosurgery.\
\ Gross and Grossi and their colleagues concentrate on petrosal approaches; Schmitt\
\ and Jane on third ventriculostomy; and Chittiboina and colleagues on the history\
\ of a very simple but ubiquitous instrument, the Freer elevator, and its inventor.\
\ In contrast to the more comprehensive overviews written by Goodrich, Donald,\
\ and others, these essays concentrate on selected details. While it is important\
\ not to miss the forest for the trees, sometimes the trees are worth studying\
\ no less than the forest. \n\n The authors report no conflict of interest."
- text: 'How do neuromediators contribute to the pathogenesis of pruritus in AD?
'
- text: "Pericardial effusion (PE) is a life-threatening condition, as accumulation\
\ of fluid in the pericardial sac can lead to cardiac tamponade and fatal shock.\
\ 1, 2 PE is often associated with an underlying disease or condition, and the\
\ causes can vary widely. 3, 4 Pericardiocentesis performed by needle (with or\
\ without echoguidance), and various surgical procedures (including subxiphoid\
\ pericardial tube drainage, pericardial window performed through a left anterior\
\ thoracotomy, or video-assisted thoracoscopic surgery) can alleviate PE. 5 Our\
\ retrospective clinical experiences of treating PE with subxiphoid pericardiostomy\
\ are presented in this study.\n\n We reviewed the medical records of patients\
\ who underwent subxiphoid pericardiostomy to treat persistent symptomatic PE\
\ in our clinic between 1990 and 2000. Echocardiography (ECG) was used to diagnose\
\ PE and N Becit, A รzyazicioglu, M Ceviz et al.\n\n determine the size of the\
\ effusion. A diastolic echo-free space of < 10 mm between the left ventricular\
\ posterior wall and pericardium was determined as mild PE, 10 -20 mm as moderate,\
\ and > 20 mm as severe PE. Patients with cardiac tamponade and/or moderate to\
\ severe PE were treated by subxiphoid pericardiostomy and tube drainage.\n\n\
\ Some patients with pre-operative tuberculosis were treated with an adult fourdrug\
\ regimen (isoniazid, 300 mg/day and rifampin, 600 mg/day for 12 months, streptomycin,\
\ 1 g/day for 2 months, and pyrazinamide, 2 g/day for 3 months) preoperatively.\
\ The effusion was drained after a 3-week course of anti-tuberculosis therapy.\
\ In these, and patients diagnosed with tuberculous pericarditis, the tuberculosis\
\ therapy regimen was given for 12 months post-operatively.\n\n The technique\
\ used for subxiphoid pericardiostomy (described previously 3 ) was performed\
\ under general anaesthetic, or local anaesthesia and sedation. General anaesthesia\
\ was preferred in children and was induced with 1.5 mg/kg ketamine. Neuromuscular\
\ block was achieved with 0.1 mg/kg vecuronium, and anaesthesia maintained with\
\ 60% N 2 O, 40% O 2 and 0.5 -1.0% isoflurane. Local anaesthetic (2% lidocaine\
\ solution) was injected into the dermal and subdermal layers, and sedation and\
\ analgesia was provided by 1 mg/kg ketamine intravenously. A piece of anterior\
\ pericardium, approximately 2 -4 cm in diameter, was excised under direct vision\
\ and submitted for histopathological analysis. The pericardial cavity was decompressed\
\ and fluid samples were collected for culture and cytological analysis. To prevent\
\ acute cardiac dilatation during decompression of the pericardial cavity, intravenous\
\ digoxin was administered and the pericardial cavity was decompressed gradually.\n\
\n The pericardial cavity was examined under direct vision and/or by digital examination\
\ to detect any tumour or adhesions. Gentle digital lysis of adhesions and opening\
\ of loculations were performed as needed, to enhance satisfactory drainage. A\
\ soft chest tube was placed in the pericardial cavity, lateral to the right ventricle,\
\ after pericardiotomy for post-operative drainage. It was connected to an underwater\
\ sealed system, and was removed when fluid drainage ceased.\n\n Patients with\
\ mild haemorrhagic effusion and cardiac tamponade, due to trauma or invasive\
\ cardiac interventions, were considered haemodynamically unstable and unsuitable\
\ for surgical subxiphoid pericardiostomy, even under local anaesthetic. These\
\ patients underwent pericardiocentesis in the intensive care unit, which provided\
\ immediate relief. Subxiphoid pericardiostomy was performed later if haemorrhagic\
\ PE persisted. Patients were followed, with physical examinations and ECG, in\
\ the outpatient clinic for at least 1 year.\n\n Numerical results are given as\
\ mean ยฑ SD. Fisher's exact test was used to compare proportions between groups\
\ (comparison of the rates of recurrence and constriction between patient groups\
\ with uraemic pericarditis, tuberculous pericarditis and non-tuberculous bacterial\
\ pericarditis). The McNemar test was used for comparison of proportions within\
\ one group (to assess the significance of rates of recurrence and constriction\
\ in patients with tuberculous pericarditis). Statistical differences were considered\
\ significant if P < 0.05."
- text: "Henry M. Blumberg, MD In this issue of Infection Control and Hospital Epidemiology,\
\ a potpourri of tuberculosis (TB)-related articles are being published. 1-7 Tuberculosisrelated\
\ issues have been an important focus for the past decade for those in infection\
\ control and hospital epidemiology, especially in urban areas where the large\
\ majority of TB cases occur, 8 but also, because of federal regulations, for\
\ those in low-endemic areas or areas where no TB cases occur (approximately half\
\ of the counties in the United States).\n\n The resurgence of TB beginning in\
\ the mid1980s in the United States (in large part, due to failure and underfunding\
\ of the public health infrastructure and to the epidemic of human immunodeficiency\
\ virus [HIV] infection) and outbreaks of TB have highlighted the risk of nosocomial\
\ transmission of TB. 9,10 These outbreaks affected both healthcare workers (HCWs)\
\ and patients. The fact that outbreaks in New York and Miami, among others, involved\
\ multidrug-resistant (MDR) strains that were associated with high morbidity and\
\ mortality among HIV-infected individuals punctuated the importance of effective\
\ TB infection control measures. Commingling of patients with unsuspected TB and\
\ those who were quite immunosuppressed led to amplification of nosocomial transmission.\
\ A decade ago, few institutions were prepared for the changing epidemiology of\
\ TB.\n\n Several recent studies have demonstrated that infection control measures\
\ are effective in preventing nosocomial transmission of TB, 11-13 and two reports\
\ in this issue, from institutions in Kentucky 1 and New York, 2 provide additional\
\ data on decreases in HCW tuberculin skin-test (TST) conversions following implementation\
\ of TB infection control measures. In most studies, multiple interventions (administrative\
\ controls, environmental controls, and respiratory protection) were initiated\
\ at approximately the same time, making it more difficult to identify the most\
\ crucial aspect of the program. The importance of TB infection control measures\
\ in contributing to the decline in TB cases in the United States, as well as\
\ the reduction in the number of MDR-TB cases in New York City, often has been\
\ understated. Increased federal funding for TB control activities and expansion\
\ of directly observed therapy clearly are important in efforts to prevent TB,\
\ but the initial decline in TB cases and in MDR TB in the United States beginning\
\ in 1993 likely was due, in large part, to interruption of TB transmission within\
\ healthcare facilities. Unfortunately, increased funding for TB control in the\
\ United States in the last 5 years often has not trickled down to inner-city\
\ hospitals, which frequently are the first line in the battle against TB.\n\n\
\ From our experience and that of others, it appears clear that administrative\
\ controls are the most important component of a TB infection control program.\
\ At Grady Memorial Hospital in Atlanta, we were able to decrease TB exposure\
\ episodes markedly and concomitantly to decrease HCW TST conversions after implementing\
\ an expanded respiratory isolation policy. 11 We continue to isolate appropriately\
\ approximately 95% of those subsequently diagnosed with TB. We were able to reduce\
\ TST conver-sion rates markedly during a period of time in which we had isolation\
\ rooms that would be considered suboptimal by Centers for Disease Control and\
\ Prevention (CDC) guidelines 14 (rooms that were under negative pressure but\
\ had less than six air changes per hour) and were using submicron masks. Implementation\
\ of better-engineered isolation rooms (>12 air changes per hour) with the completion\
\ of renovations to the hospital may have put us in better compliance with regulatory\
\ agencies and made the staff feel more secure, but has had little impact on further\
\ reducing low rates of HCW TST conversions. In addition, the termination of outbreaks\
\ and reduction of TST conversion rates at several institutions took place before\
\ introduction of National Institute for Occupational Safety and Health-approved\
\ masks and fit testing. 2,15,16 United States healthcare institutions are required\
\ by regulatory mandates to develop a \"respiratory protection program\" (including\
\ fit testing), which can be time-consuming, expensive, and logistically difficult.\
\ 17 Data published to date suggest that the impact of formal fit testing on proper\
\ mask use is small. 18 These federal mandates also have turned some well-meaning\
\ (trying to comply fully with the Occupational Safety and Health Administration\
\ [OSHA] regulations) but misguided infection control practitioners into \"facial\
\ hair police.\" These types of processes divert time, effort, and resources away\
\ from what truly is effective in preventing nosocomial transmission of TB, as\
\ well as from other important infection control activities such as preventing\
\ nosocomial bloodstream infections or transmission of highly resistant pathogens\
\ such as vancomycin-resistant Enterococcus or preparing for the onslaught of\
\ vancomycin-resistant Staphylococcus aureus. At a time when US healthcare institutions\
\ are under enormous pressure due to healthcare reform, market forces, and managed\
\ care, it is essential that federal regulatory agencies look carefully at scientific\
\ data when issuing regulations."
datasets:
- tomaarsen/miriad-4.4M-split
pipeline_tag: feature-extraction
library_name: sentence-transformers
metrics:
- dot_accuracy@1
- dot_accuracy@3
- dot_accuracy@5
- dot_accuracy@10
- dot_precision@1
- dot_precision@3
- dot_precision@5
- dot_precision@10
- dot_recall@1
- dot_recall@3
- dot_recall@5
- dot_recall@10
- dot_ndcg@10
- dot_mrr@10
- dot_map@100
- query_active_dims
- query_sparsity_ratio
- corpus_active_dims
- corpus_sparsity_ratio
co2_eq_emissions:
emissions: 196.23895298915153
energy_consumed: 0.504857070427092
source: codecarbon
training_type: fine-tuning
on_cloud: false
cpu_model: 13th Gen Intel(R) Core(TM) i7-13700K
ram_total_size: 31.777088165283203
hours_used: 1.484
hardware_used: 1 x NVIDIA GeForce RTX 3090
model-index:
- name: MPNet-base trained on MIRIAD question-passage tuples
results:
- task:
type: sparse-information-retrieval
name: Sparse Information Retrieval
dataset:
name: miriad eval
type: miriad_eval
metrics:
- type: dot_accuracy@1
value: 0.917
name: Dot Accuracy@1
- type: dot_accuracy@3
value: 0.963
name: Dot Accuracy@3
- type: dot_accuracy@5
value: 0.969
name: Dot Accuracy@5
- type: dot_accuracy@10
value: 0.98
name: Dot Accuracy@10
- type: dot_precision@1
value: 0.917
name: Dot Precision@1
- type: dot_precision@3
value: 0.32099999999999995
name: Dot Precision@3
- type: dot_precision@5
value: 0.1938
name: Dot Precision@5
- type: dot_precision@10
value: 0.09800000000000002
name: Dot Precision@10
- type: dot_recall@1
value: 0.917
name: Dot Recall@1
- type: dot_recall@3
value: 0.963
name: Dot Recall@3
- type: dot_recall@5
value: 0.969
name: Dot Recall@5
- type: dot_recall@10
value: 0.98
name: Dot Recall@10
- type: dot_ndcg@10
value: 0.9509329680619819
name: Dot Ndcg@10
- type: dot_mrr@10
value: 0.9414055555555555
name: Dot Mrr@10
- type: dot_map@100
value: 0.9422311263243918
name: Dot Map@100
- type: query_active_dims
value: 72.48699951171875
name: Query Active Dims
- type: query_sparsity_ratio
value: 0.9976254791000846
name: Query Sparsity Ratio
- type: corpus_active_dims
value: 291.5419921875
name: Corpus Active Dims
- type: corpus_sparsity_ratio
value: 0.9904497005212599
name: Corpus Sparsity Ratio
- task:
type: sparse-information-retrieval
name: Sparse Information Retrieval
dataset:
name: miriad test
type: miriad_test
metrics:
- type: dot_accuracy@1
value: 0.9
name: Dot Accuracy@1
- type: dot_accuracy@3
value: 0.953
name: Dot Accuracy@3
- type: dot_accuracy@5
value: 0.961
name: Dot Accuracy@5
- type: dot_accuracy@10
value: 0.974
name: Dot Accuracy@10
- type: dot_precision@1
value: 0.9
name: Dot Precision@1
- type: dot_precision@3
value: 0.31766666666666665
name: Dot Precision@3
- type: dot_precision@5
value: 0.19220000000000004
name: Dot Precision@5
- type: dot_precision@10
value: 0.09740000000000001
name: Dot Precision@10
- type: dot_recall@1
value: 0.9
name: Dot Recall@1
- type: dot_recall@3
value: 0.953
name: Dot Recall@3
- type: dot_recall@5
value: 0.961
name: Dot Recall@5
- type: dot_recall@10
value: 0.974
name: Dot Recall@10
- type: dot_ndcg@10
value: 0.9387955628253912
name: Dot Ndcg@10
- type: dot_mrr@10
value: 0.9273035714285714
name: Dot Mrr@10
- type: dot_map@100
value: 0.9283432155352948
name: Dot Map@100
- type: query_active_dims
value: 73.08399963378906
name: Query Active Dims
- type: query_sparsity_ratio
value: 0.9976059226378685
name: Query Sparsity Ratio
- type: corpus_active_dims
value: 293.2669982910156
name: Corpus Active Dims
- type: corpus_sparsity_ratio
value: 0.9903931929671761
name: Corpus Sparsity Ratio
---
# MPNet-base trained on MIRIAD question-passage tuples
This is a [SPLADE Sparse Encoder](https://www.sbert.net/docs/sparse_encoder/usage/usage.html) model finetuned from [microsoft/mpnet-base](https://huggingface.co/microsoft/mpnet-base) on the [miriad-4.4_m-split](https://huggingface.co/datasets/tomaarsen/miriad-4.4M-split) dataset using the [sentence-transformers](https://www.SBERT.net) library. It maps sentences & paragraphs to a 30527-dimensional sparse vector space and can be used for semantic search and sparse retrieval.
## Model Details
### Model Description
- **Model Type:** SPLADE Sparse Encoder
- **Base model:** [microsoft/mpnet-base](https://huggingface.co/microsoft/mpnet-base) <!-- at revision 6996ce1e91bd2a9c7d7f61daec37463394f73f09 -->
- **Maximum Sequence Length:** 512 tokens
- **Output Dimensionality:** 30527 dimensions
- **Similarity Function:** Dot Product
- **Training Dataset:**
- [miriad-4.4_m-split](https://huggingface.co/datasets/tomaarsen/miriad-4.4M-split)
- **Language:** en
- **License:** apache-2.0
### Model Sources
- **Documentation:** [Sentence Transformers Documentation](https://sbert.net)
- **Documentation:** [Sparse Encoder Documentation](https://www.sbert.net/docs/sparse_encoder/usage/usage.html)
- **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers)
- **Hugging Face:** [Sparse Encoders on Hugging Face](https://huggingface.co/models?library=sentence-transformers&other=sparse-encoder)
### Full Model Architecture
```
SparseEncoder(
(0): MLMTransformer({'max_seq_length': 512, 'do_lower_case': False}) with MLMTransformer model: MPNetForMaskedLM
(1): SpladePooling({'pooling_strategy': 'max', 'activation_function': 'relu', 'word_embedding_dimension': 30527})
)
```
## Usage
### Direct Usage (Sentence Transformers)
First install the Sentence Transformers library:
```bash
pip install -U sentence-transformers
```
Then you can load this model and run inference.
```python
from sentence_transformers import SparseEncoder
# Download from the ๐ค Hub
model = SparseEncoder("tomaarsen/splade-mpnet-base-miriad-2e-5-lq-5e-6-lc")
# Run inference
queries = [
"How have infection control measures been effective in preventing nosocomial transmission of TB?\n",
]
documents = [
'Henry M. Blumberg, MD In this issue of Infection Control and Hospital Epidemiology, a potpourri of tuberculosis (TB)-related articles are being published. 1-7 Tuberculosisrelated issues have been an important focus for the past decade for those in infection control and hospital epidemiology, especially in urban areas where the large majority of TB cases occur, 8 but also, because of federal regulations, for those in low-endemic areas or areas where no TB cases occur (approximately half of the counties in the United States).\n\n The resurgence of TB beginning in the mid1980s in the United States (in large part, due to failure and underfunding of the public health infrastructure and to the epidemic of human immunodeficiency virus [HIV] infection) and outbreaks of TB have highlighted the risk of nosocomial transmission of TB. 9,10 These outbreaks affected both healthcare workers (HCWs) and patients. The fact that outbreaks in New York and Miami, among others, involved multidrug-resistant (MDR) strains that were associated with high morbidity and mortality among HIV-infected individuals punctuated the importance of effective TB infection control measures. Commingling of patients with unsuspected TB and those who were quite immunosuppressed led to amplification of nosocomial transmission. A decade ago, few institutions were prepared for the changing epidemiology of TB.\n\n Several recent studies have demonstrated that infection control measures are effective in preventing nosocomial transmission of TB, 11-13 and two reports in this issue, from institutions in Kentucky 1 and New York, 2 provide additional data on decreases in HCW tuberculin skin-test (TST) conversions following implementation of TB infection control measures. In most studies, multiple interventions (administrative controls, environmental controls, and respiratory protection) were initiated at approximately the same time, making it more difficult to identify the most crucial aspect of the program. The importance of TB infection control measures in contributing to the decline in TB cases in the United States, as well as the reduction in the number of MDR-TB cases in New York City, often has been understated. Increased federal funding for TB control activities and expansion of directly observed therapy clearly are important in efforts to prevent TB, but the initial decline in TB cases and in MDR TB in the United States beginning in 1993 likely was due, in large part, to interruption of TB transmission within healthcare facilities. Unfortunately, increased funding for TB control in the United States in the last 5 years often has not trickled down to inner-city hospitals, which frequently are the first line in the battle against TB.\n\n From our experience and that of others, it appears clear that administrative controls are the most important component of a TB infection control program. At Grady Memorial Hospital in Atlanta, we were able to decrease TB exposure episodes markedly and concomitantly to decrease HCW TST conversions after implementing an expanded respiratory isolation policy. 11 We continue to isolate appropriately approximately 95% of those subsequently diagnosed with TB. We were able to reduce TST conver-sion rates markedly during a period of time in which we had isolation rooms that would be considered suboptimal by Centers for Disease Control and Prevention (CDC) guidelines 14 (rooms that were under negative pressure but had less than six air changes per hour) and were using submicron masks. Implementation of better-engineered isolation rooms (>12 air changes per hour) with the completion of renovations to the hospital may have put us in better compliance with regulatory agencies and made the staff feel more secure, but has had little impact on further reducing low rates of HCW TST conversions. In addition, the termination of outbreaks and reduction of TST conversion rates at several institutions took place before introduction of National Institute for Occupational Safety and Health-approved masks and fit testing. 2,15,16 United States healthcare institutions are required by regulatory mandates to develop a "respiratory protection program" (including fit testing), which can be time-consuming, expensive, and logistically difficult. 17 Data published to date suggest that the impact of formal fit testing on proper mask use is small. 18 These federal mandates also have turned some well-meaning (trying to comply fully with the Occupational Safety and Health Administration [OSHA] regulations) but misguided infection control practitioners into "facial hair police." These types of processes divert time, effort, and resources away from what truly is effective in preventing nosocomial transmission of TB, as well as from other important infection control activities such as preventing nosocomial bloodstream infections or transmission of highly resistant pathogens such as vancomycin-resistant Enterococcus or preparing for the onslaught of vancomycin-resistant Staphylococcus aureus. At a time when US healthcare institutions are under enormous pressure due to healthcare reform, market forces, and managed care, it is essential that federal regulatory agencies look carefully at scientific data when issuing regulations.',
'Drug Reaction with Eosinophilia and Systemic Symptoms (DRESS) syndrome is a severe and potentially life-threatening hypersensitivity reaction caused by exposure to certain medications (Phillips et al., 2011; Bocquet et al., 1996) . It is extremely heterogeneous in its manifestation but has characteristic delayed-onset cutaneous and multisystem features with a protracted natural history. The reaction typically starts with a fever, followed by widespread skin eruption of variable nature. This progresses to inflammation of internal organs such as hepatitis, pneumonitis, myocarditis and nephritis, and haematological abnormalities including eosinophilia and atypical lymphocytosis (Kardaun et al., 2013; Cho et al., 2017) .\n\n DRESS syndrome is most commonly classified according to the international scoring system developed by the RegiSCAR group (Kardaun et al., 2013) . RegiSCAR accurately defines the syndrome by considering the major manifestations, with each feature scored between โ1 and 2, and 9 being the maximum total number of points. According to this classification, a score of < 2 means no case, 2-3 means possible case, 4-5 means probable case, and 6 or above means definite DRESS syndrome. Table 1 gives an overview of the RegiSCAR scoring system. DRESS syndrome usually develops 2 to 6 weeks after exposure to the causative drug, with resolution of symptoms after drug withdrawal in the majority of cases (Husain et al., 2013a) . Some patients require supportive treatment with corticosteroids, although there is a lack of evidence surrounding the most effective dose, route and duration of the therapy (Adwan, 2017) . Although extremely rare, with an estimated population risk of between 1 and 10 in 10,000 drug exposures, it is significant due to its high mortality rate, at around 10% (Tas and The pathogenesis of DRESS syndrome remains largely unknown. Current evidence suggests that patients may be genetically predisposed to this form of hypersensitivity, with a superimposed risk resulting from Human Herpes Virus (HHV) exposure and subsequent immune reactivation (Cho et al., 2017; Husain et al., 2013a) . In fact, the serological detection of HHV-6 has even been proposed as an additional diagnostic marker for DRESS syndrome (Shiohara et al., 2007) . Other potential risk factors identified are family history (Sullivan and Shear, 2001; Pereira De Silva et al., 2011) and concomitant drug use, particularly antibiotics . DRESS syndrome appears to occur in patients of any age, with patient demographics from several reviews finding age ranges between 6 and 89 years (Picard et al., 2010; Kano et al., 2015; Cacoub et al., 2013) . DRESS syndrome was first described as an adverse reaction to antiepileptic therapy, but has since been recognised as a complication of an extremely wide range of medications (Adwan, 2017) . In rheumatology, it has been classically associated with allopurinol and sulfasalazine, but has also been documented in association with many other drugs including leflunomide, hydroxychloroquine, febuxostat and NSAIDs (Adwan, 2017) . Recent evidence has also identified a significant risk of DRESS syndrome with strontium ranelate use (Cacoub et al., 2013) . Thus far, that is the only anti-osteoporotic drug associated with DRESS syndrome, although there are various cases of other adverse cutaneous reactions linked to anti-osteoporotic medications, ranging from benign maculopapular eruption to Stevens-Johnson syndrome (SJS) and Toxic Epidermal Necrolysis (TEN) . Denosumab, an antiresorptive RANK ligand (RANKL) inhibitor licensed for osteoporosis, is currently known to be associated with some dermatological manifestations including dermatitis, eczema, pruritus and, less commonly, cellulitis (Prolia, n.d.).\n\n We hereby describe the first documented case of DRESS syndrome associated with denosumab treatment.\n\n The patient is a 76-year old female with osteoporosis and a background of alcoholic fatty liver disease and lower limb venous insufficiency. Osteoporosis was first diagnosed in 2003 and treated with risedronate, calcium and vitamin D, until 2006. While on this treatment, the patient sustained T12 and L3 fractures, the latter treated with kyphoplasty, and was therefore deemed a non-responder to risedronate.',
"The regulation of these events is known to go awry in certain pathologies especially in diseases associated with neurodegeneration. Mitochondrial fission helps to enhance the number of mitochondria, which can be efficiently distributed to each corner of neuronal cells and thus helps them to maintain their energy demands. Mitochondrial fission is highly essential during the periods of energy starvation to produce new, efficient mitochondrial energy generating systems. However, enhanced fission associated with bioenergetic crisis causes BAX foci formation on mitochondrial membrane and thus causes mitochondrial outer membrane permeabilization (MOMP), releasing cytochrome c and other pro apoptotic mediators into cytosol, results in apoptosis [93] . Impairment in the mitochondrial dynamics has also been observed in case of inflammatory neuropathies and oxaliplatin induced neuropathy [94] . Excessive nitric oxide is known to cause s-nitrosylation of dynamin related protein-1 (Drp-1), and increases the mitochondrial fission [95, 96] . Tumor necrosis factor-ฮฑ (TNF-ฮฑ) reported to inhibit the kinensin 1 protein, and thus impairs trafficking by halting mitochondrial movement along axons [97] . In addition to impaired dynamics, aggregates of abnormal shaped, damaged mitochondria are responsible for aberrant mitochondrial trafficking, which contributes to axonal degeneration observed in various peripheral neuropathies [81] .\n\n Autophagy is the discerning cellular catabolic process responsible for recycling the damaged proteins/ organelles in the cells [98] . Mitophagy is a selective autophagic process involved in recycling of damaged mitochondria and helps in supplying the constituents for mitochondrial biogenesis [99] . Excessive accumulation and impaired clearance of dysfunctional mitochondria are known to be observed in various disorders associated with oxidative stress [100] . Oxidative damage to Atg 4, a key component involved in mitophagy causes impaired autophagosome formation and clearance of damaged mitochondria [101] . Loss in the function of molecular chaperons and associated accumulation of damaged proteins are known to be involved in various peripheral neuropathies including trauma induced neuropathy [102, 103] . A model of demyelinating neuropathy corresponds to the accumulation of improperly folded myelin protein PMP-22 is also being observed recently [104, 105] .\n\n Mitochondrial dysfunction and associated disturbances are well connected to neuroinflammatory changes that occur in various neurodegenerative diseases [106] . Dysfunctional mitochondria are also implicated in several pathologies such as cardiovascular and neurodegenerative diseases. Several mitochondrial toxins have been found to inhibit the respiration in microglial cells and also inhibit IL-4 induced alternative anti inflammatory response and thus potentiates neuroinflammation [107] . Mitochondrial ROS are well identified to be involved in several inflammatory pathways such as NF-ฮบB, MAPK activation [108] . Similarly, the pro inflammatory mediators released as a result of an inflammatory episode found to be interfere with the functioning of the mitochondrial electron transport chain and thus compromise ATP production [109] . TNF-ฮฑ is known to inhibit the complex I, IV of ETC and decreases energy production. Nitric oxide (NO) is a potent inhibitor of cytochrome c oxidase (complex IV) and similarly IL-6 is also known to enhance mitochondrial generation of superoxide [110] . Mitochondrial dysfunction initiates inflammation by increased formation of complexes of damaged mitochondrial parts and cytoplasmic pattern recognition receptors (PRR's). The resulting inflammasome directed activation of interleukin-1ฮฒ production, which starts an immune response and leads to Fig. (4) . Mitotoxicity in peripheral neuropathies: Various pathophysiological insults like hyperglycemic, chemotherapeutic and traumatic injury to the peripheral nerves results in mitochondrial dysfunction through enhanced generation of ROS induced biomolecular damage and bioenergetic crisis. Following the nerve injury accumulation of mitochondria occurs resulting in the release of mtDNA & formyl peptides into circulation which acts as Death associated molecular patterns (DAMP's). These are recognized by immune cells as foreign bodies and can elicit a local immune/inflammatory response. Interaction between inflammatory mediators and structural proteins involved in mitochondrial trafficking will cause impairment in mitochondrial motility. Oxidative stress induced damage to the mt proteins like Atg4, Parkin etc cause insufficient mitophagy. Excess nitrosative stress also results in excessive mt fission associated with apoptosis. In addition, mtDNA damage impairs its transcription and reduces mitochondrial biogenesis. Ca 2+ dyshomeostasis, loss in mitochondrial potential and bioenergetic crisis cause neuronal death via apoptosis/necrosis. All these modifications cause defects in ultra structure, physiology and trafficking of mitochondria resulting in loss of neuronal function producing peripheral neuropathy.",
]
query_embeddings = model.encode_query(queries)
document_embeddings = model.encode_document(documents)
print(query_embeddings.shape, document_embeddings.shape)
# [1, 30527] [3, 30527]
# Get the similarity scores for the embeddings
similarities = model.similarity(query_embeddings, document_embeddings)
print(similarities)
# tensor([[38.6532, 2.9277, 0.1620]])
```
<!--
### Direct Usage (Transformers)
<details><summary>Click to see the direct usage in Transformers</summary>
</details>
-->
<!--
### Downstream Usage (Sentence Transformers)
You can finetune this model on your own dataset.
<details><summary>Click to expand</summary>
</details>
-->
<!--
### Out-of-Scope Use
*List how the model may foreseeably be misused and address what users ought not to do with the model.*
-->
## Evaluation
### Metrics
#### Sparse Information Retrieval
* Datasets: `miriad_eval` and `miriad_test`
* Evaluated with [<code>SparseInformationRetrievalEvaluator</code>](https://sbert.net/docs/package_reference/sparse_encoder/evaluation.html#sentence_transformers.sparse_encoder.evaluation.SparseInformationRetrievalEvaluator)
| Metric | miriad_eval | miriad_test |
|:----------------------|:------------|:------------|
| dot_accuracy@1 | 0.917 | 0.9 |
| dot_accuracy@3 | 0.963 | 0.953 |
| dot_accuracy@5 | 0.969 | 0.961 |
| dot_accuracy@10 | 0.98 | 0.974 |
| dot_precision@1 | 0.917 | 0.9 |
| dot_precision@3 | 0.321 | 0.3177 |
| dot_precision@5 | 0.1938 | 0.1922 |
| dot_precision@10 | 0.098 | 0.0974 |
| dot_recall@1 | 0.917 | 0.9 |
| dot_recall@3 | 0.963 | 0.953 |
| dot_recall@5 | 0.969 | 0.961 |
| dot_recall@10 | 0.98 | 0.974 |
| **dot_ndcg@10** | **0.9509** | **0.9388** |
| dot_mrr@10 | 0.9414 | 0.9273 |
| dot_map@100 | 0.9422 | 0.9283 |
| query_active_dims | 72.487 | 73.084 |
| query_sparsity_ratio | 0.9976 | 0.9976 |
| corpus_active_dims | 291.542 | 293.267 |
| corpus_sparsity_ratio | 0.9904 | 0.9904 |
<!--
## Bias, Risks and Limitations
*What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.*
-->
<!--
### Recommendations
*What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.*
-->
## Training Details
### Training Dataset
#### miriad-4.4_m-split
* Dataset: [miriad-4.4_m-split](https://huggingface.co/datasets/tomaarsen/miriad-4.4M-split) at [596b9ab](https://huggingface.co/datasets/tomaarsen/miriad-4.4M-split/tree/596b9ab305d52cb73644ed5b5004957c7bfaae40)
* Size: 100,000 training samples
* Columns: <code>question</code> and <code>passage_text</code>
* Approximate statistics based on the first 1000 samples:
| | question | passage_text |
|:--------|:----------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|
| type | string | string |
| details | <ul><li>min: 9 tokens</li><li>mean: 23.38 tokens</li><li>max: 71 tokens</li></ul> | <ul><li>min: 511 tokens</li><li>mean: 512.0 tokens</li><li>max: 512 tokens</li></ul> |
* Samples:
| question | passage_text |
|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| <code>What factors may contribute to increased pulmonary conduit durability in patients who undergo the Ross operation compared to those with right ventricular outflow tract obstruction?<br></code> | <code>I n 1966, Ross and Somerville 1 reported the first use of an aortic homograft to establish right ventricle-to-pulmonary artery continuity in a patient with tetralogy of Fallot and pulmonary atresia. Since that time, pulmonary position homografts have been used in a variety of right-sided congenital heart lesions. Actuarial 5-year homograft survivals for cryopreserved homografts are reported to range between 55% and 94%, with the shortest durability noted in patients less than 2 years of age. 4 Pulmonary position homografts also are used to replace pulmonary autografts explanted to repair left-sided outflow disease (the Ross operation). Several factors may be likely to favor increased pulmonary conduit durability in Ross patients compared with those with right ventricular outflow tract obstruction, including later age at operation (allowing for larger homografts), more normal pulmonary artery architecture, absence of severe right ventricular hypertrophy, and more natural positioning of ...</code> |
| <code>How does MCAM expression in hMSC affect the growth and maintenance of hematopoietic progenitors?</code> | <code>After culture in a 3-dimensional hydrogel-based matrix, which constitutes hypoxic conditions, MCAM expression is lost. Concordantly, Tormin et al. demonstrated that MCAM is down-regulated under hypoxic conditions. 10 Furthermore, it was shown by others and our group that oxygen tension causes selective modification of hematopoietic cell and mesenchymal stromal cell interactions in co-culture systems as well as influence HSPC metabolism. [44] [45] [46] Thus, the observed differences between Sharma et al. and our data in HSPC supporting capacity of hMSC are likely due to the different culture conditions used. Further studies are required to clarify the influence of hypoxia in our model system. Altogether these findings provide further evidence for the importance of MCAM in supporting HSPC. Furthermore, previous reports have shown that MCAM is down-regulated in MSC after several passages as well as during aging and differentiation. 19, 47 Interestingly, MCAM overexpression in hMSC enhance...</code> |
| <code>What is the relationship between Fanconi anemia and breast and ovarian cancer susceptibility genes?<br></code> | <code>( 31 ) , of which 5% -10 % may be caused by genetic factors ( 32 ) , up to half a million of these patients may be at risk of secondary hereditary neoplasms. The historic observation of twofold to fi vefold increased risks of cancers of the ovary, thyroid, and connective tissue after breast cancer ( 33 ) presaged the later syndromic association of these tumors with inherited mutations of BRCA1, BRCA2, PTEN, and p53 ( 16 ) . By far the largest cumulative risk of a secondary cancer in BRCA mutation carriers is associated with cancer in the contralateral breast, which may reach a risk of 29.5% at 10 years ( 34 ) . The Breast Cancer Linkage Consortium ( 35 , 36 ) also documented threefold to fi vefold increased risks of subsequent cancers of prostate, pancreas, gallbladder, stomach, skin (melanoma), and uterus in BRCA2 mutation carriers and twofold increased risks of prostate and pancreas cancer in BRCA1 mutation carriers; these results are based largely on self-reported family history inf...</code> |
* Loss: [<code>SpladeLoss</code>](https://sbert.net/docs/package_reference/sparse_encoder/losses.html#spladeloss) with these parameters:
```json
{
"loss": "SparseMultipleNegativesRankingLoss(scale=1.0, similarity_fct='dot_score')",
"lambda_corpus": 5e-06,
"lambda_query": 2e-05
}
```
### Evaluation Dataset
#### miriad-4.4_m-split
* Dataset: [miriad-4.4_m-split](https://huggingface.co/datasets/tomaarsen/miriad-4.4M-split) at [596b9ab](https://huggingface.co/datasets/tomaarsen/miriad-4.4M-split/tree/596b9ab305d52cb73644ed5b5004957c7bfaae40)
* Size: 1,000 evaluation samples
* Columns: <code>question</code> and <code>passage_text</code>
* Approximate statistics based on the first 1000 samples:
| | question | passage_text |
|:--------|:----------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|
| type | string | string |
| details | <ul><li>min: 8 tokens</li><li>mean: 23.55 tokens</li><li>max: 74 tokens</li></ul> | <ul><li>min: 512 tokens</li><li>mean: 512.0 tokens</li><li>max: 512 tokens</li></ul> |
* Samples:
| question | passage_text |
|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| <code>What are some hereditary cancer syndromes that can result in various forms of cancer?<br></code> | <code>Hereditary Cancer Syndromes, including Hereditary Breast and Ovarian Cancer (HBOC) and Lynch Syndrome (LS), can result in various forms of cancer due to germline mutations in cancer predisposition genes. While the major contributory genes for these syndromes have been identified and well-studied (BRCA1/ BRCA2 for HBOC and MSH2/MSH6/MLH1/PMS2/ EPCAM for LS), there remains a large percentage of associated cancer cases that are negative for germline mutations in these genes, including 80% of women with a personal or family history of breast cancer who are negative for BRCA1/2 mutations [1] . Similarly, between 30 and 50% of families fulfill stringent criteria for LS and test negative for germline mismatch repair gene mutations [2] . Adding complexity to these disorders is the significant overlap in the spectrum of cancers observed between various hereditary cancer syndromes, including many cancer susceptibility syndromes. Some that contribute to elevated breast cancer risk include Li-Frau...</code> |
| <code>How do MAK-4 and MAK-5 exert their antioxidant properties?<br></code> | <code>Hybrid F1 mice were injected with urethane (300 mg/kg) at 8 days of age. A group was then put on a MAK-supplemented diet, another group was fed a standard pellet diet. At 36 weeks of age the mice were sacrificed and the livers examined for the presence of tumors mouse (Panel A) and for the number of nodules per mouse (Panel B) (* p < 0.05, ** P < 0.001). Statistical analysis was performed by Two Way ANOVA Test followed by Post Hoc Bonferroni analysis. <br><br> We than measured the influence of the MAK-4+5 combination on the expression of the three liver-specific connexins (cx26, cx32, and cx43). The level of cx26 expression was similar in all the groups of mice treated with the MAK-supplemented diet and in the control (Figure 4, Panel A) . A significant, time-dependent increase in cx32 was observed in the liver of all the groups of MAK treated mice compared to the normal diet-fed controls. Cx32 expression increased 2-fold after 1 week of treatment, and 3-to 4-fold at 3 months (Figure 4, Pane...</code> |
| <code>What are the primary indications for a decompressive craniectomy, and what role does neurocritical care play in determining the suitability of a patient for this procedure?</code> | <code>Decompressive craniectomy is a valid neurosurgical strategy now a day as an alternative to control an elevated intracranial pressure (ICP) and controlling the risk of uncal and/or subfalcine herniation, in refractory cases to the postural, ventilator, and pharmacological measures to control it. The neurocritical care and the ICP monitorization are key determinants to identify and postulate the inclusion criteria to consider a patient as candidate to this procedure, as it is always considered a rescue surgical technique. Head trauma and ischemic or hemorrhagic cerebrovascular disease with progressive deterioration due to mass effect are some of the cases that may require a decompressive craniectomy with its different variants. However, this procedure per se can have complications described in the postcraniectomy syndrome and may occur in short, medium, or even long term.<br><br> 1,2 The paradoxical herniation is a condition in which there is a deviation of the midline with mass effect, even t...</code> |
* Loss: [<code>SpladeLoss</code>](https://sbert.net/docs/package_reference/sparse_encoder/losses.html#spladeloss) with these parameters:
```json
{
"loss": "SparseMultipleNegativesRankingLoss(scale=1.0, similarity_fct='dot_score')",
"lambda_corpus": 5e-06,
"lambda_query": 2e-05
}
```
### Training Hyperparameters
#### Non-Default Hyperparameters
- `eval_strategy`: steps
- `per_device_train_batch_size`: 4
- `per_device_eval_batch_size`: 4
- `learning_rate`: 2e-05
- `num_train_epochs`: 1
- `warmup_ratio`: 0.1
- `fp16`: True
- `batch_sampler`: no_duplicates
#### All Hyperparameters
<details><summary>Click to expand</summary>
- `overwrite_output_dir`: False
- `do_predict`: False
- `eval_strategy`: steps
- `prediction_loss_only`: True
- `per_device_train_batch_size`: 4
- `per_device_eval_batch_size`: 4
- `per_gpu_train_batch_size`: None
- `per_gpu_eval_batch_size`: None
- `gradient_accumulation_steps`: 1
- `eval_accumulation_steps`: None
- `torch_empty_cache_steps`: None
- `learning_rate`: 2e-05
- `weight_decay`: 0.0
- `adam_beta1`: 0.9
- `adam_beta2`: 0.999
- `adam_epsilon`: 1e-08
- `max_grad_norm`: 1.0
- `num_train_epochs`: 1
- `max_steps`: -1
- `lr_scheduler_type`: linear
- `lr_scheduler_kwargs`: {}
- `warmup_ratio`: 0.1
- `warmup_steps`: 0
- `log_level`: passive
- `log_level_replica`: warning
- `log_on_each_node`: True
- `logging_nan_inf_filter`: True
- `save_safetensors`: True
- `save_on_each_node`: False
- `save_only_model`: False
- `restore_callback_states_from_checkpoint`: False
- `no_cuda`: False
- `use_cpu`: False
- `use_mps_device`: False
- `seed`: 42
- `data_seed`: None
- `jit_mode_eval`: False
- `use_ipex`: False
- `bf16`: False
- `fp16`: True
- `fp16_opt_level`: O1
- `half_precision_backend`: auto
- `bf16_full_eval`: False
- `fp16_full_eval`: False
- `tf32`: None
- `local_rank`: 0
- `ddp_backend`: None
- `tpu_num_cores`: None
- `tpu_metrics_debug`: False
- `debug`: []
- `dataloader_drop_last`: False
- `dataloader_num_workers`: 0
- `dataloader_prefetch_factor`: None
- `past_index`: -1
- `disable_tqdm`: False
- `remove_unused_columns`: True
- `label_names`: None
- `load_best_model_at_end`: False
- `ignore_data_skip`: False
- `fsdp`: []
- `fsdp_min_num_params`: 0
- `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
- `fsdp_transformer_layer_cls_to_wrap`: None
- `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
- `deepspeed`: None
- `label_smoothing_factor`: 0.0
- `optim`: adamw_torch
- `optim_args`: None
- `adafactor`: False
- `group_by_length`: False
- `length_column_name`: length
- `ddp_find_unused_parameters`: None
- `ddp_bucket_cap_mb`: None
- `ddp_broadcast_buffers`: False
- `dataloader_pin_memory`: True
- `dataloader_persistent_workers`: False
- `skip_memory_metrics`: True
- `use_legacy_prediction_loop`: False
- `push_to_hub`: False
- `resume_from_checkpoint`: None
- `hub_model_id`: None
- `hub_strategy`: every_save
- `hub_private_repo`: None
- `hub_always_push`: False
- `gradient_checkpointing`: False
- `gradient_checkpointing_kwargs`: None
- `include_inputs_for_metrics`: False
- `include_for_metrics`: []
- `eval_do_concat_batches`: True
- `fp16_backend`: auto
- `push_to_hub_model_id`: None
- `push_to_hub_organization`: None
- `mp_parameters`:
- `auto_find_batch_size`: False
- `full_determinism`: False
- `torchdynamo`: None
- `ray_scope`: last
- `ddp_timeout`: 1800
- `torch_compile`: False
- `torch_compile_backend`: None
- `torch_compile_mode`: None
- `include_tokens_per_second`: False
- `include_num_input_tokens_seen`: False
- `neftune_noise_alpha`: None
- `optim_target_modules`: None
- `batch_eval_metrics`: False
- `eval_on_start`: False
- `use_liger_kernel`: False
- `eval_use_gather_object`: False
- `average_tokens_across_devices`: False
- `prompts`: None
- `batch_sampler`: no_duplicates
- `multi_dataset_batch_sampler`: proportional
- `router_mapping`: {}
- `learning_rate_mapping`: {}
</details>
### Training Logs
| Epoch | Step | Training Loss | Validation Loss | miriad_eval_dot_ndcg@10 | miriad_test_dot_ndcg@10 |
|:-----:|:-----:|:-------------:|:---------------:|:-----------------------:|:-----------------------:|
| 0.032 | 800 | 311.9058 | - | - | - |
| 0.064 | 1600 | 10.9011 | - | - | - |
| 0.096 | 2400 | 2.3726 | - | - | - |
| 0.128 | 3200 | 0.4999 | - | - | - |
| 0.16 | 4000 | 0.1222 | 0.0420 | 0.9017 | - |
| 0.192 | 4800 | 0.0755 | - | - | - |
| 0.224 | 5600 | 0.0481 | - | - | - |
| 0.256 | 6400 | 0.0643 | - | - | - |
| 0.288 | 7200 | 0.0598 | - | - | - |
| 0.32 | 8000 | 0.0575 | 0.0210 | 0.9274 | - |
| 0.352 | 8800 | 0.0417 | - | - | - |
| 0.384 | 9600 | 0.0487 | - | - | - |
| 0.416 | 10400 | 0.0262 | - | - | - |
| 0.448 | 11200 | 0.0404 | - | - | - |
| 0.48 | 12000 | 0.0359 | 0.0163 | 0.9282 | - |
| 0.512 | 12800 | 0.0407 | - | - | - |
| 0.544 | 13600 | 0.0373 | - | - | - |
| 0.576 | 14400 | 0.0204 | - | - | - |
| 0.608 | 15200 | 0.0218 | - | - | - |
| 0.64 | 16000 | 0.0196 | 0.0045 | 0.9434 | - |
| 0.672 | 16800 | 0.0311 | - | - | - |
| 0.704 | 17600 | 0.0372 | - | - | - |
| 0.736 | 18400 | 0.029 | - | - | - |
| 0.768 | 19200 | 0.0319 | - | - | - |
| 0.8 | 20000 | 0.0352 | 0.0196 | 0.9392 | - |
| 0.832 | 20800 | 0.0257 | - | - | - |
| 0.864 | 21600 | 0.0339 | - | - | - |
| 0.896 | 22400 | 0.0211 | - | - | - |
| 0.928 | 23200 | 0.0197 | - | - | - |
| 0.96 | 24000 | 0.0228 | 0.0069 | 0.9514 | - |
| 0.992 | 24800 | 0.0161 | - | - | - |
| -1 | -1 | - | - | 0.9509 | 0.9388 |
### Environmental Impact
Carbon emissions were measured using [CodeCarbon](https://github.com/mlco2/codecarbon).
- **Energy Consumed**: 0.505 kWh
- **Carbon Emitted**: 0.196 kg of CO2
- **Hours Used**: 1.484 hours
### Training Hardware
- **On Cloud**: No
- **GPU Model**: 1 x NVIDIA GeForce RTX 3090
- **CPU Model**: 13th Gen Intel(R) Core(TM) i7-13700K
- **RAM Size**: 31.78 GB
### Framework Versions
- Python: 3.11.6
- Sentence Transformers: 4.2.0.dev0
- Transformers: 4.52.4
- PyTorch: 2.6.0+cu124
- Accelerate: 1.5.1
- Datasets: 2.21.0
- Tokenizers: 0.21.1
## Citation
### BibTeX
#### Sentence Transformers
```bibtex
@inproceedings{reimers-2019-sentence-bert,
title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
author = "Reimers, Nils and Gurevych, Iryna",
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
month = "11",
year = "2019",
publisher = "Association for Computational Linguistics",
url = "https://arxiv.org/abs/1908.10084",
}
```
#### SpladeLoss
```bibtex
@misc{formal2022distillationhardnegativesampling,
title={From Distillation to Hard Negative Sampling: Making Sparse Neural IR Models More Effective},
author={Thibault Formal and Carlos Lassance and Benjamin Piwowarski and Stรฉphane Clinchant},
year={2022},
eprint={2205.04733},
archivePrefix={arXiv},
primaryClass={cs.IR},
url={https://arxiv.org/abs/2205.04733},
}
```
#### SparseMultipleNegativesRankingLoss
```bibtex
@misc{henderson2017efficient,
title={Efficient Natural Language Response Suggestion for Smart Reply},
author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil},
year={2017},
eprint={1705.00652},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
#### FlopsLoss
```bibtex
@article{paria2020minimizing,
title={Minimizing flops to learn efficient sparse representations},
author={Paria, Biswajit and Yeh, Chih-Kuan and Yen, Ian EH and Xu, Ning and Ravikumar, Pradeep and P{'o}czos, Barnab{'a}s},
journal={arXiv preprint arXiv:2004.05665},
year={2020}
}
```
<!--
## Glossary
*Clearly define terms in order to be accessible across audiences.*
-->
<!--
## Model Card Authors
*Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.*
-->
<!--
## Model Card Contact
*Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.*
--> |
mrdayl/qwen3-mermaid-16bnb | mrdayl | 2025-06-17T10:14:12Z | 0 | 0 | transformers | [
"transformers",
"safetensors",
"qwen3",
"text-generation",
"text-generation-inference",
"unsloth",
"conversational",
"en",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-generation | 2025-06-17T08:56:26Z | ---
base_model: unsloth/qwen3-4b-unsloth-bnb-4bit
tags:
- text-generation-inference
- transformers
- unsloth
- qwen3
license: apache-2.0
language:
- en
---
# Uploaded finetuned model
- **Developed by:** mrdayl
- **License:** apache-2.0
- **Finetuned from model :** unsloth/qwen3-4b-unsloth-bnb-4bit
This qwen3 model was trained 2x faster with [Unsloth](https://github.com/unslothai/unsloth) and Huggingface's TRL library.
[<img src="https://raw.githubusercontent.com/unslothai/unsloth/main/images/unsloth%20made%20with%20love.png" width="200"/>](https://github.com/unslothai/unsloth)
|
FormlessAI/cdf43773-3ca9-407c-a0db-3c6c553761ec | FormlessAI | 2025-06-17T10:09:30Z | 0 | 0 | transformers | [
"transformers",
"safetensors",
"llama",
"text-generation",
"generated_from_trainer",
"trl",
"sft",
"conversational",
"base_model:aisingapore/Llama-SEA-LION-v2-8B-IT",
"base_model:finetune:aisingapore/Llama-SEA-LION-v2-8B-IT",
"autotrain_compatible",
"text-generation-inference",
"endpoints_co... | text-generation | 2025-06-17T09:54:11Z | ---
base_model: aisingapore/llama3-8b-cpt-sea-lionv2.1-instruct
library_name: transformers
model_name: cdf43773-3ca9-407c-a0db-3c6c553761ec
tags:
- generated_from_trainer
- trl
- sft
licence: license
---
# Model Card for cdf43773-3ca9-407c-a0db-3c6c553761ec
This model is a fine-tuned version of [aisingapore/llama3-8b-cpt-sea-lionv2.1-instruct](https://huggingface.co/aisingapore/llama3-8b-cpt-sea-lionv2.1-instruct).
It has been trained using [TRL](https://github.com/huggingface/trl).
## Quick start
```python
from transformers import pipeline
question = "If you had a time machine, but could only go to the past or the future once and never return, which would you choose and why?"
generator = pipeline("text-generation", model="FormlessAI/cdf43773-3ca9-407c-a0db-3c6c553761ec", device="cuda")
output = generator([{"role": "user", "content": question}], max_new_tokens=128, return_full_text=False)[0]
print(output["generated_text"])
```
## Training procedure
[<img src="https://raw.githubusercontent.com/wandb/assets/main/wandb-github-badge-28.svg" alt="Visualize in Weights & Biases" width="150" height="24"/>](https://wandb.ai/phoenix-formless/Gradients/runs/jo240qxe)
This model was trained with SFT.
### Framework versions
- TRL: 0.18.1
- Transformers: 4.52.4
- Pytorch: 2.7.0+cu128
- Datasets: 3.6.0
- Tokenizers: 0.21.1
## Citations
Cite TRL as:
```bibtex
@misc{vonwerra2022trl,
title = {{TRL: Transformer Reinforcement Learning}},
author = {Leandro von Werra and Younes Belkada and Lewis Tunstall and Edward Beeching and Tristan Thrush and Nathan Lambert and Shengyi Huang and Kashif Rasul and Quentin Gallou{\'e}dec},
year = 2020,
journal = {GitHub repository},
publisher = {GitHub},
howpublished = {\url{https://github.com/huggingface/trl}}
}
``` |
gokulsrinivasagan/tinybert_base_train_book_ent_15p_s_init_book | gokulsrinivasagan | 2025-06-17T09:53:18Z | 9 | 0 | transformers | [
"transformers",
"safetensors",
"bert",
"fill-mask",
"generated_from_trainer",
"dataset:gokulsrinivasagan/processed_book_corpus-ld",
"base_model:google/bert_uncased_L-4_H-512_A-8",
"base_model:finetune:google/bert_uncased_L-4_H-512_A-8",
"license:apache-2.0",
"model-index",
"autotrain_compatible"... | fill-mask | 2025-06-13T18:21:37Z | ---
library_name: transformers
license: apache-2.0
base_model: google/bert_uncased_L-4_H-512_A-8
tags:
- generated_from_trainer
datasets:
- gokulsrinivasagan/processed_book_corpus-ld
metrics:
- accuracy
model-index:
- name: tinybert_base_train_book_ent_15p_s_init_book
results:
- task:
name: Masked Language Modeling
type: fill-mask
dataset:
name: gokulsrinivasagan/processed_book_corpus-ld
type: gokulsrinivasagan/processed_book_corpus-ld
metrics:
- name: Accuracy
type: accuracy
value: 0.5016182928962741
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# tinybert_base_train_book_ent_15p_s_init_book
This model is a fine-tuned version of [google/bert_uncased_L-4_H-512_A-8](https://huggingface.co/google/bert_uncased_L-4_H-512_A-8) on the gokulsrinivasagan/processed_book_corpus-ld dataset.
It achieves the following results on the evaluation set:
- Loss: 2.7883
- Accuracy: 0.5016
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 120
- eval_batch_size: 120
- seed: 10
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 10000
- num_epochs: 24
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-------:|:------:|:---------------:|:--------:|
| 5.7148 | 0.5269 | 10000 | 5.3754 | 0.1660 |
| 5.4378 | 1.0539 | 20000 | 5.0823 | 0.1892 |
| 5.1562 | 1.5808 | 30000 | 4.7818 | 0.2149 |
| 4.8487 | 2.1077 | 40000 | 4.4633 | 0.2476 |
| 4.5657 | 2.6346 | 50000 | 4.1493 | 0.2907 |
| 4.2716 | 3.1616 | 60000 | 3.7873 | 0.3466 |
| 4.2871 | 3.6885 | 70000 | 3.8181 | 0.3361 |
| 4.1543 | 4.2154 | 80000 | 3.5822 | 0.3733 |
| 3.8571 | 4.7423 | 90000 | 3.4747 | 0.3897 |
| 3.7109 | 5.2693 | 100000 | 3.1587 | 0.4452 |
| 3.5309 | 5.7962 | 110000 | 3.1118 | 0.4521 |
| 3.5456 | 6.3231 | 120000 | 3.1531 | 0.4396 |
| 3.3806 | 6.8500 | 130000 | 2.8550 | 0.4952 |
| 3.4529 | 7.3770 | 140000 | 3.0184 | 0.4605 |
| 3.3215 | 7.9039 | 150000 | 2.7883 | 0.5016 |
| 3.513 | 8.4308 | 160000 | 3.0518 | 0.4508 |
| 3.3968 | 8.9577 | 170000 | 2.9743 | 0.4616 |
| 3.449 | 9.4847 | 180000 | 2.9690 | 0.4628 |
| 3.3697 | 10.0116 | 190000 | 2.8899 | 0.4777 |
| 3.357 | 10.5385 | 200000 | 2.9087 | 0.4713 |
| 3.387 | 11.0654 | 210000 | 2.8973 | 0.4734 |
| 3.4019 | 11.5924 | 220000 | 2.9180 | 0.4674 |
| 3.3729 | 12.1193 | 230000 | 2.9308 | 0.4650 |
| 3.4055 | 12.6462 | 240000 | 2.9422 | 0.4640 |
| 3.4147 | 13.1731 | 250000 | 3.0244 | 0.4468 |
| 3.395 | 13.7001 | 260000 | 2.9477 | 0.4606 |
| 3.4227 | 14.2270 | 270000 | 2.9277 | 0.4636 |
| 3.5185 | 14.7539 | 280000 | 3.0647 | 0.4362 |
| 3.4673 | 15.2809 | 290000 | 3.0344 | 0.4418 |
| 3.4164 | 15.8078 | 300000 | 3.0563 | 0.4379 |
| 3.3326 | 16.3347 | 310000 | 3.0179 | 0.4443 |
| 3.3937 | 16.8616 | 320000 | 3.0324 | 0.4397 |
| 3.4516 | 17.3886 | 330000 | 3.1178 | 0.4245 |
| 3.4207 | 17.9155 | 340000 | 3.0349 | 0.4407 |
| 3.3921 | 18.4424 | 350000 | 2.9866 | 0.4471 |
| 3.3771 | 18.9693 | 360000 | 2.9835 | 0.4488 |
| 3.3844 | 19.4963 | 370000 | 2.9886 | 0.4477 |
| 3.3288 | 20.0232 | 380000 | 2.9555 | 0.4523 |
| 3.3691 | 20.5501 | 390000 | 2.9938 | 0.4449 |
| 3.3104 | 21.0770 | 400000 | 2.9500 | 0.4528 |
| 3.3398 | 21.6040 | 410000 | 2.9999 | 0.4437 |
| 3.3325 | 22.1309 | 420000 | 2.9703 | 0.4481 |
| 3.3466 | 22.6578 | 430000 | 2.9785 | 0.4474 |
| 3.3444 | 23.1847 | 440000 | 2.9894 | 0.4450 |
| 3.3103 | 23.7117 | 450000 | 2.9477 | 0.4523 |
### Framework versions
- Transformers 4.51.2
- Pytorch 2.6.0+cu126
- Datasets 3.5.0
- Tokenizers 0.21.1
|
ganesan-erss/sqlcoder-7b-finetuned-v2 | ganesan-erss | 2025-06-17T09:45:02Z | 0 | 0 | transformers | [
"transformers",
"safetensors",
"arxiv:1910.09700",
"endpoints_compatible",
"region:us"
] | null | 2025-06-17T09:44:47Z | ---
library_name: transformers
tags: []
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a ๐ค transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Dataset Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed] |
brainmao/csfmod | brainmao | 2025-06-17T09:34:03Z | 0 | 0 | null | [
"gguf",
"llama",
"unsloth",
"license:mit",
"endpoints_compatible",
"region:us"
] | null | 2025-06-17T08:51:37Z | ---
license: mit
tags:
- unsloth
---
|
BootesVoid/cmbxbw7zm00jurdqs9iqa9vjc_cmc0am4wa07fdrdqsoyeqy0u9 | BootesVoid | 2025-06-17T09:31:24Z | 0 | 0 | diffusers | [
"diffusers",
"flux",
"lora",
"replicate",
"text-to-image",
"en",
"base_model:black-forest-labs/FLUX.1-dev",
"base_model:adapter:black-forest-labs/FLUX.1-dev",
"license:other",
"region:us"
] | text-to-image | 2025-06-17T09:31:22Z | ---
license: other
license_name: flux-1-dev-non-commercial-license
license_link: https://huggingface.co/black-forest-labs/FLUX.1-dev/blob/main/LICENSE.md
language:
- en
tags:
- flux
- diffusers
- lora
- replicate
base_model: "black-forest-labs/FLUX.1-dev"
pipeline_tag: text-to-image
# widget:
# - text: >-
# prompt
# output:
# url: https://...
instance_prompt: TRISHA
---
# Cmbxbw7Zm00Jurdqs9Iqa9Vjc_Cmc0Am4Wa07Fdrdqsoyeqy0U9
<Gallery />
## About this LoRA
This is a [LoRA](https://replicate.com/docs/guides/working-with-loras) for the FLUX.1-dev text-to-image model. It can be used with diffusers or ComfyUI.
It was trained on [Replicate](https://replicate.com/) using AI toolkit: https://replicate.com/ostris/flux-dev-lora-trainer/train
## Trigger words
You should use `TRISHA` to trigger the image generation.
## Run this LoRA with an API using Replicate
```py
import replicate
input = {
"prompt": "TRISHA",
"lora_weights": "https://huggingface.co/BootesVoid/cmbxbw7zm00jurdqs9iqa9vjc_cmc0am4wa07fdrdqsoyeqy0u9/resolve/main/lora.safetensors"
}
output = replicate.run(
"black-forest-labs/flux-dev-lora",
input=input
)
for index, item in enumerate(output):
with open(f"output_{index}.webp", "wb") as file:
file.write(item.read())
```
## Use it with the [๐งจ diffusers library](https://github.com/huggingface/diffusers)
```py
from diffusers import AutoPipelineForText2Image
import torch
pipeline = AutoPipelineForText2Image.from_pretrained('black-forest-labs/FLUX.1-dev', torch_dtype=torch.float16).to('cuda')
pipeline.load_lora_weights('BootesVoid/cmbxbw7zm00jurdqs9iqa9vjc_cmc0am4wa07fdrdqsoyeqy0u9', weight_name='lora.safetensors')
image = pipeline('TRISHA').images[0]
```
For more details, including weighting, merging and fusing LoRAs, check the [documentation on loading LoRAs in diffusers](https://huggingface.co/docs/diffusers/main/en/using-diffusers/loading_adapters)
## Training details
- Steps: 2000
- Learning rate: 0.0004
- LoRA rank: 16
## Contribute your own examples
You can use the [community tab](https://huggingface.co/BootesVoid/cmbxbw7zm00jurdqs9iqa9vjc_cmc0am4wa07fdrdqsoyeqy0u9/discussions) to add images that show off what youโve made with this LoRA.
|
vietnhat/orpheus-test | vietnhat | 2025-06-17T09:25:44Z | 0 | 0 | transformers | [
"transformers",
"safetensors",
"llama",
"text-generation",
"text-generation-inference",
"unsloth",
"conversational",
"en",
"base_model:unsloth/orpheus-3b-0.1-ft",
"base_model:finetune:unsloth/orpheus-3b-0.1-ft",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:... | text-generation | 2025-06-17T09:23:56Z | ---
base_model: unsloth/orpheus-3b-0.1-ft
tags:
- text-generation-inference
- transformers
- unsloth
- llama
license: apache-2.0
language:
- en
---
# Uploaded finetuned model
- **Developed by:** vietnhat
- **License:** apache-2.0
- **Finetuned from model :** unsloth/orpheus-3b-0.1-ft
This llama model was trained 2x faster with [Unsloth](https://github.com/unslothai/unsloth) and Huggingface's TRL library.
[<img src="https://raw.githubusercontent.com/unslothai/unsloth/main/images/unsloth%20made%20with%20love.png" width="200"/>](https://github.com/unslothai/unsloth)
|
JayHyeon/Qwen_1.5B-math-VIPO_5e-6_10.0vpo_constant-5ep | JayHyeon | 2025-06-17T09:20:56Z | 0 | 0 | transformers | [
"transformers",
"tensorboard",
"safetensors",
"qwen2",
"text-generation",
"generated_from_trainer",
"trl",
"dpo",
"conversational",
"dataset:argilla/distilabel-math-preference-dpo",
"arxiv:2305.18290",
"base_model:Qwen/Qwen2.5-Math-1.5B",
"base_model:finetune:Qwen/Qwen2.5-Math-1.5B",
"auto... | text-generation | 2025-06-17T08:56:16Z | ---
base_model: Qwen/Qwen2.5-Math-1.5B
datasets: argilla/distilabel-math-preference-dpo
library_name: transformers
model_name: Qwen_1.5B-math-VIPO_5e-6_10.0vpo_constant-5ep
tags:
- generated_from_trainer
- trl
- dpo
licence: license
---
# Model Card for Qwen_1.5B-math-VIPO_5e-6_10.0vpo_constant-5ep
This model is a fine-tuned version of [Qwen/Qwen2.5-Math-1.5B](https://huggingface.co/Qwen/Qwen2.5-Math-1.5B) on the [argilla/distilabel-math-preference-dpo](https://huggingface.co/datasets/argilla/distilabel-math-preference-dpo) dataset.
It has been trained using [TRL](https://github.com/huggingface/trl).
## Quick start
```python
from transformers import pipeline
question = "If you had a time machine, but could only go to the past or the future once and never return, which would you choose and why?"
generator = pipeline("text-generation", model="JayHyeon/Qwen_1.5B-math-VIPO_5e-6_10.0vpo_constant-5ep", device="cuda")
output = generator([{"role": "user", "content": question}], max_new_tokens=128, return_full_text=False)[0]
print(output["generated_text"])
```
## Training procedure
[<img src="https://raw.githubusercontent.com/wandb/assets/main/wandb-github-badge-28.svg" alt="Visualize in Weights & Biases" width="150" height="24"/>](https://wandb.ai/bonin147/huggingface/runs/recw375c)
This model was trained with DPO, a method introduced in [Direct Preference Optimization: Your Language Model is Secretly a Reward Model](https://huggingface.co/papers/2305.18290).
### Framework versions
- TRL: 0.15.2
- Transformers: 4.50.0
- Pytorch: 2.6.0
- Datasets: 3.4.1
- Tokenizers: 0.21.1
## Citations
Cite DPO as:
```bibtex
@inproceedings{rafailov2023direct,
title = {{Direct Preference Optimization: Your Language Model is Secretly a Reward Model}},
author = {Rafael Rafailov and Archit Sharma and Eric Mitchell and Christopher D. Manning and Stefano Ermon and Chelsea Finn},
year = 2023,
booktitle = {Advances in Neural Information Processing Systems 36: Annual Conference on Neural Information Processing Systems 2023, NeurIPS 2023, New Orleans, LA, USA, December 10 - 16, 2023},
url = {http://papers.nips.cc/paper_files/paper/2023/hash/a85b405ed65c6477a4fe8302b5e06ce7-Abstract-Conference.html},
editor = {Alice Oh and Tristan Naumann and Amir Globerson and Kate Saenko and Moritz Hardt and Sergey Levine},
}
```
Cite TRL as:
```bibtex
@misc{vonwerra2022trl,
title = {{TRL: Transformer Reinforcement Learning}},
author = {Leandro von Werra and Younes Belkada and Lewis Tunstall and Edward Beeching and Tristan Thrush and Nathan Lambert and Shengyi Huang and Kashif Rasul and Quentin Gallouรฉdec},
year = 2020,
journal = {GitHub repository},
publisher = {GitHub},
howpublished = {\url{https://github.com/huggingface/trl}}
}
``` |
diegolacomba/multilingual-e5-small-legal-mnrl-0 | diegolacomba | 2025-06-17T09:05:06Z | 0 | 0 | sentence-transformers | [
"sentence-transformers",
"safetensors",
"bert",
"sentence-similarity",
"feature-extraction",
"generated_from_trainer",
"dataset_size:58898",
"loss:MultipleNegativesRankingLoss",
"arxiv:1908.10084",
"arxiv:1705.00652",
"base_model:intfloat/multilingual-e5-small",
"base_model:finetune:intfloat/m... | sentence-similarity | 2025-06-17T09:04:34Z | ---
tags:
- sentence-transformers
- sentence-similarity
- feature-extraction
- generated_from_trainer
- dataset_size:58898
- loss:MultipleNegativesRankingLoss
base_model: intfloat/multilingual-e5-small
widget:
- source_sentence: 'query: ยฟCรณmo se deben determinar las cuotas a cuenta del IRPF
en un aรฑo con actividad econรณmica suspendida?'
sentences:
- 'passage A los efectos de este Impuesto, se considerarรก promotor de edificaciones
el propietario de inmuebles que construyรณ (promotor-constructor) o contratรณ la
construcciรณn (promotor) de los mismos para destinarlos a la venta, el alquiler
o el uso propio.
c) Dichas ejecuciones de obra tengan por objeto la construcciรณn o rehabilitaciรณn
de edificios destinados fundamentalmente a viviendas, incluidos los locales, anejos,
instalaciones y servicios complementarios en ella situados.
d) Las referidas ejecuciones de obra consistan materialmente en la construcciรณn
o rehabilitaciรณn de los citados edificios.
3.- En consecuencia, las ejecuciones de obra concertadas directamente entre el
promotor y el contratista (la consultante), que tengan por objeto la rehabilitaciรณn
de una vivienda, tributan al tipo reducido del 10 por ciento. El tipo reducido
se aplica con independencia de que el promotor concierte la totalidad de la obra
de construcciรณn con un solo empresario o concierte la realizaciรณn con varios empresarios
realizando cada uno de ellos una parte de la obra segรบn su especialidad.
No obstante, las ejecuciones de obra realizadas por subcontratistas para otros
contratistas (la consultante), que a su vez contraten con el promotor, tributarรกn
por el Impuesto sobre el Valor Aรฑadido al tipo general del 21 por ciento.
4.- Lo que comunico a Vd. con efectos vinculantes, conforme a lo dispuesto en
el apartado 1 del artรญculo 89 de la Ley 58/2003, de 17 de diciembre, General Tributaria.'
- 'passage Descripciรณn de hechos: La consultante es titular de una actividad econรณmica
de "otros cafรฉs y bares". El rendimiento neto de la actividad se determina por
el mรฉtodo de estimaciรณn objetiva y tributa en el IVA por el rรฉgimen especial simplificado.
Desde la declaraciรณn de alarma en marzo de 2020 ha tenido cerrada la actividad
y la va a seguir teniendo cerrada durante todo el aรฑo 2020, pues las restricciones
que tiene que aplicar no la hacen rentable.
Cuestiรณn planteada: Forma de calcular, en 2020, el pago fraccionado a cuenta del
IRPF y el ingreso a cuenta trimestral del IVA.'
- 'passage No obstante, el artรญculo 22.Trece de la Ley 37/1992, declara la exenciรณn
de:
โLos transportes de viajeros y sus equipajes por vรญa marรญtima o aรฉrea procedentes
de o con destino a un puerto o aeropuerto situado fuera del รกmbito espacial del
Impuesto.
Se entenderรกn incluidos en este apartado los transportes por vรญa aรฉrea amparados
por un รบnico tรญtulo de transporte que incluya vuelos de conexiรณn aรฉrea.โ.
En consecuencia, los servicios de transporte consultados, que tienen su origen
o destino en un aeropuerto fuera del territorio de aplicaciรณn del impuesto sobre
el valor aรฑadido, estarรกn sujetos pero exentos del Impuesto sobre el Valor Aรฑadido.
2.- Por otra parte, el artรญculo 164, apartado uno, de la Ley del Impuesto sobre
el Valor Aรฑadido, en el que se regulan las obligaciones de los sujetos pasivos,
establece lo siguiente:
โUno. Sin perjuicio de lo establecido en el Tรญtulo anterior, los sujetos pasivos
del Impuesto estarรกn obligados, con los requisitos, lรญmites y condiciones que
se determinen reglamentariamente, a:
(โฆ)
3ยบ. Expedir y entregar factura de todas sus operaciones, ajustada a lo que se
determine reglamentariamente.โ.
El desarrollo reglamentario de dicho precepto se ha llevado a cabo por el Reglamento
por el que se regulan las obligaciones de facturaciรณn, aprobado por el artรญculo
1 del Real Decreto 1619/2012, de 30 de noviembre (BOE de 1 de diciembre).
El artรญculo 2 del mencionado Reglamento dispone que:'
- source_sentence: 'query: ยฟCuรกl es el porcentaje de impuesto que corresponde a dispositivos
destinados a aliviar discapacidades bajo la ley actual?'
sentences:
- 'passage Contestaciรณn completa: 1.- El artรญculo 90, apartado uno, de la Ley 37/1992,
de 28 de diciembre, del Impuesto sobre el Valor Aรฑadido (BOE del 29 de diciembre),
dispone que el Impuesto se exigirรก al tipo del 21 por ciento, salvo lo dispuesto
en el artรญculo siguiente.
2.- El artรญculo 91, apartado Uno.1, nรบmero 6ยบ, letra c) de la Ley 37/1992 dispone
lo siguiente:
โUno. Se aplicarรก el tipo del 10 por ciento a las operaciones siguientes:
1. Las entregas, adquisiciones intracomunitarias o importaciones de los bienes
que se indican a continuaciรณn:
(โฆ)
6.ยบ Los siguientes bienes:
(โฆ)
c) Los equipos mรฉdicos, aparatos y demรกs instrumental, relacionados en el apartado
octavo del anexo de esta Ley, que, por sus caracterรญsticas objetivas, estรฉn diseรฑados
para aliviar o tratar deficiencias, para uso personal y exclusivo de personas
que tengan deficiencias fรญsicas, mentales, intelectuales o sensoriales, sin perjuicio
de lo previsto en el apartado dos.1 de este artรญculo.
No se incluyen en esta letra otros accesorios, recambios y piezas de repuesto
de dichos bienes.โ.
El apartado octavo del Anexo de la Ley 37/1992, establece lo siguiente:
โOctavo. Relaciรณn de bienes a que se refiere el artรญculo 91.Uno.1. 6.ยบc) de esta
Ley.
(โฆ)
โ Sillas terapรฉuticas y de ruedas, asรญ como los cojines antiescaras y arneses
para el uso de las mismas, muletas, andadores y grรบas para movilizar personas
con discapacidad.
(โฆ).โ.
3.- Por su parte, el artรญculo 91, apartado dos.1, nรบmero 4ยบ de la Ley 37/1992,
dispone que:
โDos. Se aplicarรก el tipo del 4 por ciento a las operaciones siguientes:
1. Las entregas, adquisiciones intracomunitarias o importaciones de los bienes
que se indican a continuaciรณn:
(โฆ)'
- 'passage (โฆ).โ.
De acuerdo con lo dispuesto anteriormente, en los supuestos de adjudicaciรณn de
bienes en virtud de subasta judicial o administrativa, como es el caso que nos
ocupa, el adjudicatario puede efectuar, en su caso, la renuncia a las exenciones
previstas en el apartado dos del artรญculo 20 de la Ley 37/1992, asรญ como expedir
factura, presentar, en nombre y por cuenta del sujeto pasivo, la declaraciรณn-liquidaciรณn
correspondiente e ingresar el importe del Impuesto sobre el Valor Aรฑadido resultante.
El ejercicio de dicha facultad por parte del adjudicatario determina la obligaciรณn
de presentar la autoliquidaciรณn del Impuesto conforme al modelo aprobado por la
Orden HAC/3625/2003, de 23 de diciembre (modelo 309).
Uno de los requisitos necesarios para el ejercicio de dicha facultad es que el
destinatario-adjudicatario del bien inmueble tenga la consideraciรณn de empresario
o profesional en los tรฉrminos previstos en esta contestaciรณn. La no consideraciรณn
como empresario o profesional impide el ejercicio de dicha facultad.
Por รบltimo, seรฑalar que de resultar aplicable la regla de inversiรณn del sujeto
pasivo prevista en el artรญculo 84.Uno.2ยบ de la Ley 37/1992, anteriormente desarrollado,
el adjudicatario resultarรก ser el sujeto pasivo de la operaciรณn por lo que viene
obligado a presentar la autoliquidaciรณn ordinaria del Impuesto en nombre propio,
sin actuar en nombre y por cuenta del subastado. Asimismo, de optar por dicha
facultad en los tรฉrminos establecidos reglamentariamente, el consultante podrรก
emitir, en nombre y por cuenta del transmitente, la correspondiente factura en
la que se documente la operaciรณn.
No obstante, tal y como se ha seรฑalado en apartados anteriores de esta contestaciรณn,
el consultante adjudicatario de la subasta judicial no procediรณ a la renuncia
a la exenciรณn del artรญculo 20.Uno.22ยบ de la Ley del Impuesto en el plazo establecido,
habiรฉndose encontrado facultado para ello segรบn lo dispuesto en la Disposiciรณn
Adicional Sexta de la Ley 37/1992.'
- 'passage c) Las que tengan por objeto la cesiรณn del derecho a utilizar infraestructuras
ferroviarias.
d) Las autorizaciones para la prestaciรณn de servicios al pรบblico y para el desarrollo
de actividades comerciales o industriales en el รกmbito portuario.โ
3.- La consulta plantea una cuestiรณn sobre un contrato por el que un Ayuntamiento
cede a un contratista la explotaciรณn de un bar (instalaciรณn fija de obra) en una
ciudad.
Dicho contrato tiene la naturaleza de contrato administrativo especial, sin que
el mismo pueda calificarse como contrato de gestiรณn de servicio pรบblico ni tampoco
como concesiรณn administrativa de dominio pรบblico.
Cabe plantearse si podrรญa resultar aplicable a la referida prestaciรณn de servicios
efectuada por el ayuntamiento en favor de la consultante el supuesto de no sujeciรณn
al Impuesto sobre el Valor Aรฑadido previsto para el otorgamiento de concesiones
y autorizaciones administrativas en el nรบmero 9ยบ del artรญculo 7 de la citada Ley
37/1992.
La respuesta a esta cuestiรณn es negativa, pues, como ha seรฑalado la Asesorรญa Jurรญdica
de la Secretarรญa de Estado de Hacienda en el informe emitido el 30 de julio de
1997 a solicitud de esta Direcciรณn General, los contratos que tienen por objeto
la explotaciรณn de cafeterรญas y comedores en centros pรบblicos son contratos administrativos
especiales, sin que los mismos puedan calificarse como contratos de gestiรณn de
servicios pรบblicos ni tampoco como concesiones administrativas de dominio pรบblico.
En este sentido se ha pronunciado la Junta Consultiva de Contrataciรณn Administrativa
en diversos informes emitidos al respecto; asรญ, en el informe 57/07 de 6 de febrero
de 2008, y, con anterioridad, en los informes 5/96 de 7 de marzo y 67/99, de 6
de julio de 2000.
En consecuencia con todo lo anterior, estรก sujeto al Impuesto sobre el Valor Aรฑadido
y no exento del mismo el contrato suscrito entre el ayuntamiento y la consultante
consistente en explotar un bar-quiosco, a cambio del pago de una contraprestaciรณn.
4.- Lo que comunico a Vd. con efectos vinculantes, conforme a lo dispuesto en
el apartado 1 del artรญculo 89 de la Ley 58/2003, de 17 de diciembre, General Tributaria.'
- source_sentence: 'query: ยฟEn quรฉ casos las transacciones documentadas en escrituras
pรบblicas pueden estar sujetas a una tasa tributaria especรญfica segรบn la normativa
vigente?'
sentences:
- 'passage 3.- Por otra parte en relaciรณn con la inclusiรณn del suero de irrigaciรณn
en el apartado destinado a โBolsas de recogida de orina, absorbentes de incontinencia
y otros sistemas para incontinencia urinaria y fecal, incluidos los sistemas de
irrigaciรณnโ, este Centro directivo en la consulta de fecha 23 de marzo de 2015,
numero V0872-15 y en relaciรณn con los sistemas de irrigaciรณn ha dispuesto que,
โTributarรกn por el Impuesto sobre el Valor Aรฑadido, al tipo general del 21 por
ciento, los siguientes productos objeto de consulta: -Los empapadores, las duchas
vaginales, irrigadores, accesorios y sistemas de irrigaciรณn no destinados especรญficamente
a situaciones de incontinencia urinaria o fecal, ni las cรกnulas rectales y vaginales
no destinadas especรญficamente a situaciones de incontinencia urinaria o fecal
o no incorporadas en equipos destinados a estas situaciones. โ
4.- En consecuencia con lo anterior este centro directivo le informa que tributan
al tipo general del 21 por ciento las entregas, adquisiciones intracomunitarias
e importaciones de suero de irrigaciรณn (agua destilada o suero fisiolรณgico) objeto
de consulta siendo irrelevante que su destino sea para la limpieza asรฉptica de
la piel, lavado de heridas o quemaduras formando parte integrante de los sistemas
de irrigaciรณn.
5.- Lo que comunico a Vd. con efectos vinculantes, conforme a lo dispuesto en
el apartado 1 del artรญculo 89 de la Ley 58/2003, de 17 de diciembre, General Tributaria.
No obstante, de acuerdo con el artรญculo 68.2 del Reglamento General de las actuaciones
y los procedimientos de gestiรณn e inspecciรณn tributaria y de desarrollo de las
normas comunes de los procedimientos de aplicaciรณn de los tributos, aprobado por
el Real Decreto 1065/2007, de 27 de julio, la presente contestaciรณn no tendrรก
efectos vinculantes para aquellos miembros o asociados de la consultante que en
el momento de formular la consulta estuviesen siendo objeto de un procedimiento,
recurso o reclamaciรณn econรณmico-administrativa iniciado con anterioridad y relacionado
con las cuestiones planteadas en la consulta conforme a lo dispuesto en su artรญculo
89.2.'
- 'passage Contestaciรณn completa: 1.- Las reglas de localizaciรณn de las prestaciones
de servicios se encuentran reguladas en los artรญculos 69, 70 y 72 de la Ley 37/1992,
de 28 de diciembre, del Impuesto sobre el Valor Aรฑadido (BOE del 29 de diciembre).
En el artรญculo 69 del dicho texto normativo se contienen las reglas generales
de localizaciรณn en donde se establece que:
โUno. Las prestaciones de servicios se entenderรกn realizadas en el territorio
de aplicaciรณn del Impuesto, sin perjuicio de lo dispuesto en el apartado siguiente
de este artรญculo y en los artรญculos 70 y 72 de esta Ley, en los siguientes casos:
1.ยบ Cuando el destinatario sea un empresario o profesional que actรบe como tal
y radique en el citado territorio la sede de su actividad econรณmica, o tenga en
el mismo un establecimiento permanente o, en su defecto, el lugar de su domicilio
o residencia habitual, siempre que se trate de servicios que tengan por destinatarios
a dicha sede, establecimiento permanente, domicilio o residencia habitual, con
independencia de dรณnde se encuentre establecido el prestador de los servicios
y del lugar desde el que los preste.
2.ยบ Cuando el destinatario no sea un empresario o profesional actuando como tal,
siempre que los servicios se presten por un empresario o profesional y la sede
de su actividad econรณmica o establecimiento permanente desde el que los preste
o, en su defecto, el lugar de su domicilio o residencia habitual, se encuentre
en el territorio de aplicaciรณn del Impuesto.
(โฆ).โ.
No obstante, estas reglas serรกn de aplicaciรณn รบnicamente en el caso en que no
proceda aplicar ninguna de las reglas espaciales que se regulan en el artรญculo
70 de la Ley del impuesto. En concreto, respecto de los servicios de restauraciรณn
y catering, se establece en el nรบmero 5ยบ del apartado Uno de dicho precepto que:
โUno. Se entenderรกn prestados en el territorio de aplicaciรณn del Impuesto los
siguientes servicios:
(โฆ)
5.ยบ. A) Los de restauraciรณn y catering en los siguientes supuestos:
(โฆ)
b) Los restantes servicios de restauraciรณn y catering cuando se presten materialmente
en el territorio de aplicaciรณn del Impuesto.
(โฆ).โ.'
- 'passage Artรญculo 31
โ2. Las primeras copias de escrituras y actas notariales, cuando tengan por objeto
cantidad o cosa valuable, contengan actos o contratos inscribibles en los Registros
de la Propiedad, Mercantil y de la Propiedad Industrial y de Bienes Muebles no
sujetos al Impuesto sobre Sucesiones y Donaciones o a los conceptos comprendidos
en los nรบmeros 1 y 2 del artรญculo 1.ยบ de esta Ley, tributarรกn, ademรกs, al tipo
de gravamen que, conforme a lo previsto en la Ley 21/2001, de 27 de diciembre,
por la que se regulan las medidas fiscales y administrativas del nuevo sistema
de financiaciรณn de las Comunidades Autรณnomas de rรฉgimen comรบn y Ciudades con Estatuto
de Autonomรญa, haya sido aprobado por la Comunidad Autรณnoma.
Si la Comunidad Autรณnoma no hubiese aprobado el tipo a que se refiere el pรกrrafo
anterior, se aplicarรก el 0,50 por 100, en cuanto a tales actos o contratos.โ
De la aplicaciรณn de los preceptos anteriormente transcritos resulta lo siguiente:
- Por regla general las operaciones realizadas por un sujeto pasivo del IVA son
operaciones no sujetas a la modalidad de transmisiones patrimoniales onerosas
del ITP y AJD segรบn lo dispuesto en los artรญculos 7.5 del Texto Refundido del
citado impuesto. En tal caso, si la referida operaciรณn se documentase en escritura
pรบblica, la no sujeciรณn de la transmisiรณn por la modalidad de transmisiones patrimoniales
onerosas permitirรญa la aplicaciรณn la cuota variable del Documento Notarial de
la modalidad Actos Jurรญdicos Documentados, dada la concurrencia de todos los requisitos
exigidos en el artรญculo 31.2 del Texto Refundido del Impuesto:
Tratarse de una primera copia de una escritura o acta notarial
Tener por objeto cantidad o cosa valuable
Contener un acto o contrato inscribibles en los Registros de la Propiedad, Mercantil
y de la Propiedad Industrial y de Bienes Muebles
No estar sujetos los referidos actos al Impuesto sobre Sucesiones y Donaciones
o a los conceptos comprendidos en los apartados 1 y 2 del artรญculo 1 de esta Ley,
transmisiones patrimoniales onerosas y operaciones societarias'
- source_sentence: 'query: ยฟSe aplican impuestos a la enseรฑanza de idiomas para particulares
y empresas en modalidad presencial y virtual?'
sentences:
- 'passage 4.- Por otro lado, el artรญculo 91, apartado dos.2, nรบmero 1ยบ, de la Ley
del Impuesto sobre el Valor Aรฑadido, dispone la aplicaciรณn del tipo impositivo
del 4 por ciento a la prestaciรณn de los siguientes servicios:
โ1.ยบ Los servicios de reparaciรณn de los vehรญculos y de las sillas de ruedas comprendidos
en el pรกrrafo primero del nรบmero 4.ยบ del apartado dos.1 de este artรญculo y los
servicios de adaptaciรณn de los autotaxis y autoturismos para personas con discapacidad
y de los vehรญculos a motor a los que se refiere el pรกrrafo segundo del mismo precepto
independientemente de quiรฉn sea el conductor de los mismos.โ.
Los servicios de reparaciรณn recogidos en la Ley 37/1992 son รบnicamente los referidos
a vehรญculos para personas con movilidad reducida y a sillas de ruedas para uso
exclusivo de personas con discapacidad, que son los bienes incluidos en el pรกrrafo
primero del artรญculo 91, apartado dos.1, nรบmero 4ยบ de dicha Ley.
En consecuencia con lo anterior, las reparaciones de sillas de ruedas, que no
estรฉn incluidas en el pรกrrafo anterior, tributarรกn al tipo del 21 por ciento dado
que no estรก contemplado en el artรญculo 91 de la Ley 37/1992 un tipo reducido para
estos servicios de reparaciรณn.
5.- En relaciรณn con el tipo impositivo aplicable a los accesorios y recambios
de sillas de ruedas, la actual redacciรณn del artรญculo 91.Uno.1.6ยบ, letra c) dice
expresamente que: โNo se incluyen en esta letra otros accesorios, recambios y
piezas de repuesto de dichos bienes.โ.'
- 'passage Descripciรณn de hechos: La consultante es una persona fรญsica que va a
impartir clases de idiomas, en concreto alemรกn, tanto a personas fรญsicas como
a empresas. Las clases se realizarรกn tanto de manera presencial como a travรฉs
de medios electrรณnicos.
Cuestiรณn planteada: Si las clases se encuentran exentas del Impuesto sobre el
Valor Aรฑadido.'
- 'passage Contestaciรณn completa: 1.- El artรญculo 134 bis, apartado dos de la Ley
37/1992, de 28 de diciembre, del Impuesto sobre el Valor Aรฑadido (BOE del 29),
establece que:
โDos. Cuando el rรฉgimen de tributaciรณn aplicable a una determinada actividad agrรญcola,
ganadera, forestal o pesquera cambie del rรฉgimen especial de la agricultura, ganaderรญa
y pesca al general del Impuesto, el empresario o profesional titular de la actividad
tendrรก derecho a:
1ยบ. Efectuar la deducciรณn de la cuota resultante de aplicar al valor de los bienes
afectos a la actividad, Impuesto sobre el Valor Aรฑadido excluido, en la fecha
en que deje de aplicarse el rรฉgimen especial, los tipos de dicho Impuesto que
estuviesen vigentes en la citada fecha. A estos efectos, no se tendrรกn en cuenta
los siguientes:
a) Bienes de inversiรณn, definidos conforme a lo dispuesto en el artรญculo 108 de
esta Ley.
b) Bienes y servicios que hayan sido utilizados o consumidos total o parcialmente
en la actividad.
2ยบ. Deducir la compensaciรณn a tanto alzado que prevรฉ el artรญculo 130 de esta Ley
por los productos naturales obtenidos en las explotaciones que no se hayan entregado
a la fecha del cambio del rรฉgimen de tributaciรณn.
A efectos del ejercicio de los derechos recogidos en este apartado, el empresario
o profesional deberรก confeccionar y presentar un inventario a la fecha en que
deje de aplicarse el rรฉgimen especial. Tanto la presentaciรณn de este inventario
como el ejercicio de estos derechos se ajustarรกn a los requisitos y condiciones
que se establezcan reglamentariamente.โ.
Por su parte, el artรญculo 49 bis del Reglamento del Impuesto aprobado por el artรญculo
1 del Real Decreto 1624/1992, de 29 de diciembre (BOE del 31), declara que:'
- source_sentence: 'query: ยฟDe quรฉ forma la ubicaciรณn de la agencia influye en la
aplicaciรณn del impuesto en los servicios turรญsticos?'
sentences:
- 'passage Contestaciรณn completa: 1.- El artรญculo 9, primer pรกrrafo de la Ley 8/1991,
de 25 de marzo, por la que se crea el Impuesto sobre la Producciรณn, los Servicios
y la Importaciรณn en las Ciudades de Ceuta y Melilla (BOE del 26), dispone lo siguiente:
โLas importaciones definitivas de bienes en las ciudades de Ceuta y Melilla estarรกn
exentas en los mismos tรฉrminos que en la legislaciรณn comรบn del Impuesto sobre
el Valor Aรฑadido y, en todo caso, se asimilarรกn, a efectos de esta exenciรณn, las
que resulten de aplicaciรณn a las operaciones interiores.โ.
2.- Por otra parte, el artรญculo 20, apartado uno, nรบmero 17ยบ de la Ley 37/1992,
de 28 de diciembre, del Impuesto sobre el Valor Aรฑadido (BOE del 29 de diciembre),
dispone que estarรกn exentas de dicho Impuesto:
โ17ยบ. Las entregas de sellos de Correos y efectos timbrados de curso legal en
Espaรฑa por importe no superior a su valor facial.
La exenciรณn no se extiende a los servicios de expendiciรณn de los referidos bienes
prestados en nombre y por cuenta de terceros.โ.
Conforme al precepto anterior, la entrega de sellos de correos de curso legal
por importe no superior a su valor facial, objeto de consulta, estarรก exenta del
Impuesto sobre el Valor Aรฑadido.
3.- En consecuencia, estarรกn sujetas pero exentas del Impuesto sobre la Producciรณn,
los Servicios y la Importaciรณn en las Ciudades de Ceuta y Melilla las importaciones
definitivas de sellos de correos de curso legal en las ciudades de Ceuta y Melilla
cuando, de acuerdo con lo establecido en el apartado anterior de esta contestaciรณn,
su entrega estรฉ exenta del Impuesto sobre el Valor Aรฑadido.
4.- Lo que comunico a Vd. con efectos vinculantes, conforme a lo dispuesto en
el apartado 1 del artรญculo 89 de la Ley 58/2003, de 17 de diciembre, General Tributaria.'
- 'passage 2ยบ. Sin perjuicio de lo dispuesto en el punto 1ยบ anterior, se aplicarรก,
en todo caso, el tipo general del 21 por ciento, entre otros, a los siguientes
bienes y servicios:
1. Servicios prestados por vรญa electrรณnica, esto es, aquellos servicios que consistan
en la transmisiรณn enviada inicialmente y recibida en destino por medio de equipos
de procesamiento, incluida la compresiรณn numรฉrica y el almacenamiento de datos,
y enteramente transmitida, transportada y recibida por cable, sistema รณptico u
otros medios electrรณnicos y, entre otros, los siguientes:
a) El suministro y alojamiento de sitios informรกticos.
b) El mantenimiento a distancia de programas y de equipos.
c) El suministro de programas y su actualizaciรณn.
d) El suministro de imรกgenes, texto, informaciรณn y la puesta a disposiciรณn de
bases de datos.
e) El suministro de mรบsica, pelรญculas, juegos, incluidos los de azar o de dinero,
y de emisiones y manifestaciones polรญticas, culturales, artรญsticas, deportivas,
cientรญficas o de ocio.
f) El suministro de enseรฑanza a distancia.
2. Dispositivos portรกtiles que permitan almacenar y leer libros digitalizados,
asรญ como reproductores de libros electrรณnicos y otros elementos de hardware, es
decir, componentes que integren la parte material de un ordenador o que se puedan
conectar al mismo.
3. Servicios consistentes en el acceso electrรณnico a bases de datos, periรณdicos,
revistas y semejantes y, en general, a pรกginas web.
4. Comercializaciรณn de cรณdigos de descarga de archivos que incorporen libros electrรณnicos.
5. Servicios de acceso a libros de texto en formato digital alojados en servidores
de Entes pรบblicos o de colegios.
6. Servicios de consultas y accesos a bases de datos.
7. Servicios de digitalizaciรณn de obras literarias.'
- 'passage De acuerdo con los antecedentes recogidos en esta contestaciรณn, dicho
servicio estarรก sujeto al rรฉgimen especial de las agencias de viajes regulado
en el Capรญtulo VI del Tรญtulo IX de la Ley 37/1992 y tendrรก la consideraciรณn de
prestaciรณn de servicios รบnica que estarรก sujeta al Impuesto sobre el Valor Aรฑadido
bajo la premisa de que la consultante tiene establecida la sede de su actividad
econรณmica o posea un establecimiento permanente desde donde efectรบa la operaciรณn
en el territorio de aplicaciรณn del Impuesto.
El tipo impositivo aplicable al servicio รบnico de viajes serรก el general del 21
por ciento previsto en el artรญculo 90.Uno de la Ley del Impuesto.
Sobre la posible aplicaciรณn de la opciรณn del artรญculo 147 de la Ley 37/1992 para
la aplicaciรณn del rรฉgimen general del Impuesto, segรบn se establece en contestaciรณn
a consulta vinculante de 20 de septiembre de 2016, nรบmero V3942-16:
โ4.- Debe tenerse en cuenta que en el caso de las empresas radicadas en Estados
Unidos a que se refiere el escrito de consulta, no se entiende cumplido el requisito
de reciprocidad, tal como se pronunciรณ este Centro Directivo en contestaciรณn a
consulta vinculante nรบmero V0579-12 de 16 de marzo de 2012, por lo que, salvo
que el servicio prestado por la agencia de viajes consultante estรฉ relacionado
con la asistencia a ferias, congresos y exposiciones de carรกcter comercial o profesional,
en los tรฉrminos del artรญculo 119 bis de la Ley 37/1992 parcialmente transcrito,
no se entenderรกn cumplidos los requisitos para la opciรณn por el rรฉgimen general
del Impuesto sobre el Valor Aรฑadido.โ.
b) El mismo caso anterior, pero el viaje se pretende desarrollar en las Islas
Canarias.
Segรบn establece el artรญculo 144 de la Ley del Impuesto, dicha operaciรณn se encontrarรก
sujeta al Impuesto y, en particular, al rรฉgimen especial de las agencias de viajes:
โDicha prestaciรณn se entenderรก realizada en el lugar donde la agencia tenga establecida
la sede de su actividad econรณmica o posea un establecimiento permanente desde
donde efectรบe la operaciรณn.โ.'
pipeline_tag: sentence-similarity
library_name: sentence-transformers
metrics:
- cosine_accuracy@1
- cosine_accuracy@3
- cosine_accuracy@5
- cosine_accuracy@10
- cosine_precision@1
- cosine_precision@3
- cosine_precision@5
- cosine_precision@10
- cosine_recall@1
- cosine_recall@3
- cosine_recall@5
- cosine_recall@10
- cosine_ndcg@10
- cosine_mrr@10
- cosine_map@100
model-index:
- name: SentenceTransformer based on intfloat/multilingual-e5-small
results:
- task:
type: information-retrieval
name: Information Retrieval
dataset:
name: InformationRetrievalEvaluator
type: InformationRetrievalEvaluator
metrics:
- type: cosine_accuracy@1
value: 0.19120762711864406
name: Cosine Accuracy@1
- type: cosine_accuracy@3
value: 0.3175317796610169
name: Cosine Accuracy@3
- type: cosine_accuracy@5
value: 0.386034604519774
name: Cosine Accuracy@5
- type: cosine_accuracy@10
value: 0.4834922316384181
name: Cosine Accuracy@10
- type: cosine_precision@1
value: 0.19120762711864406
name: Cosine Precision@1
- type: cosine_precision@3
value: 0.10584392655367232
name: Cosine Precision@3
- type: cosine_precision@5
value: 0.0772069209039548
name: Cosine Precision@5
- type: cosine_precision@10
value: 0.04834922316384181
name: Cosine Precision@10
- type: cosine_recall@1
value: 0.19120762711864406
name: Cosine Recall@1
- type: cosine_recall@3
value: 0.3175317796610169
name: Cosine Recall@3
- type: cosine_recall@5
value: 0.386034604519774
name: Cosine Recall@5
- type: cosine_recall@10
value: 0.4834922316384181
name: Cosine Recall@10
- type: cosine_ndcg@10
value: 0.32410065040970315
name: Cosine Ndcg@10
- type: cosine_mrr@10
value: 0.2747356599183937
name: Cosine Mrr@10
- type: cosine_map@100
value: 0.28756647798527146
name: Cosine Map@100
- type: cosine_accuracy@1
value: 0.3431194112526707
name: Cosine Accuracy@1
- type: cosine_accuracy@3
value: 0.5083485004352298
name: Cosine Accuracy@3
- type: cosine_accuracy@5
value: 0.5847906939938277
name: Cosine Accuracy@5
- type: cosine_accuracy@10
value: 0.681332594761415
name: Cosine Accuracy@10
- type: cosine_precision@1
value: 0.3431194112526707
name: Cosine Precision@1
- type: cosine_precision@3
value: 0.1694495001450766
name: Cosine Precision@3
- type: cosine_precision@5
value: 0.11695813879876553
name: Cosine Precision@5
- type: cosine_precision@10
value: 0.06813325947614149
name: Cosine Precision@10
- type: cosine_recall@1
value: 0.3431194112526707
name: Cosine Recall@1
- type: cosine_recall@3
value: 0.5083485004352298
name: Cosine Recall@3
- type: cosine_recall@5
value: 0.5847906939938277
name: Cosine Recall@5
- type: cosine_recall@10
value: 0.681332594761415
name: Cosine Recall@10
- type: cosine_ndcg@10
value: 0.5018271132477355
name: Cosine Ndcg@10
- type: cosine_mrr@10
value: 0.44561322194462705
name: Cosine Mrr@10
- type: cosine_map@100
value: 0.4551504365024611
name: Cosine Map@100
---
# SentenceTransformer based on intfloat/multilingual-e5-small
This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [intfloat/multilingual-e5-small](https://huggingface.co/intfloat/multilingual-e5-small). It maps sentences & paragraphs to a 384-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
## Model Details
### Model Description
- **Model Type:** Sentence Transformer
- **Base model:** [intfloat/multilingual-e5-small](https://huggingface.co/intfloat/multilingual-e5-small) <!-- at revision c007d7ef6fd86656326059b28395a7a03a7c5846 -->
- **Maximum Sequence Length:** 512 tokens
- **Output Dimensionality:** 384 dimensions
- **Similarity Function:** Cosine Similarity
<!-- - **Training Dataset:** Unknown -->
<!-- - **Language:** Unknown -->
<!-- - **License:** Unknown -->
### Model Sources
- **Documentation:** [Sentence Transformers Documentation](https://sbert.net)
- **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers)
- **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers)
### Full Model Architecture
```
SentenceTransformer(
(0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel
(1): Pooling({'word_embedding_dimension': 384, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
(2): Normalize()
)
```
## Usage
### Direct Usage (Sentence Transformers)
First install the Sentence Transformers library:
```bash
pip install -U sentence-transformers
```
Then you can load this model and run inference.
```python
from sentence_transformers import SentenceTransformer
# Download from the ๐ค Hub
model = SentenceTransformer("diegolacomba/multilingual-e5-small-legal-mnrl-0")
# Run inference
sentences = [
'query: ยฟDe quรฉ forma la ubicaciรณn de la agencia influye en la aplicaciรณn del impuesto en los servicios turรญsticos?',
'passage De acuerdo con los antecedentes recogidos en esta contestaciรณn, dicho servicio estarรก sujeto al rรฉgimen especial de las agencias de viajes regulado en el Capรญtulo VI del Tรญtulo IX de la Ley 37/1992 y tendrรก la consideraciรณn de prestaciรณn de servicios รบnica que estarรก sujeta al Impuesto sobre el Valor Aรฑadido bajo la premisa de que la consultante tiene establecida la sede de su actividad econรณmica o posea un establecimiento permanente desde donde efectรบa la operaciรณn en el territorio de aplicaciรณn del Impuesto.\nEl tipo impositivo aplicable al servicio รบnico de viajes serรก el general del 21 por ciento previsto en el artรญculo 90.Uno de la Ley del Impuesto.\nSobre la posible aplicaciรณn de la opciรณn del artรญculo 147 de la Ley 37/1992 para la aplicaciรณn del rรฉgimen general del Impuesto, segรบn se establece en contestaciรณn a consulta vinculante de 20 de septiembre de 2016, nรบmero V3942-16:\nโ4.- Debe tenerse en cuenta que en el caso de las empresas radicadas en Estados Unidos a que se refiere el escrito de consulta, no se entiende cumplido el requisito de reciprocidad, tal como se pronunciรณ este Centro Directivo en contestaciรณn a consulta vinculante nรบmero V0579-12 de 16 de marzo de 2012, por lo que, salvo que el servicio prestado por la agencia de viajes consultante estรฉ relacionado con la asistencia a ferias, congresos y exposiciones de carรกcter comercial o profesional, en los tรฉrminos del artรญculo 119 bis de la Ley 37/1992 parcialmente transcrito, no se entenderรกn cumplidos los requisitos para la opciรณn por el rรฉgimen general del Impuesto sobre el Valor Aรฑadido.โ.\nb) El mismo caso anterior, pero el viaje se pretende desarrollar en las Islas Canarias.\nSegรบn establece el artรญculo 144 de la Ley del Impuesto, dicha operaciรณn se encontrarรก sujeta al Impuesto y, en particular, al rรฉgimen especial de las agencias de viajes:\nโDicha prestaciรณn se entenderรก realizada en el lugar donde la agencia tenga establecida la sede de su actividad econรณmica o posea un establecimiento permanente desde donde efectรบe la operaciรณn.โ.',
'passage Contestaciรณn completa: 1.- El artรญculo 9, primer pรกrrafo de la Ley 8/1991, de 25 de marzo, por la que se crea el Impuesto sobre la Producciรณn, los Servicios y la Importaciรณn en las Ciudades de Ceuta y Melilla (BOE del 26), dispone lo siguiente:\nโLas importaciones definitivas de bienes en las ciudades de Ceuta y Melilla estarรกn exentas en los mismos tรฉrminos que en la legislaciรณn comรบn del Impuesto sobre el Valor Aรฑadido y, en todo caso, se asimilarรกn, a efectos de esta exenciรณn, las que resulten de aplicaciรณn a las operaciones interiores.โ.\n2.- Por otra parte, el artรญculo 20, apartado uno, nรบmero 17ยบ de la Ley 37/1992, de 28 de diciembre, del Impuesto sobre el Valor Aรฑadido (BOE del 29 de diciembre), dispone que estarรกn exentas de dicho Impuesto:\nโ17ยบ. Las entregas de sellos de Correos y efectos timbrados de curso legal en Espaรฑa por importe no superior a su valor facial.\nLa exenciรณn no se extiende a los servicios de expendiciรณn de los referidos bienes prestados en nombre y por cuenta de terceros.โ.\nConforme al precepto anterior, la entrega de sellos de correos de curso legal por importe no superior a su valor facial, objeto de consulta, estarรก exenta del Impuesto sobre el Valor Aรฑadido.\n3.- En consecuencia, estarรกn sujetas pero exentas del Impuesto sobre la Producciรณn, los Servicios y la Importaciรณn en las Ciudades de Ceuta y Melilla las importaciones definitivas de sellos de correos de curso legal en las ciudades de Ceuta y Melilla cuando, de acuerdo con lo establecido en el apartado anterior de esta contestaciรณn, su entrega estรฉ exenta del Impuesto sobre el Valor Aรฑadido.\n4.- Lo que comunico a Vd. con efectos vinculantes, conforme a lo dispuesto en el apartado 1 del artรญculo 89 de la Ley 58/2003, de 17 de diciembre, General Tributaria.',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 384]
# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]
```
<!--
### Direct Usage (Transformers)
<details><summary>Click to see the direct usage in Transformers</summary>
</details>
-->
<!--
### Downstream Usage (Sentence Transformers)
You can finetune this model on your own dataset.
<details><summary>Click to expand</summary>
</details>
-->
<!--
### Out-of-Scope Use
*List how the model may foreseeably be misused and address what users ought not to do with the model.*
-->
## Evaluation
### Metrics
#### Information Retrieval
* Dataset: `InformationRetrievalEvaluator`
* Evaluated with [<code>InformationRetrievalEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator)
| Metric | Value |
|:--------------------|:-----------|
| cosine_accuracy@1 | 0.1912 |
| cosine_accuracy@3 | 0.3175 |
| cosine_accuracy@5 | 0.386 |
| cosine_accuracy@10 | 0.4835 |
| cosine_precision@1 | 0.1912 |
| cosine_precision@3 | 0.1058 |
| cosine_precision@5 | 0.0772 |
| cosine_precision@10 | 0.0483 |
| cosine_recall@1 | 0.1912 |
| cosine_recall@3 | 0.3175 |
| cosine_recall@5 | 0.386 |
| cosine_recall@10 | 0.4835 |
| **cosine_ndcg@10** | **0.3241** |
| cosine_mrr@10 | 0.2747 |
| cosine_map@100 | 0.2876 |
#### Information Retrieval
* Dataset: `InformationRetrievalEvaluator`
* Evaluated with [<code>InformationRetrievalEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator)
| Metric | Value |
|:--------------------|:-----------|
| cosine_accuracy@1 | 0.3431 |
| cosine_accuracy@3 | 0.5083 |
| cosine_accuracy@5 | 0.5848 |
| cosine_accuracy@10 | 0.6813 |
| cosine_precision@1 | 0.3431 |
| cosine_precision@3 | 0.1694 |
| cosine_precision@5 | 0.117 |
| cosine_precision@10 | 0.0681 |
| cosine_recall@1 | 0.3431 |
| cosine_recall@3 | 0.5083 |
| cosine_recall@5 | 0.5848 |
| cosine_recall@10 | 0.6813 |
| **cosine_ndcg@10** | **0.5018** |
| cosine_mrr@10 | 0.4456 |
| cosine_map@100 | 0.4552 |
<!--
## Bias, Risks and Limitations
*What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.*
-->
<!--
### Recommendations
*What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.*
-->
## Training Details
### Training Dataset
#### Unnamed Dataset
* Size: 58,898 training samples
* Columns: <code>anchor</code> and <code>positive</code>
* Approximate statistics based on the first 1000 samples:
| | anchor | positive |
|:--------|:-----------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|
| type | string | string |
| details | <ul><li>min: 19 tokens</li><li>mean: 31.33 tokens</li><li>max: 62 tokens</li></ul> | <ul><li>min: 23 tokens</li><li>mean: 325.57 tokens</li><li>max: 508 tokens</li></ul> |
* Samples:
| anchor | positive |
|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| <code>query: ยฟLas contribuciones que percibe una organizaciรณn en virtud de un convenio laboral en el fรบtbol tienen impacto en la base de cรกlculo para el impuesto correspondiente?</code> | <code>passage Descripciรณn de hechos: La consultante es una Asociaciรณn que se dedica a las actividades de ordenaciรณn del ejercicio de la profesiรณn de futbolistas de sus miembros, la representaciรณn de los mismos asรญ como la defensa de sus intereses profesionales tanto en el รกmbito nacional como en el internacional.<br>En virtud de un convenio colectivo para la actividad de fรบtbol profesional suscrito entre la Liga Nacional de Fรบtbol Profesional (LNFP) y la consultante, aquella viene obligada a entregar a esta, por cada temporada de vigencia del convenio, una cantidad de dinero (en concepto de Fondo social) destinada a fines benรฉficos y al normal desarrollo de la actividad de la Asociaciรณn.<br>Asimismo, segรบn Acta de Conciliaciรณn suscrita entre ambas partes, la LNFP se compromete a abonar a la consultante un porcentaje del importe neto total de los ingresos obtenidos de la explotaciรณn conjunta de los derechos de contenidos audiovisuales del fรบtbol. Dicha cuantรญa debe destinarse a actividades encamina...</code> |
| <code>query: ยฟQuรฉ tipos de transacciones intracomunitarias deben ser declaradas por las empresas segรบn la regulaciรณn vigente?</code> | <code>passage Contestaciรณn completa: 1.- De acuerdo con el artรญculo 78 del Reglamento del impuesto aprobado por el Real Decreto 1624/1992, de 29 de diciembre (BOE del 31 de diciembre):<br>โLos empresarios y profesionales deberรกn presentar una declaraciรณn recapitulativa de las entregas y adquisiciones intracomunitarias de bienes y de las prestaciones y adquisiciones intracomunitarias de servicios que realicen en la forma que se indica en el presente capรญtulo.โ.<br>El artรญculo 79 del Reglamento especifica quรฉ tipo de operaciones deben ser declaradas en la declaraciรณn recapitulativa de operaciones intracomunitarias, en concreto establece que:<br>โ1. Estarรกn obligados a presentar la declaraciรณn recapitulativa los empresarios y profesionales, incluso cuando tengan dicha condiciรณn con arreglo a lo dispuesto en el apartado cuatro del artรญculo 5 de la Ley del Impuesto, que realicen cualquiera de las siguientes operaciones.<br>1.ยบ Las entregas de bienes destinados a otro Estado miembro que se encuentren exentas ...</code> |
| <code>query: ยฟQuรฉ tipos de bebidas contienen alcohol apto para consumo humano?</code> | <code>passage Se entiende por bebida alcohรณlica todo lรญquido apto para el consumo humano por ingestiรณn que contenga alcohol etรญlico.<br>A los efectos de este nรบmero no tendrรกn la consideraciรณn de alimento el tabaco ni las sustancias no aptas para el consumo humano o animal en el mismo estado en que fuesen objeto de entrega, adquisiciรณn intracomunitaria o importaciรณn.โ.<br>4.- Con independencia de lo anterior, el artรญculo 20, apartado uno, nรบmero 9ยบ, de la Ley 37/1992, establece que estarรกn exentas del Impuesto las siguientes operaciones:<br>โ9.ยบ La educaciรณn de la infancia y de la juventud, la guarda y custodia de niรฑos, incluida la atenciรณn a niรฑos en los centros docentes en tiempo interlectivo durante el comedor escolar o en aulas en servicio de guarderรญa fuera del horario escolar, la enseรฑanza escolar, universitaria y de postgraduados, la enseรฑanza de idiomas y la formaciรณn y reciclaje profesional, realizadas por Entidades de derecho pรบblico o entidades privadas autorizadas para el ejercicio de di...</code> |
* Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters:
```json
{
"scale": 20.0,
"similarity_fct": "cos_sim"
}
```
### Training Hyperparameters
#### Non-Default Hyperparameters
- `eval_strategy`: steps
- `per_device_train_batch_size`: 32
- `per_device_eval_batch_size`: 16
- `gradient_accumulation_steps`: 16
- `learning_rate`: 2e-05
- `num_train_epochs`: 5
- `lr_scheduler_type`: cosine
- `warmup_ratio`: 0.1
- `fp16`: True
- `tf32`: True
- `load_best_model_at_end`: True
- `optim`: adamw_torch_fused
- `batch_sampler`: no_duplicates
#### All Hyperparameters
<details><summary>Click to expand</summary>
- `overwrite_output_dir`: False
- `do_predict`: False
- `eval_strategy`: steps
- `prediction_loss_only`: True
- `per_device_train_batch_size`: 32
- `per_device_eval_batch_size`: 16
- `per_gpu_train_batch_size`: None
- `per_gpu_eval_batch_size`: None
- `gradient_accumulation_steps`: 16
- `eval_accumulation_steps`: None
- `torch_empty_cache_steps`: None
- `learning_rate`: 2e-05
- `weight_decay`: 0.0
- `adam_beta1`: 0.9
- `adam_beta2`: 0.999
- `adam_epsilon`: 1e-08
- `max_grad_norm`: 1.0
- `num_train_epochs`: 5
- `max_steps`: -1
- `lr_scheduler_type`: cosine
- `lr_scheduler_kwargs`: {}
- `warmup_ratio`: 0.1
- `warmup_steps`: 0
- `log_level`: passive
- `log_level_replica`: warning
- `log_on_each_node`: True
- `logging_nan_inf_filter`: True
- `save_safetensors`: True
- `save_on_each_node`: False
- `save_only_model`: False
- `restore_callback_states_from_checkpoint`: False
- `no_cuda`: False
- `use_cpu`: False
- `use_mps_device`: False
- `seed`: 42
- `data_seed`: None
- `jit_mode_eval`: False
- `use_ipex`: False
- `bf16`: False
- `fp16`: True
- `fp16_opt_level`: O1
- `half_precision_backend`: auto
- `bf16_full_eval`: False
- `fp16_full_eval`: False
- `tf32`: True
- `local_rank`: 0
- `ddp_backend`: None
- `tpu_num_cores`: None
- `tpu_metrics_debug`: False
- `debug`: []
- `dataloader_drop_last`: False
- `dataloader_num_workers`: 0
- `dataloader_prefetch_factor`: None
- `past_index`: -1
- `disable_tqdm`: False
- `remove_unused_columns`: True
- `label_names`: None
- `load_best_model_at_end`: True
- `ignore_data_skip`: False
- `fsdp`: []
- `fsdp_min_num_params`: 0
- `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
- `fsdp_transformer_layer_cls_to_wrap`: None
- `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
- `deepspeed`: None
- `label_smoothing_factor`: 0.0
- `optim`: adamw_torch_fused
- `optim_args`: None
- `adafactor`: False
- `group_by_length`: False
- `length_column_name`: length
- `ddp_find_unused_parameters`: None
- `ddp_bucket_cap_mb`: None
- `ddp_broadcast_buffers`: False
- `dataloader_pin_memory`: True
- `dataloader_persistent_workers`: False
- `skip_memory_metrics`: True
- `use_legacy_prediction_loop`: False
- `push_to_hub`: False
- `resume_from_checkpoint`: None
- `hub_model_id`: None
- `hub_strategy`: every_save
- `hub_private_repo`: None
- `hub_always_push`: False
- `gradient_checkpointing`: False
- `gradient_checkpointing_kwargs`: None
- `include_inputs_for_metrics`: False
- `include_for_metrics`: []
- `eval_do_concat_batches`: True
- `fp16_backend`: auto
- `push_to_hub_model_id`: None
- `push_to_hub_organization`: None
- `mp_parameters`:
- `auto_find_batch_size`: False
- `full_determinism`: False
- `torchdynamo`: None
- `ray_scope`: last
- `ddp_timeout`: 1800
- `torch_compile`: False
- `torch_compile_backend`: None
- `torch_compile_mode`: None
- `include_tokens_per_second`: False
- `include_num_input_tokens_seen`: False
- `neftune_noise_alpha`: None
- `optim_target_modules`: None
- `batch_eval_metrics`: False
- `eval_on_start`: False
- `use_liger_kernel`: False
- `eval_use_gather_object`: False
- `average_tokens_across_devices`: False
- `prompts`: None
- `batch_sampler`: no_duplicates
- `multi_dataset_batch_sampler`: proportional
</details>
### Training Logs
| Epoch | Step | Training Loss | InformationRetrievalEvaluator_cosine_ndcg@10 |
|:----------:|:-------:|:-------------:|:--------------------------------------------:|
| -1 | -1 | - | 0.3241 |
| 0.8691 | 100 | 12.0508 | 0.4582 |
| 1.7300 | 200 | 1.0655 | 0.4828 |
| 2.5910 | 300 | 0.8843 | 0.4950 |
| 3.4519 | 400 | 0.766 | 0.4997 |
| **4.3129** | **500** | **0.7075** | **0.5018** |
| 5.0 | 580 | - | 0.5018 |
* The bold row denotes the saved checkpoint.
### Framework Versions
- Python: 3.11.13
- Sentence Transformers: 4.1.0
- Transformers: 4.52.4
- PyTorch: 2.6.0+cu124
- Accelerate: 1.7.0
- Datasets: 2.14.4
- Tokenizers: 0.21.1
## Citation
### BibTeX
#### Sentence Transformers
```bibtex
@inproceedings{reimers-2019-sentence-bert,
title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
author = "Reimers, Nils and Gurevych, Iryna",
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
month = "11",
year = "2019",
publisher = "Association for Computational Linguistics",
url = "https://arxiv.org/abs/1908.10084",
}
```
#### MultipleNegativesRankingLoss
```bibtex
@misc{henderson2017efficient,
title={Efficient Natural Language Response Suggestion for Smart Reply},
author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil},
year={2017},
eprint={1705.00652},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
<!--
## Glossary
*Clearly define terms in order to be accessible across audiences.*
-->
<!--
## Model Card Authors
*Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.*
-->
<!--
## Model Card Contact
*Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.*
--> |
ekiprop/bert-wnli-epochs5-lr5em06-bs16-2025-06-17-0900 | ekiprop | 2025-06-17T09:01:09Z | 0 | 0 | transformers | [
"transformers",
"safetensors",
"bert",
"text-classification",
"generated_from_trainer",
"base_model:google-bert/bert-base-uncased",
"base_model:finetune:google-bert/bert-base-uncased",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | 2025-06-17T09:00:05Z | ---
library_name: transformers
license: apache-2.0
base_model: bert-base-uncased
tags:
- generated_from_trainer
metrics:
- accuracy
model-index:
- name: bert-wnli-epochs5-lr5em06-bs16-2025-06-17-0900
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-wnli-epochs5-lr5em06-bs16-2025-06-17-0900
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6995
- Accuracy: 0.3803
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-06
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| No log | 1.0 | 40 | 0.6947 | 0.4930 |
| No log | 2.0 | 80 | 0.6966 | 0.4648 |
| No log | 3.0 | 120 | 0.6969 | 0.4789 |
| No log | 4.0 | 160 | 0.6989 | 0.3662 |
| No log | 5.0 | 200 | 0.6995 | 0.3803 |
### Framework versions
- Transformers 4.52.4
- Pytorch 2.7.0+cu128
- Datasets 3.6.0
- Tokenizers 0.21.1
|
navaneeth005/fitness_model_v0-F32-GGUF | navaneeth005 | 2025-06-17T08:59:37Z | 0 | 0 | transformers | [
"transformers",
"gguf",
"text-generation-inference",
"unsloth",
"llama",
"trl",
"llama-cpp",
"gguf-my-lora",
"en",
"base_model:navaneeth005/fitness_model_v0",
"base_model:quantized:navaneeth005/fitness_model_v0",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] | null | 2025-06-17T08:59:32Z | ---
base_model: navaneeth005/fitness_model_v0
tags:
- text-generation-inference
- transformers
- unsloth
- llama
- trl
- llama-cpp
- gguf-my-lora
license: apache-2.0
language:
- en
---
# navaneeth005/fitness_model_v0-F32-GGUF
This LoRA adapter was converted to GGUF format from [`navaneeth005/fitness_model_v0`](https://huggingface.co/navaneeth005/fitness_model_v0) via the ggml.ai's [GGUF-my-lora](https://huggingface.co/spaces/ggml-org/gguf-my-lora) space.
Refer to the [original adapter repository](https://huggingface.co/navaneeth005/fitness_model_v0) for more details.
## Use with llama.cpp
```bash
# with cli
llama-cli -m base_model.gguf --lora fitness_model_v0-f32.gguf (...other args)
# with server
llama-server -m base_model.gguf --lora fitness_model_v0-f32.gguf (...other args)
```
To know more about LoRA usage with llama.cpp server, refer to the [llama.cpp server documentation](https://github.com/ggerganov/llama.cpp/blob/master/examples/server/README.md).
|
ekiprop/bert-wnli-epochs5-lr2em06-bs16-2025-06-17-0856 | ekiprop | 2025-06-17T08:57:39Z | 0 | 0 | transformers | [
"transformers",
"safetensors",
"bert",
"text-classification",
"generated_from_trainer",
"base_model:google-bert/bert-base-uncased",
"base_model:finetune:google-bert/bert-base-uncased",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | 2025-06-17T08:56:37Z | ---
library_name: transformers
license: apache-2.0
base_model: bert-base-uncased
tags:
- generated_from_trainer
metrics:
- accuracy
model-index:
- name: bert-wnli-epochs5-lr2em06-bs16-2025-06-17-0856
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-wnli-epochs5-lr2em06-bs16-2025-06-17-0856
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6926
- Accuracy: 0.5634
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-06
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| No log | 1.0 | 40 | 0.6896 | 0.5634 |
| No log | 2.0 | 80 | 0.6912 | 0.5634 |
| No log | 3.0 | 120 | 0.6917 | 0.5634 |
| No log | 4.0 | 160 | 0.6924 | 0.5634 |
| No log | 5.0 | 200 | 0.6926 | 0.5634 |
### Framework versions
- Transformers 4.52.4
- Pytorch 2.7.0+cu128
- Datasets 3.6.0
- Tokenizers 0.21.1
|
ekiprop/bert-wnli-epochs5-lr1em05-bs16-2025-06-17-0852 | ekiprop | 2025-06-17T08:53:50Z | 0 | 0 | transformers | [
"transformers",
"safetensors",
"bert",
"text-classification",
"generated_from_trainer",
"base_model:google-bert/bert-base-uncased",
"base_model:finetune:google-bert/bert-base-uncased",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | 2025-06-17T08:52:48Z | ---
library_name: transformers
license: apache-2.0
base_model: bert-base-uncased
tags:
- generated_from_trainer
metrics:
- accuracy
model-index:
- name: bert-wnli-epochs5-lr1em05-bs16-2025-06-17-0852
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-wnli-epochs5-lr1em05-bs16-2025-06-17-0852
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.7038
- Accuracy: 0.2394
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| No log | 1.0 | 40 | 0.6981 | 0.4225 |
| No log | 2.0 | 80 | 0.6983 | 0.4366 |
| No log | 3.0 | 120 | 0.6996 | 0.4366 |
| No log | 4.0 | 160 | 0.7028 | 0.2535 |
| No log | 5.0 | 200 | 0.7038 | 0.2394 |
### Framework versions
- Transformers 4.52.4
- Pytorch 2.7.0+cu128
- Datasets 3.6.0
- Tokenizers 0.21.1
|
Whisper-Pascal/whisper-openvino | Whisper-Pascal | 2025-06-17T08:48:42Z | 0 | 0 | null | [
"license:mit",
"region:us"
] | null | 2025-06-17T08:06:34Z | ---
license: mit
---
This set contains zip archives of the OpenVino models for whisper.cpp
Each zip contains the ggml-[model]-encoder-openvino.bin and ggml-[model]-encoder-openvino.xml files required for the main matching ggml-[model].bin
You can find the main models [here](https://huggingface.co/ggerganov/whisper.cpp/tree/main)
To use the OpenVino version of whisper.cpp you require the main model and the extracted contents of the matching zip in the same directory (you can safely delete the ZIP file once extracted if you desire) |
ICTNLP/stream-omni-8b | ICTNLP | 2025-06-17T08:37:04Z | 3 | 3 | null | [
"safetensors",
"stream_omni_llama",
"omni",
"any-to-any",
"arxiv:2506.13642",
"license:gpl-3.0",
"region:us"
] | any-to-any | 2025-06-16T09:24:53Z | ---
license: gpl-3.0
pipeline_tag: any-to-any
tags:
- omni
---
# Stream-Omni: Simultaneous Multimodal Interactions with Large Language-Vision-Speech Model
[](https://arxiv.org/abs/2506.13642)
[](https://github.com/ictnlp/Stream-Omni)
[](https://huggingface.co/ICTNLP/stream-omni-8b)
[](https://huggingface.co/datasets/ICTNLP/InstructOmni)
[](https://github.com/ictnlp/Stream-Omni)
> [**Shaolei Zhang**](https://zhangshaolei1998.github.io/), [**Shoutao Guo**](https://scholar.google.com.hk/citations?user=XwHtPyAAAAAJ), [**Qingkai Fang**](https://fangqingkai.github.io/), [**Yan Zhou**](https://zhouyan19.github.io/zhouyan/), [**Yang Feng**](https://people.ucas.edu.cn/~yangfeng?language=en)\*
The introduction and usage of Stream-Omni refer to [https://github.com/ictnlp/Stream-Omni](https://github.com/ictnlp/Stream-Omni).
Stream-Omni is an end-to-end language-vision-speech chatbot that simultaneously supports interaction across various modality combinations, with the following features๐ก:
- **Omni Interaction**: Support any multimodal inputs including text, vision, and speech, and generate both text and speech responses.
- **Seamless "see-while-hear" Experience**: Simultaneously output *intermediate textual results* (e.g., ASR transcriptions and model responses) during speech interactions, like the advanced voice service of GPT-4o.
- **Efficient Training**: Require only a small amount of omni-modal data for training.
<p align="center" width="100%">
<img src="./stream-omni.png" alt="stream-omni" style="width: 90%; min-width: 300px; display: block; margin: auto;">
</p>
## ๐ฅ Demo
| Microphone Input | File Input |
| ------------------------------------------------------------ | ------------------------------------------------------------ |
| <video src='https://github.com/user-attachments/assets/99325a91-b81c-4fd1-a68e-c0628d5784fe' width="100%"/> | <video src='https://github.com/user-attachments/assets/bcb37b15-6c8c-4eae-845e-5da10aaac11e' width="100%"/> |
> [!NOTE]
>
> **Stream-Omni can produce intermediate textual results (ASR transcription and text response) during speech interaction, offering users a seamless "see-while-hear" experience.**
|
MaestrAI/jack_marlowe-lora-1750146373 | MaestrAI | 2025-06-17T08:27:35Z | 0 | 0 | null | [
"region:us"
] | null | 2025-06-17T07:46:12Z | # jack_marlowe LORA Model
This is a LORA model for character Jack Marlowe
Created at 2025-06-17 09:46:13
|
opium80s/medgemma-27b-merged-10-golden-rules-expert | opium80s | 2025-06-17T08:26:29Z | 0 | 0 | transformers | [
"transformers",
"safetensors",
"gemma3_text",
"text-generation",
"text-generation-inference",
"unsloth",
"conversational",
"en",
"base_model:unsloth/medgemma-27b-text-it-unsloth-bnb-4bit",
"base_model:finetune:unsloth/medgemma-27b-text-it-unsloth-bnb-4bit",
"license:apache-2.0",
"autotrain_com... | text-generation | 2025-06-17T08:06:48Z | ---
base_model: unsloth/medgemma-27b-text-it-unsloth-bnb-4bit
tags:
- text-generation-inference
- transformers
- unsloth
- gemma3_text
license: apache-2.0
language:
- en
---
# Uploaded finetuned model
- **Developed by:** opium80s
- **License:** apache-2.0
- **Finetuned from model :** unsloth/medgemma-27b-text-it-unsloth-bnb-4bit
This gemma3_text model was trained 2x faster with [Unsloth](https://github.com/unslothai/unsloth) and Huggingface's TRL library.
[<img src="https://raw.githubusercontent.com/unslothai/unsloth/main/images/unsloth%20made%20with%20love.png" width="200"/>](https://github.com/unslothai/unsloth)
|
adugeen/authorship-e5-small | adugeen | 2025-06-17T08:12:00Z | 0 | 0 | sentence-transformers | [
"sentence-transformers",
"safetensors",
"bert",
"sentence-similarity",
"feature-extraction",
"generated_from_trainer",
"dataset_size:276686",
"loss:MultipleNegativesRankingLoss",
"arxiv:1908.10084",
"arxiv:1705.00652",
"base_model:intfloat/multilingual-e5-small",
"base_model:finetune:intfloat/... | sentence-similarity | 2025-06-17T08:09:57Z | ---
tags:
- sentence-transformers
- sentence-similarity
- feature-extraction
- generated_from_trainer
- dataset_size:276686
- loss:MultipleNegativesRankingLoss
base_model: intfloat/multilingual-e5-small
widget:
- source_sentence: 'ะะตัะตะฝะตะณะธ ะพััััะฟะฐะปะธ. ะะฝะธ ะผะพะณะปะธ ะทะฐะฟัะพััะพ ัะฑะธัั ะพััะฐะฒัะตะณะพัั ะฟะพะทะฐะดะธ
ะบะฝัะทั ะะปะฐะดะธะผะธัะฐ, ะผะฐะปััะธัะบั ะธ ััะฐัะธะบะฐ, ะฝะพ ะฟะพะปััะธะปะธ ะฟัะธะบะฐะท - ัั
ะพะดะธัั. ะััั - ะฟะตัะตะฝะตะถัะบะธะน
ั
ะฐะฝ, ะฟัะพะธะณัะฐะฒัะธะน ะฑะพะน ั ะบะฝัะทะตะผ, ะฑัะป ัะผะตััะตะปัะฝะพ ัะฐะฝะตะฝ.
ะะพะธะฝั ะทะฐะฑัะฐะปะธ ะตะณะพ ะธ ะฒะตัะฝัะปะธัั ะฝะฐ ะฟัะตะถะฝะตะต ะผะตััะพ ััะพัะฝะบะธ, ะณะดะต ะฑัะปะธ ะพััะฐะฒะปะตะฝั ะพะฑะพะทั
ั ะฟัะพะฒะธะฐะฝัะพะผ ะธ ัะฐะทะฑะธั ะฒัะตะผะตะฝะฝัะน ะปะฐะณะตัั. ะะธะนัั ะฝะต ะพัั
ะพะดะธะป ะพั ะพััะฐ ะฝะธ ะฝะฐ ัะฐะณ, ะฝะฐะดะตััั
ะฝะต ัะพ ะฝะฐ ััะดะพ, ะฝะต ัะพ ะฝะฐ ะปะตะบะฐัะตะน. ะะตะดั ะบัะพะผะต ะฝะตะณะพ ั ะผะฐะปััะธัะบะธ ะฝะธะบะพะณะพ ะฝะต ะฑัะปะพ. ะะฐัั
ะพะฝ ะฟะพัะตััะป ะตัั ัะพะฒัะตะผ ะบัะพั
ะพะน.
ะััั ะฑัะป ะฒะปะฐััะฝัะผ ะธ ะถะตััะพะบะธะผ ะฟัะฐะฒะธัะตะปะตะผ. ะ ััะฝั ะพัะฝะพัะธะปัั ั
ะพัะพัะพ, ะฝะพ ะฝะธะบะพะณะดะฐ ะฝะต
ะฑะฐะปะพะฒะฐะป. ะะปั ะะธะนััะฐ ะพัะตั ะฒัะตะณะดะฐ ะฑัะป ะธะดะตะฐะปะพะผ, ะผะฝะพะณะธะต ะทะฐะฒะธะดะพะฒะฐะปะธ ะตะณะพ ัะตัะธะผะพััะธ ะธ
ั
ะธััะพัะผะธั. "ะะปะฐััั ะดะตัะถะธััั ะฝะฐ ัััะฐั
ะต" - ัะฐััะพ ะณะพะฒะพัะธะป ะพะฝ. ะะพ ัะตะฟะตัั ะพะฝ ะฝะฐั
ะพะดะธะปัั
ะผะตะถะดั ะถะธะทะฝัั ะธ ัะผะตัััั. ะ ัััะฝะฐั ัะฐัะฐ ะฒะตัะพะฒ ะฑัะปะฐ ะฝะฐะผะฝะพะณะพ ััะถะตะปะตะต...
ะะตะปะธะบะธะน ั
ะฐะฝ ัะผะตั ะฝะพััั, ะฝะต ะฟัะธั
ะพะดั ะฒ ัะพะทะฝะฐะฝะธะต.
ะะฐัะตะฝั ะฟะพะฝะธะผะฐะป, ััะพ ะฒัะฝัะถะดะตะฝ ััะฐัั ัะปะตะดัััะธะผ ั
ะฐะฝะพะผ, ะฒะตะดั ะฟะพ ะปะธะฝะธะธ ะพััะฐ ั ะฝะตะณะพ
ะฑะพะปััะต ัะพะดััะฒะตะฝะฝะธะบะพะฒ ะฝะต ะฑัะปะพ. ะะพ ััะพ ะทะฝะฐัะธะปะพ ะพะดะฝะพ. ะกะฝะพะฒะฐ ัะพะฒะตััะฐัั ะฝะฐะฑะตะณะธ ะฝะฐ ะะธะตะฒ
ะธ ะะพะฒะณะพัะพะด, ะฟัะพะดะพะปะถะฐัั ะดะตะปะพ ะััะธ, ะฐ ะทะฝะฐัะธั - ะพะฑะตัะฝััััั ะฟัะพัะธะฒ ัะฒะพะธั
ะดััะทะตะน, ะฟะพัััะฟะธัั
ะฟะพะดะปะพ ะฟะพ ะพัะฝะพัะตะฝะธั ะบ ะบะฝัะทั, ะะปะตะบัะต ะธ ะะปะตะฝัะบะต. ะ ะตััั ะปะธ ั ะฝะตะณะพ ะฒัะฑะพั?
ะะฐ. ะ ัะตะนัะฐั ััะพ ะตะณะพ ะฒัะฑะพั. ะัะตั ะฝะต ะฟะพะดัะบะฐะถะตั ะตะผั, ะฝะต ะฟะพะผะพะถะตั, ะฝะต ัะบะฐะถะตั ะฒะตัะฝัะน
ะฟััั. ะะฐ ะธ ะฝะต ะฟะพัะปััะฐะป ะฑั ะะธะนัั ะตะณะพ. ะะฝ ะฝะธะบะพะณะดะฐ ะฝะต ัะผะพะถะตั ััะฐัั ัะฐะบะธะผ ะถะตััะพะบะธะผ,
ะบะฐะบ ะพัะตั. ะะพ ัะตะฟะตัั ะผะฐะปััะธะบ ะฟะพะฝะธะผะฐะป, ะพัะบัะดะฐ ัะพั ัะตัะฟะฐะป ัะฒะพั ะฝะตะฝะฐะฒะธััั, ะทะปะพััั,
ะพะฑะธะดั ะธ ะถะฐะถะดั ะบัะพะฒะธ. ะฃ ะฝะตะณะพ ะฝะธะบะพะณะดะฐ ะฝะต ะฑัะปะพ ะฝะฐััะพััะตะณะพ ะดััะณะฐ, ะบะพัะพััะน ัะผะพะณ ะฑั
ะฟะพะดััะฐะฒะธัั ะฟะปะตัะพ, ะดะฐัั ะดััะถะตัะบะธะน ัะพะฒะตั. ะะตัะผะพััั ะฝะฐ ะพัะดั ะฟะพะดัะธะฝัะฝะฝัั
ะตะผั ะฟะตัะตะฝะตะณะพะฒ
ะธ ั
ะฐะทะฐั, ะพัะตั ะฑัะป ะพะดะธะฝะพะบ. ะะต ััะฐะฒะธะป ะฝะธะบะพะณะพ ััะดะพะผ ั ัะพะฑะพะน, ัะฝะธะถะฐะป, ะถะตััะพะบะพ ะฝะฐะบะฐะทัะฒะฐะป
ะฒะพะธะฝะพะฒ, ะทะฐัะปัะถะธะฒ ัะตะผ ัะฐะผัะผ "ัััะฐั
ะธ ัะฒะฐะถะตะฝะธะต". ะขะฐะบ ััะธัะฐะป ะพะฝ ัะฐะผ. ะ ัะฐะบะพะน ะฟะพะทะธัะธะธ
ะฟัะธะดะตัะถะธะฒะฐะปะธัั ะฟะตัะตะฝะตะณะธ...
ะ ะะธะนัั? ะะปะตะบัะฐ ัะฟะฐั ะตะผั ะถะธะทะฝั, ะฝะต ะทะฝะฐั, ััะพ ัะพั ะฒัะฐะณ. ะ ะบะพะณะดะฐ ัะทะฝะฐะป, ัะฐะทะฒะต ะทะฐั
ะพัะตะป
ัะฑะธัั ะตะณะพ? ะะตะดั ะฟะตัะตะฝะตะณะธ ัะพะถะณะปะธ ะตะณะพ ะดะตัะตะฒะฝั ะธ ัะฑะธะปะธ ะดะตะดััะบั.
ะะพ ะะธะนัั ะฝะต ะผะพะณ ะฟะพะฝััั, ะบะฐะบ ะฒะพะพะฑัะต ัะฐะบะพะต ะฒะพะทะผะพะถะฝะพ. ะขะพ ะปะธ ั
ัะธััะธะฐะฝะธะฝั ัะฐะบ ะธ ะฟะพะปะฐะณะฐะตััั,
ะผะพะปัะฐ ะฟะตัะตะฝะพัะธัั ะฒัะต ะฝะตะฒะทะณะพะดั, ััะพ ะฟะพััะปะฐะตั ะตะผั ััะดัะฑะฐ, ัะพ ะปะธ ััััะบะธะน ะดัั
ะฝะฐััะพะปัะบะพ
ัะธะปัะฝ, ััะพ ะฝะต ะผะพะถะตั ะพะฟััะบะฐัััั ะดะพ ะผะตััะธ. ะะต ะฟะพะฝะธะผะฐะป..., ะฐ ะถะฐะถะดะฐ ะผะตััะธ ัะฐะทะณะพัะฐะปะฐัั
ะฒ ะตะณะพ ะดััะต. ะะปะตะฝัะบะฐ... ะฅัะฐะฑัะฐั ะดะตะฒัััะบะฐ ั ััะปะบะพะน ะธ ะดะปะธะฝะฝะพะน ัะฒะตัะปะพะน ะบะพัะพะน... ะะตั!
ะ ะฝะตะน ะดะฐะถะต ะฝะต ะดัะผะฐะน!
ะะฝ ะดะพะปะถะตะฝ ะฝะฐะบะฐะทะฐัั ัะฑะธะนัั ะพััะฐ, ะดะฐะถะต ะตัะปะธ ัะพั ัะฐะผ, ะพะฑะผะฐะฝัััะน ะัะธะฒะถะตะน, ะฒะฒัะทะฐะปัั
ะฒ ะฑะพะน. ะ ััั ะถะต ัะตัะดัะต ะฒะพะทัะฐะถะฐะปะพ. "ะขะฐะบ ะฝะตะปัะทั".
ะขั ะฒะตะดั ัะถะต ะฟัะธะฝัะป ัะตัะตะฝะธะต. ะขะฐะผ, ะฝะฐ ะฟะพะปะต ะฑะพั, ะบะพะณะดะฐ ะพัะตั ะฟัะพะธะณัะฐะป. ะขั ะพััะฐะฝะพะฒะธะป
ัะฐะทัััะตะฝะฝัั
ะฟะตัะตะฝะตะณะพะฒ, ะณะพัะพะฒัั
ัะฐะทะพัะฒะฐัั ะฝะตะฝะฐะฒะธััะฝะพะณะพ ะบะฝัะทั ะะปะฐะดะธะผะธัะฐ. ะ ััะดะฝะพะน
ะดะตะด ะธ ะผะฐะปััะธัะบะฐ? ะะพะผะตัะฐะปะธ ะฑั ะพะฝะธ ััะพะผั? ะะตั.
ะ ะฐะทะฒะต ะฟะฐัะตะฝั ะผะพะณ ัะบะฐะทะฐัั ัะฐะบะธะต ัะปะพะฒะฐ?
ะะตัะตะด ะณะปะฐะทะฐะผะธ ัะฝะพะฒะฐ ะฟะพัะฒะปัะตััั ัะฐ ะบะฐััะธะฝะฐ.
ะะพะปะพะดะพะน ะบะฝัะทั ัะดะฐัะธะป ะััั ะผะตัะพะผ ะธ ะฒัะฑะธะป ะธะท ัะตะดะปะฐ. ะะฐะปััะธะบ ะฑัะพัะฐะตััั ะบ ะพััั. ะัะดะฐ
ะฟะตัะตะฝะตะณะพะฒ ัะถะต ะฝะต ัะดะตัะถะธะฒะฐะตั ะบะพะฝะตะน, ัะฒััะธั
ัั ะบ ะฑะพั. ะัั ัะตะบัะฝะดะฐ, ะธ ะพะฝะธ ัะพัะฒัััั
ั ะผะตััะฐ. ะะพะปัะฝะฐ ะพะฑะฐะณัะธััั ะบัะพะฒัั.
"ะะตั! -ะพััะฐะฝะฐะฒะปะธะฒะฐะตั ะธั
ะะธะนัั. - ะญัะพ ะฑัะป ัะตััะฝัะน ะฟะพะตะดะธะฝะพะบ. ะฃั
ะพะดะธะผ!"
ะะพะธะฝั ะฝะตั
ะพัั, ะฝะพ ะฟะพัะปััะฝะพ ัะฐะทะฒะพัะฐัะธะฒะฐัั ะบะพะฝะตะน.
ะะพัะตะผั ะพะฝะธ ะฟะพัะปััะฐะปะธ ะตะณะพ? ะะฝ ะฒะตะดั ะฝะต ั
ะฐะฝ. ะัะฒะตั ะพะดะธะฝ. ะะฝะธ ะดะพะฒะตัััั ะตะผั. ะะฐะบ ะธ
ะตะณะพ ะพััั.
ะขะฐะบ ััะพ ะถะต ะดะตะปะฐัั?
ะะปัะฟัะน ะผะฐะปััะธัะบะฐ. ะขั ะฒัั ัะตัะธะป. ะขั ะฒะพะทะณะปะฐะฒะธัั ะฟะตัะตะฝะตะถัะบะธะน ัะพัะท ะฟะปะตะผัะฝ, ะฝะพ ะฝะต ะฟะพัะผะตะตัั
ะฝะฐะฟะฐััั ะฝะฐ ะะธะตะฒัะบัั ะ ััั.
ะะธะนัั ะฒััะตะป ะธะท ัะฐััะฐ. ะััะณะพะผ, ะบัะดะฐ ะฝะต ะฟะพัะผะพััะธ ะฒััััะพะธะปะพัั ะฒะพะนัะบะพ. ะ ะพะถะธะดะฐะฝะธะธ
ะผะพะปะพะดะพะณะพ ั
ะฐะฝะฐ ะพะฝะธ ะฟะตัะตััะฟััะฒะฐะปะธัั, ัะฟะพัะธะปะธ, ััะณะฐะปะธัั. ะะพ ััั ะฒัะต ัะฐะทะพะผ ะทะฐะผะพะปัะฐะปะธ.
ะะฝะธ ะฑัะปะธ ะณะพัะพะฒั ััะปััะฐัั ัะตัะตะฝะธะต ะะธะนััะฐ.
- ะกะปััะฐะนัะต! ะะตัะตะฝะตะณะธ- ัะฒะพะฑะพะดะฝัะต ะฒะพะธะฝั! -ะณะพะปะพั ะผะฐะปััะธะบะฐ ะฝะต ะดัะพะณะฝัะป.- ะั ะผะพะถะตัะต
ัะปะตะดะพะฒะฐัั ะทะฐ ะผะฝะพะน, ะฐ ะผะพะถะตัะต ะฟัะธะผะบะฝััั ะบ ะบะฐะฝะณะฐัะฐะผ! ะฏ ะถะต ััะธัะฐั, ััะพ ะผั ะดะพะปะถะฝั ัะฐะทะฑะธัั
ะฟะพะปะพะฒัะตะฒ, ะบะพัะพััะต ะดะฐะฒะฝะพ ะทะฐััััั ะฝะฐ ะฝะฐัะธ ะทะตะผะปะธ! ะัะพ ัะพ ะผะฝะพะน?!
ะัะฒะตัะพะผ ะฑัะปะพ ะพะดะพะฑัะธัะตะปัะฝะพะต ะธ ะดััะถะฝะพะต "ะฃัะฐ!"
ะะตัะตะฝะตะณะธ ััะปะธ ั ััััะบะพะน ะทะตะผะปะธ, ะฝะพ ะฝะต ะฝะฐะฒัะตะณะดะฐ. ะงะตัะตะท ะบะฐะบะพะต-ัะพ ะฒัะตะผั ะพะฝะธ ะฒะตัะฝัะปะธัั
ั ะฝะพะฒัะผ ั
ะฐะฝะพะผ, ััะพะฑั ะฒะฒัะทะฐัััั ะฒ ะผะตะถะดะพััะพะฑะฝัั ะฒะพะนะฝั ะผะตะถะดั ะฏัะพัะปะฐะฒะพะผ ะัะดััะผ ะธ ะกะฒััะพะฟะพะปะบะพะผ
ะะบะฐัะฝะฝัะผ, ะฝะฐ ััะพัะพะฝะต ะฟะพัะปะตะดะฝะตะณะพ.'
sentences:
- 'http://vk.com/audio?performer=1&q=Hollywood%20Undead%20Coming%20Back%20Down
ะััััะต ัะปะธัั ะฑะพะปััะพะณะพ ะณะพัะพะดะฐ, ััะพ ะพัะฒะตัะฐะปะธัั ัััะตะฝะฝะธะผ ัะพะปะฝัะตะผ. ะะฐ ะฟะตัะตัะปะบะฐั
ะฒัััะตัะฐะปะธัั
ัะตะดะบะธะต ะฟัะพั
ะพะถะธะต, ะบะพัะพััะต ั ัะดะธะฒะปะตะฝะธะตะผ ัะผะพััะตะปะธ ะฝะฐ ะฟะพะปััะฐะทะดะตัะพะณะพ ะฟะฐัะฝั, ัะฐะณะฐััะตะณะพ
ะฒะดะพะปั ัะปะธั ะะฐะณะฝะพะปะธะธ.
"ะกัะผะฐััะตะดัะธะน,"- ะฟะพะดัะผะฐัั ะผะฝะพะณะธะต, ะฝะพ ัะฐะทะฒะต ััะพ ะฝะตะฟัะฐะฒะดะฐ? ะะตั, ััะพ, ะฟัะฐะฒะดะฐ, ััะพ
ัะธััะตะนัะฐั, ะฟัะฐะฒะดะฐ, ะบะพัะพัะฐั ั ะบะฐะถะดัะผ ะดะฝัะผ ะณะปะพะถะตั ะตะณะพ ะดััั. ะะฝ ะฑัะป ััะผะฐััะตะดัะธะผ ะดะพ
ัะพะณะพ ะบะฐะบ ะพะฝะฐ ะฟัะธัะปะฐ ะฒ ะตะณะพ ะถะธะทะฝั, ะธ ะพััะฐะปัั ัะฐะบะธะผ ะฟะพัะปะต ัะพะณะพ ะบะฐะบ ะพะฝะฐ ััะปะฐ...
ะะฐะถะดัะน ะดะตะฝั, ะฟัะพะณัะปะธะฒะฐััั ะฟะพ ัะปะธัะฐะผ ัััะตะฝะฝะตะณะพ ะณะพัะพะดะฐ, ะพะฝ ั ัะปัะฑะบะพะน ะฝะฐ ะณัะฑะฐั
, ะธ
ั ะณัััััั ะฒ ะณะปะฐะทะฐั
ะฒัะฟะพะผะธะฝะฐะป ะฒัะต ะผะพะผะตะฝัั, ะบะพะณะดะฐ ะพะฝะฐ ั
ะพะดะธะปะฐ ะฒะผะตััะต ั ะฝะธะผ. ะะฐะถะดัะน
ัะฐะท, ะตะผั ััะดะธะปะพัั, ััะพ ะพะฝะฐ ั ะฝะธะผ, ััะพ ะพะฝะฐ ะฝะต ะพััะฐะฒะธะปะฐ ะตะณะพ, ะฝะพ ััะพ ะฒะตะดั ะฝะตะฟัะฐะฒะดะฐ,
ะธะปะธ ะฟัะฐะฒะดะฐ? ะงัะพ ะทะฐ ัััั? ะะพะฝะตัะฝะพ, ะถะต, ะพะฝะฐ ัะถะต ะฝะต ั ะฝะธะผ, ะพะฝะฐ ัะฐะผ ะดะฐะปะตะบะพ ะฝะฐะฒะตัั
ั
ะพั ะฝะธั
...
ะะฝ ะฒัะตะณะดะฐ ัะปััะฐะป ะตั ะทะฒะพะฝะบะธะน, ะฝะพ ะพะดะฝะพะฒัะตะผะตะฝะฝะพ ัะพะฑะบะธะน ะณะพะปะพั, ะพั ะบะพัะพัะพะณะพ ัะฐะบ ะธ ั
ะพัะตะปะพัั
ะฟัะธะถะฐัั ะบ ัะตะฑะต ะธ ะฝะต ะพัะฟััะบะฐัั ะฝะธะบะพะณะดะฐ, ะฝะพ ะพะฝ ะพัะฟัััะธะป, ััะดะฐ, ะพั, ะบัะดะฐ ะฝะต ะฒะพะทะฒัะฐัะฐัััั....
ะ ัะตะฟะตัั, ะตั ะณะพะปะพั ะฟะพััะตะฟะตะฝะฝะพ ัะฐััะฒะพััะตััั ะธะท ะตะณะพ ะผััะปะตะน.
ะก ะฝะตะน, ะพะฝ ััะฒััะฒะพะฒะฐะป, ััะพ ะพะฝ ะฝะต ัะฐะบะพะน ะบะฐะบ ะฒัะต, ั ะฝะตะน, ะพะฝ ะผะพะณ ะฑััั ะพัะพะฑะตะฝะฝัะผ, ะฝะพ
ะทะฐัะตะผ ัะตะฟะตัั ะฑััั ัะฐะบะธะผ, ะตัะปะธ ะตั ะฝะตั? ะขะตะฟะตัั, ะพะฝ ะปะธัั ะพะดะฝะพ ะปะธัะพ, ะฒ ัะพะน ัะพะปะฟะต,
ััะพ ะพะบััะถะธะปะฐ ะตะณะพ. ะะพะทะผะพะถะฝะพ, ะตะณะพ ะธ ะทะฐะผะตััั, ะฝะพ ัะพัะฝะพ ะฝะต ะบะฐะบ ะปะธัะฝะพััั, ะตะณะพ ะทะฝะฐะปะฐ
ะปะธัั ะพะฝะฐ, ะฝะพ ะพะฝะฐ ััะปะฐ...
Someday, someday
ะะพ ะบะพะณะดะฐ-ะฝะธะฑัะดั, ะบะพะณะดะฐ-ะฝะธะฑัะดั
I know you''re coming back
ะฏ ะทะฝะฐั, ัั ะฒะตัะฝะตัััั...
ะกััะพัะบะธ ัะฐะผะธ ะฒัะฟะปัะปะธ ั ะฝะตะณะพ ะฒ ะณะพะปะพะฒะต. ะะตัะฝั, ะบะพัะพััั ะพะฝ ััะปััะฐะป ะฝะตะดะฐะฒะฝะพ ะฟะพ ัะฐะดะธะพ,
ะฒัะตะณะดะฐ ะฟัะตัะปะตะดะพะฒะฐะปะฐ ะตะณะพ. ะะต ะฒะฐะถะฝะพ ะณะดะต, ะฝะฐ ัะปะธัะต, ะฒ ะผะตััะพ, ะฒ ะผะฐัะธะฝะต, ะธะปะธ ะดะพะผะฐ,
ะพะฝ ะฒัะตะณะดะฐ ะฒัะฟะพะผะธะฝะฐะป ะตั... ะขั, ััะพ ะธะทะผะตะฝะธะปะฐ ะตะณะพ ะถะธะทะฝั, ะธ ัั, ะบะพัะพัะฐั ัะตะนัะฐั ะพัะตะฝั
ะดะฐะปะตะบะพ ะพั ะฝะตะณะพ...
ะจะฐะณะฐั ะฟะพ ัะปะธัะฐะผ ะฟัะพััะฟะฐะฒัะตะณะพัั ะณะพัะพะดะฐ, ะพะฝ ะฝะตะฒะพะปัะฝะพ ะฟัะพััะป-ัะพ ะผะตััะพ, ะณะดะต ะพะฝะฐ ะฟะพะบะธะฝัะปะฐ
ััะพั ะผะธั. ะะพัะฟะพะผะธะฝะฐะฝะธั ะฝะฐั
ะปัะฝัะปะธ ะตะณะพ, ะฝะพ ะพะฝ ะฟะพะฟััะฐะปัั ะธั
ะพัะพะณะฝะฐัั, ะฝะพ ัะฐะทะฒะต ััะพ
ะฒะพะทะผะพะถะฝะพ? ะะตั... ะะฝ ััะพ ะทะฝะฐะป, ะฝะพ ะฒัั ะถะต ะฟัะพัะธะฒะธััั ััะดัะฑะต. ะะฐะบะฐั ะธัะพะฝะธั, ะฝะต ะฟัะฐะฒะดะฐ
ะปะธ?
ะััะทะฝะฐั ัะปะธัะฐ, ะฒ ััะฐัะพะผ ัะฐะนะพะฝะต ะะฐะณะฝะพะปะธะธ, ะฝะฐัะบะฒะพะทั ะฑัะปะฐ ะฟัะพะฟะธัะฐะฝะฐ ะบัะพะฒัั, ะบััะตะฒะพะผ,
ะธ ะฐะปะบะพะณะพะปะตะผ. ะะฝะฐ ะฒะตะดั ะฟัะตะดะปะฐะณะฐะปะฐ ัะนัะธ, ะฐ ะพะฝ ะฝะต ะฟะพัะปััะฐะปัั ะตั. ะะปัะฟะตั. ะะฐะปะตะฝัะบะธะน
ะฝะฐะธะฒะฝัะน ะณะปัะฟะตั, ะบะพัะพััะน ะฝะธัะตะณะพ ะฝะต ะฑะพะธััั, ั
ะพัั ะฒัั ัะพะฒัะตะผ ะฝะฐะพะฑะพัะพั...
ะะฐะฝััะบ, ะพั ะบะพัะพัะพะณะพ ะพะฝะธ ะฑะตะถะฐะปะธ, ะฑัะป ะฟัะพะฟะธัะฐะฝ ะฑะตะทัะผััะฒะพะผ, ะฝะพ ัะฐะทะฒะต ะฑะตะทัะผััะฒะพะผ ะฝะฐะทัะฒะฐัั
ะฐะปะบะพะณะพะปั? ะัะพ ะทะฝะฐะตั, ะบัะพ ะทะฝะฐะตั... ะ ะถะตััะฒั, ะฟััะฝะพะณะพ ัะตะปะพะฒะตะบะฐ, ะฟัะธะฑะตะถะฐะปะธ ะฒ ะบะฐะบัั-ัะพ
ัะปะพัะบั, ะธะท ะบะพัะพัะพะน ัะฑะตะถะฐัั ะฝะตะฒะพะทะผะพะถะฝะพ, ะธะปะธ ััะฟะธะบ... ะะดะฝะพ ะฒัะตะณะพ ะปะธัั ัะปะพะฒะพ, ะฐ ัะบะพะปัะบะพ
ะฟะฐะฝะธะบะธ ะพะฝะพ ะฟัะธะฝะพัะธั... ะััััั, ะพะฝะธ ะฒััะฐะปะธ, ะฒะพ ััะพ-ัะพ ะฟะพั
ะพะถะตะต ะฝะต ะณััะทั, ะธะปะธ ััะพ
ะธ ะฑัะปะฐ ะณััะทั? ะัะพ ะทะฝะฐะตั, ะบัะพ ะทะฝะฐะตั... ะะฐะผะฐั
ะฝัะฒัะธัั, ะผะฐะฝััะบ ะฝะฐะฟัะฐะฒะธะป ะฟะตัะตัะฝัะน ะฝะพะถะธะบ
ะฝะฐ ะฟะฐัะฝั, ะธ ะบะธะฝัะป ะตะณะพ. ะะฐะฝะฐะปัะฝะพ ะดะฐ? ะะพ ะดะตะฒััะบะฐ, ััะฟะตะปะฐ ะฒััะฐัั ะฟะตัะตะด ะฝะธะผ, ะธ ะฟะพ
ััะพะผั, ัะดะฐั ะฟัะธััะปัั ะฟััะผะพ ะฒ ะถะธะฒะพั.
ะงัะพ ะฑัะปะพ ะดะฐะปััะต, ะฟะฐัะตะฝั ะฟะพะผะฝะธะป ัะผััะฝะพ, ะฝะพ ะฑัะปะพ ะพัะตะฒะธะดะฝะพ, ะฝะฐ ะตะณะพ ะณะปะฐะทะฐั
, ะทะฐ ะฝะตะณะพ
ัะผะตัะปะฐ ะตะณะพ ะะถัะฒะธั. ะกััะฐะฝะฝะพ, ะดะฐ? ะัะพะดะต ะฑั ะดะพะปะถะฝะพ ะฑััั ะฝะฐะพะฑะพัะพั, ะฐ ะฟะพะปััะธะปะพัั ัะฐะบ.
ะะฐ ะตะณะพ ััะบะฐั
, ะฑัะปะฐ ะตั ะบัะพะฒั, ะฐ ะพะฝ ะฒะธะดะธะผะพ ะฟะพัะตััะป ัะพะทะฝะฐะฝะธะต...
ะขะตะฟะตัั, ะผะธั ัะปะธัะบะพะผ ัะตััะน, ั
ะพัั ะฝะตั, ะพะฝ ะบัะพะฒะฐะฒะพ-ะบัะฐัะฝัะน, ัะฐะบะพะน ะถะต, ะบะฐะบ ะธ ะตั ะปัะฑะธะผัะน
ัะฒะตั. ะกััะฐะฝะฝะพ ะดะฐ? ะัะฑะธั ะดะพะถะดั, ะฝะพ ะปัะฑะธะผัะน ัะฒะตั ะบัะฐัะฝัะน... ะะพ ัะฐะทะฒะต ะธะผะตะตั ะปะธ ััะพ
ะทะฝะฐัะตะฝะธะต, ะตัะปะธ ะะถัะฒะธั ััะปะฐ ัะพ ััะตะฝั, ะฝะฐะฒัะตะณะดะฐ? ะะพะทะผะพะถะฝะพ, ััะพ ะธ ะฑัะป ะตั ัะฐั, ะฐ ะผะพะถะตั
ะฑััั, ะพะฝะฐ ะถะธะปะฐ ะฑั ะตัั ะดะพะปััะต, ะฝะพ ะฒัั ะถะต ะฟัะพัะปะฐ ะผะธะผะพ, ัะฐะบ ะถะต, ะบะฐะบ ะธ ะฒัั ั
ะพัะพัะตะต,
ะฒ ััะพะผ ะฐะปัะฝะพะผ ะผะธัะต...
ะ ััะพั ัะฐะบั, ะทะฐััะฐะฒะธะป ะตะณะพ ะฟะพะฒะทัะพัะปะตัั, ะธ ะฟัะธะฝััั ัะตะฐะปัะฝะพััั ัะพ ะฒัะตะผะธ ะตั ัะพััะฐะฒะปัััะธะผะธ...
ะขััะดะฝะพ, ะฑัะปะพ ะปะธัั ะฟะพ ะฝะฐัะฐะปั, ะฐ ัะตะนัะฐั ะพะฝ ะฟัะธะฒัะบ, ะธ ะดะฐะถะต ะฒะฟะพะปะฝะต ัะฟัะฐะฒะปัะตััั, ะฝะพ
ััะพ-ัะพ ะฟะพะผะตะฝัะตััั, ะธ ััะพ, ััะพ-ัะพ ะฟัะพะธะทะพะนะดัั ัะปะธัะบะพะผ ะฑััััะพ...
ะะฝะฐ ะฝะธะบะพะณะดะฐ ะฝะต ะทะฐะดัะผัะฒะฐะปะฐัั, ะพ ัะพะผ, ะฑัะดัั ะปะธ ะตั ะฟะพะผะฝะธัั, ะธะปะธ ะฝะต ะทะฐะฟะปะฐััั ะปะธ ะพะฝะธ,
ะบะพะณะดะฐ ะพะฝะฐ ัะผัะตั, ะฝะตั, ั ะฝะตั ะดะฐะถะต ะผััะปะตะน ัะฐะบะธั
ะฝะต ะฑัะปะพ, ะฒะตะดั ัั ะฑัะปะฐ ะฒ ัะฒะพะธั
ะผะตััะฐั
.
ะัะผะฐะปะฐ, ััะพ ะฑัะดะตัั ะถะธัั ะฒะตัะฝะพ, ะบะฐะบ ะฑะพะณ, ะธะปะธ ะดััะฒะพะป. ะะฝะฐ ะดัะผะฐะปะฐ, ััะพ ัะตะฑั ะฝะต ะฟัะพะฑััั
ะฝะธะบะฐะบะฐั ะฟัะปั, ะฝะพ ะฒัั ะถะต ะพัะธะฑะปะฐัั... ะัะพััะพะน ะฝะพะถะธะบ ะพัะฝัะป ัะฒะพั ะถะธะทะฝั, ะธ ะทะฝะฐะตัั,
ั ะฝะต ะถะฐะปะตั ะพะฑ ััะพะผ, ะฒะตะดั, ัั ะฒัะตะณะดะฐ ะฑัะปะฐ ั ะบััะปััะผะธ, ะบะฐะบ ั ะฐะฝะณะตะปะฐ, ั
ะพัั ัั ะฝะต
ะฒัะตะณะดะฐ ะฑัะปะฐ ััะฐััะปะธะฒะฐ, ะฟะพะบะฐ, ะฒ ัะฒะพะตะน ะถะธะทะฝะธ ะฝะต ะทะฐะฟะตะปะธ ะฟะพัะปะฐะฝะฝะธะบะธ ะะพะถัะธ...
ะ ัะฝะพะฒะฐ, ะตะณะพ ะฟัะพะฑะธัะฐะตั ะดัะพะถั, ะพะฝ ะดัะผะฐะตั, ััะพ ัะตะนัะฐั ะพะฝะฐ ะฟะพะดะฑะตะถะธั, ัะทะฐะดะธ, ะธ, ะทะฐะบััะฒ
ะณะปะฐะทะฐ, ัะบะฐะถะตั "ะฃะณะฐะดะฐะน ะบัะพ?" ะ ัั ะบะฐะบ ะพะฑััะฝะพ ะฟัะพััะพ ัะบะฐะถะตัั ั ัะปัะฑะบะพะน "ะะถัะฒะธั.
ะะพั ะะถัะฒะธ." ะ ัะตะผะฝะพะฒะพะปะพััะน ะฟะพะบะปัะฝัััั, ะฒัะตะผ ะฑะพะณะฐะผ, ััะพ ะพะฝะฐ ัะผััะฐะตััั ะพั ะตะณะพ ัะปะพะฒ.
ะะตะดั ััะพ ะฟัะฐะฒะดะฐ...
ะัะดะฝะธะต ะดะฝะธ ะฟะพัะธั
ะพะฝัะบั ะฒััะตัะฝััั ะตั ะณะพะปะพั ะธะท ะตะณะพ ะณะพะปะพะฒั, ะฐ ัะฐะผ ะพะฝ ะฟะพะณััะถะฐะตััั,
ะฒ ััะตัั ะณะพัะพะดะฐ. ะะพะทะผะพะถะฝะพ, ะพะฝ ะทะฐะฑัะดะตั ะตั, ะฝะฐะนะดัั ะดััะณัั, ะธ ะพะฝะธ ะทะฐะถะธะฒัั ััะฐััะปะธะฒะพ,
ะธะปะธ ะฑัะดะตั ะพะดะธะฝะพะบ ะฒัั ะถะธะทะฝั, ะฒัะฟะพะผะธะฝะฐั ะตั. ะัะพ ะทะฝะฐะตั, ัะตะผ ะทะฐะบะพะฝัะธััั ััะฐ ะธััะพัะธั,
ะฝะพ ะพะดะฝะฐะถะดั ะพะฝะธ ะฒัััะตััััั. ะงะตัะตะท ััะพ ะปะตั, ัะตัะตะท ะฒะตะบะฐ, ัััััะตะปะตัะธั, ะฝะต ะฒะฐะถะฝะพ...
ะะฝะธ ะฒัััะตััััั, ะธ ััะพ ะณะปะฐะฒะฝะพะต....
ะ ะฟะพะบะฐ, ัะตะผะฝะพะฒะพะปะพััะน ัะบะฐะถะตั ะปะธัั ะพะดะฝะพ ะฟัะตะดะปะพะถะตะฝะธะต, ะธ ั ัั
ะผัะปะบะพะน ะฟะพะนะดัั ะฒ ััะพัะพะฝั
ะดะพะผะฐ...
- ะะพะดะฐ ะถะต ัั ัะฟัััะธัััั, ะะถัะฑะธ?
ะัะตะณะพ ะปะธัั ะทะฐ ะฝะตัะบะพะปัะบะพ ัะตะบัะฝะด, ะผะธั ะผะพะถะตั ะฟะพะผะตะฝััััั, ะดะพ ะฝะตัะทะฝะฐะฒะฐะตะผะพััะธ. ะะตัะตะณะธัะต
ััะพ ะฒัะตะผั, ะธ ะฒะพะทะผะพะถะฝะพ, ะฒ ะฑัะดััะตะผ ะฒัะตะผั ัะฐะผะพ ะฟัะตะฟะพะดะฝะตัะตั ะฒะฐะผ ะฟะพะดะฐัะพะบ...'
- 'ะะพะปัะฝะพ... ะกะปะธัะบะพะผ ะฑะพะปัะฝะพ... ะะพะปัะฝะพ ะบะฐะถะดะพะต ะผะณะฝะพะฒะตะฝะธะต, ะบะฐะถะดัะน ะฒะทะดะพั
ะฟัะธัะธะฝัะตั ะฟัะพััะพ
ะฝะตะผััะปะธะผัั ะฑะพะปั. ะ ะพะฝะฐ ะปะตะถะธั ัะพะฒัะตะผ ััะดะพะผ, ะฑะตะทะดัั
ะฐะฝะฝะฐั. ะะพะบััะณ ะปัะดะธ, ะฝะพ ะพะฝ ะธั
ะฟัะพััะพ ะฝะต ะฒะธะดะธั. ะะฝ ะฒะธะดะตะป ะบะฐะบ ะพะฝะฐ ัะผะตัะปะฐ, ะพะฝะฐ ัะผะพััะตะปะฐ ะฟััะผะพ ะฝะฐ ะฝะตะณะพ ะฒ ััะพั ะผะพะผะตะฝั
ะธ ั
ะพัะตะปะฐ ะฟะพัะตะปะพะฒะฐัั.
"Magic can do much, but not death".
ะะฝ ะพัะณะพัะพะดะธะปัั ะพั ะฒัะตะณะพ ะฒะพะบััะณ, ะพััะฐะปะฐัั ัะพะปัะบะพ ะฑะพะปั. ะ ะฒะดััะณ ะฒัะฟััะบะฐ ััะพััะธ,
ะฐ ะฟะตัะตะด ะตะณะพ ะฒะทะพัะพะผ ะณะฐัะฝััะธะต ะณะปะฐะทะฐ ะะตะปะปั. ะฏัะพััั ะทะฐััะฐะฒะธะปะฐ ะตะณะพ ะฟะพะดะฝััััั ั ะบะพะปะตะฝ
ะธ ะฝะฐะฟัะฐะฒะธัััั ะบ ัะฑะธะนัะต. ะัะธ ั
ะพะดัะฑะต, ะฒ ะฝะพะณะต ะฟัะพัะฝัะปัั ะฒัะปะบะฐะฝ, ะฝะพ ััะพ ะฑัะปะพ ัะถะต ะฝะต
ะฒะฐะถะฝะพ. ะะพะปั ะฒ ะดััะต ะทะฐะณะปััะฐะปะฐ ะฒัะต ะพััะฐะปัะฝะพะต.
"ะะฝะฐ ัะผะตัะปะฐ ะธะท-ะทะฐ ะผะตะฝั".
ะะฝ ะฟะพะดะพัะตะป ะบ ัะฑะธะนัะต, ัะพั ะปะตะถะฐะป ะฝะฐ ะทะตะผะปะต ะธ ัั
ะผัะปัะปัั. ะะฝ ะฒัะฟะพะปะฝะธะป ัะฒะพั ะผะตััั ะธ
ะฑัะป ัะพะฑะพะน ะดะพะฒะพะปะตะฝ. ะัะถัะธะฝะฐ ั ััะพัััั ะฝะฐัะฐะป ะดััะธัั ัะฑะธะนัั ัะฒะพะตะน ััะพัััั, ะฝะพ ัั
ะผัะปะบะฐ
ะฝะต ัั
ะพะดะธะปะฐ ั ะตะณะพ ะปะธัะฐ. ะะพะปะดะฐ ะฟััะฐะปะธัั ะพััะฐัะธัั, ะฝะพ ััะพััั ะฟัะธะดะฐะปะฐ ะตะผั ะพะณัะพะผะฝัั
ัะธะปั, ะธ ะฒัะบะพัะต ะณะปะฐะทะฐ ะฟะธัะฐัะฐ ะฟะพะณะฐัะปะธ. ะะพ ะะพะปะดั ะฝะต ััะฐะปะพ ะพั ััะพะณะพ ะปะตะณัะต, ะพะฝ ะปะธัั
ะฟะพััะฒััะฒะพะฒะฐะป, ััะพ ะฟัะตะดะฐะป ะตะต ะฒะตัั. ะฃะฑะธะฒ ะะตะปะปั, ะฟะธัะฐั ัะฑะธะป ะฒัะต ัะตะปะพะฒะตัะตัะบะพะต ะฒ ะะพะปะดะต,
ะพััะฐะปัั ัะพะปัะบะพ ะผะพะฝััั. ะะพะฝััั ั ัะฐะทะฑะธััะผ ัะตัะดัะตะผ.
ะญะผะผะฐ ั
ะพัะตะปะฐ ะฝะฐะดะตัั ะฝะฐ ะฝะตะณะพ ะฝะฐัััะฝะธะบะธ, ะฝะพ ะฝะต ัะผะพะณะปะฐ. ะะผะฟัะปััะพะผ ะพะฝ ะพัะฑัะพัะธะป ะตะต ะธ
ะตะต ัะพะดะธัะตะปะตะน ะพั ัะตะฑั. ะะณะพ ะบะพะถะฐ ะฝะฐัะฐะปะฐ ะผะตะฝััััั, ะธ ะพะดะตะถะดะฐ ัะฐะบะถะต ะฒะผะธะณ ะฟะตัะตะผะตะฝะธะปะฐัั.
ะกะพะฒะตััะธะฒ ัะฑะธะนััะฒะพ, ะพะฝ ัะฝะพะฒะฐ ััะฐะป ะบัะพะบะพะดะธะปะพะผ, ะฑะตะทะถะฐะปะพััะฝัะผ ะ ัะผะฟะตะปัััะธะปััั
ะตะฝะพะผ.
ะะฝ ัะฝะพะฒะฐ ะฟะพะดะพัะตะป ะบ ะะตะปะปั ะธ ัะฟะฐะป ะฟะตัะตะด ะฝะตะน ะฝะฐ ะบะพะปะตะฝะธ. ะะฝ ัะผะพััะตะป ะฝะฐ ะตะต ั
ะพะปะพะดะฝะพะต
ะปะธัะพ ะธ ะฟัะพััะพ ะฝะต ะผะพะณ ะฟะพะฒะตัะธัั, ััะพ ะตะต ัะตะฟะปัะต ะณะปะฐะทะฐ ะฝะต ะฟะพัะผะพัััั ั ะฝะตะถะฝะพัััั ะฝะฐ
ัะฒะพะต ััะดะพะฒะธัะต. ะะท ะณััะดะธ ะฒััะฒะฐะปัั ะถััะบะธะน ะฝะตัะตะปะพะฒะตัะตัะบะธะน ะบัะธะบ, ะฑัะดัะพ ะพะฝ ะทะฒะฐะป ะดััั
ะปัะฑะธะผะพะน, ะฝะพ ะพะฝะฐ ะฝะต ะพัะบะปะธะบะฐะปะฐัั.
ะ ัะฝะพะฒะฐ ััะพััั, ะฑะตะทัะผะฝะฐั ััะพััั ะธ ัะปะตะทั. ะะฝะฐ ะฒะธะฝะพะฒะฐัะฐ ะปะธัั ะฒ ัะพะผ, ััะพ ะฒะปัะฑะธะปะฐัั
ะฒ ััะดะพะฒะธัะต, ะฟะพะดะฐัะธะปะฐ ะตะผั ััะฐัััะต. ะะพ ะตะต ัะผะตััั ะทะฐะฑัะฐะปะฐ ะฒัะต.
ะัะปะพ ัะพะฒัะตะผ ะฝะต ัะฐะบ, ะบะฐะบ ัะพะณะดะฐ ะฒ ะทะฐะผะบะต. ะขะพะณะดะฐ ะพะฝ ั
ะพัั ะฑั ะทะฝะฐะป, ััะพ ะตะณะพ ะฟัะธะฝัะตััะฐ
ะฒ ะฟะพััะดะบะต, ัะตะฟะตัั...
ะกะปะธัะบะพะผ ะฑะพะปัะฝะพ...
ะ ัะผะฟะตะปัััะธะปััั
ะตะฝั ะฝะต ั
ะพัะตะปะพัั ัััะฟะธัั ะทะฐ ัะตััั ะณะพัะพะดะฐ ะธ ะฒัะต ะทะฐะฑััั. ะะฝ ะฝะต ะผะพะณ
ะทะฐะฑััั ัะฒะพั ะฟัะธะฝัะตััั, ั
ะพัั ะฒะพัะฟะพะผะธะฝะฐะฝะธั ะธ ะฟัะธะฝะพัะธะปะธ ัััะฐะดะฐะฝะธั. ะะฝ ั
ะพัะตะป ะฟัะพััะพ
ัะผะตัะตัั ััะดะพะผ ั ัะพะน ะดะตะฒััะบะพะน, ะบะพัะพัะฐั ะทะฐะฑัะฐะปะฐ ะตะณะพ ัะตัะดัะต.'
- 'ะะพ ััะพะณะพ ะผะพะผะตะฝัะฐ ั ะฝะธัะตะณะพ ะฝะต ััะฒััะฒะพะฒะฐะป. ะัััะพัะฐ ะธ ัะตะผะฝะพัะฐ. ะะพัะปะต ัะพะณะพ, ะบะฐะบ ััะฐ
ัะฒะฐัั ะฒัะตะฟะธะปะฐัั ะผะฝะต ะฒ ัะตั ั ััะฟะตะป ะผััะปะตะฝะฝะพ ัะพ ะฒัะตะผะธ ะฟะพะฟัะพัะฐัััั ะธ ะฑัะป ะณะพัะพะฒ ะบ
ัะผะตััะธ. ะะพ ััะพ ะฑัะปะฐ ะฝะต ะพะฝะฐ. ะะฐ ัะพะผ ัะฒะตัะต ะฝะต ะผะพะถะตั ะฑััั ัะฐะบ ะฝะตะพะฟัะตะดะตะปะตะฝะฝะพ ะธ ัะตะผะฝะพ.
ะะฝะฐัะธั, ั ะฒัะต ะตัั ะถะธะฒ. ะ ัะฐะท ัะฐะบ, ั ัะบะพัะพ ะฒัะฑะตัััั ะพัััะดะฐ...
ะะพ ัะตะปั ัะปะพะฒะฝะพ ะฟัะพัะตะป ัะปะตะบััะธัะตัะบะธะน ัะฐะทััะด. ะฏ ะฒะทะดัะพะณะฝัะป ะธ ัะฐัะฟะฐั
ะฝัะป ะณะปะฐะทะฐ.
ะงะตััะฝะพ ะฟัะธะทะฝะฐัััั, ั ะดะฐะถะต ะฝะต ััะฐะทั ะฟะพะฝัะป ะณะดะต ะฝะฐั
ะพะถััั. ะ ะฒ ะดะฐะฝะฝัะน ะผะพะผะตะฝั ะผะตะฝั
ััะพ ัะพะฒัะตะผ ะฝะต ััะตะฒะพะถะธะปะพ, ะฟะพัะพะผั, ััะพ ะฟะตัะตะด ัะพะฑะพะน ั ัะฒะธะดะตะป ะทะฝะฐะบะพะผะพะต ะปะธัะพ ะผะฐะณะฐ.
ะะดะธะฝััะฒะตะฝะฝัะน, ะบัะพ ะผะฝะต ะฝัะฐะฒะธะปัั ะฟะพัะปะต ะะถะตะนัะฐ ะธ ั ะฟะตัะฒะพะณะพ ะฒะทะณะปัะดะฐ ััะผะตะป ะทะฐะฟะพะปััะธัั
ะผะพั ัะตัะดัะต. ะะฐ, ััะพ ะฝะตัะพะผะฝะตะฝะฝะพ ะฑัะป ะพะฝ. ะะฐ ััะพะปัะบะธะต ะณะพะดั ะฒะฟะตัะฒัะต ะฒ ััะตะฝะฐั
ะธะฝััะธัััะฐ
ะบัะพ-ัะพ ะบัะพะผะต ััะผะตัะตัะฝัั
ะพั
ะพัะฝะธะบะพะฒ.
- ะะฐะณะฝัั ะะตะนะฝ, - ั ะผะฐะบัะธะผะฐะปัะฝะพะน ั
ะพะปะพะดะฝะพัััั, ะบะฐะบะฐั ัะพะปัะบะพ ะฒะพะทะผะพะถะฝะฐ ะฒ ะดะฐะฝะฝะพะน ัะธััะฐัะธะธ,
ะฟัะพะบะพะผะผะตะฝัะธัะพะฒะฐะป ะพัะตะฒะธะดะฝะพะต ั.
ะขะฐะบ ัะถ ะฟัะธะฝััะพ ั ะฝะฐั, ะพั
ะพัะฝะธะบะพะฒ. ะะต ะบะธะฝััั ะถะต ั ะบ ะฝะตะผั ะฝะฐ ัะตั ั ะฟะพัะตะปััะผะธ ะธ ััะฐะทะพัะบะฐะผะธ
ะฟัะธะผะธัะธะฒะฝัั
ัะธะฟะฐ: "ะ, ะฟัะธะฒะตั! ะะฐะฒะฝะตะฝัะบะพ ะฝะต ะฒะธะดะตะปะธัั. ะฏ ัะฐะบ ัะบััะฐะป".
ะงะธััะพ ัะธะทะธัะตัะบะธ ะฒ ะดะฐะฝะฝัะน ะผะพะผะตะฝั ั ะธ ะฝะต ะผะพะณ ัะตะฑะต ัะฐะบะพะณะพ ะฟะพะทะฒะพะปะธัั. ะะฐ ะธ ะฒะธะดะตะป ั
ะผะฐะณะฐ ะฒัะตะณะพ ะฒะพ ะฒัะพัะพะน ัะฐะท ะฒ ัะฒะพะตะน ะถะธะทะฝะธ. ะะพ, ัะตัั ะฒะพะทัะผะธ, ััะพะณะพ ะฑัะปะพ ะดะพััะฐัะพัะฝะพ,
ััะพะฑั ะฟะพะดัะตะฟะธัั ะผะตะฝั ะฝะฐ ะบัััะพะบ ะบะฐะบ ะณะปัะฟะพะณะพ ะบะฐัะฐัั!
- ะะปะตะบัะฐะฝะดั ะะฐะนัะฒัะด, - ัะฝะธัั
ะพะดะธัะตะปัะฝะพ ัะปัะฑะฝัะปัั ะะฐะณะฝัั.
ะะพั
ะพะถะต ะฟะพัะปะต ะดะพะปะณะพะน ะพัะบะปััะบะธ ั ะผะตะฝั ะฝะฐัะฐะปะฐ ะบััะถะธัััั ะณะพะปะพะฒะฐ, ะฝะพ ััะพ ะฟะพะบะฐะทะฐะปะพัั
ะดะฐะถะต ะฟัะธััะฝะพ.
ะฏ ะพัะบััะป ะฑัะปะพ ัะพั ััะพะฑั ะทะฐะดะฐัั ะฒะพะฟัะพั, ะฝะพ ะณะพััั ะพะฟะตัะตะดะธะป ะผะตะฝั.
- ะขะพะปัะบะพ ะฝะต ัะฟัะฐัะธะฒะฐะน, ััะพ ั ะทะดะตัั ะดะตะปะฐั, - ะพะฝ ะฟัะพะถะตะณ ะผะตะฝั ะฒะทะณะปัะดะพะผ ะผะตะดะพะฒัั
ะณะปะฐะท.
- ะฅะพะดะถ ะฟะตัะตะถะธะฒะฐะป ะทะฐ ัะฒะพั ะถะธะทะฝั ะธ, ะบะฐะบ ั ะฒะธะถั, ะฝะต ะฝะฐะฟัะฐัะฝะพ.
ะญัะพ ะฑัะปะพ ะธะผะตะฝะฝะพ ัะพ, ััะพ ั ะธ ั
ะพัะตะป ัะฟัะพัะธัั. ะัะปะธ ะพะฝ ัะผะตะตั ัะธัะฐัั ะผััะปะธ - ััะพ ะฒะตััะผะฐ
ะฟะปะฐัะตะฒะฝะพ. ะัะตะดะฟะพัะธัะฐั ะดะตัะถะฐัั ัะฒะพะธ ะผััะปะธ ะฟัะธ ัะตะฑะต.
ะะตะนะฝะฐ ัะฝะพะฒะฐ ััะพ-ัะพ ัะฐะทะฒะตัะตะปะธะปะพ ะธ ะพะฝ ะบัะธะฒะพ ัะปัะฑะฝัะปัั ะพัะฒะพัะฐัะธะฒะฐััั ะพั ะผะตะฝั. ะญัะพ
ะฟะพะดัะฒะตัะดะธะปะพ ะผะพะธ ะดะพะณะฐะดะบะธ.
- ะะฐะดะฝะพ, ะฝะต ะฑัะดั ัะตะฑั ัะผััะฐัั, - ะผะฐะณ ะฟะพะดะฝัะปัั. - ะขั ะตัะต ัะปะธัะบะพะผ ัะปะฐะฑ ะธ ะพัะดัั
ะปะธัะฝะธะผ
ะฝะต ะฑัะดะตั.
ะะฐะณะฝัั ัะดะตะปะฐะป ัััะฐะฝะฝะพะต ะดะฒะธะถะตะฝะธะต ะฟะพั
ะพะถะตะต ะฝะฐ ะฟะพะปัะฟะพะบะปะพะฝ ะธ ะฝะฐะฟัะฐะฒะธะปัั ะบ ะดะฒะตัะธ.
"ะะตั, ัะฐะบ ะฟัะพััะพ ัั ะพั ะผะตะฝั ะฝะต ัะนะดะตัั".
ะะฝะต ะดะตะนััะฒะธัะตะปัะฝะพ ั
ะพัะตะปะพัั, ััะพะฑั ะพะฝ ะฟะพะฑัะป ััะดะพะผ, ััะพะฑั ะฝะต ัั
ะพะดะธะป ัะฐะบ ะฑััััะพ.
ะะฝ ะผะฝะต ะฝัะฐะฒะธะปัั, ะฝะฐัะบะพะปัะบะพ ะผะพะถะฝะพ ััะดะธัั ะฟะพ ะพะดะฝะพะน ะผะธะผะพะปะตัะฝะพะน ะฒัััะตัะต ะณะปะฐะทะฐะผะธ. ะฅะพัั
ั ะฝะต ะณะพัะพะฒ ะฑัะป ะฟัะธะทะฝะฐัััั ะฒ ััะพะผ ะดะฐะถะต ัะตะฑะต.
- ะฏ ะฝะพัะผะฐะปัะฝะพ ัะตะฑั ััะฒััะฒัั, - ะฝะต ัะฒะพะธะผ ะณะพะปะพัะพะผ ะทะฐะผะตัะธะป ั, ะฟัะพะฒะพะถะฐั ะผะฐะณะฐ ะฒะทะณะปัะดะพะผ.
ะขะพั ะฝะต ะพััะฐะฝะพะฒะธะปัั. - ะะตะนะฝ! - ัะฝะพะฒะฐ ะฝะธะบะฐะบะพะน ัะตะฐะบัะธะธ. - ะกัะพะน! - ะฒ ััะพััะฝะพะผ ะฑะตััะธะปะธะธ
ั ัะดะฐัะธะป ััะบะพะน ะฟะพ ะบัะพะฒะฐัะธ ะธ ะผะณะฝะพะฒะตะฝะฝะพ ะฒะตัะฝัะฒัะฐััั ะฑะพะปั ะทะฐััะฐะฒะธะปะฐ ะผะตะฝั ััะธัะฝัะฒ
ะทัะฑั ะตะดะฒะฐ ัะปััะฝะพ ะฟัะพััะพะฝะฐัั.
ะะฐะณ ะทะฐะผะตั.
- ะฅะพัะตัั, ััะพะฑั ั ะพััะฐะปัั? - ะฒ ะตะณะพ ัะปะพะฒะฐั
ั ััะปััะฐะป ะฒะตััะผะฐ ะฝะตะดะฒััะผััะปะตะฝะฝัะน ะฝะฐะผะตะบ.
- ะฏ ะฝัะถะตะฝ ัะตะฑะต?
ะคะธะณััะฐ ะฒ ัะตัะฝะพะน ะผะฐะฝัะธะธ ัะฐะทะฒะตัะฝัะปะฐัั ะบะพ ะผะฝะต. ะะตะนะฝ ะฒัะบะธะฝัะป ะฑัะพะฒั ะพะถะธะดะฐั ะพัะฒะตัะฐ.
- ะะน, ะธะดะธ ะบ ะดััะฒะพะปั! - ะฟัะธั
ะฐะฝัะป ั, ะฝะตะพะถะธะดะฐะฝะฝะพ ะดะฐะถะต ะดะปั ัะฐะผะพะณะพ ัะตะฑั, ะฝะพ ะฟััะธ ะฝะฐะทะฐะด
ัะถะต ะฝะต ะฑัะปะพ.
- ...
- ะฃะฑะธัะฐะนัั, - ะฟัะพัะตะดะธะป ั ัะบะฒะพะทั ะทัะฑั, ะฒ ะดััะต ัะผะพะปัั ะตะณะพ ะฝะต ัะปััะฐัั ะผะพะธั
ัะปะพะฒ.
- ะะฐะดะฝะพ, - ะฟัะธะผะธัะธัะตะปัะฝะพ ัััั
ะฝัะป ะณะพะปะพะฒะพะน ะะฐะณะฝัั. - ะัะพััะธ.
ะะฝ ะฒะตัะฝัะปัั ะบ ะผะพะตะน ะบะพะนะบะต.
ะฏ ัะผะตัะธะป ะตะณะพ ัะตัะดะธััะผ ะฒะทะณะปัะดะพะผ ะธ "ัะพะฒะตััะตะฝะฝะพ ัะปััะฐะนะฝะพ" ะฒัััะตัะธะปัั ั ะฟัะตะบัะฐัะฝัะผะธ
ะณะปะฐะทะฐะผะธ. ะะณะพ ะฒะทะพั ะฟัะพะฝะธะบะฐะป ะฟััะผะพ ะฒ ะดััั, ะทะฐััะฐะฒะปัะป ะฟะพะดัะธะฝััััั ะฒะพะปะต ั
ะพะทัะธะฝะฐ.
ะััะณะพะณะพ ะพะฟะธัะฐะฝะธั ั ะฑั ะดะฐัั ะฝะต ัะผะพะณ, ะฝะพ ะผะพะถะตั ัะพะปัะบะพ ะฝะฐ ะผะตะฝั ะพะฝ ัะฐะบ ะดะตะนััะฒะพะฒะฐะป.
- ะฃ ัะตะฑั ัะพะถะต ะบัะฐัะธะฒัะต ะณะปะฐะทะฐ, - ัะดะตะปะฐะป ะบะพะผะฟะปะธะผะตะฝั ะพะฝ.
- ะฅะฒะฐัะธั! - ั ะฟัะธะบััะธะป ะณัะฑั, ััะฒััะฒัั, ะบะฐะบ ะบัะฐัะบะฐ ะฟัะธะปะธะฒะฐะตั ะบ ะผะพะตะผั ะปะธัั.
- ะะฐ, ะธะทะฒะธะฝะธ, - ะพะฟะพะผะฝะธะปัั ะะฐะณะฝัั. - ะฏ ะฟะพะฝัะป, ััะพ ัะตะฑะต ััะพ ะฝะต ะฝัะฐะฒะธััั. ะงัะพ ะถ,
ะฟัะธะดะตััั ะธะณัะฐัั ะฟะพ ัะฒะพะธะผ ะฟัะฐะฒะธะปะฐะผ. - ะะตะนะฝ ะฟะพะถะฐะป ะฟะปะตัะฐะผะธ ะธ ะฒะตััะผะฐ ะณะปัะฟะพ ัะปัะฑะฝัะปัั.
ะะฝะต ะพััะฐะปะพัั ัะพะปัะบะพ ะฟะพะบะฐัะฐัั ะณะพะปะพะฒะพะน.
- ะขะฐะผ, ะฝะฐ ะฒะตัะตัะธะฝะบะต ะฒะฐะผะฟะธัะพะฒ... - ั ะทะฐะผัะปัั, - ะะฐะบ ัั ัะทะฝะฐะป, ััะพ ั... ะณะตะน?
- ะะปัะฟัะน ะฒะพะฟัะพั, ะฝะต ะฝะฐั
ะพะดะธัั? - ัะปะฐะดะบะพ ัะปัะฑะฝัะปัั ะพะฝ, ะฒ ัะพ ะฒัะตะผั ะบะฐะบ ะตะณะพ ะพะฑะฒะพะปะฐะบะธะฒะฐััะธะน
ะณะพะปะพั ะฒ ะพัะตัะตะดะฝะพะน ัะฐะท ะฒะทัะป ะผะตะฝั ะฒ ัะฒะพะน ะฟะปะตะฝ.
- ะะฐ, ะฟะพะถะฐะปัะน. - ะฟะพัะฟะตัะฝะพ ะพัะฒะตัะธะป ั, ะธ ะดะพะฑะฐะฒะธะป ะบะฐะบ-ะฑั ะผะตะถะดั ะฟัะพัะธะผ, - ะะฐะบ ัั ะผะตะฝั
ัะพะณะดะฐ ะฝะฐะทะฒะฐะป?
- ะะณััะตะน ะฑะตััะธะตะน, - ะผะฐะณ ะฑัะดัะพ ะถะดะฐะป ััะพะณะพ ะฒะพะฟัะพัะฐ.
ะฏ ัะบะพััะธะป ัะบะตะฟัะธัะตัะบัั ัะธะทะธะพะฝะพะผะธั, ะฐ ะฒะพั ะะตะนะฝ ะฒัะฟััะผะธะปัั ะธ ะฟะพัะตััะตะทะฝะตะป.
ะ ะฝะตะพะถะธะดะฐะฝะฝะพ ะฟะพะปะพะถะธะป ััะบั ะฝะฐ ะผะพั ะปะฐะดะพะฝั, ะทะฐััะฐะฒะธะฒ ะผะตะฝั ะฝะตะฒะพะปัะฝะพ ะฝะฐะฟัััััั.
- ะะฝะฐะตัั ะปะธ, - ะพัะบัะพะฒะตะฝะฝะพ ะฟัะธะทะฝะฐะปัั ะพะฝ, - ั ะฝะต ะทะฐะฝะธะผะฐััั ะฑะปะฐะณะพัะฒะพัะธัะตะปัะฝะพัััั
ะธ ะฝะต ัะฟะฐัะฐั ะพั ัะผะตััะธ ััะผะตัะตัะฝัั
ะพั
ะพัะฝะธะบะพะฒ...
ะะฐ ัะตะบัะฝะดั ะพะฝ ะทะฐะผะพะปัะฐะป, ะฒะธะดะธะผะพ ัะฐะทะผััะปัั, ััะพะธั ะปะธ ะฟัะพะดะพะปะถะฐัั ะฟัะธะทะฝะฐะฝะธะต ะธะปะธ ะปัััะต
ะทะฐะฑะธัั ะฟะพะบะฐ ะฝะต ะฟะพะทะดะฝะพ.
ะฏ ัััั ะฝะฐะบะปะพะฝะธะป ะณะพะปะพะฒั ะฒ ะทะฝะฐะบ ะบัะฐะนะฝะตะน ะทะฐะธะฝัะตัะตัะพะฒะฐะฝะฝะพััะธ ะธ ะผะฐะณ, ะฒะทะดะพั
ะฝัะฒ, ะฟัะพะดะพะปะถะธะป.
- ะะฝะต ะฑัะปะพ ะฝะฐะฟะปะตะฒะฐัั ะฝะฐ ะฟัะพััะฑั ะฅะพะดะถะฐ ะธ ะฟะพะฝะฐัะฐะปั ั ะพัะบะฐะทะฐะป ะตะผั... ะะพ... ะฟะพัะปะต
ัะพะณะพ ะบะฐะบ ะพะฝ ะฝะฐะทะฒะฐะป ะผะฝะต ะธะผั... ะัะดั ะฝะฐ ัะฒะพะตะผ ะผะตััะต ะบัะพ-ัะพ ะดััะณะพะน, ะผะตะฝั ะฑั ะทะดะตัั
ะฝะต ะฑัะปะพ.
ะะฐะณะฝัั ะฟัะพะฝะทะธะป ะผะตะฝั ะฒะทะณะปัะดะพะผ. ะ ััั ั ัะตัะธะปัั.
- ะะพะณ ะฑั ััะพัะผัะปะธัะพะฒะฐัั ะฟัะพัะต, - ะทะฐะผะตัะธะป ั.
- ะ ัะผััะปะต?
- ะขั ะผะฝะต ัะพะถะต ะฝะตะฑะตะทัะฐะทะปะธัะตะฝ.
ะะปะตัะธ ะผะฐะณะฐ ะทะฐััััะปะธัั ะพั ัะผะตั
ะฐ.
- ะงัะพ ะฝะฐ ััะพั ัะฐะท ะฝะต ัะฐะบ? - ะฒะพะทะผััะธะปัั ั ะพัะฒะพะฑะพะถะดะฐั ััะบั.
- ะขั ัะพะถะต ะฝะต ะพัะตะฝั ะบัะฐัะฝะพัะตัะธะฒ. ะะพ ะฝะต ะฑะตะดะฐ. ะฏ ะฝะต ะฑะตัะตะดะพะฒะฐัั ััะดะฐ ะฟัะธัะตะป.
ะะตะนะฝ ะฟัะพะฒะตะป ะปะฐะดะพะฝัั ะฟะพ ะผะพะตะน ัะตะบะต, ะพััะตะณะพ ะผะตะฝั ะฑัะพัะธะปะพ ะฒ ะถะฐั, ะฟะพัะพะผ ะฝะฐะบะปะพะฝะธะปัั
ะธ ะพััะพัะพะถะฝะพ ะบะพัะฝัะปัั ะผะพะธั
ะณัะฑ.
ะฏ ะฟะพะดะฐะปัั ะฑัะปะพ ะฒะฟะตัะตะด ััะตะฑัั ะฑะพะปััะตะณะพ, ะฝะพ ัะพั ะฝะตะพะถะธะดะฐะฝะฝะพ ะพััััะฐะฝะธะปัั.
- ะงัะพ ะพะฟััั? - ะฝะตะดะพะฒะพะปัะฝะพ ะทะฐะบะฐัะธะป ะณะปะฐะทะฐ ั.
- ะะฝะต ะฝัะฐะฒะธััั ัะตะฑั ะทะปะธัั, - ะฟัะธะทะฝะฐะปัั ะะฐะณะฝัั ัะบัะธะฒะธะฒ ััะพะปั ะถะตะปะฐะฝะฝัะต ะณัะฑั ะฒ ััะผะตัะบะต.
- ะขั ัะฐะบะพะน ัะตะบััะฐะปัะฝัะน, ะบะพะณะดะฐ ะทะฐะฒะพะดะธัััั.
- ะขั ะฟัะพััะพ ะฟะพะปัะทัะตัััั ะผะพะตะน ะดะตะตัะฟะพัะพะฑะฝะพัััั. ะัะปะธ ะฑั ั ะผะพะณ ะฟะพะดะฝััััั...
ะะฐะณ ะฝะต ะดะฐะป ะผะฝะต ะทะฐะบะพะฝัะธัั ัะณัะพะทั ะธ ั ะฝะพะฒะพะน ัััะฐัััั ะฝะฐะบััะป ะผะพะธ ะณัะฑั ัะฒะพะธะผะธ. ะะฐ
ััะพั ัะฐะท ั ะฟัะตะฒะพะทะผะพะณะฐั ะฑะพะปั ะพะฑั
ะฒะฐัะธะป ััะบะฐะผะธ ะตะณะพ ัะตั ะธ ัะธะปัะฝะตะต ะฟัะธััะฝัะป ะตะณะพ ะบ ัะตะฑะต.
ะขะพั ัะปะตะณะบะฐ ัะปัะฑะฝัะปัั ะฝะต ะฟัะตััะฒะฐั ะฟะพัะตะปัั. ะะพั ะณะพะปะพะฒะฐ ะฟัะพะดะพะปะถะฐะปะฐ ะบััะถะธัััั.
ะะฐะบะพะฝะตั, ััะถะตะปะพ ะดััะฐ, ะะตะนะฝ ะพััััะฐะฝะธะปัั.
- ะะพะบะฐ ัั ะฒ ัะฐะบะพะผ ัะพััะพัะฝะธะธ, ะฝะฐ ะฑะพะปััะตะต ะฝะต ัะฐัััะธััะฒะฐะน, - ั ัะปัะฑะบะพะน ะฟัะธะณัะพะทะธะป
ะพะฝ.
- ะะพะถะฝะพ ะฟะพะดัะผะฐัั, ัั ะดะตะปะฐะตัั ะผะฝะต ะพะดะพะปะถะตะฝะธะต. - ะฟัะธะฝัะป ะฒัะทะพะฒ ั. - ะขะตะฑะต ััะพ ะฝัะถะฝะพ
ะฝะต ะผะตะฝััะต.
- ะขั ะฝะฐะณะปะตะตัั ะฟััะผะพ ะฝะฐ ะณะปะฐะทะฐั
, - ะฒัะฝะตั ะฒะตัะดะธะบั ะผะฐะณ.
- ะขั ะตัั ะฟะปะพั
ะพ ะผะตะฝั ะทะฝะฐะตัั.
- ะขะพะณะดะฐ ะดะพ ัะบะพัะพะน ะฒัััะตัะธ. ะัะธะทะฝะฐััั, ะฑัะดั ัะบััะฐัั, - ะะฐะณะฝัั ะะตะนะฝ ัะถะต ะฒะพ ะฒัะพัะพะน
ัะฐะท ะทะฐ ััะพั ะฒะตัะตั ะฝะฐะฟัะฐะฒะธะปัั ะบ ะดะฒะตัะธ. ะ ะฝะฐ ััะพั ัะฐะท ั ะตะณะพ ะฝะต ะพััะฐะฝะฐะฒะปะธะฒะฐะป.
- ะะฐ, ั ัะพะถะต, - ะฒัะดะพั
ะฝัะป ั ะถะฐะปะตั, ััะพ ะฝะต ะฒะธะถั ัะตะนัะฐั ะตะณะพ ะปะธัะฐ.
ะะฒะตัั ะทะฐ ะะตะนะฝะพะผ ะทะฐะบััะปะฐัั. ะฏ ะพััะฐะปัั ะพะดะธะฝ ะธ ัะตัะตะท ะฟะฐัั ะผะธะฝัั ะฒะฝะพะฒั ะพัะบะปััะธะปัั.
ะก ัะพะณะพ ะดะฝั ะผะธะฝัะปะธ ะฟะพััะธ ะดะฒะฐ ะผะตัััะฐ. ะฏ ััะฐัะฐะปัั ะฝะต ะดัะผะฐัั ะพ ะฒะพะทะผะพะถะฝะพัััั
ะฒัััะตัะธ
ั ะผะฐะณะพะผ. ะงัะพะฑั ะฟะตัะฒัะผ ะดะตะปะฐัั ัะฐะณ ะฝะฐะฒัััะตัั ั ะฑัะป ัะปะธัะบะพะผ ะณะพัะด... ะะปะธ ััะตัะฝะธัะตะปะตะฝ,
ะฝะพ ะฒัะต-ะถะต ะผะฝะพะณะพะต ะฒ ะผะพะตะน ะถะธะทะฝะธ ะธะทะผะตะฝะธะปะพัั ะฟะพัะปะต ะฝะฐัะตะน ะฒัััะตัะธ. ะ ะฟะตัะฒัั ะพัะตัะตะดั,
ั ััะฐะป ะฟะพ-ะดััะณะพะผั ัะผะพััะตัั ะฝะฐ ะะถะตะนัะฐ. ะะฝ ะฝะต ะฟัะธะฒะปะตะบะฐะป ะผะตะฝั ัะฐะบ ะบะฐะบ ัะฐะฝััะต, ั
ะพัั
ะฝะตัะพะผะฝะตะฝะฝะพ ะฒัะต ะถะต ะฝัะฐะฒะธะปัั. ะะดะฝะฐะบะพ ัะฐะฝัะฐะทะธะธ ะผะพะธ ะทะฐะฝะธะผะฐะป ะธัะบะปััะธัะตะปัะฝะพ ััะพั ะฒััะพะบะพะผะตัะฝัะน
ะธ ัะฟะฐัะฐะถะฝัะน ัะธะฟ - ะะฐะณะฝัั ะะตะนะฝ. ะะฐะถะต ะตัะปะธ ั ะฟััะฐะปัั ะผััะปะตะฝะฝะพ ะฟะพัะปะฐัั ะตะณะพ ะบ ัะตััั
ะธ ะฒัะบะธะฝััั ะธะท ะณะพะปะพะฒั, ัะพ ะฒัะตะผะตะฝะตะผ ะฒัะต ะฑะพะปััะต ัะฑะตะถะดะฐะปัั, ััะพ ััะพ ะฝะตะฒะพะทะผะพะถะฝะพ.
- ะะปะตะบ, ัั ะทะดะตัั? - ะฒ ะดะฒะตััั
ะฟะพัะฒะธะปัั ะะถะตะนั.
- ะงัะพ ัะปััะธะปะพัั? - ั ะฝะต ะพะณะปัะฝัะปัั ะฝะฐ ะฝะตะณะพ: ะฝะต ั
ะพัะตะป ะฒะธะดะตัั.
"ะัะฐะฒะดะฐ?"
- ะฅะพะดะถ ัะบะฐะทะฐะป - ะฒะตัะตัะพะผ ะพัะฟัะฐะฒะปัะตะผัั ะฝะฐ ะพั
ะพัั. ะขั ั ะฝะฐะผะธ?
- ะััะตััะฒะตะฝะฝะพ, ััะพ ะทะฐ ะฒะพะฟัะพัั?
- ะัะพััะพ ัั ะฒัะณะปัะดะธัั ัััะฐะปัะผ, - ั ัะพััะฒััะฒะธะตะผ ะทะฐะผะตัะธะป ะฟะฐัะตะฝั.
"ะัะดัะพ ัะตะฑั ััะพ ะฒะพะปะฝัะตั", - ั ะณะพัะตััั ะฟะพะดัะผะฐะป ั.
- ะขะตะฑะต ะบะฐะถะตััั.
ะะพะฒะธัะปะพ ะฝะฐะฟััะถะตะฝะฝะพะต ะผะพะปัะฐะฝะธะต. ะฏ ั
ะพัะตะป, ััะพะฑั ะพะฝ ััะตะป. ะะฟะตัะฒัะต ะฒ ะถะธะทะฝะธ ั ะฝะต ั
ะพัะตะป
ั ะฝะธะผ ัะฐะทะณะพะฒะฐัะธะฒะฐัั. ะก ัะตะปะพะฒะตะบะพะผ, ะบะพัะพัะพะณะพ ะตัั ะฝะตะดะฐะฒะฝะพ ะผะตััะฐะป ะทะฐัะฐัะธัั ะฒ ะฟะพััะตะปั.
ะะพะถะต!
- ะกะปััะฐะน, - ะพะฝ ะฒะพััะป ะฒ ะบะพะผะฝะฐัั ะธ ะฟัะธะบััะป ะทะฐ ัะพะฑะพะน ะดะฒะตัั, - ะผั ะถะต ะดััะทัั... ะผั
ะฑัะฐััั. ะัะปะธ ั ัะตะฑั ะบะฐะบะธะต-ัะพ ะฟัะพะฑะปะตะผั, ัั ะฝะต ะดะพะปะถะตะฝ ะดะตัะถะฐัั ะธั
ะฒ ัะตะฑะต. ะะพะณะพะฒะพัะธ
ั ะฝะฐะผะธ. ะกะพ ะผะฝะพะน. ะฏ ัะฒะตัะตะฝ, ััะพ ะฒะผะตััะต ะผั ะปะตะณะบะพ ัะฟัะฐะฒะธะผัั ั ัะฒะพะตะน... ะดะตะฟัะตััะธะตะน.
ะะถะตะนั ะฟะพะปะพะถะธะป ััะบั ะฝะฐ ะผะพะต ะฟะปะตัะพ. ะ ะฐะฝััะต ะพะฝ ัะฐััะพ ัะฐะบ ะดะตะปะฐะป ะธ ะผะฝะต ััะพ ะฝัะฐะฒะธะปะพัั,
ะฝะพ ัะตะนัะฐั ะตะณะพ ะฒะผะตัะฐัะตะปัััะฒะพ ะฒ ะผะพั ะปะธัะฝัั ะถะธะทะฝั ะปะธัั ะฝะฐะณะฝะตัะฐะปะพ ัะธััะฐัะธั. ะ ะฟะพัะปะตะดะฝะตะต
ะฒัะตะผั ั ะฒัะตะผะธ ัะธะปะฐะผะธ ะฟััะฐะปัั ัะฝััั ัะถะต ะผะฝะพะณะพ ะปะตั ะผััะธะฒัะธะต ะผะตะฝั ััะฒััะฒะฐ ะบ ะฟัะธัะผะฝะพะผั
ะฑัะฐัั. ะะตะนะฝ ะดะพะปะถะตะฝ ะฑััั ัะฒะตัะตะฝ ะฒ ัะพะผ, ััะพ ะบัะพะผะต ะฝะตะณะพ ะผะฝะต ะฝะธะบัะพ ะฝะต ะฝัะถะตะฝ.
- ะะตั ั ะผะตะฝั ะฝะธะบะฐะบะธั
ะฟัะพะฑะปะตะผ, ั ะฟัะพััะพ ั
ะพัั ะฟะพะฑััั ะพะดะธะฝ. ะขั ะผะพะถะตัั ะพััะฐะฒะธัั ะผะตะฝั
ะฒ ะฟะพะบะพะต? - ะบะฐะบ ะผะพะถะฝะพ ะผัะณัะต ะฟะพััะฐัะฐะปัั ัะบะฐะทะฐัั ั.
"ะะพะพะฑัะต-ัะพ ะพะฝ ั
ะพัะตั ะฟะพะผะพัั. ะะต ะตะณะพ ะฒะธะฝะฐ, ััะพ ั ะณะตะน".
- ะฏ ะฟะพะฝะธะผะฐั ัะฒะพะธ ััะฒััะฒะฐ... - ะพััะพัะพะถะฝะพ ะฝะฐัะฐะป ะพะฝ.
- ะะตั, - ะพััะตะทะฐะป ั, ะฒะดััะณ ะพะณะปัะฝัะฒัะธัั ะฝะฐ ะฝะตะณะพ.
ะขะพั ะดะฐะถะต ะพััััะฟะธะป ะฝะฐะทะฐะด.
- ะขั ะฝะต ะฟะพะฝะธะผะฐะตัั ะผะตะฝั, ะฐ ั ะฝะต ะฟะพะฝะธะผะฐั ะฒะฐั ั ะะปััะธ. ะญัะพ ะฒะฟะพะปะฝะต ะปะพะณะธัะฝะพ, ััะพ ัะพะฒะตััะตะฝะฝะพ
ัะฐะทะฝัะต ััะฒััะฒะฐ ะธ ะฝะธะบัะพ ะฝะต ะฒะธะฝะธั ัะตะฑั ะฒ ะผะพะธั
ะฟะตัะตะถะธะฒะฐะฝะธัั
, ะฝะพ ั ัะฐะทะฑะตัััั ั ะฝะธะผะธ
ัะฐะผ. ะฏ ะทะฝะฐั ัะตะณะพ ั
ะพัั.
- ะฅะพัะพัะพ, - ะะถะตะนั ะพะฟัััะธะป ะณะปะฐะทะฐ.
ะะธะบะพะณะดะฐ ะทะฐ ะฝะธะผ ัะฐะบะพะณะพ ะฝะต ะฝะฐะฑะปัะดะฐะป.
- ะขะฒะพะต ะดะตะปะพ, - ะฟะพะถะฐะป ะฟะปะตัะฐะผะธ ะพะฝ ะธ, ะฒัะนะดั ะทะฐ ะดะฒะตัั, ะฒะฝะพะฒั ะพััะฐะฒะธะป ะผะตะฝั ะฒ ะพะดะธะฝะพัะตััะฒะต.
ะะต ะทะฝะฐั, ัะบะพะปัะบะพ ะฒัะตะผะตะฝะธ ั ะฟัะพััะพัะป ัะฐะบ ัะฐะทะผััะปัั, ะฟัะฐะฒะธะปัะฝะพ ะปะธ ะฟะพัััะฟะธะป ะธ ะฝะต
ะพะฑะธะดะตะป ะปะธ ััะฒััะฒะฐ ะฑัะฐัะฐ.
ะะฒะตัั ะฒ ะบะพะผะฝะฐัั ะพัะบััะปะฐัั. ะฏ ัะตัะธะป, ััะพ ะะถะตะนั ะฒะตัะฝัะปัั, ะฑัะพัะธะป ะฝะฐ ะฝะตะณะพ ะฝะตะดะพะฒะพะปัะฝัะน
ะฒะทะณะปัะด ะธ... ะทะฐะผะตั ะฒ ะธะทัะผะปะตะฝะธะธ.
ะะฐ ะฟะพัะพะณะต ััะพัะป ะะฐะณะฝัั ะะตะนะฝ, ะพะดะตััะน, ะบะฐะบ ะธ ะฒัะตะณะดะฐ, ะฒ ะปัััะธั
ัะฒะพะธั
ััะฐะดะธัะธัั
: ะฒ
ัะตััั
ะพะฑะปะตะณะฐััะธั
ะฑััะบะฐั
ั ะทะฐะฝะธะถะตะฝะฝะพะน ัะฐะปะธะตะน ะธ ัััะฟะฐะฝะฝัะผ ะฑะปะตััะบะฐะผะธ ัะธัะพะบะธะผ ัะตะผะฝะตะผ,
ะฑะตะปะพะน ะผะฐะนะบะต ะธ ะบะพะถะฐะฝะพะน ะบัััะบะต ั ะฑะตัััะตัะฝัะผ ะบะพะปะธัะตััะฒะพะผ ัะตะฟะตะน ะธ ะทะฐะบะปะตะฟะพะบ. ะัะพัะบะธะน
ะผะฐะบะธัะถ ัะพััะฐะฒะปัะป ัะตะผะฝัั ะฟะพะดะฒะพะดะบั ะณะปะฐะท ัะฒะตัั
ั ะธ ะทะพะปะพัะธััะพ-ะฟะตัะปะฐะผัััะพะฒัั, ะฟะพัะพะปัะต,
ัะฝะธะทั.
ะฏ ะฟะพะฝะธะผะฐะป, ััะพ ะฝัะถะฝะพ ะบะฐะบ-ัะพ ะฟะพะฟัะธะฒะตัััะฒะพะฒะฐัั ะณะพััั, ะฝะพ, ะฑัะดัะพ ะฟัะพะณะปะพัะธะฒ ัะทัะบ,
ะฟัะพะดะพะปะถะฐะป ะฑะตะทะผะพะปะฒะฝะพ ะฟัะปะธัััั ะฝะฐ ะฝะตะณะพ.
- ะัะธัะตะป ะฟัะพะฒะตัะธัั, ะบะฐะบ ัั ัะตะฑั ััะฒััะฒัะตัั, - ะฑะตะท ะปะธัะฝะธั
ัะพัะผะฐะปัะฝะพััะตะน ัะพะพะฑัะธะป
ะะตะนะฝ.
"ะะพะต ัะตัะดัะต ัะฟะฐะปะพ ะฝะฐ ะฟะพะป ะธ ะพัะบะฐัะธะปะพัั ะบ ะตะณะพ ะฝะพะณะฐะผ, ะพััะฐะฒะปัั ะบัะพะฒะฐะฒัั ะฟะพะปะพัั".
- ะัะปะธ ะฑั ัั ั
ะพัะตะป ะฟัะพะฒะตัะธัั ะบะฐะบ ั ัะตะฑั ััะฒััะฒัั - ะฟัะธััะป ะฑั ัะฐะฝััะต. - ั ะฝะพัะบะฐะผะธ
ะพะฑะธะดั ะพัะฒะตัะธะป ั. - ะขะตะฑะต ะฟัะพััะพ ะทะฐะฝััััั ะฝะตัะตะผ? ะ ะตัะธะป ะฟัะพะณัะปััััั ะดะพ ะธะฝััะธัััะฐ?
- ะ ะผะพะณ ะฑั ะฒะพะพะฑัะต ะฝะต ะฟัะธะดัะธ, - ะทะฐะผะตัะธะป ะผะฐะณ.
ะขะฐะบะฐั ะฟะตััะฟะตะบัะธะฒะฐ ะผะตะฝั ัะพะฒะตััะตะฝะฝะพ ะฝะต ะฒะดะพั
ะฝะพะฒะธะปะฐ.
- ะฃ ะผะตะฝั ัะพะถะต ะตััั ััะฒััะฒะฐ, ะะปะตะบ. ะฏ ะผะพะณ ะบะพะปะตะฑะฐัััั, ัะพะผะฝะตะฒะฐัััั, ะฟัะฐะฒะธะปัะฝะพ ะปะธ
ะฟะพัััะฟะฐั ะธ ั
ะพัั ะปะธ ััะพะณะพ ะฝะฐ ัะฐะผะพะผ ะดะตะปะต. - ะะฐะณะฝัั ะฟะพะดะพัะตะป ะฑะปะธะถะต ะธ ะพััะฐะฝะพะฒะธะปัั ะฝะฐะฟัะพัะธะฒ
ะผะตะฝั.
ะฏ ััะพัะป ะพะฟัััะธะฒ ะณะพะปะพะฒั ะธ ะฟะพััะฟะธะฒ ะฒะทะณะปัะด. ะัะพ ั ัะฐะบะพะน, ััะพะฑั ะพะฑะฒะธะฝััั ะตะณะพ ะฒ ัะตะผ-ัะพ?
ะญะณะพะธัั. ะัั ััะพ ะฒัะตะผั, ัััะฐะดะฐั ะธ ะผะตััะฐั ะพ ะฒัััะตัะต, ั ะฝะธ ะฝะฐ ัะตะบัะฝะดั ะฝะต ะทะฐะดัะผะฐะปัั
ะพ ัะพะผ, ััะพ ััะฒััะฒัะตั ะพะฝ. ะก ะบะฐะบะธะผ ัััะดะพะผ ะตะผั ะดะฐะตััั ะฟัะธะทะฝะฐะฝะธะต ัะฐะผะพะผั ัะตะฑะต. ะฃ ะฝะตะณะพ
ัะพะถะต ะตััั ะณะพัะดะพััั, ะฝะฐ ะบะพัะพััั ะตะผั ะฟัะธัะปะพัั ะฝะฐัััะฟะธัั, ััะพะฑั ัะฒะธัััั ััะดะฐ. ะะพ
ะผะฝะต. ะกะฝะพะฒะฐ.
- ะขั ะฑั ัะฐะผ ะบะพ ะผะฝะต ะฝะธ ะทะฐ ััะพ ะฝะต ะฟัะธัะตะป, - ัะฝะพะฒะฐ ะฒ ัะพัะบั.
"ะะฐ".
- ะัะพััะธ, - ัะตะฟัั ะตะดะฒะฐ ัะฐะทะปะธัะธะผะพ.
ะะฐะณ ะธะทััะฐะตั ะผะตะฝั ะบะพัะฐั'
- source_sentence: 'ะ ะตัะปะธ ะฑั ะผั ะฒัััะตัะฐะปะธ ะะพะฒัะน ะะพะด ะฒะผะตััะต, ะผ?
ะะฝะฐะตัั, ั ะฑั ะฝะต ั
ะพัะตะปะฐ, ััะพ ะฑั ะผั ะฟัะพะฒะพะดะธะปะธ ะตะณะพ ะฒ ะณะพัััั
ั ะบะพะณะพ-ัะพ ะธะท ะฝะฐั, ะฝะตั.
ะญัะพ ะฑัะปะฐ ะฑั ะฑะตะณะพัะฝั, ะตัะต ะธ ััะตัะฐ, ะณะพัะพะฒะบะฐ ะธ ะฟัะพัะตะต, ะฐ ััะพ ะฝะตะผะฝะพะณะพ ะฝะต ัะพ, ัะตะณะพ
ะฑั ั ั
ะพัะตะปะฐ, ััะพ ะฝะต ัะพ ัะฐะผะพะต, ะฝะต ะธะดะตะฐะปัะฝะพะต-ะฝะตะธะดะตะฐะปัะฝะพะต-ะฟัะพััะพะต-ะฝะตะฟัะพััะพะต. ะะพะถะตั,
ะตัะปะธ ะฑั ะฝะฐั ะบะฐะบะธะผะธ-ัะพ ัะธะปะฐะผะธ ัะฒััะต ะฟัััะธะปะธ ะบ ะดััะทััะผ, ัะพ ะผั ะฑั ะฟะพัะธะดะตะปะธ, ะฟะพะพะฑัะฐะปะธัั
ะฑั. ะญัะพ ะฑัะปะพ ะฑั ะบะปะฐััะฝะพ, ะทะฝะฐะตัั? ะัะพััะพ ัะธะดะตัั ะธ ะณะพะฒะพัะธัั ัะพ, ััะพ ั
ะพัะตัั, ะฐ ะฟะพัะพะผ
ะผั ะฟะพัะปะธ ะฑั ะฒ ะบะพะผะฝะฐัั, ะฒะทัะฒ ััะพ-ัะพ ะธะท ะฒัะฟะธะฒะบะธ ะธ ะฒะบััะฝะพััะตะน. ะขะพ, ััะพ ะฒัะฑัะฐะปะฐ ะฑั
ัั, ะฒะตะดั ะทะฝะฐะตัั - ั ะฝะต ัะผะตั ะฟะธัั. ะะตะนะปะธะท, ะผะพะถะตั? ะ ั ะปะธะฑะพ ะฝะต ะฒัะฟัั ะฑะพะปััะต ะณะปะพัะบะฐ,
ะปะธะฑะพ ะฒัะฟัั ะพัะตะฝั ะผะฝะพะณะพ. ะฏ ะปัะฑะปั ัะพะบะพะปะฐะด ั ะฝะต ะพัะตะฝั ะดะฐะฒะฝะธั
ะฟะพั. ะั ัะตะปะธ ะฑั ะฝะฐ ะบัะพะฒะฐัั,
ะฐ ัั ัะฝัะปะฐ ะฑั ัะฒะพะธ ะบะฐะฑะปัะบะธ, ะฒััะพะบะธะต ะธ ะพัะตะฝั ะบััััะต, ะถะฐะปัััั ะผะฝะต ะฝะฐ ัััะฐะปะพััั ะธ
ะฑะพะปั ะพั ะฝะธั
. ะฏ ะฟะพะถะฐะปะตะปะฐ ะฑั, ะฝะพ ะฒัะต-ัะฐะบะธ ะฝะต ัะผะพะณะฐ ะฑั ัะดะตัะถะฐัััั ะธ ะฝะต ะฟะพะทะปะพัะฐะดััะฒะพะฒะฐัั
ะพ ัะพะผ, ััะพ ะฝะฐ ะผะฝะต ะฑะฐะปะตัะบะธ. ะะพะถะตั, ะดะฐะถะต ะฟัะพััะพ ะฝะพัะบะธ, ัะฐะท ัะถ ั ะดััะทะตะน. ะขั ะฟะพะปะพะถะธะปะฐ
ะฑั ะฝะพะณะธ ะฝะฐ ะบัะพะฒะฐัั, ะฒะพะทะผะพะถะฝะพ, ัะพะณะฝัะปะฐ ะฑั ะธั
ะฒ ะบะพะปะตะฝัั
, ะธ ััะพ ะฒัะณะปัะดะตะปะพ ะฑั ะพัะตะฝั
ะผะธะปะพ ะฒ ัะพัะบะพัะฝะพะผ ะฟะปะฐััะต. ะะฐะผ ะฑัะปะพ ะฑั ะพัะตะฝั ะถะฐัะบะพ, ะฝะพ, ะทะฝะฐะตัั, ะฝะต ะธะท-ะทะฐ ัะพะณะพ, ััะพ
ะผั ััะดะพะผ ะธะปะธ ััะพ-ัะพ ะฒ ััะพะผ ัะพะดะต, ะฝะตั, ะฐ ะฟัะพััะพ ะธะท-ะทะฐ ัะพะณะพ, ััะพ ััะผะฑััะฝัะน ะธ ะธะฝะพะณะดะฐ
ะผะฝะพะณะพะปัะดะฝัะน ะะพะฒัะน ะะพะด ะฒัะตะณะดะฐ ัะฐะบ ะฟะพัััะฟะฐะตั. ะ ะฒะตะดั, ะบะฐะทะฐะปะพัั ะฑั, ะทะธะผะฝะธะน ะฟัะฐะทะดะฝะธะบ.
ะฏ ะฑั ั
ะพัะตะปะฐ ะดะตัะถะฐัั ัะตะฑั ะทะฐ ััะบั, ะฒะตะดั ั ะปัะฑะปั ัะฒะพะธ ะผะฐะปััะตะฝัะบะธะต ัััะบะธ, ะบะฐะบ ั ัะตะฑะตะฝะบะฐ,
ัะฐะทะฒะต ััะพ ัะพ ะฒะทัะพัะปัะผ ะผะฐะฝะธะบััะพะผ ะธ ะฒะทัะพัะปัะผะธ ะบะพะปััะฐะผะธ, ะบะพัะพััะต ัะฐะบ ะผะตัะฐัั ัะถะธะผะฐัั
ะบัะตะฟะบะพ ััะบั. ะ ัั ะฑั ัะฐะทัะตัะธะปะฐ, ะฒะตะดั ัะฐะบ? ะฅะฐั
, ะฒะพะทะผะพะถะฝะพ ัั ะฑั ะดะฐะถะต ะบะพะปะพะปะฐ ะผะตะฝั
ัะฒะพะธะผะธ ะฑะพะปััะธะผะธ ะฝะพะณััะผะธ, ะผั ะฑั ะฝะฐัะฐะปะธ ัะตะบะพัะฐัั ะดััะณ ะดััะณะฐ, ะฒ ััะพะผ ั ัะฒะตัะตะฝะฐ, ะฐ
ะฒะตะดั ะธ ัะฐะบ ะถะฐัะบะพ, ัั
. ะะพะปะพัั ัะฐัััะตะฟะฐะฝะฝัะต, ะผะพะบััะต ะพั ะฟะพัะฐ, ัั. ะะพ ะบะปะฐััะฝะพ.
ะั ะฑะพะปัะฐะปะธ ะฑั ะฒัั ะฝะพัั, ะผั ะถะต ะปัะฑะธะผ ััะพ ะดะตะปะพ, ะดะฐ? ะะพะถะตั, ะผั ะฒัะฟะพะผะธะฝะฐะปะธ ะฑั ััะพ-ัะพ
ะธะท ะฝะฐัะธั
ััะฐััั
ะฟัะธะบะพะปะพะฒ ะฒ ะธะฝัะตัะฝะตัะต, ะฒัะฟะพะผะธะฝะฐะปะธ ะฑั ะฝะฐัะต ะพะฑัะตะฝะธะต ะดะพ ัะพะณะพ, ะบะฐะบ
ะผะตะถะดั ะฝะฐะผะธ ะฝะฐัะฐะปะพัั ะทะฐัะพะถะดะฐัััั ััะพ-ัะพ ะฑะพะปััะตะต, ัะตะผ "ะัะธะฒะตั, ะบะปะพ*, ะบะฐะบ ะดะตะปะฐ? ะะพัะผ,
ะฐ ัั?". ะั ะฑั, ะดัะผะฐั, ะฟะพัะธะดะตะปะธ ะฒ ะณะฐะดะถะตัะฐั
, ะฟะพัะผะพััะตะปะธ ะฑั ัะพัะพะณัะฐัะธะธ. ะัะปะธ ะฑั ั
ะผะตะฝั ะฟะพะฟะฐะดะฐะปะพัั ััะพ-ัะพ ัะฐะบะพะต, ัะตะณะพ ะฝะต ั
ะพัะตะปะพัั ะฑั ะฟะพะบะฐะทัะฒะฐัั ัะตะฑะต - ะฝะตัะดะฐัะฝะพะต
ัะพัะพ, ะบ ะฟัะธะผะตัั - ั ะฝะฐัะฐะปะฐ ะฑั ะฟัััะฐัั ัะตะปะตัะพะฝ, ะฐ ัั ะฟััะฐะปะฐัั ะฑั ะพัะฝััั, ััะพะฑั
ะฒัะต-ัะฐะบะธ ะฟะพัะผะพััะตัั. ะ ะฝะฐะพะฑะพัะพั. ะ ะฟะพะด ัััะพ ะผั ััะฝัะปะธ ะฑั.
ะ, ะทะฝะฐะตัั, ะผั ะบะพะณะดะฐ-ัะพ ะฝะฐะทัะฒะฐะปะธ ัะตะฑั ะฟะฐัะพะน. ะั ะฝะธะบะพะณะดะฐ ะฝะต ะฑัะปะธ ะฟะฐัะพะน, ั
ะฐั
? ะงะตััะพะฒัะบะธ,
ัะตััะพะฒัะบะธ ะณะปัะฟะพ. ะะพ ะบัะพ ะผั ัะพะณะดะฐ? ะ ะฝะธะบัะพ ะฝะต ะทะฝะฐะตั, ะฒะตะดั, ะบะฐะบ ัะบะฐะทะฐะปะฐ ะบะฐะบ-ัะพ ัะฐะท
ัั, ะผะพั ะปัะฑะธะผะฐั ะฒะฐะฝะธะปัะบะฐ ะธ ะปัะฑะธัะตะปัะฝะธัะฐ ััะฐัััะพะฒ, "ะะฟัะตะดะตะปะธัั - ะทะฝะฐัะธั, ะพะณัะฐะฝะธัะธัั".
ะญัะพ, ะฒัะพะดะต, ะัะบะฐั ะฃะฐะนะปัะด, ะผะฝะต ะพะฝ ะฝัะฐะฒะธััั.
*ะฟัะพะธะทะฒะพะดะฝะพะต ะพั "ะบะปะพะฝ", ะฐะฝะฐะปะพะณ "ะฑัะพ". ะกะผััะป ะฒ ัะพะผ, ััะพ ะฑัะปะพ ะผะฝะพะณะพ ะพะฑัะตะณะพ.
ะะฟะธัะฐัั ะฟัะพะณัะปะบั? ะัััั ััะพ ะฑัะดะตั ะปะตัะพ. ะะฐัะบะธ ะะธัะตั, ะผ? ะัะตะฝั ะถะฐัะบะธะน, ะบะฐะบ ะฒ ะฟัะพัะปะพะผ
ะณะพะดั, ะบะพะณะดะฐ ะดััะฐัั ะฑัะปะพ ะฝะตัะตะผ, ะธ ั ะฑัะบะฒะฐะปัะฝะพ ะบัะฟะฐะปะฐัั ะฒ ะะตะฒะต ะธ ั
ะพะดะธะปะฐ ัะฐะบ ะผะพะบัะพะน
ะฟะพ ะณะพัะพะดั. ะ ะััะณะตั ะะธะฝะณะต ะฝะฐ ััะพ ะดะฐะถะต ะฝะต ะพัะพะฑะพ ะพะฑัะฐัะธะปะธ ะฒะฝะธะผะฐะฝะธะต, ะบะฐะบ ั ะฟะพะผะฝั
- ะฟะพะฝะธะผะฐะปะธ. ะั ะฑั ะฒััะปะธ ะธะท ะดะพะผะฐ ะธ ะฟะพะตั
ะฐะปะธ ะฑั ะบ ะผะตััะพ ะฝะฐ ะผะฐัััััะบะต. ะะฐ ะผะฐัััััะบะต,
ัะฐะบ ะบะฐะบ ะผั ะฑั ะฟัะพััะพ ะฝะต ะฒัะถะธะปะธ ะฟะพัะปะต ัะพัะพะบะฐะผะธะฝััะฝะพะน ั
ะพะดัะฑั ะพั ะผะพะตะณะพ ะดะพะผะฐ ะดะพ ะผะตััะพ,
ั
ะพัั ั ัะถะต ะธ ะฟัะพั
ะพะดะธะปะฐ ัะตัะตะท ัะฐะบะพะต. ะ ััะพ ัะถะฐัะฝะพ. ะ ะผะตััะพ ะผั ะบัะฟะธะปะธ ะฑั ะถะตัะพะฝั
ะธ ะฝะตะผะฝะพะณะพ ะฟะพัะฟะพัะธะปะธ ะฑั ะฟะพ ะฟะพะฒะพะดั ะฟะธัะตััะบะธั
ะถะตัะพะฝะพะฒ ะธ ะผะพัะบะพะฒัะบะธั
ะบะฐััะพัะตะบ, ั ะพะดะตัะถะฐะปะฐ
ะฑั ะฟะพะฑะตะดั. ะะตะดั, ะทะฝะฐะตัั, ะถะตัะพะฝั ะฒ ะทะฐะผะบะฝััะพะผ ะบััะณั, ะธั
ะฝะต ะฝัะถะฝะพ ะพัะพะฑะพ ะฟัะพะธะทะฒะพะดะธัั.
ะะฐััะพัะบะธ ะถะต ะพะดะฝะพัะฐะทะพะฒัะต, ะพะฝะธ ะฒัะบะธะดัะฒะฐัััั ะธ ะฟัะธั
ะพะดะธััั ะฟัะพะธะทะฒะพะดะธัั ะฝะพะฒัะต. ะั ะฑั
ะทะฐะบะพะฝัะธะปะธ ะฝะฐ ััะพะผ ะธ ะฟัะพะดะพะปะถะฐะปะธ ะฑั ัััะธัั ะฝะฐ ัะตะผั "ะฒัะฐะถะดั" ะะธัะตัะฐ ะธ ะะพัะบะฒั - ะฝะฐัะต
ะปัะฑะธะผะพะต. ะ, ะทะฝะฐะตัั, ะะธัะตั ะปัััะต. ะะพะนะดั ะฒ ะฒะฐะณะพะฝ, ะผั ะฑั ัะตะปะธ, ัะฐะบ ะบะฐะบ ััะฐะฝัะธั, ะฝะฐ
ะบะพัะพัะพะน ะถะธะฒั, ะฟะพััะธ ััะพ ะบะพะฝะตัะฝะฐั. ะัะพ-ัะพ ะธะท ะฝะฐั ะฟะพะปะพะถะธะป ะฑั ะณะพะปะพะฒั ะฝะฐ ะฟะปะตัะพ ะดััะณะพะน,
ััะพ ะฑัะปะพ ะฑั ะพัะตะฝั ะผะธะปะพ ะธ, ะฒะพะทะผะพะถะฝะพ, ะฝะตัะดะพะฑะฝะพ. ะ ะฟะพะปัะทั ัะฒะพะตะณะพ ะผะตััะพ ัั ะฑั ะตัะต
ัะบะฐะทะฐะปะฐ, ััะพ ั ะฒะฐั ะตััั WiFi ะธ ะฑะตัะบะพะฝะตัะฝะฐั ะฒะตัะบะฐ. ะฏ ะฑั ัะพะณะปะฐัะธะปะฐัั, ัะฐะบ ะบะฐะบ ัะพะถะต
ัะฐะบ ััะธัะฐั. ะกะปะธัะบะพะผ ะผะฝะพะณะพ ะพ ะผะตััะพ, ะฝะต ัะฐะบ ะปะธ, ั
ะฐั
?
ะั ะฒััะปะธ ะฑั ะฝะฐ ะฝัะถะฝะพะน ััะฐะฝัะธะธ ะธ ัะผะธัะฐะปะธ ะฑั ะพั ะถะฐัั, ัะฐัะบะฐััั ั ััะผะบะฐะผะธ. ะะพะนะดั
ะฒ ะขะฆ ะะฐะปะตัะตั (ะพะฝ ั ะฝะฐั ะฑะพะปััะพะน ะธ ะผะฝะพะณะพะปัะดะฝัะน, ั ะตะณะพ ะฝะฐะทัะฒะฐั "ะผะตััะพะผ ะฒัะตั
ะฒัััะตั"),
ะผั ะฑั, ะดัะผะฐั, ะฝะฐะฟัะฐะฒะธะปะธัั ะฝะฐ ะฟัะตะดะฟะพัะปะตะดะฝะธะน ััะฐะถ ะบ ััะดะบะพััั. ะั ะฑั ะดะพะปะณะพ ัะตัะฐะปะธ,
ะณะดะต ะผั ะฑัะดะตะผ ะตััั, ั
ะพัั... ะฝะตั, ะฝะตั, ะฝะตะดะพะปะณะพ. ะขั ะฑั ะฒัะต ะฒะทัะปะฐ ะฒ ัะฒะพะธ ััะบะธ ะธ ะฟะพัะฐัะธะปะฐ
ะฑั ะบ... ะฝะต ะบ ะะฐะบะดะพะฝะฐะปัะดั, ะฝะต ะทะฐั
ะพัะตะปะฐ ะฑั ัั ััะพััั ะฒ ะพัะตัะตะดะธ. ะะพะถะตั, ะฑะปะธะฝัะธะบะธ
ะธะปะธ ะบัะพัะบะฐ-ะบะฐััะพัะบะฐ? ะะต ะทะฝะฐั. ะะพะทะผะพะถะฝะพ, ะผั ะฒัะฑัะฐะปะธ ะฑั ะััะณะตั ะะธะฝะณ ะธะปะธ KFC, ะฒะตะดั
ัะฐะผ ะผะพะถะฝะพ ัะฝะฝะพะต ะบะพะปะธัะตััะฒะพ ัะฐะท ะฝะฐะปะธะฒะฐัั ะฒะพะดั. ะะทัะปะธ ะฑั ะพะดะธะฝ ััะฐะบะฐะฝ ะฝะฐ ะดะฒะพะธั
ะธ
ะฝะฐะปะธะปะธ ะฑั ัะฟัะฐะนั, ะฝะฐั ะพัะพะฑะตะฝะฝัะน ะฝะฐะฟะธัะพะบ. ะฃ ะฝะฐั ะฑัะปะธ ะฑั ะฒ ะฝะตะผ ะดะฒะต ัััะฑะพัะบะธ. ะ ะตัะต,
ะผั ะผัะบะฐะปะธ ะฑั ะพั ัะดะพะฒะพะปัััะฒะธั, ะฟะพะฟะธะฒะฐั ะฑะปะฐะถะตะฝะฝัะน ะธ, ะณะปะฐะฒะฝะพะต, ั
ะพะปะพะดะฝัะน ัะฟัะฐะนั. ะญัะพ
ะฑัะปะพ ะฑั ะผะธะปะพ, ะผั ะดะฐะถะต ัะบะฐะทะฐะปะธ ะฑั ััะพ ะฒัะปัั
. ะะพะตะฒ, ะผั ะฑั ะฝะฐะฑัะฐะปะธ ะตัะต ัะฐะท ะฟะพะปะฝัะน
ััะฐะบะฐะฝ ัะฟัะฐะนัะฐ ะธ ะฟะพัะปะธ ะฑั ั
ะพะดะธัั ะฟะพ ัะพัะณะพะฒะพะผั ะบะพะผะฟะปะตะบัั. ะัะตะดััะฐะฒะธะผ, ััะพ ั ะฝะฐั
ะผะฝะพะณะพ ะดะตะฝะตะณ ั ัะพะฑะพะน, ะผ? ะั ะฑั ะทะฐัะปะธ ััะดะพะผ ั ะตะดะพะน ะฒ ะพัะดะตะป, ะณะดะต ะผะฝะพะณะพ ัะปะฐะดะพััะตะน:
ะปะตะดะตะฝัะพะฒ, ะผะฐัะผะตะปะฐะดะพะบ, ะบะฐัะฐะผะตะปะตะบ, ะถะฒะฐัะตะบ, ัะปะฐะดะบะธั
"ะปะตะฝั", ะบะพะฝัะตั. ะะฐะฑัะฐะปะธ ะฑั ะผะฝะพะณะพ-ะผะฝะพะณะพ
ะฒัะตะณะพ, ััะพะฑั ะฟะพัะพะผ ั
ะพะดะธัั ะฟะพ ะผะฐะณะฐะทะธะฝะพะผ ะธ ะตััั ััะพ ะฒัะต. ะั ะฒะทัะปะธ ะฑั ะตัะต ัะตะณะพ-ะฝะธะฑัะดั
ะดะปั ะผะพะตะน ะฟะปะตะผัะฝะฝะธัั, ะบะพัะพัะฐั ะพะฑัะทะฐัะตะปัะฝะพ ัะฟัะพัะธะปะฐ ะฑั: "ะ ััะพ ะฒั ะผะฝะต ะบัะฟะธะปะธ?".
ะั ั
ะพะดะธะปะธ ะฑั ะฟะพ ะพัะดะตะปะฐะผ, ั ะปัะฑะปั "H&M" ะธ "RESERVED" ะธ ะฝะต ะปัะฑะปั "Zara". ะฅะพัั, ัะบะพัะตะต
ะฒัะตะณะพ, ะธะผะตะฝะฝะพ ะตะต ัั ะฑะพะปััะต ะฒัะตั
ะธ ะปัะฑะธัั. ะะพ ะผะฝะต ะฑัะปะพ ะฑั ะถััะบะพ ัะบััะฝะพ ะฒ ะฝะตะน.
ะฏ ะบัะฟะธะปะฐ ะฑั ัะตะฑะต ัััะฑะพะปะพะบ, ะฐ ัั ะพะฑัะฒะธ. ะฏ ะทะฝะฐั, ะบะฐะบ ัั ะปัะฑะธัั ะพะฑัะฒั. ะฏ ะฟะพััะพัะฝะฝะพ
ะฑั ะฝะฐะฟะฐะดะฐะปะฐ ะฝะฐ ะบะปะตัะบั, ะฒัะฟะพะผะธะฝะฐั ะะธะฝัะตััะตัะพะฒ*, ัั ะฑั ัะพะถะต ัะพ ะผะฝะพะน ัััะธะปะฐ ะฝะฐ ััั
ัะตะผั, ัะพะปัะบะพ ะฑะตะท ะปะธัะฝะตะณะพ ัะฐะฝะฐัะธะทะผะฐ ะธ ะฟะพัะผะตะธะฒะฐััั ะฝะฐะดะพ ะผะฝะพะน. ะ ั ะฝะฐัะปะฐ ะฑั ะธ ะปัะฑะธะผัั
ััะฑะฐัะบั ะะธัะธ, ะธ ะปัะฑะธะผัั ัะฐะฟะบั ะะถะฐัะตะดะฐ, ะธ ะปัะฑะธะผัะน ะดะถะตะผะฟะตั ะะถะตะฝัะตะฝะฐ. ะฃััะฐะฒัะธะต, ะผั
ะฒะตัะฝัะปะธัั ะฑั ะดะพะผะพะน ะธ, ะฑัะปะพ ะฑั ะปะพะณะธัะฝะพ ะฟัะตะดะฟะพะปะพะถะธัั, ััะพ ะผั ะทะฐะฒะฐะปะธะผัั ะฒ ะบัะพะฒะฐัะบั.
ะะพ ะฝะตั, ะบะพะฝะตัะฝะพ, ะฝะธ ะฒ ะบะพะตะผ ัะปััะฐะต. ะั ะปัะถะตะผ ะพัะตะฝั ะฟะพะทะดะฝะพ, ะฟะพัะพะผั ััะพ ะฝะตะปัะทั ัะฟะฐัั,
ะบะพะณะดะฐ ะผั ััะดะพะผ, ะฒะพ ัะฝะต ะผั ัะตััะตะผ ะฒัะตะผั. ะะฐ?
*ะฟะตััะพะฝะฐะถะธ ัะตัะธะฐะปะฐ "ะกะฒะตัั
ัะตััะตััะฒะตะฝะฝะพะต". ะะฐะปััะต ะธะดะตั ะฟะตัะตัะธัะปะตะฝะธะต ะฒะตะดััะธั
ะฐะบัะพัะพะฒ
ะดะฐะฝะฝะพะณะพ ัะพั.
ะฏ ัะพัะบััะธะปะฐัั, ัะฐะบ ะดะฐะฒะฐะน ะฟะพะฑัะดะตะผ ะฒะผะตััะต ะฒ ะฟะธััะผะตะฝะฝะพะน ัะพัะผะต ะตัะต, ัั ะฝะต ะฟัะพัะธะฒ?
ะขะพะปัะบะพ ั ะฝะต ะทะฝะฐั, ะบัะดะฐ ะฝะฐะผ ะฟะพะนัะธ. ะ ะดะฐะฒะฐะน ะฟัะพััะพ ะปะตะถะฐัั ะฝะฐ ะบัะพะฒะฐัะธ.
ะขะฐะบ ะฒะพั, ะผั ะฟัะพััะพ ะปะตะถะธะผ, ะธ ะผะตะถะดั ะฝะฐะผะธ ะผะพะปัะฐะฝะธะต, ะฝะฐัะต ะปัะฑะธะผะพะต ะผะพะปัะฐะฝะธะต. ะขะฐ ัะธัะธะฝะฐ,
ะบะพัะพััั ะฝะตะปัะทั ะฝะฐะทะฒะฐัั ะฝะตะปะพะฒะบะพะน. ะฃ ะฝะฐั ะฝะตั ัะฐะบะพะน. ะกะบะพัะตะต, ัะพะดะฝะฐั, ัะตะฟะปะฐั, ะฟัะธะฒััะฝะฐั,
ัััะฝะฐั. ะะฐะถะต ะบะพะณะดะฐ ะผั ัะฐะทะณะพะฒะฐัะธะฒะฐะตะผ ะพ ัะตะผ-ัะพ ะดะตะนััะฒะธัะตะปัะฝะพ, ัะฐะบ ัะบะฐะทะฐัั, ะฝะตะปะพะฒะบะพะผ
ะธ ะฒะพะปะฝะธัะตะปัะฝะพะผ, ะผั ะผะพะถะตะผ ะทะฐะผะพะปัะฐัั, ะธ ััะพ ะฝะต ะฑัะดะตั ะพะฟััั ะถะต ัะตะผ-ัะพ ะฝะตัะผะตััะฝัะผ.
ะฏ ะฑั ัะธะดะตะปะฐ ะฒ ะฐะนะฟะฐะดะต, ะดัะผะฐั, ะฐ ัั ะฒ ัะตะปะตัะพะฝะต. ะ ะดะฐ, ะผั ะฟะพัะฟะพัะธะปะธ ะฑั, ััะพ ะปัััะต
- ัะฟะป ะธะปะธ ะฐะฝะดัะพะนะด. ะัะฝัะปะธ ะฑั ะดััะณ ั ะดััะณะฐ ะณะฐะดะถะตัั, ะธัะฐ ะฒ ะฝะธั
ะฝะตัะดะพะฑััะฒะฐ ะธ ะฝะตะดะพัะตัั.
ะะฐะดะตััั, ั ะพะดะตัะถะฐะปะฐ ะฟะพะฑะตะดั ะฒ ััะพะผ ะฒัะผััะปะตะฝะฝะพะผ ัะฟะพัะต. ะงะตัั, ะฐ ะผะพะถะตั, ะผั ะฟะพัะผะพััะธะผ
ัะธะปัะผ ะธะปะธ ัะตัะธะฐะป? ะฏ ััะพะปัะบะพ ัะฐะท ะฒัะบะฐะทัะฒะฐะปะฐ ะฒ ะฟะตัะตะฟะธัะบะฐั
ะฝะฐัะธั
ะถะตะปะฐะฝะธะต ะฟะพัะผะพััะตัั
ั ัะพะฑะพะน ะฒะถะธะฒัั ััะพ-ัะพ, ั ัะฐะบ ั
ะพัะตะปะฐ ััะพะณะพ, ะธ ะฒัะต ะตัะต ั
ะพัั. ะั ะดะฐะถะต ะฝะฐ ัะฐัััะพัะฝะธะธ
ะฝะตัะบะพะปัะบะพ ัะฐะท ัะผัะดััะปะธัั ัะฐะบ ัะผะพััะตัั ัะตัะธะฐะปั ะธ ัะธะปัะผั. ะ ัะฐะบ ัะฐััะพ ะผั ะฝะต ะดะพัะผะฐััะธะฒะฐะปะธ,
ัั ะฒัะตะณะดะฐ ัะฑะตะณะฐะตัั ะบัะดะฐ-ัะพ ะฟะพ ะดะตะปะฐะผ. ะกัะตะดะธ ะฝะฐั ะดะฒะพะธั
ัะพะปัะบะพ ั ะปะพะดััั, ะบะพัะพััะน
ัะธะดะธั ะดะพะผะฐ, ะฒัั
ะพะดั ะธะท ะฟะพะผะตัะตะฝะธั ะปะธัั ะฝะฐ ััะตะฑั.
ะฅะผ, ัะฐะบ ััะพ ะฑั ะผั ะฟะพัะผะพััะตะปะธ? ะะฐะฒะฐะน "Faking it", ะดะฐะฒะฝะพ ั
ะพัั ะฟะพัะผะพััะตัั ะตะณะพ. ะะปะธ,
ะตัะปะธ ัั ัะฐะทัะตัะธัั ะฒะบะปััะธัั ััะพ-ัะพ ะผะฝะพะน ัะถะต ะฟัะพัะผะพััะตะฝะฝะพะต, ัะพ ั ะฟะพัะฐะถั ัะตะฑั ัะผะพััะตัั
"Shameless". ะฏ ะฟัะฐะฒะดะฐ ั
ะพัั, ััะพ ะฑั ัั ะฟะพัะผะพััะตะปะฐ ะตะณะพ. ะฏ ะฟะพััะฐัะฐะปะฐัั ะฑั ะฝะต ัะฟะพะนะปะตัะธัั,
ะฝะพ ั
ะพัะพัะพ, ััะพ ัั ัะฟะพะบะพะนะฝะพ ะบ ะฝะธะผ ะพัะฝะพัะธัััั, ะธ ะผะฝะต ะฝะต ะฝะฐะดะพ ะพะฟะฐัะฐัััั ะทะฐ ัะฒะพะต ะทะดะพัะพะฒัะต,
ะตัะปะธ ั ะฟัะพะณะพะฒะพัััั. ะฏ ะฑัะดั ะณะพะฒะพัะธัั: "ะะพั, ะฒะพั, ัะผะพััะธ, ัะตะนัะฐั ัะฐะบะพะต ะฑัะดะตั". ะ
ัั, ะฝะฐะดะตััั, ะผะตะฝั ะทะฐัะบะฝะตัั. ะัะพะทะฒััะฐะปะพ ัะฐะบ, ะฟะพัะตะผั-ัะพ, ะฑัะดัะพ ะฟะพัะตะปัะตะผ, ะฝะพ ะฝะตั.
ะะน, ะฐ ะตัะต ั ะฝะฐั ะฑัะดัั ัะธะฟัั ะธ "ะฝะฐั ัะฟัะฐะนั". ะฅะพัั, ั ะปัััะต ะฒะพะทัะผั ััะพ-ัะพ ะผััะฝะพะต,
ะฐ ะฝะต ัะธะฟัั. ะะพ ัั, ั ะทะฝะฐั, ะธั
ะปัะฑะธัั. ะัะปะธ ะผั ะฟัะพัะธะดะธะผ ัะฐะบ ะดะพ ัะฐะผะพะน ะฝะพัะธ, ะผั ะทะฐั
ะพัะธะผ
ัะฟะฐัั, ะฝะฐะฒะตัะฝะพะต. ะ, ะตัะปะธ ะผั ะฝะต ะฒัะบะปััะธะผ ะธ ะฝะพัะผะฐะปัะฝะพ ะฝะต ะปัะถะตะผ, ัะพ ะผั ะฟัะพััะพ ััะฝะตะผ
ะฝะฐ ัะพะฝะต ะณะพะปะพัะพะฒ ะะพัะปั, ะัะผะฐ, ะญะผะผั ะธ ะฟัะพัะธั
ะปัะดะตะน ะฝะฐะผ ะทะฝะฐะบะพะผัั
-ะฝะตะทะฝะฐะบะพะผัั
. ะงะตัั,
ะฝะตั, ะผั ะถะต ะฑัะดะตะผ ัะผะพััะตัั ะฒ ะพะทะฒััะบะต. ะะฐะบ ะถะฐะปั. ะฏ ะฟัะพัะฝััั ัะฐะฝััะต, ัะตะผ ัั, ะฝะพ ะฝะต
ะฒััะฐะฝั, ั ะฑะดั ะถะดะฐัั, ะฟะพะบะฐ ะฟัะพัะฝะตัััั ัั. ะะพัะธะถั ะฒ ะฐะนะฟะฐะดะต, ะผะพะถะตั. ะะพะณะดะฐ ะถะต ัั ะฟัะพัะฝะตัััั,
ั ัะบะฐะถั, ััะพ ัั ะฟัะพััะพ ะฝะตะฒะตัะพััะฝะฐั ัะพะฝั. ะฏ, ะบััะฐัะธ, ั
ะพัั, ััะพ ะฑั ะธะผะตะฝะฝะพ ัั ัะดะตะปะฐะปะฐ
ะทะฐะฒััะฐะบ. ะขั ะถะต ัะผะตะตัั ะณะพัะพะฒะธัั, ัั ะฝะต ั.'
sentences:
- 'ะะต ัะปััะฐะน ะผะตะฝั, ะฟัะพัั! ะฏ ั
ะพัั ัะตะฑะต ะบะพะต-ััะพ ัะบะฐะทะฐัั, ะฟะพััะพะผั ะฝะต ัะปััะฐะน ะผะตะฝั. ะฏ
ั
ะพัั ััะพ ัะบะฐะทะฐัั ัะตะฑะต, ะฟััะผะพ ัะตะฑะต, ะฝะพ ััะพะฑั ัั ะฝะต ัะปััะฐะป. ะะปะธ ััะพะฑั ะทะฐะฑัะป ัะฟัััั
ะผะณะฝะพะฒะตะฝะธะต.
ะฏ ะะฎะะะฎ ะขะะะฏ
ะัะธัั ััะพ. ะฃ ัะตะฑั ะฒ ะณะพะปะพะฒะต. ะะฝะฐะบะพะผัะผ. ะะต ัะตะฑะต. ะัะตะผ, ะฒัะตะผั, ะฒัะตะณะดะฐ, ะฝะพ ะฝะต ัะตะฑะต,
ะฝะพ ะฝะธะบะพะณะดะฐ ัะตะฑะต. ะะพัะฟะพะดะธ, ะฝะต ัะปััะฐะน ะผะตะฝั. ะัะฒะปะตะบะธัั ะพั ะผะตะฝั ัะตะนัะฐั ะถะต, ััะพะฑั ั
ะผะพะณ ัะบะฐะทะฐัั ัะตะฑะต ัะพ, ััะพ ั
ะพัั, ะธ ัะพ, ััะพ ััะฒััะฒัั. ะขะธั
ะพ. ะะฝัััะธ ััะพ ะฑัะดะตั ะพัะตะฝั
ะณัะพะผะบะพ, ะฝะพ ะฝะต ะผะพะณั ั ัะฐะบะพะน ะณัะพะผะบะพัััั ัะบะฐะทะฐัั ััะพ ะฒัะปัั
, ะฝะต ะฟัะธ ัะตะฑะต. ะขั ะดะพะปะถะตะฝ
ะพัะฒะปะตัััั, ัะพะณะดะฐ ั ัะบะฐะถั ัะธั
ะพ, ัั ะฝะต ััะปััะธัั. ะะพะถะตั, ัั ัะฟัะพัะธัั: "ะงัะพ ัั ัะบะฐะทะฐะป?"
ะ ั ะพัะฒะตัั: "ะะธัะตะณะพ, ะฝะธัะตะณะพ". ะ ัั ะฝะต ะฟะตัะตัะฟัะพัะธัั ัะฝะพะฒะฐ, ะฝะต ััะฐะฝะตัั ะฟัะพัะธัั ัะบะฐะทะฐัั,
ะฟะพัะพะผั ััะพ ัั ะฟัะธะฒัะบ. ะฏ ัะฐััะพ ะณะพะฒะพัั ัะตะฑะต ะฟะพะด ะฝะพั ะธ ะฝะต ะพัะฒะตัะฐั, ััะพ ัะบะฐะทะฐะป, ะบะพะณะดะฐ
ะฟะตัะตัะฟัะฐัะธะฒะฐะตัั. ะขะตะฑั ััะพ ัะฐะบ ะฑะตัะธะปะพ. ะะพะถะตั, ะธ ัะตะนัะฐั ะฑะตัะธั, ะฝะพ ัั ัะถะต ะฝะต ะพะฑัะฐัะฐะตัั
ะฒะฝะธะผะฐะฝะธั.
ะฃ ะผะตะฝั ะตััั ััะพะปัะบะพ ะฒัะตะณะพ ัะตะฑะต ัะบะฐะทะฐัั. ะะพ ะบะพะณะดะฐ ัั ะณะพะฒะพัะธัั: "ะั, ัะฐััะบะฐะถะธ ััะพ-ะฝะธะฑัะดั.
ะงัะพ ัะณะพะดะฝะพ. ะะฐะฒะฐะน ะถะต". ะฏ ะทะฐะผะพะปะบะฐั. ะฏ ะฝะต ะทะฝะฐั, ััะพ ะณะพะฒะพัะธัั. ะกะฟัััั ะผะฝะพะณะพ ะผะธะฝัั
ั ะผะพะณั ัะฐััะบะฐะทะฐัั ััะพ-ัะพ. ะะฐะฟัะธะผะตั, ะบะฐะบะพะน-ัะพ ะณะปัะฟัะน ัะพะฝ ะธะปะธ ะฒะฟะตัะฐัะปะตะฝะธั ะพ ัะพะปัะบะพ
ััะพ ะฟัะพัะธัะฐะฝะฝะพะน ะบะฝะธะณะต. ะ ัั ะปัะฑะธัั ะบะฝะธะณะธ.
ะะพัะฟะพะดะธ, ะบะฐะบ ั ะฝะตะฝะฐะฒะธะถั ัะพ, ััะพ ัั ะฝะตะฝะฐะฒะธะดะธัั ัะตะฑั! ะฅะฒะฐัะธั!
ะัะตะบัะฐัะธ! ะะฝะต ะฑะพะปัะฝะพ ะฝะฐ ััะพ ัะผะพััะตัั.
ะะพ ะฟัะตะบัะฐัะธ ััะพ ะดะตะปะฐัั ะฝะต ัะฐะดะธ ะผะตะฝั, ะฐ ัะฐะดะธ ัะตะฑั. ะขั ะถะต ะทะฝะฐะตัั, ัะพะปัะบะพ ัะฐะบ ััะพ
ะฑัะดะตั ะฟัะฐะฒะธะปัะฝะพ. ะะต ะฝะตะฝะฐะฒะธะดั ัะตะฑั, ะฟะพะถะฐะปัะนััะฐ, ะฝะต ะฒะธะฝะธ ัะตะฑั, ะบะพะณะดะฐ ะทะฝะฐะตัั, ััะพ
ะฝะต ะฒะธะฝะพะฒะฐั. ะะพัะตะผั ะฒัะตะณะดะฐ ะฒัะต ะพะฑััะพััะตะปัััะฒะฐ ะธ ะฒะตัั ะผะธั ะฟัะพัะธะฒ ัะตะฑั, ะฐ? ะะฝะต ัะฐะบ
ะถะฐะปั.
ะกะธะปัะฝะตะต ะปัะดะตะน ั ะฝะต ะฒัััะตัะฐะป. ะะฐ, ั ะฝะต ัะฐะบ ะผะฝะพะณะพ ะปัะดะตะน ะฒัััะตัะฐะป... ะะพ ัะฐะทะฒะต ะผะพะถะตั
ัะตะปะพะฒะตะบ ะฑััั ะตัะต ัะธะปัะฝะตะต? ะะฐ ะผะพะถะตั, ะฝะฐะฒะตัะฝะพะต, ะบะพะผั ั ะฒัั, ะผะธั ะฑะพะปััะพะน, ะปัะดะตะน ะผะฝะพะณะพ...
ะฝะพ ะบ ัะตััั, ะฒะฐะถะฝะพ ะปะธ ััะพ? ะะตั. ะะฐะถะฝะฐ ะปะธัั ัะฒะพั ัะธะปะฐ ะธ ะดัั
. ะขั ะฒะพะธะฝ, ัั ะผะพะน ะฒะพะธะฝ.
ะฃ ัะตะฑั ะดะฐะถะต ะตััั ัะฐะบะฐั ัััะฑะพะปะบะฐ, ะฝะฐ ะฝะตะน ะฝะฐะฟะธัะฐะฝะพ "warrior". ะั
, ะบะฐะบ ะพะฝะฐ ะธะดะตั ัะตะฑะต.
ะะพะธะฝ.
ะะพัะธัั, ะะธะฝ, ะฑะพัะธัั, ะฝะต ัะดะฐะฒะฐะนัั. ะัะดัั
ะฐะน, ัะฐััะปะฐะฑะปัะนัั, ะฝะพ ะฝะต ัะดะฐะฒะฐะนัั ะธ ะถะธะฒะธ,
ะถะธะฒะธ, ะถะธะฒะธ. ะขั ัะฐะผัะน ะถะธะฒะพะน ัะตะปะพะฒะตะบ ั ะผะตััะฒะตะฝะฝะพ ัััะฐะฒัะตะน ะดััะพะน. ะะฐััะพะปัะบะพ ะถะธะฒะพะน,
ััะพ ั ั ัะพะฑะพะน ััะดะพะผ ััะฒััะฒัั, ััะพ ัะพะถะต ะถะธะฒั. ะขะฐะบะพะต ะพัััะตะฝะธะต, ััะพ ะดะพ ัะตะฑั ั ะธ ะฝะต
ะถะธะป. ะะพัะพะผั ััะพ ะดะพ ัะตะฑั ะฒัะต ะฑัะปะพ ัะปะธัะบะพะผ ะธะฝะฐัะต. ะขะตะฟะตัั ั ะฒะธะถั ะผะธั ะดััะณะธะผะธ ะณะปะฐะทะฐะผะธ.
ะะพะถะตั, ะดะตะปะพ ะฒ ะฒะพะทัะฐััะต, ะฒััะพั ะธะปะธ ััะพ. ะะพ ะฒัะต ะถะต ัั ะฟัะธะฝัะป ะพะณัะพะผะฝะพะต ััะฐััะธะต ะฒ
ะฒััััะฐะธะฒะฐะฝะธะธ ะผะธัะฐ ะฒะพะบััะณ ะผะตะฝั, ะฟะพัะพะผั ััะพ ัั ะพะณัะพะผะฝะฐั ัะฐััั ััะพะณะพ ะผะธัะฐ. ะญัะพ ะผะพะถะตัั
ะดะฐะถะต ััะปััะฐัั, ะฟะพัะปััะฐัั. ะะฐ ั ัะตะฑะต ะดะฐะถะต ััะพ ะฟะธัะฐะป.
ะั
, ะดะฐ ั ะธ ะณะพะฒะพัะธะป ัะตะฑะต, ะธ ะฟะธัะฐะป, ััะพ ะปัะฑะปั. ะะพ ะผั ะถะต ะดััะทัั, ััะพ ะฝะพัะผะฐะปัะฝะพ. ะัะพััะพ
ัะพ "ั ะปัะฑะปั ัะตะฑั", ััะพ ั
ะพัั ะฟัะพะบัะธัะฐัั... ััะพ ะฝะต ัะพ "ั ะปัะฑะปั ัะตะฑั", ะบะพัะพัะพะต ัั
ะณะพะฒะพัะธัั ั ัะปัะฑะบะพะน, ัะฐัััะตะฟัะฒะฐั ะผะฝะต ะฒะพะปะพัั. ะฏ ะปัะฑะปั ะฑััั ะฝะธะทะบะธะผ. ะ ัะดะพะผ ั ัะพะฑะพะน.
ะะฐ ัะผ ะฟัะธั
ะพะดัั ัััะพัะบะธ ะณััะฟะฟั "A Big Great World"
I am feeling so small*
ะงะตัั, ะดะฐ ััะฐ ะฟะตัะฝั ะฒะพะพะฑัะต ัะฟะปะพัะฝะพะน... ั. ะฏ ะฟะพััะฒััะฒะพะฒะฐะป ะฒ ะฝะตะน ัะตะฑั ัะพะณะดะฐ, ะบะพะณะดะฐ
ะผั... ะฟะตัะตััะฐะปะธ ะฟะพ ะผะพะตะน ะฒะธะฝะต ะพะฑัะฐัััั ะฝะฐ ะบะฐะบะพะต-ัะพ ะฒัะตะผั. ะฏ ัะฑะธะฒะฐะปัั ะธ ะดะฐะถะต ะฑะธะปัั
ะณะพะปะพะฒะพะน ะพะฑ ัะบะฐั, ัะธะดั ะฝะฐ ะฟะพะปั, ะบะฐะถะตััั. ะขะพะณะดะฐ ะบ ะผะพะธะผ ะบัะธะบะฐะผ ะพ ะปัะฑะฒะธ ะดะพะฑะฐะฒะปัะปะธัั
ะดััะณะธะต.
Say something, I''m giving up on you**
ะะต ั
ะพัั ะฑะพะปััะต ััะฒััะฒะพะฒะฐัั ะธะผะตะฝะฝะพ ััั ัััะพะบั.
ะะพัะพะผ ั ะบะธะฝัะปะฐ ัะตะฑะต ะบะปะธะฟ ััะพะน ะฟะตัะฝะธ. ะกะฟัััั ะบะฐะบะพะต-ัะพ ะฒัะตะผั. ะญัะธะผ ั ะฟัะธะทะฝะฐะปะฐัั
ัะตะฑะต ะฒะพ ะผะฝะพะณะพะผ, ััะพ ะฑัะปะพ ะฟะธััะผะพ ั ัะธััะพะผ, ะบะพัะพัะพะต ะฝะต ะดะพะปะถะฝะพ ะฑััั ะฝะธะบะพะณะดะฐ ัะฐััะธััะพะฒะฐะฝะพ.
ะฏ ะบัะธัะฐะปะฐ, ะฐ ัั ะฝะต ัะปััะฐะปะฐ. ะะพัะพะผั ััะพ ั ัะผะตั ัะธั
ะพ ะบัะธัะฐัั. ะ ัะปะฐะฒะฐ ะฑะพะณั.
*ะฏ ััะฒััะฒัั ัะตะฑั ัะฐะบะธะผ ะผะฐะปะตะฝัะบะธะผ/ะฝะตะทะฝะฐัะธัะตะปัะฝัะผ
**ะกะบะฐะถะธ ะถะต ััะพ-ะฝะธะฑัะดั, ั ะพัะดะฐะปัััั ะพั ัะตะฑั'
- 'Pov ะฅะพัั
ะต.
ะัะตะผ ะฟัะธะฒะตั, ะผะตะฝั ะทะพะฒัั ะฅะพัั
ะต ะะปะฐะฝะบะพ. ะฏ ัะฐะฑะพัะฐั ะฒ ะพััะธัะต ัะฒะพะตะณะพ ะพััะฐ, ัะพะฒัะตะผ ัะบะพัะพ
ั ััะฐะฝั ะณะปะฐะฒะฝัะผ. ะะฝะต ะดะฒะฐะดัะฐัั ัะตัััะต ะณะพะดะฐ, ั ะผะตะฝั ะตััั ะดะตะฒััะบะฐ ะะฐะบะฐัะตะฝะฐ ะะธะณะตะปั,
ะฒะฝะตัะฝะพััั ั ะฝะตั ะฝะฐ ะปัะฑะธัะตะปั, ะฝะพ ััะพ ะฝะต ะธะผะตะตั ะทะฝะฐัะตะฝะธั ะณะปะฐะฒะฝะพะต ััะพ ั ะปัะฑะปั ะตั.
ะะฐะบะฐัะตะฝะฐ ะพะฝะฐ ะพัะตะฝั ะฒะทััะฒะฝะฐั ะดะตะฒััะบะฐ, ะพัะตะฝั ัะผะพัะธะพะฝะฐะปัะฝะฐั, ะฝะต ัะพะปัะบะพ ะฒ ะถะธะทะฝะธ, ะฝะพ
ะธ ะฒ ะฟะพััะตะปะธ. ะะพ ะพั ััะฐัััะฐ ะฑะฐะฑะฝะธะบะฐ ั ะฝะต ะธะทะฑะฐะฒะธะปัั, ั ะฟัะพะดะพะปะถะฐั ั
ะพะดะธัั ะฟะพ ะบะปัะฑะฐะผ,
ะฟัะธะทะฝะฐััั ัะตััะฝะพ ะดะตะฒััะตะบ ั ะฟัะพััะพ ะพะฑะพะถะฐั, ะฝะพ ัะฟะปั ัะพะปัะบะพ ั ะพะดะฝะพะน. ะัั ั ะผะตะฝั ะตััั
ะฑัะฒัะฐั ะดะตะฒััะบะฐ ะะฐััะธะฝะฐ, ััะฐ ะดะตะฒัะพะฝะบะฐ ัะฒะพะดะธะปะฐ ะผะตะฝั ั ัะผะฐ ะฒ ัะบะพะปัะฝัะต ะณะพะดั, ะพะฝะฐ ะฑัะปะฐ
ะฟัะพััะพ ัะธะบะฐัะฝะฐ ะฒ ะฟะพััะตะปะธ, ัะฐะบะพะต ัะผะตะตั ัะพะปัะบะพ ะพะฝะฐ. ะ ัะพะถะฐะปะตะฝะธั ะธะปะธ ะบ ััะฐัััั ะผั
ัะฐัััะฐะปะธัั, ะฟัะฐะฒะดะฐ ัะพะฒัะตะผ ะฝะตะดะฐะฒะฝะพ, ะฑัะบะฒะฐะปัะฝะพ ะฟะพะปัะพัะฐ ะณะพะดะฐ ะฝะฐะทะฐะด, ะฝะพ ะผั ะฟัะพะดะพะปะถะฐะตะผ
ะพะฑัะฐัััั.
ะััะฐะฒ ั ััะฟะปะพะน ะฟะพััะตะปัะบะธ ั ะฝะฐะฟัะฐะฒะธะปัั ะฒ ะดัั, ะฟะพัะปะต ะฑััะฝะพะน ะฝะพัะธ ั ะะฐะบะฐัะตะฝะพะน, ัะตะปัั
ะฝะพัั ะพะฝะฐ ัะฑะปะฐะถะฐะปะฐ ะผะตะฝั, ะตัั ั ะฝะตั ะพัะตะฝั ัะปะฐะฑัะน ั
ะฐัะฐะบัะตั, ั ะผะพะณั ะธะทะดะตะฒะฐัััั ะฝะฐะด
ะฝะตะน, ะฝะพ ะฒ ั
ะพัะพัะตะผ ัะผััะปะต, ะฒ ััะพะผ ัะพัะทะต ะณะปะฐะฒะฝัะน ั. ะะพะณะดะฐ ั ะฒัััะตัะฐะปัั ั ะขะธะฝะธ, ะดะพะผะธะฝะธัะพะฒะฐะปะฐ
ะพะฝะฐ, ั ะฝะตั ัะธะปัะฝัะน ั
ะฐัะฐะบัะตั, ะพะฝะฐ ะดะฐัั ะพัะฟะพั ะปัะฑะพะผั. ะัะธะฝัะฒ ะดัั ั ะฒััะตะป ะธะท ะฒะฐะฝะฝะพะน
ะธ ะฝะฐะฟัะฐะฒะธะปัั ะฒ ัะฟะฐะปัะฝั.
- ะะพะถะธะบ, ัั ััะพ ัะตะณะพะดะฝั ัะฐะฑะพัะฐะตัั? ะฟะพะผะพะตะผั ั ัะตะฑั ะฒัั
ะพะดะฝะพะน, ะบัะดะฐ ัั ัะพะฑัะฐะปัั?
- ะฟะพัะปััะฐะปัั ะณะพะปะพั ะปัะฑะธะผะพะน ะธะท-ะฟะพะด ะพะดะตัะปะฐ.
- ะะธะบัะดะฐ ั ะฝะต ัะพะฑัะฐะปัั, ั ะฟัะพััะพ ะฟัะธะฝะธะผะฐะป ะดัั, ะบััะฐัะธ ัะตะณะพะดะฝั ะธะดัะผ ะฒ ะบะปัะฑ, - ัะบะฐะทะฐะป
ั ะธ ะฝะฐะดะตะป ะฝะฐ ัะตะฑั ัะฟะพััะธะฒะฝัะต ััะฐะฝั.
- ะะบ, ะฒ ะฒะพัะตะผั ะฑัะดั ะณะพัะพะฒะฐ, ัะฒะพะธ ะดััะทัั ะฟะพะนะดัั ั ะฝะฐะผะธ? - ะฝะตะดะพะฒะพะปัะฝะพ ัะฟัะพัะธะปะฐ ะพะฝะฐ.
ะะพะธ ะดััะทัั ัะฐะผัะต ะปัััะธะต ะปัะดะธ ะฒ ะผะธัะต, ะพะฝะธ ะฝะต ัะพะปัะบะพ ะผะพะธ ะดััะทัั, ะฝะพ ะธ ะดััะทัั ะะฐััะธะฝั.
ะะธะตะณะพ ะธ ะะพะดะพะฒะธะบะฐ, ััะฐ ะฟัะพััะพ ะดะฒะพะต ะฝะตะฒะผะตะฝัะตะผัั
ะปัะดะตะน, ะฝะพ ะพะฝะธ ัะฐะบะธะต ะผะธะปัะต. ะ ัะดะถะตัะพ
ะธ ะะฐะฝะดะธ, ะพะฝะธ ะฝะต ะพัะตะฝั ะปัะฑัั ะบะปัะฑั ะธ ััะผะฝัะต ะฒะตัะตัะธะฝะบะธ, ะพะฝะธ ะปัะฑัั ะฟะพะฑััั ะฝะฐะตะดะธะฝะต
ะดััะณ ั ะดััะณะพะผ, ะฝะตัะผะพััั ะฝะฐ ัะพ ััะพ ะผั ัะฐะทะฝัะต, ะผั ะปัััะธะต ะดััะทัั. ะ ะตัั ะะตัะธ, ะพะฝะฐ
ะปัััะฐั ะฟะพะดััะณะฐ ะขะธ, ะฐ ะทะฐะพะดะฝะพ ะธ ะผะพั.
- ะะพะฝะตัะฝะพ, ะบัะดะฐ ั ะฑะตะท ะฝะธั
, ะฝะต ะฝะฐะดะพ ะผะฝะต ะณะพะฒะพัะธัั ััะพ ะพะฝะธ ะฟะปะพั
ะธะต, - ะฝะตะดะพะฒะพะปัะฝะพ ะฟัะพะณะพะฒะพัะธะป
ั ะธ ะฒััะตะป ะธะท ัะฟะฐะปัะฝะธ. ะะพะผ ั ะฝะฐั ะฑะพะปััะพะน, ะฝะพ ััั ะฒัะตะณะดะฐ ัะฐะบะพะน ะฑะตัะฟะพััะดะพะบ. ะญัะพ ะฟัะพััะพ
ะฝะตะฒัะฝะพัะธะผะพ, ะะฐะบะฐัะตะฝะฐ ะฟะพััะธ ะฝะต ัะฑะธัะฐะตััั, ัะตะปัะผะธ ะดะฝัะผะธ ัะธะดะธั ะดะพะผะฐ ะธ ะฑะตะทะดะตะปัะฝะธัะฐะตั,
ะบะฐะบ ัะพะปัะบะพ ะทะฐะณะพะฒะพัะธัั ะฟัะพ ัะฑะพัะบั ะธ ะณะพัะพะฒะบั, ะพะฝะฐ ะฝะฐัะธะฝะฐะตั ัะตะปะพะฒะฐัั ะผะตะฝั, ะฐ ะดะฐะปััะต
ะฒั ะทะฝะฐะตัะต ััะพ ะฟัะพะธัั
ะพะดะธั.
- ะฏ ะธ ะฝะต ัะพะฑะธัะฐะปะฐัั, - ะฟะพัะปััะฐะปะพัั ะธะท ะบะพะผะฝะฐัั. ะฏ ัะฟัััะธะปัั ะฒะฝะธะท ะธ ะฟัะธะฝัะปัั ะธัะบะฐัั
ัะฒะพั ัััะฑะพะปะบั, ะฝะพ ะฝะฐะนัะธ ะตั ะฟัะพััะพ ะฝะต ัะตะฐะปัะฝะพ. ะกะฟัััั ะฝะตัะบะพะปัะบะพ ะผะธะฝัั, ั ะฒัั-ัะฐะบะธ
ะฝะฐััะป ัะฒะพั ัััะฑะพะปะบั. ะัะพะนะดั ะฝะฐ ะบัั
ะฝั ั ะทะฐะปะตะท ะฒ ั
ะพะปะพะดะธะปัะฝะธะบ, ะฝะพ ะฝะธัะตะณะพ ััะตะดะพะฑะฝะพะณะพ
ั ัะฐะผ ะฝะต ะฝะฐััะป.
- ะ ััะพะผ ะดะพะผะต, ะตััั ััะพ-ะฝะธะฑัะดั ะฟะพะตััั? - ะฟัะพะบัะธัะฐะป ั ะธ ะฟะพััะฐะฒะธะป ััะบะธ ะฒ ะฑะพะบ. ะะฐะบะฐัะตะฝะฐ
ัะฟััะบะฐะปะฐัั ัะพ ะฒัะพัะพะณะพ ััะฐะถะฐ, ะพะฝะฐ ะฑัะปะฐ ะฟะพััะธ ะพะฑะฝะฐะถัะฝะฝะพะน, ะปะธัั ะฟัะพัััะฝั ัะบััะฒะฐะปะฐ
ะตั ะฟัะตะปะตััะธ. ะฅะพัั ัะบััะฒะฐัั ะพัะพะฑะพ ะฑัะปะพ ะฝะตัะตะณะพ, ะณััะดั ะฑัะปะฐ ะพัะตะฝั ะผะฐะปะตะฝัะบะพะน, ะฝะพะณะธ
ะบะพัะพัะบะธะต, ะดััะณะพะต ะดะตะปะพ ะะฐััะธ.
- ะงัะพ ัั ะบัะธัะธัั? ะขั ะถะต ะทะฝะฐะตัั ััะพ ั ะฝะต ัะผะตั ะณะพัะพะฒะธัั, ะฝะฐะนะผะธ ะดะพะผัะฐะฑะพัะฝะธัั ะธ ะฒะพะพะฑัะต
ะทะฐัะตะผ ัะตะฑะต ะตะดะฐ ะบะพะณะดะฐ ั ัะตะฑั ะตััั ั, - ะฟัะพัะตะฟัะฐะปะฐ ะพะฝะฐ ะธ ัะฑัะพัะธะปะฐ ั ัะตะฑั ะฟัะพัััะฝั,
ัะบะฐะฝั ัะฟะฐะปะฐ ะบ ะตั ะฝะพะณะฐะผ, ะพััะฐะฒะธะฒ ะะฐะบะฐัะตะฝั ะพะฑะฝะฐะถัะฝะฝะพะน. ะฏ ะฟะพะดะพััะป ะบ ะฝะตะน ะธ ัััะฐััะฝะพ
ะฟะพัะตะปะพะฒะฐะป ะตั, ั ะปัะฑะธะป ะณััะฑะพััั. ะฏ ะผะพะณั ัะตะปัั ะฝะพัั ะฝะฐะด ะฝะตะน ะณััะฑะพ ะธะทะดะตะฒะฐัััั ะธ ะพะฝะฐ
ะฝะต ัะบะฐะถะตั ะผะฝะต ะฝะต ัะปะพะฒะฐ. ะะพะฒะฐะปะธะฒ ะฝะฐั ะฝะฐ ะดะธะฒะฐะฝ ั ัะฟัััะธะปัั ะบ ะตั ะณััะดะธ....
- ะะปะฐะฝะบะพ, ัั ะฒ ัะฒะพัะผ ัะตะฟะตัััะฐัะต? - ะฝะตะถะฝัะน, ะฝะพ ะฒ ัะพะถะต ะฒัะตะผั ะฝะฐะณะปัะน ะณะพะปะพัะพะบ ะฟะพัะปััะฐะปัั
ั ะผะตะฝั ะทะฐ ัะฟะธะฝะพะน. ะะพะฝะตัะฝะพ ััะพ ะฑัะปะฐ ะขะธะฝะบะฐ, ะตั ะณะพะปะพั ั ัะทะฝะฐั ะธะท ัััััะธ. ะััะฐะฒ ั
ะดะธะฒะฐะฝะฐ ะธ ะฟะพะดะฝัะฒ ะบะฐัะตะณะปะฐะทัั ะทะฐ ัะพะฑะพะน, ั ัะผะพะบะฝัะป ะขะธ ะฒ ัััะบั.
- ะขั ัะตะณะพ ะทะดะตัั ะดะตะปะฐะตัั? ะขะตะฑั ะฝะธะบัะพ ะฝะต ะทะฒะฐะป ััะดะฐ, - ะทะปะพะฑะฝะพ ะฟัะพัะธะฟะตะปะฐ ะะฐะบะฐัะตะฝะฐ
ะธ ะฝะฐััะฝัะปะฐ ะฝะฐ ัะตะฑั ะฟัะพัััะฝะบั. ะฏ ะฒัะตะณะดะฐ ะฑัะป ัะฐะด ะฒะธะดะตัั ะะฐััะธะฝั, ะพะฝะฐ ะฒัะตะณะดะฐ ะดะปั
ะผะตะฝั ะพััะฐะฝะตััั ัะพะดะฝัะผ ัะตะปะพะฒะตะบะพะผ, ะฝะพ ั ะตั ะฝะต ะปัะฑะปั.
- ะะต ะพัะธ, ะฅะพัั
ะต ั ะฟัะธัะปะฐ ะฟะพ ะดะตะปั, - ัะตัััะทะฝะพ ัะบะฐะทะฐะปะฐ ะพะฝะฐ ะธ ะฟะพัะผะพััะตะปะฐ ะฝะฐ ะผะตะฝั.
ะะพะณะดะฐ ะพะฝะฐ ะทะฒะฐะปะฐ ะผะตะฝั ะฟะพ ะธะผะตะฝะธ, ัะพ ััะพ ะทะฝะฐัะธั ััะพ ะดะตะนััะฒะธัะตะปัะฝะพ ััะพ-ัะพ ััะพัะฝะพะต.
- ะะฐ ั ะฒะตัั ะฒะพ ะฒะฝะธะผะฐะฝะธะต, - ัะฟะพะบะพะนะฝะพ ัะบะฐะทะฐะป ั ะธ ัะตะป ะฝะฐ ะดะธะฒะฐะฝ.
- ะะพะถะฝะพ ะฟะพะณะพะฒะพัะธัั ะฝะฐะตะดะธะฝะต? - ะฝะตัะฒะตัะตะฝะฝะพ ัะฟัะพัะธะปะฐ ะพะฝะฐ. ะงัะพ ััะพ ั ะฝะตะน, ะฟะตัะฒัะน ัะฐะท
ะฒะธะถั ะตั ัะฐะบะพะน, ะฝะตัะฒะตัะตะฝะฝะพะน.
- ะะฐะบะฐัะตะฝะฐ ะฒัะนะดะธ, - ัะบะฐะทะฐะป ั, ัะฐ ะฝะตะดะพะฒะพะปัะฝะพ ััะบะฝัะฒ ะฟะพะบะธะฝัะปะฐ ะณะพััะธะฝัั. ะะฐััะธะฝะฐ
ะฟัะธัะตะปะฐ ััะดะพะผ ัะพ ะผะฝะพะน ะฝะฐ ะดะธะฒะฐะฝ, ัะตัะตะฑั ะฒ ััะบะฐั
ะบัะฐะน ัะฒะพะตะณะพ ะฟะปะฐััั.
- ะฅะพัั
ะต, ะผะฝะต ะฝะตะณะดะต ะถะธัั, ั ะฟัะพะดะฐะปะฐ ัะฒะพะน ะดะพะผ, ั
ะพัั ะบัะฟะธัั ะฟะพะฑะพะปััะต, ัะพะดะธัะตะปะธ ัะตั
ะฐะปะธ,
ะฐ ะบะปััะธ ะฝะต ะพััะฐะฒะธะปะธ, ะตะดะธะฝััะฒะตะฝะฝัะน ะฒะฐัะธะฐะฝั ััะพ ัั, ะผะพะถะฝะพ ั ัะตะฑั ะฟะพะถะธัั? - ั ะฝะฐะดะตะถะดะพะน
ัะฟัะพัะธะปะฐ ะพะฝะฐ ะธ ะฟะพัะผะพััะตะปะฐ ะฝะฐ ะผะตะฝั. ะฃ ะฝะตั ัะฐะผัะต ะบัะฐัะธะฒัะต ะณะปะฐะทะฐ, ะบะพัะพััะต ั ัะพะปัะบะพ
ะฒัััะตัะฐะป ะฒ ะถะธะทะฝะธ.
- ะฅะผ, ั ะบะพะฝะตัะฝะพ ะฝะต ะฟัะพัะธะฒ, ะฝะพ ะตัะปะธ ะะฐะบะฐ ะฟัะพัะธะฒ, ัะพ ัั ะธะทะฒะธะฝะธ, - ัะบะฐะทะฐะป ั, ะะฐะบะฐัะตะฝะฐ
ะบะพะฝะตัะฝะพ ะพัะบะฐะถะตััั, ะฝะพ ะฝะต ะฑัะดั ะถะต ั ะฟะตัะตัะธัั ัะพะฑััะฒะตะฝะฝะพะน ะดะตะฒััะบะธ, ัะฐะดะธ ัะฒะพะตะน ะฑัะฒัะตะน.
- ะะพะฝััะฝะพ, ะทะฝะฐัะธั ะฟะพะนะดั ะบ ะะตัะธ, - ะฟะพะฝะธะบัะต ัะบะฐะทะฐะปะฐ ะพะฝะฐ ะธ ะฒััะฐะปะฐ ั ะดะธะฒะฐะฝะฐ. ะะปะธะฝ
ััะพ ะถะต ะดะตะปะฐัั, ะฝะตะปัะทั ัะฐะบ ะฟะพัััะฟะฐัั.
- ะกัะพะน, ะดะฐะฒะฐะน ัะฟัะพัะธะผ ั ะฝะตั, ะผะพะถะตั ะพะฝะฐ ัะพะณะปะฐัะธัััั, ะะฐะบะฐัะตะฝะฐ ะธะดะธ ััะดะฐ, - ะบัะธะบะฝัะป
ั ะธ ะฟะพััะพััะตะป ะฝะฐ ะขะธะฝะธ, ะฝะฐ ะปะธัะต ั ะฝะตั ะฑัะปะพ ะฟะพะปะฝะพะต ะพััะฐะธะฝัะต.
- ะงะตะณะพ ะฝะฐะดะพ? - ะณััะฑะพ ัะฟัะพัะธะปะฐ ะพะฝะฐ ะธ ะฒััะฐะปะฐ ะฝะฐะฟัะพัะธะฒ ะฝะฐั.
- ะะฐะบะฐ, ะฟะพะฝะธะผะฐะตัั ะะฐััะธะฝะต ะฝะตะณะดะต ะถะธัั, ะผะพะถะฝะพ ะพะฝะฐ ะฟะพะถะธะฒัั ั ะฝะฐั? - ัะฟัะพัะธะป ั ะธ ะฟะพัะผะพััะตะป
ะฝะฐ ะฝะตั, ะตั ะปะธัะพ ััั ะถะต ะธะทะผะตะฝะธะปะพัั.
- ะัั ัะตะณะพ, ะฟััะบะฐะน ั ัะฒะพะธั
ะดััะทะตะน ะถะธะฒัั, - ะณััะฑะพ ะฟัะพะธะทะฝะตัะปะฐ ะพะฝะฐ ะธ ะฒะทะดััะฝัะปะฐ ะณะพะปะพะฒะพะน
ะฒะฒะตัั
, ะพะฝะฐ ะฒะพะทะพะผะฝะธะปะฐ ัะตะฑั ะทะดะตัั ะบะพัะพะปะตะฒะพะน, ะฝั ัะถ ะฝะตั, ะผะพะน ะดะพะผ ะธ ะผะพั ะฑัะฒัะฐั ะฑัะดะตั
ะถะธัั ะทะดะตัั.
- ะะฐะบะฐัะตะฝะฐ, ัั ะฟะพั
ะพะถะต ะทะฐะฑัะปะฐ ัะตะน ััะพ ะดะพะผ, ะะฐััะธะฝะฐ ะฑัะดะตั ะทะดะตัั ะถะธัั ัะบะพะปัะบะพ ะฝัะถะฝะพ,
ััะพ ะฝะต ะพะฑััะถะดะฐะตััั, - ะฟัะพัะธะฟะตะป ั ะธ ะทะปะพะฑะฝะพ ะฒะทะณะปัะฝัะป ะฝะฐ ะะฐะบั, ะพะฝะฐ ะฟะพั
ะพะถะต ัะฐัััะตััะปะฐัั
ะธ ะฝะต ะทะฝะฐะปะฐ ััะพ ัะบะฐะทะฐัั, ะฟะพััะพะผั ะฟัะพััะพ ัะฑะตะถะฐะปะฐ ะฒ ะบะพะผะฝะฐัั.
- ะะพะถะตั ะฝะต ะฝัะถะฝะพ ะฑัะปะพ ัะฐะบ, ั ะผะพะณะปะฐ ะฟะพะถะธัั ั ะะตัะธ, - ัะธั
ะธะน ะณะพะปะพั ัะฐะทะดะฐะปัั ะฝะฐ ัั
ะพะผ.
ะฏ ะฑัะป ะพัะตะฝั ะทะพะป.
- ะะฐะผะพะปัะธ, ัะตะฑะต ัะบะฐะทะฐะปะธ ะผะพะถะฝะพ, ะทะฝะฐัะธั ะผะพะถะฝะพ, ะฝะฐะดะพะตะปะธ, - ะฟัะพะบัะธัะฐะป ั ะธ ัััะป ะฒ ะบัั
ะฝั.
ะะฐะบ ะถะต ะดะฐะปััะต ะถะธัั, ััะธ ะดะฒะพะต ัะฒะตะดัั ะผะตะฝั ั ัะผะฐ.
Pov ะฅะพัั
ะต.
- ะะดะต ะผะพะน ะปะธััะธะบ? - ะบัะธัะฐะปะฐ ะะฐััะธะฝะฐ ั ะฟะตัะฒะพะณะพ ััะฐะถะฐ.
- ะะดะต ะผะพะธ ััะฐะฝั? - ะฐ ััะพ ัะถะต ะะฐะบะฐัะตะฝะฐ ะธะท ัะพัะตะดะฝะตะน ะบะพะผะฝะฐัั. ะัะปะธะฑ ั ะทะฝะฐะป ััะพ ััะธ
ะดะฒะต ะดะตะฒััะบะธ ัะฒะตะดัั ะผะตะฝั ั ัะผะฐ ะทะฐ ะดะฒะฐ ะดะฝั, ั ะฑั ะฝะธะบะพะณะดะฐ ะฝะต ะฟัััะธะป ะะฐััะธะฝั ะถะธัั
ะบ ะฝะฐะผ. ะะฐะบะฐัะตะฝะฐ ะฟะพััะพัะฝะฝะพ ะผะตะฝั ัะตะฒะฝะพะฒะฐะปะฐ ะธ ะฟัะพััะพ ะฝะตะฝะฐะฒะธะดะตะปะฐ ะขะธะฝะธ, ะฐ ัะฐ ะฒ ัะฒะพั
ะพัะตัะตะดั ะดะตะปะฐะปะฐ ะตะน ะฒัั ะฝะฐะทะปะพ. ะะฒะฐ ะดะฝั ะฟะพะดััะด ะฒ ะผะพัะผ ะดะพะผะต ะฝะต ัะผะพะปะบะฐัั ะถะตะฝัะบะธะต ะณะพะปะพัะฐ,
ัะพัะตะดะธ ะฟัะธั
ะพะดะธะปะธ ัะถะต ััะธ ัะฐะทะฐ, ั ะฝะต ะทะฝะฐั ััะพ ั ะฝะธะผะธ ะดะตะปะฐัั. ะ ะตัั ั ะผะตะฝั ะฝะต ะฑัะปะพ
ัะตะบัะฐ ัะถะต ะดะฒะฐ ั ะฟะพะปะพะฒะธะฝะพะน ะดะฝั, ัะพะปัะบะพ ะผั ั ะะฐะบะฐัะตะฝะพะน ะฝะฐัะธะฝะฐะตะผ ะฟัะพัะตัั, ะะฐััะธ ะฒะฒะฐะปะธะฒะฐะตััั
ะบ ะฝะฐะผ ะฒ ะบะพะผะฝะฐัั, ะทะฐ ััะพ ั ะณะพัะพะฒ ะตั ัะฑะธัั. ะกะตะนัะฐั ั ัะพะฑะธัะฐััั ะฝะฐ ัะฐะฑะพัั, ะะฐััะธะฝะฐ
ะฒ ะธะฝััะธััั, ะฐ ะะฐะบะฐัะตะฝะฐ ะฝะฐ ะบะฐะบัั-ัะพ ะฒะฐะถะฝัั ะฒัััะตัั, ะฟะพัะธะดะตะปะบะธ ั ะฟะพะดััะถะบะฐะผะธ ะฝะฐะทัะฒะฐัััั,
ั
ะพัั ะฟะพะดััะณ ั ะฝะตั ะผะฐะปะพะฒะฐัะพ.
- ะั ะฅะพัั
ะต, ะธะดะธ ััะดะฐ, ะฟะพะผะพะณะธ ะฝะฐะนัะธ ะผะฝะต ะผะพะน ะปะธััะธะบ, ะฟะพะถะฐะปัะนััะฐ, - ะฝัะปะฐ ะขะธะฝะบะฐ ัะพ
ะฒัะพัะพะณะพ ััะฐะถะฐ, ั ะฑั ะฟัะตะดะปะพะถะธะป ะตะน ะฝะฐะดะตัั ะปะธััะธะบ ะะฐะบะฐัะตะฝั, ะฝะพ ะฒะตะดั ะพะฝ ะตะน ะผะฐะปะพะฒะฐั.
- ะะฐััะธะฝะฐ, ั ะตะณะพ ะดะฐะถะต ะฒ ะณะปะฐะทะฐ ะฝะต ะฒะธะดะตะป, - ัะฟะพะบะพะนะฝะพ ัะบะฐะทะฐะป ั ะธ ัะฟัััะธะปัั ะฒะฝะธะท.
ะะพะฝะตัะฝะพ ะฒัั ััะฐ ัะธััะฐัะธั ะผะตะฝั ะฝะต ัะฐะดัะตั, ะฝะพ ะฒ ััะพะผ ะตััั ะฟะปัั ั ะผะพะณั ะฒะธะดะตัั ะะฐััะธ
ะธ ะตั ัะธะบะฐัะฝะพะต ัะตะปะพ ะณะพะปัะผ, ะฟะฐัั ัะฐะท ะบะพะณะดะฐ ะพะฝะฐ ะผัะปะฐัั ั "ะฝะตัะฐัะฝะฝะพ" ะทะฐั
ะพะดะธะป ะฒ ะฒะฐะฝะฝั
ะธ ัะผะพััะตะป ะฝะฐ ะขะธะฝะธ, ะพะฝะฐ ัะฐะบะฐั ะผะธะปะฐัะบะฐ ะบะพะณะดะฐ ะทะปะธัััั. ะะต ะฑัะปะพ-ะฑั ั ะผะตะฝั ะะฐะบะฐัะตะฝั,
ั ะฑั ะฒะฝะพะฒั ะฝะฐัะฐะป ะฒัััะตัะฐัััั ั ะขะธะฝะธ, ัะบะฐะทะฐัั ัะตััะฝะพ ะฟะฐัั ัะฐะท ะฒ ะฟะพััะตะปะธ ั ะฟัะตะดััะฐะฒะปัะป
ะขะธ, ะตั ะฒะพะปะพัั ะฟะฐั
ะฝััะธะต ัะพะบะพะปะฐะดะพะผ ะธ ัััะพะนะฝัะต ะฝะพะถะบะธ, ะฐ ั ะะฐะบะฐัะตะฝั ะฝะตั ัะฐะบะธั
ะฟะพััััะฐััะธั
ะฒะพะปะพั ะบะฐะบ ั ะขะธะฝะธ, ะดะฐ ะธ ะฝะพะถะบะธ ะฝะต ะฟะพะดะฐัะพะบ.
- ะั ะธ ะณะดะต ะตะณะพ ะธัะบะฐัั? - ัะฟัะพัะธะป ั ะธ ะฝะฐัะฐะป ะธัะบะฐัั ััะพั ััััะพะฒ ะปะธััะธะบ ะขะธะฝะธ.
- ะขะฒะพั ะดะตะฒััะบะฐ ัะผะตะตั ัะฑะธัะฐัััั? ะะพัะตะผั ั ะฒะฐั ัะฐะบะพะน ะฑะฐัะดะฐะบ ะฒ ะดะพะผะต? ะฏ ะฝะต ะผะพะณั ัะฐะบ
ะถะธัั, - ะฟัะพััะพะฝะฐะปะฐ ะพะฝะฐ ะธ, ัััะฐะปะพ ัะตะปะฐ ะฝะฐ ะดะธะฒะฐะฝ. ะญั
, ะบะฐะบ ะถะต ะพะฝะฐ ะฟัะฐะฒะฐ, ะะฐะบะฐัะตะฝะฐ
ัะพะฒัะตะผ ะทะฐะฑัะพัะธะปะฐ ะผะพะน ะดะพะผ, ะฝัะถะฝะพ ะฝะฐะฝะธะผะฐัั ะดะพะผัะฐะฑะพัะฝะธัั. ะะพะฒะตัะฝัะฒ ะณะพะปะพะฒั ะฒ ััะพัะพะฝั
ะธ ะฟัะธะฟะพะดะฝัะฒ ะฝะตะฟะพะฝััะฝัั ะฒะตัั ั ะพะฑะฝะฐััะถะธะป, ัััะฝัะน ะบััะถะตะฒะฝะพะน ะปะธััะธะบ ะธ ั ััะฐะทั ะฟะพะฝัะป
ััะพ ะฒะตัั ะฟัะธะฝะฐะดะปะตะถะธั ะะฐััะธ, ะฒะตะดั ั ะะฐะบะฐัะตะฝั ะฝะตั ัะฐะบะธั
ะปะธััะธะบะพะฒ, ะพะฝะธ ั ะฝะตั ะฒัะต
ะฝะตะฒะทัะฐัะฝัะต ะธ ัะตััะต.
- ะญัะพ ัะฒะพั? - ัะฟัะพัะธะป ั ะธ ะฟะพะดะฝัะป ะฒะตัะธัั ั ััะผะฑั. ะะฐััะธ ะทะฐะผะตัะฝะพ ะฟะพะบัะฐัะฝะตะปะฐ ะธ ะฟะพะฟััะฐะปะฐัั
ะพัะฝััั ั ะผะตะฝั ัะฒะพั ะฒะตัั, ะฝะพ ั ัะพะฒัะตะผ ะฝะต ั
ะพัะตะป ััะพะณะพ ะดะตะปะฐัั.
- ะััะฐะฒั ะตะณะพ ะผะฝะต, ะพะฝ ะพัะตะฝั ะบัะฐัะธะฒัะน, ะพะฝ ะบะพะณะดะฐ-ะฝะธะฑัะดั ัะตะฑั ะฟัะธะณะพะดะธััั, ะพะฑะตัะฐั,
- ะฟัะพัะตะฟัะฐะป ั ะฝะฐ ััะบะพ ะขะธ, ะพะฝะฐ ัะดะธะฒะปัะฝะฝะพ ะฟะพัะผะพััะตะปะฐ ะฝะฐ ะผะตะฝั, ะฐ ะฟะพัะพะผ ะฟะพัะปะพ ัะปัะฑะฝัะปะฐัั.
ะะฝะฐ ะฒัะตะณะดะฐ ะฟะพะฝะธะผะฐะปะฐ ะผะตะฝั ั ะฟะพะปััะปะพะฒะฐ, ะตะน ะฝะต ะฝัะถะฝั ะฑัะปะธ ะผะพะธ ะฝะฐะผัะบะธ.
- ะฏ ะบ ัะฒะพะธะผ ััะปัะณะฐะผ, ะฐ ัะตะนัะฐั ั ะฟะพะนะดั ะธ ะฒะพะทะผะพะถะฝะพ ัะตะณะพะดะฝั ะฝะต ะฟัะธะดั ะฝะพัะตะฒะฐัั, ั
ะพััะฐะฝััั ั ะะตััะธ, ะพะฝะฐ ะณะพะฒะพัะธั ััะพ ัะพัะบััะธะปะฐัั ะฟะพ ะผะฝะต, ั ะถะต ะฒัั-ัะฐะบะธ ะดะพะปะณะพะต ะฒัะตะผั
ะฑัะปะฐ ะฒ ะะพะฝะดะพะฝะต, - ัะบะฐะทะฐะปะฐ ะพะฝะฐ ะธ ัะบััะปะฐัั ะฒ ะฒะฐะฝะฝะพะน.
- ะขั ะฑัะปะฐ ะฒ ะะพะฝะดะพะฝะต? ะะพัะตะผั ัั ะฝะต ัะบะฐะทะฐะปะฐ ะผะฝะต? - ัะฟัะพัะธะป ั ะธ ะฝะฐััะฝัะป ะฝะฐ ัะตะฑั ะฑััะบะธ,
ะฐ ะทะฐัะตะผ ะธ ััะฑะฐัะบั.
- ะฏ ะทะฝะฐะปะฐ ััะพ ั ัะตะฑั ะฑัะปะธ ะฟัะพะฑะปะตะผั ั ัะฒะพะตะน ะดะตะฒััะบะพะน ะธ ะฝะต ะทะฒะพะฝะธะปะฐ ะดะพะปะณะพะต ะฒัะตะผั,
ะฟะพะผะฝะธัั ัะตัััะต ะผะตัััะฐ ะผั ั ัะพะฑะพะน ะฝะต ะพะฑัะฐะปะธัั, ะฒะพั ัะพะณะดะฐ ั ะธ ะฑัะปะฐ ะฒ ะะพะฝะดะพะฝะต ะธ,
ะทะฝะฐะตัั ั ะฟัะตะบัะฐัะฝะพ ะพัะดะพั
ะฝัะปะฐ, - ะผะตััะฐัะตะปัะฝะพ ัะบะฐะทะฐะปะฐ ะพะฝะฐ ะธ ะฒััะปะฐ ะธะท ะฒะฐะฝะฝะพะน, ะฝะฐ
ะฝะตะน ะฑัะปะฐ ะพะดะตัะฐ ัะฑะบะฐ ัััั ะฒััะต ะบะพะปะตะฝะฐ, ะฝะตะถะฝะพ-ัะพะทะพะฒะฐั ะฑะปัะทะบะฐ ะธ ัััะปะธ ะฝะฐ ัะฟะธะปัะบะต,
ััะพ ะตั ะพะฑััะฝะฐั ะพะดะตะถะดะฐ ะฒ ะธะฝััะธััั.
- ะะฐะบะฐัะตะฝะฐ ะผั ัะตั
ะฐะปะธ, ั ะฑัะดั ะฟะพะทะดะฝะพ ะฒะตัะตัะพะผ, ะฐ ะขะธะฝะธ ัะตะณะพะดะฝั ะฝะพััะตั ั ะฟะพะดััะณะธ,
ัะฐะบ ััะพ ัั ะผะพะถะตัั ัะตะปัะน ะดะตะฝั, ะฐ ะฟะพัะพะผ ะธ ะฝะพัั ะพัะดัั
ะฐัั, - ัะฟะพะบะพะนะฝะพ ัะบะฐะทะฐะป ั ะธ ะพะฑัะป
ะปะฐะบะพะฒัะต ัััะปะธ. ะงะตัะตะท ะดะฒะฐ ะผะตัััะฐ ั ะทะฐะนะผั ะผะตััะฐ ัะฒะพะตะณะพ ะพััะฐ, ั ะฒ ะฟัะตะดะฒะบััะตะฝะธะต ััะพะณะพ
ะดะฝั. ะ ะพะดะธัะตะปะธ ั ะผะตะฝั ะพัะตะฝั ะฟะพััะดะพัะฝัะต ะปัะดะธ, ะพะฝะธ ะฟัะพััะพ ะฝะตะฝะฐะฒะธะดัั ะะฐะบะฐัะตะฝั.
- ะฏ ััะฐััะปะธะฒะฐ, ะฟะพะบะฐ, - ะบัะธะบะฝัะปะฐ ะพะฝะฐ ัะพ ะฒัะพัะพะณะพ ััะฐะถะฐ. ะฏ ะฒััะตะป ะธะท ะดะพะผะฐ ะธ ะฝะฐะฟัะฐะฒะธะปัั
ะบ ัะฒะพะตะน ะผะฐัะธะฝะต. ะกะตะฒ ะฒ ะฝะตั ั ะฟัะธะฝัะปัั ะถะดะฐัั ะขะธะฝะธ, ะธะฝะพะณะดะฐ ะผะฝะต ะบะฐะถะตััั ััะพ ะพะฝะฐ ะฝะต
ะฑัะฒัะฐั, ะฐ ะฝะฐััะพััะฐั. ะัะฑะฒะธ ะบะพะฝะตัะฝะพ ะฝะตั, ะฝะพ ะฒะปะตัะตะฝะธะต ะบะฐะบ ะบ ะถะตะฝัะธะฝะต ะตััั ะพะณัะพะผะฝะพะต.
ะกะฟัััั ะผะธะฝััั ะธะท ะดะพะผะฐ ะฒััะปะฐ ะะฐััะธะฝะฐ, ะฑะพัะผะฐัะฐ ััะพ-ัะพ ะฟัะธ ััะพะผ ัะตะฑะต ะฟะพะด ะฝะพั.
- ะขะฒะพั ะดะตะฒััะบะฐ ะฟัะพััะพ ะธะดะธะพัะบะฐ, ะพะฝะฐ ะฝะต ัะผะตะตั ะฝะธ ะณะพัะพะฒะธัั, ะฝะธ ัะฑะธัะฐัััั, ะดะฐะถะต ะฝะพัะผะฐะปัะฝะพ
ัะพัะผัะปะธัะพะฒะฐัั ัะฒะพะธ ะผััะปะธ ะฝะต ัะผะตะตั, ะบะฐะบ ัั ะผะพะถะตัั ั ะฝะตะน ะฒัััะตัะฐัััั? - ะฝะตัะฒะฝะพ ัะฟัะฐัะธะฒะฐะปะฐ
ะพะฝะฐ, ััะฐะถะธะฒะฐััั ะบะพ ะผะฝะต ะฒ ะผะฐัะธะฝั.
- ะะฐัะพ ะฒ ะฟะพััะตะปะธ ะพะฝะฐ ะฟัะพััะพ ะฑะพะผะฑะฐ, - ัะบะฐะทะฐะป ั ะธ ะฒัะตั
ะฐะป ัะพ ะดะฒะพัะฐ.
- ะฏัะฝะพ, - ะฟัะพะธะทะฝะตัะปะฐ ะพะฝะฐ ะธ ะพัะฒะตัะฝัะปะฐัั. ะะฝะฐ ััะพ ะพะฑะธะดะตะปะฐัั? ะฏ ะฝะต ััะฐะป ะฝะธัะตะณะพ ะฑะพะปััะต
ัะฟัะฐัะธะฒะฐัั, ะปะธัั ะฟัะพะดะพะปะถะธะป ะฟััั ะบ ัะฝะธะฒะตัั.
ะะฐะบะพะฝัะธะฒ ัะฒะพะน ัะฐะฑะพัะธะน ะดะตะฝั ั ัััะฐะปะพ ะพัะบะธะฝัะปัั ะฝะฐ ัะฟะธะฝะบั ัััะปะฐ ะธ ััะถะตะปะพ ะฒัะดะพั
ะฝัะป.
ะะฐ ัะฐัะฐั
ะฒะพัะตะผั ะฒะตัะตัะฐ, ะฐ ะดะพะผะพะน ัะพะฒัะตะผ ะฝะต ั
ะพัะตััั. ะะฐะถะดัะน ะดะตะฝั ะพะดะฝะพ ะธ ัะพะถะต, ั
ะพัะตััั
ัะฐะทะฝะพะพะฑัะฐะทะธั. ะขะตะปะตัะพะฝ ะบะพัะพััะน ะปะตะถะฐะป ะฝะฐ ััะพะปะต, ะฝะฐัะฐะป ะฝะฐััะพะนัะธะฒะพ ััะตะทะฒะพะฝะธัั, ะฟะพัะผะพััะตะฒ
ะฝะฐ ัะบัะฐะฝ ั ัะฒะธะดะตะป ะฝะตะทะฝะฐะบะพะผัะน ะฝะพะผะตั, ะฝะฐะถะฐะฒ ะบะฝะพะฟะบั ะฟัะธะฝััั ั ะฟะพะดะฝัั ัะตะปะตัะพะฝ ะบ ัั
ั.
- ะะปะปะพ, - ัะธั
ะพ ัะบะฐะทะฐะป ั.
- ะญะผ, ะทะดัะฐะฒัััะนัะต, ะฒั ะทะฝะฐะตัะต ะดะตะฒััะตะบ ะฟะพ ะธะผะตะฝะธ ะะฐััะธะฝะฐ ะจัะพััะตะปั, ะะตััะตะดะตั ะะฐะผะฑัะต,
ะะพะดะพะฒะธะบะฐ ะะพะผะตะปัะพ ะธ ะะฐะฝะดะตะปะฐัะธั ะะพะปัะตัะต? - ะผัะถัะบะพะน ะณะพะปะพั ัะฐะทะดะฐะปัั ะฝะฐ ัะพะผ ะบะพะฝัะต ะฟัะพะฒะพ'
- 'ะกะตะณะพะดะฝั, ะฟััะณะฐั ะฝะฐ ะบัะพะฒะฐัะธ, ะะธัะฐ ัะปะพะผะฐะปะฐ ะตะต. ะะฝะฐ ะพััะฐัะฝะฝะพ ะฟััะฐะปะฐัั ะดะพะฟััะณะฝััั
ะดะพ ะฟะพัะพะปะบะฐ, ะฝะพ ะฝะธัะตะณะพ ะฝะต ะฟะพะปััะฐะปะพัั, ะธ ะพะฟะธะปะบะธ ะปะธัั ััะตัะฝะพ ััะฟะฐะปะธัั ะฝะฐ ะฟะพะป. ะะธะบัะพ
ะฝะต ัะปััะฐะป ะฝะธ ัะบัะตะถะตัะฐ ะฟััะถะธะฝ, ะฝะธ ะณัะพั
ะพัะฐ; ะฝะต ะฑัะปะพ ะฒะธะดะฝะพ ะธ ัะฐะผะพะน ะฟะพะปะพะผะบะธ.
ะะฐัั, ััะณะฐั ะดะพัั, ะฒ ะพัะฒะตั ะฟะพะปััะธะปะฐ ะปะธัั ัััะฐะปะพะต ัะฐะฒะฝะพะดััะธะต, ััะพ, ะบะพะฝะตัะฝะพ ะถะต, ะฒัะฒะตะปะพ
ะตะต ะธะท ัะตะฑั. ะัะธัะฐ ััะพ-ัะพ ะฝะตัะปะตะฝะพัะฐะทะดะตะปัะฝะพะต, ะพะฝะฐ ััััะฐะปะฐ ะฝะพะณะพะน ะฟะพ ัะปะพะผะฐะฝะฝะพะผั ะฟัะตะดะผะตัั.
ะะตะฝัะธะฝะฐ ะฝะต ะฟะพะฝะธะผะฐะปะฐ, ััะพ ะพะฝะฐ ะดะตะปะฐะปะฐ ัะพะปัะบะพ ั
ัะถะต, ะฝะพ ะณะฝะตะฒ ะฒ ะตะต ะบัะพะฒะธ ะฒะทัะป ะฒะตัั
.
- ะะฐ ะบะฐะบ ัั ัะผะตะตัั, ะฟะฐััะธะฒะฐั ะดะตะฒัะพะฝะบะฐ! ะฏ ัะพะปัะบะพ ะธ ะดะตะปะฐะปะฐ, ััะพ ัั
ะฐะถะธะฒะฐะปะฐ ะทะฐ ัะฒะพะตะน
ะบัะพะฒะฐััั! ะ ัั ัะตัะธะปะฐ ััััะพะธัั ะฟะพะณัะพะผ?! ะะฝะฐะตัั, ััะพ?! ะฏ ััะพ ัะฐะบ ะฝะต ะพััะฐะฒะปั! -
ะฝะฐ ััะธั
ัะปะพะฒะฐั
ะถะตะฝัะธะฝะฐ, ัััั ะปะธ ะฝะต ัะฝะธะผะฐั ะดะฒะตัั ั ะฟะตัะตะปั, ะฒัะฑะตะถะฐะปะฐ ะธะท ะบะพะผะฝะฐัั.
ะะธัะฐ ัะตะทะบะพ ะพะฟัััะธะปะฐัั ะฝะฐ ะบะพะปะตะฝะธ. ะัะธะถะฐะฒ ััะบะธ ะบ ะบัะพะฒะฐัะธ, ะพะฝะฐ ะฟััะฐะปะฐัั ัะดะตัะถะธะฒะฐัั
ะตะต ะฝะตะฒัะฝะพัะธะผัะน ัะบัะตะถะตั.
ะะทัะฒ ะผะพะปะพัะพะบ ะธ ะณะฒะพะทะดะธ ะธะท ะบะปะฐะดะพะฒะพะน, ะดะตะฒะพัะบะฐ ะฑะตะทะฝะฐะดะตะถะฝะพ ะบะพะปะพัะธะปะฐ ะฟะพ ะพะฑะปะพะผะบะฐะผ, ะฟััะฐััั
ั
ะพัั ะบะฐะบ-ัะพ ะธั
ัะพะตะดะธะฝะธัั. ะะพ ะฒัะต ะพะบะฐะทะฐะปะพัั ะฑะตะทัะตะทัะปััะฐัะฝะพ: ะพะฑะปะพะผะบะธ ะปะธัั ั ะตัะต
ะฑะพะปััะธะผ ัััะตะผะปะตะฝะธะตะผ ัะฐัะบะฐะปัะฒะฐะปะธัั ะฟะพะด ะณะฝะตัะพะผ ะณะฒะพะทะดะตะน.
ะะฝะฐ ะปะตะณะปะฐ ะฝะฐ ะฟะพะป. ะะตะณะบะธะน ัะบะฒะพะทะฝัะบ ัะตะบะพัะฐะป ะตะต ัะฟะธะฝั.
- ะฏ ะฝะธะบะพะณะดะฐ ะฝะต ัะผะพะณั ะดะพะฟััะณะฝััั ะดะพ ะฟะพัะพะปะบะฐ, - ัะบะฐะทะฐะปะฐ ะะธัะฐ ะธ ะฒัะดะพั
ะฝัะปะฐ.
- ะ ะฒะดััะณ ััะพ ะฝะต ัะฐะบ?
ะะธัะฐ ัะตะทะฒะพ ะฒััะฐะปะฐ. ะะฐ ะตะต ะปะธัะต ะฟะพัะฒะธะปะฐัั ะผะฐัะบะฐ ะฝะตะดะพัะผะตะฝะธั, ะฐ ะฒ ะณััะดะธ ะฝะฐัะฐะป ัะฐะทะถะธะณะฐัััั
ะพะณะพะฝะตะบ ัััะฐั
ะฐ. ะัะบัะดะฐ ััะพั ะณะพะปะพั?
- ะะต ะฑะพะนัั, ะณะปัะฟััะบะฐ, - ะณะพะปะพั ะฑัะป ะพัะตะฝั ะผัะณะพะบ.
- ะัะบัะดะฐ ัั? ะฏ ัะตะฑั ัะฐะฝััะต ะฝะต ัะปััะฐะปะฐ...
- ะ ัะฐะทะฒะต ััะพ ะฒะฐะถะฝะพ?
- ะ ััะพ, ะฝะตั?
- ะะพัะตะผั ััะพ ะดะพะปะถะฝะพ ะฑััั ะฒะฐะถะฝะพ? ะ ะฐะทะฒะต ะฝะตะปัะทั ะฟัะพััะพ ะฟะพะณะพะฒะพัะธัั ั ัะพะฑะพะน?
- ะขั ะดัะผะฐะตัั, ั ะฑัะดั ะณะพะฒะพัะธัั ั ะฝะตะทะฝะฐะบะพะผัะผ ะณะพะปะพัะพะผ?
- ะ ะฟะพัะตะผั ะฝะตั?
- ะขะฐะบ. ะะฝะต ะฝะฐะดะพะตะดะฐะตั ััะฐ ะธะณัะฐ ะฒ ะฒะพะฟัะพัั. ะะพะฒะพัะธ, ััะพ ะธะปะธ ะบัะพ ัั ะตััั?
ะะฝะตะทะฐะฟะฝะพ ะฝะฐัััะฟะธะปะพ ะผะพะปัะฐะฝะธะต, ะฟะพัะปะต ัะตะณะพ ะฟะพัะปะตะดะพะฒะฐะปะพ ะฟัะพะดะพะปะถะธัะตะปัะฝะพะต ะณัะดะตะฝะธะต.
ะะพะปะพั ะฝะฐัะฐะป ะฝะฐะฟะตะฒะฐัั ะฟะตัะตะฝะบั, ะฝะต ะฟะตัะฝั, ะฐ ะธะผะตะฝะฝะพ ะฟะตัะตะฝะบั. ะัะฑะธะผัั ะฟะตัะตะฝะบั ะะธัั,
ะบะพัะพััั ะพะฝะฐ ะทะฐะฒะพะดะธะปะฐ ะบะฐะถะดัะน ัะฐะท, ะบะพะณะดะฐ ะปะพะผะฐะปะพัั ััะพ-ะฝะธะฑัะดั ะฒ ะตะต ะบะพะผะฝะฐัะต.
- ะฏ ะผะพะณั ะฟะพัััะพะธัั ัะตะฑะต ะฝะพะฒัั ะบัะพะฒะฐัั. ะะพัะฐะทะดะพ ะปัััะต ััะพะน. ะ ะฝะตะน ะฑัะดะตั ะผะฝะพะณะพ ัะฒะตัะพะฒ
ะธ ัะปะฐะดะพััะตะน...
ะะตะฒะพัะบะฐ ะพะถะธะฒะธะปะฐัั. ะ ะตะต ัะตัะธ ะฟะพัะปััะฐะปะธัั ะฝะพัะบะธ ัะฐะดะพััะธ.
- ะัะฐะฒะดะฐ? ะขั ัะดะตะปะฐะตัั ััะพ?
- ะะฐ, ะฝะพ ะฒะพั ัะพะปัะบะพ...
- ะงัะพ "ัะพะปัะบะพ"?
- ะขะพะปัะบะพ ะพะฝะฐ ะฑัะดะตั ะฝะต ะฝะฐััะพััะตะน. ะขั ะฝะต ัะผะพะถะตัั ะฝะฐ ะฝะตะน ัะฟะฐัั, ะฝะพ ะพะฝะฐ ะฑัะดะตั ะฒ ัะฒะพะตะน
ะบะพะผะฝะฐัะต. - ะณะพะปะพั ะพัะบะฐัะปัะปัั. - ะั
, ะดะฐ. ะัะพะผะต ัะตะฑั ะตะต ะฝะธะบัะพ ะฝะต ัะฒะธะดะธั.
ะะตะฒะพัะบะฐ ะทะฐะดัะผัะธะฒะพ ัะปัะฑะฝัะปะฐัั.
- ะะพ ะบะพะณะดะฐ ะถะต ั ัะผะพะณั ัะฒะธะดะตัั ัะฒะพั ะบัะพะฒะฐัั?
ะะพะปะพั ะฝะฐัะฐะป ัะผะตััััั. ะกะธะปัะฝะพ, ะดะพะปะณะพ, ะฝะพ ะผัะณะบะพ. ะญัะพั ัะผะตั
ะฑัะป ะพัะตะฝั ะธ ะพัะตะฝั ะฝะตะพะฑััะตะฝ:
ะฒัะพะดะต ะฑั ะธ ะดะพะฑััะน, ะฐ ะฒัะพะดะต ะฑั ะธ ั ะฝะฐัะผะตัะบะพะน.
ะะฐะปะพััั.
ะะฐะปะพััั ัะฟัะฐะฒะปัะปะฐ ะธะผ.
- ะะพัะตะผั ัั ัะผะตะตัััั?
- ะะฐ ะฟะพัะพะผั ััะพ ัั ะณะปัะฟะฐั ะดะตะฒะพัะบะฐ, ะบะพัะพัะฐั ะดะฐะถะต ะฝะต ะผะพะถะตั ัะตัะธัั.
- ะฏ ะฒะพะฒัะต ะฝะต ะณะปัะฟะฐ!
- ะะฐ? ะขะฐะบ ะพัะฒะตัั: ัะตะฑะต ะฝัะถะฝะพ ัะพ, ััะพ ั ะฟัะตะดะปะฐะณะฐั?
- ะะพ ััะพ ะถะต ะฒะพะฒัะต ะฝะต ะฝะฐััะพััะฐั ะบัะพะฒะฐัั! - ะะธัะฐ ะฟัะธะปะพะถะธะปะฐ ััะบะธ ะบ ะปะธัั. - ะะฐ ะฝะตะน
ั ะฝะต ัะผะพะณั ะดะพะฟััะณะฝััั ะดะพ ะฟะพัะพะปะบะฐ!
ะะพะปะพั ะพะฟััั ะทะฐะปะธะปัั ัะผะตั
ะพะผ.
- ะะะงะะะฃ ะขะซ ะกะะะะจะฌะกะฏ ะะกะ ะะ ะะะฏ?!
- ะะฐ ะฟะพัะพะผั ััะพ ัั ัะถะต ัะตัะธะปะฐ. ะฃะถะต ะดะฐะฒะฝัะผ-ะดะฐะฒะฝะพ ัะตัะธะปะฐ.
- ะ ััะพ ะถะต ั ัะตัะธะปะฐ?
- ะขั ัะพะณะปะฐัะฝะฐ, ะฒะตะดั ัะฐะบ?
ะะธัะฐ ะทะฐะผะตัะบะฐะปะฐัั, ะฝะพ, ะฒัะต ะถะต, ะฒัะดะฐะฒะธะปะฐ ะธะท ัะตะฑั ะฝะตัะฒะตัะตะฝะฝะพะต "ะดะฐ".
ะะพะปะพั ะฟัะพะฟะฐะป, ะพััะฐะฒะธะฒ ะฟะพัะปะต ัะตะฑั ะพะณัะพะผะฝัั ะบัะพะฒะฐัั, ั ะฑะพะปััะธะผ ะผะฐััะฐัะพะผ ะธ ะผัะณะบะธะผะธ
ะฟะพะดััะบะฐะผะธ. ะะฐ ัะฐะบะพะน ะบัะพะฒะฐัะธ, ะพะฟัะตะดะตะปะตะฝะฝะพ, ะผะพะถะฝะพ ะฑัะปะพ ะฑั ะดะพะฟััะณะฝััั ะดะพ ะฟะพัะพะปะบะฐ.'
- source_sentence: 'ะขัะถัะปัะต ะฟะพัััะตัั ั ัะธั
ะธะผ ัะตะปะตััะพะผ ัะฐะทัะตั
ะฐะปะธัั ะฒ ััะพัะพะฝั, ะฒะฟััะบะฐั
ะฒะฝัััั ัะฒะตั. ะะฝ ะฑัะป ัะตัะพ-ะฑะตะปัะผ ะธ ะฑะพะปัะฝะพ ะฑะธะป ะฟะพ ะฟัะธะฒัะบัะธะผ ะบ ัะตะผะฝะพัะต ะณะปะฐะทะฐะผ. ะฏ ะฟะพัััะธะป
ััะพัะฒัะธะต ะฝะฐ ะฝะพัะฝะพะผ ััะพะปะธะบะต ัะฒะตัะธ ะธ ะฟะพะดะฝัะปัั ั ะบัะพะฒะฐัะธ.
ะญัะพั ะดะตะฝั ะฝะต ะฟัะตะดะฒะตัะฐะป ะฝะธัะตะณะพ ะฝะตะพะฑััะฝะพะณะพ. ะฏ ะบะฐะบ ะฒัะตะณะดะฐ ะฟัะพัะฝัะปัั ะฝะฐ ะฝะตัะบะพะปัะบะพ
ะผะณะฝะพะฒะตะฝะธะน ัะฐะฝััะต ะฒัะตะพะฑัะตะน ะฟะพะฑัะดะบะธ. ะะฐ ะฐะฒัะพะผะฐัะต ะฟัะธะฒัะป ัะตะฑั ะฒ ะฟะพััะดะพะบ ะธ ะพะฑะปะฐัะธะปัั
ะฒ ัะฒะพะน ะดะปะธะฝะฝัะน ัััะฝัะน ะฑะฐะปะฐั
ะพะฝ, ะฟะพะดะฟะพััะฐะฒ ะตะณะพ ัะตัะตะฑััะฝัะผ ัะฝััะบะพะผ - ะทะฝะฐะบะพะผ ะถะฝะตัะพะฒ.
ะะฐะดะตะป ะฝะฐ ะณะพะปะพะฒั ะฝะตะฝะฐะฒะธััะฝัะน ะพะฑััั, ะบะพัะพััะน ัะธะปัะฝะพ ะดะฐะฒะธะป ะฝะฐ ะปะพะฑ, ะฝะพ ะฑะตะท ะบะพัะพัะพะณะพ
ั ะฝะต ะผะพะณ ะฟะพัะฒะธัััั ะฒะฝะต ัะฒะพะตะน ะบะตะปัะธ - ะพะฝ ัะบะฐะทัะฒะฐะป ะฝะฐ ะผะพั ะฟัะธะฝะฐะดะปะตะถะฝะพััั ะบ ะัััะตะน
ะบะฐััะต. ะะพัะปะตะดะฝะธะน ัััะธั
- ะะพะปััะพ ะฒะตัะฝะพััะธ.
ะะพั ะบะตะปัั ัะฐัะฟะพะปะฐะณะฐะปะฐัั ะฝะฐ ัะฐะผะพะผ ะฟัะตััะธะถะฝะพะผ ััััะต - ะฝะธะถะฝะตะผ. ะะตััะฝัะต ะตัั ะฝะฐะทัะฒะฐะปะธ
ะตะณะพ ะะพะปัะฑะตะปัั ัะตะฝะตะน. ะขะฐะผ ะธะผะตะปะธ ะฟัะฐะฒะพ ะฝะฐั
ะพะดะธัััั ัะพะปัะบะพ ะถะฝะตัั ะัััะตะน ะบะฐััั - ัะต,
ะบะพะณะพ ะกะผะตััั ะฒัะฑัะฐะปะฐ ัะฒะพะธะผะธ ัะพะฒะตัะฝะธะบะฐะผะธ.
ะะฐะถะดัะน ัะฐััะฒะตั ั ะฟะพะดะฝะธะผะฐะปัั ะฝะฐ ะพัััััะฝัะน ัััั, ะณะดะต ะฟะพะปััะฐะป ัะบะฐะทะฐะฝะธั ะพั ะกัะฐัะตะนัะธะฝ,
ะฐ ะทะฐัะตะผ ะฟัะธัััะฟะฐะป ะบ ัะฒะพะตะน ัะฐะฑะพัะต. ะฏ - ัะพะฑะธัะฐัะตะปั ะดัั.
ะฏ ะฟะตัะตััะบ ััะตะฑะฝัะน ะทะฐะป, ะณะดะต ะพะฑััะฐะปะธัั ะผะพะปะพะดัะต ะถะฝะตัั. ะะพะณะดะฐ-ัะพ ั ะธ ัะฐะผ ัะธะดะตะป ะฝะฐ
ััะธั
ัะบะฐะผััั
ะธ ััะฐัะตะปัะฝะพ ะฒะฝะธะผะฐะป ะบะฐะถะดะพะผั ัะปะพะฒั ะฟัะพัะตััะพัะฐ. ะะฐะฒะตัะฝะพะต, ะธะผะตะฝะฝะพ ะผะพะธ
ััะฐัะฐัะตะปัะฝะพััั ะธ ัะฟะพัััะฒะพ ะฟะพะผะพะณะปะธ ะผะฝะต ะดะพัะปัะถะธัััั ะดะพ ะัััะตะน ะบะฐััั.
ะกะฐะผะพ ะผะตััะพ, ะณะดะต ะผั ะพะฑะธัะฐะปะธ, ะฝะฐะทัะฒะฐะปะพัั ะฅัะฐะผะพะผ. ะะฝ ัะฐัะฟะพะปะฐะณะฐะปัั ะฒะฝะต ะฟัะพัััะฐะฝััะฒะฐ
ะธ ะฒัะตะผะตะฝะธ. ะฅัะฐะผ ะฑัะป ัะดะตะปะฐะฝ ะธะท ะฑะตะปะพะณะพ, ะบัะฐัะฝะพะณะพ ะธ ัััะฝะพะณะพ ะผัะฐะผะพัะฐ ะธ ะธะผะตะป ะพัะตะฝั
ะฒััะพะบะธะต ัะฒะพะดั. ะะพะต-ะณะดะต ะพะฝะธ ะดะพััะธะณะฐะปะธ ัะฐะบะพะน ะฒััะพัั, ััะพ ะธั
ะฝะตะฒะพะทะผะพะถะฝะพ ะฑัะปะพ ัะฒะธะดะตัั.
ะขะฐะบะธะต ัะฒะพะดั ะฝะฐะทัะฒะฐะปะธัั ะฝะตะฑะตัะฐะผะธ.
ะะฑะธัะฐัะตะปะธ ะฅัะฐะผะฐ ะถะธะปะธ ะฒ ะบะตะปััั
- ะฟัะพััะพัะฝัั
ะบััะณะปัั
ะบะพะผะฝะฐัะฐั
. ะะพัะบะพะปัะบั ะฒัะตะผะตะฝะธ
ะดะปั ะฝะฐั ะฝะต ัััะตััะฒะพะฒะฐะปะพ, ะฝะพ ะฒะฝัััะธ ะฒัั ัะฐะฒะฝะพ ัะธะบะฐะปะธ ะฑะธะพะปะพะณะธัะตัะบะธะต ัะฐัั, ะฑัะปะธ ะฟัะธะดัะผะฐะฝั
ัะฐััะฒะตัั ะธ ะทะฐะบะฐัั. ะฅัะฐะผ ะฑัะป ะพะบััะฐะฝ ะฑะตะปะพะน, ัะฒะตัััะตะนัั ะฟะตะปะตะฝะพะน, ะฟะพััะพะผั ะฝะฐ ะทะฐะบะฐัะต,
ะบะพะณะดะฐ ััะตะฑะพะฒะฐะปะพัั ะปะพะถะธัััั ัะฟะฐัั, ะฟะพัััะตัั ะฝะฐ ะฒัะตั
ะพะบะฝะฐั
ะทะฐะดะฒะธะณะฐะปะธัั. ะะฐ ัะฐััะฒะตัะต
ะถะต ะฝะฐะพะฑะพัะพั - ะพะฝะธ ัะฐะทัะตะทะถะฐะปะธัั ะฒ ัะฐะทะฝัะต ััะพัะพะฝั. ะะตะปะฐะปะพัั ะฒัั ััะพ ะฒ ะตะดะธะฝะพะต ะดะปั
ะฒัะตั
ะฒัะตะผั, ะธ ะธะทะผะตะฝะธัั ะทะฐะบัััะธะต ะธ ัะฐัะบัััะธะต ะฟะพัััะตั ะดะฐะถะต ะฒ ะปะธัะฝัั
ะบะตะปััั
ะฑัะปะพ
ะฝะตะฒะพะทะผะพะถะฝะพ.
ะะฐะบะพะฝะตั, ั ะฝะฐ ัะฒะพัะผ ััััะต. ะกะตะณะพะดะฝั ะฒ ะทะฐะปะต ะฑัะปะพ ะฝะฐ ัะดะธะฒะปะตะฝะธะต ะฟัััะพ - ะฒะฝัััะธ ะพะบะฐะทะฐะปัั
ะปะธัั ะพะดะธะฝ ะกัะฐัะตะนัะธะฝะฐ. ะะพะณะดะฐ ั ะฒะพััะป, ะพะฝ ััะพัะป ัะฟะธะฝะพะน ะบะพ ะผะฝะต, ะพะฟัะพะบะธะฝัะฒ ะฝะฐะทะฐะด ะณะพะปะพะฒั
ะธ ัะฐะทะณะปัะดัะฒะฐั ะฝะตะฑะตัะฐ.
- ะะฐะผ ัะปะตะดัะตั ะฑััั ะฑะพะปะตะต ัะฐััะพัะพะฟะฝัะผ, ะฑัะฐั ะ ะธั
ะฐัะด, - ัะบะฐะทะฐะป ะพะฝ ัะฟะพะบะพะนะฝะพ, ะฝะต ะพัััะฒะฐั
ะฒะทะณะปัะดะฐ ะพั ะฝะตะฑะตั. - ะัะพะฑะตะฝะฝะพ ะบะพะณะดะฐ ะดะปั ะะฐั ะฟัะธะฟะฐัะตะฝะพ ัะฐะบะพะต ะทะฐะดะฐะฝะธะต.
ะะณะพ ัะปะพะฒะฐ ะทะฐััะฐะฒะธะปะธ ะผะตะฝั ะพะถะธะฒะธัััั: ะพะฝะธ ะพะฑะตัะฐะปะธ ััะพ-ัะพ ะฟะพัะปะพะถะฝะตะต ะฟัะพััะพะณะพ ัะฑะพัะฐ
ะดัั.
- ะ ะบะฐะบะพะฒะพ ะถะต ะพะฝะพ? - ัะฟัะพัะธะป ั, ััะฐัะฐััั ะฟัะธะดะฐัั ะณะพะปะพัั ะฑะตะทัะฐะทะปะธัะธะต.
- ะะต ะพะฑะพะปััะฐะนัะตัั, ััะพ ะฝะต ะฟัะพััะพ ะธะฝัะตัะตัะฝะฐั ะธะณัะฐ, - ะกัะฐัะตะนัะธะฝะฐ ะฝะฐะบะพะฝะตั ะพะฑะตัะฝัะปัั
ะธ ะฟะพัะผะพััะตะป ะผะฝะต ะฒ ะณะปะฐะทะฐ. - ะั ะฑัะดะตัะต ะธะผะตัั ะดะตะปะพ ั ะพัะตะฝั ะฝะตะพะฑััะฝัะผ ัะบะทะตะผะฟะปััะพะผ.
ะงััั ะฟัะธะฟะพะดะฝัะฒ ะฟะพะปั ะฟััะฐะฒัะตะณะพัั ะฒ ะฝะพะณะฐั
ะฑะฐะปะฐั
ะพะฝะฐ, ะกัะฐัะตะนัะธะฝะฐ ะฟะพะดะพััะป ะฑะปะธะถะต ะบะพ
ะผะฝะต.
- ะั ะดะฐะฒะฝะพ ะพั
ะพัะธะผัั ะทะฐ ะดััะพะน ััะพะณะพ ัะตะปะพะฒะตะบะฐ. ะกะผะตััั ะฝะต ัะฐะท ะฒะฝะพัะธะปะฐ ะตะณะพ ะฒ ัะฒะพะธ
ัะฟะธัะบะธ, ะฝะพ ะบะฐะบะธะผ-ัะพ ะฝะตะฒะตะดะพะผัะผ ะพะฑัะฐะทะพะผ ะพะฝ ัะถะต ัะตัััะต ัะฐะทะฐ ะพั ะฝะฐั ััะบะพะปัะทะฐะป. ะะฟััะฝะตะนัะธะต
ะถะฝะตัั ะฝะต ัะผะพะณะปะธ ัะฟัะฐะฒะธัััั ั ััะธะผ ะทะฐะดะฐะฝะธะตะผ. ะขะพะณะดะฐ ะกะผะตััั ะพัะดะฐะปะฐ ะฟัะธะบะฐะท ะพัะฟัะฐะฒะธัั
ะบ ััะพะผั ัะตะปะพะฒะตะบั ะธะผะตะฝะฝะพ ะะฐั, - ะกัะฐัะตะนัะธะฝะฐ ัะดะตะปะฐะป ัะดะฐัะตะฝะธะต ะฝะฐ ะฟะพัะปะตะดะฝะตะผ ัะปะพะฒะต.
- ะะต ะฟะพะดะฒะตะดะธัะต ะตั. ะะฝะฐัะต ะั ะทะฝะฐะตัะต, ััะพ ะะฐั ะถะดัั.
ะะฝ ะฑัะพัะธะป ะฒะทะณะปัะด ะฝะฐ ะงััะฝะพะต ะพะบะพ. ะญัะพ ะบะพะปะพะดะตั, ัะฐัะฟะพะปะฐะณะฐะฒัะธะนัั ะฒ ัะตะฝััะต ััะพะณะพ ะทะฐะปะฐ.
ะะฝ ะธัะฟะพะปัะทะพะฒะฐะปัั ะฒ ะบะฐัะตััะฒะต ะฝะฐะบะฐะทะฐะฝะธั ะดะปั ะฝะตะฒะตัะฝัั
ะธะปะธ ะฟัะพััะพ ะฝะตัะณะพะดะฝัั
ะกะผะตััะธ
ะถะฝะตัะพะฒ. ะะฝะธ ะฟะพะดะฒะตัะณะฐะปะธัั ัะฐะบ ะฝะฐะทัะฒะฐะตะผะพะผั ะะตัะตะฒะพะฟะปะพัะตะฝะธั. ะะธะฝะพะฒะฝะพะณะพ ัะฑัะฐััะฒะฐะปะธ
ะฒ ัััะฝะพะต ะถะตัะปะพ ััะพะณะพ ะบะพะปะพะดัะฐ, ะฟะพัะปะต ัะตะณะพ ะพะฝ ะธััะตะทะฐะป ะธะท ะฅัะฐะผะฐ ะฝะฐะฒัะตะณะดะฐ. ะฅะพะดะธะป ัะปัั
,
ััะพ ะบะฐะทะฝัะฝะฝัะต ะพะฑัะตัะฐะปะธ ะฝะพะฒัั ะถะธะทะฝั ะฝะฐ ะทะตะผะปะต, ัะพะปัะบะพ ะฒ ะดััะณะพะน ะธะฟะพััะฐัะธ (ะฟะพััะพะผั
ะฟัะพัะตะดััะฐ ะฝะฐะทัะฒะฐะปะฐัั ะะตัะตะฒะพะฟะปะพัะตะฝะธะตะผ). ะะดะฝะฐะบะพ ะฝะต ะฑัะปะพ ะฝะธะบะพะณะพ, ะบัะพ ะฑั ัะผะพะณ ะฟะพะดัะฒะตัะดะธัั
ััะพ ะธะปะธ ะพะฟัะพะฒะตัะณะฝััั, ะฟะพััะพะผั ะฒัะต ะฑะพัะปะธัั ะฑััั ะฟะพะดะฒะตัะณะฝัััะผะธ ะะตัะตะฒะพะฟะปะพัะตะฝะธั.
- ะฏ ะฝะธ ัะฐะทั ะฝะต ะฟะพะทะฒะพะปัะป ะะฐะผ ะธ ะกะผะตััะธ ััะพะผะฝะธัััั ะฒ ัะฒะพัะผ ะฟัะพัะตััะธะพะฝะฐะปะธะทะผะต. ะะฐะดะฐะฝะธะต
ะฑัะดะตั ะฒัะฟะพะปะฝะตะฝะพ, - ััั
ะพ ะฟัะพะธะทะฝัั ั.
- ะัะธััะฝะพ ัะปััะฐัั ัะฒะตัะตะฝะฝะพััั ะฒ ะะฐัะตะผ ะณะพะปะพัะต, ะฝะพ ะฝะต ัะตััะนัะต ะฑะดะธัะตะปัะฝะพััะธ. ะะฐ ััะพั
ัะฐะท ะะฐัะฐ ะฒัะปะฐะทะบะฐ ะฒ ะผะธั ะปัะดัะบะพะน ะฑัะดะตั ะดะปะธัะตะปัะฝะพะน. ะัะตะผะตะฝะธ ะะฐะผ ะฑัะดะตั ะดะฐะฝะพ ััะพะปัะบะพ,
ัะบะพะปัะบะพ ะฟะพััะตะฑัะตััั, ะฝะพ ะฟะพะผะฝะธัะต - ะกะผะตััั ะฝะต ะปัะฑะธั ะถะดะฐัั. ะะฐะผ ะฝัะถะฝะพ ะฒะพะนัะธ ะฒ ะดะพะฒะตัะธะต
ะบ ััะพะผั ัะตะปะพะฒะตะบั, ัะทะฝะฐัั ะตะณะพ ะบะฐะบ ะผะพะถะฝะพ ะปัััะต, ััะพะฑั ะตะณะพ ะดััะฐ ัะฐะผะฐ ะฟะพััะฝัะปะฐัั ะบ
ะะฐะผ. ะขะพะณะดะฐ ะั ัะผะพะถะตัะต ะฑะตัะฟัะตะฟััััะฒะตะฝะฝะพ ะตั ะทะฐะฑัะฐัั, ะธ ะทะฐะดะฐะฝะธะต ะฑัะดะตั ััะธัะฐัััั ะฒัะฟะพะปะฝะตะฝะฝัะผ.
ะะพัะพะฒะฐัะพ ะพะณะปัะฝัะฒัะธัั, ะกัะฐัะตะนัะธะฝะฐ ัะบะปะพะฝะธะปัั ะบ ะผะพะตะผั ัั
ั ะธ ัะตะฟะฝัะป:
- ะ ะตัั ะฒ ัะปััะฐะต ััะฟะตั
ะฐ ั ะฟะพะทะฐะฑะพัััั ะพ ัะพะผ, ััะพะฑั ะั ะฟัะธะผะบะฝัะปะธ ะบ ะฝะฐะผ.
ะะฝ ะฝะฐะผะตะบะฐะป ะฝะฐ ัะพ, ััะพ ะฟะพะผะพะถะตั ะผะฝะต ะฟะพะปััะธัั ัะฐะผัั ะฒััะพะบัั ะดะพะปะถะฝะพััั ะฅัะฐะผะฐ - ะดะพะปะถะฝะพััั
ะกัะฐัะตะนัะธะฝั.
ะฏ ะบะพัะพัะบะพ ะบะธะฒะฝัะป.
- ะฏ ะผะพะณั ะฟัะธัััะฟะธัั ะบ ะทะฐะดะฐะฝะธั?
- ะ ะฐะทัะผะตะตััั. ะกะปัะถะบะธ ัะพะฑะตััั ะะฐะผ ะฒัั, ััะพ ะฟะพััะตะฑัะตััั ะดะปั ะปัะดัะบะพะน ะถะธะทะฝะธ, ะธ ะั
ะผะพะถะตัะต ะฝะตะผะตะดะปะตะฝะฝะพ ะพัะฟัะฐะฒะปััััั.
ะฏ ัะฝะพะฒะฐ ะบะธะฒะฝัะป ะธ ะฝะฐะฟัะฐะฒะธะปัั ะบ ะฒัั
ะพะดั. ะฃะถะต ั ัะฐะผัั
ะดะฒะตัะตะน ั ะพะฑะตัะฝัะปัั ะธ ัะฟัะพัะธะป:
- ะ ะบะฐะบ ะทะพะฒัั ะผะพะตะณะพ ะฟะพะดะพะฟะตัะฝะพะณะพ?
- ะขะธะปะปั ะะธะฝะดะตะผะฐะฝะฝ.
ะะดะฒะฐ ั ะฟะพััะฒััะฒะพะฒะฐะป, ััะพ ัะฒััะดะพ ััะพั ะฝะพะณะฐะผะธ ะฝะฐ ะทะตะผะปะต, ะบะฐะบ ะฝะฐ ะผะตะฝั ะฝะฐะปะตัะตะป ัะธะปัะฝัะน
ะฟะพััะฒ ั
ะพะปะพะดะฝะพะณะพ ะฒะตััะฐ ะธ ะฝะฐัะฐะป ะฝะตัะฐะดะฝะพ ัะตะทะฐัั ะพะณะพะปัะฝะฝัะต ััะฐััะบะธ ะบะพะถะธ. ะฏ ะฟะพััั ะทะฐะผััะทัะธะต
ััะบะธ, ะฟัะธะถะฐะป ะธั
ะบะพ ััั, ััะพะฑั ัะพะณัะตัั ัะฒะพะธะผ ะณะพัััะธะผ ะดัั
ะฐะฝะธะตะผ, ะฝะพ ะพั ััะพะณะพ ะฑัะปะพ
ะผะฐะปะพ ัะพะปะบั. ะจะฐะฟะบะพะน ะผะตะฝั ัะฝะฐััะดะธัั ะฝะต ัะดะพััะถะธะปะธัั, ะฟะพััะพะผั ะณะพะปะพะฒะฐ ัะพะถะต ะพัััะฐะปะฐ
ะฒัะต ะฟัะตะปะตััะธ ะฒัะดะฐะฒัะตะนัั ะฒ ััะพะผ ะณะพะดั ัััะพะฒะพะน ะฝะตะผะตัะบะพะน ะทะธะผั.
ะฏ ะพัะผะพััะตะปัั. ะะพะบััะณ ะปะธัั ะทะฐััะฟะฐะฝะฝัะต ัะฝะตะณะพะผ ะดะตัะตะฒัั. ะะธะบะฐะบะธั
ะฟัะธะทะฝะฐะบะพะฒ ัะพะณะพ, ััะพ
ะฟะพะฑะปะธะทะพััะธ ะบัะพ-ัะพ ะพะฑะธัะฐะตั. ะัะบะธะดัะฒะฐัั ะผะตะฝั ะฟััะผะพ ะฝะฐ ะฟะพัะพะณ ะพะฑัะตะบัะฐ ะฑัะปะพ ะพะฟะฐัะฝะพ
ะธ ะฟะพะดะพะทัะธัะตะปัะฝะพ, ะฟะพััะพะผั ั ะพะบะฐะทะฐะปัั ะฒ ะฝะตะบะพัะพัะพะผ ะพัะดะฐะปะตะฝะธะธ ะพั ะผะตััะฐ ะฝะฐะทะฝะฐัะตะฝะธั.
ะ ัะตะฟะตัั, ััะพั ะฒ ััะพะผ ะทะฐัะฝะตะถะตะฝะฝะพะผ ะปะตัั, ั ะฟะพะฝััะธั ะฝะต ะธะผะตะป, ะบัะดะฐ ะผะฝะต ะฝะฐะฟัะฐะฒะปััััั.
ะฏ ัะตัะธะป ะฟะพะดะดะฐัััั ะธะฝััะธัะธะธ, ะบะพัะพัะฐั ะผะตะฝั ัะตะดะบะพ ะฟะพะดะฒะพะดะธะปะฐ. ะะฝะตัั ะธะผะตัั ะผะฝะพะณะพ ัะฟะพัะพะฑะฝะพััะตะน:
ะฝะฐะฟัะธะผะตั, ะผั ัะผะตะตะผ ะฟะพะดะฐะฒะปััั ัะพะทะฝะฐะฝะธะต ะปัะดะตะน ะธะปะธ ััะฐะฝะพะฒะธัััั ะฝะตะฒะธะดะธะผัะผะธ, ะบะพะณะดะฐ
ะฝะฐะผ ััะพ ะฝะตะพะฑั
ะพะดะธะผะพ, ะฝะพ ะฟะพัะตะผั-ัะพ ะฒะฝัััะตะฝะฝะตะณะพ ะบะพะผะฟะฐัะฐ ะฒ ะฝะฐั ะฝะต ะฒัััะพะตะฝะพ.
ะะพะณะธ ะฟัะพะฒะฐะปะธะฒะฐะปะธัั ะฒ ะณะปัะฑะพะบะธะต ััะณัะพะฑั. ะงะตัััั
ะฐััั, ั ะผะตะดะปะตะฝะฝะพ ะฟัะพะดะฒะธะณะฐะปัั ะฒะฟะตััะด,
ะฒะฝะธะผะฐัะตะปัะฝะพ ะฒัะธัะบะธะฒะฐั ะฒะดะฐะปะตะบะต ั
ะพัั ะบะฐะบะธะต-ัะพ ะฟัะธะทะฝะฐะบะธ ัะตะปะพะฒะตัะตัะบะพะณะพ ะถะธะปัั, ะพะดะฝะฐะบะพ
ััะตัะฝะพ. ะฏ ะผััะปะตะฝะฝะพ ะฟะตัะตะฑะธัะฐะป ะฒัะต ัะต ะทะฝะฐะฝะธั, ะบะพัะพััะต ะบะพะณะดะฐ-ัะพ ะฟะพะปััะธะป ะพะฑ ััะพะผ ัะตะณะธะพะฝะต,
ะฝะพ ัะฐะบ ะธ ะฝะต ัะผะพะณ ะฟัะธะฟะพะผะฝะธัั, ััะพะฑั ััั ะฑัะปะธ ั
ะฐัะฐะบัะตัะฝั ัะฐะบะธะต ะฟะพะณะพะดะฝัะต ััะปะพะฒะธั.
ะะฐะถะต ะฟะพะดัะผะฐะป ะพ ัะพะผ, ััะพ ะฒััะปะฐ ะพัะธะฑะบะฐ, ะธ ะผะตะฝั ะทะฐะบะธะฝัะปะธ ะฒ ะบะฐะบะพะต-ัะพ ะดััะณะพะต ะผะตััะพ.
ะัะตะผั ัะปะพ, ะฐ ะฟะตะนะทะฐะถะธ ะฒะพะบััะณ ะฝะต ะผะตะฝัะปะธัั, ะทะฐัะพ ะฝะฐัะธะฝะฐะปะพ ัะตะผะฝะตัั. ะฏ ัะถะต ะฝะฐัะฐะป ะฑะตัะฟะพะบะพะธัััั,
ะฝะพ ะฝะต ะดะฐะฒะฐะป ะฟะฐะฝะธะบะต ะฟะพะปะฝะพัััั ะพะฒะปะฐะดะตัั ะผะฝะพะน. ะขะพะปัะบะพ ะบะพะณะดะฐ ะฝะฐะด ะปะตัะพะผ ะฟะพะดะฝัะปัั ะพะฟะพะปะพะฒะธะฝะตะฝะฝัะน
ะดะธัะบ ะปัะฝั, ั ะพะฑะตััะธะปะตะฝะพ ะฟะพะฒะฐะปะธะปัั ะฝะฐ ะทะตะผะปั, ะฟะพะดะปะพะถะธะฒ ะฟะพะด ัะตะฑั ััะบะทะฐะบ, ะฒ ะบะพัะพัะพะผ
ะฑัะป ัะพะฑัะฐะฝ ะผะธะฝะธะผัะผ ะพะดะตะถะดั. ะ ัะบ ั ัะถะต ัะพะฒะตััะตะฝะฝะพ ะฝะต ััะฒััะฒะพะฒะฐะป, ะฐ ะณะพะปะพะฒะฐ ัะฐัะบะฐะปัะฒะฐะปะฐัั
ะพั ั
ะพะปะพะดะฐ. ะ ะผััะปัั
ะบัััะธะปะพัั, ััะพ ัะฐะบะพะน ะผะฐััะตั ัะฒะพะตะณะพ ะดะตะปะฐ ะบะฐะบ ั ะฝะต ะผะพะถะตั ัะฐะบ
ะณะปัะฟะพ ะฟัะพะฒะฐะปะธัััั, ะฝะพ ัะธะป ะดะฐะปััะต ะดะฒะธะณะฐัััั ะฝะต ะฑัะปะพ.
- ะฏ ะปะธัั ะฝะตะผะฝะพะณะพ... ะพัะดะพั
ะฝั, - ััะฟะพะบะฐะธะฒะฐะป ั ัะฐะผะพะณะพ ัะตะฑั, ะฟััะฐััั ัะฐะทะปะตะฟะธัั ะฝะฐััััะฝะพ
ะทะฐะบััะฒะฐััะธะตัั ะฒะตะบะธ. ะะพ ัะตะปะพ ะผะตะฝั ัะถะต ะฝะต ัะปััะฐะปะพัั. ะะตัะตะด ัะตะผ, ะบะฐะบ ะพะบะพะฝัะฐัะตะปัะฝะพ
ะพัะดะฐัั ัะตะฑั ะฒ ะพะฑัััะธั ัะฝะฐ, ั, ะบะฐะถะตััั, ััะปััะฐะป ัะบัะธะฟ ัะฝะตะณะฐ.
ะัะธะดั ะฒ ัะตะฑั, ั ะฟะตัะฒัะผ ะดะตะปะพะผ ะฟะพััะฒััะฒะพะฒะฐะป ะทะฐะฟะฐั
ะผะพะปะพะบะฐ. ะัะปะพ ะฝะฐ ัะดะธะฒะปะตะฝะธะต ัะตะฟะปะพ,
ะดะฐะถะต ะถะฐัะบะพ. ะัะธะพัะบััะฒ ะพะดะธะฝ ะณะปะฐะท, ั ัะฒะธะดะตะป ะฝะฐะฟัะพัะธะฒ ัะตะฑั ะทะฐะดะพัะฝะพ ะฟะปััััะธะต ัะทััะบะธ
ะฟะปะฐะผะตะฝะธ ะฒ ะบะฐะผะธะฝะต. ะะฟัััะธะป ะฒะทะณะปัะด ะฒะฝะธะท - ะผะพั ัะตะปะพ ะฟะปะพัะฝะพ ะทะฐะบััะฐะฝะพ ะฒ ัะพะปััะพะต ะพะดะตัะปะพ,
ะฐ ัะฐะผ ั ะปะตะถั ะฝะฐ ะบะฐะบะพะผ-ัะพ ะดะธะฒะฐะฝัะธะบะต.
ะะดะต ั?
ะ ะณะพะปะพะฒะต ะฝะตะพั
ะพัะฝะพ ะฝะฐัะฐะปะธ ะฒะพัะพัะฐัััั ัะตััะตััะฝะบะธ ะฒะพัะฟะพะผะธะฝะฐะฝะธะน, ะฟัะพะฒะพัะฐัะธะฒะฐั ะฝะฐะทะฐะด
ัะตะณะพะดะฝััะฝะธะต ัะพะฑััะธั. ะ ะฐััะฒะตั, ะกัะฐัะตะนัะธะฝะฐ, ะทะฐะดะฐะฝะธะต, ะปะตั, ั
ะพะปะพะด, ะพะฑะผะพัะพะบ, ะฟัััะพัะฐ.
ะฏ ะตัั ัะฐะท ะฟะพ ะผะตัะต ะฒะพะทะผะพะถะฝะพััะตะน ะพัะผะพััะตะป ัะตะฑั, ะดะฐะถะต ะทะฐะณะปัะฝัะป ะฟะพะด ะพะดะตัะปะพ. ะัะพ-ัะพ
ะทะฐะฑะพัะปะธะฒะพ ะฟัะธัะฐัะธะป ะผะตะฝั ะธะท ะปะตัะฐ, ัััะฝัะป ะฒัั ะพะดะตะถะดั ะธ ัะพะณัะตะป. ะะบะธะฝัะป ะฒะทะณะปัะดะพะผ ะบะพะผะฝะฐัั
- ะผะพะตะณะพ ะดะพะฑัะพะดะตัะตะปั ะฝะต ะฝะฐะฑะปัะดะฐะปะพัั.
ะฏ ะพััะพัะพะถะฝะพ ะฟะพะฟััะฐะปัั ะฟะพะดะฝััััั ะฝะฐ ะปะพะบััั
, ะฝะฐ ััะพ ัะตะปะพ ะพัะพะทะฒะฐะปะพัั ะฝะตะฟัะธััะฝะพะน ะฝะพััะตะน
ะฑะพะปัั. ะกะบัะธะฒะธะฒัะธัั, ั ะฒัั ะถะต ะฟัะธะฒัะป ัะตะฑั ะฒ ัะธะดััะตะต ะฟะพะปะพะถะตะฝะธะต ะธ ะตัั ัะฐะท ััะฐัะตะปัะฝะพ
ะฒัั ะพัะผะพััะตะป. ะะพะผะฝะฐัะฐ ะฑัะปะฐ ะฝะตะฑะพะปััะพะน ะธ ะฒะตััะผะฐ ะฝะตัั
ะพะถะตะฝะฝะพะน. ะะพะฒััะดั ะฒะฐะปัะปะธัั ะบะฐะบะธะต-ัะพ
ะพะฑัััะบะธ, ะฝะฐ ะฟะพะปั ััะพัะปะธ ะฟััััะต ะฑัััะปะบะธ, ะบะพะต-ะณะดะต ะฒะธะดะฝะตะปะธัั ะธ ะณััะทะฝัะต ัะฐัะตะปะบะธ. ะะฑะพะธ
ะผะตััะฐะผะธ ะพัั
ะพะดะธะปะธ, ะพะฑะฝะฐะถะฐั ะดะตัะตะฒัะฝะฝัะต ััะตะฝั, ะฟัะฐะบัะธัะตัะบะธ ะฒัั ะฑัะปะพ ะธัะฟะธัะฐะฝะพ ะธะผะตะฝะฐะผะธ
ะบะฐะบะธั
-ัะพ ะปัะดะตะน ะธ ัะฐะทะปะธัะฝัะผะธ ัััะฐะฝะฝัะผะธ ะฝะฐะทะฒะฐะฝะธัะผะธ, ะทะฝะฐัะตะฝะธั ะบะพัะพััั
ั ะฝะต ะฟะพะฝะธะผะฐะป.
ะะดะฝะฐะบะพ ะฑัะปะพ ะฒะธะดะฝะพ, ััะพ ั
ะพะทัะธะฝ ะปัะฑะธั ัะฒะพั ะฑะตัะปะพะณั ะธ ะฝะธ ะทะฐ ััะพ ะฝะต ัะฐัััะฐะฝะตััั ั
ะฝะตะน.
ะั
ะพะดะฝะฐั ะดะฒะตัั ะฟัะพััะถะฝะพ ะทะฐัะบัะธะฟะตะปะฐ, ะธ ะฒ ะบะพะผะฝะฐัั ะฒะพััะป ะทะดะพัะพะฒะตะฝะฝัะน ะดะตัะธะฝะฐ, ะดะตัะถะฐะฒัะธะน
ะฒ ััะบะฐั
ะบััะถะบั, ะฟะพ ัะฐะทะผะตัะฐะผ ะฑะพะปััะต ะฟะพั
ะพะถัั ะฝะฐ ะฒะตะดัะพ.
- ะััั
ะฐะปัั, - ะฑััะบะฝัะป ะพะฝ, ะฟัะพััะณะธะฒะฐั ะผะฝะต "ะฒะตะดัะพ". - ะะตะน.
ะขัะฟะปะพะต ะผะพะปะพะบะพ. ะฏ ะฝะต ะฑัะป ะฑะพะปััะธะผ ะฟะพะบะปะพะฝะฝะธะบะพะผ ััะพะณะพ ะฝะฐะฟะธัะบะฐ, ะฝะพ ะฒะฝะตะทะฐะฟะฝะพ ะพัััะธะป,
ััะพ ะธะผะตะฝะฝะพ ะตะณะพ ัะตะนัะฐั ััะตะฑัะตั ะผะพะน ะพัะณะฐะฝะธะทะผ, ะธ ะถะฐะดะฝะพ ะฟัะธะฟะฐะป ะบ ะบััะถะบะต. ะะฐัะตะฝั, ัะธะดะตะฒัะธะน
ะฝะฐะฟัะพัะธะฒ ะธ ัะณััะผะพ ะฝะฐะฑะปัะดะฐััะธะน ะทะฐ ะผะฝะพะน ะธะท-ะฟะพะด ะพััะพััะตะน ััะปะบะธ, ัะผะพััะตะป ะฝะฐ ะผะตะฝั ะฝะตัะบะพะปัะบะพ
ะผะธะฝัั, ะฟะพัะปะต ัะตะณะพ ัะฟัะพัะธะป:
- ะะฐะบะพะณะพ ะปะตัะตะณะพ ัะตะฑั ะฒ ะปะตั-ัะพ ะฟะพะฝะตัะปะพ?
ะฏ ะบะพะต-ะบะฐะบ ะพัะพัะฒะฐะป ัะตะฑั ะพั ะผะพะปะพะบะฐ.
- ะะฐัะธะฝะฐ ะฝะฐ ััะฐััะต ะทะฐะณะปะพั
ะปะฐ, ะฟะพััะป ะทะฐ ะฟะพะผะพััั ะธ ะทะฐะฑะปัะดะธะปัั, - ะฒัะดะฐะป ั ะทะฐัะฐะฝะตะต
ะฟะพะดะณะพัะพะฒะปะตะฝะฝัั ะปะตะณะตะฝะดั.
- ะกัะธัะฐะน, ะฟัะพะฟะฐะปะฐ ัะฒะพั ัะฐัะบะฐ, - ัััะบะฝัะป ะฝะตะทะฝะฐะบะพะผะตั. - ะฃ ะฝะฐั ััั ัะตะนัะฐั ะฝะต ัะฐะผัะน
ัะฟะพะบะพะนะฝัะน ะฝะฐัะพะดะตั ะพะฑะธัะฐะตั. ะกะบะพัะตะต ะฒัะตะณะพ, ัะถะต ะฟัะธะฑัะฐะปะธ ะบ ััะบะฐะผ ัะฒะพั ะปะพัะฐะดะบั.
ะฏ ะฟะพััะฐัะฐะปัั ัะดะตะปะฐัั ัะฐะทะพัะฐัะพะฒะฐะฝะฝัะน ะฒะธะด.
- ะญั
, ะฝั ััะพ ะถ ั ัะฐะบ!
- ะขั ะฒ ะฟะพะปะธัะธั ะฟะพะฟัะพะฑัะน ะพะฑัะฐัะธัััั, ะฝะพ ััะพ ะฒััะด ะปะธ ัะตะฑะต ะฟะพะผะพะถะตั. ะะตะฝั ะบััะฐัะธ,
ะขะธะปะปั ะทะพะฒัั.
ะฏ ะฐะถ ะฟะพะดะฐะฒะธะปัั, ััะปััะฐะฒ ะธะผั ัะฒะพะตะณะพ ัะฟะฐัะธัะตะปั.
- ะญะน, ะฝั ัั ะฐะบะบััะฐัะฝะตะต! - ะขะธะปะปั ะดะตัะฝัะปัั ะฒ ะผะพั ััะพัะพะฝั, ะฝะพ ั ะถะตััะพะผ ะพััะฐะฝะพะฒะธะป
ะตะณะพ.
- ะ ั..ะบั
ะผ-ะบั
ะผ...ะ ะธั
ะฐัะด, - ะฟัะพั
ัะธะฟะตะป ั, ะฟััะฐััั ะฟะตัะตะฑะพัะพัั ะฟัะธัััะฟั ะบะฐัะปั.
- ะะฐััััะป ัั, ะฟะพั
ะพะถะต, ะทะดะตัั, ะ ะธั
ะฐัะด, - ะขะธะปะปั ะบะธะฒะบะพะผ ัะบะฐะทะฐะป ะฝะฐ ะพะบะฝะพ. ะะฐ ัะปะธัะต ะฝะต
ะฝะฐ ัััะบั ัะฐะทะฑััะตะฒะฐะปะฐัั ะผะตัะตะปั.
- ะฃะดะฐัะฐ, ะตัะปะธ ะผั ะทะฐะฒััะฐ ะฒะพะพะฑัะต ัะผะพะถะตะผ ะธะท ะดะพะผะฐ ะฒัะนัะธ, - ะฒะทะดะพั
ะฝัะป ะขะธะปะปั. - ะฃะถะต ะฝะตะดะตะปั
ะผะตััั, ะทะฐัะฐะทะฐ, ะธ ะฒัั ะฝะธะบะฐะบ ะฝะต ััะฟะพะบะพะธััั. ะขะฐะบ ััะพ ะพะฑะพะถะดะฐัั ะฟัะธะดัััั, ะฟัะตะถะดะต ัะตะผ
ั ะดะพะบะธะฝั ัะตะฑั ั
ะพัั ะฑั ะดะพ ะจะฒะตัะธะฝะฐ.
ะฏ ะฝะตัะฒะตัะตะฝะฝะพ ะฟะพะถะฐะป ะฟะปะตัะฐะผะธ, ััะพะฑั ะขะธะปะปั ะฝะต ะฟะพะดัะผะฐะป, ััะพ ั ะฝะฐะฟัะฐัะธะฒะฐััั.
- ะขะพะปัะบะพ ัั, ัะปััั, ัะฐะผ ัะตะฑะต ะฑัะดะตัั ะถัะฐัั ะณะพัะพะฒะธัั, ั ัะตะฑะต ะฝะต ั
ะพะทัััะบะฐ. ะั ะธ ะผะฝะต
ะทะฐะพะดะฝะพ ะผะพะถะตัั, - ัั
ะผัะปัะฝัะปัั ะขะธะปะปั. - ะัะตะฝะดะฝะฐั ะฟะปะฐัะฐ, ัะฐะบ ัะบะฐะทะฐัั, ะทะฐ ะฟัะตะดะพััะฐะฒะปะตะฝะฝัั
ะถะธะปะฟะปะพัะฐะดั.
ะฏ ัะพะณะปะฐัะฝะพ ะบะธะฒะฝัะป.
- ะั ะธ ะปะฐะดััะบะธ. ะั
, ะดะฐ, ะฟััะป ะฒะพะฝ ั ะผะพะตะณะพ ะดะธะฒะฐะฝะฐ!
ะะพะผะฝั ะพ ัะพะผ, ััะพ ะผะฝะต ะฝัะถะฝะพ ะฝะฐะปะฐะดะธัั ั ััะธะผ ัะตะปะพะฒะตะบะพะผ ะบะพะฝัะฐะบั, ะธ ะทะปะธัั ะตะณะพ ะฝะธ ะฒ
ะบะพะตะผ ัะปััะฐะต ะฝะต ัะปะตะดัะตั, ั ะฟะพัะปััะฝะพ ะฒััะฐะป, ัะบััะฐะฒัะธัั ะฒ ะพะดะตัะปะพ.
- ะั, ัะพะถะต ะผะฝะต, - ั
ะผัะบะฝัะป ะขะธะปะปั, ะณะปัะดั ะฝะฐ ัะพ, ะบะฐะบ ั ะฟััะฐััั ะฟะพัะดะพะฑะฝะตะต ะทะฐะฒะตัะฝััั'
sentences:
- 'scrtrm - piano #11
ะ ัะธั
ะพะผ-ัะธั
ะพะผ ัะณะปั ัะปะธัั(ะตัะปะธ ััะพ ะฒะพะพะฑัะต ัะณะพะป) ัะธะดะตะปะธ ะดะฒะฐ ัะตะฑะตะฝะบะฐ ะฟัะธะผะตัะฝะพ ะพะดะธะฝะฐะบะพะฒะพะณะพ
ะฒะพะทัะฐััะฐ ัะพัะฝะพ ะฝะต ะดะพััะธะณะฝัะฒัะธะต ะดะตัััะธ, ะฐ ัะพ ะผะพะถะตั ะธ ะดะตะฒััะธ ะปะตั ะพั ัะพะดั. ะะพะบััะณ
ะฝะต ะฑัะปะพ ะฝะธะบะพะณะพ ะธะท ัะตั
ััะผะฝัั
ะฒะทัะพัะปัั
, ะบะพัะพััั
ะฒัะต ะดะตัะธ ะฟะพัะตะผั-ัะพ ะฝะตะดะพะปัะฑะปะธะฒะฐะปะธ.
ะะดะฝะฐะบะพ, ะบัะพะผะต ััะธั
ะดะฒะพะธั
, ะฝะฐ ัะปะธัะต ะธะทัะตะดะบะฐ ะฟัะพะฑะตะณะฐะปะธ, ะฐ ัะพ ะธ ะฟัะพะตะทะถะฐะปะธ, ะฝะฐ ะฟะพะปะฝะพะน
ัะบะพัะพััะธ ะฟะพะดัะพััะบะธ ะฝะฐ ะฒะตะปะพัะธะฟะตะดะฐั
ะธ ะฝะฐ ัะบะตะนัะฑะพัะดะฐั
. ะะตะบะพัะพััะต ะธะท ะฝะธั
ัะธั
ะพ ะธ ะฝะต
ัะฟะตัะฝะพ ะตะทะดะธะปะธ ะฝะฐ ัะพะปะธะบะฐั
, ะพะฑะพัะฐัะธะฒะฐััั ะฟะพ ััะพัะพะฝะฐะผ ะธ ัะผะพััั ะปะฐััะบะธ ั ะผะพัะพะถะตะฝะฝัะผ.
ะะฐ ัะปะธัะต ะปะตัะพ. ะัะฝั ะฒัะพะดะต ะฑั, ะฐ ะดะตัะธ ัะถะต ะฒะพ ะฒัั ัะฐะทะฒะปะตะบะฐัััั, ะทะฐะฑัะฒ ะฟัะพ ัััะฑั.
ะ ัะต ะดะฒะฐ ัะตะฑัะฝะบะฐ ะฒัั ัะฐะบ ะถะต ัะธะดะตะปะธ ะฝะฐ ะดะฒัั
ะปะฐะฒะพัะบะฐั
, ะฟะฐัะฐะปะปะตะปัะฝัั
ะดััะณ ะดััะณั,
ะธะทัะตะดะบะฐ ะฑัะพัะฐั ะฒะทะณะปัะดั ะฝะฐ ะดััะณ ะดััะณะฐ ะธ ะฟะพัะฟะตัะฝะพ ะพัะฒะพัะฐัะธะฒะฐััั, ะบะพะณะดะฐ ะพะดะธะฝ ะธะท ะฝะธั
ะทะฐะผะตัะฐะป ััะธ ะฒะทะณะปัะดั. ะจัะผ ะปะธัััะตะฒ ััะดะพะผ ัะฒะตัััะตะน ะฟะพะทะดะฝะตะน ัะตััะผัั
ะธ ะดะพะฑะฐะฒะปัะป ัะพะผะฐะฝัะธัะฝะพััะธ
ะบะฐััะธะฝะต. ะัััััะต ัะบะพะปัะฝะธะบะธ ะฝะต ัะฐะท ัะผะตัะปะธัั ะฝะฐะด ะฝะธะผะธ, ะฝะต ะทะฐะดัะผัะฒะฐััั ะฝะฐะด ัะตะผ, ััะพ
ััะฒััะฒััั ััะธ ะดะตัะธ. ะะตัะฒะฐั ะปัะฑะพะฒั ะฒัะตะณะดะฐ ะฟัะธั
ะพะดะธั ะฟะพ ัะฐะทะฝะพะผั.
ะะฐะปััะธะบ ั ะณะพะปัะฑัะผะธ ะฒะพะปะพัะฐะผะธ, ะพะดะตััะน ะฒ ัะพะฝะบัั ัะฒะตัะปะพ-ะพัะฐะฝะถะตะฒัั ััะฑะฐัะบั ะธ ะทะฐะณะฝัััะผะธ
ะดะพ ะบะพะปะตะฝะฐ ะบะพัะธัะฝะตะฒัะผะธ ะฑััะบะฐะผะธ ะฟะพะดะพััะป ะบ ะดะตะฒะพัะบะต ะธ ัะฟัะพัะธะป:
- ะ-ะฟัะธะฒะตั. ะขะตะฑะต ะฝะต ะฝัะถะตะฝ ะบะพั? ะก-ัะธะฝะธะน ัะฐะบะพะน, ั ัััะฝัะผะธ ะณะปะฐะทะฐะผะธ ะธ ะผะฐะปะตะฝัะบะธะผ ััะบะทะฐัะบะพะผ
ะฝะฐ ัะฟะธะฝะต?
- ะ-ะฝะตั, ัะฟะฐัะธะฑะพ, - ะพัะฒะตัะธะปะฐ ะดะตะฒะพัะบะฐ, ะพัะฒะพัะฐัะธะฒะฐััั ะฒ ััะพัะพะฝั. - ะะฐ ะธ ั ะผะตะฝั ัะถะต
ะตััั ะบะพั. ะัะพัะพะณะพ ัะพะดะธัะตะปะธ ะฝะต ัะฐะทัะตัะฐั.
- ะะพะฝััะฝะพ, ะฐ ััะพะณะพ ะบะพัะฐ ะฅัะฟะฟะธ ะทะฒะฐะปะธ. ะะพะฑััะน ัะฐะบะพะน, ะพัะทัะฒัะธะฒัะน, ะฟะพะฝะธะผะฐััะธะน. ะฅะพัะพัะธะน,
ะฒ ะพะฑัะตะผ, ะบะพั.
- ะ ั ะผะตะฝั ะบะพัะฐ, ะฝั ะฒะตัะฝะตะต ะฝะต ะบะพัะฐ, ะฐ ะบะพัะบั, ะจะฐัะปะธ ะทะพะฒัั. ะะพ ะพะฝะฐ ัััะฐะฝะฝะฐั, ะฝะต
ัะฐะทะณะพะฒะฐัะธะฒะฐััะฐั, ะณััะฑะฐั, ะดะฐะถะต ััะตั... ะะฐะบ ัะฐะผ ะฒะทัะพัะปัะต ะณะพะฒะพััั, ััะต-ัะฒะพ-ะทะฝะฐั,
ะพะฝะฐ ั ะผะตะฝั. ะ ะพะฑัะฐัััั ะฝะต ะปัะฑะธั.
- ะฏ ัะฒะตัะตะฝ, ะฅัะฟะฟะธ ะฑั ะฝะฐััะป ะบ ะฝะตะน ะฟะพะดั
ะพะด ะธ ัะฐะทะณะพะฒะพัะธะป ะฑั ะตั. ะ ะพะฝะฐ ะฑ ััะฐะปะฐ ัะฐะบะพะน
ะถะต ะบะฐะบ ะธ ะพะฝ.
- ะะต ะทะฝะฐั, ั, ัะพ ะตััั ะผั ะฝะธะบะพะณะดะฐ ะฝะต ะฟัะพะฑะพะฒะฐะปะธ.
- ะ ะดะฐะฒะฐะน ะฟะพะฟัะพะฑัะตะผ ะธ ัะทะฝะฐะตะผ! ะั, ัะฐะบ ัั ัะพ ะผะฝะพะน?
- ะฅ-ั
ะพัะพัะพ!
ะงะตัะตะท ะดะตะฒััั ะปะตั
- ะจะฐัะปะธ, ะฐ ัั ะฑั ั
ะพัะตะปะฐ ััะฐัั ะบะพัะบะพะน? ะั ะธะปะธ ะตั ะทะฐะฒะตััะธ?
- ะะตั, ะฅัะฟะฟะธ, ั ะผะตะฝั ัะถะต ะตััั ะพะดะธะฝ. ะะดะฝะพะณะพ ะดะพััะฐัะพัะฝะพ.
- ะัะฟะพะผะธะฝะฐะตะผ ะดะตัััะฒะพ, ะฐ ะจะฐัะปะธ?
- ะฃะณะฐะดะฐะป, ะฅัะฟะฟะธ! ะัะฟะพะผะธะฝะฐั ะดะตัััะฒะพ ะธ ัะพ ัะฐะผะพะต ะผะตััะพ.
- ะั ะฐ ะฒัั ะถะต, ะบัะพ ััะพั ะบะพั?
- ะ ะฟะพัะตะผั ัั ะดัะผะฐะตัั, ััะพ ััะพ ะบะพั?
- ะขะฐะบ ัั ะถะต ัะฐะผะฐ ัะบะฐะทะฐะปะฐ! ะกะบะปะตัะพัะธัะบะฐ!
- ะฏ ะฝะต ะณะพะฒะพัะธะปะฐ ัะฐะบะพะณะพ!
- ะะพะฒะพัะธะปะฐ!
- ะะตั!
- ะะต ะฝะตั, ะฐ ะดะฐ!
ะ ััะฐัะพะผ ะฟะฐัะบะต ะฝะฐ ะดะฒัั
ะฟะฐัะฐะปะปะตะปัะฝัั
ัะบะฐะผะตะนะบะฐั
ัะธะดะตะปะฐ ะผะพะปะพะดะฐั ะฟะฐัะฐ, ะบะพัะพัะฐั ะพ ััะผ-ัะพ
ัะฟะพัะธะปะฐ ะธ ะฝะธ ะพะดะฝะฐ ะธะท ััะพัะพะฝ ะฝะต ั
ะพัะตะปะฐ ััััะฟะฐัั. ะ ะฟะฐัะบ ะฒัั ัะฐะบ ะถะต ะฝะธะบัะพ ะฝะต ะฟะพัะตัะฐะป,
ะปะธัั ะธะทัะตะดะบะฐ ะฟะพะดัะพััะบะธ ะบะฐัะฐะปะธัั ะฝะฐ ะฒะตะปะพัะธะฟะตะดะฐั
ะธ ัะพะปะธะบะฐั
. ะ ะฝะฐ ะดะฒะพัะต ะฒัั ัะฐะบ ะถะต
ะฑัะป ะธัะฝั ะธ ะฟะพะด ััะผ ะฟะพะทะดะฝะตะน ัะตััะผัั
ะธ ะฒะปัะฑะปัะฝะฝัะต ัะปะธ ะฟะพ ััะฐัะพะผั ะทะฐะฑัะพัะตะฝะฝะพะผั ัะณะปั
ัะปะธัั ััะฐัะพะณะพ ะพััััะพะตะฝะฝะพะณะพ ะฟะฐัะบะฐ.'
- 'ะะฐะฑะฐั
! - ะฝะพะฒะฐั ะผะพะปะฝะธั ั ะณัะพั
ะพัะพะผ ัะฐะทัะตะทะฐะปะฐ ะฝะตะฑะพ. ะะตะฟะพะณะพะดะฐ ะฟัะพะดะพะปะถะฐะปะฐัั ัะถะต ััะตัะธะน
ะดะตะฝั. ะะฐะณััะฐ ะฟะพัะถะธะปะฐัั. ะะพะถะดั ะพะฝะฐ ะฝะต ะปัะฑะธะปะฐ - ัะฐะบะฐั ะฟะพะณะพะดะฐ ะปะธัะฐะปะฐ ะตั ะตะดะธะฝััะฒะตะฝะฝะพะณะพ
ะดะพัััะฟะฝะพะณะพ ะตะน ััะฐัััั - ะฟะพะปััะพะฒ. ะะฑัะตะฝะธะต ั ะะพั
ะฐะบั ััะฟะตะปะพ ะฝะฐะดะพะตััั ั
ัะถะต ะณะพััะบะพะน
ัะตะดัะบะธ - ะฟะฐัะตะฝั ะฝะต ะณะพะฒะพัะธะป ะฝะธ ะพ ััะผ, ะบัะพะผะต ะบะฐะบ ะพ ัะฟะพัะพะฑะฐั
ัะฑะธัั ะะฐัะฐะบั. ะะตะฒััะบะฐ,
ะบะพะฝะตัะฝะพ, ะฟะพะฝะธะผะฐะปะฐ, ะฟะพัะตะผั ะพะฝ ะพัะตะฝั ะฑัะป ัะพััะตะดะพัะพัะตะฝ ะฝะฐ ัะฒะพะตะน ะผะตััะธ, ะฝะพ - ะฒะตัะตั
ัะฒะธะดะตัะตะปั! - ะฝะตะปัะทั ะถะต ะฑัะปะพ ะดัะผะฐัั ัะพะปัะบะพ ะพะฑ ะพะดะฝะพะผ! ะะฐัััะพะตะฝะธะต ะฝะต ัะปัััะฐะปะพ ะธ ัะพ,
ััะพ ั ัะฐะผะพะณะพ ะฝะฐัะฐะปะฐ ะณัะพะทั ัะฐะผ ะะฐัะฐะบั ะฑัะป ะฝะตะพะฑััะฐะนะฝะพ ะดะตะปะพะฒะธั: ัะพะทะดะฐะฒะฐะป ะฝะพะฒะพะต ะฟะพัะพะถะดะตะฝะธะต,
ััะพ ะะฐะณััั ะฝะตะผะฐะปะพ ะฑะตัะฟะพะบะพะธะปะพ, ะธ ะฒะฐัะธะป ะบะฐะบะพะต-ัะพ ััะตะทะฒััะฐะนะฝะพ ะฒะพะฝััะตะต ะทะตะปัะต, ะพััะฐะฒะปัะฒัะตะต
ะฒะพะทะดัั
.
- ะะฐะณััะฐ! - ะณะพะปะพั ะะฐัะฐะบั ะทะฐััะฐะฒะธะป ะดะตะผะพะฝะธัั ะฒะทะดัะพะณะฝััั. - ะะพ ะผะพะตะณะพ ะฟัะธะบะฐะทะฐ ะธะท ะทะฐะผะบะฐ
- ะฝะธ ัะฐะณั! ะขั ะผะฝะต ะฟะพะฝะฐะดะพะฑะธัััั!
- ะะฐะฟะพะผะฝั, ััะพ ะฒ ะดะพะถะดั ั ะปะตัะฐัั ะฝะต ะผะพะณั, - ััะฐัะฐััั ัะดะตัะถะฐัั ะฒ ัะฒะพัะผ ะณะพะปะพัะต ะณะฝะตะฒ,
ะพัะฒะตัะธะปะฐ ะดะตะฒััะบะฐ. ะะพัะปะต ัะพะณะพ ะบะฐะบ ะะฐัะฐะบั ะธะทะฑะฐะฒะธะปัั ะพั ะฅะฐะบัะดะพัะธ, ะฟะพะณะปะพัะธะฒ ัะตะฑัะฝะบะฐ
ะทะฐ ัะพ, ััะพ ะทะฐะฟะพะดะพะทัะธะป ะตะณะพ ะฒ ะฟัะตะดะฐัะตะปัััะฒะต, ะะฐะณััะฐ ััะฐัะฐะปะฐัั ะฑััั ะผะฐะบัะธะผะฐะปัะฝะพ ะพััะพัะพะถะฝะพะน.
- ะฏ ะฝะธะบะพะณะดะฐ ะฝะธัะตะณะพ ะฝะต ะทะฐะฑัะฒะฐั, - ะฒ ะฝะตะฑัะตะถะฝะพ ะฑัะพัะตะฝะฝะพะน ััะฐะทะต ะะพะฒะตะปะธัะตะปัะฝะธัะต ะะตััะฐ
ะฟะพัะปััะฐะปะฐัั ัะณัะพะทะฐ - ะะฐัะฐะบั ัะปะพะฒะฝะพ ะฑั ะฝะฐะฟะพะผะฝะธะป ะตะน ะฟัะพ ะฟะพะฟััะบั ัะปะตัะตัั ั ะดะฒัะผั
ะพัะบะพะปะบะฐะผะธ ะจะธะบะพะฝะฐ.
ะะฒะฐ ะฟะพัะปะตะดัััะธั
ะดะฝั ะบะฐะถะดัะน ะฑัะป ะทะฐะฝัั ัะฒะพะธะผะธ ะดะตะปะฐะผะธ: ะะฐัะฐะบั ะบะพัะฟะตะป ะฝะฐะด ัะฒะพะธะผะธ ะบะพัะปะฐะผะธ,
ะะพั
ะฐะบั ััะตะฝะธัะพะฒะฐะปัั, ะะฐะณััะฐ ะฑะพัะพะปะฐัั ั ะทะฐะฒะธัััั ะบ ะะฐะฝะฝะต, ะบะพัะพัะฐั ัะผะพัะธะน ะฝะต ะธัะฟัััะฒะฐะปะฐ
ะธ ะพั ัะผะตัะธ ัััะฐั
ะฐ ั ะฝะตัะฒะฝัะผ ะพะถะธะดะฐะฝะธะตะผ ะฝะต ะผััะฐะปะฐัั.
ะัั ะฟัะพั
ะพะดะธั. ะัะพัะปะธ ะธ ะดะพะถะดั, ะธ ะฒัะตะผั ะฟัะธะณะพัะพะฒะปะตะฝะธั ะทะตะปัั, ะธ ะฒัะตะผั ัะพะทะดะฐะฝะธั ะฝะพะฒะพะณะพ
ะฟะพัะพะถะดะตะฝะธั. ะะฐะณััะฐ ัะฐััะผะฐััะธะฒะฐะปะฐ ัะธะดัััั ะฟะตัะตะด ะฝะตะน ะดะตะฒััะบั ะธ ะพัััะฐะปะฐ, ััะพ ััะพ-ัะพ
ั ะฝะตะน ัะฒะฝะพ ะฑัะปะพ ะฝะต ัะฐะบ. ะ ะปะธัั ัะตัะตะท ะผะธะฝััั ะฟะพะฝัะปะฐ - ะพะฝะฐ ะฒะพะพะฑัะต ะฝะต ััะฒััะฒะพะฒะฐะปะฐ
ะฝะพะฒะพะต ะฟะพัะพะถะดะตะฝะธะต ะะฐัะฐะบั! ะะต ะฑัะปะพ ะฝะธ ะดะตะผะพะฝะธัะตัะบะพะน ะฐััั, ะฝะธ ะทะฐะฟะฐั
ะฐ, ะฝะธ ะผะฐะปะตะนัะตะณะพ
ะทะฒัะบะฐ - ะดะฐะถะต ัััะบะฐ ัะตัะดัะฐ ะฝะต ัะปะฐะฒะปะธะฒะฐะปะธ ัััะบะธะต ััะธ ะดะตะผะพะฝะตััั!
- ะะฐะณััะฐ, ััะพ ะทะฐัััะปะฐ? ะฏ ะถะต ัะบะฐะทะฐะป ัะตะฑะต - ัะปะตัะฐะน ะธ ะฟัะธะฝะตัะธ ัะฒะพะตะน ัะตัััะต ะพะดะตะถะดั!
- ะะฐัะฐะบั ะทะฐะผะตัะฝะพ ะฒะพะทะฒััะธะป ะณะพะปะพั.
- ะ ะทะฐะผะบะต ะธ ัะฐะบ ะพะดะตะถะดั ะฟะพะปะฝะพ - ะฟัััั ัะฐะผะฐ ะฟะพะดะฑะตััั! - ะพัะฒะตัะธะปะฐ ัะฐ.
- ะฏ ัะบะฐะทะฐะป: ัะปะตัะฐะน ะธ ะฟัะธะฝะตัะธ, - ัะบะฒะพะทั ะทัะฑั ะฟัะพัะธะฟะตะป ะฟะพะปัะดะตะผะพะฝ.
"ะงััั, ััะพ ััะพั ะฟะฐัะบ ะทะฐะดัะผะฐะป, ััะพ ั ััะปััะฐัั ะฝะต ะดะพะปะถะฝะฐ? ะขะพัะฝะพ ะบะฐะบัั-ะฝะธะฑัะดั ะณะฐะดะพััั,
ััะพะฑั ะผะตะฝั ะฟะพะผััะธัะตะปัะฝะตะน ัะฑะธัั! ะงัะพ ะถะต ะดะตะปะฐัั, ััะพ ะถะต ะดะตะปะฐัั?! ะกะฑะตะถะฐัั? ะะตั, ั
ะฝะตะณะพ ะผะพั ัะตัะดัะต... ะััะฒะพะป, ะผะฝะต ะฟะพััะฐะฒะปะตะฝ ะผะฐั!" - ัะฐัััะถะดะฐะปะฐ ะะพะฒะตะปะธัะตะปัะฝะธัะฐ ะะตััะฐ,
ะธะดั ะฟะพ ะบะพัะธะดะพัะฐะผ.
- ะะฐะณััะฐ-ะดะพะฝะพ, ััะพ-ัะพ ัะปััะธะปะพัั? - ะดะตะผะพะฝะตััะฐ ัะฐะบ ััะปะฐ ะฒ ัะฒะพะธ ะผััะปะธ, ััะพ ะฝะต ะทะฐะผะตัะธะปะฐ
ะฟะพัะฒะปะตะฝะธั ััะดะพะผ ะะพั
ะฐะบั.
- ะะพั
ะฐะบั-ะบัะฝ... - ะดะตะฒััะบะฐ ะทะฐะผัะปะฐัั ะธ ะฟัะธัะตะปะฐ ะฝะฐ ะพะดะฝะพ ะบะพะปะตะฝะพ - ัะผะพััะตัั ะฝะฐ ะฟะฐัะฝั
ัะฒะตัั
ั ะฒะฝะธะท ัะตะนัะฐั ะตะน ัะพะฒะตััะตะฝะฝะพ ะฝะต ั
ะพัะตะปะพัั, - ะฒ ะผะพะตะน ะบะพะผะฝะฐัะต ะตััั ัะบะฐััะปะบะฐ...
ะบัะฐัะฝะฐั ั ัะธะฝะธะผะธ ะฟัะธัะฐะผะธ...
ะั
ะพัะฝะธะบ ะฝะฐ ะดะตะผะพะฝะพะฒ ะฒะฝะธะผะฐัะตะปัะฝะพ ัะผะพััะตะป ะฒ ะณะปะฐะทะฐ ะตัะปะธ ะฝะต ะฟะพะดััะณะต, ัะพ ัะถ ัะพัะฝะพ ัะพัะทะฝะธัะต,
ะธ ััะฐัะฐะปัั ัะณะฐะดะฐัั, ััะพ ั ัะพะน ะฝะฐ ะดััะต.
- ...ะฟััะผะพัะณะพะปัะฝะฐั. ะะปะธะฝะพะน... ะฝะตะผะฝะพะณะพ ะดะปะธะฝะฝะตะต ะผะพะตะณะพ ะฒะตะตัะฐ, - ะะฐะณััะฐ ะฝะฐ ะดะฒัั
ััะบะฐั
ะฟะพะบะฐะทะฐะปะฐ ัะฒะพั ะพััะถะธะต, ะธ ัะตะฑัะฝะพะบ ะบะธะฒะฝัะป ะฒ ะทะฝะฐะบ ัะพะณะพ ััะพ ะทะฐะฟะพะผะฝะธะป, - ัะธัะธะฝะพะน - ะฟัะธะผะตัะฝะพ
ะบะฐะบ ัะฒะพั ะปะฐะดะพะฝั ะพั ะฟะฐะปััะตะฒ ะดะพ ะพัะฝะพะฒะฐะฝะธั ะปะฐะดะพะฝะธ. ะะฐะบัััะฐั. ะ ะพะฑัะตะผ, ะตัะปะธ... ั
ะพัั
ัะบะพัะตะต ัะถ "ะบะพะณะดะฐ"... ะผะตะฝั ัะฑัะตั ะะฐัะฐะบั, ะฟะพััะฐัะฐะนัั ะฟะตัะตะดะฐัั ะตั ะบะฐะบ-ะฝะธะฑัะดั ะกะตัััะผะฐัั.
ะขั ะฒะตะดั ะตะณะพ ะฟะพะผะฝะธัั?
ะะพั
ะฐะบั ัะฝะพะฒะฐ ะบะธะฒะฝัะป.
- ะกะบะฐะถะตัั, ััะพ ััะพ ะพั ะผะตะฝั. ะัััั ะปะพะผะฐะตั ะทะฐะผะพะบ - ะบะปัั ะฒัะตะณะดะฐ ะฟัะธ ะผะฝะต.
- ะ ะั?
- ะขั ะถะต ะทะฝะฐะตัั, ะบะฐะบะธะต ะฝะตะฑะตะทะพะฟะฐัะฝัะต ะธะณัั ะผั ั ัะพะฑะพะน ะทะฐัะตัะปะธ... - ะะฐะณััะฐ ะฒััะฐะปะฐ,
ะฟัะพะฒะตะปะฐ ะปะฐะดะพะฝัั ะฟะพ ะฒะพะปะพัะฐะผ ะฟะฐัะฝั ะธ ะฒััะปะฐ ะธะท ะทะฐะผะบะฐ. ะะพั
ะฐะบั ัะบัะฐะดะบะพะน ัะผะฐั
ะฝัะป ะฝะฐะฑะตะถะฐะฒััั
ัะปะตะทั.
ะะพะณะดะฐ ัะตัะตะท ะฟะพะปัะฐัะฐ ะดะตะฒััะบะฐ ะฒะตัะฝัะปะฐัั ั ะฒะพัะพั
ะพะผ ะพะดะตะถะดั, ัะฐะทะณะพะฒะพั ัะถะต ัะฒะฝะพ ะทะฐะฒะตััะฐะปัั.
- ะัะปะธัะฝะพ. ะ ะฐะท ัะถ ะธั
ัะธะปะฐ ะฒ ะตะดะธะฝััะฒะต ะธ ะฒะพะปะต ะบะฐะถะดะพะณะพ - ะฒััะด ะปะธ ััะพ-ัะพ ัะผะพะถะตั ัะฝะธััะพะถะธัั
ะธั
ะปัััะต. ะะพัะปะต ัะฐะบะพะณะพ ะดะฐะถะต ะะพั
ะฐะบั ะธั
ะฟะตัะตะฑััั ะฑะตะท ะฟัะพะฑะปะตะผ! - ะบะฐะทะฐะปะพัั, ะดะฐะถะต ะณะพะปะพั
ะฝะพะฒะพะณะพ ะฟะพัะพะถะดะตะฝะธั ะฑัะป ะฝะตัะปะพะฒะธะผ, ะฟะพัะฒะปัััั ัะปะพะฒะฝะพ ะธะท ะฝะตะพัะบัะดะฐ ะธ ะตะถะตัะตะบัะฝะดะฝะพ ะผะตะฝัั
ัะตะผะฑั.
- ะััะฐัะธ, ะทะฐะฑัะป ะฟัะตะดััะฐะฒะธัั, - ะฟะพะปัะดะตะผะพะฝ ัะฒะฝะพ ะฑัะป ัะตะผ-ัะพ ะดะพะฒะพะปะตะฝ, - ะตั ะทะพะฒัั ะงะธั.
ะญัะพะน ะฝะพััั ะฒั ะดะตะนััะฒัะตัะต ะฒะดะฒะพัะผ. ะะพะผะฟะฐะฝะธั ะะฝัััะธ ัะฐะทะดะตะปะธะปะฐัั - ััะพ ะธั
ะธ ะฟะพะณัะฑะธั.
ะััะฐัะธ, ะะฐะณััะฐ, ัั ั
ะพัะพัะพ ะผะฝะต ะฟะพัะปัะถะธะปะฐ. ะัะปะธ ััะพะน ะฝะพััั ะฝะต ะพะฑะปะฐะถะฐะตัััั - ะฟะพะปััะธัั
ัะฒะพั ัะตัะดัะต.
"ะะฝัะตัะตัะฝะพ, ั ั
ะพัั ะฑั ัะฐััะฒะตั ัะฒะธะถั?" - ะผััะปะตะฝะฝะพ ะฟะพะฟัะพัะฐะปะฐัั ั ะถะธะทะฝัั ะะพะฒะตะปะธัะตะปัะฝะธัะฐ
ะะตััะฐ.
- ะะฐะณััะฐ, ะฟะพะฒัะพััั ัะฟะตัะธะฐะปัะฝะพ ะดะปั ัะตะฑั - ะดะตะปะฐะตัั ะฒัั, ััะพ ะฟัะธะบะฐะถะตั ะงะธั, - ั ัะดะพะฒะธัะพะน
ััะผะตัะบะพะน ะฟัะพะผะพะปะฒะธะป ะะฐัะฐะบั. ะะพะฒะตะปะธัะตะปัะฝะธัะฐ ะะตััะฐ ะฟะตัะฒัั ััะฐะทั ะฟัะพัะปััะฐะปะฐ - ะฒ ัะพั
ะผะพะผะตะฝั ะฟะพะปัะดะตะผะพะฝ ะดะตะผะพะฝัััะฐัะธะฒะฝะพ ะพัะดะฐะฒะฐะป ะงะธั ัะตัะดัะต ะฟะตัะฒะพะณะพ ะฟะพัะพะถะดะตะฝะธั ะะฐัะฐะบั.
ะะผะตััะพ ะพัะฒะตัะฐ ะถะตะฝัะธะฝะฐ ะฒะทะผะฐั
ะฝัะปะฐ ะฒะตะตัะพะผ, ะธ ะฟะตัะพ ั ะดะฒัะผั ะดะตะผะพะฝะตััะฐะผะธ ะฒะทะปะตัะตะปะพ.
- ะญะน, ะงะธั, ะฐ ััะพ ัั ัะผะตะตัั, ะฐ? ะะฐะบ ะฑัะดะตัั ะะฝัััั ะธ ะพััะฐะปัะฝัั
ัะฑะธะฒะฐัั?
- ะะฐะถะต ะฝะต ะฟััะฐะนัั ะฝะฐ ะผะตะฝั ะฝะฐะฟะฐััั, ััะพะฑั ะพัะพะฑัะฐัั ัะตัะดัะต - ั ัะธะปัะฝะตะต, - ะฟัะพะณะพะฒะพัะธะปะฐ
ัะฒะพะธะผ ะฝะตะฟะพะฝััะฝัะผ ะณะพะปะพัะพะผ ัะฐ. ะะฐะณััะฐ ะดะตัะฝัะปะฐัั: "ะะพะณะฐะดะปะธะฒะฐั ััะตัะฒะฐ!"
- ะ ั ะฝะต ะฑัะดั ะธั
ัะฑะธะฒะฐัั. ะฏ ัะดะตะปะฐั ัะฐะบ, ััะพ ะพะฝะธ ัะฐะผะธ ะฒัะต ะฑัะดัั ะธัะบะฐัั ัะผะตััะธ.
ะะฐะฒะฐะน ะฑััััะตะต, ะฝะฐั ะถะดัั, ะธะปะธ, ะฒะตัะฝะตะต, ะฝะต ะถะดัั ะฒ ัััั
ะผะตััะฐั
.
- ะ ัััั
? - ัะดะธะฒะปัะฝะฝะพ ะฒัะบะธะฝัะปะฐ ะฑัะพะฒั ะะพะฒะตะปะธัะตะปัะฝะธัะฐ ะะตััะฐ. - ะััะฟะฟะฐ ะะฝัััะธ ัะฐะทะดะตะปะธะปะฐัั
ะฝะฐ ััะธ?
ะงะธั ะบะพัะพัะบะพ ั
ะพั
ะพัะฝัะปะฐ. ะฅะพั
ะพั ะฑัะป ััะพะปั ะถะต ะฝะตะฟัะธััะฝัะผ, ะบะฐะบ ะธ ะณะพะปะพั.
- ะะต ัะฒะพั ะดะตะปะพ. ะะฐัะฝัะผ ั...
- ะกะฐะฝะณะพ, ัะฒะตัะตะฝ, ะธั
ะดััะธ ะฟัะตะฑัะฒะฐัั ะฒ ะปัััะตะผ ะผะธัะต, - ัะธั
ะพ ะฟัะพะณะพะฒะพัะธะป ะะธัะพะบั.
- ะกะฟะฐัะธะฑะพ ะทะฐ ะฟะพะดะดะตัะถะบั, ั
ะพะพัะธ-ัะฐะผะฐ.
ะก ะผะพะผะตะฝัะฐ ัะตะทะฝะธ ะฒ ะทะฐะผะบะต ะะฐัะฐะบั ะธ ัะตะปะตะฝะธะธ ะพั
ะพัะฝะธะบะพะฒ ะฝะฐ ะดะตะผะพะฝะพะฒ ะฟัะพััะป ัะพะฒะฝะพ ะณะพะด.
ะกะฐะฝะณะพ, ะพััะฐะฒะธะฒ ะะฝัััั, ะจะธะฟะฟะพ ะธ ะะฐะณะพะผั ะฒ ะดะตัะตะฒะฝะต ะะฐัะดั, ะพัะฟัะฐะฒะธะปะฐัั ะฝะฐะฒะตััะธัั ะผะพะณะธะปั
ัะฒะพะธั
ัะพะดััะฒะตะฝะฝะธะบะพะฒ ะธ ัะพะฒะฐัะธัะตะน. ะขะพ, ััะพ ะดะตะฒััะบะฐ ะฒะทัะปะฐ ั ัะพะฑะพะน ะะธัะพะบั, ะฟะฐัะตะฝั
ะฒะพัะฟัะธะฝัะป ะบะฐะบ ะทะฝะฐะบ ะพัะพะฑะพะณะพ ะดะพะฒะตัะธั ะธ, ะฝะตัะผะพััั ะฝะฐ ะฒัั ัะฒะพั ะถะตะปะฐะฝะธะต, ััะฐัะฐะปัั ัะตะฑั
ะฒะตััะธ ะฟะพะดะพะฑะฐััะต, ััะพ ะตะผั ะฟะพะบะฐ ัะดะฐะฒะฐะปะพัั. ะะฐ ะดะตะฝั ะพะฝ ั ะกะฐะฝะณะพ ะพะฑะฝะพะฒะธะป ะฝะฐะดะณัะพะฑะฝัะต
ัะฐะฑะปะธัะบะธ, ะฟะพะผะพะณ, ะฟะพ ะตั ะฟัะพััะฑะต, ะฟะพะฒัะดะตัะณะฐัั ัะฐะทัะพััะธะตัั ะฝะฐ ะผะพะณะธะปะฐั
ัะฐััะตะฝะธั, ัะพะฒะตััะธะป
ะฝัะถะฝัะต ะพะฑััะดั.
- ะญั
. ะะฝัะตัะตัะฝะพ, ะณะดะต ัะตะนัะฐั ะะพั
ะฐะบั? - ะฒะทะดะพั
ะฝัะปะฐ ะพั
ะพัะฝะธัะฐ.
ะะพะฝะฐั
ะฟะพะดะฝัะป ะบ ะณะปะฐะทะฐ ะบ ะฝะตะฑั: ะตะณะพ ะฟะพัะฐัะฝะฝัะต ะผะพะปะธัะฒั ััะปััะฐะฝั ะฝะต ะฑัะปะธ. ะะฐัะตะฝั ะดะพ
ะฟะพัะปะตะดะฝะตะณะพ ะฝะฐะดะตัะปัั, ััะพ ะดะตะฒััะบะฐ ะฝะต ะฒัะฟะพะผะฝะธั ะฟัะพ ะฑัะฐัะฐ - ะบะฐะถะดัะน ัะฐะท, ะบะพะณะดะฐ ััะพ
ะฟัะพะธัั
ะพะดะธะปะพ, ะพะฝะฐ ะฒะฟะฐะดะฐะปะฐ ะฒ ััะถัะปัั ะผะตะปะฐะฝั
ะพะปะธั.
- ะกะฐะฝะณะพ-ััะฝ, ะผั ะพะฑัะทะฐัะตะปัะฝะพ ะฝะฐะนะดัะผ ะตะณะพ. ะะฐะนะดัะผ ะธ ะฒััะฒะตะผ ะธะท ะปะฐะฟ ะะฐัะฐะบั, - ะะธัะพะบั
ะพะฑะฝัะป ะพั
ะพัะฝะธัั. ะขะฐ ั
ะฝัะบะฝัะปะฐ ะตะผั ะฒ ะฟะปะตัะพ. ะะธัะฐัะฐ ะฟัะธะฝัะปะฐัั ััะตัะธัะตะปัะฝะพ ะผััะบะฐัั.
- ะฅะพะพัะธ... ะผะพะถะฝะพ ะั... ะฟะพัะฟะธัะต ะฒ ะพะดะฝะพะผ ะดะพะผะต ัะพ ะผะฝะพะน? - ะฒัั
ะปะธะฟัะฒะฐั, ัะธั
ะพะฝัะบะพ ะฟะพะฟัะพัะธะปะฐ
ะดะตะฒััะบะฐ.
- ะกะฐะฝะณะพ, ั ะผะพะณั ัะฟัะพัะธัั...
- ะฏ ะฟัะพััะพ ะฝะต ั
ะพัั ะพััะฐะฒะฐัััั ะพะดะฝะฐ. ะะฐะดะตััั, ะผะพั ะฟัะตะดะปะพะถะตะฝะธะต ะั ะฝะต ะฒะพัะฟัะธะผะตัะต,
ะบะฐะบ ัะฐะทัะตัะตะฝะธะต ัะฐัะฟััะบะฐัั ััะบะธ! - ะฟะพัะปะตะดะฝัั ััะฐะทั ะกะฐะฝะณะพ ะฟัะพะธะทะฝะตัะปะฐ ัะถะต ะฑะพะปะตะต ะถัััะบะธะผ
ัะพะฝะพะผ. ะะธัะพะบั ะฟะพะดะฐะฒะธะป ะฒะทะดะพั
ัะฐะทะพัะฐัะพะฒะฐะฝะธั.
- ะะฐะดะฝะพ, ะพะฑะตัะฐั ะฒะตััะธ ัะตะฑั ัะพะพัะฒะตัััะฒะตะฝะฝะพ ัะฒะพะตะผั ะดัั
ะพะฒะฝะพะผั ะทะฒะฐะฝะธั.
- ะกะฟะฐัะธะฑะพ. ะะพะนะดัะผ ัะบะปะฐะดัะฒะฐัััั - ัะบะพัะพ ััะตะผะฝะตะตั.
- ะะปะธะฝ, ะฝั ะฒะพั ะฟะพะฝะตัะปะฐ ะฑะฐะฑะบั ะฝะตะปัะณะบะฐั ััััะธ ะบัะดะฐ ะฝะฐ ะฝะพัั ะณะปัะดั! - ัะฐะทะดัะฐะถัะฝะฝะพ
ััะฒะบะฝัะป ะะฝัััะฐ.
- ะะฐะผะตัั, ั, ะบะฐะบ ะผะธะบะพ, ะดะพะปะถะฝะฐ ะฑัะปะฐ ะฑั ะฟะพะนัะธ ัะฐะผะฐ! ะ ะฟะพัะปะฐ ะฑั, ะดะฐ ัะพะปัะบะพ ะพะฝะฐ ะทะฐะฟัะตัะธะปะฐ
- ัะบะฐะทะฐะปะฐ, ััะพ ะฝะตะทะฐัะตะผ ะปะตะบะฐัััะฒะฐ ะธะท ะผะพะตะณะพ ะผะธัะฐ ะฒัะตะผ ะฟะพะดััะด ะฟะพะบะฐะทัะฒะฐัั - ะฒะพะฟัะพัั
ะฝะตะฝัะถะฝัะต ะฒะพะทะฝะธะบะฝัั.
- ะ ะพะฝะฐ ะฝะต ะฟัะฐะฒะฐ?
ะขััััั.
- ะจะธะฟะฟะพ, ั
ะฒะฐัะธั ัะถะต - ะดะพััะฐะป ะดะพ ะพะดััะตะฝะธั ัะฒะพะตะน ัััะบะฐะปะบะพะน! - ะะฝัััะฐ ะฑัะพัะธะป ะฒ ะบะธััะฝั
ะทะปะพะน ะฒะทะณะปัะด.
- ะัััะฐะฝั! ะะฝะต ะะฐะณะพะผั-ััะฝ ะธะณัััะบั ะฟัะธะฝะตัะปะฐ, ะธ ั ะฑัะดั ะธะณัะฐัั! - ะฒัะบัะธะบะฝัะป ัะตะฑัะฝะพะบ,
ะฟัะธะบััะฒะฐั ัะพะฑะพะน ะฝะฐะฟะพะปะฝะตะฝะฝัะน ะฑัะฑะตะฝัะธะบะฐะผะธ ะผััะธะบ.
- ะงััั, ะฝั ััะธ ะถะต ะฒัะฝัั ะธ ะณะพะปะพะฒะฐ ะทะฒะตะฝะธั! ะะฐะณะพะผั, ะดััะฐ, ะทะฐัะตะผ ััั ะณะฐะดะพััั ะฟัะธะฝะตัะปะฐ?
- ะกะธะดะตัั!
ะขัะตั.
- ะญัะพ ะฝะต ะณะฐะดะพััั. ะญัะพ ะธะณัััะบะฐ. ะะฝัััะฐ, ะฝั ะพะฝ ะถะต ัะตะฑัะฝะพะบ - ะฟัััั ะธะณัะฐะตััั! ะ ะฝะตั
ะฑั ะทะฐ ัะตะฑั ะฟะพัะฐะดะพะฒะฐัััั - ัะผะพััะธ, ัะบะพะปัะบะพ ั ัะฒะพะตะน ะปัะฑะธะผะพะน ะปะฐะฟัะธ ะฟัะธะฝะตัะปะฐ! - ะผะธะบะพ
ะฟะพััะฐัะฐะปะฐัั ะฟะตัะตะบะปััะธัั ะตะณะพ ะฒะฝะธะผะฐะฝะธะต ะฝะฐ ััะพ-ัะพ ะฟัะธััะฝะพะต.
ะขััััั.
- ะฅะพัั ะบะฐะบะฐั-ัะพ ะฟะพะปัะทะฐ ะพั ัะตะฑั ะธ ัะฒะพะตะน ัะฟะพั
ะธ, - ะฑััะบะฝัะป ะฟะพะปัะดะตะผะพะฝ.
- ะฅะฐะผ!
- ะกัะตัะฒะฐ!
- ะกะธะดะตัั!
ะขัะตั.
- ะัะพัะพะน ัะฐะท ัะถะต! - ะฟัะพะพัะฐะป ะฑะตะปะพะฒะพะปะพััะน ะฟะฐัะตะฝั ั ะฟะพะปะฐ.
ะขัััััััั.
- ะะฐัะปัะถะธะป! - ะฒ ัะพะฝ ะตะผั ะพัะฒะตัะธะปะฐ ะะฐะณะพะผั, ะฟะพัะธัะฐั ะฟะฐะปััะฐะผะธ ะฒะธัะบะธ - ะทะฒะพะฝ ะผััะธะบะฐ
ะฝะฐัะฐะป ะธ ั ะฝะตั ะฒัะทัะฒะฐัั ะณะพะปะพะฒะฝัั ะฑะพะปั, ะฝะพ ะดะพะฑัะพัะฐ, ะฟะพะดะฟะธััะฒะฐะตะผะฐั ัะฟััะผััะฒะพะผ ะธ ะฝะตะถะตะปะฐะฝะธะตะผ
ัะพะณะปะฐัะธัััั ัะตะนัะฐั ั ะฟะพะปัะดะตะผะพะฝะพะผ, ะฝะต ะดะฐะฒะฐะปะธ ะตะน ะทะฐะฟัะตัะธัั ะธะณัะฐัั ะปะธััะฝะบั. - ะะฝัะตัะตัะฝะพ,
ะบะฐะบ ัะฐะผ ะะธัะพะบั ั ะกะฐะฝะณะพ?
- ะะพะดะธ ััะพั ะฑะปัะดะธัั ัะฐะผ ัะฐะทะพััะปัั, - ะพัะฒะตัะธะป ะฟะฐัะตะฝั, ะฟะพะดะฝะธะผะฐััั ั ะฟะพะปะฐ.
- ะะฝัััะฐ, ะะธัะพะบั-ัะฐะผะฐ ะทะฝะฐะตั, ะบะพะณะดะฐ ะฝัะถะฝะพ ัะดะตัะถะฐัััั.
- ะฃะณั. ะะฐะบ ะพั ะฝะฐัะตะน ะพั
ะพัะฝะธัั ะฟะพะปััะธั - ัะฐะบ ััะฐะทั ะฝะฐัะธะฝะฐะตั ัะดะตัะถะธะฒะฐัััั.
ะขััััั.
- ะะฐะดะฝะพ, ะดะฐะฒะฐะนัะต ัะฟะฐัั! - ะะฐะณะพะผั ะดะตะผะพะฝัััะฐัะธะฒะฝะพ ัะปะตะณะปะฐัั ะฒ ัะฟะฐะปัะฝัะน ะผะตัะพะบ, ะฝะฐะดะตััั,
ััะพ ะตั ะฟัะธะผะตัั ะฟะพัะปะตะดััั ะธ ะพััะฐะปัะฝัะต. ะัะบะพัะต ะฒ ั
ะธะถะธะฝะต ะดะตะนััะฒะธัะตะปัะฝะพ ััะฐะปะพ ัะตะผะฝะพ
ะธ ัะธั
ะพ.
ะะฐะณััะฐ ะธ ะงะธั ะฟัะธะฑะปะธะทะธะปะธัั ะบ ะฟะตัะฒะพะน ัะตะปะธ.
- ...ััะดะฐ...
ะกะฐะฝะณะพ, ััะปััะฐะฒ ะฝะตะฟะพะฝััะฝะพ ัะตะน ะณะพะปะพั, ะพัะบััะปะฐ ะณะปะฐะทะฐ ะธ ัะฒะธะดะตะปะฐ, ะบะฐะบ ะฝะฐ ะฝะตั ะฟะฐะดะฐะตั
ัะตะปะพะฒะตัะตัะบะฐั ัะตะฝั. ะะตะฒััะบะฐ ะพััะตะฐะณะธัะพะฒะฐะปะฐ ะฟะพะปะฝะพัััั ะธะฝััะธะฝะบัะธะฒะฝะพ - ะฟะฝัะปะฐ ัะตะฝั ะดะฒัะผั
ะฝะพะณะฐะผะธ ะฒ ะถะธะฒะพั. ะขะฐ ะพั
ะฝัะปะฐ.
- ะกะพะฟัะพัะธะฒะปัะตัััั, ะฝะตะดะพััะพะณะฐ? ะะต ะฑะพะนัั - ั ะพัััะฐั
ะฐั ัะตะฑั ะฟัะธััะฝะพ!
ะะตะฒััะบะฐ ั ัะดะธะฒะปะตะฝะธะตะผ ัะทะฝะฐะปะฐ ะฟะพ ะณะพะปะพัั ะะธัะพะบั ะธ ะฟะพะฝัะปะฐ, ะฟะพัะตะผั ะะธัะฐัะฐ ะตั ะฝะต ัะฐะทะฑัะดะธะปะฐ,
ะบะฐะบ ะพะฝะฐ ััะพ ะพะฑััะฝะพ ะดะตะปะฐะปะฐ ะฟัะธ ะฟะพัะฒะปะตะฝะธะธ ััะถะธั
ะทะฐะฟะฐั
ะพะฒ.
- ะญะน, ัั ััะพ, ะณะพะปะพะฒะพะน ัะดะฐัะธะปัั? - ะพัะฐัะฐัะตะฝะฝะพ ัะฟัะพัะธะปะฐ ะพั
ะพัะฝะธัะฐ. ะะผะตััะพ ะพัะฒะตัะฐ
ะฟะฐัะตะฝั ัะฝะพะฒะฐ ะฟััะณะฝัะป ะฝะฐ ะกะฐะฝะณะพ. ะขะฐ ะพัะบะฐัะธะปะฐัั ัะพ ัะฒะพะตะณะพ ัััะพะฝะฐ ะฒ ััะพัะพะฝั ะธ ะฟะพะผะพััะธะปะฐัั
ะพั ะฑะพะปะธ - ะฟะพะด ะปะพะบะพัั ะฟะพะฟะฐะปะธ ะพัะบะพะปะบะธ ะบะฐะบะพะณะพ-ัะพ ัะพััะดะฐ, ะธ ะฟะฐัะฐ ะฒะฟะธะปะฐัั ะฒ ะบะพะถั.
"ะะฐะบะพะณะพ ััััะฐ! ะะฝ, ััะพ ะปะธ, ะฝะฐะฟะธะปัั? ะะตัะตะผ ะฒัะพะดะต! ะ ััะพ ััั ะดะตะปะฐะตั ััะฐ ะบะตัะฐะผะธัะตัะบะฐั
ะณะฐะดะพััั - ะผั ะถะต ัะพะปัะบะพ ัะฑัะฐะปะธัั, ะธ ะฝะฐ ะฟะพะปั ะฝะธัะตะณะพ ะฝะต ะฑัะปะพ! - ััะดะพัะพะถะฝะพ ะฟััะฐะปะฐัั
ัะฐะทะพะฑัะฐัััั ะกะฐะฝะณะพ, ะฟัะพััะฟะฐััั ะพะบะพะฝัะฐัะตะปัะฝะพ ะธ ัะฒะพัะฐัะธะฒะฐััั ะพั ะฝะพะฒะพะน ะฟะพะฟััะบะธ ัั
ะฒะฐัะธัั
ะตั. - ะขะพัะฝะพ! ะะฝะฐัะธั, ะพััะฐะฒะฐ ัะฐะบะฐั!"
- ะะต ัะนะดััั ัะตะฟะตัั! ะกะดะตะปะฐะน ะถะต ะผะฝะต ะฟัะธััะฝะพ! - ะะธัะพะบั ะฒ ะพัะตัะตะดะฝะพะผ ัะบะฐัะบะต ัะดะฐะปะพัั
ะฝะฐะฒะฐะปะธัััั ะฝะฐ ะดะตะฒััะบั. ะะดะฝั ััะบั ะพะฝ ัะผะพะณ ะทะฐั
ะฒะฐัะธัั, ะฒัะพััั ะฟัะธ ะฝะตัะดะฐัะฝะพะผ ะบัะฒััะบะต
ะฟัะธะถะฐะปะฐ ัะพะฑะพะน ัะฐะผะฐ ะกะฐะฝะณะพ. ะะพะณะธ ะดะตะฒััะบะธ ะฟะฐัะตะฝั ะฟัะธะดะฐะฒะธะป ัะฒะพะธะผะธ, ัะฐะบ ััะพ ัะดะฐัะธัั
ัะฐ ะฝะต ะผะพะณะปะฐ ะฝะธะบะฐะบ. ะะฑััะฝะพะณะพ ะผัะถัะธะฝั ััะตะฝะธัะพะฒะฐะฝะฝะฐั ะพั
ะพัะฝะธัะฐ ัะผะพะณะปะฐ ะฑั ะฑะตะท ะฟัะพะฑะปะตะผ
ัะฑัะพัะธัั, ะฝะพ ะตั ะฝัะฝะตัะฝะธะน ะฟัะพัะธะฒะฝะธะบ ัะพะถะต ะฑัะป ั
ะพัะพัะพ ะฟะพะดะณะพัะพะฒะปะตะฝ. ะกะฒะพะฑะพะดะฝะพะน ััะบะพะน
ะฟะฐัะตะฝั ัะฐะทะพัะฒะฐะป ะฑะตะปัั ะดะตะฒััะบะธ.
- ะะธัะฐัะฐ, ัะฑะตัะธ ะตะณะพ!
ะะดะธะฝััะฒะตะฝะฝะพะน ะฟัะธัะธะฝะพะน, ะฟะพ ะบะพัะพัะพะน ะบะพัะบะฐ ะฝะต ะฒะผะตัะฐะปะฐัั ัะฐะฝััะต, ะฑัะปะธ ะพัะพะฑัะต ะพัะฝะพัะตะฝะธั
ะผะตะถะดั ะตั ั
ะพะทัะนะบะพะน ะธ ะบะพะผะฟะฐะฝัะพะฝะพะผ - ะตัะปะธ ะฑั ะบัะพ ะดััะณะพะน ะฟะพะฟััะฐะปัั ัะพัะฒะพัะธัั ัะฐะบะพะต
ะฝะฐ ะตั ะณะปะฐะทะฐั
ั ะพั
ะพัะฝะธัะตะน, ะพะฝ ัะถะต ัะตัะตะท ัะตะบัะฝะดั ะฒะฐะปัะปัั ะฑั ั ัะฐะทะพัะฒะฐะฝะฝัะผ ะณะพัะปะพะผ
ะธ ะฒัะฟััะตะฝะฝัะผะธ ะบะธัะบะฐะผะธ. ะขะตะฟะตัั ะถะต, ะฝะฐะบะพะฝะตั-ัะพ ะฟะพะปััะธะฒ ะฟัะธะบะฐะท, ะะธัะฐัะฐ ะผะณะฝะพะฒะตะฝะฝะพ
ะพะฑะตัะฝัะปะฐัั ะธ ะฟัะพััะพ ัะฝะตัะปะฐ ะฒ ะฟััะถะบะต ัะดะฐัะพะผ ัะตะปะฐ ะะธัะพะบั ั ะกะฐะฝะณะพ.
- ะะตัะถะธ, ะฝะพ ะฝะต ััะพะณะฐะน! - ะบัะธะบะฝัะปะฐ ะดะตะฒััะบะฐ, ะฑัะพัะฐััั ะฒ ัะณะพะป, ะณะดะต ะปะตะถะฐะปะธ ะตั ะฒะตัะธ.
ะะพะปััะธะฝััะฒะพ ัะพัั'
- 'ะัะพ ะฟัะธะดัะผะฐะป ะฟัะฐะทะดะฝะพะฒะฐัั ะ ะพะถะดะตััะฒะพ ะฒัะตะผ ะฒะผะตััะต? ะะฐะบะพะน ัะฐะดะธัั? ะะฐ ััะพ ะพะฝ ะผะตะฝั
ัะฐะบ ะฝะตะฝะฐะฒะธะดะธั?
ะะฐ ะพะบะฝะฐะผะธ ะฑัะปะฐ ะผะตัะตะปั, ะบะพัะพัะฐั, ะบะฐะทะฐะปะพัั, ะพััะตะทะฐะปะฐ ะฝะฐั ะดะพะผะธะบ ะพั ะฒัะตะณะพ ะพััะฐะปัะฝะพะณะพ
ะผะธัะฐ. ะ ะผะฝะต ั
ะพัะตะปะพัั ะพััะตะทะฐัั ัะตะฑั ะพั ะฒัะตั
ะฒ ะฟัะธะฝัะธะฟะต. ะฃะณะพัะฐะทะดะธะปะพ ะถะต ะผะตะฝั ัะพะณะปะฐัะธัััั
ะฟะพะผะพะณะฐัั ะ ะธั
ั ะฝะฐ ะบัั
ะฝะต. ะะฐะดะพ ะฑัะปะพ ะฟัะตะดะฒะธะดะตัั, ััะพ ั ะฝะธะผ ะฑัะดะตั ะพะฝ.
ะฏ ะบัะพัะธะป ะพะฒะพัะธ ะดะปั ัะฐะปะฐัะฐ ั ะพัะพะฑะพะน ัะพััะตะดะพัะพัะตะฝะฝะพัััั, ััะฐัะฐััั ะฝะต ะฟะพะดะฝะธะผะฐัั ะณะปะฐะท
ะพั ัะฐะทะดะตะปะพัะฝะพะน ะดะพัะบะธ, ะฝะพ ั ะผะตะฝั ะฝะต ะฒัะตะณะดะฐ ััะพ ะฟะพะปััะฐะปะพัั. ะฏ ะฝะต ัะดะตัะถะฐะปัั ะธ ะฑัะพัะธะป
ะฒะทะณะปัะด ะฝะฐ ะฝะธั
: ะะฐัะปั ะฟะพะดะพััะป ะบ ะ ะธั
ะฐัะดั ัะทะฐะดะธ ะธ ะฟัะธะพะฑะฝัะป ะตะณะพ ะทะฐ ัะฐะปะธั, ะ ะธั
ะผัะณะบะพ,
ะฝะพ ะฝะฐััะพะนัะธะฒะพ ะพัะพะดะฒะธะฝัะป ะตะณะพ ััะบะธ, ะฝะฐะผะตะบะฐั ะฝะฐ ัะพ, ััะพ ะพะฝะธ ัะตะนัะฐั ะฝะต ะพะดะฝะธ, ะฝะพ ะฒัั
ะถะต ะดะพะฒะพะปัะฝะพ ัะปัะฑะฝัะปัั. ะฏ ะฝะตะฟัะพะธะทะฒะพะปัะฝะพ ัะถะฐะป ััะบะพััะบั ะฝะพะถะฐ ะตัั ัะธะปัะฝะตะต, ัะฐะบ, ััะพ
ะบะพััััะบะธ ะฟะฐะปััะตะฒ ะฟะพะฑะตะปะตะปะธ, ะธ ะฟัะธะฝัะปัั ะบัะพัะธัั ะพะฒะพัะธ ะตัั ะผะตะปััะต, ะฟัะตะฒัะฐัะฐั ะธั
ะฒ
ะบะฐัั. ะฏ ัะฟะพัะฝะพ ะฟััะฐะปัั ะทะฐะณะฝะฐัั ะพะฑััะตะฒะฐััะธะต ะผะตะฝั ััะฒััะฒะฐ ะธ ัะผะพัะธะธ ะบัะดะฐ-ะฝะธะฑัะดั ะฒ
ัะฐะผัะน ะดะฐะปัะฝะธะน ัะณะพะปะพะบ ัะฒะพะตะน ะดััะธ, ััะฐัะฐััั, ััะพะฑั ะพะฝะธ ะฝะต ะฟัะพัััะฟะธะปะธ ั ะผะตะฝั ะฝะฐ ะปะธัะต.
ะะฐะบะพะต ะผะฝะต ะดะตะปะพ ะดะพ ะฝะฐัะธั
ะณะพะปัะฑะบะพะฒ? ะะธะบะฐะบะพะณะพ.
ะ ัะบะฐ ะ ะธั
ะฐัะดะฐ ะฟะปะฐะฒะฝะพ ะปะตะณะปะฐ ะฝะฐ ะผะพั. ะฏ ะฒะทะดัะพะณะฝัะป ะพั ะฝะตะพะถะธะดะฐะฝะฝะพััะธ ะธ ะฒััะพะฝะธะป ะฝะพะถ,
ะถะฐะปะพะฑะฝะพ ะทะฒัะบะฝัะฒัะธะน ะพั ัะดะฐัะฐ ะพะฑ ะฟะพะป.
- ะะพ-ะผะพะตะผั, ัะถะต ะดะพััะฐัะพัะฝะพ, - ะผัะณะบะธะน ะณะพะปะพั ะปะธะด-ะณะธัะฐัะธััะฐ ะฝะตะถะฝะพ ะพะฑะฒะพะปะฐะบะธะฒะฐะป ะผะพั
ะทะฐะธะฝะดะตะฒะตะฒัะตะต ัะตัะดัะต, ัะปะพะฒะฝะพ ััะฟะปะพะต ะพะดะตัะปะพ, ะธ ะทะฐััะฐะฒะปัะป ะตะณะพ ะพััะฐะธะฒะฐัั. ะ ะธั
ะฟะพะดะฝัะป
ั ะฟะพะปะฐ ะฝะพะถ, ะฟะพะปะพะถะธะป ะตะณะพ ะฝะฐ ััะพะป, ะทะฐะฑัะฐะป ั ะผะตะฝั ะดะพัะบั ั ะพะฒะพัะฐะผะธ ะธ ะฝะฐะฟัะฐะฒะธะปัั ะพะฑัะฐัะฝะพ
ะบ ะฟะปะธัะต. ะก ะฝะตะบะพัะพััั
ะฟะพั ั ะ ะธั
ะฐัะดะฐ ะฟะพัะฒะธะปะฐัั ะฟัะธะฒััะบะฐ ั
ะพะดะธัั ะฟะพ ะดะพะผั ั ะพะฑะฝะฐะถัะฝะฝัะผ
ัะพััะพะผ, ะธ ัะตะนัะฐั, ะบะพะณะดะฐ ะพะฝ ะดะตัะธะปะธัะพะฒะฐะป ะฟะพ ะบัั
ะฝะต ะฒ ัะฐัััะบะต ะฟะพะฒะตัั
ัััั ัะผัะณะปะพะณะพ
ะณะพะปะพะณะพ ัะตะปะฐ, ะผะฝะต ััะพะธะปะพ ะฑะพะปััะธั
ััะธะปะธะน ะฝะต ะฟัะปะธัััั ะฝะฐ ะฝะตะณะพ ะธ ะฝะต ะทะฐั
ะปัะฑัะฒะฐัััั
ัะปัะฝัะผะธ. ะะพััะพะผั, ะฑัะพัะธะฒ ะบะพัะพัะบะธะน ะฒะทะณะปัะด ะฝะฐ ะตะณะพ ัะฟะธะฝั, ั ะฟะพััะฐัะฐะปัั ะบะฐะบ ะผะพะถะฝะพ
ะฑััััะตะต ะพัะฒะตัะฝััััั.
ะก ะฝะตะดะฐะฒะฝะธั
ะฟะพั ะผะฝะต ััะฐะปะพ ะบะฐะทะฐัััั, ััะพ ะะฐัะปั ะทะฐ ะผะฝะพะน ัะปะตะดะธั. ะฏ ะฝะต ัะฐะท ะทะฐะผะตัะฐะป
ะฝะฐ ัะตะฑะต ะตะณะพ ะฟัะธัััะตะฝะฝัะน ะฒะทะณะปัะด. ะะพะถะตั, ัะตะฒะฝัะตั? ะั ะธ ะฟััะบะฐะน, ะฟัััั ั
ะพัั ะฝะตะผะฝะพะณะพ
ะฟะพะผััะฐะตััั.
- ะกะฟะฐัะธะฑะพ, ะขะธะปะปั, - ะฝะฐััะพะนัะธะฒะพ ะฟัะพะธะทะฝัั ะะฐัะปั ะธ ะฟะพัะผะพััะตะป ะฝะฐ ะดะฒะตัั. ะฃะถ ะฝะต ะทะฝะฐั,
ะฑัะป ะปะธ ััะพั ะถะตัั ัะปััะฐะนะฝัะผ ะธะปะธ ะถะต ะฟัะตะดะฝะฐะผะตัะตะฝะฝัะผ, ะฝะพ ั ะธ ัะฐะบ ะฟะพะฝัะป ะฝะฐะผัะบ. ะฃะณััะผะพ
ะบะธะฒะฝัะฒ, ั ะฒััะตะป, ะฟัะธะบััะฒ ะทะฐ ัะพะฑะพะน ะดะฒะตัั. ะกัะพะธะปะพ ะผะฝะต ะพัะพะนัะธ ะฝะฐ ะฟะฐัั ัะฐะณะพะฒ, ะบะฐะบ
ั ััะปััะฐะป, ััะพ ัะพ ััะพะปะฐ ัะฟะฐะปะธ ะบะฐััััะปะธ (ะธะปะธ ะถะต ะฑัะปะธ ัะฟะตัะธะฐะปัะฝะพ ะพัััะดะฐ ัะบะธะฝััั
ััะธะผะธ-ัะพ ะฝะตัะตัะฟะตะปะธะฒัะผะธ ััะบะฐะผะธ). ะฏ ััะธัะฝัะป ะทัะฑั ะธ ะพัะพััะป ะฟะพะดะฐะปััะต ะพั ะบัั
ะฝะธ.
ะจะฝะฐะนะดะตัั ะบะพะต-ะบะฐะบ ัะดะฐะปะพัั ัะณะพะฒะพัะธัั ะผะตะฝั ัะฟัััะธัััั ะบ ัะถะธะฝั. ะฏ ัะตะทะบะพ ะทะฐั
ะปะพะฟะฝัะป
ะฑะปะพะบะฝะพั, ะฒ ะบะพัะพัะพะผ ัะถะต ะฟะพััะธ ะฝะต ะพััะฐะปะพัั ัะธัััั
ัััะฐะฝะธั, ะฝะตะพั
ะพัะฝะพ ะฟะพะดะฝัะปัั ัะพ
ัััะปะฐ ะธ ัะฟัััะธะปัั ะฒ ััะพะปะพะฒัั, ะผัะฐัะฝะพ ะดัะผะฐั, ััะพ ะฒะฟะตัะตะดะธ ะตัั ะฝะตะดะตะปั ัะฐะบะพะณะพ ะบะพัะผะฐัะฐ.
ะะพัะตะผั ะฝะฐะดะพ ะฑัะปะพ ัะฐัะธัั ะผะตะฝั ะฒ ััะพั ััััะพะฒ ะทะฐะณะพัะพะดะฝัะน ะดะพะผ, ะณะดะต ะฒะพะบััะณ ะพะดะธะฝ ะปะตั?
ะ ะฝะฐ ะณะปะฐะทะฐั
ะฒะตัะฝะพ ะ ะธั
ะฐัะด ั ะะฐัะปะตะผ... ะฏ ะฑัะป ัะฒะตัะตะฝ, ััะพ ะฒะธะด ััะพะน ะฟะฐัะพัะบะธ ะทะฐ ััะพะปะพะผ
ััะฐะทั ะพัะพะฑััั ั ะผะตะฝั ะฐะฟะฟะตัะธั. ะะฐะฟัะธะผะตั, ะบะพะณะดะฐ ะะฐัะปั ัะฝะพะฒะฐ ะบะฐะบ ะฑั ะฝะตะทะฐะผะตัะฝะพ ะฟะพะปะพะถะธั
ะ ะธั
ะฐัะดั ััะบั ะฝะฐ ะบะพะปะตะฝะพ, ะพััะตะณะพ ั ะ ะธั
ะฐ ะฟะพะบัะฐัะฝะตัั ััะธ, ะพะฝ ะฝะฐัะฝัั ะฝะตัะฒะฝะพ ะฟะพัะผะตะธะฒะฐัััั,
ะทะฐัะตะผ ะบะฐะบ ะฑั ัะปััะฐะนะฝะพ ััะพะฝะธั ะฒะธะปะบั ะฝะฐ ะฟะพะป, ะฝะฐะณะฝัััั, ััะพะฑั ะฝะตะปะพะฒะบะพ ะบะปัะฝััั ัะฒะพะตะณะพ
ะปัะฑะพะฒะฝะธะบะฐ ะฒ ััะบั, ะฟะพัะปะต ัะตะณะพ ั ะะฐัะปั ะฝะฐ ะปะธัะต ะพะฑัะทะฐัะตะปัะฝะพ ะฟะพัะฒะธััั ะดะพะฒะพะปัะฝะฐั ัะปัะฑะบะฐ
ัััะพะณะพ ะบะพัะฐ. ะะตะฝะฐะฒะธะถั.
ะัั ะณััะฟะฟะฐ ัะถะต ัะธะดะตะปะฐ ะทะฐ ััะพะปะพะผ, ะพััะดัั ััะพะปะพะฒัะผะธ ะฟัะธะฑะพัะฐะผะธ ะฝะฐะด ะฟัะธะณะพัะพะฒะปะตะฝะฝัะผ
ะฝะฐะผะธ ัะถะธะฝะพะผ. ะะพั ะฟะพััะธั ะพะดะธะฝะพะบะพ ััะพัะปะฐ ะฒ ััะพัะพะฝะต - ะทะฝะฐัะธั, ะพะฝะธ ะฝะต ะฑัะปะธ ัะฒะตัะตะฝั
ะฒ ัะพะผ, ััะพ ั ะฟัะธะดั. ะฏ ะผะพะปัะฐ ะฒะทัะป ะตั ะธ ัะตะป ะทะฐ ััะพะป, ะฝะธะบะฐะบ ะฝะต ะพััะตะฐะณะธัะพะฒะฐะฒ ะฝะฐ ะฟะพะถะตะปะฐะฝะธั
ะฟัะธััะฝะพะณะพ ะฐะฟะฟะตัะธัะฐ. ะะฐะถะตััั, ะพะฝะธ ะดัะผะฐัั, ััะพ ั ะผะตะฝั ัะฒะพััะตัะบะธะน ะบัะธะทะธั. ะญัะพ ะฑัะปะพ
ะฑั ะผะฝะต ะฝะฐ ััะบั - ะผะตะฝั ะฟะพะฑะฐะธะฒะฐะปะธัั ััะพะณะฐัั, ะฟัะตะดะพััะฐะฒะปัั ะผะตะฝั ัะฐะผะพะผั ัะตะฑะต. ะ ะฒะพั
ััะบะฐ ะะฐัะปั ะผะตะดะปะตะฝะฝะพ ะฟะพััะฝัะปะฐัั ะฒะฟัะฐะฒะพ, ะบ ะ ะธั
ะธะฝะพะผั ะบะพะปะตะฝั. ะะพ ััั ะฒัั ะพะบะธัะปะธะปะพัั,
ั ั ัััะดะพะผ ัะดะตัะถะฐะป ัะฒะพัะฝัะน ะฟะพะทัะฒ. ะะต ััะพะธั ะทะฐะฒะพะดะธัััั... ะะต ััะพะธั ะทะฐะฒะพะดะธัััั...
ะะต ััะพะธั ะทะฐะฒะพะดะธัััั!
ะฏ ะพัะบะธะฝัะป ะฒะธะปะบั ะฒ ััะพัะพะฝั, ะทะฐ ััะพะปะพะผ ะฒะพัะฐัะธะปะฐัั ะณัะพะฑะพะฒะฐั ัะธัะธะฝะฐ. ะฏ ะทะฝะฐะป, ััะพ ะตัั
ะฝะตัะบะพะปัะบะพ ะผะธะฝัั ะฝะธะบัะพ ะฝะธัะตะณะพ ะฝะต ัะบะฐะถะตั: ะฒัะต ะฑัะดัั ะดะพะถะธะดะฐัััั, ะฟะพะบะฐ ั ัะนะดั, ััะพะฑั
ะฝะต ะฟะพะฟะฐััััั ะฟะพะด ะณะพััััั ััะบั.
ะ ั ัััะป. ะะฐัะตะผ ะฟะพััะธัั ะปัะดัะผ ะฝะฐัััะพะตะฝะธะต. ะฃ ะฝะธั
ะฒะตะดั ะฐัะผะพััะตัะฐ ะฟัะฐะทะดะฝะธะบะฐ. ะะฐะฒััะฐ
ะถะต ะณััะฑะฐะฝะพะต ะ ะพะถะดะตััะฒะพ.
ะะปะธะฒะตั ะทะฐััะป ะบะพ ะผะฝะต, ะบะพะณะดะฐ ะฑัะปะพ ัะถะต ะทะฐ ะฟะพะปะฝะพัั. ะฏ ัะธะดะตะป ะทะฐ ััะพะปะพะผ ะธ ััะตัะดะฝะพ ัััะพัะธะป
ะฒ ะฑะปะพะบะฝะพัะต. ะ ะธัะผะฐ ั
ะปะตััะฐะปะฐ ะธะท ะผะตะฝั ะฑะตะท ะพััะฐะฝะพะฒะบะธ, ะผะตะฝั ัะปะพะฒะฝะพ ัะพัะฝะธะปะพ ัััะพัะบะฐะผะธ
ััะธั
ะพัะฒะพัะตะฝะธั; ะผะฝะต ะฑัะปะพ ะฟะปะพั
ะพ, ะฝะพ ะฒัั, ัะตะผ ั ะผะพะณ ัะตะฑะต ะฟะพะผะพัั - ััะพ ะฟะธัะฐัั ะธ ะฟะธัะฐัั,
ะฒัะฟะปััะบะธะฒะฐั ะธะท ัะตะฑั ะฒัั ัะพ, ััะพ ะฝะฐะบะพะฟะธะปะพัั.
ะ ะบะพะผะฝะฐัะต ะฑัะปะพ ัะตะผะฝะพ: ั ะฟะธัะฐะป, ัะธะดั ั ะพะบะฝะฐ, ะธ ะผะฝะต ะฒะฟะพะปะฝะต ั
ะฒะฐัะฐะปะพ ัะฒะตัะฐ ัะพะฝะฐัั
ะฝะฐ ัะปะธัะต.
- ะฏ ะผะพะณั ัะตะผ-ะฝะธะฑัะดั ัะตะฑะต ะฟะพะผะพัั?
ะฏ ะตะดะฒะฐ ัะดะตัะถะฐะปัั, ััะพะฑั ะฝะตัะฒะฝะพ ะฝะต ะทะฐัะผะตััััั: ะบะพะฝะตัะฝะพ, ะผะพะถะตัั, ะะปะปะธ! ะะฐะดััะธ ะะฐัะปั.
ะะตั, ะปัััะต ะทะฐัััะตะปะธ. ะะปะธ ััะพะฟะธ. ะกะดะตะปะฐะน ั
ะพัั ััะพ-ะฝะธะฑัะดั, ััะพะฑั ััะพ ะผะฐะปะตะฝัะบะพะต ััะดะพะฒะธัะต
ะฝะต ะฟะพัะฒะปัะปะพัั ััะดะพะผ ั ะ ะธั
ะฐัะดะพะผ!
- ะะตั, ัะฟะฐัะธะฑะพ, ั ะผะตะฝั ะฒัั ะฒ ะฟะพััะดะบะต.
ะะปะธะฒะตั ะฝะตะดะพะฒะตััะธะฒะพ ะฟะพัะผะพััะตะป ะฝะฐ ะผะตะฝั, ะฝะพ ะฝะต ััะฐะป ะฝะธัะตะณะพ ะณะพะฒะพัะธัั, ะทะฐ ััะพ ั ะฑัะป
ะพั ะดััะธ ะตะผั ะฑะปะฐะณะพะดะฐัะตะฝ. ะฏ ะฟะพัะพะน ะทะฐะฒะธะดะพะฒะฐะป ะตะณะพ ัะฟะพะบะพะนััะฒะธั, ัะตัะฟะตะฝะธั ะธ, ะฟะพะถะฐะปัะน,
ะผัะดัะพััะธ. ะฃ ะฝะฐั ั ะฝะธะผ ะทะฝะฐัะธัะตะปัะฝะฐั ัะฐะทะฝะธัะฐ ะฒ ะฒะพะทัะฐััะต, ะฝะพ ะฟะพัะพะน ะผะฝะต ะบะฐะถะตััั, ััะพ
ะพะฝ ะทะฝะฐัะธัะตะปัะฝะพ ะพะฟััะฝะตะต ะผะตะฝั. ะะฐะฒะตัะฝะพะต, ะฒัั ะดะตะปะพ ะฒ ะตะณะพ ัะฟะพะบะพะนะฝะพะผ ะธ ะฝะตะผะฝะพะณะพ ะทะฐะณะฐะดะพัะฝะพะผ
ะฒะทะณะปัะดะต. ะะพะฒะพััั, ะณะปะฐะทะฐ - ะทะตัะบะฐะปะพ ะดััะธ. ะฏ ะฑั ั ะฑะพะปััะธะผ ะธะฝัะตัะตัะพะผ ะทะฐะณะปัะฝัะป ะฒ ะดััั
ะบ ะะปะธะฒะตัั. ะะพะถะฐะปัะน, ััะพ ะพะดะธะฝ ะธะท ะฝะตะผะฝะพะณะธั
ะปัะดะตะน, ััั ะดััะฐ ะผะตะฝั ะฒ ะฟัะธะฝัะธะฟะต ะธะฝัะตัะตััะตั.
ะั ะผะพะปัะฐะปะธ. ะะฐ ะพะบะฝะพะผ ะฒ ะฟัะธััะดะปะธะฒะพะผ ะฒะฐะปััะต ะบััะถะธะปะธัั ัะฝะตะถะธะฝะบะธ, ะฐ ะฒะตัะตั, ัะปะพะฒะฝะพ
ัััะพะณะธะน ะฑะฐะปะตัะผะตะนััะตั, ัััะพะฒะพ ะฒะพััะฐะป ะฝะฐ ะฝะธั
. ะขะธะบะฐะปะธ ะฝะฐััะตะฝะฝัะต ัะฐัั, ัะธั
ะพ ะธ ะณะปัั
ะพ,
ะฑัะดัะพ ะฟะพะดัััะฐะธะฒะฐะปะธัั ะฟะพะด ัััะบ ะผะพะตะณะพ ัะตัะดัะฐ. ะะปะธะฒะตั ะตัั ััะพัะป ะทะฐ ะผะพะตะน ัะฟะธะฝะพะน. ะะฝะต
ะบะฐะทะฐะปะพัั, ะพะฝ ั
ะพัะตั ะตัั ััะพ-ัะพ ะผะฝะต ัะบะฐะทะฐัั, ะฝะพ ั ะฝะต ะฟะพะฝะธะผะฐะป, ะฟะพัะตะผั ะพะฝ ะผะตะดะปะธั,
ะฒัะพะดะต ะฑั ะฒ ะตะณะพ ะฒะทะณะปัะดะต ะฝะต ะฑัะปะพ ะฝะตัะตัะธัะตะปัะฝะพััะธ. ะ ะบะพะฝัะต ะบะพะฝัะพะฒ, ะพะฝ ัะฐะทะฒะตัะฝัะปัั
ะธ ะฝะฐะฟัะฐะฒะธะปัั ะบ ะฒัั
ะพะดั, ะพัะตะฒะธะดะฝะพ, ะฟะตัะตะดัะผะฐะฒ ะธ ัะตัะธะฒ ะพััะฐะฒะธัั ััะพั ัะฐะทะณะพะฒะพั ะฝะฐ ะฟะพัะพะผ.
ะะฝ ะปะธัั ัะฟัะพัะธะป ะฝะฐะฟะพัะปะตะดะพะบ:
- ะขะตะฑะต ััะพ-ะฝะธะฑัะดั ะฟัะธะฒะตะทัะธ ะธะท ะณะพัะพะดะฐ? ะั ัะพ ะจะฝะฐะนะดะตัะพะผ ะทะฐะฑัะปะธ ะปัะถะธ, ะฐ ะฝะฐะผ ัะถ ะฑะพะปัะฝะพ
ั
ะพัะตััั ะฟะพะบะฐัะฐัััั, ะฟะพะบะฐ ัะฝะตะณ ะฝะต ะฝะฐัะฐะป ะฟัะตะฒัะฐัะฐัััั ะฒ ะณััะทั.
- ะะฐั
ะฒะฐัะธัะต ะผะฝะต ัะธัััะน ะฑะปะพะบะฝะพั.
ะ ะะปะธะฒะตั ัััะป, ัะฝะพะฒะฐ ะพััะฐะฒะธะฒ ะผะตะฝั ะฝะฐะตะดะธะฝะต ั ะฑะปะพะบะฝะพัะพะผ. ะงะตัะบะฝัะฒ ะตัั ะฟะฐัั ัััะพะบ,
ั ะฟะพะฝัะป, ััะพ ัะพะฝัะฐะฝ ะธะดะตะน ะธัััะบ. ะฏ ะทะฐะดัะผัะธะฒะพ ะฟัะพะปะธััะฐะป ะธัะฟะธัะฐะฝะฝัะต ะฝะตะฐะบะบััะฐัะฝัะผ
ะฟะพัะตัะบะพะผ ัััะฐะฝะธัั. ะััะฐะปะฐัั ะปะธัั ะพะดะฝะฐ ัะธััะฐั. ะขะพะปัะบะพ ัะตะนัะฐั ั ะฟะพะฝัะป, ััะพ ะฒัะต ััะธั
ะธ,
ะฝะฐะฟะธัะฐะฝะฝัะต ะทะดะตัั, ะฟะพัะฒััะตะฝั ะ ะธั
ะฐัะดั.
ะฏ ัะตัะธะป ะปะตัั ัะฟะฐัั ะฝะต ััะพะปัะบะพ ะฟะพัะพะผั ััะพ ัััะฐะป, ัะบะพะปัะบะพ ะธะท ะถะตะปะฐะฝะธั ัะฑะธัั ะฒัะตะผั.
ะัะตะผั... ะะฐะบ ะถะต ั ะตะณะพ ะฝะตะฝะฐะฒะธะถั. ะะตัะพััะฝะพ, ะฟะพัะพะผั ััะพ ัะฟัััะธะป ะตะณะพ. ะะพะบะฐ ั ะฟะธัะฐะป
ัะฒะพะธ ััะธั
ะธ, ะฒัะตะผั ะบะฐะฟะฐะปะพ, ัะปะพะฒะฝะพ ะฒะพะดะฐ ะธะท ะฟะปะพั
ะพ ะทะฐะบัััะพะณะพ ะบัะฐะฝะฐ. ะ ั ะพะฟะพะทะดะฐะป. ะขะตะฟะตัั
ะ ะธั
ะฐัะด ัะฐะทะดะตะปัะตั ัะฒะพั ัะธัะพะบัั ะบัะพะฒะฐัั ะฝะต ัะพ ะผะฝะพะน, ะฐ ั ัะตะผ, ะบัะพ ะพะบะฐะทะฐะปัั ะฟัะพะฒะพัะฝะตะต,
ะฑััััะตะต, ัะพะพะฑัะฐะทะธัะตะปัะฝะตะต. ะ ั ัะฐะบ ะธ ะพััะฐะปัั ะปะตะถะฐัั ะฟะพะด ั
ะพะปะพะดะฝัะผ ะพะดะตัะปะพะผ ะพะดะธะฝ,
ะฝะฐะตะดะธะฝะต ัะพ ัะฒะพะธะผะธ ััะธั
ะฐะผะธ, ะบะพัะพััะต ั ะฝะธะบะพะผั ะธ ะฝะธะบะพะณะดะฐ ะฝะต ะฟะพะบะฐะถั. ะ ั ะฟะพััะตะฟะตะฝะฝะพ
ััะฐะฝะพะฒะปััั ะบะพะปััะธะผ, ะธ, ะบะฐะถะตััั, ะพะฑัะฐััะฐั ะบะพัะบะพะน ะปัะดะฐ.
ะกะพะฝ ะฝะต ะฟัะธั
ะพะดะธะป. ะฏ ัะถะต ะฟะพัะตััะป ัััั ัะพะผั ะฒัะตะผะตะฝะธ, ะฒ ัะตัะตะฝะธะต ะบะพัะพัะพะณะพ ั ะปะตะถะฐะป ะฝะฐ
ัะฟะธะฝะต, ะณะปัะดั ะฝะฐ ะฑะตะปะพัะฝะตะถะฝัะน ะฟะพัะพะปะพะบ, ะฟะพ ะบะพัะพัะพะผั ัะบะพะปัะทะธะปะธ ัะตะฝะธ ัะฝะตะถะธะฝะพะบ. ะขะธัะธะฝะฐ
ะณัะดะตะปะฐ ะฒ ััะฐั
. ะฅะพัั ั ะดะพะฒะพะปัััะฒะพะฒะฐะปัั ะตะน ะฝะต ัะฐะบ ัะถ ะธ ะดะพะปะณะพ.
ะกะฟัััั ะบะฐะบะพะต-ัะพ ะฒัะตะผั, ะบ ะผะพะตะผั ะฝะตะณะพะดะพะฒะฐะฝะธั, ั ััะปััะฐะป ัะบัะธะฟ ะบัะพะฒะฐัะธ ะฒ ัะพัะตะดะฝะตะน
ะบะพะผะฝะฐัะต. ะฏ ะดะพ ะฟะพัะปะตะดะฝะตะณะพ ะฝะฐะดะตัะปัั, ััะพ ะผะฝะต ััะพ ะฟะพะบะฐะทะฐะปะพัั. ะะตั. ะะต ะฟะพะบะฐะทะฐะปะพัั.
ะกะบัะธะฟั ััะฐะปะธ ัะธะปัะฝะตะต, ะดะตัะตะฒัะฝะฝัะต ะฑะฐัะตะฝะบะธ ั ะธะทะณะพะปะพะฒัั ะบัะพะฒะฐัะธ ะฝะฐัะฐะปะธ ััััะฐัััั
ะฑ ััะตะฝั, ะธ ั ััะปััะฐะป ะตะณะพ ะณะพะปะพั. ะัะพะผะบะธะน ะฟัะพััะถะฝัะน ััะพะฝ, ะทะฐัะตะผ ัะตัะธั ัััั ะฑะพะปะตะต
ัะฐัััั
. ะะฝะต ะบะฐะทะฐะปะพัั, ั ัะปััะฐะป ะดะฐะถะต ะตะณะพ ะดัั
ะฐะฝะธะต.
ะะตะฝั ัะปะพะฒะฝะพ ัะดะฐัะธะปะธ ะฒ ะณััะดั, ะฒัะฑะธะฒ ะฒะตัั ะฒะพะทะดัั
ะธะท ะปัะณะบะธั
. ะัะปะพ ะฑะพะปัะฝะพ ะธ ะพะฑะธะดะฝะพ.
ะฏ ะฟะพะถะฐะปะตะป, ััะพ ะฟะพัะตะปะธะปัั ััะดะพะผ ั ะ ะธั
ะฐัะดะพะผ. ะฏ ะฝะฐะบััะป ะณะพะปะพะฒั ะฟะพะดััะบะพะน, ััะฐัะฐััั
ะฝะต ัะปััะฐัั ััะพะณะพ ะบะพัะผะฐัะฐ, ะฝะพ ััะตัะฝะพ; ะฒัะบะพัะธะป ั ะบัะพะฒะฐัะธ, ะฝะฐัะฐะป ั
ะพะดะธัั ะฟะพ ะบะพะผะฝะฐัะต,
ะฟะพะดะฐะฒะปัั ะถะตะปะฐะฝะธะต ะฑัะพัะธัััั ะฒ ัะพัะตะดะฝัั ะธ ัะฐัะบะธะดะฐัั ะปัะฑะพะฒะฝะธะบะพะฒ ะบะฐะบ ะบะพััั. ะฅะพัะตะปะพัั
ะพะฑะปะธัะธัั, ะฟัะธัััะดะธัั ะธั
. ะะฟัะพัะตะผ, ะฟะตัะตะด ะบะตะผ? ะัะต ะธัะฐะบ ะดะฐะฒะฝะพ ะพะฑะพ ะฒััะผ ะทะฝะฐะปะธ ะธ,
ะบะฐะบ ะฒะทัะพัะปัะต, ะฐะดะตะบะฒะฐัะฝัะต ะปัะดะธ, ะทะฐะบััะฒะฐะปะธ ะฝะฐ ััะพ ะณะปะฐะทะฐ. ะขะฐะบ ััะพ ั ะฟัะพััะพ ะฒัััะฐะฒะปั
ัะตะฑั ะดััะฐะบะพะผ.
ะะฐะบะพัะพะฝะธั ะธะท ััะพะฝะพะฒ ะธ ัะบัะธะฟะพะฒ ััะฐะฝะพะฒะธะปะฐัั ะฒัั ะณัะพะผัะต. ะฏ ะพะฑะตััะธะปะตะฝะพ ัะฟะฐะป ะฝะฐ ะบัะพะฒะฐัั,
ะผะพะปั ะะพะณะฐ, ััะพะฑั ัะบะพัะตะต ะฒัั ะทะฐะบะพะฝัะธะปะพัั, ะธ ะทะฐะบััะป ะณะปะฐะทะฐ. ะกะปััะฝั ะฑัะปะธ ััะพะฝั ัะพะปัะบะพ
ะ ะธั
ะฐัะดะฐ. ะฏ ะฟัะตะดััะฐะฒะธะป, ััะพ ะพะฝ ัะตะนัะฐั ะทะดะตัั, ัะพ ะผะฝะพะน, ััะพ ััะพ ั ัะถะธะผะฐั ะฒ ััะบะฐั
ะตะณะพ ัะณะพะดะธัั, ะพััะฐะฒะปัั ะฝะฐ ะฝะธั
ะธััะธะฝั-ะบัะฐัะฝัะต ะฟะพะปัะผะตัััั, ััะพ ััะพ ั ะฝะตะถะฝะพ ะฒั
ะพะถั
ะฒ ะฝะตะณะพ, ััะพ ะพะฝ ะฟะพะดะพ ะผะฝะพะน ะฒะทะดัะฐะณะธะฒะฐะตั ะพั ะฝะฐัะปะฐะถะดะตะฝะธั...
ะ ะฟะฐั
ั ะผะตะดะปะตะฝะฝะพ ัะพะทัะตะฒะฐะป ะพะณะฝะตะฝะฝัะน ัะฐั. ะะฝ ะพะฑะถะธะณะฐะป ะฒัั ะฒะฝะธะทั ะถะธะฒะพัะฐ, ะฟะตัะธะพะดะธัะตัะบะธ
ะฟะพััะปะฐั ะฝะฐะธะฑะพะปะตะต ะถะฐัะบะธะต ะธะผะฟัะปััั. ะฏ ะฟัะธัะฟัััะธะป ัะตะทะธะฝะบั ะฟะธะถะฐะผะฝัั
ััะฐะฝะพะฒ, ะฒัั ะตัั
ะฝะต ะพัะบััะฒะฐั ะณะปะฐะทะฐ. ะะพั ะดัั
ะฐะฝะธะต ะทะฐั
ะพะดะธะปะพัั, ััะพะฝั ะฟะพ ัั ััะพัะพะฝั ััะตะฝั ััะฐัะฐะปะธัั,
ะณะพัััะฐั ะฟะปะพัั ะฑัะปะฐ ะณะพัะพะฒะฐ ะปะพะฟะฝััั ะพั ะฝะฐะฟััะถะตะฝะธั.
ะะฒะฐ ะณัะพะผะบะธั
, ัะตะทะบะธั
ะฒัะดะพั
ะฐ. ะะณะพ - ะผะพะน. ะ ะผะพัะผ, ะบะฐะถะตััั, ะฟัะพัะบะพะปัะทะฝัะปะพ ัั
ะพ ะตะณะพ
ะธะผะตะฝะธ, ะบะพัะพัะพะต ะพะฑะพะถะณะปะพ ะผะฝะต ะณะพัะปะพ.
ะัั ะฒะพะบััะณ ััะธั
ะปะพ. ะขะธัะธะฝะฐ ัะฝะพะฒะฐ ะฟัะพะบัะฐะปะฐัั ะฒ ะผะพะธ ััะธ.
ะะฝะต ะฑัะปะพ ะณะฐะดะบะพ. ะะพ ะฒัั-ัะฐะบะธ ั ะฑัะป ััะฐััะปะธะฒ ะพั ัะพะณะพ, ััะพ ะฝะฐะผ ั ะ ะธั
ะฐัะดะพะผ ะฑัะปะพ ั
ะพัะพัะพ.
ะฏ ะฑั ัะผะพะณ ัะดะพะฒะปะตัะฒะพัะธัั ะตะณะพ ะฝะต ัะพะปัะบะพ ัะธะทะธัะตัะบะพะต ะถะตะปะฐะฝะธะต, ั ะฑั ัะผะพะณ ะดะพััะฐะฒะธัั
ัะดะพะฒะพะปัััะฒะธะต ะธ ะตะณะพ ะดััะต. ะฏ ะฑั ะผะฝะพะณะพะต ัะผะพะณ...
ะัะบะธะฝัะฒ ัะพ ะปะฑะฐ ะฒะปะฐะถะฝัะต ะพั ะฟะพัะฐ ะฒะพะปะพัั, ั ะฟะพะฒะตัะฝัะปัั ะฝะฐ ะฑะพะบ ะธ ะดะพััะฐัะพัะฝะพ ะฑััััะพ
ััะฝัะป.
ะะตะฝั ัะฐะทะฑัะดะธะปะธ ั
ะพะปะพะดะฝัะต ะปััะธ-ะธะณะพะปะพัะบะธ ะฝะตะดััะถะตะปัะฑะฝะพะณะพ ะทะธะผะฝะตะณะพ ัะพะปะฝัะฐ, ะฟัะพะฝะธะบะฐะฒัะธะต
ะฒ ะผะพั ัะฟะฐะปัะฝั. ะะดะฒะฐ ั ัะฐะทะปะตะฟะธะป ะณะปะฐะทะฐ, ะฑัะดะธะปัะฝะธะบ ะฝะฐ ะฟัะธะบัะพะฒะฐัะฝะพะน ััะผะฑะพัะบะต ััะปัะถะปะธะฒะพ
ะฟะพะบะฐะทะฐะป ะผะฝะต ะฒัะตะผั: ะฑะตะท ะฟััะธ ะดะตัััั. ะฏ ะฒะฟะตัะฒัะต ะทะฐ ะดะพะปะณะพะต ะฒัะตะผั ััะฒััะฒะพะฒะฐะป ัะตะฑั
ั
ะพัะพัะพ. ะะฝะต ะบะฐะทะฐะปะพัั, ััะพ ะฝะตัะบะพะปัะบะพ ะฟัะตะดัะดััะธั
ะฝะตะดะตะปั ั ะบะฐัะฐะฑะบะฐะปัั ะฒ ะฒััะพะบัั,
ะบััััั ะณะพัั ะธ, ะฝะฐะบะพะฝะตั ะดะพััะธะณะฝัะฒ ะฒะตััะธะฝั, ัะบะฐัะธะปัั ั ะฝะตั ะฒะฝะธะท, ะฑัะดัะพ ะฝะฐ ัะฐะฝะบะฐั
,
ะฝะฐัะปะฐะถะดะฐััั ัะบะพัะพัััั ะธ ัะฒะธััะพะผ ะฒะตััะฐ ะฒ ััะฐั
. ะะฐะทะฐะปะพัั, ััะพ ั ัะพะปัะบะพ ััะพ ะฟะตัะตะดะฐะป
ัััะฐัะตัะฝัั ะฟะฐะปะพัะบั ะธ ะฑัะป ัะฐะด, ััะพ ะฟัะตะดัะดััะธะน, ะผะพะน ััะฐะฟ ะพััะฐะปัั ะฟะพะทะฐะดะธ. ะััะบะฐะน
ั ะตะณะพ ะธ ะฟัะพะธะณัะฐะป.
ะฏ ะฝะต ััะฐะป ะฝะตะถะธัััั ะฒ ะบัะพะฒะฐัะธ, ัะฟะธะฒะฐััั ัะฒะพะธะผ ั
ะพัะพัะธะผ ะฝะฐัััะพะตะฝะธะตะผ, ะฐ ัะฟะตัะฝะพ ะฒััะฐะป,
ะฝะฐะบะธะฝัะป ั
ะฐะปะฐั ะธ ัะฟัััะธะปัั ะฒะฝะธะท, ะฒ ััะพะปะพะฒัั. ะขะฐะผ ะฑัะป ะพะดะธะฝ ะปะธัั ะคะปะฐะบะต, ะพะฝ ัะธะดะตะป
ะทะฐ ััะพะปะพะผ, ัะธัะฐั ะบะฝะธะณั ะธ ะฟัะธั
ะปัะฑัะฒะฐั ะบะพัะต ะผะฐะปะตะฝัะบะธะผะธ ะณะปะพัะพัะบะฐะผะธ.
- ะะพะฑัะพะต ัััะพ!
ะะปะฐะฒะธัะฝะธะบ ะพัะพัะฒะฐะปัั ะพั ะบะฝะธะณะธ ะธ ะดััะถะตะปัะฑะฝะพ ัะปัะฑะฝัะปัั.
- ะะพะฑัะพะต. ะะฐะบ ัะฟะฐะปะพัั?
- ะะฐะผะตัะฐัะตะปัะฝะพ! - ัะธัะพะบะพ ัะปัะฑะฐััั, ัะบะฐะทะฐะป ั. ะะฟะตัะฒัะต ะทะฐ ะฟะพัะปะตะดะฝะตะต ะฒัะตะผั ะผะฝะต ะฝะธัะตะณะพ
ะฝะต ัะฝะธะปะพัั, ัะตะผั ั ะฑัะป ะฝะตัะบะฐะทะฐะฝะฝะพ ัะฐะด, ะฒะตะดั ะฒัะต ะผะพะธ ัะฝั ะบัััะธะปะธัั ะฒะพะบััะณ ะ ะธั
ะฐัะดะฐ.
ะะธะฑะพ ััะพ ะฑัะปะธ ะบะพัะผะฐัั, ะฒ ะบะพัะพััั
ะพะฝ ะฑัะป ั ะะฐัะปะตะผ, ะปะธะฑะพ ััะพ ะฑัะปะธ ะฟัะตะบัะฐัะฝัะต ัะฝั,
ะฒ ะบะพัะพััั
ะผั ะฑัะปะธ ะฒะผะตััะต ะธ ะปัะฑะธะปะธ ะดััะณ ะดััะณะฐ, ะธ ะพั ััะพะณะพ ะผะฝะต ะฑัะปะพ ะฝะตะฒัะฝะพัะธะผะพ ะฑะพะปัะฝะพ
ะฟัะพััะฟะฐัััั. ะะพััะพ'
- source_sentence: 'ะคัั ะดัะผะฐะป, ััะพ ะะฐัะธะปะธัะฐ - ะณะปัะฟะฐั ััะถะตะฒะพะปะพัะฐั ัะฟะธะพะฝะบะฐ, ะบะพัะพัะฐั
ะฒัะฟะพะปะฝัะตั ะปัะฑัะต ะฟัะธะบะฐะทั ัะฒะพะตะณะพ ะพััะฐ, ะธะดะตั ะฟะพ ััะพะฟะฐะผ ะะณะฝะตะฒะฐ ะดะปั ะดะพััะธะถะตะฝะธั ะฒัััะตะน
ัะพัะบะธ ะฒะปะฐััะธ. ะัะผะฐะป, ะพะฝะฐ ะฟััะฐะตััั ัะฐะทัััะธัั ะถะธะทะฝะธ ะะธะบะฐ ะธ ััะฐััะตะณะพ ะะฐะทะฐัะตะฒะฐ. ะะฐัะตะฝั
ััะธัะฐะป, ััะพ ะดะตะฒะพัะบะฐ ะปะธัั ะฒัะธัะฐะตััั ะฒ ะดะพะฒะตัะธะต, ะดะตะปะฐะตั ะฒะธะด, ััะพ ะพะฝะฐ ัะฐะบะฐั ะดะพะฑัะฐั
ะธ ะผะธะปะฐั, ะฒะตัะฝะพ ะบัะฐัะฝะตััะฐั ะธ ะฝะตะฒะธะฝะฝะฐั ะบัะฐัะฐะฒะธัะฐ. ะ ะฝะฐ ัะฐะผะพะผ ะดะตะปะต ะฒะฝัััะธ ะตะต ะดััะธ
ัะฒะตัะฝัะปะฐัั ะบะพะปััะฐะผะธ ะทะผะตั, ะพะถะธะดะฐััะฐั ะผะพะผะตะฝัะฐ, ะบะพะณะดะฐ ะฟัะธะดะตััั ะฒะพะฝะทะธัั ัะฒะพะธ ะทัะฑั
ะฒ ัะตั ะฟัะพัะธะฒะฝะธะบะฐ ะธ ะฒะฟัััะฝััั ัะด.
ะคัั ะดัะผะฐะป, ััะพ ะะฐัะธะปะธัะฐ ะฝะธะบะพะณะดะฐ ะฝะต ัะผะพะถะตั ะฝะฐััะธัััั ะปะตัะฐัั. ะัะดะธ, ัะฐัั
ะฐะถะธะฒะฐััะธะต
ะฟะพ ะทะตะผะปะต, ะฝะต ะผะพะณัั ะฟะพััะฒััะฒะพะฒะฐัั ะบััะปัั ะทะฐ ัะฟะธะฝะพะน, "ะพัะบะปััะธัั ััะบะธ" ะธ ะฒะทะผััั ะฒ
ะณะพะปัะฑะพะต ะฝะตะฑะพ. ะะต ัะฟะพัะพะฑะฝั ะฟะพััะฒััะฒะพะฒะฐัั ะฟะพััะฒั ะฒะตััะฐ ะฝะฐ ัะฒะพะตะน ะบะพะถะต ะธ ะฟะพะฝััั, ะบะฐะบะพะณะพ
ััะพ ะฟัะตะฒัะฐัะธัััั ะฒ ะฟัะธัั.
ะัะฐะณะพัะธะน ัะฒะตััะป ัะตะฑั ะฒ ัะพะผ, ััะพ ัะพะฒะตััะตะฝะฝะพ ะฝะต ะทะฐะฒะธะดัะตั ะตะต ัะผะตะฝะธั ะฝะฐั
ะพะดะธัั ะฒัั
ะพะด
ะธะท ัะปะพะถะฝัั
ัะธััะฐัะธะน, ัะปัะฑะฐัััั, ะฟัะพะดะพะปะถะฐัั ัััะธัั ะธ ะฒะตัะตะปะธัััั, ั
ะพัะพัะพ ะทะฝะฐั ะพ
ะฟัะธะฑะปะธะถะฐััะตะผัั ะฝะฐะฟะฐะดะตะฝะธะธ ะัััะฐะณะพัะฐ. ะัะฐะณะพัะธะน ััะธัะฐะป ะฟัะฐะฒะธะปัะฝัะผ ัะบััะฒะฐัั ัะผะพัะธะธ,
ะฝะต ะพัะบััะฒะฐัั ัะพะปะฟะต ัะฒะพะธ ััะฒััะฒะฐ, ะผััะปะธ ะธ ัััะฐั
ะธ, ะฒะตะดั ัะฐะบ ะถะธะฒะตััั ะปะตะณัะต. ะะพััะพะผั
ะพะฑ ััะพะน ะปะตะณะบะพะน ะทะฐะฒะธััะธ ะฝะธะบัะพ ะธ ะฝะต ะทะฝะฐะป. ะะฐะถะต ะะธะบ.
ะคัั ะดัะผะฐะป, ััะพ ะะฐัะธะปะธัะฐ ะฒะตัะฝะพ ะฑัะดะตั ะฒัะดะตะปััั ะธะท ัะพะปะฟั, ะฟะพัะพะผั ััะพ ัะฒะฐะถะฐััะธะต ัะตะฑั
ัะฐัะพะฒัะธะบะธ ะฝะต ะดะตะปะฐัั ะฒััะบะธะต ะฐะบัะพะฑะฐัะธัะตัะบะธะต ะฝะพะผะตัะฐ ะฒ ัะฒะพะฑะพะดะฝะพะต ะฒัะตะผั. ะ, ัะตะผ ะฑะพะปะตะต,
ะฝะต ะปะฐะทะฐัั ะฟะพ ะดะตัะตะฒััะผ.
ะะฐัะตะฝั ััะธัะฐะป, ััะพ ะะฐัะธะปะธัะฐ - ะณะปัะฟะฐั ะดะตะฒัะพะฝะบะฐ, ะฟะพัะพะผั ััะพ ะดะตัะทะธั ะธ ะฟะตัะตัะธั ััะฐััะธะผ,
ะฟะพััะพัะฝะฝะพ ะฟัะตัะตะบะฐะตััั ั ะบะพะผะฟะฐะฝะธะตะน ะะฐัะบะฐ, ะฟะพะดะฑัะฐััะฒะฐั ะฒ ะฟะปะฐะผั ะฝะฐะฟััะถะตะฝะฝะพััะธ ะฒัะต
ะฑะพะปััะต ััั
ะธั
ะฟะพะปะตะฝัะตะฒ. ะะฐัะตะฝั ัะพัะฝะพ ะทะฝะฐะป, ะบะพะณะดะฐ-ัะพ ะพะฝะฐ ะทะฐะฟะปะฐัะธั ะทะฐ ะฒัะต ัะฒะพะธ ะดะตะนััะฒะธั
ะธ ะบะพะปะบะธะต ัะปะพะฒะฐ, ะธ ะดะฐะถะต ะฟะพะทะฒะพะปัะป ัะตะฑะต ัะฐัััะณะธะฒะฐัั ะณัะฑั ะฒ ัะฐะปััะธะฒะพะน ัะปัะฑะบะต, ัะฐะทะผััะปัั
ะพะฑ ััะธั
ะฝะตะดะฐะปะตะบะธั
ะฒัะตะผะตะฝะฐั
.
ะัะฐะณะพัะธะน ะฝะตะฝะฐะฒะธะดะตะป ะตะต ะทะฐ ัะพ, ััะพ ะพะฝะฐ ะพะดะฐัะธะปะฐ ะตะณะพ ัะพััะฒััะฒัััะธะผ ะฒะทะณะปัะดะพะผ ะธ ะฟะพะฟััะฐะปะฐัั
ะฟะพะถะฐะปะตัั, ะฟะพะฝััั, ะบะพะณะดะฐ ัะทะฝะฐะปะฐ ะพะฑ ะตะณะพ ัะธัะพัััะฒะต. ะคัั ััะธัะฐะป, ััะพ ะดะตะฒะพัะบะฐ ะผะพะณะปะฐ
ะฝะต ัะฟัะฐัะธะฒะฐัั ั ะฝะตะณะพ ะพ ััะฐััั
ะฒัะตะผะตะฝะฐั
ะธ ะฝะต ะฟััะฐัััั ะฟะพะดะฑะพะดัะธัั, ััะพ ะตะผั ะถะฐะปะพััั
ัะพะฒะตััะตะฝะฝะพ ะฝะต ะฝัะถะฝะฐ.
ะคัั ะดัะผะฐะป, ะะฐัะธะปะธัะฐ - ัะปะธัะบะพะผ ัะปะฐะฑะฐั ะดะปั ัะพะณะพ, ััะพะฑั ะฒัะถะธัั ะฟะพัะปะต ัะปััะฐั ั ะะปัะผ
ะฆะฒะตัะบะพะผ, ะฒะตะดั ะฝะธะบัะพ ัะฐะฝะตะต ะธะท ะฝะพัะธัะตะปะตะน ัะตัะฝะพะณะพ ะบะปััะฐ ะฝะต ะฒัะถะธะฒะฐะป. ะะฐะฒะตัะฝะพะต, ะฟะพััะพะผั
ะัะฐะณะพัะธะน ัะตัะธะป ะฟะพะผะพัั ะตะน. ะะฐ ะฟะฐัะฝั ะฟะพะฒะปะธัะปะฐ ะผะธะปะฐั ัะปัะฑะบะฐ ะะณะฝะตะฒะพะน. ะคัั ัะพะณะดะฐ ะดะปั
ัะตะฑั ัะตัะธะป, ััะพ ะฒ ะฟะพัะปะตะดะฝะธะน ัะฐะท ะดะตะปะฐะตั ะตะน ะฟะพะดะพะฑะฝะพะต ะพะดะพะปะถะตะฝะธะต ะธ ะดะฐัะธั ะถะธะทะฝั, ะฟะพะพะฑะตัะฐะป
ะฝะต ะฒัะฟะพะผะธะฝะฐัั ะพ ะทะปะพะฟะพะปััะฝะพะผ ะดะฝะต ะฒะพะทะฒัะฐัะตะฝะธั ะบ ะัััะฐะณะพัั ะธะท-ะทะฐ ััะถะตะฒะพะปะพัะพะน.
ะัะฐะณะพัะธะน ะดัะผะฐะป, ััะพ ะะฐัะธะปะธัะฐ ะฝะต ะฟะพะฑะตะดะธั ะฒะตะปะธะบะพะณะพ ะดัั
ะฐ, ะฟัััั ะดะฐะถะต ะฒัััะธั ัััััั
ััะตัะพะฒ, ััะพ ะะณะฝะตะฒะฐ - ะฝะตะผะพัะฝะฐั ะธ ะฑะตััะธะปัะฝะฐั ะดะตะฒัะพะฝะบะฐ, ะบะพัะพัะฐั ัะพะปัะบะพ ะธ ัะผะตะตั, ััะพ
ะฟะพะผะพะณะฐัั ะดััะณะธะผ ะธ ัััะพะธัั ะธะท ัะตะฑั ะณะตัะพะธะฝั ะดะตัะตะฒะพะณะพ ัะพะผะฐะฝะฐ. ะะฝ ะฟะพะพะฑะตัะฐะป, ััะพ ะฝะต
ะฟัะธัะพะตะดะธะฝะธััั ะบ ะฝะตะน, ะฝะต ะฑัะดะตั ะฟะพะผะพะณะฐัั. ะคัั ััะธัะฐะป, ััะพ ะปะธัั ะดะตะปะฐะตั ะฒะธะด ะธ ะฟัะธัะฒะพััะตััั
ะตะต ะดััะณะพะผ.
ะัะฐะณะพัะธะน ะดัะผะฐะป, ััะพ ะฝะต ะฒะปัะฑะธััั ะฒ ะฝะตะต, ะฝะต ะฟะพะนะดะตั ะฝะฐ ะฟะพะฒะพะดั ะบะฐะบะธั
-ัะพ ะฝะตะปะตะฟัั
ัะฐั
ััะถะตะฒะพะปะพัะพะน, ะฝะพ ะพะฝ ะปะธัั ะฒ ะพัะตัะตะดะฝะพะน ัะฐะท ะพัะธะฑะฐะปัั.'
sentences:
- 'ะะฐััะธ ะบัััะธั ะบะพะปะตัะธะบะพ ะทะฐะถะธะณะฐะปะบะธ, ะขะธะบะบะธ ะทะฐะบะฐะฝัะธะฒะฐะตั ะตััั ัะฒะพะต ะฟะตัะตะฝัะต ะธ ัะผะพััะธั
ะฝะฐ ะฝะตะณะพ ัะบะพัะธะทะฝะตะฝะฝะพ, ะบะฐะบ ะฑั ะณะพะฒะพัั ะฟัะตะบัะฐัะธ-ััะพ-ะฟะฐัะตะฝั-ะฝะต-ะฟะพะผะพะถะตั (ะฝะพ ะพะฝ ะฝะต ะผะพะถะตั,
ัะตัั, ะฟะพะฟัะพััั ะฝะต ะผะพะถะตั, ะฟะพัะพะผั ััะพ ะบััะตะฝะธะต - ััะพ ะฟัะธะฒััะบะฐ, ะตะดะธะฝััะฒะตะฝะฝัะน ะฑัััััะน
ัะฟะพัะพะฑ ะฟะตัะตััะฐัั ะถะตะปะฐัั ัะผะตััะธ ะญะปะตะพะฝะพั, ะฟะพัะพะผั ััะพ, ั
ัะน, ะพะฝ ััะฟะตัะณะตัะพะน, ะตะผั ะฝะตะปัะทั
ัะฑะธะฒะฐัั ะฝะตะฒะธะฝะฝัั
ะปัะดะตะน).
ะะฐััะธ ะฒะธะฝะพะฒะฐัะพ ัะปัะฑะฐะตััั ะธ ะท-ะฐ-ั-ั-ะณ-ะธ-ะฒ-ะฐ-ะต-ั-ั-ั ัะธะณะฐัะตัะฝัะผ ะดัะผะพะผ ะดะพ ะถะถะตะฝะธั
ะฒ ะณะพัะปะต, ัะตัะฝัั
ัะพัะตะบ ะฟะตัะตะด ะณะปะฐะทะฐะผะธ ะธ ะฐะดัะบะพะน ะฑะพะปะธ ะณะดะต-ัะพ ะผะตะถะดั ัะตะฑะตั, ะฒัะฟะพะผะธะฝะฐั
ะฟะตัะตะฟะปะตัะตะฝะฝัะต ะฟะฐะปััั ะปัะฑะฒะธ ะฒัะตะน ะตะณะพ ะถะธะทะฝะธ ะธ ะดะตะฒััะบะธ, ะพััะฐะฒะปัััะตะน ะตะณะพ ัััะตััะฒะพะฒะฐะฝะธะต
ั ัะฐะผะพะณะพ ะฟะตัะฒะพะณะพ ะบะปะฐััะฐ.
ะฃ ะฝะตะต ะฟะฐะฟะฐ - ะผัั ะณะพัะพะดะฐ, ะฒะตัะธ ะพั ะผะธัะพะฒัั
ะฑัะตะฝะดะพะฒ, ะบัะฐัะธะฒะฐั (ะฝะพ ะฟัััะฐั) ะฒะฝะตัะฝะพััั
ะธ ัะตะปัะน ะฒะพะท ัะถะฐัะฝัั
ะฟะพัััะฟะบะพะฒ, ะธ ะะฐััะธ ะฟะพะฝััะธั ะฝะต ะธะผะตะตั, ััะพ ะัะธ ะฒ ะฝะตะน ะผะพะณ ะฝะฐะนัะธ
(ะฝะฐัะตะป).
- ะขั ัะธะปัะฝัะน, - ะณะพะฒะพัะธั ะตะณะพ ะผะฐะปะตะฝัะบะฐั ะฟะพะดััะณะฐ, ัะฐะดััั ะฝะฐ ะฟะปะตัะพ. - ะขะพ, ััะพ ะัะธ
ะฝะฐัะฐะป ะฒัััะตัะฐัััั ั ะญะปะตะพะฝะพั, ะฝะต ัะฐะบ ัััะฐัะฝะพ, ะบะฐะบ ัะพะทะดะฐะฝะธั, ั ะบะพัะพััะผะธ ัั ะฟะพััะพัะฝะฝะพ
ััะฐะถะฐะตัััั.
(ะณะพัะฐะทะดะพ ัััะฐัะฝะตะต)
- ะะพะฝะตัะฝะพ, - ั
ัะธะฟะธั ะะฐััะธ ะฒะผะตััะต ั ะบะพัะพะน, ะฝะฐััะฝััะพะน ัะปัะฑะบะพะน ะฝะฐ ะปะธัะต, ะธ ะขะธะบะบะธ ะตะผั,
ะบะฐะถะตััั, ะฒะตัะธั, ะบะปะฐะดั ะผะฐะปะตะฝัะบัั ะปะฐะดะพัะบั ะฝะฐ ัะตะบั ะฒ ะทะฝะฐะบ ะพะดะพะฑัะตะฝะธั.
ะะฐััะธ ัะตััะฝะฐะดัะฐัั. ะะฝ ะดะพะปะถะตะฝ ั
ะพะดะธัั ะฝะฐ ัะฒะธะดะฐะฝะธั, ะฒะตัะตะปะธัััั ั ะดััะทััะผะธ ะธ ะฝะฐัะปะฐะถะดะฐัััั
ะถะธะทะฝัั, ะฝะพ ะฒะผะตััะพ ััะพะณะพ ะพะฝ ัะฟะฐัะฐะตั ะะฐัะธะถ ัััั ะปะธ ะฝะต ะตะถะตะดะฝะตะฒะฝะพ, ะปะตัะฐั ะฟะพ ะณะพัะพะดั
ะฝะฐ ะดััะฐัะบะพะผ ะนะพ-ะนะพ, ัะปะพะฒะฝะพ ัะตะปะพะฒะตะบ-ะฟะฐัะบ, ะธ ะฟะพะปะฐะณะฐััั ะฝะฐ ัะธะปั ะผะฐะปะตะฝัะบะพะณะพ ะฑัะฐัะปะตัะฐ
(ะบัะฐัะฝะพะณะพ, ะฒ ะบัะฐะฟะธะฝะบั, ะบะฐะบ ะบััะปัั ะะพะถัะตะน ะะพัะพะฒะบะธ).
(ะฐ ะตัะต ะะฐััะธ ะฟััะฐะตััั ัะบัััั ะพั ะฒัะตั
ัะฒะพั (ะฝะต ะพัะตะฝั) ะผะฐะปะตะฝัะบัั ะฒะปัะฑะปะตะฝะฝะพััั ะฒ
ะัะธ ะขะพะผะปะธะฝัะพะฝะฐ, ะตะณะพ ะพะดะฝะพะบะปะฐััะฝะธะบะฐ, ััะดะพะผ ั ะบะพัะพััะผ ะฝะต ะผะพะถะตั ัะฒัะทะฐัั ะดะฐะถะต ะดะฒัั
ัะปะพะฒ ะธ ัะดะตัะถะฐัััั ะฝะฐ ะฝะพะณะฐั
.
ัะตะฟะตัั ััะพ ะฝะตะฒะฐะถะฝะพ, ะฟะพัะพะผั ััะพ (ะตะณะพ) ะั ะฒัััะตัะฐะตััั ั ะญะปะตะพะฝะพั, ะธ ะฟะฐัะตะฝั ััะฒััะฒัะตั,
ะบะฐะบ ะฒ ะณััะดะธ ััะพ-ัะพ ะถะถะตั, ะฐ ะถะตะปะฐะฝะธะต ะถะธัั ั ะบะฐะถะดัะผ ะดะฝะตะผ ัะผะตะฝััะฐะตััั ะฒ ะณะตะพะผะตััะธัะตัะบะพะน
ะฟัะพะณัะตััะธะธ, ะพะน-ะพะน).
ะะตัะฒัะผ ะทะฐะผะตัะฐะตั ะัะฐั, ะธ ะฝะต ัะพ ััะพะฑั ะะฐััะธ ัะดะธะฒะปะตะฝ ััะพะผั ัะฐะบัั, ะฟัะพััะพ ะบะฐะบ-ัะพ ัััะฐะฝะฝะพ,
ััะพ ััะพั ัะฐะทะดัะฐะถะฐััะธะน, ะดะตัะทะบะธะน ะธ ัะพะฒะตััะตะฝะฝะพ-ะฝะต-ะฟะพั
ะพะถะธะน-ะฝะฐ-ะฒะทัะพัะปะพะณะพ ะบะพั ะฟะพััะฒััะฒะพะฒะฐะป
ะตะณะพ ะฟัะธัะฒะพัััะฒะพ.
- ะัะต ะฒ ะฟะพััะดะบะต, ะะพะถัั ะะพัะพะฒะบะฐ? - ัะฟัะฐัะธะฒะฐะตั ะพะฝ, ะบะพะณะดะฐ ะพะฝะธ ะฟะพะฑะตะถะดะฐัั ะ ะตัะปะตะบัั,
ะธ ะพััะฐะตััั ะฝะตัะบะพะปัะบะพ ะผะธะฝัั ะดะพ ะฟัะตะฒัะฐัะตะฝะธั.
ะะตะปะตะฝัะต ะณะปะฐะทะฐ ะทะฐ ะผะฐัะบะพะน ะฒัะณะปัะดัั (ะฟะพ-ะฝะฐััะพััะตะผั) ะพะฑะตัะฟะพะบะพะตะฝะฝัะผะธ, ะธ ะะฐััะธ ั
ะพัะตะป
ะฑั ะฟะพะฒะตัะธัั ะฒ ัะตะฐะปัะฝะพะต ะฒะพะปะฝะตะฝะธะต ะะพัะฐ ะพ ัะฒะพะตะน ะถะธะทะฝะธ, ะฝะพ ััะพ ะฝะต ะฒ ะตะณะพ ัะธะปะฐั
.
ะะฐััะธ ัััะบะฐะตั ะธ ะฟััะฐะตััั ะดะตัะถะฐัั ัะตะฑั ะฒ ััะบะฐั
(ะฒะดะพั
-ะฒัะดะพั
, ะผะฐะปััะธะบ), ะฟะพัะพะผั ััะพ
ะพะฝะธ ะฒะตะดั ะฝะฐะฟะฐัะฝะธะบะธ, ะธ ะพะฝ ะฝะต ะพะฑัะทะฐะฝ ะพัะบััะฒะฐัั ัะฒะพั ะดััั, ะฒะตัะฝะพ? (ะบ ัะพะผั ะถะต, ะณะพะฒะพัะธัั
ั ะัะฐัะพะผ ะพ ะปัะฑะฒะธ ะฒัะต ัะฐะฒะฝะพ ััะพ ั ัะตะฑะตะฝะบะพะผ, ะพะฝ ัะฐััะผะตะตััั, ะฝะต ะฟะพะนะผะตั)
- ะ ะฐะทัะผะตะตััั, ั ัะตะณะพ ัั ะฒะทัะป, ััะพ ััะพ-ัะพ ะฝะต ัะฐะบ, ะบะพัะธะบ? - ะะฐััะธ ะปะตะณะพะฝัะบะพ ะฑัะตั
ะตะณะพ ะฟะพ ะฝะพัั, ะะฐััะธ ัะผะตะตััั ะธ ะดะตะปะฐะตั ะฒะธะด, ััะพ ะพะฝ ะดะตะนััะฒะธัะตะปัะฝะพ ะฒ ะฟะพััะดะบะต, ะฟัััั
ะฒะฝัััะธ ะธ ะฒัะต ะฝะพะตั ะธ ัะบะฐะฝะดะธััะตั ะฟะธะทะดะตั-ะฟะธะทะดะตั-ะฟะธะทะดะตั-ั-ัะฐะบ-ะพะฑะปะฐะถะฐะปัั (ัะฝะพะฒะฐ).
ะัะฐั ะผะพััะธั ะฝะพั ะธ ัะผะพััะธั ะตัะต ะฑะพะปะตะต ะฟัะธััะฐะปัะฝะพ, ะฟะตัะตั
ะฒะฐััะฒะฐั ะตะณะพ ะปะฐะดะพะฝั ะธ ะบะฐัะฐั
ะณะพะปะพะฒะพะน, ะฝะต ะฒะตัั, ะผะพะป, ะฟัะธะดัะผะฐะน ััะพ-ะฝะธะฑัะดั ะฟะพะปัััะต.
- ะฏ ะฟัะพััะพ ััะฒััะฒัั, - ััะบะธ ะฝะฐ ะตะณะพ ะณะพะปะพะฒะต ะดะตัะณะฐัััั, ะธ ะะฐััะธ ะดะตัะณะฐะตััั ัะพะถะต, ะฟััะฐััั
ัะนัะธ (ัะฑะตะถะฐัั) ะธ ะฒะตัะฝััััั ะดะพะผะพะน, ััะพะฑั ะฟะพะบััะธัั ะฒ ะพะดะธะฝะพัะตััะฒะต, ะฝะพ ั
ะฒะฐัะบะฐ ะะพัะฐ
ะบัะตะฟะบะฐั, ะฐ ะณะพะปะพั ะพััะฐัะฝะฝัะน, ะฟะพััะธ ัะผะพะปัััะธะน, ะบะพะณะดะฐ ะพะฝ ะฟัะพัะธั ัะฐััะบะฐะทะฐัั (ะฟะพะดะตะปะธัััั).
- ะญัะพ ะฝะต ัะฒะพะต, ัะตัั ะฒะพะทัะผะธ, ะดะตะปะพ, - ัะธะฟะธั ะะฐััะธ, ะฒัะต-ัะฐะบะธ ะฒัััะฒะฐั ััะบั, ะธ ัะฟััะณะธะฒะฐะตั
ั ะบัััะธ. - ะะพ ะฒัััะตัะธ ะฝะฐ ัะปะตะดัััะตะผ ะทะฐะดะฐะฝะธะธ, ะฝะฐะฟะฐัะฝะธะบ.
ะฃ ะัะธ - ะผะพััะบะธะต ะณะปะฐะทะฐ, ะพัะปะตะฟะธัะตะปัะฝัะต ัะปัะฑะบะธ ะธ ะฒะตัะฝัะน ััะผัะฝะตั ะฝะฐ ัะตะบะฐั
(ะธ ะพะฝ ัะฐะบะพะน
ะบัะฐัะธะฒัะน, ะณะพัะฟะพะดะธ, ััะพ ะะฐััะธ ะณะพัะพะฒ ะฟะพะบะปะพะฝััััั ะตะผั, ะบะฐะบ ัะฒะพะตะผั ะปะธัะฝะพะผั ะะพะณั).
ะะฝะพะณะดะฐ ะัะธ ะทะฐะผะตัะฐะตั ะตะณะพ ะฒะทะณะปัะดั ะธ ะฟะพะดะผะธะณะธะฒะฐะตั, ะธ ะกัะฐะนะปั, ะฟัะฐะฒะดะฐ, ะทะฝะฐะตั, ััะพ ััะพ
ะฒัะต ะฝะตัะตััะตะทะฝะพ, ะฝะพ ะฝะธัะตะณะพ ะฝะต ะผะพะถะตั ั ัะพะฑะพะน ะฟะพะดะตะปะฐัั, ัะปัะฑะฐััั ัะฐะบ ัะธัะพะบะพ, ััะพ
ะฑะพะปัั ัะบัะปั, ะฟะพัะพะผั ััะพ, ัะตัั, ะพะฝ ะฒะปัะฑะปะตะฝ, ะฒะปัะฑะปะตะฝ, ะฒะปัะฑะปะตะฝ ัะฐะบ ัะธะปัะฝะพ, ัะฐะบ ะณะปัะฟะพ,
ัะฐะบ ะฟะพ-ะดะตััะบะธ (ะบะฐะถะตััั, ัะถะต ัะตะปัั ะฒะตัะฝะพััั).
ะะฐััะธ ะผะตััะฐะตั ะฒะทััั ะัะธ ะทะฐ ััะบั (ััะพะฑั ัะพั ะฟะพะทะฒะพะปะธะป ััะพ ัะดะตะปะฐัั), ะฟัะตะฒัะฐัะธัััั
ะฒ ัะตััะพะฒั ะะพะถัั ะะพัะพะฒะบั ะธ ะฟะพะดะฝััััั ะฝะฐ ัะฐะผัะน ะฒะตัั
ะญะนัะตะปะตะฒะพะน, ััะพะฑั ะพะฑะฝะธะผะฐัััั
ั ะฝะธะผ ะฝะฐะด ะฒัะตะผ ะะฐัะธะถะตะผ.
(ะธ ะะฐััะธ ะทะฝะฐะตั, ััะพ ะฒ ัะตััะฝะฐะดัะฐัั ะพะฝ ะดะพะปะถะตะฝ ะถะตะปะฐัั ะทะฐะฝััััั ัะตะบัะพะผ ั ัะตะปะพะฒะตะบะพะผ,
ะบะพัะพััะน ะฝัะฐะฒะธััั, ะฝะพ ััะพะณะพ ะฝะตั, ะฟะพัะพะผั ััะพ ัะถะต ัะตัััะต ะณะพะดะฐ ะัะธ ั
ะพัะตััั ัะพะปัะบะพ
ะป-ั-ะฑ-ะธ-ั-ั, ะธ ะฝะธัะตะณะพ ะฑะพะปััะต).
ะ ะฟะพะฝะตะดะตะปัะฝะธะบ ะัะธ ัะตะปัะตั ะญะปะตะพะฝะพั ะฝะฐ ะณะปะฐะทะฐั
ั ะฒัะตะณะพ ะบะปะฐััะฐ, ะธ ะะฐััะธ ััะฒััะฒัะตั,
ะบะฐะบ ะฒะฝัััะธ ะฝะตะณะพ ะฒะทััะฒะฐัััั (ะณะพัะพะดะฐ, ะฒัะตะปะตะฝะฝัะต, ัะฒะตัั
ะฝะพะฒัะต), ะฝะพ ะผะพะปัะธั (ะบะพะณะดะฐ ะปัะฑะธัั,
ะฒัะตะณะดะฐ ะผะพะปัะธัั).
ะะฐััะธ ะฟัะธั
ะพะดะธั ะฒ ัะฒะพั ะฟััััั ะบะฒะฐััะธัั (ัะพะดะธัะตะปะธ ัะตั
ะฐะปะธ ะฟะพ ัะฐะฑะพัะต ะฝะฐ ะฝะตะดะตะปั ะบัะดะฐ-ัะพ
ะฒ ะะผะตัะธะบั, ะธ ะฝะต ัะบะฐะทะฐัั, ััะพ ะกัะฐะนะปัั ะฒัะต ัะฐะฒะฝะพ, ะธะปะธ ััะพ ะพะฝ ะฝะต ัะบััะฐะตั, ะฟัะพััะพ
ะตะผั ะฝะต ะดะพ ะฝะธั
ัะตะนัะฐั, ะฟัะฐะฒะดะฐ) ะธ ะฟะฐะดะฐะตั ะฝะฐ ะดะธะฒะฐะฝ, ะถะตะปะฐั ัะพะปัะบะพ ะพะดะฝะพะณะพ - ัะผะตัะตัั
(ะฒะฟะตัะฒัะต ัะฐะบ ัะธะปัะฝะพ ะทะฐ ะฟะพัะปะตะดะฝะตะต ะฒัะตะผั).
ะะฝ ะฒััะฐัะบะธะฒะฐะตั ะธะท ะบะพะผะพะดะฐ ะฑัััะปะบั ะฒะธะฝะฐ, ั
ะพัั ะตะผั ะตัะต ะปะตั ะฟััั ะบะฐะบ ะฝะตะปัะทั ะฟัะธะฝะธะผะฐัั
ัะฟะธััะฝะพะต, ะธ ะฟัะตั ะฒะตัั ะฒะตัะตั, ะฟะพะบะฐ ะฟะตัะตะด ะณะปะฐะทะฐะผะธ ะฝะต ะฝะฐัะธะฝะฐัั ะปะตัะฐัั ัะตัะฝัะต ัะพัะบะธ,
ะฐ ะณะพะปะพะฒะฐ ะบััะถะธัััั. ะะฐััะธ ะดัะผะฐะตั, ััะพ ะฟัะธัะฒะพัะธััั ะฑะพะปัะฝัะผ ะธ ะพััะฐะฝะตััั ะทะฐะฒััะฐ ะดะพะผะฐ
(ะฝะตั ะฝะธะบะฐะบะพะณะพ ะถะตะปะฐะฝะธั ะฒะธะดะตัั ะปัะฑะธะผะพะณะพ ัะตะปะพะฒะตะบะฐ, ะบะพัะพััะน ััะฐััะปะธะฒ ั ะดััะณะธะผ).
ะขะธะบะบะธ ะณะพะฒะพัะธั, ััะพ ะพะฝ ะฝะต ะผะพะถะตั ะฟัะพััะพ ัะฐะบ ะฒะทััั ะฒัั
ะพะดะฝะพะน, ะทะปะพ ะฝะต ะดัะตะผะปะตั, ะธ ะฒัะต
ัะฐะบะพะต, ะฝะพ ะะฐััะธ ะฒัะต ัะฐะฒะฝะพ, ะพะฝ ะฒะตัะธั, ััะพ ะัะฐั ัะฟัะฐะฒะธััั ัะฐะผ.
ะ ะธัะพะณะต ะะฒะฐะผะธ ะฒััะฐัะบะธะฒะฐะตั ะตะณะพ ะฝะฐ ะทะฐะดะฐะฝะธะต, ะธ ะะฐััะธ ะฝะตะฝะฐะฒะธะดะธั ะตะต ัะฐะบ ัะฐะบ ัะธะปัะฝะพ,
ััะพ ะฝะต ั
ะพัะตั ะฒะธะดะตัั ะตัะต ะฑะปะธะถะฐะนัะธะต ะฝะตัะบะพะปัะบะพ ะดะฝะตะน.
ะะพั ัะถะต ะฝะฐ ะผะตััะต ะธ ัะผะตััะตั ะตะณะพ (ัะฝะพะฒะฐ) ะฒะทะณะปัะดะพะผ, ะฝะฐะฟะพะปะฝะตะฝะฝัะผ ะฒะพะปะฝะตะฝะธะตะผ, ะฝะพ ะะฐััะธ
ะปะธัั ะพัะผะฐั
ะธะฒะฐะตััั ะธ ะพัะฒะพัะฐัะธะฒะฐะตััั, ััะพะฑั ะฑััััะตะฝัะบะพ ะฒัะฟะธัั ะตัะต ะพะดะฝั ัะฐะฑะปะตัะบั
ะธะฑัะฟัะพัะตะฝะฐ - ะณะพะปะพะฒะฐ ัะฐัะบะฐะปัะฒะฐะตััั ัะฐะบ, ะฑัะดัะพ ัะฐะผ ัะตะปัะน ะฟัะตะปะธะฝัะน ัะปะตะน.
ะะฝะธ ะตะดะฒะฐ ะฝะต ะทะฐะฒะฐะปะธะฒะฐัั ััั ะฑะธัะฒั, ะฟะพัะพะผั ััะพ ะัะฐั ะฟะพััะพัะฝะฝะพ ะพะณะปัะดัะฒะฐะตััั ะฝะฐ ะกัะฐะนะปัะฐ
ะธ ะฟััะฐะตััั ะตะณะพ ะฟัะธะบัััั, ะฐ ะะฐััะธ ะฟัะพััะพ ััะฒััะฒัะตั ัะตะฑั ะทะพะผะฑะธ (ะผะพะทะณะธ ะฝะต ัะพะพะฑัะฐะถะฐัั,
ัะตะปะพ ะตะดะฒะฐ ัะปััะฐะตััั), ะฝะพ ะฒ ะบะพะฝัะต ะบะพะฝัะพะฒ ะฒัะต ะทะฐะบะฐะฝัะธะฒะฐะตััั ะบะฐะบ ะฒัะตะณะดะฐ, ั
ะพัะพัะพ,
ะธ ะะฐััะธ ะพะฑะตััะธะปะตะฝะฝะพ ะฟัะธัะปะพะฝัะตััั ะบ ััะตะฝะต ะบะฐะบะพะณะพ-ัะพ ะทะดะฐะฝะธั, ะฟัะธะบััะฒะฐั ะณะปะฐะทะฐ (ัะตะนัะฐั
ะฑั ะพะบะฐะทะฐัััั ะฝะฐ ะฝะตะพะฑะธัะฐะตะผะพะผ ะพัััะพะฒะต, ะฟะพััะตะดะธ ะดะธะบะพะน ะฟัะธัะพะดั ะธ ะฑัััััะตะณะพ ะพะบะตะฐะฝะฐ).
- ะขั ะผะพะถะตัั ัะฐััะบะฐะทะฐัั ะผะฝะต, ััะพ ะฟัะพะธัั
ะพะดะธั, ั ะฟะพะนะผั, - ะณะพะฒะพัะธั ะัะฐั, ะฟะพะดั
ะพะดั ะบ
ะฝะตะผั, ะธ ะะฐััะธ ั
ะผัะบะฐะตั, ะฟะพัะพะผั ััะพ ะะพั ะฒัะต ะตัะต ะฟะพัะปะตะดะฝะธะน, ะบะพะผั ะฑั ะพะฝ ะดะพะฒะตัะธะปัั.
- ะัะต ะฒ ะฟะพััะดะบะต, - ัะตะดะธั ะะฐััะธ ั ัะฐะทะดัะฐะถะตะฝะธะตะผ. - ะัะพััะพ ะฝะตะฑะพะปััะธะต ะฟัะพะฑะปะตะผั ะฒ ะถะธะทะฝะธ
ะธ ะฟะพั
ะผะตะปัะต.
- ะฏ ะปะธัั ั
ะพัั ะฟะพะผะพัั, - ะฒ ะณะพะปะพัะต ะฟะฐัะฝั ะฟัะพัะบะฐะปัะทัะฒะฐัั ะพะฑะธะถะตะฝะฝัะต ะฝะพัะบะธ, ะธ ะะฐััะธ
(ะฟะพััะธ) ะตะณะพ ะถะฐะปั.
- ะะฝะต ะฝะธะบัะพ ะฝะต ะผะพะถะตั ะฟะพะผะพัั, ะฟะพะฝะธะผะฐะตัั? ะะฝะต ะฝัะถะฝะพ, ััะพะฑั ะฒัะต ะพััะฐะฒะธะปะธ ะผะตะฝั ะฒ ะฟะพะบะพะต,
ะธ ะฟะตัะตััะฐะปะธ ะฟััะฐัััั ััะพ-ัะพ ะฒัััะฝะธัั ะธ ะธัะฟัะฐะฒะธัั.
- ะะฐะน ะผะฝะต ัะฐะฝั, - ะฟัะพัะธั ะัะฐั ะธ, ัะตัั, ะพะฝ ะดะตะนััะฒะธัะตะปัะฝะพ ะฒะพะปะฝัะตััั, ัะตะฟะตัั ะะฐััะธ
ะฝะต ะผะพะถะตั ััะพ ะธะณะฝะพัะธัะพะฒะฐัั. - ะั ะผะพะถะตะผ ะฒัััะตัะธัััั ะณะดะต-ะฝะธะฑัะดั ะธ ะฟะพะณะพะฒะพัะธัั. ะ ะบะพัััะผะฐั
,
ะบะพะฝะตัะฝะพ ะถะต.
ะ ะะฐััะธ ะฝะต ะทะฝะฐะตั, ััะพ ััะบะพะฒะพะดะธั ะธะผ, ะบะพะณะดะฐ ะพะฝ ะณะพะฒะพัะธั "ะดะฐ".
ะ ััะตะดั ะบ ะกัะฐะนะปัั ะฟัะธั
ะพะดะธั ะะตะนะฝ (ะปัััะธะน ะดััะณ ะธะท ะบะฐัะตะณะพัะธะธ ะฒะผะตััะต-ั-ะดะตัััะฒะฐ-ะธ-ะฝะฐะฒัะตะณะดะฐ).
ะะฝ ะฟัะธะฝะพัะธั ั ัะพะฑะพะน "1+1" ะธ ะผะพัะพะถะตะฝะพะต, ะฝะพ ะบะพะณะดะฐ ะฒะธะดะธั ัะพััะพัะฝะธะต ะะฐััะธ, ัะพ ะฟัะพััะพ
ัััะณะธะฒะฐะตั ะตะณะพ ั ัะพะฑะพะน ะฝะฐ ะบัะพะฒะฐัั, ััะพะฑั ะพะฑะฝััั.
- ะะฐะบ ัั ะดัะผะฐะตัั, ะพั ะปัะฑะฒะธ ะผะพะถะฝะพ ัะผะตัะตัั? - ัะฟัะฐัะธะฒะฐะตั ะะฐััะธ, ะฟะพะปะพะถะธะฒ ะณะพะปะพะฒั ะะตะนะฝั
ะฝะฐ ะณััะดั ะธ ะฟัะธัะปััะธะฒะฐััั ะบ ะผะตัะฝะพะผั ัััะบั ัะตัะดัะฐ, ััะฒััะฒัั (ะณะพัะฟะพะดะธ-ะฑะพะถะต-ะผะพะน-ะฝะฐะบะพะฝะตั)
ัะผะธัะพัะฒะพัะตะฝะธะต.
ะะตะนะฝ ัะธั
ะพ ัะผะตะตััั ะฝะฐะด ะณะปัะฟะพัััั ะะฐััะธ, ะฝะพ ะฒัะต ัะฐะฒะฝะพ ะพะฑะฝะธะผะฐะตั ะบัะตะฟัะต, ะฟัะธะถะธะผะฐั
ะบ ัะตะฑะต, ะฟะพัะพะผั ััะพ ะทะฝะฐะตั, ััะพ ะฝัะถะฝะพ ะตะณะพ ะผะปะฐะดัะตะผั ะดััะณั (ะทะฝะฐะตั, ะบะฐะบ ะปะตัะธัั ะตะณะพ
ะฑะพะปัะฝะพะต ัะตัะดัะต).
- ะฅะฐะท, ัั ะถะต ะทะฝะฐะตัั, ััะพ ะัะธ ั ะญะปะตะพะฝะพั ะฝะต ัะผะพะณัั ะฑััั ะดะพะปััะต ะฝะตะดะตะปะธ, ะพะฝะฐ ะดะพััะฐะฝะตั
ะตะณะพ ัะฒะพะธะผ ะพัะฒัะฐัะธัะตะปัะฝัะผ ั
ะฐัะฐะบัะตัะพะผ ะธ ะฟะธัะบะปัะฒัะผ ะณะพะปะพัะพะผ, - ัะตะฟัะตั ะพะฝ. - ะขะฒะพั ะปัะฑะพะฒั
ะฝะธะบัะดะฐ ะพั ัะตะฑั ะฝะต ะดะตะฝะตััั.
ะะฐััะธ ัะธั
ะพ ะฒัั
ะปะธะฟัะฒะฐะตั ะธ ะฟัะธะถะธะผะฐะตั ะฝะพะณะธ ะบ ัะตะปั, ัะฒะพัะฐัะธะฒะฐััั ะบะปัะฑะพัะบะพะผ. ะะฝ ะฒัะณะปัะดะธั
ะธััะพัะตะฝะฝัะผ ะธ ัะปะฐะฑัะผ, ะธ ะะตะนะฝ ััะฒััะฒัะตั ะฑะพะปั ะพั ััะพะณะพ, ะฝะพ ะฝะธัะตะณะพ ะฝะต ะผะพะถะตั ะฟะพะดะตะปะฐัั
(ัะฐะทะฒะต ััะพ ะฒัะตะทะฐัั ะขะพะผะปะธัะพะฝั, ั
ะพัั ััะพ ะธ ะณะปัะฟะพ).
ะะฐััะธ ััะฒััะฒัะตั ัะตะฑั ะผะฐะปะตะฝัะบะพะน ะฑัะบะฐัะบะพะน, ะบะพัะพััั ะฒะพั-ะฒะพั ัะฐััะพะฟััั, ั ะฝะตะณะพ ะฝะตั
ัะธะป, ัะฝะตัะณะธะธ (ะธ ะฒะตัั ะฒ ะปัััะตะต, ะบััะฐัะธ, ัะพะถะต ั ะฝะตะดะฐะฒะฝะธั
ะฟะพั).
ะะฐััะธ ะดัะผะฐะตั, ััะพ ะฝะต ะดะพััะพะธะฝ ะฑััั ััะฟะตัะณะตัะพะตะผ ะธ ัะฟะฐัะฐัั ะผะธั, ะฟะพัะพะผั ััะพ ะฝะต ะผะพะถะตั
ัะฟะฐััะธ ะดะฐะถะต ัะฐะผะพะณะพ ัะตะฑั.
ะะฐััะธ ะพะฑะตัะฐะตั ัะตะฑะต ััะฐัั ัััั ัะธะปัะฝะตะต (ัะฐะดะธ ะฒัะตะณะพ ะผะธัะฐ) ะธ ะฟะพะณะพะฒะพัะธัั-ัะฐะบะธ ั ะัะฐัะพะผ
(ะฟะพัะพะผั ััะพ ะฒะธะฝะฐ ะณะปะพะถะตั ะตะณะพ ะธะทะฝัััะธ, ะธ ั ะฝะตะน ะฝะฐะดะพ ััะพ-ัะพ ะดะตะปะฐัั), ะฒะตะดั ะธะผะตะฝะฝะพ
ััะพ ะธ ะดะตะปะฐัั ััะฟะตัะณะตัะพะธ - ะทะฐะฑัะฒะฐัั ะพ ัะตะฑะต, ะทะฐะฑะพัััั ะพ ะดััะณะธั
- ะฒะตัะฝะพ?
(ะฝะฐ ัะปะตะดัััะธะน ะดะตะฝั ะะฐััะธ ะทะฐะผะฐะทัะฒะฐะตั ะบััะณะธ ะฟะพะด ะณะปะฐะทะฐะผะธ ะธ ะฒัะบััะธะฒะฐะตั ะฟะพัะปะตะดะฝัั ัะธะณะฐัะตัั,
ะฒัะบะธะดัะฒะฐั ะฟะฐัะบั ะบ ัะตััั ั ั
ะฒะฐัะธั ัะตะฑะต ะฟะพะด ะฝะพั.
ะะฐััะธ ััะธััั ัะตัะฟะตะฝะธั, ะฝะฐัะธะฝะฐะตั ะทะดะพัะพะฒะฐัััั ั ะัะธ ะธ ะญะปะตะพะฝะพั ะธ ะดะฐะถะต ะฟะพะทะดัะฐะฒะปัะตั
ะธั
, ะธ ะฟััะฐะตััั ะผะตะฝััะต ัะผะพััะตัั ะฝะฐ ะฟะฐัะฝั (ะฟะพัะปะตะดะฝะตะต ะฝะต ะฟะพะปััะฐะตััั, ะฝะพ ะฝะธ ะพะดะฝั ะฒะตะปะธะบัั
ัะตะปั ะฝะตะปัะทั ะดะพััะธะณะฝััั ััะฐะทั, ัะฐะบ ััะพ)).
ะ ัะตัะฒะตัะณ ะะฐััะธ ะฟะพะบัะฟะฐะตั ัะธะฟัั, ะบะพะฝัะตัั ะธ ะณะฐะทะธัะพะฒะบั (ะพะฝ ะฟะพะฝััะธั ะฝะต ะธะผะตะตั, ััะพ
ะธะผะตะฝะฝะพ ะปัะฑะธั ะะพั) ะธ ะธะดะตั ะฝะฐ ะฒัััะตัั ั ะัะฐัะพะผ ะฒ ะพะดะธะฝ ะธะท ัะฐะผัั
ะผะฐะปะพะฝะฐัะตะปะตะฝะฝัั
ัะฐะนะพะฝะพะฒ
ะณะพัะพะดะฐ, ะณะดะต ะฝะธะบัะพ ัะพัะฝะพ ะฝะต ะทะฐะผะตัะธั ะธั
, ะดะฐะถะต ะฝะต ะฝะฐะดะตััั ะฝะฐ ัะพ, ััะพ ัะฐะทะณะพะฒะพั ั ะฝะธะผ
ั
ะพัั ะบะฐะบ-ัะพ ะฟะพะผะพะถะตั ะตะผั.
- ะขั ะทะดะตัั, - ะฒัะบัะธะบะธะฒะฐะตั ะฟะฐัะตะฝั, ะฒัะบะฐะบะธะฒะฐั ัะพ ัะฒะพะตะณะพ ะผะตััะฐ, ะธ ะตะณะพ ะณะปะฐะทะฐ ะทะฐะณะพัะฐัััั
ะธัะบัะตะฝะฝะตะน ัะฐะดะพัััั, ะบะพะณะดะฐ ะพะฝ ะฒะธะดะธั ะบะพะฝัะตัั ะธ ะฒัะต ะพััะฐะปัะฝะพะต. - ะะพัะฟะพะดะธ, ัะตะฑะต ะฝะต
ะฝัะถะฝะพ ะฑัะปะพ ะฒัะต ััะพ ะฟัะธะฝะพัะธัั.
ะะฐััะธ ะฟะพะถะธะผะฐะตั ะฟะปะตัะฐะผะธ, ะฟะพัะพะผั ััะพ, ัะตัั, ััะพ ะถะต ะตััะฝะดะฐ, ะธ ััะฐะถะธะฒะฐะตััั, ะพัะบะธะดัะฒะฐััั
ะฝะฐ ัะฟะธะฝะบั ะปะฐะฒะพัะบะธ.
- ะะฐะฒะฐะน ะฝะฐัะฝะตะผ ััะฐะทั. ะงะตะผ ะฑััััะตะต, ัะตะผ ะปัััะต, ะฒะตัะฝะพ? - ััะผะตั
ะฐะตััั ะพะฝ, ัะฟะปะตัะฐั
ัะฒะพะธ ะฟะฐะปััั ะฒ ะทะฐะผะพะบ ะพั ะฒะฝะตะทะฐะฟะฝะพะณะพ ััะฒััะฒะฐ ะฝะตะปะพะฒะบะพััะธ ะธ ะธัะฟัะณะฐ.
- ะขั ัะฐััะบะฐะถะตัั ะผะฝะต, ััะพ ะฟัะพะธัั
ะพะดะธั? - ะฝะตะดะพะฒะตััะธะฒะพ ะธะฝัะตัะตััะตััั ะัะฐั.
- ะขั ะฒะตะดั ะฝะต ะพัััะฐะฝะตัั, ะฐ ะผะฝะต ัะถะต ะพััะพัะตััะตะปะฐ ัะฒะพั ะทะฐะฑะพัะฐ, ัะปะพะฒะฝะพ ั ะผะฐะปะตะฝัะบะธะน
ัะตะฑะตะฝะพะบ. ะกะพ ะผะฝะพะน ะฝะต ัะปััะธะปะพัั ะฝะธัะตะณะพ ัะฐะบะพะณะพ, ะพั ัะตะณะพ ะฝะฐะดะพ ะพะฑะตัะตะณะฐัั. ะัะพััะพ ัะตะปะพะฒะตะบ,
ะบะพัะพัะพะณะพ ั ะปัะฑะปั ัะถะต ัะตัััะต ะณะพะดะฐ, ะฒัััะตัะฐะตััั ั ะดััะณะธะผ, ะธ ััะพ, ะพะบะฐะทัะฒะฐะตััั, ะณะพัะฐะทะดะพ
ั
ัะถะต, ัะตะผ ะพะฟะธััะฒะฐัั ะฒ ะบะฝะธะณะฐั
. ะะฝะต ะบะฐะถะตััั, ััะพ ั ัะณะพัะฐั ะทะฐะถะธะฒะพ, ะบะพะณะดะฐ ะฒะธะถั ะธั
,
ะธะดััะธั
ะฟะพ ะบะพัะธะดะพัั, ะธ ััะพ ัะฑะธะฒะฐะตั, ะฟะพะฝะธะผะฐะตัั? ะะพัะพะผั ััะพ ั ะทะฝะฐั, ััะพ ั ะผะตะฝั ะฝะตั
ะฝะธะบะฐะบะธั
ัะฐะฝัะพะฒ, ั ะฒัะตะณะพ ะปะธัั ะณะปัะฟัะน ััะพะตัะฝะธะบ ั ะฟะพัะปะตะดะฝะตะน ะฟะฐััั, ั ะบะพัะพัะพะณะพ ะฝะธ
ะฒะฝะตัะฝะพััะธ, ะฝะธ ะฑะพะณะฐัะพะณะพ ะพััะฐ, ะฒ ะพัะปะธัะธะต ะพั ะตะณะพ ัะตััะพะฒะพะน ะดะตะฒััะบะธ ั ะพััะพะผ-ะผััะพะผ,
- ะทะฐะบะฐะฝัะธะฒะฐะตั ะะฐััะธ ั ะฟะพะปะฝัะผ ะพะฟัััะพัะตะฝะธะตะผ ะฒะฝัััะธ, ะฒะตะดั ััะพ ะฟะตัะฒัะน ัะฐะท, ะบะพะณะดะฐ ะพะฝ
ัะฐััะบะฐะทัะฒะฐะตั ัะฒะพั ะธััะพัะธั ะบะพะผั-ัะพ, ะบัะพะผะต ะปัััะตะณะพ ะดััะณะฐ.
- ะฏ ะดัะผะฐั, ั ะฟะพะฝะธะผะฐั, - ะบะธะฒะฐะตั ะัะฐั, ะฒัะณะปัะดั ัะตััะตะทะฝัะผ ะธ ะฝะตะพะถะธะดะฐะฝะฝะพ ัะดะธะฒะปะตะฝะฝัะผ,
ะธ ะฟัะธะพะฑะฝะธะผะฐะตั ะตะณะพ ะทะฐ ะฟะปะตัะธ, ะทะฐััะฐะฒะปัั ะฟัะธะถะฐัััั ะบ ัะตะฑะต.
ะะฐััะธ ััะถะตะปะพ ะดััะฐัั ะฟะพัะปะต ัะฐะบะพะณะพ ะพัะบัะพะฒะตะฝะธั, ะธ ะตะณะพ ัะตัะดัะต ะฑัะตััั ัะตัะตัััั ะฑััััะพ,
ัะฐะบ ััะพ ะพะฝ ะทะฐะฑัะฒะฐะตั ะพ ัะพะผ, ััะพ ะพะฝะธ ั ะะพั'
- 'ะััะพ ัะถะธะผะฐะตั ะบัะปะฐะบะธ, ะณะปัะดั ะฝะฐ ัะถะต ะฒัะตะฒัะตะตัั ะฒ ะดััั ะธะผั. "ะฎะธ ะะพะผะพัะธ".
ะะฝ - ะตั ะะพัะฟะพะดะธะฝ. ะ ะพะฝะฐ ะพะฑัะทะฐะฝะฐ ะฟะพะดัะธะฝััััั ะตะผั. ะ, ัะปะตะดัั ััะพะผั ะฟัะฐะฒะธะปั, ัะตะนัะฐั
ะดะพะปะถะฝะฐ ะฒะพัะบัะตัะฝััั.
ะะปะพะฑะฐ ะฟะพะถะธัะฐะตั ะตะณะพ, ะทะฐััะฐะฒะปัั ัะธั
ะพ ัััะฐัั. ะะฑััะฝะพ ะฟะพะดะฒััะฝััะฐั ััะฐะฝะธะฝะฐ ะพะฟััะตะฝะฐ.
- ะะฝะฐะตัั, ัะฐะบ ัั ะฒัะณะปัะดะธัั ะฝะตะผะฝะพะณะพ... ะะตะฑัะตะถะฝะพ.
ะฃะดะฐั - ะฟะพ ะฝะฐะดะณัะพะฑะฝะพะผั ะบะฐะผะฝั ะฟัะพั
ะพะดะธั ััะตัะธะฝะฐ. ะคะพัะพ ัะฐััะตะฟะปัะตััั ะฝะฐะดะฒะพะต. ะะฝ ะณะพัะพะฒ
ะฟะพัะฟะพัะธัั, ััะพ ะพะฝะฐ ัะตะนัะฐั ััะพะธั ะฟะพะทะฐะดะธ ะฝะตะณะพ. ะัะฑั ะฟะพะดะถะฐัั, ะดะตะฒััะบะฐ ะตะดะฒะฐ ะปะธ ัะดะตัะถะธะฒะฐะตั
ัะปัะทั. ะ ัะบะธ ะฝะตัะฒะฝะพ ัะตัะตะฑัั ะธ ะฑะตะท ัะพะณะพ ะฟะพะผัััั ัะฑะบั. ะะฝ ะณะพัะพะฒ ะฟะพัะฟะพัะธัั, ััะพ ัะตะนัะฐั
ะพะฝะฐ ัะธั
ะพ ัะบะฐะถะตั ััะพ-ัะพ ะฟัะพ ัะพ, ััะพ ั
ะพะทัะธะฝ ะผะพะณะธะปั ะฑัะดะตั ะฝะตะดะพะฒะพะปะตะฝ. ะ ะพะฝ ะณะพัะพะฒ ะฟะพัะฟะพัะธัั,
ััะพ ะตัะปะธ ะพะฝ ะพะฑะตัะฝัััั, ะพะฝะฐ ะธััะตะทะฝะตั.
ะััะพ ัััะฐะปะพ ะพะฑะปะพะบะฐัะธะฒะฐะตััั ะฝะฐ ะดะตัะตะฒะพ. ะะพะถะดั ะฟัะธััะฝะพ ะพั
ะปะฐะถะดะฐะตั ัะฐะทะณะพัััะธะฒัะตะตัั
ัะตะปะพ.
- ะฏ ะฝะต ัะฐะทัะตัะฐะป ัะตะฑะต ัะผะธัะฐัั...
ะ ะพะฝ ะณะพัะพะฒ ะฟะพัะฟะพัะธัั, ััะพ ะพะฝะฐ ัะตะนัะฐั ัะปัะฑะฐะตััั.
ะ ะตะนะดะถะธ ัะฐะดะธััั ะฝะฐ ัะบะฐะผะตะนะบั. ะัะฝั ัะบััะปะธ ัััะธ - ะพะฝ ัะฒะตัะตะฝ, ััะพ ัะบะพัะพ ะฟะพะนะดัั ะดะพะถะดั.
ะะฐะผะฟะธั ัะปะตะณะฐะฝัะฝัะผ ะดะฒะธะถะตะฝะธะตะผ ะฟะพะฟัะฐะฒะปัะตั ะพัะบะธ. ะ ะฟะพัะตะผั ะตะผั ะทะฐั
ะพัะตะปะพัั ะฟัะธะดัะธ ััะดะฐ
ะธะผะตะฝะฝะพ ัะตะนัะฐั?...
ะงัััั ะฒะฐะผะฟะธัะฐ ะฝะต ะพะฑะผะฐะฝัะปะพ. ะะฐ ะบะฐะผะตะฝะฝัะน ะฟะพัััะตั ะฟะฐะดะฐะตั ะฝะตัะบะพะปัะบะพ ะบะฐะฟะตะปั, ะฐ ัะตัะตะท
ะฝะตัะบะพะปัะบะพ ัะตะบัะฝะด ะดะพะถะดั ัะถะต ะปััั ััะตะฝะพะน.
ะ ะตะนะดะถะธ ัะฐะบ ะธ ะฝะต ะดะฒะธะฝัะปัั ั ะผะตััะฐ, ะฝะตัะผะพััั ะดะฐะถะต ะฝะฐ ะฝะฐััะพะนัะธะฒะพะต ะผััะบะฐะฝัะต ะทะฐ ัะฟะธะฝะพะน.
ะะธะดะธะผะพ, ะฝะต ะฒัะดะตัะถะฐะฒ, ะฝะฐ ัะบะฐะผะตะนะบั ะทะฐะฟััะณะธะฒะฐะตั ะฝะตะฑะพะปััะฐั ะบะพัะบะฐ. ะกะธัะตะฝะตะฒัะต ะณะปะฐะทะฐ
ัััั ัะฒะตััััั, ะฐ ะฝะฐัะบะฒะพะทั ะผะพะบัะฐั ะฑะตะปะฐั ัััััะบะฐ ะฑะพะปััะต ะฟะพั
ะพะถะฐ ะฝะฐ ะฟะพะปะพะฒัั ัััะฟะบั.
- ะะฝะฐะตัั...
ะ ััะผะต ะดะพะถะดั ะณะพะปะพั ะฟะฐัะฝั ะตะดะฒะฐ ัะฐะทะปะธัะธะผ, ะฝะพ ะบะพัะบะฐ ะปะธัั ะฝะฐะบะปะพะฝัะตั ะณะพะปะพะฒั ะฝะฐะฑะพะบ.
- ะญัะพ ะบัะฐะนะฝะต ะฝะต ะฒะตะถะปะธะฒะพ. ะะท-ะทะฐ ัะตะฑั ั ััั ะฟัะพะผะพะบ ะดะพ ะฝะธัะบะธ.
ะะฐ ัะตะต ะบะพัะบะธ ะตะดะฒะฐ ัะฐะทะปะธัะธะผะพ ะฟะพะฑะปััะบะธะฒะฐะตั ะผะธะฝะธะฐัััะฝัะน ะฝะฐัะตะปัะฝัะน ะบัะตััะธะบ ะฝะฐ ัะตัะตะฑััะฝะพะน
ัะตะฟะพัะบะต... ะะฐะฒะตัะฝะพะต, ะฟะพะบะฐะทะฐะปะพัั.
- ะ ะฐะนัะพ, ะฐ ัั ะฒัะตะณะดะฐ ะฝะพัะธัั ััั ัะปัะฟั?
ะะฐะผะฟะธั ััะผะตั
ะฐะตััั. ะะฝ ัะฐะผ ะฝะต ะทะฝะฐะตั, ะบะพะณะดะฐ ะธ ะทะฐัะตะผ ะพะฝ ะฝะฐัะฐะป ะฝะพัะธัั ััั ัะปัะฟั. ะะพ
ะดะตะฒััะบะฐ ะดะตะนััะฒะธัะตะปัะฝะพ ะทะฐะผะตัะธะปะฐ - ะฟัะธ ะฝะตะน ะพะฝ ะฒัะตะณะดะฐ ะฑัะป ะฒ ัะปัะฟะต. ะัะป...
ะ ะฐะนัะพ ะฟัะพะฒัะป ะฟะฐะปััะฐะผะธ ะฟะพ ััะตัะธะฝะต, ัะฐะทะดะตะปัััะตะน ะบะฐะผะตะฝั ะฝะฐ ะดะฒะต ัะฐััะธ. ะะฐัะตะฝั ััั
ะถะต ัะฐัะฟะพะทะฝะฐะป ะทะฐะฟะฐั
ะฑัะฐัะฐ. ะัะธะบััะธะฒ ะณัะฑั, ะพะฝ ะฒะฝะพะฒั ะฟัะธัะตะป ะฝะฐ ัะบะฐะผะตะนะบั. ะะฝ ัะถะต ะทะฝะฐะป,
ะบะพะผั ัะตะณะพะดะฝั ะฒะปะตัะธั ะฟะพ ะฟะพะปะฝะพะน ะฟัะพะณัะฐะผะผะต.
- ะกัะธัะฐะน ัะตะฑั ะธะทะฑัะฐะฝะฝะพะน, ะผะฐะปะตะฝัะบะฐั ััะตัะฒะพัะบะฐ.
ะงััะฝะฐั ัะปัะฟะฐ ั ะบัะฐัะฝะพะน ะปะตะฝัะพะน ะปะพะถะธััั ะฝะฐ ะทะฐะผััะปะพะฒะฐััะต ัะทะพัั ะบะฐะผะฝั. ะะฐะผะฟะธั ััะผะตั
ะฐะตััั
ะธ ะฒััะฐัั.
ะะฑะพััััะฝะฝะพะต ะพะฑะพะฝัะฝะธะต ััั ะถะต ัะปะฐะฒะปะธะฒะฐะตั ะทะฝะฐะบะพะผัะน ะทะฐะฟะฐั
. ะฃัะผะตัะบะฐ ััั ะถะต ะฟะตัะตัะฐััะฐะตั
ะฒ ัะธัะพะบัั ะดะพะฒะพะปัะฝัั ัะปัะฑะบั. ะะตะปัะฝัะต ะณะปะฐะทะฐ ั
ะธััะพ ะฟัะธัััะธะฒะฐัััั.
- ะั ะตัั ะฒัััะตัะธะผัั, ะผะฐะปะตะฝัะบะฐั ััะตัะฒะพัะบะฐ.
ะะฐะฟะฐั
ะบะปัะฑะฝะธะบะธ ั ะฟัะธะผะตััั ะผะตัะฐะปะปะฐ.
ะะฐะฝะฐัะพ ะฒะฝะพะฒั ะธ ะฒะฝะพะฒั ะฒะณะปัะดัะฒะฐะตััั ะฒ ัะฐะบ ะฟะพะปัะฑะธะฒัะธะตัั ะตะผั ัะตััั. ะะตัะผะพััั ะฝะฐ ะฝะพะฒะธะทะฝั,
ะธะทะพะฑัะฐะถะตะฝะธะต ะฝะฐ ะฝะฐะดะณัะพะฑะฝะพะผ ะบะฐะผะฝะต ัะถะต ัะปะตะณะบะฐ ััััะปะพัั. ะขะพะฝะบะธะต ะฟะฐะปััั ััะดะพัะพะถะฝะพ ัะถะธะผะฐัั
ะฟะปััะตะฒะพะณะพ ะผะตะดะฒะตะถะพะฝะบะฐ, ะฟะตัะตะฑะธัะฐั ะบะพัะพัะบัั ะธัะบััััะฒะตะฝะฝัั ัะตัััั. ะะฐัะตะฝั ะดะพ ัะธั
ะฟะพั
ะฝะต ะฟะพะฝัะป, ะบะฐะบ ััะพ ะผะพะณะปะพ ัะปััะธัััั. ะะฝ ะพััััะปะธะฒะพ ะฟะพะผะฝะธะป, ะบะฐะบ ะฒะพััะป ะฒ ะตั ะบะพะผะฝะฐัั,
ะฝะฐะผะตัะตะฒะฐััั ะธัะฟัะณะฐัั ะดะตะฒััะบั. ะะฝ ะพััััะปะธะฒะพ ะฟะพะผะฝะธะป ัั ะฑะตะทะผััะตะถะฝะพััั, ััะพ ะทะฐัััะปะฐ
ะฝะฐ ะปะธัะต ะฎะธ. ะะฝ ะพััััะปะธะฒะพ ะฟะพะผะฝะธะป ัะพั ะฝะตะธััะพะฒัะน ั
ะพะปะพะด, ะธัั
ะพะดััะธะน ะพั ะบะพัะตะฝะตััะตะณะพ
ัะตะปะฐ ะดะตะฒััะบะธ.
ะกะปัะทั ะบะฐััััั ะธะท ะณะปะฐะท, ะพััะฐะฒะปัั ะทะฐ ัะพะฑะพะน ะผะพะบััะต ะดะพัะพะถะบะธ. ะะฝ ัะตะดะบะพ ะฟะปะฐะบะฐะป. ะฃะถ ะปัััะต
ัะผะตััััั, ะฝะต ะฟัะฐะฒะดะฐ ะปะธ?
- ะะฝะฐะตัั, ัั ะฟัะฐะฒะฐ.
ะฃะณะพะปะบะธ ะณัะฑ ะฟะฐัะฝั ัััั ะฟัะธะฟะพะดะฝะธะผะฐัััั. ะกะปัะทั ะฝะฐัะธะฝะฐัั ัะตัั ั ะฝะพะฒะพะน ัะธะปะพะน. ะฃะปัะฑะบะฐ
ััะฐะฝะพะฒะธััั ะตัั ัะธัะต, ะพะฑะฝะฐะถะฐั ะฑะตะปะพัะฝะตะถะฝัะต ะบะปัะบะธ.
- ะกะผะตั
ะฟัะพะดะปะตะฒะฐะตั ะถะธะทะฝั, ะฒะตะดั ัะฐะบ?
ะ ะพะฝ ัะผะตัััั. ะะตัะผะพััั ะฝะฐ ัะพ, ััะพ ัะถะต ะทะฐะดัั
ะฐะตััั ะพั ััะดะฐะฝะธะน.
ะจั ัััั ัััะธััั ะธ ะพัะฒะพัะฐัะธะฒะฐะตััั ะพั ัะพะฝะฐัั, ะฟะพ ะตะณะพ ะผะฝะตะฝะธั ัะฐะบ ะฝะตัะผะตััะฝะพ ัะฐัะฟะพะปะพะถะตะฝะฝะพะผั
ะทะดะตัั ะฟะพ ะตะณะพ ะถะต ะฟัะพััะฑะต. ะ ะพ ััะผ ะพะฝ ัะพะณะดะฐ ะดัะผะฐะป?! ะั
ะดะฐ, ะพ ะฎะธ...
- ะะฝะฐะตัั... ะฏ ะฒะพะฒัะต ะฝะต ะฑะพััั ัะตะผะฝะพัั, ะจั. ะัะพััะพ ั ั
ะพัั ะฒะธะดะตัั ะปะธัะพ ัะพะณะพ, ะบัะพ
ัะบััะฒะฐะตััั ะฒ ััะพะน ััะผะต.
ะะฐะผะฟะธั ะตะปะต ัะปััะฝะพ ะฒะทะดัั
ะฐะตั. ะคะพะฝะฐัั ะฟะพะปะฝะพัััั ะพัะฒะตัะฐะตั ะตะณะพ ัะธะณััั, ััะพ ะฝะตะผะฝะพะณะพ
ัะฐะทะดัะฐะถะฐะตั. ะะผั ะฝะตะปะพะฒะบะพ ะฟัะพััะพ ัะธะดะตัั ะฝะฐ ะผะพะณะธะปะต ะบะพะณะดะฐ-ัะพ ะฝะฐััะพะปัะบะพ ะทะฐะธะฝัะตัะตัะพะฒะฐะฒัะตะน
ะตะณะพ ะดะตะฒััะบะต. ะจั ะถััะบะพ ั
ะพัะตััั ัะฟะฐัั, ะฝะพ ะฒะฝะพะฒั ััะปััะฐัั ะตั ะณะพะปะพั ั
ะพัะตััั ะตัั ะฑะพะปััะต.
ะะฐัะตะฝั ัะผะพััะธั ะฟััะผะพ ะฒ ะณะปะฐะทะฐ ะฟะพัััะตัะฐ ะธะท-ะฟะพะด ะพะฟััะตะฝะฝัั
ัะตัะฝะธั. ะะผั ะบะฐะถะตััั, ััะพ
ะพะฝ ัะปััะธั ะตั ัะธั
ะธะน ัะผัััะฝะฝัะน ะณะพะปะพั. ะะฒะฐ ะฟะฐะปััะฐ ะปะพะถะฐััั ะฝะฐ ะณะปะฐะทะฐ ะฟะพัััะตัะฐ ะฎะธ, ั
ะพัั
ะธ ะฝะต ะทะฐะบััะฒะฐั, ะฝะพ ั
ะพัั ะฑั ะฟะตัะตะบััะฒะฐั ะธะผ ะพะฑะทะพั ะฝะฐ ะฒะฐะผะฟะธัะฐ.
ะัะณะบะฐั, ะดะฐะถะต ะฝะตะผะฝะพะณะพ ะณััััะฝะฐั ััะผะตัะบะฐ ะพััะฐะถะฐะตััั ะฝะฐ ะปะธัะต ะจั.
- ะะฐะบัะพะน ัะฒะพะธ ะณะปะฐะทะฐ, ะฟะพะถะฐะปัะนััะฐ.
ะะผั ะฟะพะบะฐะทะฐะปะพัั, ะธะปะธ ะฎะธ ะดะตะนััะฒะธัะตะปัะฝะพ ะตะผั ัะปัะฑะฝัะปะฐัั?..
- ะ ะฝะต ะพะฟัะฐะฒะดัะฒะฐะนัั ะฟะตัะตะดะพ ะผะฝะพะน.
ะกัะฑะฐัั ัััะบะฐะตั, ัะผะพััั ะฝะฐ ัััะฝะพ-ะฑะตะปัั ัะพัะพะณัะฐัะธั ะฎะธ. ะะฐัะตะฝั, ัะธั
ะพ ัััะฐ, ะบะธะดะฐะตั
ะฝะพะถ ะฝะฐ ะผะพะณะธะปัะฝัั ะฟะปะธัั. ะะฝ ััั ะถะต ะฒััะบะฐะตััั ะฒ ะฝะตั, ะพััะฐะฒะปัั ะฒะพะบััะณ ัะตะฑั ะฟะฐััะธะฝั
ะผะตะปะบะธั
ััะตัะธะฝ. ะกัะฑะฐัั, ัะถะฐะฒ ะบัะปะฐะบะธ, ะฟัะธัะตะดะฐะตั ะฝะฐะฟัะพัะธะฒ ะฟะพัััะตัะฐ ะธ ะดะพะปะณะพ ะฒัะผะฐััะธะฒะฐะตััั
ะฒ ะทะฝะฐะบะพะผัะต ัะตััั ะปะธัะฐ.
- ะขั... ะะฝะต ะบะฐะถะตััั, ัั ะฝะต ัะฐะบะพะน, ะบะฐะบ ะพะฝะธ.
ะะฐัะตะฝั ัะบะฐะปะธััั ะธ ัะถะต ะฑะพะปะตะต ะฒะพะปัะฝะพ ัะฐัะฟะพะปะฐะณะฐะตััั ะฝะฐะฟัะพัะธะฒ ะบะฐะผะฝั.
- ะ ะฒะตะดั ัั ะผะฝะต ะพะฑะตัะฐะปะฐ, ะฟะพะผะฝะธัั? ะขั ะพะฑะตัะฐะปะฐ, ััะพ ัะฑัััั ะผะตะฝั. ะ ััะพ ัะตะฟะตัั?..
ะะตัะฒัะต ะบะฐะฟะปะธ ะดะพะถะดั ัะฟะฐะปะธ ะฝะฐ ััะบั ะฎะธ. ะะฐะบะพะผั-ะฝะธะฑัะดั ัะพะฟะปะธะฒะพะผั ัะพะผะฐะฝัะธะบั ะฟะพะบะฐะถะตััั,
ััะพ ััะพ ะฝะฐะฟะพะผะธะฝะฐะตั ะตั ัะปัะทั. ะ ะพัะฒะตั ะฝะฐ ััะพ ะกัะฑะฐัั ะฒะฟะพะปะฝะต ะผะพะถะตั ัะฐััะผะตััััั ััะพะผั
ัะตะปะพะฒะตะบั ะฒ ะปะธัะพ. ะฃะถ ะพะฝ-ัะพ ะทะฝะฐะตั, ััะพ ะตั ัะปัะทั ะฝะต ัะฐะบะธะต. ะั ัะปัะทั ะฒัะตะณะดะฐ ะฒะฝัััะธ.
ะะตะปะพะฒะพะปะพััะน ัะปัะฑะฐะตััั. ะะฝ ัะฒะตัะตะฝ - ะพะฝะฐ ะตะณะพ ัะปััะธั.
ะะตัะบะพะปัะบะพ ะฟะฐัะฝะตะน ััะพัั ะพะบะพะปะพ ัะฒะตะถะตะน ะผะพะณะธะปั, ะฝะต ัะตัะฐััั ะฟัะพัะพะฝะธัั ะธ ัะปะพะฒะฐ. ะะพ ัะฐะทะฝะพัะฒะตัะฝัะผ
ะทะพะฝัะฐะผ ะฑะฐัะฐะฑะฐะฝะธั ะดะพะถะดั. ะะดะธะฝ ะธะท ะฒะฐะผะฟะธัะพะฒ ะฝะต ะฒัะดะตัะถะธะฒะฐะตั ะธ ะดะตะปะฐะตั ัะฐะณ ะบ ะผะพะณะธะปะต.
- ะะฝะฐ... ะฃะผะตัะปะฐ ะฝะฐะฒัะตะณะดะฐ, ะดะฐ? - ะณะพะปะพั ะะฐะฝะฐัะพ ะดัะพะถะธั.
ะะฐัะตะฝั, ะฝะต ะฟะพะปััะธะฒ ะพัะฒะตัะฐ ะฝะฐ ัะฒะพะน ะฒะพะฟัะพั, ัะธะปัะฝะตะต ะฟัะธะถะธะผะฐะตั ะบ ัะตะฑะต ัะฒะพะตะณะพ ะฟะปััะตะฒะพะณะพ
ะผะธัะบั. ะะฝ ะพะฑะฒะพะดะธั ะฒะทะณะปัะดะพะผ ะฑัะฐััะตะฒ ะธ ะดะตะปะฐะตั ะตัั ะพะดะฝั ะฟะพะฟััะบั ัะฐะทัััะธัั ััั ะผัััะฒัั
ัะธัะธะฝั.
- ะะพ ะฒะตะดั... ะ ะตะนะดะทะธ, ัั ะถะต ะบะพะณะดะฐ-ัะพ ัะฟะฐัะฐะป ะฎะธ!
ะ ะตะนะดะทะธ ะบะฐัะฐะตั ะณะพะปะพะฒะพะน ะธ ะฟะพะฟัะฐะฒะปัะตั ะพัะบะธ. ะะฐ ัะผะตะฝั ะฒัะตะผ ะฝะตะดะฐะฒะฝะธะผ ััะฒััะฒะฐะผ ะฟัะธัะปะฐ
ัะถะต ะทะฝะฐะบะพะผะฐั, ะฝะพ ัะตะผ ะฝะต ะผะตะฝะตะต ะฝะตะฝะฐะฒะธััะฝะฐั ะฐะฟะฐัะธั.
ะะฐะฝะฐัะพ ะฑะปะฐะณะพัะฐะทัะผะฝะพ ะทะฐะผะพะปะบะฐะตั ะธ ัะถะธะผะฐะตั ะผัะณะบัั ะปะฐะฟั ะขะตะดะดะธ. ะ ะณะพะปะพะฒั ะฒะปะตัะฐะตั ัะฐััะตัะฝะฝะฐั
ะผััะปั ะพ ัะพะผ, ััะพ ะฎะธ ะฒัะตะณะดะฐ ะณะพะฒะพัะธะปะฐ ั ะผะตะดะฒะตะดะตะผ, ะบะฐะบ ัะพ ััะฐััะผ ะทะฝะฐะบะพะผัะผ. ะัััั
ะบะพะณะดะฐ-ัะพ ััะพ ะธ ัะฐะทะดัะฐะถะฐะปะพ, ะฝะพ ัะตะนัะฐั ะพะฝ ะฑัะป ะณะพัะพะฒ ะฟัะพััะธัั ะฎะธ ะธ ััะพ.
ะะตะฑะพ ะฟัะพัะฒะตัะปะตะปะพ. ะ ะฐะนัะพ, ะพััะตะฐะณะธัะพะฒะฐะฒ ะฝะฐ ััะพ ะฟะตัะฒัะผ, ัะปะพะถะธะป ะทะพะฝั ะธ ะธะณัะธะฒะพ ะฟะพัะผะพััะตะป
ะฝะฐ ะฑะตะปัั ะบะพัะบั, ัะธะดัััั ะฝะฐ ะผะพะณะธะปะต. ะ ะตะนะดะทะธ, ะฟัะพัะปะตะดะธะฒ ะทะฐ ะฒะทะณะปัะดะพะผ ะฑัะฐัะฐ, ะฟะพะฝััะปะธะฒะพ
ั
ะผัะบะฝัะป.
- ะกะปััะฐะนัะต, ะฐ ะฒั ะฒะตัะธัะต ะฒ ะฟะตัะตัะตะปะตะฝะธะต ะดัั? - ะณะพะปะพั ะ ะฐะนัะพ ะทะฒััะธั ะฝะตะฟัะธะฒััะฝะพ ะณัะพะผะบะพ.
ะััะพ ััะผะตั
ะฐะตััั ะธ ัะผะพััะธั ะฝะฐ ะฝะฐะดะฟะธัั, ะบะพัะพััั ะดะพ ััะพะณะพ ะทะฐะณะพัะฐะถะธะฒะฐะปะฐ ะบะพัะบะฐ. ะ ะตะผั
ะฟะพัะตะผั-ัะพ ะบะฐะถะตััั, ััะพ ะพะฝะธ ะฝะตะผะฝะพะณะพ ะฟะพัะฟะตัะธะปะธ.
"ะะฐัะบะฝะธัั ะธ ัะฟะธ."'
- '-ะัะบััะพ, ัะผะพััะธ, ัะผะพััะธ - ััะพ ะพะฑะปะฐัะบะพ ะฟะพั
ะพะถะต ะฝะฐ ะฑะฐัะฐัะบะฐ.
ะะปะปัะทะธะพะฝะธัั ะพัะบััะป ะณะปะฐะทะฐ, ัะพะปะฝัะต ะฝะฐ ะผะธะณ ะพัะปะตะฟะธะปะพ ะตะณะพ, ะฝะพ ะพะฝ ะฒัะต ัะถะต ัะผัะดัะธะปัั
ัะฐััะผะพััะตัั ัะพ ัะฐะผะพะต ะพะฑะปะฐัะบะพ, ะพ ะบะพัะพัะพะผ ะณะพะฒะพัะธะป ะขััะฝะฐ.
-ะะผ. ะกะบะพัะตะต ะฟะพั
ะพะถะต ะฝะฐ ะณะพัั ัะปะฐะดะบะพะน ะฒะฐัั, ัะตะผ ะฝะฐ ะฑะฐัะฐัะบะฐ.
ะัะธ ััะธั
ัะปะพะฒะฐั
ะพะฝ ัะปัะฑะฝัะปัั ะกะฐะฒะฐะดะต, ะพััะตะณะพ ัะพั ััะฐะทั ะถะต ะฟะพะบัะฐัะฝะตะป. ะะตะณะบะฐั ััะผะตัะบะฐ
ัะพัะฒะฐะปะฐัั ั ะณัะฑ ะฅัะฐะฝะธัะตะปั ะขัะผะฐะฝะฐ, ะตะผั ะฒัะตะณะดะฐ ะดะพ ะฑะตะทัะผะธั ะฝัะฐะฒะธะปะพัั ัะผะพััะตัั ะบะฐะบ
ัะผััะฐะตััั ะตะณะพ ะปัะฑะธะผัะน...ะฑะพัั, ะฝะตั, ะปัะฑะธะผัะน.. ะฟัะพััะพ ะปัะฑะธะผะพะต ะะตะฑััะบะพ.ะะตัะผะพััั ะฝะฐ
ัะพ, ััะพ ะขััะฝะฐััะธ ะฟะพะฒะทัะพัะปะตะป, ะฟะพัะพะน ะพะฝ ะฒะตะป ัะตะฑั ะบะฐะบ ัะตะฑะตะฝะพะบ.
-ะะฝะฐะตัั, ะัะบััะพ,ะฐ ะผะฝะต ะบะฐะถะตััั, ััะพ ัั ะฑัะดััะธ ะธะปะปัะทะธะพะฝะธััะพะผ ะพะฑะปะฐะดะฐะตัั ะดะพะฒะพะปัะฝะพ
ัััะฐะฝะฝะพะน ัะฐะฝัะฐะทะธะตะน.
ะะปะปัะทะธะพะฝะธัั ะฟะพัะผะพััะตะป ะฝะฐ ะะตัััะพะณะพ. ะ ะฐะทะฒะต ะผะพะณ ะพะฝ ัะฐััะบะฐะทะฐัั ะตะผั, ะขััะฝะฐััะธ, ะพ ัะตั
ัะฐะฝัะฐะทะธัั
, ััะพ ะฟะพัะตัะฐะปะธ ะตะณะพ, ะพัะพะฑะตะฝะฝะพ ะฟะพ ะฝะพัะฐะผ.
ะะพะณะดะฐ ะฝะฐัะฐะปะธัั ะธั
ะพัะฝะพัะตะฝะธั, ะัะบััะพ ัะพะณะปะฐัะธะปัั ะฝะฐ ะฝะตะบะพัะพััะต ััะปะพะฒะธั, ะบะพัะพััะต ะฟะพััะฐะฒะธะป
ะฟะตัะตะด ะฝะธะผ ะกะฐะฒะฐะดะฐ.ะะฝ, ะฝะตัะผะพััั ะฝะฐ ัะฒะพั ะณะพััััั ะธัะฐะปััะฝัะบัั ะบัะพะฒั, ัััะฐััั, ะฟัะธััััั
ะปัะฑะพะผั ะฟัะตะดััะฐะฒะธัะตะปั ะดะฐะฝะฝะพะน ะฝะฐัะธะพะฝะฐะปัะฝะพััะธ, ัะพะณะปะฐัะธะปัั ะฝะฐ ััะธ ััะปะพะฒะธั. ะกะปะธัะบะพะผ
ะฟะพะทะดะฝะพ ะพะฝ ะพัะพะทะฝะฐะป, ััะพ ะตะณะพ ะปัะฑะธะผัะน ัะฐะบะพะน ััะตัะฝะธัะตะปัะฝัะน, ััะพ ะดะฐะถะต ะฟะพัะพะน ัะพัะฒะฐัั
ั ะตะณะพ ะณัะฑ ะฟะพัะตะปัะน - ัะฐะบะฐั ะฟัะพะฑะปะตะผะฐ. ะะตัะผะพััั ะฝะฐ ัะพ, ััะพ ะพะฝะธ ะถะธะปะธ ะฒะผะตััะต ัะถะต ะฒัะพัะพะน
ะณะพะด, ะกะฐะฒะฐะดะฐ ะพััะฐะฒะฐะปัั...ะดะตะฒััะฒะตะฝะฝะธะบะพะผ.ะ ัะฝะพะฒะฐ ะฝะฐ ะณัะฑะฐั
ะ ะพะบัะดะพ ะพััะฐะทะธะปะฐัั ัััะฐะฝะฝะฐั
ะทะฐะดัะผัะธะฒะฐั ัะปัะฑะบะฐ. ะะพะณะดะฐ ะพะฝ ัะฐะบ ัะธะปัะฝะพ ะธะทะผะตะฝะธะปัั, ะบะพะณะดะฐ ััะฐะป ะธะทะผะตะฝััั ัะฒะพะธะผ ะฟัะธะฝัะธะฟะฐะผ?
ะ ะฐะฝััะต, ะพะฝ ะฑั ะฝะต ะทะฐะดัะผัะฒะฐััั ะฟัะพััะพ ะฒะทัะป ะฑัะป ะฑั ะะพะฝะณะพะปั, ะฝะต ัะฟัะฐัะธะฒะฐั ัะพะณะพ - ั
ะพัะตั
ะพะฝ ััะพะณะพ ะธะปะธ ะฝะตั. ะะดะฝะฐะบะพ ะผะฝะพะณะพะต ะฟะพะผะตะฝัะปะพัั, ะฝะฐัะธะปะธะต ะฒ ะพัะฝะพัะตะฝะธะธ ััะพะณะพ ะฟะฐัะฝั ะพะฝ
ะพัะผะตะป ััะฐะทั ะถะต. ะะฝ ะฒะตะดั ะปัะฑะธั .ะกะฐะฒะฐะดะฐ ะปัะฑะธั ะตะณะพ, ะฝะพ ะฟะพัะตะผั ะธ ะพั ัะตะณะพ? ะัะพะดะพะปะถะฐั
ัะผะพััะตัั ะฝะฐ ะขััะฝั, ะพะฝ ะทะฐะดะฐะฒะฐะป ัะตะฑะต ััะธ ะฒะพะฟัะพัั ัะถะต, ะฝะฐะฒะตัะฝะพะต, ะฒ ัััััะฝัะน ัะฐะท.
ะะพัะตะผั ะพะฝ ัะฐะบ ัะฐะด ัะพะผั, ััะพ ะผะพะถะตั ะฒะพั ัะฐะบ ัะฟะพะบะพะนะฝะพ ัะธะดะตัั ะทะดะตัั, ะฒ ะฟะฐัะบะต ะฝะฐ ะปัะถะฐะนะบะต
ะธ ัะปัะฑะฐัััั ะตะผั? ะกะฐะฒะฐะดะฐ ะขััะฝะฐััะธ - ัะตะปะพะฒะตะบ, ะบะพัะพััะน ะธะทะผะตะฝะธะป ะตะณะพ ะธ ะผะธั ะฒะฝัััะธ.
ะกััะฐะฝะฝัะน ะผะฐะปััะธะบ, ะฑะตะท ะพัะพะฑะพะณะพ ะดะฐัะพะฒะฐะฝะธั, ะฝะต ัะผะตะปัะน, ะฝะพ ะธ ะฝะต ััััะปะธะฒัะน .ะ ะบะพะณะดะฐ
ะดะตะปะพ ะบะฐัะฐะปะพัั ะตะณะพ ัะตะผัะธ, ะดััะทะตะน - ะพัะฒะฐะถะฝะตะต ะตะณะพ ะฝะต ะฝะฐะนะดะตัั ะฝะธะบะพะณะพ .ะั ัะตะณะพ ะถะต ะพะฝ,
ะ ะพะบัะดะพ ะัะบััะพ, ั
ะพะปะพะดะฝัะน, ัะธะฝะธัะฝัะน, ะฝะตะฝะฐะฒะธะดััะธะน ะผะฐัะธั, ัะฑะธะนัะฐ, ัะฐะบ ััะฐััะปะธะฒ, ะฝะฐั
ะพะดััั
ััะดะพะผ ั ะะตััััะผ ะฑะพััะพะผ ะะพะฝะณะพะปั?
ะะตะณะบะพะต ะฟัะธะบะพัะฝะพะฒะตะฝะธะต ะบ ะตะณะพ ััะบะต ะฒัะฒะตะปะพ ะ ะพะบัะดะพ ะธะท ะฟะพัะพะบะฐ ัะฐะทะผััะปะตะฝะธะน.
-ะัะบััะพ, ััะพ-ัะพ ัะปััะธะปะพัั?
ะ ะณะปะฐะทะฐั
, ััะธั
ะบะฐัะฐะผะตะปัะฝัั
ะณะปะฐะทะฐั
, ะฝะตะพะถะธะดะฐะฝะฝะพ ะพััะฐะทะธะปะพัั ะฟะตัะตะถะธะฒะฐะฝะธะต ะธ ัััะฐั
.
-ะัะต ะฒ ะฟะพััะดะบะต, ะผะธะปัะน, ะฒัะต ั
ะพัะพัะพ. ะัะพััะพ ะทะฐะดัะผะฐะปัั.
ะะพััะฝัะฒัะธัั ะบ ะกะฐะฒะฐะดะต,ะพะฝ, ะพะฑั
ะฒะฐัะธะฒ ัะพะณะพ ะทะฐ ัะฐะปะธั, ััะฐะดะธะป ัะตะฑะต ะฝะฐ ะบะพะปะตะฝะธ, ะฝะตะถะฝะพ
ะฟัะธะถะธะผะฐั ะบ ัะฒะพะตะน ะณััะดะธ. ะะฐะปััั ะปะฐัะบะพะฒะพ ะฟะพะณะปะฐะถะธะฒะฐะปะธ ะฒะพะปะพัั, ะณัะฑั ััะตะฟะตัะฝะพ ัะตะปะพะฒะฐะปะธ,
ะทะฐััะฐะฒะปัั ะขััะฝะฐััะธ ัะผััะฐัััั ะตัะต ะฑะพะปััะต. ะะพะดะฐัะปะธะฒะพะต, ัะฐะทะณะพัััะตะฝะฝะพะต ัะตะปะพ ะฟะฐัะฝั
ัะฝะพัะธะปะพ ะบัััั ั ะัะบััะพ ะธ, ะบะพะณะดะฐ ะปะตะณะบะธะน ััะพะฝ ะฒััะฒะฐะปัั ะธะท ะณััะดะธ ะปัะฑะธะผะพะณะพ, ะธะปะปัะทะธะพะฝะธัั
ัะบะพะปัะทะฝัะป ะปะฐะดะพะฝัะผะธ ะฟะพะด ััะฑะฐัะบั, ะฟะพะณะปะฐะถะธะฒะฐั ัะฟะธะฝั, ะทะฐััะฐะฒะปัั ะฒัะณะธะฑะฐัััั, ััะพะฝะฐัั
ะธ ัะธะปัะฝะตะต ะฟัะธะถะธะผะฐัััั. ะัะฑะฐะผะธ ะฟัะพะฒะตะป ะฒะปะฐะถะฝัั ะดะพัะพะถะบั ะฟะพ ัะตะต ะบ ััะบั.
-ะั-ะบั-ัะพ..
ะขััะฝะฐ ะดัะพะถะฐะป ะฒัะตะผ ัะตะปะพะผ ะพั ะฒะพะทะฑัะถะดะตะฝะธั ะบะฐะถะดัะน ัะฐะท, ะบะพะณะดะฐ ะัะบััะพ ะปะฐัะบะฐะป ะตะณะพ, ะตะผั
ั
ะพัะตะปะพัั, ััะพะฑั ัะพั ะฝะธ ะทะฐ ััะพ ะธ ะฝะธะบะพะณะดะฐ ะฝะต ะพััะฐะฝะฐะฒะปะธะฒะฐะปัั.
-ะะต ะพััะฐะฝะฐะฒะปะธะฒะฐะนัั, ะัะบััะพ.
ะั ััะธั
ัะปะพะฒ ััะบะธ ะัะบััะพ ะทะฐะผะตัะปะธ, ะพะฝ ะฒะตะดั ะดะฐะฒะฝะพ ะผะตััะฐะป, ััะพะฑั ะตะณะพ ะปัะฑะธะผัะน ัะบะฐะทะฐะป
ะตะผั ััะพ.ะ ัะตะนัะฐั ะพะฝ ะฒะทะณะปัะฝัะป ะฒ ะณะปะฐะทะฐ ะขััะฝะฐััะธ, ะฟะฐะปััะตะผ ะฟัะพะฒะตะป ะฟะพ ัะตะบะต ะธ, ะฝะตะพะถะธะดะฐะฝะฝะพ
ะดะปั ัะตะฑั, ะฟัะธะถะฐะป ะตะณะพ ะบ ัะตะฑะต ัะพ ะฒัะตะน ะฝะตะถะฝะพัััั. ะะฐะบ ะถะต ั
ะพัะตะปะพัั ะฒะพั ัะฐะบ ัะธะดะตัั
ั ะฝะธะผ, ะฟัะธะถะธะผะฐััั, ัะปััะฐั ะฑะธะตะฝะธะต ะปัะฑะธะผะพะณะพ ัะตัะดัะฐ, ะพัััะฐัั ะปะฐัะบะพะฒัะต ะฟัะธะบะพัะฝะพะฒะตะฝะธั
ัะตะฟะปัั
ััะบ, ััะฒััะฒะพะฒะฐัั ะดัั
ะฐะฝะธะต ะฝะฐ ัะฒะพะตะน ะบะพะถะต, ััะพ ะพะฑะถะธะณะฐะปะพ ะธ ัะฒะพะดะธะปะพ ั ัะผะฐ. ะ
ัะณะพะปะบะฐั
ะณะปะฐะท ัะฒะตัะบะฝัะปะธ ัะปะตะทั. ะั ััะฒััะฒะฐ, ััะพ ัะตะนัะฐั ะพั
ะฒะฐัะธะปะพ ะ ะพะบัะดะพ, ั
ะพัะตะปะพัั
ะฟะปะฐะบะฐัั. ะะฐะบะพะต ะถะต ััะพ ััะฐัััะต - ะฑััั ะบะพะผั-ัะพ ะฝัะถะฝัะผ, ะฒะฐะถะฝัะผ. ะััั ะปัะฑะธะผัะผ.
-ะฏ ะฝะธะบะพะณะดะฐ ะฝะต ะพััะฐะฝะพะฒะปััั, ะพะฑะตัะฐั ัะตะฑะต, ะผะพะต ะะตะฑะพ!
ะััััะฐะฝะธะฒัะธัั ัะปะตะณะบะฐ ะพั ะัะบััะพ, ะขััะฝะฐ ะฟะพัะตะปะพะฒะฐะป ัะพะณะพ ะฒ ะณัะฑั, ะฟะฐะปััะฐะผะธ ััะธัะฐั ัะปะตะทั
ั ัะตะบ.
-ะ ั ะฝะธะบะพะณะดะฐ ะฝะต ะพััะฐะฒะปั ัะตะฑั, ะผะพะน ะขัะผะฐะฝัะธะบ!
ะกะพะปะฝัะต ัะถะต ะบะพัะฝัะปะพัั ะผะฐะบััะตะบ ะดะตัะตะฒัะตะฒ, ะฐ ะพะฝะธ ะฟัะพะดะพะปะถะฐะปะธ ัะธะดะตัั ะผะพะปัะฐ, ะฒะตะดั ะธะผ
ะฝะต ะฝัะถะฝั ะฑัะปะธ ัะปะพะฒะฐ. ะะฝะธ ะฟะพะฝะธะผะฐะปะธ ะดััะณ ะดััะณะฐ ะธ ะฑะตะท ะฝะธั
.
-ะะถัะดะฐะนะผะตะต...ะะดะต ะฒั?
ะะดะฐะปะธ ะฟะพัะปััะฐะปะธ ะบัะธะบะธ ะะพะบัะดะตัั. ะฅัะฐะฝะธัะตะปั ะฃัะฐะณะฐะฝะฐ ะฝะพัะธะปัั ะฟะพ ะฟะฐัะบั ะฒ ะฟะพะธัะบะฐั
ัะฒะพะตะณะพ
ะฑะพััะฐ.
ะ ะพะบัะดะพ ะฒัะดะพั
ะฝัะป, ะขััะฝะฐ ัะปัะฑะฝัะปัั, ะพะฝะธ ะฟะพะฝะธะผะฐะปะธ, ััะพ ะฟัะธัะปะพ ะฒัะตะผั ะธะผ ะฒะพะทะฒัะฐัะฐัััั
ะฒ ัะพั ะผะธั, ะบะพัะพััะน ะฝะต ัะตัะฟะธั ัะตะฝัะธะผะตะฝัะฐะปัะฝะพััะตะน ะธ ะฝะตะถะฝะพััะตะน. ะะธั ะผะฐัะธะธ ะถะตััะพะบ
ะธ ะฑะตัะฟะพัะฐะดะตะฝ, ะฝะพ ััะธ ะดะฒะพะต, ะถะธะฒั ะฒ ะฝะตะผ, ั
ัะฐะฝะธะปะธ ััะฒััะฒะพ, ะบะพัะพัะพะต ัะฒัะทะฐะปะพ ะธั
.ะ,
ะฝะตัะผะพััั ะฝะฐ ะฒะพะนะฝั, ะฟะพัะตัะธ, ะพะฝะธ ะฟัะพะฝะตััั ััะพ ััะฒััะฒะพ ะดะพ ัะพะณะพ ะดะฝั, ะบะพะณะดะฐ ะธั
ะฝะต ััะฐะฝะตั
ัะถะต ะฝะฐ ะทะตะผะปะต. ะะพ, ะบัะพ ะทะฝะฐะตั, ะผะพะถะตั ะฟัะธะดะตั ะฒัะตะผั ะธ ะพะฝะธ ะฒัััะตััััั ัะฝะพะฒะฐ, ะธะฑะพ ะฝะฐััะพััะฐั
ะปัะฑะพะฒั ะฒะตัะฝะฐ ะธ ะฝะต ัะณะฐัะฐะตั, ะฝะตัะผะพััั ะฝะฐ ะณะพะดะฐ ะธ ะฒะตะบะฐ.'
- source_sentence: '- ะะฐะฝะดะฐ! ะฏ ะตัั ัะพะณะปะฐัะธะปัั ะฝะฐ ะฟะฐััะธะฒ "ัะฝะธะทั" ! ะะพ ััะพ ัะถะต ะดะฐะถะต
ะฝะต ะฟะฐััะธะฒ, ััะพ ัะถะต ะฑะตะท ะฐะบัะธะฒ ะบะฐะบะพะน-ัะพ!
- ะ ััะพ ัั ะพั ะผะตะฝั ั
ะพัะตัั-ัะพ? ะฏ ัะถะต ะฝะต ะผะพะณั ัะดะตัะถะธะฒะฐัััั!
- ะะต ะฒะดะฐะฒะปะธะฒะฐะน ะผะตะฝั ะขะะ ะฒ ััะตะฝั-ัะพ!
- ะขั. ะะพััะธ, ั
ะฒะฐัะธั ะตะปะพะทะธัั ะธ ะฝะฐัะปะฐะถะดะฐะนัั ะผะพะผะตะฝัะพะผ!
- ะะพ ััะพ ะฝะต ัะตััะฝะพ! ะะพ ะพัะฝะพัะตะฝะธั, ะผะตะถะดั ะฟัะพัะธะผ, ะบ ัะต...ะผะผะผะผะผ!!!
- ะขั. ะะพััะธ, ััะพ ััะพ?
- ะะฐะฝัะธะบ!
- ะฏ ะฒะธะถั, ะฝะฐ ะบะพะน ัััั ัั ะผะฝะต ะตะณะพ ะทะฐะฒัะทะฐะป?
- ะขะฐะบ ัั ะฟะพั
ะพะถ ะฝะฐ ะฟะพะดะฐัะพะบ!
- ะกัะฐั ะัะณะตะฝะพะผ ะพะณัะตะฑััั!
- ะ ะฟะพัะตะผั ัั ะฝะต ัะฟัะพัะธัั "ะะพะผั?"
- ะะฝะต ััะพ ะฝะต ะธะฝัะตัะตัะฝะพ! ะ ะบะพะผั?
- ะะฝะต!!! ... *ะงะผะพะบ*
- ะฅะผ... ะผะตะฝั ััะพ ะฝะต ััััะฐะธะฒะฐะตั!
- ะงะตะณะพ?!!
- ะขะพะณะพ!
- ะะะะ!!!
- ะะพะผัะธ! ะงัะพ ััะพ ะทะฝะฐัะธั? ะงัะพ ั ะะปะปะตะฝะพะผ?
- ะงัะพ? ะ, ะะฐะฝะดะฐ. ะะปะปะตะฝะฐ ะฃะพะปะบะตัะฐ ัะฐะฝะธะปะธ ะฝะฐ ะผะธััะธะธ!
- ะญัะพ ั ัะถะต ะฟะพะฝัะป! ะงัะพ ั ะฝะธะผ, ะณะพะฒะพัะธ ะบะพะฝะบัะตัะฝะตะน! - ัะฐะผััะฐะน ะฒััััั
ะฝัะป ะฝะฐัะฐะปัะฝะธะบะฐ.
- ะะต ะฟะพะฒััะฐะน ะฝะฐ ะผะตะฝั ะณะพะปะพั! - ะฒะพะทะผััะธะปัั ัะผะพััะธัะตะปั.
- ะะพั ั ะฒะฐัะตะน ัะตัััะพะน ััะพ-ะฝะธะฑัะดั ัะปััะธััั, ั ะฒะฐะผ ัะพะถะต ัะบะฐะถั "ะะต ะบัะธัะธัะต!" ะงัะพ
ั ะะพััะธ?
- ะญั
... ะฝะฐ ะผะธััะธะธ ะะปะปะตะฝ ะพัะปะตะฟ. ะะพ ะฝะต ะฟะตัะตะถะธะฒะฐะน. ะญัะพ ะฒัะตะผะตะฝะฝะพ! ะัะตะฝะธะต ะฒะพัััะฐะฝะพะฒะธััั!
ะะตัััะฐ ัะตัะตะท 3!
- 3 ะะะกะฏะฆะ?!
- ะะฐ! ะขั ัะถ ะฝะต ะพะฑะธะถะฐะน ะตะณะพ ะฟะพะบะฐ.
- ะะตะท ะฒะฐั ะทะฝะฐั!
- ะขั ะบัะดะฐ?
- ะ ะะปะปะตะฝั, ะบัะดะฐ ะถะต ะตัั! - ะณัะพะทะฝะพ ััะฒะบะฝัะป ัะฐะผััะฐะน.
- ะั
ัะถ ััะธ ะณะพะปัะฑะบะธ...
- ะ-ะบัะพ ะทะดะตัั? - ะะปะปะตะฝ ัะธะดะตะป ะฝะฐ ะบะพะนะบะต, ะทะฐะฒะตัะฝัะฒัะธัั ะฒ ะพะดะตัะปะพ.
- ... - ัะฐะณะธ ะฟัะธะฑะปะธะถะฐะปะธัั.
- ะ-ะฝะต ะฟะพะดั
ะพะดะธ! - ะฐ ะฒั ะฑั ะฝะต ะธัะฟัะณะฐะปะธัั, ะฟัะตะถะดะต ะพััะฐะฒัะธัั ะฒ ะพะดะธะฝะพัะตััะฒะต, ััะตะดะธ
ะฐะบัะผ, ะฒ ะณัััะพะผ ะปะตัั, ะฑะตะท ะทัะตะฝะธั? ะขะพ-ัะพ ะถะต!
- "ะะต ะพััะฐะฒะปั!"
- ะงะธััะฐั ะกะธะปะฐ! - ะทะฐะฝัั ััะบั, ะฟัะตะดัะฟัะตะถะดะฐั ะฒัะฐะณะฐ.
- "ะะธ ะทะฐ ััะพ ะฑะพะปััะต ะฝะต ะพััะฐะฒะปั ะพะดะฝะพะณะพ!" ะะปะปะตะฝ! - ะฟะพะดั
ะฒะฐัะธัั ัะพะฝะบะพะต ัะตะปััะต ะฝะฐ ััะบะธ,
ะฟัะธะถะฐัั ะบ ัะตะฑะต, ะปะฐะดะพะฝัั ะฝะฐะบััะฒ ะณะปะฐะทะฐ ะะพััะธ, ะบะฐะบ ะฒ ะฟะตัะฒะพะผ ะฟะพัะตะปัะต. ะ ะบะพัะฝััััั
ัะณะพะปะบะฐ ัะพัะธะบะฐ ัะฒะพะธะผะธ ะณัะฑะฐะผะธ.
- ะ-ะะฐะฝะดะฐ?!
- ะะต ะฒะพะปะฝัะนัั! ะฏ ััะฐะฝั ัะฒะพะธะผะธ ะณะปะฐะทะฐะผะธ ะฟะพะบะฐ ัั ะฟัะพะดะพะปะถะฐะตัั ะฑััั ะผะพะธะผ ะกะตัะดัะตะผ ะะตะฒะธะฝะฝะพััะธ.
ะ ัั ัััะฐะฒัะธะน ะธะดััั ั ะผะธััะธะธ.
ะ ั ัััะฐะฒัะธะน ะธะดั ั ััะตะฝะธัะพะฒะบะธ.
ะขะฒะพะธ ะฝะพะณะธ ะธััะพะฟัะฐะฝั.
POV ะะฐะฝะดั.
ะะพั ะณะพะปะพะฒะฐ ะฑะพะปะธั.
ะขะฒะพะธ ััะบะธ ะฝะพัั.
ะะพั ัะตัะดัะต ะธััะพะผะธะปะพัั.
ะ ะฒะพั ะผั ะธะดัะผ ะดััะณ ะฝะฐ ะดััะณะฐ, ะฟะพะดะฝะธะผะฐะตะผ ะณััััะฝัะต, ะธะทะผััะตะฝะฝัะต ะณะปะฐะทะฐ ะดััะณ ะบ ะดััะณั.
ะขั ะพััะฐะฝะฐะฒะปะธะฒะฐะตัััั.
ะััะฐะบ, ััะพ-ัะพ ะณะพะฒะพัะธัั.
ะงัะพ-ัะพ ะบัะธัะธัั.
ะ ััะผ-ัะพ ะผะพะปัะธัั.
ะะฐะบ-ัะพ ัะผะพััะธัั.
ะ ััะผ-ัะพ ะฒะพะปะฝัะตัััั.
ะกะฝะพะฒะฐ ะพ ััะผ-ัะพ ะบัะธัะธัั.
ะก ะณัััััั ัะผะพััะธัั.
ะ ะบะพะผ ัั ะผััะฐะตัััั?
ะะตะปะฐะตัั ัะฐะณ, ะตัั ะพะดะธะฝ.
ะฅะฒะฐัะฐะตัั ะทะฐ ะฒะพัะพัะฝะธะบ.
ะัะธะฒััะฐััั ะฝะฐ ะฝะพัะพัะบะฐั
.
ะฆะตะปัะตัั...
ะััะฐะบ, ัั ะถะต ัััะฐะป!
ะััะฐะบ, ั ะถะต ัััะฐะป!
ะฏ ะพััะฐะฝะฐะฒะปะธะฒะฐััั.
ะััะฐะบ, ััะพ-ัะพ ะพัะฒะตัะฐั.
ะงัะพ-ัะพ ะบัะธัั.
ะะฐ ััะผ-ัะพ ะทะฐะผะพะปะบะฐั.
ะขัะฟะพ ัะผะพััั.
ะงัะพ-ัะพ ัะตะผะธั.
ะะฟััั ััะพ-ัะพ ะพัั.
ะััะตััะฝะฝะพ ัะผะพััั.
ะะฐ ะบะพะณะพ-ัะพ ะฒะพะปะฝัััั.
ะกัะพั.
ะะฐััะพัะฐะถะธะฒะฐััั.
ะัั ัะฐะฒะฝะพ.
ะะตัะถะตะปะธ?!
ะัะฒะตัะฐั ะฝะฐ ะฟะพัะตะปัะน.
ะะฐะบ ะถะต ะผั ัััะฐะปะธ!
- ะะฐะฒะฝะพ?
- ะัะตะณะดะฐ, ะะพััะธ.
- ะะตั, ัะตััะฝะพ, ั ะฝะตะฝะฐะฒะธะถั ะณะพะปัะฑะตะน! ะ ะพัะพะฑะตะฝะฝะพ ะฑะตะปัั
! - ัะบะฐะฝะดะฐะปะธะป ะะฐะฒะธ ะธะดั ะฟะพ ะบะพัะธะดะพัั
ะงััะฝะพะณะพ ะัะดะตะฝะฐ.
- ะงะตะผ ะถะต ะพะฝะธ ัะตะฑะต ะฝะต ะฝัะฐะฒัััั? - ัะฟัะพัะธะปะฐ ะดะตะฒััะบะฐ-ะบะธัะฐัะฝะบะฐ.
- ะะฐ ะฑะปะธะฝ, ัะธะดัั ะฒะตะทะดะต ะณะดะต ะฝะต ะฟะพะฟะฐะดั ะธ ัะฒะตัั
ั ะบะฐะบะฐัั!
ะ ัะพะปัะบะพ ะพะฝะธ ะทะฐัะปะธ ะทะฐ ะฟะพะฒะพัะพั, ะบะฐะบ ััะปััะฐะปะธ:
- ะะตั, ะะฐะฝะดะฐ, ะฟัะตะบัะฐัะธ! ะะฐั ะผะพะณัั ัะฒะธะดะตัั!
- ะะฐ ะบัะพ ะฝะฐั ััั ะผะพะถะตั ัะฒะธะดะตัั, ะะพััะธ?
- ะั, ะบัะพ-ะฝะธะฑัะดั! ะั
... ะฎั!
- ะะธะผะพ ะณะพะปัะฑััะฝะธ ะฝะธะบัะพ ะฝะต ะฟัะพั
ะพะดะธั! ะญัะพ ะฝะฐะดัะถะฝะฐั ัะฐััั ะทะฐะผะบะฐ!
- ะั, ะฝั, ะฝั ะปะฐะดะฝะพ... ะฝะพ ะฟะพัะตะผั ะธะผะตะฝะฝะพ ััั?
- ะ ะบัะพ ะฒะตัะตัะฐะป ััะพ ะตะผั ัะพะผะฐะฝัะธะบะธ ะฝะต ั
ะฒะฐัะฐะตั? - ะะฐะฝะดะฐ ะฝะตะดะฒััะผััะปะตะฝะฝะพ ะทะฐะถะธะผะฐะป ะะปะปะตะฝะฐ
ะฝะฐ ะฟะพะดะพะบะพะฝะฝะธะบะต.
ะะฐะฒะธ ะธ ะะธะฝะฐะปะธ ัะฐัะฐั
ะฝัะปะธัั ะพะฑัะฐัะฝะพ.
- ะฅะพัั ะทะฝะฐะตัั, ะะธ. ะะพะถะตั ะฒ ััะธั
ะณะพะปัะฑะบะฐั
ััะพ-ัะพ ะธ ะตััั!
ะััะฐัั ััะฐะฝะพะฒะธััั ะฒัั ัััะดะฝะตะต, ะผะตะฝั ะทะฐะณะพะฝััั ะฒ ัะณะพะป. ะฏ ะธัะฟัะณะฐะฝะฝะพ ะพะฑะพัะฐัะธะฒะฐััั
ะธ ะฒะธะถั... ะตะณะพ.
- ะขั, ะฟัะพะบะปัััะน! - ะพะบัะธะบะธะฒะฐะตั ัััะพะณะธะน ัะฟะพะฝะตั.
ะ ั ะผะพะปัั, ะดัั
ะฐะฝะธะต ัะฑะธะปะพัั, ะฒะทะฒะพะปะฝะพะฒะฐะฝ. ะงัะพ ะตะผั ะฝัะถะฝะพ?
- ะะต ะดัะผะฐะป ััะพ ัั ะพะฟัััะธัััั ะดะพ ะบัะฐะถ, ะะพััะธ.
ะงัะพ? ะัะฐะถ? ะะฐะบะธั
ะบัะฐะถ, ะฎั?
- ะขั ะพ ััะผ?
- ะะผะตะฝะฝะพ ัั, ะพ ั ะฝะต ัะพะผะฝะตะฒะฐััั, - ะพะฝ ะธะทะดะตะฒะฐะตััั? - ัั ัะบัะฐะป ั ะผะตะฝั ะพะดะฝั ะฒะตัั.
- ะะฐะฝะดะฐ, ั ะฝะธัะตะณะพ ะฝะต ะฑัะฐะป! - ะพัะบัะดะฐ ััะพ ััะฒััะฒะพ ะฑะตะทััั
ะพะดะฝะพััะธ? ะะฝ ะพััะฐะฝะพะฒะธะปัั.
- ะะธะฑะพ ะฒะตัะฝะธ ะผะฝะต ะตะณะพ, ะปะธะฑะพ ั ะทะฐะฑะตัั ัะฒะพั! - ะงัะพ? ะฏ ะตะณะพ ะฝะต ะฟะพะฝะธะผะฐั! ะงัะพ ะพะฝ ะดะตะปะฐะตั?
ะฅะฒะฐัะฐะตั ะทะฐ ะฟะพะดะฑะพัะพะดะพะบ, ัะฐัะธั ะฝะฐ ัะตะฑั. ะ... ะฑะพะถะต, ััะพ ััะพ? ะะฝ... ะฏ ััะฒััะฒัั ะตะณะพ
ะณัะฑั ะธ ัะฐะผ ะฝะต ะฟะพะฝะธะฐั - ะพัะฒะตัะฐั! ะะฐะฝะดะฐ, ัั, ัั, ัั... ะฝะฐััะพััะธะน ะฒะพั! ะขั ะบัะฐะดััั
ะผะพั ัะตัะดัะต!
- ะ-ะบะฐะฝะดะฐ... - ััะบะธ ะฟะพะฒะธัะปะธ. ะะต ัะพะพะฑัะฐะถะฐั.
- ะะตัะฝะธ ะผะพั ัะตัะดัะต, ะััะฑะฐะฝะฝัะน ะกััััะพะบ!
- ะฏ... ะฝะธ ะทะฐ ััะพ! ะฏ ะพััะฐะฒะปั ะตะณะพ ัะตะฑะต! ะ ัั... ัะถะต ะทะฐะฑัะฐะป ะผะพั...
ะะพะปะฝะพะฒะฐะปัั ะบะฐะบ-ัะพ ะะปะปะตะฝ, ะฒะตะดั ะะฐะฝะดะฐ ะฑัะป ะฝะฐ ะผะธััะธะธ. ะ ะฒะพั ัะปัั ะพะฝ ะตะผั ะฟะธััะผะพ ัะตัะตะท
ะณะพะปะตะผะฐ.
ะ: ะขั ัะฐะผ ะฒะพะพะฑัะต ะถะธะฒะพะน, ะะฐะะฐะฝะดะฐ?
ะ: ะะฐ, ะะปะปะตะฝ. ะะธะฒะพะน, ั ะถะธะฒะพะน!
ะ: ะะฐะฝะดะฐ! ะงัะพ ั ัะพะฑะพะน? ะขะตะฑะต ะฟะปะพั
ะพ? ะฃะผะธัะฐะตัั? ะขะตะฑั ะะธะฝะฐะปะธ ะฟะพัะตะปะพะฒะฐะปะฐ? ะะตัะถะธัั ะดััะณ!
ะ: ะขั ัั? ะกะพ ะผะฝะพะน ะฒัั ั
ะพัะพัะพ, ะะพััะธ!
ะ: ะคัั
... ะฝะต ะฟัะณะฐะน ะผะตะฝั ัะฐะบ ะฑะพะปััะต.
- ะญะน, ะะฐะฝะดะฐ, ะฝะต ะณััััะธ! ะญัะพ ัะฐะบ ะฝะฐ ัะตะฑั ะฝะต ะฟะพั
ะพะถะต!
- ะขะตะฑะต ะปะตะณะบะพ ะณะพะฒะพัะธัั. ะฃ ะฒะฐั ั ะะฐะฒะธ ะฒัั ะฝะฐะปะฐะถะธะฒะฐะตััั. ะ ะฝะฐ ะผะตะฝั ะฃะพะปะบะตั ะดะฐะถะต ะฝะต
ัะผะพััะธั!
- ะขั ะฟะพะณะพะฒะพัะธ ั ะฝะธะผ!
- ะขั, ะธ ัะฐะบ ะบะฐะถะดัะน ะดะตะฝั ะฒะธะดะธะผัั!
- ะะตั, ะฎั, ะฟะพะณะพะฒะพัะธ ั ะฝะธะผ, ะบะฐะบ ัะพ ะผะฝะพะน! ะัั ะพะฑัะฐะทัะผะธัััั ัะปััะธัั?
- ะะต ะฒะตัั ั ะฒ ััะพ.
- ะ ัั ะฟะพะฒะตัั! ะงัะดะพ - ะฑัะฒะฐะตั!
http://vkontakte.ru/photo63528512_276702591
-ะะต ะพัะดะฐะผ. ะกะปััะธัะต?! ะะธะบะพะณะดะฐ ะฝะต ะพัะดะฐะผ ะฒะฐะผ ะะฐะฝะดั!!!
- ะญัะพ ะฝะต ัะตะฑะต ัะตัะฐัั, ะะปะปะตะฝ ะฃะพะปะบะตั!
- ะะต. ะะพะดั
ะพะดะธัะต. ะ. ะะฐะผ.
- ะะฝ ะฟัะธะฝะฐะดะปะตะถะธั ะฝะฐะผ!
- ะฏ... ะพะฝ... ะฃะะฌะฎ!!!
http://vkontakte.ru/photo63528512_276702661'
sentences:
- 'ะกะตะณะพะดะฝั, ะฟััะณะฐั ะฝะฐ ะบัะพะฒะฐัะธ, ะะธัะฐ ัะปะพะผะฐะปะฐ ะตะต. ะะฝะฐ ะพััะฐัะฝะฝะพ ะฟััะฐะปะฐัั ะดะพะฟััะณะฝััั
ะดะพ ะฟะพัะพะปะบะฐ, ะฝะพ ะฝะธัะตะณะพ ะฝะต ะฟะพะปััะฐะปะพัั, ะธ ะพะฟะธะปะบะธ ะปะธัั ััะตัะฝะพ ััะฟะฐะปะธัั ะฝะฐ ะฟะพะป. ะะธะบัะพ
ะฝะต ัะปััะฐะป ะฝะธ ัะบัะตะถะตัะฐ ะฟััะถะธะฝ, ะฝะธ ะณัะพั
ะพัะฐ; ะฝะต ะฑัะปะพ ะฒะธะดะฝะพ ะธ ัะฐะผะพะน ะฟะพะปะพะผะบะธ.
ะะฐัั, ััะณะฐั ะดะพัั, ะฒ ะพัะฒะตั ะฟะพะปััะธะปะฐ ะปะธัั ัััะฐะปะพะต ัะฐะฒะฝะพะดััะธะต, ััะพ, ะบะพะฝะตัะฝะพ ะถะต, ะฒัะฒะตะปะพ
ะตะต ะธะท ัะตะฑั. ะัะธัะฐ ััะพ-ัะพ ะฝะตัะปะตะฝะพัะฐะทะดะตะปัะฝะพะต, ะพะฝะฐ ััััะฐะปะฐ ะฝะพะณะพะน ะฟะพ ัะปะพะผะฐะฝะฝะพะผั ะฟัะตะดะผะตัั.
ะะตะฝัะธะฝะฐ ะฝะต ะฟะพะฝะธะผะฐะปะฐ, ััะพ ะพะฝะฐ ะดะตะปะฐะปะฐ ัะพะปัะบะพ ั
ัะถะต, ะฝะพ ะณะฝะตะฒ ะฒ ะตะต ะบัะพะฒะธ ะฒะทัะป ะฒะตัั
.
- ะะฐ ะบะฐะบ ัั ัะผะตะตัั, ะฟะฐััะธะฒะฐั ะดะตะฒัะพะฝะบะฐ! ะฏ ัะพะปัะบะพ ะธ ะดะตะปะฐะปะฐ, ััะพ ัั
ะฐะถะธะฒะฐะปะฐ ะทะฐ ัะฒะพะตะน
ะบัะพะฒะฐััั! ะ ัั ัะตัะธะปะฐ ััััะพะธัั ะฟะพะณัะพะผ?! ะะฝะฐะตัั, ััะพ?! ะฏ ััะพ ัะฐะบ ะฝะต ะพััะฐะฒะปั! -
ะฝะฐ ััะธั
ัะปะพะฒะฐั
ะถะตะฝัะธะฝะฐ, ัััั ะปะธ ะฝะต ัะฝะธะผะฐั ะดะฒะตัั ั ะฟะตัะตะปั, ะฒัะฑะตะถะฐะปะฐ ะธะท ะบะพะผะฝะฐัั.
ะะธัะฐ ัะตะทะบะพ ะพะฟัััะธะปะฐัั ะฝะฐ ะบะพะปะตะฝะธ. ะัะธะถะฐะฒ ััะบะธ ะบ ะบัะพะฒะฐัะธ, ะพะฝะฐ ะฟััะฐะปะฐัั ัะดะตัะถะธะฒะฐัั
ะตะต ะฝะตะฒัะฝะพัะธะผัะน ัะบัะตะถะตั.
ะะทัะฒ ะผะพะปะพัะพะบ ะธ ะณะฒะพะทะดะธ ะธะท ะบะปะฐะดะพะฒะพะน, ะดะตะฒะพัะบะฐ ะฑะตะทะฝะฐะดะตะถะฝะพ ะบะพะปะพัะธะปะฐ ะฟะพ ะพะฑะปะพะผะบะฐะผ, ะฟััะฐััั
ั
ะพัั ะบะฐะบ-ัะพ ะธั
ัะพะตะดะธะฝะธัั. ะะพ ะฒัะต ะพะบะฐะทะฐะปะพัั ะฑะตะทัะตะทัะปััะฐัะฝะพ: ะพะฑะปะพะผะบะธ ะปะธัั ั ะตัะต
ะฑะพะปััะธะผ ัััะตะผะปะตะฝะธะตะผ ัะฐัะบะฐะปัะฒะฐะปะธัั ะฟะพะด ะณะฝะตัะพะผ ะณะฒะพะทะดะตะน.
ะะฝะฐ ะปะตะณะปะฐ ะฝะฐ ะฟะพะป. ะะตะณะบะธะน ัะบะฒะพะทะฝัะบ ัะตะบะพัะฐะป ะตะต ัะฟะธะฝั.
- ะฏ ะฝะธะบะพะณะดะฐ ะฝะต ัะผะพะณั ะดะพะฟััะณะฝััั ะดะพ ะฟะพัะพะปะบะฐ, - ัะบะฐะทะฐะปะฐ ะะธัะฐ ะธ ะฒัะดะพั
ะฝัะปะฐ.
- ะ ะฒะดััะณ ััะพ ะฝะต ัะฐะบ?
ะะธัะฐ ัะตะทะฒะพ ะฒััะฐะปะฐ. ะะฐ ะตะต ะปะธัะต ะฟะพัะฒะธะปะฐัั ะผะฐัะบะฐ ะฝะตะดะพัะผะตะฝะธั, ะฐ ะฒ ะณััะดะธ ะฝะฐัะฐะป ัะฐะทะถะธะณะฐัััั
ะพะณะพะฝะตะบ ัััะฐั
ะฐ. ะัะบัะดะฐ ััะพั ะณะพะปะพั?
- ะะต ะฑะพะนัั, ะณะปัะฟััะบะฐ, - ะณะพะปะพั ะฑัะป ะพัะตะฝั ะผัะณะพะบ.
- ะัะบัะดะฐ ัั? ะฏ ัะตะฑั ัะฐะฝััะต ะฝะต ัะปััะฐะปะฐ...
- ะ ัะฐะทะฒะต ััะพ ะฒะฐะถะฝะพ?
- ะ ััะพ, ะฝะตั?
- ะะพัะตะผั ััะพ ะดะพะปะถะฝะพ ะฑััั ะฒะฐะถะฝะพ? ะ ะฐะทะฒะต ะฝะตะปัะทั ะฟัะพััะพ ะฟะพะณะพะฒะพัะธัั ั ัะพะฑะพะน?
- ะขั ะดัะผะฐะตัั, ั ะฑัะดั ะณะพะฒะพัะธัั ั ะฝะตะทะฝะฐะบะพะผัะผ ะณะพะปะพัะพะผ?
- ะ ะฟะพัะตะผั ะฝะตั?
- ะขะฐะบ. ะะฝะต ะฝะฐะดะพะตะดะฐะตั ััะฐ ะธะณัะฐ ะฒ ะฒะพะฟัะพัั. ะะพะฒะพัะธ, ััะพ ะธะปะธ ะบัะพ ัั ะตััั?
ะะฝะตะทะฐะฟะฝะพ ะฝะฐัััะฟะธะปะพ ะผะพะปัะฐะฝะธะต, ะฟะพัะปะต ัะตะณะพ ะฟะพัะปะตะดะพะฒะฐะปะพ ะฟัะพะดะพะปะถะธัะตะปัะฝะพะต ะณัะดะตะฝะธะต.
ะะพะปะพั ะฝะฐัะฐะป ะฝะฐะฟะตะฒะฐัั ะฟะตัะตะฝะบั, ะฝะต ะฟะตัะฝั, ะฐ ะธะผะตะฝะฝะพ ะฟะตัะตะฝะบั. ะัะฑะธะผัั ะฟะตัะตะฝะบั ะะธัั,
ะบะพัะพััั ะพะฝะฐ ะทะฐะฒะพะดะธะปะฐ ะบะฐะถะดัะน ัะฐะท, ะบะพะณะดะฐ ะปะพะผะฐะปะพัั ััะพ-ะฝะธะฑัะดั ะฒ ะตะต ะบะพะผะฝะฐัะต.
- ะฏ ะผะพะณั ะฟะพัััะพะธัั ัะตะฑะต ะฝะพะฒัั ะบัะพะฒะฐัั. ะะพัะฐะทะดะพ ะปัััะต ััะพะน. ะ ะฝะตะน ะฑัะดะตั ะผะฝะพะณะพ ัะฒะตัะพะฒ
ะธ ัะปะฐะดะพััะตะน...
ะะตะฒะพัะบะฐ ะพะถะธะฒะธะปะฐัั. ะ ะตะต ัะตัะธ ะฟะพัะปััะฐะปะธัั ะฝะพัะบะธ ัะฐะดะพััะธ.
- ะัะฐะฒะดะฐ? ะขั ัะดะตะปะฐะตัั ััะพ?
- ะะฐ, ะฝะพ ะฒะพั ัะพะปัะบะพ...
- ะงัะพ "ัะพะปัะบะพ"?
- ะขะพะปัะบะพ ะพะฝะฐ ะฑัะดะตั ะฝะต ะฝะฐััะพััะตะน. ะขั ะฝะต ัะผะพะถะตัั ะฝะฐ ะฝะตะน ัะฟะฐัั, ะฝะพ ะพะฝะฐ ะฑัะดะตั ะฒ ัะฒะพะตะน
ะบะพะผะฝะฐัะต. - ะณะพะปะพั ะพัะบะฐัะปัะปัั. - ะั
, ะดะฐ. ะัะพะผะต ัะตะฑั ะตะต ะฝะธะบัะพ ะฝะต ัะฒะธะดะธั.
ะะตะฒะพัะบะฐ ะทะฐะดัะผัะธะฒะพ ัะปัะฑะฝัะปะฐัั.
- ะะพ ะบะพะณะดะฐ ะถะต ั ัะผะพะณั ัะฒะธะดะตัั ัะฒะพั ะบัะพะฒะฐัั?
ะะพะปะพั ะฝะฐัะฐะป ัะผะตััััั. ะกะธะปัะฝะพ, ะดะพะปะณะพ, ะฝะพ ะผัะณะบะพ. ะญัะพั ัะผะตั
ะฑัะป ะพัะตะฝั ะธ ะพัะตะฝั ะฝะตะพะฑััะตะฝ:
ะฒัะพะดะต ะฑั ะธ ะดะพะฑััะน, ะฐ ะฒัะพะดะต ะฑั ะธ ั ะฝะฐัะผะตัะบะพะน.
ะะฐะปะพััั.
ะะฐะปะพััั ัะฟัะฐะฒะปัะปะฐ ะธะผ.
- ะะพัะตะผั ัั ัะผะตะตัััั?
- ะะฐ ะฟะพัะพะผั ััะพ ัั ะณะปัะฟะฐั ะดะตะฒะพัะบะฐ, ะบะพัะพัะฐั ะดะฐะถะต ะฝะต ะผะพะถะตั ัะตัะธัั.
- ะฏ ะฒะพะฒัะต ะฝะต ะณะปัะฟะฐ!
- ะะฐ? ะขะฐะบ ะพัะฒะตัั: ัะตะฑะต ะฝัะถะฝะพ ัะพ, ััะพ ั ะฟัะตะดะปะฐะณะฐั?
- ะะพ ััะพ ะถะต ะฒะพะฒัะต ะฝะต ะฝะฐััะพััะฐั ะบัะพะฒะฐัั! - ะะธัะฐ ะฟัะธะปะพะถะธะปะฐ ััะบะธ ะบ ะปะธัั. - ะะฐ ะฝะตะน
ั ะฝะต ัะผะพะณั ะดะพะฟััะณะฝััั ะดะพ ะฟะพัะพะปะบะฐ!
ะะพะปะพั ะพะฟััั ะทะฐะปะธะปัั ัะผะตั
ะพะผ.
- ะะะงะะะฃ ะขะซ ะกะะะะจะฌะกะฏ ะะกะ ะะ ะะะฏ?!
- ะะฐ ะฟะพัะพะผั ััะพ ัั ัะถะต ัะตัะธะปะฐ. ะฃะถะต ะดะฐะฒะฝัะผ-ะดะฐะฒะฝะพ ัะตัะธะปะฐ.
- ะ ััะพ ะถะต ั ัะตัะธะปะฐ?
- ะขั ัะพะณะปะฐัะฝะฐ, ะฒะตะดั ัะฐะบ?
ะะธัะฐ ะทะฐะผะตัะบะฐะปะฐัั, ะฝะพ, ะฒัะต ะถะต, ะฒัะดะฐะฒะธะปะฐ ะธะท ัะตะฑั ะฝะตัะฒะตัะตะฝะฝะพะต "ะดะฐ".
ะะพะปะพั ะฟัะพะฟะฐะป, ะพััะฐะฒะธะฒ ะฟะพัะปะต ัะตะฑั ะพะณัะพะผะฝัั ะบัะพะฒะฐัั, ั ะฑะพะปััะธะผ ะผะฐััะฐัะพะผ ะธ ะผัะณะบะธะผะธ
ะฟะพะดััะบะฐะผะธ. ะะฐ ัะฐะบะพะน ะบัะพะฒะฐัะธ, ะพะฟัะตะดะตะปะตะฝะฝะพ, ะผะพะถะฝะพ ะฑัะปะพ ะฑั ะดะพะฟััะณะฝััั ะดะพ ะฟะพัะพะปะบะฐ.'
- 'ะะพะฝะตั ะณะพะดะฐ - ััะพ ะฟะพัะฐ ะดะปั ัะฐะดะพััะธ, ะฒ ะฟัะตะดััะฒััะฒะธะธ ะฝะฐะดะฒะธะณะฐััะธั
ัั ะบะฐะฝะธะบัะป, ัะฒะพะฑะพะดั.
ะญัะพ ะฑัะปะพ ะฝะฐัะฐะปะพ ะผะฐั, ะบะพะณะดะฐ ะฝะฐ ัะปะธัะต ัะถะต ัะตะฟะปะพ, ะฐ ะฟะพ ัััะฐะผ ะทัะฑะบะพ. ะะพะณะดะฐ ัะฒะตัั ัะถะต
ัะฐััะฒะตะปะธ ะธ ะฝะฐัะฐะปะธ ะฑะปะฐะณะพัั
ะฐัั. ะกััะฐั ะทะตะผะปั ะฟะพะบััะฒะฐะปะฐัั ััะฐะฒะธะฝะพัะบะฐะผะธ, ะธ ะฟะพ ะฝะตะน ััะดะฐ-ััะดะฐ
ัะฝะพะฒะฐะปะธ ะฑัะบะฐัะบะธ-ัะฐัะฐะบะฐัะบะธ.
ะัะธัั ะปะตัะฐะปะธ ะฝะฐะด ะดะตัะตะฒััะผะธ, ัะธัะธะบะฐั ะธ ัััะตะบะพัะฐ, ะฐ ะบะฐะบะฐั-ัะพ ะพัะพะฑะตะฝะฝะพ ััะตัะดะฝะพ ะฝะฐะฟะตะฒะฐะปะฐ:
~ midori tanabiku namimori no
dainaku shounaku nami ga ii
itsumo kawaranu
sukoyaka kenage
aa~
tomo ni utaou
namimorichuu ~
ะะฐ... ััะพ ะฑัะปะฐ ัะฐ ัะฐะผะฐั ัะพะบะฝััะฐั ะฟัะธัะบะฐ, ั
ะพะทัะธะฝะพะผ ะบะพัะพัะพะน ะฑัะป ะฝะต ะผะตะฝะต ัะพะบะฝัััะน
ะฅะธะฑะฐัะธ ะัั. ะฅะพัั ะฝะฐะทะฒะฐัั ะตะณะพ ัะฐะบ ะฟัะธะปัะดะฝะพ ะฝะธ ั ะบะพะณะพ ะฑั ัะทัะบ ะฝะต ะฟะพะฒะตัะฝัะปัั... ะฝั,
ะฟะพััะธ ะฝะธ ั ะบะพะณะพ.
ะัะตะผะตะฝะฐ ัะบะพะปัะฝะพะน ะฟะพัั ะฟัะพัะปะธ, ะธ ัะตะฟะตัั ะฝะฐััะฐะปะธ ะฝะต ะผะตะฝะตะต ะฝะฐัััะตะฝะฝัะต ะฒัะตะผะตะฝะฐ ัััะดะตะฝัะตััะฒะฐ.
ะขะฐะบ ัะถ ะฟะพะปััะธะปะพัั, ััะดัะฑั ะทะปะฐั ัััะบะฐ, ััะพ ะฑะตะดะฝะพะณะพ ะกะฐะฒะฐะดั ะขััะฝะฐััะธ ะฟะตัะตะฝะฐะฟัะฐะฒะธะปะธ
ะฒ ัะฝะธะฒะตััะธัะตั, ะณะดะต ะณะปะฐะฒะพะน ะดะธััะธะฟะปะธะฝะฐัะฝะพะณะพ ะบะพะผะธัะตัะฐ ะฑัะป ัััะฐั
ะธ ัะถะฐั ะตะณะพ ะถะธะทะฝะธ
- ะฅะธะฑะฐัะธ ะัั! ะั, ัะฐะทัะผะตะตััั ะฟะพัะปะต ัะตะฟะตัะธัะพัะฐ... ะฝะพ ะฝะต ะพะฑ ััะพะผ ัะตะนัะฐั. ะัะฑะพะฟััะฝะพ,
ััะพ ะฑะตะดะฝะพะณะพ ะกะฐะฒะฐะดั ะขััะฝะฐััะธ, ะพัะธะฑะพัะฝะพ, ะทะฐะฟะธั
ะฝัะปะธ ััะฐะทั ะฝะฐ 2 ะบััั! ะ-ะดะฐ... ะฝะต ะฟะพะฒะตะทะปะพ
ัะตะฑัะฝะบั...
ะะพ ััั ัะพัััะฝะฐ ะฟะพะฒะตัะฝัะปะฐัั ะบ ะฝะตะผั ัะฒะพะธะผ ััะปะพะผ, ะธ ะฒ ะตะณะพ ะบะปะฐััะต ะพะฝ ะฟะพะฒัััะตัะฐะป ะทะฐะผะตัะฐัะตะปัะฝะพะณะพ
ัะตะปะพะฒะตะบะฐ - ะะปะปะตะฝะฐ ะฃะพะปะบะตัะฐ.
ะก ะฝะธะผ ะพะฝะธ ะผะธะณะพะผ ัะดััะถะธะปะธัั ะธ ััะฐะปะธ, ะฝะต ัะฐะทะปะตะน ะฒะพะดะฐ. ะะพ ััะพ ะฑัะปะพ ะพัะตะฝัั, ะฐ ัะตะฟะตัั
ะฒะตัะฝะฐ! ะ ััะพ ะทะฝะฐัะธั...
ะกัะตะฝะฐ 1. ะัะฑะปั 1.
- ะขััะฝะฐ, ะฝะต ะฟะตัะตะถะธะฒะฐะน ัั ัะฐะบ! ะกะดะฐัั ัั ััะธ ัะบะทะฐะผะตะฝั! ะะตะดั ะธ ั, ะธ ัะฒะพะน ัะตะฟะตัะธัะพั
ะทะฐะฝะธะผะฐะปะธัั ั ัะพะฑะพะน ะฒะตัั ััะตะฑะฝัะน ะณะพะด! ะขั ะดะฐะถะต ะฝะฐัะฐะป ะฟะพะฝะธะผะฐัั ะฐะทั ัะปะตะบััะพัะธะทะธะบะธ!
- ััะฟะพะบะฐะธะฒะฐะป ะฒะตัะฝะพ ะปะพัะปัะฝัะน ัะตะดะพะน, ะฟะพะณะปะฐะถะธะฒะฐั ะขััะฝั ะฟะพ ะฟััะธััะพะน ะบะฐััะฐะฝะพะฒะพะน ัะตะฒะตะปััะต.
- ะั, ะฐ ะตัะปะธ ััะพ, ะพััะฐะฝะตัััั ะฝะฐ ะฒัะพัะพะน ะณะพะด! ะะพะฝ, ะฝะตะบะพัะพััะต ัะฐะบ ัะถะต 3 ัะฐะทะฐ ะดะตะปะฐะปะธ!
- ะบะธะฒะฝัะป ะพะฝ ะฝะฐ ะะฐะฝะดั, ััะพ ัะธะดะตะป ั ะพะบะฝะฐ ะฒ ะบะพะฝัะต ะบะปะฐััะฐ.
ะะฐะฝะดะฐ ะฎั, ะพ-ะพ-ะพ! ะญัะพ, ะฒะพะพะฑัะต, ะพัะดะตะปัะฝะฐั ะธััะพัะธั! ะฅัะปะธะณะฐะฝ, ะพัะปะธัะฝะธะบ, ะบัะฐัะฐะฒะตั,
ะฟะพัะปะตะดะฝัั ัะบะพัะธะฝะฐ, ัะตะปะพะฒะตะบ ัะตััะธ, ะฑะตะทะดะฐัั, ะณัะพะทะฐ ะฒัะตั
ะธ ะฒัั... ััะฒััะฒะฐ ัะผะตัะฐะฝะฝัะต.
ะะฐะบ ะฒัั ััะพ ะธ ะตัั ะผะฝะพะณะพ "ะฟะพะปะพะถะธัะตะปัะฝัั
" ะบะฐัะตััะฒ ะฝะฐั
ะพะดัััั ะฒ ะพะดะฝะพะผ ัะตะปะพะฒะตะบะต, ะะปะปะตะฝ
ะพัะบะฐะทัะฒะฐะปัั ะฟะพะฝะธะผะฐัั!
- ะะพ ะพะฝ ั
ะพัั ะฑั ะบัััะพะน, ะธ ะพัะปะธัะฝะธะบ, ะฐ ั ะบะฐะบ ะฑัะป ะฝะธะบัะตะผะฝัะผ, ัะฐะบะธะผ ะธ ะพััะฐะฝััั. ะะฝะต
ะฝะต ัะดะฐัั ััะธ ัะบะทะฐะผะตะฝั, ะฝะธ ะทะฐ ััะพ ะฒ ะถะธะทะฝะธ! - ะฟัะพะดะพะปะถะฐะป ัััะฐะดะฐัั ะขััะฝะฐ, ัั
ะฒะฐัะธะฒัะธัั
ะทะฐ ะณะพะปะพะฒั. ะขะฐะบะธะผ ะพะฝ ะฑัะป, ัะปะธัะบะพะผ ะฝะตัะฒะตัะตะฝะฝัะผ ะฒ ัะตะฑะต, ะฟะตััะธะผะธััะธัะฝัะผ, ะฐ ะตัั ะฟะพัะปะตะดะฝะธะผ
ะฝะตัะดะฐัะฝะธะบะพะผ... ัะฟะธัะพะบ ะผะพะถะฝะพ ะฟัะพะดะพะปะถะธัั. ะะพ ะฒ ัะพะถะต ะฒัะตะผั, ัะฐะดะธ ะดััะทะตะน ะพะฝ ะฑัะป ะณะพัะพะฒ
ะฝะฐ ะผะฝะพะณะพะต! ะะณะพ ะพัะทัะฒัะธะฒะพััั, ะดะพะฑัะพัะฐ ะฝะต ะทะฝะฐะปะฐ ะณัะฐะฝะธั. ะัะปะธ ะบัะพ-ัะพ ะพะฑะธะถะฐะป ะตะณะพ ะดััะทะตะน,
ะตะณะพ ะณะปะฐะทะฐ ััะฐะฝะพะฒะธะปะธัั ะพัะฐะฝะถะตะฒัะผะธ, ะฐ ัะฐะผ ะพะฝ ัะตัััะทะฝัะผ ะธ ะผะตะณะฐ-ัะธะปัะฝัะผ.
- ะะฐะะฐะฝะดะฐ-ัะพ?! ะฅะฐ-ั
ะฐ-ั
ะฐ! - ัะฐััะผะตัะปัั ะฃะพะปะบะตั. - ะััะฐะบ ะดััะฐะบะพะผ! ะะฝ ะฟัะพััะพ ะฒะตะทัะฝัะธะบ
ั ัะตะฟััะฐัะธะตะน ะธ ะฒะฝะตัะฝะพัััั! ะ ะฒัั! - ะพะฝ ะผะฝะพะณะพะทะฝะฐัะธัะตะปัะฝะพ ั
ะผัะบะฝัะป. - ะ ัั, ัั ะดะพะฑััะน
ะธ ะผะธะปัะน! ะัะพััะพ ะฑัะดั ะฟะพัะฒะตัะตะฝะฝะตะต ะฒ ัะตะฑะต, ะธ ะฒัั ะฟะพะปััะธััั!
- ะญั
, ะธ ะบะฐะบ ะตะผั ัะดะฐะตััั ะฑััั ัะฐะบะธะผ ัะฒะตัะตะฝะฝัะผ? ะฃ ะผะตะฝั ัะฐะบ ะฝะต ะฟะพะปััะฐะตััั... - ะฒะทะดะพั
ะฝัะป
ะกะฐะฒะฐะดะฐ, ะฟะพัะผะพััะตะฒ ะฝะฐ ะะฐะฝะดั. - ะะฐ, ะธ ะฟัะธ ััะพะผ ะพะฝ ะฝะธัะตะณะพ ะฝะต ะดะตะปะฐะตั, ะปะธัั ัะธะดะธั ะฝะฐ
ัะฒะพัะผ ะผะตััะต, ะฝะพ ะฒัะต ะดะตะฒัะพะฝะบะธ ะฒะพะทะปะต ะฝะตะณะพ ะฒััััั.
ะะพ ััั, ะฒะดััะณ, ะะฐะฝะดะฐ ะฟะพัะผะพััะตะป ะฒ ะธั
ััะพัะพะฝั, ะฐ ะขััะฝะฐ ััั ะถะต ะพัะฒะตัะฝัะปัั ะธ ัะถะฐะปัั,
ะฑัะดัะพ ะตะณะพ ัะพะปัะบะพ ััะพ ะพะฑะปะธะปะธ ะปะตะดัะฝะพะน ะฒะพะดะพะน.
- ะคัั
...
ะะปะปะตะฝ ัะพะถะต ะฟะพัะผะพััะตะป ะฝะฐ ะะฐะฝะดั ะธ, ะฟะพะบะฐะทะฐะฒ ะตะผั ัะทัะบ, ะพัะฒะตัะฝัะปัั.
- ะั! ะ ััะพ ะพะฝะธ ะฒ ะฝัะผ ะฝะฐัะปะธ, ะฝะต ะฟะพะฝะธะผะฐ... - ะฒะพั ัะตะฟะตัั ัะถะต ะะปะปะตะฝ ะทะฐะผะตั ัััะฐะฒะธะฒัะธัั
ะฝะฐ ะดะฒะตัะฝะพะน ะฟัะพัะผ, ะพัะบัะดะฐ ะธะทะปััะฐะปะฐัั ะฐััะฐ ัะผะตััะธ. ะญัะพ ะฑัะป ะฅะธะฑะฐัะธ ะัั
"ะงัะพ ะตะผั ะฝัะถะฝะพ?!"
ะกัะตะฝะฐ 2. ะัะฑะปั 1.
- ะัะพ... ะบัะพ ะฟะพัะผะตะป ะฟัะธะนัะธ ะฒ ัะฝะธะฒะตััะธัะตั ะฑะตะท ัะผะตะฝะบะธ?!!!
ะขัั ะฃะพะปะบะตั ะฟะพัะผะพััะตะป ะฝะฐ ะฟะพะป ะธ ะฒะทะดัะพะณะฝัะป. ะััะทั! ะัะถะธ ะณััะทะธ ะพั ะฒะพะตะฝะฝัั
ัะฐะฟะพะณ, ะฐ
ัะฐะบะธะต ัะฐะฟะพะณะธ ัะพะปัะบะพ ั...
- ะะฐะฝะดะฐ ะฎั! - ะฒะทัะตะฒะตะป ะัั.
ะะพ ะฟะฐัะตะฝั ะปะธัั ะพะดะฐัะธะป ะตะณะพ ัะฒะพะธะผ ะพะฑััะฝัะผ, ัะฐะฒะฝะพะดััะฝัะผ ะฒะทะณะปัะดะพะผ, ะฟะพะปะฝัะผ ั
ะพะปะพะดะฐ.
- ะงัะพ-ัะพ ะฝะต ัะฐะบ?
- ะขั, ััะฐะฒะพัะดะฝะพะต! - ะฟะพะดะพะนะดั ะบ ะะฐะฝะดะต, ะฅะธะฑะฐัะธ ะปะฐัะบะพะฒะพ ะพัะพะดะฒะธะฝัะป ะฟะฐััั. - ะขั ะพัะฒะตัะธัั
ะทะฐ ัะพ, ััะพ ะธัะฟะฐัะบะฐะป ะฟะพะปั! - ะพะฝ ะฝะฐัะบะฒะพะทั ะฟัะพะถะธะณะฐะป ะฒะทะณะปัะดะพะผ.
- ะฅะผ, ะตัั ัะตะณะพ, - ะณะพัะดัะต ัะธะฝะธะต ะณะปะฐะทะฐ ะฟัะพะฝะธะทัะฒะฐะปะธ ั
ะพะปะพะดะพะผ ะฒ ะพัะฒะตั. ะะดะพะฑะฐะฒะพะบ ะพะฝ
ะทะฐะบะธะฝัะป ะฝะพะณั ะฝะฐ ะฝะพะณั. - ะกะผะตะฝะบะฐ ะฟะพัะฒะฐะปะฐัั, ะดััะณะพะน ั ะฝะต ะฝะฐััะป, ะฟัะธัะปะพัั ะธะดัะธ ะฒ ัะฝะธะฒะตััะธัะตั
ัะฐะบ.
- ะะฐ ะฟะปะตะฒะฐัั ั ั
ะพัะตะป! ะะพัะธะบะพะผ ั
ะพะดะธ! ะ ะฟะพะผะตัะตะฝะธะต ะฟะฐัะบะฐัั ะฝะต ัะผะตะน! - ัััะฐะป ะัั.
- ะะฐะฒััะฐ ัะฐะบ ะธ ัะดะตะปะฐั - ัััะบะฝัะป ัะพั. - ะญัะพ ะฒัั?
- ะัะดะตัั ะฝะตะดะตะปั ะผััั ะฟะพะปั ะฒ ััะพะผ ะบะพัะธะดะพัะต! - ะฝะฐั
ะผััะธะปัั ะณะปะฐะฒะฐ ะดะธััะธะฟะปะธะฝะฐัะฝะพะณะพ
ะบะพะผะธัะตัะฐ. - ะ ะฝะฐัะฝััั, ะฟััะผะพ ัะตะนัะฐั!
- ะขั, ะฝะต ะฝะฐะผะตัะตะฝ. ะะปั ััะพะณะพ ะตััั ัะฑะพััะธัั, ะธ... - ะฑัะพัะธะป ะบะพัะพัะบะธะน ะฒะทะณะปัะด ะฒ ััะพัะพะฝั
ะฃะพะปะบะตัะฐ ะธ ะขััะฝั. - ะะตะถััะฝัะต.
- ะงะตะณะพ-ะพ-ะพ?! - ะฒะพะทะผััะธะปัั ะฃะพะปะบะตั. - ะะฐ ะบะพัะธะดะพั ะผั ะฝะต ะพัะฒะตัะฐะตะผ!
- ะฅะผ, - ั
ะผัะบะฝัะป ะัั, ะธ ะผะฐะปััะธะบ ัะตัะธะป ะฟะพะผะพะปัะฐัั. - ะขั ะทะฐะฟะฐัะบะฐะป ัั ะธ ัะฑะธัะฐะน, ะฐ ะธะฝะฐัะต...
- ะณะปะฐะทะฐ ัะฒะตัะบะฝัะปะธ ะฝะต ะฟะพ-ะดะพะฑัะพะผั. - ะะฐะผะธะบะพัะพั!
- ะะตั ะถะตะปะฐะฝะธั ะดัะฐัััั, ะฝะพ ัะฐะท ัั ะฝะฐััะฐะธะฒะฐะตัั! - ะะฐะฝะดะฐ ะฟะพะดะฝัะปัั ั ะผะตััะฐ, ัะผะพััั
ะฝะฐ ะฟะฐัะฝั ั ะฒัะทะพะฒะพะผ. ะะฝ ะฝะต ัะพะฑะธัะฐะปัั ะพัะดะฐะฒะฐัั ัะฒะพะตะผั ะณะปะฐะฒะฝะพะผั ัะพะฟะตัะฝะธะบั ะทะฒะฐะฝะธะต
ะณัะพะทั ัะฝะธะฒะตััะธัะตัะฐ.
- ะ, ััะพ ะฑัะดะตั ะธะฝัะตัะตัะฝะพ, - ะทะปะพัะฐะดะฝะพ ัั
ะผัะปัะฝัะปัั. - ะัะต ะฒะพะฝ! ะะพะบะฐ ะฝะต ะฟะตัะตะฑะธะป.
ะะตัั ะบะปะฐัั, ััะพ ะถะฐะปัั ะฟะพ ััะตะฝะพัะบะฐะผ, ะผะพะผะตะฝัะฐะปัะฝะพ ะฒัััะฟะฐะป ะฒ ะบะพัะธะดะพั. ะัะพะผะต ะขััะฝั
ะธ ะะปะปะตะฝะฐ, ััะพ ะทะฐะฒะพัะพะถะตะฝะพ ะฝะฐะฑะปัะดะฐะปะธ ะทะฐ ัะพะฑััะธัะผะธ. ะกะฐะฒะฐะดะฐ ัะพ ัััะฐั
ั ะฒัะตะฟะธะปัั ะฒ ััะบั
ะฃะพะปะบะตัะฐ, ะฐ ัะฐะผ ะฟะฐัะตะฝั ะพะฑะตัะฟะพะบะพะตะฝะฝะพ ัะผะพััะตะป ะฒ ััะพัะพะฝั ะดะปะธะฝะฝะพะฒะพะปะพัะพะณะพ ัะฟะพะฝัะฐ. -
ะฎั... - ัะธั
ะพ ะฟะพะทะฒะฐะป ะพะฝ.
- ะัะฐะฒะธะปัะฝะพ, ัะฒะธะดะตัะตะปะธ ะฝะธ ะบ ัะตะผั, - ัะฐะบ ะถะต ัั
ะผัะปัะฝัะปัั ะะฐะฝะดะฐ, ัะฐะทะผะธะฝะฐั ััะบะธ. -
ะั, ะดะฒะพะต, ัะฐะทะฒะต ะฝะต ััะฝะพ ะฑัะปะพ ัะบะฐะทะฐะฝะพ? - ะณะปัะฝัะป ะพะฝ ะฒ ััะพัะพะฝั ะฟะฐัะฝะตะน.
- ะะปะปะตะฝ, ะผะพะถะตั... - ัะธั
ะพ ะฟัะพัะบัะปะธะป ะขััะฝะฐ, ะฟัะตะบัะฐัะฝะพ ะทะฝะฐะฒัะธะน ะฝัะฐะฒ ะฅะธะฑะฐัะธ.
ะะตะปะพะฑััััะน, ััะพ ะฒัั ััะพ ะฒัะตะผั ะฟะตัะตะฒะพะดะธะป ะฒะทะณะปัะด ั ะัั ะฝะฐ ะะฐะฝะดั, ะฒะทะดะพั
ะฝัะป, ะพะฟัััะธะฒ
ะณะปะฐะทะฐ, ะธ ะฟะพะดะดะฐะปัั ะฝะฐ ัะณะพะฒะพัั ะกะฐะฒะฐะดั, ะฟะพะทะฒะพะปะธะฒ ััะฐัะธัั ัะตะฑั ะฒ ะบะพัะธะดะพั.
ะกัะตะฝะฐ 3. ะัะฑะปั 1.
- ะฅะต... - ะฅะธะฑะฐัะธ ัััะฐะฝะฝะพ ั
ะผัะบะฝัะป, ะบัะฐะตะผ ะณะปะฐะทะฐ ะฝะฐะฑะปัะดะฐั ะทะฐ ััะตะดัะธะผะธ.
ะะฐะฝะดะฐ ัะฐะบ ะถะต ะผะพะปัะฐ, ะฟัะพะฒะพะดะธะป ะฟะพะดัะพััะบะพะฒ ะฒะทะณะปัะดะพะผ ะธ ะฒะฝะพะฒั ะฟะพัะผะพััะตะป ะฝะฐ ัะฒะพะตะณะพ ะฟัะพัะธะฒะฝะธะบะฐ.
ะญัะฐ ัั
ะผัะปะบะฐ ะฝะธ ะพ ััะผ ะดะพะฑัะพะผ ะฝะต ะณะพะฒะพัะธะปะฐ.
ะฅะธะฑะฐัะธ ะฝะตะพะถะธะดะฐะฝะฝะพ ัะดะฐัะธะป ะฟะฐัะฝั ะฒ ะถะธะฒะพั ัะฐะบ, ััะพ ัะพั ะพัะปะตัะตะป ะบ ะพะบะฝั.
- ะขั... - ะะฐะฝะดะฐ ัะพะณะฝัะปัั, ะฝะพ ะฑััััะพ ะฟัะธััะป ะฒ ัะตะฑั ะธ ะฟะพะดะฝัะปัั. ะะพัะปะตะดะพะฒะฐะป ะพัะฒะตัะฝัะน
ัะดะฐั.
- ะฅะผ... ัะปะฐะฑะฐะบ! - ะัั ะฑััััะพ ะฑะปะพะบะธัะพะฒะฐะป ััะพั ัะดะฐั ะธ ะฟะพะดัะตัะบะพะน ัะฑะธะป ะฟัะพัะธะฒะฝะธะบะฐ
ั ะฝะพะณ.
ะฎั ะฝะต ัะฐััะตััะปัั ะธ ัะดะฐัะธะป ะตะณะพ ะฟะพ ะฝะพะณะฐะผ, ัะพะถะต ะทะฐะฒะฐะปะธะฒ ะฝะฐ ะฟะพะป ะธ ัะตะป ะฝะฐ ะฝะตะณะพ, ะบะฐะบ
ะฝะฐ ัะบะฐะผะตะนะบั. ะะพัะพะผ ะฟะพะดะฝัะปัั ะธ ะทะฐะปะพะผะธะป ัะพะผั ััะบะธ, ะฟัะธะณะธะฑะฐั ะบ ะฟะพะปั.
- ะะตัะธัั!
ะฅะธะฑะฐัะธ ะฒัะฒะตัะฝัะปัั ะธ ั ัะฐะทะฒะพัะพัะฐ ัะดะฐัะธะป ะฟะพ ะปะธัั.
- ะขัะฐะฒะพัะดะฝัะต ะดะพะปะถะฝั ะผะพะปัะฐัั ะธ ะฟะพะดัะธะฝััััั!
- ะฏ ัะฐะบะพะน ัะฒะพะปะพัะธ ะฟะพะดัะธะฝััััั ะฝะต ัะพะฑะธัะฐััั! - ัะดะฐั ะฒ ะฑะพะบ ะฟะพ ะฟะตัะตะฝะธ.
ะัั ัะดะฐัะธะป ะฟะพ ะณะพะปะพะฒะต ัะพะฝัะฐ.
- ะ ัะตะฑั ะฝะธะบัะพ ะฝะต ัะฟัะฐัะธะฒะฐะตั! ะะธััะธะฟะปะธะฝะฐ ะฝะฐ ะฟะตัะฒะพะผ ะผะตััะต!
ะะฐะฝะดะฐ ะทะฐะตั
ะฐะป ะฝะพะณะพะน ะฒ ะถะธะฒะพั. ะกะฐะฟะพะณะฐะผะธ ััะพ ะพัะตะฝั ะถะตััะพะบะพ.
- ะะพะบะฐ ะผะตะฝั ะฝะธะบัะพ ะฝะต ััะพะณะฐะตั, ั ัะฟะพะบะพะตะฝ!
- ะะพะบะฐ ะฝะต ะฝะฐัััะฐะตัั ะฟัะฐะฒะธะปะฐ, ัะฟะพะบะพะตะฝ ั! - ะฟะฐัะตะฝั ั ัะธะปะพะน ัะดะฐัะธะป ะฟะพ ัะพะปะฝะตัะฝะพะผั
ัะฟะปะตัะตะฝะธั.
- ะั
... ัะฑะปัะดะพะบ, - ัะผะพััะธะปัั ะฎั.
- ะขะพะถะต ะผะฝะต - ะฝะฐะณะปัะน! ะัะผะฐะตัั, ั
ัะน ะพััะฐััะธะป, ะธ ัะตะฑะต ะฒัั ะดะพะทะฒะพะปะตะฝะพ?! - ะฟัะพัััะฐะป
ะฅะธะฑะฐัะธ.
- ะะพะฒะพัะธ, ััะพ ั
ะพัะตัั, ะฝะพ ะฟะพะปั ะผััั ั ะฝะต ัะพะฑะธัะฐััั, - ัะตะผ ะถะต ัะพะฝะพะผ ะพัะฒะตัะธะป ะฟัะพัะธะฒะฝะธะบ.
- ะะพ ัะฐะบะธ ะฒัะผะพะตัั! - ัะฝะพะฒะฐ ัะดะฐัะธะป ะณะปะฐะฒะฐ ะดะธััะธะฟะปะธะฝะฐัะฝะพะณะพ ะบะพะผะธัะตัะฐ.
- ะะฐะฒััะฐ ะฒะพะพะฑัะต ะฝะต ัะฒะปััั. ะ ะฟะปะฐะบะฐะป ะฒะฐั ะบัะฑะพะบ ะทะฐ ะฟะตัะฒะพะต ะผะตััะพ ะฟะพ ะฑะฐัะบะตัะฑะพะปั, -
ะฒััะตัะฟะตะป ะะฐะฝะดะฐ.
- ะขั ะผะฝะต ััั ะฝะต ัะณัะพะถะฐะน! ะะตะทะฐะผะตะฝะธะผัั
ะปัะดะตะน ะฝะต ะฑัะฒะฐะตั! ะ ัะตะผ ะฑะพะปะตะต ัะตะฑั ะทะฐะผะตะฝะธัั
- ัะฐะท ะฟะปัะฝััั!
- ะั
, ัะพะณะดะฐ ััะพ ะถะต ะพัะปะธัะฝะพ! ะะฐะฒััะฐ ัะตะปัะน ะดะตะฝั ะฟัะพะฒะฐะปัััั ะฒ ะบัะพะฒะฐัะธ ะธ ะฝะต ัะฒะธะถั
ััะพะณะพ ะผะตะปะบะพะณะพ. ะะฐะดะพะปะฑะฐะป ะฟัะปะธัััั.
- ะ? ะ ะฟัะธััะผ ััั ะะปะปะตะฝ?! - ะัั ะฒัะบะธะฝัะป ะฑัะพะฒั.
- ะัะธัะพะผ, ััะพ ะดะพััะฐะป, - ะฒะทะดะพั
ะฝัะป ะฎั, - ัััะฐะฝะฝัะน ะพะฝ ะบะฐะบะพะน-ัะพ. ะ ัะผะพััะธั ะฝะฐ ะผะตะฝั
ะบะฐะบ-ัะพ ัััะฐะฝะฝะพ.
- ะ ะฐะดะพะฒะฐะปัั ะฑั! ะัะต ะพััะฐะปัะฝัะต ะพั ัะตะฑั ัะฐัะฐั
ะฐัััั. ะก ัะฐะบะธะผะธ ัะตะผะฟะฐะผะธ ะธ ะดะพ ะพะฝะฐะฝะธะทะผะฐ
ะฝะตะดะฐะปะตะบะพ, ะธะปะธ ัั ัะถะต? - ััะผะตั
ะฝัะปัั ะฅะธะฑะฐัะธ.
- ะั, ะฝะตั... ะธ ััะพ ะฒะพะพะฑัะต ะทะฐ ะฒะพะฟัะพัั? ะฃะพะปะบะตั ะผะตะฝั ะฒ ะฟะพัะปะตะดะฝัั ะพัะตัะตะดั ะธะฝัะตัะตััะตั.
- ะฏ ะฝะต ะพะฑ ััะพะน ะบะพะทัะฒะบะต ะณะพะฒะพัั! ะ ะฟัะพ ัะพ, ััะพ ั ัะฒะพะธะผ ั
ะฐัะฐะบัะตัะพะผ ะฝะธ ะพะดะฝะฐ ะดะตะฒััะบะฐ
ะบ ัะตะฑะต ะฝะต ะฟะพะดะพะนะดัั!
- ะฅะต, - ััะผะตั
ะฝัะปัั ะะฐะฝะดะฐ. - ะกะฟะพัะธะผ, ั ะปัะฑัั ะทะฐ ะดะตะฝั ัะผะพะณั ะทะฐะบะฐะดัะธัั? ะ ะทะฐะฝััััั
ัะตะบัะพะผ.
- ะขั-ัะพ? ะฅะฐ! ะ ะทะฐ ะผะตััั ะฝะต ัะฟัะฐะฒะธัััั! - ะพัะบะฐะปะธะปัั ะัั.
- ะขะฐะบ ะทะฝะฐัะธั, ัะฟะพัะธะผ? - ะฟัะธะฟะพะดะฝัะปัั ะฎั. - ะะพ ัะพะณะดะฐ ะธ ัั ััะฐััะฒัะตัั.
- ะฅะต, ะดะฐั ัะตะฑะต ะฝะตะดะตะปั! - ะฅะธะฑะฐัะธ ัะฑัะฐะป ัะพะฝัะฐ ะธ ะฟัะพััะฝัะป ัะฒะพั ััะบั.
- ะะพะณะพะฒะพัะธะปะธัั, - ะฟะพะถะฐะป ััะบั ัะพั. - ะ ะบัะพ ััะฐะฝะตั ัะตะปัั?
- ะฅะผ... ะฐ ัะพั, ะบัะพ ะฟะตัะฒัะน ะฒะพะนะดัั ะฒ ััะพั ะบะฐะฑะธะฝะตั! ะงัะพะฑ ัะถ ัะตััะฝะพ ะฑัะปะพ. ะ ะฟะพะดัะฒะตัะถะดะตะฝะธะต
ะฟะพะฑะตะดั ะฟัะธะฝะตัั ัะตะฑะต ะฝะธะถะฝะตะต ะฑะตะปัั ะถะตััะฒั! - ะณะปะฐะฒะฐ ะดะธััะธะฟะปะธะฝะฐัะฝะพะณะพ ะบะพะผะธัะตัะฐ ะบัะตะฟัะต
ัะถะฐะป ััะบั ะธ, ัะฒะฐะฝัะฒ ะฝะฐ ัะตะฑั, ะฟะตัะตะบะธะฝัะป ะะฐะฝะดั ัะตัะตะท ัะฟะธะฝั ะฝะฐ ะฟะพะป. - ะะพ ัััะธ, ะตัะปะธ
ัั ะฟัะพะธะณัะฐะตัั, ะฑัะดะตัั ะดัะฐะธัั ัะฝะธะฒะตััะธัะตั ะฒะตัั ะณะพะด!
- ะขั... ะปะฐะดะฝะพ - ะฎั ะฟะพะดะฝัะปัั, ะดะตัะถะฐัั ะทะฐ ัะฟะธะฝั. - ะฏ ัะตะฑะต ััะพ ะฝะต ะฟัะพัั.
ะขัั ะฒ ะดะฒะตัั ัะธั
ะพะฝัะบะพ ะฟะพััััะฐะปะธัั.
- ะ ะตัะปะธ ะฒัะธะณัะฐะตัั ัั, ั ะฝะฐ ะณะพะด ะพั ัะตะฑั ะพัััะฐะฝั! - ั
ะผัะบะฝัะป ะัั ะธ ะฟะพะฒะตัะฝัะปัั ะบ
ะดะฒะตัะธ.
- ะฅะธะฑะฐัะธ-ัะฐะฝ! ะฏ, ะบะพะฝะตัะฝะพ, ะฟะพะฝะธะผะฐั, ััะพ ะดะธััะธะฟะปะธะฝะฐ - ััะพ ัะฒััะพะต, ะธ ะฟะพะดะดะตัะถะธะฒะฐั
ะฒะฐัะต ัะตัะตะฝะธะต ะฝะฐะดัะฐัั ััะพะผั ะฟัะธะดััะบั ะทะฐะด! ะะพ ั ะฝะฐั ััั ััะพะบ, ะฐ ะผะฝะต ัะตัะตัะฐั ัะดะฐะฒะฐัั!
- ะทะฐััะป ะฑะตะทัะฟัะตัะฝัะน ะะปะปะตะฝ ะฃะพะปะบะตั, ะฒ ะบะพัะพัะพะณะพ ะฝะฐะผะตััะฒะพ ะฒัะตะฟะธะปัั ะกะฐะฒะฐะดะฐ ะฟััะฐััั
ะพััะฐะฝะพะฒะธัั.
ะกัะตะฝะฐ 4. ะัะฑะปั 1.
ะะฐะฝะดะฐ ะฟะพัะผะพััะตะป ะฝะฐ ะผะฐะปััะธัะบั ะธ ะธะทะดะฐะป ัะธั
ะธะน ะทะฒัะบ, ะฟะพั
ะพะถะธะน ะฝะฐ ะบะพัะฐััะต ัะธะฟะตะฝะธะต. ะะธะดะธะผะพ
ะพะฝ ะฝะต ะพะถะธะดะฐะป, ััะพ ะฟะตัะฒัะผะธ ะฒ ะบะปะฐัั ะทะฐะนะดัั ะธะผะตะฝะฝะพ ััะธ ะดะฒะพะต.
- ะัะพั
ะพะดะธ, ะทะฐััะป ัะถะต, - ั
ะผัะบะฝัะป ะฎั ะธ, ะพัะฒะตัะธะฒ ะฅะธะฑะฐัะธ ะฟะพะดะทะฐััะปัะฝะธะบ, ะฟะพัะฟะตัะธะป ะฒะตัะฝััััั
ะฝะฐ ัะฒะพั ะผะตััะพ.
- ะ, ัั ะตัั ะถะธะฒะพะน?! ะะตัะฐะปัะฝะพ... - ะฟะพะบะฐัะฐะป ะณะพะปะพะฒะพะน ะะปะปะตะฝ. - ะ ะตะฑััะฐ ะทะฐั
ะพะดะธัะต, ะฅะธะฑะฐัะธ
ัััะป! - ััั ะถะต ะฒ ะดะฒะตัั ะฟะพะฒะฐะปะธะปะธ ะพััะฐะปัะฝัะต. ะ ะฟะพัะปะตะดะฝะธะผ ะทะฐััะป ะฟัะตะฟะพะดะฐะฒะฐัะตะปั. ะะตะปะพะฒะพะปะพััะน
ะดะพััะฐะป ะธะท ััะผะบะธ ัะธััะฝะบะธ ะธ ัะตััะตะถะธ, ะฟะพัะปะต ัะตะณะพ ัะฐะทะฒะตัะธะป, ะฒะทัะป ัะบะฐะทะบั ะธ ะฝะฐัะฐะป ัะฐััะบะฐะทัะฒะฐัั
ัะตัะตัะฐั ะฟะพ ัะบะพะปะพะณะธะธ.
ะะพะพะฑัะต, ะพะฝ ะฝะต ะฑัะป ะพัะปะธัะฝะธะบะพะผ, ะฝะพ ะฑะพะปััะธะผ ัััะดัะณะพะน!
ะัะปะธ ัะฐะฝััะต ะฎั ะผะตััะฐะป ะพััะธะดะตัั ะฟะพัะปะตะดะฝะธะต ััะพะบะธ ะธ ัะฒะฐะปะธัั ะดะพะผะพะน, ัะพ ัะตะฟะตัั ะตะณะพ
ะถะตะปะฐะฝะธะตะผ ะฑัะปะพ, ััะพะฑั ััะพะบะธ ะฝะธะบะพะณะดะฐ ะฝะต ะทะฐะบะฐะฝัะธะฒะฐะปะธัั.
"ะขั, ะจะฟะตะฝะดะตะปั. ะะพัะตะผั, ะฟะพัะตะผั ัั ัะฐะบ ะฝะต ะฒะพะฒัะตะผั ัะฒะฐะปะธะปัั ะผะฝะต ะฝะฐ ะณะพะปะพะฒั?!" - ะดัะผะฐะป
ะพะฝ, ะดะตะปะฐั ะฒะธะด, ััะพ ัะปััะฐะตั.
- ... ะ ะฒะพั ะฟะพััะพะผั ะดะปั ัะฟะฐัะตะฝะธั ะบะธัะพะฒ ัะฐะบ ะฒะฐะถะฝะพ ะฟัะตะบัะฐัะธัั ัััะตะปัะฑั ะธ ะฟะตัะตะฒะพะท
ะฝะตััะธ ัะตัะตะท ะพะบะตะฐะฝ! ะฃ ะผะตะฝั ะฒัั! - ะทะฐะบะพะฝัะธะป ัะฐััะบะฐะท.
- ะั ััะพ ะถ, ะดัะผะฐั, ะฝะฐ 4-ะบั ะฒะฟะพะปะฝะต ั
ะฒะฐัะธั.
- ะงัะพ?! ะะพ ััะธัะตะปั, ั ะฝะตะณะพ ะฟะพััััะฐััะธะน ะดะพะบะปะฐะด! - ะทะฐัะตะฑะตัะฐะป ะพะดะฝะพะณััะฟะฟะฝะธะบ.
- ะะฝ ะผะฝะพะณะพ ะณะพัะพะฒะธะปัั, ะฒะพะปะฝะพะฒะฐะปัั, ะฟะพัะตะผั ัะตัััะต?! - ะทะฐัััะฟะธะปัั ะทะฐ ะะปะปะตะฝะฐ ะขััะฝะฐ.
- ะะฐ ะฟัะตะบัะฐัะฝัะน ะดะพะบะปะฐะด, ะตัะปะธ ัะตััะฝะพ, ะฝะต ะพะถะธะดะฐะป, - ะฒััะบะฐะทะฐะปัั ัะตะปะพะฒะตะบ, ะบะพัะพัะพะณะพ
ะผะตะฝััะต ะฒัะตะณะพ ััะพ ะผะพะณะปะพ ะฒะพะปะฝะพะฒะฐัั. ะะฐะฝะดะฐ ัะผะพััะตะป ะฝะฐ ะฟัะตะฟะพะดะฐะฒะฐัะตะปั.
- ะญ-ะญ-ะญ?! - ะพัะฐะปะตะป ะฒะตัั ะบะปะฐัั.
- ะ... ั-ัั-ัะพ... - ะทะฐะปะธะปัั ััะผัะฝัะตะผ ะฃะพะปะบะตั.
- ะั ะปะฐะดะฝะพ, 5!
ะฎั, ะฟะพะฑะตะดะฝะพ ัั
ะผัะปัะฝัะฒัะธัั, ะฟะตัะตะฒะตะป ะฒะทะณะปัะด ะฝะฐ ะะปะปะตะฝะฐ. ะขะพั ะฟะพััะฟะธะปัั ะธ ัััะฐะฒะธะปัั
ะฒ ะฟะพะป.
"ะฅะผ, ะฒะพะทะผะพะถะฝะพ ััะพ ะฑัะดะตั ะฝะต ัะฐะบ ัะถ ะธ ัะถะฐัะฝะพ", - ะฟะพัะตะผั-ัะพ ัะพะปัะบะพ ัะตะนัะฐั ะะฐะฝ'
- '- ะะพะฑัะพะต ัััะพ, - ัะตะฟะพั ัะตะบะพัะตั ะผะฝะต ัั
ะพ. ะกะพะฒัะตะผ ะฝะต ั
ะพัะตััั ัะฐะทะปะตะฟะปััั ะณะปะฐะทะฐ ะธ
ะฒัััะตัะฐัั ะฝะพะฒัะน ะดะตะฝั. ะะพะฒะพัะฐัะธะฒะฐััั, ะฟัะธััะณะธะฒะฐั ัะตะฑั ะฑะปะธะถะต, ะธ ัััะบะฐััั ะฝะพัะพะผ ัะตะฑะต
ะฒ ะณััะดั, ะพัััะฐั ะทะฐะฟะฐั
ัะปะฐะดะพััะตะน, ะบะพัะพััะต ะฝัะฐะฒัััั ะฝะฐะผ ะพะฑะพะธะผ. ะฏ ะตะถััั ะพั ั
ะพะปะพะดะฐ,
ะฟััะฐััั ะฒัะปะตะฟัั ะฝะฐะนัะธ ะบัะฐั ัััะฝะพะณะพ ะพะดะตัะปะฐ ะธ ัะฝะพะฒะฐ ะพะบัะฝััััั ะฒ ัะพะฝ. ะขั ะทะฐะผะตัะฐะตัั
ััะพ ะธ ะทะฐะฑะพัะปะธะฒะพ ัะบััะฒะฐะตัั ะผะตะฝั. ะขะฒะพะธ ะฟะฐะปััั ะฟะตัะตะฑะธัะฐัั ะผะพะธ ะฒะพะปะพัั, ะฐ ะณัะฑั ะปะตะณะบะพ
ะบะฐัะฐัััั ะผะพะตะณะพ ะปะฑะฐ. ะั ัะฐะบ ะธ ะทะฐัััะฒะฐะตะผ ะฒ ััะพะน ะฟะพะทะต ะฝะฐ ะฝะตะบะพัะพัะพะต ะฒัะตะผั.
ะัะพั
ะพะดะธั ะฒัะตะณะพ ะฝะตัะบะพะปัะบะพ ะผะธะฝัั, ะฐ ะฟะพัะพะผ ั ัะตะทะบะพ ัะฐะถััั ะฝะฐ ะบัะพะฒะฐัะธ ะธ ะฝะฐัะธะฝะฐั ะฒะพััะฐัั,
ััะพ ัะถะต ะดะฐะฒะฝะพ ะฟะพัะฐ ะฒััะฐะฒะฐัั, ะฒะตะดั ัะตะณะพะดะฝั ะฟัะตะดััะพะธั ะฟะพะตะทะดะบะฐ ะฝะฐ ะฟัะธัะพะดั ะฒะผะตััะต
ั ะดััะทััะผะธ. ะฃ ัะตะฑั ะฝะฐ ะปะธัะต ะฟะพัะฒะปัะตััั ัะปัะฑะบะฐ, ะฐ ััะบะธ ััะฝัั ะพะฑัะฐัะฝะพ, ะทะฐััะฐะฒะปัั
ะฒะฝะพะฒั ะพัะบะธะฝััััั ะฝะฐ ะฟะพะดััะบะธ. ะะฐ ัะปะธัะต ะปัะตั ะดะพะถะดั, ะฑะฐัะฐะฑะฐะฝั ะฒ ะพะบะฝะฐ, ะฐ ััะพ ะตัะต ะดะตะปะฐัั
ะฒ ัะฐะบะพะน ะดะตะฝั, ะตัะปะธ ะฝะต ะฝะตะถะธัััั ะฒ ัััะฝะพะน ะฟะพััะตะปะธ ะฒ ะพะฑัััะธัั
ะปัะฑะธะผะพะณะพ?
ะกะบะพะปัะบะพ ะฒัะตะผะตะฝะธ ะผั ะฑัะปะธ ะทะฝะฐะบะพะผั, ะฟัะตะถะดะต, ัะตะผ ัะทะฝะฐะปะธ ะพ ััะฒััะฒะฐั
ะดััะณ ะดััะณะฐ? ะะฐ,
ั ะฝะต ะฟะพะผะฝั ััะพะณะพ, ะฝะพ ะบัะพ ััะธัะฐะตั? ะะปะฐะฒะฝะพะต, ััะพ ะฒ ะผะพะตะน ะฟะฐะผััะธ ะดะพ ัะธั
ะฟะพั ะฑะตัะตะถะฝะพ
ั
ัะฐะฝะธััั ะผะพะผะตะฝั, ะบะพะณะดะฐ ัั ะฝะฐะบะพะฝะตั ััะปััะฐะป ัะต ะฒะฐะถะฝัะต ัะปะพะฒะฐ. ะะตัะตะด ะณะปะฐะทะฐะผะธ ะฒัะฟะปัะฒะฐัั
ััะฐััะปะธะฒัะต ะผะณะฝะพะฒะตะฝะธั, ัะปะพะฒะฝะพ ะบะฐะดัั, ะทะฐะฟะตัะฐัะปะตะฒัะธะต ะฒัั ะฒ ะผะตะปััะฐะนัะธั
ะดะตัะฐะปัั
. ะญัะพ
ะฟัะพะธะทะพัะปะพ ะฒ ะผะพัะพะทะฝัะน ัะฝะฒะฐััะบะธะน ะดะตะฝั. ะะตััะปะฐั ะบะพะผะฟะฐะฝะธั ะผะพะปะพะดัั
ะปัะดะตะน ะฝะต ะผะพะณะปะฐ ะฟัะพััะพ
ัะธะดะตัั ะดะพะผะฐ ะฒะทะฐะฟะตััะธ ะธ ัะฟัััะธัั ัะฐะบะพะน ั
ะพัะพัะธะน ัะปััะฐะน ะดะปั ะฟัะพะณัะปะบะธ ะฟะพ ะทะฐัะฝะตะถะตะฝะฝะพะผั
ะปะตัั ะธ ะฟัะพัะธั
ะทะธะผะฝะธั
ะทะฐะฑะฐะฒ.
ะขั ัะพะณะดะฐ ะพะบะฐะทะฐะปัั ะฒะฝะต ะฝะฐัะตะณะพ ะฟะพะปั ะทัะตะฝะธั, ะฐ ัะตะผะฝะพัะฐ ัะถะต ะฝะฐัะฐะปะฐ ะพะฟััะบะฐัััั ะฝะฐ ะทะตะผะปั.
ะะพะฝะตัะฝะพ, ะผะฝะต ะฝะธัะตะณะพ ะฝะต ะพััะฐะฒะฐะปะพัั, ะบัะพะผะต ะบะฐะบ ะพัะฟัะฐะฒะธัััั ะฝะฐ ะฟะพะธัะบะธ. ะะฐ ะผะพะตะผ ะปะธัะต
ะทะฐัััะปะพ ัะดะธะฒะปะตะฝะธะต, ะบะพะณะดะฐ ั ะทะฐััะฐะป ัะตะฑั ะทะฐ ัััะฐะฝะฝัะผ ะทะฐะฝััะธะตะผ: ะฑัะปะพ ะทะฐะฑะฐะฒะฝะพ ะฝะฐะฑะปัะดะฐัั
ะทะฐ ัะพะฑะพะน, ะฒัะฒะพะดััะธะผ ะฐะบะฒะฐัะตะปัั ะธ ะฑะฐะปะปะพะฝัะธะบะฐะผะธ ั ะบัะฐัะบะพะน ะฝะตะบะธะต ัะทะพัั ะฟััะผะพ ะฝะฐ ัะฝะตะณั.
ะขะฒะพะธ ะฝะตะพะฑััะฝะพััั ะธ ะฝะตะฟัะตะดัะบะฐะทัะตะผะพััั ะฟัะธััะณะธะฒะฐะปะธ ะบ ัะตะฑะต ะผะพั ะฝะฐัััั.
- ะขั ะผะฝะต ะฝัะฐะฒะธัััั. ะัะตะฝั, - ะบะฐะถะตััั, ะฑัะดัะพ ะฒัั ะทะฐะผะตัะปะพ, ะธ ะฒ ะทะฒะตะฝััะตะน ัะธัะธะฝะต ะฟัะพะทะฒััะฐะปะธ
ะฟัะพัััะต ัะปะพะฒะฐ, ะบะพัะพััะต ััะถะตะปะพ ะฟัะพะธะทะฝะตััะธ. ะงัะพ ะผะพะณะปะพ ัะพะปะบะฝััั ะผะตะฝั ะฟัะพััะพ ะฒะทััั
ะธ ัะบะฐะทะฐัั ะธั
? ะะดะฝะฐะบะพ ะพัะฒะตั ะฝะฐ ััะพั ะฒะพะฟัะพั ัะถะต ะฝะต ะฒะฐะถะตะฝ, ัะตะฟะตัั ะพะฝ ะพััะฐะฒะธะป ะผะตััะพ
ะดะปั ะฑะตัะฟะพะบะพะนััะฒะฐ. ะขะฒะพะธ ัะผะพัะธะธ ัะปะพะถะฝะพ ะฟัะพัะธัะฐัั. ะขะฐะบ ะฑัะปะพ ะฒัะตะณะดะฐ. ะะพะปัะฐะฝะธะต ะฝะฐะณะฝะตัะฐะตั
ะฝะฐะฟััะถะตะฝะธะต ะผะตะถะดั ะฝะฐะผะธ.
ะัะธะบะพัะฝะพะฒะตะฝะธะต ะปะตะดัะฝัั
ะฟะฐะปััะตะฒ ะบ ะผะพะตะน ัะตะบะต ะฒัะฒะพะดะธั ะธะท ะพัะตะฟะตะฝะตะฝะธั, ัะบะพะฒะฐะฒัะตะณะพ ัะตะปะพ.
ะฏ ะตะปะต-ะตะปะต ัะฐะทะปะธัะฐั, ััะพ ัั ัะตะนัะฐั ะณะพะฒะพัะธัั, ะฝะพ ะฝะตะบะพัะพััะต ะพะฑััะฒะบะธ ััะฐะท ะฒัั ะถะต ะฟัะธะพะฑัะตัะฐัั
ัะผััะป. ะะธะบะพะณะดะฐ ะฝะต ะฒะตัะธะป ะฒ ััะดะตัะฐ, ะดะฐ ะฒะพั ัะพะปัะบะพ ัะตะนัะฐั ะฟะพะฝะธะผะฐั: ะพะฝะธ ัะปััะฐัััั.
ะะฐะปะตะฝัะบะพะต ััะดะพ - ัะทะฝะฐัั ะพะฑ ะพัะฒะตัะฝัั
ััะฒััะฒะฐั
ัะพะณะพ, ะบัะพ ัะฐะบ ะผะฝะพะณะพ ะทะฝะฐัะธั ะดะปั ัะตะฑั.
ะั ะธะดะตะผ ั ัะพะฑะพะน ะฟะพ ะทะฐะผะตัะตะฝะฝัะผ ัะฝะตะณะพะผ ัะปะธัะฐะผ. ะััะณะฐ, ะทะฐะฒัะฒะฐั, ะดัะตั ะฒ ะปะธัะพ, ัะฑะธะฒะฐั
ะฟัะพั
ะพะถะธั
ั ะฟััะธ, ะฐ ั ะผะตะฝั ะฝะฐ ะดััะต - ัะฟะพะบะพะนััะฒะธะต ะธ ัะผะธัะพัะฒะพัะตะฝะธะต... ะะพะณะดะฐ ัั ััะดะพะผ,
ะฟัะพะธัั
ะพะดััะตะต ะฒะพะบััะณ ะฝะต ะธะผะตะตั ะทะฝะฐัะตะฝะธั, ะธ ะฝะตั ะดะตะปะฐ ะดะพ ะฒัะตั
ะพััะฐะปัะฝัั
.
ะะฝะต ัะปััะฝะพ, ะบะฐะบ ัะฒะพะธ ะทัะฑั ััััะฐั ะพั ั
ะพะปะพะดะฐ. ะกะถะฐะฒัะธัั, ัั ะฟัััะตัั ะฝะพั ะฒ ะฒััะพะบะธะน
ะฒะพัะพั ะบัััะบะธ. ะฏ ัะฒะตัะตะฝ, ััะพ ัะฒะพะธ ััะบะธ ะฒ ะบะฐัะผะฐะฝะฐั
ะดะฐะฒะฝะพ ะฝะต ะผะพะณัั ะพัะพะณัะตัััั ะธ ะฟัะธะฝััั
ะฝะพัะผะฐะปัะฝัั ัะตะผะฟะตัะฐัััั.
- ะะฐะผะตัะท? - ัะฟัะฐัะธะฒะฐั, ะทะฐะณะปัะดัะฒะฐั ะฒ ะบะฐัะธะต ะณะปะฐะทะฐ, ะพะฑัะฐะผะปะตะฝะฝัะต ัะตัะฝัะผะธ ัะตัะฝะธัะฐะผะธ,
ะฝะฐ ะบะพัะพััะต ัะธั
ะพ ะฟะฐะดะฐัั ัะฝะตะถะธะฝะบะธ, ะธ, ะฝะต ะดะพะถะธะดะฐััั ะพัะฒะตัะฐ, ััะฝั ัะตะฑั ะฒ ะฑะปะธะถะฐะนัะตะต
ะบะฐัะต.
- ะะพะนะดะตะผ ะดะพะผะพะน, ะฐ ัะพ ะฒะพัะฟะฐะปะตะฝะธะต ะปะตะณะบะธั
ะฟะพะดั
ะฒะฐัะธัั, - ัััะพะณะพ ะทะฐะผะตัะฐะตัั ัั, ัะถะต
ะฝะฐะฟัะฐะฒะปัััั ะฒ ััะพัะพะฝั ะฝะฐัะตะณะพ ะฟะพะดัะตะทะดะฐ.
- ะะพััะพะน, ัะฐะทะฒะต ะฝะต ะฒะธะดะธัั, ะบะฐะบะฐั ััะดะตัะฝะฐั ะฟะพะณะพะดะฐ? - ะทะฝะฐะตัั ะฒะตะดั, ััะพ ะผะฝะต ะฝัะฐะฒะธััั
ะณัะปััั ะฟะพะด ะดะพะถะดะตะผ, ะฟะพะดััะฐะฒะปัั ะปะธัะพ ะฟะฐะดะฐััะธะผ ั
ะพะปะพะดะฝัะผ ะบะฐะฟะปัะผ.
ะขะตะฑะต ะฒ ะณะพะปะพะฒั ะฑััััะพ ะฟัะธั
ะพะดะธั ะผััะปั, ะบะฐะบ ะทะฐััะฐะฒะธัั ะผะตะฝั ัะนัะธ ะฒ ะฑะพะปะตะต ััั
ะพะต ะธ ัะตะฟะปะพะต
ะผะตััะพ. ะะพะปะณะพ ะฝะต ัะฐะทะดัะผัะฒะฐั, ััะฒะบะพะผ ะฟัะธััะณะธะฒะฐะตัั ะบ ัะตะฑะต, ะฟัะธะถะธะผะฐััั ะบ ะผะพะธะผ ะณัะฑะฐะผ.
ะั ะฝะตะพะถะธะดะฐะฝะฝะพััะธ ั ะฟัะธะพัะบััะฒะฐั ะธั
, ะฐ ััะบะฐะผะธ ะฝะฐัะธะฝะฐั ะณะปะฐะดะธัั ัะฒะพั ัะฟะธะฝั, ะบ ะบะพัะพัะพะน
ะฟัะธะปะธะฟะปะฐ ะธะทััะดะฝะพ ะฟัะพะผะพะบัะฐั ััะฑะฐัะบะฐ. ะะต ัะฟะตัะฐ, ัั ัะณะปัะฑะปัะตัั ะฟะพัะตะปัะน, ะตัะต ะฑะพะปััะต
ัะฐะทะทะฐะดะพัะธะฒะฐั. ะะผะตะฝะฝะพ ัะฐะบ ะธ ะฟัะตะดะฟะพะปะฐะณะฐะปะพัั, ะฟัะฐะฒะดะฐ?
ะะพะต-ะบะฐะบ ัะฟัะฐะฒะธะฒัะธัั ั ะทะฐะผะบะพะผ, ะผั ะฒะฒะฐะปะธะฒะฐะตะผัั ะฒ ะฟะพะปััะตะผะฝัั ะบะฒะฐััะธัั, ะตะดะฒะฐ ััะผะตะฒ
ัััะพััั ะฝะฐ ะฝะพะณะฐั
. ะะตัะตะด ะณะปะฐะทะฐะผะธ ะดะพ ัะธั
ะฟะพั ััะพะธั ะฟะตะปะตะฝะฐ ะดะพะถะดั. ะขั ััะฐะทั ะถะต ัะตะทะบะพ
ะฟัะธะถะธะผะฐะตัั ะผะตะฝั ะบ ััะตะฝะต, ะธ ัะฒะพะน ัะทัะบ ะฒััะฒะฐะตััั ะฒ ะผะพะน ัะพั ะฒ ะฝะตะธััะพะฒะพะผ ะฟะพัะตะปัะต,
ะฑะตัะฟะพััะดะพัะฝะพ ะดะฒะธะณะฐะตััั ะฒะดะพะปั ะทัะฑะพะฒ ะธ ะฒะพะทะฒัะฐัะฐะตััั ะบ ะผะพะตะผั ัะทัะบั. ะฏ ะฝะต ัััะตะผะปััั
ะฑัะฐัั ะธะฝะธัะธะฐัะธะฒั ะฝะฐ ัะตะฑั, ะผะฝะต ะฒัะตะณะดะฐ ะฝัะฐะฒะธะปะพัั ะฟะปะฐะฒะธัััั ะฟะพะด ะฝะฐัะธัะบะพะผ ัะฒะพะธั
ะปะฐัะบ
ะธ ะพะถะธะดะฐัั, ััะพ ะถะต ัั ะฟัะตะดะฟัะธะผะตัั ะดะฐะปััะต. ะฃ ัะตะฑั ะฟะพััะธ ะฒัะตะณะดะฐ ะปะตะดัะฝัะต ะฟะฐะปััั, ะธ
ั ะผะตะฝั ะผััะฐัะบะธ ะฑะตะณัั ะฟะพ ะบะพะถะต ะพั ะฟัะธััะฝัั
, ะฝะพ ั
ะพะปะพะดะฝัั
ะฟัะธะบะพัะฝะพะฒะตะฝะธะน. ะขะตะฑะต ะฝัะฐะฒะธััั
ัะผะพััะตัั, ะบะฐะบ ะฟัะพะณะธะฑะฐะตััั ะผะพั ัะฟะธะฝะฐ, ะบะพะณะดะฐ ัั ัะธััะตัั ะฝะฐ ะฝะตะน ะฝะตะฒะธะดะธะผัะต ะปะธะฝะธะธ.
ะ ะดะถะธะฝัะฐั
ัะถะต ััะฐะฝะพะฒะธััั ัะตัะฝะพ, ะฐ ะฒ ะณะพะปะพะฒะต ะพะฑัะฐะทัะตััั ะฟัััะพัะฐ, ะทะฐะฟะพะปะฝัะตะผะฐั ะปะธัั
ัะพะฑะพะน. ะขะฒะพะธ ััะบะธ ะพะฟััะบะฐัััั ะฝะธะถะต, ะฝะฐััะฟัะฒะฐั ะฟััะถะบั ัะตะผะฝั. ะ, ัั ะถะต ัะฐะผ ะทะฐัะตัะป
ััั ะธะณัั, ะผะฐะปัั, ัะฐะบ ะดะฐะฒะฐะน ะฟะพะธะณัะฐะตะผ?
ะฏ, ะฒัั ัะฐะบ ะถะต ะฝะฐั
ะพะดััั ะฒ ะบัะตะฟะบะธั
ะพะฑัััะธัั
, ะดะตะปะฐั ะฝะตะพะถะธะดะฐะฝะฝัะน ัะฐะทะฒะพัะพั, ะฟัะธะฒััะฝะพ
ะทะฐะฝะธะผะฐั ัะพะปั ะฐะบัะธะฒะฐ. ะขั ั ะทะฐะผะธัะฐะฝะธะตะผ ัะตัะดัะฐ ัะผะพััะธัั ะฝะฐ ะผะตะฝั, ะฟัะตะบัะฐัะธะฒ ะฒัะต ะดะตะนััะฒะธั.
ะฃะปัะฑะฝัะฒัะธัั, ะฟัะพะฒะพะถั ัะทัะบะพะผ ะฟะพ ัะฒะพะตะผั ัั
ั, ัััั ะฟัะธะบัััะฒะฐั ะผะพัะบั, ะพั ัะตะณะพ ัะฒะพะธ
ะดัะพะถะฐัะธะต ะฟะฐะปััั ะฟะตัะตะผะตัะฐัััั ะฒะฒะตัั
ะธ ััะดะพัะพะถะฝะพ ัะถะธะผะฐัั ะผะพะธ ะฒะพะปะพัั, ั ะบะพัะพััั
ััะตะบะฐะตั
ะฒะพะดะฐ. ะะตัะตัะฟะตะปะธะฒะพ ัะฐัััะตะณะธะฒะฐั ะฟัะณะพะฒะธัั ัะฒะพะตะน ััะฑะฐัะบะธ, ะฟะพะฟััะฝะพ ะพััะฐะฒะปัั ะฝะตัะบะพะปัะบะพ
ะฑะฐะณัะพะฒัั
ะพัะผะตัะธะฝ ะฝะฐ ัะตะต ะธ ะฝะฐ ะณััะดะธ. ะะพ ะผะตะฝั ะดะพะฝะพัะธััั ััะพะฝ, ะธ ั ะฟัะพะดะพะปะถะฐั ะผะตะดะปะตะฝะฝัั
ะฟััะบั, ัััะณะธะฒะฐั ั ัะตะฑั ะฑััะบะธ ะฒะผะตััะต ั ะฑะตะปัะตะผ. ะ ัะธัะธะฝะต, ัะฐะทะฑะฐะฒะปัะตะผะพะน ะฝะฐัะธะผ ััะถะตะปัะผ
ะดัั
ะฐะฝะธะตะผ, ัะฐะทะดะฐะตััั ััะผะฝัะน ะฒัะดะพั
, ะบะพะณะดะฐ ั ะดะตะปะฐั ะฝะตัะบะพะปัะบะพ ะดะฒะธะถะตะฝะธะน ััะบะพะน ะฟะพ ะพัะฝะพะฒะฐะฝะธั
ัะปะตะฝะฐ, ะฐ ะทะฐัะตะผ, ะปะธะทะฝัะฒ ะณะพะปะพะฒะบั, ะพััััะฐะฝัััั, ะณะปัะดั ะฒ ะพะดััะผะฐะฝะตะฝะฝัะต ะณะปะฐะทะฐ.
- ะกะฟะฐะปัะฝั, - ัะตะฟัะตัั ัั, ะบัะตะฟะบะพ ะดะตัะถะฐัั ะทะฐ ะบัะฐะน ััะผะฑะพัะบะธ, ััะพัะฒัะตะน ััะดะพะผ.
ะัะพัะธัั ะดะฒะฐะถะดั ะฝะตั ัะผััะปะฐ, ะฒะตะดั ั ะผะตะฝั ัะฐะผะพะณะพ ัะถะต ะฝะตั ัะธะป ัะตัะฟะตัั ััะพ ััะฝััะตะต
ะพัััะตะฝะธะต, ะพะฑัะฐะทะพะฒะฐะฒัะตะตัั ะฒะฝะธะทั ะถะธะฒะพัะฐ.
ะะตะณะบะพ ะฟะพะดั
ะฒะฐััะฒะฐั ัะตะฑั ะฝะฐ ััะบะธ ะธ ะธะดั ะฒ ัั ะบะพะผะฝะฐัั, ะฒ ะบะพัะพัะพะน ะผั ััะพะปัะบะพ ัะฐะท ะทะฐะฝะธะผะฐะปะธัั
ะปัะฑะพะฒัั.
ะัะพะฒะฐัั ะฒัััะตัะฐะตั ะฝะฐั ะทะฝะฐะบะพะผัะผ ัะบัะธะฟะพะผ, ะบะพะณะดะฐ ั ะพะฟััะบะฐั ัะตะฑั, ะฝะตัะฒะฝะพ ะบััะฐััะตะณะพ
ะณัะฑั. ะขั ั
ะฒะฐัะฐะตัั ะผะตะฝั ะธ ััะฝะตัั ะฝะฐ ัะตะฑั, ะพััะตะณะพ ะพะบะฐะทัะฒะฐะตัััั ะฟัะธะถะฐััะผ ะผะพะธะผ ัะตะปะพะผ.
ะขะฒะพะธ ััะบะธ ัะบะพะปัะทัั ะฟะพ ะผะพะธะผ ะฑะพะบะฐะผ, ะฟะพะผะพะณะฐั ัะฝััั ัััะฑะพะปะบั ะธ, ะฟัะธะปะพะถะธะฒ ะฝะตะบะพัะพััะต
ััะธะปะธั, ะฟัะธะฟะพะดะฝัะฒัะธัั, ะพะฑะฒะพะดะธัั ัะทัะบะพะผ ะผะพะธ ัะพัะบะธ, ัะปะตะณะบะฐ ัะฐัะฐะฟะฐั ะธั
ะทัะฑะฐะผะธ.
ะงัะฒััะฒัั ะฝะตะพะฑั
ะพะดะธะผะพััั ัะบะพัะตะนัะตะน ัะฐะทััะดะบะธ, ั ะฟััะฐััั ะบะฐะบ ะผะพะถะฝะพ ัะบะพัะตะต ัะฝััั ะดะถะธะฝัั
ะธ ะฝะฐัะฐัะธัั ะฒ ััะธัะบะต ัะบะฐัะฐ ัะผะฐะทะบั ะธ ะฟัะตะทะตัะฒะฐัะธะฒั. ะะตัะตัะฟะตะปะธะฒะพ ััััะฐะธะฒะฐััั ะฟะพัะดะพะฑะฝะตะต
ะผะตะถะดั ัะฒะพะธั
ะฝะพะณ, ัะฐะทะฒะพะดั ะธั
ะฒ ััะพัะพะฝั ะธ ะฝะตะผะฝะพะณะพ ัะณะธะฑะฐั ะฒ ะบะพะปะตะฝัั
. ะัะดะฐะฒะปะธะฒะฐั ะณะตะปั
ะธ ะฟะพะพัะตัะตะดะฝะพ ะฐะบะบััะฐัะฝะพ ะฒะฒะพะถั ะฒ ัะตะฑั ะฟะฐะปััั, ัะฐัััะณะธะฒะฐั ะฟัะพั
ะพะด. ะกะปััั ัะฒะพะต ัะดะฐะฒะปะตะฝะฝะพะต
ัะธะฟะตะฝะธะต ะธ ััะฐัะฐััั ะพัะฒะปะตัั ัะฒะพะธะผะธ ะปะฐัะบะฐะผะธ, ะฟะพะบััะฒะฐั ะณััะดั ะธ ะฟะปะตัะธ ะฟะพัะตะปััะผะธ, ะบะพะต-ะณะดะต
ัััั ะฟัะธะบัััะฒะฐั ะบะพะถั.
ะขั ะทะฐะตัะทะฐะป ะธ ะฝะตะดะพะฒะพะปัะฝะพ ัััะฐะฒะธะปัั ะฝะฐ ะผะตะฝั, ััะตะฑัั ะฑะพะปััะตะณะพ. ะฏ ั ัะดะพะฒะพะปัััะฒะธะตะผ
ะฟะพะดัะธะฝัััั. ะัะธััะฐะฒะปัั ัะปะตะฝ ะบะพ ะฒั
ะพะดั ะธ ะผะตะดะปะตะฝะฝะพ ะฒั
ะพะถั, ะฝะฐ ััะพ ะฟะพะปััะฐั ะตะปะต ะทะฐะผะตัะฝัะน
ะบะธะฒะพะบ, ะบะฐะบ ัะฐะทัะตัะตะฝะธะต ะฟัะพะดะพะปะถะฐัั. ะกะฟัััั ะฝะตัะบะพะปัะบะพ ัะพะปัะบะพะฒ ัั ะฒัะณะธะฑะฐะตัััั ะฒ ะฟะพะทะฒะพะฝะพัะฝะธะบะต,
ะธ ะฝะฐ ะผะพะตะผ ะปะธัะต ะฟะพัะฒะปัะตััั ัะปัะฑะบะฐ. ะฏ ัะฒะตะปะธัะธะฒะฐั ัะตะผะฟ, ะดะฒะธะณะฐััั ะฒัั ะฑััััะตะต.
ะัะณะฐะทะผ ัััะตะผะธัะตะปัะฝะพ ะฝะฐะบััะฒะฐะตั ะฝะฐั ั ะณะพะปะพะฒะพะน, ะดะฐัั ััะพะปั ะดะพะปะณะพะถะดะฐะฝะฝะพะต ะฝะฐัะปะฐะถะดะตะฝะธะต.
ะกะพ ัะฑะธะฒัะธะผัั ะดัั
ะฐะฝะธะตะผ, ัะพ ะทะฒะตะทะดะพัะบะฐะผะธ ะฒ ะณะปะฐะทะฐั
ะฟะฐะดะฐั ััะดะพะผ ั ัะพะฑะพะน, ัะฐัะบัะฐัะฝะตะฒัะธะผัั,
ััะถะตะปะพ ะดััะฐัะธะผ, ะฝะพ ัะฐะบะธะผ ะปัะฑะธะผัะผ. ะขั ะฟัะธะถะธะผะฐะตัััั ะบะพ ะผะฝะต, ะฟะพะปะพะถะธะฒ ะณะพะปะพะฒั ะฝะฐ ะผะพั
ะณััะดั. ะะตะปะฐัั ัะตะนัะฐั ััะพ-ะปะธะฑะพ ะฒััะต ะฒััะบะธั
ัะธะป - ั ะฟัะพะดะพะปะถะฐั ะปะตะถะฐัั, ะฟะพะณะปะฐะถะธะฒะฐั
ัะฒะพะธ ะฒะพะปะพัั ะธ ะฒัะปััะธะฒะฐััั ะฒ ะฑะธะตะฝะธะต ะฝะฐัะธั
ัะตัะดะตั.
ะะพัะตะผั ั ัะตะฑั ัะพะณะดะฐ ะฝะต ะฟะพัะปััะฐะป? ะะฐัะตะผ ะฟะพะทะฒะพะปะธะป ัะตะฑะต ะผะพะบะฝััั ะฟะพะด ะดะพะถะดะตะผ ะฒะผะตััะต
ัะพ ะผะฝะพะน? ะัะปะธ ะฑั ะฝะต ััะฐ ะพัะธะฑะบะฐ, ัั ะฑั ะฝะต ะฟะพะดั
ะฒะฐัะธะป ัะตััะตะทะฝัั ะฑะพะปะตะทะฝั. ะะตะฝั ะดะพ
ัะธั
ะฟะพั ัะตัะทะฐะตั ััะฒััะฒะพ ะฒะธะฝั. ะัะตะฝั ััะถะตะปะพ ะพัะพะทะฝะฐะฒะฐัั, ััะพ ะฟะพะณัะฑะธะป ััั-ัะพ ะถะธะทะฝั...
ะัะพะฑะตะฝะฝะพ ัะพะณะพ, ะบัะพ ะฑัะป ัะตะฝััะพะผ ะผะพะตะน ะัะตะปะตะฝะฝะพะน.
ะฏ ะฟัะพะดะพะปะถะฐั ะถะธัั ะฟัะพัะปัะผ, ะฝะต ะผะพะณั ะฝะต ะฒัะฟะพะผะธะฝะฐัั ัะต ะฝะตะผะฝะพะณะธะต, ะฝะพ ัะฐะบะธะต ะดะพัะพะณะธะต
ะผะพะตะผั ัะตัะดัั ะผะพะผะตะฝัั, ะฟัะพะฒะตะดะตะฝะฝัะต ั ัะพะฑะพะน. ะั ัะพะฒัะตะผ ะฝะตะดะพะปะณะพ ะฑัะปะธ ะฒะผะตััะต. ะะตะฝั
ัะฐััะพ ะผะพะถะฝะพ ะฒัััะตัะธัั ะฝะฐ ัะพะผ ัะฐะผะพะผ ะผะตััะต ะฒ ะปะตัั, ะณะดะต ั ะพัะบััะปัั ัะตะฑะต. ะะฝะพะณะดะฐ ะผะฝะต
ะบะฐะถะตััั, ััะพ ัะบะฒะพะทั ัะธะปัะฝัั ะผะตัะตะปั ะฒะธะถั ัะฒะพะน ัะธะปััั. ะขั ัะปัะฑะฐะตัััั ะธ ะดะตะปะฐะตัั ะฝะตัะบะพะปัะบะพ
ัะฐะณะพะฒ ะฝะฐะฒัััะตัั, ะฐ ะฟะพัะพะผ ะธััะตะทะฐะตัั...
ะัะปะธ ะฑั ัะพะปัะบะพ ะฑัะปะฐ ะฒะพะทะผะพะถะฝะพััั ะตัะต ัะฐะท ััะปััะฐัั ัะฐะบะพะต ัะตะฟะปะพะต "ะดะพะฑัะพะต ัััะพ", ะฟะพััะฒััะฒะพะฒะฐัั
ะณะพัััะตะต ะดัั
ะฐะฝะธะต, ัะตะบะพัััะตะต ัั
ะพ, ั
ะพัั ััะพ-ะฝะธะฑัะดั...
ะะพะบะฐ ัั ะฑัะป ััะดะพะผ, ะฑัะปะพ ัะพะฒัะตะผ ะฝะต ะฒะฐะถะฝะพ, ััะพ ะฟัะพะธัั
ะพะดะธั ะฒะพะบััะณ. ะะพ ัะตะฟะตัั, ะบะพะณะดะฐ
ั ะฝะฐะฑะปัะดะฐั ะทะฐ ะฝะตะฝะฐััะฝะพะน ะฟะพะณะพะดะพะน ะฒ ะพะบะฝะพ, ั ะผะตะฝั ะฝะตั ัะฒะตัะปัั
ะผััะปะตะน ะธ ะปะตะณะบะพััะธ,
ััะพ ะฒะพะทะฝะธะบะฐะปะธ ัะฐะฝััะต. ะะฐะถะต ะปะตัะพะผ ะผะพะต ัะตัะดัะต ัะบะพะฒัะฒะฐะตั ะปะตะด, ะบะพัะพััะน ัะถะต ะฝะต ัะดะฐัััั
ัะฐััะพะฟะธัั.'
pipeline_tag: sentence-similarity
library_name: sentence-transformers
metrics:
- cosine_accuracy
- cosine_accuracy_threshold
- cosine_f1
- cosine_f1_threshold
- cosine_precision
- cosine_recall
- cosine_ap
- cosine_mcc
model-index:
- name: SentenceTransformer based on intfloat/multilingual-e5-small
results:
- task:
type: binary-classification
name: Binary Classification
dataset:
name: Unknown
type: unknown
metrics:
- type: cosine_accuracy
value: 0.9214652872665756
name: Cosine Accuracy
- type: cosine_accuracy_threshold
value: 0.3258066475391388
name: Cosine Accuracy Threshold
- type: cosine_f1
value: 0.7519700297119235
name: Cosine F1
- type: cosine_f1_threshold
value: 0.28451085090637207
name: Cosine F1 Threshold
- type: cosine_precision
value: 0.7465213209361975
name: Cosine Precision
- type: cosine_recall
value: 0.7574988613442645
name: Cosine Recall
- type: cosine_ap
value: 0.8411912148109758
name: Cosine Ap
- type: cosine_mcc
value: 0.7019559271431105
name: Cosine Mcc
---
# SentenceTransformer based on intfloat/multilingual-e5-small
This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [intfloat/multilingual-e5-small](https://huggingface.co/intfloat/multilingual-e5-small) on the json dataset. It maps sentences & paragraphs to a 384-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
## Model Details
### Model Description
- **Model Type:** Sentence Transformer
- **Base model:** [intfloat/multilingual-e5-small](https://huggingface.co/intfloat/multilingual-e5-small) <!-- at revision c007d7ef6fd86656326059b28395a7a03a7c5846 -->
- **Maximum Sequence Length:** 512 tokens
- **Output Dimensionality:** 384 dimensions
- **Similarity Function:** Cosine Similarity
- **Training Dataset:**
- json
<!-- - **Language:** Unknown -->
<!-- - **License:** Unknown -->
### Model Sources
- **Documentation:** [Sentence Transformers Documentation](https://sbert.net)
- **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers)
- **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers)
### Full Model Architecture
```
SentenceTransformer(
(0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel
(1): Pooling({'word_embedding_dimension': 384, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
(2): Normalize()
)
```
## Usage
### Direct Usage (Sentence Transformers)
First install the Sentence Transformers library:
```bash
pip install -U sentence-transformers
```
Then you can load this model and run inference.
```python
from sentence_transformers import SentenceTransformer
# Download from the ๐ค Hub
model = SentenceTransformer("sentence_transformers_model_id")
# Run inference
sentences = [
'- ะะฐะฝะดะฐ! ะฏ ะตัั ัะพะณะปะฐัะธะปัั ะฝะฐ ะฟะฐััะธะฒ "ัะฝะธะทั" ! ะะพ ััะพ ัะถะต ะดะฐะถะต ะฝะต ะฟะฐััะธะฒ, ััะพ ัะถะต ะฑะตะท ะฐะบัะธะฒ ะบะฐะบะพะน-ัะพ!\n- ะ ััะพ ัั ะพั ะผะตะฝั ั
ะพัะตัั-ัะพ? ะฏ ัะถะต ะฝะต ะผะพะณั ัะดะตัะถะธะฒะฐัััั!\n- ะะต ะฒะดะฐะฒะปะธะฒะฐะน ะผะตะฝั ะขะะ ะฒ ััะตะฝั-ัะพ!\n- ะขั. ะะพััะธ, ั
ะฒะฐัะธั ะตะปะพะทะธัั ะธ ะฝะฐัะปะฐะถะดะฐะนัั ะผะพะผะตะฝัะพะผ!\n- ะะพ ััะพ ะฝะต ัะตััะฝะพ! ะะพ ะพัะฝะพัะตะฝะธั, ะผะตะถะดั ะฟัะพัะธะผ, ะบ ัะต...ะผะผะผะผะผ!!!\n- ะขั. ะะพััะธ, ััะพ ััะพ?\n- ะะฐะฝัะธะบ!\n- ะฏ ะฒะธะถั, ะฝะฐ ะบะพะน ัััั ัั ะผะฝะต ะตะณะพ ะทะฐะฒัะทะฐะป?\n- ะขะฐะบ ัั ะฟะพั
ะพะถ ะฝะฐ ะฟะพะดะฐัะพะบ!\n- ะกัะฐั ะัะณะตะฝะพะผ ะพะณัะตะฑััั!\n- ะ ะฟะพัะตะผั ัั ะฝะต ัะฟัะพัะธัั "ะะพะผั?"\n- ะะฝะต ััะพ ะฝะต ะธะฝัะตัะตัะฝะพ! ะ ะบะพะผั?\n- ะะฝะต!!! ... *ะงะผะพะบ*\n- ะฅะผ... ะผะตะฝั ััะพ ะฝะต ััััะฐะธะฒะฐะตั!\n- ะงะตะณะพ?!!\n- ะขะพะณะพ!\n- ะะะะ!!!\n- ะะพะผัะธ! ะงัะพ ััะพ ะทะฝะฐัะธั? ะงัะพ ั ะะปะปะตะฝะพะผ?\n- ะงัะพ? ะ, ะะฐะฝะดะฐ. ะะปะปะตะฝะฐ ะฃะพะปะบะตัะฐ ัะฐะฝะธะปะธ ะฝะฐ ะผะธััะธะธ!\n- ะญัะพ ั ัะถะต ะฟะพะฝัะป! ะงัะพ ั ะฝะธะผ, ะณะพะฒะพัะธ ะบะพะฝะบัะตัะฝะตะน! - ัะฐะผััะฐะน ะฒััััั
ะฝัะป ะฝะฐัะฐะปัะฝะธะบะฐ.\n- ะะต ะฟะพะฒััะฐะน ะฝะฐ ะผะตะฝั ะณะพะปะพั! - ะฒะพะทะผััะธะปัั ัะผะพััะธัะตะปั.\n- ะะพั ั ะฒะฐัะตะน ัะตัััะพะน ััะพ-ะฝะธะฑัะดั ัะปััะธััั, ั ะฒะฐะผ ัะพะถะต ัะบะฐะถั "ะะต ะบัะธัะธัะต!" ะงัะพ ั ะะพััะธ?\n- ะญั
... ะฝะฐ ะผะธััะธะธ ะะปะปะตะฝ ะพัะปะตะฟ. ะะพ ะฝะต ะฟะตัะตะถะธะฒะฐะน. ะญัะพ ะฒัะตะผะตะฝะฝะพ! ะัะตะฝะธะต ะฒะพัััะฐะฝะพะฒะธััั! ะะตัััะฐ ัะตัะตะท 3!\n- 3 ะะะกะฏะฆะ?!\n- ะะฐ! ะขั ัะถ ะฝะต ะพะฑะธะถะฐะน ะตะณะพ ะฟะพะบะฐ.\n- ะะตะท ะฒะฐั ะทะฝะฐั!\n- ะขั ะบัะดะฐ?\n- ะ ะะปะปะตะฝั, ะบัะดะฐ ะถะต ะตัั! - ะณัะพะทะฝะพ ััะฒะบะฝัะป ัะฐะผััะฐะน.\n- ะั
ัะถ ััะธ ะณะพะปัะฑะบะธ...\n- ะ-ะบัะพ ะทะดะตัั? - ะะปะปะตะฝ ัะธะดะตะป ะฝะฐ ะบะพะนะบะต, ะทะฐะฒะตัะฝัะฒัะธัั ะฒ ะพะดะตัะปะพ.\n- ... - ัะฐะณะธ ะฟัะธะฑะปะธะถะฐะปะธัั.\n- ะ-ะฝะต ะฟะพะดั
ะพะดะธ! - ะฐ ะฒั ะฑั ะฝะต ะธัะฟัะณะฐะปะธัั, ะฟัะตะถะดะต ะพััะฐะฒัะธัั ะฒ ะพะดะธะฝะพัะตััะฒะต, ััะตะดะธ ะฐะบัะผ, ะฒ ะณัััะพะผ ะปะตัั, ะฑะตะท ะทัะตะฝะธั? ะขะพ-ัะพ ะถะต!\n- "ะะต ะพััะฐะฒะปั!"\n- ะงะธััะฐั ะกะธะปะฐ! - ะทะฐะฝัั ััะบั, ะฟัะตะดัะฟัะตะถะดะฐั ะฒัะฐะณะฐ.\n- "ะะธ ะทะฐ ััะพ ะฑะพะปััะต ะฝะต ะพััะฐะฒะปั ะพะดะฝะพะณะพ!" ะะปะปะตะฝ! - ะฟะพะดั
ะฒะฐัะธัั ัะพะฝะบะพะต ัะตะปััะต ะฝะฐ ััะบะธ, ะฟัะธะถะฐัั ะบ ัะตะฑะต, ะปะฐะดะพะฝัั ะฝะฐะบััะฒ ะณะปะฐะทะฐ ะะพััะธ, ะบะฐะบ ะฒ ะฟะตัะฒะพะผ ะฟะพัะตะปัะต. ะ ะบะพัะฝััััั ัะณะพะปะบะฐ ัะพัะธะบะฐ ัะฒะพะธะผะธ ะณัะฑะฐะผะธ.\n- ะ-ะะฐะฝะดะฐ?!\n- ะะต ะฒะพะปะฝัะนัั! ะฏ ััะฐะฝั ัะฒะพะธะผะธ ะณะปะฐะทะฐะผะธ ะฟะพะบะฐ ัั ะฟัะพะดะพะปะถะฐะตัั ะฑััั ะผะพะธะผ ะกะตัะดัะตะผ ะะตะฒะธะฝะฝะพััะธ.\nะ ัั ัััะฐะฒัะธะน ะธะดััั ั ะผะธััะธะธ.\nะ ั ัััะฐะฒัะธะน ะธะดั ั ััะตะฝะธัะพะฒะบะธ.\nะขะฒะพะธ ะฝะพะณะธ ะธััะพะฟัะฐะฝั.\nPOV ะะฐะฝะดั.\nะะพั ะณะพะปะพะฒะฐ ะฑะพะปะธั.\nะขะฒะพะธ ััะบะธ ะฝะพัั.\nะะพั ัะตัะดัะต ะธััะพะผะธะปะพัั.\nะ ะฒะพั ะผั ะธะดัะผ ะดััะณ ะฝะฐ ะดััะณะฐ, ะฟะพะดะฝะธะผะฐะตะผ ะณััััะฝัะต, ะธะทะผััะตะฝะฝัะต ะณะปะฐะทะฐ ะดััะณ ะบ ะดััะณั.\nะขั ะพััะฐะฝะฐะฒะปะธะฒะฐะตัััั.\nะััะฐะบ, ััะพ-ัะพ ะณะพะฒะพัะธัั.\nะงัะพ-ัะพ ะบัะธัะธัั.\nะ ััะผ-ัะพ ะผะพะปัะธัั.\nะะฐะบ-ัะพ ัะผะพััะธัั.\nะ ััะผ-ัะพ ะฒะพะปะฝัะตัััั.\nะกะฝะพะฒะฐ ะพ ััะผ-ัะพ ะบัะธัะธัั.\nะก ะณัััััั ัะผะพััะธัั.\nะ ะบะพะผ ัั ะผััะฐะตัััั?\nะะตะปะฐะตัั ัะฐะณ, ะตัั ะพะดะธะฝ.\nะฅะฒะฐัะฐะตัั ะทะฐ ะฒะพัะพัะฝะธะบ.\nะัะธะฒััะฐััั ะฝะฐ ะฝะพัะพัะบะฐั
.\nะฆะตะปัะตัั...\nะััะฐะบ, ัั ะถะต ัััะฐะป!\nะััะฐะบ, ั ะถะต ัััะฐะป!\nะฏ ะพััะฐะฝะฐะฒะปะธะฒะฐััั.\nะััะฐะบ, ััะพ-ัะพ ะพัะฒะตัะฐั.\nะงัะพ-ัะพ ะบัะธัั.\nะะฐ ััะผ-ัะพ ะทะฐะผะพะปะบะฐั.\nะขัะฟะพ ัะผะพััั.\nะงัะพ-ัะพ ัะตะผะธั.\nะะฟััั ััะพ-ัะพ ะพัั.\nะััะตััะฝะฝะพ ัะผะพััั.\nะะฐ ะบะพะณะพ-ัะพ ะฒะพะปะฝัััั.\nะกัะพั.\nะะฐััะพัะฐะถะธะฒะฐััั.\nะัั ัะฐะฒะฝะพ.\nะะตัะถะตะปะธ?!\nะัะฒะตัะฐั ะฝะฐ ะฟะพัะตะปัะน.\nะะฐะบ ะถะต ะผั ัััะฐะปะธ!\n- ะะฐะฒะฝะพ?\n- ะัะตะณะดะฐ, ะะพััะธ.\n- ะะตั, ัะตััะฝะพ, ั ะฝะตะฝะฐะฒะธะถั ะณะพะปัะฑะตะน! ะ ะพัะพะฑะตะฝะฝะพ ะฑะตะปัั
! - ัะบะฐะฝะดะฐะปะธะป ะะฐะฒะธ ะธะดั ะฟะพ ะบะพัะธะดะพัั ะงััะฝะพะณะพ ะัะดะตะฝะฐ.\n- ะงะตะผ ะถะต ะพะฝะธ ัะตะฑะต ะฝะต ะฝัะฐะฒัััั? - ัะฟัะพัะธะปะฐ ะดะตะฒััะบะฐ-ะบะธัะฐัะฝะบะฐ.\n- ะะฐ ะฑะปะธะฝ, ัะธะดัั ะฒะตะทะดะต ะณะดะต ะฝะต ะฟะพะฟะฐะดั ะธ ัะฒะตัั
ั ะบะฐะบะฐัั!\nะ ัะพะปัะบะพ ะพะฝะธ ะทะฐัะปะธ ะทะฐ ะฟะพะฒะพัะพั, ะบะฐะบ ััะปััะฐะปะธ:\n- ะะตั, ะะฐะฝะดะฐ, ะฟัะตะบัะฐัะธ! ะะฐั ะผะพะณัั ัะฒะธะดะตัั!\n- ะะฐ ะบัะพ ะฝะฐั ััั ะผะพะถะตั ัะฒะธะดะตัั, ะะพััะธ?\n- ะั, ะบัะพ-ะฝะธะฑัะดั! ะั
... ะฎั!\n- ะะธะผะพ ะณะพะปัะฑััะฝะธ ะฝะธะบัะพ ะฝะต ะฟัะพั
ะพะดะธั! ะญัะพ ะฝะฐะดัะถะฝะฐั ัะฐััั ะทะฐะผะบะฐ!\n- ะั, ะฝั, ะฝั ะปะฐะดะฝะพ... ะฝะพ ะฟะพัะตะผั ะธะผะตะฝะฝะพ ััั?\n- ะ ะบัะพ ะฒะตัะตัะฐะป ััะพ ะตะผั ัะพะผะฐะฝัะธะบะธ ะฝะต ั
ะฒะฐัะฐะตั? - ะะฐะฝะดะฐ ะฝะตะดะฒััะผััะปะตะฝะฝะพ ะทะฐะถะธะผะฐะป ะะปะปะตะฝะฐ ะฝะฐ ะฟะพะดะพะบะพะฝะฝะธะบะต.\nะะฐะฒะธ ะธ ะะธะฝะฐะปะธ ัะฐัะฐั
ะฝัะปะธัั ะพะฑัะฐัะฝะพ.\n- ะฅะพัั ะทะฝะฐะตัั, ะะธ. ะะพะถะตั ะฒ ััะธั
ะณะพะปัะฑะบะฐั
ััะพ-ัะพ ะธ ะตััั!\nะััะฐัั ััะฐะฝะพะฒะธััั ะฒัั ัััะดะฝะตะต, ะผะตะฝั ะทะฐะณะพะฝััั ะฒ ัะณะพะป. ะฏ ะธัะฟัะณะฐะฝะฝะพ ะพะฑะพัะฐัะธะฒะฐััั ะธ ะฒะธะถั... ะตะณะพ.\n- ะขั, ะฟัะพะบะปัััะน! - ะพะบัะธะบะธะฒะฐะตั ัััะพะณะธะน ัะฟะพะฝะตั.\nะ ั ะผะพะปัั, ะดัั
ะฐะฝะธะต ัะฑะธะปะพัั, ะฒะทะฒะพะปะฝะพะฒะฐะฝ. ะงัะพ ะตะผั ะฝัะถะฝะพ?\n- ะะต ะดัะผะฐะป ััะพ ัั ะพะฟัััะธัััั ะดะพ ะบัะฐะถ, ะะพััะธ.\nะงัะพ? ะัะฐะถ? ะะฐะบะธั
ะบัะฐะถ, ะฎั?\n- ะขั ะพ ััะผ?\n- ะะผะตะฝะฝะพ ัั, ะพ ั ะฝะต ัะพะผะฝะตะฒะฐััั, - ะพะฝ ะธะทะดะตะฒะฐะตััั? - ัั ัะบัะฐะป ั ะผะตะฝั ะพะดะฝั ะฒะตัั.\n- ะะฐะฝะดะฐ, ั ะฝะธัะตะณะพ ะฝะต ะฑัะฐะป! - ะพัะบัะดะฐ ััะพ ััะฒััะฒะพ ะฑะตะทััั
ะพะดะฝะพััะธ? ะะฝ ะพััะฐะฝะพะฒะธะปัั.\n- ะะธะฑะพ ะฒะตัะฝะธ ะผะฝะต ะตะณะพ, ะปะธะฑะพ ั ะทะฐะฑะตัั ัะฒะพั! - ะงัะพ? ะฏ ะตะณะพ ะฝะต ะฟะพะฝะธะผะฐั! ะงัะพ ะพะฝ ะดะตะปะฐะตั? ะฅะฒะฐัะฐะตั ะทะฐ ะฟะพะดะฑะพัะพะดะพะบ, ัะฐัะธั ะฝะฐ ัะตะฑั. ะ... ะฑะพะถะต, ััะพ ััะพ? ะะฝ... ะฏ ััะฒััะฒัั ะตะณะพ ะณัะฑั ะธ ัะฐะผ ะฝะต ะฟะพะฝะธะฐั - ะพัะฒะตัะฐั! ะะฐะฝะดะฐ, ัั, ัั, ัั... ะฝะฐััะพััะธะน ะฒะพั! ะขั ะบัะฐะดััั ะผะพั ัะตัะดัะต!\n- ะ-ะบะฐะฝะดะฐ... - ััะบะธ ะฟะพะฒะธัะปะธ. ะะต ัะพะพะฑัะฐะถะฐั.\n- ะะตัะฝะธ ะผะพั ัะตัะดัะต, ะััะฑะฐะฝะฝัะน ะกััััะพะบ!\n- ะฏ... ะฝะธ ะทะฐ ััะพ! ะฏ ะพััะฐะฒะปั ะตะณะพ ัะตะฑะต! ะ ัั... ัะถะต ะทะฐะฑัะฐะป ะผะพั...\nะะพะปะฝะพะฒะฐะปัั ะบะฐะบ-ัะพ ะะปะปะตะฝ, ะฒะตะดั ะะฐะฝะดะฐ ะฑัะป ะฝะฐ ะผะธััะธะธ. ะ ะฒะพั ัะปัั ะพะฝ ะตะผั ะฟะธััะผะพ ัะตัะตะท ะณะพะปะตะผะฐ.\nะ: ะขั ัะฐะผ ะฒะพะพะฑัะต ะถะธะฒะพะน, ะะฐะะฐะฝะดะฐ?\nะ: ะะฐ, ะะปะปะตะฝ. ะะธะฒะพะน, ั ะถะธะฒะพะน!\nะ: ะะฐะฝะดะฐ! ะงัะพ ั ัะพะฑะพะน? ะขะตะฑะต ะฟะปะพั
ะพ? ะฃะผะธัะฐะตัั? ะขะตะฑั ะะธะฝะฐะปะธ ะฟะพัะตะปะพะฒะฐะปะฐ? ะะตัะถะธัั ะดััะณ!\nะ: ะขั ัั? ะกะพ ะผะฝะพะน ะฒัั ั
ะพัะพัะพ, ะะพััะธ!\nะ: ะคัั
... ะฝะต ะฟัะณะฐะน ะผะตะฝั ัะฐะบ ะฑะพะปััะต.\n- ะญะน, ะะฐะฝะดะฐ, ะฝะต ะณััััะธ! ะญัะพ ัะฐะบ ะฝะฐ ัะตะฑั ะฝะต ะฟะพั
ะพะถะต!\n- ะขะตะฑะต ะปะตะณะบะพ ะณะพะฒะพัะธัั. ะฃ ะฒะฐั ั ะะฐะฒะธ ะฒัั ะฝะฐะปะฐะถะธะฒะฐะตััั. ะ ะฝะฐ ะผะตะฝั ะฃะพะปะบะตั ะดะฐะถะต ะฝะต ัะผะพััะธั!\n- ะขั ะฟะพะณะพะฒะพัะธ ั ะฝะธะผ!\n- ะขั, ะธ ัะฐะบ ะบะฐะถะดัะน ะดะตะฝั ะฒะธะดะธะผัั!\n- ะะตั, ะฎั, ะฟะพะณะพะฒะพัะธ ั ะฝะธะผ, ะบะฐะบ ัะพ ะผะฝะพะน! ะัั ะพะฑัะฐะทัะผะธัััั ัะปััะธัั?\n- ะะต ะฒะตัั ั ะฒ ััะพ.\n- ะ ัั ะฟะพะฒะตัั! ะงัะดะพ - ะฑัะฒะฐะตั!\nhttp://vkontakte.ru/photo63528512_276702591\n-ะะต ะพัะดะฐะผ. ะกะปััะธัะต?! ะะธะบะพะณะดะฐ ะฝะต ะพัะดะฐะผ ะฒะฐะผ ะะฐะฝะดั!!!\n- ะญัะพ ะฝะต ัะตะฑะต ัะตัะฐัั, ะะปะปะตะฝ ะฃะพะปะบะตั!\n- ะะต. ะะพะดั
ะพะดะธัะต. ะ. ะะฐะผ.\n- ะะฝ ะฟัะธะฝะฐะดะปะตะถะธั ะฝะฐะผ!\n- ะฏ... ะพะฝ... ะฃะะฌะฎ!!!\nhttp://vkontakte.ru/photo63528512_276702661',
'ะะพะฝะตั ะณะพะดะฐ - ััะพ ะฟะพัะฐ ะดะปั ัะฐะดะพััะธ, ะฒ ะฟัะตะดััะฒััะฒะธะธ ะฝะฐะดะฒะธะณะฐััะธั
ัั ะบะฐะฝะธะบัะป, ัะฒะพะฑะพะดั. ะญัะพ ะฑัะปะพ ะฝะฐัะฐะปะพ ะผะฐั, ะบะพะณะดะฐ ะฝะฐ ัะปะธัะต ัะถะต ัะตะฟะปะพ, ะฐ ะฟะพ ัััะฐะผ ะทัะฑะบะพ. ะะพะณะดะฐ ัะฒะตัั ัะถะต ัะฐััะฒะตะปะธ ะธ ะฝะฐัะฐะปะธ ะฑะปะฐะณะพัั
ะฐัั. ะกััะฐั ะทะตะผะปั ะฟะพะบััะฒะฐะปะฐัั ััะฐะฒะธะฝะพัะบะฐะผะธ, ะธ ะฟะพ ะฝะตะน ััะดะฐ-ััะดะฐ ัะฝะพะฒะฐะปะธ ะฑัะบะฐัะบะธ-ัะฐัะฐะบะฐัะบะธ.\nะัะธัั ะปะตัะฐะปะธ ะฝะฐะด ะดะตัะตะฒััะผะธ, ัะธัะธะบะฐั ะธ ัััะตะบะพัะฐ, ะฐ ะบะฐะบะฐั-ัะพ ะพัะพะฑะตะฝะฝะพ ััะตัะดะฝะพ ะฝะฐะฟะตะฒะฐะปะฐ:\n~ midori tanabiku namimori no\ndainaku shounaku nami ga ii\nitsumo kawaranu\nsukoyaka kenage\naa~\ntomo ni utaou\nnamimorichuu ~\nะะฐ... ััะพ ะฑัะปะฐ ัะฐ ัะฐะผะฐั ัะพะบะฝััะฐั ะฟัะธัะบะฐ, ั
ะพะทัะธะฝะพะผ ะบะพัะพัะพะน ะฑัะป ะฝะต ะผะตะฝะต ัะพะบะฝัััะน ะฅะธะฑะฐัะธ ะัั. ะฅะพัั ะฝะฐะทะฒะฐัั ะตะณะพ ัะฐะบ ะฟัะธะปัะดะฝะพ ะฝะธ ั ะบะพะณะพ ะฑั ัะทัะบ ะฝะต ะฟะพะฒะตัะฝัะปัั... ะฝั, ะฟะพััะธ ะฝะธ ั ะบะพะณะพ.\nะัะตะผะตะฝะฐ ัะบะพะปัะฝะพะน ะฟะพัั ะฟัะพัะปะธ, ะธ ัะตะฟะตัั ะฝะฐััะฐะปะธ ะฝะต ะผะตะฝะตะต ะฝะฐัััะตะฝะฝัะต ะฒัะตะผะตะฝะฐ ัััะดะตะฝัะตััะฒะฐ. ะขะฐะบ ัะถ ะฟะพะปััะธะปะพัั, ััะดัะฑั ะทะปะฐั ัััะบะฐ, ััะพ ะฑะตะดะฝะพะณะพ ะกะฐะฒะฐะดั ะขััะฝะฐััะธ ะฟะตัะตะฝะฐะฟัะฐะฒะธะปะธ ะฒ ัะฝะธะฒะตััะธัะตั, ะณะดะต ะณะปะฐะฒะพะน ะดะธััะธะฟะปะธะฝะฐัะฝะพะณะพ ะบะพะผะธัะตัะฐ ะฑัะป ัััะฐั
ะธ ัะถะฐั ะตะณะพ ะถะธะทะฝะธ - ะฅะธะฑะฐัะธ ะัั! ะั, ัะฐะทัะผะตะตััั ะฟะพัะปะต ัะตะฟะตัะธัะพัะฐ... ะฝะพ ะฝะต ะพะฑ ััะพะผ ัะตะนัะฐั. ะัะฑะพะฟััะฝะพ, ััะพ ะฑะตะดะฝะพะณะพ ะกะฐะฒะฐะดั ะขััะฝะฐััะธ, ะพัะธะฑะพัะฝะพ, ะทะฐะฟะธั
ะฝัะปะธ ััะฐะทั ะฝะฐ 2 ะบััั! ะ-ะดะฐ... ะฝะต ะฟะพะฒะตะทะปะพ ัะตะฑัะฝะบั...\nะะพ ััั ัะพัััะฝะฐ ะฟะพะฒะตัะฝัะปะฐัั ะบ ะฝะตะผั ัะฒะพะธะผ ััะปะพะผ, ะธ ะฒ ะตะณะพ ะบะปะฐััะต ะพะฝ ะฟะพะฒัััะตัะฐะป ะทะฐะผะตัะฐัะตะปัะฝะพะณะพ ัะตะปะพะฒะตะบะฐ - ะะปะปะตะฝะฐ ะฃะพะปะบะตัะฐ.\nะก ะฝะธะผ ะพะฝะธ ะผะธะณะพะผ ัะดััะถะธะปะธัั ะธ ััะฐะปะธ, ะฝะต ัะฐะทะปะตะน ะฒะพะดะฐ. ะะพ ััะพ ะฑัะปะพ ะพัะตะฝัั, ะฐ ัะตะฟะตัั ะฒะตัะฝะฐ! ะ ััะพ ะทะฝะฐัะธั...\nะกัะตะฝะฐ 1. ะัะฑะปั 1.\n- ะขััะฝะฐ, ะฝะต ะฟะตัะตะถะธะฒะฐะน ัั ัะฐะบ! ะกะดะฐัั ัั ััะธ ัะบะทะฐะผะตะฝั! ะะตะดั ะธ ั, ะธ ัะฒะพะน ัะตะฟะตัะธัะพั ะทะฐะฝะธะผะฐะปะธัั ั ัะพะฑะพะน ะฒะตัั ััะตะฑะฝัะน ะณะพะด! ะขั ะดะฐะถะต ะฝะฐัะฐะป ะฟะพะฝะธะผะฐัั ะฐะทั ัะปะตะบััะพัะธะทะธะบะธ! - ััะฟะพะบะฐะธะฒะฐะป ะฒะตัะฝะพ ะปะพัะปัะฝัะน ัะตะดะพะน, ะฟะพะณะปะฐะถะธะฒะฐั ะขััะฝั ะฟะพ ะฟััะธััะพะน ะบะฐััะฐะฝะพะฒะพะน ัะตะฒะตะปััะต. - ะั, ะฐ ะตัะปะธ ััะพ, ะพััะฐะฝะตัััั ะฝะฐ ะฒัะพัะพะน ะณะพะด! ะะพะฝ, ะฝะตะบะพัะพััะต ัะฐะบ ัะถะต 3 ัะฐะทะฐ ะดะตะปะฐะปะธ! - ะบะธะฒะฝัะป ะพะฝ ะฝะฐ ะะฐะฝะดั, ััะพ ัะธะดะตะป ั ะพะบะฝะฐ ะฒ ะบะพะฝัะต ะบะปะฐััะฐ.\nะะฐะฝะดะฐ ะฎั, ะพ-ะพ-ะพ! ะญัะพ, ะฒะพะพะฑัะต, ะพัะดะตะปัะฝะฐั ะธััะพัะธั! ะฅัะปะธะณะฐะฝ, ะพัะปะธัะฝะธะบ, ะบัะฐัะฐะฒะตั, ะฟะพัะปะตะดะฝัั ัะบะพัะธะฝะฐ, ัะตะปะพะฒะตะบ ัะตััะธ, ะฑะตะทะดะฐัั, ะณัะพะทะฐ ะฒัะตั
ะธ ะฒัั... ััะฒััะฒะฐ ัะผะตัะฐะฝะฝัะต. ะะฐะบ ะฒัั ััะพ ะธ ะตัั ะผะฝะพะณะพ "ะฟะพะปะพะถะธัะตะปัะฝัั
" ะบะฐัะตััะฒ ะฝะฐั
ะพะดัััั ะฒ ะพะดะฝะพะผ ัะตะปะพะฒะตะบะต, ะะปะปะตะฝ ะพัะบะฐะทัะฒะฐะปัั ะฟะพะฝะธะผะฐัั!\n- ะะพ ะพะฝ ั
ะพัั ะฑั ะบัััะพะน, ะธ ะพัะปะธัะฝะธะบ, ะฐ ั ะบะฐะบ ะฑัะป ะฝะธะบัะตะผะฝัะผ, ัะฐะบะธะผ ะธ ะพััะฐะฝััั. ะะฝะต ะฝะต ัะดะฐัั ััะธ ัะบะทะฐะผะตะฝั, ะฝะธ ะทะฐ ััะพ ะฒ ะถะธะทะฝะธ! - ะฟัะพะดะพะปะถะฐะป ัััะฐะดะฐัั ะขััะฝะฐ, ัั
ะฒะฐัะธะฒัะธัั ะทะฐ ะณะพะปะพะฒั. ะขะฐะบะธะผ ะพะฝ ะฑัะป, ัะปะธัะบะพะผ ะฝะตัะฒะตัะตะฝะฝัะผ ะฒ ัะตะฑะต, ะฟะตััะธะผะธััะธัะฝัะผ, ะฐ ะตัั ะฟะพัะปะตะดะฝะธะผ ะฝะตัะดะฐัะฝะธะบะพะผ... ัะฟะธัะพะบ ะผะพะถะฝะพ ะฟัะพะดะพะปะถะธัั. ะะพ ะฒ ัะพะถะต ะฒัะตะผั, ัะฐะดะธ ะดััะทะตะน ะพะฝ ะฑัะป ะณะพัะพะฒ ะฝะฐ ะผะฝะพะณะพะต! ะะณะพ ะพัะทัะฒัะธะฒะพััั, ะดะพะฑัะพัะฐ ะฝะต ะทะฝะฐะปะฐ ะณัะฐะฝะธั. ะัะปะธ ะบัะพ-ัะพ ะพะฑะธะถะฐะป ะตะณะพ ะดััะทะตะน, ะตะณะพ ะณะปะฐะทะฐ ััะฐะฝะพะฒะธะปะธัั ะพัะฐะฝะถะตะฒัะผะธ, ะฐ ัะฐะผ ะพะฝ ัะตัััะทะฝัะผ ะธ ะผะตะณะฐ-ัะธะปัะฝัะผ.\n- ะะฐะะฐะฝะดะฐ-ัะพ?! ะฅะฐ-ั
ะฐ-ั
ะฐ! - ัะฐััะผะตัะปัั ะฃะพะปะบะตั. - ะััะฐะบ ะดััะฐะบะพะผ! ะะฝ ะฟัะพััะพ ะฒะตะทัะฝัะธะบ ั ัะตะฟััะฐัะธะตะน ะธ ะฒะฝะตัะฝะพัััั! ะ ะฒัั! - ะพะฝ ะผะฝะพะณะพะทะฝะฐัะธัะตะปัะฝะพ ั
ะผัะบะฝัะป. - ะ ัั, ัั ะดะพะฑััะน ะธ ะผะธะปัะน! ะัะพััะพ ะฑัะดั ะฟะพัะฒะตัะตะฝะฝะตะต ะฒ ัะตะฑะต, ะธ ะฒัั ะฟะพะปััะธััั!\n- ะญั
, ะธ ะบะฐะบ ะตะผั ัะดะฐะตััั ะฑััั ัะฐะบะธะผ ัะฒะตัะตะฝะฝัะผ? ะฃ ะผะตะฝั ัะฐะบ ะฝะต ะฟะพะปััะฐะตััั... - ะฒะทะดะพั
ะฝัะป ะกะฐะฒะฐะดะฐ, ะฟะพัะผะพััะตะฒ ะฝะฐ ะะฐะฝะดั. - ะะฐ, ะธ ะฟัะธ ััะพะผ ะพะฝ ะฝะธัะตะณะพ ะฝะต ะดะตะปะฐะตั, ะปะธัั ัะธะดะธั ะฝะฐ ัะฒะพัะผ ะผะตััะต, ะฝะพ ะฒัะต ะดะตะฒัะพะฝะบะธ ะฒะพะทะปะต ะฝะตะณะพ ะฒััััั.\nะะพ ััั, ะฒะดััะณ, ะะฐะฝะดะฐ ะฟะพัะผะพััะตะป ะฒ ะธั
ััะพัะพะฝั, ะฐ ะขััะฝะฐ ััั ะถะต ะพัะฒะตัะฝัะปัั ะธ ัะถะฐะปัั, ะฑัะดัะพ ะตะณะพ ัะพะปัะบะพ ััะพ ะพะฑะปะธะปะธ ะปะตะดัะฝะพะน ะฒะพะดะพะน.\n- ะคัั
...\nะะปะปะตะฝ ัะพะถะต ะฟะพัะผะพััะตะป ะฝะฐ ะะฐะฝะดั ะธ, ะฟะพะบะฐะทะฐะฒ ะตะผั ัะทัะบ, ะพัะฒะตัะฝัะปัั.\n- ะั! ะ ััะพ ะพะฝะธ ะฒ ะฝัะผ ะฝะฐัะปะธ, ะฝะต ะฟะพะฝะธะผะฐ... - ะฒะพั ัะตะฟะตัั ัะถะต ะะปะปะตะฝ ะทะฐะผะตั ัััะฐะฒะธะฒัะธัั ะฝะฐ ะดะฒะตัะฝะพะน ะฟัะพัะผ, ะพัะบัะดะฐ ะธะทะปััะฐะปะฐัั ะฐััะฐ ัะผะตััะธ. ะญัะพ ะฑัะป ะฅะธะฑะฐัะธ ะัั\n"ะงัะพ ะตะผั ะฝัะถะฝะพ?!"\nะกัะตะฝะฐ 2. ะัะฑะปั 1.\n- ะัะพ... ะบัะพ ะฟะพัะผะตะป ะฟัะธะนัะธ ะฒ ัะฝะธะฒะตััะธัะตั ะฑะตะท ัะผะตะฝะบะธ?!!!\nะขัั ะฃะพะปะบะตั ะฟะพัะผะพััะตะป ะฝะฐ ะฟะพะป ะธ ะฒะทะดัะพะณะฝัะป. ะััะทั! ะัะถะธ ะณััะทะธ ะพั ะฒะพะตะฝะฝัั
ัะฐะฟะพะณ, ะฐ ัะฐะบะธะต ัะฐะฟะพะณะธ ัะพะปัะบะพ ั...\n- ะะฐะฝะดะฐ ะฎั! - ะฒะทัะตะฒะตะป ะัั.\nะะพ ะฟะฐัะตะฝั ะปะธัั ะพะดะฐัะธะป ะตะณะพ ัะฒะพะธะผ ะพะฑััะฝัะผ, ัะฐะฒะฝะพะดััะฝัะผ ะฒะทะณะปัะดะพะผ, ะฟะพะปะฝัะผ ั
ะพะปะพะดะฐ.\n- ะงัะพ-ัะพ ะฝะต ัะฐะบ?\n- ะขั, ััะฐะฒะพัะดะฝะพะต! - ะฟะพะดะพะนะดั ะบ ะะฐะฝะดะต, ะฅะธะฑะฐัะธ ะปะฐัะบะพะฒะพ ะพัะพะดะฒะธะฝัะป ะฟะฐััั. - ะขั ะพัะฒะตัะธัั ะทะฐ ัะพ, ััะพ ะธัะฟะฐัะบะฐะป ะฟะพะปั! - ะพะฝ ะฝะฐัะบะฒะพะทั ะฟัะพะถะธะณะฐะป ะฒะทะณะปัะดะพะผ.\n- ะฅะผ, ะตัั ัะตะณะพ, - ะณะพัะดัะต ัะธะฝะธะต ะณะปะฐะทะฐ ะฟัะพะฝะธะทัะฒะฐะปะธ ั
ะพะปะพะดะพะผ ะฒ ะพัะฒะตั. ะะดะพะฑะฐะฒะพะบ ะพะฝ ะทะฐะบะธะฝัะป ะฝะพะณั ะฝะฐ ะฝะพะณั. - ะกะผะตะฝะบะฐ ะฟะพัะฒะฐะปะฐัั, ะดััะณะพะน ั ะฝะต ะฝะฐััะป, ะฟัะธัะปะพัั ะธะดัะธ ะฒ ัะฝะธะฒะตััะธัะตั ัะฐะบ.\n- ะะฐ ะฟะปะตะฒะฐัั ั ั
ะพัะตะป! ะะพัะธะบะพะผ ั
ะพะดะธ! ะ ะฟะพะผะตัะตะฝะธะต ะฟะฐัะบะฐัั ะฝะต ัะผะตะน! - ัััะฐะป ะัั.\n- ะะฐะฒััะฐ ัะฐะบ ะธ ัะดะตะปะฐั - ัััะบะฝัะป ัะพั. - ะญัะพ ะฒัั?\n- ะัะดะตัั ะฝะตะดะตะปั ะผััั ะฟะพะปั ะฒ ััะพะผ ะบะพัะธะดะพัะต! - ะฝะฐั
ะผััะธะปัั ะณะปะฐะฒะฐ ะดะธััะธะฟะปะธะฝะฐัะฝะพะณะพ ะบะพะผะธัะตัะฐ. - ะ ะฝะฐัะฝััั, ะฟััะผะพ ัะตะนัะฐั!\n- ะขั, ะฝะต ะฝะฐะผะตัะตะฝ. ะะปั ััะพะณะพ ะตััั ัะฑะพััะธัั, ะธ... - ะฑัะพัะธะป ะบะพัะพัะบะธะน ะฒะทะณะปัะด ะฒ ััะพัะพะฝั ะฃะพะปะบะตัะฐ ะธ ะขััะฝั. - ะะตะถััะฝัะต.\n- ะงะตะณะพ-ะพ-ะพ?! - ะฒะพะทะผััะธะปัั ะฃะพะปะบะตั. - ะะฐ ะบะพัะธะดะพั ะผั ะฝะต ะพัะฒะตัะฐะตะผ!\n- ะฅะผ, - ั
ะผัะบะฝัะป ะัั, ะธ ะผะฐะปััะธะบ ัะตัะธะป ะฟะพะผะพะปัะฐัั. - ะขั ะทะฐะฟะฐัะบะฐะป ัั ะธ ัะฑะธัะฐะน, ะฐ ะธะฝะฐัะต... - ะณะปะฐะทะฐ ัะฒะตัะบะฝัะปะธ ะฝะต ะฟะพ-ะดะพะฑัะพะผั. - ะะฐะผะธะบะพัะพั!\n- ะะตั ะถะตะปะฐะฝะธั ะดัะฐัััั, ะฝะพ ัะฐะท ัั ะฝะฐััะฐะธะฒะฐะตัั! - ะะฐะฝะดะฐ ะฟะพะดะฝัะปัั ั ะผะตััะฐ, ัะผะพััั ะฝะฐ ะฟะฐัะฝั ั ะฒัะทะพะฒะพะผ. ะะฝ ะฝะต ัะพะฑะธัะฐะปัั ะพัะดะฐะฒะฐัั ัะฒะพะตะผั ะณะปะฐะฒะฝะพะผั ัะพะฟะตัะฝะธะบั ะทะฒะฐะฝะธะต ะณัะพะทั ัะฝะธะฒะตััะธัะตัะฐ.\n- ะ, ััะพ ะฑัะดะตั ะธะฝัะตัะตัะฝะพ, - ะทะปะพัะฐะดะฝะพ ัั
ะผัะปัะฝัะปัั. - ะัะต ะฒะพะฝ! ะะพะบะฐ ะฝะต ะฟะตัะตะฑะธะป.\nะะตัั ะบะปะฐัั, ััะพ ะถะฐะปัั ะฟะพ ััะตะฝะพัะบะฐะผ, ะผะพะผะตะฝัะฐะปัะฝะพ ะฒัััะฟะฐะป ะฒ ะบะพัะธะดะพั. ะัะพะผะต ะขััะฝั ะธ ะะปะปะตะฝะฐ, ััะพ ะทะฐะฒะพัะพะถะตะฝะพ ะฝะฐะฑะปัะดะฐะปะธ ะทะฐ ัะพะฑััะธัะผะธ. ะกะฐะฒะฐะดะฐ ัะพ ัััะฐั
ั ะฒัะตะฟะธะปัั ะฒ ััะบั ะฃะพะปะบะตัะฐ, ะฐ ัะฐะผ ะฟะฐัะตะฝั ะพะฑะตัะฟะพะบะพะตะฝะฝะพ ัะผะพััะตะป ะฒ ััะพัะพะฝั ะดะปะธะฝะฝะพะฒะพะปะพัะพะณะพ ัะฟะพะฝัะฐ. - ะฎั... - ัะธั
ะพ ะฟะพะทะฒะฐะป ะพะฝ.\n- ะัะฐะฒะธะปัะฝะพ, ัะฒะธะดะตัะตะปะธ ะฝะธ ะบ ัะตะผั, - ัะฐะบ ะถะต ัั
ะผัะปัะฝัะปัั ะะฐะฝะดะฐ, ัะฐะทะผะธะฝะฐั ััะบะธ. - ะั, ะดะฒะพะต, ัะฐะทะฒะต ะฝะต ััะฝะพ ะฑัะปะพ ัะบะฐะทะฐะฝะพ? - ะณะปัะฝัะป ะพะฝ ะฒ ััะพัะพะฝั ะฟะฐัะฝะตะน.\n- ะะปะปะตะฝ, ะผะพะถะตั... - ัะธั
ะพ ะฟัะพัะบัะปะธะป ะขััะฝะฐ, ะฟัะตะบัะฐัะฝะพ ะทะฝะฐะฒัะธะน ะฝัะฐะฒ ะฅะธะฑะฐัะธ.\nะะตะปะพะฑััััะน, ััะพ ะฒัั ััะพ ะฒัะตะผั ะฟะตัะตะฒะพะดะธะป ะฒะทะณะปัะด ั ะัั ะฝะฐ ะะฐะฝะดั, ะฒะทะดะพั
ะฝัะป, ะพะฟัััะธะฒ ะณะปะฐะทะฐ, ะธ ะฟะพะดะดะฐะปัั ะฝะฐ ัะณะพะฒะพัั ะกะฐะฒะฐะดั, ะฟะพะทะฒะพะปะธะฒ ััะฐัะธัั ัะตะฑั ะฒ ะบะพัะธะดะพั.\nะกัะตะฝะฐ 3. ะัะฑะปั 1.\n- ะฅะต... - ะฅะธะฑะฐัะธ ัััะฐะฝะฝะพ ั
ะผัะบะฝัะป, ะบัะฐะตะผ ะณะปะฐะทะฐ ะฝะฐะฑะปัะดะฐั ะทะฐ ััะตะดัะธะผะธ.\nะะฐะฝะดะฐ ัะฐะบ ะถะต ะผะพะปัะฐ, ะฟัะพะฒะพะดะธะป ะฟะพะดัะพััะบะพะฒ ะฒะทะณะปัะดะพะผ ะธ ะฒะฝะพะฒั ะฟะพัะผะพััะตะป ะฝะฐ ัะฒะพะตะณะพ ะฟัะพัะธะฒะฝะธะบะฐ. ะญัะฐ ัั
ะผัะปะบะฐ ะฝะธ ะพ ััะผ ะดะพะฑัะพะผ ะฝะต ะณะพะฒะพัะธะปะฐ.\nะฅะธะฑะฐัะธ ะฝะตะพะถะธะดะฐะฝะฝะพ ัะดะฐัะธะป ะฟะฐัะฝั ะฒ ะถะธะฒะพั ัะฐะบ, ััะพ ัะพั ะพัะปะตัะตะป ะบ ะพะบะฝั.\n- ะขั... - ะะฐะฝะดะฐ ัะพะณะฝัะปัั, ะฝะพ ะฑััััะพ ะฟัะธััะป ะฒ ัะตะฑั ะธ ะฟะพะดะฝัะปัั. ะะพัะปะตะดะพะฒะฐะป ะพัะฒะตัะฝัะน ัะดะฐั.\n- ะฅะผ... ัะปะฐะฑะฐะบ! - ะัั ะฑััััะพ ะฑะปะพะบะธัะพะฒะฐะป ััะพั ัะดะฐั ะธ ะฟะพะดัะตัะบะพะน ัะฑะธะป ะฟัะพัะธะฒะฝะธะบะฐ ั ะฝะพะณ.\nะฎั ะฝะต ัะฐััะตััะปัั ะธ ัะดะฐัะธะป ะตะณะพ ะฟะพ ะฝะพะณะฐะผ, ัะพะถะต ะทะฐะฒะฐะปะธะฒ ะฝะฐ ะฟะพะป ะธ ัะตะป ะฝะฐ ะฝะตะณะพ, ะบะฐะบ ะฝะฐ ัะบะฐะผะตะนะบั. ะะพัะพะผ ะฟะพะดะฝัะปัั ะธ ะทะฐะปะพะผะธะป ัะพะผั ััะบะธ, ะฟัะธะณะธะฑะฐั ะบ ะฟะพะปั.\n- ะะตัะธัั!\nะฅะธะฑะฐัะธ ะฒัะฒะตัะฝัะปัั ะธ ั ัะฐะทะฒะพัะพัะฐ ัะดะฐัะธะป ะฟะพ ะปะธัั.\n- ะขัะฐะฒะพัะดะฝัะต ะดะพะปะถะฝั ะผะพะปัะฐัั ะธ ะฟะพะดัะธะฝััััั!\n- ะฏ ัะฐะบะพะน ัะฒะพะปะพัะธ ะฟะพะดัะธะฝััััั ะฝะต ัะพะฑะธัะฐััั! - ัะดะฐั ะฒ ะฑะพะบ ะฟะพ ะฟะตัะตะฝะธ.\nะัั ัะดะฐัะธะป ะฟะพ ะณะพะปะพะฒะต ัะพะฝัะฐ.\n- ะ ัะตะฑั ะฝะธะบัะพ ะฝะต ัะฟัะฐัะธะฒะฐะตั! ะะธััะธะฟะปะธะฝะฐ ะฝะฐ ะฟะตัะฒะพะผ ะผะตััะต!\nะะฐะฝะดะฐ ะทะฐะตั
ะฐะป ะฝะพะณะพะน ะฒ ะถะธะฒะพั. ะกะฐะฟะพะณะฐะผะธ ััะพ ะพัะตะฝั ะถะตััะพะบะพ.\n- ะะพะบะฐ ะผะตะฝั ะฝะธะบัะพ ะฝะต ััะพะณะฐะตั, ั ัะฟะพะบะพะตะฝ!\n- ะะพะบะฐ ะฝะต ะฝะฐัััะฐะตัั ะฟัะฐะฒะธะปะฐ, ัะฟะพะบะพะตะฝ ั! - ะฟะฐัะตะฝั ั ัะธะปะพะน ัะดะฐัะธะป ะฟะพ ัะพะปะฝะตัะฝะพะผั ัะฟะปะตัะตะฝะธั.\n- ะั
... ัะฑะปัะดะพะบ, - ัะผะพััะธะปัั ะฎั.\n- ะขะพะถะต ะผะฝะต - ะฝะฐะณะปัะน! ะัะผะฐะตัั, ั
ัะน ะพััะฐััะธะป, ะธ ัะตะฑะต ะฒัั ะดะพะทะฒะพะปะตะฝะพ?! - ะฟัะพัััะฐะป ะฅะธะฑะฐัะธ.\n- ะะพะฒะพัะธ, ััะพ ั
ะพัะตัั, ะฝะพ ะฟะพะปั ะผััั ั ะฝะต ัะพะฑะธัะฐััั, - ัะตะผ ะถะต ัะพะฝะพะผ ะพัะฒะตัะธะป ะฟัะพัะธะฒะฝะธะบ.\n- ะะพ ัะฐะบะธ ะฒัะผะพะตัั! - ัะฝะพะฒะฐ ัะดะฐัะธะป ะณะปะฐะฒะฐ ะดะธััะธะฟะปะธะฝะฐัะฝะพะณะพ ะบะพะผะธัะตัะฐ.\n- ะะฐะฒััะฐ ะฒะพะพะฑัะต ะฝะต ัะฒะปััั. ะ ะฟะปะฐะบะฐะป ะฒะฐั ะบัะฑะพะบ ะทะฐ ะฟะตัะฒะพะต ะผะตััะพ ะฟะพ ะฑะฐัะบะตัะฑะพะปั, - ะฒััะตัะฟะตะป ะะฐะฝะดะฐ.\n- ะขั ะผะฝะต ััั ะฝะต ัะณัะพะถะฐะน! ะะตะทะฐะผะตะฝะธะผัั
ะปัะดะตะน ะฝะต ะฑัะฒะฐะตั! ะ ัะตะผ ะฑะพะปะตะต ัะตะฑั ะทะฐะผะตะฝะธัั - ัะฐะท ะฟะปัะฝััั!\n- ะั
, ัะพะณะดะฐ ััะพ ะถะต ะพัะปะธัะฝะพ! ะะฐะฒััะฐ ัะตะปัะน ะดะตะฝั ะฟัะพะฒะฐะปัััั ะฒ ะบัะพะฒะฐัะธ ะธ ะฝะต ัะฒะธะถั ััะพะณะพ ะผะตะปะบะพะณะพ. ะะฐะดะพะปะฑะฐะป ะฟัะปะธัััั.\n- ะ? ะ ะฟัะธััะผ ััั ะะปะปะตะฝ?! - ะัั ะฒัะบะธะฝัะป ะฑัะพะฒั.\n- ะัะธัะพะผ, ััะพ ะดะพััะฐะป, - ะฒะทะดะพั
ะฝัะป ะฎั, - ัััะฐะฝะฝัะน ะพะฝ ะบะฐะบะพะน-ัะพ. ะ ัะผะพััะธั ะฝะฐ ะผะตะฝั ะบะฐะบ-ัะพ ัััะฐะฝะฝะพ.\n- ะ ะฐะดะพะฒะฐะปัั ะฑั! ะัะต ะพััะฐะปัะฝัะต ะพั ัะตะฑั ัะฐัะฐั
ะฐัััั. ะก ัะฐะบะธะผะธ ัะตะผะฟะฐะผะธ ะธ ะดะพ ะพะฝะฐะฝะธะทะผะฐ ะฝะตะดะฐะปะตะบะพ, ะธะปะธ ัั ัะถะต? - ััะผะตั
ะฝัะปัั ะฅะธะฑะฐัะธ.\n- ะั, ะฝะตั... ะธ ััะพ ะฒะพะพะฑัะต ะทะฐ ะฒะพะฟัะพัั? ะฃะพะปะบะตั ะผะตะฝั ะฒ ะฟะพัะปะตะดะฝัั ะพัะตัะตะดั ะธะฝัะตัะตััะตั.\n- ะฏ ะฝะต ะพะฑ ััะพะน ะบะพะทัะฒะบะต ะณะพะฒะพัั! ะ ะฟัะพ ัะพ, ััะพ ั ัะฒะพะธะผ ั
ะฐัะฐะบัะตัะพะผ ะฝะธ ะพะดะฝะฐ ะดะตะฒััะบะฐ ะบ ัะตะฑะต ะฝะต ะฟะพะดะพะนะดัั!\n- ะฅะต, - ััะผะตั
ะฝัะปัั ะะฐะฝะดะฐ. - ะกะฟะพัะธะผ, ั ะปัะฑัั ะทะฐ ะดะตะฝั ัะผะพะณั ะทะฐะบะฐะดัะธัั? ะ ะทะฐะฝััััั ัะตะบัะพะผ.\n- ะขั-ัะพ? ะฅะฐ! ะ ะทะฐ ะผะตััั ะฝะต ัะฟัะฐะฒะธัััั! - ะพัะบะฐะปะธะปัั ะัั.\n- ะขะฐะบ ะทะฝะฐัะธั, ัะฟะพัะธะผ? - ะฟัะธะฟะพะดะฝัะปัั ะฎั. - ะะพ ัะพะณะดะฐ ะธ ัั ััะฐััะฒัะตัั.\n- ะฅะต, ะดะฐั ัะตะฑะต ะฝะตะดะตะปั! - ะฅะธะฑะฐัะธ ัะฑัะฐะป ัะพะฝัะฐ ะธ ะฟัะพััะฝัะป ัะฒะพั ััะบั.\n- ะะพะณะพะฒะพัะธะปะธัั, - ะฟะพะถะฐะป ััะบั ัะพั. - ะ ะบัะพ ััะฐะฝะตั ัะตะปัั?\n- ะฅะผ... ะฐ ัะพั, ะบัะพ ะฟะตัะฒัะน ะฒะพะนะดัั ะฒ ััะพั ะบะฐะฑะธะฝะตั! ะงัะพะฑ ัะถ ัะตััะฝะพ ะฑัะปะพ. ะ ะฟะพะดัะฒะตัะถะดะตะฝะธะต ะฟะพะฑะตะดั ะฟัะธะฝะตัั ัะตะฑะต ะฝะธะถะฝะตะต ะฑะตะปัั ะถะตััะฒั! - ะณะปะฐะฒะฐ ะดะธััะธะฟะปะธะฝะฐัะฝะพะณะพ ะบะพะผะธัะตัะฐ ะบัะตะฟัะต ัะถะฐะป ััะบั ะธ, ัะฒะฐะฝัะฒ ะฝะฐ ัะตะฑั, ะฟะตัะตะบะธะฝัะป ะะฐะฝะดั ัะตัะตะท ัะฟะธะฝั ะฝะฐ ะฟะพะป. - ะะพ ัััะธ, ะตัะปะธ ัั ะฟัะพะธะณัะฐะตัั, ะฑัะดะตัั ะดัะฐะธัั ัะฝะธะฒะตััะธัะตั ะฒะตัั ะณะพะด!\n- ะขั... ะปะฐะดะฝะพ - ะฎั ะฟะพะดะฝัะปัั, ะดะตัะถะฐัั ะทะฐ ัะฟะธะฝั. - ะฏ ัะตะฑะต ััะพ ะฝะต ะฟัะพัั.\nะขัั ะฒ ะดะฒะตัั ัะธั
ะพะฝัะบะพ ะฟะพััััะฐะปะธัั.\n- ะ ะตัะปะธ ะฒัะธะณัะฐะตัั ัั, ั ะฝะฐ ะณะพะด ะพั ัะตะฑั ะพัััะฐะฝั! - ั
ะผัะบะฝัะป ะัั ะธ ะฟะพะฒะตัะฝัะปัั ะบ ะดะฒะตัะธ.\n- ะฅะธะฑะฐัะธ-ัะฐะฝ! ะฏ, ะบะพะฝะตัะฝะพ, ะฟะพะฝะธะผะฐั, ััะพ ะดะธััะธะฟะปะธะฝะฐ - ััะพ ัะฒััะพะต, ะธ ะฟะพะดะดะตัะถะธะฒะฐั ะฒะฐัะต ัะตัะตะฝะธะต ะฝะฐะดัะฐัั ััะพะผั ะฟัะธะดััะบั ะทะฐะด! ะะพ ั ะฝะฐั ััั ััะพะบ, ะฐ ะผะฝะต ัะตัะตัะฐั ัะดะฐะฒะฐัั! - ะทะฐััะป ะฑะตะทัะฟัะตัะฝัะน ะะปะปะตะฝ ะฃะพะปะบะตั, ะฒ ะบะพัะพัะพะณะพ ะฝะฐะผะตััะฒะพ ะฒัะตะฟะธะปัั ะกะฐะฒะฐะดะฐ ะฟััะฐััั ะพััะฐะฝะพะฒะธัั.\nะกัะตะฝะฐ 4. ะัะฑะปั 1.\nะะฐะฝะดะฐ ะฟะพัะผะพััะตะป ะฝะฐ ะผะฐะปััะธัะบั ะธ ะธะทะดะฐะป ัะธั
ะธะน ะทะฒัะบ, ะฟะพั
ะพะถะธะน ะฝะฐ ะบะพัะฐััะต ัะธะฟะตะฝะธะต. ะะธะดะธะผะพ ะพะฝ ะฝะต ะพะถะธะดะฐะป, ััะพ ะฟะตัะฒัะผะธ ะฒ ะบะปะฐัั ะทะฐะนะดัั ะธะผะตะฝะฝะพ ััะธ ะดะฒะพะต.\n- ะัะพั
ะพะดะธ, ะทะฐััะป ัะถะต, - ั
ะผัะบะฝัะป ะฎั ะธ, ะพัะฒะตัะธะฒ ะฅะธะฑะฐัะธ ะฟะพะดะทะฐััะปัะฝะธะบ, ะฟะพัะฟะตัะธะป ะฒะตัะฝััััั ะฝะฐ ัะฒะพั ะผะตััะพ.\n- ะ, ัั ะตัั ะถะธะฒะพะน?! ะะตัะฐะปัะฝะพ... - ะฟะพะบะฐัะฐะป ะณะพะปะพะฒะพะน ะะปะปะตะฝ. - ะ ะตะฑััะฐ ะทะฐั
ะพะดะธัะต, ะฅะธะฑะฐัะธ ัััะป! - ััั ะถะต ะฒ ะดะฒะตัั ะฟะพะฒะฐะปะธะปะธ ะพััะฐะปัะฝัะต. ะ ะฟะพัะปะตะดะฝะธะผ ะทะฐััะป ะฟัะตะฟะพะดะฐะฒะฐัะตะปั. ะะตะปะพะฒะพะปะพััะน ะดะพััะฐะป ะธะท ััะผะบะธ ัะธััะฝะบะธ ะธ ัะตััะตะถะธ, ะฟะพัะปะต ัะตะณะพ ัะฐะทะฒะตัะธะป, ะฒะทัะป ัะบะฐะทะบั ะธ ะฝะฐัะฐะป ัะฐััะบะฐะทัะฒะฐัั ัะตัะตัะฐั ะฟะพ ัะบะพะปะพะณะธะธ.\nะะพะพะฑัะต, ะพะฝ ะฝะต ะฑัะป ะพัะปะธัะฝะธะบะพะผ, ะฝะพ ะฑะพะปััะธะผ ัััะดัะณะพะน!\nะัะปะธ ัะฐะฝััะต ะฎั ะผะตััะฐะป ะพััะธะดะตัั ะฟะพัะปะตะดะฝะธะต ััะพะบะธ ะธ ัะฒะฐะปะธัั ะดะพะผะพะน, ัะพ ัะตะฟะตัั ะตะณะพ ะถะตะปะฐะฝะธะตะผ ะฑัะปะพ, ััะพะฑั ััะพะบะธ ะฝะธะบะพะณะดะฐ ะฝะต ะทะฐะบะฐะฝัะธะฒะฐะปะธัั.\n"ะขั, ะจะฟะตะฝะดะตะปั. ะะพัะตะผั, ะฟะพัะตะผั ัั ัะฐะบ ะฝะต ะฒะพะฒัะตะผั ัะฒะฐะปะธะปัั ะผะฝะต ะฝะฐ ะณะพะปะพะฒั?!" - ะดัะผะฐะป ะพะฝ, ะดะตะปะฐั ะฒะธะด, ััะพ ัะปััะฐะตั.\n- ... ะ ะฒะพั ะฟะพััะพะผั ะดะปั ัะฟะฐัะตะฝะธั ะบะธัะพะฒ ัะฐะบ ะฒะฐะถะฝะพ ะฟัะตะบัะฐัะธัั ัััะตะปัะฑั ะธ ะฟะตัะตะฒะพะท ะฝะตััะธ ัะตัะตะท ะพะบะตะฐะฝ! ะฃ ะผะตะฝั ะฒัั! - ะทะฐะบะพะฝัะธะป ัะฐััะบะฐะท.\n- ะั ััะพ ะถ, ะดัะผะฐั, ะฝะฐ 4-ะบั ะฒะฟะพะปะฝะต ั
ะฒะฐัะธั.\n- ะงัะพ?! ะะพ ััะธัะตะปั, ั ะฝะตะณะพ ะฟะพััััะฐััะธะน ะดะพะบะปะฐะด! - ะทะฐัะตะฑะตัะฐะป ะพะดะฝะพะณััะฟะฟะฝะธะบ.\n- ะะฝ ะผะฝะพะณะพ ะณะพัะพะฒะธะปัั, ะฒะพะปะฝะพะฒะฐะปัั, ะฟะพัะตะผั ัะตัััะต?! - ะทะฐัััะฟะธะปัั ะทะฐ ะะปะปะตะฝะฐ ะขััะฝะฐ.\n- ะะฐ ะฟัะตะบัะฐัะฝัะน ะดะพะบะปะฐะด, ะตัะปะธ ัะตััะฝะพ, ะฝะต ะพะถะธะดะฐะป, - ะฒััะบะฐะทะฐะปัั ัะตะปะพะฒะตะบ, ะบะพัะพัะพะณะพ ะผะตะฝััะต ะฒัะตะณะพ ััะพ ะผะพะณะปะพ ะฒะพะปะฝะพะฒะฐัั. ะะฐะฝะดะฐ ัะผะพััะตะป ะฝะฐ ะฟัะตะฟะพะดะฐะฒะฐัะตะปั.\n- ะญ-ะญ-ะญ?! - ะพัะฐะปะตะป ะฒะตัั ะบะปะฐัั.\n- ะ... ั-ัั-ัะพ... - ะทะฐะปะธะปัั ััะผัะฝัะตะผ ะฃะพะปะบะตั.\n- ะั ะปะฐะดะฝะพ, 5!\nะฎั, ะฟะพะฑะตะดะฝะพ ัั
ะผัะปัะฝัะฒัะธัั, ะฟะตัะตะฒะตะป ะฒะทะณะปัะด ะฝะฐ ะะปะปะตะฝะฐ. ะขะพั ะฟะพััะฟะธะปัั ะธ ัััะฐะฒะธะปัั ะฒ ะฟะพะป.\n"ะฅะผ, ะฒะพะทะผะพะถะฝะพ ััะพ ะฑัะดะตั ะฝะต ัะฐะบ ัะถ ะธ ัะถะฐัะฝะพ", - ะฟะพัะตะผั-ัะพ ัะพะปัะบะพ ัะตะนัะฐั ะะฐะฝ',
'- ะะพะฑัะพะต ัััะพ, - ัะตะฟะพั ัะตะบะพัะตั ะผะฝะต ัั
ะพ. ะกะพะฒัะตะผ ะฝะต ั
ะพัะตััั ัะฐะทะปะตะฟะปััั ะณะปะฐะทะฐ ะธ ะฒัััะตัะฐัั ะฝะพะฒัะน ะดะตะฝั. ะะพะฒะพัะฐัะธะฒะฐััั, ะฟัะธััะณะธะฒะฐั ัะตะฑั ะฑะปะธะถะต, ะธ ัััะบะฐััั ะฝะพัะพะผ ัะตะฑะต ะฒ ะณััะดั, ะพัััะฐั ะทะฐะฟะฐั
ัะปะฐะดะพััะตะน, ะบะพัะพััะต ะฝัะฐะฒัััั ะฝะฐะผ ะพะฑะพะธะผ. ะฏ ะตะถััั ะพั ั
ะพะปะพะดะฐ, ะฟััะฐััั ะฒัะปะตะฟัั ะฝะฐะนัะธ ะบัะฐั ัััะฝะพะณะพ ะพะดะตัะปะฐ ะธ ัะฝะพะฒะฐ ะพะบัะฝััััั ะฒ ัะพะฝ. ะขั ะทะฐะผะตัะฐะตัั ััะพ ะธ ะทะฐะฑะพัะปะธะฒะพ ัะบััะฒะฐะตัั ะผะตะฝั. ะขะฒะพะธ ะฟะฐะปััั ะฟะตัะตะฑะธัะฐัั ะผะพะธ ะฒะพะปะพัั, ะฐ ะณัะฑั ะปะตะณะบะพ ะบะฐัะฐัััั ะผะพะตะณะพ ะปะฑะฐ. ะั ัะฐะบ ะธ ะทะฐัััะฒะฐะตะผ ะฒ ััะพะน ะฟะพะทะต ะฝะฐ ะฝะตะบะพัะพัะพะต ะฒัะตะผั.\nะัะพั
ะพะดะธั ะฒัะตะณะพ ะฝะตัะบะพะปัะบะพ ะผะธะฝัั, ะฐ ะฟะพัะพะผ ั ัะตะทะบะพ ัะฐะถััั ะฝะฐ ะบัะพะฒะฐัะธ ะธ ะฝะฐัะธะฝะฐั ะฒะพััะฐัั, ััะพ ัะถะต ะดะฐะฒะฝะพ ะฟะพัะฐ ะฒััะฐะฒะฐัั, ะฒะตะดั ัะตะณะพะดะฝั ะฟัะตะดััะพะธั ะฟะพะตะทะดะบะฐ ะฝะฐ ะฟัะธัะพะดั ะฒะผะตััะต ั ะดััะทััะผะธ. ะฃ ัะตะฑั ะฝะฐ ะปะธัะต ะฟะพัะฒะปัะตััั ัะปัะฑะบะฐ, ะฐ ััะบะธ ััะฝัั ะพะฑัะฐัะฝะพ, ะทะฐััะฐะฒะปัั ะฒะฝะพะฒั ะพัะบะธะฝััััั ะฝะฐ ะฟะพะดััะบะธ. ะะฐ ัะปะธัะต ะปัะตั ะดะพะถะดั, ะฑะฐัะฐะฑะฐะฝั ะฒ ะพะบะฝะฐ, ะฐ ััะพ ะตัะต ะดะตะปะฐัั ะฒ ัะฐะบะพะน ะดะตะฝั, ะตัะปะธ ะฝะต ะฝะตะถะธัััั ะฒ ัััะฝะพะน ะฟะพััะตะปะธ ะฒ ะพะฑัััะธัั
ะปัะฑะธะผะพะณะพ?\nะกะบะพะปัะบะพ ะฒัะตะผะตะฝะธ ะผั ะฑัะปะธ ะทะฝะฐะบะพะผั, ะฟัะตะถะดะต, ัะตะผ ัะทะฝะฐะปะธ ะพ ััะฒััะฒะฐั
ะดััะณ ะดััะณะฐ? ะะฐ, ั ะฝะต ะฟะพะผะฝั ััะพะณะพ, ะฝะพ ะบัะพ ััะธัะฐะตั? ะะปะฐะฒะฝะพะต, ััะพ ะฒ ะผะพะตะน ะฟะฐะผััะธ ะดะพ ัะธั
ะฟะพั ะฑะตัะตะถะฝะพ ั
ัะฐะฝะธััั ะผะพะผะตะฝั, ะบะพะณะดะฐ ัั ะฝะฐะบะพะฝะตั ััะปััะฐะป ัะต ะฒะฐะถะฝัะต ัะปะพะฒะฐ. ะะตัะตะด ะณะปะฐะทะฐะผะธ ะฒัะฟะปัะฒะฐัั ััะฐััะปะธะฒัะต ะผะณะฝะพะฒะตะฝะธั, ัะปะพะฒะฝะพ ะบะฐะดัั, ะทะฐะฟะตัะฐัะปะตะฒัะธะต ะฒัั ะฒ ะผะตะปััะฐะนัะธั
ะดะตัะฐะปัั
. ะญัะพ ะฟัะพะธะทะพัะปะพ ะฒ ะผะพัะพะทะฝัะน ัะฝะฒะฐััะบะธะน ะดะตะฝั. ะะตััะปะฐั ะบะพะผะฟะฐะฝะธั ะผะพะปะพะดัั
ะปัะดะตะน ะฝะต ะผะพะณะปะฐ ะฟัะพััะพ ัะธะดะตัั ะดะพะผะฐ ะฒะทะฐะฟะตััะธ ะธ ัะฟัััะธัั ัะฐะบะพะน ั
ะพัะพัะธะน ัะปััะฐะน ะดะปั ะฟัะพะณัะปะบะธ ะฟะพ ะทะฐัะฝะตะถะตะฝะฝะพะผั ะปะตัั ะธ ะฟัะพัะธั
ะทะธะผะฝะธั
ะทะฐะฑะฐะฒ.\nะขั ัะพะณะดะฐ ะพะบะฐะทะฐะปัั ะฒะฝะต ะฝะฐัะตะณะพ ะฟะพะปั ะทัะตะฝะธั, ะฐ ัะตะผะฝะพัะฐ ัะถะต ะฝะฐัะฐะปะฐ ะพะฟััะบะฐัััั ะฝะฐ ะทะตะผะปั. ะะพะฝะตัะฝะพ, ะผะฝะต ะฝะธัะตะณะพ ะฝะต ะพััะฐะฒะฐะปะพัั, ะบัะพะผะต ะบะฐะบ ะพัะฟัะฐะฒะธัััั ะฝะฐ ะฟะพะธัะบะธ. ะะฐ ะผะพะตะผ ะปะธัะต ะทะฐัััะปะพ ัะดะธะฒะปะตะฝะธะต, ะบะพะณะดะฐ ั ะทะฐััะฐะป ัะตะฑั ะทะฐ ัััะฐะฝะฝัะผ ะทะฐะฝััะธะตะผ: ะฑัะปะพ ะทะฐะฑะฐะฒะฝะพ ะฝะฐะฑะปัะดะฐัั ะทะฐ ัะพะฑะพะน, ะฒัะฒะพะดััะธะผ ะฐะบะฒะฐัะตะปัั ะธ ะฑะฐะปะปะพะฝัะธะบะฐะผะธ ั ะบัะฐัะบะพะน ะฝะตะบะธะต ัะทะพัั ะฟััะผะพ ะฝะฐ ัะฝะตะณั. ะขะฒะพะธ ะฝะตะพะฑััะฝะพััั ะธ ะฝะตะฟัะตะดัะบะฐะทัะตะผะพััั ะฟัะธััะณะธะฒะฐะปะธ ะบ ัะตะฑะต ะผะพั ะฝะฐัััั.\n- ะขั ะผะฝะต ะฝัะฐะฒะธัััั. ะัะตะฝั, - ะบะฐะถะตััั, ะฑัะดัะพ ะฒัั ะทะฐะผะตัะปะพ, ะธ ะฒ ะทะฒะตะฝััะตะน ัะธัะธะฝะต ะฟัะพะทะฒััะฐะปะธ ะฟัะพัััะต ัะปะพะฒะฐ, ะบะพัะพััะต ััะถะตะปะพ ะฟัะพะธะทะฝะตััะธ. ะงัะพ ะผะพะณะปะพ ัะพะปะบะฝััั ะผะตะฝั ะฟัะพััะพ ะฒะทััั ะธ ัะบะฐะทะฐัั ะธั
? ะะดะฝะฐะบะพ ะพัะฒะตั ะฝะฐ ััะพั ะฒะพะฟัะพั ัะถะต ะฝะต ะฒะฐะถะตะฝ, ัะตะฟะตัั ะพะฝ ะพััะฐะฒะธะป ะผะตััะพ ะดะปั ะฑะตัะฟะพะบะพะนััะฒะฐ. ะขะฒะพะธ ัะผะพัะธะธ ัะปะพะถะฝะพ ะฟัะพัะธัะฐัั. ะขะฐะบ ะฑัะปะพ ะฒัะตะณะดะฐ. ะะพะปัะฐะฝะธะต ะฝะฐะณะฝะตัะฐะตั ะฝะฐะฟััะถะตะฝะธะต ะผะตะถะดั ะฝะฐะผะธ.\nะัะธะบะพัะฝะพะฒะตะฝะธะต ะปะตะดัะฝัั
ะฟะฐะปััะตะฒ ะบ ะผะพะตะน ัะตะบะต ะฒัะฒะพะดะธั ะธะท ะพัะตะฟะตะฝะตะฝะธั, ัะบะพะฒะฐะฒัะตะณะพ ัะตะปะพ. ะฏ ะตะปะต-ะตะปะต ัะฐะทะปะธัะฐั, ััะพ ัั ัะตะนัะฐั ะณะพะฒะพัะธัั, ะฝะพ ะฝะตะบะพัะพััะต ะพะฑััะฒะบะธ ััะฐะท ะฒัั ะถะต ะฟัะธะพะฑัะตัะฐัั ัะผััะป. ะะธะบะพะณะดะฐ ะฝะต ะฒะตัะธะป ะฒ ััะดะตัะฐ, ะดะฐ ะฒะพั ัะพะปัะบะพ ัะตะนัะฐั ะฟะพะฝะธะผะฐั: ะพะฝะธ ัะปััะฐัััั. ะะฐะปะตะฝัะบะพะต ััะดะพ - ัะทะฝะฐัั ะพะฑ ะพัะฒะตัะฝัั
ััะฒััะฒะฐั
ัะพะณะพ, ะบัะพ ัะฐะบ ะผะฝะพะณะพ ะทะฝะฐัะธั ะดะปั ัะตะฑั.\nะั ะธะดะตะผ ั ัะพะฑะพะน ะฟะพ ะทะฐะผะตัะตะฝะฝัะผ ัะฝะตะณะพะผ ัะปะธัะฐะผ. ะััะณะฐ, ะทะฐะฒัะฒะฐั, ะดัะตั ะฒ ะปะธัะพ, ัะฑะธะฒะฐั ะฟัะพั
ะพะถะธั
ั ะฟััะธ, ะฐ ั ะผะตะฝั ะฝะฐ ะดััะต - ัะฟะพะบะพะนััะฒะธะต ะธ ัะผะธัะพัะฒะพัะตะฝะธะต... ะะพะณะดะฐ ัั ััะดะพะผ, ะฟัะพะธัั
ะพะดััะตะต ะฒะพะบััะณ ะฝะต ะธะผะตะตั ะทะฝะฐัะตะฝะธั, ะธ ะฝะตั ะดะตะปะฐ ะดะพ ะฒัะตั
ะพััะฐะปัะฝัั
.\nะะฝะต ัะปััะฝะพ, ะบะฐะบ ัะฒะพะธ ะทัะฑั ััััะฐั ะพั ั
ะพะปะพะดะฐ. ะกะถะฐะฒัะธัั, ัั ะฟัััะตัั ะฝะพั ะฒ ะฒััะพะบะธะน ะฒะพัะพั ะบัััะบะธ. ะฏ ัะฒะตัะตะฝ, ััะพ ัะฒะพะธ ััะบะธ ะฒ ะบะฐัะผะฐะฝะฐั
ะดะฐะฒะฝะพ ะฝะต ะผะพะณัั ะพัะพะณัะตัััั ะธ ะฟัะธะฝััั ะฝะพัะผะฐะปัะฝัั ัะตะผะฟะตัะฐัััั.\n- ะะฐะผะตัะท? - ัะฟัะฐัะธะฒะฐั, ะทะฐะณะปัะดัะฒะฐั ะฒ ะบะฐัะธะต ะณะปะฐะทะฐ, ะพะฑัะฐะผะปะตะฝะฝัะต ัะตัะฝัะผะธ ัะตัะฝะธัะฐะผะธ, ะฝะฐ ะบะพัะพััะต ัะธั
ะพ ะฟะฐะดะฐัั ัะฝะตะถะธะฝะบะธ, ะธ, ะฝะต ะดะพะถะธะดะฐััั ะพัะฒะตัะฐ, ััะฝั ัะตะฑั ะฒ ะฑะปะธะถะฐะนัะตะต ะบะฐัะต.\n- ะะพะนะดะตะผ ะดะพะผะพะน, ะฐ ัะพ ะฒะพัะฟะฐะปะตะฝะธะต ะปะตะณะบะธั
ะฟะพะดั
ะฒะฐัะธัั, - ัััะพะณะพ ะทะฐะผะตัะฐะตัั ัั, ัะถะต ะฝะฐะฟัะฐะฒะปัััั ะฒ ััะพัะพะฝั ะฝะฐัะตะณะพ ะฟะพะดัะตะทะดะฐ.\n- ะะพััะพะน, ัะฐะทะฒะต ะฝะต ะฒะธะดะธัั, ะบะฐะบะฐั ััะดะตัะฝะฐั ะฟะพะณะพะดะฐ? - ะทะฝะฐะตัั ะฒะตะดั, ััะพ ะผะฝะต ะฝัะฐะฒะธััั ะณัะปััั ะฟะพะด ะดะพะถะดะตะผ, ะฟะพะดััะฐะฒะปัั ะปะธัะพ ะฟะฐะดะฐััะธะผ ั
ะพะปะพะดะฝัะผ ะบะฐะฟะปัะผ.\nะขะตะฑะต ะฒ ะณะพะปะพะฒั ะฑััััะพ ะฟัะธั
ะพะดะธั ะผััะปั, ะบะฐะบ ะทะฐััะฐะฒะธัั ะผะตะฝั ัะนัะธ ะฒ ะฑะพะปะตะต ััั
ะพะต ะธ ัะตะฟะปะพะต ะผะตััะพ. ะะพะปะณะพ ะฝะต ัะฐะทะดัะผัะฒะฐั, ััะฒะบะพะผ ะฟัะธััะณะธะฒะฐะตัั ะบ ัะตะฑะต, ะฟัะธะถะธะผะฐััั ะบ ะผะพะธะผ ะณัะฑะฐะผ. ะั ะฝะตะพะถะธะดะฐะฝะฝะพััะธ ั ะฟัะธะพัะบััะฒะฐั ะธั
, ะฐ ััะบะฐะผะธ ะฝะฐัะธะฝะฐั ะณะปะฐะดะธัั ัะฒะพั ัะฟะธะฝั, ะบ ะบะพัะพัะพะน ะฟัะธะปะธะฟะปะฐ ะธะทััะดะฝะพ ะฟัะพะผะพะบัะฐั ััะฑะฐัะบะฐ. ะะต ัะฟะตัะฐ, ัั ัะณะปัะฑะปัะตัั ะฟะพัะตะปัะน, ะตัะต ะฑะพะปััะต ัะฐะทะทะฐะดะพัะธะฒะฐั. ะะผะตะฝะฝะพ ัะฐะบ ะธ ะฟัะตะดะฟะพะปะฐะณะฐะปะพัั, ะฟัะฐะฒะดะฐ?\nะะพะต-ะบะฐะบ ัะฟัะฐะฒะธะฒัะธัั ั ะทะฐะผะบะพะผ, ะผั ะฒะฒะฐะปะธะฒะฐะตะผัั ะฒ ะฟะพะปััะตะผะฝัั ะบะฒะฐััะธัั, ะตะดะฒะฐ ััะผะตะฒ ัััะพััั ะฝะฐ ะฝะพะณะฐั
. ะะตัะตะด ะณะปะฐะทะฐะผะธ ะดะพ ัะธั
ะฟะพั ััะพะธั ะฟะตะปะตะฝะฐ ะดะพะถะดั. ะขั ััะฐะทั ะถะต ัะตะทะบะพ ะฟัะธะถะธะผะฐะตัั ะผะตะฝั ะบ ััะตะฝะต, ะธ ัะฒะพะน ัะทัะบ ะฒััะฒะฐะตััั ะฒ ะผะพะน ัะพั ะฒ ะฝะตะธััะพะฒะพะผ ะฟะพัะตะปัะต, ะฑะตัะฟะพััะดะพัะฝะพ ะดะฒะธะณะฐะตััั ะฒะดะพะปั ะทัะฑะพะฒ ะธ ะฒะพะทะฒัะฐัะฐะตััั ะบ ะผะพะตะผั ัะทัะบั. ะฏ ะฝะต ัััะตะผะปััั ะฑัะฐัั ะธะฝะธัะธะฐัะธะฒั ะฝะฐ ัะตะฑั, ะผะฝะต ะฒัะตะณะดะฐ ะฝัะฐะฒะธะปะพัั ะฟะปะฐะฒะธัััั ะฟะพะด ะฝะฐัะธัะบะพะผ ัะฒะพะธั
ะปะฐัะบ ะธ ะพะถะธะดะฐัั, ััะพ ะถะต ัั ะฟัะตะดะฟัะธะผะตัั ะดะฐะปััะต. ะฃ ัะตะฑั ะฟะพััะธ ะฒัะตะณะดะฐ ะปะตะดัะฝัะต ะฟะฐะปััั, ะธ ั ะผะตะฝั ะผััะฐัะบะธ ะฑะตะณัั ะฟะพ ะบะพะถะต ะพั ะฟัะธััะฝัั
, ะฝะพ ั
ะพะปะพะดะฝัั
ะฟัะธะบะพัะฝะพะฒะตะฝะธะน. ะขะตะฑะต ะฝัะฐะฒะธััั ัะผะพััะตัั, ะบะฐะบ ะฟัะพะณะธะฑะฐะตััั ะผะพั ัะฟะธะฝะฐ, ะบะพะณะดะฐ ัั ัะธััะตัั ะฝะฐ ะฝะตะน ะฝะตะฒะธะดะธะผัะต ะปะธะฝะธะธ. ะ ะดะถะธะฝัะฐั
ัะถะต ััะฐะฝะพะฒะธััั ัะตัะฝะพ, ะฐ ะฒ ะณะพะปะพะฒะต ะพะฑัะฐะทัะตััั ะฟัััะพัะฐ, ะทะฐะฟะพะปะฝัะตะผะฐั ะปะธัั ัะพะฑะพะน. ะขะฒะพะธ ััะบะธ ะพะฟััะบะฐัััั ะฝะธะถะต, ะฝะฐััะฟัะฒะฐั ะฟััะถะบั ัะตะผะฝั. ะ, ัั ะถะต ัะฐะผ ะทะฐัะตัะป ััั ะธะณัั, ะผะฐะปัั, ัะฐะบ ะดะฐะฒะฐะน ะฟะพะธะณัะฐะตะผ?\nะฏ, ะฒัั ัะฐะบ ะถะต ะฝะฐั
ะพะดััั ะฒ ะบัะตะฟะบะธั
ะพะฑัััะธัั
, ะดะตะปะฐั ะฝะตะพะถะธะดะฐะฝะฝัะน ัะฐะทะฒะพัะพั, ะฟัะธะฒััะฝะพ ะทะฐะฝะธะผะฐั ัะพะปั ะฐะบัะธะฒะฐ. ะขั ั ะทะฐะผะธัะฐะฝะธะตะผ ัะตัะดัะฐ ัะผะพััะธัั ะฝะฐ ะผะตะฝั, ะฟัะตะบัะฐัะธะฒ ะฒัะต ะดะตะนััะฒะธั. ะฃะปัะฑะฝัะฒัะธัั, ะฟัะพะฒะพะถั ัะทัะบะพะผ ะฟะพ ัะฒะพะตะผั ัั
ั, ัััั ะฟัะธะบัััะฒะฐั ะผะพัะบั, ะพั ัะตะณะพ ัะฒะพะธ ะดัะพะถะฐัะธะต ะฟะฐะปััั ะฟะตัะตะผะตัะฐัััั ะฒะฒะตัั
ะธ ััะดะพัะพะถะฝะพ ัะถะธะผะฐัั ะผะพะธ ะฒะพะปะพัั, ั ะบะพัะพััั
ััะตะบะฐะตั ะฒะพะดะฐ. ะะตัะตัะฟะตะปะธะฒะพ ัะฐัััะตะณะธะฒะฐั ะฟัะณะพะฒะธัั ัะฒะพะตะน ััะฑะฐัะบะธ, ะฟะพะฟััะฝะพ ะพััะฐะฒะปัั ะฝะตัะบะพะปัะบะพ ะฑะฐะณัะพะฒัั
ะพัะผะตัะธะฝ ะฝะฐ ัะตะต ะธ ะฝะฐ ะณััะดะธ. ะะพ ะผะตะฝั ะดะพะฝะพัะธััั ััะพะฝ, ะธ ั ะฟัะพะดะพะปะถะฐั ะผะตะดะปะตะฝะฝัั ะฟััะบั, ัััะณะธะฒะฐั ั ัะตะฑั ะฑััะบะธ ะฒะผะตััะต ั ะฑะตะปัะตะผ. ะ ัะธัะธะฝะต, ัะฐะทะฑะฐะฒะปัะตะผะพะน ะฝะฐัะธะผ ััะถะตะปัะผ ะดัั
ะฐะฝะธะตะผ, ัะฐะทะดะฐะตััั ััะผะฝัะน ะฒัะดะพั
, ะบะพะณะดะฐ ั ะดะตะปะฐั ะฝะตัะบะพะปัะบะพ ะดะฒะธะถะตะฝะธะน ััะบะพะน ะฟะพ ะพัะฝะพะฒะฐะฝะธั ัะปะตะฝะฐ, ะฐ ะทะฐัะตะผ, ะปะธะทะฝัะฒ ะณะพะปะพะฒะบั, ะพััััะฐะฝัััั, ะณะปัะดั ะฒ ะพะดััะผะฐะฝะตะฝะฝัะต ะณะปะฐะทะฐ.\n- ะกะฟะฐะปัะฝั, - ัะตะฟัะตัั ัั, ะบัะตะฟะบะพ ะดะตัะถะฐัั ะทะฐ ะบัะฐะน ััะผะฑะพัะบะธ, ััะพัะฒัะตะน ััะดะพะผ.\nะัะพัะธัั ะดะฒะฐะถะดั ะฝะตั ัะผััะปะฐ, ะฒะตะดั ั ะผะตะฝั ัะฐะผะพะณะพ ัะถะต ะฝะตั ัะธะป ัะตัะฟะตัั ััะพ ััะฝััะตะต ะพัััะตะฝะธะต, ะพะฑัะฐะทะพะฒะฐะฒัะตะตัั ะฒะฝะธะทั ะถะธะฒะพัะฐ.\nะะตะณะบะพ ะฟะพะดั
ะฒะฐััะฒะฐั ัะตะฑั ะฝะฐ ััะบะธ ะธ ะธะดั ะฒ ัั ะบะพะผะฝะฐัั, ะฒ ะบะพัะพัะพะน ะผั ััะพะปัะบะพ ัะฐะท ะทะฐะฝะธะผะฐะปะธัั ะปัะฑะพะฒัั.\nะัะพะฒะฐัั ะฒัััะตัะฐะตั ะฝะฐั ะทะฝะฐะบะพะผัะผ ัะบัะธะฟะพะผ, ะบะพะณะดะฐ ั ะพะฟััะบะฐั ัะตะฑั, ะฝะตัะฒะฝะพ ะบััะฐััะตะณะพ ะณัะฑั. ะขั ั
ะฒะฐัะฐะตัั ะผะตะฝั ะธ ััะฝะตัั ะฝะฐ ัะตะฑั, ะพััะตะณะพ ะพะบะฐะทัะฒะฐะตัััั ะฟัะธะถะฐััะผ ะผะพะธะผ ัะตะปะพะผ. ะขะฒะพะธ ััะบะธ ัะบะพะปัะทัั ะฟะพ ะผะพะธะผ ะฑะพะบะฐะผ, ะฟะพะผะพะณะฐั ัะฝััั ัััะฑะพะปะบั ะธ, ะฟัะธะปะพะถะธะฒ ะฝะตะบะพัะพััะต ััะธะปะธั, ะฟัะธะฟะพะดะฝัะฒัะธัั, ะพะฑะฒะพะดะธัั ัะทัะบะพะผ ะผะพะธ ัะพัะบะธ, ัะปะตะณะบะฐ ัะฐัะฐะฟะฐั ะธั
ะทัะฑะฐะผะธ.\nะงัะฒััะฒัั ะฝะตะพะฑั
ะพะดะธะผะพััั ัะบะพัะตะนัะตะน ัะฐะทััะดะบะธ, ั ะฟััะฐััั ะบะฐะบ ะผะพะถะฝะพ ัะบะพัะตะต ัะฝััั ะดะถะธะฝัั ะธ ะฝะฐัะฐัะธัั ะฒ ััะธัะบะต ัะบะฐัะฐ ัะผะฐะทะบั ะธ ะฟัะตะทะตัะฒะฐัะธะฒั. ะะตัะตัะฟะตะปะธะฒะพ ััััะฐะธะฒะฐััั ะฟะพัะดะพะฑะฝะตะต ะผะตะถะดั ัะฒะพะธั
ะฝะพะณ, ัะฐะทะฒะพะดั ะธั
ะฒ ััะพัะพะฝั ะธ ะฝะตะผะฝะพะณะพ ัะณะธะฑะฐั ะฒ ะบะพะปะตะฝัั
. ะัะดะฐะฒะปะธะฒะฐั ะณะตะปั ะธ ะฟะพะพัะตัะตะดะฝะพ ะฐะบะบััะฐัะฝะพ ะฒะฒะพะถั ะฒ ัะตะฑั ะฟะฐะปััั, ัะฐัััะณะธะฒะฐั ะฟัะพั
ะพะด. ะกะปััั ัะฒะพะต ัะดะฐะฒะปะตะฝะฝะพะต ัะธะฟะตะฝะธะต ะธ ััะฐัะฐััั ะพัะฒะปะตัั ัะฒะพะธะผะธ ะปะฐัะบะฐะผะธ, ะฟะพะบััะฒะฐั ะณััะดั ะธ ะฟะปะตัะธ ะฟะพัะตะปััะผะธ, ะบะพะต-ะณะดะต ัััั ะฟัะธะบัััะฒะฐั ะบะพะถั.\nะขั ะทะฐะตัะทะฐะป ะธ ะฝะตะดะพะฒะพะปัะฝะพ ัััะฐะฒะธะปัั ะฝะฐ ะผะตะฝั, ััะตะฑัั ะฑะพะปััะตะณะพ. ะฏ ั ัะดะพะฒะพะปัััะฒะธะตะผ ะฟะพะดัะธะฝัััั. ะัะธััะฐะฒะปัั ัะปะตะฝ ะบะพ ะฒั
ะพะดั ะธ ะผะตะดะปะตะฝะฝะพ ะฒั
ะพะถั, ะฝะฐ ััะพ ะฟะพะปััะฐั ะตะปะต ะทะฐะผะตัะฝัะน ะบะธะฒะพะบ, ะบะฐะบ ัะฐะทัะตัะตะฝะธะต ะฟัะพะดะพะปะถะฐัั. ะกะฟัััั ะฝะตัะบะพะปัะบะพ ัะพะปัะบะพะฒ ัั ะฒัะณะธะฑะฐะตัััั ะฒ ะฟะพะทะฒะพะฝะพัะฝะธะบะต, ะธ ะฝะฐ ะผะพะตะผ ะปะธัะต ะฟะพัะฒะปัะตััั ัะปัะฑะบะฐ. ะฏ ัะฒะตะปะธัะธะฒะฐั ัะตะผะฟ, ะดะฒะธะณะฐััั ะฒัั ะฑััััะตะต.\nะัะณะฐะทะผ ัััะตะผะธัะตะปัะฝะพ ะฝะฐะบััะฒะฐะตั ะฝะฐั ั ะณะพะปะพะฒะพะน, ะดะฐัั ััะพะปั ะดะพะปะณะพะถะดะฐะฝะฝะพะต ะฝะฐัะปะฐะถะดะตะฝะธะต. ะกะพ ัะฑะธะฒัะธะผัั ะดัั
ะฐะฝะธะตะผ, ัะพ ะทะฒะตะทะดะพัะบะฐะผะธ ะฒ ะณะปะฐะทะฐั
ะฟะฐะดะฐั ััะดะพะผ ั ัะพะฑะพะน, ัะฐัะบัะฐัะฝะตะฒัะธะผัั, ััะถะตะปะพ ะดััะฐัะธะผ, ะฝะพ ัะฐะบะธะผ ะปัะฑะธะผัะผ. ะขั ะฟัะธะถะธะผะฐะตัััั ะบะพ ะผะฝะต, ะฟะพะปะพะถะธะฒ ะณะพะปะพะฒั ะฝะฐ ะผะพั ะณััะดั. ะะตะปะฐัั ัะตะนัะฐั ััะพ-ะปะธะฑะพ ะฒััะต ะฒััะบะธั
ัะธะป - ั ะฟัะพะดะพะปะถะฐั ะปะตะถะฐัั, ะฟะพะณะปะฐะถะธะฒะฐั ัะฒะพะธ ะฒะพะปะพัั ะธ ะฒัะปััะธะฒะฐััั ะฒ ะฑะธะตะฝะธะต ะฝะฐัะธั
ัะตัะดะตั.\nะะพัะตะผั ั ัะตะฑั ัะพะณะดะฐ ะฝะต ะฟะพัะปััะฐะป? ะะฐัะตะผ ะฟะพะทะฒะพะปะธะป ัะตะฑะต ะผะพะบะฝััั ะฟะพะด ะดะพะถะดะตะผ ะฒะผะตััะต ัะพ ะผะฝะพะน? ะัะปะธ ะฑั ะฝะต ััะฐ ะพัะธะฑะบะฐ, ัั ะฑั ะฝะต ะฟะพะดั
ะฒะฐัะธะป ัะตััะตะทะฝัั ะฑะพะปะตะทะฝั. ะะตะฝั ะดะพ ัะธั
ะฟะพั ัะตัะทะฐะตั ััะฒััะฒะพ ะฒะธะฝั. ะัะตะฝั ััะถะตะปะพ ะพัะพะทะฝะฐะฒะฐัั, ััะพ ะฟะพะณัะฑะธะป ััั-ัะพ ะถะธะทะฝั... ะัะพะฑะตะฝะฝะพ ัะพะณะพ, ะบัะพ ะฑัะป ัะตะฝััะพะผ ะผะพะตะน ะัะตะปะตะฝะฝะพะน.\nะฏ ะฟัะพะดะพะปะถะฐั ะถะธัั ะฟัะพัะปัะผ, ะฝะต ะผะพะณั ะฝะต ะฒัะฟะพะผะธะฝะฐัั ัะต ะฝะตะผะฝะพะณะธะต, ะฝะพ ัะฐะบะธะต ะดะพัะพะณะธะต ะผะพะตะผั ัะตัะดัั ะผะพะผะตะฝัั, ะฟัะพะฒะตะดะตะฝะฝัะต ั ัะพะฑะพะน. ะั ัะพะฒัะตะผ ะฝะตะดะพะปะณะพ ะฑัะปะธ ะฒะผะตััะต. ะะตะฝั ัะฐััะพ ะผะพะถะฝะพ ะฒัััะตัะธัั ะฝะฐ ัะพะผ ัะฐะผะพะผ ะผะตััะต ะฒ ะปะตัั, ะณะดะต ั ะพัะบััะปัั ัะตะฑะต. ะะฝะพะณะดะฐ ะผะฝะต ะบะฐะถะตััั, ััะพ ัะบะฒะพะทั ัะธะปัะฝัั ะผะตัะตะปั ะฒะธะถั ัะฒะพะน ัะธะปััั. ะขั ัะปัะฑะฐะตัััั ะธ ะดะตะปะฐะตัั ะฝะตัะบะพะปัะบะพ ัะฐะณะพะฒ ะฝะฐะฒัััะตัั, ะฐ ะฟะพัะพะผ ะธััะตะทะฐะตัั...\nะัะปะธ ะฑั ัะพะปัะบะพ ะฑัะปะฐ ะฒะพะทะผะพะถะฝะพััั ะตัะต ัะฐะท ััะปััะฐัั ัะฐะบะพะต ัะตะฟะปะพะต "ะดะพะฑัะพะต ัััะพ", ะฟะพััะฒััะฒะพะฒะฐัั ะณะพัััะตะต ะดัั
ะฐะฝะธะต, ัะตะบะพัััะตะต ัั
ะพ, ั
ะพัั ััะพ-ะฝะธะฑัะดั...\nะะพะบะฐ ัั ะฑัะป ััะดะพะผ, ะฑัะปะพ ัะพะฒัะตะผ ะฝะต ะฒะฐะถะฝะพ, ััะพ ะฟัะพะธัั
ะพะดะธั ะฒะพะบััะณ. ะะพ ัะตะฟะตัั, ะบะพะณะดะฐ ั ะฝะฐะฑะปัะดะฐั ะทะฐ ะฝะตะฝะฐััะฝะพะน ะฟะพะณะพะดะพะน ะฒ ะพะบะฝะพ, ั ะผะตะฝั ะฝะตั ัะฒะตัะปัั
ะผััะปะตะน ะธ ะปะตะณะบะพััะธ, ััะพ ะฒะพะทะฝะธะบะฐะปะธ ัะฐะฝััะต. ะะฐะถะต ะปะตัะพะผ ะผะพะต ัะตัะดัะต ัะบะพะฒัะฒะฐะตั ะปะตะด, ะบะพัะพััะน ัะถะต ะฝะต ัะดะฐัััั ัะฐััะพะฟะธัั.',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 384]
# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]
```
<!--
### Direct Usage (Transformers)
<details><summary>Click to see the direct usage in Transformers</summary>
</details>
-->
<!--
### Downstream Usage (Sentence Transformers)
You can finetune this model on your own dataset.
<details><summary>Click to expand</summary>
</details>
-->
<!--
### Out-of-Scope Use
*List how the model may foreseeably be misused and address what users ought not to do with the model.*
-->
## Evaluation
### Metrics
#### Binary Classification
* Evaluated with [<code>BinaryClassificationEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.BinaryClassificationEvaluator)
| Metric | Value |
|:--------------------------|:-----------|
| cosine_accuracy | 0.9215 |
| cosine_accuracy_threshold | 0.3258 |
| cosine_f1 | 0.752 |
| cosine_f1_threshold | 0.2845 |
| cosine_precision | 0.7465 |
| cosine_recall | 0.7575 |
| **cosine_ap** | **0.8412** |
| cosine_mcc | 0.702 |
<!--
## Bias, Risks and Limitations
*What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.*
-->
<!--
### Recommendations
*What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.*
-->
## Training Details
### Training Dataset
#### json
* Dataset: json
* Size: 276,686 training samples
* Columns: <code>sentence1</code>, <code>sentence2</code>, and <code>label</code>
* Approximate statistics based on the first 1000 samples:
| | sentence1 | sentence2 | label |
|:--------|:--------------------------------------------------------------------------------------|:--------------------------------------------------------------------------------------|:-----------------------------|
| type | string | string | int |
| details | <ul><li>min: 442 tokens</li><li>mean: 510.89 tokens</li><li>max: 512 tokens</li></ul> | <ul><li>min: 451 tokens</li><li>mean: 511.55 tokens</li><li>max: 512 tokens</li></ul> | <ul><li>1: 100.00%</li></ul> |
* Samples:
| sentence1 | sentence2 | label |
|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:---------------|
| <code>ะงัะพ ััะพ-ัะพ ะฝะต ัะฐะบ, ะธะฝััะธัะธั ะฟะพะดัะบะฐะทัะฒะฐะปะฐ ะะฐะฝะทะฐัั ั ัะฐะผะพะณะพ ัััะฐ. ะะปะฐะณะพะฟะพะปััะฝะพ ะฟัะพะธะณะฝะพัะธัะพะฒะฐะฒ ะฟัะพะฑัะถะดะตะฝะธะต ั ะปะธะณัะพะผ ะฒ ะฟะพััะตะปะธ, ะะตััะตั ะฟะตัะธะพะดะธัะตัะบะธ ัะฟะฐะป ััะดะพะผ ั ั
ะพะทัะธะฝะพะผ. ะะฐะฝะทะฐั ะปะตะฝะธะฒะพ ะฝะพ, ะบะฐะบ ะธะทะฒะตััะฝะพ ะพัะณะฐะฝะธะทะผั ะฝะต ะพัะบะฐะถะตัั, ัะฟัััะธะปัั ะฝะฐ ะบัั
ะฝั. ะัะฝะพัะธัะตะปัะฝะพ ัะฟะพะบะพะนะฝะพ ะฟะพะตะฒ, ะธ ะธะทะฑะฐะฒะธะฒัะธัั ะพั ะฝะพะฒะพัะฒะปะตะฝะฝะพะณะพ ัััะฟะฐ, ะบะพัะพััะน ะฟัะพะปะธะป ะฝะฐ ะฝะตะณะพ ะฟะพะดะปะธะฒั ะบ ะผััั, ะฑะพัั ะฒัะตั ะฒะฐัะธะธ ะพัะฟัะฐะฒะธะปัั ะฒ ะดัั. ะััััะพ ะฒัะผัะฒัะธัั ะธ ะพะฑะฒัะทะฐะฒ ะบะพัะพัะบะพะต ะฟะพะปะพัะตะฝัะต ะฝะฐ ะฑะตะดัะฐั
, ะพะฝ ะฒะตัะฝัะปัั ะฒ ัะฒะพั ัะฟะฐะปัะฝั ะธ ะฟัะธะปะตะณ ะฝะฐ ะบัะพะฒะฐัั ััะดะพะผ ั ะปะธะณัะพะผ. ะะตะผะฝะพะณะพ ะฟะพััะตะฟะฐะฒ ะตะณะพ ะณัะธะฒั, ะฑััะฝะตั ัะฐะทะปะตะณัั ะฝะฐ ะบัะพะฒะฐัะธ. ะะธะฒะพัะฝะพะต ะถะต ะฒัะฟะพะผะฝะธะปะพ, ะบะฐะบ ะดะปะธะฝะฝะพะฒะพะปะพััะน ะฟะฐัะตะฝั ะธัะฟะพะปัะทะพะฒะฐะป ะตะณะพ ั
ะพะทัะธะฝะฐ ะบะฐะบ ัะฐะผะบั. ะ ะะตััะตั ะฟะพ ั
ะฐัะฐะบัะตัั ะฑัะป ะพัะตะฝั ะฟะพั
ะพะถ ะฝะฐ ะะฐะฝะทะฐัะฐ ัะพะฑััะฒะตะฝะฝะธัะตััะฒะพะผ, ะฟะพ ััะพะน ะฟัะธัะธะฝะต ะทะฒะตัั ะฒััะฐะป, ะฟะพัะพะฟัะฐะปัั ะฝะฐ ะฟะพััะตะปะธ ะธ ะทะฐะฑัะฐะปัั ะฝะฐ ัะฒะพะตะณะพ ั
ะพะทัะธะฝะฐ. ะะฐะฝะทะฐั ะฒะฝะพะฒั ะฝะต ะฟัะธะดะฐะป ััะพะผั ะทะฝะฐัะตะฝะธั, ะฟัะธะฝะธะผะฐั ะทะฐ ะฟะพะฟััะบั ะปะตะฝะธะฒะพะณะพ ะถะธะฒะพัะฝะพะณะพ ัะปะตะทัั ั ะบัะพะฒะฐัะธ. ะญัะพ ะธ ะฑัะปะพ ะตะณะพ ะพัะธะฑะบะพะน. ะกะฒะพะธะผ ะฝะตะผะฐะปัะผ ะฒะตัะพะผ ะะตััะตั ะฟัะธะดะฐะฒะธะป ะผัะถัะธะฝั ะบ ะฟะพััะตะปะธ, ะธ ะพัะดะตะปัะฝะพ ะฟัะธะดะฐะฒะธะป ะพะดะฝะพะน ะปะฐะฟะพะน ะตะณะพ ั...</code> | <code>ะะพะผะธะฝะต ะฝะตัะฟะตัะฝะพ ัะตะป ะฒ ััะพัะพะฝั ัะบะพะปั ะกะตะนัะธะฝ. ะฃัะพะบะธ ะตัะต ัะปะธ, ะฟะพััะพะผั ะตะผั ะฑัะปะพ ะฝะตะบัะดะฐ ัะพัะพะฟะธัััั, ะฐ ัะฒะพะธ ะพะฝ ะฑะปะฐะณะพะฟะพะปััะฝะพ ะฟัะพะต...ะบั
ะผ, ะฟัะพะฟัััะธะป, ะดะฐะฑั ะฝะฐะฒะตะดะฐัััั ะบ ะะฐะณะฐะผะธ. ะะฐัะตะผ, ะพะฝ ะธ ัะฐะผ ะดะพ ะบะพะฝัะฐ ะฝะต ะฟะพะฝะธะผะฐะป, ะฝะพ ะฟัะธะฒัะบ ัะปะตะดะพะฒะฐัั ัะฒะพะธะผ ะถะตะปะฐะฝะธัะผ ะธ ะบะพัะผะธัั ะฒะฝัััะตะฝะฝะธั
ะดะตะผะพะฝะพะฒ. ะ ะฝะฐััะฝะธะบะฐั
ะธะณัะฐะปะฐ ะฝะตะทะฐัะตะนะปะธะฒะฐั ะผะตะปะพะดะธั ะฝะฐ ะฐะฝะณะปะธะนัะบะพะผ, ะฐ ัะฐะผ ะะพะผะธะฝะต ะฝะต ะทะฐะผะพัะฐัะธะฒะฐะปัั ัะตะบััะพะผ, ะฝะฐัะปะฐะถะดะฐััั ะทะฒััะฐะฝะธะตะผ ะผัะทัะบะธ ะธ ะณะพะปะพัะพะผ ะฟะตะฒัะฐ.<br>ะะพะนะดั ะฒะพ ะดะฒะพั, ะพะฝ ะพะณะปัะดะตะปัั, ะธัะฐ ะฒั
ะพะด ะฒ ััะตะฑะฝัะน ะบะพัะฟัั. ะะฐะนะดั ะถะต ะฝัะถะฝัั ะดะฒะตัั, ะพะฝ ะฟัะพัะตะป ะฒะฝัััั, ะฟะพะดั
ะพะดั ะบ ัะฐัะฟะธัะฐะฝะธั. ะัะพะฒะตะดั ะฟะฐะปััะตะผ ะฟะพ ัะธััะต ะฝัะถะฝะพะณะพ ะบะปะฐััะฐ, ะพะฝ ะฒะทะณะปัะฝัะป ะฝะฐ ัะฐะผะธ ััะพะบะธ.<br>-ะฅะผ, ัะฟะพะฝัะบะธะน... ะะต ะดัะผะฐั, ััะพ ะพะฝ ะฑัะดะตั ะฟัะพัะธะฒ, ะตัะปะธ ั ะตะณะพ ะพัะผะฐะถั. - ะ, ะฟะพ ะฐะบัะปัะธ ัะปัะฑะฝัะฒัะธัั, ะฟะฐัะตะฝั ะฝะฐะฟัะฐะฒะธะปัั ะฝะฐ ะฒัะพัะพะน ััะฐะถ, ะบ ะบะฐะฑะธะฝะตัั ะฝะพะผะตั ััะธะฝะฐะดัะฐัั. ะัะตะดะฒะฐัะธัะตะปัะฝะพ ะทะฐะณะปัะฝัะฒ ะฒ ะทะฐะผะพัะฝัั ัะบะฒะฐะถะธะฝั, ัะฒะธะดะตะฒ ะผะพะปะพะดะตะฝัะบัั ััะธัะตะปัะฝะธัั ะธ ะฒัะณะปัะดัะฒะฐัััั ะธะท-ะฟะพะด ะฑะปัะทะบะธ ัะฐััะธัะพะฒะบั "I love yaoi", ะฒ ะพัะตัะตะดะฝะพะน ัะฐะท ะพัะบะฐะปะธะปัั ะธ ะฟัะพัะตะป ะฒ ะบะฐะฑะธะฝะตั. ะะตะฒััะบะฐ ะฝะต ััะฟะตะปะฐ ะดะฐะถะต ะฟะธะบะฝััั, ะบะฐะบ ะพะฝ ะฑัะป ั ะฟะฐััั ะะฐะณะฐะผะธ. ะะฐะบะปะพะฝะธะฒั...</code> | <code>1</code> |
| <code>ะงัะพ ััะพ-ัะพ ะฝะต ัะฐะบ, ะธะฝััะธัะธั ะฟะพะดัะบะฐะทัะฒะฐะปะฐ ะะฐะฝะทะฐัั ั ัะฐะผะพะณะพ ัััะฐ. ะะปะฐะณะพะฟะพะปััะฝะพ ะฟัะพะธะณะฝะพัะธัะพะฒะฐะฒ ะฟัะพะฑัะถะดะตะฝะธะต ั ะปะธะณัะพะผ ะฒ ะฟะพััะตะปะธ, ะะตััะตั ะฟะตัะธะพะดะธัะตัะบะธ ัะฟะฐะป ััะดะพะผ ั ั
ะพะทัะธะฝะพะผ. ะะฐะฝะทะฐั ะปะตะฝะธะฒะพ ะฝะพ, ะบะฐะบ ะธะทะฒะตััะฝะพ ะพัะณะฐะฝะธะทะผั ะฝะต ะพัะบะฐะถะตัั, ัะฟัััะธะปัั ะฝะฐ ะบัั
ะฝั. ะัะฝะพัะธัะตะปัะฝะพ ัะฟะพะบะพะนะฝะพ ะฟะพะตะฒ, ะธ ะธะทะฑะฐะฒะธะฒัะธัั ะพั ะฝะพะฒะพัะฒะปะตะฝะฝะพะณะพ ัััะฟะฐ, ะบะพัะพััะน ะฟัะพะปะธะป ะฝะฐ ะฝะตะณะพ ะฟะพะดะปะธะฒั ะบ ะผััั, ะฑะพัั ะฒัะตั ะฒะฐัะธะธ ะพัะฟัะฐะฒะธะปัั ะฒ ะดัั. ะััััะพ ะฒัะผัะฒัะธัั ะธ ะพะฑะฒัะทะฐะฒ ะบะพัะพัะบะพะต ะฟะพะปะพัะตะฝัะต ะฝะฐ ะฑะตะดัะฐั
, ะพะฝ ะฒะตัะฝัะปัั ะฒ ัะฒะพั ัะฟะฐะปัะฝั ะธ ะฟัะธะปะตะณ ะฝะฐ ะบัะพะฒะฐัั ััะดะพะผ ั ะปะธะณัะพะผ. ะะตะผะฝะพะณะพ ะฟะพััะตะฟะฐะฒ ะตะณะพ ะณัะธะฒั, ะฑััะฝะตั ัะฐะทะปะตะณัั ะฝะฐ ะบัะพะฒะฐัะธ. ะะธะฒะพัะฝะพะต ะถะต ะฒัะฟะพะผะฝะธะปะพ, ะบะฐะบ ะดะปะธะฝะฝะพะฒะพะปะพััะน ะฟะฐัะตะฝั ะธัะฟะพะปัะทะพะฒะฐะป ะตะณะพ ั
ะพะทัะธะฝะฐ ะบะฐะบ ัะฐะผะบั. ะ ะะตััะตั ะฟะพ ั
ะฐัะฐะบัะตัั ะฑัะป ะพัะตะฝั ะฟะพั
ะพะถ ะฝะฐ ะะฐะฝะทะฐัะฐ ัะพะฑััะฒะตะฝะฝะธัะตััะฒะพะผ, ะฟะพ ััะพะน ะฟัะธัะธะฝะต ะทะฒะตัั ะฒััะฐะป, ะฟะพัะพะฟัะฐะปัั ะฝะฐ ะฟะพััะตะปะธ ะธ ะทะฐะฑัะฐะปัั ะฝะฐ ัะฒะพะตะณะพ ั
ะพะทัะธะฝะฐ. ะะฐะฝะทะฐั ะฒะฝะพะฒั ะฝะต ะฟัะธะดะฐะป ััะพะผั ะทะฝะฐัะตะฝะธั, ะฟัะธะฝะธะผะฐั ะทะฐ ะฟะพะฟััะบั ะปะตะฝะธะฒะพะณะพ ะถะธะฒะพัะฝะพะณะพ ัะปะตะทัั ั ะบัะพะฒะฐัะธ. ะญัะพ ะธ ะฑัะปะพ ะตะณะพ ะพัะธะฑะบะพะน. ะกะฒะพะธะผ ะฝะตะผะฐะปัะผ ะฒะตัะพะผ ะะตััะตั ะฟัะธะดะฐะฒะธะป ะผัะถัะธะฝั ะบ ะฟะพััะตะปะธ, ะธ ะพัะดะตะปัะฝะพ ะฟัะธะดะฐะฒะธะป ะพะดะฝะพะน ะปะฐะฟะพะน ะตะณะพ ั...</code> | <code>ะะพะผะธะฝะต ะฑัะป ะฐะฝะณะตะปะพะผ ัะถะต ะพัะตะฝั ะดะฐะฒะฝะพ. ะะฝ ะดะฐะถะต ะฝะต ะฟะพะผะฝะธะป ัะบะพะปัะบะพ ะปะตั, ะดะฐะถะต ะฒะตะบะพะฒ ะฟัะพัะปะพ ั ัะพะณะพ ะผะพะผะตะฝัะฐ. ะะฝ ะปัะฑะธะป ัะธะดั ะฝะฐ ะพะดะฝะพะผ ะธะท ะพะฑะปะฐะบะพะฒ ะฝะฐะฑะปัะดะฐัั ะทะฐ ะะตะผะปะตะน, ะฐ ะพัะพะฑะตะฝะฝะพ ะพัะตะฝัั. ะ ััะพ ะฑัะป ะพะฑััะฝัะน ะดะตะฝั, ะฝะพ ะะพะผะธะฝะต ะทะฐั
ะพัะตะปะพัั ะฟะพัะผะพััะตัั ะฟะพะฑะปะธะถะต. ะ ะฐัะบััะฒ ะพะณัะพะผะฝัะต ะบััะปัั, ะพะฝ ััััะตะผะธะปัั ะบะฐะผะฝะตะผ ะฒะฝะธะท. ะะปั ะปัะดะตะน ััะพ ะฒัะณะปัะดะตะปะพ ะบะฐะบ ัะฟะฐะฒัะฐั ะทะฒะตะทะดะฐ, ััะบะฐั ะปะธะฝะธั ะฒ ะฝะตะฑะต. ะะธะบัะพ ะฝะต ะทะฝะฐะป, ััะพ ะฒัะต ััะธ ะปะธะฝะธะธ ัะตััะธะปะธัั ะฟะฐะดะฐััะธะผะธ ะฐะฝะณะตะปะฐะผะธ, ะปะธัั ะฟะพััะพะผั ะถะตะปะฐะฝะธั, ะบะพัะพััะต ะฑัะปะพ ะฟัะธะฝััะพ ะทะฐะณะฐะดัะฒะฐัั, ะธัะฟะพะปะฝัะปะธัั. ะะฐ ะฑะพะปััะพะน ัะบะพัะพััะธ ะผะพะปะพะดะพะน ะฐะฝะณะตะป ะฟัะธะทะตะผะปะธะปัั. ะะพะณะดะฐ ะพะฑะปะฐะบะพ ะฟัะปะธ ัะฐััะตัะปะพัั, ััะฐะปะพ ะฒะธะดะฝะพ, ััะพ ะพะฝ ััะพะธั ะฝะฐ ะพะดะฝะพะผ ะบะพะปะตะฝะต, ัะฟะธัะฐััั ััะบะฐะผะธ ะฒ ะทะตะผะปั. ะกะปะพะถะธะฒ ัะฒะพะธ ะบััะปัั, ะพะฝ ัะบััะป ะธั
ะพั ัะตะปะพะฒะตัะตัะบะธั
ะณะปะฐะท, ะตะณะพ ะพะดะตะถะดะฐ ะฒัะตะณะดะฐ ะฑัะปะฐ ะบะฐะบ ะทะตะผะฝะฐั, ะธ ัะตะนัะฐั ัะพะถะต. ะะตะปะฐั ะฑะพััะพะฒะบะฐ, ัะฐัััะตะณะฝััะฐั ะฑะปะตะดะฝะพ ะณะพะปัะฑะฐั ััะฑะฐัะบะฐ ะฑะตะท ััะบะฐะฒะพะฒ ะธ ัะตะผะฝะพ ัะธะฝะธะต ะดะถะธะฝัั. ะะฐะธะฝัะตัะตัะพะฒะฐะฝะฝะพ ัะผะพััั ะฟะพ ััะพัะพะฝะฐะผ, ะพะฝ ะฟะพะฑัะตะป ะฒ ััะพัะพะฝั ะณะพัะพะดะฐ, ะผะธะผะพ ะตั
ะฐะปะธ ะผะฐัะธะฝั, ะฝะพ ะพะฝ ะธั
ะฝะต ะทะฐะผะตัะฐะป. ะะฐะนะดั ะฒ ะณะพัะพะด, ะพะฝ ะฑัะป ะฟะพััะธ ะพัะปะตะฟะปะตะฝ ะฝะตะพะฝะพะฒัะผะธ ะฒัะฒะตัะบะฐะผะธ ะธ ะผะฝะพะณะพัะธัะปะตะฝะฝัะผะธ...</code> | <code>1</code> |
| <code>ะงัะพ ััะพ-ัะพ ะฝะต ัะฐะบ, ะธะฝััะธัะธั ะฟะพะดัะบะฐะทัะฒะฐะปะฐ ะะฐะฝะทะฐัั ั ัะฐะผะพะณะพ ัััะฐ. ะะปะฐะณะพะฟะพะปััะฝะพ ะฟัะพะธะณะฝะพัะธัะพะฒะฐะฒ ะฟัะพะฑัะถะดะตะฝะธะต ั ะปะธะณัะพะผ ะฒ ะฟะพััะตะปะธ, ะะตััะตั ะฟะตัะธะพะดะธัะตัะบะธ ัะฟะฐะป ััะดะพะผ ั ั
ะพะทัะธะฝะพะผ. ะะฐะฝะทะฐั ะปะตะฝะธะฒะพ ะฝะพ, ะบะฐะบ ะธะทะฒะตััะฝะพ ะพัะณะฐะฝะธะทะผั ะฝะต ะพัะบะฐะถะตัั, ัะฟัััะธะปัั ะฝะฐ ะบัั
ะฝั. ะัะฝะพัะธัะตะปัะฝะพ ัะฟะพะบะพะนะฝะพ ะฟะพะตะฒ, ะธ ะธะทะฑะฐะฒะธะฒัะธัั ะพั ะฝะพะฒะพัะฒะปะตะฝะฝะพะณะพ ัััะฟะฐ, ะบะพัะพััะน ะฟัะพะปะธะป ะฝะฐ ะฝะตะณะพ ะฟะพะดะปะธะฒั ะบ ะผััั, ะฑะพัั ะฒัะตั ะฒะฐัะธะธ ะพัะฟัะฐะฒะธะปัั ะฒ ะดัั. ะััััะพ ะฒัะผัะฒัะธัั ะธ ะพะฑะฒัะทะฐะฒ ะบะพัะพัะบะพะต ะฟะพะปะพัะตะฝัะต ะฝะฐ ะฑะตะดัะฐั
, ะพะฝ ะฒะตัะฝัะปัั ะฒ ัะฒะพั ัะฟะฐะปัะฝั ะธ ะฟัะธะปะตะณ ะฝะฐ ะบัะพะฒะฐัั ััะดะพะผ ั ะปะธะณัะพะผ. ะะตะผะฝะพะณะพ ะฟะพััะตะฟะฐะฒ ะตะณะพ ะณัะธะฒั, ะฑััะฝะตั ัะฐะทะปะตะณัั ะฝะฐ ะบัะพะฒะฐัะธ. ะะธะฒะพัะฝะพะต ะถะต ะฒัะฟะพะผะฝะธะปะพ, ะบะฐะบ ะดะปะธะฝะฝะพะฒะพะปะพััะน ะฟะฐัะตะฝั ะธัะฟะพะปัะทะพะฒะฐะป ะตะณะพ ั
ะพะทัะธะฝะฐ ะบะฐะบ ัะฐะผะบั. ะ ะะตััะตั ะฟะพ ั
ะฐัะฐะบัะตัั ะฑัะป ะพัะตะฝั ะฟะพั
ะพะถ ะฝะฐ ะะฐะฝะทะฐัะฐ ัะพะฑััะฒะตะฝะฝะธัะตััะฒะพะผ, ะฟะพ ััะพะน ะฟัะธัะธะฝะต ะทะฒะตัั ะฒััะฐะป, ะฟะพัะพะฟัะฐะปัั ะฝะฐ ะฟะพััะตะปะธ ะธ ะทะฐะฑัะฐะปัั ะฝะฐ ัะฒะพะตะณะพ ั
ะพะทัะธะฝะฐ. ะะฐะฝะทะฐั ะฒะฝะพะฒั ะฝะต ะฟัะธะดะฐะป ััะพะผั ะทะฝะฐัะตะฝะธั, ะฟัะธะฝะธะผะฐั ะทะฐ ะฟะพะฟััะบั ะปะตะฝะธะฒะพะณะพ ะถะธะฒะพัะฝะพะณะพ ัะปะตะทัั ั ะบัะพะฒะฐัะธ. ะญัะพ ะธ ะฑัะปะพ ะตะณะพ ะพัะธะฑะบะพะน. ะกะฒะพะธะผ ะฝะตะผะฐะปัะผ ะฒะตัะพะผ ะะตััะตั ะฟัะธะดะฐะฒะธะป ะผัะถัะธะฝั ะบ ะฟะพััะตะปะธ, ะธ ะพัะดะตะปัะฝะพ ะฟัะธะดะฐะฒะธะป ะพะดะฝะพะน ะปะฐะฟะพะน ะตะณะพ ั...</code> | <code>ะขััะฝะฐะตัะธ ะปะตะถะฐะป ะฝะฐ ะฟะพััะตะปะธ ะฒ ะพะดะฝะพะผ ัะตะปะบะพะฒะพะผ ั
ะฐะปะฐัะต, ะพะถะธะดะฐั ะฟัะธั
ะพะดะฐ ัะฒะพะตะณะพ ะฟะฐัะฝั. ะะพะณะดะฐ ะดะฒะตัั ะพัััะปะฐัั, ะพะฝ ัะฐะทะฒัะฐัะฝะพ ัะฐะทะฒะตะป ะฝะพะถะบะธ ะธ ะฟัะธะฝัะปัั ะฒัะปะธะทัะฒะฐัั ัะฒะพะธ ะฟะฐะปััั.<br>-ะ-ะด-ะดะถัะดะฐะนะผะต! ะงัะพ ะั ะดะตะปะฐะตัะต?!<br>-ะะผะผะผ, ะฅะฐะฐะฐะฐััะพ, ั ัะตะฑั ัะฐะบ ั
ะพัั! - ะะทะฝัะฒะฐั ะพั ะถะตะปะฐะฝะธั, ะฟัะพะธะทะฝะตั ะกะฐะฒะฐะดะฐ.<br>-ะะฐ ะั ััะพ... ะะฐะบ ั ะผะพะณั? - ะขะพั ะฟะพะบัะฐัะฝะตะป ะธ ะพัะฒะตัะฝัะปัั.<br>-ะะฝะฐะตัั ััะพ, ะะพะบัะดะตัะฐ... - ะะพะปะพั ััะฐะป ัะตัะดะธััะผ. - ะั ัะฐัััะฐะตะผัั! - ะ ะฝะต ะดะฐะฒ ะฟะพะดััะฒะฝะธะบั ะธ ัะปะพะฒะฐ ัะบะฐะทะฐัั, ะกะฐะฒะฐะดะฐ ะทะฐะฒัะทะฐะป ะฟะพัั ั
ะฐะปะฐัะฐ ะธ, ะพะฑัะฒัะธัั, ะฟะพะบะธะฝัะป ะบะฒะฐััะธัั. ะะฝ ะฑััััะพ ะดะพะฑัะฐะปัั ะดะพ ัะฒะพะตะน ะผะฐัะธะฝั ะธ, ัะตะฒ ะฒ ะฝะตะต, ะฝะต ะผะตะฝะตะต ะฑััััะพ ะดะพะฑัะฐะปัั ะดะพ ััะตะผะฝะพะน ะบะฒะฐััะธัั ะกะบัะฐะปะพ. ะะพัะฐัะฐะฑะฐะฝะธะฒ ะฒ ะดะฒะตัั ะฒัะตะณะพ ะผะธะฝััั, ะพะฝ ะฒะพัะฒะฐะปัั ะฒ ะบะพัะธะดะพั ะธ ะฝะฐ ะฝะตะดะพัะผะตะฝะฝัะน ะฒะทะณะปัะด ะกะบัะฐะปะพ ัะบะธะฝัะป ั
ะฐะปะฐัะธะบ.<br>-ะัะพะพะพะน, ะกะฐะฒะฐะดะฐ, ัั ัะต ัะฒะพัะธัั? - ะกัะฟะตัะฑะธ ะฟััะฐะปัั ัะพะฑัะฐัั ัะตะปัััั ั ะฟะพะปะฐ, ะฐ ัะฐะผ ะะตััััะน ะพะฟะตััั ััะบะฐะผะธ ะฝะฐ ััะตะฝั, ะฟัะพะณะฝัะป ัะฟะธะฝะบั ะธ, ัะฐัััะฐะฒะธะฒ ะฝะพะถะบะธ ะฟะพััะตะฑะพะฒะฐะป.<br>-ะขัะฐั
ะฝะธ ะผะตะฝั!<br>-ะงะตะณะพ? ะขั ะฟััะฝัะน ััะพ ะปะธ? - ะงะตะปัััั ะผะตัะฝะธะบะฐ ะฒะพ ะฒัะพัะพะน ัะฐะท ะฟะพะทะฝะฐะบะพะผะธะปะฐัั ั ะฟะพะปะพะผ.<br>-ะัะพััะพ ะฒััะฐะฒั. ะะฝะต ะผะพะถะฝะพ ะฑะตะท ัะผะฐะทะบะธ ะธ ะณะฐะฝะดะพะฝะพะฒ. - ะ ะฐะทะดัะฐ...</code> | <code>1</code> |
* Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters:
```json
{
"scale": 20.0,
"similarity_fct": "cos_sim"
}
```
### Evaluation Dataset
#### json
* Dataset: json
* Size: 184,428 evaluation samples
* Columns: <code>sentence1</code>, <code>sentence2</code>, and <code>label</code>
* Approximate statistics based on the first 1000 samples:
| | sentence1 | sentence2 | label |
|:--------|:--------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|:-----------------------------|
| type | string | string | int |
| details | <ul><li>min: 429 tokens</li><li>mean: 510.07 tokens</li><li>max: 512 tokens</li></ul> | <ul><li>min: 406 tokens</li><li>mean: 510.3 tokens</li><li>max: 512 tokens</li></ul> | <ul><li>1: 100.00%</li></ul> |
* Samples:
| sentence1 | sentence2 | label |
|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:---------------|
| <code>ะะตะปะพ ะฑัะปะพ ะฒะตัะตัะพะผ, ะบะพะณะดะฐ ั ะพัะฟัะฐะฒะปัะปะฐัั ะฒ ะณะพััะธ ะบ ะะฐััะธะฝะต. ะฏ ะฒะทัะปะฐ ั ัะพะฑะพะน ะคะฐะบัะฝะดะพ, ะธ ะัะฐะฑะธ. ะญัะพ ะผะพะธ ััะผะฐััะตะดัะธะต, ะฝะพ ะปัััะธะต ะดััะทัั. ะะพัะพัะต ะบะพัะพัะบะพ: ะฏ - ะะพะดะพะฒะธะบะฐ, ะฝะพ ะผะพะถะฝะพ ะฟัะพััะพ ะะพะดะพ.<br>ะขะฐะบ ะบะฐะบ ะขะธะฝะฐ ะถะธะฒัั ะฝะฐ 9 ััะฐะถะต, ะฝะฐะผ ะฟัะธัะปะพัั ะตั
ะฐัั ะฝะฐ ะปะธััะต, ะธะฝะฐัะต ะฝะฐ ะปะตััะฝะธัะต ะผั ะฑั ะฟะพะดะพั
ะปะธ. ะะพัะพัะต ะทะฐั
ะพะดะธะผ ะฒ ะปะธัั, ะธ ะฒะพั ะผั ัะถะต ะฝะฐ ะฝัะถะฝะพะผ ะฝะฐะผ ััะฐะถะต.<br>ะะฒะตัะบะธ ะพัะบััะปะธัั, ั ะพัะฒะตัะฝัะปะฐัั ะฝะฐ ะผะธะฝััะบั. ะ ะฟะพัะพะผ ะฟะพะฒะตัะฝัะปะฐัั, ัะผะพััั ะฐ ััะธั
ะธะดะธะพัะพะฒ ะฝะตัั. ะะดััะณ ะดะฒะตัะบะธ ะทะฐะบััะปะธัั, ั ะดะตะฒััะบะฐ ะฝะต ะฟัะณะปะธะฒะฐั ะฝะพ, ัััั ะฐ ะฒะดััะณ ั ััั ะทะฐััััะฝั?<br>- ะัะธะดััะบะธ, ะฑะปะธะฝ! ะัะบัะพะนัะต ะปะธัั! ะะฐััััะฝั ะฒะตะดั!<br>ะ ะพัะฒะตั ั ััะปััะฐะปะฐ ะปะธัั ัะผะตั
ะดะฒัั
ะฟะฐัะฝะตะน. ะั! ะะผ ะฝะต ะฟะพะทะดะพัะพะฒะธััั ะบะพะณะดะฐ ั ะพัััะดะฐ ะฒัะนะดั.<br>ะะดััะณ, ะดะฒะตัะบะธ ะพัะบััะปะธัั, ั ะฒัะปะตะทะปะฐ ะธะท ะปะธััะฐ, ะธ ััะธ ะดะตะฑะธะปั ะฝะฐัะธะปัะฝะพ ะทะฐัะฐัะบะธะฒะฐัั ะผะตะฝั ะฒ ะปะธัั, ะธ ะถะผัั ะฝะฐ ะบะฝะพะฟะบั, ััะพะฑั ะปะธัั ะฟะพะตั
ะฐะป ะดะพ 1 ััะฐะถะฐ.<br>- ะััััะตะต! ะคะฐะบั! ะะฐะดะพ ะฑััััะตะต ะะพะดะพ ัะฟัััะธัััั ะฝะฐ 1 ััะฐะถ! - ะะฐะบัะธัะฐะป ะัะฐะฑััะฝะธ.<br>- ะะพะฝัะป! - ะ ะพัะฒะตั ะบัะธะบะฝัะป ะคะฐะบัะฝะดะพ.<br>ะขัั ะดะฒะตัะบะธ ะทะฐะบััะปะธัั, ะธ ะผะตะฝั ะฟะพะฝะตัะปะพ ะฝะฐ 1 ััะฐะถ. ะงะตัะตะท ะฝะตัะบะพะปัะบะพ ะผะธะฝัั ั ัะฟัััะธะปะฐัั ะฝะฐ ะฝัะถะฝัะน ััะฐะถ, ะธ ะฒะดัั...</code> | <code>ะฏ - ะะบะบะธะฝะณ. ะกัะฝ ะฒะพะถะดั, ะฟะตัะฒัะน ะบะพัะพััะน ะฟัะธัััะธะป ะดัะฐะบะพะฝะฐ.<br>ะฃ ะผะตะฝั ะตััั ะปัะฑะธะผะฐั ะดะตะฒััะบะฐ, ะฝะพ ั ะดะฐะฒะฝะพ ะตั ะฝะต ะฒะธะดะตะป. ะะพัะปะตะดะฝะธะต ะฝะพะฒะพััะธ ะผะตะฝั ะฟัะธะฒะตะปะธ ะฒ ัะถะฐั.<br>ะัะพะผะณะธะปัะดะฐ, ะดัะฐะบะพะฝะธั
ะฐ, ะฟะพะณะธะฑะปะฐ. ะ ะัััะธะด ะฝะต ะฟะตัะตะฝะตัะปะฐ ะตั ัะผะตััะธ, ะธ ะฟะพะฒะตัะธะปะฐัั...<br>ะะพะต ัะตัะดัะต ัะฐะทะฑะธะปะพัั ะฝะฐ ััะพ ะพัะบะพะปะบะพะฒ, ะผะพั ะตะดะธะฝััะฒะตะฝะฝะฐั ะปัะฑะพะฒั ะฟะพะณะธะฑะปะฐ.<br>ะะพะฒะพััั ััะพ ะฒะธะบะธะฝะณะธ ะฑะตััะตัะดะตัะฝัะต, ะฝะพ ััะพ ะฝะต ัะฐะบ. ะั ัะพะถะต ัะผะตะตะผ ะปัะฑะธัั!<br>ะ ะฐะฝะฝะตะต ัััะพ. ะัะต ะฒะธะบะธะฝะณะธ ะตัั ัะฟัั, ะทะฐ ะพะบะฝะพะผ ั
ะพะปะพะดะฝะพ, ัะพะปะฝัะต, ะฝะพ ะฒะตัะตั ะฒัั ะถะต ะตััั.<br>ะฏ ะฟัะธะพัะบััะป ะณะปะฐะทะฐ, ะธ ะทะฐะผะตัะธะป, ััะพ ะผะพั ะปัะฑะธะผะฐั ัะตะฟัะธะปะธั ะฒัั ะตัั ัะฟะธั.<br>ะขะฐะบ ั
ะพะปะพะดะฝะพ, ััะพ ะผะฝะต ะทะฐั
ะพัะตะปะพัั ะฒัั ะฒะตัะฝะพััั ะฟัะพะปะตะถะฐัั ะฒ ััะฟะปะพะน ะฟะพััะตะปัะบะต. ะะพั ะบัะพะฒะฐัั ัะฐะบ ะธ ะผะฐะฝะธะปะฐ ะปะตัั, ะธ ะทะฐััะฐะฒะธัั ัะฟะฐัั. ะะพ, ััั ะผะพะน ัััะฝัะน ะดััะณ ะฟัะพัะฝัะปัั. ะะฝ ัััะฐะฒะธะปัั ะฝะฐ ะผะตะฝั ัะฒะพะธะผะธ ะฑะพะปััะธะผะธ ะทะตะปัะฝัะผะธ ะณะปะฐะทะฐะผะธ.<br>- ะงัะพ? - ะะต ะฟะพะฝะธะผะฐะป ั ััะพ ะฟัะพะธัั
ะพะดะธั, ะฝะพ ะฝะฐ ะผะพะน ะฒะพะฟัะพั ะะตะทะทัะฑะธะบ ะปะธัั ัััะบะฝัะป.<br>ะะพ ััั ะพะฝ ัะฐัะฟัะฐะฒะธะป ัะฒะพะธ ะบััะปัั, ะธ ะฟะพะดะปะตัะตะป ะบะพ ะผะฝะต. ะ ัะฒะพั ะผะพัะดั ะฟะพะปะพะถะธะป ะผะฝะต ะฝะฐ ััะบะธ. ะฏ ัะฒะฝะพ ะฝะต ะฟะพะฝะธะผะฐะป ััะพ ะพะฝ ะพั ะผะตะฝั ั
ะพัะตั.<br>ะะพ ััั ะพะฝ ัะฒะพะตะน ะผะพัะดะพะน ัััะฐะฒะธะปัั ะฝะฐ ัะฒะพะธ ะบััะปัั, ะธ ั
ะฒะพัั. ...</code> | <code>1</code> |
| <code>ะะตะปะพ ะฑัะปะพ ะฒะตัะตัะพะผ, ะบะพะณะดะฐ ั ะพัะฟัะฐะฒะปัะปะฐัั ะฒ ะณะพััะธ ะบ ะะฐััะธะฝะต. ะฏ ะฒะทัะปะฐ ั ัะพะฑะพะน ะคะฐะบัะฝะดะพ, ะธ ะัะฐะฑะธ. ะญัะพ ะผะพะธ ััะผะฐััะตะดัะธะต, ะฝะพ ะปัััะธะต ะดััะทัั. ะะพัะพัะต ะบะพัะพัะบะพ: ะฏ - ะะพะดะพะฒะธะบะฐ, ะฝะพ ะผะพะถะฝะพ ะฟัะพััะพ ะะพะดะพ.<br>ะขะฐะบ ะบะฐะบ ะขะธะฝะฐ ะถะธะฒัั ะฝะฐ 9 ััะฐะถะต, ะฝะฐะผ ะฟัะธัะปะพัั ะตั
ะฐัั ะฝะฐ ะปะธััะต, ะธะฝะฐัะต ะฝะฐ ะปะตััะฝะธัะต ะผั ะฑั ะฟะพะดะพั
ะปะธ. ะะพัะพัะต ะทะฐั
ะพะดะธะผ ะฒ ะปะธัั, ะธ ะฒะพั ะผั ัะถะต ะฝะฐ ะฝัะถะฝะพะผ ะฝะฐะผ ััะฐะถะต.<br>ะะฒะตัะบะธ ะพัะบััะปะธัั, ั ะพัะฒะตัะฝัะปะฐัั ะฝะฐ ะผะธะฝััะบั. ะ ะฟะพัะพะผ ะฟะพะฒะตัะฝัะปะฐัั, ัะผะพััั ะฐ ััะธั
ะธะดะธะพัะพะฒ ะฝะตัั. ะะดััะณ ะดะฒะตัะบะธ ะทะฐะบััะปะธัั, ั ะดะตะฒััะบะฐ ะฝะต ะฟัะณะปะธะฒะฐั ะฝะพ, ัััั ะฐ ะฒะดััะณ ั ััั ะทะฐััััะฝั?<br>- ะัะธะดััะบะธ, ะฑะปะธะฝ! ะัะบัะพะนัะต ะปะธัั! ะะฐััััะฝั ะฒะตะดั!<br>ะ ะพัะฒะตั ั ััะปััะฐะปะฐ ะปะธัั ัะผะตั
ะดะฒัั
ะฟะฐัะฝะตะน. ะั! ะะผ ะฝะต ะฟะพะทะดะพัะพะฒะธััั ะบะพะณะดะฐ ั ะพัััะดะฐ ะฒัะนะดั.<br>ะะดััะณ, ะดะฒะตัะบะธ ะพัะบััะปะธัั, ั ะฒัะปะตะทะปะฐ ะธะท ะปะธััะฐ, ะธ ััะธ ะดะตะฑะธะปั ะฝะฐัะธะปัะฝะพ ะทะฐัะฐัะบะธะฒะฐัั ะผะตะฝั ะฒ ะปะธัั, ะธ ะถะผัั ะฝะฐ ะบะฝะพะฟะบั, ััะพะฑั ะปะธัั ะฟะพะตั
ะฐะป ะดะพ 1 ััะฐะถะฐ.<br>- ะััััะตะต! ะคะฐะบั! ะะฐะดะพ ะฑััััะตะต ะะพะดะพ ัะฟัััะธัััั ะฝะฐ 1 ััะฐะถ! - ะะฐะบัะธัะฐะป ะัะฐะฑััะฝะธ.<br>- ะะพะฝัะป! - ะ ะพัะฒะตั ะบัะธะบะฝัะป ะคะฐะบัะฝะดะพ.<br>ะขัั ะดะฒะตัะบะธ ะทะฐะบััะปะธัั, ะธ ะผะตะฝั ะฟะพะฝะตัะปะพ ะฝะฐ 1 ััะฐะถ. ะงะตัะตะท ะฝะตัะบะพะปัะบะพ ะผะธะฝัั ั ัะฟัััะธะปะฐัั ะฝะฐ ะฝัะถะฝัะน ััะฐะถ, ะธ ะฒะดัั...</code> | <code>ะะธะพะปะตััะฐ ะบะฐะบ ะฒัะตะณะดะฐ ัะฟะฐะปะฐ ะฒ ัะฒะพะตะน ะบัะพะฒะฐัะธ, ะธ, ะฒ ะพัะตัะตะดะฝะพะน ัะฐะท ะตะน ัะฝะธะปัั ะบะพัะผะฐั. ะ ะพัะตัะตะดะฝะพะน ัะฐะท ะตะน ัะฝะธะปะฐัั ะตะต ะฟะพะบะพะนะฝะฐั ะผะฐัั, ะะฐัะธั. ะะธะพะปะตััะฐ ะฒััะฐะปะฐ, ะฒัั ะฒัะฟะพัะตะฒัะฐั, ะฒัั ะธัะฟัะณะฐะฝะฝะฐั.<br>ะะดััะณ ะดะฒะตัั ะบะพะผะฝะฐัั ะพัะบััะปะฐัั, ะธะท ะทะฐ ะดะฒะตัะธ ะฟะพะบะฐะทะฐะปัั ัะฝะพัะฐ. ะะฝ ะณะปัะดั ะฝะฐ ะะธะพะปะตััั ะฝะฐั
ะผััะธะป ะฑัะพะฒะธ, ะธ ะฟะพะดะพััะป ะบ ะฝะตะน.<br>- ะะธะพะปะตััะฐ, ััะพ ั ัะพะฑะพะน? - ะกะฟัะพัะธะป ะพะฝ.<br>- ะะธัะตะณะพ. ะัะพััะพ ะพะฟััั ะบะพัะผะฐั ะฟัะธัะฝะธะปัั.<br>- ะะฟััั?<br>ะคะตะดะตัะธะบะพ ัะตะป ะฝะฐ ะบัะฐะน ะบัะพะฒะฐัะธ, ะธ ะพะฑะฝัะป ะตะต. ะขะฐ ะฝะต ััะฐะปะฐ ัะพะฟัะพัะธะฒะปััััั. ะะฝะฐ ะพะฑะฝัะปะฐ ะตะณะพ ะฒ ะพัะฒะตั, ัะตะนัะฐั ะตะน ะฝัะถะฝะฐ ะฟะพะดะดะตัะถะบะฐ. ะะฟััั ัะพะฝ, ะพะฟััั ัะปัะทั. ะะพะณะดะฐ ะถะต ะฑะตะดะฝะพะน ะดะตะฒััะบะต ะฟัะตะบัะฐัะธััั ัะฝะธัััั ะตะต ะผะฐัั?<br>ะะธะพะปะตััะฐ ะฒััะฐะปะฐ ะธะท ัะฒะพะตะน ะฟะพััะตะปะธ, ะธ ะคะตะดะตัะธะบะพ ะฒััะตะป ะธะท ะบะพะผะฝะฐัั. ะะตะฒััะบะฐ ะฝะฐัะฐะปะฐ ะพะดะตะฒะฐัััั, ะพะดะตะฒัะธัั ะพะฝะฐ ัะฟัััะธะปะฐัั ะฝะฐ ะฟะตัะฒัะน ััะฐะถ, ะฒ ะณะพััะธะฝัั.<br>ะะฐะผะตัะธะฒ ััะพ ะฝะธะบะพะณะพ ะบัะพะผะต ะคะตะดะตัะธะบะพ ะฒ ะณะพััะธะฝะพะน ะฝะตัั, ะพะฝะฐ ะฟัะพัะธะปะฐ:<br>- ะ ะณะดะต ะฒัะต?<br>- ะะปัะณะฐ ะฟะพัะปะฐ ะฟะพะบัะฟะฐัั ะฟัะพะดัะบัั, ะฐ ะ ะพะผะฐะปัะพ ะธ ะะตัะผะฐะฝ ะฝะฐ ัะฐะฑะพัะต.<br>- ะะพะฝััะฝะพ.<br>ะัั ะบะฐะบ ะฒัะตะณะดะฐ, ะฝะธัะตะณะพ ะฝะต ะผะตะฝัะตััั, ะบัะพะผะต ะผะพะธั
ะบะพัะผะฐัะพะฒ.<br>ะฏ ัะตะปะฐ ะฝะฐ ะดะธะฒะฐะฝ, ะฝะฐะฟัะพัะธะฒ ะคะตะดะตัะธะบะพ, ะพะฝ ััะพ ัะพ ะฟะธัะฐะป ะฝะฐ ะฑัะผะฐะถะบะต. ะะฐะฒะตั...</code> | <code>1</code> |
| <code>ะะตะปะพ ะฑัะปะพ ะฒะตัะตัะพะผ, ะบะพะณะดะฐ ั ะพัะฟัะฐะฒะปัะปะฐัั ะฒ ะณะพััะธ ะบ ะะฐััะธะฝะต. ะฏ ะฒะทัะปะฐ ั ัะพะฑะพะน ะคะฐะบัะฝะดะพ, ะธ ะัะฐะฑะธ. ะญัะพ ะผะพะธ ััะผะฐััะตะดัะธะต, ะฝะพ ะปัััะธะต ะดััะทัั. ะะพัะพัะต ะบะพัะพัะบะพ: ะฏ - ะะพะดะพะฒะธะบะฐ, ะฝะพ ะผะพะถะฝะพ ะฟัะพััะพ ะะพะดะพ.<br>ะขะฐะบ ะบะฐะบ ะขะธะฝะฐ ะถะธะฒัั ะฝะฐ 9 ััะฐะถะต, ะฝะฐะผ ะฟัะธัะปะพัั ะตั
ะฐัั ะฝะฐ ะปะธััะต, ะธะฝะฐัะต ะฝะฐ ะปะตััะฝะธัะต ะผั ะฑั ะฟะพะดะพั
ะปะธ. ะะพัะพัะต ะทะฐั
ะพะดะธะผ ะฒ ะปะธัั, ะธ ะฒะพั ะผั ัะถะต ะฝะฐ ะฝัะถะฝะพะผ ะฝะฐะผ ััะฐะถะต.<br>ะะฒะตัะบะธ ะพัะบััะปะธัั, ั ะพัะฒะตัะฝัะปะฐัั ะฝะฐ ะผะธะฝััะบั. ะ ะฟะพัะพะผ ะฟะพะฒะตัะฝัะปะฐัั, ัะผะพััั ะฐ ััะธั
ะธะดะธะพัะพะฒ ะฝะตัั. ะะดััะณ ะดะฒะตัะบะธ ะทะฐะบััะปะธัั, ั ะดะตะฒััะบะฐ ะฝะต ะฟัะณะปะธะฒะฐั ะฝะพ, ัััั ะฐ ะฒะดััะณ ั ััั ะทะฐััััะฝั?<br>- ะัะธะดััะบะธ, ะฑะปะธะฝ! ะัะบัะพะนัะต ะปะธัั! ะะฐััััะฝั ะฒะตะดั!<br>ะ ะพัะฒะตั ั ััะปััะฐะปะฐ ะปะธัั ัะผะตั
ะดะฒัั
ะฟะฐัะฝะตะน. ะั! ะะผ ะฝะต ะฟะพะทะดะพัะพะฒะธััั ะบะพะณะดะฐ ั ะพัััะดะฐ ะฒัะนะดั.<br>ะะดััะณ, ะดะฒะตัะบะธ ะพัะบััะปะธัั, ั ะฒัะปะตะทะปะฐ ะธะท ะปะธััะฐ, ะธ ััะธ ะดะตะฑะธะปั ะฝะฐัะธะปัะฝะพ ะทะฐัะฐัะบะธะฒะฐัั ะผะตะฝั ะฒ ะปะธัั, ะธ ะถะผัั ะฝะฐ ะบะฝะพะฟะบั, ััะพะฑั ะปะธัั ะฟะพะตั
ะฐะป ะดะพ 1 ััะฐะถะฐ.<br>- ะััััะตะต! ะคะฐะบั! ะะฐะดะพ ะฑััััะตะต ะะพะดะพ ัะฟัััะธัััั ะฝะฐ 1 ััะฐะถ! - ะะฐะบัะธัะฐะป ะัะฐะฑััะฝะธ.<br>- ะะพะฝัะป! - ะ ะพัะฒะตั ะบัะธะบะฝัะป ะคะฐะบัะฝะดะพ.<br>ะขัั ะดะฒะตัะบะธ ะทะฐะบััะปะธัั, ะธ ะผะตะฝั ะฟะพะฝะตัะปะพ ะฝะฐ 1 ััะฐะถ. ะงะตัะตะท ะฝะตัะบะพะปัะบะพ ะผะธะฝัั ั ัะฟัััะธะปะฐัั ะฝะฐ ะฝัะถะฝัะน ััะฐะถ, ะธ ะฒะดัั...</code> | <code>ะฏ - ะะถะฐะผะธะปั, ะดะพัั ะทะฝะฐัะฝะพะณะพ ะณัะฐัะฐ. ะะพั ะผะฐัั ัะผะตัะปะฐ ะฟัะธ ัะพะดะฐั
, ะฐ ั ะพััะฐะปะฐัั ะถะธะฒะฐ. ะฃะถะต ะบะฐะบ 20 ะปะตั ะฟัะพัะปะพ ัะพ ัะผะตััะธ ะปัะฑััะธะน ะผะฐัะตัะธ. ะะพะน ะพัะตั ัะฝะพะฒะฐ ะถะตะฝะธะปัั ััะพะฑั ั ะผะตะฝั ะฑัะป ะฟัะธะผะตั ะดะปั ะฟะพะดัะฐะถะฐะฝะธั.<br>ะะพั ะผะฐัะตั
ั ะทะพะฒัั ะญะปะธะทะฐะฑะตั. ะัะพะดะต ะธะผั ะดะพะฑัะพะต, ะฐ ัะฐะผะฐ ะถะตะฝัะธะฝะฐ ะฝะต ะธะท ะปัััะธั
.<br>ะั ั ะญะปะธะทะฐะฑะตั ะฝะต ะปะฐะดะธะปะธ. ะะพะน ะพัะตั ัะตั
ะฐะป, ะดะพะผะฐ ะพััะฐะปะฐัั ั ั ะผะฐัะตั
ะพะน, ะบะพัะพัะฐั ัะพะฒัะตะผ ะฝะต ะทะฐะฝะธะผะฐะปะฐัั ะผะพะธะผ ะฒะพัะฟะธัะฐะฝะธะตะผ ะบะฐะบ ะฟะพัััะธะป ะตะน ะผะพะน ะพัะตั.<br>ะะพะผ ั ะฝะฐั ะฑัะป ะฑะพะณะฐััะน, ะบัะฐัะธะฒัะน. ะ ะผะฝะพะณะพ ัะปัะณ.<br>ะะพะฟััะฝัะน ะฒะตัะตั ะดัะตั ะผะฝะต ะฟััะผะพ ะฒ ะปะธัะพ. ะ ะพะบััะณะต ะฟะพัะฐะถะตะฝั ัะฒะตัั.<br>ะกะตะนัะฐั ั ะฒ ัะฐะดั. ะฏ ะพัะตะฝั ัะตะดะบะพ ัะปัะฑะฐััั, ัะฐะบ ะบะฐะบ ัะฐะบะธั
ัะฐะดะพััะฝัั
ะผะพะผะตะฝัะพะฒ, ั ะผะตะฝั ะฑัะปะพ ะพัะตะฝั ะผะฐะปะพ.<br>ะฏ ัะตะดะบะพ ะฒัั
ะพะถั ะธะท ัะฒะพะตะณะพ ะดะพะผะฐ, ะดะฐะถะต ะฟัะฐะบัะธัะตัะบะธ ะธะท ัะฒะพะตะน ะบะพะผะฝะฐัั ะฝะต ะฒัั
ะพะถั.<br>ะะพั ะผะฐัะตั
ะฐ ะพัะตะฝั ัะตะดะบะพ ะฒัะฟััะบะฐะตั ะผะตะฝั ะฟะพะดััะฐัั ะฒะพะทะดัั
ะพะผ, ะพะฝะฐ ะณะพะฒะพัะธั ััะพ ะผะฝะต ะฝะตะปัะทั ะฒัั
ะพะดะธัั ะฝะฐ ัะปะธัั, ะธ ะพะฑัะฐัััั ั ะปัะดัะผะธ, ะฟะพะบะฐ ั ะฝะต ะฝะฐััััั ะฟัะฐะฒะธะปะฐะผะธ ััะธะบะตัะฐ.<br>ะะตะผะฝะพะณะพ ะฟะพะดััะฐะฒ ะฒะพะทะดัั
ะพะผ ั ะทะฐัะปะฐ ะฒ ะดะพะผ. ะะพ ะผะฝะต ััะฐะทั ะถะต ะฟะพะดะฑะตะถะฐะปะฐ ะญะปะธะทะฐะฑะตั.<br>ะะปะฐะทะฐ ะตั ะฑัะปะธ ะฝะฐะฟะพะปะฝะตะฝั ะณะฝะตะฒะพะผ. ะะฝะฐ ะฟัะพะถะธะณะฐะปะฐ ะผะตะฝั ัะฒะพะธะผ ะทะปะพะฒะตัะธะผ ะฒะทะณะปัะดะพะผ...</code> | <code>1</code> |
* Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters:
```json
{
"scale": 20.0,
"similarity_fct": "cos_sim"
}
```
### Training Hyperparameters
#### Non-Default Hyperparameters
- `eval_strategy`: epoch
- `per_device_train_batch_size`: 136
- `per_device_eval_batch_size`: 136
- `weight_decay`: 0.01
- `num_train_epochs`: 5
- `bf16`: True
- `load_best_model_at_end`: True
- `batch_sampler`: no_duplicates
#### All Hyperparameters
<details><summary>Click to expand</summary>
- `overwrite_output_dir`: False
- `do_predict`: False
- `eval_strategy`: epoch
- `prediction_loss_only`: True
- `per_device_train_batch_size`: 136
- `per_device_eval_batch_size`: 136
- `per_gpu_train_batch_size`: None
- `per_gpu_eval_batch_size`: None
- `gradient_accumulation_steps`: 1
- `eval_accumulation_steps`: None
- `torch_empty_cache_steps`: None
- `learning_rate`: 5e-05
- `weight_decay`: 0.01
- `adam_beta1`: 0.9
- `adam_beta2`: 0.999
- `adam_epsilon`: 1e-08
- `max_grad_norm`: 1.0
- `num_train_epochs`: 5
- `max_steps`: -1
- `lr_scheduler_type`: linear
- `lr_scheduler_kwargs`: {}
- `warmup_ratio`: 0.0
- `warmup_steps`: 0
- `log_level`: passive
- `log_level_replica`: warning
- `log_on_each_node`: True
- `logging_nan_inf_filter`: True
- `save_safetensors`: True
- `save_on_each_node`: False
- `save_only_model`: False
- `restore_callback_states_from_checkpoint`: False
- `no_cuda`: False
- `use_cpu`: False
- `use_mps_device`: False
- `seed`: 42
- `data_seed`: None
- `jit_mode_eval`: False
- `use_ipex`: False
- `bf16`: True
- `fp16`: False
- `fp16_opt_level`: O1
- `half_precision_backend`: auto
- `bf16_full_eval`: False
- `fp16_full_eval`: False
- `tf32`: None
- `local_rank`: 0
- `ddp_backend`: None
- `tpu_num_cores`: None
- `tpu_metrics_debug`: False
- `debug`: []
- `dataloader_drop_last`: False
- `dataloader_num_workers`: 0
- `dataloader_prefetch_factor`: None
- `past_index`: -1
- `disable_tqdm`: False
- `remove_unused_columns`: True
- `label_names`: None
- `load_best_model_at_end`: True
- `ignore_data_skip`: False
- `fsdp`: []
- `fsdp_min_num_params`: 0
- `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
- `fsdp_transformer_layer_cls_to_wrap`: None
- `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
- `deepspeed`: None
- `label_smoothing_factor`: 0.0
- `optim`: adamw_torch
- `optim_args`: None
- `adafactor`: False
- `group_by_length`: False
- `length_column_name`: length
- `ddp_find_unused_parameters`: None
- `ddp_bucket_cap_mb`: None
- `ddp_broadcast_buffers`: False
- `dataloader_pin_memory`: True
- `dataloader_persistent_workers`: False
- `skip_memory_metrics`: True
- `use_legacy_prediction_loop`: False
- `push_to_hub`: False
- `resume_from_checkpoint`: None
- `hub_model_id`: None
- `hub_strategy`: every_save
- `hub_private_repo`: None
- `hub_always_push`: False
- `gradient_checkpointing`: False
- `gradient_checkpointing_kwargs`: None
- `include_inputs_for_metrics`: False
- `include_for_metrics`: []
- `eval_do_concat_batches`: True
- `fp16_backend`: auto
- `push_to_hub_model_id`: None
- `push_to_hub_organization`: None
- `mp_parameters`:
- `auto_find_batch_size`: False
- `full_determinism`: False
- `torchdynamo`: None
- `ray_scope`: last
- `ddp_timeout`: 1800
- `torch_compile`: False
- `torch_compile_backend`: None
- `torch_compile_mode`: None
- `include_tokens_per_second`: False
- `include_num_input_tokens_seen`: False
- `neftune_noise_alpha`: None
- `optim_target_modules`: None
- `batch_eval_metrics`: False
- `eval_on_start`: False
- `use_liger_kernel`: False
- `eval_use_gather_object`: False
- `average_tokens_across_devices`: False
- `prompts`: None
- `batch_sampler`: no_duplicates
- `multi_dataset_batch_sampler`: proportional
</details>
### Training Logs
<details><summary>Click to expand</summary>
| Epoch | Step | Training Loss | Validation Loss | cosine_ap |
|:-------:|:---------:|:-------------:|:---------------:|:----------:|
| 0.0491 | 100 | 3.2263 | - | - |
| 0.0983 | 200 | 2.8883 | - | - |
| 0.1474 | 300 | 2.7172 | - | - |
| 0.1966 | 400 | 2.6694 | - | - |
| 0.2457 | 500 | 2.5956 | - | - |
| 0.2948 | 600 | 2.5362 | - | - |
| 0.3440 | 700 | 2.5064 | - | - |
| 0.3931 | 800 | 2.4516 | - | - |
| 0.4423 | 900 | 2.4311 | - | - |
| 0.4914 | 1000 | 2.4303 | - | - |
| 0.5405 | 1100 | 2.3851 | - | - |
| 0.5897 | 1200 | 2.3797 | - | - |
| 0.6388 | 1300 | 2.3416 | - | - |
| 0.6880 | 1400 | 2.3335 | - | - |
| 0.7371 | 1500 | 2.3025 | - | - |
| 0.7862 | 1600 | 2.2933 | - | - |
| 0.8354 | 1700 | 2.3108 | - | - |
| 0.8845 | 1800 | 2.273 | - | - |
| 0.9337 | 1900 | 2.237 | - | - |
| 0.9828 | 2000 | 2.2317 | - | - |
| 1.0 | 2035 | - | 7.5524 | 0.8340 |
| 1.0319 | 2100 | 2.1496 | - | - |
| 1.0811 | 2200 | 2.0707 | - | - |
| 1.1302 | 2300 | 2.0904 | - | - |
| 1.1794 | 2400 | 2.066 | - | - |
| 1.2285 | 2500 | 2.0367 | - | - |
| 1.2776 | 2600 | 2.0456 | - | - |
| 1.3268 | 2700 | 2.0691 | - | - |
| 1.3759 | 2800 | 2.0345 | - | - |
| 1.4251 | 2900 | 2.0432 | - | - |
| 1.4742 | 3000 | 2.03 | - | - |
| 1.5233 | 3100 | 2.0081 | - | - |
| 1.5725 | 3200 | 2.0012 | - | - |
| 1.6216 | 3300 | 1.9942 | - | - |
| 1.6708 | 3400 | 1.9842 | - | - |
| 1.7199 | 3500 | 1.9824 | - | - |
| 1.7690 | 3600 | 1.9665 | - | - |
| 1.8182 | 3700 | 1.9583 | - | - |
| 1.8673 | 3800 | 1.975 | - | - |
| 1.9165 | 3900 | 1.9307 | - | - |
| 1.9656 | 4000 | 1.9338 | - | - |
| 2.0 | 4070 | - | 7.6937 | 0.8397 |
| 2.0147 | 4100 | 1.9082 | - | - |
| 2.0639 | 4200 | 1.7883 | - | - |
| 2.1130 | 4300 | 1.777 | - | - |
| 2.1622 | 4400 | 1.7939 | - | - |
| 2.2113 | 4500 | 1.8113 | - | - |
| 2.2604 | 4600 | 1.783 | - | - |
| 2.3096 | 4700 | 1.7794 | - | - |
| 2.3587 | 4800 | 1.7704 | - | - |
| 2.4079 | 4900 | 1.7864 | - | - |
| 2.4570 | 5000 | 1.7766 | - | - |
| 2.5061 | 5100 | 1.7344 | - | - |
| 2.5553 | 5200 | 1.7621 | - | - |
| 2.6044 | 5300 | 1.7413 | - | - |
| 2.6536 | 5400 | 1.7428 | - | - |
| 2.7027 | 5500 | 1.7287 | - | - |
| 2.7518 | 5600 | 1.7603 | - | - |
| 2.8010 | 5700 | 1.7455 | - | - |
| 2.8501 | 5800 | 1.7561 | - | - |
| 2.8993 | 5900 | 1.763 | - | - |
| 2.9484 | 6000 | 1.7152 | - | - |
| 2.9975 | 6100 | 1.7227 | - | - |
| 3.0 | 6105 | - | 7.4898 | 0.8378 |
| 3.0467 | 6200 | 1.6102 | - | - |
| 3.0958 | 6300 | 1.609 | - | - |
| 3.1450 | 6400 | 1.5979 | - | - |
| 3.1941 | 6500 | 1.5864 | - | - |
| 3.2432 | 6600 | 1.6182 | - | - |
| 3.2924 | 6700 | 1.609 | - | - |
| 3.3415 | 6800 | 1.6004 | - | - |
| 3.3907 | 6900 | 1.5681 | - | - |
| 3.4398 | 7000 | 1.582 | - | - |
| 3.4889 | 7100 | 1.6019 | - | - |
| 3.5381 | 7200 | 1.5959 | - | - |
| 3.5872 | 7300 | 1.608 | - | - |
| 3.6364 | 7400 | 1.5961 | - | - |
| 3.6855 | 7500 | 1.5751 | - | - |
| 3.7346 | 7600 | 1.5572 | - | - |
| 3.7838 | 7700 | 1.5721 | - | - |
| 3.8329 | 7800 | 1.5519 | - | - |
| 3.8821 | 7900 | 1.5431 | - | - |
| 3.9312 | 8000 | 1.5659 | - | - |
| 3.9803 | 8100 | 1.5586 | - | - |
| 4.0 | 8140 | - | 7.4912 | 0.8411 |
| 4.0295 | 8200 | 1.4872 | - | - |
| 4.0786 | 8300 | 1.4554 | - | - |
| 4.1278 | 8400 | 1.4866 | - | - |
| 4.1769 | 8500 | 1.4601 | - | - |
| 4.2260 | 8600 | 1.4879 | - | - |
| 4.2752 | 8700 | 1.4872 | - | - |
| 4.3243 | 8800 | 1.4797 | - | - |
| 4.3735 | 8900 | 1.4861 | - | - |
| 4.4226 | 9000 | 1.4578 | - | - |
| 4.4717 | 9100 | 1.4648 | - | - |
| 4.5209 | 9200 | 1.4625 | - | - |
| 4.5700 | 9300 | 1.4596 | - | - |
| 4.6192 | 9400 | 1.4689 | - | - |
| 4.6683 | 9500 | 1.4475 | - | - |
| 4.7174 | 9600 | 1.4452 | - | - |
| 4.7666 | 9700 | 1.4561 | - | - |
| 4.8157 | 9800 | 1.4471 | - | - |
| 4.8649 | 9900 | 1.4256 | - | - |
| 4.9140 | 10000 | 1.4613 | - | - |
| 4.9631 | 10100 | 1.4794 | - | - |
| **5.0** | **10175** | **-** | **7.399** | **0.8412** |
* The bold row denotes the saved checkpoint.
</details>
### Framework Versions
- Python: 3.10.18
- Sentence Transformers: 4.1.0
- Transformers: 4.52.4
- PyTorch: 2.7.1+cu128
- Accelerate: 1.7.0
- Datasets: 3.6.0
- Tokenizers: 0.21.1
## Citation
### BibTeX
#### Sentence Transformers
```bibtex
@inproceedings{reimers-2019-sentence-bert,
title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
author = "Reimers, Nils and Gurevych, Iryna",
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
month = "11",
year = "2019",
publisher = "Association for Computational Linguistics",
url = "https://arxiv.org/abs/1908.10084",
}
```
#### MultipleNegativesRankingLoss
```bibtex
@misc{henderson2017efficient,
title={Efficient Natural Language Response Suggestion for Smart Reply},
author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil},
year={2017},
eprint={1705.00652},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
<!--
## Glossary
*Clearly define terms in order to be accessible across audiences.*
-->
<!--
## Model Card Authors
*Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.*
-->
<!--
## Model Card Contact
*Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.*
--> |
ekiprop/bert-wnli-3-epochs-2025-06-17-0808 | ekiprop | 2025-06-17T08:09:22Z | 0 | 0 | transformers | [
"transformers",
"safetensors",
"bert",
"text-classification",
"generated_from_trainer",
"base_model:google-bert/bert-base-uncased",
"base_model:finetune:google-bert/bert-base-uncased",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] | text-classification | 2025-06-17T08:08:36Z | ---
library_name: transformers
license: apache-2.0
base_model: bert-base-uncased
tags:
- generated_from_trainer
metrics:
- accuracy
model-index:
- name: bert-wnli-3-epochs-2025-06-17-0808
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-wnli-3-epochs-2025-06-17-0808
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.7137
- Accuracy: 0.4085
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-06
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 3
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| No log | 1.0 | 40 | 0.7273 | 0.4225 |
| No log | 2.0 | 80 | 0.7149 | 0.4085 |
| No log | 3.0 | 120 | 0.7137 | 0.4085 |
### Framework versions
- Transformers 4.52.4
- Pytorch 2.7.0+cu128
- Datasets 3.6.0
- Tokenizers 0.21.1
|
songkey/hm5b_reference | songkey | 2025-06-17T07:51:08Z | 12 | 0 | diffusers | [
"diffusers",
"safetensors",
"arxiv:2410.22901",
"base_model:stable-diffusion-v1-5/stable-diffusion-v1-5",
"base_model:finetune:stable-diffusion-v1-5/stable-diffusion-v1-5",
"license:mit",
"region:us"
] | null | 2025-06-12T10:04:48Z | ---
base_model:
- stable-diffusion-v1-5/stable-diffusion-v1-5
library_name: diffusers
license: mit
---
Model of [**HelloMeme**](https://songkey.github.io/hellomeme/)
[**Project Page**](https://songkey.github.io/hellomeme/
) | [**Code Page**](https://github.com/HelloVision/HelloMeme) | [**Arxiv**](https://arxiv.org/abs/2410.22901) | [**ComfyUI**](https://github.com/HelloVision/ComfyUI_HelloMeme) | [**Demo**](https://www.modelscope.cn/studios/songkey/HelloMeme)
**BibTeX:**
```bibtex
@misc{zhang2024hellomemeintegratingspatialknitting,
title={HelloMeme: Integrating Spatial Knitting Attentions to Embed High-Level and Fidelity-Rich Conditions in Diffusion Models},
author={Shengkai Zhang and Nianhong Jiao and Tian Li and Chaojie Yang and Chenhui Xue and Boya Niu and Jun Gao},
year={2024},
eprint={2410.22901},
archivePrefix={arXiv},
primaryClass={cs.CV},
url={https://arxiv.org/abs/2410.22901},
}
``` |
talzoomanzoo/LIMO-full-Qwen-2.5-1.5B-Instruct | talzoomanzoo | 2025-06-17T07:45:53Z | 11 | 0 | transformers | [
"transformers",
"safetensors",
"qwen2",
"text-generation",
"llama-factory",
"generated_from_trainer",
"conversational",
"zho",
"eng",
"fra",
"spa",
"por",
"deu",
"ita",
"rus",
"jpn",
"kor",
"vie",
"tha",
"ara",
"base_model:Qwen/Qwen2.5-1.5B-Instruct",
"base_model:finetune:Q... | text-generation | 2025-03-24T00:57:43Z | ---
library_name: transformers
license: apache-2.0
base_model: Qwen/Qwen2.5-1.5B-Instruct
tags:
- llama-factory
- generated_from_trainer
language:
- zho
- eng
- fra
- spa
- por
- deu
- ita
- rus
- jpn
- kor
- vie
- tha
- ara
model-index:
- name: LIMO-full-Qwen-2.5-1.5B-Instruct
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# LIMO-full-Qwen-2.5-1.5B-Instruct
This model is a fine-tuned version of [Qwen/Qwen2.5-1.5B-Instruct](https://huggingface.co/Qwen/Qwen2.5-1.5B-Instruct) on an unknown dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-06
- train_batch_size: 1
- eval_batch_size: 8
- seed: 42
- distributed_type: multi-GPU
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- num_epochs: 15
### Training results
### Framework versions
- Transformers 4.48.2
- Pytorch 2.5.1+cu124
- Datasets 3.2.0
- Tokenizers 0.21.0
|
veddhanth/lora-trained-xl-stage-2-finetuned-enc-v2-spat-1989-map-3 | veddhanth | 2025-06-17T07:24:12Z | 1 | 0 | diffusers | [
"diffusers",
"tensorboard",
"text-to-image",
"diffusers-training",
"lora",
"template:sd-lora",
"stable-diffusion-xl",
"stable-diffusion-xl-diffusers",
"base_model:stabilityai/stable-diffusion-xl-base-1.0",
"base_model:adapter:stabilityai/stable-diffusion-xl-base-1.0",
"license:openrail++",
"re... | text-to-image | 2025-06-17T07:14:38Z | ---
base_model: stabilityai/stable-diffusion-xl-base-1.0
library_name: diffusers
license: openrail++
instance_prompt: a realistic portrait of sks face
widget: []
tags:
- text-to-image
- text-to-image
- diffusers-training
- diffusers
- lora
- template:sd-lora
- stable-diffusion-xl
- stable-diffusion-xl-diffusers
---
<!-- This model card has been generated automatically according to the information the training script had access to. You
should probably proofread and complete it, then remove this comment. -->
# SDXL LoRA DreamBooth - veddhanth/lora-trained-xl-stage-2-finetuned-enc-v2-spat-1989-map-3
<Gallery />
## Model description
These are veddhanth/lora-trained-xl-stage-2-finetuned-enc-v2-spat-1989-map-3 LoRA adaption weights for stabilityai/stable-diffusion-xl-base-1.0.
The weights were trained using [DreamBooth](https://dreambooth.github.io/).
LoRA for the text encoder was enabled: True.
Special VAE used for training: madebyollin/sdxl-vae-fp16-fix.
## Trigger words
You should use a realistic portrait of sks face to trigger the image generation.
## Download model
Weights for this model are available in Safetensors format.
[Download](veddhanth/lora-trained-xl-stage-2-finetuned-enc-v2-spat-1989-map-3/tree/main) them in the Files & versions tab.
## Intended uses & limitations
#### How to use
```python
# TODO: add an example code snippet for running this diffusion pipeline
```
#### Limitations and bias
[TODO: provide examples of latent issues and potential remediations]
## Training details
[TODO: describe the data used to train the model] |
Megha06/PixelcopterEnv | Megha06 | 2025-06-17T07:23:23Z | 0 | 0 | null | [
"Pixelcopter-PLE-v0",
"reinforce",
"reinforcement-learning",
"custom-implementation",
"deep-rl-class",
"model-index",
"region:us"
] | reinforcement-learning | 2025-06-17T06:38:11Z | ---
tags:
- Pixelcopter-PLE-v0
- reinforce
- reinforcement-learning
- custom-implementation
- deep-rl-class
model-index:
- name: PixelcopterEnv
results:
- task:
type: reinforcement-learning
name: reinforcement-learning
dataset:
name: Pixelcopter-PLE-v0
type: Pixelcopter-PLE-v0
metrics:
- type: mean_reward
value: 42.70 +/- 62.51
name: mean_reward
verified: false
---
# **Reinforce** Agent playing **Pixelcopter-PLE-v0**
This is a trained model of a **Reinforce** agent playing **Pixelcopter-PLE-v0** .
To learn to use this model and train yours check Unit 4 of the Deep Reinforcement Learning Course: https://huggingface.co/deep-rl-course/unit4/introduction
|
music991758/uuu_fine_tune_taipower | music991758 | 2025-06-17T07:20:20Z | 5 | 0 | null | [
"safetensors",
"gpt2",
"license:apache-2.0",
"region:us"
] | null | 2025-06-17T06:11:36Z | ---
license: apache-2.0
---
|
HsuHuggingFace/uuu_fine_tune_gpt2 | HsuHuggingFace | 2025-06-17T07:18:09Z | 2 | 0 | null | [
"safetensors",
"gpt2",
"license:apache-2.0",
"region:us"
] | null | 2025-06-17T06:20:42Z | ---
license: apache-2.0
---
|
HedyKoala17/uuu_fine_tune_taipower | HedyKoala17 | 2025-06-17T07:10:45Z | 5 | 0 | null | [
"safetensors",
"gpt2",
"license:apache-2.0",
"region:us"
] | null | 2025-06-17T06:18:43Z | ---
license: apache-2.0
---
|
krs375/uuu_fine_tune_taipower | krs375 | 2025-06-17T07:00:39Z | 5 | 0 | null | [
"safetensors",
"gpt2",
"license:apache-2.0",
"region:us"
] | null | 2025-06-17T06:23:24Z | ---
license: apache-2.0
---
|
LarryAIDraw/Char_Honkai_Bronya_V3_Pony | LarryAIDraw | 2025-06-17T06:57:12Z | 0 | 0 | null | [
"license:creativeml-openrail-m",
"region:us"
] | null | 2025-06-17T06:32:13Z | ---
license: creativeml-openrail-m
---
https://civitai.com/models/130516/bronya-zaychik-4in1-ponyxl15?modelVersionId=1900974 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.