text stringlengths 0 473k |
|---|
[SOURCE: https://he.wikipedia.org/wiki/ืคืืจืื:ืืฉืจืื] | [TOKENS: 4720] |
ืคืืจืื:ืืฉืจืื ืจืขื ืื ืืคืืจืื ืืืฆื ืืืื ืืขืืืจ? ืืืื ืช ืืฉืจืื ืืืืืจื ืืช ื ืืกืื ื-1948 (ื' ืืืืืจ ืชืฉ"ื), ืืฉืืืฉ ืฉื ืื ืืืืจ ืืฉืืื. ืืงืืชื ืืืืชื ืืืืืื ืขื ืืขืจืืื ืืืงืืืืื ืืืจืฆืืช ืขืจื ืืฉืื ืืช ืื. ืืกืืกืื ืืขืจืื-ืืฉืจืืื ืงืืื ืืฉืจืืจ ืขื ืืืื, ืืืจืืช ืืืืฆื ืคืืืก ืืฉืืื ืฉื ืขืฉื ืืืืื ืืฉื ืื. ืืฉื ืืกืืกืื, ืืืฉืจืื ืืฉ ืืืืก ืืืื ืืฆืื ืืื ื ืืืฉืจืื, ืื ืืฉื ืืฆืื ืืืืง ืืืืชืงืื ืืืืชืจ ืืืืืจ. ืืืฉืจืื ืืืื ื-9.5 ืืืืืื ืืืจืืื, ืจืืื ืืืืืื, ืื ืืฉื ื ืงืืืฆืืช ืืืขืื ืืืืืืช: ืขืจืืื ืืฉืจืื (ืจืืื ืืืกืืืื ืืืงืฆืชื ื ืืฆืจืื), ืืจืืืื ืืืืืืื, ืืืื ืงืืืฆืืช ืืืขืื ืฉืืืืืช ืืื ืืฆ'ืจืงืกืื. ืืืฉืจืื ืืืืื ืืืงื ืืืกืืช ืืืืืกืกืช ืขื ืชืขืฉืืืช ืืื-ืืง, ืืืข, ืืกืืจ, ืฉืืจืืชืื ืืืงืืืืช. ืืขืจื ืืืื ืชื ืืืจ ืืื ืืชืจ ืืจืืืืืืื ืืื ืืืืื ืืฉืืื ืืื ืืจื ืืื ืืงืืืื, ืืชืืื ืืืขืฆื ืืืืจืืช ืืืจ, ืืื ืืืจืื ืืจืืื, ืืืืืื ืขื ืืขืืจ ืืื ืขื ืืช ืืขืชืืงื ืืืจ. ืืชื ืืื ืืืืฉืืืื ืฉืืชืื ืืจืฅ ืืฉืจืื, ืืื ืขื ืชืื ืืฆืืจ, ืืืืื ืืืืจ ืฉืืข. ืขืืจื ืืฉืชืจืข ืขื ืึพ5,000 ืฉื ืื, ืืชืื ืืขืช ืื ืืืืืชืืช ืขื ืืืื ืืฆืืื ืื. ืืื ื ืชืืื ืืฉื ืช 1871, ืืืื ืืคื ืืืชืจ ืฉื ืืคืืจืืช ืืจืืืืืืืืืช ืจืืืช. ืืชืืื ืืืืื ืืคืชื ืืฉืื ืืืงืจ ืืืจื ืืืืื ืฉื ืืืฉืื ืืจืฅ ืืฉืจืื ืืงืืืื. ืืืืฉื ืื ืืชื ืืื ืืจื ืืืืฉืื ืืจืื ืืืกืฃ, ืขื ืืืืฉ 44, ืืื ืฆืืืช ื ืืฉืื ืืจืืื. ืื 12 ืจืืฉื ืืืืฉืืืช ืฉืืืื ื ืืืฉืจืื, ืื ืกืืืื ืืช ืชืคืงืืื ืืืืขื ืฉื ืงืืข ืืจืืฉ ืืคื ืืืืง. ืืื ืื-ืืืจืืื (ืคืขืืืื), ืืฉื ืฉืจืช, ืืืืื ืืืืจ, ืื ืื ืืืื, ืืืื ืืจืง ืืืืื ืืืืืจื ืืชืคืืจื, ืืื ืืฉืืื ื ืคืืจ ืืจื ืกืืื ืืืื ืชื, ืืจืืื ืฉืจืื ืืื ืืืืื ืืืื ืชื, ืืืื ืชื ืฉื ืืฆืืง ืฉืืืจ (ืคืขืืืื) ืืื ืืืื ื ืชื ืืื ืืชืงืฆืจื ืืฉื ืืงืืืช ืืืืืจืืช, ืฉืืืืจืืื ืื ืืืฉืืื ืืืื ืืชืคืงืื, ืฉืืขืื ืคืจืก ืืืื ืืืืฉืืช ืืจืืืฆืื ืืืขืืืจ ืืช ืืชืคืงืื ืืืฆืืง ืฉืืืจ ืืงืื ืฆืื ืืจืืฉืื ื, ืืืงืืื ืืช ืืืืืจืืช ืืงืื ืฆืื ืืฉื ืืื ืืืฆืืง ืจืืื ืืงืืื ืืช ืืืืืจืืช ืืงืื ืฆืื ืืจืืฉืื ื ืื ืจืฆื ืืฉื ื ืืคื ื ืกืืื ืืืื ืชื ืืฉื ืืื. ืชืืืืืช ืขื ืืฉืจืื: ืืขืืืื ืืจืืื ืื - ืืืืืช ืืืงืจืฉื - ืืฉืืืช ืืื - ืืขืจืืืช ืืืกืชืืจ ืฉื ืืจ ืืืืื ืชืืืืืช ืืืืฉืื: ืืจืืื ืฆืืื ืืืืื - ืืืจ ืืฆืคืื - ืืืื ื - ืืืขืคืื - ืืืืฉืื ืืืฉื - ืืขืืืื ืืืืจืื - ืืจืืืช ืืื ืืืืจืืช - ืืฉืืืจ - ืื"ืฉ - ืคืจืฉืช ืืกืจื'ื ืืื - ืคืจืฉืช ืืจืฆื ืืืืืืช ืชื ื ืืฃ ืืืื ืช ืืฉืจืื: ืืืืืืช ืืืื ืช ืืฉืจืื - ืืื ืืฉืจืื - ืืืกืชืืจืืช ืืืืืืช ืฉื ืืขืืืืื ืืืจืฅ ืืฉืจืื - ืืืงืืงื ืืืฉืจืื - ืืืฉืคื ืืืฉืจืื - ืืืง ืืืกืืจืื - ืืืง ืืกืื: ื ืฉืื ืืืืื ื - ืืฉืจืื ืืืืืืช ืืื ืืืืคืืจืื - ืืืฆืข ืขืจืฆื 19 - ืืืื ืืืช ืืฉืื ืืช ืืืฉืจืื - ืืฉืืจ ืงืจื ืืช ืืคื ืกืื ืืืจืขืื ืืืช - ืืฉืืจ ืืืืื - ื ืกืืืช ืฆื"ื ืืกืื ื ืืืขืื (1956โ1957) - ืคืกืง ืืื ืืจืืก-ืืงืง ื ืื ืืืืขืฅ ืืืฉืคืื ืืืืฉืื - ืชื ืืืื ืืืืืืช ืืขืฆืืืืช - ืื"ืฅ ื ืืขืจ ืืืืื ื ืื ืืฉืจื ืืืื ืื ืืืกื ืืืฅ: ืืืกื ืืฉืจืื-ืกืจื ืื ืงื - ืืืกื ืืคืืืืคืื ืื-ืืฉืจืื - ืืืกื ืืฉืจืื-ืืืืื ืืจืฅ ืืฉืจืื: ืืฉืืื - ืืชืงืืคื ืืืืืืืืช ืืืจืฅ ืืฉืจืื - ืืืช ืืืจ - ืืืจืื - ืื ืืืื - ืืคืจ ืกืื - ืืขืจืช ืื ืื - ืืขืจืช ืืืกืืื - ืืฆืื - ืืจืื ืืืจืื - ื ืฉืจ (ืขืืจ) - ืขืฆืื ืขืชืืงืื ืืืจืฅ ืืฉืจืื - ืชื ืืื ืืืื - ืฆืคืข ืืฆืื - ืืื ืืช ืืฆืืื ืฆืื ืืื ื ืืืฉืจืื: ืืืคืืจ ืื-9 ืืฉืืจืืช ืฆื"ื - ืืืง ืืขืืืืจืื ืืฆืื ืืื ื ืืืฉืจืื - ืืื ืืื ืืกื ืืืฉืจืืื - ืืื"ื - ืืจืืื (ืื ืง) - ื ืื"ืฉืื ืืืืื ืืฆื"ื - ืกืคืื ืืช ืฉืจืืืจื - ืฉืืืืช ืกืคืื ืืช ืืืืืื - ืืืฆืข ืคืขืืื ืืืืจืืช - ืคืจืฉืช ืื ืืกืื ืืืืกืื ื ืื ืืืืช ืืฆื"ื (ืขืืืจ 2) - ืืืืื 188 ืืืืืืช ืืื ืืืืคืืจืื - ืืืืืช ืืกืืจืืช ืืืืืืช ืืื ืืืืคืืจืื ืืจืืฉืืื: ืืืขืช ืืืืชืจ - ืืื ืืืจืืืืืืื ืืจืืฉืืื - ืืืืืช ืืจืืฉืืื - ืืจืืฉืืื ืืชืงืืคื ืืฆืืื ืืช - ืื ืกืืืช ืืงืืจ - ืื ืืจืช ืืื ืกืช - ืงืจืืืช ืขืืจืืืช ืืจืืฉืืื - ืชืืคืืืช - ืจืืื ืื ืืืืื - ืืืืื ืงืคืืืืืื ื ืืืืืจืคืืืช: ืืื ืืืฉื - ืืืจืื ืืฆืืง ืืืื ืงืืง - ืืืืขืืจ ืืืื ื - ืืจืชืืจ ืฉืคื ืืืจ - ืืื ื'ืืืืื ืกืงื - ืืืกืฃ ืืืืจ ืฉืืจ - ืืฆืืง ืื ืฆืืืจ - ืืฆืืง ืืจ ืืืืื ืืื - ืืฆืืง ืฉืืืจ - ืืฉืจืื ืืจ - ืืฉืจืื ืงืกืื ืจ - ืืื ืืฉืืื - ืื ืื ืืืื - ืื ืื ืืืื - ืืฉื ืืจืฉืื ื - ืืฉื ืืืื - ื ืคืชืื ืืจืฅ ืืืืื - ืคืืืื ืคืืืงืก - ืฉืืื ืคืืื ืืืื - ืฉืืืื ืืื ืืืฆืืื - ืฉืืื ืืืืก - ืฉืืขืื ืคืจืก - ืฉืืืืืช ืืืื ื - ืฉืืืื ืืืกืฃ ืขืื ืื - ื ืชื ืืืชืจืื - ืืืจืงื ืืก ืืืืช ืืืืื - ืืืื ืกื ืืืจืื - ืืืืงื ืืจืืกืื - ืืืื ืืืื - ืขืืืก ืงืื ื - ืืืขืื ืจื - ืืชื ืคืื - ืคื ืืก ืจืืื ืืจื ืชืจืืืช ืืืื ืืช: ืืื ืืช ืืืืชืืช ืืืฉืจืื - ืื ื'ืืืงื ืฉืฅ - ืืืืช ืืืืืจ - ืืืกืืืจืื ืฉื ื ืืืจืช ืืฉืจืื ืืืืืจืื - ืืจืื ืฉืืจืืง - ืืืกืฃ ืืจืืฆืงื - ืืฆืืง ืื ืฆืืืจ - ืืืื ืชืืืจืงืื - ืืืืืงื ืงืืืกืืช ืืฉืจืืืืช - ืืฉืื ื - ื ืืจืื (ืคืกื) - ืกืคืจืืืช ืืจืฉื ืฉืืื - ืคืืกืื ืืฉืจืืื - ืงืืื ืืข ืืจืื - ืชืืืืจืื ืืืืืฉ - ืชืืืืช ืืืฉืื ืืขืืจืืช - ืืืื ืืืื - ืืืืื ืืืืื ืืกืื ืืช ืืชืืื ืืช: ืืืืขืช ืืืื ืืื ืืฆืื ืืืื ืงืืืืจืืืช ืืืฉื ื ืฉืชืืช ืงืืืืจืืืช ืืฉืจืื ืืืืืงืืคืืื: ืืืืจืืืช ืืชืืื ืืช ืืืืื ืืืื ืืืข (14 ืืืจืฅ 1950 โ 15 ืืคืืจืืืจ 2005), ืืื, ืืืืืจ ืืืืืจ ืกืคืจื ืงืืืืงืก, ืงืจืืงืืืจืืกื ืืืืืืจืืกืื ืืฉืจืืื. ืืืข ืคืจืกื ืืืืจื ืฉื ืื ืจืฆืืขืืช ืงืืืืงืก ืกืืืืจืืืช ืืืืืจืื ืืืืืจืืกืืืื ืืขืืชืื ืืช ืืืจืฆืืช ืืืืงืืืื ืืช, ืฉืืืงื ืืืืื ืื ืืฆื ืืืืจ ืืขืฉืจืืช ืกืคืจืื. ืืฉื ืืชืื ืืืืจืื ืืช ืคืขื ืืืข ืื ืืืืฅ ืืืกืืจืช ืืขืืชืื ืืช ืืืืืกืืช, ืืืื ืืื ืคืขืืื ืืืจืช ืืงืืืืงืก ืืืืืจื ืืืืืช ืืืชืจืืืช ืืืืชืจืชืืช. ืืกืืื ืฉืืจืืชื ืืฆืืื ืืขื ืืื ืืืชื, ืื ืืื ืืืข ืืขืืื ืืืคืชื ืืช ืชืจืืืช ืืงืืืืงืก, ืืืืืจ ืืืืืืืจ ืืืฉืจืืืืช. ืืืข ืคืจืกื ืงืจืืงืืืจืืช ืืืืืจื ืกืืืืจื ืืขืืชืื ืื "ืืขืืื ืืื" (ืื ืืจืฅ ืื), "ืืืฉืืช" (ืืืืืจืื "ืฉืืจืช ืืืจืืื", "ืืฉืืขืืช ืืืืื" ืืืืจืื), "ืื ืืขืืจ" ("ืืืื ืืกืืื ืืคืจืฉืช ืืคื ืื ืืืคืืงืฉืฉ"), "ืืขืืจ" ("ืืืกืฃ ืืืืื", "ืฉืชืืงืช ืืืจืืื"), "ืืขืจืื" ("ืืืื ืืืข", "ืืืจื ืื ืืืืฉืจ") ื"ืืืจืฅ". ืื ืืกืฃ ืืขืืืืชื ืืขืืชืื ืืืช, ืคืจืกื ืืืข ืขืฉืจืืช ืกืคืจืื ืืืื ืฉืืชืฃ, ืืขืืงืจ ืืืืืจ, ืืกืคืจ "ืื ืืจืฅ ืื", ืฉืืืื ืืช ืืืืจื ืืกืืืืจื ืฉืืืคืืขื ื"ืืขืืื ืืื". ืืขืจื ืืืื โ ืืืืืืช ื ืืกืคืืช ืจืฉืืื ืื ืืืืื ืืงืฉืืช ืืขืจืืื ืืขืืกืงืื ืืืืื ืช ืืฉืจืื: ืืืฉืื ืืคืื ืฉืืกืจืื ืืคื ืืืืช ืืคื ืืืช ืจืื ืื ืืขืจืืช ืฉืืืืื ืืื ืคืืจืื? โ ืจืฉืืืช ืื ืงืืืืจืืืช ืืืฉื ื ืืืขืจืืื |
======================================== |
[SOURCE: https://he.wikipedia.org/wiki/%D7%95%D7%99%D7%A7%D7%99%D7%A4%D7%93%D7%99%D7%94:%D7%9E%D7%96%D7%A0%D7%95%D7%9F] | [TOKENS: 64263] |
ืชืืื ืขื ืืื ืื ืืืงืืคืืื:ืืื ืื ืืืื ืื ืืฉืืฉ ืืืืื ื ืืืื ืืืช ืืืืื ืื ืืืืืื ืฉืืื ืืื ืืงืื ืืชืืื ืืืจ. ืืขื ืืื ืื ืืืจืื ืืฉ ืืคื ืืช ืืืคืื ืืืืื: [ ืขืจืืื ] โ ืืขืืจ ืืชืืชืืช ืืืฃโ ืืขืืจ ืืชืืชืืช ืืืฃ ืืืชืื "ื ืฆื" ืืฉืืืช ืขืจืืื ืืฆื ืืืคืื: ืืืืื ืืืื ืืืื ืขืจืืื ืืฆืื ืืืชืื "ื ืฆื" (ืื "ื ืฆืื"), ืืชืื ืืืืฉื ืฉืืืจ ืืื ื ืชืงื ื. ืืืชืื ืืชืงื ื ืืืื ืืื "ื ืืฆื", "ื ืืฆืื". ืื ื ืืฆืืข ืืืขืืืจ ืืช ืืืื ืืืชืื ืืชืงื ื. ืืื ืืืชืจ: ื ืฆืืื, ื ืฆืื, ื ืฆืื ืืคืืจืื, ืขืื ื ืฆื ืืจืืคื, ืขืื ื ืฆื ืืืื-ืืื, ืขืื ื ืฆื ืืฆืืืฅ, ืขืื ื ืฆื ืืืจืืื ืืื ืฉืืฃ ื ืฆื. ืื ืื ืืื ืืชืืื ืืืื ืืืื ืืฉืืื:ื ืฆืืื#ืืชืื, ืืื ื ืขื ื. ืืจืื ืฉื ืืช ืชืืืืชื ืขื ืืคื ืืขืจืืื ืขืื ื ืืฆื ืืขืื ื ืฆื. ืืืจ ืฉืืฉ โข ืฉืืื 21:00, 22 ืื ืืืืืจ 2025 (IST)ืชืืืื ืืืืืื ืืขืงืืืช ืื ืฉืืืืืื ืืืืจืื ื ืื ืืฉื ืืชืงืืื ืืคื ื 17 ืฉื ืื, ืืืขืงืืืช ืฉืื ืืืื ืืฉืืขืืชืืื ืื ืกืืืืช ืืืืชืืฉื ืืฉืื ืื ืืกืืืืก ืฉื ืืืืืืื ืืืืืช ืืจืื ืืื ืืืื ืืืฉืืืื ืื-ืคืงืื, ืื ื ืืืงืฉ ืืืืฉ ืืช ืืืืื ืื ืืฉื ืืืืื ืืืืืื ืืืืืช ืืขืจื, ืืืืฆืืข ืืช ืืืืื ืืื: ืืืื ืืืื ืืขืจื โ ืืืืื ืฉืขืื ื ืขื ืืื ืืชื ืืื ืืืืื: ืื ืืขืชืื? ืืืจ ืื ืื โข ืฉืืื 22:20, 23 ืื ืืืืืจ 2025 (IST)ืชืืืื ืื ืกื ืืกืืจ ืืืืฉ ืขื ืื ืช ืืืงื ืขื ืืืืื: ืืืื ืืืื ืืขืจื โ ืืืืื ืฉืขืื ื ืขื ืืื ืืชื ืืื ืืืืื: ืืืจ ืื ืื โข ืฉืืื 23:24, 15 ืืืฆืืืจ 2025 (IST)ืชืืืื ืืืื ืืืื ืืขืจื โ ืืืืื ืฉืขืื ื ืขื ืืื ืืชื ืืื ืืืืื: ืืืจ ืื ืื โข ืฉืืื 18:11, 27 ืืืฆืืืจ 2025 (IST)ืชืืืื ืืืงืื ืืขืจืืช ืฉืืืืื ืืฆื ืืืคืื: ืืืืื ืืืคื: ืืืฉ ืขืืื ืื ืฉื ืโ2008 ืืชืงืืื ืืืื ืขื ืืืงืืื ืืจืืื ืฉื ืืง:ืืขืจืืช ืฉืืืืื, ืืคื ื ืกืืื ื ืคืืกืืง ืื ืืืจืืื. ืืืืืื ืืกืืคืืช ืืืืชื ืืืฉ ืืืฉืจ ืืขืื ืื ืืขืฉื ืืืืื ืฉืชืืื ืืืืืืช (ืื ืืคื ื ืกืืื ื ืืคืืกืืง ืื ืืืืจืืื ืืื ืืขืจื). ืื ืืฉื ืืืขืื ืืืืื ืืืืจืื ื ืืืืงืืคืืื:ืืื/ืืงืฉืืช/ืืจืืืื 2#ืืื ื ืงืืื ืืขืจื, ืืืืืชื ืจืืฆื ืืืขืช ืืื ืื ืฉืื ืื ืืขืืืช ืืงืืืื ืืื ืืืื ืืืงืื ืืจืืื ืืืขืจืืช ืฉืืืืื (ืืื ืืืช ืืืคื ืืึพืืื ืฉื ืื). ืืืฉ ืขืืื โ ืืื ื ืฉืืื 13:15, 5 ืืืฆืืืจ 2025 (IST)ืชืืืื ืืืจืืงืจืืื ืฉืืื ืืืืจืื. ืืืื ื ืืื ืืืชื ืืืืช, ืืืฃ ืืื ืื ืืืื ืืืืจืืช ืืืฃ ืืื ืืืจ "ืืขืืื ืืืชืจ". ืคืขื ืืื ืืื ืืื ืืื "ืื ืคืขืื - ืื ืืคืขืื", ืืื ืืืืื ืฉืืื ืืืืื ืืขืฉืืช ืื ืืืืื ืขื ืืกืคืจ ืืืคืขืืืื, ืื ื ืืจื ืื ืืฉ ืืคืขืื ืคืขืื ืคืืืช - ืืื ืื ืืื ืื "ืชืืคืกืื ืืงืื" ืฉื ืืคืขืื ืคืขืื ืืืชืจ, ืืืืื ืืืืืชืจ ืืื ืื ื ืฉืืจ. ืืฉืืืืืจ ืืืืืงืื ืืืืืืจืืงืจืืื ืืืฆื ืฉืื ื ืืืงืฆืช: ืืกืืืืช ืืฉืืืจืืช ืขืื, ืืงืืืื ืืืืืื ืืืืืื ืืช ืืกืคืจื, ืืืื ืืืืง ืื ืืืืจืืงืจื ืื "ืชืืคืก ืืงืื" ืฉืืืฉืื ืืืจ ืืืื ืืื ืืืื. ืืืฉืจ ืขืืจื ืืชื ืื ืืืขืืื ืขืฆืื ืืืืืจื ืืืื ืืชืคืงืืืื ืืืื, ืืื ืืืขืฉื ืืงืื ืขื ืขืฆืื ืืชืืืืืืช ืืืืืช ืืืื, ืืืฉืงืืข ืืชืคืงืื ืืื ืืื ืฉื ืืืฅ, ืืืืฆืข ืืช ืืชืคืงืื ืืืฆืืคื. ื ืจืื ืฉืืืืืจืืงืจืืื ืื ืืืืืื ืืชืงืฉืื ืืื. ืืฉ ืืืื ืฉืฉืืื ืืืืจืืจืื ืืืจ ืืื ืจื, ืืืคืืื ืกืืื ื"ืืืืื ืืืฆื" ืืคื ื ืืืชืจ ืืฉืืืขืืื, ืื ืืืืืจืืงืจืืื ืื ืืชืคื ืื ืืกืื ืขื ืืื ืืืืจืื. ืืืืื, ืืงืฉืืช ืืคื ืืืช ื"ืืงืฉืืช ืืืจืฉืืช ืขืืจื ืืืฉืง" ืื ืืืืคืืืช ืชืื ืคืจืง ืืืื ืืืื ืืกืืืจ ืืืืฆืืคื. ืืื ืื ืืขื ืืช ืืจืื ืืืืฉืืช, ืืื ืื ืืืฉืื ืืขืืื ืขืฆืื ืืืืืจื ืืืืืจืืงืจื ืืื "ืืืืื ืืืืฉืืื" ืืชืขืจืื ืืืชืืจืจ ืฉืขืืฉืื ืืื ืืื ื ืคื ืื ืืืื ืืช ืืชืคืงืื, ืืขืชื ืืื ืฉืืืืจ ืื ืืื ืืขืฉืืชื ืืื ืืืืืืข ืืงืืืื, ืืืจืื ืืืืจืืช ืืืฉืืช, ืืืคื ืืช ืืงืื ืืืืฉืื ืฉืืื ื ืื ืคื ืื ืืื. ืืจืืข ืืฉ ืฉื ื ืืืจืื ืขื ืืจืฉืืช ืืืืจืืงืจื ืืืืืืื ื ืฉืืงืืืื ืืจืฉื ืฉืืืฉื, ืื ืืืื ืืื ืฆืืจื ืืืืื ื"ืคืื ืื" ืืื ืืฉ ืฆืืจื ื"ืืื ืื". ืืืฉืืช, ืืืืชื ืืื ื ืืืืจืืงืจื ืขืืจ (ืืฆืืืข ืขื @ืืื ื ืื ืืืืง - ืกืืืื ืืืืืื ืขื ืืกืืืื) ื"ืืืืจืืงืจื ืืื ื" ืืืืคืื ืืืืจืื ืืืืืคืื ืขื ืฉืืืื ื ืืืืจืืงืจืืื ืืืฉืื. ืืฉืื ืืืืื ืืฉืืืข ืฉืืฆืืงืช ืืืื ืฉื ืืืืืจืืงืจืืื ืืื ืืื ืืช - ืื ืื ืืืืืขื ืืฉืื ืืื "ืื ื ืืื ืขืกืืง ืขืืฉืื ืืืื ื ืืื ื ืคื ืื ืืืืืื ืืชืคืงืื, ืืื ืืืฆื ืืื ืฆืคืื ืืืฉืชื ืืช ืขืื ืฉืืืข ืืจืืข, ืืื ืืืื ืืืื ืืช ืืชืคืงืื ืืคื ืฉืืฆืืคื" ืืคืฉืจ ืืืืชืื ืงืฆืช, ืืื ืื ืื ืื ืืืฆื, ืืืขืชื ืขืืื ื ืืืืืก ืืืืจืืงืจืืื ืืืฉืื - ืืื ืืชืืืืจืช ืืฉื ืืื ื ืืกืคืื ืืชืืืืฃ. ืื ืืขืฆื ืืื ืืขืื ื"ืืื ืืจืืฉ ืฉืื", ืืฉืื ืืฉืืืข ืืช ืืืืืจืืงืจืืื ืขืฆืื ืืืืจืื ืืืช, ืืืืืื ืืฉืื ืืจืืืชื ืกืืืจืื ืืช ืืืืื ืืืืจืืจืื ืืืืคืืื ืืืงืฉืืช ืืืคื ืืืจืฉืืืช. ืืืจืื - ืงืืคืืื ืืฉ 21:26, 11 ืืืฆืืืจ 2025 (IST)ืชืืืื ืงืืื ืฆืจืื ืืชืงื ืืช ืื ืืื ืืืจืืข ืฉื ืืืืจืช ืืืจืืงืจืืื, ืืงืืื ืืืื, ืืื ืืขืฉืืช ืืฆืืขื ื ืืกืคืช. ืฉื ื ืืืจืืงืจืืื ืื ืืขื ืืืื. ืจืืฉืืช ืืฉืื ืฉืืืื ืืกืคืจ ืื ืืืื. ืืื ืืืื 3 ืืืจืืงืจืืื * ืื ื Hanay โข ืฉืืื โข ืืจื ืืจืฅ ืงืืื ืืชืืจื - ืจื ืืืืืื ืืืืืฃ ืืืืจืื 16:45, 18 ืืืฆืืืจ 2025 (IST)ืชืืืื ืืชืงืคืช ืืืืืช ืื ืืืข ืืขืจืืื ืืงืฉืืจืื ืืกืืืืืื ื ืืขืงืืืช ืืืชืื ืืืืช ื-2022 ืืืืืช ืืืืื ืกืขืื ืืืืจืืื, ืขืืจื "ืืกืืืืืื ื ืืจืื ืืงื", ืื ืืืื ืืืื ืืืืืจ ืฉื ืืื"ื, ืืฉืจืื ืขืืื ืืฆื ืกืืืืืื ื - ืืื ืืคืขื ืืจืืฉืื ื | ืืืจ ืืืืื | ืืืื, ืืืชืจ ืืฉืจืื ืืืื, 27 ืืืฆืืืจ 2025 ืื ืืืืื ืืกืื ืฉื ืืงืฉืช ืืฉืจืื ืืื"ื ื-1990 ืื ืืืข ืืืื ืืกืืืืื ืืืืืจ ืืกืืืืืื ื ื ืืฉืคืชื ืืื ืฉืืฉ ืื ื ืืืฉื ืืช ืืขืจื ืจืฆื ืืขื ืฉื ืื ื ืืืืกืง ืฉืืื ืื (ืืื) ืืงืืฉืจ ืืกืืืืืื ื. ืืื ืืฉื ื ืขืื ืขืจืืื ืืืื ? ืืื ื ืืชื ืืขืชืื ืืงืจืื ืืืจืื ืืืื ืื ืคืขืื ืืืืื ืืืงืืฆืื ืฉืขืกืงื ืืืชืงืคืช ืืืืืช ืฉื ืืืื ื ืืื ืืืื ืื ืืคืขื ืขืืืจ ืกืืืืืื ื ? ื ืืื ืื ืื ืืืืจื ืฉื ื-3 ืืืื ืืช, ืืจื ืืื ืืืช ืืืื ืืืืืืช ืฉืื ืืืืื ืช ืืืกืืืจืืช ืืืชื ืืช. ืืช ืื ืืืขืชืื ื ืืื ืืชืืื ืืฉืืื ืื. ืื ืื ืื ืืืืืจ ืืกืืืืช ืขืจืืื ืฉืืขืช ืืืื ืืขื ืื ืจื ืืกืืื ืืืช ืืชืืื ืื ืืช ืืืกื ืฉืื ืืืืช ืืืืืช ืืขืชื. ืื-ื ืืจ โข ืฉืืื 00:02, 29 ืืืฆืืืจ 2025 (IST)ืชืืืื ืืฉื ืืืืื ืืฆืืจื ืืืืืข ืคื ืื ืืืข ืืขืื ื ืืื ืืืกืืขืื (ืื ') (ืื ืจืื ืืืืืช ืืืฉืืื ืืืืชืจ ืืืื ืืกืืืืืื ื ืืืืื ืฉื ืืืืืช ืืืืืกืืื ืืืื ืืคืืืช ืืืืช ืืืคืช ืฉื) ืื ืจืื ืืืื ืืช ืืืคืจืืงืืืืช ืืจืืฉืื ืืช ืฉืืื ืืืื ืื. ืืฉื ืืช ื-50 ืืืื ืกืืขืื ืืืืืืืืช ืืืจืืื ืื, ืืืืืชื ืืืืฉื ืืกืืืืืื ืืืช ืืจืืฉืื ื ืฉืืืื ืฉื. ืขื ืฉืืื, ืขืืื ืืืืืืืช ืืืืืชื ื ืฉืืื ืืืืืื ืืื'ื ืืืืจืืื ืขืื (ืื ') ืฉืืืื ืืจืืฉ ืืืฉืืช ืืืื ืช ืกืืืืืื ื ืืืืืฉื ืืื ืืขืฆืืืืช ืืคื ื ืืืืืื ืขื ืกืืืืื, ืืืืืจ ืืื ืืจืืฉ ืืืฉืืช ืืจืคืืืืืงื ืืกืืืืืช (1960โ1960) ื-(1967โ1969), ืืื ืฉืื ืกืืืืืื ื (1993โ2002). ืืืื ื ืืืืฉื ืืจืืฉืื ื ืืืืฉืืช ืกืืืืืื ื ืืฉืจืช ืืจืืืื ืืืคืืชืื ืืืืจืชื ืืกืืืืื ืืืฉืจืช ืืืืฅ ืืจืืฉืื ื ืฉื ืกืืืืืื ื (2003-2006). ืขืืื ืืืจืืื ืืืจืืืืช ืืขืืืื (WHO) ืืืืืชื ื ืฆืืืชื ืื'ืืืืื. ื-2002 ืืืกืื ืืช ืืืช ืืืืืื ืืืืื ืขื ืฉืื ืืืจืืืืกื, ืฉืืคืืืช ืืฉืืขืืชืืช ืชืืืชืช ืืืืืืช ืืชืื ืืงืืช. ืืืืช ืืืืื ืืคื ืืืื ืืืจืกืืื ืืืืฉืจืช ืื ืฉื ืจืคืืื. ืืื ืคืขืืื ืืืืืื ืืืืืง ื ืื ืืืืช ื ืฉืื (FGM), ืื ืืกืืื ื ืืืืฉื. ืืื ืื ื ืฉืืืช ืืืจืืื ืืงืืจืื ืืช ืขืื ืืืื. ืืืจืฅ 2022 ืืคืื ืื ืฉืืืช ืืจืืื ืืขืืื ืืืืืืื ืฉืืื ื ืืืืฆืืื. ื-2023 ืืืชื ืืคืจืก ืืืคืืืื ืืืืงืจืชื ืขื ืชืจืืืชื ืืืจืืืืช ื ืฉืื ืืืืืืืชืืื. ืืชืืื ืืช Hila Livne ืฉืื ืชืืืช ืืืืืช ืืขืชื ืืืื ื ืืื ืืืงืฉ ืืช ืืฆืืจืช ืืขืจื ืืืื ืืื ื ืืชื ืืชืขืืฃ ืขืจื ืืื ืืืกืืจืช ืชืืจ ืืืฉืืืืช ืฉื ืืืงืืคืืื:ืืืขืืื ืื/ืืืขืืื 51%? ืื-ื ืืจ โข ืฉืืื 22:14, 2 ืืื ืืืจ 2026 (IST)ืชืืืื ืืืื ืืืช ืืืืืงืืคืืื ืฉืืื ืืืืจืื. ืืกืขืืฃ ืืื ืืื ื ืืฆืขื ืืื ืืงืฉื ืฉืืืฉืื ืืขืฉื ืืฉืื. ืืืืื ืงืืื ืฉืื ืืืฉืื "ืื ืื ืื ืืคื ืืืืื ืืืช ืืจืฉืืืื ืฉื ืืืงื ืืขืืจื?", ืื ืืืฆืชื ืืืกืืื ืฉืื ืืฉื ืืขืืจืคื. ืืืฉืืช ืืขืจืคืื ืืื ืื ืืื ืืคืจืืข ืื, ืืื ืืฉ ืจืืืืช ืืืฉืืืช ืฉืขืืจืืื ืืืฉืื ืืชืงืฉืื ืืืชืืืืืื ืืฉืืืืช "ืื ืืืืื ืืืช", "ืืื ื ืงืืขืช ืืืื ืืืช", "ืืื ืืฉื ืื ืืืื ืืืช" ื"ืืื ื ืืื ืืืชื ืื", ืื ืืฉืืชื ืืืชืื ืืืืจ ืงืฆืจ (ืืืืืจ ืืืื ืืงืฆืจ ืฉืืฉ ืืื ืืื - ืืืจื ืืื ืื ืื ืืืืช ืงืฆืจ ืืฆืขืจื) ืฉืืชืืจ ืืช ืืืฆื ืืคื ืฉืื ื ืจืืื ืืืชื. ืกืืืจ ืฉืืืื ืืืืจื ืื ืืืืงืื, ืืขืืืืช, ืืืืื ืืคืืื ืืขืืืืช ืงืฉืืช - ืืื ืืืื ืื, ืืืฉื ืืืคืฉื ืืืขืืจ ืื, ืื ืืืฃ ืื, ืืืืขืืืื ื ืขื ืืขืืืืชื. ืืชืืื ืืชืืืืจ ืืืืกืืืจืื "ืืืืื ืืืืขืชื": ืืืฉืจ ื ืืกืื ืืืงืืคืืื ืืขืืจืืช, ืืืืจืื ืฉืืฆืืจืคื ืืืืื ืื ืืชืจืืื ืืืชืืืช ืกืคืจื ืืืงืื, ืืื ืืืชืืืช ืื ืฆืืงืืืคืืื. ืืืฉืจ ืฆืฆื ืืืืืงื ืืขืืช ื ืืกื ืืืืืข ืืืกืืื ืืืื ื ืืืืืฉืื ืืืื, ืืื ืคื ืืฉื ืื ืืคืฉืจ ืืื ืืืืืข ืืืกืืื ืื ืืฉื ืื ืื ืืืจ, ืืืฉืื ืืืฉืื ืืชืืืื ืืคืชืืจ ืืืืืงืืช ืืขืืจืช "ืืฆืืขืืช". ื"ืงืืืื" ืืืืชื ืืฆืืืฆืืช, ืืคืขืืื ืจืง ืงืืืฅ ืขืืจืืื ืืฉืชืชืคื ืืืฆืืขื ืืื, ืืืคืฉืจ ืืื ืืกืืื ืขื ืื ืฉ"ืืืื" ืืืืขืื ืืืืืืงืช, ื"ืืืื" ืืืืขืื ืื ืืืืื. ืืืงืืคืืื ืฉืืฉืื ืืคืจืื, ืขืืจืืื ืืืฉืื ืืฆืืจืคื, ืืชืืจืจ ืื ืฉืขืืจืืื ืืชืืงืื ืขืืืืื, ื ืืฆืจ ืฆืืจื ืืืกื ืืืชืขื ืืช ืืืืืืืช ืืืืืช "ืืืืฉืื", ืืืืื ืืืืืชื "ืืคื ืืืื ืืืช". ืื ืงืจื "ืืืืคื ืืืจืื ื", ืื ืืืืกื ืืื ืืกืืืจ: ืื ืืืฉื ืจืื ืืคื ืืืืื ืืืช ื ืืชืื ืืืจืื ืืืงืืคืืื, ืืื ืืืง ืืื ื ืืฆืืื ืืืืงื ืืืจืื "ืขืืจื" (ืื ืืืฉื ืืืืืื ืืืื ืืชืืืืช ื ืืฆืืื ืืขืืจื:ืืชืืื, ืืืืืื ืืืื ืฉื ืืฉืชืืฉ ื ืืฆืืื ืืขืืจื:ืืฉืืื ืืืฉ/ืฉื ืืฉืชืืฉ, ืื ืืืื ืฉืืฉ ืขืื ืืื ืืคืื ืืืจืื ืืขืืจื ืฉืืืืืื "ืืืื ืืืช"). ืืฉืื ืืกืืื ืงืืืจืื ืืคื ืืืื ืืืช ืืงืืืืจืื:ืืืงืืคืืื - ืืืื ืืืช ืคื ืื (ืืงืืืืจืืืช ืืฉื ื ืฉืื), ืืื ืืืจืืื ื, ืื ืืืืคื ืืืื ืืขืงืื: ืื ืืืฉื ืืฉ ืืืืื ืฉื ืืืขืื ืืืคื ืฉืืื ืฉื ืืฆืืื ืืืืงืืคืืื:ืืฃ ืฉืืื (ืืื "ืืื ืืขืจืื ืืืจื ืืืจืื" ืืขืื ืืื) ืฉืื ืืืคืืข ืืืืช ืืงืืืืจืืืช ืืืื. ืืฉ ืืืื ืืืช ืฉื ืงืืขื ืืืจื ืืืื ืงืืืืชื ืืงืื ืฆื ืืืก, ืืฉ ืืืื ืืืช ืฉื ืืกืื ืขื ืืื ืืืงืืคื ืืื ืื ืืืชืจ ืืื ืฉืืชืงืืื ืืืื ืืกืืืจ, ืืืฉ ืืืื ืืืช ืฉื ืงืืขื ืืืจื ืืฆืืขื (ืืฉืื ืืืฉืื ื ืืกื ืืฃ ืฉืืืืขื ืืืฆืืขืืช ืขื ืืืื ืืืช ืืืืืช. ืืืงืืคืืื ืืขืืจืืช ืืจืื ืืื ืืจืืื ืฉื ืืืงืืคืืื: ืืืงืืคืืื ืืงืคืืื ืืื ืืืขืืื ืืชืช ืืชืคืงืืืื ืืืืกืืืช ืฉืืืช ืฆื ืืขืื ืืื ืืืืจื ืืื, ืืื ืฉ"ืื ืืขืื ืืฉืชื ืืจืืฉ": ืื ืืืฉื ืื ืงืืจืืื ืืื ืฉืื ืื ืืช ืืืืื ืืคืืขื "ืื ืืืื" ืืื "ืืคืขืืืื", ืืื ืฉืื ืื ืืืชื ืื ืงืืจืืื "ืื ืืื ืขื" ืืื ืืฉื ืืื ืืืืจื ื "ืืืืจืืงืจืืื", ืืื ืฉืื ืืืื ืืช ืืืื ืื ื ืงืจืืื "ืืืื ืืืืืื" ืืื ืืฉื ืืฆื ืืข "ืืืืืื". ืืืืกื ืื ื ืงืืขืื ืจืื ืืืืจืื ื ืงืจื "ืืื ืื", ืืื ืืกืืื ืืืฉืื, ืืืืงื ืืืฃ ืื ืืฆืืืขืื ืขื ืืืื ืืืช ื ืืืจ ืืฉื ืืืืืจื ื ืืื ืคืื "ืคืจืืื ื" - ืืฉืืืื ืืื ื ืืง ืืืฉื ืื ืืฉ ืืืจืื ืฉืืืืืกืื ื"ืื ืืืืื ืืคืจืืื ื" ืืขืืช ืงืืืฉื ืืื ื"ืืืืจืืืชื"). ืืคืืขื, ืืืง ืืืจื ืฉื ืืืื ืืืช ืืืงืืคืืื ืืชืืืฉ ืืื ืืฆืืขืืช ืืืคืขืืื ืืคืืื ืืื ืืืชืจ ืืื ืืืื ืื: ืื ืืืฉื, ืืื ืืืคื ืืืืื ืืืช ืืืฉืืืื ืืืืชืจ, ืฉืืชืืจ ืืืื ืืืช ืฉืืคืจืืชืื ืืจืื ืืืืชืจ ืืกืืืืช ืืื ืืืื ืืืช ืืืจืช, ืฉื ืงืจื "ืืืงืืคืืื:ืืืื ืืชื ืืืืช ืืื ืืืจื ืืงืืืื" ื ืงืืข ืืืชืืืฉ ืืืฉื ืืืื ืืื ืืืื ืจืฆืื ื, ืงื ืืืืืจ ืืฆืืขื (ืื ืื ืืขืื - ืืืฃ ืืชืืจ ืืืืื ืฉืืฉืื ืืืฉืจ ืืืืื, ืืงืฉื ืืชืืจ ืืืฉืื ืฉืืขืจืขืจ ืขื ืืื ืืืืืืื ืฉืจืฉืืืื ืื. ืืืกืืจืช ืืชืืืช ืืืืืจ ืืื ืืืงืชื ืฉื ืืฉืื, ืืืชืืืืจ ืื ืฉืืฉืื ืื ืืืฉืืขืืชื ืืืืจืื ืื ืืืคื ื ืืืจืื ื ืขืฉื ืขื ืืื - ืืกืจืช "ืืื" ืืืคืจื ืฉื ืืกืฃ ืืืืืื ืืื ืฉืื ืืืื ืืืขืืื ืื ื ืฉืืจ. ืืืกืจื ื ืขืฉืชื ืืขืงืืืช ืืืื ืืืื ืื ืฉืืืืืชื ืฉืืืืข ื"ืงืื ืฆื ืืืก"). ืืคืืขื, ืืืง ื ืืื ืื"ืืืื ืืืช" ืืืืืงืืคืืื ืืคืืขื ืืืืกืกืช ืขื ื ืืจืืืช ืืืกืืื, ืื ืขื ืฆืจืืจ ืืืงืื ืงืืืขืื. ืืืขืชื ืืืืขืช ืืืงืืคืืื ืืชืืงืื ืื ืืขืืื ืื ืื ืืขืื ืืืืื ืื ืืคืืื ืืืฆื ืื ืืื, ืืื ืงืฉื ืืืืืืฉ ืฉืืืฆื ืืื ืื ืืืฉ ืขืืืจ ืืขืืจื ืืืฉ ืืืฉืชืื ืืงืืืื ืืืกืคืื ืืืืคื ืื ืืช "ืจืื ืืืืื". ืืื "ืกืคืจ ืืืงืื" ืืกืืืจ ืืืชืืขื, ืืืืคืก ืืืจืื ืืืจืืื ืงืฉื (ืืฆืืข ืืืื ืืื) ืืื ืืฉืคืจ ืืช ืืืฆื? ืกืคืง ืจื ืืื - ืืืขืชื ืื. ืืืฆื ืื ืืืื ืืื ืฉืืืฉืจ ืืืฉืื ืคืืขื ืื ืืืื ืืืืืื ืืื ืขื "ืืืื ื ืืืื" ืืืจื ืืื ืื ืืขื ืืฉืื ืืืชื ืืื ืืืืืฉืื ืืืชื ืืื ืืืืจืื ืื "ืฉืื ื ื ืื ืฉืื ื ืืื "ืืฆืื ื" ืืขืฉืืช ืื, ืื ื ืืืื ืข ืืื ืืขืชืื", ืืืืจื ืืื ืื ืืกืคืืง. ืืฉ ืืขืืืื ื ืื ืฉืื ืฉืื ืื, ืืืชืืจืจ ืฉืืฉ ืื ืฉืื ืฉื ืื ืืื ืืืชืจ ืขื "ืืืงืื" ื"ืืืืื" ืืชืืืื, ืืจืืจืื ืืืคืืจืฉืื - ืืฆืขืจื ื ืขื ืืืงืืคืืื ืืขืืจืืช ืืขืืืจ ืขืื ืืืจืช ืืจื ืืจืืื ืืคื ื ืฉื"ืืืงืื" ืฉืื ืืฉืืืขื ืืช ืจืฆืื ื ืฉื ืื ืฉืื ืืื. ืืฉืืืชื (ืื ืืืื ืืืืืชื) ืื ืืืฆื ืื ืืืื ืืชืืื... ืชืืื, ืืืงืฉื, ืกืืืื - ืงืืคืืื ืืฉ 05:20, 29 ืืืฆืืืจ 2025 (IST)ืชืืืื ืืฉืืคืช ืงืืืฆืืช ืชืืืื/ืืืืก ืืืฉืคืขื ืืืืจืื ืช ืขื ืืชืืื ืืืืืงืืคืืื ืืขืืจืืช ืืฆื ืืืคืื: ืืืืื ืืืคื: ืืืจื ืืื ืงืฆืจ ืืืืจ ืืชืคืืฆืฆืืช ืคืจืฉืช ืืืืืช ืืืฆืืขืืช ืืืืืื ืื, ืืืืขื ืืืืื ืฆืืืืื ืืกื ืืืงืืื ืืชืื ืืชืืชืืืืืช ืืฉืชื ืงืืืฆืืช ืืืืืกืืค ืืขืืืช ืืืคื ืคืืืืื. ืื ืืืืืจ ืฉืืืื ืขืืื ืื ืืงืืืฆืืช ืืื ื ืขืฉื ืฉืืืืฉ ืืืืืก ืืฉืชืชืคืื ืืืฆื ืืฉืืืื ืฉื ืืืคื ืืคืืืืืืช, ืืืฆืืขืืช ืืืืืื ืื ืคืืืืืืื/ืจืืืฉืื ืืืืืงืืคืืื ืืืืืืื ืช ืืืคื ืืืฉืชืชืคืืช ืืื. ืืืืืจ ืืืขืืจ ืืืื ืืืื ืืื ืฉืืื ืืืจ ืืงืืืฆืืช ืืชืงืืคื ืืจืืืื ืืืช, ืืืื ืื ืืืฉืจ ืื ืืคืจืกื ืืืชื ืืคืืืื. ืืืืื ืืืืขืชื, ืืืืืจืืงืจืืื ืืฆืืืื ืืื ืืคืืืช ืืืชื ืฆืืืืื ืืกื, ืืืืชืื ืฉืื ืืืืจ ื ืืกืฃ. ืื ื ืืืืืก ืืฉืืืืช ืืืฆืืช ืืื ืื ืื ืืคื ืฉืืื ืืฉืชืงืฃ ืื ืืืชืืชืืืืืช. ืคืืกื ืื ืืืื ืืกืคืจ ืฆืืืืืื ืงืฆืจืื ืื ืืงืืืฆืืช, ืฉืืขืื ืื ืืืืฆืืื ืืช ืืืคื ืืืชื ืืืืช ืืื, ืืืืช ืืื ืฉืืงืืืื ืชืืื ืืืชืจืฉื ืืฉืืจืืช ืื ืืืืคื ืฉืื ื ืขืฉื ืืืืืกืื ืืืืืืื ืืช, ืืืืขืจืื ืืขืฆืื ืืช ืืืืจืช ืืชืืคืขื. ืืฉื ื ืื ืืฉืืืืช ืจืื ืืขืื ืื ืืืื ืช ืืคืืกื ืืคืขืืื ืืืฉืืืืชืืื ืืืคืฉืจืืืช ืขื ื ืืืจืืืืช ืืืืื ืืืืื ืืืขืชืื, ืืืคืจื ืืืืืืช ืฉื ืืขืื ืืืจืฉืืืช ืืืืืช ืืืื ืืืช ืืคืืกืื ืืืืื ืืืืคื ืืื ืืืชืื. ืืฉืื ืื ืืฆืืื ืฉืืืืช ืืืืชืืื ืืืืง ืืืืงืจืื ืืื ื ืืืืขื. ืืืขืฉื, ืจืื ืืืืจืื ืืงืืืฆืืช, ืืื ืืืื ืืืืชืื ืืืืืขืืช ืืื ื ืืืืืื ืขื ืืืื. ืืืชืืืืืช ืฉืื ืื ืืคืจืกื ืืช ืืืืจืื ืืืืชื ืืืืฉืืช, ืื ืืฉืื ืืืืฉืื. ืืชืืืขืฆืชื ืขื ืืื ืืืื ืืืงืืคืืื ืื ืืฉื. ืขื ืืืช, ืจืฆืฃ ืืืจืืขืื ืืชืงืืคื ืืืืจืื ื ืืืืง ืืฆืื ืืช ืืืกืงื ื ืฉืืืื ืงืืืืชื ืคืืืื ืืื ื ืืืฅ: ืืชืืืฉื ืฉื "ืกืขืจื ืคืืืืืืช" ืืชืืฉืืช ืกืืื ืขืจืืื ืืืืื ืื ืจืืืฉืื, ื ืืกืืื ืืช ืืืืจืื ืืืฉืคืืข ืขืืืื ืืืืงืฃ ืืจืื, ืืืฆืื ืืชืืกืกืืช ืืืขื ื ืฉืืื ืืื ืืื ืงืืืฆืืช ืืืืก ืื ืืืื, ืื ืืื ืืื ืชืจืื ืืชืืืฉื ืฉื ืฉืืืงื ืืชืืฉืืช ืืืจืงื ืืงืืืืชื ืืขืืื, ืืืชืคืืกื ืืืืืช ืืืขืืืงื ืฉื ืืืืื ืืืืจื ืืืืืงืช ืคืืืืืืช. ืืืจืชื ืืื ืืืคืฉืจ ืืืื ืงืืืืชื ืืขืฉื ืืฉืืืืช ืืืืืช: ืืื ืืืืฆื ืืืื ื ืืืจ ืืคืืก ืฉื ืืืืก/ืชืืืื ืฉืคืืืข ืื ืืืืจืืืืช ืฉื ืืืงืืคืืื ืื ืืืื? ืืื ืืคืขืืืืช ืฉื ื ืงืื ืขื ืื ืืฆืืฆืื ืชืืคืขืช ืืืืืกืื ืืงืืืฆืืช ืืืืื ืืกืคืืงืืช, ืืื ืื, ืืืื ืฆืขืืื ื ืืกืคืื ืืืืืื ืืงืืืื ืืืขืื ืืืจืฉืืืช ืื ืงืื ืืื ืืฆืืฆื ื ืืง ืืืืจืชืืข ืชืืคืขืืช ืืืืืช ืืขืชืื? ืืฆืืื ืฉืืืืื ืืื ืชื, ืฉืชื ืืงืืืฆืืช ืืื ืคืขืืืืช ืื ืืคื ื ืืื ืืืจื ืืืืื ืืช ืืืชืืขืืื. ืืื ืืืื ืจืืืืช ืืืืืช ืืืืงืฃ ืคืขืืืืชื ืืืืฅ ืืืืื ืืช ืืืื, ืื ืื ืืืืืจ ืขืืื ืืื ืืคืืืช ืฉืงืืื ืืฉืฉ ืกืืืจ ืฉืื ืื ืื ื ืืืืก/ืชืืืื ืืกืื ืื ืขืฉืืืื ืืืืฉืื ืืืชืงืืื ืื ืืืื. ืืฉืชื ืืงืืืฆืืช ืืื ืืืฉ ืืจืืจ ืขื ืืืืกืื ืืืฆืืขืืช, ืืืื ืืขืงื ืืืจื ืืฆื ืืืฆืืขื ืืงืจืืืืช ืืืฆืืจืฃ ืืืฆืืขื: ืืขืืชืื ืืืืขืช ืืืืืก ืืืืชื ื-"ืืืจื ืืกืืจ" ืฉืืืกืืจืื ืืช ืืืืืืงืช, ืฉืืืืจืื ืืืืขื ืงืจืืื ืืืฉืชืชืฃ ืืืฆืืขื ืื ืืืืื ืืจืืืื ืื: ืืืงืจืื ืืืจืื ืืื ืืื ืืืืชื ืืฉืืจื ืืืืื: "ืชืชืืื ืึพX ืืชืชื ืืื ืืืฆืขื ืฉื Y". ืืืฆื ืฉืขื ืืคื ื ืกืืืจืช ืืืฆืืขื, ืคืจืกืื ืืืช ืืื ืืืืช ืืงืืืฆื ืืืืขื ืฉืื ืืชืื "ืืฉ ืืื 56%" ืืืกืืืจื ืฉืฆืจืื ืขืื 3 ืขื 15:20 ืืื ืฉืื ืืจื ืืชืืช ืึพ55% (ืืื ืืขืจื ืื ืืืืืง). ืืืฆื ืืฉืขื ืฉืืืืจ ืืื ืืขื ืืกืืืจืช ืืืฆืืขื, ืืฆืืืขื 6 ืืืงืืคืืื ืืขื ืืฉืืจืช ืืขืจื ืื ืฉืืืื ืืฉืื ืื ืชืืฆืืืช ืืืฆืืขื ืืจืืข ืืืืจืื. ืืื, ืืืืฆืื ืืื, ืื ืืืฆืืืขืื ืื "ื ืืืื ืืคืขืื ืืืืงืืคืืื ืืื ืืคืจืื 2023 ืืกืคืืืืจ 2023 (ืจืื ืืจืืื ืขื ืื ืืืืฉื). ืืงืืืฆืืช ืื ืขืกืงื ืจืง ืืืฆืืขืืช, ืืื ืื ืืืืืก ืชืืืืืช/ืขืืืืช ืืืืื ืื. ืืืืช ืืงืืืฆืืช ืืืฉื ืืืคืืขื "ืืืื ืืช" ืืืฉืชืชืฃ ืืืืื ืื ืคืืืืืืื ืฉืื ืื ืฉืืืื ื ืืกืืืื ืืื: ืืืงืจื ืืื ื ืืชื ืฉืื ืชืืื ืืกืคืืง ืชืืืื, ืืคืฉืจ ืืืื ืืืื ืืก ืืช ืืืืืข ืืขืจื ืืื ืืคืชืื "ืืืื ืขื ืขืื ืืฉืื". ืืงืืืฆืืช ื ืืืจ ืขืืกืืง ืืชืืฉื ืืืฉืืช ืืืืช ืืฆืืขื ืืืืืก ืืืจืื ื ืืกืคืื ืฉืืื ืืคืฆืช "ืืืจืื" ืืื ืืืืืง ืื ืืฉ ืืืืช ืืฆืืขื ืืืืืขืืช ืืื: ืืืง ืืืืืขืืช ืืืืืก ืกืืื ื ื-"ืืืขืืจื", ืื ืฉืืจืื ืฉืื ืืืืืจ ืืฉืชื ืงืืืฆืืช ืืืื ืืื ืืืงืืกืืกืื ืฉื ืงืืืฆืืช ืฉืืคืืฆืืช ืื ืืื ืืืืขืืช ืืืืก. ืืืืง ืื ืืืืืขืืช ืืืคืืขืื ื ืืกืืืื ืืืืืืื ืขื ืืฉืฉ ืฉืืคืขืืืืช ืชืืชืคืก ืืชืืืื ืืืฅึพืืืงื ืื ืืคืขืืืืช ืฉืืื ื ืืงืืืืช ืืงืืืื. ืืืฉื ืืืืช ืืงืืืฆืืช ืืชืืงืฉื ืื ืฉืืืืขืื ืืืื ืืืคืืฉื ืืืงืืคืืื ืฉืืืจืื ื "ืื ืืืืจ ืขื ืืืืจืืืช ืืขื ืืงืืืฆื ืืืืช!" (ืืืืืฉื ืืืงืืจ) ืื ืืกืฃ ืืกืืจ ืฉืื ืื ืฉืื ืขืืฉืื ืืฉืื ืจืข, ืืื ืื ืฆืจืื "ืืชืช ืชืืืืฉืช ืืื ืฉืจืืฆื ืืชืงืืฃ ืืืืคืื ืืืฉืืืช ืฉืืื". ืคืืกื ืืืจ ืืื ืจืืืง ืืืชืจ ืืืื ืื ืืืืช ืฉืืืจืชื ืืืคืืืช ืกืืื ื ืืืืื ืฉื ืคืขืืืืช ืืชืืืืช (ืจืื ืคืืจืื ืืืื). ืืฉืชื ืืงืืืฆืืช ืืชืงืืืื ืืืจืืืช (ืื ืจืื ืืื). ืืืงื ืืืื ืชืืืืืช, ืื ืืืืงื ืืืคืืขืื ืืกืจืื ืฉืืืืืื ืขื ืชืคืืกื ืืื ืกืืจืืื ืืืืช ืฉื ืืืืื. ืืืฉื ืืืืช ืืืืืขืืช ืืืืช ืืงืืืฆืืช ืืืืืจื ืืืืจื ืืงืืืฆื ืฉืืืืช ืืืืจืืืช ื ืืืจ ืฉ-"ืจืื ืืืืื ืงืืืข ืื ืืืืชื ืืขืจื", ืืืื ืื "ืืืจื ืฉืืื ืืืฉืคืืข ืขื ืืืงืืคืืื", ืืฉืื ืืืจื ืืงืืืฆื ืืืื ืกื ืืขืจื ืืืชืืื ืืขืจืืืืช ืืืืืืจืืช, ื ืืชื ืืืื ืืืื ืืก ืืช ืืืืืข ืชืื ืฉืืืข. ืืื ืฉืืฆืืืชื ืืืืื, ืคืจืืคืื ืืืืจืื ืืงืืืฆืืช ืื "ื ืืชืืืง ืืงืืจืื ืึพ3 ืกืืืื: ืืืืจ ืฉืืชืคืืฆืฆื ืคืจืฉืช ืืืืืืช ืืืื ื 2024, ืคืืจืกืื ืืงืืืฆื ืืณ ืืืืขื ืขื ืืคืจืฉื. ืืคืจืกื ืืืืืขื ืืื ืืื ืืื ืืื ืืงืืืฆื. ืืืืชื ืืื ื ืืืืขื (ืืืืื ืืืืขืชื) ืขื ืืืื. ืืืืืขื ื ืืขื ืฉืืืืงืืคืืื "ืกืืขืจืช" ืืขืงืืืช "ืืฉืืคืช ืืชืืจืื ืืืืช ืืืื ืกืช ืืืืืช ืคืืืืืืืช ืืืจืื ืืขืจืืื ืืืืฆืขืืช ืฉืืชืืคื ืคืขืืื ืืชืืืืื ืืืฆืืขืืช ืืืืืื ืื", ืืื ื ืืขื ืฉ-"ืืืขืื ืึพ40 ืขืืจืืื" ื ืืกืื ืืฆืืืชืืช, ืชืื ืืืืฉื ืฉืจืืื "ืืืืืกืืื ืืืฉืืืืกืืื". ืืืืฉื ื ืืขื ืฉืืงืืืฆื "ืื ื ืืขืื ืืืื ืกืช ืืืืืช ืืื ืืื ืืขืช ืืืืืช", ืขื "ืืงืคืื ืขื ื ืืืจืืืืช, ืกืืืืืื ืืืฆืืช ืืืฆืืืืช ืืืืช ืฉืืืโฆ ืืืืืง ืืื ืฉืืืืื ืืชืื ืืืืจืืืช", ืืื ืืืจืชื ืฉืืืจืื ืืืื "ืขืืจืืื ืืืืื ืืขืฆืืืืื" ืืื ืฉืืืืื ืืื "ืืืื ืืขืจืื ืื ืื ืืืฆืืืข". ืืื ืขื ืืืช, ืืืืจ ืื ืฉ-"ืืืื ืช ืืจืขื ืื ืขืืฆืจืช" ืืขืืืืืช ืืืืืช ืืืฉืืืช ืขื "ืืชืืจืื ืืืืช ืื ืืืงืืืช", ืืชืืงืฉื ืืืจื ืืงืืืฆื "ืืฉืืืจ ืขื ืืืกืงืจืืืืช" ืืืื ืงืืืื ืฉื ืืงืืืฆื. ืืกืืฃ ืืืืืขื ืืืืื "ืืืคืื" ืฉื ืืขืื, ืืคื ืื ืืกืื ืฉื, ืืฆืืฆื ืืช ืืืฉื ืฉืขืืื ืืขืืืช ืฉืืงืืืฆื ืคืืขืืช ืืืืง ืืืชืืจืื ืืช ืคืืืืืืช. ืืืื ืชืงืฆืืจ ืฉื ืื ืืกื ืืืงืืจื (ืืฆืืจื ืืื ืช ืืื ืื ืื): ืืืื ืืืฆืืขืืช/ืืืื ืื ืฉืืืืืื ืชืืขืื ืืืืขืช ืืืืก ืืชืื ืืืืืจ ืฉืืจืฉืืชื. ืืืืืฉ ืฉืื: ืืืื ืืืง ืงืื ืืืื ืื ืืืชืืชืืืืืช ืืืืื ืืื ืงืฆืจ ืืืกืืช. ืกืืืจ ืฉืืืงืฃ ืืืืืืช ืืืื ืืจืื ืืืชืจ. ืืืขื ืืกืจ ืกืคืง ืื ื ืื ืืืขื ืฉืื ืื ืฉืืฉืชืชืฃ ืืืื ืืืืืื ืื ืื ืืฃ ืืืจืื ืืืืืื ืื ืื "ื ืืื ืืืืจื โืืืืืกโ. ืืืืคื ืืื, ืจืื ืืืฉืชืชืคืื ืืจืื ืืืืื ืื ืฉืืืฆืข ืืืืื ืืืืก, ืื ืืื "ืืืืืืกืื". ืืืจืชื ืืื ืืืืืจ ืืืฆืืช ืืคืืกื ืคืขืืื ืืคื ืฉืื ืืฉืชืงืคืื ืื ืืืืืจ ืืื ืชืื ื ืืืฉืชืชืคืืช ืืืืื ืื/ืืฆืืขืืช ืืื ืืืคืฉืจ ืืงืืืื ืืืื ืืขืฉื ืืฉืืื ืื ืขืืฉืื ืขื ืืืืก ืืืฅึพืืืงื ืืืืฉืคืขืืช ืฉืื ืขื ืืืืื. ืื ืื ืืืืขืืจืืื ืกืืืจ ืฉืืืืื ืืืื ืืขืืช ืขืืืืชืืช ืื ืืืื ืืขืืช ืื ืื-ืืืืง ืืกืื ืืืฉืื, ืืฉืื ืืชืืงืื/ืืืืืง/ืืืืจื ืืืชืื. ืจืฉืืืช ืืืืื ืื/ืืฆืืขืืช ืืคื ืงืืืฆืืช (ืืืืฆื ืืคืชืืื/ืกืืืจื) ืืืฆืขืชื ื ืืชืื ืืืืชื ืฉื ืืืฉืชืชืคืื ืืืืื ืื ืืืืฆืืขืืช ืฉืืืืืื ืชืืขืื ืืืืขืช ืืืืก ืืฉืชื ืืงืืืฆืืช: ืกืืดื 23 ืืืจืืขืื (14 ืืงืืืฆื ืืณ, 9 ืืงืืืฆื ืืณ). ืืื ืื ืื ืคื ืกืคืืจืืช ืืืื ืืื ืืืื ืื ืกืืื ืืืชื ืขืจื/ื ืืฉื, ืืืงืจืื ืฉืืื ืืื ืืื ืืืื ืื ืฉืืืื ืืืฃ ืืฉืืื ืฉื ืืืชื ืืขืจื, ืกืคืจืชื ืื ืืฉืชืืฉ ืคืขื ืืืช (ืืืื: "ืืกื ืืืืืืก"). ืืืืืฉ, ืฉืืืืจืชื "ืืฉืชืชืฃ" ืืื ืฉืืืื ืืืื ืืืืื ืื ืื ืืืฆืืขืืช ืฉืืืฆืข ืืืืื ืืืืก. ืืฆืืจื ืืงืจื ืขื ืืืืฆืืื ืื ืืชื ืกื ืืงืจื ืืืืื ืืื ืื ืืชื ืฉื 23 ืืืื ืื ืคืืืืืืื ืฉืืชืงืืืื ืืืืื ืฉื ืช 2022 ืฉืื ืืืืข ืฉืืืฆืข ืืืืื ืืืืก (ืืืื: "ืกื ืืืงืจื"). ืืืื ืื ืชืื ืื ืืืืืืืื ืฉืืคืืื ืืืฆืขืชื ืืช ืื ืืชืืืื ืฉืืืื: ืชืงืฆืืจืื ืืกืืืืกืืืงื (ืืืืฆื ืืคืชืืื/ืกืืืจื) ืืกื ืืืืืืก: ืืฉืชืชืคืืืืช = ืืืื ืืืื ืื/ืืฆืืขืืช ืฉืืืฆืข ืืืืื ืืืืก ืืืืงืืคื ืืฉืชืชืฃ. ืกื ืืงืจื ืืฉื ืช 2022: ืืกืงื ื: ืื ืฆืคืื ืืืืื ืื ืจืืืืื (ืฉืื ืืืฆืข ืืืืื ืืืืก): "ืื ื ืืจืื" ืืืขื, ืงืจื ืืขื ืขืืจืืื ืืฉืชืชืคืื ืืืจืื ืืืื ืื, ืืืจืื ืขืืจืืื ืืฉืชืชืคืื ืืืขื. ืืื ืืฉืชืชืคืืช ืจืื ืืืืื ืื ืฉืืืื ืืื ืืฉืืขืฆืื ืื ืืืืจื ืืืืจืช ืืจืื ืืื ืืืื ืื ืืฉ "ืจืืืืช ื ืกืืืชืืืช" ื ืืกืคืืช ืืืื ืืืืช ืคืืืืืืช ืืืื ืืื ืืืฉืชืชืคืื, ืงืฉืจืื ืืืืขืื ืฉื ืชืืืื ืืืื ืืฉื ื ืืื'. ืืืจืืืืช ืืขืืงืจืืืช ืืื ืืื ื โืืฉ ืื ืฉืืื ืคืขืื ืืืจืื ืืืื ืื ืฉืืืืโ, ืืื ืืฉืืืื ืืื ืฉืืืืจ ืฉืื ืืฉืื ืืืืืง ืกืืื ืืืจืืขื ืืืืก ืืชืืขืืื: ืื ืชืื ืื ืืื: ืกื ืฉื 23 ืืืจืืขื ืืืื/ืืฆืืขื ืขื ืชืืขืื ืืืืก ืืืฉืืืื ืืกื ืืืงืจื. ืื "ืืฉืชืชืคืืช" = ืืืคืขื ืืืื ืึพ23 ืืืืจืืขืื. ืืื ืืฆืืจืคืืช ืืจืืืื ืืืืืื ืืชืื ืืกื (ืชืืจืื ืขืจืืื ืจืืฉืื ื ืฉื ืืฉืชืชืคื ืืกื) 2022: ืืืืฆืข ืฉื ื-1.6 ื ืจืฉืืื ืืืืืฉ. ืืืจืืืืช ืืืืืืืช ืืืืชืจ ืื 8 ื ืจืฉืืื ืืืฆืืืจ 2021 ื-7 ื ืจืฉืืื ืืืจืฅ 2022. ืืืืฆืข ืฉื ื-1.6 ื ืจืฉืืื ืืืืืฉ, ืื ืืืื ืืื ืืืฆืืจืคืืช ืืืจืืืื: ืืคืจืณโืืื 2023: 27 ื ืจืฉืืื (13.5 ืืืืืฉ) ืืืืณโืกืคื 2023: 21 ื ืจืฉืืื (10.5 ืืืืืฉ) ืืื ืืืืื ืืกืคืจ ืื ืจืฉืืื ื ืืื ืืืจืื. ืื ืืณ-ืืจืฅ 2023: 5 ื ืจืฉืืื (1.67 ืืืืืฉ). ืืื ืณโืืืืณ 2023: 4 ื ืจืฉืืื (2 ืืืืืฉ). ืืืงืืณ-ืืฆื 2023: 9 ื ืจืฉืืื (3 ืืืืืฉ). ืืกื ืืืืืืก ืืฉ ืฉืืขืืจ ืืืื ืฉื ืืืงืืคืืื ืืืฉืื ืฉืืฆืืจืคื ืืืืื ืืืจืืื ืืฉื ื ืืืื ืืช ืืื (ืื ืืคืจืณโืืื ืืื ืืืืณโืกืคื 2023), ืืขืื ืฉืืืืืฉืื ืกืืืืื ืืงืฆื ืงืจืื ืื ืืจืืื. ืืกืืืื ืืจืืืช ืงืคืืฆืืช ืืืื "ืืืงืจื" (ืื ืืงืฆื ืืืืช ืืื ื-2022 ืื ืืคืืื ืืืืก ืืงืฆื ืืฉืืจ 2023 ืืกื ืืืืืืก) ืืื ืงืื ืืืื (ื-0.01%). ืืจืืื ืืืฆืืึพืืชืจ ืงืืฆืื ื 2 = 2 ืืฉืชืืฉืื ืขืืื ืืงืจืืืจืืื 12 = 12 ืืฉืชืืฉืื ืขืืื ืืงืจืืืจืืื โืืจืืโ ืืื = ืืฉืชืืฉ ืฉืืืืก ืืืืืช ืืขืจืืืืช ืฉืื ืื ืืื ืืืืจ ืืืขื ืืืืคืืข ืืกื ืืืืื ืื ืฉืืืฆืข ืืืืื ืืืืก, ืืื ืืคืืขื ืืืคืืข ืฉืื ืืฉืื (ืืคืืืช 3 ืคืขืืื) ืืืคืขืจ ืฉื ืคื 50+ ืืืฆืคื. ืืกื ืืืงืจื ืืฉ ืจืง 2 ืืจืืืื ืืืื, ืืกื ืืืืืืก ืืฉ 12. ืืชืื ื-12: 9 ืืื ืืืฆืืจืคืื ืฉื ืืคืจืณโืืื ืึพ1 ืืืื ืฉื ืืืืณโืกืคื (10/12 โ 83%). ืงืจื, 83% ืื-"ืืฉืชืืฉืื ืืืจืืืื" ืื ืืืื ืืืฆืืจืคืืช ืืื ืืืืื ืืื ืืืื ืืคืฉืจ ืืืืืช "ืชืงืืคืืช ืืืืก", ืฆืจืื ืืืืืื ืืื ืฉื ื ืืืจืื ืฉืื ืื: ืืืืืงื ืืื ื ืฉืืืืช "ืืื ื ืจืฉืื ืืืืงืืคืืื ืืขืืจืืช ืืืคืจืื 2023 ืืืื", ืืื ืฉืืืืช ืฉืืื ืืืืงืืช ืืืจืื: ืืชืื 279 ืืืฉืชืืฉืื ืฉืืฉืชืชืคื ืืคืืืช ืคืขื ืืืช ืึพ23 ืืืืื ืื/ืืืฆืืขืืช ืฉืืืืืื ืืฉ ืชืืขืื ืฉื ืืืืขืืช ืืืืก, ืืชื ืื ืืื ืืื ืืชืืื ืืคืขืื ืืืืื? (ืืืจื ืืื: ืืืขื ืืขืจืืื ืืจืืฉืื ื). ืื ืื ืืื ืืืืกืื ืืจืืืืื, ืืืื ื ืืฆืคืื ืฉืชืืจืืื ืืืชืืื ืฉื ืืืชื 279 ืืฉืชืืฉืื ืืืื ืืคืืืจืื ืืืกืืช ืืืืคื ืืืื ืืืืจื ืืฉื ืื - ืืืืืจ ืฉืืื ืืืืฉ ืืืคืืข ืืกืคืจ ืงืื ืฉื ืืฉืชืืฉืื ืืืฉืื, ืืื โืงืคืืฆืืชโ ืืืืช. ืืื ืืื ืื ืฉื ืฆืคื ืืืืจื ืืืขื ืื ืืชืงืืคื, ืืืขื ืฉื ื ืืืื ืืช ืืื ืงืฆืจืื ืฉืืื ืืืคืืขืื โืืืฉืืโ ืืืืืื ืืืืืื ืฉื ืืฉืืื ืืช ืืืฉืื. ืืกื ืืืื, ืืกืคืจ ืืืฉืชืชืคืื ืืืืื ืื ืฉืืืื ืืงืจื ืื ืฉืืืื ืืคืขืื ืืืคืจืืโืืื 2023 ืืืืืืกืโืกืคืืืืจ 2023 ืืื 48. ืื ืืื ืื ืืงืื, ืืืืชืืกืก ืขื ืืืืืฆืข ืืจืึพืฉื ืชื (ื-1.6 ืืืืืฉ ืขืืืจ ืืฉื ืื 2022 ืขื 2024 ืืื ืืืืื ืืช ืืืืืฉืื ืืืจืืืื), ืืืกืคืจ ืืฆืคืื ืืืจืืขื ืืืืฉืื ืืืื ืืื ืืืืจ ืืขืืื ืขื ืึพ7, ืื 10 ื-"ืชืจืืืฉ ืืืืคืืืื". ืคืขืจ ืื ืืจืื ืืืื ืกืืืืกืืืช, ืืืขืื ืืจืืช ืืืืืช ืืืืื ืขื ืืื ืื ืืกื/ืืืืก ืืจืืืืื ืืชืงืืคืืช ืืืื ืืงืืืฆืืช ืฉื ืืื ื (ืืืืื ืื ืืงืืืฆืืช ื ืืกืคืืช). ืืื ืืืืืง ืืื ืืืืฆืืื ืืขืื ืื ืคืฉืื โืืชื ืืืืช ืจืืืืโ ืืืืื ืื ืคืืืืืืื, ืื ืืชื ืกื ืืฉืืืื ืืฉื ืช 2022, ืื ืืฉืืื ืืืืื ืืื ืืืคืฉืจ ืืืืื ืื ืฉืืืื: 6 ืืฆืืขืืช ืืืืงื, 2 ืืืืืงืช, ืืืื ืืืคืขืื ื ืืื ืืื, ืืืื ืืืื ืื ืืื, ืืืฉืืจ ืืืื ืื ืืืคื ืฉืืื ืฉื ืขืจืืื. ืืกื ืืืฉืืืื ืืื 765 โืืฉืชืชืคืืืืชโ ืืื 963 ืืืืื ืืืงืืจื (ืืฉ ืืืืืจ ืฉืืืืืจ ืืชืงืืคื ืฉืืคื ื "ืกืขืจืช ืืจืคืืจืื ืืืฉืคืืืช" ืืืคื ื ืชืืคืขืืช ืงืืืฆืช ืืืืื, ืืืื ืื ืืืืื ื ืฉืืืืช ืืืฉืชืชืคืื ืืืืืืช ืืืืื ืื ืชืืื ื ืืืื ืืืชืจ). ืืกื 2022 ืื ื ืจืืชื ืงืคืืฆื ืงืืฆืื ืืช ืืชืืจืืื ืืชืืืช ืคืขืืืืช ืืื ืืืคืจืืโืืื 2023 ืื ืืืืืืกืโืกืคืืืืจ 2023. ืืืืืฆืข ืืื ื-1.6 ืืืงืคืืฆืืช ืืืจืืืืช ืืืืชืจ ืืื ืืฆืืืจ 2021 ืืืจืฅ 2022 ืขื 8 ืึพ7 (ืคื ~3.5) โ ืืจืื ืืืขื, ืืื ืื ืืืืชื ืกืืจื ืืืื. ืืืืืงื ืื ื ืืกืืชื ืืืืืช ืืฉืชืืฉืื ืืขืื ืชืจืืื ื ืืืื ืืืจืื ืืขืจืืื, ืืื ืืืื ืฉืืื ืืืช ืืืคืืขืื ืฉืื ืืฉืื ืืืืงื ืืกื ืืืืงื ืฉื 23 ืืืจืืขื ืืืืก. ืืฉ ืืฆืืื ืื ืืขืืชืื ืืกืคืจ ืืขืจืืืืช ืื ืืื ืืืจืื ืืขืจืืื ื ืืืข ืืื ืฉืืืืืจ ืืืฉืชืืฉ ืืืฉ ืืื ืฉืื ืืืงืจื ืืื, ืื ืชืืืื ืื ืืืชื ืืฉืชืืฉ ืืฉืฉืชืฃ ืืืกืคืจ ืืืื ืฉื ืืืื ืื ืืกื ืฉื ืืืจืืขื ืืืืืก. ืืฆืืจื ืืืืืงื: ืืืกืืฃ ืืืฉืืชื Ratio = Actual/Expected (โืคื ืืื ืืื ืืฉืชืชืฃ ืืขื ืืฆืคืโ). ืืืืจืชื ืืจืืื ืืืืฆืขืืช ืฉื ื ืชื ืืื ืืฆืืืจืื: Actual โฅ 3 (ืืคืืืช 3 ืืืคืขืืช ืืคืืขื), ื-Ratio โฅ 50 (ืคื 50 ืืืขืื ืืขื ืืฆืคื ืืคื ืคืขืืืืช ืืืืืช). ืืกืืื ืฉืืงืืชื Ratio ืืืื ืืืื ืืื ืืืื ืืืืื ืืืืืืจืช ืืขืื. ืืชืืฆืื ืืกื ืืืืืืก ื ืืฆืื: 12 ืืฉืชืืฉืื ืืจืืืื ืืืื: ืืชืื ื-12 ืืืื 10 ืื ืืฉืชืืฉืื ืืืื ืืืฆืืจืคืืช ืืืจืืืื: 9 ืืืื ืืคืขืื ืึพืืคืจืณโืืื 2023 ื-1 ืืื ืืคืขืื ืึพืืืืณโืกืคืืณ 2023. ืจืง 2 ืื ืืฉืืื ืืช ืืชืืงืื ืืืชืจ (2019โ2020). ืืืืืจ, ืจืื ืืืจืืืื ืืงืืฆืื ืืื ืืืื ืืื ืืืืื ืื ืืคืืืจืื ืืงืจืืืช ืืืืจื ืืฉื ืื, ืืื ืื ืืจืืืืื ืืืื ืืฆืืจืคืืช ืฉืืืจ ืืืื ืืืจืืืื ืืชืื ืกื ืืืืื ืื ืืืืืืก. ืืืืืื ืฉืืฉืชืืฉืื ืืืฉืื ืฆืืจื ืืืืคื ืืืขื ืคืืืช ืขืจืืืืช, ืืืงืชื ืื ืืื ืจืืืื ื-"ืืจืืืื" ืืืื ืืืื ืืืฆืืจืคืืช ื ืืืข ืจืง ืืื ืฉืื "ืืืฉืื" ืืื ืืื ืฉืื "ืืืืืกืื". ืืฉืืฉืืืื ืจืง ืืื ืืฉืชืืฉืื ืฉืืฆืืจืคื ื-2023: ืืืื ืืืฆืืจืคืืช (ืืคืจ-ืืื + ืืื-ืกืคื) ืืฉ 10 ืืจืืืื ืืชืื 48 (21%), ืืขืื ืฉืืฉืืจ 2023 (ืื ื-ืืจืฅ + ืืื -ืืื + ืืืง+) ืืฉ 0 ืืจืืืื ืืชืื 27 (0%). ืืกืืืื ืืคืขืจ ืืื ืืืงืจื ืืื ื-0.8%. ืื ืืกืฃ, ืืฉืืจืืฆืื ืืช ืืืชื ืืืืงื ืขื ืกื ืืืงืจื, ืืกืคืจ ืืืจืืืื ืืงืืฆืื ืืื ืงืื ืืืื - ืจืง 2 ืืจืืืื (ืืกืืืื ืืจืืืช ืคืขืจ ืืื ืืืืคื ืืงืจื ืืื ืืกื ืืืืืืก ืืกื ืืืงืจื ืืื ื-1.6%). ืื ืืขืื ืขื ืื ืฉืื ืืกื โืจืืืโ (ืืื ืชืืขืื ืืืืก), ืืืืื ืืืืืื ืืืืืช 1โ2 ืืจืืืื ืคื ืืฉื, ืืื ืืกื ืืืืืืก ืืฉ ืคื ืืื ืืืื ืืืชืจ ืืจืืืื ืืืกืื ืืื, ืืืืืคื ืืึพืืงืจืื ืื ืื ืืจืืืืื ืืืื ืืฆืืจืคืืช ืกืคืฆืืคืืื. ืืจืืืช ืื ืจืฉืืื ืืชืงืืคืืช ืืืจืืืืช ืืืื (ืืืืืจ 48 ืืืกืคืจ) ืื ืืขืื ื ืืืื ืคืืืืืืช ืืืืืงืช. ืกืืืืืชื ืืฉืืื ืืืืืื ืฉื ืืืคื ืืคืืืืืืช ืืคื ืืืฆืืขืืช ืืชืืืืืช ืฉืจืืืชื ืืงืจื ืืงืืืฆื ืืื ืืืืื ืื ืกืคืฆืืคืืื. ืืกืืืื ืืื ืืืืื ืืฉืืขืจ ืืื ืขืฉืืชื ืืขืืช, ืื ื ืืชื ืฆื ืืจืืฉ. 4 ืืฉืชืืฉืื ืื ืกืืืืืชื ืื ืื ืืจืืฉืชื ืจืืช ืืืืืื ืืกืคืงืช ืืื ืืงืืืข. ืืืื ืืืกืืืืื: 15 (~34%) ื ืืืจ ืื ืฉืืืืื ืืืื ื ืืืืื ื-29 (~66%) ืฉืืืืื ืืืื ื ืืฉืืื. ืืงืืืฆืช ืืืืื ืืฉ ืคืืืช ืืฉืชืชืคืืืืช ืืืืื ืื ืืืกืืช ืืงืืืฆืช ืืฉืืื: ืจืง 4 (25%) ืืฉืชืชืคื ืึพ3 ืืืื ืื ืืืขืื. ืืงืืืฆืช ืืฉืืื 17 (~60%) ืืฉืชืชืคื ืึพ3 ืืืื ืื ืืืขืื (ืื ืืืื ืืคืชืืข: ืื ืื ืืชืื ืืืฆืข ืขื ืืืื ืื ืฉืืืฆืข ืืืืื ืืืืก ืืงืืืฆืืช ืืฉืืื). ืืืืื ืช ืื ืคืขืื/ืื ืคืขืื/ืืกืื ืืืืงืืคืืื ื ืืื ืืืืื: ืืขืจืืช: ืืืื ืืืืื ืืืืื ืฉื ื ืจืฉืื ืืคืจืืโืืื ืืืืืืกืโืกืคืืืืจ 2023. ืืื ืืฆืืื ื ืชืื ืื ืืื ืชืืจืืืื ืคืขืืืืช, ืืกืคืจ ืขืจืืืืช ืืืจืื ืืขืจืืื ืืชืงืืคื ืืจืืืื ืืืช ืืกืคืืจืช ืืกืคืจ ืืฉืชืชืคืืืืช ืืืืื ืื ืฉืืืฆืข ืืืืื ืืืืก. ืืงืืืื ืืืืื ืช ืืขืฆืื ืืืืื ืืช ืืืคืืกืื (ืื ืฉืืืื ืืขืืช, ืืฉืื ืืชืงื): ืื ืขืืจืชื ืืืึพืืื ืขื ืื ืงืืืฆืช ืืืฉืชืืฉืื ื-"ืืจืืืื" ืฉืืขืื, ืื ืืืืืงื ืืืืืืช ืจืืื ื ืืืจืื ืืคืืกื ืคืขืืืืช ืืจืืืื ืืืืื ืช ืชืืืื ืืชืืืื ืคืขืืืืช, ืืคื ืฉืืฉืชืงืฃ ืื ืืืืืืืืช. ืืชื ืจืง ืืื ืืืืืืืช: ืืืืืืืืช ืืืื ืืืืืจ ืื ืืืืืืช ืืืื, ืืฉ ืืืืืืืช ืจืืืช ืืืืืช ืขื ืืคืืกื ืคืขืืืืช ืืฉืืืื ืืืืื. ืืืืื ืืชืื ืืืืืง ืืขืฆืืื. ืืฉื ื ืื ืงืืืฆื ืงืื ื ืฉื ืขืืจืืื ืืชืืงืื ืฉื ืืฆืืื ืึพ2% ืืขืืืื ืื ืืืกืคืจ ืืืฉืชืชืคืืืืช (14+). ืืืจืืืืช ืืื ืืื ื ืืชืืฆื ืืขืฆื ืืืงืฃ ืืืฉืชืชืคืืช, ืฉืื ืื ืืืืืงืช ืืืงืจื ืฉืืืฆืขืช ืขื ืืืื ืื ืืฉื ืช 2022, ื ืืชื ืืืืืช "ืื ื ืืจืื" ืฉื ืืฉืชืชืคืื ืืืืื, ืืื ืืืจืื ืืงืืืฆื ืขืฆืื. ืืืืืจ ืืืืงืืคืืื ืืืืืจืื ืืืืื ืฉืืฉื ืื ืืืืจืื ืืช ืชืืืืื ืื ืืื ืืืืคื ืขืงืื ืืืืื ืื, ืืืืืื ืืืงืฉืจืื ืคืืืืืืื. ืืืงื ื ืงืฉืจื ืืขืืจ ืืืืืกืื ืื ืืงืืืฆืืช ืืืื, ืืืืงื ืคืขืื ืื ืคืืขืืื ืืืืคื ืืงืืืื ืื ืืจืฅ ืืฉืืจืืจ ืืืงืืคืืื ืฉื ืืกืื ืืฉื ืืขืืจืืืช ืืงืืืฆืืช ืืืื. ืืขืืืช ืืืช, ืึพ2% ืืขืืืื ืื ืืกื ืืืงืจื ืึพ2022 ืื ืืฆืืืชื ืืืืืช ืืืคืืื ืืฉืืชืฃ ืืจืืจ ืืื ืืืืงืืคืืื, ืืขืืจ ืื ืืืื ืืืืืช ืืืฉืชืชืคืืช ืืืืื ืื ืืขืื ืื ืืคืืืืืืื. ืื ืืืฉื, Hila Livne ืึพLa Nave Partirร ื ืืกืื ืืืืจ ืฉื ืชืคืกื ืืืืืกืื, ืืืขื ืืื ื ืฉื ืืืื ืืฃ ืฆืืื ื ืืืืืจืืงืจืืื ืืงืืืืื ืื (ืืืืชื ืืืง ืืงืืืฆืช ืืืืกืืค ืืืืจืื ืช ืืืืืืช ืืืื ืื). ืืื ืฉื ืืืืชื ืฉืกืืืขื ืืืืชืืจืืฉืคืจ ืืืขืืจืืื ื ืืกืคืื ืืืชืืืช ืืชืืื ื ืืืื ืฉืขืกืงื ืืืขื ืืช (ืฉืงืจืืืช) ืืืฉืชืืืืช ืืชืืชึพืืืืื ืืช ืขื ืืืงืืคืืื ืืขืืจืืช, ืืืื ืื ื ืืกืื ืืืื ืืกืืจืช ืืกืืกืื ืืืฉืืื ื ืืืืชืืจืืฉืคืจ ืืืืจ ืฉืืื ื ืืกื. ืืืืจ ืฉื ืืกืื ืคืขื ื ืืกืคืช (ืืืื ืืืืจืืช ืคืืืืืืืช ืืื ืืืืช, ืืคืืืืช ืืื ื ืืื ืืช), ืฆืืื ื ืืืฃ ืฉืืืชื ืื ืืืืื ืืฉืืจืืจื ืฉื ืืืืฉื ืขืืจืืื "ืืืืจืืืื" ืืืกืืจืช ืืืื ืืขืจืขืืจ. Sofiblum ืืกืืืจืืช ืืืืื ืกืืืขื ืืฃ ืืื ืืืืชืืจืืฉืคืจ ืืืชืืืช ืืชืืื ื ืืืื, ืืืื ืื ืฉืืชืื ืืช ืืชืืื ื ืืฉื ืืื ืืืืขืืช ืึพU4C. ืืืืืฉ ืื ืืขื ืืื ืืื ืืื ื ืขืฆื ืืืฉืช ืืชืืื ืืช ืืืื, ืืื ืืืคืืก ืืืฆืืืืจ ืฉื ืื ืฉื ืจืื ืืื ืฉืืชืืฃ ืคืขืืื ืืื ืืืงืืคืืื ืืื. ืืฉืื ืืฆืืื ืื ืืื ืืืื "ืืงืื ืืขืฉื" ืืืขืื ืืืืคื ืืฉืืจ ืขื ืืืจืืช ืคืขืืื ืฉืืื ืืงืืืฆืืช ืืืืื ืฉืฆืืืืื ืืืกื ืฉืืื ืืืืขื ืืืื, ืื ืืงืืืฆืืช ืืืืืช ืืืจืืช. ืืื ืื, ืขืฆื ืืืืฆืืืชื ืืจืืฉ ืจืฉืืืช ืืืฉืชืชืคืื ืืืืื ืื ืืืืฆืืขืืช ืฉืืืฆืข ืืืืื ืืืืก ืืื ื ืืืืืจ, ืืฉืืขืฆืื, ืืืืื ืืคืขืืืืช ืคืกืืื. ืขื ืืืช, ืืขืืืื ืฉืืืืงื ืขืืจืืื ืืื, ืฉืืื ืืืกืืืจืื ืืชืืขืืช ืฉื ืืขืืจืืืช ืืืืืกืื ื/ืื ืืงืืืฆืืช ืืืื ื/ืื ืืืืืฆืื ืืชืืืืื ืกืืื ืฉืืจืืจ ืืืกืืืื ืืคืจืฉืช ืงืืืฆืืช ืืืืื, ืืืคืืขืื ืืืืคื ืืืื ืืจืืฉ ืจืฉืืืช ืืืฉืชืชืคืื ืืืืื ืื ืืืืฆืืขืืช ืฉืืืืื ืชืืขื ืืืืก, ืืื ื ื ืืชื ืช ืืขืื ืื ืืืชืขืืืืช, ืื ืื ืืื ืื ืืฉืืขืฆืื ืืืืื ืืคืขืืืืช ืคืกืืื. ืืฉืื ืืฉืืืข ืืืงืืืื ืื ืืขืชื ืขื ืืชืืื ื ืืืฆืืืืจืช ืื ืืืืืจ ืืื ืชืื ืื ืฉืืืืชื ืืขืื. ืืื ืืชื ืกืืืจืื ืืืื ื ืฉืืฉ ืืฉืฉ ืืื ืื ืื ืคืขืื, ืืชืืืื ืืืืื, ืฉืืืชืื ืฉืืืฉืื ืืคืขืื ืื ืืืื? ืฉื ืืืจ ืืื ืืคืืก ืฉื ืืืืก/ืชืืืื ืืืฅึพืืืงื ืืืืื ืื ืืืฆืืขืืช ืืฉืืื ืืขื ืคืืื ืฆืืื ืืืืื ืืคืืืขื ืงืฉื ืืืืงืืคืืื? ืื ืื, ืืืื ืฆืขืืื, ืื ืืืื, ืืฉ ืืืขืชืื ืื ืงืื ืื ืืฉื? ืืื ืืืขืชืื ืืืืคืื ืฉื ืขืฉื ืขื ืื ืืคืจืฉื ืืฆื ืืขืื ืืืจืฉืืืช ืืื ื ืืื ืืืกืคืง? ืืฉืื ืื ืืฉืืืข ืืืขืื ืืืจืฉืืืช (ืืคืขืืืื@Eldad, Funcs, Geagea, HiyoriX, Idoc07, Ili Kaufmann, Lostam, Neriah, Politheory1983, PurpleBuffalo, ืืืขืื., ืืจื ืืืืจื, ืืืฉ, ืืื ืฉื, ืืืื, ืืืืื, ืืืืืงืืืืก, ืืื ื ืื ืืืืง, ืืงืฃ, ื ืื, ืขืจื, ืฉืืื, ืึพืฉืืคืืฅ-ื ืืง: ืืืืืงืื@ืืืฉ, ืืื ื ืื ืืืืง, TheStriker, PurpleBuffalo, ืึพBarak a:), ืืื ืืืืืจ ืืืืื ืืขืื ืืืืจ ืืื, ืืขื ืืื ืืืืชื ืืืืขืื ืืืืงืฃ ืืืืจืื ืืขืืจ ืืื ืฉืคืืจืกื ืขื ืืืื. ืืืกืืื, ืืืื ืืชืืืืช ืืืืจืืงืจืืืืืฉืชืืฉืช:Funcsืืฉืชืืฉ:ืืจื ืืืืจื: ืืื ืืืืื ืชืื ืืืืื ืืช ืืืืจืื ืืืืฆืข ืคืขืืืืช ื ืืกืคืืช ืืืกืืจืช ืืคืจืฉื, ืืื ืื, ืืืืื ืื ืืืืื ืชืื ืืขืฉืืช? ืื ืื, ืืฉืื ืืืืื ืื ืืฉืืงืืืื/ืืืืื ืืืช ืฉืื ืืื ืืชืื ืืื? ืืืจื - ืฉืืื 19:19, 4 ืืื ืืืจ 2026 (IST)ืชืืืื ืกืืืื ืขื ืืืืืืจ ืืชืฉืืื. ืคืืืืชืืืืจื, ืื ื ืื ืืงืื ืืช ืืืืฉื ืฉื "ืื ืืคืฉืจ ืืคืงื ืขื ืื ืขืืฉืื ืืืืฅ ืืืืืงืืคืืื, ืื ืืืื ื ืคืกืืง ืืืืฉ ืืคืจืฉื". ืงืืฉื ืืืืืคื ืืื ื ืืืขืื ืืืืืช ืืคืงืจืืช, ืืืืคื ืืื, ืื ืืืืืง ืืืืขืื ืืื ืฆืจืื ืืืื ืืืช ืืจืืจื, ืขืงืืืช ืืืจืชืืขื ืืฉืืฉ ืืื ืืืงืฆืืืช ืืืงืืช. "ืืืืก ืงืืืืช" ืืื ืคืกืื ืื ืืชืื ืืืืื ืืื ืืืืฆื ืื ืืกืืื ืคืฉืืื: ืืื ืืืคื ืืืจืขืืช ืื ืฆืืงืืืคืืืืช ืืชืืจืืช "ืื ืืฆืืื ืืืชืืจืื ืขื ืงืืืฆืช ืชืืืืื ืืื ืืืชืจ ืืืืืื ืืืชืจ ืื ืฉืื ืืืื", ืืืงืื ืืืืื ืขื ืืื ื ืฉื ืขืืจืืื ืืืืชืืื. ืืชื ืฆืืืง ืฉืืจืฃ ืืืืืช ืืฆืืขื ืืืขืื ืืื ืฆืขื ืืืืจื. ืืื ืื ืคืชืจืื ืืืงื ืืืื. ืื 900 ืขืจืืืืช ืืืจืื ืืจืืฉื (ืืฉืขืจืืืืช ืืืืืืืืืช ื ืกืคืจืืช ืืืงืืช) ืืื ื "ืืืชืึพืขืืืจืืช" ืืื ืฉืืืืงื ืืืจื. ืื ืฉืืจืชืืข ืืืืช ืืื ืื ืจืง ืกืฃ ืืื ื, ืืื ืืื ื ืืจืืจื ืฉืืจืืื, ื ืืืื ืงืืืฆืช ืืืื ืืืืืก ืกืืงืืืื ืื ืืฉืชืืืื ืืืืื ืืื. ืืืืคื ืฉื ืขื ืืื, ืืฉ ืืืื ืืื ืขืืืื ืขืจืืื ืืืืืช ืืืื ืืืืื ื ืกืืงืืืืืช ืืืืื/ืืฆืืขื ืืกืืืืื, ืืืืืื ืื ืืฉืืื ืคืืืืืืื/ืจืืืฉืื. "ืืืืก ืงืืืืช" ืืืืืจืชื ืืฆืื ื ืืกืืจ: ืืืงืืคืืื:ืืฆืืขื#ื ืืื ืืืฆืืขื ืงืืืข ืืืคืืจืฉ: โืืืืก ืงืืืืช, ืฉืื ืืืืื ืื ืืืืคื ืืงืืืื ืืกืืงืืืื ืืืฆืืขื ืื ืืืืื, ืขืืจืืื ืฉืืคืื ื ืืืืื ืฉืฆืคืืืื ืืชืืื ืืฆื ืืกืืื ืฉื ืืืืื โ ื ืืื ืืช ืขืจืื ืืืืื, ืืืื ืืกืืจ.โ ืื ืื ืืืื ืืงืืืืช, ืื ืื ืืจืื ืื ืืืืืืื ืืืจืื (ืืืฉื ืจืื ืืช ืืฃ ืืืืื ืืืช ืืืืงืืคืืื ืืื ืืืืช). ืื ืืื ืื ืชืฉืืื ืืืื ื ืฉืืืจ ืฉ-"ืืืง ืืืืืืืืืช ืฉืืืืื ืืืงืืืฆืืช ืืื ื ืืขืืืชืืืช". ืื ืื ื ืืื, ืืงืืืฆื ืืืืชื ืฉื ืคืขืืื ืงืคืื ืืืืืชื ืืืืื ืช ืืืจื. ืืื ืื ืืืืืืืืช ืฉืืืื ืืืงืืืฆืืช ืืขืืชืืืช ืืืกืืจืืช ืืคื ืื ืืืื ืฉืื ื ืืฉื ืฉืืจ ืืืืื ืืืงืืคืืื. ืืืขื ื ืฉืื ืฉ-"ืจืื ืืขืืจืืื ืืคืขืืืื ืืืืขืื ืืืฆืื ืืฆืืขืืช ืื ืืื ืืืืกืืค" ืืืืืฆื ืืช ืืขืืงืจ: ืืืขืื ืืื ื "ืืื ืื ืืฆืื", ืืื ืืื ืื ืืืืื, ืืืืคื ืกืืงืืืื, ืืฆืืจื ืืฉืืช ืจืื, ืืืืืง ืืืืงืจืื ืขื ื ืืกืืืื ืฉืืืืจืื ืืืคืืจืฉ ืขื "ืืืฉืื ืจืื", "ืื ืืืื ืืืช ืืืช", ืืขื "ืืืคืื" ืืฆืืฆืื ืืฉื. ืื ืืืืืง ืืืืื ืืื ืงืืืื ืฉืืชืืืื ืืื ืืืชืื ืื ืฆืืงืืืคืืื ืืืื ืื ืื ืื ืืืืืืช ืคืืืืืืืช. ืืื, ืื ืื ืจืง ืืฆืืขืืช, ืื ืื ืืืื ืื, ืืฉืื ืฉืืืชืจ ืงืฉื ืืืืื ืฉืจืื ืืขืืจืืื ืืคืขืืืื ืืืืขืื ืืืฆืื. ืืื ื ื., ืืขื ืืื "ืืงืืืื ืืชืืืื ืืืชืืืฉืฉ": ืื ื ืื ืจืืื ืืชืืืฉืฉืืช, ืื ื ืจืืื ื ืจืืื ืฉื ืืฆื ืฉืื ืืจืื ืืขืจืืื ืืืืจืข ืฉืื ืืฉืื ืขืดื ืืืฉืื ืืชืืืืืื. ืื ืืืขืืจ ืขื ืขืจืืื ืืืฆืืขืืช ืืคืฉืจ ืืจืืืช ืืจืก ืืชืืฉื ืฉืืืืืจ, ืืืืื ืืืจืืื, ืืช ืื ืฉืงืืจื ืืืืงืืคืืื ืืื ืืืืช ืกืืื ืขืจืื ืืฉืจืื, ืจืง ืฉืขืืืื ืื ืืืืชื ืขืืฆืื. ืื ืฉืืืืจ ืืืชืืืืก ืืื ื-"ืื ืกืืฃ ืืขืืื" ืืขืฆื ืืฉืืื ืขื ืืคืืืช ืืืืื ืืืืจืช ืืืืง ืคืืืืื, ืืื ืืืืืง ืื ืฉืืืืฅ ืืช ืืืืืจืืจืืช. ืืขืืืื ืฉืืงืืืฆื ืืืื ืืช ื ืืกืื ืืขืื ืฉืืงืืืฆื ืืืงืืืื ืืฉืืื ืงืืืื ืืคืืขื ืืจืื ื ืฉืืื ืืืืฉืื ืืคืขืื, ืืืื ืืืจืืื ืืช ืืืืช ืืืชืืืฉืืืืช ืืืจืขืฉ, ืืื ืืืืืจ ืืื: ืืืื ืืืืืช ืืืขืืืงื ืฉื ืืืืื, ืืืืกืจ ืืื ืืจืืข ืฉืืคืฉืจ ืืฉืืื ืืื ืฉ-"ืืฉ ืืืืก ืฉืืงืื ืขื ืืฉื ืืืืืจื ืืืืชื ืืชืคืฉืจืช ืืืฉ ืืืืก ืฉืืงืื ืืืคืื ืจื". ืืืื ืคืฉืื ืฉืืืช ืืื ืืืืื ืืื ื ืจืื ืขื ืืคื ื ืืื ืฉื ืื. ืืฉืขืืจืชื ืขื ืืืื ืื ื-2022 ื ืืืจืชื ืืื ืืื ืืื ืคื ืคืขื ืืืื ืื ืืขืจืื ื ืืช ืื. ืื ืจืื ืืชืจืืืช ืืืฆื ืืืืืขืชื ืฉืื ืื ื ื ืืฆืืื ืื ืืืื ืื ืฉืืชื ืคืฉืื ืืืืจ ืืขืฆืื ืขืื ืืื. ืืขื ืืื "ืจืง 7โ8 ืขืืืื ืคืขืืืื": ืื ืืกืคืจ ืืื ืืกืคืืง ืืื "ืืืืื" ืืืื ืื ืืืฆืืขืืช ืืืืืื ื-"ืจืฆืื". ืื ืฆืจืื "ืฆืื". ืืืจืื ืืืื ืื/ืืฆืืขืืช ืื ืฆืจืื ืืจืื ืืืชืจ ื-5-6 ืงืืืืช ืืื ืืืืื ืืช ืืืืื/ืืฆืืขื ืืฆื ืืื ืืฉื ื. ืืืฅ ืืื, ืื "7โ8" ืืคื ื ืืชืื ืืืืืจ ืืืืื ื ืชืื ืื ืืฆืืืฆื ืืชืืขืื ืืืงื ืืืื. ืืชืืชื ืืืคืืจืฉ ืฉืืืกืงื ืืช ืฉืื ืฉืืจื ืืืช: ืื ืฉืืฉืคืชื ืืื ืืืื ืืืื. ืื ื ืืืื ืืืืื ืื ืขืื ืืฉืชืืฉืื ืขื ืืคืืก ืคืขืืืืช ืืฉืื ืืืื (ืืืฉืื ืืืขืื) ืืื ืืจืืืื ืกืืืืกืืืช ืคืฉืื ืื ืืื ืืกืคืืง ืืจืืืื ืืืืก ืืจืฃ ืฉืงืืขืชื ืืืื ืื ืื ืืืืืจื ืื ืืชืื. ืฉืืืชื "ืืื ืืฉื ื ืืืืจืื ื ืืฉ ืืขืื ืืืืคื ืืคืขืืืืช ืฉืืื" ืืคืชืืขื ืืืชื ืืืื. ืงืืื ืื ืืชืฉืืื ืืื ืื. ืื ืขืืืื ืคืืขืืื ืืืืื ืจืื ืืืืฃ ืืื - ืืฆืืืขืื ืืืื ืืชืืืืื ืืื ืืฉื ื. ืืื ืืืชืจ ืืื, ืืืจืื ืืืืืฆืื ืืช ืืขืืงืจ. ืืคืจืฉืืช ืืืื ืืฉืืงืื ืืื ื ืจืง "ืืื ืื ืชืืจืืื ืืืื", ืืื ืืจืชืขื, ืขืงืืืืช, ืืืืื ืขืืช ืืชืงืืื ืฉืืืฉืืจ ืืืืขืื ืืชืืจืื ืืืืช. ืื ืืืชื ืืืืืื ืฉืืืืืจืืงืจืืื ืขืฆืื ืฆืืื ื ืืฉืืืช ืืฉืชืืฉืช:ืื-ืฆืืจ#ืชืืฆืืืช ืืขืจืขืืจ, ืืฉืืืจื ืืืฉืชืืฉืช ืฉื ืืกืื ืขืงื ืืืจืืช ืืงืืืฆืช ืืืืื ืฉื ืืืืื, ืฉืืืจืืช ืฉืืคืขืืืืช ืฉืื ืืืืฉืืื ืืืืฉ ืืืืจื ืืืืฉืื ืืื ื ืคืืืืืืช, ืืืฉ ืื ืชืจืืื ืืืืชืืช ืืืืื ืืขืจืื ืคืกืืืืืืืื, ืฉืืจืืจื ืืืฆืืจ ืชืงืืื ืืฉืืจืืจื ืฉื ืืืจืื ืฉื ืืกืื. ืื ื ืืกืืื ืืืชื ื-100%. ืืื, ืืืื ืืจืืืื ืืฉ ืื ื ืืืื ืืืช ืืืื, ืืืืงืืคืืื:ืืชืืืืืืช ืขื ืืจืืืื ืืชืื ืฉืื ืื ืฉืืืื ืืืจืื ืืชื ืขืจื ืืืืืืื ืืืฆืืื ืขื ืฆืจืคืช, ืืกืืจ ืืงืื ืืช ืื ืืืขืืื ืขืงืจืื ืืื ืื ืื ืืง ืฉืืงืืืชื ืขืืื ืืืื ืฉืืขืืจ ืขื ืืชืืขืืช. ืืื ืืืืืื ืืืื ืื ืขืืื ืคื? ื ื ืื ืฉืืฆืืจื ืืืืื ืืืื ืฉืืื ืื ืืืจ ืื ืืฆืืืขืื "ืืืืฉ ืืื" ืื ืชืืืืื ืืืื ืืฉื ื (ืืืืืจ ืื ืื ื ืืื, ืืื ื ื ืื). ืืื ืื ืืืืจ ืืืฆืืืง ืื ืื ื ืืืจืคืช? ืื ืื ื ืืืืจืื ืขื ืืชืืจืื ืืช ืฉืืจืื ื ืืง ืขืฆืื ืืืืื: ืขืจืขืืจ ืืืื ืฉืืืชื, ืงืจืืขืช ืืืงืืคืืื ืืืืจืื ืขื ืืื ืื ืฉืืืืื ืืคืืขื ืืืชืืืืืืช ืขืฆืืื ืฉื ืืืืจืืงืจื ืืืื ืืืคืจืืฉืชื, ื ืืกืืื ืืืฆืื "ืืืืจืืงืจื ืืืขื" ืืชืืืืฃ, ืืืฆืืช ืืืืชื ืฉื ืืืงืืคืืื ืืขืืจืืช ืืคืืืื, ืืืืืคื ืืืื ืืืืื ืืืืื ืืชืื ืืจืืืช ืืืืฉืืช ืฉืคืืขื ืื ืงืฉืืช. ืืื ืื ืฉืืืืื/ืืงื ืืืง ืืื ืืืื ืืืืก ืืงื ืจืง ืื ืืืื ืืื "ืืชื ืื ืืคื"? ืืืืืขื ืืื ืื ืื ื? ืืื ืืงืืืฆื ืืืื ืืช ืื ืืืชื ืืืืชื "ืืืืื ืืช"? ืืื ืจืง ืืงืืืฆื ืืฉืืืืืช ืืฉ ืืช ืืคืจืืืืืืื ืฉื ืืืคืื ืจื, ืืขืื ืฉืืฆื ืืืจื ืืงืืืฆื ืืืงืืืื ืืืืื ืืืช ืืืืชื ืื ืฉืืจื ื ืืงืฉื ืืื ืกืืื ืืช? ืื ืื ืืืืื ืืืช ืืืืฉื (ืกืืื ืืช ืืื "ืืืืืช ืืช ืืงืจืขืื"), ืฉืชืืื ืฉืืืืื ืืช, ืืื ืืืืื ืืืช ืืืฉื ื ืชืงืคื, ืืื ืฉืื ืืฆืืงื ืืืจืืืืช. ืืืจืืงืจืืืืืฉืชืืฉืช:Funcsืืฉืชืืฉ:ืืจื ืืืืจื, ืื ื ืืืงืฉ ืฉืื ืชืฉืืื ืืจืืจื: ืืืฆืื ืืื ืืื ืืืงืฆืืืช ืืืงืืช ืืื ืฉืืฉ ืืกืคืจ ืืฉืืขืืชื ืฉื ืืขืืจืืื ืืืงืืืฆื ืืฉื ืืื ืฉืื ื ืืกืื, ืืืืงื ืขืืืื ืคืขืืืื ืืืืื ืื ืืืืฆืืขืืช ืคืืืืืืื. ืื ืืืืื ืืืช ืฉืืื ืืืื ืืืืก ืืืขืืจืืืช ืืงืืืฆืืช ืืืืก/ืืืื? ืืื ืื ืฉืืื ื ืืืืจ ื ืืื ืืืืืื ืืืช ืฉืืื ืืฉืชื ืชื? ืืื ืื, ืืื ืืื ืืืืืช ืืืืืช ืืืืฉืืช ืืืืคื ืขืงืื ืืฉืืืืื ื? ืืื ืื, ืื ืืืืื ืชืื ืืขืฉืืช ืืืืก ืืืขืืจืืื ืืงืืืฆืืช ืืืืื ืฉืืฆืืชื ืืขืื ืฉืื ืืืคืื? ืืืจื - ืฉืืื 21:23, 16 ืืื ืืืจ 2026 (IST)ืชืืืื ืืงืืืฆืืช ืืื ืืืคืฆื ืื ืงืจืืืืช ืืืขืืืืืช ืื ืฉืื ืืืฉืคืืข ืขื ืืฉืื ืืฆืืืืจื - ืืืชืคืงื ืืืคืืื, ืืืืื ืขื ืืืืขืืช ืืืฉืืชืืืช, ืืืขืืืช ืคืืกืืื ืืคืืืกืืืง ืืจืฉืชืืช ืืืจืชืืืช ืืืจืืช, ืืื, ืื ืืืฆืืจืฃ ืืขืืจืืื ืืืืงืืคืืื ืืขืืจืืช. ืืชืงืืคื ืื ืืื ืืกืคืจ ืืชืืืช ืขืืชืื ืืืืช ืขื ืืืฉืคืขื ืืคืืืืืืช ืืืืงืืคืืื ืืขืืจืืช, ื ืืื ืื ืฉื ืืฉืคื ืคืจืฉืช ืืืืืช ืืงืฉ ืฉื ืงืืืช, ืฉืกืืืจ ืฉืื ืืืชืืืช ืืืืืชืื ืืืคืฆื ืืงืืืฆืืช ืืืื. ืืคืฉืจ ืืฉืขืจ ืื ืฉืืืง ืืืืฉืชืืฉืื ืืืืฉืื ืืคืขืืื ืื ืืืืจื ืืืื ืืช ืืงืืืื ืืืจืื ืืขืืืื ืื, ืืช ืืืจื ืืคืชืืจ ืืืืืงืืช, ืืคืขืืื ืื ืงืืืื ืขืืืื ืฉืืื ืืื ืฉืืืจืืฅ ืืืชื ืืืฆืืจืฃ ("ืืฉืื ืืืฆืืจืฃ ืืื ืืฆืืืจ ืืืืช ืืฆืืขื") ืืืืชืื ืฉืืืงื ืื ื ืืกื ืืืจืื ืงืืืฆืืช ืืืฉืคืขื ืคืกืืื ืืืืืงืืคืืื. ืื ืื ืืืืจ ืฉืืืื ืืื ืืงืืืฆืืช ืืืื. ืืื ืคืื ืฉืืื ืื ืืขื ืืฆืืจืคืื ืืืชืงืืคื ืืื ืฉืืคืืื ื ืขื ืืื ืจื ืืืืืืงืืช ืคืืืืืืืช ืืืฆืืจืคื ืืืืื ืื ืื ืืฉื, ืืืชืจ ืขื ืืื ืืื ืฉืืฉืชืืฉืื ืืืฉืื ืืชืงืืคื ืืงืืืื ืืคืืื ื. ืืื ืื ืื ืฉืื ืืชืื ืฉื ืืืจื ืืืืื - ืฉืืืฉืชืืฉืื ืืืืฉืื ืืืชืงืืคื ืืืื ืืคืืื ื ืขื ืืื ืื ืืฉืืื ืคืืืืืืื. ืื ืืืืื ื, ืกืืืจ, ืืชืืงืฉ, ืืชืงืื ืขื ืืืขืช, ืืื ืืืืื ืฉืื ืืืจ ืฉืืืื ืืืื ืืืชื ืืฉืชืืฉืื ืฉื ืืื ืื ืฉืืืื ืืืจ ืื ื ืืื ืืืื ืจืฆืื ื ืืืจืื ืืื ืฉืืืจื ืื ืกื ืืฉืืืช ืืื. ืืืื ืืืช ืงืืฉืืจืื ืืืฆืื ืืื ืงืืฉืืจืื ืืืฆืื ืืื ื ืืขืื ืืชืช ืขืืืง ืืืจืืื ืืงืืจืืื ืืืืจืื ืฉืืื ื ืื ืฆืืงืืืคืืื. ืืืื ืืืช ืืืช ืฉืืืชื ืื ื ืืฆืืข ืืื, ืืื ืืกืืืจ ืคืจืฆืืช. ืืฉ ืืฉืชืืฉืื ืฉืฉืืื ืงืืฉืืจืื ืืืฆืื ืืื ืืฆืืจื ืกืืืื ืืืช. ืืื ื ืจืืื ืืืชื ืืืฉืืืชืื ืฉืืืจืืื ื ืืง. ืืืื ืืื ืืืจืื, ืฉืืฉ ืืืื ืข ืืื: ืืชืืืื ืืืขื ืืช ืืื, ืืืชื ืืขืืชื ืืืื ืื ืืคื ื ืืื ืืื, ืืฉืื ืืช ืื ืขื ืืืืืจืื "ืืงืจืืื ื ืืกืคืช". ืืืฉืืืื ืืืืช ืคืกืืื ืืืกืคืจ ืกืืืืช. ืืื ืื ื ืืืฉื ืฉืงืืฉืืจ ืืืฆืื ื ื ืขืื ืืฉืืฉ ืจืง ืืืงืจืื ืืื ืืื ืฉืืืฉ ืืงืืจ ืืืจืืืช ืืขืจื. ืืืืกืืืจืืืช ืืขืจืืืืช ืื ืื ืงืืจื, ืืจืื ืืงืืฉืืจืื ืืืืฆืื ืืื ื ืจืืื ืืื "ืชืจืื ืื ืงืจืืชื". ืื ื ืขืืกืง ืจืืืช ืืืืืงืช ืงืืฉืืจืื ืืืฆืื ืืื: ืืขืืงืจ ืืงืืฉืืจืื ื ืขืืืื, ืขืืกืงืื ืื ืืฉืื ืืฉืขื, ืืื ืื ืจืืืื ืืืื ืืื ืฉื ืื ืืืจ ืื. ืืจืืืื ืืช ืื"ืฆ. ืืืืช ืฉืงืืฉืืจืื ืืืฆืื ืืื ืืืืชืจืื ืืืจืืื ืื ืืขืฆืื ืืช ืืืงื ืืืืงื ืืืื ืืกืื ืืจื ืืืืื ืฉื ืืฆืจ ืืฉื. ืืชืืื ืืช ืขืื ืืื, Mr.Shoval, Shoshie8 Galzigler: ืืืฃ ืืืื ืืืช ืืื ื ืืฆืจ ืืืืืชืื, ืืืชื ืืืืื ืื ืืืืื ืืช ืืขื ืืช ืื ืื. HofEz96 โข ืฉืืื 20:42, 8 ืืื ืืืจ 2026 (IST)ืชืืืื ื ืื ืื ืืกืจื ืฉื ืงืืฉืืจืื ืืืฆืื ืืื ืืืชืจืื "ืืืจืฅ" "ืื ืืจืงืจ" ื"ืืืืขืืช ืืืจืื ืืช ืืื ืืืื ืืืช ืืืืช. ืืืืื ืืืช ืืื ืืืขืืืฃ ืงืืฉืืจืื ืคืชืืืื, ืืื ืื ืืื, ืื ืืื. ืืขืืชืื ืื ืืืื ืืืืืื ืืืจืฅ ืืื ืืจืงืจ ืื ืืงืืจืืช ืืืืข ืืืืืชืืื, ืฉืคืขืืื ืจืืืช ืืืชืืื ืขื ืืืจืื, ืฉืืืจืื ืื ืืืชืืื. ืืื ืชืกืืจ ืืงืืจืืช ืืกืคืจืื ืื ืืกืคืจ ืื ืืืื ืื?. * ืื ื Hanay โข ืฉืืื โข ืืจื ืืจืฅ ืงืืื ืืชืืจื - ืจื ืืืืืื ืืืืืฃ ืืืืจืื 09:45, 9 ืืื ืืืจ 2026 (IST)ืชืืืื ืืฉ ืืืืื ืืชืืจืืช ืืืชืืื ืืฉืชืื-ืขืฉืจื! ืฆืืจืืื ืืืืื ืืืืื, ืืืฉ ืคืืจืกืื ืืชืืฆืืืช ืฉื ืชืืจืืช ืืืชืืื ืืฉืชืื-ืขืฉืจื. ืืจืืืช ืืืืืื, ืื ืืืืื ืืื ืื ืฉืืฉืชืชืฃ, ืืชืืื ืืืืืื ืขื ืืชืจืืืืช ืืืืฉืงืขืืช! ืืชืืจืืช ื-12 ืืืื ืึพ10 ืืกืคืืืืจ 2025, ืืืฉืืืฉืช ืืืืืฉืื ืฉืืืืจ ืืื ืืืืฉื ืขืฉืจืืช ืขืจืืื ืฉื ืืชืื, ืืืจืืื ืืชืจืืื ืืืกืืจืืช ืจืื. ืืืฉืชืชืคืื, ืืชืืงืื ืืืืฉืื ืืืื, ืืฉืงืืขื ืืื ืืืืฉืื ืืืชืืืช ืขืจืืื ืืืฉืืคืืจ ืขืจืืื ืงืืืืื, ืืืืืกืกืื ืขื ืืงืืจืืช ืืืืืชืืื ืืืชืืืื ืืฆืืจื ื ืืืฉื ืืงืืจืืื. ืืชืืจืืช ืืฉื ื ืืฆืืื ืืืืื ืจืื ืฉื ื ืืฉืืื ืืฉื ืกืื ืื ืืช ืืชืืื. ืื ืืืืื ืืื ืื ืฉืืงื ืืืืงืื ืืืง ืืชืืจืืช! ืื ื ืืืงืืจืื ืืช ืืืฉืงืขื, ืืืืืง ืืืืืืฅ ืฉืืืฉืงืขื ืืื ืขืจื ืืขืจื, ืืืงืืืื ืฉืืชืืืื ืืื ืขืืืจืื ืืืื, ืืกืคืง ืืืขืืจืจ ืืฉืจืื ืืืืฉื ืืชืืื ืืชืจืืื. ืืืจืื, ืืฉืืคืืื ืืืืืืจืช ืืชืืจืืช โโช โFuncsโ โ ืฉืืื โช 00:54, 9 ืืื ืืืจ 2026 (IST)ืชืืืื ืืชืืจืืช ืืืืฉื ืืืชืจ ื-30 ืขืจืืื, ืจืืื ืืื ืืขื ืืื ืื ืืืชืืืื ืืืื. ืืฉืืคืืื ืืืจื ืืืฆืืืื ืื ืืืืืื ืืช ืืขืจืืื ืืืืื. ืืฉืืคืืื ืืืืื ืื ืืืฉืชืืฉืช:funcs ืขื ืจืืืื ื ืขืื ืืืขืื ืฉื ืืืื ืืฉืืคืื. ืคืจืืค' ืจืื ืืืืื ืืื, ืคืจืืค' ืขืืื ืืืฉื ื ืชืืื ืืื ืืืฉืชืชืคืื ืืืจืืืช ืืืืืื, ืฆืืืช ืืฉืืคืืื ืขืืืช ืืืืื-Funcs ืฆืืืช ืืฉืืคืืื ืืื ืฉืืืฉื, ืืืืืงืืืืก ื-ืก.ื'ืืืื ืืจืืืช ืืืืืื, ืืืื ืฉืงืืืื ืฆืืื ืืฉืื, ืืื ืืืืื ืืื ืืืฉืชืชืคืื! Crocodile2020, Barak a, IdanST, HanochP, Avishay, ืืงืฃึพืขืืจื, ืืืืึพืืืฉืืจ, TheStriker, ืืจื ืืืืจื, ืืืจื ืืืืืจืช ืืชืืจืืช, โโช โFuncsโ โ ืฉืืื โช 01:21, 9 ืืื ืืืจ 2026 (IST)ืชืืืื ืขืจืืื ืืชืจืืื ืืืื ื - ืืขืื ืฉื ืืืืืช ืืืืจืื ื ืฉืืชื ืื ืืขืืืื ืืืืื ืืขืจืืื ืื ืืชืืื ืืืืฆืขืืช ืชืจืืื ืืืื ื. ืฉืื ืชืืื ืื ืืื ื, ืื ื ืืขื ืฉืืืืฉ ืืชืจืืื ืืืื ื ืืืื ืขืืจ, ืื ืืคืืขื ืืืืืช ืืชืืฆืจ ืฉืืืขืื ืืขืืชืื ืงืจืืืืช ืคืฉืื ืืจืืขื. ืืฉืคืืื ืกืชืืืื ืฉืงืฉื ืืืืื, ืืืืื ืืื ืืืืช ืฉืื ืชืืจืืื ืืื, ืื ืืกืืืื ืฆืืืขืื ืฉืืื ื ืืืขืืื ืืขืืจืืช. ืืจื ืื ืืื ืืืื ืืืืื ืก ืืขืจื ืืื ืืืืช ืืืืืืฅ ืขื ืืคืชืืจ ืืื ืืงืื ืชืจืืื ืืืื ื ืืืืืืื. ืืื ืื ืื ืืืชืจืื ืืืืืื ืืืื ืืื ืืคื ืฉืืื ืื ืชืื ืืืงืืคืืื ืืขืืจืืช? ืืขืื ืื, ืขืจืืื ืืืชืืืื ืืฆืืจื ืืจืืื ืคืืืขืื ืืืืื ืืืชืจ ืืฉืื ืืืขืืืื. ืืืฉื, ืืขืจืืื ืฉืืื ืืชืืจืืช ืืงืฆืจืืจ ืืืืืืจ ืืืืช ื ืืืื ืืืืืช ืืืืจืื ืืืงืื ืืขืื ืื ืืืขืืืช ืืืืจืืช ืฉื ืชืจืืื ืื ืืกืื, ืืืื ื ื ืืืื ืืืฆื ืขืจืืื ืืืฆื ืืื ื ืืืจื ืืืงืื ืืจืืฉืื. ืืื ืงืืืืื ืืืื ืื ืื ืื ืื ืื ืืขืืืื ืืืชืืืืืืช ืขื ืืืขืื ืืื? Assafn โข ืฉืืื 21:56, 9 ืืื ืืืจ 2026 (IST)ืชืืืื Crocodile2020, ืืื ืืืื ืขืงืจืื ื ืืจืื ืชืืื ืืฉ ืฉืืืจืฉืื ืืืืืืืช ืงืื ืงืจืืืืช. ืขืจืืื ืฉืฆืืื ื ืื ืจืง ืืืืืืืช ืืืขืื ืจืืืืืช ืืงืืืืช ืืืืงืืคืืื. ืืืืชื ืืืคื ืืืื ืืชืช ืืืืืืืช ืฉื ืขืจืืื ืืืจืื ืขื ืื ืช ืืืขืืืจ ืืช ืืืชื ืืืกืจ ืืคืชืืืช ืืืืื. ืื ื ืื ืืืฉื ืฉืืชื ืฆืจืื ืืงืืช ืืช ืฉื ืืืจ ืืืืคื ืื ืื ืืืฉื. ืืืขื ืืื ืืืื ื ืืืืื ื ืคืชื, ืื ื ืจืืื ืืฉืืืืช ืฉืืืขืื ืืืขืื ืฉืืจืฉืืช ืืืืจ ืืื ื ืฉืืืืื ืืงืื ืกืคืฆืื ืืืืชืืช, ืืงืืืืช ืขืื ืืจืืฉืืชื ืฉื ืืืงืืคืืื ืืืืืขื ืื ื ืขื ืืืื. ืงืฆืจืืจืื ืื ื ืืกืคืจ ืฉืืจืืช ืฉื ืืชืื ืืขืืจ ืืจืืืง ืฉืงืืืื ืืืืื ืชื ืืขืจืืื ืืจืืืื ืืืชืจ ืื ืืชืืื ืืืื ืืขืืจืช ืืื ื ืืืืืืชืืช ืืืืจ ืืืืื ืืืืฆืจ ืืืชื ืืืื. ื ืืชื ืืืชืืืื ืขื ืื. ืื ื ืื ืื ืฉืืืื ืืืื ืฉืืืจืื ืืืฆื ืฉื ืืืื ื ืืืช ืืืชืจ ืืืืฆื ืฉืฉืจืจ ืืจืืฉืืชื ืฉื ืืืงืืคืืื. ืืืืื ืืืช ืืืืืชื ื ืฉืืืืื ืืงืื ืกืคืฆืื ืืืืืชืืช ืืื ืืขื ืงืช ื"ืขืืืืจืื" ืืืฉืชืืฉืื ืืืืชืืื ืืื ืืจืื ืขืจืืื ืืืืื "ืืืืคืื ืืืืฃ". ืื ื ืืืืจ, ืืืืืื ืฉื ืชืชื ืืื ืืฉื ืืืืจืช ืืืจืื. ืืื ืืจืืืช ืืื ืืืจื ืชืืงืคื ืืช ืืืคื ืืืืื ืืืืืจื ืืื. ืื ื ืื ืื ืฉืืืงืืคืืื ื ืืกืคืื ืืืืืื ืืชืช ืืืืืืืช ืจืืืช ืื ืงืืื ืฉืื ื ืื ืกื ืืืขืืืจ ืื ืื ืคืืฉื ืืื ืคืื ื ืืืืื. ืจืง ืืืืจืื ื ืืชืงืืื ืืืื ืขื ืืื ืืงืจืื ืืขืืืืจืื ืืืืฉืื, ืื ืืืืืืืช ืฉื ืืขืจืืื ืฉืืฉืชืืฉืื ืืืืฆืจืื ืืืื ืงืฆืจ ืฉืืืจืืช ืฉืืืื. ืื ืืกืฃ, ืื ื ืฉืืืืื ืืฆืืืจืช ืจืืืืื ื. ืืื ืขื ืื ืืืื ืื ืืขืืจ. ืืืืื ื ืืกืคืช ืืื ืืฆืืช ืืื ืืขืจืืื ืฉืืชืื ื ืืืคื ืืืฉืชืืฉ (ืืืื ืฉืื ืื ื ืืืื ืืื). ืื ื ืืขืจืืืื ืืช ืชืจืืืชื ื ืืืืื ืืฆืืจื ืืืืชืืช. ืืืืื, ืื ื ืืืฉื ืฉืืืชื ืงืื ืกืคืฆืื ืืืืชืืช ืืืืชื ื ืืื ื ืืจืืฉืืชื ืฉื ืืืงืืคืืื, ืืืืื "ืขืจื 400,000 ืขืจืืื" (ืื ืคืืชื ืืืืืื ืืืื ืืืืืช "ืืงืจืืช 400,000 ืขืจืืื"?) ืื ื ืืฉืืื ืข ืื ืืฉ ืืื ืื ืืืฉื ืื. ืืคืขื ืืคืขื ื ืชืงืืื ืืขืืืื ืืฆืืจื ืืืืืจืช ืฉื ืขืจืืื, ืืขืืงืจ ืืงืจืืช ืืกืคืจ ืขืืื ืฉื ืขืจืืื, ืื ืื ืื ืจืืฆื ืืจืืืช ืืช ืืืงืืคืืื ืืืืขื ืืืืืืื ืขืจืืื? ืืขืืจ ืืื ืคื ืืืืืื ืืืืืืช ืืืงืืคืืื:ืกืืคืฉืืืข ืืื ืขืจืืื ืฉืืืืืฉื ืืช ืืืืืืช ืขื ืคื ื ืืืืืช. ืื ื ืืืืข ืืื ืฉืงืฉื ืืืืืก ืงืืืฆืืช ืขืืจืืื ืืืืจื ืืื ืชืืช ืืืื ืืื ืืืื ืื ื ืืืฉื ืฉืื ืืื ืืืื ืืืฆืข ืืกืคืจ ืืืจืื ืืืืงืช ืืืืืืื ืืงืื ื ืฉืื. 1. ืืคืขื ืืคืขื ืื ื ื ืื ืก ืืืฃ [], ืืืืื ืืืืืฆืช ืืคืชืืจ ืืช ืืขืจืืื ืฉืืชืืชื ืืขืืจ ืืคื ืืืจืื ืืืงืื ืืืืื, ื ืื ืก ืืขืจื ืืงืฆืจ ืืืืชืจ ืืืจืืื ืืืชื. 2. ืืชืืื ืื ืืฉืืื ืืื ืื ื ืืงืื ืื ืืคืืืช ืืืืจ ืืฆืืจื ืฉืืื ืืืชืจ ืืืกืืกืืช. 3. ืืชืืื ืืื ืืื ืืืืืื ืืฉืืชืื/ืขืจืืื/ืืจืืืช ืืืืื ืืื'. ืืืืกืืฃ, ืืืจืช ืขืจื ืืืชืืื? 1. ืืชืื ืืืชื. 2. ืืืืจ ืืืื ืืืืจ ืืกืคืจ ืืืื ืืงืจื ืฉืื ืืฆืืจื ืจืฆืืคื. ืืคืขืืื ืงืฉื ืืงืจืื ืขืจื ืืื ืืืจืื ืืคืขื ืืืช, ืื ืคืฉืื ืืืืง ืืช ืืงืจืืื. 3. ืืืืจ ืืืื ืฉืื ืืชืงืจื. ืื ืืกืืื, ืื ืืงืจืืื ืฉืืืฉืืช ืืืฆืืื ืื ืืชืงื. ืื ืกืชื ืขืจื ื ืงืจื "ืขืจื". ืชืืฆืจื ืขืจืืื ืขื ืขืจื. ืืืฉื ืขืจื ื ืคืื. ืืืจืื, Avishay โข ืฉืืื 22:22, 11 ืืื ืืืจ 2026 (IST)ืชืืืื ืื ื ืจืืฆื ืืืืืืจ ืืช ืืืืื ืืฉืืื ืืืงืืจืืช. ืืืฆื ืืืชืืืื ืขื ืืชืืคืขื ืืืืคื ืจืืืื ืืืื ืืขืจืืื ืืืืืงืืคืืื ืืขืืจืืช. ืืืืจ ืฉืื ื ืืชื ื ืืื ืืชืืืืกืืช ืืืฉืืช ืืฉืืื ืืื, ืืฆืืข ืืืืื ืคืขืืื ืื ืจืื ืืืฆื ืืื ืืชืงืื. ืืขืื ืื, ืขืจื ืงืฆืจ ืื ืงืจืื ืืืจืืจ ืขืืืฃ ืขื ืขืจื ืืจืื ืืืชื ืงืจืื ืืืื ืฉืืืืืช. ืืืขืื ืืื ืฉืืงืืืช ืฉืื ื ืืชื ืืืื ืืืืฆืจ ืขืจืืื ืืืืฆืขืืช ืืืืืื ืืงืกื ืืืืืื ืืืฆืคื ืฉื ืืจืื ืืขืจืืื ืืืงืกืืื ืืขืืืชืืื, ืืงืฆื ืฉืืงืืืื ืืื ื ื ืืืืื ืืขืืื ืื ืืืืฆืขืืช ืชืืงืื ื ืขืจืืื ืฉืืืคืื. ืืชืืฆืื ืืื, ืขืจืืื ืืืื ื ืฉืืจืื ืืืจืื ืืขืจืืื ืืคืืืขืื ืืืืืืืช ืืงืจืืื ืืืืืื ืืช ืฉื ืืืงืืคืืื. ืืื ืื ื ืืฆืืข ืืืืฅ ืืืฉื ืคืจืืืงืืืืืช ืืืชืจ ืืืคืฉืจ ืืฆืืช ืชืื ืืช ืฉืืชืื ืืืขืืืืช (ืืืืื ืืช ืจืฉืืืช ืืืขืืืช) ืขื ืืืืืช ืืื, ืื ืฉืื ืืชืื ืคืจืง ืืื ืืืืืจ ืืขืจื ืื ืืฉืชืคืจ ืืืืคื ืืืฉื ืืื ืืืืืจ ืืืืืืืืช ืืืจืื ืืืืืื ืฉื ืืืืชื. ืืืืจื ืืืจืื ืืขืจืืื ืืืืื ืืืชืืฆืข ืืฉืืขืจื ืืืื ืืชืืืื ืฉื ืจืื ืืืงืืคืื ืืืฃ ืืฉืืื ืื ืืคืืืช ืืืงืืคื ืืื ืืคืืืช. Assafn โข ืฉืืื 13:18, 12 ืืื ืืืจ 2026 (IST)ืชืืืื ืืืื ืืืืืฉ: ืืืื ืืืช ืืชืืืช ืขืจืืื ืขื ืคืจืงื ืืืงืจื ืื ื ืืืงืฉ ืืคืชืื ืืืื ืขืงืจืื ื ืื ืืืข ืืืชืืืช ืขืจืืื ืขื ืคืจืงืื ืืืืืื ืืชื "ื. ืืืืืืกื 2023 ืืชืงืืืื ืืฆืืขืช ืืืืงื ืขื ืฉืืืฉื ืขืจืืื ืกืคืฆืืคืืื, ืฉืืืืจืขื ืืจืื ืืืืืงื. ืืขืงืืืชืื ืืฉืชืจืฉื ืชืคืืกื ืืคืื ืืชืืืช ืขืจืืื ืขื ืคืจืงื ืืืงืจื ืืื ื ืจืฆืืื, ืืื ืจืง ืขื "ืกืืคืืจืื" ืื "ื ืืฉืืื". ืื ื ืกืืืจ ืฉืคืจืฉื ืืช ืืืจืคืช ืื ืฉืืืื ืืคืืืขืช ืืืจืื ืืขืจืืื, ืืื ื ืืฆืืข ืืืฉื ืฉื "ืื ืืงืจื ืืืืคื" ืขื ืคื ื ืืืกืืจ ืงืืืืจื. ืืืื ืื ืืืืงืื ืืืืข ืืชืคืืกื ืื ืืืืืช ืืขืืืชืืช, ืืืืืข ืืฉ ืืืคืฉืจ ืขืจืื ืคืจืงืื ืืืงืจืื ืืืชืืืืื: ืืื ืขืจืืื ืฉืืชืืชื ืืคื ื ืฉื ืืืข ืื ืขื ืืืกืชืืืืืช: ืืืฆืขื ืฉืื: ืืื ืืคืกืื ืขืจืืื ืจืง ืืฉืื ืฉืฉืื ืืื "ืกืคืจ X ืคืจืง Y". ืืฉ ืืืืื ืืช ืืขืจื ืขื ืคื ืชืืื ื ืืืืืชื: ืืื ืืื ืืกืคืง ืืงืืจื ืืขืืคืช ืื ืฆืืงืืืคืืืช ืจืืื โ ืืืืืืช ืืืงืจ, ืคืจืฉื ืืช, ืืงืฉืจืื ืืืกืืืจืืื ืืืืงืืจืช ื ืืกื โ ืืืืจืืช ืืขืืจ ืืฆืืืื ืืืงืกื ืืืงืจืื ืืจืืื? ืื ืืชืฉืืื ืืืืืืช, ืืจื ืฉืืฉ ืื ืืฉืืืืช ืืืงืื ืืื ืฆืืงืืืคืืื. ืืืืขืโ โข ืฉืืื 14:45, 14 ืืื ืืืจ 2026 (IST)ืชืืืื ืื ื ื ืกื ืืืฆืืจ ืขืจืืื ืจืง ืืคื ืืืืืืช ืกืคืจืืชืืืช, ื ืืืืฅ ืืืฆืข ืืืจืขืืช ืคืจืฉื ืืืช ืืืื ืืืืืืช ืืืืืื ืืฉืื, ืืืืื ืืืืืช ืฉืืงืืฆื ืคืจืงืื ืื ืื ืืืงืฆื ืืืคื ืืืืืืชื. ืืื, ืืฉืืืืฉ ืืคืจืง ืืฉืืฉ ืืขืืื ืืื ื ืื ืืืืจืื ืืืชืืฆืืืช - ืฉื ืืคืจืง ืืฉืื ืืืื ืืืชืืฆืืืช, ืื ืงืืจื ืืชืงืฉื ืืืฆืื ืืื ืืื ืืขืจื ืขื ืฉื ืฉื ื ืืฉื ืฉืืื ื ืืืืืจ ืืืื ืืื ืืืืจ ืืืืงืจ ืื ืืกืืจืชืืช. ืืขืจื ืืฉืืฉ ื"ืืืกื ืื": ืืืืชืจืช ืืื ืืคืจืง, ืืืืืฃ ืืขืจื ื ืืชื (ืืจืฆืื) ืืืืื ืืช ืืืืงืจ, ืืืฆืื ืืช ืืืื ื ืืกืคืจืืชื ืืืช ืืืขืืช ืืฉืื ืืช ืืืืื. ืืืืขืโ โข ืฉืืื 10:44, 15 ืืื ืืืจ 2026 (IST)ืชืืืื ืืืืขื, ืจืง ืืืืื - ืืชื ืืชืืืื ืฉืืขืฆื ืืคื ืืงืจืืืจืืื ืื ืฉืืชื ืืฆืืข ืกืคืจ ืฉืืคืืื, ืคืจืง ื' ืืื ืคืจืง ืฉืื ื ืืชื ืืืื ืืืชืื ืขืืื ืขืจื, ืืฉืื ืฉืืื ืืืคืฃ ืืขืจื ืฉืืจืช ืืืืจื, ืื ืขื ืกืคืจ ืฉืืคืืื, ืคืจืง ื' ื ืืชื ืืืื ืืืชืื? ืืื ื ืืคื ืคืจืงืื ืจืืื ืืชื "ื ืืขืจืืื ืขื ืคื ืคืจืงืื, ืขื ืืื ืืืจืื, ืืชืืจ ืขื ืืื ืฉื ืืืื ืืืช? ืืขืื ืื ืื ืจืขืืื ืืฉืื ื ืืื, ืฉืืืื (ืืืขื) ืื ืืืกืจืื ืืช ืืื ืืืื ืืขืื, (ืืืฆืืจื ืืคืืจืืช ืืืชืจ ืืืืื ืื ืืงืืืืื), ืืชืืืกืฃ ืืืื ืืืกืจืื ืืช ืฉื ื) ืืืกืจ ืจืฆืืคืืช, ื) ืืืกืจ ืืืืจืืช ืืขืืจืืื ืฉืื ืืืื ืืืืขืื ืืงืจืืืจืืื ืื ืืืื ืื ืืื ืืืืืืืืืื ืืืื, ืฉืืืขื ืืื ืกืคืง ืขืชืืืื ืืืฉืงืืข ืืช ืืจืฆื ืืืื ื ืืขืจืืื ืฉืืกืืจ ืืืชืื ืืืชื, ืืืืกืืฃ ืฆืคืื ืืื ืืคื ื ืคืฉ ืืฉืืืืืง ืืขืจื, (ืฉืืจื ืืขืจื ืืืงืืื ืืืจ ืืืื ืงืืื, ืืจืื ืืืงืจืื). ืืืืจ ืืื ืืืืื ืฉืื ืืฉื ืืขืื ืฉืื ืืืืื, ืืืืงืื ืืืชืื ืขืจืืื, ื ืืื ืืช ืืื ืื ื ืืืืงืืคืืื ืืืืื ืื ืืืงืื ืืฉืคืจ ืืช ืืืืื, ืืืืืจ ืืืืื... ืืื ืฉืงืจืืืจืืื ืื ืืจืืจืื ืืืชืจ - ืื ืื ืืืืื ืืืชืจ. ืื ืขืืืคื ืืืืืจื ืืืื ืืคืฉืืื? ืืืืฉ ืฆืืง โข ืฉืืื โข ื' ืืฉืื ื'ืชืฉืค"ื โข 20:29, 20 ืืื ืืืจ 2026 (IST)ืชืืืื ืืฉืื AI ื ืชืงืืชื ืืขืจื ืืฆืืฆื ืืคืขืืื ืื ืฉื ืจืื ืฉืืืื ื ืืชื ื-ai. ืืื ืืื ืืืช ืดืขืืืจืืด ืขื ืืืืืช ืืคืจืืื ื ืฉืขื ืืืชื ืขืจื ืืงืจืื ืืขืฆืื ืืช ืืืงืืจ? ืื ืืื ืืขืจื ื ืืชื ืื - ืืื ืื ืขืืื ื ืืืืืงื ืืดืืืืงื ืืืืจืืด? ืืืืื ืชืืฉืื ืืฉืืื, ืืืืขื ืืื ื ืฉืืื ืืืช ืืฉืืืช ืืขืจื. ืืฉ ืืืื ืืืจ ืขืฉืจืืช ืขืจืืื, ืืืื ืืืื ืฉืืื ืืืื ืื. Shannen โข ืฉืืื 16:13, 14 ืืื ืืืจ 2026 (IST)ืชืืืื ืืืื ืืืช ืืงืืจืช ืืชืจืืื vs ืืงืืจืืช ืืื ื ืืืืืืชืืช ืื ืงืจืืชื ืืช ืื ืืืืื, ืื ืฉืื ื ืืชื ืฆื ืื ืคืกืคืกืชื ืืฉืื. ืื ื ืื ืืืฉื ืฉืืฉ ืืืืฉืืื ืจืืื ืืืืช ืืฉืืื ืืื ื ืืชื ืืขืจื. ืฆืจืื ืืฉืืื ืืื ืืื ื ืจืื ืขืืฉืื. ืื ืืื ืืชืื ืืฆืืจื ืื ืืจืืจื ืืงืฉื ืืงืจืืื, ืฆืจืื ืืฉืคืจ ืืืืงืจื ืืฆืืจื ืืืขืืืจ ืืืืืื. ืืืื ืืืงืืจืช, ืืฉ ืื ื ื ืืื ืฉืฆืจืื ืืงืจื ืืืชื. ืื ืขืืื ืืฉื ืืืืชื ืื ืงืจื, ืืฉ ืืฉืืื ืืืชื (ืืืื ืื ืืืื ื ืืืื ืืืฉืจ ืืชืฉืืืชื). ืื ืื ืื ืงืฉืืจ ืืฉืืืืฉ ืืืื ื ืืืืืชืืช. โAizenr (ืฉืืื | ืชืจืืืืช | ืืื ื) ืื ืืชื โ 00:00, 10 ืืื ืืืจ 2000 (IST)ืชืืืื ืืืฉ ืืชืืืฉืืืช ืืฉ ืกืืจืช ืขืจืืื ืฉื ืืชืื ืืฉื ื ืืืืจืื ื ืฉืขืืฉื ืฉืืืืฉ ื ืจืื ืืืื ื ืืื. ื ืืกืฃ ืื ืขืจื ืืื ืืืืืงืืคืืื. ืืขืื ืื ืืืืืจ ืืืืกืจ ืืืื ืืช ืื ืืกืืื ืื ืฆื ืืืฉื ืืืกืืืจื ืืืืืื ืืืชื ืืืืคื ืื ืืจืื ืืกืื ืขื ืืฆืืืืช ืฉืื ืืืืืืื ืืฉ ืืืื ื, ืชืื ื ืจืืื ืืืืื ื. โBarhai (ืฉืืื | ืชืจืืืืช | ืืื ื) ืื ืืชื โ 00:00, 10 ืืื ืืืจ 2000 (IST)ืชืืืื ืขืจืืื ืืืืืช ืขืืจืืื ืื ืื ืคืืขื ืขืืืจืืช ืื ื ืืฆืืข ืืงืืืื ืืฉืงืื ืืืืฅ ืืืืคื ืจืฉืื ืืืกืืืจ ืืช ืืืืื ืืืช ืืืื: ืืืฉืจ ืืื ืืชืคืจืกื ืืขืืงืจ ืืฉื ืืืง ืฉืืงื ืืืืจืืข ืคืืืื ืืื ืืืื, ืืื ืื ืืขืืจืืื ืืืื ืื ืื ืคืืข ืขืืืจื, ืืฉ ืืืขืืืฃ ืืืฆืืจ ืขืจื ืืืืืช ืืืืจืืข ืืื ืืืืืช ืืืืฉืืืช. ืืืงืื ืืืฆืืจ ืขืจื ืขื ืืืืฉืืืช, ื ืืชื ืืืฆืืจ ืืคื ืืื ืืชืืืื. ืื ื ืื ืืฆืืข ืืืืฆืื ืืช ืืืืื, ืืื ืืืกื ืืฆืืจื ืืกืืืจืช ื ืืื ืฉืืืืื ืงืืื, ืื ืฉืืืื ืื ืื ืคืงืื ืืื ืื ืืืจื. ืืคื ืฉืชืืืจ ืจืืื ืืจืืื ืืืืจืื ืื ืืคื ืืืช ืื ืจืฆื ืชืืืจ ืจืืื, ืืืคื ืฉืฉืืจื ืื ืงื ืืืฉื ืฉืืืกื ืื ืืคื ืืืช ืื ืืจืฆื ืืืฆืขื ืืืืืื ืืืจืืฉืืื, ืืขืื ืืขืื. ืืื ืืื ื'ืง โข ืฉืืื 13:03, 16 ืืื ืืืจ 2026 (IST)ืชืืืื ืืืฉืจ ืืื ืืชืคืจืกื ืืขืืงืจ ืืฉื ืืืง ืฉืืงื ืืืืจืืข ืืื ืืืื, ืืฉ ืืืขืืืฃ ืืืฆืืจ ืขืจื ืืืืืช ืืืืจืืข ืืื ืืืืืช ืืืืฉืืืช. ืืืงืื ืืืฆืืจ ืขืจื ืขื ืืืืฉืืืช, ื ืืชื ืืืฆืืจ ืืคื ืืื ืืชืืืื. Feminism and Folklore 2026 starts soon Dear Wiki Community, We are pleased to invite Wikimedia communities, affiliates, and independent contributors to organize the Feminism and Folklore 2026 writing competition on your local Wikipedia. The international campaign will run from 1 February to 31 March 2026 and aims to improve coverage of feminism, womenโs histories, gender-related topics, and folk culture across Wikipedia projects. Feminism and Folklore is a global writing initiative that complements the Wiki Loves Folklore photography competition. While Wiki Loves Folklore focuses on visual documentation, this writing campaign addresses the gender gap on Wikipedia by improving encyclopedic content related to folk culture and marginalized voices. Communities can contribute by creating, expanding, or translating articles related to: Participants may work from curated article lists or generate new article suggestions using campaign tools. Organizers are requested to complete the following steps to register their community: The Wiki Loves Folklore Tech Team has introduced tools to support organizers and participants: Both tools are now available for use in the campaign. Click here to access the tools For detailed information about rules, timelines, and prizes, please visit the Feminism and Folklore 2026 project page. If you have any questions or need assistance, feel free to reach out via: We look forward to your collaboration and coordination in making Feminism and Folklore 2026 a meaningful and impactful campaign for closing gender gaps and enriching folk culture content on Wikipedia. Thank you and best wishes, Feminism and Folklore 2026 International Team Stay connected: โMediaWiki message delivery (ืฉืืื | ืชืจืืืืช | ืืื ื) ืื ืืชื โ 00:00, 10 ืืื ืืืจ 2000 (IST)ืชืืืื Invitation to Host Wiki Loves Folklore 2026 in Your Country Hello everyone, We are delighted to invite Wikimedia affiliates, user groups, and community organizations worldwide to participate in Wiki Loves Folklore 2026, an international initiative dedicated to documenting and celebrating folk culture across the globe. Wiki Loves Folklore is an annual international photography competition hosted on Wikimedia Commons. The campaign runs from 1 February to 31 March 2026 and encourages photographers, cultural enthusiasts, and community members to contribute photographs that highlight: Through this campaign, we aim to preserve and promote diverse folk cultures and make them freely accessible to the world. Project page on Wikimedia Commons As we celebrate the eight edition of Wiki Loves Folklore, we warmly invite communities to organize a local edition in their country or region. Hosting a local campaign is a great opportunity to: Sign up to organize: If your team prefers to organize the competition in either February or March only, please feel free to let us know. If you are unable to organize, we encourage you to share this opportunity with other interested groups or organizations in your region. If you have any questions, need support, or would like to explore collaboration opportunities, please feel free to contact us via: We are also happy to connect via an online meeting if your team would like to discuss planning or coordination in more detail. Warm regards, The Wiki Loves Folklore International Team MediaWiki message delivery โข ืฉืืื 15:21, 18 ืืื ืืืจ 2026 (IST)ืชืืืื Annual review of the Universal Code of Conduct and Enforcement Guidelines I am writing to you to let you know the annual review period for the Universal Code of Conduct and Enforcement Guidelines is open now. You can make suggestions for changes through 9 February 2026. This is the first step of several to be taken for the annual review. Read more information and find a conversation to join on the UCoC page on Meta. The Universal Code of Conduct Coordinating Committee (U4C) is a global group dedicated to providing an equitable and consistent implementation of the UCoC. This annual review was planned and implemented by the U4C. For more information and the responsibilities of the U4C, you may review the U4C Charter. Please share this information with other members in your community wherever else might be appropriate. -- In cooperation with the U4C, Keegan (WMF) (talk) 23:01, 19 ืืื ืืืจ 2026 (IST) ืืื ืืื ืืฉืืืจืื ืขืืืื ืขื [ืืจืืฉ ืืงืืจ]? ืืขืจื ืคืฉืืขื ืืืืจืืคื ืืืคืืข ืืืฉืคื ืืื ืืจืืฉ ืืขืจื: "ืืขืฉืืจืื ืืืืจืื ืื, ืืืืจ ืืื ืืืืจื ืืืืื ืืช ืืืกืืื ืืืืจืืคื ืืชืืืจื ืคืขืืืืช ืืืจืืจ ืืืืจืืคื ืืืคืจื ืืืจืืจ ืืืกืืืื ืืืืจืืคื." ืืืกืคืชื ืืืฉืคื ืืื ืชืื ืืช [ืืจืืฉ ืืงืืจ]. ืขื ืืขืจื ืืชืงืืื ืขืืฉืื ืืืื ืืฉืืืืช. ืื ืืขืจื ืืฉืจืื ืืช ืืืื ืืืฉืืืืช/ืืืืงื, ืืื ืืื ืืงืืื ืฉืืขื ื ืืื ืชืืฉืืจ ืืื ืืงืืจ? ืืื ืืงืืื ืื ืืฉ ืืืื ืืืช ืฉืืืจื ืืื ืืกืืืื ืฉืื ืฆืืืื ืืงืืจ, ืืคืฉืจ ืืืืืง ืืช ืืืฉืคื? --HananCohen โข ืฉืืื 18:32, 21 ืืื ืืืจ 2026 (IST)ืชืืืื ืื ืืืฉื ืฉืฆืจืืื ืืืืืช ืืืืืช ืืื. ืืฉืืื ืืฉืืื ืฉืืืืง ืืงืืืืืช ืขื ืืขืืืื ("ืื ื ืืืฉื ืฉืื ืื ื ืืื ืื ืืคืืืช ืืฉืื ืืืื") ืืืฉืื ืืืืืจ ืื ืืฉื ืจืืืฉ, ืืงืืจ ืืื ืืืจ ืืฉืื ืืืขืฉืืจ, ืื ืื ืืืจืื. ืื ืื ื ืฆืจืืืื ืืฉืืืฃ ืืงื ืฉืื ืคืืกืช ืืืืข ืื ืืจืืืืืืืช ืชืืื ืืืืื ืืืงืืจืืช, ืืื ืื ืื ื ืื ืฆืจืืืื ืืืืืข ืืื ืืื. ืชืื ืืช "ืืจืืฉ ืืงืืจ" ืืืืืจื ืืช ืืงืืจื ืืืฉืืืช ืชืฉืืืช ืื ืื ืงืืื ืฉืืขืื ื ืฉืืคืืจ ืืื ืื ืฉืืคืืช ืืช ืืชืื ืืง ืขื ืืืืื. ืืืืืช ืืืช, ืื ืืฉ ืืฉืืื ืฉืืืืง ืืงืืืืืช ืขื ืืขืืืื, ืืื ืืืฉืืืจื ืืื ืืงืืจ. ืื ืืืืืจ ืื ืืฉื ืจืืืฉ, ืื ืื ืืื ืืฉืืื ืฉืืืืง ืืงืืืืืช ืขื ืืขืืืื, ืืื ืืืฉืืืจ ืืืชืจ ืืืกืคืจ ืืืื. ืจืื (Aizenr) โข ืฉืืื 07:30, 30 ืืื ืืืจ 2026 (IST)ืชืืืื ืืงืกืื ืืืื ืกืืืจื ืืฆืืคืืจืื ืขืืืจ ืืืงืืคืืื ืืขืืจืืช ืขื ืืืื ืืื ืืืืืข ืื ืื ืืืืชื ืืืืื ืืกืืืจืช ืืคื ืื ืชืืงืืข ืืืงืกืื ืืืื ืฉื ืขืจืื ืืขืืคืืช ืืืืงืืคืืื ืืขืืจืืช. ืืืืจืื ื ืงืืืฆื ืื ืืจืฉืืืืช ืืขืืืืืืช ืืงืืืืืช ืืจืฉืืื ืืืืื ืืืฉื ืืฉื AviList, ืืื ืืืงืืคืืื ืืื ืืืืช ืขืืืจืช ืื ืื ืืขืงืื ืืืจื ืืืงืกืื ืืืื ืฉื. ืื ื ืืฆืืข ืฉืื ืืื ื ืงืืข ืืืืคื ืืกืืืจ ืฉืขืจืื ืืฆืืคืืจืื ืืกืืืจื ืืืขืืืื ื ืืคื ืืืงืกืื ืืืื ืฉืืื. ืืชืืื ืืช ืืขืื ืืืืข ืืืขืื ืืืืืืื ื ืื ืืืืง, Santacruz13, Aziz Subach,ืื ืื.ืื, PurpleBuffaloโ, Tshuva, ื ืืฉ ืงืื, ืืืืจ ืื, Gidip, ืคืขืื ืืืขื ืืืืืืช ืืื, ืืืืืืืจ, Squaredevil, MathKnightโ, ืคืจืฆืืืื, assafn, ืืื, ืืจ ืืื ื 007, ืขืืจื ืืืืืื ื ืืคืจืืื ืชืืืื YedidyaPopper โข ืฉืืื 10:30, 22 ืืื ืืืจ 2026 (IST)ืชืืืื ืืฆืขื ืืืจืืืช ืคืจืง ืืฉืืืืช ืืขืจื: ืืฆืขื ืื ืืื ืืืง ืืคืขืืืืช ืืฆืืืช ืืฉืืคืืจ ืืืงืืื ืืืืืงืืคืืื. ืืืช ืืคืขืืืืืืช ืฉืืืืืจื ืืฆืืืช ืืื ืืืกืคืช ืืคื ืืืื ืืืช ืื ืืฉื ืืฉืืืืช, ืืืกืคืช ืืคื ืืืื ืืืช ืืืื ืชืืกืื ืืขืชืื ืืืง ืืืืื ื ืืืฉืืืืช, ืืืฆืืขืืช ืืืฉืืืืช, ืืืฉืืืื, ืืืืืงื ืืืขืืช ืืืืชืืื ืื ืืืืื ืืคืขืืืืืืช ืืื. ืชืืช ืืืงืืคืืื:ืขืงืจืื ืืช ืืงืืืื ืื ืืื ืืืฆืืจืช ืขืจืื ืืืฉืื ืงืืื ืืืฃ ืืืงืืคืืื: ืืืงืืคืืื:ืขืงืจืื ืืช ืืงืืืื ืื ืืื ืืืฆืืจืช ืขืจืื ืืืฉืื/ืฆืืืจืื ืืคืกืืื. ืืฃ ืื ืขืืกืง ืืฆืืืจืื, ืคืกืืื ืืืื ื ืงืจืืืงื. ืืงืจืื ืืืฃ: "ืขืงืจืื ืืช ืืงืืืื ืื ืืื ืืืฆืืจืช ืขืจืื ืืืฉืื ืืื ืื ืืชืืื ืืืื ืืช ืืืืืชืืช", ืืืฆืขื ืื "ืฆืืืจ ืื ืคืกื" ืืืืืฃ ื"ืืื ืืชืืื ืืืื ืืช ืืืืืชืืช". ืื ืฉืืจ ืืชื ืืื ื ืฉืืจืื ืืคื ืฉืื ืืืื. (ืืขืจื: ืืืื ืืช ืืืืืชืืช ืืืืื ืืืืงืืคืืื: ืฆืืืจ, ืจืืฉืื, ืืืืจ, ืืจืคืืื, ืฆืืืื ืกืืืืก, ืคืืกืื, ืชืืืื, ืืืคืก.) ืืงืื ืืช ืืืฆืขื 1, ืืืืืกืืฃ ื-4 ืืชื ืืื ืืงืืืืื ืื ืืช ืกืขืืฃ 5 ืืืืฆื ืืืื (ืชืืกืคืช ืื ืืื ืืืกืืช ืืช ืืืงืจืื ืืื ืืื ืื ืืืืื ืืืืฉืืื ืืื ืืืืืืื ืืชืขืจืืืืช ื/ืื ืชืืจืืืืช ืืื ืืืืืืืช.) ืืงืื ืืช ืืฆืขื 1 ื-2 ืืืืืกืืฃ ื-5 ืืชื ืืื ืื ืืช ืกืขืืฃ 6. ืืขืจื: ืขืืจืืื ืืืงืืืื ืืคืจืืื ืืืืืื ืืืฆืืข ืืืืกืืฃ ืืจืืื ืื ืืืืืืื ืืขืื-ืืฉืืืืช ืืืื. HanochP โข ืฉืืื 17:36, 24 ืืื ืืืจ 2026 (IST)ืชืืืื ืืื ื ืืชืืฆื ืืื ืืชืืื ืืืืื ืืช, ืืืื ืื ื ื ืืืืจืื ืืฉืืืช ืงืจืืืจืืื ื ืืืฉืืืืช ืืืืื ืื. ืืืืคื ืืืื ืื ื ื ืื ืืืืืจื ืฉืืืืื, ืืคืื ืืฉ ืขื ืืื ืืงืืืข ืงืจืืืจืืื ืื ืงืฉืืืื ืืืฉืืืืช ืืืืืื ื ืืฉืืื. ืืชืื ืฉืื ืืืืืื ืื ืืื ื ืืืงืจื ืืื, ืืื ืืืืคื ืืืื, ืืืขื ืืืชื ืืคืฉืจื ืืชืคืืก ืืช ืืืฉืืืืช ืืื ืฆืืงืืืคืืืช ืฉื ืืืจ ืืขืืจืช ืจืฉืืื ืงืฉืืื ืฉื ืงืจืืืจืืื ืื, ืืขืืืฃ ืืืจื ืืื ืืืฉืืืจ ืืงืืืื ืืช ืืจืืื ืฉืืงืื ืืืขืช. "ืืืื ื ืืืฉืืืืช, ืืฆืืขืืช ืืืฉืืืืช, ืืืฉืืืื, ืืืืืงื ืืืขืืช" ืฉื ืืืืื ืืืืืืืช ืืื - ืื ืืืง ืืจืื ืืชืืืื ืืืงืจื ืืขืฆืืืช ืฉื ืืื ืฆืืงืืืคืืื, ืืื ืืืจ ืฉืืืื ืฉืฆืจืื ืืฆืืฆื ืืืื ืืืื ืืืืจืื (ืื ืืืื ื"ืืชืืื ืื ืืืืื", ืฉืืื ืืื ืืฆืืฆื). ืืืืื.ืฆืืื โข ืฉืืื โข ื' ืืฉืื ื'ืชืฉืค"ื โข 19:30, 27 ืืื ืืืจ 2026 (IST)ืชืืืื ืืขื. ืจืื (Aizenr) โข ืฉืืื 07:24, 30 ืืื ืืืจ 2026 (IST)ืชืืืื ืืืืขื ืขื ืคืชืืืช ืืฆืืขื ืืคืจืืื ื ืืืืขืชื ืื ืืืื ืืืืืขืืช, ืืื ืื ืคื ืื ื ืืืชื ืืืืจ ืืฉืืืืช ืืจืืืฉืืช ืืขื ืืื: ื ืคืชืื ืืฆืืขื ืืคืจืืื ื ืขื ืืืืืช ืืฉืืืืชื ืืื ืฆืืงืืืคืืืช ืฉื ืืืืคื ืืืืืคืืช ืืฉืืขื ืืืืงืืืืจ. ืจืื ืืื. ืืืืืจ ืืืืืื ืืืืจืืช ืฉื @AviStav. ืืฆืืืขื ืืืฉืคืืขื! ื ืื โข ืฉืืื 16:28, 27 ืืื ืืืจ 2026 (IST)ืชืืืื ืืฉืืืืช ืงืืืื ืืืงืื ืืื ืกืช ืืงืจืืช ืืืืืจืืช ืืงืจืืืช ืืืืืช ืขืืื ื ืืืืื, ืขืจืืื ืจืืื ืขื ืคืืืืืืงืืื ืขืืืจืื ืืขืช ืืืืจืื ื ืขืืืื, ืืืืื ืืืืื ืืชืฉืืืช ืื ืืืืืจืช. ืืืืจืื ื ื ืชืงืืชื ืืขืจืืื ืฉืืกืื ืืช ืชืฉืืืช ืืื ืืฉืืื ืืืืืช ืืืชืจ: ืืชื ื ืืื ืืืชืื ืืขืจื ืขื ืืืจ ืืื ืกืช ืขื ืืืง ืฉืืืชื ืืืจ ืงืืื? ืืื ืืช ืืืฆื, ืื ื ืกืืืจ ืฉืืืขื ืืงืจืื ืืจืืืื (ืืื ืืจืคืืจืื ืืืฉืคืืืช), ืื ืืืืง ืื ืขืืจ ืืงืจืืื ืฉืืืฉืืช ืืื ืืืืืืจ ืืืชื ืืขืจื. ืืฉ ืื ืฉืชื ืฉืืืืช ืืงืืืื, ืืื ืื ืฉืืืืืจ ืืืืง ืฉื ืื ืก ืืืกืืฃ ืืกืคืจ ืืืืงืื: ืืืืื.ืฆืืื โข ืฉืืื โข ื' ืืฉืื ื'ืชืฉืค"ื โข 19:50, 27 ืืื ืืืจ 2026 (IST)ืชืืืื ืืืื ืฉืืจ ื"ื ืืจืืืขื ืืืจืฅ 2025 ืืื ืืขืืืื ืืืจื ืขืช ืืคืงื ืืืื ืฉืืจ ื"ื, ืืืื ืืืฉืืจ ืืืืื, ืืืื 24 ืืืื. ืืื ืขืจื ืคื ืชืืช ืืฉื ืืฉ:Liavshe, ืืืชื ืืช ืืขืจืืื ืืืืืจืคืื ืฉื ืืกืืื ืื, ืืืืืจืคืื ืฉื ืงืืืืืื ืืื ืื ืืงืืืืืจ, ื ืชืจื ืฆืืื ืื ืืืืื ืก ืื ืก ืื ืืื ืื ืกื. ืืืืื ืื ืืงืจืื ืขืื ืขืืื ืืืชืจ ืืืืืจ. ืืื ืืืจื ืืจืื ืก.ื'ืืืื - ืฉืืื - ืืฆืืจืคื ืืืืื ืกืื 21:39, 28 ืืื ืืืจ 2026 (IST)ืชืืืื ืืืื ืชืจืืืช ืืืืื ืืืืืงืืคืืื - ืฆืืืช ืืืืื ืืฉืืคืืจ ืืืงืืื ืืืืืงืืคืืื ืฉืืื ืืื/ื, ืืืืฉื ืืืืืขื ืขื ืืงืจืืื ืืฉืืคืืจ ืืืงืืื ืืืืืงืืคืืื, ืื ืื ื ืงืืืฆืช ืขืืจืืื ืฉืืชื ืืื ืืขืืืจ ืืงืืืื ืืฉืคืจ ืืช ืืืคื ืืืืื ืื ืืงืืืื ืขื ืื ืช ืฉืืืืื ื ืืืื ื ืื ืืืืื ืืืฉืชืชืฃ ืืื. ืืงืื ื ืฆืืืช ืืืงืืื ืืืื ืื ืฉืืืคืื ืืฉืืื ืืืฆื ื ืืื ืืกืืืข ืืืฉืคืจ ืืืช. ืขื ืกืืื ืืืืื ืื - ื ืคื ื ืืงืืืื ืขื ืืกืงื ืืช ืืืฆืขืืช ืืืคืจืืืืืืช. ืืื ืืขื ืืงืจืื ืืฉืื ืืืืืงืืคืืื ืืชืืื, ืคืืืข, ืื ืขืืื ืืคืกืื ืืืฉืืื. ืืื ืงืฉืจ ืืชืืื ืืืกืืืืืช ืฉืืืืื, ืืืื ืงืฉืจ ืืืืืช ืืืฉืชืชืคืื ืืืืื, ืืืจืช ืืฆืืืช ืืื ืื ืกื ืืฆืขืืช ืงืื ืงืจืืืืช ืฉืืขืืจื ืืฉืคืจ ืืช ืืฉืื. ืืื ืชืืื, ืืฉืื ืื ื ืืืืืืจ ืืืชืืื ืืืืคื ืืืื ืืืงืืคืืื:ืืืื ืืชื ืืืืช ืืื ืืืจื ืืงืืืื ืืงืืจืืื ืืื ืืื ืืื/ื ืืืขืืจ ืืืืชืขืจื ืืฉืืชื ืจืืืื ืืชืืืืืืืช ืคืืืขืืช ืื ืืขืืืืืช. ืืกืืฃ ืืืื ื ืื ื ืืื ืืืืืจื ืืืงืืืช ืืืื ืฉื ืขืืื ืื ืืืฆืจ ืืืข ืืืืงืืคืืื ืชืื ืฉืฆืืืจืื ืชืืืฉืืช ืจืขืืช. ืืืืื ื ืืฉืื ืฉืืืืงืืคืืื ืชืืื ืืงืื ืืื ืืืชืจ, ืืืื ื ืฉืคืจ ืืช ืืืืืืจื ืืกืืืืช ืืืฆืืจื ืฉืื ื ืืืื. ืขืืจืืื ื ืืกืคืื ืืืืื ืื ืืืฆืืจืฃ ืืฆืืืช ืฉืื ื. ืืืจืื, ืืืงืืกื ืืฉื ืฆืืืช ืืืืื ืืฉืืคืืจ ืืืงืืื ืืืืืงืืคืืื (ืจืฉืืืช ืืขืืจืืื ืืฆืืืช ืืืคืืขื ืืกืืืืื ืืืืื ืื ืืืืืขื ืืืื ืงืืฉืจืชื ืงืืื) ืืืงืืกื - ืขื ืื ืืขื ืื - ืื ืืืืืงืืคืืื ืืฉ ืขืืชืื 21:25, 28 ืืื ืืืจ 2026 (IST)ืชืืืื HananCohen, ืืกืืื - ืืืืื ื ืืื ืืืืืืื ืฉืื ืืืืืง ืฉืืืืข ืื ืงืืืืช ืืืื ืืืืืืืช ืืจืืืื ืืืชืจ. ืืืงืืกื - ืขื ืื ืืขื ืื - ืื ืืืืืงืืคืืื ืืฉ ืขืืชืื 19:10, 4 ืืคืืจืืืจ 2026 (IST)ืชืืืื ืชืืื ืขื ืืืขืจืืช ืืืืืจืืช ืฉื ืืชืื ืืื. ืืงืืชื ืืชืฉืืืช ืืื. ืืืฆื ืื ืงืืืช ืื ืื ืฉืืืื ืืืจืื ืืืืื "ืฉืืืจ" ืืื ืืืื ืืืืื ืืื ืืืื ืืช ืืฆืืจื ืืื ืืืืื ืืืืื ืฉืืฆืืืช ืืืืขืื ืืฆืืจื ืืืชืขืจืืืช ืืืืื ืื ืื. ืื-ื ืืจ- ืชืืื ืงืืื ืื ืขื ืืืืืื ืืืืืช ืื ืืื ืืืืฉืืช. ืืืื ืืืชื ืืฉืืืืช ืฉืืฉืืืืื ืืฉืงืืคืืช ืืคืจืกืืื ืกืืืืื ืืืื ืื ืฉื ืืคืืฉืื, ืืื ืืฉืื ืืืืื ืฉืืืืื ืื ื ืืกืคืื (ืืืื ืืืชื ืืืืื) ืืืฆืืจืฃ ืืงืืืฆื. ืื ืื ืืคืฉืจ ืืืฆืืจืฃ ืืืื ืกืคืฆืืคื ืืฉ ืงืืืฆืช ืืืืกืืค ืืืคืฉืจ ืืืกืชื ืืจื ืืืขืืืช ืืืื ืขืชืืื. ืื ืื ืฉืืฉืืงืืืื ืืืจืื ืจืฆืื ืืื ืืืชืจ ืื ืืืฆืื ืืงืืืื ืืืืื ืืืฉืืื ืืงืื ืืช ืชืืืืชื.ืืืงืืกื - ืขื ืื ืืขื ืื - ืื ืืืืืงืืคืืื ืืฉ ืขืืชืื 18:40, 5 ืืคืืจืืืจ 2026 (IST)ืชืืืื ืืืืจืื ืืขืจืืื ืขื ืชื ืืืืช ืืื ืืื ืืืจืื, ืคื ื ืืื ืื ืื ืืืช ืกืคืจ ืืกืืื ืื ืืืข ืืืืืจ 'ืืืจืคื ืืืืคืืจื' ืืืืจืื ืฉืืืคืืข ืืขืจื 69 (ืชื ืืื). ืืืกืชืืืืช ืืืฃ ืืฉืืื ืฉื ืงืืืื ื ืืืืจื ืืฉื ืื ืื ืืขื ืืขืจืืช ืืืฉืืืื ืื ืื ืืืืื ืืขืจื ืื ืืืข ืืืืืจ ืฉื. ืืืืชื ืืืกืชืื ืื ืงืืจื ืืขืจืืื ืืืจืื ืขื ืชื ืืืืช ืืื ืืื ื ืจืืื ืฉืืืืื ืืฉ ืืืืจืื (ืจืืื ืืืืชื ืืกืืจื), ืืืืืื ืืืืืจ ืืื ื ืืืกืชืจ ืื ืฉืชืืจืฉ ืคืขืืื ืืงืืืืืช ืืคืชืื ืืืชื - ืืืื ืืืขื ืืขืจื 'ืชื ืืืืช ืืื' ืฉืื ืื ืืืืงื ืื ืืืกืชืจืืช, ืืืขืจื ืืฆืืฆื ืขืฆืืืช ืฉื ืืืืืจ ืืืฉ ืืชืฆืืื. ืื ื ื ืืื ืืืฉืื ืฉืืื ืฆืืงืืืคืืื ืืกืืื ื, ืฉืืฉืืฉืช ืืืืื ืจืื ืฉื ืงืืืื, ืืืืืืื ืฉืื ืื, ืืจืืืช ืงืืื ืื ืืื' - ืืฉ ืืงืื ืืืคืฉืจ ืืงืืจื ืืื ืืืืืฉืฃ ืื ืฉืื ืืืืืจืื ืืจืคืืื ืจืืืืืกืืืื ืฉื ืืื ืืืช. ืืืื ืืช ืื ืืฉื ืืื ืืืืื. ื ืืฆื ืฆืื ืืื โข ืฉืืื 17:23, 29 ืืื ืืืจ 2026 (IST)ืชืืืื ืืืืคื ืืืื ืื ื ืืืฉื ืฉืื ืชืืื ื ืฉืขืืืื ืืืคืจืืข ืืืืื ืื ืื ืื ืืืืืืืืกืื ืฆืจืืื ืืืืืช ืืืกืชืจืช. ืืืช ืืืกืืืืช ืืืืืช: ืชืืื, ืจืื (Aizenr) โข ืฉืืื 07:12, 30 ืืื ืืืจ 2026 (IST)ืชืืืื ืคื ืชื ืืื ืืขืช ืืืจื ื ืืกืคืช ืื ืืฉื, ืฉืื ืืืขื ืขื ืืคื ืืื ืืงืืืืช ืฉื ืื ืื ืืืช ืืกืคืจ. ืืืื ืืช ืืืืืขื ืฉืฉืืื ืื ืืืฉืื ื ืืืฉืืืช ืฉื ืืืื: "ืืจื ื ื-67 (six sevev) ืืืืชื ืืืืื ืืจื ื ืืืืจ ืืืื: 69 (six nine). ืืืืื ืฆืขืืจืื ืืชืืืืื ืฉืื ืืืืขืื ืื ืื, ืืืชืืื ืืืืื 69 ืืืชืืฆืื ืืจืืฉืื ื ืฉืื ืืงืืืื ืืื ืขืจื ืืืืงืืคืืื ืฉื ืืชื ืืื ืืืื ืืช 69. ืืฉืื ืืืืฆืื ืขื ืืงืืฉืืจ ืืืฆื ืืืืื ืืจืคื ืืืื ืืืื ืฉื ืืชื ืืื ืืจืืฉ ืืฃ ืืืืงืืคืืื ืฉื ืืขืจื 69.ืื ื ืืืื ื ืฉืืฉ ืื ืฉืื ืฉืจืืฆืื ืฉืืืื ืขืจื ืืื ืืืืงืืคืืื, ืืื ืืืื ืืคืฉืจ ืืืขืืืจ ืืช ืืชืืื ื ืืชืืชืืช ืืขืืื? ืื ืืคืืืช ืื ืื ืืงืคืืฅ ืืืืืื ืืื ืืขืื ืืื? ืื ืงืจื ืืข' (ืื 9) ืืื ืืืื ืืืื ืืืชื. ืืื ืืืื ืืืื ืฉืื ืชืืื ืฉืื ืชืืื ืืช ืืืื ืืกืืจ ืืช ืื, ืืื ืืืื ืืฉ ืืจื ืืืืื ืข ืืื? ืื ืืื ืื ืืืฉื ืื ืื. ืจืง ืฉืืข ืขื 6-9 ืืืืืื ืืืืชื". ื ืืฆื ืฆืื ืืื โข ืฉืืื 15:08, 30 ืืื ืืืจ 2026 (IST)ืชืืืื ืืืื ืืฉ ืืงืื ืืฉืงืื ืืืฆืืจ ืชืื ืืช, ืืืืื ืืชืื ืืืช ืืืจืืช ืืงืืืืืช ืืงืืืืจืื:ืชืื ืืืช ืืืืจื ืืืืืจื, ืืืืืืจืืช ืืคื ื ืขืจืืื ืฉืชืืื ื ืขืืื ืฉืื ืืืชืืื ืืืืืื/ืงืืื ืื (ืืืืืืช / ืืื). ืืื ืืื ื'ืง โข ืฉืืื 18:47, 30 ืืื ืืืจ 2026 (IST)ืชืืืื ืื ื ืืืฉื ืฉืื ืื ื ืืืคื ืืชืคืืจืื, ืืืจืื ืืงืจืื. ืืืฆืขื ืคืฉืืื, ืืื ืืืชื ืืืืืืช ืืืืขื ืืช ืืชืืืืืช ืืฆืืช ืืืืืืจ ืฉืืื ื ืืชืืืื. ืื ืกื ืืจืื ืืช ืืืขื ืืช ืฉืขืื, ืืื ื ืืืงืฉ ืืืชืืืืก ืืงืฆืจื ืืขื ืื ื ืื ืืกืขืืคืื ืฉืขืื. ืืืืชืื ืืืืื ืืืชืื ืืจืจื ืืื ืืืื ืื ืืช ืืกืืืืช ืื ืืช ืืจืฉืืช ืืื ืืข ืืื ืืืชืื ืืืชื, ืืื ืื ืืกืืืื ืืช ืืืขืช ืืืืจืืจืื ืืช ืืจืืืช ืืืืกืืื. ืืืขืื ืฉืืืขืืชื ืขื ืืื ื ืืฆื: ืืืคื ืืชืคืืจืืช: ืืืคื ืืืืืืื: ืืืฆืขืืช ืฉืขืื ืืืฉืจ ืืชืืื ืืช: ืื ื ืืงืืื ืฉืชืืฆืื ืืช ืืกืืืื ืืืขืื. ืื ืืืขืชืื ืืืืืื ืฉืื ื ืืฆืื ืืืืืช ืื ืฉืื ื ืืขืฆืื ืืฉืชืืฉืชื ืืืืืืืื, ืื ืชืืกืกื ืืืขืืจ. ืคืจืืื ืชืืืื ๐ง ืฉืืื ๐ฅ ๐ฆืื ืืืื ืืืืืจ ืืืืื ืขืืืื?๐ฆ 15:27, 1 ืืคืืจืืืจ 2026 (IST)ืชืืืื ืฉืืืื ืืชืื ืืืช "ืืืฉืืืช" ืืืงืชื ืืืื ืขืจื ืฉืืฆืจืชื ืืชืืื (ืืจืื ืื), ืืืืคืขืชื ืฉืืฉ ืื ืฉืืืื (ืฉืืืืช ืืืื ืืืืืื ืืืืื:ืขืืฆืื_ืืฆืืจืืช_ืืืืืงืืืืช ืืฉืืจื 42<includeonly></includeonly>: attempt to call field 'contains_any_csv' (a nil value).) ืืฉืืชื ืฉืื ืจืง ืืขืจื ืืื ืืื ืืฉืืชืืืชื ืืืืืง ืขืจืืื ื ืืกืคืื ืจืืืชื ืฉืืื ืงืืื ืืื (ืืืืืช ืืืืืืื ืจืืืจืืื, ืืืืืืจ ืกืืืืคื ืืืืจื ืกืืืืืก). ืืืฉืื ืืืืข ืืื ืืืงืืจ ืฉื ืืฉืืืื ืืืื ื ืืชื ืืชืงื ืืืชื? ืชืืื ืืจืืฉ! Shahar124 โข ืฉืืื 14:45, 31 ืืื ืืืจ 2026 (IST)ืชืืืื ืขืืจื ืงืื ื ืืืืืื ื ืืกืืืืช ืืขืจื ืืืกืช ืืจืืค ืื ื ืืชืงืฉื ืืืืื ืืช ืื ืืกืืืืช ืืืขืจื ืืื ืืื, ืืฉืื ืื ืืืฉืื ืืขืฉื ืืืช ืืืงื ืื ืฉืชืืื :) ืืื ืืจืื ืฆืืื โข ืฉืืื 07:53, 2 ืืคืืจืืืจ 2026 (IST)ืชืืืื ืืกืงืจืืคื ืืฉืืืืจ ืืืืจ ืฉื ืืฉืืชืืช - ืืื ืืื ืขืืื ื? ืืืืงืจ ืืื ืืกืคืจ ืืฉืืชืืช ืืืืืช ืืจืฆืฃ ืฉื ืขืจืืื. ืืื ืฉื ืขืฉื ืืืงืจืื ืืืื ืืขืืจ, ืืจืฆืชื ืืช ืกืงืจืืคื ืืืืคืื ืืืฉืืชืืช. ืืฉืืืงืชื ืืช ืืชืืฆืืืช, ืจืืืชื ืฉืืื ืฉืืืจ ืื ืื ืืขื ืขืจืืืืช ืชืงืื ืืช. ืืื ืืืืจ ืืฉืื ืืืื ืืืืืงืืคืืื ืืืืจืื ื, ืขื ืืกืคืจื ืืืื-ืคื ืืืื ืืื, ืขืืืื ืืืื ืืืฉืชืืฉ ืืกืงืจืืคื ืืื? (ื ืืืฆืชื ืืฉืืืจ ืืื ืืช ืื ืืขื ืขืจืืืืช ืฉื ืจืื ืื ืชืงืื ืืช). ืืืื โข ืฉืืื 08:28, 2 ืืคืืจืืืจ 2026 (IST)ืชืืืื "ืืจืืฉ ืฆืืืื" ืืขืจืืื ืืชืืจืืืื ืืืช ืื ืคืขื ืจืืฉืื ื ืฉืื ื ื ืชืงื ืืืงืจื ืืื, ืืืื ื ืืืฉื ืฉืืฉ ืืืืฉืื ืืชืืืืกืืช ืืื ืืื ืฉืืฆืืืชื ืืจืืืช ืขื ืื ืืืคื ืืืืื ืืืช ืืืืื ืื ืื ืืฉื. ืจืื ืืืืื ืืขืจื ืืืืืื, ืฉื ืืืืื ืขืจื ืืืืงื ืืื ืืื, ืืืขืจื ืืืงืืจ ืืจืืข ืืื ืืขื "ืืจืืฉ ืืงืืจ". ืืืชืจืื ืื ืืขืืืจื ืขื ืื ืฉืืขืืืจ ืืช ืืชืืื, (ื ืชืงืืชื ืืื ืืขืช ืกืงืืจื ืฉื ืืชืจืืื ืืฉืืคืืจ ืืืงื ืืืจืืืช ืืืงืกื ืื ืืื ืืช ืืชืจืืื). ืืื ื ืชืืื ืืื ืืืขืชืงืช ืืขื ืืช ืฉืืขืจื ืืืงืืจื ืืืจืฉ ืืงืืจ ืืืื ืืืขืชืืง ืืช ืืชืื ืืช ืื ืืขืจื ืืืชืืจืื? (ืื ื ืืืฉื ืฉืื ืืืืืจ ืืืขื ื ืืืืจืฉืช ืืคื ืืืืืื ืืงืืจ ืจืฆืื ืืืฉืืืจ) Szibre โข ืฉืืื 19:36, 5 ืืคืืจืืืจ 2026 (IST)ืชืืืื ืื ื ืืฆืขืชื ืืืื ื ืืืืกืืฃ ืืชืื ืืช {{ืืขืจื}} ืคืจืืืจ ืื ืฉืื ืืฉื "ื ืืืง" ืื "ืืืืืจ" ืืื ืชืืจืื ืืืืืงื ืื ืืืฉืชืืฉ ืื ืขืืจื ืงืจื ืืช ืืืงืืจ ืืืืฉืจ ืฉืืื ืืื ืชืืื ืืืชืื. ืื ืืคืจืืืจ "ื ืืืง" ืืขืจื ืืื "ืื" ืืืขืจื ืื ืืืฆืืช ืืขืจื (ืืชืฆืืืช ืืงืจืืื) ืืื ืื ืงืืืืช ืืงืื ืืืงืืจ - ืืื ืขืืจืืื ืืืจืื ืืืืืื ืืืขืช ืขื ืื ืืืืืจื ืืชืืกืกื ืืืขื ื ืืืืืืง ืืช ืื. โ ื"ืจ MathKnight โก (ืฉืืื) 14:35, 7 ืืคืืจืืืจ 2026 (IST)ืชืืืื ืชืืืจืืช ืชืืจืืืืช ืืชืืื ืฉืืื, ืืฉืื ืืฉืืืข ืืื - ืื ืืชืืืจืืช ืฉืืชื ืืืฉืืื ืฉืชืืื ืืืืื ืืืืชืจ ืืชืืจืืืืช ืืชืืื? ืืืืื ื ืืื ืืชืืจืืืืช ืืชืืื ืืฆืืืฆืืืช ืืืชืจ ืืืชืืจืืช ืฉืืืืืง ื ืืืจื, ืืืฉื ืชืืจืืช ืืื ืืืืงื ืืืชืื (ืืืืงื ืืืชืืจืื ืืื ืืงืืจื ืืืฉื), ืื ืชืืจืืช ืื ืืฉื ืืกืืื. ืชืืื โโช โFuncsโ โ ืฉืืื โช 22:50, 5 ืืคืืจืืืจ 2026 (IST)ืชืืืื ืืฆืขื - ืืจืืืช ืกืืืืืืช ืืืคืขืืืื ืืขืฆืืจ ืืืืืืช ืคืขืืืืช ืืืืืงืืช ืืงืืืืช ืืืงืืคืืื ืืืจืืขืื ืฉืื ืื ืฉืื ื ืขืืงื ืืืจืืื ืืฉื ืื ืืืืจืื ืืช ืืืืื ืืืชื ืืืืฉืื ืฉืืคืขืืื ืฆืจืื ืืืืืช ืืืื ืืืืืงืจืืื ืืืจืืฉืื "ืืืืืจืื ืืืจืืื" ืฉืืขืฆืจื ืืืืืืช ืชืืืืืื ืฉืืืืงืื ืืงืืืืช ืืืงืืคืืื ืืืคื ืื ืฉืื ืืืฆืืืื ืฉื ืจืข ืืงืืืื. ืื ื ืืืชื ืืช ืืฆืขืชื ืืขืงืืืช ืคืชืืื ืืืงืืื ืฉื ืืืื ื ืืฉืืืืช ืขื 9 ืขืจืืื ืฉื ืืืืคืื. ืืืื ื ืืืฉืืืืช ืืืื ืืื ืืืืชืจืื ืืงืืืื ืคื ืืื ืืื ืขืืจืจื ืืขืก ืขื ืืงืืืื ืืคืืจืกืื ืขื ืื ืืืืขื ืืืชืจ ืืืงื. ืืืฆืขื ืฉืื ืืื ืืื (ืืขืจื, ืื?): ืืืงืจืื ื ืืืจืื ืฉืืื ืืคืขืืื ืืขืจืืช ืืขืจืืืื ืฉืคืขืืื ืืกืืืืืช ืขืืืื ืืคืืืข ืืงืืืื ืืืคื ืื ืื ืืืืืื ืฉืื ืืืืืฅ, ืชืืื ืืื ืกืืืืช ืืืืืืช ืืืืืง ืชืื ืื ืืืคื ืฉืืื ืืืืกืื ืืืชื. ืืคืขืืื ืืืืืืืช ืชืืื ืจืง ืืืกืืื ืฉื ืืื ืืคืขืืืื ืืืฉืจ ืืขืืงืจืื ืืืืื ืืื ืืื ื ืขื ืืงืืืื. ืืืืืงื ืืืืกืืื ืืืื ืขื ื 24 ืฉืขืืช ืืืื. ืืืื ืืืืืงื ืืืขืฆืืจื ืืงืืืื ืืืคืขืืืื ืฉืืืืช ืืื ืขืฆืื ืืขื ืืืขืืจืืื ืืื ืืืจืืืข ืืช ืืจืืืืช ืืืืฉืื ืืืฆื ืืืืฉ ืืช ืืฉืืื ืืืืคื ืฉืืืืื ืขื ืืงืืืื. ืื ื ืืืฉื ืฉืืืื ืฉืืืฆืขื ืืื ืชืืืื ืืฆืืืชืื ืฉืขืืกืงืื ืืฉืืคืืจ ืืืงืืื ืืืืืงืืคืืื. ืื ื ืืงืืื ืฉืืจืกื ืื ืื ืืืจืช ืฉื ืืฆืขืชื ืชืืื ืืืืื ืืืฆืืขื ืืขืชืื ืื ืื ืืืจ. HananCohen โข ืฉืืื 14:18, 8 ืืคืืจืืืจ 2026 (IST)ืชืืืื ืืฉืืื ืืงืจื ืงืืฆืื ืืฉื ื ืืืจืืงืจืืื ืฉื ืืืจื ืืื ืืชืช ืืช ืืืืื ืืืืจืื ื ืืกืืืืืช ืืืจืืืืช ืฉืืฉ ืืชืช ืขืืืื ืืช ืืืขืช. ืืชืืจ ืขืืจื ืื ืืื ืืื ืื ื ืื ืจืืฆื ืฉืงืืืฆืช ืืขืื ืืจืฉืืืช ืืกืืืืช ืชืืืืง ืืกืืืืืืช ืืจืงืื ืืืช, ืืืชืืจ ืืคืขืื ืื ื ืื ืฆืจืื ืขืื ืชืืืื ืืืจืืืช ืฉืืืืืจืื ืืฉืื ืืคืืจ. ืืื ืื ืืื ืชืืืื ืืืืจืืืช ืืืืฉืืืืช ืฉื ืืืคืขืืืื ืจืืืื ืืชืืืืืื (ืืืฉื ื ืืืืกืืจ ืืืืืข ืืืงืืืข ืืืฆืืช ืืืคืขืืืื). ืืืงืจื ืื ื, ืชืื ืืืช ืืืฉืืืืช ืืืกืจื ืืืืจ ืคื ืืื (ืืืืื ื ืืืช: ืฉืื) ืืขืืจื ืฉืื ืื ืืืชื ืืืืชืืืื, ืืืฃ ืฉืืืื ืืคืขื ืืืืืืื ืืืจ ืื, ืืขืืจื ื ืืื ืืืขืชื ืืื ืืฆืืื ืืืื. ื ืืื ืฉืื ืืง ื ืขืฉื ืืืืืช ืื"ืฆ ืืจืืข ืืงืืืื (ืืื ืื ืืืืจ ืฉืืืชืื ืืืื ืืืืงื ืืืืชื ื ืืื ื ืืืืช ืืืื), ืืื ืื ืืืง ืืืืืืจ ืื ืืืื ืืืชืืืช ืื ืฆืืงืืืคืืื ืืฉืคื ืืขืืจืืช - ืฉืจืื ืืืืฅ ืฉื ืืืืจืื ืื ืืืจืื ืืืช ืืืืื ืืช ืืืืจืืืืช ืืชืื. ื ืื โข ืฉืืื 08:17, 11 ืืคืืจืืืจ 2026 (IST)ืชืืืื ืชืื ืืช:ืืืืืช ืคืจืืื ื ืืืืงืืจืช {{ืืฆืขืจื ื ืคืชืื ืืืชื ืืฆืืขืช ืืฉืืืืช ืกืืืื ืืืช ื10 ืขืจืืื ืขื ืืืืคืื. ืื ืืืืืจ ืจืง ืืชืืืืช ืฉื ืืืงืืคืืื ืืคืื ืืืืงืืจืช ืืืฆืื ืืช ืืืจืืช ืฉืืฉืื ืืืืืช ืงืฉืืืื ืื ืืื . ืืืืืจ ืืืคืจื ืฉื ืืืืืช ืคืจืืื ื ืฉืืชืงืืื ืืื ืงืฆืจ ืืืงืื ืืืชืจ, ืืืืจ ืืืืจ ืืืืืืื ืืืืืจ ืืฆืืขื. ืืื, ืื ืฉืืงืจืืืจืืื ืื ืื ืืื ืื ืืจืืจืื ืื ืฉืืฉ ืืืื ืฉืื ืืืืืื ืืืืืืช.}}Talitova โข ืฉืืื 17:46, 14 ืืคืืจืืืจ 2026 (IST)ืชืืืื ืืฉืืืืชื ืฉื ืขืจืื ืืืฉืืืื ืขืชืืืืื ืืืฉืจืื ืขืืชื ืืืืืงืช ืืื ื ืืืื Mertaro ืืืืืช ืืฉืืืืชื ืฉื ืขืจืื ืืืฉืืืื ืขืชืืืืื ืืืฉืจืื. ืื ื ืกืืืจ ืฉืืืฉืืืื ืฉืืืืื ืขื ืืงืืชื ืืืืกื ืืจืืืื ืื ืืคื ืืืืง (=ืืืืฉืื ืืืืื ืช ืื ืืขืื ืืืกืืืช ืืืขืื) ืจืืืืื ืืขืจื ืืืืงืืคืืื ืืืฉ ืืื ืืฉืืืืช ืื ืฆืืงืืืคืืืช, ืืืขืืืชื ืืจืืจื, ืืื ืฉืื ื ืืืื ืืช ืืขืชื, ืกืืืจื ืฉืืื ืืืชืื ืขืจืืื ืขื ืืืฉืืืื ืขืชืืืืื ืขื ืฉืืฉืืืื ืืช ืืืื ืืงืืช ืืืืฉืื ืืืืื ื ืืืืืช ืฆืืจืืฃ ืืืืขืฆื ืืืืจืืช, ืชืืืื ืฉืื, ืื ืืื ืืจื ืืืจืช (ืืจืืจื โ ืื ื ืืืืงื ืืืชื ืื ืื ื ืืืขื ืืืื ืช ืืขืชื). ืืืืจ ืฉืืงืืืืจืื:ืืฉืจืื: ืืืฉืืืื ืขืชืืืืื ืืฉื ื 33 ืืคืื ืืขืืืืื ืืืืืงื ืขื ืคื ืืขืชื ืฉื ืืจืืจื, ืืืืืจ ืืืช ืฉืืฉ ืขืจืื ืืืฉืืืื ืขืชืืืืื ืจืืื ืืขืฉืืืื ืืืืืชื โ ืืืฉื ืืืจ ืืืืจืื ืืืืื, ืืื, ืืืขืืช ืขืืจืื, ืืืื ืื ืืืื ืืจืื ืื ืื, ืฉืื, ืืืื ืืืจ ืืืง ืืฉืืืจืื ืื ืืกืคืื, ืื ื ืืขืื ืืื ืืืืืข ืืื ืืืืืืช ืงืืืื ืื-ืืฉืืขืืช: ืืื ืืฉ ืืฉืืืืช ืืขืจืืื ืืืืืช ืืืฉืืืื ืขืชืืืืื ืืืืื ืช ืืฉืจืื? ืืืื ืื ืงืืืืื, ืฉืขืกืงื ืืืงืจืื ืืกืืืืื ืืื ื ืืกืืื ืืืืขื ืืขืืงืจืื ืืืืื: ืฉืืื:ืืืคืืจ (ืืืฉืื)#ืืฉืืืืช, ืฉืืื:ืืืื (ืืืฉืื ืืฉืจืืื)#ืืฉืืืืช, ืฉืืื:ืฉืื (ืืืฉืื)#ืืฉืืืืช, ืฉืืื:ืืจ ืืืง (ืืืฉืื)#ืืฉืืืืช, ืืืงืืคืืื:ืจืฉืืืช ืืืขืืืื ืืืืืงื/:ืืจ ืืืง (ืืืฉืื), ืฉืืื:ืืืื ืืฉืืจ ืฆืคืื#ืืฉืืืืช, ืฉืืื:ืืืช ืืืจืื ืฆืคืื#ืืฉืืืืช. ืืฆืื ืฉืชื ืืคืฉืจืืืืช ืืืืจื ืืื ืืื ืืงืืืื ืฆืจืืื ืืืืืจ, ืืืืื ืฉืืคืฉืจ ืืืฆืืข ืืคืฉืจืืืืช ื ืืกืคืืช: ืืื ืื ืืืืืจ ืฉืืืืืื, ืื ืชืืืข ืืฆืืจืช ืืฆืืขื, ืืืืื ืืืจืืก ืืืืืืช ืงืืืื ืงืืืืืช โ ืืื ืืืืื ืื, ืืืืื ืืฃ ืืืืืืจ ืืืจืื ืขืจืืื ืฉื ืืืงื. ืืชืืื ืื ืืช IdanST, ืฉืืืืข ืขืืื ืื ืืฉื ืืื ืืืืืง. ืืืจ ืื ืื โข ืฉืืื 18:14, 8 ืืคืืจืืืจ 2026 (IST)ืชืืืื ืืืื ืืืืื ืืืจืื ืขืจืืื ืืืฉืืืช, ืฉืื ืืืจื ืืื ืืืืื. ืืืฆืืช ืืืืืื. ืืืชืื ืืื ืื ื ืืฆืืข ืฉืื ืื ืืืืจืืช: ืืื ืืืืืื: ืืฉืืขืืชื ืืคืืืช ืืื ืืืื: ืืื ืืืืช ืกืคืจ ืืืืื, ืืื ืืช ืืื ืืฉืจืื ืืืืช (ืื ืื ืจืง ืืฆืืจื ืชืจืืืชืืช, ืืื ืืืืชืืช), ืืื ืขืฉื ืืจ/ืืช ืืฆืืื. ืืคื ืื ืืืฉื ืืื ืจืืกืื - ืืืืืื. ืื ืืื ืจืืงืืืฃ - ืืืืฆื ืืืืื. HofEz96 โข ืฉืืื 22:06, 8 ืืคืืจืืืจ 2026 (IST)ืชืืืื ืชืืื ืืื. ืืืืื ืืื ื ืืฆืจ ืื ืื ื ื ืชืงื ืืืจืื ืืืื ืขืจืืื ืื ืฉืื ืฉืืืืืจืื ืืฉืจ ืืืืืืื, ืืืจืืช ืฉืืฉ ืืื ืืืจื ืืืืื ืืื, ืืื ืจืื ืืช ืขืฆืื ืืืืืืื. ืืื ืื ืืกืืืื ืฉืืื. ืืื ืืืืืื ืฉื ืื ืืื ืจืืงืืืฃ ืื ืืจืชื ืืจืืจืืฅ'. ืืกืคืืื ืคื ืื ืฉืื ืืขื ืืฉืจืื ืืืื ืฉืจืื ืืช ืขืฆืื ืืืืื. ืืืื ืืืืขืื ืฉืื ืืืื ืืืืืืื, ืืคื ืื ืฉืื ืื ื ืืืืขืื ืขื ืจืืฉืืช ืืืืื ืื ืืจืืืื ืืช. ืื ืืื ืื ืื ืืืืื ืืจืืฉืื ืฉื ืขืฉื, ืืื ืื ื ืืคื ื ืืื ืฉื ืื ืืื ืืืื ืืืื. ืื ื ืืืฉื ืฉืฆืจืื ืงื ืื ืืืืช ืืืกืื ืื ืืืืืจ ืืืืืื ืืชืืืืจ ืืจืืฉืื ื ืฉืื. --HofEz96 โข ืฉืืื 18:57, 9 ืืคืืจืืืจ 2026 (IST)ืชืืืื Absolutely. I really do. My dad is Northern Irish and my mum is Jewish. Thatโs working blood. Though I am not religious in the least, I am very proud to be Jewish. โืื ืขืื ืชืืจืช ืืืืกืืช ืืงืืืืช ืื ืืื ื, ืืขืืชืื ืื ืืืจืื ืื ืืชืืจืื ืืืชื ื"ืืจืื ื" ืืืื ืฉืืื ืืืื ืืชืืจืื ืืืชื ื"ืืืืื ืฉืืืืฆืจื". ืื ืืชืืจืจ ืฉืืชืืจื ืืืืขืืช ืืืฆื ืืชืืคื - ืืื ืืืื ืืงืจืื ืื ืืจืื ื ืืืืจืื ืื ืืงืจืื ืื ืืืืื"โ ืื ื ืจืืฆื ืื ืกืืช ืืืงื ืืช ืืืืื ืืฉืืื ืืชื ืจืืื ืืืืืืจ ืืื ืืืืืื ืืคืชืื โ ืืื ืงืฉืจ ืืฉืืื "ืืืื ืืืืื". ืื ื ืืืฉื ืฉืืื ืกืืื ืืขืฉืืช ืืืช, ืืื ืื ืืขืืืื ืืื ืืฉ ืืฉืคืขื ืืืฉืืช ืขื ืืขืืกืืง/ืืขืฉืื ืฉืืงื ืื ืื ืืฉืืืืช ืื ืฆืืงืืืคืืืช ืื ืขื ืืืืืืจืคืื ืฉืื, ืืืื ืืงืจื ืืืฉืจ ืืืืืืจืคืื ืฉื ืืื ืืืจืืืช ืืืืืจื ืคืฉืืื, ืขืืืฃ ืืฉืืืจ ืืช ืืคืจืืื ืืืืฃ ืืขืจื. ืืืืฃ ืืขืจื ืืฉ ืืช ืืืงืื ืืืจืืฉ ืืื ืืชืืจ ืืช ืืจืงืข ืฉื ืืื ืืืืคื ืขืืืืชื, ืืืื ืืขืกืืง ืืืืืจืืช. ืคืืืืชืืืืจื โข ืฉืืื 16:59, 10 ืืคืืจืืืจ 2026 (IST)ืชืืืื ืืื ืื ืืืื ืืืจืื ืืืืืืืช ืฉืื, ืืคืืื ืื ืืื ืืืชื ืฆืจ/ืืชืืกืื/ืืืืจ-ืืช. ืืืืืืช, ืืื ืื ืืืฉื ืืขืจื ืืชืืื ืื ืื ืืชืืืฉ ืื, ืขืืืื ืืืืื ืืืง ืืฉืื ืืืืืชื, ืืืื ืืฉ ืืฆืืื ื (ืืขืืชืื ืืฆืืจืช "ืืืืฆื ืืืืื" ืื ืืื ืื ืืืืืืื ืื ืฉืืชื ืฆืจ/ืืชืืกืื). โ ื"ืจ MathKnight โก (ืฉืืื) 19:09, 11 ืืคืืจืืืจ 2026 (IST)ืชืืืื ืฆืืื ืืืืืืช ืืืฉืจืืืืืช ืืืืืจืืช ืืืฉืจืืืืืช ืื ืืฉื ืืื ืขืื ืืืจ ืืขืืจ, ืืื ืื ื ืื ืืืื ืฉืืฉื ืืฉื ืืคื ื ืขืฆืื. ืืฉื ืื ืกืชื ืืงืืืืจืื:ืืืืืช ืืืืืจืกื ืืขืืืื ืืช ืืืืจืืคื, ืฉืืชื ืื ืฉืื ืืขืืืื ืฉืืืืช ืืขื ืืืฉืจืืืืช ืื ืืฆืืื ืช ืืืื ืืฉื ืืขืจื. ืืฉื ื ืืื ืืืืจื ืืืืจืช (ืื ืืื ื) ืืคืื ืืืงืืคืืื ืืขืืจืืช ืืื ืขืืจืืช ืืื ืืฉืจืืืืช. ืืืืข ืื ืื ืื ื ืฉื ื ืืช ืื ืฉืืืช ืืืืืืช ืืืฉืจืืืืืช ืืฉืื ืืช ืื ืฉืืืืื ืืช ืืืืื "ืืืฉืจืืืืช"? ืื ืืืืืช ืืขื ืืืฉืจืืืืช ืืืืืจืกื ื ืฉืื, ืืืืช ืืขื ืืืฉืจืืืืช ืืืืืจืื, ืืืืช ืืขื ืืืฉืจืืืืช ืืืืืจืกื, ืืืืช ืืขื ืืืฉืจืืืืช ืืืืืจืขืฃ ืืื ืืืื. ืืืฆืขื ืฉืื ื ืืืขืช ืืฉืื ืื ืจืง ืืืืืืช ืืขืืืื ืืช. ืืืืืืช ืื ืืืืืช ืืืชืจ ืืืฆื ืคืืืช ืืืืจ, ืืื ืืืืืืช ืืขืืืื ืืช ืฉืื ืฉื ืจืื ืืขืจืืื ืืื (ืืืขื ืชืืื ืืืืช ืืขื). ืชืื ืืืจืืื - ืฉืืื 13:28, 10 ืืคืืจืืืจ 2026 (IST)ืชืืืื ืืจืืฉ ืืื โ ืืืขืืจ ืืืฃ ืืืงืืคืืื:ืืื/ืืงืฉืืช ืืขืืจื ืืืืืื ืืชืจืืื ืืืืงื ืืื ืืฆืืขื ืืขืจื ืื ืืืื ืืืฆื ืืจืืื ืื, ืฉืืื ืืืืืงืืคืืื ืขืืจืืช ืืื ืืืชืจ ืืขืฉืจืื ืฉื ื ืืืฉ ืื ืืงืืืืื ืืขืื ืืคืืืช ืชืจืืกืจ ืืืงืืคืืืืช. ืืขืจื ืชืืจืื ืืฉืคืืช ืืืจืืช ืืืื ืื ืืงืืจืืช ืืขืืจืืช. ืืจืืื ืืช ืืืื ืืืืช ืืืืืจ ืืขืจื ืืืื ืืืื ืขื ืืจืื ืืงืืจืืช. ืืืื ืืืื ืฉืืื ืืืืื ืืงืืจืืช, ืฉืื ืงืจืื ื ืืขืฆืืื ื, ืืื ื ืืืื ืืืืื ืืช ืืืงืืจืืช ืืขืจื ืืขืืจื, ืืื ืื ืืืฉื ืืืงืืจืืช. ืื ื ืืืงืฉ ืืฆืืื ืฉืื ืืฆื ืฉืืื ืืืจืื ืขืจืืื ืืชืืจืืืื, ืื ืืขืจื ืืื ืืืชืงืฃ ืืจืืข ืืืืงืฉืื ืืืืื ืืืืืงืชื ืขื ืืื ืืขืืจื ืืืืืื. ืื ืืื ืืขืืืจ ืืืืืื, ืื ืืืืจ ืืืืืช ืืืจืื ืฉื ืืืืช ืื ืืืคื ืขืจืืื ื ืืกืคืื ืืืกืชืืืื ืขื ืืงืืจืืช ืืืืืงืืคืืืืช ืืืจืืช, ืืื ืืืืืจ ืืื ืืืืืื ืฉื ืืืื ืืืช. ืืืจืื. ืืืฉ โข ืฉืืื 16:58, 11 ืืคืืจืืืจ 2026 (IST)ืชืืืื ืืืฉ, ืืคื ื ืืช ืชืฉืืืช ืืื ืืื ืฉืขื ืคื ืืืืืื, "ืืืืจ ืืืขืืจื ืืืืืื, ื ืืชื ืืืืืืจ ืืช ืืขืจื ืืืจืื ืืขืจืืื ืื ื ืืฆื ืืขื ืืืืช ืืฆืืขื ืฉืกืืืจ ืื ืืขืจื ื ืืฆื ืืขืช ืืืฆื ืจืืื ืืืจืื ืืขืจืืื (ืืืื ืืืชื ืืขืจื ืืืงืืจื), ืืื ืืืื ืืงืืื." ืื ืืืืจ ืืืคืื ืื ืืฉืฉ ื"ืชืจืืื ืืืืงื ืืื ืืฆืืขื" โ ืงื ืืืคืื ืืช ืืืืืื, ืืขืืืช ืขืจื ืฉื ืืืง ืืืฆืืขื, ืฉืืืื ืืืื ืฉืืืืจ ืืื ืืื ืฉืื ืื ืืืืชื, ืืืื ืืืืืจ ืจืง ืืจืื ืืืืืก. ืคืืืืชืืืืจื โข ืฉืืื 23:55, 12 ืืคืืจืืืจ 2026 (IST)ืชืืืื ืืื-ืฉืืืืช ืื ืื-ืืฉืืืืช โ ืืืขืืจ ืืืืฃ ืฉืืืช ืืืงืืคืืื:ืืฉืื ืืฉืจืืืชื ืืช ืืขืจื ืืฉืคืื ืืื-ืฉืืืืช ืฉื ืืื ืืืืชืจืช ืืืื ืฆืจืื ืื. ืคื ืืชื ืืืฃ ืืฉืืื ืืืืขื ืืฉืชืื ืขืชื ืืืืื ืืื ืืื ื, ืฉืืขื ื ืฉ"ืื" ืืื ืืืืืช ืืื "ืืืชื" ืื "ืื", ืืืืืื ืืชืืจ ืืืืื ืืช "ืืืกืคืจ ืืื-ืืืื", ืืืฉืจ ืืจืืจ ืฉืืืืืื "ืืืกืคืจ ืื-ืืืืื" ืืื ื ืชืงืื. ืืื ืื ืงืจืืชื ืืช ืืืจื ืืื ืื ืืื ืฉืืฆืืืข ืขื ืื ืฉืืกืืื ืืืืกืจ ืืชืงืื ืืช ืฉื ืืืืืื "ืืืกืคืจ ืื-ืืืืื", ืืื ืฉ"ืืืื" ืืื ืชืืืจ ืืื ืฉื ืืืื ืื "ืื-ืืืื" ืืื ืชืืืจ ืืืื ื ืกืืืืืช. ืืขืืืช ืืืช, "ืฉืืืืช" ืื "ืืืืืืช" ืื ืฉืืืช ืคืขืืื (ืชืืื ืืช), ืืืฉืืฆืืจืคืช ืืื ืืืืื "ืื", ืืื ืืชืคืงืืช ืืฉื ื ืจืืฃ ื"ืืืกืจ" ืื "ืืขืืจ", ืืืื ืืคืฉืจ ืืืืจ "ืืืช ืืชืืื ืืช ืืืกืคืจ 7 ืืื ืื-ืืืืืืืช ืฉืื" โ ืืืืืจ ืืฉ ืื ืืืกืจ ืืืืืืช. ืืืฃ ืืื ืฉื ืืืงืืืื ืืืกืืจ ืฉืืืงืจื ืืืื"ื "ืื" ืืื ืฉืืืฉื ืืืืืืช ืฉืืืื ืืื "ืืืชื" ื"ืื", ืืื ืืื ื ืฉืืจื ืขื ืืืื ืืืืืืืื ืืื "ืื-ืืคืฉืจ" ืืื "ืื-ืืืื" (ืฉืืืืช ืคืืขื ืื ืชืืืจ). ืืื ืืขืืจืืช ืืืืฉื ื ืืฆืจื ืืืืืืื ืืืฉืื ืืื "ืื-ืฉืืืืช", "ืื ืืื ื", "ืื-ืฆืืืช", "ืื-ืกืืจ", ืฉืืืืื "ืึดื" ืืชืคืงืืช ืืืืื ืืฉื "ืึธืึดื" ืืืืืืช ืืช ืืกืจืื ืืฉืืืืช, ืืืื ื, ืืฆืืืช ืืกืืจ. ืืืขืชื ืื ืืืื ืืืืฉืืื ืืื ืื ื ืขืฉื ืฉืืืืฉ ืืืืืื "ืืืชื" ื"ืื", ืฉืืฉืืฉืืช ืืฉืืื ืืขืืงืจ ืชืืจืื ืื ืคืขืืื (ืืืชื ื ืจืื, ืืืชื ืืคืื, ืืืชื ืชืืื; ืืื ื ืืืข). ืืื ืืืืง ืืืื ืืืืืืืื ืืืืชื ืฉืืื ืืืืืื "ืืื ืฉืืืืช" ื ืฉืืข ืื ืชืงื ื. ืืืงืืืื ื ืื ืขื ืืืืืืื ืืืื ืืืฆืืจืืช ื ืืื ื (ืืืื ืืชืืืืกืช ืืืคืืจืฉ ืืืืื "ืึดื" ืืืืืจืช ืืฆืืจืืฃ "ืื-ืชืืืช"), ืืืืขืชื ืื ืืืื ืฉืืื ืืืื ื ืฉืืืืื "ืื" ืืชืคืชืื ืืฉืคื ืืืชื ืื ืืื ืฉื (ืืฉืืื ื). ืืฉ ืื ื ืืคืฉืจืืช ืืืืช ืจืืฉ ืืงืืจ ืืืืขืื ืฉ"ืื" ืื ืื ืฉื ืืืืชื ืืืืื ืฉืื ืืฉืชืืฉืื ืื ืืืืฅ ืืฆืืจืืคืื ืืืื, ืืื ืืืขืชื ืขืืืฃ ืืงืื ืืช ืืฉืคื ืืืืขืืช ืืืืื ืืืืืืืืืช. ืืฉืื ืืฉืืืข ืื ืืขืช ืืขืื ืืืืข ืืืฉืื ืขื ืขื ืืื ืื. ืืืื ืืืืข ืืืื ืืฉื ืืช ืืช ืืืืื ืืืช, ืื ืืคืืืช ืืงืืืข ืืืื ืืืช ืืืืื ืืื ืืืชืจ. ืืืืคืืฉ ืืืชืจ ืืฆืืชื ืฉืืขืืืช ืืืงืจืื "ืืื ืฉืืืืช" ื"ืืื ืืืืืช", ืืืืืื "ืื ืืืืื" ื ืคืืฅ ืืฆืื ื ืืจืื ืืืชืจ ืืืืืืื "ืืื ืืืื". ืจืื ืืืคืื ืืืชืจ ืฉืขืืกืงืื ืืืืืืืื ืืืื ืืฉืชืืฉืื ืืืืืืข ืฉื ื"ืื", ืื ืจืื ืขื ืืกืืก ืืืืื ืฉื ืืฉืคืื ืืื. ืืืืืืืช: ืืฉืคื ืืื-ืฉืืืืช ืฉื ืฆ'ืืืืื, ืืฉืคื ืืื-ืฉืืคืื, ืืฉืคื ืืื-ืืืืจืืช ืฉื ืืจืกืงื, ืขืงืจืื ืืื-ืืืืืช, ืืืง ืืื-ืกืชืืจื, ืืืืืช ืืื-ืื ืืื ืืจืืฉืื ื ืฉื ืงื ืืืจ. ืืืฆื ืืฉื ื ืืฆืืชื ืจืง ืืช ืืขืืืช ืื ืืืืืช. (ืื ืงืืืื: ืฉืืืชื ืืื ืคืขืืื AI ืฉืื ืื ืื ืืฆืืจื ืื ืืื ื, ืืื ืืืืื ืืืืืืื ืืื ืืชืฉืืืืช). ืกืืืื ืขื ืืืขืืจืืช ืืืจืืืืช, ืืงื ืื ืืื ืืืืืืจ ืืืคื ืคืืชืืื ืืืื ืื ืืืื. ืืืจืื, ืขืื โข ืชืืืื ืืืชื ืืืขื ื ืืืืจื - ืฉืืื โข ื"ื ืืฉืื ื'ืชืฉืค"ื โข 01:43, 13 ืืคืืจืืืจ 2026 (IST) โ ืกืืฃ ืืขืืจืืชืืืื ืืคื ืื ืืคืืจืื ืืจืืฉ ืืขืจื ืจืืืชื ืืช ืืขืจืืื ืืื ืืืชืงืฉืชื ืืืืฉ ืืขื ืขืืื. ืื ืืชื ืฉืื ืื ืืืจืช ืื ืืกืื ืืืืจืื ืฉืืืื ืฉืืืื ืกืื ืืจื ืืืื ืืืืืื. ืืกื ืืื ื ืจืื ืื ื ืืื ืืืคื ืืช ืืชืื ืืช ื ืืืื ืื ืืคืืจืื ืืจืืฉ ืืขืจื, ืืื ืื ื ืื ืกืืืจ ืขื ืืืืฉืื. ืืืืคื ืืืฉื ืื ื ืืืฉื ืฉืืืื ืืชืืช ืืชืื ืืช ืืคืืชืืช ืฉื ืืขืจื (ืืื ืืขืืื ืืื ืืขืจืืื ืืื), ืื ืืืื ืืืืืช ื ืืื ืืืคื ืืช ืืคืืจืืืื. ืืชืืื ืืช Thefarmeryes. ืืืฉ ืืฉืื (HaShumai) - ืืชืื ืื ืืืืขื - ืืืฉืืืช ืขื ืืืขืื ืืืืืงืืช 10:35, 14 ืืคืืจืืืจ 2026 (IST)ืชืืืื ืืืื ืืฉืืืืช ืืฉืืชืฃ ืืืืฉื ืืขืฆืืชืืื ืฉื @ืืื ื ืื ืืืืง @ืฉืืื ื@Funcs ืื ื ืคืืชืืช ืืืื ืืฉืืืืช ืืฉืืชืฃ ืขื ืชืฉืขื ืขืจืืื (ืืกืืฃ 9 ืืื 10 ืื ืืืื ืืื ืืืืืืืื ืฉืืฉ ืืื ืฉื ื ืฉืืืช ืืื ืืืืืจ ืืกืืฃ ืืืืชื ืขืจื). ืืืืื ืืื ืืืืืฃ ืืช ืืืืื ืื ืืคืจืื ืืื. ืืกืืื ืืื ืฉืื ื ืคืชืืื ืชืฉืขื ืืืื ื ืืฉืืืืช ืคืจืื ืืื ืืื ืฉืืืืืจ ืืืืื ืื ืืืื. ืืขืจืืื ืื ืขื ืืืืืื ืฉืงืื/ืฉืืืืจื ืฉืืงืืื ืืฉื ืช 2024 ืืืฉื ืช 2025. ืื ื ืงืจืื ืืช ืืขืจืืื ืืื ืืฉ ืืืื ืืื ืืืื ืฉืืชื ืืขืืืคืื ืฉืืคืชื ืื ืืืข ืืื ืืืื ืืฉืืืืช ืคืจืื ื ืชืืืื. ืืื ืืืื ืืฉืืืืช ืคืจืื ื ืื ืืืืื ืืื ืืืฉื ืฉืืืข, ืืืืืจ ืขื ื 21 ืืคืืจืืืจ ืืฉืขื 21:40. ืืืืจืื ื ืขืืืจ ืืฉืจ ืืืฆืืขื. ืืฉืื ืขืืฉืื ืชืื ืืช ืืฉืืืืช ืืืช ืขื ืืื ืืขืจืืื ืื ืืฉื ืืืื ืืฉืืืืช ืคืจืื ื ืจืง ืืื ืืืคื ืืช ืืืืื ืื ืื ืฉืืฃ ืืื ืื ืืคืกืคืก ืืืชื. ืืขืจืืื ืื: 1. ืจืืฉ ืืขืื ืืืจื (ื ืงืจื ืื ืฉืืจืช ืืื), 2. ืืืจืช ืื ืืืื (ืืืื ืืจ-ืื), 3. ืืืื ืืฉืืจ ืฆืคืื (ื ืงืจื ืื ื ืืืช ืืคืจืื), 4. ืืฉืขืื (ืืืฉืื ืขืชืืื) (ื ืงืจื ืื ืืฉืืื), 5. ืืืช ืืืจืื ืฆืคืื, 6. ืืืื ืืฉืืจ ืฆืคืื, 7. ืืคืงื (ืืืฉืื), (ื ืงืจื ืื ืงื ื (ืืืฉืื) ืืื ื ืื ืขืชื ืืื), 8. ืืื"ื ืืขืจื 9. ืืฆืื ืืืืื ืืืืื ืขื ืืจ ืืืง ืฉืืืืื ืืจืื ืืฉืืขืืชื ืฉืืื ืื ืืฉืืืืช, ืื ืืื ืืืืืจ ืืืืืืื ืื ืืืงืื ืืื ืืฆืืืื ืืืืืกืกืื. ืืคื ืฉืืืจ ืืืืจืชื, ืืฉ ืขืจื ืขื "ืจืฉืืืช ืืืืืื", "ืืืืืช ืืืงืืืืืช ืืืืืื ืืฉืืืจืื" , "ืืืื", "ืืืืช ืืืืืื", "ืืชื ืืืืช" . ืืืขืชื ืื ืืกืคืืง. ืื ืืกืืจ ืืืืจ ืฉืืืื ืขืจืืื ืขื ืืืืืื ืืชืืงืื ืืื ืื ืขื ืื ืืืื ืืืฉ ืฉืงื. ืืขืจืืื ืืืืืืจืื ืื ืขื ืืืฉืืืื ืฉืจืง ืืคื ื ืจืืข ืืืฉืจื ืืช ืืงืืชื ืืงืืื ื ืืฉืืฉืื ืื ืืื ื ืืืงืืื. ืืืืืจ ืืืืืืื ืฉืงืื ืื ืืืื, ืืื ืืกืฃ ืื ืฉืืื ืืฃ ืืงืืจ ืื ืืชืื ืืื ืื ืฉืื ืืจืื ืืื, ืื ืฉืืฉ ืืงืืจ ืืื ืืืชืื ืื ืขื ืืฉืคืื ืืืช ืื ืขื ืืฉืคืืืช ืกืคืืจืืช ืฉืืจืืช ืื. ืืขืืืื ืฉื ืืชื ืืืฉืืจ ืจืืฉืื ื ืืืงืื ืฉื ืืืฉืื ืื ืืกืคืืงื ืืื ืืชืช 'ืืฉืจ' ืืขืจื". ืืคื ืืืงืืจ ืืื ืืกืคืจ ืืืืืืื ืขืื ื 30 ืืคื ื ืืื ืฉืืขื ืืืืงืืืืจ ื-120 ื ืืื ืืืืื 2025. ืืคื ืืืงืืจ ืืื ืืชืืืืช 2023 ืขื ืืฆืืืจ 2025 ืงืื ื-140 ืืืืืื. ืื ืกืืืจ ืืขืื ื ืืืชืื ืขื ืื ืืื ืืื ืขืจื. ืื ืืกืฃ, ืืืืื ืืืื ืื ืืจืื ืืืืจืืข ืกืืจ ืฉืืื ืืฉืืืืช ืืืืืืืืช ืืืืฉืื ืขืชืืื. Mertaro โข ืฉืืื 21:40, 14 ืืคืืจืืืจ 2026 (IST)ืชืืืื ืื ื ืืืฉื ืฉื ืืื ืืฉืื ืชืื ืืช ืืฉืืืืช ืขื ืื ืขืจื ืื ืคืจื, ืืืืืืช ืืฆืืจื ืืืคื ืืช ืืืืื ืืื ืืืื ืื. ืืืจืช ืืื ืืืืื ืืื ืืฉืืขืืช, ืืื ืื ื ืขืจื ืืคื ืื ืืื ืฉืืกืืืจ ืืืืงืช ืขืจืืื ืืืืื ืืฉืืืืช, ืจืื ืืืงืืคืืื:ืืฉืืืืช. ืชืืืจ โข ืฉืืื 16:05, 15 ืืคืืจืืืจ 2026 (IST)ืชืืืื ืืื ืืืื ืืจืื ืืื ืฆืจืื ืขืจื ื ืคืจื ืืื ืืื ืืื ืฉืืฉื ืืื ืืชืืื ืืื ืืืขื ืืืืืืื ืืืงืื ืขืจื ืืฉืืชืฃ ืืคื ืฉืื ืืืืจืื. ~2026-10124-31 โข ืฉืืื 02:18, 15 ืืคืืจืืืจ 2026 (IST)ืชืืืื ืืืืจืืช ืืืืืืื ืืชืงืืืืืช ืืืืจืืช ืืืืืืื ืฉื ืืืงืืืืื. ืืฉ ืืืืฆืืช ืืื ืืืฆืืืข? ืืื-ืืืืืฉื โข ืฉืืื 10:43, 17 ืืคืืจืืืจ 2026 (IST)ืชืืืื Migration to Parsoid Read this in another language Hello everyone! I am glad to inform you that as the next step in the Parser Unification project, Parsoid will soon be turned on as the default article renderer on your wiki. We are gradually increasing the number of wikis using Parsoid, with the intention of making it the default wikitext parser for MediaWiki's next long-term support release. This will make our wikis more reliable and consistent for editors, readers, and tools to use, as well as making the development of future wikitext features easier. If this disrupts your workflow, donโt worry! You can still opt out through a user preference or turn Parsoid off on the current page using the Tools submenu, as described in the Extension:ParserMigration documentation. There is more information about our roll-out strategy available, including the testing done before we turn on Parsoid for a new wiki. To report bugs and issues, please look at our known issues documentation and if you found a new bug please create a phab ticket and tag the Content Transform Team in Phabricator. Content Transform Team 02:42, 18 ืืคืืจืืืจ 2026 (IST)ืชืืืื ืืื ai ืืขืจืืื ืื ืฆืืงืืืคืืืช ืฉืืื. ืคืืชืืชื ืืืืจืื ื ืคืจืืืืงืื ืืฉื Conceptual Skeleton Protocol, ืฉืืืืขื ืืฉืืืื ืืืขืจืืืช AI ืืฆืืจื ืืชืืื ืืขืจืืื ืื ืฆืืงืืืคืืืช ืืจืืช ืงืืืจื ืืืืช ืืืืื. ืืคืจืืืืงืื ืืืื ืืขืืื ืืืืืจืื ืึพGitHub: https://github.com/hthr777-prog/Conceptual-Skeleton-Protocol๏ฟฝ ืืฉืื ืืงืื ืืฉืื ืืืงืืืื, ืืื ืืฉืืืข ืืื ืืืขืชืื ื ืืชื ืืฉืื ืื ืืืืื ืืช ืืคืจืืืืงืื ืืืกืืจืช ืืขืืืื ืขื ืขืจืืื ืื ืฆืืงืืืคืืืื. ื ืืชื ืืืชืจืฉื ืืืขืจืืื ืืืืจืื ืื ืฉืืฆืจืชื ืืืืงืืคืืื ืืืืฆืขืืชื.ืงื ืืืก * ืฉืืื 12:49, 19 ืืคืืจืืืจ 2026 (IST)ืชืืืื |
======================================== |
[SOURCE: https://he.wikipedia.org/wiki/ืืื ื] | [TOKENS: 1428] |
ืชืืื ืขื ืืื ืื ืืจืืื ืืืืืื ืืจืืื ืืืืืื ืื ืืื ืื ืืื ืชืืื ืืืงืจ ืชืืืจืื ืฉืขื ืืื ื ืืื ืืื ืืฉืืืืฉืืื. "ืืื ืืื", ืืืกืืจืช ืืืจืืื ืืืืืื, ืื ืืืืื, ืื ืืืืื-ืืืจืืืืช, ืืืฉืืฉืืช ืืืงืฉืจ ืืกืืื, ืืืืืื ืื ืืฉืืืืฉ ืืฆืจ ืืืฆืื ืืืืื, ืืืฆืืื ืืืกืฃ ืืืืจืืช ืืืืืืืืช ืืืฉืืฉืืช ืืืกืืจืช ืคืจืงืืืงื ืืกืืืืช. ืืืงืจืื ืืืจืืื ืืืืืื ืขืืกืงืื ืืื ืืืชืจ ืืฉืืืช ืืืืืฆืจืืชื ืฉื ืืื ืืื ืืืืกื ืืืืืืื ืฉืืื ืขื ืืชืจืืืช ืื ืื ืืฉืืฉืื. ืืจืืื ืืืืืื ืืฆืืื ืช ืืืกืฆืืคืืื ื ืจืฉืืืช, ืืืืืืช ืืืืคื ืฉืืืชื ืืช ื"ืฉืืื" ืื"ืืืขืื" ืฉื ืืืฉืืื ืืืกืืจืช ืฉืื ืคืขืืืืช ืื ืืฉืืช, ืื ืชืืื ืขื ืืื ืื ืืฉื, ืืืืฆืขืืช ืืืงืจ ืื ืืชืื ืฉื ืืื ืืื ืืชืื ืืงืฉืจ, ืืืฉื ืชืืขืื ืืงืืืื ืืฉืืืืฉ ืื ืืื ืืื. ืืืงืจ ืืจืืื ืืืืื ืขืฉืื ืืขืกืืง ืืฉืคื ืืืช, ืืืกืืช ืืกืคืจ ืฉืคืืช ("ืืจืืื ืืืืืื ืจื ืฉืคืชืืช", "ืืจืืื ืืืืืื ืื-ืฉืคืชืืช" ืืื'), ืื ืขืฉืื ืืืชืืงื ืืืงืจ ืืื ืืื ืืืกืืจืช ืืื-ืชืืืืืช. ืกืงืืจื ืืืืกืฆืืคืืื ื ืืืจืืื ืืืืืืช ืืืืกืกืช ืขื ืขืงืจืื ืืช ืชืืืจืืืื ืืฉืื ืืืืืื ืืช ืืืกืคืงืืื ืืขืืงืจืืื ืืืืื: ืืขืงืืืช ืืืืืืืืืฆืื, ืืืืืื ืืืฉืื ื ืืืขื. ืืืคื ืฉืคืืช ืขืืืืืช ืืืจ ืืคื ื ืืืืื ืืืืจืืช ืืชืืืืืช ืืืื ืืช ืขืืฉืจื ืืชืคืงืืืืื. ืื ืืืื ืืืืฆืื ืืฉืืืืจ ืืืืืื ืืืฉืื ื ืืืืืขืชื ืืจืฉืช ืืืื ืืจื ื ื ืขืฉืื ืืืกืืจืืช ืฉืื ืืช ืืืืืืกืืช ืืฆืืื ืืจืืื ืืืืืื ืืชืืืืื ืฉืื ืื. ืกืืื ืืจืืื ืืืืืืืช ืงืืืืช ืืืื ื ืืื ืฉื ื ืกืืืื ืฉื ืืจืืื ืืืืืืืช: ืืจืืื ืืืืืื ืื ืืืง ื ืคืืฆื ืืืงืฆืืข ืืชืจืืื, ืื ืชืจืืื ืืืืจ ืฉื ืืื ื ืกืคืฆืืคื ืื ืืืกืฃ ืืื ืืื ื ืืจืฉ ืืฉื ืคืชืจืื ืืขืืืช ืชืจืืื ืืืืืืืช. ืืจืืื ืืืืืื ืืืืกืฆืืคืืื ื ืืืืกืฆืืคืืื ื ืืืจืืื ืืืืืื ืงืฉืืจื ืืชืจืืื, ืืขื ืคื ืจืื ืืืืื ืืื ืืืงืฆืืขืืช ืืืืื ืื ืืืืื ืืืื ืืืจืกืืืืืช ืืืื ืืชืจืืื ืื ืืืชื ืืกืคืจ ืืชืจืืื. ืชืืืจืืืช ืฉื ืืจืืื ืืืืืื ืจืื ืื ืืงืจืืื ื ืืกืคืช ืงืืฉืืจืื ืืืฆืื ืืื |
======================================== |
[SOURCE: https://he.wikipedia.org/wiki/ืกืค_ืคืืื ืืง] | [TOKENS: 3493] |
ืชืืื ืขื ืืื ืื ืกืค ืคืืื ืืง ืืืืฃ ืขืื ืืื ืืืืจืืืก ืคืืื ืืง (ืืืจืื ืืช: Josef Emanuel Hubertus Piontek;โ 5 ืืืจืฅ 1940 โ 18 ืืคืืจืืืจ 2026), ืืืงืืฆืืจ ืกืค ืคืืื ืืง (Sepp Piontek), ืืื ืืืืจืืื ืืืืื ืืจืื ื. ืคืืื ืืง ืฉืืืง ืืืขื ืืืืจื ืื ืืงืจืืืจื ืฉืื ืืขืืืช ืืืื ืืืืจืืจ ืืจืื, ืืื ืขืื ืืชืืืจ ืืืืคืืช ืืืื ืืกืืืื ืืขืื ืช 1964/65, ืืขืจื ืฉืฉ ืืืคืขืืช ืื ืืืจืช ืืจืื ืื ืืืขืจืืืช. ืืงืจืืืจืช ืืืืืื ืฉืื ืืืื ืคืืื ืืง ืืืืืื ืขื ื ืืืจืช ืื ืืจืง, ืืืชื ืืืื ืืืฉื 11 ืฉื ืื, ืืืืจืื ืืืืจื 1984, ืืื ืืืื 1986 ืืืืจื 1988. ืงืจืืืจื ืืฉืืงื ืคืืื ืืง ืฉืืืง ืื ืขืจ ืืืืขืืื VFL ืืจืื ืื ืืืจ, ืืืฉื ืช 1958 ืขืื ืืงืืืฆื ืืืืืจืช ืฉื ืืืืขืืื. ืืฉื ืช 1960, ืืจืื ืืงืืช ืืืื ืืกืืืื, ืืฆืืจืฃ ืคืืื ืืง ืืืจืืจ ืืจืื, ืฉื ืฉืืืง ืืืฉื 12 ืขืื ืืช ืืจืฆืืคืืช. ืืฉื ืช 1961 ืืื ืคืืื ืืง ืขื ืืจืืจ ืืจืื ืืืืืข ืืืจืื ื, ืืืืจ ื ืืฆืืื 2-0 ืืืืจ ืขื ืงืืืืจืกืืืืืจื. ืืขืื ืช 1964/65 ืืื ืคืืื ืืง ืขื ืืจืืจ ืืจืื ืืืืืคืืช ืืืื ืืกืืืื, ืืขืื ื ืืฉื ืืื ืื ืืืื ืืกืืืื ืืืืชื ืงืืืืช, ืืืืจ ืฉืืงืืืฆื ืืฆืืื ืืช ืืืื ื ืืืืื ืืืืชืจ ืืืืื, ืื ืคืืื ืืง ืฉืืชืฃ ืคืขืืื ืขื ืืืจืกื ืืืืจ ืืืืืก. ืืขืื ืช 1966/67 ืืชืืืืื ืคืืื ืืง ืืืจืื ืืชืืชืืช ืื ืืฆืื ืืืจืืื ืืืืืืจื ืืกืืื, ืื ืขืื ื ืืืืจ ืืื, 1967/68 ืกืืืื ืืืงืื ืืฉื ื, ืืืฉืจ ืคืืื ืืง ืืืงืืข ืืขืื ื ืื ืืจืืขื ืฉืขืจืื, ืฉืื ืืืืืืฉืื ืฉืื ืืขืื ื ืืืช. ืขื ืืขืื ืช 1969/70 ืืื ืคืืื ืืง ืฉืืงื ืืจืื ืงืืืข, ืื ืืืื ืืช ืืงืืื ืืืจืื, ืืืฉื ืช 1972 ืืืืืข ืขื ืคืจืืฉื ืืืฉืืง ืคืขืื, ืืืืจ ืฉืฆืืจ 278 ืืืคืขืืช ืืืื ืืืืงืืข 15 ืฉืขืจืื ืืืื ืืจืืจ ืืจืื. ืืคืืื ืืง ืฉืฉ ืืืคืขืืช ืืืื ื ืืืจืช ืืจืื ืื ืืืขืจืืืช, ืืืื ืืื ืืฉื ืื 1965 ื-1966, ืื ืืื ืืขืืื ืื ืืฉืชืชืฃ ืืืืจื ืืจ ืืืื. ืงืจืืืจืช ืืืืื ืืื ืืืืจ ืคืจืืฉืชื ืืื ืคืืื ืืง ืืืื ืืช ืืจืืจ ืืจืื, ืฉื ืืืื ืขื ืืฉื ืช 1975. ืืื ืืืืจ ืขืืืืชื ืืช ืืจืื ืืืื ืคืืื ืืง ืขืื ื ืืืช ืืคืืจืืื ื ืืืกืืืืจืฃ, ืฉื ืชืืื ืื ืืืจืช ืืืืื ืืขืื ื ืืืช ืืค.ืฆ. ืกื ื ืคืืืื. ืืชืงืืคื ืืืฉืืขืืชืืช ืืืืชืจ ืฉื ืคืืื ืืง ืืืืื ืืืืชื ืืื ืืฉื ืช 1979 ืขื ืืฉื ืช 1990, 11 ืฉื ืื ืืื ืืืื ืคืืื ืืง ืืช ื ืืืจืช ืื ืืจืง. ืคืืื ืืง ืืืื ืืช ืื ืืจืง ื-115 ืืฉืืงืื ืืื ืืืืืืื, ืืืฉื ืช 1983 ื ืืืจ ืื ืืืืื ืืขืื ื ืืื ืืจืง, ืืื ืืืืื ืืฉื ื ืฉื ืืืืื ืืืืืจืื ืืืจืื ืกืืงืจ, ืืื ืืฉืืจ ืืฉืื ืฉืืขืคืื ืขื ื ืืืจืช ืื ืืจืง ืืืืจื 1984, ืื ืืืืขื ืื ืืจืง ืืืฆื ืืืืจ, ืืืืืื ืขื ืืื ื ืืืจืช ืกืคืจื ืืืขืืืืช ืืืจืขื. ืืฉื ืช 1986 ืืขืคืืื ืคืืื ืืง ืืื ืืจืง ืืืื ืืืื 1986, ืฉืืื ืืืื ืืืื ืืจืืฉืื ืื ืืฉืชืชืคื ื ืืืจืช ืื ืืจืง, ืฉื ืืฆืื ืืช ืื ืฉืืืฉืช ืืฉืืงืื ืืฉืื ืืืชืื, ืืจืืืช ื ืืฆืืื ืืคืชืืข ืืชืืฆืื 2-0 ืขื ื ืืืจืช ืืจืื ืื ืืืขืจืืืช, ืื ืืืืื ืืืืจ ืชืืืกื 5-1 ืื ืืืจืช ืกืคืจื ืืฉืืื ืืช ืืืืจ. ืืฉื ืช 1988 ืืขืคืืื ืคืืื ืืง ืืื ืืจืง ืืืืจื 1988, ืื ืืืืื ืืฉืื ืืืชืื, ืืืชื ืกืืืื ืขื ืืคืก ื ืงืืืืช. ืืฉื ืช 1990 ืขืื ืคืืื ืืง ืืช ืชืคืงืืื ืื ืืืจืช ืื ืืจืง, ืืืืจ ืฉืื ืืฆืืื ืืืขืคืื ืืืื ืืืื 1990. ืืฉื ืช 2008 ื ืื ืก ืคืืื ืืง ืืืืื ืืชืืืื ืฉื ืืืืืจืื ืืื ื, ืืืืื ื ืฉื ื ืืืจืช ืืืืืืื ืฉื ืื ืืจืง ืืฉื ืืช ื-80. ืืืืจ ืขืืืืชื ืฉื ืคืืื ืืง ืืช ื ืืืจืช ืื ืืจืง, ืืื ืืื ื ืืฉื ืช 1990 ืืืืื ื ืืืจืช ืืืจืงืื, ืืืชื ืืืื ืืืฉื ืฉืืืฉ ืฉื ืื, ืืืื ืืื ืืื ื ืืืกืืืืช ืืืืจ ืืขืชืืื ืฉื ื ืืืจืช ืืืจืงืื, ืฉืืื ืืืฆืืื ืืฉืืื ืฉื ืืช ื-90. ืืืืจ ืืื ืืืื ืคืืื ืืง ืืช ืืืจืกืืกืคืืจ ืืืืจืงืืช ืืืฉื ืชืงืืคื ืงืฆืจื, ืืืืืจ ืืื ืืืจ ืืื ืืจืง ืืื ืืืื ืืช ืืืืืืจื ืืืืืกืคืืืงืืื, ืืืืจ ืื ืืช ืกืืืงืืืจื. ืืฉื ืช 2000 ืืื ืืืื ืืช ื ืืืจืช ืืจืื ืื ื, ืืขืื ืืช ืืชืคืงืื ืืฉื ืช 2002. ืืขืืืจ ืฉื ืชืืื, ืืืื ืฉืื ืืช ืืจืื ืื ื ืืชืงืืคื ืงืฆืจื, ืืคืจืฉ ืืืืืื. ืขื ืืืชื ืคืืื ืืง ืืชืคืจื ืก ืืืชื ืืจืฆืืืช. ืชืืจืื ืงืืฉืืจืื ืืืฆืื ืืื 1 ืงืืืืจ โข 2 ื.ืจืกืืืกื โข 3 ืืืกืง โข 4 ืืืืกื โข 5 ื ืืืกื โข 6 ืืจืื โข 7 ืืจืืืกื โข 8 ืืืืกื โข 9 ืกืืืื ืกื โข 10 ืืืงืืืจ ืืืจืกื โข 11 ืืจืืจืื โข 12 ืืืืืื โข 13 ืืืืจืืืกื โข 14 ื.ืืืืืจืืค โข 15 ืืจื ืกื โข 16 ื.ืจืกืืืกื โข 17 ืืืืืกื โข 18 ืกืืืืืง โข 19 ืืจืืื ืืืจืกื โข 20 ืงืืืืกื โข ืืืื: ืคืืื ืืง 1 ืจืกืืืกื โข 2 ืกืืืืืง โข 3 ืืืกืง โข 4 ืืืืกื โข 5 ื ืืืกื โข 6 ืืจืื โข 7 ืืืืืื โข 8 ืืืืกื โข 9 ืืจืืจืื โข 10 ืืืงืืืจ ืืืจืกื โข 11 ื.ืืืืืจืืค โข 12 ืืจืืืกื โข 13 ืคืจืืื โข 14 ืกืืืื ืกื โข 15 ืืจื ืกื โข 16 ืงืืืืกื โข 17 ื ืืืกื โข 18 ืืจืืกืื ืกื โข 19 ืืจืืงืกื โข 20 ืืืจืืจืื โข 21 ืื ืืจืกื โข 22 ืืื โข ืืืื: ืคืืื ืืง 1 ืจืกืืืกื โข 2 ืกืืืืืง โข 3 ืืืกืง โข 4 ืืืืกื โข 5 ื ืืืกื โข 6 ืืจืื โข 7 ืืื โข 8 ืคืจืืื โข 9 ืืืื ืฆื โข 10 ืืืงืืืจ ืืืจืกื โข 11 ื.ืืืืืจืืค โข 12 ื.ืืืืกื โข 13 ืื ืกื โข 14 ื.ืืืืกื โข 15 ืคืืืืกื โข 16 ืฉืืืืื โข 17 ืืจืืจืื โข 18 ืืจืืงืกื โข 19 ืืจืืกืื ืกื โข 20 ืืืืคืืจื โข ืืืื: ืคืืื ืืง |
======================================== |
[SOURCE: https://he.wikipedia.org/wiki/ื'ืกื_ื'ืงืกืื] | [TOKENS: 3675] |
ืชืืื ืขื ืืื ืื ื'ืกื ื'ืงืกืื ื'ืกื ืืืืืก ื'ืงืกืื (ืืื ืืืืช: Jesse Louis Jackson;โ 8 ืืืืงืืืืจ 1941 โ 17 ืืคืืจืืืจ 2026) ืืื ืืืืจ, ืคืืืืืืงืื ืืคืขืื ืืชื ืืขื ืืืืืืืช ืืืืจื ืืืจืฆืืช ืืืจืืช. ืืื ืืื ืื ืืื ืืืื ืฉื ืืฉืืื ืื ืืฆืจื ืืืชืืืื ืขื ืืืขืืืืช ืืืคืืื ืืืืืงืจืืืช ืื ืฉืืืืช ืืจืฆืืช ืืืจืืช ื-1984 ืื-1988. ืืืืืจืคืื ื'ืงืกืื ื ืืื ืืฉื ื'ืกื ืืืืืก ืืจื ืก ืืขืืจ ืืจืื ืืืื ืฉืืืืื ืช ืืจืื ืงืจืืืืื ื. ืืื, ืืื ืืจื ืก, ืืืืชื ืืช 16 ืืืื ืืขืช ืืืืชื. ืืืื ืืืืืืืื, ื ืื ืืืืืก ืจืืืื ืกืื, ืืื ื ืฉืื ืืืืฉื ืืืจืช ืืื ืืื ืืขืืจื ืืืื ืื ื. ื'ืงืกืื ืืืืฅ ืืช ืฉื ืืืฉืคืื ืฉื ืืืื ืืืืจื. ื'ืงืกืื ืืื ืงืืืจืืจืืง ืืฆืืืื ืืชืืืื. .ืืืืฉื ืงืืื ืืืื ืืืจืกืืืช ืืืืื ืื ืืืืช ืคืืืืื. ืืืื ืืืืืื ืฉืื ืืืคืฉืจื ืื ืืฉืืง ืืขืืืช ืงืืืจืืจืืง ืืืื ืฉืืื ืฉืืืจ, ืืื ืขืืจ ืืืื ืืืจืกืืืช North Carolina A&T, ืฉื ืืคื ืืงืืืจืืจืืง ืืจืืฉืื, ืืชืืืื ืืฆืืืื ืืกืืฆืืืืืืื ืืืืืื ืื ืฉืื ืืืืืช ืืกืืืื ืืื. ื-1965 ืืื ื'ืงืกืื ืืืืืช ืคืขืื ืืชื ืืขื ืืืืืืืช ืืืืจื ืืื ืืืชื ืฉื ืืจืืื ืืืชืจ ืงืื ื ืืื. ืืฉื ืจืฆื ืงืื ื ืืืืคืืก ื-1968, ื'ืงืกืื ืืื ื ืืื ืืืงืื ืืืืจืืข. ื'ืงืกืื ืืขื ืืืื ื ืื ืงืื ื ืืช ืืืจืืขืืชืื, ืืืื ืืงืืจืืื ืืืืื ืฉื ืงืื ื ืืืืื ืกืคืง ืืืจืกื ืื. ืืืืจ ืจืฆื ืงืื ื ืืชืขืืช ื'ืงืกืื ืขื ืืืจืฉื ืืจืืฉืืช ืืชื ืืขื, ืจืืฃ ืืืจื ืชื, ืืคืจืฉ ืืืชื ืืขื ื-1971 ืืืงืื ืชื ืืขื ื ืคืจืืช. ื-1984 ืฉืื ืืืชืืืื ืืชื ืืขืืช ืื'ืงืกืื ืืืจ ืืืจืื ืืืืจื ืืฆืืืืจืืช. ืืกืคืืืืจ 1979 ืืืงืจ ืืฆื ืืืกืจ ืขืจืคืืช ืืืืืจืืช, ืืื ืืืฉืจืื. ืืื ื ืคืืฉ ืขื ืืื ืงืืืง, ืจืืฉ ืขืืจืืืช ืืจืืฉืืื, ืืืื ืืืืืฆืช ืฉืจ ืืืืฅ ืืฉื ืืืื, ืื ื ืืขืื ืขืื ื ืฆืืื ืืืืฉืื. ืืฉื ืืช ื-80 ืขืกืง ื'ืงืกืื ืืคืขืืืืช ืฆืืืืจืืช ื ืืจืฆืช ืืืื ืืืืจื ืืื ืืื ืืืื ืฉื ืืงืืืื ืืืคืจื-ืืืจืืงืืืช ืืฉื ืืชื ืืขื ืืืืืืืช ืืืืจื. ื-1983 ื ืกืข ื'ืงืกืื ืืกืืจืื ืืคืขื ืฉื ืืฉืืจืืจื ืฉื ืืืืืก ืืืืจืืงืื ืืฉืืื, ืจืืืจื ืืืืื, ืฉืืืืืง ืืืื ืกืืจืื ืืืืจ ืฉืืืืกื ืืืคื ืืฉืื ืืื ืื ืืฉืืื ืืืฉืืืช ืืคืฆืฆื ืฉื ืขืืืืช ืกืืจืืืช. ืขื ืฉืืื ืืืืฉืื ืืืื, ืืชืงืืื ืืืืื ืื'ืงืกืื ืขื ืืื ืื ืฉืื ืจืื ืื ืจืืืื ืืืืช ืืืื. ืืืืฉื ืชืจื ืืคืืคืืืจืืืช ืฉื ื'ืงืกืื ืืฆืืืืจ ืืืืจืืงืื ืืฉืืืฉ ืืืงืคืฆื ืืืชืืืืืืชื ืืืืืจืืช ืืืงืืืืืช ืื ืฉืืืืช ื-1984. ืืืืืจืืช ืืืงืืืืืช ืฉืงืืืื ืืืคืืื ืืืืืงืจืืืช ืืงืจืืช ืืืืจืืฅ ืื ืฉืืืืช ื-1984 ืืคืชืืข ื'ืงืกืื ืืืื ื-21% ืืงืืืืช ืืืฆืืืขืื, ืืืื ืืืืข ืจืง ืืืงืื ืืฉืืืฉื ืืืจื ืืกื ืืืืจ ืืจื ืืืจื ืืกืื ืื ืฉืื ืืฉืขืืจ ืืืืืจ ืืื ืืืื, ืฉืืื ืืืืขืืืืช. ืืจืืข ืฉื ืื ืืืจ ืื, ื'ืงืกืื ืืชืืืื ืฉืื ืืชืคืงืื ืืืขืื ืืืืืงืจืืื ืื ืฉืืืืช. ืืืชืืืืืืช ืื, ืืฆืืืชื ืืขืืจ ืืืฆืื ืืืชื ืืืืขืื ืืืื ืืืชืจ ืืืื ืืฃ ื ืื ื ืืืืืื ืืืืจืืื ืืืืื ืืืชืจ. ืื ื-1988 ืืฆืืืชื ืฉื ื'ืงืกืื ืืืืืจืืช ืืืงืืืืืช ืืืืชื ืืขืืจ ืืฆืืคืืืช ืืืื ืืื ื-6.9 ืืืืืื ืงืืืืช ืื-11 ืืืื ืืช. ืืืื ืืกืข ืืืืืจืืช ืฉื ื'ืงืกืื ืกืคื ืืืืืื ืืืฉืจ ืืื ืืืืก ืืืืื ืช ืืืกืงืื ืกืื ืขื ืืื ืืืืงื ืืืงืืงืืก, ืฉืืื ืืืืขืืืืช. ืืฉื ืช 2017 ืืฉืฃ ื'ืงืกืื ืื ืืื ืืืืื ืขื ืืืืช ืคืจืงืื ืกืื. ืืืื ืืืืงืืช ืฉืืืฆืข ืืืคืจืื 2025 ืงืืขื ืฉืืืืืจ ืึพPSP, ืืืื ืฉืืฉ ืื ืชืกืืื ืื ืืืืื ืืคืจืงืื ืกืื. ื'ืงืกืื ื ืคืืจ ื-17 ืืคืืจืืืจ 2026. ืื ื, ื'ืื ืชื ื'ืงืกืื, ืืื ืืืจ ืืืืช ืื ืืืจืื ืืืขื ืืืืื ืืจืืฉืื ืฉื ืืืื ืช ืืืืื ืื. ืืฆืข ืืืืืจืืช ืืฉืชื ืืขืจืืืช ืืืืืจืืช ืืืงืืืืืช ื'ืงืกืื ืืชืืืื ืขื ืืฆืข ืฉื ืืฉื ืืขืื ื ืจืืื ืืืืจืื ืืืื. ื'ืงืกืื ืืืจืื ืขื ืจืฆืื ื ืืืฆืืจ "ืงืืืืืฆืืืช ืงืฉืช" ืฉื ืงืืืฆืืช ืืืขืืืื ืฉืื ืืช, ืืืื ืืคืจื-ืืืจืืงืืื, ืืืกืคืื ืื, ืขื ืืื ืืืืืืกืงืกืืืืื, ืืื ืื ืืื ืื ืคืจืืืจืกืืืืื ืฉืืชืืืืื ืืืืช ืืงืืืืจืืืช ืืืื. ืืฆืข ืืืืืจืืช ืฉืื ืืื ืืช ืื ืืฉืืื ืืืืื: ืืืขื ืืกืขืืฃ ืื ืืืข ืืืจืื ืืคืจืืงื, ืืฃ ืืืช ืืขืืืืชืื ืื "ื ืฉื ื'ืงืกืื ืื ืืชืงืืื ืืืืง ืืืฆืข ืืืืืจืืช ืฉื ืืืคืืื ืืืืืงืจืืืช ื-1984 ืื ื-1988. ืคืขืืืืช ืฆืืืืจืืช ืืฉื ืืชืื ืืืืจืื ืืช ืืืืื ืฉื ืืช ืืืื ืชื ืฉื ืืื ืงืืื ืืื ืื ืฉืื ืืจืฆืืช ืืืจืืช, ืืคื ื'ืงืกืื ืืืขื ืืจืืช ืฉื ืื ืฉืื ืืกืืืข ืื ืืืฉืื ืืช ืชืืืืช ืืงืืืื ืืืคืจื-ืืืจืืงืืืช. ืงืืื ืืื ืืขื ืืง ืื'ืงืกืื ืืช ืืืืืืช ืืืืจืืช ืื ืฉืืืืชืืช โ ืขืืืืจ ืืืืื ืืืืื ืืืืชืจ ืฉืืืขื ืง ืืืืจืืื ืืืจืฆืืช ืืืจืืช. ืืืกื ืืืืืืื ื'ืงืกืื ืืื ืืืงื ืืืืงืืจืช ืืืืื ืฉืืฉืืืข ืืขืจืืช ืื ืืืข ืืืืืืื ืืืกืืืืืช ืืืืืืืช: ืฉืื ืฉืื ืจืืฆ'ืจื ื ืืงืกืื ืืื ืคืืืช ืงืฉืื ืืืขืืืช ืืขืื ื ืืืจืฆืืช ืืืจืืช ืืืืื ืฉ"ืืจืืขื ืืชืื ืืืืฉื (ืืืืขืฆืื ืืืืืจืื ืฉื ื ืืงืกืื) ืื ืืืืืื ืืจืื ืื ืืกืืจ ืืขืืืคืืืืช ืฉืืื ืืืื ืืช ืืืจืืคื ืืืกืื"; ืฉ"ื ืืืก ืื ืืฉืืืข ืขื ืืฉืืื"; ืฉืืฉื ื "ืขืืชืื ืืื ืืืืืื ืืขืืื ืืืื ืฉืืกืืืืื ืืืืืช ืืืืืืงืืืืืื ืืขื ืืื ืื ืขืจืืืื". ืื ืืกืฃ, ืืฉื ืช 1984 ืืชืืืืก ื'ืงืกืื ืืืืืืื ืืืื ืื Hymies ืืืขืืจ ื ืื ืืืจืง ืืืื ืื Hymietown ืืฉืืื ืขื ืืชื ืืขืืชืื ืืืฉืื ืืืื ืคืืกื. ืงืืฉืืจืื ืืืฆืื ืืื ืืขืจืืช ืฉืืืืื |
======================================== |
[SOURCE: https://he.wikipedia.org/wiki/ืืืืืช] | [TOKENS: 17867] |
ืชืืื ืขื ืืื ืื ืืืืืช ืืืืืืช ืืื ืจืฆืฃ ืืืกืืจืช ืืืืืฉืืืืช ืืืชืืช, ืืชืจืืืชืืช ืืืืฉืคืืืช ืืงืืืงืืืืืช ืฉื ืืืืืืื. ืืืจืฉืช ื ืจืืืช ืื ืืืืืช ื ืจืืืืื ืืืกืืืจืืืืจืคืืื ืืฉืืชืคืื ืืืืื ืืื ืฉื ืืช, ืืืื ืืชืจืืืช, ืฉืืชืคืชืื ืืงืจื ืขื ืืฉืจืื ืืื ืืจืืฉืืช ืืชืืืฉืืชื ืืืืืจ ืืืื ื ืืฉืืื ืืืืฃ ืืฉื ื ืืคื ื"ืก. ื ืืชื ืื ืงืื ืืืื ื ืืืืืช ืืืคื ื ืืฉืื ืืืจืฉืช ืื ืืื ืืืื ืืืช ืฉื ื, ืืคื ื ืื ืืืื ื ืืืชืืื ืืื ืื ื ืืฉืจืื ืืืช ืืฉืจืื. ืืืืืืช ืืืช ืืื ืืชื ืืืืื ืชื ืืืกืืจืชืืช ืฉื ืืืืืืื. ืืืื ืืช ืืืจืืืืช ืืืืชืืช (ืืืจืชืืคืจืงืืืช), ืืืืืกืกืช ืขื ืืขืจืืช ืฉื ืืฆืืืช ืืขืฉืืืช, ืืืื ืืช, ืืืืืช, ืกืืคืืจืื ืืืกืืืจืืื ืืขืงืจืื ืืช, ืฉืืงืืจืืชืื ืืืชืื ืืงืืืฉ ืืืืืืืื: ืืชืืจื ืฉืืืชื ืืืชืืจื ืฉืืขื ืคื. ืืช ืื ืืฉืคืืขื ืขืืืงืืช ืขื ืจืืฉืืช ืื ืฆืจืืช ืืืืกืืื, ืืืื ืื ืืืื ืืฉืชื ืืืชืืช ืืืืจืื ืืช ืืื ืืื ื ืืช ืืืกืืื ืจืืช ืืื ืืืจืกืืืช ืืื ืืช ืืชื ืืช. ืืื ืืืช ืืืืืืืื ืืืขื ืืชืคืชืืืืืช ืืคืืืืืื ืืื ืืืจื ืืืืกืืืจืื. ืืืืืืช ืืชืจืืืช ืืื ืืืืื ืืขืจืืื, ืืืืื ืืช ืืชืคืืกืืช ืฉื ืืืจื ืืคื ืฉืื ืืืืช ืืืื ืืืืื ืืฆื ืืืืืืื. ืืชืจืืืช ืืืืืืืช ืืืืืช ืืื ืฉืคืืช ืืืืืืืืช (ืขืืจืืช, ืืืืืฉ, ืืืืื ื, ืขืจืืืช ืืืืืืช ืืืจืืืช ืืืืืืช) ืฉืืื ืืืช ืืื ื ืืฆืจื ืกืคืจืืช ืขื ืคื; ืคืืืืกืืคืื, ืืืืช ืืืฆืืจื ืฉืืชืืชืื ืขื ืชืจืืืืืช ืืกืืืื; ืืื ืืขืจืืืช ืฉื ืืชื ืืืืืืช ืืืืกืืืืช ืืืจืชืืืช. ืืืืืืช ืืืืื ืืืืื ืืืงื ืืืืืืช ืขืืืจ ืืืืืื ืจืืื ืืจืืื ืืขืืื. ืืืงืจืื ืืกืืืืื ืกืืืจืื ืฉืขืื ืืขืช ืืขืชืืงื ืืชืงืืืื ืืื ืืืืืืื ืืจืืืืื ืฉื ืชืืืขื ืคืจืืื-ืืืืืืช. ืืฆืืื ืืช, ืืชื ืืขื ืืืืืืืช ืืืืืืืช ืืขืืงืจืืช ืืขืช ืืืืืจื ืืช, ืฉืืื ืืฉืจืื ืืืืืงื ืืืกืืจืชืืช ืืืืืืืช ืืืจืฅ ืืฉืจืื ืื ืืกืื ืืืชื ืืืื ืื ืืืืืืืืช ืืืืจืืคืืืช. ืืืื ืช ืืฉืจืื, ืืฉืจ ืงืื ืขื ืืกืืก ืืจืขืืื ืืฆืืื ื, ืืื ืืืื ืช ืืืืื ืฉื ืืขื ืืืืืื. ืืคื ืืกืืื ืืช ืืืืืืืช, ื ืืื ืืฉื ืช 2023 ืืืื ืืจืืื ืชืื ื-15.7 ืืืืืื ืืืืืื, ืืืืืืื ืึพ0.2% ืืืืืืืกืืืช ืืขืืื. ืืชืืื, 7.2 ืืืืืื ืืืืืื ืืืื ืืืฉืจืื. ืืงืืจ ืืฉื ืืฉื 'ืืืืืช' ื ืืืจ ืืฉืื ืฉื ืืืืืืื, ืชืืฉืื ืืืืืช ืืืืื ืืืืจืื ืคืืืืช ืืืื ืืชืงืืคื ืืคืจืกืืช, ืื ื ืืืข ืฉืื. ืืคื ืกืืคืืจื ืืืงืจื ืฉืื ื ืืืจ ืืฉืื ืฉื ืฉืื ืืืืื, ืฉื ืงืจื ืขื ืฉืื ืฉื ืืืืื ืื ื ืฉื ืืขืงื. ืชืืช ืืฉืืืื ืืจืืืื ื ืงืจืื ืืจืฅ ืื ืคืจืืืื ืงืื ืืืืืื. ืืืืืืช ืืืื ืืืืืืฆืจ ืืชืืืื ืฉืืื ืืืืืืช ืืืืื ืขื ืืจืคืืจืื ืฉื ืืืฉืืื (ืืืืื ืื ืืืงืืื), ื ืืฉื ืืืืืช ืืื ืืืืืข ืืืืจ ืืืืืฉื ืืคืืืืช ืืืื, ืืืื ืขืืจื ืืกืืคืจ. ืขืืจื, "ืกึนืคึตืจ ืึธืึดืืจ ืึฐึผืชืึนืจึทืช ืึนืฉึถืื", ืืฉืจ "ืึตืึดืื ืึฐืึธืืึน ืึดืึฐืจืึนืฉื ืึถืช ืชึผืึนืจึทืช ืึฐืืึธื ืึฐืึทืขึฒืฉึนืืช ืึผืึฐืึทืึตึผื ืึฐึผืึดืฉึฐืืจึธืึตื ืึนืง ืึผืึดืฉึฐืืคึธึผื", ืืืื ืืื ืื ืื ืฉืกืืื ืืช ืขืจืืืชื ืืกืืคืืช ืฉื ืชืืจืช ืืฉื. ืืกืคืจืืช ืื"ื ื ืืฆื ืืืืื ืืงืืื ืืืงืืจื: "ืืช ืืฉื ืืืืืืื" ืืฉืจ ืืืื ื ืืฆืืืฆื. ืืืืื ืืืื ื "ืืืืืช" ืืืืื ื ืืจืื, ืืืชืืจ ืืช ืืืืื ืขืืืื ืืืืืืืชื ืฉื ืืืื ืืืืืื, ืืื ื ืืฆืื ืืืืจื ืื"ื, ืืืื ื ืืืจ ืืืื ืืกืคืจืืช ืืชืืจื ืืช ืฉืืคื ื ืืขืช ืืืืฉื ืืื ื ืื ืืืคืืข ืืฉืคื ืืืืื ืืช (โแพฟฮฮฟฯ
ฮดฮฑฯฯฮผฯฯ; "ืืืืืืกืืืก") ืืกืคืจืืช ืืืืืืืช-ืืืื ืืกืืืช, ืืจืืฉืื ื ืืกืคืจ ืืงืืื ื', (ืืื ืื ืืืื ื "ืืื ืืกืืืื" ืืืื ืื ืืืืืืื ืืืืืฆืื ืืืืช ืืืื ืืช ืืืืคืืข ืืฃ ืืื ืืจืืฉืื ื ืืืืืืจ ืื). ืืืื ื ืืืคืืข ืืืืจ ืืื ืื ืืกืคืจ ืืงืืื ื', ืืืจืืช ืืืืฉื ืืืืชืืืืช ืืคืืืจืคืืืช ื ืจืื ืฉืืจืืฉืื ืฉืืฉืชืืฉ ืืืื ื "ืืืืืช" ืืืืคื ืฉืืืชื (ืงืืื ืืื ื ืืืจ ืืืื ืืืืงืฉืจ ืฉืืื ื ืืึพืืฉืืขื) ืืื ืื ืืื ืกืืื ืืจืืืืืื ืืก, ืืจืืฉืืช ืืืื ืืฉืืืฉืืช. ืืื ืขืฉื ืืืช ืืืขืื ืฉื ืื ืื, ืืืืจ ืฉื ืืงืง ืืชืืืจ ืืืืืืจ ืื ืืช ืืืืืืื ืืืืจ ืฉื ืืื, ืืฉืืืชื, ืืขื ืืฉืจืื ืื ืืืจ ืฉืืืฉืื ืืื ืกืืื, ืืืืกืืคื ืืืชืงืืื ืืืขืื "ืงืืืคื ืื ืืื ืช" ืืื ืืงืืฉ ืืงืจืื ืืช, ืืืฆื ืฉื "ืืจื ื ืื ืืื". ืืืืืช ื-18 ืื-19 ืืคื ืืืื ื "ืืืืืช" ืืฉืืื ืืคื ืืืืืื, ืืืฉืืื ืืืจืื ืืช. ืืืืกืื "ืืืืช ืืฉืจืื" ืืื ื ืืช ืืชื ืืืืืชื ืืชืืจ Judenthum (ืืืื ืืื), ืืืืืืื ืชืืจืื ืืขืืจืืช ืืืืืืืฉ ืืืืจื ืืืจืืคื. ืืืืื ื ืขืฉืชื ืจืืืืช ืืงืจื ืื ืืืืืื, ืืฉืืื ืืืืืคื ืืื ืืื ืืื "ืขื ืืฉืจืื" ื"ืชืืจื". ืฆืืื ื ืืจื ืืชืืืืืช ืขื ืืฉืจืื ืืืืืืืช ืขื ืคื ืืืกืืจืช, ืืกืื ืืืืืืช ืืืื ืืืืคืื ืืืื ืืชืืืกืืืช ืฉื ืืืจืื (1811 ืืคื ื"ืก ืื ืฉื ืช ื'ืชืชืงื"ื ืืืจืื ืืืืืื ืืืืืืืช ืืืกืืจืชืืช), ืืฉืจ ืขืื ืืืจืฅ ืืฉืจืื ืืืืจ ืืฉืืื ืืขืงืืืช ืฆืืืื ืืืืื. ืืืจืื ืืื ืื ืืจืืฉืื ืฉื ืฆืืืื ืืงืืื ืืจืืช ืืืื, ืฉืืคืื ืืืฆืืื ืืจืืืืช ืื ืืืื ืขื ืืืื. ืืคื ืืืชืื, ืืกืืฃ ืชืงืืคืช ืืืืืช ืืื ืื ื ืืฉืจืื 210 ืฉื ื ืืืจืฅ ืืฆืจืื, ืืืืืจ ืืื ืืฆืื ืืืฆืจืื ืืื ืืืช ืืฉื ืืืืชืขืจืืืช ืืืืืืช. ืืืจ ืกืื ื ืงืืืื ืื ื ืืฉืจืื ืืช ืืชืืจื, ืืืืืจ 40 ืฉื ืืช ื ืืืืื ืืืืืจ ืืืฉื ืืช ืืจืฅ ืื ืขื ืื ืืื ืืืชื. ืขื ืคื ืืืกืืคืจ ืืืงืจื, ืืืืจ ืืืืืฉ ืืืจืฅ, ืืชืืืฉืื ืื ื ืืฉืจืื ืืืจืฅ ืืฉืจืื ืืืฉื ืืื ืืืืช ืฉื ืื ืืื ืืืืื, ืืืื ืื ืืื ืืกืืืจืช. ืชืงืืคื ืื ื ืงืจืืช ืชืงืืคืช ืืฉืืคืืื - ืขื ืฉื ืืฉืืคืืื (ืืื ืืืืื) ืฉืขืืื ืืื ืืืื ืืช ืืขื ืืืืจ ืฉืฉืืขืื ืืขืื ืืืืืจ ืืืื ืืืจืืืื. ืืืืจ ืชืงืืคื ืฉื ื-390 ืฉื ื ืืืงืื ืืืจืฅ ืืฉืจืื ืืืืืช ืืฉืจืื ืืืืืืืช ืขื ืืื ืฉืืืื ืื ืืื, ืฉืืฉื ืืืื ืืช ืฉืืื. ืืืืจ ืืืช ืฉืืื ืืื ืื ืืืืืื, ืืชืืืคื ืฉืืฉืืช ืืืืืื ืืืืขืืจื ืืืื ืืืื, ืืืฉื ืืืื ืฉืืฉืืช ืืืช ืืื. ืืืืจ ืืื ืืื ืขื ืืฉืจืื ืื ื ืฉื ืืื - ืฉืืื ืืืื ืฉืืงืื ืืืจืืฉืืื ืืช ืืืช ืืืงืืฉ ืืจืืฉืื, ืฉืืื ืืืจืื ืืคืืืื ืืืืืื. ืืืืจ ืืืชื ืฉื ืฉืืื, ืืชืคืฆืื ืืืืืื ืืืืืืช ืืืืื, ืืืจืื ืืืจืฅ, ืืืืืืืช ืืฉืจืื, ืืฆืคืื ื. ืืจืืืืืืืื ืืืืกืืืจืืื ืื ืื ืกืื ืืืชืืงืืช ืืืจ ืืืชืืืืช ืฉื ืขื ืืฉืจืื ืืืืฆืขืืช ื ืืชืื ืฉื ืืืฆืืื ืืจืืืืืืืืื, ืืืกืืฃ ืฉื ืขืืืืืช ืืืฅ-ืืงืจืืืืช ืืฉื ืืืงืกื ืืืงืจืื. ืงืืืืช ืืกืืื ืจืืื ืืื ืืืืงืจืื ืืืื ืงืืืื ืืืืกืืืจื ืฉื ืฉืชื ืืืืืืืช, ืืฉืจืื ืืืืืื, ืืืื ืืฉ ืืืืืืื ืกืคืง ืืงืืืื ืฉื ืืืืืื ืืืืืืืช. ืขืงื ืืืกืจ ืืืชืืื ืืื ืืืืฆืืื ืืืื ืกืืคืืจื ืืืงืจื ืขื ืชืงืืคืช ืืืืืช, ืืฆืืืช ืืฆืจืื ืืืืืืฉ ืื ืขื, ืืขืงื ืืืขืจืื ืื ืื ืืืืชื ืืืืืช ืืชืืื ืืฉืืขืืชืืช ืืืจืฅ ืืฉืจืื ืืคื ื ืืืื ื-8 ืืคื ื"ืก, ืืงืืื ืืฉืขืจ ืื ืขื ืืฉืจืื ืืชืืืฉ ืืฉืืื ืืืืฃ ื-2 ืืคื ื"ืก ืืชืืืื ืืจืื ืฉื ืืืืื ืชืจืืืชื ืืคืืื ืืื ืืืืืืืกืืื ืืื ืขื ืืช ืืืื ืงืืืฆืืช ืฉืืืจื ืืื ืขื ืืืืจืื ืืขืืจ ืืืจืื ืืืืจืื. ืืืงืจื ืืืงืจื ืืืฆืืื ืจืืืื ืืชืืืื ืื ืื ืืืงืกื ืืืงืจืื. ืขื ืืืช, ืืงืืื ืืืืงืจ[ืืจืืฉ ืืงืืจ][ืืจืืฉื ืืืืจื] ืื ืืชืืืืจ ืืืืื ืฉื ืชืงืืคืช ืืฉืืคืืื ืขืฉืื ืืืชืืื ืืืืื ืืกืืืืช ืืืฆืืืืช ืฉืฉืจืจื ืืื ืขื ืืชืงืืคืช ืืืชืืืฉืืช ืฉื ืืขื, ืืืฉ ืืืงืจืื ืืกืืืจืื ืื ืืืชืื ืฉืืกืืคืืจืื ืขื ืืฆืืืช ืืฆืจืื, ืืืืืฉ ืืืจืฅ ืืืืืืื ืืืืืืืช ืืฉืืจืื ืืืืจืื ืืช ืืืกืืืจืืื ืฉื ืืืง ืืืงืืืฆืืช ืฉืืชืืืฉื ืืกืืคื ืฉื ืืืจ ืืขื ืืฉืจืื. ืืืืืืื ืืื ืืืจื ืืืจืืื ืืืืงืจ ืืืื ืืชืืืืจ ืืืกืืจืชื ืืฆืืืฆืืื ืืฉืืขืืชืืช ืกืืื ืกืืฃ ืืืื ื-9 ืืคื ื"ืก, ืืืืื ืชืงืืคืช ืืืืช ืืจืืฉืื ืืืืจืืืืช ืืื ืืืืืืช ืืฉืจืื ืืืืืื. ืืืืืช ืืฉืจืื ืืจืื ืืกืืืืืช ืฉื ืช 720 ืืคื ื"ืก, ืืืฉืจ ื ืืืฉื ืขื ืืื ืืืืืคืจืื ืืืฉืืจืืช, ืืจืืื ืืชืืฉืืื ืืืืื. ืืืจืืข ืื ื ืืืข ืืืืฉื ืืืืืช ืขืฉืจืช ืืฉืืืื. ืืืืืช ืืืืื ืืจืื ืืฉื ืช 586 ืืคื ื"ืก. ืืืืชื ืฉื ื ื ืืืืื ืืฆืจ ืืฉื ื ืืื ืืื ืืืฉ ืืช ืืจืืฉืืื ืืืืจืื ืืช ืืืช ืืืงืืฉ ืืจืืฉืื (ืขื ืคื ืืืกืืจืช ืืืืืืืช, ืืืช ืืืงืืฉ ืืจืืฉืื ื ืืจื ืืฉื ืช ื'ืฉื"ื (422 ืืคื ื"ืก)). ืจืืื ืืชืืฉืื ืืืืื ืืืืื ืืืื. ืืืืช ืืื ื ืืฉืื ืึพ70 ืฉื ื; ืืืืืื ืืงืฆืช ืืืื ืืื ืฉืื ืื ืืจืฅ ืืฉืจืื ืืื ืืืชืืื ืืื ืืืช ืืืช ืืืงืืฉ ืืฉื ื, ืืืืฉืืจ ืืื ืืืืืื ืืคืจืกืืช ืืฉืืืช, ืืืจืฉ. ืืฉื ืช 520 ืืคื ื"ืก ืืขืจื, ืืืงื ืืืจืืฉืืื ืืืช ืืืงืืฉ ืืฉื ื. ืืชืงืืคืช ืืืช ืฉื ื ื ืืชื ืืชื "ื, ืืืืืืืช ื ืืืงื ืืืกืคืจ ืืจืืื ืืจืืืืื, ืืื ืืคืจืืฉืื, ืืฆืืืงืื ืืืืืกืืื. ืืืืจ ืฉื ืื ืฉื ืฉืืืื ืคืจืกื ืืืืจ ืื ืืื ืืกืื, ืืจื ืืืืืื ืืืื ืืืงืืชื ืฉื ืืืืื ืืืืืืช ืขืฆืืืืช ืชืืช ืืืฉืืื ืืื ืกืืื ืฉื ืช 110 ืืคื ื"ืก. ืืืืฉื, ืืืฉื ืืจืืืืื ืืช ืืืืืื. ืืจืืืืช ืืืืื ืืจืืื ืืืืื ืืชืืืืชื ืฉื ืืืืช ืืจืืื. ืืืกืืจืช ืืืืืืืช ืืชืืืจืืช ืชืืืืช ืืืืืช ืืืืจืื ืืืช ืืืงืืฉ ืืฉื ื, ืืืืื ืืืจื ืืืืื (70 ืืกืคืืจื). ืืจืืืืื ืืจืกื ืืช ืืจืืฉืืื ืขื ืืืกืื, ืืืืจืื ืืืืืื ืืืจืฅ ืขืืจ ืืืื ื. ืืืกืื ืืืืื ืืื ืคืงื ืืช ืืืืฉืื ืืืืืื ืืืจืฅ ืฉืืฉืื ืฉื ื ืืืืจ ืืืืจืื, ืืขืงืืืช ืืจื ืืจ ืืืืื. ืืืืืคืจืื ืืจืืืืช ืืืกืื ืืกืืคื ืฉื ืืืจ ืืช ืขืฆืืืืช ืืืืืืื ืืืจืฅ ืืื ืืช ืืจืื ืืืืืื ืื. ืืืืจ ืืื ืืชืคืืจื ืืืืืืื ืืื ืงืฆืืืช ืชืื, ืืืืฉืืื ืืฉืืืจ ืขื ืงืฉืจ ืืืืฆืขืืช ืืืจืืช ืืฉืืืืื ืืฉืจ ืชืืขืื ืืืขืื ืืช ืจืืฉืืืืื ืขื ืืืชื. ืืืืื ื ืืืืืืืช ืขื ืคื ืกืคืจ ืืจืืฉืืช ื ืืจื ืืขืืื ืืฉืืฉื ืืืื. ืื ืืกืฃ ืืื ืืชืืืจ ืืื ืฉืืืขื ืฉืื ืฉืืช ืืื ืืืืืืชื. ืื ืืื ืืฉื ืื ืืืื ืืขืืจื ืืชืืืื ืืฉื ืช ืืจืืืช ืืขืืื (ืฉื ื ืงืืื ืืจืืืช ืืื ืืจืืฉืื) ืืคื ืืืขื ืืืงืืืืช, ืฉื ืกืืืช ืขื ืืฉืืื ืืืืคืืข ืืกืคืจ ืกืืจ ืขืืื. ืืฉื ืื ื ืื ืืช ืืกืคืจืืช ืขืืจืืืช, ืืืคืืื ืฉื ืช ื'ืชืฉืค"ื ืืื ืืฉื ื ืึพ5786 ืืืจืืืช ืืขืืื, ืขืึพืคื ืืืฉืืื ืื "ื (ืืืืช ื' ืฉืืคื ื ืืืจืฉ ืืฉืืขื ืืืฉืช ืืืคืื). ืืคื ืืืชืืืจ ืืกืคืจ ืืจืืฉืืช, ื ืืจื ืืืื ืืืจืื ืืืื ืืฉืืฉื, ืืืชืจ ืืืจืืื ืืืืฉืืืืชื ืืืชืืื ืชื (ืืฆืื ืืืืื) ืืืืืจ ืืจืืืชื ืืื ืืื ืืฉืืืขื ืืื ืื ืืื (ืืื ืืฉืืช). ืืืืจ ืืื ืขืฅ ืืืขืช ืืืจืฉ ืืืื ืืื ืขืื, ืืื ืืชืืืื ืฉืืฉืืช ืืืื. ืขื ืคื ืืืง ืืืืื ืืคืจืฉื ื ืืืืืืช, ืืืื ืืจืื"ื, ืกืืคืืจ ืืืจืืื ืืื ืืขืื ืืื ืืฉื ืืืืืจื-ืกืืืืืื ืื ืืื ืืกืื "ืืขืฉื ืืจืืฉืืช", ืืืื ืืืืื ื ืืคืฉืืื. ืืืืืืช ืืื ืืขืจืืช ืืืืกืกืช ืฉื ืืืืืืช ืืื ืืืชืืช ืืืจืืช ืืืื ืืืืืฉื ืืช ืืืืืืื ืืืขืฉืืื, ืื ืขืืืื ืืฉื ื ืขืืงืจื ืืืื ื ืืงืืืืื ืฉื ืืกืื ืืกืคืจ ืคืขืืื ืขื ืืื ืืืืื ืืืืจืืช. ืืืคืืจืกืืช ืืืืชืจ ืืื ืจืฉืืืชื ืฉื ืืจืื"ื ืืืืืืช ืฉืืืฉื ืขืฉืจ ืขืืงืจืื, ืืืจืืฉื ืืืืื ื ืืื ืืื, ืืืืื ื ืืืืืชื ืืืคืฉื ืืื ืืฉืื, ืืืืื ื ืื ืืืื ืืืืืื ื ืืขืืืื ืืชื ืื ืืืืืช ืฉื ืชืืจืช ืืฉื. ืืื ืืขืืงืจืื ืื ืืกืคืื ื ืืฆืืืช ืืืืื ืืช ืืืฉืื, ืืขืืื ืืื ืืืชืืืืช ืืืชืื. ืืื ืืืืืืืื ืืืืืืื ืืืืชืจ ืขื ืืืืื ื ืืืืืืืช, ืืืืื ืฉืืื ืคืกืืง ืืชืืจื, ืืื "ืฉืืข ืืฉืจืื, ื' ืืืืืื ื, ื' ืืื". ืืืืืื ืืฉืชืืฉื ืืืฉืคื ืื ืืืืืื ืืืืื ื ืืืื ืืชืืืืกืืืช, ืืืืฆืขื ืืืืืื ืืคื ื ืืืืืื ืืืจืื, ืืืฉืืืขื ืืืชื ืืืืช ืืืืงืืชื ืืืืืืช, ืืืฃ ืืขืช ืืืืช ืขื ืงืืืืฉ ืืฉื. ืขื ืืฃ ืืฉืืืืชื, ืืื ืื ืืืฆืืจืช ืืืฉืคื ืืื ืืื ืืืชืืืืจ. ืืฉืืื ืื ืืฉ ืืืกืืจืช ืืืืืืืช ืืงืืืื ืืขืืงืจื ืืืืื ื ืืกืืืจืื ืื ืฆืจืืช ืืืจืืื ืืืืื ืจืืื ืืืื ืื ืชืฉืืื ืืึพืืฉืืขืืช: ืืขืื ืฉืืืื ืืื ืืืื ืืื ืืื ืืจืื"ื ื ืืกืื ืขืืงืจืื ืืืื ืืืฃ ืืืจืืื ืื ืืฉืืืืื ืืืชื ืืืฆืืื ืืืื ืืฉืจืื, ืืกืคืจืืช ืืจืื ืืช ืขืกืงื ืชืืื ืืืชืจ ืืืงืืืงื ืืืื ืืืฉืจ ืืืืืจืืช ืืืื ืืช, ืืืขืช ืืืืฉื ืงืื ืจืืื ืฉืชืืืจื ืืช ืืืืืืช ืืืช ืืืืชืืช ืืืฉืคืืืช ืืืื ืฉืืื ืื ืขืืงืจืื. ืืืืื ืืืืืืืช ืืชืืจื ืฉืืืชื ืืืืืืงืช ืืืืฉื ืืืงืื ืืืืื ืื "ืืืืฉืื", ืืืืื "ืืืฉื ืืืืฉื ืชืืจื", ืืืชืืจื ืฉืืขื ืคื - ืืืฉื ื ืืืชืืืื, ืืืืืืช ืืช ืืืืจืืืช ืืืืืชืืืช ืฉื ืืืืืืช ืืจืื ืืช, ืืืฆืืืช. ืขืืงืจ ืกืคืจืืช ืืงืืืฉ ืืืืืืืช ืืื ืืืื ืืคืืจืืฉ ืฉื ืืืฆืืืช. ืืืจื ืืชืืืช ืืชืืืื ื ืืชืื ืกืคืจื ืืืื ืืืืืืื ืืืืืช ืืื ืืืื ืืืื (ืฉืื ืืื ืืชืืืื ืขืฆืื ืืืืื ืื "ืืืืชืืช" - ืืืืื ื ืกืืคืืจืื ืืขืื ืืืจืื ืฉืืื ื ื ืืืขืื ืืืืื) ืืื "ืกืคืจ ืืืืืช ืืืืืืช" ืฉื ืืชื ืืชืงืืคืช ืืืืื ืื, ืืกืคืจื ืืืืื ืืืืืขืื ืฉื ืืชืื ืืชืงืืคืช ืืจืืฉืื ืื, ืกืคืจื ืืจื"ืฃ, ืกืคืจ "ืืฉื ื ืชืืจื" ืืืจืื"ื, "ืคืกืงื ืืจื"ืฉ", ืืกืคืจื ืื ืืืข ืฉื ืื ื ืจ' ืืขืงื "ืืขื ืืืืจืื" ืฉืืขืืช ืืืื ืืกืชืืืืช ืขื ืืืจื ืืชืืืื ืืืืืืงืืชืืื ื ืืฆืจื ืขืงื ืคืืจืืฉืื ืฉืื ืื ืื. ืืืืขืืจ ืืื ืื ื ืืื (ืขื ืคื ืืืกืืจืช, ืื ืืืื ืคืกืงื ืืจืืฉืืช ืืื ืืืช ืฉื ื), ืืื ืืื ืืืชื ืืื ืืจื, ืืืืจืื ืขื ื ืืืื ืืงืืืื, ืขื ืื ืืืชื ืืจืืื ืืช ืืขื ืคืกืืงืช ืืืื. ืงืืื ืื ืืฉืคื ืืชื, ืื ืืฉืื ืืืฉืืช, ืืืืจ ืืืืื ืืช. ืืืื ืงืืื ืืกื ืืืจืื ืืืืชื ืื ืกืืืืช ืืืงืืช ืื ืื ืืฉืื ืคืืืืื. ืื"ื ืื ื ืืชืืจื ืชืจื"ื (613) ืืฆืืืช, ืืืืืืงืืช ืืจื"ื (248) ืืฆืืืช "ืขืฉื" ืืฉืก"ื (365) ืืฆืืืช "ืื ืชืขืฉื". ืืืื ืืืื ืืื ื ืขืฉื ื ืืกืืื ืืช ืืืืืงื ืฉื ืืืฆืืืช. ืืฉื ื ืืืืงื ืจืืืืืช ื ืืกืคืช, ืืื ืืฆืืืช ืฉืืื ืืื ืืืืจื ืืืฆืืืช ืฉืืื ืืื ืืืงืื. ืืืช ืืชืคืืื ืืืืืื ื ืงืจื ืืืช ืื ืกืช. ืืชืคืืืื ืื ืืช ืฉืืืฉ ืืชืคืืืืช ืืืืืืืืืืช (ืฉืืจืืช, ืื ืื ืืขืจืืืช). ืืฉืืชืืช ืืืืืื ืืืชืื ืืืง ืื ืืกื ืืชืคืืื ืืขื ืืื ื ืฉื ืืืื ืื ืืกืคืช ืชืคืืืช ืืืกืฃ. ืืืื ืืืคืืจ ืืฉื ื ืื ืชืคืืื ืืืืฉืืช, ืชืคืืืช ื ืขืืื. ืืืื ืฉื ื ืืืืืฉื, ืืฉืืชืืช, ืชืขื ืืืช ืืืืขืืื ืงืืจืืื ืืชืืจื. ืืคื ืื"ื ื ืงืจื ืืืช ืืื ืกืช ืืฉื "ืืงืืฉ ืืขื", ืขื ืฉื ืืืช ืืืงืืฉ ืฉืขื ืืืจืื ื ืืืืื ืืจืื ืจืืื ื ืืขื ืืฉืจืื ืืขื ืฉื ืขืืืืช ืืงืจืื ืืช ืฉืืืืขืืจ ืืืช ืืงืืฉ ืืืืจืช ืืชืคืืื ืืืืื ื ืื ืขืืืืช ืืชืคืืื. ืขื ืืฃ ืฉืืืืืืช ืืื ื ืืช ืืืกืืื ืจืืช, ืื ืฉืื ืฉืืื ื ืืืืืื ืืืืืื ืืืฆืืจืฃ ืืืืื ืืืืืืืช ืืืืจ ืฉืขืืจื ืชืืืื ืฉื ืืืืจ ืืืืื ืงืืืช ืขืื ืืืฆืืืช. ืื ืืืืจ ืฉืืื ืืืขืื ืืื ืืืฆืืจืฃ ืืืืืืช ืืฆืืืจ ืขื ืื, ืขื ืืืืื ืื ืืืืืช ืืืชื ืืกืคืจ ืคืขืืื ืขื ืื ืช ืืจืืืช ืฉืืื ืืืืช ืจืฆืื ื, ืจืง ืืืืจ ืฉืืืชื ืืื ืฉื ืืืืช ืืืื ืืืืจ ืฉื ืืื ืืืืืื ืฉืจืฆืื ื ืืืืชื - ืื ืืชืืคืฉืจ ืื ืืืชืืื ืืชืืืื ืืืืืจ. ืืืืืืช ืจืืื ืื ืืื ืฉืืื ื ื ืืืืื ืืขื ืชืคืงืื ืจืืื ื ืืขืืื, ืืืขืงืืืช ืื ืืืืืช ืขื ืื ืื ืฉืืื ื ื ืืืืื ืืืืื ืืงืืื ืฉืืข ืืฆืืืช ืืืืื ืืช "ืฉืืข ืืฆืืืช ืื ื ื ื" (ืืื ืืืืกืืจ ืืขืืื ืขืืืื ืืจื ืืื ืืฆืืืช ืฉืื ืืช ืฉ"ืืื ืืื ืืืืจื"). ืืืืืืช ืจืืื ืืืืืื ืื ืงืืืงืืื ืืืืื, "ืืื ืืฉืจืื", ืืคื ืฉืืืืืฉื ืืชืืจื ืืช ืืืืจื "ืืืชื ืชืืื ืื ืืืืืช ืืื ืื ืืืื ืงืืืฉ" (ืฉืืืช ื"ื, ื'), ืฉืืฉืชืืืืืช ืื ืืืชืืื ืืช ืืืืืช ืืืื ืืชืืื ืืคืืืืื, ืืืืฉื ืืขืื. ืืฆื ืืืจืืช ืืืื ืืืฆื, ืืืื ืืืืคื ืชืืืจืื ืื ืืื ืืืชืืืืจ ืืืืชืงืื ืืชืื ืขื ืืฉืจืื. ืืืืื ืืฉืืื ืืืงืจื ืฉื ืืืืจ ื ืืฆื ืืืืืืช ืจืืช ืื ืืชืืืจ ืืืขืืจ ืืืคืื ืฉืขืืฉื ืจืืช ืืืืืจื ืื ืขืื: "ืขืื ืขืื ืืืืืื ืืืื" (ืืืืืช ืจืืช, ืคืจืง ื', ืคืกืืง ื"ื). ืืืจืืช ืฉืืคืฉืจ ืืืฆืืจืฃ ืืขื ืืฉืจืื ืืืืฆืขืืช ืืืืจ, ืืื ืืืืืื ืืจื ืืืคืื ืืื ืืืืื, ืืื ืงืฉืจ ืืืืื ืชื ืืืืงืคืืชื ืขื ืืืืื. ืืืืืืช ืืื ื ืืช ืืคืืจืฉ ืืื ื 'ืืืฆื ืืฉืืื' ืื 'ืืืืื ืืืืจ', ืืืช ืืื ืฉื ืืืื ืื ืืชืงืื ืืืืช ืืื ืืื ื 'ืชืื ืืง ืฉื ืฉืื', ืืื ืื ืืคืฉืจ ืืืคืื ืืืื, ืืื ืื ืืืืฉ ื ืืื ืืืืื (ืฉืืื ืืืืืื) ืืืื ืื ืืื ืืจ. ืืืฉืื ืืืืจื ืืกื ืืืจืื "ืืฉืจืื ืืฃ ืขื ืคื ืฉืืื - ืืฉืจืื ืืื". ืืืช ืืืฉืืืืช ืืื ืืื ืฉืืืืื ืฉืืื ื ืฉืืืจ ืชืืจื ืืืฆืืืช ืืืืงืฉ ืืฉืืืจ ืืช ืื ืืื ืืืช ืืืฆืืืชืื, ืืื ื ืฆืจืื ืืืชืืืืจ ืืืืื ืคืฉืื ืืืชืืื ืืฉืืืจ ืชืืจื ืืืฆืืืช, ืื ืืืืืื ืืืจืืช 'ืืืืืจ ืืชืฉืืื'. ืืืคืืื ื ืืชืจืืืช ืืืืืืืช ืืืืืืช ืืฉ ืืื ืืชืื ืงืืืฉ ืขืืงืจืืื ืื ืืฉืืื ืืืกืืกื ืืืชื: ืืืืืจื ืฉื "ืงืืืฉ" ืืื ืื ืขื ืื ืืกืคืจืืช ืืจืื ืืช ืืื. ืงืืืืื ืขืื ืกืคืจืื ืจืืื, ืืืฉืคืืขืื ืื ืืื ืืืืืชื. ืืงืืื ืืืืงื ืืกืคืจืืช ืืจืืฉืื ืื ืืืกืคืจืืช ืืืืจืื ืื, ืืฉืื ืงืืืฆื ืืชืืืงืช ืืกืคืจืืช ืืืื (ืืืื ืืืืืืจืื ืืืืืชืืื ืืืฉืคืืขืื ืืื ืืชืื ืืืืื ืื ืืืจืื"ื ืืขื ืืืื ื), ืืืืืฉืื ืขื ืืชื "ื, ืืืืืฉืื ืขื ืืชืืืื, ืืกืคืจืืช ืงืืื, ืคืืืืกืืคืื, ืืืืช ืืืืฉืืช ืืฉืจืื ืืขืื. ืฉื ื ืกืืืื ืืืืืืื, ืื ืื ืกืืืื ืฉื ืืืื ืช ืืฉืจืื - ืืื ืืื ืืื ืืจืช ืฉืืขืช ืืงื ืื. ืืื ืืจื ืืชืืืจืช ืืชื "ื ืืื ืืจื ืฉืขืืื ืชืืืื ืืืฉืื ืืืืืจ ืืื ืืืืช ืืืงืืฉ. ืืื ืืืคืืขื ืืกืื ืืืืื ืืืจ ืืชืงืืคืืช ืงืืืืืช. ืชืืืื ืืืืืช ืืื ืืจื ืืืคืืข ืขื ืฉืขืจ ืืืืืก ืืืชืืจ ืืช ืืืจืืช ืืื ืืืช ืืืงืืฉ ืืืืจ ืืืจืื ื ืืืืืชื ืืจืืื. ืืื ืื ืืชืืื ืืืืช ืืื ืกืช ืขืชืืง ืืืจืืื ืขื ืจืฆืคืช ืืคืกืืคืก ืฆืืืจ ืื ืืจื ืืขืืื ืืืืชืื "ืฉืืื ืขื ืืฉืจืื". ืืื ืืื ืืื ืืงืกืืจืื - ืฉืจืืื ืฉื ืืืื ืืขื ืฉืืฉื ืงืืืงืืืื, ืฉื ืจืื ืืฉื ื ืืฉืืืฉืื ืฉืืื ืืื ืืื ื ืืืืืคื ืขื ืืืืจ. ืืื ืกืื ืขืชืืง ืฉืฉืืืฉ ืืขืืจ ืขืืื ืฉืื ืื, ืืืื ืืขืช ืืืืฉื ืืื ืืคื ืืกืื ืืืืืื ืขื ืืืืืืื. ืืืขื ืืคืืืชื ืืกืื ืืืืื ืืืกืืื ืืื ืืื ื ืืจืืจืื. ืืื ืืกืืก ืืจืืจ ืืืืชืืก ืืื ืฉืกืื ืื ืืื ืกืืื ืฉื ืืื ืืืื. ืืืื ืืขืืจื ืืื ืืื ืืฉื ื ืืืฉืืฉ ืืช ืืืืืืช ืืืื ืืื ืืฉืืื ืืจืื-ืฉืืฉื: ืืืืฉืื ืืืืกืกืื ืขื ืืืืฉื ืืืจื, ืื ืืื ืืืชืื ืืฉื ืช ืืฉืืฉ ืขื ืืื ืืืกืคืช ืืืืฉ ื ืืกืฃ ืืืช ืืืกืคืจ ืฉื ืื, ืืื ืฉืืืืื ืืฆืื ืชืืื ืืืืชื ืชืงืืคื ืฉื ืืฉื ื. ืืืื ืื ืืื ืืืื ืืชืืฉื ืื ืืืื ืืฉืืืข ืืงืืืขืชื ืืืืืกืช ืืืื ื ืฉืืื ืืืื ืืจืืืขืืช ืืกืคืืจื. ืืืื ืืืงืืืฉ ืืฉืืืข ืืื ืืฉืืช, ืืืื ืืฉืืืขื ืืฉืืืข, ืฉืื ืืฆืืืื ืืฉืืืช ืืื ืื ืืืืืื, ืืืืชื ืืฆืืื ืื ืืชืคืืืืช ืืืืืืืช, ืกืขืืืืช ืืืืืืืช, ืืืงืกืื ืฉืื ืื: ืืืืงืช ื ืจืืช ืฉืืช ืืขืจื ืฉืืช, ืงืืืืฉ ืืืืื ืืืืื ืืืืืื ืืฆืืช ืืฉืืช. ืืืขืื ืืฉืจืื ื ืืืงืื ืืืืขืืื ืืงืจืืืื ืืืืขืื ืื"ื. ืืืขืืื ืืงืจืืืื: ืจืืฉ ืืฉื ื (ืืื ืืื ืืืชืืจื ืืืื ื ืืกืฃ ืฉืชืืงื ื ืื"ื), ืืื ืืืืคืืจืื, ืฉืืืฉ ืืจืืืื: ืคืกื (7 ืืืื), ืฉืืืขืืช (ืืื ืืื), ืกืืืืช (7 ืืืื) ืืฉืืื ื ืขืฆืจืช (ืฉืืืช ืชืืจื) (ืืื ืืื). ืืืขืื ืื"ื: ืคืืจืื (ืืื ืืื ืฉืชืืจืืื ืืฉืชื ื ืืื ืขืจื ืืคืจืืื ืืขืจืื ืืืงืคืืช ืืืื) ืืื ืืื (8 ืืืื), ืืื ืืืฉ ืืชืขื ืืืช: ืชืฉืขื ืืื, ืฉืืขื ืขืฉืจ ืืชืืื, ืขืฉืจื ืืืืช, ืฆืื ืืืืื ืืชืขื ืืช ืืกืชืจ. ืืฉื ื ืืืื ืืืืื ื ืืกืคืื, ืืืืืื ืื ืจืืฉ ืืืืฉ ืขืืจื, ื"ื ืืฉืื ืื"ื ืืื. ืืืื ื-12 ืืชืคืชื ืื ืืื ืื ืืืืื ืืช ื"ื ืืขืืืจ. ืืืืจ ืืื ืื ื ืฉื ืืืื ืช ืืฉืจืื ืืืกืืคื ืืจืื ืืช ืืจืืฉืืช ืืืฉืจืื ืืจืฉืืืช ืืืืขืืื ืื ืืช ืืื ืืขืฆืืืืช, ืืื ืืืืืจืื ืืืืื ืืขืจืืืช ืืฉืจืื, ืืื ืืืืืจืื ืืฉืืื ืืืืืืจื ืืืืืจ ืืืืืช ืฉืฉืช ืืืืื ืืฃ ืืช ืืื ืืจืืฉืืื. ืืืจืืช ืืืช, ืื ืืื ื ื ืชืคืกืื ืืืขืื ืืฉืืขืืช ืืชืืช ืืขืื ื ืจืื ืืฆืืืืจ ืืืจืื. ืืืงืื ืืืงืืืฉ ืืืืืืช ืืื ืืจืฅ ืืฉืจืื, ืฉื ืงืจืืช ืืืกืืจืช ืืืืืืืช ืื ืืฉืืืช "ืืจืฅ ืืงืืืฉ", "ืืจืฅ ืืืืื" ืืืชืืจืื ืืืืืืื ื ืืกืคืื. ืขื ืคื ืืืืื ื ืืืืืืืช, ืืื ื ืชื ืืช ืืืจืฅ ืืื ืืขื ืืฉืจืื ืขื ืื ืช ืฉืืืืื ืืงืืื ืื ืืืืื ืืืื ืืืืืืช. ืืืืื ืงืืืืช ืืฆืืืช ืืืฉืื ืืจืฅ ืืฉืจืื, ืืขื ืคื ืืืง ืืคืืกืงื ืืืืื ืืฃ ืืฉ ืืืืื ืข ืืืฆืืื ืืืืจืฅ ืืืืจืืช ืฉืืื ื ืืืจืืืืช. ืืืื ืฉืืืช ืืงืืช ืืืืื ื ืืืืืืืช ืืืืื ื, ื ืืืงื ืืคืืกืงืื ืืื ืืื ืืจืืืื ืืืงืืชื ืืฆืืื, ืืืื ืืืชื ืืืื ืืฆืืื ืืช, ืืื ืืืชืจ ืืืขืืื ืฉื ืืืกืืจ ืฉื "ืืืืงืช ืืงืฅ" ื"ืฉืืืฉ ืืฉืืืขืืช". ืื ืืืจืฅ ืืฉืจืื ืขืฆืื ืืฉ ืืืจืืืช ืืงืืืฉื, ืืืฉืจ ืืขืืจ ืืืงืืืฉืช ืืืืชืจ ืืืืืืช ืืื ืืจืืฉืืื, ืฉืื ืฉืื ืืืช ืืืงืืฉ. ืืจ ืืืืช, ืฉืขืืื ืฉืื ืืืงืืฉ, ืืื ืืืชืจ ืืืงืืืฉ ืืืืชืจ ืืืืืืช ืืืจืฅ ืืืขืืื ืืืื, ืืืฉื ืงืืืฉืชื ืืืง ืืืคืืกืงืื ืืฃ ืืืกืจืื ืืขืืืช ืืืื ืืืืื ื ืืื. ืืฃ ืืชืื ืืจ ืืืืช ืืฉ ืืจืืืช ืืงืืืฉื - ืื ืงืืื ืืืงืืืฉืช ืืืืชืจ ืืื ืงืืืฉ ืืงืืืฉืื. ืืขืืจ ืืฉื ืืื ืืงืืืฉืชื ืืืืืืช ืืื ืืืจืื - ืฉืืืขืจืช ืืืืคืื ืฉืื ืงืืืจืื ืืคื ืืืกืืจืช ืืืืืช ืืืืืืืช. ืขืจืื ืงืืืฉืืช ื ืืกืคืืช ืื ืืืจืื ืืฆืคืช, ืฉืืื ืืชืจืื ืจืื ืืืืฉืื ืืืืจ ืืืืืช, ืืื ืงืืืจืื ืฉื ืืื ืืืืืื ืืืื ืืฉืจืื ืืืื ืืืืืช, ืืื ืืจืื"ื, ืจืื ืืืกืฃ ืงืืจื, ืืืจ"ื ืืขืื ืืกืืจืืช ืขื ืงืืืจืช ืชื ืืื ืืกืืืืื ืืื ืจืื ืืืืจ ืืขื ืื ืก ืืขืื. ืืืืืืืืช ืืืืืืืช ืืืื ืงืื ืืื ืืืงืจืื ืืืืจื ืืื ืืฆืืืขืื ืขื ืื, ืื ืขื ืืฃ ืฉืืืืืืืืช ื ืืฉืืช ืืืจื ืืื ืืชืืคืขื ืืืืจื ืืช, ื ืืชื ืืืืืช ืืืืืืื ืืืงืืืื ืฉื ืืืืช ืืืืืืช ืืงืจื ืืืืื ืืขืช ืืขืชืืงื. ืืืขื ืืืืื ื ืืื ืืจืืืื ืื ื ืืช ืืืืืืื ืืื ืืงืืืฆืืช ืืืขืืืช ืฉื ืืื ื ืืืืืช ืืืืืืช ืืืืื ืช ืืืจ ืืืื ืงืื, ืืฆื ืืืืื ืื ืืืืืืื. ืืกืืฆืืืืื ืืืืกืืืจื ืืืืงืจ ืืืืืืืืช ืื ืชืื ื ืกืืืช ืืชื ืื ืืขื ืืืืืื ืืฉืืื ืชืงืืคืช ืืืช ืฉื ื ืืืืื "ืืงืืื ืงืจืืื ืืืชืจ ืืืืคืืก ืืืืืืืื ืฉื ืืืื [...] ืืืฉืจ ืืืื ืืื ืืงืื ืืืจ ืืขืืื ืืงืืื". ืืืืกืืืจืืื ืืื ืืืืืื ืืืกืืฃ ืื ืืืงืจื, ืืกืคืจืืช ืืืชืจ-ืืงืจืืืช ืืืืืกืืืจืื ืืืืืืืช ืฉื ืืืืืืื ืกืืคืงื ืืกืืก ืืืืืช ืงืืืงืืืืืช ืืืืืืช ืืืจ ืืชืงืืคืช ืืืืช ืืฉื ื. ืื ืจืืื ืืืืืื ืืชืืืง ืืืชืืืฉ ืืืืฆืขืืช ืงืจืืืืช ืฆืืืืจืืืช ืฉื ืืงืกืืื ืืืืืืื, ืื ืฉืืคืฉืจ ืื ืืื ืืืคืืืชืื ืืืฉืชืชืฃ ืืืืืืจ ืืืืืช ืืงืืืงืืืืืช ืฉื ืืงืืืื. ืืืงืืื, ืืฉืคื ืืขืืจืืช ืืืืชื ืืืื ืืจืืื ืืื ืืืช ืืฉืืืืจ ืืืืืช ืืืืืืืช: ืื ืื ืฉื ืชืงื ืื, ืืืชื ืื ืืืืืืจ, ืืืื ืืื ืืืืืช ืืช ืืงืฉืจ ืืขืืจ ืืืืกืืจืช ืืืืืืืช, ืฉืื ืื ืืืืชื ืฉืคืช ืืืืืช ืืฉืคืช ืืืช ืืืกืคืจืืช ืืืืืืืช. ืคืจืืืก ืืืืืจ, ืืืกืืืจืืื ืฉื ืืืืืคืจืื ืืจืืืืช, ืืฆืืข ืื ืืชืงืืคื ืื ื ืฉืขื ื ืืืืืืืืช ืืืืืืืช ืขื ืืชื "ื, ืฉืฉืืืฉ ืื ืืืืกืืืจืื ืืืืืืช ืืื ืืืงืืจ ืืฉืคืื; ืืฉืคื ืืขืืจืืช ืืฉืคืช ืืืืื; ืืขืจืืช ืืืงืื; ืืืืกืืืช ืืืจืชืืื ืืืื ืืชื ืกืคืจ, ืืชื ืื ืกืช ืืื ืฉืืืจืช ืืฉืืช. ืืืงืจ ืืืจ, ืกืืื ืืืฆืื, ืืืืื ืืชืืืืืช ืืืืืืช, ืกืืืจ ืื ืืืืืช ืืืืืืืช ืืืืืืืช ืืชืคืชืื ืืชืืฆืื ืืืืืชื ืฉื ืืืืืืื ืขื ื ืืืฉ ืชืืช ืฉืืืื ืคืจืกื, ืืืื ื ืืจืืื, ืื ืฉืขืืื ืืืชื ืืืืืืืช ืืืืื ืขืชืืงื, ืขื ืืกืืก ืกืคืจื ืืงืืืฉ, ืืฉืคื ืืขืืจืืช, ืืืงืืฉ ืืืืืื ื, ืืืกืืจืช ืืืืืช. ืืืืกืืืจืืื ืืืจืื ืื ืืืก ืืฆืืื ืื ืืืืืช ืืืฉืืื ืืื, ืฉืืืืชื ืืืช ืืืืืืืืช ืืืืืืืช ืืจืืืื ืืช ืืืืืืช ืืงืืืืช ืืชืงืืคืชื, ืืืืงื ืืืืื ืจืื ืืช ืืชืืืขื ืืืืืืืช ืืืืืืืช. ืืืืจืื ืชืงืืคื ืื ืฉื ืขืฆืืืืช ืชืจื ืืืืืฆืื ืืืชืืฉืืื ืืฉืื ืืืืื ื ืจืืืื ืืช ืืืืืืช ืืืจืฅ ืืืืื ืืืืืื ืืืจืืืืช ืืืืฉืืืช ื ืื ืืจืืืืื ืืืืืช ืืจืืฉืื ื ืืืฉื ืืื ืืกืคืืจื. ืืืงืจ ืืืืืืืืช ืืืจืืื ืืืืกืืื ืืก ืืขื ืื ืืขืื ืฉืืืืื ืืืืืืืืืช ืื ืชืืคืขืืช ืืืืคืืื ืืช ืืขืืงืจ ืืช ืืขืืื ืื ืืฆืจื, "ืืืืฆื ืื ืืืื ืืืืื ืืืขื ื ืื ืื ืืืืืืื". ืืืขืชื, ืืืืืืื ืืืืืื ืื ืจืื ืืช "ืืืืื ืืจืืฉืื ื ืืืืืชื", ืืฉืจ ืกืืคืง ืืขืืื, ืืืืฆืขืืช ืืชื "ื, ืืช ืืืืื ืืืืืื ืืจื ืืืืืื ืฉื ืขื ืืฉืจืื ืืงืืื. ืืกืืื ืืก ืกืืืจ ืื "ืืคืจืืืงืก ืืขื ืงื ืฉื ืืืืกืืืจืื ืืืืืืืช ืืื ืฉืืขื ืฉื ืชื ืืขืืื ืืช ืืืื ืืืืืชืืืช, ืืืคืืื ืืืืืื ืฉื ืืืื ืช ืืืื, ืืืื ืืืชื ืืืฉื ืืืขื ืืืคืืื ืฉื ื, ืืืจืฃ ืืืช ืฉืจื". ืืคื ืืกืืื ืืก, ืืกืืื ืืฉืจืืืืช ืืืืืืช ืื ื ืขืืฆื ืืื ืฉืืืืืืื ืฉืืืจื ืืช ืืืืชื ืืืืืืืช ืืืืื ืื ืืืืจ ืืืจืื ืืืืช ืืฉื ื ืขื ืืื ืืืคืื ืืืืืจืื ืืงืืืงืืืื ืืฉืืืจื ืขื ืืชื "ื ืืืฆืืจืืช ืืงืกืืืืืืืช ืืงืฉืืจืืช ืื. ืืื ืื ื ืืชืจื ืืืื ืืืืื ืืื ืจืง ืงืืืฆื ืืชื ืืช. ืขื ืขืืืืช ืืืืืืืืช ืืืืจืืคื, ืืฆืื ืฉื ืืืืืืื ืืคื ืืืืืืื, ืื ืฉืืืืื ืืฆืืืืชื ืืืืชื ื ืื ืขืช ืฉื ืืฆืืื ืืช โ ืชื ืืขื ืืืืืืช ืืืืืืช ืืืืจื ืืช โ ืืืกืืคื ืฉื ืืืจ ืืืืื ืืืงืืชื ืฉื ืืืื ืช ืืืืื ืืืืืืืช. ืืจืืื ืืงืืืฆืืช ืืืืืืช ืืื ื ื ืืืื ื-20, ืืืืจ ืืฉืืื ืืืืืืจืืช ืืืืื ืืืช, ืืชืืื ืฉื ื ืจืืืืื ืืืืืื ืขืืงืจืืื ืืขืืื: ืืฉืจืื ืืืจืฆืืช ืืืจืืช. ื-90% ืืืืืื ืชืื ืืืื ืืื. ืืืืืืช ืืืืืจื ืืช ืงืืืืื ืืกืคืจ ืืจืืื, ืฉืืืฉื ืืืืืื ืืืื ืงืื ืื ืืืจืื: ืืงืจื ืืืืืืืกืืื ืืืืืืืช ืืืฉืจืื, ืืืืืงื ืืจืืืืช ืืคื ืืืืจื ืขืฆืืืช ืืื ืืืจืืข ืงืืืฆืืช ืฉืื ืืช, ืื ืืืืืช ืื ืืื ืืคืจืืืจืื ืกืืฆืืืืืืืื ืฉืื ืื: ืืืืืืช ืืขืืื, ืืื ืื ืืืจืืื ืืงืืืืื, ืงืืืืช ืงืืืืจืื ืฉื ืืืืืื ืื-ืืฉืืืืื, ืื ืืืืื ืจืื ืฉื ืื ืฉืืื ื ืืืจืื ืืืฃ ืงืืืื, ืื ืืืืื ืฆืจ ืืืชืจ ืืืืื ื ืืื ืืื ืฉืืื ื ืืืืืื ืขื ืืฃ ืืจื ืงืืื. ืืกืงืจืื ืฉืื ืื, ื ืืชื ืช ืืื-ืืฉืืืืื ืืืคืฉืจืืช ืืืืืืืช ื"ืืืืืื ืกืชื" (Just Jewish), "ืืืืืื ืืื ืืช" (Jews of No Religion) ืืขืื. "ืืืืืื ืืื ืืช" ืืืืืื ืืช ืืจืื ืืืืจืืข ืฉื ืืื-ืืฉืืืืื ืืืจืฆืืช ืืืจืืช. ืื ืื ื ื-1.2 ืืืืืื ื ืคืฉ ื-2013; ืงืืืฆื ืื ืืืืงื ื-500,000 ืืืืืจืื ืืื ืืช ืฉืืืืื ืืืืืืื ืืชื ืืื, ืืขืื ื-600,000 ืืืืืจืื ืืื ืืช ืฉืขื ื ืฉืื "ืืืืืื ืืืงืืช". ืืงืจื "ืืืืืื ืืื ืืช" ืืืจืฆืืช ืืืจืืช ื ืจืฉืื ืชืืคืขืืช ืฉื ืืืืืืช ืื-ืืชืืช ืืงืืืืืช ืขื ืืืืืืช, ืืืืจืฉืช, ืืืืืืืช ืื ืืชื ืืืช (ืขื ืืืฆืข ืืืื ื-20 ืืืืชื ืืืืจืืงื ืืืืื ืืืช ืืืืืืช ืืืืกืกืช, ืื ืื ืฉืงืขื ืืืืื ืจืื.) ืขื ืืืช, ืืืืคื ืืืื, ืงืืืฆื ืื ืืืืืกืช ืืฉืืืืช ืืืขืื ืืืืืชื ืืืืืืืช, ืืืืชืจ ืืฉื ื ืฉืืืฉืื ืืชืืื ืืื ืื ืืืืืื ืืช ืืืืืื ืืืืืืื. ืจืื ืื ืืขืจืืื, ืืืฉืื, ืืืฉืืื ืืืืืฆืืช, ืจืื ืคืืจืื ืืืืืืช. ืืงืจืืื ื ืืกืคืช ืงืืฉืืจืื ืืืฆืื ืืื ืืขืจืืช ืฉืืืืื |
======================================== |
[SOURCE: https://he.wikipedia.org/wiki/ืชื ืืขืช_ืืืกืืืืช] | [TOKENS: 47822] |
ืชืืื ืขื ืืื ืื ืชื ืืขืช ืืืกืืืืช ืืฉืขืจ ืื ืืฉืืื, ืืืฉืื ืืืืืจืื ืืชื ืืขืช ืืืกืืืืช, ืจืื ืคืืจืื ืชื ืืขืช ืืืกืืืืช ืชื ืืขืช ืืืกืืืืช ืืื ืชื ืืขื ืจืืื ืืช ืืืืจืชืืช ืืืืืืช ืฉืงืื ืืืืฆืข ืืืื ื-18 ืืืขืจื ืืืงืจืืื ื ืฉื ืืืื, ืืืชืคืฉืื ืืืืืจืืช ืืืืืืช ืืืจื ืืืจืืคื. ืืืขื ืฉื ืืื ื ืืฉื ืืืืืืื ืฉื ืืืกืืืืช. ืชืืืืืื, ืืจืืฉืืช ืืืืื ืืืืจืืืฉ, ืคืืชืื ืืช ืชืืจืชื ืืืคืืฆืื. ืืืืชื ืฉื ืืืกืืืืช ืืืฉืคืขื, ืืคืจื ืืืืจืืชืื ืืจืืฉืื ืื, ืืงืืืช ืืืจ"ื, ืชืื ืืืืฉืช ื ืืืืืช ืืื ืืืงืื ืืืื, ืืช ืืฆืืจื ืืืืืืง ืื ืืื ืขืช ืืืงืื ืืืช ืืืื ืืจืืื ื ืื ืกืชืจ ืฉื ืืืฆืืืืช. ืืืกืืืื ืืืืืืื ื"ืืฆืจืืช" ืฉืืจืืฉ ืื ืืืช ืืื ื ืืฆื ืืืื"ืจ, ืืืงืื ืืืจื ืืื ืืช ืชืืืจื ืืืจืืฉื. ืืืืื ืืืืจืืช ื ืืฆืจื ืืืืช ืืฆืจืืช ืขืฆืืืืืช, ืืืืืืช ืืงืื ืืช, ืืืื ืืืช ืืกืืจืืชืื ืืืืฉืื. ืืืกืืืื ืืืืืืื ืืืืืืง ืืืืื"ืจ ืฉืืจืืฉ ืืืฆืจ ืฉืืื ืืืฆืืืช ืื. ืืื ื ืืฉื ืืกืืืืช ืจืืื ืืช ืฉืืฉ ืืืืขืืจ ืื ืืื ืืืชืงืจื ืืื. ืืืฉืชืืืืืช ืืืฆืจ ื ืฉืืจืช ืืืจื ืืื ืืชืื ืืืฉืคืื ืืืฉื ืืืจืืช, ืืืืกืื ื ืืื ืขื ืคื ืจืื ืืืฆืจ ืืกืืืืช ืืืื ืขื ืืืจืฉืชื. ืืื ืืืืฆืืช ืืฉื ืืื ืฉื ืืืื ื-20 ืืชืืืจืจืื ืืจืืืช ืืืกืืืื ืืืฉืจืื, ืืืจืฆืืช ืืืจืืช, ืืืจืืื ืื ืืืืืืื. ืื ืืืืืื ืชืช-ืงืืืฆื ืืชืื ืืขืืื ืืืจืื ืื ืืฉืืื ืืืจื ืืื ืืืืืงืื ืืฉืืจื ืื. ืืืกืืืื ืืฉืืจืื ืจืืืช ืืืกืืจืืชืื ืฉื ืืืืืช ืืืจื ืืืจืืคื. ืืืืืฉ ืืืฉื ืืืืืืืจ ืืืืืืืื ืืืืืืฉ ื ืขืฉื ืืืืืื ืขืื ืืืขื ืืืขืืืช. ืืฉื ื ืืขืืื ืืชืจืืกืจ ืืฆืจืืช ืืืืืืช ืขื ืืืคื ืืชื-ืื ืืืืืช ืืฆืจืืช ืงืื ืืช ืืืชืจ, ืืขืืชืื ืขื ืงืืืื ืกืืืืช ืฉื ืืืืืื. ื-2016 ืืื ืืขืืื ื-130,000 ืืชื-ืื ืืกืืืืื. ืืืกืืืจืื ืืกืืืืช ืืืฆืืืช ืชืืืืืืื ืฉื ืืืฉืืื ืืืขืฉ"ื ืืืืืื ืืืืจืืืฉ ืืืงืืืืช ืืจืื ืืืืจื ืืืืืืืช ืืกืืืืช ืืจื ืืืืืจื ืื ืฆืืจืื. ืืขืืจ ืืื ืืงืืืืืช ืืชืืืจืืืช ืฉื ืฉืืขืื ืืืื ืื, ืฉืืืืก ืืช ืืฆืืืช ืืืกืืืืช ืืืจืก ืื ืืืจ ืฉืืืืื ืืืืจืืช ืช"ื-ืช"ื ื-1648 ืืืืื, ืืืืชืืืืืืช ืืฉืืืื ืืขืฆืื ืืืืืื ืืืืืื ืืคืืื ื-ืืืืื ืขื ืคืืจืืง ืืขื ืืจืืข ืืจืฆืืช ื-1764, ืืืืกืืจืื ืืืจืงืกืืกืืืื ืฉื ืจืคืื ืืืืืจ, ืฉืืขื ืืืืืืฉ ืฉืืื ืืฉืืืืช ืจืืืืช ืขืงื ืืฆืจืช ืฆืขืืืื ืฉื ืืืืืืื ืืชืืื ืืืืืื. ืืืฉืืช ืืื ื ืกืชืจื ืืืืจืื ืขื ืืจืืืช ืืืืงืจ ืืชืืื. ืื ืืง ืืืืื ืืืจืืืฉ ืฉืืืืื ืคืจืขืืช 1648 ืืื ืงืื ืืืจืื ืืืืฉืืขืจ, ืืขืืืืืช ืจืืืช ืืฆืืืขืืช ืขื ืืชืืืฉืฉืืช ืืืืื ืืืืืืืื. ืืคืืื ืืืขื ื ืฉืืฉืคืืขื ืืฉืืจืืช ืขื ืืืืืฆืจืืช ืืฉืืชืืืช ืืื ืงืฆืจ ืืืจ ืื ืืืจ ื ืืฉืืช ืืืืืืืช, ืื ืฉืื ืืชืคืชืืืช ืืืกืืืืช ืฉืคืจืืืชื ืืืจืขื ืืืืฆืข ืืืื ื-18 ืืืขืืงืจ ืืงืจืืช ืกืืคื. ืคืืจืืง ืืืืขื ืื ืืฆืจ ืคืืืขื ืืชืืฉืืช ืืืืืื ืืืื ืืงืืืืชืืช, ืฉืืืืชื ืืืืจืื ืช ืืืืื ืขื ืืกืืก ืืงืืื; ืืืืงื ืืฉืืืื ืฉืกืืคืื ืืคืจืืกืื ืขื ืืืืงืช ืคืืืื, ืืื ืืื ืืืกื ืืืชื ืืฉืืืื, ืื ืืืืชื ืืืกืืืืช ืืจืืกืช ืจืื. ืขื ืืฃ ืฉืกืืื ืืืืืืช ืืจืืืคืืช ืืืืื ืคืืจืืง ืืืืืื ืืืืืจืื, ืืชืืืื ืฉืืคืืจ ืขืงืื ืืืฆืื ืืืืืื ืืืืืจืื ืฉื ืืืืืืื ืืชืงืืคื ืื ืฉืืฉืื ืืืกืืืืช, ืืฉืืื ืืืื ื-18. ืื ืกืืืืช ืืืืืืืงืืืืืืช ืืฆืืืืชื ืืืืชืคืฉืืืชื ืฉื ืืชื ืืขื ืืืืฉื โ ืืืื ืขืฆื ืืืืืืฉ ืืืืคืื ื ืฉืืืจืื โ ื ืชืื ืืืกืคืจ ืืืจืืื. ืืืื ื-17 ืืื ืืืชืคืฉื ืืืืข ืืฆืืืืจื ืขื ืืืืืช ืืงืืื, ืฉืืืคืฅ ืืื ืฉืจืื ืืขืืื ืื ืืืืื ื ืืขื ืืืชืคืจืกื ืขืงื ืคืขืืืืช ืฉื ืืืืืื ืื ืืฆืจืื ืืืื ืืชืืื. ืื ืฉืืื ืืขืืจ ื ืืืช ืืืืืจืื ืืขืืื, ืืคื ืืืืื ืืขืื ืืื. ืืชืืคืขื ืืฉืคืืขื ืขื ืืืืฉืคืขื ืืขืืืืช ืืฉืืชืืืช ืืืืคืจื ืงืืื, ืืจืงืข ืืฉืืชืฃ ืื ืืื ืื ืืข ืืจืืื ืืืชื ืืืืช ืืืกืืืืช ืืืืฉื. ืืืืืช ืืืขืฉ"ื ืืื ืฉืื ืืืืื ืืืคืขืื ืืืขืื ืืงืืืืื ื ืกืชืจืื ืืคื ืฉืืื ืืขืืจ ืืืืคืื ืืช ืขืืกืืงื ืืคืืืื, ืืจืืข ืฉืืืื ืืช ืืืกืืืืช, ืืืืชื ืืขืื ืื ืืื. ืื ืืกืฃ, ืขื ืืฃ ืฉืืืืืื ืืืื ืืงืืืืชืืช ื ืืชืจื ืืืชื ื ืืืกืืช, ืกืืืืชื ืืืืกืจืืช ืฉื ืืื ืืื ืืืืชืืงื ืืืื ืืืชืขืจืขืจื. ืืืฆืืืื ืฉืืืืืืื ืืฉืชืืืื ืืื ืืชืขืจืื ืื ืคืขื ืืืืืื ืืืืจืช ืืคืจื ืกืื ืืืจืื ืื, ืขื ืืื ืื ืฉื ืืฉืื ืืฉืจืืช ืืื ื ืชืคืฉื ืืขืืชืื ืงืจืืืืช ืืขืืฉื-ืืืจื ืืชื ืื. ืืืก ืฉืืืื ืื ืืงืฉื ืขื ืืงืื ืืจืืืชื ืื ืืืื ืคื ืืืช ืืขื ืืื ืื ืฉื ืืจืฉื ืืืืจืืข ืืื, ืืขืืงืจ ืืกืืจืช ื ืืฉื ืืืืืจืืช ืขื ืืืืจืช ืื"ืฉ, ืืืง ืืขืื. ืืืืื"ืจืื, ืฉืืื ื ื ืื ืืื ืืืฅ-ืืืกืืืช, ืืืื ืืืืคืืข ืืืืจืจืื ืืืื ืื ืืกืืืืืช ืืื, ืืืื ืืคืื ืืืืจื ืืจืืืืืืจืื ืืขืืืื ืฉื ืขืกืงื ืืืืืจื. ืขื ืืฃ ืฉืื ืขืืื ืืจืฉืืชื ืืจืืืช ืืกื ืงืฆืืืช ืืงืืืืชืืืช, ืงืืืืช ืืืกืืืืช ืืืืฆืขืื ืืืจืื ืคืืฆื ืขื ืื. ืืื ืื, ืงืื ืืืืชื ืขืืื ืืขืืื ืื ืืฆืจื ืืืื ืชื ืืขืืช ืชืืืื ืืชืืืช ืฉืืืืืฉื ืืช ืืกืืืื ืืืืจืจืืื ืจืฉืืืช ืืืืงืกืืืช ื ืืงืฉื ืืืืืช ืจืขืืื ืืช ืืืกืืืื ืืคืฉืืืช. ืืืกืืฃ, ืืืคื ืืื ืืื ืืื ืืืคืชื ืืืฆืืืช ืืืกืืืืช: ืขื ืืฃ ืืืื ืื ืืืืกืืืื ืืืืงืื ืฉืืืื ืืช ืคืขืืืืชื ืืขืืจืจื ืืฉืฉ ืืื ืืงืจื ืืชื ืืืืื ืืืืฉื ืืฉืืชืืืช ืืืคืจื ืงืืื, ื ืื ืขื ืืชืืงืคื ืืช ืืชืจื ืืืคื ืืืืืืืช ืืืฉื ืืช ืืืืืืฉื ืื ืื ืืื ืืขืจืขืจ ืืช ืกืืจ ืืขืจืืื ืฉื ืืื. ืื ืืื ืฉืืจื ืื ืืืชืื ืื ืืืกืืกื, ืืขื ืืฃ ืืคืืคืืืืื ืฉื ืงืื ืื ืืืคื ืืืืื ืื, ืืงืคืืื ืืฉืืืจ ืขื ืืืงื ืืฉืืืืช ืืืืืืกืืช ืืื ืืื ืืืขื ืืืื. ืืขื ืืืืข ืืืืืืืช ืขื ืจืื ืืฉืจืื ืื ืืืืขืืจ, ืืืขืฉ"ื, ืื ืืฉื ืืืื ืืืกืืืืช. ืืื ื ืืื ืืื ืื ืจืื ืืืจืื ืื ืืจ ืคืจืื, ืืกืคืจ ืืฆืคืื ื ืฉื ื ืกืืืืช ืืืืืืื, ืืฉื ืืช ื-1690. ืืื ืกืื ืืืจืื-ืืขืจื ืืืงืจืืื ื ืฉื ืืืื โ ืืืืจ ืฉืื ืืชืงืืืื ืืฉืคืขืืช ืจืืื ืืืช ืืืืื ืืช: ื ืืืจืื ืืืจืชืืืืงืกืื ืืืกืืืงื ืื ืืชืืืืื ืืืขืจืืช, ืืฉืืืื ืืงืืืืื ืืืืืืช ืืืืฆืืืื ืฉืืืจื ืืกืืจืืช ืฉืืื ืืืช ืขืชืืงืืช โ ืืจืืฉ ืืื ืืืื ื"ืืขื ืฉื", ืืจืคื ืขืืื ืฉืืฉืชืืฉ ืืขืฉืืื ืืื ืืงืืขืืช, ืืืืฉืื ืืืืฉืืขืืช ืงืืืืืช. ืืฃ ืฉืื ืืื ืืืฉ ืืืืืื ืืจืื ืืช, ืืื ืืืืื ืืกืคืืง ืืื ืืงืืฅ ืชืืืืืื ืืืืช-ืืืืจืฉ ืืืฉืืช ืืืืฉื ืืช ืืืืชื ืืืจืืฉื ืฉื ืจื. ืืืื ืืืช, ืชืืืืืชืื ืืงืืืืช ืืืืืืืจืคืืืช ืืกืืืืืช. ืืคื ืืกืืจืืช ืืื, ืืขืืื ื ืขืจ ืืืืจ ืืกืืืืืชืื "ืจืื ืืื ืืขื ืฉื", ืฉืืื ืืืืืจื ื ืฆืจ ืืฉืืฉืืช ืืจืืื ืฉื ืฆืืืงืื ื ืกืชืจืื ืฉืืขืืืจื ืื ืืื ืืืืจ ืืืืจ ืกืืืืช ืชืืจื ืฉืื ืืชืืื ืขื ืืืชื ืขืช. ืืื ื ืืกืจื ืืืฉืจืื ืืฆืขืืจ. ืืื ืืชืืืื ืขืฉืจ ืฉื ืื ืืืจื ืืงืจืคืืื, ืืื ืงืืืื ืืงืืกืื, ืืฉื ืืชืืื ืืืื ืืืื ืืฉืืืื ื ืืืืื ืืืชื ืขืื ื ืกืชืจืืช. ืืืื 36 ื ืืชื ื ืื ืืจืฉืืช ืืฉืืื ืืืชืืืืช ืืืงืืื ืืืคืืขื ืืฉืืขืืช. ืืฉื ืืช ื-40 ืฉื ืืืื ื-18 ืืชืืขื ืฉืืชืืืฉื ืืื'ืืืื' ืฉืืคืืืืืื, ืืงืื ืืขืฆืื ืืขืื ืฉื ืชืืืืืื, ืืฉืื ืืฆื ืืื ืืืื ืืืขืืจ ืื. ืืจืฉืืืืช ืื ืืืื ืืืื ืืฆืืื ืฉืืื ืขืฉืืจ ืืืคืืจืกื ืืืื. ืืืขืฉ"ื ืืืืืฉ ืืื ืืกืืืืช ืืงืืื ืืืืืืช-ืื ืืืืฉ ืืฉื ื ืืืืืืืช ืืฉืื. ืืื ืืืชืจ ืืจืื ืืขืกืืง ืื ืืืืืช ืืื ืืื ืืืจืืื ืืืกืืืจ ืฉืืฉื ืื, ืื ืืขืฉืื ืืฉืืืื ืคืฉืืืื ืืื ืืืืื ืืืืืื ืืืืืขื ืืืฉืืขืืช ืจืืื ืืช ืืืืคืฉืจ ืืฉืืช ืืืงืืช ืืืืืืื. ืืื ื ืืืข ืืืืืื ืืชืคืืืืชืื ืืืงืกืืืืืช ืืืืจืืืืช ืืืืงืคืืชื ืขื ืืืื ื, ืืชืืืจ ืืช ืืืืืืช ืืืืืชืืืช ืืืืื ืื ืืฆืจืื ืืืืืจืช ืืืืจ ืืืขืืืืืช ืืขืืืื ืื. ืื ืืกืฃ ืืืฉืื ืืืื ืืช ืืฉืืื ืืืืชืืืืืช ืืขืืืืช ื' ืืืช ืืืชืขืืืช ืืจืืื ืืช, ืืืื ืืช ืืกืืคื ืืช ืืืคืจืืฉืืช ืฉืจืืืื ืื ืืงืจื ืืงืืืืื ืืืืงืฉื ืืจืื ืืืืื ืืืชืจ. ืืื ืืชืืืืืื ืฉื ืืฉืื ืืืื ืขืงื ืื ืืื ืืืืื ืืืืจืืืฉ, ืฉืืื ืืคื ืืืกืืคืจ ื ืื ืขืงื ืกืืืืคืื ืืงืฉืื. ืืืขืฉ"ื ืื ืื ืืช ืืืกืืืืช ืืชืืจืช ืืืชื ืืขืช ืืืกืืืืช, ืืคืืชื ืืจื ืฉืชืฆืืข ืืืืื ืื ืืกืืื ืคืฉืื ืื ืืืฉ ืืืชืจ ืืืืืื ืจืืื ืืช ืืฉืืขืืชืืช. ืืืจืืช ืืืช, ืืื ื ืืชืจ ืื ืืืื ืื ืืืื ืฉื ืืื ืืืืืืกืื ืืฆืืืฆื ืืืขืืื ืื ืืคืืฅ ืืช ืืฉื ืชื ืืฆืืืืจ ืืจืื. ืขื ืืฃ ืฉืืืฉืืืื ืืืืกื ืื ืืช ืืืืืฉ ืืืคืืื ืฉื ืืชื ืืขื, ืื ื ืื ืืืืชื ืืขืืื. ืขื ืืืช ืืืขืฉ"ื, ืงืืืื ืขื ืขืฆืื ืจืื ืชืืืืืื ืืช ืื ืืืชื ืฉื ืืจื ืื ืืขืจ, ืืืืื ืืืืจืืืฉ, ืื ืื ืจืื ืืขืงื ืืืกืฃ ืืคืืืื ืื ืืืขื ืืืจืื ืืืจื ืืคืขืื ืืื. ืืืฆืจื ืืืืืืื ืคื ื ืืืืื ืืฉืืื ืืืืจืืื ืืช ืืจืขืืื ืืช ืืืกืืกืืื ืฉืคืืชื ืืืจื, ืืฉืืื ืืืขืื ืชืืืืืื ืืื ืื ืชืืจื ืฉืืื ืืืงืืคื, ืชืืืื ืฉืืืืื ืืื ืื ืื ื ืขืฉื ืืจืืื ืจืง ืขื ืืื ืืืฉืืืื. ืืืขืฉ"ื ืืกืืืืื ืืฉืชืืฉื ืืืืฉื "ืืกืื" ืืืืื ื ืืืกืืจืชื, ืฉื ืืื ืืจื ืฉืืื ืืคื ืื ืืฉืืจืช ืืืื; ืืฉืืืฉ ืืืืจืื ืฉื ืืืื ื-18 ื ืืฆืจื ืืืื ื ืืจืืจื ืืื ืืฉืืขืืช ืื ืืืจื ืืืกืืืืช ืฉืืฆืื ืืืืจืืืฉ, ืืฉืืืืืืื ืื ืืื ื ืืฉืขืชื "ืืกืืืื ืืืฉืื" ืืื ืืืืืืื. ืืืื ื "ืืกืื" ืืชืงืฉืจ ืื ืื ืขื ืืชื ืืขื ืืฆืืืืช ืขื ืฉืืืื ื ืืืงืืจื ืืืืคื ืชืืช ืืืืฉ, ืืืืื ื ืื ืก ืื ืืขืืจืืช ืืืืืจื ืืช โ ืืืฉืืขืืช ืฉื "ืืึนืึตื, ืึทืขึฒืจึดืืฅ" ืืื ืฉื ืืืืง. ืืฉื ืชื ืฉื ืืืืื ืืจืขืื ืืืื ืืืืืื ืจืืื. ืืฉืืืืช ืืชืคืืื ืืืืืื ื ืื ืืืืืฉื ืื ืื ืขื ืฉืจืืื ืืื ืฉืืื ืืืืจื ืืช ืืื ื ืืขืืจ ืืืืชืจ ืืื ืืืชืืื ื ืืืืืื. ืืืจื ืืืขืฉ"ื ืขื ืืฆืืจื ืืืขืืืช ืืืฉืืืช ืืืืืช ืืฉืืจืฉื ืืืงืื ืืืืืืง ืืื ืฉืืืืื ืืขืืงืจ ืฉืื ืขื ืืื ืืืจืฉื, ืฉืชืืืจ ืืช ืืืชืืจื ืฉื ืืืฉืืืช ืืจืืฉืืช ืจืขืื ืืืืืื ืืชืืืื ืคื ืืื ืฉืื, ืืขืื ืืงืืืื ืืชืื ืืืื ืฉื ืืกืคืืจืืช ืืืืืืืืช. ืืื ืืืฉืื ืืืืชืจ ืืื ืจืขืืื ืืฆืืืง ืืืกืืื, ืฉืืคื ืืฆืื ืืจ ืืืืจืืช ืฉืคืข ืืืืื ื ืืืืื ืื ืืืืขืืืช ืชืคืืืืชืืื ืืืงืฉืืชืืื, ืืืกืืืืช ืืืกืจืืช ืืืืจืื ืจืืื ื. ืืืืื ืืืืจืืืฉ ื ืคืืจ ื-1772. ืืขืฉืจืื ืืชืืืืืื ืืืงืืจืืื ืืฆืื, ืงืืื ืืืืจ ืื, ืืืคืืฅ ืืช ืจืขืืื ืืชืืื ืืจืืื ืืืจื ืืืจืืคื: ืจืื ืืืจื ืืืืื ืืงืจืืื, ืจืื ืื ืื ืื ืื ืืืืืืืกืง ืืจืื ืฉื ืืืืจ ืืืื ืืืืืื ืคืขืื ืืฆืคืื ืืจืืืง, ืืืืื; ืจืื ืื ืื ื ืืื ืืฆ'ืจื ืืืื ืฉืชืงืข ืืฆืคืื ืืงืืื; ืจืื ืืื ืืฆืืง ืืืจืืืฆ'ื ื ืกืข ืืชืืืื ืืคืื ืกืง ืืืืืจ ืืื ืืชืืืฉื ืืคืืืืืื, ืืกืืื ืืืจืื; ืจืื ืืืืืื ืืืื'ื ืกืง, ืืืื ืจืื ืืืฉื ืืื ืืคืืื, ืชืืืืื ืืืฉืืชืฃ ืจืื ืืฉืจืื ืืืคืฉืืืื ืืจืื ืืืืื ืืืื ืืืืืืฉืื ืืืกืกื ืืช ื ืืืืืช ืืชื ืืขื ืืคืืืื. ืจืื ืื ืื ืื ืื ืืืืืืืกืง ืืจืื ืืืจืื ืืืื ืืงืืืกืง ืื ืืืื ืืช ืขืืืืช ืืืกืืืื ืืืจืฅ ืืฉืจืื. ืืืกืืืืช ืืืืฉื ืืชืคืฉืื ืืืืืจืืช ืืืจืื. ืืฆืืืงืื ืฉืืืื ืื ืืื ืืจืืืืืืช ืขื ืชืืืื ืคืืืืื, ืืื ื ืชืืืื ืืงืจื ืฉืืืืช ืฉื ืขืืจื ืืฉืคืขื ืืชืื ืืืืืื ืืงืืืืชืืช ืืงืืืืช: ืืื ืืงืืืฉ ื ืืืื-ืืืจื, ืขืฉืืจืื ืืื ืืืืืก (ืืืืืืื ื ืฉืื ื ืืื ืืืช, ืฉืืขืืื ืืืื ืืืช ืืื ื ืืื ืืืื ืืืื ืืืืงืจื ืืคืืจืื ืืืช ืฉื ืืืื ืืืกืืืื) ืืืืกืืฃ ืืืื ื ืคืฉืืื ืืขื. ืืืงืืื, ื ืืืจื ืฉืื ืืขืืจืจ ืืจืืืืช ืืชืจื, ืืืืจืืช ืืคืืคืืืืื ืืืืคืื ืฉืืื, ืื ืขืืืื ืฉืื ืืกืจ ืืืจืชื ืืืคืื ื ืืืืชื ื ืืช ืืืืืื ืขื ืืขืื ืืืืืก. ืืฆืืืงืื ืขืฆืื ืืื ืืืฆืื ืืืชื ืืืืื, ืืืืืขื ืืืจื ืืื ืืจืงืข ืฉื ืืืื ืืช ืืขืืฉืจ. ืืืืื ืืืืืืื, ืืืืืื, ืืชืืกืก ืืขืืืจื ืืืื ืืืื, ืืกืืืื ืืืืืืื ืฉืขื ืฉืื ื ืงืจื, ืืคืขื ืืืืง ืืช ืืฉืคืขืชื ืืขืืจ ืืื ืืชื ืืืืช ืขืื ืฉื ืืืืกื ืืืื"ื ืืืงืืื ืจืื ืขืืจืืื ืืืจืืืืฅ. ืืื ื ืืืืื ืืชืืืืื ืืจืืื ืืจืื ืืคืจื ืกืื ืืงืื ืืืชื ื-1803. ืชืืืื ืืืื ืงืจื ืืืงืืืืช ืจืืื. ืืฉืืืื, ืื ืืื ืืชืคืืื ืืงืื ืืื ืคืจื ืฉืืื ืื ืืกืืก ืืจืฉืช ืฉื ืงืฉืจืื ืืืจืชืืื ืืคืขืืืืช ืงืืืืชืืช, ืฉืืืฉ ืขืจืืฅ ืืขืื ืืืืืก ืืืจืื ืืชืืืืื ืืืฉืื. ืืชื ืืขื ืืืืฉื ืขืงืจื ืืช ืืืื ืืงืืืืชื ืืืกืืจืชื: ืืฆืจ ืืืืื"ืจ ืืืืืคื ืืช ืกืืืืช ืืจื ืืืขื ืืคืจื ืกืื, ืืืืงืื ืืืฃ ืืงืืื ืฉืื ืืืืืืื ืืืืืจ ืืื ืืคืืคืื ืืืื, ืืชืงืืืื ืืืงืืื ืืืืง ืืืงืืืืืช ืงืืืฆืืช ืฉืืฉืชืืืื ืืืฆืจืืช ืืกืืืืืช ืฉืื ืืช, ืืืืืง ืืืืืื"ืจืื ืืื ืืกืืืื ืืงืืืืืช ืฉืืชืคืจืฉื ืขื ืคื ื ืืจืืืื ืืืืืื. ืืืกืืืืช ืืืืชื ืืชื ืืขื ืืืืืืืช ืืืืืจื ืืช ืืจืืฉืื ื ืืื ืฉืืืืืคื ืืช ืืงืืืื ืืืฉื ื ืืืืื ืขื-ืืจืืืืจืืืื ืืจืืืืื ืื ืืื, ืืืฆืจื ืืชืืืช ืฉืืืืชื ืืืืืื ืช ืืืชืจ ืืืคื ืืืืื. ืืืกืืืื ืื ืืืื ืฉืืืื ืืฉืืื ืืื ืืชืจ ืฉืื ืืชืืกืกื, ืืฉืืืืืื ืืงืคืืื ืขื ืฉืืืืฉ ืืกืืื ืืืืืฉืช ืืืืืื, ืฉืืืฉืืื ืื ืจืง ืืืื ืืื ืื ืืจืฆืืขืช ืขืืจ, ืืฉืืืืชื ืื ืืงื ืืืชืจ ืื ืืจืืื. ื ืืชื ืืื ืื ืืขื ืืืืชื, ืื ืืืืง ืฉืืฉืืื ืืจืืืืช ืืื ื ืืืืื ืืช ืืกืืื ืื ืื ืืฆืจื ืกืืืื ืืืื ืืืชืจ ืืื ืฉืืื ืื ืชืืื ืืืงื, ืืื ืืขื ืงืืื ืฉื ืชืืงืื ื ืฉืืืช ืืืื ืฉืืชืืืืื ืืืืืช ืืืืืชืืช. ืืืืฃ ืืืืืืฉ ืืืืง, ืฉืื ื ืฉืชืืจื ืืืืื ืืืืืืื ืจืืื ืืืฃ ืื ืกืืชืจืื ืืืืืืืืื, ืื ืืื ืืืืืฉ ืืกืืื ืืืืืง. ืื ืืืื ืืืืื ืจืืืื ืืงืืืืืช ืืกืืืืืช ืืืฆื ืืืื ืืงืืืืื ืขืื ืงืืื. ืื ืืื ืืืืฉ ืืืชืงืฃ ืืืืืช ืืืืจื ืืชืจื ืขื ืืื ืืฆืืืืจ, ืื ืขืืจืจ ืืชื ืืืืช ืืขืืงืจ ืื ืืื ืืืงืืืื ืืขืืืจืืช ืืืง ืืืืจ ืืืื ืกืืชืืื ืืืก ืืฉืืืื, ืืืขืืืจ ืืฆืืืง ืืืกืืื ืืงืืจ ืืื ืืืืื ืจืืฉืื ืืืขืื. ืืืืืืงืช ืขื ืืฉืืืื ืืืืขื ืืืื ืืืืฆืข ืืืื ื-19, ืืฉืกืืื ืื ืืคืืื ืฉื ืืชื ืืื ืืืฉืืืื ืืงืืืช ืืคืื ืืืืื ืื ืืืืืื. ืืืื ืืืืื ืืืืืจ ืฉืืฆืืื ืืฆืืืงืื ืขื ืืจืืช ืืื ืืื ืืืืืกืืช, ืืชืืืืืชื ืืชื ืืขืืชืืื ืืชืคืืื ืืืืืืจ ืืืื ืื, ืื ืืืืช ืื ืืืืืื ืืืืจืืช ืืคืขืืื ืืืื ืื ืืืืงืื ืงืืืืื โ ืืคื ืืืกืืคืจ ืืืืืืจ ืืื ืื-ืืกืืื "ืฉืืจ ืคืืฉืขืื", ืืืจื ืคืขื ืืืืื ืืืืจืืืฉ ืืฉืืื ืคืจื ืืืื ืฉืื ืืืื ืืืืช ืืจืืข ืืืืช, ืืืขื ื ืฉืืชืืืืื ืื ื ืฉืื ืฉื ืฆืจืื ืืชืืงืื; ืขืืืืืช ืจืืืช ืื ืืกืื ืืื ื ืคืืฆื ืืชืงืืคื โ ืืืื ืืืจืืื ืืช ืืจืื ืื ืืืืืืจื ืืจืืื ืืช ืืฉืืชืืืช. ืื ืืืื ืืืชื ืืืืช ืืืกืืืืช, ืฉืืืืื ืืืืื ืืืืืื ื. ืืืกืืจืชื ืืืืจืื ื-1772 ืืืกืืืืช ืืื ืฉืื. ื-1781, ืืืืื ืกืืืื ืขืืืืชืื ื ืืกืฃ, ื ืฉืจืคื ืืชืื ืจ"ื ืืคืืืื ืื ืืืืื ืืจื ืขื ืืืกืืืื ืืืจืืื. ืืืขืืจ ืืืืจืืชื ืืชืคืืื ืื ืืกื ืืืจ"ื ืืืงืื ืื ืืกื ืืฉืื ื ืกืืคืง ืืฃ ืืื ืืืชื ืืืื ืขืืื ืืืืืง. ืกืืืืจ ืจืืฉืื ืืืื ืื ืืืืจื ืืืจืืคื ืืืืคืก ื-1781 ืืืืงื ืื'ืืืงืื ืืืกืืืช ืืืื ืืงืืืื ืืืจืืื, ืืืงื ืืจืืฃ ืฉื ืืชื ืืืืช, ืื ืื ืืืืืฉื ืื ืื ืืื ืฉืื ืคืฉืืืื ืืืชืคืื ืืชืืื. ืืืืืืืื ืืืจืื, ืืืืืื ืืชื ืืขื ืืืืฉื ืืืืื ืขื ืืื "ืืืกืืืื ืืืฉื ืื", ืืืงืืืืื ืืืืืืืกืืืื ืฉืงืืืื ืืช ืืืจื ืืืืื ืืกืืืจืืช ืืืืืขืื ืืืฆืืืืจ (ืจืืฉ ืืจืืฉืื ืืื ืืื ืืืจ"ื ืขืฆืื). ืืจืืืช, ืืืฉื ืืช ืืขืืืืชืื ืงืฉืื ืืชืืืืื ืืื ืืืจืื, ืื ืืชื ืืขื ืืืกืืคื ืืืชืคืฉื ืืืืชืขืฆื ืืื ืืจืฃ. ืืฉื ืืช ื-1780 ืืฆืื ืืืืจ ืฉืืืฉืช ืืืืืืจืื ืืืืืืื ืฉื ืืืกืืืืช ืืืืงืืืช: "ืชืืืืืช ืืขืงื ืืืกืฃ" ืืจ"ื ืืคืืื ืื ืฉื ืชืคืจืกื ื-1780; "ืืืื ืืืจืื ืืืขืงื" ืฉืืื ืืช ืืืงืืื ืืืจืื ืฉื ืืืืื ืขืฆืื ืืืฆื ื-1781, ืชืฉืข ืฉื ืื ืืืืจ ืืืชื; ื"ื ืืขื ืืืืืื" ืืืจื ืชืืจื ืฉื ืจืื ืืืืืื ืืืื'ื ืกืง. ื-1798 ืขืื ืืืืื ืืืชื ืืืื ืืืขืฆืจ ืืขื ืืชื ืื ืืืฉืืช ืจืืืื, ืื ืืืช ืืืจ"ื ืฉื ื ืงืืื ืืื ืฉืื ืืื ืืช ืื ืืืื ืืืช ืืงืืจ ืกืืืืชื ืืขืืงืจื. ื-1804 ืืืจืขื ืืชืคืชืืืช ืืฉืืื ื ืืกืคืช ืฉืคืืขื ืืื, ืืฉืืฆืืจ ืืืืกื ืืจ ืืชืืจ ืืงืืช ืื ืืื ื ืชืคืืื ืขืฆืืืืื. ืื ืืืืกืืจ ืขื ืืจื ืืจืืกืื ื-1795 ืืชืงื ืืช ืืืจืืช ืฉืืืืืฉื ืืช ืืงืืืืืช ืื ืขื ืืืืืืื ืืคืขืื ื ืื ืืชื ืืขื ืืชืงืืคืืช. ืืืืืืช ืืืกืืืจื ื ืื ืชื ืืืกืืืืช ืืกืืืื ืืช ืจืฉืืืช ืืืืคื ืจืฆืืฃ, ืืืืง ืืืืืื ืืืช ืืงืืกืจืืช, ืืืืจ ื-1788 ื ืืกืจ ืขื ืจืืืคืชื. ืจืื ืืืื ืืืืืื'ืื ืืืืื ืืช ืืืชื ืืืื ืืฉืืืืช ืืชืืื ื ืืช ืกืืืืืช ืืืชืจ, ืืื ืืงืืช ืืืกื ืืืฉืืื ืืืืืจื ื ืฉื ืืขื ืืืกื ืืช ืื ืืขืจ ืืืกืคืง ืื ืืืืคื ืจืืื ืืช ืืืฉืืช ืื ืคืืืช. ืืื ื ืืจื ืืช ืืืืืืืื ืืืืืืช ืื ืืืืืื ืืืืื ืื ืืงืืื โ ืืืช ืืคืืื ืฆืืื ืฉืืื ืืืฉืืืช ืืช ืืขืจื ืืจืืื ื ืฉื ืงืืื ืืฆืืืช ืืื ืืขืฉื ืฉื ืขืฉื ืืืืื ื ืืชืืืื โ ืืืืฆืขืืช ืืงืืืขื ืฉืืืื ืืื ื ืืืื ืืืชืงืจื ืื ืงืืืช ืืืื ืืืืืืืช, ืืืื ืื ืืข ืืืื ืืืชืคืืฉ ืืช ืืืฆืืืืช ืืืืืืชื. ืืื ืื ืืื ืืืื ืืช ืื ืฉื ืืกืจ ืื. ืืืกืืืืช ืขืฆืื ืืฆืื ืืืืืืง ืืืืืงืช ืืืืืืฉืช ืืืื ืฉืืขืืจ, ืขื ืืืืช ืืจืืจื. ืืืคื ื ืืืื ื-18 ืขืื ืืืืจ ืืจืืืขื ืฉื ืืฆืืืงืื: ืจืื ืื ืื ืื ืื ืืจืืื ืื ืชืคืก ืืช ืืงืื ืื ืืขื ืืืืืื ืฉืืช ื-1787 ืืื ืืื ืืืืืฆืื, ืืืืืื ืืืืืืื ืืจืฉ ืืืชื ืืคืืืื ืืงืื ืืจืกืืืช. ืืขืฉืืจืื ืืคืืชืืื ืฉื ืืืื ื-19 ืขืืื ืืกืืื ืฉืื ืื ืขืืืง ืืืกืืืืช. ืืื ืืคืื ืืชื ืืขื ืขืืื ืืืื ืืืช, ืืืฅ-ืืืกืืืช, ืืืื ืืฉืืื ืืืจืืืช ืืืจื ืืืจืืคื. ืืคืืื ืืืืื ืืืชื ืืืืช ืงืืืื ืขืืืจืืช ืจืืืช ืืช ืืจืืชื ืฉื ืฆืืืงืื ืืงืืืืื. ืืืืืื, ืคืืืืืื, ืืืืฆืื ื ืกืืคื ืืืขื ืืืื, ืืคืืืื ืืงืื ืืจืกืืืช ืจืง ืืขื ืคืืืช ืืื. ืืืฆืจืืช ืืชืคืฉืื ืืชืืืื ืืกืจืืื ืืืืงืืืื ื, ืืืืืขื ืื ืื ืืืืื ืืืขืจืื ืืืืชืจ ืฉื ืืืกืืืืช ืืืืืืืชืื ืืช ืฉืืคื ื ืืืืืช ืืขืืื ืืฉื ืืื, ืืฆืคืื-ืืืจื ืืื ืืจืื. ืฉื ืืื ื ืชืืืื ืืืืื, "ืืืฉืื ืืฉื", ืืจื ืืืืืื ื-1808. ืืื ืฉืืฆืืจื ืืืชืืกืก ืืืจืืืฉ ืชืืืืื ืืชืืืฃ ืืืืจื ืืฉืืจ ืืช ืืืืฉืืื, ืื ืืชืืกืื ืืืกืืืืช ืืขืืจื ืชืืืื ืึทืฉึฐืึธืจึธื (ืจืึผืึดืื ึดืืึทืฆึฐืึธื). ืืืืฉ ืขื ืืจืืืืืืืช ืกืืืคืช ืืืืืฃ ืืฆืืจื ืืืกืืจืช ืืืชืจ ืฉื ืื ืืื, ืืืื ืืฉืืฉืืชื. ืืจืืฉืื ืืืืื ืืืช ืืื ืจืื ืืจืื ืืื'ืืืื', ื ืื ืืืขืฉ"ื ืฉืืชืื ื ืืืืื"ืจ ื-1782 ืืชืืข ืืืชืจ ืืฆืืืงืื ืืืืืจ ืืขืืืื ืืชื ืขื ืกืื ืืืืืกื. ืืืืจื ืืชืคืฉืื ืืฉืืฉืืชืืืช, ืืืืื"ืจืื ืืืื ืืชืืจ ืืช ืขืงืจืื ืืืืจืฉื ืืืืฉื ืืชืืืื ืฉื ืชืืจืช ืืชื ืืขื. ืืืืจ ืฉืืฆืืืง ืงืืฉืจ ืืื ืจืืื ืืืช ืืืฉืืืืช, ืืื ืื ืื ืืืืจื ืฉืชืืื ืืชืื ืืื ืืฆืืืืช ืื ืืืืคื ืืืฉ ืืขืืจื ืืฆืืฆืืื. ืขืื ืืื ืืืงืก ืืืข ืืืช ืืืืืื "ืืื ืฆืืืง ืืื ืื ืฆืืืง". ืืขืืงืจืื ืืชืงืืข ื-1813 ืืฉืชืืืืื ืืืืืจ ืฉื ืืขื ืืชื ืื, ืืจื ืืืจื ืืกืืจืฉืื, ืืคืกืื ืืื ื ืฉื ืจืื, ืจืื ืืืืขืจ ืฉื ืืืืจื ืืืืืง ืขื ืจืืฉืืช ืื"ื. ืืชืื ืืืจ ืืื ืืืขื ืื ืืืฆืจืืช ืฉืืฉืืชืืืช. ืคื ืืืจ ืฉื ืืืฉืืจื ืืื ื ืกืืื ืฉืงืื ืืืขืืกืืง ืืืืคืื ืืืืกืืืงื ืืืืืช ืืืืืืื ืืชืื ืื ืืืชืจ ืฉื ืจืืื ืืืช, ืืืืืื ืืชืืจื ืฉื ืืชืคืก ืืงืื ืืจืืื. ืขืจืืืช ืืืคืชืื ืืืืืฆื ืืื ืืืืื ื ืืชืจื ืืืง ืืืืชืืก ืืืกืืื, ืืื ืืฆืื ืืื ืืืชืคืชื ืืื ืืืฉ ืฉื "ืืืื"ืจ-ืจื", ืฉืฉืืื ืืช ืืจืื ืืชื ืืขื ืขื ืืืืืช ืจืืืื ืืชืืจื ืืฉืกืืืืชื ื ืฉืขื ื ืืืขื ืขื ืืืชื ืืงืืจืืช ืฉื ืคืืกืงืื ืื-ืืกืืืืื. ืจืื ื ืืื ืืืจืกืื, ืืื ืฉื ืืืืจ ืืจืืืขื, ืชืงืฃ ืจืืื ืืฆืืืงื ืืืจื ืขื ืืชืืกืืืชื ืืคืืืขืชื, ืืขืื ืื, ืืืจื ืืืกืืืืช. ืืขืฉืืจ ืืฉื ื ืฉื ืืืื ื-19 ืฆืืื ืืช ืืืขืืจ ืืื ืืืกืืืืช ืืืืงืืืช ืืื ืืืืืืจืช ืืืืืืกืกืช, ืื ืฉืขื ืช ืขื ืืฆืจืืช ืืืืจืื ืืช ืืืื. ืืืืื ืืืืืจืื ืจืืื ืฉืฆืืืจื ืืช ืืชื ืืขื ืืจืืฉืืชื ืืืืจ ืจืืื ืื, ืืื ืืจืืื ืืืืจ ืืืืืื, ืจืื ืืชืืืจื ืื ืกืืื ื"ื ืืืื" ืืฉืงืืขื ืืืฉืืชืืช ืืืืขื ืฉืื ืขืกืงื ืืืชืคืชืืืืืช ืืฉื ืืืืื. ืืื ืืฃ ืฉืืืืงื ืืื ื"ืืกืืืืช" ืืืืืชื ื"ืฆืืืงืืช" ืืืืกืกืช-ืืืื"ืจืื ืืื. ืจืืืื ืื ืืืืชื ืืขืืช ืืฉืคืขื ืืคืืื ืขื ืืชืคืืฉื ืืืงืืืืช ืฉื ืืชื ืืขื, ืื ืืืืงืจ ืืืืืืจ ืกืชืจ ืื ืืช ืืจืืื ืืืืฆืื ืฉื ืืืกืืืืช ืืืืงืืืช ืืื ืืช ืืืขื ื ืฉืื ืืืืืกืืช ืืกืจื ืชืกืืกื ืืืืืืฉ ืจืืื ืืื. ืืชื ืืขื ืืกืจื ืื ืื ืืื ืืจืืืืช ืืื ืืืช ืืืืื, ืืขื ืืืืืชื ืขืืจื ืคืืฆืื ืืืื ืืืืืจ ืืืงืื ืฉื ืืฆืจืืช ืืจืืืืช. ืื ืงืืื ืืื ื ืืชื ืืื ืืืืืช "ืืกืื" ืกืชื, ืืจื ืฉืืฉื ืืืืื ืฆืืจืคื ืืชืืืืช ืื ืชืืื ืฉืื ืฉื ืขืืืจื ืืกืืืืช ืื ืืชืืืจืจ ืืืืื"ืจ. ืืชืคืชืื ืื ืืืืงื ืืืืืช ืืืชืืืืช ืืื ืืืขืื ืืคื ืืื ืฉื ืืืฆืจืืช, ืขื ืืืกืืืื ืืืงืืจืืื ืืืืชืจ ืฉืฉืื ื ืืื ืงืืข ("ืืืฉืืื"), ืืฉืืื ืจืืื ืืืจืื ืฉืืืงืจื ืืื ืจืง ืืื ืคืขื ืืืืขืื ื ืืกืฃ ืฉื ืฆืืืืจ ืฉืืชืืืจืจ ืืขืืืจืืช ืฉืืืกืืืืช ืฉืืื ืืื ืืืชืคืื ืื ืืกื ืกืคืจื ืืืขื ืืื ืงืฉืจ ืืืฆืจ ืืืืกืืช. ืกืืื ืืืฆืข ืืืื ื-19, ืงืจืื ืืืื ืฉืืฉืืืช ืืืื"ืจืื ืฉืืชืืืืกื ืื ืืื ืืงืฉืจื ื ืืฉืืืื ืืื ืืืื ืืืืง ืืืืชืจ ืืืจืื ืืืืืืจืคื ืฉืืื ืจืืกืื ืืคื ืืืืช (ืืืืจื ืืชืืื ืืืืฉื), ืคืจืืกืื, ืืืื-ืืฉืขืืจ ืืืืืืช ืืื ืืจืื, ืืืชื ืืขื ืืืืชื ื ืืืืืช ื ืืืจืช ืืฉืชื ืืืืจืื ืืช. ืืจืืืช ืืืฆืจืืช ืืืคืืจืกืืืช ื ืืกืื ืืชืงืืคื ืื. ืืื ืืกืฃ ืืืืงืจืื ืืืจืื ืืฆืืืขื ืขื ืืจืืขื ืืืคืืกื ืื ืืืืืช ืฉืืชืืืฉื ืืืืื ืืขืืื: ืืจืื ื-ืืืื ื, ืฉืืืืืจ ืืชืืกืก ืขื ืืืืืช ืืชืืจื ืืฉืืืืฆืืื ืืื ืืืืจืื ืืชืืืืื ืืืืื ืืืืืช ืขืฆืื ืืฉืืืฉื ืืจืื ื ืงืืืื ืืืคืืกืงืื ืื ืืกืฃ; ืืงืืื-ืืืกืื, ืืงืื ืืืืื, ืฉืืืกืืฃ ืืืืืง ืืืชืืก ืืืฉื ืืืืืืืฉ ืืช ืืืืื ืชืืจืช ืืกืื; ืืืืืืชื, ืฉืจืืื ืืช ืืืืื"ืจ ืืื ืืื ื ืขืจืฅ ืืฉืืื ืืจืื ืืืคืื ืช ืขืืฉืจ ืืืืงืจื, ืืฉืืืื ืขื ืืืช ืจืื'ืื; ืืืคืืคืืืืกืื, ืฉืืชืืงื ืืขืฉืืืช ืืืคืชืื, ืฆืืงื ืืงืืจืื ืืฆืืืืจ ืืจืื. ืกืื ืื ืืช ืืื ืื ืืื ืืืขืืืื ืืืื ืฉืืืืืื ืืื ืืื. ืืื ืฉืืืคืืขื ืืืชืจ ืืกืืืืืืช ืืืื ืืืช ืขื "ืืืื"ืจืื-ืจืื ืื" ืืจืืฉื, ืืื ืฆืื ื โ ืฉืื ืืืื ืืืืจื ืืืื ืืื ืืืืื"ืจ ืืืืืื ืืืืืฆืื ืืืืื, ืืืืืืฆื ืืืืืืง ืืืืชืจ ืฉื ืืืืช ืืฆืืืงืืช ืืืืื ืืช โ ืืืขืืื, ืืืชื ืืขื ืขืืขืื ืืช ืืืกืจ ืืืืกืื ืฉืื, ืื ืืื ืืคืืช ืืืชื ืขื ืืืชื ืืืื. ืื ืฉืืืื ืขืื ืืืชืจ ืืช ืฉื ื ืืฆืืืืจืื ืืื ืืืคืขืช ืืืื ืืฉืืชืฃ, ืชื ืืขืช ืืืฉืืื. ืืฉืืืื ืืืจื ืืืจืืคื ืฉืืื ืืืื ืืช ืืืกืืืื ืืืช ืื ืื-ืจืฆืืื ืืืกืืืช ืืฉืื ืืช ืงืืื, ืืฆืืื ืืืชื ืืืืคืื ืืชืฉืืื ืฉื ืขืฆืื ืืืขืื, ืืืืื ืืฉืคื ืืขืืจืืช, ืืขืืืืืช ืฉื ืืืจืื ืืืฉืื. ืจืืืื ืืืื ืืืืชื ืื ืช ืืืงื ืฉื ืืืืื ืืืขืจื ืืืื, ืืืืง ืืืืืก ืืืืจื ืื ืืฉื (ืืื ืืกืฃ ืขืื ืขื ืื ืฉืืืกืืืืช ืคืฉืื ืจืง ืืืืืจืื ืฉืืื ืืืืืฉ ืืืืืชื ืืฉืื ืืื): ืืืจืชืืืืงืกืื ืืขืจืืืื ืืืืืงืื ืืจื ืขืืจืืื ืืืืืกืืืืืจ ืชืืขืื ืืช ืืืกืืืืช. ืืืืืฆืื ืืืืืื ืืืืชื ืืขืืื ืืช ืืืกืืืืช ืืืื ื ืืืฉืืชืฃ ืืืฉืืืืื ืืืชืื ืื ืืืจืืืงืืื ืืืืชืจ, ืืืืจ"ืฅ ืืืืช ืขื ืืืืฉืข ืืฉื ืฉืืจ. ืืืกืืืื ืืืื ืืืจืืืืื ืืจืืคื ืืช ืื ืืืื ืืืฉืืื ืืืช ืืคืืฆื ืกืคืจืืชื. ืืืืืง ืืืฆื ืืช ืืืกืืืืช ืืืืจื ืืฉืืจื ื ืืืืชืจ ืืืืจื ืืืืืืืช, ืจืง ืงืฆืช ืืืชืจ ืืืืจ ืืืืจ ืฉื ืจืืคื ืืื ืื-ืืืกืืืช; ืืืชื ืืืื, ืืขืืืชื, ืืื ืคืชืืืื ืืืงืืช ืืืฉืืื. ืืืื ืืจืื, ืืืช"ื ืกืืคืจ ืืกืชืืื ืื ืืืกืืืืช ืืืกืจ ืืฉื ืืช ืืช ื ืืกื ืืชืคืืื ืืืฉืื ื ืืกืคืจื, ืื ื ืื ืข ืืืฆืืช ื ืืื ืืืขืืืฃ ืืืชืืงื ืืืืืื ืืจืืืืช ืืืืืจื ืืืฆืื; ืืขืืืจ ืืื ืขืฉืืจืื, ืืืืจื ืืืื ืืฆืื ืืืจืฉ ืคืจืืืื ืืืืกืงื ืืืืืื ืืช ืืืกืืืื ืืชืืืื ืืืืฃ ืืงืืฆืื ื ืืืืชืจ ืฉื ืืืืจืชืืืืงืกืื ืืืืืื, ืื ืื ืจืืื ืืืืืื"ืจืื ืกืืจืื ืืืฆืืจืฃ ืืืจืืื ืืงืืืืืช ืืืืจืชืืืืงืกื ืื ืคืจื ืืืขืืืคื ืืืฉืืืจ ืืช ืงืืืืืชืืื ืืืขืื ืกืืืืก ืงืื ืขืฆืืื. ืขืืืชื ืืฉืืจื ืืช ืืืืืืงืช ืืชืืืื ืื ืืืืฉืืจืื ืคืจืืืืืื ืืืชืจ. ืืืจืืฆืช ืืืื ื-19, ืืืกืืืื ืืชืขืงืฉื ืขื ืฉืืืืจ ืืืืืื ืืืกืืจืชื ืืืจืื ืฉืืื ืืฉืืชืฃ ืืืืืื ืืืืืจ ืืขืืจ (ืืื ืฉืืชืืืกืคื ืืืื ืคืจืืืื ืื ืกืื ืื ืืช ืืืืืืืื ืืืฆืจืืช), ืืขืื ืฉืืืชื ืืืื ืขืืจื ืืจืืื ืืืืจืื ืืืืืฉ ืืืืจื ื ืืืกืืช. ืืฉืืจืืืื, ืฉื ืืืฉ ืืืืช ืฉื ืื ืืคื ื ืืืขืฉ"ื, ื ืขืฉื ืืืืื ืืืืืข ืืกืืื. ืืืืืฆืื ืืืืืื, ืฉืื ืืฉืืืื ืืืืกืืจื ืืืฅ ืืขืืื ืืช ืืืืืืื ืืขืืืจ ืชืืืจื ืืืืืืช ืืชืจืืืชืืช, ื ืขืฉืชื ืืชื ืืขื ืืืืื ืขื ืฉืืืืจ ืืืกืืจืืช ืืขืืืืืช ืืืืจื ืืืืื ืืืฉื. ืืืื-ืืืืกืื ืฉื ืืืฉื ืืกืืจึฐืชึธึผื ืืช ืื ืืื ืืฆืืืง ืื ืื ืื ืื ืืจืืื ืื, ืฉืฉืืจื ืืชื ืืืืคืืืช ืืื ืคืจื ืืขืืื ืืชื ืืืืืจื ื ืืืืชื ืืฉื ืืืจ. ืืืื"ืจืื ืฉืื ืื ืืฃ ืืขื ืืงื ืขืจื ืจืืื ื ืืืืืืืช ืืืงืืืืื, ืืืงืืื ืืืืืื ื ืงืฉืจื ืกืืืืืช ืจืืืช. ืื ืืื ืืืฆืจื ืืช ืืืืืื ืฉื ืืืกืืืืช ืืืืฆืจื ืฉื ืืขืืืืืช ืืืืืืืช ืืืืชื ืืืช, ืืฃ ืืขืื ื ืืืืื (ืืขืชืื ืืจืืืง, ืืฉืืื ืืืื ื-20, ืืืืืื ืฉื ืงืืืืืชืื ืืฃ ืืคื ืืืชื ืืืฉืืจืืชืื ืืืืขื-ืืืขืืืืช ืฉื ืฉืคืช ืืืืืืฉ ืืืฉืื ืืื). ืชื ืืขืช ืืืฉืืื ืืืืชื ืืื ืืืชืืื ืืื ืืขื ืืฉืคืขื ืืืขืื ืืืกืืช, ืื ืืชืืืจืืช ืืืืืืืช ืฉืืืื ืืคืงืื ืืช ืืืืื ืืืจื ืืืจืืคื ืืฉื ืืช ื-1880 (ืืขืืงืจ ืขืืืืช ืืชื ืืขืืช ืืืืืืืืช ืืืืืืืืช ืืืืืืจื ืืืืื ืืช ืืขืืจ ืืื) ืืื ืืืื ืืืื ืืืจืื ืขื ืืืกืืืืช. ืืื ืขืฆืื ืืืชืชื ืขื ืืื ืช ืืืฆื ืืืืฉ: ื-1881, ืฉื ืช ืืกืืคืืช ืื ืื, ืืืกื ืจืื ืฉืืื ืืืืจืฉืืื ืืช ืืืฉืืื ืืืกืืืืช ืืจืืฉืื ื ืืืืืฉื ืืฆื. ืืืกืืืช ืืืืื ืฉืืืฉื ืงืืื ืืช ืืืชื ืืืื โ ืฉืื ืืื ืืืืฅ ืืกืืื, ืืขืืืืช ืืื ื ืืืกืืืืช ื ืฉืื ืืืขื ืืืืจื โ ืืื ืืืกื ืืช ืืืืื ืืคื ื ืืืกืืืืช ืืืืงื, ืืขืชื ืืชื ืืขื ื ืืืฆื ืืืืื ืืืืฆืขืื ืืืืื ืืื ืืืืจืช ืืืฉืืจ ืืช ืฆืขืืจืื ืืฉืืืืจื ืืืกืืจืชืืช ืืืื ืืงืจืกื. ืืฉืืืืช ืืืงืื ืืืืจื ืืืฆืจืืช ืจืืืช. ืืขืช ืฉืืืืืืื ืขืืจื ืืืืื ืืืื ืื ืืฆืื, ืืฉืืืืช ืืืชืงืืืืช ืืื ืืืกืืืื ืืฉืจืื ืื ืืื ืฉืืื ืืื ื ืขืืืื ืจื-ืขืฆืื ืื ื ืืืื ืืืืืื ืขื ืืืื. ืืืฉืฉ ืืืชืืืืืืช ืืืืืขื ืืืจืคืืช, ืฉืขืืืื ืืื ืชืืคืขืืช ืฉืืืืื ืืื ืืืื ืืืจืกื ืืงืจื ืืฆืขืืจืื, ืืืื ืืจืืฉืื ื ืืืชืืื ืืฉืืืืื ืืืืืืืื ืืืคืืื ืืืื ืืืกืืืืช ืืชื ืืขื ืขืืืืช ืืฉืืจืฉืืช, ืืืง ืื ืืกื ืืจืื ืฉื ืืืืื ืืืชืขืืจืจืช ืืฉืืืชื, ืฉืื ืืื ืืืฉืืื ืืื ืื ืฆืืจ ืืืชืขื ืืืคืืงืื ืืืฉืื. ืืขืจืื-ืืืืฉ ืื ืืืื ืืฉืื ืืกืคืจืืช ืื ืืืื, ืืฉืืืืื ืืื ืคืจืืืงืื ืจืืืงืื ืกืื ืคืจืกื ื-1863 ืืช ืืื ืชืืืืืื "ืงืื ืืกืืืื". ืืื ืืฆืืื ืืช ืืืฉืืื ืื ืืื ื ืืกืืืืืช ืืืชืจืคืงืช, ืจืืืงื ืืืืืืื ืืงื ืื ืืืืฉืื ืฉืืืง ืืื ืขื ืื ืืืฆืืจื ืืืืืืืช ืืืืืจื ืืช. ืืืฉืืื ืืืืื ื ืืืืขืืจ ืฆืื ืฆืืืืืคื ืืืกืก ืืืฉื ืืื ืืืืคื ืขืืื ื ืืืืืืจื "ืฉืืื ืขื ืืฉืจืื" ื-1868, ืฉืื ืืืืจ ืืช ืืืฉืืื ืืืขืฉ"ื ืืชืืืืชื ืืฉืืืืืช ืืืชื ืขืืืื ืืืืคื ืืืืื. ืกืืคืจื ืชืงืืคืช ืืชืืืื ืืื ื"ื ืคืจืฅ ืืืืืื ืฉืืืื ืืจื ืืืืจืื, ืชืืืจื ืืช ืืืกืืืืช ืืฆืืขืื ืกื ืืืื ืืืืื ืืจืืื ืืืื. ืื ืืื ืืืืงืจื ืชืืืืืชืื, ืืืืืื ืฉืืืื ืืื ืืืจืืืฆืงื, ืขืืื ืขื ืืืืืฉ ืชืืืืชื, ืขื ืืื ืืืืจืจืืช ืืืคืืืืืืืงื. ื-1912 ืืฉืชืชืคื ืืืื"ืจืื ืจืืื ืืืืขืืืช ืงืืืืืฅ ืฉืื ืืืงืื ืืืืืช ืืฉืจืื, ืฉื ืืกืชื ืืืื ืขื ืืืืืืช ืืืืจืชืืืืงืกืืช (ืืื ื ืฉืืื ืืืืืช ืจืืืื ืื ืืคืืื ืืืืจื ืืืกืืจืชื) ืืืืืืฉืช, ืืขืื ืืืกืืืืืืช ืืงื ืืืืช ืืืื ืืจืื ืืืืืฆืื ืืชื ืืื ืืืืืื ืืืฉ ืืื ืืฆืืื ืืช, ืฉืืืื ืืฃ ืืื ืืขืืืช. ืืืืก ืืชื ืืขื ืืฆืืื ืืช ืืืืฉื ืืงืจื ืืืกืืืื ื ืข ืืื ืขืืื ืืช ืงืืฆืื ืืช ืฉืชืืืจื ืืืชื ืืืขืฉื ืฉืื ืืืฉ, ืืืืืื ืืงืจื ืืฆืจืืช ืืืื ืืกืืืืช ืืื ืงืืืฉ ืืืืืจืื ืืกืืืืช ืกืืืืจ; ืขืืืจ ืืฉืืืื ืืชืื ื ืืืชืจ ืื ืืืืืฉืืช, ืฉืืคืืื ื ืืช ืืจืืืช ืืืฆืจืืช; ืืืื ืืชืืืื ืืชื ืืขืช ืืืืจืื, ืืงืจื ืืกืคืจ ืืืื"ืจืื ืฉืืฆืืจืคื ืืืคืืื, ืืจืืืชื ืืืืช ืจืื'ืื, ืืื ืจืื ืฉืืื ืืืื ืคืจืืืื ืื ืจืื ืืขืงื ืคืจืืืื. ืืืืืช ืืขืืื ืืจืืฉืื ื ืืืืืืช ืืืืจืืื ืืจืืกืื, ืฉืขืงืจื ืืืืช ืืืคืื ืืืงืืืืช ืืืฉืื, ืฉืื ืงืฅ ืืขืืืจื ืืืืืืืช ืืืฉื ื ืฉืืืืชื ืืืฆืข ืขืืื ื ืฉืขื ื ืืืกืืืืช. ืืงืืชื ืฉื ืืจืืช ืืืืขืฆืืช, ืฉืจืืคื ืืช ืืืชืืช ืืืื, ืืืจื ืืืื ืขื ืืชื ืืขื ืืืจืืืช ืฉืืื ืจืืกืื ืืคื ืืืืช, ืืืืจืืก ืืืืงืจืืื ื. ืจืง ืื"ื ืฉืืืจื ื ืืืืืช ืืฉืืขืืชืืช ืืืืชืจืช. ืืื ืืืืืืช ืืขืืื ืืื ืืืฆืจืืช ืืฉืื ืืช ืขืืืื ืงืื ืชืืืืื ืขืฆืื, ืืืืืื ืืคืืืื ืืขืฆืืืืช, ืื ืืื ืืื ืืืืืงื ืืขืื ืืฆืขืืจืื ืืชืืื ืื. ืืฉืืื ืืืืืื ืืืืช ืืช ืืขืืื ืืืกืืื ืืืืจื ืืืจืืคื. ืืืกืืืื, ืืืืืื ืืืืจืืจ ืืืืืืืื ืชืจืืืชืืช ืืกืืืืชื, ืืื ืืืจื ืงืื ืืืืืื ืื ืืฆืื. ืืืืช ืืืื"ืจืื ืืชื ืขื ืงืืืืืชืืื ืืืฆืจืืช ืจืืืช ืืืฉืืื ืื ืงืจืื ืืื. ืืืืืืืชื ืืืื"ืจืื, ืืื ืืจืื ืื ืืืื ืืืืืืืืื ืืกืืืืืจ ืืืืจื ืจืืงื ืืืขืืื, ืืขืืชื ืืืฉืืืช ืขื ืืคืงืจืช ืงืืื ืืืืืจ. ืืฉื ืืช ื-40 ืื-50 ื ืจืืชื ืืืกืืืืช ืืขืืืืช ืขื ืกืฃ ืืืืื, ืืฉืืืื ืื ืืฆืืืื ืืืขืืื ืฉื ืืชืจื ืกืืื ืืืฆืจืืช ืืืคืืื ืืืงืจื ืืืื ืืืชืืื ืืืืจื ืืื. ืฉืืจืช ืื ืืืืื ืฉืขืืชื ืืืืชื ืชืงืืคื ืืฆืืืื ืืืืื ืืช ืืืืช ืืืชืคืืจืจืืช ืืืืกืก ืืืืฉ ืืช ืืืกืืืืช ืืืื ืื. ืืฆื ืืจืืืืืืืช ืืืืฉืจ ืืจืืื ื, ืื ื ืฉืขื ื ืขื ืจืืฉืืช ืืืฉื ืืื ืืกืืืืื ืฉืืฉ ืืฆืืืืจ ืืืืืื ืืืคืืื ืืืืืฆืื ืืืืจืฉืช ืืืืืื ืื ืืขืืจ, ืืื ืขื ืืกืืืื ืืช ืืืขืจืืช ืืจืืืื ืืืืื ืืช ืืืขืจื ืฉืืคืฉืจื ืืชืืืืืืช ืชืจืืืชืืช ืืืฉืคืืืช ืืืืืืช ืืืจืืืืช-ืืืืื. ืืืืื"ืจ ืืืื ืืกืืืืืจ ืืงืื ืืช ืืฆืจื ืื ืื ืืืจืง ืืืฆืจ ืืืืืขืช ืขื ืืขืจืืืช ืจืืืื ืืชืจืืืช ืคื ืืืืืช. ืืขืืื ืื ืืฃ ืืช ืืื ืืื ืื-ืฆืืื ืืช ืืืงื ืืืช ืืืืืืืืืืืืช ืืฆืืื ืืชืื ืืื ืขืฉืืจืื ืืืืื ืืช ืงืืืืชื ืืืืจ ืืืืืฉ ืืฉืคื ืขืืงืจืืช ืืืงืืื ืืืจื ืืืื ืืืืืื: ืืืื ืืืืช ื-1948 ืฆืืื ืืืืขืื ื-100,000 ืืจืืฉืืช ืืืื ื-21. ืืืืื"ืจ ืืฉืจืื ืืืชืจ ืืชืืกืก ืืืืฉ ืืืฉืจืื ืืฉืืงื ืืช ืืกืืืืช ืืืจ ืืืืจืขืื ืืงืื ืฉืฉืจื, ืืขืืงืจ ืขื ืืื ืงืืจืื ืืืืฉ ืฉื ืืืืืื ืืืชืืื-ืืืืืืื ืฉืืฉืชืืืื ืืืื ืงืืื ืืืืืืื, ืืขืืื ืืฉืืื ื ืืจืืืช ืงืคืื ืืืช ืฉืืืืื ืืฉืืจื ืืช ืื ืฉืื ืืืฆืืืืจ ืืืืื. ืื ืืืจ ืืืื ืืืืืจืืช, ืืืกืืฃ ืืืื ืืืจ ืืืื ืจืืืืช. ืจืื ืืืื ืืืืจ ืืืจ ืืืืื"ืจืื ืืืืจืื ืฉื ืืืื'ื ืืฅ ืงืืืื ืืช ืขื ืคืื ืืฉืื ืื ืืืฉืจืื ืืืืจืฆืืช ืืืจืืช, ืืจืื ืืืจื ืจืืงื ืืกืฃ ืกืืืื ืืช ืืกืืื ืืขืืื. ืืื ืืืืจืคื, ืจืื ืืฉื ืืฆืืง ืืืืืจืฆืื ืืืื ืืช ืืฉืจืืืื ืฉืืชืืืฉืื ืืขืืจ ืืืืกื ืืช ืืกืืืืช ืคืฉืขืืืืจืกืง ืืืืฉื. ืืฆืจืืช ืืืจืืช ืืฉืชืงืื ืืืื. ืืืฆืจืืช ื ืืชืจื ืืฉืืช ืืช ืฉื ืืขืืืจื ืืืืจื ืืืจืืคื ืฉืื ื ืืกืื ืืืื ื ืฆืืื. ืชืืืืื ืฆืืืื ืืจืกืืืื ืืืชืจ ืืชืืืืื ืืฉืชื ืืกืืืืืืช ืืืืืืืืช. ื-1951 ืืจื ืื ืื ืื ืื ืฉื ืืืืจืกืื ืืืืชืจ ืืืืื"ืจ ืื"ื. ืืื ืืืืฅ ืงื ืฉืื ื ืืืื ืืืืชืืืืืช ืืืืืื ืขื ืืืกืืืืช ืืืจื ืืื ืืืคื ืืช ืืฆืจื ืืืืฉืืจ ืืืืจื ืืชืฉืืื, ืืจืืื ืืื ืฉืื ื ืฉืืื ืืืืฆื ืืจืืืืืืื ืืื ืืขืกืืง ืืื. ืืืฉืืืฉ ืืื ืืกืืืื ืืืฉ ืืืืืืื ืืืงืืจืืื ืืงืฉื ืขื ืืืืื ืืืืื ืฉื ืื"ื ืืืืขืจื ืืืื ืจืืืืช. ืืจืกืื ืืืขืจืืจืืช ืืื ืจืืคืช ืืคืื ืืฉื ืืช ื-70 ืื-80 ืืืจ ืคืืจื ืฉืืฉื ืืขืืงืจ ืืืืจืื ืืชืฉืืื, ืฉืืชืขื ืืื ื ืืชืืจืชื ืืืืืืจืืช ืืืืืกืืืช. ืืงืืืฅ ืืืขื ืืื ืื ืืืื ืืืกืคืจ ืจื ืฉื ืงืืืืืช ืขืฆืืืืืช, ืื ืืืช ืขื ืจืืืชืื, ืขื ืืืคื ืืจืืืืช ืืืืืื. ืืืืืื ืืืืืืจืคื ืืคื ืืื ืืืื ืืืืจื ืืชืฉืืื ืฉืืืืงื ืืช ืืขืืื ืืืืจืชืืืืงืกื ืืฉืืืฉ ืืืืจืื ืฉื ืืืื ื-20 ืืฉืืื ืืช ืืืกืืืืช ืืกืฃ ืืืืืื ืื ืืจืื ืืืื ืืฆืืืืจืืช. ืืื ืขื ืืชืขืฆืืืชื ืืืกืคืจืืช ืฉืื ืืกืืกืืืื ืืืืืงื ืืืจืืฉื ืฉืืคืืื ื ืื ืืช ืชืืจ ืืืื ืืืื ื-19. ืกืืืืืจ, ืืืืื ืืืฆืจืืช ืืืจืืช ื ืงืจืขื ืืื ืืืขื ืื ืืืขืืืื ืืืืื"ืจืืช. ืืืกืฃ ืื ืืฆืืืข ืขื ืืืขืืืืช ืืจืืืื ืื ืืกืืืืืช ืฉื ืืืกืืืืช, ืฉืืจืคื ืื ืืืื ืืฆืืืืจ ืืืืืื ืืจืื ืืขืืจ, ืืืืืคืชื ืืืฉืฉ ืืคืื ืืจืืื ืืืืงื ืืืืชืจ ืืืืื ืืืชื. ืืืฉืืช ืืืกืืืืช ืืืืกืืืจืื ืืืจืืื ืฉื ืืืกืืืืช, ืืืกืืืืืช ืืจืืืช ืฉืืืคืืขื ืืชืืื ืืืืืืื ืฉืืืืฉื ืืจื ืืืจืฉื ืืช ืืกืืืื, ืืืชื ืืืขื-ืคื, ืขื ืืชืืืืกืืช ืขื ืคื ืืืงืืจืืช ืืืงืืืื ืืืชืจ, ืืงืฉื ืขื ืืืงืจืื ืืืฆืื ืืฉื ื ืกืืืจื ืืืฉืืชืคืช ืืื ืืืจืืื ืืชืืื. ืืคื ืฉืฆืืื ืืืกืฃ ืื, โืื ื ืกืืื ืืื ื ืืฉืโ. ืืืคืืื ืื ืฉื ืืฉืื ืืขืืจ ืืืืืืืื ืืืกืืืืช ื ืชืืื ืขื ืืืื ืืืงืืืืื ืื ืืงืจื ืงืืืืื ืืื ืืฆื ืืืชื ืืืื ืื, ืืืืืจ ื ืืื ืขืื ืืืชืจ ืืืืก ืืืืืืืื ื ืคืืฆืื ืืืืืืื ืขืื. ืงืฉื ืื ืืืคืจืื ืืื ืืชื ืืขื ืืืงืืจ ืืฉืจืืชื ืืจืืฉื, ืงืืืช ืืืจ"ื, ืืืืืจืืข ืื ืืืืืชื ืืงืืจื ืืื ืืื ื ืืื ื ืืกืื ืฉืื ื ืืืงืฆืช. ืื ืื ืคืืืงืื' ืืขื ืฉืืืกืืืืช ืื ืืืืฉื ืืืขื ืืืจ ืฉืืื ื ื ืืฆื ืืืฆืืจืืช ืฉืืืืจื ืฉื ืื ืจืืืช ืืคื ื ืขืืืืชื. ืืงืืจืืืชื, ืกืืจ, ื ืขืืฆื ืืืืคื ืฉืื ื ืืฆืื ืืืฉืืื ืืฉื ืื ืืื ืืืฆืืจ ืชื ืืขื ืืืืกืืช ืืืืืจืื ืช ืืืื. ืืื ืืืืคืืื ืื ืืืืืืื ืขื ืืืกืืืืช ืฉื ืคืืฆืื ืืืื ืื ืืืืฅ ืื, ื ืืฆื ืืืืฉ ืขื ืืฉืืื ืืขืืืืช ืืืืจื ืืืืืื. ืื ืื ืืืืื ื"ืืืืื ืืคืฉืื" ืืขืืืช ืืืืจืช ืชืืืืื ืืืืืื ืืืืืืืกืืืื ืฉื ืคืืฆื ืืืืืจื ืืงืจื ืืชื ืืืื; ืืืคืืื ืื ืืืืืจ ืืกืคืจืืช ืืืกืจ ืฉืงืืื ืื ืืืืืช ืฉื ืื. ืืืชื ืืื ืืืกืืืืช ืืืืกื ืืจืื ื ืืฉืื, ืฉืกืืืืชื ื ืฉืขื ื ืขื ืืืืืช ืืืจืื ืืชืืจื, ืืชืคืืื ืืืืืจืืช ืืืชื ืืขื ืืืืืฉื ืืืืจื ืืช ืืจืืืืืช ืืืืืื ืืกืืืจ. ืื ืืืืืื ืืคืืคืืืจื ืฉื ื ืืืื ืืื ืืืกืืืื ืื ืืืืื ืืืฉืงืืขืื ืืงืืื ืืืชื ืืืืื ื'ืืืฉืื' ืืืจืฆืืื ืืืกืืื ืืื ืืกืจ ืืกืืก: ืืืืจืื ืื ืขืกืงื ืืืืกืืืงื ืื ืคืืืช ืืืจืืฉืื ืื. ืืฃ ืขื ืคื ืฉืืื ืืงืืฉืจืช ืขื ืืืืฉืช ืืืืื ืืืช ืืืืืื ืืืจืืืื ืืืืฆืขื ืืืชืขืืึผืช, ืืืงืื ืืกืืืืคืื ืืขืฆืืืื ืฉืจืืืื ืืคื ืื, ืืืกืืืืช ืื ืืืชื ืืืื ืืช ืืกืืืืฃ ืืืคืจืืฉืืช. ืืฆืจืืช ืืืืืช ืืฃ ืื ืืืื ืืืชื ืืืื ืืฆืืืืจ ืืืืืืื ืืกืืืืื, ืืขืื ืฉืืคื ื ืืชื ืืขื ืจืง ืงืืืฅ ืืืืืืกืื ื ืื ืื. ืื ืืืืก ืืช ืืชืคืืฉืืช ืืฉืืืืืช ืืืื ืืืืืื ืฉื ืื ืฉืืืื ื "ื ืื-ืืกืืืืช", ืืื ืืจืืื ืืืืจ. ืืืืื ืืื ืืืงืฉื ืืื ืืช ืืื ืจืืื ื ืฉืืขื ืืง ืืฉืจืื ืืืืืืื ืืขืืื ืืืืืจื ื ืืืฆืจื ืืืชืืืื ืืืืื ืกื ืืืื ืืื ืืจืืื ืื ืฉื ืืชื ืืขื. ืืชืืืืจืื ื"ื ืื-ืืกืืืืื" ืืื ืืืฉืคืขื ื ืืืจืช ืืชืจืืืช, ืืืชืื ืืืืจ ืืขื ืืืืืืจ "ืืืฉ ืืกืื ืืื", ืืืฃ ืืืงืืืื, ืื ืืงืฉืจ ืืื ื ืืืฆืืืืช ืืื ืงืืืฉ. ืขื ืืื ื ืืกืฃ ืฉืืงืฉื ืขื ืืืืจืช ืืืืืืช ืืืื ืื ืืฉืืชืคืื ืืื ืืคืขืจ ืืื ืืืกืืืืช ืืืืงืืืช, ืขื ืืขืฉืืจืื ืืจืืฉืื ืื ืฉื ืืืื ื-19, ืืืื ืื ืืืืืจืื ืช ืืื ืืืืื. ืืฉืื ืืจืืฉืื ืืชื ืื ืืชื ืืขืช ืชืืืื ืืื ืืืช, ืืขืื ืฉืืื ืืืจืื ืืชืืคืืื ืืืชืืกืกืืช ืืืงืืช ืืฆืจืืช ืืืืกืืืช ืขื ืื ืืืืืช ืืขืืืจืช ืืืจืืฉื. ืืชืืจื ืืจืืื ืืช-ืืืกืืืช ืฉืืืคืฆื ืขื ืืื ืืืืช ืืืกืืืืช ืื ื ืื ืื: ืืืื"ืจืื ืจืืื ืืืกืืคื ืืืืืช ืืืืื ืืงืืจืืื ืืืฆืืจืชืืื, ืืืืจืืช ืืืขื ืืช ืืืืจ "ื ืืืื", ืืชื ืืขื ื ืืชืจื ืชืืกืกืช ืืคืขืืื. ืขื ืืืช, ืืื ืืืคืืช ืืืืฉ ืขื ืืืืืื ืจืืื ืฉื ืืืืงืืจืื ื ืืืืงืืืช ืืืืืช ืืคืืงืื ืฉืืจืชืืื ืืืงืืืืื ืืืชืจ ืืืืืื ืจืืื ืืืช, ืืื ืืืืื ืชืืจื, ืืืกืืืืช ืจืืืงืืืื ื ืืืจืื ืืขืืืขืื. ืืืง ืืืืืื"ืจืื ืืืืฆื ืืฉืงืคืืช ืจืฆืืื ืืืืช ืืืืืกื ืืฉืืืืช ืคืืืชื ืืชืคืงืืืืื ืืืืกืืืื, ืืจืืื ืืืจืื ืืชืืงืื ืืืืืชื ืื ืืืืื ืคืืืืืืื ืฉื ืืฆืจืืช ืขื ืง ืืืืกืืืช. ืืืกืืืื ืื ืืืจื ืืื ืฆืืืง ืขื ืคื ืกืืืืืชืื ืืืืฉืืืช ืืืืจืืืืืืืช ืืื ืืืฉืืื ืืืจื ืืื ืืืฉืชืืื ืืืฆืจ ืฉืืืื ืืฉืชืืืื ืืืืชืืื. ืืืืฉื ืืืกืืกื ืืขืืื ืืืกืืก ืืืฉืืช ืืืกืืืืช ืืื ืจืขืืื ืืึดืืึทื ึถื ึฐืฆึดืึธื โ ืื ืืืืืช ืืืืืืืช ืืืงืื ืืืื, ืืืืืื ืชืืืจ ืืฆืืืื ืืชืืงืื ื ืืืืืจ (ืงื"ื ืขื' ื'): โืึตืืช ืึฒืชึทืจ ืคึธึผื ืึผื ืึดื ึตึผืืึผโ. ืืฉืงืคื ืคื ืื ืชืืืกืืืช ืื, ืฉืืคืื ืื ืืขืืื ืกืคืื ืืืฉ ืืืืืืืช, ืืื ืงืช ืืืงืืื ืืืื ืืงืืืช ืืืจ"ื ืืคืจื, ืื ืืืจืืื ืืืื ืืคืืืืกืืคืื ืืืกืืืืช. ืืืจื ืืืจืืื, ื ืืจืฉ ืืื ืืฆืืฆื ืืช ื ืืืืืชื ืืืืฉืืืช, ืืืืคืฉืืช ืืืืืชื-ืืืืืืช, ืืืืื ื "ืืืจ ืืื-ืกืืฃ". ืื ื ืืฆืจ "ืืื ืคื ืื" ืฉื ืืชืจ ืื ืจืง ืจืืฉืื ืืืืจ ืืื-ืกืืฃ ืืื ืคืจื ืืืืืจื ืืขืฆืืืช ืืืืืืื. ืืื, ืืืื ืื ืืืื ืืืชืงืืื ืืื ืืจืข, ืกืชืืจืืช, ืจืฆืื ืืืคืฉื ืืชืืคืขืืช ืืืจืืช ืฉืื ืืื ืืคืฉืจืืืช ืืชืื ืืืื-ืกืืฃ ืืืืฉืื ืืืืืื; ืืืคืฉืืืช, ื ืืชื ืืื ืืืจืื ืื ืืช ืืขืืื. ืื ืืฆืืืืช ืืืงืื ืชืืืื ืืืืืืื ืืืงืืจื ืืืืืื ืืืืืืชื ืืจืืื ืืช ืฉืืืขืืื ืืื ืืืื ืืืืคืก. ืืืงืืื, ืืืื-ืกืืฃ ืืื ื ืืืื ืืืืคืืข ืืฆืืจืชื ืืฉืืื ืืชืื ืืืื ืืคื ืื ืืืืืจื ืืฆืืฆื ืืช ืขืฆืื ืืืฉืืืืช ืืืืืืช ืฉื ืืชื ืืชืคืืฉ ืืืืฉืื. ืื ืืชืงืืืืช ืืืืืช ื ืืืืืื ืืืืืงืืืช ืืื ืืืฉืืืืช ืืคืืืืช ืืืืืืืช, ืืฉืื ืืืช ืืชืืืจื ืขืฆืื ืืฉื ืืื: ืืฉื ืฉืืื ืฆืจืื ืืฆืืฆื ืืช ืขืฆืื ืืื ืืืชืืื ืืขืืื ืืืืืจื, ืื ืฉืืื ืขื ืืืฉืื ืืื ื-ืืืื ืืืืืื ืืืชืขืืืช ืืืืชืืื ืืืืฉ ืขื ืืืื-ืกืืฃ. ืจืฉ"ื ืืืืื ืชืืืจ ืืช ืืืืืืงืืืงื ืืื ืืืฉืืืืชืื ื"ืชืืจื ืืืจ": โืชืืืืช ืืจืืืช ืืขืืืืืช ืืืื ืืืฉ ืืื ืืืืคืื ืืืืื ืช ืืฉ ืืืืื ืช ืืืโ. ืชืืืืช ืืืื ืืื ืืืืฅ ืืช ืื ืืฆืืฆืืช, ืจืืฉืืื ืืืืจ ืืืงืืจื, ืฉื ืฉืื ืืงืืืคืืช ืืืฆืืืืช ื"ืืื ืืคื ืื", ืืืขืืืชื ืืฉืืจืฉื ืืืืขืื ืืืื ืืชืงื ืืช ืืืจืืื. ืืืืืจ, ืชืคืืฉื ืื ื ืฉืืืช ืืืงืืื. ืื ืืขืื ืฉืืืืจืื ื ืืฆืืื ืืืชื ืืขืืงืจ ืืืงืฉืจืื ืงืืกืืืืืืืื, ืืืืืช ืืืืคื ืฉืื ืืื ืืฆืืฆื ืืช ืขืฆืื ืื ืืขืืื ืืขืฉืจ ืืกืคืืจืืช ืืืืืกืื ืืื ืืืืืชืื ืืฉืื ืืช ืืืืืืืืช ืืื, ืืชื ืืขื ืืืืฉื ืืืืื ืืืชื ืขื ืืคืจืืื ืืคืฉืืืื ืืืืื-ืืืืืื ืืืืชืจ ืฉื ืืืืื. ืืืกืืืืืช ืืืกืืืืืช ืืงืืืฉืืช ืืงืื ืจื ืื ืืืืืืืช ืืืฉืืืื ืืื "ืืื", ืืืฉืืขืืชื ืืืคืืื ื"ืฉืื-ืืืจ" ื"ืืื ืกืืฃ", ื"ืืฉ" ืืืฉืื, ืืืืื, ืื ืืืืืช ืืืชื ืืืืฆืขื ืืคืขื ืื ืืืื ืช ืืืฆืืืืช ืขืฆืื. ืจืื ืืืืืืจ ืกืืืื ืืืช: โืืืจื ืืืกืืืืช ืงืืขื ืฉืื ืืืจ ืืืื ืืขืช ืืืขืื ื ืืืช ืืืจ ืืืืคืืื, ืืืขืืจ ืืื ืืืืช ื ืืืืช ืฉื ืืืฆืืืืช ืืกืืคืืช ืืฆืืื ืืืืชื ืืืื ืกืืคืืช... ืืืฆืืืืช ืืืืื ืืช ืืขืืื ืืกืืื ืืฉืขื ืฉืืื ื ืืืืช ืืืืช ืืืื ืืืืงืฉืช ืืืฉืืฃ ืืช ืืืืชื ืืืืืืืช ืืืืืื ืืืืคืืื ืืืฉืื.โ ืืืฉืืื ืจืืื ื ืืืจื ืืขืงืจืื-ืขื ืื. ืืืกืืืืช ืืืืืฉื ืืช ืืฆืืจื ืืืฉืื ืืืงืืช ืืืืืืื, ืฉืคืืจืฉ ืืงืืื ืืืชืืืืืช ืืืกืืืช ืขื ืืฉืืชื. ืืืืจ ืฉืืื ื ืืื ืืื ืืงืื, ืคืฉืืื ืืืฉืืขื, ืืืื ืืฆืจืื ืื ืืืืื ืืืืืง ืื ืืื ืฉืขื. ืืฉื ืื ืืืื ืขืืื ืืืฉืชืืง ืืช ืืืคืื ืื ืืืืื ืืืืชืืืืข ืืืฆืืืืช ืืืืช ืืืืจื ืืืื, ืืื ืืืฉืื ืืืงืืช, ืืขืื ืืืืืื ืืฉืืืื. ืืืืงืืช ืคืืจืฉื ืื ืืืขืื ืืฆื ืฉื ืืืืืืืช ืคืจืืืช ืืจืื ืืืื, ืืกืืคืงื ืืืืจ ืืกืืื ืืืืฃ ืืืฉืืื ืืืืง ืฉื ืืข ืื ืืงืืื ืืืืชืจืื ืืื ืืฉืืจ ืืฉืืชืืืช. ืืืกืืืืช ืกืืคืงื ืฉืกืชืื ืืืื ืืคืจืืงืช ื ืืืืช ืืื ืขื ืืื ืืขืืจืชื ืืชืืื ืืืืฉื ืืืืืคืฉื. ืื ืืืืื ืืื ืืืืจืื ืื ืจืง ืืืฉืื ืืืงืืช ืืขืฆืื ืืื ืื ืืื ืืืช ืืช ืงืืื ืืืื. ืืื ืืื ืฉืืืื ืืจืืืช ืืจืืืื, ืืืืชืขืืืช ืืืงืกืืืืช ืฉื ืืฆืืืงืื ืืขื ืืงืจืื ืืกืคืื ืื ืืช, ืืฆื ืืขื ืืืชืจ, ืฉื ืืืื ืืคืฉืื ืืืืื ืืชืคืืื. ืื ืืฉื ืืืืงืืช ืืจืื ืื ืืืืฉื ืฉื ืืืืื ืืืฉ, ืืืืืจ ืืืืื ื ืืื ืฉืืขืื "ืขืื ื ืืืฉืจ" ืื ืืืชืืช ืืฆืืืืช ืืคื ื ืืืื ืืฆื ืฉืืื ืฉืืคืื ืืขืืื ืืื ืืืืจื, ืืืืฉ ืื ืืื ืจืืื ืืืช, ืืจ-ืืืขืช ืฆืจืื ืืืขืช ืื ืืืืืจ ืืืฉืืื ืืืจืืืช ืืืขื ืื ืืช ื ืืืืืช ืืืืจื ืืืื. ืืฉ ืืืชืืืจ ืขื ืืืื ืืืกืืืช ืืงืืื ืืืฉืื ืืืืชืืงื ืืขืืืืช ืืืืจื ืืืืืงืืช ืื. ืื ืฉืืฆืืื ืืืืฉื ืืืช ืืืืคื ืืื ืืืืจ ืืืชื ืชืง ืืืืืืืชื ืืขืฆืืืช ืืืชืคืืฉ ืขืฆืื ื"ืืื", ืืื ื ืฉืืชื ืืืืื ืืืฉืื ืืืงืืช ืืืื ืืืืืืจ ืืงืืื ืฉืืื ื ื ืขืฆืืื ืื ืคืจื ืืื ืฉื ืืืืืื. ืืฆื ืืืืืืื ืื ืืืื ื ืืชืคืฉืืืช ืืืฉืืืืช, ืฉืืื ืืืืคืื ืืืืืืงืื ืฉื ืืืฉื ืืฆืืฆืื. ืืื ืืืืืช ืืกืืื ืืืฉืื ืืช "ืืืืื ืืืฉ" ืืืืชืืืจ ืขื ืืืคื ืืืฉืจ ืืฉ ืืืืืง ืืฆืืื ืืช "ืื ืคืฉ ืืืืืืช" ืืืืขื ืืช ืืืื ืืืืชืืืจ ื"ื ืคืฉ ืืืืืงืืช" ืฉื ืืฆืื ืืืงืืจ ืขืืืื ืืฉืืืคืช ืืืืขื ืืฉืื ืืืืชืจืืื ืืืื; ืืืืื "ืื ืคืฉ ืืืืืืช" ืืืืจ ืืืืขืฉืืช ืืืืฆืขืืช ืืชืืื ื ืืช ืืืจืืืจ ืืชืืืืื ืขื ืืืขื ืื ืกืชืจ ืืืืืืชื ืฉื ืืขืืื, ืืืชื ืืฉ ืืจืืืช ืื ื"ืขืื ื ืืืฉืจ" ืฉื "ืืืืืืช" ืืื ื"ืขืื ื ืืจืื" ืฉื ืืืืืืืช. ืืื ืืืฆืืื ืืื ืืฉ ืฆืืจื ืืืืื ื ืืืงื ืืืืฉืื. ืขื ืืื ืืืืื ืืช ืืืื ืืื ื ืืฉื ืืจืืื ืืืืชืจ ืืชื ืืขื, ืฉืืฉื ืชื ืขื ืืืืืช ืืืืืช ืืืฆืืืืช ืืขื ืืืืืื ืืื ืืฆื ืืื ืืืฉื ืื (ืืืืจ ืืจืื, ืืืืื ืืืฉืื) ืืืื ืืืชื ืืขืืขืื ืืืืืืื ืืื ืืืฉืื ืืืจืืืจ ืืขืืืช ืืืืืจ ืื ืืขืฉื ืืืฉื ืืืจืืืช ืืช ืืืื ืื ืขืื ืขื ืืืชื ืืืฉืืจ ืืื ืืืืฆื ืฉื ืืืฉ: ืืืื ื ืขืฉืืื ืืืืชืจืื ืืื ืืืฉืคืขื ืจืืื ืืช ืืืืืื ืื ืฉืืืืืช, ืืืจืืืจ ืคืืื ืขืฉืื ืืืืืช ืืกืืื ืืืฉ ืืืคืืื ืืืชืจ ืืืขืฉื ืงืื ืงืจืื. ืืืืฆืขืืช ืืืชืืื ื ืืช, ืืืกืื ืืืืืืืื ืฆืจืื ืืืืฉ "ืืฉืชืืึผืช", ืืขืื ืืืืฉืืช ืขื ืืื ืืช ืืฆืื ื ืช, ืืืคื ืขื ืืื ื ืืขืืื ืืื. ืื ืื ืืืื ืื ืืืชื ืืืื ืืฉืืื ืขืืื ืืืืื ืืช ืขืฆืื ืืืฉืคืืชื, ืื ืืื ืืืืจ ืืืคื ืื ืืืืืื ืืช ืืฉืืืืืช ืืืฉืงืจืืืช ืฉื ืืืฉืื. ืฆืืืงื ืืืกืืืืช ืืคืฆืืจื ืืงืืื ื"ืืชืืื" ืืืืืืข ืืืจืื ืืื, ืื ืื ืืืื ื ืืืื ืฉืืืืืจ ืืืืืง ืคื ืืื ืงืฉื. ืืืชื ืฉืืื ืืืืื ื ืืืืืช ืืืืืืืช ืฉื ืื ืจืึถื ืืืื ืืืืฉืืชื ืืืชืืืชื ืฉื ืืขืืื ืื ืชืคืฉ ืืืืฉืื ืื ื ืืฉื ืืจืืื ืืกืคืจืืช ืืชื ืืขืชืืช. ื ืืืจืช ืืฉืืื ืืืืืื ืฉื ืืืืืืืืช ืืื ืืืืจ ืืจืื ืืื ืืขืืืื ืืืฉืืืืช. ืืฉื ืฉืืืื ืกืืฃ ืฆืืฆื ืขืฆืื ืื ืืืงืื ืืืืื ืื ืืคืฉืจ ืืืขืืืช ืืช ืืจืืืืื ืืืจื; ืืืืืชื ืืืื ืฉืืกืคืืจืืช ืืืืืืืช ืืฉืคืืขืืช ืขื ืืืชืจืืฉ ืืชืืชืื ืืช, ืื ืื ืืคืขืืืืช ืืคืฉืืืืช ืืืืชืจ ืืขืืื ืืื, ืื ืื ื ืขืฉืืช ืื ืืจืฉ, ืืืืืืช ืืฉื ืืช ืืช ืืงืืจื ืฉื. ืืืกืืืืช ืืืืื ืฉืืขืฉืื ืืืืฉืืื ืฉืืืฆืขื ืืืื ืืืืืื ื, ืืื ืจืืงืื ืืืืืื, ืืืื ืืฉืืจืจ ืืช ืื ืืฆืืฆืืช ืืฉืืืืื ืืงืืืคืืชืืื ืฉืกืืื ืืงืืกืืืก. ืืืืจ ืฉืืืืืจ ืืื ืกืคืื ืจืกืืกืื ืืืืืืื, ื ืืจืฉื ืื ืืขืฉืื ืืืืจืืื ืืื ืืืืฅ ืืืชื. ื"ืขืืืื ืืืฉืืืืช" ืืื ืคืืื ืฆืืื ืืืง ืืื ืืื ืืืืื, ืฉืื ืืฆืจื ืืขืื ืืฉืืืื ืืื ืืืืื ืืืฆืืืช ืืืชืืจื ืืืขืฉืื ืจืืืืื ืืืื-ืืืืืื, ืื ืกืื ืงืืืฉื ืืืืื. ืคืืื ืฆืืื ืื ืืื ืืชืืืฉ ืืืชืืช ืืืจืืช ืฉืืชืืกืกื ืขื ืืงืืื, ืืื ืืฉืืชืืืช, ืื ืฆืืืงื ืืืกืืืืช ืืื ืฉืืจื ืื ืืืื. "ืืขืืืื ืืืฉืืืืช" ืจืืกื ื ืืืืืืื ืจืง ืืืืืื ืืื ืืืื, ืืจืืื ืฉื ืืืกืืืื ืืื ืื ืืืกืชืคืง ืืชืจืืืช ืืกืฃ ืืจืืืชืืื ืืืืฆืจ, ืืืขืื ืฆืืจื ืฉื ืขืืืื ืื. ืื ืืืื ืืืืืืงืื ืฉื ืืขืืืื ืืืฉืืืืช ืืื ืืืฉื ืืืฉืืช ืืฉืคืข. ืื ืฉืืฉืื ืจืืืช ืืืืืืช ืฉื ืืืงืืช (ืืืืื ื, ืืฆืืืงืื ืืื ืืืืื) ืืืื ืืื ืืกืคืื ืืช ืืืืจ ืืืืืื ืื ืืกืคืืจืืช ืืืืืืืช ืืืืฉืื ืืื ืื ืฉืืชืืื ืืืฉืคืขืืช ืืืืืืืช ืืขืืื ืืื. ืืื ืืืื ืชืืืฉืืช ืืืื ื ืืืืืืื, ืืืฃ ืืขืืืืช ื' ืืฉืืจ ืขื ืืื ืื ืจืืื ืืื, ืืื ืืืจืื ืคืจืืืืืื ืืืชืจ ืืื ืืจืืืืช, ืคืจืืื ืืฉืืฉืื ืืืืื. "ืืืฉืืช ืืฉืคืข" ืกืืคืงื ืชืืจืืฅ ืืืง ืืืืืจื ืืืื ืืืืื ื ืืขื ืื ืืืจ ืื ืืืกืืืืช. ืืืืืช ืื ืืืืืื ืืืฉืืื ืืชืืจืช ืืืกืืืืช ืื ืขื ืืกืืืืช ืงืืืืืื ืืืจืื, ืฉืชืืืจื ืืืฉืืืืื ืืืืคืืืื ืื ืืื โ ืืื ืืจืข, ืืืืื ืืฉืคืืืช, ืืืืื ืืืืจื ืืขืื. ืืืื ืืชื ืืขื, ืืื ืืืจืื ืฉืืืฉืคืขื ืื ืืงืืื, ืืืืืฉื ืื ืืื ืืคืืืช ืืช ืื ืืฆืืฆืืช ืืืืืื ืชืืงืื ืืื ืื ืืืืข ืขื ืืืฉืื, ืืฉ ืื ืืืคืฉ ืืืชื ืืืืืืืช ืืืื ืืืจืฉืข. ืขืื ืืืกืืืืช ืืืืงืืืช ืืืืืฅ ืื ืืืืืืง ืืืฉืืืช ืืจืืช ืืขืืืื ืืฉืขืช ืืชืคืืื, ืืื ืืืชืืืื ืืืืชืืืจ ืืืชื ืขื ืืื ืืืจื ืืฉืคืืืช ืืืื ืืืืืืืช ืืืืจื. ืืืืื ืืืืจืืืฉ ืคืืชื ืชืืจื ืืขื ืืื ืืืืคื ืฉืื ืจืืฉืืช ืืืืคืื ื ืืืชืื ืงืฉืืจืื ืืืื ืืืืืื ืืื ืืฆืืื ืื ืคืฉ ืืืื, ืืชืืืื ืชืืืจื ืคื ืืื ืืืฉืงืฃ ืืช ืขืฉืจ ืืกืคืืจืืช ืืืืืืืืช. ืืืกืืืื ื ืงืจืื ืืืชืืืื ืืฉืืจืืช ืขื ืืืืืช ืจืขืืช ื"ืืฉืืืจ" ืืืชื ืืืงืื ืื ืกืืช ืืืฆื ืืข ืืืชื. ืจืขืืื ืืช ืงืืฆืื ืืื ืืื ืขืืืขืื ืื ืื ืื ืขื ืคื ืจืื ืืืกืืืืช ืืืืืืจืช, ืืื ืืืืื"ืจืื ืฉืขืกืงื ืื ืืฉื ืืงืคืืื ืืืืืืฉ ืื ืืื ืื ืืืืขืืื ืืืื ืืื ืจืง ืื ืืืจืื ืืขืืื. ืขืื ืืฉืืื ืฉื ืืคื ืื ืชืืืื ืืืกืืื ืืืืชื ืขืืืื ืขื ืงืืืื ืฉื ืืฉืืื ืคืจืืืช ืืฉืืจืจืช ืืืจืืื. ืื ืืืื ืืจืื ืืจืืฉืื ืื, ืกืืจื ืืืขืฉ"ื ืืืืฉืืืื ืื ืืืืจ ืฉืืื ื ืืื ืืื, ืืืืื ืืื ืืฉืืื ืขื ืืื. ืืขืื ืฉืงืฉื ืืืคืจืื ืืช ืชืืจืช ืืืกืืืืช ืื ืืงืืื, ืืืืคืืื ืืืืืืง ืืืืืืื ืฉื ืืชื ืืขื, ืื ืืืืืืื ืืื ืืืืจื ืืืกืื, ืืื ืืื ืืื ืืจืืื ื ืฉื ืืขืื ืืืกืืืืช, ืืฆืืืง โ ืืืืืจ ืืืื ืื ืืชืืืจ ืืืืื ืืืืื "ืืืื"ืจ" (ืืืื ื ื, ืืืจื ื ืืจืื ื) ืื ืืืื ืื ืืขืืื "ืจืื". ืืจืขืืื ืฉืงืืืืื ืืื ืืืจ ืฆืืืงืื ืืจืื ื ืืฉื ืืฉืคืข ืืขืืื ืืืฉืื ืงืื ืืชืืืืื ืืืขืฉ"ื, ืืกืคืจ ืืืืืจ ืืฆืืืจ ืขื ืื ืฉืืื ืืื ืขืืืื, "ืืชืคืฉืืืชื ืืืฉื ืืื ืืจื ืืืจื". ืืืกืืืืช ืืคืื ืืช ืืืฉื ืืฆืืืง ืืืกืืก ืื ืฉืืืชื ืืืืืื ืืืจืืื ื, ืขื ืืื ืื ืฉืืกืคืจืืชื ืงืืื ืืชืืืจ ืืฉืืขืืช ืฉืื ื ืื ืืืืช ืืื ืืืงืืจืืช ืฉื ืืื ืืจื ืฉืืื. ืืฉืืชื ืืขื ืืฆืขืืจื ืืชืคืฉืื ืืขืืจ ืืืขืื ืืกืืืจ ืืืงืื ืฉืืืืชื ืืชืืืืชื ืืฆืืจื ืืืื ื ืชืืืืื, ืืืื ื ืจืืฉืื ืฉืืกืืืื ืืืื ืืช ืคืฉืืื ืืขื ืืช ืชืืจืชื ืืกืืืื ืงืื. ืื ืืืืจื ืชืืืืืืื ืืชืงืฉื ืืืชืืืื ืขื ืืืืืืงืืืงื ืืืืจืืืช ืืื ืืฉ ืืืื ืืคืืื ืืจืื ืืืื ืืืงืืืืืืช; ืืชืงืืื ืฉืืืืจืื ืืคื ืืื ืืื ืืช ืืช ืืคืืืืกืืคืื ืืื, ืื ืจืง ืื ืืฉืคื ืืืืืฅ, ืืืืชื ืงืืืฉื. ืืื ืืืืื ืืคืฆืืจื ืืงืืื ืืืืืื ืืืืืื, ืื ืืคืชืจืื ืืืืืชื ืฉืืืืฉื ืืืขื ื ืืื ืืืืฉื ืืืจ-ืชืคืืฉื ืืื, ืืกืืื ืืช ืืคืืืชื ืืชื ืืขื ืืืืื ืช ืืขืฆืืืืช: ืื ืืื ืืกืืื ืฉืืื ืืงืื ืืจืื ืืฉืืื ื ืืขื ืืฉืืฉ ืืืชืืืืืช ืืื ืฉื ืจืขืืื ืืชืื ืขืืืจ ืืงืื ืืจืืื ืืืื ืืืฉื ืืขืฆื ื ืืืืืชื. ืืื ืืื ืืกืืื ืืืื ืืช ืืืฉ, ืืขืืื ืืืฉืืืืช, ืืขืืืช ืืขืืืืืช ืืขืืืื ืื ืืืืฉืื ืืืืงืืช ืืืืืคื ืืืื ืืืื ืืช ืื ืืืืืืืืื ืฉืืชื ืืขื ืืฆืืื. ืืืืจ ืฉืืจืื ืืืืืื ืฉื ืืฆืืืืจ ืืืกืืื ืื ืืื ืืกืืื ืืขืฉืืช ืืืช ืื ืืคืืื ืืืชืงืจื ืืื, ืื ืืื ืืืืจืื ืืืืืืง ืื ืืืงืื ืืื ืืืฉืื ืืคืืืช ืืืืื ืืืื ืืืฉืื ืืืื ืฉื ื. ืืืฉืืืชื ืืืฆืืื ืืืืจืืืืืืช (ืืคืืืช ืืืืจืืช ืืจืืฉืื ืื) ื ืืขืื ืืืืืืฉ ืืืืื ืื ืืช ืืืืืชืืช ืฉืืชืืจืช ืืชื ืืขื, ืืกืคืง ืืื ืืืื ื ืคืฉืืช ืืืืื ืกืคืืงืืช ืืืฉืฉืืช. ืืื ืชืคืงืืื ืืื ืืจืื ืืืชืจ ืืจืืื ื ืืจืืื: ืืืืจ ืฉืืื, ืืืืืจ, ืืกืืื ืืืชืขืืืช ืืืืืฉืื ืืช ืืฉืคืข ืื ืืกืคืืจืืช ืืืืืืืช, ืืืืื ื ืชืืืืื ืฉืืื ืฆืื ืืจ ืืจืื ืืืืื ืืช ืืื ืืฉืคืขืืช ืืืืืืืช ืืืืจืืืช ืืืืจื. โืืฆืืืืืชื ืฉื ืฉืื ืชืืืืจืื ืื ืฆืืื ื ืืช ืืคืืืชื ืฉื ืืืกืืืืช ืืชื ืืขื ืืืจืชืืช ืืืืืฉืชโ, ืฆืืื ืืื ืืืื ืจ. ื ืืื ืืชื ืฉื ืืฆืืืง ืืืืชืจ ืขื ืืืงืกืืื ืืืจืืื ืืืฆื ืืืืืงืืช ืืื ืืื ืืื ืืช ืขืืชื ืชืืืจื ืื ืืืืืช ืืืืชืืจ ืืื ืฉืขืฉื ืืืขื ืืขื. ืืชืืืจื, ืื ื ืืจืฉื ืืกืคืง ืื ืืช ืฆืจืืื ืืืืืจืืื; ืืื ืื, ืฆืืคื ืืื ืืฆืืืช ืื ืืืืคื ืืืืื, ืฉืืจื ืฉืืืชื ืืกืคืืจืืช ืืืืืืืช ืืงื ืชื ืื ืืืข ืขืืืฃ. ืืืืฉื ืฉื "ืืจืืืช ืืฆืืืง" ืื ืขื ืืื ื ืืขืืื ืืืืชื ืืืืง ืืฉืืืืืชื ืืืฆืื ืืช ืืืืืืื ืืืคืืืช ืืช ืื ืืฆืืฆืืช ืืฉืืืืื, ืืืงืืฉืืจ ืฉื ืขืฉื ืื ืืื ืชืคืงืืืื ืืจืืื ืืื ืืืคืืืืืืื ืืขื ืืง ืืืืืืืฆืื ืืกืืืืชื ืืฉื ื ืืืืฉืืจืื: ืื ืืืชื ืืช ืืงืืืื ืืืืชื ืืืง ืืืืขืืื ืืืืกืื. ืืฉืื ืืืกืืื ืืื ืืฆืืืงืื ืืืฉื ืืืขืจืฆื ืื ืืื ืืช ืืืจืคืืช ืืืืืืืจืคืื ื ืจืืืช. ืื ืืืฉืื ืืืืืืืช ืื ืืืงืจื ืืชืืืจื ืืืืื ืขืฆืื. ืจืืฉืืช, ืืืืจ ืฉืืืื ื ืืขื ืื ืืืื ืืืื ืืช ืขืฆืืืืชื ืืื ืืื ืืืฉืื ืืืืงืืช ืืืืื ืฉืืื, ืฆืืคื ืืื "ืืืชืืื ืืฆืืืง" ืืืื ืขื ืืืืจื ืืื ื ืขืฉื ืืื ืืืืื ืืงืฉืจืช ืืจืื ืื ืงื. ืืื ืฉืืืฉ ืืฉืจ, ืืืฉื ืฉืคืข ืืืืขืื ืขืืืจ ืืงืื ืืืขืืืจ ืืช ืืชืคืืืืช ืืืืงืฉืืช ืฉืืื ืืืืืื ืืืคืื. ืืชืืืฉื ืืขืจืืช ืืืกืื ืืจืืจื ืืื ืืื ืืื ืืขืืชื, ืืฉืืื ืืฆืืืชืื ืื, ืคืื ืื ืืืื ืืื ืฉืืื ืืื ืืงืื ืขืฆื ืืืชืื ืกืื ืืืฆืจื ืืื ืืกืคืื ืืฉืจืื ืืืชืขืืืช, ืืืื ืืกืคืง ืืจืื ืืฉืคืข. ืืฆืืืง ืืฃ ืืืื ืคืื ืงืฆืื ืืืืืืช ืืืื ืฉื ืคืืื ืฆืืื ืืฉืืื ืืงืืืืชื, "ืืฉืื-ืืขื ืืขืืชื ืืืืื ื" ืืืฉืื ื ืฉื ืืืกืฃ ืื. ืืคืืื ืื ืืื, ื ืืงื ืื ืืจื ืืช ืืืื ืืืืืื ืฉืืชืืื ืืื ืืฉืืจ ืืฉืืชืืืช. ืืฃ ืฉื ืืชืจ ืืจืืกื ืืืืืจืืช ืืืจื ืืื, ืืฉื ื ืืงืจืื ืืื ืืชืคืจืฅ ืืขืืฆืื: ืจืื ื ืืื ืืืจืกืื ืืขื ืืืืื ืฉืืื "ืฆืืืง ืืืืช" ืืืืื ืืืืจ, ืืจืืื ืืืกืืื ืจืื ืื ืื ืื ืื ืฉื ืืืืจืกืื ืืืขื ืื ืฉืืื ืืืฉืื ืืฃ ืืืืจ ืืืชื ื-1994. ืืืืื ืืืืงืืืื ืฉื ืืืกืืืืช ืงืืื ืืฆืืืง ืืช ืืขืืื ืืืื ืืืจืืืื ืืืคืื ืช ืืืงืืืืช, ืื ืขืื ืืขืฉืืจ ืืจืืฉืื ืฉื ืืืื ื-19 ืืืื ืื ืืืื ืืชื ืืขื ืืชืืืข ืืืืืืืฆืื ืืืชืืกืก ืขื ืืืืืก ืืฉืคืืชื ืืงืืืืืื. ืืืืคื ืื ืืฆืืืง ืงืฉืจ ืืื ืจืื ืืืฉืืืืช ืืืชืจืื ืืืขื ื ืฉืกืืืื ืื ืืืืชื ืฆืจืืื ืืืชืืื ืื ืืืืคื ืืืืฉื, ืืืืื ืืืืขื ืืงืืืขื ืฉ"ืืื ืฆืืืง ืืื ืื ืฆืืืง". ืืฉืืฉืืชืืืช ืืื ื ืืืช ืืืขื ืื ืืืฆืจืืช ืืืกืืืืืช ืืื, ืืืฉืคืืืช ืืืืื"ืจืื ืืงืคืืืืช ืขื ืคื ืจืื ืืืื ืฉื ืืื ื ืืืื ืขืฆืื. ืฆืืืงืื ืืกืืืืื ืืืืง ืื ืืืฆืจืืช ืืืืืจืื ืืช ืคืืชืื ืชืืจืืช ืืคืืืืกืืคืืืช ืืืจืืืืช ืืืงืืจืืืช ืฉืืืืืฉื ืื ืืืช ืฆืืืื ืืกืืืืื ืืืืฉืืช ืืืกืืืืช. ืืืง ืืืฉืงืคืืช ืืื ืืื ืืขืืืช ืืฉืคืขื ืขืฆืืื ืื ืืขืืจ ืืชื ืืขื, ืืืืจืืช ื ืืืื ืขื ืืืช ืืืฆืจืืื. ื ืืชื ืืกืืื ืืช ืืืฆืจืืช ืืืกืืืืืช ืืฉืื ืืช ืืคื ืืืฉืืื ืืขื ืืื: ืื, ืืืืืื, ืืืง ืืืืคืืื ืืช ืืืืื"ืจืื ืฉืื ืืขืืงืจ ืชืืืืื ืืืืื ืืคืืกืงื ืืืื, ืืฉืืืืื ืืช ืกืืืืชื ืืืืคื ืฉืืื ื ื ืจืืืง ืืื ืฉื ืจืื ืื ืื-ืืกืืืืื. ืืืจืืื ืืงืคืืืื ืืืืืื ืขื ืืืืื ืชืืจื ืืืงืืืง ืืฉืืืจืช ืืฆืืืช. ืืชื ืืืืช ืืื ืืืคืืื ืช ืืืฉื ืืช ืืืช ืฆืื ื ืืืช ืขื ืคืื, ืืื ืกืืืืืจ, ืื ืืช ืืขืืื. ืฉืืฉืืืช ืืืจืืช, ืืืืืช ืืื'ื ืืฅ, ืืชืืงืืืช ืืืจืืืื ืฉื ืืืืื"ืจ ืืืืืืชื ืคืืขื ืืฉืืขืืช ืืืงืจื ืืช ืืืืื, ืืืฆืืืื ืืช ืืกืื ืื ืชืคืืื ื ืืื ืืืืคื ืชืืกืก ืฉื ืื ืืืืืื. ืืฆืจืืช ืืกืืืืืช ืืืืฆื ืืืฉื ืฉืืืชื ืืช ืืืืืชืืช ืืืืฉืืื ืืช ืืืชืงืืืืช ืืจืืื ืืช ืฉื ืืืื ืืืช ืชืคืงืื ืืฆืืืง ืืืืจืืชื ืืื, ืงื ืืจืืื ืืื ืืืชืจ ืืืื ืฉืืื ืงืืช ืืฉืืืช ืืืช ืคืฉืืกืื ืืื ืืืจ. ืงืืืฆืืช ืืขืืืช ืืืกืืช, ืืื ืืืช ืืืืืืฉืื ืืขื ืคืื, ืืฉืืจืืช ืืืืื ื ืืืจืช ืืช ืืคืืื ื ืืืกืืืืช ืืืืงืืืช ืืืขืืืืืช ืงืจืืื ืืกืคืจืืช ืงืืืืช ืืืฃ (ืืืืืจืืช ืจืื) ืขืืกืืง ืืืืืืื ืืืขืฉืืื ืฉื ืจืขืืื ืืชืื. ืืฃ ืืฆืจ ืืื ื ื ืืชืืงืืช ืืืื ืืงื ืืื, ืืืืื ืืืืฉื ืฉืืืื ืืืฉืื ืืืื ืื ืื ืฉืื ืื. ืืื ืืคืืฆืืืื ืืืืงืืืื ืขื ืจืงืข ืืืืืืืืื ืืชืจืืฉ ื-1812, ืืฉืืืืืื ืืงืืืฉ ืคืจืฉ ืืขื ืืืืื ืืืืืืื ืืืงืื ืงืืืฆื ืขืฆืืืืช. ืืืืื ืืื ืืืืฉื ืคืืคืืืืกืืืช-ืืจืืืืืืช ืืจืื ืืช ืชืคืงืืื ืืงืืจืื ืืืืื ืื ืืคืฉืืืื ืขื ืืื ืขืฉืืืช ืืืคืชืื, ืืืืืืช ืืืฉืคืขืช ืฉืคืข ืืกืื ืื ื ืืื. ืืืืืื ืืงืืืฉ, ืืขืืืชื, ืืื ืืืืืืกืื ืืืืืจืืจ ืืืชืจ ืืืชืจืื ืืฆื ืืืืืชื: ืืื ืืจืื ืืืขืื ืฉืชืืืืช ืืืื ืืื ืฉืืื ืืืฆื ืชืืืขื ืฉืืื ื ื ืขืฆืืื ืื ืคืจื ืืื ืฉื ืืืืืืื, ืืคื ืฉืืชืงืืื ืืืืจืื ืืืจื ืืืืืช ืคืจื ืขืฅ ืืืขืช. ืืฉืืืชื ืฉื ืืฆืืืง ืืืืชื ืืื ืืืช ืืช ืชืืืืืื ืืืืงืืจืื ืืืขื ืื. ืืืื ืงืืฆืื ื ืฉื ืืืฉื ืื ืืื ืื ืื ืื ืื ืืงืืฆืง, ืฉื ืืืข ืืงืืฆืื ื ืืกืจ-ืคืฉืจืืช ืฉืชืืข ืืชืืกืจืืช ืืืืืืช ืืืกืข ืืืจ ืืืืช ืืคื ืืืืช ืื ืื ืืืจืฉ ืืช ืื ืฉืืืงืฉื ืืช ืืจืืชื[ืืจืืฉ ืืงืืจ]. ืืืื"ืจื ืืืช ืคืฉืืกืื ืฉืืืฉืืื ืืืจื ืืืืืื ืืงืืืฉ ืื ืืชื ืขืจื ืืขืืื ืืืืฆืืข ืืืคืชืื ืื ืกืื ืื ืขืืื, ืื ืืื ืืจืืกื ืื ืืืื ืืืืืื; ืืืกืืืื ืืืืชื ืืืืื ื ืืืช ืืคืืืื ืืืจืืืืช, ืืขืื ืฉืืืชืืก ืืืืืืื ื ืืื ื ืคืืฅ ืืืืืื ืืืืืฆืื. ืฉืืื ืืืจืช ืืื ืื ืฉื ืื"ื, ืฉืคืืชืื ืขื ืืื ืจืื ืฉื ืืืืจ ืืืื ืืืืื ืืืืจืฉืื. ืื"ื ืฉืืืจื ืืืืื ืจืื ืืช ืจืื ืืืกืืืืช ืืืืงืืืช, ืืืจื ืืชืืื ืืืืืื ืืื ืืืืื"ืจ ืืขืืชื, ืืืืจืฉืช ืื ืืืืืจืื ืืคืฉืืืื ืื ืืชืขืืงืืช ืืื ืืืงืืืืืืช ืืขื ืืื ื ืืกืคืืจืืช, ืืฆืืฆืื ืืืืืฆื ืืื. ืืืฆืจ ืืืืืฉื ืืช ืืฉืืืืช ืืืื ื ืืฉืืืืช ืฉื ืชืืจืช ืืืกืืืืช ืืฉืืืืช ืืชืืืจืืช ืจืืฉืืช ืืกืคืื ืื ืืช ืืืฉืืื ืืกืจืช ืืกืืก; ืจืืฉื ืืชืืืืช ืฉื ืืฉื ืื"ื, "ืืืื, ืืื ื ืืืขืช", ืืชืืืืกืื ืืฉืืืฉ ืืกืคืืจืืช ืืืชืืืืืช ืืชืืืื ืืืฉืืื. ืขืื ืชื ืืขื ืืืืืช ืฉืงืจืืื ืืืชืจ ืืืชืืก ืืืกืืื ืืืืงืื ืืื ืืจืกืื. ืืืจื ืืจืื ืจืื ื ืืื ืืืจืกืื ืืจืื ืืืขืื ืืื ืฉื ืืกื ืืชืคืืฉ ืืฉืืื ืืช ืืืข ืืืืืช ืื ืืืืืื ืืื ืืฉ ืืืื ืืืช ืืืืคื ืื ืืื ื ืืื ืื ืขืืจ ืื ืืื ืืืื ืืคื ืื; ืืฉืืืชื, ืขื ืืื ืื ืืื ืืื ืคืจืืืงืกืืืื, ืืขืืจ ืืืฉืื ืื ืืฉืืช, ืืจืง ืืืื ื ืชืืืื ืืืืชืืจ ืขื ืืืืืจื ืืืื ืืืงืืืืืืช ืืืืื ืืืฉืืชื. ืืื ืืืืืฉ ืืช ืืืคืื ืืงืืืจ ืฉื ืืขืืื ืืืฉืื ืืืช ืจืืืื ืืืฉืืื ืืืื ืืขืืช ืฉื ืืฆืืื ืืืจืื ืฉื ืื ืฉืื ืืืืืืืช ืืืืื ืืืฉืชืืจืจ ืืืืคืื ืืืืืชืืื. ืืืืื ืืืกืืืืื ืืืื ื ืืืจืืื ื ืืืกืืกื ืฉื ืืืกืืืืช ืืื ืืืฆืจ. ืื ืืฉืืฉืช ืืืืื ืืคืื, ืื ืืืงืื ืืฉืื ื ืืคืืื ืฉื ืืืืื"ืจ ืืขืืื ืืจืืฉ ืื ืงืืืฆื ืืกืืืืช ืืืื ื ืงืืืื ืชืืืืื ืืืืืช ืื, ืืื ืืืืื ืชืืืืจื ืืืชืืืืก ืืืื ืืืืฉืื ืืืืืกืืืช ืืืคืืคืื ืืฉืืฉืืช ืฆืืืงืื ืืกืืืืช. ืืืฆืจืืช ืืืืืกืืืช ืืืื ืืืืคืืข ืืฉืืื ืืืื ื-18; ืื ืืืื ืืจืื ืืช ืืงืืืื ืืืกืืจืชืืช, ืื ืื ืืื ืืืืืืืช ืืฉืื ืืืืืจืคื ืืกืืื ืืื ืืื ืืขืืืช ืกืืืืช ืืืื ืื ืืืืืืื ืฉืืฉืื ืื. ื ืกืคื ืืืืื ืื ืื ืฉืจืฆื ืืืกืืช ืืฆื ืฆืืืง ืืกืืื, ืื ืืขืืจ ืืืฉืื ืืื ืืื ืืชืจ ืืืจ. ืืืฆืจ ืืงืื ืงืจืืืช, ืืืชื ืืฉืืืื ืืฉืืช ืฉื ืืืืื"ืจ, ื ืืฆืื ืืืืฉืื ืื ืืชืืืจืจ. ืืฉืืฉืืืช ื ืงืจืื ืืืจื ืืื ืขื ืฉื ืืขืืจ ืื ืืขืืืจื, ืืฉืืจื ืขื ืฉืื ืื ืืืืจ ืฉืขืืื ืืืชื. ืื, ืืกืืืืช ืกืืืืจ ืืฉืืื ืช ืื ืื ืืืจืง ื ืืฉืืช ืขืืืื ืืช ืฉื ืืขืืืจื ืืืจื ืกืืืื ืื ืื ืงืื ื-1905. ืืืฆืจ ืขืฆืื ืืื ื ืืช ืืืืื"ืจ ืืืช ืืืขืื ืืคื ืืื ืฉื ืืฉืคืืชื ืืืงืืจืืื. ืื ืืืื ืืช ืืฉืืฉืื ืืขืืืจืื, ืืืืืื ืื ืืืฉื"ืงืื, ืืืช ืืืกืืืื ืืงืจืืืื ืืฉืืื ืื ืงืืข ืืฆืื, ื"ืืืฉืืื" (ืืื ืืื ื ืคืืฆืื ืืืื ืืืืื ืฉืืจื ืืฉืืื, ืื ืืืื ืจืง ืืฆืจืืช ืืขืืืช ืืืืืงืืช "ืืืฉืืื" ืฉื ืืืฉ). ืืจืืืช ืืกืืื ืื ืืฆืจ ืกืจืื ืืืื ืืขืืชืื ื ืืืจืืช ืืืชืจ, ืืฉืืชืืช ืื ืืจืื ืืืืื, ืืื ืืืชืืจื ืืคื ืืืืื"ืจ ืืืฉืืืช ืขืื ืืชืคืืืืช ืืืกืขืืืืช. ืืืฆืจืืช ืืืืืืืช ืืืชืจ ืืืืื ื ืื ืืกืืจืืช ืืจืืื ืืืช ืืฉืืืืืืช, ืืืืืืงืืช ืื ืืขืจืืืช ืืื ืื ื ืคืจืืืช ืืืืคื ืจืืืื ืืฆืืงื. ืืืืจื ืืืจืืคื ืืืฉื ื, ืืกืืื ืืืชื ืืืื"ืจ ืืื ืืชืจ ืืืชืจ ืืืืฅ ืืืฆืจื ืืื ืืงืืืืื ืืขืื ืงืืืื ืืืื ืืืงืืื ืฉืขืืงืจื ืืื ืืชืคืืื ืืืฉืืชืคืช ืืฉืืืื, ืื ืืื ื ืคืจื ืงืื. ืื ืืืื, ืจืืื ืืืกืืื ืืืชื ืฉืืฉืืืช ืืืืืงืื ืืชื-ืื ืกืช ืขืฆืืืืื. ืื ืืืื ืจืืื ืกืืืืื ืกืืื ืืืฆืจ. ืืืื ืืืฉืื ืืืื ืืื ืืื ืืชื ืคืืืื, ืื ืงืืืืื, ืื ื ืืกืจืืช ืืืืื"ืจ ืืงืฉืืช ืืืฉืืืช ืฉืื ืืช ืืืชื ืืฆืืจืืฃ ืกืืื ืืกืฃ. ืืืื ืกืืช ืืคืืืื ืืช ืืื ืืขืืจ ืืงืืจ ืืคืจื ืกื ืืจืืฉื ืฉื ืืืฆืจืืช. ืืืงืก ืืขืืงืจื ืืื ืืืืฉ (ืฉืืืื ืืืืืืฉ) ืื ื ืืกืคืื ืืืื ื ืืกืืืื ืืฉืืช ืื ืื ืขื ืืืืื"ืจ, ืืฉืชืชืคืื ืืกืขืืืืช ืื ืืงืืืืฉ ืืขื ืืืืืื ืืืื, ืฉืจืื ืืืืจืืช ืืื ืขืื ืืจืืืื ืืืชื ืืืื. ืืืกืืืืืืช ืจืืืช ืืื ื ืืื ืืืืง ืฉืืจืืื ืืืจืืืชื, ืื ืืฉืืื ืืกืคืืืื ืืงืืืฉื, ืืงืื. ืืืจื ืืชืืจื ืืืืจืฉืืช ืฉื ืืฉื ืืจืื ืืืืื ืืฉืืช ืืฉืื ื ืื ืขื ืืื "ืืืืจืื", ืืกืืืื ืืขืื ืืืืจืื ืืื, ืืืืขืืื ืขื ืืืชื ืขื ืฆืืชื. ืื ืืชืื ืืช ืืืืืืืช ืืืจืืช ืฉื ืืฉืคืืช ืืืืื"ืจ ืื ืืืจืืขืื ืฉืืื ืืชืื ืกืื ืจืื ืืืกืืืื ืืืืคืื ืช ืขืืฆืืชื ืฉื ืืืฆืจ. ืืืกืืืืืืช ืืืืืืช, ืืืงืฃ ืืืืื ืืืืืช ืืืืจืืืช (ืคืืจื ืืฉืขืก) ืขืืืกืืช ืงืื ืืขืื ืื ื ืืืฉืคืื ืกืืขืืื ืืขืืจืืื ืจืืงืื ืืฆืืื ืืืื. ืืืืื"ืจ ืืื ืื ืืืื ืืฉืื ืฉื ืืฆืจื, ืืืกืืืื ืืืืจืื ืืืืืช ืืกืืจืื ืืืฆืืืช ืื. ื ืืื ืืช ืืืืื"ืจืื ืืืืืื ืืขืืืืชืื ืงืฉืื, ืื ืืืจืืืืช ืืื ืืฆืจืืช ืืื ืืกืืกืืื ืืจืืฉื ืืชืืื, ืฉืืื ืืจืืืื ืืืืืืืช ืืจืืื. ืืชืืงืฃ ืืืืชื ืื ืืืืืื ืฉื ืงืืืืืช ืืืืืืช ืืืืืจืื ืืช ืืืื, ืจืืื ืืืืืื"ืจืื ืืชืจืืื ืืชืคืงืืืืื ืืื. ืืฉืืื ืืืื ื-20 ืฆืฆื ืืชืืคืขื ืฉื ืืฉืคืืขืื, ืืืจืื ืืืืกืจืื ืฉืืขืืจืื ืืขืืกืงืื ืืขื ืืื ื ืจืื ืืืืคื ืืืืืืจ ืืช ืชืคืงืืื ืืืงืืจื ืฉื ืืฆืืืง ืืืกืืื ืื ืืื ื ืืืขื ืื ืืืืื"ืจืืช, ืืืื ืืืฆืจืืช ืืืืืืช ืืกืืื ืืืชื. ื ืืกื ืืชืคืืื ืฉื ืืืกืืืื, ืื ืืื ืืื ืืืฉืื ืื ืืืกืืจืชื, ืืื ื ืืกื ืกืคืจื, ืฉืืื ืืจืืื ืฉื ืกืืจ ืืชืคืืื ืฉื ืืืจ"ื ืขื ืกืืจ ืืชืคืืื ืืืฉืื ืื. ื ืืกื ืกืคืจื ืืื ื ื ืืืื ืืืฆืจืืช ืฉืื ืืช ืืชืคืืืืช ืืชืช-ื ืืกืืื ืฉืื ืื. ืื, ืืืืืื, ืืกืืืืจ ืฉื ืืกืืืืช ืืขืืื ื ืืืจืื ืืกืคืจ ืงืืขืื ืืืืงื ืืคื ืืืกืืจืช ืืืฉืื ืืืช. ืืงืืื ืืืื ืืืชื ืืขืข ืืืืชืคืื ืืจืืฉ ืจื. ืืืจืืืช ืืืกืืืืืืช ื ืืืืื ืืื ืื ืชืคืืืื ืฉื ืจืื ื ืชื ืื ืืกืฃ ืืชืคืืืื ืฉื ืจืฉ"ื, ืืจืื ืืืจื ืืืชืื ื (ืืืกืืืืืืช ืืจืกืื ืงืืืจื ื ืืื"ื ืืชืืืืื ืืืื ืืจ ืืฆืืื ืื ืืกืืื ืื). ืืจืื ืืืกืืืืืืช ืืงืืื ืืืืื ืืืงืืื ืืื ืืื ืืืืขื ืืืื ืืืืืื ืืคืืืช ืืขืจื ืฉืืช. ืืืืจืืช ืืื ืืืื ืจืืื, ืืืฉืชื ืื ืืืฆืจ ืืืฆืจ, ื ืฉืืื ืื ืืงืืื ืื ืืื ืืื ืขื ืืื ืืืืื"ืจืื. ืืจืืืช ืืืืฆืจืืช ื ืืื ืืงืืื "ืืืืจืืช", ืงืืืฆืืช ืงืื ืืช ืฉื ืืืจืื ืืืืืืื ืืื ืืื ืคืขื. ืืืืฉ ืืืืื ืจืืื ืืฉืื ืืขืืื ืืืกืืื; ืืืฆืจืืช ืืกืืืืืช ืืฉ ืคืจืืืื ืืืคืืื ืืื ืืืืืืืื ืืื, ืื ืืชื ืืืืืช ืืช ืืืจืืื ืืคืืื. ืืฉื ื ืื ืคืจืืื ืืืืฉ ืืืฉืืชืคืื ืืืจืืืชื. ืืืกืืืจืืช, ืืืืืื ืืืกืืื ืืกืืจืืืืืืคื ืืื ืื ืฉืืื ื ืืื ืืงืจื ืืืืืช ืืืจื ืืืจืืคื ืืืื ืขื ืืขืช ืืืืฉื (ืืจืื ืขื ืืืื ื-20) ืื ืฉืืจ ืืฃ ืืงืจื ืงืืืฆืืช ืื-ืืกืืืืืช ืฉืืจื ืืืช ืืขืืืช, ืืื ืืคืจืืฉืื ืืืชื ืืืื ืืืืฉืื ืืืฉื ืืืจืืฉืืื ืฉืขืืื ืืืืฉืื ืฉืืจืืืืืื ืืฉืืชืืช. ืืกืคืจืืช ืืชื ืืขืชืืช ื ืืชื ื ื ืืืืงืื ืจืืื ืืื ืืขืืืืช ืคืจืืืื ืฉืื ืื, ืื ืืื ื ืืืขื ืืืขื ืชืืื ืืืืขืื. ืืืืืฉ ืืืกืืจืชื ืฉื ืืืืื ืืืจื ืืืจืืคื ืืื ืขื ืคื ืจืื ืืืงืื ืฉื ืืืืคื ื ืฉืจืืืื ืืงืจื ืืฆืืืช ืืืืืื ืืคืืื ื-ืืืืื, ื"ืฉืืืืื". ืคืจืื ืืืื ืืื ืืื ืืืืข ืคืจืืื ืื ืืืฉ ืืฉืืชืืช, ืืืื ืืืืจืืขืื ืืืืืืื. ืืืจืกื ืื ืคืืฆื ืืื ืืฉืืจืืืื, ืืฆื ืคืช ืขืฉืืื ืืื ืืืช ืืขืื-ืืืื ืฉืขืืจืื ืฉืืืืฉืื ืืกืืืื ืืืฆืื ืืื ืืจืื, ืืืืฆืื, ืจืืกืื ืืจืืื ืื. ืืกืคืืืืง, ืืืืื ืืื ื ืืฆืืขื ื ืืื ืืฉืืืจ, ืืงืืื ืืงืจื ืืกืืื ืคืืืื. ื ืืื ืืืื"ืจืื ืืืืจืื ืืืืฉืื ืืืืข ืงืืืคืืง. ืืืื ืืื ืืืืฉืื ืืกืืื ืืืืฆืื ืืืืข ืกืื: ืืฆื ืื ืฉื ืืกืืืืช ืืื'ื ืืฅ ืืกืจื ืืืืงื ืื ืืืืงื ืืฆื ืืืื, ืืืจืืช ืืฆื ืฉืืื ืืืืฆืจืืช ืืืจืืฉืืืืืช ืืืืง ืืกืืืืืจ ืืืืฉืื ืืืชื ืืืจืกื ื ืืืื. ืืืืืจ ืืฉืื ืืื ืืืจืื, ืืื ื ืื ืฉืืืกืืืื ืืฉืชืืฉืื ืื ืืฉืขืช ืืชืคืืื. ืืฉื ื ืกืืืื ืฉืื ืื ืฉื ืืจืื, ืืืื ืกืืื ืืืืื ื ืืกืฃ ืืื ืืงืืืฆืืช ืืฉืื ืืช. ืืืจืื ืืื ื ืคืืฅ ืืงืจื ืืื ืืืืืืื ืืขืืจ, ืืืฉืจืืช ืืืืจืืช ืืื ืฉื ื"ืฉืืืืื"; ืขื ืืฉื ืื ื ืืืง ืืืืจ ืืืืขืื ืื ืื ืืงืืื, ืืืฆืืจืช ืืืฅ ืืื ืืื ืืขืจืืื ืืืื ืืจืืื ื ืืืืื. ืจืื ืืืกืืืื ืืืืฉืื ืื ืงืคืืื, ืืขืื ืฉืืืจ ืืจืื. ืืฉ ืฉืืืช ืฉืื ืื ืืืื ืืืื ืืืืื ืืฉืืช, ืืืฉ ืื ืกืื ืื ืืช ืฉืื ืื ืฉื ืงืคืืืืช: ืืืจืื ืืฆืจืืช, ืืืืืช ืืืืื, ืืื'ื ืืฅ ืืืขืื, ื ืืืฉ ืืื ืขืืืื ืืฉื "ืจืืฉืืืืงื" (ืืืชืื ืขืืจื: ืจื'ืืืืงื). ืคืจืืืื ืืืืืืื ืืืฆืืื ืื ืืกืืืืืืช ืื, ืืืืืื, ื"ืืืืื ืืื ืื ืืืงื" (ืืื ืกืืื ืืืจืืืื) ืฉื ืืกืืืืช ืืืจ, ืกืื ืื ืฉืืืืฅ ืืืืจ ืืืืจืืช ืืืืืฉ ืฉื 1846 ืืคืืืื. ืืกืืื ืืืจื ืืืืฆืื ืืืืจืื ืืืจืืื ืืจืืืื ืฉืืืจืื ืืืืืช ืืืื ืืืื ืื ืืฉืืชืืช ืืืืื. ืื"ื ืืชืืืืืช ืืืืืฉื ืืืืืื ืืืืืจื ื. ืืืฆืจืืช ืืืกืืืืืช ืืืืื ื ืืคื ืกืงืจ ืืืืืจืคื ื ืจืื ืฉื ืขืจื ืขืืืจ ืืืืงืจ ืืจืฆืื ืืืืืื ืืกืงื (ืคืื') ืืืื ืืืจืกืืืช ืคืจืื ืกืืื, ืืฉื ืช 2016 ืืื ืืขืืื ื-129,000 ืืชื-ืื ืืกืืืืื. ืืืขื ืืืื ืืืฉืืื ืืืืื ืช ืืฉืจืื (ืืืื ืืืืื ืืฉืืืจืื) ืืืืจืฆืืช ืืืจืืช, ืืื ืืฆืืืืช ืืืชืืื 62,000 ื-53,000 ืืฉืคืืืช. 75% ืืืืกืืืื ืฉืืื ืื ืืขืฉืจ ืืงืืืืืช ืืืกืืืืืช ืืืืืืืช ืืืืชืจ, ืืขืื ืฉืืฉืคืืืช ืืืืืืช ืื ืงืืืืืช ืงืื ืืช โ ื ืืืืืช ืืกืืืืช ืชืืขืื ื-1,350 ืืงืืืืช ืืืฉืื ืืขืืื โ ืืฉืชืืืืืช ืืืขื ืชืืื ืืื"ื ืื ืืจืกืื ืืคืชืืืืช. ืืจืืืื ืืืกืืื ืืืืื ืืขืืื ืืฆืื ืืขืืจ ื ืื ืืืจืง, ืฉื ืืชืืืจืจืื ื-34,000 ืืชื-ืื (26% ืืืืื), ืืขืืงืจ ืืฉืืื ืืช ืืืืืืืกืืืจื, ืืืจื ืคืืจืง ืืงืจืืื ืืืืืก. ืืจืืฉืืื ื ืืฆืืช ืืืงืื ืืฉื ื, ืขื ื-16,000 ืืฉืคืืืช (12.6%) ืืืืจืื ืื ื ืืจืง ืขื 13,000 (9.5%). ืืืืื ืช ื ืื ืืืจืง ืืฆืืืื ืืฆืืจืื ืืืืืื ืืืื ืกื, ืฉืื ืืืฉืืช ืืกืืืืช ืืื'ื ืืฅ ืืื ืกื, ืืืืืฉืืืื ืืขืฆืืืืื ืงืจืืืช ืืืื ืื ืื ืกืงืืืจ, ืืืฉืชืืืืื ืืืชืืื ืืืกืืืืช ืกืืืืืจ ืืืกืืืืช ืกืงืืืืจื. ืงืืืืืช ืืืืืืช ืืฆืืืืช ืืืฉืจืื ืืืฉืืื, ืืืช ืฉืืฉ, ืืืคื, ื ืชื ืื, ืืืขื, ืืืืืขืื ืขืืืืช ืืืืชืจ ืขืืืืช. ืืืจืืื ืื ืืืืงื ืืขืืงืจื ืืื ืกืืืคืืจื ืืื ืฉืืฆืคืื ืืื ืืื ืขื ื-5,000 ืืฉืคืืืช, ืืื ืืกืฃ ืื ืืกืคืจ ืืืืืขืืช, ืืขืืงืจ ืืื ืฆ'ืกืืจ. ืืืื ืืจืืืื ืฉืืงื ืื ืืชืืืจืจืื ื-2,000 ืืชื-ืื, ืจืืื ืืืืฉืื ืงืจืืืช ืืืฉ, ืืืื ืืืืจืคื ืฉืืืืืื ืืืฉืืืช ื-1,800 ืืฉืคืืืช. ืืขืืื ืืฆืืืืช ืืืชืจ ืืืืชืืื ืืฉืืืฉืื ืืฆืจืืช ืืกืืืืืช. ืืืขืื ืืชืจืืกืจ ืืื ืืช ืงืื ืืืื ืฉื ืืืคื ืืชื-ืื, ืืืืชืจ ืืฉ ืืกืคืจ ืืืจืื ืงืื, ืืืขืืชืื ืืืขื ืืคืกื. ืืฆืจืืช ืจืืืช ื ืืื ืืื ืืืจ ืืฉืืื; ืืืจืืช ืฉืจืื ืืืชืงืืืืืช ืจืง ืคืืจืืืืช, ืขื ืืืื"ืจ ืืืืืืง ืืชืืืจ ืืืกืืืื ืืืืืื ืืืฉ. ื ืืชื ืืืืื ืืช ืืืฆืจืืช ืขื ืฆืืจืื ืจืืื. ืืคื ืืืืืืจืคื, ืืฉ ืืชืื ืืืฉ ืืืืชืจ ืืื ืืงืืจื ืฉื ืื ืงืืืฆื ืืืคืืื ืื. ืืกืืืืืืช ืืืืฆืื, ืืืืืื, ืืืืืืช ืขื ืคืืคืืืืื ืืืจืืืืืืืช, ืืื ืฉื ืืื ืืจืื ืืงืืฉืจืืช ืืงื ืืืช ืงืืฆืื ืืช, ืืฉืืฉืืืช ืืจืืกืื ืืืื ื ืืืืจืื ืืืืื ืขื ืฉืืืชื ืืช ืืืชืื ืืช; ืื ืืื ืืืืืืื ืจืืืืื ืืืชืจ ืืืฉืจ ืืฆืืืืช ืขืืืืชืืช. ืืคืืขื, ืืฉื ื ืืจืืืืช ืืฉืืขืืชืืืช ืื ืจืง ืืื ืืจืื ืืืืืจืคื, ืืื ืื ืืชืื ืื ืืฆืจ ืืืืืื ืฉืืืืช ืขื ืืืฉื ืืกืืืืช. ืืืืฉืืช ืื"ื ืืจืฆืืื ืืืกืืืช ืืืืืจื, ืืฉื ื ืจืืืื ืืฉืืืืื ืืช ืืืืืื, ืืขืื ืฉืงื ืื ืืื ืืจืื ืขืกืงื ืืชืืืืื ืืืืืจืืื ืืื ืขืืืจื ืืฉืื, ืืฆืืืงืื ืืืืฆืืื ืื ืืกืืืืื ื ืื ืืฉืืืชื ืืช ืืืืืช. ืืืืงื ืืกืืกืืช ืืืจืช ืืื ืืชืืื ืืคืืืืื. ืื ื ืกืืื ืืขืืจ ืขื ืืืืก ืืืชืจืืงืื ืืืืช ืืขื ืฉืืืช ืืื ืื ืงืืืืืช ืืจืืืืช ื ืคืจืืืช, ืืืืื ืืื ื ืืืขืช ืืืชื ืืื ืืื ืื-ืฆืืื ืื ืืืืชื-ืฆืืื ืื. ืืื ืื-ืฆืืื ืื ืืชื ืืืื ืืงืืืื ืฉื ืืืื ืช ืืฉืจืื, ืจืืืื ืื ืฉืืงืืฅ ืืืกืจืืื ืืืฉืชืชืฃ ืืืืืจืืช ืืืฉืจืืืืืช ืื ืืืืื ืชืงืฆืืืื ืืืฉืืชืืื ืืืืกืืืช ืืื ืื. ืงื ืื ืืืื ืขื ืืื ืืืกืืืืืืช ืืชืืืืืช ืืืชืืืืืช ืืจืื ืื ืืืขืื ืืืจืืืช, ืืืืืื ืกืืืืืจ. ืงืืืฆื ืื ืืื ืืืขืื ืืืฉืจืื ืื ืืืืืช ืืช ืจืื ืืืกืืืื ืืืชืืืจืจืื ืืื"ื. ืืจืื ืืื-ืฆืืื ื, ืืืฉืชืืื ืืืจื ืืื ืืืืืืช ืืฉืจืื, ืืชื ืื ืืฆืืื ืืช, ืืืื ืืฉืชืชืฃ ืืคืืืืืืงื ืื ืืื ืชืงืฆืืืื. ืงื ืื ืืืคืืื ืืฆืจืืช ืืื ืืืจ, ืืื'ื ืืฅ, ืื"ื ืืจืืืช ืืืจืืช. ืืชืจืืกืจ ืืืื"ืจืื ืืืื ืื ืืืืขืฆืช ืืืืื ืืชืืจื ืฉื ืืืืืช ืืฉืจืื ืืืืื ื, ืืขืื ืจืืื ืฉืืชืคืื ืื. ืืฉ ืื ืืืื"ืจืื ืฆืืื ืืื ืืขืื ืืฉืคืขื, ืืืืืื ืืขื ืคื ืืืช ืจืื'ืื, ืืื ืืกืืืืช ืคืืฉืงืื ืืืืืื"ืจ ืฉืื ืืืฉืข ืคืจืืืื ืื ืฉืืื. ืขื ืืืช, ืจืืื ืืงืจื ืืืกืืืื ืืืฉืจืื ื ืืืื ืืืืื ืืืฉืจืืื, ืืขืืชืื ืื ืืืืื ืืงืืฆืื ื, ืืืชื ืืืื ืื ืกืืื ืืืืืื ืืฉืืืจืื. ืืกืคืจ ืืืื"ืจืื ืืืืขืื, ืืื ืืจืื ืืืืืืืืืืืฉ, ืฆืืืื ืืขืืื ืืืืช. ืืืฆืจ ืืืืืื ืืขืืื ืืื ืืกืืืืช ืกืืืืืจ, ืขื ื-26,000 ืืชื-ืื, ืื ืืืืฉืืช ืืืื ืืืกืืืื. ืืจืืืื ืื ืืืืืืืกืืืจื ืฉืืืจืืงืืื ืืงืจืืืช ืืืื. ืืื ื ืืกืื ื-1905 ืืขืืจ ืืฉื ืื ืืืจื ืกืืืื ืื ืืืืืคืืื ืช ืืงื ืืืช ืืืฉืืจื ืืช ืืชืืช, ืืืืืื ืื ืืขืืื ืืืืฆืื ื ืืืื ืื-ืฆืืื ืืช. ื-2006 ืืชืคืืื ืกืืืืืจ ืืื ืืืืื"ืจืื ืืืืื ืืืื ืืืื ืืืืืืืืื ืืืืจื ืืืืืืืืื, ืืืืขื ืื ืื ืืื ืืืืืช ืืืจืืฉื. ืืืฆืจ ืืืืืื ืืืฉืจืื ืืื ืืืจ. ื-9% ืืืื ืืชื-ืืื ืืืกืืืืื ืืืืืขืื ืืชืื ื ืื ืื ืขืืื, ืืขืจื 12,000. ืืืจ ืืืื ืืคืขืื ื-1859 ืืืืจื ืงืืืืืจืื ืืกืืืื ืืืืจืฉื ืืฉืืจืฉืื ื ืขืืฆืื ืืืืช ืคืฉืืกืื ืืจืฆืืื ืืืกืื. ืืืฉื ืฉื ืื ืจืืืช ืืืืชื ืืืื ืืืืืื ื ืื ืืืืืืช ืืฉืจืื ืืืชืืคืืื ื ืืืืก ืืชืื ืืืคื ืืืืืจื ื. ืืื 1996 ืืืื ืืืืื"ืจ ืืขืงื ืืจืื ืืืชืจ. ืืกืืืืช ืืขืืื, ืขื 7,500 ืืฉืคืืืช, ืงืื ื-1817 ืืืืฉืื ืืฉื ืื ืืืืืฆืื ืืืืจืืืช, ืืฆืคืื ืืืืื. ืืขืืื ืฉืืืื ืคืืคืืืืื ืืจืืืืื ืืฉืืืช ืืืืื ืืืืืืื ืขื ืืกืืืืช "ืจืื ืืช" ืืืืื ืืช, ืืืคืืื ื ืงื ืืืช ืื ืืงืฉืืช ืืชืืช. ืขื 1979 ืฉืืจื ืขื ืืืงื ืืขืื ืืืจืืืช ืื ืืืืืชื ืืืืืืช ืืฉืจืื, ืื ืื ืืชื ืชืงื ืืื ืืืช ืืืืื"ืจ ืืืืื ืืฉืืจ ืื ืจืืงื ืืืงืืื ืืืกืืืช ืืช ืืฉืื. ืืืฆืจ ืืืืืื ืืจืืืขืืช ืืื ืืกืืืืช ืืื'ื ืืฅ, ืฉื ืืกืื ืืขืืจ ืืฉื ืื ืืืืงืืืื ื ื-1854 ืืืืื ืืขืืืืืช ืืืงืืจืื ืืืืื ืื ืืืืฆืขืืช ืืคืื ืช ืืืืืช ืืฆื ืืจืื. ืืื'ื ืืฅ ืืชืคืฆืื ืืืื ืขื ืคืื ืืืชืงืืืืื ืืฉืืื ืืืื, ืื ืืื ืขื ืืืื"ืจื: ืฉืืืื ืืื'ื ืืฅ ืืจืืฉืืช ืืจื ืืฉืจืื ืืืจ, ืขื ื-5,000 ืืฉืคืืืช; ืืจืื ืืื'ื ืืฅ ืืืืืืช ืืจื ืื ืื ืื ืื ืืืจ ืืืื ื ื-650; ืืกืืืืช ืืื'ื ืืฅ ืืื ืกื ืืืืืืช 3,500 ืืชื-ืื, ืืืกืืืืช ืกืจื ืืื'ื ืืฅ ืขื 1,300. ืืฉื ื ืขืื ืืฆืจืืช ืืืืืฉืืช ืขื ืงืื ืืฉืืขืืชื. ืืืืื, ืฉื ืืกืื ื-1892 ืืืืืฆืื, ืืชืคืืื ื-2005 ื"ืืืืื" ืืจืืฉืืช, ืขื 3,500 ืืฉืคืืืช, ื"ืืืืื 45", ืืืื ื 1,500. ืื ื ืงืจืืืช ืขื ืฉื ืืจืืืืืช ืืืืจื ืคืืจืง ืืื ืฉืืื ืืช ืืืฆืจืืช. ืืกืืืืช ืฆืื ื-ืงืืืืื ืืืจื ืงืื ืืืจื ืกืืืื ืื ื-1927 ืืคืืฆืื ืืืกืืื ืืื ืฉืืืฉื ืืืื"ืจืื, ืฉืืื 3,800 ืืฉืคืืืช ืืกื ืืื. ืืกืืืืช ืงืจืืื, ืฉื ืืกืื ืืืื ื-18 ืืืืืจืืก, ืืชืคืืื ืืื ืคืื ืกืง-ืงืจืืื, ืงืจืืื-ืกืืืืื ืืงืื ืืกืืื ืืจืืฉืืื. ืืื ืืื ืืช ืืืื ื-3,000 ืืฉืคืืืช. ืืืืจืช ืฉืืืจ ืืืื ืื ืื ืืืืฉืื ืืืฉื ืืืจืืฉืืื, ืืืคืืฆืืช ืืืืฉ ืืฆืจืืช ื ืคืจืืืช (ืฉืืืืืื ืฉืืื ืืื ืืกืืืืช ืชืืืืืช ืืืจื), ืืื ื ืืฃ ืืื ืืืขืื ื-2,000 ืืฉืคืืืช ืืขื ืคืื. ืฉืชื ืชื ืืขืืช ืืกืืืืืช ืืืืืืช ืฉืืืื ืืื ืืื ืืืื"ืจ, ืื ืื"ื ืืืจืกืื. ืืจืื ื ืืื ืืืจืกืื ืื ื ืืฆื ืืืจืฉ ืืืกืื ืืืืจ ืืืชื ื-1810, ืืชืืืืืื ืืืฉืืื ืืืืืจืืช ืงืื ืืช ืฉืืชืงืืฆื ืกืืื ืืืจืื ืฉื ืฆืืื ืืฉืืืชื. ืืื ืืืืชื ืืกืืืืช ืงืื ื ืื ืจืืคืช ืืืฉื ืืืื ืืืืืฉืื ืฉื ื, ืื ืืขื ืืื ืืืืืืฉ ืฉืขืืจืจื ืืืงืจืื ืืืจืืื ืืืืจ ืืืฉื ืชื ืืืืื ืืืื ืืืืจืื ืืชืฉืืื ืจืืื ืืขืฉื ืืืชื ืคืืคืืืจืืช. ืืจืกืื ืืื ื ืืืขืื ืืชืจืืกืจ ืงืืืฆืืช-ืืฉื ื ืฉืืจืืื ืืืืืื ืจืืื ืฉืืื ื ืืกืืืื ืืืฉ. ืืื ืืื ืืช ื-14,000 ืืฉืคืืืช, ืื ืื ืื ืืงืจืื ืืงืจืืืจืืื ืื ืืืฉืชืืืืืช ืืืืฉืื ืืืจืื ืืื ืืชืจ ืืืฆืจืืช. ืืื"ื, ืืงืืืฉ ืืืืื"ืจ ืืืืจืื ืื ืื ืื ืื ืฉื ืืืืจืกืื ืืช ืืฉืืื ืืฆืจื ืืืืืจื ืืชืฉืืื, ืขื ืืฉืืืฉ ืืืืืืื ืืื ืืจืขืื ืืืกืืืื (ื-17,000 ืืฉืคืืืช) ืืจืืืืช ืืงืืจืืื ืืืืืืื ื ืืกืคืื. ืืืืจ ืืืชื ื-1994 ืฉื ืฉื ืืืืจืกืื, ืฉืจืืื ืืืกืืืื ืืืฉืืืื ืืกืืืจ ืฉืืื ืืืฉืื, ืื ืงื ืื ืืืืืฃ. ืื"ื ืืชื ืืืช ืืฉืืจื ืฉื ืืืกืืืช ืฉืืฉ ืืื ืื ืืื ืืจืืืืช ืื ืื ืืืื"ืจ. ืจืื ืื ืืงืจืืื ื ืืกืคืช ืงืืฉืืจืื ืืืฆืื ืืื ืืขืจืืช ืฉืืืืื ืืงืจื: ืงื ืืืฆืง=ืจื ืืชืืืื ืงื ืืงืืืงื=ืฉืืฉืืช ืื ืงืฉืจ ืืฉืคืืชื ืืขืจืืช: ืืชืื ืืช ืกืืงืจืช ืืช ืืฉืืฉืืืช ืืืจืืืืืช, ืืืขืืงืจ ืืช ืืื ืฉืฉืจืื ืขื ืืืื. ืืืง ืืืฉืืฉืืืช ืืืฉืคืขื ืืืกืคืจ ืืงืืจืืช, ืืชืื ืืช ื ืื ื ืจืง ืืืงืืจืืช ืืขืืงืจืืื. |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Language_model#References] | [TOKENS: 1793] |
Contents Language model A language model is a computational model that predicts sequences in natural language. Language models are useful for a variety of tasks, including speech recognition, machine translation, natural language generation (generating more human-like text), optical character recognition, route optimization, handwriting recognition, grammar induction, and information retrieval. Large language models (LLMs), currently their most advanced form as of 2019, are predominantly based on transformers trained on larger datasets (frequently using texts scraped from the public internet). They have superseded recurrent neural network-based models, which had previously superseded the purely statistical models, such as the word n-gram language model. History Noam Chomsky did pioneering work on language models in the 1950s by developing a theory of formal grammars. In 1980, statistical approaches were explored and found to be more useful for many purposes than rule-based formal grammars. Discrete representations like word n-gram language models, with probabilities for discrete combinations of words, made significant advances. In the 2000s, continuous representations for words, such as word embeddings, began to replace discrete representations. Typically, the representation is a real-valued vector that encodes a wordโs meaning such that words closer in vector space are similar in meaning and common relationships between words, such as plurality or gender, are preserved. Pure statistical models In 1980, the first significant statistical language model was proposed, and during the decade IBM performed 'Shannon-style' experiments, in which potential sources for language modeling improvement were identified by observing and analyzing the performance of human subjects in predicting or correcting text. A word n-gram language model is a statistical model of language which calculates the probability of the next word in a sequence from a fixed size window of previous words. If one previous word is considered, it is a bigram model; if two words, a trigram model; if n โ 1 words, an n-gram model. Special tokens are introduced to denote the start and end of a sentence โจ s โฉ {\displaystyle \langle s\rangle } and โจ / s โฉ {\displaystyle \langle /s\rangle } . To prevent a zero probability being assigned to unseen words, the probability of each seen word is slightly lowered to make room for the unseen words in a given corpus. To achieve this, various smoothing methods are used, from simple "add-one" smoothing (assigning a count of 1 to unseen n-grams, as an uninformative prior) to more sophisticated techniques, such as GoodโTuring discounting or back-off models. Word n-gram models have largely been superseded by recurrent neural networkโbased models, which in turn have been superseded by Transformer-based models often referred to as large language models. Maximum entropy language models encode the relationship between a word and the n-gram history using feature functions. The equation is P ( w m โฃ w 1 , โฆ , w m โ 1 ) = 1 Z ( w 1 , โฆ , w m โ 1 ) exp โก ( a T f ( w 1 , โฆ , w m ) ) {\displaystyle P(w_{m}\mid w_{1},\ldots ,w_{m-1})={\frac {1}{Z(w_{1},\ldots ,w_{m-1})}}\exp(a^{T}f(w_{1},\ldots ,w_{m}))} where Z ( w 1 , โฆ , w m โ 1 ) {\displaystyle Z(w_{1},\ldots ,w_{m-1})} is the partition function, a {\displaystyle a} is the parameter vector, and f ( w 1 , โฆ , w m ) {\displaystyle f(w_{1},\ldots ,w_{m})} is the feature function. In the simplest case, the feature function is just an indicator of the presence of a certain n-gram. It is helpful to use a prior on a {\displaystyle a} or some form of regularization. The log-bilinear model is another example of an exponential language model. Skip-gram language model is an attempt at overcoming the data sparsity problem that the preceding model (i.e. word n-gram language model) faced. Words represented in an embedding vector were not necessarily consecutive anymore, but could leave gaps that are skipped over (thus the name "skip-gram"). Formally, a k-skip-n-gram is a length-n subsequence where the components occur at distance at most k from each other. For example, in the input text: the set of 1-skip-2-grams includes all the bigrams (2-grams), and in addition the subsequences In skip-gram model, semantic relations between words are represented by linear combinations, capturing a form of compositionality. For example, in some such models, if v is the function that maps a word w to its n-d vector representation, then v ( k i n g ) โ v ( m a l e ) + v ( f e m a l e ) โ v ( q u e e n ) {\displaystyle v(\mathrm {king} )-v(\mathrm {male} )+v(\mathrm {female} )\approx v(\mathrm {queen} )} where โ is made precise by stipulating that its right-hand side must be the nearest neighbor of the value of the left-hand side. Neural models Continuous representations or embeddings of words are produced in recurrent neural network-based language models (known also as continuous space language models). Such continuous space embeddings help to alleviate the curse of dimensionality, which is the consequence of the number of possible sequences of words increasing exponentially with the size of the vocabulary, further causing a data sparsity problem. Neural networks avoid this problem by representing words as non-linear combinations of weights in a neural net. A large language model (LLM) is a language model trained with self-supervised machine learning on a vast amount of text, designed for natural language processing tasks, especially language generation. The largest and most capable LLMs are generative pre-trained transformers (GPTs) that provide the core capabilities of modern chatbots. LLMs can be fine-tuned for specific tasks or guided by prompt engineering. These models acquire predictive power regarding syntax, semantics, and ontologies inherent in human language corpora, but they also inherit inaccuracies and biases present in the data they are trained on. They consist of billions to trillions of parameters and operate as general-purpose sequence models, generating, summarizing, translating, and reasoning over text. LLMs represent a significant new technology in their ability to generalize across tasks with minimal task-specific supervision, enabling capabilities like conversational agents, code generation, knowledge retrieval, and automated reasoning that previously required bespoke systems. LLMs evolved from earlier statistical and recurrent neural network approaches to language modeling. The transformer architecture, introduced in 2017, replaced recurrence with self-attention, allowing efficient parallelization, longer context handling, and scalable training on unprecedented data volumes. This innovation enabled models like GPT, BERT, and their successors, which demonstrated emergent behaviors at scale, such as few-shot learning and compositional reasoning. Reinforcement learning, particularly policy gradient algorithms, has been adapted to fine-tune LLMs for desired behaviors beyond raw next-token prediction. Reinforcement learning from human feedback (RLHF) applies these methods to optimize a policy, the LLM's output distribution, against reward signals derived from human or automated preference judgments. This has been critical for aligning model outputs with user expectations, improving factuality, reducing harmful responses, and enhancing task performance. Benchmark evaluations for LLMs have evolved from narrow linguistic assessments toward comprehensive, multi-task evaluations measuring reasoning, factual accuracy, alignment, and safety. Hill climbing, iteratively optimizing models against benchmarks, has emerged as a dominant strategy, producing rapid incremental performance gains but raising concerns of overfitting to benchmarks rather than achieving genuine generalization or robust capability improvements. Although sometimes matching human performance, it is not clear whether they are plausible cognitive models. At least for recurrent neural networks, it has been shown that they sometimes learn patterns that humans do not, but fail to learn patterns that humans typically do. Evaluation and benchmarks Evaluation of the quality of language models is mostly done by comparison to human created sample benchmarks created from typical language-oriented tasks. Other, less established, quality tests examine the intrinsic character of a language model or compare two such models. Since language models are typically intended to be dynamic and to learn from data they see, some proposed models investigate the rate of learning, e.g., through inspection of learning curves. Various data sets have been developed for use in evaluating language processing systems. These include: See also References Further reading |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Language_model#Pure_statistical_models] | [TOKENS: 1793] |
Contents Language model A language model is a computational model that predicts sequences in natural language. Language models are useful for a variety of tasks, including speech recognition, machine translation, natural language generation (generating more human-like text), optical character recognition, route optimization, handwriting recognition, grammar induction, and information retrieval. Large language models (LLMs), currently their most advanced form as of 2019, are predominantly based on transformers trained on larger datasets (frequently using texts scraped from the public internet). They have superseded recurrent neural network-based models, which had previously superseded the purely statistical models, such as the word n-gram language model. History Noam Chomsky did pioneering work on language models in the 1950s by developing a theory of formal grammars. In 1980, statistical approaches were explored and found to be more useful for many purposes than rule-based formal grammars. Discrete representations like word n-gram language models, with probabilities for discrete combinations of words, made significant advances. In the 2000s, continuous representations for words, such as word embeddings, began to replace discrete representations. Typically, the representation is a real-valued vector that encodes a wordโs meaning such that words closer in vector space are similar in meaning and common relationships between words, such as plurality or gender, are preserved. Pure statistical models In 1980, the first significant statistical language model was proposed, and during the decade IBM performed 'Shannon-style' experiments, in which potential sources for language modeling improvement were identified by observing and analyzing the performance of human subjects in predicting or correcting text. A word n-gram language model is a statistical model of language which calculates the probability of the next word in a sequence from a fixed size window of previous words. If one previous word is considered, it is a bigram model; if two words, a trigram model; if n โ 1 words, an n-gram model. Special tokens are introduced to denote the start and end of a sentence โจ s โฉ {\displaystyle \langle s\rangle } and โจ / s โฉ {\displaystyle \langle /s\rangle } . To prevent a zero probability being assigned to unseen words, the probability of each seen word is slightly lowered to make room for the unseen words in a given corpus. To achieve this, various smoothing methods are used, from simple "add-one" smoothing (assigning a count of 1 to unseen n-grams, as an uninformative prior) to more sophisticated techniques, such as GoodโTuring discounting or back-off models. Word n-gram models have largely been superseded by recurrent neural networkโbased models, which in turn have been superseded by Transformer-based models often referred to as large language models. Maximum entropy language models encode the relationship between a word and the n-gram history using feature functions. The equation is P ( w m โฃ w 1 , โฆ , w m โ 1 ) = 1 Z ( w 1 , โฆ , w m โ 1 ) exp โก ( a T f ( w 1 , โฆ , w m ) ) {\displaystyle P(w_{m}\mid w_{1},\ldots ,w_{m-1})={\frac {1}{Z(w_{1},\ldots ,w_{m-1})}}\exp(a^{T}f(w_{1},\ldots ,w_{m}))} where Z ( w 1 , โฆ , w m โ 1 ) {\displaystyle Z(w_{1},\ldots ,w_{m-1})} is the partition function, a {\displaystyle a} is the parameter vector, and f ( w 1 , โฆ , w m ) {\displaystyle f(w_{1},\ldots ,w_{m})} is the feature function. In the simplest case, the feature function is just an indicator of the presence of a certain n-gram. It is helpful to use a prior on a {\displaystyle a} or some form of regularization. The log-bilinear model is another example of an exponential language model. Skip-gram language model is an attempt at overcoming the data sparsity problem that the preceding model (i.e. word n-gram language model) faced. Words represented in an embedding vector were not necessarily consecutive anymore, but could leave gaps that are skipped over (thus the name "skip-gram"). Formally, a k-skip-n-gram is a length-n subsequence where the components occur at distance at most k from each other. For example, in the input text: the set of 1-skip-2-grams includes all the bigrams (2-grams), and in addition the subsequences In skip-gram model, semantic relations between words are represented by linear combinations, capturing a form of compositionality. For example, in some such models, if v is the function that maps a word w to its n-d vector representation, then v ( k i n g ) โ v ( m a l e ) + v ( f e m a l e ) โ v ( q u e e n ) {\displaystyle v(\mathrm {king} )-v(\mathrm {male} )+v(\mathrm {female} )\approx v(\mathrm {queen} )} where โ is made precise by stipulating that its right-hand side must be the nearest neighbor of the value of the left-hand side. Neural models Continuous representations or embeddings of words are produced in recurrent neural network-based language models (known also as continuous space language models). Such continuous space embeddings help to alleviate the curse of dimensionality, which is the consequence of the number of possible sequences of words increasing exponentially with the size of the vocabulary, further causing a data sparsity problem. Neural networks avoid this problem by representing words as non-linear combinations of weights in a neural net. A large language model (LLM) is a language model trained with self-supervised machine learning on a vast amount of text, designed for natural language processing tasks, especially language generation. The largest and most capable LLMs are generative pre-trained transformers (GPTs) that provide the core capabilities of modern chatbots. LLMs can be fine-tuned for specific tasks or guided by prompt engineering. These models acquire predictive power regarding syntax, semantics, and ontologies inherent in human language corpora, but they also inherit inaccuracies and biases present in the data they are trained on. They consist of billions to trillions of parameters and operate as general-purpose sequence models, generating, summarizing, translating, and reasoning over text. LLMs represent a significant new technology in their ability to generalize across tasks with minimal task-specific supervision, enabling capabilities like conversational agents, code generation, knowledge retrieval, and automated reasoning that previously required bespoke systems. LLMs evolved from earlier statistical and recurrent neural network approaches to language modeling. The transformer architecture, introduced in 2017, replaced recurrence with self-attention, allowing efficient parallelization, longer context handling, and scalable training on unprecedented data volumes. This innovation enabled models like GPT, BERT, and their successors, which demonstrated emergent behaviors at scale, such as few-shot learning and compositional reasoning. Reinforcement learning, particularly policy gradient algorithms, has been adapted to fine-tune LLMs for desired behaviors beyond raw next-token prediction. Reinforcement learning from human feedback (RLHF) applies these methods to optimize a policy, the LLM's output distribution, against reward signals derived from human or automated preference judgments. This has been critical for aligning model outputs with user expectations, improving factuality, reducing harmful responses, and enhancing task performance. Benchmark evaluations for LLMs have evolved from narrow linguistic assessments toward comprehensive, multi-task evaluations measuring reasoning, factual accuracy, alignment, and safety. Hill climbing, iteratively optimizing models against benchmarks, has emerged as a dominant strategy, producing rapid incremental performance gains but raising concerns of overfitting to benchmarks rather than achieving genuine generalization or robust capability improvements. Although sometimes matching human performance, it is not clear whether they are plausible cognitive models. At least for recurrent neural networks, it has been shown that they sometimes learn patterns that humans do not, but fail to learn patterns that humans typically do. Evaluation and benchmarks Evaluation of the quality of language models is mostly done by comparison to human created sample benchmarks created from typical language-oriented tasks. Other, less established, quality tests examine the intrinsic character of a language model or compare two such models. Since language models are typically intended to be dynamic and to learn from data they see, some proposed models investigate the rate of learning, e.g., through inspection of learning curves. Various data sets have been developed for use in evaluating language processing systems. These include: See also References Further reading |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Language_model#Large_language_models] | [TOKENS: 1793] |
Contents Language model A language model is a computational model that predicts sequences in natural language. Language models are useful for a variety of tasks, including speech recognition, machine translation, natural language generation (generating more human-like text), optical character recognition, route optimization, handwriting recognition, grammar induction, and information retrieval. Large language models (LLMs), currently their most advanced form as of 2019, are predominantly based on transformers trained on larger datasets (frequently using texts scraped from the public internet). They have superseded recurrent neural network-based models, which had previously superseded the purely statistical models, such as the word n-gram language model. History Noam Chomsky did pioneering work on language models in the 1950s by developing a theory of formal grammars. In 1980, statistical approaches were explored and found to be more useful for many purposes than rule-based formal grammars. Discrete representations like word n-gram language models, with probabilities for discrete combinations of words, made significant advances. In the 2000s, continuous representations for words, such as word embeddings, began to replace discrete representations. Typically, the representation is a real-valued vector that encodes a wordโs meaning such that words closer in vector space are similar in meaning and common relationships between words, such as plurality or gender, are preserved. Pure statistical models In 1980, the first significant statistical language model was proposed, and during the decade IBM performed 'Shannon-style' experiments, in which potential sources for language modeling improvement were identified by observing and analyzing the performance of human subjects in predicting or correcting text. A word n-gram language model is a statistical model of language which calculates the probability of the next word in a sequence from a fixed size window of previous words. If one previous word is considered, it is a bigram model; if two words, a trigram model; if n โ 1 words, an n-gram model. Special tokens are introduced to denote the start and end of a sentence โจ s โฉ {\displaystyle \langle s\rangle } and โจ / s โฉ {\displaystyle \langle /s\rangle } . To prevent a zero probability being assigned to unseen words, the probability of each seen word is slightly lowered to make room for the unseen words in a given corpus. To achieve this, various smoothing methods are used, from simple "add-one" smoothing (assigning a count of 1 to unseen n-grams, as an uninformative prior) to more sophisticated techniques, such as GoodโTuring discounting or back-off models. Word n-gram models have largely been superseded by recurrent neural networkโbased models, which in turn have been superseded by Transformer-based models often referred to as large language models. Maximum entropy language models encode the relationship between a word and the n-gram history using feature functions. The equation is P ( w m โฃ w 1 , โฆ , w m โ 1 ) = 1 Z ( w 1 , โฆ , w m โ 1 ) exp โก ( a T f ( w 1 , โฆ , w m ) ) {\displaystyle P(w_{m}\mid w_{1},\ldots ,w_{m-1})={\frac {1}{Z(w_{1},\ldots ,w_{m-1})}}\exp(a^{T}f(w_{1},\ldots ,w_{m}))} where Z ( w 1 , โฆ , w m โ 1 ) {\displaystyle Z(w_{1},\ldots ,w_{m-1})} is the partition function, a {\displaystyle a} is the parameter vector, and f ( w 1 , โฆ , w m ) {\displaystyle f(w_{1},\ldots ,w_{m})} is the feature function. In the simplest case, the feature function is just an indicator of the presence of a certain n-gram. It is helpful to use a prior on a {\displaystyle a} or some form of regularization. The log-bilinear model is another example of an exponential language model. Skip-gram language model is an attempt at overcoming the data sparsity problem that the preceding model (i.e. word n-gram language model) faced. Words represented in an embedding vector were not necessarily consecutive anymore, but could leave gaps that are skipped over (thus the name "skip-gram"). Formally, a k-skip-n-gram is a length-n subsequence where the components occur at distance at most k from each other. For example, in the input text: the set of 1-skip-2-grams includes all the bigrams (2-grams), and in addition the subsequences In skip-gram model, semantic relations between words are represented by linear combinations, capturing a form of compositionality. For example, in some such models, if v is the function that maps a word w to its n-d vector representation, then v ( k i n g ) โ v ( m a l e ) + v ( f e m a l e ) โ v ( q u e e n ) {\displaystyle v(\mathrm {king} )-v(\mathrm {male} )+v(\mathrm {female} )\approx v(\mathrm {queen} )} where โ is made precise by stipulating that its right-hand side must be the nearest neighbor of the value of the left-hand side. Neural models Continuous representations or embeddings of words are produced in recurrent neural network-based language models (known also as continuous space language models). Such continuous space embeddings help to alleviate the curse of dimensionality, which is the consequence of the number of possible sequences of words increasing exponentially with the size of the vocabulary, further causing a data sparsity problem. Neural networks avoid this problem by representing words as non-linear combinations of weights in a neural net. A large language model (LLM) is a language model trained with self-supervised machine learning on a vast amount of text, designed for natural language processing tasks, especially language generation. The largest and most capable LLMs are generative pre-trained transformers (GPTs) that provide the core capabilities of modern chatbots. LLMs can be fine-tuned for specific tasks or guided by prompt engineering. These models acquire predictive power regarding syntax, semantics, and ontologies inherent in human language corpora, but they also inherit inaccuracies and biases present in the data they are trained on. They consist of billions to trillions of parameters and operate as general-purpose sequence models, generating, summarizing, translating, and reasoning over text. LLMs represent a significant new technology in their ability to generalize across tasks with minimal task-specific supervision, enabling capabilities like conversational agents, code generation, knowledge retrieval, and automated reasoning that previously required bespoke systems. LLMs evolved from earlier statistical and recurrent neural network approaches to language modeling. The transformer architecture, introduced in 2017, replaced recurrence with self-attention, allowing efficient parallelization, longer context handling, and scalable training on unprecedented data volumes. This innovation enabled models like GPT, BERT, and their successors, which demonstrated emergent behaviors at scale, such as few-shot learning and compositional reasoning. Reinforcement learning, particularly policy gradient algorithms, has been adapted to fine-tune LLMs for desired behaviors beyond raw next-token prediction. Reinforcement learning from human feedback (RLHF) applies these methods to optimize a policy, the LLM's output distribution, against reward signals derived from human or automated preference judgments. This has been critical for aligning model outputs with user expectations, improving factuality, reducing harmful responses, and enhancing task performance. Benchmark evaluations for LLMs have evolved from narrow linguistic assessments toward comprehensive, multi-task evaluations measuring reasoning, factual accuracy, alignment, and safety. Hill climbing, iteratively optimizing models against benchmarks, has emerged as a dominant strategy, producing rapid incremental performance gains but raising concerns of overfitting to benchmarks rather than achieving genuine generalization or robust capability improvements. Although sometimes matching human performance, it is not clear whether they are plausible cognitive models. At least for recurrent neural networks, it has been shown that they sometimes learn patterns that humans do not, but fail to learn patterns that humans typically do. Evaluation and benchmarks Evaluation of the quality of language models is mostly done by comparison to human created sample benchmarks created from typical language-oriented tasks. Other, less established, quality tests examine the intrinsic character of a language model or compare two such models. Since language models are typically intended to be dynamic and to learn from data they see, some proposed models investigate the rate of learning, e.g., through inspection of learning curves. Various data sets have been developed for use in evaluating language processing systems. These include: See also References Further reading |
======================================== |
[SOURCE: https://he.wikipedia.org/wiki/ืืืฆื] | [TOKENS: 4392] |
ืชืืื ืขื ืืื ืื ืืืฆื ืึดืึฐืฆึธื ืชืจืืืชืืช (ืฉื ืืืขื: Cicer arietinum), ืืืืืจืช ืืืชืจ ืืฉืื ืืขืจืื ืืืืืก (ืชืขืชืืง ืืืืืง: ืึปืึปึผืฅ), ืืื ืฆืื ืืืฉืคืืช ืืงืื ืืืช. ืืืืื ืฉื ืืฆืื ืืื 20 - 50 ืก"ื, ืืืื ืืขื ืขืืื ื ืืฆืชืืื ืงืื ืื ืื ืืฆืืื ืืฉื ื ืฆืื ืืืืขืื. ืืชืจืืื ืืจืข ืืื ืขื ืฉืืืฉื ืืจืขื ืืืฆื. ืืคืจืืื ืืื ืื ืื ืืืืืื ืืืืืืื. ืืืฆื ืืงืืงื ืืืงืืื ืื ืืืืขืื ืึพ400 ื"ื ืืฉื ืฉื ืชื. ืืจืืืจื ืืืืืืก ืืืฆืื ืืืืขื ืืืืืื 16%-24% ืืืืื. ืกืืืื ืืืืฆื ืืืจืืขืื ืืื ืื, ืืื ืืื ืจืง ืชืฉืขื ืื ืฉื ืชืืื. ืชืคืืฆืชื ืืืคืื ืืกืื ืืขื ืกืืืื. ืืืืฆื ืชืจืืืชืืช ืฉืชื ืงืืืฆืืช ืื ืื ืขืืงืจืืืช ืฉืืืืขื ืืืขืจื ืืจื ืืืื, ืืงืืืฆื ืืจืืฉืื ื ืืงืืืืช ืืืื ืืื ืืชืืืื ืืขืืช ืืจืืจ ืืืื ืืขื ืงืืืคื ืืืงื ืืฆืืข ืืืืจ ืื ืงืจืืช ืงืืืื ืืฉืื ืฉืืืืขื ืืืืื ืืจื ืืคืื ืืกืื. ืืงืืืฆื ืืฉื ืืื, ืืื (ืืงืืื ืืกื ืกืงืจืื), ืืขืืช ืืจืืจ ืงืื ืืขื ืฆืืจื ืืืืกืคืกืช ืืฆืืข ืืื, ื ืืฆืืช ืืืืืื ืืืชืืืคืื ืืืืืจื ืืชืืืื (ืืืฅ ืืืฉืจืื) ืืืจืืืง. ืื ื ืืงืืืื ืืืืืื ืืืจืฅ ืืคืืืื ืืืชืจ ืืื ื ืืืื ืืืื ืืืงื ืขืืืืื ืืืืืืช ืืขืืงืจืืืช ืืชืืงืคืืช ืืืฆื, ืื ื ืืืื ืื ืืืื ืืืื ื ืืื ืืจืืืฉืื ืืืืืืช ืืืื ืืคืกืืงื ืืืืื ืืืจืฅ. ืืืจืฅ ืืื ืืื ืืจ ืืืื ืฉื ืืืฆื, ืืงืจืื ืืืฆื ืฉืกืืขื (Cicer pinnatifidum). ืืฆืื ืืืืฆื ืฉืืจืฉืื ืจืืื ืืฆืืืืื ืืื, ืื ืืื ืืฆืืืื. ืืฉืืจืฉืื ืืื ื ืืชืืืงืื ืืจืื, ืืื ืืฉ ืจืืืื ืฉืืจืฉืื ืืืจืข ืืืืฆื ืขืฆืื. ืฉืืืืฉืื ืืืืื ืฆืืจืืช ืืืืฉื ืืคืืคืืืจืืช ืืืืชืจ ืฉื ืืืืฆื ืืืืจื ืืชืืืื ืืื ืืืืจื ืืืืืก, ืืืืจืื ืืืจืืจื ืืืฆื ืืืื ืื ืืชืืืื ืื ืฉืื ืื. ืืืื ื ืคืืฅ ื ืืกืฃ ืืขืฉืื ืืืจืืืจื ืืืฆื ืืื ืืคืืืคื, ืฉืืื ืงืฆืืฆื ืืขืฉืืื ืืจืืจื ืืืืืก ืืืื ืื ืืชืืกืคืช ืชืืืื, ืืืืืื ืช ืืฉืื ืขืืืง. ืืฆื ืืืืื ืืืจื ืืืจืืคื ืืงืืื ืืื ืื ืืืืื ืืจืืขืก, ืืจืืจื ืืืฆื ืืืืฉืืื ืืืชืืืืื ืืืื ืืคืืคื ืฉืืืจ, ืืกืขืืืช ืืฉืืื ืืืจ ืืืืื ืชืื ืืง ืฉื ืืื. ืืฆื ืืืืื ืชืื ืืกืื ื ืืื ืืืืืฉ ืืจืืจื ืืืืืก ืฉืืืื, ืืชืืืืื ืืืื ืืคืคืจืืงื ืืืืืจืืช, ืขืจื ืืขืืืื ืืงืืจ, ืืืกืืจืช ืกืขืืื ืืกืืจืชืืช. ืจืืืืื ืืืจื ืืืจืืฉืืื ืขื 1947 ืชืจืืืื ืืืฆื ืงืืืืื ืขื ืืขื ืคืื ืฉืืื ืืฉื ืืขืจืื ืืืืื ืืืืื. ืืืจื 1967 ื ืืืจื ืืืืชื ืฉื ืชืจืืืืื ืงืืืืื, ืืื ืืขื ืคืื, ืืฉืืง ืฉื ืืืจืื. ืืจืืจื ืืืืฆื ื ืืืืื ืื ืืกืืืื, ืืืืฉืืื ืืชืืฉืืืื, ืื ื ืืื ืื ืืงืื. ืืฆืื ืื ืืืื ืืืืืื ืืืจืง ืืจื. ืืืืื, ืืคืงืืกืื ืืืื ืืืืฉ ืืืื ืื ืืืืจืืืจืื ืงืื, ืืื ืืืชืจ, ืืืงืืื ืืืืื ืืืชื ืืืืฉืืื ืขื ืงืืจื. ืืชืช-ืืืืฉืช ืืืืืืช ื ืคืืฆื ืื ืืืฆื ืืจืืงื. ืืืืจืงืื ืงืืืื ืืืชื ืขื ืชืืืื ืื ืืืืืืื ืืืชื ืืืืืฃ. ืืืืืื ืืื ืืืืฃ ืืืืืก ืชืืื ื (ืฉืืงืืจื ืืืืืฃ ืืืืื ืกึถื) ืืืืื ืืืื ืืืจืืฃ ืืืืืกืก ืขื ืืืืืก ืืชืืืื ืื. ืงืฆืืฆืืช ืืืื ืื ืืคืจืกืืืช ืืืืืืช ืงืื ืืืืืก ืืขืืฃ. ืืืกืืืจืืืช ืืชึดืจืืืช ืืงืืจื ืฉื ืืืืฆื ืืชืจืืืชืืช ืืื ืืืจืื ืืืจืงืื ืืื ืฉื ืืืฆืช ืืจ ืืฉื "ืืืฆื ืืจืืฉืชืช", ืื ืจืื ืฉื ืขืฉื ืืกืคืจ ื ืืกืืื ืืช ืชืืจืืืช ืืื ืื ืฉืื ืื ืฉื ืืืฆืืช ืืจ ืืืืกืืฃ ืืื ืฉืชืืจืืช ืืืืืฆื ืืืจืืฉืชืช ืืืง ืืช ืื ืืืชืจ. ืืขืืืช ืืืฆื ืชืจืืืชืืช ืฉืื ืืชืจืืื ื ืฉืืจ ืขื ืืฆืื ืืืื ื ื ืืชืืงืข, ืืืืฆืช ืืืจ ืืชืจืืื ื ืืฉืจ ืืฉืื ืืกืืื, ืืชืืงืข ืขื ืืงืจืงืข ืืืจืขืื ื ืคืืฆืื ืืืฉื ืชืงืืคื. ืืฉื ื ืขืืืืืช ืขื ืืืืช ืฆืื ืืืืฆื ืืืืื ืืชืงืืคื ืื ืืืืืชืืช ืืืืืจ ืืืจืงืื, ืืฆืจืื ืืืจืืื. ืืืชืงืืคื ืื ืืืืืชืืช ืืืืืืจืช ืืฉื ื ืขืืืืืช ืืชืกืืื, ืงืกื ืชื, ืืจื ื, ืืืืืื ื ืืกืืืืืช 3500 ืืคื ื"ืก. ืืฆืจืคืช ื ืืฆืื ืขืืืืืช ืืืชืงืืคื ืืืืืืืชืืช ืืกืืืืืช 6790 ืืคื ื"ืก. ืืชืงืืคืช ืืืจืื ืื ื ืืืื ืืืืฆื ืืื ืืฉืืืฉื ืืงืื ืืืื. ืืจืืืืื ืื ืื ืืืืื ืืื ืกืืื ืืืฆืืช. ืื ื ืืืื ืืืจืง ืืฉืจ ืืืืืืคืื ืืืืจ ืฉืขืืจื ืงืืืื. ืืชืงืืคื ืืจืืืืช ืืื ืืจืืจื ืืืฆื ืงืืืืื ืืืื ืคืืคืืืจื. ืืงืืกืจ ืงืจื ืืืืื ืืืืืจ ืืืชืืื ื-800 ืืกืคืืจื ืืช ืืืืื ืืืืฆื ืืืืืืืช ืืืืืืชืืืช. ืืคืืืืกืืฃ ืืืืจืืืก ืืื ืืก ืืืืืจ ืฉืืืฉื ืกืืืื - ืืืื, ืืื ืืฉืืืจ. ืืืืจืื, ืื ืืืื ืืฉืืฉ ืืฆืจืืื ืจืคืืืืื, ืืืื ืืขืืืช ืืืืืืช ืืืื ืืืืจืข, ืืืจืื ืืกืช ืืืืืช ืฉืชื ืืกืืืข ืืืืคืื ืืืื ื ืืืื. ืืชืืขืื ืืืืืื ืืจืืฉืื ืืืื ืช ืืืืืก ืืืจืืจืื ืืืืฉืืื ืืชืืกืคืช ืืืื ื ืืืจืฅ-ืืฉืจืื, ืืื ืืืชืงืืคื ืืฆืืื ืืช, ืขื ืืืช ืืฉ ืืืืขื ืื ืฉืืื ื ืืืจ ืืืจ ืืืงืืจืืช ืงืืืืื (ืจืื ืืืื "ืืงืืจ ืืฉื"). ืืฉื ืช 1976 ืขืจื ืคืจืืคืกืืจ ืืืขืื ืืืื'ืื ืกืงื ืืืคืงืืืื ืืืงืืืืช, ืืืื ืืกืืืื, ืืืงืจ ืฉืืืจืชื ืืืฆืื ืืช "ืื ืืืืืืก" (ืืืงืจ ืืืืื ืืืืงืจื ืฉื ืืืจื ืืืจื ืกืื ืืืฆืืืช "ืื ืืืืื") ืืืช ืืื ืืืฉืืื ืืช ืืื ืื ืืืชืืจืืชืื. ืืืื'ืื ืกืงื ืืืชืจ ืืืืจืงืื ืืช ืื ืืืืฆื ืฉืืื ื ืืชืคืชื ืื "ืืืืืืก" ืืืชืืจืืช. ื-2023 ืืืืื ืืืืื ื-75% ืืชืคืืงืช ืืืืฆื ืืขืืืืืช, ืืืืกืืจืืื ื-5% ืืืชืคืืงื, ืืืืืจืงืื ื-3.5% ืืืชืคืืงื. ืืงืืจ ืืฉื ืืืืื ืึดืึฐืฆึธื ืืืืฉื ืืขืืจืืช ืืืืืจื ืืช ืขื ืืกืืก ืืืืื ืืืจืืืช-ืชืืืืืืช "ืืืืฆื", ืฉืืืืชื ืขื ืคื ืืืงืฉืจ ืขื ืืื ืืืื ื ืงืื ืืช, ืืขื ืืกืืก ืืืืื ืืขืจืืืช "ืึปืึปึผืฅ" (ืฉื ืืืืช ืืคื ืืืืจื ืฉืคืืช ืืืจืืช "ืืืืืก"). ืืฉ ืืืืขื ืื ืฉืืืฉืื ืืืืื ืฉื ืืฆืื ืืื ืืคืื. ืืืขืชื ืฉื ืืืงืจ ืืืืื ืืงื ืฉืืืงืจื, ืืืืื ืคืืืงืก, ืืฃ ืืคืกืืง ืืืงืจืื: โืึฐึผืึดืื ืึธืึดืืฅ ืึนืืึตืืึผโ (ืืฉืขืืื ื', ื"ื) ืขืืกืง ืืืืื ืืืืฆื. ืืืืฉื ืืืฉืืื ืงืืจ, ืืื ืื ืืกืืคืจ ืืืืจ ืฉืื, ืกืืืจืื ืฉืืืฉืจ ืืืขื ืืฆืืข ืืจืืช ืืืืืืื: โืึทืึนึผืืึถืจ ืึธื ืึนืขึทื ืึฐืขึตืช ืึธืึนืึถื ืึนึผืฉึดืื ืึฒืึนื ืึฐืึธืึทืึฐืชึฐึผ ืึดื ืึทืึถึผืึถื ืึฐืึธืึทืึฐืชึฐึผ ืคึดึผืชึตึผืึฐ ืึทึผืึนืึถืฅโ (ืจืืช ื', ื"ื) ืืืืืจ ืืืขืฉื ื"ื ืืืื" ืฉื ืืืืืก. ืืื ืืื ืื ืกืืืจ ืฉืืืื ืืื ืืืืืฅ ืืืืืข ืื ื, ืืื ืืืืฆื, ืืืฉ ืืคื ืฉืืืืืื ืืืื. ืืฉื ื ืฉื ื ืืกืืจืื ืืงืืืืื ืืฉื "ืืืฆื": ืจืื ืื ืงืืฉืืจืื ืืืฆืื ืืื ืืขืจืืช ืฉืืืืื |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Natural_language] | [TOKENS: 623] |
Contents Natural language A natural language or ordinary language is any spoken language or signed language used organically in a human community, first emerging without conscious premeditation and subject to: replication across generations of people in the community, regional expansion or contraction, and gradual internal and structural changes. The vast majority of languages in the world are natural languages. As a category, natural language includes both standard dialects (ones with high social prestige) as well as nonstandard or vernacular dialects. Even an official language with a regulating academy such as Standard French, overseen by the Acadรฉmie Franรงaise, is still classified as a natural language (e.g. in the field of natural language processing), as its prescriptive aspects do not make it regulated enough to be considered a constructed or controlled natural language. Linguists broadly consider writing to be a static visual representation of a particular natural language, though, in many cases in highly literate modern societies, writing itself is also now subject to the natural processes of widely spoken natural languages. Excluded from the definition of natural language are: artificial and constructed languages, such as those developed for works of fiction; languages of formal logic, such as those in computer programming; and non-human communication systems in nature, such as whale vocalizations or honey bees' waggle dance. The academic consensus is that particular key features prevent animal communication systems from being classified as languages at all. Certain systems of human communication with no native speakers, as sometimes used in cross-cultural contexts, are also not natural languages. Controlled languages Controlled natural languages are subsets of natural languages whose grammars and dictionaries have been restricted in order to reduce ambiguity and complexity. This may be accomplished by decreasing usage of superlative or adverbial forms, or irregular verbs. Typical purposes for developing and implementing a controlled natural language are to aid understanding by non-native speakers or to ease computer processing. An example of a widely used controlled natural language is Simplified Technical English, which was originally developed for aerospace and avionics industry manuals. International constructed languages Being constructed, International auxiliary languages such as Esperanto and Interlingua are not considered natural languages, with the possible exception of true native speakers of such languages. Natural languages evolve, through fluctuations in vocabulary and syntax, to incrementally improve human communication. In contrast, Esperanto was created by Polish ophthalmologist L. L. Zamenhof in the late 19th century. Some natural languages have become organically "standardized" through the synthesis of two or more pre-existing natural languages over a relatively short period of time through the development of a pidgin, which is not considered a language, into a stable creole language. A creole such as Haitian Creole has its own grammar, vocabulary and literature. It is spoken by over 10 million people worldwide and is one of the two official languages of the Republic of Haiti. As of 1996, there were 350 attested families with one or more native speakers of Esperanto. Latino sine flexione, another international auxiliary language, is no longer widely spoken. See also Notes References External links |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Language_model#cite_note-3] | [TOKENS: 1793] |
Contents Language model A language model is a computational model that predicts sequences in natural language. Language models are useful for a variety of tasks, including speech recognition, machine translation, natural language generation (generating more human-like text), optical character recognition, route optimization, handwriting recognition, grammar induction, and information retrieval. Large language models (LLMs), currently their most advanced form as of 2019, are predominantly based on transformers trained on larger datasets (frequently using texts scraped from the public internet). They have superseded recurrent neural network-based models, which had previously superseded the purely statistical models, such as the word n-gram language model. History Noam Chomsky did pioneering work on language models in the 1950s by developing a theory of formal grammars. In 1980, statistical approaches were explored and found to be more useful for many purposes than rule-based formal grammars. Discrete representations like word n-gram language models, with probabilities for discrete combinations of words, made significant advances. In the 2000s, continuous representations for words, such as word embeddings, began to replace discrete representations. Typically, the representation is a real-valued vector that encodes a wordโs meaning such that words closer in vector space are similar in meaning and common relationships between words, such as plurality or gender, are preserved. Pure statistical models In 1980, the first significant statistical language model was proposed, and during the decade IBM performed 'Shannon-style' experiments, in which potential sources for language modeling improvement were identified by observing and analyzing the performance of human subjects in predicting or correcting text. A word n-gram language model is a statistical model of language which calculates the probability of the next word in a sequence from a fixed size window of previous words. If one previous word is considered, it is a bigram model; if two words, a trigram model; if n โ 1 words, an n-gram model. Special tokens are introduced to denote the start and end of a sentence โจ s โฉ {\displaystyle \langle s\rangle } and โจ / s โฉ {\displaystyle \langle /s\rangle } . To prevent a zero probability being assigned to unseen words, the probability of each seen word is slightly lowered to make room for the unseen words in a given corpus. To achieve this, various smoothing methods are used, from simple "add-one" smoothing (assigning a count of 1 to unseen n-grams, as an uninformative prior) to more sophisticated techniques, such as GoodโTuring discounting or back-off models. Word n-gram models have largely been superseded by recurrent neural networkโbased models, which in turn have been superseded by Transformer-based models often referred to as large language models. Maximum entropy language models encode the relationship between a word and the n-gram history using feature functions. The equation is P ( w m โฃ w 1 , โฆ , w m โ 1 ) = 1 Z ( w 1 , โฆ , w m โ 1 ) exp โก ( a T f ( w 1 , โฆ , w m ) ) {\displaystyle P(w_{m}\mid w_{1},\ldots ,w_{m-1})={\frac {1}{Z(w_{1},\ldots ,w_{m-1})}}\exp(a^{T}f(w_{1},\ldots ,w_{m}))} where Z ( w 1 , โฆ , w m โ 1 ) {\displaystyle Z(w_{1},\ldots ,w_{m-1})} is the partition function, a {\displaystyle a} is the parameter vector, and f ( w 1 , โฆ , w m ) {\displaystyle f(w_{1},\ldots ,w_{m})} is the feature function. In the simplest case, the feature function is just an indicator of the presence of a certain n-gram. It is helpful to use a prior on a {\displaystyle a} or some form of regularization. The log-bilinear model is another example of an exponential language model. Skip-gram language model is an attempt at overcoming the data sparsity problem that the preceding model (i.e. word n-gram language model) faced. Words represented in an embedding vector were not necessarily consecutive anymore, but could leave gaps that are skipped over (thus the name "skip-gram"). Formally, a k-skip-n-gram is a length-n subsequence where the components occur at distance at most k from each other. For example, in the input text: the set of 1-skip-2-grams includes all the bigrams (2-grams), and in addition the subsequences In skip-gram model, semantic relations between words are represented by linear combinations, capturing a form of compositionality. For example, in some such models, if v is the function that maps a word w to its n-d vector representation, then v ( k i n g ) โ v ( m a l e ) + v ( f e m a l e ) โ v ( q u e e n ) {\displaystyle v(\mathrm {king} )-v(\mathrm {male} )+v(\mathrm {female} )\approx v(\mathrm {queen} )} where โ is made precise by stipulating that its right-hand side must be the nearest neighbor of the value of the left-hand side. Neural models Continuous representations or embeddings of words are produced in recurrent neural network-based language models (known also as continuous space language models). Such continuous space embeddings help to alleviate the curse of dimensionality, which is the consequence of the number of possible sequences of words increasing exponentially with the size of the vocabulary, further causing a data sparsity problem. Neural networks avoid this problem by representing words as non-linear combinations of weights in a neural net. A large language model (LLM) is a language model trained with self-supervised machine learning on a vast amount of text, designed for natural language processing tasks, especially language generation. The largest and most capable LLMs are generative pre-trained transformers (GPTs) that provide the core capabilities of modern chatbots. LLMs can be fine-tuned for specific tasks or guided by prompt engineering. These models acquire predictive power regarding syntax, semantics, and ontologies inherent in human language corpora, but they also inherit inaccuracies and biases present in the data they are trained on. They consist of billions to trillions of parameters and operate as general-purpose sequence models, generating, summarizing, translating, and reasoning over text. LLMs represent a significant new technology in their ability to generalize across tasks with minimal task-specific supervision, enabling capabilities like conversational agents, code generation, knowledge retrieval, and automated reasoning that previously required bespoke systems. LLMs evolved from earlier statistical and recurrent neural network approaches to language modeling. The transformer architecture, introduced in 2017, replaced recurrence with self-attention, allowing efficient parallelization, longer context handling, and scalable training on unprecedented data volumes. This innovation enabled models like GPT, BERT, and their successors, which demonstrated emergent behaviors at scale, such as few-shot learning and compositional reasoning. Reinforcement learning, particularly policy gradient algorithms, has been adapted to fine-tune LLMs for desired behaviors beyond raw next-token prediction. Reinforcement learning from human feedback (RLHF) applies these methods to optimize a policy, the LLM's output distribution, against reward signals derived from human or automated preference judgments. This has been critical for aligning model outputs with user expectations, improving factuality, reducing harmful responses, and enhancing task performance. Benchmark evaluations for LLMs have evolved from narrow linguistic assessments toward comprehensive, multi-task evaluations measuring reasoning, factual accuracy, alignment, and safety. Hill climbing, iteratively optimizing models against benchmarks, has emerged as a dominant strategy, producing rapid incremental performance gains but raising concerns of overfitting to benchmarks rather than achieving genuine generalization or robust capability improvements. Although sometimes matching human performance, it is not clear whether they are plausible cognitive models. At least for recurrent neural networks, it has been shown that they sometimes learn patterns that humans do not, but fail to learn patterns that humans typically do. Evaluation and benchmarks Evaluation of the quality of language models is mostly done by comparison to human created sample benchmarks created from typical language-oriented tasks. Other, less established, quality tests examine the intrinsic character of a language model or compare two such models. Since language models are typically intended to be dynamic and to learn from data they see, some proposed models investigate the rate of learning, e.g., through inspection of learning curves. Various data sets have been developed for use in evaluating language processing systems. These include: See also References Further reading |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Language_model#cite_note-Semantic_parsing_as_machine_translation-4] | [TOKENS: 1793] |
Contents Language model A language model is a computational model that predicts sequences in natural language. Language models are useful for a variety of tasks, including speech recognition, machine translation, natural language generation (generating more human-like text), optical character recognition, route optimization, handwriting recognition, grammar induction, and information retrieval. Large language models (LLMs), currently their most advanced form as of 2019, are predominantly based on transformers trained on larger datasets (frequently using texts scraped from the public internet). They have superseded recurrent neural network-based models, which had previously superseded the purely statistical models, such as the word n-gram language model. History Noam Chomsky did pioneering work on language models in the 1950s by developing a theory of formal grammars. In 1980, statistical approaches were explored and found to be more useful for many purposes than rule-based formal grammars. Discrete representations like word n-gram language models, with probabilities for discrete combinations of words, made significant advances. In the 2000s, continuous representations for words, such as word embeddings, began to replace discrete representations. Typically, the representation is a real-valued vector that encodes a wordโs meaning such that words closer in vector space are similar in meaning and common relationships between words, such as plurality or gender, are preserved. Pure statistical models In 1980, the first significant statistical language model was proposed, and during the decade IBM performed 'Shannon-style' experiments, in which potential sources for language modeling improvement were identified by observing and analyzing the performance of human subjects in predicting or correcting text. A word n-gram language model is a statistical model of language which calculates the probability of the next word in a sequence from a fixed size window of previous words. If one previous word is considered, it is a bigram model; if two words, a trigram model; if n โ 1 words, an n-gram model. Special tokens are introduced to denote the start and end of a sentence โจ s โฉ {\displaystyle \langle s\rangle } and โจ / s โฉ {\displaystyle \langle /s\rangle } . To prevent a zero probability being assigned to unseen words, the probability of each seen word is slightly lowered to make room for the unseen words in a given corpus. To achieve this, various smoothing methods are used, from simple "add-one" smoothing (assigning a count of 1 to unseen n-grams, as an uninformative prior) to more sophisticated techniques, such as GoodโTuring discounting or back-off models. Word n-gram models have largely been superseded by recurrent neural networkโbased models, which in turn have been superseded by Transformer-based models often referred to as large language models. Maximum entropy language models encode the relationship between a word and the n-gram history using feature functions. The equation is P ( w m โฃ w 1 , โฆ , w m โ 1 ) = 1 Z ( w 1 , โฆ , w m โ 1 ) exp โก ( a T f ( w 1 , โฆ , w m ) ) {\displaystyle P(w_{m}\mid w_{1},\ldots ,w_{m-1})={\frac {1}{Z(w_{1},\ldots ,w_{m-1})}}\exp(a^{T}f(w_{1},\ldots ,w_{m}))} where Z ( w 1 , โฆ , w m โ 1 ) {\displaystyle Z(w_{1},\ldots ,w_{m-1})} is the partition function, a {\displaystyle a} is the parameter vector, and f ( w 1 , โฆ , w m ) {\displaystyle f(w_{1},\ldots ,w_{m})} is the feature function. In the simplest case, the feature function is just an indicator of the presence of a certain n-gram. It is helpful to use a prior on a {\displaystyle a} or some form of regularization. The log-bilinear model is another example of an exponential language model. Skip-gram language model is an attempt at overcoming the data sparsity problem that the preceding model (i.e. word n-gram language model) faced. Words represented in an embedding vector were not necessarily consecutive anymore, but could leave gaps that are skipped over (thus the name "skip-gram"). Formally, a k-skip-n-gram is a length-n subsequence where the components occur at distance at most k from each other. For example, in the input text: the set of 1-skip-2-grams includes all the bigrams (2-grams), and in addition the subsequences In skip-gram model, semantic relations between words are represented by linear combinations, capturing a form of compositionality. For example, in some such models, if v is the function that maps a word w to its n-d vector representation, then v ( k i n g ) โ v ( m a l e ) + v ( f e m a l e ) โ v ( q u e e n ) {\displaystyle v(\mathrm {king} )-v(\mathrm {male} )+v(\mathrm {female} )\approx v(\mathrm {queen} )} where โ is made precise by stipulating that its right-hand side must be the nearest neighbor of the value of the left-hand side. Neural models Continuous representations or embeddings of words are produced in recurrent neural network-based language models (known also as continuous space language models). Such continuous space embeddings help to alleviate the curse of dimensionality, which is the consequence of the number of possible sequences of words increasing exponentially with the size of the vocabulary, further causing a data sparsity problem. Neural networks avoid this problem by representing words as non-linear combinations of weights in a neural net. A large language model (LLM) is a language model trained with self-supervised machine learning on a vast amount of text, designed for natural language processing tasks, especially language generation. The largest and most capable LLMs are generative pre-trained transformers (GPTs) that provide the core capabilities of modern chatbots. LLMs can be fine-tuned for specific tasks or guided by prompt engineering. These models acquire predictive power regarding syntax, semantics, and ontologies inherent in human language corpora, but they also inherit inaccuracies and biases present in the data they are trained on. They consist of billions to trillions of parameters and operate as general-purpose sequence models, generating, summarizing, translating, and reasoning over text. LLMs represent a significant new technology in their ability to generalize across tasks with minimal task-specific supervision, enabling capabilities like conversational agents, code generation, knowledge retrieval, and automated reasoning that previously required bespoke systems. LLMs evolved from earlier statistical and recurrent neural network approaches to language modeling. The transformer architecture, introduced in 2017, replaced recurrence with self-attention, allowing efficient parallelization, longer context handling, and scalable training on unprecedented data volumes. This innovation enabled models like GPT, BERT, and their successors, which demonstrated emergent behaviors at scale, such as few-shot learning and compositional reasoning. Reinforcement learning, particularly policy gradient algorithms, has been adapted to fine-tune LLMs for desired behaviors beyond raw next-token prediction. Reinforcement learning from human feedback (RLHF) applies these methods to optimize a policy, the LLM's output distribution, against reward signals derived from human or automated preference judgments. This has been critical for aligning model outputs with user expectations, improving factuality, reducing harmful responses, and enhancing task performance. Benchmark evaluations for LLMs have evolved from narrow linguistic assessments toward comprehensive, multi-task evaluations measuring reasoning, factual accuracy, alignment, and safety. Hill climbing, iteratively optimizing models against benchmarks, has emerged as a dominant strategy, producing rapid incremental performance gains but raising concerns of overfitting to benchmarks rather than achieving genuine generalization or robust capability improvements. Although sometimes matching human performance, it is not clear whether they are plausible cognitive models. At least for recurrent neural networks, it has been shown that they sometimes learn patterns that humans do not, but fail to learn patterns that humans typically do. Evaluation and benchmarks Evaluation of the quality of language models is mostly done by comparison to human created sample benchmarks created from typical language-oriented tasks. Other, less established, quality tests examine the intrinsic character of a language model or compare two such models. Since language models are typically intended to be dynamic and to learn from data they see, some proposed models investigate the rate of learning, e.g., through inspection of learning curves. Various data sets have been developed for use in evaluating language processing systems. These include: See also References Further reading |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Language_model#cite_note-5] | [TOKENS: 1793] |
Contents Language model A language model is a computational model that predicts sequences in natural language. Language models are useful for a variety of tasks, including speech recognition, machine translation, natural language generation (generating more human-like text), optical character recognition, route optimization, handwriting recognition, grammar induction, and information retrieval. Large language models (LLMs), currently their most advanced form as of 2019, are predominantly based on transformers trained on larger datasets (frequently using texts scraped from the public internet). They have superseded recurrent neural network-based models, which had previously superseded the purely statistical models, such as the word n-gram language model. History Noam Chomsky did pioneering work on language models in the 1950s by developing a theory of formal grammars. In 1980, statistical approaches were explored and found to be more useful for many purposes than rule-based formal grammars. Discrete representations like word n-gram language models, with probabilities for discrete combinations of words, made significant advances. In the 2000s, continuous representations for words, such as word embeddings, began to replace discrete representations. Typically, the representation is a real-valued vector that encodes a wordโs meaning such that words closer in vector space are similar in meaning and common relationships between words, such as plurality or gender, are preserved. Pure statistical models In 1980, the first significant statistical language model was proposed, and during the decade IBM performed 'Shannon-style' experiments, in which potential sources for language modeling improvement were identified by observing and analyzing the performance of human subjects in predicting or correcting text. A word n-gram language model is a statistical model of language which calculates the probability of the next word in a sequence from a fixed size window of previous words. If one previous word is considered, it is a bigram model; if two words, a trigram model; if n โ 1 words, an n-gram model. Special tokens are introduced to denote the start and end of a sentence โจ s โฉ {\displaystyle \langle s\rangle } and โจ / s โฉ {\displaystyle \langle /s\rangle } . To prevent a zero probability being assigned to unseen words, the probability of each seen word is slightly lowered to make room for the unseen words in a given corpus. To achieve this, various smoothing methods are used, from simple "add-one" smoothing (assigning a count of 1 to unseen n-grams, as an uninformative prior) to more sophisticated techniques, such as GoodโTuring discounting or back-off models. Word n-gram models have largely been superseded by recurrent neural networkโbased models, which in turn have been superseded by Transformer-based models often referred to as large language models. Maximum entropy language models encode the relationship between a word and the n-gram history using feature functions. The equation is P ( w m โฃ w 1 , โฆ , w m โ 1 ) = 1 Z ( w 1 , โฆ , w m โ 1 ) exp โก ( a T f ( w 1 , โฆ , w m ) ) {\displaystyle P(w_{m}\mid w_{1},\ldots ,w_{m-1})={\frac {1}{Z(w_{1},\ldots ,w_{m-1})}}\exp(a^{T}f(w_{1},\ldots ,w_{m}))} where Z ( w 1 , โฆ , w m โ 1 ) {\displaystyle Z(w_{1},\ldots ,w_{m-1})} is the partition function, a {\displaystyle a} is the parameter vector, and f ( w 1 , โฆ , w m ) {\displaystyle f(w_{1},\ldots ,w_{m})} is the feature function. In the simplest case, the feature function is just an indicator of the presence of a certain n-gram. It is helpful to use a prior on a {\displaystyle a} or some form of regularization. The log-bilinear model is another example of an exponential language model. Skip-gram language model is an attempt at overcoming the data sparsity problem that the preceding model (i.e. word n-gram language model) faced. Words represented in an embedding vector were not necessarily consecutive anymore, but could leave gaps that are skipped over (thus the name "skip-gram"). Formally, a k-skip-n-gram is a length-n subsequence where the components occur at distance at most k from each other. For example, in the input text: the set of 1-skip-2-grams includes all the bigrams (2-grams), and in addition the subsequences In skip-gram model, semantic relations between words are represented by linear combinations, capturing a form of compositionality. For example, in some such models, if v is the function that maps a word w to its n-d vector representation, then v ( k i n g ) โ v ( m a l e ) + v ( f e m a l e ) โ v ( q u e e n ) {\displaystyle v(\mathrm {king} )-v(\mathrm {male} )+v(\mathrm {female} )\approx v(\mathrm {queen} )} where โ is made precise by stipulating that its right-hand side must be the nearest neighbor of the value of the left-hand side. Neural models Continuous representations or embeddings of words are produced in recurrent neural network-based language models (known also as continuous space language models). Such continuous space embeddings help to alleviate the curse of dimensionality, which is the consequence of the number of possible sequences of words increasing exponentially with the size of the vocabulary, further causing a data sparsity problem. Neural networks avoid this problem by representing words as non-linear combinations of weights in a neural net. A large language model (LLM) is a language model trained with self-supervised machine learning on a vast amount of text, designed for natural language processing tasks, especially language generation. The largest and most capable LLMs are generative pre-trained transformers (GPTs) that provide the core capabilities of modern chatbots. LLMs can be fine-tuned for specific tasks or guided by prompt engineering. These models acquire predictive power regarding syntax, semantics, and ontologies inherent in human language corpora, but they also inherit inaccuracies and biases present in the data they are trained on. They consist of billions to trillions of parameters and operate as general-purpose sequence models, generating, summarizing, translating, and reasoning over text. LLMs represent a significant new technology in their ability to generalize across tasks with minimal task-specific supervision, enabling capabilities like conversational agents, code generation, knowledge retrieval, and automated reasoning that previously required bespoke systems. LLMs evolved from earlier statistical and recurrent neural network approaches to language modeling. The transformer architecture, introduced in 2017, replaced recurrence with self-attention, allowing efficient parallelization, longer context handling, and scalable training on unprecedented data volumes. This innovation enabled models like GPT, BERT, and their successors, which demonstrated emergent behaviors at scale, such as few-shot learning and compositional reasoning. Reinforcement learning, particularly policy gradient algorithms, has been adapted to fine-tune LLMs for desired behaviors beyond raw next-token prediction. Reinforcement learning from human feedback (RLHF) applies these methods to optimize a policy, the LLM's output distribution, against reward signals derived from human or automated preference judgments. This has been critical for aligning model outputs with user expectations, improving factuality, reducing harmful responses, and enhancing task performance. Benchmark evaluations for LLMs have evolved from narrow linguistic assessments toward comprehensive, multi-task evaluations measuring reasoning, factual accuracy, alignment, and safety. Hill climbing, iteratively optimizing models against benchmarks, has emerged as a dominant strategy, producing rapid incremental performance gains but raising concerns of overfitting to benchmarks rather than achieving genuine generalization or robust capability improvements. Although sometimes matching human performance, it is not clear whether they are plausible cognitive models. At least for recurrent neural networks, it has been shown that they sometimes learn patterns that humans do not, but fail to learn patterns that humans typically do. Evaluation and benchmarks Evaluation of the quality of language models is mostly done by comparison to human created sample benchmarks created from typical language-oriented tasks. Other, less established, quality tests examine the intrinsic character of a language model or compare two such models. Since language models are typically intended to be dynamic and to learn from data they see, some proposed models investigate the rate of learning, e.g., through inspection of learning curves. Various data sets have been developed for use in evaluating language processing systems. These include: See also References Further reading |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Language_model#cite_note-2] | [TOKENS: 1793] |
Contents Language model A language model is a computational model that predicts sequences in natural language. Language models are useful for a variety of tasks, including speech recognition, machine translation, natural language generation (generating more human-like text), optical character recognition, route optimization, handwriting recognition, grammar induction, and information retrieval. Large language models (LLMs), currently their most advanced form as of 2019, are predominantly based on transformers trained on larger datasets (frequently using texts scraped from the public internet). They have superseded recurrent neural network-based models, which had previously superseded the purely statistical models, such as the word n-gram language model. History Noam Chomsky did pioneering work on language models in the 1950s by developing a theory of formal grammars. In 1980, statistical approaches were explored and found to be more useful for many purposes than rule-based formal grammars. Discrete representations like word n-gram language models, with probabilities for discrete combinations of words, made significant advances. In the 2000s, continuous representations for words, such as word embeddings, began to replace discrete representations. Typically, the representation is a real-valued vector that encodes a wordโs meaning such that words closer in vector space are similar in meaning and common relationships between words, such as plurality or gender, are preserved. Pure statistical models In 1980, the first significant statistical language model was proposed, and during the decade IBM performed 'Shannon-style' experiments, in which potential sources for language modeling improvement were identified by observing and analyzing the performance of human subjects in predicting or correcting text. A word n-gram language model is a statistical model of language which calculates the probability of the next word in a sequence from a fixed size window of previous words. If one previous word is considered, it is a bigram model; if two words, a trigram model; if n โ 1 words, an n-gram model. Special tokens are introduced to denote the start and end of a sentence โจ s โฉ {\displaystyle \langle s\rangle } and โจ / s โฉ {\displaystyle \langle /s\rangle } . To prevent a zero probability being assigned to unseen words, the probability of each seen word is slightly lowered to make room for the unseen words in a given corpus. To achieve this, various smoothing methods are used, from simple "add-one" smoothing (assigning a count of 1 to unseen n-grams, as an uninformative prior) to more sophisticated techniques, such as GoodโTuring discounting or back-off models. Word n-gram models have largely been superseded by recurrent neural networkโbased models, which in turn have been superseded by Transformer-based models often referred to as large language models. Maximum entropy language models encode the relationship between a word and the n-gram history using feature functions. The equation is P ( w m โฃ w 1 , โฆ , w m โ 1 ) = 1 Z ( w 1 , โฆ , w m โ 1 ) exp โก ( a T f ( w 1 , โฆ , w m ) ) {\displaystyle P(w_{m}\mid w_{1},\ldots ,w_{m-1})={\frac {1}{Z(w_{1},\ldots ,w_{m-1})}}\exp(a^{T}f(w_{1},\ldots ,w_{m}))} where Z ( w 1 , โฆ , w m โ 1 ) {\displaystyle Z(w_{1},\ldots ,w_{m-1})} is the partition function, a {\displaystyle a} is the parameter vector, and f ( w 1 , โฆ , w m ) {\displaystyle f(w_{1},\ldots ,w_{m})} is the feature function. In the simplest case, the feature function is just an indicator of the presence of a certain n-gram. It is helpful to use a prior on a {\displaystyle a} or some form of regularization. The log-bilinear model is another example of an exponential language model. Skip-gram language model is an attempt at overcoming the data sparsity problem that the preceding model (i.e. word n-gram language model) faced. Words represented in an embedding vector were not necessarily consecutive anymore, but could leave gaps that are skipped over (thus the name "skip-gram"). Formally, a k-skip-n-gram is a length-n subsequence where the components occur at distance at most k from each other. For example, in the input text: the set of 1-skip-2-grams includes all the bigrams (2-grams), and in addition the subsequences In skip-gram model, semantic relations between words are represented by linear combinations, capturing a form of compositionality. For example, in some such models, if v is the function that maps a word w to its n-d vector representation, then v ( k i n g ) โ v ( m a l e ) + v ( f e m a l e ) โ v ( q u e e n ) {\displaystyle v(\mathrm {king} )-v(\mathrm {male} )+v(\mathrm {female} )\approx v(\mathrm {queen} )} where โ is made precise by stipulating that its right-hand side must be the nearest neighbor of the value of the left-hand side. Neural models Continuous representations or embeddings of words are produced in recurrent neural network-based language models (known also as continuous space language models). Such continuous space embeddings help to alleviate the curse of dimensionality, which is the consequence of the number of possible sequences of words increasing exponentially with the size of the vocabulary, further causing a data sparsity problem. Neural networks avoid this problem by representing words as non-linear combinations of weights in a neural net. A large language model (LLM) is a language model trained with self-supervised machine learning on a vast amount of text, designed for natural language processing tasks, especially language generation. The largest and most capable LLMs are generative pre-trained transformers (GPTs) that provide the core capabilities of modern chatbots. LLMs can be fine-tuned for specific tasks or guided by prompt engineering. These models acquire predictive power regarding syntax, semantics, and ontologies inherent in human language corpora, but they also inherit inaccuracies and biases present in the data they are trained on. They consist of billions to trillions of parameters and operate as general-purpose sequence models, generating, summarizing, translating, and reasoning over text. LLMs represent a significant new technology in their ability to generalize across tasks with minimal task-specific supervision, enabling capabilities like conversational agents, code generation, knowledge retrieval, and automated reasoning that previously required bespoke systems. LLMs evolved from earlier statistical and recurrent neural network approaches to language modeling. The transformer architecture, introduced in 2017, replaced recurrence with self-attention, allowing efficient parallelization, longer context handling, and scalable training on unprecedented data volumes. This innovation enabled models like GPT, BERT, and their successors, which demonstrated emergent behaviors at scale, such as few-shot learning and compositional reasoning. Reinforcement learning, particularly policy gradient algorithms, has been adapted to fine-tune LLMs for desired behaviors beyond raw next-token prediction. Reinforcement learning from human feedback (RLHF) applies these methods to optimize a policy, the LLM's output distribution, against reward signals derived from human or automated preference judgments. This has been critical for aligning model outputs with user expectations, improving factuality, reducing harmful responses, and enhancing task performance. Benchmark evaluations for LLMs have evolved from narrow linguistic assessments toward comprehensive, multi-task evaluations measuring reasoning, factual accuracy, alignment, and safety. Hill climbing, iteratively optimizing models against benchmarks, has emerged as a dominant strategy, producing rapid incremental performance gains but raising concerns of overfitting to benchmarks rather than achieving genuine generalization or robust capability improvements. Although sometimes matching human performance, it is not clear whether they are plausible cognitive models. At least for recurrent neural networks, it has been shown that they sometimes learn patterns that humans do not, but fail to learn patterns that humans typically do. Evaluation and benchmarks Evaluation of the quality of language models is mostly done by comparison to human created sample benchmarks created from typical language-oriented tasks. Other, less established, quality tests examine the intrinsic character of a language model or compare two such models. Since language models are typically intended to be dynamic and to learn from data they see, some proposed models investigate the rate of learning, e.g., through inspection of learning curves. Various data sets have been developed for use in evaluating language processing systems. These include: See also References Further reading |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Grammar_induction] | [TOKENS: 1686] |
Contents Grammar induction Grammar induction (or grammatical inference) is the process in machine learning of learning a formal grammar (usually as a collection of re-write rules or productions or alternatively as a finite-state machine or automaton of some kind) from a set of observations, thus constructing a model which accounts for the characteristics of the observed objects. More generally, grammatical inference is that branch of machine learning where the instance space consists of discrete combinatorial objects such as strings, trees and graphs. Grammar classes Grammatical inference has often been very focused on the problem of learning finite-state machines of various types (see the article Induction of regular languages for details on these approaches), since there have been efficient algorithms for this problem since the 1980s. Since the beginning of the century, these approaches have been extended to the problem of inference of context-free grammars and richer formalisms, such as multiple context-free grammars and parallel multiple context-free grammars. Other classes of grammars for which grammatical inference has been studied are combinatory categorial grammars, stochastic context-free grammars, contextual grammars and pattern languages. Learning models The simplest form of learning is where the learning algorithm merely receives a set of examples drawn from the language in question: the aim is to learn the language from examples of it (and, rarely, from counter-examples, that is, example that do not belong to the language). However, other learning models have been studied. One frequently studied alternative is the case where the learner can ask membership queries as in the exact query learning model or minimally adequate teacher model introduced by Angluin. Methodologies There is a wide variety of methods for grammatical inference. Two of the classic sources are Fu (1977) and Fu (1982). Duda, Hart & Stork (2001) also devote a brief section to the problem, and cite a number of references. The basic trial-and-error method they present is discussed below. For approaches to infer subclasses of regular languages in particular, see Induction of regular languages. A more recent textbook is de la Higuera (2010), which covers the theory of grammatical inference of regular languages and finite state automata. D'Ulizia, Ferri and Grifoni provide a survey that explores grammatical inference methods for natural languages. There are several methods for induction of probabilistic context-free grammars.[further explanation needed] The method proposed in Section 8.7 of Duda, Hart & Stork (2001) suggests successively guessing grammar rules (productions) and testing them against positive and negative observations. The rule set is expanded so as to be able to generate each positive example, but if a given rule set also generates a negative example, it must be discarded. This particular approach can be characterized as "hypothesis testing" and bears some similarity to Mitchel's version space algorithm. The Duda, Hart & Stork (2001) text provide a simple example which nicely illustrates the process, but the feasibility of such an unguided trial-and-error approach for more substantial problems is dubious. Grammatical induction using evolutionary algorithms is the process of evolving a representation of the grammar of a target language through some evolutionary process. Formal grammars can easily be represented as tree structures of production rules that can be subjected to evolutionary operators. Algorithms of this sort stem from the genetic programming paradigm pioneered by John Koza.[citation needed] Other early work on simple formal languages used the binary string representation of genetic algorithms, but the inherently hierarchical structure of grammars couched in the EBNF language made trees a more flexible approach. Koza represented Lisp programs as trees. He was able to find analogues to the genetic operators within the standard set of tree operators. For example, swapping sub-trees is equivalent to the corresponding process of genetic crossover, where sub-strings of a genetic code are transplanted into an individual of the next generation. Fitness is measured by scoring the output from the functions of the Lisp code. Similar analogues between the tree structured lisp representation and the representation of grammars as trees, made the application of genetic programming techniques possible for grammar induction. In the case of grammar induction, the transplantation of sub-trees corresponds to the swapping of production rules that enable the parsing of phrases from some language. The fitness operator for the grammar is based upon some measure of how well it performed in parsing some group of sentences from the target language. In a tree representation of a grammar, a terminal symbol of a production rule corresponds to a leaf node of the tree. Its parent nodes corresponds to a non-terminal symbol (e.g. a noun phrase or a verb phrase) in the rule set. Ultimately, the root node might correspond to a sentence non-terminal. Like all greedy algorithms, greedy grammar inference algorithms make, in iterative manner, decisions that seem to be the best at that stage. The decisions made usually deal with things like the creation of new rules, the removal of existing rules, the choice of a rule to be applied or the merging of some existing rules. Because there are several ways to define 'the stage' and 'the best', there are also several greedy grammar inference algorithms. These context-free grammar generating algorithms make the decision after every read symbol: These context-free grammar generating algorithms first read the whole given symbol-sequence and then start to make decisions: A more recent approach is based on distributional learning. Algorithms using these approaches have been applied to learning context-free grammars and mildly context-sensitive languages and have been proven to be correct and efficient for large subclasses of these grammars. Angluin defines a pattern to be "a string of constant symbols from ฮฃ and variable symbols from a disjoint set". The language of such a pattern is the set of all its nonempty ground instances i.e. all strings resulting from consistent replacement of its variable symbols by nonempty strings of constant symbols.[note 1] A pattern is called descriptive for a finite input set of strings if its language is minimal (with respect to set inclusion) among all pattern languages subsuming the input set. Angluin gives a polynomial algorithm to compute, for a given input string set, all descriptive patterns in one variable x.[note 2] To this end, she builds an automaton representing all possibly relevant patterns; using sophisticated arguments about word lengths, which rely on x being the only variable, the state count can be drastically reduced. Erlebach et al. give a more efficient version of Angluin's pattern learning algorithm, as well as a parallelized version. Arimura et al. show that a language class obtained from limited unions of patterns can be learned in polynomial time. Pattern theory, formulated by Ulf Grenander, is a mathematical formalism to describe knowledge of the world as patterns. It differs from other approaches to artificial intelligence in that it does not begin by prescribing algorithms and machinery to recognize and classify patterns; rather, it prescribes a vocabulary to articulate and recast the pattern concepts in precise language. In addition to the new algebraic vocabulary, its statistical approach was novel in its aim to: Broad in its mathematical coverage, pattern theory spans algebra and statistics, as well as local topological and global entropic properties. Applications The principle of grammar induction has been applied to other aspects of natural language processing, and has been applied (among many other problems) to semantic parsing, natural language understanding, example-based translation, language acquisition, grammar-based compression, and anomaly detection. Grammar-based codes or grammar-based compression are compression algorithms based on the idea of constructing a context-free grammar (CFG) for the string to be compressed. Examples include universal lossless data compression algorithms. To compress a data sequence x = x 1 โฏ x n {\displaystyle x=x_{1}\cdots x_{n}} , a grammar-based code transforms x {\displaystyle x} into a context-free grammar G {\displaystyle G} . The problem of finding a smallest grammar for an input sequence (smallest grammar problem) is known to be NP-hard, so many grammar-transform algorithms are proposed from theoretical and practical viewpoints. Generally, the produced grammar G {\displaystyle G} is further compressed by statistical encoders like arithmetic coding. See also Notes References Sources Tools QSMM โ adaptive parsers for the induction of context-free grammars by template |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Special:RecentChangesLinked/Language_model] | [TOKENS: 57] |
Related changes Enter a page name to see changes on pages linked to or from that page. (To see members of a category, enter Category:Name of category). Changes to pages on your Watchlist are shown in bold with a green bullet. See more at Help:Related changes. |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Language_model#cite_note-6] | [TOKENS: 1793] |
Contents Language model A language model is a computational model that predicts sequences in natural language. Language models are useful for a variety of tasks, including speech recognition, machine translation, natural language generation (generating more human-like text), optical character recognition, route optimization, handwriting recognition, grammar induction, and information retrieval. Large language models (LLMs), currently their most advanced form as of 2019, are predominantly based on transformers trained on larger datasets (frequently using texts scraped from the public internet). They have superseded recurrent neural network-based models, which had previously superseded the purely statistical models, such as the word n-gram language model. History Noam Chomsky did pioneering work on language models in the 1950s by developing a theory of formal grammars. In 1980, statistical approaches were explored and found to be more useful for many purposes than rule-based formal grammars. Discrete representations like word n-gram language models, with probabilities for discrete combinations of words, made significant advances. In the 2000s, continuous representations for words, such as word embeddings, began to replace discrete representations. Typically, the representation is a real-valued vector that encodes a wordโs meaning such that words closer in vector space are similar in meaning and common relationships between words, such as plurality or gender, are preserved. Pure statistical models In 1980, the first significant statistical language model was proposed, and during the decade IBM performed 'Shannon-style' experiments, in which potential sources for language modeling improvement were identified by observing and analyzing the performance of human subjects in predicting or correcting text. A word n-gram language model is a statistical model of language which calculates the probability of the next word in a sequence from a fixed size window of previous words. If one previous word is considered, it is a bigram model; if two words, a trigram model; if n โ 1 words, an n-gram model. Special tokens are introduced to denote the start and end of a sentence โจ s โฉ {\displaystyle \langle s\rangle } and โจ / s โฉ {\displaystyle \langle /s\rangle } . To prevent a zero probability being assigned to unseen words, the probability of each seen word is slightly lowered to make room for the unseen words in a given corpus. To achieve this, various smoothing methods are used, from simple "add-one" smoothing (assigning a count of 1 to unseen n-grams, as an uninformative prior) to more sophisticated techniques, such as GoodโTuring discounting or back-off models. Word n-gram models have largely been superseded by recurrent neural networkโbased models, which in turn have been superseded by Transformer-based models often referred to as large language models. Maximum entropy language models encode the relationship between a word and the n-gram history using feature functions. The equation is P ( w m โฃ w 1 , โฆ , w m โ 1 ) = 1 Z ( w 1 , โฆ , w m โ 1 ) exp โก ( a T f ( w 1 , โฆ , w m ) ) {\displaystyle P(w_{m}\mid w_{1},\ldots ,w_{m-1})={\frac {1}{Z(w_{1},\ldots ,w_{m-1})}}\exp(a^{T}f(w_{1},\ldots ,w_{m}))} where Z ( w 1 , โฆ , w m โ 1 ) {\displaystyle Z(w_{1},\ldots ,w_{m-1})} is the partition function, a {\displaystyle a} is the parameter vector, and f ( w 1 , โฆ , w m ) {\displaystyle f(w_{1},\ldots ,w_{m})} is the feature function. In the simplest case, the feature function is just an indicator of the presence of a certain n-gram. It is helpful to use a prior on a {\displaystyle a} or some form of regularization. The log-bilinear model is another example of an exponential language model. Skip-gram language model is an attempt at overcoming the data sparsity problem that the preceding model (i.e. word n-gram language model) faced. Words represented in an embedding vector were not necessarily consecutive anymore, but could leave gaps that are skipped over (thus the name "skip-gram"). Formally, a k-skip-n-gram is a length-n subsequence where the components occur at distance at most k from each other. For example, in the input text: the set of 1-skip-2-grams includes all the bigrams (2-grams), and in addition the subsequences In skip-gram model, semantic relations between words are represented by linear combinations, capturing a form of compositionality. For example, in some such models, if v is the function that maps a word w to its n-d vector representation, then v ( k i n g ) โ v ( m a l e ) + v ( f e m a l e ) โ v ( q u e e n ) {\displaystyle v(\mathrm {king} )-v(\mathrm {male} )+v(\mathrm {female} )\approx v(\mathrm {queen} )} where โ is made precise by stipulating that its right-hand side must be the nearest neighbor of the value of the left-hand side. Neural models Continuous representations or embeddings of words are produced in recurrent neural network-based language models (known also as continuous space language models). Such continuous space embeddings help to alleviate the curse of dimensionality, which is the consequence of the number of possible sequences of words increasing exponentially with the size of the vocabulary, further causing a data sparsity problem. Neural networks avoid this problem by representing words as non-linear combinations of weights in a neural net. A large language model (LLM) is a language model trained with self-supervised machine learning on a vast amount of text, designed for natural language processing tasks, especially language generation. The largest and most capable LLMs are generative pre-trained transformers (GPTs) that provide the core capabilities of modern chatbots. LLMs can be fine-tuned for specific tasks or guided by prompt engineering. These models acquire predictive power regarding syntax, semantics, and ontologies inherent in human language corpora, but they also inherit inaccuracies and biases present in the data they are trained on. They consist of billions to trillions of parameters and operate as general-purpose sequence models, generating, summarizing, translating, and reasoning over text. LLMs represent a significant new technology in their ability to generalize across tasks with minimal task-specific supervision, enabling capabilities like conversational agents, code generation, knowledge retrieval, and automated reasoning that previously required bespoke systems. LLMs evolved from earlier statistical and recurrent neural network approaches to language modeling. The transformer architecture, introduced in 2017, replaced recurrence with self-attention, allowing efficient parallelization, longer context handling, and scalable training on unprecedented data volumes. This innovation enabled models like GPT, BERT, and their successors, which demonstrated emergent behaviors at scale, such as few-shot learning and compositional reasoning. Reinforcement learning, particularly policy gradient algorithms, has been adapted to fine-tune LLMs for desired behaviors beyond raw next-token prediction. Reinforcement learning from human feedback (RLHF) applies these methods to optimize a policy, the LLM's output distribution, against reward signals derived from human or automated preference judgments. This has been critical for aligning model outputs with user expectations, improving factuality, reducing harmful responses, and enhancing task performance. Benchmark evaluations for LLMs have evolved from narrow linguistic assessments toward comprehensive, multi-task evaluations measuring reasoning, factual accuracy, alignment, and safety. Hill climbing, iteratively optimizing models against benchmarks, has emerged as a dominant strategy, producing rapid incremental performance gains but raising concerns of overfitting to benchmarks rather than achieving genuine generalization or robust capability improvements. Although sometimes matching human performance, it is not clear whether they are plausible cognitive models. At least for recurrent neural networks, it has been shown that they sometimes learn patterns that humans do not, but fail to learn patterns that humans typically do. Evaluation and benchmarks Evaluation of the quality of language models is mostly done by comparison to human created sample benchmarks created from typical language-oriented tasks. Other, less established, quality tests examine the intrinsic character of a language model or compare two such models. Since language models are typically intended to be dynamic and to learn from data they see, some proposed models investigate the rate of learning, e.g., through inspection of learning curves. Various data sets have been developed for use in evaluating language processing systems. These include: See also References Further reading |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Recurrent_neural_network] | [TOKENS: 6352] |
Contents Recurrent neural network In artificial neural networks, recurrent neural networks (RNNs) are designed for processing sequential data, such as text, speech, and time series, where the order of elements is important. Unlike feedforward neural networks, which process inputs independently, RNNs utilize recurrent connections, where the output of a neuron at one time step is fed back as input to the network at the next time step. This enables RNNs to capture temporal dependencies and patterns within sequences. The fundamental building block of RNN is the recurrent unit, which maintains a hidden stateโa form of memory that is updated at each time step based on the current input and the previous hidden state. This feedback mechanism allows the network to learn from past inputs and incorporate that knowledge into its current processing. RNNs have been successfully applied to tasks such as unsegmented, connected handwriting recognition, speech recognition, natural language processing, and neural machine translation. However, traditional RNNs suffer from the vanishing gradient problem, which limits their ability to learn long-range dependencies. This issue was addressed by the development of the long short-term memory (LSTM) architecture in 1997, making it the standard RNN variant for handling long-term dependencies. Later, gated recurrent units (GRUs) were introduced as a more computationally efficient alternative. In recent years, transformers, which rely on self-attention mechanisms instead of recurrence, have become the dominant architecture for many sequence-processing tasks, particularly in natural language processing, due to their superior handling of long-range dependencies and greater parallelizability. Nevertheless, RNNs remain relevant for applications where computational efficiency, real-time processing, or the inherent sequential nature of data is crucial. History One origin of RNN was neuroscience. The word "recurrent" is used to describe loop-like structures in anatomy. In 1901, Cajal observed "recurrent semicircles" in the cerebellar cortex formed by parallel fiber, Purkinje cells, and granule cells. In 1933, Lorente de Nรณ discovered "recurrent, reciprocal connections" by Golgi's method, and proposed that excitatory loops explain certain aspects of the vestibulo-ocular reflex. During 1940s, multiple people proposed the existence of feedback in the brain, which was a contrast to the previous understanding of the neural system as a purely feedforward structure. Hebb considered "reverberating circuit" as an explanation for short-term memory. The McCulloch and Pitts paper (1943), which proposed the McCulloch-Pitts neuron model, considered networks that contains cycles. The current activity of such networks can be affected by activity indefinitely far in the past. They were both interested in closed loops as possible explanations for e.g. epilepsy and causalgia. Recurrent inhibition was proposed in 1946 as a negative feedback mechanism in motor control. Neural feedback loops were a common topic of discussion at the Macy conferences. See for an extensive review of recurrent neural network models in neuroscience. Frank Rosenblatt in 1960 published "close-loop cross-coupled perceptrons", which are 3-layered perceptron networks whose middle layer contains recurrent connections that change by a Hebbian learning rule.: 73โ75 Later, in Principles of Neurodynamics (1961), he described "closed-loop cross-coupled" and "back-coupled" perceptron networks, and made theoretical and experimental studies for Hebbian learning in these networks,: Chapter 19, 21 and noted that a fully cross-coupled perceptron network is equivalent to an infinitely deep feedforward network.: Section 19.11 Similar networks were published by Kaoru Nakano in 1971,Shun'ichi Amari in 1972, and William A. Little [de] in 1974, who was acknowledged by Hopfield in his 1982 paper. Another origin of RNN was statistical mechanics. The Ising model was developed by Wilhelm Lenz and Ernst Ising in the 1920s as a simple statistical mechanical model of magnets at equilibrium. Glauber in 1963 studied the Ising model evolving in time, as a process towards equilibrium (Glauber dynamics), adding in the component of time. The SherringtonโKirkpatrick model of spin glass, published in 1975, is the Hopfield network with random initialization. Sherrington and Kirkpatrick found that it is highly likely for the energy function of the SK model to have many local minima. In the 1982 paper, Hopfield applied this recently developed theory to study the Hopfield network with binary activation functions. In a 1984 paper he extended this to continuous activation functions. It became a standard model for the study of neural networks through statistical mechanics. Modern RNN networks are mainly based on two architectures: LSTM and BRNN. At the resurgence of neural networks in the 1980s, recurrent networks were studied again. They were sometimes called "iterated nets". Two early influential works were the Jordan network (1986) and the Elman network (1990), which applied RNN to study cognitive psychology. In 1993, a neural history compressor system solved a "Very Deep Learning" task that required more than 1000 subsequent layers in an RNN unfolded in time. Long short-term memory (LSTM) networks were invented by Hochreiter and Schmidhuber in 1995 and set accuracy records in multiple applications domains. It became the default choice for RNN architecture. Bidirectional recurrent neural networks (BRNN) use two RNNs that process the same input in opposite directions. These two are often combined, giving the bidirectional LSTM architecture. Around 2006, bidirectional LSTM started to revolutionize speech recognition, outperforming traditional models in certain speech applications. They also improved large-vocabulary speech recognition and text-to-speech synthesis and was used in Google voice search, and dictation on Android devices. They broke records for improved machine translation, language modeling and Multilingual Language Processing. Also, LSTM combined with convolutional neural networks (CNNs) improved automatic image captioning. The idea of encoder-decoder sequence transduction had been developed in the early 2010s. The papers most commonly cited as the originators that produced seq2seq are two papers from 2014. A seq2seq architecture employs two RNN, typically LSTM, an "encoder" and a "decoder", for sequence transduction, such as machine translation. They became state of the art in machine translation, and was instrumental in the development of attention mechanisms and transformers. Configurations An RNN-based model can be factored into two parts: configuration and architecture. Multiple RNNs can be combined in a data flow, and the data flow itself is the configuration. Each RNN itself may have any architecture, including LSTM, GRU, etc. RNNs come in many variants. Abstractly speaking, an RNN is a function f ฮธ {\displaystyle f_{\theta }} of type ( x t , h t ) โฆ ( y t , h t + 1 ) {\displaystyle (x_{t},h_{t})\mapsto (y_{t},h_{t+1})} , where In words, it is a neural network that maps an input x t {\displaystyle x_{t}} into an output y t {\displaystyle y_{t}} , with the hidden vector h t {\displaystyle h_{t}} playing the role of "memory", a partial record of all previous input-output pairs. At each step, it transforms input to an output, and modifies its "memory" to help it to better perform future processing. The illustration to the right may be misleading to many because practical neural network topologies are frequently organized in "layers" and the drawing gives that appearance. However, what appears to be layers are, in fact, different steps in time, "unfolded" to produce the appearance of layers. A stacked RNN, or deep RNN, is composed of multiple RNNs stacked one above the other. Abstractly, it is structured as follows Each layer operates as a stand-alone RNN, and each layer's output sequence is used as the input sequence to the layer above. There is no conceptual limit to the depth of stacked RNN. A bidirectional RNN (biRNN) is composed of two RNNs, one processing the input sequence in one direction, and another in the opposite direction. Abstractly, it is structured as follows: The two output sequences are then concatenated to give the total output: ( ( y 0 , y 0 โฒ ) , ( y 1 , y 1 โฒ ) , โฆ , ( y N , y N โฒ ) ) {\displaystyle ((y_{0},y_{0}'),(y_{1},y_{1}'),\dots ,(y_{N},y_{N}'))} . Bidirectional RNN allows the model to process a token both in the context of what came before it and what came after it. By stacking multiple bidirectional RNNs together, the model can process a token increasingly contextually. The ELMo model (2018) is a stacked bidirectional LSTM which takes character-level as inputs and produces word-level embeddings. Two RNNs can be run front-to-back in an encoder-decoder configuration. The encoder RNN processes an input sequence into a sequence of hidden vectors, and the decoder RNN processes the sequence of hidden vectors to an output sequence, with an optional attention mechanism. This was used to construct state of the art neural machine translators during the 2014โ2017 period. This was an instrumental step towards the development of transformers. An RNN may process data with more than one dimension. PixelRNN processes two-dimensional data, with many possible directions. For example, the row-by-row direction processes an n ร n {\displaystyle n\times n} grid of vectors x i , j {\displaystyle x_{i,j}} in the following order: x 1 , 1 , x 1 , 2 , โฆ , x 1 , n , x 2 , 1 , x 2 , 2 , โฆ , x 2 , n , โฆ , x n , n {\displaystyle x_{1,1},x_{1,2},\dots ,x_{1,n},x_{2,1},x_{2,2},\dots ,x_{2,n},\dots ,x_{n,n}} The diagonal BiLSTM uses two LSTMs to process the same grid. One processes it from the top-left corner to the bottom-right, such that it processes x i , j {\displaystyle x_{i,j}} depending on its hidden state and cell state on the top and the left side: h i โ 1 , j , c i โ 1 , j {\displaystyle h_{i-1,j},c_{i-1,j}} and h i , j โ 1 , c i , j โ 1 {\displaystyle h_{i,j-1},c_{i,j-1}} . The other processes it from the top-right corner to the bottom-left. Architectures Fully recurrent neural networks (FRNN) connect the outputs of all neurons to the inputs of all neurons. In other words, it is a fully connected network. This is the most general neural network topology, because all other topologies can be represented by setting some connection weights to zero to simulate the lack of connections between those neurons. The Hopfield network is an RNN in which all connections across layers are equally sized. It requires stationary inputs and is thus not a general RNN, as it does not process sequences of patterns. However, it guarantees that it will converge. If the connections are trained using Hebbian learning, then the Hopfield network can perform as robust content-addressable memory, resistant to connection alteration. An Elman network is a three-layer network (arranged horizontally as x, y, and z in the illustration) with the addition of a set of context units (u in the illustration). The middle (hidden) layer is connected to these context units fixed with a weight of one. At each time step, the input is fed forward and a learning rule is applied. The fixed back-connections save a copy of the previous values of the hidden units in the context units (since they propagate over the connections before the learning rule is applied). Thus the network can maintain a sort of state, allowing it to perform tasks such as sequence-prediction that are beyond the power of a standard multilayer perceptron. Jordan networks are similar to Elman networks. The context units are fed from the output layer instead of the hidden layer. The context units in a Jordan network are also called the state layer. They have a recurrent connection to themselves. Elman and Jordan networks are also known as "Simple recurrent networks" (SRN). Variables and functions Long short-term memory (LSTM) is the most widely used RNN architecture. It was designed to solve the vanishing gradient problem. LSTM is normally augmented by recurrent gates called "forget gates". LSTM prevents backpropagated errors from vanishing or exploding. Instead, errors can flow backward through unlimited numbers of virtual layers unfolded in space. That is, LSTM can learn tasks that require memories of events that happened thousands or even millions of discrete time steps earlier. Problem-specific LSTM-like topologies can be evolved. LSTM works even given long delays between significant events and can handle signals that mix low and high-frequency components. Many applications use stacks of LSTMs, for which it is called "deep LSTM". LSTM can learn to recognize context-sensitive languages unlike previous models based on hidden Markov models (HMM) and similar concepts. Gated recurrent unit (GRU), introduced in 2014, was designed as a simplification of LSTM. They are used in the full form and several further simplified variants. They have fewer parameters than LSTM, as they lack an output gate. Their performance on polyphonic music modeling and speech signal modeling was found to be similar to that of long short-term memory. There does not appear to be particular performance difference between LSTM and GRU. Introduced by Bart Kosko, a bidirectional associative memory (BAM) network is a variant of a Hopfield network that stores associative data as a vector. The bidirectionality comes from passing information through a matrix and its transpose. Typically, bipolar encoding is preferred to binary encoding of the associative pairs. Recently, stochastic BAM models using Markov stepping are optimized for increased network stability and relevance to real-world applications. A BAM network has two layers, either of which can be driven as an input to recall an association and produce an output on the other layer. Echo state networks (ESN) have a sparsely connected random hidden layer. The weights of output neurons are the only part of the network that can change (be trained). ESNs are good at reproducing certain time series. A variant for spiking neurons is known as a liquid state machine. A recursive neural network is created by applying the same set of weights recursively over a differentiable graph-like structure by traversing the structure in topological order. Such networks are typically also trained by the reverse mode of automatic differentiation. They can process distributed representations of structure, such as logical terms. A special case of recursive neural networks is the RNN whose structure corresponds to a linear chain. Recursive neural networks have been applied to natural language processing. The recursive neural tensor network uses a tensor-based composition function for all nodes in the tree. Neural Turing machines (NTMs) are a method of extending recurrent neural networks by coupling them to external memory resources with which they interact. The combined system is analogous to a Turing machine or Von Neumann architecture but is differentiable end-to-end, allowing it to be efficiently trained with gradient descent. Differentiable neural computers (DNCs) are an extension of neural Turing machines, allowing for the usage of fuzzy amounts of each memory address and a record of chronology. Neural network pushdown automata (NNPDA) are similar to NTMs, but tapes are replaced by analog stacks that are differentiable and trained. In this way, they are similar in complexity to recognizers of context free grammars (CFGs). Recurrent neural networks are Turing complete and can run arbitrary programs to process arbitrary sequences of inputs. Training An RNN can be trained into a conditionally generative model of sequences, aka autoregression. Concretely, let us consider the problem of machine translation, that is, given a sequence ( x 1 , x 2 , โฆ , x n ) {\displaystyle (x_{1},x_{2},\dots ,x_{n})} of English words, the model is to produce a sequence ( y 1 , โฆ , y m ) {\displaystyle (y_{1},\dots ,y_{m})} of French words. It is to be solved by a seq2seq model. Now, during training, the encoder half of the model would first ingest ( x 1 , x 2 , โฆ , x n ) {\displaystyle (x_{1},x_{2},\dots ,x_{n})} , then the decoder half would start generating a sequence ( y ^ 1 , y ^ 2 , โฆ , y ^ l ) {\displaystyle ({\hat {y}}_{1},{\hat {y}}_{2},\dots ,{\hat {y}}_{l})} . The problem is that if the model makes a mistake early on, say at y ^ 2 {\displaystyle {\hat {y}}_{2}} , then subsequent tokens are likely to also be mistakes. This makes it inefficient for the model to obtain a learning signal, since the model would mostly learn to shift y ^ 2 {\displaystyle {\hat {y}}_{2}} towards y 2 {\displaystyle y_{2}} , but not the others. Teacher forcing makes it so that the decoder uses the correct output sequence for generating the next entry in the sequence. So for example, it would see ( y 1 , โฆ , y k ) {\displaystyle (y_{1},\dots ,y_{k})} in order to generate y ^ k + 1 {\displaystyle {\hat {y}}_{k+1}} . Gradient descent is a first-order iterative optimization algorithm for finding the minimum of a function. In neural networks, it can be used to minimize the error term by changing each weight in proportion to the derivative of the error with respect to that weight, provided the non-linear activation functions are differentiable. The standard method for training RNN by gradient descent is the "backpropagation through time" (BPTT) algorithm, which is a special case of the general algorithm of backpropagation. A more computationally expensive online variant is called "Real-Time Recurrent Learning" or RTRL, which is an instance of automatic differentiation in the forward accumulation mode with stacked tangent vectors. Unlike BPTT, this algorithm is local in time but not local in space. In this context, local in space means that a unit's weight vector can be updated using only information stored in the connected units and the unit itself such that update complexity of a single unit is linear in the dimensionality of the weight vector. Local in time means that the updates take place continually (on-line) and depend only on the most recent time step rather than on multiple time steps within a given time horizon as in BPTT. Biological neural networks appear to be local with respect to both time and space. For recursively computing the partial derivatives, RTRL has a time-complexity of O(number of hidden x number of weights) per time step for computing the Jacobian matrices, while BPTT only takes O(number of weights) per time step, at the cost of storing all forward activations within the given time horizon. An online hybrid between BPTT and RTRL with intermediate complexity exists, along with variants for continuous time. A major problem with gradient descent for standard RNN architectures is that error gradients vanish exponentially quickly with the size of the time lag between important events. LSTM combined with a BPTT/RTRL hybrid learning method attempts to overcome these problems. This problem is also solved in the independently recurrent neural network (IndRNN) by reducing the context of a neuron to its own past state and the cross-neuron information can then be explored in the following layers. Memories of different ranges including long-term memory can be learned without the gradient vanishing and exploding problems. The online algorithm called causal recursive backpropagation (CRBP), implements and combines BPTT and RTRL paradigms for locally recurrent networks. It works with the most general locally recurrent networks. The CRBP algorithm can minimize the global error term. This fact improves the stability of the algorithm, providing a unifying view of gradient calculation techniques for recurrent networks with local feedback. One approach to gradient information computation in RNNs with arbitrary architectures is based on signal-flow graphs diagrammatic derivation. It uses the BPTT batch algorithm, based on Lee's theorem for network sensitivity calculations. It was proposed by Wan and Beaufays, while its fast online version was proposed by Campolucci, Uncini and Piazza. The connectionist temporal classification (CTC) is a specialized loss function for training RNNs for sequence modeling problems where the timing is variable. Training the weights in a neural network can be modeled as a non-linear global optimization problem. A target function can be formed to evaluate the fitness or error of a particular weight vector as follows: First, the weights in the network are set according to the weight vector. Next, the network is evaluated against the training sequence. Typically, the sum-squared difference between the predictions and the target values specified in the training sequence is used to represent the error of the current weight vector. Arbitrary global optimization techniques may then be used to minimize this target function. The most common global optimization method for training RNNs is genetic algorithms, especially in unstructured networks. Initially, the genetic algorithm is encoded with the neural network weights in a predefined manner where one gene in the chromosome represents one weight link. The whole network is represented as a single chromosome. The fitness function is evaluated as follows: Many chromosomes make up the population; therefore, many different neural networks are evolved until a stopping criterion is satisfied. A common stopping scheme can be: The fitness function evaluates the stopping criterion as it receives the mean-squared error reciprocal from each network during training. Therefore, the goal of the genetic algorithm is to maximize the fitness function, reducing the mean-squared error. Other global (and/or evolutionary) optimization techniques may be used to seek a good set of weights, such as simulated annealing or particle swarm optimization. Other architectures The independently recurrent neural network (IndRNN) addresses the gradient vanishing and exploding problems in the traditional fully connected RNN. Each neuron in one layer only receives its own past state as context information (instead of full connectivity to all other neurons in this layer) and thus neurons are independent of each other's history. The gradient backpropagation can be regulated to avoid gradient vanishing and exploding in order to keep long or short-term memory. The cross-neuron information is explored in the next layers. IndRNN can be robustly trained with non-saturated nonlinear functions such as ReLU. Deep networks can be trained using skip connections. The neural history compressor is an unsupervised stack of RNNs. At the input level, it learns to predict its next input from the previous inputs. Only unpredictable inputs of some RNN in the hierarchy become inputs to the next higher level RNN, which therefore recomputes its internal state only rarely. Each higher level RNN thus studies a compressed representation of the information in the RNN below. This is done such that the input sequence can be precisely reconstructed from the representation at the highest level. The system effectively minimizes the description length or the negative logarithm of the probability of the data. Given a lot of learnable predictability in the incoming data sequence, the highest level RNN can use supervised learning to easily classify even deep sequences with long intervals between important events. It is possible to distill the RNN hierarchy into two RNNs: the "conscious" chunker (higher level) and the "subconscious" automatizer (lower level). Once the chunker has learned to predict and compress inputs that are unpredictable by the automatizer, then the automatizer can be forced in the next learning phase to predict or imitate through additional units the hidden units of the more slowly changing chunker. This makes it easy for the automatizer to learn appropriate, rarely changing memories across long intervals. In turn, this helps the automatizer to make many of its once unpredictable inputs predictable, such that the chunker can focus on the remaining unpredictable events. A generative model partially overcame the vanishing gradient problem of automatic differentiation or backpropagation in neural networks in 1992. In 1993, such a system solved a "Very Deep Learning" task that required more than 1000 subsequent layers in an RNN unfolded in time. Second-order RNNs use higher order weights w i j k {\displaystyle w{}_{ijk}} instead of the standard w i j {\displaystyle w{}_{ij}} weights, and states can be a product. This allows a direct mapping to a finite-state machine both in training, and representation. Long short-term memory is an example of this but has no such formal mappings or proof of stability. Hierarchical recurrent neural networks (HRNN) connect their neurons in various ways to decompose hierarchical behavior into useful subprograms. Such hierarchical structures of cognition are present in theories of memory presented by philosopher Henri Bergson, whose philosophical views have inspired hierarchical models. Hierarchical recurrent neural networks are useful in forecasting, helping to predict disaggregated inflation components of the consumer price index (CPI). The HRNN model leverages information from higher levels in the CPI hierarchy to enhance lower-level predictions. Evaluation of a substantial dataset from the US CPI-U index demonstrates the superior performance of the HRNN model compared to various established inflation prediction methods. Generally, a recurrent multilayer perceptron network (RMLP network) consists of cascaded subnetworks, each containing multiple layers of nodes. Each subnetwork is feed-forward except for the last layer, which can have feedback connections. Each of these subnets is connected only by feed-forward connections. A multiple timescales recurrent neural network (MTRNN) is a neural-based computational model that can simulate the functional hierarchy of the brain through self-organization depending on the spatial connection between neurons and on distinct types of neuron activities, each with distinct time properties. With such varied neuronal activities, continuous sequences of any set of behaviors are segmented into reusable primitives, which in turn are flexibly integrated into diverse sequential behaviors. The biological approval of such a type of hierarchy was discussed in the memory-prediction theory of brain function by Hawkins in his book On Intelligence.[citation needed] Such a hierarchy also agrees with theories of memory posited by philosopher Henri Bergson, which have been incorporated into an MTRNN model. Greg Snider of HP Labs describes a system of cortical computing with memristive nanodevices. The memristors (memory resistors) are implemented by thin film materials in which the resistance is electrically tuned via the transport of ions or oxygen vacancies within the film. DARPA's SyNAPSE project has funded IBM Research and HP Labs, in collaboration with the Boston University Department of Cognitive and Neural Systems (CNS), to develop neuromorphic architectures that may be based on memristive systems. Memristive networks are a particular type of physical neural network that have very similar properties to (Little-)Hopfield networks, as they have continuous dynamics, a limited memory capacity and natural relaxation via the minimization of a function which is asymptotic to the Ising model. In this sense, the dynamics of a memristive circuit have the advantage compared to a Resistor-Capacitor network to have a more interesting non-linear behavior. From this point of view, engineering analog memristive networks account for a peculiar type of neuromorphic engineering in which the device behavior depends on the circuit wiring or topology. The evolution of these networks can be studied analytically using variations of the Caravelli-Traversa-Di Ventra equation. A continuous-time recurrent neural network (CTRNN) uses a system of ordinary differential equations to model the effects on a neuron of the incoming inputs. They are typically analyzed by dynamical systems theory. Many RNN models in neuroscience are continuous-time. For a neuron i {\displaystyle i} in the network with activation y i {\displaystyle y_{i}} , the rate of change of activation is given by: Where: CTRNNs have been applied to evolutionary robotics where they have been used to address vision, co-operation, and minimal cognitive behaviour. Note that, by the Shannon sampling theorem, discrete-time recurrent neural networks can be viewed as continuous-time recurrent neural networks where the differential equations have transformed into equivalent difference equations. This transformation can be thought of as occurring after the post-synaptic node activation functions y i ( t ) {\displaystyle y_{i}(t)} have been low-pass filtered but prior to sampling. They are in fact recursive neural networks with a particular structure: that of a linear chain. Whereas recursive neural networks operate on any hierarchical structure, combining child representations into parent representations, recurrent neural networks operate on the linear progression of time, combining the previous time step and a hidden representation into the representation for the current time step. From a time-series perspective, RNNs can appear as nonlinear versions of finite impulse response and infinite impulse response filters and also as a nonlinear autoregressive exogenous model (NARX). RNN has infinite impulse response whereas convolutional neural network has finite impulse response. Both classes of networks exhibit temporal dynamic behavior. A finite impulse recurrent network is a directed acyclic graph that can be unrolled and replaced with a strictly feedforward neural network, while an infinite impulse recurrent network is a directed cyclic graph that cannot be unrolled. The effect of memory-based learning for the recognition of sequences can also be implemented by a more biological-based model which uses the silencing mechanism exhibited in neurons with a relatively high frequency spiking activity. Additional stored states and the storage under direct control by the network can be added to both infinite-impulse and finite-impulse networks. Another network or graph can also replace the storage if that incorporates time delays or has feedback loops. Such controlled states are referred to as gated states or gated memory and are part of long short-term memory networks (LSTMs) and gated recurrent units. This is also called Feedback Neural Network (FNN). Libraries Modern libraries provide runtime-optimized implementations of the above functionality or allow to speed up the slow loop by just-in-time compilation. Applications Applications of recurrent neural networks include: References Further reading |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Text_corpus] | [TOKENS: 345] |
Contents Text corpus In linguistics and natural language processing, a corpus (pl.: corpora) or text corpus is a dataset, consisting of natively digital and older, digitalized, language resources, either annotated or unannotated. Annotated, they have been used in corpus linguistics for statistical hypothesis testing, checking occurrences or validating linguistic rules within a specific language territory. Overview A corpus may contain texts in a single language (monolingual corpus) or text data in multiple languages (multilingual corpus). In order to make the corpora more useful for doing linguistic research, they are often subjected to a process known as annotation. An example of annotating a corpus is part-of-speech tagging, or POS-tagging, in which information about each word's part of speech (verb, noun, adjective, etc.) is added to the corpus in the form of tags. Another example is indicating the lemma (base) form of each word. When the language of the corpus is not a working language of the researchers who use it, interlinear glossing is used to make the annotation bilingual.[citation needed] Some corpora have further structured levels of analysis applied. In particular, smaller corpora may be fully parsed. Such corpora are usually called Treebanks or Parsed Corpora. The difficulty of ensuring that the entire corpus is completely and consistently annotated means that these corpora are usually smaller, containing around one to three million words. Other levels of linguistic structured analysis are possible, including annotations for morphology, semantics and pragmatics.[citation needed] Applications Corpora are the main knowledge base in corpus linguistics.[citation needed] Other notable areas of application include: Some notable text corpora See also References External links |
======================================== |
[SOURCE: https://en.wikipedia.org/w/index.php?title=Word_n-gram_language_model&action=edit] | [TOKENS: 1451] |
Editing Word n-gram language model Copy and paste: โ โ ยฐ โฒ โณ โ โ โค โฅ ยฑ โ ร รท โ โ ยท ยง Cite your sources: <ref></ref> {{}} {{{}}} | [] [[]] [[Category:]] #REDIRECT [[]] <s></s> <sup></sup> <sub></sub> <code></code> <pre></pre> <blockquote></blockquote> <ref></ref> <ref name="" /> {{Reflist}} <references /> <includeonly></includeonly> <noinclude></noinclude> {{DEFAULTSORT:}} <nowiki></nowiki> <!-- --> <span class="plainlinks"></span> Symbols: ~ | ยก ยฟ โ โก โ โ โ โข ยถ # โ โนโบ ยซยป ยค โณ เธฟ โต ยข โก โข $ โซ โฏ โฌ โ โฃ ฦ โด โญ โค โณ โฅ โฆ โง โฐ ยฃ แ โจ โช เงณ โฎ โฉ ยฅ โ โฃ โฅ โฆ ๐ซ โญ โฎ โฏ ๐ช ยฉ ยผ ยฝ ยพ Latin: A a ร รก ร ร ร รข ร รค ว ว ฤ ฤ ฤ ฤ ร รฃ ร
รฅ ฤ ฤ
ร รฆ วข วฃ B b C c ฤ ฤ ฤ ฤ ฤ ฤ ฤ ฤ ร รง D d ฤ ฤ ฤ ฤ แธ แธ ร รฐ E e ร รฉ ร รจ ฤ ฤ ร รช ร รซ ฤ ฤ ฤ ฤ ฤ ฤ แบผ แบฝ ฤ ฤ แบธ แบน ฦ ษ ฦ ว ฦ ษ F f G g ฤ ฤก ฤ ฤ ฤ ฤ ฤข ฤฃ H h ฤค ฤฅ ฤฆ ฤง แธค แธฅ I i ฤฐ ฤฑ ร รญ ร รฌ ร รฎ ร รฏ ว ว ฤฌ ฤญ ฤช ฤซ ฤจ ฤฉ ฤฎ ฤฏ แป แป J j ฤด ฤต K k ฤถ ฤท L l ฤน ฤบ ฤฟ ล ฤฝ ฤพ ฤป ฤผ ล ล แธถ แธท แธธ แธน M m แน แน N n ล ล ล ล ร รฑ ล
ล แน แน ล ล O o ร รณ ร รฒ ร รด ร รถ ว ว ล ล ล ล ร รต วช วซ แป แป ล ล ร รธ ล ล ฦ ษ P p Q q R r ล ล ล ล ล ล แน แน แน แน S s ล ล ล ล ล ลก ล ล ศ ศ แนข แนฃ ร T t ลค ลฅ ลข ลฃ ศ ศ แนฌ แนญ ร รพ U u ร รบ ร รน ร รป ร รผ ว ว ลฌ ลญ ลช ลซ ลจ ลฉ ลฎ ลฏ ลฒ ลณ แปค แปฅ ลฐ ลฑ ว ว ว ว ว ว ว ว V v W w ลด ลต X x Y y ร รฝ ลถ ลท ลธ รฟ แปธ แปน ศฒ ศณ Z z ลน ลบ ลป ลผ ลฝ ลพ ร ร รฐ ร รพ ล ล ฦ ษ Greek: ฮ ฮฌ ฮ ฮญ ฮ ฮฎ ฮ ฮฏ ฮ ฯ ฮ ฯ ฮ ฯ ฮ ฮฑ ฮ ฮฒ ฮ ฮณ ฮ ฮด ฮ ฮต ฮ ฮถ ฮ ฮท ฮ ฮธ ฮ ฮน ฮ ฮบ ฮ ฮป ฮ ฮผ ฮ ฮฝ ฮ ฮพ ฮ ฮฟ ฮ ฯ ฮก ฯ ฮฃ ฯ ฯ ฮค ฯ ฮฅ ฯ
ฮฆ ฯ ฮง ฯ ฮจ ฯ ฮฉ ฯ {{Polytonic|}} Cyrillic: ะ ะฐ ะ ะฑ ะ ะฒ ะ ะณ า า ะ ั ะ ะด ะ ั ะ ะต ะ ั ะ ั ะ ะถ ะ ะท ะ
ั ะ ะธ ะ ั ะ ั ะ ะน ะ ั ะ ะบ ะ ั ะ ะป ะ ั ะ ะผ ะ ะฝ ะ ั ะ ะพ ะ ะฟ ะ ั ะก ั ะข ั ะ ั ะฃ ั ะ ั ะค ั ะฅ ั
ะฆ ั ะง ั ะ ั ะจ ั ะฉ ั ะช ั ะซ ั ะฌ ั ะญ ั ะฎ ั ะฏ ั ฬ IPA: tฬช dฬช ส ษ ษ ษก ษข สก ส ษธ ฮฒ ฮธ รฐ ส ส ษ ส ส ส รง ส ษฃ ฯ ส ฤง ส ส สข ษฆ ษฑ ษณ ษฒ ล ษด ส ษน ษป ษฐ ส โฑฑ ส ษพ ษฝ ษซ ษฌ ษฎ ษบ ษญ ส ส ษฅ ส ษง สผ ษ ษ ส ษ ส ส ว ว ว ว ษจ ส ษฏ ษช ส ส รธ ษ ษต ษค ษ ษ ษ ล ษ ษ ษ ส ษ รฆ ษ ษถ ษ ษ สฐ สฑ สท สฒ ห หค โฟ หก ห ห ห ห ฬช {{IPA|}} Wikidata entities used in this page Pages transcluded onto the current version of this page (help): This page is a member of 5 hidden categories (help): |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Language_model#cite_note-14] | [TOKENS: 1793] |
Contents Language model A language model is a computational model that predicts sequences in natural language. Language models are useful for a variety of tasks, including speech recognition, machine translation, natural language generation (generating more human-like text), optical character recognition, route optimization, handwriting recognition, grammar induction, and information retrieval. Large language models (LLMs), currently their most advanced form as of 2019, are predominantly based on transformers trained on larger datasets (frequently using texts scraped from the public internet). They have superseded recurrent neural network-based models, which had previously superseded the purely statistical models, such as the word n-gram language model. History Noam Chomsky did pioneering work on language models in the 1950s by developing a theory of formal grammars. In 1980, statistical approaches were explored and found to be more useful for many purposes than rule-based formal grammars. Discrete representations like word n-gram language models, with probabilities for discrete combinations of words, made significant advances. In the 2000s, continuous representations for words, such as word embeddings, began to replace discrete representations. Typically, the representation is a real-valued vector that encodes a wordโs meaning such that words closer in vector space are similar in meaning and common relationships between words, such as plurality or gender, are preserved. Pure statistical models In 1980, the first significant statistical language model was proposed, and during the decade IBM performed 'Shannon-style' experiments, in which potential sources for language modeling improvement were identified by observing and analyzing the performance of human subjects in predicting or correcting text. A word n-gram language model is a statistical model of language which calculates the probability of the next word in a sequence from a fixed size window of previous words. If one previous word is considered, it is a bigram model; if two words, a trigram model; if n โ 1 words, an n-gram model. Special tokens are introduced to denote the start and end of a sentence โจ s โฉ {\displaystyle \langle s\rangle } and โจ / s โฉ {\displaystyle \langle /s\rangle } . To prevent a zero probability being assigned to unseen words, the probability of each seen word is slightly lowered to make room for the unseen words in a given corpus. To achieve this, various smoothing methods are used, from simple "add-one" smoothing (assigning a count of 1 to unseen n-grams, as an uninformative prior) to more sophisticated techniques, such as GoodโTuring discounting or back-off models. Word n-gram models have largely been superseded by recurrent neural networkโbased models, which in turn have been superseded by Transformer-based models often referred to as large language models. Maximum entropy language models encode the relationship between a word and the n-gram history using feature functions. The equation is P ( w m โฃ w 1 , โฆ , w m โ 1 ) = 1 Z ( w 1 , โฆ , w m โ 1 ) exp โก ( a T f ( w 1 , โฆ , w m ) ) {\displaystyle P(w_{m}\mid w_{1},\ldots ,w_{m-1})={\frac {1}{Z(w_{1},\ldots ,w_{m-1})}}\exp(a^{T}f(w_{1},\ldots ,w_{m}))} where Z ( w 1 , โฆ , w m โ 1 ) {\displaystyle Z(w_{1},\ldots ,w_{m-1})} is the partition function, a {\displaystyle a} is the parameter vector, and f ( w 1 , โฆ , w m ) {\displaystyle f(w_{1},\ldots ,w_{m})} is the feature function. In the simplest case, the feature function is just an indicator of the presence of a certain n-gram. It is helpful to use a prior on a {\displaystyle a} or some form of regularization. The log-bilinear model is another example of an exponential language model. Skip-gram language model is an attempt at overcoming the data sparsity problem that the preceding model (i.e. word n-gram language model) faced. Words represented in an embedding vector were not necessarily consecutive anymore, but could leave gaps that are skipped over (thus the name "skip-gram"). Formally, a k-skip-n-gram is a length-n subsequence where the components occur at distance at most k from each other. For example, in the input text: the set of 1-skip-2-grams includes all the bigrams (2-grams), and in addition the subsequences In skip-gram model, semantic relations between words are represented by linear combinations, capturing a form of compositionality. For example, in some such models, if v is the function that maps a word w to its n-d vector representation, then v ( k i n g ) โ v ( m a l e ) + v ( f e m a l e ) โ v ( q u e e n ) {\displaystyle v(\mathrm {king} )-v(\mathrm {male} )+v(\mathrm {female} )\approx v(\mathrm {queen} )} where โ is made precise by stipulating that its right-hand side must be the nearest neighbor of the value of the left-hand side. Neural models Continuous representations or embeddings of words are produced in recurrent neural network-based language models (known also as continuous space language models). Such continuous space embeddings help to alleviate the curse of dimensionality, which is the consequence of the number of possible sequences of words increasing exponentially with the size of the vocabulary, further causing a data sparsity problem. Neural networks avoid this problem by representing words as non-linear combinations of weights in a neural net. A large language model (LLM) is a language model trained with self-supervised machine learning on a vast amount of text, designed for natural language processing tasks, especially language generation. The largest and most capable LLMs are generative pre-trained transformers (GPTs) that provide the core capabilities of modern chatbots. LLMs can be fine-tuned for specific tasks or guided by prompt engineering. These models acquire predictive power regarding syntax, semantics, and ontologies inherent in human language corpora, but they also inherit inaccuracies and biases present in the data they are trained on. They consist of billions to trillions of parameters and operate as general-purpose sequence models, generating, summarizing, translating, and reasoning over text. LLMs represent a significant new technology in their ability to generalize across tasks with minimal task-specific supervision, enabling capabilities like conversational agents, code generation, knowledge retrieval, and automated reasoning that previously required bespoke systems. LLMs evolved from earlier statistical and recurrent neural network approaches to language modeling. The transformer architecture, introduced in 2017, replaced recurrence with self-attention, allowing efficient parallelization, longer context handling, and scalable training on unprecedented data volumes. This innovation enabled models like GPT, BERT, and their successors, which demonstrated emergent behaviors at scale, such as few-shot learning and compositional reasoning. Reinforcement learning, particularly policy gradient algorithms, has been adapted to fine-tune LLMs for desired behaviors beyond raw next-token prediction. Reinforcement learning from human feedback (RLHF) applies these methods to optimize a policy, the LLM's output distribution, against reward signals derived from human or automated preference judgments. This has been critical for aligning model outputs with user expectations, improving factuality, reducing harmful responses, and enhancing task performance. Benchmark evaluations for LLMs have evolved from narrow linguistic assessments toward comprehensive, multi-task evaluations measuring reasoning, factual accuracy, alignment, and safety. Hill climbing, iteratively optimizing models against benchmarks, has emerged as a dominant strategy, producing rapid incremental performance gains but raising concerns of overfitting to benchmarks rather than achieving genuine generalization or robust capability improvements. Although sometimes matching human performance, it is not clear whether they are plausible cognitive models. At least for recurrent neural networks, it has been shown that they sometimes learn patterns that humans do not, but fail to learn patterns that humans typically do. Evaluation and benchmarks Evaluation of the quality of language models is mostly done by comparison to human created sample benchmarks created from typical language-oriented tasks. Other, less established, quality tests examine the intrinsic character of a language model or compare two such models. Since language models are typically intended to be dynamic and to learn from data they see, some proposed models investigate the rate of learning, e.g., through inspection of learning curves. Various data sets have been developed for use in evaluating language processing systems. These include: See also References Further reading |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Language_model#cite_note-12] | [TOKENS: 1793] |
Contents Language model A language model is a computational model that predicts sequences in natural language. Language models are useful for a variety of tasks, including speech recognition, machine translation, natural language generation (generating more human-like text), optical character recognition, route optimization, handwriting recognition, grammar induction, and information retrieval. Large language models (LLMs), currently their most advanced form as of 2019, are predominantly based on transformers trained on larger datasets (frequently using texts scraped from the public internet). They have superseded recurrent neural network-based models, which had previously superseded the purely statistical models, such as the word n-gram language model. History Noam Chomsky did pioneering work on language models in the 1950s by developing a theory of formal grammars. In 1980, statistical approaches were explored and found to be more useful for many purposes than rule-based formal grammars. Discrete representations like word n-gram language models, with probabilities for discrete combinations of words, made significant advances. In the 2000s, continuous representations for words, such as word embeddings, began to replace discrete representations. Typically, the representation is a real-valued vector that encodes a wordโs meaning such that words closer in vector space are similar in meaning and common relationships between words, such as plurality or gender, are preserved. Pure statistical models In 1980, the first significant statistical language model was proposed, and during the decade IBM performed 'Shannon-style' experiments, in which potential sources for language modeling improvement were identified by observing and analyzing the performance of human subjects in predicting or correcting text. A word n-gram language model is a statistical model of language which calculates the probability of the next word in a sequence from a fixed size window of previous words. If one previous word is considered, it is a bigram model; if two words, a trigram model; if n โ 1 words, an n-gram model. Special tokens are introduced to denote the start and end of a sentence โจ s โฉ {\displaystyle \langle s\rangle } and โจ / s โฉ {\displaystyle \langle /s\rangle } . To prevent a zero probability being assigned to unseen words, the probability of each seen word is slightly lowered to make room for the unseen words in a given corpus. To achieve this, various smoothing methods are used, from simple "add-one" smoothing (assigning a count of 1 to unseen n-grams, as an uninformative prior) to more sophisticated techniques, such as GoodโTuring discounting or back-off models. Word n-gram models have largely been superseded by recurrent neural networkโbased models, which in turn have been superseded by Transformer-based models often referred to as large language models. Maximum entropy language models encode the relationship between a word and the n-gram history using feature functions. The equation is P ( w m โฃ w 1 , โฆ , w m โ 1 ) = 1 Z ( w 1 , โฆ , w m โ 1 ) exp โก ( a T f ( w 1 , โฆ , w m ) ) {\displaystyle P(w_{m}\mid w_{1},\ldots ,w_{m-1})={\frac {1}{Z(w_{1},\ldots ,w_{m-1})}}\exp(a^{T}f(w_{1},\ldots ,w_{m}))} where Z ( w 1 , โฆ , w m โ 1 ) {\displaystyle Z(w_{1},\ldots ,w_{m-1})} is the partition function, a {\displaystyle a} is the parameter vector, and f ( w 1 , โฆ , w m ) {\displaystyle f(w_{1},\ldots ,w_{m})} is the feature function. In the simplest case, the feature function is just an indicator of the presence of a certain n-gram. It is helpful to use a prior on a {\displaystyle a} or some form of regularization. The log-bilinear model is another example of an exponential language model. Skip-gram language model is an attempt at overcoming the data sparsity problem that the preceding model (i.e. word n-gram language model) faced. Words represented in an embedding vector were not necessarily consecutive anymore, but could leave gaps that are skipped over (thus the name "skip-gram"). Formally, a k-skip-n-gram is a length-n subsequence where the components occur at distance at most k from each other. For example, in the input text: the set of 1-skip-2-grams includes all the bigrams (2-grams), and in addition the subsequences In skip-gram model, semantic relations between words are represented by linear combinations, capturing a form of compositionality. For example, in some such models, if v is the function that maps a word w to its n-d vector representation, then v ( k i n g ) โ v ( m a l e ) + v ( f e m a l e ) โ v ( q u e e n ) {\displaystyle v(\mathrm {king} )-v(\mathrm {male} )+v(\mathrm {female} )\approx v(\mathrm {queen} )} where โ is made precise by stipulating that its right-hand side must be the nearest neighbor of the value of the left-hand side. Neural models Continuous representations or embeddings of words are produced in recurrent neural network-based language models (known also as continuous space language models). Such continuous space embeddings help to alleviate the curse of dimensionality, which is the consequence of the number of possible sequences of words increasing exponentially with the size of the vocabulary, further causing a data sparsity problem. Neural networks avoid this problem by representing words as non-linear combinations of weights in a neural net. A large language model (LLM) is a language model trained with self-supervised machine learning on a vast amount of text, designed for natural language processing tasks, especially language generation. The largest and most capable LLMs are generative pre-trained transformers (GPTs) that provide the core capabilities of modern chatbots. LLMs can be fine-tuned for specific tasks or guided by prompt engineering. These models acquire predictive power regarding syntax, semantics, and ontologies inherent in human language corpora, but they also inherit inaccuracies and biases present in the data they are trained on. They consist of billions to trillions of parameters and operate as general-purpose sequence models, generating, summarizing, translating, and reasoning over text. LLMs represent a significant new technology in their ability to generalize across tasks with minimal task-specific supervision, enabling capabilities like conversational agents, code generation, knowledge retrieval, and automated reasoning that previously required bespoke systems. LLMs evolved from earlier statistical and recurrent neural network approaches to language modeling. The transformer architecture, introduced in 2017, replaced recurrence with self-attention, allowing efficient parallelization, longer context handling, and scalable training on unprecedented data volumes. This innovation enabled models like GPT, BERT, and their successors, which demonstrated emergent behaviors at scale, such as few-shot learning and compositional reasoning. Reinforcement learning, particularly policy gradient algorithms, has been adapted to fine-tune LLMs for desired behaviors beyond raw next-token prediction. Reinforcement learning from human feedback (RLHF) applies these methods to optimize a policy, the LLM's output distribution, against reward signals derived from human or automated preference judgments. This has been critical for aligning model outputs with user expectations, improving factuality, reducing harmful responses, and enhancing task performance. Benchmark evaluations for LLMs have evolved from narrow linguistic assessments toward comprehensive, multi-task evaluations measuring reasoning, factual accuracy, alignment, and safety. Hill climbing, iteratively optimizing models against benchmarks, has emerged as a dominant strategy, producing rapid incremental performance gains but raising concerns of overfitting to benchmarks rather than achieving genuine generalization or robust capability improvements. Although sometimes matching human performance, it is not clear whether they are plausible cognitive models. At least for recurrent neural networks, it has been shown that they sometimes learn patterns that humans do not, but fail to learn patterns that humans typically do. Evaluation and benchmarks Evaluation of the quality of language models is mostly done by comparison to human created sample benchmarks created from typical language-oriented tasks. Other, less established, quality tests examine the intrinsic character of a language model or compare two such models. Since language models are typically intended to be dynamic and to learn from data they see, some proposed models investigate the rate of learning, e.g., through inspection of learning curves. Various data sets have been developed for use in evaluating language processing systems. These include: See also References Further reading |
======================================== |
[SOURCE: https://en.wikipedia.org/wiki/Principle_of_maximum_entropy] | [TOKENS: 5066] |
Contents Principle of maximum entropy The principle of maximum entropy states that, among all probability distributions consistent with a given set of constraints (such as normalization or specified expectation values), the distribution that maximizes Shannon entropy should be selected. This yields the least committal distribution compatible with the known constraints, introducing no structure beyond what is logically implied by the available information. The justification is that entropy measures the expected information content (or log-surprise) of outcomes relative to a specified reference measure. Maximizing entropy ensures that no additional structure is imposed beyond the stated constraints. Any lower-entropy alternative would encode extra regularity not required by those constraints and would therefore amount to introducing unsupported information. It is important that entropy be defined relative to a specified measure or prior. In discrete cases, Shannon entropy is defined relative to the counting measure (or an explicitly specified prior weighting). In continuous cases, differential entropy depends on the choice of coordinates and is not invariant under reparameterization. For this reason, the principled continuous formulation maximizes relative entropy (equivalently, minimizes KullbackโLeibler divergence) with respect to a specified reference measure or prior density m(x), typically by maximizing โ โซ p ( x ) log โก p ( x ) m ( x ) d x {\displaystyle -\int p(x)\,\log {\frac {p(x)}{m(x)}}\,dx} subject to the given constraints. This formulation is invariant under change of variables and makes explicit the role of the underlying prior measure. History The principle was first expounded by E. T. Jaynes in two papers in 1957, where he emphasized a natural correspondence between statistical mechanics and information theory. In particular, Jaynes argued that the Gibbsian method of statistical mechanics is sound by also arguing that the entropy of statistical mechanics and the information entropy of information theory are the same concept. Consequently, statistical mechanics should be considered a particular application of a general tool of logical inference and information theory. Overview In most practical cases, the stated prior data or testable information is given by a set of conserved quantities (average values of some moment functions), associated with the probability distribution in question. This is the way the maximum entropy principle is most often used in statistical thermodynamics. Another possibility is to prescribe some symmetries of the probability distribution. The equivalence between conserved quantities and corresponding symmetry groups implies a similar equivalence for these two ways of specifying the testable information in the maximum entropy method. The maximum entropy principle is also needed to guarantee the uniqueness and consistency of probability assignments obtained by different methods, statistical mechanics and logical inference in particular. The maximum entropy principle makes explicit our freedom in using different forms of prior data. As a special case, a uniform prior probability density (Laplace's principle of indifference, sometimes called the principle of insufficient reason), may be adopted. Thus, the maximum entropy principle is not merely an alternative way to view the usual methods of inference of classical statistics, but represents a significant conceptual generalization of those methods. However these statements do not imply that thermodynamical systems need not be shown to be ergodic to justify treatment as a statistical ensemble. In ordinary language, the principle of maximum entropy can be said to express a claim of epistemic modesty, or of maximum ignorance. The selected distribution is the one that makes the least claim to being informed beyond the stated prior data, that is to say the one that admits the most ignorance beyond the stated prior data. Testable information The principle of maximum entropy is useful explicitly only when applied to testable information. Testable information is a statement about a probability distribution whose truth or falsity is well-defined. For example, the statements and p 2 + p 3 > 0.6 {\displaystyle p_{2}+p_{3}>0.6} (where p 2 {\displaystyle p_{2}} and p 3 {\displaystyle p_{3}} are probabilities of events) are statements of testable information. Given testable information, the maximum entropy procedure consists of seeking the probability distribution which maximizes information entropy, subject to the constraints of the information. This constrained optimization problem is typically solved using the method of Lagrange multipliers. Entropy maximization with no testable information respects the universal "constraint" that the sum of the probabilities is one. Under this constraint, the maximum entropy discrete probability distribution is the uniform distribution, p i = 1 n f o r a l l i โ { 1 , โฆ , n } . {\displaystyle p_{i}={\frac {1}{n}}\ {\rm {for\ all}}\ i\in \{\,1,\dots ,n\,\}.} Applications The principle of maximum entropy is commonly applied in two ways to inferential problems: The principle of maximum entropy is often used to obtain prior probability distributions for Bayesian inference. Jaynes was a strong advocate of this approach, claiming the maximum entropy distribution represented the least informative distribution. A large amount of literature is now dedicated to the elicitation of maximum entropy priors and links with channel coding. Maximum entropy is a sufficient updating rule for radical probabilism. Richard Jeffrey's probability kinematics is a special case of maximum entropy inference. However, maximum entropy is not a generalisation of all such sufficient updating rules. Alternatively, the principle is often invoked for model specification: in this case the observed data itself is assumed to be the testable information. Such models are widely used in natural language processing. An example of such a model is logistic regression, which corresponds to the maximum entropy classifier for independent observations. The maximum entropy principle has also been applied in economics and resource allocation. For example, the Boltzmann fair division model uses the maximum entropy (Boltzmann) distribution to allocate resources or income among individuals, providing a probabilistic approach to distributive justice. One of the main applications of the maximum entropy principle is in discrete and continuous density estimation. Similar to support vector machine estimators, the maximum entropy principle may require the solution to a quadratic programming problem, and thus provide a sparse mixture model as the optimal density estimator. One important advantage of the method is its ability to incorporate prior information in the density estimation. General solution for the maximum entropy distribution with linear constraints We have some testable information I about a quantity x taking values in {x1, x2,..., xn}. We assume this information has the form of m constraints on the expectations of the functions fk; that is, we require our probability distribution to satisfy the moment inequality/equality constraints: โ i = 1 n Pr ( x i ) f k ( x i ) โฅ F k k = 1 , โฆ , m . {\displaystyle \sum _{i=1}^{n}\Pr(x_{i})f_{k}(x_{i})\geq F_{k}\qquad k=1,\ldots ,m.} where the F k {\displaystyle F_{k}} are observables. We also require the probability density to sum to one, which may be viewed as a primitive constraint on the identity function and an observable equal to 1 giving the constraint โ i = 1 n Pr ( x i ) = 1. {\displaystyle \sum _{i=1}^{n}\Pr(x_{i})=1.} The probability distribution with maximum information entropy subject to these inequality/equality constraints is of the form: Pr ( x i ) = 1 Z ( ฮป 1 , โฆ , ฮป m ) exp โก [ ฮป 1 f 1 ( x i ) + โฏ + ฮป m f m ( x i ) ] , {\displaystyle \Pr(x_{i})={\frac {1}{Z(\lambda _{1},\ldots ,\lambda _{m})}}\exp \left[\lambda _{1}f_{1}(x_{i})+\cdots +\lambda _{m}f_{m}(x_{i})\right],} for some ฮป 1 , โฆ , ฮป m {\displaystyle \lambda _{1},\ldots ,\lambda _{m}} . It is sometimes called the Gibbs distribution. The normalization constant is determined by: Z ( ฮป 1 , โฆ , ฮป m ) = โ i = 1 n exp โก [ ฮป 1 f 1 ( x i ) + โฏ + ฮป m f m ( x i ) ] , {\displaystyle Z(\lambda _{1},\ldots ,\lambda _{m})=\sum _{i=1}^{n}\exp \left[\lambda _{1}f_{1}(x_{i})+\cdots +\lambda _{m}f_{m}(x_{i})\right],} and is conventionally called the partition function. (The PitmanโKoopman theorem states that the necessary and sufficient condition for a sampling distribution to admit sufficient statistics of bounded dimension is that it have the general form of a maximum entropy distribution.) The ฮปk parameters are Lagrange multipliers. In the case of equality constraints their values are determined from the solution of the nonlinear equations F k = โ โ ฮป k log โก Z ( ฮป 1 , โฆ , ฮป m ) . {\displaystyle F_{k}={\frac {\partial }{\partial \lambda _{k}}}\log Z(\lambda _{1},\ldots ,\lambda _{m}).} In the case of inequality constraints, the Lagrange multipliers are determined from the solution of a convex optimization program with linear constraints. In both cases, there is no closed form solution, and the computation of the Lagrange multipliers usually requires numerical methods. For continuous distributions, the Shannon entropy cannot be used, as it is only defined for discrete probability spaces. Instead Edwin Jaynes (1963, 1968, 2003) gave the following formula, which is closely related to the relative entropy (see also differential entropy). H c = โ โซ p ( x ) log โก p ( x ) q ( x ) d x {\displaystyle H_{c}=-\int p(x)\log {\frac {p(x)}{q(x)}}\,dx} where q(x), which Jaynes called the "invariant measure", is proportional to the limiting density of discrete points. For now, we shall assume that q is known; we will discuss it further after the solution equations are given. A closely related quantity, the relative entropy, is usually defined as the KullbackโLeibler divergence of p from q (although it is sometimes, confusingly, defined as the negative of this). The inference principle of minimizing this, due to Kullback, is known as the Principle of Minimum Discrimination Information. We have some testable information I about a quantity x which takes values in some interval of the real numbers (all integrals below are over this interval). We assume this information has the form of m constraints on the expectations of the functions fk, i.e. we require our probability density function to satisfy the inequality (or purely equality) moment constraints: โซ p ( x ) f k ( x ) d x โฅ F k k = 1 , โฆ , m . {\displaystyle \int p(x)f_{k}(x)\,dx\geq F_{k}\qquad k=1,\dotsc ,m.} where the F k {\displaystyle F_{k}} are observables. We also require the probability density to integrate to one, which may be viewed as a primitive constraint on the identity function and an observable equal to 1 giving the constraint โซ p ( x ) d x = 1. {\displaystyle \int p(x)\,dx=1.} The probability density function with maximum Hc subject to these constraints is: p ( x ) = q ( x ) exp โก [ ฮป 1 f 1 ( x ) + โฏ + ฮป m f m ( x ) ] Z ( ฮป 1 , โฆ , ฮป m ) {\displaystyle p(x)={\frac {q(x)\exp \left[\lambda _{1}f_{1}(x)+\dotsb +\lambda _{m}f_{m}(x)\right]}{Z(\lambda _{1},\dotsc ,\lambda _{m})}}} with the partition function determined by Z ( ฮป 1 , โฆ , ฮป m ) = โซ q ( x ) exp โก [ ฮป 1 f 1 ( x ) + โฏ + ฮป m f m ( x ) ] d x . {\displaystyle Z(\lambda _{1},\dotsc ,\lambda _{m})=\int q(x)\exp \left[\lambda _{1}f_{1}(x)+\dotsb +\lambda _{m}f_{m}(x)\right]\,dx.} As in the discrete case, in the case where all moment constraints are equalities, the values of the ฮป k {\displaystyle \lambda _{k}} parameters are determined by the system of nonlinear equations: F k = โ โ ฮป k log โก Z ( ฮป 1 , โฆ , ฮป m ) . {\displaystyle F_{k}={\frac {\partial }{\partial \lambda _{k}}}\log Z(\lambda _{1},\dotsc ,\lambda _{m}).} In the case with inequality moment constraints the Lagrange multipliers are determined from the solution of a convex optimization program. The invariant measure function q(x) can be best understood by supposing that x is known to take values only in the bounded interval (a, b), and that no other information is given. Then the maximum entropy probability density function is p ( x ) = A โ
q ( x ) , a < x < b {\displaystyle p(x)=A\cdot q(x),\qquad a<x<b} where A is a normalization constant. The invariant measure function is actually the prior density function encoding 'lack of relevant information'. It cannot be determined by the principle of maximum entropy, and must be determined by some other logical method, such as the principle of transformation groups or marginalization theory. For several examples of maximum entropy distributions, see the article on maximum entropy probability distributions. Justifications for the principle of maximum entropy Proponents of the principle of maximum entropy justify its use in assigning probabilities in several ways, including the following two arguments. These arguments take the use of Bayesian probability as given, and are thus subject to the same postulates. Consider a discrete probability distribution among m {\displaystyle m} mutually exclusive propositions. The most informative distribution would occur when one of the propositions was known to be true. In that case, the information entropy would be equal to zero. The least informative distribution would occur when there is no reason to favor any one of the propositions over the others. In that case, the only reasonable probability distribution would be uniform, and then the information entropy would be equal to its maximum possible value, log โก m {\displaystyle \log m} . The information entropy can therefore be seen as a numerical measure which describes how uninformative a particular probability distribution is, ranging from zero (completely informative) to log โก m {\displaystyle \log m} (completely uninformative). By choosing to use the distribution with the maximum entropy allowed by our information, the argument goes, we are choosing the most uninformative distribution possible. To choose a distribution with lower entropy would be to assume information we do not possess. Thus the maximum entropy distribution is the only reasonable distribution. The dependence of the solution on the dominating measure represented by m ( x ) {\displaystyle m(x)} is however a source of criticisms of the approach since this dominating measure is in fact arbitrary. The following argument is the result of a suggestion made by Graham Wallis to E. T. Jaynes in 1962. It is essentially the same mathematical argument used for the MaxwellโBoltzmann statistics in statistical mechanics, although the conceptual emphasis is quite different. It has the advantage of being strictly combinatorial in nature, making no reference to information entropy as a measure of 'uncertainty', 'uninformativeness', or any other imprecisely defined concept. The information entropy function is not assumed a priori, but rather is found in the course of the argument; and the argument leads naturally to the procedure of maximizing the information entropy, rather than treating it in some other way. Suppose an individual wishes to make a probability assignment among m {\displaystyle m} mutually exclusive propositions. They have some testable information, but are not sure how to go about including this information in their probability assessment. They therefore conceive of the following random experiment. They will distribute N {\displaystyle N} quanta of probability (each worth 1 / N {\displaystyle 1/N} ) at random among the m {\displaystyle m} possibilities. (One might imagine that they will throw N {\displaystyle N} balls into m {\displaystyle m} buckets while blindfolded. In order to be as fair as possible, each throw is to be independent of any other, and every bucket is to be the same size.) Once the experiment is done, they will check if the probability assignment thus obtained is consistent with their information. (For this step to be successful, the information must be a constraint given by an open set in the space of probability measures). If it is inconsistent, they will reject it and try again. If it is consistent, their assessment will be p i = n i N {\displaystyle p_{i}={\frac {n_{i}}{N}}} where p i {\displaystyle p_{i}} is the probability of the i {\displaystyle i} th proposition, while ni is the number of quanta that were assigned to the i {\displaystyle i} th proposition (i.e. the number of balls that ended up in bucket i {\displaystyle i} ). Now, in order to reduce the 'graininess' of the probability assignment, it will be necessary to use quite a large number of quanta of probability. Rather than actually carry out, and possibly have to repeat, the rather long random experiment, the protagonist decides to simply calculate and use the most probable result. The probability of any particular result is the multinomial distribution, P r ( p ) = W โ
m โ N {\displaystyle Pr(\mathbf {p} )=W\cdot m^{-N}} where W = N ! n 1 ! n 2 ! โฏ n m ! {\displaystyle W={\frac {N!}{n_{1}!\,n_{2}!\,\dotsb \,n_{m}!}}} is sometimes known as the multiplicity of the outcome. The most probable result is the one which maximizes the multiplicity W {\displaystyle W} . Rather than maximizing W {\displaystyle W} directly, the protagonist could equivalently maximize any monotonic increasing function of W {\displaystyle W} . They decide to maximize 1 N log โก W = 1 N log โก N ! n 1 ! n 2 ! โฏ n m ! = 1 N log โก N ! ( N p 1 ) ! ( N p 2 ) ! โฏ ( N p m ) ! = 1 N ( log โก N ! โ โ i = 1 m log โก ( ( N p i ) ! ) ) . {\displaystyle {\begin{aligned}{\frac {1}{N}}\log W&={\frac {1}{N}}\log {\frac {N!}{n_{1}!\,n_{2}!\,\dotsb \,n_{m}!}}\\[6pt]&={\frac {1}{N}}\log {\frac {N!}{(Np_{1})!\,(Np_{2})!\,\dotsb \,(Np_{m})!}}\\[6pt]&={\frac {1}{N}}\left(\log N!-\sum _{i=1}^{m}\log((Np_{i})!)\right).\end{aligned}}} At this point, in order to simplify the expression, the protagonist takes the limit as N โ โ {\displaystyle N\to \infty } , i.e. as the probability levels go from grainy discrete values to smooth continuous values. Using Stirling's approximation, they find lim N โ โ ( 1 N log โก W ) = 1 N ( N log โก N โ โ i = 1 m N p i log โก ( N p i ) ) = log โก N โ โ i = 1 m p i log โก ( N p i ) = log โก N โ log โก N โ i = 1 m p i โ โ i = 1 m p i log โก p i = ( 1 โ โ i = 1 m p i ) log โก N โ โ i = 1 m p i log โก p i = โ โ i = 1 m p i log โก p i = H ( p ) . {\displaystyle {\begin{aligned}\lim _{N\to \infty }\left({\frac {1}{N}}\log W\right)&={\frac {1}{N}}\left(N\log N-\sum _{i=1}^{m}Np_{i}\log(Np_{i})\right)\\[6pt]&=\log N-\sum _{i=1}^{m}p_{i}\log(Np_{i})\\[6pt]&=\log N-\log N\sum _{i=1}^{m}p_{i}-\sum _{i=1}^{m}p_{i}\log p_{i}\\[6pt]&=\left(1-\sum _{i=1}^{m}p_{i}\right)\log N-\sum _{i=1}^{m}p_{i}\log p_{i}\\[6pt]&=-\sum _{i=1}^{m}p_{i}\log p_{i}\\[6pt]&=H(\mathbf {p} ).\end{aligned}}} All that remains for the protagonist to do is to maximize entropy under the constraints of their testable information. They have found that the maximum entropy distribution is the most probable of all "fair" random distributions, in the limit as the probability levels go from discrete to continuous. Giffin and Caticha (2007) state that Bayes' theorem and the principle of maximum entropy are completely compatible and can be seen as special cases of the "method of maximum relative entropy". They state that this method reproduces every aspect of orthodox Bayesian inference methods. In addition this new method opens the door to tackling problems that could not be addressed by either the maximal entropy principle or orthodox Bayesian methods individually. Moreover, recent contributions (Lazar 2003, and Schennach 2005) show that frequentist relative-entropy-based inference approaches (such as empirical likelihood and exponentially tilted empirical likelihood โ see e.g. Owen 2001 and Kitamura 2006) can be combined with prior information to perform Bayesian posterior analysis. Jaynes stated Bayes' theorem was a way to calculate a probability, while maximum entropy was a way to assign a prior probability distribution. It is however, possible in concept to solve for a posterior distribution directly from a stated prior distribution using the principle of minimum cross-entropy (or the Principle of Maximum Entropy being a special case of using a uniform distribution as the given prior), independently of any Bayesian considerations by treating the problem formally as a constrained optimisation problem, the Entropy functional being the objective function. For the case of given average values as testable information (averaged over the sought after probability distribution), the sought after distribution is formally the Gibbs (or Boltzmann) distribution the parameters of which must be solved for in order to achieve minimum cross entropy and satisfy the given testable information. Relevance to physics The principle of maximum entropy bears a relation to a key assumption of kinetic theory of gases known as molecular chaos or Stosszahlansatz. This asserts that the distribution function characterizing particles entering a collision can be factorized. Though this statement can be understood as a strictly physical hypothesis, it can also be interpreted as a heuristic hypothesis regarding the most probable configuration of particles before colliding. See also Notes References Further reading |
======================================== |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.