text
stringlengths
1
22.8M
```shell Adding a remote repository Check the status of your files Use `short` status to make output more compact How to write a git commit message Ignore files in git ```
Bhiksha (, bhikṣā; , bhikkhā) is a term used in Indic religions, such as Jainism, Buddhism and Hinduism, to refer to the act of alms or asking. Commonly, it is also used to refer to food obtained by asking for alms. Buddhism In Buddhism, bhiksha takes on the form of the monastic almsround (, piṇḍacāra), during which monks make themselves available to the laity to receive alms food (, piṇḍapāta). Hinduism Bhiksha signifies a Hindu tradition of asking for alms with the purpose of self-effacement or ego-conquering. Other forms of giving and asking include dakshina (offering a gift to the guru) and dāna (an unreciprocated gift to someone in need). Usually, bhiksha is the meal served to a sadhu sanyasi or monk when that person visits a devout Hindu household. Occasionally, bhiksha has also referred to donations of gold, cattle, and even land, given to Brahmins in exchange for karmakanda. It is given by disciples to a guru as an offering as well. Bhiksha is incorporated into religious rituals as well, a prominent one being the bhikshacharanam, which includes begging for alms. In such a ritual, a Brahmin who has completed his rite of passage ceremony must beg for alms, stating, "". The concept of a deity or being seeking bhiksha occurs in Hindu literature such as the Ramayana. In this epic, in order to lure Sita out of her hermitage, Ravana disguises himself as a mendicant begging for alms. When she subsequently offers him bhiksha, he abducts her to Lanka upon his pushpaka vimana. See also Dāna References Hindu monasticism Hindu traditions Religious food and drink Alms in Hinduism
The High Commission of Namibia in London is the diplomatic mission of Namibia in the United Kingdom. In 2012 a protest was held outside the High Commission by the conservation group Earthrace against the practice of seal culling in the country. Gallery References External links Official site Namibia Diplomatic missions of Namibia Namibia and the Commonwealth of Nations Namibia–United Kingdom relations Buildings and structures in the City of Westminster Marylebone United Kingdom and the Commonwealth of Nations
```python # your_sha256_hash___________ # # Pyomo: Python Optimization Modeling Objects # National Technology and Engineering Solutions of Sandia, LLC # Under the terms of Contract DE-NA0003525 with National Technology and # Engineering Solutions of Sandia, LLC, the U.S. Government retains certain # rights in this software. # your_sha256_hash___________ from pyomo.core import * def pipe_rule(model, pipe, id): pipe.length = Param(within=NonNegativeReals) pipe.flow = Var() pipe.pIn = Var(within=NonNegativeReals) pipe.pOut = Var(within=NonNegativeReals) pipe.pDrop = Constraint( expr=pipe.pIn - pipe.pOut == model.friction * model.pipe_length[id] * pipe.flow ) pipe.IN = Connector() pipe.IN.add(-1 * pipe.flow, "flow") pipe.IN.add(pipe.pIn, "pressure") pipe.OUT = Connector() pipe.OUT.add(pipe.flow) pipe.OUT.add(pipe.pOut, "pressure") def node_rule(model, node, id): def _mass_balance(model, node, flows): return node.demand == sum_product(flows) node.demand = Param(within=Reals, default=0) node.flow = VarList() node.pressure = Var(within=NonNegativeReals) node.port = Connector() # node.port.add( node.flow, # aggregate=lambda m,n,v: m.demands[id] == sum_product(v) ) node.port.add(node.flow, aggregate=_mass_balance) node.port.add(node.pressure) def _src_rule(model, pipe): return model.nodes[value(model.pipe_links[pipe, 0])].port == model.pipes[pipe].IN def _sink_rule(model, pipe): return model.nodes[value(model.pipe_links[pipe, 1])].port == model.pipes[pipe].OUT model = AbstractModel() model.PIPES = Set() model.NODES = Set() model.friction = Param(within=NonNegativeReals) model.pipe_links = Param(model.PIPES, RangeSet(2)) model.pipes = Block(model.PIPES, rule=pipe_rule) model.nodes = Block(model.NODES, rule=node_rule) # Connect the network model.network_src = Constraint(model.PIPES, rule=_src_rule) model.network_sink = Constraint(model.PIPES, rule=_sink_rule) # Solve so the minimum pressure in the network is 0 def _obj(model): return sum(model.nodes[n].pressure for n in model.NODES) model.obj = Objective(rule=_obj) ```
```objective-c #pragma once #include <Core/Block.h> #include <Interpreters/Context.h> #include <Dictionaries/IDictionarySource.h> #include <Dictionaries/DictionaryStructure.h> #include <Processors/Sources/ShellCommandSource.h> namespace DB { /** ExecutablePoolDictionarySource allows loading data from pool of processes. * When client requests ids or keys source get process from ProcessPool * and create stream based on source format from process stdout. * It is important that stream format will expect only rows that were requested. * When stream is finished process is returned back to the ProcessPool. * If there are no processes in pool during request client will be blocked * until some process will be returned to pool. */ class ExecutablePoolDictionarySource final : public IDictionarySource { public: struct Configuration { String command; std::vector<String> command_arguments; bool implicit_key; }; ExecutablePoolDictionarySource( const DictionaryStructure & dict_struct_, const Configuration & configuration_, Block & sample_block_, std::shared_ptr<ShellCommandSourceCoordinator> coordinator_, ContextPtr context_); ExecutablePoolDictionarySource(const ExecutablePoolDictionarySource & other); ExecutablePoolDictionarySource & operator=(const ExecutablePoolDictionarySource &) = delete; QueryPipeline loadAll() override; /** The logic of this method is flawed, absolutely incorrect and ignorant. * It may lead to skipping some values due to clock sync or timezone changes. * The intended usage of "update_field" is totally different. */ QueryPipeline loadUpdatedAll() override; QueryPipeline loadIds(const std::vector<UInt64> & ids) override; QueryPipeline loadKeys(const Columns & key_columns, const std::vector<size_t> & requested_rows) override; bool isModified() const override; bool supportsSelectiveLoad() const override; bool hasUpdateField() const override; DictionarySourcePtr clone() const override; std::string toString() const override; QueryPipeline getStreamForBlock(const Block & block); private: const DictionaryStructure dict_struct; const Configuration configuration; Block sample_block; std::shared_ptr<ShellCommandSourceCoordinator> coordinator; ContextPtr context; LoggerPtr log; }; } ```
Vukan Vuletić () (born January 21, 1973) is a Serbian diver. He competed at the 1996 Summer Olympics in Atlanta and took 37th place in Men's Platform. Vuletić is president of Čukarički Diving Club and min coach. He finished his career in 2000, after 20 years of competing. References External links Vukan Vuletić - Sports-Reference.com 1973 births Living people Serbian male divers Olympic divers for Serbia and Montenegro Divers at the 1996 Summer Olympics Place of birth missing (living people)
```javascript Use hosted scripts to increase performance Navigating the browser history ProgressEvent Geolocation Drag and Drop API ```
```xml <?xml version="1.0" encoding="utf-8"?> <manifest xmlns:android="path_to_url" package="me.ele.app.amigo"> <application> <meta-data android:name="data_key" android:value="AAA" /> </application> </manifest> ```
Jamaica competed at the 1972 Summer Olympics in Munich, West Germany. 33 competitors, 21 men and 12 women, took part in 27 events in 6 sports. Medalists Athletics Men's 800 metres Byron Dyce Heat — 1:48.0 (→ did not advance) Men's 1500 metres Byron Dyce Heat — 3:45.9 (→ did not advance) Men's 4 × 100 m Relay Michael Fray, Donald Quarrie, Lennox Miller, and Horace Levy Heat — DNS (→ did not advance) Boxing Cycling Six cyclists represented Jamaica in 1972. Individual road race Howard Fenton — did not finish (→ no ranking) Michael Lecky — did not finish (→ no ranking) Radcliffe Lawrence — did not finish (→ no ranking) Xavier Mirander — did not finish (→ no ranking) Sprint Honson Chin Maurice Hugh-Sam 1000m time trial Howard Fenton Final — 1:12.64 (→ 26th place) Tandem Honson Chin and Howard Fenton → 11th place Individual pursuit Michael Lecky Team pursuit Radcliffe Lawrence Howard Fenton Maurice Hugh-Sam Michael Lecky Diving Women's 3m Springboard Betsy Sullivan — 210.39 points (→ 29th place) Sailing Swimming References External links Official Olympic Reports International Olympic Committee results database Nations at the 1972 Summer Olympics 1972 Summer Olympics 1972 in Jamaican sport
Luís Vázquez Fernández-Pimentel (b. Lugo, 18 December 1895 – d. Lugo, 13 February 1958) was a Galician poet. Galician Literature Day is dedicated to him in 1990. Work Triscos (1950). Colección Benito Soto . Sombra do aire na herba (1959). Galaxia . Barco sin luces (1960) . Poesía enteira (1981). Edicións Xerais de Galicia. Luis Pimentel. Obra inédita o no recopilada (1981). Celta. Poesía galega (1989). Edicións Xerais de Galicia. Poesías completas, de Luís Pimentel (1990). Comares. Luís Pimentel. Obra completa (2009). Galaxia. References Bibliography Agustín Fernández, S. (2007). "Luis Pimentel, poeta del abismo interior"". Madrygal (10): 35–43. ISSN 1138-9664. Alonso Montero, X. (1990). Luís Pimentel, biografía da súa poesía. Do Cumio. . Blanco, C. (2005). Extranjera en su patria. Cuatro poetas gallegos. Rosalía de Castro, Manuel Antonio, Luís Pimentel, Luz Pozo Garza. Círculo de Lectores / Galaxia Gutenberg. . Carballo Calero, R. (1978). "Originales inéditos en castellano de poemas de Luis Pimentel publicados en gallego". 1916. Anuario de la Sociedad Española de Literatura General y Comparada (1): 68–83. —————— (1980). "Sobre la poesía de Luis Pimentel". 1916. Anuario de la Sociedad Española de Literatura General y Comparada (3): 41–50. Fernández de la Vega, C. (1983). "Vida e poesía de Luís Pimentel". Sombra do aire na herba. BBdCG. Galaxia. Fernández del Riego, F. (1971) [1951]. Historia da literatura galega (2ª ed.). Galaxia. pp. 221–223. —————— (1990). Diccionario de escritores en lingua galega. Do Castro. p. 450. . Fernández Rodríguez, Manuel, ed. (2006). Poemas pola memoria. 1936-2006. Xunta de Galicia. García, J., ed. (2001). Poetas del Novecientos: entre el Modernismo y la Vanguardia: (Antología). Tomo I: De Fernando Fortún a Rafael Porlán. BSCH. pp. 182–201. . Gómez, A.; Queixas, M. (2001). Historia xeral da literatura galega. A Nosa Terra. p. 205-206. . Herrero Figueroa, A. (1991). "Luis Pimentel, poeta hispánico". Turia (16): 123–146. ISSN 0213-4373. —————— (1994). Sobre Luís Pimentel, Álvaro Cunqueiro e Carballo Calero: apontamentos de Filoloxía, Crítica e Didáctica da Literatura. Do Castro. . —————— (2007). Unha cidade e un poeta (Lugo e Luís Pimentel). Deputación. . López-Casanova, A. (1990). Luís Pimentel e Sombra do aire na herba. Ágora. Galaxia. . Méndez Ferrín, X. L. (1984). De Pondal a Novoneyra. Xerais. pp. 52, 306. . Pallarés, P. (1991). Rosas na sombra (a poesía de Luís Pimentel). Do Cumio. . Piñeiro, R. (1958). "Luis Pimentel". Boletín da RAG (327-332): 180-183. ISSN 1576-8767. Piñeiro, Pozo, López-Casanova, Rodriguez and Murado (1990). Luís Pimentel, unha fotobiografía. Xerais. . Sánchez Reboredo, J. (1989). El silencio y la música (Ensayo sobre la poesía de Pimentel) (en español). . Vilavedra, D., ed. (1995). "Vázquez Fernández, Luís". Diccionario da Literatura Galega. I. Autores. Galaxia. pp. 597–599. . External links "Pimentel, Luis Vázquez Fernández (1895-1958)" Enciclopedia Universal Micronet . People from Lugo Galician poets 1895 births 1958 deaths Galician-language writers 20th-century Spanish poets Spanish male poets University of Santiago de Compostela alumni 20th-century Spanish male writers
```xml /* * * * path_to_url * * Unless required by applicable law or agreed to in writing, software * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. */ import { HttpClient } from '@angular/common/http'; import { Injectable } from '@angular/core'; import { Groups } from '../entity/Groups'; import { BaseService } from './base.service'; @Injectable({ providedIn: 'root' }) export class GroupsService extends BaseService<Groups> { constructor(private _httpClient: HttpClient) { super(_httpClient, '/access/groups'); } } ```
```java /* * */ package io.debezium.pipeline.metrics; import io.debezium.metrics.activity.ActivityMonitoringMXBean; import io.debezium.pipeline.metrics.traits.ConnectionMetricsMXBean; import io.debezium.pipeline.metrics.traits.StreamingMetricsMXBean; /** * Metrics specific to streaming change event sources * * @author Randall Hauch, Jiri Pechanec */ public interface StreamingChangeEventSourceMetricsMXBean extends ChangeEventSourceMetricsMXBean, ConnectionMetricsMXBean, StreamingMetricsMXBean, ActivityMonitoringMXBean { } ```
```php <?php declare(strict_types=1); // Generated by the protocol buffer compiler. DO NOT EDIT! // source: src/Tracing/FederatedTracing/reports.proto namespace Nuwave\Lighthouse\Tracing\FederatedTracing\Proto; use Google\Protobuf\Internal\GPBType; use Google\Protobuf\Internal\GPBUtil; /** * Generated from protobuf message <code>ExtendedReferences</code>. */ class ExtendedReferences extends \Google\Protobuf\Internal\Message { /** Generated from protobuf field <code>map<string, .InputTypeStats> input_types = 1 [json_name = "inputTypes"];</code> */ private $input_types; /** * Map of enum name to stats about that enum. * * Generated from protobuf field <code>map<string, .EnumStats> enum_values = 2 [json_name = "enumValues"];</code> */ private $enum_values; /** * Constructor. * * @param array $data { * Optional. Data for populating the Message object. * * @var array|\Google\Protobuf\Internal\MapField $input_types * @var array|\Google\Protobuf\Internal\MapField $enum_values * Map of enum name to stats about that enum. * } */ public function __construct($data = null) { Metadata\Reports::initOnce(); parent::__construct($data); } /** * Generated from protobuf field <code>map<string, .InputTypeStats> input_types = 1 [json_name = "inputTypes"];</code>. * * @return \Google\Protobuf\Internal\MapField */ public function getInputTypes() { return $this->input_types; } /** * Generated from protobuf field <code>map<string, .InputTypeStats> input_types = 1 [json_name = "inputTypes"];</code>. * * @param array|\Google\Protobuf\Internal\MapField $var * * @return $this */ public function setInputTypes($var) { $arr = GPBUtil::checkMapField($var, GPBType::STRING, GPBType::MESSAGE, InputTypeStats::class); $this->input_types = $arr; return $this; } /** * Map of enum name to stats about that enum. * * Generated from protobuf field <code>map<string, .EnumStats> enum_values = 2 [json_name = "enumValues"];</code> * * @return \Google\Protobuf\Internal\MapField */ public function getEnumValues() { return $this->enum_values; } /** * Map of enum name to stats about that enum. * * Generated from protobuf field <code>map<string, .EnumStats> enum_values = 2 [json_name = "enumValues"];</code> * * @param array|\Google\Protobuf\Internal\MapField $var * * @return $this */ public function setEnumValues($var) { $arr = GPBUtil::checkMapField($var, GPBType::STRING, GPBType::MESSAGE, EnumStats::class); $this->enum_values = $arr; return $this; } } ```
```xml <?xml version="1.0" encoding="utf-8"?> <resources> <style name="AppTheme.NoActionBar"> <item name="windowActionBar">false</item> <item name="windowNoTitle">true</item> <item name="android:windowTranslucentStatus">true</item> </style> </resources> ```
```python #!/usr/bin/env python # coding=utf-8 # author: zengyuetian # import unittest from lib.utility.date import * class DateTest(unittest.TestCase): def setUp(self): pass def tearDown(self): pass def test_time_string(self): self.assertEqual(len(get_time_string()), 14) def test_date_string(self): self.assertEqual(len(get_date_string()), 8) def test_year_string(self): self.assertEqual(len(get_year_month_string()), 6) if __name__ == '__main__': unittest.main() ```
Dynasty Warriors Next is a hack and slash video game and a spin-off title of the Dynasty Warriors series of video games. Developed by Omega Force and published by Koei, it was released for the PlayStation Vita. Similar to the other games in the franchise, the game's plot follows that of the book Romance of the Three Kingdoms by Luo Guanzhong. The game was developed as a Vita launch title, and was released along with the console in all regions. Gameplay Dynasty Warriors Next is split into several scenarios where all stages are chosen from a map of China. The territories can be invaded in order to gain influence and gold for each owned region. The earned gold can be spent on stratagems, which are special boosts represented by the officers of a player's faction. They come with different bonuses: increasing attack and defense, boosting the aggression of the enemy's army, making the bases easier to seize, and others. The army can be equipped with items and weapons that are found on the battlefield, like buffs, enhancements or horses. Once the battle starts, the map gets split between allied and enemy bases. Each of them have a special purpose, and benefit the side which controls it. A supply will increase the power of all owned bases, and an armory can temporarily double the troops' attack. The lairs can spawn animal reinforcements in bears, tigers or wolves, while the magical bases link themselves to other bases, making them invulnerable. The bases can be captured by killing everything that is inside, until the counter drops to zero. At any point, an unskippable one-on-one duel may initiate. Similar to Infinity Blade, the player can block the attacks while tapping the flashing points to break their resistance, and finish them off. There are several game modes available. Campaign contains three story acts and is loosely based on Romance of the Three Kingdoms, with the purpose of introducing the basic concepts. It serves as a series of battles, where the rival kingdoms are vying for control of the land. The player usually gets to make a choice of which officer to take in, except for key conflicts where it's all restricted. In the Conquest mode, the main goal of taking over territories across China remains the same, only that it also allows the players to create their own army and officers. Edit Mode is used for creating or editing characters. The customization materials are unlocked by completing the Campaign parts, and every created character can be brought in Conquest afterwards. Conquest has an online version, where the game will collect data from other players to populate the battlefield. The player will then face off against other Edit Mode creations, in addition to the regular cast. The game makes use of touch and gyroscope Vita controls: tilting for aiming Musou attacks or marking enemies' weak points, and touchscreen for blocking and deflecting projectiles. Reception Dynasty Warriors Next was met with average to mixed reception upon release; GameRankings gave it a score of 68.31%, while Metacritic gave it 67 out of 100. Notes References External links Official website of North America Dynasty Warriors PlayStation Vita games PlayStation Vita-only games Koei games 2011 video games Video games developed in Japan Video games set in China
```xml <?xml version="1.0" encoding="utf-8"?> <Project DefaultTargets="Build" ToolsVersion="4.0" xmlns="path_to_url"> <ItemGroup Label="ProjectConfigurations"> <ProjectConfiguration Include="Debug|Win32"> <Configuration>Debug</Configuration> <Platform>Win32</Platform> </ProjectConfiguration> <ProjectConfiguration Include="Debug|x64"> <Configuration>Debug</Configuration> <Platform>x64</Platform> </ProjectConfiguration> <ProjectConfiguration Include="Release|Win32"> <Configuration>Release</Configuration> <Platform>Win32</Platform> </ProjectConfiguration> <ProjectConfiguration Include="Release|x64"> <Configuration>Release</Configuration> <Platform>x64</Platform> </ProjectConfiguration> </ItemGroup> <PropertyGroup Label="Globals"> <ProjectGuid>{95A1571F-61BE-4C51-BE53-2F2DAB280685}</ProjectGuid> <RootNamespace>gresource</RootNamespace> <Keyword>Win32Proj</Keyword> </PropertyGroup> <Import Project="$(VCTargetsPath)\Microsoft.Cpp.Default.props" /> <PropertyGroup Condition="'$(Configuration)|$(Platform)'=='Release|Win32'" Label="Configuration"> <ConfigurationType>Application</ConfigurationType> <CharacterSet>MultiByte</CharacterSet> <WholeProgramOptimization>true</WholeProgramOptimization> <PlatformToolset>v120</PlatformToolset> </PropertyGroup> <PropertyGroup Condition="'$(Configuration)|$(Platform)'=='Debug|Win32'" Label="Configuration"> <ConfigurationType>Application</ConfigurationType> <CharacterSet>MultiByte</CharacterSet> <PlatformToolset>v120</PlatformToolset> </PropertyGroup> <PropertyGroup Condition="'$(Configuration)|$(Platform)'=='Release|x64'" Label="Configuration"> <ConfigurationType>Application</ConfigurationType> <CharacterSet>MultiByte</CharacterSet> <WholeProgramOptimization>true</WholeProgramOptimization> <PlatformToolset>v120</PlatformToolset> </PropertyGroup> <PropertyGroup Condition="'$(Configuration)|$(Platform)'=='Debug|x64'" Label="Configuration"> <ConfigurationType>Application</ConfigurationType> <CharacterSet>MultiByte</CharacterSet> <PlatformToolset>v120</PlatformToolset> </PropertyGroup> <Import Project="$(VCTargetsPath)\Microsoft.Cpp.props" /> <ImportGroup Label="ExtensionSettings"> </ImportGroup> <ImportGroup Condition="'$(Configuration)|$(Platform)'=='Release|Win32'" Label="PropertySheets"> <Import Project="$(UserRootDir)\Microsoft.Cpp.$(Platform).user.props" Condition="exists('$(UserRootDir)\Microsoft.Cpp.$(Platform).user.props')" Label="LocalAppDataPlatform" /> <Import Project="glib-build-defines.props" /> </ImportGroup> <ImportGroup Condition="'$(Configuration)|$(Platform)'=='Debug|Win32'" Label="PropertySheets"> <Import Project="$(UserRootDir)\Microsoft.Cpp.$(Platform).user.props" Condition="exists('$(UserRootDir)\Microsoft.Cpp.$(Platform).user.props')" Label="LocalAppDataPlatform" /> <Import Project="glib-build-defines.props" /> </ImportGroup> <ImportGroup Condition="'$(Configuration)|$(Platform)'=='Release|x64'" Label="PropertySheets"> <Import Project="$(UserRootDir)\Microsoft.Cpp.$(Platform).user.props" Condition="exists('$(UserRootDir)\Microsoft.Cpp.$(Platform).user.props')" Label="LocalAppDataPlatform" /> <Import Project="glib-build-defines.props" /> </ImportGroup> <ImportGroup Condition="'$(Configuration)|$(Platform)'=='Debug|x64'" Label="PropertySheets"> <Import Project="$(UserRootDir)\Microsoft.Cpp.$(Platform).user.props" Condition="exists('$(UserRootDir)\Microsoft.Cpp.$(Platform).user.props')" Label="LocalAppDataPlatform" /> <Import Project="glib-build-defines.props" /> </ImportGroup> <PropertyGroup Label="UserMacros" /> <PropertyGroup> <LinkIncremental Condition="'$(Configuration)|$(Platform)'=='Debug|Win32'">true</LinkIncremental> <LinkIncremental Condition="'$(Configuration)|$(Platform)'=='Debug|x64'">true</LinkIncremental> <LinkIncremental Condition="'$(Configuration)|$(Platform)'=='Release|Win32'">false</LinkIncremental> <LinkIncremental Condition="'$(Configuration)|$(Platform)'=='Release|x64'">false</LinkIncremental> </PropertyGroup> <ItemDefinitionGroup Condition="'$(Configuration)|$(Platform)'=='Debug|Win32'"> <ClCompile> <Optimization>Disabled</Optimization> <AdditionalIncludeDirectories>..\..\..\gmodule;%(AdditionalIncludeDirectories)</AdditionalIncludeDirectories> <PreprocessorDefinitions>_DEBUG;%(PreprocessorDefinitions)</PreprocessorDefinitions> <MinimalRebuild>true</MinimalRebuild> <BasicRuntimeChecks>EnableFastChecks</BasicRuntimeChecks> <RuntimeLibrary>MultiThreadedDebugDLL</RuntimeLibrary> <PrecompiledHeader> </PrecompiledHeader> <WarningLevel>Level3</WarningLevel> <DebugInformationFormat>EditAndContinue</DebugInformationFormat> </ClCompile> <Link> <GenerateDebugInformation>true</GenerateDebugInformation> <SubSystem>Console</SubSystem> <TargetMachine>MachineX86</TargetMachine> </Link> </ItemDefinitionGroup> <ItemDefinitionGroup Condition="'$(Configuration)|$(Platform)'=='Debug|x64'"> <ClCompile> <Optimization>Disabled</Optimization> <AdditionalIncludeDirectories>..\..\..\gmodule;%(AdditionalIncludeDirectories)</AdditionalIncludeDirectories> <PreprocessorDefinitions>_DEBUG;%(PreprocessorDefinitions)</PreprocessorDefinitions> <MinimalRebuild>true</MinimalRebuild> <BasicRuntimeChecks>EnableFastChecks</BasicRuntimeChecks> <RuntimeLibrary>MultiThreadedDebugDLL</RuntimeLibrary> <PrecompiledHeader> </PrecompiledHeader> <WarningLevel>Level3</WarningLevel> <DebugInformationFormat>ProgramDatabase</DebugInformationFormat> <CompileAs>CompileAsC</CompileAs> </ClCompile> <Link> <GenerateDebugInformation>true</GenerateDebugInformation> <SubSystem>Console</SubSystem> <RandomizedBaseAddress>false</RandomizedBaseAddress> <DataExecutionPrevention> </DataExecutionPrevention> <TargetMachine>MachineX64</TargetMachine> </Link> </ItemDefinitionGroup> <ItemDefinitionGroup Condition="'$(Configuration)|$(Platform)'=='Release|Win32'"> <ClCompile> <Optimization>MaxSpeed</Optimization> <IntrinsicFunctions>true</IntrinsicFunctions> <AdditionalIncludeDirectories>..\..\..\gmodule;%(AdditionalIncludeDirectories)</AdditionalIncludeDirectories> <PreprocessorDefinitions>%(PreprocessorDefinitions)</PreprocessorDefinitions> <RuntimeLibrary>MultiThreadedDLL</RuntimeLibrary> <FunctionLevelLinking>true</FunctionLevelLinking> <PrecompiledHeader> </PrecompiledHeader> <WarningLevel>Level3</WarningLevel> <DebugInformationFormat>ProgramDatabase</DebugInformationFormat> </ClCompile> <Link> <GenerateDebugInformation>true</GenerateDebugInformation> <SubSystem>Console</SubSystem> <OptimizeReferences>true</OptimizeReferences> <EnableCOMDATFolding>true</EnableCOMDATFolding> <TargetMachine>MachineX86</TargetMachine> </Link> </ItemDefinitionGroup> <ItemDefinitionGroup Condition="'$(Configuration)|$(Platform)'=='Release|x64'"> <ClCompile> <AdditionalIncludeDirectories>..\..\..\gmodule;%(AdditionalIncludeDirectories)</AdditionalIncludeDirectories> <PreprocessorDefinitions>%(PreprocessorDefinitions)</PreprocessorDefinitions> <RuntimeLibrary>MultiThreadedDLL</RuntimeLibrary> <PrecompiledHeader> </PrecompiledHeader> <WarningLevel>Level3</WarningLevel> <DebugInformationFormat>ProgramDatabase</DebugInformationFormat> <CompileAs>CompileAsC</CompileAs> </ClCompile> <Link> <GenerateDebugInformation>true</GenerateDebugInformation> <SubSystem>Console</SubSystem> <OptimizeReferences>true</OptimizeReferences> <EnableCOMDATFolding>true</EnableCOMDATFolding> <RandomizedBaseAddress>false</RandomizedBaseAddress> <DataExecutionPrevention> </DataExecutionPrevention> <TargetMachine>MachineX64</TargetMachine> </Link> </ItemDefinitionGroup> <ItemGroup> <ClCompile Include="..\..\..\gio\gresource-tool.c" /> </ItemGroup> <ItemGroup> <ProjectReference Include="gio.vcxproj"> <Project>{f3d1583c-5613-4809-bd98-7cc1c1276f92}</Project> <ReferenceOutputAssembly>false</ReferenceOutputAssembly> </ProjectReference> <ProjectReference Include="glib.vcxproj"> <Project>{12bca020-eabf-429e-876a-a476bc9c10c0}</Project> <ReferenceOutputAssembly>false</ReferenceOutputAssembly> </ProjectReference> <ProjectReference Include="gobject.vcxproj"> <Project>{f172effc-e30f-4593-809e-db2024b1e753}</Project> <ReferenceOutputAssembly>false</ReferenceOutputAssembly> </ProjectReference> </ItemGroup> <Import Project="$(VCTargetsPath)\Microsoft.Cpp.targets" /> <ImportGroup Label="ExtensionTargets"> </ImportGroup> </Project> ```
Zu Yue (祖約) (after 266 - March to April 330), courtesy name Shishao, was a Chinese military general and warlord of the Jin dynasty. He was the younger brother of the famed Jin general Zu Ti who marched north to reclaim lost lands from the barbarians. After Ti's death in 321, Zu Yue succeeded him but was said to have lacked his talents. In 327, dissatisfied with his treatment by the Jin court, he joined forces with Su Jun and took over the capital. However, he was defeated by loyalist forces in 329 and fled to Later Zhao, where he and his family were executed by Shi Le. Career under the Jin dynasty Zu Yue hailed from Qiuxian county, Fanyang commandery and was the younger brother of Zu Ti, who he had a friendly relationship with. In his youth, Zu Yue received the title of "Xiaolian (孝廉; Filial and Incorrupt)" and worked as the Magistrate of Chenggao County. After the Disaster of Yongjia in 311, he followed his brother south to join Sima Rui. There, he served a handful of offices such as the Attendant Officer of the Household Gentlemen and was said to be as equally famous as Ruan Fu (阮孚) of Chenliu. Despite his respectful career, Zu Yue nearly landed himself in trouble due to his marital problems at home. Zu Yue had a very jealous wife who was very suspicious of him to the point that Zu Yue feared her. One night, Zu Yue was suddenly injured by someone, and he suspected that this was his wife's doing. Zu Yue begged Sima Rui to allow him to resign but was rejected, so Zu instead abandoned his post. The Minister of Justice, Liu Wei (劉隗), wanted to execute him for his negligence but Sima Rui prevented him from doing so. While Zu Ti won merits in his northern expedition, Zu Yue too benefitted back home with promotions. After Zu Ti died in 321, however, Zu Yue was chosen to take over his army as General Who Pacifies The West and Inspector of Yuzhou. His half-brother, Zu Na (祖納) warned Sima Rui that giving his brother that much power would lead to rebellion, but Na was ignored as he was noted to be notoriously jealous of Yue. Zu Yue found his new position difficult to hold, as his lack of ability to impose discipline and poor relations with his brother's generals made him very unpopular among his men. Shortly after Zu Yue's appointment, Later Zhao forces quickly retook lands that they had lost to Zu Ti. Zu Yue failed to hold out and lost Xiangcheng, Chengfu (城父, in present-day Bozhou, Anhui) and Chenliu as a result. In 324, Zu Yue joined the loyalist side during Wang Dun's second insurrection against Jin after he was summoned to the capital by Emperor Ming. Zu Yue drove out Wang's Administrator of Huainan Ren Tai (任台) at Shouyang (壽陽, in present-day Lu'an, Anhui). After the death of Emperor Ming the following year, his brother-in-law Yu Liang, became the regent for his nephew, Emperor Cheng of Jin. Zu Yue saw himself as an independent warlord and wished to exercise his own authority over his holdings. He had hope that the new government would give him the privilege to hand out offices to his subordinates, much like his contemporaries, but this did not happen. He soon sent multiple petitions demanding for it, but they were either rejected or ignored. Even worse, when an imperial edict promoting ministers was declared, he, along with Tao Kan, were left out from the edict, and all this caused Zu Yue to suspect that Yu Liang was purposefully snubbing him. In 326, Zu Yue was attacked by Later Zhao forces under Shi Cong (石聰) at Shouchun. Zu sent edicts to Jiankang demanding for help but none came. The court only considered action when Shi Cong attacked Junqiu (浚遒, in present-day Feidong County, Anhui) and Fuling (阜陵; in present-day Quanjiao County, Anhui). However, before reinforcements could be sent, the warlord, Su Jun, sent his general Han Huang first and repelled Shi Cong. Zu Yue's relationship with the court deteriorated even further when he heard of the court's plan to make a defensive dyke. The dyke would cut him off from the capital, leaving him isolated in the face of a future invasion. Su Jun's Rebellion The next year, Su Jun rebelled against the Jin dynasty. Su Jun knew of Zu Yue's grudge with Yu Liang and the government, so he was offered to join forces. Zu Yue was delighted, and sent nephew Zu Huan and brother-in-law Xu Liu to aid Su Jun in capturing Jiankang. There were attempts to discourage Zu Yue from joining Su Jun by Huan Xuan and Zu Ti's widow, but Zu Yue refused to listen. When Su Jun took over the capital in 328, Su Jun appointed Zu Yue Palace Attendant, Grand Commandant, and Prefect of the Masters of Writing. While the rebellion raged on in the south, Later Zhao attacked Zu Yue at Huaishang. One of Zu Yue's general, Chen Guang (陳光), betrayed and attacked him. Zu Yue's Attendant and also his look-alike, Yan Tu (閻禿), pretended to be superior while the real Zu Yue secretly escaped the city in the night. The Jin general, Wen Jiao, issued a call to arms against Su Jun and Zu Yue. Many loyalists rose up against them and gathered around the capital's region. While Wen Jiao was at the Qiezi river mouth (茄子浦, in present-day Nanjing, Jiangsu), his subordinate Mao Bao went against his orders and successfully attacked a shipment of rice that Su Jun was sending to Zu Yue, leaving Zu and his men starving without food. Later, Zu Yue sent his generals Zu Huan (祖渙) and Huan Fu (桓撫) to attack Penkou (湓口, in present-day Jiujiang, Jiangxi). They managed to defeat Mao Bao at first, but he then returned to drive them off. Mao Bao proceeded to attack and capture Zu Yue's camps in Hefei. With his deteriorating relationship with his staff and the mounting defeats, Zu Yue's generals plotted with Later Zhao to kill him. Shi Cong and Shi Kan (石堪) attacked Zu Yue at Shouchun and his forces scattered, causing him to flee to Liyang. Su Jun was killed in battle in late 328 and was succeeded by his brother Su Yi (蘇逸). The situation for Zu Yue continued to worsen as the loyalist Zhao Yin (趙胤) attacked his base the next year. While his general Gan Miao (甘苗) fought Zhao Yin, Zu Yue secretly fled to Later Zhao with his families and followers. Gan Miao later surrender to Zhao Yin, thus ending Zu Yue's part in the rebellion. Su Yi and the rest of the rebels were destroyed later that year. Flight to Later Zhao and death Although Zu Yue was under Zhao's protection, its emperor, Shi Le secretly despised him. His advisor Cheng Xia and general Yao Yizhong shared his sentiment and advised him to kill Zu Yue before he could rebel, citing the precedent of Liu Bang killing Ding Gong despite Ding having once saved Liu's life. Shi Le thus hosted a banquet for Zu Yue and his followers with the intention of trapping them there. At the banquet, Zu Yue soon realized that he had fallen for Shi Le's ruse and drank heavily. Zu Yue and his followers were then arrested and brought to the marketplace to be executed. Before he died, Zu Yue was said to have cried while holding his grandsons. The men were executed while the women were distributed among the tribes in Zhao. Only his nephew, Zu Xian (祖羡; original name Zu Daozhong (祖道重)), survived due to the help of Zu Ti's slave-turned-Later Zhao general Wang An (王安). References Fang, Xuanling (ed.) (648). Book of Jin (Jin Shu). Sima, Guang (1084). Zizhi Tongjian. 330 deaths Jin dynasty (266–420) rebels Generals from Hebei Jin dynasty (266–420) generals Executed Jin dynasty (266–420) people
Vincent Mangematin (born 1965) is a French researcher and professor in management, specialized in Strategy, Strategic management of Innovation and Technology Management. He is currently professor and scientific director at Grenoble Ecole de Management. Biography Mangematin received his PhD in Administrative Sciences from Université Paris IX Dauphine in 1993 and is an ENS Cachan alumni. His thesis, entitled « Recherche coopérative et stratégie de normalisation » focuses on technological competition processes. He has worked for INRA Grenoble as member of the scientific board of the department of economics. His research concerns procedures for the transfer of technology between education and industry. In 2000, he was dean of INRA, before joining Grenoble Ecole de Management. He has been a senior professor and scientific director since 2010 at GEM. He has been an invited professor in many universities (Université du Québec, Georgia Institute of Technology, Chalmers University, Cass Business School) and in ESSEC Business School, he is also an associate professor in Dublin Institute of Technology. Work Mangematin’s main research areas are strategic management of innovation, role of user communities in the innovation processes, the globalization of business education and the institutionalization processes in emerging and changing environments. Presentation Mangematin’s research interest focuses on innovation processes in technological competition situations. He has successively worked on technology transfer, especially individuals’ movement between organisations and technology platforms, on creation and growth of high-tech startups, on clusters’ influence and on new business models emergence. His theorical perspectives are related to two key pillars : on the one hand, renewal of strategical approaches thanks to the business model concept, and on the other hand, roles of visibility, recognition and reputation within knowledge economy, especially in art fields (architecture), in scientific research and in business schools. Mangematin also analyses in his work the conditions of change of the innovation dynamics in different industries: nanotechnology, biotechnology, cultural industry and business education. Mangematin received a publication award in 2008 from the IAMOT (International Association for Management of Technology), making him one of the top 50 authors of technology and innovation management over the last five years, based on quantitative analysis of research from 2003 and 2007. Creation and circulation of knowledge Mangematin explored various strategies of knowledge creation and transmission within and amongst knowledge-based organisations. From the example of biotechnologies and nanotechnologies, he worked on firms evolution within nascent industries. He concentrated on the following topics : knowledge integration within project-based firms along the industry life cycle, coupling between financial and scientific resources within high tech companies, spatial organisation of economic, technological and scientific activities in emerging sectors where sharing of research facilities is required. Mangematin has been researching the underlying ambiguous roles of scientific and human capital in high tech organization. His research regarding geography of innovation showed that : Clusters foster the creation of firms but not their growth Cluster policy based on growth generates diseconomies of scale, higher competition amongst local actors and high pressure on resources. Cluster policies based on eliteness and visibility are successful Governance matters In close cooperation with Esther Tippmann and Pamela Sharkey-Scott, he has also been analysing the construction and circulation of knowledge by middle managers in multinational firms Technology transfer and management of public research organisms Technology transfer strategies essentially lie in research alliances or patent and licensing but Mangematin focuses on alternative means of technology transfer like mobility of human resources from academia to industry or firm creation. His research is especially based on individual as central technology transfer mechanisms, especially mid career around 40–50 years old researcher. He also focuses on management of shared research facilities as a mode of technology transfer. Selected publications Articles, a selection: Mangematin, Vincent, and Lionel Nesta. "What kind of knowledge can a firm absorb?." International Journal of Technology Management 18.3 (1999): 149-172. Mangematin, Vincent. "PhD job market: professional trajectories and incentives during the PhD." Research policy 29.6 (2000): 741-756. Mangematin, V., Lemarié, S., Boissin, J. P., Catherine, D., Corolleur, F., Coronini, R., & Trommetter, M. (2003). "Development of SMEs and heterogeneity of trajectories: the case of biotechnology in France." Research Policy, 32(4), 621-638. References External links Vincent Mangematin presentation on Grenoble Ecole de Management website 1965 births Living people French business theorists Academics of Bayes Business School
```javascript const ENS = artifacts.require("./ENSRegistry.sol"); const FIFSRegistrar = artifacts.require('./FIFSRegistrar.sol'); // Currently the parameter('./ContractName') is only used to imply // the compiled contract JSON file name. So even though `Registrar.sol` is // not existed, it's valid to put it here. // TODO: align the contract name with the source code file name. const web3 = new (require('web3'))(); const namehash = require('eth-ens-namehash'); /** * Calculate root node hashes given the top level domain(tld) * * @param {string} tld plain text tld, for example: 'eth' */ function getRootNodeFromTLD(tld) { return { namehash: namehash.hash(tld), sha3: web3.sha3(tld) }; } /** * Deploy the ENS and FIFSRegistrar * * @param {Object} deployer truffle deployer helper * @param {string} tld tld which the FIFS registrar takes charge of */ function deployFIFSRegistrar(deployer, tld) { var rootNode = getRootNodeFromTLD(tld); // Deploy the ENS first deployer.deploy(ENS) .then(() => { // Deploy the FIFSRegistrar and bind it with ENS return deployer.deploy(FIFSRegistrar, ENS.address, rootNode.namehash); }) .then(function() { // Transfer the owner of the `rootNode` to the FIFSRegistrar return ENS.at(ENS.address).then((c) => c.setSubnodeOwner('0x0', rootNode.sha3, FIFSRegistrar.address)); }); } module.exports = function(deployer, network) { var tld = 'eth'; if (network === 'dev.fifs') { deployFIFSRegistrar(deployer, tld); } }; ```
```c /*your_sha256_hash-------- * * geqo_mutation.c * * TSP mutation routines * * src/backend/optimizer/geqo/geqo_mutation.c * *your_sha256_hash--------- */ /* contributed by: =*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*= * Martin Utesch * Institute of Automatic Control * = = University of Mining and Technology = * utesch@aut.tu-freiberg.de * Freiberg, Germany * =*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*= */ /* this is adopted from Genitor : */ /*************************************************************/ /* */ /* Darrell L. Whitley */ /* Computer Science Department */ /* Colorado State University */ /* */ /* Permission is hereby granted to copy all or any part of */ /* this program for free distribution. The author's name */ /* and this copyright notice must be included in any copy. */ /* */ /*************************************************************/ #include "postgres.h" #include "optimizer/geqo_mutation.h" #include "optimizer/geqo_random.h" #if defined(CX) /* currently used only in CX mode */ void geqo_mutation(PlannerInfo *root, Gene *tour, int num_gene) { int swap1; int swap2; int num_swaps = geqo_randint(root, num_gene / 3, 0); Gene temp; while (num_swaps > 0) { swap1 = geqo_randint(root, num_gene - 1, 0); swap2 = geqo_randint(root, num_gene - 1, 0); while (swap1 == swap2) swap2 = geqo_randint(root, num_gene - 1, 0); temp = tour[swap1]; tour[swap1] = tour[swap2]; tour[swap2] = temp; num_swaps -= 1; } } #endif /* defined(CX) */ ```
The Buick Circus Hour is an American television series which aired from 1952 to 1953 on NBC. It was a variety series with a circus theme. It was a 60-minute show, 52 or so minutes minus ads. As the title suggests, it was sponsored by Buick. Archival status is not known, but the debut episode appears on the Internet Archive. This series aired once a month at the Tuesday night 8:00 PM Eastern time slot normally occupied by the Texaco Star Theater which starred Milton Berle. A reviewer for the Brooklyn Eagle newspaper felt the series was not up to 1952 standards (comparing the show with a 1948 variety show), though also describing the cast and crew as being talented. References External links Video of The Buick Circus Hour at Internet Archive 1952 American television series debuts 1953 American television series endings 1950s American variety television series Black-and-white American television shows NBC original programming
```xml // *** WARNING: this file was generated by test. *** // *** Do not edit by hand unless you're certain you know what you are doing! *** import * as pulumi from "@pulumi/pulumi"; import * as inputs from "../types/input"; import * as outputs from "../types/output"; import {Cat, Dog} from ".."; /** * A toy for a dog */ export interface Chew { owner?: Dog; } /** * A Toy for a cat */ export interface Laser { animal?: Cat; batteries?: boolean; light?: number; } export interface Rec { rec1?: outputs.Rec; } /** * This is a toy */ export interface Toy { associated?: outputs.Toy; color?: string; wear?: number; } ```
```javascript class EasyAccess { constructor() { // Implement mouse drag with Ctrl and left button pressed to scroll. this.lastMouseClientX = 0; this.lastMouseClientY = 0; this.readyToScroll = false; this.scrolled = false; // Vi-like navigation. // Pending keys for keydown. this.pendingKeys = []; // The repeat token from user input. this.repeatToken = 0; window.vxcore.on('ready', () => { this.setupMouseMove(); this.setupViNavigation(); this.setupZoomOnWheel(); }); } setupMouseMove() { window.addEventListener('mousedown', (e) => { e = e || window.event; let isCtrl = window.vxcore.os === 'Mac' ? e.metaKey : e.ctrlKey; // Left button and Ctrl key. if (e.buttons == 1 && isCtrl && window.getSelection().type != 'Range' && !window.vxImageViewer.isViewingImage()) { this.lastMouseClientX = e.clientX; this.lastMouseClientY = e.clientY; this.readyToScroll = true; this.scrolled = false; e.preventDefault(); } else { this.readyToScroll = false; this.scrolled = false; } }); window.addEventListener('mouseup', (e) => { e = e || window.event; if (this.scrolled || this.readyToScroll) { // Have been scrolled, restore the cursor style. document.body.style.cursor = "auto"; e.preventDefault(); } this.readyToScroll = false; this.scrolled = false; }); window.addEventListener('mousemove', (e) => { e = e || window.event; if (this.readyToScroll) { let deltaX = e.clientX - this.lastMouseClientX; let deltaY = e.clientY - this.lastMouseClientY; let threshold = 5; if (Math.abs(deltaX) >= threshold || Math.abs(deltaY) >= threshold) { this.lastMouseClientX = e.clientX; this.lastMouseClientY = e.clientY; if (!this.scrolled) { this.scrolled = true; document.body.style.cursor = "all-scroll"; } let scrollX = -deltaX; let scrollY = -deltaY; window.scrollBy(scrollX, scrollY); } e.preventDefault(); } }); } setupViNavigation() { document.addEventListener('keydown', (e) => { // Need to clear pending kyes. let needClear = true; // This event has been handled completely. No need to call the default handler. let accepted = true; e = e || window.event; let key = null; let shift = null; let ctrl = null; let meta = null; if (e.which) { key = e.which; } else { key = e.keyCode; } shift = !!e.shiftKey; ctrl = !!e.ctrlKey; meta = !!e.metaKey; let isCtrl = window.vxcore.os === 'Mac' ? e.metaKey : e.ctrlKey; switch (key) { // Skip Ctrl, Shift, Alt, Supper. case 16: case 17: case 18: case 91: case 92: needClear = false; break; // 0 - 9. case 48: case 49: case 50: case 51: case 52: case 53: case 54: case 55: case 56: case 57: case 96: case 97: case 98: case 99: case 100: case 101: case 102: case 103: case 104: case 105: { if (this.pendingKeys.length != 0 || ctrl || shift || meta) { accepted = false; break; } let num = key >= 96 ? key - 96 : key - 48; this.repeatToken = this.repeatToken * 10 + num; needClear = false; break; } case 74: // J if (!shift && (!ctrl || isCtrl) && (!meta || isCtrl)) { EasyAccess.scroll(true); break; } accepted = false; break; case 75: // K if (!shift && (!ctrl || isCtrl) && (!meta || isCtrl)) { EasyAccess.scroll(false); break; } accepted = false; break; case 72: // H if (!ctrl && !shift && !meta) { window.scrollBy(-100, 0); break; } accepted = false; break; case 76: // L if (!ctrl && !shift && !meta) { window.scrollBy(100, 0); break; } accepted = false; break; case 71: // G if (shift) { if (this.pendingKeys.length == 0) { let scrollLeft = document.documentElement.scrollLeft || document.body.scrollLeft || window.pageXOffset; let scrollHeight = document.documentElement.scrollHeight || document.body.scrollHeight; window.scrollTo(scrollLeft, scrollHeight); break; } } else if (!ctrl && !meta) { if (this.pendingKeys.length == 0) { // First g, pend it. this.pendingKeys.push({ key: key, ctrl: ctrl, shift: shift }); needClear = false; break; } else if (this.pendingKeys.length == 1) { let pendKey = this.pendingKeys[0]; if (pendKey.key == key && !pendKey.shift && !pendKey.ctrl) { let scrollLeft = document.documentElement.scrollLeft || document.body.scrollLeft || window.pageXOffset; window.scrollTo(scrollLeft, 0); break; } } } accepted = false; break; case 85: // U if (ctrl) { let clientHeight = document.documentElement.clientHeight; window.scrollBy(0, -clientHeight / 2); break; } accepted = false; break; case 68: // D if (ctrl) { let clientHeight = document.documentElement.clientHeight; window.scrollBy(0, clientHeight / 2); break; } accepted = false; break; case 219: // [ or { { let repeat = this.repeatToken < 1 ? 1 : this.repeatToken; if (shift) { // { if (this.pendingKeys.length == 1) { let pendKey = this.pendingKeys[0]; if (pendKey.key == key && !pendKey.shift && !pendKey.ctrl) { // [{, jump to previous title at a higher level. this.jumpTitle(false, -1, repeat); break; } } } else if (!ctrl && !meta) { // [ if (this.pendingKeys.length == 0) { // First [, pend it. this.pendingKeys.push({ key: key, ctrl: ctrl, shift: shift }); needClear = false; break; } else if (this.pendingKeys.length == 1) { let pendKey = this.pendingKeys[0]; if (pendKey.key == key && !pendKey.shift && !pendKey.ctrl) { // [[, jump to previous title. this.jumpTitle(false, 1, repeat); break; } else if (pendKey.key == 221 && !pendKey.shift && !pendKey.ctrl) { // ][, jump to next title at the same level. this.jumpTitle(true, 0, repeat); break; } } } accepted = false; break; } case 221: // ] or } { let repeat = this.repeatToken < 1 ? 1 : this.repeatToken; if (shift) { // } if (this.pendingKeys.length == 1) { let pendKey = this.pendingKeys[0]; if (pendKey.key == key && !pendKey.shift && !pendKey.ctrl) { // ]}, jump to next title at a higher level. this.jumpTitle(true, -1, repeat); break; } } } else if (!ctrl && !meta) { // ] if (this.pendingKeys.length == 0) { // First ], pend it. this.pendingKeys.push({ key: key, ctrl: ctrl, shift: shift }); needClear = false; break; } else if (this.pendingKeys.length == 1) { let pendKey = this.pendingKeys[0]; if (pendKey.key == key && !pendKey.shift && !pendKey.ctrl) { // ]], jump to next title. this.jumpTitle(true, 1, repeat); break; } else if (pendKey.key == 219 && !pendKey.shift && !pendKey.ctrl) { // [], jump to previous title at the same level. this.jumpTitle(false, 0, repeat); break; } } } accepted = false; break; } default: accepted = false; break; } if (needClear) { this.repeatToken = 0; this.pendingKeys = []; } if (accepted) { e.preventDefault(); } else { window.vxcore.setKeyPress(key, ctrl, shift, meta); } }); } // @forward: jump forward or backward. // @relativeLevel: 0 for the same level as current header; // negative value for upper level; // positive value is ignored. jumpTitle(forward, relativeLevel, repeat) { let headings = window.vxcore.nodeLineMapper.headingNodes; if (headings.length == 0) { return; } let currentHeadingIdx = window.vxcore.nodeLineMapper.currentHeadingIndex(); if (currentHeadingIdx == -1) { // At the beginning, before any headings. if (relativeLevel < 0 || !forward) { return; } } let targetIdx = -1; // -1: skip level check. let targetLevel = 0; let delta = 1; if (!forward) { delta = -1; } let scrollTop = document.documentElement.scrollTop || document.body.scrollTop || window.pageYOffset; for (targetIdx = (currentHeadingIdx == -1 ? 0 : currentHeadingIdx); targetIdx >= 0 && targetIdx < headings.length; targetIdx += delta) { let level = parseInt(headings[targetIdx].tagName.substr(1)); if (targetLevel == 0) { targetLevel = level; if (relativeLevel < 0) { targetLevel += relativeLevel; if (targetLevel < 1) { // Invalid level. return; } } else if (relativeLevel > 0) { targetLevel = -1; } } if (targetLevel == -1 || level == targetLevel) { if (targetIdx == currentHeadingIdx) { // If current heading is visible, skip it. // Minus 2 to tolerate some margin. if (forward || scrollTop - 2 <= headings[targetIdx].offsetTop) { continue; } } if (--repeat == 0) { break; } } else if (level < targetLevel) { return; } } if (targetIdx < 0 || targetIdx >= headings.length) { return; } window.vxcore.nodeLineMapper.scrollToNode(headings[targetIdx], false, false); window.setTimeout(function() { window.vxcore.nodeLineMapper.updateCurrentHeading(); }, 1000); }; setupZoomOnWheel() { window.addEventListener('wheel', (e) => { e = e || window.event; let isCtrl = window.vxcore.os === 'Mac' ? e.metaKey : e.ctrlKey; if (isCtrl) { if (e.deltaY != 0) { window.vxcore.zoom(e.deltaY < 0); } e.preventDefault(); } }); } static scroll(p_up) { let delta = 100; if (p_up) { window.scrollBy(0, delta); } else { window.scrollBy(0, -delta); } } } window.vxEasyAccess = new EasyAccess; ```
```objective-c /** * @file lv_draw_sw_mask_private.h * */ #ifndef LV_DRAW_SW_MASK_PRIVATE_H #define LV_DRAW_SW_MASK_PRIVATE_H #ifdef __cplusplus extern "C" { #endif /********************* * INCLUDES *********************/ #include "lv_draw_sw_mask.h" /********************* * DEFINES *********************/ /********************** * TYPEDEFS **********************/ typedef struct { uint8_t * buf; lv_opa_t * cir_opa; /**< Opacity of values on the circumference of an 1/4 circle */ uint16_t * x_start_on_y; /**< The x coordinate of the circle for each y value */ uint16_t * opa_start_on_y; /**< The index of `cir_opa` for each y value */ int32_t life; /**< How many times the entry way used */ uint32_t used_cnt; /**< Like a semaphore to count the referencing masks */ int32_t radius; /**< The radius of the entry */ } lv_draw_sw_mask_radius_circle_dsc_t; struct lv_draw_sw_mask_common_dsc_t { lv_draw_sw_mask_xcb_t cb; lv_draw_sw_mask_type_t type; }; struct lv_draw_sw_mask_line_param_t { /** The first element must be the common descriptor */ lv_draw_sw_mask_common_dsc_t dsc; struct { /*First point*/ lv_point_t p1; /*Second point*/ lv_point_t p2; /*Which side to keep?*/ lv_draw_sw_mask_line_side_t side : 2; } cfg; /** A point of the line */ lv_point_t origo; /** X / (1024*Y) steepness (X is 0..1023 range). What is the change of X in 1024 Y? */ int32_t xy_steep; /** Y / (1024*X) steepness (Y is 0..1023 range). What is the change of Y in 1024 X? */ int32_t yx_steep; /** Helper which stores yx_steep for flat lines and xy_steep for steep (non flat) lines */ int32_t steep; /** Steepness in 1 px in 0..255 range. Used only by flat lines. */ int32_t spx; /** 1: It's a flat line? (Near to horizontal) */ uint8_t flat : 1; /** Invert the mask. The default is: Keep the left part. *It is used to select left/right/top/bottom */ uint8_t inv: 1; }; struct lv_draw_sw_mask_angle_param_t { /** The first element must be the common descriptor */ lv_draw_sw_mask_common_dsc_t dsc; struct { lv_point_t vertex_p; int32_t start_angle; int32_t end_angle; } cfg; lv_draw_sw_mask_line_param_t start_line; lv_draw_sw_mask_line_param_t end_line; uint16_t delta_deg; }; struct lv_draw_sw_mask_radius_param_t { /** The first element must be the common descriptor */ lv_draw_sw_mask_common_dsc_t dsc; struct { lv_area_t rect; int32_t radius; /** Invert the mask. 0: Keep the pixels inside. */ uint8_t outer: 1; } cfg; lv_draw_sw_mask_radius_circle_dsc_t * circle; }; struct lv_draw_sw_mask_fade_param_t { /** The first element must be the common descriptor */ lv_draw_sw_mask_common_dsc_t dsc; struct { lv_area_t coords; int32_t y_top; int32_t y_bottom; lv_opa_t opa_top; lv_opa_t opa_bottom; } cfg; }; struct lv_draw_sw_mask_map_param_t { /** The first element must be the common descriptor */ lv_draw_sw_mask_common_dsc_t dsc; struct { lv_area_t coords; const lv_opa_t * map; } cfg; }; typedef lv_draw_sw_mask_radius_circle_dsc_t lv_draw_sw_mask_radius_circle_dsc_arr_t[LV_DRAW_SW_CIRCLE_CACHE_SIZE]; /********************** * GLOBAL PROTOTYPES **********************/ /** * Called by LVGL the rendering of a screen is ready to clean up * the temporal (cache) data of the masks */ void lv_draw_sw_mask_cleanup(void); /********************** * MACROS **********************/ #ifdef __cplusplus } /*extern "C"*/ #endif #endif /*LV_DRAW_SW_MASK_PRIVATE_H*/ ```
Eaglesham South Aerodrome is located south southwest of Eaglesham, Alberta, Canada. See also Eaglesham/Bice Farm Aerodrome Eaglesham/Codesa South Aerodrome References External links Page about this airport on COPA's Places to Fly airport directory Registered aerodromes in Alberta Birch Hills County
```c++ // // // path_to_url // // Unless required by applicable law or agreed to in writing, software // WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. #include "paddle/fluid/framework/ir/ipu/forward_graph_extract_pass.h" #include "paddle/fluid/framework/ir/pass_tester_helper.h" namespace paddle { namespace framework { namespace ir { void ForwardGraphExtractPass::ApplyImpl(ir::Graph* graph) const { VLOG(10) << "enter ForwardGraphExtractPass::ApplyImpl"; std::unordered_map<OpRole, std::unordered_set<ir::Node*>> all_ops{ {OpRole::kForward, {}}, {OpRole::kBackward, {}}, {OpRole::kOptimize, {}}, {OpRole::kRPC, {}}, {OpRole::kDist, {}}, {OpRole::kLRSched, {}}, {OpRole::kLoss, {}}, {OpRole::kNotSpecified, {}}}; for (auto* node : graph->Nodes()) { if (!node->IsOp()) { continue; } auto op_role = PADDLE_GET_MUTABLE(int, node->Op()->GetAttr("op_role")); if (op_role == static_cast<int>(OpRole::kForward)) { all_ops[OpRole::kForward].insert(node); } else if (op_role == static_cast<int>(OpRole::kBackward)) { all_ops[OpRole::kBackward].insert(node); } else if (op_role == static_cast<int>(OpRole::kOptimize)) { all_ops[OpRole::kOptimize].insert(node); } else if (op_role == static_cast<int>(OpRole::kRPC)) { } else if (op_role == static_cast<int>(OpRole::kDist)) { } else if (op_role == static_cast<int>(OpRole::kLRSched)) { } else if (op_role == static_cast<int>(OpRole::kLoss)) { all_ops[OpRole::kLoss].insert(node); } else if (op_role == static_cast<int>(OpRole::kNotSpecified)) { LOG(WARNING) << "Op: " << node->Name() << " OpRole is NotSpecified "; } } std::unordered_set<ir::Node*> forward_vars; std::unordered_set<ir::Node*> backward_vars; std::unordered_set<ir::Node*> control_vars; // forward_vars for (auto& nodes : std::array<std::unordered_set<ir::Node*>, 2>{ all_ops[OpRole::kForward], all_ops[OpRole::kLoss]}) { for (auto* node : nodes) { for (auto* in_node : node->inputs) { forward_vars.insert(in_node); } for (auto* out_node : node->outputs) { forward_vars.insert(out_node); } } } // learning_rate var for (auto* node : all_ops[OpRole::kOptimize]) { if (node->Op()->Inputs().count("LearningRate") && !node->Op()->Inputs().at("LearningRate").empty()) { auto lr_var_name = node->Op()->Inputs().at("LearningRate").front(); for (auto* in_var : node->inputs) { if (in_var->Name() == lr_var_name) { VLOG(10) << "found LearningRate var: " << in_var->Name(); forward_vars.insert(in_var); } } } } // control_vars & backward_vars for (auto* node : graph->Nodes()) { if (!node->IsVar()) { continue; } if (node->IsCtrlVar()) { control_vars.insert(node); } for (auto* in_node : node->inputs) { if (all_ops[OpRole::kOptimize].count(in_node)) { backward_vars.insert(node); } } } // all removed node std::unordered_set<ir::Node*> rm_nodes; for (auto* node : graph->Nodes()) { if (backward_vars.count(node)) { rm_nodes.insert(node); } else if (control_vars.count(node)) { rm_nodes.insert(node); } else if (all_ops[OpRole::kBackward].count(node)) { rm_nodes.insert(node); } else if (all_ops[OpRole::kForward].count(node) == 0 && all_ops[OpRole::kLoss].count(node) == 0 && forward_vars.count(node) == 0) { rm_nodes.insert(node); } else if (node->Name() == "feed" || node->Name() == "fetch") { rm_nodes.insert(node); } } VLOG(10) << "Remove Node: "; for (auto* node : rm_nodes) { // rm node relations for (auto* node_in : node->inputs) { for (size_t i = 0; i < node_in->outputs.size(); ++i) { if (node_in->outputs[i] == node) { node_in->outputs.erase(node_in->outputs.begin() + i); break; } } } for (auto* node_out : node->outputs) { for (size_t i = 0; i < node_out->inputs.size(); ++i) { if (node_out->inputs[i] == node) { node_out->inputs.erase(node_out->inputs.begin() + i); break; } } } VLOG(10) << "\t" << node->Name(); graph->RemoveNode(node); } VLOG(10) << "Post Graph: "; VLOG(10) << DebugString(graph); VLOG(10) << "leave ForwardGraphExtractPass::ApplyImpl"; } } // namespace ir } // namespace framework } // namespace paddle REGISTER_PASS(forward_graph_extract_pass, paddle::framework::ir::ForwardGraphExtractPass); ```
```java Java Virtual Machine Use meaningful names Use `DecimalFormat` class to format numbers Detect or prevent integer overflow Supply `toString()` in all classes ```
```kotlin package net.corda.nodeapi.internal.protonwrapper.netty import io.netty.channel.ChannelDuplexHandler import io.netty.channel.ChannelHandler import io.netty.channel.ChannelHandlerContext import io.netty.channel.ChannelPromise import io.netty.handler.logging.LogLevel import io.netty.util.internal.logging.InternalLogLevel import io.netty.util.internal.logging.InternalLogger import io.netty.util.internal.logging.InternalLoggerFactory import java.net.SocketAddress @ChannelHandler.Sharable class NettyServerEventLogger(level: LogLevel = DEFAULT_LEVEL, val silencedIPs: Set<String> = emptySet()) : ChannelDuplexHandler() { companion object { val DEFAULT_LEVEL: LogLevel = LogLevel.DEBUG } private val logger: InternalLogger = InternalLoggerFactory.getInstance(javaClass) private val internalLevel: InternalLogLevel = level.toInternalLevel() @Throws(Exception::class) override fun channelActive(ctx: ChannelHandlerContext) { if (logger.isEnabled(internalLevel)) { logger.log(internalLevel, "Server socket ${ctx.channel()} ACTIVE") } ctx.fireChannelActive() } @Throws(Exception::class) override fun channelInactive(ctx: ChannelHandlerContext) { if (logger.isEnabled(internalLevel)) { logger.log(internalLevel, "Server socket ${ctx.channel()} INACTIVE") } ctx.fireChannelInactive() } @Suppress("OverridingDeprecatedMember") @Throws(Exception::class) override fun exceptionCaught(ctx: ChannelHandlerContext, cause: Throwable) { if (logger.isEnabled(internalLevel)) { logger.log(internalLevel, "Server socket ${ctx.channel()} EXCEPTION ${cause.message}", cause) } ctx.fireExceptionCaught(cause) } @Throws(Exception::class) override fun bind(ctx: ChannelHandlerContext, localAddress: SocketAddress, promise: ChannelPromise) { if (logger.isEnabled(internalLevel)) { logger.log(internalLevel, "Server socket ${ctx.channel()} BIND $localAddress") } ctx.bind(localAddress, promise) } @Throws(Exception::class) override fun close(ctx: ChannelHandlerContext, promise: ChannelPromise) { if (logger.isEnabled(internalLevel)) { logger.log(internalLevel, "Server socket ${ctx.channel()} CLOSE") } ctx.close(promise) } @Throws(Exception::class) override fun channelRead(ctx: ChannelHandlerContext, msg: Any) { val level = if (msg is io.netty.channel.socket.SocketChannel) { // Should always be the case as this is a server socket, but be defensive if (msg.remoteAddress()?.hostString !in silencedIPs) internalLevel else InternalLogLevel.TRACE } else internalLevel if (logger.isEnabled(level)) { logger.log(level, "Server socket ${ctx.channel()} ACCEPTED $msg") } ctx.fireChannelRead(msg) } } ```
```javascript /** * @license Apache-2.0 * * * * path_to_url * * Unless required by applicable law or agreed to in writing, software * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. */ 'use strict'; // MODULES // var sincos = require( '@stdlib/math/base/special/sincos' ).assign; var exp = require( '@stdlib/math/base/special/exp' ); var Complex128 = require( '@stdlib/complex/float64/ctor' ); var real = require( '@stdlib/complex/float64/real' ); var imag = require( '@stdlib/complex/float64/imag' ); // VARIABLES // // Pre-allocate workspace array: var WORKSPACE = [ 0.0, 0.0 ]; // MAIN // /** * Evaluates the cis function for a double-precision complex floating-point number. * * @param {Complex128} z - complex number * @returns {Complex128} result * * @example * var Complex128 = require( '@stdlib/complex/float64/ctor' ); * var real = require( '@stdlib/complex/float64/real' ); * var imag = require( '@stdlib/complex/float64/imag' ); * * var z = new Complex128( 0.0, 0.0 ); * // returns <Complex128> * * var out = ccis( z ); * // returns <Complex128> * * var re = real( out ); * // returns 1.0 * * var im = imag( out ); * // returns 0.0 * * @example * var Complex128 = require( '@stdlib/complex/float64/ctor' ); * var real = require( '@stdlib/complex/float64/real' ); * var imag = require( '@stdlib/complex/float64/imag' ); * * var z = new Complex128( 1.0, 0.0 ); * // returns <Complex128> * * var out = ccis( z ); * // returns <Complex128> * * var re = real( out ); * // returns ~0.54 * * var im = imag( out ); * // returns ~0.841 */ function ccis( z ) { var re; var im; var e; re = real( z ); im = imag( z ); sincos( re, WORKSPACE, 1, 0 ); if ( im !== 0.0 ) { e = exp( -im ); WORKSPACE[ 0 ] *= e; WORKSPACE[ 1 ] *= e; } return new Complex128( WORKSPACE[ 1 ], WORKSPACE[ 0 ] ); } // EXPORTS // module.exports = ccis; ```
```yaml lonlat: - 8.806354 - 47.667048 parsers: exchange: ENTSOE.fetch_exchange exchangeForecast: ENTSOE.fetch_exchange_forecast rotation: 0 ```
Garmez-e Sofla (, also Romanized as Garmez-e Soflá) is a village in Howmeh Rural District, in the Central District of Behbahan County, Khuzestan Province, Iran. At the 2006 census, its population was 226, in 58 families. References Populated places in Behbahan County
```php <?php /* * * * path_to_url * * Unless required by applicable law or agreed to in writing, software * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the */ namespace Google\Service\Firestore; class Status extends \Google\Collection { protected $collection_key = 'details'; /** * @var int */ public $code; /** * @var array[] */ public $details; /** * @var string */ public $message; /** * @param int */ public function setCode($code) { $this->code = $code; } /** * @return int */ public function getCode() { return $this->code; } /** * @param array[] */ public function setDetails($details) { $this->details = $details; } /** * @return array[] */ public function getDetails() { return $this->details; } /** * @param string */ public function setMessage($message) { $this->message = $message; } /** * @return string */ public function getMessage() { return $this->message; } } // Adding a class alias for backwards compatibility with the previous class name. class_alias(Status::class, 'Google_Service_Firestore_Status'); ```
```java /* * DO NOT ALTER OR REMOVE COPYRIGHT NOTICES OR THIS FILE HEADER. * * * Subject to the condition set forth below, permission is hereby granted to any * person obtaining a copy of this software, associated documentation and/or * data (collectively the "Software"), free of charge and under any and all * copyright rights in the Software, and any and all patent rights owned or * freely licensable by each licensor hereunder covering either (i) the * unmodified Software as contributed to or provided by such licensor, or (ii) * the Larger Works (as defined below), to deal in both * * (a) the Software, and * * (b) any piece of software and/or hardware listed in the lrgrwrks.txt file if * one is included with the Software each a "Larger Work" to which the Software * is contributed by such licensors), * * without restriction, including without limitation the rights to copy, create * derivative works of, display, perform, and distribute the Software and make, * use, sell, offer for sale, import, export, have made, and have sold the * Software and the Larger Work(s), and to sublicense the foregoing rights on * either these or other terms. * * This license is subject to the following condition: * * The above copyright notice and either this complete permission notice or at a * minimum a reference to the UPL must be included in all copies or substantial * portions of the Software. * * THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR * IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, * FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE * AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER * LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, * OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE * SOFTWARE. */ package com.oracle.truffle.polyglot; import static com.oracle.truffle.api.CompilerDirectives.shouldNotReachHere; import static com.oracle.truffle.polyglot.EngineAccessor.RUNTIME; import java.lang.reflect.Type; import java.math.BigInteger; import java.nio.ByteOrder; import java.time.Duration; import java.time.Instant; import java.time.LocalDate; import java.time.LocalTime; import java.time.ZoneId; import java.util.AbstractSet; import java.util.Arrays; import java.util.Collections; import java.util.Iterator; import java.util.Map; import java.util.NoSuchElementException; import java.util.Set; import org.graalvm.polyglot.impl.AbstractPolyglotImpl; import org.graalvm.polyglot.impl.AbstractPolyglotImpl.APIAccess; import org.graalvm.polyglot.impl.AbstractPolyglotImpl.AbstractValueDispatch; import com.oracle.truffle.api.CallTarget; import com.oracle.truffle.api.CompilerDirectives.TruffleBoundary; import com.oracle.truffle.api.dsl.Bind; import com.oracle.truffle.api.dsl.Cached; import com.oracle.truffle.api.dsl.GenerateCached; import com.oracle.truffle.api.dsl.GenerateInline; import com.oracle.truffle.api.dsl.ImportStatic; import com.oracle.truffle.api.dsl.Specialization; import com.oracle.truffle.api.interop.ArityException; import com.oracle.truffle.api.interop.InteropLibrary; import com.oracle.truffle.api.interop.InvalidArrayIndexException; import com.oracle.truffle.api.interop.InvalidBufferOffsetException; import com.oracle.truffle.api.interop.StopIterationException; import com.oracle.truffle.api.interop.TruffleObject; import com.oracle.truffle.api.interop.UnknownIdentifierException; import com.oracle.truffle.api.interop.UnknownKeyException; import com.oracle.truffle.api.interop.UnsupportedMessageException; import com.oracle.truffle.api.interop.UnsupportedTypeException; import com.oracle.truffle.api.library.CachedLibrary; import com.oracle.truffle.api.nodes.Node; import com.oracle.truffle.api.profiles.InlinedBranchProfile; import com.oracle.truffle.api.strings.TruffleString; import com.oracle.truffle.polyglot.PolyglotLanguageContext.ToGuestValueNode; import com.oracle.truffle.polyglot.PolyglotLanguageContext.ToGuestValuesNode; import com.oracle.truffle.polyglot.PolyglotLanguageContext.ToHostValueNode; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.AsClassLiteralNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.AsDateNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.AsDurationNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.AsInstantNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.AsNativePointerNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.AsTimeNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.AsTimeZoneNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.AsTypeLiteralNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.CanExecuteNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.CanInstantiateNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.CanInvokeNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.ExecuteNoArgsNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.ExecuteNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.ExecuteVoidNoArgsNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.ExecuteVoidNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.GetArrayElementNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.GetArraySizeNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.GetBufferSizeNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.GetHashEntriesIteratorNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.GetHashKeysIteratorNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.GetHashSizeNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.GetHashValueNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.GetHashValueOrDefaultNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.GetHashValuesIteratorNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.GetIteratorNextElementNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.GetMemberKeysNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.GetMemberNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.GetMetaQualifiedNameNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.GetMetaSimpleNameNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.HasArrayElementsNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.HasBufferElementsNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.HasHashEntriesNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.HasHashEntryNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.HasIteratorNextElementNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.HasIteratorNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.HasMemberNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.HasMembersNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.InvokeNoArgsNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.InvokeNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.IsBufferWritableNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.IsDateNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.IsDurationNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.IsExceptionNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.IsMetaInstanceNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.IsMetaObjectNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.IsNativePointerNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.IsNullNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.IsTimeNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.IsTimeZoneNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.NewInstanceNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.PutHashEntryNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.PutMemberNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.ReadBufferByteNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.ReadBufferDoubleNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.ReadBufferFloatNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.ReadBufferIntNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.ReadBufferLongNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.ReadBufferNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.ReadBufferShortNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.RemoveArrayElementNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.RemoveHashEntryNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.RemoveMemberNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.SetArrayElementNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.ThrowExceptionNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.WriteBufferByteNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.WriteBufferDoubleNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.WriteBufferFloatNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.WriteBufferIntNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.WriteBufferLongNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.WriteBufferShortNodeGen; abstract class PolyglotValueDispatch extends AbstractValueDispatch { private static final String TRUNCATION_SUFFIX = "..."; private static final String UNKNOWN = "Unknown"; static final InteropLibrary UNCACHED_INTEROP = InteropLibrary.getFactory().getUncached(); final PolyglotImpl impl; final PolyglotLanguageInstance languageInstance; PolyglotValueDispatch(PolyglotImpl impl, PolyglotLanguageInstance languageInstance) { super(impl); this.impl = impl; this.languageInstance = languageInstance; } @Override public final Object getContext(Object context) { if (context == null) { return null; } return ((PolyglotLanguageContext) context).context.api; } static <T extends Throwable> RuntimeException guestToHostException(PolyglotLanguageContext languageContext, T e, boolean entered) { throw PolyglotImpl.guestToHostException(languageContext, e, entered); } @Override public Object getArrayElement(Object languageContext, Object receiver, long index) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object prev = hostEnter(context); try { return getArrayElementUnsupported(context, receiver); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, prev); } } @TruffleBoundary static Object getArrayElementUnsupported(PolyglotLanguageContext context, Object receiver) { throw unsupported(context, receiver, "getArrayElement(long)", "hasArrayElements()"); } @Override public void setArrayElement(Object languageContext, Object receiver, long index, Object value) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object prev = hostEnter(context); try { setArrayElementUnsupported(context, receiver); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, prev); } } @TruffleBoundary static void setArrayElementUnsupported(PolyglotLanguageContext context, Object receiver) { throw unsupported(context, receiver, "setArrayElement(long, Object)", "hasArrayElements()"); } @Override public boolean removeArrayElement(Object languageContext, Object receiver, long index) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object prev = hostEnter(context); try { throw removeArrayElementUnsupported(context, receiver); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, prev); } } @TruffleBoundary static RuntimeException removeArrayElementUnsupported(PolyglotLanguageContext context, Object receiver) { throw unsupported(context, receiver, "removeArrayElement(long, Object)", null); } @Override public long getArraySize(Object languageContext, Object receiver) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object prev = hostEnter(context); try { return getArraySizeUnsupported(context, receiver); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, prev); } } @TruffleBoundary static long getArraySizeUnsupported(PolyglotLanguageContext context, Object receiver) { throw unsupported(context, receiver, "getArraySize()", "hasArrayElements()"); } // region Buffer Methods @Override public boolean isBufferWritable(Object languageContext, Object receiver) throws UnsupportedOperationException { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; final Object prev = hostEnter(context); try { throw isBufferWritableUnsupported(context, receiver); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, prev); } } @TruffleBoundary static RuntimeException isBufferWritableUnsupported(PolyglotLanguageContext context, Object receiver) { return unsupported(context, receiver, "isBufferWritable()", "hasBufferElements()"); } @Override public long getBufferSize(Object languageContext, Object receiver) throws UnsupportedOperationException { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; final Object prev = hostEnter(context); try { throw getBufferSizeUnsupported(context, receiver); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, prev); } } @TruffleBoundary static RuntimeException getBufferSizeUnsupported(PolyglotLanguageContext context, Object receiver) { return unsupported(context, receiver, "getBufferSize()", "hasBufferElements()"); } @Override public byte readBufferByte(Object languageContext, Object receiver, long byteOffset) throws UnsupportedOperationException, IndexOutOfBoundsException { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; final Object prev = hostEnter(context); try { throw readBufferByteUnsupported(context, receiver); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, prev); } } @TruffleBoundary static RuntimeException readBufferByteUnsupported(PolyglotLanguageContext context, Object receiver) { return unsupported(context, receiver, "readBufferByte()", "hasBufferElements()"); } @Override public void readBuffer(Object languageContext, Object receiver, long byteOffset, byte[] destination, int destinationOffset, int length) throws UnsupportedOperationException, IndexOutOfBoundsException { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; final Object prev = hostEnter(context); try { throw readBufferUnsupported(context, receiver); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, prev); } } @TruffleBoundary static RuntimeException readBufferUnsupported(PolyglotLanguageContext context, Object receiver) { return unsupported(context, receiver, "readBuffer()", "hasBufferElements()"); } @Override public void writeBufferByte(Object languageContext, Object receiver, long byteOffset, byte value) throws UnsupportedOperationException, IndexOutOfBoundsException { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; final Object prev = hostEnter(context); try { throw writeBufferByteUnsupported(context, receiver); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, prev); } } @TruffleBoundary static RuntimeException writeBufferByteUnsupported(PolyglotLanguageContext context, Object receiver) { return unsupported(context, receiver, "writeBufferByte()", "hasBufferElements()"); } @Override public short readBufferShort(Object languageContext, Object receiver, ByteOrder order, long byteOffset) throws UnsupportedOperationException, IndexOutOfBoundsException { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; final Object prev = hostEnter(context); try { throw readBufferShortUnsupported(context, receiver); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, prev); } } @TruffleBoundary static RuntimeException readBufferShortUnsupported(PolyglotLanguageContext context, Object receiver) { return unsupported(context, receiver, "readBufferShort()", "hasBufferElements()"); } @Override public void writeBufferShort(Object languageContext, Object receiver, ByteOrder order, long byteOffset, short value) throws UnsupportedOperationException, IndexOutOfBoundsException { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; final Object prev = hostEnter(context); try { throw writeBufferShortUnsupported(context, receiver); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, prev); } } @TruffleBoundary static RuntimeException writeBufferShortUnsupported(PolyglotLanguageContext context, Object receiver) { return unsupported(context, receiver, "writeBufferShort()", "hasBufferElements()"); } @Override public int readBufferInt(Object languageContext, Object receiver, ByteOrder order, long byteOffset) throws UnsupportedOperationException, IndexOutOfBoundsException { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; final Object prev = hostEnter(context); try { throw readBufferIntUnsupported(context, receiver); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, prev); } } @TruffleBoundary static RuntimeException readBufferIntUnsupported(PolyglotLanguageContext context, Object receiver) { return unsupported(context, receiver, "readBufferInt()", "hasBufferElements()"); } @Override public void writeBufferInt(Object languageContext, Object receiver, ByteOrder order, long byteOffset, int value) throws UnsupportedOperationException, IndexOutOfBoundsException { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; final Object prev = hostEnter(context); try { throw writeBufferIntUnsupported(context, receiver); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, prev); } } @TruffleBoundary static RuntimeException writeBufferIntUnsupported(PolyglotLanguageContext context, Object receiver) { return unsupported(context, receiver, "writeBufferInt()", "hasBufferElements()"); } @Override public long readBufferLong(Object languageContext, Object receiver, ByteOrder order, long byteOffset) throws UnsupportedOperationException, IndexOutOfBoundsException { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; final Object prev = hostEnter(context); try { throw readBufferLongUnsupported(context, receiver); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, prev); } } @TruffleBoundary static RuntimeException readBufferLongUnsupported(PolyglotLanguageContext context, Object receiver) { return unsupported(context, receiver, "readBufferLong()", "hasBufferElements()"); } @Override public void writeBufferLong(Object languageContext, Object receiver, ByteOrder order, long byteOffset, long value) throws UnsupportedOperationException, IndexOutOfBoundsException { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; final Object prev = hostEnter(context); try { throw writeBufferLongUnsupported(context, receiver); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, prev); } } @TruffleBoundary static RuntimeException writeBufferLongUnsupported(PolyglotLanguageContext context, Object receiver) { return unsupported(context, receiver, "writeBufferLong()", "hasBufferElements()"); } @Override public float readBufferFloat(Object languageContext, Object receiver, ByteOrder order, long byteOffset) throws UnsupportedOperationException, IndexOutOfBoundsException { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; final Object prev = hostEnter(context); try { throw readBufferFloatUnsupported(context, receiver); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, prev); } } @TruffleBoundary static RuntimeException readBufferFloatUnsupported(PolyglotLanguageContext context, Object receiver) { return unsupported(context, receiver, "readBufferFloat()", "hasBufferElements()"); } @Override public void writeBufferFloat(Object languageContext, Object receiver, ByteOrder order, long byteOffset, float value) throws UnsupportedOperationException, IndexOutOfBoundsException { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; final Object prev = hostEnter(context); try { throw writeBufferFloatUnsupported(context, receiver); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, prev); } } @TruffleBoundary static RuntimeException writeBufferFloatUnsupported(PolyglotLanguageContext context, Object receiver) { return unsupported(context, receiver, "writeBufferFloat()", "hasBufferElements()"); } @Override public double readBufferDouble(Object languageContext, Object receiver, ByteOrder order, long byteOffset) throws UnsupportedOperationException, IndexOutOfBoundsException { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object prev = hostEnter(context); try { throw readBufferDoubleUnsupported(context, receiver); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, prev); } } @TruffleBoundary static RuntimeException readBufferDoubleUnsupported(PolyglotLanguageContext context, Object receiver) { return unsupported(context, receiver, "readBufferDouble()", "hasBufferElements()"); } @Override public void writeBufferDouble(Object languageContext, Object receiver, ByteOrder order, long byteOffset, double value) throws UnsupportedOperationException, IndexOutOfBoundsException { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object prev = hostEnter(context); try { throw writeBufferDoubleUnsupported(context, receiver); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, prev); } } @TruffleBoundary static RuntimeException writeBufferDoubleUnsupported(PolyglotLanguageContext context, Object receiver) { return unsupported(context, receiver, "writeBufferDouble()", "hasBufferElements()"); } @TruffleBoundary protected static RuntimeException invalidBufferIndex(PolyglotLanguageContext context, Object receiver, long byteOffset, long size) { final String message = String.format("Invalid buffer access of length %d at byte offset %d for buffer %s.", size, byteOffset, getValueInfo(context, receiver)); throw PolyglotEngineException.bufferIndexOutOfBounds(message); } // endregion @Override public Object getMember(Object languageContext, Object receiver, String key) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object prev = hostEnter(context); try { return getMemberUnsupported(context, receiver, key); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, prev); } } @TruffleBoundary static Object getMemberUnsupported(PolyglotLanguageContext context, Object receiver, @SuppressWarnings("unused") String key) { throw unsupported(context, receiver, "getMember(String)", "hasMembers()"); } @Override public void putMember(Object languageContext, Object receiver, String key, Object member) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object prev = hostEnter(context); try { putMemberUnsupported(context, receiver); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, prev); } } @TruffleBoundary static RuntimeException putMemberUnsupported(PolyglotLanguageContext context, Object receiver) { throw unsupported(context, receiver, "putMember(String, Object)", "hasMembers()"); } @Override public boolean removeMember(Object languageContext, Object receiver, String key) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object prev = hostEnter(context); try { throw removeMemberUnsupported(context, receiver); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, prev); } } @TruffleBoundary static RuntimeException removeMemberUnsupported(PolyglotLanguageContext context, Object receiver) { throw unsupported(context, receiver, "removeMember(String, Object)", null); } @Override public Object execute(Object languageContext, Object receiver, Object[] arguments) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object prev = hostEnter(context); try { throw executeUnsupported(context, receiver); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, prev); } } @Override public Object execute(Object languageContext, Object receiver) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object prev = hostEnter(context); try { throw executeUnsupported(context, receiver); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, prev); } } @TruffleBoundary static RuntimeException executeUnsupported(PolyglotLanguageContext context, Object receiver) { throw unsupported(context, receiver, "execute(Object...)", "canExecute()"); } @Override public Object newInstance(Object languageContext, Object receiver, Object[] arguments) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object prev = hostEnter(context); try { return newInstanceUnsupported(context, receiver); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, prev); } } @TruffleBoundary static Object newInstanceUnsupported(PolyglotLanguageContext context, Object receiver) { throw unsupported(context, receiver, "newInstance(Object...)", "canInstantiate()"); } @Override public void executeVoid(Object languageContext, Object receiver, Object[] arguments) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object prev = hostEnter(context); try { executeVoidUnsupported(context, receiver); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, prev); } } @Override public void executeVoid(Object languageContext, Object receiver) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object prev = hostEnter(context); try { executeVoidUnsupported(context, receiver); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, prev); } } @TruffleBoundary static void executeVoidUnsupported(PolyglotLanguageContext context, Object receiver) { throw unsupported(context, receiver, "executeVoid(Object...)", "canExecute()"); } @Override public Object invoke(Object languageContext, Object receiver, String identifier, Object[] arguments) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object prev = hostEnter(context); try { throw invokeUnsupported(context, receiver, identifier); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, prev); } } @Override public Object invoke(Object languageContext, Object receiver, String identifier) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object prev = hostEnter(context); try { throw invokeUnsupported(context, receiver, identifier); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, prev); } } @TruffleBoundary static RuntimeException invokeUnsupported(PolyglotLanguageContext context, Object receiver, String identifier) { throw unsupported(context, receiver, "invoke(" + identifier + ", Object...)", "canInvoke(String)"); } @Override public String asString(Object languageContext, Object receiver) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object prev = hostEnter(context); try { return asStringUnsupported(context, receiver); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, prev); } } protected static String asStringUnsupported(PolyglotLanguageContext context, Object receiver) { return invalidCastPrimitive(context, receiver, String.class, "asString()", "isString()", "Invalid coercion."); } @Override public boolean asBoolean(Object languageContext, Object receiver) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object prev = hostEnter(context); try { return asBooleanUnsupported(context, receiver); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, prev); } } private static boolean isNullUncached(Object receiver) { return InteropLibrary.getFactory().getUncached().isNull(receiver); } protected static boolean asBooleanUnsupported(PolyglotLanguageContext context, Object receiver) { return invalidCastPrimitive(context, receiver, boolean.class, "asBoolean()", "isBoolean()", "Invalid or lossy primitive coercion."); } private static <T> T invalidCastPrimitive(PolyglotLanguageContext context, Object receiver, Class<T> clazz, String asMethodName, String isMethodName, String detail) { if (isNullUncached(receiver)) { throw nullCoercion(context, receiver, clazz, asMethodName, isMethodName); } else { throw cannotConvert(context, receiver, clazz, asMethodName, isMethodName, detail); } } @Override public int asInt(Object languageContext, Object receiver) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object prev = hostEnter(context); try { return asIntUnsupported(context, receiver); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, prev); } } protected static int asIntUnsupported(PolyglotLanguageContext context, Object receiver) { return invalidCastPrimitive(context, receiver, int.class, "asInt()", "fitsInInt()", "Invalid or lossy primitive coercion."); } @Override public long asLong(Object languageContext, Object receiver) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object prev = hostEnter(context); try { return asLongUnsupported(context, receiver); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, prev); } } protected static long asLongUnsupported(PolyglotLanguageContext context, Object receiver) { return invalidCastPrimitive(context, receiver, long.class, "asLong()", "fitsInLong()", "Invalid or lossy primitive coercion."); } @Override public BigInteger asBigInteger(Object languageContext, Object receiver) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object prev = hostEnter(context); try { return asBigIntegerUnsupported(context, receiver); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, prev); } } protected static BigInteger asBigIntegerUnsupported(PolyglotLanguageContext context, Object receiver) { throw cannotConvert(context, receiver, BigInteger.class, "asBigInteger()", "fitsInBigInteger()", "Invalid or lossy coercion."); } @Override public double asDouble(Object languageContext, Object receiver) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object prev = hostEnter(context); try { return asDoubleUnsupported(context, receiver); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, prev); } } protected static double asDoubleUnsupported(PolyglotLanguageContext context, Object receiver) { return invalidCastPrimitive(context, receiver, double.class, "asDouble()", "fitsInDouble()", "Invalid or lossy primitive coercion."); } @Override public float asFloat(Object languageContext, Object receiver) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object prev = hostEnter(context); try { return asFloatUnsupported(context, receiver); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, prev); } } protected static float asFloatUnsupported(PolyglotLanguageContext context, Object receiver) { return invalidCastPrimitive(context, receiver, float.class, "asFloat()", "fitsInFloat()", "Invalid or lossy primitive coercion."); } @Override public byte asByte(Object languageContext, Object receiver) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object prev = hostEnter(context); try { return asByteUnsupported(context, receiver); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, prev); } } protected static byte asByteUnsupported(PolyglotLanguageContext context, Object receiver) { return invalidCastPrimitive(context, receiver, byte.class, "asByte()", "fitsInByte()", "Invalid or lossy primitive coercion."); } @Override public short asShort(Object languageContext, Object receiver) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object prev = hostEnter(context); try { return asShortUnsupported(context, receiver); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, prev); } } protected static short asShortUnsupported(PolyglotLanguageContext context, Object receiver) { return invalidCastPrimitive(context, receiver, short.class, "asShort()", "fitsInShort()", "Invalid or lossy primitive coercion."); } @Override public long asNativePointer(Object languageContext, Object receiver) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object prev = hostEnter(context); try { return asNativePointerUnsupported(context, receiver); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, prev); } } static long asNativePointerUnsupported(PolyglotLanguageContext context, Object receiver) { throw cannotConvert(context, receiver, long.class, "asNativePointer()", "isNativeObject()", "Value cannot be converted to a native pointer."); } @Override public Object asHostObject(Object languageContext, Object receiver) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object prev = hostEnter(context); try { return asHostObjectUnsupported(context, receiver); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, prev); } } protected static Object asHostObjectUnsupported(PolyglotLanguageContext context, Object receiver) { throw cannotConvert(context, receiver, null, "asHostObject()", "isHostObject()", "Value is not a host object."); } @Override public Object asProxyObject(Object languageContext, Object receiver) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object prev = hostEnter(context); try { return asProxyObjectUnsupported(context, receiver); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, prev); } } protected static Object asProxyObjectUnsupported(PolyglotLanguageContext context, Object receiver) { throw cannotConvert(context, receiver, null, "asProxyObject()", "isProxyObject()", "Value is not a proxy object."); } @Override public LocalDate asDate(Object languageContext, Object receiver) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object prev = hostEnter(context); try { if (isNullUncached(receiver)) { return null; } else { throw cannotConvert(context, receiver, null, "asDate()", "isDate()", "Value does not contain date information."); } } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, prev); } } @Override public LocalTime asTime(Object languageContext, Object receiver) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object prev = hostEnter(context); try { if (isNullUncached(receiver)) { return null; } else { throw cannotConvert(context, receiver, null, "asTime()", "isTime()", "Value does not contain time information."); } } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, prev); } } @Override public ZoneId asTimeZone(Object languageContext, Object receiver) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object prev = hostEnter(context); try { if (isNullUncached(receiver)) { return null; } else { throw cannotConvert(context, receiver, null, "asTimeZone()", "isTimeZone()", "Value does not contain time zone information."); } } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, prev); } } @Override public Instant asInstant(Object languageContext, Object receiver) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object prev = hostEnter(context); try { if (isNullUncached(receiver)) { return null; } else { throw cannotConvert(context, receiver, null, "asInstant()", "isInstant()", "Value does not contain instant information."); } } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, prev); } } @Override public Duration asDuration(Object languageContext, Object receiver) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object prev = hostEnter(context); try { if (isNullUncached(receiver)) { return null; } else { throw cannotConvert(context, receiver, null, "asDuration()", "isDuration()", "Value does not contain duration information."); } } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, prev); } } @Override public RuntimeException throwException(Object languageContext, Object receiver) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object prev = hostEnter(context); try { throw unsupported(context, receiver, "throwException()", "isException()"); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, prev); } } @Override public final Object getMetaObject(Object languageContext, Object receiver) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object prev = hostEnter(context); try { return getMetaObjectImpl(context, receiver); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, prev); } } @Override public Object getIterator(Object languageContext, Object receiver) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object prev = hostEnter(context); try { return getIteratorUnsupported(context, receiver); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, prev); } } @TruffleBoundary static final Object getIteratorUnsupported(PolyglotLanguageContext context, Object receiver) { throw unsupported(context, receiver, "getIterator()", "hasIterator()"); } @Override public boolean hasIteratorNextElement(Object languageContext, Object receiver) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object prev = hostEnter(context); try { return hasIteratorNextElementUnsupported(context, receiver); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, prev); } } @TruffleBoundary static final boolean hasIteratorNextElementUnsupported(PolyglotLanguageContext context, Object receiver) { throw unsupported(context, receiver, "hasIteratorNextElement()", "isIterator()"); } @Override public Object getIteratorNextElement(Object languageContext, Object receiver) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object prev = hostEnter(context); try { return getIteratorNextElementUnsupported(context, receiver); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, prev); } } @TruffleBoundary static final Object getIteratorNextElementUnsupported(PolyglotLanguageContext context, Object receiver) { throw unsupported(context, receiver, "getIteratorNextElement()", "isIterator()"); } @Override public long getHashSize(Object languageContext, Object receiver) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object prev = hostEnter(context); try { throw getHashSizeUnsupported(context, receiver); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, prev); } } @TruffleBoundary static final RuntimeException getHashSizeUnsupported(PolyglotLanguageContext context, Object receiver) { throw unsupported(context, receiver, "getHashSize()", "hasHashEntries()"); } @Override public Object getHashValue(Object languageContext, Object receiver, Object key) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object prev = hostEnter(context); try { throw getHashValueUnsupported(context, receiver, key); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, prev); } } @TruffleBoundary static final RuntimeException getHashValueUnsupported(PolyglotLanguageContext context, Object receiver, @SuppressWarnings("unused") Object key) { throw unsupported(context, receiver, "getHashValue(Object)", "hasHashEntries()"); } @Override public Object getHashValueOrDefault(Object languageContext, Object receiver, Object key, Object defaultValue) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object prev = hostEnter(context); try { throw getHashValueOrDefaultUnsupported(context, receiver, key, defaultValue); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, prev); } } @TruffleBoundary @SuppressWarnings("unused") static final RuntimeException getHashValueOrDefaultUnsupported(PolyglotLanguageContext context, Object receiver, Object key, Object defaultValue) { throw unsupported(context, receiver, "getHashValueOrDefault(Object, Object)", "hasHashEntries()"); } @Override public void putHashEntry(Object languageContext, Object receiver, Object key, Object value) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object prev = hostEnter(context); try { putHashEntryUnsupported(context, receiver, key, value); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, prev); } } @TruffleBoundary static final RuntimeException putHashEntryUnsupported(PolyglotLanguageContext context, Object receiver, @SuppressWarnings("unused") Object key, @SuppressWarnings("unused") Object value) { throw unsupported(context, receiver, "putHashEntry(Object, Object)", "hasHashEntries()"); } @Override public boolean removeHashEntry(Object languageContext, Object receiver, Object key) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object prev = hostEnter(context); try { throw removeHashEntryUnsupported(context, receiver, key); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, prev); } } @TruffleBoundary static final RuntimeException removeHashEntryUnsupported(PolyglotLanguageContext context, Object receiver, @SuppressWarnings("unused") Object key) { throw unsupported(context, receiver, "removeHashEntry(Object)", "hasHashEntries()"); } @Override public Object getHashEntriesIterator(Object languageContext, Object receiver) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object prev = hostEnter(context); try { throw getHashEntriesIteratorUnsupported(context, receiver); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, prev); } } @TruffleBoundary static final RuntimeException getHashEntriesIteratorUnsupported(PolyglotLanguageContext context, Object receiver) { throw unsupported(context, receiver, "getHashEntriesIterator()", "hasHashEntries()"); } @Override public Object getHashKeysIterator(Object languageContext, Object receiver) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object prev = hostEnter(context); try { throw getHashKeysIteratorUnsupported(context, receiver); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, prev); } } @TruffleBoundary static final RuntimeException getHashKeysIteratorUnsupported(PolyglotLanguageContext context, Object receiver) { throw unsupported(context, receiver, "getHashKeysIterator()", "hasHashEntries()"); } @Override public Object getHashValuesIterator(Object languageContext, Object receiver) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object prev = hostEnter(context); try { throw getHashValuesIteratorUnsupported(context, receiver); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, prev); } } @Override public void pin(Object languageContext, Object receiver) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object prev = hostEnter(context); try { languageInstance.sharing.engine.host.pin(receiver); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, prev); } } @TruffleBoundary static final RuntimeException getHashValuesIteratorUnsupported(PolyglotLanguageContext context, Object receiver) { throw unsupported(context, receiver, "getHashValuesIterator()", "hasHashEntries()"); } protected Object getMetaObjectImpl(PolyglotLanguageContext context, Object receiver) { InteropLibrary lib = InteropLibrary.getFactory().getUncached(receiver); if (lib.hasMetaObject(receiver)) { try { return asValue(impl, context, lib.getMetaObject(receiver)); } catch (UnsupportedMessageException e) { throw shouldNotReachHere("Unexpected unsupported message.", e); } } return null; } private static Object asValue(PolyglotImpl polyglot, PolyglotLanguageContext context, Object value) { if (context == null) { return polyglot.asValue(PolyglotFastThreadLocals.getContext(null), value); } else { return context.asValue(value); } } static Object hostEnter(Object languageContext) { if (languageContext == null) { return null; } PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; PolyglotContextImpl c = context.context; try { return c.engine.enterIfNeeded(c, true); } catch (Throwable t) { throw guestToHostException(context, t, false); } } static void hostLeave(Object languageContext, Object prev) { if (languageContext == null) { return; } PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; try { PolyglotContextImpl c = context.context; c.engine.leaveIfNeeded(prev, c); } catch (Throwable t) { throw guestToHostException(context, t, false); } } @TruffleBoundary protected static RuntimeException unsupported(PolyglotLanguageContext context, Object receiver, String message, String useToCheck) { String polyglotMessage; if (useToCheck != null) { polyglotMessage = String.format("Unsupported operation Value.%s for %s. You can ensure that the operation is supported using Value.%s.", message, getValueInfo(context, receiver), useToCheck); } else { polyglotMessage = String.format("Unsupported operation Value.%s for %s.", message, getValueInfo(context, receiver)); } return PolyglotEngineException.unsupported(polyglotMessage); } private static final int CHARACTER_LIMIT = 140; private static final InteropLibrary INTEROP = InteropLibrary.getFactory().getUncached(); @TruffleBoundary static String getValueInfo(Object languageContext, Object receiver) { PolyglotContextImpl context = languageContext != null ? ((PolyglotLanguageContext) languageContext).context : null; return getValueInfo(context, receiver); } @TruffleBoundary static String getValueInfo(PolyglotContextImpl context, Object receiver) { if (context == null) { return receiver.toString(); } else if (receiver == null) { assert false : "receiver should never be null"; return "null"; } PolyglotLanguage displayLanguage = EngineAccessor.EngineImpl.findObjectLanguage(context.engine, receiver); Object view; if (displayLanguage == null) { displayLanguage = context.engine.hostLanguage; view = context.getHostContext().getLanguageView(receiver); } else { view = receiver; } String valueToString; String metaObjectToString = UNKNOWN; try { InteropLibrary uncached = InteropLibrary.getFactory().getUncached(view); if (uncached.hasMetaObject(view)) { Object qualifiedName = INTEROP.getMetaQualifiedName(uncached.getMetaObject(view)); metaObjectToString = truncateString(INTEROP.asString(qualifiedName), CHARACTER_LIMIT); } valueToString = truncateString(INTEROP.asString(uncached.toDisplayString(view)), CHARACTER_LIMIT); } catch (UnsupportedMessageException e) { throw shouldNotReachHere(e); } String languageName = null; boolean hideType = false; if (displayLanguage.isHost()) { languageName = "Java"; // java is our host language for now // hide meta objects of null if (UNKNOWN.equals(metaObjectToString) && INTEROP.isNull(receiver)) { hideType = true; } } else { languageName = displayLanguage.getName(); } if (hideType) { return String.format("'%s'(language: %s)", valueToString, languageName); } else { return String.format("'%s'(language: %s, type: %s)", valueToString, languageName, metaObjectToString); } } private static String truncateString(String s, int i) { if (s.length() > i) { return s.substring(0, i - TRUNCATION_SUFFIX.length()) + TRUNCATION_SUFFIX; } else { return s; } } @TruffleBoundary protected static RuntimeException nullCoercion(Object languageContext, Object receiver, Class<?> targetType, String message, String useToCheck) { assert isEnteredOrNull(languageContext); String valueInfo = getValueInfo(languageContext, receiver); throw PolyglotEngineException.nullPointer(String.format("Cannot convert null value %s to Java type '%s' using Value.%s. " + "You can ensure that the operation is supported using Value.%s.", valueInfo, targetType, message, useToCheck)); } static boolean isEnteredOrNull(Object languageContext) { if (languageContext == null) { return true; } PolyglotContextImpl context = ((PolyglotLanguageContext) languageContext).context; return !context.engine.needsEnter(context); } @TruffleBoundary protected static RuntimeException cannotConvert(Object languageContext, Object receiver, Class<?> targetType, String message, String useToCheck, String reason) { assert isEnteredOrNull(languageContext); String valueInfo = getValueInfo(languageContext, receiver); String targetTypeString = ""; if (targetType != null) { targetTypeString = String.format("to Java type '%s'", targetType.getTypeName()); } throw PolyglotEngineException.classCast( String.format("Cannot convert %s %s using Value.%s: %s You can ensure that the value can be converted using Value.%s.", valueInfo, targetTypeString, message, reason, useToCheck)); } @TruffleBoundary protected static RuntimeException invalidArrayIndex(PolyglotLanguageContext context, Object receiver, long index) { String message = String.format("Invalid array index %s for array %s.", index, getValueInfo(context, receiver)); throw PolyglotEngineException.arrayIndexOutOfBounds(message); } @TruffleBoundary protected static RuntimeException invalidArrayValue(PolyglotLanguageContext context, Object receiver, long identifier, Object value) { throw PolyglotEngineException.classCast( String.format("Invalid array value %s for array %s and index %s.", getValueInfo(context, value), getValueInfo(context, receiver), identifier)); } @TruffleBoundary protected static RuntimeException nonReadableMemberKey(PolyglotLanguageContext context, Object receiver, String identifier) { String message = String.format("Non readable or non-existent member key '%s' for object %s.", identifier, getValueInfo(context, receiver)); throw PolyglotEngineException.unsupported(message); } @TruffleBoundary protected static RuntimeException nonWritableMemberKey(PolyglotLanguageContext context, Object receiver, String identifier) { String message = String.format("Non writable or non-existent member key '%s' for object %s.", identifier, getValueInfo(context, receiver)); throw PolyglotEngineException.unsupported(message); } @TruffleBoundary protected static RuntimeException nonRemovableMemberKey(PolyglotLanguageContext context, Object receiver, String identifier) { String message = String.format("Non removable or non-existent member key '%s' for object %s.", identifier, getValueInfo(context, receiver)); throw PolyglotEngineException.unsupported(message); } @TruffleBoundary protected static RuntimeException invalidMemberValue(PolyglotLanguageContext context, Object receiver, String identifier, Object value) { String message = String.format("Invalid member value %s for object %s and member key '%s'.", getValueInfo(context, value), getValueInfo(context, receiver), identifier); throw PolyglotEngineException.illegalArgument(message); } @TruffleBoundary protected static RuntimeException stopIteration(PolyglotLanguageContext context, Object receiver) { String message = String.format("Iteration was stopped for iterator %s.", getValueInfo(context, receiver)); throw PolyglotEngineException.noSuchElement(message); } @TruffleBoundary protected static RuntimeException nonReadableIteratorElement() { throw PolyglotEngineException.unsupported("Iterator element is not readable."); } @TruffleBoundary protected static RuntimeException invalidHashValue(PolyglotLanguageContext context, Object receiver, Object key, Object value) { String message = String.format("Invalid hash value %s for object %s and hash key %s.", getValueInfo(context, value), getValueInfo(context, receiver), getValueInfo(context, key)); throw PolyglotEngineException.illegalArgument(message); } @TruffleBoundary protected static RuntimeException invalidExecuteArgumentType(PolyglotLanguageContext context, Object receiver, UnsupportedTypeException e) { String originalMessage = e.getMessage() == null ? "" : e.getMessage() + " "; String[] formattedArgs = formatArgs(context, e.getSuppliedValues()); throw PolyglotEngineException.illegalArgument(String.format("Invalid argument when executing %s. %sProvided arguments: %s.", getValueInfo(context, receiver), originalMessage, Arrays.asList(formattedArgs))); } @TruffleBoundary protected static RuntimeException invalidInvokeArgumentType(PolyglotLanguageContext context, Object receiver, String member, UnsupportedTypeException e) { String originalMessage = e.getMessage() == null ? "" : e.getMessage(); String[] formattedArgs = formatArgs(context, e.getSuppliedValues()); String message = String.format("Invalid argument when invoking '%s' on %s. %sProvided arguments: %s.", member, getValueInfo(context, receiver), originalMessage, Arrays.asList(formattedArgs)); throw PolyglotEngineException.illegalArgument(message); } @TruffleBoundary protected static RuntimeException invalidInstantiateArgumentType(PolyglotLanguageContext context, Object receiver, Object[] arguments) { String[] formattedArgs = formatArgs(context, arguments); String message = String.format("Invalid argument when instantiating %s with arguments %s.", getValueInfo(context, receiver), Arrays.asList(formattedArgs)); throw PolyglotEngineException.illegalArgument(message); } @TruffleBoundary protected static RuntimeException invalidInstantiateArity(PolyglotLanguageContext context, Object receiver, Object[] arguments, int expectedMin, int expectedMax, int actual) { String[] formattedArgs = formatArgs(context, arguments); String message = String.format("Invalid argument count when instantiating %s with arguments %s. %s", getValueInfo(context, receiver), Arrays.asList(formattedArgs), formatExpectedArguments(expectedMin, expectedMax, actual)); throw PolyglotEngineException.illegalArgument(message); } @TruffleBoundary protected static RuntimeException invalidExecuteArity(PolyglotLanguageContext context, Object receiver, Object[] arguments, int expectedMin, int expectedMax, int actual) { String[] formattedArgs = formatArgs(context, arguments); String message = String.format("Invalid argument count when executing %s with arguments %s. %s", getValueInfo(context, receiver), Arrays.asList(formattedArgs), formatExpectedArguments(expectedMin, expectedMax, actual)); throw PolyglotEngineException.illegalArgument(message); } @TruffleBoundary protected static RuntimeException invalidInvokeArity(PolyglotLanguageContext context, Object receiver, String member, Object[] arguments, int expectedMin, int expectedMax, int actual) { String[] formattedArgs = formatArgs(context, arguments); String message = String.format("Invalid argument count when invoking '%s' on %s with arguments %s. %s", member, getValueInfo(context, receiver), Arrays.asList(formattedArgs), formatExpectedArguments(expectedMin, expectedMax, actual)); throw PolyglotEngineException.illegalArgument(message); } static String formatExpectedArguments(int expectedMinArity, int expectedMaxArity, int actualArity) { String actual; if (actualArity < 0) { actual = "unknown"; } else { actual = String.valueOf(actualArity); } String expected; if (expectedMinArity == expectedMaxArity) { expected = String.valueOf(expectedMinArity); } else { if (expectedMaxArity < 0) { expected = expectedMinArity + "+"; } else { expected = expectedMinArity + "-" + expectedMaxArity; } } return String.format("Expected %s argument(s) but got %s.", expected, actual); } private static String[] formatArgs(Object languageContext, Object[] arguments) { String[] formattedArgs = new String[arguments.length]; for (int i = 0; i < arguments.length; i++) { formattedArgs[i] = getValueInfo(languageContext, arguments[i]); } return formattedArgs; } @Override public final String toString(Object languageContext, Object receiver) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object prev = null; if (context != null) { PolyglotContextImpl.State localContextState = context.context.state; if (localContextState.isInvalidOrClosed()) { /* * Performance improvement for closed or invalid to avoid recurring exceptions. */ return "Error in toString(): Context is invalid or closed."; } } try { prev = PolyglotValueDispatch.hostEnter(context); } catch (Throwable t) { // enter might fail if context was closed. // Can no longer call interop. return String.format("Error in toString(): Could not enter context: %s.", t.getMessage()); } try { return toStringImpl(context, receiver); } catch (Throwable e) { throw PolyglotValueDispatch.guestToHostException(context, e, true); } finally { try { PolyglotValueDispatch.hostLeave(languageContext, prev); } catch (Throwable t) { // ignore errors leaving we cannot propagate them. } } } String toStringImpl(PolyglotLanguageContext context, Object receiver) throws AssertionError { return PolyglotWrapper.toStringImpl(context, receiver); } @Override public Object getSourceLocation(Object languageContext, Object receiver) { if (languageContext == null) { return null; } PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object prev = hostEnter(context); try { InteropLibrary lib = InteropLibrary.getFactory().getUncached(receiver); com.oracle.truffle.api.source.SourceSection result = null; if (lib.hasSourceLocation(receiver)) { try { result = lib.getSourceLocation(receiver); } catch (UnsupportedMessageException e) { } } if (result == null) { return null; } return PolyglotImpl.getPolyglotSourceSection(impl, result); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, prev); } } @Override public boolean isMetaObject(Object languageContext, Object receiver) { return false; } @Override public boolean equalsImpl(Object languageContext, Object receiver, Object obj) { if (receiver == obj) { return true; } return PolyglotWrapper.equals(languageContext, receiver, obj); } @Override public int hashCodeImpl(Object languageContext, Object receiver) { return PolyglotWrapper.hashCode(languageContext, receiver); } @Override public boolean isMetaInstance(Object languageContext, Object receiver, Object instance) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object prev = hostEnter(context); try { throw unsupported(context, receiver, "isMetaInstance(Object)", "isMetaObject()"); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, prev); } } @Override public String getMetaQualifiedName(Object languageContext, Object receiver) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object prev = hostEnter(context); try { throw unsupported(context, receiver, "getMetaQualifiedName()", "isMetaObject()"); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, prev); } } @Override public String getMetaSimpleName(Object languageContext, Object receiver) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object prev = hostEnter(context); try { throw unsupported(context, receiver, "getMetaSimpleName()", "isMetaObject()"); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, prev); } } @Override public boolean hasMetaParents(Object languageContext, Object receiver) { return false; } @Override public Object getMetaParents(Object languageContext, Object receiver) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object prev = hostEnter(context); try { throw unsupported(context, receiver, "getMetaParents()", "hasMetaParents()"); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, prev); } } static CallTarget createTarget(InteropNode root) { CallTarget target = root.getCallTarget(); Class<?>[] types = root.getArgumentTypes(); if (types != null) { RUNTIME.initializeProfile(target, types); } return target; } static PolyglotValueDispatch createInteropValue(PolyglotLanguageInstance languageInstance, TruffleObject receiver, Class<?> receiverType) { return new InteropValue(languageInstance.getImpl(), languageInstance, receiver, receiverType); } static PolyglotValueDispatch createHostNull(PolyglotImpl polyglot) { return new HostNull(polyglot); } static void createDefaultValues(PolyglotImpl polyglot, PolyglotLanguageInstance languageInstance, Map<Class<?>, PolyglotValueDispatch> valueCache) { addDefaultValue(polyglot, languageInstance, valueCache, false); addDefaultValue(polyglot, languageInstance, valueCache, ""); addDefaultValue(polyglot, languageInstance, valueCache, TruffleString.fromJavaStringUncached("", TruffleString.Encoding.UTF_16)); addDefaultValue(polyglot, languageInstance, valueCache, 'a'); addDefaultValue(polyglot, languageInstance, valueCache, (byte) 0); addDefaultValue(polyglot, languageInstance, valueCache, (short) 0); addDefaultValue(polyglot, languageInstance, valueCache, 0); addDefaultValue(polyglot, languageInstance, valueCache, 0L); addDefaultValue(polyglot, languageInstance, valueCache, 0F); addDefaultValue(polyglot, languageInstance, valueCache, 0D); } static void addDefaultValue(PolyglotImpl polyglot, PolyglotLanguageInstance languageInstance, Map<Class<?>, PolyglotValueDispatch> valueCache, Object primitive) { valueCache.put(primitive.getClass(), new PrimitiveValue(polyglot, languageInstance, primitive)); } static final class PrimitiveValue extends PolyglotValueDispatch { private final InteropLibrary interop; private final PolyglotLanguage language; private PrimitiveValue(PolyglotImpl impl, PolyglotLanguageInstance instance, Object primitiveValue) { super(impl, instance); /* * No caching needed for primitives. We do that to avoid the overhead of crossing a * Truffle call boundary. */ this.interop = InteropLibrary.getFactory().getUncached(primitiveValue); this.language = instance != null ? instance.language : null; } @Override public boolean isString(Object languageContext, Object receiver) { return interop.isString(receiver); } @Override public boolean isBoolean(Object languageContext, Object receiver) { return interop.isBoolean(receiver); } @Override public boolean asBoolean(Object languageContext, Object receiver) { try { return interop.asBoolean(receiver); } catch (UnsupportedMessageException e) { return super.asBoolean(languageContext, receiver); } } @Override public String asString(Object languageContext, Object receiver) { try { return interop.asString(receiver); } catch (UnsupportedMessageException e) { return super.asString(languageContext, receiver); } } @Override public boolean isNumber(Object languageContext, Object receiver) { return interop.isNumber(receiver); } @Override public boolean fitsInByte(Object languageContext, Object receiver) { return interop.fitsInByte(receiver); } @Override public boolean fitsInShort(Object languageContext, Object receiver) { return interop.fitsInShort(receiver); } @Override public boolean fitsInInt(Object languageContext, Object receiver) { return interop.fitsInInt(receiver); } @Override public boolean fitsInLong(Object languageContext, Object receiver) { return interop.fitsInLong(receiver); } @Override public boolean fitsInBigInteger(Object languageContext, Object receiver) { return interop.fitsInBigInteger(receiver); } @Override public boolean fitsInFloat(Object languageContext, Object receiver) { return interop.fitsInFloat(receiver); } @Override public boolean fitsInDouble(Object languageContext, Object receiver) { return interop.fitsInDouble(receiver); } @Override public byte asByte(Object languageContext, Object receiver) { try { return interop.asByte(receiver); } catch (UnsupportedMessageException e) { return super.asByte(languageContext, receiver); } } @Override public short asShort(Object languageContext, Object receiver) { try { return interop.asShort(receiver); } catch (UnsupportedMessageException e) { return super.asShort(languageContext, receiver); } } @Override public int asInt(Object languageContext, Object receiver) { try { return interop.asInt(receiver); } catch (UnsupportedMessageException e) { return super.asInt(languageContext, receiver); } } @Override public long asLong(Object languageContext, Object receiver) { try { return interop.asLong(receiver); } catch (UnsupportedMessageException e) { return super.asLong(languageContext, receiver); } } @Override public BigInteger asBigInteger(Object languageContext, Object receiver) { try { return interop.asBigInteger(receiver); } catch (UnsupportedMessageException e) { return super.asBigInteger(languageContext, receiver); } } @Override public float asFloat(Object languageContext, Object receiver) { try { return interop.asFloat(receiver); } catch (UnsupportedMessageException e) { return super.asFloat(languageContext, receiver); } } @Override public double asDouble(Object languageContext, Object receiver) { try { return interop.asDouble(receiver); } catch (UnsupportedMessageException e) { return super.asDouble(languageContext, receiver); } } @SuppressWarnings("unchecked") @Override public <T> T asClass(Object languageContext, Object receiver, Class<T> targetType) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object prev = hostEnter(context); try { if (context != null) { return language.engine.host.toHostType(null, null, context.context.getHostContextImpl(), receiver, targetType, targetType); } else { // disconnected primitive value T result = (T) EngineAccessor.HOST.convertPrimitiveLossy(receiver, targetType); if (result == null) { throw PolyglotInteropErrors.cannotConvertPrimitive(null, receiver, targetType); } return result; } } catch (Throwable e) { throw guestToHostException((context), e, true); } finally { hostLeave(context, prev); } } @SuppressWarnings("unchecked") @Override public <T> T asTypeLiteral(Object languageContext, Object receiver, Class<T> rawType, Type type) { return asClass(languageContext, receiver, rawType); } @Override public Object getMetaObjectImpl(PolyglotLanguageContext languageContext, Object receiver) { return super.getMetaObjectImpl(languageContext, getLanguageView(languageContext, receiver)); } @Override String toStringImpl(PolyglotLanguageContext context, Object receiver) throws AssertionError { return super.toStringImpl(context, getLanguageView(context, receiver)); } private Object getLanguageView(Object languageContext, Object receiver) { if (languageContext == null || language == null) { return receiver; } PolyglotContextImpl c = ((PolyglotLanguageContext) languageContext).context; return c.getContext(language).getLanguageViewNoCheck(receiver); } } private static final class HostNull extends PolyglotValueDispatch { private final PolyglotImpl polyglot; HostNull(PolyglotImpl polyglot) { super(polyglot, null); this.polyglot = polyglot; } @Override public boolean isNull(Object languageContext, Object receiver) { return true; } @SuppressWarnings("unchecked") @Override public <T> T asClass(Object languageContext, Object receiver, Class<T> targetType) { if (targetType == polyglot.getAPIAccess().getValueClass()) { return (T) polyglot.hostNull; } return null; } @SuppressWarnings("cast") @Override public <T> T asTypeLiteral(Object languageContext, Object receiver, Class<T> rawType, Type type) { return asClass(languageContext, receiver, rawType); } } abstract static class InteropNode extends HostToGuestRootNode { protected static final int CACHE_LIMIT = 5; protected final InteropValue polyglot; protected abstract String getOperationName(); protected InteropNode(InteropValue polyglot) { super(polyglot.languageInstance); this.polyglot = polyglot; } protected abstract Class<?>[] getArgumentTypes(); @Override protected Class<? extends Object> getReceiverType() { return polyglot.receiverType; } @Override public final String getName() { return "org.graalvm.polyglot.Value<" + polyglot.receiverType.getSimpleName() + ">." + getOperationName(); } protected final AbstractPolyglotImpl getImpl() { return polyglot.impl; } @Override public final String toString() { return getName(); } } /** * Host value implementation used when a Value needs to be created but not context is available. * If a context is available the normal interop value implementation is used. */ static class HostValue extends PolyglotValueDispatch { HostValue(PolyglotImpl polyglot) { super(polyglot, null); } @Override public boolean isHostObject(Object languageContext, Object receiver) { return EngineAccessor.HOST.isDisconnectedHostObject(receiver); } @Override public Object asHostObject(Object languageContext, Object receiver) { return EngineAccessor.HOST.unboxDisconnectedHostObject(receiver); } @Override public boolean isProxyObject(Object languageContext, Object receiver) { return EngineAccessor.HOST.isDisconnectedHostProxy(receiver); } @Override public Object asProxyObject(Object languageContext, Object receiver) { return EngineAccessor.HOST.unboxDisconnectedHostProxy(receiver); } @Override public <T> T asClass(Object languageContext, Object receiver, Class<T> targetType) { return asImpl(languageContext, receiver, targetType); } @SuppressWarnings("cast") @Override public <T> T asTypeLiteral(Object languageContext, Object receiver, Class<T> rawType, Type type) { return asImpl(languageContext, receiver, (Class<T>) rawType); } <T> T asImpl(Object languageContext, Object receiver, Class<T> targetType) { Object hostValue; if (isProxyObject(languageContext, receiver)) { hostValue = asProxyObject(languageContext, receiver); } else if (isHostObject(languageContext, receiver)) { hostValue = asHostObject(languageContext, receiver); } else { throw new ClassCastException(); } return targetType.cast(hostValue); } } /** * Must be kept in sync with the HostObject and the HostToTypeNode implementation. */ static final class BigIntegerHostValue extends HostValue { BigIntegerHostValue(PolyglotImpl polyglot) { super(polyglot); } @Override public boolean isNumber(Object context, Object receiver) { assert asHostObject(context, receiver) instanceof BigInteger; return true; } @Override public boolean fitsInByte(Object context, Object receiver) { assert asHostObject(context, receiver) instanceof BigInteger; return ((BigInteger) asHostObject(context, receiver)).bitLength() < Byte.SIZE; } @Override public boolean fitsInShort(Object context, Object receiver) { assert asHostObject(context, receiver) instanceof BigInteger; return ((BigInteger) asHostObject(context, receiver)).bitLength() < Short.SIZE; } @Override public boolean fitsInInt(Object context, Object receiver) { assert asHostObject(context, receiver) instanceof BigInteger; return ((BigInteger) asHostObject(context, receiver)).bitLength() < Integer.SIZE; } @Override public boolean fitsInLong(Object context, Object receiver) { assert asHostObject(context, receiver) instanceof BigInteger; return ((BigInteger) asHostObject(context, receiver)).bitLength() < Long.SIZE; } @Override public boolean fitsInBigInteger(Object context, Object receiver) { assert asHostObject(context, receiver) instanceof BigInteger; return true; } @Override public boolean fitsInFloat(Object context, Object receiver) { assert asHostObject(context, receiver) instanceof BigInteger; return EngineAccessor.HOST.bigIntegerFitsInFloat((BigInteger) asHostObject(context, receiver)); } @Override public boolean fitsInDouble(Object context, Object receiver) { assert asHostObject(context, receiver) instanceof BigInteger; return EngineAccessor.HOST.bigIntegerFitsInDouble((BigInteger) asHostObject(context, receiver)); } @Override public byte asByte(Object languageContext, Object receiver) { assert asHostObject(languageContext, receiver) instanceof BigInteger; try { return ((BigInteger) asHostObject(languageContext, receiver)).byteValueExact(); } catch (ArithmeticException e) { // throws an unsupported error. return super.asByte(languageContext, receiver); } } @Override public short asShort(Object languageContext, Object receiver) { assert asHostObject(languageContext, receiver) instanceof BigInteger; try { return ((BigInteger) asHostObject(languageContext, receiver)).shortValueExact(); } catch (ArithmeticException e) { // throws an unsupported error. return super.asShort(languageContext, receiver); } } @Override public int asInt(Object languageContext, Object receiver) { assert asHostObject(languageContext, receiver) instanceof BigInteger; try { return ((BigInteger) asHostObject(languageContext, receiver)).intValueExact(); } catch (ArithmeticException e) { // throws an unsupported error. return super.asInt(languageContext, receiver); } } @Override public long asLong(Object languageContext, Object receiver) { assert asHostObject(languageContext, receiver) instanceof BigInteger; try { return ((BigInteger) asHostObject(languageContext, receiver)).longValueExact(); } catch (ArithmeticException e) { // throws an unsupported error. return super.asLong(languageContext, receiver); } } @Override public BigInteger asBigInteger(Object languageContext, Object receiver) { assert asHostObject(languageContext, receiver) instanceof BigInteger; return ((BigInteger) asHostObject(languageContext, receiver)); } @Override public float asFloat(Object languageContext, Object receiver) { assert asHostObject(languageContext, receiver) instanceof BigInteger; if (fitsInFloat(languageContext, receiver)) { return ((BigInteger) asHostObject(languageContext, receiver)).floatValue(); } else { // throws an unsupported error. return super.asFloat(languageContext, receiver); } } @Override public double asDouble(Object languageContext, Object receiver) { assert asHostObject(languageContext, receiver) instanceof BigInteger; if (fitsInFloat(languageContext, receiver)) { return ((BigInteger) asHostObject(languageContext, receiver)).doubleValue(); } else { // throws an unsupported error. return super.asDouble(languageContext, receiver); } } @SuppressWarnings("unchecked") @Override <T> T asImpl(Object languageContext, Object receiver, Class<T> targetType) { assert asHostObject(languageContext, receiver) instanceof BigInteger; if (targetType == byte.class || targetType == Byte.class) { return (T) (Byte) asByte(languageContext, receiver); } else if (targetType == short.class || targetType == Short.class) { return (T) (Short) asShort(languageContext, receiver); } else if (targetType == int.class || targetType == Integer.class) { return (T) (Integer) asInt(languageContext, receiver); } else if (targetType == long.class || targetType == Long.class) { return (T) (Long) asLong(languageContext, receiver); } else if (targetType == float.class || targetType == Float.class) { return (T) (Float) asFloat(languageContext, receiver); } else if (targetType == double.class || targetType == Double.class) { return (T) (Double) asDouble(languageContext, receiver); } else if (targetType == BigInteger.class || targetType == Number.class) { return (T) asBigInteger(languageContext, receiver); } else if (targetType == char.class || targetType == Character.class) { if (fitsInInt(languageContext, receiver)) { int v = asInt(languageContext, receiver); if (v >= 0 && v < 65536) { return (T) (Character) (char) v; } } } return super.asImpl(languageContext, receiver, targetType); } } @SuppressWarnings("unused") static final class InteropValue extends PolyglotValueDispatch { final CallTarget isNativePointer; final CallTarget asNativePointer; final CallTarget hasArrayElements; final CallTarget getArrayElement; final CallTarget setArrayElement; final CallTarget removeArrayElement; final CallTarget getArraySize; final CallTarget hasBufferElements; final CallTarget isBufferWritable; final CallTarget getBufferSize; final CallTarget readBufferByte; final CallTarget readBuffer; final CallTarget writeBufferByte; final CallTarget readBufferShort; final CallTarget writeBufferShort; final CallTarget readBufferInt; final CallTarget writeBufferInt; final CallTarget readBufferLong; final CallTarget writeBufferLong; final CallTarget readBufferFloat; final CallTarget writeBufferFloat; final CallTarget readBufferDouble; final CallTarget writeBufferDouble; final CallTarget hasMembers; final CallTarget hasMember; final CallTarget getMember; final CallTarget putMember; final CallTarget removeMember; final CallTarget isNull; final CallTarget canExecute; final CallTarget execute; final CallTarget canInstantiate; final CallTarget newInstance; final CallTarget executeNoArgs; final CallTarget executeVoid; final CallTarget executeVoidNoArgs; final CallTarget canInvoke; final CallTarget invoke; final CallTarget invokeNoArgs; final CallTarget getMemberKeys; final CallTarget isDate; final CallTarget asDate; final CallTarget isTime; final CallTarget asTime; final CallTarget isTimeZone; final CallTarget asTimeZone; final CallTarget asInstant; final CallTarget isDuration; final CallTarget asDuration; final CallTarget isException; final CallTarget throwException; final CallTarget isMetaObject; final CallTarget isMetaInstance; final CallTarget getMetaQualifiedName; final CallTarget getMetaSimpleName; final CallTarget hasMetaParents; final CallTarget getMetaParents; final CallTarget hasIterator; final CallTarget getIterator; final CallTarget isIterator; final CallTarget hasIteratorNextElement; final CallTarget getIteratorNextElement; final CallTarget hasHashEntries; final CallTarget getHashSize; final CallTarget hasHashEntry; final CallTarget getHashValue; final CallTarget getHashValueOrDefault; final CallTarget putHashEntry; final CallTarget removeHashEntry; final CallTarget getHashEntriesIterator; final CallTarget getHashKeysIterator; final CallTarget getHashValuesIterator; final CallTarget asClassLiteral; final CallTarget asTypeLiteral; final Class<?> receiverType; InteropValue(PolyglotImpl polyglot, PolyglotLanguageInstance languageInstance, Object receiverObject, Class<?> receiverType) { super(polyglot, languageInstance); this.receiverType = receiverType; this.asClassLiteral = createTarget(AsClassLiteralNodeGen.create(this)); this.asTypeLiteral = createTarget(AsTypeLiteralNodeGen.create(this)); this.isNativePointer = createTarget(IsNativePointerNodeGen.create(this)); this.asNativePointer = createTarget(AsNativePointerNodeGen.create(this)); this.hasArrayElements = createTarget(HasArrayElementsNodeGen.create(this)); this.getArrayElement = createTarget(GetArrayElementNodeGen.create(this)); this.setArrayElement = createTarget(SetArrayElementNodeGen.create(this)); this.removeArrayElement = createTarget(RemoveArrayElementNodeGen.create(this)); this.getArraySize = createTarget(GetArraySizeNodeGen.create(this)); this.hasBufferElements = createTarget(HasBufferElementsNodeGen.create(this)); this.isBufferWritable = createTarget(IsBufferWritableNodeGen.create(this)); this.getBufferSize = createTarget(GetBufferSizeNodeGen.create(this)); this.readBufferByte = createTarget(ReadBufferByteNodeGen.create(this)); this.readBuffer = createTarget(ReadBufferNodeGen.create(this)); this.writeBufferByte = createTarget(WriteBufferByteNodeGen.create(this)); this.readBufferShort = createTarget(ReadBufferShortNodeGen.create(this)); this.writeBufferShort = createTarget(WriteBufferShortNodeGen.create(this)); this.readBufferInt = createTarget(ReadBufferIntNodeGen.create(this)); this.writeBufferInt = createTarget(WriteBufferIntNodeGen.create(this)); this.readBufferLong = createTarget(ReadBufferLongNodeGen.create(this)); this.writeBufferLong = createTarget(WriteBufferLongNodeGen.create(this)); this.readBufferFloat = createTarget(ReadBufferFloatNodeGen.create(this)); this.writeBufferFloat = createTarget(WriteBufferFloatNodeGen.create(this)); this.readBufferDouble = createTarget(ReadBufferDoubleNodeGen.create(this)); this.writeBufferDouble = createTarget(WriteBufferDoubleNodeGen.create(this)); this.hasMember = createTarget(HasMemberNodeGen.create(this)); this.getMember = createTarget(GetMemberNodeGen.create(this)); this.putMember = createTarget(PutMemberNodeGen.create(this)); this.removeMember = createTarget(RemoveMemberNodeGen.create(this)); this.isNull = createTarget(IsNullNodeGen.create(this)); this.execute = createTarget(ExecuteNodeGen.create(this)); this.executeNoArgs = createTarget(ExecuteNoArgsNodeGen.create(this)); this.executeVoid = createTarget(ExecuteVoidNodeGen.create(this)); this.executeVoidNoArgs = createTarget(ExecuteVoidNoArgsNodeGen.create(this)); this.newInstance = createTarget(NewInstanceNodeGen.create(this)); this.canInstantiate = createTarget(CanInstantiateNodeGen.create(this)); this.canExecute = createTarget(CanExecuteNodeGen.create(this)); this.canInvoke = createTarget(CanInvokeNodeGen.create(this)); this.invoke = createTarget(InvokeNodeGen.create(this)); this.invokeNoArgs = createTarget(InvokeNoArgsNodeGen.create(this)); this.hasMembers = createTarget(HasMembersNodeGen.create(this)); this.getMemberKeys = createTarget(GetMemberKeysNodeGen.create(this)); this.isDate = createTarget(IsDateNodeGen.create(this)); this.asDate = createTarget(AsDateNodeGen.create(this)); this.isTime = createTarget(IsTimeNodeGen.create(this)); this.asTime = createTarget(AsTimeNodeGen.create(this)); this.isTimeZone = createTarget(IsTimeZoneNodeGen.create(this)); this.asTimeZone = createTarget(AsTimeZoneNodeGen.create(this)); this.asInstant = createTarget(AsInstantNodeGen.create(this)); this.isDuration = createTarget(IsDurationNodeGen.create(this)); this.asDuration = createTarget(AsDurationNodeGen.create(this)); this.isException = createTarget(IsExceptionNodeGen.create(this)); this.throwException = createTarget(ThrowExceptionNodeGen.create(this)); this.isMetaObject = createTarget(IsMetaObjectNodeGen.create(this)); this.isMetaInstance = createTarget(IsMetaInstanceNodeGen.create(this)); this.getMetaQualifiedName = createTarget(GetMetaQualifiedNameNodeGen.create(this)); this.getMetaSimpleName = createTarget(GetMetaSimpleNameNodeGen.create(this)); this.hasMetaParents = createTarget(PolyglotValueDispatchFactory.InteropValueFactory.HasMetaParentsNodeGen.create(this)); this.getMetaParents = createTarget(PolyglotValueDispatchFactory.InteropValueFactory.GetMetaParentsNodeGen.create(this)); this.hasIterator = createTarget(HasIteratorNodeGen.create(this)); this.getIterator = createTarget(PolyglotValueDispatchFactory.InteropValueFactory.GetIteratorNodeGen.create(this)); this.isIterator = createTarget(PolyglotValueDispatchFactory.InteropValueFactory.IsIteratorNodeGen.create(this)); this.hasIteratorNextElement = createTarget(HasIteratorNextElementNodeGen.create(this)); this.getIteratorNextElement = createTarget(GetIteratorNextElementNodeGen.create(this)); this.hasHashEntries = createTarget(HasHashEntriesNodeGen.create(this)); this.getHashSize = createTarget(GetHashSizeNodeGen.create(this)); this.hasHashEntry = createTarget(HasHashEntryNodeGen.create(this)); this.getHashValue = createTarget(GetHashValueNodeGen.create(this)); this.getHashValueOrDefault = createTarget(GetHashValueOrDefaultNodeGen.create(this)); this.putHashEntry = createTarget(PutHashEntryNodeGen.create(this)); this.removeHashEntry = createTarget(RemoveHashEntryNodeGen.create(this)); this.getHashEntriesIterator = createTarget(GetHashEntriesIteratorNodeGen.create(this)); this.getHashKeysIterator = createTarget(GetHashKeysIteratorNodeGen.create(this)); this.getHashValuesIterator = createTarget(GetHashValuesIteratorNodeGen.create(this)); } @SuppressWarnings("unchecked") @Override public <T> T asClass(Object languageContext, Object receiver, Class<T> targetType) { return (T) RUNTIME.callProfiled(this.asClassLiteral, languageContext, receiver, targetType); } @SuppressWarnings("unchecked") @Override public <T> T asTypeLiteral(Object languageContext, Object receiver, Class<T> rawType, Type type) { return (T) RUNTIME.callProfiled(this.asTypeLiteral, languageContext, receiver, rawType, type); } @Override public boolean isNativePointer(Object languageContext, Object receiver) { return (boolean) RUNTIME.callProfiled(this.isNativePointer, languageContext, receiver); } @Override public boolean hasArrayElements(Object languageContext, Object receiver) { return (boolean) RUNTIME.callProfiled(this.hasArrayElements, languageContext, receiver); } @Override public Object getArrayElement(Object languageContext, Object receiver, long index) { return RUNTIME.callProfiled(this.getArrayElement, languageContext, receiver, index); } @Override public void setArrayElement(Object languageContext, Object receiver, long index, Object value) { RUNTIME.callProfiled(this.setArrayElement, languageContext, receiver, index, value); } @Override public boolean removeArrayElement(Object languageContext, Object receiver, long index) { return (boolean) RUNTIME.callProfiled(this.removeArrayElement, languageContext, receiver, index); } @Override public long getArraySize(Object languageContext, Object receiver) { return (long) RUNTIME.callProfiled(this.getArraySize, languageContext, receiver); } // region Buffer Methods @Override public boolean hasBufferElements(Object languageContext, Object receiver) { return (boolean) RUNTIME.callProfiled(this.hasBufferElements, languageContext, receiver); } @Override public boolean isBufferWritable(Object languageContext, Object receiver) { return (boolean) RUNTIME.callProfiled(this.isBufferWritable, languageContext, receiver); } @Override public long getBufferSize(Object languageContext, Object receiver) throws UnsupportedOperationException { return (long) RUNTIME.callProfiled(this.getBufferSize, languageContext, receiver); } @Override public byte readBufferByte(Object languageContext, Object receiver, long byteOffset) throws UnsupportedOperationException, IndexOutOfBoundsException { return (byte) RUNTIME.callProfiled(this.readBufferByte, languageContext, receiver, byteOffset); } @Override public void readBuffer(Object languageContext, Object receiver, long byteOffset, byte[] destination, int destinationOffset, int length) throws UnsupportedOperationException, IndexOutOfBoundsException { RUNTIME.callProfiled(this.readBuffer, languageContext, receiver, byteOffset, destination, destinationOffset, length); } @Override public void writeBufferByte(Object languageContext, Object receiver, long byteOffset, byte value) throws UnsupportedOperationException, IndexOutOfBoundsException { RUNTIME.callProfiled(this.writeBufferByte, languageContext, receiver, byteOffset, value); } @Override public short readBufferShort(Object languageContext, Object receiver, ByteOrder order, long byteOffset) throws UnsupportedOperationException, IndexOutOfBoundsException { return (short) RUNTIME.callProfiled(this.readBufferShort, languageContext, receiver, order, byteOffset); } @Override public void writeBufferShort(Object languageContext, Object receiver, ByteOrder order, long byteOffset, short value) throws UnsupportedOperationException, IndexOutOfBoundsException { RUNTIME.callProfiled(this.writeBufferShort, languageContext, receiver, order, byteOffset, value); } @Override public int readBufferInt(Object languageContext, Object receiver, ByteOrder order, long byteOffset) throws UnsupportedOperationException, IndexOutOfBoundsException { return (int) RUNTIME.callProfiled(this.readBufferInt, languageContext, receiver, order, byteOffset); } @Override public void writeBufferInt(Object languageContext, Object receiver, ByteOrder order, long byteOffset, int value) throws UnsupportedOperationException, IndexOutOfBoundsException { RUNTIME.callProfiled(this.writeBufferInt, languageContext, receiver, order, byteOffset, value); } @Override public long readBufferLong(Object languageContext, Object receiver, ByteOrder order, long byteOffset) throws UnsupportedOperationException, IndexOutOfBoundsException { return (long) RUNTIME.callProfiled(this.readBufferLong, languageContext, receiver, order, byteOffset); } @Override public void writeBufferLong(Object languageContext, Object receiver, ByteOrder order, long byteOffset, long value) throws UnsupportedOperationException, IndexOutOfBoundsException { RUNTIME.callProfiled(this.writeBufferLong, languageContext, receiver, order, byteOffset, value); } @Override public float readBufferFloat(Object languageContext, Object receiver, ByteOrder order, long byteOffset) throws UnsupportedOperationException, IndexOutOfBoundsException { return (float) RUNTIME.callProfiled(this.readBufferFloat, languageContext, receiver, order, byteOffset); } @Override public void writeBufferFloat(Object languageContext, Object receiver, ByteOrder order, long byteOffset, float value) throws UnsupportedOperationException, IndexOutOfBoundsException { RUNTIME.callProfiled(this.writeBufferFloat, languageContext, receiver, order, byteOffset, value); } @Override public double readBufferDouble(Object languageContext, Object receiver, ByteOrder order, long byteOffset) throws UnsupportedOperationException, IndexOutOfBoundsException { return (double) RUNTIME.callProfiled(this.readBufferDouble, languageContext, receiver, order, byteOffset); } @Override public void writeBufferDouble(Object languageContext, Object receiver, ByteOrder order, long byteOffset, double value) throws UnsupportedOperationException, IndexOutOfBoundsException { RUNTIME.callProfiled(this.writeBufferDouble, languageContext, receiver, order, byteOffset, value); } // endregion @Override public boolean hasMembers(Object languageContext, Object receiver) { return (boolean) RUNTIME.callProfiled(this.hasMembers, languageContext, receiver); } @Override public Object getMember(Object languageContext, Object receiver, String key) { return RUNTIME.callProfiled(this.getMember, languageContext, receiver, key); } @Override public boolean hasMember(Object languageContext, Object receiver, String key) { return (boolean) RUNTIME.callProfiled(this.hasMember, languageContext, receiver, key); } @Override public void putMember(Object languageContext, Object receiver, String key, Object member) { RUNTIME.callProfiled(this.putMember, languageContext, receiver, key, member); } @Override public boolean removeMember(Object languageContext, Object receiver, String key) { return (boolean) RUNTIME.callProfiled(this.removeMember, languageContext, receiver, key); } @Override public Set<String> getMemberKeys(Object languageContext, Object receiver) { Object keys = RUNTIME.callProfiled(this.getMemberKeys, languageContext, receiver); if (keys == null) { // unsupported return Collections.emptySet(); } return new MemberSet(this.getEngine().getAPIAccess(), languageContext, receiver, keys); } @Override public long asNativePointer(Object languageContext, Object receiver) { return (long) RUNTIME.callProfiled(this.asNativePointer, languageContext, receiver); } @Override public boolean isDate(Object languageContext, Object receiver) { return (boolean) RUNTIME.callProfiled(this.isDate, languageContext, receiver); } @Override public LocalDate asDate(Object languageContext, Object receiver) { return (LocalDate) RUNTIME.callProfiled(this.asDate, languageContext, receiver); } @Override public boolean isTime(Object languageContext, Object receiver) { return (boolean) RUNTIME.callProfiled(this.isTime, languageContext, receiver); } @Override public LocalTime asTime(Object languageContext, Object receiver) { return (LocalTime) RUNTIME.callProfiled(this.asTime, languageContext, receiver); } @Override public boolean isTimeZone(Object languageContext, Object receiver) { return (boolean) RUNTIME.callProfiled(this.isTimeZone, languageContext, receiver); } @Override public ZoneId asTimeZone(Object languageContext, Object receiver) { return (ZoneId) RUNTIME.callProfiled(this.asTimeZone, languageContext, receiver); } @Override public Instant asInstant(Object languageContext, Object receiver) { return (Instant) RUNTIME.callProfiled(this.asInstant, languageContext, receiver); } @Override public boolean isDuration(Object languageContext, Object receiver) { return (boolean) RUNTIME.callProfiled(this.isDuration, languageContext, receiver); } @Override public Duration asDuration(Object languageContext, Object receiver) { return (Duration) RUNTIME.callProfiled(this.asDuration, languageContext, receiver); } @Override public boolean isHostObject(Object languageContext, Object receiver) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object prev = hostEnter(context); try { return getEngine().host.isHostObject(receiver); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, prev); } } private PolyglotEngineImpl getEngine() { return languageInstance.sharing.engine; } @Override public boolean isProxyObject(Object languageContext, Object receiver) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object prev = hostEnter(context); try { return getEngine().host.isHostProxy(receiver); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, prev); } } @Override public Object asProxyObject(Object languageContext, Object receiver) { if (isProxyObject(languageContext, receiver)) { return getEngine().host.unboxProxyObject(receiver); } else { return super.asProxyObject(languageContext, receiver); } } @Override public Object asHostObject(Object languageContext, Object receiver) { if (isHostObject(languageContext, receiver)) { return getEngine().host.unboxHostObject(receiver); } else { return super.asHostObject(languageContext, receiver); } } @Override public boolean isNull(Object languageContext, Object receiver) { return (boolean) RUNTIME.callProfiled(this.isNull, languageContext, receiver); } @Override public boolean canExecute(Object languageContext, Object receiver) { return (boolean) RUNTIME.callProfiled(this.canExecute, languageContext, receiver); } @Override public void executeVoid(Object languageContext, Object receiver, Object[] arguments) { RUNTIME.callProfiled(this.executeVoid, languageContext, receiver, arguments); } @Override public void executeVoid(Object languageContext, Object receiver) { RUNTIME.callProfiled(this.executeVoidNoArgs, languageContext, receiver); } @Override public Object execute(Object languageContext, Object receiver, Object[] arguments) { return RUNTIME.callProfiled(this.execute, languageContext, receiver, arguments); } @Override public Object execute(Object languageContext, Object receiver) { return RUNTIME.callProfiled(this.executeNoArgs, languageContext, receiver); } @Override public boolean canInstantiate(Object languageContext, Object receiver) { return (boolean) RUNTIME.callProfiled(this.canInstantiate, languageContext, receiver); } @Override public Object newInstance(Object languageContext, Object receiver, Object[] arguments) { return RUNTIME.callProfiled(this.newInstance, languageContext, receiver, arguments); } @Override public boolean canInvoke(Object languageContext, String identifier, Object receiver) { return (boolean) RUNTIME.callProfiled(this.canInvoke, languageContext, receiver, identifier); } @Override public Object invoke(Object languageContext, Object receiver, String identifier, Object[] arguments) { return RUNTIME.callProfiled(this.invoke, languageContext, receiver, identifier, arguments); } @Override public Object invoke(Object languageContext, Object receiver, String identifier) { return RUNTIME.callProfiled(this.invokeNoArgs, languageContext, receiver, identifier); } @Override public boolean isException(Object languageContext, Object receiver) { return (boolean) RUNTIME.callProfiled(this.isException, languageContext, receiver); } @Override public RuntimeException throwException(Object languageContext, Object receiver) { RUNTIME.callProfiled(this.throwException, languageContext, receiver); throw super.throwException(languageContext, receiver); } @Override public boolean isNumber(Object languageContext, Object receiver) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object c = hostEnter(context); try { return UNCACHED_INTEROP.isNumber(receiver); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, c); } } @Override public boolean fitsInByte(Object languageContext, Object receiver) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object c = hostEnter(context); try { return UNCACHED_INTEROP.fitsInByte(receiver); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, c); } } @Override public byte asByte(Object languageContext, Object receiver) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object c = hostEnter(context); try { try { return UNCACHED_INTEROP.asByte(receiver); } catch (UnsupportedMessageException e) { return asByteUnsupported(context, receiver); } } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, c); } } @Override public boolean isString(Object languageContext, Object receiver) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object c = hostEnter(context); try { return UNCACHED_INTEROP.isString(receiver); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, c); } } @Override public String asString(Object languageContext, Object receiver) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object c = hostEnter(context); try { try { if (isNullUncached(receiver)) { return null; } return UNCACHED_INTEROP.asString(receiver); } catch (UnsupportedMessageException e) { return asStringUnsupported(context, receiver); } } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, c); } } @Override public boolean fitsInInt(Object languageContext, Object receiver) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object c = hostEnter(context); try { return UNCACHED_INTEROP.fitsInInt(receiver); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, c); } } @Override public int asInt(Object languageContext, Object receiver) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object c = hostEnter(context); try { try { return UNCACHED_INTEROP.asInt(receiver); } catch (UnsupportedMessageException e) { return asIntUnsupported(context, receiver); } } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, c); } } @Override public boolean isBoolean(Object languageContext, Object receiver) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object c = hostEnter(context); try { return InteropLibrary.getFactory().getUncached().isBoolean(receiver); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, c); } } @Override public boolean asBoolean(Object languageContext, Object receiver) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object c = hostEnter(context); try { try { return InteropLibrary.getFactory().getUncached().asBoolean(receiver); } catch (UnsupportedMessageException e) { return asBooleanUnsupported(context, receiver); } } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, c); } } @Override public boolean fitsInFloat(Object languageContext, Object receiver) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object c = hostEnter(context); try { return InteropLibrary.getFactory().getUncached().fitsInFloat(receiver); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, c); } } @Override public float asFloat(Object languageContext, Object receiver) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object c = hostEnter(context); try { try { return UNCACHED_INTEROP.asFloat(receiver); } catch (UnsupportedMessageException e) { return asFloatUnsupported(context, receiver); } } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, c); } } @Override public boolean fitsInDouble(Object languageContext, Object receiver) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object c = hostEnter(context); try { return UNCACHED_INTEROP.fitsInDouble(receiver); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, c); } } @Override public double asDouble(Object languageContext, Object receiver) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object c = hostEnter(context); try { try { return UNCACHED_INTEROP.asDouble(receiver); } catch (UnsupportedMessageException e) { return asDoubleUnsupported(context, receiver); } } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, c); } } @Override public boolean fitsInLong(Object languageContext, Object receiver) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object c = hostEnter(context); try { return UNCACHED_INTEROP.fitsInLong(receiver); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, c); } } @Override public long asLong(Object languageContext, Object receiver) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object c = hostEnter(context); try { try { return UNCACHED_INTEROP.asLong(receiver); } catch (UnsupportedMessageException e) { return asLongUnsupported(context, receiver); } } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, c); } } @Override public boolean fitsInBigInteger(Object languageContext, Object receiver) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object c = hostEnter(context); try { return UNCACHED_INTEROP.fitsInBigInteger(receiver); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, c); } } @Override public BigInteger asBigInteger(Object languageContext, Object receiver) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object c = hostEnter(context); try { try { return UNCACHED_INTEROP.asBigInteger(receiver); } catch (UnsupportedMessageException e) { return asBigIntegerUnsupported(context, receiver); } } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, c); } } @Override public boolean fitsInShort(Object languageContext, Object receiver) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object c = hostEnter(context); try { return UNCACHED_INTEROP.fitsInShort(receiver); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, c); } } @Override public short asShort(Object languageContext, Object receiver) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object c = hostEnter(context); try { try { return UNCACHED_INTEROP.asShort(receiver); } catch (UnsupportedMessageException e) { return asShortUnsupported(context, receiver); } } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, c); } } @Override public boolean isMetaObject(Object languageContext, Object receiver) { return (boolean) RUNTIME.callProfiled(this.isMetaObject, languageContext, receiver); } @Override public boolean isMetaInstance(Object languageContext, Object receiver, Object instance) { return (boolean) RUNTIME.callProfiled(this.isMetaInstance, languageContext, receiver, instance); } @Override public String getMetaQualifiedName(Object languageContext, Object receiver) { return (String) RUNTIME.callProfiled(this.getMetaQualifiedName, languageContext, receiver); } @Override public String getMetaSimpleName(Object languageContext, Object receiver) { return (String) RUNTIME.callProfiled(this.getMetaSimpleName, languageContext, receiver); } @Override public boolean hasMetaParents(Object languageContext, Object receiver) { return (boolean) RUNTIME.callProfiled(this.hasMetaParents, languageContext, receiver); } @Override public Object getMetaParents(Object languageContext, Object receiver) { return RUNTIME.callProfiled(this.getMetaParents, languageContext, receiver); } @Override public boolean hasIterator(Object languageContext, Object receiver) { return (boolean) RUNTIME.callProfiled(this.hasIterator, languageContext, receiver); } @Override public Object getIterator(Object languageContext, Object receiver) { return RUNTIME.callProfiled(this.getIterator, languageContext, receiver); } @Override public boolean isIterator(Object languageContext, Object receiver) { return (boolean) RUNTIME.callProfiled(this.isIterator, languageContext, receiver); } @Override public boolean hasIteratorNextElement(Object languageContext, Object receiver) { return (boolean) RUNTIME.callProfiled(this.hasIteratorNextElement, languageContext, receiver); } @Override public Object getIteratorNextElement(Object languageContext, Object receiver) { return RUNTIME.callProfiled(this.getIteratorNextElement, languageContext, receiver); } @Override public boolean hasHashEntries(Object languageContext, Object receiver) { return (boolean) RUNTIME.callProfiled(this.hasHashEntries, languageContext, receiver); } @Override public long getHashSize(Object languageContext, Object receiver) { return (long) RUNTIME.callProfiled(this.getHashSize, languageContext, receiver); } @Override public boolean hasHashEntry(Object languageContext, Object receiver, Object key) { return (boolean) RUNTIME.callProfiled(this.hasHashEntry, languageContext, receiver, key); } @Override public Object getHashValue(Object languageContext, Object receiver, Object key) { return RUNTIME.callProfiled(this.getHashValue, languageContext, receiver, key); } @Override public Object getHashValueOrDefault(Object languageContext, Object receiver, Object key, Object defaultValue) { return RUNTIME.callProfiled(this.getHashValueOrDefault, languageContext, receiver, key, defaultValue); } @Override public void putHashEntry(Object languageContext, Object receiver, Object key, Object value) { RUNTIME.callProfiled(this.putHashEntry, languageContext, receiver, key, value); } @Override public boolean removeHashEntry(Object languageContext, Object receiver, Object key) { return (boolean) RUNTIME.callProfiled(this.removeHashEntry, languageContext, receiver, key); } @Override public Object getHashEntriesIterator(Object languageContext, Object receiver) { return RUNTIME.callProfiled(this.getHashEntriesIterator, languageContext, receiver); } @Override public Object getHashKeysIterator(Object languageContext, Object receiver) { return RUNTIME.callProfiled(this.getHashKeysIterator, languageContext, receiver); } @Override public Object getHashValuesIterator(Object languageContext, Object receiver) { return RUNTIME.callProfiled(this.getHashValuesIterator, languageContext, receiver); } private final class MemberSet extends AbstractSet<String> { private final APIAccess api; private final Object context; private final Object receiver; private final Object keys; private int cachedSize = -1; MemberSet(APIAccess api, Object languageContext, Object receiver, Object keys) { this.api = api; this.context = languageContext; this.receiver = receiver; this.keys = keys; } @Override public boolean contains(Object o) { if (!(o instanceof String)) { return false; } return hasMember(this.context, receiver, (String) o); } @Override public Iterator<String> iterator() { return new Iterator<String>() { int index = 0; public boolean hasNext() { return index < size(); } public String next() { if (index >= size()) { throw new NoSuchElementException(); } Object arrayElement = api.callValueGetArrayElement(keys, index++); if (api.callValueIsString(arrayElement)) { return api.callValueAsString(arrayElement); } else { return null; } } }; } @Override public int size() { int size = this.cachedSize; if (size != -1) { return size; } cachedSize = size = (int) api.callValueGetArraySize(keys); return size; } } abstract static class IsDateNode extends InteropNode { protected IsDateNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType}; } @Override protected String getOperationName() { return "isDate"; } @Specialization(limit = "CACHE_LIMIT") static Object doCached(PolyglotLanguageContext context, Object receiver, Object[] args, @Bind("this") Node node, // @CachedLibrary("receiver") InteropLibrary objects) { return objects.isDate(receiver); } } abstract static class AsDateNode extends InteropNode { protected AsDateNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType}; } @Override protected String getOperationName() { return "asDate"; } @Specialization(limit = "CACHE_LIMIT") static Object doCached(PolyglotLanguageContext context, Object receiver, Object[] args, // @Bind("this") Node node, @CachedLibrary("receiver") InteropLibrary objects, @Cached InlinedBranchProfile unsupported) { try { return objects.asDate(receiver); } catch (UnsupportedMessageException e) { unsupported.enter(node); if (objects.isNull(receiver)) { return null; } else { throw cannotConvert(context, receiver, null, "asDate()", "isDate()", "Value does not contain date information."); } } } } abstract static class IsTimeNode extends InteropNode { protected IsTimeNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType}; } @Override protected String getOperationName() { return "isTime"; } @Specialization(limit = "CACHE_LIMIT") static Object doCached(PolyglotLanguageContext context, Object receiver, Object[] args, @Bind("this") Node node, // @CachedLibrary("receiver") InteropLibrary objects) { return objects.isTime(receiver); } } abstract static class AsTimeNode extends InteropNode { protected AsTimeNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType}; } @Override protected String getOperationName() { return "asTime"; } @Specialization(limit = "CACHE_LIMIT") static Object doCached(PolyglotLanguageContext context, Object receiver, Object[] args, // @Bind("this") Node node, @CachedLibrary("receiver") InteropLibrary objects, @Cached InlinedBranchProfile unsupported) { try { return objects.asTime(receiver); } catch (UnsupportedMessageException e) { unsupported.enter(node); if (objects.isNull(receiver)) { return null; } else { throw cannotConvert(context, receiver, null, "asTime()", "isTime()", "Value does not contain time information."); } } } } abstract static class IsTimeZoneNode extends InteropNode { protected IsTimeZoneNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType}; } @Override protected String getOperationName() { return "isTimeZone"; } @Specialization(limit = "CACHE_LIMIT") static Object doCached(PolyglotLanguageContext context, Object receiver, Object[] args, @Bind("this") Node node, // @CachedLibrary("receiver") InteropLibrary objects) { return objects.isTimeZone(receiver); } } abstract static class AsTimeZoneNode extends InteropNode { protected AsTimeZoneNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType}; } @Override protected String getOperationName() { return "asTimeZone"; } @Specialization(limit = "CACHE_LIMIT") static Object doCached(PolyglotLanguageContext context, Object receiver, Object[] args, // @Bind("this") Node node, @CachedLibrary("receiver") InteropLibrary objects, @Cached InlinedBranchProfile unsupported) { try { return objects.asTimeZone(receiver); } catch (UnsupportedMessageException e) { unsupported.enter(node); if (objects.isNull(receiver)) { return null; } else { throw cannotConvert(context, receiver, null, "asTimeZone()", "isTimeZone()", "Value does not contain time-zone information."); } } } } abstract static class IsDurationNode extends InteropNode { protected IsDurationNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType}; } @Override protected String getOperationName() { return "isDuration"; } @Specialization(limit = "CACHE_LIMIT") static Object doCached(PolyglotLanguageContext context, Object receiver, Object[] args, @Bind("this") Node node, // @CachedLibrary("receiver") InteropLibrary objects) { return objects.isDuration(receiver); } } abstract static class AsDurationNode extends InteropNode { protected AsDurationNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType}; } @Override protected String getOperationName() { return "asDuration"; } @Specialization(limit = "CACHE_LIMIT") static Object doCached(PolyglotLanguageContext context, Object receiver, Object[] args, // @Bind("this") Node node, @CachedLibrary("receiver") InteropLibrary objects, @Cached InlinedBranchProfile unsupported) { try { return objects.asDuration(receiver); } catch (UnsupportedMessageException e) { unsupported.enter(node); if (objects.isNull(receiver)) { return null; } else { throw cannotConvert(context, receiver, null, "asDuration()", "isDuration()", "Value does not contain duration information."); } } } } abstract static class AsInstantNode extends InteropNode { protected AsInstantNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType}; } @Override protected String getOperationName() { return "getInstant"; } @Specialization(limit = "CACHE_LIMIT") static Object doCached(PolyglotLanguageContext context, Object receiver, Object[] args, // @Bind("this") Node node, @CachedLibrary("receiver") InteropLibrary objects, @Cached InlinedBranchProfile unsupported) { try { return objects.asInstant(receiver); } catch (UnsupportedMessageException e) { unsupported.enter(node); if (objects.isNull(receiver)) { return null; } else { throw cannotConvert(context, receiver, null, "asInstant()", "hasInstant()", "Value does not contain instant information."); } } } } abstract static class AsClassLiteralNode extends InteropNode { protected AsClassLiteralNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType, Class.class}; } @Override protected String getOperationName() { return "as"; } @Specialization final Object doCached(PolyglotLanguageContext context, Object receiver, Object[] args, @Cached PolyglotToHostNode toHost) { return toHost.execute(this, context, receiver, (Class<?>) args[ARGUMENT_OFFSET], null); } } abstract static class AsTypeLiteralNode extends InteropNode { protected AsTypeLiteralNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType, Class.class, Type.class}; } @Override protected String getOperationName() { return "as"; } @Specialization final Object doCached(PolyglotLanguageContext context, Object receiver, Object[] args, @Cached PolyglotToHostNode toHost) { Class<?> rawType = (Class<?>) args[ARGUMENT_OFFSET]; Type type = (Type) args[ARGUMENT_OFFSET + 1]; return toHost.execute(this, context, receiver, rawType, type); } } abstract static class IsNativePointerNode extends InteropNode { protected IsNativePointerNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType}; } @Override protected String getOperationName() { return "isNativePointer"; } @Specialization(limit = "CACHE_LIMIT") static Object doCached(PolyglotLanguageContext context, Object receiver, Object[] args, @Bind("this") Node node, // @CachedLibrary("receiver") InteropLibrary natives) { return natives.isPointer(receiver); } } abstract static class AsNativePointerNode extends InteropNode { protected AsNativePointerNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType}; } @Override protected String getOperationName() { return "asNativePointer"; } @Specialization(limit = "CACHE_LIMIT") static Object doCached(PolyglotLanguageContext context, Object receiver, Object[] args, // @Bind("this") Node node, @CachedLibrary("receiver") InteropLibrary natives, @Cached InlinedBranchProfile unsupported) { try { return natives.asPointer(receiver); } catch (UnsupportedMessageException e) { unsupported.enter(node); throw cannotConvert(context, receiver, long.class, "asNativePointer()", "isNativeObject()", "Value cannot be converted to a native pointer."); } } } abstract static class HasArrayElementsNode extends InteropNode { protected HasArrayElementsNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType}; } @Override protected String getOperationName() { return "hasArrayElements"; } @Specialization(limit = "CACHE_LIMIT") static Object doCached(PolyglotLanguageContext context, Object receiver, Object[] args, @Bind("this") Node node, // @CachedLibrary("receiver") InteropLibrary arrays) { return arrays.hasArrayElements(receiver); } } abstract static class GetMemberKeysNode extends InteropNode { protected GetMemberKeysNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType}; } @Override protected String getOperationName() { return "getMemberKeys"; } @Specialization(limit = "CACHE_LIMIT") static Object doCached(PolyglotLanguageContext context, Object receiver, Object[] args, // @Bind("this") Node node, @CachedLibrary("receiver") InteropLibrary objects, @Cached ToHostValueNode toHost, @Cached InlinedBranchProfile unsupported) { try { return toHost.execute(node, context, objects.getMembers(receiver)); } catch (UnsupportedMessageException e) { unsupported.enter(node); return null; } } } abstract static class GetArrayElementNode extends InteropNode { protected GetArrayElementNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType, Long.class}; } @Override protected String getOperationName() { return "getArrayElement"; } @Specialization(limit = "CACHE_LIMIT") static Object doCached(PolyglotLanguageContext context, Object receiver, Object[] args, // @Bind("this") Node node, @CachedLibrary("receiver") InteropLibrary arrays, @Cached ToHostValueNode toHost, @Cached InlinedBranchProfile unsupported, @Cached InlinedBranchProfile unknown) { long index = (long) args[ARGUMENT_OFFSET]; try { return toHost.execute(node, context, arrays.readArrayElement(receiver, index)); } catch (UnsupportedMessageException e) { unsupported.enter(node); return getArrayElementUnsupported(context, receiver); } catch (InvalidArrayIndexException e) { unknown.enter(node); throw invalidArrayIndex(context, receiver, index); } } } abstract static class SetArrayElementNode extends InteropNode { protected SetArrayElementNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType, Long.class, null}; } @Override protected String getOperationName() { return "setArrayElement"; } @Specialization(limit = "CACHE_LIMIT") static Object doCached(PolyglotLanguageContext context, Object receiver, Object[] args, // @Bind("this") Node node, @CachedLibrary("receiver") InteropLibrary arrays, @Cached(inline = true) ToGuestValueNode toGuestValue, @Cached InlinedBranchProfile unsupported, @Cached InlinedBranchProfile invalidIndex, @Cached InlinedBranchProfile invalidValue) { long index = (long) args[ARGUMENT_OFFSET]; Object value = toGuestValue.execute(node, context, args[ARGUMENT_OFFSET + 1]); try { arrays.writeArrayElement(receiver, index, value); } catch (UnsupportedMessageException e) { unsupported.enter(node); setArrayElementUnsupported(context, receiver); } catch (UnsupportedTypeException e) { invalidValue.enter(node); throw invalidArrayValue(context, receiver, index, value); } catch (InvalidArrayIndexException e) { invalidIndex.enter(node); throw invalidArrayIndex(context, receiver, index); } return null; } } abstract static class RemoveArrayElementNode extends InteropNode { protected RemoveArrayElementNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType, Long.class}; } @Override protected String getOperationName() { return "removeArrayElement"; } @Specialization(limit = "CACHE_LIMIT") static Object doCached(PolyglotLanguageContext context, Object receiver, Object[] args, // @Bind("this") Node node, @CachedLibrary("receiver") InteropLibrary arrays, @Cached InlinedBranchProfile unsupported, @Cached InlinedBranchProfile invalidIndex) { long index = (long) args[ARGUMENT_OFFSET]; Object value; try { arrays.removeArrayElement(receiver, index); value = Boolean.TRUE; } catch (UnsupportedMessageException e) { unsupported.enter(node); throw removeArrayElementUnsupported(context, receiver); } catch (InvalidArrayIndexException e) { invalidIndex.enter(node); throw invalidArrayIndex(context, receiver, index); } return value; } } abstract static class GetArraySizeNode extends InteropNode { protected GetArraySizeNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType}; } @Override protected String getOperationName() { return "getArraySize"; } @Specialization(limit = "CACHE_LIMIT") static Object doCached(PolyglotLanguageContext context, Object receiver, Object[] args, // @Bind("this") Node node, @CachedLibrary("receiver") InteropLibrary arrays, @Cached InlinedBranchProfile unsupported) { try { return arrays.getArraySize(receiver); } catch (UnsupportedMessageException e) { unsupported.enter(node); return getArraySizeUnsupported(context, receiver); } } } // region Buffer nodes abstract static class HasBufferElementsNode extends InteropNode { protected HasBufferElementsNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType}; } @Override protected String getOperationName() { return "hasBufferElements"; } @Specialization(limit = "CACHE_LIMIT") static Object doCached(PolyglotLanguageContext context, Object receiver, Object[] args, @Bind("this") Node node, // @CachedLibrary("receiver") InteropLibrary buffers) { return buffers.hasBufferElements(receiver); } } abstract static class IsBufferWritableNode extends InteropNode { protected IsBufferWritableNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType}; } @Override protected String getOperationName() { return "isBufferWritable"; } @Specialization(limit = "CACHE_LIMIT") static Object doCached(PolyglotLanguageContext context, Object receiver, Object[] args, // @Bind("this") Node node, @CachedLibrary("receiver") InteropLibrary buffers, @Cached InlinedBranchProfile unsupported) { try { return buffers.isBufferWritable(receiver); } catch (UnsupportedMessageException e) { unsupported.enter(node); throw getBufferSizeUnsupported(context, receiver); } } } abstract static class GetBufferSizeNode extends InteropNode { protected GetBufferSizeNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType}; } @Override protected String getOperationName() { return "getBufferSize"; } @Specialization(limit = "CACHE_LIMIT") static Object doCached(PolyglotLanguageContext context, Object receiver, Object[] args, // @Bind("this") Node node, @CachedLibrary("receiver") InteropLibrary buffers, @Cached InlinedBranchProfile unsupported) { try { return buffers.getBufferSize(receiver); } catch (UnsupportedMessageException e) { unsupported.enter(node); throw getBufferSizeUnsupported(context, receiver); } } } abstract static class ReadBufferByteNode extends InteropNode { protected ReadBufferByteNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType, Long.class}; } @Override protected String getOperationName() { return "readBufferByte"; } @Specialization(limit = "CACHE_LIMIT") static Object doCached(PolyglotLanguageContext context, Object receiver, Object[] args, // @Bind("this") Node node, @CachedLibrary("receiver") InteropLibrary buffers, @Cached ToHostValueNode toHost, @Cached InlinedBranchProfile unsupported, @Cached InlinedBranchProfile unknown) { final long byteOffset = (long) args[ARGUMENT_OFFSET]; try { return buffers.readBufferByte(receiver, byteOffset); } catch (UnsupportedMessageException e) { unsupported.enter(node); throw readBufferByteUnsupported(context, receiver); } catch (InvalidBufferOffsetException e) { unknown.enter(node); throw invalidBufferIndex(context, receiver, e.getByteOffset(), e.getLength()); } } } abstract static class ReadBufferNode extends InteropNode { protected ReadBufferNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType, Long.class, byte[].class, Integer.class, Integer.class}; } @Override protected String getOperationName() { return "readBufferInto"; } @Specialization(limit = "CACHE_LIMIT") static Object doCached(PolyglotLanguageContext context, Object receiver, Object[] args, // @Bind("this") Node node, @CachedLibrary("receiver") InteropLibrary buffers, @Cached ToHostValueNode toHost, @Cached InlinedBranchProfile unsupported, @Cached InlinedBranchProfile unknown) { final long bufferByteOffset = (long) args[ARGUMENT_OFFSET]; final byte[] destination = (byte[]) args[ARGUMENT_OFFSET + 1]; final int destinationOffset = (int) args[ARGUMENT_OFFSET + 2]; final int byteLength = (int) args[ARGUMENT_OFFSET + 3]; try { buffers.readBuffer(receiver, bufferByteOffset, destination, destinationOffset, byteLength); } catch (UnsupportedMessageException e) { unsupported.enter(node); throw readBufferUnsupported(context, receiver); } catch (InvalidBufferOffsetException e) { unknown.enter(node); throw invalidBufferIndex(context, receiver, e.getByteOffset(), e.getLength()); } return null; } } abstract static class WriteBufferByteNode extends InteropNode { protected WriteBufferByteNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType, Long.class, Byte.class}; } @Override protected String getOperationName() { return "writeBufferByte"; } @Specialization(limit = "CACHE_LIMIT") static Object doCached(PolyglotLanguageContext context, Object receiver, Object[] args, // @Bind("this") Node node, @CachedLibrary("receiver") InteropLibrary buffers, @Cached InlinedBranchProfile unsupported, @Cached InlinedBranchProfile invalidIndex, @Cached InlinedBranchProfile invalidValue) { final long byteOffset = (long) args[ARGUMENT_OFFSET]; final byte value = (byte) args[ARGUMENT_OFFSET + 1]; try { buffers.writeBufferByte(receiver, byteOffset, value); } catch (UnsupportedMessageException e) { unsupported.enter(node); if (buffers.hasBufferElements(receiver)) { throw unsupported(context, receiver, "writeBufferByte()", "isBufferWritable()"); } throw writeBufferByteUnsupported(context, receiver); } catch (InvalidBufferOffsetException e) { invalidIndex.enter(node); throw invalidBufferIndex(context, receiver, e.getByteOffset(), e.getLength()); } return null; } } abstract static class ReadBufferShortNode extends InteropNode { protected ReadBufferShortNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType, ByteOrder.class, Long.class}; } @Override protected String getOperationName() { return "readBufferShort"; } @Specialization(limit = "CACHE_LIMIT") static Object doCached(PolyglotLanguageContext context, Object receiver, Object[] args, // @Bind("this") Node node, @CachedLibrary("receiver") InteropLibrary buffers, @Cached ToHostValueNode toHost, @Cached InlinedBranchProfile unsupported, @Cached InlinedBranchProfile unknown) { final ByteOrder order = (ByteOrder) args[ARGUMENT_OFFSET]; final long byteOffset = (long) args[ARGUMENT_OFFSET + 1]; try { return buffers.readBufferShort(receiver, order, byteOffset); } catch (UnsupportedMessageException e) { unsupported.enter(node); throw readBufferShortUnsupported(context, receiver); } catch (InvalidBufferOffsetException e) { unknown.enter(node); throw invalidBufferIndex(context, receiver, e.getByteOffset(), e.getLength()); } } } abstract static class WriteBufferShortNode extends InteropNode { protected WriteBufferShortNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType, ByteOrder.class, Long.class, Short.class}; } @Override protected String getOperationName() { return "writeBufferShort"; } @Specialization(limit = "CACHE_LIMIT") static Object doCached(PolyglotLanguageContext context, Object receiver, Object[] args, // @Bind("this") Node node, @CachedLibrary("receiver") InteropLibrary buffers, @Cached InlinedBranchProfile unsupported, @Cached InlinedBranchProfile invalidIndex, @Cached InlinedBranchProfile invalidValue) { final ByteOrder order = (ByteOrder) args[ARGUMENT_OFFSET]; final long byteOffset = (long) args[ARGUMENT_OFFSET + 1]; final short value = (short) args[ARGUMENT_OFFSET + 2]; try { buffers.writeBufferShort(receiver, order, byteOffset, value); } catch (UnsupportedMessageException e) { unsupported.enter(node); if (buffers.hasBufferElements(receiver)) { throw unsupported(context, receiver, "writeBufferShort()", "isBufferWritable()"); } throw writeBufferShortUnsupported(context, receiver); } catch (InvalidBufferOffsetException e) { invalidIndex.enter(node); throw invalidBufferIndex(context, receiver, e.getByteOffset(), e.getLength()); } return null; } } abstract static class ReadBufferIntNode extends InteropNode { protected ReadBufferIntNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType, ByteOrder.class, Long.class}; } @Override protected String getOperationName() { return "readBufferInt"; } @Specialization(limit = "CACHE_LIMIT") static Object doCached(PolyglotLanguageContext context, Object receiver, Object[] args, // @Bind("this") Node node, @CachedLibrary("receiver") InteropLibrary buffers, @Cached ToHostValueNode toHost, @Cached InlinedBranchProfile unsupported, @Cached InlinedBranchProfile unknown) { final ByteOrder order = (ByteOrder) args[ARGUMENT_OFFSET]; final long byteOffset = (long) args[ARGUMENT_OFFSET + 1]; try { return buffers.readBufferInt(receiver, order, byteOffset); } catch (UnsupportedMessageException e) { unsupported.enter(node); throw readBufferIntUnsupported(context, receiver); } catch (InvalidBufferOffsetException e) { unknown.enter(node); throw invalidBufferIndex(context, receiver, e.getByteOffset(), e.getLength()); } } } abstract static class WriteBufferIntNode extends InteropNode { protected WriteBufferIntNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType, ByteOrder.class, Long.class, Integer.class}; } @Override protected String getOperationName() { return "writeBufferInt"; } @Specialization(limit = "CACHE_LIMIT") static Object doCached(PolyglotLanguageContext context, Object receiver, Object[] args, // @Bind("this") Node node, @CachedLibrary("receiver") InteropLibrary buffers, @Cached InlinedBranchProfile unsupported, @Cached InlinedBranchProfile invalidIndex, @Cached InlinedBranchProfile invalidValue) { final ByteOrder order = (ByteOrder) args[ARGUMENT_OFFSET]; final long byteOffset = (long) args[ARGUMENT_OFFSET + 1]; final int value = (int) args[ARGUMENT_OFFSET + 2]; try { buffers.writeBufferInt(receiver, order, byteOffset, value); } catch (UnsupportedMessageException e) { unsupported.enter(node); if (buffers.hasBufferElements(receiver)) { throw unsupported(context, receiver, "writeBufferInt()", "isBufferWritable()"); } throw writeBufferIntUnsupported(context, receiver); } catch (InvalidBufferOffsetException e) { invalidIndex.enter(node); throw invalidBufferIndex(context, receiver, e.getByteOffset(), e.getLength()); } return null; } } abstract static class ReadBufferLongNode extends InteropNode { protected ReadBufferLongNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType, ByteOrder.class, Long.class}; } @Override protected String getOperationName() { return "readBufferLong"; } @Specialization(limit = "CACHE_LIMIT") static Object doCached(PolyglotLanguageContext context, Object receiver, Object[] args, // @Bind("this") Node node, @CachedLibrary("receiver") InteropLibrary buffers, @Cached ToHostValueNode toHost, @Cached InlinedBranchProfile unsupported, @Cached InlinedBranchProfile unknown) { final ByteOrder order = (ByteOrder) args[ARGUMENT_OFFSET]; final long byteOffset = (long) args[ARGUMENT_OFFSET + 1]; try { return buffers.readBufferLong(receiver, order, byteOffset); } catch (UnsupportedMessageException e) { unsupported.enter(node); throw readBufferLongUnsupported(context, receiver); } catch (InvalidBufferOffsetException e) { unknown.enter(node); throw invalidBufferIndex(context, receiver, e.getByteOffset(), e.getLength()); } } } abstract static class WriteBufferLongNode extends InteropNode { protected WriteBufferLongNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType, ByteOrder.class, Long.class, Long.class}; } @Override protected String getOperationName() { return "writeBufferLong"; } @Specialization(limit = "CACHE_LIMIT") static Object doCached(PolyglotLanguageContext context, Object receiver, Object[] args, // @Bind("this") Node node, @CachedLibrary("receiver") InteropLibrary buffers, @Cached InlinedBranchProfile unsupported, @Cached InlinedBranchProfile invalidIndex, @Cached InlinedBranchProfile invalidValue) { final ByteOrder order = (ByteOrder) args[ARGUMENT_OFFSET]; final long byteOffset = (long) args[ARGUMENT_OFFSET + 1]; final long value = (long) args[ARGUMENT_OFFSET + 2]; try { buffers.writeBufferLong(receiver, order, byteOffset, value); } catch (UnsupportedMessageException e) { unsupported.enter(node); if (buffers.hasBufferElements(receiver)) { throw unsupported(context, receiver, "writeBufferLong()", "isBufferWritable()"); } throw writeBufferLongUnsupported(context, receiver); } catch (InvalidBufferOffsetException e) { invalidIndex.enter(node); throw invalidBufferIndex(context, receiver, e.getByteOffset(), e.getLength()); } return null; } } abstract static class ReadBufferFloatNode extends InteropNode { protected ReadBufferFloatNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType, ByteOrder.class, Long.class}; } @Override protected String getOperationName() { return "readBufferFloat"; } @Specialization(limit = "CACHE_LIMIT") static Object doCached(PolyglotLanguageContext context, Object receiver, Object[] args, // @Bind("this") Node node, @CachedLibrary("receiver") InteropLibrary buffers, @Cached ToHostValueNode toHost, @Cached InlinedBranchProfile unsupported, @Cached InlinedBranchProfile unknown) { final ByteOrder order = (ByteOrder) args[ARGUMENT_OFFSET]; final long byteOffset = (long) args[ARGUMENT_OFFSET + 1]; try { return buffers.readBufferFloat(receiver, order, byteOffset); } catch (UnsupportedMessageException e) { unsupported.enter(node); throw readBufferFloatUnsupported(context, receiver); } catch (InvalidBufferOffsetException e) { unknown.enter(node); throw invalidBufferIndex(context, receiver, e.getByteOffset(), e.getLength()); } } } abstract static class WriteBufferFloatNode extends InteropNode { protected WriteBufferFloatNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType, ByteOrder.class, Long.class, Float.class}; } @Override protected String getOperationName() { return "writeBufferFloat"; } @Specialization(limit = "CACHE_LIMIT") static Object doCached(PolyglotLanguageContext context, Object receiver, Object[] args, // @Bind("this") Node node, @CachedLibrary("receiver") InteropLibrary buffers, @Cached InlinedBranchProfile unsupported, @Cached InlinedBranchProfile invalidIndex) { final ByteOrder order = (ByteOrder) args[ARGUMENT_OFFSET]; final long byteOffset = (long) args[ARGUMENT_OFFSET + 1]; final float value = (float) args[ARGUMENT_OFFSET + 2]; try { buffers.writeBufferFloat(receiver, order, byteOffset, value); } catch (UnsupportedMessageException e) { unsupported.enter(node); if (buffers.hasBufferElements(receiver)) { throw unsupported(context, receiver, "writeBufferFloat()", "isBufferWritable()"); } throw writeBufferFloatUnsupported(context, receiver); } catch (InvalidBufferOffsetException e) { invalidIndex.enter(node); throw invalidBufferIndex(context, receiver, e.getByteOffset(), e.getLength()); } return null; } } abstract static class ReadBufferDoubleNode extends InteropNode { protected ReadBufferDoubleNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType, ByteOrder.class, Long.class}; } @Override protected String getOperationName() { return "readBufferDouble"; } @Specialization(limit = "CACHE_LIMIT") static Object doCached(PolyglotLanguageContext context, Object receiver, Object[] args, // @Bind("this") Node node, @CachedLibrary("receiver") InteropLibrary buffers, @Cached ToHostValueNode toHost, @Cached InlinedBranchProfile unsupported, @Cached InlinedBranchProfile unknown) { final ByteOrder order = (ByteOrder) args[ARGUMENT_OFFSET]; final long byteOffset = (long) args[ARGUMENT_OFFSET + 1]; try { return buffers.readBufferDouble(receiver, order, byteOffset); } catch (UnsupportedMessageException e) { unsupported.enter(node); throw readBufferDoubleUnsupported(context, receiver); } catch (InvalidBufferOffsetException e) { unknown.enter(node); throw invalidBufferIndex(context, receiver, e.getByteOffset(), e.getLength()); } } } abstract static class WriteBufferDoubleNode extends InteropNode { protected WriteBufferDoubleNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType, ByteOrder.class, Long.class, Double.class}; } @Override protected String getOperationName() { return "writeBufferDouble"; } @Specialization(limit = "CACHE_LIMIT") static Object doCached(PolyglotLanguageContext context, Object receiver, Object[] args, @Bind("this") Node node, // @CachedLibrary("receiver") InteropLibrary buffers, @Cached InlinedBranchProfile unsupported, @Cached InlinedBranchProfile invalidIndex, @Cached InlinedBranchProfile invalidValue) { final ByteOrder order = (ByteOrder) args[ARGUMENT_OFFSET]; final long byteOffset = (long) args[ARGUMENT_OFFSET + 1]; final double value = (double) args[ARGUMENT_OFFSET + 2]; try { buffers.writeBufferDouble(receiver, order, byteOffset, value); } catch (UnsupportedMessageException e) { unsupported.enter(node); if (buffers.hasBufferElements(receiver)) { throw unsupported(context, receiver, "writeBufferDouble()", "isBufferWritable()"); } throw writeBufferDoubleUnsupported(context, receiver); } catch (InvalidBufferOffsetException e) { invalidIndex.enter(node); throw invalidBufferIndex(context, receiver, e.getByteOffset(), e.getLength()); } return null; } } // endregion abstract static class GetMemberNode extends InteropNode { protected GetMemberNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType, String.class}; } @Override protected String getOperationName() { return "getMember"; } @Specialization(limit = "CACHE_LIMIT") static Object doCached(PolyglotLanguageContext context, Object receiver, Object[] args, @Bind("this") Node node, // @CachedLibrary("receiver") InteropLibrary objects, @Cached ToHostValueNode toHost, @Cached InlinedBranchProfile unsupported, @Cached InlinedBranchProfile unknown) { String key = (String) args[ARGUMENT_OFFSET]; Object value; try { assert key != null : "should be handled already"; value = toHost.execute(node, context, objects.readMember(receiver, key)); } catch (UnsupportedMessageException e) { unsupported.enter(node); if (objects.hasMembers(receiver)) { value = null; } else { return getMemberUnsupported(context, receiver, key); } } catch (UnknownIdentifierException e) { unknown.enter(node); value = null; } return value; } } abstract static class PutMemberNode extends InteropNode { protected PutMemberNode(InteropValue interop) { super(interop); } @Override protected String getOperationName() { return "putMember"; } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType, String.class, null}; } @Specialization static Object doCached(PolyglotLanguageContext context, Object receiver, Object[] args, @Bind("this") Node node, @CachedLibrary(limit = "CACHE_LIMIT") InteropLibrary objects, @Cached(inline = true) ToGuestValueNode toGuestValue, @Cached InlinedBranchProfile unsupported, @Cached InlinedBranchProfile invalidValue, @Cached InlinedBranchProfile unknown) { String key = (String) args[ARGUMENT_OFFSET]; Object originalValue = args[ARGUMENT_OFFSET + 1]; Object value = toGuestValue.execute(node, context, originalValue); assert key != null; try { objects.writeMember(receiver, key, value); } catch (UnsupportedMessageException e) { unsupported.enter(node); throw putMemberUnsupported(context, receiver); } catch (UnknownIdentifierException e) { unknown.enter(node); throw nonWritableMemberKey(context, receiver, key); } catch (UnsupportedTypeException e) { invalidValue.enter(node); throw invalidMemberValue(context, receiver, key, value); } return null; } } abstract static class RemoveMemberNode extends InteropNode { protected RemoveMemberNode(InteropValue interop) { super(interop); } @Override protected String getOperationName() { return "removeMember"; } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType, String.class}; } @Specialization(limit = "CACHE_LIMIT") static Object doCached(PolyglotLanguageContext context, Object receiver, Object[] args, @Bind("this") Node node, // @CachedLibrary("receiver") InteropLibrary objects, @Cached InlinedBranchProfile unsupported, @Cached InlinedBranchProfile unknown) { String key = (String) args[ARGUMENT_OFFSET]; Object value; try { assert key != null : "should be handled already"; objects.removeMember(receiver, key); value = Boolean.TRUE; } catch (UnsupportedMessageException e) { unsupported.enter(node); if (!objects.hasMembers(receiver)) { throw removeMemberUnsupported(context, receiver); } else if (objects.isMemberExisting(receiver, key)) { throw nonRemovableMemberKey(context, receiver, key); } else { value = Boolean.FALSE; } } catch (UnknownIdentifierException e) { unknown.enter(node); if (objects.isMemberExisting(receiver, key)) { throw nonRemovableMemberKey(context, receiver, key); } else { value = Boolean.FALSE; } } return value; } } abstract static class IsNullNode extends InteropNode { protected IsNullNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType}; } @Override protected String getOperationName() { return "isNull"; } @Specialization(limit = "CACHE_LIMIT") static Object doCached(PolyglotLanguageContext context, Object receiver, Object[] args, @Bind("this") Node node, // @CachedLibrary("receiver") InteropLibrary values) { return values.isNull(receiver); } } abstract static class HasMembersNode extends InteropNode { protected HasMembersNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType}; } @Override protected String getOperationName() { return "hasMembers"; } @Specialization(limit = "CACHE_LIMIT") static Object doCached(PolyglotLanguageContext context, Object receiver, Object[] args, @Bind("this") Node node, // @CachedLibrary("receiver") InteropLibrary objects) { return objects.hasMembers(receiver); } } private abstract static class AbstractMemberInfoNode extends InteropNode { protected AbstractMemberInfoNode(InteropValue interop) { super(interop); } @Override protected final Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType, String.class}; } } abstract static class HasMemberNode extends AbstractMemberInfoNode { protected HasMemberNode(InteropValue interop) { super(interop); } @Override protected String getOperationName() { return "hasMember"; } @Specialization(limit = "CACHE_LIMIT") static Object doCached(PolyglotLanguageContext context, Object receiver, Object[] args, @Bind("this") Node node, // @CachedLibrary("receiver") InteropLibrary objects) { String key = (String) args[ARGUMENT_OFFSET]; return objects.isMemberExisting(receiver, key); } } abstract static class CanInvokeNode extends AbstractMemberInfoNode { protected CanInvokeNode(InteropValue interop) { super(interop); } @Override protected String getOperationName() { return "canInvoke"; } @Specialization(limit = "CACHE_LIMIT") static Object doCached(PolyglotLanguageContext context, Object receiver, Object[] args, @Bind("this") Node node, // @CachedLibrary("receiver") InteropLibrary objects) { String key = (String) args[ARGUMENT_OFFSET]; return objects.isMemberInvocable(receiver, key); } } abstract static class CanExecuteNode extends InteropNode { protected CanExecuteNode(InteropValue interop) { super(interop); } @Override protected String getOperationName() { return "canExecute"; } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType}; } @Specialization(limit = "CACHE_LIMIT") static Object doCached(PolyglotLanguageContext context, Object receiver, Object[] args, @Bind("this") Node node, // @CachedLibrary("receiver") InteropLibrary executables) { return executables.isExecutable(receiver); } } abstract static class CanInstantiateNode extends InteropNode { protected CanInstantiateNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType}; } @Override protected String getOperationName() { return "canInstantiate"; } @Specialization(limit = "CACHE_LIMIT") static Object doCached(PolyglotLanguageContext context, Object receiver, Object[] args, @Bind("this") Node node, // @CachedLibrary("receiver") InteropLibrary instantiables) { return instantiables.isInstantiable(receiver); } } @ImportStatic(InteropNode.class) @GenerateInline(true) @GenerateCached(false) abstract static class SharedExecuteNode extends Node { protected abstract Object executeShared(Node node, PolyglotLanguageContext context, Object receiver, Object[] args); @Specialization(limit = "CACHE_LIMIT") protected static Object doDefault(Node node, PolyglotLanguageContext context, Object receiver, Object[] args, @CachedLibrary("receiver") InteropLibrary executables, @Cached ToGuestValuesNode toGuestValues, @Cached ToHostValueNode toHostValue, @Cached InlinedBranchProfile invalidArgument, @Cached InlinedBranchProfile arity, @Cached InlinedBranchProfile unsupported) { Object[] guestArguments = toGuestValues.execute(node, context, args); try { return executables.execute(receiver, guestArguments); } catch (UnsupportedTypeException e) { invalidArgument.enter(node); throw invalidExecuteArgumentType(context, receiver, e); } catch (ArityException e) { arity.enter(node); throw invalidExecuteArity(context, receiver, guestArguments, e.getExpectedMinArity(), e.getExpectedMaxArity(), e.getActualArity()); } catch (UnsupportedMessageException e) { unsupported.enter(node); throw executeUnsupported(context, receiver); } } } abstract static class ExecuteVoidNode extends InteropNode { protected ExecuteVoidNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType, Object[].class}; } @Specialization final Object doDefault(PolyglotLanguageContext context, Object receiver, Object[] args, @Cached SharedExecuteNode executeNode) { executeNode.executeShared(this, context, receiver, (Object[]) args[ARGUMENT_OFFSET]); return null; } @Override protected String getOperationName() { return "executeVoid"; } } abstract static class ExecuteVoidNoArgsNode extends InteropNode { private static final Object[] NO_ARGS = new Object[0]; protected ExecuteVoidNoArgsNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType}; } @Specialization final Object doDefault(PolyglotLanguageContext context, Object receiver, Object[] args, @Cached SharedExecuteNode executeNode) { executeNode.executeShared(this, context, receiver, NO_ARGS); return null; } @Override protected String getOperationName() { return "executeVoid"; } } abstract static class ExecuteNode extends InteropNode { protected ExecuteNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType, Object[].class}; } @Specialization final Object doDefault(PolyglotLanguageContext context, Object receiver, Object[] args, @Cached ToHostValueNode toHostValue, @Cached SharedExecuteNode executeNode) { return toHostValue.execute(this, context, executeNode.executeShared(this, context, receiver, (Object[]) args[ARGUMENT_OFFSET])); } @Override protected String getOperationName() { return "execute"; } } abstract static class ExecuteNoArgsNode extends InteropNode { protected ExecuteNoArgsNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType}; } @Specialization final Object doDefault(PolyglotLanguageContext context, Object receiver, Object[] args, @Cached ToHostValueNode toHostValue, @Cached SharedExecuteNode executeNode) { return toHostValue.execute(this, context, executeNode.executeShared(this, context, receiver, ExecuteVoidNoArgsNode.NO_ARGS)); } @Override protected String getOperationName() { return "execute"; } } abstract static class NewInstanceNode extends InteropNode { protected NewInstanceNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType, Object[].class}; } @Specialization(limit = "CACHE_LIMIT") static Object doCached(PolyglotLanguageContext context, Object receiver, Object[] args, @Bind("this") Node node, // @CachedLibrary("receiver") InteropLibrary instantiables, @Cached ToGuestValuesNode toGuestValues, @Cached ToHostValueNode toHostValue, @Cached InlinedBranchProfile arity, @Cached InlinedBranchProfile invalidArgument, @Cached InlinedBranchProfile unsupported) { Object[] instantiateArguments = toGuestValues.execute(node, context, (Object[]) args[ARGUMENT_OFFSET]); try { return toHostValue.execute(node, context, instantiables.instantiate(receiver, instantiateArguments)); } catch (UnsupportedTypeException e) { invalidArgument.enter(node); throw invalidInstantiateArgumentType(context, receiver, instantiateArguments); } catch (ArityException e) { arity.enter(node); throw invalidInstantiateArity(context, receiver, instantiateArguments, e.getExpectedMinArity(), e.getExpectedMaxArity(), e.getActualArity()); } catch (UnsupportedMessageException e) { unsupported.enter(node); return newInstanceUnsupported(context, receiver); } } @Override protected String getOperationName() { return "newInstance"; } } @ImportStatic(InteropNode.class) @GenerateInline(true) @GenerateCached(false) abstract static class SharedInvokeNode extends Node { protected abstract Object executeShared(Node node, PolyglotLanguageContext context, Object receiver, String key, Object[] guestArguments); @Specialization(limit = "CACHE_LIMIT") protected static Object doDefault(Node node, PolyglotLanguageContext context, Object receiver, String key, Object[] guestArguments, @CachedLibrary("receiver") InteropLibrary objects, @Cached ToHostValueNode toHostValue, @Cached InlinedBranchProfile invalidArgument, @Cached InlinedBranchProfile arity, @Cached InlinedBranchProfile unsupported, @Cached InlinedBranchProfile unknownIdentifier) { try { return toHostValue.execute(node, context, objects.invokeMember(receiver, key, guestArguments)); } catch (UnsupportedMessageException e) { unsupported.enter(node); throw invokeUnsupported(context, receiver, key); } catch (UnknownIdentifierException e) { unknownIdentifier.enter(node); throw nonReadableMemberKey(context, receiver, key); } catch (UnsupportedTypeException e) { invalidArgument.enter(node); throw invalidInvokeArgumentType(context, receiver, key, e); } catch (ArityException e) { arity.enter(node); throw invalidInvokeArity(context, receiver, key, guestArguments, e.getExpectedMinArity(), e.getExpectedMaxArity(), e.getActualArity()); } } } abstract static class InvokeNode extends InteropNode { protected InvokeNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType, String.class, Object[].class}; } @Override protected String getOperationName() { return "invoke"; } @Specialization final Object doDefault(PolyglotLanguageContext context, Object receiver, Object[] args, @Cached SharedInvokeNode sharedInvoke, @Cached ToGuestValuesNode toGuestValues) { String key = (String) args[ARGUMENT_OFFSET]; Object[] guestArguments = toGuestValues.execute(this, context, (Object[]) args[ARGUMENT_OFFSET + 1]); return sharedInvoke.executeShared(this, context, receiver, key, guestArguments); } } abstract static class InvokeNoArgsNode extends InteropNode { protected InvokeNoArgsNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType, String.class}; } @Override protected String getOperationName() { return "invoke"; } @Specialization final Object doDefault(PolyglotLanguageContext context, Object receiver, Object[] args, @Cached SharedInvokeNode sharedInvoke) { String key = (String) args[ARGUMENT_OFFSET]; return sharedInvoke.executeShared(this, context, receiver, key, ExecuteVoidNoArgsNode.NO_ARGS); } } abstract static class IsExceptionNode extends InteropNode { protected IsExceptionNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType}; } @Override protected String getOperationName() { return "isException"; } @Specialization(limit = "CACHE_LIMIT") static Object doCached(PolyglotLanguageContext context, Object receiver, Object[] args, @Bind("this") Node node, @CachedLibrary("receiver") InteropLibrary objects) { return objects.isException(receiver); } } abstract static class ThrowExceptionNode extends InteropNode { protected ThrowExceptionNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType}; } @Override protected String getOperationName() { return "throwException"; } @Specialization(limit = "CACHE_LIMIT") static Object doCached(PolyglotLanguageContext context, Object receiver, Object[] args, @Bind("this") Node node, @CachedLibrary("receiver") InteropLibrary objects, @Cached InlinedBranchProfile unsupported) { try { throw objects.throwException(receiver); } catch (UnsupportedMessageException e) { unsupported.enter(node); throw unsupported(context, receiver, "throwException()", "isException()"); } } } abstract static class IsMetaObjectNode extends InteropNode { protected IsMetaObjectNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType}; } @Override protected String getOperationName() { return "isMetaObject"; } @Specialization(limit = "CACHE_LIMIT") static boolean doCached(PolyglotLanguageContext context, Object receiver, Object[] args, @Bind("this") Node node, @CachedLibrary("receiver") InteropLibrary objects) { return objects.isMetaObject(receiver); } } abstract static class GetMetaQualifiedNameNode extends InteropNode { protected GetMetaQualifiedNameNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType}; } @Override protected String getOperationName() { return "getMetaQualifiedName"; } @Specialization(limit = "CACHE_LIMIT") static String doCached(PolyglotLanguageContext context, Object receiver, Object[] args, @Bind("this") Node node, @CachedLibrary("receiver") InteropLibrary objects, @CachedLibrary(limit = "1") InteropLibrary toString, @Cached InlinedBranchProfile unsupported) { try { return toString.asString(objects.getMetaQualifiedName(receiver)); } catch (UnsupportedMessageException e) { unsupported.enter(node); throw unsupported(context, receiver, "getMetaQualifiedName()", "isMetaObject()"); } } } abstract static class GetMetaSimpleNameNode extends InteropNode { protected GetMetaSimpleNameNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType}; } @Override protected String getOperationName() { return "getMetaSimpleName"; } @Specialization(limit = "CACHE_LIMIT") static String doCached(PolyglotLanguageContext context, Object receiver, Object[] args, @Bind("this") Node node, @CachedLibrary("receiver") InteropLibrary objects, @CachedLibrary(limit = "1") InteropLibrary toString, @Cached InlinedBranchProfile unsupported) { try { return toString.asString(objects.getMetaSimpleName(receiver)); } catch (UnsupportedMessageException e) { unsupported.enter(node); throw unsupported(context, receiver, "getMetaSimpleName()", "isMetaObject()"); } } } abstract static class IsMetaInstanceNode extends InteropNode { protected IsMetaInstanceNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType, null}; } @Override protected String getOperationName() { return "isMetaInstance"; } @Specialization(limit = "CACHE_LIMIT") static boolean doCached(PolyglotLanguageContext context, Object receiver, Object[] args, @Bind("this") Node node, @CachedLibrary("receiver") InteropLibrary objects, @Cached(inline = true) ToGuestValueNode toGuest, @Cached InlinedBranchProfile unsupported) { try { return objects.isMetaInstance(receiver, toGuest.execute(node, context, args[ARGUMENT_OFFSET])); } catch (UnsupportedMessageException e) { unsupported.enter(node); throw unsupported(context, receiver, "isMetaInstance()", "isMetaObject()"); } } } abstract static class HasMetaParentsNode extends InteropNode { protected HasMetaParentsNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType}; } @Override protected String getOperationName() { return "hasMetaParents"; } @Specialization(limit = "CACHE_LIMIT") static boolean doCached(PolyglotLanguageContext context, Object receiver, Object[] args, @Bind("this") Node node, @CachedLibrary("receiver") InteropLibrary objects, @Cached InlinedBranchProfile unsupported) { return objects.hasMetaParents(receiver); } } abstract static class GetMetaParentsNode extends InteropNode { protected GetMetaParentsNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType}; } @Override protected String getOperationName() { return "getMetaParents"; } @Specialization(limit = "CACHE_LIMIT") static Object doCached(PolyglotLanguageContext context, Object receiver, Object[] args, @Bind("this") Node node, @CachedLibrary("receiver") InteropLibrary objects, @Cached ToHostValueNode toHost, @Cached InlinedBranchProfile unsupported) { try { return toHost.execute(node, context, objects.getMetaParents(receiver)); } catch (UnsupportedMessageException e) { unsupported.enter(node); throw unsupported(context, receiver, "getMetaParents()", "hasMetaParents()"); } } } abstract static class HasIteratorNode extends InteropNode { protected HasIteratorNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType}; } @Override protected String getOperationName() { return "hasIterator"; } @Specialization(limit = "CACHE_LIMIT") static Object doCached(PolyglotLanguageContext context, Object receiver, Object[] args, @Bind("this") Node node, // @CachedLibrary("receiver") InteropLibrary iterators) { return iterators.hasIterator(receiver); } } abstract static class GetIteratorNode extends InteropNode { protected GetIteratorNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType}; } @Override protected String getOperationName() { return "getIterator"; } @Specialization(limit = "CACHE_LIMIT") static Object doCached(PolyglotLanguageContext context, Object receiver, Object[] args, @Bind("this") Node node, // @CachedLibrary("receiver") InteropLibrary iterators, @Cached ToHostValueNode toHost, @Cached InlinedBranchProfile unsupported) { try { return toHost.execute(node, context, iterators.getIterator(receiver)); } catch (UnsupportedMessageException e) { unsupported.enter(node); return getIteratorUnsupported(context, receiver); } } } abstract static class IsIteratorNode extends InteropNode { protected IsIteratorNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType}; } @Override protected String getOperationName() { return "isIterator"; } @Specialization(limit = "CACHE_LIMIT") static Object doCached(PolyglotLanguageContext context, Object receiver, Object[] args, @Bind("this") Node node, // @CachedLibrary("receiver") InteropLibrary iterators) { return iterators.isIterator(receiver); } } abstract static class HasIteratorNextElementNode extends InteropNode { protected HasIteratorNextElementNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType}; } @Override protected String getOperationName() { return "hasIteratorNextElement"; } @Specialization(limit = "CACHE_LIMIT") static Object doCached(PolyglotLanguageContext context, Object receiver, Object[] args, @Bind("this") Node node, // @CachedLibrary("receiver") InteropLibrary iterators, @Cached InlinedBranchProfile unsupported) { try { return iterators.hasIteratorNextElement(receiver); } catch (UnsupportedMessageException e) { unsupported.enter(node); return hasIteratorNextElementUnsupported(context, receiver); } } } abstract static class GetIteratorNextElementNode extends InteropNode { protected GetIteratorNextElementNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType}; } @Override protected String getOperationName() { return "getIteratorNextElement"; } @Specialization(limit = "CACHE_LIMIT") static Object doCached(PolyglotLanguageContext context, Object receiver, Object[] args, @Bind("this") Node node, // @CachedLibrary("receiver") InteropLibrary iterators, @Cached ToHostValueNode toHost, @Cached InlinedBranchProfile unsupported, @Cached InlinedBranchProfile stop) { try { return toHost.execute(node, context, iterators.getIteratorNextElement(receiver)); } catch (UnsupportedMessageException e) { unsupported.enter(node); throw nonReadableIteratorElement(); } catch (StopIterationException e) { stop.enter(node); throw stopIteration(context, receiver); } } } abstract static class HasHashEntriesNode extends InteropNode { protected HasHashEntriesNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType}; } @Override protected String getOperationName() { return "hasHashEntries"; } @Specialization(limit = "CACHE_LIMIT") static Object doCached(PolyglotLanguageContext context, Object receiver, Object[] args, @Bind("this") Node node, // @CachedLibrary("receiver") InteropLibrary hashes) { return hashes.hasHashEntries(receiver); } } abstract static class GetHashSizeNode extends InteropNode { protected GetHashSizeNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType}; } @Override protected String getOperationName() { return "getHashSize"; } @Specialization(limit = "CACHE_LIMIT") static Object doCached(PolyglotLanguageContext context, Object receiver, Object[] args, @Bind("this") Node node, // @CachedLibrary("receiver") InteropLibrary hashes, @Cached InlinedBranchProfile unsupported) { try { return hashes.getHashSize(receiver); } catch (UnsupportedMessageException e) { unsupported.enter(node); throw getHashSizeUnsupported(context, receiver); } } } abstract static class HasHashEntryNode extends InteropNode { protected HasHashEntryNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType, Object.class}; } @Override protected String getOperationName() { return "hasHashEntry"; } @Specialization(limit = "CACHE_LIMIT") static Object doCached(PolyglotLanguageContext context, Object receiver, Object[] args, @Bind("this") Node node, @CachedLibrary("receiver") InteropLibrary hashes, @Cached(inline = true) ToGuestValueNode toGuestKey) { Object hostKey = args[ARGUMENT_OFFSET]; Object key = toGuestKey.execute(node, context, hostKey); return hashes.isHashEntryExisting(receiver, key); } } abstract static class GetHashValueNode extends InteropNode { protected GetHashValueNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType, Object.class}; } @Override protected String getOperationName() { return "getHashValue"; } @Specialization(limit = "CACHE_LIMIT") static Object doCached(PolyglotLanguageContext context, Object receiver, Object[] args, @Bind("this") Node node, @CachedLibrary("receiver") InteropLibrary hashes, @Cached(inline = true) ToGuestValueNode toGuestKey, @Cached ToHostValueNode toHost, @Cached InlinedBranchProfile unsupported, @Cached InlinedBranchProfile invalidKey) { Object hostKey = args[ARGUMENT_OFFSET]; Object key = toGuestKey.execute(node, context, hostKey); try { return toHost.execute(node, context, hashes.readHashValue(receiver, key)); } catch (UnsupportedMessageException e) { unsupported.enter(node); throw getHashValueUnsupported(context, receiver, key); } catch (UnknownKeyException e) { invalidKey.enter(node); if (hashes.isHashEntryExisting(receiver, key)) { throw getHashValueUnsupported(context, receiver, key); } else { return null; } } } } abstract static class GetHashValueOrDefaultNode extends InteropNode { protected GetHashValueOrDefaultNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType, Object.class, Object.class}; } @Override protected String getOperationName() { return "getHashValueOrDefault"; } @Specialization(limit = "CACHE_LIMIT") static Object doCached(PolyglotLanguageContext context, Object receiver, Object[] args, @Bind("this") Node node, // @CachedLibrary("receiver") InteropLibrary hashes, @Cached(inline = true) ToGuestValueNode toGuestKey, @Cached(inline = true) ToGuestValueNode toGuestDefaultValue, @Cached ToHostValueNode toHost, @Cached InlinedBranchProfile unsupported, @Cached InlinedBranchProfile invalidKey) { Object hostKey = args[ARGUMENT_OFFSET]; Object hostDefaultValue = args[ARGUMENT_OFFSET + 1]; Object key = toGuestKey.execute(node, context, hostKey); Object defaultValue = toGuestDefaultValue.execute(node, context, hostDefaultValue); try { return toHost.execute(node, context, hashes.readHashValueOrDefault(receiver, key, hostDefaultValue)); } catch (UnsupportedMessageException e) { unsupported.enter(node); throw getHashValueUnsupported(context, receiver, key); } } } abstract static class PutHashEntryNode extends InteropNode { protected PutHashEntryNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType, Object.class, Object.class}; } @Override protected String getOperationName() { return "putHashEntry"; } @Specialization(limit = "CACHE_LIMIT") static Object doCached(PolyglotLanguageContext context, Object receiver, Object[] args, @Bind("this") Node node, // @CachedLibrary("receiver") InteropLibrary hashes, @Cached(inline = true) ToGuestValueNode toGuestKey, @Cached(inline = true) ToGuestValueNode toGuestValue, @Cached InlinedBranchProfile unsupported, @Cached InlinedBranchProfile invalidKey, @Cached InlinedBranchProfile invalidValue) { Object hostKey = args[ARGUMENT_OFFSET]; Object hostValue = args[ARGUMENT_OFFSET + 1]; Object key = toGuestKey.execute(node, context, hostKey); Object value = toGuestValue.execute(node, context, hostValue); try { hashes.writeHashEntry(receiver, key, value); } catch (UnsupportedMessageException | UnknownKeyException e) { unsupported.enter(node); throw putHashEntryUnsupported(context, receiver, key, value); } catch (UnsupportedTypeException e) { invalidValue.enter(node); throw invalidHashValue(context, receiver, key, value); } return null; } } abstract static class RemoveHashEntryNode extends InteropNode { protected RemoveHashEntryNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType, Object.class}; } @Override protected String getOperationName() { return "removeHashEntry"; } @Specialization(limit = "CACHE_LIMIT") static Object doCached(PolyglotLanguageContext context, Object receiver, Object[] args, @Bind("this") Node node, // @CachedLibrary("receiver") InteropLibrary hashes, @Cached(inline = true) ToGuestValueNode toGuestKey, @Cached InlinedBranchProfile unsupported, @Cached InlinedBranchProfile invalidKey) { Object hostKey = args[ARGUMENT_OFFSET]; Object key = toGuestKey.execute(node, context, hostKey); Boolean result; try { hashes.removeHashEntry(receiver, key); result = Boolean.TRUE; } catch (UnsupportedMessageException e) { unsupported.enter(node); if (!hashes.hasHashEntries(receiver) || hashes.isHashEntryExisting(receiver, key)) { throw removeHashEntryUnsupported(context, receiver, key); } else { result = Boolean.FALSE; } } catch (UnknownKeyException e) { invalidKey.enter(node); result = Boolean.FALSE; } return result; } } abstract static class GetHashEntriesIteratorNode extends InteropNode { GetHashEntriesIteratorNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType}; } @Override protected String getOperationName() { return "getHashEntriesIterator"; } @Specialization(limit = "CACHE_LIMIT") static Object doCached(PolyglotLanguageContext context, Object receiver, Object[] args, @Bind("this") Node node, // @CachedLibrary("receiver") InteropLibrary hashes, @Cached ToHostValueNode toHost, @Cached InlinedBranchProfile unsupported) { try { return toHost.execute(node, context, hashes.getHashEntriesIterator(receiver)); } catch (UnsupportedMessageException e) { unsupported.enter(node); throw getHashEntriesIteratorUnsupported(context, receiver); } } } abstract static class GetHashKeysIteratorNode extends InteropNode { GetHashKeysIteratorNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType}; } @Override protected String getOperationName() { return "getHashKeysIterator"; } @Specialization(limit = "CACHE_LIMIT") static Object doCached(PolyglotLanguageContext context, Object receiver, Object[] args, @Bind("this") Node node, // @CachedLibrary("receiver") InteropLibrary hashes, @Cached ToHostValueNode toHost, @Cached InlinedBranchProfile unsupported) { try { return toHost.execute(node, context, hashes.getHashKeysIterator(receiver)); } catch (UnsupportedMessageException e) { unsupported.enter(node); throw getHashEntriesIteratorUnsupported(context, receiver); } } } abstract static class GetHashValuesIteratorNode extends InteropNode { GetHashValuesIteratorNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType}; } @Override protected String getOperationName() { return "getHashValuesIterator"; } @Specialization(limit = "CACHE_LIMIT") static Object doCached(PolyglotLanguageContext context, Object receiver, Object[] args, @Bind("this") Node node, // @CachedLibrary("receiver") InteropLibrary hashes, @Cached ToHostValueNode toHost, @Cached InlinedBranchProfile unsupported) { try { return toHost.execute(node, context, hashes.getHashValuesIterator(receiver)); } catch (UnsupportedMessageException e) { unsupported.enter(node); throw getHashEntriesIteratorUnsupported(context, receiver); } } } } } ```
Yelena Matyushenko (born 25 January 1961) is a Soviet diver. She competed in the women's 10 metre platform event at the 1980 Summer Olympics. References 1961 births Living people Soviet female divers Olympic divers for the Soviet Union Divers at the 1980 Summer Olympics Place of birth missing (living people)
```kotlin import org.gradle.api.DefaultTask import org.gradle.api.tasks.Input import org.gradle.api.tasks.TaskAction import java.io.File import java.net.URL /** Pulls the newest map style from the maplibre-streetcomplete-style repository */ open class UpdateMapStyleTask : DefaultTask() { @get:Input lateinit var targetDir: String @get:Input lateinit var mapStyleBranch: String @get:Input lateinit var apiKey: String @TaskAction fun run() { val targetDir = File(targetDir) require(targetDir.exists()) { "Directory ${targetDir.absolutePath} does not exist." } val urls = listOf( "path_to_url", "path_to_url", ).map { URL(it) } for (url in urls) { val fileName = File(url.path).name val targetFile = File(targetDir, fileName) val fileContent = url.readText() .normalizeLineEndings() .replaceAccessToken(apiKey) .replaceGlyphs() .replaceSprites() targetFile.writeText(fileContent) } } } private fun String.normalizeLineEndings() = this.replace("\r\n", "\n") private fun String.replaceAccessToken(apiKey: String): String = replace(Regex("\\?access-token=[0-9A-Za-z+/=]*"), "?access-token=$apiKey") private fun String.replaceGlyphs(): String = replace(Regex("path_to_url"), "asset://map_theme/glyphs") private fun String.replaceSprites(): String = replace(Regex("path_to_url"), "asset://map_theme/sprites") ```
```css How to easily check browser compatibility of a feature Importing a CSS file into another CSS file Writing comments in CSS Removing the bullets from the `ul` Add `line-height` to `body` ```
```c path_to_url Unless required by applicable law or agreed to in writing, software WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. */ #include "sysdep.h" #include "bfd.h" #include "dis-asm.h" #include "disassemble.h" #include <stdint.h> #define MAX_TEXT_SIZE 256 typedef struct { char *buffer; size_t pos; } SFILE; static int fuzz_disasm_null_styled_printf (void *stream, enum disassembler_style style, const char *format, ...) { return 0; } static int objdump_sprintf (void *vf, const char *format, ...) { SFILE *f = (SFILE *) vf; size_t n; va_list args; va_start (args, format); if (f->pos >= MAX_TEXT_SIZE){ printf("buffer needs more space\n"); //reset f->pos=0; return 0; } n = vsnprintf (f->buffer + f->pos, MAX_TEXT_SIZE - f->pos, format, args); //vfprintf(stdout, format, args); va_end (args); f->pos += n; return n; } int LLVMFuzzerTestOneInput(const uint8_t *Data, size_t Size) { char AssemblyText[MAX_TEXT_SIZE]; struct disassemble_info disasm_info; SFILE s; if (Size < 10 || Size > 16394) { // 10 bytes for options // 16394 limit code to prevent timeouts return 0; } init_disassemble_info (&disasm_info, stdout, (fprintf_ftype) fprintf, fuzz_disasm_null_styled_printf); disasm_info.fprintf_func = objdump_sprintf; disasm_info.print_address_func = generic_print_address; disasm_info.display_endian = disasm_info.endian = BFD_ENDIAN_LITTLE; disasm_info.buffer = (bfd_byte *) Data; disasm_info.buffer_vma = 0x1000; disasm_info.buffer_length = Size-10; disasm_info.insn_info_valid = 0; disasm_info.created_styled_output = false; s.buffer = AssemblyText; s.pos = 0; disasm_info.stream = &s; disasm_info.bytes_per_line = 0; disasm_info.flags |= USER_SPECIFIED_MACHINE_TYPE; disasm_info.arch = Data[Size-1]; disasm_info.mach = bfd_getl64(&Data[Size-9]); disasm_info.flavour = Data[Size-10]; if (bfd_lookup_arch (disasm_info.arch, disasm_info.mach) != NULL) { disassembler_ftype disasfunc = disassembler(disasm_info.arch, 0, disasm_info.mach, NULL); if (disasfunc != NULL) { disassemble_init_for_target(&disasm_info); while (1) { s.pos = 0; int octets = disasfunc(disasm_info.buffer_vma, &disasm_info); if (octets < (int) disasm_info.octets_per_byte) break; if (disasm_info.buffer_length <= (size_t) octets) break; disasm_info.buffer += octets; disasm_info.buffer_vma += octets / disasm_info.octets_per_byte; disasm_info.buffer_length -= octets; } disassemble_free_target(&disasm_info); } } return 0; } ```
```c /* * * This file is part of FFmpeg. * * FFmpeg is free software; you can redistribute it and/or * modify it under the terms of the GNU Lesser General Public * * FFmpeg is distributed in the hope that it will be useful, * but WITHOUT ANY WARRANTY; without even the implied warranty of * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU * * You should have received a copy of the GNU Lesser General Public * Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA */ /** * @file * multimedia converter based on the FFmpeg libraries */ #include "config.h" #include <ctype.h> #include <string.h> #include <math.h> #include <stdlib.h> #include <errno.h> #include <limits.h> #include <stdint.h> #if HAVE_IO_H #include <io.h> #endif #if HAVE_UNISTD_H #include <unistd.h> #endif #include "libavformat/avformat.h" #include "libavdevice/avdevice.h" #include "libswresample/swresample.h" #include "libavutil/opt.h" #include "libavutil/channel_layout.h" #include "libavutil/parseutils.h" #include "libavutil/samplefmt.h" #include "libavutil/fifo.h" #include "libavutil/internal.h" #include "libavutil/intreadwrite.h" #include "libavutil/dict.h" #include "libavutil/mathematics.h" #include "libavutil/pixdesc.h" #include "libavutil/avstring.h" #include "libavutil/libm.h" #include "libavutil/imgutils.h" #include "libavutil/timestamp.h" #include "libavutil/bprint.h" #include "libavutil/time.h" #include "libavutil/threadmessage.h" #include "libavcodec/mathops.h" #include "libavformat/os_support.h" # include "libavfilter/avfilter.h" # include "libavfilter/buffersrc.h" # include "libavfilter/buffersink.h" #if HAVE_SYS_RESOURCE_H #include <sys/time.h> #include <sys/types.h> #include <sys/resource.h> #elif HAVE_GETPROCESSTIMES #include <windows.h> #endif #if HAVE_GETPROCESSMEMORYINFO #include <windows.h> #include <psapi.h> #endif #if HAVE_SETCONSOLECTRLHANDLER #include <windows.h> #endif #if HAVE_SYS_SELECT_H #include <sys/select.h> #endif #if HAVE_TERMIOS_H #include <fcntl.h> #include <sys/ioctl.h> #include <sys/time.h> #include <termios.h> #elif HAVE_KBHIT #include <conio.h> #endif #if HAVE_PTHREADS #include <pthread.h> #endif #include <time.h> #include "ffmpeg.h" #include "cmdutils.h" #include "libavutil/avassert.h" const char program_name[] = "ffmpeg"; const int program_birth_year = 2000; static FILE *vstats_file; const char *const forced_keyframes_const_names[] = { "n", "n_forced", "prev_forced_n", "prev_forced_t", "t", NULL }; static void do_video_stats(OutputStream *ost, int frame_size); static int64_t getutime(void); static int64_t getmaxrss(void); static int run_as_daemon = 0; static int nb_frames_dup = 0; static int nb_frames_drop = 0; static int64_t decode_error_stat[2]; static int want_sdp = 1; static int current_time; AVIOContext *progress_avio = NULL; static uint8_t *subtitle_out; InputStream **input_streams = NULL; int nb_input_streams = 0; InputFile **input_files = NULL; int nb_input_files = 0; OutputStream **output_streams = NULL; int nb_output_streams = 0; OutputFile **output_files = NULL; int nb_output_files = 0; FilterGraph **filtergraphs; int nb_filtergraphs; #if HAVE_TERMIOS_H /* init terminal so that we can grab keys */ static struct termios oldtty; static int restore_tty; #endif #if HAVE_PTHREADS static void free_input_threads(void); #endif /* sub2video hack: Convert subtitles to video with alpha to insert them in filter graphs. This is a temporary solution until libavfilter gets real subtitles support. */ static int sub2video_get_blank_frame(InputStream *ist) { int ret; AVFrame *frame = ist->sub2video.frame; av_frame_unref(frame); ist->sub2video.frame->width = ist->dec_ctx->width ? ist->dec_ctx->width : ist->sub2video.w; ist->sub2video.frame->height = ist->dec_ctx->height ? ist->dec_ctx->height : ist->sub2video.h; ist->sub2video.frame->format = AV_PIX_FMT_RGB32; if ((ret = av_frame_get_buffer(frame, 32)) < 0) return ret; memset(frame->data[0], 0, frame->height * frame->linesize[0]); return 0; } static void sub2video_copy_rect(uint8_t *dst, int dst_linesize, int w, int h, AVSubtitleRect *r) { uint32_t *pal, *dst2; uint8_t *src, *src2; int x, y; if (r->type != SUBTITLE_BITMAP) { av_log(NULL, AV_LOG_WARNING, "sub2video: non-bitmap subtitle\n"); return; } if (r->x < 0 || r->x + r->w > w || r->y < 0 || r->y + r->h > h) { av_log(NULL, AV_LOG_WARNING, "sub2video: rectangle (%d %d %d %d) overflowing %d %d\n", r->x, r->y, r->w, r->h, w, h ); return; } dst += r->y * dst_linesize + r->x * 4; src = r->data[0]; pal = (uint32_t *)r->data[1]; for (y = 0; y < r->h; y++) { dst2 = (uint32_t *)dst; src2 = src; for (x = 0; x < r->w; x++) *(dst2++) = pal[*(src2++)]; dst += dst_linesize; src += r->linesize[0]; } } static void sub2video_push_ref(InputStream *ist, int64_t pts) { AVFrame *frame = ist->sub2video.frame; int i; av_assert1(frame->data[0]); ist->sub2video.last_pts = frame->pts = pts; for (i = 0; i < ist->nb_filters; i++) av_buffersrc_add_frame_flags(ist->filters[i]->filter, frame, AV_BUFFERSRC_FLAG_KEEP_REF | AV_BUFFERSRC_FLAG_PUSH); } static void sub2video_update(InputStream *ist, AVSubtitle *sub) { AVFrame *frame = ist->sub2video.frame; int8_t *dst; int dst_linesize; int num_rects, i; int64_t pts, end_pts; if (!frame) return; if (sub) { pts = av_rescale_q(sub->pts + sub->start_display_time * 1000LL, AV_TIME_BASE_Q, ist->st->time_base); end_pts = av_rescale_q(sub->pts + sub->end_display_time * 1000LL, AV_TIME_BASE_Q, ist->st->time_base); num_rects = sub->num_rects; } else { pts = ist->sub2video.end_pts; end_pts = INT64_MAX; num_rects = 0; } if (sub2video_get_blank_frame(ist) < 0) { av_log(ist->dec_ctx, AV_LOG_ERROR, "Impossible to get a blank canvas.\n"); return; } dst = frame->data [0]; dst_linesize = frame->linesize[0]; for (i = 0; i < num_rects; i++) sub2video_copy_rect(dst, dst_linesize, frame->width, frame->height, sub->rects[i]); sub2video_push_ref(ist, pts); ist->sub2video.end_pts = end_pts; } static void sub2video_heartbeat(InputStream *ist, int64_t pts) { InputFile *infile = input_files[ist->file_index]; int i, j, nb_reqs; int64_t pts2; /* When a frame is read from a file, examine all sub2video streams in the same file and send the sub2video frame again. Otherwise, decoded video frames could be accumulating in the filter graph while a filter (possibly overlay) is desperately waiting for a subtitle frame. */ for (i = 0; i < infile->nb_streams; i++) { InputStream *ist2 = input_streams[infile->ist_index + i]; if (!ist2->sub2video.frame) continue; /* subtitles seem to be usually muxed ahead of other streams; if not, subtracting a larger time here is necessary */ pts2 = av_rescale_q(pts, ist->st->time_base, ist2->st->time_base) - 1; /* do not send the heartbeat frame if the subtitle is already ahead */ if (pts2 <= ist2->sub2video.last_pts) continue; if (pts2 >= ist2->sub2video.end_pts || !ist2->sub2video.frame->data[0]) sub2video_update(ist2, NULL); for (j = 0, nb_reqs = 0; j < ist2->nb_filters; j++) nb_reqs += av_buffersrc_get_nb_failed_requests(ist2->filters[j]->filter); if (nb_reqs) sub2video_push_ref(ist2, pts2); } } static void sub2video_flush(InputStream *ist) { int i; if (ist->sub2video.end_pts < INT64_MAX) sub2video_update(ist, NULL); for (i = 0; i < ist->nb_filters; i++) av_buffersrc_add_frame(ist->filters[i]->filter, NULL); } /* end of sub2video hack */ static void term_exit_sigsafe(void) { #if HAVE_TERMIOS_H if(restore_tty) tcsetattr (0, TCSANOW, &oldtty); #endif } void term_exit(void) { av_log(NULL, AV_LOG_QUIET, "%s", ""); term_exit_sigsafe(); } static volatile int received_sigterm = 0; static volatile int received_nb_signals = 0; static volatile int transcode_init_done = 0; static volatile int ffmpeg_exited = 0; static int main_return_code = 0; static void sigterm_handler(int sig) { received_sigterm = sig; received_nb_signals++; term_exit_sigsafe(); if(received_nb_signals > 3) { write(2/*STDERR_FILENO*/, "Received > 3 system signals, hard exiting\n", strlen("Received > 3 system signals, hard exiting\n")); exit(123); } } #if HAVE_SETCONSOLECTRLHANDLER static BOOL WINAPI CtrlHandler(DWORD fdwCtrlType) { av_log(NULL, AV_LOG_DEBUG, "\nReceived windows signal %ld\n", fdwCtrlType); switch (fdwCtrlType) { case CTRL_C_EVENT: case CTRL_BREAK_EVENT: sigterm_handler(SIGINT); return TRUE; case CTRL_CLOSE_EVENT: case CTRL_LOGOFF_EVENT: case CTRL_SHUTDOWN_EVENT: sigterm_handler(SIGTERM); /* Basically, with these 3 events, when we return from this method the process is hard terminated, so stall as long as we need to to try and let the main thread(s) clean up and gracefully terminate (we have at most 5 seconds, but should be done far before that). */ while (!ffmpeg_exited) { Sleep(0); } return TRUE; default: av_log(NULL, AV_LOG_ERROR, "Received unknown windows signal %ld\n", fdwCtrlType); return FALSE; } } #endif void term_init(void) { #if HAVE_TERMIOS_H if (!run_as_daemon && stdin_interaction) { struct termios tty; if (tcgetattr (0, &tty) == 0) { oldtty = tty; restore_tty = 1; tty.c_iflag &= ~(IGNBRK|BRKINT|PARMRK|ISTRIP |INLCR|IGNCR|ICRNL|IXON); tty.c_oflag |= OPOST; tty.c_lflag &= ~(ECHO|ECHONL|ICANON|IEXTEN); tty.c_cflag &= ~(CSIZE|PARENB); tty.c_cflag |= CS8; tty.c_cc[VMIN] = 1; tty.c_cc[VTIME] = 0; tcsetattr (0, TCSANOW, &tty); } signal(SIGQUIT, sigterm_handler); /* Quit (POSIX). */ } #endif signal(SIGINT , sigterm_handler); /* Interrupt (ANSI). */ signal(SIGTERM, sigterm_handler); /* Termination (ANSI). */ #ifdef SIGXCPU signal(SIGXCPU, sigterm_handler); #endif #if HAVE_SETCONSOLECTRLHANDLER SetConsoleCtrlHandler((PHANDLER_ROUTINE) CtrlHandler, TRUE); #endif } /* read a key without blocking */ static int read_key(void) { unsigned char ch; #if HAVE_TERMIOS_H int n = 1; struct timeval tv; fd_set rfds; FD_ZERO(&rfds); FD_SET(0, &rfds); tv.tv_sec = 0; tv.tv_usec = 0; n = select(1, &rfds, NULL, NULL, &tv); if (n > 0) { n = read(0, &ch, 1); if (n == 1) return ch; return n; } #elif HAVE_KBHIT # if HAVE_PEEKNAMEDPIPE static int is_pipe; static HANDLE input_handle; DWORD dw, nchars; if(!input_handle){ input_handle = GetStdHandle(STD_INPUT_HANDLE); is_pipe = !GetConsoleMode(input_handle, &dw); } if (is_pipe) { /* When running under a GUI, you will end here. */ if (!PeekNamedPipe(input_handle, NULL, 0, NULL, &nchars, NULL)) { // input pipe may have been closed by the program that ran ffmpeg return -1; } //Read it if(nchars != 0) { read(0, &ch, 1); return ch; }else{ return -1; } } # endif if(kbhit()) return(getch()); #endif return -1; } static int decode_interrupt_cb(void *ctx) { return received_nb_signals > transcode_init_done; } const AVIOInterruptCB int_cb = { decode_interrupt_cb, NULL }; static void ffmpeg_cleanup(int ret) { int i, j; if (do_benchmark) { int maxrss = getmaxrss() / 1024; av_log(NULL, AV_LOG_INFO, "bench: maxrss=%ikB\n", maxrss); } for (i = 0; i < nb_filtergraphs; i++) { FilterGraph *fg = filtergraphs[i]; avfilter_graph_free(&fg->graph); for (j = 0; j < fg->nb_inputs; j++) { av_freep(&fg->inputs[j]->name); av_freep(&fg->inputs[j]); } av_freep(&fg->inputs); for (j = 0; j < fg->nb_outputs; j++) { av_freep(&fg->outputs[j]->name); av_freep(&fg->outputs[j]); } av_freep(&fg->outputs); av_freep(&fg->graph_desc); av_freep(&filtergraphs[i]); } av_freep(&filtergraphs); av_freep(&subtitle_out); /* close files */ for (i = 0; i < nb_output_files; i++) { OutputFile *of = output_files[i]; AVFormatContext *s; if (!of) continue; s = of->ctx; if (s && s->oformat && !(s->oformat->flags & AVFMT_NOFILE)) avio_closep(&s->pb); avformat_free_context(s); av_dict_free(&of->opts); av_freep(&output_files[i]); } for (i = 0; i < nb_output_streams; i++) { OutputStream *ost = output_streams[i]; if (!ost) continue; for (j = 0; j < ost->nb_bitstream_filters; j++) av_bsf_free(&ost->bsf_ctx[j]); av_freep(&ost->bsf_ctx); av_freep(&ost->bsf_extradata_updated); av_frame_free(&ost->filtered_frame); av_frame_free(&ost->last_frame); av_dict_free(&ost->encoder_opts); av_parser_close(ost->parser); avcodec_free_context(&ost->parser_avctx); av_freep(&ost->forced_keyframes); av_expr_free(ost->forced_keyframes_pexpr); av_freep(&ost->avfilter); av_freep(&ost->logfile_prefix); av_freep(&ost->audio_channels_map); ost->audio_channels_mapped = 0; av_dict_free(&ost->sws_dict); avcodec_free_context(&ost->enc_ctx); avcodec_parameters_free(&ost->ref_par); while (ost->muxing_queue && av_fifo_size(ost->muxing_queue)) { AVPacket pkt; av_fifo_generic_read(ost->muxing_queue, &pkt, sizeof(pkt), NULL); av_packet_unref(&pkt); } av_fifo_freep(&ost->muxing_queue); av_freep(&output_streams[i]); } #if HAVE_PTHREADS free_input_threads(); #endif for (i = 0; i < nb_input_files; i++) { avformat_close_input(&input_files[i]->ctx); av_freep(&input_files[i]); } for (i = 0; i < nb_input_streams; i++) { InputStream *ist = input_streams[i]; av_frame_free(&ist->decoded_frame); av_frame_free(&ist->filter_frame); av_dict_free(&ist->decoder_opts); avsubtitle_free(&ist->prev_sub.subtitle); av_frame_free(&ist->sub2video.frame); av_freep(&ist->filters); av_freep(&ist->hwaccel_device); av_freep(&ist->dts_buffer); avcodec_free_context(&ist->dec_ctx); av_freep(&input_streams[i]); } if (vstats_file) { if (fclose(vstats_file)) av_log(NULL, AV_LOG_ERROR, "Error closing vstats file, loss of information possible: %s\n", av_err2str(AVERROR(errno))); } av_freep(&vstats_filename); av_freep(&input_streams); av_freep(&input_files); av_freep(&output_streams); av_freep(&output_files); uninit_opts(); avformat_network_deinit(); if (received_sigterm) { av_log(NULL, AV_LOG_INFO, "Exiting normally, received signal %d.\n", (int) received_sigterm); } else if (ret && transcode_init_done) { av_log(NULL, AV_LOG_INFO, "Conversion failed!\n"); } term_exit(); ffmpeg_exited = 1; } void remove_avoptions(AVDictionary **a, AVDictionary *b) { AVDictionaryEntry *t = NULL; while ((t = av_dict_get(b, "", t, AV_DICT_IGNORE_SUFFIX))) { av_dict_set(a, t->key, NULL, AV_DICT_MATCH_CASE); } } void assert_avoptions(AVDictionary *m) { AVDictionaryEntry *t; if ((t = av_dict_get(m, "", NULL, AV_DICT_IGNORE_SUFFIX))) { av_log(NULL, AV_LOG_FATAL, "Option %s not found.\n", t->key); exit_program(1); } } static void abort_codec_experimental(AVCodec *c, int encoder) { exit_program(1); } static void update_benchmark(const char *fmt, ...) { if (do_benchmark_all) { int64_t t = getutime(); va_list va; char buf[1024]; if (fmt) { va_start(va, fmt); vsnprintf(buf, sizeof(buf), fmt, va); va_end(va); av_log(NULL, AV_LOG_INFO, "bench: %8"PRIu64" %s \n", t - current_time, buf); } current_time = t; } } static void close_all_output_streams(OutputStream *ost, OSTFinished this_stream, OSTFinished others) { int i; for (i = 0; i < nb_output_streams; i++) { OutputStream *ost2 = output_streams[i]; ost2->finished |= ost == ost2 ? this_stream : others; } } static void write_packet(OutputFile *of, AVPacket *pkt, OutputStream *ost) { AVFormatContext *s = of->ctx; AVStream *st = ost->st; int ret; if (!of->header_written) { AVPacket tmp_pkt; /* the muxer is not initialized yet, buffer the packet */ if (!av_fifo_space(ost->muxing_queue)) { int new_size = FFMIN(2 * av_fifo_size(ost->muxing_queue), ost->max_muxing_queue_size); if (new_size <= av_fifo_size(ost->muxing_queue)) { av_log(NULL, AV_LOG_ERROR, "Too many packets buffered for output stream %d:%d.\n", ost->file_index, ost->st->index); exit_program(1); } ret = av_fifo_realloc2(ost->muxing_queue, new_size); if (ret < 0) exit_program(1); } av_packet_move_ref(&tmp_pkt, pkt); av_fifo_generic_write(ost->muxing_queue, &tmp_pkt, sizeof(tmp_pkt), NULL); return; } if ((st->codecpar->codec_type == AVMEDIA_TYPE_VIDEO && video_sync_method == VSYNC_DROP) || (st->codecpar->codec_type == AVMEDIA_TYPE_AUDIO && audio_sync_method < 0)) pkt->pts = pkt->dts = AV_NOPTS_VALUE; /* * Audio encoders may split the packets -- #frames in != #packets out. * But there is no reordering, so we can limit the number of output packets * by simply dropping them here. * Counting encoded video frames needs to be done separately because of * reordering, see do_video_out() */ if (!(st->codecpar->codec_type == AVMEDIA_TYPE_VIDEO && ost->encoding_needed)) { if (ost->frame_number >= ost->max_frames) { av_packet_unref(pkt); return; } ost->frame_number++; } if (st->codecpar->codec_type == AVMEDIA_TYPE_VIDEO) { int i; uint8_t *sd = av_packet_get_side_data(pkt, AV_PKT_DATA_QUALITY_STATS, NULL); ost->quality = sd ? AV_RL32(sd) : -1; ost->pict_type = sd ? sd[4] : AV_PICTURE_TYPE_NONE; for (i = 0; i<FF_ARRAY_ELEMS(ost->error); i++) { if (sd && i < sd[5]) ost->error[i] = AV_RL64(sd + 8 + 8*i); else ost->error[i] = -1; } if (ost->frame_rate.num && ost->is_cfr) { if (pkt->duration > 0) av_log(NULL, AV_LOG_WARNING, "Overriding packet duration by frame rate, this should not happen\n"); pkt->duration = av_rescale_q(1, av_inv_q(ost->frame_rate), ost->st->time_base); } } if (!(s->oformat->flags & AVFMT_NOTIMESTAMPS)) { if (pkt->dts != AV_NOPTS_VALUE && pkt->pts != AV_NOPTS_VALUE && pkt->dts > pkt->pts) { av_log(s, AV_LOG_WARNING, "Invalid DTS: %"PRId64" PTS: %"PRId64" in output stream %d:%d, replacing by guess\n", pkt->dts, pkt->pts, ost->file_index, ost->st->index); pkt->pts = pkt->dts = pkt->pts + pkt->dts + ost->last_mux_dts + 1 - FFMIN3(pkt->pts, pkt->dts, ost->last_mux_dts + 1) - FFMAX3(pkt->pts, pkt->dts, ost->last_mux_dts + 1); } if ((st->codecpar->codec_type == AVMEDIA_TYPE_AUDIO || st->codecpar->codec_type == AVMEDIA_TYPE_VIDEO) && pkt->dts != AV_NOPTS_VALUE && !(st->codecpar->codec_id == AV_CODEC_ID_VP9 && ost->stream_copy) && ost->last_mux_dts != AV_NOPTS_VALUE) { int64_t max = ost->last_mux_dts + !(s->oformat->flags & AVFMT_TS_NONSTRICT); if (pkt->dts < max) { int loglevel = max - pkt->dts > 2 || st->codecpar->codec_type == AVMEDIA_TYPE_VIDEO ? AV_LOG_WARNING : AV_LOG_DEBUG; av_log(s, loglevel, "Non-monotonous DTS in output stream " "%d:%d; previous: %"PRId64", current: %"PRId64"; ", ost->file_index, ost->st->index, ost->last_mux_dts, pkt->dts); if (exit_on_error) { av_log(NULL, AV_LOG_FATAL, "aborting.\n"); exit_program(1); } av_log(s, loglevel, "changing to %"PRId64". This may result " "in incorrect timestamps in the output file.\n", max); if (pkt->pts >= pkt->dts) pkt->pts = FFMAX(pkt->pts, max); pkt->dts = max; } } } ost->last_mux_dts = pkt->dts; ost->data_size += pkt->size; ost->packets_written++; pkt->stream_index = ost->index; if (debug_ts) { av_log(NULL, AV_LOG_INFO, "muxer <- type:%s " "pkt_pts:%s pkt_pts_time:%s pkt_dts:%s pkt_dts_time:%s size:%d\n", av_get_media_type_string(ost->enc_ctx->codec_type), av_ts2str(pkt->pts), av_ts2timestr(pkt->pts, &ost->st->time_base), av_ts2str(pkt->dts), av_ts2timestr(pkt->dts, &ost->st->time_base), pkt->size ); } ret = av_interleaved_write_frame(s, pkt); if (ret < 0) { print_error("av_interleaved_write_frame()", ret); main_return_code = 1; close_all_output_streams(ost, MUXER_FINISHED | ENCODER_FINISHED, ENCODER_FINISHED); } av_packet_unref(pkt); } static void close_output_stream(OutputStream *ost) { OutputFile *of = output_files[ost->file_index]; ost->finished |= ENCODER_FINISHED; if (of->shortest) { int64_t end = av_rescale_q(ost->sync_opts - ost->first_pts, ost->enc_ctx->time_base, AV_TIME_BASE_Q); of->recording_time = FFMIN(of->recording_time, end); } } static void output_packet(OutputFile *of, AVPacket *pkt, OutputStream *ost) { int ret = 0; /* apply the output bitstream filters, if any */ if (ost->nb_bitstream_filters) { int idx; av_packet_split_side_data(pkt); ret = av_bsf_send_packet(ost->bsf_ctx[0], pkt); if (ret < 0) goto finish; idx = 1; while (idx) { /* get a packet from the previous filter up the chain */ ret = av_bsf_receive_packet(ost->bsf_ctx[idx - 1], pkt); /* HACK! - aac_adtstoasc updates extradata after filtering the first frame when * the api states this shouldn't happen after init(). Propagate it here to the * muxer and to the next filters in the chain to workaround this. * TODO/FIXME - Make aac_adtstoasc use new packet side data instead of changing * par_out->extradata and adapt muxers accordingly to get rid of this. */ if (!(ost->bsf_extradata_updated[idx - 1] & 1)) { ret = avcodec_parameters_copy(ost->st->codecpar, ost->bsf_ctx[idx - 1]->par_out); if (ret < 0) goto finish; ost->bsf_extradata_updated[idx - 1] |= 1; } if (ret == AVERROR(EAGAIN)) { ret = 0; idx--; continue; } else if (ret < 0) goto finish; /* send it to the next filter down the chain or to the muxer */ if (idx < ost->nb_bitstream_filters) { /* HACK/FIXME! - See above */ if (!(ost->bsf_extradata_updated[idx] & 2)) { ret = avcodec_parameters_copy(ost->bsf_ctx[idx]->par_out, ost->bsf_ctx[idx - 1]->par_out); if (ret < 0) goto finish; ost->bsf_extradata_updated[idx] |= 2; } ret = av_bsf_send_packet(ost->bsf_ctx[idx], pkt); if (ret < 0) goto finish; idx++; } else write_packet(of, pkt, ost); } } else write_packet(of, pkt, ost); finish: if (ret < 0 && ret != AVERROR_EOF) { av_log(NULL, AV_LOG_ERROR, "Error applying bitstream filters to an output " "packet for stream #%d:%d.\n", ost->file_index, ost->index); if(exit_on_error) exit_program(1); } } static int check_recording_time(OutputStream *ost) { OutputFile *of = output_files[ost->file_index]; if (of->recording_time != INT64_MAX && av_compare_ts(ost->sync_opts - ost->first_pts, ost->enc_ctx->time_base, of->recording_time, AV_TIME_BASE_Q) >= 0) { close_output_stream(ost); return 0; } return 1; } static void do_audio_out(OutputFile *of, OutputStream *ost, AVFrame *frame) { AVCodecContext *enc = ost->enc_ctx; AVPacket pkt; int ret; av_init_packet(&pkt); pkt.data = NULL; pkt.size = 0; if (!check_recording_time(ost)) return; if (frame->pts == AV_NOPTS_VALUE || audio_sync_method < 0) frame->pts = ost->sync_opts; ost->sync_opts = frame->pts + frame->nb_samples; ost->samples_encoded += frame->nb_samples; ost->frames_encoded++; av_assert0(pkt.size || !pkt.data); update_benchmark(NULL); if (debug_ts) { av_log(NULL, AV_LOG_INFO, "encoder <- type:audio " "frame_pts:%s frame_pts_time:%s time_base:%d/%d\n", av_ts2str(frame->pts), av_ts2timestr(frame->pts, &enc->time_base), enc->time_base.num, enc->time_base.den); } ret = avcodec_send_frame(enc, frame); if (ret < 0) goto error; while (1) { ret = avcodec_receive_packet(enc, &pkt); if (ret == AVERROR(EAGAIN)) break; if (ret < 0) goto error; update_benchmark("encode_audio %d.%d", ost->file_index, ost->index); av_packet_rescale_ts(&pkt, enc->time_base, ost->st->time_base); if (debug_ts) { av_log(NULL, AV_LOG_INFO, "encoder -> type:audio " "pkt_pts:%s pkt_pts_time:%s pkt_dts:%s pkt_dts_time:%s\n", av_ts2str(pkt.pts), av_ts2timestr(pkt.pts, &ost->st->time_base), av_ts2str(pkt.dts), av_ts2timestr(pkt.dts, &ost->st->time_base)); } output_packet(of, &pkt, ost); } return; error: av_log(NULL, AV_LOG_FATAL, "Audio encoding failed\n"); exit_program(1); } static void do_subtitle_out(OutputFile *of, OutputStream *ost, AVSubtitle *sub) { int subtitle_out_max_size = 1024 * 1024; int subtitle_out_size, nb, i; AVCodecContext *enc; AVPacket pkt; int64_t pts; if (sub->pts == AV_NOPTS_VALUE) { av_log(NULL, AV_LOG_ERROR, "Subtitle packets must have a pts\n"); if (exit_on_error) exit_program(1); return; } enc = ost->enc_ctx; if (!subtitle_out) { subtitle_out = av_malloc(subtitle_out_max_size); if (!subtitle_out) { av_log(NULL, AV_LOG_FATAL, "Failed to allocate subtitle_out\n"); exit_program(1); } } /* Note: DVB subtitle need one packet to draw them and one other packet to clear them */ /* XXX: signal it in the codec context ? */ if (enc->codec_id == AV_CODEC_ID_DVB_SUBTITLE) nb = 2; else nb = 1; /* shift timestamp to honor -ss and make check_recording_time() work with -t */ pts = sub->pts; if (output_files[ost->file_index]->start_time != AV_NOPTS_VALUE) pts -= output_files[ost->file_index]->start_time; for (i = 0; i < nb; i++) { unsigned save_num_rects = sub->num_rects; ost->sync_opts = av_rescale_q(pts, AV_TIME_BASE_Q, enc->time_base); if (!check_recording_time(ost)) return; sub->pts = pts; // start_display_time is required to be 0 sub->pts += av_rescale_q(sub->start_display_time, (AVRational){ 1, 1000 }, AV_TIME_BASE_Q); sub->end_display_time -= sub->start_display_time; sub->start_display_time = 0; if (i == 1) sub->num_rects = 0; ost->frames_encoded++; subtitle_out_size = avcodec_encode_subtitle(enc, subtitle_out, subtitle_out_max_size, sub); if (i == 1) sub->num_rects = save_num_rects; if (subtitle_out_size < 0) { av_log(NULL, AV_LOG_FATAL, "Subtitle encoding failed\n"); exit_program(1); } av_init_packet(&pkt); pkt.data = subtitle_out; pkt.size = subtitle_out_size; pkt.pts = av_rescale_q(sub->pts, AV_TIME_BASE_Q, ost->st->time_base); pkt.duration = av_rescale_q(sub->end_display_time, (AVRational){ 1, 1000 }, ost->st->time_base); if (enc->codec_id == AV_CODEC_ID_DVB_SUBTITLE) { /* XXX: the pts correction is handled here. Maybe handling it in the codec would be better */ if (i == 0) pkt.pts += 90 * sub->start_display_time; else pkt.pts += 90 * sub->end_display_time; } pkt.dts = pkt.pts; output_packet(of, &pkt, ost); } } static void do_video_out(OutputFile *of, OutputStream *ost, AVFrame *next_picture, double sync_ipts) { int ret, format_video_sync; AVPacket pkt; AVCodecContext *enc = ost->enc_ctx; AVCodecParameters *mux_par = ost->st->codecpar; int nb_frames, nb0_frames, i; double delta, delta0; double duration = 0; int frame_size = 0; InputStream *ist = NULL; AVFilterContext *filter = ost->filter->filter; if (ost->source_index >= 0) ist = input_streams[ost->source_index]; if (filter->inputs[0]->frame_rate.num > 0 && filter->inputs[0]->frame_rate.den > 0) duration = 1/(av_q2d(filter->inputs[0]->frame_rate) * av_q2d(enc->time_base)); if(ist && ist->st->start_time != AV_NOPTS_VALUE && ist->st->first_dts != AV_NOPTS_VALUE && ost->frame_rate.num) duration = FFMIN(duration, 1/(av_q2d(ost->frame_rate) * av_q2d(enc->time_base))); if (!ost->filters_script && !ost->filters && next_picture && ist && lrintf(av_frame_get_pkt_duration(next_picture) * av_q2d(ist->st->time_base) / av_q2d(enc->time_base)) > 0) { duration = lrintf(av_frame_get_pkt_duration(next_picture) * av_q2d(ist->st->time_base) / av_q2d(enc->time_base)); } if (!next_picture) { //end, flushing nb0_frames = nb_frames = mid_pred(ost->last_nb0_frames[0], ost->last_nb0_frames[1], ost->last_nb0_frames[2]); } else { delta0 = sync_ipts - ost->sync_opts; // delta0 is the "drift" between the input frame (next_picture) and where it would fall in the output. delta = delta0 + duration; /* by default, we output a single frame */ nb0_frames = 0; // tracks the number of times the PREVIOUS frame should be duplicated, mostly for variable framerate (VFR) nb_frames = 1; format_video_sync = video_sync_method; if (format_video_sync == VSYNC_AUTO) { if(!strcmp(of->ctx->oformat->name, "avi")) { format_video_sync = VSYNC_VFR; } else format_video_sync = (of->ctx->oformat->flags & AVFMT_VARIABLE_FPS) ? ((of->ctx->oformat->flags & AVFMT_NOTIMESTAMPS) ? VSYNC_PASSTHROUGH : VSYNC_VFR) : VSYNC_CFR; if ( ist && format_video_sync == VSYNC_CFR && input_files[ist->file_index]->ctx->nb_streams == 1 && input_files[ist->file_index]->input_ts_offset == 0) { format_video_sync = VSYNC_VSCFR; } if (format_video_sync == VSYNC_CFR && copy_ts) { format_video_sync = VSYNC_VSCFR; } } ost->is_cfr = (format_video_sync == VSYNC_CFR || format_video_sync == VSYNC_VSCFR); if (delta0 < 0 && delta > 0 && format_video_sync != VSYNC_PASSTHROUGH && format_video_sync != VSYNC_DROP) { if (delta0 < -0.6) { av_log(NULL, AV_LOG_WARNING, "Past duration %f too large\n", -delta0); } else av_log(NULL, AV_LOG_DEBUG, "Clipping frame in rate conversion by %f\n", -delta0); sync_ipts = ost->sync_opts; duration += delta0; delta0 = 0; } switch (format_video_sync) { case VSYNC_VSCFR: if (ost->frame_number == 0 && delta0 >= 0.5) { av_log(NULL, AV_LOG_DEBUG, "Not duplicating %d initial frames\n", (int)lrintf(delta0)); delta = duration; delta0 = 0; ost->sync_opts = lrint(sync_ipts); } case VSYNC_CFR: // FIXME set to 0.5 after we fix some dts/pts bugs like in avidec.c if (frame_drop_threshold && delta < frame_drop_threshold && ost->frame_number) { nb_frames = 0; } else if (delta < -1.1) nb_frames = 0; else if (delta > 1.1) { nb_frames = lrintf(delta); if (delta0 > 1.1) nb0_frames = lrintf(delta0 - 0.6); } break; case VSYNC_VFR: if (delta <= -0.6) nb_frames = 0; else if (delta > 0.6) ost->sync_opts = lrint(sync_ipts); break; case VSYNC_DROP: case VSYNC_PASSTHROUGH: ost->sync_opts = lrint(sync_ipts); break; default: av_assert0(0); } } nb_frames = FFMIN(nb_frames, ost->max_frames - ost->frame_number); nb0_frames = FFMIN(nb0_frames, nb_frames); memmove(ost->last_nb0_frames + 1, ost->last_nb0_frames, sizeof(ost->last_nb0_frames[0]) * (FF_ARRAY_ELEMS(ost->last_nb0_frames) - 1)); ost->last_nb0_frames[0] = nb0_frames; if (nb0_frames == 0 && ost->last_dropped) { nb_frames_drop++; av_log(NULL, AV_LOG_VERBOSE, "*** dropping frame %d from stream %d at ts %"PRId64"\n", ost->frame_number, ost->st->index, ost->last_frame->pts); } if (nb_frames > (nb0_frames && ost->last_dropped) + (nb_frames > nb0_frames)) { if (nb_frames > dts_error_threshold * 30) { av_log(NULL, AV_LOG_ERROR, "%d frame duplication too large, skipping\n", nb_frames - 1); nb_frames_drop++; return; } nb_frames_dup += nb_frames - (nb0_frames && ost->last_dropped) - (nb_frames > nb0_frames); av_log(NULL, AV_LOG_VERBOSE, "*** %d dup!\n", nb_frames - 1); } ost->last_dropped = nb_frames == nb0_frames && next_picture; /* duplicates frame if needed */ for (i = 0; i < nb_frames; i++) { AVFrame *in_picture; av_init_packet(&pkt); pkt.data = NULL; pkt.size = 0; if (i < nb0_frames && ost->last_frame) { in_picture = ost->last_frame; } else in_picture = next_picture; if (!in_picture) return; in_picture->pts = ost->sync_opts; #if 1 if (!check_recording_time(ost)) #else if (ost->frame_number >= ost->max_frames) #endif return; #if FF_API_LAVF_FMT_RAWPICTURE if (of->ctx->oformat->flags & AVFMT_RAWPICTURE && enc->codec->id == AV_CODEC_ID_RAWVIDEO) { /* raw pictures are written as AVPicture structure to avoid any copies. We support temporarily the older method. */ if (in_picture->interlaced_frame) mux_par->field_order = in_picture->top_field_first ? AV_FIELD_TB:AV_FIELD_BT; else mux_par->field_order = AV_FIELD_PROGRESSIVE; pkt.data = (uint8_t *)in_picture; pkt.size = sizeof(AVPicture); pkt.pts = av_rescale_q(in_picture->pts, enc->time_base, ost->st->time_base); pkt.flags |= AV_PKT_FLAG_KEY; output_packet(of, &pkt, ost); } else #endif { int forced_keyframe = 0; double pts_time; if (enc->flags & (AV_CODEC_FLAG_INTERLACED_DCT | AV_CODEC_FLAG_INTERLACED_ME) && ost->top_field_first >= 0) in_picture->top_field_first = !!ost->top_field_first; if (in_picture->interlaced_frame) { if (enc->codec->id == AV_CODEC_ID_MJPEG) mux_par->field_order = in_picture->top_field_first ? AV_FIELD_TT:AV_FIELD_BB; else mux_par->field_order = in_picture->top_field_first ? AV_FIELD_TB:AV_FIELD_BT; } else mux_par->field_order = AV_FIELD_PROGRESSIVE; in_picture->quality = enc->global_quality; in_picture->pict_type = 0; pts_time = in_picture->pts != AV_NOPTS_VALUE ? in_picture->pts * av_q2d(enc->time_base) : NAN; if (ost->forced_kf_index < ost->forced_kf_count && in_picture->pts >= ost->forced_kf_pts[ost->forced_kf_index]) { ost->forced_kf_index++; forced_keyframe = 1; } else if (ost->forced_keyframes_pexpr) { double res; ost->forced_keyframes_expr_const_values[FKF_T] = pts_time; res = av_expr_eval(ost->forced_keyframes_pexpr, ost->forced_keyframes_expr_const_values, NULL); ff_dlog(NULL, "force_key_frame: n:%f n_forced:%f prev_forced_n:%f t:%f prev_forced_t:%f -> res:%f\n", ost->forced_keyframes_expr_const_values[FKF_N], ost->forced_keyframes_expr_const_values[FKF_N_FORCED], ost->forced_keyframes_expr_const_values[FKF_PREV_FORCED_N], ost->forced_keyframes_expr_const_values[FKF_T], ost->forced_keyframes_expr_const_values[FKF_PREV_FORCED_T], res); if (res) { forced_keyframe = 1; ost->forced_keyframes_expr_const_values[FKF_PREV_FORCED_N] = ost->forced_keyframes_expr_const_values[FKF_N]; ost->forced_keyframes_expr_const_values[FKF_PREV_FORCED_T] = ost->forced_keyframes_expr_const_values[FKF_T]; ost->forced_keyframes_expr_const_values[FKF_N_FORCED] += 1; } ost->forced_keyframes_expr_const_values[FKF_N] += 1; } else if ( ost->forced_keyframes && !strncmp(ost->forced_keyframes, "source", 6) && in_picture->key_frame==1) { forced_keyframe = 1; } if (forced_keyframe) { in_picture->pict_type = AV_PICTURE_TYPE_I; av_log(NULL, AV_LOG_DEBUG, "Forced keyframe at time %f\n", pts_time); } update_benchmark(NULL); if (debug_ts) { av_log(NULL, AV_LOG_INFO, "encoder <- type:video " "frame_pts:%s frame_pts_time:%s time_base:%d/%d\n", av_ts2str(in_picture->pts), av_ts2timestr(in_picture->pts, &enc->time_base), enc->time_base.num, enc->time_base.den); } ost->frames_encoded++; ret = avcodec_send_frame(enc, in_picture); if (ret < 0) goto error; while (1) { ret = avcodec_receive_packet(enc, &pkt); update_benchmark("encode_video %d.%d", ost->file_index, ost->index); if (ret == AVERROR(EAGAIN)) break; if (ret < 0) goto error; if (debug_ts) { av_log(NULL, AV_LOG_INFO, "encoder -> type:video " "pkt_pts:%s pkt_pts_time:%s pkt_dts:%s pkt_dts_time:%s\n", av_ts2str(pkt.pts), av_ts2timestr(pkt.pts, &enc->time_base), av_ts2str(pkt.dts), av_ts2timestr(pkt.dts, &enc->time_base)); } if (pkt.pts == AV_NOPTS_VALUE && !(enc->codec->capabilities & AV_CODEC_CAP_DELAY)) pkt.pts = ost->sync_opts; av_packet_rescale_ts(&pkt, enc->time_base, ost->st->time_base); if (debug_ts) { av_log(NULL, AV_LOG_INFO, "encoder -> type:video " "pkt_pts:%s pkt_pts_time:%s pkt_dts:%s pkt_dts_time:%s\n", av_ts2str(pkt.pts), av_ts2timestr(pkt.pts, &ost->st->time_base), av_ts2str(pkt.dts), av_ts2timestr(pkt.dts, &ost->st->time_base)); } frame_size = pkt.size; output_packet(of, &pkt, ost); /* if two pass, output log */ if (ost->logfile && enc->stats_out) { fprintf(ost->logfile, "%s", enc->stats_out); } } } ost->sync_opts++; /* * For video, number of frames in == number of packets out. * But there may be reordering, so we can't throw away frames on encoder * flush, we need to limit them here, before they go into encoder. */ ost->frame_number++; if (vstats_filename && frame_size) do_video_stats(ost, frame_size); } if (!ost->last_frame) ost->last_frame = av_frame_alloc(); av_frame_unref(ost->last_frame); if (next_picture && ost->last_frame) av_frame_ref(ost->last_frame, next_picture); else av_frame_free(&ost->last_frame); return; error: av_log(NULL, AV_LOG_FATAL, "Video encoding failed\n"); exit_program(1); } static double psnr(double d) { return -10.0 * log10(d); } static void do_video_stats(OutputStream *ost, int frame_size) { AVCodecContext *enc; int frame_number; double ti1, bitrate, avg_bitrate; /* this is executed just the first time do_video_stats is called */ if (!vstats_file) { vstats_file = fopen(vstats_filename, "w"); if (!vstats_file) { perror("fopen"); exit_program(1); } } enc = ost->enc_ctx; if (enc->codec_type == AVMEDIA_TYPE_VIDEO) { frame_number = ost->st->nb_frames; fprintf(vstats_file, "frame= %5d q= %2.1f ", frame_number, ost->quality / (float)FF_QP2LAMBDA); if (ost->error[0]>=0 && (enc->flags & AV_CODEC_FLAG_PSNR)) fprintf(vstats_file, "PSNR= %6.2f ", psnr(ost->error[0] / (enc->width * enc->height * 255.0 * 255.0))); fprintf(vstats_file,"f_size= %6d ", frame_size); /* compute pts value */ ti1 = av_stream_get_end_pts(ost->st) * av_q2d(ost->st->time_base); if (ti1 < 0.01) ti1 = 0.01; bitrate = (frame_size * 8) / av_q2d(enc->time_base) / 1000.0; avg_bitrate = (double)(ost->data_size * 8) / ti1 / 1000.0; fprintf(vstats_file, "s_size= %8.0fkB time= %0.3f br= %7.1fkbits/s avg_br= %7.1fkbits/s ", (double)ost->data_size / 1024, ti1, bitrate, avg_bitrate); fprintf(vstats_file, "type= %c\n", av_get_picture_type_char(ost->pict_type)); } } static void finish_output_stream(OutputStream *ost) { OutputFile *of = output_files[ost->file_index]; int i; ost->finished = ENCODER_FINISHED | MUXER_FINISHED; if (of->shortest) { for (i = 0; i < of->ctx->nb_streams; i++) output_streams[of->ost_index + i]->finished = ENCODER_FINISHED | MUXER_FINISHED; } } /** * Get and encode new output from any of the filtergraphs, without causing * activity. * * @return 0 for success, <0 for severe errors */ static int reap_filters(int flush) { AVFrame *filtered_frame = NULL; int i; /* Reap all buffers present in the buffer sinks */ for (i = 0; i < nb_output_streams; i++) { OutputStream *ost = output_streams[i]; OutputFile *of = output_files[ost->file_index]; AVFilterContext *filter; AVCodecContext *enc = ost->enc_ctx; int ret = 0; if (!ost->filter) continue; filter = ost->filter->filter; if (!ost->filtered_frame && !(ost->filtered_frame = av_frame_alloc())) { return AVERROR(ENOMEM); } filtered_frame = ost->filtered_frame; while (1) { double float_pts = AV_NOPTS_VALUE; // this is identical to filtered_frame.pts but with higher precision ret = av_buffersink_get_frame_flags(filter, filtered_frame, AV_BUFFERSINK_FLAG_NO_REQUEST); if (ret < 0) { if (ret != AVERROR(EAGAIN) && ret != AVERROR_EOF) { av_log(NULL, AV_LOG_WARNING, "Error in av_buffersink_get_frame_flags(): %s\n", av_err2str(ret)); } else if (flush && ret == AVERROR_EOF) { if (filter->inputs[0]->type == AVMEDIA_TYPE_VIDEO) do_video_out(of, ost, NULL, AV_NOPTS_VALUE); } break; } if (ost->finished) { av_frame_unref(filtered_frame); continue; } if (filtered_frame->pts != AV_NOPTS_VALUE) { int64_t start_time = (of->start_time == AV_NOPTS_VALUE) ? 0 : of->start_time; AVRational tb = enc->time_base; int extra_bits = av_clip(29 - av_log2(tb.den), 0, 16); tb.den <<= extra_bits; float_pts = av_rescale_q(filtered_frame->pts, filter->inputs[0]->time_base, tb) - av_rescale_q(start_time, AV_TIME_BASE_Q, tb); float_pts /= 1 << extra_bits; // avoid exact midoints to reduce the chance of rounding differences, this can be removed in case the fps code is changed to work with integers float_pts += FFSIGN(float_pts) * 1.0 / (1<<17); filtered_frame->pts = av_rescale_q(filtered_frame->pts, filter->inputs[0]->time_base, enc->time_base) - av_rescale_q(start_time, AV_TIME_BASE_Q, enc->time_base); } //if (ost->source_index >= 0) // *filtered_frame= *input_streams[ost->source_index]->decoded_frame; //for me_threshold switch (filter->inputs[0]->type) { case AVMEDIA_TYPE_VIDEO: if (!ost->frame_aspect_ratio.num) enc->sample_aspect_ratio = filtered_frame->sample_aspect_ratio; if (debug_ts) { av_log(NULL, AV_LOG_INFO, "filter -> pts:%s pts_time:%s exact:%f time_base:%d/%d\n", av_ts2str(filtered_frame->pts), av_ts2timestr(filtered_frame->pts, &enc->time_base), float_pts, enc->time_base.num, enc->time_base.den); } do_video_out(of, ost, filtered_frame, float_pts); break; case AVMEDIA_TYPE_AUDIO: if (!(enc->codec->capabilities & AV_CODEC_CAP_PARAM_CHANGE) && enc->channels != av_frame_get_channels(filtered_frame)) { av_log(NULL, AV_LOG_ERROR, "Audio filter graph output is not normalized and encoder does not support parameter changes\n"); break; } do_audio_out(of, ost, filtered_frame); break; default: // TODO support subtitle filters av_assert0(0); } av_frame_unref(filtered_frame); } } return 0; } static void print_final_stats(int64_t total_size) { uint64_t video_size = 0, audio_size = 0, extra_size = 0, other_size = 0; uint64_t subtitle_size = 0; uint64_t data_size = 0; float percent = -1.0; int i, j; int pass1_used = 1; for (i = 0; i < nb_output_streams; i++) { OutputStream *ost = output_streams[i]; switch (ost->enc_ctx->codec_type) { case AVMEDIA_TYPE_VIDEO: video_size += ost->data_size; break; case AVMEDIA_TYPE_AUDIO: audio_size += ost->data_size; break; case AVMEDIA_TYPE_SUBTITLE: subtitle_size += ost->data_size; break; default: other_size += ost->data_size; break; } extra_size += ost->enc_ctx->extradata_size; data_size += ost->data_size; if ( (ost->enc_ctx->flags & (AV_CODEC_FLAG_PASS1 | CODEC_FLAG_PASS2)) != AV_CODEC_FLAG_PASS1) pass1_used = 0; } if (data_size && total_size>0 && total_size >= data_size) percent = 100.0 * (total_size - data_size) / data_size; av_log(NULL, AV_LOG_INFO, "video:%1.0fkB audio:%1.0fkB subtitle:%1.0fkB other streams:%1.0fkB global headers:%1.0fkB muxing overhead: ", video_size / 1024.0, audio_size / 1024.0, subtitle_size / 1024.0, other_size / 1024.0, extra_size / 1024.0); if (percent >= 0.0) av_log(NULL, AV_LOG_INFO, "%f%%", percent); else av_log(NULL, AV_LOG_INFO, "unknown"); av_log(NULL, AV_LOG_INFO, "\n"); /* print verbose per-stream stats */ for (i = 0; i < nb_input_files; i++) { InputFile *f = input_files[i]; uint64_t total_packets = 0, total_size = 0; av_log(NULL, AV_LOG_VERBOSE, "Input file #%d (%s):\n", i, f->ctx->filename); for (j = 0; j < f->nb_streams; j++) { InputStream *ist = input_streams[f->ist_index + j]; enum AVMediaType type = ist->dec_ctx->codec_type; total_size += ist->data_size; total_packets += ist->nb_packets; av_log(NULL, AV_LOG_VERBOSE, " Input stream #%d:%d (%s): ", i, j, media_type_string(type)); av_log(NULL, AV_LOG_VERBOSE, "%"PRIu64" packets read (%"PRIu64" bytes); ", ist->nb_packets, ist->data_size); if (ist->decoding_needed) { av_log(NULL, AV_LOG_VERBOSE, "%"PRIu64" frames decoded", ist->frames_decoded); if (type == AVMEDIA_TYPE_AUDIO) av_log(NULL, AV_LOG_VERBOSE, " (%"PRIu64" samples)", ist->samples_decoded); av_log(NULL, AV_LOG_VERBOSE, "; "); } av_log(NULL, AV_LOG_VERBOSE, "\n"); } av_log(NULL, AV_LOG_VERBOSE, " Total: %"PRIu64" packets (%"PRIu64" bytes) demuxed\n", total_packets, total_size); } for (i = 0; i < nb_output_files; i++) { OutputFile *of = output_files[i]; uint64_t total_packets = 0, total_size = 0; av_log(NULL, AV_LOG_VERBOSE, "Output file #%d (%s):\n", i, of->ctx->filename); for (j = 0; j < of->ctx->nb_streams; j++) { OutputStream *ost = output_streams[of->ost_index + j]; enum AVMediaType type = ost->enc_ctx->codec_type; total_size += ost->data_size; total_packets += ost->packets_written; av_log(NULL, AV_LOG_VERBOSE, " Output stream #%d:%d (%s): ", i, j, media_type_string(type)); if (ost->encoding_needed) { av_log(NULL, AV_LOG_VERBOSE, "%"PRIu64" frames encoded", ost->frames_encoded); if (type == AVMEDIA_TYPE_AUDIO) av_log(NULL, AV_LOG_VERBOSE, " (%"PRIu64" samples)", ost->samples_encoded); av_log(NULL, AV_LOG_VERBOSE, "; "); } av_log(NULL, AV_LOG_VERBOSE, "%"PRIu64" packets muxed (%"PRIu64" bytes); ", ost->packets_written, ost->data_size); av_log(NULL, AV_LOG_VERBOSE, "\n"); } av_log(NULL, AV_LOG_VERBOSE, " Total: %"PRIu64" packets (%"PRIu64" bytes) muxed\n", total_packets, total_size); } if(video_size + data_size + audio_size + subtitle_size + extra_size == 0){ av_log(NULL, AV_LOG_WARNING, "Output file is empty, nothing was encoded "); if (pass1_used) { av_log(NULL, AV_LOG_WARNING, "\n"); } else { av_log(NULL, AV_LOG_WARNING, "(check -ss / -t / -frames parameters if used)\n"); } } } static void print_report(int is_last_report, int64_t timer_start, int64_t cur_time) { char buf[1024]; AVBPrint buf_script; OutputStream *ost; AVFormatContext *oc; int64_t total_size; AVCodecContext *enc; int frame_number, vid, i; double bitrate; double speed; int64_t pts = INT64_MIN + 1; static int64_t last_time = -1; static int qp_histogram[52]; int hours, mins, secs, us; int ret; float t; if (!print_stats && !is_last_report && !progress_avio) return; if (!is_last_report) { if (last_time == -1) { last_time = cur_time; return; } if ((cur_time - last_time) < 500000) return; last_time = cur_time; } t = (cur_time-timer_start) / 1000000.0; oc = output_files[0]->ctx; total_size = avio_size(oc->pb); if (total_size <= 0) // FIXME improve avio_size() so it works with non seekable output too total_size = avio_tell(oc->pb); buf[0] = '\0'; vid = 0; av_bprint_init(&buf_script, 0, 1); for (i = 0; i < nb_output_streams; i++) { float q = -1; ost = output_streams[i]; enc = ost->enc_ctx; if (!ost->stream_copy) q = ost->quality / (float) FF_QP2LAMBDA; if (vid && enc->codec_type == AVMEDIA_TYPE_VIDEO) { snprintf(buf + strlen(buf), sizeof(buf) - strlen(buf), "q=%2.1f ", q); av_bprintf(&buf_script, "stream_%d_%d_q=%.1f\n", ost->file_index, ost->index, q); } if (!vid && enc->codec_type == AVMEDIA_TYPE_VIDEO) { float fps; frame_number = ost->frame_number; fps = t > 1 ? frame_number / t : 0; snprintf(buf + strlen(buf), sizeof(buf) - strlen(buf), "frame=%5d fps=%3.*f q=%3.1f ", frame_number, fps < 9.95, fps, q); av_bprintf(&buf_script, "frame=%d\n", frame_number); av_bprintf(&buf_script, "fps=%.1f\n", fps); av_bprintf(&buf_script, "stream_%d_%d_q=%.1f\n", ost->file_index, ost->index, q); if (is_last_report) snprintf(buf + strlen(buf), sizeof(buf) - strlen(buf), "L"); if (qp_hist) { int j; int qp = lrintf(q); if (qp >= 0 && qp < FF_ARRAY_ELEMS(qp_histogram)) qp_histogram[qp]++; for (j = 0; j < 32; j++) snprintf(buf + strlen(buf), sizeof(buf) - strlen(buf), "%X", av_log2(qp_histogram[j] + 1)); } if ((enc->flags & AV_CODEC_FLAG_PSNR) && (ost->pict_type != AV_PICTURE_TYPE_NONE || is_last_report)) { int j; double error, error_sum = 0; double scale, scale_sum = 0; double p; char type[3] = { 'Y','U','V' }; snprintf(buf + strlen(buf), sizeof(buf) - strlen(buf), "PSNR="); for (j = 0; j < 3; j++) { if (is_last_report) { error = enc->error[j]; scale = enc->width * enc->height * 255.0 * 255.0 * frame_number; } else { error = ost->error[j]; scale = enc->width * enc->height * 255.0 * 255.0; } if (j) scale /= 4; error_sum += error; scale_sum += scale; p = psnr(error / scale); snprintf(buf + strlen(buf), sizeof(buf) - strlen(buf), "%c:%2.2f ", type[j], p); av_bprintf(&buf_script, "stream_%d_%d_psnr_%c=%2.2f\n", ost->file_index, ost->index, type[j] | 32, p); } p = psnr(error_sum / scale_sum); snprintf(buf + strlen(buf), sizeof(buf) - strlen(buf), "*:%2.2f ", psnr(error_sum / scale_sum)); av_bprintf(&buf_script, "stream_%d_%d_psnr_all=%2.2f\n", ost->file_index, ost->index, p); } vid = 1; } /* compute min output value */ if (av_stream_get_end_pts(ost->st) != AV_NOPTS_VALUE) pts = FFMAX(pts, av_rescale_q(av_stream_get_end_pts(ost->st), ost->st->time_base, AV_TIME_BASE_Q)); if (is_last_report) nb_frames_drop += ost->last_dropped; } secs = FFABS(pts) / AV_TIME_BASE; us = FFABS(pts) % AV_TIME_BASE; mins = secs / 60; secs %= 60; hours = mins / 60; mins %= 60; bitrate = pts && total_size >= 0 ? total_size * 8 / (pts / 1000.0) : -1; speed = t != 0.0 ? (double)pts / AV_TIME_BASE / t : -1; if (total_size < 0) snprintf(buf + strlen(buf), sizeof(buf) - strlen(buf), "size=N/A time="); else snprintf(buf + strlen(buf), sizeof(buf) - strlen(buf), "size=%8.0fkB time=", total_size / 1024.0); if (pts < 0) snprintf(buf + strlen(buf), sizeof(buf) - strlen(buf), "-"); snprintf(buf + strlen(buf), sizeof(buf) - strlen(buf), "%02d:%02d:%02d.%02d ", hours, mins, secs, (100 * us) / AV_TIME_BASE); if (bitrate < 0) { snprintf(buf + strlen(buf), sizeof(buf) - strlen(buf),"bitrate=N/A"); av_bprintf(&buf_script, "bitrate=N/A\n"); }else{ snprintf(buf + strlen(buf), sizeof(buf) - strlen(buf),"bitrate=%6.1fkbits/s", bitrate); av_bprintf(&buf_script, "bitrate=%6.1fkbits/s\n", bitrate); } if (total_size < 0) av_bprintf(&buf_script, "total_size=N/A\n"); else av_bprintf(&buf_script, "total_size=%"PRId64"\n", total_size); av_bprintf(&buf_script, "out_time_ms=%"PRId64"\n", pts); av_bprintf(&buf_script, "out_time=%02d:%02d:%02d.%06d\n", hours, mins, secs, us); if (nb_frames_dup || nb_frames_drop) snprintf(buf + strlen(buf), sizeof(buf) - strlen(buf), " dup=%d drop=%d", nb_frames_dup, nb_frames_drop); av_bprintf(&buf_script, "dup_frames=%d\n", nb_frames_dup); av_bprintf(&buf_script, "drop_frames=%d\n", nb_frames_drop); if (speed < 0) { snprintf(buf + strlen(buf), sizeof(buf) - strlen(buf)," speed=N/A"); av_bprintf(&buf_script, "speed=N/A\n"); } else { snprintf(buf + strlen(buf), sizeof(buf) - strlen(buf)," speed=%4.3gx", speed); av_bprintf(&buf_script, "speed=%4.3gx\n", speed); } if (print_stats || is_last_report) { const char end = is_last_report ? '\n' : '\r'; if (print_stats==1 && AV_LOG_INFO > av_log_get_level()) { fprintf(stderr, "%s %c", buf, end); } else av_log(NULL, AV_LOG_INFO, "%s %c", buf, end); fflush(stderr); } if (progress_avio) { av_bprintf(&buf_script, "progress=%s\n", is_last_report ? "end" : "continue"); avio_write(progress_avio, buf_script.str, FFMIN(buf_script.len, buf_script.size - 1)); avio_flush(progress_avio); av_bprint_finalize(&buf_script, NULL); if (is_last_report) { if ((ret = avio_closep(&progress_avio)) < 0) av_log(NULL, AV_LOG_ERROR, "Error closing progress log, loss of information possible: %s\n", av_err2str(ret)); } } if (is_last_report) print_final_stats(total_size); } static void flush_encoders(void) { int i, ret; for (i = 0; i < nb_output_streams; i++) { OutputStream *ost = output_streams[i]; AVCodecContext *enc = ost->enc_ctx; OutputFile *of = output_files[ost->file_index]; int stop_encoding = 0; if (!ost->encoding_needed) continue; if (enc->codec_type == AVMEDIA_TYPE_AUDIO && enc->frame_size <= 1) continue; #if FF_API_LAVF_FMT_RAWPICTURE if (enc->codec_type == AVMEDIA_TYPE_VIDEO && (of->ctx->oformat->flags & AVFMT_RAWPICTURE) && enc->codec->id == AV_CODEC_ID_RAWVIDEO) continue; #endif if (enc->codec_type != AVMEDIA_TYPE_VIDEO && enc->codec_type != AVMEDIA_TYPE_AUDIO) continue; avcodec_send_frame(enc, NULL); for (;;) { const char *desc = NULL; switch (enc->codec_type) { case AVMEDIA_TYPE_AUDIO: desc = "audio"; break; case AVMEDIA_TYPE_VIDEO: desc = "video"; break; default: av_assert0(0); } if (1) { AVPacket pkt; int pkt_size; av_init_packet(&pkt); pkt.data = NULL; pkt.size = 0; update_benchmark(NULL); ret = avcodec_receive_packet(enc, &pkt); update_benchmark("flush_%s %d.%d", desc, ost->file_index, ost->index); if (ret < 0 && ret != AVERROR_EOF) { av_log(NULL, AV_LOG_FATAL, "%s encoding failed: %s\n", desc, av_err2str(ret)); exit_program(1); } if (ost->logfile && enc->stats_out) { fprintf(ost->logfile, "%s", enc->stats_out); } if (ret == AVERROR_EOF) { stop_encoding = 1; break; } if (ost->finished & MUXER_FINISHED) { av_packet_unref(&pkt); continue; } av_packet_rescale_ts(&pkt, enc->time_base, ost->st->time_base); pkt_size = pkt.size; output_packet(of, &pkt, ost); if (ost->enc_ctx->codec_type == AVMEDIA_TYPE_VIDEO && vstats_filename) { do_video_stats(ost, pkt_size); } } if (stop_encoding) break; } } } /* * Check whether a packet from ist should be written into ost at this time */ static int check_output_constraints(InputStream *ist, OutputStream *ost) { OutputFile *of = output_files[ost->file_index]; int ist_index = input_files[ist->file_index]->ist_index + ist->st->index; if (ost->source_index != ist_index) return 0; if (ost->finished) return 0; if (of->start_time != AV_NOPTS_VALUE && ist->pts < of->start_time) return 0; return 1; } static void do_streamcopy(InputStream *ist, OutputStream *ost, const AVPacket *pkt) { OutputFile *of = output_files[ost->file_index]; InputFile *f = input_files [ist->file_index]; int64_t start_time = (of->start_time == AV_NOPTS_VALUE) ? 0 : of->start_time; int64_t ost_tb_start_time = av_rescale_q(start_time, AV_TIME_BASE_Q, ost->st->time_base); AVPicture pict; AVPacket opkt; av_init_packet(&opkt); if ((!ost->frame_number && !(pkt->flags & AV_PKT_FLAG_KEY)) && !ost->copy_initial_nonkeyframes) return; if (!ost->frame_number && !ost->copy_prior_start) { int64_t comp_start = start_time; if (copy_ts && f->start_time != AV_NOPTS_VALUE) comp_start = FFMAX(start_time, f->start_time + f->ts_offset); if (pkt->pts == AV_NOPTS_VALUE ? ist->pts < comp_start : pkt->pts < av_rescale_q(comp_start, AV_TIME_BASE_Q, ist->st->time_base)) return; } if (of->recording_time != INT64_MAX && ist->pts >= of->recording_time + start_time) { close_output_stream(ost); return; } if (f->recording_time != INT64_MAX) { start_time = f->ctx->start_time; if (f->start_time != AV_NOPTS_VALUE && copy_ts) start_time += f->start_time; if (ist->pts >= f->recording_time + start_time) { close_output_stream(ost); return; } } /* force the input stream PTS */ if (ost->enc_ctx->codec_type == AVMEDIA_TYPE_VIDEO) ost->sync_opts++; if (pkt->pts != AV_NOPTS_VALUE) opkt.pts = av_rescale_q(pkt->pts, ist->st->time_base, ost->st->time_base) - ost_tb_start_time; else opkt.pts = AV_NOPTS_VALUE; if (pkt->dts == AV_NOPTS_VALUE) opkt.dts = av_rescale_q(ist->dts, AV_TIME_BASE_Q, ost->st->time_base); else opkt.dts = av_rescale_q(pkt->dts, ist->st->time_base, ost->st->time_base); opkt.dts -= ost_tb_start_time; if (ost->st->codecpar->codec_type == AVMEDIA_TYPE_AUDIO && pkt->dts != AV_NOPTS_VALUE) { int duration = av_get_audio_frame_duration(ist->dec_ctx, pkt->size); if(!duration) duration = ist->dec_ctx->frame_size; opkt.dts = opkt.pts = av_rescale_delta(ist->st->time_base, pkt->dts, (AVRational){1, ist->dec_ctx->sample_rate}, duration, &ist->filter_in_rescale_delta_last, ost->st->time_base) - ost_tb_start_time; } opkt.duration = av_rescale_q(pkt->duration, ist->st->time_base, ost->st->time_base); opkt.flags = pkt->flags; // FIXME remove the following 2 lines they shall be replaced by the bitstream filters if ( ost->st->codecpar->codec_id != AV_CODEC_ID_H264 && ost->st->codecpar->codec_id != AV_CODEC_ID_MPEG1VIDEO && ost->st->codecpar->codec_id != AV_CODEC_ID_MPEG2VIDEO && ost->st->codecpar->codec_id != AV_CODEC_ID_VC1 ) { int ret = av_parser_change(ost->parser, ost->parser_avctx, &opkt.data, &opkt.size, pkt->data, pkt->size, pkt->flags & AV_PKT_FLAG_KEY); if (ret < 0) { av_log(NULL, AV_LOG_FATAL, "av_parser_change failed: %s\n", av_err2str(ret)); exit_program(1); } if (ret) { opkt.buf = av_buffer_create(opkt.data, opkt.size, av_buffer_default_free, NULL, 0); if (!opkt.buf) exit_program(1); } } else { opkt.data = pkt->data; opkt.size = pkt->size; } av_copy_packet_side_data(&opkt, pkt); #if FF_API_LAVF_FMT_RAWPICTURE if (ost->st->codecpar->codec_type == AVMEDIA_TYPE_VIDEO && ost->st->codecpar->codec_id == AV_CODEC_ID_RAWVIDEO && (of->ctx->oformat->flags & AVFMT_RAWPICTURE)) { /* store AVPicture in AVPacket, as expected by the output format */ int ret = avpicture_fill(&pict, opkt.data, ost->st->codecpar->format, ost->st->codecpar->width, ost->st->codecpar->height); if (ret < 0) { av_log(NULL, AV_LOG_FATAL, "avpicture_fill failed: %s\n", av_err2str(ret)); exit_program(1); } opkt.data = (uint8_t *)&pict; opkt.size = sizeof(AVPicture); opkt.flags |= AV_PKT_FLAG_KEY; } #endif output_packet(of, &opkt, ost); } int guess_input_channel_layout(InputStream *ist) { AVCodecContext *dec = ist->dec_ctx; if (!dec->channel_layout) { char layout_name[256]; if (dec->channels > ist->guess_layout_max) return 0; dec->channel_layout = av_get_default_channel_layout(dec->channels); if (!dec->channel_layout) return 0; av_get_channel_layout_string(layout_name, sizeof(layout_name), dec->channels, dec->channel_layout); av_log(NULL, AV_LOG_WARNING, "Guessed Channel Layout for Input Stream " "#%d.%d : %s\n", ist->file_index, ist->st->index, layout_name); } return 1; } static void check_decode_result(InputStream *ist, int *got_output, int ret) { if (*got_output || ret<0) decode_error_stat[ret<0] ++; if (ret < 0 && exit_on_error) exit_program(1); if (exit_on_error && *got_output && ist) { if (av_frame_get_decode_error_flags(ist->decoded_frame) || (ist->decoded_frame->flags & AV_FRAME_FLAG_CORRUPT)) { av_log(NULL, AV_LOG_FATAL, "%s: corrupt decoded frame in stream %d\n", input_files[ist->file_index]->ctx->filename, ist->st->index); exit_program(1); } } } // This does not quite work like avcodec_decode_audio4/avcodec_decode_video2. // There is the following difference: if you got a frame, you must call // it again with pkt=NULL. pkt==NULL is treated differently from pkt.size==0 // (pkt==NULL means get more output, pkt.size==0 is a flush/drain packet) static int decode(AVCodecContext *avctx, AVFrame *frame, int *got_frame, AVPacket *pkt) { int ret; *got_frame = 0; if (pkt) { ret = avcodec_send_packet(avctx, pkt); // In particular, we don't expect AVERROR(EAGAIN), because we read all // decoded frames with avcodec_receive_frame() until done. if (ret < 0 && ret != AVERROR_EOF) return ret; } ret = avcodec_receive_frame(avctx, frame); if (ret < 0 && ret != AVERROR(EAGAIN)) return ret; if (ret >= 0) *got_frame = 1; return 0; } static int decode_audio(InputStream *ist, AVPacket *pkt, int *got_output) { AVFrame *decoded_frame, *f; AVCodecContext *avctx = ist->dec_ctx; int i, ret, err = 0, resample_changed; AVRational decoded_frame_tb; if (!ist->decoded_frame && !(ist->decoded_frame = av_frame_alloc())) return AVERROR(ENOMEM); if (!ist->filter_frame && !(ist->filter_frame = av_frame_alloc())) return AVERROR(ENOMEM); decoded_frame = ist->decoded_frame; update_benchmark(NULL); ret = decode(avctx, decoded_frame, got_output, pkt); update_benchmark("decode_audio %d.%d", ist->file_index, ist->st->index); if (ret >= 0 && avctx->sample_rate <= 0) { av_log(avctx, AV_LOG_ERROR, "Sample rate %d invalid\n", avctx->sample_rate); ret = AVERROR_INVALIDDATA; } if (ret != AVERROR_EOF) check_decode_result(ist, got_output, ret); if (!*got_output || ret < 0) return ret; ist->samples_decoded += decoded_frame->nb_samples; ist->frames_decoded++; #if 1 /* increment next_dts to use for the case where the input stream does not have timestamps or there are multiple frames in the packet */ ist->next_pts += ((int64_t)AV_TIME_BASE * decoded_frame->nb_samples) / avctx->sample_rate; ist->next_dts += ((int64_t)AV_TIME_BASE * decoded_frame->nb_samples) / avctx->sample_rate; #endif resample_changed = ist->resample_sample_fmt != decoded_frame->format || ist->resample_channels != avctx->channels || ist->resample_channel_layout != decoded_frame->channel_layout || ist->resample_sample_rate != decoded_frame->sample_rate; if (resample_changed) { char layout1[64], layout2[64]; if (!guess_input_channel_layout(ist)) { av_log(NULL, AV_LOG_FATAL, "Unable to find default channel " "layout for Input Stream #%d.%d\n", ist->file_index, ist->st->index); exit_program(1); } decoded_frame->channel_layout = avctx->channel_layout; av_get_channel_layout_string(layout1, sizeof(layout1), ist->resample_channels, ist->resample_channel_layout); av_get_channel_layout_string(layout2, sizeof(layout2), avctx->channels, decoded_frame->channel_layout); av_log(NULL, AV_LOG_INFO, "Input stream #%d:%d frame changed from rate:%d fmt:%s ch:%d chl:%s to rate:%d fmt:%s ch:%d chl:%s\n", ist->file_index, ist->st->index, ist->resample_sample_rate, av_get_sample_fmt_name(ist->resample_sample_fmt), ist->resample_channels, layout1, decoded_frame->sample_rate, av_get_sample_fmt_name(decoded_frame->format), avctx->channels, layout2); ist->resample_sample_fmt = decoded_frame->format; ist->resample_sample_rate = decoded_frame->sample_rate; ist->resample_channel_layout = decoded_frame->channel_layout; ist->resample_channels = avctx->channels; for (i = 0; i < nb_filtergraphs; i++) if (ist_in_filtergraph(filtergraphs[i], ist)) { FilterGraph *fg = filtergraphs[i]; if (configure_filtergraph(fg) < 0) { av_log(NULL, AV_LOG_FATAL, "Error reinitializing filters!\n"); exit_program(1); } } } if (decoded_frame->pts != AV_NOPTS_VALUE) { decoded_frame_tb = ist->st->time_base; } else if (pkt && pkt->pts != AV_NOPTS_VALUE) { decoded_frame->pts = pkt->pts; decoded_frame_tb = ist->st->time_base; }else { decoded_frame->pts = ist->dts; decoded_frame_tb = AV_TIME_BASE_Q; } if (decoded_frame->pts != AV_NOPTS_VALUE) decoded_frame->pts = av_rescale_delta(decoded_frame_tb, decoded_frame->pts, (AVRational){1, avctx->sample_rate}, decoded_frame->nb_samples, &ist->filter_in_rescale_delta_last, (AVRational){1, avctx->sample_rate}); ist->nb_samples = decoded_frame->nb_samples; for (i = 0; i < ist->nb_filters; i++) { if (i < ist->nb_filters - 1) { f = ist->filter_frame; err = av_frame_ref(f, decoded_frame); if (err < 0) break; } else f = decoded_frame; err = av_buffersrc_add_frame_flags(ist->filters[i]->filter, f, AV_BUFFERSRC_FLAG_PUSH); if (err == AVERROR_EOF) err = 0; /* ignore */ if (err < 0) break; } decoded_frame->pts = AV_NOPTS_VALUE; av_frame_unref(ist->filter_frame); av_frame_unref(decoded_frame); return err < 0 ? err : ret; } static int decode_video(InputStream *ist, AVPacket *pkt, int *got_output, int eof) { AVFrame *decoded_frame, *f; int i, ret = 0, err = 0, resample_changed; int64_t best_effort_timestamp; int64_t dts = AV_NOPTS_VALUE; AVRational *frame_sample_aspect; AVPacket avpkt; // With fate-indeo3-2, we're getting 0-sized packets before EOF for some // reason. This seems like a semi-critical bug. Don't trigger EOF, and // skip the packet. if (!eof && pkt && pkt->size == 0) return 0; if (!ist->decoded_frame && !(ist->decoded_frame = av_frame_alloc())) return AVERROR(ENOMEM); if (!ist->filter_frame && !(ist->filter_frame = av_frame_alloc())) return AVERROR(ENOMEM); decoded_frame = ist->decoded_frame; if (ist->dts != AV_NOPTS_VALUE) dts = av_rescale_q(ist->dts, AV_TIME_BASE_Q, ist->st->time_base); if (pkt) { avpkt = *pkt; avpkt.dts = dts; // ffmpeg.c probably shouldn't do this } // The old code used to set dts on the drain packet, which does not work // with the new API anymore. if (eof) { void *new = av_realloc_array(ist->dts_buffer, ist->nb_dts_buffer + 1, sizeof(ist->dts_buffer[0])); if (!new) return AVERROR(ENOMEM); ist->dts_buffer = new; ist->dts_buffer[ist->nb_dts_buffer++] = dts; } update_benchmark(NULL); ret = decode(ist->dec_ctx, decoded_frame, got_output, pkt ? &avpkt : NULL); update_benchmark("decode_video %d.%d", ist->file_index, ist->st->index); // The following line may be required in some cases where there is no parser // or the parser does not has_b_frames correctly if (ist->st->codecpar->video_delay < ist->dec_ctx->has_b_frames) { if (ist->dec_ctx->codec_id == AV_CODEC_ID_H264) { ist->st->codecpar->video_delay = ist->dec_ctx->has_b_frames; } else av_log(ist->dec_ctx, AV_LOG_WARNING, "video_delay is larger in decoder than demuxer %d > %d.\n" "If you want to help, upload a sample " "of this file to ftp://upload.ffmpeg.org/incoming/ " "and contact the ffmpeg-devel mailing list. (ffmpeg-devel@ffmpeg.org)", ist->dec_ctx->has_b_frames, ist->st->codecpar->video_delay); } if (ret != AVERROR_EOF) check_decode_result(ist, got_output, ret); if (*got_output && ret >= 0) { if (ist->dec_ctx->width != decoded_frame->width || ist->dec_ctx->height != decoded_frame->height || ist->dec_ctx->pix_fmt != decoded_frame->format) { av_log(NULL, AV_LOG_DEBUG, "Frame parameters mismatch context %d,%d,%d != %d,%d,%d\n", decoded_frame->width, decoded_frame->height, decoded_frame->format, ist->dec_ctx->width, ist->dec_ctx->height, ist->dec_ctx->pix_fmt); } } if (!*got_output || ret < 0) return ret; if(ist->top_field_first>=0) decoded_frame->top_field_first = ist->top_field_first; ist->frames_decoded++; if (ist->hwaccel_retrieve_data && decoded_frame->format == ist->hwaccel_pix_fmt) { err = ist->hwaccel_retrieve_data(ist->dec_ctx, decoded_frame); if (err < 0) goto fail; } ist->hwaccel_retrieved_pix_fmt = decoded_frame->format; best_effort_timestamp= av_frame_get_best_effort_timestamp(decoded_frame); if (eof && best_effort_timestamp == AV_NOPTS_VALUE && ist->nb_dts_buffer > 0) { best_effort_timestamp = ist->dts_buffer[0]; for (i = 0; i < ist->nb_dts_buffer - 1; i++) ist->dts_buffer[i] = ist->dts_buffer[i + 1]; ist->nb_dts_buffer--; } if(best_effort_timestamp != AV_NOPTS_VALUE) { int64_t ts = av_rescale_q(decoded_frame->pts = best_effort_timestamp, ist->st->time_base, AV_TIME_BASE_Q); if (ts != AV_NOPTS_VALUE) ist->next_pts = ist->pts = ts; } if (debug_ts) { av_log(NULL, AV_LOG_INFO, "decoder -> ist_index:%d type:video " "frame_pts:%s frame_pts_time:%s best_effort_ts:%"PRId64" best_effort_ts_time:%s keyframe:%d frame_type:%d time_base:%d/%d\n", ist->st->index, av_ts2str(decoded_frame->pts), av_ts2timestr(decoded_frame->pts, &ist->st->time_base), best_effort_timestamp, av_ts2timestr(best_effort_timestamp, &ist->st->time_base), decoded_frame->key_frame, decoded_frame->pict_type, ist->st->time_base.num, ist->st->time_base.den); } if (ist->st->sample_aspect_ratio.num) decoded_frame->sample_aspect_ratio = ist->st->sample_aspect_ratio; resample_changed = ist->resample_width != decoded_frame->width || ist->resample_height != decoded_frame->height || ist->resample_pix_fmt != decoded_frame->format; if (resample_changed) { av_log(NULL, AV_LOG_INFO, "Input stream #%d:%d frame changed from size:%dx%d fmt:%s to size:%dx%d fmt:%s\n", ist->file_index, ist->st->index, ist->resample_width, ist->resample_height, av_get_pix_fmt_name(ist->resample_pix_fmt), decoded_frame->width, decoded_frame->height, av_get_pix_fmt_name(decoded_frame->format)); ist->resample_width = decoded_frame->width; ist->resample_height = decoded_frame->height; ist->resample_pix_fmt = decoded_frame->format; for (i = 0; i < nb_filtergraphs; i++) { if (ist_in_filtergraph(filtergraphs[i], ist) && ist->reinit_filters && configure_filtergraph(filtergraphs[i]) < 0) { av_log(NULL, AV_LOG_FATAL, "Error reinitializing filters!\n"); exit_program(1); } } } frame_sample_aspect= av_opt_ptr(avcodec_get_frame_class(), decoded_frame, "sample_aspect_ratio"); for (i = 0; i < ist->nb_filters; i++) { if (!frame_sample_aspect->num) *frame_sample_aspect = ist->st->sample_aspect_ratio; if (i < ist->nb_filters - 1) { f = ist->filter_frame; err = av_frame_ref(f, decoded_frame); if (err < 0) break; } else f = decoded_frame; err = av_buffersrc_add_frame_flags(ist->filters[i]->filter, f, AV_BUFFERSRC_FLAG_PUSH); if (err == AVERROR_EOF) { err = 0; /* ignore */ } else if (err < 0) { av_log(NULL, AV_LOG_FATAL, "Failed to inject frame into filter network: %s\n", av_err2str(err)); exit_program(1); } } fail: av_frame_unref(ist->filter_frame); av_frame_unref(decoded_frame); return err < 0 ? err : ret; } static int transcode_subtitles(InputStream *ist, AVPacket *pkt, int *got_output) { AVSubtitle subtitle; int i, ret = avcodec_decode_subtitle2(ist->dec_ctx, &subtitle, got_output, pkt); check_decode_result(NULL, got_output, ret); if (ret < 0 || !*got_output) { if (!pkt->size) sub2video_flush(ist); return ret; } if (ist->fix_sub_duration) { int end = 1; if (ist->prev_sub.got_output) { end = av_rescale(subtitle.pts - ist->prev_sub.subtitle.pts, 1000, AV_TIME_BASE); if (end < ist->prev_sub.subtitle.end_display_time) { av_log(ist->dec_ctx, AV_LOG_DEBUG, "Subtitle duration reduced from %d to %d%s\n", ist->prev_sub.subtitle.end_display_time, end, end <= 0 ? ", dropping it" : ""); ist->prev_sub.subtitle.end_display_time = end; } } FFSWAP(int, *got_output, ist->prev_sub.got_output); FFSWAP(int, ret, ist->prev_sub.ret); FFSWAP(AVSubtitle, subtitle, ist->prev_sub.subtitle); if (end <= 0) goto out; } if (!*got_output) return ret; sub2video_update(ist, &subtitle); if (!subtitle.num_rects) goto out; ist->frames_decoded++; for (i = 0; i < nb_output_streams; i++) { OutputStream *ost = output_streams[i]; if (!check_output_constraints(ist, ost) || !ost->encoding_needed || ost->enc->type != AVMEDIA_TYPE_SUBTITLE) continue; do_subtitle_out(output_files[ost->file_index], ost, &subtitle); } out: avsubtitle_free(&subtitle); return ret; } static int send_filter_eof(InputStream *ist) { int i, ret; for (i = 0; i < ist->nb_filters; i++) { ret = av_buffersrc_add_frame(ist->filters[i]->filter, NULL); if (ret < 0) return ret; } return 0; } /* pkt = NULL means EOF (needed to flush decoder buffers) */ static int process_input_packet(InputStream *ist, const AVPacket *pkt, int no_eof) { int ret = 0, i; int repeating = 0; int eof_reached = 0; AVPacket avpkt; if (!ist->saw_first_ts) { ist->dts = ist->st->avg_frame_rate.num ? - ist->dec_ctx->has_b_frames * AV_TIME_BASE / av_q2d(ist->st->avg_frame_rate) : 0; ist->pts = 0; if (pkt && pkt->pts != AV_NOPTS_VALUE && !ist->decoding_needed) { ist->dts += av_rescale_q(pkt->pts, ist->st->time_base, AV_TIME_BASE_Q); ist->pts = ist->dts; //unused but better to set it to a value thats not totally wrong } ist->saw_first_ts = 1; } if (ist->next_dts == AV_NOPTS_VALUE) ist->next_dts = ist->dts; if (ist->next_pts == AV_NOPTS_VALUE) ist->next_pts = ist->pts; if (!pkt) { /* EOF handling */ av_init_packet(&avpkt); avpkt.data = NULL; avpkt.size = 0; } else { avpkt = *pkt; } if (pkt && pkt->dts != AV_NOPTS_VALUE) { ist->next_dts = ist->dts = av_rescale_q(pkt->dts, ist->st->time_base, AV_TIME_BASE_Q); if (ist->dec_ctx->codec_type != AVMEDIA_TYPE_VIDEO || !ist->decoding_needed) ist->next_pts = ist->pts = ist->dts; } // while we have more to decode or while the decoder did output something on EOF while (ist->decoding_needed) { int duration = 0; int got_output = 0; ist->pts = ist->next_pts; ist->dts = ist->next_dts; switch (ist->dec_ctx->codec_type) { case AVMEDIA_TYPE_AUDIO: ret = decode_audio (ist, repeating ? NULL : &avpkt, &got_output); break; case AVMEDIA_TYPE_VIDEO: ret = decode_video (ist, repeating ? NULL : &avpkt, &got_output, !pkt); if (!repeating || !pkt || got_output) { if (pkt && pkt->duration) { duration = av_rescale_q(pkt->duration, ist->st->time_base, AV_TIME_BASE_Q); } else if(ist->dec_ctx->framerate.num != 0 && ist->dec_ctx->framerate.den != 0) { int ticks= av_stream_get_parser(ist->st) ? av_stream_get_parser(ist->st)->repeat_pict+1 : ist->dec_ctx->ticks_per_frame; duration = ((int64_t)AV_TIME_BASE * ist->dec_ctx->framerate.den * ticks) / ist->dec_ctx->framerate.num / ist->dec_ctx->ticks_per_frame; } if(ist->dts != AV_NOPTS_VALUE && duration) { ist->next_dts += duration; }else ist->next_dts = AV_NOPTS_VALUE; } if (got_output) ist->next_pts += duration; //FIXME the duration is not correct in some cases break; case AVMEDIA_TYPE_SUBTITLE: if (repeating) break; ret = transcode_subtitles(ist, &avpkt, &got_output); if (!pkt && ret >= 0) ret = AVERROR_EOF; break; default: return -1; } if (ret == AVERROR_EOF) { eof_reached = 1; break; } if (ret < 0) { av_log(NULL, AV_LOG_ERROR, "Error while decoding stream #%d:%d: %s\n", ist->file_index, ist->st->index, av_err2str(ret)); if (exit_on_error) exit_program(1); // Decoding might not terminate if we're draining the decoder, and // the decoder keeps returning an error. // This should probably be considered a libavcodec issue. // Sample: fate-vsynth1-dnxhd-720p-hr-lb if (!pkt) eof_reached = 1; break; } if (!got_output) break; // During draining, we might get multiple output frames in this loop. // ffmpeg.c does not drain the filter chain on configuration changes, // which means if we send multiple frames at once to the filters, and // one of those frames changes configuration, the buffered frames will // be lost. This can upset certain FATE tests. // Decode only 1 frame per call on EOF to appease these FATE tests. // The ideal solution would be to rewrite decoding to use the new // decoding API in a better way. if (!pkt) break; repeating = 1; } /* after flushing, send an EOF on all the filter inputs attached to the stream */ /* except when looping we need to flush but not to send an EOF */ if (!pkt && ist->decoding_needed && eof_reached && !no_eof) { int ret = send_filter_eof(ist); if (ret < 0) { av_log(NULL, AV_LOG_FATAL, "Error marking filters as finished\n"); exit_program(1); } } /* handle stream copy */ if (!ist->decoding_needed) { ist->dts = ist->next_dts; switch (ist->dec_ctx->codec_type) { case AVMEDIA_TYPE_AUDIO: ist->next_dts += ((int64_t)AV_TIME_BASE * ist->dec_ctx->frame_size) / ist->dec_ctx->sample_rate; break; case AVMEDIA_TYPE_VIDEO: if (ist->framerate.num) { // TODO: Remove work-around for c99-to-c89 issue 7 AVRational time_base_q = AV_TIME_BASE_Q; int64_t next_dts = av_rescale_q(ist->next_dts, time_base_q, av_inv_q(ist->framerate)); ist->next_dts = av_rescale_q(next_dts + 1, av_inv_q(ist->framerate), time_base_q); } else if (pkt->duration) { ist->next_dts += av_rescale_q(pkt->duration, ist->st->time_base, AV_TIME_BASE_Q); } else if(ist->dec_ctx->framerate.num != 0) { int ticks= av_stream_get_parser(ist->st) ? av_stream_get_parser(ist->st)->repeat_pict + 1 : ist->dec_ctx->ticks_per_frame; ist->next_dts += ((int64_t)AV_TIME_BASE * ist->dec_ctx->framerate.den * ticks) / ist->dec_ctx->framerate.num / ist->dec_ctx->ticks_per_frame; } break; } ist->pts = ist->dts; ist->next_pts = ist->next_dts; } for (i = 0; pkt && i < nb_output_streams; i++) { OutputStream *ost = output_streams[i]; if (!check_output_constraints(ist, ost) || ost->encoding_needed) continue; do_streamcopy(ist, ost, pkt); } return !eof_reached; } static void print_sdp(void) { char sdp[16384]; int i; int j; AVIOContext *sdp_pb; AVFormatContext **avc; for (i = 0; i < nb_output_files; i++) { if (!output_files[i]->header_written) return; } avc = av_malloc_array(nb_output_files, sizeof(*avc)); if (!avc) exit_program(1); for (i = 0, j = 0; i < nb_output_files; i++) { if (!strcmp(output_files[i]->ctx->oformat->name, "rtp")) { avc[j] = output_files[i]->ctx; j++; } } if (!j) goto fail; av_sdp_create(avc, j, sdp, sizeof(sdp)); if (!sdp_filename) { printf("SDP:\n%s\n", sdp); fflush(stdout); } else { if (avio_open2(&sdp_pb, sdp_filename, AVIO_FLAG_WRITE, &int_cb, NULL) < 0) { av_log(NULL, AV_LOG_ERROR, "Failed to open sdp file '%s'\n", sdp_filename); } else { avio_printf(sdp_pb, "SDP:\n%s", sdp); avio_closep(&sdp_pb); av_freep(&sdp_filename); } } fail: av_freep(&avc); } static const HWAccel *get_hwaccel(enum AVPixelFormat pix_fmt) { int i; for (i = 0; hwaccels[i].name; i++) if (hwaccels[i].pix_fmt == pix_fmt) return &hwaccels[i]; return NULL; } static enum AVPixelFormat get_format(AVCodecContext *s, const enum AVPixelFormat *pix_fmts) { InputStream *ist = s->opaque; const enum AVPixelFormat *p; int ret; for (p = pix_fmts; *p != -1; p++) { const AVPixFmtDescriptor *desc = av_pix_fmt_desc_get(*p); const HWAccel *hwaccel; if (!(desc->flags & AV_PIX_FMT_FLAG_HWACCEL)) break; hwaccel = get_hwaccel(*p); if (!hwaccel || (ist->active_hwaccel_id && ist->active_hwaccel_id != hwaccel->id) || (ist->hwaccel_id != HWACCEL_AUTO && ist->hwaccel_id != hwaccel->id)) continue; ret = hwaccel->init(s); if (ret < 0) { if (ist->hwaccel_id == hwaccel->id) { av_log(NULL, AV_LOG_FATAL, "%s hwaccel requested for input stream #%d:%d, " "but cannot be initialized.\n", hwaccel->name, ist->file_index, ist->st->index); return AV_PIX_FMT_NONE; } continue; } if (ist->hw_frames_ctx) { s->hw_frames_ctx = av_buffer_ref(ist->hw_frames_ctx); if (!s->hw_frames_ctx) return AV_PIX_FMT_NONE; } ist->active_hwaccel_id = hwaccel->id; ist->hwaccel_pix_fmt = *p; break; } return *p; } static int get_buffer(AVCodecContext *s, AVFrame *frame, int flags) { InputStream *ist = s->opaque; if (ist->hwaccel_get_buffer && frame->format == ist->hwaccel_pix_fmt) return ist->hwaccel_get_buffer(s, frame, flags); return avcodec_default_get_buffer2(s, frame, flags); } static int init_input_stream(int ist_index, char *error, int error_len) { int ret; InputStream *ist = input_streams[ist_index]; if (ist->decoding_needed) { AVCodec *codec = ist->dec; if (!codec) { snprintf(error, error_len, "Decoder (codec %s) not found for input stream #%d:%d", avcodec_get_name(ist->dec_ctx->codec_id), ist->file_index, ist->st->index); return AVERROR(EINVAL); } ist->dec_ctx->opaque = ist; ist->dec_ctx->get_format = get_format; ist->dec_ctx->get_buffer2 = get_buffer; ist->dec_ctx->thread_safe_callbacks = 1; av_opt_set_int(ist->dec_ctx, "refcounted_frames", 1, 0); if (ist->dec_ctx->codec_id == AV_CODEC_ID_DVB_SUBTITLE && (ist->decoding_needed & DECODING_FOR_OST)) { av_dict_set(&ist->decoder_opts, "compute_edt", "1", AV_DICT_DONT_OVERWRITE); if (ist->decoding_needed & DECODING_FOR_FILTER) av_log(NULL, AV_LOG_WARNING, "Warning using DVB subtitles for filtering and output at the same time is not fully supported, also see -compute_edt [0|1]\n"); } av_dict_set(&ist->decoder_opts, "sub_text_format", "ass", AV_DICT_DONT_OVERWRITE); /* Useful for subtitles retiming by lavf (FIXME), skipping samples in * audio, and video decoders such as cuvid or mediacodec */ av_codec_set_pkt_timebase(ist->dec_ctx, ist->st->time_base); if (!av_dict_get(ist->decoder_opts, "threads", NULL, 0)) av_dict_set(&ist->decoder_opts, "threads", "auto", 0); if ((ret = avcodec_open2(ist->dec_ctx, codec, &ist->decoder_opts)) < 0) { if (ret == AVERROR_EXPERIMENTAL) abort_codec_experimental(codec, 0); snprintf(error, error_len, "Error while opening decoder for input stream " "#%d:%d : %s", ist->file_index, ist->st->index, av_err2str(ret)); return ret; } assert_avoptions(ist->decoder_opts); } ist->next_pts = AV_NOPTS_VALUE; ist->next_dts = AV_NOPTS_VALUE; return 0; } static InputStream *get_input_stream(OutputStream *ost) { if (ost->source_index >= 0) return input_streams[ost->source_index]; return NULL; } static int compare_int64(const void *a, const void *b) { return FFDIFFSIGN(*(const int64_t *)a, *(const int64_t *)b); } /* open the muxer when all the streams are initialized */ static int check_init_output_file(OutputFile *of, int file_index) { int ret, i; for (i = 0; i < of->ctx->nb_streams; i++) { OutputStream *ost = output_streams[of->ost_index + i]; if (!ost->initialized) return 0; } of->ctx->interrupt_callback = int_cb; ret = avformat_write_header(of->ctx, &of->opts); if (ret < 0) { av_log(NULL, AV_LOG_ERROR, "Could not write header for output file #%d " "(incorrect codec parameters ?): %s", file_index, av_err2str(ret)); return ret; } //assert_avoptions(of->opts); of->header_written = 1; av_dump_format(of->ctx, file_index, of->ctx->filename, 1); if (sdp_filename || want_sdp) print_sdp(); /* flush the muxing queues */ for (i = 0; i < of->ctx->nb_streams; i++) { OutputStream *ost = output_streams[of->ost_index + i]; while (av_fifo_size(ost->muxing_queue)) { AVPacket pkt; av_fifo_generic_read(ost->muxing_queue, &pkt, sizeof(pkt), NULL); write_packet(of, &pkt, ost); } } return 0; } static int init_output_bsfs(OutputStream *ost) { AVBSFContext *ctx; int i, ret; if (!ost->nb_bitstream_filters) return 0; for (i = 0; i < ost->nb_bitstream_filters; i++) { ctx = ost->bsf_ctx[i]; ret = avcodec_parameters_copy(ctx->par_in, i ? ost->bsf_ctx[i - 1]->par_out : ost->st->codecpar); if (ret < 0) return ret; ctx->time_base_in = i ? ost->bsf_ctx[i - 1]->time_base_out : ost->st->time_base; ret = av_bsf_init(ctx); if (ret < 0) { av_log(NULL, AV_LOG_ERROR, "Error initializing bitstream filter: %s\n", ost->bsf_ctx[i]->filter->name); return ret; } } ctx = ost->bsf_ctx[ost->nb_bitstream_filters - 1]; ret = avcodec_parameters_copy(ost->st->codecpar, ctx->par_out); if (ret < 0) return ret; ost->st->time_base = ctx->time_base_out; return 0; } static int init_output_stream_streamcopy(OutputStream *ost) { OutputFile *of = output_files[ost->file_index]; InputStream *ist = get_input_stream(ost); AVCodecParameters *par_dst = ost->st->codecpar; AVCodecParameters *par_src = ost->ref_par; AVRational sar; int i, ret; uint64_t extra_size; av_assert0(ist && !ost->filter); avcodec_parameters_to_context(ost->enc_ctx, ist->st->codecpar); ret = av_opt_set_dict(ost->enc_ctx, &ost->encoder_opts); if (ret < 0) { av_log(NULL, AV_LOG_FATAL, "Error setting up codec context options.\n"); return ret; } avcodec_parameters_from_context(par_src, ost->enc_ctx); extra_size = (uint64_t)par_src->extradata_size + AV_INPUT_BUFFER_PADDING_SIZE; if (extra_size > INT_MAX) { return AVERROR(EINVAL); } /* if stream_copy is selected, no need to decode or encode */ par_dst->codec_id = par_src->codec_id; par_dst->codec_type = par_src->codec_type; if (!par_dst->codec_tag) { unsigned int codec_tag; if (!of->ctx->oformat->codec_tag || av_codec_get_id (of->ctx->oformat->codec_tag, par_src->codec_tag) == par_dst->codec_id || !av_codec_get_tag2(of->ctx->oformat->codec_tag, par_src->codec_id, &codec_tag)) par_dst->codec_tag = par_src->codec_tag; } par_dst->bit_rate = par_src->bit_rate; par_dst->field_order = par_src->field_order; par_dst->chroma_location = par_src->chroma_location; if (par_src->extradata_size) { par_dst->extradata = av_mallocz(extra_size); if (!par_dst->extradata) { return AVERROR(ENOMEM); } memcpy(par_dst->extradata, par_src->extradata, par_src->extradata_size); par_dst->extradata_size = par_src->extradata_size; } par_dst->bits_per_coded_sample = par_src->bits_per_coded_sample; par_dst->bits_per_raw_sample = par_src->bits_per_raw_sample; if (!ost->frame_rate.num) ost->frame_rate = ist->framerate; ost->st->avg_frame_rate = ost->frame_rate; ret = avformat_transfer_internal_stream_timing_info(of->ctx->oformat, ost->st, ist->st, copy_tb); if (ret < 0) return ret; // copy timebase while removing common factors ost->st->time_base = av_add_q(av_stream_get_codec_timebase(ost->st), (AVRational){0, 1}); if (ist->st->nb_side_data) { ost->st->side_data = av_realloc_array(NULL, ist->st->nb_side_data, sizeof(*ist->st->side_data)); if (!ost->st->side_data) return AVERROR(ENOMEM); ost->st->nb_side_data = 0; for (i = 0; i < ist->st->nb_side_data; i++) { const AVPacketSideData *sd_src = &ist->st->side_data[i]; AVPacketSideData *sd_dst = &ost->st->side_data[ost->st->nb_side_data]; if (ost->rotate_overridden && sd_src->type == AV_PKT_DATA_DISPLAYMATRIX) continue; sd_dst->data = av_malloc(sd_src->size); if (!sd_dst->data) return AVERROR(ENOMEM); memcpy(sd_dst->data, sd_src->data, sd_src->size); sd_dst->size = sd_src->size; sd_dst->type = sd_src->type; ost->st->nb_side_data++; } } ost->parser = av_parser_init(par_dst->codec_id); ost->parser_avctx = avcodec_alloc_context3(NULL); if (!ost->parser_avctx) return AVERROR(ENOMEM); switch (par_dst->codec_type) { case AVMEDIA_TYPE_AUDIO: if (audio_volume != 256) { av_log(NULL, AV_LOG_FATAL, "-acodec copy and -vol are incompatible (frames are not decoded)\n"); exit_program(1); } par_dst->channel_layout = par_src->channel_layout; par_dst->sample_rate = par_src->sample_rate; par_dst->channels = par_src->channels; par_dst->frame_size = par_src->frame_size; par_dst->block_align = par_src->block_align; par_dst->initial_padding = par_src->initial_padding; par_dst->trailing_padding = par_src->trailing_padding; par_dst->profile = par_src->profile; if((par_dst->block_align == 1 || par_dst->block_align == 1152 || par_dst->block_align == 576) && par_dst->codec_id == AV_CODEC_ID_MP3) par_dst->block_align= 0; if(par_dst->codec_id == AV_CODEC_ID_AC3) par_dst->block_align= 0; break; case AVMEDIA_TYPE_VIDEO: par_dst->format = par_src->format; par_dst->color_space = par_src->color_space; par_dst->color_range = par_src->color_range; par_dst->color_primaries = par_src->color_primaries; par_dst->color_trc = par_src->color_trc; par_dst->width = par_src->width; par_dst->height = par_src->height; par_dst->video_delay = par_src->video_delay; par_dst->profile = par_src->profile; if (ost->frame_aspect_ratio.num) { // overridden by the -aspect cli option sar = av_mul_q(ost->frame_aspect_ratio, (AVRational){ par_dst->height, par_dst->width }); av_log(NULL, AV_LOG_WARNING, "Overriding aspect ratio " "with stream copy may produce invalid files\n"); } else if (ist->st->sample_aspect_ratio.num) sar = ist->st->sample_aspect_ratio; else sar = par_src->sample_aspect_ratio; ost->st->sample_aspect_ratio = par_dst->sample_aspect_ratio = sar; ost->st->avg_frame_rate = ist->st->avg_frame_rate; ost->st->r_frame_rate = ist->st->r_frame_rate; break; case AVMEDIA_TYPE_SUBTITLE: par_dst->width = par_src->width; par_dst->height = par_src->height; break; case AVMEDIA_TYPE_UNKNOWN: case AVMEDIA_TYPE_DATA: case AVMEDIA_TYPE_ATTACHMENT: break; default: abort(); } return 0; } static int init_output_stream(OutputStream *ost, char *error, int error_len) { int ret = 0; if (ost->encoding_needed) { AVCodec *codec = ost->enc; AVCodecContext *dec = NULL; InputStream *ist; if ((ist = get_input_stream(ost))) dec = ist->dec_ctx; if (dec && dec->subtitle_header) { /* ASS code assumes this buffer is null terminated so add extra byte. */ ost->enc_ctx->subtitle_header = av_mallocz(dec->subtitle_header_size + 1); if (!ost->enc_ctx->subtitle_header) return AVERROR(ENOMEM); memcpy(ost->enc_ctx->subtitle_header, dec->subtitle_header, dec->subtitle_header_size); ost->enc_ctx->subtitle_header_size = dec->subtitle_header_size; } if (!av_dict_get(ost->encoder_opts, "threads", NULL, 0)) av_dict_set(&ost->encoder_opts, "threads", "auto", 0); if (ost->enc->type == AVMEDIA_TYPE_AUDIO && !codec->defaults && !av_dict_get(ost->encoder_opts, "b", NULL, 0) && !av_dict_get(ost->encoder_opts, "ab", NULL, 0)) av_dict_set(&ost->encoder_opts, "b", "128000", 0); if (ost->filter && ost->filter->filter->inputs[0]->hw_frames_ctx) { ost->enc_ctx->hw_frames_ctx = av_buffer_ref(ost->filter->filter->inputs[0]->hw_frames_ctx); if (!ost->enc_ctx->hw_frames_ctx) return AVERROR(ENOMEM); } if ((ret = avcodec_open2(ost->enc_ctx, codec, &ost->encoder_opts)) < 0) { if (ret == AVERROR_EXPERIMENTAL) abort_codec_experimental(codec, 1); snprintf(error, error_len, "Error while opening encoder for output stream #%d:%d - " "maybe incorrect parameters such as bit_rate, rate, width or height", ost->file_index, ost->index); return ret; } if (ost->enc->type == AVMEDIA_TYPE_AUDIO && !(ost->enc->capabilities & AV_CODEC_CAP_VARIABLE_FRAME_SIZE)) av_buffersink_set_frame_size(ost->filter->filter, ost->enc_ctx->frame_size); assert_avoptions(ost->encoder_opts); if (ost->enc_ctx->bit_rate && ost->enc_ctx->bit_rate < 1000) av_log(NULL, AV_LOG_WARNING, "The bitrate parameter is set too low." " It takes bits/s as argument, not kbits/s\n"); ret = avcodec_parameters_from_context(ost->st->codecpar, ost->enc_ctx); if (ret < 0) { av_log(NULL, AV_LOG_FATAL, "Error initializing the output stream codec context.\n"); exit_program(1); } /* * FIXME: ost->st->codec should't be needed here anymore. */ ret = avcodec_copy_context(ost->st->codec, ost->enc_ctx); if (ret < 0) return ret; if (ost->enc_ctx->nb_coded_side_data) { int i; ost->st->side_data = av_realloc_array(NULL, ost->enc_ctx->nb_coded_side_data, sizeof(*ost->st->side_data)); if (!ost->st->side_data) return AVERROR(ENOMEM); for (i = 0; i < ost->enc_ctx->nb_coded_side_data; i++) { const AVPacketSideData *sd_src = &ost->enc_ctx->coded_side_data[i]; AVPacketSideData *sd_dst = &ost->st->side_data[i]; sd_dst->data = av_malloc(sd_src->size); if (!sd_dst->data) return AVERROR(ENOMEM); memcpy(sd_dst->data, sd_src->data, sd_src->size); sd_dst->size = sd_src->size; sd_dst->type = sd_src->type; ost->st->nb_side_data++; } } // copy timebase while removing common factors ost->st->time_base = av_add_q(ost->enc_ctx->time_base, (AVRational){0, 1}); ost->st->codec->codec= ost->enc_ctx->codec; } else if (ost->stream_copy) { ret = init_output_stream_streamcopy(ost); if (ret < 0) return ret; /* * FIXME: will the codec context used by the parser during streamcopy * This should go away with the new parser API. */ ret = avcodec_parameters_to_context(ost->parser_avctx, ost->st->codecpar); if (ret < 0) return ret; } /* initialize bitstream filters for the output stream * needs to be done here, because the codec id for streamcopy is not * known until now */ ret = init_output_bsfs(ost); if (ret < 0) return ret; ost->initialized = 1; ret = check_init_output_file(output_files[ost->file_index], ost->file_index); if (ret < 0) return ret; return ret; } static void parse_forced_key_frames(char *kf, OutputStream *ost, AVCodecContext *avctx) { char *p; int n = 1, i, size, index = 0; int64_t t, *pts; for (p = kf; *p; p++) if (*p == ',') n++; size = n; pts = av_malloc_array(size, sizeof(*pts)); if (!pts) { av_log(NULL, AV_LOG_FATAL, "Could not allocate forced key frames array.\n"); exit_program(1); } p = kf; for (i = 0; i < n; i++) { char *next = strchr(p, ','); if (next) *next++ = 0; if (!memcmp(p, "chapters", 8)) { AVFormatContext *avf = output_files[ost->file_index]->ctx; int j; if (avf->nb_chapters > INT_MAX - size || !(pts = av_realloc_f(pts, size += avf->nb_chapters - 1, sizeof(*pts)))) { av_log(NULL, AV_LOG_FATAL, "Could not allocate forced key frames array.\n"); exit_program(1); } t = p[8] ? parse_time_or_die("force_key_frames", p + 8, 1) : 0; t = av_rescale_q(t, AV_TIME_BASE_Q, avctx->time_base); for (j = 0; j < avf->nb_chapters; j++) { AVChapter *c = avf->chapters[j]; av_assert1(index < size); pts[index++] = av_rescale_q(c->start, c->time_base, avctx->time_base) + t; } } else { t = parse_time_or_die("force_key_frames", p, 1); av_assert1(index < size); pts[index++] = av_rescale_q(t, AV_TIME_BASE_Q, avctx->time_base); } p = next; } av_assert0(index == size); qsort(pts, size, sizeof(*pts), compare_int64); ost->forced_kf_count = size; ost->forced_kf_pts = pts; } static void report_new_stream(int input_index, AVPacket *pkt) { InputFile *file = input_files[input_index]; AVStream *st = file->ctx->streams[pkt->stream_index]; if (pkt->stream_index < file->nb_streams_warn) return; av_log(file->ctx, AV_LOG_WARNING, "New %s stream %d:%d at pos:%"PRId64" and DTS:%ss\n", av_get_media_type_string(st->codecpar->codec_type), input_index, pkt->stream_index, pkt->pos, av_ts2timestr(pkt->dts, &st->time_base)); file->nb_streams_warn = pkt->stream_index + 1; } static void set_encoder_id(OutputFile *of, OutputStream *ost) { AVDictionaryEntry *e; uint8_t *encoder_string; int encoder_string_len; int format_flags = 0; int codec_flags = 0; if (av_dict_get(ost->st->metadata, "encoder", NULL, 0)) return; e = av_dict_get(of->opts, "fflags", NULL, 0); if (e) { const AVOption *o = av_opt_find(of->ctx, "fflags", NULL, 0, 0); if (!o) return; av_opt_eval_flags(of->ctx, o, e->value, &format_flags); } e = av_dict_get(ost->encoder_opts, "flags", NULL, 0); if (e) { const AVOption *o = av_opt_find(ost->enc_ctx, "flags", NULL, 0, 0); if (!o) return; av_opt_eval_flags(ost->enc_ctx, o, e->value, &codec_flags); } encoder_string_len = sizeof(LIBAVCODEC_IDENT) + strlen(ost->enc->name) + 2; encoder_string = av_mallocz(encoder_string_len); if (!encoder_string) exit_program(1); if (!(format_flags & AVFMT_FLAG_BITEXACT) && !(codec_flags & AV_CODEC_FLAG_BITEXACT)) av_strlcpy(encoder_string, LIBAVCODEC_IDENT " ", encoder_string_len); else av_strlcpy(encoder_string, "Lavc ", encoder_string_len); av_strlcat(encoder_string, ost->enc->name, encoder_string_len); av_dict_set(&ost->st->metadata, "encoder", encoder_string, AV_DICT_DONT_STRDUP_VAL | AV_DICT_DONT_OVERWRITE); } static int transcode_init(void) { int ret = 0, i, j, k; AVFormatContext *oc; OutputStream *ost; InputStream *ist; char error[1024] = {0}; for (i = 0; i < nb_filtergraphs; i++) { FilterGraph *fg = filtergraphs[i]; for (j = 0; j < fg->nb_outputs; j++) { OutputFilter *ofilter = fg->outputs[j]; if (!ofilter->ost || ofilter->ost->source_index >= 0) continue; if (fg->nb_inputs != 1) continue; for (k = nb_input_streams-1; k >= 0 ; k--) if (fg->inputs[0]->ist == input_streams[k]) break; ofilter->ost->source_index = k; } } /* init framerate emulation */ for (i = 0; i < nb_input_files; i++) { InputFile *ifile = input_files[i]; if (ifile->rate_emu) for (j = 0; j < ifile->nb_streams; j++) input_streams[j + ifile->ist_index]->start = av_gettime_relative(); } /* for each output stream, we compute the right encoding parameters */ for (i = 0; i < nb_output_streams; i++) { ost = output_streams[i]; oc = output_files[ost->file_index]->ctx; ist = get_input_stream(ost); if (ost->attachment_filename) continue; if (ist) { ost->st->disposition = ist->st->disposition; } else { for (j=0; j<oc->nb_streams; j++) { AVStream *st = oc->streams[j]; if (st != ost->st && st->codecpar->codec_type == ost->st->codecpar->codec_type) break; } if (j == oc->nb_streams) if (ost->st->codecpar->codec_type == AVMEDIA_TYPE_AUDIO || ost->st->codecpar->codec_type == AVMEDIA_TYPE_VIDEO) ost->st->disposition = AV_DISPOSITION_DEFAULT; } if (!ost->stream_copy) { AVCodecContext *enc_ctx = ost->enc_ctx; AVCodecContext *dec_ctx = NULL; set_encoder_id(output_files[ost->file_index], ost); if (ist) { dec_ctx = ist->dec_ctx; enc_ctx->chroma_sample_location = dec_ctx->chroma_sample_location; } #if CONFIG_LIBMFX if (qsv_transcode_init(ost)) exit_program(1); #endif #if CONFIG_CUVID if (cuvid_transcode_init(ost)) exit_program(1); #endif if ((enc_ctx->codec_type == AVMEDIA_TYPE_VIDEO || enc_ctx->codec_type == AVMEDIA_TYPE_AUDIO) && filtergraph_is_simple(ost->filter->graph)) { FilterGraph *fg = ost->filter->graph; if (configure_filtergraph(fg)) { av_log(NULL, AV_LOG_FATAL, "Error opening filters!\n"); exit_program(1); } } if (enc_ctx->codec_type == AVMEDIA_TYPE_VIDEO) { if (!ost->frame_rate.num) ost->frame_rate = av_buffersink_get_frame_rate(ost->filter->filter); if (ist && !ost->frame_rate.num) ost->frame_rate = ist->framerate; if (ist && !ost->frame_rate.num) ost->frame_rate = ist->st->r_frame_rate; if (ist && !ost->frame_rate.num) { ost->frame_rate = (AVRational){25, 1}; av_log(NULL, AV_LOG_WARNING, "No information " "about the input framerate is available. Falling " "back to a default value of 25fps for output stream #%d:%d. Use the -r option " "if you want a different framerate.\n", ost->file_index, ost->index); } // ost->frame_rate = ist->st->avg_frame_rate.num ? ist->st->avg_frame_rate : (AVRational){25, 1}; if (ost->enc && ost->enc->supported_framerates && !ost->force_fps) { int idx = av_find_nearest_q_idx(ost->frame_rate, ost->enc->supported_framerates); ost->frame_rate = ost->enc->supported_framerates[idx]; } // reduce frame rate for mpeg4 to be within the spec limits if (enc_ctx->codec_id == AV_CODEC_ID_MPEG4) { av_reduce(&ost->frame_rate.num, &ost->frame_rate.den, ost->frame_rate.num, ost->frame_rate.den, 65535); } } switch (enc_ctx->codec_type) { case AVMEDIA_TYPE_AUDIO: enc_ctx->sample_fmt = ost->filter->filter->inputs[0]->format; if (dec_ctx) enc_ctx->bits_per_raw_sample = FFMIN(dec_ctx->bits_per_raw_sample, av_get_bytes_per_sample(enc_ctx->sample_fmt) << 3); enc_ctx->sample_rate = ost->filter->filter->inputs[0]->sample_rate; enc_ctx->channel_layout = ost->filter->filter->inputs[0]->channel_layout; enc_ctx->channels = avfilter_link_get_channels(ost->filter->filter->inputs[0]); enc_ctx->time_base = (AVRational){ 1, enc_ctx->sample_rate }; break; case AVMEDIA_TYPE_VIDEO: enc_ctx->time_base = av_inv_q(ost->frame_rate); if (!(enc_ctx->time_base.num && enc_ctx->time_base.den)) enc_ctx->time_base = ost->filter->filter->inputs[0]->time_base; if ( av_q2d(enc_ctx->time_base) < 0.001 && video_sync_method != VSYNC_PASSTHROUGH && (video_sync_method == VSYNC_CFR || video_sync_method == VSYNC_VSCFR || (video_sync_method == VSYNC_AUTO && !(oc->oformat->flags & AVFMT_VARIABLE_FPS)))){ av_log(oc, AV_LOG_WARNING, "Frame rate very high for a muxer not efficiently supporting it.\n" "Please consider specifying a lower framerate, a different muxer or -vsync 2\n"); } for (j = 0; j < ost->forced_kf_count; j++) ost->forced_kf_pts[j] = av_rescale_q(ost->forced_kf_pts[j], AV_TIME_BASE_Q, enc_ctx->time_base); enc_ctx->width = ost->filter->filter->inputs[0]->w; enc_ctx->height = ost->filter->filter->inputs[0]->h; enc_ctx->sample_aspect_ratio = ost->st->sample_aspect_ratio = ost->frame_aspect_ratio.num ? // overridden by the -aspect cli option av_mul_q(ost->frame_aspect_ratio, (AVRational){ enc_ctx->height, enc_ctx->width }) : ost->filter->filter->inputs[0]->sample_aspect_ratio; if (!strncmp(ost->enc->name, "libx264", 7) && enc_ctx->pix_fmt == AV_PIX_FMT_NONE && ost->filter->filter->inputs[0]->format != AV_PIX_FMT_YUV420P) av_log(NULL, AV_LOG_WARNING, "No pixel format specified, %s for H.264 encoding chosen.\n" "Use -pix_fmt yuv420p for compatibility with outdated media players.\n", av_get_pix_fmt_name(ost->filter->filter->inputs[0]->format)); if (!strncmp(ost->enc->name, "mpeg2video", 10) && enc_ctx->pix_fmt == AV_PIX_FMT_NONE && ost->filter->filter->inputs[0]->format != AV_PIX_FMT_YUV420P) av_log(NULL, AV_LOG_WARNING, "No pixel format specified, %s for MPEG-2 encoding chosen.\n" "Use -pix_fmt yuv420p for compatibility with outdated media players.\n", av_get_pix_fmt_name(ost->filter->filter->inputs[0]->format)); enc_ctx->pix_fmt = ost->filter->filter->inputs[0]->format; if (dec_ctx) enc_ctx->bits_per_raw_sample = FFMIN(dec_ctx->bits_per_raw_sample, av_pix_fmt_desc_get(enc_ctx->pix_fmt)->comp[0].depth); ost->st->avg_frame_rate = ost->frame_rate; if (!dec_ctx || enc_ctx->width != dec_ctx->width || enc_ctx->height != dec_ctx->height || enc_ctx->pix_fmt != dec_ctx->pix_fmt) { enc_ctx->bits_per_raw_sample = frame_bits_per_raw_sample; } if (ost->forced_keyframes) { if (!strncmp(ost->forced_keyframes, "expr:", 5)) { ret = av_expr_parse(&ost->forced_keyframes_pexpr, ost->forced_keyframes+5, forced_keyframes_const_names, NULL, NULL, NULL, NULL, 0, NULL); if (ret < 0) { av_log(NULL, AV_LOG_ERROR, "Invalid force_key_frames expression '%s'\n", ost->forced_keyframes+5); return ret; } ost->forced_keyframes_expr_const_values[FKF_N] = 0; ost->forced_keyframes_expr_const_values[FKF_N_FORCED] = 0; ost->forced_keyframes_expr_const_values[FKF_PREV_FORCED_N] = NAN; ost->forced_keyframes_expr_const_values[FKF_PREV_FORCED_T] = NAN; // Don't parse the 'forced_keyframes' in case of 'keep-source-keyframes', // parse it only for static kf timings } else if(strncmp(ost->forced_keyframes, "source", 6)) { parse_forced_key_frames(ost->forced_keyframes, ost, ost->enc_ctx); } } break; case AVMEDIA_TYPE_SUBTITLE: enc_ctx->time_base = (AVRational){1, 1000}; if (!enc_ctx->width) { enc_ctx->width = input_streams[ost->source_index]->st->codecpar->width; enc_ctx->height = input_streams[ost->source_index]->st->codecpar->height; } break; case AVMEDIA_TYPE_DATA: break; default: abort(); break; } } if (ost->disposition) { static const AVOption opts[] = { { "disposition" , NULL, 0, AV_OPT_TYPE_FLAGS, { .i64 = 0 }, INT64_MIN, INT64_MAX, .unit = "flags" }, { "default" , NULL, 0, AV_OPT_TYPE_CONST, { .i64 = AV_DISPOSITION_DEFAULT }, .unit = "flags" }, { "dub" , NULL, 0, AV_OPT_TYPE_CONST, { .i64 = AV_DISPOSITION_DUB }, .unit = "flags" }, { "original" , NULL, 0, AV_OPT_TYPE_CONST, { .i64 = AV_DISPOSITION_ORIGINAL }, .unit = "flags" }, { "comment" , NULL, 0, AV_OPT_TYPE_CONST, { .i64 = AV_DISPOSITION_COMMENT }, .unit = "flags" }, { "lyrics" , NULL, 0, AV_OPT_TYPE_CONST, { .i64 = AV_DISPOSITION_LYRICS }, .unit = "flags" }, { "karaoke" , NULL, 0, AV_OPT_TYPE_CONST, { .i64 = AV_DISPOSITION_KARAOKE }, .unit = "flags" }, { "forced" , NULL, 0, AV_OPT_TYPE_CONST, { .i64 = AV_DISPOSITION_FORCED }, .unit = "flags" }, { "hearing_impaired" , NULL, 0, AV_OPT_TYPE_CONST, { .i64 = AV_DISPOSITION_HEARING_IMPAIRED }, .unit = "flags" }, { "visual_impaired" , NULL, 0, AV_OPT_TYPE_CONST, { .i64 = AV_DISPOSITION_VISUAL_IMPAIRED }, .unit = "flags" }, { "clean_effects" , NULL, 0, AV_OPT_TYPE_CONST, { .i64 = AV_DISPOSITION_CLEAN_EFFECTS }, .unit = "flags" }, { "captions" , NULL, 0, AV_OPT_TYPE_CONST, { .i64 = AV_DISPOSITION_CAPTIONS }, .unit = "flags" }, { "descriptions" , NULL, 0, AV_OPT_TYPE_CONST, { .i64 = AV_DISPOSITION_DESCRIPTIONS }, .unit = "flags" }, { "metadata" , NULL, 0, AV_OPT_TYPE_CONST, { .i64 = AV_DISPOSITION_METADATA }, .unit = "flags" }, { NULL }, }; static const AVClass class = { .class_name = "", .item_name = av_default_item_name, .option = opts, .version = LIBAVUTIL_VERSION_INT, }; const AVClass *pclass = &class; ret = av_opt_eval_flags(&pclass, &opts[0], ost->disposition, &ost->st->disposition); if (ret < 0) goto dump_format; } } /* init input streams */ for (i = 0; i < nb_input_streams; i++) if ((ret = init_input_stream(i, error, sizeof(error))) < 0) { for (i = 0; i < nb_output_streams; i++) { ost = output_streams[i]; avcodec_close(ost->enc_ctx); } goto dump_format; } /* open each encoder */ for (i = 0; i < nb_output_streams; i++) { ret = init_output_stream(output_streams[i], error, sizeof(error)); if (ret < 0) goto dump_format; } /* discard unused programs */ for (i = 0; i < nb_input_files; i++) { InputFile *ifile = input_files[i]; for (j = 0; j < ifile->ctx->nb_programs; j++) { AVProgram *p = ifile->ctx->programs[j]; int discard = AVDISCARD_ALL; for (k = 0; k < p->nb_stream_indexes; k++) if (!input_streams[ifile->ist_index + p->stream_index[k]]->discard) { discard = AVDISCARD_DEFAULT; break; } p->discard = discard; } } /* write headers for files with no streams */ for (i = 0; i < nb_output_files; i++) { oc = output_files[i]->ctx; if (oc->oformat->flags & AVFMT_NOSTREAMS && oc->nb_streams == 0) { ret = check_init_output_file(output_files[i], i); if (ret < 0) goto dump_format; } } dump_format: /* dump the stream mapping */ av_log(NULL, AV_LOG_INFO, "Stream mapping:\n"); for (i = 0; i < nb_input_streams; i++) { ist = input_streams[i]; for (j = 0; j < ist->nb_filters; j++) { if (!filtergraph_is_simple(ist->filters[j]->graph)) { av_log(NULL, AV_LOG_INFO, " Stream #%d:%d (%s) -> %s", ist->file_index, ist->st->index, ist->dec ? ist->dec->name : "?", ist->filters[j]->name); if (nb_filtergraphs > 1) av_log(NULL, AV_LOG_INFO, " (graph %d)", ist->filters[j]->graph->index); av_log(NULL, AV_LOG_INFO, "\n"); } } } for (i = 0; i < nb_output_streams; i++) { ost = output_streams[i]; if (ost->attachment_filename) { /* an attached file */ av_log(NULL, AV_LOG_INFO, " File %s -> Stream #%d:%d\n", ost->attachment_filename, ost->file_index, ost->index); continue; } if (ost->filter && !filtergraph_is_simple(ost->filter->graph)) { /* output from a complex graph */ av_log(NULL, AV_LOG_INFO, " %s", ost->filter->name); if (nb_filtergraphs > 1) av_log(NULL, AV_LOG_INFO, " (graph %d)", ost->filter->graph->index); av_log(NULL, AV_LOG_INFO, " -> Stream #%d:%d (%s)\n", ost->file_index, ost->index, ost->enc ? ost->enc->name : "?"); continue; } av_log(NULL, AV_LOG_INFO, " Stream #%d:%d -> #%d:%d", input_streams[ost->source_index]->file_index, input_streams[ost->source_index]->st->index, ost->file_index, ost->index); if (ost->sync_ist != input_streams[ost->source_index]) av_log(NULL, AV_LOG_INFO, " [sync #%d:%d]", ost->sync_ist->file_index, ost->sync_ist->st->index); if (ost->stream_copy) av_log(NULL, AV_LOG_INFO, " (copy)"); else { const AVCodec *in_codec = input_streams[ost->source_index]->dec; const AVCodec *out_codec = ost->enc; const char *decoder_name = "?"; const char *in_codec_name = "?"; const char *encoder_name = "?"; const char *out_codec_name = "?"; const AVCodecDescriptor *desc; if (in_codec) { decoder_name = in_codec->name; desc = avcodec_descriptor_get(in_codec->id); if (desc) in_codec_name = desc->name; if (!strcmp(decoder_name, in_codec_name)) decoder_name = "native"; } if (out_codec) { encoder_name = out_codec->name; desc = avcodec_descriptor_get(out_codec->id); if (desc) out_codec_name = desc->name; if (!strcmp(encoder_name, out_codec_name)) encoder_name = "native"; } av_log(NULL, AV_LOG_INFO, " (%s (%s) -> %s (%s))", in_codec_name, decoder_name, out_codec_name, encoder_name); } av_log(NULL, AV_LOG_INFO, "\n"); } if (ret) { av_log(NULL, AV_LOG_ERROR, "%s\n", error); return ret; } transcode_init_done = 1; return 0; } /* Return 1 if there remain streams where more output is wanted, 0 otherwise. */ static int need_output(void) { int i; for (i = 0; i < nb_output_streams; i++) { OutputStream *ost = output_streams[i]; OutputFile *of = output_files[ost->file_index]; AVFormatContext *os = output_files[ost->file_index]->ctx; if (ost->finished || (os->pb && avio_tell(os->pb) >= of->limit_filesize)) continue; if (ost->frame_number >= ost->max_frames) { int j; for (j = 0; j < of->ctx->nb_streams; j++) close_output_stream(output_streams[of->ost_index + j]); continue; } return 1; } return 0; } /** * Select the output stream to process. * * @return selected output stream, or NULL if none available */ static OutputStream *choose_output(void) { int i; int64_t opts_min = INT64_MAX; OutputStream *ost_min = NULL; for (i = 0; i < nb_output_streams; i++) { OutputStream *ost = output_streams[i]; int64_t opts = ost->st->cur_dts == AV_NOPTS_VALUE ? INT64_MIN : av_rescale_q(ost->st->cur_dts, ost->st->time_base, AV_TIME_BASE_Q); if (ost->st->cur_dts == AV_NOPTS_VALUE) av_log(NULL, AV_LOG_DEBUG, "cur_dts is invalid (this is harmless if it occurs once at the start per stream)\n"); if (!ost->finished && opts < opts_min) { opts_min = opts; ost_min = ost->unavailable ? NULL : ost; } } return ost_min; } static void set_tty_echo(int on) { #if HAVE_TERMIOS_H struct termios tty; if (tcgetattr(0, &tty) == 0) { if (on) tty.c_lflag |= ECHO; else tty.c_lflag &= ~ECHO; tcsetattr(0, TCSANOW, &tty); } #endif } static int check_keyboard_interaction(int64_t cur_time) { int i, ret, key; static int64_t last_time; if (received_nb_signals) return AVERROR_EXIT; /* read_key() returns 0 on EOF */ if(cur_time - last_time >= 100000 && !run_as_daemon){ key = read_key(); last_time = cur_time; }else key = -1; if (key == 'q') return AVERROR_EXIT; if (key == '+') av_log_set_level(av_log_get_level()+10); if (key == '-') av_log_set_level(av_log_get_level()-10); if (key == 's') qp_hist ^= 1; if (key == 'h'){ if (do_hex_dump){ do_hex_dump = do_pkt_dump = 0; } else if(do_pkt_dump){ do_hex_dump = 1; } else do_pkt_dump = 1; av_log_set_level(AV_LOG_DEBUG); } if (key == 'c' || key == 'C'){ char buf[4096], target[64], command[256], arg[256] = {0}; double time; int k, n = 0; fprintf(stderr, "\nEnter command: <target>|all <time>|-1 <command>[ <argument>]\n"); i = 0; set_tty_echo(1); while ((k = read_key()) != '\n' && k != '\r' && i < sizeof(buf)-1) if (k > 0) buf[i++] = k; buf[i] = 0; set_tty_echo(0); fprintf(stderr, "\n"); if (k > 0 && (n = sscanf(buf, "%63[^ ] %lf %255[^ ] %255[^\n]", target, &time, command, arg)) >= 3) { av_log(NULL, AV_LOG_DEBUG, "Processing command target:%s time:%f command:%s arg:%s", target, time, command, arg); for (i = 0; i < nb_filtergraphs; i++) { FilterGraph *fg = filtergraphs[i]; if (fg->graph) { if (time < 0) { ret = avfilter_graph_send_command(fg->graph, target, command, arg, buf, sizeof(buf), key == 'c' ? AVFILTER_CMD_FLAG_ONE : 0); fprintf(stderr, "Command reply for stream %d: ret:%d res:\n%s", i, ret, buf); } else if (key == 'c') { fprintf(stderr, "Queuing commands only on filters supporting the specific command is unsupported\n"); ret = AVERROR_PATCHWELCOME; } else { ret = avfilter_graph_queue_command(fg->graph, target, command, arg, 0, time); if (ret < 0) fprintf(stderr, "Queuing command failed with error %s\n", av_err2str(ret)); } } } } else { av_log(NULL, AV_LOG_ERROR, "Parse error, at least 3 arguments were expected, " "only %d given in string '%s'\n", n, buf); } } if (key == 'd' || key == 'D'){ int debug=0; if(key == 'D') { debug = input_streams[0]->st->codec->debug<<1; if(!debug) debug = 1; while(debug & (FF_DEBUG_DCT_COEFF|FF_DEBUG_VIS_QP|FF_DEBUG_VIS_MB_TYPE)) //unsupported, would just crash debug += debug; }else{ char buf[32]; int k = 0; i = 0; set_tty_echo(1); while ((k = read_key()) != '\n' && k != '\r' && i < sizeof(buf)-1) if (k > 0) buf[i++] = k; buf[i] = 0; set_tty_echo(0); fprintf(stderr, "\n"); if (k <= 0 || sscanf(buf, "%d", &debug)!=1) fprintf(stderr,"error parsing debug value\n"); } for(i=0;i<nb_input_streams;i++) { input_streams[i]->st->codec->debug = debug; } for(i=0;i<nb_output_streams;i++) { OutputStream *ost = output_streams[i]; ost->enc_ctx->debug = debug; } if(debug) av_log_set_level(AV_LOG_DEBUG); fprintf(stderr,"debug=%d\n", debug); } if (key == '?'){ fprintf(stderr, "key function\n" "? show this help\n" "+ increase verbosity\n" "- decrease verbosity\n" "c Send command to first matching filter supporting it\n" "C Send/Queue command to all matching filters\n" "D cycle through available debug modes\n" "h dump packets/hex press to cycle through the 3 states\n" "q quit\n" "s Show QP histogram\n" ); } return 0; } #if HAVE_PTHREADS static void *input_thread(void *arg) { InputFile *f = arg; unsigned flags = f->non_blocking ? AV_THREAD_MESSAGE_NONBLOCK : 0; int ret = 0; while (1) { AVPacket pkt; ret = av_read_frame(f->ctx, &pkt); if (ret == AVERROR(EAGAIN)) { av_usleep(10000); continue; } if (ret < 0) { av_thread_message_queue_set_err_recv(f->in_thread_queue, ret); break; } ret = av_thread_message_queue_send(f->in_thread_queue, &pkt, flags); if (flags && ret == AVERROR(EAGAIN)) { flags = 0; ret = av_thread_message_queue_send(f->in_thread_queue, &pkt, flags); av_log(f->ctx, AV_LOG_WARNING, "Thread message queue blocking; consider raising the " "thread_queue_size option (current value: %d)\n", f->thread_queue_size); } if (ret < 0) { if (ret != AVERROR_EOF) av_log(f->ctx, AV_LOG_ERROR, "Unable to send packet to main thread: %s\n", av_err2str(ret)); av_packet_unref(&pkt); av_thread_message_queue_set_err_recv(f->in_thread_queue, ret); break; } } return NULL; } static void free_input_threads(void) { int i; for (i = 0; i < nb_input_files; i++) { InputFile *f = input_files[i]; AVPacket pkt; if (!f || !f->in_thread_queue) continue; av_thread_message_queue_set_err_send(f->in_thread_queue, AVERROR_EOF); while (av_thread_message_queue_recv(f->in_thread_queue, &pkt, 0) >= 0) av_packet_unref(&pkt); pthread_join(f->thread, NULL); f->joined = 1; av_thread_message_queue_free(&f->in_thread_queue); } } static int init_input_threads(void) { int i, ret; if (nb_input_files == 1) return 0; for (i = 0; i < nb_input_files; i++) { InputFile *f = input_files[i]; if (f->ctx->pb ? !f->ctx->pb->seekable : strcmp(f->ctx->iformat->name, "lavfi")) f->non_blocking = 1; ret = av_thread_message_queue_alloc(&f->in_thread_queue, f->thread_queue_size, sizeof(AVPacket)); if (ret < 0) return ret; if ((ret = pthread_create(&f->thread, NULL, input_thread, f))) { av_log(NULL, AV_LOG_ERROR, "pthread_create failed: %s. Try to increase `ulimit -v` or decrease `ulimit -s`.\n", strerror(ret)); av_thread_message_queue_free(&f->in_thread_queue); return AVERROR(ret); } } return 0; } static int get_input_packet_mt(InputFile *f, AVPacket *pkt) { return av_thread_message_queue_recv(f->in_thread_queue, pkt, f->non_blocking ? AV_THREAD_MESSAGE_NONBLOCK : 0); } #endif static int get_input_packet(InputFile *f, AVPacket *pkt) { if (f->rate_emu) { int i; for (i = 0; i < f->nb_streams; i++) { InputStream *ist = input_streams[f->ist_index + i]; int64_t pts = av_rescale(ist->dts, 1000000, AV_TIME_BASE); int64_t now = av_gettime_relative() - ist->start; if (pts > now) return AVERROR(EAGAIN); } } #if HAVE_PTHREADS if (nb_input_files > 1) return get_input_packet_mt(f, pkt); #endif return av_read_frame(f->ctx, pkt); } static int got_eagain(void) { int i; for (i = 0; i < nb_output_streams; i++) if (output_streams[i]->unavailable) return 1; return 0; } static void reset_eagain(void) { int i; for (i = 0; i < nb_input_files; i++) input_files[i]->eagain = 0; for (i = 0; i < nb_output_streams; i++) output_streams[i]->unavailable = 0; } // set duration to max(tmp, duration) in a proper time base and return duration's time_base static AVRational duration_max(int64_t tmp, int64_t *duration, AVRational tmp_time_base, AVRational time_base) { int ret; if (!*duration) { *duration = tmp; return tmp_time_base; } ret = av_compare_ts(*duration, time_base, tmp, tmp_time_base); if (ret < 0) { *duration = tmp; return tmp_time_base; } return time_base; } static int seek_to_start(InputFile *ifile, AVFormatContext *is) { InputStream *ist; AVCodecContext *avctx; int i, ret, has_audio = 0; int64_t duration = 0; ret = av_seek_frame(is, -1, is->start_time, 0); if (ret < 0) return ret; for (i = 0; i < ifile->nb_streams; i++) { ist = input_streams[ifile->ist_index + i]; avctx = ist->dec_ctx; // flush decoders if (ist->decoding_needed) { process_input_packet(ist, NULL, 1); avcodec_flush_buffers(avctx); } /* duration is the length of the last frame in a stream * when audio stream is present we don't care about * last video frame length because it's not defined exactly */ if (avctx->codec_type == AVMEDIA_TYPE_AUDIO && ist->nb_samples) has_audio = 1; } for (i = 0; i < ifile->nb_streams; i++) { ist = input_streams[ifile->ist_index + i]; avctx = ist->dec_ctx; if (has_audio) { if (avctx->codec_type == AVMEDIA_TYPE_AUDIO && ist->nb_samples) { AVRational sample_rate = {1, avctx->sample_rate}; duration = av_rescale_q(ist->nb_samples, sample_rate, ist->st->time_base); } else continue; } else { if (ist->framerate.num) { duration = av_rescale_q(1, ist->framerate, ist->st->time_base); } else if (ist->st->avg_frame_rate.num) { duration = av_rescale_q(1, ist->st->avg_frame_rate, ist->st->time_base); } else duration = 1; } if (!ifile->duration) ifile->time_base = ist->st->time_base; /* the total duration of the stream, max_pts - min_pts is * the duration of the stream without the last frame */ duration += ist->max_pts - ist->min_pts; ifile->time_base = duration_max(duration, &ifile->duration, ist->st->time_base, ifile->time_base); } if (ifile->loop > 0) ifile->loop--; return ret; } /* * Return * - 0 -- one packet was read and processed * - AVERROR(EAGAIN) -- no packets were available for selected file, * this function should be called again * - AVERROR_EOF -- this function should not be called again */ static int process_input(int file_index) { InputFile *ifile = input_files[file_index]; AVFormatContext *is; InputStream *ist; AVPacket pkt; int ret, i, j; int64_t duration; int64_t pkt_dts; is = ifile->ctx; ret = get_input_packet(ifile, &pkt); if (ret == AVERROR(EAGAIN)) { ifile->eagain = 1; return ret; } if (ret < 0 && ifile->loop) { if ((ret = seek_to_start(ifile, is)) < 0) return ret; ret = get_input_packet(ifile, &pkt); if (ret == AVERROR(EAGAIN)) { ifile->eagain = 1; return ret; } } if (ret < 0) { if (ret != AVERROR_EOF) { print_error(is->filename, ret); if (exit_on_error) exit_program(1); } for (i = 0; i < ifile->nb_streams; i++) { ist = input_streams[ifile->ist_index + i]; if (ist->decoding_needed) { ret = process_input_packet(ist, NULL, 0); if (ret>0) return 0; } /* mark all outputs that don't go through lavfi as finished */ for (j = 0; j < nb_output_streams; j++) { OutputStream *ost = output_streams[j]; if (ost->source_index == ifile->ist_index + i && (ost->stream_copy || ost->enc->type == AVMEDIA_TYPE_SUBTITLE)) finish_output_stream(ost); } } ifile->eof_reached = 1; return AVERROR(EAGAIN); } reset_eagain(); if (do_pkt_dump) { av_pkt_dump_log2(NULL, AV_LOG_INFO, &pkt, do_hex_dump, is->streams[pkt.stream_index]); } /* the following test is needed in case new streams appear dynamically in stream : we ignore them */ if (pkt.stream_index >= ifile->nb_streams) { report_new_stream(file_index, &pkt); goto discard_packet; } ist = input_streams[ifile->ist_index + pkt.stream_index]; ist->data_size += pkt.size; ist->nb_packets++; if (ist->discard) goto discard_packet; if (exit_on_error && (pkt.flags & AV_PKT_FLAG_CORRUPT)) { av_log(NULL, AV_LOG_FATAL, "%s: corrupt input packet in stream %d\n", is->filename, pkt.stream_index); exit_program(1); } if (debug_ts) { av_log(NULL, AV_LOG_INFO, "demuxer -> ist_index:%d type:%s " "next_dts:%s next_dts_time:%s next_pts:%s next_pts_time:%s pkt_pts:%s pkt_pts_time:%s pkt_dts:%s pkt_dts_time:%s off:%s off_time:%s\n", ifile->ist_index + pkt.stream_index, av_get_media_type_string(ist->dec_ctx->codec_type), av_ts2str(ist->next_dts), av_ts2timestr(ist->next_dts, &AV_TIME_BASE_Q), av_ts2str(ist->next_pts), av_ts2timestr(ist->next_pts, &AV_TIME_BASE_Q), av_ts2str(pkt.pts), av_ts2timestr(pkt.pts, &ist->st->time_base), av_ts2str(pkt.dts), av_ts2timestr(pkt.dts, &ist->st->time_base), av_ts2str(input_files[ist->file_index]->ts_offset), av_ts2timestr(input_files[ist->file_index]->ts_offset, &AV_TIME_BASE_Q)); } if(!ist->wrap_correction_done && is->start_time != AV_NOPTS_VALUE && ist->st->pts_wrap_bits < 64){ int64_t stime, stime2; // Correcting starttime based on the enabled streams // FIXME this ideally should be done before the first use of starttime but we do not know which are the enabled streams at that point. // so we instead do it here as part of discontinuity handling if ( ist->next_dts == AV_NOPTS_VALUE && ifile->ts_offset == -is->start_time && (is->iformat->flags & AVFMT_TS_DISCONT)) { int64_t new_start_time = INT64_MAX; for (i=0; i<is->nb_streams; i++) { AVStream *st = is->streams[i]; if(st->discard == AVDISCARD_ALL || st->start_time == AV_NOPTS_VALUE) continue; new_start_time = FFMIN(new_start_time, av_rescale_q(st->start_time, st->time_base, AV_TIME_BASE_Q)); } if (new_start_time > is->start_time) { av_log(is, AV_LOG_VERBOSE, "Correcting start time by %"PRId64"\n", new_start_time - is->start_time); ifile->ts_offset = -new_start_time; } } stime = av_rescale_q(is->start_time, AV_TIME_BASE_Q, ist->st->time_base); stime2= stime + (1ULL<<ist->st->pts_wrap_bits); ist->wrap_correction_done = 1; if(stime2 > stime && pkt.dts != AV_NOPTS_VALUE && pkt.dts > stime + (1LL<<(ist->st->pts_wrap_bits-1))) { pkt.dts -= 1ULL<<ist->st->pts_wrap_bits; ist->wrap_correction_done = 0; } if(stime2 > stime && pkt.pts != AV_NOPTS_VALUE && pkt.pts > stime + (1LL<<(ist->st->pts_wrap_bits-1))) { pkt.pts -= 1ULL<<ist->st->pts_wrap_bits; ist->wrap_correction_done = 0; } } /* add the stream-global side data to the first packet */ if (ist->nb_packets == 1) { if (ist->st->nb_side_data) av_packet_split_side_data(&pkt); for (i = 0; i < ist->st->nb_side_data; i++) { AVPacketSideData *src_sd = &ist->st->side_data[i]; uint8_t *dst_data; if (av_packet_get_side_data(&pkt, src_sd->type, NULL)) continue; if (ist->autorotate && src_sd->type == AV_PKT_DATA_DISPLAYMATRIX) continue; dst_data = av_packet_new_side_data(&pkt, src_sd->type, src_sd->size); if (!dst_data) exit_program(1); memcpy(dst_data, src_sd->data, src_sd->size); } } if (pkt.dts != AV_NOPTS_VALUE) pkt.dts += av_rescale_q(ifile->ts_offset, AV_TIME_BASE_Q, ist->st->time_base); if (pkt.pts != AV_NOPTS_VALUE) pkt.pts += av_rescale_q(ifile->ts_offset, AV_TIME_BASE_Q, ist->st->time_base); if (pkt.pts != AV_NOPTS_VALUE) pkt.pts *= ist->ts_scale; if (pkt.dts != AV_NOPTS_VALUE) pkt.dts *= ist->ts_scale; pkt_dts = av_rescale_q_rnd(pkt.dts, ist->st->time_base, AV_TIME_BASE_Q, AV_ROUND_NEAR_INF|AV_ROUND_PASS_MINMAX); if ((ist->dec_ctx->codec_type == AVMEDIA_TYPE_VIDEO || ist->dec_ctx->codec_type == AVMEDIA_TYPE_AUDIO) && pkt_dts != AV_NOPTS_VALUE && ist->next_dts == AV_NOPTS_VALUE && !copy_ts && (is->iformat->flags & AVFMT_TS_DISCONT) && ifile->last_ts != AV_NOPTS_VALUE) { int64_t delta = pkt_dts - ifile->last_ts; if (delta < -1LL*dts_delta_threshold*AV_TIME_BASE || delta > 1LL*dts_delta_threshold*AV_TIME_BASE){ ifile->ts_offset -= delta; av_log(NULL, AV_LOG_DEBUG, "Inter stream timestamp discontinuity %"PRId64", new offset= %"PRId64"\n", delta, ifile->ts_offset); pkt.dts -= av_rescale_q(delta, AV_TIME_BASE_Q, ist->st->time_base); if (pkt.pts != AV_NOPTS_VALUE) pkt.pts -= av_rescale_q(delta, AV_TIME_BASE_Q, ist->st->time_base); } } duration = av_rescale_q(ifile->duration, ifile->time_base, ist->st->time_base); if (pkt.pts != AV_NOPTS_VALUE) { pkt.pts += duration; ist->max_pts = FFMAX(pkt.pts, ist->max_pts); ist->min_pts = FFMIN(pkt.pts, ist->min_pts); } if (pkt.dts != AV_NOPTS_VALUE) pkt.dts += duration; pkt_dts = av_rescale_q_rnd(pkt.dts, ist->st->time_base, AV_TIME_BASE_Q, AV_ROUND_NEAR_INF|AV_ROUND_PASS_MINMAX); if ((ist->dec_ctx->codec_type == AVMEDIA_TYPE_VIDEO || ist->dec_ctx->codec_type == AVMEDIA_TYPE_AUDIO) && pkt_dts != AV_NOPTS_VALUE && ist->next_dts != AV_NOPTS_VALUE && !copy_ts) { int64_t delta = pkt_dts - ist->next_dts; if (is->iformat->flags & AVFMT_TS_DISCONT) { if (delta < -1LL*dts_delta_threshold*AV_TIME_BASE || delta > 1LL*dts_delta_threshold*AV_TIME_BASE || pkt_dts + AV_TIME_BASE/10 < FFMAX(ist->pts, ist->dts)) { ifile->ts_offset -= delta; av_log(NULL, AV_LOG_DEBUG, "timestamp discontinuity %"PRId64", new offset= %"PRId64"\n", delta, ifile->ts_offset); pkt.dts -= av_rescale_q(delta, AV_TIME_BASE_Q, ist->st->time_base); if (pkt.pts != AV_NOPTS_VALUE) pkt.pts -= av_rescale_q(delta, AV_TIME_BASE_Q, ist->st->time_base); } } else { if ( delta < -1LL*dts_error_threshold*AV_TIME_BASE || delta > 1LL*dts_error_threshold*AV_TIME_BASE) { av_log(NULL, AV_LOG_WARNING, "DTS %"PRId64", next:%"PRId64" st:%d invalid dropping\n", pkt.dts, ist->next_dts, pkt.stream_index); pkt.dts = AV_NOPTS_VALUE; } if (pkt.pts != AV_NOPTS_VALUE){ int64_t pkt_pts = av_rescale_q(pkt.pts, ist->st->time_base, AV_TIME_BASE_Q); delta = pkt_pts - ist->next_dts; if ( delta < -1LL*dts_error_threshold*AV_TIME_BASE || delta > 1LL*dts_error_threshold*AV_TIME_BASE) { av_log(NULL, AV_LOG_WARNING, "PTS %"PRId64", next:%"PRId64" invalid dropping st:%d\n", pkt.pts, ist->next_dts, pkt.stream_index); pkt.pts = AV_NOPTS_VALUE; } } } } if (pkt.dts != AV_NOPTS_VALUE) ifile->last_ts = av_rescale_q(pkt.dts, ist->st->time_base, AV_TIME_BASE_Q); if (debug_ts) { av_log(NULL, AV_LOG_INFO, "demuxer+ffmpeg -> ist_index:%d type:%s pkt_pts:%s pkt_pts_time:%s pkt_dts:%s pkt_dts_time:%s off:%s off_time:%s\n", ifile->ist_index + pkt.stream_index, av_get_media_type_string(ist->dec_ctx->codec_type), av_ts2str(pkt.pts), av_ts2timestr(pkt.pts, &ist->st->time_base), av_ts2str(pkt.dts), av_ts2timestr(pkt.dts, &ist->st->time_base), av_ts2str(input_files[ist->file_index]->ts_offset), av_ts2timestr(input_files[ist->file_index]->ts_offset, &AV_TIME_BASE_Q)); } sub2video_heartbeat(ist, pkt.pts); process_input_packet(ist, &pkt, 0); discard_packet: av_packet_unref(&pkt); return 0; } /** * Perform a step of transcoding for the specified filter graph. * * @param[in] graph filter graph to consider * @param[out] best_ist input stream where a frame would allow to continue * @return 0 for success, <0 for error */ static int transcode_from_filter(FilterGraph *graph, InputStream **best_ist) { int i, ret; int nb_requests, nb_requests_max = 0; InputFilter *ifilter; InputStream *ist; *best_ist = NULL; ret = avfilter_graph_request_oldest(graph->graph); if (ret >= 0) return reap_filters(0); if (ret == AVERROR_EOF) { ret = reap_filters(1); for (i = 0; i < graph->nb_outputs; i++) close_output_stream(graph->outputs[i]->ost); return ret; } if (ret != AVERROR(EAGAIN)) return ret; for (i = 0; i < graph->nb_inputs; i++) { ifilter = graph->inputs[i]; ist = ifilter->ist; if (input_files[ist->file_index]->eagain || input_files[ist->file_index]->eof_reached) continue; nb_requests = av_buffersrc_get_nb_failed_requests(ifilter->filter); if (nb_requests > nb_requests_max) { nb_requests_max = nb_requests; *best_ist = ist; } } if (!*best_ist) for (i = 0; i < graph->nb_outputs; i++) graph->outputs[i]->ost->unavailable = 1; return 0; } /** * Run a single step of transcoding. * * @return 0 for success, <0 for error */ static int transcode_step(void) { OutputStream *ost; InputStream *ist; int ret; ost = choose_output(); if (!ost) { if (got_eagain()) { reset_eagain(); av_usleep(10000); return 0; } av_log(NULL, AV_LOG_VERBOSE, "No more inputs to read from, finishing.\n"); return AVERROR_EOF; } if (ost->filter) { if ((ret = transcode_from_filter(ost->filter->graph, &ist)) < 0) return ret; if (!ist) return 0; } else { av_assert0(ost->source_index >= 0); ist = input_streams[ost->source_index]; } ret = process_input(ist->file_index); if (ret == AVERROR(EAGAIN)) { if (input_files[ist->file_index]->eagain) ost->unavailable = 1; return 0; } if (ret < 0) return ret == AVERROR_EOF ? 0 : ret; return reap_filters(0); } /* * The following code is the main loop of the file converter */ static int transcode(void) { int ret, i; AVFormatContext *os; OutputStream *ost; InputStream *ist; int64_t timer_start; int64_t total_packets_written = 0; ret = transcode_init(); if (ret < 0) goto fail; if (stdin_interaction) { av_log(NULL, AV_LOG_INFO, "Press [q] to stop, [?] for help\n"); } timer_start = av_gettime_relative(); #if HAVE_PTHREADS if ((ret = init_input_threads()) < 0) goto fail; #endif while (!received_sigterm) { int64_t cur_time= av_gettime_relative(); /* if 'q' pressed, exits */ if (stdin_interaction) if (check_keyboard_interaction(cur_time) < 0) break; /* check if there's any stream where output is still needed */ if (!need_output()) { av_log(NULL, AV_LOG_VERBOSE, "No more output streams to write to, finishing.\n"); break; } ret = transcode_step(); if (ret < 0 && ret != AVERROR_EOF) { char errbuf[128]; av_strerror(ret, errbuf, sizeof(errbuf)); av_log(NULL, AV_LOG_ERROR, "Error while filtering: %s\n", errbuf); break; } /* dump report by using the output first video and audio streams */ print_report(0, timer_start, cur_time); } #if HAVE_PTHREADS free_input_threads(); #endif /* at the end of stream, we must flush the decoder buffers */ for (i = 0; i < nb_input_streams; i++) { ist = input_streams[i]; if (!input_files[ist->file_index]->eof_reached && ist->decoding_needed) { process_input_packet(ist, NULL, 0); } } flush_encoders(); term_exit(); /* write the trailer if needed and close file */ for (i = 0; i < nb_output_files; i++) { os = output_files[i]->ctx; if (!output_files[i]->header_written) { av_log(NULL, AV_LOG_ERROR, "Nothing was written into output file %d (%s), because " "at least one of its streams received no packets.\n", i, os->filename); continue; } if ((ret = av_write_trailer(os)) < 0) { av_log(NULL, AV_LOG_ERROR, "Error writing trailer of %s: %s", os->filename, av_err2str(ret)); if (exit_on_error) exit_program(1); } } /* dump report by using the first video and audio streams */ print_report(1, timer_start, av_gettime_relative()); /* close each encoder */ for (i = 0; i < nb_output_streams; i++) { ost = output_streams[i]; if (ost->encoding_needed) { av_freep(&ost->enc_ctx->stats_in); } total_packets_written += ost->packets_written; } if (!total_packets_written && (abort_on_flags & ABORT_ON_FLAG_EMPTY_OUTPUT)) { av_log(NULL, AV_LOG_FATAL, "Empty output\n"); exit_program(1); } /* close each decoder */ for (i = 0; i < nb_input_streams; i++) { ist = input_streams[i]; if (ist->decoding_needed) { avcodec_close(ist->dec_ctx); if (ist->hwaccel_uninit) ist->hwaccel_uninit(ist->dec_ctx); } } av_buffer_unref(&hw_device_ctx); /* finished ! */ ret = 0; fail: #if HAVE_PTHREADS free_input_threads(); #endif if (output_streams) { for (i = 0; i < nb_output_streams; i++) { ost = output_streams[i]; if (ost) { if (ost->logfile) { if (fclose(ost->logfile)) av_log(NULL, AV_LOG_ERROR, "Error closing logfile, loss of information possible: %s\n", av_err2str(AVERROR(errno))); ost->logfile = NULL; } av_freep(&ost->forced_kf_pts); av_freep(&ost->apad); av_freep(&ost->disposition); av_dict_free(&ost->encoder_opts); av_dict_free(&ost->sws_dict); av_dict_free(&ost->swr_opts); av_dict_free(&ost->resample_opts); } } } return ret; } static int64_t getutime(void) { #if HAVE_GETRUSAGE struct rusage rusage; getrusage(RUSAGE_SELF, &rusage); return (rusage.ru_utime.tv_sec * 1000000LL) + rusage.ru_utime.tv_usec; #elif HAVE_GETPROCESSTIMES HANDLE proc; FILETIME c, e, k, u; proc = GetCurrentProcess(); GetProcessTimes(proc, &c, &e, &k, &u); return ((int64_t) u.dwHighDateTime << 32 | u.dwLowDateTime) / 10; #else return av_gettime_relative(); #endif } static int64_t getmaxrss(void) { #if HAVE_GETRUSAGE && HAVE_STRUCT_RUSAGE_RU_MAXRSS struct rusage rusage; getrusage(RUSAGE_SELF, &rusage); return (int64_t)rusage.ru_maxrss * 1024; #elif HAVE_GETPROCESSMEMORYINFO HANDLE proc; PROCESS_MEMORY_COUNTERS memcounters; proc = GetCurrentProcess(); memcounters.cb = sizeof(memcounters); GetProcessMemoryInfo(proc, &memcounters, sizeof(memcounters)); return memcounters.PeakPagefileUsage; #else return 0; #endif } static void log_callback_null(void *ptr, int level, const char *fmt, va_list vl) { } int main(int argc, char **argv) { int i, ret; int64_t ti; init_dynload(); register_exit(ffmpeg_cleanup); setvbuf(stderr,NULL,_IONBF,0); /* win32 runtime needs this */ av_log_set_flags(AV_LOG_SKIP_REPEATED); parse_loglevel(argc, argv, options); if(argc>1 && !strcmp(argv[1], "-d")){ run_as_daemon=1; av_log_set_callback(log_callback_null); argc--; argv++; } avcodec_register_all(); #if CONFIG_AVDEVICE avdevice_register_all(); #endif avfilter_register_all(); av_register_all(); avformat_network_init(); show_banner(argc, argv, options); /* parse options and open all input/output files */ ret = ffmpeg_parse_options(argc, argv); if (ret < 0) exit_program(1); if (nb_output_files <= 0 && nb_input_files == 0) { show_usage(); av_log(NULL, AV_LOG_WARNING, "Use -h to get full help or, even better, run 'man %s'\n", program_name); exit_program(1); } /* file converter / grab */ if (nb_output_files <= 0) { av_log(NULL, AV_LOG_FATAL, "At least one output file must be specified\n"); exit_program(1); } // if (nb_input_files == 0) { // av_log(NULL, AV_LOG_FATAL, "At least one input file must be specified\n"); // exit_program(1); // } for (i = 0; i < nb_output_files; i++) { if (strcmp(output_files[i]->ctx->oformat->name, "rtp")) want_sdp = 0; } current_time = ti = getutime(); if (transcode() < 0) exit_program(1); ti = getutime() - ti; if (do_benchmark) { av_log(NULL, AV_LOG_INFO, "bench: utime=%0.3fs\n", ti / 1000000.0); } av_log(NULL, AV_LOG_DEBUG, "%"PRIu64" frames successfully decoded, %"PRIu64" decoding errors\n", decode_error_stat[0], decode_error_stat[1]); if ((decode_error_stat[0] + decode_error_stat[1]) * max_error_rate < decode_error_stat[1]) exit_program(69); exit_program(received_nb_signals ? 255 : main_return_code); return main_return_code; } ```
```smalltalk using System; using System.Collections.Generic; using System.Linq; using System.Text; namespace g3 { /// <summary> /// SparseList provides a linear-indexing interface, but internally may use an /// alternate data structure to store the [index,value] pairs, if the list /// is very sparse. /// /// Currently uses Dictionary<> as sparse data structure /// </summary> public class SparseList<T> where T : IEquatable<T> { T[] dense; Dictionary<int, T> sparse; T zeroValue; public SparseList(int MaxIndex, int SubsetCountEst, T ZeroValue) { zeroValue = ZeroValue; bool bSmall = MaxIndex > 0 && MaxIndex < 1024; float fPercent = (MaxIndex == 0) ? 0 : (float)SubsetCountEst / (float)MaxIndex; float fPercentThresh = 0.1f; if (bSmall || fPercent > fPercentThresh) { dense = new T[MaxIndex]; for (int k = 0; k < MaxIndex; ++k) dense[k] = ZeroValue; } else sparse = new Dictionary<int, T>(); } public T this[int idx] { get { if (dense != null) return dense[idx]; T val; if (sparse.TryGetValue(idx, out val)) return val; return zeroValue; } set { if (dense != null) { dense[idx] = value; } else { sparse[idx] = value; } } } public int Count(Func<T,bool> CountF) { int count = 0; if ( dense != null ) { for (int i = 0; i < dense.Length; ++i) if (CountF(dense[i])) count++; } else { foreach (var v in sparse) { if (CountF(v.Value)) count++; } } return count; } /// <summary> /// This enumeration will return pairs [index,0] for dense case /// </summary> public IEnumerable<KeyValuePair<int,T>> Values() { if ( dense != null ) { for (int i = 0; i < dense.Length; ++i) yield return new KeyValuePair<int, T>(i, dense[i]); } else { foreach (var v in sparse) yield return v; } } public IEnumerable<KeyValuePair<int,T>> NonZeroValues() { if ( dense != null ) { for (int i = 0; i < dense.Length; ++i) { if ( dense[i].Equals(zeroValue) == false ) yield return new KeyValuePair<int, T>(i, dense[i]); } } else { foreach (var v in sparse) yield return v; } } } /// <summary> /// variant of SparseList for class objects, then "zero" is null /// /// TODO: can we combine these classes somehow? /// </summary> public class SparseObjectList<T> where T : class { T[] dense; Dictionary<int, T> sparse; public SparseObjectList(int MaxIndex, int SubsetCountEst) { bool bSmall = MaxIndex < 1024; float fPercent = (float)SubsetCountEst / (float)MaxIndex; float fPercentThresh = 0.1f; if (bSmall || fPercent > fPercentThresh) { dense = new T[MaxIndex]; for (int k = 0; k < MaxIndex; ++k) dense[k] = null; } else sparse = new Dictionary<int, T>(); } public T this[int idx] { get { if (dense != null) return dense[idx]; T val; if (sparse.TryGetValue(idx, out val)) return val; return null; } set { if (dense != null) { dense[idx] = value; } else { sparse[idx] = value; } } } public int Count(Func<T,bool> CountF) { int count = 0; if ( dense != null ) { for (int i = 0; i < dense.Length; ++i) if (CountF(dense[i])) count++; } else { foreach (var v in sparse) { if (CountF(v.Value)) count++; } } return count; } /// <summary> /// This enumeration will return pairs [index,0] for dense case /// </summary> public IEnumerable<KeyValuePair<int,T>> Values() { if ( dense != null ) { for (int i = 0; i < dense.Length; ++i) yield return new KeyValuePair<int, T>(i, dense[i]); } else { foreach (var v in sparse) yield return v; } } public IEnumerable<KeyValuePair<int,T>> NonZeroValues() { if ( dense != null ) { for (int i = 0; i < dense.Length; ++i) { if ( dense[i] != null ) yield return new KeyValuePair<int, T>(i, dense[i]); } } else { foreach (var v in sparse) yield return v; } } public void Clear() { if (dense != null) { Array.Clear(dense, 0, dense.Length); } else { sparse.Clear(); } } } } ```
Jerusalem is a city in the Middle East. Jerusalem or Jeruzalem may also refer to: Places Middle East Jerusalem District, State of Israel Jerusalem Governorate, Palestinian National Authority Mutasarrifate of Jerusalem, an Ottoman District from 1872 to 1917 Kingdom of Jerusalem, a Christian kingdom from 1099 to 1291 United States Jerusalem, Arkansas, unincorporated community in Conway County Jerusalem, Baltimore County, Maryland, unincorporated community Jerusalem Mill Village, living history museum in Maryland Jerusalem, Michigan, an unincorporated community Jerusalem, New York, town in Yates County Jerusalem, Ohio, village in Monroe County Jerusalem, Rhode Island, an unincorporated village in the incorporated town of Narragansett in Washington County Jerusalem, Virginia, former name of Courtland, a town in Southampton County Jerusalem and Figtree Hill, U.S. Virgin Islands New Zealand Jerusalem, New Zealand, also known as Hiruharama, a village in the Manawatu-Whanganui Region Hiruharama, a transliteration of "Jerusalem", a village in the Gisborne District Other places Jerusalem of Lithuania, a nickname for Vilnius, Lithuania Jerusalem, Lincolnshire, a village in England Jeruzalem, Ljutomer, a village in Slovenia Jeruzalem, Pomeranian Voivodeship, a village in Poland Jerusalem (Königsberg), a former quarter of Königsberg, Prussia Jerusalem, a village and administrative part of Příbram, Czech Republic Góra Kalwaria, a town in Poland formerly known as Nowa Jerozolima ("New Jerusalem") Yerusalimka or Ierusalimka, a Jewish quarter of the town of Vinnytsia Nowa Jerozolima, an 18th-century village, now part of Warsaw Jeruzalem, a restaurant in Delft, Netherlands Arts Literature Jerusalem Delivered, a 1581 epic poem by Torquato Tasso "Jerusalem" (poem), common name for the 1804 poem "And did those feet in ancient time" by William Blake Jerusalem The Emanation of the Giant Albion, an illuminated book created from 1804 to 1820 by William Blake Jerusalem (Mendelssohn), philosophical book published in 1783 Jerusalem (Lagerlöf novel), 1901 novel by Selma Lagerlöf Jerusalem, 1996 novel by Cecelia Holland Jerusalem, 2009 novel by Patrick Neate Jerusalem: The Biography, 2011 historical book by Simon Sebag Montefiore Jerusalem (Moore novel), a 2016 novel by Alan Moore O Jerusalem!, an epic history of the city. Music Jerusalem-Yerushalayim, 2008 oratorio-musical by Antony Pitts Jerusalem (British band), early 1970s Jerusalem (Swedish band), founded in 1975 Jérusalem, 1847 opera by Giuseppe Verdi Albums Jerusalem (Jerusalem album), by the Swedish band Jerusalem, 1978 Jerusalem (Sleep album), title of an unauthorized 1999 release of Sleep's third album, also released as Dopesmoker Jerusalem (Steve Earle album), 2002 Jerusalem, by Alpha Blondy featuring The Wailers, 1986 Jerusalem (EP), by Mark Stewart Songs "Jerusalem" (hymn), a setting to music of Blake's poem And did those feet in ancient time, written by Sir Hubert Parry in 1916 and used as a hymn "Jerusalem", 1973 song from the album Brain Salad Surgery by Emerson, Lake & Palmer "Jerusalem", 1981 track on Chariots of Fire by Ambrosian Singers and Vangelis "Jerusalem", 1988 track on I Am Kurious Oranj by The Fall "Blake's Jerusalem", 1990 song on the album The Internationale by Billy Bragg "Jerusalem", a song from The Chemical Wedding by Bruce Dickinson "Jerusalem of Gold", Israeli patriotic song written in 1967 "Jerusalem", 1970 instrumental by Herb Alpert from Summertime "Jerusalem", 1971 song on Long Player by The Faces "The Holy City" (song), a religious Victorian ballad dating from 1892, sometimes known as "Jerusalem" because of the prominence of that word in the refrain "With a Shout (Jerusalem)", 1981 song by U2 from October "Jerusalem" (Belouis Some song), from his 1985 album Some People "Jerusalem" (Alphaville song), from their 1986 album Afternoons in Utopia "Jerusalem", title song from the 1986 album by Alpha Blondy featuring The Wailers "Jerusalem", 1987 song by Sinéad O'Connor from The Lion and the Cobra "Jerusalem" (Dan Bern song), 1996 work from Dog Boy Van by Dan Bern "Jerusalem", 2004 song by Dutch singer Anouk on Hotel New York "Jerusalem" (Out of Darkness Comes Light), 2006 work by Hasidic reggae musician Matisyahu "Jerusalem", 2007 song by Stanley Clarke from The Toys of Men "Jerusalem", 2010 song by Rosamund Pike, Tom Wilkinson, Stephen Merchant, Sanjeev Bhaskar, Pam Ferris and cast from Jackboots on Whitehall Other arts Jerusalem (1996 film), 1996 Swedish film, directed by Bille August, based on Selma Lagerlöf's 1901 novel Jerusalem (2013 film), a National Geographic documentary narrated by Benedict Cumberbatch Jerusalem (painting), an 1867 painting by Jean-Léon Gérôme Jerusalem (play), 2009 work created by Jez Butterworth JeruZalem, 2015 Israeli film directed by Doron Paz and Yoav Paz Other uses Jerusalem (surname), a surname (and a list of people with the name) Jerusalem artichoke, a vegetable Jerusalem (computer virus) Jerusalem College of Engineering, Chennai, an engineering college in Tamil Nadu, India Jerusalem! Tactical Game of the 1948 War, a 1975 board wargame that simulates the 1948 Arab-Israeli War Jerusalem: The Three Roads to the Holy Land, 2002 historical adventure video game Spider Jerusalem, fictional character in the comic Transmetropolitan Council of Jerusalem, early Christian council held around the year 50 A metonym for the Israeli Government See also Yerushalmi (disambiguation) Al-Quds (disambiguation) Bayt al-Maqdis (disambiguation) Aleje Jerozolimskie (literally Jerusalem Avenues), a street in Warsaw, Poland East Jerusalem West Jerusalem Eyerusalem, given name
Emerging is the title of the only album by the Phil Keaggy Band, released in 1977 on NewSong Records. The album's release was delayed due to a shift in record pressing plant priorities following the death of Elvis Presley. The album was re-released on CD in 2000 as ReEmerging with one original track omitted and four newly recorded songs by the band members. Track listing (1977 release) All songs written by Phil Keaggy, unless otherwise noted. Side one "Theme" (Phil Madeira) – 1:25 (instrumental) "Where Is My Maker?" – 2:25 "Another Try" – 4:55 "Ryan's Song" (inspired by a poem by Bill Clarke) – 3:09 "Struck By the Love" (Madeira) – 5:43 (lead vocal: Phil Madeira) Side two "Turned On the Light" – 4:57 "Sorry" – 4:09 "Take a Look Around" – 5:16 "Gentle Eyes" – 5:29 (omitted from 2000 reissue) Track listing (2000 re-release) All songs written by Phil Keaggy, unless otherwise noted. "Theme" (Madeira) "Where Is My Maker?" "Another Try" "Ryan's Song" "Struck By the Love" (Madeira) "Turned On the Light" "Sorry" "Take a Look Around" "My Auburn Lady" - 4:26 "Mighty Lord" (Madeira) - 4:43 (lead vocal: Phil Madeira) "You're My Hero" (Andersen/Keaggy) - 4:04 (lead vocal: Terry Andersen) "Amelia Earhart's Last Flight" (McEnery) - 3:17 (lead vocal: Dan Cunningham) Personnel The Phil Keaggy Band Phil Keaggy – vocals, lead electric and acoustic guitar Lynn Nichols – vocals, electric guitar, acoustic guitar (lead on "Struck by the Love"), classical guitar Phil Madeira – vocals, piano, Hammond organ, Fender Rhodes, Micromoog and Polymoog synths Dan Cunningham – bass, vocals (CD reissue only) Terry Andersen – drums, vocals (CD reissue only) Additional musicians Karl Fruh – Cello on "Another Try" Ray Papai – Sax on "Sorry" Production notes Peter K. Hopper – producer Phil Keaggy – co-producer Gary Hedden – engineer, mixing References 1977 albums Phil Keaggy albums
Seb Seliyer (lit. "tin-worker") is an Indic language spoken by Roma in Iran, but their language is distinct from that of other Roma in Iran, who speak closely related dialects. The language has largely converged on Mazandarani, but core vocabulary remains Indic. References Indo-Aryan languages Language Languages of Iran Romani in Iran
The 1953 NCAA baseball season, play of college baseball in the United States organized by the National Collegiate Athletic Association (NCAA) began in the spring of 1953. The season progressed through the regular season and concluded with the 1953 College World Series. The College World Series, held for the seventh time in 1953, consisted of one team from each of eight geographical districts and was held in Omaha, Nebraska at Johnny Rosenblatt Stadium as a double-elimination tournament. Michigan claimed the championship. Conference winners This is a partial list of conference champions from the 1953 season. Each of the eight geographical districts chose, by various methods, the team that would represent them in the NCAA Tournament. Conference champions had to be chosen, unless all conference champions declined the bid. Conference standings The following is an incomplete list of conference standings: College World Series The 1953 season marked the seventh NCAA Baseball Tournament, which consisted of the eight team College World Series. The College World Series was held in Omaha, Nebraska. Districts used a variety of selection methods to the event, from playoffs to a selection committee. District playoffs were not considered part of the NCAA Tournament, and the expansion to eight teams resulted in the end of regionals as they existed from 1947 through 1949. The eight teams played a double-elimination format, with Michigan claiming their first championship with a 7–5 win over Texas in the final. Award winners All-America team References
The California Capitol Christmas Tree (known as the California Capitol Holiday Tree between 1999 and 2003) is an annually erected decorated tree outside the California State Capitol in Sacramento, California, United States. Initiated in 1931, the tree has been a tradition ever since and is decorated during the second week of December each year. In December 1999, during his first Christmas season in office, Governor Gray Davis changed the name of the tree to the California Capitol Holiday Tree, a spokesperson describing this as a name that "more accurately symbolizes the diversity of what the holidays are in California". After the 2003 California recall, Gray Davis was replaced with Arnold Schwarzenegger, who changed the name back from the Christmas of 2004. A similar occurrence took place with the Capitol Christmas Tree in Washington, DC, which took on the "holiday" generic name from 1999–2005. References Individual Christmas trees Individual trees in California Culture of Sacramento, California Tourist attractions in Sacramento, California Recurring events established in 1931 1931 establishments in California Christmas in California
```kotlin // This file was automatically generated from select-expression.md by Knit tool. Do not edit. package kotlinx.coroutines.guide.exampleSelect05 import kotlinx.coroutines.* import kotlinx.coroutines.channels.* import kotlinx.coroutines.selects.* fun CoroutineScope.switchMapDeferreds(input: ReceiveChannel<Deferred<String>>) = produce<String> { var current = input.receive() // start with first received deferred value while (isActive) { // loop while not cancelled/closed val next = select<Deferred<String>?> { // return next deferred value from this select or null input.onReceiveCatching { update -> update.getOrNull() } current.onAwait { value -> send(value) // send value that current deferred has produced input.receiveCatching().getOrNull() // and use the next deferred from the input channel } } if (next == null) { println("Channel was closed") break // out of loop } else { current = next } } } fun CoroutineScope.asyncString(str: String, time: Long) = async { delay(time) str } fun main() = runBlocking<Unit> { val chan = Channel<Deferred<String>>() // the channel for test launch { // launch printing coroutine for (s in switchMapDeferreds(chan)) println(s) // print each received string } chan.send(asyncString("BEGIN", 100)) delay(200) // enough time for "BEGIN" to be produced chan.send(asyncString("Slow", 500)) delay(100) // not enough time to produce slow chan.send(asyncString("Replace", 100)) delay(500) // give it time before the last one chan.send(asyncString("END", 500)) delay(1000) // give it time to process chan.close() // close the channel ... delay(500) // and wait some time to let it finish } ```
, also known as Doraemon Nights, is a 1991 Japanese animated science fantasy film which premiered on 9 March 1991 in Japan, based on the 11th volume of the same name of the Doraemon Long Stories series. It's the 12th Doraemon film. Plot Nobita Nobi and Doraemon experience the tale of Sinbad of the Arabian Nights fame firsthand using a storybook gadget, but Nobita becomes bored by just watching it from afar. He tries to invite Shizuka Minamoto to enter the storybooks of other tales and accidentally brings Takeshi "Gian" Goda and Suneo Honekawa along. Gian and Suneo mess up the storybooks to create a "fresh" tale, which causes Nobita and Shizuka to experience a mishmash of various tales that Shizuka dislikes. Attempting to leave, she is knocked out by Sinbad's magic carpet and falls into the desert. The next day, Doraemon realizes that Shizuka has gone missing and stages a rescue mission by going to 8th century Baghdad, during the reign of caliph Harun al-Rashid, after receiving a confirmation from the future that the world of the Arabian Nights does indeed coincide with the 8th century Abbasid Caliphate. Posing as foreign traders and servants, Nobita, Doraemon, Gian, and Suneo are rescued from Cassim and his bandits by the caliph himself, who gives a permit that allows them to travel from the port of Basra. Initially, the four are accompanied by Mikujin, a guide genie, but the latter goes upset when they insult him due to his incompetence and leaves. After purchasing a ship, however, the group are double-crossed by the trader, who reveals himself to be Cassim, and are thrown overboard. Waking up on the shore of the Arabian Desert, the four are forced to walk through it because Doraemon's pocket is lost during the storm that also crashed Cassim's ship. However, they are rescued by a gigantic genie commandeered by Sinbad, who reigns over a marvelous city in the desert presented by an anonymous time traveler from the future. With his magical gadgets, Sinbad helps the four rescue Shizuka from a bandit named Abdil. Vowing revenge, Abdil meets with Cassim and his two minions to search for the lost city. It is then revealed that Abdil was the only visitor to Sinbad's city who remembers its location, because when Sinbad urged him to drink a memory potion after the visit, he spewed it away. After arriving, Abdil and Cassim swiftly take the city from Sinbad, whom they expel. Mikujin returns and helps the group with Doraemon's pocket, which he recovered after the storm. The group eventually manage to defeat Abdil and Cassim and retake the city. Despite Sinbad's offer for them to remain by not erasing their memories with the memory potion, Nobita and his friends bid him farewell before returning to the present day. Cast Release The film was released in the theatres of Japan on 9 March 1991. The film was released theatrically in Spain by Luk Internacional S.A. on 25 June 2001. Video game A video game based on the film was released on the PC Engine on 6 December 1991 and PC Engine CD on 29 May 1992. References External links Doraemon The Movie 25th page 1991 films 1991 anime films Anime and manga based on fairy tales Nobita in Dorabian Nights Films based on One Thousand and One Nights Films directed by Tsutomu Shibayama Films set in Baghdad Films set in the 8th century Films set in the 9th century Animated films about time travel Toho animated films Fictional caliphs Films scored by Shunsuke Kikuchi 1990s children's animated films Japanese children's fantasy films
Bangla Pokkho is a pro-Bangla advocacy organization that focuses on rights for Bengalis in the Republic of India based on Bengali Nationalism (not to be confused with Bangladeshi Nationalism), and works against the Hindi–Urdu cultural and linguistic imperialism and forced domination of Hindustani speakers in West Bengal. It is organized along linguistic lines and aimed at protecting Bengali culture. It uses the Bengali slogan Joy Bangla. Demands and Protests Bangla Pokkho majorly demands 100% reservation for residents of West Bengal in fields of Government job and 90% reservation in other job sectors, education, military, administration works. The group has done numerous meetings, gatherings and rallies throughout many places of West Bengal and Tripura. Bangla pokkho also demanded for Bengali language in various exams i.e. Rail, JEE, NEET etc., and in Banks, offices and other sectors. The group proposed that the top administrative officials in West Bengal must come from WBCS/WBPS cadre and not IAS/IPS. Bangla Pokkho made protest when Bengali workers in WBSEDCL were expelled as they had not fluency in English and Hindi. After protests, the company has taken back then and enacted them in respective places. Massive controversy broke out when in the second season of webseries ‘Abhay’, released on OTT platform Zee5, young Bengali freedom fighter Khudiram Bose was shown as a ‘criminal’ in its second episode. Bangla Pokkho along with many nationalist organisations protested and demanded the show to be banned. The group sent a legal notice to remove the scene. After that Zee5 edited and re-released the show cutting out the Khudiram image. The group opposed the CAA, NRC, planned by the central government to be implemented in Bengal. Bangla Pokkho said that the BJP unlawfully had passed the laws in the parliament to break the unity of Hindus and Muslims. The group also protested against the idea of formation of Gorkhaland. They fired an idol of Subramaniyam Swami who proposed to make a separate state from West Bengal namely Gorkhaland Bangla Pokkho along with Kanchanpur Nagarik Surakkha Mancha meet in a big protest in Tripura, that gathered more than 30,000 Bengalis in Tripura, complaining against social discrimination of Bengalis by the Tripura state BJP Government. Major successes The West Bengal Public Service Commission, which is the body for test for administrative posts in Government of West Bengal marked Bengali language mandatory with 300 marks Bengali/Nepali language paper in the examination. Hindi and English languages were used in Kolkata Metro Rail smart cards. In this regard, the Bangla Pokkho protested and pointed out the non-use of Bengali language in Bengali-speaking states and the use of Hindi to the exclusion of Bengali. Bengali language was included in Kolkata Metro Rail smart card after this protest. The SET exam, used to recruit college faculties, introduced Bengali language as a medium. Major banks and ATMs started to use Bengali language in the bank forms, which were previously in Hindi and English only. Support and criticism As of January 2019, Bangla Pokkho claims to have 200,000-300,000 supporters. Supporters usually have a stand for promotion and protection of Bengali culture. Bangla Pokkho is known to promote Hindu Muslim unity in among Bengalis. Maidul Islam, political analyst and faculty member of Centre for Studies in Social Sciences, terms this as rise of organic nationalist left of center, after the decline of Left and its class politics. Many Bengali workers who live outside Bengal for their work claim they are facing problem for Bangla Pokkho due to their 'Anti-Hindi' attitude. Some residents of West Bengal claim that Bangla Pokkho keeps its mouth shut when any incident against Bengal people done by West Bengal ruling party TMC, Bangla Pokkho accused to target BJP behalf of TMC as a B team of TMC. Bangla Pokkho remain silent many such incident like SSC scam, Sharda Scam and many more so its clear they are not fight for Bengali people, their main agenda to run anti-Hindi movement. Some days back, the founder of Bangla Pokkho demand for a separate flag for West Bengal in his video which was rejected by some Bengali pepole in his comment section. On several occasions, Bangla Pokkho tried to speak up about the harrashment faced by Bengali people in the hands of Non-Bengalis specially Hindi-Urdu speakers in their native state, West Bengal. References External links Official website Political activism Language advocacy organizations Bengali nationalism Advocacy groups in India
```go package batch import "encoding/json" // Index is a "key" mapper for batch requests. Its key is not special and // should be treated the same as an index in a slice. type Index struct { key int } // NewIndex creates an index. This should only be used by // generated code or the creator of batches, The "Key" has no purpose and should // not be used elsewhere. func NewIndex(key int) Index { // TODO generate a hash to really mess with people trying to recreate batches? return Index{key: key} } // Create another type for Index so that MarshalText and UnmarshalText // don't run into recursion issues. type index Index func (i Index) MarshalText() (text []byte, err error) { return json.Marshal(index(i)) } func (i *Index) UnmarshalText(text []byte) error { return json.Unmarshal(text, (*index)(i)) } ```
This is an episodes' list of The Cramp Twins, a Cartoon Network European original animated series created by Brian Wood. The series aired on Cartoon Network Europe in European countries and on CBBC in the United Kingdom from 2001 to 2004 and on Cartoon Network in the United States from June 14, 2004 to 2005. Episodes Season 1 (2001) Season 2 References External links Lists of British animated television series episodes Lists of American children's animated television series episodes
151 South African Infantry Battalion was a motorised infantry unit of the South African Army. History Origin of the black battalions By the late 1970s the South African government had abandoned its opposition to arming black soldiers. By early 1979, the government approved a plan to form a number of regional African battalions, each with a particular ethnic identity, which would either serve in their homelands or under regional SADF commands. This led to the formation of 151 Battalion for the Southern Sothos. Troops for 151 SA Battalion were recruited from the self-governing territory of Qwaqwa. Higher Command 151 Battalion resorted under the command of Group 36. The battalion was responsible for patrolling the border between Lesotho and South Africa. Disbandment 151 SA Battalion was disbanded around 1994 and members were assimilated into 1 South African Infantry Battalion and the new SANDF. Insignia Leadership Notes Peled, A. A question of Loyalty Military Manpower Policy in Multiethinic States, Cornell University Press, 1998, Chapter 2: South Africa: From Exclusion to Inclusion References Infantry battalions of South Africa Military units and formations of South Africa in the Border War Military units and formations established in 1980 Military units and formations disestablished in 1994
Guichenotia alba is a flowering plant in the family Malvaceae and is endemic to Western Australia. It is a slender, spreading shrub with lax, hairy young branches, leaves with the edges rolled under, and white flowers. Description Guichenotia alba is a slender, spreading shrub that typically grows to high and wide with many stems at the base but few branches. Its young branches are densely covered with woolly, star-shaped hairs. The leaves are long, the edges rolled under, on a petiole long with stipules up to two-thirds the length of the leaves. The leaves are densely woolly-hairy when young, but later glabrous. The flowers are borne singly, in pairs or groups of three, on a peduncle long, each flower on a pedicel long, with narrowly egg-shaped bracts long and bracteoles long at the base. The flowers are bell-shaped with five divided, petal-like sepals long, that are white on the outside, pale green inside. There are five tiny white, scale-like petals and the stamens are red. Flowering occurs in July and August and the fruit is a capsule in diameter. Taxonomy and naming Guichenotia alba was first formally described in 1992 by Greg Keighery and the description was published in the journal Nuytsia from specimens he collected near Cataby in 1988. The specific epithet (alba) means "white". Distribution and habitat This species of Guichenotia grows in heath, often in winter-wet areas, in a few places between Three Springs and Cataby in the Avon Wheatbelt, Geraldton Sandplains and Swan Coastal Plain bioregions of south-western Western Australia. Conservation status Guichenotia alba is listed as "Priority Three" by the Government of Western Australia Department of Biodiversity, Conservation and Attractions, meaning that it is poorly known and known from only a few locations but is not under imminent threat. References Malvales of Australia Rosids of Western Australia Plants described in 2003 alba Endemic flora of Southwest Australia
```java Updating interfaces by using `default` methods Using `synchronized` statements There is no such thing as *pass-by-reference* in Java Do not perform bitwise and arithmetic operations on the same data Methods performing *Security Checks* must be declared `Private` or `Final` ```
Aarón Suárez Zúñiga (born 27 June 2002), is a Costa Rican professional footballer who plays as an attacking midfielder for Liga FPD club Alajuelense and the Costa Rica national team. Career Suarez who hails from La Trinidad District started his career for Deportivo Saprissa, before joined 2019 to Alajuelense. 2020 was promoted to LD Alajuelense first team and made 8 games, before the midfielder joined on loan to Juventud Escazuceña. International career Suárez debuted with the Costa Rica national team in a 1–0 2022 FIFA World Cup qualification loss to Canada on 13 November 2021. Honours Individual CONCACAF League Best Young Player: 2022 References External links 2002 births Living people Footballers from San José, Costa Rica Costa Rican men's footballers Costa Rica men's international footballers Men's association football midfielders Liga FPD players Liga Deportiva Alajuelense footballers 2023 CONCACAF Gold Cup players
```xml <?xml version="1.0" encoding="utf-8"?> <xliff xmlns="urn:oasis:names:tc:xliff:document:1.2" xmlns:xsi="path_to_url" version="1.2" xsi:schemaLocation="urn:oasis:names:tc:xliff:document:1.2 xliff-core-1.2-transitional.xsd"> <file datatype="xml" source-language="en" target-language="cs" original="../FSStrings.resx"> <body> <trans-unit id="ArgumentsInSigAndImplMismatch"> <source>The argument names in the signature '{0}' and implementation '{1}' do not match. The argument name from the signature file will be used. This may cause problems when debugging or profiling.</source> <target state="translated">Nzvy argument v signatue {0} a implementaci {1} si neodpovdaj. Pouije se nzev argumentu ze souboru signatury. To me zpsobit problmy pi ladn nebo profilovn.</target> <note /> </trans-unit> <trans-unit id="ErrorFromAddingTypeEquationTuples"> <source>Type mismatch. Expecting a tuple of length {0} of type\n {1} \nbut given a tuple of length {2} of type\n {3} {4}\n</source> <target state="translated">Neshoda typ Oekv se azen kolekce len o dlce {0} typu\n {1} \nale odevzdala se azen kolekce len o dlce {2} typu\n {3}{4}\n</target> <note /> </trans-unit> <trans-unit id="HashLoadedSourceHasIssues0"> <source>One or more informational messages in loaded file.\n</source> <target state="translated">Nejmn jedna informan zprva v natenm souboru\n</target> <note /> </trans-unit> <trans-unit id="NotUpperCaseConstructorWithoutRQA"> <source>Lowercase discriminated union cases are only allowed when using RequireQualifiedAccess attribute</source> <target state="translated">Ppady sjednocen s malmi psmeny jsou povolen jenom pi pouit atributu RequireQualifiedAccess.</target> <note /> </trans-unit> <trans-unit id="OverrideShouldBeInstance"> <source> Non-static member is expected.</source> <target state="new"> Non-static member is expected.</target> <note /> </trans-unit> <trans-unit id="OverrideShouldBeStatic"> <source> Static member is expected.</source> <target state="new"> Static member is expected.</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.DOT.DOT.HAT"> <source>symbol '..^'</source> <target state="translated">symbol ..^</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.INTERP.STRING.BEGIN.END"> <source>interpolated string</source> <target state="translated">interpolovan etzec</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.INTERP.STRING.BEGIN.PART"> <source>interpolated string (first part)</source> <target state="translated">interpolovan etzec (prvn st)</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.INTERP.STRING.END"> <source>interpolated string (final part)</source> <target state="translated">interpolovan etzec (posledn st)</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.INTERP.STRING.PART"> <source>interpolated string (part)</source> <target state="translated">interpolovan etzec (st)</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.WHILE.BANG"> <source>keyword 'while!'</source> <target state="translated">klov slovo while!</target> <note /> </trans-unit> <trans-unit id="SeeAlso"> <source>. See also {0}.</source> <target state="translated">. Viz taky {0}.</target> <note /> </trans-unit> <trans-unit id="ConstraintSolverTupleDiffLengths"> <source>The tuples have differing lengths of {0} and {1}</source> <target state="translated">{0} a {1} maj v azench kolekcch len rznou dlku.</target> <note /> </trans-unit> <trans-unit id="ConstraintSolverInfiniteTypes"> <source>The types '{0}' and '{1}' cannot be unified.</source> <target state="translated">Typy {0} a {1} nemou bt sjednocen.</target> <note /> </trans-unit> <trans-unit id="ConstraintSolverMissingConstraint"> <source>A type parameter is missing a constraint '{0}'</source> <target state="translated">V parametru typu chyb omezen {0}.</target> <note /> </trans-unit> <trans-unit id="ConstraintSolverTypesNotInEqualityRelation1"> <source>The unit of measure '{0}' does not match the unit of measure '{1}'</source> <target state="translated">Mrn jednotka {0} se neshoduje s mrnou jednotkou {1}.</target> <note /> </trans-unit> <trans-unit id="ConstraintSolverTypesNotInEqualityRelation2"> <source>The type '{0}' does not match the type '{1}'</source> <target state="translated">Typ {0} se neshoduje s typem {1}.</target> <note /> </trans-unit> <trans-unit id="ConstraintSolverTypesNotInSubsumptionRelation"> <source>The type '{0}' is not compatible with the type '{1}'{2}</source> <target state="translated">Typ {0} nen kompatibiln s typem {1}{2}.</target> <note /> </trans-unit> <trans-unit id="ErrorFromAddingTypeEquation1"> <source>This expression was expected to have type\n '{1}' \nbut here has type\n '{0}' {2}</source> <target state="translated">U tohoto vrazu se oekval typ\n {1}, \nale tady je typu\n {0} {2}.</target> <note /> </trans-unit> <trans-unit id="ErrorFromAddingTypeEquation2"> <source>Type mismatch. Expecting a\n '{0}' \nbut given a\n '{1}' {2}\n</source> <target state="translated">Neshoda v typu. Oekvan typ je \n {0}, \nale pedvan je\n {1} {2}.\n</target> <note /> </trans-unit> <trans-unit id="ErrorFromApplyingDefault1"> <source>Type constraint mismatch when applying the default type '{0}' for a type inference variable. </source> <target state="translated">Neshoda v omezen typu pi pouit vchozho typu {0} na promnnou rozhran typu </target> <note /> </trans-unit> <trans-unit id="ErrorFromApplyingDefault2"> <source> Consider adding further type constraints</source> <target state="translated"> Zvate pidn dalch omezen typu.</target> <note /> </trans-unit> <trans-unit id="ErrorsFromAddingSubsumptionConstraint"> <source>Type constraint mismatch. The type \n '{0}' \nis not compatible with type\n '{1}' {2}\n</source> <target state="translated">Neshoda v omezen typu. Typ \n {0} \nnen kompatibiln s typem\n {1}. {2}\n</target> <note /> </trans-unit> <trans-unit id="UpperCaseIdentifierInPattern"> <source>Uppercase variable identifiers should not generally be used in patterns, and may indicate a missing open declaration or a misspelt pattern name.</source> <target state="translated">Identifiktory promnnch psan velkmi psmeny se ve vzorech obecn nedoporuuj. Mou oznaovat chybjc otevenou deklaraci nebo patn napsan nzev vzoru.</target> <note /> </trans-unit> <trans-unit id="NotUpperCaseConstructor"> <source>Discriminated union cases and exception labels must be uppercase identifiers</source> <target state="translated">Rozlien ppady typu union a popisky vjimek mus bt identifiktory, kter jsou psan velkmi psmeny.</target> <note /> </trans-unit> <trans-unit id="FunctionExpected"> <source>This function takes too many arguments, or is used in a context where a function is not expected</source> <target state="translated">Tato funkce pebr pli mnoho argument, nebo se pouv v kontextu, ve kterm se funkce neoekv.</target> <note /> </trans-unit> <trans-unit id="BakedInMemberConstraintName"> <source>Member constraints with the name '{0}' are given special status by the F# compiler as certain .NET types are implicitly augmented with this member. This may result in runtime failures if you attempt to invoke the member constraint from your own code.</source> <target state="translated">Kompiltor F# udlil omezenm lena s nzvem {0} zvltn statut, protoe nkter typy .NET jsou o tento len implicitn rozen. Pokud se budete pokouet vyvolat omezen lena z vlastnho kdu, me to zpsobit pote za bhu.</target> <note /> </trans-unit> <trans-unit id="BadEventTransformation"> <source>A definition to be compiled as a .NET event does not have the expected form. Only property members can be compiled as .NET events.</source> <target state="translated">Definice, kter se m zkompilovat jako udlost .NET, nem oekvanou podobu. Jako udlosti .NET se daj zkompilovat jenom lenov vlastnost.</target> <note /> </trans-unit> <trans-unit id="ParameterlessStructCtor"> <source>Implicit object constructors for structs must take at least one argument</source> <target state="translated">Implicitn konstruktory objektu pro struktury mus pebrat aspo jeden argument.</target> <note /> </trans-unit> <trans-unit id="InterfaceNotRevealed"> <source>The type implements the interface '{0}' but this is not revealed by the signature. You should list the interface in the signature, as the interface will be discoverable via dynamic type casts and/or reflection.</source> <target state="translated">Typ implementuje rozhran {0}, kter ale signatura neposkytuje. Mli byste toto rozhran uvst v signatue, aby bylo prostednictvm dynamickho petypovn nebo reflexe zjistiteln.</target> <note /> </trans-unit> <trans-unit id="TyconBadArgs"> <source>The type '{0}' expects {1} type argument(s) but is given {2}</source> <target state="translated">Poet argument typu, kter typ {0} oekv, je {1}, ale poet pedvanch je {2}.</target> <note /> </trans-unit> <trans-unit id="IndeterminateType"> <source>Lookup on object of indeterminate type based on information prior to this program point. A type annotation may be needed prior to this program point to constrain the type of the object. This may allow the lookup to be resolved.</source> <target state="translated">Definovali jste vyhledvn u objektu neuritho typu zaloenho na informacch ped tmto mstem v programu. Aby se typ objektu omezil, bude mon poteba pidat ped tmto mstem v programu poznmku typu. Tm se problm s vyhledvnm pravdpodobn vye.</target> <note /> </trans-unit> <trans-unit id="NameClash1"> <source>Duplicate definition of {0} '{1}'</source> <target state="translated">Duplicitn definice: {0} {1}</target> <note /> </trans-unit> <trans-unit id="NameClash2"> <source>The {0} '{1}' can not be defined because the name '{2}' clashes with the {3} '{4}' in this type or module</source> <target state="translated">{0} {1} se ned definovat, protoe nzev {2} a {3} {4} v tomto typu nebo modulu jsou v konfliktu.</target> <note /> </trans-unit> <trans-unit id="Duplicate1"> <source>Two members called '{0}' have the same signature</source> <target state="translated">Dva lenov s nzvem {0} maj stejnou signaturu.</target> <note /> </trans-unit> <trans-unit id="Duplicate2"> <source>Duplicate definition of {0} '{1}'</source> <target state="translated">Duplicitn definice: {0} {1}</target> <note /> </trans-unit> <trans-unit id="UndefinedName2"> <source> A construct with this name was found in FSharp.PowerPack.dll, which contains some modules and types that were implicitly referenced in some previous versions of F#. You may need to add an explicit reference to this DLL in order to compile this code.</source> <target state="translated"> Konstruktor s tmto nzvem se nael v knihovn FSharp.PowerPack.dll, kter obsahuje urit moduly a typy implicitn odkazovan v nkterch dvjch verzch F#. Abyste tento kd mohli zkompilovat, bude mon poteba pidat explicitn odkaz na tuto knihovnu DLL.</target> <note /> </trans-unit> <trans-unit id="FieldNotMutable"> <source>This field is not mutable</source> <target state="translated">Toto pole nen mniteln.</target> <note /> </trans-unit> <trans-unit id="FieldsFromDifferentTypes"> <source>The fields '{0}' and '{1}' are from different types</source> <target state="translated">Pole {0} a {1} jsou odlinho typu.</target> <note /> </trans-unit> <trans-unit id="VarBoundTwice"> <source>'{0}' is bound twice in this pattern</source> <target state="translated">{0} m v tomto vzoru dv vazby.</target> <note /> </trans-unit> <trans-unit id="Recursion"> <source>A use of the function '{0}' does not match a type inferred elsewhere. The inferred type of the function is\n {1}. \nThe type of the function required at this point of use is\n {2} {3}\nThis error may be due to limitations associated with generic recursion within a 'let rec' collection or within a group of classes. Consider giving a full type signature for the targets of recursive calls including type annotations for both argument and return types.</source> <target state="translated">Pouit funkce {0} neodpovd typu, kter se odvozuje na jinm mst. Odvozen typ funkce je\n {1}. \nTyp funkce, kter se na tomto mst poaduje, je \n {2} {3}.\nTato chyba me bt zpsoben omezenmi, kter jsou pidruen k obecn rekurzi v kolekci let rec nebo ve skupin td. Zvate monost zadat plnou signaturu typu pro cle rekurzivnch voln vetn poznmek typu pro argumenty i nvratovch typ.</target> <note /> </trans-unit> <trans-unit id="InvalidRuntimeCoercion"> <source>Invalid runtime coercion or type test from type {0} to {1}\n{2}</source> <target state="translated">Neplatn test typu nebo konverze za bhu z typu {0} na {1}\n{2}</target> <note /> </trans-unit> <trans-unit id="IndeterminateRuntimeCoercion"> <source>This runtime coercion or type test from type\n {0} \n to \n {1} \ninvolves an indeterminate type based on information prior to this program point. Runtime type tests are not allowed on some types. Further type annotations are needed.</source> <target state="translated">Tento test typu nebo konverze za bhu z typu\n {0} \n na \n {1} \nzahrnuje neurit typ zaloen na informacch ped tmto mstem v programu. Testy typu za bhu nejsou u nkterch typ povolen. Je poteba, abyste k typu doplnili dal poznmky.</target> <note /> </trans-unit> <trans-unit id="IndeterminateStaticCoercion"> <source>The static coercion from type\n {0} \nto \n {1} \n involves an indeterminate type based on information prior to this program point. Static coercions are not allowed on some types. Further type annotations are needed.</source> <target state="translated">Statick konverze z typu\n {0} \nna typ \n {1} \n zahrnuje neurit typ zaloen na informacch ped tmto mstem v programu. Statick konverze nejsou u nkterch typ povolen. Je poteba, abyste k typu doplnili dal poznmky.</target> <note /> </trans-unit> <trans-unit id="StaticCoercionShouldUseBox"> <source>A coercion from the value type \n {0} \nto the type \n {1} \nwill involve boxing. Consider using 'box' instead</source> <target state="translated">Pi konverzi z typu hodnoty \n {0} \nna typ \n {1} \nprobhne zabalen. Zvate monost pout msto toho klov slovo box.</target> <note /> </trans-unit> <trans-unit id="TypeIsImplicitlyAbstract"> <source>This type is 'abstract' since some abstract members have not been given an implementation. If this is intentional then add the '[&lt;AbstractClass&gt;]' attribute to your type.</source> <target state="translated">Tento typ je abstract, protoe se neimplementovali nkte abstraktn lenov. Pokud je to zmr, pak k typu pidejte atribut [&lt;AbstractClass&gt;].</target> <note /> </trans-unit> <trans-unit id="NonRigidTypar1"> <source>This construct causes code to be less generic than indicated by its type annotations. The type variable implied by the use of a '#', '_' or other type annotation at or near '{0}' has been constrained to be type '{1}'.</source> <target state="translated">Konstruktor zpsobuje, e kd je m obecn, ne udvaj jeho poznmky typu. Promnn typu odvozen pomoc #, _ nebo jin poznmky typu na pozici {0} nebo blzko n se omezila na typ {1}.</target> <note /> </trans-unit> <trans-unit id="NonRigidTypar2"> <source>This construct causes code to be less generic than indicated by the type annotations. The unit-of-measure variable '{0} has been constrained to be measure '{1}'.</source> <target state="translated">Tento konstruktor zpsobuje, e kd je m obecn, ne udvaj jeho poznmky typu. Promnn mrn jednotky {0} se omezila na mrnou jednotku {1}.</target> <note /> </trans-unit> <trans-unit id="NonRigidTypar3"> <source>This construct causes code to be less generic than indicated by the type annotations. The type variable '{0} has been constrained to be type '{1}'.</source> <target state="translated">Tento konstruktor zpsobuje, e kd je m obecn, ne udvaj jeho poznmky typu. Promnn typu {0} se omezila na typ {1}.</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.IDENT"> <source>identifier</source> <target state="translated">identifiktor</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.INT"> <source>integer literal</source> <target state="translated">celoseln literl</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.FLOAT"> <source>floating point literal</source> <target state="translated">literl s plovouc desetinnou rkou</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.DECIMAL"> <source>decimal literal</source> <target state="translated">destkov literl</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.CHAR"> <source>character literal</source> <target state="translated">znakov literl</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.BASE"> <source>keyword 'base'</source> <target state="translated">klov slovo base</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.LPAREN.STAR.RPAREN"> <source>symbol '(*)'</source> <target state="translated">symbol (*)</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.DOLLAR"> <source>symbol '$'</source> <target state="translated">symbol $</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.INFIX.STAR.STAR.OP"> <source>infix operator</source> <target state="translated">opertor vpony</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.INFIX.COMPARE.OP"> <source>infix operator</source> <target state="translated">opertor vpony</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.COLON.GREATER"> <source>symbol ':&gt;'</source> <target state="translated">symbol :&gt;</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.COLON.COLON"> <source>symbol '::'</source> <target state="translated">symbol ::</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.PERCENT.OP"> <source>symbol '{0}</source> <target state="translated">symbol {0}</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.INFIX.AT.HAT.OP"> <source>infix operator</source> <target state="translated">opertor vpony</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.INFIX.BAR.OP"> <source>infix operator</source> <target state="translated">opertor vpony</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.PLUS.MINUS.OP"> <source>infix operator</source> <target state="translated">opertor vpony</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.PREFIX.OP"> <source>prefix operator</source> <target state="translated">opertor pedpony</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.COLON.QMARK.GREATER"> <source>symbol ':?&gt;'</source> <target state="translated">symbol :?&gt;</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.INFIX.STAR.DIV.MOD.OP"> <source>infix operator</source> <target state="translated">opertor vpony</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.INFIX.AMP.OP"> <source>infix operator</source> <target state="translated">opertor vpony</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.AMP"> <source>symbol '&amp;'</source> <target state="translated">symbol &amp;</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.AMP.AMP"> <source>symbol '&amp;&amp;'</source> <target state="translated">symbol &amp;&amp;</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.BAR.BAR"> <source>symbol '||'</source> <target state="translated">symbol ||</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.LESS"> <source>symbol '&lt;'</source> <target state="translated">symbol &lt;</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.GREATER"> <source>symbol '&gt;'</source> <target state="translated">symbol &gt;</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.QMARK"> <source>symbol '?'</source> <target state="translated">symbol ?</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.QMARK.QMARK"> <source>symbol '??'</source> <target state="translated">symbol ??</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.COLON.QMARK"> <source>symbol ':?'</source> <target state="translated">symbol :?</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.INT32.DOT.DOT"> <source>integer..</source> <target state="translated">cel slo..</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.DOT.DOT"> <source>symbol '..'</source> <target state="translated">symbol ..</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.QUOTE"> <source>quote symbol</source> <target state="translated">symbol citace</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.STAR"> <source>symbol '*'</source> <target state="translated">symbol *</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.HIGH.PRECEDENCE.TYAPP"> <source>type application </source> <target state="translated">pouit typu </target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.COLON"> <source>symbol ':'</source> <target state="translated">symbol :</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.COLON.EQUALS"> <source>symbol ':='</source> <target state="translated">symbol :=</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.LARROW"> <source>symbol '&lt;-'</source> <target state="translated">symbol &lt;-</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.EQUALS"> <source>symbol '='</source> <target state="translated">symbol =</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.GREATER.BAR.RBRACK"> <source>symbol '&gt;|]'</source> <target state="translated">symbol &gt;|]</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.MINUS"> <source>symbol '-'</source> <target state="translated">symbol -</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.ADJACENT.PREFIX.OP"> <source>prefix operator</source> <target state="translated">opertor pedpony</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.FUNKY.OPERATOR.NAME"> <source>operator name</source> <target state="translated">nzev opertora</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.COMMA"> <source>symbol ','</source> <target state="translated">symbol ,</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.DOT"> <source>symbol '.'</source> <target state="translated">symbol .</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.BAR"> <source>symbol '|'</source> <target state="translated">symbol |</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.HASH"> <source>symbol #</source> <target state="translated">symbol #</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.UNDERSCORE"> <source>symbol '_'</source> <target state="translated">symbol _</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.SEMICOLON"> <source>symbol ';'</source> <target state="translated">symbol ;</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.SEMICOLON.SEMICOLON"> <source>symbol ';;'</source> <target state="translated">symbol ;;</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.LPAREN"> <source>symbol '('</source> <target state="translated">symbol (</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.RPAREN"> <source>symbol ')'</source> <target state="translated">symbol )</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.SPLICE.SYMBOL"> <source>symbol 'splice'</source> <target state="translated">symbol splice</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.LQUOTE"> <source>start of quotation</source> <target state="translated">zatek citace</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.LBRACK"> <source>symbol '['</source> <target state="translated">symbol [</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.LBRACK.BAR"> <source>symbol '[|'</source> <target state="translated">symbol [|</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.LBRACK.LESS"> <source>symbol '[&lt;'</source> <target state="translated">symbol [&lt;</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.LBRACE"> <source>symbol '{'</source> <target state="translated">symbol {</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.LBRACE.LESS"> <source>symbol '{&lt;'</source> <target state="translated">symbol {&lt;</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.BAR.RBRACK"> <source>symbol '|]'</source> <target state="translated">symbol |]</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.GREATER.RBRACE"> <source>symbol '&gt;}'</source> <target state="translated">symbol &gt;}</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.GREATER.RBRACK"> <source>symbol '&gt;]'</source> <target state="translated">symbol &gt;]</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.RQUOTE"> <source>end of quotation</source> <target state="translated">konec citace</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.RBRACK"> <source>symbol ']'</source> <target state="translated">symbol ]</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.RBRACE"> <source>symbol '}'</source> <target state="translated">symbol }</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.PUBLIC"> <source>keyword 'public'</source> <target state="translated">klov slovo public</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.PRIVATE"> <source>keyword 'private'</source> <target state="translated">klov slovo private</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.INTERNAL"> <source>keyword 'internal'</source> <target state="translated">klov slovo internal</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.FIXED"> <source>keyword 'fixed'</source> <target state="translated">klov slovo fixed</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.CONSTRAINT"> <source>keyword 'constraint'</source> <target state="translated">klov slovo constraint</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.INSTANCE"> <source>keyword 'instance'</source> <target state="translated">klov slovo instance</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.DELEGATE"> <source>keyword 'delegate'</source> <target state="translated">klov slovo delegate</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.INHERIT"> <source>keyword 'inherit'</source> <target state="translated">klov slovo inherit</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.CONSTRUCTOR"> <source>keyword 'constructor'</source> <target state="translated">klov slovo constructor</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.DEFAULT"> <source>keyword 'default'</source> <target state="translated">klov slovo default</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.OVERRIDE"> <source>keyword 'override'</source> <target state="translated">klov slovo override</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.ABSTRACT"> <source>keyword 'abstract'</source> <target state="translated">klov slovo abstract</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.CLASS"> <source>keyword 'class'</source> <target state="translated">klov slovo class</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.MEMBER"> <source>keyword 'member'</source> <target state="translated">klov slovo member</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.STATIC"> <source>keyword 'static'</source> <target state="translated">klov slovo static</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.NAMESPACE"> <source>keyword 'namespace'</source> <target state="translated">klov slovo namespace</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.OBLOCKBEGIN"> <source>start of structured construct</source> <target state="translated">zatek strukturovanho konstruktoru</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.OBLOCKEND"> <source>incomplete structured construct at or before this point</source> <target state="translated">nepln strukturovan konstruktor na tto pozici nebo ped n</target> <note /> </trans-unit> <trans-unit id="BlockEndSentence"> <source>Incomplete structured construct at or before this point</source> <target state="translated">Nepln strukturovan konstruktor na tto pozici nebo ped n</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.OTHEN"> <source>keyword 'then'</source> <target state="translated">klov slovo then</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.OELSE"> <source>keyword 'else'</source> <target state="translated">klov slovo else</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.OLET"> <source>keyword 'let' or 'use'</source> <target state="translated">klov slovo let nebo use</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.BINDER"> <source>binder keyword</source> <target state="translated">klov slovo vazae</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.ODO"> <source>keyword 'do'</source> <target state="translated">klov slovo do</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.CONST"> <source>keyword 'const'</source> <target state="translated">klov slovo const</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.OWITH"> <source>keyword 'with'</source> <target state="translated">klov slovo with</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.OFUNCTION"> <source>keyword 'function'</source> <target state="translated">klov slovo function</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.OFUN"> <source>keyword 'fun'</source> <target state="translated">klov slovo fun</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.ORESET"> <source>end of input</source> <target state="translated">konec vstupu</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.ODUMMY"> <source>internal dummy token</source> <target state="translated">intern fiktivn token</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.ODO.BANG"> <source>keyword 'do!'</source> <target state="translated">klov slovo do!</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.YIELD"> <source>yield</source> <target state="translated">yield</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.YIELD.BANG"> <source>yield!</source> <target state="translated">yield!</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.OINTERFACE.MEMBER"> <source>keyword 'interface'</source> <target state="translated">klov slovo interface</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.ELIF"> <source>keyword 'elif'</source> <target state="translated">klov slovo elif</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.RARROW"> <source>symbol '-&gt;'</source> <target state="translated">symbol -&gt;</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.SIG"> <source>keyword 'sig'</source> <target state="translated">klov slovo sig</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.STRUCT"> <source>keyword 'struct'</source> <target state="translated">klov slovo struct</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.UPCAST"> <source>keyword 'upcast'</source> <target state="translated">klov slovo upcast</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.DOWNCAST"> <source>keyword 'downcast'</source> <target state="translated">klov slovo downcast</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.NULL"> <source>keyword 'null'</source> <target state="translated">klov slovo null</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.RESERVED"> <source>reserved keyword</source> <target state="translated">vyhrazen klov slovo</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.MODULE"> <source>keyword 'module'</source> <target state="translated">klov slovo module</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.AND"> <source>keyword 'and'</source> <target state="translated">klov slovo and</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.AS"> <source>keyword 'as'</source> <target state="translated">klov slovo as</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.ASSERT"> <source>keyword 'assert'</source> <target state="translated">klov slovo assert</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.ASR"> <source>keyword 'asr'</source> <target state="translated">klov slovo asr</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.DOWNTO"> <source>keyword 'downto'</source> <target state="translated">klov slovo downto</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.EXCEPTION"> <source>keyword 'exception'</source> <target state="translated">klov slovo exception</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.FALSE"> <source>keyword 'false'</source> <target state="translated">klov slovo false</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.FOR"> <source>keyword 'for'</source> <target state="translated">klov slovo for</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.FUN"> <source>keyword 'fun'</source> <target state="translated">klov slovo fun</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.FUNCTION"> <source>keyword 'function'</source> <target state="translated">klov slovo function</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.FINALLY"> <source>keyword 'finally'</source> <target state="translated">klov slovo finally</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.LAZY"> <source>keyword 'lazy'</source> <target state="translated">klov slovo lazy</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.MATCH"> <source>keyword 'match'</source> <target state="translated">klov slovo match</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.MATCH.BANG"> <source>keyword 'match!'</source> <target state="translated">klov slovo match!</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.MUTABLE"> <source>keyword 'mutable'</source> <target state="translated">klov slovo mutable</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.NEW"> <source>keyword 'new'</source> <target state="translated">klov slovo new</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.OF"> <source>keyword 'of'</source> <target state="translated">klov slovo of</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.OPEN"> <source>keyword 'open'</source> <target state="translated">klov slovo open</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.OR"> <source>keyword 'or'</source> <target state="translated">klov slovo or</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.VOID"> <source>keyword 'void'</source> <target state="translated">klov slovo void</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.EXTERN"> <source>keyword 'extern'</source> <target state="translated">klov slovo extern</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.INTERFACE"> <source>keyword 'interface'</source> <target state="translated">klov slovo interface</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.REC"> <source>keyword 'rec'</source> <target state="translated">klov slovo rec</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.TO"> <source>keyword 'to'</source> <target state="translated">klov slovo to</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.TRUE"> <source>keyword 'true'</source> <target state="translated">klov slovo true</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.TRY"> <source>keyword 'try'</source> <target state="translated">klov slovo try</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.TYPE"> <source>keyword 'type'</source> <target state="translated">klov slovo type</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.VAL"> <source>keyword 'val'</source> <target state="translated">klov slovo val</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.INLINE"> <source>keyword 'inline'</source> <target state="translated">klov slovo inline</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.WHEN"> <source>keyword 'when'</source> <target state="translated">klov slovo when</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.WHILE"> <source>keyword 'while'</source> <target state="translated">klov slovo while</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.WITH"> <source>keyword 'with'</source> <target state="translated">klov slovo with</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.IF"> <source>keyword 'if'</source> <target state="translated">klov slovo if</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.DO"> <source>keyword 'do'</source> <target state="translated">klov slovo do</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.GLOBAL"> <source>keyword 'global'</source> <target state="translated">klov slovo global</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.DONE"> <source>keyword 'done'</source> <target state="translated">klov slovo done</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.IN"> <source>keyword 'in'</source> <target state="translated">klov slovo in</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.HIGH.PRECEDENCE.PAREN.APP"> <source>symbol '('</source> <target state="translated">symbol (</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.HIGH.PRECEDENCE.BRACK.APP"> <source>symbol'['</source> <target state="translated">symbol [</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.BEGIN"> <source>keyword 'begin'</source> <target state="translated">klov slovo begin</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.END"> <source>keyword 'end'</source> <target state="translated">klov slovo end</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.HASH.ENDIF"> <source>directive</source> <target state="translated">direktiva</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.INACTIVECODE"> <source>inactive code</source> <target state="translated">neaktivn kd</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.LEX.FAILURE"> <source>lex failure</source> <target state="translated">selhn lex</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.WHITESPACE"> <source>whitespace</source> <target state="translated">przdn znak</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.COMMENT"> <source>comment</source> <target state="translated">Koment</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.LINE.COMMENT"> <source>line comment</source> <target state="translated">dkov koment</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.STRING.TEXT"> <source>string text</source> <target state="translated">text etzce</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.KEYWORD_STRING"> <source>compiler generated literal</source> <target state="translated">literl generovan kompiltorem</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.BYTEARRAY"> <source>byte array literal</source> <target state="translated">literl bajtovho pole</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.STRING"> <source>string literal</source> <target state="translated">etzcov literl</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.EOF"> <source>end of input</source> <target state="translated">konec vstupu</target> <note /> </trans-unit> <trans-unit id="UnexpectedEndOfInput"> <source>Unexpected end of input</source> <target state="translated">Neoekvan konec vstupu</target> <note /> </trans-unit> <trans-unit id="Unexpected"> <source>Unexpected {0}</source> <target state="translated">Neoekvan {0}</target> <note /> </trans-unit> <trans-unit id="NONTERM.interaction"> <source> in interaction</source> <target state="translated"> v interakci</target> <note /> </trans-unit> <trans-unit id="NONTERM.hashDirective"> <source> in directive</source> <target state="translated"> v direktiv</target> <note /> </trans-unit> <trans-unit id="NONTERM.fieldDecl"> <source> in field declaration</source> <target state="translated"> v deklaraci pole</target> <note /> </trans-unit> <trans-unit id="NONTERM.unionCaseRepr"> <source> in discriminated union case declaration</source> <target state="translated"> v deklaracch rozliench ppad typu union</target> <note /> </trans-unit> <trans-unit id="NONTERM.localBinding"> <source> in binding</source> <target state="translated"> ve vazb</target> <note /> </trans-unit> <trans-unit id="NONTERM.hardwhiteLetBindings"> <source> in binding</source> <target state="translated"> ve vazb</target> <note /> </trans-unit> <trans-unit id="NONTERM.classDefnMember"> <source> in member definition</source> <target state="translated"> v definici lena</target> <note /> </trans-unit> <trans-unit id="NONTERM.defnBindings"> <source> in definitions</source> <target state="translated"> v definicch</target> <note /> </trans-unit> <trans-unit id="NONTERM.classMemberSpfn"> <source> in member signature</source> <target state="translated"> v signatue lena</target> <note /> </trans-unit> <trans-unit id="NONTERM.valSpfn"> <source> in value signature</source> <target state="translated"> v signatue hodnoty</target> <note /> </trans-unit> <trans-unit id="NONTERM.tyconSpfn"> <source> in type signature</source> <target state="translated"> v signatue typu</target> <note /> </trans-unit> <trans-unit id="NONTERM.anonLambdaExpr"> <source> in lambda expression</source> <target state="translated"> ve vrazu lambda</target> <note /> </trans-unit> <trans-unit id="NONTERM.attrUnionCaseDecl"> <source> in union case</source> <target state="translated"> v ppadu typu union</target> <note /> </trans-unit> <trans-unit id="NONTERM.cPrototype"> <source> in extern declaration</source> <target state="translated"> v extern deklaraci</target> <note /> </trans-unit> <trans-unit id="NONTERM.objectImplementationMembers"> <source> in object expression</source> <target state="translated"> v objektovm vrazu</target> <note /> </trans-unit> <trans-unit id="NONTERM.ifExprCases"> <source> in if/then/else expression</source> <target state="translated"> ve vrazu if/then/else</target> <note /> </trans-unit> <trans-unit id="NONTERM.openDecl"> <source> in open declaration</source> <target state="translated"> v oteven deklaraci</target> <note /> </trans-unit> <trans-unit id="NONTERM.fileModuleSpec"> <source> in module or namespace signature</source> <target state="translated"> v signatue oboru nzv nebo modulu</target> <note /> </trans-unit> <trans-unit id="NONTERM.patternClauses"> <source> in pattern matching</source> <target state="translated"> v porovnvn vzor</target> <note /> </trans-unit> <trans-unit id="NONTERM.beginEndExpr"> <source> in begin/end expression</source> <target state="translated"> ve vrazu begin/end</target> <note /> </trans-unit> <trans-unit id="NONTERM.recdExpr"> <source> in record expression</source> <target state="translated"> ve vrazu zznamu</target> <note /> </trans-unit> <trans-unit id="NONTERM.tyconDefn"> <source> in type definition</source> <target state="translated"> v definici typu</target> <note /> </trans-unit> <trans-unit id="NONTERM.exconCore"> <source> in exception definition</source> <target state="translated"> v definici vjimky</target> <note /> </trans-unit> <trans-unit id="NONTERM.typeNameInfo"> <source> in type name</source> <target state="translated"> v nzvu typu</target> <note /> </trans-unit> <trans-unit id="NONTERM.attributeList"> <source> in attribute list</source> <target state="translated"> v seznamu atribut</target> <note /> </trans-unit> <trans-unit id="NONTERM.quoteExpr"> <source> in quotation literal</source> <target state="translated"> v literlu citace</target> <note /> </trans-unit> <trans-unit id="NONTERM.typeConstraint"> <source> in type constraint</source> <target state="translated"> v omezen typu</target> <note /> </trans-unit> <trans-unit id="NONTERM.Category.ImplementationFile"> <source> in implementation file</source> <target state="translated"> v souboru implementace</target> <note /> </trans-unit> <trans-unit id="NONTERM.Category.Definition"> <source> in definition</source> <target state="translated"> v definici</target> <note /> </trans-unit> <trans-unit id="NONTERM.Category.SignatureFile"> <source> in signature file</source> <target state="translated"> v souboru signatury</target> <note /> </trans-unit> <trans-unit id="NONTERM.Category.Pattern"> <source> in pattern</source> <target state="translated"> ve vzoru</target> <note /> </trans-unit> <trans-unit id="NONTERM.Category.Expr"> <source> in expression</source> <target state="translated"> ve vrazu</target> <note /> </trans-unit> <trans-unit id="NONTERM.Category.Type"> <source> in type</source> <target state="translated"> v typu</target> <note /> </trans-unit> <trans-unit id="NONTERM.typeArgsActual"> <source> in type arguments</source> <target state="translated"> v argumentech typu</target> <note /> </trans-unit> <trans-unit id="FixKeyword"> <source>keyword </source> <target state="translated">klov slovo </target> <note /> </trans-unit> <trans-unit id="FixSymbol"> <source>symbol </source> <target state="translated">symbol </target> <note /> </trans-unit> <trans-unit id="FixReplace"> <source> (due to indentation-aware syntax)</source> <target state="translated"> (kvli syntaxi, kter reflektuje odsazen)</target> <note /> </trans-unit> <trans-unit id="TokenName1"> <source>. Expected {0} or other token.</source> <target state="translated">. Oekval se token {0} nebo njak jin.</target> <note /> </trans-unit> <trans-unit id="TokenName1TokenName2"> <source>. Expected {0}, {1} or other token.</source> <target state="translated">. Oekval se token {0}, {1} nebo njak jin.</target> <note /> </trans-unit> <trans-unit id="TokenName1TokenName2TokenName3"> <source>. Expected {0}, {1}, {2} or other token.</source> <target state="translated">. Oekval se token {0}, {1}, {2} nebo njak jin.</target> <note /> </trans-unit> <trans-unit id="RuntimeCoercionSourceSealed1"> <source>The type '{0}' cannot be used as the source of a type test or runtime coercion</source> <target state="translated">Typ {0} se jako zdroj testu typu nebo konverze za bhu pout ned.</target> <note /> </trans-unit> <trans-unit id="RuntimeCoercionSourceSealed2"> <source>The type '{0}' does not have any proper subtypes and cannot be used as the source of a type test or runtime coercion.</source> <target state="translated">Typ {0} nem dn sprvn podtypy a ned se pout jako zdroj testu typu nebo konverze za bhu.</target> <note /> </trans-unit> <trans-unit id="CoercionTargetSealed"> <source>The type '{0}' does not have any proper subtypes and need not be used as the target of a static coercion</source> <target state="translated">Typ {0} nem dn sprvn podtypy a nen poteba ho pouvat jako cl statick konverze.</target> <note /> </trans-unit> <trans-unit id="UpcastUnnecessary"> <source>This upcast is unnecessary - the types are identical</source> <target state="translated">Toto petypovn smrem nahoru nen nutn: oba typy jsou identick.</target> <note /> </trans-unit> <trans-unit id="TypeTestUnnecessary"> <source>This type test or downcast will always hold</source> <target state="translated">Tento test typu nebo petypovn smrem dol se vdycky ulo.</target> <note /> </trans-unit> <trans-unit id="OverrideDoesntOverride1"> <source>The member '{0}' does not have the correct type to override any given virtual method</source> <target state="translated">len {0} nen sprvnho typu, aby mohl pepsat libovoln pedan virtuln metody.</target> <note /> </trans-unit> <trans-unit id="OverrideDoesntOverride2"> <source>The member '{0}' does not have the correct type to override the corresponding abstract method.</source> <target state="translated">len {0} nen sprvnho typu, aby mohl pepsat odpovdajc abstraktn metodu.</target> <note /> </trans-unit> <trans-unit id="OverrideDoesntOverride3"> <source> The required signature is '{0}'.</source> <target state="translated"> Poadovan signatura je {0}.</target> <note /> </trans-unit> <trans-unit id="OverrideDoesntOverride4"> <source>The member '{0}' is specialized with 'unit' but 'unit' can't be used as return type of an abstract method parameterized on return type.</source> <target state="translated">len {0} je specializovan s typem unit, ale typ unit nen mon pout jako nvratov typ abstraktn metody parametrizovan u nvratovho typu.</target> <note /> </trans-unit> <trans-unit id="UnionCaseWrongArguments"> <source>This constructor is applied to {0} argument(s) but expects {1}</source> <target state="translated">Poet argument, pro kter se pouv tento konstruktor, je {0}, ale oekvan poet je {1}.</target> <note /> </trans-unit> <trans-unit id="UnionPatternsBindDifferentNames"> <source>The two sides of this 'or' pattern bind different sets of variables</source> <target state="translated">Ob strany tohoto vzoru or vou jinou sadu promnnch.</target> <note /> </trans-unit> <trans-unit id="ValueNotContained"> <source>Module '{0}' contains\n {1} \nbut its signature specifies\n {2} \n{3}.</source> <target state="translated">Modul {0} obsahuje hodnotu\n {1}, \nale jeho signatura definuje hodnotu\n {2}. \n{3}.</target> <note /> </trans-unit> <trans-unit id="RequiredButNotSpecified"> <source>Module '{0}' requires a {1} '{2}'</source> <target state="translated">Modul {0} vyaduje {1} {2}.</target> <note /> </trans-unit> <trans-unit id="UseOfAddressOfOperator"> <source>The use of native pointers may result in unverifiable .NET IL code</source> <target state="translated">Pouit nativnch ukazatel me zpsobit vygenerovn neovitelnho kdu .NET IL.</target> <note /> </trans-unit> <trans-unit id="DefensiveCopyWarning"> <source>{0}</source> <target state="translated">{0}</target> <note /> </trans-unit> <trans-unit id="DeprecatedThreadStaticBindingWarning"> <source>Thread static and context static 'let' bindings are deprecated. Instead use a declaration of the form 'static val mutable &lt;ident&gt; : &lt;type&gt;' in a class. Add the 'DefaultValue' attribute to this declaration to indicate that the value is initialized to the default value on each new thread.</source> <target state="translated">Vazby let, kter jsou statick na rovni vlkna nebo kontextu, jsou zastaral. Pouijte msto nich deklaraci ve td v podob static val mutable &lt;ident&gt; : &lt;type&gt;. Pidnm atributu DefaultValue do tto deklarace mete urit, e se bude hodnota v kadm novm vlknu inicializovat na vchoz hodnotu.</target> <note /> </trans-unit> <trans-unit id="FunctionValueUnexpected"> <source>This expression is a function value, i.e. is missing arguments. Its type is {0}.</source> <target state="translated">Tento vraz je hodnotou funkce, to znamen, e u nj chyb argumenty. Je typu {0}.</target> <note /> </trans-unit> <trans-unit id="UnitTypeExpected"> <source>The result of this expression has type '{0}' and is implicitly ignored. Consider using 'ignore' to discard this value explicitly, e.g. 'expr |&gt; ignore', or 'let' to bind the result to a name, e.g. 'let result = expr'.</source> <target state="translated">Vsledek tohoto vrazu m typ {0} a implicitn se ignoruje. Zvate monost zahodit tuto hodnotu explicitn pomoc klovho slova ignore (napklad vraz |&gt; ignore) nebo vytvoit vazbu vsledku na nzev pomoc klovho slova let (napklad let vsledek = vraz).</target> <note /> </trans-unit> <trans-unit id="UnitTypeExpectedWithEquality"> <source>The result of this equality expression has type '{0}' and is implicitly discarded. Consider using 'let' to bind the result to a name, e.g. 'let result = expression'.</source> <target state="translated">Vsledek tohoto vrazu rovnosti m typ {0} a implicitn se zru. Zvate vytvoen vazby mezi vsledkem a nzvem pomoc klovho slova let, nap. let vsledek = vraz.</target> <note /> </trans-unit> <trans-unit id="UnitTypeExpectedWithPossiblePropertySetter"> <source>The result of this equality expression has type '{0}' and is implicitly discarded. Consider using 'let' to bind the result to a name, e.g. 'let result = expression'. If you intended to set a value to a property, then use the '&lt;-' operator e.g. '{1}.{2} &lt;- expression'.</source> <target state="translated">Vsledek tohoto vrazu rovnosti m typ {0} a implicitn se zru. Zvate vytvoen vazby mezi vsledkem a nzvem pomoc klovho slova let, napklad let vsledek = vraz. Pokud jste chtli nastavit hodnotu na vlastnost, pouijte opertor &lt;-, napklad {1}.{2} &lt;- vraz.</target> <note /> </trans-unit> <trans-unit id="UnitTypeExpectedWithPossibleAssignment"> <source>The result of this equality expression has type '{0}' and is implicitly discarded. Consider using 'let' to bind the result to a name, e.g. 'let result = expression'. If you intended to mutate a value, then mark the value 'mutable' and use the '&lt;-' operator e.g. '{1} &lt;- expression'.</source> <target state="translated">Vsledek tohoto vrazu rovnosti m typ {0} a implicitn se zru. Zvate vytvoen vazby mezi vsledkem a nzvem pomoc klovho slova let, napklad let vsledek = vraz. Pokud jste chtli mutovat hodnotu, oznate hodnotu jako mutable a pouijte opertor &lt;-, napklad {1} &lt;- vraz.</target> <note /> </trans-unit> <trans-unit id="UnitTypeExpectedWithPossibleAssignmentToMutable"> <source>The result of this equality expression has type '{0}' and is implicitly discarded. Consider using 'let' to bind the result to a name, e.g. 'let result = expression'. If you intended to mutate a value, then use the '&lt;-' operator e.g. '{1} &lt;- expression'.</source> <target state="translated">Vsledek tohoto vrazu rovnosti m typ {0} a implicitn se zru. Zvate vytvoen vazby mezi vsledkem a nzvem pomoc klovho slova let, napklad let vsledek = vraz. Pokud jste chtli mutovat hodnotu, pouijte opertor &lt;-, napklad {1} &lt;- vraz.</target> <note /> </trans-unit> <trans-unit id="RecursiveUseCheckedAtRuntime"> <source>This recursive use will be checked for initialization-soundness at runtime. This warning is usually harmless, and may be suppressed by using '#nowarn "21"' or '--nowarn:21'.</source> <target state="translated">U tohoto rekurzivnho pouit se bude kontrolovat stabilita inicializace za bhu. Toto upozornn je obvykle nekodn a pomoc #nowarn "21" nebo --nowarn:21 se d potlait.</target> <note /> </trans-unit> <trans-unit id="LetRecUnsound1"> <source>The value '{0}' will be evaluated as part of its own definition</source> <target state="translated">Hodnota {0} se vyhodnot v rmci jej vlastn definice.</target> <note /> </trans-unit> <trans-unit id="LetRecUnsound2"> <source>This value will be eventually evaluated as part of its own definition. You may need to make the value lazy or a function. Value '{0}'{1}.</source> <target state="translated">Tato hodnota se nakonec vyhodnot v rmci jej vlastn definice. Mon bude poteba, abyste hodnotu zmnili na opodnou nebo na funkci. Hodnota {0}{1}.</target> <note /> </trans-unit> <trans-unit id="LetRecUnsoundInner"> <source> will evaluate '{0}'</source> <target state="translated"> se vyhodnot jako {0}</target> <note /> </trans-unit> <trans-unit id="LetRecEvaluatedOutOfOrder"> <source>Bindings may be executed out-of-order because of this forward reference.</source> <target state="translated">Vazby se mou kvli tomuto dopednmu odkazu provdt ve patnm poad.</target> <note /> </trans-unit> <trans-unit id="LetRecCheckedAtRuntime"> <source>This and other recursive references to the object(s) being defined will be checked for initialization-soundness at runtime through the use of a delayed reference. This is because you are defining one or more recursive objects, rather than recursive functions. This warning may be suppressed by using '#nowarn "40"' or '--nowarn:40'.</source> <target state="translated">U tohoto a dalch rekurzivnch odkaz na definovan objekty se bude kontrolovat stabilita inicializace za bhu pomoc zpodnho odkazovn. Je to kvli tomu, e definujete rekurzivn objekty msto rekurzivnch funkc. Toto upozornn se d pomoc #nowarn "40" nebo --nowarn:40 potlait.</target> <note /> </trans-unit> <trans-unit id="SelfRefObjCtor1"> <source>Recursive references to the object being defined will be checked for initialization soundness at runtime through the use of a delayed reference. Consider placing self-references in members or within a trailing expression of the form '&lt;ctor-expr&gt; then &lt;expr&gt;'.</source> <target state="translated">U rekurzivnch odkaz na definovan objekty se bude kontrolovat stabilita inicializace za bhu pomoc zpodnho odkazovn. Zvate monost pidat odkazy na sebe sama do len nebo koncovho vrazu v podob &lt;vraz-konstruktoru&gt; then &lt;vraz&gt;.</target> <note /> </trans-unit> <trans-unit id="SelfRefObjCtor2"> <source>Recursive references to the object being defined will be checked for initialization soundness at runtime through the use of a delayed reference. Consider placing self-references within 'do' statements after the last 'let' binding in the construction sequence.</source> <target state="translated">U rekurzivnch odkaz na definovan objekty se bude kontrolovat stabilita inicializace za bhu pomoc zpodnho odkazovn. Zvate monost pidat do pkaz do za posledn vazbou let v sekvenci konstruktoru odkazy na sebe sama.</target> <note /> </trans-unit> <trans-unit id="VirtualAugmentationOnNullValuedType"> <source>The containing type can use 'null' as a representation value for its nullary union case. Invoking an abstract or virtual member or an interface implementation on a null value will lead to an exception. If necessary add a dummy data value to the nullary constructor to avoid 'null' being used as a representation for this type.</source> <target state="translated">Nadazen typ me pout null jako hodnotu reprezentace pro svj przdn ppad typu union. Vyvoln abstraktnho nebo virtulnho lena nebo implementace rozhran u hodnoty null zpsob vjimku. Pokud je to nutn, pidejte do przdnho konstruktoru fiktivn datovou hodnotu, abyste pedeli tomu, e se null pouije jako reprezentace tohoto typu.</target> <note /> </trans-unit> <trans-unit id="NonVirtualAugmentationOnNullValuedType"> <source>The containing type can use 'null' as a representation value for its nullary union case. This member will be compiled as a static member.</source> <target state="translated">Nadazen typ me pout null jako hodnotu reprezentace pro svj przdn ppad typu union. Tento len se kompiluje jako statick.</target> <note /> </trans-unit> <trans-unit id="NonUniqueInferredAbstractSlot1"> <source>The member '{0}' doesn't correspond to a unique abstract slot based on name and argument count alone</source> <target state="translated">len {0} neodpovd uniktn abstraktn datov oblasti zaloen jenom na nzvu a potu argument.</target> <note /> </trans-unit> <trans-unit id="NonUniqueInferredAbstractSlot2"> <source>. Multiple implemented interfaces have a member with this name and argument count</source> <target state="translated">. Vc implementovanch rozhran m lena s tmto nzvem a potem argument.</target> <note /> </trans-unit> <trans-unit id="NonUniqueInferredAbstractSlot3"> <source>. Consider implementing interfaces '{0}' and '{1}' explicitly.</source> <target state="translated">. Zvate explicitn implementaci rozhran {0} a {1}.</target> <note /> </trans-unit> <trans-unit id="NonUniqueInferredAbstractSlot4"> <source>. Additional type annotations may be required to indicate the relevant override. This warning can be disabled using '#nowarn "70"' or '--nowarn:70'.</source> <target state="translated">. Mou se vyadovat dal poznmky typu k uren pslunho pepsn. Toto upozornn se d pomoc #nowarn "70" nebo --nowarn:70 vypnout.</target> <note /> </trans-unit> <trans-unit id="Failure1"> <source>parse error</source> <target state="translated">Chyba analzy</target> <note /> </trans-unit> <trans-unit id="Failure2"> <source>parse error: unexpected end of file</source> <target state="translated">Chyba analzy: neoekvan konec souboru</target> <note /> </trans-unit> <trans-unit id="Failure3"> <source>{0}</source> <target state="translated">{0}</target> <note /> </trans-unit> <trans-unit id="Failure4"> <source>internal error: {0}</source> <target state="translated">Vnitn chyba: {0}</target> <note /> </trans-unit> <trans-unit id="FullAbstraction"> <source>{0}</source> <target state="translated">{0}</target> <note /> </trans-unit> <trans-unit id="MatchIncomplete1"> <source>Incomplete pattern matches on this expression.</source> <target state="translated">Nepln porovnvn vzor u tohoto vrazu</target> <note /> </trans-unit> <trans-unit id="MatchIncomplete2"> <source> For example, the value '{0}' may indicate a case not covered by the pattern(s).</source> <target state="translated"> Teba hodnota {0} me oznaovat ppad, na kter se vzor nevztahuje.</target> <note /> </trans-unit> <trans-unit id="MatchIncomplete3"> <source> For example, the value '{0}' may indicate a case not covered by the pattern(s). However, a pattern rule with a 'when' clause might successfully match this value.</source> <target state="translated"> Teba hodnota {0} me oznaovat ppad, na kter se vzor nevztahuje. Pravidlo vzoru s klauzul with se ale s touto hodnotou spn shodovat me.</target> <note /> </trans-unit> <trans-unit id="MatchIncomplete4"> <source> Unmatched elements will be ignored.</source> <target state="translated"> Nesprovan prvky se budou ignorovat.</target> <note /> </trans-unit> <trans-unit id="EnumMatchIncomplete1"> <source>Enums may take values outside known cases.</source> <target state="translated">Vty mou zskvat hodnoty mimo znm ppady.</target> <note /> </trans-unit> <trans-unit id="RuleNeverMatched"> <source>This rule will never be matched</source> <target state="translated">Pro toto pravidlo nebude nikdy existovat shoda.</target> <note /> </trans-unit> <trans-unit id="ValNotMutable"> <source>This value is not mutable. Consider using the mutable keyword, e.g. 'let mutable {0} = expression'.</source> <target state="translated">Tato hodnota nen promnliv. Zvate pouit promnlivho klovho slova, teba let mutable {0} = expression.</target> <note /> </trans-unit> <trans-unit id="ValNotLocal"> <source>This value is not local</source> <target state="translated">Tato hodnota nen lokln.</target> <note /> </trans-unit> <trans-unit id="Obsolete1"> <source>This construct is deprecated</source> <target state="translated">Tento konstruktor je zastaral.</target> <note /> </trans-unit> <trans-unit id="Obsolete2"> <source>. {0}</source> <target state="translated">. {0}</target> <note /> </trans-unit> <trans-unit id="Experimental"> <source>{0}. This warning can be disabled using '--nowarn:57' or '#nowarn "57"'.</source> <target state="translated">{0}. Toto upozornn se d pomoc --nowarn:57 nebo #nowarn "57" vypnout.</target> <note /> </trans-unit> <trans-unit id="PossibleUnverifiableCode"> <source>Uses of this construct may result in the generation of unverifiable .NET IL code. This warning can be disabled using '--nowarn:9' or '#nowarn "9"'.</source> <target state="translated">Pouit tohoto konstruktoru me zpsobit vygenerovn neovitelnho kdu .NET IL. Toto upozornn se d pomoc --nowarn:9 nebo #nowarn "9" vypnout.</target> <note /> </trans-unit> <trans-unit id="Deprecated"> <source>This construct is deprecated: {0}</source> <target state="translated">Tento konstruktor je zastaral: {0}</target> <note /> </trans-unit> <trans-unit id="LibraryUseOnly"> <source>This construct is deprecated: it is only for use in the F# library</source> <target state="translated">Tento konstruktor je zastaral: pouv se jenom v knihovn F#.</target> <note /> </trans-unit> <trans-unit id="MissingFields"> <source>The following fields require values: {0}</source> <target state="translated">Nsledujc pole vyaduj hodnoty: {0}</target> <note /> </trans-unit> <trans-unit id="ValueRestriction1"> <source>Value restriction. The value '{0}' has generic type\n {1} \nEither make the arguments to '{2}' explicit or, if you do not intend for it to be generic, add a type annotation.</source> <target state="translated">Omezen hodnoty. Hodnota {0} je obecnho typu\n {1}. \nBu zmte argumenty pro {2} na explicitn, nebo (pokud hodnota nem bt obecn) pidejte poznmku typu.</target> <note /> </trans-unit> <trans-unit id="ValueRestriction2"> <source>Value restriction. The value '{0}' has generic type\n {1} \nEither make '{2}' into a function with explicit arguments or, if you do not intend for it to be generic, add a type annotation.</source> <target state="translated">Omezen hodnoty. Hodnota {0} je obecnho typu\n {1}. \nZmte {2} na funkci s explicitnmi argumenty nebo (pokud hodnota nem bt obecn) pidejte poznmku typu.</target> <note /> </trans-unit> <trans-unit id="ValueRestriction3"> <source>Value restriction. This member has been inferred to have generic type\n {0} \nConstructors and property getters/setters cannot be more generic than the enclosing type. Add a type annotation to indicate the exact types involved.</source> <target state="translated">Omezen hodnoty. Tento len se odvodil jako len obecnho typu\n {0}. \nKonstruktory a metody getter nebo setter vlastnosti nemou bt obecnj ne nadazen typ. Pidejte poznmku typu, abyste pesn urili, kter typy se maj zahrnout.</target> <note /> </trans-unit> <trans-unit id="ValueRestriction4"> <source>Value restriction. The value '{0}' has been inferred to have generic type\n {1} \nEither make the arguments to '{2}' explicit or, if you do not intend for it to be generic, add a type annotation.</source> <target state="translated">Omezen hodnoty. Hodnota {0} se odvodila jako hodnota obecnho typu\n {1}. \nBu zmte argumenty pro {2} na explicitn, nebo (pokud hodnota nem bt obecn) pidejte poznmku typu.</target> <note /> </trans-unit> <trans-unit id="ValueRestriction5"> <source>Value restriction. The value '{0}' has been inferred to have generic type\n {1} \nEither define '{2}' as a simple data term, make it a function with explicit arguments or, if you do not intend for it to be generic, add a type annotation.</source> <target state="translated">Omezen hodnoty. Hodnota {0} se odvodila jako hodnota obecnho typu\n {1}. \nDefinujte {2} jako jednoduch datov vraz, zmte ji na funkci s explicitnmi argumenty nebo (pokud hodnota nem bt obecn) pidejte poznmku typu.</target> <note /> </trans-unit> <trans-unit id="RecoverableParseError"> <source>syntax error</source> <target state="translated">chyba syntaxe</target> <note /> </trans-unit> <trans-unit id="ReservedKeyword"> <source>{0}</source> <target state="translated">{0}</target> <note /> </trans-unit> <trans-unit id="IndentationProblem"> <source>{0}</source> <target state="translated">{0}</target> <note /> </trans-unit> <trans-unit id="OverrideInIntrinsicAugmentation"> <source>Override implementations in augmentations are now deprecated. Override implementations should be given as part of the initial declaration of a type.</source> <target state="translated">Implementace pepsn v rozench jsou u zastaral. Implementace pepsn by se mly provdt pi poten deklaraci typu.</target> <note /> </trans-unit> <trans-unit id="OverrideInExtrinsicAugmentation"> <source>Override implementations should be given as part of the initial declaration of a type.</source> <target state="translated">Implementace pepsn by se mly provdt pi poten deklaraci typu.</target> <note /> </trans-unit> <trans-unit id="IntfImplInIntrinsicAugmentation"> <source>Interface implementations should normally be given on the initial declaration of a type. Interface implementations in augmentations may lead to accessing static bindings before they are initialized, though only if the interface implementation is invoked during initialization of the static data, and in turn access the static data. You may remove this warning using #nowarn "69" if you have checked this is not the case.</source> <target state="translated">Implementace rozhran by obvykle mly bt zadny pro poten deklaraci typu. Implementace rozhran v rozench mohou vst k pstupu ke statickm vazbm ped jejich inicializac, ale pouze v ppad, e je implementace rozhran vyvolna bhem inicializace statickch dat a nsledn umon pstup ke statickm datm. Toto upozornn mete odebrat pomoc #nowarn 69, pokud jste ovili, e tomu tak nen.</target> <note /> </trans-unit> <trans-unit id="IntfImplInExtrinsicAugmentation"> <source>Interface implementations should be given on the initial declaration of a type.</source> <target state="translated">Implementace rozhran by se mly provdt pi poten deklaraci typu.</target> <note /> </trans-unit> <trans-unit id="UnresolvedReferenceNoRange"> <source>A required assembly reference is missing. You must add a reference to assembly '{0}'.</source> <target state="translated">Chyb poadovan odkaz na sestaven. Muste k sestaven {0} pidat odkaz.</target> <note /> </trans-unit> <trans-unit id="UnresolvedPathReferenceNoRange"> <source>The type referenced through '{0}' is defined in an assembly that is not referenced. You must add a reference to assembly '{1}'.</source> <target state="translated">Typ odkazovan pomoc {0} je definovan v sestaven, na kter se neodkazuje. Muste pidat odkaz na sestaven {1}.</target> <note /> </trans-unit> <trans-unit id="HashIncludeNotAllowedInNonScript"> <source>#I directives may only occur in F# script files (extensions .fsx or .fsscript). Either move this code to a script file, add a '-I' compiler option for this reference or delimit the directive with delimit it with '#if INTERACTIVE'/'#endif'.</source> <target state="translated">Direktivy #I se mou vyskytovat jenom v souborech skriptu F# (s pponou .fsx nebo .fsscript). Pesute tento kd do souboru skriptu nebo pidejte pro tento odkaz monost kompiltoru -I anebo direktivu ohranite pomoc notace #if INTERACTIVE/#endif.</target> <note /> </trans-unit> <trans-unit id="HashReferenceNotAllowedInNonScript"> <source>#r directives may only occur in F# script files (extensions .fsx or .fsscript). Either move this code to a script file or replace this reference with the '-r' compiler option. If this directive is being executed as user input, you may delimit it with '#if INTERACTIVE'/'#endif'.</source> <target state="translated">Direktivy #r se mou vyskytovat jenom v souborech skriptu F# (s pponou .fsx nebo .fsscript). Bu pesute tento kd do souboru skriptu, nebo nahrate tento odkaz monost kompiltoru -r. Pokud se tato direktiva provd jako uivatelsk vstup, mete ji ohraniit pomoc notace #if INTERACTIVE'/'#endif.</target> <note /> </trans-unit> <trans-unit id="HashDirectiveNotAllowedInNonScript"> <source>This directive may only be used in F# script files (extensions .fsx or .fsscript). Either remove the directive, move this code to a script file or delimit the directive with '#if INTERACTIVE'/'#endif'.</source> <target state="translated">Tato direktiva se d pout jenom v souborech skriptu F# (s pponou .fsx nebo .fsscript). Bu direktivu odeberte, nebo tento kd pesute do souboru skriptu, anebo direktivu ohranite pomoc notace #if INTERACTIVE/#endif.</target> <note /> </trans-unit> <trans-unit id="FileNameNotResolved"> <source>Unable to find the file '{0}' in any of\n {1}</source> <target state="translated">Soubor {0} se nepovedlo najt v dn(m)\n {1}.</target> <note /> </trans-unit> <trans-unit id="AssemblyNotResolved"> <source>Assembly reference '{0}' was not found or is invalid</source> <target state="translated">Odkaz na sestaven {0} se nenael nebo je neplatn.</target> <note /> </trans-unit> <trans-unit id="HashLoadedSourceHasIssues1"> <source>One or more warnings in loaded file.\n</source> <target state="translated">V natenm souboru je nejm jedno upozornn.\n</target> <note /> </trans-unit> <trans-unit id="HashLoadedSourceHasIssues2"> <source>One or more errors in loaded file.\n</source> <target state="translated">V natenm souboru je nejm jedna chyba.\n</target> <note /> </trans-unit> <trans-unit id="HashLoadedScriptConsideredSource"> <source>Loaded files may only be F# source files (extension .fs). This F# script file (.fsx or .fsscript) will be treated as an F# source file</source> <target state="translated">Naten soubory mou bt jenom zdrojov soubory F# (s pponou .fs). Tento soubor skriptu F# (s pponou .fsx nebo .fsscript) se zpracuje jako zdrojov soubor F#.</target> <note /> </trans-unit> <trans-unit id="InvalidInternalsVisibleToAssemblyName1"> <source>Invalid assembly name '{0}' from InternalsVisibleTo attribute in {1}</source> <target state="translated">Neplatn nzev sestaven {0} z atributu InternalsVisibleTo v {1}</target> <note /> </trans-unit> <trans-unit id="InvalidInternalsVisibleToAssemblyName2"> <source>Invalid assembly name '{0}' from InternalsVisibleTo attribute (assembly filename not available)</source> <target state="translated">Neplatn nzev sestaven {0} z atributu InternalsVisibleTo (nzev souboru sestaven nen dostupn)</target> <note /> </trans-unit> <trans-unit id="LoadedSourceNotFoundIgnoring"> <source>Could not load file '{0}' because it does not exist or is inaccessible</source> <target state="translated">Soubor {0} se nedal nast, protoe neexistuje nebo nen dostupn.</target> <note /> </trans-unit> <trans-unit id="MSBuildReferenceResolutionError"> <source>{0} (Code={1})</source> <target state="translated">{0} (Kd={1})</target> <note /> </trans-unit> <trans-unit id="TargetInvocationExceptionWrapper"> <source>internal error: {0}</source> <target state="translated">Vnitn chyba: {0}</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.LBRACE.BAR"> <source>symbol '{|'</source> <target state="translated">symbol {|</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.BAR.RBRACE"> <source>symbol '|}'</source> <target state="translated">symbol |}</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.AND.BANG"> <source>keyword 'and!'</source> <target state="translated">klov slovo and!</target> <note /> </trans-unit> </body> </file> </xliff> ```
Last Summer () is a 2023 French erotic drama film directed by Catherine Breillat, from a screenplay written by Breillat in collaboration with Pascal Bonitzer. It is a remake of the 2019 Danish film Queen of Hearts. Starring Léa Drucker and Samuel Kircher, the film explores the taboos of a stepmother–stepson relationship. The film was selected to compete for the Palme d'Or at the 76th Cannes Film Festival, where it premiered on 25 May 2023. It was released in France on 13 September 2023. Synopsis Anne is a respected lawyer who lives in Paris with her husband Pierre and their two young daughters. Théo, Pierre's 17-year-old son from a previous marriage, moves in, and Anne eventually begins an affair with him. In doing so, she risks jeopardizing her career and losing her family. Théo is a fragile figure and, as time goes on, the relationship turns destructive. Cast Léa Drucker as Anne Olivier Rabourdin as Pierre Samuel Kircher as Théo Clotilde Courau as Mina Serena Hu as Serena Angela Chen as Angela Romain Maricau Romane Violeau Marie Lucas Nelia Da Costa Lilas-Rose Gilberti-Poisot Production Last Summer is director Catherine Breillat's 15th feature film and her first film since her autobiographical drama Abuse of Weakness released ten years earlier. The film is a remake of the 2019 Danish film Queen of Hearts, which was directed by May el-Toukhy, who co-wrote the film with Maren Louise Kaëhne. Valeria Bruni Tedeschi was initially cast in role of Anne before being replaced by Léa Drucker. Samuel Kircher, the son of actors Irène Jacob and Jérôme Kircher, makes his film debut as the teenage stepson Théo. Samuel was recommended to Breillat by his brother Paul, who was originally scheduled to play the role. In an interview given in early February 2023, Drucker said that Last Summer is "one of the most disturbing films" in which she has acted. The film poses the questions "Is it love? Where does love stop? Where does the transgression begin?" without being "moralistic". The film reminded Drucker of the play Blackbird by Scottish dramatist David Harrower, in which she has starred on stage. Blackbird tells the story of a young woman meeting a middle-aged man fifteen years after being sexually abused by him when she was twelve. Filming took place from 7 June to 13 July 2022. Jeanne Lapoirie served as the director of photography. The film was produced by Saïd Ben Saïd through his company SBS Productions. According to director Catherine Breillat, the shooting took place "in a state of absolute grace". Breillat described herself as physically diminished and "afraid of not holding on", but said she rediscovered her love of filming during the production of Last Summer. Release Last Summer was selected to compete for the Palme d'Or at the 2023 Cannes Film Festival, where it had its world premiere on 25 May 2023. The film was theatrically released in France on 13 September 2023 by Pyramide Distribution. Following screening at the 2023 New York Film Festival. It was also invited at the 28th Busan International Film Festival in 'Icon' section and was screened on 6 October 2023. Reception Critical response On Rotten Tomatoes, the film holds an approval rating of 73% based on 22 reviews, with an average rating of 6.7/10. On Metacritic, the film has a weighted average score of 63 out of 100, based on 7 critic reviews, indicating "generally favorable" reviews. Last Summer received an average rating of 3.6 out of 5 stars on the French website AlloCiné, based on 35 reviews. Awards and nominations References External links 2023 films 2023 drama films 2020s erotic drama films 2020s French films 2020s French-language films French erotic drama films Films about families Films directed by Catherine Breillat Remakes of Danish films
Young and Crazy is a 1987 album by Tigertailz, with artwork by the band's bassist Pepsi Tate. It has a heavier sound than its successor, with less elements of pop metal. It is the only album to feature the original lead vocalist Steevi Jaimz. The album was released in America on Combat records. The compact disc is now considered a collector's item, and is much sought after by Tigertailz fans. The CD was re-released in the US on May 27, 2008, on independent label Krescendo. In 2021, Metal Hammer listed Young and Crazy as one of their ten 'obscure but brilliant hair metal albums' in an article. On 15 October 2022, original vocalist Steevi Jaimz performed a one-off 35th anniversary show at the 229 in London. The album was performed live in full for the first time with a backing band featuring Kai from Esprit D'Air on guitar, previous Tigertailz drummer Robin Guy, and BulletBoys bassist Rob Lane. Track listing "Star Attraction" - 2:51 "Hollywood Killer" - 3:39 "Ballerina" (Instrumental) - 0:51 "Livin' Without You" - 4:17 "Shameless" - 4:20 "City Kidz" - 3:56 "Shoot to Kill" - 3:20 "Turn Me On" - 2:44 "She'z Too Hot" - 3:12 "Young and Crazy" - 3:03 "Fall in Love Again" - 4:44 All songs written by S. Jaimz/J. Pepper, except "She'z Too Hot" written by P. Tate Personnel Tigertailz Steevi Jaimz - vocals Jay Pepper - guitar Pepsi Tate - bass guitar, keyboards Ace Finchum - drums with: Tim Lewis - keyboards Jakki Lynn, Toni - backing vocals References 1987 debut albums Tigertailz albums Music for Nations albums Combat Records albums
Minna Tarkka (1960 – 27 August 2023) was a Finnish critic, educator, producer and curator of media art and culture. In 2000 she co-founded m-cult in Helsinki, Finland and remained as its director until 2023. Life and work Tarkka was active in developing the first media art-related university courses in Finland, including the Faculty of Time and Space (Academy of Fine Arts) and the MA in New Media (University of Art and Design, Media Lab). She was a founder member and the first director of MUU (1989-91), as well as a founding member of AV-arkki, a distribution facility for media art, and the Finnish Media Art Network. ISEA, the International Symposium on Electronic Art, was held in Helsinki under Tarkka's direction in 1994 in partnership with the University of Art and Design, and again in 2004, where it took place both in Helsinki and in Tallinn, Estonia, and was produced by curator Amanda McDonald Crowley. In 2017, Tarkka was awarded the Finnish State Art Prize in Media Art category acknowledging her pioneering work and her commissions in collaborative new media art. Tarkka died on 27 August 2023, at the age of 62. References External links m-cult agency to develop and promote new forms of media art and digital culture 1960 births 2023 deaths People from Helsinki
Lysimachia pendens is a rare species of flowering plant in the family Primulaceae known by the common name broad-leaf yellow loosestrife. It is endemic to Hawaii, where there is a single occurrence known on the island of Kauai. It was federally listed as an endangered species of the United States in 2010. This shrub was described as a new species in 1997 when one population of Lysimachia filifolia plants was determined to be different from the others and not part of that species. The leaves are wider and hairier than those of L. filifolia. This plant occurs at one location at the headwaters of the north fork of the Wailua River of Kauai, where it grows alongside the newly described Lysimachia iniki. The habitat is made up of wet, mossy cliffs. This shrub has hanging branches, the new growth covered in tan hairs. The lance-shaped leaves are closely spaced on the branches and measure roughly 2 to 4 centimeters long by 2 to 4 millimeters wide. The flowers have green or red-tinged sepals and red petals each just under a centimeter in length. The plant is threatened by the invasion of introduced species of plants in its habitat. Landslides have destroyed many of the plants. There are only eight individuals of this species remaining (as of April 2010). References pendens Endemic flora of Hawaii Plants described in 1997
```yaml # UTF-8 # YAML # # name name: # other_names ... # YAML # other_names: {"":"", "":"", "":"Tom"} # other_names: # sex M/F / sex: M # birth 4 N/A birth: 1946 # death 4 N/A death: # desc YAML # desc desc: | (,18) # links YAML list # # links: - path_to_url ```
was a gynecologist and president of Kumamoto Medical College (1925–1932). He wrote The history of medical education in Higo (Kumamoto) and Yokoi Shōnan. After retirement, he travelled in Okinawa. {{Infobox person |name = Masatada Yamasaki |image = Masatada Yamasaki.jpg |image_size = |caption = Masatada Yamasaki |birth_date = June 16, 1872 |birth_place = Kōchi Prefecture |death_date = May 29, 1950 |known_for = First President of Kumamoto Medical College,(1925-1932), The books History of Medical Education in Kumamoto and Yokoi Shōnan |occupation = Physician (Gynecologist) |nationality = Japanese }} Life He was born in Sagawa town, Takaoka gun, Kōchi Prefecture on 11 May 1872. After graduation from Tokyo Imperial University in 1900, he became professor at private Kumamoto Medical School in 1901. In 1909 and 1910, he studied in München and Bonn universities. He was appointed the president of Aichi Medical School in 1916. In 1925, he was appointed the president of Kumamoto Medical College and director of the Hospital. In 1929, he wrote The history of medical education in Higo (Kumamoto). In 1932, he retired from the Kumamoto Medical College. In 1932 and 1933, he travelled in Okinawa and on 29 May 1950 he died in his house. Okinawa Originally interested in history and travelling, he retired at age 60 from the university and travelled to Okinawa in 1932 and 1933. In 1932, a cameraman accompanied him and in 1933 he was accompanied by his wife, son, a cameraman and an artist. Many physicians who studied in his universities helped him (he used the Governor's car). Fortunately, photographs of various scenes taken during his journey in Okinawa in 1932 and 1933, mostly in historical spots, were discovered and published in 2000. Since Okinawa was hit badly by the last war, these photographs were considered of inestimable value. The inclusion of himself in photographs served as a measure of various buildings. Medical education in Higo (Kumamoto) His famous book, Medical education in Higo dates back to 1758 when Saishunkan was established. This book has more than 800 pages. Yokoi Shōnan He studied Yokoi Shōnan and his book got a renewed interest. 2009 marked the 200th year after the birth of Yokoi Shōnan. This book has more than 1300 pages. Yokoi Shōnan, the Foremost World-Pacifist in Japan (1949), Dr. M. Yamasaki, Kumamoto Education BoardA Moemorial Lecture by Dr. M. Yamasaki (Beginning) Motoda Toya (1801-1880), Yokoi Shōnan's devoted disciple and friend, exalted him as an unrivalled master of moral philosophy and a scholar suitable to be an Emperor's instructor, and adds, "I have made friends with many persons of repute in my life, but really can I remamber none with so broad a view and so highspirited as Yokoi, my instructor. His clear judgement and keen observation can hardly be attained by others". Nagaoka Moriyoshi(1842-1906), one of his bosom friends, wrote a sonnet in praise of Yokoi in which he says, "Who on earth would dare to compete with him for gift and talent ?"Katsu Kaishu(1823-1899), known to have had a very high opinion of his own wit and discernment, had to confess that Yokoi was surprisingly broadminded and towered above his contemporaries. He says, "During the eventful years of my life, I have come across two really formidable persons to deal with, namely, Saigo Nanshu(1827-1877) and Yokoi Shonan. Well, Yokoi's knowledge of the world affairs was by no means rich, so I had been very often his teacher in that respect. On the other hand, his hightoned thinking and imagination was far beyond my reach."Memorials The Yamasaki Memorial House is inside the campus of the Medical Department, Kumamoto University. It houses a bronze statue of Dr. Yamasaki. His tomb is in Komine Memorial Park, Kurokami, Kumamoto. Footnotes Photo album, "Beloved Okinawa. Scenes in early Showa period when Masatada Yamasaki walked"'' (2000), Takao Nonomura, Ryukyushinpo, Naha. References 1872 births 1950 deaths Japanese gynaecologists 20th-century Japanese historians People from Kumamoto Prefecture People from Kōchi Prefecture University of Tokyo alumni
Coleophora patzaki is a moth of the family Coleophoridae that is endemic to Greece. References External links patzaki Moths described in 1983 Moths of Europe Endemic fauna of Greece
The nasal palatal approximant is a type of consonantal sound used in some oral languages. The symbol in the International Phonetic Alphabet that represents this sound is , that is, a j with a tilde. The equivalent X-SAMPA symbol is j~, and in the Americanist phonetic notation it is . The nasal palatal approximant is sometimes called a nasal yod; and may be called nasal glides. Features Features of the nasal palatal approximant: It is a nasal consonant, which means air is allowed to escape through the nose, in this case in addition to through the mouth. Occurrence , written ny, is a common realization of before nasal vowels in many languages of West Africa that do not have a phonemic distinction between voiced nasal and oral stops, such as Yoruba, Ewe and Bini languages. See also Palatal nasal Nasal labio-velar approximant Labiodental nasal, which may be an approximant in the one language in which it is phonemic Voiceless nasal glottal approximant Index of phonetics articles Notes References Further reading External links Nasal consonants Palatal consonants Central consonants Voiced consonants Pulmonic consonants
Emoia aneityumensis, Medway's emo skink or the Anatom emo skink, is a species of lizard in the family Scincidae. It is found in Vanuatu. References Emoia Reptiles described in 1974
```smalltalk using System.Linq; using System.Numerics; using Algorithms.Sequences; using FluentAssertions; using NUnit.Framework; namespace Algorithms.Tests.Sequences; [TestFixture] public static class MatchstickTriangleSequenceTests { private static BigInteger[] _testList = { 0, 1, 5, 13, 27, 48, 78, 118, 170, 235, 315, 411, 525, 658, 812, 988, 1188, 1413, 1665, 1945, 2255, 2596, 2970, 3378, 3822, 4303, 4823, 5383, 5985, 6630, 7320, 8056, 8840, 9673, 10557, 11493, 12483, 13528, 14630, 15790, 17010, 18291, 19635, 21043, 22517, }; /// <summary> /// This test uses the list values provided from path_to_url /// </summary> [Test] public static void TestOeisList() { var sequence = new MatchstickTriangleSequence().Sequence.Take(_testList.Length); sequence.SequenceEqual(_testList).Should().BeTrue(); } } ```
```javascript // Generated by ReScript, PLEASE EDIT WITH CARE 'use strict'; let Mt = require("./mt.js"); let Belt_Option = require("../../lib/js/belt_Option.js"); let Caml_option = require("../../lib/js/caml_option.js"); let suites_0 = [ "make", (function (param) { return { TAG: "Eq", _0: "null", _1: String(null).concat("") }; }) ]; let suites_1 = { hd: [ "fromCharCode", (function (param) { return { TAG: "Eq", _0: "a", _1: String.fromCharCode(97) }; }) ], tl: { hd: [ "fromCharCodeMany", (function (param) { return { TAG: "Eq", _0: "az", _1: String.fromCharCode(97, 122) }; }) ], tl: { hd: [ "fromCodePoint", (function (param) { return { TAG: "Eq", _0: "a", _1: String.fromCodePoint(97) }; }) ], tl: { hd: [ "fromCodePointMany", (function (param) { return { TAG: "Eq", _0: "az", _1: String.fromCodePoint(97, 122) }; }) ], tl: { hd: [ "length", (function (param) { return { TAG: "Eq", _0: 3, _1: "foo".length }; }) ], tl: { hd: [ "get", (function (param) { return { TAG: "Eq", _0: "a", _1: "foobar"[4] }; }) ], tl: { hd: [ "charAt", (function (param) { return { TAG: "Eq", _0: "a", _1: "foobar".charAt(4) }; }) ], tl: { hd: [ "charCodeAt", (function (param) { return { TAG: "Eq", _0: 97, _1: "foobar".charCodeAt(4) }; }) ], tl: { hd: [ "codePointAt", (function (param) { return { TAG: "Eq", _0: 97, _1: "foobar".codePointAt(4) }; }) ], tl: { hd: [ "codePointAt - out of bounds", (function (param) { return { TAG: "Eq", _0: undefined, _1: "foobar".codePointAt(98) }; }) ], tl: { hd: [ "concat", (function (param) { return { TAG: "Eq", _0: "foobar", _1: "foo".concat("bar") }; }) ], tl: { hd: [ "concatMany", (function (param) { return { TAG: "Eq", _0: "foobarbaz", _1: "foo".concat("bar", "baz") }; }) ], tl: { hd: [ "endsWith", (function (param) { return { TAG: "Eq", _0: true, _1: "foobar".endsWith("bar") }; }) ], tl: { hd: [ "endsWithFrom", (function (param) { return { TAG: "Eq", _0: false, _1: "foobar".endsWith("bar", 1) }; }) ], tl: { hd: [ "includes", (function (param) { return { TAG: "Eq", _0: true, _1: "foobarbaz".includes("bar") }; }) ], tl: { hd: [ "includesFrom", (function (param) { return { TAG: "Eq", _0: false, _1: "foobarbaz".includes("bar", 4) }; }) ], tl: { hd: [ "indexOf", (function (param) { return { TAG: "Eq", _0: 3, _1: "foobarbaz".indexOf("bar") }; }) ], tl: { hd: [ "indexOfFrom", (function (param) { return { TAG: "Eq", _0: -1, _1: "foobarbaz".indexOf("bar", 4) }; }) ], tl: { hd: [ "lastIndexOf", (function (param) { return { TAG: "Eq", _0: 3, _1: "foobarbaz".lastIndexOf("bar") }; }) ], tl: { hd: [ "lastIndexOfFrom", (function (param) { return { TAG: "Eq", _0: 3, _1: "foobarbaz".lastIndexOf("bar", 4) }; }) ], tl: { hd: [ "localeCompare", (function (param) { return { TAG: "Eq", _0: 0, _1: "foo".localeCompare("foo") }; }) ], tl: { hd: [ "match", (function (param) { return { TAG: "Eq", _0: [ "na", "na" ], _1: Caml_option.null_to_opt("banana".match(/na+/g)) }; }) ], tl: { hd: [ "match - no match", (function (param) { return { TAG: "Eq", _0: undefined, _1: Caml_option.null_to_opt("banana".match(/nanana+/g)) }; }) ], tl: { hd: [ "match - not found capture groups", (function (param) { return { TAG: "Eq", _0: [ "hello ", undefined ], _1: Belt_Option.map(Caml_option.null_to_opt("hello word".match(/hello (world)?/)), (function (prim) { return prim.slice(); })) }; }) ], tl: { hd: [ "normalize", (function (param) { return { TAG: "Eq", _0: "foo", _1: "foo".normalize() }; }) ], tl: { hd: [ "normalizeByForm", (function (param) { return { TAG: "Eq", _0: "foo", _1: "foo".normalize("NFKD") }; }) ], tl: { hd: [ "repeat", (function (param) { return { TAG: "Eq", _0: "foofoofoo", _1: "foo".repeat(3) }; }) ], tl: { hd: [ "replace", (function (param) { return { TAG: "Eq", _0: "fooBORKbaz", _1: "foobarbaz".replace("bar", "BORK") }; }) ], tl: { hd: [ "replaceByRe", (function (param) { return { TAG: "Eq", _0: "fooBORKBORK", _1: "foobarbaz".replace(/ba./g, "BORK") }; }) ], tl: { hd: [ "unsafeReplaceBy0", (function (param) { let replace = function (whole, offset, s) { if (whole === "bar") { return "BORK"; } else { return "DORK"; } }; return { TAG: "Eq", _0: "fooBORKDORK", _1: "foobarbaz".replace(/ba./g, replace) }; }) ], tl: { hd: [ "unsafeReplaceBy1", (function (param) { let replace = function (whole, p1, offset, s) { if (whole === "bar") { return "BORK"; } else { return "DORK"; } }; return { TAG: "Eq", _0: "fooBORKDORK", _1: "foobarbaz".replace(/ba./g, replace) }; }) ], tl: { hd: [ "unsafeReplaceBy2", (function (param) { let replace = function (whole, p1, p2, offset, s) { if (whole === "bar") { return "BORK"; } else { return "DORK"; } }; return { TAG: "Eq", _0: "fooBORKDORK", _1: "foobarbaz".replace(/ba./g, replace) }; }) ], tl: { hd: [ "unsafeReplaceBy3", (function (param) { let replace = function (whole, p1, p2, p3, offset, s) { if (whole === "bar") { return "BORK"; } else { return "DORK"; } }; return { TAG: "Eq", _0: "fooBORKDORK", _1: "foobarbaz".replace(/ba./g, replace) }; }) ], tl: { hd: [ "search", (function (param) { return { TAG: "Eq", _0: 3, _1: "foobarbaz".search(/ba./g) }; }) ], tl: { hd: [ "slice", (function (param) { return { TAG: "Eq", _0: "bar", _1: "foobarbaz".slice(3, 6) }; }) ], tl: { hd: [ "sliceToEnd", (function (param) { return { TAG: "Eq", _0: "barbaz", _1: "foobarbaz".slice(3) }; }) ], tl: { hd: [ "split", (function (param) { return { TAG: "Eq", _0: [ "foo", "bar", "baz" ], _1: "foo bar baz".split(" ") }; }) ], tl: { hd: [ "splitAtMost", (function (param) { return { TAG: "Eq", _0: [ "foo", "bar" ], _1: "foo bar baz".split(" ", 2) }; }) ], tl: { hd: [ "splitByRe", (function (param) { return { TAG: "Eq", _0: [ "a", "#", undefined, "b", "#", ":", "c" ], _1: "a#b#:c".split(/(#)(:)?/) }; }) ], tl: { hd: [ "splitByReAtMost", (function (param) { return { TAG: "Eq", _0: [ "a", "#", undefined ], _1: "a#b#:c".split(/(#)(:)?/, 3) }; }) ], tl: { hd: [ "startsWith", (function (param) { return { TAG: "Eq", _0: true, _1: "foobarbaz".startsWith("foo") }; }) ], tl: { hd: [ "startsWithFrom", (function (param) { return { TAG: "Eq", _0: false, _1: "foobarbaz".startsWith("foo", 1) }; }) ], tl: { hd: [ "substr", (function (param) { return { TAG: "Eq", _0: "barbaz", _1: "foobarbaz".substr(3) }; }) ], tl: { hd: [ "substrAtMost", (function (param) { return { TAG: "Eq", _0: "bar", _1: "foobarbaz".substr(3, 3) }; }) ], tl: { hd: [ "substring", (function (param) { return { TAG: "Eq", _0: "bar", _1: "foobarbaz".substring(3, 6) }; }) ], tl: { hd: [ "substringToEnd", (function (param) { return { TAG: "Eq", _0: "barbaz", _1: "foobarbaz".substring(3) }; }) ], tl: { hd: [ "toLowerCase", (function (param) { return { TAG: "Eq", _0: "bork", _1: "BORK".toLowerCase() }; }) ], tl: { hd: [ "toLocaleLowerCase", (function (param) { return { TAG: "Eq", _0: "bork", _1: "BORK".toLocaleLowerCase() }; }) ], tl: { hd: [ "toUpperCase", (function (param) { return { TAG: "Eq", _0: "FUBAR", _1: "fubar".toUpperCase() }; }) ], tl: { hd: [ "toLocaleUpperCase", (function (param) { return { TAG: "Eq", _0: "FUBAR", _1: "fubar".toLocaleUpperCase() }; }) ], tl: { hd: [ "trim", (function (param) { return { TAG: "Eq", _0: "foo", _1: " foo ".trim() }; }) ], tl: { hd: [ "anchor", (function (param) { return { TAG: "Eq", _0: "<a name=\"bar\">foo</a>", _1: "foo".anchor("bar") }; }) ], tl: { hd: [ "link", (function (param) { return { TAG: "Eq", _0: "<a href=\"path_to_url">foo</a>", _1: "foo".link("path_to_url") }; }) ], tl: { hd: [ "File \"js_string_test.res\", line 138, characters 5-12", (function (param) { return { TAG: "Ok", _0: "ab".includes("a") }; }) ], tl: /* [] */0 } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } }; let suites = { hd: suites_0, tl: suites_1 }; Mt.from_pair_suites("Js_string_test", suites); exports.suites = suites; /* Not a pure module */ ```
```asciidoc xref::overview/apoc.import/apoc.import.json.adoc[apoc.import.json icon:book[]] + `apoc.import.json(file,config)` - imports the json list to the provided file label:procedure[] label:apoc-core[] ```
```html <html> <head> <meta http-equiv="Content-Type" content="text/html; charset=US-ASCII"> <title>crosses (with strategy)</title> <link rel="stylesheet" href="../../../../../../../../doc/src/boostbook.css" type="text/css"> <meta name="generator" content="DocBook XSL Stylesheets V1.79.1"> <link rel="home" href="../../../../index.html" title="Chapter&#160;1.&#160;Geometry"> <link rel="up" href="../crosses.html" title="crosses"> <link rel="prev" href="../crosses.html" title="crosses"> <link rel="next" href="crosses_2.html" title="crosses"> </head> <body bgcolor="white" text="black" link="#0000FF" vlink="#840084" alink="#0000FF"> <table cellpadding="2" width="100%"><tr> <td valign="top"><img alt="Boost C++ Libraries" width="277" height="86" src="../../../../../../../../boost.png"></td> <td align="center"><a href="../../../../../../../../index.html">Home</a></td> <td align="center"><a href="../../../../../../../../libs/libraries.htm">Libraries</a></td> <td align="center"><a href="path_to_url">People</a></td> <td align="center"><a href="path_to_url">FAQ</a></td> <td align="center"><a href="../../../../../../../../more/index.htm">More</a></td> </tr></table> <hr> <div class="spirit-nav"> <a accesskey="p" href="../crosses.html"><img src="../../../../../../../../doc/src/images/prev.png" alt="Prev"></a><a accesskey="u" href="../crosses.html"><img src="../../../../../../../../doc/src/images/up.png" alt="Up"></a><a accesskey="h" href="../../../../index.html"><img src="../../../../../../../../doc/src/images/home.png" alt="Home"></a><a accesskey="n" href="crosses_2.html"><img src="../../../../../../../../doc/src/images/next.png" alt="Next"></a> </div> <div class="section"> <div class="titlepage"><div><div><h5 class="title"> <a name="geometry.reference.algorithms.crosses.crosses_3_with_strategy"></a><a class="link" href="crosses_3_with_strategy.html" title="crosses (with strategy)">crosses (with strategy)</a> </h5></div></div></div> <p> <a class="indexterm" name="idp109981808"></a> Checks if two geometries crosses. </p> <h6> <a name="geometry.reference.algorithms.crosses.crosses_3_with_strategy.h0"></a> <span class="phrase"><a name="geometry.reference.algorithms.crosses.crosses_3_with_strategy.synopsis"></a></span><a class="link" href="crosses_3_with_strategy.html#geometry.reference.algorithms.crosses.crosses_3_with_strategy.synopsis">Synopsis</a> </h6> <p> </p> <pre class="programlisting"><span class="keyword">template</span><span class="special">&lt;</span><span class="keyword">typename</span> <span class="identifier">Geometry1</span><span class="special">,</span> <span class="keyword">typename</span> <span class="identifier">Geometry2</span><span class="special">,</span> <span class="keyword">typename</span> <span class="identifier">Strategy</span><span class="special">&gt;</span> <span class="keyword">bool</span> <span class="identifier">crosses</span><span class="special">(</span><span class="identifier">Geometry1</span> <span class="keyword">const</span> <span class="special">&amp;</span> <span class="identifier">geometry1</span><span class="special">,</span> <span class="identifier">Geometry2</span> <span class="keyword">const</span> <span class="special">&amp;</span> <span class="identifier">geometry2</span><span class="special">,</span> <span class="identifier">Strategy</span> <span class="keyword">const</span> <span class="special">&amp;</span> <span class="identifier">strategy</span><span class="special">)</span></pre> <p> </p> <h6> <a name="geometry.reference.algorithms.crosses.crosses_3_with_strategy.h1"></a> <span class="phrase"><a name="geometry.reference.algorithms.crosses.crosses_3_with_strategy.parameters"></a></span><a class="link" href="crosses_3_with_strategy.html#geometry.reference.algorithms.crosses.crosses_3_with_strategy.parameters">Parameters</a> </h6> <div class="informaltable"><table class="table"> <colgroup> <col> <col> <col> <col> </colgroup> <thead><tr> <th> <p> Type </p> </th> <th> <p> Concept </p> </th> <th> <p> Name </p> </th> <th> <p> Description </p> </th> </tr></thead> <tbody> <tr> <td> <p> Geometry1 const &amp; </p> </td> <td> <p> Any type fulfilling a Geometry Concept </p> </td> <td> <p> geometry1 </p> </td> <td> <p> A model of the specified concept </p> </td> </tr> <tr> <td> <p> Geometry2 const &amp; </p> </td> <td> <p> Any type fulfilling a Geometry Concept </p> </td> <td> <p> geometry2 </p> </td> <td> <p> A model of the specified concept </p> </td> </tr> <tr> <td> <p> Strategy const &amp; </p> </td> <td> <p> Any type fulfilling a Crosses Strategy Concept </p> </td> <td> <p> strategy </p> </td> <td> <p> The strategy which will be used for crosses calculations </p> </td> </tr> </tbody> </table></div> <h6> <a name="geometry.reference.algorithms.crosses.crosses_3_with_strategy.h2"></a> <span class="phrase"><a name="geometry.reference.algorithms.crosses.crosses_3_with_strategy.returns"></a></span><a class="link" href="crosses_3_with_strategy.html#geometry.reference.algorithms.crosses.crosses_3_with_strategy.returns">Returns</a> </h6> <p> Returns true if two geometries crosses </p> <h6> <a name="geometry.reference.algorithms.crosses.crosses_3_with_strategy.h3"></a> <span class="phrase"><a name="geometry.reference.algorithms.crosses.crosses_3_with_strategy.header"></a></span><a class="link" href="crosses_3_with_strategy.html#geometry.reference.algorithms.crosses.crosses_3_with_strategy.header">Header</a> </h6> <p> Either </p> <p> <code class="computeroutput"><span class="preprocessor">#include</span> <span class="special">&lt;</span><span class="identifier">boost</span><span class="special">/</span><span class="identifier">geometry</span><span class="special">.</span><span class="identifier">hpp</span><span class="special">&gt;</span></code> </p> <p> Or </p> <p> <code class="computeroutput"><span class="preprocessor">#include</span> <span class="special">&lt;</span><span class="identifier">boost</span><span class="special">/</span><span class="identifier">geometry</span><span class="special">/</span><span class="identifier">algorithms</span><span class="special">/</span><span class="identifier">crosses</span><span class="special">.</span><span class="identifier">hpp</span><span class="special">&gt;</span></code> </p> <h6> <a name="geometry.reference.algorithms.crosses.crosses_3_with_strategy.h4"></a> <span class="phrase"><a name="geometry.reference.algorithms.crosses.crosses_3_with_strategy.conformance"></a></span><a class="link" href="crosses_3_with_strategy.html#geometry.reference.algorithms.crosses.crosses_3_with_strategy.conformance">Conformance</a> </h6> <p> The function crosses implements function Crosses from the <a href="path_to_url" target="_top">OGC Simple Feature Specification</a>. </p> <h6> <a name="geometry.reference.algorithms.crosses.crosses_3_with_strategy.h5"></a> <span class="phrase"><a name="geometry.reference.algorithms.crosses.crosses_3_with_strategy.supported_geometries"></a></span><a class="link" href="crosses_3_with_strategy.html#geometry.reference.algorithms.crosses.crosses_3_with_strategy.supported_geometries">Supported geometries</a> </h6> <div class="informaltable"><table class="table"> <colgroup> <col> <col> <col> <col> <col> <col> <col> <col> <col> <col> <col> </colgroup> <thead><tr> <th> </th> <th> <p> Point </p> </th> <th> <p> Segment </p> </th> <th> <p> Box </p> </th> <th> <p> Linestring </p> </th> <th> <p> Ring </p> </th> <th> <p> Polygon </p> </th> <th> <p> MultiPoint </p> </th> <th> <p> MultiLinestring </p> </th> <th> <p> MultiPolygon </p> </th> <th> <p> Variant </p> </th> </tr></thead> <tbody> <tr> <td> <p> Point </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/ok.png" alt="ok"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/ok.png" alt="ok"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/ok.png" alt="ok"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/ok.png" alt="ok"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/ok.png" alt="ok"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/ok.png" alt="ok"></span> </p> </td> </tr> <tr> <td> <p> Segment </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> </tr> <tr> <td> <p> Box </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> </tr> <tr> <td> <p> Linestring </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/ok.png" alt="ok"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/ok.png" alt="ok"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/ok.png" alt="ok"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> </tr> <tr> <td> <p> Ring </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/ok.png" alt="ok"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/ok.png" alt="ok"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/ok.png" alt="ok"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> </tr> <tr> <td> <p> Polygon </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/ok.png" alt="ok"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/ok.png" alt="ok"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/ok.png" alt="ok"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> </tr> <tr> <td> <p> MultiPoint </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> </tr> <tr> <td> <p> MultiLinestring </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/ok.png" alt="ok"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/ok.png" alt="ok"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/ok.png" alt="ok"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> </tr> <tr> <td> <p> MultiPolygon </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/ok.png" alt="ok"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/ok.png" alt="ok"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/ok.png" alt="ok"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> </tr> <tr> <td> <p> Variant </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> </tr> </tbody> </table></div> </div> <table xmlns:rev="path_to_url~gregod/boost/tools/doc/revision" width="100%"><tr> <td align="left"></td> Gehrels, Bruno Lalande, Mateusz Loskot, Adam Wulkiewicz, Oracle and/or its affiliates<p> file LICENSE_1_0.txt or copy at <a href="path_to_url" target="_top">path_to_url </p> </div></td> </tr></table> <hr> <div class="spirit-nav"> <a accesskey="p" href="../crosses.html"><img src="../../../../../../../../doc/src/images/prev.png" alt="Prev"></a><a accesskey="u" href="../crosses.html"><img src="../../../../../../../../doc/src/images/up.png" alt="Up"></a><a accesskey="h" href="../../../../index.html"><img src="../../../../../../../../doc/src/images/home.png" alt="Home"></a><a accesskey="n" href="crosses_2.html"><img src="../../../../../../../../doc/src/images/next.png" alt="Next"></a> </div> </body> </html> ```
Kiloyear may refer to: Millennium, a period of time equal to 1000 years Kiloannus, abbreviated ka, a period of 1000 Julian years, equal to 365,250 days
```html <!DOCTYPE html> <html lang="en"> <head> <meta charset="utf-8"> <link rel="shortcut icon" type="image/png" href="../assets/img/favicon.ico"> <link rel="stylesheet" href="../assets/lib/cssgrids.css"> <link rel="stylesheet" href="../assets/css/main.css" id="site_styles"> <script src="../assets/lib/yui-min.js"></script> <script src="../assets/js/api-prettify.js"></script> <script src="../assets/js/api-filter.js"></script> <script src="../assets/js/api-list.js"></script> <script src="../assets/js/api-search.js"></script> <script src="../assets/js/api-docs.js"></script> <title>nunuStudio BaseNode</title> </head> <body class="yui3-skin-sam"> <div id="doc"> <div id="hd" class="yui3-g header"> <div class="yui3-u-3-4"> <h1><a href="../index.html"><img src="../assets/img/logo.png" title=""></a></h1> </div> </div> <div id="bd" class="yui3-g"> <div class="yui3-u-1-4"> <div id="docs-sidebar" class="sidebar"> <div id="api-list"> <h2 class="off-left">APIs</h2> <div id="api-tabview" class="tabview"> <div id="api-tabview-filter"> <input type="search" id="api-filter" placeholder="Type to filter APIs"> </div> <ul class="tabs"> <li><a href="#api-classes">Classes</a></li> <li><a href="#api-modules">Modules</a></li> </ul> <div id="api-tabview-panel"> <ul id="api-classes" class="apis classes"> <li><a href="../classes/AfterimagePass.html">AfterimagePass</a></li> <li><a href="../classes/AmbientLight.html">AmbientLight</a></li> <li><a href="../classes/AnimationMixer.html">AnimationMixer</a></li> <li><a href="../classes/AnimationTimer.html">AnimationTimer</a></li> <li><a href="../classes/App.html">App</a></li> <li><a href="../classes/ARHandler.html">ARHandler</a></li> <li><a href="../classes/ArraybufferUtils.html">ArraybufferUtils</a></li> <li><a href="../classes/Audio.html">Audio</a></li> <li><a href="../classes/AudioEmitter.html">AudioEmitter</a></li> <li><a href="../classes/AudioLoader.html">AudioLoader</a></li> <li><a href="../classes/Base64Utils.html">Base64Utils</a></li> <li><a href="../classes/BaseNode.html">BaseNode</a></li> <li><a href="../classes/BillboardGroup.html">BillboardGroup</a></li> <li><a href="../classes/BloomPass.html">BloomPass</a></li> <li><a href="../classes/BokehPass.html">BokehPass</a></li> <li><a href="../classes/BufferUtils.html">BufferUtils</a></li> <li><a href="../classes/ByteArrayUtils.html">ByteArrayUtils</a></li> <li><a href="../classes/CanvasSprite.html">CanvasSprite</a></li> <li><a href="../classes/CanvasTexture.html">CanvasTexture</a></li> <li><a href="../classes/CapsuleBufferGeometry.html">CapsuleBufferGeometry</a></li> <li><a href="../classes/ColorifyPass.html">ColorifyPass</a></li> <li><a href="../classes/CompressedTexture.html">CompressedTexture</a></li> <li><a href="../classes/CopyPass.html">CopyPass</a></li> <li><a href="../classes/CSS3DObject.html">CSS3DObject</a></li> <li><a href="../classes/CSS3DRenderer.html">CSS3DRenderer</a></li> <li><a href="../classes/CSS3DSprite.html">CSS3DSprite</a></li> <li><a href="../classes/CubeCamera.html">CubeCamera</a></li> <li><a href="../classes/CubeTexture.html">CubeTexture</a></li> <li><a href="../classes/DataTexture.html">DataTexture</a></li> <li><a href="../classes/DirectionalLight.html">DirectionalLight</a></li> <li><a href="../classes/DirectionalLightCSM.html">DirectionalLightCSM</a></li> <li><a href="../classes/DotScreenPass.html">DotScreenPass</a></li> <li><a href="../classes/EffectComposer.html">EffectComposer</a></li> <li><a href="../classes/EventManager.html">EventManager</a></li> <li><a href="../classes/FileSystem.html">FileSystem</a></li> <li><a href="../classes/FilmPass.html">FilmPass</a></li> <li><a href="../classes/FirstPersonControls.html">FirstPersonControls</a></li> <li><a href="../classes/Fog.html">Fog</a></li> <li><a href="../classes/Font.html">Font</a></li> <li><a href="../classes/FontLoader.html">FontLoader</a></li> <li><a href="../classes/FXAAPass.html">FXAAPass</a></li> <li><a href="../classes/Gamepad.html">Gamepad</a></li> <li><a href="../classes/GeometryLoader.html">GeometryLoader</a></li> <li><a href="../classes/Group.html">Group</a></li> <li><a href="../classes/Gyroscope.html">Gyroscope</a></li> <li><a href="../classes/HemisphereLight.html">HemisphereLight</a></li> <li><a href="../classes/HTMLView.html">HTMLView</a></li> <li><a href="../classes/HueSaturationPass.html">HueSaturationPass</a></li> <li><a href="../classes/Image.html">Image</a></li> <li><a href="../classes/ImageLoader.html">ImageLoader</a></li> <li><a href="../classes/InstancedMesh.html">InstancedMesh</a></li> <li><a href="../classes/Key.html">Key</a></li> <li><a href="../classes/Keyboard.html">Keyboard</a></li> <li><a href="../classes/LegacyGeometryLoader.html">LegacyGeometryLoader</a></li> <li><a href="../classes/LensFlare.html">LensFlare</a></li> <li><a href="../classes/LightProbe.html">LightProbe</a></li> <li><a href="../classes/LocalStorage.html">LocalStorage</a></li> <li><a href="../classes/Material.html">Material</a></li> <li><a href="../classes/MaterialLoader.html">MaterialLoader</a></li> <li><a href="../classes/MathUtils.html">MathUtils</a></li> <li><a href="../classes/Mesh.html">Mesh</a></li> <li><a href="../classes/Model.html">Model</a></li> <li><a href="../classes/Mouse.html">Mouse</a></li> <li><a href="../classes/NodeScript.html">NodeScript</a></li> <li><a href="../classes/Nunu.html">Nunu</a></li> <li><a href="../classes/Object3D.html">Object3D</a></li> <li><a href="../classes/ObjectLoader.html">ObjectLoader</a></li> <li><a href="../classes/ObjectUtils.html">ObjectUtils</a></li> <li><a href="../classes/OperationNode.html">OperationNode</a></li> <li><a href="../classes/OrbitControls.html">OrbitControls</a></li> <li><a href="../classes/OrthographicCamera.html">OrthographicCamera</a></li> <li><a href="../classes/ParametricBufferGeometry.html">ParametricBufferGeometry</a></li> <li><a href="../classes/ParticleDistributions.html">ParticleDistributions</a></li> <li><a href="../classes/ParticleEmitter.html">ParticleEmitter</a></li> <li><a href="../classes/ParticleEmitterControl.html">ParticleEmitterControl</a></li> <li><a href="../classes/ParticleEmitterControlOptions.html">ParticleEmitterControlOptions</a></li> <li><a href="../classes/ParticleGroup.html">ParticleGroup</a></li> <li><a href="../classes/Pass.html">Pass</a></li> <li><a href="../classes/PerspectiveCamera.html">PerspectiveCamera</a></li> <li><a href="../classes/PhysicsGenerator.html">PhysicsGenerator</a></li> <li><a href="../classes/PhysicsObject.html">PhysicsObject</a></li> <li><a href="../classes/PointLight.html">PointLight</a></li> <li><a href="../classes/PositionalAudio.html">PositionalAudio</a></li> <li><a href="../classes/Program.html">Program</a></li> <li><a href="../classes/PythonScript.html">PythonScript</a></li> <li><a href="../classes/RectAreaLight.html">RectAreaLight</a></li> <li><a href="../classes/RendererConfiguration.html">RendererConfiguration</a></li> <li><a href="../classes/RendererState.html">RendererState</a></li> <li><a href="../classes/RenderPass.html">RenderPass</a></li> <li><a href="../classes/Resource.html">Resource</a></li> <li><a href="../classes/ResourceManager.html">ResourceManager</a></li> <li><a href="../classes/RoundedBoxBufferGeometry.html">RoundedBoxBufferGeometry</a></li> <li><a href="../classes/Scene.html">Scene</a></li> <li><a href="../classes/Script.html">Script</a></li> <li><a href="../classes/ShaderAttribute.html">ShaderAttribute</a></li> <li><a href="../classes/ShaderPass.html">ShaderPass</a></li> <li><a href="../classes/ShaderUtils.html">ShaderUtils</a></li> <li><a href="../classes/SimplexNoise.html">SimplexNoise</a></li> <li><a href="../classes/Skeleton.html">Skeleton</a></li> <li><a href="../classes/SkinnedMesh.html">SkinnedMesh</a></li> <li><a href="../classes/Sky.html">Sky</a></li> <li><a href="../classes/SobelPass.html">SobelPass</a></li> <li><a href="../classes/SpineAnimation.html">SpineAnimation</a></li> <li><a href="../classes/SpineTexture.html">SpineTexture</a></li> <li><a href="../classes/SpotLight.html">SpotLight</a></li> <li><a href="../classes/Sprite.html">Sprite</a></li> <li><a href="../classes/SpriteSheetTexture.html">SpriteSheetTexture</a></li> <li><a href="../classes/SSAONOHPass.html">SSAONOHPass</a></li> <li><a href="../classes/SSAOPass.html">SSAOPass</a></li> <li><a href="../classes/SSAOShader.html">SSAOShader</a></li> <li><a href="../classes/TargetConfig.html">TargetConfig</a></li> <li><a href="../classes/TechnicolorPass.html">TechnicolorPass</a></li> <li><a href="../classes/TerrainBufferGeometry.html">TerrainBufferGeometry</a></li> <li><a href="../classes/TextBitmap.html">TextBitmap</a></li> <li><a href="../classes/TextFile.html">TextFile</a></li> <li><a href="../classes/TextMesh.html">TextMesh</a></li> <li><a href="../classes/TextSprite.html">TextSprite</a></li> <li><a href="../classes/Texture.html">Texture</a></li> <li><a href="../classes/TextureLoader.html">TextureLoader</a></li> <li><a href="../classes/Timer.html">Timer</a></li> <li><a href="../classes/TizenKeyboard.html">TizenKeyboard</a></li> <li><a href="../classes/Tree.html">Tree</a></li> <li><a href="../classes/TreeUtils.html">TreeUtils</a></li> <li><a href="../classes/TwistModifier.html">TwistModifier</a></li> <li><a href="../classes/TypedArrayHelper.html">TypedArrayHelper</a></li> <li><a href="../classes/UnitConverter.html">UnitConverter</a></li> <li><a href="../classes/UnrealBloomPass.html">UnrealBloomPass</a></li> <li><a href="../classes/Video.html">Video</a></li> <li><a href="../classes/VideoLoader.html">VideoLoader</a></li> <li><a href="../classes/VideoStream.html">VideoStream</a></li> <li><a href="../classes/VideoTexture.html">VideoTexture</a></li> <li><a href="../classes/Viewport.html">Viewport</a></li> <li><a href="../classes/VRHandler.html">VRHandler</a></li> <li><a href="../classes/WebcamTexture.html">WebcamTexture</a></li> <li><a href="../classes/WorkerPool.html">WorkerPool</a></li> <li><a href="../classes/WorkerTask.html">WorkerTask</a></li> <li><a href="../classes/{Object} ParticleGroupOptions.html">{Object} ParticleGroupOptions</a></li> </ul> <ul id="api-modules" class="apis modules"> <li><a href="../modules/Animation.html">Animation</a></li> <li><a href="../modules/Animations.html">Animations</a></li> <li><a href="../modules/Audio.html">Audio</a></li> <li><a href="../modules/BinaryUtils.html">BinaryUtils</a></li> <li><a href="../modules/Cameras.html">Cameras</a></li> <li><a href="../modules/Controls.html">Controls</a></li> <li><a href="../modules/Core.html">Core</a></li> <li><a href="../modules/Files.html">Files</a></li> <li><a href="../modules/Input.html">Input</a></li> <li><a href="../modules/Lights.html">Lights</a></li> <li><a href="../modules/Loaders.html">Loaders</a></li> <li><a href="../modules/Meshes.html">Meshes</a></li> <li><a href="../modules/Misc.html">Misc</a></li> <li><a href="../modules/Particles.html">Particles</a></li> <li><a href="../modules/Physics.html">Physics</a></li> <li><a href="../modules/Postprocessing.html">Postprocessing</a></li> <li><a href="../modules/Resources.html">Resources</a></li> <li><a href="../modules/Runtime.html">Runtime</a></li> <li><a href="../modules/Script.html">Script</a></li> <li><a href="../modules/Sprite.html">Sprite</a></li> <li><a href="../modules/Textures.html">Textures</a></li> <li><a href="../modules/THREE.html">THREE</a></li> <li><a href="../modules/Utils.html">Utils</a></li> </ul> </div> </div> </div> </div> </div> <div class="yui3-u-3-4"> <!--<div id="api-options"> Show: <label for="api-show-inherited"> <input type="checkbox" id="api-show-inherited" checked> Inherited </label> <label for="api-show-protected"> <input type="checkbox" id="api-show-protected"> Protected </label> <label for="api-show-private"> <input type="checkbox" id="api-show-private"> Private </label> <label for="api-show-deprecated"> <input type="checkbox" id="api-show-deprecated"> Deprecated </label> </div>--> <div class="apidocs"> <div id="docs-main"> <div class="content"> <h1>BaseNode Class</h1> <div class="box meta"> Module: <a href="../modules/Script.html">Script</a> </div> <div class="box intro"> <p>Base node are used as a basis for all other nodes, they implement the necessary common functionality for all nodes.</p> <p>Base nodes add a destructible function with a button which allows the user to destroy them.</p> <p>When the node gets destroyed it automatically gets removed from the graph.</p> </div> <div class="constructor"> <h2>Constructor</h2> <div id="method_BaseNode" class="method item"> <h3 class="name"><code>BaseNode</code></h3> <span class="paren">()</span> <div class="meta"> <p> </p> </div> <div class="description"> </div> </div> </div> <div id="classdocs" class="tabview"> <ul class="api-class-tabs"> <li class="api-class-tab index"><a href="#index">Index</a></li> <li class="api-class-tab attrs"><a href="#attrs">Attributes</a></li> </ul> <div> <div id="index" class="api-class-tabpanel index"> <h2 class="off-left">Item Index</h2> <div class="index-section attrs"> <h3>Attributes</h3> <ul class="index-list attrs"> <li class="index-item attr"> <a href="#attr_destroyButton">destroyButton</a> </li> </ul> </div> </div> <div id="attrs" class="api-class-tabpanel"> <h2 class="off-left">Attributes</h2> <div id="attr_destroyButton" class="attr item"> <a name="config_destroyButton"></a> <h3 class="name"><code>destroyButton</code></h3> <span class="type">Circle</span> <div class="meta"> <p> </p> </div> <div class="description"> <p>Button used to destroy the node and remove it from the graph.</p> </div> </div> </div> </div> </div> </div> </div> </div> </div> </div> </div> </body> </html> ```
Karl Stern (April 8, 1906 - November 11, 1975) was a German Canadian neurologist and psychiatrist, and a Jewish convert to the Catholic Church. Stern is best known for the account of his conversion in Pillar of Fire (1951). Life and career Stern was born in the small town Cham in Bavaria in 1906, to socially assimilated Jewish parents. There was no synagogue or rabbi in the town, and although regular services and classes were held under the direction of a cantor, Stern's religious education was minimal. As a teenager he sought to re-engage with the Jewish faith, and began attending an Orthodox synagogue, but he soon became an atheist Zionist. He studied medicine at the Universities of Munich, Berlin and Frankfurt, and came to specialize in psychiatric research. In the course of undergoing psychoanalysis himself, he regained belief in God and returned to Orthodox Jewish worship. He emigrated from Nazi Germany in 1936, finding work in neurological research in England, and later as lecturer in neuropathology and assistant neuropathologist at the Montreal Neurological Institute, under Wilder Penfield. It was while in London that he began to take an interest in the Catholic faith. In 1943, after much soul-searching, and ultimately influenced by encounters with Jacques Maritain and Dorothy Day, Stern converted to Christianity and was baptized as a Roman Catholic. Stern married Liselotte von Baeyer, a bookbinder (died 1970) and they had four children: Antony, a psychiatrist (1937-1967), Katherine Skorzewska, Michael and John. Stern was significantly incapacitated by a stroke in 1970, although he continued working and died in Montreal in 1975. Writings Books Pillar of Fire. New York: Harcourt, Brace, 1951. Much reprinted, most recently by Urbi Et Orbi Communications, 2001. . French translation, Le buisson ardent. Paris: Seuil, 1951. Dutch translation, De vuurzuil. Antwerp: Sheed and Ward, 1951. German translation, Die Feuerwolke. Salzburg: Müller, 1954. The Third Revolution: A Study of Psychiatry and Religion. New York: Harcourt, Brace, 1954. French translation, La troisième révolution: essai sur la psychanalyse et la religion. Paris: Du Seuil, 1955. German translation, Die dritte Revolution: Psychiatrie und Religion. Salzburg: Otto Müller, 1956. Dutch translation, De derde revolutie: psychiatrie en religie. Utrecht: De Fontein, 1958. Through Dooms of Love: a novel. New York: Farrar, Straus and Cudahy, 1960. The Flight from Woman. New York: Farrar, Straus and Giroux, 1965. :Reissued New York: Paragon House, 1985. . German translation, Die Flucht vor dem Weib: zur Pathologie des Zeitgeistes. Salzburg: Otto Müller, 1968. French translation, Refus de la femme. Montréal: Éditions HMH, 1968. Love and Success, and other essays. New York: Farrar, Straus and Giroux, 1975. . Other writings Preface to Henri Gratton, Psychanalyses d'hier et d'aujourd'hui comme thérapeutiques, sciences et philosophies: introduction aux problèmes de la psychologie des profondeurs. Paris: Cerf, 1955. Essay on St Thérèse of Lisieux, in Saints for Now, edited by Clare Boothe Luce. London and New York: Sheed & Ward, 1952. Reprinted San Francisco: Ignatius Press, 1993. . Works about Stern Daniel Burston, A Forgotten Freudian, The Passion of Karl Stern. London: Karnac, 2016. Bernard Heller, Epistle to an Apostate. New York: Bookman's Press, 1951. "Karl Stern", in F. Lelotte (ed.), Convertis du XXème siècle. Vol. 2. Paris and Tournai: Casterman; Brussels: Foyer Notre-Dame, 1954. Reprinted 1963. "Karl Stern", in International Biographical Dictionary of Central European Émigrés 1933-1945. Vol. 2, part 2. Edited by Werner Röder and Herbert A. Strauss. Munich: Saur, 1983. "Karl Stern", in Charles Patrick Connor, Classic Catholic Converts. San Francisco: Ignatius Press, 2001. "Karl Stern", in Lorene Hanley Duquin, A Century of Catholic Converts. Our Sunday Visitor, 2003. . Robert B. McFarland, "Elective Divinities: Exile and Religious Conversion in Alfred Döblin's 'Schicksalsreise' (Destiny's Journey), Karl Jakob Hirsch's 'Heimkehr zu Gott' (Return to God), and Karl Stern's 'The Pillar of Fire'". Christianity & Literature 57:1 (2007), pp. 35–61. References 1906 births 1975 deaths Canadian neurologists Canadian psychiatrists Canadian Roman Catholics Converts to Roman Catholicism from Judaism Jewish emigrants from Nazi Germany to Canada German neurologists Goethe University Frankfurt alumni Ludwig Maximilian University of Munich alumni People from Cham, Germany Roman Catholic writers 20th-century Canadian physicians
Primus inter pares is a Latin phrase meaning first among equals. It is typically used as an honorary title for someone who is formally equal to other members of their group but is accorded unofficial respect, traditionally owing to their seniority in office. Historically, the princeps senatus of the Roman Senate was such a figure and initially bore only the distinction that he was allowed to speak first during debate. Also, Constantine the Great was given the role of primus inter pares. However, the term is also often used ironically or self-deprecatingly by leaders with much higher status as a form of respect, camaraderie or propaganda. After the fall of the Republic, Roman emperors initially referred to themselves only as princeps despite having enormous power. Various modern figures such as the chair of the Federal Reserve in the United States, the prime minister in parliamentary systems, the president of the Swiss Confederation, the chief justice of the United States, the chief justice of the Philippines, the archbishop of Canterbury of the Anglican Communion and the ecumenical patriarch of Constantinople of the Eastern Orthodox Church fall under both senses: bearing higher status and various additional powers while remaining still merely equal to their peers in important senses. National use China In the People's Republic of China, which was placed under the collective leadership of the Politburo Standing Committee following the death of Chairman Mao Zedong, the term "first among equals" was often used to describe China's paramount leader at the zenith of Deng Xiaoping's influence. This has fallen out of favour since the consolidation of power under the current core leader, General Secretary Xi Jinping. Commonwealth usage Prime minister or premier In the federal Commonwealth realms, Canada and Australia, in which King Charles III is head of state as constitutional monarch, a governor-general is appointed by the King-in-Council to represent the King during his absence. The governor-general typically appoints the leader of the political party holding at least a plurality of seats in the elected legislature to be prime minister, whose relationship with the other ministers of the Crown is in theory said to be that of a primus inter pares, or "first among equals". This is also done at the provincial or state level wherein the lieutenant governors of the Canadian provinces or governors of the Australian states as Lieutenant-Governor-in-Council appoints the leader of the provincial or state political party holding at least a plurality of seats in the elected provincial or state legislature to be provincial premier or state premier. Viceroys in Canada and Australia As federations in Canada, lieutenant-governors represent the Canadian monarch in each of the provinces, thus, acting as the "heads of state" in the provinces. Unlike in Australia with the governors of the Australian states, the lieutenant-governors in Canada are not appointed by the King-in-Council, but by the governor general on the advice of the prime minister of Canada known as the Governor-in-Council. Similarly, in Australia, there are governors to represent the Australian monarch in each of the states of Australia that comprise the federal Commonwealth of Australia, making them "head of state" in each of their own states. In each case, these several governors or lieutenant-governors are not envisaged as subordinate to the governor general – the governor-general of Australia or the governor general of Canada, as a federal viceroy, is "first among equals". Germany Mayors of German city states have traditionally acted as primus inter pares. In Hamburg, Lübeck and Bremen, which had been Free Imperial Cities from the times of the Holy Roman Empire, the government was called Senate. The mayor was one senator amongst many, often referred to as president of the Senate rather than mayor. This ended in Lübeck with the incorporation into Prussia in 1937. While in a constitutional reform in 1996, the mayor of Hamburg was given broad powers to shape the politics of the Senate of Hamburg, thus, ending his status as primus inter pares. However, in the city state Free Hanseatic City of Bremen which was created after the Second World War, the mayor has had a similar role in the Senate of Bremen. The same was true until 1995 for the governing mayor of Berlin among his colleagues within the Senate of Berlin. Japan Starting with the Meiji Constitution of 1885, as part of the "Cabinet System Act", and lasting until the revision of the modern constitution in 1947, the prime minister of Japan was legally considered to be of the same rank as the other ministers who formed the Cabinet. During this time, the prime minister was referred to as "同輩中の首席" dōhai-chū no shuseki ("chief among peers"). Netherlands The prime minister of the Netherlands (officially, the "minister-president") is the chairman of the Council of Ministers and active executive authority of the Dutch government. Although formally no special powers are assigned, the prime minister functions as the "face" of the cabinet of the Netherlands. Usually, the prime minister is also minister of General Affairs. Until 1945, the position of chairman of the Council of Ministers officially switched between the ministers, although practices differed throughout history. In 1945, the position was formally instituted. Although not formally necessary, the prime minister in practice is the leader of the largest party in the majority coalition in the House of Representatives, the lower house of parliament. Singapore The phrase "first among equals" is often used to describe the political succession within the ruling People's Action Party leadership and future candidate for the prime minister of Singapore. Switzerland In Switzerland, the seven-member Federal Council constitutes the executive in the Swiss directorial system. Each year, the Federal Assembly elects a president of the Confederation. By convention, the positions of President and Vice President rotate annually, each Councillor thus becoming vice president and then President every seven years while in office. The president is not the Swiss head of state, but is the highest-ranking Swiss official. The president presides over Council meetings and carries out certain representative functions that, in other countries, are the business of the head of state. In urgent situations where a Council decision cannot be made in time, the president is empowered to act on behalf of the whole Council. Apart from that, though, the president is a primus inter pares, having no power above and beyond the other six councillors. United Kingdom The term "prime minister" can be compared to "primary minister" or "first minister". Because of this, the prime ministers of many countries are traditionally considered to be "first among equals" – they are the chairman or "head" of a Cabinet rather than holding an office that is de jure superior to that of ministers. The Prime Minister of the United Kingdom has frequently been described as "first among equals". In the UK, the executive is the Cabinet, and during Hanoverian times a minister had the role of informing the monarch about proposed legislation in the House of Commons and other matters. In modern times, however, although the phrase is still occasionally used, it understates the powers of the prime minister, which now include many broad, exclusive, executive powers over which cabinet members have little influence. First Among Equals is the title of a popular political novel (1984) by Jeffrey Archer, about the careers and private lives of several men vying to become British Prime Minister. It was later adapted into a ten-part TV series, produced by Granada Television. Countries and jurisdictions that have adapted the British parliamentary system (such as Canada and Australia) would have the same use for the phrase. United States The phrase "first among equals" has also been used to describe the Chief Justice of the United States. The Chief Justice has no authority over the decisions of the other Justices, but holds one key administrative power: when the Chief Justice votes with the majority on a decision, he can either author the majority opinion himself or assign it to another Justice voting with the majority. Chairmen In many private parliamentary bodies, such as clubs, boards, educational faculty, and committees, the officer or member who holds the position of chair or chairman is often regarded as a "first among equals". That is, while most rules of order will grant the chair special powers within the context of a meeting, the position of chair is usually temporary, rotating, and powerless in other contexts, making the occupant merely a temporary leader required to instil order. This is the case for mayors under a council–manager government, as the "mayor" has the same vote as all other council members and cannot override them, although their opinion may have more sway among other members. Religion Catholic Church In Latin and Eastern Catholic Churches, the Pope (bishop of Rome) is seen as the Vicar of Christ and "first among equals", the successor of Saint Peter, and leader of the Christian world, in accordance with the rules of Apostolic succession to the apostles. In the Catholic Church, the Pope holds the office with supreme authority in canon law over all other bishops. In the Catholic Church, the Dean of the College of Cardinals is the first among equal Prince of the Church in the college, which is the pope's highest-ranking council and elects the papal successor, generally from its own ranks. Various episcopal sees were granted or claim the title of primate (usually of a past or present political entity), which grants such a primas (usually a metropolitan archbishopric, often in a former/present capital) precedence over all other sees in its circumscription, outranking (other) metropolitan sees, but the incumbent primates can be trumped by personal ranks, as they rank below cardinals. More commonly, dioceses are geographically grouped in an ecclesiastical province, where only one holds the rank of metropolitan archbishop, which outranks his colleagues, who are therefore called his suffragans, even if these include (fairly rarely) another archbishop. Eastern Orthodox Churches The phrase "first among equals" is also used to describe the role of the patriarch of Constantinople, who, as the "ecumenical patriarch", is the first among all the bishops of the Eastern Orthodox Church. He has no direct jurisdiction over the other patriarchs or the other autocephalous Orthodox churches and cannot interfere in the election of bishops in autocephalous churches, but he alone enjoys the right of convening extraordinary synods consisting of them or their delegates to deal with ad hoc situations, and he has also convened well-attended pan-Orthodox Synods in the last forty years. His title is an acknowledgement of his historic significance and of his privilege to serve as primary spokesman for the Eastern Orthodox Communion. Eastern Christians considered the bishop of Rome to be the "first among equals" during the first thousand years of Christianity according to the ancient, first millennial order (or "taxis" in Greek) of Rome, Constantinople, Alexandria, Antioch, and Jerusalem, known as the Pentarchy that was established after Constantinople became the eastern capital of the Byzantine Empire. The canons relative to the universal primacy of honor of the patriarch of Constantinople are the 9th canon of the synod of Antioch and the 28th canon of the Council of Chalcedon. Anglican Communion According to the Anglican Covenant, the archbishop of Canterbury is "first among equals" in his presidency over the Anglican Communion. The senior bishop of the seven diocesan bishops of the Scottish Episcopal Church bears the truncated title primus from primus inter pares. Leading bishops or primates in other Anglican 'national' churches are often said to be primus inter pares within their provinces (e.g. Church of Ireland), while the (first) primatial see of Canterbury remains primus among them. However, on 20 February 2023, the Global South Fellowship of Anglican Churches declared the Archbishop of Canterbury had lost its mantle of first among equals due to him accepting the Church of England's incorporation into the Anglican liturgy of blessings of same-sex unions. The International Anglican-Catholic Commission for Unity and Mission, in its 2007 agreed statement Growing Together in Unity and Mission, "urge[s] Anglicans and Catholics to explore together how the ministry of the Bishop of Rome might be offered and received in order to assist our Communions to grow towards full, ecclesial communion". Presbyterianism The Moderator of the General Assembly in a Presbyterian church is similarly designated as a primus inter pares. This concept holds also for the Moderators of each Synod, Presbytery, and Kirk Session. As all elders are ordained – some for teaching and some for ruling – none sit in higher status, but all are considered equal behind the one and only head of the church Jesus Christ. Church of Sweden In the Lutheran Church of Sweden, the Archbishop of Uppsala is considered by the church as primus inter pares. As such, the Archbishop of Uppsala has no powers over the other 13 bishops but has some additional administrative and spiritual duties, as specified in the Church Order of the Church of Sweden. According to the chapter 8 of the Church Order, only the Archbishop of Uppsala can ordain a bishop. The other bishops of the Church of Sweden are peers, not subordinate, to the Archbishop of Uppsala. Among the Archbishop of Uppsala's other duties is the obligation to convene and chair the Episcopal Assembly. Unlike the other bishops, who are elected to office by members of their diocese, the Archbishop of Uppsala is elected by the entire body of the church. There is a peculiar regulation that stipulates that the total votes cast in the archdiocese of Uppsala, when electing an archbishop, "shall be divided by ten, with decimals removed", before being added to the national vote. See also Egalitarianism Republicanism Animal Farm Notes Footnotes References Latin political words and phrases Titles Primates (bishops) Christian terminology Roman Senate
```python # being a bit too dynamic # pylint: disable=E1101 from __future__ import division from distutils.version import LooseVersion def _mpl_le_1_2_1(): try: import matplotlib as mpl return (str(mpl.__version__) <= LooseVersion('1.2.1') and str(mpl.__version__)[0] != '0') except ImportError: return False def _mpl_ge_1_3_1(): try: import matplotlib # The or v[0] == '0' is because their versioneer is # messed up on dev return (matplotlib.__version__ >= LooseVersion('1.3.1') or matplotlib.__version__[0] == '0') except ImportError: return False def _mpl_ge_1_4_0(): try: import matplotlib return (matplotlib.__version__ >= LooseVersion('1.4') or matplotlib.__version__[0] == '0') except ImportError: return False def _mpl_ge_1_5_0(): try: import matplotlib return (matplotlib.__version__ >= LooseVersion('1.5') or matplotlib.__version__[0] == '0') except ImportError: return False def _mpl_ge_2_0_0(): try: import matplotlib return matplotlib.__version__ >= LooseVersion('2.0') except ImportError: return False def _mpl_le_2_0_0(): try: import matplotlib return matplotlib.compare_versions('2.0.0', matplotlib.__version__) except ImportError: return False def _mpl_ge_2_0_1(): try: import matplotlib return matplotlib.__version__ >= LooseVersion('2.0.1') except ImportError: return False def _mpl_ge_2_1_0(): try: import matplotlib return matplotlib.__version__ >= LooseVersion('2.1') except ImportError: return False ```
```c++ extern "C" { #include "ring.h" } #include "gtimer.h" GTimer::GTimer(QObject *parent,VM *pVM) : QTimer(parent) { this->pVM = pVM; this->pParaList = ring_list_new(0); strcpy(this->ctimeoutEvent,""); QObject::connect(this, SIGNAL(timeout()),this, SLOT(timeoutSlot())); } GTimer::~GTimer() { ring_list_delete(this->pParaList); } void GTimer::geteventparameters(void) { void *pPointer; pPointer = this->pVM; RING_API_RETLIST(this->pParaList); } void GTimer::settimeoutEvent(const char *cStr) { if ( strlen(cStr) < RINGQT_EVENT_SIZE ) strcpy(this->ctimeoutEvent,cStr); else { printf("\nEvent Code: %s\n",cStr); ring_vm_error(this->pVM,RINGQT_EVENT_SIZE_ERROR); } } const char *GTimer::gettimeoutEvent(void) { return this->ctimeoutEvent; } void GTimer::timeoutSlot() { if (strcmp(this->ctimeoutEvent,"")==0) return ; ring_vm_runcode(this->pVM,this->ctimeoutEvent); } ```
Dom Jerónimo de Azevedo (Estate of Barbosa, Entre-Douro-e-Minho, Portugal, circa 1560 – Lisbon, São Jorge Castle, 1625) was a Portuguese fidalgo, Governor (captain-general) of Portuguese Ceylon and viceroy of Portuguese India. He proclaimed in Colombo, in 1597, the King of Portugal, Philip I, as the legitimate heir to the throne of Kotte, thus substantiating the Portuguese claims of sovereignty over the island of Ceylon. Early life He was born Jerónimo de Azevedo de Ataíde e Malafaya, one of the thirteen children of Dom Manuel de Azevedo, Comendador of the monastery of São João de Alpendurada. He was thus a half-brother of the Jesuit martyr, Blessed Inácio de Azevedo. Not being the firstborn son, he did not inherit his father's estate, that included the lordship of the medieval honras of Barbosa and Ataíde, each with an estimated annual income of 100 thousand reais, a considerable sum in 16th century Portugal. Dom Jerónimo was thus compelled to follow the example of many second sons of the Portuguese nobility of that era, by emigrating at a young age to the most important of Portugal's overseas possessions, the Estado da Índia, where he made his career. He entered royal service on 25 March 1577, as a page (moço fidalgo) of King Sebastian's household (this was a posting usually reserved for very young members of the nobility, suggesting he was born sometime around 1560 - and not in 1540 as referred in some sources). The appointment was made with his going to the East already in view and some time later, he sailed to India. Governorship of Ceylon Azevedo was Captain-Major of the Malabar coast for a period of 15 years, before being nominated Captain-General of Ceylon in the year 1594. He stayed in charge in Ceylon for a period of 18 years (from Christmas 1594 until November 1612), an unusual long period for holding one office in the Estado da Índia. He was a key figure in the late 16th-century and early 17th-century Portuguese attempts to take full control over the whole territory of present-day Sri Lanka. The touchstone of Portuguese ambitions in Ceylon by the end of the 16th century was the bequest by King Dharmapala of Kotte in 1580 of his entire realm to the king of Portugal. Dharmapala was a Christian convert and his bequest was unacceptable to many of his Buddhist subjects and to the ruler of the neighboring kingdom of Kandy. The takeover was therefore resisted, and the Portuguese had to subjugate Kotte by force. Capture of Maldives The leader of the Portuguese forces during the capture of the Maldives in 1558 was Dom Jerónimo de Azevedo. He played a significant role in the Portuguese expansion in the Indian Ocean. Dom Jerónimo de Azevedo led the military expedition to the Maldives, and under his command, the Portuguese captured the Maldivian capital, Malé, in 1558. The Maldivian ruler at that time was Sultan Ali VI, and the Portuguese captured him during their invasion. However, the Portuguese rule in the Maldives was relatively short-lived, lasting for about 15 years. The Maldivians successfully revolted against Portuguese rule, and in 1573, they managed to drive the Portuguese out of their islands and regain their independence. Maldivians Refer to him as Andhiri Andhirin. Military campaigns in Kotte Azevedo arrived in Colombo with fresh troops on 24 December 1594, barely three months after his predecessor, Pedro Lopes de Sousa died at the battle of Danture (9 October 1594). The Portuguese army had been annihilated by the Kandyan forces in that battle and Azevedo found a kingdom of Kotte in full rebellion. On the first of January 1595 he held a review of the armed forces at his disposal, with king Dharmapala (the source of legitimacy for Portuguese rule in Kotte) at his side. He mustered 900 Portuguese and 2,000 Lascarin soldiers. He sized up the situation and decided he had to pacify the lowlands of Kotte before he could retaliate against Kandy. A difficult, ruthless and prolonged campaign to crush the revolts in the lowlands was thus started and it would concentrate most of Azevedo's attention until it was successfully concluded in 1602. By the end of 1601 the king of Portugal was already writing to the viceroy in Goa, promising to remember Azevedo's many services to the crown. Proclamation of Philip I of Portugal as King of Kotte and the Malvana Convention His governorship in Kotte is also noted for his dealing with complex political issues, which he tried to address in two steps - first, by convening a ceremony for the proclamation of the Portuguese King as sovereign of Kotte and second, by summoning the Convention of Malvana, after the death of king Dharmapala in 1597. On 29 May, 1597, just two days after the death of Dharmapala, Azevedo convened a meeting at the Igreja da Misericórdia of Colombo, with the presence of several Portuguese dignitaries and an important number of court nobles of Kotte. Among the local nobles were a mudalyar (a high military official), an aratchi (captain of a company of Lankan soldiers) and a patangatim (a caste headman among the fishermen). The participants in this Assembly had been chosen "by the main vassals of the King of Kotte" and, at the end, an aratchi made an announcement in Sinhala to the crowd gathered in front of the Misericórdia Church, stating that the late King Dharmapala had donated the throne to the King of Portugal, it being now necessary to proclaim him as the new monarch. The Assembly then swore allegiance to Philip I and a procession followed, going through the streets of Colombo, with the participants crying out, in the traditional Portuguese manner: This Ceremony, quickly decided and organized by Azevedo, was thus equivalent to the acclamation of Filipe I as King of Portugal in the cortes of Almeirim, with the purpose of dispelling any doubts about the legitimate right of succession of the Portuguese sovereign to the throne of Kotte. Azevedo then convened the so-called Malvana Convention, a meeting of representatives of all the districts of the kingdom and accepted - after two days of deliberations - that the native inhabitants of Kotte would keep their laws and customs, though they had to pledge allegiance to the king of Portugal. This convention is mentioned (some 40 years after it was allegedly convened) in a letter of grievance written by Sinhalese leaders to the captain-general, Diogo de Melo e Castro (1636 - 1638). Opening of Ceylon to the Jesuits Another issue to which Azevedo dedicated his attention was missionary activity. In 1554, the Portuguese crown had decided that its possessions in Ceylon would be an exclusive preserve of the Franciscans, and it reaffirmed this decision in 1593. Under Azevedo, however, this policy changed. In January 1597 the Bishop of Cochin, D. André de Santa Maria, wrote to the King of Portugal, suggesting that the crown allow the Jesuits to engage in missionary activity in Ceylon, since the resources available to the Franciscans were allegedly "insufficient" for achieving the evangelizing objectives of Portugal. Later in that year, the Portuguese crown received specific proposals - based on the initial suggestions of the bishop of Cochin, fully supported by Azevedo, whose brother Inácio was a Jesuit martyr - on how to divide missionary activity in the island between the two Religious Orders. After much debate, these proposals were adopted by the crown and in April 1602 the first four Jesuits arrived in Colombo, backed by a patent issued by Viceroy Aires de Saldanha, in Goa, on 27 February 1602. Military campaigns against Kandy Azevedo was less successful in his attempts to subdue Kandy. He invaded the kingdom in 1603 in a carefully planned operation with a total force of about 1,100 Portuguese and 12,000 Sinhalese, but despite initial successes he was forced to withdraw after a rebellion erupted among the Lascarins, the indigenous troops who fought alongside the Portuguese forces. In contrast to his predecessor Pedro Lopes de Sousa, who died and his troops in a previous invasion, he did manage to prevent the annihilation of the Portuguese forces and showed exceptional military capabilities during this retreat that thus came to be known as a famosa retirada. After this setback, Azevedo brought innovations to the formulation of Portuguese military strategy in Ceylon. He decided to abandon the traditional approach of trying to subdue Kandy with a single, decisive military offensive and adopted instead a new strategy based on economic warfare. Every year the Portuguese would engage in smaller but highly destructive biannual forays and raids deep inside Kandyan territory, burning crops and villages and driving off the cattle. This greatly weakened Kandy - in the words of a Portuguese chronicler, the Jesuit Fernão de Queiroz, the kingdom "never in our time recovered its former opulence and size". However, this strategy aimed at the economic strangulation of the kingdom of Kandy did not produce all the effects expected by Azevedo, because Portuguese traders in Indian port cities such as São Tomé de Meliapor, with the support of the Hindu king at Jaffna, refused to abandon their highly lucrative trades with Kandy. When he finally left Ceylon for Goa, Azevedo issued directives to his successor, insisting that his new strategy should continue to be followed until Kandy accepted a status of vassalage to the kingdom of Portugal Viceroy of Portuguese India Azevedo was appointed 20th viceroy of Portuguese India on 24 November 1611, and left Colombo for Goa where he assumed his new duties on 16 December 1612. Resurgent Portuguese expansionism According to historians such as Luís Filipe Thomaz and A. R. Disney, at the end of the 16th century and the beginning of the 17th century there was a period of new Portuguese expansionism in South and Southeast Asia, which reached its peak precisely when Dom Jerónimo de Azevedo took over the government in Goa. A. R. Disney writes that by the time of Dom Jerónimo de Azevedo's mandate as viceroy the influence of colonials in Goa, as distinct from metropolitan Portuguese, was steadily growing. These colonials had interests and an outlook that did not necessarily coincide with those of the metropolis. In practice, Azevedo – who had come to Asia as a young man and served for his entire career in the Portuguese Estado da Índia – was a colonial. This influence of colonials coincided with the above-mentioned period of resurgent Portuguese expansionism in the East, that - however - did not last very long and was already receding by the time he terminated his mandate as viceroy in 1617. In 1615, Dom Jerónimo backed an audacious expedition to Pegu to loot the Moon imperial treasures in Mrauk-U, an enterprise that ultimately did not succeed. However, the fact that it was supported at such a high official level showed how plunder was considered a legitimate policy objective in 16th-century Portuguese-ruled Asia. Also in 1615 Azevedo led a huge fleet that tried to drive English East India Company ships under the command of Nicholas Downton off Surat, but after a series of engagements (including a clash in which the viceroy's brother, Dom Manuel de Azevedo, captain of Chaul and Diu, sank a couple of English merchant vessels) he failed to achieve the strategic objective of dislodging the East India Company from the Indian Ocean trade routes – an incident which demonstrated that Portuguese Goa had lost the capacity to protect its monopoly of commerce on the Western coast of India. Geographic explorations During his government in Goa Azevedo also had to face the serious threat posed by the increasing pressure of Dutch fleets against the Portuguese possessions and trade routes in the Indian Ocean. His response was not only military, for one surprising byproduct of the Dutch challenge was galvanizing the Portuguese into undertaking new journeys of geographical exploration, seen as a sort of preemptive strike against the movements of Dutch and English fleets. Between 1613 and 1616 Dom Jerónimo ordered two such expeditions to the coasts of Madagascar - known to the Portuguese as ilha de São Lourenço since the beginning of the 16th century - with the participation of Jesuit priests (the Jesuits were then very much engaged in such activity, including journeys to the interior of China and to Tibet). One of the aims of this project was evaluating the possibilities for conquest of the island. The expeditions explored the region, prepared a roteiro of the coasts of Madagascar, produced new regional maps and compiled many scientific observations, in a concrete demonstration that the spirit of discovery was still present in the Estado da Índia in the first quarter of the seventeenth century. Fall from favor at the Habsburg court Towards the end of his tenure, Azevedo began to lose the favor he had previously enjoyed at the court of the Spanish Habsburgs, who ruled Portugal. One factor that probably contributed to his downfall was the obstruction he made, when viceroy in Goa, to the mission of the Castilian national García de Silva Figueroa, appointed by king Philip III of Spain (Filipe II of Portugal) as his envoy to the Shah of Persia. Figueroa left Lisbon in April 1614, headed for a stopover in Goa, where he had strong disagreements with Azevedo, that prevented him from reaching his final destination until the year 1617. Azevedo even ordered Silva Figueroa's detention for a time before finally allowing him to continue his journey to Persia. In this approach, Dom Jerónimo was fully expressing the feelings of the local Portuguese in Goa. They saw with great concern the dispatch of a Spanish national as envoy to a country where Portugal supposedly had exclusive interests, that should be preserved, in the framework of the distribution of powers agreed between Portugal and Spain, since the beginning of the dynastic union between the two countries, in 1580. On his return to Lisbon, Azevedo was held in custody and put on trial on several accusations, including embezzlement. Dom Jerónimo de Azevedo died on 9 March 1625, in São Jorge Castle, before the conclusion of his trial and without the various allegations against him being proven.. He was buried in São Roque Church in Lisbon, which was then the home church of the Jesuits in Portugal. Legacy Azevedo lived at a time when the Portuguese Empire in the East was already past its heyday and relatively weakened by the arrival to Asian waters of other European powers. But military successes in the 1590s against the Kingdoms of Sitawaka and Jaffna presented the Portuguese crown with a unique window of opportunity that led it to decide (in 1594) to attempt the conquest of the entire island of Ceylon. The martial qualities of the governors were thus paramount, and this is the likely reason that explains Azevedo's long tenure in Colombo, followed by his promotion to viceroy in Goa in 1611. Azevedo had attempted to resign from captain general of Ceylon in 1597 and later on, in February 1603, he was also the target of an inquiry ("devassa") conducted by the Archbishop of Goa, for alleged misappropriation of resources. However, as early as 15 March 1603, the king was already writing to the Archbishop to request the postponement of the inquiry, since in the meantime he had received favorable informations from the bishop of Cochin concerning Azevedo's military campaigns on the island. Clearly, the king did not want the good military results of governor to be undermined, and in the end he granted to Dom Jerónimo a reward of 10,000 reais and a comenda in the Order of Christ. By that time, Azevedo had already acquired in Ceylon the equivalent of the power and mystique of a viceroy. But Azevedo was more than a military leader, for he undertook important political initiatives (such as the negotiations that led to the Malvana Convention) and missionary reforms (opening of Ceylon to the Jesuits) while in Colombo - and he strongly supported geographical exploration and scientific activities during his tenure in Goa. On balance, it is likely that - more than the accusations of embezzlement (a quite common feature of many previous tenures of governors and viceroys of the Portuguese Empire) - it was his politically risky actions against a personal envoy of the king of Spain and Portugal that sealed his final fall from grace in the Habsburg court. The Fort of São Jerónimo (Saint Jerome) in Nani Daman was started during his term as viceroy and is named in his honor. Azevedo was also responsible for the rebuilding of the Idalcão or Adilshahi Palace in Panaji (Goa). The Portuguese city of Porto named a street after him. References Bibliography Tikiri Abeyasinghe, Portuguese Rule in Ceylon, 1594 - 1612 Lake House Investments Ltd. Publishers, 1966 edition, Colombo, passim. Frederick Charles Danvers, The Portuguese in India: Being a History of the Rise and Decline of Their Eastern Empire: Volume 2, 1894 edition by W. H. Allen & Co., London, page 198 A. R. Disney, A History of Portugal and the Portuguese Empire: From Beginnings to 1807 (Volume 2) Cambridge University Press; 1st edition (13 April 2009), pages 154, 166–167, 168, 191 C. Gaston Perera, Kandy Fights the Portuguese (A Military History of Kandyan Resistance), 2007, Vijith Yapa Publications, Sri Lanka, Fernão de Queiroz, The temporal and spiritual conquest of Ceylon (Volume 2), Colombo : A.C. Richards, 1930. (Originally written and published in Portuguese, 1688) John F. Riddick, The history of British India: a chronology, Preager Publishers, 2006. Senaka K. Weeraratna, Repression of Buddhism in Sri Lanka by the Portuguese (1505 - 1658) Chandra Tilake Edirisuriya, Azevedo -The Cruelest Portuguese Viceroys of Portuguese India Portuguese knights Portuguese nobility 1540 births 1625 deaths 16th-century Portuguese people 17th-century Portuguese people Governors of Portuguese Ceylon
Norman C. Stone (April 28, 1939 – April 2, 2021) was an American psychotherapist, philanthropist, vintner and a collector of modern and contemporary art. Biography Stone, son of Chicago businessman and self-help book author W. Clement Stone, was born in Evanston, Illinois. He holds a B.A. in economics from Stanford University and a doctorate from the Wright Institute Graduate School of Social-clinical Psychology in Berkeley, California. Stone served as a staff psychologist at the mental health center in 1980 for the Bayview Hunters Point Foundation for Community Improvement in San Francisco, counseling patients for schizophrenia, crack addiction and depression. Stone studied painting at the San Francisco Art Institute before attending the Wright Institute. Under the tutelage of Thea Westreich Art Advisory Services in New York., Stone began actively collecting contemporary art in the mid 1980s. Since that time, he and his now-deceased wife, Norah Sharpe Stone, have acquired major pieces from important contemporary artists such as Jan de Cock, Robert Gober, Jeff Koons, Cady Noland, Richard Prince, Richard Serra, Keith Tyson and Christopher Wool. The Stones’ collection also features works from such seminal artists as Andy Warhol, Marcel Duchamp, Hans Bellmer and Tony Conrad. The collection is divided among the Stones’ primary residence in San Francisco and their Napa Valley wine estate, Stonescape. The latter property has a art cave designed by Brooklyn architectural firm Bade Stageberg Cox, as well as a pool and pavilion conceptualized by James Turrell, an artist noted for his famed Skyspaces and executed by Jim Jennings. The landscape was designed by Tom Leader. Stone is president of the W. Clement and Jessie V. Stone Foundation as well as a trustee of the San Francisco Museum of Modern Art. Both he and wife Norah are members of the National Committee of the Whitney Museum in New York, and the Tate International Council in London. He is a co-founder of the Nueva School in Hillsborough, California. Stonescape is located in the Diamond Mountain District AVA of the Napa Valley appellation. Since the 1990s, the property has produced Merlot wines under the Azalea Springs label. The Stones replanted their vineyards in 2002 with premium cabernet sauvignon vines producing wine under the AZS label. Stone is a member of the Napa Valley Vintners, a non-profit trade association. He died on April 2, 2021, in San Francisco. References External links Napa Valley Vintners 1939 births Living people American winemakers Wine merchants American art collectors
```java package io.ray.runtime.functionmanager; import com.google.common.base.Objects; import com.google.common.collect.ImmutableList; import io.ray.runtime.generated.Common.Language; import java.io.Serializable; import java.util.List; /** Represents metadata of Java function. */ public final class JavaFunctionDescriptor implements FunctionDescriptor, Serializable { private static final long serialVersionUID = -2137471820857197094L; /** Function's class name. */ public final String className; /** Function's name. */ public final String name; /** Function's signature. */ public final String signature; public JavaFunctionDescriptor(String className, String name, String signature) { this.className = className; this.name = name; this.signature = signature; } @Override public String toString() { return className + "." + name; } @Override public boolean equals(Object o) { if (this == o) { return true; } if (o == null || getClass() != o.getClass()) { return false; } JavaFunctionDescriptor that = (JavaFunctionDescriptor) o; return Objects.equal(className, that.className) && Objects.equal(name, that.name) && Objects.equal(signature, that.signature); } @Override public int hashCode() { return Objects.hashCode(className, name, signature); } @Override public List<String> toList() { return ImmutableList.of(className, name, signature); } @Override public Language getLanguage() { return Language.JAVA; } } ```
```c++ #include "../src/meshoptimizer.h" #include <assert.h> #include <math.h> #include <stdlib.h> #include <string.h> #include <vector> // This file uses assert() to verify algorithm correctness #undef NDEBUG #include <assert.h> struct PV { unsigned short px, py, pz; unsigned char nu, nv; // octahedron encoded normal, aliases .pw unsigned short tx, ty; }; // note: 4 6 5 triangle here is a combo-breaker: // we encode it without rotating, a=next, c=next - this means we do *not* bump next to 6 // which means that the next triangle can't be encoded via next sequencing! static const unsigned int kIndexBuffer[] = {0, 1, 2, 2, 1, 3, 4, 6, 5, 7, 8, 9}; static const unsigned char kIndexDataV0[] = { 0xe0, 0xf0, 0x10, 0xfe, 0xff, 0xf0, 0x0c, 0xff, 0x02, 0x02, 0x02, 0x00, 0x76, 0x87, 0x56, 0x67, 0x78, 0xa9, 0x86, 0x65, 0x89, 0x68, 0x98, 0x01, 0x69, 0x00, 0x00, // clang-format :-/ }; // note: this exercises two features of v1 format, restarts (0 1 2) and last static const unsigned int kIndexBufferTricky[] = {0, 1, 2, 2, 1, 3, 0, 1, 2, 2, 1, 5, 2, 1, 4}; static const unsigned char kIndexDataV1[] = { 0xe1, 0xf0, 0x10, 0xfe, 0x1f, 0x3d, 0x00, 0x0a, 0x00, 0x76, 0x87, 0x56, 0x67, 0x78, 0xa9, 0x86, 0x65, 0x89, 0x68, 0x98, 0x01, 0x69, 0x00, 0x00, // clang-format :-/ }; static const unsigned int kIndexSequence[] = {0, 1, 51, 2, 49, 1000}; static const unsigned char kIndexSequenceV1[] = { 0xd1, 0x00, 0x04, 0xcd, 0x01, 0x04, 0x07, 0x98, 0x1f, 0x00, 0x00, 0x00, 0x00, // clang-format :-/ }; static const PV kVertexBuffer[] = { {0, 0, 0, 0, 0, 0, 0}, {300, 0, 0, 0, 0, 500, 0}, {0, 300, 0, 0, 0, 0, 500}, {300, 300, 0, 0, 0, 500, 500}, }; static const unsigned char kVertexDataV0[] = { 0xa0, 0x01, 0x3f, 0x00, 0x00, 0x00, 0x58, 0x57, 0x58, 0x01, 0x26, 0x00, 0x00, 0x00, 0x01, 0x0c, 0x00, 0x00, 0x00, 0x58, 0x01, 0x08, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x01, 0x3f, 0x00, 0x00, 0x00, 0x17, 0x18, 0x17, 0x01, 0x26, 0x00, 0x00, 0x00, 0x01, 0x0c, 0x00, 0x00, 0x00, 0x17, 0x01, 0x08, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, // clang-format :-/ }; static void decodeIndexV0() { const size_t index_count = sizeof(kIndexBuffer) / sizeof(kIndexBuffer[0]); std::vector<unsigned char> buffer(kIndexDataV0, kIndexDataV0 + sizeof(kIndexDataV0)); unsigned int decoded[index_count]; assert(meshopt_decodeIndexBuffer(decoded, index_count, &buffer[0], buffer.size()) == 0); assert(memcmp(decoded, kIndexBuffer, sizeof(kIndexBuffer)) == 0); } static void decodeIndexV1() { const size_t index_count = sizeof(kIndexBufferTricky) / sizeof(kIndexBufferTricky[0]); std::vector<unsigned char> buffer(kIndexDataV1, kIndexDataV1 + sizeof(kIndexDataV1)); unsigned int decoded[index_count]; assert(meshopt_decodeIndexBuffer(decoded, index_count, &buffer[0], buffer.size()) == 0); assert(memcmp(decoded, kIndexBufferTricky, sizeof(kIndexBufferTricky)) == 0); } static void decodeIndex16() { const size_t index_count = sizeof(kIndexBuffer) / sizeof(kIndexBuffer[0]); const size_t vertex_count = 10; std::vector<unsigned char> buffer(meshopt_encodeIndexBufferBound(index_count, vertex_count)); buffer.resize(meshopt_encodeIndexBuffer(&buffer[0], buffer.size(), kIndexBuffer, index_count)); unsigned short decoded[index_count]; assert(meshopt_decodeIndexBuffer(decoded, index_count, &buffer[0], buffer.size()) == 0); for (size_t i = 0; i < index_count; ++i) assert(decoded[i] == kIndexBuffer[i]); } static void encodeIndexMemorySafe() { const size_t index_count = sizeof(kIndexBuffer) / sizeof(kIndexBuffer[0]); const size_t vertex_count = 10; std::vector<unsigned char> buffer(meshopt_encodeIndexBufferBound(index_count, vertex_count)); buffer.resize(meshopt_encodeIndexBuffer(&buffer[0], buffer.size(), kIndexBuffer, index_count)); // check that encode is memory-safe; note that we reallocate the buffer for each try to make sure ASAN can verify buffer access for (size_t i = 0; i <= buffer.size(); ++i) { std::vector<unsigned char> shortbuffer(i); size_t result = meshopt_encodeIndexBuffer(i == 0 ? NULL : &shortbuffer[0], i, kIndexBuffer, index_count); if (i == buffer.size()) assert(result == buffer.size()); else assert(result == 0); } } static void decodeIndexMemorySafe() { const size_t index_count = sizeof(kIndexBuffer) / sizeof(kIndexBuffer[0]); const size_t vertex_count = 10; std::vector<unsigned char> buffer(meshopt_encodeIndexBufferBound(index_count, vertex_count)); buffer.resize(meshopt_encodeIndexBuffer(&buffer[0], buffer.size(), kIndexBuffer, index_count)); // check that decode is memory-safe; note that we reallocate the buffer for each try to make sure ASAN can verify buffer access unsigned int decoded[index_count]; for (size_t i = 0; i <= buffer.size(); ++i) { std::vector<unsigned char> shortbuffer(buffer.begin(), buffer.begin() + i); int result = meshopt_decodeIndexBuffer(decoded, index_count, i == 0 ? NULL : &shortbuffer[0], i); if (i == buffer.size()) assert(result == 0); else assert(result < 0); } } static void decodeIndexRejectExtraBytes() { const size_t index_count = sizeof(kIndexBuffer) / sizeof(kIndexBuffer[0]); const size_t vertex_count = 10; std::vector<unsigned char> buffer(meshopt_encodeIndexBufferBound(index_count, vertex_count)); buffer.resize(meshopt_encodeIndexBuffer(&buffer[0], buffer.size(), kIndexBuffer, index_count)); // check that decoder doesn't accept extra bytes after a valid stream std::vector<unsigned char> largebuffer(buffer); largebuffer.push_back(0); unsigned int decoded[index_count]; assert(meshopt_decodeIndexBuffer(decoded, index_count, &largebuffer[0], largebuffer.size()) < 0); } static void decodeIndexRejectMalformedHeaders() { const size_t index_count = sizeof(kIndexBuffer) / sizeof(kIndexBuffer[0]); const size_t vertex_count = 10; std::vector<unsigned char> buffer(meshopt_encodeIndexBufferBound(index_count, vertex_count)); buffer.resize(meshopt_encodeIndexBuffer(&buffer[0], buffer.size(), kIndexBuffer, index_count)); // check that decoder doesn't accept malformed headers std::vector<unsigned char> brokenbuffer(buffer); brokenbuffer[0] = 0; unsigned int decoded[index_count]; assert(meshopt_decodeIndexBuffer(decoded, index_count, &brokenbuffer[0], brokenbuffer.size()) < 0); } static void decodeIndexRejectInvalidVersion() { const size_t index_count = sizeof(kIndexBuffer) / sizeof(kIndexBuffer[0]); const size_t vertex_count = 10; std::vector<unsigned char> buffer(meshopt_encodeIndexBufferBound(index_count, vertex_count)); buffer.resize(meshopt_encodeIndexBuffer(&buffer[0], buffer.size(), kIndexBuffer, index_count)); // check that decoder doesn't accept invalid version std::vector<unsigned char> brokenbuffer(buffer); brokenbuffer[0] |= 0x0f; unsigned int decoded[index_count]; assert(meshopt_decodeIndexBuffer(decoded, index_count, &brokenbuffer[0], brokenbuffer.size()) < 0); } static void decodeIndexMalformedVByte() { const unsigned char input[] = { 0xe1, 0x20, 0x20, 0x20, 0xff, 0x20, 0x20, 0x20, 0x20, 0x20, 0x20, 0x20, 0x20, 0x20, 0x20, 0x20, 0x20, 0x20, 0x20, 0x20, 0x20, 0x20, 0x20, 0x20, 0xff, 0xff, 0xff, 0xff, 0x20, 0x20, 0x20, 0x20, 0x20, 0x20, 0x20, 0x20, 0x20, 0x20, 0x20, // clang-format :-/ }; unsigned int decoded[66]; assert(meshopt_decodeIndexBuffer(decoded, 66, input, sizeof(input)) < 0); } static void roundtripIndexTricky() { const size_t index_count = sizeof(kIndexBufferTricky) / sizeof(kIndexBufferTricky[0]); const size_t vertex_count = 6; std::vector<unsigned char> buffer(meshopt_encodeIndexBufferBound(index_count, vertex_count)); buffer.resize(meshopt_encodeIndexBuffer(&buffer[0], buffer.size(), kIndexBufferTricky, index_count)); unsigned int decoded[index_count]; assert(meshopt_decodeIndexBuffer(decoded, index_count, &buffer[0], buffer.size()) == 0); assert(memcmp(decoded, kIndexBufferTricky, sizeof(kIndexBufferTricky)) == 0); } static void encodeIndexEmpty() { std::vector<unsigned char> buffer(meshopt_encodeIndexBufferBound(0, 0)); buffer.resize(meshopt_encodeIndexBuffer(&buffer[0], buffer.size(), NULL, 0)); assert(meshopt_decodeIndexBuffer(static_cast<unsigned int*>(NULL), 0, &buffer[0], buffer.size()) == 0); } static void decodeIndexSequence() { const size_t index_count = sizeof(kIndexSequence) / sizeof(kIndexSequence[0]); std::vector<unsigned char> buffer(kIndexSequenceV1, kIndexSequenceV1 + sizeof(kIndexSequenceV1)); unsigned int decoded[index_count]; assert(meshopt_decodeIndexSequence(decoded, index_count, &buffer[0], buffer.size()) == 0); assert(memcmp(decoded, kIndexSequence, sizeof(kIndexSequence)) == 0); } static void decodeIndexSequence16() { const size_t index_count = sizeof(kIndexSequence) / sizeof(kIndexSequence[0]); const size_t vertex_count = 1001; std::vector<unsigned char> buffer(meshopt_encodeIndexSequenceBound(index_count, vertex_count)); buffer.resize(meshopt_encodeIndexSequence(&buffer[0], buffer.size(), kIndexSequence, index_count)); unsigned short decoded[index_count]; assert(meshopt_decodeIndexSequence(decoded, index_count, &buffer[0], buffer.size()) == 0); for (size_t i = 0; i < index_count; ++i) assert(decoded[i] == kIndexSequence[i]); } static void encodeIndexSequenceMemorySafe() { const size_t index_count = sizeof(kIndexSequence) / sizeof(kIndexSequence[0]); const size_t vertex_count = 1001; std::vector<unsigned char> buffer(meshopt_encodeIndexSequenceBound(index_count, vertex_count)); buffer.resize(meshopt_encodeIndexSequence(&buffer[0], buffer.size(), kIndexSequence, index_count)); // check that encode is memory-safe; note that we reallocate the buffer for each try to make sure ASAN can verify buffer access for (size_t i = 0; i <= buffer.size(); ++i) { std::vector<unsigned char> shortbuffer(i); size_t result = meshopt_encodeIndexSequence(i == 0 ? NULL : &shortbuffer[0], i, kIndexSequence, index_count); if (i == buffer.size()) assert(result == buffer.size()); else assert(result == 0); } } static void decodeIndexSequenceMemorySafe() { const size_t index_count = sizeof(kIndexSequence) / sizeof(kIndexSequence[0]); const size_t vertex_count = 1001; std::vector<unsigned char> buffer(meshopt_encodeIndexSequenceBound(index_count, vertex_count)); buffer.resize(meshopt_encodeIndexSequence(&buffer[0], buffer.size(), kIndexSequence, index_count)); // check that decode is memory-safe; note that we reallocate the buffer for each try to make sure ASAN can verify buffer access unsigned int decoded[index_count]; for (size_t i = 0; i <= buffer.size(); ++i) { std::vector<unsigned char> shortbuffer(buffer.begin(), buffer.begin() + i); int result = meshopt_decodeIndexSequence(decoded, index_count, i == 0 ? NULL : &shortbuffer[0], i); if (i == buffer.size()) assert(result == 0); else assert(result < 0); } } static void decodeIndexSequenceRejectExtraBytes() { const size_t index_count = sizeof(kIndexSequence) / sizeof(kIndexSequence[0]); const size_t vertex_count = 1001; std::vector<unsigned char> buffer(meshopt_encodeIndexSequenceBound(index_count, vertex_count)); buffer.resize(meshopt_encodeIndexSequence(&buffer[0], buffer.size(), kIndexSequence, index_count)); // check that decoder doesn't accept extra bytes after a valid stream std::vector<unsigned char> largebuffer(buffer); largebuffer.push_back(0); unsigned int decoded[index_count]; assert(meshopt_decodeIndexSequence(decoded, index_count, &largebuffer[0], largebuffer.size()) < 0); } static void decodeIndexSequenceRejectMalformedHeaders() { const size_t index_count = sizeof(kIndexSequence) / sizeof(kIndexSequence[0]); const size_t vertex_count = 1001; std::vector<unsigned char> buffer(meshopt_encodeIndexSequenceBound(index_count, vertex_count)); buffer.resize(meshopt_encodeIndexSequence(&buffer[0], buffer.size(), kIndexSequence, index_count)); // check that decoder doesn't accept malformed headers std::vector<unsigned char> brokenbuffer(buffer); brokenbuffer[0] = 0; unsigned int decoded[index_count]; assert(meshopt_decodeIndexSequence(decoded, index_count, &brokenbuffer[0], brokenbuffer.size()) < 0); } static void decodeIndexSequenceRejectInvalidVersion() { const size_t index_count = sizeof(kIndexSequence) / sizeof(kIndexSequence[0]); const size_t vertex_count = 1001; std::vector<unsigned char> buffer(meshopt_encodeIndexSequenceBound(index_count, vertex_count)); buffer.resize(meshopt_encodeIndexSequence(&buffer[0], buffer.size(), kIndexSequence, index_count)); // check that decoder doesn't accept invalid version std::vector<unsigned char> brokenbuffer(buffer); brokenbuffer[0] |= 0x0f; unsigned int decoded[index_count]; assert(meshopt_decodeIndexSequence(decoded, index_count, &brokenbuffer[0], brokenbuffer.size()) < 0); } static void encodeIndexSequenceEmpty() { std::vector<unsigned char> buffer(meshopt_encodeIndexSequenceBound(0, 0)); buffer.resize(meshopt_encodeIndexSequence(&buffer[0], buffer.size(), NULL, 0)); assert(meshopt_decodeIndexSequence(static_cast<unsigned int*>(NULL), 0, &buffer[0], buffer.size()) == 0); } static void decodeVertexV0() { const size_t vertex_count = sizeof(kVertexBuffer) / sizeof(kVertexBuffer[0]); std::vector<unsigned char> buffer(kVertexDataV0, kVertexDataV0 + sizeof(kVertexDataV0)); PV decoded[vertex_count]; assert(meshopt_decodeVertexBuffer(decoded, vertex_count, sizeof(PV), &buffer[0], buffer.size()) == 0); assert(memcmp(decoded, kVertexBuffer, sizeof(kVertexBuffer)) == 0); } static void encodeVertexMemorySafe() { const size_t vertex_count = sizeof(kVertexBuffer) / sizeof(kVertexBuffer[0]); std::vector<unsigned char> buffer(meshopt_encodeVertexBufferBound(vertex_count, sizeof(PV))); buffer.resize(meshopt_encodeVertexBuffer(&buffer[0], buffer.size(), kVertexBuffer, vertex_count, sizeof(PV))); // check that encode is memory-safe; note that we reallocate the buffer for each try to make sure ASAN can verify buffer access for (size_t i = 0; i <= buffer.size(); ++i) { std::vector<unsigned char> shortbuffer(i); size_t result = meshopt_encodeVertexBuffer(i == 0 ? NULL : &shortbuffer[0], i, kVertexBuffer, vertex_count, sizeof(PV)); if (i == buffer.size()) assert(result == buffer.size()); else assert(result == 0); } } static void decodeVertexMemorySafe() { const size_t vertex_count = sizeof(kVertexBuffer) / sizeof(kVertexBuffer[0]); std::vector<unsigned char> buffer(meshopt_encodeVertexBufferBound(vertex_count, sizeof(PV))); buffer.resize(meshopt_encodeVertexBuffer(&buffer[0], buffer.size(), kVertexBuffer, vertex_count, sizeof(PV))); // check that decode is memory-safe; note that we reallocate the buffer for each try to make sure ASAN can verify buffer access PV decoded[vertex_count]; for (size_t i = 0; i <= buffer.size(); ++i) { std::vector<unsigned char> shortbuffer(buffer.begin(), buffer.begin() + i); int result = meshopt_decodeVertexBuffer(decoded, vertex_count, sizeof(PV), i == 0 ? NULL : &shortbuffer[0], i); (void)result; if (i == buffer.size()) assert(result == 0); else assert(result < 0); } } static void decodeVertexRejectExtraBytes() { const size_t vertex_count = sizeof(kVertexBuffer) / sizeof(kVertexBuffer[0]); std::vector<unsigned char> buffer(meshopt_encodeVertexBufferBound(vertex_count, sizeof(PV))); buffer.resize(meshopt_encodeVertexBuffer(&buffer[0], buffer.size(), kVertexBuffer, vertex_count, sizeof(PV))); // check that decoder doesn't accept extra bytes after a valid stream std::vector<unsigned char> largebuffer(buffer); largebuffer.push_back(0); PV decoded[vertex_count]; assert(meshopt_decodeVertexBuffer(decoded, vertex_count, sizeof(PV), &largebuffer[0], largebuffer.size()) < 0); } static void decodeVertexRejectMalformedHeaders() { const size_t vertex_count = sizeof(kVertexBuffer) / sizeof(kVertexBuffer[0]); std::vector<unsigned char> buffer(meshopt_encodeVertexBufferBound(vertex_count, sizeof(PV))); buffer.resize(meshopt_encodeVertexBuffer(&buffer[0], buffer.size(), kVertexBuffer, vertex_count, sizeof(PV))); // check that decoder doesn't accept malformed headers std::vector<unsigned char> brokenbuffer(buffer); brokenbuffer[0] = 0; PV decoded[vertex_count]; assert(meshopt_decodeVertexBuffer(decoded, vertex_count, sizeof(PV), &brokenbuffer[0], brokenbuffer.size()) < 0); } static void decodeVertexBitGroups() { unsigned char data[16 * 4]; // this tests 0/2/4/8 bit groups in one stream for (size_t i = 0; i < 16; ++i) { data[i * 4 + 0] = 0; data[i * 4 + 1] = (unsigned char)(i * 1); data[i * 4 + 2] = (unsigned char)(i * 2); data[i * 4 + 3] = (unsigned char)(i * 8); } std::vector<unsigned char> buffer(meshopt_encodeVertexBufferBound(16, 4)); buffer.resize(meshopt_encodeVertexBuffer(&buffer[0], buffer.size(), data, 16, 4)); unsigned char decoded[16 * 4]; assert(meshopt_decodeVertexBuffer(decoded, 16, 4, &buffer[0], buffer.size()) == 0); assert(memcmp(decoded, data, sizeof(data)) == 0); } static void decodeVertexBitGroupSentinels() { unsigned char data[16 * 4]; // this tests 0/2/4/8 bit groups and sentinels in one stream for (size_t i = 0; i < 16; ++i) { if (i == 7 || i == 13) { data[i * 4 + 0] = 42; data[i * 4 + 1] = 42; data[i * 4 + 2] = 42; data[i * 4 + 3] = 42; } else { data[i * 4 + 0] = 0; data[i * 4 + 1] = (unsigned char)(i * 1); data[i * 4 + 2] = (unsigned char)(i * 2); data[i * 4 + 3] = (unsigned char)(i * 8); } } std::vector<unsigned char> buffer(meshopt_encodeVertexBufferBound(16, 4)); buffer.resize(meshopt_encodeVertexBuffer(&buffer[0], buffer.size(), data, 16, 4)); unsigned char decoded[16 * 4]; assert(meshopt_decodeVertexBuffer(decoded, 16, 4, &buffer[0], buffer.size()) == 0); assert(memcmp(decoded, data, sizeof(data)) == 0); } static void decodeVertexLarge() { unsigned char data[128 * 4]; // this tests 0/2/4/8 bit groups in one stream for (size_t i = 0; i < 128; ++i) { data[i * 4 + 0] = 0; data[i * 4 + 1] = (unsigned char)(i * 1); data[i * 4 + 2] = (unsigned char)(i * 2); data[i * 4 + 3] = (unsigned char)(i * 8); } std::vector<unsigned char> buffer(meshopt_encodeVertexBufferBound(128, 4)); buffer.resize(meshopt_encodeVertexBuffer(&buffer[0], buffer.size(), data, 128, 4)); unsigned char decoded[128 * 4]; assert(meshopt_decodeVertexBuffer(decoded, 128, 4, &buffer[0], buffer.size()) == 0); assert(memcmp(decoded, data, sizeof(data)) == 0); } static void encodeVertexEmpty() { std::vector<unsigned char> buffer(meshopt_encodeVertexBufferBound(0, 16)); buffer.resize(meshopt_encodeVertexBuffer(&buffer[0], buffer.size(), NULL, 0, 16)); assert(meshopt_decodeVertexBuffer(NULL, 0, 16, &buffer[0], buffer.size()) == 0); } static void decodeFilterOct8() { const unsigned char data[4 * 4] = { 0, 1, 127, 0, 0, 187, 127, 1, 255, 1, 127, 0, 14, 130, 127, 1, // clang-format :-/ }; const unsigned char expected[4 * 4] = { 0, 1, 127, 0, 0, 159, 82, 1, 255, 1, 127, 0, 1, 130, 241, 1, // clang-format :-/ }; // Aligned by 4 unsigned char full[4 * 4]; memcpy(full, data, sizeof(full)); meshopt_decodeFilterOct(full, 4, 4); assert(memcmp(full, expected, sizeof(full)) == 0); // Tail processing for unaligned data unsigned char tail[3 * 4]; memcpy(tail, data, sizeof(tail)); meshopt_decodeFilterOct(tail, 3, 4); assert(memcmp(tail, expected, sizeof(tail)) == 0); } static void decodeFilterOct12() { const unsigned short data[4 * 4] = { 0, 1, 2047, 0, 0, 1870, 2047, 1, 2017, 1, 2047, 0, 14, 1300, 2047, 1, // clang-format :-/ }; const unsigned short expected[4 * 4] = { 0, 16, 32767, 0, 0, 32621, 3088, 1, 32764, 16, 471, 0, 307, 28541, 16093, 1, // clang-format :-/ }; // Aligned by 4 unsigned short full[4 * 4]; memcpy(full, data, sizeof(full)); meshopt_decodeFilterOct(full, 4, 8); assert(memcmp(full, expected, sizeof(full)) == 0); // Tail processing for unaligned data unsigned short tail[3 * 4]; memcpy(tail, data, sizeof(tail)); meshopt_decodeFilterOct(tail, 3, 8); assert(memcmp(tail, expected, sizeof(tail)) == 0); } static void decodeFilterQuat12() { const unsigned short data[4 * 4] = { 0, 1, 0, 0x7fc, 0, 1870, 0, 0x7fd, 2017, 1, 0, 0x7fe, 14, 1300, 0, 0x7ff, // clang-format :-/ }; const unsigned short expected[4 * 4] = { 32767, 0, 11, 0, 0, 25013, 0, 21166, 11, 0, 23504, 22830, 158, 14715, 0, 29277, // clang-format :-/ }; // Aligned by 4 unsigned short full[4 * 4]; memcpy(full, data, sizeof(full)); meshopt_decodeFilterQuat(full, 4, 8); assert(memcmp(full, expected, sizeof(full)) == 0); // Tail processing for unaligned data unsigned short tail[3 * 4]; memcpy(tail, data, sizeof(tail)); meshopt_decodeFilterQuat(tail, 3, 8); assert(memcmp(tail, expected, sizeof(tail)) == 0); } static void decodeFilterExp() { const unsigned int data[4] = { 0, 0xff000003, 0x02fffff7, 0xfe7fffff, // clang-format :-/ }; const unsigned int expected[4] = { 0, 0x3fc00000, 0xc2100000, 0x49fffffe, // clang-format :-/ }; // Aligned by 4 unsigned int full[4]; memcpy(full, data, sizeof(full)); meshopt_decodeFilterExp(full, 4, 4); assert(memcmp(full, expected, sizeof(full)) == 0); // Tail processing for unaligned data unsigned int tail[3]; memcpy(tail, data, sizeof(tail)); meshopt_decodeFilterExp(tail, 3, 4); assert(memcmp(tail, expected, sizeof(tail)) == 0); } void encodeFilterOct8() { const float data[4 * 4] = { 1, 0, 0, 0, 0, -1, 0, 0, 0.7071068f, 0, 0.707168f, 1, -0.7071068f, 0, -0.707168f, 1, // clang-format :-/ }; const unsigned char expected[4 * 4] = { 0x7f, 0, 0x7f, 0, 0, 0x81, 0x7f, 0, 0x3f, 0, 0x7f, 0x7f, 0x81, 0x40, 0x7f, 0x7f, // clang-format :-/ }; unsigned char encoded[4 * 4]; meshopt_encodeFilterOct(encoded, 4, 4, 8, data); assert(memcmp(encoded, expected, sizeof(expected)) == 0); signed char decoded[4 * 4]; memcpy(decoded, encoded, sizeof(decoded)); meshopt_decodeFilterOct(decoded, 4, 4); for (size_t i = 0; i < 4 * 4; ++i) assert(fabsf(decoded[i] / 127.f - data[i]) < 1e-2f); } void encodeFilterOct12() { const float data[4 * 4] = { 1, 0, 0, 0, 0, -1, 0, 0, 0.7071068f, 0, 0.707168f, 1, -0.7071068f, 0, -0.707168f, 1, // clang-format :-/ }; const unsigned short expected[4 * 4] = { 0x7ff, 0, 0x7ff, 0, 0x0, 0xf801, 0x7ff, 0, 0x3ff, 0, 0x7ff, 0x7fff, 0xf801, 0x400, 0x7ff, 0x7fff, // clang-format :-/ }; unsigned short encoded[4 * 4]; meshopt_encodeFilterOct(encoded, 4, 8, 12, data); assert(memcmp(encoded, expected, sizeof(expected)) == 0); short decoded[4 * 4]; memcpy(decoded, encoded, sizeof(decoded)); meshopt_decodeFilterOct(decoded, 4, 8); for (size_t i = 0; i < 4 * 4; ++i) assert(fabsf(decoded[i] / 32767.f - data[i]) < 1e-3f); } void encodeFilterQuat12() { const float data[4 * 4] = { 1, 0, 0, 0, 0, -1, 0, 0, 0.7071068f, 0, 0, 0.707168f, -0.7071068f, 0, 0, -0.707168f, // clang-format :-/ }; const unsigned short expected[4 * 4] = { 0, 0, 0, 0x7fc, 0, 0, 0, 0x7fd, 0x7ff, 0, 0, 0x7ff, 0x7ff, 0, 0, 0x7ff, // clang-format :-/ }; unsigned short encoded[4 * 4]; meshopt_encodeFilterQuat(encoded, 4, 8, 12, data); assert(memcmp(encoded, expected, sizeof(expected)) == 0); short decoded[4 * 4]; memcpy(decoded, encoded, sizeof(decoded)); meshopt_decodeFilterQuat(decoded, 4, 8); for (size_t i = 0; i < 4; ++i) { float dx = decoded[i * 4 + 0] / 32767.f; float dy = decoded[i * 4 + 1] / 32767.f; float dz = decoded[i * 4 + 2] / 32767.f; float dw = decoded[i * 4 + 3] / 32767.f; float dp = data[i * 4 + 0] * dx + data[i * 4 + 1] * dy + data[i * 4 + 2] * dz + data[i * 4 + 3] * dw; assert(fabsf(fabsf(dp) - 1.f) < 1e-4f); } } void encodeFilterExp() { const float data[4] = { 1, -23.4f, -0.1f, 11.0f, }; // separate exponents: each component gets its own value const unsigned int expected1[4] = { 0xf3002000, 0xf7ffd133, 0xefffcccd, 0xf6002c00, }; // shared exponents (vector): all components of each vector get the same value const unsigned int expected2[4] = { 0xf7000200, 0xf7ffd133, 0xf6ffff9a, 0xf6002c00, }; // shared exponents (component): each component gets the same value across all vectors const unsigned int expected3[4] = { 0xf3002000, 0xf7ffd133, 0xf3fffccd, 0xf7001600, }; unsigned int encoded1[4]; meshopt_encodeFilterExp(encoded1, 2, 8, 15, data, meshopt_EncodeExpSeparate); unsigned int encoded2[4]; meshopt_encodeFilterExp(encoded2, 2, 8, 15, data, meshopt_EncodeExpSharedVector); unsigned int encoded3[4]; meshopt_encodeFilterExp(encoded3, 2, 8, 15, data, meshopt_EncodeExpSharedComponent); assert(memcmp(encoded1, expected1, sizeof(expected1)) == 0); assert(memcmp(encoded2, expected2, sizeof(expected2)) == 0); assert(memcmp(encoded3, expected3, sizeof(expected3)) == 0); float decoded1[4]; memcpy(decoded1, encoded1, sizeof(decoded1)); meshopt_decodeFilterExp(decoded1, 2, 8); float decoded2[4]; memcpy(decoded2, encoded2, sizeof(decoded2)); meshopt_decodeFilterExp(decoded2, 2, 8); float decoded3[4]; memcpy(decoded3, encoded3, sizeof(decoded3)); meshopt_decodeFilterExp(decoded3, 2, 8); for (size_t i = 0; i < 4; ++i) { assert(fabsf(decoded1[i] - data[i]) < 1e-3f); assert(fabsf(decoded2[i] - data[i]) < 1e-3f); assert(fabsf(decoded3[i] - data[i]) < 1e-3f); } } void encodeFilterExpZero() { const float data = 0.f; const unsigned int expected = 0xf2000000; unsigned int encoded; meshopt_encodeFilterExp(&encoded, 1, 4, 15, &data, meshopt_EncodeExpSeparate); assert(encoded == expected); float decoded; memcpy(&decoded, &encoded, sizeof(decoded)); meshopt_decodeFilterExp(&decoded, 1, 4); assert(decoded == data); } static void clusterBoundsDegenerate() { const float vbd[] = {0, 0, 0, 0, 0, 0, 0, 0, 0}; const unsigned int ibd[] = {0, 0, 0}; const unsigned int ib1[] = {0, 1, 2}; // all of the bounds below are degenerate as they use 0 triangles, one topology-degenerate triangle and one position-degenerate triangle respectively meshopt_Bounds bounds0 = meshopt_computeClusterBounds(NULL, 0, NULL, 0, 12); meshopt_Bounds boundsd = meshopt_computeClusterBounds(ibd, 3, vbd, 3, 12); meshopt_Bounds bounds1 = meshopt_computeClusterBounds(ib1, 3, vbd, 3, 12); assert(bounds0.center[0] == 0 && bounds0.center[1] == 0 && bounds0.center[2] == 0 && bounds0.radius == 0); assert(boundsd.center[0] == 0 && boundsd.center[1] == 0 && boundsd.center[2] == 0 && boundsd.radius == 0); assert(bounds1.center[0] == 0 && bounds1.center[1] == 0 && bounds1.center[2] == 0 && bounds1.radius == 0); const float vb1[] = {1, 0, 0, 0, 1, 0, 0, 0, 1}; const unsigned int ib2[] = {0, 1, 2, 0, 2, 1}; // these bounds have a degenerate cone since the cluster has two triangles with opposite normals meshopt_Bounds bounds2 = meshopt_computeClusterBounds(ib2, 6, vb1, 3, 12); assert(bounds2.cone_apex[0] == 0 && bounds2.cone_apex[1] == 0 && bounds2.cone_apex[2] == 0); assert(bounds2.cone_axis[0] == 0 && bounds2.cone_axis[1] == 0 && bounds2.cone_axis[2] == 0); assert(bounds2.cone_cutoff == 1); assert(bounds2.cone_axis_s8[0] == 0 && bounds2.cone_axis_s8[1] == 0 && bounds2.cone_axis_s8[2] == 0); assert(bounds2.cone_cutoff_s8 == 127); // however, the bounding sphere needs to be in tact (here we only check bbox for simplicity) assert(bounds2.center[0] - bounds2.radius <= 0 && bounds2.center[0] + bounds2.radius >= 1); assert(bounds2.center[1] - bounds2.radius <= 0 && bounds2.center[1] + bounds2.radius >= 1); assert(bounds2.center[2] - bounds2.radius <= 0 && bounds2.center[2] + bounds2.radius >= 1); } static size_t allocCount; static size_t freeCount; static void* customAlloc(size_t size) { allocCount++; return malloc(size); } static void customFree(void* ptr) { freeCount++; free(ptr); } static void customAllocator() { meshopt_setAllocator(customAlloc, customFree); assert(allocCount == 0 && freeCount == 0); float vb[] = {1, 0, 0, 0, 1, 0, 0, 0, 1}; unsigned int ib[] = {0, 1, 2}; unsigned short ibs[] = {0, 1, 2}; // meshopt_computeClusterBounds doesn't allocate meshopt_computeClusterBounds(ib, 3, vb, 3, 12); assert(allocCount == 0 && freeCount == 0); // ... unless IndexAdapter is used meshopt_computeClusterBounds(ibs, 3, vb, 3, 12); assert(allocCount == 1 && freeCount == 1); // meshopt_optimizeVertexFetch allocates internal remap table and temporary storage for in-place remaps meshopt_optimizeVertexFetch(vb, ib, 3, vb, 3, 12); assert(allocCount == 3 && freeCount == 3); // ... plus one for IndexAdapter meshopt_optimizeVertexFetch(vb, ibs, 3, vb, 3, 12); assert(allocCount == 6 && freeCount == 6); meshopt_setAllocator(operator new, operator delete); // customAlloc & customFree should not get called anymore meshopt_optimizeVertexFetch(vb, ib, 3, vb, 3, 12); assert(allocCount == 6 && freeCount == 6); allocCount = freeCount = 0; } static void emptyMesh() { meshopt_optimizeVertexCache(NULL, NULL, 0, 0); meshopt_optimizeVertexCacheFifo(NULL, NULL, 0, 0, 16); meshopt_optimizeOverdraw(NULL, NULL, 0, NULL, 0, 12, 1.f); } static void simplify() { // 0 // 1 2 // 3 4 5 unsigned int ib[] = { 0, 2, 1, 1, 2, 3, 3, 2, 4, 2, 5, 4, }; float vb[] = { 0, 4, 0, 0, 1, 0, 2, 2, 0, 0, 0, 0, 1, 0, 0, 4, 0, 0, }; unsigned int expected[] = { 0, 5, 3, }; float error; assert(meshopt_simplify(ib, ib, 12, vb, 6, 12, 3, 1e-2f, 0, &error) == 3); assert(error == 0.f); assert(memcmp(ib, expected, sizeof(expected)) == 0); } static void simplifyStuck() { // tetrahedron can't be simplified due to collapse error restrictions float vb1[] = {0, 0, 0, 1, 0, 0, 0, 1, 0, 0, 0, 1}; unsigned int ib1[] = {0, 1, 2, 0, 2, 3, 0, 3, 1, 2, 1, 3}; assert(meshopt_simplify(ib1, ib1, 12, vb1, 4, 12, 6, 1e-3f) == 12); // 5-vertex strip can't be simplified due to topology restriction since middle triangle has flipped winding float vb2[] = {0, 0, 0, 1, 0, 0, 2, 0, 0, 0.5f, 1, 0, 1.5f, 1, 0}; unsigned int ib2[] = {0, 1, 3, 3, 1, 4, 1, 2, 4}; // ok unsigned int ib3[] = {0, 1, 3, 1, 3, 4, 1, 2, 4}; // flipped assert(meshopt_simplify(ib2, ib2, 9, vb2, 5, 12, 6, 1e-3f) == 6); assert(meshopt_simplify(ib3, ib3, 9, vb2, 5, 12, 6, 1e-3f) == 9); // 4-vertex quad with a locked corner can't be simplified due to border error-induced restriction float vb4[] = {0, 0, 0, 1, 0, 0, 0, 1, 0, 1, 1, 0}; unsigned int ib4[] = {0, 1, 3, 0, 3, 2}; assert(meshopt_simplify(ib4, ib4, 6, vb4, 4, 12, 3, 1e-3f) == 6); // 4-vertex quad with a locked corner can't be simplified due to border error-induced restriction float vb5[] = {0, 0, 0, 1, 0, 0, 0, 1, 0, 1, 1, 0, 1, 1, 0}; unsigned int ib5[] = {0, 1, 4, 0, 3, 2}; assert(meshopt_simplify(ib5, ib5, 6, vb5, 5, 12, 3, 1e-3f) == 6); } static void simplifySloppyStuck() { const float vb[] = {0, 0, 0, 0, 0, 0, 0, 0, 0}; const unsigned int ib[] = {0, 1, 2, 0, 1, 2}; unsigned int* target = NULL; // simplifying down to 0 triangles results in 0 immediately assert(meshopt_simplifySloppy(target, ib, 3, vb, 3, 12, 0, 0.f) == 0); // simplifying down to 2 triangles given that all triangles are degenerate results in 0 as well assert(meshopt_simplifySloppy(target, ib, 6, vb, 3, 12, 6, 0.f) == 0); } static void simplifyPointsStuck() { const float vb[] = {0, 0, 0, 0, 0, 0, 0, 0, 0}; // simplifying down to 0 points results in 0 immediately assert(meshopt_simplifyPoints(NULL, vb, 3, 12, NULL, 0, 0, 0) == 0); } static void simplifyFlip() { // this mesh has been constructed by taking a tessellated irregular grid with a square cutout // and progressively collapsing edges until the only ones left violate border or flip constraints. // there is only one valid non-flip collapse, so we validate that we take it; when flips are allowed, // the wrong collapse is picked instead. float vb[] = { 1.000000f, 1.000000f, -1.000000f, 1.000000f, 1.000000f, 1.000000f, 1.000000f, -1.000000f, 1.000000f, 1.000000f, -0.200000f, -0.200000f, 1.000000f, 0.200000f, -0.200000f, 1.000000f, -0.200000f, 0.200000f, 1.000000f, 0.200000f, 0.200000f, 1.000000f, 0.500000f, -0.500000f, 1.000000f, -1.000000f, 0.000000f, // clang-format :-/ }; // the collapse we expect is 7 -> 0 unsigned int ib[] = { 7, 4, 3, 1, 2, 5, 7, 1, 6, 7, 8, 0, // gets removed 7, 6, 4, 8, 5, 2, 8, 7, 3, 8, 3, 5, 5, 6, 1, 7, 0, 1, // gets removed }; unsigned int expected[] = { 0, 4, 3, 1, 2, 5, 0, 1, 6, 0, 6, 4, 8, 5, 2, 8, 0, 3, 8, 3, 5, 5, 6, 1, // clang-format :-/ }; assert(meshopt_simplify(ib, ib, 30, vb, 9, 12, 3, 1e-3f) == 24); assert(memcmp(ib, expected, sizeof(expected)) == 0); } static void simplifyScale() { const float vb[] = {0, 0, 0, 1, 0, 0, 0, 2, 0, 0, 0, 3}; assert(meshopt_simplifyScale(vb, 4, 12) == 3.f); } static void simplifyDegenerate() { float vb[] = { 0.000000f, 0.000000f, 0.000000f, 0.000000f, 1.000000f, 0.000000f, 0.000000f, 2.000000f, 0.000000f, 1.000000f, 0.000000f, 0.000000f, 2.000000f, 0.000000f, 0.000000f, 1.000000f, 1.000000f, 0.000000f, // clang-format :-/ }; // 0 1 2 // 3 5 // 4 unsigned int ib[] = { 0, 1, 3, 3, 1, 5, 1, 2, 5, 3, 5, 4, 1, 0, 1, // these two degenerate triangles create a fake reverse edge 0, 3, 0, // which breaks border classification }; unsigned int expected[] = { 0, 1, 4, 4, 1, 2, // clang-format :-/ }; assert(meshopt_simplify(ib, ib, 18, vb, 6, 12, 3, 1e-3f) == 6); assert(memcmp(ib, expected, sizeof(expected)) == 0); } static void simplifyLockBorder() { float vb[] = { 0.000000f, 0.000000f, 0.000000f, 0.000000f, 1.000000f, 0.000000f, 0.000000f, 2.000000f, 0.000000f, 1.000000f, 0.000000f, 0.000000f, 1.000000f, 1.000000f, 0.000000f, 1.000000f, 2.000000f, 0.000000f, 2.000000f, 0.000000f, 0.000000f, 2.000000f, 1.000000f, 0.000000f, 2.000000f, 2.000000f, 0.000000f, // clang-format :-/ }; // 0 1 2 // 3 4 5 // 6 7 8 unsigned int ib[] = { 0, 1, 3, 3, 1, 4, 1, 2, 4, 4, 2, 5, 3, 4, 6, 6, 4, 7, 4, 5, 7, 7, 5, 8, // clang-format :-/ }; unsigned int expected[] = { 0, 1, 3, 1, 2, 3, 3, 2, 5, 6, 3, 7, 3, 5, 7, 7, 5, 8, // clang-format :-/ }; assert(meshopt_simplify(ib, ib, 24, vb, 9, 12, 3, 1e-3f, meshopt_SimplifyLockBorder) == 18); assert(memcmp(ib, expected, sizeof(expected)) == 0); } static void simplifyAttr(bool skip_g) { float vb[8 * 3][6]; for (int y = 0; y < 8; ++y) { // first four rows are a blue gradient, next four rows are a yellow gradient float r = (y < 4) ? 0.8f + y * 0.05f : 0.f; float g = (y < 4) ? 0.8f + y * 0.05f : 0.f; float b = (y < 4) ? 0.f : 0.8f + (7 - y) * 0.05f; for (int x = 0; x < 3; ++x) { vb[y * 3 + x][0] = float(x); vb[y * 3 + x][1] = float(y); vb[y * 3 + x][2] = 0.03f * x; vb[y * 3 + x][3] = r; vb[y * 3 + x][4] = g; vb[y * 3 + x][5] = b; } } unsigned int ib[7 * 2][6]; for (int y = 0; y < 7; ++y) { for (int x = 0; x < 2; ++x) { ib[y * 2 + x][0] = (y + 0) * 3 + (x + 0); ib[y * 2 + x][1] = (y + 0) * 3 + (x + 1); ib[y * 2 + x][2] = (y + 1) * 3 + (x + 0); ib[y * 2 + x][3] = (y + 1) * 3 + (x + 0); ib[y * 2 + x][4] = (y + 0) * 3 + (x + 1); ib[y * 2 + x][5] = (y + 1) * 3 + (x + 1); } } float attr_weights[3] = {0.01f, skip_g ? 0.f : 0.01f, 0.01f}; unsigned int expected[3][6] = { {0, 2, 9, 9, 2, 11}, {9, 11, 12, 12, 11, 14}, {12, 14, 21, 21, 14, 23}, }; assert(meshopt_simplifyWithAttributes(ib[0], ib[0], 7 * 2 * 6, vb[0], 8 * 3, 6 * sizeof(float), vb[0] + 3, 6 * sizeof(float), attr_weights, 3, NULL, 6 * 3, 1e-2f) == 18); assert(memcmp(ib, expected, sizeof(expected)) == 0); } static void simplifyLockFlags() { float vb[] = { 0, 0, 0, 0, 1, 0, 0, 2, 0, 1, 0, 0, 1, 1, 0, 1, 2, 0, 2, 0, 0, 2, 1, 0, 2, 2, 0, // clang-format :-/ }; unsigned char lock[9] = { 1, 1, 1, 1, 0, 1, 1, 1, 1, // clang-format :-/ }; // 0 1 2 // 3 4 5 // 6 7 8 unsigned int ib[] = { 0, 1, 3, 3, 1, 4, 1, 2, 4, 4, 2, 5, 3, 4, 6, 6, 4, 7, 4, 5, 7, 7, 5, 8, // clang-format :-/ }; unsigned int expected[] = { 0, 1, 3, 1, 2, 3, 3, 2, 5, 6, 3, 7, 3, 5, 7, 7, 5, 8, // clang-format :-/ }; assert(meshopt_simplifyWithAttributes(ib, ib, 24, vb, 9, 12, NULL, 0, NULL, 0, lock, 3, 1e-3f, 0) == 18); assert(memcmp(ib, expected, sizeof(expected)) == 0); } static void simplifySparse() { float vb[] = { 0, 0, 100, 0, 1, 0, 0, 2, 100, 1, 0, 0.1f, 1, 1, 0.1f, 1, 2, 0.1f, 2, 0, 100, 2, 1, 0, 2, 2, 100, // clang-format :-/ }; float vba[] = { 100, 0.5f, 100, 0.5f, 0.5f, 0, 100, 0.5f, 100, // clang-format :-/ }; float aw[] = { 0.5f}; unsigned char lock[9] = { 8, 1, 8, 1, 0, 1, 8, 1, 8, // clang-format :-/ }; // 1 // 3 4 5 // 7 unsigned int ib[] = { 3, 1, 4, 1, 5, 4, 3, 4, 7, 4, 5, 7, // clang-format :-/ }; unsigned int res[12]; // vertices 3-4-5 are slightly elevated along Z which guides the collapses when only using geometry unsigned int expected[] = { 1, 5, 3, 3, 5, 7, // clang-format :-/ }; assert(meshopt_simplify(res, ib, 12, vb, 9, 12, 6, 1e-3f, meshopt_SimplifySparse) == 6); assert(memcmp(res, expected, sizeof(expected)) == 0); // vertices 1-4-7 have a crease in the attribute value which guides the collapses the opposite way when weighing attributes sufficiently unsigned int expecteda[] = { 3, 1, 7, 1, 5, 7, // clang-format :-/ }; assert(meshopt_simplifyWithAttributes(res, ib, 12, vb, 9, 12, vba, sizeof(float), aw, 1, lock, 6, 1e-1f, meshopt_SimplifySparse) == 6); assert(memcmp(res, expecteda, sizeof(expecteda)) == 0); // a final test validates that destination can alias when using sparsity assert(meshopt_simplify(ib, ib, 12, vb, 9, 12, 6, 1e-3f, meshopt_SimplifySparse) == 6); assert(memcmp(ib, expected, sizeof(expected)) == 0); } static void simplifyErrorAbsolute() { float vb[] = { 0, 0, 0, 0, 1, 0, 0, 2, 0, 1, 0, 0, 1, 1, 1, 1, 2, 0, 2, 0, 0, 2, 1, 0, 2, 2, 0, // clang-format :-/ }; // 0 1 2 // 3 4 5 // 6 7 8 unsigned int ib[] = { 0, 1, 3, 3, 1, 4, 1, 2, 4, 4, 2, 5, 3, 4, 6, 6, 4, 7, 4, 5, 7, 7, 5, 8, // clang-format :-/ }; float error = 0.f; assert(meshopt_simplify(ib, ib, 24, vb, 9, 12, 18, 2.f, meshopt_SimplifyLockBorder | meshopt_SimplifyErrorAbsolute, &error) == 18); assert(fabsf(error - 0.85f) < 0.01f); } static void simplifySeam() { // xyz+attr float vb[] = { 0, 0, 0, 0, 0, 1, 0, 0, 0, 1, 0, 1, 0, 2, 0, 1, 1, 0, 0, 0, 1, 1, 0.3f, 0, 1, 1, 0.3f, 1, 1, 2, 0, 1, 2, 0, 0, 0, 2, 1, 0.1f, 0, 2, 1, 0.1f, 1, 2, 2, 0, 1, 3, 0, 0, 0, 3, 1, 0, 0, 3, 1, 0, 1, 3, 2, 0, 1, // clang-format :-/ }; // 0 1-2 3 // 4 5-6 7 // 8 9-10 11 // 12 13-14 15 unsigned int ib[] = { 0, 1, 4, 4, 1, 5, 2, 3, 6, 6, 3, 7, 4, 5, 8, 8, 5, 9, 6, 7, 10, 10, 7, 11, 8, 9, 12, 12, 9, 13, 10, 11, 14, 14, 11, 15, // clang-format :-/ }; // note: vertices 1-2 and 13-14 are classified as locked, because they are on a seam & a border // since seam->locked collapses are restriced, we only get to 3 triangles on each side as the seam is simplified to 3 vertices // so we get this structure initially, and then one of the internal seam vertices is collapsed to the other one: // 0 1-2 3 // 5-6 // 9-10 // 12 13-14 15 unsigned int expected[] = { 0, 1, 5, 2, 3, 6, 0, 5, 12, 12, 5, 13, 6, 3, 14, 14, 3, 15, // clang-format :-/ }; unsigned int res[36]; float error = 0.f; assert(meshopt_simplify(res, ib, 36, vb, 16, 16, 18, 1.f, 0, &error) == 18); assert(memcmp(res, expected, sizeof(expected)) == 0); assert(fabsf(error - 0.04f) < 0.01f); // note: the error is not zero because there is a small difference in height between the seam vertices float aw = 1; assert(meshopt_simplifyWithAttributes(res, ib, 36, vb, 16, 16, vb + 3, 16, &aw, 1, NULL, 18, 2.f, 0, &error) == 18); assert(memcmp(res, expected, sizeof(expected)) == 0); assert(fabsf(error - 0.04f) < 0.01f); // note: this is the same error as above because the attribute is constant on either side of the seam } static void adjacency() { // 0 1/4 // 2/5 3 const float vb[] = {0, 0, 0, 1, 0, 0, 0, 1, 0, 1, 1, 0, 1, 0, 0, 0, 1, 0}; const unsigned int ib[] = {0, 1, 2, 5, 4, 3}; unsigned int adjib[12]; meshopt_generateAdjacencyIndexBuffer(adjib, ib, 6, vb, 6, 12); unsigned int expected[] = { // patch 0 0, 0, 1, 3, 2, 2, // patch 1 5, 0, 4, 4, 3, 3, // clang-format :-/ }; assert(memcmp(adjib, expected, sizeof(expected)) == 0); } static void tessellation() { // 0 1/4 // 2/5 3 const float vb[] = {0, 0, 0, 1, 0, 0, 0, 1, 0, 1, 1, 0, 1, 0, 0, 0, 1, 0}; const unsigned int ib[] = {0, 1, 2, 5, 4, 3}; unsigned int tessib[24]; meshopt_generateTessellationIndexBuffer(tessib, ib, 6, vb, 6, 12); unsigned int expected[] = { // patch 0 0, 1, 2, 0, 1, 4, 5, 2, 0, 0, 1, 2, // patch 1 5, 4, 3, 2, 1, 4, 3, 3, 5, 2, 1, 3, // clang-format :-/ }; assert(memcmp(tessib, expected, sizeof(expected)) == 0); } static void quantizeFloat() { volatile float zero = 0.f; // avoids div-by-zero warnings assert(meshopt_quantizeFloat(1.2345f, 23) == 1.2345f); assert(meshopt_quantizeFloat(1.2345f, 16) == 1.2344971f); assert(meshopt_quantizeFloat(1.2345f, 8) == 1.2343750f); assert(meshopt_quantizeFloat(1.2345f, 4) == 1.25f); assert(meshopt_quantizeFloat(1.2345f, 1) == 1.0); assert(meshopt_quantizeFloat(1.f, 0) == 1.0f); assert(meshopt_quantizeFloat(1.f / zero, 0) == 1.f / zero); assert(meshopt_quantizeFloat(-1.f / zero, 0) == -1.f / zero); float nanf = meshopt_quantizeFloat(zero / zero, 8); assert(nanf != nanf); } static void quantizeHalf() { volatile float zero = 0.f; // avoids div-by-zero warnings // normal assert(meshopt_quantizeHalf(1.2345f) == 0x3cf0); // overflow assert(meshopt_quantizeHalf(65535.f) == 0x7c00); assert(meshopt_quantizeHalf(-65535.f) == 0xfc00); // large assert(meshopt_quantizeHalf(65000.f) == 0x7bef); assert(meshopt_quantizeHalf(-65000.f) == 0xfbef); // small assert(meshopt_quantizeHalf(0.125f) == 0x3000); assert(meshopt_quantizeHalf(-0.125f) == 0xb000); // very small assert(meshopt_quantizeHalf(1e-4f) == 0x068e); assert(meshopt_quantizeHalf(-1e-4f) == 0x868e); // underflow assert(meshopt_quantizeHalf(1e-5f) == 0x0000); assert(meshopt_quantizeHalf(-1e-5f) == 0x8000); // exponent underflow assert(meshopt_quantizeHalf(1e-20f) == 0x0000); assert(meshopt_quantizeHalf(-1e-20f) == 0x8000); // exponent overflow assert(meshopt_quantizeHalf(1e20f) == 0x7c00); assert(meshopt_quantizeHalf(-1e20f) == 0xfc00); // inf assert(meshopt_quantizeHalf(1.f / zero) == 0x7c00); assert(meshopt_quantizeHalf(-1.f / zero) == 0xfc00); // nan unsigned short nanh = meshopt_quantizeHalf(zero / zero); assert(nanh == 0x7e00 || nanh == 0xfe00); } static void dequantizeHalf() { volatile float zero = 0.f; // avoids div-by-zero warnings // normal assert(meshopt_dequantizeHalf(0x3cf0) == 1.234375f); // large assert(meshopt_dequantizeHalf(0x7bef) == 64992.f); assert(meshopt_dequantizeHalf(0xfbef) == -64992.f); // small assert(meshopt_dequantizeHalf(0x3000) == 0.125f); assert(meshopt_dequantizeHalf(0xb000) == -0.125f); // very small assert(meshopt_dequantizeHalf(0x068e) == 1.00016594e-4f); assert(meshopt_dequantizeHalf(0x868e) == -1.00016594e-4f); // denormal assert(meshopt_dequantizeHalf(0x00ff) == 0.f); assert(meshopt_dequantizeHalf(0x80ff) == 0.f); // actually this is -0.f assert(1.f / meshopt_dequantizeHalf(0x80ff) == -1.f / zero); // inf assert(meshopt_dequantizeHalf(0x7c00) == 1.f / zero); assert(meshopt_dequantizeHalf(0xfc00) == -1.f / zero); // nan float nanf = meshopt_dequantizeHalf(0x7e00); assert(nanf != nanf); } void runTests() { decodeIndexV0(); decodeIndexV1(); decodeIndex16(); encodeIndexMemorySafe(); decodeIndexMemorySafe(); decodeIndexRejectExtraBytes(); decodeIndexRejectMalformedHeaders(); decodeIndexRejectInvalidVersion(); decodeIndexMalformedVByte(); roundtripIndexTricky(); encodeIndexEmpty(); decodeIndexSequence(); decodeIndexSequence16(); encodeIndexSequenceMemorySafe(); decodeIndexSequenceMemorySafe(); decodeIndexSequenceRejectExtraBytes(); decodeIndexSequenceRejectMalformedHeaders(); decodeIndexSequenceRejectInvalidVersion(); encodeIndexSequenceEmpty(); decodeVertexV0(); encodeVertexMemorySafe(); decodeVertexMemorySafe(); decodeVertexRejectExtraBytes(); decodeVertexRejectMalformedHeaders(); decodeVertexBitGroups(); decodeVertexBitGroupSentinels(); decodeVertexLarge(); encodeVertexEmpty(); decodeFilterOct8(); decodeFilterOct12(); decodeFilterQuat12(); decodeFilterExp(); encodeFilterOct8(); encodeFilterOct12(); encodeFilterQuat12(); encodeFilterExp(); encodeFilterExpZero(); clusterBoundsDegenerate(); customAllocator(); emptyMesh(); simplify(); simplifyStuck(); simplifySloppyStuck(); simplifyPointsStuck(); simplifyFlip(); simplifyScale(); simplifyDegenerate(); simplifyLockBorder(); simplifyAttr(/* skip_g= */ false); simplifyAttr(/* skip_g= */ true); simplifyLockFlags(); simplifySparse(); simplifyErrorAbsolute(); simplifySeam(); adjacency(); tessellation(); quantizeFloat(); quantizeHalf(); dequantizeHalf(); } ```
Sandwich class is an informal term used in Singapore and Hong Kong to refer to the middle class. Generally, the sandwich class consists of lower-middle-class people who feel "squeezed" — although they are not poor, they are not able to achieve their aspirations as people with a higher income. In Hong Kong, this comprises families with an income of US$20,000–40,000 per year. Per capita income is typically around US$10,000 per year in Hong Kong, so this places them far above the average family in the territory. However, given very high real estate prices, it is nowhere near enough for them to afford a private residence. Hence, they are "sandwiched" between the large population who truly need public assistance, and the smaller number of people who can afford private residences and other luxury goods. In Singapore, the sandwich class typically refers to the middle class who are "sandwiched" between having luxuries and basic necessities. They generally have to support ageing parents and growing children. Their household income are usually around SGD $10,000. Typical issues range from unable to upgrade to private property, inability to enjoy a lifestyle or the means to support such a lifestyle, taking care of parents and children and inability to retire early. See also Sandwich generation Sandwich Class Housing Scheme in Hong Kong My Home Purchase Plan References Middle class culture Social classes
Pallandri, also spelled Palandri (), originally Pulandari, is a Tehsil which serves as administrative capital of Sudhanoti district of Azad Kashmir. It is located at latitude 33° 42′ 54″ N, longitude 73° 41′ 9″ E, from Islamabad, the capital of Pakistan. It is connected with Rawalpindi and Islamabad through Azad Pattan road. The main tribe of Pallandri is the Sudhan tribe. History Pallandri was central for the violent anti government 1955 Poonch uprising, which was led by the Sudhans. Sudhans were angered by the removal of Sardar Ibrahim Khan, with an assassination attempt on Sher Ahmed Khan marking the beginning in February 1955. Administration Pallandri is divided into four tehsils, Pallandri, Mong, Tarar Khel and Balouch, and Pallandri serves as the headquarters of Sudhanoti. Jinjahell was the first capital of Azad Kashmir and is about 20 Kilometers away from Pallandri. It is at an elevation of 1372 meters and is from Rawalpindi via Azad Pattan. The district is connected to Rawalakot by a metaled road. Educational institutes For a town as small Pallandri, it has a lot of educational institutions. It has more than 50 colleges and schools. Some of the most prominent schools of Pallandri are,Oxford Model Public High School, Fauji Foundation, Bilal Gul school, and college, Pilot High School and Read foundation. the most famous educational institution of Pallandri is Cadet college, Cadet College Palandri is situated about 100 km from Islamabad.The College Complex is situated on the south eastern flank of Pallandri town.first private English medium school in pallandri is jaffer Ali Khan school (Former KG school) run by sir Hameed(late). Moreover Mirpur University of Science MUST establish a campus in Palandri Notable people Khan Muhammad Khan, member of the Jammu and Kashmir legislative assembly (Praja Sabha) from 1934 to 1946. Chairman War of Council of Azad Jammu and Kashmir in 1947 and then member of Defence Council. Founder of Sudhan Educational Conference. Sher Ahmed Khan (Late), 4th President of AJK (22 June 1952 – 31 May 1956) Sardar Sabz Ali khan and Mali Khan 1832 Lt Col Muhammad Naqi Khan (late), Ex-MLA & Minister for Health and Food Brigadier M. Sadiq Khan (late) - Asst Chief of Staff CENTO, Ankara – Turkey (1970–73) - Chairman Governor's /Chief Minister's Inspection, Enquiries & Anti-Corruption Department & Secretary of Govt of the Punjab (1978–87) Member Punjab Public Service Commission (1988–90) - Minister Communications & Works, Housing & Physical Planning & Transport Departments Govt of the Punjab (1993) - Yagana-e-Kashmir Book (writer) General Muhammad Aziz Khan, a retired four-star rank army general in the Pakistan Army who served as Chairman of the Joint Chiefs of Staff Committee, appointed in October 2001 until his retirement in 2005 Dr Muhammad Najeeb Naqi Khan Minister for Health and Finance. He has been elected as the member of the Azad Kashmir Legislative Assembly five times (1991, 1996, 2006, 2011, 2016) from the Pallandri constituency and was a member of the Kashmir Council from 2001 to 2006 Brigadier Arshad Iqbal, serving as a station commander Mangla Cantt, commander planning and Log area, serving 1 star, general. On his meritorious contribution to the security and national interests of Pakistan, world peace, cultural, He has been awarded Sitara Imtiaz Military which is indeed a great honor for whole Kashmir and Pallandri. On 23 March 2021, Brig, Arshad Iqbal was among the few Pakistanis who were awarded with Sitara-i-Imtiaz for their matchless services to the nation. Notes References External links www.pallandri.com http://www.dostpakistan.pk/pallandri/ success in Kashmir earthquake response Populated places in Sudhanoti District Tehsils of Sudhanoti District
WFOT (89.5 FM) is a non-commercial radio station licensed to Lexington, Ohio, featuring a Catholicm–based format as a repeater station in the Annunciation Radio network. Owned by Our Lady of Guadalupe Radio, Inc. (d/b/a Annunciation Radio), the station serves the Mansfield, Ashland and Mount Vernon areas as an affiliate of EWTN Radio and Ave Maria Radio. In addition to a standard analog transmission, WFOT's programming is available online. History WFOT began in February 2007 as a near-simulcast of WVKO in Columbus from 2007 to 2011 and then WVSG in Columbus from 2011 until July 2013 when it was part of St. Gabriel Radio. St. Gabriel Radio began broadcasting in 2005 on the former WUCO (now WDLR) in Marysville, Ohio until 2010. Though WUCO was the first to air Catholic programming for the Columbus Diocese, this made WFOT the second such station in that diocese. In June 2013 Annunciation Radio purchased WFOT which on July 11, 2013 at 3:00 pm, programming transitioned from St. Gabriel Radio to Annunciation Radio. Sale and transfer of license of WFOT was granted by the FCC on June 24, 2013, thus making WFOT the fifth full-time Catholic station in the Toledo Diocese and the third station in Annunciation's regional network. External links Catholic radio stations FOT Radio stations established in 2007 2007 establishments in Ohio
The 69th Regiment Armory is a historic National Guard armory building located at 68 Lexington Avenue between East 25th and 26th Streets in the Rose Hill section of Manhattan, New York City. The building began construction in 1904 and was completed in 1906. The armory was designed by the firm of Hunt & Hunt, and was the first armory built in New York City to not be modeled on a medieval fortress; instead, it was designed in the Beaux-Arts style. The Armory was the site of the controversial 1913 Armory Show, in which modern art was first publicly presented in the United States, per the efforts of Irish American collector John Quinn. As planned, the armory had 12 company rooms, a rifle gallery, and various social rooms. It has a 5,000 seat arena that is used for sporting and entertainment events, such as the Victoria's Secret Fashion Show. The Armory is also the former home of the Civil Air Patrol – Phoenix Composite Squadron. The building is still used to house the headquarters of the New York Army National Guard's 1st Battalion, 69th Infantry Regiment (known as the "Fighting Irish" since Gettysburg), as well as for the presentation of special events. The building was declared a National Historic Landmark in 1965, and a New York City landmark in 1983. Notable events In 1913, the Armory Show exhibited art from many contemporary artists such as Vincent Van Gogh, Pablo Picasso, Henri Matisse, Raoul Dufy, Marcel Duchamp, Andre Dunoyer de Segonzac, and more. It was the first large-scale modern art show in the United States. It received mixed reactions from the public and media for its controversial new art forms, such as cubism, fauvism, and post-impressionism. It was a success and eventually moved on to Chicago and Boston. Thure Johansson of Sweden broke Dorando Pietri's indoor record for the marathon at the 69th Regiment Armory on March 1, 1910 (2:36:55.2). As of May 2010, the Association of Road Racing Statisticians notes that Johansson's mark still stands as the sixth fastest time on an indoor track. Starting November 29, 1948 through early 1949, the Armory hosted at least 17 roller derby matches, including the first matches ever broadcast on television. The Armory was the site of some New York Knicks home games from 1946 to 1960, including all their home games during the 1951, 1952, and 1953 NBA Finals, due to other events being booked at the time at their normal home, Madison Square Garden. The New York Americans – now the Brooklyn Nets – of the new American Basketball Association wanted to play at the Armory in 1967, but pressure from the Knicks on the Armory management forced the new club to play in Teaneck, New Jersey, instead. In 1994, the rock group Soundgarden performed two shows at the Armory (on June 16 and 17), as part of the tour in support of their Superunknown album. In 1996, NBA Entertainment used the Armory to film Denzel Washington's portions of the documentary NBA at 50. After the September 11, 2001, attacks, the Armory served as a counseling center for the victims and families. In 2002, 2003, 2005, 2009, 2010, 2011, 2012, 2013, and 2015 the Armory was the venue used for the Victoria's Secret Fashion Show. The Armory has been the site of the Museum of Comic and Cartoon Art's MoCCA Art Festival since 2009. The Architectural League of New York staged its annual "Beaux Arts Ball" at the Armory in 2013, to mark the centennial of the 1913 "Armory Show". For the event, the ALNY commissioned giant illuminated cubist puppets designed by Processional Art Workshop In May 2014, The Armory hosted the inaugural edition of the Downtown Art Fair in which work from leading art galleries was offered for sale. See also List of New York City Designated Landmarks in Manhattan from 14th to 59th Streets National Register of Historic Places listings in Manhattan from 14th to 59th Streets References Explanatory notes Citations Sources External links Official unit website – includes link to a slide tour of armory from the home page 69th Regiment NYC Architecture New York Times Article on the Commanders Room 1906 establishments in New York City Armories in New York City Armories on the National Register of Historic Places in New York (state) Athletics (track and field) venues in New York City Badminton venues Basketball venues in New York City Event venues on the National Register of Historic Places in New York City Former National Basketball Association venues Former sports venues in New York City Infrastructure completed in 1906 Installations of the United States Army National Guard Military facilities on the National Register of Historic Places in Manhattan National Historic Landmarks in Manhattan New York City Designated Landmarks in Manhattan Rose Hill, Manhattan Sports venues completed in 1906 St. Francis Brooklyn Terriers men's basketball
```javascript /** * @license Apache-2.0 * * * * path_to_url * * Unless required by applicable law or agreed to in writing, software * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. */ 'use strict'; // MODULES // var tape = require( 'tape' ); var MAX_TYPED_ARRAY_LENGTH = require( './../lib' ); // TESTS // tape( 'main export is a number', function test( t ) { t.ok( true, __filename ); t.strictEqual( typeof MAX_TYPED_ARRAY_LENGTH, 'number', 'main export is a number' ); t.end(); }); tape( 'the exported value is 9007199254740991', function test( t ) { t.strictEqual( MAX_TYPED_ARRAY_LENGTH, 9007199254740991, 'returns expected value' ); t.end(); }); ```
```python # # # path_to_url # # or in the "license" file accompanying this file. This file is # distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF import boto.vendored.regions.regions as _regions class _CompatEndpointResolver(_regions.EndpointResolver): """Endpoint resolver which handles boto2 compatibility concerns. This is NOT intended for external use whatsoever. """ _DEFAULT_SERVICE_RENAMES = { # The botocore resolver is based on endpoint prefix. # These don't always sync up to the name that boto2 uses. # A mapping can be provided that handles the mapping between # "service names" and endpoint prefixes. 'awslambda': 'lambda', 'cloudwatch': 'monitoring', 'ses': 'email', 'ec2containerservice': 'ecs', 'configservice': 'config', } def __init__(self, endpoint_data, service_rename_map=None): """ :type endpoint_data: dict :param endpoint_data: Regions and endpoints data in the same format as is used by botocore / boto3. :type service_rename_map: dict :param service_rename_map: A mapping of boto2 service name to endpoint prefix. """ super(_CompatEndpointResolver, self).__init__(endpoint_data) if service_rename_map is None: service_rename_map = self._DEFAULT_SERVICE_RENAMES # Mapping of boto2 service name to endpoint prefix self._endpoint_prefix_map = service_rename_map # Mapping of endpoint prefix to boto2 service name self._service_name_map = dict( (v, k) for k, v in service_rename_map.items()) def get_available_endpoints(self, service_name, partition_name='aws', allow_non_regional=False): endpoint_prefix = self._endpoint_prefix(service_name) return super(_CompatEndpointResolver, self).get_available_endpoints( endpoint_prefix, partition_name, allow_non_regional) def get_all_available_regions(self, service_name): """Retrieve every region across partitions for a service.""" regions = set() endpoint_prefix = self._endpoint_prefix(service_name) # Get every region for every partition in the new endpoint format for partition_name in self.get_available_partitions(): if self._is_global_service(service_name, partition_name): # Global services are available in every region in the # partition in which they are considered global. partition = self._get_partition_data(partition_name) regions.update(partition['regions'].keys()) continue else: regions.update( self.get_available_endpoints( endpoint_prefix, partition_name) ) return list(regions) def construct_endpoint(self, service_name, region_name=None): endpoint_prefix = self._endpoint_prefix(service_name) return super(_CompatEndpointResolver, self).construct_endpoint( endpoint_prefix, region_name) def get_available_services(self): """Get a list of all the available services in the endpoints file(s)""" services = set() for partition in self._endpoint_data['partitions']: services.update(partition['services'].keys()) return [self._service_name(s) for s in services] def _is_global_service(self, service_name, partition_name='aws'): """Determines whether a service uses a global endpoint. In theory a service can be 'global' in one partition but regional in another. In practice, each service is all global or all regional. """ endpoint_prefix = self._endpoint_prefix(service_name) partition = self._get_partition_data(partition_name) service = partition['services'].get(endpoint_prefix, {}) return 'partitionEndpoint' in service def _get_partition_data(self, partition_name): """Get partition information for a particular partition. This should NOT be used to get service endpoint data because it only loads from the new endpoint format. It should only be used for partition metadata and partition specific service metadata. :type partition_name: str :param partition_name: The name of the partition to search for. :returns: Partition info from the new endpoints format. :rtype: dict or None """ for partition in self._endpoint_data['partitions']: if partition['partition'] == partition_name: return partition raise ValueError( "Could not find partition data for: %s" % partition_name) def _endpoint_prefix(self, service_name): """Given a boto2 service name, get the endpoint prefix.""" return self._endpoint_prefix_map.get(service_name, service_name) def _service_name(self, endpoint_prefix): """Given an endpoint prefix, get the boto2 service name.""" return self._service_name_map.get(endpoint_prefix, endpoint_prefix) class BotoEndpointResolver(object): """Resolves endpoint hostnames for AWS services. This is NOT intended for external use. """ def __init__(self, endpoint_data, service_rename_map=None): """ :type endpoint_data: dict :param endpoint_data: Regions and endpoints data in the same format as is used by botocore / boto3. :type service_rename_map: dict :param service_rename_map: A mapping of boto2 service name to endpoint prefix. """ self._resolver = _CompatEndpointResolver( endpoint_data, service_rename_map) def resolve_hostname(self, service_name, region_name): """Resolve the hostname for a service in a particular region. :type service_name: str :param service_name: The service to look up. :type region_name: str :param region_name: The region to find the endpoint for. :return: The hostname for the given service in the given region. """ endpoint = self._resolver.construct_endpoint(service_name, region_name) if endpoint is None: return None return endpoint.get('sslCommonName', endpoint['hostname']) def get_all_available_regions(self, service_name): """Get all the regions a service is available in. :type service_name: str :param service_name: The service to look up. :rtype: list of str :return: A list of all the regions the given service is available in. """ return self._resolver.get_all_available_regions(service_name) def get_available_services(self): """Get all the services supported by the endpoint data. :rtype: list of str :return: A list of all the services explicitly contained within the endpoint data provided during instantiation. """ return self._resolver.get_available_services() class StaticEndpointBuilder(object): """Builds a static mapping of endpoints in the legacy format.""" def __init__(self, resolver): """ :type resolver: BotoEndpointResolver :param resolver: An endpoint resolver. """ self._resolver = resolver def build_static_endpoints(self, service_names=None): """Build a set of static endpoints in the legacy boto2 format. :param service_names: The names of the services to build. They must use the names that boto2 uses, not boto3, e.g "ec2containerservice" and not "ecs". If no service names are provided, all available services will be built. :return: A dict consisting of:: {"service": {"region": "full.host.name"}} """ if service_names is None: service_names = self._resolver.get_available_services() static_endpoints = {} for name in service_names: endpoints_for_service = self._build_endpoints_for_service(name) if endpoints_for_service: # It's possible that when we try to build endpoints for # services we get an empty hash. In that case we don't # bother adding it to the final list of static endpoints. static_endpoints[name] = endpoints_for_service self._handle_special_cases(static_endpoints) return static_endpoints def _build_endpoints_for_service(self, service_name): # Given a service name, 'ec2', build a dict of # 'region' -> 'hostname' endpoints = {} regions = self._resolver.get_all_available_regions(service_name) for region_name in regions: endpoints[region_name] = self._resolver.resolve_hostname( service_name, region_name) return endpoints def _handle_special_cases(self, static_endpoints): # cloudsearchdomain endpoints use the exact same set of endpoints as # cloudsearch. if 'cloudsearch' in static_endpoints: cloudsearch_endpoints = static_endpoints['cloudsearch'] static_endpoints['cloudsearchdomain'] = cloudsearch_endpoints ```
```javascript /** @jest-environment ./packages/test/harness/src/host/jest/WebDriverEnvironment.js */ describe('Attachment', () => { test('with "contentUrl" of forbidden protocols', () => runHTML('attachment.forbiddenProtocol.html')); }); ```
```c++ //======================================================================= // Author: Jeremy G. Siek // // accompanying file LICENSE_1_0.txt or copy at // path_to_url //======================================================================= #ifndef BOOST_GRAPH_ITERATION_MACROS_HPP #define BOOST_GRAPH_ITERATION_MACROS_HPP #include <utility> #define BGL_CAT(x,y) x ## y #define BGL_RANGE(linenum) BGL_CAT(bgl_range_,linenum) #define BGL_FIRST(linenum) (BGL_RANGE(linenum).first) #define BGL_LAST(linenum) (BGL_RANGE(linenum).second) /* BGL_FORALL_VERTICES_T(v, g, graph_t) // This is on line 9 expands to the following, but all on the same line for (typename boost::graph_traits<graph_t>::vertex_iterator bgl_first_9 = vertices(g).first, bgl_last_9 = vertices(g).second; bgl_first_9 != bgl_last_9; bgl_first_9 = bgl_last_9) for (typename boost::graph_traits<graph_t>::vertex_descriptor v; bgl_first_9 != bgl_last_9 ? (v = *bgl_first_9, true) : false; ++bgl_first_9) The purpose of having two for-loops is just to provide a place to declare both the iterator and value variables. There is really only one loop. The stopping condition gets executed two more times than it usually would be, oh well. The reason for the bgl_first_9 = bgl_last_9 in the outer for-loop is in case the user puts a break statement in the inner for-loop. The other macros work in a similar fashion. Use the _T versions when the graph type is a template parameter or dependent on a template parameter. Otherwise use the non _T versions. ----------------------- 6/9/09 THK The above contains two calls to the vertices function. I modified these macros to expand to for (std::pair<typename boost::graph_traits<graph_t>::vertex_iterator, typename boost::graph_traits<graph_t>::vertex_iterator> bgl_range_9 = vertices(g); bgl_range_9.first != bgl_range_9.second; bgl_range_9.first = bgl_range_9.second) for (typename boost::graph_traits<graph_t>::vertex_descriptor v; bgl_range_9.first != bgl_range_9.second ? (v = *bgl_range_9.first, true) : false; ++bgl_range_9.first) */ #define BGL_FORALL_VERTICES_T(VNAME, GNAME, GraphType) \ for (std::pair<typename boost::graph_traits<GraphType>::vertex_iterator, \ typename boost::graph_traits<GraphType>::vertex_iterator> BGL_RANGE(__LINE__) = vertices(GNAME); \ BGL_FIRST(__LINE__) != BGL_LAST(__LINE__); BGL_FIRST(__LINE__) = BGL_LAST(__LINE__)) \ for (typename boost::graph_traits<GraphType>::vertex_descriptor VNAME; \ BGL_FIRST(__LINE__) != BGL_LAST(__LINE__) ? (VNAME = *BGL_FIRST(__LINE__), true):false; \ ++BGL_FIRST(__LINE__)) #define BGL_FORALL_VERTICES(VNAME, GNAME, GraphType) \ for (std::pair<boost::graph_traits<GraphType>::vertex_iterator, \ boost::graph_traits<GraphType>::vertex_iterator> BGL_RANGE(__LINE__) = vertices(GNAME); \ BGL_FIRST(__LINE__) != BGL_LAST(__LINE__); BGL_FIRST(__LINE__) = BGL_LAST(__LINE__)) \ for (boost::graph_traits<GraphType>::vertex_descriptor VNAME; \ BGL_FIRST(__LINE__) != BGL_LAST(__LINE__) ? (VNAME = *BGL_FIRST(__LINE__), true):false; \ ++BGL_FIRST(__LINE__)) #define BGL_FORALL_EDGES_T(ENAME, GNAME, GraphType) \ for (std::pair<typename boost::graph_traits<GraphType>::edge_iterator, \ typename boost::graph_traits<GraphType>::edge_iterator> BGL_RANGE(__LINE__) = edges(GNAME); \ BGL_FIRST(__LINE__) != BGL_LAST(__LINE__); BGL_FIRST(__LINE__) = BGL_LAST(__LINE__)) \ for (typename boost::graph_traits<GraphType>::edge_descriptor ENAME; \ BGL_FIRST(__LINE__) != BGL_LAST(__LINE__) ? (ENAME = *BGL_FIRST(__LINE__), true):false; \ ++BGL_FIRST(__LINE__)) #define BGL_FORALL_EDGES(ENAME, GNAME, GraphType) \ for (std::pair<boost::graph_traits<GraphType>::edge_iterator, \ boost::graph_traits<GraphType>::edge_iterator> BGL_RANGE(__LINE__) = edges(GNAME); \ BGL_FIRST(__LINE__) != BGL_LAST(__LINE__); BGL_FIRST(__LINE__) = BGL_LAST(__LINE__)) \ for (boost::graph_traits<GraphType>::edge_descriptor ENAME; \ BGL_FIRST(__LINE__) != BGL_LAST(__LINE__) ? (ENAME = *BGL_FIRST(__LINE__), true):false; \ ++BGL_FIRST(__LINE__)) #define BGL_FORALL_ADJ_T(UNAME, VNAME, GNAME, GraphType) \ for (std::pair<typename boost::graph_traits<GraphType>::adjacency_iterator, \ typename boost::graph_traits<GraphType>::adjacency_iterator> BGL_RANGE(__LINE__) = adjacent_vertices(UNAME, GNAME); \ BGL_FIRST(__LINE__) != BGL_LAST(__LINE__); BGL_FIRST(__LINE__) = BGL_LAST(__LINE__)) \ for (typename boost::graph_traits<GraphType>::vertex_descriptor VNAME; \ BGL_FIRST(__LINE__) != BGL_LAST(__LINE__) ? (VNAME = *BGL_FIRST(__LINE__), true) : false; \ ++BGL_FIRST(__LINE__)) #define BGL_FORALL_ADJ(UNAME, VNAME, GNAME, GraphType) \ for (std::pair<boost::graph_traits<GraphType>::adjacency_iterator, \ boost::graph_traits<GraphType>::adjacency_iterator> BGL_RANGE(__LINE__) = adjacent_vertices(UNAME, GNAME); \ BGL_FIRST(__LINE__) != BGL_LAST(__LINE__); BGL_FIRST(__LINE__) = BGL_LAST(__LINE__)) \ for (boost::graph_traits<GraphType>::vertex_descriptor VNAME; \ BGL_FIRST(__LINE__) != BGL_LAST(__LINE__) ? (VNAME = *BGL_FIRST(__LINE__), true) : false; \ ++BGL_FIRST(__LINE__)) #define BGL_FORALL_OUTEDGES_T(UNAME, ENAME, GNAME, GraphType) \ for (std::pair<typename boost::graph_traits<GraphType>::out_edge_iterator, \ typename boost::graph_traits<GraphType>::out_edge_iterator> BGL_RANGE(__LINE__) = out_edges(UNAME, GNAME); \ BGL_FIRST(__LINE__) != BGL_LAST(__LINE__); BGL_FIRST(__LINE__) = BGL_LAST(__LINE__)) \ for (typename boost::graph_traits<GraphType>::edge_descriptor ENAME; \ BGL_FIRST(__LINE__) != BGL_LAST(__LINE__) ? (ENAME = *BGL_FIRST(__LINE__), true) : false; \ ++BGL_FIRST(__LINE__)) #define BGL_FORALL_OUTEDGES(UNAME, ENAME, GNAME, GraphType) \ for (std::pair<boost::graph_traits<GraphType>::out_edge_iterator, \ boost::graph_traits<GraphType>::out_edge_iterator> BGL_RANGE(__LINE__) = out_edges(UNAME, GNAME); \ BGL_FIRST(__LINE__) != BGL_LAST(__LINE__); BGL_FIRST(__LINE__) = BGL_LAST(__LINE__)) \ for (boost::graph_traits<GraphType>::edge_descriptor ENAME; \ BGL_FIRST(__LINE__) != BGL_LAST(__LINE__) ? (ENAME = *BGL_FIRST(__LINE__), true) : false; \ ++BGL_FIRST(__LINE__)) #define BGL_FORALL_INEDGES_T(UNAME, ENAME, GNAME, GraphType) \ for (std::pair<typename boost::graph_traits<GraphType>::in_edge_iterator, \ typename boost::graph_traits<GraphType>::in_edge_iterator> BGL_RANGE(__LINE__) = in_edges(UNAME, GNAME); \ BGL_FIRST(__LINE__) != BGL_LAST(__LINE__); BGL_FIRST(__LINE__) = BGL_LAST(__LINE__)) \ for (typename boost::graph_traits<GraphType>::edge_descriptor ENAME; \ BGL_FIRST(__LINE__) != BGL_LAST(__LINE__) ? (ENAME = *BGL_FIRST(__LINE__), true) : false; \ ++BGL_FIRST(__LINE__)) #define BGL_FORALL_INEDGES(UNAME, ENAME, GNAME, GraphType) \ for (std::pair<boost::graph_traits<GraphType>::in_edge_iterator, \ boost::graph_traits<GraphType>::in_edge_iterator> BGL_RANGE(__LINE__) = in_edges(UNAME, GNAME); \ BGL_FIRST(__LINE__) != BGL_LAST(__LINE__); BGL_FIRST(__LINE__) = BGL_LAST(__LINE__)) \ for (boost::graph_traits<GraphType>::edge_descriptor ENAME; \ BGL_FIRST(__LINE__) != BGL_LAST(__LINE__) ? (ENAME = *BGL_FIRST(__LINE__), true) : false; \ ++BGL_FIRST(__LINE__)) #endif // BOOST_GRAPH_ITERATION_MACROS_HPP ```
Events from the year 1631 in Sweden Incumbents Monarch – Gustaf II Adolf Events February 16 – Gustav Adolf Secondary School is founded in Tallinn, Estonia by the king. April 13 – Thirty Years' War: Gustavus Adolphus of Sweden defeats an imperial garrison at the city of Frankfurt an der Oder. Treaty of Bärwalde Swedish occupation of Pomerania. Swedish treaty with Brandenburg. May 10 – Thirty Years' War: After a two-month siege, an Imperial army under the command of Tilly storms the German city of Magdeburg and brutally sacks it, massacring over 20,000 inhabitants. Shocked by the massacre, many Protestant states in the Holy Roman Empire decide to ally with Sweden its ongoing invasion. July 16 – The city of Würzburg is taken by Gustavus Adolphus, putting an end to the Würzburg witch trial. July 22 – Thirty Years' War: Tilly was defeated by Gustavus Adolphus at the Battle of Werben, suffering a loss of 6,000 men. End of August – Thirty Years' War: Running out of supply, Tilly is forced to send his army into the Electorate of Saxony in order to secure supplies, as well as to force a reaction from John George, Elector of Saxony and Gustavus Adolphus. September 11 – Thirty Years' War: As a result of Tilly's invasion, John George, Elector of Saxony, until now neutral, allies with Sweden in order to drive the Imperial army out of Saxony. September 17 – Thirty Years' War: In the Battle of Breitenfeld, Tilly's imperial army is defeated by Gustavus II Adolphus, shattering the imperial army of the Holy Roman Empire and marking the first significant victory for the Protestants in the war. December 23 – Thirty Years' War: Sweden takes the city of Mainz without any resistance. Births Deaths 28 February – Regina Basilier, merchant and moneylender. References Years of the 17th century in Sweden Sweden
```java package com.fishercoder.firstthousand; import com.fishercoder.common.classes.TreeNode; import com.fishercoder.common.utils.TreeUtils; import com.fishercoder.solutions.firstthousand._297; import org.junit.jupiter.api.BeforeEach; import org.junit.jupiter.api.Test; import java.util.Arrays; import static org.junit.jupiter.api.Assertions.assertEquals; public class _297Test { private _297.Solution1 solution1; @BeforeEach public void setup() { solution1 = new _297.Solution1(); } @Test public void test1() { TreeNode root = TreeUtils.constructBinaryTree(Arrays.asList(1, 2, 3, null, null, 4, 5, 6, 7)); TreeUtils.printBinaryTree(root); String str = solution1.serialize(root); System.out.println(str); TreeUtils.printBinaryTree(solution1.deserialize(str)); assertEquals(root, solution1.deserialize(str)); } } ```
Nobody Dies Twice (Spanish: Nadie muere dos veces) is a 1953 Mexican thriller film directed by Luis Spota and starring Abel Salazar, Luis Aguilar and Lilia del Valle. Cast Abel Salazar as Raúl García / Ricardo Luis Aguilar as Alberto Lilia del Valle as Irma Ramón Gay as Arturo Robles Pedro Vargas as Cantante Fernando Fernández as Fernando Enedina Díaz de León as Enedina Salvador Quiroz as Don Antonio References Bibliography María Luisa Amador. Cartelera cinematográfica, 1950-1959. UNAM, 1985. External links 1953 films 1950s thriller films Mexican thriller films 1950s Spanish-language films Mexican black-and-white films 1950s Mexican films
```java package com.zhy.adapter.recyclerview.utils; import android.support.v7.widget.GridLayoutManager; import android.support.v7.widget.RecyclerView; import android.support.v7.widget.StaggeredGridLayoutManager; import android.view.ViewGroup; /** * Created by zhy on 16/6/28. */ public class WrapperUtils { public interface SpanSizeCallback { int getSpanSize(GridLayoutManager layoutManager, GridLayoutManager.SpanSizeLookup oldLookup, int position); } public static void onAttachedToRecyclerView(RecyclerView.Adapter innerAdapter, RecyclerView recyclerView, final SpanSizeCallback callback) { innerAdapter.onAttachedToRecyclerView(recyclerView); RecyclerView.LayoutManager layoutManager = recyclerView.getLayoutManager(); if (layoutManager instanceof GridLayoutManager) { final GridLayoutManager gridLayoutManager = (GridLayoutManager) layoutManager; final GridLayoutManager.SpanSizeLookup spanSizeLookup = gridLayoutManager.getSpanSizeLookup(); gridLayoutManager.setSpanSizeLookup(new GridLayoutManager.SpanSizeLookup() { @Override public int getSpanSize(int position) { return callback.getSpanSize(gridLayoutManager, spanSizeLookup, position); } }); gridLayoutManager.setSpanCount(gridLayoutManager.getSpanCount()); } } public static void setFullSpan(RecyclerView.ViewHolder holder) { ViewGroup.LayoutParams lp = holder.itemView.getLayoutParams(); if (lp != null && lp instanceof StaggeredGridLayoutManager.LayoutParams) { StaggeredGridLayoutManager.LayoutParams p = (StaggeredGridLayoutManager.LayoutParams) lp; p.setFullSpan(true); } } } ```
```batchfile :: or more contributor license agreements. See the NOTICE file :: distributed with this work for additional information :: regarding copyright ownership. The ASF licenses this file :: :: path_to_url :: :: Unless required by applicable law or agreed to in writing, :: "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY :: specific language governing permissions and limitations echo on conda build --output-folder=conda/pkg conda/recipe || exit /b ```
The Chrysler Engineering Corp. T36D turboprop engine, design-rated at 2,500 lb thrust, was ordered by the United States Navy in 1948, but was deemed unnecessary as the aircraft it was to power was canceled. The engine development was stopped, and the program was canceled before any engines were built. Specifications (T36D) References Fahey, James Charles. The ships and aircraft of the United States Fleet 6th edition p54 1940s turboprop engines Abandoned military aircraft engine projects of the United States
Pauline Allen, (born 23 February 1948) is an Australian scholar of early Christianity. She is Research Professor of Early Christian Studies and the Director of the Centre for Early Christian Studies at the Australian Catholic University. Honours In 1996 Allen was elected a Fellow of the Australian Academy of the Humanities, while in July 2016, she was elected a corresponding Fellow of the British Academy (FBA). Selected works References 1948 births Living people Roman Catholic biblical scholars Australian Roman Catholics Academic staff of the Australian Catholic University Corresponding Fellows of the British Academy Fellows of the Australian Academy of the Humanities Historians of Christianity Australian historians of religion Female biblical scholars
```xml import * as React from 'react'; import { StoryWright } from 'storywright'; import { Meta } from '@storybook/react'; import { Dialog } from '@fluentui/react-northstar'; import StoryWrightSteps from './commonStoryWrightSteps'; import DialogExampleFullWidth from '../../examples/components/Dialog/Content/DialogExampleFullWidth.shorthand'; export default { component: Dialog, title: 'Dialog', decorators: [story => <StoryWright steps={StoryWrightSteps}>{story()}</StoryWright>], } as Meta<typeof Dialog>; export { DialogExampleFullWidth }; ```
The Blaiklock River is a tributary of the Barlow River (Chibougamau River), flowing into the Regional County Municipality (MRC) of Eeyou Istchee Baie-James, in Jamésie, in the administrative region of Nord-du-Québec, in the province of Quebec, in Canada. The course of the river flows in the townships of Chérisy, Beaulieu and Blaiklock. This river flows into the Albanel, Mistassini and Blaiklock Lakes Wildlife Sanctuary. The hydrographic slope of the "Blaiklock River" is accessible by a forest road (East-West direction serving the upper part of the watercourse, which connects to another forest road (North-South direction) which cuts the river and which connects to route 167 to the southwest of Waconichi Lake. This last road comes from Chibougamau, going north-east along the shoreline. Southeast of Waconichi Lake and the river of the same name. The surface of the "Blaiklock River" is usually frozen from early November to mid-May, however safe ice traffic is generally from mid-November to mid-April. Geography The main hydrographic slopes adjacent to the "Blaiklock River" are: North side: Mistassini Lake (Penicouane Bay), Rupert River, Saint-Urcisse River; East side: Barlow River (Chibougamau River), Waconichi Lake, Waconichi River; South side: Chébistuane River, Chibougamau River, Chibougamau Lake, Lac aux Dorés; West side: Barlow River (Chibougamau River), Chibougamau River, Lac du Sauvage, Brock River (Chibougamau River). The Blaiklock River originates from a forest stream (elevation: ) in Cherisy Township. This source is located at: north-east of the mouth of the Blaiklock River (confluence with the Barlow River (Chibougamau River); northwest of Waconichi Lake; south-west of Mistassini Lake; north of the village center of Chapais, Quebec; north-west of downtown Chibougamau; north-east of the mouth of the Chibougamau River (confluence with the Opawica River); east of the mouth of the Nottaway River. From its source, the Blaiklock River flows over generally to the southwest, according to the following segments: easterly entering the Albanel, Mistassini and Waconichi Lakes Wildlife Sanctuary, then in the township of Beaulieu, to a bend of the river; eastward southward into Blaiklock Township, then southeasterly to the bridge of a forest road, corresponding to the boundary of the townships of Beaulieu and Blaiklock; south-east to its mouth. Note: A forest road follows this segment. The Blaiklock River flows on the north shore of the Barlow River (Chibougamau River). The latter flows southwest and flows into a river bend on the north shore of the Chibougamau River in a marsh area upstream of Chevrillon Lake. From there, the current flows towards the South-West by borrowing the Chibougamau River, until its confluence with the Opawica River. From this confluence, the current flows generally southwesterly through the Waswanipi River to the east shore of Goéland Lake (Waswanipi River). The latter is crossed to the northwest by the Waswanipi River which is a tributary of Matagami Lake. The mouth of the "Blaiklock River" located at: northwest of Mistassini Lake; north-east of the mouth of the Barlow River (Chibougamau River); north-east of the mouth of the Chibougamau River (confluence with the Opawica River; south-east of the mouth of the Nottaway River; north-east of the village center of Chapais, Quebec; North of downtown Chibougamau. Toponymy This hydronym uses the same name as the township of Blaiklock. This term evokes the memory of Frederic William Blaiklock (1822-1901), surveyor, native of Quebec (city). He was the first to conduct topographic measurements in the valley of the Ashuapmushuan River which is located north of Lac Saint-Jean. He also mapped out the two provincial roads leading to Lac Saint-Jean, via Stoneham and via La Tuque. From 1850 to 1853, he surveyed the territorial boundaries between Canada and New Brunswick. For 23 years (1878-1901), he was in charge of the Cadastre Office at Montreal. The Geography Commission, the current Commission de toponymie du Québec, approved this place name in 1953. The toponym "Blaiklock River" was made official on December 5, 1968, at the Commission de toponymie du Québec, that is to say, the foundation of this commission. References See also Rivers of Nord-du-Québec Nottaway River drainage basin Eeyou Istchee James Bay
```javascript /* global $:true */ /* jshint unused:false*/ + function($) { "use strict"; var defaults; var formatNumber = function (n) { return n < 10 ? "0" + n : n; } var Datetime = function(input, params) { this.input = $(input); this.params = params || {}; this.initMonthes = params.monthes this.initYears = params.years var p = $.extend({}, params, this.getConfig()); $(this.input).picker(p); } Datetime.prototype = { getDays : function(max) { var days = []; for(var i=1; i<= (max||31);i++) { days.push(i < 10 ? "0"+i : i); } return days; }, getDaysByMonthAndYear : function(month, year) { var int_d = new Date(year, parseInt(month)+1-1, 1); var d = new Date(int_d - 1); return this.getDays(d.getDate()); }, getConfig: function() { var today = new Date(), params = this.params, self = this, lastValidValues; var config = { rotateEffect: false, // cssClass: 'datetime-picker', value: [today.getFullYear(), formatNumber(today.getMonth()+1), formatNumber(today.getDate()), formatNumber(today.getHours()), (formatNumber(today.getMinutes()))], onChange: function (picker, values, displayValues) { var cols = picker.cols; var days = self.getDaysByMonthAndYear(values[1], values[0]); var currentValue = values[2]; if(currentValue > days.length) currentValue = days.length; picker.cols[4].setValue(currentValue); //check min and max var current = new Date(values[0]+'-'+values[1]+'-'+values[2]); var valid = true; if(params.min) { var min = new Date(typeof params.min === "function" ? params.min() : params.min); if(current < +min) { picker.setValue(lastValidValues); valid = false; } } if(params.max) { var max = new Date(typeof params.max === "function" ? params.max() : params.max); if(current > +max) { picker.setValue(lastValidValues); valid = false; } } valid && (lastValidValues = values); if (self.params.onChange) { self.params.onChange.apply(this, arguments); } }, formatValue: function (p, values, displayValues) { return self.params.format(p, values, displayValues); }, cols: [ { values: this.initYears }, { divider: true, // content: params.yearSplit }, { values: this.initMonthes }, { divider: true, // content: params.monthSplit }, { values: (function () { var dates = []; for (var i=1; i<=31; i++) dates.push(formatNumber(i)); return dates; })() }, ] } if (params.dateSplit) { config.cols.push({ divider: true, content: params.dateSplit }) } config.cols.push({ divider: true, content: params.datetimeSplit }) var times = self.params.times(); if (times && times.length) { config.cols = config.cols.concat(times); } var inputValue = this.input.val(); if(inputValue) config.value = params.parse(inputValue); if(this.params.value) { this.input.val(this.params.value); config.value = params.parse(this.params.value); } return config; } } $.fn.datetimePicker = function(params) { params = $.extend({}, defaults, params); return this.each(function() { if(!this) return; var $this = $(this); var datetime = $this.data("datetime"); if(!datetime) $this.data("datetime", new Datetime(this, params)); return datetime; }); }; defaults = $.fn.datetimePicker.prototype.defaults = { input: undefined, // min: undefined, // YYYY-MM-DD max: undefined, // YYYY-MM-DD yearSplit: '-', monthSplit: '-', dateSplit: '', // datetimeSplit: ' ', // monthes: ('01 02 03 04 05 06 07 08 09 10 11 12').split(' '), years: (function () { var arr = []; for (var i = 1950; i <= 2030; i++) { arr.push(i); } return arr; })(), times: function () { return [ // { values: (function () { var hours = []; for (var i=0; i<24; i++) hours.push(formatNumber(i)); return hours; })() }, { divider: true, // content: ':' }, { values: (function () { var minutes = []; for (var i=0; i<60; i++) minutes.push(formatNumber(i)); return minutes; })() } ]; }, format: function (p, values) { // return p.cols.map(function (col) { return col.value || col.content; }).join(''); }, parse: function (str) { // // '' ''parse // var t = str.split(this.datetimeSplit); return t[0].split(/\D/).concat(t[1].split(/:|||/)).filter(function (d) { return !!d; }) } } }($); ```
```asciidoc //// This file is generated by DocsTest, so don't change it! //// = apoc.couchbase :description: This section contains reference documentation for the apoc.couchbase procedures. [.procedures, opts=header, cols='5a,1a,1a'] |=== | Qualified Name | Type | Release |xref::overview/apoc.couchbase/apoc.couchbase.append.adoc[apoc.couchbase.append icon:book[]] apoc.couchbase.append(hostOrKey, bucket, documentId, content) yield id, expiry, cas, mutationToken, content - append a couchbase json document to an existing one. |label:procedure[] |label:apoc-full[] |xref::overview/apoc.couchbase/apoc.couchbase.exists.adoc[apoc.couchbase.exists icon:book[]] apoc.couchbase.exists(hostOrKey, bucket, documentId) yield value - check whether a couchbase json document with the given ID does exist. |label:procedure[] |label:apoc-full[] |xref::overview/apoc.couchbase/apoc.couchbase.get.adoc[apoc.couchbase.get icon:book[]] apoc.couchbase.get(hostOrKey, bucket, documentId) yield id, expiry, cas, mutationToken, content - retrieves a couchbase json document by its unique ID. |label:procedure[] |label:apoc-full[] |xref::overview/apoc.couchbase/apoc.couchbase.insert.adoc[apoc.couchbase.insert icon:book[]] apoc.couchbase.insert(hostOrKey, bucket, documentId, jsonDocument) yield id, expiry, cas, mutationToken, content - insert a couchbase json document with its unique ID. |label:procedure[] |label:apoc-full[] |xref::overview/apoc.couchbase/apoc.couchbase.namedParamsQuery.adoc[apoc.couchbase.namedParamsQuery icon:book[]] apoc.couchbase.namedParamsQuery(hostkOrKey, bucket, statement, paramNames, paramValues) yield queryResult - executes a N1QL statement with named parameters. |label:procedure[] |label:apoc-full[] |xref::overview/apoc.couchbase/apoc.couchbase.posParamsQuery.adoc[apoc.couchbase.posParamsQuery icon:book[]] apoc.couchbase.posParamsQuery(hostOrKey, bucket, statement, params) yield queryResult - executes a N1QL statement with positional parameters. |label:procedure[] |label:apoc-full[] |xref::overview/apoc.couchbase/apoc.couchbase.prepend.adoc[apoc.couchbase.prepend icon:book[]] apoc.couchbase.prepend(hostOrKey, bucket, documentId, content) yield id, expiry, cas, mutationToken, content - prepend a couchbase json document to an existing one. |label:procedure[] |label:apoc-full[] |xref::overview/apoc.couchbase/apoc.couchbase.query.adoc[apoc.couchbase.query icon:book[]] apoc.couchbase.query(hostOrKey, bucket, statement) yield queryResult - executes a plain un-parameterized N1QL statement. |label:procedure[] |label:apoc-full[] |xref::overview/apoc.couchbase/apoc.couchbase.remove.adoc[apoc.couchbase.remove icon:book[]] apoc.couchbase.remove(hostOrKey, bucket, documentId) yield id, expiry, cas, mutationToken, content - remove the couchbase json document identified by its unique ID. |label:procedure[] |label:apoc-full[] |xref::overview/apoc.couchbase/apoc.couchbase.replace.adoc[apoc.couchbase.replace icon:book[]] apoc.couchbase.replace(hostOrKey, bucket, documentId, jsonDocument) yield id, expiry, cas, mutationToken, content - replace the content of the couchbase json document identified by its unique ID. |label:procedure[] |label:apoc-full[] |xref::overview/apoc.couchbase/apoc.couchbase.upsert.adoc[apoc.couchbase.upsert icon:book[]] apoc.couchbase.upsert(hostOrKey, bucket, documentId, jsonDocument) yield id, expiry, cas, mutationToken, content - insert or overwrite a couchbase json document with its unique ID. |label:procedure[] |label:apoc-full[] |=== ```
Anna Leopoldovna (; 18 December 1718 – 19 March 1746), born Elisabeth Katharina Christine von Mecklenburg-Schwerin and also known as Anna Carlovna (А́нна Ка́рловна), was regent of Russia for just over a year (1740–1741) during the minority of her infant son Emperor Ivan VI. Biography Early life Anna Leopoldovna was born Elisabeth Katharina Christine, the daughter of Karl Leopold, Duke of Mecklenburg-Schwerin, by his wife, Catherine, the eldest daughter of Tsar Ivan V of Russia. Catherine's father, Ivan V, was the elder brother and co-ruler of Russia with Peter the Great, but because he was mentally challenged and unfit to rule, all power was in the hands of Peter the Great, who was like a father to Catherine and looked out for her interest as long as he was alive. Elisabeth's mother, Catherine, was the third wife of Duke Karl Leopold, who had divorced his first two wives after very short marriages (less than two years each). Catherine was the only wife ever to bear him a living child, and Elisabeth was the only child. In 1721, when Elisabeth was three years old, her mother became pregnant a second time, but the child was stillborn. By then, the marriage between her parents was in trouble, and in 1722, Catherine returned to the court of her uncle Peter the Great. She took her daughter with her, and Elisabeth, therefore, grew up in Russia with little or no contact with her father. In 1730, Tsar Peter II, who was the last surviving male member of the Romanov dynasty, died unwed, and his dynasty died with him. The Russian privy council debated about whom to invite to the throne, and Elisabeth's mother, Catherine, was one of the candidates who was considered. However, she was passed over for several reasons, and the throne was offered to her younger sister, Anna Ivanovna, who became known to history as Empress Anna of Russia. Anna was a childless widow, and Elisabeth was Catherine's only child. Elisabeth's position at court was, therefore, an important one. In 1733, Elisabeth converted to the Russian Orthodox Church and was given the name Anna Leopoldovna, which was a compliment to her aunt, Empress Anna, and also to her father, Karl Leopold, Duke of Mecklenburg-Schwerin. Her conversion to the Orthodox faith made her acceptable as heiress to the throne, but she was never actually declared heiress by her aunt. In 1739, Anna Leopoldovna, as she was now known, was given in marriage to Anthony Ulrich (1714–1774), the second son of Ferdinand Albert, Duke of Brunswick-Wolfenbüttel. Ulrich had lived in Russia since 1733 so that he and his bride could get to know each other better. He was able to do so because he was a younger son, and it was improbable that he would be called upon to shoulder the responsibility of ruling his father's principality. Both circumstances clearly indicate that Empress Anna intended her niece to inherit her throne, and was laying the ground for that by selecting a husband of suitable birth and situation and by observing him at close quarters for several years before the marriage was celebrated. On 5 October 1740, Empress Anna adopted their newborn son, Ivan, and proclaimed him heir to the Russian throne. On 28 October, just a few weeks after the proclamation, the empress died after she had left directions regarding the succession and appointing her favourite Ernest Biron, Duke of Courland, as regent. Biron, however, had made himself an object of detestation to the Russian people. After Biron threatened to exile Anna and her spouse to Germany, she had little difficulty working with Field Marshal Burkhard Christoph von Münnich to overthrow him. The coup succeeded, and she assumed the regency on 8 November (O.S.) and took the title of Grand Duchess. Field Marshal Münnich personally arrested Biron in his apartment, where the formerly-tyrannical Biron ingloriously begged for his life on his knees. Regency Anna knew little of the character of the people with whom she had to deal, knew even less of the conventions and politics of Russian government and speedily quarrelled with her principal supporters. According to the Dictionary of Russian History, she ordered an investigation of the garment industry when new uniforms received by the military were found to be of inferior quality. When the investigation revealed inhumane conditions, she issued decrees mandating a minimum wage and maximum working hours in that industry, as well as the establishment of medical facilities at every garment factory. She also presided over a brilliant victory by Russian forces at the Battle of Villmanstrand in Finland after Sweden had declared war against her government. She had an influential favourite, Julia von Mengden. Anna's love life took up much time, as she was andwas involved simultaneously in what were described as "passionate" affairs with Saxon Ambassador and her lady-in-waiting, Mengden. Anna's husband did his best to ignore the affairs. After becoming regent, Anton was marginalised by being forced to sleep in another palace, and Anna took Lynar, Mengden or both to bed with her. At times, the grand duke would appear to complain about being "cuckolded", but he was always sent away. At one point, Anna proposed to have Lynar marry Mengden to unite the two people who were closest to her in the world together. The regent's relationship with Mengden caused much disgust in Russia, but the French historian Henri Troyat wrote that amongst the many libertines of St. Petersburg, Anna's "sexual eclecticism" in having both a man and a woman as her lovers was seen as a sign of Anna's openmindedness. More damagingly, many in the Russian elite believed that at the age of 22, Anna was too young and immature to be the regent of Russia and that her preoccupation with her relationships with Lynar and Mengden at the expense of governing Russia were making her a danger to the state. Troyat described Anna as an "indolent day-dreamer" who spent her mornings reading novels in bed, got up only in the afternoon, liked to wander around her apartment when she was barely dressed and had her hair undone and was principally interested in reading novels. Anna's preference for handing out government jobs to Baltic German aristocrats caused much resentment on the part of the ethnic Russian nobility, which, for neither the first nor the last time, complained that a disproportionate number of Baltic Germans held high office. Later life In December 1741, Elizabeth, the daughter of Peter the Great, excited the guards to revolt and had already become a favourite of them. The coup overcame the insignificant opposition and was supported by the ambassadors of France and Sweden because of the pro-British and pro-Austrian policies of Anna's government. The French ambassador in St. Petersburg, the marquis de La Chétardie, was deeply involved in planning Elizabeth's coup and bribed numerous officers of the Imperial Guard into supporting it. The victorious regime first imprisoned the family in the fortress of Dünamünde, near Riga, and then exiled them to Kholmogory on the Northern Dvina River. Anna eventually died of puerperal fever on 19 March 1746, nine days after the birth of her son Alexei after more than four years in prison. Her family continued in prison for many years. A further 18 years were to pass before her son, Ivan VI, was murdered in Shlisselburg Fortress on 16 July 1764, and her husband, Anthony Ulrich, died in Kholmogory on 4 May 1774 after spending a further decade in prison. Her remaining four children (Ekaterina, Elizaveta, Peter and Alexei) were released from prison into the custody of their aunt, Danish Queen Dowager Juliana Maria of Brunswick-Wolfenbüttel, on 30 June 1780 and settled in Jutland, where they lived in comfort under house arrest in Horsens for the rest of their lives under the guardianship of Juliana and at the expense of Catherine the Great. The eldest of them had been only months old when she and her family had been placed in prison. The other three had been born in captivity. They were, therefore, not used to social life, and even after they had gained their freedom, they made little or no contact with people outside their own small "court" of some forty to fifty people, all of whom Danish except for the priest. None of them ever married or left progeny. Family Anna Leopoldovna had the following children: Ivan VI (1740–1764) (reigning Emperor 1740–1741) Catherine Antonovna of Brunswick (1741–1807) (released to house arrest in Horsens in Denmark in 1780) Elizabeth Antonovna of Brunswick (1743–1782) (released to house arrest in Horsens in Denmark in 1780) Peter Antonovich of Brunswick (1745–1798) (released to house arrest in Horsens in Denmark in 1780) Alexei Antonovich of Brunswick (1746–1787) (released to house arrest in Horsens in Denmark in 1780) Ancestry Notes References Attribution External links – Historical reconstruction "The Romanovs". StarMedia. Babich-Design(Russia, 2013) 1718 births 1746 deaths Regents of Russia 18th-century women rulers 18th-century regents Female regents House of Mecklenburg-Brunswick-Romanov Deaths in childbirth House of Mecklenburg-Schwerin Duchesses of Mecklenburg-Schwerin Converts to Eastern Orthodoxy from Lutheranism Burials at the Annunciation Church of the Alexander Nevsky Lavra Daughters of monarchs Mothers of monarchs
Una Vez Más (Spanish: "Once More") may refer to: Una Vez Más, a 1995 album by the Barrio Boyzz "Una Vez Más" (The Barrio Boyzz song), the lead single from the album "Una Vez Más" (Leslie Shaw song), a single by Leslie Shaw Una Vez Más (Calle Ciega album), a 2006 album by Venezuelan group Calle Ciega "Una Vez Más", a 1982 song by Juan Gabriel "Una Vez Más", a 2014 song by Víctor Manuelle Una Vez Más Holdings, LLC, a broadcasting company in the United States
```python async def app(scope, receive, send): assert scope["type"] == "http" await send( { "type": "http.response.start", "status": 200, "headers": [[b"content-type", b"text/html"]], } ) await send( { "type": "http.response.body", "body": b"asgi-function:RANDOMNESS_PLACEHOLDER" } ) ```