text
stringlengths
1
22.8M
Meg Kissinger is an American investigative journalist and a Visiting Professor at Columbia University. She is the author of “While You Were Out: An Intimate Family Portrait of Mental Illness in an Era of Silence”, published by Macmillan on Sept. 5, 2023. While working at The Milwaukee Journal Sentinel, she and Susanne Rust were finalists for the 2009 Pulitzer Prize for Investigative Reporting for their investigation of Bisphenol A. Kissinger has also written extensively about the failures of the mental health system. She was born in Wilmette, Illinois, where she attended Regina Dominican High School. She graduated from DePauw University in 1979. Awards 2013 George Polk Award for Medical Writing http://www.jsonline.com/news/milwaukee/chronic-crisis-a-system-that-doesnt-heal-milwaukee-county-mental-health-system-210480011.html?ipad=y#!/emergency-detentions/ 2012 Robert F. Kennedy Journalism Award 2009 Pulitzer Prize for Investigative Reporting finalist 2008 George Polk Award 2008 John B. Oakes Award for distinguished environmental reporting Scripps Howard National Journalism award, 2009 and 2010 2009 Grantham award of special merit Work "Chemical Fallout", Milwaukee Wisconsin Journal Sentinel References External links "Logan Symposium: How the Sausage Is Made? (Journalists)", Berkeley Graduate School of Journalism "Please Explain: BPA", WNYC, February Nieman Storyboard: Meg Kissinger On Writing the Tough Stories http://www.niemanstoryboard.org/2012/02/08/meg-kissinger-on-writing-the-tough-stories/ American women journalists Living people George Polk Award recipients DePauw University alumni Milwaukee Journal Sentinel people Year of birth missing (living people)
Mystogenes is a genus of moths belonging to the subfamily Olethreutinae of the family Tortricidae. Description This genus presently only contains one species Mystogenes astatopa Meyrick, 1930 described from Mauritius that has a wingspan of 12mm. See also List of Tortricidae genera References External links tortricidae.com Tortricidae genera Monotypic moth genera Olethreutinae Taxa named by Edward Meyrick
First Empire may refer to: First British Empire, sometimes used to describe the British Empire between 1583 and 1783 First Bulgarian Empire (680–1018) First French Empire (1804–1814/1815) First German Empire or "First Reich", sometimes used to describe the Holy Roman Empire (962–1806) First Empire of Haiti (1804–1806) First Mexican Empire (1821–1823) First Persian Empire, sometimes used to describe the Achaemenid Empire (ca. 550 BCE – 336 BCE) 1st Empire Awards, film awards held in 1996 See also Second Empire (disambiguation) Third Empire (disambiguation)
```python # your_sha256_hash___________ # # Pyomo: Python Optimization Modeling Objects # National Technology and Engineering Solutions of Sandia, LLC # Under the terms of Contract DE-NA0003525 with National Technology and # Engineering Solutions of Sandia, LLC, the U.S. Government retains certain # rights in this software. # your_sha256_hash___________ from pyomo.core import * def pipe_rule(model, pipe, id): pipe.length = Param(within=NonNegativeReals) pipe.flow = Var() pipe.pIn = Var(within=NonNegativeReals) pipe.pOut = Var(within=NonNegativeReals) pipe.pDrop = Constraint( expr=pipe.pIn - pipe.pOut == model.friction * model.pipe_length[id] * pipe.flow ) pipe.IN = Connector() pipe.IN.add(-1 * pipe.flow, "flow") pipe.IN.add(pipe.pIn, "pressure") pipe.OUT = Connector() pipe.OUT.add(pipe.flow) pipe.OUT.add(pipe.pOut, "pressure") def node_rule(model, node, id): def _mass_balance(model, node, flows): return node.demand == sum_product(flows) node.demand = Param(within=Reals, default=0) node.flow = VarList() node.pressure = Var(within=NonNegativeReals) node.port = Connector() # node.port.add( node.flow, # aggregate=lambda m,n,v: m.demands[id] == sum_product(v) ) node.port.add(node.flow, aggregate=_mass_balance) node.port.add(node.pressure) def _src_rule(model, pipe): return model.nodes[value(model.pipe_links[pipe, 0])].port == model.pipes[pipe].IN def _sink_rule(model, pipe): return model.nodes[value(model.pipe_links[pipe, 1])].port == model.pipes[pipe].OUT model = AbstractModel() model.PIPES = Set() model.NODES = Set() model.friction = Param(within=NonNegativeReals) model.pipe_links = Param(model.PIPES, RangeSet(2)) model.pipes = Block(model.PIPES, rule=pipe_rule) model.nodes = Block(model.NODES, rule=node_rule) # Connect the network model.network_src = Constraint(model.PIPES, rule=_src_rule) model.network_sink = Constraint(model.PIPES, rule=_sink_rule) # Solve so the minimum pressure in the network is 0 def _obj(model): return sum(model.nodes[n].pressure for n in model.NODES) model.obj = Objective(rule=_obj) ```
Sir Charles Thomas Newton (16 September 1816 – 28 November 1894) was a British archaeologist. He was made KCB in 1887. Life He was born in 1816, the second son of Newton Dickinson Hand Newton, vicar of Clungunford, Shropshire, and afterwards of Bredwardine, Herefordshire. He was educated at Shrewsbury School (then under Samuel Butler), and at Christ Church, Oxford (matriculating 17 Oct. 1833), where he graduated B.A. in 1837 and M.A. in 1840. Already in his undergraduate days Newton (as his friend and contemporary, John Ruskin, tells in Præterita) was giving evidence of his natural bent; the scientific study of classical archaeology, which Winckelmann had set on foot in Germany, was in England to find its worthy apostle in Newton. In 1840, contrary to the wishes of his family, he entered the British Museum as an assistant in the department of antiquities. As a career the museum, as it then was, can have presented but few attractions to a young man; but the department, as yet undivided, probably offered to Newton a wider range of comparative study in his subject than he could otherwise have acquired. In 1852, he was named vice-consul at Mytilene, and from April 1853 to January 1854 he was consul at Rhodes, with the definite duty, among others, of watching over the interests of the British Museum in the Levant. In 1854 and 1855, with funds advanced by Lord Stratford de Redcliffe, he carried on excavations in Kalymnos, enriching the British Museum with an important series of inscriptions, and in the following year he was at length enabled to undertake his long-cherished scheme of identifying the site, and recovering for this country the chief remains, of the mausoleum at Halicarnassus. In 1856–1857, he achieved the great archaeological exploit of his life by the discovery of the remains of the mausoleum, one of the seven wonders of the ancient world. He was greatly assisted by Murdoch Smith, afterwards celebrated in connection with Persian telegraphs. The results were described by Newton in his History of Discoveries at Halicarnassus (1862–1863), written in conjunction with R. P. Pullan, and in his Travels and Discoveries in the Levant (1865). These works included particulars of other important discoveries, especially at Branchidae, where he disinterred the statues which had anciently lined the Sacred Way, and at Cnidos, where Pullan, acting under his direction, found the Lion of Knidos now in the British Museum. In 1860, he was named consul at Rome, but was the following year recalled to take up the newly created post of keeper of Greek and Roman antiquities at the British Museum. Newton's keepership at the museum was marked by an amassing wealth of important acquisitions, which were largely attributable to his personal influence or initiation. Thus in the ten years 1864-74 alone he was enabled to purchase no less than five important collections of classical antiquities: the Farnese, the two great series of Castellani, the Pourtales, and the Blacas collections, representing in special grants upwards of £100,000. Meanwhile, his work in the Levant, bringing to the museum the direct results of exploration and research, was being continued by his successors and friends: Biliotti in Rhodes, Smith and Porcher at Gyrene, Lang in Cyprus, Dennis in Sicily, in the Cyrenaica, and around Smyrna, Pullan at Priene, John Turtle Wood at Ephesus were all working more or less directly under Newton on behalf of the museum. Of his own work as a scholar in elucidating and editing the remains of antiquity, the list of his writings given below is only a slight indication; nor was this confined to writing alone. In 1855, he had been offered by Lord Palmerston (acting on Liddell's advice) the regius professorship of Greek at Oxford, rendered vacant by Dean Gaisford's death, with the definite object of creating a school of students in what was then a practically untried field of classical study at Oxford. The salary, however, was only nominal, and Newton was obliged to decline the post, which was then offered to and accepted by Benjamin Jowett. In 1880, however, the Yates chair of classical archaeology was created at University College, London, and by a special arrangement, Newton was enabled to hold it coincidentally with his museum appointment. As antiquary to the Royal Academy he lectured frequently. In the latter part of his career, he was closely associated with the work of three English societies, all of which owed to him more or less directly their inception and a large part of their success; the Society for the Promotion of Hellenic Studies, at the inaugural meeting of which he presided in June 1879; the British School at Athens, started in February 1885: and the Egypt Exploration Fund, which was founded in 1882. In 1889, he was presented by his friends and pupils, under the presidency of the Earl of Carnarvon, with a testimonial in the form of a marble portrait bust of himself by Boehm, now deposited in the Mausoleum Room at the British Museum; the balance of the fund was by his own wish devoted to founding a studentship in connection with the British School at Athens. In 1885, he resigned the museum and academy appointments, and in 1888 he was compelled by increasing infirmity to give up the Yates professorship. On 28 November 1894 he died at Margate, where he had moved from his residence, 2 Montague Place, Bedford Square. Awards In 1874 Newton was made honorary fellow of Worcester College, Oxford, and on 9 June 1875 D.C.L. of the same university ; LL.D. of Cambridge, and PhD of Strasburg in 1879 ; Companion of the Bath (C.B.) on 16 November 1875, and Knight Commander of the same order (K.C.B.) on 21 June 1887. He was correspondent of the Institute of France, honorary director of the Archaeological Institute of Berlin, and honorary member of the Accademia dei Lincei of Rome. Family On 27 April 1861, he married the distinguished painter, Ann Mary, daughter of Joseph Severn, himself a painter and the friend of John Keats, who had succeeded Newton in Rome; she died in 1866 at their residence, 74 Gower Street, Bloomsbury. Works He was editor of the Collection of Ancient Greek Inscriptions in the British Museum (1874 &c. fol.), and author of numerous other official publications of the British Museum ; also of a treatise on the Method of the Study of Ancient Art, 1850; a History of Discoveries at Halicarnassus, Cnidus, and Branchidse, 1862-3 ; Travels and Discoveries in the Levant, 1865 ; Essays on Art and Archæology, 1880 ; and of many papers in periodicals, among which may be specially noted a Memoir on the Mausoleum in the Classical Museum for 1847. References Attribution: Sources Charles T. Newton, History of Discoveries at Halicarnassus(1863) Vol. II Charles T. Newton, Travels & Discoveries in the Levant (1865) Vol. II (reissued by Cambridge University Press, 2010. ) Charles T Newton, Essays on Art and Archaeology (1886) Macmillan, London. (reissued by Cambridge University Press, 2010. ) External links 1816 births 1894 deaths 19th-century British archaeologists English archaeologists Employees of the British Museum Knights Commander of the Order of the Bath People educated at Shrewsbury School Recipients of the Royal Gold Medal People from Herefordshire Alumni of Christ Church, Oxford Burials at Kensal Green Cemetery Mausoleum at Halicarnassus
Statistics of Bahraini Premier League for the 1994–95 season. Overview It was contested by 10 teams, and Muharraq Club won the championship. League standings References Bahrain - List of final tables (RSSSF) Bahraini Premier League seasons Bah 1994–95 in Bahraini football
```javascript Use hosted scripts to increase performance Navigating the browser history ProgressEvent Geolocation Drag and Drop API ```
```xml <?xml version="1.0" encoding="utf-8"?> <manifest xmlns:android="path_to_url" package="me.ele.app.amigo"> <application> <meta-data android:name="data_key" android:value="AAA" /> </application> </manifest> ```
Wood Mountain Regional Park is a conservation and recreation area in its natural state set aside as a regional park in south-western region of the Canadian province of Saskatchewan. The park is set in the semi-arid Palliser's Triangle in an upland area called Wood Mountain Hills. It is in the Rural Municipality of Old Post No. 43, south of the village of Wood Mountain along Highway 18. Adjacent to the southern boundary of Wood Mountain Regional Park is Wood Mountain Creek and Wood Mountain Post Provincial Park. Immediately to the west is Wood Mountain Indian reserve and to the east is the Wood Mountain Game Preserve (). Amenities and attractions within the park include the Rodeo Ranch Museum, Wood Mountain Stampede, Sitting Bull Monument, ball diamonds, campsites, concessions, swimming pool, and hiking and bicycling trails. This is a local park administered by local funding. History In 1874, the Boundary Commission, which was charged with surveying the Canada–United States border, set up a depot on Wood Mountain Creek at the current location of Wood Mountain Regional Park. Later that year, the North-West Mounted Police (NWMP) on their March West to deal with the Cypress Hills Massacre, bought the depot and used it to establish relations with local First Nations, patrol the border with the United States, and to police whisky traders, horse thieves, and cattle rustlers. In 1876, Chief Sitting Bull led his 5,000-strong Lakota Sioux tribe away from the Little Bighorn River into the Wood Mountain Hills in Canada after defeating Custer at the Battle of the Little Bighorn. The Canadian government was concerned that the Sioux would cause problems, and charged James Walsh of the NWMP with maintaining control of what amounted to Canada's first attempted peace keeping mission. Walsh succeeded, as he and Sitting Bull became close friends over the years. In the neighbouring provincial park, there are two reconstructed buildings with artefacts that tell the story of Walsh and Sitting Bull. Chief Sitting Bull and some of his people returned to the United States after five years while most stayed in the Wood Mountain area. In 1910, they were given their own Indian reserve and many of their descendants remain in the area to this day. The North-West Mounted Police closed Wood Mountain Post in 1883. Then, with the out-break of the North-West Rebellion, it was re-opened in 1885. Two years later, the dilapidated buildings were abandoned and new buildings were constructed to the south-east, across Wood Mountain Creek (which is a tributary of Wood River via Lynthorpe Creek) and in the current Wood Mountain Post Provincial Park. The post operated at that location until it was permanently closed in 1918. Attractions and amenities Wood Mountain Regional Park is set in rolling hills and ranchland. There are several trails throughout the park, including one that leads to the provincial park. There is a heated swimming pool, a campground, museum, Bible camp, and Canada's longest running rodeo. A monument to Chief Sitting Bull sits atop a hill overlooking the regional park, behind the museum. Rodeo Ranch Museum The Rodeo Ranch Museum features exhibits about the cowboys and ranchers who settled the area in the 1880s. Exhibits include photographs, pioneer, rodeo and Western artefacts. The information centre for the East Block of Grasslands National Park is located in the museum. Wood Mountain Stampede In 1890, the Wood Mountain Stampede was established by the North-West Mounted Police to promote sports and to celebrate the July 1 Dominion Day holiday. It became an annual event held every second weekend in July and is Canada's longest-running annual rodeo. See also History of Saskatchewan List of protected areas of Saskatchewan Tourism in Saskatchewan References External links Wood Mountain Regional Park website Parks in Saskatchewan Museums in Saskatchewan History museums in Saskatchewan Old Post No. 43, Saskatchewan Regional parks of Canada Sioux Sitting Bull North-West Mounted Police
Schuylkill Township could refer to the following places in the U.S. state of Pennsylvania: Schuylkill Township, Chester County, Pennsylvania. Schuylkill Township, Schuylkill County, Pennsylvania. Pennsylvania township disambiguation pages
```xml /* * * * path_to_url * * Unless required by applicable law or agreed to in writing, software * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. */ import { HttpClient } from '@angular/common/http'; import { Injectable } from '@angular/core'; import { Groups } from '../entity/Groups'; import { BaseService } from './base.service'; @Injectable({ providedIn: 'root' }) export class GroupsService extends BaseService<Groups> { constructor(private _httpClient: HttpClient) { super(_httpClient, '/access/groups'); } } ```
Quenching The Light (2008) is a PSA-like documentary short film that highlights the persecution of Baháʼís in Iran. Production The nine minute documentary was produced by Mithaq Kazimi and features artist Mahmehr Golestaneh's paintings of some of the martyrs of the Baháʼí Faith since 1978, mixed with live video footage of the descendants of the martyrs. The score is composed by Christopher Tressler and Larry Robinson. It was the first documentary addressing the 5 March to 14 May 2008 arrests of the Baháʼí 7. On the internet, the documentary was first released on Google video and then later on YouTube. Response When first uploaded, the video was "flagged" by unknown sources and therefore removed by YouTube. A few months later, in September 2008, was able to open another account and upload the video. See also Iranian Taboo (2011) Education Under Fire (2011) ''To Light a Candle (2014) References External links Official Website Baha'is Online article The University of Auckland Public service announcements Persecution of Bahá'ís
Dvorce () is a former village in the Levoča District in Slovakia. Dvorce was part of the Szepes county of the Kingdom of Hungary and was part of the Levoča District. After over 600 years, the village was evacuated and destroyed in the 1950s to make way for the Javorina military training area. Some of the fittings of the Evangelical (Lutheran) church of Dvorce, which was also destroyed, are preserved in the parsonage of the Evangelical Church, Levoča. Dvorce has been known by various names: in Hungarian as 'Dvorecz' (1869 place name), and 'Szepesudvard' (current usage) and in German as 'Burgerhof'. References Former villages in Slovakia
The 2013 Antigua Barracuda FC season was the club's third and final season in existence, playing in the USL Pro, which at the time was the third division in the United States soccer pyramid. The regular season began on 6 April 2013 and concluded on 31 August 2013. During the 2013 season, Antigua Barracuda played all of their matches on the road, thus being a travelling team. The team became the first team in the history of United States soccer leagues to lose every single one of their matches, amassing a record of 0-26-0, scoring 11 goals and conceding 91. Outside of USL Pro, Antigua Barracuda played in the 2013 CFU Club Championship, being a club based in a CFU-member nation. There, they finished in third place in their group, losing one match and tying another. Barracuda did not participate in the Antigua and Barbuda FA Cup, and did not qualify for the USL Pro Playoffs. Following the conclusion of the 2013 season, Antigua Barracuda FC folded due to financial difficulties. Background Review Competitions Preseason USL Pro Table Results summary Results by round Match reports CFU Club Championship Group table First round Statistics Transfers Player awards References 2013 USL Pro season Antigua and Barbuda football clubs 2012–13 season Antigua and Barbuda football clubs 2013–14 season
```python #!/usr/bin/env python # coding=utf-8 # author: zengyuetian # import unittest from lib.utility.date import * class DateTest(unittest.TestCase): def setUp(self): pass def tearDown(self): pass def test_time_string(self): self.assertEqual(len(get_time_string()), 14) def test_date_string(self): self.assertEqual(len(get_date_string()), 8) def test_year_string(self): self.assertEqual(len(get_year_month_string()), 6) if __name__ == '__main__': unittest.main() ```
Parliamentary elections were held in Kosovo on 6 October 2019. The main opposition parties received the most votes, led by Vetëvendosje and the Democratic League of Kosovo (LDK). Vetëvendosje leader Albin Kurti became Prime Minister, forming a governing coalition with the LDK on an anti-corruption platform. He is the second Prime Minister not to have been a fighter of the Kosovo Liberation Army during the 1990s. Background On 19 July 2019 Prime Minister Ramush Haradinaj resigned after being summoned for questioning by the KSC in The Hague, Netherlands. The constitution requires the President to designate a new candidate to either form a government, or hold new elections in between 30 and 45 days after consultation with political parties or coalitions who hold a majority in the Assembly. On 2 August 2019, President Hashim Thaçi asked the PANA Coalition to propose a new candidate to form a coalition government. However, other political parties opposed the move. On 5 August 2019, the Assembly of Kosovo agreed to hold an extraordinary session on 22 August, planning to disband itself so that elections could be scheduled. Subsequently, on 22 August 2019, MPs voted to dissolve parliament, with 89 of the 120 voting in favour, necessitating elections within 30–45 days. Electoral system The 120 members of the Assembly are elected by open list proportional representation, with 20 seats reserved for national minorities. An electoral threshold of 5% was in place for non-minority parties. Parties and coalitions On 7 September the Election Commission published the official list of the 25 participating parties and coalitions. Opinion polls Coalitions Parties Results The initial results showed that the pro-government NISMA–AKR–PD alliance fell only a few hundred votes short of meeting the 5% electoral threshold and lost all 10 of their seats. However, the Kosovo Election Complaints and Appeals Panel subsequently ordered around 3,782 votes originating in Serbia to be removed from the vote count as they had been delivered by Serbian officials rather than by post. The removed votes allowed the NISMA-led alliance to cross the threshold and win six seats, a reduction of four from the prior election. Vetëvendosje and the Independent Liberal Party (which lost its parliamentary representation) were the only other parties to see a reduction in their seat totals. The Democratic League of Kosovo, Democratic Party of Kosovo, AAK–PSD alliance and the Serb List all gained seats. Voter turnout was around 45%. Aftermath After the election, Vetëvendosje leader Albin Kurti formed a coalition with the LDK. However, the government collapsed on 25 March following a motion of no confidence. Following the vote, the LDK formed a new government with the Alliance for the Future of Kosovo, the Social Democratic Initiative and the Serb List. However, due to the Constitutional Court's ruling and a deputy's conviction, the government collapsed again and another election will be held within 40 days starting from 22 December. References Kosovo Parliamentary election Elections in Kosovo Kosovan parliamentary election
The weekly address of the president of the United States (also known as the Weekly (Radio) Address or Your Weekly Address) is the weekly speech by the president of the United States to the nation. Franklin D. Roosevelt was the first U.S. president to deliver such radio addresses. Ronald Reagan revived the practice of delivering a weekly Saturday radio broadcast in 1982, and his successors all continued the practice until Donald Trump ceased doing so seventeen months into his term. As the Internet became mainstream during the 1990s, the weekly address was made available on other media. George W. Bush introduced an audio podcast feed and Barack Obama introduced a weekly video address during his presidential transition period. Donald Trump continued the weekly video address for the first nine months of his administration, after which he ended the practice. He later released occasional "weekly" addresses before ceasing the tradition in June 2018. Joe Biden revived the practice of making a weekly address in February 2021 in the form of “Weekly Conversations”, answering prepared questions or concerns from citizens. In July 2021, he stopped doing Weekly Conversations. As vice president, Biden made weekly addresses on behalf of Barack Obama during the Obama administration. History Franklin D. Roosevelt first used what would become known as fireside chats in 1929 as Governor of New York. His third gubernatorial address—April 3, 1929, on WGY radio—is cited by Roosevelt biographer Frank Freidel as being the first fireside chat. As president he continued the tradition, which he called his fireside chats. The success of these presidential addresses encouraged their continuation by future presidents. The practice of regularly scheduled addresses began in 1982 when President Ronald Reagan started delivering a radio broadcast every Saturday. Conservative journalist William A. Rusher, who publicly urged Reagan to begin the series of broadcasts, explicitly referred to the "fireside chats" and compared Reagan's communications skills to those of Roosevelt. During a sound check in preparation for his radio address of August 11, 1984, Reagan made the following comments in jest, which were later leaked to the general public: "My fellow Americans, I'm pleased to tell you today that I've signed legislation that will outlaw Russia forever. We begin bombing in five minutes." George H. W. Bush did not regularly record a weekly radio address; he recorded only a total of 18 addresses during his term in office, most toward the latter part. Bill Clinton regularly recorded a weekly radio address, often going over ten minutes with some speeches early in his term. George W. Bush was the first president to deliver the weekly radio address in English and Spanish, which he continued to throughout his presidency. Later, George W. Bush began to have his addresses posted as an audio podcast once that technology became popular. Barack Obama used YouTube for regular video addresses as President-elect and since his inauguration the weekly addresses have continued on the White House website, the official White House YouTube channel, and networks such as C-SPAN, with the 24-hour cable news channels and network morning shows usually airing the full address only if the topic involves a breaking news event; short summaries of the address and the talking points within were otherwise edited and presented within regular news reports throughout each Saturday. Until his final broadcast, Donald Trump continued to use the video address as his predecessor did. His weekly address also webcast on Facebook as a live stream, releasing the address on Fridays instead of Saturdays. It has long become customary for the president's Weekly Radio Address to be followed by a response from the opposition party. When the president is a Democrat, the opposition's response is given by a Republican and vice versa. This response is not limited to only responding by the subject of the president's address, but may address other topics of political or social interest, a tribute to a figure who has died in the last week, a general patriotic message on holiday weekends (the latter two of which can also be part of the presidential address), or other concerns working through the Senate or House which have not yet been addressed by the executive branch. Despite the discontinuation of the president's weekly addresses, the Democrats still continued their weekly address through the remainder of the Trump administration. A common complaint about the president's Weekly Radio Address pre-digital age (but remaining in the mainstream) is that only a few radio stations (mainly public radio and all-news radio outlets, a format very rare outside of major metropolitan areas) cover the very short broadcasts, they are not advertised publicly, and very few Americans are able to find address coverage on their local radio dial; Saturday mornings usually have brokered or paid programming carried on most commercial radio stations. See also Oval Office address State of the Union Weekly Democratic Address, the opposition response during a Republican presidency Weekly Republican Address, the current opposition response during a Democratic presidency References External links President Obama's Weekly Addresses Transcripts of President G.W. Bush's Radio Addresses by date and topic President G.W. Bush's Radio Address podcasts Ronald Reagan's Presidential Radio Addresses from 1982 to 1989 George H.W. Bush's Presidential Radio Addresses from 1990 to 1992 Bill Clinton's Presidential Radio Addresses from 1993 to 2001 George W. Bush's Presidential Radio Addresses from 2001 to 2009 Barack Obama's Presidential Weekly Addresses from 2008 to 2017 Donald Trump's Presidential Weekly Addresses from 2017 to 2021 Corpus of Political Speeches Free access to political speeches by American and other politicians, developed by Hong Kong Baptist University Library American radio programs 1929 radio programme debuts United States presidential speeches
Isabel Jordayne (died c.1534) was an English abbess of Wilton Abbey. She was the penultimate abbess whose election was debated by Cardinal Wolsey and Anne Boleyn before Henry VIII, the abbey's patron, chose her. Life Jordayne's birth and early life are not known. Her sister, Agnes Jordan, was the abbess at Syon Monastery and Isabel was well respected as a nun at Wilton Abbey who was "ancient, wise and discreet". The abbess of Wilton Abbey, Cecily Willoughby, died on 24 September 1528, and Jordayne was the heir apparent and her name was put forward to Cardinal Wolsey. The job vacancy came in difficult times, as thirty convents had been closed and converted to supplying education. The wealthy Wilton Abbey was not an obvious candidate for closure, as it was a royal foundation, although with Henry VIII as patron the abbey was obliged to supply favours at the monarch's request. The abbey nominated the election of the prioress, Isabel Jordayne, described as 'ancient, wise and discreet', while Anne Boleyn favored her brother-in-law William Carey's sister Eleanor Carey. Henry VIII eventually agreed to Jordayne when Eleanor Carey's candidacy was destroyed by serious charges of immorality against her. One source says that it was Wolsey who appointed Jordayne. Eleanor Carey admitted that she had given birth to two children fathered by priests. Eleanor's elder sister, who was a nun at Wilton, was also a contender, but she was rejected too. Although Isabel won the appointment, she was deprived of control of the abbey's finances until November 1528, and the abbey's reputation was damaged by the scandal surrounding its potential leaders. The abbey was then visited by the plague, there was a fire in one of the dormitories, and Jordayne struggled to bring the nuns into line with discipline. The bishop of Salisbury's vicar-general, Richard Hilley, also tried to impose order amongst the nuns in 1533. It is not clear when Jordayne died, but her position was said to be vacant in 1533. Cecily Bodenham a new abbess was appointed and she was the last of Wilton Abbey's abbesses. She had paid to be named abbess. References 1534 deaths English Roman Catholic abbesses
The following is a list of gangsta rap artists. 0–9 11/5 187 Fac 213 21 Savage 3X Krazy 40 Glocc 50 Cent A Above the Law AMG AZ Ant Banks B Bone Thugs-n-Harmony Boo-Yaa T.R.I.B.E. Boss Brotha Lynch Hung Bun B Bushwick Bill C C-Bo Chief Keef C-Murder Cold 187um (aka Big Hutch) Comethazine Compton's Most Wanted Cypress Hill D Detroit's Most Wanted Daz Dillinger DJ Quik Do or Die Doggy's Angels Dr. Dre Dru Down E E-40 Eazy-E G The Game Gangsta Boo Geto Boys Freddie Gibbs H Hard Boyz Havoc & Prodeje I Ice Cube Ice-T J Ja Rule Just-Ice K Kendrick Lamar Knight Owl Kurupt L Lil Rob Lunasicc M Mac Dre Mac Mall Mack 10 Master P MC Eiht MC Ren Mia X Mobb Deep Mozzy Mr. Capone-E Mr. Serv-On N N.W.A N2Deep Nardo Wick Nate Dogg The Notorious B.I.G. P Pimp C Pop Smoke R Ray Luv S Scarface Schoolboy Q Schoolly D Tupac Shakur Silkk the Shocker Skee-Lo Snoop Dogg South Central Cartel South Park Mexican Spice 1 Sidhu Moose Wala T Tay-K Tha Dogg Pound Thug Life Too Short TRU Three 6 Mafia U UGK W Warren G Westside Connection WC WC and the Maad Circle Willie D X X-Raided XXXTentacion Y Ya Boy YoungBoy Never Broke Again Z Z-Ro References Bibliography Gangsta rap Gangsta rap
The Royal Elephant National Museum, also known as Chang Ton National Museum, is a museum located in Dusit District, Bangkok, Thailand. Permanent closed External links Thailand Travel Tours - Royal Elephant National Museum Museums in Bangkok National museums of Thailand History museums in Thailand
```c /*your_sha256_hash-------- * * geqo_mutation.c * * TSP mutation routines * * src/backend/optimizer/geqo/geqo_mutation.c * *your_sha256_hash--------- */ /* contributed by: =*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*= * Martin Utesch * Institute of Automatic Control * = = University of Mining and Technology = * utesch@aut.tu-freiberg.de * Freiberg, Germany * =*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*=*= */ /* this is adopted from Genitor : */ /*************************************************************/ /* */ /* Darrell L. Whitley */ /* Computer Science Department */ /* Colorado State University */ /* */ /* Permission is hereby granted to copy all or any part of */ /* this program for free distribution. The author's name */ /* and this copyright notice must be included in any copy. */ /* */ /*************************************************************/ #include "postgres.h" #include "optimizer/geqo_mutation.h" #include "optimizer/geqo_random.h" #if defined(CX) /* currently used only in CX mode */ void geqo_mutation(PlannerInfo *root, Gene *tour, int num_gene) { int swap1; int swap2; int num_swaps = geqo_randint(root, num_gene / 3, 0); Gene temp; while (num_swaps > 0) { swap1 = geqo_randint(root, num_gene - 1, 0); swap2 = geqo_randint(root, num_gene - 1, 0); while (swap1 == swap2) swap2 = geqo_randint(root, num_gene - 1, 0); temp = tour[swap1]; tour[swap1] = tour[swap2]; tour[swap2] = temp; num_swaps -= 1; } } #endif /* defined(CX) */ ```
Belgrade News is a newspaper in Belgrade, Montana, United States. It is a biweekly, published every Tuesday and Friday for free. Prior owner Pioneer News Group sold its papers to Adams Publishing Group in 2017. It provides local news on legislature, business, crime, municipal, and other areas. References External links Official website Newspapers published in Montana News
```xml // *** WARNING: this file was generated by test. *** // *** Do not edit by hand unless you're certain you know what you are doing! *** import * as pulumi from "@pulumi/pulumi"; import * as inputs from "../types/input"; import * as outputs from "../types/output"; import {Cat, Dog} from ".."; /** * A toy for a dog */ export interface Chew { owner?: Dog; } /** * A Toy for a cat */ export interface Laser { animal?: Cat; batteries?: boolean; light?: number; } export interface Rec { rec1?: outputs.Rec; } /** * This is a toy */ export interface Toy { associated?: outputs.Toy; color?: string; wear?: number; } ```
Paranjothi (), popularly known as Sirruthondar was an army general of the great Pallava king Narasimavarman I who ruled South India from 630–668 CE. He also led the Pallava army during the invasion of Vatapi in 642 CE. In the later years of his life, Paranjothi gave up violence and became a wandering Saivite monk, Sirruthonda nayanar. He is venerated as one of the 63 Nayanmars. Early life Paranjothi, was born in a Tamil warrior clan(Title: Mamaathirar), who served as soliders in chola aramy, in Chengattankudi (now Thiruchenkkatukudi) village of Nagapattinam district, Tamil Nadu during the 7th century AD. Tamil Nadu was ruled by Mahendravarman I of the Pallava dynasty with Kanchipuram as its capital. Paranjothi, who had mastered the art of war moves to Kanchipuram to learn literature and saivite scriptures, Kanchipuram was then a renowned knowledge capital in India. Conquest of Vatapi King Mahendravarman I, impressed by the courage and valour of Paranjothi appointed him as a commander in his army. After the death of Mahendravarman in 630 CE, his son Narasimavarman became the ruler of the Pallava dynasty and Paranjothi became his army general. Paranjothi was also a close friend of king Narasimavarman. Paranjothi as a trusted general of Narasimavarman, lead his forces to Vatapi in 642 CE for war against the Chalukya king, Pulakeshin. Pulakeshin was killed in the battle and Vatapi was burnt to ground to avenge the defeat of Mahendravarman by Pulakeshin in the battle of Pullalur in 618 CE. Vatapi Ganapathi During the dawn of war, Paranjothi worshipped a Ganesha sculpture on the walls of Vatapi fort. On the return from the victorious battlefield, he took the statue of Ganesha to his birthplace Tiruchenkattankudi to be worshipped as Vatapi Ganapathi. The statue and shrine to Vathapi Ganapathi is located in a temple in Tiruchenkattankudi in Nagapattinam district in the Tamil Nadu state of India. Sirutthondar On the victorious battle field Paranjothi underwent a change of heart and devoted himself to Lord Shiva. Paranjothi became an ardent devotee of Lord Siva and was then called as Sirutthondar. He later became one of 63 Nayanmar Saints. Sirutthondar's life and devotion are narrated in Sekkizhar's Periya Puranam. In Popular Culture Paranjothi is one of the prominent characters in Tamil historical fiction novel Sivagamiyin Sapatham by Kalki Krishnamurthy. This talks about the young years of Pranjothi where he raises in the ranks of the Pallava army and becomes the army general, his deeds in securing the Kachi fort from the imminent Vatapi invasion, his war on Vatapi Pulikesi and his eventual win over the Chalukyas. Further reading Parthiban Kanavu Sivagamiyin sabatham Vatapi Ganapathi References External links Nayanars 7th-century monarchs in Asia Year of birth missing Year of death missing
Józefów () is a village in the administrative district of Gmina Chocz, within Pleszew County, Greater Poland Voivodeship, in west-central Poland. References Villages in Pleszew County
Mary E. L. Butler (1874-1920) () was an Irish writer and Irish-language activist. Mary Ellen Butler was the daughter of Peter Lambert Butler and the granddaughter of William Butler of Bunnahow, County Clare. She was a first cousin (once removed) of Edward Carson In order to learn Irish she made several visits to the Aran Islands. According to her memoirs, which are in a Benedictine monastery in France, she was converted to the nationalist cause after reading John Mitchel's Jail Journal. From 1899 to 1904 she edited a women's page and also a children's page in the Irish Weekly Independent. She promoted the nationalist cause in both. She joined the Gaelic League, where she met Irish-language enthusiasts such as Evelyn Donovan, Agnes O'Farrelly and Máire Ní Chinnéide, and spent several years on its executive. In 1907, she married Thomas O'Nolan, who died in 1913. She was a close friend of Arthur Griffith and in a letter of condolence which Griffith wrote to her sister from Mountjoy Jail in 1920 he states that Mary Butler was the first person to suggest to him the name Sinn Féin as the title of the new organisation which he had founded. She died in Rome in 1920 and is buried there. Early life and education The Butlers of County Clare were a landowning family that had remained Catholic, unlike other branches of the same family. Her father, Peter, had been educated in France and was always at home in French as in English or Irish. The family of her mother, Ellen Lambert of Castle Ellen, County Galway, remained staunchly Gaelic. Her younger sister, Isabella, however, was, interestingly, the mother of Lord (better known as Sir) Edward Carson. Thus, Edward Carson, who became the symbol of Ulster Unionism, was a cousin of Mary Butler. Her early education was first conducted at home and the way was carefully prepared for the masters' tuition. During the Land League agitation of the late 1870s and early 1880s, Butler was educated at Alexandra College, Dublin, a school attended largely by young women of Protestant background, where she had the advantage of having native masters for the French, Italian, and German languages which she studied with enthusiasm. According to A Life of Mary Butler, a marked characteristic of her young days and one that grew with her growth, was her intense love of home ties. Butler's conversion to nationalism resulted from a chance encounter with the writings of Young Ireland, and like many blossoming cultural nationalists, she immersed herself in the culture of the western part of the country, visiting the Aran Islands on several occasions. Activism Butler was a member of the Gaelic League, serving for several years on its League Executive, but unlike other notable female members such as Countess Markevich and Maud Gonne, Mary did not seek an "assertive public role". Butler was not a professed feminist. She believed that in order for women to progress in Ireland they must progress within the limits of conservative Ireland. She approached much of her writings from a gender related viewpoint, making much reference to the domestic side of activism. She wrote articles that appeared frequently in the League's newspaper, An Claidheamh Soluis, as well as in the United Irishman and in other periodicals. Some were republished as pamphlets. Butler's overriding goal was always the achievement of securing an Irish speaking Ireland. It is no surprise then that Mary had ties to Arthur Griffith. Griffith actually credited Mary with coining the term ‘Sinn Féin’. She focused mainly in her writings on nationalism but in a domestic context and underlined in much of her texts the important role of women in maintaining and promoting nationalism in the home. Writing Butler was a frequent contributor to the leading periodicals of "Irish Ireland". Her articles appeared frequently in the League's newspaper, An Claidheamh Soluis, as well as in the United Irishman and in various Irish-American periodicals. She also wrote columns in the Irish weekly independent on issues concerning children's education and women's role in the movement. Many of her articles got issued separately as pamphlets among some of which were: "Irish woman and the Home language", "Two schools: A contrast", "Womanhood and Nationhood". Butler also wrote fiction, publishing a collection of stories “A Bundle of Rushes” in 1900. In 1906 her first novel “The Ring of Day” was serialized in the Irish Peasant, before being published in a book form a year later. The book focused on a young woman's conversion to the cause of Irish-Ireland and may be largely taken as a self-portrait. Religion Butler was a devout Catholic, as shown by the fact that she was a frequent contributor to the Bulletin and other Catholic periodicals. Despite her views on gender equality, Butler did believe that the Church promoted domesticated idea of women, although she believed women could still do a lot of work for the Irish language revival movement from the comfort of their own homes. She lived in Brittany, France for a while with her mother, before moving to Rome. Butler was very comfortable living in Brittany, as she viewed the native Bretons as having the same Celtic roots and sharing the same Catholic beliefs and strong level of faith as the Irish. In her writings for the Catholic Bulletin, she also continuously compares the landscape to the west coast of Ireland, constantly remarking at how similar they are. She viewed France and Italy as two very free countries and allies of Ireland and she seemed to believe that Ireland should aspire to be more like them, in terms of politics. She believed that the general populations of the two countries were very interested in the Irish political affairs of the time. Furthermore, she loved her time in Rome too, as the city had such a long and strong history of being connected with Catholicism and the ground was, as she said, "soaked with the blood of martyrs". She was buried in the Catholic cemetery San Lorenzo in Rome on 29 November 1920. References Bibliography Irish novelists Writers from County Clare 1874 births 1920 deaths People educated at Alexandra College Women's page journalists 20th-century Irish women writers 19th-century Irish women writers
Garrison is a small village near Lough Melvin in County Fermanagh, Northern Ireland. The Roogagh River runs through the village. In the 2021 Census it had a population of 411 people. It is situated within Fermanagh and Omagh district. According to the UK Met Office, the highest temperature ever recorded in Northern Ireland is 30.8 °C (87.4 °F) at Knockarevan, Garrison on 30 June 1976. Toponymy The village's name comes from a military barracks and its garrison of troops established in the village by William III of England, following the Battle of Aughrim in 1691. History Garrison was one of several Catholic border villages in Fermanagh that would have been transferred to the Irish Free State had the recommendations of the Irish Boundary Commission been enacted in 1925. The Melvin Hotel, previously owned by the McGovern family, was blown up in January 1972 during the middle of a Catholic wedding reception, by the IRA, reportedly as retaliation for allowing members of the security forces to stay on the premises. The Police Service of Northern Ireland came under gun attack in the town on 21 November 2009. Tourism Visitors to Garrison can enjoy a wide range of activities including golfing, fishing, hill-walking, water sports, horse-riding, cycling, camping and caving. The Lough Melvin Holiday Centre caters for large groups and there are many local guesthouses and chalets to let. Two local pubs – The Melvin Bar and The Riverside Bar – provide music and craic. The local restaurant, The Bilberry, is well known in the North-West region. Transport Ulsterbus route 64 serves Garrison on Thursday with two journeys to Belleek and Belcoo and one journey to Letterbreen and Enniskillen. Belleek, approximately five miles away, is served by Bus Éireann route 30 every two hours each way for most of the day plus an overnight coach. This route operates to Donegal, Cavan, Dublin Airport and Dublin. Lough Melvin Lough Melvin in Ireland is home to the Gillaroo or 'salmo stomachius', a species of trout which eats primarily snails. Gillaroo is derived from the Irish for 'red fellow' (Giolla Rua). This is due to the fish's distinctive colouring. It has a bright buttery golden colour on its flanks with bright crimson and vermillion spots. The gillaroo is also characterised by a "gizzard", which is used to aid the digestion of hard food items such as water snails. They feed almost exclusively on bottom-living animals (snails, sedge fly larvae and freshwater shrimp), except in late summer when they come to the surface to feed and may be caught on the dry fly. Other lakes reputed to contain the gillaroo are Lough Neagh, Lough Conn, Lough Mask and Lough Corrib. However, the unique gene found in the Lough Melvin trout has not been found in some 200 trout populations in Ireland or Britain and experiments carried out by Queen's University Belfast established that the Lough Melvin gillaroo species has not been found anywhere else in the world. The sonaghan trout (Salmo nigripinnis) is a sub-species of trout unique to Lough Melvin. It can have a light brown or silvery hue with large, distinctive black spots. There are sometimes small, inconspicuous red spots located along its posterior region. Its fins are dark brown or black with elongated pectorals. Sonaghan are found in areas of open, deep water, where they feed on mid-water planktonic organisms. Notable residents Mick Moohan, one time cabinet minister in the New Zealand Government and Patrick Treacy, author and one time physician to Michael Jackson, were both born in Garrison. See also List of townlands in County Fermanagh List of towns and villages in Northern Ireland B52 road (Northern Ireland) References Culture Northern Ireland Villages in County Fermanagh Townlands of County Fermanagh Fermanagh and Omagh district
The SCW World Tag Team Championship was the top tag team championship in Southwest Championship Wrestling from its establishment in 1980 until 1984, when the title was abandoned. History References External links SWCW World Tag Team title history World Tage Team Championship Tag team wrestling championships
```kotlin package net.corda.nodeapi.internal.protonwrapper.netty import io.netty.channel.ChannelDuplexHandler import io.netty.channel.ChannelHandler import io.netty.channel.ChannelHandlerContext import io.netty.channel.ChannelPromise import io.netty.handler.logging.LogLevel import io.netty.util.internal.logging.InternalLogLevel import io.netty.util.internal.logging.InternalLogger import io.netty.util.internal.logging.InternalLoggerFactory import java.net.SocketAddress @ChannelHandler.Sharable class NettyServerEventLogger(level: LogLevel = DEFAULT_LEVEL, val silencedIPs: Set<String> = emptySet()) : ChannelDuplexHandler() { companion object { val DEFAULT_LEVEL: LogLevel = LogLevel.DEBUG } private val logger: InternalLogger = InternalLoggerFactory.getInstance(javaClass) private val internalLevel: InternalLogLevel = level.toInternalLevel() @Throws(Exception::class) override fun channelActive(ctx: ChannelHandlerContext) { if (logger.isEnabled(internalLevel)) { logger.log(internalLevel, "Server socket ${ctx.channel()} ACTIVE") } ctx.fireChannelActive() } @Throws(Exception::class) override fun channelInactive(ctx: ChannelHandlerContext) { if (logger.isEnabled(internalLevel)) { logger.log(internalLevel, "Server socket ${ctx.channel()} INACTIVE") } ctx.fireChannelInactive() } @Suppress("OverridingDeprecatedMember") @Throws(Exception::class) override fun exceptionCaught(ctx: ChannelHandlerContext, cause: Throwable) { if (logger.isEnabled(internalLevel)) { logger.log(internalLevel, "Server socket ${ctx.channel()} EXCEPTION ${cause.message}", cause) } ctx.fireExceptionCaught(cause) } @Throws(Exception::class) override fun bind(ctx: ChannelHandlerContext, localAddress: SocketAddress, promise: ChannelPromise) { if (logger.isEnabled(internalLevel)) { logger.log(internalLevel, "Server socket ${ctx.channel()} BIND $localAddress") } ctx.bind(localAddress, promise) } @Throws(Exception::class) override fun close(ctx: ChannelHandlerContext, promise: ChannelPromise) { if (logger.isEnabled(internalLevel)) { logger.log(internalLevel, "Server socket ${ctx.channel()} CLOSE") } ctx.close(promise) } @Throws(Exception::class) override fun channelRead(ctx: ChannelHandlerContext, msg: Any) { val level = if (msg is io.netty.channel.socket.SocketChannel) { // Should always be the case as this is a server socket, but be defensive if (msg.remoteAddress()?.hostString !in silencedIPs) internalLevel else InternalLogLevel.TRACE } else internalLevel if (logger.isEnabled(level)) { logger.log(level, "Server socket ${ctx.channel()} ACCEPTED $msg") } ctx.fireChannelRead(msg) } } ```
St Laurence's Church, Gonalston is a Grade II listed parish church in the Church of England in Gonalston. History The church dates from the 14th century. It was rebuilt in 1843 by Thomas Chambers Hine. The church is in a joint parish with: Holy Cross Church, Epperstone St Swithun's Church, Woodborough St Peter & St Paul's Church, Oxton Memorials Memorials include: 3 early C14 damaged reclining effigies of the Heriz family, 2 of cross legged Knights and the third of Lady Mathilda in wimpole with head under an ogee arch decorated with crockets and further decorated with stiff leaf and more naturalistic foliage. North aisle See also Listed buildings in Gonalston References Church of England church buildings in Nottinghamshire Gonalston
Kenneth Hahn Hall of Administration (abbreviated HOA), formerly the Los Angeles County Hall of Administration, completed 1960, is the seat of the government of the County of Los Angeles, California, and houses the Los Angeles County Board of Supervisors, meeting chambers, and the offices of several County departments. It is located in the Civic Center district of downtown Los Angeles, encompassing a city block bounded by Grand, Temple, Hill, and Grand Park. On an average workday, 2,700 civil servants occupy the building. History The Hall of Administration was originally conceived as part of the 1947 Civic Center Master Plan that ultimately transformed Bunker Hill, as the Civic Center expanded westward. Los Angeles County Courthouse (Stanley Mosk Courthouse), located opposite of the Hall of Administration, was built at the same time, by the same team of architects. Construction for the Hall of Administration began in 1952 and was completed in 1960. Prior to its construction, Los Angeles County Hall of Records (originally built in 1911, and rebuilt in 1961) housed the Board of Supervisors, as well as other county government entities. On the night of November 13, 1968, Security Officer Lee Edward Roach was murdered at the Hall of Administration by a former janitorial employee. Officer Roach, of the Los Angeles County Mechanical Department, was guarding payroll warrants on the fifth floor of the building. The complex was renamed the Kenneth Hahn Hall of Administration in 1992, in honor of Los Angeles County's longest serving Supervisor, Kenneth Hahn. Architecture The Hall of Administration, a 10-story, complex, is built in the Late Moderne architecture style. The complex was designed by architects Paul R. Williams, Adrian Wilson and the firms Austin, Field & Fry, Stanton & Stockwell. The Hall of Administration sits atop a complex of underground pedestrian tunnels that connect it to other government buildings in Civic Center. The complex features integrated public art displays, including a pair of sculptures called "The Law Givers," by Albert Stewart, a sculptor. On the second floor lobby stands a bronze bust of Abraham Lincoln, sculpted by Emil Seletz in 1958. See also Civic Center, Los Angeles Grand Park References Hahn Hall of Administration Hahn Hall of Administration Civic Center, Los Angeles Hahn Hall of Administration Hahn Hall of Administration Government buildings completed in 1960 1960 establishments in California 1960s architecture in the United States Moderne architecture in California Stripped Classical architecture in the United States Paul Williams (architect) buildings Stanton & Stockwell buildings
Juan José Cuevas García (born 2 April 1965) is a Mexican politician from the National Action Party. From 2009 to 2012 he served as Deputy of the LXI Legislature of the Mexican Congress representing Jalisco. References 1965 births Living people Politicians from Nayarit National Action Party (Mexico) politicians 21st-century Mexican politicians Autonomous University of Nayarit alumni Deputies of the LXI Legislature of Mexico Members of the Chamber of Deputies (Mexico) for Jalisco
```php <?php /* * * * path_to_url * * Unless required by applicable law or agreed to in writing, software * WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the */ namespace Google\Service\Firestore; class Status extends \Google\Collection { protected $collection_key = 'details'; /** * @var int */ public $code; /** * @var array[] */ public $details; /** * @var string */ public $message; /** * @param int */ public function setCode($code) { $this->code = $code; } /** * @return int */ public function getCode() { return $this->code; } /** * @param array[] */ public function setDetails($details) { $this->details = $details; } /** * @return array[] */ public function getDetails() { return $this->details; } /** * @param string */ public function setMessage($message) { $this->message = $message; } /** * @return string */ public function getMessage() { return $this->message; } } // Adding a class alias for backwards compatibility with the previous class name. class_alias(Status::class, 'Google_Service_Firestore_Status'); ```
```java /* * DO NOT ALTER OR REMOVE COPYRIGHT NOTICES OR THIS FILE HEADER. * * * Subject to the condition set forth below, permission is hereby granted to any * person obtaining a copy of this software, associated documentation and/or * data (collectively the "Software"), free of charge and under any and all * copyright rights in the Software, and any and all patent rights owned or * freely licensable by each licensor hereunder covering either (i) the * unmodified Software as contributed to or provided by such licensor, or (ii) * the Larger Works (as defined below), to deal in both * * (a) the Software, and * * (b) any piece of software and/or hardware listed in the lrgrwrks.txt file if * one is included with the Software each a "Larger Work" to which the Software * is contributed by such licensors), * * without restriction, including without limitation the rights to copy, create * derivative works of, display, perform, and distribute the Software and make, * use, sell, offer for sale, import, export, have made, and have sold the * Software and the Larger Work(s), and to sublicense the foregoing rights on * either these or other terms. * * This license is subject to the following condition: * * The above copyright notice and either this complete permission notice or at a * minimum a reference to the UPL must be included in all copies or substantial * portions of the Software. * * THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR * IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, * FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE * AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER * LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, * OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE * SOFTWARE. */ package com.oracle.truffle.polyglot; import static com.oracle.truffle.api.CompilerDirectives.shouldNotReachHere; import static com.oracle.truffle.polyglot.EngineAccessor.RUNTIME; import java.lang.reflect.Type; import java.math.BigInteger; import java.nio.ByteOrder; import java.time.Duration; import java.time.Instant; import java.time.LocalDate; import java.time.LocalTime; import java.time.ZoneId; import java.util.AbstractSet; import java.util.Arrays; import java.util.Collections; import java.util.Iterator; import java.util.Map; import java.util.NoSuchElementException; import java.util.Set; import org.graalvm.polyglot.impl.AbstractPolyglotImpl; import org.graalvm.polyglot.impl.AbstractPolyglotImpl.APIAccess; import org.graalvm.polyglot.impl.AbstractPolyglotImpl.AbstractValueDispatch; import com.oracle.truffle.api.CallTarget; import com.oracle.truffle.api.CompilerDirectives.TruffleBoundary; import com.oracle.truffle.api.dsl.Bind; import com.oracle.truffle.api.dsl.Cached; import com.oracle.truffle.api.dsl.GenerateCached; import com.oracle.truffle.api.dsl.GenerateInline; import com.oracle.truffle.api.dsl.ImportStatic; import com.oracle.truffle.api.dsl.Specialization; import com.oracle.truffle.api.interop.ArityException; import com.oracle.truffle.api.interop.InteropLibrary; import com.oracle.truffle.api.interop.InvalidArrayIndexException; import com.oracle.truffle.api.interop.InvalidBufferOffsetException; import com.oracle.truffle.api.interop.StopIterationException; import com.oracle.truffle.api.interop.TruffleObject; import com.oracle.truffle.api.interop.UnknownIdentifierException; import com.oracle.truffle.api.interop.UnknownKeyException; import com.oracle.truffle.api.interop.UnsupportedMessageException; import com.oracle.truffle.api.interop.UnsupportedTypeException; import com.oracle.truffle.api.library.CachedLibrary; import com.oracle.truffle.api.nodes.Node; import com.oracle.truffle.api.profiles.InlinedBranchProfile; import com.oracle.truffle.api.strings.TruffleString; import com.oracle.truffle.polyglot.PolyglotLanguageContext.ToGuestValueNode; import com.oracle.truffle.polyglot.PolyglotLanguageContext.ToGuestValuesNode; import com.oracle.truffle.polyglot.PolyglotLanguageContext.ToHostValueNode; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.AsClassLiteralNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.AsDateNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.AsDurationNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.AsInstantNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.AsNativePointerNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.AsTimeNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.AsTimeZoneNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.AsTypeLiteralNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.CanExecuteNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.CanInstantiateNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.CanInvokeNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.ExecuteNoArgsNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.ExecuteNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.ExecuteVoidNoArgsNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.ExecuteVoidNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.GetArrayElementNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.GetArraySizeNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.GetBufferSizeNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.GetHashEntriesIteratorNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.GetHashKeysIteratorNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.GetHashSizeNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.GetHashValueNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.GetHashValueOrDefaultNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.GetHashValuesIteratorNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.GetIteratorNextElementNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.GetMemberKeysNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.GetMemberNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.GetMetaQualifiedNameNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.GetMetaSimpleNameNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.HasArrayElementsNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.HasBufferElementsNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.HasHashEntriesNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.HasHashEntryNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.HasIteratorNextElementNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.HasIteratorNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.HasMemberNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.HasMembersNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.InvokeNoArgsNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.InvokeNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.IsBufferWritableNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.IsDateNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.IsDurationNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.IsExceptionNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.IsMetaInstanceNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.IsMetaObjectNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.IsNativePointerNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.IsNullNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.IsTimeNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.IsTimeZoneNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.NewInstanceNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.PutHashEntryNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.PutMemberNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.ReadBufferByteNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.ReadBufferDoubleNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.ReadBufferFloatNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.ReadBufferIntNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.ReadBufferLongNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.ReadBufferNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.ReadBufferShortNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.RemoveArrayElementNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.RemoveHashEntryNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.RemoveMemberNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.SetArrayElementNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.ThrowExceptionNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.WriteBufferByteNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.WriteBufferDoubleNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.WriteBufferFloatNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.WriteBufferIntNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.WriteBufferLongNodeGen; import com.oracle.truffle.polyglot.PolyglotValueDispatchFactory.InteropValueFactory.WriteBufferShortNodeGen; abstract class PolyglotValueDispatch extends AbstractValueDispatch { private static final String TRUNCATION_SUFFIX = "..."; private static final String UNKNOWN = "Unknown"; static final InteropLibrary UNCACHED_INTEROP = InteropLibrary.getFactory().getUncached(); final PolyglotImpl impl; final PolyglotLanguageInstance languageInstance; PolyglotValueDispatch(PolyglotImpl impl, PolyglotLanguageInstance languageInstance) { super(impl); this.impl = impl; this.languageInstance = languageInstance; } @Override public final Object getContext(Object context) { if (context == null) { return null; } return ((PolyglotLanguageContext) context).context.api; } static <T extends Throwable> RuntimeException guestToHostException(PolyglotLanguageContext languageContext, T e, boolean entered) { throw PolyglotImpl.guestToHostException(languageContext, e, entered); } @Override public Object getArrayElement(Object languageContext, Object receiver, long index) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object prev = hostEnter(context); try { return getArrayElementUnsupported(context, receiver); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, prev); } } @TruffleBoundary static Object getArrayElementUnsupported(PolyglotLanguageContext context, Object receiver) { throw unsupported(context, receiver, "getArrayElement(long)", "hasArrayElements()"); } @Override public void setArrayElement(Object languageContext, Object receiver, long index, Object value) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object prev = hostEnter(context); try { setArrayElementUnsupported(context, receiver); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, prev); } } @TruffleBoundary static void setArrayElementUnsupported(PolyglotLanguageContext context, Object receiver) { throw unsupported(context, receiver, "setArrayElement(long, Object)", "hasArrayElements()"); } @Override public boolean removeArrayElement(Object languageContext, Object receiver, long index) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object prev = hostEnter(context); try { throw removeArrayElementUnsupported(context, receiver); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, prev); } } @TruffleBoundary static RuntimeException removeArrayElementUnsupported(PolyglotLanguageContext context, Object receiver) { throw unsupported(context, receiver, "removeArrayElement(long, Object)", null); } @Override public long getArraySize(Object languageContext, Object receiver) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object prev = hostEnter(context); try { return getArraySizeUnsupported(context, receiver); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, prev); } } @TruffleBoundary static long getArraySizeUnsupported(PolyglotLanguageContext context, Object receiver) { throw unsupported(context, receiver, "getArraySize()", "hasArrayElements()"); } // region Buffer Methods @Override public boolean isBufferWritable(Object languageContext, Object receiver) throws UnsupportedOperationException { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; final Object prev = hostEnter(context); try { throw isBufferWritableUnsupported(context, receiver); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, prev); } } @TruffleBoundary static RuntimeException isBufferWritableUnsupported(PolyglotLanguageContext context, Object receiver) { return unsupported(context, receiver, "isBufferWritable()", "hasBufferElements()"); } @Override public long getBufferSize(Object languageContext, Object receiver) throws UnsupportedOperationException { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; final Object prev = hostEnter(context); try { throw getBufferSizeUnsupported(context, receiver); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, prev); } } @TruffleBoundary static RuntimeException getBufferSizeUnsupported(PolyglotLanguageContext context, Object receiver) { return unsupported(context, receiver, "getBufferSize()", "hasBufferElements()"); } @Override public byte readBufferByte(Object languageContext, Object receiver, long byteOffset) throws UnsupportedOperationException, IndexOutOfBoundsException { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; final Object prev = hostEnter(context); try { throw readBufferByteUnsupported(context, receiver); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, prev); } } @TruffleBoundary static RuntimeException readBufferByteUnsupported(PolyglotLanguageContext context, Object receiver) { return unsupported(context, receiver, "readBufferByte()", "hasBufferElements()"); } @Override public void readBuffer(Object languageContext, Object receiver, long byteOffset, byte[] destination, int destinationOffset, int length) throws UnsupportedOperationException, IndexOutOfBoundsException { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; final Object prev = hostEnter(context); try { throw readBufferUnsupported(context, receiver); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, prev); } } @TruffleBoundary static RuntimeException readBufferUnsupported(PolyglotLanguageContext context, Object receiver) { return unsupported(context, receiver, "readBuffer()", "hasBufferElements()"); } @Override public void writeBufferByte(Object languageContext, Object receiver, long byteOffset, byte value) throws UnsupportedOperationException, IndexOutOfBoundsException { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; final Object prev = hostEnter(context); try { throw writeBufferByteUnsupported(context, receiver); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, prev); } } @TruffleBoundary static RuntimeException writeBufferByteUnsupported(PolyglotLanguageContext context, Object receiver) { return unsupported(context, receiver, "writeBufferByte()", "hasBufferElements()"); } @Override public short readBufferShort(Object languageContext, Object receiver, ByteOrder order, long byteOffset) throws UnsupportedOperationException, IndexOutOfBoundsException { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; final Object prev = hostEnter(context); try { throw readBufferShortUnsupported(context, receiver); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, prev); } } @TruffleBoundary static RuntimeException readBufferShortUnsupported(PolyglotLanguageContext context, Object receiver) { return unsupported(context, receiver, "readBufferShort()", "hasBufferElements()"); } @Override public void writeBufferShort(Object languageContext, Object receiver, ByteOrder order, long byteOffset, short value) throws UnsupportedOperationException, IndexOutOfBoundsException { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; final Object prev = hostEnter(context); try { throw writeBufferShortUnsupported(context, receiver); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, prev); } } @TruffleBoundary static RuntimeException writeBufferShortUnsupported(PolyglotLanguageContext context, Object receiver) { return unsupported(context, receiver, "writeBufferShort()", "hasBufferElements()"); } @Override public int readBufferInt(Object languageContext, Object receiver, ByteOrder order, long byteOffset) throws UnsupportedOperationException, IndexOutOfBoundsException { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; final Object prev = hostEnter(context); try { throw readBufferIntUnsupported(context, receiver); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, prev); } } @TruffleBoundary static RuntimeException readBufferIntUnsupported(PolyglotLanguageContext context, Object receiver) { return unsupported(context, receiver, "readBufferInt()", "hasBufferElements()"); } @Override public void writeBufferInt(Object languageContext, Object receiver, ByteOrder order, long byteOffset, int value) throws UnsupportedOperationException, IndexOutOfBoundsException { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; final Object prev = hostEnter(context); try { throw writeBufferIntUnsupported(context, receiver); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, prev); } } @TruffleBoundary static RuntimeException writeBufferIntUnsupported(PolyglotLanguageContext context, Object receiver) { return unsupported(context, receiver, "writeBufferInt()", "hasBufferElements()"); } @Override public long readBufferLong(Object languageContext, Object receiver, ByteOrder order, long byteOffset) throws UnsupportedOperationException, IndexOutOfBoundsException { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; final Object prev = hostEnter(context); try { throw readBufferLongUnsupported(context, receiver); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, prev); } } @TruffleBoundary static RuntimeException readBufferLongUnsupported(PolyglotLanguageContext context, Object receiver) { return unsupported(context, receiver, "readBufferLong()", "hasBufferElements()"); } @Override public void writeBufferLong(Object languageContext, Object receiver, ByteOrder order, long byteOffset, long value) throws UnsupportedOperationException, IndexOutOfBoundsException { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; final Object prev = hostEnter(context); try { throw writeBufferLongUnsupported(context, receiver); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, prev); } } @TruffleBoundary static RuntimeException writeBufferLongUnsupported(PolyglotLanguageContext context, Object receiver) { return unsupported(context, receiver, "writeBufferLong()", "hasBufferElements()"); } @Override public float readBufferFloat(Object languageContext, Object receiver, ByteOrder order, long byteOffset) throws UnsupportedOperationException, IndexOutOfBoundsException { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; final Object prev = hostEnter(context); try { throw readBufferFloatUnsupported(context, receiver); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, prev); } } @TruffleBoundary static RuntimeException readBufferFloatUnsupported(PolyglotLanguageContext context, Object receiver) { return unsupported(context, receiver, "readBufferFloat()", "hasBufferElements()"); } @Override public void writeBufferFloat(Object languageContext, Object receiver, ByteOrder order, long byteOffset, float value) throws UnsupportedOperationException, IndexOutOfBoundsException { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; final Object prev = hostEnter(context); try { throw writeBufferFloatUnsupported(context, receiver); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, prev); } } @TruffleBoundary static RuntimeException writeBufferFloatUnsupported(PolyglotLanguageContext context, Object receiver) { return unsupported(context, receiver, "writeBufferFloat()", "hasBufferElements()"); } @Override public double readBufferDouble(Object languageContext, Object receiver, ByteOrder order, long byteOffset) throws UnsupportedOperationException, IndexOutOfBoundsException { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object prev = hostEnter(context); try { throw readBufferDoubleUnsupported(context, receiver); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, prev); } } @TruffleBoundary static RuntimeException readBufferDoubleUnsupported(PolyglotLanguageContext context, Object receiver) { return unsupported(context, receiver, "readBufferDouble()", "hasBufferElements()"); } @Override public void writeBufferDouble(Object languageContext, Object receiver, ByteOrder order, long byteOffset, double value) throws UnsupportedOperationException, IndexOutOfBoundsException { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object prev = hostEnter(context); try { throw writeBufferDoubleUnsupported(context, receiver); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, prev); } } @TruffleBoundary static RuntimeException writeBufferDoubleUnsupported(PolyglotLanguageContext context, Object receiver) { return unsupported(context, receiver, "writeBufferDouble()", "hasBufferElements()"); } @TruffleBoundary protected static RuntimeException invalidBufferIndex(PolyglotLanguageContext context, Object receiver, long byteOffset, long size) { final String message = String.format("Invalid buffer access of length %d at byte offset %d for buffer %s.", size, byteOffset, getValueInfo(context, receiver)); throw PolyglotEngineException.bufferIndexOutOfBounds(message); } // endregion @Override public Object getMember(Object languageContext, Object receiver, String key) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object prev = hostEnter(context); try { return getMemberUnsupported(context, receiver, key); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, prev); } } @TruffleBoundary static Object getMemberUnsupported(PolyglotLanguageContext context, Object receiver, @SuppressWarnings("unused") String key) { throw unsupported(context, receiver, "getMember(String)", "hasMembers()"); } @Override public void putMember(Object languageContext, Object receiver, String key, Object member) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object prev = hostEnter(context); try { putMemberUnsupported(context, receiver); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, prev); } } @TruffleBoundary static RuntimeException putMemberUnsupported(PolyglotLanguageContext context, Object receiver) { throw unsupported(context, receiver, "putMember(String, Object)", "hasMembers()"); } @Override public boolean removeMember(Object languageContext, Object receiver, String key) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object prev = hostEnter(context); try { throw removeMemberUnsupported(context, receiver); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, prev); } } @TruffleBoundary static RuntimeException removeMemberUnsupported(PolyglotLanguageContext context, Object receiver) { throw unsupported(context, receiver, "removeMember(String, Object)", null); } @Override public Object execute(Object languageContext, Object receiver, Object[] arguments) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object prev = hostEnter(context); try { throw executeUnsupported(context, receiver); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, prev); } } @Override public Object execute(Object languageContext, Object receiver) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object prev = hostEnter(context); try { throw executeUnsupported(context, receiver); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, prev); } } @TruffleBoundary static RuntimeException executeUnsupported(PolyglotLanguageContext context, Object receiver) { throw unsupported(context, receiver, "execute(Object...)", "canExecute()"); } @Override public Object newInstance(Object languageContext, Object receiver, Object[] arguments) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object prev = hostEnter(context); try { return newInstanceUnsupported(context, receiver); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, prev); } } @TruffleBoundary static Object newInstanceUnsupported(PolyglotLanguageContext context, Object receiver) { throw unsupported(context, receiver, "newInstance(Object...)", "canInstantiate()"); } @Override public void executeVoid(Object languageContext, Object receiver, Object[] arguments) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object prev = hostEnter(context); try { executeVoidUnsupported(context, receiver); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, prev); } } @Override public void executeVoid(Object languageContext, Object receiver) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object prev = hostEnter(context); try { executeVoidUnsupported(context, receiver); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, prev); } } @TruffleBoundary static void executeVoidUnsupported(PolyglotLanguageContext context, Object receiver) { throw unsupported(context, receiver, "executeVoid(Object...)", "canExecute()"); } @Override public Object invoke(Object languageContext, Object receiver, String identifier, Object[] arguments) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object prev = hostEnter(context); try { throw invokeUnsupported(context, receiver, identifier); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, prev); } } @Override public Object invoke(Object languageContext, Object receiver, String identifier) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object prev = hostEnter(context); try { throw invokeUnsupported(context, receiver, identifier); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, prev); } } @TruffleBoundary static RuntimeException invokeUnsupported(PolyglotLanguageContext context, Object receiver, String identifier) { throw unsupported(context, receiver, "invoke(" + identifier + ", Object...)", "canInvoke(String)"); } @Override public String asString(Object languageContext, Object receiver) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object prev = hostEnter(context); try { return asStringUnsupported(context, receiver); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, prev); } } protected static String asStringUnsupported(PolyglotLanguageContext context, Object receiver) { return invalidCastPrimitive(context, receiver, String.class, "asString()", "isString()", "Invalid coercion."); } @Override public boolean asBoolean(Object languageContext, Object receiver) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object prev = hostEnter(context); try { return asBooleanUnsupported(context, receiver); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, prev); } } private static boolean isNullUncached(Object receiver) { return InteropLibrary.getFactory().getUncached().isNull(receiver); } protected static boolean asBooleanUnsupported(PolyglotLanguageContext context, Object receiver) { return invalidCastPrimitive(context, receiver, boolean.class, "asBoolean()", "isBoolean()", "Invalid or lossy primitive coercion."); } private static <T> T invalidCastPrimitive(PolyglotLanguageContext context, Object receiver, Class<T> clazz, String asMethodName, String isMethodName, String detail) { if (isNullUncached(receiver)) { throw nullCoercion(context, receiver, clazz, asMethodName, isMethodName); } else { throw cannotConvert(context, receiver, clazz, asMethodName, isMethodName, detail); } } @Override public int asInt(Object languageContext, Object receiver) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object prev = hostEnter(context); try { return asIntUnsupported(context, receiver); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, prev); } } protected static int asIntUnsupported(PolyglotLanguageContext context, Object receiver) { return invalidCastPrimitive(context, receiver, int.class, "asInt()", "fitsInInt()", "Invalid or lossy primitive coercion."); } @Override public long asLong(Object languageContext, Object receiver) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object prev = hostEnter(context); try { return asLongUnsupported(context, receiver); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, prev); } } protected static long asLongUnsupported(PolyglotLanguageContext context, Object receiver) { return invalidCastPrimitive(context, receiver, long.class, "asLong()", "fitsInLong()", "Invalid or lossy primitive coercion."); } @Override public BigInteger asBigInteger(Object languageContext, Object receiver) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object prev = hostEnter(context); try { return asBigIntegerUnsupported(context, receiver); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, prev); } } protected static BigInteger asBigIntegerUnsupported(PolyglotLanguageContext context, Object receiver) { throw cannotConvert(context, receiver, BigInteger.class, "asBigInteger()", "fitsInBigInteger()", "Invalid or lossy coercion."); } @Override public double asDouble(Object languageContext, Object receiver) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object prev = hostEnter(context); try { return asDoubleUnsupported(context, receiver); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, prev); } } protected static double asDoubleUnsupported(PolyglotLanguageContext context, Object receiver) { return invalidCastPrimitive(context, receiver, double.class, "asDouble()", "fitsInDouble()", "Invalid or lossy primitive coercion."); } @Override public float asFloat(Object languageContext, Object receiver) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object prev = hostEnter(context); try { return asFloatUnsupported(context, receiver); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, prev); } } protected static float asFloatUnsupported(PolyglotLanguageContext context, Object receiver) { return invalidCastPrimitive(context, receiver, float.class, "asFloat()", "fitsInFloat()", "Invalid or lossy primitive coercion."); } @Override public byte asByte(Object languageContext, Object receiver) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object prev = hostEnter(context); try { return asByteUnsupported(context, receiver); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, prev); } } protected static byte asByteUnsupported(PolyglotLanguageContext context, Object receiver) { return invalidCastPrimitive(context, receiver, byte.class, "asByte()", "fitsInByte()", "Invalid or lossy primitive coercion."); } @Override public short asShort(Object languageContext, Object receiver) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object prev = hostEnter(context); try { return asShortUnsupported(context, receiver); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, prev); } } protected static short asShortUnsupported(PolyglotLanguageContext context, Object receiver) { return invalidCastPrimitive(context, receiver, short.class, "asShort()", "fitsInShort()", "Invalid or lossy primitive coercion."); } @Override public long asNativePointer(Object languageContext, Object receiver) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object prev = hostEnter(context); try { return asNativePointerUnsupported(context, receiver); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, prev); } } static long asNativePointerUnsupported(PolyglotLanguageContext context, Object receiver) { throw cannotConvert(context, receiver, long.class, "asNativePointer()", "isNativeObject()", "Value cannot be converted to a native pointer."); } @Override public Object asHostObject(Object languageContext, Object receiver) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object prev = hostEnter(context); try { return asHostObjectUnsupported(context, receiver); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, prev); } } protected static Object asHostObjectUnsupported(PolyglotLanguageContext context, Object receiver) { throw cannotConvert(context, receiver, null, "asHostObject()", "isHostObject()", "Value is not a host object."); } @Override public Object asProxyObject(Object languageContext, Object receiver) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object prev = hostEnter(context); try { return asProxyObjectUnsupported(context, receiver); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, prev); } } protected static Object asProxyObjectUnsupported(PolyglotLanguageContext context, Object receiver) { throw cannotConvert(context, receiver, null, "asProxyObject()", "isProxyObject()", "Value is not a proxy object."); } @Override public LocalDate asDate(Object languageContext, Object receiver) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object prev = hostEnter(context); try { if (isNullUncached(receiver)) { return null; } else { throw cannotConvert(context, receiver, null, "asDate()", "isDate()", "Value does not contain date information."); } } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, prev); } } @Override public LocalTime asTime(Object languageContext, Object receiver) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object prev = hostEnter(context); try { if (isNullUncached(receiver)) { return null; } else { throw cannotConvert(context, receiver, null, "asTime()", "isTime()", "Value does not contain time information."); } } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, prev); } } @Override public ZoneId asTimeZone(Object languageContext, Object receiver) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object prev = hostEnter(context); try { if (isNullUncached(receiver)) { return null; } else { throw cannotConvert(context, receiver, null, "asTimeZone()", "isTimeZone()", "Value does not contain time zone information."); } } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, prev); } } @Override public Instant asInstant(Object languageContext, Object receiver) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object prev = hostEnter(context); try { if (isNullUncached(receiver)) { return null; } else { throw cannotConvert(context, receiver, null, "asInstant()", "isInstant()", "Value does not contain instant information."); } } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, prev); } } @Override public Duration asDuration(Object languageContext, Object receiver) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object prev = hostEnter(context); try { if (isNullUncached(receiver)) { return null; } else { throw cannotConvert(context, receiver, null, "asDuration()", "isDuration()", "Value does not contain duration information."); } } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, prev); } } @Override public RuntimeException throwException(Object languageContext, Object receiver) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object prev = hostEnter(context); try { throw unsupported(context, receiver, "throwException()", "isException()"); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, prev); } } @Override public final Object getMetaObject(Object languageContext, Object receiver) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object prev = hostEnter(context); try { return getMetaObjectImpl(context, receiver); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, prev); } } @Override public Object getIterator(Object languageContext, Object receiver) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object prev = hostEnter(context); try { return getIteratorUnsupported(context, receiver); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, prev); } } @TruffleBoundary static final Object getIteratorUnsupported(PolyglotLanguageContext context, Object receiver) { throw unsupported(context, receiver, "getIterator()", "hasIterator()"); } @Override public boolean hasIteratorNextElement(Object languageContext, Object receiver) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object prev = hostEnter(context); try { return hasIteratorNextElementUnsupported(context, receiver); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, prev); } } @TruffleBoundary static final boolean hasIteratorNextElementUnsupported(PolyglotLanguageContext context, Object receiver) { throw unsupported(context, receiver, "hasIteratorNextElement()", "isIterator()"); } @Override public Object getIteratorNextElement(Object languageContext, Object receiver) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object prev = hostEnter(context); try { return getIteratorNextElementUnsupported(context, receiver); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, prev); } } @TruffleBoundary static final Object getIteratorNextElementUnsupported(PolyglotLanguageContext context, Object receiver) { throw unsupported(context, receiver, "getIteratorNextElement()", "isIterator()"); } @Override public long getHashSize(Object languageContext, Object receiver) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object prev = hostEnter(context); try { throw getHashSizeUnsupported(context, receiver); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, prev); } } @TruffleBoundary static final RuntimeException getHashSizeUnsupported(PolyglotLanguageContext context, Object receiver) { throw unsupported(context, receiver, "getHashSize()", "hasHashEntries()"); } @Override public Object getHashValue(Object languageContext, Object receiver, Object key) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object prev = hostEnter(context); try { throw getHashValueUnsupported(context, receiver, key); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, prev); } } @TruffleBoundary static final RuntimeException getHashValueUnsupported(PolyglotLanguageContext context, Object receiver, @SuppressWarnings("unused") Object key) { throw unsupported(context, receiver, "getHashValue(Object)", "hasHashEntries()"); } @Override public Object getHashValueOrDefault(Object languageContext, Object receiver, Object key, Object defaultValue) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object prev = hostEnter(context); try { throw getHashValueOrDefaultUnsupported(context, receiver, key, defaultValue); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, prev); } } @TruffleBoundary @SuppressWarnings("unused") static final RuntimeException getHashValueOrDefaultUnsupported(PolyglotLanguageContext context, Object receiver, Object key, Object defaultValue) { throw unsupported(context, receiver, "getHashValueOrDefault(Object, Object)", "hasHashEntries()"); } @Override public void putHashEntry(Object languageContext, Object receiver, Object key, Object value) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object prev = hostEnter(context); try { putHashEntryUnsupported(context, receiver, key, value); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, prev); } } @TruffleBoundary static final RuntimeException putHashEntryUnsupported(PolyglotLanguageContext context, Object receiver, @SuppressWarnings("unused") Object key, @SuppressWarnings("unused") Object value) { throw unsupported(context, receiver, "putHashEntry(Object, Object)", "hasHashEntries()"); } @Override public boolean removeHashEntry(Object languageContext, Object receiver, Object key) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object prev = hostEnter(context); try { throw removeHashEntryUnsupported(context, receiver, key); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, prev); } } @TruffleBoundary static final RuntimeException removeHashEntryUnsupported(PolyglotLanguageContext context, Object receiver, @SuppressWarnings("unused") Object key) { throw unsupported(context, receiver, "removeHashEntry(Object)", "hasHashEntries()"); } @Override public Object getHashEntriesIterator(Object languageContext, Object receiver) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object prev = hostEnter(context); try { throw getHashEntriesIteratorUnsupported(context, receiver); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, prev); } } @TruffleBoundary static final RuntimeException getHashEntriesIteratorUnsupported(PolyglotLanguageContext context, Object receiver) { throw unsupported(context, receiver, "getHashEntriesIterator()", "hasHashEntries()"); } @Override public Object getHashKeysIterator(Object languageContext, Object receiver) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object prev = hostEnter(context); try { throw getHashKeysIteratorUnsupported(context, receiver); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, prev); } } @TruffleBoundary static final RuntimeException getHashKeysIteratorUnsupported(PolyglotLanguageContext context, Object receiver) { throw unsupported(context, receiver, "getHashKeysIterator()", "hasHashEntries()"); } @Override public Object getHashValuesIterator(Object languageContext, Object receiver) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object prev = hostEnter(context); try { throw getHashValuesIteratorUnsupported(context, receiver); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, prev); } } @Override public void pin(Object languageContext, Object receiver) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object prev = hostEnter(context); try { languageInstance.sharing.engine.host.pin(receiver); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, prev); } } @TruffleBoundary static final RuntimeException getHashValuesIteratorUnsupported(PolyglotLanguageContext context, Object receiver) { throw unsupported(context, receiver, "getHashValuesIterator()", "hasHashEntries()"); } protected Object getMetaObjectImpl(PolyglotLanguageContext context, Object receiver) { InteropLibrary lib = InteropLibrary.getFactory().getUncached(receiver); if (lib.hasMetaObject(receiver)) { try { return asValue(impl, context, lib.getMetaObject(receiver)); } catch (UnsupportedMessageException e) { throw shouldNotReachHere("Unexpected unsupported message.", e); } } return null; } private static Object asValue(PolyglotImpl polyglot, PolyglotLanguageContext context, Object value) { if (context == null) { return polyglot.asValue(PolyglotFastThreadLocals.getContext(null), value); } else { return context.asValue(value); } } static Object hostEnter(Object languageContext) { if (languageContext == null) { return null; } PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; PolyglotContextImpl c = context.context; try { return c.engine.enterIfNeeded(c, true); } catch (Throwable t) { throw guestToHostException(context, t, false); } } static void hostLeave(Object languageContext, Object prev) { if (languageContext == null) { return; } PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; try { PolyglotContextImpl c = context.context; c.engine.leaveIfNeeded(prev, c); } catch (Throwable t) { throw guestToHostException(context, t, false); } } @TruffleBoundary protected static RuntimeException unsupported(PolyglotLanguageContext context, Object receiver, String message, String useToCheck) { String polyglotMessage; if (useToCheck != null) { polyglotMessage = String.format("Unsupported operation Value.%s for %s. You can ensure that the operation is supported using Value.%s.", message, getValueInfo(context, receiver), useToCheck); } else { polyglotMessage = String.format("Unsupported operation Value.%s for %s.", message, getValueInfo(context, receiver)); } return PolyglotEngineException.unsupported(polyglotMessage); } private static final int CHARACTER_LIMIT = 140; private static final InteropLibrary INTEROP = InteropLibrary.getFactory().getUncached(); @TruffleBoundary static String getValueInfo(Object languageContext, Object receiver) { PolyglotContextImpl context = languageContext != null ? ((PolyglotLanguageContext) languageContext).context : null; return getValueInfo(context, receiver); } @TruffleBoundary static String getValueInfo(PolyglotContextImpl context, Object receiver) { if (context == null) { return receiver.toString(); } else if (receiver == null) { assert false : "receiver should never be null"; return "null"; } PolyglotLanguage displayLanguage = EngineAccessor.EngineImpl.findObjectLanguage(context.engine, receiver); Object view; if (displayLanguage == null) { displayLanguage = context.engine.hostLanguage; view = context.getHostContext().getLanguageView(receiver); } else { view = receiver; } String valueToString; String metaObjectToString = UNKNOWN; try { InteropLibrary uncached = InteropLibrary.getFactory().getUncached(view); if (uncached.hasMetaObject(view)) { Object qualifiedName = INTEROP.getMetaQualifiedName(uncached.getMetaObject(view)); metaObjectToString = truncateString(INTEROP.asString(qualifiedName), CHARACTER_LIMIT); } valueToString = truncateString(INTEROP.asString(uncached.toDisplayString(view)), CHARACTER_LIMIT); } catch (UnsupportedMessageException e) { throw shouldNotReachHere(e); } String languageName = null; boolean hideType = false; if (displayLanguage.isHost()) { languageName = "Java"; // java is our host language for now // hide meta objects of null if (UNKNOWN.equals(metaObjectToString) && INTEROP.isNull(receiver)) { hideType = true; } } else { languageName = displayLanguage.getName(); } if (hideType) { return String.format("'%s'(language: %s)", valueToString, languageName); } else { return String.format("'%s'(language: %s, type: %s)", valueToString, languageName, metaObjectToString); } } private static String truncateString(String s, int i) { if (s.length() > i) { return s.substring(0, i - TRUNCATION_SUFFIX.length()) + TRUNCATION_SUFFIX; } else { return s; } } @TruffleBoundary protected static RuntimeException nullCoercion(Object languageContext, Object receiver, Class<?> targetType, String message, String useToCheck) { assert isEnteredOrNull(languageContext); String valueInfo = getValueInfo(languageContext, receiver); throw PolyglotEngineException.nullPointer(String.format("Cannot convert null value %s to Java type '%s' using Value.%s. " + "You can ensure that the operation is supported using Value.%s.", valueInfo, targetType, message, useToCheck)); } static boolean isEnteredOrNull(Object languageContext) { if (languageContext == null) { return true; } PolyglotContextImpl context = ((PolyglotLanguageContext) languageContext).context; return !context.engine.needsEnter(context); } @TruffleBoundary protected static RuntimeException cannotConvert(Object languageContext, Object receiver, Class<?> targetType, String message, String useToCheck, String reason) { assert isEnteredOrNull(languageContext); String valueInfo = getValueInfo(languageContext, receiver); String targetTypeString = ""; if (targetType != null) { targetTypeString = String.format("to Java type '%s'", targetType.getTypeName()); } throw PolyglotEngineException.classCast( String.format("Cannot convert %s %s using Value.%s: %s You can ensure that the value can be converted using Value.%s.", valueInfo, targetTypeString, message, reason, useToCheck)); } @TruffleBoundary protected static RuntimeException invalidArrayIndex(PolyglotLanguageContext context, Object receiver, long index) { String message = String.format("Invalid array index %s for array %s.", index, getValueInfo(context, receiver)); throw PolyglotEngineException.arrayIndexOutOfBounds(message); } @TruffleBoundary protected static RuntimeException invalidArrayValue(PolyglotLanguageContext context, Object receiver, long identifier, Object value) { throw PolyglotEngineException.classCast( String.format("Invalid array value %s for array %s and index %s.", getValueInfo(context, value), getValueInfo(context, receiver), identifier)); } @TruffleBoundary protected static RuntimeException nonReadableMemberKey(PolyglotLanguageContext context, Object receiver, String identifier) { String message = String.format("Non readable or non-existent member key '%s' for object %s.", identifier, getValueInfo(context, receiver)); throw PolyglotEngineException.unsupported(message); } @TruffleBoundary protected static RuntimeException nonWritableMemberKey(PolyglotLanguageContext context, Object receiver, String identifier) { String message = String.format("Non writable or non-existent member key '%s' for object %s.", identifier, getValueInfo(context, receiver)); throw PolyglotEngineException.unsupported(message); } @TruffleBoundary protected static RuntimeException nonRemovableMemberKey(PolyglotLanguageContext context, Object receiver, String identifier) { String message = String.format("Non removable or non-existent member key '%s' for object %s.", identifier, getValueInfo(context, receiver)); throw PolyglotEngineException.unsupported(message); } @TruffleBoundary protected static RuntimeException invalidMemberValue(PolyglotLanguageContext context, Object receiver, String identifier, Object value) { String message = String.format("Invalid member value %s for object %s and member key '%s'.", getValueInfo(context, value), getValueInfo(context, receiver), identifier); throw PolyglotEngineException.illegalArgument(message); } @TruffleBoundary protected static RuntimeException stopIteration(PolyglotLanguageContext context, Object receiver) { String message = String.format("Iteration was stopped for iterator %s.", getValueInfo(context, receiver)); throw PolyglotEngineException.noSuchElement(message); } @TruffleBoundary protected static RuntimeException nonReadableIteratorElement() { throw PolyglotEngineException.unsupported("Iterator element is not readable."); } @TruffleBoundary protected static RuntimeException invalidHashValue(PolyglotLanguageContext context, Object receiver, Object key, Object value) { String message = String.format("Invalid hash value %s for object %s and hash key %s.", getValueInfo(context, value), getValueInfo(context, receiver), getValueInfo(context, key)); throw PolyglotEngineException.illegalArgument(message); } @TruffleBoundary protected static RuntimeException invalidExecuteArgumentType(PolyglotLanguageContext context, Object receiver, UnsupportedTypeException e) { String originalMessage = e.getMessage() == null ? "" : e.getMessage() + " "; String[] formattedArgs = formatArgs(context, e.getSuppliedValues()); throw PolyglotEngineException.illegalArgument(String.format("Invalid argument when executing %s. %sProvided arguments: %s.", getValueInfo(context, receiver), originalMessage, Arrays.asList(formattedArgs))); } @TruffleBoundary protected static RuntimeException invalidInvokeArgumentType(PolyglotLanguageContext context, Object receiver, String member, UnsupportedTypeException e) { String originalMessage = e.getMessage() == null ? "" : e.getMessage(); String[] formattedArgs = formatArgs(context, e.getSuppliedValues()); String message = String.format("Invalid argument when invoking '%s' on %s. %sProvided arguments: %s.", member, getValueInfo(context, receiver), originalMessage, Arrays.asList(formattedArgs)); throw PolyglotEngineException.illegalArgument(message); } @TruffleBoundary protected static RuntimeException invalidInstantiateArgumentType(PolyglotLanguageContext context, Object receiver, Object[] arguments) { String[] formattedArgs = formatArgs(context, arguments); String message = String.format("Invalid argument when instantiating %s with arguments %s.", getValueInfo(context, receiver), Arrays.asList(formattedArgs)); throw PolyglotEngineException.illegalArgument(message); } @TruffleBoundary protected static RuntimeException invalidInstantiateArity(PolyglotLanguageContext context, Object receiver, Object[] arguments, int expectedMin, int expectedMax, int actual) { String[] formattedArgs = formatArgs(context, arguments); String message = String.format("Invalid argument count when instantiating %s with arguments %s. %s", getValueInfo(context, receiver), Arrays.asList(formattedArgs), formatExpectedArguments(expectedMin, expectedMax, actual)); throw PolyglotEngineException.illegalArgument(message); } @TruffleBoundary protected static RuntimeException invalidExecuteArity(PolyglotLanguageContext context, Object receiver, Object[] arguments, int expectedMin, int expectedMax, int actual) { String[] formattedArgs = formatArgs(context, arguments); String message = String.format("Invalid argument count when executing %s with arguments %s. %s", getValueInfo(context, receiver), Arrays.asList(formattedArgs), formatExpectedArguments(expectedMin, expectedMax, actual)); throw PolyglotEngineException.illegalArgument(message); } @TruffleBoundary protected static RuntimeException invalidInvokeArity(PolyglotLanguageContext context, Object receiver, String member, Object[] arguments, int expectedMin, int expectedMax, int actual) { String[] formattedArgs = formatArgs(context, arguments); String message = String.format("Invalid argument count when invoking '%s' on %s with arguments %s. %s", member, getValueInfo(context, receiver), Arrays.asList(formattedArgs), formatExpectedArguments(expectedMin, expectedMax, actual)); throw PolyglotEngineException.illegalArgument(message); } static String formatExpectedArguments(int expectedMinArity, int expectedMaxArity, int actualArity) { String actual; if (actualArity < 0) { actual = "unknown"; } else { actual = String.valueOf(actualArity); } String expected; if (expectedMinArity == expectedMaxArity) { expected = String.valueOf(expectedMinArity); } else { if (expectedMaxArity < 0) { expected = expectedMinArity + "+"; } else { expected = expectedMinArity + "-" + expectedMaxArity; } } return String.format("Expected %s argument(s) but got %s.", expected, actual); } private static String[] formatArgs(Object languageContext, Object[] arguments) { String[] formattedArgs = new String[arguments.length]; for (int i = 0; i < arguments.length; i++) { formattedArgs[i] = getValueInfo(languageContext, arguments[i]); } return formattedArgs; } @Override public final String toString(Object languageContext, Object receiver) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object prev = null; if (context != null) { PolyglotContextImpl.State localContextState = context.context.state; if (localContextState.isInvalidOrClosed()) { /* * Performance improvement for closed or invalid to avoid recurring exceptions. */ return "Error in toString(): Context is invalid or closed."; } } try { prev = PolyglotValueDispatch.hostEnter(context); } catch (Throwable t) { // enter might fail if context was closed. // Can no longer call interop. return String.format("Error in toString(): Could not enter context: %s.", t.getMessage()); } try { return toStringImpl(context, receiver); } catch (Throwable e) { throw PolyglotValueDispatch.guestToHostException(context, e, true); } finally { try { PolyglotValueDispatch.hostLeave(languageContext, prev); } catch (Throwable t) { // ignore errors leaving we cannot propagate them. } } } String toStringImpl(PolyglotLanguageContext context, Object receiver) throws AssertionError { return PolyglotWrapper.toStringImpl(context, receiver); } @Override public Object getSourceLocation(Object languageContext, Object receiver) { if (languageContext == null) { return null; } PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object prev = hostEnter(context); try { InteropLibrary lib = InteropLibrary.getFactory().getUncached(receiver); com.oracle.truffle.api.source.SourceSection result = null; if (lib.hasSourceLocation(receiver)) { try { result = lib.getSourceLocation(receiver); } catch (UnsupportedMessageException e) { } } if (result == null) { return null; } return PolyglotImpl.getPolyglotSourceSection(impl, result); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, prev); } } @Override public boolean isMetaObject(Object languageContext, Object receiver) { return false; } @Override public boolean equalsImpl(Object languageContext, Object receiver, Object obj) { if (receiver == obj) { return true; } return PolyglotWrapper.equals(languageContext, receiver, obj); } @Override public int hashCodeImpl(Object languageContext, Object receiver) { return PolyglotWrapper.hashCode(languageContext, receiver); } @Override public boolean isMetaInstance(Object languageContext, Object receiver, Object instance) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object prev = hostEnter(context); try { throw unsupported(context, receiver, "isMetaInstance(Object)", "isMetaObject()"); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, prev); } } @Override public String getMetaQualifiedName(Object languageContext, Object receiver) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object prev = hostEnter(context); try { throw unsupported(context, receiver, "getMetaQualifiedName()", "isMetaObject()"); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, prev); } } @Override public String getMetaSimpleName(Object languageContext, Object receiver) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object prev = hostEnter(context); try { throw unsupported(context, receiver, "getMetaSimpleName()", "isMetaObject()"); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, prev); } } @Override public boolean hasMetaParents(Object languageContext, Object receiver) { return false; } @Override public Object getMetaParents(Object languageContext, Object receiver) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object prev = hostEnter(context); try { throw unsupported(context, receiver, "getMetaParents()", "hasMetaParents()"); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, prev); } } static CallTarget createTarget(InteropNode root) { CallTarget target = root.getCallTarget(); Class<?>[] types = root.getArgumentTypes(); if (types != null) { RUNTIME.initializeProfile(target, types); } return target; } static PolyglotValueDispatch createInteropValue(PolyglotLanguageInstance languageInstance, TruffleObject receiver, Class<?> receiverType) { return new InteropValue(languageInstance.getImpl(), languageInstance, receiver, receiverType); } static PolyglotValueDispatch createHostNull(PolyglotImpl polyglot) { return new HostNull(polyglot); } static void createDefaultValues(PolyglotImpl polyglot, PolyglotLanguageInstance languageInstance, Map<Class<?>, PolyglotValueDispatch> valueCache) { addDefaultValue(polyglot, languageInstance, valueCache, false); addDefaultValue(polyglot, languageInstance, valueCache, ""); addDefaultValue(polyglot, languageInstance, valueCache, TruffleString.fromJavaStringUncached("", TruffleString.Encoding.UTF_16)); addDefaultValue(polyglot, languageInstance, valueCache, 'a'); addDefaultValue(polyglot, languageInstance, valueCache, (byte) 0); addDefaultValue(polyglot, languageInstance, valueCache, (short) 0); addDefaultValue(polyglot, languageInstance, valueCache, 0); addDefaultValue(polyglot, languageInstance, valueCache, 0L); addDefaultValue(polyglot, languageInstance, valueCache, 0F); addDefaultValue(polyglot, languageInstance, valueCache, 0D); } static void addDefaultValue(PolyglotImpl polyglot, PolyglotLanguageInstance languageInstance, Map<Class<?>, PolyglotValueDispatch> valueCache, Object primitive) { valueCache.put(primitive.getClass(), new PrimitiveValue(polyglot, languageInstance, primitive)); } static final class PrimitiveValue extends PolyglotValueDispatch { private final InteropLibrary interop; private final PolyglotLanguage language; private PrimitiveValue(PolyglotImpl impl, PolyglotLanguageInstance instance, Object primitiveValue) { super(impl, instance); /* * No caching needed for primitives. We do that to avoid the overhead of crossing a * Truffle call boundary. */ this.interop = InteropLibrary.getFactory().getUncached(primitiveValue); this.language = instance != null ? instance.language : null; } @Override public boolean isString(Object languageContext, Object receiver) { return interop.isString(receiver); } @Override public boolean isBoolean(Object languageContext, Object receiver) { return interop.isBoolean(receiver); } @Override public boolean asBoolean(Object languageContext, Object receiver) { try { return interop.asBoolean(receiver); } catch (UnsupportedMessageException e) { return super.asBoolean(languageContext, receiver); } } @Override public String asString(Object languageContext, Object receiver) { try { return interop.asString(receiver); } catch (UnsupportedMessageException e) { return super.asString(languageContext, receiver); } } @Override public boolean isNumber(Object languageContext, Object receiver) { return interop.isNumber(receiver); } @Override public boolean fitsInByte(Object languageContext, Object receiver) { return interop.fitsInByte(receiver); } @Override public boolean fitsInShort(Object languageContext, Object receiver) { return interop.fitsInShort(receiver); } @Override public boolean fitsInInt(Object languageContext, Object receiver) { return interop.fitsInInt(receiver); } @Override public boolean fitsInLong(Object languageContext, Object receiver) { return interop.fitsInLong(receiver); } @Override public boolean fitsInBigInteger(Object languageContext, Object receiver) { return interop.fitsInBigInteger(receiver); } @Override public boolean fitsInFloat(Object languageContext, Object receiver) { return interop.fitsInFloat(receiver); } @Override public boolean fitsInDouble(Object languageContext, Object receiver) { return interop.fitsInDouble(receiver); } @Override public byte asByte(Object languageContext, Object receiver) { try { return interop.asByte(receiver); } catch (UnsupportedMessageException e) { return super.asByte(languageContext, receiver); } } @Override public short asShort(Object languageContext, Object receiver) { try { return interop.asShort(receiver); } catch (UnsupportedMessageException e) { return super.asShort(languageContext, receiver); } } @Override public int asInt(Object languageContext, Object receiver) { try { return interop.asInt(receiver); } catch (UnsupportedMessageException e) { return super.asInt(languageContext, receiver); } } @Override public long asLong(Object languageContext, Object receiver) { try { return interop.asLong(receiver); } catch (UnsupportedMessageException e) { return super.asLong(languageContext, receiver); } } @Override public BigInteger asBigInteger(Object languageContext, Object receiver) { try { return interop.asBigInteger(receiver); } catch (UnsupportedMessageException e) { return super.asBigInteger(languageContext, receiver); } } @Override public float asFloat(Object languageContext, Object receiver) { try { return interop.asFloat(receiver); } catch (UnsupportedMessageException e) { return super.asFloat(languageContext, receiver); } } @Override public double asDouble(Object languageContext, Object receiver) { try { return interop.asDouble(receiver); } catch (UnsupportedMessageException e) { return super.asDouble(languageContext, receiver); } } @SuppressWarnings("unchecked") @Override public <T> T asClass(Object languageContext, Object receiver, Class<T> targetType) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object prev = hostEnter(context); try { if (context != null) { return language.engine.host.toHostType(null, null, context.context.getHostContextImpl(), receiver, targetType, targetType); } else { // disconnected primitive value T result = (T) EngineAccessor.HOST.convertPrimitiveLossy(receiver, targetType); if (result == null) { throw PolyglotInteropErrors.cannotConvertPrimitive(null, receiver, targetType); } return result; } } catch (Throwable e) { throw guestToHostException((context), e, true); } finally { hostLeave(context, prev); } } @SuppressWarnings("unchecked") @Override public <T> T asTypeLiteral(Object languageContext, Object receiver, Class<T> rawType, Type type) { return asClass(languageContext, receiver, rawType); } @Override public Object getMetaObjectImpl(PolyglotLanguageContext languageContext, Object receiver) { return super.getMetaObjectImpl(languageContext, getLanguageView(languageContext, receiver)); } @Override String toStringImpl(PolyglotLanguageContext context, Object receiver) throws AssertionError { return super.toStringImpl(context, getLanguageView(context, receiver)); } private Object getLanguageView(Object languageContext, Object receiver) { if (languageContext == null || language == null) { return receiver; } PolyglotContextImpl c = ((PolyglotLanguageContext) languageContext).context; return c.getContext(language).getLanguageViewNoCheck(receiver); } } private static final class HostNull extends PolyglotValueDispatch { private final PolyglotImpl polyglot; HostNull(PolyglotImpl polyglot) { super(polyglot, null); this.polyglot = polyglot; } @Override public boolean isNull(Object languageContext, Object receiver) { return true; } @SuppressWarnings("unchecked") @Override public <T> T asClass(Object languageContext, Object receiver, Class<T> targetType) { if (targetType == polyglot.getAPIAccess().getValueClass()) { return (T) polyglot.hostNull; } return null; } @SuppressWarnings("cast") @Override public <T> T asTypeLiteral(Object languageContext, Object receiver, Class<T> rawType, Type type) { return asClass(languageContext, receiver, rawType); } } abstract static class InteropNode extends HostToGuestRootNode { protected static final int CACHE_LIMIT = 5; protected final InteropValue polyglot; protected abstract String getOperationName(); protected InteropNode(InteropValue polyglot) { super(polyglot.languageInstance); this.polyglot = polyglot; } protected abstract Class<?>[] getArgumentTypes(); @Override protected Class<? extends Object> getReceiverType() { return polyglot.receiverType; } @Override public final String getName() { return "org.graalvm.polyglot.Value<" + polyglot.receiverType.getSimpleName() + ">." + getOperationName(); } protected final AbstractPolyglotImpl getImpl() { return polyglot.impl; } @Override public final String toString() { return getName(); } } /** * Host value implementation used when a Value needs to be created but not context is available. * If a context is available the normal interop value implementation is used. */ static class HostValue extends PolyglotValueDispatch { HostValue(PolyglotImpl polyglot) { super(polyglot, null); } @Override public boolean isHostObject(Object languageContext, Object receiver) { return EngineAccessor.HOST.isDisconnectedHostObject(receiver); } @Override public Object asHostObject(Object languageContext, Object receiver) { return EngineAccessor.HOST.unboxDisconnectedHostObject(receiver); } @Override public boolean isProxyObject(Object languageContext, Object receiver) { return EngineAccessor.HOST.isDisconnectedHostProxy(receiver); } @Override public Object asProxyObject(Object languageContext, Object receiver) { return EngineAccessor.HOST.unboxDisconnectedHostProxy(receiver); } @Override public <T> T asClass(Object languageContext, Object receiver, Class<T> targetType) { return asImpl(languageContext, receiver, targetType); } @SuppressWarnings("cast") @Override public <T> T asTypeLiteral(Object languageContext, Object receiver, Class<T> rawType, Type type) { return asImpl(languageContext, receiver, (Class<T>) rawType); } <T> T asImpl(Object languageContext, Object receiver, Class<T> targetType) { Object hostValue; if (isProxyObject(languageContext, receiver)) { hostValue = asProxyObject(languageContext, receiver); } else if (isHostObject(languageContext, receiver)) { hostValue = asHostObject(languageContext, receiver); } else { throw new ClassCastException(); } return targetType.cast(hostValue); } } /** * Must be kept in sync with the HostObject and the HostToTypeNode implementation. */ static final class BigIntegerHostValue extends HostValue { BigIntegerHostValue(PolyglotImpl polyglot) { super(polyglot); } @Override public boolean isNumber(Object context, Object receiver) { assert asHostObject(context, receiver) instanceof BigInteger; return true; } @Override public boolean fitsInByte(Object context, Object receiver) { assert asHostObject(context, receiver) instanceof BigInteger; return ((BigInteger) asHostObject(context, receiver)).bitLength() < Byte.SIZE; } @Override public boolean fitsInShort(Object context, Object receiver) { assert asHostObject(context, receiver) instanceof BigInteger; return ((BigInteger) asHostObject(context, receiver)).bitLength() < Short.SIZE; } @Override public boolean fitsInInt(Object context, Object receiver) { assert asHostObject(context, receiver) instanceof BigInteger; return ((BigInteger) asHostObject(context, receiver)).bitLength() < Integer.SIZE; } @Override public boolean fitsInLong(Object context, Object receiver) { assert asHostObject(context, receiver) instanceof BigInteger; return ((BigInteger) asHostObject(context, receiver)).bitLength() < Long.SIZE; } @Override public boolean fitsInBigInteger(Object context, Object receiver) { assert asHostObject(context, receiver) instanceof BigInteger; return true; } @Override public boolean fitsInFloat(Object context, Object receiver) { assert asHostObject(context, receiver) instanceof BigInteger; return EngineAccessor.HOST.bigIntegerFitsInFloat((BigInteger) asHostObject(context, receiver)); } @Override public boolean fitsInDouble(Object context, Object receiver) { assert asHostObject(context, receiver) instanceof BigInteger; return EngineAccessor.HOST.bigIntegerFitsInDouble((BigInteger) asHostObject(context, receiver)); } @Override public byte asByte(Object languageContext, Object receiver) { assert asHostObject(languageContext, receiver) instanceof BigInteger; try { return ((BigInteger) asHostObject(languageContext, receiver)).byteValueExact(); } catch (ArithmeticException e) { // throws an unsupported error. return super.asByte(languageContext, receiver); } } @Override public short asShort(Object languageContext, Object receiver) { assert asHostObject(languageContext, receiver) instanceof BigInteger; try { return ((BigInteger) asHostObject(languageContext, receiver)).shortValueExact(); } catch (ArithmeticException e) { // throws an unsupported error. return super.asShort(languageContext, receiver); } } @Override public int asInt(Object languageContext, Object receiver) { assert asHostObject(languageContext, receiver) instanceof BigInteger; try { return ((BigInteger) asHostObject(languageContext, receiver)).intValueExact(); } catch (ArithmeticException e) { // throws an unsupported error. return super.asInt(languageContext, receiver); } } @Override public long asLong(Object languageContext, Object receiver) { assert asHostObject(languageContext, receiver) instanceof BigInteger; try { return ((BigInteger) asHostObject(languageContext, receiver)).longValueExact(); } catch (ArithmeticException e) { // throws an unsupported error. return super.asLong(languageContext, receiver); } } @Override public BigInteger asBigInteger(Object languageContext, Object receiver) { assert asHostObject(languageContext, receiver) instanceof BigInteger; return ((BigInteger) asHostObject(languageContext, receiver)); } @Override public float asFloat(Object languageContext, Object receiver) { assert asHostObject(languageContext, receiver) instanceof BigInteger; if (fitsInFloat(languageContext, receiver)) { return ((BigInteger) asHostObject(languageContext, receiver)).floatValue(); } else { // throws an unsupported error. return super.asFloat(languageContext, receiver); } } @Override public double asDouble(Object languageContext, Object receiver) { assert asHostObject(languageContext, receiver) instanceof BigInteger; if (fitsInFloat(languageContext, receiver)) { return ((BigInteger) asHostObject(languageContext, receiver)).doubleValue(); } else { // throws an unsupported error. return super.asDouble(languageContext, receiver); } } @SuppressWarnings("unchecked") @Override <T> T asImpl(Object languageContext, Object receiver, Class<T> targetType) { assert asHostObject(languageContext, receiver) instanceof BigInteger; if (targetType == byte.class || targetType == Byte.class) { return (T) (Byte) asByte(languageContext, receiver); } else if (targetType == short.class || targetType == Short.class) { return (T) (Short) asShort(languageContext, receiver); } else if (targetType == int.class || targetType == Integer.class) { return (T) (Integer) asInt(languageContext, receiver); } else if (targetType == long.class || targetType == Long.class) { return (T) (Long) asLong(languageContext, receiver); } else if (targetType == float.class || targetType == Float.class) { return (T) (Float) asFloat(languageContext, receiver); } else if (targetType == double.class || targetType == Double.class) { return (T) (Double) asDouble(languageContext, receiver); } else if (targetType == BigInteger.class || targetType == Number.class) { return (T) asBigInteger(languageContext, receiver); } else if (targetType == char.class || targetType == Character.class) { if (fitsInInt(languageContext, receiver)) { int v = asInt(languageContext, receiver); if (v >= 0 && v < 65536) { return (T) (Character) (char) v; } } } return super.asImpl(languageContext, receiver, targetType); } } @SuppressWarnings("unused") static final class InteropValue extends PolyglotValueDispatch { final CallTarget isNativePointer; final CallTarget asNativePointer; final CallTarget hasArrayElements; final CallTarget getArrayElement; final CallTarget setArrayElement; final CallTarget removeArrayElement; final CallTarget getArraySize; final CallTarget hasBufferElements; final CallTarget isBufferWritable; final CallTarget getBufferSize; final CallTarget readBufferByte; final CallTarget readBuffer; final CallTarget writeBufferByte; final CallTarget readBufferShort; final CallTarget writeBufferShort; final CallTarget readBufferInt; final CallTarget writeBufferInt; final CallTarget readBufferLong; final CallTarget writeBufferLong; final CallTarget readBufferFloat; final CallTarget writeBufferFloat; final CallTarget readBufferDouble; final CallTarget writeBufferDouble; final CallTarget hasMembers; final CallTarget hasMember; final CallTarget getMember; final CallTarget putMember; final CallTarget removeMember; final CallTarget isNull; final CallTarget canExecute; final CallTarget execute; final CallTarget canInstantiate; final CallTarget newInstance; final CallTarget executeNoArgs; final CallTarget executeVoid; final CallTarget executeVoidNoArgs; final CallTarget canInvoke; final CallTarget invoke; final CallTarget invokeNoArgs; final CallTarget getMemberKeys; final CallTarget isDate; final CallTarget asDate; final CallTarget isTime; final CallTarget asTime; final CallTarget isTimeZone; final CallTarget asTimeZone; final CallTarget asInstant; final CallTarget isDuration; final CallTarget asDuration; final CallTarget isException; final CallTarget throwException; final CallTarget isMetaObject; final CallTarget isMetaInstance; final CallTarget getMetaQualifiedName; final CallTarget getMetaSimpleName; final CallTarget hasMetaParents; final CallTarget getMetaParents; final CallTarget hasIterator; final CallTarget getIterator; final CallTarget isIterator; final CallTarget hasIteratorNextElement; final CallTarget getIteratorNextElement; final CallTarget hasHashEntries; final CallTarget getHashSize; final CallTarget hasHashEntry; final CallTarget getHashValue; final CallTarget getHashValueOrDefault; final CallTarget putHashEntry; final CallTarget removeHashEntry; final CallTarget getHashEntriesIterator; final CallTarget getHashKeysIterator; final CallTarget getHashValuesIterator; final CallTarget asClassLiteral; final CallTarget asTypeLiteral; final Class<?> receiverType; InteropValue(PolyglotImpl polyglot, PolyglotLanguageInstance languageInstance, Object receiverObject, Class<?> receiverType) { super(polyglot, languageInstance); this.receiverType = receiverType; this.asClassLiteral = createTarget(AsClassLiteralNodeGen.create(this)); this.asTypeLiteral = createTarget(AsTypeLiteralNodeGen.create(this)); this.isNativePointer = createTarget(IsNativePointerNodeGen.create(this)); this.asNativePointer = createTarget(AsNativePointerNodeGen.create(this)); this.hasArrayElements = createTarget(HasArrayElementsNodeGen.create(this)); this.getArrayElement = createTarget(GetArrayElementNodeGen.create(this)); this.setArrayElement = createTarget(SetArrayElementNodeGen.create(this)); this.removeArrayElement = createTarget(RemoveArrayElementNodeGen.create(this)); this.getArraySize = createTarget(GetArraySizeNodeGen.create(this)); this.hasBufferElements = createTarget(HasBufferElementsNodeGen.create(this)); this.isBufferWritable = createTarget(IsBufferWritableNodeGen.create(this)); this.getBufferSize = createTarget(GetBufferSizeNodeGen.create(this)); this.readBufferByte = createTarget(ReadBufferByteNodeGen.create(this)); this.readBuffer = createTarget(ReadBufferNodeGen.create(this)); this.writeBufferByte = createTarget(WriteBufferByteNodeGen.create(this)); this.readBufferShort = createTarget(ReadBufferShortNodeGen.create(this)); this.writeBufferShort = createTarget(WriteBufferShortNodeGen.create(this)); this.readBufferInt = createTarget(ReadBufferIntNodeGen.create(this)); this.writeBufferInt = createTarget(WriteBufferIntNodeGen.create(this)); this.readBufferLong = createTarget(ReadBufferLongNodeGen.create(this)); this.writeBufferLong = createTarget(WriteBufferLongNodeGen.create(this)); this.readBufferFloat = createTarget(ReadBufferFloatNodeGen.create(this)); this.writeBufferFloat = createTarget(WriteBufferFloatNodeGen.create(this)); this.readBufferDouble = createTarget(ReadBufferDoubleNodeGen.create(this)); this.writeBufferDouble = createTarget(WriteBufferDoubleNodeGen.create(this)); this.hasMember = createTarget(HasMemberNodeGen.create(this)); this.getMember = createTarget(GetMemberNodeGen.create(this)); this.putMember = createTarget(PutMemberNodeGen.create(this)); this.removeMember = createTarget(RemoveMemberNodeGen.create(this)); this.isNull = createTarget(IsNullNodeGen.create(this)); this.execute = createTarget(ExecuteNodeGen.create(this)); this.executeNoArgs = createTarget(ExecuteNoArgsNodeGen.create(this)); this.executeVoid = createTarget(ExecuteVoidNodeGen.create(this)); this.executeVoidNoArgs = createTarget(ExecuteVoidNoArgsNodeGen.create(this)); this.newInstance = createTarget(NewInstanceNodeGen.create(this)); this.canInstantiate = createTarget(CanInstantiateNodeGen.create(this)); this.canExecute = createTarget(CanExecuteNodeGen.create(this)); this.canInvoke = createTarget(CanInvokeNodeGen.create(this)); this.invoke = createTarget(InvokeNodeGen.create(this)); this.invokeNoArgs = createTarget(InvokeNoArgsNodeGen.create(this)); this.hasMembers = createTarget(HasMembersNodeGen.create(this)); this.getMemberKeys = createTarget(GetMemberKeysNodeGen.create(this)); this.isDate = createTarget(IsDateNodeGen.create(this)); this.asDate = createTarget(AsDateNodeGen.create(this)); this.isTime = createTarget(IsTimeNodeGen.create(this)); this.asTime = createTarget(AsTimeNodeGen.create(this)); this.isTimeZone = createTarget(IsTimeZoneNodeGen.create(this)); this.asTimeZone = createTarget(AsTimeZoneNodeGen.create(this)); this.asInstant = createTarget(AsInstantNodeGen.create(this)); this.isDuration = createTarget(IsDurationNodeGen.create(this)); this.asDuration = createTarget(AsDurationNodeGen.create(this)); this.isException = createTarget(IsExceptionNodeGen.create(this)); this.throwException = createTarget(ThrowExceptionNodeGen.create(this)); this.isMetaObject = createTarget(IsMetaObjectNodeGen.create(this)); this.isMetaInstance = createTarget(IsMetaInstanceNodeGen.create(this)); this.getMetaQualifiedName = createTarget(GetMetaQualifiedNameNodeGen.create(this)); this.getMetaSimpleName = createTarget(GetMetaSimpleNameNodeGen.create(this)); this.hasMetaParents = createTarget(PolyglotValueDispatchFactory.InteropValueFactory.HasMetaParentsNodeGen.create(this)); this.getMetaParents = createTarget(PolyglotValueDispatchFactory.InteropValueFactory.GetMetaParentsNodeGen.create(this)); this.hasIterator = createTarget(HasIteratorNodeGen.create(this)); this.getIterator = createTarget(PolyglotValueDispatchFactory.InteropValueFactory.GetIteratorNodeGen.create(this)); this.isIterator = createTarget(PolyglotValueDispatchFactory.InteropValueFactory.IsIteratorNodeGen.create(this)); this.hasIteratorNextElement = createTarget(HasIteratorNextElementNodeGen.create(this)); this.getIteratorNextElement = createTarget(GetIteratorNextElementNodeGen.create(this)); this.hasHashEntries = createTarget(HasHashEntriesNodeGen.create(this)); this.getHashSize = createTarget(GetHashSizeNodeGen.create(this)); this.hasHashEntry = createTarget(HasHashEntryNodeGen.create(this)); this.getHashValue = createTarget(GetHashValueNodeGen.create(this)); this.getHashValueOrDefault = createTarget(GetHashValueOrDefaultNodeGen.create(this)); this.putHashEntry = createTarget(PutHashEntryNodeGen.create(this)); this.removeHashEntry = createTarget(RemoveHashEntryNodeGen.create(this)); this.getHashEntriesIterator = createTarget(GetHashEntriesIteratorNodeGen.create(this)); this.getHashKeysIterator = createTarget(GetHashKeysIteratorNodeGen.create(this)); this.getHashValuesIterator = createTarget(GetHashValuesIteratorNodeGen.create(this)); } @SuppressWarnings("unchecked") @Override public <T> T asClass(Object languageContext, Object receiver, Class<T> targetType) { return (T) RUNTIME.callProfiled(this.asClassLiteral, languageContext, receiver, targetType); } @SuppressWarnings("unchecked") @Override public <T> T asTypeLiteral(Object languageContext, Object receiver, Class<T> rawType, Type type) { return (T) RUNTIME.callProfiled(this.asTypeLiteral, languageContext, receiver, rawType, type); } @Override public boolean isNativePointer(Object languageContext, Object receiver) { return (boolean) RUNTIME.callProfiled(this.isNativePointer, languageContext, receiver); } @Override public boolean hasArrayElements(Object languageContext, Object receiver) { return (boolean) RUNTIME.callProfiled(this.hasArrayElements, languageContext, receiver); } @Override public Object getArrayElement(Object languageContext, Object receiver, long index) { return RUNTIME.callProfiled(this.getArrayElement, languageContext, receiver, index); } @Override public void setArrayElement(Object languageContext, Object receiver, long index, Object value) { RUNTIME.callProfiled(this.setArrayElement, languageContext, receiver, index, value); } @Override public boolean removeArrayElement(Object languageContext, Object receiver, long index) { return (boolean) RUNTIME.callProfiled(this.removeArrayElement, languageContext, receiver, index); } @Override public long getArraySize(Object languageContext, Object receiver) { return (long) RUNTIME.callProfiled(this.getArraySize, languageContext, receiver); } // region Buffer Methods @Override public boolean hasBufferElements(Object languageContext, Object receiver) { return (boolean) RUNTIME.callProfiled(this.hasBufferElements, languageContext, receiver); } @Override public boolean isBufferWritable(Object languageContext, Object receiver) { return (boolean) RUNTIME.callProfiled(this.isBufferWritable, languageContext, receiver); } @Override public long getBufferSize(Object languageContext, Object receiver) throws UnsupportedOperationException { return (long) RUNTIME.callProfiled(this.getBufferSize, languageContext, receiver); } @Override public byte readBufferByte(Object languageContext, Object receiver, long byteOffset) throws UnsupportedOperationException, IndexOutOfBoundsException { return (byte) RUNTIME.callProfiled(this.readBufferByte, languageContext, receiver, byteOffset); } @Override public void readBuffer(Object languageContext, Object receiver, long byteOffset, byte[] destination, int destinationOffset, int length) throws UnsupportedOperationException, IndexOutOfBoundsException { RUNTIME.callProfiled(this.readBuffer, languageContext, receiver, byteOffset, destination, destinationOffset, length); } @Override public void writeBufferByte(Object languageContext, Object receiver, long byteOffset, byte value) throws UnsupportedOperationException, IndexOutOfBoundsException { RUNTIME.callProfiled(this.writeBufferByte, languageContext, receiver, byteOffset, value); } @Override public short readBufferShort(Object languageContext, Object receiver, ByteOrder order, long byteOffset) throws UnsupportedOperationException, IndexOutOfBoundsException { return (short) RUNTIME.callProfiled(this.readBufferShort, languageContext, receiver, order, byteOffset); } @Override public void writeBufferShort(Object languageContext, Object receiver, ByteOrder order, long byteOffset, short value) throws UnsupportedOperationException, IndexOutOfBoundsException { RUNTIME.callProfiled(this.writeBufferShort, languageContext, receiver, order, byteOffset, value); } @Override public int readBufferInt(Object languageContext, Object receiver, ByteOrder order, long byteOffset) throws UnsupportedOperationException, IndexOutOfBoundsException { return (int) RUNTIME.callProfiled(this.readBufferInt, languageContext, receiver, order, byteOffset); } @Override public void writeBufferInt(Object languageContext, Object receiver, ByteOrder order, long byteOffset, int value) throws UnsupportedOperationException, IndexOutOfBoundsException { RUNTIME.callProfiled(this.writeBufferInt, languageContext, receiver, order, byteOffset, value); } @Override public long readBufferLong(Object languageContext, Object receiver, ByteOrder order, long byteOffset) throws UnsupportedOperationException, IndexOutOfBoundsException { return (long) RUNTIME.callProfiled(this.readBufferLong, languageContext, receiver, order, byteOffset); } @Override public void writeBufferLong(Object languageContext, Object receiver, ByteOrder order, long byteOffset, long value) throws UnsupportedOperationException, IndexOutOfBoundsException { RUNTIME.callProfiled(this.writeBufferLong, languageContext, receiver, order, byteOffset, value); } @Override public float readBufferFloat(Object languageContext, Object receiver, ByteOrder order, long byteOffset) throws UnsupportedOperationException, IndexOutOfBoundsException { return (float) RUNTIME.callProfiled(this.readBufferFloat, languageContext, receiver, order, byteOffset); } @Override public void writeBufferFloat(Object languageContext, Object receiver, ByteOrder order, long byteOffset, float value) throws UnsupportedOperationException, IndexOutOfBoundsException { RUNTIME.callProfiled(this.writeBufferFloat, languageContext, receiver, order, byteOffset, value); } @Override public double readBufferDouble(Object languageContext, Object receiver, ByteOrder order, long byteOffset) throws UnsupportedOperationException, IndexOutOfBoundsException { return (double) RUNTIME.callProfiled(this.readBufferDouble, languageContext, receiver, order, byteOffset); } @Override public void writeBufferDouble(Object languageContext, Object receiver, ByteOrder order, long byteOffset, double value) throws UnsupportedOperationException, IndexOutOfBoundsException { RUNTIME.callProfiled(this.writeBufferDouble, languageContext, receiver, order, byteOffset, value); } // endregion @Override public boolean hasMembers(Object languageContext, Object receiver) { return (boolean) RUNTIME.callProfiled(this.hasMembers, languageContext, receiver); } @Override public Object getMember(Object languageContext, Object receiver, String key) { return RUNTIME.callProfiled(this.getMember, languageContext, receiver, key); } @Override public boolean hasMember(Object languageContext, Object receiver, String key) { return (boolean) RUNTIME.callProfiled(this.hasMember, languageContext, receiver, key); } @Override public void putMember(Object languageContext, Object receiver, String key, Object member) { RUNTIME.callProfiled(this.putMember, languageContext, receiver, key, member); } @Override public boolean removeMember(Object languageContext, Object receiver, String key) { return (boolean) RUNTIME.callProfiled(this.removeMember, languageContext, receiver, key); } @Override public Set<String> getMemberKeys(Object languageContext, Object receiver) { Object keys = RUNTIME.callProfiled(this.getMemberKeys, languageContext, receiver); if (keys == null) { // unsupported return Collections.emptySet(); } return new MemberSet(this.getEngine().getAPIAccess(), languageContext, receiver, keys); } @Override public long asNativePointer(Object languageContext, Object receiver) { return (long) RUNTIME.callProfiled(this.asNativePointer, languageContext, receiver); } @Override public boolean isDate(Object languageContext, Object receiver) { return (boolean) RUNTIME.callProfiled(this.isDate, languageContext, receiver); } @Override public LocalDate asDate(Object languageContext, Object receiver) { return (LocalDate) RUNTIME.callProfiled(this.asDate, languageContext, receiver); } @Override public boolean isTime(Object languageContext, Object receiver) { return (boolean) RUNTIME.callProfiled(this.isTime, languageContext, receiver); } @Override public LocalTime asTime(Object languageContext, Object receiver) { return (LocalTime) RUNTIME.callProfiled(this.asTime, languageContext, receiver); } @Override public boolean isTimeZone(Object languageContext, Object receiver) { return (boolean) RUNTIME.callProfiled(this.isTimeZone, languageContext, receiver); } @Override public ZoneId asTimeZone(Object languageContext, Object receiver) { return (ZoneId) RUNTIME.callProfiled(this.asTimeZone, languageContext, receiver); } @Override public Instant asInstant(Object languageContext, Object receiver) { return (Instant) RUNTIME.callProfiled(this.asInstant, languageContext, receiver); } @Override public boolean isDuration(Object languageContext, Object receiver) { return (boolean) RUNTIME.callProfiled(this.isDuration, languageContext, receiver); } @Override public Duration asDuration(Object languageContext, Object receiver) { return (Duration) RUNTIME.callProfiled(this.asDuration, languageContext, receiver); } @Override public boolean isHostObject(Object languageContext, Object receiver) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object prev = hostEnter(context); try { return getEngine().host.isHostObject(receiver); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, prev); } } private PolyglotEngineImpl getEngine() { return languageInstance.sharing.engine; } @Override public boolean isProxyObject(Object languageContext, Object receiver) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object prev = hostEnter(context); try { return getEngine().host.isHostProxy(receiver); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, prev); } } @Override public Object asProxyObject(Object languageContext, Object receiver) { if (isProxyObject(languageContext, receiver)) { return getEngine().host.unboxProxyObject(receiver); } else { return super.asProxyObject(languageContext, receiver); } } @Override public Object asHostObject(Object languageContext, Object receiver) { if (isHostObject(languageContext, receiver)) { return getEngine().host.unboxHostObject(receiver); } else { return super.asHostObject(languageContext, receiver); } } @Override public boolean isNull(Object languageContext, Object receiver) { return (boolean) RUNTIME.callProfiled(this.isNull, languageContext, receiver); } @Override public boolean canExecute(Object languageContext, Object receiver) { return (boolean) RUNTIME.callProfiled(this.canExecute, languageContext, receiver); } @Override public void executeVoid(Object languageContext, Object receiver, Object[] arguments) { RUNTIME.callProfiled(this.executeVoid, languageContext, receiver, arguments); } @Override public void executeVoid(Object languageContext, Object receiver) { RUNTIME.callProfiled(this.executeVoidNoArgs, languageContext, receiver); } @Override public Object execute(Object languageContext, Object receiver, Object[] arguments) { return RUNTIME.callProfiled(this.execute, languageContext, receiver, arguments); } @Override public Object execute(Object languageContext, Object receiver) { return RUNTIME.callProfiled(this.executeNoArgs, languageContext, receiver); } @Override public boolean canInstantiate(Object languageContext, Object receiver) { return (boolean) RUNTIME.callProfiled(this.canInstantiate, languageContext, receiver); } @Override public Object newInstance(Object languageContext, Object receiver, Object[] arguments) { return RUNTIME.callProfiled(this.newInstance, languageContext, receiver, arguments); } @Override public boolean canInvoke(Object languageContext, String identifier, Object receiver) { return (boolean) RUNTIME.callProfiled(this.canInvoke, languageContext, receiver, identifier); } @Override public Object invoke(Object languageContext, Object receiver, String identifier, Object[] arguments) { return RUNTIME.callProfiled(this.invoke, languageContext, receiver, identifier, arguments); } @Override public Object invoke(Object languageContext, Object receiver, String identifier) { return RUNTIME.callProfiled(this.invokeNoArgs, languageContext, receiver, identifier); } @Override public boolean isException(Object languageContext, Object receiver) { return (boolean) RUNTIME.callProfiled(this.isException, languageContext, receiver); } @Override public RuntimeException throwException(Object languageContext, Object receiver) { RUNTIME.callProfiled(this.throwException, languageContext, receiver); throw super.throwException(languageContext, receiver); } @Override public boolean isNumber(Object languageContext, Object receiver) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object c = hostEnter(context); try { return UNCACHED_INTEROP.isNumber(receiver); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, c); } } @Override public boolean fitsInByte(Object languageContext, Object receiver) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object c = hostEnter(context); try { return UNCACHED_INTEROP.fitsInByte(receiver); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, c); } } @Override public byte asByte(Object languageContext, Object receiver) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object c = hostEnter(context); try { try { return UNCACHED_INTEROP.asByte(receiver); } catch (UnsupportedMessageException e) { return asByteUnsupported(context, receiver); } } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, c); } } @Override public boolean isString(Object languageContext, Object receiver) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object c = hostEnter(context); try { return UNCACHED_INTEROP.isString(receiver); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, c); } } @Override public String asString(Object languageContext, Object receiver) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object c = hostEnter(context); try { try { if (isNullUncached(receiver)) { return null; } return UNCACHED_INTEROP.asString(receiver); } catch (UnsupportedMessageException e) { return asStringUnsupported(context, receiver); } } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, c); } } @Override public boolean fitsInInt(Object languageContext, Object receiver) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object c = hostEnter(context); try { return UNCACHED_INTEROP.fitsInInt(receiver); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, c); } } @Override public int asInt(Object languageContext, Object receiver) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object c = hostEnter(context); try { try { return UNCACHED_INTEROP.asInt(receiver); } catch (UnsupportedMessageException e) { return asIntUnsupported(context, receiver); } } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, c); } } @Override public boolean isBoolean(Object languageContext, Object receiver) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object c = hostEnter(context); try { return InteropLibrary.getFactory().getUncached().isBoolean(receiver); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, c); } } @Override public boolean asBoolean(Object languageContext, Object receiver) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object c = hostEnter(context); try { try { return InteropLibrary.getFactory().getUncached().asBoolean(receiver); } catch (UnsupportedMessageException e) { return asBooleanUnsupported(context, receiver); } } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, c); } } @Override public boolean fitsInFloat(Object languageContext, Object receiver) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object c = hostEnter(context); try { return InteropLibrary.getFactory().getUncached().fitsInFloat(receiver); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, c); } } @Override public float asFloat(Object languageContext, Object receiver) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object c = hostEnter(context); try { try { return UNCACHED_INTEROP.asFloat(receiver); } catch (UnsupportedMessageException e) { return asFloatUnsupported(context, receiver); } } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, c); } } @Override public boolean fitsInDouble(Object languageContext, Object receiver) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object c = hostEnter(context); try { return UNCACHED_INTEROP.fitsInDouble(receiver); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, c); } } @Override public double asDouble(Object languageContext, Object receiver) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object c = hostEnter(context); try { try { return UNCACHED_INTEROP.asDouble(receiver); } catch (UnsupportedMessageException e) { return asDoubleUnsupported(context, receiver); } } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, c); } } @Override public boolean fitsInLong(Object languageContext, Object receiver) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object c = hostEnter(context); try { return UNCACHED_INTEROP.fitsInLong(receiver); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, c); } } @Override public long asLong(Object languageContext, Object receiver) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object c = hostEnter(context); try { try { return UNCACHED_INTEROP.asLong(receiver); } catch (UnsupportedMessageException e) { return asLongUnsupported(context, receiver); } } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, c); } } @Override public boolean fitsInBigInteger(Object languageContext, Object receiver) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object c = hostEnter(context); try { return UNCACHED_INTEROP.fitsInBigInteger(receiver); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, c); } } @Override public BigInteger asBigInteger(Object languageContext, Object receiver) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object c = hostEnter(context); try { try { return UNCACHED_INTEROP.asBigInteger(receiver); } catch (UnsupportedMessageException e) { return asBigIntegerUnsupported(context, receiver); } } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, c); } } @Override public boolean fitsInShort(Object languageContext, Object receiver) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object c = hostEnter(context); try { return UNCACHED_INTEROP.fitsInShort(receiver); } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, c); } } @Override public short asShort(Object languageContext, Object receiver) { PolyglotLanguageContext context = (PolyglotLanguageContext) languageContext; Object c = hostEnter(context); try { try { return UNCACHED_INTEROP.asShort(receiver); } catch (UnsupportedMessageException e) { return asShortUnsupported(context, receiver); } } catch (Throwable e) { throw guestToHostException(context, e, true); } finally { hostLeave(context, c); } } @Override public boolean isMetaObject(Object languageContext, Object receiver) { return (boolean) RUNTIME.callProfiled(this.isMetaObject, languageContext, receiver); } @Override public boolean isMetaInstance(Object languageContext, Object receiver, Object instance) { return (boolean) RUNTIME.callProfiled(this.isMetaInstance, languageContext, receiver, instance); } @Override public String getMetaQualifiedName(Object languageContext, Object receiver) { return (String) RUNTIME.callProfiled(this.getMetaQualifiedName, languageContext, receiver); } @Override public String getMetaSimpleName(Object languageContext, Object receiver) { return (String) RUNTIME.callProfiled(this.getMetaSimpleName, languageContext, receiver); } @Override public boolean hasMetaParents(Object languageContext, Object receiver) { return (boolean) RUNTIME.callProfiled(this.hasMetaParents, languageContext, receiver); } @Override public Object getMetaParents(Object languageContext, Object receiver) { return RUNTIME.callProfiled(this.getMetaParents, languageContext, receiver); } @Override public boolean hasIterator(Object languageContext, Object receiver) { return (boolean) RUNTIME.callProfiled(this.hasIterator, languageContext, receiver); } @Override public Object getIterator(Object languageContext, Object receiver) { return RUNTIME.callProfiled(this.getIterator, languageContext, receiver); } @Override public boolean isIterator(Object languageContext, Object receiver) { return (boolean) RUNTIME.callProfiled(this.isIterator, languageContext, receiver); } @Override public boolean hasIteratorNextElement(Object languageContext, Object receiver) { return (boolean) RUNTIME.callProfiled(this.hasIteratorNextElement, languageContext, receiver); } @Override public Object getIteratorNextElement(Object languageContext, Object receiver) { return RUNTIME.callProfiled(this.getIteratorNextElement, languageContext, receiver); } @Override public boolean hasHashEntries(Object languageContext, Object receiver) { return (boolean) RUNTIME.callProfiled(this.hasHashEntries, languageContext, receiver); } @Override public long getHashSize(Object languageContext, Object receiver) { return (long) RUNTIME.callProfiled(this.getHashSize, languageContext, receiver); } @Override public boolean hasHashEntry(Object languageContext, Object receiver, Object key) { return (boolean) RUNTIME.callProfiled(this.hasHashEntry, languageContext, receiver, key); } @Override public Object getHashValue(Object languageContext, Object receiver, Object key) { return RUNTIME.callProfiled(this.getHashValue, languageContext, receiver, key); } @Override public Object getHashValueOrDefault(Object languageContext, Object receiver, Object key, Object defaultValue) { return RUNTIME.callProfiled(this.getHashValueOrDefault, languageContext, receiver, key, defaultValue); } @Override public void putHashEntry(Object languageContext, Object receiver, Object key, Object value) { RUNTIME.callProfiled(this.putHashEntry, languageContext, receiver, key, value); } @Override public boolean removeHashEntry(Object languageContext, Object receiver, Object key) { return (boolean) RUNTIME.callProfiled(this.removeHashEntry, languageContext, receiver, key); } @Override public Object getHashEntriesIterator(Object languageContext, Object receiver) { return RUNTIME.callProfiled(this.getHashEntriesIterator, languageContext, receiver); } @Override public Object getHashKeysIterator(Object languageContext, Object receiver) { return RUNTIME.callProfiled(this.getHashKeysIterator, languageContext, receiver); } @Override public Object getHashValuesIterator(Object languageContext, Object receiver) { return RUNTIME.callProfiled(this.getHashValuesIterator, languageContext, receiver); } private final class MemberSet extends AbstractSet<String> { private final APIAccess api; private final Object context; private final Object receiver; private final Object keys; private int cachedSize = -1; MemberSet(APIAccess api, Object languageContext, Object receiver, Object keys) { this.api = api; this.context = languageContext; this.receiver = receiver; this.keys = keys; } @Override public boolean contains(Object o) { if (!(o instanceof String)) { return false; } return hasMember(this.context, receiver, (String) o); } @Override public Iterator<String> iterator() { return new Iterator<String>() { int index = 0; public boolean hasNext() { return index < size(); } public String next() { if (index >= size()) { throw new NoSuchElementException(); } Object arrayElement = api.callValueGetArrayElement(keys, index++); if (api.callValueIsString(arrayElement)) { return api.callValueAsString(arrayElement); } else { return null; } } }; } @Override public int size() { int size = this.cachedSize; if (size != -1) { return size; } cachedSize = size = (int) api.callValueGetArraySize(keys); return size; } } abstract static class IsDateNode extends InteropNode { protected IsDateNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType}; } @Override protected String getOperationName() { return "isDate"; } @Specialization(limit = "CACHE_LIMIT") static Object doCached(PolyglotLanguageContext context, Object receiver, Object[] args, @Bind("this") Node node, // @CachedLibrary("receiver") InteropLibrary objects) { return objects.isDate(receiver); } } abstract static class AsDateNode extends InteropNode { protected AsDateNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType}; } @Override protected String getOperationName() { return "asDate"; } @Specialization(limit = "CACHE_LIMIT") static Object doCached(PolyglotLanguageContext context, Object receiver, Object[] args, // @Bind("this") Node node, @CachedLibrary("receiver") InteropLibrary objects, @Cached InlinedBranchProfile unsupported) { try { return objects.asDate(receiver); } catch (UnsupportedMessageException e) { unsupported.enter(node); if (objects.isNull(receiver)) { return null; } else { throw cannotConvert(context, receiver, null, "asDate()", "isDate()", "Value does not contain date information."); } } } } abstract static class IsTimeNode extends InteropNode { protected IsTimeNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType}; } @Override protected String getOperationName() { return "isTime"; } @Specialization(limit = "CACHE_LIMIT") static Object doCached(PolyglotLanguageContext context, Object receiver, Object[] args, @Bind("this") Node node, // @CachedLibrary("receiver") InteropLibrary objects) { return objects.isTime(receiver); } } abstract static class AsTimeNode extends InteropNode { protected AsTimeNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType}; } @Override protected String getOperationName() { return "asTime"; } @Specialization(limit = "CACHE_LIMIT") static Object doCached(PolyglotLanguageContext context, Object receiver, Object[] args, // @Bind("this") Node node, @CachedLibrary("receiver") InteropLibrary objects, @Cached InlinedBranchProfile unsupported) { try { return objects.asTime(receiver); } catch (UnsupportedMessageException e) { unsupported.enter(node); if (objects.isNull(receiver)) { return null; } else { throw cannotConvert(context, receiver, null, "asTime()", "isTime()", "Value does not contain time information."); } } } } abstract static class IsTimeZoneNode extends InteropNode { protected IsTimeZoneNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType}; } @Override protected String getOperationName() { return "isTimeZone"; } @Specialization(limit = "CACHE_LIMIT") static Object doCached(PolyglotLanguageContext context, Object receiver, Object[] args, @Bind("this") Node node, // @CachedLibrary("receiver") InteropLibrary objects) { return objects.isTimeZone(receiver); } } abstract static class AsTimeZoneNode extends InteropNode { protected AsTimeZoneNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType}; } @Override protected String getOperationName() { return "asTimeZone"; } @Specialization(limit = "CACHE_LIMIT") static Object doCached(PolyglotLanguageContext context, Object receiver, Object[] args, // @Bind("this") Node node, @CachedLibrary("receiver") InteropLibrary objects, @Cached InlinedBranchProfile unsupported) { try { return objects.asTimeZone(receiver); } catch (UnsupportedMessageException e) { unsupported.enter(node); if (objects.isNull(receiver)) { return null; } else { throw cannotConvert(context, receiver, null, "asTimeZone()", "isTimeZone()", "Value does not contain time-zone information."); } } } } abstract static class IsDurationNode extends InteropNode { protected IsDurationNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType}; } @Override protected String getOperationName() { return "isDuration"; } @Specialization(limit = "CACHE_LIMIT") static Object doCached(PolyglotLanguageContext context, Object receiver, Object[] args, @Bind("this") Node node, // @CachedLibrary("receiver") InteropLibrary objects) { return objects.isDuration(receiver); } } abstract static class AsDurationNode extends InteropNode { protected AsDurationNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType}; } @Override protected String getOperationName() { return "asDuration"; } @Specialization(limit = "CACHE_LIMIT") static Object doCached(PolyglotLanguageContext context, Object receiver, Object[] args, // @Bind("this") Node node, @CachedLibrary("receiver") InteropLibrary objects, @Cached InlinedBranchProfile unsupported) { try { return objects.asDuration(receiver); } catch (UnsupportedMessageException e) { unsupported.enter(node); if (objects.isNull(receiver)) { return null; } else { throw cannotConvert(context, receiver, null, "asDuration()", "isDuration()", "Value does not contain duration information."); } } } } abstract static class AsInstantNode extends InteropNode { protected AsInstantNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType}; } @Override protected String getOperationName() { return "getInstant"; } @Specialization(limit = "CACHE_LIMIT") static Object doCached(PolyglotLanguageContext context, Object receiver, Object[] args, // @Bind("this") Node node, @CachedLibrary("receiver") InteropLibrary objects, @Cached InlinedBranchProfile unsupported) { try { return objects.asInstant(receiver); } catch (UnsupportedMessageException e) { unsupported.enter(node); if (objects.isNull(receiver)) { return null; } else { throw cannotConvert(context, receiver, null, "asInstant()", "hasInstant()", "Value does not contain instant information."); } } } } abstract static class AsClassLiteralNode extends InteropNode { protected AsClassLiteralNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType, Class.class}; } @Override protected String getOperationName() { return "as"; } @Specialization final Object doCached(PolyglotLanguageContext context, Object receiver, Object[] args, @Cached PolyglotToHostNode toHost) { return toHost.execute(this, context, receiver, (Class<?>) args[ARGUMENT_OFFSET], null); } } abstract static class AsTypeLiteralNode extends InteropNode { protected AsTypeLiteralNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType, Class.class, Type.class}; } @Override protected String getOperationName() { return "as"; } @Specialization final Object doCached(PolyglotLanguageContext context, Object receiver, Object[] args, @Cached PolyglotToHostNode toHost) { Class<?> rawType = (Class<?>) args[ARGUMENT_OFFSET]; Type type = (Type) args[ARGUMENT_OFFSET + 1]; return toHost.execute(this, context, receiver, rawType, type); } } abstract static class IsNativePointerNode extends InteropNode { protected IsNativePointerNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType}; } @Override protected String getOperationName() { return "isNativePointer"; } @Specialization(limit = "CACHE_LIMIT") static Object doCached(PolyglotLanguageContext context, Object receiver, Object[] args, @Bind("this") Node node, // @CachedLibrary("receiver") InteropLibrary natives) { return natives.isPointer(receiver); } } abstract static class AsNativePointerNode extends InteropNode { protected AsNativePointerNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType}; } @Override protected String getOperationName() { return "asNativePointer"; } @Specialization(limit = "CACHE_LIMIT") static Object doCached(PolyglotLanguageContext context, Object receiver, Object[] args, // @Bind("this") Node node, @CachedLibrary("receiver") InteropLibrary natives, @Cached InlinedBranchProfile unsupported) { try { return natives.asPointer(receiver); } catch (UnsupportedMessageException e) { unsupported.enter(node); throw cannotConvert(context, receiver, long.class, "asNativePointer()", "isNativeObject()", "Value cannot be converted to a native pointer."); } } } abstract static class HasArrayElementsNode extends InteropNode { protected HasArrayElementsNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType}; } @Override protected String getOperationName() { return "hasArrayElements"; } @Specialization(limit = "CACHE_LIMIT") static Object doCached(PolyglotLanguageContext context, Object receiver, Object[] args, @Bind("this") Node node, // @CachedLibrary("receiver") InteropLibrary arrays) { return arrays.hasArrayElements(receiver); } } abstract static class GetMemberKeysNode extends InteropNode { protected GetMemberKeysNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType}; } @Override protected String getOperationName() { return "getMemberKeys"; } @Specialization(limit = "CACHE_LIMIT") static Object doCached(PolyglotLanguageContext context, Object receiver, Object[] args, // @Bind("this") Node node, @CachedLibrary("receiver") InteropLibrary objects, @Cached ToHostValueNode toHost, @Cached InlinedBranchProfile unsupported) { try { return toHost.execute(node, context, objects.getMembers(receiver)); } catch (UnsupportedMessageException e) { unsupported.enter(node); return null; } } } abstract static class GetArrayElementNode extends InteropNode { protected GetArrayElementNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType, Long.class}; } @Override protected String getOperationName() { return "getArrayElement"; } @Specialization(limit = "CACHE_LIMIT") static Object doCached(PolyglotLanguageContext context, Object receiver, Object[] args, // @Bind("this") Node node, @CachedLibrary("receiver") InteropLibrary arrays, @Cached ToHostValueNode toHost, @Cached InlinedBranchProfile unsupported, @Cached InlinedBranchProfile unknown) { long index = (long) args[ARGUMENT_OFFSET]; try { return toHost.execute(node, context, arrays.readArrayElement(receiver, index)); } catch (UnsupportedMessageException e) { unsupported.enter(node); return getArrayElementUnsupported(context, receiver); } catch (InvalidArrayIndexException e) { unknown.enter(node); throw invalidArrayIndex(context, receiver, index); } } } abstract static class SetArrayElementNode extends InteropNode { protected SetArrayElementNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType, Long.class, null}; } @Override protected String getOperationName() { return "setArrayElement"; } @Specialization(limit = "CACHE_LIMIT") static Object doCached(PolyglotLanguageContext context, Object receiver, Object[] args, // @Bind("this") Node node, @CachedLibrary("receiver") InteropLibrary arrays, @Cached(inline = true) ToGuestValueNode toGuestValue, @Cached InlinedBranchProfile unsupported, @Cached InlinedBranchProfile invalidIndex, @Cached InlinedBranchProfile invalidValue) { long index = (long) args[ARGUMENT_OFFSET]; Object value = toGuestValue.execute(node, context, args[ARGUMENT_OFFSET + 1]); try { arrays.writeArrayElement(receiver, index, value); } catch (UnsupportedMessageException e) { unsupported.enter(node); setArrayElementUnsupported(context, receiver); } catch (UnsupportedTypeException e) { invalidValue.enter(node); throw invalidArrayValue(context, receiver, index, value); } catch (InvalidArrayIndexException e) { invalidIndex.enter(node); throw invalidArrayIndex(context, receiver, index); } return null; } } abstract static class RemoveArrayElementNode extends InteropNode { protected RemoveArrayElementNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType, Long.class}; } @Override protected String getOperationName() { return "removeArrayElement"; } @Specialization(limit = "CACHE_LIMIT") static Object doCached(PolyglotLanguageContext context, Object receiver, Object[] args, // @Bind("this") Node node, @CachedLibrary("receiver") InteropLibrary arrays, @Cached InlinedBranchProfile unsupported, @Cached InlinedBranchProfile invalidIndex) { long index = (long) args[ARGUMENT_OFFSET]; Object value; try { arrays.removeArrayElement(receiver, index); value = Boolean.TRUE; } catch (UnsupportedMessageException e) { unsupported.enter(node); throw removeArrayElementUnsupported(context, receiver); } catch (InvalidArrayIndexException e) { invalidIndex.enter(node); throw invalidArrayIndex(context, receiver, index); } return value; } } abstract static class GetArraySizeNode extends InteropNode { protected GetArraySizeNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType}; } @Override protected String getOperationName() { return "getArraySize"; } @Specialization(limit = "CACHE_LIMIT") static Object doCached(PolyglotLanguageContext context, Object receiver, Object[] args, // @Bind("this") Node node, @CachedLibrary("receiver") InteropLibrary arrays, @Cached InlinedBranchProfile unsupported) { try { return arrays.getArraySize(receiver); } catch (UnsupportedMessageException e) { unsupported.enter(node); return getArraySizeUnsupported(context, receiver); } } } // region Buffer nodes abstract static class HasBufferElementsNode extends InteropNode { protected HasBufferElementsNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType}; } @Override protected String getOperationName() { return "hasBufferElements"; } @Specialization(limit = "CACHE_LIMIT") static Object doCached(PolyglotLanguageContext context, Object receiver, Object[] args, @Bind("this") Node node, // @CachedLibrary("receiver") InteropLibrary buffers) { return buffers.hasBufferElements(receiver); } } abstract static class IsBufferWritableNode extends InteropNode { protected IsBufferWritableNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType}; } @Override protected String getOperationName() { return "isBufferWritable"; } @Specialization(limit = "CACHE_LIMIT") static Object doCached(PolyglotLanguageContext context, Object receiver, Object[] args, // @Bind("this") Node node, @CachedLibrary("receiver") InteropLibrary buffers, @Cached InlinedBranchProfile unsupported) { try { return buffers.isBufferWritable(receiver); } catch (UnsupportedMessageException e) { unsupported.enter(node); throw getBufferSizeUnsupported(context, receiver); } } } abstract static class GetBufferSizeNode extends InteropNode { protected GetBufferSizeNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType}; } @Override protected String getOperationName() { return "getBufferSize"; } @Specialization(limit = "CACHE_LIMIT") static Object doCached(PolyglotLanguageContext context, Object receiver, Object[] args, // @Bind("this") Node node, @CachedLibrary("receiver") InteropLibrary buffers, @Cached InlinedBranchProfile unsupported) { try { return buffers.getBufferSize(receiver); } catch (UnsupportedMessageException e) { unsupported.enter(node); throw getBufferSizeUnsupported(context, receiver); } } } abstract static class ReadBufferByteNode extends InteropNode { protected ReadBufferByteNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType, Long.class}; } @Override protected String getOperationName() { return "readBufferByte"; } @Specialization(limit = "CACHE_LIMIT") static Object doCached(PolyglotLanguageContext context, Object receiver, Object[] args, // @Bind("this") Node node, @CachedLibrary("receiver") InteropLibrary buffers, @Cached ToHostValueNode toHost, @Cached InlinedBranchProfile unsupported, @Cached InlinedBranchProfile unknown) { final long byteOffset = (long) args[ARGUMENT_OFFSET]; try { return buffers.readBufferByte(receiver, byteOffset); } catch (UnsupportedMessageException e) { unsupported.enter(node); throw readBufferByteUnsupported(context, receiver); } catch (InvalidBufferOffsetException e) { unknown.enter(node); throw invalidBufferIndex(context, receiver, e.getByteOffset(), e.getLength()); } } } abstract static class ReadBufferNode extends InteropNode { protected ReadBufferNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType, Long.class, byte[].class, Integer.class, Integer.class}; } @Override protected String getOperationName() { return "readBufferInto"; } @Specialization(limit = "CACHE_LIMIT") static Object doCached(PolyglotLanguageContext context, Object receiver, Object[] args, // @Bind("this") Node node, @CachedLibrary("receiver") InteropLibrary buffers, @Cached ToHostValueNode toHost, @Cached InlinedBranchProfile unsupported, @Cached InlinedBranchProfile unknown) { final long bufferByteOffset = (long) args[ARGUMENT_OFFSET]; final byte[] destination = (byte[]) args[ARGUMENT_OFFSET + 1]; final int destinationOffset = (int) args[ARGUMENT_OFFSET + 2]; final int byteLength = (int) args[ARGUMENT_OFFSET + 3]; try { buffers.readBuffer(receiver, bufferByteOffset, destination, destinationOffset, byteLength); } catch (UnsupportedMessageException e) { unsupported.enter(node); throw readBufferUnsupported(context, receiver); } catch (InvalidBufferOffsetException e) { unknown.enter(node); throw invalidBufferIndex(context, receiver, e.getByteOffset(), e.getLength()); } return null; } } abstract static class WriteBufferByteNode extends InteropNode { protected WriteBufferByteNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType, Long.class, Byte.class}; } @Override protected String getOperationName() { return "writeBufferByte"; } @Specialization(limit = "CACHE_LIMIT") static Object doCached(PolyglotLanguageContext context, Object receiver, Object[] args, // @Bind("this") Node node, @CachedLibrary("receiver") InteropLibrary buffers, @Cached InlinedBranchProfile unsupported, @Cached InlinedBranchProfile invalidIndex, @Cached InlinedBranchProfile invalidValue) { final long byteOffset = (long) args[ARGUMENT_OFFSET]; final byte value = (byte) args[ARGUMENT_OFFSET + 1]; try { buffers.writeBufferByte(receiver, byteOffset, value); } catch (UnsupportedMessageException e) { unsupported.enter(node); if (buffers.hasBufferElements(receiver)) { throw unsupported(context, receiver, "writeBufferByte()", "isBufferWritable()"); } throw writeBufferByteUnsupported(context, receiver); } catch (InvalidBufferOffsetException e) { invalidIndex.enter(node); throw invalidBufferIndex(context, receiver, e.getByteOffset(), e.getLength()); } return null; } } abstract static class ReadBufferShortNode extends InteropNode { protected ReadBufferShortNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType, ByteOrder.class, Long.class}; } @Override protected String getOperationName() { return "readBufferShort"; } @Specialization(limit = "CACHE_LIMIT") static Object doCached(PolyglotLanguageContext context, Object receiver, Object[] args, // @Bind("this") Node node, @CachedLibrary("receiver") InteropLibrary buffers, @Cached ToHostValueNode toHost, @Cached InlinedBranchProfile unsupported, @Cached InlinedBranchProfile unknown) { final ByteOrder order = (ByteOrder) args[ARGUMENT_OFFSET]; final long byteOffset = (long) args[ARGUMENT_OFFSET + 1]; try { return buffers.readBufferShort(receiver, order, byteOffset); } catch (UnsupportedMessageException e) { unsupported.enter(node); throw readBufferShortUnsupported(context, receiver); } catch (InvalidBufferOffsetException e) { unknown.enter(node); throw invalidBufferIndex(context, receiver, e.getByteOffset(), e.getLength()); } } } abstract static class WriteBufferShortNode extends InteropNode { protected WriteBufferShortNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType, ByteOrder.class, Long.class, Short.class}; } @Override protected String getOperationName() { return "writeBufferShort"; } @Specialization(limit = "CACHE_LIMIT") static Object doCached(PolyglotLanguageContext context, Object receiver, Object[] args, // @Bind("this") Node node, @CachedLibrary("receiver") InteropLibrary buffers, @Cached InlinedBranchProfile unsupported, @Cached InlinedBranchProfile invalidIndex, @Cached InlinedBranchProfile invalidValue) { final ByteOrder order = (ByteOrder) args[ARGUMENT_OFFSET]; final long byteOffset = (long) args[ARGUMENT_OFFSET + 1]; final short value = (short) args[ARGUMENT_OFFSET + 2]; try { buffers.writeBufferShort(receiver, order, byteOffset, value); } catch (UnsupportedMessageException e) { unsupported.enter(node); if (buffers.hasBufferElements(receiver)) { throw unsupported(context, receiver, "writeBufferShort()", "isBufferWritable()"); } throw writeBufferShortUnsupported(context, receiver); } catch (InvalidBufferOffsetException e) { invalidIndex.enter(node); throw invalidBufferIndex(context, receiver, e.getByteOffset(), e.getLength()); } return null; } } abstract static class ReadBufferIntNode extends InteropNode { protected ReadBufferIntNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType, ByteOrder.class, Long.class}; } @Override protected String getOperationName() { return "readBufferInt"; } @Specialization(limit = "CACHE_LIMIT") static Object doCached(PolyglotLanguageContext context, Object receiver, Object[] args, // @Bind("this") Node node, @CachedLibrary("receiver") InteropLibrary buffers, @Cached ToHostValueNode toHost, @Cached InlinedBranchProfile unsupported, @Cached InlinedBranchProfile unknown) { final ByteOrder order = (ByteOrder) args[ARGUMENT_OFFSET]; final long byteOffset = (long) args[ARGUMENT_OFFSET + 1]; try { return buffers.readBufferInt(receiver, order, byteOffset); } catch (UnsupportedMessageException e) { unsupported.enter(node); throw readBufferIntUnsupported(context, receiver); } catch (InvalidBufferOffsetException e) { unknown.enter(node); throw invalidBufferIndex(context, receiver, e.getByteOffset(), e.getLength()); } } } abstract static class WriteBufferIntNode extends InteropNode { protected WriteBufferIntNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType, ByteOrder.class, Long.class, Integer.class}; } @Override protected String getOperationName() { return "writeBufferInt"; } @Specialization(limit = "CACHE_LIMIT") static Object doCached(PolyglotLanguageContext context, Object receiver, Object[] args, // @Bind("this") Node node, @CachedLibrary("receiver") InteropLibrary buffers, @Cached InlinedBranchProfile unsupported, @Cached InlinedBranchProfile invalidIndex, @Cached InlinedBranchProfile invalidValue) { final ByteOrder order = (ByteOrder) args[ARGUMENT_OFFSET]; final long byteOffset = (long) args[ARGUMENT_OFFSET + 1]; final int value = (int) args[ARGUMENT_OFFSET + 2]; try { buffers.writeBufferInt(receiver, order, byteOffset, value); } catch (UnsupportedMessageException e) { unsupported.enter(node); if (buffers.hasBufferElements(receiver)) { throw unsupported(context, receiver, "writeBufferInt()", "isBufferWritable()"); } throw writeBufferIntUnsupported(context, receiver); } catch (InvalidBufferOffsetException e) { invalidIndex.enter(node); throw invalidBufferIndex(context, receiver, e.getByteOffset(), e.getLength()); } return null; } } abstract static class ReadBufferLongNode extends InteropNode { protected ReadBufferLongNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType, ByteOrder.class, Long.class}; } @Override protected String getOperationName() { return "readBufferLong"; } @Specialization(limit = "CACHE_LIMIT") static Object doCached(PolyglotLanguageContext context, Object receiver, Object[] args, // @Bind("this") Node node, @CachedLibrary("receiver") InteropLibrary buffers, @Cached ToHostValueNode toHost, @Cached InlinedBranchProfile unsupported, @Cached InlinedBranchProfile unknown) { final ByteOrder order = (ByteOrder) args[ARGUMENT_OFFSET]; final long byteOffset = (long) args[ARGUMENT_OFFSET + 1]; try { return buffers.readBufferLong(receiver, order, byteOffset); } catch (UnsupportedMessageException e) { unsupported.enter(node); throw readBufferLongUnsupported(context, receiver); } catch (InvalidBufferOffsetException e) { unknown.enter(node); throw invalidBufferIndex(context, receiver, e.getByteOffset(), e.getLength()); } } } abstract static class WriteBufferLongNode extends InteropNode { protected WriteBufferLongNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType, ByteOrder.class, Long.class, Long.class}; } @Override protected String getOperationName() { return "writeBufferLong"; } @Specialization(limit = "CACHE_LIMIT") static Object doCached(PolyglotLanguageContext context, Object receiver, Object[] args, // @Bind("this") Node node, @CachedLibrary("receiver") InteropLibrary buffers, @Cached InlinedBranchProfile unsupported, @Cached InlinedBranchProfile invalidIndex, @Cached InlinedBranchProfile invalidValue) { final ByteOrder order = (ByteOrder) args[ARGUMENT_OFFSET]; final long byteOffset = (long) args[ARGUMENT_OFFSET + 1]; final long value = (long) args[ARGUMENT_OFFSET + 2]; try { buffers.writeBufferLong(receiver, order, byteOffset, value); } catch (UnsupportedMessageException e) { unsupported.enter(node); if (buffers.hasBufferElements(receiver)) { throw unsupported(context, receiver, "writeBufferLong()", "isBufferWritable()"); } throw writeBufferLongUnsupported(context, receiver); } catch (InvalidBufferOffsetException e) { invalidIndex.enter(node); throw invalidBufferIndex(context, receiver, e.getByteOffset(), e.getLength()); } return null; } } abstract static class ReadBufferFloatNode extends InteropNode { protected ReadBufferFloatNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType, ByteOrder.class, Long.class}; } @Override protected String getOperationName() { return "readBufferFloat"; } @Specialization(limit = "CACHE_LIMIT") static Object doCached(PolyglotLanguageContext context, Object receiver, Object[] args, // @Bind("this") Node node, @CachedLibrary("receiver") InteropLibrary buffers, @Cached ToHostValueNode toHost, @Cached InlinedBranchProfile unsupported, @Cached InlinedBranchProfile unknown) { final ByteOrder order = (ByteOrder) args[ARGUMENT_OFFSET]; final long byteOffset = (long) args[ARGUMENT_OFFSET + 1]; try { return buffers.readBufferFloat(receiver, order, byteOffset); } catch (UnsupportedMessageException e) { unsupported.enter(node); throw readBufferFloatUnsupported(context, receiver); } catch (InvalidBufferOffsetException e) { unknown.enter(node); throw invalidBufferIndex(context, receiver, e.getByteOffset(), e.getLength()); } } } abstract static class WriteBufferFloatNode extends InteropNode { protected WriteBufferFloatNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType, ByteOrder.class, Long.class, Float.class}; } @Override protected String getOperationName() { return "writeBufferFloat"; } @Specialization(limit = "CACHE_LIMIT") static Object doCached(PolyglotLanguageContext context, Object receiver, Object[] args, // @Bind("this") Node node, @CachedLibrary("receiver") InteropLibrary buffers, @Cached InlinedBranchProfile unsupported, @Cached InlinedBranchProfile invalidIndex) { final ByteOrder order = (ByteOrder) args[ARGUMENT_OFFSET]; final long byteOffset = (long) args[ARGUMENT_OFFSET + 1]; final float value = (float) args[ARGUMENT_OFFSET + 2]; try { buffers.writeBufferFloat(receiver, order, byteOffset, value); } catch (UnsupportedMessageException e) { unsupported.enter(node); if (buffers.hasBufferElements(receiver)) { throw unsupported(context, receiver, "writeBufferFloat()", "isBufferWritable()"); } throw writeBufferFloatUnsupported(context, receiver); } catch (InvalidBufferOffsetException e) { invalidIndex.enter(node); throw invalidBufferIndex(context, receiver, e.getByteOffset(), e.getLength()); } return null; } } abstract static class ReadBufferDoubleNode extends InteropNode { protected ReadBufferDoubleNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType, ByteOrder.class, Long.class}; } @Override protected String getOperationName() { return "readBufferDouble"; } @Specialization(limit = "CACHE_LIMIT") static Object doCached(PolyglotLanguageContext context, Object receiver, Object[] args, // @Bind("this") Node node, @CachedLibrary("receiver") InteropLibrary buffers, @Cached ToHostValueNode toHost, @Cached InlinedBranchProfile unsupported, @Cached InlinedBranchProfile unknown) { final ByteOrder order = (ByteOrder) args[ARGUMENT_OFFSET]; final long byteOffset = (long) args[ARGUMENT_OFFSET + 1]; try { return buffers.readBufferDouble(receiver, order, byteOffset); } catch (UnsupportedMessageException e) { unsupported.enter(node); throw readBufferDoubleUnsupported(context, receiver); } catch (InvalidBufferOffsetException e) { unknown.enter(node); throw invalidBufferIndex(context, receiver, e.getByteOffset(), e.getLength()); } } } abstract static class WriteBufferDoubleNode extends InteropNode { protected WriteBufferDoubleNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType, ByteOrder.class, Long.class, Double.class}; } @Override protected String getOperationName() { return "writeBufferDouble"; } @Specialization(limit = "CACHE_LIMIT") static Object doCached(PolyglotLanguageContext context, Object receiver, Object[] args, @Bind("this") Node node, // @CachedLibrary("receiver") InteropLibrary buffers, @Cached InlinedBranchProfile unsupported, @Cached InlinedBranchProfile invalidIndex, @Cached InlinedBranchProfile invalidValue) { final ByteOrder order = (ByteOrder) args[ARGUMENT_OFFSET]; final long byteOffset = (long) args[ARGUMENT_OFFSET + 1]; final double value = (double) args[ARGUMENT_OFFSET + 2]; try { buffers.writeBufferDouble(receiver, order, byteOffset, value); } catch (UnsupportedMessageException e) { unsupported.enter(node); if (buffers.hasBufferElements(receiver)) { throw unsupported(context, receiver, "writeBufferDouble()", "isBufferWritable()"); } throw writeBufferDoubleUnsupported(context, receiver); } catch (InvalidBufferOffsetException e) { invalidIndex.enter(node); throw invalidBufferIndex(context, receiver, e.getByteOffset(), e.getLength()); } return null; } } // endregion abstract static class GetMemberNode extends InteropNode { protected GetMemberNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType, String.class}; } @Override protected String getOperationName() { return "getMember"; } @Specialization(limit = "CACHE_LIMIT") static Object doCached(PolyglotLanguageContext context, Object receiver, Object[] args, @Bind("this") Node node, // @CachedLibrary("receiver") InteropLibrary objects, @Cached ToHostValueNode toHost, @Cached InlinedBranchProfile unsupported, @Cached InlinedBranchProfile unknown) { String key = (String) args[ARGUMENT_OFFSET]; Object value; try { assert key != null : "should be handled already"; value = toHost.execute(node, context, objects.readMember(receiver, key)); } catch (UnsupportedMessageException e) { unsupported.enter(node); if (objects.hasMembers(receiver)) { value = null; } else { return getMemberUnsupported(context, receiver, key); } } catch (UnknownIdentifierException e) { unknown.enter(node); value = null; } return value; } } abstract static class PutMemberNode extends InteropNode { protected PutMemberNode(InteropValue interop) { super(interop); } @Override protected String getOperationName() { return "putMember"; } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType, String.class, null}; } @Specialization static Object doCached(PolyglotLanguageContext context, Object receiver, Object[] args, @Bind("this") Node node, @CachedLibrary(limit = "CACHE_LIMIT") InteropLibrary objects, @Cached(inline = true) ToGuestValueNode toGuestValue, @Cached InlinedBranchProfile unsupported, @Cached InlinedBranchProfile invalidValue, @Cached InlinedBranchProfile unknown) { String key = (String) args[ARGUMENT_OFFSET]; Object originalValue = args[ARGUMENT_OFFSET + 1]; Object value = toGuestValue.execute(node, context, originalValue); assert key != null; try { objects.writeMember(receiver, key, value); } catch (UnsupportedMessageException e) { unsupported.enter(node); throw putMemberUnsupported(context, receiver); } catch (UnknownIdentifierException e) { unknown.enter(node); throw nonWritableMemberKey(context, receiver, key); } catch (UnsupportedTypeException e) { invalidValue.enter(node); throw invalidMemberValue(context, receiver, key, value); } return null; } } abstract static class RemoveMemberNode extends InteropNode { protected RemoveMemberNode(InteropValue interop) { super(interop); } @Override protected String getOperationName() { return "removeMember"; } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType, String.class}; } @Specialization(limit = "CACHE_LIMIT") static Object doCached(PolyglotLanguageContext context, Object receiver, Object[] args, @Bind("this") Node node, // @CachedLibrary("receiver") InteropLibrary objects, @Cached InlinedBranchProfile unsupported, @Cached InlinedBranchProfile unknown) { String key = (String) args[ARGUMENT_OFFSET]; Object value; try { assert key != null : "should be handled already"; objects.removeMember(receiver, key); value = Boolean.TRUE; } catch (UnsupportedMessageException e) { unsupported.enter(node); if (!objects.hasMembers(receiver)) { throw removeMemberUnsupported(context, receiver); } else if (objects.isMemberExisting(receiver, key)) { throw nonRemovableMemberKey(context, receiver, key); } else { value = Boolean.FALSE; } } catch (UnknownIdentifierException e) { unknown.enter(node); if (objects.isMemberExisting(receiver, key)) { throw nonRemovableMemberKey(context, receiver, key); } else { value = Boolean.FALSE; } } return value; } } abstract static class IsNullNode extends InteropNode { protected IsNullNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType}; } @Override protected String getOperationName() { return "isNull"; } @Specialization(limit = "CACHE_LIMIT") static Object doCached(PolyglotLanguageContext context, Object receiver, Object[] args, @Bind("this") Node node, // @CachedLibrary("receiver") InteropLibrary values) { return values.isNull(receiver); } } abstract static class HasMembersNode extends InteropNode { protected HasMembersNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType}; } @Override protected String getOperationName() { return "hasMembers"; } @Specialization(limit = "CACHE_LIMIT") static Object doCached(PolyglotLanguageContext context, Object receiver, Object[] args, @Bind("this") Node node, // @CachedLibrary("receiver") InteropLibrary objects) { return objects.hasMembers(receiver); } } private abstract static class AbstractMemberInfoNode extends InteropNode { protected AbstractMemberInfoNode(InteropValue interop) { super(interop); } @Override protected final Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType, String.class}; } } abstract static class HasMemberNode extends AbstractMemberInfoNode { protected HasMemberNode(InteropValue interop) { super(interop); } @Override protected String getOperationName() { return "hasMember"; } @Specialization(limit = "CACHE_LIMIT") static Object doCached(PolyglotLanguageContext context, Object receiver, Object[] args, @Bind("this") Node node, // @CachedLibrary("receiver") InteropLibrary objects) { String key = (String) args[ARGUMENT_OFFSET]; return objects.isMemberExisting(receiver, key); } } abstract static class CanInvokeNode extends AbstractMemberInfoNode { protected CanInvokeNode(InteropValue interop) { super(interop); } @Override protected String getOperationName() { return "canInvoke"; } @Specialization(limit = "CACHE_LIMIT") static Object doCached(PolyglotLanguageContext context, Object receiver, Object[] args, @Bind("this") Node node, // @CachedLibrary("receiver") InteropLibrary objects) { String key = (String) args[ARGUMENT_OFFSET]; return objects.isMemberInvocable(receiver, key); } } abstract static class CanExecuteNode extends InteropNode { protected CanExecuteNode(InteropValue interop) { super(interop); } @Override protected String getOperationName() { return "canExecute"; } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType}; } @Specialization(limit = "CACHE_LIMIT") static Object doCached(PolyglotLanguageContext context, Object receiver, Object[] args, @Bind("this") Node node, // @CachedLibrary("receiver") InteropLibrary executables) { return executables.isExecutable(receiver); } } abstract static class CanInstantiateNode extends InteropNode { protected CanInstantiateNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType}; } @Override protected String getOperationName() { return "canInstantiate"; } @Specialization(limit = "CACHE_LIMIT") static Object doCached(PolyglotLanguageContext context, Object receiver, Object[] args, @Bind("this") Node node, // @CachedLibrary("receiver") InteropLibrary instantiables) { return instantiables.isInstantiable(receiver); } } @ImportStatic(InteropNode.class) @GenerateInline(true) @GenerateCached(false) abstract static class SharedExecuteNode extends Node { protected abstract Object executeShared(Node node, PolyglotLanguageContext context, Object receiver, Object[] args); @Specialization(limit = "CACHE_LIMIT") protected static Object doDefault(Node node, PolyglotLanguageContext context, Object receiver, Object[] args, @CachedLibrary("receiver") InteropLibrary executables, @Cached ToGuestValuesNode toGuestValues, @Cached ToHostValueNode toHostValue, @Cached InlinedBranchProfile invalidArgument, @Cached InlinedBranchProfile arity, @Cached InlinedBranchProfile unsupported) { Object[] guestArguments = toGuestValues.execute(node, context, args); try { return executables.execute(receiver, guestArguments); } catch (UnsupportedTypeException e) { invalidArgument.enter(node); throw invalidExecuteArgumentType(context, receiver, e); } catch (ArityException e) { arity.enter(node); throw invalidExecuteArity(context, receiver, guestArguments, e.getExpectedMinArity(), e.getExpectedMaxArity(), e.getActualArity()); } catch (UnsupportedMessageException e) { unsupported.enter(node); throw executeUnsupported(context, receiver); } } } abstract static class ExecuteVoidNode extends InteropNode { protected ExecuteVoidNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType, Object[].class}; } @Specialization final Object doDefault(PolyglotLanguageContext context, Object receiver, Object[] args, @Cached SharedExecuteNode executeNode) { executeNode.executeShared(this, context, receiver, (Object[]) args[ARGUMENT_OFFSET]); return null; } @Override protected String getOperationName() { return "executeVoid"; } } abstract static class ExecuteVoidNoArgsNode extends InteropNode { private static final Object[] NO_ARGS = new Object[0]; protected ExecuteVoidNoArgsNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType}; } @Specialization final Object doDefault(PolyglotLanguageContext context, Object receiver, Object[] args, @Cached SharedExecuteNode executeNode) { executeNode.executeShared(this, context, receiver, NO_ARGS); return null; } @Override protected String getOperationName() { return "executeVoid"; } } abstract static class ExecuteNode extends InteropNode { protected ExecuteNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType, Object[].class}; } @Specialization final Object doDefault(PolyglotLanguageContext context, Object receiver, Object[] args, @Cached ToHostValueNode toHostValue, @Cached SharedExecuteNode executeNode) { return toHostValue.execute(this, context, executeNode.executeShared(this, context, receiver, (Object[]) args[ARGUMENT_OFFSET])); } @Override protected String getOperationName() { return "execute"; } } abstract static class ExecuteNoArgsNode extends InteropNode { protected ExecuteNoArgsNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType}; } @Specialization final Object doDefault(PolyglotLanguageContext context, Object receiver, Object[] args, @Cached ToHostValueNode toHostValue, @Cached SharedExecuteNode executeNode) { return toHostValue.execute(this, context, executeNode.executeShared(this, context, receiver, ExecuteVoidNoArgsNode.NO_ARGS)); } @Override protected String getOperationName() { return "execute"; } } abstract static class NewInstanceNode extends InteropNode { protected NewInstanceNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType, Object[].class}; } @Specialization(limit = "CACHE_LIMIT") static Object doCached(PolyglotLanguageContext context, Object receiver, Object[] args, @Bind("this") Node node, // @CachedLibrary("receiver") InteropLibrary instantiables, @Cached ToGuestValuesNode toGuestValues, @Cached ToHostValueNode toHostValue, @Cached InlinedBranchProfile arity, @Cached InlinedBranchProfile invalidArgument, @Cached InlinedBranchProfile unsupported) { Object[] instantiateArguments = toGuestValues.execute(node, context, (Object[]) args[ARGUMENT_OFFSET]); try { return toHostValue.execute(node, context, instantiables.instantiate(receiver, instantiateArguments)); } catch (UnsupportedTypeException e) { invalidArgument.enter(node); throw invalidInstantiateArgumentType(context, receiver, instantiateArguments); } catch (ArityException e) { arity.enter(node); throw invalidInstantiateArity(context, receiver, instantiateArguments, e.getExpectedMinArity(), e.getExpectedMaxArity(), e.getActualArity()); } catch (UnsupportedMessageException e) { unsupported.enter(node); return newInstanceUnsupported(context, receiver); } } @Override protected String getOperationName() { return "newInstance"; } } @ImportStatic(InteropNode.class) @GenerateInline(true) @GenerateCached(false) abstract static class SharedInvokeNode extends Node { protected abstract Object executeShared(Node node, PolyglotLanguageContext context, Object receiver, String key, Object[] guestArguments); @Specialization(limit = "CACHE_LIMIT") protected static Object doDefault(Node node, PolyglotLanguageContext context, Object receiver, String key, Object[] guestArguments, @CachedLibrary("receiver") InteropLibrary objects, @Cached ToHostValueNode toHostValue, @Cached InlinedBranchProfile invalidArgument, @Cached InlinedBranchProfile arity, @Cached InlinedBranchProfile unsupported, @Cached InlinedBranchProfile unknownIdentifier) { try { return toHostValue.execute(node, context, objects.invokeMember(receiver, key, guestArguments)); } catch (UnsupportedMessageException e) { unsupported.enter(node); throw invokeUnsupported(context, receiver, key); } catch (UnknownIdentifierException e) { unknownIdentifier.enter(node); throw nonReadableMemberKey(context, receiver, key); } catch (UnsupportedTypeException e) { invalidArgument.enter(node); throw invalidInvokeArgumentType(context, receiver, key, e); } catch (ArityException e) { arity.enter(node); throw invalidInvokeArity(context, receiver, key, guestArguments, e.getExpectedMinArity(), e.getExpectedMaxArity(), e.getActualArity()); } } } abstract static class InvokeNode extends InteropNode { protected InvokeNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType, String.class, Object[].class}; } @Override protected String getOperationName() { return "invoke"; } @Specialization final Object doDefault(PolyglotLanguageContext context, Object receiver, Object[] args, @Cached SharedInvokeNode sharedInvoke, @Cached ToGuestValuesNode toGuestValues) { String key = (String) args[ARGUMENT_OFFSET]; Object[] guestArguments = toGuestValues.execute(this, context, (Object[]) args[ARGUMENT_OFFSET + 1]); return sharedInvoke.executeShared(this, context, receiver, key, guestArguments); } } abstract static class InvokeNoArgsNode extends InteropNode { protected InvokeNoArgsNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType, String.class}; } @Override protected String getOperationName() { return "invoke"; } @Specialization final Object doDefault(PolyglotLanguageContext context, Object receiver, Object[] args, @Cached SharedInvokeNode sharedInvoke) { String key = (String) args[ARGUMENT_OFFSET]; return sharedInvoke.executeShared(this, context, receiver, key, ExecuteVoidNoArgsNode.NO_ARGS); } } abstract static class IsExceptionNode extends InteropNode { protected IsExceptionNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType}; } @Override protected String getOperationName() { return "isException"; } @Specialization(limit = "CACHE_LIMIT") static Object doCached(PolyglotLanguageContext context, Object receiver, Object[] args, @Bind("this") Node node, @CachedLibrary("receiver") InteropLibrary objects) { return objects.isException(receiver); } } abstract static class ThrowExceptionNode extends InteropNode { protected ThrowExceptionNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType}; } @Override protected String getOperationName() { return "throwException"; } @Specialization(limit = "CACHE_LIMIT") static Object doCached(PolyglotLanguageContext context, Object receiver, Object[] args, @Bind("this") Node node, @CachedLibrary("receiver") InteropLibrary objects, @Cached InlinedBranchProfile unsupported) { try { throw objects.throwException(receiver); } catch (UnsupportedMessageException e) { unsupported.enter(node); throw unsupported(context, receiver, "throwException()", "isException()"); } } } abstract static class IsMetaObjectNode extends InteropNode { protected IsMetaObjectNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType}; } @Override protected String getOperationName() { return "isMetaObject"; } @Specialization(limit = "CACHE_LIMIT") static boolean doCached(PolyglotLanguageContext context, Object receiver, Object[] args, @Bind("this") Node node, @CachedLibrary("receiver") InteropLibrary objects) { return objects.isMetaObject(receiver); } } abstract static class GetMetaQualifiedNameNode extends InteropNode { protected GetMetaQualifiedNameNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType}; } @Override protected String getOperationName() { return "getMetaQualifiedName"; } @Specialization(limit = "CACHE_LIMIT") static String doCached(PolyglotLanguageContext context, Object receiver, Object[] args, @Bind("this") Node node, @CachedLibrary("receiver") InteropLibrary objects, @CachedLibrary(limit = "1") InteropLibrary toString, @Cached InlinedBranchProfile unsupported) { try { return toString.asString(objects.getMetaQualifiedName(receiver)); } catch (UnsupportedMessageException e) { unsupported.enter(node); throw unsupported(context, receiver, "getMetaQualifiedName()", "isMetaObject()"); } } } abstract static class GetMetaSimpleNameNode extends InteropNode { protected GetMetaSimpleNameNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType}; } @Override protected String getOperationName() { return "getMetaSimpleName"; } @Specialization(limit = "CACHE_LIMIT") static String doCached(PolyglotLanguageContext context, Object receiver, Object[] args, @Bind("this") Node node, @CachedLibrary("receiver") InteropLibrary objects, @CachedLibrary(limit = "1") InteropLibrary toString, @Cached InlinedBranchProfile unsupported) { try { return toString.asString(objects.getMetaSimpleName(receiver)); } catch (UnsupportedMessageException e) { unsupported.enter(node); throw unsupported(context, receiver, "getMetaSimpleName()", "isMetaObject()"); } } } abstract static class IsMetaInstanceNode extends InteropNode { protected IsMetaInstanceNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType, null}; } @Override protected String getOperationName() { return "isMetaInstance"; } @Specialization(limit = "CACHE_LIMIT") static boolean doCached(PolyglotLanguageContext context, Object receiver, Object[] args, @Bind("this") Node node, @CachedLibrary("receiver") InteropLibrary objects, @Cached(inline = true) ToGuestValueNode toGuest, @Cached InlinedBranchProfile unsupported) { try { return objects.isMetaInstance(receiver, toGuest.execute(node, context, args[ARGUMENT_OFFSET])); } catch (UnsupportedMessageException e) { unsupported.enter(node); throw unsupported(context, receiver, "isMetaInstance()", "isMetaObject()"); } } } abstract static class HasMetaParentsNode extends InteropNode { protected HasMetaParentsNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType}; } @Override protected String getOperationName() { return "hasMetaParents"; } @Specialization(limit = "CACHE_LIMIT") static boolean doCached(PolyglotLanguageContext context, Object receiver, Object[] args, @Bind("this") Node node, @CachedLibrary("receiver") InteropLibrary objects, @Cached InlinedBranchProfile unsupported) { return objects.hasMetaParents(receiver); } } abstract static class GetMetaParentsNode extends InteropNode { protected GetMetaParentsNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType}; } @Override protected String getOperationName() { return "getMetaParents"; } @Specialization(limit = "CACHE_LIMIT") static Object doCached(PolyglotLanguageContext context, Object receiver, Object[] args, @Bind("this") Node node, @CachedLibrary("receiver") InteropLibrary objects, @Cached ToHostValueNode toHost, @Cached InlinedBranchProfile unsupported) { try { return toHost.execute(node, context, objects.getMetaParents(receiver)); } catch (UnsupportedMessageException e) { unsupported.enter(node); throw unsupported(context, receiver, "getMetaParents()", "hasMetaParents()"); } } } abstract static class HasIteratorNode extends InteropNode { protected HasIteratorNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType}; } @Override protected String getOperationName() { return "hasIterator"; } @Specialization(limit = "CACHE_LIMIT") static Object doCached(PolyglotLanguageContext context, Object receiver, Object[] args, @Bind("this") Node node, // @CachedLibrary("receiver") InteropLibrary iterators) { return iterators.hasIterator(receiver); } } abstract static class GetIteratorNode extends InteropNode { protected GetIteratorNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType}; } @Override protected String getOperationName() { return "getIterator"; } @Specialization(limit = "CACHE_LIMIT") static Object doCached(PolyglotLanguageContext context, Object receiver, Object[] args, @Bind("this") Node node, // @CachedLibrary("receiver") InteropLibrary iterators, @Cached ToHostValueNode toHost, @Cached InlinedBranchProfile unsupported) { try { return toHost.execute(node, context, iterators.getIterator(receiver)); } catch (UnsupportedMessageException e) { unsupported.enter(node); return getIteratorUnsupported(context, receiver); } } } abstract static class IsIteratorNode extends InteropNode { protected IsIteratorNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType}; } @Override protected String getOperationName() { return "isIterator"; } @Specialization(limit = "CACHE_LIMIT") static Object doCached(PolyglotLanguageContext context, Object receiver, Object[] args, @Bind("this") Node node, // @CachedLibrary("receiver") InteropLibrary iterators) { return iterators.isIterator(receiver); } } abstract static class HasIteratorNextElementNode extends InteropNode { protected HasIteratorNextElementNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType}; } @Override protected String getOperationName() { return "hasIteratorNextElement"; } @Specialization(limit = "CACHE_LIMIT") static Object doCached(PolyglotLanguageContext context, Object receiver, Object[] args, @Bind("this") Node node, // @CachedLibrary("receiver") InteropLibrary iterators, @Cached InlinedBranchProfile unsupported) { try { return iterators.hasIteratorNextElement(receiver); } catch (UnsupportedMessageException e) { unsupported.enter(node); return hasIteratorNextElementUnsupported(context, receiver); } } } abstract static class GetIteratorNextElementNode extends InteropNode { protected GetIteratorNextElementNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType}; } @Override protected String getOperationName() { return "getIteratorNextElement"; } @Specialization(limit = "CACHE_LIMIT") static Object doCached(PolyglotLanguageContext context, Object receiver, Object[] args, @Bind("this") Node node, // @CachedLibrary("receiver") InteropLibrary iterators, @Cached ToHostValueNode toHost, @Cached InlinedBranchProfile unsupported, @Cached InlinedBranchProfile stop) { try { return toHost.execute(node, context, iterators.getIteratorNextElement(receiver)); } catch (UnsupportedMessageException e) { unsupported.enter(node); throw nonReadableIteratorElement(); } catch (StopIterationException e) { stop.enter(node); throw stopIteration(context, receiver); } } } abstract static class HasHashEntriesNode extends InteropNode { protected HasHashEntriesNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType}; } @Override protected String getOperationName() { return "hasHashEntries"; } @Specialization(limit = "CACHE_LIMIT") static Object doCached(PolyglotLanguageContext context, Object receiver, Object[] args, @Bind("this") Node node, // @CachedLibrary("receiver") InteropLibrary hashes) { return hashes.hasHashEntries(receiver); } } abstract static class GetHashSizeNode extends InteropNode { protected GetHashSizeNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType}; } @Override protected String getOperationName() { return "getHashSize"; } @Specialization(limit = "CACHE_LIMIT") static Object doCached(PolyglotLanguageContext context, Object receiver, Object[] args, @Bind("this") Node node, // @CachedLibrary("receiver") InteropLibrary hashes, @Cached InlinedBranchProfile unsupported) { try { return hashes.getHashSize(receiver); } catch (UnsupportedMessageException e) { unsupported.enter(node); throw getHashSizeUnsupported(context, receiver); } } } abstract static class HasHashEntryNode extends InteropNode { protected HasHashEntryNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType, Object.class}; } @Override protected String getOperationName() { return "hasHashEntry"; } @Specialization(limit = "CACHE_LIMIT") static Object doCached(PolyglotLanguageContext context, Object receiver, Object[] args, @Bind("this") Node node, @CachedLibrary("receiver") InteropLibrary hashes, @Cached(inline = true) ToGuestValueNode toGuestKey) { Object hostKey = args[ARGUMENT_OFFSET]; Object key = toGuestKey.execute(node, context, hostKey); return hashes.isHashEntryExisting(receiver, key); } } abstract static class GetHashValueNode extends InteropNode { protected GetHashValueNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType, Object.class}; } @Override protected String getOperationName() { return "getHashValue"; } @Specialization(limit = "CACHE_LIMIT") static Object doCached(PolyglotLanguageContext context, Object receiver, Object[] args, @Bind("this") Node node, @CachedLibrary("receiver") InteropLibrary hashes, @Cached(inline = true) ToGuestValueNode toGuestKey, @Cached ToHostValueNode toHost, @Cached InlinedBranchProfile unsupported, @Cached InlinedBranchProfile invalidKey) { Object hostKey = args[ARGUMENT_OFFSET]; Object key = toGuestKey.execute(node, context, hostKey); try { return toHost.execute(node, context, hashes.readHashValue(receiver, key)); } catch (UnsupportedMessageException e) { unsupported.enter(node); throw getHashValueUnsupported(context, receiver, key); } catch (UnknownKeyException e) { invalidKey.enter(node); if (hashes.isHashEntryExisting(receiver, key)) { throw getHashValueUnsupported(context, receiver, key); } else { return null; } } } } abstract static class GetHashValueOrDefaultNode extends InteropNode { protected GetHashValueOrDefaultNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType, Object.class, Object.class}; } @Override protected String getOperationName() { return "getHashValueOrDefault"; } @Specialization(limit = "CACHE_LIMIT") static Object doCached(PolyglotLanguageContext context, Object receiver, Object[] args, @Bind("this") Node node, // @CachedLibrary("receiver") InteropLibrary hashes, @Cached(inline = true) ToGuestValueNode toGuestKey, @Cached(inline = true) ToGuestValueNode toGuestDefaultValue, @Cached ToHostValueNode toHost, @Cached InlinedBranchProfile unsupported, @Cached InlinedBranchProfile invalidKey) { Object hostKey = args[ARGUMENT_OFFSET]; Object hostDefaultValue = args[ARGUMENT_OFFSET + 1]; Object key = toGuestKey.execute(node, context, hostKey); Object defaultValue = toGuestDefaultValue.execute(node, context, hostDefaultValue); try { return toHost.execute(node, context, hashes.readHashValueOrDefault(receiver, key, hostDefaultValue)); } catch (UnsupportedMessageException e) { unsupported.enter(node); throw getHashValueUnsupported(context, receiver, key); } } } abstract static class PutHashEntryNode extends InteropNode { protected PutHashEntryNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType, Object.class, Object.class}; } @Override protected String getOperationName() { return "putHashEntry"; } @Specialization(limit = "CACHE_LIMIT") static Object doCached(PolyglotLanguageContext context, Object receiver, Object[] args, @Bind("this") Node node, // @CachedLibrary("receiver") InteropLibrary hashes, @Cached(inline = true) ToGuestValueNode toGuestKey, @Cached(inline = true) ToGuestValueNode toGuestValue, @Cached InlinedBranchProfile unsupported, @Cached InlinedBranchProfile invalidKey, @Cached InlinedBranchProfile invalidValue) { Object hostKey = args[ARGUMENT_OFFSET]; Object hostValue = args[ARGUMENT_OFFSET + 1]; Object key = toGuestKey.execute(node, context, hostKey); Object value = toGuestValue.execute(node, context, hostValue); try { hashes.writeHashEntry(receiver, key, value); } catch (UnsupportedMessageException | UnknownKeyException e) { unsupported.enter(node); throw putHashEntryUnsupported(context, receiver, key, value); } catch (UnsupportedTypeException e) { invalidValue.enter(node); throw invalidHashValue(context, receiver, key, value); } return null; } } abstract static class RemoveHashEntryNode extends InteropNode { protected RemoveHashEntryNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType, Object.class}; } @Override protected String getOperationName() { return "removeHashEntry"; } @Specialization(limit = "CACHE_LIMIT") static Object doCached(PolyglotLanguageContext context, Object receiver, Object[] args, @Bind("this") Node node, // @CachedLibrary("receiver") InteropLibrary hashes, @Cached(inline = true) ToGuestValueNode toGuestKey, @Cached InlinedBranchProfile unsupported, @Cached InlinedBranchProfile invalidKey) { Object hostKey = args[ARGUMENT_OFFSET]; Object key = toGuestKey.execute(node, context, hostKey); Boolean result; try { hashes.removeHashEntry(receiver, key); result = Boolean.TRUE; } catch (UnsupportedMessageException e) { unsupported.enter(node); if (!hashes.hasHashEntries(receiver) || hashes.isHashEntryExisting(receiver, key)) { throw removeHashEntryUnsupported(context, receiver, key); } else { result = Boolean.FALSE; } } catch (UnknownKeyException e) { invalidKey.enter(node); result = Boolean.FALSE; } return result; } } abstract static class GetHashEntriesIteratorNode extends InteropNode { GetHashEntriesIteratorNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType}; } @Override protected String getOperationName() { return "getHashEntriesIterator"; } @Specialization(limit = "CACHE_LIMIT") static Object doCached(PolyglotLanguageContext context, Object receiver, Object[] args, @Bind("this") Node node, // @CachedLibrary("receiver") InteropLibrary hashes, @Cached ToHostValueNode toHost, @Cached InlinedBranchProfile unsupported) { try { return toHost.execute(node, context, hashes.getHashEntriesIterator(receiver)); } catch (UnsupportedMessageException e) { unsupported.enter(node); throw getHashEntriesIteratorUnsupported(context, receiver); } } } abstract static class GetHashKeysIteratorNode extends InteropNode { GetHashKeysIteratorNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType}; } @Override protected String getOperationName() { return "getHashKeysIterator"; } @Specialization(limit = "CACHE_LIMIT") static Object doCached(PolyglotLanguageContext context, Object receiver, Object[] args, @Bind("this") Node node, // @CachedLibrary("receiver") InteropLibrary hashes, @Cached ToHostValueNode toHost, @Cached InlinedBranchProfile unsupported) { try { return toHost.execute(node, context, hashes.getHashKeysIterator(receiver)); } catch (UnsupportedMessageException e) { unsupported.enter(node); throw getHashEntriesIteratorUnsupported(context, receiver); } } } abstract static class GetHashValuesIteratorNode extends InteropNode { GetHashValuesIteratorNode(InteropValue interop) { super(interop); } @Override protected Class<?>[] getArgumentTypes() { return new Class<?>[]{PolyglotLanguageContext.class, polyglot.receiverType}; } @Override protected String getOperationName() { return "getHashValuesIterator"; } @Specialization(limit = "CACHE_LIMIT") static Object doCached(PolyglotLanguageContext context, Object receiver, Object[] args, @Bind("this") Node node, // @CachedLibrary("receiver") InteropLibrary hashes, @Cached ToHostValueNode toHost, @Cached InlinedBranchProfile unsupported) { try { return toHost.execute(node, context, hashes.getHashValuesIterator(receiver)); } catch (UnsupportedMessageException e) { unsupported.enter(node); throw getHashEntriesIteratorUnsupported(context, receiver); } } } } } ```
Enrique Villarreal (born 17 April 1947) is a Mexican boxer. He competed in the men's light heavyweight event at the 1968 Summer Olympics. 1968 Olympic results Below are the results of Enrique Villareal, a Mexican light heavyweight boxer who competed at the 1968 Mexico City Olympics: Round of 32: bye Round of 16: lost to Fatai Ayinla (Nigeria) referee stopped contest References External links 1947 births Living people Mexican male boxers Olympic boxers for Mexico Boxers at the 1968 Summer Olympics Sportspeople from Coahuila Light-heavyweight boxers
Thomas Schnauz (born ) is an American television producer and television writer. His credits include The X-Files, The Lone Gunmen, Night Stalker, Reaper, Breaking Bad, and Better Call Saul. Personal life Schnauz was born in Kearny, New Jersey. He attended New York University's Tisch School of the Arts, where he first met fellow student Vince Gilligan. Schnauz graduated from Tisch in 1988. Career Schnauz started his career in various production jobs. His first screenplay was called Spirits in Passing. He eventually joined Vince Gilligan on the writing staff of The X-Files and its spinoff show, The Lone Gunmen. He also co-wrote the screenplays for the 2008 film Otis and the 2008 television film Infected. In 2010, he re-teamed with Gilligan on Breaking Bad, where he remained through the show's 2013 conclusion. Schnauz signed a two-year overall deal with Sony Pictures Television in November 2014. Schnauz served as co-executive producer on AMC's Breaking Bad spinoff series Better Call Saul. He has written and/or directed a number of its episodes including "Pimento", the penultimate episode of the show's first season, which received critical acclaim, as well as "Plan and Execution", the finale of the sixth season's first half that also received praise for Schnauz's writing. In April 2015, it was reported that he had been tapped to write the screenplay for "a revisionist take" on "Jack and the Beanstalk", also to be produced by Vince Gilligan. In 2019, Schnauz joined other WGA writers in firing their agents as part of the WGA's stand against the ATA and the practice of packaging. Filmography Writer Production staff Screenplays Awards and nominations Schnauz has been nominated for Writers Guild of America Awards on six occasions, winning three times, for his work on the writing staffs of Breaking Bad and Better Call Saul. Schnauz shared in the show's 2010 Dramatic Series nomination, and subsequent category wins in 2011, 2012 and 2013, for his work on Breaking Bad. He was nominated again in 2015 and 2016 in the Dramatic Series category for Better Call Saul. He was nominated for a Primetime Emmy Award for Outstanding Writing in a Drama Series for the 2012 Breaking Bad episode "Say My Name". References External links American television producers American television writers Living people Place of birth missing (living people) Year of birth missing (living people) American television directors American male screenwriters Tisch School of the Arts alumni American male television writers Writers Guild of America Award winners
Rescuecom Corp. v. Google Inc. 562 F.3d 123 (2nd Cir. 2009), was a case at the United States Court of Appeals for the Second Circuit, in which the court held that recommending a trademark for keyword advertising was a commercial use of the trademark, and could constitute trademark infringement. Background AdWords is a system used by Google through which advertisers can purchase keywords. When a user searches for a purchased keyword, Google displays advertisements from the purchasing firm. Google also provides its advertising customers with a "Keyword Suggestion Tool", which recommends additional keywords for the customer to purchase. Rescuecom, a computer repair and service firm, discovered that Google had used its name during this process, and recommended it to it AdWords customers. The name "Rescuecom" was trademarked by the company. Rescuecom further found that Google had recommended its name for purchase by some of Rescuecom's own competitors. The company therefore filed suit against Google claiming violations of the Lanham Act, including trademark infringement, trademark dilution, false designation of origin, and tortious interference in business relations with an intent to gain economic advantage. District court proceedings The case was first heard by the United States District Court for the Northern District of New York in 2006. The district court drew heavily on the Second Circuit precedent 1-800 CONTACTS, INC. v. WhenU. com, Inc., concerning the use of software that generated pop-up advertisements based on a computer user's actions, with the ads being generated from a database of company domain names. In that precedent, the court held that the usage of a trademarked domain name in an "unpublished directory of terms" and the appearance of "separate, branded ads" triggered by a trademark do not constitute "use" of the trademark under the Lanham Act. In its ruling in the Rescuecom case, the district court held that, per the WhenU precedent, Rescuecom had "prove[d] no facts in support of its claim... [of] trademark use". Given that a trademark "use" is required under the Lanham Act for infringement to occur, the district court dismissed Rescuecom's complaint. Rescuecom appealed this ruling to the Second Circuit Court of Appeals. Circuit court opinion The Second Circuit reversed the district court's decision in 2009, holding that Google's use of the "Rescuecom" trademark constituted a "use in commerce" under the Lanham Act, even when recommending it for purchase by its own advertising customers. The circuit court held that the lower court had interpreted the WhenU precedent incorrectly, because Google's process of selling keywords for its AdWords service was a different type of business practice. The court rejected Google's argument that the inclusion of a trademarked term in an internal computer directory does not constitute "use" of that trademark under the Lanham Act. Instead, the court held that "Google’s recommendation and sale of Rescuecom’s mark to its advertising customers are not internal uses" and were a full business transaction. If the court were to accept Google's argument, "the operators of search engines would be free to use trademarks in ways designed to deceive and cause consumer confusion. This is surely neither within the intention nor the letter of the Lanham Act." In another departure from the matters discussed in the WhenU precedent, the circuit court accepted Rescuecom's assertion that Google's placement of "sponsored links" purchased by its AdWords customers, above organic search results at the google.com page, could lead consumers to conclude that such ads were associated with Rescuecom. As a result, Google's argument that a sponsored link is analogous to placement of a generic brand next to a trademarked brand was rejected by the circuit court. The Second Circuit thus vacated the district court decision, ruling that Google's actions constituted commercial use and that Rescuecom's claims of trademark infringement, trademark dilution, and related claims could not be immediately dismissed as Google requested. The case was sent back to the district court for reconsideration of possible financial damages to be paid to Rescuecom. Impact and subsequent developments In 2010, Rescuecom moved to drop the proceedings against Google and issued a press release declaring victory in the case. However, Google apparently made no new concessions to Rescuecom to get it to drop the lawsuit; instead, the changes that Rescuecom claimed as victory had been made by Google five years earlier. Thus it was unclear if Google had adjusted its AdWords recommendation process based on the circuit court ruling, or if Rescuecom had realized any rewards. Meanwhile, the original district court ruling was criticized for confusing the matter of "commercial use" of trademarked terms on the Internet for advertising practices that would be permitted for traditional advertising, though the Second Circuit ruling on appeal has been cited as an important if uncertain precedent in the law of Internet advertising. References United States Court of Appeals for the Second Circuit cases United States Internet case law United States trademark case law 2009 in United States case law Google litigation Online advertising
```c path_to_url Unless required by applicable law or agreed to in writing, software WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. */ #include "sysdep.h" #include "bfd.h" #include "dis-asm.h" #include "disassemble.h" #include <stdint.h> #define MAX_TEXT_SIZE 256 typedef struct { char *buffer; size_t pos; } SFILE; static int fuzz_disasm_null_styled_printf (void *stream, enum disassembler_style style, const char *format, ...) { return 0; } static int objdump_sprintf (void *vf, const char *format, ...) { SFILE *f = (SFILE *) vf; size_t n; va_list args; va_start (args, format); if (f->pos >= MAX_TEXT_SIZE){ printf("buffer needs more space\n"); //reset f->pos=0; return 0; } n = vsnprintf (f->buffer + f->pos, MAX_TEXT_SIZE - f->pos, format, args); //vfprintf(stdout, format, args); va_end (args); f->pos += n; return n; } int LLVMFuzzerTestOneInput(const uint8_t *Data, size_t Size) { char AssemblyText[MAX_TEXT_SIZE]; struct disassemble_info disasm_info; SFILE s; if (Size < 10 || Size > 16394) { // 10 bytes for options // 16394 limit code to prevent timeouts return 0; } init_disassemble_info (&disasm_info, stdout, (fprintf_ftype) fprintf, fuzz_disasm_null_styled_printf); disasm_info.fprintf_func = objdump_sprintf; disasm_info.print_address_func = generic_print_address; disasm_info.display_endian = disasm_info.endian = BFD_ENDIAN_LITTLE; disasm_info.buffer = (bfd_byte *) Data; disasm_info.buffer_vma = 0x1000; disasm_info.buffer_length = Size-10; disasm_info.insn_info_valid = 0; disasm_info.created_styled_output = false; s.buffer = AssemblyText; s.pos = 0; disasm_info.stream = &s; disasm_info.bytes_per_line = 0; disasm_info.flags |= USER_SPECIFIED_MACHINE_TYPE; disasm_info.arch = Data[Size-1]; disasm_info.mach = bfd_getl64(&Data[Size-9]); disasm_info.flavour = Data[Size-10]; if (bfd_lookup_arch (disasm_info.arch, disasm_info.mach) != NULL) { disassembler_ftype disasfunc = disassembler(disasm_info.arch, 0, disasm_info.mach, NULL); if (disasfunc != NULL) { disassemble_init_for_target(&disasm_info); while (1) { s.pos = 0; int octets = disasfunc(disasm_info.buffer_vma, &disasm_info); if (octets < (int) disasm_info.octets_per_byte) break; if (disasm_info.buffer_length <= (size_t) octets) break; disasm_info.buffer += octets; disasm_info.buffer_vma += octets / disasm_info.octets_per_byte; disasm_info.buffer_length -= octets; } disassemble_free_target(&disasm_info); } } return 0; } ```
```c /* * * This file is part of FFmpeg. * * FFmpeg is free software; you can redistribute it and/or * modify it under the terms of the GNU Lesser General Public * * FFmpeg is distributed in the hope that it will be useful, * but WITHOUT ANY WARRANTY; without even the implied warranty of * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU * * You should have received a copy of the GNU Lesser General Public * Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA */ /** * @file * multimedia converter based on the FFmpeg libraries */ #include "config.h" #include <ctype.h> #include <string.h> #include <math.h> #include <stdlib.h> #include <errno.h> #include <limits.h> #include <stdint.h> #if HAVE_IO_H #include <io.h> #endif #if HAVE_UNISTD_H #include <unistd.h> #endif #include "libavformat/avformat.h" #include "libavdevice/avdevice.h" #include "libswresample/swresample.h" #include "libavutil/opt.h" #include "libavutil/channel_layout.h" #include "libavutil/parseutils.h" #include "libavutil/samplefmt.h" #include "libavutil/fifo.h" #include "libavutil/internal.h" #include "libavutil/intreadwrite.h" #include "libavutil/dict.h" #include "libavutil/mathematics.h" #include "libavutil/pixdesc.h" #include "libavutil/avstring.h" #include "libavutil/libm.h" #include "libavutil/imgutils.h" #include "libavutil/timestamp.h" #include "libavutil/bprint.h" #include "libavutil/time.h" #include "libavutil/threadmessage.h" #include "libavcodec/mathops.h" #include "libavformat/os_support.h" # include "libavfilter/avfilter.h" # include "libavfilter/buffersrc.h" # include "libavfilter/buffersink.h" #if HAVE_SYS_RESOURCE_H #include <sys/time.h> #include <sys/types.h> #include <sys/resource.h> #elif HAVE_GETPROCESSTIMES #include <windows.h> #endif #if HAVE_GETPROCESSMEMORYINFO #include <windows.h> #include <psapi.h> #endif #if HAVE_SETCONSOLECTRLHANDLER #include <windows.h> #endif #if HAVE_SYS_SELECT_H #include <sys/select.h> #endif #if HAVE_TERMIOS_H #include <fcntl.h> #include <sys/ioctl.h> #include <sys/time.h> #include <termios.h> #elif HAVE_KBHIT #include <conio.h> #endif #if HAVE_PTHREADS #include <pthread.h> #endif #include <time.h> #include "ffmpeg.h" #include "cmdutils.h" #include "libavutil/avassert.h" const char program_name[] = "ffmpeg"; const int program_birth_year = 2000; static FILE *vstats_file; const char *const forced_keyframes_const_names[] = { "n", "n_forced", "prev_forced_n", "prev_forced_t", "t", NULL }; static void do_video_stats(OutputStream *ost, int frame_size); static int64_t getutime(void); static int64_t getmaxrss(void); static int run_as_daemon = 0; static int nb_frames_dup = 0; static int nb_frames_drop = 0; static int64_t decode_error_stat[2]; static int want_sdp = 1; static int current_time; AVIOContext *progress_avio = NULL; static uint8_t *subtitle_out; InputStream **input_streams = NULL; int nb_input_streams = 0; InputFile **input_files = NULL; int nb_input_files = 0; OutputStream **output_streams = NULL; int nb_output_streams = 0; OutputFile **output_files = NULL; int nb_output_files = 0; FilterGraph **filtergraphs; int nb_filtergraphs; #if HAVE_TERMIOS_H /* init terminal so that we can grab keys */ static struct termios oldtty; static int restore_tty; #endif #if HAVE_PTHREADS static void free_input_threads(void); #endif /* sub2video hack: Convert subtitles to video with alpha to insert them in filter graphs. This is a temporary solution until libavfilter gets real subtitles support. */ static int sub2video_get_blank_frame(InputStream *ist) { int ret; AVFrame *frame = ist->sub2video.frame; av_frame_unref(frame); ist->sub2video.frame->width = ist->dec_ctx->width ? ist->dec_ctx->width : ist->sub2video.w; ist->sub2video.frame->height = ist->dec_ctx->height ? ist->dec_ctx->height : ist->sub2video.h; ist->sub2video.frame->format = AV_PIX_FMT_RGB32; if ((ret = av_frame_get_buffer(frame, 32)) < 0) return ret; memset(frame->data[0], 0, frame->height * frame->linesize[0]); return 0; } static void sub2video_copy_rect(uint8_t *dst, int dst_linesize, int w, int h, AVSubtitleRect *r) { uint32_t *pal, *dst2; uint8_t *src, *src2; int x, y; if (r->type != SUBTITLE_BITMAP) { av_log(NULL, AV_LOG_WARNING, "sub2video: non-bitmap subtitle\n"); return; } if (r->x < 0 || r->x + r->w > w || r->y < 0 || r->y + r->h > h) { av_log(NULL, AV_LOG_WARNING, "sub2video: rectangle (%d %d %d %d) overflowing %d %d\n", r->x, r->y, r->w, r->h, w, h ); return; } dst += r->y * dst_linesize + r->x * 4; src = r->data[0]; pal = (uint32_t *)r->data[1]; for (y = 0; y < r->h; y++) { dst2 = (uint32_t *)dst; src2 = src; for (x = 0; x < r->w; x++) *(dst2++) = pal[*(src2++)]; dst += dst_linesize; src += r->linesize[0]; } } static void sub2video_push_ref(InputStream *ist, int64_t pts) { AVFrame *frame = ist->sub2video.frame; int i; av_assert1(frame->data[0]); ist->sub2video.last_pts = frame->pts = pts; for (i = 0; i < ist->nb_filters; i++) av_buffersrc_add_frame_flags(ist->filters[i]->filter, frame, AV_BUFFERSRC_FLAG_KEEP_REF | AV_BUFFERSRC_FLAG_PUSH); } static void sub2video_update(InputStream *ist, AVSubtitle *sub) { AVFrame *frame = ist->sub2video.frame; int8_t *dst; int dst_linesize; int num_rects, i; int64_t pts, end_pts; if (!frame) return; if (sub) { pts = av_rescale_q(sub->pts + sub->start_display_time * 1000LL, AV_TIME_BASE_Q, ist->st->time_base); end_pts = av_rescale_q(sub->pts + sub->end_display_time * 1000LL, AV_TIME_BASE_Q, ist->st->time_base); num_rects = sub->num_rects; } else { pts = ist->sub2video.end_pts; end_pts = INT64_MAX; num_rects = 0; } if (sub2video_get_blank_frame(ist) < 0) { av_log(ist->dec_ctx, AV_LOG_ERROR, "Impossible to get a blank canvas.\n"); return; } dst = frame->data [0]; dst_linesize = frame->linesize[0]; for (i = 0; i < num_rects; i++) sub2video_copy_rect(dst, dst_linesize, frame->width, frame->height, sub->rects[i]); sub2video_push_ref(ist, pts); ist->sub2video.end_pts = end_pts; } static void sub2video_heartbeat(InputStream *ist, int64_t pts) { InputFile *infile = input_files[ist->file_index]; int i, j, nb_reqs; int64_t pts2; /* When a frame is read from a file, examine all sub2video streams in the same file and send the sub2video frame again. Otherwise, decoded video frames could be accumulating in the filter graph while a filter (possibly overlay) is desperately waiting for a subtitle frame. */ for (i = 0; i < infile->nb_streams; i++) { InputStream *ist2 = input_streams[infile->ist_index + i]; if (!ist2->sub2video.frame) continue; /* subtitles seem to be usually muxed ahead of other streams; if not, subtracting a larger time here is necessary */ pts2 = av_rescale_q(pts, ist->st->time_base, ist2->st->time_base) - 1; /* do not send the heartbeat frame if the subtitle is already ahead */ if (pts2 <= ist2->sub2video.last_pts) continue; if (pts2 >= ist2->sub2video.end_pts || !ist2->sub2video.frame->data[0]) sub2video_update(ist2, NULL); for (j = 0, nb_reqs = 0; j < ist2->nb_filters; j++) nb_reqs += av_buffersrc_get_nb_failed_requests(ist2->filters[j]->filter); if (nb_reqs) sub2video_push_ref(ist2, pts2); } } static void sub2video_flush(InputStream *ist) { int i; if (ist->sub2video.end_pts < INT64_MAX) sub2video_update(ist, NULL); for (i = 0; i < ist->nb_filters; i++) av_buffersrc_add_frame(ist->filters[i]->filter, NULL); } /* end of sub2video hack */ static void term_exit_sigsafe(void) { #if HAVE_TERMIOS_H if(restore_tty) tcsetattr (0, TCSANOW, &oldtty); #endif } void term_exit(void) { av_log(NULL, AV_LOG_QUIET, "%s", ""); term_exit_sigsafe(); } static volatile int received_sigterm = 0; static volatile int received_nb_signals = 0; static volatile int transcode_init_done = 0; static volatile int ffmpeg_exited = 0; static int main_return_code = 0; static void sigterm_handler(int sig) { received_sigterm = sig; received_nb_signals++; term_exit_sigsafe(); if(received_nb_signals > 3) { write(2/*STDERR_FILENO*/, "Received > 3 system signals, hard exiting\n", strlen("Received > 3 system signals, hard exiting\n")); exit(123); } } #if HAVE_SETCONSOLECTRLHANDLER static BOOL WINAPI CtrlHandler(DWORD fdwCtrlType) { av_log(NULL, AV_LOG_DEBUG, "\nReceived windows signal %ld\n", fdwCtrlType); switch (fdwCtrlType) { case CTRL_C_EVENT: case CTRL_BREAK_EVENT: sigterm_handler(SIGINT); return TRUE; case CTRL_CLOSE_EVENT: case CTRL_LOGOFF_EVENT: case CTRL_SHUTDOWN_EVENT: sigterm_handler(SIGTERM); /* Basically, with these 3 events, when we return from this method the process is hard terminated, so stall as long as we need to to try and let the main thread(s) clean up and gracefully terminate (we have at most 5 seconds, but should be done far before that). */ while (!ffmpeg_exited) { Sleep(0); } return TRUE; default: av_log(NULL, AV_LOG_ERROR, "Received unknown windows signal %ld\n", fdwCtrlType); return FALSE; } } #endif void term_init(void) { #if HAVE_TERMIOS_H if (!run_as_daemon && stdin_interaction) { struct termios tty; if (tcgetattr (0, &tty) == 0) { oldtty = tty; restore_tty = 1; tty.c_iflag &= ~(IGNBRK|BRKINT|PARMRK|ISTRIP |INLCR|IGNCR|ICRNL|IXON); tty.c_oflag |= OPOST; tty.c_lflag &= ~(ECHO|ECHONL|ICANON|IEXTEN); tty.c_cflag &= ~(CSIZE|PARENB); tty.c_cflag |= CS8; tty.c_cc[VMIN] = 1; tty.c_cc[VTIME] = 0; tcsetattr (0, TCSANOW, &tty); } signal(SIGQUIT, sigterm_handler); /* Quit (POSIX). */ } #endif signal(SIGINT , sigterm_handler); /* Interrupt (ANSI). */ signal(SIGTERM, sigterm_handler); /* Termination (ANSI). */ #ifdef SIGXCPU signal(SIGXCPU, sigterm_handler); #endif #if HAVE_SETCONSOLECTRLHANDLER SetConsoleCtrlHandler((PHANDLER_ROUTINE) CtrlHandler, TRUE); #endif } /* read a key without blocking */ static int read_key(void) { unsigned char ch; #if HAVE_TERMIOS_H int n = 1; struct timeval tv; fd_set rfds; FD_ZERO(&rfds); FD_SET(0, &rfds); tv.tv_sec = 0; tv.tv_usec = 0; n = select(1, &rfds, NULL, NULL, &tv); if (n > 0) { n = read(0, &ch, 1); if (n == 1) return ch; return n; } #elif HAVE_KBHIT # if HAVE_PEEKNAMEDPIPE static int is_pipe; static HANDLE input_handle; DWORD dw, nchars; if(!input_handle){ input_handle = GetStdHandle(STD_INPUT_HANDLE); is_pipe = !GetConsoleMode(input_handle, &dw); } if (is_pipe) { /* When running under a GUI, you will end here. */ if (!PeekNamedPipe(input_handle, NULL, 0, NULL, &nchars, NULL)) { // input pipe may have been closed by the program that ran ffmpeg return -1; } //Read it if(nchars != 0) { read(0, &ch, 1); return ch; }else{ return -1; } } # endif if(kbhit()) return(getch()); #endif return -1; } static int decode_interrupt_cb(void *ctx) { return received_nb_signals > transcode_init_done; } const AVIOInterruptCB int_cb = { decode_interrupt_cb, NULL }; static void ffmpeg_cleanup(int ret) { int i, j; if (do_benchmark) { int maxrss = getmaxrss() / 1024; av_log(NULL, AV_LOG_INFO, "bench: maxrss=%ikB\n", maxrss); } for (i = 0; i < nb_filtergraphs; i++) { FilterGraph *fg = filtergraphs[i]; avfilter_graph_free(&fg->graph); for (j = 0; j < fg->nb_inputs; j++) { av_freep(&fg->inputs[j]->name); av_freep(&fg->inputs[j]); } av_freep(&fg->inputs); for (j = 0; j < fg->nb_outputs; j++) { av_freep(&fg->outputs[j]->name); av_freep(&fg->outputs[j]); } av_freep(&fg->outputs); av_freep(&fg->graph_desc); av_freep(&filtergraphs[i]); } av_freep(&filtergraphs); av_freep(&subtitle_out); /* close files */ for (i = 0; i < nb_output_files; i++) { OutputFile *of = output_files[i]; AVFormatContext *s; if (!of) continue; s = of->ctx; if (s && s->oformat && !(s->oformat->flags & AVFMT_NOFILE)) avio_closep(&s->pb); avformat_free_context(s); av_dict_free(&of->opts); av_freep(&output_files[i]); } for (i = 0; i < nb_output_streams; i++) { OutputStream *ost = output_streams[i]; if (!ost) continue; for (j = 0; j < ost->nb_bitstream_filters; j++) av_bsf_free(&ost->bsf_ctx[j]); av_freep(&ost->bsf_ctx); av_freep(&ost->bsf_extradata_updated); av_frame_free(&ost->filtered_frame); av_frame_free(&ost->last_frame); av_dict_free(&ost->encoder_opts); av_parser_close(ost->parser); avcodec_free_context(&ost->parser_avctx); av_freep(&ost->forced_keyframes); av_expr_free(ost->forced_keyframes_pexpr); av_freep(&ost->avfilter); av_freep(&ost->logfile_prefix); av_freep(&ost->audio_channels_map); ost->audio_channels_mapped = 0; av_dict_free(&ost->sws_dict); avcodec_free_context(&ost->enc_ctx); avcodec_parameters_free(&ost->ref_par); while (ost->muxing_queue && av_fifo_size(ost->muxing_queue)) { AVPacket pkt; av_fifo_generic_read(ost->muxing_queue, &pkt, sizeof(pkt), NULL); av_packet_unref(&pkt); } av_fifo_freep(&ost->muxing_queue); av_freep(&output_streams[i]); } #if HAVE_PTHREADS free_input_threads(); #endif for (i = 0; i < nb_input_files; i++) { avformat_close_input(&input_files[i]->ctx); av_freep(&input_files[i]); } for (i = 0; i < nb_input_streams; i++) { InputStream *ist = input_streams[i]; av_frame_free(&ist->decoded_frame); av_frame_free(&ist->filter_frame); av_dict_free(&ist->decoder_opts); avsubtitle_free(&ist->prev_sub.subtitle); av_frame_free(&ist->sub2video.frame); av_freep(&ist->filters); av_freep(&ist->hwaccel_device); av_freep(&ist->dts_buffer); avcodec_free_context(&ist->dec_ctx); av_freep(&input_streams[i]); } if (vstats_file) { if (fclose(vstats_file)) av_log(NULL, AV_LOG_ERROR, "Error closing vstats file, loss of information possible: %s\n", av_err2str(AVERROR(errno))); } av_freep(&vstats_filename); av_freep(&input_streams); av_freep(&input_files); av_freep(&output_streams); av_freep(&output_files); uninit_opts(); avformat_network_deinit(); if (received_sigterm) { av_log(NULL, AV_LOG_INFO, "Exiting normally, received signal %d.\n", (int) received_sigterm); } else if (ret && transcode_init_done) { av_log(NULL, AV_LOG_INFO, "Conversion failed!\n"); } term_exit(); ffmpeg_exited = 1; } void remove_avoptions(AVDictionary **a, AVDictionary *b) { AVDictionaryEntry *t = NULL; while ((t = av_dict_get(b, "", t, AV_DICT_IGNORE_SUFFIX))) { av_dict_set(a, t->key, NULL, AV_DICT_MATCH_CASE); } } void assert_avoptions(AVDictionary *m) { AVDictionaryEntry *t; if ((t = av_dict_get(m, "", NULL, AV_DICT_IGNORE_SUFFIX))) { av_log(NULL, AV_LOG_FATAL, "Option %s not found.\n", t->key); exit_program(1); } } static void abort_codec_experimental(AVCodec *c, int encoder) { exit_program(1); } static void update_benchmark(const char *fmt, ...) { if (do_benchmark_all) { int64_t t = getutime(); va_list va; char buf[1024]; if (fmt) { va_start(va, fmt); vsnprintf(buf, sizeof(buf), fmt, va); va_end(va); av_log(NULL, AV_LOG_INFO, "bench: %8"PRIu64" %s \n", t - current_time, buf); } current_time = t; } } static void close_all_output_streams(OutputStream *ost, OSTFinished this_stream, OSTFinished others) { int i; for (i = 0; i < nb_output_streams; i++) { OutputStream *ost2 = output_streams[i]; ost2->finished |= ost == ost2 ? this_stream : others; } } static void write_packet(OutputFile *of, AVPacket *pkt, OutputStream *ost) { AVFormatContext *s = of->ctx; AVStream *st = ost->st; int ret; if (!of->header_written) { AVPacket tmp_pkt; /* the muxer is not initialized yet, buffer the packet */ if (!av_fifo_space(ost->muxing_queue)) { int new_size = FFMIN(2 * av_fifo_size(ost->muxing_queue), ost->max_muxing_queue_size); if (new_size <= av_fifo_size(ost->muxing_queue)) { av_log(NULL, AV_LOG_ERROR, "Too many packets buffered for output stream %d:%d.\n", ost->file_index, ost->st->index); exit_program(1); } ret = av_fifo_realloc2(ost->muxing_queue, new_size); if (ret < 0) exit_program(1); } av_packet_move_ref(&tmp_pkt, pkt); av_fifo_generic_write(ost->muxing_queue, &tmp_pkt, sizeof(tmp_pkt), NULL); return; } if ((st->codecpar->codec_type == AVMEDIA_TYPE_VIDEO && video_sync_method == VSYNC_DROP) || (st->codecpar->codec_type == AVMEDIA_TYPE_AUDIO && audio_sync_method < 0)) pkt->pts = pkt->dts = AV_NOPTS_VALUE; /* * Audio encoders may split the packets -- #frames in != #packets out. * But there is no reordering, so we can limit the number of output packets * by simply dropping them here. * Counting encoded video frames needs to be done separately because of * reordering, see do_video_out() */ if (!(st->codecpar->codec_type == AVMEDIA_TYPE_VIDEO && ost->encoding_needed)) { if (ost->frame_number >= ost->max_frames) { av_packet_unref(pkt); return; } ost->frame_number++; } if (st->codecpar->codec_type == AVMEDIA_TYPE_VIDEO) { int i; uint8_t *sd = av_packet_get_side_data(pkt, AV_PKT_DATA_QUALITY_STATS, NULL); ost->quality = sd ? AV_RL32(sd) : -1; ost->pict_type = sd ? sd[4] : AV_PICTURE_TYPE_NONE; for (i = 0; i<FF_ARRAY_ELEMS(ost->error); i++) { if (sd && i < sd[5]) ost->error[i] = AV_RL64(sd + 8 + 8*i); else ost->error[i] = -1; } if (ost->frame_rate.num && ost->is_cfr) { if (pkt->duration > 0) av_log(NULL, AV_LOG_WARNING, "Overriding packet duration by frame rate, this should not happen\n"); pkt->duration = av_rescale_q(1, av_inv_q(ost->frame_rate), ost->st->time_base); } } if (!(s->oformat->flags & AVFMT_NOTIMESTAMPS)) { if (pkt->dts != AV_NOPTS_VALUE && pkt->pts != AV_NOPTS_VALUE && pkt->dts > pkt->pts) { av_log(s, AV_LOG_WARNING, "Invalid DTS: %"PRId64" PTS: %"PRId64" in output stream %d:%d, replacing by guess\n", pkt->dts, pkt->pts, ost->file_index, ost->st->index); pkt->pts = pkt->dts = pkt->pts + pkt->dts + ost->last_mux_dts + 1 - FFMIN3(pkt->pts, pkt->dts, ost->last_mux_dts + 1) - FFMAX3(pkt->pts, pkt->dts, ost->last_mux_dts + 1); } if ((st->codecpar->codec_type == AVMEDIA_TYPE_AUDIO || st->codecpar->codec_type == AVMEDIA_TYPE_VIDEO) && pkt->dts != AV_NOPTS_VALUE && !(st->codecpar->codec_id == AV_CODEC_ID_VP9 && ost->stream_copy) && ost->last_mux_dts != AV_NOPTS_VALUE) { int64_t max = ost->last_mux_dts + !(s->oformat->flags & AVFMT_TS_NONSTRICT); if (pkt->dts < max) { int loglevel = max - pkt->dts > 2 || st->codecpar->codec_type == AVMEDIA_TYPE_VIDEO ? AV_LOG_WARNING : AV_LOG_DEBUG; av_log(s, loglevel, "Non-monotonous DTS in output stream " "%d:%d; previous: %"PRId64", current: %"PRId64"; ", ost->file_index, ost->st->index, ost->last_mux_dts, pkt->dts); if (exit_on_error) { av_log(NULL, AV_LOG_FATAL, "aborting.\n"); exit_program(1); } av_log(s, loglevel, "changing to %"PRId64". This may result " "in incorrect timestamps in the output file.\n", max); if (pkt->pts >= pkt->dts) pkt->pts = FFMAX(pkt->pts, max); pkt->dts = max; } } } ost->last_mux_dts = pkt->dts; ost->data_size += pkt->size; ost->packets_written++; pkt->stream_index = ost->index; if (debug_ts) { av_log(NULL, AV_LOG_INFO, "muxer <- type:%s " "pkt_pts:%s pkt_pts_time:%s pkt_dts:%s pkt_dts_time:%s size:%d\n", av_get_media_type_string(ost->enc_ctx->codec_type), av_ts2str(pkt->pts), av_ts2timestr(pkt->pts, &ost->st->time_base), av_ts2str(pkt->dts), av_ts2timestr(pkt->dts, &ost->st->time_base), pkt->size ); } ret = av_interleaved_write_frame(s, pkt); if (ret < 0) { print_error("av_interleaved_write_frame()", ret); main_return_code = 1; close_all_output_streams(ost, MUXER_FINISHED | ENCODER_FINISHED, ENCODER_FINISHED); } av_packet_unref(pkt); } static void close_output_stream(OutputStream *ost) { OutputFile *of = output_files[ost->file_index]; ost->finished |= ENCODER_FINISHED; if (of->shortest) { int64_t end = av_rescale_q(ost->sync_opts - ost->first_pts, ost->enc_ctx->time_base, AV_TIME_BASE_Q); of->recording_time = FFMIN(of->recording_time, end); } } static void output_packet(OutputFile *of, AVPacket *pkt, OutputStream *ost) { int ret = 0; /* apply the output bitstream filters, if any */ if (ost->nb_bitstream_filters) { int idx; av_packet_split_side_data(pkt); ret = av_bsf_send_packet(ost->bsf_ctx[0], pkt); if (ret < 0) goto finish; idx = 1; while (idx) { /* get a packet from the previous filter up the chain */ ret = av_bsf_receive_packet(ost->bsf_ctx[idx - 1], pkt); /* HACK! - aac_adtstoasc updates extradata after filtering the first frame when * the api states this shouldn't happen after init(). Propagate it here to the * muxer and to the next filters in the chain to workaround this. * TODO/FIXME - Make aac_adtstoasc use new packet side data instead of changing * par_out->extradata and adapt muxers accordingly to get rid of this. */ if (!(ost->bsf_extradata_updated[idx - 1] & 1)) { ret = avcodec_parameters_copy(ost->st->codecpar, ost->bsf_ctx[idx - 1]->par_out); if (ret < 0) goto finish; ost->bsf_extradata_updated[idx - 1] |= 1; } if (ret == AVERROR(EAGAIN)) { ret = 0; idx--; continue; } else if (ret < 0) goto finish; /* send it to the next filter down the chain or to the muxer */ if (idx < ost->nb_bitstream_filters) { /* HACK/FIXME! - See above */ if (!(ost->bsf_extradata_updated[idx] & 2)) { ret = avcodec_parameters_copy(ost->bsf_ctx[idx]->par_out, ost->bsf_ctx[idx - 1]->par_out); if (ret < 0) goto finish; ost->bsf_extradata_updated[idx] |= 2; } ret = av_bsf_send_packet(ost->bsf_ctx[idx], pkt); if (ret < 0) goto finish; idx++; } else write_packet(of, pkt, ost); } } else write_packet(of, pkt, ost); finish: if (ret < 0 && ret != AVERROR_EOF) { av_log(NULL, AV_LOG_ERROR, "Error applying bitstream filters to an output " "packet for stream #%d:%d.\n", ost->file_index, ost->index); if(exit_on_error) exit_program(1); } } static int check_recording_time(OutputStream *ost) { OutputFile *of = output_files[ost->file_index]; if (of->recording_time != INT64_MAX && av_compare_ts(ost->sync_opts - ost->first_pts, ost->enc_ctx->time_base, of->recording_time, AV_TIME_BASE_Q) >= 0) { close_output_stream(ost); return 0; } return 1; } static void do_audio_out(OutputFile *of, OutputStream *ost, AVFrame *frame) { AVCodecContext *enc = ost->enc_ctx; AVPacket pkt; int ret; av_init_packet(&pkt); pkt.data = NULL; pkt.size = 0; if (!check_recording_time(ost)) return; if (frame->pts == AV_NOPTS_VALUE || audio_sync_method < 0) frame->pts = ost->sync_opts; ost->sync_opts = frame->pts + frame->nb_samples; ost->samples_encoded += frame->nb_samples; ost->frames_encoded++; av_assert0(pkt.size || !pkt.data); update_benchmark(NULL); if (debug_ts) { av_log(NULL, AV_LOG_INFO, "encoder <- type:audio " "frame_pts:%s frame_pts_time:%s time_base:%d/%d\n", av_ts2str(frame->pts), av_ts2timestr(frame->pts, &enc->time_base), enc->time_base.num, enc->time_base.den); } ret = avcodec_send_frame(enc, frame); if (ret < 0) goto error; while (1) { ret = avcodec_receive_packet(enc, &pkt); if (ret == AVERROR(EAGAIN)) break; if (ret < 0) goto error; update_benchmark("encode_audio %d.%d", ost->file_index, ost->index); av_packet_rescale_ts(&pkt, enc->time_base, ost->st->time_base); if (debug_ts) { av_log(NULL, AV_LOG_INFO, "encoder -> type:audio " "pkt_pts:%s pkt_pts_time:%s pkt_dts:%s pkt_dts_time:%s\n", av_ts2str(pkt.pts), av_ts2timestr(pkt.pts, &ost->st->time_base), av_ts2str(pkt.dts), av_ts2timestr(pkt.dts, &ost->st->time_base)); } output_packet(of, &pkt, ost); } return; error: av_log(NULL, AV_LOG_FATAL, "Audio encoding failed\n"); exit_program(1); } static void do_subtitle_out(OutputFile *of, OutputStream *ost, AVSubtitle *sub) { int subtitle_out_max_size = 1024 * 1024; int subtitle_out_size, nb, i; AVCodecContext *enc; AVPacket pkt; int64_t pts; if (sub->pts == AV_NOPTS_VALUE) { av_log(NULL, AV_LOG_ERROR, "Subtitle packets must have a pts\n"); if (exit_on_error) exit_program(1); return; } enc = ost->enc_ctx; if (!subtitle_out) { subtitle_out = av_malloc(subtitle_out_max_size); if (!subtitle_out) { av_log(NULL, AV_LOG_FATAL, "Failed to allocate subtitle_out\n"); exit_program(1); } } /* Note: DVB subtitle need one packet to draw them and one other packet to clear them */ /* XXX: signal it in the codec context ? */ if (enc->codec_id == AV_CODEC_ID_DVB_SUBTITLE) nb = 2; else nb = 1; /* shift timestamp to honor -ss and make check_recording_time() work with -t */ pts = sub->pts; if (output_files[ost->file_index]->start_time != AV_NOPTS_VALUE) pts -= output_files[ost->file_index]->start_time; for (i = 0; i < nb; i++) { unsigned save_num_rects = sub->num_rects; ost->sync_opts = av_rescale_q(pts, AV_TIME_BASE_Q, enc->time_base); if (!check_recording_time(ost)) return; sub->pts = pts; // start_display_time is required to be 0 sub->pts += av_rescale_q(sub->start_display_time, (AVRational){ 1, 1000 }, AV_TIME_BASE_Q); sub->end_display_time -= sub->start_display_time; sub->start_display_time = 0; if (i == 1) sub->num_rects = 0; ost->frames_encoded++; subtitle_out_size = avcodec_encode_subtitle(enc, subtitle_out, subtitle_out_max_size, sub); if (i == 1) sub->num_rects = save_num_rects; if (subtitle_out_size < 0) { av_log(NULL, AV_LOG_FATAL, "Subtitle encoding failed\n"); exit_program(1); } av_init_packet(&pkt); pkt.data = subtitle_out; pkt.size = subtitle_out_size; pkt.pts = av_rescale_q(sub->pts, AV_TIME_BASE_Q, ost->st->time_base); pkt.duration = av_rescale_q(sub->end_display_time, (AVRational){ 1, 1000 }, ost->st->time_base); if (enc->codec_id == AV_CODEC_ID_DVB_SUBTITLE) { /* XXX: the pts correction is handled here. Maybe handling it in the codec would be better */ if (i == 0) pkt.pts += 90 * sub->start_display_time; else pkt.pts += 90 * sub->end_display_time; } pkt.dts = pkt.pts; output_packet(of, &pkt, ost); } } static void do_video_out(OutputFile *of, OutputStream *ost, AVFrame *next_picture, double sync_ipts) { int ret, format_video_sync; AVPacket pkt; AVCodecContext *enc = ost->enc_ctx; AVCodecParameters *mux_par = ost->st->codecpar; int nb_frames, nb0_frames, i; double delta, delta0; double duration = 0; int frame_size = 0; InputStream *ist = NULL; AVFilterContext *filter = ost->filter->filter; if (ost->source_index >= 0) ist = input_streams[ost->source_index]; if (filter->inputs[0]->frame_rate.num > 0 && filter->inputs[0]->frame_rate.den > 0) duration = 1/(av_q2d(filter->inputs[0]->frame_rate) * av_q2d(enc->time_base)); if(ist && ist->st->start_time != AV_NOPTS_VALUE && ist->st->first_dts != AV_NOPTS_VALUE && ost->frame_rate.num) duration = FFMIN(duration, 1/(av_q2d(ost->frame_rate) * av_q2d(enc->time_base))); if (!ost->filters_script && !ost->filters && next_picture && ist && lrintf(av_frame_get_pkt_duration(next_picture) * av_q2d(ist->st->time_base) / av_q2d(enc->time_base)) > 0) { duration = lrintf(av_frame_get_pkt_duration(next_picture) * av_q2d(ist->st->time_base) / av_q2d(enc->time_base)); } if (!next_picture) { //end, flushing nb0_frames = nb_frames = mid_pred(ost->last_nb0_frames[0], ost->last_nb0_frames[1], ost->last_nb0_frames[2]); } else { delta0 = sync_ipts - ost->sync_opts; // delta0 is the "drift" between the input frame (next_picture) and where it would fall in the output. delta = delta0 + duration; /* by default, we output a single frame */ nb0_frames = 0; // tracks the number of times the PREVIOUS frame should be duplicated, mostly for variable framerate (VFR) nb_frames = 1; format_video_sync = video_sync_method; if (format_video_sync == VSYNC_AUTO) { if(!strcmp(of->ctx->oformat->name, "avi")) { format_video_sync = VSYNC_VFR; } else format_video_sync = (of->ctx->oformat->flags & AVFMT_VARIABLE_FPS) ? ((of->ctx->oformat->flags & AVFMT_NOTIMESTAMPS) ? VSYNC_PASSTHROUGH : VSYNC_VFR) : VSYNC_CFR; if ( ist && format_video_sync == VSYNC_CFR && input_files[ist->file_index]->ctx->nb_streams == 1 && input_files[ist->file_index]->input_ts_offset == 0) { format_video_sync = VSYNC_VSCFR; } if (format_video_sync == VSYNC_CFR && copy_ts) { format_video_sync = VSYNC_VSCFR; } } ost->is_cfr = (format_video_sync == VSYNC_CFR || format_video_sync == VSYNC_VSCFR); if (delta0 < 0 && delta > 0 && format_video_sync != VSYNC_PASSTHROUGH && format_video_sync != VSYNC_DROP) { if (delta0 < -0.6) { av_log(NULL, AV_LOG_WARNING, "Past duration %f too large\n", -delta0); } else av_log(NULL, AV_LOG_DEBUG, "Clipping frame in rate conversion by %f\n", -delta0); sync_ipts = ost->sync_opts; duration += delta0; delta0 = 0; } switch (format_video_sync) { case VSYNC_VSCFR: if (ost->frame_number == 0 && delta0 >= 0.5) { av_log(NULL, AV_LOG_DEBUG, "Not duplicating %d initial frames\n", (int)lrintf(delta0)); delta = duration; delta0 = 0; ost->sync_opts = lrint(sync_ipts); } case VSYNC_CFR: // FIXME set to 0.5 after we fix some dts/pts bugs like in avidec.c if (frame_drop_threshold && delta < frame_drop_threshold && ost->frame_number) { nb_frames = 0; } else if (delta < -1.1) nb_frames = 0; else if (delta > 1.1) { nb_frames = lrintf(delta); if (delta0 > 1.1) nb0_frames = lrintf(delta0 - 0.6); } break; case VSYNC_VFR: if (delta <= -0.6) nb_frames = 0; else if (delta > 0.6) ost->sync_opts = lrint(sync_ipts); break; case VSYNC_DROP: case VSYNC_PASSTHROUGH: ost->sync_opts = lrint(sync_ipts); break; default: av_assert0(0); } } nb_frames = FFMIN(nb_frames, ost->max_frames - ost->frame_number); nb0_frames = FFMIN(nb0_frames, nb_frames); memmove(ost->last_nb0_frames + 1, ost->last_nb0_frames, sizeof(ost->last_nb0_frames[0]) * (FF_ARRAY_ELEMS(ost->last_nb0_frames) - 1)); ost->last_nb0_frames[0] = nb0_frames; if (nb0_frames == 0 && ost->last_dropped) { nb_frames_drop++; av_log(NULL, AV_LOG_VERBOSE, "*** dropping frame %d from stream %d at ts %"PRId64"\n", ost->frame_number, ost->st->index, ost->last_frame->pts); } if (nb_frames > (nb0_frames && ost->last_dropped) + (nb_frames > nb0_frames)) { if (nb_frames > dts_error_threshold * 30) { av_log(NULL, AV_LOG_ERROR, "%d frame duplication too large, skipping\n", nb_frames - 1); nb_frames_drop++; return; } nb_frames_dup += nb_frames - (nb0_frames && ost->last_dropped) - (nb_frames > nb0_frames); av_log(NULL, AV_LOG_VERBOSE, "*** %d dup!\n", nb_frames - 1); } ost->last_dropped = nb_frames == nb0_frames && next_picture; /* duplicates frame if needed */ for (i = 0; i < nb_frames; i++) { AVFrame *in_picture; av_init_packet(&pkt); pkt.data = NULL; pkt.size = 0; if (i < nb0_frames && ost->last_frame) { in_picture = ost->last_frame; } else in_picture = next_picture; if (!in_picture) return; in_picture->pts = ost->sync_opts; #if 1 if (!check_recording_time(ost)) #else if (ost->frame_number >= ost->max_frames) #endif return; #if FF_API_LAVF_FMT_RAWPICTURE if (of->ctx->oformat->flags & AVFMT_RAWPICTURE && enc->codec->id == AV_CODEC_ID_RAWVIDEO) { /* raw pictures are written as AVPicture structure to avoid any copies. We support temporarily the older method. */ if (in_picture->interlaced_frame) mux_par->field_order = in_picture->top_field_first ? AV_FIELD_TB:AV_FIELD_BT; else mux_par->field_order = AV_FIELD_PROGRESSIVE; pkt.data = (uint8_t *)in_picture; pkt.size = sizeof(AVPicture); pkt.pts = av_rescale_q(in_picture->pts, enc->time_base, ost->st->time_base); pkt.flags |= AV_PKT_FLAG_KEY; output_packet(of, &pkt, ost); } else #endif { int forced_keyframe = 0; double pts_time; if (enc->flags & (AV_CODEC_FLAG_INTERLACED_DCT | AV_CODEC_FLAG_INTERLACED_ME) && ost->top_field_first >= 0) in_picture->top_field_first = !!ost->top_field_first; if (in_picture->interlaced_frame) { if (enc->codec->id == AV_CODEC_ID_MJPEG) mux_par->field_order = in_picture->top_field_first ? AV_FIELD_TT:AV_FIELD_BB; else mux_par->field_order = in_picture->top_field_first ? AV_FIELD_TB:AV_FIELD_BT; } else mux_par->field_order = AV_FIELD_PROGRESSIVE; in_picture->quality = enc->global_quality; in_picture->pict_type = 0; pts_time = in_picture->pts != AV_NOPTS_VALUE ? in_picture->pts * av_q2d(enc->time_base) : NAN; if (ost->forced_kf_index < ost->forced_kf_count && in_picture->pts >= ost->forced_kf_pts[ost->forced_kf_index]) { ost->forced_kf_index++; forced_keyframe = 1; } else if (ost->forced_keyframes_pexpr) { double res; ost->forced_keyframes_expr_const_values[FKF_T] = pts_time; res = av_expr_eval(ost->forced_keyframes_pexpr, ost->forced_keyframes_expr_const_values, NULL); ff_dlog(NULL, "force_key_frame: n:%f n_forced:%f prev_forced_n:%f t:%f prev_forced_t:%f -> res:%f\n", ost->forced_keyframes_expr_const_values[FKF_N], ost->forced_keyframes_expr_const_values[FKF_N_FORCED], ost->forced_keyframes_expr_const_values[FKF_PREV_FORCED_N], ost->forced_keyframes_expr_const_values[FKF_T], ost->forced_keyframes_expr_const_values[FKF_PREV_FORCED_T], res); if (res) { forced_keyframe = 1; ost->forced_keyframes_expr_const_values[FKF_PREV_FORCED_N] = ost->forced_keyframes_expr_const_values[FKF_N]; ost->forced_keyframes_expr_const_values[FKF_PREV_FORCED_T] = ost->forced_keyframes_expr_const_values[FKF_T]; ost->forced_keyframes_expr_const_values[FKF_N_FORCED] += 1; } ost->forced_keyframes_expr_const_values[FKF_N] += 1; } else if ( ost->forced_keyframes && !strncmp(ost->forced_keyframes, "source", 6) && in_picture->key_frame==1) { forced_keyframe = 1; } if (forced_keyframe) { in_picture->pict_type = AV_PICTURE_TYPE_I; av_log(NULL, AV_LOG_DEBUG, "Forced keyframe at time %f\n", pts_time); } update_benchmark(NULL); if (debug_ts) { av_log(NULL, AV_LOG_INFO, "encoder <- type:video " "frame_pts:%s frame_pts_time:%s time_base:%d/%d\n", av_ts2str(in_picture->pts), av_ts2timestr(in_picture->pts, &enc->time_base), enc->time_base.num, enc->time_base.den); } ost->frames_encoded++; ret = avcodec_send_frame(enc, in_picture); if (ret < 0) goto error; while (1) { ret = avcodec_receive_packet(enc, &pkt); update_benchmark("encode_video %d.%d", ost->file_index, ost->index); if (ret == AVERROR(EAGAIN)) break; if (ret < 0) goto error; if (debug_ts) { av_log(NULL, AV_LOG_INFO, "encoder -> type:video " "pkt_pts:%s pkt_pts_time:%s pkt_dts:%s pkt_dts_time:%s\n", av_ts2str(pkt.pts), av_ts2timestr(pkt.pts, &enc->time_base), av_ts2str(pkt.dts), av_ts2timestr(pkt.dts, &enc->time_base)); } if (pkt.pts == AV_NOPTS_VALUE && !(enc->codec->capabilities & AV_CODEC_CAP_DELAY)) pkt.pts = ost->sync_opts; av_packet_rescale_ts(&pkt, enc->time_base, ost->st->time_base); if (debug_ts) { av_log(NULL, AV_LOG_INFO, "encoder -> type:video " "pkt_pts:%s pkt_pts_time:%s pkt_dts:%s pkt_dts_time:%s\n", av_ts2str(pkt.pts), av_ts2timestr(pkt.pts, &ost->st->time_base), av_ts2str(pkt.dts), av_ts2timestr(pkt.dts, &ost->st->time_base)); } frame_size = pkt.size; output_packet(of, &pkt, ost); /* if two pass, output log */ if (ost->logfile && enc->stats_out) { fprintf(ost->logfile, "%s", enc->stats_out); } } } ost->sync_opts++; /* * For video, number of frames in == number of packets out. * But there may be reordering, so we can't throw away frames on encoder * flush, we need to limit them here, before they go into encoder. */ ost->frame_number++; if (vstats_filename && frame_size) do_video_stats(ost, frame_size); } if (!ost->last_frame) ost->last_frame = av_frame_alloc(); av_frame_unref(ost->last_frame); if (next_picture && ost->last_frame) av_frame_ref(ost->last_frame, next_picture); else av_frame_free(&ost->last_frame); return; error: av_log(NULL, AV_LOG_FATAL, "Video encoding failed\n"); exit_program(1); } static double psnr(double d) { return -10.0 * log10(d); } static void do_video_stats(OutputStream *ost, int frame_size) { AVCodecContext *enc; int frame_number; double ti1, bitrate, avg_bitrate; /* this is executed just the first time do_video_stats is called */ if (!vstats_file) { vstats_file = fopen(vstats_filename, "w"); if (!vstats_file) { perror("fopen"); exit_program(1); } } enc = ost->enc_ctx; if (enc->codec_type == AVMEDIA_TYPE_VIDEO) { frame_number = ost->st->nb_frames; fprintf(vstats_file, "frame= %5d q= %2.1f ", frame_number, ost->quality / (float)FF_QP2LAMBDA); if (ost->error[0]>=0 && (enc->flags & AV_CODEC_FLAG_PSNR)) fprintf(vstats_file, "PSNR= %6.2f ", psnr(ost->error[0] / (enc->width * enc->height * 255.0 * 255.0))); fprintf(vstats_file,"f_size= %6d ", frame_size); /* compute pts value */ ti1 = av_stream_get_end_pts(ost->st) * av_q2d(ost->st->time_base); if (ti1 < 0.01) ti1 = 0.01; bitrate = (frame_size * 8) / av_q2d(enc->time_base) / 1000.0; avg_bitrate = (double)(ost->data_size * 8) / ti1 / 1000.0; fprintf(vstats_file, "s_size= %8.0fkB time= %0.3f br= %7.1fkbits/s avg_br= %7.1fkbits/s ", (double)ost->data_size / 1024, ti1, bitrate, avg_bitrate); fprintf(vstats_file, "type= %c\n", av_get_picture_type_char(ost->pict_type)); } } static void finish_output_stream(OutputStream *ost) { OutputFile *of = output_files[ost->file_index]; int i; ost->finished = ENCODER_FINISHED | MUXER_FINISHED; if (of->shortest) { for (i = 0; i < of->ctx->nb_streams; i++) output_streams[of->ost_index + i]->finished = ENCODER_FINISHED | MUXER_FINISHED; } } /** * Get and encode new output from any of the filtergraphs, without causing * activity. * * @return 0 for success, <0 for severe errors */ static int reap_filters(int flush) { AVFrame *filtered_frame = NULL; int i; /* Reap all buffers present in the buffer sinks */ for (i = 0; i < nb_output_streams; i++) { OutputStream *ost = output_streams[i]; OutputFile *of = output_files[ost->file_index]; AVFilterContext *filter; AVCodecContext *enc = ost->enc_ctx; int ret = 0; if (!ost->filter) continue; filter = ost->filter->filter; if (!ost->filtered_frame && !(ost->filtered_frame = av_frame_alloc())) { return AVERROR(ENOMEM); } filtered_frame = ost->filtered_frame; while (1) { double float_pts = AV_NOPTS_VALUE; // this is identical to filtered_frame.pts but with higher precision ret = av_buffersink_get_frame_flags(filter, filtered_frame, AV_BUFFERSINK_FLAG_NO_REQUEST); if (ret < 0) { if (ret != AVERROR(EAGAIN) && ret != AVERROR_EOF) { av_log(NULL, AV_LOG_WARNING, "Error in av_buffersink_get_frame_flags(): %s\n", av_err2str(ret)); } else if (flush && ret == AVERROR_EOF) { if (filter->inputs[0]->type == AVMEDIA_TYPE_VIDEO) do_video_out(of, ost, NULL, AV_NOPTS_VALUE); } break; } if (ost->finished) { av_frame_unref(filtered_frame); continue; } if (filtered_frame->pts != AV_NOPTS_VALUE) { int64_t start_time = (of->start_time == AV_NOPTS_VALUE) ? 0 : of->start_time; AVRational tb = enc->time_base; int extra_bits = av_clip(29 - av_log2(tb.den), 0, 16); tb.den <<= extra_bits; float_pts = av_rescale_q(filtered_frame->pts, filter->inputs[0]->time_base, tb) - av_rescale_q(start_time, AV_TIME_BASE_Q, tb); float_pts /= 1 << extra_bits; // avoid exact midoints to reduce the chance of rounding differences, this can be removed in case the fps code is changed to work with integers float_pts += FFSIGN(float_pts) * 1.0 / (1<<17); filtered_frame->pts = av_rescale_q(filtered_frame->pts, filter->inputs[0]->time_base, enc->time_base) - av_rescale_q(start_time, AV_TIME_BASE_Q, enc->time_base); } //if (ost->source_index >= 0) // *filtered_frame= *input_streams[ost->source_index]->decoded_frame; //for me_threshold switch (filter->inputs[0]->type) { case AVMEDIA_TYPE_VIDEO: if (!ost->frame_aspect_ratio.num) enc->sample_aspect_ratio = filtered_frame->sample_aspect_ratio; if (debug_ts) { av_log(NULL, AV_LOG_INFO, "filter -> pts:%s pts_time:%s exact:%f time_base:%d/%d\n", av_ts2str(filtered_frame->pts), av_ts2timestr(filtered_frame->pts, &enc->time_base), float_pts, enc->time_base.num, enc->time_base.den); } do_video_out(of, ost, filtered_frame, float_pts); break; case AVMEDIA_TYPE_AUDIO: if (!(enc->codec->capabilities & AV_CODEC_CAP_PARAM_CHANGE) && enc->channels != av_frame_get_channels(filtered_frame)) { av_log(NULL, AV_LOG_ERROR, "Audio filter graph output is not normalized and encoder does not support parameter changes\n"); break; } do_audio_out(of, ost, filtered_frame); break; default: // TODO support subtitle filters av_assert0(0); } av_frame_unref(filtered_frame); } } return 0; } static void print_final_stats(int64_t total_size) { uint64_t video_size = 0, audio_size = 0, extra_size = 0, other_size = 0; uint64_t subtitle_size = 0; uint64_t data_size = 0; float percent = -1.0; int i, j; int pass1_used = 1; for (i = 0; i < nb_output_streams; i++) { OutputStream *ost = output_streams[i]; switch (ost->enc_ctx->codec_type) { case AVMEDIA_TYPE_VIDEO: video_size += ost->data_size; break; case AVMEDIA_TYPE_AUDIO: audio_size += ost->data_size; break; case AVMEDIA_TYPE_SUBTITLE: subtitle_size += ost->data_size; break; default: other_size += ost->data_size; break; } extra_size += ost->enc_ctx->extradata_size; data_size += ost->data_size; if ( (ost->enc_ctx->flags & (AV_CODEC_FLAG_PASS1 | CODEC_FLAG_PASS2)) != AV_CODEC_FLAG_PASS1) pass1_used = 0; } if (data_size && total_size>0 && total_size >= data_size) percent = 100.0 * (total_size - data_size) / data_size; av_log(NULL, AV_LOG_INFO, "video:%1.0fkB audio:%1.0fkB subtitle:%1.0fkB other streams:%1.0fkB global headers:%1.0fkB muxing overhead: ", video_size / 1024.0, audio_size / 1024.0, subtitle_size / 1024.0, other_size / 1024.0, extra_size / 1024.0); if (percent >= 0.0) av_log(NULL, AV_LOG_INFO, "%f%%", percent); else av_log(NULL, AV_LOG_INFO, "unknown"); av_log(NULL, AV_LOG_INFO, "\n"); /* print verbose per-stream stats */ for (i = 0; i < nb_input_files; i++) { InputFile *f = input_files[i]; uint64_t total_packets = 0, total_size = 0; av_log(NULL, AV_LOG_VERBOSE, "Input file #%d (%s):\n", i, f->ctx->filename); for (j = 0; j < f->nb_streams; j++) { InputStream *ist = input_streams[f->ist_index + j]; enum AVMediaType type = ist->dec_ctx->codec_type; total_size += ist->data_size; total_packets += ist->nb_packets; av_log(NULL, AV_LOG_VERBOSE, " Input stream #%d:%d (%s): ", i, j, media_type_string(type)); av_log(NULL, AV_LOG_VERBOSE, "%"PRIu64" packets read (%"PRIu64" bytes); ", ist->nb_packets, ist->data_size); if (ist->decoding_needed) { av_log(NULL, AV_LOG_VERBOSE, "%"PRIu64" frames decoded", ist->frames_decoded); if (type == AVMEDIA_TYPE_AUDIO) av_log(NULL, AV_LOG_VERBOSE, " (%"PRIu64" samples)", ist->samples_decoded); av_log(NULL, AV_LOG_VERBOSE, "; "); } av_log(NULL, AV_LOG_VERBOSE, "\n"); } av_log(NULL, AV_LOG_VERBOSE, " Total: %"PRIu64" packets (%"PRIu64" bytes) demuxed\n", total_packets, total_size); } for (i = 0; i < nb_output_files; i++) { OutputFile *of = output_files[i]; uint64_t total_packets = 0, total_size = 0; av_log(NULL, AV_LOG_VERBOSE, "Output file #%d (%s):\n", i, of->ctx->filename); for (j = 0; j < of->ctx->nb_streams; j++) { OutputStream *ost = output_streams[of->ost_index + j]; enum AVMediaType type = ost->enc_ctx->codec_type; total_size += ost->data_size; total_packets += ost->packets_written; av_log(NULL, AV_LOG_VERBOSE, " Output stream #%d:%d (%s): ", i, j, media_type_string(type)); if (ost->encoding_needed) { av_log(NULL, AV_LOG_VERBOSE, "%"PRIu64" frames encoded", ost->frames_encoded); if (type == AVMEDIA_TYPE_AUDIO) av_log(NULL, AV_LOG_VERBOSE, " (%"PRIu64" samples)", ost->samples_encoded); av_log(NULL, AV_LOG_VERBOSE, "; "); } av_log(NULL, AV_LOG_VERBOSE, "%"PRIu64" packets muxed (%"PRIu64" bytes); ", ost->packets_written, ost->data_size); av_log(NULL, AV_LOG_VERBOSE, "\n"); } av_log(NULL, AV_LOG_VERBOSE, " Total: %"PRIu64" packets (%"PRIu64" bytes) muxed\n", total_packets, total_size); } if(video_size + data_size + audio_size + subtitle_size + extra_size == 0){ av_log(NULL, AV_LOG_WARNING, "Output file is empty, nothing was encoded "); if (pass1_used) { av_log(NULL, AV_LOG_WARNING, "\n"); } else { av_log(NULL, AV_LOG_WARNING, "(check -ss / -t / -frames parameters if used)\n"); } } } static void print_report(int is_last_report, int64_t timer_start, int64_t cur_time) { char buf[1024]; AVBPrint buf_script; OutputStream *ost; AVFormatContext *oc; int64_t total_size; AVCodecContext *enc; int frame_number, vid, i; double bitrate; double speed; int64_t pts = INT64_MIN + 1; static int64_t last_time = -1; static int qp_histogram[52]; int hours, mins, secs, us; int ret; float t; if (!print_stats && !is_last_report && !progress_avio) return; if (!is_last_report) { if (last_time == -1) { last_time = cur_time; return; } if ((cur_time - last_time) < 500000) return; last_time = cur_time; } t = (cur_time-timer_start) / 1000000.0; oc = output_files[0]->ctx; total_size = avio_size(oc->pb); if (total_size <= 0) // FIXME improve avio_size() so it works with non seekable output too total_size = avio_tell(oc->pb); buf[0] = '\0'; vid = 0; av_bprint_init(&buf_script, 0, 1); for (i = 0; i < nb_output_streams; i++) { float q = -1; ost = output_streams[i]; enc = ost->enc_ctx; if (!ost->stream_copy) q = ost->quality / (float) FF_QP2LAMBDA; if (vid && enc->codec_type == AVMEDIA_TYPE_VIDEO) { snprintf(buf + strlen(buf), sizeof(buf) - strlen(buf), "q=%2.1f ", q); av_bprintf(&buf_script, "stream_%d_%d_q=%.1f\n", ost->file_index, ost->index, q); } if (!vid && enc->codec_type == AVMEDIA_TYPE_VIDEO) { float fps; frame_number = ost->frame_number; fps = t > 1 ? frame_number / t : 0; snprintf(buf + strlen(buf), sizeof(buf) - strlen(buf), "frame=%5d fps=%3.*f q=%3.1f ", frame_number, fps < 9.95, fps, q); av_bprintf(&buf_script, "frame=%d\n", frame_number); av_bprintf(&buf_script, "fps=%.1f\n", fps); av_bprintf(&buf_script, "stream_%d_%d_q=%.1f\n", ost->file_index, ost->index, q); if (is_last_report) snprintf(buf + strlen(buf), sizeof(buf) - strlen(buf), "L"); if (qp_hist) { int j; int qp = lrintf(q); if (qp >= 0 && qp < FF_ARRAY_ELEMS(qp_histogram)) qp_histogram[qp]++; for (j = 0; j < 32; j++) snprintf(buf + strlen(buf), sizeof(buf) - strlen(buf), "%X", av_log2(qp_histogram[j] + 1)); } if ((enc->flags & AV_CODEC_FLAG_PSNR) && (ost->pict_type != AV_PICTURE_TYPE_NONE || is_last_report)) { int j; double error, error_sum = 0; double scale, scale_sum = 0; double p; char type[3] = { 'Y','U','V' }; snprintf(buf + strlen(buf), sizeof(buf) - strlen(buf), "PSNR="); for (j = 0; j < 3; j++) { if (is_last_report) { error = enc->error[j]; scale = enc->width * enc->height * 255.0 * 255.0 * frame_number; } else { error = ost->error[j]; scale = enc->width * enc->height * 255.0 * 255.0; } if (j) scale /= 4; error_sum += error; scale_sum += scale; p = psnr(error / scale); snprintf(buf + strlen(buf), sizeof(buf) - strlen(buf), "%c:%2.2f ", type[j], p); av_bprintf(&buf_script, "stream_%d_%d_psnr_%c=%2.2f\n", ost->file_index, ost->index, type[j] | 32, p); } p = psnr(error_sum / scale_sum); snprintf(buf + strlen(buf), sizeof(buf) - strlen(buf), "*:%2.2f ", psnr(error_sum / scale_sum)); av_bprintf(&buf_script, "stream_%d_%d_psnr_all=%2.2f\n", ost->file_index, ost->index, p); } vid = 1; } /* compute min output value */ if (av_stream_get_end_pts(ost->st) != AV_NOPTS_VALUE) pts = FFMAX(pts, av_rescale_q(av_stream_get_end_pts(ost->st), ost->st->time_base, AV_TIME_BASE_Q)); if (is_last_report) nb_frames_drop += ost->last_dropped; } secs = FFABS(pts) / AV_TIME_BASE; us = FFABS(pts) % AV_TIME_BASE; mins = secs / 60; secs %= 60; hours = mins / 60; mins %= 60; bitrate = pts && total_size >= 0 ? total_size * 8 / (pts / 1000.0) : -1; speed = t != 0.0 ? (double)pts / AV_TIME_BASE / t : -1; if (total_size < 0) snprintf(buf + strlen(buf), sizeof(buf) - strlen(buf), "size=N/A time="); else snprintf(buf + strlen(buf), sizeof(buf) - strlen(buf), "size=%8.0fkB time=", total_size / 1024.0); if (pts < 0) snprintf(buf + strlen(buf), sizeof(buf) - strlen(buf), "-"); snprintf(buf + strlen(buf), sizeof(buf) - strlen(buf), "%02d:%02d:%02d.%02d ", hours, mins, secs, (100 * us) / AV_TIME_BASE); if (bitrate < 0) { snprintf(buf + strlen(buf), sizeof(buf) - strlen(buf),"bitrate=N/A"); av_bprintf(&buf_script, "bitrate=N/A\n"); }else{ snprintf(buf + strlen(buf), sizeof(buf) - strlen(buf),"bitrate=%6.1fkbits/s", bitrate); av_bprintf(&buf_script, "bitrate=%6.1fkbits/s\n", bitrate); } if (total_size < 0) av_bprintf(&buf_script, "total_size=N/A\n"); else av_bprintf(&buf_script, "total_size=%"PRId64"\n", total_size); av_bprintf(&buf_script, "out_time_ms=%"PRId64"\n", pts); av_bprintf(&buf_script, "out_time=%02d:%02d:%02d.%06d\n", hours, mins, secs, us); if (nb_frames_dup || nb_frames_drop) snprintf(buf + strlen(buf), sizeof(buf) - strlen(buf), " dup=%d drop=%d", nb_frames_dup, nb_frames_drop); av_bprintf(&buf_script, "dup_frames=%d\n", nb_frames_dup); av_bprintf(&buf_script, "drop_frames=%d\n", nb_frames_drop); if (speed < 0) { snprintf(buf + strlen(buf), sizeof(buf) - strlen(buf)," speed=N/A"); av_bprintf(&buf_script, "speed=N/A\n"); } else { snprintf(buf + strlen(buf), sizeof(buf) - strlen(buf)," speed=%4.3gx", speed); av_bprintf(&buf_script, "speed=%4.3gx\n", speed); } if (print_stats || is_last_report) { const char end = is_last_report ? '\n' : '\r'; if (print_stats==1 && AV_LOG_INFO > av_log_get_level()) { fprintf(stderr, "%s %c", buf, end); } else av_log(NULL, AV_LOG_INFO, "%s %c", buf, end); fflush(stderr); } if (progress_avio) { av_bprintf(&buf_script, "progress=%s\n", is_last_report ? "end" : "continue"); avio_write(progress_avio, buf_script.str, FFMIN(buf_script.len, buf_script.size - 1)); avio_flush(progress_avio); av_bprint_finalize(&buf_script, NULL); if (is_last_report) { if ((ret = avio_closep(&progress_avio)) < 0) av_log(NULL, AV_LOG_ERROR, "Error closing progress log, loss of information possible: %s\n", av_err2str(ret)); } } if (is_last_report) print_final_stats(total_size); } static void flush_encoders(void) { int i, ret; for (i = 0; i < nb_output_streams; i++) { OutputStream *ost = output_streams[i]; AVCodecContext *enc = ost->enc_ctx; OutputFile *of = output_files[ost->file_index]; int stop_encoding = 0; if (!ost->encoding_needed) continue; if (enc->codec_type == AVMEDIA_TYPE_AUDIO && enc->frame_size <= 1) continue; #if FF_API_LAVF_FMT_RAWPICTURE if (enc->codec_type == AVMEDIA_TYPE_VIDEO && (of->ctx->oformat->flags & AVFMT_RAWPICTURE) && enc->codec->id == AV_CODEC_ID_RAWVIDEO) continue; #endif if (enc->codec_type != AVMEDIA_TYPE_VIDEO && enc->codec_type != AVMEDIA_TYPE_AUDIO) continue; avcodec_send_frame(enc, NULL); for (;;) { const char *desc = NULL; switch (enc->codec_type) { case AVMEDIA_TYPE_AUDIO: desc = "audio"; break; case AVMEDIA_TYPE_VIDEO: desc = "video"; break; default: av_assert0(0); } if (1) { AVPacket pkt; int pkt_size; av_init_packet(&pkt); pkt.data = NULL; pkt.size = 0; update_benchmark(NULL); ret = avcodec_receive_packet(enc, &pkt); update_benchmark("flush_%s %d.%d", desc, ost->file_index, ost->index); if (ret < 0 && ret != AVERROR_EOF) { av_log(NULL, AV_LOG_FATAL, "%s encoding failed: %s\n", desc, av_err2str(ret)); exit_program(1); } if (ost->logfile && enc->stats_out) { fprintf(ost->logfile, "%s", enc->stats_out); } if (ret == AVERROR_EOF) { stop_encoding = 1; break; } if (ost->finished & MUXER_FINISHED) { av_packet_unref(&pkt); continue; } av_packet_rescale_ts(&pkt, enc->time_base, ost->st->time_base); pkt_size = pkt.size; output_packet(of, &pkt, ost); if (ost->enc_ctx->codec_type == AVMEDIA_TYPE_VIDEO && vstats_filename) { do_video_stats(ost, pkt_size); } } if (stop_encoding) break; } } } /* * Check whether a packet from ist should be written into ost at this time */ static int check_output_constraints(InputStream *ist, OutputStream *ost) { OutputFile *of = output_files[ost->file_index]; int ist_index = input_files[ist->file_index]->ist_index + ist->st->index; if (ost->source_index != ist_index) return 0; if (ost->finished) return 0; if (of->start_time != AV_NOPTS_VALUE && ist->pts < of->start_time) return 0; return 1; } static void do_streamcopy(InputStream *ist, OutputStream *ost, const AVPacket *pkt) { OutputFile *of = output_files[ost->file_index]; InputFile *f = input_files [ist->file_index]; int64_t start_time = (of->start_time == AV_NOPTS_VALUE) ? 0 : of->start_time; int64_t ost_tb_start_time = av_rescale_q(start_time, AV_TIME_BASE_Q, ost->st->time_base); AVPicture pict; AVPacket opkt; av_init_packet(&opkt); if ((!ost->frame_number && !(pkt->flags & AV_PKT_FLAG_KEY)) && !ost->copy_initial_nonkeyframes) return; if (!ost->frame_number && !ost->copy_prior_start) { int64_t comp_start = start_time; if (copy_ts && f->start_time != AV_NOPTS_VALUE) comp_start = FFMAX(start_time, f->start_time + f->ts_offset); if (pkt->pts == AV_NOPTS_VALUE ? ist->pts < comp_start : pkt->pts < av_rescale_q(comp_start, AV_TIME_BASE_Q, ist->st->time_base)) return; } if (of->recording_time != INT64_MAX && ist->pts >= of->recording_time + start_time) { close_output_stream(ost); return; } if (f->recording_time != INT64_MAX) { start_time = f->ctx->start_time; if (f->start_time != AV_NOPTS_VALUE && copy_ts) start_time += f->start_time; if (ist->pts >= f->recording_time + start_time) { close_output_stream(ost); return; } } /* force the input stream PTS */ if (ost->enc_ctx->codec_type == AVMEDIA_TYPE_VIDEO) ost->sync_opts++; if (pkt->pts != AV_NOPTS_VALUE) opkt.pts = av_rescale_q(pkt->pts, ist->st->time_base, ost->st->time_base) - ost_tb_start_time; else opkt.pts = AV_NOPTS_VALUE; if (pkt->dts == AV_NOPTS_VALUE) opkt.dts = av_rescale_q(ist->dts, AV_TIME_BASE_Q, ost->st->time_base); else opkt.dts = av_rescale_q(pkt->dts, ist->st->time_base, ost->st->time_base); opkt.dts -= ost_tb_start_time; if (ost->st->codecpar->codec_type == AVMEDIA_TYPE_AUDIO && pkt->dts != AV_NOPTS_VALUE) { int duration = av_get_audio_frame_duration(ist->dec_ctx, pkt->size); if(!duration) duration = ist->dec_ctx->frame_size; opkt.dts = opkt.pts = av_rescale_delta(ist->st->time_base, pkt->dts, (AVRational){1, ist->dec_ctx->sample_rate}, duration, &ist->filter_in_rescale_delta_last, ost->st->time_base) - ost_tb_start_time; } opkt.duration = av_rescale_q(pkt->duration, ist->st->time_base, ost->st->time_base); opkt.flags = pkt->flags; // FIXME remove the following 2 lines they shall be replaced by the bitstream filters if ( ost->st->codecpar->codec_id != AV_CODEC_ID_H264 && ost->st->codecpar->codec_id != AV_CODEC_ID_MPEG1VIDEO && ost->st->codecpar->codec_id != AV_CODEC_ID_MPEG2VIDEO && ost->st->codecpar->codec_id != AV_CODEC_ID_VC1 ) { int ret = av_parser_change(ost->parser, ost->parser_avctx, &opkt.data, &opkt.size, pkt->data, pkt->size, pkt->flags & AV_PKT_FLAG_KEY); if (ret < 0) { av_log(NULL, AV_LOG_FATAL, "av_parser_change failed: %s\n", av_err2str(ret)); exit_program(1); } if (ret) { opkt.buf = av_buffer_create(opkt.data, opkt.size, av_buffer_default_free, NULL, 0); if (!opkt.buf) exit_program(1); } } else { opkt.data = pkt->data; opkt.size = pkt->size; } av_copy_packet_side_data(&opkt, pkt); #if FF_API_LAVF_FMT_RAWPICTURE if (ost->st->codecpar->codec_type == AVMEDIA_TYPE_VIDEO && ost->st->codecpar->codec_id == AV_CODEC_ID_RAWVIDEO && (of->ctx->oformat->flags & AVFMT_RAWPICTURE)) { /* store AVPicture in AVPacket, as expected by the output format */ int ret = avpicture_fill(&pict, opkt.data, ost->st->codecpar->format, ost->st->codecpar->width, ost->st->codecpar->height); if (ret < 0) { av_log(NULL, AV_LOG_FATAL, "avpicture_fill failed: %s\n", av_err2str(ret)); exit_program(1); } opkt.data = (uint8_t *)&pict; opkt.size = sizeof(AVPicture); opkt.flags |= AV_PKT_FLAG_KEY; } #endif output_packet(of, &opkt, ost); } int guess_input_channel_layout(InputStream *ist) { AVCodecContext *dec = ist->dec_ctx; if (!dec->channel_layout) { char layout_name[256]; if (dec->channels > ist->guess_layout_max) return 0; dec->channel_layout = av_get_default_channel_layout(dec->channels); if (!dec->channel_layout) return 0; av_get_channel_layout_string(layout_name, sizeof(layout_name), dec->channels, dec->channel_layout); av_log(NULL, AV_LOG_WARNING, "Guessed Channel Layout for Input Stream " "#%d.%d : %s\n", ist->file_index, ist->st->index, layout_name); } return 1; } static void check_decode_result(InputStream *ist, int *got_output, int ret) { if (*got_output || ret<0) decode_error_stat[ret<0] ++; if (ret < 0 && exit_on_error) exit_program(1); if (exit_on_error && *got_output && ist) { if (av_frame_get_decode_error_flags(ist->decoded_frame) || (ist->decoded_frame->flags & AV_FRAME_FLAG_CORRUPT)) { av_log(NULL, AV_LOG_FATAL, "%s: corrupt decoded frame in stream %d\n", input_files[ist->file_index]->ctx->filename, ist->st->index); exit_program(1); } } } // This does not quite work like avcodec_decode_audio4/avcodec_decode_video2. // There is the following difference: if you got a frame, you must call // it again with pkt=NULL. pkt==NULL is treated differently from pkt.size==0 // (pkt==NULL means get more output, pkt.size==0 is a flush/drain packet) static int decode(AVCodecContext *avctx, AVFrame *frame, int *got_frame, AVPacket *pkt) { int ret; *got_frame = 0; if (pkt) { ret = avcodec_send_packet(avctx, pkt); // In particular, we don't expect AVERROR(EAGAIN), because we read all // decoded frames with avcodec_receive_frame() until done. if (ret < 0 && ret != AVERROR_EOF) return ret; } ret = avcodec_receive_frame(avctx, frame); if (ret < 0 && ret != AVERROR(EAGAIN)) return ret; if (ret >= 0) *got_frame = 1; return 0; } static int decode_audio(InputStream *ist, AVPacket *pkt, int *got_output) { AVFrame *decoded_frame, *f; AVCodecContext *avctx = ist->dec_ctx; int i, ret, err = 0, resample_changed; AVRational decoded_frame_tb; if (!ist->decoded_frame && !(ist->decoded_frame = av_frame_alloc())) return AVERROR(ENOMEM); if (!ist->filter_frame && !(ist->filter_frame = av_frame_alloc())) return AVERROR(ENOMEM); decoded_frame = ist->decoded_frame; update_benchmark(NULL); ret = decode(avctx, decoded_frame, got_output, pkt); update_benchmark("decode_audio %d.%d", ist->file_index, ist->st->index); if (ret >= 0 && avctx->sample_rate <= 0) { av_log(avctx, AV_LOG_ERROR, "Sample rate %d invalid\n", avctx->sample_rate); ret = AVERROR_INVALIDDATA; } if (ret != AVERROR_EOF) check_decode_result(ist, got_output, ret); if (!*got_output || ret < 0) return ret; ist->samples_decoded += decoded_frame->nb_samples; ist->frames_decoded++; #if 1 /* increment next_dts to use for the case where the input stream does not have timestamps or there are multiple frames in the packet */ ist->next_pts += ((int64_t)AV_TIME_BASE * decoded_frame->nb_samples) / avctx->sample_rate; ist->next_dts += ((int64_t)AV_TIME_BASE * decoded_frame->nb_samples) / avctx->sample_rate; #endif resample_changed = ist->resample_sample_fmt != decoded_frame->format || ist->resample_channels != avctx->channels || ist->resample_channel_layout != decoded_frame->channel_layout || ist->resample_sample_rate != decoded_frame->sample_rate; if (resample_changed) { char layout1[64], layout2[64]; if (!guess_input_channel_layout(ist)) { av_log(NULL, AV_LOG_FATAL, "Unable to find default channel " "layout for Input Stream #%d.%d\n", ist->file_index, ist->st->index); exit_program(1); } decoded_frame->channel_layout = avctx->channel_layout; av_get_channel_layout_string(layout1, sizeof(layout1), ist->resample_channels, ist->resample_channel_layout); av_get_channel_layout_string(layout2, sizeof(layout2), avctx->channels, decoded_frame->channel_layout); av_log(NULL, AV_LOG_INFO, "Input stream #%d:%d frame changed from rate:%d fmt:%s ch:%d chl:%s to rate:%d fmt:%s ch:%d chl:%s\n", ist->file_index, ist->st->index, ist->resample_sample_rate, av_get_sample_fmt_name(ist->resample_sample_fmt), ist->resample_channels, layout1, decoded_frame->sample_rate, av_get_sample_fmt_name(decoded_frame->format), avctx->channels, layout2); ist->resample_sample_fmt = decoded_frame->format; ist->resample_sample_rate = decoded_frame->sample_rate; ist->resample_channel_layout = decoded_frame->channel_layout; ist->resample_channels = avctx->channels; for (i = 0; i < nb_filtergraphs; i++) if (ist_in_filtergraph(filtergraphs[i], ist)) { FilterGraph *fg = filtergraphs[i]; if (configure_filtergraph(fg) < 0) { av_log(NULL, AV_LOG_FATAL, "Error reinitializing filters!\n"); exit_program(1); } } } if (decoded_frame->pts != AV_NOPTS_VALUE) { decoded_frame_tb = ist->st->time_base; } else if (pkt && pkt->pts != AV_NOPTS_VALUE) { decoded_frame->pts = pkt->pts; decoded_frame_tb = ist->st->time_base; }else { decoded_frame->pts = ist->dts; decoded_frame_tb = AV_TIME_BASE_Q; } if (decoded_frame->pts != AV_NOPTS_VALUE) decoded_frame->pts = av_rescale_delta(decoded_frame_tb, decoded_frame->pts, (AVRational){1, avctx->sample_rate}, decoded_frame->nb_samples, &ist->filter_in_rescale_delta_last, (AVRational){1, avctx->sample_rate}); ist->nb_samples = decoded_frame->nb_samples; for (i = 0; i < ist->nb_filters; i++) { if (i < ist->nb_filters - 1) { f = ist->filter_frame; err = av_frame_ref(f, decoded_frame); if (err < 0) break; } else f = decoded_frame; err = av_buffersrc_add_frame_flags(ist->filters[i]->filter, f, AV_BUFFERSRC_FLAG_PUSH); if (err == AVERROR_EOF) err = 0; /* ignore */ if (err < 0) break; } decoded_frame->pts = AV_NOPTS_VALUE; av_frame_unref(ist->filter_frame); av_frame_unref(decoded_frame); return err < 0 ? err : ret; } static int decode_video(InputStream *ist, AVPacket *pkt, int *got_output, int eof) { AVFrame *decoded_frame, *f; int i, ret = 0, err = 0, resample_changed; int64_t best_effort_timestamp; int64_t dts = AV_NOPTS_VALUE; AVRational *frame_sample_aspect; AVPacket avpkt; // With fate-indeo3-2, we're getting 0-sized packets before EOF for some // reason. This seems like a semi-critical bug. Don't trigger EOF, and // skip the packet. if (!eof && pkt && pkt->size == 0) return 0; if (!ist->decoded_frame && !(ist->decoded_frame = av_frame_alloc())) return AVERROR(ENOMEM); if (!ist->filter_frame && !(ist->filter_frame = av_frame_alloc())) return AVERROR(ENOMEM); decoded_frame = ist->decoded_frame; if (ist->dts != AV_NOPTS_VALUE) dts = av_rescale_q(ist->dts, AV_TIME_BASE_Q, ist->st->time_base); if (pkt) { avpkt = *pkt; avpkt.dts = dts; // ffmpeg.c probably shouldn't do this } // The old code used to set dts on the drain packet, which does not work // with the new API anymore. if (eof) { void *new = av_realloc_array(ist->dts_buffer, ist->nb_dts_buffer + 1, sizeof(ist->dts_buffer[0])); if (!new) return AVERROR(ENOMEM); ist->dts_buffer = new; ist->dts_buffer[ist->nb_dts_buffer++] = dts; } update_benchmark(NULL); ret = decode(ist->dec_ctx, decoded_frame, got_output, pkt ? &avpkt : NULL); update_benchmark("decode_video %d.%d", ist->file_index, ist->st->index); // The following line may be required in some cases where there is no parser // or the parser does not has_b_frames correctly if (ist->st->codecpar->video_delay < ist->dec_ctx->has_b_frames) { if (ist->dec_ctx->codec_id == AV_CODEC_ID_H264) { ist->st->codecpar->video_delay = ist->dec_ctx->has_b_frames; } else av_log(ist->dec_ctx, AV_LOG_WARNING, "video_delay is larger in decoder than demuxer %d > %d.\n" "If you want to help, upload a sample " "of this file to ftp://upload.ffmpeg.org/incoming/ " "and contact the ffmpeg-devel mailing list. (ffmpeg-devel@ffmpeg.org)", ist->dec_ctx->has_b_frames, ist->st->codecpar->video_delay); } if (ret != AVERROR_EOF) check_decode_result(ist, got_output, ret); if (*got_output && ret >= 0) { if (ist->dec_ctx->width != decoded_frame->width || ist->dec_ctx->height != decoded_frame->height || ist->dec_ctx->pix_fmt != decoded_frame->format) { av_log(NULL, AV_LOG_DEBUG, "Frame parameters mismatch context %d,%d,%d != %d,%d,%d\n", decoded_frame->width, decoded_frame->height, decoded_frame->format, ist->dec_ctx->width, ist->dec_ctx->height, ist->dec_ctx->pix_fmt); } } if (!*got_output || ret < 0) return ret; if(ist->top_field_first>=0) decoded_frame->top_field_first = ist->top_field_first; ist->frames_decoded++; if (ist->hwaccel_retrieve_data && decoded_frame->format == ist->hwaccel_pix_fmt) { err = ist->hwaccel_retrieve_data(ist->dec_ctx, decoded_frame); if (err < 0) goto fail; } ist->hwaccel_retrieved_pix_fmt = decoded_frame->format; best_effort_timestamp= av_frame_get_best_effort_timestamp(decoded_frame); if (eof && best_effort_timestamp == AV_NOPTS_VALUE && ist->nb_dts_buffer > 0) { best_effort_timestamp = ist->dts_buffer[0]; for (i = 0; i < ist->nb_dts_buffer - 1; i++) ist->dts_buffer[i] = ist->dts_buffer[i + 1]; ist->nb_dts_buffer--; } if(best_effort_timestamp != AV_NOPTS_VALUE) { int64_t ts = av_rescale_q(decoded_frame->pts = best_effort_timestamp, ist->st->time_base, AV_TIME_BASE_Q); if (ts != AV_NOPTS_VALUE) ist->next_pts = ist->pts = ts; } if (debug_ts) { av_log(NULL, AV_LOG_INFO, "decoder -> ist_index:%d type:video " "frame_pts:%s frame_pts_time:%s best_effort_ts:%"PRId64" best_effort_ts_time:%s keyframe:%d frame_type:%d time_base:%d/%d\n", ist->st->index, av_ts2str(decoded_frame->pts), av_ts2timestr(decoded_frame->pts, &ist->st->time_base), best_effort_timestamp, av_ts2timestr(best_effort_timestamp, &ist->st->time_base), decoded_frame->key_frame, decoded_frame->pict_type, ist->st->time_base.num, ist->st->time_base.den); } if (ist->st->sample_aspect_ratio.num) decoded_frame->sample_aspect_ratio = ist->st->sample_aspect_ratio; resample_changed = ist->resample_width != decoded_frame->width || ist->resample_height != decoded_frame->height || ist->resample_pix_fmt != decoded_frame->format; if (resample_changed) { av_log(NULL, AV_LOG_INFO, "Input stream #%d:%d frame changed from size:%dx%d fmt:%s to size:%dx%d fmt:%s\n", ist->file_index, ist->st->index, ist->resample_width, ist->resample_height, av_get_pix_fmt_name(ist->resample_pix_fmt), decoded_frame->width, decoded_frame->height, av_get_pix_fmt_name(decoded_frame->format)); ist->resample_width = decoded_frame->width; ist->resample_height = decoded_frame->height; ist->resample_pix_fmt = decoded_frame->format; for (i = 0; i < nb_filtergraphs; i++) { if (ist_in_filtergraph(filtergraphs[i], ist) && ist->reinit_filters && configure_filtergraph(filtergraphs[i]) < 0) { av_log(NULL, AV_LOG_FATAL, "Error reinitializing filters!\n"); exit_program(1); } } } frame_sample_aspect= av_opt_ptr(avcodec_get_frame_class(), decoded_frame, "sample_aspect_ratio"); for (i = 0; i < ist->nb_filters; i++) { if (!frame_sample_aspect->num) *frame_sample_aspect = ist->st->sample_aspect_ratio; if (i < ist->nb_filters - 1) { f = ist->filter_frame; err = av_frame_ref(f, decoded_frame); if (err < 0) break; } else f = decoded_frame; err = av_buffersrc_add_frame_flags(ist->filters[i]->filter, f, AV_BUFFERSRC_FLAG_PUSH); if (err == AVERROR_EOF) { err = 0; /* ignore */ } else if (err < 0) { av_log(NULL, AV_LOG_FATAL, "Failed to inject frame into filter network: %s\n", av_err2str(err)); exit_program(1); } } fail: av_frame_unref(ist->filter_frame); av_frame_unref(decoded_frame); return err < 0 ? err : ret; } static int transcode_subtitles(InputStream *ist, AVPacket *pkt, int *got_output) { AVSubtitle subtitle; int i, ret = avcodec_decode_subtitle2(ist->dec_ctx, &subtitle, got_output, pkt); check_decode_result(NULL, got_output, ret); if (ret < 0 || !*got_output) { if (!pkt->size) sub2video_flush(ist); return ret; } if (ist->fix_sub_duration) { int end = 1; if (ist->prev_sub.got_output) { end = av_rescale(subtitle.pts - ist->prev_sub.subtitle.pts, 1000, AV_TIME_BASE); if (end < ist->prev_sub.subtitle.end_display_time) { av_log(ist->dec_ctx, AV_LOG_DEBUG, "Subtitle duration reduced from %d to %d%s\n", ist->prev_sub.subtitle.end_display_time, end, end <= 0 ? ", dropping it" : ""); ist->prev_sub.subtitle.end_display_time = end; } } FFSWAP(int, *got_output, ist->prev_sub.got_output); FFSWAP(int, ret, ist->prev_sub.ret); FFSWAP(AVSubtitle, subtitle, ist->prev_sub.subtitle); if (end <= 0) goto out; } if (!*got_output) return ret; sub2video_update(ist, &subtitle); if (!subtitle.num_rects) goto out; ist->frames_decoded++; for (i = 0; i < nb_output_streams; i++) { OutputStream *ost = output_streams[i]; if (!check_output_constraints(ist, ost) || !ost->encoding_needed || ost->enc->type != AVMEDIA_TYPE_SUBTITLE) continue; do_subtitle_out(output_files[ost->file_index], ost, &subtitle); } out: avsubtitle_free(&subtitle); return ret; } static int send_filter_eof(InputStream *ist) { int i, ret; for (i = 0; i < ist->nb_filters; i++) { ret = av_buffersrc_add_frame(ist->filters[i]->filter, NULL); if (ret < 0) return ret; } return 0; } /* pkt = NULL means EOF (needed to flush decoder buffers) */ static int process_input_packet(InputStream *ist, const AVPacket *pkt, int no_eof) { int ret = 0, i; int repeating = 0; int eof_reached = 0; AVPacket avpkt; if (!ist->saw_first_ts) { ist->dts = ist->st->avg_frame_rate.num ? - ist->dec_ctx->has_b_frames * AV_TIME_BASE / av_q2d(ist->st->avg_frame_rate) : 0; ist->pts = 0; if (pkt && pkt->pts != AV_NOPTS_VALUE && !ist->decoding_needed) { ist->dts += av_rescale_q(pkt->pts, ist->st->time_base, AV_TIME_BASE_Q); ist->pts = ist->dts; //unused but better to set it to a value thats not totally wrong } ist->saw_first_ts = 1; } if (ist->next_dts == AV_NOPTS_VALUE) ist->next_dts = ist->dts; if (ist->next_pts == AV_NOPTS_VALUE) ist->next_pts = ist->pts; if (!pkt) { /* EOF handling */ av_init_packet(&avpkt); avpkt.data = NULL; avpkt.size = 0; } else { avpkt = *pkt; } if (pkt && pkt->dts != AV_NOPTS_VALUE) { ist->next_dts = ist->dts = av_rescale_q(pkt->dts, ist->st->time_base, AV_TIME_BASE_Q); if (ist->dec_ctx->codec_type != AVMEDIA_TYPE_VIDEO || !ist->decoding_needed) ist->next_pts = ist->pts = ist->dts; } // while we have more to decode or while the decoder did output something on EOF while (ist->decoding_needed) { int duration = 0; int got_output = 0; ist->pts = ist->next_pts; ist->dts = ist->next_dts; switch (ist->dec_ctx->codec_type) { case AVMEDIA_TYPE_AUDIO: ret = decode_audio (ist, repeating ? NULL : &avpkt, &got_output); break; case AVMEDIA_TYPE_VIDEO: ret = decode_video (ist, repeating ? NULL : &avpkt, &got_output, !pkt); if (!repeating || !pkt || got_output) { if (pkt && pkt->duration) { duration = av_rescale_q(pkt->duration, ist->st->time_base, AV_TIME_BASE_Q); } else if(ist->dec_ctx->framerate.num != 0 && ist->dec_ctx->framerate.den != 0) { int ticks= av_stream_get_parser(ist->st) ? av_stream_get_parser(ist->st)->repeat_pict+1 : ist->dec_ctx->ticks_per_frame; duration = ((int64_t)AV_TIME_BASE * ist->dec_ctx->framerate.den * ticks) / ist->dec_ctx->framerate.num / ist->dec_ctx->ticks_per_frame; } if(ist->dts != AV_NOPTS_VALUE && duration) { ist->next_dts += duration; }else ist->next_dts = AV_NOPTS_VALUE; } if (got_output) ist->next_pts += duration; //FIXME the duration is not correct in some cases break; case AVMEDIA_TYPE_SUBTITLE: if (repeating) break; ret = transcode_subtitles(ist, &avpkt, &got_output); if (!pkt && ret >= 0) ret = AVERROR_EOF; break; default: return -1; } if (ret == AVERROR_EOF) { eof_reached = 1; break; } if (ret < 0) { av_log(NULL, AV_LOG_ERROR, "Error while decoding stream #%d:%d: %s\n", ist->file_index, ist->st->index, av_err2str(ret)); if (exit_on_error) exit_program(1); // Decoding might not terminate if we're draining the decoder, and // the decoder keeps returning an error. // This should probably be considered a libavcodec issue. // Sample: fate-vsynth1-dnxhd-720p-hr-lb if (!pkt) eof_reached = 1; break; } if (!got_output) break; // During draining, we might get multiple output frames in this loop. // ffmpeg.c does not drain the filter chain on configuration changes, // which means if we send multiple frames at once to the filters, and // one of those frames changes configuration, the buffered frames will // be lost. This can upset certain FATE tests. // Decode only 1 frame per call on EOF to appease these FATE tests. // The ideal solution would be to rewrite decoding to use the new // decoding API in a better way. if (!pkt) break; repeating = 1; } /* after flushing, send an EOF on all the filter inputs attached to the stream */ /* except when looping we need to flush but not to send an EOF */ if (!pkt && ist->decoding_needed && eof_reached && !no_eof) { int ret = send_filter_eof(ist); if (ret < 0) { av_log(NULL, AV_LOG_FATAL, "Error marking filters as finished\n"); exit_program(1); } } /* handle stream copy */ if (!ist->decoding_needed) { ist->dts = ist->next_dts; switch (ist->dec_ctx->codec_type) { case AVMEDIA_TYPE_AUDIO: ist->next_dts += ((int64_t)AV_TIME_BASE * ist->dec_ctx->frame_size) / ist->dec_ctx->sample_rate; break; case AVMEDIA_TYPE_VIDEO: if (ist->framerate.num) { // TODO: Remove work-around for c99-to-c89 issue 7 AVRational time_base_q = AV_TIME_BASE_Q; int64_t next_dts = av_rescale_q(ist->next_dts, time_base_q, av_inv_q(ist->framerate)); ist->next_dts = av_rescale_q(next_dts + 1, av_inv_q(ist->framerate), time_base_q); } else if (pkt->duration) { ist->next_dts += av_rescale_q(pkt->duration, ist->st->time_base, AV_TIME_BASE_Q); } else if(ist->dec_ctx->framerate.num != 0) { int ticks= av_stream_get_parser(ist->st) ? av_stream_get_parser(ist->st)->repeat_pict + 1 : ist->dec_ctx->ticks_per_frame; ist->next_dts += ((int64_t)AV_TIME_BASE * ist->dec_ctx->framerate.den * ticks) / ist->dec_ctx->framerate.num / ist->dec_ctx->ticks_per_frame; } break; } ist->pts = ist->dts; ist->next_pts = ist->next_dts; } for (i = 0; pkt && i < nb_output_streams; i++) { OutputStream *ost = output_streams[i]; if (!check_output_constraints(ist, ost) || ost->encoding_needed) continue; do_streamcopy(ist, ost, pkt); } return !eof_reached; } static void print_sdp(void) { char sdp[16384]; int i; int j; AVIOContext *sdp_pb; AVFormatContext **avc; for (i = 0; i < nb_output_files; i++) { if (!output_files[i]->header_written) return; } avc = av_malloc_array(nb_output_files, sizeof(*avc)); if (!avc) exit_program(1); for (i = 0, j = 0; i < nb_output_files; i++) { if (!strcmp(output_files[i]->ctx->oformat->name, "rtp")) { avc[j] = output_files[i]->ctx; j++; } } if (!j) goto fail; av_sdp_create(avc, j, sdp, sizeof(sdp)); if (!sdp_filename) { printf("SDP:\n%s\n", sdp); fflush(stdout); } else { if (avio_open2(&sdp_pb, sdp_filename, AVIO_FLAG_WRITE, &int_cb, NULL) < 0) { av_log(NULL, AV_LOG_ERROR, "Failed to open sdp file '%s'\n", sdp_filename); } else { avio_printf(sdp_pb, "SDP:\n%s", sdp); avio_closep(&sdp_pb); av_freep(&sdp_filename); } } fail: av_freep(&avc); } static const HWAccel *get_hwaccel(enum AVPixelFormat pix_fmt) { int i; for (i = 0; hwaccels[i].name; i++) if (hwaccels[i].pix_fmt == pix_fmt) return &hwaccels[i]; return NULL; } static enum AVPixelFormat get_format(AVCodecContext *s, const enum AVPixelFormat *pix_fmts) { InputStream *ist = s->opaque; const enum AVPixelFormat *p; int ret; for (p = pix_fmts; *p != -1; p++) { const AVPixFmtDescriptor *desc = av_pix_fmt_desc_get(*p); const HWAccel *hwaccel; if (!(desc->flags & AV_PIX_FMT_FLAG_HWACCEL)) break; hwaccel = get_hwaccel(*p); if (!hwaccel || (ist->active_hwaccel_id && ist->active_hwaccel_id != hwaccel->id) || (ist->hwaccel_id != HWACCEL_AUTO && ist->hwaccel_id != hwaccel->id)) continue; ret = hwaccel->init(s); if (ret < 0) { if (ist->hwaccel_id == hwaccel->id) { av_log(NULL, AV_LOG_FATAL, "%s hwaccel requested for input stream #%d:%d, " "but cannot be initialized.\n", hwaccel->name, ist->file_index, ist->st->index); return AV_PIX_FMT_NONE; } continue; } if (ist->hw_frames_ctx) { s->hw_frames_ctx = av_buffer_ref(ist->hw_frames_ctx); if (!s->hw_frames_ctx) return AV_PIX_FMT_NONE; } ist->active_hwaccel_id = hwaccel->id; ist->hwaccel_pix_fmt = *p; break; } return *p; } static int get_buffer(AVCodecContext *s, AVFrame *frame, int flags) { InputStream *ist = s->opaque; if (ist->hwaccel_get_buffer && frame->format == ist->hwaccel_pix_fmt) return ist->hwaccel_get_buffer(s, frame, flags); return avcodec_default_get_buffer2(s, frame, flags); } static int init_input_stream(int ist_index, char *error, int error_len) { int ret; InputStream *ist = input_streams[ist_index]; if (ist->decoding_needed) { AVCodec *codec = ist->dec; if (!codec) { snprintf(error, error_len, "Decoder (codec %s) not found for input stream #%d:%d", avcodec_get_name(ist->dec_ctx->codec_id), ist->file_index, ist->st->index); return AVERROR(EINVAL); } ist->dec_ctx->opaque = ist; ist->dec_ctx->get_format = get_format; ist->dec_ctx->get_buffer2 = get_buffer; ist->dec_ctx->thread_safe_callbacks = 1; av_opt_set_int(ist->dec_ctx, "refcounted_frames", 1, 0); if (ist->dec_ctx->codec_id == AV_CODEC_ID_DVB_SUBTITLE && (ist->decoding_needed & DECODING_FOR_OST)) { av_dict_set(&ist->decoder_opts, "compute_edt", "1", AV_DICT_DONT_OVERWRITE); if (ist->decoding_needed & DECODING_FOR_FILTER) av_log(NULL, AV_LOG_WARNING, "Warning using DVB subtitles for filtering and output at the same time is not fully supported, also see -compute_edt [0|1]\n"); } av_dict_set(&ist->decoder_opts, "sub_text_format", "ass", AV_DICT_DONT_OVERWRITE); /* Useful for subtitles retiming by lavf (FIXME), skipping samples in * audio, and video decoders such as cuvid or mediacodec */ av_codec_set_pkt_timebase(ist->dec_ctx, ist->st->time_base); if (!av_dict_get(ist->decoder_opts, "threads", NULL, 0)) av_dict_set(&ist->decoder_opts, "threads", "auto", 0); if ((ret = avcodec_open2(ist->dec_ctx, codec, &ist->decoder_opts)) < 0) { if (ret == AVERROR_EXPERIMENTAL) abort_codec_experimental(codec, 0); snprintf(error, error_len, "Error while opening decoder for input stream " "#%d:%d : %s", ist->file_index, ist->st->index, av_err2str(ret)); return ret; } assert_avoptions(ist->decoder_opts); } ist->next_pts = AV_NOPTS_VALUE; ist->next_dts = AV_NOPTS_VALUE; return 0; } static InputStream *get_input_stream(OutputStream *ost) { if (ost->source_index >= 0) return input_streams[ost->source_index]; return NULL; } static int compare_int64(const void *a, const void *b) { return FFDIFFSIGN(*(const int64_t *)a, *(const int64_t *)b); } /* open the muxer when all the streams are initialized */ static int check_init_output_file(OutputFile *of, int file_index) { int ret, i; for (i = 0; i < of->ctx->nb_streams; i++) { OutputStream *ost = output_streams[of->ost_index + i]; if (!ost->initialized) return 0; } of->ctx->interrupt_callback = int_cb; ret = avformat_write_header(of->ctx, &of->opts); if (ret < 0) { av_log(NULL, AV_LOG_ERROR, "Could not write header for output file #%d " "(incorrect codec parameters ?): %s", file_index, av_err2str(ret)); return ret; } //assert_avoptions(of->opts); of->header_written = 1; av_dump_format(of->ctx, file_index, of->ctx->filename, 1); if (sdp_filename || want_sdp) print_sdp(); /* flush the muxing queues */ for (i = 0; i < of->ctx->nb_streams; i++) { OutputStream *ost = output_streams[of->ost_index + i]; while (av_fifo_size(ost->muxing_queue)) { AVPacket pkt; av_fifo_generic_read(ost->muxing_queue, &pkt, sizeof(pkt), NULL); write_packet(of, &pkt, ost); } } return 0; } static int init_output_bsfs(OutputStream *ost) { AVBSFContext *ctx; int i, ret; if (!ost->nb_bitstream_filters) return 0; for (i = 0; i < ost->nb_bitstream_filters; i++) { ctx = ost->bsf_ctx[i]; ret = avcodec_parameters_copy(ctx->par_in, i ? ost->bsf_ctx[i - 1]->par_out : ost->st->codecpar); if (ret < 0) return ret; ctx->time_base_in = i ? ost->bsf_ctx[i - 1]->time_base_out : ost->st->time_base; ret = av_bsf_init(ctx); if (ret < 0) { av_log(NULL, AV_LOG_ERROR, "Error initializing bitstream filter: %s\n", ost->bsf_ctx[i]->filter->name); return ret; } } ctx = ost->bsf_ctx[ost->nb_bitstream_filters - 1]; ret = avcodec_parameters_copy(ost->st->codecpar, ctx->par_out); if (ret < 0) return ret; ost->st->time_base = ctx->time_base_out; return 0; } static int init_output_stream_streamcopy(OutputStream *ost) { OutputFile *of = output_files[ost->file_index]; InputStream *ist = get_input_stream(ost); AVCodecParameters *par_dst = ost->st->codecpar; AVCodecParameters *par_src = ost->ref_par; AVRational sar; int i, ret; uint64_t extra_size; av_assert0(ist && !ost->filter); avcodec_parameters_to_context(ost->enc_ctx, ist->st->codecpar); ret = av_opt_set_dict(ost->enc_ctx, &ost->encoder_opts); if (ret < 0) { av_log(NULL, AV_LOG_FATAL, "Error setting up codec context options.\n"); return ret; } avcodec_parameters_from_context(par_src, ost->enc_ctx); extra_size = (uint64_t)par_src->extradata_size + AV_INPUT_BUFFER_PADDING_SIZE; if (extra_size > INT_MAX) { return AVERROR(EINVAL); } /* if stream_copy is selected, no need to decode or encode */ par_dst->codec_id = par_src->codec_id; par_dst->codec_type = par_src->codec_type; if (!par_dst->codec_tag) { unsigned int codec_tag; if (!of->ctx->oformat->codec_tag || av_codec_get_id (of->ctx->oformat->codec_tag, par_src->codec_tag) == par_dst->codec_id || !av_codec_get_tag2(of->ctx->oformat->codec_tag, par_src->codec_id, &codec_tag)) par_dst->codec_tag = par_src->codec_tag; } par_dst->bit_rate = par_src->bit_rate; par_dst->field_order = par_src->field_order; par_dst->chroma_location = par_src->chroma_location; if (par_src->extradata_size) { par_dst->extradata = av_mallocz(extra_size); if (!par_dst->extradata) { return AVERROR(ENOMEM); } memcpy(par_dst->extradata, par_src->extradata, par_src->extradata_size); par_dst->extradata_size = par_src->extradata_size; } par_dst->bits_per_coded_sample = par_src->bits_per_coded_sample; par_dst->bits_per_raw_sample = par_src->bits_per_raw_sample; if (!ost->frame_rate.num) ost->frame_rate = ist->framerate; ost->st->avg_frame_rate = ost->frame_rate; ret = avformat_transfer_internal_stream_timing_info(of->ctx->oformat, ost->st, ist->st, copy_tb); if (ret < 0) return ret; // copy timebase while removing common factors ost->st->time_base = av_add_q(av_stream_get_codec_timebase(ost->st), (AVRational){0, 1}); if (ist->st->nb_side_data) { ost->st->side_data = av_realloc_array(NULL, ist->st->nb_side_data, sizeof(*ist->st->side_data)); if (!ost->st->side_data) return AVERROR(ENOMEM); ost->st->nb_side_data = 0; for (i = 0; i < ist->st->nb_side_data; i++) { const AVPacketSideData *sd_src = &ist->st->side_data[i]; AVPacketSideData *sd_dst = &ost->st->side_data[ost->st->nb_side_data]; if (ost->rotate_overridden && sd_src->type == AV_PKT_DATA_DISPLAYMATRIX) continue; sd_dst->data = av_malloc(sd_src->size); if (!sd_dst->data) return AVERROR(ENOMEM); memcpy(sd_dst->data, sd_src->data, sd_src->size); sd_dst->size = sd_src->size; sd_dst->type = sd_src->type; ost->st->nb_side_data++; } } ost->parser = av_parser_init(par_dst->codec_id); ost->parser_avctx = avcodec_alloc_context3(NULL); if (!ost->parser_avctx) return AVERROR(ENOMEM); switch (par_dst->codec_type) { case AVMEDIA_TYPE_AUDIO: if (audio_volume != 256) { av_log(NULL, AV_LOG_FATAL, "-acodec copy and -vol are incompatible (frames are not decoded)\n"); exit_program(1); } par_dst->channel_layout = par_src->channel_layout; par_dst->sample_rate = par_src->sample_rate; par_dst->channels = par_src->channels; par_dst->frame_size = par_src->frame_size; par_dst->block_align = par_src->block_align; par_dst->initial_padding = par_src->initial_padding; par_dst->trailing_padding = par_src->trailing_padding; par_dst->profile = par_src->profile; if((par_dst->block_align == 1 || par_dst->block_align == 1152 || par_dst->block_align == 576) && par_dst->codec_id == AV_CODEC_ID_MP3) par_dst->block_align= 0; if(par_dst->codec_id == AV_CODEC_ID_AC3) par_dst->block_align= 0; break; case AVMEDIA_TYPE_VIDEO: par_dst->format = par_src->format; par_dst->color_space = par_src->color_space; par_dst->color_range = par_src->color_range; par_dst->color_primaries = par_src->color_primaries; par_dst->color_trc = par_src->color_trc; par_dst->width = par_src->width; par_dst->height = par_src->height; par_dst->video_delay = par_src->video_delay; par_dst->profile = par_src->profile; if (ost->frame_aspect_ratio.num) { // overridden by the -aspect cli option sar = av_mul_q(ost->frame_aspect_ratio, (AVRational){ par_dst->height, par_dst->width }); av_log(NULL, AV_LOG_WARNING, "Overriding aspect ratio " "with stream copy may produce invalid files\n"); } else if (ist->st->sample_aspect_ratio.num) sar = ist->st->sample_aspect_ratio; else sar = par_src->sample_aspect_ratio; ost->st->sample_aspect_ratio = par_dst->sample_aspect_ratio = sar; ost->st->avg_frame_rate = ist->st->avg_frame_rate; ost->st->r_frame_rate = ist->st->r_frame_rate; break; case AVMEDIA_TYPE_SUBTITLE: par_dst->width = par_src->width; par_dst->height = par_src->height; break; case AVMEDIA_TYPE_UNKNOWN: case AVMEDIA_TYPE_DATA: case AVMEDIA_TYPE_ATTACHMENT: break; default: abort(); } return 0; } static int init_output_stream(OutputStream *ost, char *error, int error_len) { int ret = 0; if (ost->encoding_needed) { AVCodec *codec = ost->enc; AVCodecContext *dec = NULL; InputStream *ist; if ((ist = get_input_stream(ost))) dec = ist->dec_ctx; if (dec && dec->subtitle_header) { /* ASS code assumes this buffer is null terminated so add extra byte. */ ost->enc_ctx->subtitle_header = av_mallocz(dec->subtitle_header_size + 1); if (!ost->enc_ctx->subtitle_header) return AVERROR(ENOMEM); memcpy(ost->enc_ctx->subtitle_header, dec->subtitle_header, dec->subtitle_header_size); ost->enc_ctx->subtitle_header_size = dec->subtitle_header_size; } if (!av_dict_get(ost->encoder_opts, "threads", NULL, 0)) av_dict_set(&ost->encoder_opts, "threads", "auto", 0); if (ost->enc->type == AVMEDIA_TYPE_AUDIO && !codec->defaults && !av_dict_get(ost->encoder_opts, "b", NULL, 0) && !av_dict_get(ost->encoder_opts, "ab", NULL, 0)) av_dict_set(&ost->encoder_opts, "b", "128000", 0); if (ost->filter && ost->filter->filter->inputs[0]->hw_frames_ctx) { ost->enc_ctx->hw_frames_ctx = av_buffer_ref(ost->filter->filter->inputs[0]->hw_frames_ctx); if (!ost->enc_ctx->hw_frames_ctx) return AVERROR(ENOMEM); } if ((ret = avcodec_open2(ost->enc_ctx, codec, &ost->encoder_opts)) < 0) { if (ret == AVERROR_EXPERIMENTAL) abort_codec_experimental(codec, 1); snprintf(error, error_len, "Error while opening encoder for output stream #%d:%d - " "maybe incorrect parameters such as bit_rate, rate, width or height", ost->file_index, ost->index); return ret; } if (ost->enc->type == AVMEDIA_TYPE_AUDIO && !(ost->enc->capabilities & AV_CODEC_CAP_VARIABLE_FRAME_SIZE)) av_buffersink_set_frame_size(ost->filter->filter, ost->enc_ctx->frame_size); assert_avoptions(ost->encoder_opts); if (ost->enc_ctx->bit_rate && ost->enc_ctx->bit_rate < 1000) av_log(NULL, AV_LOG_WARNING, "The bitrate parameter is set too low." " It takes bits/s as argument, not kbits/s\n"); ret = avcodec_parameters_from_context(ost->st->codecpar, ost->enc_ctx); if (ret < 0) { av_log(NULL, AV_LOG_FATAL, "Error initializing the output stream codec context.\n"); exit_program(1); } /* * FIXME: ost->st->codec should't be needed here anymore. */ ret = avcodec_copy_context(ost->st->codec, ost->enc_ctx); if (ret < 0) return ret; if (ost->enc_ctx->nb_coded_side_data) { int i; ost->st->side_data = av_realloc_array(NULL, ost->enc_ctx->nb_coded_side_data, sizeof(*ost->st->side_data)); if (!ost->st->side_data) return AVERROR(ENOMEM); for (i = 0; i < ost->enc_ctx->nb_coded_side_data; i++) { const AVPacketSideData *sd_src = &ost->enc_ctx->coded_side_data[i]; AVPacketSideData *sd_dst = &ost->st->side_data[i]; sd_dst->data = av_malloc(sd_src->size); if (!sd_dst->data) return AVERROR(ENOMEM); memcpy(sd_dst->data, sd_src->data, sd_src->size); sd_dst->size = sd_src->size; sd_dst->type = sd_src->type; ost->st->nb_side_data++; } } // copy timebase while removing common factors ost->st->time_base = av_add_q(ost->enc_ctx->time_base, (AVRational){0, 1}); ost->st->codec->codec= ost->enc_ctx->codec; } else if (ost->stream_copy) { ret = init_output_stream_streamcopy(ost); if (ret < 0) return ret; /* * FIXME: will the codec context used by the parser during streamcopy * This should go away with the new parser API. */ ret = avcodec_parameters_to_context(ost->parser_avctx, ost->st->codecpar); if (ret < 0) return ret; } /* initialize bitstream filters for the output stream * needs to be done here, because the codec id for streamcopy is not * known until now */ ret = init_output_bsfs(ost); if (ret < 0) return ret; ost->initialized = 1; ret = check_init_output_file(output_files[ost->file_index], ost->file_index); if (ret < 0) return ret; return ret; } static void parse_forced_key_frames(char *kf, OutputStream *ost, AVCodecContext *avctx) { char *p; int n = 1, i, size, index = 0; int64_t t, *pts; for (p = kf; *p; p++) if (*p == ',') n++; size = n; pts = av_malloc_array(size, sizeof(*pts)); if (!pts) { av_log(NULL, AV_LOG_FATAL, "Could not allocate forced key frames array.\n"); exit_program(1); } p = kf; for (i = 0; i < n; i++) { char *next = strchr(p, ','); if (next) *next++ = 0; if (!memcmp(p, "chapters", 8)) { AVFormatContext *avf = output_files[ost->file_index]->ctx; int j; if (avf->nb_chapters > INT_MAX - size || !(pts = av_realloc_f(pts, size += avf->nb_chapters - 1, sizeof(*pts)))) { av_log(NULL, AV_LOG_FATAL, "Could not allocate forced key frames array.\n"); exit_program(1); } t = p[8] ? parse_time_or_die("force_key_frames", p + 8, 1) : 0; t = av_rescale_q(t, AV_TIME_BASE_Q, avctx->time_base); for (j = 0; j < avf->nb_chapters; j++) { AVChapter *c = avf->chapters[j]; av_assert1(index < size); pts[index++] = av_rescale_q(c->start, c->time_base, avctx->time_base) + t; } } else { t = parse_time_or_die("force_key_frames", p, 1); av_assert1(index < size); pts[index++] = av_rescale_q(t, AV_TIME_BASE_Q, avctx->time_base); } p = next; } av_assert0(index == size); qsort(pts, size, sizeof(*pts), compare_int64); ost->forced_kf_count = size; ost->forced_kf_pts = pts; } static void report_new_stream(int input_index, AVPacket *pkt) { InputFile *file = input_files[input_index]; AVStream *st = file->ctx->streams[pkt->stream_index]; if (pkt->stream_index < file->nb_streams_warn) return; av_log(file->ctx, AV_LOG_WARNING, "New %s stream %d:%d at pos:%"PRId64" and DTS:%ss\n", av_get_media_type_string(st->codecpar->codec_type), input_index, pkt->stream_index, pkt->pos, av_ts2timestr(pkt->dts, &st->time_base)); file->nb_streams_warn = pkt->stream_index + 1; } static void set_encoder_id(OutputFile *of, OutputStream *ost) { AVDictionaryEntry *e; uint8_t *encoder_string; int encoder_string_len; int format_flags = 0; int codec_flags = 0; if (av_dict_get(ost->st->metadata, "encoder", NULL, 0)) return; e = av_dict_get(of->opts, "fflags", NULL, 0); if (e) { const AVOption *o = av_opt_find(of->ctx, "fflags", NULL, 0, 0); if (!o) return; av_opt_eval_flags(of->ctx, o, e->value, &format_flags); } e = av_dict_get(ost->encoder_opts, "flags", NULL, 0); if (e) { const AVOption *o = av_opt_find(ost->enc_ctx, "flags", NULL, 0, 0); if (!o) return; av_opt_eval_flags(ost->enc_ctx, o, e->value, &codec_flags); } encoder_string_len = sizeof(LIBAVCODEC_IDENT) + strlen(ost->enc->name) + 2; encoder_string = av_mallocz(encoder_string_len); if (!encoder_string) exit_program(1); if (!(format_flags & AVFMT_FLAG_BITEXACT) && !(codec_flags & AV_CODEC_FLAG_BITEXACT)) av_strlcpy(encoder_string, LIBAVCODEC_IDENT " ", encoder_string_len); else av_strlcpy(encoder_string, "Lavc ", encoder_string_len); av_strlcat(encoder_string, ost->enc->name, encoder_string_len); av_dict_set(&ost->st->metadata, "encoder", encoder_string, AV_DICT_DONT_STRDUP_VAL | AV_DICT_DONT_OVERWRITE); } static int transcode_init(void) { int ret = 0, i, j, k; AVFormatContext *oc; OutputStream *ost; InputStream *ist; char error[1024] = {0}; for (i = 0; i < nb_filtergraphs; i++) { FilterGraph *fg = filtergraphs[i]; for (j = 0; j < fg->nb_outputs; j++) { OutputFilter *ofilter = fg->outputs[j]; if (!ofilter->ost || ofilter->ost->source_index >= 0) continue; if (fg->nb_inputs != 1) continue; for (k = nb_input_streams-1; k >= 0 ; k--) if (fg->inputs[0]->ist == input_streams[k]) break; ofilter->ost->source_index = k; } } /* init framerate emulation */ for (i = 0; i < nb_input_files; i++) { InputFile *ifile = input_files[i]; if (ifile->rate_emu) for (j = 0; j < ifile->nb_streams; j++) input_streams[j + ifile->ist_index]->start = av_gettime_relative(); } /* for each output stream, we compute the right encoding parameters */ for (i = 0; i < nb_output_streams; i++) { ost = output_streams[i]; oc = output_files[ost->file_index]->ctx; ist = get_input_stream(ost); if (ost->attachment_filename) continue; if (ist) { ost->st->disposition = ist->st->disposition; } else { for (j=0; j<oc->nb_streams; j++) { AVStream *st = oc->streams[j]; if (st != ost->st && st->codecpar->codec_type == ost->st->codecpar->codec_type) break; } if (j == oc->nb_streams) if (ost->st->codecpar->codec_type == AVMEDIA_TYPE_AUDIO || ost->st->codecpar->codec_type == AVMEDIA_TYPE_VIDEO) ost->st->disposition = AV_DISPOSITION_DEFAULT; } if (!ost->stream_copy) { AVCodecContext *enc_ctx = ost->enc_ctx; AVCodecContext *dec_ctx = NULL; set_encoder_id(output_files[ost->file_index], ost); if (ist) { dec_ctx = ist->dec_ctx; enc_ctx->chroma_sample_location = dec_ctx->chroma_sample_location; } #if CONFIG_LIBMFX if (qsv_transcode_init(ost)) exit_program(1); #endif #if CONFIG_CUVID if (cuvid_transcode_init(ost)) exit_program(1); #endif if ((enc_ctx->codec_type == AVMEDIA_TYPE_VIDEO || enc_ctx->codec_type == AVMEDIA_TYPE_AUDIO) && filtergraph_is_simple(ost->filter->graph)) { FilterGraph *fg = ost->filter->graph; if (configure_filtergraph(fg)) { av_log(NULL, AV_LOG_FATAL, "Error opening filters!\n"); exit_program(1); } } if (enc_ctx->codec_type == AVMEDIA_TYPE_VIDEO) { if (!ost->frame_rate.num) ost->frame_rate = av_buffersink_get_frame_rate(ost->filter->filter); if (ist && !ost->frame_rate.num) ost->frame_rate = ist->framerate; if (ist && !ost->frame_rate.num) ost->frame_rate = ist->st->r_frame_rate; if (ist && !ost->frame_rate.num) { ost->frame_rate = (AVRational){25, 1}; av_log(NULL, AV_LOG_WARNING, "No information " "about the input framerate is available. Falling " "back to a default value of 25fps for output stream #%d:%d. Use the -r option " "if you want a different framerate.\n", ost->file_index, ost->index); } // ost->frame_rate = ist->st->avg_frame_rate.num ? ist->st->avg_frame_rate : (AVRational){25, 1}; if (ost->enc && ost->enc->supported_framerates && !ost->force_fps) { int idx = av_find_nearest_q_idx(ost->frame_rate, ost->enc->supported_framerates); ost->frame_rate = ost->enc->supported_framerates[idx]; } // reduce frame rate for mpeg4 to be within the spec limits if (enc_ctx->codec_id == AV_CODEC_ID_MPEG4) { av_reduce(&ost->frame_rate.num, &ost->frame_rate.den, ost->frame_rate.num, ost->frame_rate.den, 65535); } } switch (enc_ctx->codec_type) { case AVMEDIA_TYPE_AUDIO: enc_ctx->sample_fmt = ost->filter->filter->inputs[0]->format; if (dec_ctx) enc_ctx->bits_per_raw_sample = FFMIN(dec_ctx->bits_per_raw_sample, av_get_bytes_per_sample(enc_ctx->sample_fmt) << 3); enc_ctx->sample_rate = ost->filter->filter->inputs[0]->sample_rate; enc_ctx->channel_layout = ost->filter->filter->inputs[0]->channel_layout; enc_ctx->channels = avfilter_link_get_channels(ost->filter->filter->inputs[0]); enc_ctx->time_base = (AVRational){ 1, enc_ctx->sample_rate }; break; case AVMEDIA_TYPE_VIDEO: enc_ctx->time_base = av_inv_q(ost->frame_rate); if (!(enc_ctx->time_base.num && enc_ctx->time_base.den)) enc_ctx->time_base = ost->filter->filter->inputs[0]->time_base; if ( av_q2d(enc_ctx->time_base) < 0.001 && video_sync_method != VSYNC_PASSTHROUGH && (video_sync_method == VSYNC_CFR || video_sync_method == VSYNC_VSCFR || (video_sync_method == VSYNC_AUTO && !(oc->oformat->flags & AVFMT_VARIABLE_FPS)))){ av_log(oc, AV_LOG_WARNING, "Frame rate very high for a muxer not efficiently supporting it.\n" "Please consider specifying a lower framerate, a different muxer or -vsync 2\n"); } for (j = 0; j < ost->forced_kf_count; j++) ost->forced_kf_pts[j] = av_rescale_q(ost->forced_kf_pts[j], AV_TIME_BASE_Q, enc_ctx->time_base); enc_ctx->width = ost->filter->filter->inputs[0]->w; enc_ctx->height = ost->filter->filter->inputs[0]->h; enc_ctx->sample_aspect_ratio = ost->st->sample_aspect_ratio = ost->frame_aspect_ratio.num ? // overridden by the -aspect cli option av_mul_q(ost->frame_aspect_ratio, (AVRational){ enc_ctx->height, enc_ctx->width }) : ost->filter->filter->inputs[0]->sample_aspect_ratio; if (!strncmp(ost->enc->name, "libx264", 7) && enc_ctx->pix_fmt == AV_PIX_FMT_NONE && ost->filter->filter->inputs[0]->format != AV_PIX_FMT_YUV420P) av_log(NULL, AV_LOG_WARNING, "No pixel format specified, %s for H.264 encoding chosen.\n" "Use -pix_fmt yuv420p for compatibility with outdated media players.\n", av_get_pix_fmt_name(ost->filter->filter->inputs[0]->format)); if (!strncmp(ost->enc->name, "mpeg2video", 10) && enc_ctx->pix_fmt == AV_PIX_FMT_NONE && ost->filter->filter->inputs[0]->format != AV_PIX_FMT_YUV420P) av_log(NULL, AV_LOG_WARNING, "No pixel format specified, %s for MPEG-2 encoding chosen.\n" "Use -pix_fmt yuv420p for compatibility with outdated media players.\n", av_get_pix_fmt_name(ost->filter->filter->inputs[0]->format)); enc_ctx->pix_fmt = ost->filter->filter->inputs[0]->format; if (dec_ctx) enc_ctx->bits_per_raw_sample = FFMIN(dec_ctx->bits_per_raw_sample, av_pix_fmt_desc_get(enc_ctx->pix_fmt)->comp[0].depth); ost->st->avg_frame_rate = ost->frame_rate; if (!dec_ctx || enc_ctx->width != dec_ctx->width || enc_ctx->height != dec_ctx->height || enc_ctx->pix_fmt != dec_ctx->pix_fmt) { enc_ctx->bits_per_raw_sample = frame_bits_per_raw_sample; } if (ost->forced_keyframes) { if (!strncmp(ost->forced_keyframes, "expr:", 5)) { ret = av_expr_parse(&ost->forced_keyframes_pexpr, ost->forced_keyframes+5, forced_keyframes_const_names, NULL, NULL, NULL, NULL, 0, NULL); if (ret < 0) { av_log(NULL, AV_LOG_ERROR, "Invalid force_key_frames expression '%s'\n", ost->forced_keyframes+5); return ret; } ost->forced_keyframes_expr_const_values[FKF_N] = 0; ost->forced_keyframes_expr_const_values[FKF_N_FORCED] = 0; ost->forced_keyframes_expr_const_values[FKF_PREV_FORCED_N] = NAN; ost->forced_keyframes_expr_const_values[FKF_PREV_FORCED_T] = NAN; // Don't parse the 'forced_keyframes' in case of 'keep-source-keyframes', // parse it only for static kf timings } else if(strncmp(ost->forced_keyframes, "source", 6)) { parse_forced_key_frames(ost->forced_keyframes, ost, ost->enc_ctx); } } break; case AVMEDIA_TYPE_SUBTITLE: enc_ctx->time_base = (AVRational){1, 1000}; if (!enc_ctx->width) { enc_ctx->width = input_streams[ost->source_index]->st->codecpar->width; enc_ctx->height = input_streams[ost->source_index]->st->codecpar->height; } break; case AVMEDIA_TYPE_DATA: break; default: abort(); break; } } if (ost->disposition) { static const AVOption opts[] = { { "disposition" , NULL, 0, AV_OPT_TYPE_FLAGS, { .i64 = 0 }, INT64_MIN, INT64_MAX, .unit = "flags" }, { "default" , NULL, 0, AV_OPT_TYPE_CONST, { .i64 = AV_DISPOSITION_DEFAULT }, .unit = "flags" }, { "dub" , NULL, 0, AV_OPT_TYPE_CONST, { .i64 = AV_DISPOSITION_DUB }, .unit = "flags" }, { "original" , NULL, 0, AV_OPT_TYPE_CONST, { .i64 = AV_DISPOSITION_ORIGINAL }, .unit = "flags" }, { "comment" , NULL, 0, AV_OPT_TYPE_CONST, { .i64 = AV_DISPOSITION_COMMENT }, .unit = "flags" }, { "lyrics" , NULL, 0, AV_OPT_TYPE_CONST, { .i64 = AV_DISPOSITION_LYRICS }, .unit = "flags" }, { "karaoke" , NULL, 0, AV_OPT_TYPE_CONST, { .i64 = AV_DISPOSITION_KARAOKE }, .unit = "flags" }, { "forced" , NULL, 0, AV_OPT_TYPE_CONST, { .i64 = AV_DISPOSITION_FORCED }, .unit = "flags" }, { "hearing_impaired" , NULL, 0, AV_OPT_TYPE_CONST, { .i64 = AV_DISPOSITION_HEARING_IMPAIRED }, .unit = "flags" }, { "visual_impaired" , NULL, 0, AV_OPT_TYPE_CONST, { .i64 = AV_DISPOSITION_VISUAL_IMPAIRED }, .unit = "flags" }, { "clean_effects" , NULL, 0, AV_OPT_TYPE_CONST, { .i64 = AV_DISPOSITION_CLEAN_EFFECTS }, .unit = "flags" }, { "captions" , NULL, 0, AV_OPT_TYPE_CONST, { .i64 = AV_DISPOSITION_CAPTIONS }, .unit = "flags" }, { "descriptions" , NULL, 0, AV_OPT_TYPE_CONST, { .i64 = AV_DISPOSITION_DESCRIPTIONS }, .unit = "flags" }, { "metadata" , NULL, 0, AV_OPT_TYPE_CONST, { .i64 = AV_DISPOSITION_METADATA }, .unit = "flags" }, { NULL }, }; static const AVClass class = { .class_name = "", .item_name = av_default_item_name, .option = opts, .version = LIBAVUTIL_VERSION_INT, }; const AVClass *pclass = &class; ret = av_opt_eval_flags(&pclass, &opts[0], ost->disposition, &ost->st->disposition); if (ret < 0) goto dump_format; } } /* init input streams */ for (i = 0; i < nb_input_streams; i++) if ((ret = init_input_stream(i, error, sizeof(error))) < 0) { for (i = 0; i < nb_output_streams; i++) { ost = output_streams[i]; avcodec_close(ost->enc_ctx); } goto dump_format; } /* open each encoder */ for (i = 0; i < nb_output_streams; i++) { ret = init_output_stream(output_streams[i], error, sizeof(error)); if (ret < 0) goto dump_format; } /* discard unused programs */ for (i = 0; i < nb_input_files; i++) { InputFile *ifile = input_files[i]; for (j = 0; j < ifile->ctx->nb_programs; j++) { AVProgram *p = ifile->ctx->programs[j]; int discard = AVDISCARD_ALL; for (k = 0; k < p->nb_stream_indexes; k++) if (!input_streams[ifile->ist_index + p->stream_index[k]]->discard) { discard = AVDISCARD_DEFAULT; break; } p->discard = discard; } } /* write headers for files with no streams */ for (i = 0; i < nb_output_files; i++) { oc = output_files[i]->ctx; if (oc->oformat->flags & AVFMT_NOSTREAMS && oc->nb_streams == 0) { ret = check_init_output_file(output_files[i], i); if (ret < 0) goto dump_format; } } dump_format: /* dump the stream mapping */ av_log(NULL, AV_LOG_INFO, "Stream mapping:\n"); for (i = 0; i < nb_input_streams; i++) { ist = input_streams[i]; for (j = 0; j < ist->nb_filters; j++) { if (!filtergraph_is_simple(ist->filters[j]->graph)) { av_log(NULL, AV_LOG_INFO, " Stream #%d:%d (%s) -> %s", ist->file_index, ist->st->index, ist->dec ? ist->dec->name : "?", ist->filters[j]->name); if (nb_filtergraphs > 1) av_log(NULL, AV_LOG_INFO, " (graph %d)", ist->filters[j]->graph->index); av_log(NULL, AV_LOG_INFO, "\n"); } } } for (i = 0; i < nb_output_streams; i++) { ost = output_streams[i]; if (ost->attachment_filename) { /* an attached file */ av_log(NULL, AV_LOG_INFO, " File %s -> Stream #%d:%d\n", ost->attachment_filename, ost->file_index, ost->index); continue; } if (ost->filter && !filtergraph_is_simple(ost->filter->graph)) { /* output from a complex graph */ av_log(NULL, AV_LOG_INFO, " %s", ost->filter->name); if (nb_filtergraphs > 1) av_log(NULL, AV_LOG_INFO, " (graph %d)", ost->filter->graph->index); av_log(NULL, AV_LOG_INFO, " -> Stream #%d:%d (%s)\n", ost->file_index, ost->index, ost->enc ? ost->enc->name : "?"); continue; } av_log(NULL, AV_LOG_INFO, " Stream #%d:%d -> #%d:%d", input_streams[ost->source_index]->file_index, input_streams[ost->source_index]->st->index, ost->file_index, ost->index); if (ost->sync_ist != input_streams[ost->source_index]) av_log(NULL, AV_LOG_INFO, " [sync #%d:%d]", ost->sync_ist->file_index, ost->sync_ist->st->index); if (ost->stream_copy) av_log(NULL, AV_LOG_INFO, " (copy)"); else { const AVCodec *in_codec = input_streams[ost->source_index]->dec; const AVCodec *out_codec = ost->enc; const char *decoder_name = "?"; const char *in_codec_name = "?"; const char *encoder_name = "?"; const char *out_codec_name = "?"; const AVCodecDescriptor *desc; if (in_codec) { decoder_name = in_codec->name; desc = avcodec_descriptor_get(in_codec->id); if (desc) in_codec_name = desc->name; if (!strcmp(decoder_name, in_codec_name)) decoder_name = "native"; } if (out_codec) { encoder_name = out_codec->name; desc = avcodec_descriptor_get(out_codec->id); if (desc) out_codec_name = desc->name; if (!strcmp(encoder_name, out_codec_name)) encoder_name = "native"; } av_log(NULL, AV_LOG_INFO, " (%s (%s) -> %s (%s))", in_codec_name, decoder_name, out_codec_name, encoder_name); } av_log(NULL, AV_LOG_INFO, "\n"); } if (ret) { av_log(NULL, AV_LOG_ERROR, "%s\n", error); return ret; } transcode_init_done = 1; return 0; } /* Return 1 if there remain streams where more output is wanted, 0 otherwise. */ static int need_output(void) { int i; for (i = 0; i < nb_output_streams; i++) { OutputStream *ost = output_streams[i]; OutputFile *of = output_files[ost->file_index]; AVFormatContext *os = output_files[ost->file_index]->ctx; if (ost->finished || (os->pb && avio_tell(os->pb) >= of->limit_filesize)) continue; if (ost->frame_number >= ost->max_frames) { int j; for (j = 0; j < of->ctx->nb_streams; j++) close_output_stream(output_streams[of->ost_index + j]); continue; } return 1; } return 0; } /** * Select the output stream to process. * * @return selected output stream, or NULL if none available */ static OutputStream *choose_output(void) { int i; int64_t opts_min = INT64_MAX; OutputStream *ost_min = NULL; for (i = 0; i < nb_output_streams; i++) { OutputStream *ost = output_streams[i]; int64_t opts = ost->st->cur_dts == AV_NOPTS_VALUE ? INT64_MIN : av_rescale_q(ost->st->cur_dts, ost->st->time_base, AV_TIME_BASE_Q); if (ost->st->cur_dts == AV_NOPTS_VALUE) av_log(NULL, AV_LOG_DEBUG, "cur_dts is invalid (this is harmless if it occurs once at the start per stream)\n"); if (!ost->finished && opts < opts_min) { opts_min = opts; ost_min = ost->unavailable ? NULL : ost; } } return ost_min; } static void set_tty_echo(int on) { #if HAVE_TERMIOS_H struct termios tty; if (tcgetattr(0, &tty) == 0) { if (on) tty.c_lflag |= ECHO; else tty.c_lflag &= ~ECHO; tcsetattr(0, TCSANOW, &tty); } #endif } static int check_keyboard_interaction(int64_t cur_time) { int i, ret, key; static int64_t last_time; if (received_nb_signals) return AVERROR_EXIT; /* read_key() returns 0 on EOF */ if(cur_time - last_time >= 100000 && !run_as_daemon){ key = read_key(); last_time = cur_time; }else key = -1; if (key == 'q') return AVERROR_EXIT; if (key == '+') av_log_set_level(av_log_get_level()+10); if (key == '-') av_log_set_level(av_log_get_level()-10); if (key == 's') qp_hist ^= 1; if (key == 'h'){ if (do_hex_dump){ do_hex_dump = do_pkt_dump = 0; } else if(do_pkt_dump){ do_hex_dump = 1; } else do_pkt_dump = 1; av_log_set_level(AV_LOG_DEBUG); } if (key == 'c' || key == 'C'){ char buf[4096], target[64], command[256], arg[256] = {0}; double time; int k, n = 0; fprintf(stderr, "\nEnter command: <target>|all <time>|-1 <command>[ <argument>]\n"); i = 0; set_tty_echo(1); while ((k = read_key()) != '\n' && k != '\r' && i < sizeof(buf)-1) if (k > 0) buf[i++] = k; buf[i] = 0; set_tty_echo(0); fprintf(stderr, "\n"); if (k > 0 && (n = sscanf(buf, "%63[^ ] %lf %255[^ ] %255[^\n]", target, &time, command, arg)) >= 3) { av_log(NULL, AV_LOG_DEBUG, "Processing command target:%s time:%f command:%s arg:%s", target, time, command, arg); for (i = 0; i < nb_filtergraphs; i++) { FilterGraph *fg = filtergraphs[i]; if (fg->graph) { if (time < 0) { ret = avfilter_graph_send_command(fg->graph, target, command, arg, buf, sizeof(buf), key == 'c' ? AVFILTER_CMD_FLAG_ONE : 0); fprintf(stderr, "Command reply for stream %d: ret:%d res:\n%s", i, ret, buf); } else if (key == 'c') { fprintf(stderr, "Queuing commands only on filters supporting the specific command is unsupported\n"); ret = AVERROR_PATCHWELCOME; } else { ret = avfilter_graph_queue_command(fg->graph, target, command, arg, 0, time); if (ret < 0) fprintf(stderr, "Queuing command failed with error %s\n", av_err2str(ret)); } } } } else { av_log(NULL, AV_LOG_ERROR, "Parse error, at least 3 arguments were expected, " "only %d given in string '%s'\n", n, buf); } } if (key == 'd' || key == 'D'){ int debug=0; if(key == 'D') { debug = input_streams[0]->st->codec->debug<<1; if(!debug) debug = 1; while(debug & (FF_DEBUG_DCT_COEFF|FF_DEBUG_VIS_QP|FF_DEBUG_VIS_MB_TYPE)) //unsupported, would just crash debug += debug; }else{ char buf[32]; int k = 0; i = 0; set_tty_echo(1); while ((k = read_key()) != '\n' && k != '\r' && i < sizeof(buf)-1) if (k > 0) buf[i++] = k; buf[i] = 0; set_tty_echo(0); fprintf(stderr, "\n"); if (k <= 0 || sscanf(buf, "%d", &debug)!=1) fprintf(stderr,"error parsing debug value\n"); } for(i=0;i<nb_input_streams;i++) { input_streams[i]->st->codec->debug = debug; } for(i=0;i<nb_output_streams;i++) { OutputStream *ost = output_streams[i]; ost->enc_ctx->debug = debug; } if(debug) av_log_set_level(AV_LOG_DEBUG); fprintf(stderr,"debug=%d\n", debug); } if (key == '?'){ fprintf(stderr, "key function\n" "? show this help\n" "+ increase verbosity\n" "- decrease verbosity\n" "c Send command to first matching filter supporting it\n" "C Send/Queue command to all matching filters\n" "D cycle through available debug modes\n" "h dump packets/hex press to cycle through the 3 states\n" "q quit\n" "s Show QP histogram\n" ); } return 0; } #if HAVE_PTHREADS static void *input_thread(void *arg) { InputFile *f = arg; unsigned flags = f->non_blocking ? AV_THREAD_MESSAGE_NONBLOCK : 0; int ret = 0; while (1) { AVPacket pkt; ret = av_read_frame(f->ctx, &pkt); if (ret == AVERROR(EAGAIN)) { av_usleep(10000); continue; } if (ret < 0) { av_thread_message_queue_set_err_recv(f->in_thread_queue, ret); break; } ret = av_thread_message_queue_send(f->in_thread_queue, &pkt, flags); if (flags && ret == AVERROR(EAGAIN)) { flags = 0; ret = av_thread_message_queue_send(f->in_thread_queue, &pkt, flags); av_log(f->ctx, AV_LOG_WARNING, "Thread message queue blocking; consider raising the " "thread_queue_size option (current value: %d)\n", f->thread_queue_size); } if (ret < 0) { if (ret != AVERROR_EOF) av_log(f->ctx, AV_LOG_ERROR, "Unable to send packet to main thread: %s\n", av_err2str(ret)); av_packet_unref(&pkt); av_thread_message_queue_set_err_recv(f->in_thread_queue, ret); break; } } return NULL; } static void free_input_threads(void) { int i; for (i = 0; i < nb_input_files; i++) { InputFile *f = input_files[i]; AVPacket pkt; if (!f || !f->in_thread_queue) continue; av_thread_message_queue_set_err_send(f->in_thread_queue, AVERROR_EOF); while (av_thread_message_queue_recv(f->in_thread_queue, &pkt, 0) >= 0) av_packet_unref(&pkt); pthread_join(f->thread, NULL); f->joined = 1; av_thread_message_queue_free(&f->in_thread_queue); } } static int init_input_threads(void) { int i, ret; if (nb_input_files == 1) return 0; for (i = 0; i < nb_input_files; i++) { InputFile *f = input_files[i]; if (f->ctx->pb ? !f->ctx->pb->seekable : strcmp(f->ctx->iformat->name, "lavfi")) f->non_blocking = 1; ret = av_thread_message_queue_alloc(&f->in_thread_queue, f->thread_queue_size, sizeof(AVPacket)); if (ret < 0) return ret; if ((ret = pthread_create(&f->thread, NULL, input_thread, f))) { av_log(NULL, AV_LOG_ERROR, "pthread_create failed: %s. Try to increase `ulimit -v` or decrease `ulimit -s`.\n", strerror(ret)); av_thread_message_queue_free(&f->in_thread_queue); return AVERROR(ret); } } return 0; } static int get_input_packet_mt(InputFile *f, AVPacket *pkt) { return av_thread_message_queue_recv(f->in_thread_queue, pkt, f->non_blocking ? AV_THREAD_MESSAGE_NONBLOCK : 0); } #endif static int get_input_packet(InputFile *f, AVPacket *pkt) { if (f->rate_emu) { int i; for (i = 0; i < f->nb_streams; i++) { InputStream *ist = input_streams[f->ist_index + i]; int64_t pts = av_rescale(ist->dts, 1000000, AV_TIME_BASE); int64_t now = av_gettime_relative() - ist->start; if (pts > now) return AVERROR(EAGAIN); } } #if HAVE_PTHREADS if (nb_input_files > 1) return get_input_packet_mt(f, pkt); #endif return av_read_frame(f->ctx, pkt); } static int got_eagain(void) { int i; for (i = 0; i < nb_output_streams; i++) if (output_streams[i]->unavailable) return 1; return 0; } static void reset_eagain(void) { int i; for (i = 0; i < nb_input_files; i++) input_files[i]->eagain = 0; for (i = 0; i < nb_output_streams; i++) output_streams[i]->unavailable = 0; } // set duration to max(tmp, duration) in a proper time base and return duration's time_base static AVRational duration_max(int64_t tmp, int64_t *duration, AVRational tmp_time_base, AVRational time_base) { int ret; if (!*duration) { *duration = tmp; return tmp_time_base; } ret = av_compare_ts(*duration, time_base, tmp, tmp_time_base); if (ret < 0) { *duration = tmp; return tmp_time_base; } return time_base; } static int seek_to_start(InputFile *ifile, AVFormatContext *is) { InputStream *ist; AVCodecContext *avctx; int i, ret, has_audio = 0; int64_t duration = 0; ret = av_seek_frame(is, -1, is->start_time, 0); if (ret < 0) return ret; for (i = 0; i < ifile->nb_streams; i++) { ist = input_streams[ifile->ist_index + i]; avctx = ist->dec_ctx; // flush decoders if (ist->decoding_needed) { process_input_packet(ist, NULL, 1); avcodec_flush_buffers(avctx); } /* duration is the length of the last frame in a stream * when audio stream is present we don't care about * last video frame length because it's not defined exactly */ if (avctx->codec_type == AVMEDIA_TYPE_AUDIO && ist->nb_samples) has_audio = 1; } for (i = 0; i < ifile->nb_streams; i++) { ist = input_streams[ifile->ist_index + i]; avctx = ist->dec_ctx; if (has_audio) { if (avctx->codec_type == AVMEDIA_TYPE_AUDIO && ist->nb_samples) { AVRational sample_rate = {1, avctx->sample_rate}; duration = av_rescale_q(ist->nb_samples, sample_rate, ist->st->time_base); } else continue; } else { if (ist->framerate.num) { duration = av_rescale_q(1, ist->framerate, ist->st->time_base); } else if (ist->st->avg_frame_rate.num) { duration = av_rescale_q(1, ist->st->avg_frame_rate, ist->st->time_base); } else duration = 1; } if (!ifile->duration) ifile->time_base = ist->st->time_base; /* the total duration of the stream, max_pts - min_pts is * the duration of the stream without the last frame */ duration += ist->max_pts - ist->min_pts; ifile->time_base = duration_max(duration, &ifile->duration, ist->st->time_base, ifile->time_base); } if (ifile->loop > 0) ifile->loop--; return ret; } /* * Return * - 0 -- one packet was read and processed * - AVERROR(EAGAIN) -- no packets were available for selected file, * this function should be called again * - AVERROR_EOF -- this function should not be called again */ static int process_input(int file_index) { InputFile *ifile = input_files[file_index]; AVFormatContext *is; InputStream *ist; AVPacket pkt; int ret, i, j; int64_t duration; int64_t pkt_dts; is = ifile->ctx; ret = get_input_packet(ifile, &pkt); if (ret == AVERROR(EAGAIN)) { ifile->eagain = 1; return ret; } if (ret < 0 && ifile->loop) { if ((ret = seek_to_start(ifile, is)) < 0) return ret; ret = get_input_packet(ifile, &pkt); if (ret == AVERROR(EAGAIN)) { ifile->eagain = 1; return ret; } } if (ret < 0) { if (ret != AVERROR_EOF) { print_error(is->filename, ret); if (exit_on_error) exit_program(1); } for (i = 0; i < ifile->nb_streams; i++) { ist = input_streams[ifile->ist_index + i]; if (ist->decoding_needed) { ret = process_input_packet(ist, NULL, 0); if (ret>0) return 0; } /* mark all outputs that don't go through lavfi as finished */ for (j = 0; j < nb_output_streams; j++) { OutputStream *ost = output_streams[j]; if (ost->source_index == ifile->ist_index + i && (ost->stream_copy || ost->enc->type == AVMEDIA_TYPE_SUBTITLE)) finish_output_stream(ost); } } ifile->eof_reached = 1; return AVERROR(EAGAIN); } reset_eagain(); if (do_pkt_dump) { av_pkt_dump_log2(NULL, AV_LOG_INFO, &pkt, do_hex_dump, is->streams[pkt.stream_index]); } /* the following test is needed in case new streams appear dynamically in stream : we ignore them */ if (pkt.stream_index >= ifile->nb_streams) { report_new_stream(file_index, &pkt); goto discard_packet; } ist = input_streams[ifile->ist_index + pkt.stream_index]; ist->data_size += pkt.size; ist->nb_packets++; if (ist->discard) goto discard_packet; if (exit_on_error && (pkt.flags & AV_PKT_FLAG_CORRUPT)) { av_log(NULL, AV_LOG_FATAL, "%s: corrupt input packet in stream %d\n", is->filename, pkt.stream_index); exit_program(1); } if (debug_ts) { av_log(NULL, AV_LOG_INFO, "demuxer -> ist_index:%d type:%s " "next_dts:%s next_dts_time:%s next_pts:%s next_pts_time:%s pkt_pts:%s pkt_pts_time:%s pkt_dts:%s pkt_dts_time:%s off:%s off_time:%s\n", ifile->ist_index + pkt.stream_index, av_get_media_type_string(ist->dec_ctx->codec_type), av_ts2str(ist->next_dts), av_ts2timestr(ist->next_dts, &AV_TIME_BASE_Q), av_ts2str(ist->next_pts), av_ts2timestr(ist->next_pts, &AV_TIME_BASE_Q), av_ts2str(pkt.pts), av_ts2timestr(pkt.pts, &ist->st->time_base), av_ts2str(pkt.dts), av_ts2timestr(pkt.dts, &ist->st->time_base), av_ts2str(input_files[ist->file_index]->ts_offset), av_ts2timestr(input_files[ist->file_index]->ts_offset, &AV_TIME_BASE_Q)); } if(!ist->wrap_correction_done && is->start_time != AV_NOPTS_VALUE && ist->st->pts_wrap_bits < 64){ int64_t stime, stime2; // Correcting starttime based on the enabled streams // FIXME this ideally should be done before the first use of starttime but we do not know which are the enabled streams at that point. // so we instead do it here as part of discontinuity handling if ( ist->next_dts == AV_NOPTS_VALUE && ifile->ts_offset == -is->start_time && (is->iformat->flags & AVFMT_TS_DISCONT)) { int64_t new_start_time = INT64_MAX; for (i=0; i<is->nb_streams; i++) { AVStream *st = is->streams[i]; if(st->discard == AVDISCARD_ALL || st->start_time == AV_NOPTS_VALUE) continue; new_start_time = FFMIN(new_start_time, av_rescale_q(st->start_time, st->time_base, AV_TIME_BASE_Q)); } if (new_start_time > is->start_time) { av_log(is, AV_LOG_VERBOSE, "Correcting start time by %"PRId64"\n", new_start_time - is->start_time); ifile->ts_offset = -new_start_time; } } stime = av_rescale_q(is->start_time, AV_TIME_BASE_Q, ist->st->time_base); stime2= stime + (1ULL<<ist->st->pts_wrap_bits); ist->wrap_correction_done = 1; if(stime2 > stime && pkt.dts != AV_NOPTS_VALUE && pkt.dts > stime + (1LL<<(ist->st->pts_wrap_bits-1))) { pkt.dts -= 1ULL<<ist->st->pts_wrap_bits; ist->wrap_correction_done = 0; } if(stime2 > stime && pkt.pts != AV_NOPTS_VALUE && pkt.pts > stime + (1LL<<(ist->st->pts_wrap_bits-1))) { pkt.pts -= 1ULL<<ist->st->pts_wrap_bits; ist->wrap_correction_done = 0; } } /* add the stream-global side data to the first packet */ if (ist->nb_packets == 1) { if (ist->st->nb_side_data) av_packet_split_side_data(&pkt); for (i = 0; i < ist->st->nb_side_data; i++) { AVPacketSideData *src_sd = &ist->st->side_data[i]; uint8_t *dst_data; if (av_packet_get_side_data(&pkt, src_sd->type, NULL)) continue; if (ist->autorotate && src_sd->type == AV_PKT_DATA_DISPLAYMATRIX) continue; dst_data = av_packet_new_side_data(&pkt, src_sd->type, src_sd->size); if (!dst_data) exit_program(1); memcpy(dst_data, src_sd->data, src_sd->size); } } if (pkt.dts != AV_NOPTS_VALUE) pkt.dts += av_rescale_q(ifile->ts_offset, AV_TIME_BASE_Q, ist->st->time_base); if (pkt.pts != AV_NOPTS_VALUE) pkt.pts += av_rescale_q(ifile->ts_offset, AV_TIME_BASE_Q, ist->st->time_base); if (pkt.pts != AV_NOPTS_VALUE) pkt.pts *= ist->ts_scale; if (pkt.dts != AV_NOPTS_VALUE) pkt.dts *= ist->ts_scale; pkt_dts = av_rescale_q_rnd(pkt.dts, ist->st->time_base, AV_TIME_BASE_Q, AV_ROUND_NEAR_INF|AV_ROUND_PASS_MINMAX); if ((ist->dec_ctx->codec_type == AVMEDIA_TYPE_VIDEO || ist->dec_ctx->codec_type == AVMEDIA_TYPE_AUDIO) && pkt_dts != AV_NOPTS_VALUE && ist->next_dts == AV_NOPTS_VALUE && !copy_ts && (is->iformat->flags & AVFMT_TS_DISCONT) && ifile->last_ts != AV_NOPTS_VALUE) { int64_t delta = pkt_dts - ifile->last_ts; if (delta < -1LL*dts_delta_threshold*AV_TIME_BASE || delta > 1LL*dts_delta_threshold*AV_TIME_BASE){ ifile->ts_offset -= delta; av_log(NULL, AV_LOG_DEBUG, "Inter stream timestamp discontinuity %"PRId64", new offset= %"PRId64"\n", delta, ifile->ts_offset); pkt.dts -= av_rescale_q(delta, AV_TIME_BASE_Q, ist->st->time_base); if (pkt.pts != AV_NOPTS_VALUE) pkt.pts -= av_rescale_q(delta, AV_TIME_BASE_Q, ist->st->time_base); } } duration = av_rescale_q(ifile->duration, ifile->time_base, ist->st->time_base); if (pkt.pts != AV_NOPTS_VALUE) { pkt.pts += duration; ist->max_pts = FFMAX(pkt.pts, ist->max_pts); ist->min_pts = FFMIN(pkt.pts, ist->min_pts); } if (pkt.dts != AV_NOPTS_VALUE) pkt.dts += duration; pkt_dts = av_rescale_q_rnd(pkt.dts, ist->st->time_base, AV_TIME_BASE_Q, AV_ROUND_NEAR_INF|AV_ROUND_PASS_MINMAX); if ((ist->dec_ctx->codec_type == AVMEDIA_TYPE_VIDEO || ist->dec_ctx->codec_type == AVMEDIA_TYPE_AUDIO) && pkt_dts != AV_NOPTS_VALUE && ist->next_dts != AV_NOPTS_VALUE && !copy_ts) { int64_t delta = pkt_dts - ist->next_dts; if (is->iformat->flags & AVFMT_TS_DISCONT) { if (delta < -1LL*dts_delta_threshold*AV_TIME_BASE || delta > 1LL*dts_delta_threshold*AV_TIME_BASE || pkt_dts + AV_TIME_BASE/10 < FFMAX(ist->pts, ist->dts)) { ifile->ts_offset -= delta; av_log(NULL, AV_LOG_DEBUG, "timestamp discontinuity %"PRId64", new offset= %"PRId64"\n", delta, ifile->ts_offset); pkt.dts -= av_rescale_q(delta, AV_TIME_BASE_Q, ist->st->time_base); if (pkt.pts != AV_NOPTS_VALUE) pkt.pts -= av_rescale_q(delta, AV_TIME_BASE_Q, ist->st->time_base); } } else { if ( delta < -1LL*dts_error_threshold*AV_TIME_BASE || delta > 1LL*dts_error_threshold*AV_TIME_BASE) { av_log(NULL, AV_LOG_WARNING, "DTS %"PRId64", next:%"PRId64" st:%d invalid dropping\n", pkt.dts, ist->next_dts, pkt.stream_index); pkt.dts = AV_NOPTS_VALUE; } if (pkt.pts != AV_NOPTS_VALUE){ int64_t pkt_pts = av_rescale_q(pkt.pts, ist->st->time_base, AV_TIME_BASE_Q); delta = pkt_pts - ist->next_dts; if ( delta < -1LL*dts_error_threshold*AV_TIME_BASE || delta > 1LL*dts_error_threshold*AV_TIME_BASE) { av_log(NULL, AV_LOG_WARNING, "PTS %"PRId64", next:%"PRId64" invalid dropping st:%d\n", pkt.pts, ist->next_dts, pkt.stream_index); pkt.pts = AV_NOPTS_VALUE; } } } } if (pkt.dts != AV_NOPTS_VALUE) ifile->last_ts = av_rescale_q(pkt.dts, ist->st->time_base, AV_TIME_BASE_Q); if (debug_ts) { av_log(NULL, AV_LOG_INFO, "demuxer+ffmpeg -> ist_index:%d type:%s pkt_pts:%s pkt_pts_time:%s pkt_dts:%s pkt_dts_time:%s off:%s off_time:%s\n", ifile->ist_index + pkt.stream_index, av_get_media_type_string(ist->dec_ctx->codec_type), av_ts2str(pkt.pts), av_ts2timestr(pkt.pts, &ist->st->time_base), av_ts2str(pkt.dts), av_ts2timestr(pkt.dts, &ist->st->time_base), av_ts2str(input_files[ist->file_index]->ts_offset), av_ts2timestr(input_files[ist->file_index]->ts_offset, &AV_TIME_BASE_Q)); } sub2video_heartbeat(ist, pkt.pts); process_input_packet(ist, &pkt, 0); discard_packet: av_packet_unref(&pkt); return 0; } /** * Perform a step of transcoding for the specified filter graph. * * @param[in] graph filter graph to consider * @param[out] best_ist input stream where a frame would allow to continue * @return 0 for success, <0 for error */ static int transcode_from_filter(FilterGraph *graph, InputStream **best_ist) { int i, ret; int nb_requests, nb_requests_max = 0; InputFilter *ifilter; InputStream *ist; *best_ist = NULL; ret = avfilter_graph_request_oldest(graph->graph); if (ret >= 0) return reap_filters(0); if (ret == AVERROR_EOF) { ret = reap_filters(1); for (i = 0; i < graph->nb_outputs; i++) close_output_stream(graph->outputs[i]->ost); return ret; } if (ret != AVERROR(EAGAIN)) return ret; for (i = 0; i < graph->nb_inputs; i++) { ifilter = graph->inputs[i]; ist = ifilter->ist; if (input_files[ist->file_index]->eagain || input_files[ist->file_index]->eof_reached) continue; nb_requests = av_buffersrc_get_nb_failed_requests(ifilter->filter); if (nb_requests > nb_requests_max) { nb_requests_max = nb_requests; *best_ist = ist; } } if (!*best_ist) for (i = 0; i < graph->nb_outputs; i++) graph->outputs[i]->ost->unavailable = 1; return 0; } /** * Run a single step of transcoding. * * @return 0 for success, <0 for error */ static int transcode_step(void) { OutputStream *ost; InputStream *ist; int ret; ost = choose_output(); if (!ost) { if (got_eagain()) { reset_eagain(); av_usleep(10000); return 0; } av_log(NULL, AV_LOG_VERBOSE, "No more inputs to read from, finishing.\n"); return AVERROR_EOF; } if (ost->filter) { if ((ret = transcode_from_filter(ost->filter->graph, &ist)) < 0) return ret; if (!ist) return 0; } else { av_assert0(ost->source_index >= 0); ist = input_streams[ost->source_index]; } ret = process_input(ist->file_index); if (ret == AVERROR(EAGAIN)) { if (input_files[ist->file_index]->eagain) ost->unavailable = 1; return 0; } if (ret < 0) return ret == AVERROR_EOF ? 0 : ret; return reap_filters(0); } /* * The following code is the main loop of the file converter */ static int transcode(void) { int ret, i; AVFormatContext *os; OutputStream *ost; InputStream *ist; int64_t timer_start; int64_t total_packets_written = 0; ret = transcode_init(); if (ret < 0) goto fail; if (stdin_interaction) { av_log(NULL, AV_LOG_INFO, "Press [q] to stop, [?] for help\n"); } timer_start = av_gettime_relative(); #if HAVE_PTHREADS if ((ret = init_input_threads()) < 0) goto fail; #endif while (!received_sigterm) { int64_t cur_time= av_gettime_relative(); /* if 'q' pressed, exits */ if (stdin_interaction) if (check_keyboard_interaction(cur_time) < 0) break; /* check if there's any stream where output is still needed */ if (!need_output()) { av_log(NULL, AV_LOG_VERBOSE, "No more output streams to write to, finishing.\n"); break; } ret = transcode_step(); if (ret < 0 && ret != AVERROR_EOF) { char errbuf[128]; av_strerror(ret, errbuf, sizeof(errbuf)); av_log(NULL, AV_LOG_ERROR, "Error while filtering: %s\n", errbuf); break; } /* dump report by using the output first video and audio streams */ print_report(0, timer_start, cur_time); } #if HAVE_PTHREADS free_input_threads(); #endif /* at the end of stream, we must flush the decoder buffers */ for (i = 0; i < nb_input_streams; i++) { ist = input_streams[i]; if (!input_files[ist->file_index]->eof_reached && ist->decoding_needed) { process_input_packet(ist, NULL, 0); } } flush_encoders(); term_exit(); /* write the trailer if needed and close file */ for (i = 0; i < nb_output_files; i++) { os = output_files[i]->ctx; if (!output_files[i]->header_written) { av_log(NULL, AV_LOG_ERROR, "Nothing was written into output file %d (%s), because " "at least one of its streams received no packets.\n", i, os->filename); continue; } if ((ret = av_write_trailer(os)) < 0) { av_log(NULL, AV_LOG_ERROR, "Error writing trailer of %s: %s", os->filename, av_err2str(ret)); if (exit_on_error) exit_program(1); } } /* dump report by using the first video and audio streams */ print_report(1, timer_start, av_gettime_relative()); /* close each encoder */ for (i = 0; i < nb_output_streams; i++) { ost = output_streams[i]; if (ost->encoding_needed) { av_freep(&ost->enc_ctx->stats_in); } total_packets_written += ost->packets_written; } if (!total_packets_written && (abort_on_flags & ABORT_ON_FLAG_EMPTY_OUTPUT)) { av_log(NULL, AV_LOG_FATAL, "Empty output\n"); exit_program(1); } /* close each decoder */ for (i = 0; i < nb_input_streams; i++) { ist = input_streams[i]; if (ist->decoding_needed) { avcodec_close(ist->dec_ctx); if (ist->hwaccel_uninit) ist->hwaccel_uninit(ist->dec_ctx); } } av_buffer_unref(&hw_device_ctx); /* finished ! */ ret = 0; fail: #if HAVE_PTHREADS free_input_threads(); #endif if (output_streams) { for (i = 0; i < nb_output_streams; i++) { ost = output_streams[i]; if (ost) { if (ost->logfile) { if (fclose(ost->logfile)) av_log(NULL, AV_LOG_ERROR, "Error closing logfile, loss of information possible: %s\n", av_err2str(AVERROR(errno))); ost->logfile = NULL; } av_freep(&ost->forced_kf_pts); av_freep(&ost->apad); av_freep(&ost->disposition); av_dict_free(&ost->encoder_opts); av_dict_free(&ost->sws_dict); av_dict_free(&ost->swr_opts); av_dict_free(&ost->resample_opts); } } } return ret; } static int64_t getutime(void) { #if HAVE_GETRUSAGE struct rusage rusage; getrusage(RUSAGE_SELF, &rusage); return (rusage.ru_utime.tv_sec * 1000000LL) + rusage.ru_utime.tv_usec; #elif HAVE_GETPROCESSTIMES HANDLE proc; FILETIME c, e, k, u; proc = GetCurrentProcess(); GetProcessTimes(proc, &c, &e, &k, &u); return ((int64_t) u.dwHighDateTime << 32 | u.dwLowDateTime) / 10; #else return av_gettime_relative(); #endif } static int64_t getmaxrss(void) { #if HAVE_GETRUSAGE && HAVE_STRUCT_RUSAGE_RU_MAXRSS struct rusage rusage; getrusage(RUSAGE_SELF, &rusage); return (int64_t)rusage.ru_maxrss * 1024; #elif HAVE_GETPROCESSMEMORYINFO HANDLE proc; PROCESS_MEMORY_COUNTERS memcounters; proc = GetCurrentProcess(); memcounters.cb = sizeof(memcounters); GetProcessMemoryInfo(proc, &memcounters, sizeof(memcounters)); return memcounters.PeakPagefileUsage; #else return 0; #endif } static void log_callback_null(void *ptr, int level, const char *fmt, va_list vl) { } int main(int argc, char **argv) { int i, ret; int64_t ti; init_dynload(); register_exit(ffmpeg_cleanup); setvbuf(stderr,NULL,_IONBF,0); /* win32 runtime needs this */ av_log_set_flags(AV_LOG_SKIP_REPEATED); parse_loglevel(argc, argv, options); if(argc>1 && !strcmp(argv[1], "-d")){ run_as_daemon=1; av_log_set_callback(log_callback_null); argc--; argv++; } avcodec_register_all(); #if CONFIG_AVDEVICE avdevice_register_all(); #endif avfilter_register_all(); av_register_all(); avformat_network_init(); show_banner(argc, argv, options); /* parse options and open all input/output files */ ret = ffmpeg_parse_options(argc, argv); if (ret < 0) exit_program(1); if (nb_output_files <= 0 && nb_input_files == 0) { show_usage(); av_log(NULL, AV_LOG_WARNING, "Use -h to get full help or, even better, run 'man %s'\n", program_name); exit_program(1); } /* file converter / grab */ if (nb_output_files <= 0) { av_log(NULL, AV_LOG_FATAL, "At least one output file must be specified\n"); exit_program(1); } // if (nb_input_files == 0) { // av_log(NULL, AV_LOG_FATAL, "At least one input file must be specified\n"); // exit_program(1); // } for (i = 0; i < nb_output_files; i++) { if (strcmp(output_files[i]->ctx->oformat->name, "rtp")) want_sdp = 0; } current_time = ti = getutime(); if (transcode() < 0) exit_program(1); ti = getutime() - ti; if (do_benchmark) { av_log(NULL, AV_LOG_INFO, "bench: utime=%0.3fs\n", ti / 1000000.0); } av_log(NULL, AV_LOG_DEBUG, "%"PRIu64" frames successfully decoded, %"PRIu64" decoding errors\n", decode_error_stat[0], decode_error_stat[1]); if ((decode_error_stat[0] + decode_error_stat[1]) * max_error_rate < decode_error_stat[1]) exit_program(69); exit_program(received_nb_signals ? 255 : main_return_code); return main_return_code; } ```
```smalltalk using System; using System.Collections.Generic; using System.Linq; using System.Text; namespace g3 { /// <summary> /// SparseList provides a linear-indexing interface, but internally may use an /// alternate data structure to store the [index,value] pairs, if the list /// is very sparse. /// /// Currently uses Dictionary<> as sparse data structure /// </summary> public class SparseList<T> where T : IEquatable<T> { T[] dense; Dictionary<int, T> sparse; T zeroValue; public SparseList(int MaxIndex, int SubsetCountEst, T ZeroValue) { zeroValue = ZeroValue; bool bSmall = MaxIndex > 0 && MaxIndex < 1024; float fPercent = (MaxIndex == 0) ? 0 : (float)SubsetCountEst / (float)MaxIndex; float fPercentThresh = 0.1f; if (bSmall || fPercent > fPercentThresh) { dense = new T[MaxIndex]; for (int k = 0; k < MaxIndex; ++k) dense[k] = ZeroValue; } else sparse = new Dictionary<int, T>(); } public T this[int idx] { get { if (dense != null) return dense[idx]; T val; if (sparse.TryGetValue(idx, out val)) return val; return zeroValue; } set { if (dense != null) { dense[idx] = value; } else { sparse[idx] = value; } } } public int Count(Func<T,bool> CountF) { int count = 0; if ( dense != null ) { for (int i = 0; i < dense.Length; ++i) if (CountF(dense[i])) count++; } else { foreach (var v in sparse) { if (CountF(v.Value)) count++; } } return count; } /// <summary> /// This enumeration will return pairs [index,0] for dense case /// </summary> public IEnumerable<KeyValuePair<int,T>> Values() { if ( dense != null ) { for (int i = 0; i < dense.Length; ++i) yield return new KeyValuePair<int, T>(i, dense[i]); } else { foreach (var v in sparse) yield return v; } } public IEnumerable<KeyValuePair<int,T>> NonZeroValues() { if ( dense != null ) { for (int i = 0; i < dense.Length; ++i) { if ( dense[i].Equals(zeroValue) == false ) yield return new KeyValuePair<int, T>(i, dense[i]); } } else { foreach (var v in sparse) yield return v; } } } /// <summary> /// variant of SparseList for class objects, then "zero" is null /// /// TODO: can we combine these classes somehow? /// </summary> public class SparseObjectList<T> where T : class { T[] dense; Dictionary<int, T> sparse; public SparseObjectList(int MaxIndex, int SubsetCountEst) { bool bSmall = MaxIndex < 1024; float fPercent = (float)SubsetCountEst / (float)MaxIndex; float fPercentThresh = 0.1f; if (bSmall || fPercent > fPercentThresh) { dense = new T[MaxIndex]; for (int k = 0; k < MaxIndex; ++k) dense[k] = null; } else sparse = new Dictionary<int, T>(); } public T this[int idx] { get { if (dense != null) return dense[idx]; T val; if (sparse.TryGetValue(idx, out val)) return val; return null; } set { if (dense != null) { dense[idx] = value; } else { sparse[idx] = value; } } } public int Count(Func<T,bool> CountF) { int count = 0; if ( dense != null ) { for (int i = 0; i < dense.Length; ++i) if (CountF(dense[i])) count++; } else { foreach (var v in sparse) { if (CountF(v.Value)) count++; } } return count; } /// <summary> /// This enumeration will return pairs [index,0] for dense case /// </summary> public IEnumerable<KeyValuePair<int,T>> Values() { if ( dense != null ) { for (int i = 0; i < dense.Length; ++i) yield return new KeyValuePair<int, T>(i, dense[i]); } else { foreach (var v in sparse) yield return v; } } public IEnumerable<KeyValuePair<int,T>> NonZeroValues() { if ( dense != null ) { for (int i = 0; i < dense.Length; ++i) { if ( dense[i] != null ) yield return new KeyValuePair<int, T>(i, dense[i]); } } else { foreach (var v in sparse) yield return v; } } public void Clear() { if (dense != null) { Array.Clear(dense, 0, dense.Length); } else { sparse.Clear(); } } } } ```
Mỹ Hà is a commune (xã) and village in Lạng Giang District, Bắc Giang Province, in northeastern Vietnam. References Populated places in Bắc Giang province Communes of Bắc Giang province
WBXX-TV (channel 20) is a television station licensed to Crossville, Tennessee, United States, serving the Knoxville area as an affiliate of The CW. It is owned by Gray Television alongside dual CBS/MyNetworkTV affiliate WVLT-TV (channel 8). Both stations share studios on Papermill Drive (near I-40/I-75) on the west side of Knoxville, while WBXX-TV's transmitter is located at Windrock, Buffalo Mountain outside Oliver Springs, Tennessee. WBXX-TV is the only full-powered Knoxville-market station to be licensed in a city in the Central Time Zone; Cumberland County (where Crossville is located) and Fentress County are the only two counties in the market that observe Central Time, while Knoxville proper is in Eastern Time. However, while CW network programming is promoted with both Eastern and Central Time listings, WBXX-TV's local programming is promoted with only Eastern Time listings. History WCPT-TV and WINT-TV WBXX began operation as independent WCPT-TV ("Wonderful Cumberland Plateau Television") on analog channel 55 (716–722 MHz) beginning October 3, 1976 from a tower site on Haley Mountain (Renegade Resort) in Cumberland County, Tennessee. It was owned by Edward M. "Mack" Johnson, who also owned 49% of WSCV AM in Crossville. The studios in Crossville were a former loan office. WCPT-TV produced a nightly local newscast, as well as a three-hour live Saturday Night Jamboree; much of its syndicated fare came by bus the day after airing on Nashville's WZTV. Viewers in Cookeville could view the station by translator on channel 23; it also maintained a facility there. The station did look at obtaining an ABC affiliation, but the station needed to serve 1,500 more households to make the arrangement worthwhile. It also tried to move to channel 7 in 1977, which the station opted not to pursue further after meeting with opposition from WTVK. Initially, WCPT-TV ran a mix of religious shows, some low-budget syndicated movies and westerns, and some cartoons, as well as sporting shows and public affairs programs. By the early 1980s, more movies and drama shows aired as well as cartoons. Channel 55 was sold to Calvin C. Smith, who owned stations in Kentucky, and John A. Cunningham in 1979. In February 1980, citing poor reception and failing equipment, they asked to switch from channel 55 to channel 20, which was reserved as educational, and make 55 the non-commercial allocation for Crossville; this was approved and WCPT-TV was ordered to move to channel 20 effective August 13. Additionally, Smith and Cunningham were approved to increase channel 20's effective radiated power to 2,818kW (it had signed on with just 16) in 1981, but those facilities were never implemented. Still operating at low power, the station became WINT-TV on December 27, 1982, and in 1983, the station was sold mostly to Larry Hudson, with Cunningham retaining 10 percent. In 1988, WINT-TV aired religious programming. In 1995, CW TV, headed by Cynthia Willis, acquired the station for $700,000 by way of Crossville TV, LP. CW applied to move the transmitter site to Upper Windrock, an outcropping on Buffalo Mountain along the Cumberland Plateau above Oliver Springs, Tennessee, and boost power to 3,630 kW—covering Knoxville. Becoming WBXX In 1997, the partners in Crossville TV LP sold the license to ACME Communications. ACME immediately changed the call letters to WBXX-TV, and in October 1997, channel 20 relaunched from its new facilities as Knoxville's affiliate of The WB. It was the second station for ACME, behind KWBP in Portland, Oregon, which it had acquired that June. The station also ran select UPN programming during 2001 and 2002, as that network did not have a Knoxville affiliate at the time. WBXX was consistently one of the highest-rated WB stations in the country, and was recognized as such by The WB network. After being known as "WB20" since signing on, WBXX rebranded as "East Tennessee's WB" in September 2003. When the station took affiliation with The CW, it was renamed "East Tennessee's CW." WBXX rebranded again, to "CW20", in August 2008. A decade of sales In February 2011, ACME Communications announced a deal to sell the station to Virginia-based Lockwood Broadcast Group. The sale was approved by the Federal Communications Commission (FCC) on March 21 with the consummation being completed on May 6. From 1998 until 2004, the station aired a series of interstitials during children's programming called WB 20 Kids Club (later Dubba Clubba) hosted by comedian Jackson Bailey (known as "Joe Cool"). The interstitials featured information and contests to viewers in several vignettes each weekday covering topics such as science, biology, conservation, music, and pet care. From 2015 to 2019, WBXX has broadcast Atlantic Coast Conference football and men's basketball games syndicated from the Raycom Sports-operated ACC Network, some of which were shared with the main channel of CBS affiliate WVLT-TV. Those games were previously broadcast on MyNetworkTV affiliate WVLT-DT2 from 2009 until the end of the 2014–2015 season. On October 1, 2015, Gray Television announced that it would acquire WBXX-TV from Lockwood. The purchase was made as part of Gray's acquisition of the broadcasting assets of Schurz Communications; as part of the deal, Lockwood received KAKE in Wichita, Kansas (which Gray put up for sale following the deal with Schurz) and paid $11.2 million to Gray. Gray (through WVLT-TV, Inc.) took the operations of the station via Local Marketing Agreement. The sale was completed on February 1, 2016. On June 25, 2018, Gray Television announced that they would acquire the assets of Raycom Media, who had been owned Fox affiliate WTNZ-TV since 1996 for $3.6 billion. Due to FCC rules, Gray kept the existing duopoly of WVLT-TV and WBXX-TV and sell WTNZ, since WTNZ and WVLT-TV rank among the four highest-rated stations in the Knoxville market in total-day viewership. On August 20, 2018, it was announced that Gray would sell WTNZ to former owner Lockwood Broadcast Group, who owned WKNX-TV since 2013 in a group deal that see Lockwood to acquire WFXG-TV in Augusta, WPGX-TV in Panama City and WDFX-TV in Dothan. The sale was completed on January 2, 2019. Newscasts Until mid-late 2013, WBXX aired the nationally syndicated morning show The Daily Buzz from 6 until 8. The program was produced by ACME Communications, and during the company's ownership of this station, there were local weather cut-ins focusing on the Knoxville area. It is unknown if these updates were still provided with WBXX's ownership change to Lockwood. The Daily Buzz then moved to sister station WKNX-TV, which aired it until its sudden cancellation on April 17, 2015. At one point in time, NBC affiliate WBIR-TV (then owned by Gannett, now owned by Tegna) began producing a nightly newscast on this station through an outsourcing agreement, called 10 News at 10. The newscast only aired for twelve minutes in an abbreviated format featuring the day's top stories along with an updated weather forecast. The broadcast originated from WBIR's facility on Hutchinson Avenue in Knoxville's Lincoln Park section (official address is Bill Williams Avenue). It was offered as an alternative to Fox affiliate WTNZ which had nightly local news produced by ABC affiliate WATE-TV. In early March 2011, WTNZ terminated its news share agreement with WATE after entering into another contract with WBIR. As a result, the latter station stopped producing the nightly update for WBXX. On August 1, 2011, WATE (then owned by Young Broadcasting, now owned by Nexstar Media Group) returned to the prime time newscast race with a new nightly 35 minute broadcast on WBXX (The CW 20 News at 10) through another outsourcing agreement. Corresponding with the addition, WATE upgraded to high definition newscasts that October 17 becoming the third local news operation in Knoxville to make the change. Initially, the newscast on WBXX was not included in the upgrade as it lacked a high definition-capable master control at its separate studios to receive the newscast in HD. This lasted until early April 2012 when WBXX underwent a master control upgrade. The CW 20 News at 10 originated from WATE's studios in Camp House on North Broadway in the city's Old North Knoxville section. Starting on January 1, 2017, sister station WVLT-TV took over production of WBXX's newscasts from WATE which include the morning (7–9 a.m.) and prime time 10 p.m. newscasts. Technical information Subchannels The station's digital signal is multiplexed: Analog-to-digital conversion WBXX-TV shut down its analog signal, over UHF channel 20, on June 12, 2009, the official date on which full-power television stations in the United States transitioned from analog to digital broadcasts under federal mandate. The station's digital signal relocated from its pre-transition UHF channel 50 to channel 20.<ref> References External links Television channels and stations established in 1997 BXX-TV The CW affiliates Story Television affiliates Heroes & Icons affiliates Dabl affiliates Telemundo network affiliates Gray Television 1997 establishments in Tennessee Cumberland County, Tennessee
Pycnarmon alboflavalis is a moth in the family Crambidae. It was described by Frederic Moore in 1888. It is found in India (Darjeeling, Sikkim, Andamans, Arunachal Pradesh) and Bhutan. The wingspan is 20 mm. Adults are white, the forewings with two black costal spots and three bands from the median nervure to the inner margin on the basal area. There is a discocellular black spot and large spots on the origin of veins 2 to 5, as well as a postmedial line, straight from costa to vein 5, then bent outwards and fine, with a spot below it on the inner margin. The marginal area is orange with a black speck at the apex and a double submarginal black spot on vein 4. The hindwings are white with a black spot at the base of the inner margin. There is a double series of black spots from below the lower angle of the cell to the anal angle. The marginal area is orange, with a submarginal black spot on vein 4 and some specks on the margin towards the anal angle. References Spilomelinae Moths described in 1888 Moths of Asia
Thorolf is an Old Norse masculine personal name. It means "Thor's wolf." Notable people with the name include: Thorolf Kveldulfsson, 9th century Norwegian hersir and Viking Thorolf Skallagrimsson, Icelandic Viking and nephew of the former Norwegian masculine given names Masculine given names
The Centre for Innovation, Research and Competence in the Learning Economy (CIRCLE) is an interdisciplinary research centre situated in Lund, Sweden. It is spanning several faculties at Lund University and Blekinge Institute of Technology. The activities cover the field of innovation, entrepreneurship, knowledge creation and economic growth. History CIRCLE was initiated in July 2004 with long-term funding by VINNOVA (Swedish Governmental Agency for Innovation Systems), Lund University and Blekinge Institute of Technology. It is one of the four national research Centres of Excellence funded by VINNOVA. It has established connections with renowned research environments in the fields of innovation, entrepreneurship and economic growth. Guest researchers and visiting scholars are regularly invited to enhance the exciting international research and teaching milieu through presentations and seminars. Research CIRCLE’s research is organized into the four main research areas addressing different aspects of innovation and knowledge creation. They are: learning in innovation systems and the consequences of R&D and innovation for productivity growth international comparison of regional innovation systems the entrepreneurial university and the creation of research-based firms public policy in the field of innovation, R&D and competence building in a comparative perspective. The research focus lies on the socioeconomic and policy aspects of three kinds of learning: Innovation (in new products as well as processes) which takes place mainly in firms. Research and Development (R&D) which is carried out in universities and research organizations as well as in firms. Competence Building (e.g., training and education), which is done in schools and universities (schooling, education) as well as in firms (in the form of training, learning-by-doing, learning-by-using and individual learning, often throughout life). Competence building is the same as enhancement of human capital. References Innovation organizations Research institutes in Sweden Scientific organizations based in Sweden Lund University 2004 establishments in Sweden Research and development in Sweden
Bobby Ricketts is an American saxophonist based in Copenhagen, Denmark. Bobby grew up in the area around greater Boston MA. Due to several hundred appearances on various Danish television shows from ca. 1983 onward, numerous international artistic collaborations, and cultural outreach programs conducted as an Arts Envoy of the U.S. Dept. of State, his reputation has spread throughout Scandinavia, Europe, Japan, Africa, and the US. In March 2008, Bobby Ricketts released the album "Skin To Skin" which was called "one of the most scintillating albums of 2008!" in a review by Smoothjazz.com radio personality Sandy Shore. External links http://www.bobbyricketts.com References American male saxophonists Living people 21st-century American saxophonists 21st-century American male musicians Year of birth missing (living people)
Two-and-four-stroke engines are engines that combine elements from both two-stroke and four-stroke engines. They usually incorporate two pistons. M4+2 engine The M4+2 engine, also known as the double-piston internal combustion engine, is a new type of internal combustion engine invented by a Polish patent holder Piotr Mężyk. The M4+2 engine took its name from a combination of the two working modes of the known engines, that is the two-stroke engine and the four-stroke engine. The two-stroke combustion engine is characterized by a simple construction and system of air load change, as well as a bigger index of power output. Unfortunately, its filling ratio is worse than in a four-stroke engine. The ecological index of a two-stroke engine is also unfavorable. The system of valves of the four-stroke engine is its disadvantage. The cylinders of both modules of the double-pistons engine have been joined along one axis with a common cylinder head – in the form of a ring. The pistons are moved at different speeds and with appropriate stage displacement. The two crankshafts are connected with a special transmission. The four-stroke crankshaft is rotated at twice the speed of the two-stroke crankshaft. The engine is named double-piston because of its construction – double pistons and crankshafts. In the M4+2, the advantages of both engines being connected are obvious; the pistons of the engine working in one combustion cylinder are set oppositely to each other but in different modes. Although the projects of connecting two-stroke modes in one cylinder were tried already a long time ago in the opposed piston engine, the combination of the two different cycles had never been tried before. It turned out that the engine is not only able to work but also the effects are very promising. The engine has a far greater efficiency over the break-even value known to combustion engines (about 35%) and closer to the one associated with steam turbines or electric engines (about 70%). The other advantages are: more efficient power production (150 horsepower from 1000 cc of volume in the basic version) ability to use different fuel types (natural plant oils as well as petroleum) much better ecological effects – a huge reduction of carbon dioxide and other gases emission lower fuel consumption prolonged stroke expansion possibility to change the compression parameters without having to stop the engine simplicity of construction The idea was developed at the Silesian University of Technology, Poland, under the leadership of Adam Ciesiołkiewicz – Journal of KONES Internal Combustion Engines 2002 No. 1-2 , the official Polish educational site (Pdf)</ref> It was granted the patent number 195052 by the Polish Patent Office. The work with the new combustion engine has been done by IZOLING P.W. Company in cooperation with the Silesian University of Technology in Gliwice. Consultations with researchers from Technical University of Kraków were also held. So far, two working engine models have been made. The preliminary model is based on two existing engines (a two-stroke motorbike engine and a four-stroke small machine engine). This model confirms the concept of two- and four-stroke engines' connection. The second model is the functional model. The new solution for a combustion engine was presented during scientific conferences (KONES '2002, Seminary of biofuels 2003, etc.) and in mass media (journals, newspapers, TVP, radio). The hope to build a modern and universal ecological engine with good technical parameters and low fuel consumption is connected to the double-piston internal combustion engine. The M4+2 engine working cycle The stages of the working cycle: Gas exhaust Fresh air inlet (two stages) Medium compression Two-stage combustion Gas expansion The filling process takes place at the overpressure phase, using a mechanical gas compressor and a throttle for the purpose of regulation. The load change is assisted by a four-stroke piston, working as a dynamic boosting system and allowing the good scavenging of working space. A possibility exists of changing the relative piston positions during the engine work, which gives the possibility of changing the compression ratio depending on the temporary level of the load. This suggests the possibility of different fuels for combustion (low-octane petrol, biofuels with high levels of vegetable components). The working cycle is characterized by an almost constant combustion characteristic of the working space volume increasing during the expansion stage. The engine is characterized by: extended gas expansion and limited exhaust gas temperature on the outlet, the variability of the compression ratio and the possibility of changing the compression ratio during the engine work depending on the temporary load, a beneficial changing of working space during the mixture compression with different crankshafts' polar locations. That makes the pressure forming in the cylinder possible (within reasonable limits) an almost constant characteristic of volume increasing during the expansion stage at different piston positions (it influences the steady power transport). Calculations prove the new engine has more favorable parameters and work indicators: the increase of engine thermal efficiency (Although the mechanical efficiency is lesser than a conventional engine) higher total efficiency – efficiency characteristic is more favorable at medium load. It is comparable with the diesel engine efficiency, so the exploitation of fuel consumption is relatively smaller the higher torque and the higher power output compared to the four-stroke SI engine with the same stroke volume Ricardo 2x4 engine The two-cycle modes are currently being researched at Ricardo Consulting Engineers in the UK. The concept consists in switching from one mode to the other depending on the rpm value. The four-stroke engine is more efficient when running at full throttle, while the opposite is the case for the two-stroke engine. When a small car under heavy load runs at half speed, the engine automatically switches to the two-cycle mode, which is then more efficient. The research on this showed a 27% reduction in fuel consumption. Since the shaft of the four-stroke piston in the M4+2 engine always rotates twice as fast as the shaft of the two-stroke piston, and the two-stroke part always runs at half speed, both parts work in optimal conditions regarding fuel consumption at all times. The same principles are involved with having 2 distinct engines, but the design of the M4+2 is much simpler. See also Six-stroke engine Rotary Engine References Internal combustion piston engines Proposed engines
Amulette Garneau (August 11, 1928 – November 7, 2008) was a Canadian actress living in Quebec. She was born Huguette Laurendeau in Montreal and was educated at the École des beaux-arts de Montréal, going on to study acting at the school of the Théâtre du Nouveau Monde and dramatic art with Uta Hagen in New York City. Garneau also studied with Georges Groulx. Garneau performed with various theatre companies, including the Montreal Repertory Theatre and the Théâtre de Quat'Sous. She also became one of Michel Tremblay's favourite performers; she appeared in a number of his productions for the stage as well as the film Once Upon a Time in the East. She appeared in a number of television series, including 14, rue de Galais, , La famille Plouffe and Grand-Papa. She performed with Olivier Guimond in the popular sitcom . Garneau was nominated for a Genie Award for her role in the film Maria Chapdelaine. She also gave notable performances in Taureau, Night Zoo (Un zoo la nuit), Orders (Les Ordres), The Vultures (Les Vautours) and The Years of Dreams and Revolt (Les Années de rêves). She was married twice: first to poet in 1953 and then to . Her son is an actor. Her brother is a journalist. Garneau died in Montreal at the age of 80 from cancer. References External links 1928 births 2008 deaths Actresses from Montreal École des beaux-arts de Montréal alumni
```go package batch import "encoding/json" // Index is a "key" mapper for batch requests. Its key is not special and // should be treated the same as an index in a slice. type Index struct { key int } // NewIndex creates an index. This should only be used by // generated code or the creator of batches, The "Key" has no purpose and should // not be used elsewhere. func NewIndex(key int) Index { // TODO generate a hash to really mess with people trying to recreate batches? return Index{key: key} } // Create another type for Index so that MarshalText and UnmarshalText // don't run into recursion issues. type index Index func (i Index) MarshalText() (text []byte, err error) { return json.Marshal(index(i)) } func (i *Index) UnmarshalText(text []byte) error { return json.Unmarshal(text, (*index)(i)) } ```
Maggie Wilderotter (born Mary Agnes Sullivan; February 9, 1955) is an American businessperson who is the chairwoman of DocuSign (as well as interim CEO from April to October 2022) and the former chief executive officer of Frontier Communications, from November 2004 to April 2015; then executive chairman of the company until April 2016. During her tenure with Frontier, the company grew from a regional telephone company with customer revenues of less than $3 billion to a national broadband, voice and video provider with operations in 29 states and annualized revenues in excess of $10 billion. Early life and family Wilderotter was born in Neptune Township, New Jersey. She grew up in the Elberon section of Long Branch, New Jersey, and graduated from Long Branch High School in 1973. She earned her undergraduate degree in 1977 from College of the Holy Cross in Worcester, Massachusetts, in economics and business administration. She served two terms as a member of the board of trustees for the college and is an active alumna. She has been awarded an Honorary Doctor of Engineering degree from the Stevens Institute of Technology and an Honorary Doctor of Laws degree from the University of Rochester. Wilderotter is the second-oldest of the four girls. One of her sisters is Denise Morrison, former president and Chief Executive Officer of Campbell Soup Company. Career Wilderotter was Senior Vice President of Global Business Strategy and ran the Worldwide Public Sector at Microsoft. Before this, she was president and CEO of Wink Communications Inc., Executive Vice President of National Operations for AT&T Wireless Services Inc., Chief Executive Officer of AT&T's Aviation Communications Division, and a Senior Vice President of McCaw Cellular Communications Inc. Wilderotter serves on the boards of Costco Wholesale Corporation, DocuSign, Hewlett Packard Enterprise, Juno Therapeutics, Inc., and Lyft as well as a number of private and non-profit organizations. Wilderotter previously served on the President's National Security Telecommunications Advisory Committee (NSTAC). From October 2010 to October 2012, she was Vice Chairman of NSTAC and from October 2012 to November 2014, she served as Chairman of NSTAC. Mrs. Wilderotter currently serves on the President's Commission on Enhancing National Cybersecurity. Wilderotter is a member of the Board of Directors of The Conference Board; a member of the Executive Committee of the Catalyst Board of Directors; a member of the Board of Women in America; and a member of the Business Council and the Committee of 200. In 2014, she chaired the Blue Ribbon Committee on Board Strategy for NACD and is a member of WomenCorporateDirectors (WCD). On June 21, 2022, Wilderotter was appointed interim CEO of document management company DocuSign, and will serve as interim CEO until former Google executive Allan Thygesen takes over on October 10, 2022. Wilderotter will continue to serve as chairwoman of the board. References External links Leadership Profile: Maggie Wilderotter 1955 births Living people Long Branch High School alumni People from Long Branch, New Jersey People from Neptune Township, New Jersey American women chief executives American women business executives American chairpersons of corporations Frontier Communications Procter & Gamble people College of the Holy Cross alumni 20th-century American businesspeople 21st-century American businesspeople 20th-century American businesswomen 21st-century American businesswomen
The following outline is provided as an overview of and topical guide to Wallis and Futuna: Wallis and Futuna – French island territory in Polynesia (but not part of, or even contiguous with, French Polynesia) in the South Pacific Ocean between Fiji and Samoa. It comprises three main volcanic tropical islands and a number of tiny islets. The territory is split into two island groups lying about 260 km apart: Wallis Islands (Uvea), in the north Wallis Island (Uvea) Hoorn Islands (Futuna Islands), in the south Futuna Alofi Since 2003 Wallis and Futuna has been a French overseas collectivity (collectivité d'outre-mer, or COM). General reference Pronunciation: Common English country name: Wallis and Futuna or the Wallis and Futuna Islands Official English country name: The French Overseas Collectivity of the Wallis and Futuna Islands Common endonym(s): Official endonym(s): Adjectival(s): Wallisian, Futunan Demonym(s): Wallisian, Futunan Etymology: ISO country codes: WF, WLF, 876 ISO region codes: See ISO 3166-2:WF Internet country code top-level domain: .wf Geography of Wallis and Futuna Geography of Wallis and Futuna Wallis and Futuna is: A French overseas collectivity Location: Southern Hemisphere and Eastern Hemisphere Pacific Ocean South Pacific Ocean Oceania Polynesia Time zone: UTC+12 Extreme points of Wallis and Futuna High: Mont Puke on Futuna Low: South Pacific Ocean 0 m Land boundaries: none Coastline: South Pacific Ocean 129 km Population of Wallis and Futuna: 15,000 Area of Wallis and Futuna: 264 Atlas of Wallis and Futuna Environment of Wallis and Futuna Climate of Wallis and Futuna Birds of Wallis and Futuna Mammals of Wallis and Futuna Natural geographic features of Wallis and Futuna Rivers of Wallis and Futuna Regions of Wallis and Futuna Ecoregions of Wallis and Futuna Administrative divisions of Wallis and Futuna Administrative divisions of Wallis and Futuna Provinces of Wallis and Futuna Districts of Wallis and Futuna Municipalities of Wallis and Futuna Capital of Wallis and Futuna: Mata-Utu Cities of Wallis and Futuna Demography of Wallis and Futuna Demographics of Wallis and Futuna Government and politics of Wallis and Futuna Politics of Wallis and Futuna Form of government: parliamentary representative democratic French overseas collectivity Capital of Wallis and Futuna: Mata-Utu Elections in Wallis and Futuna Political parties in Wallis and Futuna Branches of the government of Wallis and Futuna Government of Wallis and Futuna Executive branch of the government of Wallis and Futuna Legislative branch of the government of Wallis and Futuna Parliament of Wallis and Futuna Judicial branch of the government of Wallis and Futuna Foreign relations of Wallis and Futuna International organization membership The Territory of Wallis and Futuna is a member of: Pacific Islands Forum (PIF) (observer) The Pacific Community (SPC) Universal Postal Union (UPU) World Federation of Trade Unions (WFTU) History of Wallis and Futuna History of Wallis and Futuna Culture of Wallis and Futuna Culture of Wallis and Futuna Dance of Wallis and Futuna Languages of Wallis and Futuna Coat of arms of Wallis and Futuna Flag of Wallis and Futuna National anthem of Wallis and Futuna Public holidays in Wallis and Futuna Sports in Wallis and Futuna Sports in Wallis and Futuna Football in Wallis and Futuna Economy and infrastructure of Wallis and Futuna Economy of Wallis and Futuna Economic rank, by nominal GDP (2007): Communications in Wallis and Futuna Currency of Wallis and Futuna: Franc ISO 4217: XPF Transport in Wallis and Futuna Airports in Wallis and Futuna Education in Wallis and Futuna Education in Wallis and Futuna See also Wallis and Futuna List of international rankings List of Wallis and Futuna-related topics Outline of France Outline of geography Outline of Oceania References External links Official website of the Assembly Official website of the French Administrateur supérieur de Wallis et Futuna Map of Wallis and Futuna, with district boundaries Information about Wallis and Futuna Pictures of Wallis Wallis and Futuna 1 Outlines
This article details the Cronulla-Sutherland Sharks rugby league football club's 2005 season. Season summary The Sharks started season 2005 brightly; at one stage they were joint ladder leaders after round 11. Early season highlights included wins over the 2003 premiers Penrith in the opening round, Parramatta in round three at Parramatta Stadium, 2004 premiers the Bulldogs at home in round four, North Queensland in round five, the Melbourne Storm in Melbourne in round eight, the New Zealand Warriors in Perth in round nine, the Canberra Raiders at Canberra Stadium in round ten and again over the Penrith Panthers at Penrith in round eleven. However, the season would fall apart with losses to Parramatta at home (the Sharks trailed 22-0 at halftime before rallying in the second half to go down 34-26), North Queensland in Townsville, the eventual premiers Wests Tigers at Campbelltown in round 14 before winning against the Sydney Roosters at home in round fifteen. Three more wins followed; a thrilling 30-26 golden point win over the struggling but resurgent Newcastle Knights at home with Vince Mellars scoring the winning try, a 40-16 hammering of the Melbourne Storm at home in what was now-Dragons premiership player Beau Scott's debut match and a massive 68-6 annihilation of Manly in round 24. Late season losses to Souths, St. George Illawarra, the Wests Tigers, Sydney Roosters and the Newcastle Knights in Newcastle all proved costly. The Sharks finished seventh, equal with Manly but higher courtesy of the round 24 result, making the finals for the first time since 2002 but eliminated by the Dragons 28-22 in front of a massive Wollongong crowd in the first round. The Sharks' round two loss to the Manly-Warringah Sea Eagles was marred by the sickening injury suffered by Keith Galloway as a result of a high elbow shot from Manly winger John Hopoate. Galloway missed a huge part of the 2005 season partly as a result from the injury and due to limited opportunities at Cronulla he signed with the Wests Tigers for the 2006 season onwards. For his part, Hopoate was sacked by the Sea Eagles after receiving a mammoth 17-week ban by the NRL Judiciary. Ladder Awards Player of the year- Danny Nutley Chairman’s Award - Paul Gallen Rookie of the Year - Phillip Leuluai Try of the Year - David Peachey Premier League Player of the Year - Nathan Merritt Jersey Flegg Player of the Year - Jacob Selmes Jersey Flegg Player’s Player - Mitch Brown References Cronulla-Sutherland Sharks seasons Cronulla-Sutherland Sharks season
```java Updating interfaces by using `default` methods Using `synchronized` statements There is no such thing as *pass-by-reference* in Java Do not perform bitwise and arithmetic operations on the same data Methods performing *Security Checks* must be declared `Private` or `Final` ```
Helmut Sonnenfeldt (September 13, 1926 – November 18, 2012), also known as Hal Sonnenfeldt, was an American foreign policy expert. He was known as Kissinger’s Kissinger for his philosophical affinity with and influence on Henry A. Kissinger, the architect of American foreign policy in the Nixon and Ford administrations. He was a veteran staff member of the United States National Security Council, and held several advisory posts in the U.S. government and the private sector. Later in life he was a visiting scholar at the Johns Hopkins School of Advanced International Studies and a guest scholar at the Brookings Institution. Early life Sonnenfeldt was born in 1926 in Berlin, Germany, to Drs. Walther and Gertrud (Liebenthal) Sonnenfeldt. His family was Jewish. He spent his childhood in Gardelegen, Germany, where his parents had a family medical practice. In 1938, Sonnenfeldt was sent to Anna Essinger's Bunce Court School in England, as was his brother, Richard Sonnenfeldt. Helmut Sonnenfeldt remained in England until 1944, when he immigrated to the United States and rejoined his parents, who had resettled in Baltimore, Maryland. He entered the U.S. Army in 1944, became a naturalized American citizen and served in both the Philippines and in the U.S. occupation forces in Germany. After military service, he attended Johns Hopkins University (BA 1950, MA 1951, Johns Hopkins School of Advanced International Studies). Career Sonnenfeldt entered service in the U.S. Department of State in 1952 as a member of the staff of the Office of Research on the Soviet Union and Eastern Europe, and served as the Director of that Office from 1963 to 1969. Within days of the 1968 Nixon election, Henry Kissinger picked him to serve on the National Security Council staff. He was a senior staff member of the National Security Council from 1969 to 1974. In 1974, he was appointed Counselor of the U.S. Department of State, where he served from 1974, continuing after Nixon's resignation for the duration of the Ford administration. During his time in the National Security Council and in the State Department, he was a close assistant and adviser of Kissinger and became known as "Kissinger's Kissinger." On the 7th January 1971 Sonnenfeldt, in a memo regarding US / Soviet relations complained to Henry Kissinger that he had been excluded from Kissinger's confidence. "My undoubted personal disappointment that you have almost completely excluded me from participation in or even knowledge of the more sensitive aspects of our dealings with the USSR." After leaving government service, he was a visiting scholar at the Johns Hopkins School of Advanced International Studies. Since 1978, he had been a Guest Scholar at the Brookings Institution in Washington, D.C. Family In 1953, he married Marjorie Hecht. They had three children: Babette Hecht, Walter Herman and Stewart Hecht. Death Sonnenfeldt died on Sunday, November 18, 2012 after a long illness, leaving behind his wife and their three children. The cause was complications of Alzheimer’s disease. As a veteran, he was interred at Arlington National Cemetery. His brother was Richard Sonnenfeldt, an American engineer also noted for being the U.S. prosecution team's chief interpreter in 1945 at the Nuremberg Trial after World War II. Publications Books Soviet Style in International Politics. Washington, DC: Washington Institute for Values in Public Policy (1985) Soviet Politics in the 1980s. Boulder: Westview Press (1985) Soviet Perspectives on Security. Adelphi papers, no. 150. London: International Institute for Strategic Studies (1979) – with William G. Hyland Book contributions "The Chinese Factor in Soviet Disarmament Policy" (Chapter 4). In: Halperin, Morton H. (editor). Sino-Soviet Relations and Arms Control. Cambridge: MIT Press (1967): 95-113. "Written under the auspices of the Center for International Affairs and the East Asian Research Center, Harvard University." Interviews Kennedy, Charles Stuart. "Interview with Helmut Sonnenfeldt." Association for Diplomatic Studies and Training (July 24, 2000) Foreign Affairs Oral History Project. Awards 1997 Leo Baeck Medal U.S. Navy Superior Public Service Award Sonnenfeldt has also been honored by the governments of France, Germany, Luxembourg, and Sweden. References Further reading Hunter, Edward. "Why He Was Planted in Treasury: Sonnenfeldt Case Explained." Tactics, vol. 10, no. 11 (November 20, 1973): 8-9. . "Remembering Helmut Sonnenfeldt: A Major Figure in U.S. Foreign Policy" (proceedings). Washington: Brookings Institution (November 18, 2019). External links Biography at History Commons.org site 1926 births 2012 deaths American businesspeople American foreign policy writers American male non-fiction writers American people of the Vietnam War American political writers Burials at Arlington National Cemetery City College of New York alumni Jewish emigrants from Nazi Germany to the United States International relations scholars Nixon administration personnel Operation Condor People educated at Bunce Court School People from Gardelegen Presidential Medal of Freedom recipients United States National Security Advisors Writers from Washington, D.C. Johns Hopkins University alumni
```xml <?xml version="1.0" encoding="utf-8"?> <xliff xmlns="urn:oasis:names:tc:xliff:document:1.2" xmlns:xsi="path_to_url" version="1.2" xsi:schemaLocation="urn:oasis:names:tc:xliff:document:1.2 xliff-core-1.2-transitional.xsd"> <file datatype="xml" source-language="en" target-language="cs" original="../FSStrings.resx"> <body> <trans-unit id="ArgumentsInSigAndImplMismatch"> <source>The argument names in the signature '{0}' and implementation '{1}' do not match. The argument name from the signature file will be used. This may cause problems when debugging or profiling.</source> <target state="translated">Nzvy argument v signatue {0} a implementaci {1} si neodpovdaj. Pouije se nzev argumentu ze souboru signatury. To me zpsobit problmy pi ladn nebo profilovn.</target> <note /> </trans-unit> <trans-unit id="ErrorFromAddingTypeEquationTuples"> <source>Type mismatch. Expecting a tuple of length {0} of type\n {1} \nbut given a tuple of length {2} of type\n {3} {4}\n</source> <target state="translated">Neshoda typ Oekv se azen kolekce len o dlce {0} typu\n {1} \nale odevzdala se azen kolekce len o dlce {2} typu\n {3}{4}\n</target> <note /> </trans-unit> <trans-unit id="HashLoadedSourceHasIssues0"> <source>One or more informational messages in loaded file.\n</source> <target state="translated">Nejmn jedna informan zprva v natenm souboru\n</target> <note /> </trans-unit> <trans-unit id="NotUpperCaseConstructorWithoutRQA"> <source>Lowercase discriminated union cases are only allowed when using RequireQualifiedAccess attribute</source> <target state="translated">Ppady sjednocen s malmi psmeny jsou povolen jenom pi pouit atributu RequireQualifiedAccess.</target> <note /> </trans-unit> <trans-unit id="OverrideShouldBeInstance"> <source> Non-static member is expected.</source> <target state="new"> Non-static member is expected.</target> <note /> </trans-unit> <trans-unit id="OverrideShouldBeStatic"> <source> Static member is expected.</source> <target state="new"> Static member is expected.</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.DOT.DOT.HAT"> <source>symbol '..^'</source> <target state="translated">symbol ..^</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.INTERP.STRING.BEGIN.END"> <source>interpolated string</source> <target state="translated">interpolovan etzec</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.INTERP.STRING.BEGIN.PART"> <source>interpolated string (first part)</source> <target state="translated">interpolovan etzec (prvn st)</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.INTERP.STRING.END"> <source>interpolated string (final part)</source> <target state="translated">interpolovan etzec (posledn st)</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.INTERP.STRING.PART"> <source>interpolated string (part)</source> <target state="translated">interpolovan etzec (st)</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.WHILE.BANG"> <source>keyword 'while!'</source> <target state="translated">klov slovo while!</target> <note /> </trans-unit> <trans-unit id="SeeAlso"> <source>. See also {0}.</source> <target state="translated">. Viz taky {0}.</target> <note /> </trans-unit> <trans-unit id="ConstraintSolverTupleDiffLengths"> <source>The tuples have differing lengths of {0} and {1}</source> <target state="translated">{0} a {1} maj v azench kolekcch len rznou dlku.</target> <note /> </trans-unit> <trans-unit id="ConstraintSolverInfiniteTypes"> <source>The types '{0}' and '{1}' cannot be unified.</source> <target state="translated">Typy {0} a {1} nemou bt sjednocen.</target> <note /> </trans-unit> <trans-unit id="ConstraintSolverMissingConstraint"> <source>A type parameter is missing a constraint '{0}'</source> <target state="translated">V parametru typu chyb omezen {0}.</target> <note /> </trans-unit> <trans-unit id="ConstraintSolverTypesNotInEqualityRelation1"> <source>The unit of measure '{0}' does not match the unit of measure '{1}'</source> <target state="translated">Mrn jednotka {0} se neshoduje s mrnou jednotkou {1}.</target> <note /> </trans-unit> <trans-unit id="ConstraintSolverTypesNotInEqualityRelation2"> <source>The type '{0}' does not match the type '{1}'</source> <target state="translated">Typ {0} se neshoduje s typem {1}.</target> <note /> </trans-unit> <trans-unit id="ConstraintSolverTypesNotInSubsumptionRelation"> <source>The type '{0}' is not compatible with the type '{1}'{2}</source> <target state="translated">Typ {0} nen kompatibiln s typem {1}{2}.</target> <note /> </trans-unit> <trans-unit id="ErrorFromAddingTypeEquation1"> <source>This expression was expected to have type\n '{1}' \nbut here has type\n '{0}' {2}</source> <target state="translated">U tohoto vrazu se oekval typ\n {1}, \nale tady je typu\n {0} {2}.</target> <note /> </trans-unit> <trans-unit id="ErrorFromAddingTypeEquation2"> <source>Type mismatch. Expecting a\n '{0}' \nbut given a\n '{1}' {2}\n</source> <target state="translated">Neshoda v typu. Oekvan typ je \n {0}, \nale pedvan je\n {1} {2}.\n</target> <note /> </trans-unit> <trans-unit id="ErrorFromApplyingDefault1"> <source>Type constraint mismatch when applying the default type '{0}' for a type inference variable. </source> <target state="translated">Neshoda v omezen typu pi pouit vchozho typu {0} na promnnou rozhran typu </target> <note /> </trans-unit> <trans-unit id="ErrorFromApplyingDefault2"> <source> Consider adding further type constraints</source> <target state="translated"> Zvate pidn dalch omezen typu.</target> <note /> </trans-unit> <trans-unit id="ErrorsFromAddingSubsumptionConstraint"> <source>Type constraint mismatch. The type \n '{0}' \nis not compatible with type\n '{1}' {2}\n</source> <target state="translated">Neshoda v omezen typu. Typ \n {0} \nnen kompatibiln s typem\n {1}. {2}\n</target> <note /> </trans-unit> <trans-unit id="UpperCaseIdentifierInPattern"> <source>Uppercase variable identifiers should not generally be used in patterns, and may indicate a missing open declaration or a misspelt pattern name.</source> <target state="translated">Identifiktory promnnch psan velkmi psmeny se ve vzorech obecn nedoporuuj. Mou oznaovat chybjc otevenou deklaraci nebo patn napsan nzev vzoru.</target> <note /> </trans-unit> <trans-unit id="NotUpperCaseConstructor"> <source>Discriminated union cases and exception labels must be uppercase identifiers</source> <target state="translated">Rozlien ppady typu union a popisky vjimek mus bt identifiktory, kter jsou psan velkmi psmeny.</target> <note /> </trans-unit> <trans-unit id="FunctionExpected"> <source>This function takes too many arguments, or is used in a context where a function is not expected</source> <target state="translated">Tato funkce pebr pli mnoho argument, nebo se pouv v kontextu, ve kterm se funkce neoekv.</target> <note /> </trans-unit> <trans-unit id="BakedInMemberConstraintName"> <source>Member constraints with the name '{0}' are given special status by the F# compiler as certain .NET types are implicitly augmented with this member. This may result in runtime failures if you attempt to invoke the member constraint from your own code.</source> <target state="translated">Kompiltor F# udlil omezenm lena s nzvem {0} zvltn statut, protoe nkter typy .NET jsou o tento len implicitn rozen. Pokud se budete pokouet vyvolat omezen lena z vlastnho kdu, me to zpsobit pote za bhu.</target> <note /> </trans-unit> <trans-unit id="BadEventTransformation"> <source>A definition to be compiled as a .NET event does not have the expected form. Only property members can be compiled as .NET events.</source> <target state="translated">Definice, kter se m zkompilovat jako udlost .NET, nem oekvanou podobu. Jako udlosti .NET se daj zkompilovat jenom lenov vlastnost.</target> <note /> </trans-unit> <trans-unit id="ParameterlessStructCtor"> <source>Implicit object constructors for structs must take at least one argument</source> <target state="translated">Implicitn konstruktory objektu pro struktury mus pebrat aspo jeden argument.</target> <note /> </trans-unit> <trans-unit id="InterfaceNotRevealed"> <source>The type implements the interface '{0}' but this is not revealed by the signature. You should list the interface in the signature, as the interface will be discoverable via dynamic type casts and/or reflection.</source> <target state="translated">Typ implementuje rozhran {0}, kter ale signatura neposkytuje. Mli byste toto rozhran uvst v signatue, aby bylo prostednictvm dynamickho petypovn nebo reflexe zjistiteln.</target> <note /> </trans-unit> <trans-unit id="TyconBadArgs"> <source>The type '{0}' expects {1} type argument(s) but is given {2}</source> <target state="translated">Poet argument typu, kter typ {0} oekv, je {1}, ale poet pedvanch je {2}.</target> <note /> </trans-unit> <trans-unit id="IndeterminateType"> <source>Lookup on object of indeterminate type based on information prior to this program point. A type annotation may be needed prior to this program point to constrain the type of the object. This may allow the lookup to be resolved.</source> <target state="translated">Definovali jste vyhledvn u objektu neuritho typu zaloenho na informacch ped tmto mstem v programu. Aby se typ objektu omezil, bude mon poteba pidat ped tmto mstem v programu poznmku typu. Tm se problm s vyhledvnm pravdpodobn vye.</target> <note /> </trans-unit> <trans-unit id="NameClash1"> <source>Duplicate definition of {0} '{1}'</source> <target state="translated">Duplicitn definice: {0} {1}</target> <note /> </trans-unit> <trans-unit id="NameClash2"> <source>The {0} '{1}' can not be defined because the name '{2}' clashes with the {3} '{4}' in this type or module</source> <target state="translated">{0} {1} se ned definovat, protoe nzev {2} a {3} {4} v tomto typu nebo modulu jsou v konfliktu.</target> <note /> </trans-unit> <trans-unit id="Duplicate1"> <source>Two members called '{0}' have the same signature</source> <target state="translated">Dva lenov s nzvem {0} maj stejnou signaturu.</target> <note /> </trans-unit> <trans-unit id="Duplicate2"> <source>Duplicate definition of {0} '{1}'</source> <target state="translated">Duplicitn definice: {0} {1}</target> <note /> </trans-unit> <trans-unit id="UndefinedName2"> <source> A construct with this name was found in FSharp.PowerPack.dll, which contains some modules and types that were implicitly referenced in some previous versions of F#. You may need to add an explicit reference to this DLL in order to compile this code.</source> <target state="translated"> Konstruktor s tmto nzvem se nael v knihovn FSharp.PowerPack.dll, kter obsahuje urit moduly a typy implicitn odkazovan v nkterch dvjch verzch F#. Abyste tento kd mohli zkompilovat, bude mon poteba pidat explicitn odkaz na tuto knihovnu DLL.</target> <note /> </trans-unit> <trans-unit id="FieldNotMutable"> <source>This field is not mutable</source> <target state="translated">Toto pole nen mniteln.</target> <note /> </trans-unit> <trans-unit id="FieldsFromDifferentTypes"> <source>The fields '{0}' and '{1}' are from different types</source> <target state="translated">Pole {0} a {1} jsou odlinho typu.</target> <note /> </trans-unit> <trans-unit id="VarBoundTwice"> <source>'{0}' is bound twice in this pattern</source> <target state="translated">{0} m v tomto vzoru dv vazby.</target> <note /> </trans-unit> <trans-unit id="Recursion"> <source>A use of the function '{0}' does not match a type inferred elsewhere. The inferred type of the function is\n {1}. \nThe type of the function required at this point of use is\n {2} {3}\nThis error may be due to limitations associated with generic recursion within a 'let rec' collection or within a group of classes. Consider giving a full type signature for the targets of recursive calls including type annotations for both argument and return types.</source> <target state="translated">Pouit funkce {0} neodpovd typu, kter se odvozuje na jinm mst. Odvozen typ funkce je\n {1}. \nTyp funkce, kter se na tomto mst poaduje, je \n {2} {3}.\nTato chyba me bt zpsoben omezenmi, kter jsou pidruen k obecn rekurzi v kolekci let rec nebo ve skupin td. Zvate monost zadat plnou signaturu typu pro cle rekurzivnch voln vetn poznmek typu pro argumenty i nvratovch typ.</target> <note /> </trans-unit> <trans-unit id="InvalidRuntimeCoercion"> <source>Invalid runtime coercion or type test from type {0} to {1}\n{2}</source> <target state="translated">Neplatn test typu nebo konverze za bhu z typu {0} na {1}\n{2}</target> <note /> </trans-unit> <trans-unit id="IndeterminateRuntimeCoercion"> <source>This runtime coercion or type test from type\n {0} \n to \n {1} \ninvolves an indeterminate type based on information prior to this program point. Runtime type tests are not allowed on some types. Further type annotations are needed.</source> <target state="translated">Tento test typu nebo konverze za bhu z typu\n {0} \n na \n {1} \nzahrnuje neurit typ zaloen na informacch ped tmto mstem v programu. Testy typu za bhu nejsou u nkterch typ povolen. Je poteba, abyste k typu doplnili dal poznmky.</target> <note /> </trans-unit> <trans-unit id="IndeterminateStaticCoercion"> <source>The static coercion from type\n {0} \nto \n {1} \n involves an indeterminate type based on information prior to this program point. Static coercions are not allowed on some types. Further type annotations are needed.</source> <target state="translated">Statick konverze z typu\n {0} \nna typ \n {1} \n zahrnuje neurit typ zaloen na informacch ped tmto mstem v programu. Statick konverze nejsou u nkterch typ povolen. Je poteba, abyste k typu doplnili dal poznmky.</target> <note /> </trans-unit> <trans-unit id="StaticCoercionShouldUseBox"> <source>A coercion from the value type \n {0} \nto the type \n {1} \nwill involve boxing. Consider using 'box' instead</source> <target state="translated">Pi konverzi z typu hodnoty \n {0} \nna typ \n {1} \nprobhne zabalen. Zvate monost pout msto toho klov slovo box.</target> <note /> </trans-unit> <trans-unit id="TypeIsImplicitlyAbstract"> <source>This type is 'abstract' since some abstract members have not been given an implementation. If this is intentional then add the '[&lt;AbstractClass&gt;]' attribute to your type.</source> <target state="translated">Tento typ je abstract, protoe se neimplementovali nkte abstraktn lenov. Pokud je to zmr, pak k typu pidejte atribut [&lt;AbstractClass&gt;].</target> <note /> </trans-unit> <trans-unit id="NonRigidTypar1"> <source>This construct causes code to be less generic than indicated by its type annotations. The type variable implied by the use of a '#', '_' or other type annotation at or near '{0}' has been constrained to be type '{1}'.</source> <target state="translated">Konstruktor zpsobuje, e kd je m obecn, ne udvaj jeho poznmky typu. Promnn typu odvozen pomoc #, _ nebo jin poznmky typu na pozici {0} nebo blzko n se omezila na typ {1}.</target> <note /> </trans-unit> <trans-unit id="NonRigidTypar2"> <source>This construct causes code to be less generic than indicated by the type annotations. The unit-of-measure variable '{0} has been constrained to be measure '{1}'.</source> <target state="translated">Tento konstruktor zpsobuje, e kd je m obecn, ne udvaj jeho poznmky typu. Promnn mrn jednotky {0} se omezila na mrnou jednotku {1}.</target> <note /> </trans-unit> <trans-unit id="NonRigidTypar3"> <source>This construct causes code to be less generic than indicated by the type annotations. The type variable '{0} has been constrained to be type '{1}'.</source> <target state="translated">Tento konstruktor zpsobuje, e kd je m obecn, ne udvaj jeho poznmky typu. Promnn typu {0} se omezila na typ {1}.</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.IDENT"> <source>identifier</source> <target state="translated">identifiktor</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.INT"> <source>integer literal</source> <target state="translated">celoseln literl</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.FLOAT"> <source>floating point literal</source> <target state="translated">literl s plovouc desetinnou rkou</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.DECIMAL"> <source>decimal literal</source> <target state="translated">destkov literl</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.CHAR"> <source>character literal</source> <target state="translated">znakov literl</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.BASE"> <source>keyword 'base'</source> <target state="translated">klov slovo base</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.LPAREN.STAR.RPAREN"> <source>symbol '(*)'</source> <target state="translated">symbol (*)</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.DOLLAR"> <source>symbol '$'</source> <target state="translated">symbol $</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.INFIX.STAR.STAR.OP"> <source>infix operator</source> <target state="translated">opertor vpony</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.INFIX.COMPARE.OP"> <source>infix operator</source> <target state="translated">opertor vpony</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.COLON.GREATER"> <source>symbol ':&gt;'</source> <target state="translated">symbol :&gt;</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.COLON.COLON"> <source>symbol '::'</source> <target state="translated">symbol ::</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.PERCENT.OP"> <source>symbol '{0}</source> <target state="translated">symbol {0}</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.INFIX.AT.HAT.OP"> <source>infix operator</source> <target state="translated">opertor vpony</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.INFIX.BAR.OP"> <source>infix operator</source> <target state="translated">opertor vpony</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.PLUS.MINUS.OP"> <source>infix operator</source> <target state="translated">opertor vpony</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.PREFIX.OP"> <source>prefix operator</source> <target state="translated">opertor pedpony</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.COLON.QMARK.GREATER"> <source>symbol ':?&gt;'</source> <target state="translated">symbol :?&gt;</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.INFIX.STAR.DIV.MOD.OP"> <source>infix operator</source> <target state="translated">opertor vpony</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.INFIX.AMP.OP"> <source>infix operator</source> <target state="translated">opertor vpony</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.AMP"> <source>symbol '&amp;'</source> <target state="translated">symbol &amp;</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.AMP.AMP"> <source>symbol '&amp;&amp;'</source> <target state="translated">symbol &amp;&amp;</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.BAR.BAR"> <source>symbol '||'</source> <target state="translated">symbol ||</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.LESS"> <source>symbol '&lt;'</source> <target state="translated">symbol &lt;</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.GREATER"> <source>symbol '&gt;'</source> <target state="translated">symbol &gt;</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.QMARK"> <source>symbol '?'</source> <target state="translated">symbol ?</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.QMARK.QMARK"> <source>symbol '??'</source> <target state="translated">symbol ??</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.COLON.QMARK"> <source>symbol ':?'</source> <target state="translated">symbol :?</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.INT32.DOT.DOT"> <source>integer..</source> <target state="translated">cel slo..</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.DOT.DOT"> <source>symbol '..'</source> <target state="translated">symbol ..</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.QUOTE"> <source>quote symbol</source> <target state="translated">symbol citace</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.STAR"> <source>symbol '*'</source> <target state="translated">symbol *</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.HIGH.PRECEDENCE.TYAPP"> <source>type application </source> <target state="translated">pouit typu </target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.COLON"> <source>symbol ':'</source> <target state="translated">symbol :</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.COLON.EQUALS"> <source>symbol ':='</source> <target state="translated">symbol :=</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.LARROW"> <source>symbol '&lt;-'</source> <target state="translated">symbol &lt;-</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.EQUALS"> <source>symbol '='</source> <target state="translated">symbol =</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.GREATER.BAR.RBRACK"> <source>symbol '&gt;|]'</source> <target state="translated">symbol &gt;|]</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.MINUS"> <source>symbol '-'</source> <target state="translated">symbol -</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.ADJACENT.PREFIX.OP"> <source>prefix operator</source> <target state="translated">opertor pedpony</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.FUNKY.OPERATOR.NAME"> <source>operator name</source> <target state="translated">nzev opertora</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.COMMA"> <source>symbol ','</source> <target state="translated">symbol ,</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.DOT"> <source>symbol '.'</source> <target state="translated">symbol .</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.BAR"> <source>symbol '|'</source> <target state="translated">symbol |</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.HASH"> <source>symbol #</source> <target state="translated">symbol #</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.UNDERSCORE"> <source>symbol '_'</source> <target state="translated">symbol _</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.SEMICOLON"> <source>symbol ';'</source> <target state="translated">symbol ;</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.SEMICOLON.SEMICOLON"> <source>symbol ';;'</source> <target state="translated">symbol ;;</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.LPAREN"> <source>symbol '('</source> <target state="translated">symbol (</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.RPAREN"> <source>symbol ')'</source> <target state="translated">symbol )</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.SPLICE.SYMBOL"> <source>symbol 'splice'</source> <target state="translated">symbol splice</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.LQUOTE"> <source>start of quotation</source> <target state="translated">zatek citace</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.LBRACK"> <source>symbol '['</source> <target state="translated">symbol [</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.LBRACK.BAR"> <source>symbol '[|'</source> <target state="translated">symbol [|</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.LBRACK.LESS"> <source>symbol '[&lt;'</source> <target state="translated">symbol [&lt;</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.LBRACE"> <source>symbol '{'</source> <target state="translated">symbol {</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.LBRACE.LESS"> <source>symbol '{&lt;'</source> <target state="translated">symbol {&lt;</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.BAR.RBRACK"> <source>symbol '|]'</source> <target state="translated">symbol |]</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.GREATER.RBRACE"> <source>symbol '&gt;}'</source> <target state="translated">symbol &gt;}</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.GREATER.RBRACK"> <source>symbol '&gt;]'</source> <target state="translated">symbol &gt;]</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.RQUOTE"> <source>end of quotation</source> <target state="translated">konec citace</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.RBRACK"> <source>symbol ']'</source> <target state="translated">symbol ]</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.RBRACE"> <source>symbol '}'</source> <target state="translated">symbol }</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.PUBLIC"> <source>keyword 'public'</source> <target state="translated">klov slovo public</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.PRIVATE"> <source>keyword 'private'</source> <target state="translated">klov slovo private</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.INTERNAL"> <source>keyword 'internal'</source> <target state="translated">klov slovo internal</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.FIXED"> <source>keyword 'fixed'</source> <target state="translated">klov slovo fixed</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.CONSTRAINT"> <source>keyword 'constraint'</source> <target state="translated">klov slovo constraint</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.INSTANCE"> <source>keyword 'instance'</source> <target state="translated">klov slovo instance</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.DELEGATE"> <source>keyword 'delegate'</source> <target state="translated">klov slovo delegate</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.INHERIT"> <source>keyword 'inherit'</source> <target state="translated">klov slovo inherit</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.CONSTRUCTOR"> <source>keyword 'constructor'</source> <target state="translated">klov slovo constructor</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.DEFAULT"> <source>keyword 'default'</source> <target state="translated">klov slovo default</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.OVERRIDE"> <source>keyword 'override'</source> <target state="translated">klov slovo override</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.ABSTRACT"> <source>keyword 'abstract'</source> <target state="translated">klov slovo abstract</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.CLASS"> <source>keyword 'class'</source> <target state="translated">klov slovo class</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.MEMBER"> <source>keyword 'member'</source> <target state="translated">klov slovo member</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.STATIC"> <source>keyword 'static'</source> <target state="translated">klov slovo static</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.NAMESPACE"> <source>keyword 'namespace'</source> <target state="translated">klov slovo namespace</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.OBLOCKBEGIN"> <source>start of structured construct</source> <target state="translated">zatek strukturovanho konstruktoru</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.OBLOCKEND"> <source>incomplete structured construct at or before this point</source> <target state="translated">nepln strukturovan konstruktor na tto pozici nebo ped n</target> <note /> </trans-unit> <trans-unit id="BlockEndSentence"> <source>Incomplete structured construct at or before this point</source> <target state="translated">Nepln strukturovan konstruktor na tto pozici nebo ped n</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.OTHEN"> <source>keyword 'then'</source> <target state="translated">klov slovo then</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.OELSE"> <source>keyword 'else'</source> <target state="translated">klov slovo else</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.OLET"> <source>keyword 'let' or 'use'</source> <target state="translated">klov slovo let nebo use</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.BINDER"> <source>binder keyword</source> <target state="translated">klov slovo vazae</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.ODO"> <source>keyword 'do'</source> <target state="translated">klov slovo do</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.CONST"> <source>keyword 'const'</source> <target state="translated">klov slovo const</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.OWITH"> <source>keyword 'with'</source> <target state="translated">klov slovo with</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.OFUNCTION"> <source>keyword 'function'</source> <target state="translated">klov slovo function</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.OFUN"> <source>keyword 'fun'</source> <target state="translated">klov slovo fun</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.ORESET"> <source>end of input</source> <target state="translated">konec vstupu</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.ODUMMY"> <source>internal dummy token</source> <target state="translated">intern fiktivn token</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.ODO.BANG"> <source>keyword 'do!'</source> <target state="translated">klov slovo do!</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.YIELD"> <source>yield</source> <target state="translated">yield</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.YIELD.BANG"> <source>yield!</source> <target state="translated">yield!</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.OINTERFACE.MEMBER"> <source>keyword 'interface'</source> <target state="translated">klov slovo interface</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.ELIF"> <source>keyword 'elif'</source> <target state="translated">klov slovo elif</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.RARROW"> <source>symbol '-&gt;'</source> <target state="translated">symbol -&gt;</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.SIG"> <source>keyword 'sig'</source> <target state="translated">klov slovo sig</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.STRUCT"> <source>keyword 'struct'</source> <target state="translated">klov slovo struct</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.UPCAST"> <source>keyword 'upcast'</source> <target state="translated">klov slovo upcast</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.DOWNCAST"> <source>keyword 'downcast'</source> <target state="translated">klov slovo downcast</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.NULL"> <source>keyword 'null'</source> <target state="translated">klov slovo null</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.RESERVED"> <source>reserved keyword</source> <target state="translated">vyhrazen klov slovo</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.MODULE"> <source>keyword 'module'</source> <target state="translated">klov slovo module</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.AND"> <source>keyword 'and'</source> <target state="translated">klov slovo and</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.AS"> <source>keyword 'as'</source> <target state="translated">klov slovo as</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.ASSERT"> <source>keyword 'assert'</source> <target state="translated">klov slovo assert</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.ASR"> <source>keyword 'asr'</source> <target state="translated">klov slovo asr</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.DOWNTO"> <source>keyword 'downto'</source> <target state="translated">klov slovo downto</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.EXCEPTION"> <source>keyword 'exception'</source> <target state="translated">klov slovo exception</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.FALSE"> <source>keyword 'false'</source> <target state="translated">klov slovo false</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.FOR"> <source>keyword 'for'</source> <target state="translated">klov slovo for</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.FUN"> <source>keyword 'fun'</source> <target state="translated">klov slovo fun</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.FUNCTION"> <source>keyword 'function'</source> <target state="translated">klov slovo function</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.FINALLY"> <source>keyword 'finally'</source> <target state="translated">klov slovo finally</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.LAZY"> <source>keyword 'lazy'</source> <target state="translated">klov slovo lazy</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.MATCH"> <source>keyword 'match'</source> <target state="translated">klov slovo match</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.MATCH.BANG"> <source>keyword 'match!'</source> <target state="translated">klov slovo match!</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.MUTABLE"> <source>keyword 'mutable'</source> <target state="translated">klov slovo mutable</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.NEW"> <source>keyword 'new'</source> <target state="translated">klov slovo new</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.OF"> <source>keyword 'of'</source> <target state="translated">klov slovo of</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.OPEN"> <source>keyword 'open'</source> <target state="translated">klov slovo open</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.OR"> <source>keyword 'or'</source> <target state="translated">klov slovo or</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.VOID"> <source>keyword 'void'</source> <target state="translated">klov slovo void</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.EXTERN"> <source>keyword 'extern'</source> <target state="translated">klov slovo extern</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.INTERFACE"> <source>keyword 'interface'</source> <target state="translated">klov slovo interface</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.REC"> <source>keyword 'rec'</source> <target state="translated">klov slovo rec</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.TO"> <source>keyword 'to'</source> <target state="translated">klov slovo to</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.TRUE"> <source>keyword 'true'</source> <target state="translated">klov slovo true</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.TRY"> <source>keyword 'try'</source> <target state="translated">klov slovo try</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.TYPE"> <source>keyword 'type'</source> <target state="translated">klov slovo type</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.VAL"> <source>keyword 'val'</source> <target state="translated">klov slovo val</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.INLINE"> <source>keyword 'inline'</source> <target state="translated">klov slovo inline</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.WHEN"> <source>keyword 'when'</source> <target state="translated">klov slovo when</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.WHILE"> <source>keyword 'while'</source> <target state="translated">klov slovo while</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.WITH"> <source>keyword 'with'</source> <target state="translated">klov slovo with</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.IF"> <source>keyword 'if'</source> <target state="translated">klov slovo if</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.DO"> <source>keyword 'do'</source> <target state="translated">klov slovo do</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.GLOBAL"> <source>keyword 'global'</source> <target state="translated">klov slovo global</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.DONE"> <source>keyword 'done'</source> <target state="translated">klov slovo done</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.IN"> <source>keyword 'in'</source> <target state="translated">klov slovo in</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.HIGH.PRECEDENCE.PAREN.APP"> <source>symbol '('</source> <target state="translated">symbol (</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.HIGH.PRECEDENCE.BRACK.APP"> <source>symbol'['</source> <target state="translated">symbol [</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.BEGIN"> <source>keyword 'begin'</source> <target state="translated">klov slovo begin</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.END"> <source>keyword 'end'</source> <target state="translated">klov slovo end</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.HASH.ENDIF"> <source>directive</source> <target state="translated">direktiva</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.INACTIVECODE"> <source>inactive code</source> <target state="translated">neaktivn kd</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.LEX.FAILURE"> <source>lex failure</source> <target state="translated">selhn lex</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.WHITESPACE"> <source>whitespace</source> <target state="translated">przdn znak</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.COMMENT"> <source>comment</source> <target state="translated">Koment</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.LINE.COMMENT"> <source>line comment</source> <target state="translated">dkov koment</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.STRING.TEXT"> <source>string text</source> <target state="translated">text etzce</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.KEYWORD_STRING"> <source>compiler generated literal</source> <target state="translated">literl generovan kompiltorem</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.BYTEARRAY"> <source>byte array literal</source> <target state="translated">literl bajtovho pole</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.STRING"> <source>string literal</source> <target state="translated">etzcov literl</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.EOF"> <source>end of input</source> <target state="translated">konec vstupu</target> <note /> </trans-unit> <trans-unit id="UnexpectedEndOfInput"> <source>Unexpected end of input</source> <target state="translated">Neoekvan konec vstupu</target> <note /> </trans-unit> <trans-unit id="Unexpected"> <source>Unexpected {0}</source> <target state="translated">Neoekvan {0}</target> <note /> </trans-unit> <trans-unit id="NONTERM.interaction"> <source> in interaction</source> <target state="translated"> v interakci</target> <note /> </trans-unit> <trans-unit id="NONTERM.hashDirective"> <source> in directive</source> <target state="translated"> v direktiv</target> <note /> </trans-unit> <trans-unit id="NONTERM.fieldDecl"> <source> in field declaration</source> <target state="translated"> v deklaraci pole</target> <note /> </trans-unit> <trans-unit id="NONTERM.unionCaseRepr"> <source> in discriminated union case declaration</source> <target state="translated"> v deklaracch rozliench ppad typu union</target> <note /> </trans-unit> <trans-unit id="NONTERM.localBinding"> <source> in binding</source> <target state="translated"> ve vazb</target> <note /> </trans-unit> <trans-unit id="NONTERM.hardwhiteLetBindings"> <source> in binding</source> <target state="translated"> ve vazb</target> <note /> </trans-unit> <trans-unit id="NONTERM.classDefnMember"> <source> in member definition</source> <target state="translated"> v definici lena</target> <note /> </trans-unit> <trans-unit id="NONTERM.defnBindings"> <source> in definitions</source> <target state="translated"> v definicch</target> <note /> </trans-unit> <trans-unit id="NONTERM.classMemberSpfn"> <source> in member signature</source> <target state="translated"> v signatue lena</target> <note /> </trans-unit> <trans-unit id="NONTERM.valSpfn"> <source> in value signature</source> <target state="translated"> v signatue hodnoty</target> <note /> </trans-unit> <trans-unit id="NONTERM.tyconSpfn"> <source> in type signature</source> <target state="translated"> v signatue typu</target> <note /> </trans-unit> <trans-unit id="NONTERM.anonLambdaExpr"> <source> in lambda expression</source> <target state="translated"> ve vrazu lambda</target> <note /> </trans-unit> <trans-unit id="NONTERM.attrUnionCaseDecl"> <source> in union case</source> <target state="translated"> v ppadu typu union</target> <note /> </trans-unit> <trans-unit id="NONTERM.cPrototype"> <source> in extern declaration</source> <target state="translated"> v extern deklaraci</target> <note /> </trans-unit> <trans-unit id="NONTERM.objectImplementationMembers"> <source> in object expression</source> <target state="translated"> v objektovm vrazu</target> <note /> </trans-unit> <trans-unit id="NONTERM.ifExprCases"> <source> in if/then/else expression</source> <target state="translated"> ve vrazu if/then/else</target> <note /> </trans-unit> <trans-unit id="NONTERM.openDecl"> <source> in open declaration</source> <target state="translated"> v oteven deklaraci</target> <note /> </trans-unit> <trans-unit id="NONTERM.fileModuleSpec"> <source> in module or namespace signature</source> <target state="translated"> v signatue oboru nzv nebo modulu</target> <note /> </trans-unit> <trans-unit id="NONTERM.patternClauses"> <source> in pattern matching</source> <target state="translated"> v porovnvn vzor</target> <note /> </trans-unit> <trans-unit id="NONTERM.beginEndExpr"> <source> in begin/end expression</source> <target state="translated"> ve vrazu begin/end</target> <note /> </trans-unit> <trans-unit id="NONTERM.recdExpr"> <source> in record expression</source> <target state="translated"> ve vrazu zznamu</target> <note /> </trans-unit> <trans-unit id="NONTERM.tyconDefn"> <source> in type definition</source> <target state="translated"> v definici typu</target> <note /> </trans-unit> <trans-unit id="NONTERM.exconCore"> <source> in exception definition</source> <target state="translated"> v definici vjimky</target> <note /> </trans-unit> <trans-unit id="NONTERM.typeNameInfo"> <source> in type name</source> <target state="translated"> v nzvu typu</target> <note /> </trans-unit> <trans-unit id="NONTERM.attributeList"> <source> in attribute list</source> <target state="translated"> v seznamu atribut</target> <note /> </trans-unit> <trans-unit id="NONTERM.quoteExpr"> <source> in quotation literal</source> <target state="translated"> v literlu citace</target> <note /> </trans-unit> <trans-unit id="NONTERM.typeConstraint"> <source> in type constraint</source> <target state="translated"> v omezen typu</target> <note /> </trans-unit> <trans-unit id="NONTERM.Category.ImplementationFile"> <source> in implementation file</source> <target state="translated"> v souboru implementace</target> <note /> </trans-unit> <trans-unit id="NONTERM.Category.Definition"> <source> in definition</source> <target state="translated"> v definici</target> <note /> </trans-unit> <trans-unit id="NONTERM.Category.SignatureFile"> <source> in signature file</source> <target state="translated"> v souboru signatury</target> <note /> </trans-unit> <trans-unit id="NONTERM.Category.Pattern"> <source> in pattern</source> <target state="translated"> ve vzoru</target> <note /> </trans-unit> <trans-unit id="NONTERM.Category.Expr"> <source> in expression</source> <target state="translated"> ve vrazu</target> <note /> </trans-unit> <trans-unit id="NONTERM.Category.Type"> <source> in type</source> <target state="translated"> v typu</target> <note /> </trans-unit> <trans-unit id="NONTERM.typeArgsActual"> <source> in type arguments</source> <target state="translated"> v argumentech typu</target> <note /> </trans-unit> <trans-unit id="FixKeyword"> <source>keyword </source> <target state="translated">klov slovo </target> <note /> </trans-unit> <trans-unit id="FixSymbol"> <source>symbol </source> <target state="translated">symbol </target> <note /> </trans-unit> <trans-unit id="FixReplace"> <source> (due to indentation-aware syntax)</source> <target state="translated"> (kvli syntaxi, kter reflektuje odsazen)</target> <note /> </trans-unit> <trans-unit id="TokenName1"> <source>. Expected {0} or other token.</source> <target state="translated">. Oekval se token {0} nebo njak jin.</target> <note /> </trans-unit> <trans-unit id="TokenName1TokenName2"> <source>. Expected {0}, {1} or other token.</source> <target state="translated">. Oekval se token {0}, {1} nebo njak jin.</target> <note /> </trans-unit> <trans-unit id="TokenName1TokenName2TokenName3"> <source>. Expected {0}, {1}, {2} or other token.</source> <target state="translated">. Oekval se token {0}, {1}, {2} nebo njak jin.</target> <note /> </trans-unit> <trans-unit id="RuntimeCoercionSourceSealed1"> <source>The type '{0}' cannot be used as the source of a type test or runtime coercion</source> <target state="translated">Typ {0} se jako zdroj testu typu nebo konverze za bhu pout ned.</target> <note /> </trans-unit> <trans-unit id="RuntimeCoercionSourceSealed2"> <source>The type '{0}' does not have any proper subtypes and cannot be used as the source of a type test or runtime coercion.</source> <target state="translated">Typ {0} nem dn sprvn podtypy a ned se pout jako zdroj testu typu nebo konverze za bhu.</target> <note /> </trans-unit> <trans-unit id="CoercionTargetSealed"> <source>The type '{0}' does not have any proper subtypes and need not be used as the target of a static coercion</source> <target state="translated">Typ {0} nem dn sprvn podtypy a nen poteba ho pouvat jako cl statick konverze.</target> <note /> </trans-unit> <trans-unit id="UpcastUnnecessary"> <source>This upcast is unnecessary - the types are identical</source> <target state="translated">Toto petypovn smrem nahoru nen nutn: oba typy jsou identick.</target> <note /> </trans-unit> <trans-unit id="TypeTestUnnecessary"> <source>This type test or downcast will always hold</source> <target state="translated">Tento test typu nebo petypovn smrem dol se vdycky ulo.</target> <note /> </trans-unit> <trans-unit id="OverrideDoesntOverride1"> <source>The member '{0}' does not have the correct type to override any given virtual method</source> <target state="translated">len {0} nen sprvnho typu, aby mohl pepsat libovoln pedan virtuln metody.</target> <note /> </trans-unit> <trans-unit id="OverrideDoesntOverride2"> <source>The member '{0}' does not have the correct type to override the corresponding abstract method.</source> <target state="translated">len {0} nen sprvnho typu, aby mohl pepsat odpovdajc abstraktn metodu.</target> <note /> </trans-unit> <trans-unit id="OverrideDoesntOverride3"> <source> The required signature is '{0}'.</source> <target state="translated"> Poadovan signatura je {0}.</target> <note /> </trans-unit> <trans-unit id="OverrideDoesntOverride4"> <source>The member '{0}' is specialized with 'unit' but 'unit' can't be used as return type of an abstract method parameterized on return type.</source> <target state="translated">len {0} je specializovan s typem unit, ale typ unit nen mon pout jako nvratov typ abstraktn metody parametrizovan u nvratovho typu.</target> <note /> </trans-unit> <trans-unit id="UnionCaseWrongArguments"> <source>This constructor is applied to {0} argument(s) but expects {1}</source> <target state="translated">Poet argument, pro kter se pouv tento konstruktor, je {0}, ale oekvan poet je {1}.</target> <note /> </trans-unit> <trans-unit id="UnionPatternsBindDifferentNames"> <source>The two sides of this 'or' pattern bind different sets of variables</source> <target state="translated">Ob strany tohoto vzoru or vou jinou sadu promnnch.</target> <note /> </trans-unit> <trans-unit id="ValueNotContained"> <source>Module '{0}' contains\n {1} \nbut its signature specifies\n {2} \n{3}.</source> <target state="translated">Modul {0} obsahuje hodnotu\n {1}, \nale jeho signatura definuje hodnotu\n {2}. \n{3}.</target> <note /> </trans-unit> <trans-unit id="RequiredButNotSpecified"> <source>Module '{0}' requires a {1} '{2}'</source> <target state="translated">Modul {0} vyaduje {1} {2}.</target> <note /> </trans-unit> <trans-unit id="UseOfAddressOfOperator"> <source>The use of native pointers may result in unverifiable .NET IL code</source> <target state="translated">Pouit nativnch ukazatel me zpsobit vygenerovn neovitelnho kdu .NET IL.</target> <note /> </trans-unit> <trans-unit id="DefensiveCopyWarning"> <source>{0}</source> <target state="translated">{0}</target> <note /> </trans-unit> <trans-unit id="DeprecatedThreadStaticBindingWarning"> <source>Thread static and context static 'let' bindings are deprecated. Instead use a declaration of the form 'static val mutable &lt;ident&gt; : &lt;type&gt;' in a class. Add the 'DefaultValue' attribute to this declaration to indicate that the value is initialized to the default value on each new thread.</source> <target state="translated">Vazby let, kter jsou statick na rovni vlkna nebo kontextu, jsou zastaral. Pouijte msto nich deklaraci ve td v podob static val mutable &lt;ident&gt; : &lt;type&gt;. Pidnm atributu DefaultValue do tto deklarace mete urit, e se bude hodnota v kadm novm vlknu inicializovat na vchoz hodnotu.</target> <note /> </trans-unit> <trans-unit id="FunctionValueUnexpected"> <source>This expression is a function value, i.e. is missing arguments. Its type is {0}.</source> <target state="translated">Tento vraz je hodnotou funkce, to znamen, e u nj chyb argumenty. Je typu {0}.</target> <note /> </trans-unit> <trans-unit id="UnitTypeExpected"> <source>The result of this expression has type '{0}' and is implicitly ignored. Consider using 'ignore' to discard this value explicitly, e.g. 'expr |&gt; ignore', or 'let' to bind the result to a name, e.g. 'let result = expr'.</source> <target state="translated">Vsledek tohoto vrazu m typ {0} a implicitn se ignoruje. Zvate monost zahodit tuto hodnotu explicitn pomoc klovho slova ignore (napklad vraz |&gt; ignore) nebo vytvoit vazbu vsledku na nzev pomoc klovho slova let (napklad let vsledek = vraz).</target> <note /> </trans-unit> <trans-unit id="UnitTypeExpectedWithEquality"> <source>The result of this equality expression has type '{0}' and is implicitly discarded. Consider using 'let' to bind the result to a name, e.g. 'let result = expression'.</source> <target state="translated">Vsledek tohoto vrazu rovnosti m typ {0} a implicitn se zru. Zvate vytvoen vazby mezi vsledkem a nzvem pomoc klovho slova let, nap. let vsledek = vraz.</target> <note /> </trans-unit> <trans-unit id="UnitTypeExpectedWithPossiblePropertySetter"> <source>The result of this equality expression has type '{0}' and is implicitly discarded. Consider using 'let' to bind the result to a name, e.g. 'let result = expression'. If you intended to set a value to a property, then use the '&lt;-' operator e.g. '{1}.{2} &lt;- expression'.</source> <target state="translated">Vsledek tohoto vrazu rovnosti m typ {0} a implicitn se zru. Zvate vytvoen vazby mezi vsledkem a nzvem pomoc klovho slova let, napklad let vsledek = vraz. Pokud jste chtli nastavit hodnotu na vlastnost, pouijte opertor &lt;-, napklad {1}.{2} &lt;- vraz.</target> <note /> </trans-unit> <trans-unit id="UnitTypeExpectedWithPossibleAssignment"> <source>The result of this equality expression has type '{0}' and is implicitly discarded. Consider using 'let' to bind the result to a name, e.g. 'let result = expression'. If you intended to mutate a value, then mark the value 'mutable' and use the '&lt;-' operator e.g. '{1} &lt;- expression'.</source> <target state="translated">Vsledek tohoto vrazu rovnosti m typ {0} a implicitn se zru. Zvate vytvoen vazby mezi vsledkem a nzvem pomoc klovho slova let, napklad let vsledek = vraz. Pokud jste chtli mutovat hodnotu, oznate hodnotu jako mutable a pouijte opertor &lt;-, napklad {1} &lt;- vraz.</target> <note /> </trans-unit> <trans-unit id="UnitTypeExpectedWithPossibleAssignmentToMutable"> <source>The result of this equality expression has type '{0}' and is implicitly discarded. Consider using 'let' to bind the result to a name, e.g. 'let result = expression'. If you intended to mutate a value, then use the '&lt;-' operator e.g. '{1} &lt;- expression'.</source> <target state="translated">Vsledek tohoto vrazu rovnosti m typ {0} a implicitn se zru. Zvate vytvoen vazby mezi vsledkem a nzvem pomoc klovho slova let, napklad let vsledek = vraz. Pokud jste chtli mutovat hodnotu, pouijte opertor &lt;-, napklad {1} &lt;- vraz.</target> <note /> </trans-unit> <trans-unit id="RecursiveUseCheckedAtRuntime"> <source>This recursive use will be checked for initialization-soundness at runtime. This warning is usually harmless, and may be suppressed by using '#nowarn "21"' or '--nowarn:21'.</source> <target state="translated">U tohoto rekurzivnho pouit se bude kontrolovat stabilita inicializace za bhu. Toto upozornn je obvykle nekodn a pomoc #nowarn "21" nebo --nowarn:21 se d potlait.</target> <note /> </trans-unit> <trans-unit id="LetRecUnsound1"> <source>The value '{0}' will be evaluated as part of its own definition</source> <target state="translated">Hodnota {0} se vyhodnot v rmci jej vlastn definice.</target> <note /> </trans-unit> <trans-unit id="LetRecUnsound2"> <source>This value will be eventually evaluated as part of its own definition. You may need to make the value lazy or a function. Value '{0}'{1}.</source> <target state="translated">Tato hodnota se nakonec vyhodnot v rmci jej vlastn definice. Mon bude poteba, abyste hodnotu zmnili na opodnou nebo na funkci. Hodnota {0}{1}.</target> <note /> </trans-unit> <trans-unit id="LetRecUnsoundInner"> <source> will evaluate '{0}'</source> <target state="translated"> se vyhodnot jako {0}</target> <note /> </trans-unit> <trans-unit id="LetRecEvaluatedOutOfOrder"> <source>Bindings may be executed out-of-order because of this forward reference.</source> <target state="translated">Vazby se mou kvli tomuto dopednmu odkazu provdt ve patnm poad.</target> <note /> </trans-unit> <trans-unit id="LetRecCheckedAtRuntime"> <source>This and other recursive references to the object(s) being defined will be checked for initialization-soundness at runtime through the use of a delayed reference. This is because you are defining one or more recursive objects, rather than recursive functions. This warning may be suppressed by using '#nowarn "40"' or '--nowarn:40'.</source> <target state="translated">U tohoto a dalch rekurzivnch odkaz na definovan objekty se bude kontrolovat stabilita inicializace za bhu pomoc zpodnho odkazovn. Je to kvli tomu, e definujete rekurzivn objekty msto rekurzivnch funkc. Toto upozornn se d pomoc #nowarn "40" nebo --nowarn:40 potlait.</target> <note /> </trans-unit> <trans-unit id="SelfRefObjCtor1"> <source>Recursive references to the object being defined will be checked for initialization soundness at runtime through the use of a delayed reference. Consider placing self-references in members or within a trailing expression of the form '&lt;ctor-expr&gt; then &lt;expr&gt;'.</source> <target state="translated">U rekurzivnch odkaz na definovan objekty se bude kontrolovat stabilita inicializace za bhu pomoc zpodnho odkazovn. Zvate monost pidat odkazy na sebe sama do len nebo koncovho vrazu v podob &lt;vraz-konstruktoru&gt; then &lt;vraz&gt;.</target> <note /> </trans-unit> <trans-unit id="SelfRefObjCtor2"> <source>Recursive references to the object being defined will be checked for initialization soundness at runtime through the use of a delayed reference. Consider placing self-references within 'do' statements after the last 'let' binding in the construction sequence.</source> <target state="translated">U rekurzivnch odkaz na definovan objekty se bude kontrolovat stabilita inicializace za bhu pomoc zpodnho odkazovn. Zvate monost pidat do pkaz do za posledn vazbou let v sekvenci konstruktoru odkazy na sebe sama.</target> <note /> </trans-unit> <trans-unit id="VirtualAugmentationOnNullValuedType"> <source>The containing type can use 'null' as a representation value for its nullary union case. Invoking an abstract or virtual member or an interface implementation on a null value will lead to an exception. If necessary add a dummy data value to the nullary constructor to avoid 'null' being used as a representation for this type.</source> <target state="translated">Nadazen typ me pout null jako hodnotu reprezentace pro svj przdn ppad typu union. Vyvoln abstraktnho nebo virtulnho lena nebo implementace rozhran u hodnoty null zpsob vjimku. Pokud je to nutn, pidejte do przdnho konstruktoru fiktivn datovou hodnotu, abyste pedeli tomu, e se null pouije jako reprezentace tohoto typu.</target> <note /> </trans-unit> <trans-unit id="NonVirtualAugmentationOnNullValuedType"> <source>The containing type can use 'null' as a representation value for its nullary union case. This member will be compiled as a static member.</source> <target state="translated">Nadazen typ me pout null jako hodnotu reprezentace pro svj przdn ppad typu union. Tento len se kompiluje jako statick.</target> <note /> </trans-unit> <trans-unit id="NonUniqueInferredAbstractSlot1"> <source>The member '{0}' doesn't correspond to a unique abstract slot based on name and argument count alone</source> <target state="translated">len {0} neodpovd uniktn abstraktn datov oblasti zaloen jenom na nzvu a potu argument.</target> <note /> </trans-unit> <trans-unit id="NonUniqueInferredAbstractSlot2"> <source>. Multiple implemented interfaces have a member with this name and argument count</source> <target state="translated">. Vc implementovanch rozhran m lena s tmto nzvem a potem argument.</target> <note /> </trans-unit> <trans-unit id="NonUniqueInferredAbstractSlot3"> <source>. Consider implementing interfaces '{0}' and '{1}' explicitly.</source> <target state="translated">. Zvate explicitn implementaci rozhran {0} a {1}.</target> <note /> </trans-unit> <trans-unit id="NonUniqueInferredAbstractSlot4"> <source>. Additional type annotations may be required to indicate the relevant override. This warning can be disabled using '#nowarn "70"' or '--nowarn:70'.</source> <target state="translated">. Mou se vyadovat dal poznmky typu k uren pslunho pepsn. Toto upozornn se d pomoc #nowarn "70" nebo --nowarn:70 vypnout.</target> <note /> </trans-unit> <trans-unit id="Failure1"> <source>parse error</source> <target state="translated">Chyba analzy</target> <note /> </trans-unit> <trans-unit id="Failure2"> <source>parse error: unexpected end of file</source> <target state="translated">Chyba analzy: neoekvan konec souboru</target> <note /> </trans-unit> <trans-unit id="Failure3"> <source>{0}</source> <target state="translated">{0}</target> <note /> </trans-unit> <trans-unit id="Failure4"> <source>internal error: {0}</source> <target state="translated">Vnitn chyba: {0}</target> <note /> </trans-unit> <trans-unit id="FullAbstraction"> <source>{0}</source> <target state="translated">{0}</target> <note /> </trans-unit> <trans-unit id="MatchIncomplete1"> <source>Incomplete pattern matches on this expression.</source> <target state="translated">Nepln porovnvn vzor u tohoto vrazu</target> <note /> </trans-unit> <trans-unit id="MatchIncomplete2"> <source> For example, the value '{0}' may indicate a case not covered by the pattern(s).</source> <target state="translated"> Teba hodnota {0} me oznaovat ppad, na kter se vzor nevztahuje.</target> <note /> </trans-unit> <trans-unit id="MatchIncomplete3"> <source> For example, the value '{0}' may indicate a case not covered by the pattern(s). However, a pattern rule with a 'when' clause might successfully match this value.</source> <target state="translated"> Teba hodnota {0} me oznaovat ppad, na kter se vzor nevztahuje. Pravidlo vzoru s klauzul with se ale s touto hodnotou spn shodovat me.</target> <note /> </trans-unit> <trans-unit id="MatchIncomplete4"> <source> Unmatched elements will be ignored.</source> <target state="translated"> Nesprovan prvky se budou ignorovat.</target> <note /> </trans-unit> <trans-unit id="EnumMatchIncomplete1"> <source>Enums may take values outside known cases.</source> <target state="translated">Vty mou zskvat hodnoty mimo znm ppady.</target> <note /> </trans-unit> <trans-unit id="RuleNeverMatched"> <source>This rule will never be matched</source> <target state="translated">Pro toto pravidlo nebude nikdy existovat shoda.</target> <note /> </trans-unit> <trans-unit id="ValNotMutable"> <source>This value is not mutable. Consider using the mutable keyword, e.g. 'let mutable {0} = expression'.</source> <target state="translated">Tato hodnota nen promnliv. Zvate pouit promnlivho klovho slova, teba let mutable {0} = expression.</target> <note /> </trans-unit> <trans-unit id="ValNotLocal"> <source>This value is not local</source> <target state="translated">Tato hodnota nen lokln.</target> <note /> </trans-unit> <trans-unit id="Obsolete1"> <source>This construct is deprecated</source> <target state="translated">Tento konstruktor je zastaral.</target> <note /> </trans-unit> <trans-unit id="Obsolete2"> <source>. {0}</source> <target state="translated">. {0}</target> <note /> </trans-unit> <trans-unit id="Experimental"> <source>{0}. This warning can be disabled using '--nowarn:57' or '#nowarn "57"'.</source> <target state="translated">{0}. Toto upozornn se d pomoc --nowarn:57 nebo #nowarn "57" vypnout.</target> <note /> </trans-unit> <trans-unit id="PossibleUnverifiableCode"> <source>Uses of this construct may result in the generation of unverifiable .NET IL code. This warning can be disabled using '--nowarn:9' or '#nowarn "9"'.</source> <target state="translated">Pouit tohoto konstruktoru me zpsobit vygenerovn neovitelnho kdu .NET IL. Toto upozornn se d pomoc --nowarn:9 nebo #nowarn "9" vypnout.</target> <note /> </trans-unit> <trans-unit id="Deprecated"> <source>This construct is deprecated: {0}</source> <target state="translated">Tento konstruktor je zastaral: {0}</target> <note /> </trans-unit> <trans-unit id="LibraryUseOnly"> <source>This construct is deprecated: it is only for use in the F# library</source> <target state="translated">Tento konstruktor je zastaral: pouv se jenom v knihovn F#.</target> <note /> </trans-unit> <trans-unit id="MissingFields"> <source>The following fields require values: {0}</source> <target state="translated">Nsledujc pole vyaduj hodnoty: {0}</target> <note /> </trans-unit> <trans-unit id="ValueRestriction1"> <source>Value restriction. The value '{0}' has generic type\n {1} \nEither make the arguments to '{2}' explicit or, if you do not intend for it to be generic, add a type annotation.</source> <target state="translated">Omezen hodnoty. Hodnota {0} je obecnho typu\n {1}. \nBu zmte argumenty pro {2} na explicitn, nebo (pokud hodnota nem bt obecn) pidejte poznmku typu.</target> <note /> </trans-unit> <trans-unit id="ValueRestriction2"> <source>Value restriction. The value '{0}' has generic type\n {1} \nEither make '{2}' into a function with explicit arguments or, if you do not intend for it to be generic, add a type annotation.</source> <target state="translated">Omezen hodnoty. Hodnota {0} je obecnho typu\n {1}. \nZmte {2} na funkci s explicitnmi argumenty nebo (pokud hodnota nem bt obecn) pidejte poznmku typu.</target> <note /> </trans-unit> <trans-unit id="ValueRestriction3"> <source>Value restriction. This member has been inferred to have generic type\n {0} \nConstructors and property getters/setters cannot be more generic than the enclosing type. Add a type annotation to indicate the exact types involved.</source> <target state="translated">Omezen hodnoty. Tento len se odvodil jako len obecnho typu\n {0}. \nKonstruktory a metody getter nebo setter vlastnosti nemou bt obecnj ne nadazen typ. Pidejte poznmku typu, abyste pesn urili, kter typy se maj zahrnout.</target> <note /> </trans-unit> <trans-unit id="ValueRestriction4"> <source>Value restriction. The value '{0}' has been inferred to have generic type\n {1} \nEither make the arguments to '{2}' explicit or, if you do not intend for it to be generic, add a type annotation.</source> <target state="translated">Omezen hodnoty. Hodnota {0} se odvodila jako hodnota obecnho typu\n {1}. \nBu zmte argumenty pro {2} na explicitn, nebo (pokud hodnota nem bt obecn) pidejte poznmku typu.</target> <note /> </trans-unit> <trans-unit id="ValueRestriction5"> <source>Value restriction. The value '{0}' has been inferred to have generic type\n {1} \nEither define '{2}' as a simple data term, make it a function with explicit arguments or, if you do not intend for it to be generic, add a type annotation.</source> <target state="translated">Omezen hodnoty. Hodnota {0} se odvodila jako hodnota obecnho typu\n {1}. \nDefinujte {2} jako jednoduch datov vraz, zmte ji na funkci s explicitnmi argumenty nebo (pokud hodnota nem bt obecn) pidejte poznmku typu.</target> <note /> </trans-unit> <trans-unit id="RecoverableParseError"> <source>syntax error</source> <target state="translated">chyba syntaxe</target> <note /> </trans-unit> <trans-unit id="ReservedKeyword"> <source>{0}</source> <target state="translated">{0}</target> <note /> </trans-unit> <trans-unit id="IndentationProblem"> <source>{0}</source> <target state="translated">{0}</target> <note /> </trans-unit> <trans-unit id="OverrideInIntrinsicAugmentation"> <source>Override implementations in augmentations are now deprecated. Override implementations should be given as part of the initial declaration of a type.</source> <target state="translated">Implementace pepsn v rozench jsou u zastaral. Implementace pepsn by se mly provdt pi poten deklaraci typu.</target> <note /> </trans-unit> <trans-unit id="OverrideInExtrinsicAugmentation"> <source>Override implementations should be given as part of the initial declaration of a type.</source> <target state="translated">Implementace pepsn by se mly provdt pi poten deklaraci typu.</target> <note /> </trans-unit> <trans-unit id="IntfImplInIntrinsicAugmentation"> <source>Interface implementations should normally be given on the initial declaration of a type. Interface implementations in augmentations may lead to accessing static bindings before they are initialized, though only if the interface implementation is invoked during initialization of the static data, and in turn access the static data. You may remove this warning using #nowarn "69" if you have checked this is not the case.</source> <target state="translated">Implementace rozhran by obvykle mly bt zadny pro poten deklaraci typu. Implementace rozhran v rozench mohou vst k pstupu ke statickm vazbm ped jejich inicializac, ale pouze v ppad, e je implementace rozhran vyvolna bhem inicializace statickch dat a nsledn umon pstup ke statickm datm. Toto upozornn mete odebrat pomoc #nowarn 69, pokud jste ovili, e tomu tak nen.</target> <note /> </trans-unit> <trans-unit id="IntfImplInExtrinsicAugmentation"> <source>Interface implementations should be given on the initial declaration of a type.</source> <target state="translated">Implementace rozhran by se mly provdt pi poten deklaraci typu.</target> <note /> </trans-unit> <trans-unit id="UnresolvedReferenceNoRange"> <source>A required assembly reference is missing. You must add a reference to assembly '{0}'.</source> <target state="translated">Chyb poadovan odkaz na sestaven. Muste k sestaven {0} pidat odkaz.</target> <note /> </trans-unit> <trans-unit id="UnresolvedPathReferenceNoRange"> <source>The type referenced through '{0}' is defined in an assembly that is not referenced. You must add a reference to assembly '{1}'.</source> <target state="translated">Typ odkazovan pomoc {0} je definovan v sestaven, na kter se neodkazuje. Muste pidat odkaz na sestaven {1}.</target> <note /> </trans-unit> <trans-unit id="HashIncludeNotAllowedInNonScript"> <source>#I directives may only occur in F# script files (extensions .fsx or .fsscript). Either move this code to a script file, add a '-I' compiler option for this reference or delimit the directive with delimit it with '#if INTERACTIVE'/'#endif'.</source> <target state="translated">Direktivy #I se mou vyskytovat jenom v souborech skriptu F# (s pponou .fsx nebo .fsscript). Pesute tento kd do souboru skriptu nebo pidejte pro tento odkaz monost kompiltoru -I anebo direktivu ohranite pomoc notace #if INTERACTIVE/#endif.</target> <note /> </trans-unit> <trans-unit id="HashReferenceNotAllowedInNonScript"> <source>#r directives may only occur in F# script files (extensions .fsx or .fsscript). Either move this code to a script file or replace this reference with the '-r' compiler option. If this directive is being executed as user input, you may delimit it with '#if INTERACTIVE'/'#endif'.</source> <target state="translated">Direktivy #r se mou vyskytovat jenom v souborech skriptu F# (s pponou .fsx nebo .fsscript). Bu pesute tento kd do souboru skriptu, nebo nahrate tento odkaz monost kompiltoru -r. Pokud se tato direktiva provd jako uivatelsk vstup, mete ji ohraniit pomoc notace #if INTERACTIVE'/'#endif.</target> <note /> </trans-unit> <trans-unit id="HashDirectiveNotAllowedInNonScript"> <source>This directive may only be used in F# script files (extensions .fsx or .fsscript). Either remove the directive, move this code to a script file or delimit the directive with '#if INTERACTIVE'/'#endif'.</source> <target state="translated">Tato direktiva se d pout jenom v souborech skriptu F# (s pponou .fsx nebo .fsscript). Bu direktivu odeberte, nebo tento kd pesute do souboru skriptu, anebo direktivu ohranite pomoc notace #if INTERACTIVE/#endif.</target> <note /> </trans-unit> <trans-unit id="FileNameNotResolved"> <source>Unable to find the file '{0}' in any of\n {1}</source> <target state="translated">Soubor {0} se nepovedlo najt v dn(m)\n {1}.</target> <note /> </trans-unit> <trans-unit id="AssemblyNotResolved"> <source>Assembly reference '{0}' was not found or is invalid</source> <target state="translated">Odkaz na sestaven {0} se nenael nebo je neplatn.</target> <note /> </trans-unit> <trans-unit id="HashLoadedSourceHasIssues1"> <source>One or more warnings in loaded file.\n</source> <target state="translated">V natenm souboru je nejm jedno upozornn.\n</target> <note /> </trans-unit> <trans-unit id="HashLoadedSourceHasIssues2"> <source>One or more errors in loaded file.\n</source> <target state="translated">V natenm souboru je nejm jedna chyba.\n</target> <note /> </trans-unit> <trans-unit id="HashLoadedScriptConsideredSource"> <source>Loaded files may only be F# source files (extension .fs). This F# script file (.fsx or .fsscript) will be treated as an F# source file</source> <target state="translated">Naten soubory mou bt jenom zdrojov soubory F# (s pponou .fs). Tento soubor skriptu F# (s pponou .fsx nebo .fsscript) se zpracuje jako zdrojov soubor F#.</target> <note /> </trans-unit> <trans-unit id="InvalidInternalsVisibleToAssemblyName1"> <source>Invalid assembly name '{0}' from InternalsVisibleTo attribute in {1}</source> <target state="translated">Neplatn nzev sestaven {0} z atributu InternalsVisibleTo v {1}</target> <note /> </trans-unit> <trans-unit id="InvalidInternalsVisibleToAssemblyName2"> <source>Invalid assembly name '{0}' from InternalsVisibleTo attribute (assembly filename not available)</source> <target state="translated">Neplatn nzev sestaven {0} z atributu InternalsVisibleTo (nzev souboru sestaven nen dostupn)</target> <note /> </trans-unit> <trans-unit id="LoadedSourceNotFoundIgnoring"> <source>Could not load file '{0}' because it does not exist or is inaccessible</source> <target state="translated">Soubor {0} se nedal nast, protoe neexistuje nebo nen dostupn.</target> <note /> </trans-unit> <trans-unit id="MSBuildReferenceResolutionError"> <source>{0} (Code={1})</source> <target state="translated">{0} (Kd={1})</target> <note /> </trans-unit> <trans-unit id="TargetInvocationExceptionWrapper"> <source>internal error: {0}</source> <target state="translated">Vnitn chyba: {0}</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.LBRACE.BAR"> <source>symbol '{|'</source> <target state="translated">symbol {|</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.BAR.RBRACE"> <source>symbol '|}'</source> <target state="translated">symbol |}</target> <note /> </trans-unit> <trans-unit id="Parser.TOKEN.AND.BANG"> <source>keyword 'and!'</source> <target state="translated">klov slovo and!</target> <note /> </trans-unit> </body> </file> </xliff> ```
Las Cruces () is a municipality in the northern Guatemalan department of El Petén. It is situated at above sea level at . The municipality was founded in 2011 when it split off from the territory of La Libertad. References Municipalities of the Petén Department
Martelo (hammer) is the name for roundhouse kick in capoeira. The kick targets the head of the opponent with the top of the foot. Martelo was not used in traditional capoeira Angola. The kick was introduced to regional capoeira in the 1930s by mestre Bimba, likely from Asian martial arts. There are several variations of martelo kick, and it is often combined with other techniques. Variations Martelo em pé (standing hammer) Martelo em pé (standing hammer) or just martelo is a fast and powerful kick, traditionally performed with the top of the foot. The arms are important in the martelo, helping to generate power and momentum, and to preserve balance. After the kick, the kicking leg swiftly returns under control. The kick is usually delivered from the back leg in ginga. To execute the hammer kick, one adjusts the supporting foot's angle for alignment, raises the knee in front of the body, turn sideways and forcefully extends the leg to kick with the top of the foot. Martelo do chão (ground hammer) Martelo do chão starts from the negativa position. In this position, the hand opposite to the supporting one moves towards the extended leg. The hip lifts, and the previously bent leg kicks the opponent. This kick is commonly used as an offensive move from the ground against a standing opponent. It is more common in capoeira Angola. Chapéu de couro (leather hat) Chapéu de couro (leather hat), also known as S-dobrado (double-S), is a complex move that starts on the floor and springs towards the opponent, in a slicing motion through the air, with each leg following the same path. This kick can be used to surprise the opponent by leaping at them directly from a grounded position. The player starts in a negative position, then executes a powerful hop from their leading foot, swinging the other leg upward and forward across their body while keeping one hand on the floor. To gain momentum, the player should coordinate their legs, torso, free hand, and head in the same direction as the kick. In the kicking phase of the movement, the player does not reach back with the free hand to place it on the ground as they would in S-dobrado. Instead, they complete the kick with their free hand remaining off the ground and land on their kicking foot. According to Da Costa, S-dobrado follows a fake sweep. The capoeirista initiates a strong rasteira in front of the opponent, without making contact, supporting themselves with one leg and the arm on the same side. The opposite arm, on the side of the extended leg, rises for balance. When the sweeping leg approaches the supporting leg, the raised hand touches the ground. Aided by the momentum generated by the sweep, the supporting leg extends, lifting off the ground and delivering a powerful kick to the opponent. Literature References See also List of capoeira techniques Capoeira techniques Kicks
A phalashruti () is a meritorious verse in Hindu literature, appearing at the end of a text or one of its constituent sections. Such a verse offers a description of the benefits that could be accrued by an adherent from the recitation or listening to a given text. It may also extol the prominence of a work, as well as provide the appropriate context for its perusal. Etymology Phalaśruti is a Sanskrit compound word consisting of the words phala () and śruti (), literally translating to, "fruits of listening". Literature The phalashruti is featured in a number of genres of Hindu literature such as the Upanishads, the Brahmanas, the Puranas, and the Itihasas. Bhagavata Purana The phalashruti of the Bhagavata Purana states that he who gifts the text on a full moon in the month of Bhadrapada would achieve the highest goal after their death. The verse acclaims the greatness of the text among other texts of the Purana genre, stating it to be analogous to the river Ganga and the deity Vishnu in terms of its virtue. Mahabharata The phalashruti of the Mahabharata describes the benefits of success, progeny, good fortune, and victory to its listeners. It also describes the purification of the listener from sins, as well as allowing them to obtain heaven and become one with Brahman. A brief account of the composition of the text is also featured in the verse. Ramayana The phalashruti of the Ramayana describes the benefits of longevity, good fortune, and the destruction of all sins accrued by its listeners. It also states that offspring and wealth would be granted to its listeners, along with favourable prospects after their death. The time periods of noon and dusk are described in the verse to be auspicious for the recitation of the epic. Shiva Purana The phalashruti of the Shiva Purana describes the benefits of worldly pleasure, the destruction of sins, and liberation to its listeners. It encourages the work to be recited to the devotees of Shiva over others. It states that the repeated listening of the text leads to increased devotion, culminating in the achievement of salvation. See also Mangalacharana References Sanskrit literature Hindu literature Sanskrit words and phrases
Hannu Mäkirinta (30 April 1949 – 17 February 1989) was a Finnish orienteering competitor. At the 1974 World Orienteering Championships in Silkeborg he finished 16th in the individual event, and received a silver medal in the relay with the Finnish team. In 1976 he finished 15th in the individual event, and received a bronze in the relay. See also Finnish orienteers List of orienteers List of orienteering events References 1949 births 1989 deaths Finnish orienteers Male orienteers Foot orienteers World Orienteering Championships medalists
```yaml # UTF-8 # YAML # # name name: # other_names ... # YAML # other_names: {"":"", "":"", "":"Tom"} # other_names: # sex M/F / sex: M # birth 4 N/A birth: 1946 # death 4 N/A death: # desc YAML # desc desc: | (,18) # links YAML list # # links: - path_to_url ```
Sex Execs were a new wave music band from Boston, Massachusetts, active from late 1981 to mid-1984, playing bars and colleges in the Northeast. Although the group's recorded output was scanty and self-released, lasting recognition came via several notable members. The band's home studio marked the formative experience of producers Paul Q. Kolderie (bass) and Sean Slade (rhythm guitar). Other members included Jim Fitting (who played saxophone for Sex Execs but became better known on harmonica), drummer Jerome Deupree (later of Morphine), and saxophonist Russ Gershon. History Style and Reception Sex Execs also achieved popularity on college radio, especially with the single "My Ex." However, the closest the band came to breaking out to wider recognition was in 1983, thanks to the fifth annual Rock 'n' Roll Rumble, sponsored by WBCN (FM) radio. After beating the Del Fuegos in the semi-finals, they finished as runner-up to 'Til Tuesday. The Boston Globe called both finalists "style conscious" and referred to Sex Execs as "witty" and "wry". The article also cited how one judge was impressed by their tight ensemble playing of complicated arrangements and by the persona of frontman Walter Clay. Yet whereas 'Til Tuesday's performance held the promise that would eventually earn a major-label deal, Sex Execs were viewed as short on songs. They were, however, called "slick, stylish, and tight," with frequent bursts of brass as part of their busy arrangements. (By then the combo had expanded from six to eight members with the addition of two horn players.) With the suits and ties that they sported on stage, the band was akin to Robert Palmer, who happened to be on hand for that Rumble final. Previously, the Boston Globe had referred to Sex Execs' penchant for "overeducated in-jokes" while also praising them as an interesting band that explored and trashed many styles, including covers of Hayley Mills' "Let's Get Together" and Annette Funicello's "Jo Jo the Dog Faced Boy". Their originals were described as ranging from basic pop to a jazzy dissonance. Rock journalist Dave Marsh had a pithy description of the "concept funk" song "Sex Train": "David Byrne flashing a hard-on." Legacy in production In a 2015 interview, Paul Kolderie talked about how he first began to learn about recording in the Dorchester, Boston house where most of the Sex Execs lived. It was wired up as a primitive studio, and other bands came over to record as well. As Sex Execs became more successful, they started recording in professional studios such as Syncro Sound, which was owned by The Cars. Kolderie learned a lot from the engineers there. Three years later, Sean Slade echoed Kolderie in an interview during which he discussed their career as producers and how it got started at the Sex Execs' house with a four-track reel-to-reel recorder they'd bought in New York. Kolderie, Slade, and Fitting -- who were all alumni of Yale University -- then went on to found Fort Apache Studios, which continued the "do it yourself" approach they'd espoused with Sex Execs. The other founder, Joe Harvard, said that Fitting became the fourth principal because it was easier to divide the bills by four. Harvard described them as "the most grossly overeducated recording crew in the history of the world." Personnel Walter Clay – lead vocals Sean Slade – guitar, saxophone, vocals Paul Kolderie – bass Jim Fitting – baritone saxophone Russ Gershon – saxophone Andrew "Andre" Barnaby – guitar Ted Pine – keyboards, vocals Dan Johnsen – drums Jerome Deupree – drums (replaced Johnsen) Recordings Sex Execs (1982) - four-song EP "My Ex" / "Ladies' Man" (1983) - 12-inch single "Sex Train" / "Strange Things" (1984) – 12-inch single References American new wave musical groups Musical groups established in 1981 Musical groups disestablished in 1984 Musical groups from Boston
The 2015–16 Liga Alef season saw Ironi Nesher (champions of the North Division) and Maccabi Sha'arayim (champions of the South Division) win the title and promotion to Liga Leumit. The clubs which were ranked between 2nd to 5th places in each division competed in a promotion play-offs, in which the winners, F.C. Kafr Qasim, advanced to the final round, where they lost 1-3 on aggregate to the 14th placed club in Liga Leumit, Hapoel Jerusalem. Thus, F.C. Kafr Qasim remained in Liga Alef. At the bottom, the bottom two clubs in each division, Maccabi Sektzia Ma'alot-Tarshiha, Ihud Bnei Majd al-Krum (from North division), Bnei Eilat and Hapoel Morasha Ramat HaSharon (from South division) were all automatically relegated to Liga Bet, whilst the two clubs which were ranked in 14th place in each division, Maccabi Daliyat al-Karmel and Maccabi Ironi Amishav Petah Tikva entered a promotion/relegation play-offs. Maccabi Daliyat al-Karmel prevailing to stay in Liga Alef, while Maccabi Ironi Amishav Petah Tikva were relegated after losing the play-offs. Changes from last season Team changes Hapoel Katamon Jerusalem and Hapoel Ashkelon were promoted to Liga Leumit; Ironi Tiberias (to North division) and Hakoah Amidar Ramat Gan (to South division) were relegated from Liga Leumit. Beitar Nahariya and Maccabi Umm al-Fahm were relegated to Liga Bet from North division, whilst F.C. Givat Olga folded following a merger with Hapoel Hadera; Hapoel Kafr Kanna, Hapoel Ironi Baqa al-Gharbiyye and Hapoel Iksal were promoted to the North division from Liga Bet. Maccabi Be'er Sheva and Maccabi Kiryat Malakhi were relegated to Liga Bet from South division; Hapoel Bik'at HaYarden and Bnei Eilat were promoted to the South division from Liga Bet. North Division South Division Promotion play-offs First round Second and third placed clubs played single match at home against the fourth and fifth placed clubs in their respective regional division. Hapoel Hadera and Ironi Tiberias (from North division) and F.C. Kafr Qasim and Hapoel Azor (from South division) advanced to the second round. Second round The winners of the first round played single match at home of the higher ranked club (from each regional division). Hapoel Hadera and F.C. Kafr Qasim advanced to the third round. Third round Hapoel Hadera and F.C. Kafr Qasim faced each other for a single match in neutral venue. The winner advanced to the fourth round against the 14th placed club in Liga Leumit. F.C. Kafr Qasim advanced to the promotion/relegation play-offs. Fourth round - promotion/relegation play-offs F.C. Kafr Qasim faced the 14th placed in 2015–16 Liga Leumit, Hapoel Jerusalem. The winner on aggregate earned a spot in the 2016–17 Liga Leumit. The matches took place on May 27 and 31, 2016. Hapoel Jerusalem won 3–1 on aggregate and remained in Liga Leumit. F.C. Kafr Qasim remained in Liga Alef. Relegation play-offs North play-off The 14th placed club in Liga Alef North, Maccabi Daliyat al-Karmel, faced the Liga Bet North play-offs winner, Hapoel Umm al-Fahm. The winner earned a spot in the 2016–17 Liga Alef. Maccabi Daliyat al-Karmel remained in Liga Alef; Hapoel Umm al-Fahm remained in Liga Bet. South play-off The 14th placed club in Liga Alef South, Maccabi Amishav Petah Tikva, faced the Liga Bet South play-offs winner, F.C. Tira. The winner earned a spot in the 2016–17 Liga Alef. F.C. Tira Promoted to Liga Alef; Maccabi Amishav Petah Tikva relegated to Liga Bet. References Liga Alef North 2015/2016 The Israel Football Association Liga Alef South 2015/2016 The Israel Football Association 3 Liga Alef seasons Israel Liga Alef
Halimium lasianthum, the Lisbon false sun-rose or woolly rock rose, is a species of flowering plant in the family Cistaceae, native to the Iberian Peninsula (Portugal, western Spain and southwestern France) and Northwest Africa (Morocco). It is a spreading evergreen shrub growing to tall by wide, with grey-green leaves and bright yellow flowers in spring. The flowers may have a maroon blotch at the base of each petal. In cultivation this plant requires a sandy soil and full sun. References lasianthum Flora of Portugal Flora of Spain Endemic flora of the Iberian Peninsula
Dying to Go Home () is a Portuguese movie released in 1996. It was directed by Carlos da Silva and George Sluizer, starring Diogo Infante (as Manuel Espírito Santo) and Maria d'Aires (as Júlia Espírito Santo). The story takes place in Portugal and the Netherlands. Three languages are used during the movie (Portuguese, Dutch and English). Plot Manuel Espírito Santo (whose surname means Holy Spirit), a Portuguese immigrant in the Netherlands suffers an accident and dies. Now a ghost, he discovers that his soul cannot rest unless his body is buried in his home country. He also discovers that he can appear in living people's dreams and thereby talk with them. He appears in his sister's dream and asks her to go to Amsterdam in order to retrieve his body. Trivia Portuguese comedian Herman José has a minor role as Vasco da Gama. External links 1990s Portuguese-language films 1996 films 1990s fantasy comedy films 1996 romantic comedy films Films directed by George Sluizer Portuguese romantic comedy films
```smalltalk using System.Linq; using System.Numerics; using Algorithms.Sequences; using FluentAssertions; using NUnit.Framework; namespace Algorithms.Tests.Sequences; [TestFixture] public static class MatchstickTriangleSequenceTests { private static BigInteger[] _testList = { 0, 1, 5, 13, 27, 48, 78, 118, 170, 235, 315, 411, 525, 658, 812, 988, 1188, 1413, 1665, 1945, 2255, 2596, 2970, 3378, 3822, 4303, 4823, 5383, 5985, 6630, 7320, 8056, 8840, 9673, 10557, 11493, 12483, 13528, 14630, 15790, 17010, 18291, 19635, 21043, 22517, }; /// <summary> /// This test uses the list values provided from path_to_url /// </summary> [Test] public static void TestOeisList() { var sequence = new MatchstickTriangleSequence().Sequence.Take(_testList.Length); sequence.SequenceEqual(_testList).Should().BeTrue(); } } ```
```javascript // Generated by ReScript, PLEASE EDIT WITH CARE 'use strict'; let Mt = require("./mt.js"); let Belt_Option = require("../../lib/js/belt_Option.js"); let Caml_option = require("../../lib/js/caml_option.js"); let suites_0 = [ "make", (function (param) { return { TAG: "Eq", _0: "null", _1: String(null).concat("") }; }) ]; let suites_1 = { hd: [ "fromCharCode", (function (param) { return { TAG: "Eq", _0: "a", _1: String.fromCharCode(97) }; }) ], tl: { hd: [ "fromCharCodeMany", (function (param) { return { TAG: "Eq", _0: "az", _1: String.fromCharCode(97, 122) }; }) ], tl: { hd: [ "fromCodePoint", (function (param) { return { TAG: "Eq", _0: "a", _1: String.fromCodePoint(97) }; }) ], tl: { hd: [ "fromCodePointMany", (function (param) { return { TAG: "Eq", _0: "az", _1: String.fromCodePoint(97, 122) }; }) ], tl: { hd: [ "length", (function (param) { return { TAG: "Eq", _0: 3, _1: "foo".length }; }) ], tl: { hd: [ "get", (function (param) { return { TAG: "Eq", _0: "a", _1: "foobar"[4] }; }) ], tl: { hd: [ "charAt", (function (param) { return { TAG: "Eq", _0: "a", _1: "foobar".charAt(4) }; }) ], tl: { hd: [ "charCodeAt", (function (param) { return { TAG: "Eq", _0: 97, _1: "foobar".charCodeAt(4) }; }) ], tl: { hd: [ "codePointAt", (function (param) { return { TAG: "Eq", _0: 97, _1: "foobar".codePointAt(4) }; }) ], tl: { hd: [ "codePointAt - out of bounds", (function (param) { return { TAG: "Eq", _0: undefined, _1: "foobar".codePointAt(98) }; }) ], tl: { hd: [ "concat", (function (param) { return { TAG: "Eq", _0: "foobar", _1: "foo".concat("bar") }; }) ], tl: { hd: [ "concatMany", (function (param) { return { TAG: "Eq", _0: "foobarbaz", _1: "foo".concat("bar", "baz") }; }) ], tl: { hd: [ "endsWith", (function (param) { return { TAG: "Eq", _0: true, _1: "foobar".endsWith("bar") }; }) ], tl: { hd: [ "endsWithFrom", (function (param) { return { TAG: "Eq", _0: false, _1: "foobar".endsWith("bar", 1) }; }) ], tl: { hd: [ "includes", (function (param) { return { TAG: "Eq", _0: true, _1: "foobarbaz".includes("bar") }; }) ], tl: { hd: [ "includesFrom", (function (param) { return { TAG: "Eq", _0: false, _1: "foobarbaz".includes("bar", 4) }; }) ], tl: { hd: [ "indexOf", (function (param) { return { TAG: "Eq", _0: 3, _1: "foobarbaz".indexOf("bar") }; }) ], tl: { hd: [ "indexOfFrom", (function (param) { return { TAG: "Eq", _0: -1, _1: "foobarbaz".indexOf("bar", 4) }; }) ], tl: { hd: [ "lastIndexOf", (function (param) { return { TAG: "Eq", _0: 3, _1: "foobarbaz".lastIndexOf("bar") }; }) ], tl: { hd: [ "lastIndexOfFrom", (function (param) { return { TAG: "Eq", _0: 3, _1: "foobarbaz".lastIndexOf("bar", 4) }; }) ], tl: { hd: [ "localeCompare", (function (param) { return { TAG: "Eq", _0: 0, _1: "foo".localeCompare("foo") }; }) ], tl: { hd: [ "match", (function (param) { return { TAG: "Eq", _0: [ "na", "na" ], _1: Caml_option.null_to_opt("banana".match(/na+/g)) }; }) ], tl: { hd: [ "match - no match", (function (param) { return { TAG: "Eq", _0: undefined, _1: Caml_option.null_to_opt("banana".match(/nanana+/g)) }; }) ], tl: { hd: [ "match - not found capture groups", (function (param) { return { TAG: "Eq", _0: [ "hello ", undefined ], _1: Belt_Option.map(Caml_option.null_to_opt("hello word".match(/hello (world)?/)), (function (prim) { return prim.slice(); })) }; }) ], tl: { hd: [ "normalize", (function (param) { return { TAG: "Eq", _0: "foo", _1: "foo".normalize() }; }) ], tl: { hd: [ "normalizeByForm", (function (param) { return { TAG: "Eq", _0: "foo", _1: "foo".normalize("NFKD") }; }) ], tl: { hd: [ "repeat", (function (param) { return { TAG: "Eq", _0: "foofoofoo", _1: "foo".repeat(3) }; }) ], tl: { hd: [ "replace", (function (param) { return { TAG: "Eq", _0: "fooBORKbaz", _1: "foobarbaz".replace("bar", "BORK") }; }) ], tl: { hd: [ "replaceByRe", (function (param) { return { TAG: "Eq", _0: "fooBORKBORK", _1: "foobarbaz".replace(/ba./g, "BORK") }; }) ], tl: { hd: [ "unsafeReplaceBy0", (function (param) { let replace = function (whole, offset, s) { if (whole === "bar") { return "BORK"; } else { return "DORK"; } }; return { TAG: "Eq", _0: "fooBORKDORK", _1: "foobarbaz".replace(/ba./g, replace) }; }) ], tl: { hd: [ "unsafeReplaceBy1", (function (param) { let replace = function (whole, p1, offset, s) { if (whole === "bar") { return "BORK"; } else { return "DORK"; } }; return { TAG: "Eq", _0: "fooBORKDORK", _1: "foobarbaz".replace(/ba./g, replace) }; }) ], tl: { hd: [ "unsafeReplaceBy2", (function (param) { let replace = function (whole, p1, p2, offset, s) { if (whole === "bar") { return "BORK"; } else { return "DORK"; } }; return { TAG: "Eq", _0: "fooBORKDORK", _1: "foobarbaz".replace(/ba./g, replace) }; }) ], tl: { hd: [ "unsafeReplaceBy3", (function (param) { let replace = function (whole, p1, p2, p3, offset, s) { if (whole === "bar") { return "BORK"; } else { return "DORK"; } }; return { TAG: "Eq", _0: "fooBORKDORK", _1: "foobarbaz".replace(/ba./g, replace) }; }) ], tl: { hd: [ "search", (function (param) { return { TAG: "Eq", _0: 3, _1: "foobarbaz".search(/ba./g) }; }) ], tl: { hd: [ "slice", (function (param) { return { TAG: "Eq", _0: "bar", _1: "foobarbaz".slice(3, 6) }; }) ], tl: { hd: [ "sliceToEnd", (function (param) { return { TAG: "Eq", _0: "barbaz", _1: "foobarbaz".slice(3) }; }) ], tl: { hd: [ "split", (function (param) { return { TAG: "Eq", _0: [ "foo", "bar", "baz" ], _1: "foo bar baz".split(" ") }; }) ], tl: { hd: [ "splitAtMost", (function (param) { return { TAG: "Eq", _0: [ "foo", "bar" ], _1: "foo bar baz".split(" ", 2) }; }) ], tl: { hd: [ "splitByRe", (function (param) { return { TAG: "Eq", _0: [ "a", "#", undefined, "b", "#", ":", "c" ], _1: "a#b#:c".split(/(#)(:)?/) }; }) ], tl: { hd: [ "splitByReAtMost", (function (param) { return { TAG: "Eq", _0: [ "a", "#", undefined ], _1: "a#b#:c".split(/(#)(:)?/, 3) }; }) ], tl: { hd: [ "startsWith", (function (param) { return { TAG: "Eq", _0: true, _1: "foobarbaz".startsWith("foo") }; }) ], tl: { hd: [ "startsWithFrom", (function (param) { return { TAG: "Eq", _0: false, _1: "foobarbaz".startsWith("foo", 1) }; }) ], tl: { hd: [ "substr", (function (param) { return { TAG: "Eq", _0: "barbaz", _1: "foobarbaz".substr(3) }; }) ], tl: { hd: [ "substrAtMost", (function (param) { return { TAG: "Eq", _0: "bar", _1: "foobarbaz".substr(3, 3) }; }) ], tl: { hd: [ "substring", (function (param) { return { TAG: "Eq", _0: "bar", _1: "foobarbaz".substring(3, 6) }; }) ], tl: { hd: [ "substringToEnd", (function (param) { return { TAG: "Eq", _0: "barbaz", _1: "foobarbaz".substring(3) }; }) ], tl: { hd: [ "toLowerCase", (function (param) { return { TAG: "Eq", _0: "bork", _1: "BORK".toLowerCase() }; }) ], tl: { hd: [ "toLocaleLowerCase", (function (param) { return { TAG: "Eq", _0: "bork", _1: "BORK".toLocaleLowerCase() }; }) ], tl: { hd: [ "toUpperCase", (function (param) { return { TAG: "Eq", _0: "FUBAR", _1: "fubar".toUpperCase() }; }) ], tl: { hd: [ "toLocaleUpperCase", (function (param) { return { TAG: "Eq", _0: "FUBAR", _1: "fubar".toLocaleUpperCase() }; }) ], tl: { hd: [ "trim", (function (param) { return { TAG: "Eq", _0: "foo", _1: " foo ".trim() }; }) ], tl: { hd: [ "anchor", (function (param) { return { TAG: "Eq", _0: "<a name=\"bar\">foo</a>", _1: "foo".anchor("bar") }; }) ], tl: { hd: [ "link", (function (param) { return { TAG: "Eq", _0: "<a href=\"path_to_url">foo</a>", _1: "foo".link("path_to_url") }; }) ], tl: { hd: [ "File \"js_string_test.res\", line 138, characters 5-12", (function (param) { return { TAG: "Ok", _0: "ab".includes("a") }; }) ], tl: /* [] */0 } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } } }; let suites = { hd: suites_0, tl: suites_1 }; Mt.from_pair_suites("Js_string_test", suites); exports.suites = suites; /* Not a pure module */ ```
```asciidoc xref::overview/apoc.import/apoc.import.json.adoc[apoc.import.json icon:book[]] + `apoc.import.json(file,config)` - imports the json list to the provided file label:procedure[] label:apoc-core[] ```
```html <html> <head> <meta http-equiv="Content-Type" content="text/html; charset=US-ASCII"> <title>crosses (with strategy)</title> <link rel="stylesheet" href="../../../../../../../../doc/src/boostbook.css" type="text/css"> <meta name="generator" content="DocBook XSL Stylesheets V1.79.1"> <link rel="home" href="../../../../index.html" title="Chapter&#160;1.&#160;Geometry"> <link rel="up" href="../crosses.html" title="crosses"> <link rel="prev" href="../crosses.html" title="crosses"> <link rel="next" href="crosses_2.html" title="crosses"> </head> <body bgcolor="white" text="black" link="#0000FF" vlink="#840084" alink="#0000FF"> <table cellpadding="2" width="100%"><tr> <td valign="top"><img alt="Boost C++ Libraries" width="277" height="86" src="../../../../../../../../boost.png"></td> <td align="center"><a href="../../../../../../../../index.html">Home</a></td> <td align="center"><a href="../../../../../../../../libs/libraries.htm">Libraries</a></td> <td align="center"><a href="path_to_url">People</a></td> <td align="center"><a href="path_to_url">FAQ</a></td> <td align="center"><a href="../../../../../../../../more/index.htm">More</a></td> </tr></table> <hr> <div class="spirit-nav"> <a accesskey="p" href="../crosses.html"><img src="../../../../../../../../doc/src/images/prev.png" alt="Prev"></a><a accesskey="u" href="../crosses.html"><img src="../../../../../../../../doc/src/images/up.png" alt="Up"></a><a accesskey="h" href="../../../../index.html"><img src="../../../../../../../../doc/src/images/home.png" alt="Home"></a><a accesskey="n" href="crosses_2.html"><img src="../../../../../../../../doc/src/images/next.png" alt="Next"></a> </div> <div class="section"> <div class="titlepage"><div><div><h5 class="title"> <a name="geometry.reference.algorithms.crosses.crosses_3_with_strategy"></a><a class="link" href="crosses_3_with_strategy.html" title="crosses (with strategy)">crosses (with strategy)</a> </h5></div></div></div> <p> <a class="indexterm" name="idp109981808"></a> Checks if two geometries crosses. </p> <h6> <a name="geometry.reference.algorithms.crosses.crosses_3_with_strategy.h0"></a> <span class="phrase"><a name="geometry.reference.algorithms.crosses.crosses_3_with_strategy.synopsis"></a></span><a class="link" href="crosses_3_with_strategy.html#geometry.reference.algorithms.crosses.crosses_3_with_strategy.synopsis">Synopsis</a> </h6> <p> </p> <pre class="programlisting"><span class="keyword">template</span><span class="special">&lt;</span><span class="keyword">typename</span> <span class="identifier">Geometry1</span><span class="special">,</span> <span class="keyword">typename</span> <span class="identifier">Geometry2</span><span class="special">,</span> <span class="keyword">typename</span> <span class="identifier">Strategy</span><span class="special">&gt;</span> <span class="keyword">bool</span> <span class="identifier">crosses</span><span class="special">(</span><span class="identifier">Geometry1</span> <span class="keyword">const</span> <span class="special">&amp;</span> <span class="identifier">geometry1</span><span class="special">,</span> <span class="identifier">Geometry2</span> <span class="keyword">const</span> <span class="special">&amp;</span> <span class="identifier">geometry2</span><span class="special">,</span> <span class="identifier">Strategy</span> <span class="keyword">const</span> <span class="special">&amp;</span> <span class="identifier">strategy</span><span class="special">)</span></pre> <p> </p> <h6> <a name="geometry.reference.algorithms.crosses.crosses_3_with_strategy.h1"></a> <span class="phrase"><a name="geometry.reference.algorithms.crosses.crosses_3_with_strategy.parameters"></a></span><a class="link" href="crosses_3_with_strategy.html#geometry.reference.algorithms.crosses.crosses_3_with_strategy.parameters">Parameters</a> </h6> <div class="informaltable"><table class="table"> <colgroup> <col> <col> <col> <col> </colgroup> <thead><tr> <th> <p> Type </p> </th> <th> <p> Concept </p> </th> <th> <p> Name </p> </th> <th> <p> Description </p> </th> </tr></thead> <tbody> <tr> <td> <p> Geometry1 const &amp; </p> </td> <td> <p> Any type fulfilling a Geometry Concept </p> </td> <td> <p> geometry1 </p> </td> <td> <p> A model of the specified concept </p> </td> </tr> <tr> <td> <p> Geometry2 const &amp; </p> </td> <td> <p> Any type fulfilling a Geometry Concept </p> </td> <td> <p> geometry2 </p> </td> <td> <p> A model of the specified concept </p> </td> </tr> <tr> <td> <p> Strategy const &amp; </p> </td> <td> <p> Any type fulfilling a Crosses Strategy Concept </p> </td> <td> <p> strategy </p> </td> <td> <p> The strategy which will be used for crosses calculations </p> </td> </tr> </tbody> </table></div> <h6> <a name="geometry.reference.algorithms.crosses.crosses_3_with_strategy.h2"></a> <span class="phrase"><a name="geometry.reference.algorithms.crosses.crosses_3_with_strategy.returns"></a></span><a class="link" href="crosses_3_with_strategy.html#geometry.reference.algorithms.crosses.crosses_3_with_strategy.returns">Returns</a> </h6> <p> Returns true if two geometries crosses </p> <h6> <a name="geometry.reference.algorithms.crosses.crosses_3_with_strategy.h3"></a> <span class="phrase"><a name="geometry.reference.algorithms.crosses.crosses_3_with_strategy.header"></a></span><a class="link" href="crosses_3_with_strategy.html#geometry.reference.algorithms.crosses.crosses_3_with_strategy.header">Header</a> </h6> <p> Either </p> <p> <code class="computeroutput"><span class="preprocessor">#include</span> <span class="special">&lt;</span><span class="identifier">boost</span><span class="special">/</span><span class="identifier">geometry</span><span class="special">.</span><span class="identifier">hpp</span><span class="special">&gt;</span></code> </p> <p> Or </p> <p> <code class="computeroutput"><span class="preprocessor">#include</span> <span class="special">&lt;</span><span class="identifier">boost</span><span class="special">/</span><span class="identifier">geometry</span><span class="special">/</span><span class="identifier">algorithms</span><span class="special">/</span><span class="identifier">crosses</span><span class="special">.</span><span class="identifier">hpp</span><span class="special">&gt;</span></code> </p> <h6> <a name="geometry.reference.algorithms.crosses.crosses_3_with_strategy.h4"></a> <span class="phrase"><a name="geometry.reference.algorithms.crosses.crosses_3_with_strategy.conformance"></a></span><a class="link" href="crosses_3_with_strategy.html#geometry.reference.algorithms.crosses.crosses_3_with_strategy.conformance">Conformance</a> </h6> <p> The function crosses implements function Crosses from the <a href="path_to_url" target="_top">OGC Simple Feature Specification</a>. </p> <h6> <a name="geometry.reference.algorithms.crosses.crosses_3_with_strategy.h5"></a> <span class="phrase"><a name="geometry.reference.algorithms.crosses.crosses_3_with_strategy.supported_geometries"></a></span><a class="link" href="crosses_3_with_strategy.html#geometry.reference.algorithms.crosses.crosses_3_with_strategy.supported_geometries">Supported geometries</a> </h6> <div class="informaltable"><table class="table"> <colgroup> <col> <col> <col> <col> <col> <col> <col> <col> <col> <col> <col> </colgroup> <thead><tr> <th> </th> <th> <p> Point </p> </th> <th> <p> Segment </p> </th> <th> <p> Box </p> </th> <th> <p> Linestring </p> </th> <th> <p> Ring </p> </th> <th> <p> Polygon </p> </th> <th> <p> MultiPoint </p> </th> <th> <p> MultiLinestring </p> </th> <th> <p> MultiPolygon </p> </th> <th> <p> Variant </p> </th> </tr></thead> <tbody> <tr> <td> <p> Point </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/ok.png" alt="ok"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/ok.png" alt="ok"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/ok.png" alt="ok"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/ok.png" alt="ok"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/ok.png" alt="ok"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/ok.png" alt="ok"></span> </p> </td> </tr> <tr> <td> <p> Segment </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> </tr> <tr> <td> <p> Box </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> </tr> <tr> <td> <p> Linestring </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/ok.png" alt="ok"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/ok.png" alt="ok"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/ok.png" alt="ok"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> </tr> <tr> <td> <p> Ring </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/ok.png" alt="ok"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/ok.png" alt="ok"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/ok.png" alt="ok"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> </tr> <tr> <td> <p> Polygon </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/ok.png" alt="ok"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/ok.png" alt="ok"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/ok.png" alt="ok"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> </tr> <tr> <td> <p> MultiPoint </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> </tr> <tr> <td> <p> MultiLinestring </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/ok.png" alt="ok"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/ok.png" alt="ok"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/ok.png" alt="ok"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> </tr> <tr> <td> <p> MultiPolygon </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/ok.png" alt="ok"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/ok.png" alt="ok"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/ok.png" alt="ok"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> </tr> <tr> <td> <p> Variant </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> <td> <p> <span class="inlinemediaobject"><img src="../../../../img/nyi.png" alt="nyi"></span> </p> </td> </tr> </tbody> </table></div> </div> <table xmlns:rev="path_to_url~gregod/boost/tools/doc/revision" width="100%"><tr> <td align="left"></td> Gehrels, Bruno Lalande, Mateusz Loskot, Adam Wulkiewicz, Oracle and/or its affiliates<p> file LICENSE_1_0.txt or copy at <a href="path_to_url" target="_top">path_to_url </p> </div></td> </tr></table> <hr> <div class="spirit-nav"> <a accesskey="p" href="../crosses.html"><img src="../../../../../../../../doc/src/images/prev.png" alt="Prev"></a><a accesskey="u" href="../crosses.html"><img src="../../../../../../../../doc/src/images/up.png" alt="Up"></a><a accesskey="h" href="../../../../index.html"><img src="../../../../../../../../doc/src/images/home.png" alt="Home"></a><a accesskey="n" href="crosses_2.html"><img src="../../../../../../../../doc/src/images/next.png" alt="Next"></a> </div> </body> </html> ```
Jacob George Strutt (4 August 1784 – 1867) was a British portrait and landscape painter and engraver in the manner of John Constable. He was the husband of the writer Elizabeth Strutt, and father of the painter, traveller and archaeologist Arthur John Strutt. Life Strutt was born on 4 August 1784 in Colchester, in Essex, one of eight children of Benjamin Strutt and Caroline, née Pollett. In London, on 8 November 1813, he married Elizabeth Byron, with whom he had four children; their second son, Arthur John Strutt, was born in 1819. Strutt moved to Lausanne in Switzerland in about 1830. With his son Arthur he travelled in France and Switzerland from 1835 to 1837, and later to Italy; they established a studio in Rome. He returned to England in 1851, and died in Rome in 1864 or 1867. Work Strutt painted portraits and landscapes, mainly in gouache, in the style of Constable, with whom he may have studied. He was also a capable engraver. He showed work in London between 1819 and 1858. At the Royal Academy he exhibited from 1822 to 1852; in 1822 and 1823 he showed portraits, but from 1824 until 1831 showed only woodland or forest scenes. Two paintings were sent from Italy while he was living there: The Ancient Forum, Rome in 1845, and in 1851 Tasso's Oak, Rome. He published two books of poetry in translation, and several books of engravings. Publications Claudius Claudianus, Jacob George Strutt (1814). The Rape of Proserpine: with Other Poems, from Claudian; translated into English verse. With a prefatory discourse, and occasional notes. London: Printed by A. J. Valpy, sold by Longman, Hurst, Rees, Orme, and Brown. John Milton, The Latin and Italian Poems of Milton. Translated into English verse by J. G. Strutt. London: J. Conder, 1814. Bury St. Edmunds illustrated in Twelve Etchings by J.G. Strutt. London: J.G. Strutt, 1821. Sylva Britannica, or, Portraits of forest trees, distinguished for their antiquity, magnitude, or beauty. London: The author 1822; Full text of expanded 1830 edition. Deliciae sylvarum, or, Grand and romantic forest scenery in England and Scotland, drawn from nature, and etched by Jacob George Strutt. London: J. G. Strutt [1828]. Notes References 19th-century British painters British male painters 1784 births 1867 deaths Place of birth missing Artists from Colchester 19th-century British male artists
```html <!DOCTYPE html> <html lang="en"> <head> <meta charset="utf-8"> <link rel="shortcut icon" type="image/png" href="../assets/img/favicon.ico"> <link rel="stylesheet" href="../assets/lib/cssgrids.css"> <link rel="stylesheet" href="../assets/css/main.css" id="site_styles"> <script src="../assets/lib/yui-min.js"></script> <script src="../assets/js/api-prettify.js"></script> <script src="../assets/js/api-filter.js"></script> <script src="../assets/js/api-list.js"></script> <script src="../assets/js/api-search.js"></script> <script src="../assets/js/api-docs.js"></script> <title>nunuStudio BaseNode</title> </head> <body class="yui3-skin-sam"> <div id="doc"> <div id="hd" class="yui3-g header"> <div class="yui3-u-3-4"> <h1><a href="../index.html"><img src="../assets/img/logo.png" title=""></a></h1> </div> </div> <div id="bd" class="yui3-g"> <div class="yui3-u-1-4"> <div id="docs-sidebar" class="sidebar"> <div id="api-list"> <h2 class="off-left">APIs</h2> <div id="api-tabview" class="tabview"> <div id="api-tabview-filter"> <input type="search" id="api-filter" placeholder="Type to filter APIs"> </div> <ul class="tabs"> <li><a href="#api-classes">Classes</a></li> <li><a href="#api-modules">Modules</a></li> </ul> <div id="api-tabview-panel"> <ul id="api-classes" class="apis classes"> <li><a href="../classes/AfterimagePass.html">AfterimagePass</a></li> <li><a href="../classes/AmbientLight.html">AmbientLight</a></li> <li><a href="../classes/AnimationMixer.html">AnimationMixer</a></li> <li><a href="../classes/AnimationTimer.html">AnimationTimer</a></li> <li><a href="../classes/App.html">App</a></li> <li><a href="../classes/ARHandler.html">ARHandler</a></li> <li><a href="../classes/ArraybufferUtils.html">ArraybufferUtils</a></li> <li><a href="../classes/Audio.html">Audio</a></li> <li><a href="../classes/AudioEmitter.html">AudioEmitter</a></li> <li><a href="../classes/AudioLoader.html">AudioLoader</a></li> <li><a href="../classes/Base64Utils.html">Base64Utils</a></li> <li><a href="../classes/BaseNode.html">BaseNode</a></li> <li><a href="../classes/BillboardGroup.html">BillboardGroup</a></li> <li><a href="../classes/BloomPass.html">BloomPass</a></li> <li><a href="../classes/BokehPass.html">BokehPass</a></li> <li><a href="../classes/BufferUtils.html">BufferUtils</a></li> <li><a href="../classes/ByteArrayUtils.html">ByteArrayUtils</a></li> <li><a href="../classes/CanvasSprite.html">CanvasSprite</a></li> <li><a href="../classes/CanvasTexture.html">CanvasTexture</a></li> <li><a href="../classes/CapsuleBufferGeometry.html">CapsuleBufferGeometry</a></li> <li><a href="../classes/ColorifyPass.html">ColorifyPass</a></li> <li><a href="../classes/CompressedTexture.html">CompressedTexture</a></li> <li><a href="../classes/CopyPass.html">CopyPass</a></li> <li><a href="../classes/CSS3DObject.html">CSS3DObject</a></li> <li><a href="../classes/CSS3DRenderer.html">CSS3DRenderer</a></li> <li><a href="../classes/CSS3DSprite.html">CSS3DSprite</a></li> <li><a href="../classes/CubeCamera.html">CubeCamera</a></li> <li><a href="../classes/CubeTexture.html">CubeTexture</a></li> <li><a href="../classes/DataTexture.html">DataTexture</a></li> <li><a href="../classes/DirectionalLight.html">DirectionalLight</a></li> <li><a href="../classes/DirectionalLightCSM.html">DirectionalLightCSM</a></li> <li><a href="../classes/DotScreenPass.html">DotScreenPass</a></li> <li><a href="../classes/EffectComposer.html">EffectComposer</a></li> <li><a href="../classes/EventManager.html">EventManager</a></li> <li><a href="../classes/FileSystem.html">FileSystem</a></li> <li><a href="../classes/FilmPass.html">FilmPass</a></li> <li><a href="../classes/FirstPersonControls.html">FirstPersonControls</a></li> <li><a href="../classes/Fog.html">Fog</a></li> <li><a href="../classes/Font.html">Font</a></li> <li><a href="../classes/FontLoader.html">FontLoader</a></li> <li><a href="../classes/FXAAPass.html">FXAAPass</a></li> <li><a href="../classes/Gamepad.html">Gamepad</a></li> <li><a href="../classes/GeometryLoader.html">GeometryLoader</a></li> <li><a href="../classes/Group.html">Group</a></li> <li><a href="../classes/Gyroscope.html">Gyroscope</a></li> <li><a href="../classes/HemisphereLight.html">HemisphereLight</a></li> <li><a href="../classes/HTMLView.html">HTMLView</a></li> <li><a href="../classes/HueSaturationPass.html">HueSaturationPass</a></li> <li><a href="../classes/Image.html">Image</a></li> <li><a href="../classes/ImageLoader.html">ImageLoader</a></li> <li><a href="../classes/InstancedMesh.html">InstancedMesh</a></li> <li><a href="../classes/Key.html">Key</a></li> <li><a href="../classes/Keyboard.html">Keyboard</a></li> <li><a href="../classes/LegacyGeometryLoader.html">LegacyGeometryLoader</a></li> <li><a href="../classes/LensFlare.html">LensFlare</a></li> <li><a href="../classes/LightProbe.html">LightProbe</a></li> <li><a href="../classes/LocalStorage.html">LocalStorage</a></li> <li><a href="../classes/Material.html">Material</a></li> <li><a href="../classes/MaterialLoader.html">MaterialLoader</a></li> <li><a href="../classes/MathUtils.html">MathUtils</a></li> <li><a href="../classes/Mesh.html">Mesh</a></li> <li><a href="../classes/Model.html">Model</a></li> <li><a href="../classes/Mouse.html">Mouse</a></li> <li><a href="../classes/NodeScript.html">NodeScript</a></li> <li><a href="../classes/Nunu.html">Nunu</a></li> <li><a href="../classes/Object3D.html">Object3D</a></li> <li><a href="../classes/ObjectLoader.html">ObjectLoader</a></li> <li><a href="../classes/ObjectUtils.html">ObjectUtils</a></li> <li><a href="../classes/OperationNode.html">OperationNode</a></li> <li><a href="../classes/OrbitControls.html">OrbitControls</a></li> <li><a href="../classes/OrthographicCamera.html">OrthographicCamera</a></li> <li><a href="../classes/ParametricBufferGeometry.html">ParametricBufferGeometry</a></li> <li><a href="../classes/ParticleDistributions.html">ParticleDistributions</a></li> <li><a href="../classes/ParticleEmitter.html">ParticleEmitter</a></li> <li><a href="../classes/ParticleEmitterControl.html">ParticleEmitterControl</a></li> <li><a href="../classes/ParticleEmitterControlOptions.html">ParticleEmitterControlOptions</a></li> <li><a href="../classes/ParticleGroup.html">ParticleGroup</a></li> <li><a href="../classes/Pass.html">Pass</a></li> <li><a href="../classes/PerspectiveCamera.html">PerspectiveCamera</a></li> <li><a href="../classes/PhysicsGenerator.html">PhysicsGenerator</a></li> <li><a href="../classes/PhysicsObject.html">PhysicsObject</a></li> <li><a href="../classes/PointLight.html">PointLight</a></li> <li><a href="../classes/PositionalAudio.html">PositionalAudio</a></li> <li><a href="../classes/Program.html">Program</a></li> <li><a href="../classes/PythonScript.html">PythonScript</a></li> <li><a href="../classes/RectAreaLight.html">RectAreaLight</a></li> <li><a href="../classes/RendererConfiguration.html">RendererConfiguration</a></li> <li><a href="../classes/RendererState.html">RendererState</a></li> <li><a href="../classes/RenderPass.html">RenderPass</a></li> <li><a href="../classes/Resource.html">Resource</a></li> <li><a href="../classes/ResourceManager.html">ResourceManager</a></li> <li><a href="../classes/RoundedBoxBufferGeometry.html">RoundedBoxBufferGeometry</a></li> <li><a href="../classes/Scene.html">Scene</a></li> <li><a href="../classes/Script.html">Script</a></li> <li><a href="../classes/ShaderAttribute.html">ShaderAttribute</a></li> <li><a href="../classes/ShaderPass.html">ShaderPass</a></li> <li><a href="../classes/ShaderUtils.html">ShaderUtils</a></li> <li><a href="../classes/SimplexNoise.html">SimplexNoise</a></li> <li><a href="../classes/Skeleton.html">Skeleton</a></li> <li><a href="../classes/SkinnedMesh.html">SkinnedMesh</a></li> <li><a href="../classes/Sky.html">Sky</a></li> <li><a href="../classes/SobelPass.html">SobelPass</a></li> <li><a href="../classes/SpineAnimation.html">SpineAnimation</a></li> <li><a href="../classes/SpineTexture.html">SpineTexture</a></li> <li><a href="../classes/SpotLight.html">SpotLight</a></li> <li><a href="../classes/Sprite.html">Sprite</a></li> <li><a href="../classes/SpriteSheetTexture.html">SpriteSheetTexture</a></li> <li><a href="../classes/SSAONOHPass.html">SSAONOHPass</a></li> <li><a href="../classes/SSAOPass.html">SSAOPass</a></li> <li><a href="../classes/SSAOShader.html">SSAOShader</a></li> <li><a href="../classes/TargetConfig.html">TargetConfig</a></li> <li><a href="../classes/TechnicolorPass.html">TechnicolorPass</a></li> <li><a href="../classes/TerrainBufferGeometry.html">TerrainBufferGeometry</a></li> <li><a href="../classes/TextBitmap.html">TextBitmap</a></li> <li><a href="../classes/TextFile.html">TextFile</a></li> <li><a href="../classes/TextMesh.html">TextMesh</a></li> <li><a href="../classes/TextSprite.html">TextSprite</a></li> <li><a href="../classes/Texture.html">Texture</a></li> <li><a href="../classes/TextureLoader.html">TextureLoader</a></li> <li><a href="../classes/Timer.html">Timer</a></li> <li><a href="../classes/TizenKeyboard.html">TizenKeyboard</a></li> <li><a href="../classes/Tree.html">Tree</a></li> <li><a href="../classes/TreeUtils.html">TreeUtils</a></li> <li><a href="../classes/TwistModifier.html">TwistModifier</a></li> <li><a href="../classes/TypedArrayHelper.html">TypedArrayHelper</a></li> <li><a href="../classes/UnitConverter.html">UnitConverter</a></li> <li><a href="../classes/UnrealBloomPass.html">UnrealBloomPass</a></li> <li><a href="../classes/Video.html">Video</a></li> <li><a href="../classes/VideoLoader.html">VideoLoader</a></li> <li><a href="../classes/VideoStream.html">VideoStream</a></li> <li><a href="../classes/VideoTexture.html">VideoTexture</a></li> <li><a href="../classes/Viewport.html">Viewport</a></li> <li><a href="../classes/VRHandler.html">VRHandler</a></li> <li><a href="../classes/WebcamTexture.html">WebcamTexture</a></li> <li><a href="../classes/WorkerPool.html">WorkerPool</a></li> <li><a href="../classes/WorkerTask.html">WorkerTask</a></li> <li><a href="../classes/{Object} ParticleGroupOptions.html">{Object} ParticleGroupOptions</a></li> </ul> <ul id="api-modules" class="apis modules"> <li><a href="../modules/Animation.html">Animation</a></li> <li><a href="../modules/Animations.html">Animations</a></li> <li><a href="../modules/Audio.html">Audio</a></li> <li><a href="../modules/BinaryUtils.html">BinaryUtils</a></li> <li><a href="../modules/Cameras.html">Cameras</a></li> <li><a href="../modules/Controls.html">Controls</a></li> <li><a href="../modules/Core.html">Core</a></li> <li><a href="../modules/Files.html">Files</a></li> <li><a href="../modules/Input.html">Input</a></li> <li><a href="../modules/Lights.html">Lights</a></li> <li><a href="../modules/Loaders.html">Loaders</a></li> <li><a href="../modules/Meshes.html">Meshes</a></li> <li><a href="../modules/Misc.html">Misc</a></li> <li><a href="../modules/Particles.html">Particles</a></li> <li><a href="../modules/Physics.html">Physics</a></li> <li><a href="../modules/Postprocessing.html">Postprocessing</a></li> <li><a href="../modules/Resources.html">Resources</a></li> <li><a href="../modules/Runtime.html">Runtime</a></li> <li><a href="../modules/Script.html">Script</a></li> <li><a href="../modules/Sprite.html">Sprite</a></li> <li><a href="../modules/Textures.html">Textures</a></li> <li><a href="../modules/THREE.html">THREE</a></li> <li><a href="../modules/Utils.html">Utils</a></li> </ul> </div> </div> </div> </div> </div> <div class="yui3-u-3-4"> <!--<div id="api-options"> Show: <label for="api-show-inherited"> <input type="checkbox" id="api-show-inherited" checked> Inherited </label> <label for="api-show-protected"> <input type="checkbox" id="api-show-protected"> Protected </label> <label for="api-show-private"> <input type="checkbox" id="api-show-private"> Private </label> <label for="api-show-deprecated"> <input type="checkbox" id="api-show-deprecated"> Deprecated </label> </div>--> <div class="apidocs"> <div id="docs-main"> <div class="content"> <h1>BaseNode Class</h1> <div class="box meta"> Module: <a href="../modules/Script.html">Script</a> </div> <div class="box intro"> <p>Base node are used as a basis for all other nodes, they implement the necessary common functionality for all nodes.</p> <p>Base nodes add a destructible function with a button which allows the user to destroy them.</p> <p>When the node gets destroyed it automatically gets removed from the graph.</p> </div> <div class="constructor"> <h2>Constructor</h2> <div id="method_BaseNode" class="method item"> <h3 class="name"><code>BaseNode</code></h3> <span class="paren">()</span> <div class="meta"> <p> </p> </div> <div class="description"> </div> </div> </div> <div id="classdocs" class="tabview"> <ul class="api-class-tabs"> <li class="api-class-tab index"><a href="#index">Index</a></li> <li class="api-class-tab attrs"><a href="#attrs">Attributes</a></li> </ul> <div> <div id="index" class="api-class-tabpanel index"> <h2 class="off-left">Item Index</h2> <div class="index-section attrs"> <h3>Attributes</h3> <ul class="index-list attrs"> <li class="index-item attr"> <a href="#attr_destroyButton">destroyButton</a> </li> </ul> </div> </div> <div id="attrs" class="api-class-tabpanel"> <h2 class="off-left">Attributes</h2> <div id="attr_destroyButton" class="attr item"> <a name="config_destroyButton"></a> <h3 class="name"><code>destroyButton</code></h3> <span class="type">Circle</span> <div class="meta"> <p> </p> </div> <div class="description"> <p>Button used to destroy the node and remove it from the graph.</p> </div> </div> </div> </div> </div> </div> </div> </div> </div> </div> </div> </body> </html> ```
The 2015 Plymouth City Council election took place on 7 May 2015 to elect members of Plymouth City Council in England. The Labour Party lost its narrow majority, resulting in no party having overall control. Background Plymouth City Council held local elections on 7 May 2015 along with councils across the United Kingdom as part of the 2015 local elections. The council elects its councillors in thirds, with a third being up for election every year for three years, with no election in the fourth year. Councillors defending their seats in this election were previously elected in 2011. In that election, twelve Labour candidates and eight Conservative candidates were elected. Two Labour candidates had been elected in St Peter and the Waterfront due to a by-election coinciding with the council election, only one of whom was defending their seat in this election. Overall results |- | colspan=2 style="text-align: right; margin-right: 1em" | Total | style="text-align: right;" | 19 | colspan=5 | | style="text-align: right;" | 117,348 | style="text-align: right;" | Note: All changes in vote share are in comparison to the corresponding 2011 election. The Labour Party lost their majority on the council, leaving the council under no overall control. After the previous election, the composition of the council was: After this election, the composition of the council was: Ward results Asterisks denote sitting councillors seeking re-election. Budshead Compton Devonport Drake Efford and Lipson Eggbuckland Ham Honicknowle Moor View Peverell Plympton Erle Plympton St Mary Plymstock Dunstone Plymstock Radford Southway St Budeaux Stoke St Peter and the Waterfront Sutton and Mount Gould Aftermath The Labour Party lost control of the council, remaining the largest party with one seat short of an overall majority. References Plymouth May 2015 events in the United Kingdom 2015 2010s in Devon
Éric Paul Maurice Garcin (born 6 December 1965) is a French professional football coach and a former player. References External links 1965 births Living people Footballers from Avignon French men's footballers Toulouse FC players Nîmes Olympique players Le Mans FC players Motherwell F.C. players Dundee F.C. players Ligue 1 players Ligue 2 players Scottish Football League players Scottish Premier League players French expatriate men's footballers Expatriate men's footballers in Scotland French football managers Grenoble Foot 38 managers FC Rouen managers Men's association football defenders AC Avignonnais players
Tennis events were contested at the 1959 Summer Universiade in Turin, Italy. Medal summary Medal table See also Tennis at the Summer Universiade External links World University Games Tennis on HickokSports.com (Archived1) World University Games Tennis on HickokSports.com (Archived2) 1959 Universiade 1959 Summer Universiade
Joshua (also known as The Devil's Child) is a 2007 American psychological thriller co-written and directed by George Ratliff. The film stars Sam Rockwell, Vera Farmiga and Jacob Kogan. Joshua premiered at the 2007 Sundance Film Festival and was released on July 6, 2007, in the United States by Fox Searchlight Pictures. The film received favourable reviews from critics, with many praising the performances, atmosphere and the horror elements, but criticism for its plot and characters. Plot Brad and Abby Cairn are an affluent New York couple with two children. Their firstborn, a self-decided conservatively dressed 9-year-old, Joshua, is a child prodigy on the piano and demonstrates ability to such a degree that he thinks and acts old before his time. Joshua gravitates toward his oblivious Uncle Ned as a close friend, but distances himself from his parents, particularly following the birth of his sister, Lily. As the days pass, bizarre events transpire as the house regresses from healthy and happy to strange and disorienting. As the baby's whines drive an already strained Abby to the point of a nervous breakdown, Joshua exhibits downright sociopathic behavior, while Brad finds he can't help his wife, do well at his demanding job, or match wits with his strange son. Joshua kills the family's dog, but says he doesn't know what happened, then imitates his father's grief as if he himself is incapable of feeling sadness. He next causes a fight between his mother (who is Jewish, but non-religious) and paternal grandmother Hazel (who is an Evangelical Christian and who constantly proselytizes Joshua) when he tells his parents he wants to become a Christian. Later, he convinces his mother to play hide-and-seek. As Abby counts, he takes his sister from her crib to hide with him, causing his mother to panic and pass out while searching for them, before he puts his sister back into the crib to make it look as though his mother was hallucinating. Seeing that Abby is getting worse, Brad takes two weeks off from his job to look after her and the children. When he arrives home, Joshua has gone to the Brooklyn Museum with Lily and Hazel. Joshua frightens Hazel by describing the violent acts of Seth, the Egyptian God of Chaos. At home, Brad watches a videotape of Joshua making Lily cry on purpose in the middle of the night, realizing that Joshua is the one who has been making Lily cry incessantly at night. He arrives at the museum just in time to see Joshua attempt to push his sister down a flight of stairs. He stops when he is caught by Hazel but proceeds to push her down instead, killing her and disguising it as an accident. Brad confides in Ned that he thinks Joshua did it but Ned doesn't believe him. By this point, Abby has had a complete psychotic breakdown and been institutionalized. That night, Brad installs a lock on his bedroom door and tells Joshua that his sister will be sleeping with him, fearing he will attempt to do something to Lily. He also brings Betsy, a psychologist, to meet Joshua. Betsy comes to the erroneous conclusion that Joshua is being abused, further frustrating Brad. Brad tells Joshua he is sending him away to a boarding school, causing Joshua to run away. When Brad arrives home, he immediately sees Joshua's school backpack on the kitchen floor. Brad calls out for Joshua numerous times while searching the residence for him, but his efforts are unsuccessful. Later in the evening Brad is laying with Lily on his bed lulling her to sleep. Brad then faintly hears Joshua crying, and he finds Joshua hiding in the kitchen. Joshua begged Brad not to send him away. Joshua wouldn't crawl out of his hiding place so Brad pulled him out, and Joshua cried out in pain. It was then that Brad discovered a large bruise on Joshua's back. Brad asked Joshua who gave him the bruise several times, and all Joshua would say was that he slipped. The next morning, Brad and Joshua go for a walk with his sister, but Joshua steals her pacifier, causing her to cry. When Brad confronts him, Joshua begins to mock him, causing him to strike Joshua. Brad tries to apologize but Joshua further taunts him, driving Brad to beat his son in public, strengthening Joshua's case of abuse and sending Brad to jail for assault. It is indicated that Joshua also framed his father for tampering with Abby's medications, suggesting that Brad will spend the rest of his life in prison, leaving Ned to adopt Joshua and Lily. In the last scene, Ned sits with Joshua at the piano and the two compose a song. Joshua sings about how his parents both will never be loved by anyone now, and that he only wanted to be with Ned and got rid of everyone else. It is through this that Ned finally realizes what Joshua has done and looks at him with a disturbed glance. Cast Release The film was directed by George Ratliff, who co-wrote the screenplay with David Gilbert. It was produced by Johnathan Dorfman and ATO Pictures, and was distributed Fox Searchlight Pictures and 20th Century Fox. The film was a special selection at the Sundance Film Festival in January 2007. Joshua was given a limited release in the United States on July 6, 2007. Music Beethoven's Piano Sonata No. 12 (Funeral March movement) was used widely in the film, and was learned and played by 12-year-old Jacob Kogan. The soundtrack was written by Nico Muhly and was available to download via iTunes. Reception Box office The film made $53,233 in its opening weekend from 6 screens, averaging $8,538 per theater. It made $482,355 domestically in the United States, and $237,613 in other territories for a total worldwide gross of $719,968. Critical response On Rotten Tomatoes the film has an average rating of 62%, based on 99 reviews, with an average rating of 6.3/10. The site's consensus reads, "Though Joshua is ultimately too formulaic, its intelligence and suspenseful buildup heighten the overall creep factor." On Metacritic, the film scored a 69 out of 100 rating, based on 25 critic reviews, indicating "generally favorable reviews". Duane Byrge of The Hollywood Reporter said that the film was "a brilliant house-of-horror tale with Hitchcockian flare". Owen Gleiberman of Entertainment Weekly said that the film is "something vitally new... that has a cool and savvy fun with your fears" — he also noted that it is "a superbly crafted psychological thriller". Elizabeth Weitzman of New York Daily News wrote: "None of this would have worked without the ideal child, and Kogan, making his movie debut, gets a difficult role down perfectly. Ratliff avoids turning him into a typical bad seed, and shows the same restraint at nearly every turn. With deliberate pacing, well-placed scares, and a pitch-black sense of humor, Ratliff keeps us guessing until the stunner finish." Maitland McDonagh of TV Guide gave the film 2.5 stars out of 4, writing: "Ratliff and co-screenwriter David Gilbert are clearly aiming for the highbrow suspense market rather than the down-and-dirty horror crowd, but their script's obviousness strips the story of suspense and turns it into a tedious slog to a predestined end." Mick LaSalle of the San Francisco Chronicle wrote: "Two things: the audience is ahead of the movie, and the movie never catches up. In the first scene we look at the kid, we look at the family and we get it. We get all there is to get. A strangely composed boy sits in his suit and tie, banging away at a Bartók piano piece for a school recital, while his parents ooh and aah over their newborn daughter. Joshua is a very bad kid, and he is going to do very bad things." References External links 2007 films 2007 independent films 2007 thriller films 2007 psychological thriller films American psychological thriller films Films about dysfunctional families Films scored by Nico Muhly Films set in New York City Films shot in New York City Fox Searchlight Pictures films 2000s English-language films 2000s American films
Hamadi Krouma is a town and commune in Skikda Province in north-eastern Algeria. References Communes of Skikda Province Skikda Province
Charles Drake (born Charles Ruppert; October 2, 1917 – September 10, 1994) was an American actor. Biography Drake was born in New York City. He graduated from Nichols College and became a salesman. In 1939, he turned to acting and signed a contract with Warner Bros., but he was not immediately successful. Drake served in the U.S. Army during World War II. Drake returned to Hollywood in 1945 and was cast in Conflict which starred Humphrey Bogart. His contract with Warner Brothers eventually ended. In the 1940s, he did some freelance work, including A Night in Casablanca (1946). In 1949, he moved to Universal Studios, where he co-starred with James Stewart and Shelley Winters in Winchester '73 (1950) and again co-starred with Stewart in the film Harvey (also 1950) a screen adaptation of the Broadway play. He co-starred in the Audie Murphy biopic To Hell and Back (1955), as Murphy's close friend "Brandon". In 1955, Drake turned to television as one of the stock-company players on Montgomery's Summer Stock, a summer replacement for Robert Montgomery Presents and from 1957 he hosted the syndicated TV espionage weekly Schilling Playhouse (also known as Rendezvous). In 1956 Drake appeared as Tom Sweeny with Murphy and Anne Bancroft in Walk the Proud Land. In 1959, he teamed again with Audie Murphy, this time in the Western film No Name on the Bullet, with Murphy in a rare villainous role as a hired assassin and Drake playing a small-town doctor trying to stop his reign of terror. On November 14, 1961, Drake played state line boss Allen Winter in the episode "The Accusers" of NBC's Laramie Western series. On February 6, 1963, Drake played Hollister in the Wagon Train episode "The Hollister John Garrison Story". He also played Charles Maury in "The Charles Maury Story" in 1958, Season 1, episode 32. Drake played the part of Oliver Greer in The Fugitive episode "The One That Got Away" (1967). He guest-starred in the fourth season (1968–1969) of NBC's Daniel Boone as Simon Jarvis. In 1969, Drake appeared as Milo Cantrell on the TV series The Virginian in the episode titled "A Woman of Stone." In 1970 he appeared as Randolf in "The Men From Shiloh" (the rebranded name of The Virginian) in the episode titled "Jenny." He played in eighty-three films between 1939 and 1975, including Scream, Pretty Peggy. More than fifty were dramas, but he also acted in comedies, science fiction, horror, and film noir. In an episode of the original Star Trek series ("The Deadly Years", 1967), he guested as Commodore Stocker. He died on September 10, 1994, in East Lyme, Connecticut, at the age of 76. Selected filmography Career (1939) - Rex Chaney Conspiracy (1939) - Police Guard (uncredited) The Hunchback of Notre Dame (1939) - Young Priest (uncredited) I Wanted Wings (1941) - Cadet (uncredited) Affectionately Yours (1941) - Hospital Intern (uncredited) Million Dollar Baby (1941) - Pamela's First Dance Partner (uncredited) Out of the Fog (1941) - Reporter (uncredited) Sergeant York (1941) - Scorer (uncredited) Dive Bomber (1941) - Pilot (uncredited) Navy Blues (1941) - Sea bag Inspection Officer (uncredited) Nine Lives Are Not Enough (1941) - 'Snappy' Lucas One Foot in Heaven (1941) - Second Bridegroom (uncredited) The Maltese Falcon (1941) - Reporter (uncredited) The Body Disappears (1941) - Arthur (scenes deleted) Dangerously They Live (1941) - Joe, Hospital Orderly with Dr. Murdock (uncredited) You're in the Army Now (1941) - Private (uncredited) The Man Who Came to Dinner (1942) - Sandy Bullet Scars (1942) - Harry a Reporter (uncredited) The Male Animal (1942) - Student (uncredited) Larceny, Inc. (1942) - R.V. Boyce - Driver in Accident (uncredited) Yankee Doodle Dandy (1942) - (uncredited) Wings for the Eagle (1942) - Customer (uncredited) The Gay Sisters (1942) - Man Entering Courtroom (uncredited) Busses Roar (1942) - Eddie Sloan Across the Pacific (1942) - Officer (uncredited) Now, Voyager (1942) - Leslie Trotter (uncredited) The Hard Way (1943) - Trailer Narrator (uncredited) Air Force (1943) - Navigator Conflict (1945) - Prof. Norman Holsworth You Came Along (1945) - Lt. R. Janoschek Whistle Stop (1946) - Ernie Winter Wonderland (1946) - Steve Kirk A Night in Casablanca (1946) - Pierre The Pretender (1947) - Dr. Leonard G. Koster The Tender Years (1948) - Bob Wilson The Babe Ruth Story (1948) - Reporter (uncredited) Tarzan's Magic Fountain (1949) - Mr. Dodd Johnny Stool Pigeon (1949) - Hotel Clerk (uncredited) Comanche Territory (1950) - Stacey Howard I Was a Shoplifter (1950) - Herb Klaxon Louisa (1950) - Voice of Radio Broadcaster (uncredited) Winchester '73 (1950) - Steve Miller Peggy (1950) - Tom Fielding Deported (1950) - Voice of Customs Official (uncredited) Mystery Submarine (1950) - Commodore (voice, uncredited) Harvey (1950) - Dr. Raymond Sanderson Air Cadet (1951) - Captain Sullivan The Fat Man (1951) - Radio Broadcaster at Racetrack (voice, uncredited) Little Egypt (1951) - Oliver Doane You Never Can Tell (1951) - Perry Collins The Treasure of Lost Canyon (1952) - Jim Anderson Red Ball Express (1952) - Pvt. Ronald Partridge / Narrator Bonzo Goes to College (1952) - Malcolm Drew Gunsmoke (1953) - Johnny Lake The Lone Hand (1953) - George Hadley It Came from Outer Space (1953) - Sheriff Matt Warren War Arrow (1953) - Sgt. Luke Schermerhorn The Glenn Miller Story (1954) - Don Haynes Tobor the Great (1954) - Dr. Ralph Harrison Four Guns to the Border (1954) - Jim Flannery To Hell and Back (1955) - Brandon Female on the Beach (1955) - Police Lieutenant Galley All That Heaven Allows (1955) - Mick Anderson The Price of Fear (1956) - Police Sgt. Pete Carroll Walk the Proud Land (1956) - Tom Sweeny Jeanne Eagels (1957) - John Donahue Until They Sail (1957) - Capt. Richard Bates Step Down to Terror (1958) - Johnny Williams Walters No Name on the Bullet (1959) - Luke Canfield Tammy Tell Me True (1961) - Buford Woodly Back Street (1961) - Curt Stanton Showdown (1963) - Bert Pickett The Lively Set (1964) - Paul Manning Dear Heart (1964) - Frank Taylor The Third Day (1965) - Lawrence Conway The Money Jungle (1967) - Harvey Sheppard Valley of the Dolls (1967) - Kevin Gillmore The Counterfeit Killer (1968) - Dolan The Swimmer (1968) - Howard Graham Hail, Hero! (1969) - Senator Murchiston The Arrangement (1969) - Finnegan The Seven Minutes (1971) - Sgt. Kellogg The Screaming Woman (1972, TV Movie) - Ken Bronson Scream, Pretty Peggy (1973, TV Movie) - George Thornton The Lives of Jenny Dolan (1975, TV Movie) - Alan Hardesty My Brother's Wedding (1983) - Pastor #2 (final film role) References External links 1917 births 1994 deaths 20th-century American male actors American male film actors American male stage actors American male television actors Male actors from New York City Military personnel from New York City Military personnel from New York (state) Nichols College alumni United States Army personnel of World War II Western (genre) television actors
```java package io.ray.runtime.functionmanager; import com.google.common.base.Objects; import com.google.common.collect.ImmutableList; import io.ray.runtime.generated.Common.Language; import java.io.Serializable; import java.util.List; /** Represents metadata of Java function. */ public final class JavaFunctionDescriptor implements FunctionDescriptor, Serializable { private static final long serialVersionUID = -2137471820857197094L; /** Function's class name. */ public final String className; /** Function's name. */ public final String name; /** Function's signature. */ public final String signature; public JavaFunctionDescriptor(String className, String name, String signature) { this.className = className; this.name = name; this.signature = signature; } @Override public String toString() { return className + "." + name; } @Override public boolean equals(Object o) { if (this == o) { return true; } if (o == null || getClass() != o.getClass()) { return false; } JavaFunctionDescriptor that = (JavaFunctionDescriptor) o; return Objects.equal(className, that.className) && Objects.equal(name, that.name) && Objects.equal(signature, that.signature); } @Override public int hashCode() { return Objects.hashCode(className, name, signature); } @Override public List<String> toList() { return ImmutableList.of(className, name, signature); } @Override public Language getLanguage() { return Language.JAVA; } } ```
The Dorset dialect is the traditional dialect spoken in Dorset, a county in the West Country of England. Stemming from Old West Saxon, it is preserved in the isolated Blackmore Vale, despite it somewhat falling into disuse throughout the earlier part of the 20th century, when the arrival of the railways brought the customs and language of other parts of the country and in particular, London. The rural dialect is still spoken in some villages however and is kept alive in the poems of William Barnes and Robert Young. Origins and distribution Dorset (or archaically, Dorsetshire) is a county in South West England on the English Channel coast. It borders Devon to the west, Somerset to the north-west, Wiltshire to the north-east, and Hampshire to the east. The Dorset dialect is derivative of the Wessex dialect which is spoken, with regional variations, in Dorset, Wiltshire, Somerset and Devon. It was mainly spoken in the Blackmore Vale in North Dorset, not so prevalent in the south of the county and less so in the south-east, which was historically in Hampshire prior to local government re-organisation in 1974. The Dorset dialect stems from Saxon with heavy Norse influence. The Saxon invaders that landed in Dorset and Hampshire towards the end of the 6th century, hailed from what is now the south of Denmark and the Saxon islands of Heligoland, Busen and Nordstrand. The dialect of the Saxons who settled in what became Wessex was very different from that of Saxons who settled in the east and south-east of England, being heavily influenced by their Danish neighbours. The Anglo-Saxon Chronicle records that Jutes occupied the area before the Saxons arrived and there are a number of old Norse words entrenched in the Dorset language, 'dwell' for example. Phonology Dorset is a medium-sized county in the South West of England which has a distinct accent and dialect. Some of the distinct features of the accent include: H-dropping, glottalization, rhoticity and accentuated vowel sounds. Consonants A prominent feature in the accent is the use of a t-glottalization, commonly used when it is in the last syllable of a multi-syllable word. The sound is pronounced when it precedes an and sometimes on other occasions. The voiceless in words such as think is replaced with the voiced sound as in the. The voiced also replaces the 'double d', so ladder becomes la(th)er. The letters and , if the first or last letter of a word, are pronounced as and respectively. However, words that are not of Germanic origin or have been adopted from other languages retain their original sound; family, figure, factory, scene, sabbath for example, are not pronounced vamily, vigure, vactory, zene and zabbath. The becomes a if it appears before an sound so eleven sounds like 'elebn'. The 'z' and the 'v' in Dorset are used to distinguish words which, in standard English, sound the same: sea and see, son and sun, foul and fowl become sea and zee, son and zun, and foul and vowl for example. The liquid consonants and are treated differently in the Dorset dialect. When 'r' and 'l' come together, a 'd' or 'e' sound is put between them, so curl and twirl become curel and twirel or as often, curdl and twirdl. Although the accent has some rhoticity, meaning the letter in words is pronounced, so for example, "hard" is pronounced and not ; the 'r' is omitted when it comes before some open and closed palate letters. Therefore words like burst, first, force and verse, are pronounced bu'st, vu'st, fwo'ss and ve'ss. Other consonants are left out when they immediately precede a hard consonant in the following word: bit of cheese becomes bit o' cheese but bit of an apple often remains bit ov an apple. This is not always the case though. Sometimes the labiodental fricative is also elided along with following sounds. For example, "all of it" is often spoken as "all o't" and "all of 'em" becomes "all o'm". Similarly "let us" becomes "le's" and "better than that" becomes "better 'n 'at". The sound is also often transposed. Words such as clasp and crisp, becoming claps and crips in the dialect. Other examples of this type of the pronunciation include ax for ask, and the use of the word wopsy for a wasp. When starts a word, it is sometimes given an sound. Examples of this include, eet for yet, and eesterday for yesterday. The letter is often dropped from words, so "hello" becomes "ello" but is also added where none would be in standard English. This usually occurs when the Friesic equivalent root word begins with an aspirated . So the words "kwing", meaning quick, and "kring", meaning bend, from which the English words "wing" and "ring" are derived, are voiced as "hwing" and "hring" respectively . Vowels The sound in some words, such as bean, clean, lean and mead, is voiced as a , but this is not always the case; bead, meat, read keep the monophthong but use the short sound. The words head and lead, pronounced and in standard English, also use this sound. Words in the lexical set are generally spoken with the diphthong, such as in bake, cake, late and lane. The standard English in words such as beg, leg, peg, are given the short . So egg thus becomes agg which gives rise to the Dorset dialect word for egg collecting, aggy. In a few words where precedes , as in arm, charm and garden, the vowel sound is pronounced as or . The short sound in words such as dust, crust and rut is usually pronounced in the Dorset dialect as an diphthong to make dowst, crowst and rowt. Vowels sounds are sometimes preceded by a sound, particularly the sound in words such as boil, spoil and point, and the English long . Barnes' book, Poems of Rural Life in the Dorset Dialect, contains the poem Woak were Good Enough Woonce which begins: Ees; now mahogany’s the goo, An’ good wold English woak won’t do. I wish vo’k always mid auvord Hot meals upon a woakèn bwoard, As good as think that took my cup An’ trencher all my growèn up. Grammar Adjectives Adjectives in the dialect often end 'en', more so than in standard English which still retains wooden to describe something made of wood but would not use 'leatheren' to describe something made of leather. A paper bag in Dorset would be a bag to put paper in, as opposed to a paperen bag, a bag made of paper. A woaken bwoard, in the Barnes' poem above, is a board made from oak. Some nouns when pluralised, also end in 'en' instead of the more usual 's' or 'es'. Cheese, house and place for example become cheesen, housen and pleacen. Other unconventional plurals in the dialect include words ending 'st' such as coast, post and fist. Normally pluralised with the addition of an 's', instead take 'es' to make coastes, postes and vistes. Nouns There are two different classes of noun in the Dorset dialect, and each has its own personal pronoun. Things that have no fixed shape or form, such as sand, water, dust etc, more or less follow the rules of standard English, in that they take the pronoun "it". However things with a given shape such as a tree or a brick use the personal pronoun, "he". Referring to a felled tree, someone from Dorset might say, "I chopped 'e down" but when talking about a diminishing stream, "It's a-drying up". The objective class of he, in this case is "en", thus "I chopped 'e down" but "'E felled en". Instead of the usual two, the Dorset dialect has four demonstrative pronouns. In addition to "this" and "that" which are used for the nouns without fixed form, there is also "thease" and "thic" respectively. Thus, "Teake thease fork and pitch that hay" and "'Old thik can while I pour this paint in". These demonstrative nouns can help remove ambiguity, for when a Dorset man says 'that stone' he is talking about a load of broken stone but if he says 'thik stone', he is talking about a particular stone. He will say, "Pick it up" when referring to the former but "Pick en up" when talking about the latter. The use and formation of pronouns differ from standard English. When emphatic pronouns are used obliquely, for example, the nominative rather than the objective form is employed, thus "Give the gun to I" but unemphatically, "Give me the gun". 'Self' is inflected in common with other nouns, when used in conjunction with personal pronouns; in the same way one would say 'his book' or 'their book', the Dorset speech uses hisself and theirselves, not himself and themselves. When dialect speakers discuss a quantity or a count, the units are given before tens; 'four and twenty' for example, not 'twenty-four'. Verbs Many verbs in the dialect are conjugated in an unorthodox fashion, noticeably 'to be', which goes: I be, thou bist, you be, we be, they be, and not; I am, you are, we are, they are. 'Is' is sometimes used however for he, she and it and in the past tense, 'were' is used for all the personal pronouns except the now largely archaic, but still used, 'thou', which uses 'werst'. 'Was' is not used. In the perfect tense, verbs are often preceded by an 'a'; I've a-been, I had a-been, I shall have a-been, for example. There is no distinction between the auxiliary verbs 'may' and 'might', instead 'mid' is used in both cases. When auxiliary verbs end in 'd' or 's', 'en' is added at the end to express the negative. 'Could not', 'should not', 'might not', 'must not', become 'coulden', 'shoulden', 'midden' and 'mussen'. Although the last two examples 'might' and 'must' end with 't', the Dorset equivalents are sounded with 'd' and 's' respectively. Verbs in the past-tense have both an aorist and an imperfect tense form which indicates whether the action is ongoing or repeated. To say "The kids stole the apples from the tree", for example, means it occurred once, but to say "The kids did steal the apples from the tree" means it is recurrent event.Verbs in the infinitive mode or those used in conjunction with an auxiliary verb, often have 'y' attached to the end, but only when the verb is absolute. One might ask "Can ye sewy?" but never "Will you sewy a patch on?" Some verbs, which are irregular in mainstream English, are treated as regular in the Dorset dialect, and vice-versa. For example: Blew, built and caught are blowed, builded and catched, whereas scrape becomes scrope. When forming the perfect participle, a letter 'a' at the beginning of the verb acts as an augment. Thus, "He have alost his watch" or "She have abroke the vase". Coupled with the accentuated pronunciation of the vowels this makes for a smooth, flowing dialect by diluting the hard consonants in the language. Punning Puns, humour which exploits the similar sounds of two different words, rarely work in the Dorset dialect. Many like sounding words in standard English are not pronounced the same in Dorset. For example, the classic pun, "The people told the sexton and the sexton toll'd the bell", would sound as, "The people twold the sex'on and the sex'on tolled the bell". Dialect words beginning with 's' are spoken with a 'z' if they are Germanic in origin, but words that entered the language later, are not. 'Sun' is 'zun' but 'son' keeps the 's' sound. 'Scene' is the same but 'seen' is 'zeen'. The letter 'f', if the first or last of a word is pronounced as a 'v' but again, only if the word is derived from the original Saxon. The verb 'fall' and 'fall' meaning autumn, are 'vall' and 'fall' respectively, and one would immediately know what is meant by, "This chicken is foul" because fowl is pronounced 'vowl'. Words and phrases Dorset is home to some distinctive words and phrases. Some phrases are alternative versions of common English idioms, such as, Don't teach yer grandma to spin equivalent to standard English, 'Don't teach your grandmother to suck eggs', and Zet the fox to keep the geese similar to 'Putting the fox in charge of the henhouse', but others are peculiar to Dorset. All the goo, meaning 'all the fashion', was how Barnes described the then new fad for mahogany furniture, in his poem Woak Was Good Enough Woonce and That'll happen next Niver'stide, which refers to something that will never happen. To hold wi' the hare and run wi' the hounds is another typical Dorset saying and refers to hedging one's bets or trying to cover all the bases. Someone from Dorset might say, I do live too near a wood to be frightened by an owl, to indicate that they know enough about something, not to be worried by it. There are many words to refer to 'a bite to eat', it is said that a Dorset man has eight meals a day; dewbit, breakfast, nuncheon, cruncheon, luncheon, nammet, crammet and supper. Many 'dialect' words are contractions: Bumbye and bimeby are short for 'by-and-by', didden for 'didn't' and gramfer and grammer are for 'grandfather' and 'grandmother' respectively. The word 'like' is often used as a qualifier for an adjective and is attached to the end of the sentence. To say, "He's ill, like" means he is 'rather' ill. In literature William Barnes was born in the Bagber in 1801. He wrote three volumes poetry in the Dorset dialect, the first, Poems of Rural Life in the Dorset Dialect was published in 1844. Barnes hated what he called 'foreign' words and avoided the use of them in his poetry, preferring instead to use the Saxon language. Where there was no Saxon equivalent for modern terms, Barnes would retronymically concoct words and phrases, such as 'push wainling' for perambulator. Barnes had studied Celtic literature and often used a repetition of consonantal sounds known as cynghanedd. This is particularly noticeable in the poem, "My Orcha'd in Linden Lea". Barnes also produced works about the phonology, grammar and vocabulary of the Dorset dialect: "A Grammar and Glossary of the Dorset Dialect", published in 1863, and a much expanded version, "A Glossary of the Dorset Dialect with Grammar of its Word-shapening and Wording", in 1886. Another poet who wrote in the local dialect was Robert Young whose work includes, "Rabin Hill's Visit to the Railway: What he Zeed and Done, and What he Zed About It", published in two parts in 1864, and "Rabin Hill's Excursion to Western-Super-Mare to see the Opening of the New Peir", published in 1867. Thomas Hardy, the renowned Dorset novelist, contributed Dorset dialect words to Joseph Wright’s "English Dialect Dictionary" and the "Oxford English Dictionary". Hardy also had his poetry published but used a mixture of Dorset dialect and standard English. Instead of writing in the Dorset dialect, like Barnes and Young, Hardy used it only in his characters' dialogue. J. K. Rowling used the Dorset dialect word for a bumblebee, dumbledore, for one of the characters in her Harry Potter books, whom she saw as bumbling about his study, humming to himself. PJ Harvey composed her book-length narrative poem Orlam in the Dorset dialect, and later used it in her album I Inside the Old Year Dying. Decline Preserved in the isolated Blackmore Vale, use of the dialect began to decline from the mid-nineteenth century when it was exposed to other English variations. The arrival of the railways, around this time, brought an influx of tourists to Dorset, while land enclosure and the repeal of the Corn Laws, caused mass unemployment in the mainly rural county, forcing farmers to seek work in other parts of the country. Attempts to standardise English began as early as the 16th century and by the mid-nineteenth century had also had a profound effect on local dialects, particularly in the south-west. Dialect was actively discouraged in schools at this time and the introduction of compulsory education for young children hastened its decline. Thomas Hardy noted in 1883 that, "Having attended the National School they [the children] would mix the printed tongue as taught therein with the unwritten, dying, Wessex English they had learnt of their parents, the result of this transitional state of affairs being a composite language without rule or harmony". It has also been suggested by Jason Sullock in his 2012 book, "Oo do ee think ee are?", that West Country dialects are a source of some derision, leading many local speakers to water them down or abandon them all together. The same point is made in Alan Chedzoy's, "The People's Poet: William Barnes of Dorset". However the Dorset dialect is still spoken in some villages. It also features in the Scrumpy and Western music of Dorset bands like The Yetties, Who's Afeard and The Skimmity Hitchers, and is kept alive in the literature of Thomas Hardy, William Barnes and Robert Young. See also William Barnes The Yetties Citations References Culture in Dorset English language in England British English History of Dorset Dialects of English
Bates is an unincorporated community in Grant County, Oregon, United States. it has a post office with a ZIP code 97817. The elevation is . Bates was a lumber mill town until 1975 with a population of up to 400. Bates State Park opened in 2011. Climate This climatic region is typified by large seasonal temperature differences, with warm to hot (and often humid) summers and cold (sometimes severely cold) winters. According to the Köppen Climate Classification system, Bates has a humid continental climate, abbreviated "Dfb" on climate maps. References External links History of Bates at Oregon Encyclopedia Company towns in Oregon Unincorporated communities in Grant County, Oregon Unincorporated communities in Oregon
San Francisco Naciff Airport (, ) is a public use airport near Naciff in the Beni Department of Bolivia. See also Transport in Bolivia List of airports in Bolivia References External links OpenStreetMap - Naciff OurAirports - Naciff Fallingrain - Naciff Airport Bing Maps - Naciff Airports in Beni Department
A list of rivers of Saxony, Germany: A Alte Luppe B Bahra Bahre Batschke Bauerngraben Biela Black Elster Black Pockau Bobritzsch Borlasbach Brunndöbra Burgauenbach C Chemnitz Colmnitzbach Cunnersdorfer Wasser D Dahle Döllnitz E Eastern Rietzschke Elbe Eula F Fällbach Feilebach Fleißenbach Flöha Freiberger Mulde Friesenbach G Geberbach Gimmlitz Goldbach Göltzsch Gösel Gottleuba Greifenbach Große Bockau Große Lößnitz Große Mittweida Große Pyra Große Röder Großschweidnitzer Wasser Gruna Grundwasser H Hammerbach Haselbach Helfenberger Bach Hoyerswerdaer Schwarzwasser J Jahna Jahnabach Jauer K Kabelske Käbnitz Kaitzbach Kaltenbach Kemmlitzbach Keppbach Ketzerbach Kirnitzsch Kleine Bockau Kleine Luppe Kleine Pleiße Kleine Pyra Kleine Röder, tributary of the Black Elster Kleine Röder, tributary of the Große Röder Kleine Spree Kleine Triebisch Kleinwaltersdorfer Bach Klosterwasser Kotitzer Wasser Krebsgraben Krippenbach L Lachsbach Landgraben Landwasser Langes Wasser Lausenbach Lausur Legnitzka Leinegraben Leubnitzbach Litte Löbauer Wasser Lober Lockwitzbach Lossa Lößnitzbach Lungwitzbach Luppe Lusatian Neisse M Maltengraben Mandau Müglitz Mühlgrundbach Mulde Münzbach N Nahle Natzschung Neue Luppe Northern Rietzschke O Oberhermsdorfer Bach Oelsabach Orla Otterbach P Parthe Paußnitz Pietzschebach Pleiße Pließnitz Pöbelbach Pöhlbach Pöhlwasser Polenz Pösgraben Preßnitz Prießnitz Pulsnitz Q Quänebach R Räderschnitza Raklitza Red Mulde Red Pockau Red Weißeritz Romereifeldgraben Rosenbach Roter Graben Rotes Wasser Ruhlander Schwarzwasser S Satkula Schaukelgraben Schirmbach Schlettenbach Schlumper Schnauder Schwarzbach, tributary of the Große Mittweida Schwarzbach, tributary of the Mulde Schwarzbach, tributary of the Sebnitz Schwarzbach, tributary of the White Elster Schwarze Röder Schwarzer Bach Schwarzer Graben Schwarzer Schöps Schwarzwasser, tributary of the Mulde Schwarzwasser, tributary of the Preßnitz Schweinitz Schwennigke Sebnitz Sehma Seidewitz Seifenbach Seltenrein Spitzkunnersdorfer Bach Spree Steindöbra Striegis Struga Svatava Svitávka Syrabach T Treba Trebnitz Treuener Wasser Trieb Triebisch V Verlorenes Wasser W Weinske Weißer Schöps Weißeritz Wesenitz White Elster White Mulde Wiederitz Wild Weißeritz Wilisch Wilzsch Wisenta Wittgendorfer Wasser Würschnitz Wyhra Z Zschampert Zschonerbach Zschopau Zwickauer Mulde Zwittebach Zwönitz Zwota Saxony-related lists Saxony
The 2010 Whyte & Mackay Premier League was a darts tournament organised by the Professional Darts Corporation; the sixth such running of the tournament. The tournament began at The O2 Arena in London on 11 February, and finished at the Wembley Arena on 24 May. Phil Taylor won in the final 10–8 against defending champion James Wade, where he also became the first player to hit two nine-dart finishes in a single match. Qualification The top six players from the PDC Order of Merit following the 2010 PDC World Darts Championship were confirmed on 5 January. Simon Whitlock and Adrian Lewis were named as the two Sky Sports wild card selections; Whitlock being announced on 4 January and Lewis on 13 January. WC = Wild Card Venues Fifteen venues were used for the 2010 Premier League, with the only change from 2009 being Bournemouth replacing Edinburgh after a one-year absence. Prize money The prize money increased again with the total prize fund rising to £410,000, as a third place play-off was introduced, earning the winner of that an extra £10,000 to their £40,000 for reaching the play-offs. Results League stage 11 February – week 1 The O2 Arena, London 18 February – week 2 Bournemouth International Centre, Bournemouth 25 February – week 3 Odyssey Arena, Belfast 4 March – Week 4 Westpoint Arena, Exeter 11 March – Week 5 MEN Arena, Manchester 18 March – Week 6 Brighton Centre, Brighton 25 March – Week 7 National Indoor Arena, Birmingham 1 April – Week 8 Cardiff International Arena, Cardiff 8 April – Week 9 SECC, Glasgow 15 April – week 10 Sheffield Arena, Sheffield 22 April – week 11 Echo Arena, Liverpool 29 April – week 12 AECC, Aberdeen Nine-dart finish The Premier League's second nine-dart finish occurred, when Raymond van Barneveld hit one during the second leg of his match against Terry Jenkins, checking out with T20, T19 and D12. Barneveld had also hit the first Premier League nine-dart finish in 2006 against Peter Manley. 6 May – week 13 Metro Radio Arena, Newcastle upon Tyne 13 May – week 14 Trent FM Arena, Nottingham Play-offs – 24 May Wembley Arena, London The play-offs were originally scheduled for 23 May, but due to a power cut in the area surrounding the Wembley Arena, they were postponed until 24 May. Nine dart finishes The finals night saw the second and third nine-dart finishes of the 2010 Premier League Darts, and the third and fourth in Premier League Darts overall, in the final between Phil Taylor and James Wade. Trailing 1–0 after losing the throw in the first leg, Taylor responded with a 174 (T20, 2 T19s), 180 (3 T20s), and 147 (T20, T17, D18) to take the second leg against throw. This was Taylor's first nine-dart finish in the Premier League, having only been achieved previously by Raymond van Barneveld. This was also the first nine-dart finish in a televised final. In the 15th leg he hit the second nine dart finish of the night with two 180s and checked out on 141 (T20, T19, D12). This was the first time that the same player achieved two nine dart finishes in one match. After the second nine darter, Taylor made it 17 consecutive perfect darts, needing only T17 D18 for a 3rd nine darter and second on the trot, he missed the T17 by an inch, but still went on to win that leg in 10 darts, after then hitting T18, then returning to hit the D8 with his first dart. Table and streaks Table Top four qualified for Play-offs after Week 14.NB: LWAT = Legs Won Against Throw. Players separated by +/- leg difference if tied. Streaks NB: W = Won D = Drawn L = Lost Player statistics The following statistics are for the league stage only. Playoffs are not included. Phil Taylor Longest unbeaten run: 14 Most consecutive wins: 4 Most consecutive draws: 1 Most consecutive losses: 0 Longest without a win: 1 Biggest victory: 8-1 (v. Adrian Lewis) Biggest defeat: Player Undefeated Simon Whitlock Longest unbeaten run: 3 Most consecutive wins: 2 Most consecutive draws: 1 Most consecutive losses: 1 Longest without a win: 2 Biggest victory: 8-2 (v. Mervyn King) Biggest defeat: 3-8 (v. Phil Taylor) James Wade Longest unbeaten run: 4 Most consecutive wins: 2 Most consecutive draws: 2 Most consecutive losses: 3 Longest without a win: 5 Biggest victory: 8-3 (v. Raymond van Barneveld) Biggest defeat: 2-8 (v. Phil Taylor) Mervyn King Longest unbeaten run: 3 Most consecutive wins: 2 Most consecutive draws: 2 Most consecutive losses: 3 Longest without a win: 3 Biggest victory: 8-2 (v. Terry Jenkins) Biggest defeat: 1-8 (v. Phil Taylor) Ronnie Baxter Longest unbeaten run: 3 Most consecutive wins: 2 Most consecutive draws: 2 Most consecutive losses: 1 Longest without a win: 6 Biggest victory: 8-4 (v. Raymond van Barneveld) Biggest defeat: 2-8 (v. Raymond van Barneveld) Raymond van Barneveld Longest unbeaten run: 2 Most consecutive wins: 2 Most consecutive draws: 1 Most consecutive losses: 5 Longest without a win: 6 Biggest victory: 8-2 (v. Ronnie Baxter) Biggest defeat: 2-8 (v. Phil Taylor (twice)) Adrian Lewis Longest unbeaten run: 2 Most consecutive wins: 2 Most consecutive draws: 2 Most consecutive losses:2 Longest without a win: 5 Biggest victory: 8-3 (v. Raymond van Barneveld) Biggest defeat: 1-8 (v. Phil Taylor) Terry Jenkins Longest unbeaten run: 3 Most consecutive wins: 2 Most consecutive draws: 2 Most consecutive losses: 6 Longest without a win: 6 Biggest victory: 8-4 (v. Ronnie Baxter) Biggest defeat: 2-8 (v. Mervyn King) References External links Whyte & Mackay Premier League Darts Premier League Darts Premier League Premier League Darts Premier League Darts
Rudi sportsman is a 1911 Austro-Hungarian comedy film. It is one of a series of four films written by and starring Emil Artur Longen as the title character, Rudi. It was filmed in Prague. External links 1911 comedy films 1911 films Austro-Hungarian films Czech comedy films Hungarian black-and-white films Austrian black-and-white films Austrian silent films Hungarian silent films Hungarian comedy films Austrian comedy films Films directed by Emil Artur Longen Czech silent films Czech black-and-white films
```python # # # path_to_url # # or in the "license" file accompanying this file. This file is # distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF import boto.vendored.regions.regions as _regions class _CompatEndpointResolver(_regions.EndpointResolver): """Endpoint resolver which handles boto2 compatibility concerns. This is NOT intended for external use whatsoever. """ _DEFAULT_SERVICE_RENAMES = { # The botocore resolver is based on endpoint prefix. # These don't always sync up to the name that boto2 uses. # A mapping can be provided that handles the mapping between # "service names" and endpoint prefixes. 'awslambda': 'lambda', 'cloudwatch': 'monitoring', 'ses': 'email', 'ec2containerservice': 'ecs', 'configservice': 'config', } def __init__(self, endpoint_data, service_rename_map=None): """ :type endpoint_data: dict :param endpoint_data: Regions and endpoints data in the same format as is used by botocore / boto3. :type service_rename_map: dict :param service_rename_map: A mapping of boto2 service name to endpoint prefix. """ super(_CompatEndpointResolver, self).__init__(endpoint_data) if service_rename_map is None: service_rename_map = self._DEFAULT_SERVICE_RENAMES # Mapping of boto2 service name to endpoint prefix self._endpoint_prefix_map = service_rename_map # Mapping of endpoint prefix to boto2 service name self._service_name_map = dict( (v, k) for k, v in service_rename_map.items()) def get_available_endpoints(self, service_name, partition_name='aws', allow_non_regional=False): endpoint_prefix = self._endpoint_prefix(service_name) return super(_CompatEndpointResolver, self).get_available_endpoints( endpoint_prefix, partition_name, allow_non_regional) def get_all_available_regions(self, service_name): """Retrieve every region across partitions for a service.""" regions = set() endpoint_prefix = self._endpoint_prefix(service_name) # Get every region for every partition in the new endpoint format for partition_name in self.get_available_partitions(): if self._is_global_service(service_name, partition_name): # Global services are available in every region in the # partition in which they are considered global. partition = self._get_partition_data(partition_name) regions.update(partition['regions'].keys()) continue else: regions.update( self.get_available_endpoints( endpoint_prefix, partition_name) ) return list(regions) def construct_endpoint(self, service_name, region_name=None): endpoint_prefix = self._endpoint_prefix(service_name) return super(_CompatEndpointResolver, self).construct_endpoint( endpoint_prefix, region_name) def get_available_services(self): """Get a list of all the available services in the endpoints file(s)""" services = set() for partition in self._endpoint_data['partitions']: services.update(partition['services'].keys()) return [self._service_name(s) for s in services] def _is_global_service(self, service_name, partition_name='aws'): """Determines whether a service uses a global endpoint. In theory a service can be 'global' in one partition but regional in another. In practice, each service is all global or all regional. """ endpoint_prefix = self._endpoint_prefix(service_name) partition = self._get_partition_data(partition_name) service = partition['services'].get(endpoint_prefix, {}) return 'partitionEndpoint' in service def _get_partition_data(self, partition_name): """Get partition information for a particular partition. This should NOT be used to get service endpoint data because it only loads from the new endpoint format. It should only be used for partition metadata and partition specific service metadata. :type partition_name: str :param partition_name: The name of the partition to search for. :returns: Partition info from the new endpoints format. :rtype: dict or None """ for partition in self._endpoint_data['partitions']: if partition['partition'] == partition_name: return partition raise ValueError( "Could not find partition data for: %s" % partition_name) def _endpoint_prefix(self, service_name): """Given a boto2 service name, get the endpoint prefix.""" return self._endpoint_prefix_map.get(service_name, service_name) def _service_name(self, endpoint_prefix): """Given an endpoint prefix, get the boto2 service name.""" return self._service_name_map.get(endpoint_prefix, endpoint_prefix) class BotoEndpointResolver(object): """Resolves endpoint hostnames for AWS services. This is NOT intended for external use. """ def __init__(self, endpoint_data, service_rename_map=None): """ :type endpoint_data: dict :param endpoint_data: Regions and endpoints data in the same format as is used by botocore / boto3. :type service_rename_map: dict :param service_rename_map: A mapping of boto2 service name to endpoint prefix. """ self._resolver = _CompatEndpointResolver( endpoint_data, service_rename_map) def resolve_hostname(self, service_name, region_name): """Resolve the hostname for a service in a particular region. :type service_name: str :param service_name: The service to look up. :type region_name: str :param region_name: The region to find the endpoint for. :return: The hostname for the given service in the given region. """ endpoint = self._resolver.construct_endpoint(service_name, region_name) if endpoint is None: return None return endpoint.get('sslCommonName', endpoint['hostname']) def get_all_available_regions(self, service_name): """Get all the regions a service is available in. :type service_name: str :param service_name: The service to look up. :rtype: list of str :return: A list of all the regions the given service is available in. """ return self._resolver.get_all_available_regions(service_name) def get_available_services(self): """Get all the services supported by the endpoint data. :rtype: list of str :return: A list of all the services explicitly contained within the endpoint data provided during instantiation. """ return self._resolver.get_available_services() class StaticEndpointBuilder(object): """Builds a static mapping of endpoints in the legacy format.""" def __init__(self, resolver): """ :type resolver: BotoEndpointResolver :param resolver: An endpoint resolver. """ self._resolver = resolver def build_static_endpoints(self, service_names=None): """Build a set of static endpoints in the legacy boto2 format. :param service_names: The names of the services to build. They must use the names that boto2 uses, not boto3, e.g "ec2containerservice" and not "ecs". If no service names are provided, all available services will be built. :return: A dict consisting of:: {"service": {"region": "full.host.name"}} """ if service_names is None: service_names = self._resolver.get_available_services() static_endpoints = {} for name in service_names: endpoints_for_service = self._build_endpoints_for_service(name) if endpoints_for_service: # It's possible that when we try to build endpoints for # services we get an empty hash. In that case we don't # bother adding it to the final list of static endpoints. static_endpoints[name] = endpoints_for_service self._handle_special_cases(static_endpoints) return static_endpoints def _build_endpoints_for_service(self, service_name): # Given a service name, 'ec2', build a dict of # 'region' -> 'hostname' endpoints = {} regions = self._resolver.get_all_available_regions(service_name) for region_name in regions: endpoints[region_name] = self._resolver.resolve_hostname( service_name, region_name) return endpoints def _handle_special_cases(self, static_endpoints): # cloudsearchdomain endpoints use the exact same set of endpoints as # cloudsearch. if 'cloudsearch' in static_endpoints: cloudsearch_endpoints = static_endpoints['cloudsearch'] static_endpoints['cloudsearchdomain'] = cloudsearch_endpoints ```
Desolation Sound Marine Provincial Park is a provincial park in British Columbia, Canada, and is distinguished by its many picturesque sheltered coves and anchorages, frequented by yachts and pleasure craft. The scenery consists of waterfalls, rugged glaciated peaks, and their steep forested slopes that fall into the ocean. Its many inlets, islets, coves, and bays attract many pleasure craft each summer, when it is not uncommon for a hundred boats to share a small anchorage. The sound is home to a wide variety of wildlife and still relatively free from development, although some areas, such as Theodesia Inlet, show signs of clear-cut logging. The area has a long history of use by First Nations and supports tremendous ecological diversity. From the time when Captain George Vancouver first visited the area to modern time Indigenous people have been pressured off their land and have lost access to most of their hunting and gathering spots due to environmental protection, tourists and pollution. Health authorities that are operated by First Nations groups are currently investigating the effects of environmental pollution in the area. Land use in the marine area is now strictly controlled by the BC provincial government and any operations in this marine area that do not conform to the laws are eradicated. Location and size The park is located approximately 32 km north of Powell River and 150 km north of Vancouver. This provincial marine park, which is about 84 km2 in size is only accessible by boat. Desolation Sound Marine Provincial Park created by the Government of British Columbia in 1973, under the advocacy of MLA Don Lockstead and the New Democratic Party government, out of an area comprising and over of shoreline. The park is located at the confluence of Malaspina Inlet and Homfray Channel. History Desolation Sound was first named this way by Captain George Vancouver. When he first sailed there, he encountered what he described as mostly abandoned Native settlements, gloomy weather and barren land. It was, however, only temporarily empty due to a few factors. The first is that his visit likely came after a smallpox outbreak; second, it was the time of the year when most people would be inland hunting and gathering. Third, First Nations people were likely avoiding the Lekwiltok raiders. Another significant factor was that Vancouver’s view was influenced by “European cultural aesthetics,” as he didn't recognize most indigenous-altered landscapes as occupied by people. While Vancouver saw Desolation Sound as an unattractive and empty land, people later came to value it as full of untouched nature. The history of Desolation Sound Marine Park is intimately connected with the erasure of First Peoples’ presence from their ancestral lands. What non-Indigenous people see as conservation efforts comes in direct conflict with the lives of the Sliammon First Nations (Tla'amin Nation), who used to live, hunt, gather and practice their culture on these lands. Many also see Indigenous People's presence as ruining nature's pristine emptiness. The Sliammon First Nations, however, provide an opposing narrative of unequal power relations and a homeland turned into a landscape for non-Native visitors, taken over and destroyed by outsiders. Starting from 1875, they were, over time, pushed out of their communities into small reserves, and in 1920, legislation was enacted allowing the government to unilaterally reduce allocated reserve land. First Nations As First Nations were pressured off their land, Desolation Sound once again became a wild and uninhabited paradise in the eyes of yachters and other people looking for a retreat from industrial modernity. And while the Sliammon tried to continue hunting, gathering and participating in cultural activities, parts of the area were leased to non-Indigenous residents with tourists and summer residents, increasing clashes of interest between them. Summer houses were built in prime gathering and hunting locations and archeological sites, including graves, were robbed for souvenirs and environmental protection infringed on their ability to use what was left of the unoccupied land. While the area was under environmental protection, only some aspects of the park environment were protected. It is only more recently that people began to value the 10000-year history of the area's Native occupants, who have built their lives around what Captain Vancouver called Desolation Sound. Increased tourism, however, harms the First Nations and the environment. Untreated sewage and accidental fuel leaks from homes and boats have led to toxic pollution, dangerous for marine life and those relying on seafood for sustenance. As a result, the Sliammon are afraid to eat the little traditional food available to gather after the government leased their best gathering spots to oyster farmers. Overall, the Sliammon First Nations, like many others, have been harmed by the seemingly benevolent creation of parks, including the Desolation Sound Marine Park. Their identity is deeply integrated with the environment. While many still maintain the spiritual connection and use of land, it is increasingly complicated by Settler private property, tourism, pollution and ecologically protected areas. Government relationship Desolation Sound Provincial Park is an important part of traditional territory for the Sliammon and Klahoose First Nations. The areas within the park contain important historical and spiritual sites which are culturally, economically, and socially important to the First Nations groups. According the BC government, Sliammon First Nations reviewed treaty negotiation documents and provided input to the planning process which is reflected in various sections of the parks management plan. In 2008, both First Nations groups (Sliammon and Klahoose) were involved in treaty negotiations with senior levels of government. First Nations are able to exercise aboriginal rights subject to conservation, public safety and public health values. The final treaty may provide additional directions or changes on aboriginal rights within the park areas. BC First Nations Environmental Contaminations Program First Nations Projects that investigate the connection between environmental pollutants and human health are supported by the First Nations Health Authority's Environmental pollutants Program (ECP). Its goals are to encourage capacity building and assist First Nations communities in British Columbia in addressing their environmental health challenges. The Program integrates Indigenous methods of knowing, traditional knowledge, and empirical science to support community-based research on environmental health challenges. Ecology Fauna Desolation Sound Marine Park supports a diversity of terrestrial and marine species. Some terrestrial wildlife identified within the park include large mammal species like black-tailed deer and black bears while small fur-bearing mammals, and various species of reptile also inhabit the park. The park is also home to an abundance of marine animals including spawning and rearing areas for species of salmon and other fish like herring, while also supporting mammalian species such as porpoises and fin footed mammals like seals. The park also supports many migratory birds including several species of ducks and gulls; however, the presence of the marbled murrelet is particularly well documented because 10% of the Canadian Population and 1% of the global population of this bird species use the Marine Park in the summer and fall. The marbled murrelet is studied extensively because it has been listed as threatened by federal legislation since 1990, while agencies in British Columbia list it as a species of special concern. Other species at risk in the park include orcas, sea lions, eulachon, and heron. Flora Many coniferous tree species as well as deciduous maples and alders can be found within Desolation Sound Marine Park and the park primarily supports a second-growth regime thanks to previous logging efforts and wildfires, although sections of old growth areas remain within the park. Thanks to an abundance of favourable habitats, the park also supports many species of seagrass such as eelgrass, which play an important role in sequestering carbon dioxide. Salt marshes and kelp beds present in the park combine with the eelgrass to support a diversity of shellfish. Policy and management It mainly focuses on 6 outcomes and in order to achieve these outcomes, 30 policy intentions are created. A healthy and productive coast A healthy local ecosystem is very important in the management of coastal marines as it is the basis of all the other functions that the coastal can provide. It provides home to countless different species and benefits humans in many different ways. To keep a healthy and productive coast, the BC government focuses on recovering the wildlife and habitats as well as reducing marine pollution. Wildlifes like salmons and killer whales are important parts in the local ecosystem. They have their unique contribution to the local habitats. Resilience to climate change Climate change is a huge impact on the ocean ecosystem. Many species suffered from it and problems have occurred in many places like ocean acidification, sea level rising and coral bleaching. The BC government believes it is important to keep the local  communities safe first as they are the first line that response to extreme weather events. Because the climate change is also affecting the seafood availability, a healthy and resilient community need to be built to decrease the impact from climate change and the government should also collaborate with the first nations to adapt to the climate change. Compared to human-based solutions, nature based solutions are more effective and long-lasting in dealing with the climate change. People find them cost effective and less likely to disrupt the local habitat. Trusting, respectful relationships The local communities are a significant part in managing the costal areas. The first nations have their unique way of understanding and management coastal areas. The BC government believes it's crucial to respect the local community and collaborate deeply to get a comprehensive management strategy. Holistic learning and knowledge sharing The first nations hold a lot of precious knowledge and they have an unprecedented value in understanding and managing the costal areas. Therefore, it's very important to combine the traditional knowledge with western science to get a better understanding and assessment of coastal areas. Community well-being The local communities not only manage coastal areas, but also rely on coastal areas and the ocean for their well-being. Things like climate change, pollution and disconnection from decision-making process poss the local communities a lot of challenges. The province will work to create jobs and support the communities to respond to change by improving the capacity. Desolation Sound Marine Provincial Park is a typical example of managing the recreational services. It is very popular among people as it offers kayak and boating activities, but the popularity is also affecting the local ecosystem. To better manage coastline development and access issues, the province aims to balance the need of recreational services and protection of local habitats. The government also seeks to collaborate with the Indigenous People to manage the Park in an effective and comprehensive way. A sustainable, thriving ocean economy A sustainable economy not only focuses on GDP. It benefits the local community in many perspective for a long time and it's good for people's future generation. The government would like to take different measures to build a sustainable economy, including investing a diverse economy,  promoting aquaculture and tourism,  co-developing with the indigenous people and manage cumulative effect. Controversies In the journal article Desolate Viewscapes, author Jonathan Clapperton claims that BC Parks gauges the park's performance by adding amenities and more visitors. Some examples of this include placing campgrounds on environmentally sensitive land or letting hundreds of yachts anchor in a few tiny bays. Despite being a no-dumping zone, sewage released from yachts concentrated in small coves poisoned several of the marine park's waters, severely contaminating the local shellfish. Because they run the risk of becoming poisoned, many Sliammon First Nations people are reluctant to eat traditional foods in the water near the parks. For ten millennia, the region served as a major source of food for Indigenous people due to the abundance of mussels, mollusks, and oysters that flourish in the warmest waters found in the north of Baja. See also Desolation Sound Prideaux Haven References External links BC Parks - Desolation Sound Provincial Marine Park Sunshine Coast (British Columbia) Provincial Parks of the Discovery Islands Provincial parks of British Columbia 1973 establishments in British Columbia Protected areas established in 1973 Marine parks of Canada
```c++ //======================================================================= // Author: Jeremy G. Siek // // accompanying file LICENSE_1_0.txt or copy at // path_to_url //======================================================================= #ifndef BOOST_GRAPH_ITERATION_MACROS_HPP #define BOOST_GRAPH_ITERATION_MACROS_HPP #include <utility> #define BGL_CAT(x,y) x ## y #define BGL_RANGE(linenum) BGL_CAT(bgl_range_,linenum) #define BGL_FIRST(linenum) (BGL_RANGE(linenum).first) #define BGL_LAST(linenum) (BGL_RANGE(linenum).second) /* BGL_FORALL_VERTICES_T(v, g, graph_t) // This is on line 9 expands to the following, but all on the same line for (typename boost::graph_traits<graph_t>::vertex_iterator bgl_first_9 = vertices(g).first, bgl_last_9 = vertices(g).second; bgl_first_9 != bgl_last_9; bgl_first_9 = bgl_last_9) for (typename boost::graph_traits<graph_t>::vertex_descriptor v; bgl_first_9 != bgl_last_9 ? (v = *bgl_first_9, true) : false; ++bgl_first_9) The purpose of having two for-loops is just to provide a place to declare both the iterator and value variables. There is really only one loop. The stopping condition gets executed two more times than it usually would be, oh well. The reason for the bgl_first_9 = bgl_last_9 in the outer for-loop is in case the user puts a break statement in the inner for-loop. The other macros work in a similar fashion. Use the _T versions when the graph type is a template parameter or dependent on a template parameter. Otherwise use the non _T versions. ----------------------- 6/9/09 THK The above contains two calls to the vertices function. I modified these macros to expand to for (std::pair<typename boost::graph_traits<graph_t>::vertex_iterator, typename boost::graph_traits<graph_t>::vertex_iterator> bgl_range_9 = vertices(g); bgl_range_9.first != bgl_range_9.second; bgl_range_9.first = bgl_range_9.second) for (typename boost::graph_traits<graph_t>::vertex_descriptor v; bgl_range_9.first != bgl_range_9.second ? (v = *bgl_range_9.first, true) : false; ++bgl_range_9.first) */ #define BGL_FORALL_VERTICES_T(VNAME, GNAME, GraphType) \ for (std::pair<typename boost::graph_traits<GraphType>::vertex_iterator, \ typename boost::graph_traits<GraphType>::vertex_iterator> BGL_RANGE(__LINE__) = vertices(GNAME); \ BGL_FIRST(__LINE__) != BGL_LAST(__LINE__); BGL_FIRST(__LINE__) = BGL_LAST(__LINE__)) \ for (typename boost::graph_traits<GraphType>::vertex_descriptor VNAME; \ BGL_FIRST(__LINE__) != BGL_LAST(__LINE__) ? (VNAME = *BGL_FIRST(__LINE__), true):false; \ ++BGL_FIRST(__LINE__)) #define BGL_FORALL_VERTICES(VNAME, GNAME, GraphType) \ for (std::pair<boost::graph_traits<GraphType>::vertex_iterator, \ boost::graph_traits<GraphType>::vertex_iterator> BGL_RANGE(__LINE__) = vertices(GNAME); \ BGL_FIRST(__LINE__) != BGL_LAST(__LINE__); BGL_FIRST(__LINE__) = BGL_LAST(__LINE__)) \ for (boost::graph_traits<GraphType>::vertex_descriptor VNAME; \ BGL_FIRST(__LINE__) != BGL_LAST(__LINE__) ? (VNAME = *BGL_FIRST(__LINE__), true):false; \ ++BGL_FIRST(__LINE__)) #define BGL_FORALL_EDGES_T(ENAME, GNAME, GraphType) \ for (std::pair<typename boost::graph_traits<GraphType>::edge_iterator, \ typename boost::graph_traits<GraphType>::edge_iterator> BGL_RANGE(__LINE__) = edges(GNAME); \ BGL_FIRST(__LINE__) != BGL_LAST(__LINE__); BGL_FIRST(__LINE__) = BGL_LAST(__LINE__)) \ for (typename boost::graph_traits<GraphType>::edge_descriptor ENAME; \ BGL_FIRST(__LINE__) != BGL_LAST(__LINE__) ? (ENAME = *BGL_FIRST(__LINE__), true):false; \ ++BGL_FIRST(__LINE__)) #define BGL_FORALL_EDGES(ENAME, GNAME, GraphType) \ for (std::pair<boost::graph_traits<GraphType>::edge_iterator, \ boost::graph_traits<GraphType>::edge_iterator> BGL_RANGE(__LINE__) = edges(GNAME); \ BGL_FIRST(__LINE__) != BGL_LAST(__LINE__); BGL_FIRST(__LINE__) = BGL_LAST(__LINE__)) \ for (boost::graph_traits<GraphType>::edge_descriptor ENAME; \ BGL_FIRST(__LINE__) != BGL_LAST(__LINE__) ? (ENAME = *BGL_FIRST(__LINE__), true):false; \ ++BGL_FIRST(__LINE__)) #define BGL_FORALL_ADJ_T(UNAME, VNAME, GNAME, GraphType) \ for (std::pair<typename boost::graph_traits<GraphType>::adjacency_iterator, \ typename boost::graph_traits<GraphType>::adjacency_iterator> BGL_RANGE(__LINE__) = adjacent_vertices(UNAME, GNAME); \ BGL_FIRST(__LINE__) != BGL_LAST(__LINE__); BGL_FIRST(__LINE__) = BGL_LAST(__LINE__)) \ for (typename boost::graph_traits<GraphType>::vertex_descriptor VNAME; \ BGL_FIRST(__LINE__) != BGL_LAST(__LINE__) ? (VNAME = *BGL_FIRST(__LINE__), true) : false; \ ++BGL_FIRST(__LINE__)) #define BGL_FORALL_ADJ(UNAME, VNAME, GNAME, GraphType) \ for (std::pair<boost::graph_traits<GraphType>::adjacency_iterator, \ boost::graph_traits<GraphType>::adjacency_iterator> BGL_RANGE(__LINE__) = adjacent_vertices(UNAME, GNAME); \ BGL_FIRST(__LINE__) != BGL_LAST(__LINE__); BGL_FIRST(__LINE__) = BGL_LAST(__LINE__)) \ for (boost::graph_traits<GraphType>::vertex_descriptor VNAME; \ BGL_FIRST(__LINE__) != BGL_LAST(__LINE__) ? (VNAME = *BGL_FIRST(__LINE__), true) : false; \ ++BGL_FIRST(__LINE__)) #define BGL_FORALL_OUTEDGES_T(UNAME, ENAME, GNAME, GraphType) \ for (std::pair<typename boost::graph_traits<GraphType>::out_edge_iterator, \ typename boost::graph_traits<GraphType>::out_edge_iterator> BGL_RANGE(__LINE__) = out_edges(UNAME, GNAME); \ BGL_FIRST(__LINE__) != BGL_LAST(__LINE__); BGL_FIRST(__LINE__) = BGL_LAST(__LINE__)) \ for (typename boost::graph_traits<GraphType>::edge_descriptor ENAME; \ BGL_FIRST(__LINE__) != BGL_LAST(__LINE__) ? (ENAME = *BGL_FIRST(__LINE__), true) : false; \ ++BGL_FIRST(__LINE__)) #define BGL_FORALL_OUTEDGES(UNAME, ENAME, GNAME, GraphType) \ for (std::pair<boost::graph_traits<GraphType>::out_edge_iterator, \ boost::graph_traits<GraphType>::out_edge_iterator> BGL_RANGE(__LINE__) = out_edges(UNAME, GNAME); \ BGL_FIRST(__LINE__) != BGL_LAST(__LINE__); BGL_FIRST(__LINE__) = BGL_LAST(__LINE__)) \ for (boost::graph_traits<GraphType>::edge_descriptor ENAME; \ BGL_FIRST(__LINE__) != BGL_LAST(__LINE__) ? (ENAME = *BGL_FIRST(__LINE__), true) : false; \ ++BGL_FIRST(__LINE__)) #define BGL_FORALL_INEDGES_T(UNAME, ENAME, GNAME, GraphType) \ for (std::pair<typename boost::graph_traits<GraphType>::in_edge_iterator, \ typename boost::graph_traits<GraphType>::in_edge_iterator> BGL_RANGE(__LINE__) = in_edges(UNAME, GNAME); \ BGL_FIRST(__LINE__) != BGL_LAST(__LINE__); BGL_FIRST(__LINE__) = BGL_LAST(__LINE__)) \ for (typename boost::graph_traits<GraphType>::edge_descriptor ENAME; \ BGL_FIRST(__LINE__) != BGL_LAST(__LINE__) ? (ENAME = *BGL_FIRST(__LINE__), true) : false; \ ++BGL_FIRST(__LINE__)) #define BGL_FORALL_INEDGES(UNAME, ENAME, GNAME, GraphType) \ for (std::pair<boost::graph_traits<GraphType>::in_edge_iterator, \ boost::graph_traits<GraphType>::in_edge_iterator> BGL_RANGE(__LINE__) = in_edges(UNAME, GNAME); \ BGL_FIRST(__LINE__) != BGL_LAST(__LINE__); BGL_FIRST(__LINE__) = BGL_LAST(__LINE__)) \ for (boost::graph_traits<GraphType>::edge_descriptor ENAME; \ BGL_FIRST(__LINE__) != BGL_LAST(__LINE__) ? (ENAME = *BGL_FIRST(__LINE__), true) : false; \ ++BGL_FIRST(__LINE__)) #endif // BOOST_GRAPH_ITERATION_MACROS_HPP ```
Zygmunt Łoziński (5 June 1870 – 26 March 1932) was a Polish Roman Catholic bishop who served as the head of the Roman Catholic Archdiocese of Minsk-Mohilev that later was aggregated to the Diocese of Pinsk. Soviet authorities arrested him on two occasions during his episcopate. The title of Venerable was conferred upon him on 2 April 1993 after Pope John Paul II acknowledged his heroic virtue. Life Zygmunt Łoziński was born on 5 June 1870 in a village of Baratin in the Novogrudsky Uyezd of the Minsk Governorate of the Russian Empire (present-day Karelichy District, Belarus. He studied in Warsaw and in Saint Petersburg where graduated from the Saint Petersburg Roman Catholic Theological Academy before he started his studies for the priesthood; he was ordained to the priesthood on 23 June 1895. Russian authorities sentenced him on 17 November 1898 to three years of seclusion in a convent in Latvia. Łoziński became the vicar of Smolensk in 1901 as well as being reassigned to Tula in 1902 and Riga in 1904. Łoziński became the rector of the Minsk Cathedral in 1905. In 1906 he returned to Saint Petersburg where he taught Hebrew and biblical studies. He accompanied the Bishop of Mogilev to visit the parishes of the Russian Empire from 1909 until 1911. In 1912 he started to commence further education in the German Empire and in Rome. Pope Benedict XV appointed him as the Bishop of Minsk on 2 November 1917 and as such he received his episcopal consecration on 28 July 1918 in Warsaw from Cardinal Aleksander Kakowski - the co-consecrators were Stanisław Kazimierz and Blessed Antoni Julian Nowowiejski. Soviet authorities arrested him on 1 August 1920 on the charges of "counter-revolution" but the pressure of local Christians saw him released on the following 11 August. He was arrested again on 4 September 1920 while the Polish government secured his release eleven months later in 1921 from Butyrka Prison; he weighed 95 pounds upon his release. Pope Pius XI appointed him on 28 October 1925 as the Bishop of Pińsk after his previous diocese was aggregated to the latter. He filed a total of 755 lawsuits as part of the recovery of Orthodox Churches in Poland. In 1929 he invited the Blessed Martyrs of Nowogródek of the Sisters of the Holy Family of Nazareth to Navahrudak in 1929. Łoziński was awarded with both the Order of the White Eagle (Poland) and the Cross of Valour (Poland). Łoziński died on 26 March 1932 - Holy Saturday - and was buried in the cathedral of his diocese. Beatification process The beatification process commenced on two fronts in both Pinsk and Vatican. The informative process opened in 1957 and concluded its business of collating testimonies and documentation in 1962. This occurred despite the fact that the title of Servant of God - the first official stage in the process - was not conferred to him until 4 December 1980 after which point an apostolic process was held. After this the Congregation for the Causes of Saints validated the previous processes. The C.C.S. received the Positio in 1990 at which point theologians approved the cause on 10 December 1992 while the C.C.S. also approved it on 9 March 1993. He was declared to be Venerable on 2 April 1993 after Pope John Paul II approved that the late bishop lived a life of heroic virtue. References External links Hagiography Circle Catholic Hierarchy 1870 births 1932 deaths People from Karelichy District People from Novogrudsky Uyezd 19th-century venerated Christians 19th-century Belarusian people 20th-century venerated Christians 20th-century Belarusian people Polish bishops Belarusian Roman Catholic bishops Prisoners and detainees of the Soviet Union Venerated Catholics by Pope John Paul II Recipients of the Cross of Valour (Poland)
```java package com.fishercoder.firstthousand; import com.fishercoder.common.classes.TreeNode; import com.fishercoder.common.utils.TreeUtils; import com.fishercoder.solutions.firstthousand._297; import org.junit.jupiter.api.BeforeEach; import org.junit.jupiter.api.Test; import java.util.Arrays; import static org.junit.jupiter.api.Assertions.assertEquals; public class _297Test { private _297.Solution1 solution1; @BeforeEach public void setup() { solution1 = new _297.Solution1(); } @Test public void test1() { TreeNode root = TreeUtils.constructBinaryTree(Arrays.asList(1, 2, 3, null, null, 4, 5, 6, 7)); TreeUtils.printBinaryTree(root); String str = solution1.serialize(root); System.out.println(str); TreeUtils.printBinaryTree(solution1.deserialize(str)); assertEquals(root, solution1.deserialize(str)); } } ```
Denise Christensen is a diver from Tucson, Arizona, United States. She won a gold medal in springboard diving at the 1979 Pan American Games in San Juan. In 1980 she became AIAW diving champion in springboard, and in 1982 she won the AIAW team championship with the Texas Longhorns. References Year of birth missing (living people) Living people American female divers Texas Longhorns women's swimmers Sportspeople from Tucson, Arizona Pan American Games gold medalists for the United States Pan American Games medalists in diving Divers at the 1979 Pan American Games Medalists at the 1979 Pan American Games
Vladimir Vladimirovich Tatarchuk (; born 20 September 1987) is a former Russian professional footballer. He made his professional debut in the Russian Second Division in 2008 for FC Krasnodar. He is a son of Vladimir Tatarchuk. Honours Russian Cup winner: 2006 (played 2 games in the tournament). References 1987 births Footballers from Moscow Living people Russian men's footballers Russian people of Ukrainian descent PFC CSKA Moscow players FC Krasnodar players PFC Spartak Nalchik players FC Torpedo Moscow players Russian Premier League players FC Fakel Voronezh players Men's association football midfielders
```batchfile :: or more contributor license agreements. See the NOTICE file :: distributed with this work for additional information :: regarding copyright ownership. The ASF licenses this file :: :: path_to_url :: :: Unless required by applicable law or agreed to in writing, :: "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY :: specific language governing permissions and limitations echo on conda build --output-folder=conda/pkg conda/recipe || exit /b ```
The Siege of Gujranwala was fought between the Sikh forces led by Charat Singh and the Afghan forces led by Jahan Khan. Background In 1763, Jahan Khan attempted to regain some of his lost territory. After Charat Singh previously occupied Gujranwala in 1761, the route to Kabul was blocked. Jahan Khan, the Nawab of Lahore decided to besiege Gujranwala. 30,000 troops surrounded the city on all sides. One thousand men were trapped inside it. This all happened in November 1763. Siege The Afghan troops of 30,000 encircled the city by all sides and their strength kept increasing. In the city, 200 soldiers were under Charat Singh. At night, the 200 soldiers attacked the enemy and raged hao among the enemy lines. Jahan Khan managed to escape horseback. As he was escaping to Lahore, the Sikhs rushed after him in hot pursuit. Plenty of treasure landed in Charat's hands and more Sikhs joined him. Aftermath Following this victory, the Sikhs sacked Malerkotla and Morinda. Gujranwala was made the Sukerchakia Misl capital after this battle. References See also Nihang Martyrdom and Sikhism Conflicts in 1763 18th-century sieges 1760s in the Durrani Empire Battles involving the Sikhs Gujranwala
Monika Lundi is a German television and film actress. Selected filmography (1968, TV film) (1968) (1968) (1968, TV miniseries) (1969) Der Kommissar: Dr. Meinhardts trauriges Ende (1970, TV series episode) (1971, TV series) The Captain (1971) The Heath Is Green (1972) (1973) Crazy – Completely Mad (1973) No Gold for a Dead Diver (1974) Tatort: Kindergeld (1982, TV series episode) Ein Fall für zwei: Tödliches Viereck (1983, TV series episode) Alle meine Töchter (1995–2001, TV series) References External links 1943 births German film actresses Actresses from Berlin Living people German television actresses 20th-century German actresses
Inus Michael Kritzinger (born 13 May 1989) is a South African rugby union player, currently playing with the . His regular position is scrum-half. Career Youth He represented the at the 2005 Under-16 Grant Khomo week and Griffons Country Districts at the Under–18 Academy week competitions in both 2006 and 2007. In 2008, he played in the Under-19 Provincial Championship competition with the and the following year he was included in the Under-21 squad, but also had a short spell with the Under-21s. He returned to the Free State Under-21s for 2010. Varsity Shield He also played for university side the in the Varsity Shield competitions in 2011, 2012 and 2013. In 2011 and 2012, he was voted the "Back That Rocks" and he was joint top try scorer in 2012 with 6 tries. In 2013, he was voted the "Player That Rocks" and he was the sole top try scorer with 8 tries. Griffons In 2011, he joined the for their 2011 Currie Cup First Division season, making his debut against the and his first start against the . He made four appearances for the in the 2012 Vodacom Cup – including two starts at fly-half – but returned to the for the 2012 Currie Cup First Division. He also played for the Griffons in the 2013 Vodacom Cup and 2013 Currie Cup First Division competitions and then signed a senior contract for 2014. Representative Rugby In 2013, he was initially included in a South Africa President's XV squad to play in the 2013 IRB Tbilisi Cup, but was omitted from the final squad. References 1989 births Living people Free State Cheetahs players Griffons (rugby union) players Rugby union players from Welkom South African rugby union players Rugby union scrum-halves
Moses ben Samuel Zuriel (, Moshe ben Shmu’el Tzuri’el) was a seventeenth-century Jewish mathematician. He was the author of Meḥaddesh Ḥodashim, a calendar for AM 5414–5434. References 16th-century Jews 16th-century mathematicians
```javascript /* global $:true */ /* jshint unused:false*/ + function($) { "use strict"; var defaults; var formatNumber = function (n) { return n < 10 ? "0" + n : n; } var Datetime = function(input, params) { this.input = $(input); this.params = params || {}; this.initMonthes = params.monthes this.initYears = params.years var p = $.extend({}, params, this.getConfig()); $(this.input).picker(p); } Datetime.prototype = { getDays : function(max) { var days = []; for(var i=1; i<= (max||31);i++) { days.push(i < 10 ? "0"+i : i); } return days; }, getDaysByMonthAndYear : function(month, year) { var int_d = new Date(year, parseInt(month)+1-1, 1); var d = new Date(int_d - 1); return this.getDays(d.getDate()); }, getConfig: function() { var today = new Date(), params = this.params, self = this, lastValidValues; var config = { rotateEffect: false, // cssClass: 'datetime-picker', value: [today.getFullYear(), formatNumber(today.getMonth()+1), formatNumber(today.getDate()), formatNumber(today.getHours()), (formatNumber(today.getMinutes()))], onChange: function (picker, values, displayValues) { var cols = picker.cols; var days = self.getDaysByMonthAndYear(values[1], values[0]); var currentValue = values[2]; if(currentValue > days.length) currentValue = days.length; picker.cols[4].setValue(currentValue); //check min and max var current = new Date(values[0]+'-'+values[1]+'-'+values[2]); var valid = true; if(params.min) { var min = new Date(typeof params.min === "function" ? params.min() : params.min); if(current < +min) { picker.setValue(lastValidValues); valid = false; } } if(params.max) { var max = new Date(typeof params.max === "function" ? params.max() : params.max); if(current > +max) { picker.setValue(lastValidValues); valid = false; } } valid && (lastValidValues = values); if (self.params.onChange) { self.params.onChange.apply(this, arguments); } }, formatValue: function (p, values, displayValues) { return self.params.format(p, values, displayValues); }, cols: [ { values: this.initYears }, { divider: true, // content: params.yearSplit }, { values: this.initMonthes }, { divider: true, // content: params.monthSplit }, { values: (function () { var dates = []; for (var i=1; i<=31; i++) dates.push(formatNumber(i)); return dates; })() }, ] } if (params.dateSplit) { config.cols.push({ divider: true, content: params.dateSplit }) } config.cols.push({ divider: true, content: params.datetimeSplit }) var times = self.params.times(); if (times && times.length) { config.cols = config.cols.concat(times); } var inputValue = this.input.val(); if(inputValue) config.value = params.parse(inputValue); if(this.params.value) { this.input.val(this.params.value); config.value = params.parse(this.params.value); } return config; } } $.fn.datetimePicker = function(params) { params = $.extend({}, defaults, params); return this.each(function() { if(!this) return; var $this = $(this); var datetime = $this.data("datetime"); if(!datetime) $this.data("datetime", new Datetime(this, params)); return datetime; }); }; defaults = $.fn.datetimePicker.prototype.defaults = { input: undefined, // min: undefined, // YYYY-MM-DD max: undefined, // YYYY-MM-DD yearSplit: '-', monthSplit: '-', dateSplit: '', // datetimeSplit: ' ', // monthes: ('01 02 03 04 05 06 07 08 09 10 11 12').split(' '), years: (function () { var arr = []; for (var i = 1950; i <= 2030; i++) { arr.push(i); } return arr; })(), times: function () { return [ // { values: (function () { var hours = []; for (var i=0; i<24; i++) hours.push(formatNumber(i)); return hours; })() }, { divider: true, // content: ':' }, { values: (function () { var minutes = []; for (var i=0; i<60; i++) minutes.push(formatNumber(i)); return minutes; })() } ]; }, format: function (p, values) { // return p.cols.map(function (col) { return col.value || col.content; }).join(''); }, parse: function (str) { // // '' ''parse // var t = str.split(this.datetimeSplit); return t[0].split(/\D/).concat(t[1].split(/:|||/)).filter(function (d) { return !!d; }) } } }($); ```
```asciidoc //// This file is generated by DocsTest, so don't change it! //// = apoc.couchbase :description: This section contains reference documentation for the apoc.couchbase procedures. [.procedures, opts=header, cols='5a,1a,1a'] |=== | Qualified Name | Type | Release |xref::overview/apoc.couchbase/apoc.couchbase.append.adoc[apoc.couchbase.append icon:book[]] apoc.couchbase.append(hostOrKey, bucket, documentId, content) yield id, expiry, cas, mutationToken, content - append a couchbase json document to an existing one. |label:procedure[] |label:apoc-full[] |xref::overview/apoc.couchbase/apoc.couchbase.exists.adoc[apoc.couchbase.exists icon:book[]] apoc.couchbase.exists(hostOrKey, bucket, documentId) yield value - check whether a couchbase json document with the given ID does exist. |label:procedure[] |label:apoc-full[] |xref::overview/apoc.couchbase/apoc.couchbase.get.adoc[apoc.couchbase.get icon:book[]] apoc.couchbase.get(hostOrKey, bucket, documentId) yield id, expiry, cas, mutationToken, content - retrieves a couchbase json document by its unique ID. |label:procedure[] |label:apoc-full[] |xref::overview/apoc.couchbase/apoc.couchbase.insert.adoc[apoc.couchbase.insert icon:book[]] apoc.couchbase.insert(hostOrKey, bucket, documentId, jsonDocument) yield id, expiry, cas, mutationToken, content - insert a couchbase json document with its unique ID. |label:procedure[] |label:apoc-full[] |xref::overview/apoc.couchbase/apoc.couchbase.namedParamsQuery.adoc[apoc.couchbase.namedParamsQuery icon:book[]] apoc.couchbase.namedParamsQuery(hostkOrKey, bucket, statement, paramNames, paramValues) yield queryResult - executes a N1QL statement with named parameters. |label:procedure[] |label:apoc-full[] |xref::overview/apoc.couchbase/apoc.couchbase.posParamsQuery.adoc[apoc.couchbase.posParamsQuery icon:book[]] apoc.couchbase.posParamsQuery(hostOrKey, bucket, statement, params) yield queryResult - executes a N1QL statement with positional parameters. |label:procedure[] |label:apoc-full[] |xref::overview/apoc.couchbase/apoc.couchbase.prepend.adoc[apoc.couchbase.prepend icon:book[]] apoc.couchbase.prepend(hostOrKey, bucket, documentId, content) yield id, expiry, cas, mutationToken, content - prepend a couchbase json document to an existing one. |label:procedure[] |label:apoc-full[] |xref::overview/apoc.couchbase/apoc.couchbase.query.adoc[apoc.couchbase.query icon:book[]] apoc.couchbase.query(hostOrKey, bucket, statement) yield queryResult - executes a plain un-parameterized N1QL statement. |label:procedure[] |label:apoc-full[] |xref::overview/apoc.couchbase/apoc.couchbase.remove.adoc[apoc.couchbase.remove icon:book[]] apoc.couchbase.remove(hostOrKey, bucket, documentId) yield id, expiry, cas, mutationToken, content - remove the couchbase json document identified by its unique ID. |label:procedure[] |label:apoc-full[] |xref::overview/apoc.couchbase/apoc.couchbase.replace.adoc[apoc.couchbase.replace icon:book[]] apoc.couchbase.replace(hostOrKey, bucket, documentId, jsonDocument) yield id, expiry, cas, mutationToken, content - replace the content of the couchbase json document identified by its unique ID. |label:procedure[] |label:apoc-full[] |xref::overview/apoc.couchbase/apoc.couchbase.upsert.adoc[apoc.couchbase.upsert icon:book[]] apoc.couchbase.upsert(hostOrKey, bucket, documentId, jsonDocument) yield id, expiry, cas, mutationToken, content - insert or overwrite a couchbase json document with its unique ID. |label:procedure[] |label:apoc-full[] |=== ```
Agnes of Baden (25 March 1408 – January 1473), was a German noblewoman member of the House of Zähringen and by marriage Countess of Holstein-Rendsburg. She was a daughter of Bernard I, Margrave of Baden by his second wife Anna of Oettingen. Life In Ettlingen on 23 February 1432 she was betrothed to Gerhard VII, Count of Holstein-Rendsburg. Her older brother Jacob, new Margrave of Baden was very anxious for this marriage because he wanted to obtain political advantages in Schleswig. The marriage was celebrated in Baden on 2 June of that year, but Gerhard VII quickly returned to his domains in order to secure his frontiers without an official wedding night. Officially, the marriage was consummated only on 5 October. On 15 January 1433 Agnes, pregnant at that time, fell from the stairs at Gottorf Castle. The next day she gave birth healthy twins, Henry and Catherine. This caused surprise and the scandal erupted, because it was clear that the consummation of the marriage and the date of birth were too close to produce living children. To stop the increased rumours about his paternity, in February Gerhard VII declared in the courtyard of Gottorf Castle in front of his knights that he secretly slept with Agnes before the wedding, and that she'd been a virgin then; therefore, the children were his, and Henry would be capable of inheriting his possessions. In Schleswig Cathedral and in a State Assembly before the clergy and nobility Gerhard VII reaffirmed his word, who was further confirmed by court ladies, doctors and midwives. The matter was ultimately settled by the Bishops of Lübeck and Schleswig. The Count's brother Adolphus VIII supported his declaration. All the scandal received the name of the "Twin Disaster of Gottorf" (de: Zwillingssturz von Gottorf). Gerhard VII suffered from a lung disease. Shortly after the declaration in Schleswig Cathedral, his condition worsened, and doctors are unable to help him. Against medical advice, Agnes and her husband travelled to a spa in her native Baden; however, in the middle of the trip, Gerhard VII became worse and in Cologne they decided to return. Gerhard VII died on 24 July 1433 during the return journey in Emmerich am Rhein. He was buried there. During her return in Hamburg, Agnes received unexpected news: her brother-in-law Adolphus VIII refused her entry to Holstein, kidnapped her children and denied her rights as a widow, because her dowry was never paid and the recent scandal about the paternity of the twins. Without options, Agnes was forced to return to Baden. She never saw her children again: Henry was drowned in the Schlei River shortly after and Catherine entered in Preetz Priory as a nun. Both children were probably murdered. Her brother Margrave Jacob of Baden at first strongly supported her rights; however, on 2 June 1436 she was secretly betrothed and married with Hans von Höwen, a former admirer. Margrave Jacob, who at that time was negotiated a new marriage for his sister with one of the sons of Piast Duke Konrad V of Oleśnica, was furious. By order of her brother, in 1437 Agnes was imprisoned for life in Eberstein Castle in Ebersteinburg, where she died blind in the first weeks of 1473. Baden-Baden, Agnes of Baden-Baden, Agnes of Agnes 15th-century German women 15th-century German people Daughters of monarchs
"Say Something Anyway" was the Irish girl group Bellefire's first single release of their comeback as a 3-piece, after being signed by WEA. It is also included on their album Spin the Wheel. This single received quite a lot of airplay in Ireland (peaking at #5), as well as the United Kingdom (#26 chart peak). During several episodes of a few hit British sitcoms, such as Coronation Street & Emmerdale, Say Something Anyway can clearly be heard in the background during scenes in public buildings, such as pubs and diners. Track listing "Say Something Anyway" "Say Something Anyway (Bimbo Jones re-mix)" Video The video was shot in Cuba and was directed by Andy Hylton. It features the girls in numerous shots around the capital, from driving the signature vehicle of the Spin The Wheel album to singing by a Cuban manor house. It also features three Cuban men, who the girls meet on their way around the city. Their truck has broken down, and they flag the girls down for some oil and invaluable motor expertise. These boys then take the girls around the city in a tour, stopping at the end of a song to enjoy the warmth of a camp fire. 2004 singles Bellefire songs Songs written by Jörgen Elofsson 2004 songs East West Records singles
```python async def app(scope, receive, send): assert scope["type"] == "http" await send( { "type": "http.response.start", "status": 200, "headers": [[b"content-type", b"text/html"]], } ) await send( { "type": "http.response.body", "body": b"asgi-function:RANDOMNESS_PLACEHOLDER" } ) ```