text
stringlengths
1
22.8M
```python """Fixer for has_key(). Calls to .has_key() methods are expressed in terms of the 'in' operator: d.has_key(k) -> k in d CAVEATS: 1) While the primary target of this fixer is dict.has_key(), the fixer will change any has_key() method call, regardless of its class. 2) Cases like this will not be converted: m = d.has_key if m(k): ... Only *calls* to has_key() are converted. While it is possible to convert the above to something like m = d.__contains__ if m(k): ... this is currently not done. """ # Local imports from .. import pytree from ..pgen2 import token from .. import fixer_base from ..fixer_util import Name, parenthesize class FixHasKey(fixer_base.BaseFix): BM_compatible = True PATTERN = """ anchor=power< before=any+ trailer< '.' 'has_key' > trailer< '(' ( not(arglist | argument<any '=' any>) arg=any | arglist<(not argument<any '=' any>) arg=any ','> ) ')' > after=any* > | negation=not_test< 'not' anchor=power< before=any+ trailer< '.' 'has_key' > trailer< '(' ( not(arglist | argument<any '=' any>) arg=any | arglist<(not argument<any '=' any>) arg=any ','> ) ')' > > > """ def transform(self, node, results): assert results syms = self.syms if (node.parent.type == syms.not_test and self.pattern.match(node.parent)): # Don't transform a node matching the first alternative of the # pattern when its parent matches the second alternative return None negation = results.get("negation") anchor = results["anchor"] prefix = node.prefix before = [n.clone() for n in results["before"]] arg = results["arg"].clone() after = results.get("after") if after: after = [n.clone() for n in after] if arg.type in (syms.comparison, syms.not_test, syms.and_test, syms.or_test, syms.test, syms.lambdef, syms.argument): arg = parenthesize(arg) if len(before) == 1: before = before[0] else: before = pytree.Node(syms.power, before) before.prefix = u" " n_op = Name(u"in", prefix=u" ") if negation: n_not = Name(u"not", prefix=u" ") n_op = pytree.Node(syms.comp_op, (n_not, n_op)) new = pytree.Node(syms.comparison, (arg, n_op, before)) if after: new = parenthesize(new) new = pytree.Node(syms.power, (new,) + tuple(after)) if node.parent.type in (syms.comparison, syms.expr, syms.xor_expr, syms.and_expr, syms.shift_expr, syms.arith_expr, syms.term, syms.factor, syms.power): new = parenthesize(new) new.prefix = prefix return new ```
```smalltalk namespace Volo.Abp.AspNetCore.Components.WebAssembly.MultiTenant; public class WebAssemblyMultiTenantUrlOptions { public string DomainFormat { get; set; } = default!; } ```
Estefanía Bacca is an Argentine vedette, acrobat-dancer, actress, model and choreographer. She received her licence as dance instructor (specialist in musical comedy) in Instituto Universitario Nacional de Arte in Buenos Aires. She has modeled for Paparazzi Online and Maxim in Argentina. Career Estefanía started her theater career as a small role actress and simple burlesque dancer in revue shows from 2007 and 2010, all in the same company, "Faroni Producciones" of actress-comedian and director Carmen Barbieri and producer and founder of the company, Javier Faroni. In late-2010 Bacca was cast in the revue musical Excitante as the second vedette bested by Adabel Guerrero and Jésica Cirio in Mar del Plata. Estefanía was the second vedette on the magazine play El Anfitrión for the 2011–12 theatrical season in Villa Carlos Paz. The musical was led by María Martha Serra Lima as the lead attraction and Mónica González Listorti as the first vedette accompanied by her Bailando professional dance partner Maximiliano D'Iorio as the first dancer. The show's creative director was the Austrian vedette, Reina Reech. Also in the show as the actor-comedian, "Cacho" Buenaventura and the magician, Emanuel. The musical debuted in Córdoba, Argentina, 16 December 2011 in the theater, Candilejas I. Bacca won an award for best vedette, for her work in El Anfitrión. She performed part of a non-musical theater comedy, Sin Comerla ni Beberla alongside René Bertrand, Sabrina Olmedo, Emiliano Rella, Belén Giménez, Pamela Sosa and Leandro Orowitz, directed by René Bertrand and produced-writing by Gerardo Sofovich in Mar del Plata. They debuted on 28 December 2012 in the theater Re Fa Si. Currently, Bacca is the lead vedette of the music hall, "Divain" in Mar del Plata, Argentina alongside Aníbal Pachano, Gustavo Wons, Maia Contreras, Nicolás Armengol and others. The theater show received six nomination including two for female revelations, one being for Estefanía. See also List of glamour models References Sources Living people People from Reconquista, Santa Fe Argentine people of Spanish descent Argentine people of Italian descent Argentine people of Israeli descent Dance teachers Argentine choreographers Argentine women choreographers Argentine female dancers Argentine dancers Argentine erotic dancers Dance musicians Argentine female models Argentine stage actresses Cabaret singers Participants in Argentine reality television series Argentine vedettes Argentine entertainers Burlesque performers Argentine musical theatre female dancers Argentine musical theatre women singers Argentine musical theatre actresses Argentine theatrical dancer-actresses 1988 births 21st-century Argentine dancers 21st-century Argentine women
```javascript const test = require('tap').test; const Variable = require('../../src/engine/variable'); const htmlparser = require('htmlparser2'); test('spec', t => { t.type(typeof Variable.SCALAR_TYPE, typeof Variable.LIST_TYPE); t.type(typeof Variable.SCALAR_TYPE, typeof Variable.BROADCAST_MESSAGE_TYPE); const varId = 'varId'; const varName = 'varName'; const varIsCloud = false; let v = new Variable( varId, varName, Variable.SCALAR_TYPE, varIsCloud ); t.type(Variable, 'function'); t.type(v, 'object'); t.ok(v instanceof Variable); t.equal(v.id, varId); t.equal(v.name, varName); t.equal(v.type, Variable.SCALAR_TYPE); t.type(v.value, 'number'); t.equal(v.isCloud, varIsCloud); t.type(v.toXML, 'function'); v = new Variable( varId, varName, Variable.LIST_TYPE, varIsCloud ); t.ok(Array.isArray(v.value)); v = new Variable( varId, varName, Variable.BROADCAST_MESSAGE_TYPE, varIsCloud ); t.equal(v.value, 'varName'); t.end(); }); test('toXML', t => { const varId = 'varId'; const varName = 'varName'; const varIsCloud = false; const varIsLocal = false; const v = new Variable( varId, varName, Variable.SCALAR_TYPE, varIsCloud ); const parser = new htmlparser.Parser({ onopentag: function (name, attribs){ if (name === 'variable'){ t.equal(attribs.type, Variable.SCALAR_TYPE); t.equal(attribs.id, varId); t.equal(attribs.iscloud, varIsCloud.toString()); t.equal(attribs.islocal, varIsLocal.toString()); } }, ontext: function (text){ t.equal(text, varName); } }, {decodeEntities: false}); parser.write(v.toXML(false)); parser.end(); t.end(); }); test('escape variable name for XML', t => { const varId = 'varId'; const varName = '<>&\'"'; const varIsCloud = false; const varIsLocal = false; const v = new Variable( varId, varName, Variable.SCALAR_TYPE, varIsCloud ); const parser = new htmlparser.Parser({ onopentag: function (name, attribs){ if (name === 'variable'){ t.equal(attribs.type, Variable.SCALAR_TYPE); t.equal(attribs.id, varId); t.equal(attribs.iscloud, varIsCloud.toString()); t.equal(attribs.islocal, varIsLocal.toString()); } }, ontext: function (text){ t.equal(text, '&lt;&gt;&amp;&apos;&quot;'); } }, {decodeEntities: false}); parser.write(v.toXML(false)); parser.end(); t.end(); }); ```
```javascript Use the `apply` function to get the `min` or `max` of an array Function declarations vs function expressions Function constructor vs. function declaration vs. function expression Functions can be declared after use IIFE pattern ```
```javascript /** * * * path_to_url * * Unless required by applicable law or agreed to in writing, software * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. */ define(["StartAudioContext", "Tone/core/Tone"], function (StartAudioContext, Tone) { return function(){ // send the ready message to the parent var isIOS = /iPad|iPhone|iPod/.test(navigator.userAgent) && !window.MSStream; var isAndroid = /Android/.test(navigator.userAgent) && !window.MSStream; // full screen button on iOS if (isIOS || isAndroid) { // make a full screen element and put it in front var iOSTapper = document.createElement("div"); iOSTapper.id = "iOSTap"; document.body.appendChild(iOSTapper); new StartAudioContext(Tone.context, iOSTapper).then(function() { iOSTapper.remove(); window.parent.postMessage('ready','*'); }); } else { window.parent.postMessage('ready','*'); } }; }); ```
```php <?php namespace Phphub\Notification; use Phphub\Forms\ReplyCreationForm; use Phphub\Core\CreatorListener; use Phphub\Notification\Mention; use App\Models\Reply; use Auth; use App\Models\Topic; use App\Models\Notification; use Carbon\Carbon; use App\Models\User; use App\Models\Append; use App\Jobs\SendReplyNotifyMail; class Notifier { public $notifiedUsers = []; public function newTopicNotify(User $fromUser, Mention $mentionParser, Topic $topic) { // Notify mentioned users Notification::batchNotify( 'mentioned_in_topic', $fromUser, $this->removeDuplication($mentionParser->users), $topic); // Notify user follower Notification::batchNotify( 'new_topic_from_following', $fromUser, $this->removeDuplication($fromUser->followers), $topic); // Notify blog subscriber if (count($topic->user->blogs)) { Notification::batchNotify( 'new_topic_from_subscribe', $fromUser, $this->removeDuplication($topic->user->blogs->first()->subscribers), $topic); } } public function newReplyNotify(User $fromUser, Mention $mentionParser, Topic $topic, Reply $reply) { // Notify the author Notification::batchNotify( 'new_reply', $fromUser, $this->removeDuplication([$topic->user]), $topic, $reply); // Notify attented users Notification::batchNotify( 'attention', $fromUser, $this->removeDuplication($topic->attentedUsers()), $topic, $reply); // Notify mentioned users Notification::batchNotify( 'at', $fromUser, $this->removeDuplication($mentionParser->users), $topic, $reply); } public function newAppendNotify(User $fromUser, Topic $topic, Append $append) { $users = $topic->replies()->with('user')->get()->lists('user'); // Notify commented user Notification::batchNotify( 'comment_append', $fromUser, $this->removeDuplication($users), $topic, null, $append->content); // Notify voted users Notification::batchNotify( 'vote_append', $fromUser, $this->removeDuplication($topic->votedUsers()), $topic, null, $append->content); // Notify attented users Notification::batchNotify( 'attented_append', $fromUser, $this->removeDuplication($topic->attentedUsers()), $topic, null, $append->content); } public function newFollowNotify(User $fromUser, User $toUser) { Notification::notify( 'follow', $fromUser, $toUser, null, null, null); } // in case of a user get a lot of the same notification public function removeDuplication($users) { $notYetNotifyUsers = []; foreach ($users as $user) { if ( ! in_array($user->id, $this->notifiedUsers)) { $notYetNotifyUsers[] = $user; $this->notifiedUsers[] = $user->id; } } return $notYetNotifyUsers; } } ```
```xml <?xml version="1.0" encoding="UTF-8"?> <phpunit xmlns:xsi="path_to_url" xsi:noNamespaceSchemaLocation="path_to_url" bootstrap="../tests/bootstrap.php" backupGlobals="false" verbose="true"> <testsuite name="Comparator"> <directory suffix="Test.php">../tests</directory> </testsuite> </phpunit> ```
```javascript 'use strict'; angular.module("ngLocale", [], ["$provide", function($provide) { var PLURAL_CATEGORY = {ZERO: "zero", ONE: "one", TWO: "two", FEW: "few", MANY: "many", OTHER: "other"}; $provide.value("$locale", { "DATETIME_FORMATS": { "AMPMS": [ "AM", "PM" ], "DAY": [ "dimanche", "lundi", "mardi", "mercredi", "jeudi", "vendredi", "samedi" ], "MONTH": [ "janvier", "f\u00e9vrier", "mars", "avril", "mai", "juin", "juillet", "ao\u00fbt", "septembre", "octobre", "novembre", "d\u00e9cembre" ], "SHORTDAY": [ "dim.", "lun.", "mar.", "mer.", "jeu.", "ven.", "sam." ], "SHORTMONTH": [ "janv.", "f\u00e9vr.", "mars", "avr.", "mai", "juin", "juil.", "ao\u00fbt", "sept.", "oct.", "nov.", "d\u00e9c." ], "fullDate": "EEEE d MMMM y", "longDate": "d MMMM y", "medium": "d MMM y HH:mm:ss", "mediumDate": "d MMM y", "mediumTime": "HH:mm:ss", "short": "dd/MM/y HH:mm", "shortDate": "dd/MM/y", "shortTime": "HH:mm" }, "NUMBER_FORMATS": { "CURRENCY_SYM": "\u20ac", "DECIMAL_SEP": ",", "GROUP_SEP": "\u00a0", "PATTERNS": [ { "gSize": 3, "lgSize": 3, "macFrac": 0, "maxFrac": 3, "minFrac": 0, "minInt": 1, "negPre": "-", "negSuf": "", "posPre": "", "posSuf": "" }, { "gSize": 3, "lgSize": 3, "macFrac": 0, "maxFrac": 2, "minFrac": 2, "minInt": 1, "negPre": "-", "negSuf": "\u00a0\u00a4", "posPre": "", "posSuf": "\u00a0\u00a4" } ] }, "id": "fr-ma", "pluralCat": function (n, opt_precision) { var i = n | 0; if (i == 0 || i == 1) { return PLURAL_CATEGORY.ONE; } return PLURAL_CATEGORY.OTHER;} }); }]); ```
```c /***************************************************************************** All rights reserved. Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met: * Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer. * Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution. * Neither the name of Intel Corporation nor the names of its contributors may be used to endorse or promote products derived from this software without specific prior written permission. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. ***************************************************************************** * Contents: Native middle-level C interface to LAPACK function clanhe * Author: Intel Corporation *****************************************************************************/ #include "lapacke_utils.h" float API_SUFFIX(LAPACKE_clanhe_work)( int matrix_layout, char norm, char uplo, lapack_int n, const lapack_complex_float* a, lapack_int lda, float* work ) { lapack_int info = 0; float res = 0.; if( matrix_layout == LAPACK_COL_MAJOR ) { /* Call LAPACK function and adjust info */ res = LAPACK_clanhe( &norm, &uplo, &n, a, &lda, work ); if( info < 0 ) { info = info - 1; } } else if( matrix_layout == LAPACK_ROW_MAJOR ) { lapack_int lda_t = MAX(1,n); lapack_complex_float* a_t = NULL; /* Check leading dimension(s) */ if( lda < n ) { info = -6; API_SUFFIX(LAPACKE_xerbla)( "LAPACKE_clanhe_work", info ); return info; } /* Allocate memory for temporary array(s) */ a_t = (lapack_complex_float*) LAPACKE_malloc( sizeof(lapack_complex_float) * lda_t * MAX(1,n) ); if( a_t == NULL ) { info = LAPACK_TRANSPOSE_MEMORY_ERROR; goto exit_level_0; } /* Transpose input matrices */ API_SUFFIX(LAPACKE_che_trans)( matrix_layout, uplo, n, a, lda, a_t, lda_t ); /* Call LAPACK function and adjust info */ res = LAPACK_clanhe( &norm, &uplo, &n, a_t, &lda_t, work ); info = 0; /* LAPACK call is ok! */ /* Release memory and exit */ LAPACKE_free( a_t ); exit_level_0: if( info == LAPACK_TRANSPOSE_MEMORY_ERROR ) { API_SUFFIX(LAPACKE_xerbla)( "LAPACKE_clanhe_work", info ); } } else { info = -1; API_SUFFIX(LAPACKE_xerbla)( "LAPACKE_clanhe_work", info ); } return res; } ```
Elections to the National Assembly of France were held in Algeria on 14 October 1877 as part of the wider National Assembly elections. All three seats had only one candidate. Results See also 1877 French legislative election References French elections in Algeria French Algeria
Lord chief justice may refer to: Lord Chief Justice of England and Wales Lord Chief Justice of Ireland Lord Chief Justice of Northern Ireland Lord Chief Justice of the Common Pleas Lord Chief Justice of the King's Bench for Ireland See also Lord President of the Court of Session
Ziółkowo is a village in the administrative district of Gmina Gostyń, within Gostyń County, Greater Poland Voivodeship, in west-central Poland. It lies approximately south-east of Gostyń and south of the regional capital Poznań. References Villages in Gostyń County
Gillow is a surname. Notable people with the surname include: Alfred Gillow (1835–1897), English cricketer Eulogio Gillow y Zavala, archbishop of Antequera Joseph Gillow (1850–1921), Roman Catholic antiquary Robert Gillow (1704–1772), cabinetmaker Russ Gillow (born 1940), ice hockey player Shara Gillow (born 1987), Australian cyclist Thomas Gillow (died 1687), English actor Wilf Gillow (born 1890s), footballer See also Gillow, hamlet in the parish of Hentland, Herefordshire, England Waring & Gillow, furniture manufacturers
Sanfedismo (from Santa Fede, "Holy Faith" in Italian) was a popular anti-Jacobin movement, organized by Fabrizio Cardinal Ruffo, which mobilized peasants of the Kingdom of Naples against the pro-French Parthenopaean Republic in 1799, its aims culminating in the restoration of the Monarchy under Ferdinand I of the Two Sicilies. Its full name was the Army of Holy Faith in our Lord Jesus Christ (Italian: Armata della Santa Fede in nostro Signore Gesù Cristo), and its members were called Sanfedisti. The terms Sanfedismo and Sanfedisti are sometimes used more generally to refer to any religiously motivated, improvised peasant army that sprung up on the Italian peninsula to resist the newly created French client republics. Campaign Cardinal Ruffo recruited the Sanfedisti in his native Calabria. His recruiting poster of February 1799 reads: "Brave and courageous Calabrians, unite now under the standard of the Holy Cross and of our beloved sovereign. Do not wait for the enemy to come and contaminate our home neighbourhoods. Let us march to confront him, to repel him, to hunt him out of our kingdom and out of Italy and to break the barbarous chains of our holy Pontiff. May the banner of the Holy Cross secure you total victory." The Sanfedismo movement nominally acted on behalf of Ferdinand I of the Two Sicilies. On January 25, 1799, two days after the proclamation of the Parthenopean Republic, Ferdinand appointed Ruffo, while both were taking refuge in Palermo, Sicily, to act as his vicar-general on the Italian mainland. Ruffo landed in Calabria on February 7 with no money or weapons and only eight companions, but bearing a banner with the royal arms on one side and a cross on the other, also bearing the ancient slogan "In hoc signo vinces." It took Ruffo a month to amass a force of 17,000; mostly peasants, but also "bandits, ecclesiastics, mercenaries, looters, devotees, and assassins." During the campaign, Ruffo corresponded with Ferdinand's agent, Sir John Acton, 6th Baronet, updating him on the military progress of the Sanfedisti: "I beg the king [of Naples] to order at least a thousand handguns and many loads of lead shot to be sent to me" (February 12) "I think it would be expedient to send a frigate with a mortar against Cotrone and to destroy it absolutely" (February 26) "Catanzaro has really surrendered; many of the worst fellows have been massacred, others taken prisoner" (March 8) "Cosenza has been taken and sacked" (March 19) By the end of April, the Sanfedisti had subdued the entirety of Calabria, Basilicata and most of Apulia, and by June had begun a land siege of the city of Naples. In the siege, the Sanfedismo irregulars were supported by the British Royal Navy under the command of Admiral Horatio Nelson, for which Ferdinand gave Nelson the title of Duke of Bronte, which Nelson affixed to his signature for the rest of his life. The Parthenopean Republic collapsed on June 19, 1799. Most of the Sanfedisti victories occurred in rugged terrain, which was "well-suited" to the irregular style of warfare employed by Ruffo. Similar to other anti-French uprisings in Italy, the Sanfedisti were not, as a rule, amiable towards Freemasons and Jews, who were perceived as supporters of the Enlightenment ideology. Furthermore, Bishop Giovanni Andrea Serrao, the Jansenist leader in southern Italy and despite being a supporter of the Parthenopaean Republic, was summarily executed on February 24, 1799, by the Republican soldiers of the Potenza garrison, as Ruffo's forces were drawing near to the city. Legacy The role of Cardinal Ruffo in the movement was a contemporary source of controversy, attributing to Ruffo both cruelty and bloodlust; apologist writings defending him are only extant with respect to the sack of Altamura. The name of Sanfedismo itself was a source of criticism, dubbed "a word sprung up, by which this new phase of wickedness might be called" by a contemporary. Sanfedismo, and Ruffo himself, became synonymous with the "recalcitrant, truly counter-revolutionary clergy" as opposed to those who were more sympathetic to the French Revolution. The name "Sanfedisti" was also used by Bourbonist peasant uprisings against the House of Savoy during Italian unification. The Canto dei Sanfedisti is still remembered by heart among many in the Mezzogiorno, and sometimes sung by folk groups. It is a parody of La Carmagnole, a popular French Revolutionary song. Later scholarly views of the Sanfedisti have dubbed them a "counter-revolutionary" group, but not homogeneously a "reaction[ary]" one. See also Veronese Easters Viva Maria Sanfedisti#Canto dei Sanfedisti Vendée Revolt Carlism Fra Diavolo Notes References Paramilitary organisations based in Italy Monarchist organizations Warfare of the early modern period 1799 establishments in Italy 1800 disestablishments in Europe Organizations established in 1799 Organizations disestablished in 1800 Monarchism in Italy Nationalism in Italy Conservatism in Italy
The 2010–11 season was the 63rd season in the existence of FC Steaua București and the club's 63rd consecutive season in the top flight of Romanian football. In addition to the domestic league, Steaua București participated in this season's edition of the Cupa României and the UEFA Europa League. Previous season positions Players Squad information |- |colspan="12"|Players buy, sold and rebuy during the season |- |- |colspan="12"|Players sold or loaned out during the season |- Transfers In Out Statistics Player stats |- |colspan="15"|Players sold or loaned out during the season |- Goalscorers Key Goal minutes Last updated: 25 May 2011 (UTC) Source: FCSB Start formations Squad stats {|class="wikitable" style="text-align: center;" |- ! !Total !Home !Away !Neutral |- |align=left| Games played ||48 ||23 ||24 || 1 |- |align=left| Games won ||21 ||11 || 9 || 1 |- |align=left| Games drawn ||15 || 8 || 7 || 0 |- |align=left| Games lost ||12 || 4 || 8 || 0 |- |align=left| Biggest win || 5–0 vs Unirea Urziceni || 5–0 vs Unirea Urziceni || 3–0 vs FC Vaslui || 2–1 vs Dinamo București |- |align=left| Biggest lose || 4–1 vs Liverpool 3–0 vs FC Brașov || 3–0 vs FC Brașov || 4–1 vs Liverpool || — |- |align=left| Clean sheets ||16 || 7 || 9 || 0 |- |align=left| Goals scored ||60 ||37 ||21 || 2 |- |align=left| Goals conceded ||42 ||22 ||19 || 1 |- |align=left| Goal difference ||+18||+15||+2 ||+1 |- |align=left| Top scorer || Stancu (16) || 13 || 3 || 0 |- |align=left| Winning rate || % || % || % || % |- International appearances Notes Was called for Romania's game but was not used. Competitions Overall Liga I League table Results summary Results by round Points by opponent Source: FCSB Matches Cupa României Results UEFA Europa League Play-off round Group stage Group K Results Non competitive matches UEFA Club rankings This is the current UEFA Club Rankings, including season 2009–10. Notes and references 2010-11 Romanian football clubs 2010–11 season 2010–11 UEFA Europa League participants seasons
Edwin Kentfield (11 January 1802 – 29 August 1873) also known as Jonathan Kentfield, was an English player of English billiards. He claimed the Billiards Championship in 1825 and held it uncontested until 1849. Biography Edwin Kentfield was born at Brighton in 1802. In about 1815, John Carr, better known as Jack Carr, took a job as a billiard marker, a role that involved keeping the score of billiards matches. In this role, he learned how to play billiards with the use of , which was at that time almost unknown. He successfully played challenge matches for money, to the extent that by 1825 he had backers for him to play for 100 guineas a side against any challenger. In 1825 Kentfield challenged Carr, but Carr was too ill to play, and Kentfield assumed the title of Champion, for which he was unchallenged for 24 years. Kentfield authored a book on billiards, which was published in 1839 In 1849, John Roberts Sr sought to challenge Kentfield for the title, and when Kentfield declined to play, Roberts took the title of champion. Kentfield ran a billiards club in Brighton, which was offered for sale at auction in 1864 following his bankruptcy. He died on 29 August 1873. Kentfield's highest was 196. References External links "Kentfield & Carr" at the Billiard and Snooker Heritage Collection. English players of English billiards World champions in English billiards 1873 deaths
John Downing Fripp (February 11, 1921 – March 24, 2022) was a Canadian skier and football player. He was a skier between 1927 and 1960 and played football in the Interprovincial Rugby Football Union (IRFU) (now CFL East Division) and Ontario Rugby Football Union (ORFU) between 1941 and 1947. A centenarian, Fripp was believed to be the oldest former Canadian football player at the time of his death. Early life Johnny Fripp was born on February 11, 1921, in Ottawa, Ontario. He attended high school at Lisgar Collegiate Institute, before transferring to Glebe Collegiate. Skiing career Fripp started skiing at the age of 6, and won multiple tournaments as a youth. In 1938, at the age of 17, he won the Journal Trophy at the Gatineau Ski Zone Championships. However, the trophy was awarded to someone else as he was not old enough to be eligible. Due to his young age, he could not compete at the Dominion Championships and instead went to compete against the Americans in the Eastern Olympic try-outs held at Lake Placid, New York. He was beaten by national champion Dick Durrance, but won third place in both slalom and downhill events. Fripp again won the Journal Trophy one year later, and also placed second in the Quebec Kandahar combined race, earning first place honors in downhill. He was champion of the Kandahar race in 1940, and won the Eastern Canadian Championships with first place in both slalom and downhill events. Later that year in Sun Valley, Idaho, he competed against American and Austrian professionals and placed 16th out of 80 competitors. Also in 1940, he was named an assistant professional ski instructor at Mont Tremblant. His sports career was interrupted in 1942 by World War II. As a member of the skiing team of the Royal Canadian Air Force, he won an event held at Mount Baldy Ski Area and recorded the fastest time ever by a Canadian skier. After returning from the war, Fripp again won the Quebec Kandahar race. He also won the Alta Cup competition in Alta, Utah. He retired shortly afterwards, but returned in 1951 and won his third Quebec Kandahar tournament. He won the Canadian Open Class Downhill Championships in 1953, and was Canada's top entrant to the 1954 Ryan Cup. He was appointed by the Canadian Federation Internationale de Ski (FIS) to be the coach of the men's team in 1958 to compete in Bad Gastein. He also served as director of the Canadian Amateur Ski Association in 1957, and was a member of the International Competition Committee in 1958. He retired in 1960. Football career Fripp began to play football for the Ottawa Rough Riders of the Interprovincial Rugby Football Union (IRFU) at 20 years old in 1941. He played the flying wing position. He made his debut on September 27, in a 18–5 win over a Montreal team. An article in The Montreal Gazette said, "Johnny Fripp, one of Canada's greatest skiers and last year's one man football team at Glebe Collegiate, was shoved into the battle as a momentary replacement for Andy Tommy in the second quarter." After being put in, he was immediately given the ball on a short end play and "made yards". His rushes were described as "bull-like" by The Montreal Gazette. He made a "fine debut", according to The Ottawa Journal. Controversy arose after the game, when Montreal Star writer Baz O'Meara claimed he was ineligible to play due to his skiing career. James P. McCaffrey, league president, declined to comment on Fripp. Fripp remained in the league, and scored his first touchdown in a 24–6 win over the Toronto Argonauts on October 25. He was out of the league in 1942 when games were suspended due to World War II. He returned to football the following year, playing on the Lachine Fliers military service team. After then spending another year out of football, Fripp played for the Montreal Hornets of the IRFU in 1945. He re-joined the Ottawa Rough Riders in 1946, appearing in eleven games. He returned to the team in 1947, but left before the season start for the Ottawa Trojans of the Ontario Rugby Football Union (ORFU). He played one season with the Trojans, appearing in seven games, before retiring from football. Later life His father, Herbert, founded a real estate and insurance firm in 1923, H. D. Fripp & Son Ltd., and Johnny Fripp took over in 1950. He was inducted into the Canadian Ski Hall of Fame in 1988, and one year later was an inductee to the Ottawa Sports Hall of Fame. In 2020, he was reported to be the oldest former Ottawa Rough Rider, and believed to be the oldest former Canadian football player at the time. Fripp celebrated his 100th birthday on February 11, 2021. He died on March 24, 2022, one year later at age 101. References 1921 births 2022 deaths Canadian centenarians Canadian football quarterbacks Canadian football running backs Canadian male alpine skiers Men centenarians Ottawa Rough Riders players Players of Canadian football from Ontario Skiers from Ottawa Royal Canadian Air Force personnel of World War II Skiing coaches
Songs That Jesus Said is the first studio album featuring songs by Keith Getty and Kristyn Getty intended specifically for children. A published score and CD tracks were released in conjunction with this CD. All songs on the album were written jointly by Keith Getty and Kristyn Getty, and most are based on specific passages of the Christian Bible, especially the four Gospels. Track listing Better Is One Day with Jesus(Luke 10) He Is My Light(John 8:12) Underneath the Shining Star(Matthew 1-2) Have You Seen Him?(Luke 9:18-20) Two Little Houses(Matthew 7:24-27) Stop and Think(Matthew 20:26-27) Seed You Sow(Luke 8) Store Up Good(Luke 6:45) Little Zac(Luke 19:1-10) Look to Jesus(John 4, John 7) Once Upon a Boat(Matthew 8:23-27) Let the Little Children Come(Matthew 19:13-15) Father in Heaven(Matthew 6:9-13) You Know(Matthew 10:29-31) Remember(Matthew 6:25-31) You Are the Shepherd(John 10:3-5) In My Father's House(John 14:1-4) The Grace Song of Heaven Christ Has Risen I'm Ready to Go(Matthew 28:18-20) Credits Joni McCabe – Producer Tim Oliver – Arranger Katrina McClay & Susie Young – Conductors Mudd Wallace - Engineer (Homestead Recording Studios, Northern Ireland) Dave Schober - Mastering Eric Wyse - ‘Final Touches’ Leigh Merchant - Album Art (c) 2005 Getty Music External links Getty Music “Songs That Jesus Said: Singing the Bible for Young Worshipers,” audio lecture by Keith & Kristyn Getty, delivered at The Southern Baptist Theological Seminary on October 6, 2005 (mp3). Review in Reformed Worship, issue no. 81 (September 2006), by Carl Stam and Fritz West. 2005 albums Keith & Kristyn Getty albums Children's music albums by artists from Northern Ireland
```c /* packet-usb.c * * USB basic dissector * By Paolo Abeni <paolo.abeni@email.it> * Ronnie Sahlberg 2006 * * path_to_url * * path_to_url * * path_to_url * * This program is free software; you can redistribute it and/or * as published by the Free Software Foundation; either version 2 * * This program is distributed in the hope that it will be useful, * but WITHOUT ANY WARRANTY; without even the implied warranty of * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the * * along with this program; if not, write to the Free Software * Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA. */ #include "config.h" #include <epan/packet.h> #include <epan/exceptions.h> #include <epan/addr_resolv.h> #include <epan/address_types.h> #include <epan/conversation_table.h> #include <epan/expert.h> #include <epan/prefs.h> #include <epan/decode_as.h> #include <epan/proto_data.h> #include <wsutil/pint.h> #include "packet-usb.h" #include "packet-mausb.h" #include "packet-usbip.h" /* protocols and header fields */ static int proto_usb = -1; /* USB pseudoheader fields, both FreeBSD and Linux */ static int hf_usb_totlen = -1; static int hf_usb_busunit = -1; static int hf_usb_address = -1; static int hf_usb_mode = -1; static int hf_usb_freebsd_urb_type = -1; static int hf_usb_freebsd_transfer_type = -1; static int hf_usb_xferflags = -1; static int hf_usb_xferflags_force_short_xfer = -1; static int hf_usb_xferflags_short_xfer_ok = -1; static int hf_usb_xferflags_short_frames_ok = -1; static int hf_usb_xferflags_pipe_bof = -1; static int hf_usb_xferflags_proxy_buffer = -1; static int hf_usb_xferflags_ext_buffer = -1; static int hf_usb_xferflags_manual_status = -1; static int hf_usb_xferflags_no_pipe_ok = -1; static int hf_usb_xferflags_stall_pipe = -1; static int hf_usb_xferstatus = -1; static int hf_usb_xferstatus_open = -1; static int hf_usb_xferstatus_transferring = -1; static int hf_usb_xferstatus_did_dma_delay = -1; static int hf_usb_xferstatus_did_close = -1; static int hf_usb_xferstatus_draining = -1; static int hf_usb_xferstatus_started = -1; static int hf_usb_xferstatus_bw_reclaimed = -1; static int hf_usb_xferstatus_control_xfr = -1; static int hf_usb_xferstatus_control_hdr = -1; static int hf_usb_xferstatus_control_act = -1; static int hf_usb_xferstatus_control_stall = -1; static int hf_usb_xferstatus_short_frames_ok = -1; static int hf_usb_xferstatus_short_xfer_ok = -1; static int hf_usb_xferstatus_bdma_enable = -1; static int hf_usb_xferstatus_bdma_no_post_sync = -1; static int hf_usb_xferstatus_bdma_setup = -1; static int hf_usb_xferstatus_isochronous_xfr = -1; static int hf_usb_xferstatus_curr_dma_set = -1; static int hf_usb_xferstatus_can_cancel_immed = -1; static int hf_usb_xferstatus_doing_callback = -1; static int hf_usb_error = -1; static int hf_usb_interval = -1; static int hf_usb_nframes = -1; static int hf_usb_packet_size = -1; static int hf_usb_packet_count = -1; static int hf_usb_speed = -1; static int hf_usb_frame_length = -1; static int hf_usb_frame_flags = -1; static int hf_usb_frame_flags_read = -1; static int hf_usb_frame_flags_data_follows = -1; static int hf_usb_frame_data = -1; static int hf_usb_urb_id = -1; static int hf_usb_linux_urb_type = -1; static int hf_usb_linux_transfer_type = -1; static int hf_usb_endpoint_number = -1; static int hf_usb_endpoint_direction = -1; static int hf_usb_endpoint_number_value = -1; static int hf_usb_device_address = -1; static int hf_usb_bus_id = -1; static int hf_usb_setup_flag = -1; static int hf_usb_data_flag = -1; static int hf_usb_urb_ts_sec = -1; static int hf_usb_urb_ts_usec = -1; static int hf_usb_urb_status = -1; static int hf_usb_urb_len = -1; static int hf_usb_urb_data_len = -1; static int hf_usb_urb_unused_setup_header = -1; static int hf_usb_urb_interval = -1; static int hf_usb_urb_start_frame = -1; static int hf_usb_urb_copy_of_transfer_flags = -1; /* Win32 USBPcap pseudoheader fields */ static int hf_usb_win32_header_len = -1; static int hf_usb_irp_id = -1; static int hf_usb_usbd_status = -1; static int hf_usb_function = -1; static int hf_usb_info = -1; static int hf_usb_usbpcap_info_reserved = -1; static int hf_usb_usbpcap_info_direction = -1; static int hf_usb_win32_device_address = -1; /* hf_usb_bus_id, hf_usb_endpoint_number, hf_usb_endpoint_direction, * hf_usb_endpoint_number_value, hf_usb_transfer_type are common with * Linux pseudoheader */ static int hf_usb_win32_data_len = -1; static int hf_usb_control_stage = -1; static int hf_usb_win32_iso_start_frame = -1; static int hf_usb_win32_iso_num_packets = -1; static int hf_usb_win32_iso_error_count = -1; static int hf_usb_win32_iso_offset = -1; static int hf_usb_win32_iso_length = -1; static int hf_usb_win32_iso_status = -1; static int hf_usb_request = -1; static int hf_usb_request_unknown_class = -1; static int hf_usb_value = -1; static int hf_usb_index = -1; static int hf_usb_length = -1; /* static int hf_usb_data_len = -1; */ static int hf_usb_capdata = -1; static int hf_usb_device_wFeatureSelector = -1; static int hf_usb_interface_wFeatureSelector = -1; static int hf_usb_endpoint_wFeatureSelector = -1; static int hf_usb_wInterface = -1; static int hf_usb_wEndpoint = -1; static int hf_usb_wStatus = -1; static int hf_usb_wFrameNumber = -1; static int hf_usb_iso_error_count = -1; static int hf_usb_iso_numdesc = -1; static int hf_usb_iso_status = -1; static int hf_usb_iso_off = -1; static int hf_usb_iso_len = -1; static int hf_usb_iso_actual_len = -1; static int hf_usb_iso_pad = -1; static int hf_usb_iso_data = -1; static int hf_usb_bmRequestType = -1; static int hf_usb_control_response_generic = -1; static int hf_usb_bmRequestType_direction = -1; static int hf_usb_bmRequestType_type = -1; static int hf_usb_bmRequestType_recipient = -1; static int hf_usb_bDescriptorType = -1; static int hf_usb_get_descriptor_resp_generic = -1; static int hf_usb_descriptor_index = -1; static int hf_usb_language_id = -1; static int hf_usb_bLength = -1; static int hf_usb_bcdUSB = -1; static int hf_usb_bDeviceClass = -1; static int hf_usb_bDeviceSubClass = -1; static int hf_usb_bDeviceProtocol = -1; static int hf_usb_bMaxPacketSize0 = -1; static int hf_usb_idVendor = -1; static int hf_usb_idProduct = -1; static int hf_usb_bcdDevice = -1; static int hf_usb_iManufacturer = -1; static int hf_usb_iProduct = -1; static int hf_usb_iSerialNumber = -1; static int hf_usb_bNumConfigurations = -1; static int hf_usb_wLANGID = -1; static int hf_usb_bString = -1; static int hf_usb_bInterfaceNumber = -1; static int hf_usb_bAlternateSetting = -1; static int hf_usb_bNumEndpoints = -1; static int hf_usb_bInterfaceClass = -1; static int hf_usb_bInterfaceSubClass = -1; static int hf_usb_bInterfaceSubClass_cdc = -1; static int hf_usb_bInterfaceSubClass_hid = -1; static int hf_usb_bInterfaceSubClass_misc = -1; static int hf_usb_bInterfaceSubClass_app = -1; static int hf_usb_bInterfaceProtocol = -1; static int hf_usb_bInterfaceProtocol_cdc = -1; static int hf_usb_bInterfaceProtocol_cdc_data = -1; static int hf_usb_bInterfaceProtocol_hid_boot = -1; static int hf_usb_bInterfaceProtocol_app_dfu = -1; static int hf_usb_bInterfaceProtocol_app_irda = -1; static int hf_usb_bInterfaceProtocol_app_usb_test_and_measurement = -1; static int hf_usb_iInterface = -1; static int hf_usb_bEndpointAddress = -1; static int hf_usb_bmAttributes = -1; static int hf_usb_bEndpointAttributeTransfer = -1; static int hf_usb_bEndpointAttributeSynchonisation = -1; static int hf_usb_bEndpointAttributeBehaviour = -1; static int hf_usb_wMaxPacketSize = -1; static int hf_usb_wMaxPacketSize_size = -1; static int hf_usb_wMaxPacketSize_slots = -1; static int hf_usb_bInterval = -1; static int hf_usb_wTotalLength = -1; static int hf_usb_bNumInterfaces = -1; static int hf_usb_bConfigurationValue = -1; static int hf_usb_iConfiguration = -1; static int hf_usb_bMaxPower = -1; static int hf_usb_configuration_bmAttributes = -1; static int hf_usb_configuration_legacy10buspowered = -1; static int hf_usb_configuration_selfpowered = -1; static int hf_usb_configuration_remotewakeup = -1; static int hf_usb_bEndpointAddress_direction = -1; static int hf_usb_bEndpointAddress_number = -1; static int hf_usb_response_in = -1; static int hf_usb_time = -1; static int hf_usb_request_in = -1; static int hf_usb_bFirstInterface = -1; static int hf_usb_bInterfaceCount = -1; static int hf_usb_bFunctionClass = -1; static int hf_usb_bFunctionSubClass = -1; static int hf_usb_bFunctionProtocol = -1; static int hf_usb_iFunction = -1; static int hf_usb_data_fragment = -1; static int hf_usb_src = -1; static int hf_usb_dst = -1; static int hf_usb_addr = -1; static gint ett_usb_hdr = -1; static gint ett_usb_setup_hdr = -1; static gint ett_usb_isodesc = -1; static gint ett_usb_win32_iso_packet = -1; static gint ett_usb_endpoint = -1; static gint ett_usb_setup_bmrequesttype = -1; static gint ett_usb_usbpcap_info = -1; static gint ett_descriptor_device = -1; static gint ett_configuration_bmAttributes = -1; static gint ett_configuration_bEndpointAddress = -1; static gint ett_endpoint_bmAttributes = -1; static gint ett_endpoint_wMaxPacketSize = -1; static gint ett_usb_xferflags = -1; static gint ett_usb_xferstatus = -1; static gint ett_usb_frame = -1; static gint ett_usb_frame_flags = -1; static expert_field ei_usb_bLength_even = EI_INIT; static expert_field ei_usb_bLength_too_short = EI_INIT; static expert_field ei_usb_desc_length_invalid = EI_INIT; static expert_field ei_usb_invalid_setup = EI_INIT; static int usb_address_type = -1; static const int *usb_endpoint_fields[] = { &hf_usb_endpoint_direction, &hf_usb_endpoint_number_value, NULL }; static const int *usb_usbpcap_info_fields[] = { &hf_usb_usbpcap_info_reserved, &hf_usb_usbpcap_info_direction, NULL }; static int usb_tap = -1; static gboolean try_heuristics = TRUE; static dissector_table_t usb_bulk_dissector_table; static dissector_table_t usb_control_dissector_table; static dissector_table_t usb_interrupt_dissector_table; static dissector_table_t usb_descriptor_dissector_table; static heur_dissector_list_t heur_bulk_subdissector_list; static heur_dissector_list_t heur_control_subdissector_list; static heur_dissector_list_t heur_interrupt_subdissector_list; static wmem_tree_t *device_to_protocol_table = NULL; static wmem_tree_t *device_to_product_table = NULL; static dissector_table_t device_to_dissector; static dissector_table_t protocol_to_dissector; static dissector_table_t product_to_dissector; typedef struct _device_product_data_t { guint16 vendor; guint16 product; guint bus_id; guint device_address; } device_product_data_t; typedef struct _device_protocol_data_t { guint32 protocol; guint bus_id; guint device_address; } device_protocol_data_t; typedef struct _usb_alt_setting_t { guint8 altSetting; guint8 interfaceClass; guint8 interfaceSubclass; guint8 interfaceProtocol; } usb_alt_setting_t; /* path_to_url */ static const value_string usb_langid_vals[] = { {0x0000, "no language specified"}, {0x0401, "Arabic (Saudi Arabia)"}, {0x0402, "Bulgarian"}, {0x0403, "Catalan"}, {0x0404, "Chinese (Taiwan)"}, {0x0405, "Czech"}, {0x0406, "Danish"}, {0x0407, "German (Standard)"}, {0x0408, "Greek"}, {0x0409, "English (United States)"}, {0x040a, "Spanish (Traditional Sort)"}, {0x040b, "Finnish"}, {0x040c, "French (Standard)"}, {0x040d, "Hebrew"}, {0x040e, "Hungarian"}, {0x040f, "Icelandic"}, {0x0410, "Italian (Standard)"}, {0x0411, "Japanese"}, {0x0412, "Korean"}, {0x0413, "Dutch (Netherlands)"}, {0x0414, "Norwegian (Bokmal)"}, {0x0415, "Polish"}, {0x0416, "Portuguese (Brazil)"}, {0x0418, "Romanian"}, {0x0419, "Russian"}, {0x041a, "Croatian"}, {0x041b, "Slovak"}, {0x041c, "Albanian"}, {0x041d, "Swedish"}, {0x041e, "Thai"}, {0x041f, "Turkish"}, {0x0420, "Urdu (Pakistan)"}, {0x0421, "Indonesian"}, {0x0422, "Ukrainian"}, {0x0423, "Belarussian"}, {0x0424, "Slovenian"}, {0x0425, "Estonian"}, {0x0426, "Latvian"}, {0x0427, "Lithuanian"}, {0x0429, "Farsi"}, {0x042a, "Vietnamese"}, {0x042b, "Armenian"}, {0x042c, "Azeri (Latin)"}, {0x042d, "Basque"}, {0x042f, "Macedonian"}, {0x0430, "Sutu"}, {0x0436, "Afrikaans"}, {0x0437, "Georgian"}, {0x0438, "Faeroese"}, {0x0439, "Hindi"}, {0x043e, "Malay (Malaysian)"}, {0x043f, "Kazakh"}, {0x0441, "Swahili (Kenya)"}, {0x0443, "Uzbek (Latin)"}, {0x0444, "Tatar (Tatarstan)"}, {0x0445, "Bengali"}, {0x0446, "Punjabi"}, {0x0447, "Gujarati"}, {0x0448, "Oriya"}, {0x0449, "Tamil"}, {0x044a, "Telugu"}, {0x044b, "Kannada"}, {0x044c, "Malayalam"}, {0x044d, "Assamese"}, {0x044e, "Marathi"}, {0x044f, "Sanskrit"}, {0x0455, "Burmese"}, {0x0457, "Konkani"}, {0x0458, "Manipuri"}, {0x0459, "Sindhi"}, {0x04ff, "HID (Usage Data Descriptor)"}, {0x0801, "Arabic (Iraq)"}, {0x0804, "Chinese (PRC)"}, {0x0807, "German (Switzerland)"}, {0x0809, "English (United Kingdom)"}, {0x080a, "Spanish (Mexican)"}, {0x080c, "French (Belgian)"}, {0x0810, "Italian (Switzerland)"}, {0x0812, "Korean (Johab)"}, {0x0813, "Dutch (Belgium)"}, {0x0814, "Norwegian (Nynorsk)"}, {0x0816, "Portuguese (Standard)"}, {0x081a, "Serbian (Latin)"}, {0x081d, "Swedish (Finland)"}, {0x0820, "Urdu (India)"}, {0x0827, "Lithuanian (Classic)"}, {0x082c, "Azeri (Cyrillic)"}, {0x083e, "Malay (Brunei Darussalam)"}, {0x0843, "Uzbek (Cyrillic)"}, {0x0860, "Kashmiri (India)"}, {0x0861, "Nepali (India)"}, {0x0c01, "Arabic (Egypt)"}, {0x0c04, "Chinese (Hong Kong SAR, PRC)"}, {0x0c07, "German (Austria)"}, {0x0c09, "English (Australian)"}, {0x0c0a, "Spanish (Modern Sort)"}, {0x0c0c, "French (Canadian)"}, {0x0c1a, "Serbian (Cyrillic)"}, {0x1001, "Arabic (Libya)"}, {0x1004, "Chinese (Singapore)"}, {0x1007, "German (Luxembourg)"}, {0x1009, "English (Canadian)"}, {0x100a, "Spanish (Guatemala)"}, {0x100c, "French (Switzerland)"}, {0x1401, "Arabic (Algeria)"}, {0x1404, "Chinese (Macau SAR)"}, {0x1407, "German (Liechtenstein)"}, {0x1409, "English (New Zealand)"}, {0x140a, "Spanish (Costa Rica)"}, {0x140c, "French (Luxembourg)"}, {0x1801, "Arabic (Morocco)"}, {0x1809, "English (Ireland)"}, {0x180a, "Spanish (Panama)"}, {0x180c, "French (Monaco)"}, {0x1c01, "Arabic (Tunisia)"}, {0x1c09, "English (South Africa)"}, {0x1c0a, "Spanish (Dominican Republic)"}, {0x2001, "Arabic (Oman)"}, {0x2009, "English (Jamaica)"}, {0x200a, "Spanish (Venezuela)"}, {0x2401, "Arabic (Yemen)"}, {0x2409, "English (Caribbean)"}, {0x240a, "Spanish (Colombia)"}, {0x2801, "Arabic (Syria)"}, {0x2809, "English (Belize)"}, {0x280a, "Spanish (Peru)"}, {0x2c01, "Arabic (Jordan)"}, {0x2c09, "English (Trinidad)"}, {0x2c0a, "Spanish (Argentina)"}, {0x3001, "Arabic (Lebanon)"}, {0x3009, "English (Zimbabwe)"}, {0x300a, "Spanish (Ecuador)"}, {0x3401, "Arabic (Kuwait)"}, {0x3409, "English (Philippines)"}, {0x340a, "Spanish (Chile)"}, {0x3801, "Arabic (U.A.E.)"}, {0x380a, "Spanish (Uruguay)"}, {0x3c01, "Arabic (Bahrain)"}, {0x3c0a, "Spanish (Paraguay)"}, {0x4001, "Arabic (Qatar)"}, {0x400a, "Spanish (Bolivia)"}, {0x440a, "Spanish (El Salvador)"}, {0x480a, "Spanish (Honduras)"}, {0x4c0a, "Spanish (Nicaragua)"}, {0x500a, "Spanish (Puerto Rico)"}, {0xf0ff, "HID (Vendor Defined 1)"}, {0xf4ff, "HID (Vendor Defined 2)"}, {0xf8ff, "HID (Vendor Defined 3)"}, {0xfcff, "HID (Vendor Defined 4)"}, {0, NULL} }; value_string_ext usb_langid_vals_ext = VALUE_STRING_EXT_INIT(usb_langid_vals); static const value_string usb_class_vals[] = { {IF_CLASS_DEVICE, "Device"}, {IF_CLASS_AUDIO, "Audio"}, {IF_CLASS_COMMUNICATIONS, "Communications and CDC Control"}, {IF_CLASS_HID, "HID"}, {IF_CLASS_PHYSICAL, "Physical"}, {IF_CLASS_IMAGE, "Imaging"}, {IF_CLASS_PRINTER, "Printer"}, {IF_CLASS_MASS_STORAGE, "Mass Storage"}, {IF_CLASS_HUB, "Hub"}, {IF_CLASS_CDC_DATA, "CDC-Data"}, {IF_CLASS_SMART_CARD, "Smart Card"}, {IF_CLASS_CONTENT_SECURITY, "Content Security"}, {IF_CLASS_VIDEO, "Video"}, {IF_CLASS_PERSONAL_HEALTHCARE, "Personal Healthcare"}, {IF_CLASS_AUDIO_VIDEO, "Audio/Video Devices"}, {IF_CLASS_DIAGNOSTIC_DEVICE, "Diagnostic Device"}, {IF_CLASS_WIRELESS_CONTROLLER, "Wireless Controller"}, {IF_CLASS_MISCELLANEOUS, "Miscellaneous"}, {IF_CLASS_APPLICATION_SPECIFIC, "Application Specific"}, {IF_CLASS_VENDOR_SPECIFIC, "Vendor Specific"}, {0, NULL} }; value_string_ext usb_class_vals_ext = VALUE_STRING_EXT_INIT(usb_class_vals); /* use usb class, subclass and protocol id together path_to_url USB Class Definitions for Communications Devices, Revision 1.2 December 6, 2012 */ static const value_string usb_protocols[] = { {0x000000, "Use class code info from Interface Descriptors"}, {0x060101, "Still Imaging"}, {0x090000, "Full speed Hub"}, {0x090001, "Hi-speed hub with single TT"}, {0x090002, "Hi-speed hub with multiple TTs"}, {0x0D0000, "Content Security"}, {0x100100, "AVControl Interface"}, {0x100200, "AVData Video Streaming Interface"}, {0x100300, "AVData Audio Streaming Interface"}, {0xDC0101, "USB2 Compliance Device"}, {0xE00101, "Bluetooth Programming Interface"}, {0xE00102, "UWB Radio Control Interface"}, {0xE00103, "Remote NDIS"}, {0xE00104, "Bluetooth AMP Controller"}, {0xE00201, "Host Wire Adapter Control/Data interface"}, {0xE00202, "Device Wire Adapter Control/Data interface"}, {0xE00203, "Device Wire Adapter Isochronous interface"}, {0xEF0101, "Active Sync device"}, {0xEF0102, "Palm Sync"}, {0xEF0201, "Interface Association Descriptor"}, {0xEF0202, "Wire Adapter Multifunction Peripheral programming interface"}, {0xEF0301, "Cable Based Association Framework"}, {0xFE0101, "Device Firmware Upgrade"}, {0xFE0200, "IRDA Bridge device"}, {0xFE0300, "USB Test and Measurement Device"}, {0xFE0301, "USB Test and Measurement Device conforming to the USBTMC USB488"}, {0, NULL} }; static value_string_ext usb_protocols_ext = VALUE_STRING_EXT_INIT(usb_protocols); /* FreeBSD header */ /* Transfer mode */ #define FREEBSD_MODE_HOST 0 #define FREEBSD_MODE_DEVICE 1 static const value_string usb_freebsd_transfer_mode_vals[] = { {FREEBSD_MODE_HOST, "Host"}, {FREEBSD_MODE_DEVICE, "Device"}, {0, NULL} }; /* Type */ #define FREEBSD_URB_SUBMIT 0 #define FREEBSD_URB_COMPLETE 1 static const value_string usb_freebsd_urb_type_vals[] = { {FREEBSD_URB_SUBMIT, "URB_SUBMIT"}, {FREEBSD_URB_COMPLETE, "URB_COMPLETE"}, {0, NULL} }; /* Transfer type */ #define FREEBSD_URB_CONTROL 0 #define FREEBSD_URB_ISOCHRONOUS 1 #define FREEBSD_URB_BULK 2 #define FREEBSD_URB_INTERRUPT 3 static const value_string usb_freebsd_transfer_type_vals[] = { {FREEBSD_URB_CONTROL, "URB_CONTROL"}, {FREEBSD_URB_ISOCHRONOUS, "URB_ISOCHRONOUS"}, {FREEBSD_URB_BULK, "URB_BULK"}, {FREEBSD_URB_INTERRUPT, "URB_INTERRUPT"}, {0, NULL} }; /* Transfer flags */ #define FREEBSD_FLAG_FORCE_SHORT_XFER 0x00000001 #define FREEBSD_FLAG_SHORT_XFER_OK 0x00000002 #define FREEBSD_FLAG_SHORT_FRAMES_OK 0x00000004 #define FREEBSD_FLAG_PIPE_BOF 0x00000008 #define FREEBSD_FLAG_PROXY_BUFFER 0x00000010 #define FREEBSD_FLAG_EXT_BUFFER 0x00000020 #define FREEBSD_FLAG_MANUAL_STATUS 0x00000040 #define FREEBSD_FLAG_NO_PIPE_OK 0x00000080 #define FREEBSD_FLAG_STALL_PIPE 0x00000100 static const int *usb_xferflags_fields[] = { &hf_usb_xferflags_force_short_xfer, &hf_usb_xferflags_short_xfer_ok, &hf_usb_xferflags_short_frames_ok, &hf_usb_xferflags_pipe_bof, &hf_usb_xferflags_proxy_buffer, &hf_usb_xferflags_ext_buffer, &hf_usb_xferflags_manual_status, &hf_usb_xferflags_no_pipe_ok, &hf_usb_xferflags_stall_pipe, NULL }; /* Transfer status */ #define FREEBSD_STATUS_OPEN 0x00000001 #define FREEBSD_STATUS_TRANSFERRING 0x00000002 #define FREEBSD_STATUS_DID_DMA_DELAY 0x00000004 #define FREEBSD_STATUS_DID_CLOSE 0x00000008 #define FREEBSD_STATUS_DRAINING 0x00000010 #define FREEBSD_STATUS_STARTED 0x00000020 #define FREEBSD_STATUS_BW_RECLAIMED 0x00000040 #define FREEBSD_STATUS_CONTROL_XFR 0x00000080 #define FREEBSD_STATUS_CONTROL_HDR 0x00000100 #define FREEBSD_STATUS_CONTROL_ACT 0x00000200 #define FREEBSD_STATUS_CONTROL_STALL 0x00000400 #define FREEBSD_STATUS_SHORT_FRAMES_OK 0x00000800 #define FREEBSD_STATUS_SHORT_XFER_OK 0x00001000 #define FREEBSD_STATUS_BDMA_ENABLE 0x00002000 #define FREEBSD_STATUS_BDMA_NO_POST_SYNC 0x00004000 #define FREEBSD_STATUS_BDMA_SETUP 0x00008000 #define FREEBSD_STATUS_ISOCHRONOUS_XFR 0x00010000 #define FREEBSD_STATUS_CURR_DMA_SET 0x00020000 #define FREEBSD_STATUS_CAN_CANCEL_IMMED 0x00040000 #define FREEBSD_STATUS_DOING_CALLBACK 0x00080000 static const int *usb_xferstatus_fields[] = { &hf_usb_xferstatus_open, &hf_usb_xferstatus_transferring, &hf_usb_xferstatus_did_dma_delay, &hf_usb_xferstatus_did_close, &hf_usb_xferstatus_draining, &hf_usb_xferstatus_started, &hf_usb_xferstatus_bw_reclaimed, &hf_usb_xferstatus_control_xfr, &hf_usb_xferstatus_control_hdr, &hf_usb_xferstatus_control_act, &hf_usb_xferstatus_control_stall, &hf_usb_xferstatus_short_frames_ok, &hf_usb_xferstatus_short_xfer_ok, &hf_usb_xferstatus_bdma_enable, &hf_usb_xferstatus_bdma_no_post_sync, &hf_usb_xferstatus_bdma_setup, &hf_usb_xferstatus_isochronous_xfr, &hf_usb_xferstatus_curr_dma_set, &hf_usb_xferstatus_can_cancel_immed, &hf_usb_xferstatus_doing_callback, NULL }; /* USB errors */ #define FREEBSD_ERR_NORMAL_COMPLETION 0 #define FREEBSD_ERR_PENDING_REQUESTS 1 #define FREEBSD_ERR_NOT_STARTED 2 #define FREEBSD_ERR_INVAL 3 #define FREEBSD_ERR_NOMEM 4 #define FREEBSD_ERR_CANCELLED 5 #define FREEBSD_ERR_BAD_ADDRESS 6 #define FREEBSD_ERR_BAD_BUFSIZE 7 #define FREEBSD_ERR_BAD_FLAG 8 #define FREEBSD_ERR_NO_CALLBACK 9 #define FREEBSD_ERR_IN_USE 10 #define FREEBSD_ERR_NO_ADDR 11 #define FREEBSD_ERR_NO_PIPE 12 #define FREEBSD_ERR_ZERO_NFRAMES 13 #define FREEBSD_ERR_ZERO_MAXP 14 #define FREEBSD_ERR_SET_ADDR_FAILED 15 #define FREEBSD_ERR_NO_POWER 16 #define FREEBSD_ERR_TOO_DEEP 17 #define FREEBSD_ERR_IOERROR 18 #define FREEBSD_ERR_NOT_CONFIGURED 19 #define FREEBSD_ERR_TIMEOUT 20 #define FREEBSD_ERR_SHORT_XFER 21 #define FREEBSD_ERR_STALLED 22 #define FREEBSD_ERR_INTERRUPTED 23 #define FREEBSD_ERR_DMA_LOAD_FAILED 24 #define FREEBSD_ERR_BAD_CONTEXT 25 #define FREEBSD_ERR_NO_ROOT_HUB 26 #define FREEBSD_ERR_NO_INTR_THREAD 27 #define FREEBSD_ERR_NOT_LOCKED 28 static const value_string usb_freebsd_err_vals[] = { {FREEBSD_ERR_NORMAL_COMPLETION, "Normal completion"}, {FREEBSD_ERR_PENDING_REQUESTS, "Pending requests"}, {FREEBSD_ERR_NOT_STARTED, "Not started"}, {FREEBSD_ERR_INVAL, "Invalid"}, {FREEBSD_ERR_NOMEM, "No memory"}, {FREEBSD_ERR_CANCELLED, "Cancelled"}, {FREEBSD_ERR_BAD_ADDRESS, "Bad address"}, {FREEBSD_ERR_BAD_BUFSIZE, "Bad buffer size"}, {FREEBSD_ERR_BAD_FLAG, "Bad flag"}, {FREEBSD_ERR_NO_CALLBACK, "No callback"}, {FREEBSD_ERR_IN_USE, "In use"}, {FREEBSD_ERR_NO_ADDR, "No address"}, {FREEBSD_ERR_NO_PIPE, "No pipe"}, {FREEBSD_ERR_ZERO_NFRAMES, "Number of frames is zero"}, {FREEBSD_ERR_ZERO_MAXP, "MAXP is zero"}, {FREEBSD_ERR_SET_ADDR_FAILED, "Set address failed"}, {FREEBSD_ERR_NO_POWER, "No power"}, {FREEBSD_ERR_TOO_DEEP, "Too deep"}, {FREEBSD_ERR_IOERROR, "I/O error"}, {FREEBSD_ERR_NOT_CONFIGURED, "Not configured"}, {FREEBSD_ERR_TIMEOUT, "Timeout"}, {FREEBSD_ERR_SHORT_XFER, "Short transfer"}, {FREEBSD_ERR_STALLED, "Stalled"}, {FREEBSD_ERR_INTERRUPTED, "Interrupted"}, {FREEBSD_ERR_DMA_LOAD_FAILED, "DMA load failed"}, {FREEBSD_ERR_BAD_CONTEXT, "Bad context"}, {FREEBSD_ERR_NO_ROOT_HUB, "No root hub"}, {FREEBSD_ERR_NO_INTR_THREAD, "No interrupt thread"}, {FREEBSD_ERR_NOT_LOCKED, "Not locked"}, {0, NULL} }; /* USB speeds */ #define FREEBSD_SPEED_VARIABLE 0 #define FREEBSD_SPEED_LOW 1 #define FREEBSD_SPEED_FULL 2 #define FREEBSD_SPEED_HIGH 3 #define FREEBSD_SPEED_SUPER 4 static const value_string usb_freebsd_speed_vals[] = { {FREEBSD_SPEED_VARIABLE, "Variable"}, {FREEBSD_SPEED_LOW, "Low"}, {FREEBSD_SPEED_FULL, "Full"}, {FREEBSD_SPEED_HIGH, "High"}, {FREEBSD_SPEED_SUPER, "Super"}, {0, NULL} }; /* Frame flags */ #define FREEBSD_FRAMEFLAG_READ 0x00000001 #define FREEBSD_FRAMEFLAG_DATA_FOLLOWS 0x00000002 static const int *usb_frame_flags_fields[] = { &hf_usb_frame_flags_read, &hf_usb_frame_flags_data_follows, NULL }; static const value_string usb_linux_urb_type_vals[] = { {URB_SUBMIT, "URB_SUBMIT"}, {URB_COMPLETE, "URB_COMPLETE"}, {URB_ERROR, "URB_ERROR"}, {0, NULL} }; static const value_string usb_linux_transfer_type_vals[] = { {URB_CONTROL, "URB_CONTROL"}, {URB_ISOCHRONOUS, "URB_ISOCHRONOUS"}, {URB_INTERRUPT, "URB_INTERRUPT"}, {URB_BULK, "URB_BULK"}, {0, NULL} }; static const value_string usb_transfer_type_and_direction_vals[] = { {URB_CONTROL, "URB_CONTROL out"}, {URB_ISOCHRONOUS, "URB_ISOCHRONOUS out"}, {URB_INTERRUPT, "URB_INTERRUPT out"}, {URB_BULK, "URB_BULK out"}, {URB_CONTROL | URB_TRANSFER_IN, "URB_CONTROL in"}, {URB_ISOCHRONOUS | URB_TRANSFER_IN, "URB_ISOCHRONOUS in"}, {URB_INTERRUPT | URB_TRANSFER_IN, "URB_INTERRUPT in"}, {URB_BULK | URB_TRANSFER_IN, "URB_BULK in"}, {0, NULL} }; static const value_string usb_endpoint_direction_vals[] = { {0, "OUT"}, {1, "IN"}, {0, NULL} }; extern value_string_ext ext_usb_vendors_vals; extern value_string_ext ext_usb_products_vals; extern value_string_ext ext_usb_com_subclass_vals; /* * Standard descriptor types. * * all class specific descriptor types were removed from this list * a descriptor type is not globally unique * dissectors for the USB classes should provide their own value string * and pass it to dissect_usb_descriptor_header() * */ #define USB_DT_DEVICE 1 #define USB_DT_CONFIG 2 #define USB_DT_STRING 3 #define USB_DT_INTERFACE 4 #define USB_DT_ENDPOINT 5 #define USB_DT_DEVICE_QUALIFIER 6 #define USB_DT_OTHER_SPEED_CONFIG 7 #define USB_DT_INTERFACE_POWER 8 /* these are from a minor usb 2.0 revision (ECN) */ #define USB_DT_OTG 9 #define USB_DT_DEBUG 10 #define USB_DT_INTERFACE_ASSOCIATION 11 /* There are only Standard Descriptor Types, Class-specific types are provided by "usb.descriptor" descriptors table*/ static const value_string std_descriptor_type_vals[] = { {USB_DT_DEVICE, "DEVICE"}, {USB_DT_CONFIG, "CONFIGURATION"}, {USB_DT_STRING, "STRING"}, {USB_DT_INTERFACE, "INTERFACE"}, {USB_DT_ENDPOINT, "ENDPOINT"}, {USB_DT_DEVICE_QUALIFIER, "DEVICE QUALIFIER"}, {USB_DT_OTHER_SPEED_CONFIG, "OTHER SPEED CONFIG"}, {USB_DT_INTERFACE_POWER, "INTERFACE POWER"}, {USB_DT_OTG, "OTG"}, {USB_DT_DEBUG, "DEBUG"}, {USB_DT_INTERFACE_ASSOCIATION, "INTERFACE ASSOCIATION"}, { 0x0F, "BOS"}, { 0x10, "DEVICE CAPABILITY"}, { 0x30, "SUPERSPEED USB ENDPOINT COMPANION"}, { 0x31, "SUPERSPEED PLUS ISOCHRONOUS ENDPOINT COMPANION"}, {0,NULL} }; static value_string_ext std_descriptor_type_vals_ext = VALUE_STRING_EXT_INIT(std_descriptor_type_vals); /* * Feature selectors. * Per USB 3.1 spec, Table 9-7 */ #define USB_FS_ENDPOINT_HALT 0 #define USB_FS_FUNCTION_SUSPEND 0 /* same as ENDPOINT_HALT */ #define USB_FS_DEVICE_REMOTE_WAKEUP 1 #define USB_FS_TEST_MODE 2 #define USB_FS_B_HNP_ENABLE 3 #define USB_FS_A_HNP_SUPPORT 4 #define USB_FS_A_ALT_HNP_SUPPORT 5 #define USB_FS_WUSB_DEVICE 6 #define USB_FS_U1_ENABLE 48 #define USB_FS_U2_ENABLE 49 #define USB_FS_LTM_ENABLE 50 #define USB_FS_B3_NTF_HOST_REL 51 #define USB_FS_B3_RSP_ENABLE 52 #define USB_FS_LDM_ENABLE 53 static const value_string usb_endpoint_feature_selector_vals[] = { {USB_FS_ENDPOINT_HALT, "ENDPOINT HALT"}, {0, NULL} }; static const value_string usb_interface_feature_selector_vals[] = { {USB_FS_FUNCTION_SUSPEND, "FUNCTION SUSPEND"}, {0, NULL} }; static const value_string usb_device_feature_selector_vals[] = { {USB_FS_DEVICE_REMOTE_WAKEUP, "DEVICE REMOTE WAKEUP"}, {USB_FS_TEST_MODE, "TEST MODE"}, {USB_FS_B_HNP_ENABLE, "B HNP ENABLE"}, {USB_FS_A_HNP_SUPPORT, "A HNP SUPPORT"}, {USB_FS_A_ALT_HNP_SUPPORT, "A ALT HNP SUPPORT"}, {USB_FS_WUSB_DEVICE, "WUSB DEVICE"}, {USB_FS_U1_ENABLE, "U1 ENABLE"}, {USB_FS_U2_ENABLE, "U2 ENABLE"}, {USB_FS_LTM_ENABLE, "LTM ENABLE"}, {USB_FS_B3_NTF_HOST_REL, "B3 NTF HOST REL"}, {USB_FS_B3_RSP_ENABLE, "B3 RSP ENABLE"}, {USB_FS_LDM_ENABLE, "LDM ENABLE"}, {0, NULL} }; /* the transfer type in the endpoint descriptor, i.e. the type of the endpoint (this is not the same as the URB transfer type) */ #define USB_EP_CONTROL 0x00 #define USB_EP_ISOCHRONOUS 0x01 #define USB_EP_BULK 0x02 #define USB_EP_INTERRUPT 0x03 static const value_string usb_bmAttributes_transfer_vals[] = { {USB_EP_CONTROL, "Control-Transfer"}, {USB_EP_ISOCHRONOUS, "Isochronous-Transfer"}, {USB_EP_BULK, "Bulk-Transfer"}, {USB_EP_INTERRUPT, "Interrupt-Transfer"}, {0, NULL} }; static const value_string usb_bmAttributes_sync_vals[] = { {0x00, "No Sync"}, {0x01, "Asynchronous"}, {0x02, "Adaptive"}, {0x03, "Synchronous"}, {0, NULL} }; static const value_string usb_bmAttributes_behaviour_vals[] = { {0x00, "Data-Endpoint"}, {0x01, "Explicit Feedback-Endpoint"}, {0x02, "Implicit Feedback-Data-Endpoint"}, {0x03, "Reserved"}, {0, NULL} }; static const value_string usb_wMaxPacketSize_slots_vals[] = { {0x00, "1"}, {0x01, "2"}, {0x02, "3"}, {0x03, "Reserved"}, {0, NULL} }; /* Note: sorted in (unsigned) ascending order */ static const value_string usb_urb_status_vals[] = { /* from linux/include/asm-generic/errno.h*/ { -131, "State not recoverable (-ENOTRECOVERABLE)" }, { -130, "Owner died (-EOWNERDEAD)" }, { -129, "Key was rejected by service (-EKEYREJECTED)" }, { -128, "Key has been revoked (-EKEYREVOKED)" }, { -127, "Key has expired (-EKEYEXPIRED)" }, { -126, "Required key not available (-ENOKEY)" }, { -125, "Operation Canceled (-ECANCELED)" }, { -124, "Wrong medium type (-EMEDIUMTYPE)" }, { -123, "No medium found (-ENOMEDIUM)" }, { -122, "Quota exceeded (-EDQUOT)" }, { -121, "Remote I/O error (-EREMOTEIO)" }, { -120, "Is a named type file (-EISNAM)" }, { -119, "No XENIX semaphores available (-ENAVAIL)" }, { -118, "Not a XENIX named type file (-ENOTNAM)" }, { -117, "Structure needs cleaning (-EUCLEAN)" }, { -116, "Stale NFS file handle (-ESTALE)" }, { -115, "Operation now in progress (-EINPROGRESS)" }, { -114, "Operation already in progress (-EALREADY)" }, { -113, "No route to host (-EHOSTUNREACH)" }, { -112, "Host is down (-EHOSTDOWN)" }, { -111, "Connection refused (-ECONNREFUSED)" }, { -110, "Connection timed out (-ETIMEDOUT)" }, { -109, "Too many references: cannot splice (-ETOOMANYREFS)" }, { -108, "Cannot send after transport endpoint shutdown (-ESHUTDOWN)" }, { -107, "Transport endpoint is not connected (-ENOTCONN)" }, { -106, "Transport endpoint is already connected (-EISCONN)" }, { -105, "No buffer space available (-ENOBUFS)" }, { -104, "Connection reset by peer (-ECONNRESET)" }, { -103, "Software caused connection abort (-ECONNABORTED)" }, { -102, "Network dropped connection because of reset (-ENETRESET)" }, { -101, "Network is unreachable (-ENETUNREACH)" }, { -100, "Network is down (-ENETDOWN)" }, { -99, "Cannot assign requested address (-EADDRNOTAVAIL)" }, { -98, "Address already in use (-EADDRINUSE)" }, { -97, "Address family not supported by protocol (-EAFNOSUPPORT)" }, { -96, "Protocol family not supported (-EPFNOSUPPORT)" }, { -95, "Operation not supported on transport endpoint (-EOPNOTSUPP)" }, { -94, "Socket type not supported (-ESOCKTNOSUPPORT)" }, { -93, "Protocol not supported (-EPROTONOSUPPORT)" }, { -92, "Protocol not available (-ENOPROTOOPT)" }, { -91, "Protocol wrong type for socket (-EPROTOTYPE)" }, { -90, "Message too long (-EMSGSIZE)" }, { -89, "Destination address required (-EDESTADDRREQ)" }, { -88, "Socket operation on non-socket (-ENOTSOCK)" }, { -87, "Too many users (-EUSERS)" }, { -86, "Streams pipe error (-ESTRPIPE)" }, { -85, "Interrupted system call should be restarted (-ERESTART)" }, { -84, "Illegal byte sequence (-EILSEQ)" }, { -83, "Cannot exec a shared library directly (-ELIBEXEC)" }, { -82, "Attempting to link in too many shared libraries (-ELIBMAX)" }, { -81, ".lib section in a.out corrupted (-ELIBSCN)" }, { -80, "Accessing a corrupted shared library (-ELIBBAD)" }, { -79, "Can not access a needed shared library (-ELIBACC)" }, { -78, "Remote address changed (-EREMCHG)" }, { -77, "File descriptor in bad state (-EBADFD)" }, { -76, "Name not unique on network (-ENOTUNIQ)" }, { -75, "Value too large for defined data type (-EOVERFLOW)" }, { -74, "Not a data message (-EBADMSG)" }, { -73, "RFS specific error (-EDOTDOT)" }, { -72, "Multihop attempted (-EMULTIHOP)" }, { -71, "Protocol error (-EPROTO)" }, { -70, "Communication error on send (-ECOMM)" }, { -69, "Srmount error (-ESRMNT)" }, { -68, "Advertise error (-EADV)" }, { -67, "Link has been severed (-ENOLINK)" }, { -66, "Object is remote (-EREMOTE)" }, { -65, "Package not installed (-ENOPKG)" }, { -64, "Machine is not on the network (-ENONET)" }, { -63, "Out of streams resources (-ENOSR)" }, { -62, "Timer expired (-ETIME)" }, { -61, "No data available (-ENODATA)" }, { -60, "Device not a stream (-ENOSTR)" }, { -59, "Bad font file format (-EBFONT)" }, { -58, "(-58 \?\?\?)" }, /* dummy so that there are no "gaps" */ { -57, "Invalid slot (-EBADSLT)" }, { -56, "Invalid request code (-EBADRQC)" }, { -55, "No anode (-ENOANO)" }, { -54, "Exchange full (-EXFULL)" }, { -53, "Invalid request descriptor (-EBADR)" }, { -52, "Invalid exchange (-EBADE)" }, { -51, "Level 2 halted (-EL2HLT)" }, { -50, "No CSI structure available (-ENOCSI)" }, { -49, "Protocol driver not attached (-EUNATCH)" }, { -48, "Link number out of range (-ELNRNG)" }, { -47, "Level 3 reset (-EL3RST)" }, { -46, "Level 3 halted (-EL3HLT)" }, { -45, "Level 2 not synchronized (-EL2NSYNC)" }, { -44, "Channel number out of range (-ECHRNG)" }, { -43, "Identifier removed (-EIDRM)" }, { -42, "No message of desired type (-ENOMSG)" }, { -41, "(-41 \?\?\?)" }, /* dummy so that there are no "gaps" */ { -40, "Too many symbolic links encountered (-ELOOP)" }, { -39, "Directory not empty (-ENOTEMPTY)" }, { -38, "Function not implemented (-ENOSYS)" }, { -37, "No record locks available (-ENOLCK)" }, { -36, "File name too long (-ENAMETOOLONG)" }, { -35, "Resource deadlock would occur (-EDEADLK)" }, /* from linux/include/asm-generic/errno.h */ { -34, "Math result not representable (-ERANGE)" }, { -33, "Math argument out of domain of func (-EDOM)" }, { -32, "Broken pipe (-EPIPE)" }, { -31, "Too many links (-EMLINK)" }, { -30, "Read-only file system (-EROFS)" }, { -29, "Illegal seek (-ESPIPE)" }, { -28, "No space left on device (-ENOSPC)" }, { -27, "File too large (-EFBIG)" }, { -26, "Text file busy (-ETXTBSY)" }, { -25, "Not a typewriter (-ENOTTY)" }, { -24, "Too many open files (-EMFILE)" }, { -23, "File table overflow (-ENFILE)" }, { -22, "Invalid argument (-EINVAL)" }, { -21, "Is a directory (-EISDIR)" }, { -20, "Not a directory (-ENOTDIR)" }, { -19, "No such device (-ENODEV)" }, { -18, "Cross-device link (-EXDEV)" }, { -17, "File exists (-EEXIST)" }, { -16, "Device or resource busy (-EBUSY)" }, { -15, "Block device required (-ENOTBLK)" }, { -14, "Bad address (-EFAULT)" }, { -13, "Permission denied (-EACCES)" }, { -12, "Out of memory (-ENOMEM)" }, { -11, "Try again (-EAGAIN)" }, { -10, "No child processes (-ECHILD)" }, { -9, "Bad file number (-EBADF)" }, { -8, "Exec format error (-ENOEXEC)" }, { -7, "Argument list too long (-E2BIG)" }, { -6, "No such device or address (-ENXIO)" }, { -5, "I/O error (-EIO)" }, { -4, "Interrupted system call (-EINTR)" }, { -3, "No such process (-ESRCH)" }, { -2, "No such file or directory (-ENOENT)" }, { -1, "Operation not permitted (-EPERM)" }, { 0, "Success"}, { 0, NULL } }; value_string_ext usb_urb_status_vals_ext = VALUE_STRING_EXT_INIT(usb_urb_status_vals); #define USB_CONTROL_STAGE_SETUP 0x00 #define USB_CONTROL_STAGE_DATA 0x01 #define USB_CONTROL_STAGE_STATUS 0x02 static const value_string usb_control_stage_vals[] = { {USB_CONTROL_STAGE_SETUP, "Setup"}, {USB_CONTROL_STAGE_DATA, "Data"}, {USB_CONTROL_STAGE_STATUS, "Status"}, {0, NULL} }; static const value_string win32_urb_function_vals[] = { {0x0000, "URB_FUNCTION_SELECT_CONFIGURATION"}, {0x0001, "URB_FUNCTION_SELECT_INTERFACE"}, {0x0002, "URB_FUNCTION_ABORT_PIPE"}, {0x0003, "URB_FUNCTION_TAKE_FRAME_LENGTH_CONTROL"}, {0x0004, "URB_FUNCTION_RELEASE_FRAME_LENGTH_CONTROL"}, {0x0005, "URB_FUNCTION_GET_FRAME_LENGTH"}, {0x0006, "URB_FUNCTION_SET_FRAME_LENGTH"}, {0x0007, "URB_FUNCTION_GET_CURRENT_FRAME_NUMBER"}, {0x0008, "URB_FUNCTION_CONTROL_TRANSFER"}, {0x0009, "URB_FUNCTION_BULK_OR_INTERRUPT_TRANSFER"}, {0x000A, "URB_FUNCTION_ISOCH_TRANSFER"}, {0x000B, "URB_FUNCTION_GET_DESCRIPTOR_FROM_DEVICE"}, {0x000C, "URB_FUNCTION_SET_DESCRIPTOR_TO_DEVICE"}, {0x000D, "URB_FUNCTION_SET_FEATURE_TO_DEVICE"}, {0x000E, "URB_FUNCTION_SET_FEATURE_TO_INTERFACE"}, {0x000F, "URB_FUNCTION_SET_FEATURE_TO_ENDPOINT"}, {0x0010, "URB_FUNCTION_CLEAR_FEATURE_TO_DEVICE"}, {0x0011, "URB_FUNCTION_CLEAR_FEATURE_TO_INTERFACE"}, {0x0012, "URB_FUNCTION_CLEAR_FEATURE_TO_ENDPOINT"}, {0x0013, "URB_FUNCTION_GET_STATUS_FROM_DEVICE"}, {0x0014, "URB_FUNCTION_GET_STATUS_FROM_INTERFACE"}, {0x0015, "URB_FUNCTION_GET_STATUS_FROM_ENDPOINT"}, {0x0016, "URB_FUNCTION_RESERVED_0X0016"}, {0x0017, "URB_FUNCTION_VENDOR_DEVICE"}, {0x0018, "URB_FUNCTION_VENDOR_INTERFACE"}, {0x0019, "URB_FUNCTION_VENDOR_ENDPOINT"}, {0x001A, "URB_FUNCTION_CLASS_DEVICE"}, {0x001B, "URB_FUNCTION_CLASS_INTERFACE"}, {0x001C, "URB_FUNCTION_CLASS_ENDPOINT"}, {0x001D, "URB_FUNCTION_RESERVE_0X001D"}, {0x001E, "URB_FUNCTION_SYNC_RESET_PIPE_AND_CLEAR_STALL"}, {0x001F, "URB_FUNCTION_CLASS_OTHER"}, {0x0020, "URB_FUNCTION_VENDOR_OTHER"}, {0x0021, "URB_FUNCTION_GET_STATUS_FROM_OTHER"}, {0x0022, "URB_FUNCTION_CLEAR_FEATURE_TO_OTHER"}, {0x0023, "URB_FUNCTION_SET_FEATURE_TO_OTHER"}, {0x0024, "URB_FUNCTION_GET_DESCRIPTOR_FROM_ENDPOINT"}, {0x0025, "URB_FUNCTION_SET_DESCRIPTOR_TO_ENDPOINT"}, {0x0026, "URB_FUNCTION_GET_CONFIGURATION"}, {0x0027, "URB_FUNCTION_GET_INTERFACE"}, {0x0028, "URB_FUNCTION_GET_DESCRIPTOR_FROM_INTERFACE"}, {0x0029, "URB_FUNCTION_SET_DESCRIPTOR_TO_INTERFACE"}, {0x002A, "URB_FUNCTION_GET_MS_FEATURE_DESCRIPTOR"}, {0x002B, "URB_FUNCTION_RESERVE_0X002B"}, {0x002C, "URB_FUNCTION_RESERVE_0X002C"}, {0x002D, "URB_FUNCTION_RESERVE_0X002D"}, {0x002E, "URB_FUNCTION_RESERVE_0X002E"}, {0x002F, "URB_FUNCTION_RESERVE_0X002F"}, {0x0030, "URB_FUNCTION_SYNC_RESET_PIPE"}, {0x0031, "URB_FUNCTION_SYNC_CLEAR_STALL"}, {0x0032, "URB_FUNCTION_CONTROL_TRANSFER_EX"}, {0x0033, "URB_FUNCTION_RESERVE_0X0033"}, {0x0034, "URB_FUNCTION_RESERVE_0X0034"}, {0, NULL} }; value_string_ext win32_urb_function_vals_ext = VALUE_STRING_EXT_INIT(win32_urb_function_vals); static const value_string win32_usbd_status_vals[] = { {0x00000000, "USBD_STATUS_SUCCESS"}, {0x40000000, "USBD_STATUS_PENDING"}, {0x80000200, "USBD_STATUS_INVALID_URB_FUNCTION"}, {0x80000300, "USBD_STATUS_INVALID_PARAMETER"}, {0x80000400, "USBD_STATUS_ERROR_BUSY"}, {0x80000600, "USBD_STATUS_INVALID_PIPE_HANDLE"}, {0x80000700, "USBD_STATUS_NO_BANDWIDTH"}, {0x80000800, "USBD_STATUS_INTERNAL_HC_ERROR"}, {0x80000900, "USBD_STATUS_ERROR_SHORT_TRANSFER"}, {0xC0000001, "USBD_STATUS_CRC"}, {0xC0000002, "USBD_STATUS_BTSTUFF"}, {0xC0000003, "USBD_STATUS_DATA_TOGGLE_MISMATCH"}, {0xC0000004, "USBD_STATUS_STALL_PID"}, {0xC0000005, "USBD_STATUS_DEV_NOT_RESPONDING"}, {0xC0000006, "USBD_STATUS_PID_CHECK_FAILURE"}, {0xC0000007, "USBD_STATUS_UNEXPECTED_PID"}, {0xC0000008, "USBD_STATUS_DATA_OVERRUN"}, {0xC0000009, "USBD_STATUS_DATA_UNDERRUN"}, {0xC000000A, "USBD_STATUS_RESERVED1"}, {0xC000000B, "USBD_STATUS_RESERVED2"}, {0xC000000C, "USBD_STATUS_BUFFER_OVERRUN"}, {0xC000000D, "USBD_STATUS_BUFFER_UNDERRUN"}, {0xC000000F, "USBD_STATUS_NOT_ACCESSED"}, {0xC0000010, "USBD_STATUS_FIFO"}, {0xC0000011, "USBD_STATUS_XACT_ERROR"}, {0xC0000012, "USBD_STATUS_BABBLE_DETECTED"}, {0xC0000013, "USBD_STATUS_DATA_BUFFER_ERROR"}, {0xC0000030, "USBD_STATUS_ENDPOINT_HALTED"}, {0xC0000A00, "USBD_STATUS_BAD_START_FRAME"}, {0xC0000B00, "USBD_STATUS_ISOCH_REQUEST_FAILED"}, {0xC0000C00, "USBD_STATUS_FRAME_CONTROL_OWNED"}, {0xC0000D00, "USBD_STATUS_FRAME_CONTROL_NOT_OWNED"}, {0xC0000E00, "USBD_STATUS_NOT_SUPPORTED"}, {0xC0000F00, "USBD_STATUS_INVALID_CONFIGURATION_DESCRIPTOR"}, {0xC0001000, "USBD_STATUS_INSUFFICIENT_RESOURCES"}, {0xC0002000, "USBD_STATUS_SET_CONFIG_FAILED"}, {0xC0003000, "USBD_STATUS_BUFFER_TOO_SMALL"}, {0xC0004000, "USBD_STATUS_INTERFACE_NOT_FOUND"}, {0xC0005000, "USBD_STATUS_INVALID_PIPE_FLAGS"}, {0xC0006000, "USBD_STATUS_TIMEOUT"}, {0xC0007000, "USBD_STATUS_DEVICE_GONE"}, {0xC0008000, "USBD_STATUS_STATUS_NOT_MAPPED"}, {0xC0009000, "USBD_STATUS_HUB_INTERNAL_ERROR"}, {0xC0010000, "USBD_STATUS_CANCELED"}, {0xC0020000, "USBD_STATUS_ISO_NOT_ACCESSED_BY_HW"}, {0xC0030000, "USBD_STATUS_ISO_TD_ERROR"}, {0xC0040000, "USBD_STATUS_ISO_NA_LATE_USBPORT"}, {0xC0050000, "USBD_STATUS_ISO_NOT_ACCESSED_LATE"}, {0xC0100000, "USBD_STATUS_BAD_DESCRIPTOR"}, {0xC0100001, "USBD_STATUS_BAD_DESCRIPTOR_BLEN"}, {0xC0100002, "USBD_STATUS_BAD_DESCRIPTOR_TYPE"}, {0xC0100003, "USBD_STATUS_BAD_INTERFACE_DESCRIPTOR"}, {0xC0100004, "USBD_STATUS_BAD_ENDPOINT_DESCRIPTOR"}, {0xC0100005, "USBD_STATUS_BAD_INTERFACE_ASSOC_DESCRIPTOR"}, {0xC0100006, "USBD_STATUS_BAD_CONFIG_DESC_LENGTH"}, {0xC0100007, "USBD_STATUS_BAD_NUMBER_OF_INTERFACES"}, {0xC0100008, "USBD_STATUS_BAD_NUMBER_OF_ENDPOINTS"}, {0xC0100009, "USBD_STATUS_BAD_ENDPOINT_ADDRESS"}, {0, NULL} }; static value_string_ext win32_usbd_status_vals_ext = VALUE_STRING_EXT_INIT(win32_usbd_status_vals); static const value_string win32_usb_info_direction_vals[] = { {0, "FDO -> PDO"}, {1, "PDO -> FDO"}, {0, NULL} }; static const value_string usb_cdc_protocol_vals[] = { {0x00, "No class specific protocol required"}, {0x01, "AT Commands: V.250 etc"}, {0x02, "AT Commands defined by PCCA-101"}, {0x03, "AT Commands defined by PCCA-101 & Annex O"}, {0x04, "AT Commands defined by GSM 07.07"}, {0x05, "AT Commands defined by 3GPP 27.007"}, {0x06, "AT Commands defined by TIA for CDMA"}, {0x07, "Ethernet Emulation Model"}, {0xFE, "External Protocol: Commands defined by Command Set Functional Descriptor"}, {0xFF, "Vendor-specific"}, {0, NULL} }; static value_string_ext usb_cdc_protocol_vals_ext = VALUE_STRING_EXT_INIT(usb_cdc_protocol_vals); static const value_string usb_cdc_data_protocol_vals[] = { {0x00, "No class specific protocol required"}, {0x01, "Network Transfer Block"}, {0x02, "Network Transfer Block (IP + DSS)"}, {0x30, "Physical interface protocol for ISDN BRI"}, {0x31, "HDLC"}, {0x32, "Transparent"}, {0x50, "Management protocol for Q.921 data link protocol"}, {0x51, "Data link protocol for Q.931"}, {0x52, "TEI-multiplexor for Q.921 data link protocol"}, {0x90, "Data compression procedures"}, {0x91, "Euro-ISDN protocol control"}, {0x92, "V.24 rate adaptation to ISDN"}, {0x93, "CAPI Commands"}, {0xFE, "The protocol(s) are described using a Protocol Unit Functional Descriptors on Communications Class Interface"}, {0xFF, "Vendor-specific"}, {0, NULL} }; static value_string_ext usb_cdc_data_protocol_vals_ext = VALUE_STRING_EXT_INIT(usb_cdc_data_protocol_vals); static const value_string usb_hid_subclass_vals[] = { {0, "No Subclass"}, {1, "Boot Interface"}, {0, NULL} }; static value_string_ext usb_hid_subclass_vals_ext = VALUE_STRING_EXT_INIT(usb_hid_subclass_vals); static const value_string usb_hid_boot_protocol_vals[] = { {0, "None"}, {1, "Keyboard"}, {2, "Mouse"}, {0, NULL} }; static value_string_ext usb_hid_boot_protocol_vals_ext = VALUE_STRING_EXT_INIT(usb_hid_boot_protocol_vals); static const value_string usb_misc_subclass_vals[] = { {0x03, "Cable Based Association Framework"}, {0x04, "RNDIS"}, {IF_SUBCLASS_MISC_U3V, "USB3 Vision"}, {0x06, "Stream Transport Efficient Protocol"}, {0, NULL} }; static value_string_ext usb_misc_subclass_vals_ext = VALUE_STRING_EXT_INIT(usb_misc_subclass_vals); static const value_string usb_app_subclass_vals[] = { {0x01, "Device Firmware Upgrade"}, {0x02, "IRDA Bridge"}, {0x03, "USB Test and Measurement Device"}, {0, NULL} }; static value_string_ext usb_app_subclass_vals_ext = VALUE_STRING_EXT_INIT(usb_app_subclass_vals); static const value_string usb_app_dfu_protocol_vals[] = { {0x01, "Runtime protocol"}, {0x02, "DFU mode protocol"}, {0, NULL} }; static value_string_ext usb_app_dfu_protocol_vals_ext = VALUE_STRING_EXT_INIT(usb_app_dfu_protocol_vals); static const value_string usb_app_irda_protocol_vals[] = { {0x00, "IRDA Bridge device"}, {0, NULL} }; static value_string_ext usb_app_irda_protocol_vals_ext = VALUE_STRING_EXT_INIT(usb_app_irda_protocol_vals); static const value_string usb_app_usb_test_and_measurement_protocol_vals[] = { {0x00, "USB Test and Measurement Device"}, {0x01, "USB Test and Measurement Device conforming to the USBTMC USB488 Subclass Specification"}, {0, NULL} }; static value_string_ext usb_app_usb_test_and_measurement_protocol_vals_ext = VALUE_STRING_EXT_INIT(usb_app_usb_test_and_measurement_protocol_vals); void proto_register_usb(void); void proto_reg_handoff_usb(void); /* USB address handling */ static int usb_addr_to_str(const address* addr, gchar *buf, int buf_len _U_) { const guint8 *addrp = (const guint8 *)addr->data; if(pletoh32(&addrp[0])==0xffffffff){ g_strlcpy(buf, "host", buf_len); } else { g_snprintf(buf, buf_len, "%d.%d.%d", pletoh16(&addrp[8]), pletoh32(&addrp[0]), pletoh32(&addrp[4])); } return (int)(strlen(buf)+1); } static int usb_addr_str_len(const address* addr _U_) { return 50; } /* This keys provide information for DecodeBy and other dissector via per packet data: p_get_proto_data()/p_add_proto_data() */ #define USB_BUS_ID 0 #define USB_DEVICE_ADDRESS 1 #define USB_VENDOR_ID 2 #define USB_PRODUCT_ID 3 #define USB_DEVICE_CLASS 4 #define USB_DEVICE_SUBCLASS 5 #define USB_DEVICE_PROTOCOL 6 static void usb_device_prompt(packet_info *pinfo, gchar* result) { g_snprintf(result, MAX_DECODE_AS_PROMPT_LEN, "Bus ID %u \nDevice Address %u\nas ", GPOINTER_TO_UINT(p_get_proto_data(pinfo->pool, pinfo, proto_usb, USB_BUS_ID)), GPOINTER_TO_UINT(p_get_proto_data(pinfo->pool, pinfo, proto_usb, USB_DEVICE_ADDRESS))); } static gpointer usb_device_value(packet_info *pinfo) { guint32 value = GPOINTER_TO_UINT(p_get_proto_data(pinfo->pool, pinfo, proto_usb, USB_BUS_ID)) << 16; value |= GPOINTER_TO_UINT(p_get_proto_data(pinfo->pool, pinfo, proto_usb, USB_DEVICE_ADDRESS)); return GUINT_TO_POINTER(value); } static void usb_product_prompt(packet_info *pinfo, gchar* result) { g_snprintf(result, MAX_DECODE_AS_PROMPT_LEN, "Vendor ID 0x%04x \nProduct ID 0x%04x\nas ", GPOINTER_TO_UINT(p_get_proto_data(pinfo->pool, pinfo, proto_usb, USB_VENDOR_ID)), GPOINTER_TO_UINT(p_get_proto_data(pinfo->pool, pinfo, proto_usb, USB_PRODUCT_ID))); } static gpointer usb_product_value(packet_info *pinfo) { guint32 value = GPOINTER_TO_UINT(p_get_proto_data(pinfo->pool, pinfo, proto_usb, USB_VENDOR_ID)) << 16; value |= GPOINTER_TO_UINT(p_get_proto_data(pinfo->pool, pinfo, proto_usb, USB_PRODUCT_ID)); return GUINT_TO_POINTER(value); } static void usb_protocol_prompt(packet_info *pinfo, gchar* result) { g_snprintf(result, MAX_DECODE_AS_PROMPT_LEN, "Class ID 0x%04x \nSubclass ID 0x%04x\nProtocol 0x%04x\nas ", GPOINTER_TO_UINT(p_get_proto_data(pinfo->pool, pinfo, proto_usb, USB_DEVICE_CLASS)), GPOINTER_TO_UINT(p_get_proto_data(pinfo->pool, pinfo, proto_usb, USB_DEVICE_SUBCLASS)), GPOINTER_TO_UINT(p_get_proto_data(pinfo->pool, pinfo, proto_usb, USB_DEVICE_PROTOCOL))); } static gpointer usb_protocol_value(packet_info *pinfo) { guint32 value = GPOINTER_TO_UINT(p_get_proto_data(pinfo->pool, pinfo, proto_usb, USB_DEVICE_CLASS)) << 16; value |= GPOINTER_TO_UINT(p_get_proto_data(pinfo->pool, pinfo, proto_usb, USB_DEVICE_SUBCLASS)) << 8; value |= GPOINTER_TO_UINT(p_get_proto_data(pinfo->pool, pinfo, proto_usb, USB_DEVICE_PROTOCOL)); return GUINT_TO_POINTER(value); } static build_valid_func usb_product_da_build_value[1] = {usb_product_value}; static decode_as_value_t usb_product_da_values = {usb_product_prompt, 1, usb_product_da_build_value}; static decode_as_t usb_product_da = { "usb", "USB Product", "usb.product", 1, 0, &usb_product_da_values, NULL, NULL, decode_as_default_populate_list, decode_as_default_reset, decode_as_default_change, NULL}; static build_valid_func usb_device_da_build_value[1] = {usb_device_value}; static decode_as_value_t usb_device_da_values = {usb_device_prompt, 1, usb_device_da_build_value}; static decode_as_t usb_device_da = { "usb", "USB Device", "usb.device", 1, 0, &usb_device_da_values, NULL, NULL, decode_as_default_populate_list, decode_as_default_reset, decode_as_default_change, NULL}; static build_valid_func usb_protocol_da_build_value[1] = {usb_protocol_value}; static decode_as_value_t usb_protocol_da_values = {usb_protocol_prompt, 1, usb_protocol_da_build_value}; static decode_as_t usb_protocol_da = { "usb", "USB Device Protocol", "usb.protocol", 1, 0, &usb_protocol_da_values, NULL, NULL, decode_as_default_populate_list, decode_as_default_reset, decode_as_default_change, NULL}; static usb_conv_info_t * get_usb_conv_info(conversation_t *conversation) { usb_conv_info_t *usb_conv_info; /* do we have conversation specific data ? */ usb_conv_info = (usb_conv_info_t *)conversation_get_proto_data(conversation, proto_usb); if (!usb_conv_info) { /* no not yet so create some */ usb_conv_info = wmem_new0(wmem_file_scope(), usb_conv_info_t); usb_conv_info->interfaceClass = IF_CLASS_UNKNOWN; usb_conv_info->interfaceSubclass = IF_SUBCLASS_UNKNOWN; usb_conv_info->interfaceProtocol = IF_PROTOCOL_UNKNOWN; usb_conv_info->deviceVendor = DEV_VENDOR_UNKNOWN; usb_conv_info->deviceProduct = DEV_PRODUCT_UNKNOWN; usb_conv_info->alt_settings = wmem_array_new(wmem_file_scope(), sizeof(usb_alt_setting_t)); usb_conv_info->transactions = wmem_tree_new(wmem_file_scope()); conversation_add_proto_data(conversation, proto_usb, usb_conv_info); } return usb_conv_info; } /* usb_conv_info_t contains some components that are valid only for one specific packet clear_usb_conv_tmp_data() clears these components, it should be called before we dissect a new packet */ static void clear_usb_conv_tmp_data(usb_conv_info_t *usb_conv_info) { /* caller must have checked that usb_conv_info!= NULL */ usb_conv_info->direction = P2P_DIR_UNKNOWN; usb_conv_info->transfer_type = URB_UNKNOWN; usb_conv_info->is_request = FALSE; usb_conv_info->is_setup = FALSE; usb_conv_info->setup_requesttype = 0; /* when we parse the configuration, interface and endpoint descriptors, we store the current interface class in endpoint 0's conversation this must be cleared since endpoint 0 does not belong to any interface class we used to clear these info in dissect_usb_configuration_descriptor() this doesn't work when the descriptor parsing throws an exception */ if (usb_conv_info->endpoint==0) { usb_conv_info->interfaceClass = IF_CLASS_UNKNOWN; usb_conv_info->interfaceSubclass = IF_SUBCLASS_UNKNOWN; usb_conv_info->interfaceProtocol = IF_PROTOCOL_UNKNOWN; } } static conversation_t * get_usb_conversation(packet_info *pinfo, address *src_addr, address *dst_addr, guint32 src_endpoint, guint32 dst_endpoint) { conversation_t *conversation; /* * Do we have a conversation for this connection? */ conversation = find_conversation(pinfo->num, src_addr, dst_addr, pinfo->ptype, src_endpoint, dst_endpoint, 0); if (conversation) { return conversation; } /* We don't yet have a conversation, so create one. */ conversation = conversation_new(pinfo->num, src_addr, dst_addr, pinfo->ptype, src_endpoint, dst_endpoint, 0); return conversation; } /* Fetch or create usb_conv_info for a specified interface. */ usb_conv_info_t * get_usb_iface_conv_info(packet_info *pinfo, guint8 interface_num) { conversation_t *conversation; guint32 if_port; if_port = GUINT32_TO_LE(INTERFACE_PORT | interface_num); if (pinfo->srcport == NO_ENDPOINT) { conversation = get_usb_conversation(pinfo, &pinfo->src, &pinfo->dst, pinfo->srcport, if_port); } else { conversation = get_usb_conversation(pinfo, &pinfo->src, &pinfo->dst, if_port, pinfo->destport); } return get_usb_conv_info(conversation); } static const char* usb_conv_get_filter_type(conv_item_t* conv, conv_filter_type_e filter) { if ((filter == CONV_FT_SRC_ADDRESS) && (conv->src_address.type == usb_address_type)) return "usb.src"; if ((filter == CONV_FT_DST_ADDRESS) && (conv->dst_address.type == usb_address_type)) return "usb.dst"; if ((filter == CONV_FT_ANY_ADDRESS) && (conv->src_address.type == usb_address_type)) return "usb.addr"; return CONV_FILTER_INVALID; } static ct_dissector_info_t usb_ct_dissector_info = {&usb_conv_get_filter_type}; static int usb_conversation_packet(void *pct, packet_info *pinfo, epan_dissect_t *edt _U_, const void *vip _U_) { conv_hash_t *hash = (conv_hash_t*) pct; add_conversation_table_data(hash, &pinfo->src, &pinfo->dst, 0, 0, 1, pinfo->fd->pkt_len, &pinfo->rel_ts, &pinfo->abs_ts, &usb_ct_dissector_info, PT_NONE); return 1; } static const char* usb_host_get_filter_type(hostlist_talker_t* host, conv_filter_type_e filter) { if ((filter == CONV_FT_ANY_ADDRESS) && (host->myaddress.type == usb_address_type)) return "usb.addr"; return CONV_FILTER_INVALID; } static hostlist_dissector_info_t usb_host_dissector_info = {&usb_host_get_filter_type}; static int usb_hostlist_packet(void *pit, packet_info *pinfo, epan_dissect_t *edt _U_, const void *vip _U_) { conv_hash_t *hash = (conv_hash_t*) pit; /* Take two "add" passes per packet, adding for each direction, ensures that all packets are counted properly (even if address is sending to itself) XXX - this could probably be done more efficiently inside hostlist_table */ add_hostlist_table_data(hash, &pinfo->src, 0, TRUE, 1, pinfo->fd->pkt_len, &usb_host_dissector_info, PT_NONE); add_hostlist_table_data(hash, &pinfo->dst, 0, FALSE, 1, pinfo->fd->pkt_len, &usb_host_dissector_info, PT_NONE); return 1; } /* SETUP dissectors */ /* * These dissectors are used to dissect the setup part and the data * for URB_CONTROL_INPUT / CLEAR FEATURE */ /* 9.4.1 */ static int dissect_usb_setup_clear_feature_request(packet_info *pinfo _U_, proto_tree *tree, tvbuff_t *tvb, int offset, usb_conv_info_t *usb_conv_info) { guint8 recip; if (usb_conv_info) { recip = USB_RECIPIENT(usb_conv_info->usb_trans_info->setup.requesttype); /* feature selector, zero/interface/endpoint */ switch (recip) { case RQT_SETUP_RECIPIENT_DEVICE: proto_tree_add_item(tree, hf_usb_device_wFeatureSelector, tvb, offset, 2, ENC_LITTLE_ENDIAN); offset += 2; proto_tree_add_item(tree, hf_usb_index, tvb, offset, 2, ENC_LITTLE_ENDIAN); break; case RQT_SETUP_RECIPIENT_INTERFACE: proto_tree_add_item(tree, hf_usb_interface_wFeatureSelector, tvb, offset, 2, ENC_LITTLE_ENDIAN); offset += 2; proto_tree_add_item(tree, hf_usb_wInterface, tvb, offset, 2, ENC_LITTLE_ENDIAN); break; case RQT_SETUP_RECIPIENT_ENDPOINT: proto_tree_add_item(tree, hf_usb_endpoint_wFeatureSelector, tvb, offset, 2, ENC_LITTLE_ENDIAN); offset += 2; proto_tree_add_item(tree, hf_usb_wEndpoint, tvb, offset, 2, ENC_LITTLE_ENDIAN); break; case RQT_SETUP_RECIPIENT_OTHER: default: proto_tree_add_item(tree, hf_usb_value, tvb, offset, 2, ENC_LITTLE_ENDIAN); offset += 2; proto_tree_add_item(tree, hf_usb_index, tvb, offset, 2, ENC_LITTLE_ENDIAN); break; } } else { /* No conversation information, so recipient type is unknown */ proto_tree_add_item(tree, hf_usb_value, tvb, offset, 2, ENC_LITTLE_ENDIAN); offset += 2; proto_tree_add_item(tree, hf_usb_index, tvb, offset, 2, ENC_LITTLE_ENDIAN); } offset += 2; /* length */ proto_tree_add_item(tree, hf_usb_length, tvb, offset, 2, ENC_LITTLE_ENDIAN); offset += 2; return offset; } static int dissect_usb_setup_clear_feature_response(packet_info *pinfo _U_, proto_tree *tree _U_, tvbuff_t *tvb _U_, int offset, usb_conv_info_t *usb_conv_info _U_) { return offset; } /* * These dissectors are used to dissect the setup part and the data * for URB_CONTROL_INPUT / GET CONFIGURATION */ /* 9.4.2 */ static int dissect_usb_setup_get_configuration_response(packet_info *pinfo _U_, proto_tree *tree _U_, tvbuff_t *tvb _U_, int offset, usb_conv_info_t *usb_conv_info _U_) { proto_tree_add_item(tree, hf_usb_bConfigurationValue, tvb, offset, 1, ENC_LITTLE_ENDIAN); offset += 1; return offset; } /* * These dissectors are used to dissect the setup part and the data * for URB_CONTROL_INPUT / GET DESCRIPTOR */ proto_item * dissect_usb_descriptor_header(proto_tree *tree, tvbuff_t *tvb, int offset, value_string_ext *type_val_str) { guint8 desc_type; proto_item *length_item; proto_item *type_item; length_item = proto_tree_add_item(tree, hf_usb_bLength, tvb, offset, 1, ENC_LITTLE_ENDIAN); offset++; desc_type = tvb_get_guint8(tvb, offset); type_item = proto_tree_add_item(tree, hf_usb_bDescriptorType, tvb, offset, 1, ENC_LITTLE_ENDIAN); /* if the caller provided no class specific value string, we're * using the standard descriptor types */ if (!type_val_str) type_val_str = &std_descriptor_type_vals_ext; proto_item_append_text(type_item, " (%s)", val_to_str_ext(desc_type, type_val_str, "unknown")); return length_item; } /* 9.6.2 */ static int dissect_usb_device_qualifier_descriptor(packet_info *pinfo _U_, proto_tree *parent_tree, tvbuff_t *tvb, int offset, usb_conv_info_t *usb_conv_info) { proto_item *item; proto_tree *tree; proto_item *nitem; int old_offset = offset; guint32 protocol; const gchar *description; tree = proto_tree_add_subtree(parent_tree, tvb, offset, -1, ett_descriptor_device, &item, "DEVICE QUALIFIER DESCRIPTOR"); dissect_usb_descriptor_header(tree, tvb, offset, NULL); offset += 2; /* bcdUSB */ proto_tree_add_item(tree, hf_usb_bcdUSB, tvb, offset, 2, ENC_LITTLE_ENDIAN); offset += 2; protocol = tvb_get_ntoh24(tvb, offset); description = val_to_str_ext_const(protocol, &usb_protocols_ext, ""); /* bDeviceClass */ proto_tree_add_item(tree, hf_usb_bDeviceClass, tvb, offset, 1, ENC_LITTLE_ENDIAN); offset += 1; /* bDeviceSubClass */ proto_tree_add_item(tree, hf_usb_bDeviceSubClass, tvb, offset, 1, ENC_LITTLE_ENDIAN); offset += 1; /* bDeviceProtocol */ nitem = proto_tree_add_item(tree, hf_usb_bDeviceProtocol, tvb, offset, 1, ENC_LITTLE_ENDIAN); if (*description) proto_item_append_text(nitem, " (%s)", description); offset += 1; if (!pinfo->fd->flags.visited) { guint k_bus_id; guint k_device_address; guint k_frame_number; wmem_tree_key_t key[4]; device_protocol_data_t *device_protocol_data; k_frame_number = pinfo->num; k_device_address = usb_conv_info->device_address; k_bus_id = usb_conv_info->bus_id; key[0].length = 1; key[0].key = &k_device_address; key[1].length = 1; key[1].key = &k_bus_id; key[2].length = 1; key[2].key = &k_frame_number; key[3].length = 0; key[3].key = NULL; device_protocol_data = wmem_new(wmem_file_scope(), device_protocol_data_t); device_protocol_data->protocol = protocol; device_protocol_data->bus_id = usb_conv_info->bus_id; device_protocol_data->device_address = usb_conv_info->device_address; wmem_tree_insert32_array(device_to_protocol_table, key, device_protocol_data); } /* bMaxPacketSize0 */ proto_tree_add_item(tree, hf_usb_bMaxPacketSize0, tvb, offset, 1, ENC_LITTLE_ENDIAN); offset += 1; /* bNumConfigurations */ proto_tree_add_item(tree, hf_usb_bNumConfigurations, tvb, offset, 1, ENC_LITTLE_ENDIAN); offset += 1; /* one reserved byte */ offset += 1; proto_item_set_len(item, offset-old_offset); return offset; } /* 9.6.1 */ static int dissect_usb_device_descriptor(packet_info *pinfo, proto_tree *parent_tree, tvbuff_t *tvb, int offset, usb_conv_info_t *usb_conv_info) { proto_item *item; proto_tree *tree; proto_item *nitem; int old_offset = offset; guint32 protocol; const gchar *description; guint32 vendor_id; guint32 product; guint16 product_id; tree = proto_tree_add_subtree(parent_tree, tvb, offset, -1, ett_descriptor_device, &item, "DEVICE DESCRIPTOR"); dissect_usb_descriptor_header(tree, tvb, offset, NULL); offset += 2; /* bcdUSB */ proto_tree_add_item(tree, hf_usb_bcdUSB, tvb, offset, 2, ENC_LITTLE_ENDIAN); offset += 2; protocol = tvb_get_ntoh24(tvb, offset); description = val_to_str_ext_const(protocol, &usb_protocols_ext, ""); /* bDeviceClass */ proto_tree_add_item(tree, hf_usb_bDeviceClass, tvb, offset, 1, ENC_LITTLE_ENDIAN); offset += 1; /* bDeviceSubClass */ proto_tree_add_item(tree, hf_usb_bDeviceSubClass, tvb, offset, 1, ENC_LITTLE_ENDIAN); offset += 1; /* bDeviceProtocol */ nitem = proto_tree_add_item(tree, hf_usb_bDeviceProtocol, tvb, offset, 1, ENC_LITTLE_ENDIAN); if (*description) proto_item_append_text(nitem, " (%s)", description); offset += 1; /* bMaxPacketSize0 */ proto_tree_add_item(tree, hf_usb_bMaxPacketSize0, tvb, offset, 1, ENC_LITTLE_ENDIAN); offset += 1; /* if request was only for the first 8 bytes */ /* per 5.5.3 of USB2.0 Spec */ if (8 == usb_conv_info->usb_trans_info->setup.wLength) { proto_item_set_len(item, offset-old_offset); return offset; } /* idVendor */ proto_tree_add_item_ret_uint(tree, hf_usb_idVendor, tvb, offset, 2, ENC_LITTLE_ENDIAN, &vendor_id); usb_conv_info->deviceVendor = (guint16)vendor_id; offset += 2; /* idProduct */ product_id = tvb_get_letohs(tvb, offset); usb_conv_info->deviceProduct = product_id; product = (guint16)vendor_id << 16 | product_id; proto_tree_add_uint_format_value(tree, hf_usb_idProduct, tvb, offset, 2, product_id, "%s (0x%04x)", val_to_str_ext_const(product, &ext_usb_products_vals, "Unknown"), product_id); offset += 2; if (!pinfo->fd->flags.visited) { guint k_bus_id; guint k_device_address; guint k_frame_number; wmem_tree_key_t key[4]; device_product_data_t *device_product_data; device_protocol_data_t *device_protocol_data; k_frame_number = pinfo->num; k_device_address = usb_conv_info->device_address; k_bus_id = usb_conv_info->bus_id; key[0].length = 1; key[0].key = &k_device_address; key[1].length = 1; key[1].key = &k_bus_id; key[2].length = 1; key[2].key = &k_frame_number; key[3].length = 0; key[3].key = NULL; device_product_data = wmem_new(wmem_file_scope(), device_product_data_t); device_product_data->vendor = vendor_id; device_product_data->product = product_id; device_product_data->bus_id = usb_conv_info->bus_id; device_product_data->device_address = usb_conv_info->device_address; wmem_tree_insert32_array(device_to_product_table, key, device_product_data); device_protocol_data = wmem_new(wmem_file_scope(), device_protocol_data_t); device_protocol_data->protocol = protocol; device_protocol_data->bus_id = usb_conv_info->bus_id; device_protocol_data->device_address = usb_conv_info->device_address; wmem_tree_insert32_array(device_to_protocol_table, key, device_protocol_data); } /* bcdDevice */ proto_tree_add_item(tree, hf_usb_bcdDevice, tvb, offset, 2, ENC_LITTLE_ENDIAN); offset += 2; /* iManufacturer */ proto_tree_add_item(tree, hf_usb_iManufacturer, tvb, offset, 1, ENC_LITTLE_ENDIAN); offset += 1; /* iProduct */ proto_tree_add_item(tree, hf_usb_iProduct, tvb, offset, 1, ENC_LITTLE_ENDIAN); offset += 1; /* iSerialNumber */ proto_tree_add_item(tree, hf_usb_iSerialNumber, tvb, offset, 1, ENC_LITTLE_ENDIAN); offset += 1; /* bNumConfigurations */ proto_tree_add_item(tree, hf_usb_bNumConfigurations, tvb, offset, 1, ENC_LITTLE_ENDIAN); offset += 1; proto_item_set_len(item, offset-old_offset); return offset; } /* 9.6.7 */ static int dissect_usb_string_descriptor(packet_info *pinfo _U_, proto_tree *parent_tree, tvbuff_t *tvb, int offset, usb_conv_info_t *usb_conv_info) { proto_item *item; proto_tree *tree; int old_offset = offset; guint8 len; proto_item *len_item; usb_trans_info_t *usb_trans_info; usb_trans_info = usb_conv_info->usb_trans_info; tree = proto_tree_add_subtree(parent_tree, tvb, offset, -1, ett_descriptor_device, &item, "STRING DESCRIPTOR"); len = tvb_get_guint8(tvb, offset); /* The USB spec says that the languages / the string are UTF16 and not 0-terminated, i.e. the length field must contain an even number */ if (len & 0x1) { /* bLength */ len_item = proto_tree_add_item(tree, hf_usb_bLength, tvb, offset, 1, ENC_LITTLE_ENDIAN); expert_add_info(pinfo, len_item, &ei_usb_bLength_even); /* bDescriptorType */ proto_tree_add_item(tree, hf_usb_bDescriptorType, tvb, offset+1, 1, ENC_LITTLE_ENDIAN); } else len_item = dissect_usb_descriptor_header(tree, tvb, offset, NULL); offset += 2; /* Report an error, and give up, if the length is < 2 */ if (len < 2) { expert_add_info(pinfo, len_item, &ei_usb_bLength_too_short); return offset; } if (!usb_trans_info->u.get_descriptor.usb_index) { /* list of languanges */ while (offset >= old_offset && len > (offset - old_offset)) { /* wLANGID */ proto_tree_add_item(tree, hf_usb_wLANGID, tvb, offset, 2, ENC_LITTLE_ENDIAN); offset+=2; } } else { /* UTF-16 string */ /* handle case of host requesting only substring */ guint8 len_str = MIN(len-2, usb_trans_info->setup.wLength -2); proto_tree_add_item(tree, hf_usb_bString, tvb, offset, len_str, ENC_UTF_16 | ENC_LITTLE_ENDIAN); offset += len_str; } proto_item_set_len(item, offset-old_offset); return offset; } /* 9.6.5 */ static int dissect_usb_interface_descriptor(packet_info *pinfo, proto_tree *parent_tree, tvbuff_t *tvb, int offset, usb_conv_info_t *usb_conv_info) { proto_item *item; proto_tree *tree; const char *class_str = NULL; int old_offset = offset; guint8 len; guint8 interface_num; guint8 alt_setting; usb_trans_info_t *usb_trans_info; usb_trans_info = usb_conv_info->usb_trans_info; tree = proto_tree_add_subtree(parent_tree, tvb, offset, -1, ett_descriptor_device, &item, "INTERFACE DESCRIPTOR"); len = tvb_get_guint8(tvb, offset); dissect_usb_descriptor_header(tree, tvb, offset, NULL); offset += 2; /* bInterfaceNumber */ interface_num = tvb_get_guint8(tvb, offset); proto_tree_add_item(tree, hf_usb_bInterfaceNumber, tvb, offset, 1, ENC_LITTLE_ENDIAN); usb_conv_info->interfaceNum = interface_num; offset += 1; /* bAlternateSetting */ alt_setting = tvb_get_guint8(tvb, offset); proto_tree_add_item(tree, hf_usb_bAlternateSetting, tvb, offset, 1, ENC_LITTLE_ENDIAN); offset += 1; /* bNumEndpoints */ proto_tree_add_item(tree, hf_usb_bNumEndpoints, tvb, offset, 1, ENC_LITTLE_ENDIAN); offset += 1; /* bInterfaceClass */ proto_tree_add_item(tree, hf_usb_bInterfaceClass, tvb, offset, 1, ENC_LITTLE_ENDIAN); /* save the class so we can access it later in the endpoint descriptor */ usb_conv_info->interfaceClass = tvb_get_guint8(tvb, offset); class_str = val_to_str_ext(usb_conv_info->interfaceClass, &usb_class_vals_ext, "unknown (0x%X)"); proto_item_append_text(item, " (%u.%u): class %s", interface_num, alt_setting, class_str); if (!pinfo->fd->flags.visited) { usb_alt_setting_t alternate_setting; /* Register conversation for this interface in case CONTROL messages are sent to it */ usb_trans_info->interface_info = get_usb_iface_conv_info(pinfo, interface_num); alternate_setting.altSetting = alt_setting; alternate_setting.interfaceClass = tvb_get_guint8(tvb, offset); alternate_setting.interfaceSubclass = tvb_get_guint8(tvb, offset+1); alternate_setting.interfaceProtocol = tvb_get_guint8(tvb, offset+2); wmem_array_append_one(usb_trans_info->interface_info->alt_settings, alternate_setting); if (alt_setting == 0) { /* By default let's assume alternate setting 0 will be used */ /* in interface conversations, endpoint has no meaning */ usb_trans_info->interface_info->endpoint = NO_ENDPOINT8; usb_trans_info->interface_info->interfaceClass = alternate_setting.interfaceClass; usb_trans_info->interface_info->interfaceSubclass = alternate_setting.interfaceSubclass; usb_trans_info->interface_info->interfaceProtocol = alternate_setting.interfaceProtocol; usb_trans_info->interface_info->deviceVendor = usb_conv_info->deviceVendor; usb_trans_info->interface_info->deviceProduct = usb_conv_info->deviceProduct; } } offset += 1; /* bInterfaceSubClass */ switch (usb_conv_info->interfaceClass) { case IF_CLASS_COMMUNICATIONS: proto_tree_add_item(tree, hf_usb_bInterfaceSubClass_cdc, tvb, offset, 1, ENC_LITTLE_ENDIAN); break; case IF_CLASS_HID: proto_tree_add_item(tree, hf_usb_bInterfaceSubClass_hid, tvb, offset, 1, ENC_LITTLE_ENDIAN); break; case IF_CLASS_MISCELLANEOUS: proto_tree_add_item(tree, hf_usb_bInterfaceSubClass_misc, tvb, offset, 1, ENC_LITTLE_ENDIAN); break; case IF_CLASS_APPLICATION_SPECIFIC: proto_tree_add_item(tree, hf_usb_bInterfaceSubClass_app, tvb, offset, 1, ENC_LITTLE_ENDIAN); break; default: proto_tree_add_item(tree, hf_usb_bInterfaceSubClass, tvb, offset, 1, ENC_LITTLE_ENDIAN); } /* save the subclass so we can access it later in class-specific descriptors */ usb_conv_info->interfaceSubclass = tvb_get_guint8(tvb, offset); offset += 1; /* bInterfaceProtocol */ switch (usb_conv_info->interfaceClass) { case IF_CLASS_COMMUNICATIONS: proto_tree_add_item(tree, hf_usb_bInterfaceProtocol_cdc, tvb, offset, 1, ENC_LITTLE_ENDIAN); break; case IF_CLASS_CDC_DATA: proto_tree_add_item(tree, hf_usb_bInterfaceProtocol_cdc_data, tvb, offset, 1, ENC_LITTLE_ENDIAN); break; case IF_CLASS_APPLICATION_SPECIFIC: switch (usb_conv_info->interfaceSubclass) { case 0x01: proto_tree_add_item(tree, hf_usb_bInterfaceProtocol_app_dfu, tvb, offset, 1, ENC_LITTLE_ENDIAN); break; case 0x02: proto_tree_add_item(tree, hf_usb_bInterfaceProtocol_app_irda, tvb, offset, 1, ENC_LITTLE_ENDIAN); break; case 0x03: proto_tree_add_item(tree, hf_usb_bInterfaceProtocol_app_usb_test_and_measurement, tvb, offset, 1, ENC_LITTLE_ENDIAN); break; default: proto_tree_add_item(tree, hf_usb_bInterfaceProtocol, tvb, offset, 1, ENC_LITTLE_ENDIAN); } break; case IF_CLASS_HID: if (usb_conv_info->interfaceSubclass == 1) { proto_tree_add_item(tree, hf_usb_bInterfaceProtocol_hid_boot, tvb, offset, 1, ENC_LITTLE_ENDIAN); break; } proto_tree_add_item(tree, hf_usb_bInterfaceProtocol, tvb, offset, 1, ENC_LITTLE_ENDIAN); break; default: proto_tree_add_item(tree, hf_usb_bInterfaceProtocol, tvb, offset, 1, ENC_LITTLE_ENDIAN); } usb_conv_info->interfaceProtocol = tvb_get_guint8(tvb, offset); offset += 1; /* iInterface */ proto_tree_add_item(tree, hf_usb_iInterface, tvb, offset, 1, ENC_LITTLE_ENDIAN); offset += 1; proto_item_set_len(item, len); if (offset < old_offset+len) { /* skip unknown records */ offset = old_offset + len; } return offset; } /* 9.6.6 */ const true_false_string tfs_endpoint_direction = { "IN Endpoint", "OUT Endpoint" }; void dissect_usb_endpoint_address(proto_tree *tree, tvbuff_t *tvb, int offset) { proto_item *endpoint_item; proto_tree *endpoint_tree; guint8 endpoint; endpoint_item = proto_tree_add_item(tree, hf_usb_bEndpointAddress, tvb, offset, 1, ENC_LITTLE_ENDIAN); endpoint_tree = proto_item_add_subtree(endpoint_item, ett_configuration_bEndpointAddress); endpoint = tvb_get_guint8(tvb, offset)&0x0f; proto_tree_add_item(endpoint_tree, hf_usb_bEndpointAddress_direction, tvb, offset, 1, ENC_LITTLE_ENDIAN); proto_item_append_text(endpoint_item, " %s", (tvb_get_guint8(tvb, offset)&0x80)?"IN":"OUT"); proto_tree_add_item(endpoint_tree, hf_usb_bEndpointAddress_number, tvb, offset, 1, ENC_LITTLE_ENDIAN); proto_item_append_text(endpoint_item, " Endpoint:%d", endpoint); } int dissect_usb_endpoint_descriptor(packet_info *pinfo, proto_tree *parent_tree, tvbuff_t *tvb, int offset, usb_conv_info_t *usb_conv_info) { proto_item *item; proto_tree *tree; proto_item *ep_attrib_item; proto_tree *ep_attrib_tree; proto_item *ep_pktsize_item; proto_tree *ep_pktsize_tree; int old_offset = offset; guint8 endpoint; guint8 ep_type; guint8 len; usb_trans_info_t *usb_trans_info = NULL; if (usb_conv_info) usb_trans_info = usb_conv_info->usb_trans_info; tree = proto_tree_add_subtree(parent_tree, tvb, offset, -1, ett_descriptor_device, &item, "ENDPOINT DESCRIPTOR"); len = tvb_get_guint8(tvb, offset); dissect_usb_descriptor_header(tree, tvb, offset, NULL); offset += 2; endpoint = tvb_get_guint8(tvb, offset)&0x0f; dissect_usb_endpoint_address(tree, tvb, offset); offset += 1; /* Together with class from the interface descriptor we know what kind * of class the device at endpoint is. * Make sure a conversation exists for this endpoint and attach a * usb_conv_into_t structure to it. * * All endpoints for the same interface descriptor share the same * usb_conv_info structure. */ if ((!pinfo->fd->flags.visited) && usb_trans_info && usb_trans_info->interface_info) { conversation_t *conversation = NULL; if (pinfo->destport == NO_ENDPOINT) { address tmp_addr; usb_address_t *usb_addr = wmem_new0(wmem_packet_scope(), usb_address_t); /* packet is sent from a USB device's endpoint 0 to the host * replace endpoint 0 with the endpoint of this descriptor * and find the corresponding conversation */ usb_addr->bus_id = ((const usb_address_t *)(pinfo->src.data))->bus_id; usb_addr->device = ((const usb_address_t *)(pinfo->src.data))->device; usb_addr->endpoint = GUINT32_TO_LE(endpoint); set_address(&tmp_addr, usb_address_type, USB_ADDR_LEN, (char *)usb_addr); conversation = get_usb_conversation(pinfo, &tmp_addr, &pinfo->dst, usb_addr->endpoint, pinfo->destport); } if (conversation) { usb_trans_info->interface_info->endpoint = endpoint; conversation_add_proto_data(conversation, proto_usb, usb_trans_info->interface_info); } } /* bmAttributes */ ep_type = ENDPOINT_TYPE(tvb_get_guint8(tvb, offset)); ep_attrib_item = proto_tree_add_item(tree, hf_usb_bmAttributes, tvb, offset, 1, ENC_LITTLE_ENDIAN); ep_attrib_tree = proto_item_add_subtree(ep_attrib_item, ett_endpoint_bmAttributes); proto_tree_add_item(ep_attrib_tree, hf_usb_bEndpointAttributeTransfer, tvb, offset, 1, ENC_LITTLE_ENDIAN); if (ep_type==USB_EP_ISOCHRONOUS) { proto_tree_add_item(ep_attrib_tree, hf_usb_bEndpointAttributeSynchonisation, tvb, offset, 1, ENC_LITTLE_ENDIAN); proto_tree_add_item(ep_attrib_tree, hf_usb_bEndpointAttributeBehaviour, tvb, offset, 1, ENC_LITTLE_ENDIAN); } offset += 1; /* wMaxPacketSize */ ep_pktsize_item = proto_tree_add_item(tree, hf_usb_wMaxPacketSize, tvb, offset, 2, ENC_LITTLE_ENDIAN); ep_pktsize_tree = proto_item_add_subtree(ep_pktsize_item, ett_endpoint_wMaxPacketSize); if ((ep_type == ENDPOINT_TYPE_INTERRUPT) || (ep_type == ENDPOINT_TYPE_ISOCHRONOUS)) { proto_tree_add_item(ep_pktsize_tree, hf_usb_wMaxPacketSize_slots, tvb, offset, 2, ENC_LITTLE_ENDIAN); } proto_tree_add_item(ep_pktsize_tree, hf_usb_wMaxPacketSize_size, tvb, offset, 2, ENC_LITTLE_ENDIAN); offset+=2; /* bInterval */ proto_tree_add_item(tree, hf_usb_bInterval, tvb, offset, 1, ENC_LITTLE_ENDIAN); offset += 1; proto_item_set_len(item, len); if (offset < old_offset+len) { /* skip unknown records */ offset = old_offset + len; } return offset; } /* ECN */ static int dissect_usb_interface_assn_descriptor(packet_info *pinfo _U_, proto_tree *parent_tree, tvbuff_t *tvb, int offset, usb_conv_info_t *usb_conv_info _U_) { proto_item *item; proto_tree *tree; int old_offset = offset; tree = proto_tree_add_subtree(parent_tree, tvb, offset, -1, ett_descriptor_device, &item, "INTERFACE ASSOCIATION DESCRIPTOR"); dissect_usb_descriptor_header(tree, tvb, offset, NULL); offset += 2; /* bFirstInterface */ proto_tree_add_item(tree, hf_usb_bFirstInterface, tvb, offset, 1, ENC_LITTLE_ENDIAN); offset += 1; /* bInterfaceCount */ proto_tree_add_item(tree, hf_usb_bInterfaceCount, tvb, offset, 1, ENC_LITTLE_ENDIAN); offset += 1; /* bFunctionClass */ proto_tree_add_item(tree, hf_usb_bFunctionClass, tvb, offset, 1, ENC_LITTLE_ENDIAN); offset += 1; /* bFunctionSubclass */ proto_tree_add_item(tree, hf_usb_bFunctionSubClass, tvb, offset, 1, ENC_LITTLE_ENDIAN); offset += 1; /* bFunctionProtocol */ proto_tree_add_item(tree, hf_usb_bFunctionProtocol, tvb, offset, 1, ENC_LITTLE_ENDIAN); offset += 1; /* iFunction */ proto_tree_add_item(tree, hf_usb_iFunction, tvb, offset, 1, ENC_LITTLE_ENDIAN); offset += 1; proto_item_set_len(item, offset-old_offset); return offset; } int dissect_usb_unknown_descriptor(packet_info *pinfo _U_, proto_tree *parent_tree, tvbuff_t *tvb, int offset, usb_conv_info_t *usb_conv_info _U_) { proto_item *item; proto_tree *tree; guint8 bLength; tree = proto_tree_add_subtree(parent_tree, tvb, offset, -1, ett_descriptor_device, &item, "UNKNOWN DESCRIPTOR"); bLength = tvb_get_guint8(tvb, offset); dissect_usb_descriptor_header(tree, tvb, offset, NULL); offset += bLength; proto_item_set_len(item, bLength); return offset; } /* 9.6.3 */ static const true_false_string tfs_mustbeone = { "Must be 1 for USB 1.1 and higher", "FIXME: Is this a USB 1.0 device" }; static const true_false_string tfs_selfpowered = { "This device is SELF-POWERED", "This device is powered from the USB bus" }; static const true_false_string tfs_remotewakeup = { "This device supports REMOTE WAKEUP", "This device does NOT support remote wakeup" }; static int dissect_usb_configuration_descriptor(packet_info *pinfo _U_, proto_tree *parent_tree, tvbuff_t *tvb, int offset, usb_conv_info_t *usb_conv_info) { proto_item *item; proto_tree *tree; int old_offset = offset; guint16 len; proto_item *flags_item; proto_tree *flags_tree; guint8 flags; proto_item *power_item; guint8 power; gboolean truncation_expected; usb_trans_info_t *usb_trans_info; usb_trans_info = usb_conv_info->usb_trans_info; usb_conv_info->interfaceClass = IF_CLASS_UNKNOWN; usb_conv_info->interfaceSubclass = IF_SUBCLASS_UNKNOWN; usb_conv_info->interfaceProtocol = IF_PROTOCOL_UNKNOWN; tree = proto_tree_add_subtree(parent_tree, tvb, offset, -1, ett_descriptor_device, &item, "CONFIGURATION DESCRIPTOR"); dissect_usb_descriptor_header(tree, tvb, offset, NULL); offset += 2; /* wTotalLength */ proto_tree_add_item(tree, hf_usb_wTotalLength, tvb, offset, 2, ENC_LITTLE_ENDIAN); len = tvb_get_letohs(tvb, offset); offset+=2; /* bNumInterfaces */ proto_tree_add_item(tree, hf_usb_bNumInterfaces, tvb, offset, 1, ENC_LITTLE_ENDIAN); offset += 1; /* bConfigurationValue */ proto_tree_add_item(tree, hf_usb_bConfigurationValue, tvb, offset, 1, ENC_LITTLE_ENDIAN); offset += 1; /* iConfiguration */ proto_tree_add_item(tree, hf_usb_iConfiguration, tvb, offset, 1, ENC_LITTLE_ENDIAN); offset += 1; /* bmAttributes */ flags_item = proto_tree_add_item(tree, hf_usb_configuration_bmAttributes, tvb, offset, 1, ENC_LITTLE_ENDIAN); flags_tree = proto_item_add_subtree(flags_item, ett_configuration_bmAttributes); flags = tvb_get_guint8(tvb, offset); proto_tree_add_item(flags_tree, hf_usb_configuration_legacy10buspowered, tvb, offset, 1, ENC_LITTLE_ENDIAN); proto_tree_add_item(flags_tree, hf_usb_configuration_selfpowered, tvb, offset, 1, ENC_LITTLE_ENDIAN); proto_item_append_text(flags_item, " %sSELF-POWERED", (flags&0x40)?"":"NOT "); proto_tree_add_item(flags_tree, hf_usb_configuration_remotewakeup, tvb, offset, 1, ENC_LITTLE_ENDIAN); proto_item_append_text(flags_item, " %sREMOTE-WAKEUP", (flags&0x20)?"":"NO "); offset += 1; /* bMaxPower */ power_item = proto_tree_add_item(tree, hf_usb_bMaxPower, tvb, offset, 1, ENC_LITTLE_ENDIAN); power = tvb_get_guint8(tvb, offset); proto_item_append_text(power_item, " (%dmA)", power*2); offset += 1; /* initialize interface_info to NULL */ usb_trans_info->interface_info = NULL; truncation_expected = (usb_trans_info->setup.wLength < len); /* decode any additional interface and endpoint descriptors */ while(len>(offset-old_offset)) { guint8 next_type; guint8 next_len = 0; gint remaining_tvb, remaining_len; tvbuff_t *next_tvb = NULL; /* Handle truncated descriptors appropriately */ remaining_tvb = tvb_reported_length_remaining(tvb, offset); if (remaining_tvb > 0) { next_len = tvb_get_guint8(tvb, offset); remaining_len = len - (offset - old_offset); if ((next_len < 3) || (next_len > remaining_len)) { proto_tree_add_expert_format(parent_tree, pinfo, &ei_usb_desc_length_invalid, tvb, offset, 1, "Invalid descriptor length: %u", next_len); item = NULL; break; } } if ((remaining_tvb == 0) || (next_len > remaining_tvb)) { if (truncation_expected) break; } next_type = tvb_get_guint8(tvb, offset+1); switch(next_type) { case USB_DT_INTERFACE: offset = dissect_usb_interface_descriptor(pinfo, parent_tree, tvb, offset, usb_conv_info); break; case USB_DT_ENDPOINT: offset = dissect_usb_endpoint_descriptor(pinfo, parent_tree, tvb, offset, usb_conv_info); break; case USB_DT_INTERFACE_ASSOCIATION: offset = dissect_usb_interface_assn_descriptor(pinfo, parent_tree, tvb, offset, usb_conv_info); break; default: next_tvb = tvb_new_subset_length(tvb, offset, next_len); if (dissector_try_uint_new(usb_descriptor_dissector_table, usb_conv_info->interfaceClass, next_tvb, pinfo, parent_tree, TRUE, usb_conv_info)) { offset += next_len; } else { offset = dissect_usb_unknown_descriptor(pinfo, parent_tree, tvb, offset, usb_conv_info); } break; /* was: return offset; */ } } proto_item_set_len(item, offset-old_offset); return offset; } /* 9.4.3 */ static int dissect_usb_setup_get_descriptor_request(packet_info *pinfo, proto_tree *tree, tvbuff_t *tvb, int offset, usb_conv_info_t *usb_conv_info) { usb_trans_info_t *usb_trans_info, trans_info; if (usb_conv_info) usb_trans_info = usb_conv_info->usb_trans_info; else usb_trans_info = &trans_info; /* descriptor index */ proto_tree_add_item(tree, hf_usb_descriptor_index, tvb, offset, 1, ENC_LITTLE_ENDIAN); usb_trans_info->u.get_descriptor.usb_index = tvb_get_guint8(tvb, offset); offset += 1; /* descriptor type */ proto_tree_add_item(tree, hf_usb_bDescriptorType, tvb, offset, 1, ENC_LITTLE_ENDIAN); usb_trans_info->u.get_descriptor.type = tvb_get_guint8(tvb, offset); offset += 1; col_append_fstr(pinfo->cinfo, COL_INFO, " %s", val_to_str_ext(usb_trans_info->u.get_descriptor.type, &std_descriptor_type_vals_ext, "Unknown type %u")); /* language id */ proto_tree_add_item(tree, hf_usb_language_id, tvb, offset, 2, ENC_LITTLE_ENDIAN); offset+=2; /* length */ proto_tree_add_item(tree, hf_usb_length, tvb, offset, 2, ENC_LITTLE_ENDIAN); offset += 2; return offset; } static int dissect_usb_setup_get_descriptor_response(packet_info *pinfo, proto_tree *tree, tvbuff_t *tvb, int offset, usb_conv_info_t *usb_conv_info) { usb_trans_info_t *usb_trans_info; usb_trans_info = usb_conv_info->usb_trans_info; col_append_fstr(pinfo->cinfo, COL_INFO, " %s", val_to_str_ext(usb_trans_info->u.get_descriptor.type, &std_descriptor_type_vals_ext, "Unknown type %u")); switch(usb_trans_info->u.get_descriptor.type) { case USB_DT_INTERFACE: case USB_DT_ENDPOINT: /* an interface or an endpoint descriptor can only be accessed as part of a configuration descriptor */ break; case USB_DT_DEVICE: offset = dissect_usb_device_descriptor(pinfo, tree, tvb, offset, usb_conv_info); break; case USB_DT_CONFIG: offset = dissect_usb_configuration_descriptor(pinfo, tree, tvb, offset, usb_conv_info); break; case USB_DT_STRING: offset = dissect_usb_string_descriptor(pinfo, tree, tvb, offset, usb_conv_info); break; case USB_DT_DEVICE_QUALIFIER: offset = dissect_usb_device_qualifier_descriptor(pinfo, tree, tvb, offset, usb_conv_info); break; default: /* XXX dissect the descriptor coming back from the device */ { guint len = tvb_reported_length_remaining(tvb, offset); proto_tree_add_bytes_format(tree, hf_usb_get_descriptor_resp_generic, tvb, offset, len, NULL, "GET DESCRIPTOR Response data (unknown descriptor type %u): %s", usb_trans_info->u.get_descriptor.type, tvb_bytes_to_str(wmem_packet_scope(), tvb, offset, len)); offset = offset + len; } break; } return offset; } /* * These dissectors are used to dissect the setup part and the data * for URB_CONTROL_INPUT / GET INTERFACE */ /* 9.4.4 */ static int dissect_usb_setup_get_interface_request(packet_info *pinfo _U_, proto_tree *tree, tvbuff_t *tvb, int offset, usb_conv_info_t *usb_conv_info _U_) { /* zero */ proto_tree_add_item(tree, hf_usb_value, tvb, offset, 2, ENC_LITTLE_ENDIAN); offset += 2; /* interface */ proto_tree_add_item(tree, hf_usb_wInterface, tvb, offset, 2, ENC_LITTLE_ENDIAN); offset += 2; /* length */ proto_tree_add_item(tree, hf_usb_length, tvb, offset, 2, ENC_LITTLE_ENDIAN); offset += 2; return offset; } static int dissect_usb_setup_get_interface_response(packet_info *pinfo _U_, proto_tree *tree, tvbuff_t *tvb, int offset, usb_conv_info_t *usb_conv_info _U_) { /* alternate setting */ proto_tree_add_item(tree, hf_usb_bAlternateSetting, tvb, offset, 1, ENC_LITTLE_ENDIAN); offset += 1; return offset; } /* * These dissectors are used to dissect the setup part and the data * for URB_CONTROL_INPUT / GET STATUS */ /* 9.4.5 */ static int dissect_usb_setup_get_status_request(packet_info *pinfo _U_, proto_tree *tree, tvbuff_t *tvb, int offset, usb_conv_info_t *usb_conv_info _U_) { /* zero */ proto_tree_add_item(tree, hf_usb_value, tvb, offset, 2, ENC_LITTLE_ENDIAN); offset += 2; /* zero/interface/endpoint */ if (usb_conv_info) { guint8 recip; recip = USB_RECIPIENT(usb_conv_info->usb_trans_info->setup.requesttype); switch (recip) { case RQT_SETUP_RECIPIENT_INTERFACE: proto_tree_add_item(tree, hf_usb_wInterface, tvb, offset, 2, ENC_LITTLE_ENDIAN); break; case RQT_SETUP_RECIPIENT_ENDPOINT: proto_tree_add_item(tree, hf_usb_wEndpoint, tvb, offset, 2, ENC_LITTLE_ENDIAN); break; case RQT_SETUP_RECIPIENT_DEVICE: case RQT_SETUP_RECIPIENT_OTHER: default: proto_tree_add_item(tree, hf_usb_index, tvb, offset, 2, ENC_LITTLE_ENDIAN); break; } } else { proto_tree_add_item(tree, hf_usb_index, tvb, offset, 2, ENC_LITTLE_ENDIAN); } offset += 2; /* length */ proto_tree_add_item(tree, hf_usb_length, tvb, offset, 2, ENC_LITTLE_ENDIAN); offset += 2; return offset; } static int dissect_usb_setup_get_status_response(packet_info *pinfo _U_, proto_tree *tree, tvbuff_t *tvb, int offset, usb_conv_info_t *usb_conv_info _U_) { /* status */ /* XXX - show bits */ proto_tree_add_item(tree, hf_usb_wStatus, tvb, offset, 2, ENC_LITTLE_ENDIAN); offset += 2; return offset; } /* * These dissectors are used to dissect the setup part and the data * for URB_CONTROL_INPUT / SET ADDRESS */ /* 9.4.6 */ static int dissect_usb_setup_set_address_request(packet_info *pinfo _U_, proto_tree *tree, tvbuff_t *tvb, int offset, usb_conv_info_t *usb_conv_info _U_) { /* device address */ proto_tree_add_item(tree, hf_usb_device_address, tvb, offset, 2, ENC_LITTLE_ENDIAN); offset += 2; /* zero */ proto_tree_add_item(tree, hf_usb_index, tvb, offset, 2, ENC_LITTLE_ENDIAN); offset += 2; /* zero */ proto_tree_add_item(tree, hf_usb_length, tvb, offset, 2, ENC_LITTLE_ENDIAN); offset += 2; return offset; } static int dissect_usb_setup_set_address_response(packet_info *pinfo _U_, proto_tree *tree _U_, tvbuff_t *tvb _U_, int offset, usb_conv_info_t *usb_conv_info _U_) { return offset; } /* * These dissectors are used to dissect the setup part and the data * for URB_CONTROL_INPUT / SET CONFIGURATION */ /* 9.4.7 */ static int dissect_usb_setup_set_configuration_request(packet_info *pinfo _U_, proto_tree *tree, tvbuff_t *tvb, int offset, usb_conv_info_t *usb_conv_info _U_) { /* configuration value */ proto_tree_add_item(tree, hf_usb_bConfigurationValue, tvb, offset, 1, ENC_LITTLE_ENDIAN); offset += 2; /* zero */ proto_tree_add_item(tree, hf_usb_index, tvb, offset, 2, ENC_LITTLE_ENDIAN); offset += 2; /* zero */ proto_tree_add_item(tree, hf_usb_length, tvb, offset, 2, ENC_LITTLE_ENDIAN); offset += 2; return offset; } static int dissect_usb_setup_set_configuration_response(packet_info *pinfo _U_, proto_tree *tree _U_, tvbuff_t *tvb _U_, int offset, usb_conv_info_t *usb_conv_info _U_) { return offset; } /* * These dissectors are used to dissect the setup part and the data * for URB_CONTROL_INPUT / SET FEATURE */ /* 9.4.9 */ static int dissect_usb_setup_set_feature_request(packet_info *pinfo _U_, proto_tree *tree, tvbuff_t *tvb, int offset, usb_conv_info_t *usb_conv_info) { guint8 recip; if (usb_conv_info) { recip = USB_RECIPIENT(usb_conv_info->usb_trans_info->setup.requesttype); /* feature selector, zero/interface/endpoint */ switch (recip) { case RQT_SETUP_RECIPIENT_DEVICE: proto_tree_add_item(tree, hf_usb_device_wFeatureSelector, tvb, offset, 2, ENC_LITTLE_ENDIAN); offset += 2; proto_tree_add_item(tree, hf_usb_index, tvb, offset, 2, ENC_LITTLE_ENDIAN); break; case RQT_SETUP_RECIPIENT_INTERFACE: proto_tree_add_item(tree, hf_usb_interface_wFeatureSelector, tvb, offset, 2, ENC_LITTLE_ENDIAN); offset += 2; proto_tree_add_item(tree, hf_usb_wInterface, tvb, offset, 2, ENC_LITTLE_ENDIAN); break; case RQT_SETUP_RECIPIENT_ENDPOINT: proto_tree_add_item(tree, hf_usb_endpoint_wFeatureSelector, tvb, offset, 2, ENC_LITTLE_ENDIAN); offset += 2; proto_tree_add_item(tree, hf_usb_wEndpoint, tvb, offset, 2, ENC_LITTLE_ENDIAN); break; case RQT_SETUP_RECIPIENT_OTHER: default: proto_tree_add_item(tree, hf_usb_value, tvb, offset, 2, ENC_LITTLE_ENDIAN); offset += 2; proto_tree_add_item(tree, hf_usb_index, tvb, offset, 2, ENC_LITTLE_ENDIAN); break; } } else { /* No conversation information, so recipient type is unknown */ proto_tree_add_item(tree, hf_usb_value, tvb, offset, 2, ENC_LITTLE_ENDIAN); offset += 2; proto_tree_add_item(tree, hf_usb_index, tvb, offset, 2, ENC_LITTLE_ENDIAN); } offset += 2; /* zero */ proto_tree_add_item(tree, hf_usb_length, tvb, offset, 2, ENC_LITTLE_ENDIAN); offset += 2; return offset; } static int dissect_usb_setup_set_feature_response(packet_info *pinfo _U_, proto_tree *tree _U_, tvbuff_t *tvb _U_, int offset, usb_conv_info_t *usb_conv_info _U_) { return offset; } /* * These dissectors are used to dissect the setup part and the data * for URB_CONTROL_INPUT / SET INTERFACE */ /* 9.4.10 */ static int dissect_usb_setup_set_interface_request(packet_info *pinfo, proto_tree *tree, tvbuff_t *tvb, int offset, usb_conv_info_t *usb_conv_info _U_) { guint8 alt_setting, interface_num; /* alternate setting */ alt_setting = tvb_get_guint8(tvb, offset); proto_tree_add_uint(tree, hf_usb_bAlternateSetting, tvb, offset, 2, alt_setting); offset += 2; /* interface */ interface_num = tvb_get_guint8(tvb, offset); proto_tree_add_uint(tree, hf_usb_wInterface, tvb, offset, 2, interface_num); offset += 2; /* zero */ proto_tree_add_item(tree, hf_usb_length, tvb, offset, 2, ENC_LITTLE_ENDIAN); offset += 2; if (!PINFO_FD_VISITED(pinfo)) { guint i, count; usb_conv_info_t *iface_conv_info = get_usb_iface_conv_info(pinfo, interface_num); /* update the conversation info with the selected alternate setting */ count = wmem_array_get_count(iface_conv_info->alt_settings); for (i = 0; i < count; i++) { usb_alt_setting_t *alternate_setting = (usb_alt_setting_t *)wmem_array_index(iface_conv_info->alt_settings, i); if (alternate_setting->altSetting == alt_setting) { iface_conv_info->interfaceClass = alternate_setting->interfaceClass; iface_conv_info->interfaceSubclass = alternate_setting->interfaceSubclass; iface_conv_info->interfaceProtocol = alternate_setting->interfaceProtocol; break; } } } return offset; } static int dissect_usb_setup_set_interface_response(packet_info *pinfo _U_, proto_tree *tree _U_, tvbuff_t *tvb _U_, int offset, usb_conv_info_t *usb_conv_info _U_) { return offset; } /* * These dissectors are used to dissect the setup part and the data * for URB_CONTROL_INPUT / SYNCH FRAME */ /* 9.4.11 */ static int dissect_usb_setup_synch_frame_request(packet_info *pinfo _U_, proto_tree *tree, tvbuff_t *tvb, int offset, usb_conv_info_t *usb_conv_info _U_) { /* zero */ proto_tree_add_item(tree, hf_usb_value, tvb, offset, 2, ENC_LITTLE_ENDIAN); offset += 2; /* endpoint */ proto_tree_add_item(tree, hf_usb_wEndpoint, tvb, offset, 2, ENC_LITTLE_ENDIAN); offset += 2; /* two */ proto_tree_add_item(tree, hf_usb_length, tvb, offset, 2, ENC_LITTLE_ENDIAN); offset += 2; return offset; } static int dissect_usb_setup_synch_frame_response(packet_info *pinfo _U_, proto_tree *tree _U_, tvbuff_t *tvb _U_, int offset, usb_conv_info_t *usb_conv_info _U_) { /* frame number */ proto_tree_add_item(tree, hf_usb_wFrameNumber, tvb, offset, 2, ENC_LITTLE_ENDIAN); offset += 2; return offset; } /* Dissector used for unknown USB setup request/responses */ static int dissect_usb_setup_generic(packet_info *pinfo _U_, proto_tree *tree , tvbuff_t *tvb, int offset, usb_conv_info_t *usb_conv_info _U_) { proto_tree_add_item(tree, hf_usb_value, tvb, offset, 2, ENC_LITTLE_ENDIAN); offset += 2; proto_tree_add_item(tree, hf_usb_index, tvb, offset, 2, ENC_LITTLE_ENDIAN); offset += 2; proto_tree_add_item(tree, hf_usb_length, tvb, offset, 2, ENC_LITTLE_ENDIAN); offset += 2; return offset; } typedef int (*usb_setup_dissector)(packet_info *pinfo, proto_tree *tree, tvbuff_t *tvb, int offset, usb_conv_info_t *usb_conv_info); typedef struct _usb_setup_dissector_table_t { guint8 request; usb_setup_dissector dissector; } usb_setup_dissector_table_t; static const usb_setup_dissector_table_t setup_request_dissectors[] = { {USB_SETUP_GET_STATUS, dissect_usb_setup_get_status_request}, {USB_SETUP_CLEAR_FEATURE, dissect_usb_setup_clear_feature_request}, {USB_SETUP_SET_FEATURE, dissect_usb_setup_set_feature_request}, {USB_SETUP_SET_ADDRESS, dissect_usb_setup_set_address_request}, {USB_SETUP_GET_DESCRIPTOR, dissect_usb_setup_get_descriptor_request}, {USB_SETUP_SET_CONFIGURATION, dissect_usb_setup_set_configuration_request}, {USB_SETUP_GET_INTERFACE, dissect_usb_setup_get_interface_request}, {USB_SETUP_SET_INTERFACE, dissect_usb_setup_set_interface_request}, {USB_SETUP_SYNCH_FRAME, dissect_usb_setup_synch_frame_request}, {0, NULL} }; static const usb_setup_dissector_table_t setup_response_dissectors[] = { {USB_SETUP_GET_STATUS, dissect_usb_setup_get_status_response}, {USB_SETUP_CLEAR_FEATURE, dissect_usb_setup_clear_feature_response}, {USB_SETUP_SET_FEATURE, dissect_usb_setup_set_feature_response}, {USB_SETUP_SET_ADDRESS, dissect_usb_setup_set_address_response}, {USB_SETUP_GET_DESCRIPTOR, dissect_usb_setup_get_descriptor_response}, {USB_SETUP_GET_CONFIGURATION, dissect_usb_setup_get_configuration_response}, {USB_SETUP_SET_CONFIGURATION, dissect_usb_setup_set_configuration_response}, {USB_SETUP_GET_INTERFACE, dissect_usb_setup_get_interface_response}, {USB_SETUP_SET_INTERFACE, dissect_usb_setup_set_interface_response}, {USB_SETUP_SYNCH_FRAME, dissect_usb_setup_synch_frame_response}, {0, NULL} }; static const value_string setup_request_names_vals[] = { {USB_SETUP_GET_STATUS, "GET STATUS"}, {USB_SETUP_CLEAR_FEATURE, "CLEAR FEATURE"}, {USB_SETUP_SET_FEATURE, "SET FEATURE"}, {USB_SETUP_SET_ADDRESS, "SET ADDRESS"}, {USB_SETUP_GET_DESCRIPTOR, "GET DESCRIPTOR"}, {USB_SETUP_SET_DESCRIPTOR, "SET DESCRIPTOR"}, {USB_SETUP_GET_CONFIGURATION, "GET CONFIGURATION"}, {USB_SETUP_SET_CONFIGURATION, "SET CONFIGURATION"}, {USB_SETUP_GET_INTERFACE, "GET INTERFACE"}, {USB_SETUP_SET_INTERFACE, "SET INTERFACE"}, {USB_SETUP_SYNCH_FRAME, "SYNCH FRAME"}, {USB_SETUP_SET_SEL, "SET SEL"}, {USB_SETUP_SET_ISOCH_DELAY, "SET ISOCH DELAY"}, {0, NULL} }; static value_string_ext setup_request_names_vals_ext = VALUE_STRING_EXT_INIT(setup_request_names_vals); static const true_false_string tfs_bmrequesttype_direction = { "Device-to-host", "Host-to-device" }; static const value_string bmrequesttype_type_vals[] = { {RQT_SETUP_TYPE_STANDARD, "Standard"}, {RQT_SETUP_TYPE_CLASS, "Class"}, {RQT_SETUP_TYPE_VENDOR, "Vendor"}, {0, NULL} }; static const value_string bmrequesttype_recipient_vals[] = { {RQT_SETUP_RECIPIENT_DEVICE, "Device" }, {RQT_SETUP_RECIPIENT_INTERFACE, "Interface" }, {RQT_SETUP_RECIPIENT_ENDPOINT, "Endpoint" }, {RQT_SETUP_RECIPIENT_OTHER, "Other" }, {0, NULL } }; /* Dissector used for standard usb setup requests */ static int dissect_usb_standard_setup_request(packet_info *pinfo, proto_tree *tree , tvbuff_t *tvb, usb_conv_info_t *usb_conv_info, usb_trans_info_t *usb_trans_info) { gint offset = 0; const usb_setup_dissector_table_t *tmp; usb_setup_dissector dissector; proto_tree_add_item(tree, hf_usb_request, tvb, offset, 1, ENC_LITTLE_ENDIAN); offset += 1; col_add_fstr(pinfo->cinfo, COL_INFO, "%s Request", val_to_str_ext(usb_trans_info->setup.request, &setup_request_names_vals_ext, "Unknown type %x")); dissector = NULL; for(tmp = setup_request_dissectors;tmp->dissector;tmp++) { if (tmp->request == usb_trans_info->setup.request) { dissector = tmp->dissector; break; } } if (!dissector) { dissector = &dissect_usb_setup_generic; } offset = dissector(pinfo, tree, tvb, offset, usb_conv_info); return offset; } /* Dissector used for standard usb setup responses */ static int dissect_usb_standard_setup_response(packet_info *pinfo, proto_tree *tree, tvbuff_t *tvb, int offset, usb_conv_info_t *usb_conv_info) { const usb_setup_dissector_table_t *tmp; usb_setup_dissector dissector; gint length_remaining; col_add_fstr(pinfo->cinfo, COL_INFO, "%s Response", val_to_str_ext(usb_conv_info->usb_trans_info->setup.request, &setup_request_names_vals_ext, "Unknown type %x")); dissector = NULL; for(tmp = setup_response_dissectors;tmp->dissector;tmp++) { if (tmp->request == usb_conv_info->usb_trans_info->setup.request) { dissector = tmp->dissector; break; } } length_remaining = tvb_reported_length_remaining(tvb, offset); if (length_remaining <= 0) return offset; if (dissector) { offset = dissector(pinfo, tree, tvb, offset, usb_conv_info); } else { proto_tree_add_item(tree, hf_usb_control_response_generic, tvb, offset, length_remaining, ENC_NA); offset += length_remaining; } return offset; } static void usb_tap_queue_packet(packet_info *pinfo, guint8 urb_type, usb_conv_info_t *usb_conv_info) { usb_tap_data_t *tap_data; tap_data = wmem_new(wmem_packet_scope(), usb_tap_data_t); tap_data->urb_type = urb_type; tap_data->transfer_type = (guint8)(usb_conv_info->transfer_type); tap_data->conv_info = usb_conv_info; tap_data->trans_info = usb_conv_info->usb_trans_info; tap_queue_packet(usb_tap, pinfo, tap_data); } static gboolean is_usb_standard_setup_request(usb_trans_info_t *usb_trans_info) { guint8 type, recip; type = USB_TYPE(usb_trans_info->setup.requesttype); recip = USB_RECIPIENT(usb_trans_info->setup.requesttype); if (type != RQT_SETUP_TYPE_STANDARD) return FALSE; /* the USB standards defines the GET_DESCRIPTOR request only as a request to a device if it's not aimed at a device, it's a non-standard request that should be handled by a class-specific dissector */ if (usb_trans_info->setup.request == USB_SETUP_GET_DESCRIPTOR && recip != RQT_SETUP_RECIPIENT_DEVICE) { return FALSE; } return TRUE; } static gint try_dissect_next_protocol(proto_tree *tree, tvbuff_t *next_tvb, packet_info *pinfo, usb_conv_info_t *usb_conv_info, guint8 urb_type, proto_tree *urb_tree) { int ret; wmem_tree_key_t key[4]; guint32 k_frame_number; guint32 k_device_address; guint32 k_bus_id; usb_trans_info_t *usb_trans_info; heur_dtbl_entry_t *hdtbl_entry; heur_dissector_list_t heur_subdissector_list = NULL; dissector_table_t usb_dissector_table = NULL; proto_item *sub_item; device_product_data_t *device_product_data; device_protocol_data_t *device_protocol_data; guint8 ctrl_recip; /* if we select the next dissector based on a class, this is the (device or interface) class we're using */ guint32 usb_class; if (!usb_conv_info) { /* * Not enough information to choose the next protocol. * XXX - is there something we can still do here? */ if (tvb_reported_length(next_tvb) > 0) call_data_dissector(next_tvb, pinfo, tree); return tvb_captured_length(next_tvb); } /* try dissect by "usb.device" */ ret = dissector_try_uint_new(device_to_dissector, (guint32)(usb_conv_info->bus_id<<16 | usb_conv_info->device_address), next_tvb, pinfo, tree, TRUE, usb_conv_info); if (ret) return tvb_captured_length(next_tvb); k_frame_number = pinfo->num; k_device_address = usb_conv_info->device_address; k_bus_id = usb_conv_info->bus_id; key[0].length = 1; key[0].key = &k_device_address; key[1].length = 1; key[1].key = &k_bus_id; key[2].length = 1; key[2].key = &k_frame_number; key[3].length = 0; key[3].key = NULL; /* try dissect by "usb.protocol" */ device_protocol_data = (device_protocol_data_t *)wmem_tree_lookup32_array_le(device_to_protocol_table, key); if (device_protocol_data && device_protocol_data->bus_id == usb_conv_info->bus_id && device_protocol_data->device_address == usb_conv_info->device_address) { ret = dissector_try_uint_new(protocol_to_dissector, (guint32)device_protocol_data->protocol, next_tvb, pinfo, tree, TRUE, usb_conv_info); if (ret) return tvb_captured_length(next_tvb); } device_product_data = (device_product_data_t *)wmem_tree_lookup32_array_le(device_to_product_table, key); if (device_product_data && device_product_data->bus_id == usb_conv_info->bus_id && device_product_data->device_address == usb_conv_info->device_address) { ret = dissector_try_uint_new(product_to_dissector, (guint32)(device_product_data->vendor<<16 | device_product_data->product), next_tvb, pinfo, tree, TRUE, usb_conv_info); if (ret) return tvb_captured_length(next_tvb); } switch(usb_conv_info->transfer_type) { case URB_BULK: heur_subdissector_list = heur_bulk_subdissector_list; usb_dissector_table = usb_bulk_dissector_table; break; case URB_INTERRUPT: heur_subdissector_list = heur_interrupt_subdissector_list; usb_dissector_table = usb_interrupt_dissector_table; break; case URB_CONTROL: usb_trans_info = usb_conv_info->usb_trans_info; if (!usb_trans_info) break; /* for standard control requests and responses, there's no need to query dissector tables */ if (is_usb_standard_setup_request(usb_trans_info)) break; ctrl_recip = USB_RECIPIENT(usb_trans_info->setup.requesttype); if (ctrl_recip == RQT_SETUP_RECIPIENT_DEVICE) { /* a non-standard control message addressed to a device this is not supported (I don't think it's used anywhere) */ break; } else if (ctrl_recip == RQT_SETUP_RECIPIENT_INTERFACE) { guint8 interface_num = usb_trans_info->setup.wIndex & 0xff; heur_subdissector_list = heur_control_subdissector_list; usb_dissector_table = usb_control_dissector_table; usb_conv_info = get_usb_iface_conv_info(pinfo, interface_num); usb_conv_info->usb_trans_info = usb_trans_info; } else if (ctrl_recip == RQT_SETUP_RECIPIENT_ENDPOINT) { address endpoint_addr; gint endpoint; guint32 src_endpoint, dst_endpoint; conversation_t *conversation; heur_subdissector_list = heur_control_subdissector_list; usb_dissector_table = usb_control_dissector_table; endpoint = usb_trans_info->setup.wIndex & 0x0f; if (usb_conv_info->is_request) { usb_address_t *dst_addr = wmem_new0(wmem_packet_scope(), usb_address_t); dst_addr->bus_id = usb_conv_info->bus_id; dst_addr->device = usb_conv_info->device_address; dst_addr->endpoint = dst_endpoint = GUINT32_TO_LE(endpoint); set_address(&endpoint_addr, usb_address_type, USB_ADDR_LEN, (char *)dst_addr); conversation = get_usb_conversation(pinfo, &pinfo->src, &endpoint_addr, pinfo->srcport, dst_endpoint); } else { usb_address_t *src_addr = wmem_new0(wmem_packet_scope(), usb_address_t); src_addr->bus_id = usb_conv_info->bus_id; src_addr->device = usb_conv_info->device_address; src_addr->endpoint = src_endpoint = GUINT32_TO_LE(endpoint); set_address(&endpoint_addr, usb_address_type, USB_ADDR_LEN, (char *)src_addr); conversation = get_usb_conversation(pinfo, &endpoint_addr, &pinfo->dst, src_endpoint, pinfo->destport); } usb_conv_info = get_usb_conv_info(conversation); usb_conv_info->usb_trans_info = usb_trans_info; } else { /* the recipient is "other" or "reserved" there's no way for us to determine the interfaceClass we set the usb_dissector_table anyhow as some dissectors register for control messages to IF_CLASS_UNKNOWN (this should be fixed) */ heur_subdissector_list = heur_control_subdissector_list; usb_dissector_table = usb_control_dissector_table; } usb_tap_queue_packet(pinfo, urb_type, usb_conv_info); sub_item = proto_tree_add_uint(urb_tree, hf_usb_bInterfaceClass, next_tvb, 0, 0, usb_conv_info->interfaceClass); PROTO_ITEM_SET_GENERATED(sub_item); break; default: break; } if (try_heuristics && heur_subdissector_list) { ret = dissector_try_heuristic(heur_subdissector_list, next_tvb, pinfo, tree, &hdtbl_entry, usb_conv_info); if (ret) return tvb_captured_length(next_tvb); } if (usb_dissector_table) { /* we prefer the interface class unless it says we should refer to the device class XXX - use the device class if the interface class is unknown */ if (usb_conv_info->interfaceClass == IF_CLASS_DEVICE) { usb_class = (usb_conv_info->device_protocol>>16) & 0xFF; } else { usb_class = usb_conv_info->interfaceClass; } ret = dissector_try_uint_new(usb_dissector_table, usb_class, next_tvb, pinfo, tree, TRUE, usb_conv_info); if (ret) return tvb_captured_length(next_tvb); } return 0; } static int dissect_usb_setup_response(packet_info *pinfo, proto_tree *tree, tvbuff_t *tvb, int offset, guint8 urb_type, usb_conv_info_t *usb_conv_info) { proto_tree *parent; tvbuff_t *next_tvb = NULL; gint length_remaining; parent = proto_tree_get_parent_tree(tree); if (usb_conv_info) { if (usb_conv_info->usb_trans_info && is_usb_standard_setup_request(usb_conv_info->usb_trans_info)) { offset = dissect_usb_standard_setup_response(pinfo, parent, tvb, offset, usb_conv_info); } else { next_tvb = tvb_new_subset_remaining(tvb, offset); offset += try_dissect_next_protocol(parent, next_tvb, pinfo, usb_conv_info, urb_type, tree); length_remaining = tvb_reported_length_remaining(tvb, offset); if (length_remaining > 0) { proto_tree_add_item(parent, hf_usb_control_response_generic, tvb, offset, length_remaining, ENC_NA); offset += length_remaining; } } } else { /* no matching request available */ length_remaining = tvb_reported_length_remaining(tvb, offset); if (length_remaining > 0) { proto_tree_add_item(parent, hf_usb_control_response_generic, tvb, offset, length_remaining, ENC_NA); offset += length_remaining; } } return offset; } static int dissect_usb_bmrequesttype(proto_tree *parent_tree, tvbuff_t *tvb, int offset) { proto_item *item; proto_tree *tree; item = proto_tree_add_item(parent_tree, hf_usb_bmRequestType, tvb, offset, 1, ENC_LITTLE_ENDIAN); tree = proto_item_add_subtree(item, ett_usb_setup_bmrequesttype); proto_tree_add_item(tree, hf_usb_bmRequestType_direction, tvb, offset, 1, ENC_LITTLE_ENDIAN); proto_tree_add_item(tree, hf_usb_bmRequestType_type, tvb, offset, 1, ENC_LITTLE_ENDIAN); proto_tree_add_item(tree, hf_usb_bmRequestType_recipient, tvb, offset, 1, ENC_LITTLE_ENDIAN); return ++offset; } static int dissect_linux_usb_pseudo_header_ext(tvbuff_t *tvb, int offset, packet_info *pinfo _U_, proto_tree *tree) { proto_tree_add_item(tree, hf_usb_urb_interval, tvb, offset, 4, ENC_HOST_ENDIAN); offset += 4; proto_tree_add_item(tree, hf_usb_urb_start_frame, tvb, offset, 4, ENC_HOST_ENDIAN); offset += 4; proto_tree_add_item(tree, hf_usb_urb_copy_of_transfer_flags, tvb, offset, 4, ENC_HOST_ENDIAN); offset += 4; proto_tree_add_item(tree, hf_usb_iso_numdesc, tvb, offset, 4, ENC_HOST_ENDIAN); offset += 4; return offset; } /* Dissector used for usb setup requests */ static int dissect_usb_setup_request(packet_info *pinfo, proto_tree *tree, tvbuff_t *tvb, int offset, guint8 urb_type, usb_conv_info_t *usb_conv_info, usb_header_t header_type) { gint setup_offset; gint req_type; gint ret; proto_tree *parent, *setup_tree; usb_trans_info_t *usb_trans_info, trans_info; tvbuff_t *next_tvb, *data_tvb = NULL; /* we should do the NULL check in all non-static functions */ if (usb_conv_info) usb_trans_info = usb_conv_info->usb_trans_info; else usb_trans_info = &trans_info; parent = proto_tree_get_parent_tree(tree); setup_tree = proto_tree_add_subtree(parent, tvb, offset, 8, ett_usb_setup_hdr, NULL, "URB setup"); req_type = USB_TYPE(tvb_get_guint8(tvb, offset)); usb_trans_info->setup.requesttype = tvb_get_guint8(tvb, offset); if (usb_conv_info) { usb_conv_info->setup_requesttype = tvb_get_guint8(tvb, offset); if (req_type != RQT_SETUP_TYPE_CLASS) usb_tap_queue_packet(pinfo, urb_type, usb_conv_info); } offset = dissect_usb_bmrequesttype(setup_tree, tvb, offset); /* as we're going through the data, we build a next_tvb that contains the the setup packet without the request type and request-specific data all subsequent dissection routines work on this tvb */ setup_offset = offset; usb_trans_info->setup.request = tvb_get_guint8(tvb, offset); offset++; usb_trans_info->setup.wValue = tvb_get_letohs(tvb, offset); offset += 2; usb_trans_info->setup.wIndex = tvb_get_letohs(tvb, offset); offset += 2; usb_trans_info->setup.wLength = tvb_get_letohs(tvb, offset); offset += 2; if (header_type == USB_HEADER_LINUX_64_BYTES) { offset = dissect_linux_usb_pseudo_header_ext(tvb, offset, pinfo, tree); } if (tvb_captured_length_remaining(tvb, offset) > 0) { next_tvb = tvb_new_composite(); tvb_composite_append(next_tvb, tvb_new_subset(tvb, setup_offset, 7, 7)); data_tvb = tvb_new_subset_remaining(tvb, offset); tvb_composite_append(next_tvb, data_tvb); offset += tvb_captured_length(data_tvb); tvb_composite_finalize(next_tvb); next_tvb = tvb_new_child_real_data(tvb, (const guint8 *) tvb_memdup(pinfo->pool, next_tvb, 0, tvb_captured_length(next_tvb)), tvb_captured_length(next_tvb), tvb_captured_length(next_tvb)); add_new_data_source(pinfo, next_tvb, "USB Control"); } else { next_tvb = tvb_new_subset(tvb, setup_offset, 7, 7); } /* at this point, offset contains the number of bytes that we dissected */ if (is_usb_standard_setup_request(usb_trans_info)) { /* there's no point in checking the return value as there's no fallback for standard setup requests */ dissect_usb_standard_setup_request(pinfo, setup_tree, next_tvb, usb_conv_info, usb_trans_info); } else { /* no standard request - pass it on to class-specific dissectors */ ret = try_dissect_next_protocol( parent, next_tvb, pinfo, usb_conv_info, urb_type, tree); if (ret <= 0) { /* no class-specific dissector could handle it, dissect it as generic setup request */ proto_tree_add_item(setup_tree, hf_usb_request_unknown_class, next_tvb, 0, 1, ENC_LITTLE_ENDIAN); dissect_usb_setup_generic(pinfo, setup_tree, next_tvb, 1, usb_conv_info); } else if (data_tvb) { proto_tree_add_item(setup_tree, hf_usb_request_unknown_class, tvb, setup_offset, 1, ENC_LITTLE_ENDIAN); dissect_usb_setup_generic(pinfo, setup_tree, tvb, setup_offset+1, usb_conv_info); } } if (data_tvb) proto_tree_add_item(setup_tree, hf_usb_data_fragment, data_tvb, 0, -1, ENC_NA); return offset; } /* dissect the linux-specific USB pseudo header and fill the conversation struct return the number of dissected bytes */ static gint dissect_linux_usb_pseudo_header(tvbuff_t *tvb, packet_info *pinfo, proto_tree *tree, usb_conv_info_t *usb_conv_info, guint64 *urb_id) { guint8 transfer_type; guint8 endpoint_byte; guint8 transfer_type_and_direction; guint8 urb_type; guint8 flag[2]; *urb_id = tvb_get_guint64(tvb, 0, ENC_HOST_ENDIAN); proto_tree_add_uint64(tree, hf_usb_urb_id, tvb, 0, 8, *urb_id); /* show the urb type of this URB as string and as a character */ urb_type = tvb_get_guint8(tvb, 8); usb_conv_info->is_request = (urb_type==URB_SUBMIT); proto_tree_add_uint_format_value(tree, hf_usb_linux_urb_type, tvb, 8, 1, urb_type, "%s ('%c')", val_to_str(urb_type, usb_linux_urb_type_vals, "Unknown %d"), g_ascii_isprint(urb_type) ? urb_type : '.'); proto_tree_add_item(tree, hf_usb_linux_transfer_type, tvb, 9, 1, ENC_LITTLE_ENDIAN); transfer_type = tvb_get_guint8(tvb, 9); usb_conv_info->transfer_type = transfer_type; endpoint_byte = tvb_get_guint8(tvb, 10); /* direction bit | endpoint */ usb_conv_info->endpoint = endpoint_byte & 0x7F; if (endpoint_byte & URB_TRANSFER_IN) usb_conv_info->direction = P2P_DIR_RECV; else usb_conv_info->direction = P2P_DIR_SENT; transfer_type_and_direction = (transfer_type & 0x7F) | (endpoint_byte & 0x80); col_append_str(pinfo->cinfo, COL_INFO, val_to_str(transfer_type_and_direction, usb_transfer_type_and_direction_vals, "Unknown type %x")); proto_tree_add_bitmask(tree, tvb, 10, hf_usb_endpoint_number, ett_usb_endpoint, usb_endpoint_fields, ENC_NA); proto_tree_add_item(tree, hf_usb_device_address, tvb, 11, 1, ENC_LITTLE_ENDIAN); usb_conv_info->device_address = (guint16)tvb_get_guint8(tvb, 11); proto_tree_add_item(tree, hf_usb_bus_id, tvb, 12, 2, ENC_HOST_ENDIAN); tvb_memcpy(tvb, &usb_conv_info->bus_id, 12, 2); /* Right after the pseudo header we always have * sizeof(struct usb_device_setup_hdr) bytes. The content of these * bytes only have meaning in case setup_flag == 0. */ flag[0] = tvb_get_guint8(tvb, 14); flag[1] = '\0'; if (flag[0] == 0) { usb_conv_info->is_setup = TRUE; proto_tree_add_string(tree, hf_usb_setup_flag, tvb, 14, 1, "relevant (0)"); if (usb_conv_info->transfer_type!=URB_CONTROL) proto_tree_add_expert(tree, pinfo, &ei_usb_invalid_setup, tvb, 14, 1); } else { usb_conv_info->is_setup = FALSE; proto_tree_add_string_format_value(tree, hf_usb_setup_flag, tvb, 14, 1, flag, "not relevant ('%c')", g_ascii_isprint(flag[0]) ? flag[0]: '.'); } flag[0] = tvb_get_guint8(tvb, 15); flag[1] = '\0'; if (flag[0] == 0) { proto_tree_add_string(tree, hf_usb_data_flag, tvb, 15, 1, "present (0)"); } else { proto_tree_add_string_format_value(tree, hf_usb_data_flag, tvb, 15, 1, flag, "not present ('%c')", g_ascii_isprint(flag[0]) ? flag[0] : '.'); } proto_tree_add_item(tree, hf_usb_urb_ts_sec, tvb, 16, 8, ENC_HOST_ENDIAN); proto_tree_add_item(tree, hf_usb_urb_ts_usec, tvb, 24, 4, ENC_HOST_ENDIAN); proto_tree_add_item(tree, hf_usb_urb_status, tvb, 28, 4, ENC_HOST_ENDIAN); proto_tree_add_item(tree, hf_usb_urb_len, tvb, 32, 4, ENC_HOST_ENDIAN); proto_tree_add_item(tree, hf_usb_urb_data_len, tvb, 36, 4, ENC_HOST_ENDIAN); return 40; } /* dissect the usbpcap_buffer_packet_header and fill the conversation struct this function does not handle the transfer-specific headers return the number of bytes processed */ static gint dissect_usbpcap_buffer_packet_header(tvbuff_t *tvb, packet_info *pinfo, proto_tree *tree, usb_conv_info_t *usb_conv_info, guint32 *win32_data_len, guint64 *irp_id) { guint8 transfer_type; guint8 endpoint_byte; guint8 transfer_type_and_direction; guint8 tmp_val8; proto_tree_add_item(tree, hf_usb_win32_header_len, tvb, 0, 2, ENC_LITTLE_ENDIAN); *irp_id = tvb_get_guint64(tvb, 2, ENC_LITTLE_ENDIAN); proto_tree_add_uint64(tree, hf_usb_irp_id, tvb, 2, 8, *irp_id); proto_tree_add_item(tree, hf_usb_usbd_status, tvb, 10, 4, ENC_LITTLE_ENDIAN); proto_tree_add_item(tree, hf_usb_function, tvb, 14, 2, ENC_LITTLE_ENDIAN); proto_tree_add_bitmask(tree, tvb, 16, hf_usb_info, ett_usb_usbpcap_info, usb_usbpcap_info_fields, ENC_LITTLE_ENDIAN); tmp_val8 = tvb_get_guint8(tvb, 16); /* TODO: Handle errors */ if (tmp_val8 & 0x01) { usb_conv_info->is_request = FALSE; } else { usb_conv_info->is_request = TRUE; } proto_tree_add_item(tree, hf_usb_bus_id, tvb, 17, 2, ENC_LITTLE_ENDIAN); usb_conv_info->bus_id = tvb_get_letohs(tvb, 17); proto_tree_add_item(tree, hf_usb_win32_device_address, tvb, 19, 2, ENC_LITTLE_ENDIAN); usb_conv_info->device_address = tvb_get_letohs(tvb, 19); endpoint_byte = tvb_get_guint8(tvb, 21); usb_conv_info->direction = endpoint_byte&URB_TRANSFER_IN ? P2P_DIR_RECV : P2P_DIR_SENT; usb_conv_info->endpoint = endpoint_byte&0x7F; proto_tree_add_bitmask(tree, tvb, 21, hf_usb_endpoint_number, ett_usb_endpoint, usb_endpoint_fields, ENC_LITTLE_ENDIAN); transfer_type = tvb_get_guint8(tvb, 22); usb_conv_info->transfer_type = transfer_type; proto_tree_add_item(tree, hf_usb_linux_transfer_type, tvb, 22, 1, ENC_LITTLE_ENDIAN); transfer_type_and_direction = (transfer_type & 0x7F) | (endpoint_byte & 0x80); col_append_str(pinfo->cinfo, COL_INFO, val_to_str(transfer_type_and_direction, usb_transfer_type_and_direction_vals, "Unknown type %x")); *win32_data_len = tvb_get_letohl(tvb, 23); proto_tree_add_item(tree, hf_usb_win32_data_len, tvb, 23, 4, ENC_LITTLE_ENDIAN); /* by default, we assume it's no setup packet the correct values will be set when we parse the control header */ usb_conv_info->is_setup = FALSE; usb_conv_info->setup_requesttype = 0; /* we don't handle the transfer-specific headers here */ return 27; } /* Set the usb_address_t fields based on the direction of the urb */ static void usb_set_addr(proto_tree *tree, tvbuff_t *tvb, packet_info *pinfo, guint16 bus_id, guint16 device_address, int endpoint, gboolean req) { proto_item *sub_item; usb_address_t *src_addr = wmem_new0(pinfo->pool, usb_address_t), *dst_addr = wmem_new0(pinfo->pool, usb_address_t); guint8 *str_src_addr; guint8 *str_dst_addr; if (req) { /* request */ src_addr->device = 0xffffffff; src_addr->endpoint = NO_ENDPOINT; dst_addr->device = GUINT16_TO_LE(device_address); dst_addr->endpoint = GUINT32_TO_LE(endpoint); } else { /* response */ src_addr->device = GUINT16_TO_LE(device_address); src_addr->endpoint = GUINT32_TO_LE(endpoint); dst_addr->device = 0xffffffff; dst_addr->endpoint = NO_ENDPOINT; } src_addr->bus_id = GUINT16_TO_LE(bus_id); dst_addr->bus_id = GUINT16_TO_LE(bus_id); set_address(&pinfo->net_src, usb_address_type, USB_ADDR_LEN, (char *)src_addr); copy_address_shallow(&pinfo->src, &pinfo->net_src); set_address(&pinfo->net_dst, usb_address_type, USB_ADDR_LEN, (char *)dst_addr); copy_address_shallow(&pinfo->dst, &pinfo->net_dst); pinfo->ptype = PT_USB; pinfo->srcport = src_addr->endpoint; pinfo->destport = dst_addr->endpoint; /* sent/received is from the perspective of the USB host */ pinfo->p2p_dir = req ? P2P_DIR_SENT : P2P_DIR_RECV; str_src_addr = address_to_str(wmem_packet_scope(), &pinfo->src); str_dst_addr = address_to_str(wmem_packet_scope(), &pinfo->dst); sub_item = proto_tree_add_string(tree, hf_usb_src, tvb, 0, 0, str_src_addr); PROTO_ITEM_SET_GENERATED(sub_item); sub_item = proto_tree_add_string(tree, hf_usb_addr, tvb, 0, 0, str_src_addr); PROTO_ITEM_SET_HIDDEN(sub_item); sub_item = proto_tree_add_string(tree, hf_usb_dst, tvb, 0, 0, str_dst_addr); PROTO_ITEM_SET_GENERATED(sub_item); sub_item = proto_tree_add_string(tree, hf_usb_addr, tvb, 0, 0, str_dst_addr); PROTO_ITEM_SET_HIDDEN(sub_item); } /* Gets the transfer info for a given packet * Generates transfer info if none exists yet * Also adds request/response info to the tree for the given packet */ static usb_trans_info_t *usb_get_trans_info(tvbuff_t *tvb, packet_info *pinfo, proto_tree *tree, usb_header_t header_type, usb_conv_info_t *usb_conv_info, guint64 usb_id) { usb_trans_info_t *usb_trans_info; proto_item *ti; nstime_t t, deltat; wmem_tree_key_t key[3]; /* request/response matching so we can keep track of transaction specific * data. */ key[0].length = 2; key[0].key = (guint32 *)&usb_id; key[1].length = 1; key[1].key = &pinfo->num; key[2].length = 0; key[2].key = NULL; if (usb_conv_info->is_request) { /* this is a request */ usb_trans_info = (usb_trans_info_t *)wmem_tree_lookup32_array(usb_conv_info->transactions, key); if (!usb_trans_info) { usb_trans_info = wmem_new0(wmem_file_scope(), usb_trans_info_t); usb_trans_info->request_in = pinfo->num; usb_trans_info->req_time = pinfo->abs_ts; usb_trans_info->header_type = header_type; usb_trans_info->usb_id = usb_id; wmem_tree_insert32_array(usb_conv_info->transactions, key, usb_trans_info); } if (usb_trans_info->response_in) { ti = proto_tree_add_uint(tree, hf_usb_response_in, tvb, 0, 0, usb_trans_info->response_in); PROTO_ITEM_SET_GENERATED(ti); } } else { /* this is a response */ if (pinfo->fd->flags.visited) { usb_trans_info = (usb_trans_info_t *)wmem_tree_lookup32_array(usb_conv_info->transactions, key); } else { usb_trans_info = (usb_trans_info_t *)wmem_tree_lookup32_array_le(usb_conv_info->transactions, key); if (usb_trans_info) { if (usb_trans_info->usb_id == usb_id) { if (usb_trans_info->response_in == 0) { /* USBPcap generates 2 frames for response; store the first one */ usb_trans_info->response_in = pinfo->num; } wmem_tree_insert32_array(usb_conv_info->transactions, key, usb_trans_info); } else { usb_trans_info = NULL; } } } if (usb_trans_info && usb_trans_info->request_in) { ti = proto_tree_add_uint(tree, hf_usb_request_in, tvb, 0, 0, usb_trans_info->request_in); PROTO_ITEM_SET_GENERATED(ti); t = pinfo->abs_ts; nstime_delta(&deltat, &t, &usb_trans_info->req_time); ti = proto_tree_add_time(tree, hf_usb_time, tvb, 0, 0, &deltat); PROTO_ITEM_SET_GENERATED(ti); } } return usb_trans_info; } /* dissect a group of isochronous packets inside an usb packet in usbpcap format */ static gint dissect_usbpcap_iso_packets(packet_info *pinfo _U_, proto_tree *urb_tree, guint8 urb_type, tvbuff_t *tvb, gint offset, guint32 win32_data_len, usb_conv_info_t *usb_conv_info) { guint32 i; guint32 num_packets; guint32 data_start_offset; proto_item *urb_tree_ti; proto_tree_add_item(urb_tree, hf_usb_win32_iso_start_frame, tvb, offset, 4, ENC_LITTLE_ENDIAN); offset += 4; num_packets = tvb_get_letohl(tvb, offset); proto_tree_add_item(urb_tree, hf_usb_win32_iso_num_packets, tvb, offset, 4, ENC_LITTLE_ENDIAN); offset += 4; proto_tree_add_item(urb_tree, hf_usb_win32_iso_error_count, tvb, offset, 4, ENC_LITTLE_ENDIAN); offset += 4; data_start_offset = offset + 12 * num_packets; urb_tree_ti = proto_tree_get_parent(urb_tree); proto_item_set_len(urb_tree_ti, data_start_offset); for (i = 0; i < num_packets; i++) { guint32 this_offset; guint32 next_offset; guint32 iso_len; proto_item *iso_packet_ti, *ti; proto_tree *iso_packet_tree; iso_packet_ti = proto_tree_add_protocol_format( proto_tree_get_root(urb_tree), proto_usb, tvb, offset, 12, "USB isochronous packet"); iso_packet_tree = proto_item_add_subtree(iso_packet_ti, ett_usb_win32_iso_packet); this_offset = tvb_get_letohl(tvb, offset); if (num_packets - i == 1) { /* this is the last packet */ next_offset = win32_data_len; } else { /* there is next packet */ next_offset = tvb_get_letohl(tvb, offset + 12); } if (next_offset > this_offset) { iso_len = next_offset - this_offset; } else { iso_len = 0; } /* If this packet does not contain isochrounous data, do not try to display it */ if (!((usb_conv_info->is_request && usb_conv_info->direction==P2P_DIR_SENT) || (!usb_conv_info->is_request && usb_conv_info->direction==P2P_DIR_RECV))) { iso_len = 0; } proto_tree_add_item(iso_packet_tree, hf_usb_win32_iso_offset, tvb, offset, 4, ENC_LITTLE_ENDIAN); offset += 4; ti = proto_tree_add_item(iso_packet_tree, hf_usb_win32_iso_length, tvb, offset, 4, ENC_LITTLE_ENDIAN); if (usb_conv_info->direction==P2P_DIR_SENT) { /* Isochronous OUT transfer */ proto_item_append_text(ti, " (not used)"); } else { /* Isochronous IN transfer. * Length field is being set by host controller. */ if (usb_conv_info->is_request) { /* Length was not yet set */ proto_item_append_text(ti, " (irrelevant)"); } else { /* Length was set and (should be) valid */ proto_item_append_text(ti, " (relevant)"); iso_len = tvb_get_letohl(tvb, offset); } } offset += 4; ti = proto_tree_add_item(iso_packet_tree, hf_usb_win32_iso_status, tvb, offset, 4, ENC_LITTLE_ENDIAN); if (urb_type == URB_SUBMIT) { proto_item_append_text(ti, " (irrelevant)"); } else { proto_item_append_text(ti, " (relevant)"); } offset += 4; if (iso_len && data_start_offset + this_offset + iso_len <= tvb_captured_length(tvb)) { proto_tree_add_item(iso_packet_tree, hf_usb_iso_data, tvb, (gint)(data_start_offset + this_offset), (gint)iso_len, ENC_NA); proto_tree_set_appendix(iso_packet_tree, tvb, (gint)(data_start_offset + this_offset), (gint)iso_len); } } if ((usb_conv_info->is_request && usb_conv_info->direction==P2P_DIR_SENT) || (!usb_conv_info->is_request && usb_conv_info->direction==P2P_DIR_RECV)) { /* We have dissected all the isochronous data */ offset += win32_data_len; } return offset; } static gint dissect_linux_usb_iso_transfer(packet_info *pinfo _U_, proto_tree *urb_tree, usb_header_t header_type, tvbuff_t *tvb, gint offset, usb_conv_info_t *usb_conv_info) { guint32 iso_numdesc = 0; proto_item *tii; guint32 val32; guint32 i; guint data_base; guint32 iso_status; guint32 iso_off = 0; guint32 iso_len = 0; tii = proto_tree_add_uint(urb_tree, hf_usb_bInterfaceClass, tvb, offset, 0, usb_conv_info->interfaceClass); PROTO_ITEM_SET_GENERATED(tii); /* All fields which belong to Linux usbmon headers are in host-endian * byte order. The fields coming from the USB communication are in little * endian format (see usb_20.pdf, chapter 8.1 Byte/Bit ordering). * * When a capture file is transferred to a host with different endianness * than packet was captured then the necessary swapping happens in * wiretap/pcap-common.c, pcap_byteswap_linux_usb_pseudoheader(). */ /* iso urbs on linux can't possibly contain a setup packet see mon_bin_event() in the linux kernel */ /* Process ISO related fields (usbmon_packet.iso). The fields are * in host endian byte order so use tvb_memcopy() and * proto_tree_add_uint() pair. */ tvb_memcpy(tvb, (guint8 *)&val32, offset, 4); proto_tree_add_uint(urb_tree, hf_usb_iso_error_count, tvb, offset, 4, val32); offset += 4; tvb_memcpy(tvb, (guint8 *)&iso_numdesc, offset, 4); proto_tree_add_uint(urb_tree, hf_usb_iso_numdesc, tvb, offset, 4, iso_numdesc); offset += 4; if (header_type == USB_HEADER_LINUX_64_BYTES) { offset = dissect_linux_usb_pseudo_header_ext(tvb, offset, pinfo, urb_tree); } data_base = offset + iso_numdesc*16; for (i = 0; i<iso_numdesc; i++) { proto_item *iso_desc_ti; proto_tree *iso_desc_tree; guint32 iso_pad; /* Fetch ISO descriptor fields stored in host endian byte order. */ tvb_memcpy(tvb, (guint8 *)&iso_status, offset, 4); tvb_memcpy(tvb, (guint8 *)&iso_off, offset+4, 4); tvb_memcpy(tvb, (guint8 *)&iso_len, offset+8, 4); iso_desc_ti = proto_tree_add_protocol_format(urb_tree, proto_usb, tvb, offset, 16, "USB isodesc %u [%s]", i, val_to_str_ext(iso_status, &usb_urb_status_vals_ext, "Error %d")); if (iso_len > 0) proto_item_append_text(iso_desc_ti, " (%u bytes)", iso_len); iso_desc_tree = proto_item_add_subtree(iso_desc_ti, ett_usb_isodesc); proto_tree_add_int(iso_desc_tree, hf_usb_iso_status, tvb, offset, 4, iso_status); offset += 4; proto_tree_add_uint(iso_desc_tree, hf_usb_iso_off, tvb, offset, 4, iso_off); offset += 4; proto_tree_add_uint(iso_desc_tree, hf_usb_iso_len, tvb, offset, 4, iso_len); offset += 4; /* Show the ISO data if we captured them and either the status is OK or the packet is sent from host to device. The Linux kernel sets the status field in outgoing isochronous URBs to -EXDEV and fills the data part with valid data. */ if ((pinfo->p2p_dir==P2P_DIR_SENT || !iso_status) && iso_len && data_base + iso_off + iso_len <= tvb_captured_length(tvb)) { proto_tree_add_item(iso_desc_tree, hf_usb_iso_data, tvb, data_base + iso_off, iso_len, ENC_NA); proto_tree_set_appendix(iso_desc_tree, tvb, (gint)(data_base+iso_off), (gint)iso_len); } tvb_memcpy(tvb, (guint8 *)&iso_pad, offset, 4); proto_tree_add_uint(iso_desc_tree, hf_usb_iso_pad, tvb, offset, 4, iso_pad); offset += 4; } /* we jump to the end of the last iso data chunk this assumes that the iso data starts immediately after the iso descriptors we have to use the offsets from the last iso descriptor, we can't keep track of the offset ourselves as there may be gaps between data packets in the transfer buffer */ return data_base+iso_off+iso_len; } static gint dissect_usbip_iso_transfer(packet_info *pinfo _U_, proto_tree *urb_tree, tvbuff_t *tvb, gint offset, guint32 iso_numdesc, guint32 desc_offset, usb_conv_info_t *usb_conv_info) { proto_item *tii; guint32 i; guint data_base; guint32 iso_off = 0; guint32 iso_len = 0; tii = proto_tree_add_uint(urb_tree, hf_usb_bInterfaceClass, tvb, offset, 0, usb_conv_info->interfaceClass); PROTO_ITEM_SET_GENERATED(tii); /* All fields which belong to usbip are in big-endian byte order. * unlike the linux kernel, the usb isoc descriptor is appended at * the end of the isoc data. We have to reassemble the pdus and jump * to the end (actual_length) and the remaining data is the isoc * descriptor. */ data_base = offset; for (i = 0; i<iso_numdesc; i++) { proto_item *iso_desc_ti; proto_tree *iso_desc_tree; guint32 iso_status; iso_status = tvb_get_ntohl(tvb, desc_offset + 12); iso_desc_ti = proto_tree_add_protocol_format(urb_tree, proto_usb, tvb, desc_offset, 16, "USB isodesc %u [%s]", i, val_to_str_ext(iso_status, &usb_urb_status_vals_ext, "Error %d")); iso_desc_tree = proto_item_add_subtree(iso_desc_ti, ett_usb_isodesc); proto_tree_add_item_ret_uint(iso_desc_tree, hf_usb_iso_off, tvb, desc_offset, 4, ENC_BIG_ENDIAN, &iso_off); desc_offset += 4; proto_tree_add_item(iso_desc_tree, hf_usb_iso_len, tvb, desc_offset, 4, ENC_BIG_ENDIAN); desc_offset += 4; proto_tree_add_item_ret_uint(iso_desc_tree, hf_usb_iso_actual_len, tvb, desc_offset, 4, ENC_BIG_ENDIAN, &iso_len); desc_offset += 4; if (iso_len > 0) proto_item_append_text(iso_desc_ti, " (%u bytes)", iso_len); proto_tree_add_uint(iso_desc_tree, hf_usb_iso_status, tvb, desc_offset, 4, iso_status); desc_offset += 4; /* Show the ISO data if we captured them and either the status is OK or the packet is sent from host to device. The Linux kernel sets the status field in outgoing isochronous URBs to -EXDEV and fills the data part with valid data. */ if ((pinfo->p2p_dir==P2P_DIR_SENT || !iso_status) && iso_len && data_base + iso_off + iso_len <= tvb_reported_length(tvb)) { proto_tree_add_item(iso_desc_tree, hf_usb_iso_data, tvb, (guint) data_base + iso_off, iso_len, ENC_NA); proto_tree_set_appendix(iso_desc_tree, tvb, (guint) data_base + iso_off, (gint)iso_len); } } return desc_offset; } static gint dissect_usb_payload(tvbuff_t *tvb, packet_info *pinfo, proto_tree *parent, proto_tree *tree, usb_conv_info_t *usb_conv_info, guint8 urb_type, gint offset, guint16 device_address) { wmem_tree_key_t key[4]; guint32 k_frame_number; guint32 k_device_address; guint32 k_bus_id; device_product_data_t *device_product_data = NULL; device_protocol_data_t *device_protocol_data = NULL; tvbuff_t *next_tvb = NULL; k_frame_number = pinfo->num; k_device_address = device_address; k_bus_id = usb_conv_info->bus_id; key[0].length = 1; key[0].key = &k_device_address; key[1].length = 1; key[1].key = &k_bus_id; key[2].length = 1; key[2].key = &k_frame_number; key[3].length = 0; key[3].key = NULL; device_product_data = (device_product_data_t *) wmem_tree_lookup32_array_le(device_to_product_table, key); if (device_product_data && device_product_data->bus_id == usb_conv_info->bus_id && device_product_data->device_address == device_address) { p_add_proto_data(pinfo->pool, pinfo, proto_usb, USB_VENDOR_ID, GUINT_TO_POINTER((guint)device_product_data->vendor)); p_add_proto_data(pinfo->pool, pinfo, proto_usb, USB_PRODUCT_ID, GUINT_TO_POINTER((guint)device_product_data->product)); } device_protocol_data = (device_protocol_data_t *) wmem_tree_lookup32_array_le(device_to_protocol_table, key); if (device_protocol_data && device_protocol_data->bus_id == usb_conv_info->bus_id && device_protocol_data->device_address == device_address) { p_add_proto_data(pinfo->pool, pinfo, proto_usb, USB_DEVICE_CLASS, GUINT_TO_POINTER(device_protocol_data->protocol >> 16)); p_add_proto_data(pinfo->pool, pinfo, proto_usb, USB_DEVICE_SUBCLASS, GUINT_TO_POINTER((device_protocol_data->protocol >> 8) & 0xFF)); p_add_proto_data(pinfo->pool, pinfo, proto_usb, USB_DEVICE_PROTOCOL, GUINT_TO_POINTER(device_protocol_data->protocol & 0xFF)); usb_conv_info->device_protocol = device_protocol_data->protocol; } p_add_proto_data(pinfo->pool, pinfo, proto_usb, USB_BUS_ID, GUINT_TO_POINTER((guint)usb_conv_info->bus_id)); p_add_proto_data(pinfo->pool, pinfo, proto_usb, USB_DEVICE_ADDRESS, GUINT_TO_POINTER((guint)device_address)); if (tvb_captured_length_remaining(tvb, offset) > 0) { next_tvb = tvb_new_subset_remaining(tvb, offset); offset += try_dissect_next_protocol(parent, next_tvb, pinfo, usb_conv_info, urb_type, tree); } if (tvb_captured_length_remaining(tvb, offset) > 0) { /* There is still leftover capture data to add (padding?) */ proto_tree_add_item(parent, hf_usb_capdata, tvb, offset, -1, ENC_NA); } return offset; } static int dissect_freebsd_usb(tvbuff_t *tvb, packet_info *pinfo, proto_tree *parent, void *data _U_) { int offset = 0; proto_item *ti; proto_tree *tree = NULL, *frame_tree = NULL; guint32 nframes; guint32 i; col_set_str(pinfo->cinfo, COL_PROTOCOL, "USB"); /* add usb hdr*/ if (parent) { ti = proto_tree_add_protocol_format(parent, proto_usb, tvb, 0, 128, "USB URB"); tree = proto_item_add_subtree(ti, ett_usb_hdr); } proto_tree_add_item(tree, hf_usb_totlen, tvb, 0, 4, ENC_LITTLE_ENDIAN); proto_tree_add_item(tree, hf_usb_busunit, tvb, 4, 4, ENC_LITTLE_ENDIAN); proto_tree_add_item(tree, hf_usb_address, tvb, 8, 1, ENC_LITTLE_ENDIAN); proto_tree_add_item(tree, hf_usb_mode, tvb, 9, 1, ENC_LITTLE_ENDIAN); proto_tree_add_item(tree, hf_usb_freebsd_urb_type, tvb, 10, 1, ENC_LITTLE_ENDIAN); proto_tree_add_item(tree, hf_usb_freebsd_transfer_type, tvb, 11, 1, ENC_LITTLE_ENDIAN); proto_tree_add_bitmask(tree, tvb, 12, hf_usb_xferflags, ett_usb_xferflags, usb_xferflags_fields, ENC_LITTLE_ENDIAN); proto_tree_add_bitmask(tree, tvb, 16, hf_usb_xferstatus, ett_usb_xferstatus, usb_xferstatus_fields, ENC_LITTLE_ENDIAN); proto_tree_add_item(tree, hf_usb_error, tvb, 20, 4, ENC_LITTLE_ENDIAN); proto_tree_add_item(tree, hf_usb_interval, tvb, 24, 4, ENC_LITTLE_ENDIAN); proto_tree_add_item_ret_uint(tree, hf_usb_nframes, tvb, 28, 4, ENC_LITTLE_ENDIAN, &nframes); proto_tree_add_item(tree, hf_usb_packet_size, tvb, 32, 4, ENC_LITTLE_ENDIAN); proto_tree_add_item(tree, hf_usb_packet_count, tvb, 36, 4, ENC_LITTLE_ENDIAN); proto_tree_add_item(tree, hf_usb_endpoint_number, tvb, 40, 4, ENC_LITTLE_ENDIAN); proto_tree_add_item(tree, hf_usb_speed, tvb, 44, 1, ENC_LITTLE_ENDIAN); offset += 128; for (i = 0; i < nframes; i++) { guint32 framelen; guint64 frameflags; frame_tree = proto_tree_add_subtree_format(tree, tvb, offset, -1, ett_usb_frame, &ti, "Frame %u", i); proto_tree_add_item_ret_uint(frame_tree, hf_usb_frame_length, tvb, offset, 4, ENC_LITTLE_ENDIAN, &framelen); offset += 4; proto_tree_add_bitmask_ret_uint64(frame_tree, tvb, offset, hf_usb_frame_flags, ett_usb_frame_flags, usb_frame_flags_fields, ENC_LITTLE_ENDIAN, &frameflags); offset += 4; if (frameflags & FREEBSD_FRAMEFLAG_DATA_FOLLOWS) { /* * XXX - ultimately, we should dissect this data. */ proto_tree_add_item(frame_tree, hf_usb_frame_data, tvb, offset, framelen, ENC_NA); offset += (framelen + 3) & ~3; } proto_item_set_end(ti, tvb, offset); } return tvb_captured_length(tvb); } void dissect_usb_common(tvbuff_t *tvb, packet_info *pinfo, proto_tree *parent, usb_header_t header_type, void *extra_data) { gint offset = 0; int endpoint; guint8 urb_type; guint32 win32_data_len = 0; guint32 iso_numdesc = 0; guint32 desc_offset = 0; proto_item *urb_tree_ti; proto_tree *tree; proto_item *item; usb_conv_info_t *usb_conv_info; conversation_t *conversation; guint16 device_address; guint16 bus_id; guint8 usbpcap_control_stage = 0; guint64 usb_id; struct mausb_header *ma_header = NULL; struct usbip_header *ip_header = NULL; /* the goal is to get the conversation struct as early as possible and store all status values in this struct at first, we read the fields required to create/identify the right conversation struct */ switch (header_type) { case USB_HEADER_LINUX_48_BYTES: case USB_HEADER_LINUX_64_BYTES: urb_type = tvb_get_guint8(tvb, 8); endpoint = tvb_get_guint8(tvb, 10) & 0x7F; device_address = (guint16)tvb_get_guint8(tvb, 11); bus_id = tvb_get_letohs(tvb, 12); break; case USB_HEADER_USBPCAP: urb_type = tvb_get_guint8(tvb, 16) & 0x01 ? URB_COMPLETE : URB_SUBMIT; device_address = tvb_get_letohs(tvb, 19); endpoint = tvb_get_guint8(tvb, 21) & 0x7F; bus_id = tvb_get_letohs(tvb, 17); break; case USB_HEADER_MAUSB: ma_header = (struct mausb_header *) extra_data; urb_type = mausb_is_from_host(ma_header) ? URB_SUBMIT : URB_COMPLETE; device_address = mausb_ep_handle_dev_addr(ma_header->handle); endpoint = mausb_ep_handle_ep_num(ma_header->handle); bus_id = mausb_ep_handle_bus_num(ma_header->handle); break; case USB_HEADER_USBIP: ip_header = (struct usbip_header *) extra_data; urb_type = tvb_get_ntohl(tvb, 0) == 1 ? URB_SUBMIT : URB_COMPLETE; device_address = ip_header->devid; bus_id = ip_header->busid; endpoint = ip_header->ep; break; default: return; /* invalid USB pseudo header */ } col_set_str(pinfo->cinfo, COL_PROTOCOL, "USB"); urb_tree_ti = proto_tree_add_protocol_format(parent, proto_usb, tvb, 0, -1, "USB URB"); tree = proto_item_add_subtree(urb_tree_ti, ett_usb_hdr); usb_set_addr(tree, tvb, pinfo, bus_id, device_address, endpoint, (urb_type == URB_SUBMIT)); conversation = get_usb_conversation(pinfo, &pinfo->src, &pinfo->dst, pinfo->srcport, pinfo->destport); usb_conv_info = get_usb_conv_info(conversation); clear_usb_conv_tmp_data(usb_conv_info); switch (header_type) { case USB_HEADER_LINUX_48_BYTES: case USB_HEADER_LINUX_64_BYTES: proto_item_set_len(urb_tree_ti, (header_type == USB_HEADER_LINUX_64_BYTES) ? 64 : 48); offset = dissect_linux_usb_pseudo_header(tvb, pinfo, tree, usb_conv_info, &usb_id); break; case USB_HEADER_USBPCAP: offset = dissect_usbpcap_buffer_packet_header(tvb, pinfo, tree, usb_conv_info, &win32_data_len, &usb_id); /* the length that we're setting here might have to be corrected if there's a transfer-specific pseudo-header following */ proto_item_set_len(urb_tree_ti, offset); break; case USB_HEADER_MAUSB: /* MA USB header gets dissected earlier, just set conversation variables */ offset = MAUSB_DPH_LENGTH; mausb_set_usb_conv_info(usb_conv_info, ma_header); usb_id = 0; break; case USB_HEADER_USBIP: iso_numdesc = tvb_get_ntohl(tvb, 0x20); usb_conv_info->transfer_type = endpoint == 0 ? URB_CONTROL : (iso_numdesc > 0 ? URB_ISOCHRONOUS : URB_UNKNOWN); usb_conv_info->direction = ip_header->dir == USBIP_DIR_OUT ? P2P_DIR_SENT : P2P_DIR_RECV; usb_conv_info->is_setup = endpoint == 0 ? (tvb_get_ntoh64(tvb, 0x28) != G_GUINT64_CONSTANT(0)) : FALSE; usb_conv_info->is_request = (urb_type==URB_SUBMIT); offset = usb_conv_info->is_setup ? USBIP_HEADER_WITH_SETUP_LEN : USBIP_HEADER_LEN; /* The ISOC descriptor is located at the end of the isoc frame behind the isoc data. */ if ((usb_conv_info->is_request && usb_conv_info->direction == USBIP_DIR_OUT) || (!usb_conv_info->is_request && usb_conv_info->direction == USBIP_DIR_IN)) { desc_offset += tvb_get_ntohl(tvb, 0x18); } desc_offset += offset; usb_id = 0; break; default: usb_id = 0; break; } usb_conv_info->usb_trans_info = usb_get_trans_info(tvb, pinfo, tree, header_type, usb_conv_info, usb_id); if (usb_conv_info->transfer_type != URB_CONTROL) { usb_tap_queue_packet(pinfo, urb_type, usb_conv_info); } switch(usb_conv_info->transfer_type) { case URB_BULK: case URB_INTERRUPT: item = proto_tree_add_uint(tree, hf_usb_bInterfaceClass, tvb, 0, 0, usb_conv_info->interfaceClass); PROTO_ITEM_SET_GENERATED(item); switch (header_type) { case USB_HEADER_LINUX_48_BYTES: case USB_HEADER_LINUX_64_BYTES: /* bulk and interrupt transfers never contain a setup packet */ proto_tree_add_item(tree, hf_usb_urb_unused_setup_header, tvb, offset, 8, ENC_NA); offset += 8; if (header_type == USB_HEADER_LINUX_64_BYTES) { offset = dissect_linux_usb_pseudo_header_ext(tvb, offset, pinfo, tree); } break; case USB_HEADER_USBPCAP: break; case USB_HEADER_MAUSB: break; case USB_HEADER_USBIP: break; } break; case URB_CONTROL: if (header_type == USB_HEADER_USBPCAP) { proto_tree_add_item(tree, hf_usb_control_stage, tvb, offset, 1, ENC_LITTLE_ENDIAN); usbpcap_control_stage = tvb_get_guint8(tvb, offset); if (usbpcap_control_stage == USB_CONTROL_STAGE_SETUP) usb_conv_info->is_setup = TRUE; offset++; proto_item_set_len(urb_tree_ti, offset); } if (usb_conv_info->is_request) { if (usb_conv_info->is_setup) { offset = dissect_usb_setup_request(pinfo, tree, tvb, offset, urb_type, usb_conv_info, header_type); } else { switch (header_type) { case USB_HEADER_LINUX_48_BYTES: case USB_HEADER_LINUX_64_BYTES: proto_tree_add_item(tree, hf_usb_urb_unused_setup_header, tvb, offset, 8, ENC_NA); offset += 8; if (header_type == USB_HEADER_LINUX_64_BYTES) { offset = dissect_linux_usb_pseudo_header_ext(tvb, offset, pinfo, tree); } break; case USB_HEADER_USBPCAP: break; case USB_HEADER_MAUSB: break; case USB_HEADER_USBIP: break; } } } else { /* this is a response */ switch (header_type) { case USB_HEADER_LINUX_48_BYTES: case USB_HEADER_LINUX_64_BYTES: /* Skip setup header - it's never applicable for responses */ proto_tree_add_item(tree, hf_usb_urb_unused_setup_header, tvb, offset, 8, ENC_NA); offset += 8; if (header_type == USB_HEADER_LINUX_64_BYTES) { offset = dissect_linux_usb_pseudo_header_ext(tvb, offset, pinfo, tree); } break; case USB_HEADER_USBPCAP: /* Check if this is status stage */ if ((usb_conv_info->usb_trans_info) && (usbpcap_control_stage == USB_CONTROL_STAGE_STATUS)) { col_add_fstr(pinfo->cinfo, COL_INFO, "%s Status", val_to_str_ext(usb_conv_info->usb_trans_info->setup.request, &setup_request_names_vals_ext, "Unknown type %x")); /* There is no data to dissect */ return; } break; case USB_HEADER_MAUSB: break; case USB_HEADER_USBIP: break; } offset = dissect_usb_setup_response(pinfo, tree, tvb, offset, urb_type, usb_conv_info); } break; case URB_ISOCHRONOUS: switch (header_type) { case USB_HEADER_LINUX_48_BYTES: case USB_HEADER_LINUX_64_BYTES: offset = dissect_linux_usb_iso_transfer(pinfo, tree, header_type, tvb, offset, usb_conv_info); break; case USB_HEADER_USBPCAP: offset = dissect_usbpcap_iso_packets(pinfo, tree, urb_type, tvb, offset, win32_data_len, usb_conv_info); break; case USB_HEADER_MAUSB: break; case USB_HEADER_USBIP: offset = dissect_usbip_iso_transfer(pinfo, tree, tvb, offset, iso_numdesc, desc_offset, usb_conv_info); break; } break; default: /* unknown transfer type */ switch (header_type) { case USB_HEADER_LINUX_48_BYTES: case USB_HEADER_LINUX_64_BYTES: proto_tree_add_item(tree, hf_usb_urb_unused_setup_header, tvb, offset, 8, ENC_NA); offset += 8; if (header_type == USB_HEADER_LINUX_64_BYTES) { offset = dissect_linux_usb_pseudo_header_ext(tvb, offset, pinfo, tree); } break; case USB_HEADER_USBPCAP: break; case USB_HEADER_MAUSB: break; case USB_HEADER_USBIP: break; } break; } dissect_usb_payload(tvb, pinfo, parent, tree, usb_conv_info, urb_type, offset, device_address); } static int dissect_linux_usb(tvbuff_t *tvb, packet_info *pinfo, proto_tree *parent, void* data _U_) { dissect_usb_common(tvb, pinfo, parent, USB_HEADER_LINUX_48_BYTES, NULL); return tvb_captured_length(tvb); } static int dissect_linux_usb_mmapped(tvbuff_t *tvb, packet_info *pinfo, proto_tree *parent, void* data _U_) { dissect_usb_common(tvb, pinfo, parent, USB_HEADER_LINUX_64_BYTES, NULL); return tvb_captured_length(tvb); } static int dissect_win32_usb(tvbuff_t *tvb, packet_info *pinfo, proto_tree *parent, void* data _U_) { dissect_usb_common(tvb, pinfo, parent, USB_HEADER_USBPCAP, NULL); return tvb_captured_length(tvb); } void proto_register_usb(void) { module_t *usb_module; static hf_register_info hf[] = { /* USB packet pseudoheader members */ { &hf_usb_totlen, { "Total length", "usb.totlen", FT_UINT32, BASE_DEC, NULL, 0x0, NULL, HFILL }}, { &hf_usb_busunit, { "Host controller unit number", "usb.busunit", FT_UINT32, BASE_DEC, NULL, 0x0, NULL, HFILL }}, { &hf_usb_address, { "USB device index", "usb.address", FT_UINT8, BASE_DEC, NULL, 0x0, NULL, HFILL }}, { &hf_usb_mode, { "Mode of transfer", "usb.address", FT_UINT8, BASE_DEC, VALS(usb_freebsd_transfer_mode_vals), 0x0, NULL, HFILL }}, { &hf_usb_freebsd_urb_type, { "URB type", "usb.freebsd_type", FT_UINT8, BASE_DEC, VALS(usb_freebsd_urb_type_vals), 0x0, NULL, HFILL }}, { &hf_usb_freebsd_transfer_type, { "URB transfer type", "usb.freebsd_transfer_type", FT_UINT8, BASE_HEX, VALS(usb_freebsd_transfer_type_vals), 0x0, NULL, HFILL }}, { &hf_usb_xferflags, { "Transfer flags", "usb.xferflags", FT_UINT32, BASE_HEX, NULL, 0x0, NULL, HFILL }}, { &hf_usb_xferflags_force_short_xfer, { "Force short transfer", "usb.xferflags.force_short_xfer", FT_BOOLEAN, 32, NULL, FREEBSD_FLAG_FORCE_SHORT_XFER, NULL, HFILL }}, { &hf_usb_xferflags_short_xfer_ok, { "Short transfer OK", "usb.xferflags.short_xfer_ok", FT_BOOLEAN, 32, NULL, FREEBSD_FLAG_SHORT_XFER_OK, NULL, HFILL }}, { &hf_usb_xferflags_short_frames_ok, { "Short frames OK", "usb.xferflags.short_frames_ok", FT_BOOLEAN, 32, NULL, FREEBSD_FLAG_SHORT_FRAMES_OK, NULL, HFILL }}, { &hf_usb_xferflags_pipe_bof, { "Pipe BOF", "usb.xferflags.pipe_bof", FT_BOOLEAN, 32, NULL, FREEBSD_FLAG_PIPE_BOF, NULL, HFILL }}, { &hf_usb_xferflags_proxy_buffer, { "Proxy buffer", "usb.xferflags.proxy_buffer", FT_BOOLEAN, 32, NULL, FREEBSD_FLAG_PROXY_BUFFER, NULL, HFILL }}, { &hf_usb_xferflags_ext_buffer, { "External buffer", "usb.xferflags.ext_buffer", FT_BOOLEAN, 32, NULL, FREEBSD_FLAG_EXT_BUFFER, NULL, HFILL }}, { &hf_usb_xferflags_manual_status, { "Manual status", "usb.xferflags.manual_status", FT_BOOLEAN, 32, NULL, FREEBSD_FLAG_MANUAL_STATUS, NULL, HFILL }}, { &hf_usb_xferflags_no_pipe_ok, { "No pipe OK", "usb.xferflags.no_pipe_ok", FT_BOOLEAN, 32, NULL, FREEBSD_FLAG_NO_PIPE_OK, NULL, HFILL }}, { &hf_usb_xferflags_stall_pipe, { "Stall pipe", "usb.xferflags.stall_pipe", FT_BOOLEAN, 32, NULL, FREEBSD_FLAG_STALL_PIPE, NULL, HFILL }}, { &hf_usb_xferstatus, { "Transfer status", "usb.xferstatus", FT_UINT32, BASE_HEX, NULL, 0x0, NULL, HFILL }}, { &hf_usb_xferstatus_open, { "Pipe has been opened", "usb.xferstatus.open", FT_BOOLEAN, 32, NULL, FREEBSD_STATUS_OPEN, NULL, HFILL }}, { &hf_usb_xferstatus_transferring, { "Transfer in progress", "usb.xferstatus.transferring", FT_BOOLEAN, 32, NULL, FREEBSD_STATUS_TRANSFERRING, NULL, HFILL }}, { &hf_usb_xferstatus_did_dma_delay, { "Waited for hardware DMA", "usb.xferstatus.did_dma_delay", FT_BOOLEAN, 32, NULL, FREEBSD_STATUS_DID_DMA_DELAY, NULL, HFILL }}, { &hf_usb_xferstatus_did_close, { "Transfer closed", "usb.xferstatus.did_close", FT_BOOLEAN, 32, NULL, FREEBSD_STATUS_DID_CLOSE, NULL, HFILL }}, { &hf_usb_xferstatus_draining, { "Draining transfer", "usb.xferstatus.draining", FT_BOOLEAN, 32, NULL, FREEBSD_STATUS_DRAINING, NULL, HFILL }}, { &hf_usb_xferstatus_started, { "Transfer started", "usb.xferstatus.started", FT_BOOLEAN, 32, NULL, FREEBSD_STATUS_STARTED, "Whether the transfer is started or stopped", HFILL }}, { &hf_usb_xferstatus_bw_reclaimed, { "Bandwidth reclaimed", "usb.xferstatus.bw_reclaimed", FT_BOOLEAN, 32, NULL, FREEBSD_STATUS_BW_RECLAIMED, NULL, HFILL }}, { &hf_usb_xferstatus_control_xfr, { "Control transfer", "usb.xferstatus.control_xfr", FT_BOOLEAN, 32, NULL, FREEBSD_STATUS_CONTROL_XFR, NULL, HFILL }}, { &hf_usb_xferstatus_control_hdr, { "Control header being sent", "usb.xferstatus.control_hdr", FT_BOOLEAN, 32, NULL, FREEBSD_STATUS_CONTROL_HDR, NULL, HFILL }}, { &hf_usb_xferstatus_control_act, { "Control transfer active", "usb.xferstatus.control_act", FT_BOOLEAN, 32, NULL, FREEBSD_STATUS_CONTROL_ACT, NULL, HFILL }}, { &hf_usb_xferstatus_control_stall, { "Control transfer should be stalled", "usb.xferstatus.control_stall", FT_BOOLEAN, 32, NULL, FREEBSD_STATUS_CONTROL_STALL, NULL, HFILL }}, { &hf_usb_xferstatus_short_frames_ok, { "Short frames OK", "usb.xferstatus.short_frames_ok", FT_BOOLEAN, 32, NULL, FREEBSD_STATUS_SHORT_FRAMES_OK, NULL, HFILL }}, { &hf_usb_xferstatus_short_xfer_ok, { "Short transfer OK", "usb.xferstatus.short_xfer_ok", FT_BOOLEAN, 32, NULL, FREEBSD_STATUS_SHORT_XFER_OK, NULL, HFILL }}, { &hf_usb_xferstatus_bdma_enable, { "BUS-DMA enabled", "usb.xferstatus.bdma_enable", FT_BOOLEAN, 32, NULL, FREEBSD_STATUS_BDMA_ENABLE, NULL, HFILL }}, { &hf_usb_xferstatus_bdma_no_post_sync, { "BUS-DMA post sync op not done", "usb.xferstatus.bdma_no_post_sync", FT_BOOLEAN, 32, NULL, FREEBSD_STATUS_BDMA_NO_POST_SYNC, NULL, HFILL }}, { &hf_usb_xferstatus_bdma_setup, { "BUS-DMA set up", "usb.xferstatus.bdma_setup", FT_BOOLEAN, 32, NULL, FREEBSD_STATUS_BDMA_SETUP, NULL, HFILL }}, { &hf_usb_xferstatus_isochronous_xfr, { "Isochronous transfer", "usb.xferstatus.isochronous_xfr", FT_BOOLEAN, 32, NULL, FREEBSD_STATUS_ISOCHRONOUS_XFR, NULL, HFILL }}, { &hf_usb_xferstatus_curr_dma_set, { "Current DMA set", "usb.xferstatus.curr_dma_set", FT_UINT32, BASE_DEC, NULL, FREEBSD_STATUS_CURR_DMA_SET, NULL, HFILL }}, { &hf_usb_xferstatus_can_cancel_immed, { "Transfer can be cancelled immediately", "usb.xferstatus.can_cancel_immed", FT_BOOLEAN, 32, NULL, FREEBSD_STATUS_CAN_CANCEL_IMMED, NULL, HFILL }}, { &hf_usb_xferstatus_doing_callback, { "Executing the callback", "usb.xferstatus.doing_callback", FT_BOOLEAN, 32, NULL, FREEBSD_STATUS_DOING_CALLBACK, NULL, HFILL }}, { &hf_usb_error, { "Error", "usb.error", FT_UINT32, BASE_DEC, VALS(usb_freebsd_err_vals), 0x0, NULL, HFILL }}, { &hf_usb_interval, { "Interval", "usb.interval", FT_UINT32, BASE_DEC, NULL, 0x0, "Interval (ms)", HFILL }}, { &hf_usb_nframes, { "Number of following frames", "usb.nframes", FT_UINT32, BASE_DEC, NULL, 0x0, NULL, HFILL }}, { &hf_usb_packet_size, { "Packet size used", "usb.packet_size", FT_UINT32, BASE_DEC, NULL, 0x0, NULL, HFILL }}, { &hf_usb_packet_count, { "Packet count used", "usb.packet_count", FT_UINT32, BASE_DEC, NULL, 0x0, NULL, HFILL }}, { &hf_usb_speed, { "Speed", "usb.speed", FT_UINT8, BASE_DEC, VALS(usb_freebsd_speed_vals), 0x0, NULL, HFILL }}, { &hf_usb_frame_length, { "Frame length", "usb.frame.length", FT_UINT32, BASE_DEC, NULL, 0x0, NULL, HFILL }}, { &hf_usb_frame_flags, { "Frame flags", "usb.frame.flags", FT_UINT32, BASE_HEX, NULL, 0x0, NULL, HFILL }}, { &hf_usb_frame_flags_read, { "Data direction is read", "usb.frame.read", FT_BOOLEAN, 32, NULL, FREEBSD_FRAMEFLAG_READ, NULL, HFILL }}, { &hf_usb_frame_flags_data_follows, { "Frame contains data", "usb.frame.data_follows", FT_BOOLEAN, 32, NULL, FREEBSD_FRAMEFLAG_DATA_FOLLOWS, NULL, HFILL }}, { &hf_usb_frame_data, { "Frame data", "usb.frame.data", FT_BYTES, BASE_NONE, NULL, 0x0, NULL, HFILL }}, { &hf_usb_urb_id, { "URB id", "usb.urb_id", FT_UINT64, BASE_HEX, NULL, 0x0, NULL, HFILL }}, { &hf_usb_linux_urb_type, { "URB type", "usb.urb_type", FT_UINT8, BASE_DEC, VALS(usb_linux_urb_type_vals), 0x0, NULL, HFILL }}, { &hf_usb_linux_transfer_type, { "URB transfer type", "usb.transfer_type", FT_UINT8, BASE_HEX, VALS(usb_linux_transfer_type_vals), 0x0, NULL, HFILL }}, { &hf_usb_endpoint_number, { "Endpoint", "usb.endpoint_number", FT_UINT8, BASE_HEX, NULL, 0x0, "USB endpoint number", HFILL }}, { &hf_usb_endpoint_direction, { "Direction", "usb.endpoint_number.direction", FT_UINT8, BASE_DEC, VALS(usb_endpoint_direction_vals), 0x80, "USB endpoint direction", HFILL }}, { &hf_usb_endpoint_number_value, { "Endpoint value", "usb.endpoint_number.endpoint", FT_UINT8, BASE_DEC, NULL, 0x7F, "USB endpoint value", HFILL }}, { &hf_usb_device_address, { "Device", "usb.device_address", FT_UINT8, BASE_DEC, NULL, 0x0, "USB device address", HFILL }}, { &hf_usb_bus_id, { "URB bus id", "usb.bus_id", FT_UINT16, BASE_DEC, NULL, 0x0, NULL, HFILL }}, { &hf_usb_setup_flag, { "Device setup request", "usb.setup_flag", FT_STRING, BASE_NONE, NULL, 0x0, "USB device setup request is relevant (0) or not", HFILL }}, { &hf_usb_data_flag, { "Data", "usb.data_flag", FT_STRING, BASE_NONE, NULL, 0x0, "USB data is present (0) or not", HFILL }}, { &hf_usb_urb_ts_sec, { "URB sec", "usb.urb_ts_sec", FT_UINT64, BASE_DEC, NULL, 0x0, NULL, HFILL }}, { &hf_usb_urb_ts_usec, { "URB usec", "usb.urb_ts_usec", FT_UINT32, BASE_DEC, NULL, 0x0, NULL, HFILL }}, { &hf_usb_urb_status, { "URB status", "usb.urb_status", FT_INT32, BASE_DEC|BASE_EXT_STRING, &usb_urb_status_vals_ext, 0x0, NULL, HFILL }}, { &hf_usb_urb_len, { "URB length [bytes]", "usb.urb_len", FT_UINT32, BASE_DEC, NULL, 0x0, "URB length in bytes", HFILL }}, { &hf_usb_urb_data_len, { "Data length [bytes]", "usb.data_len", FT_UINT32, BASE_DEC, NULL, 0x0, "URB data length in bytes", HFILL }}, { &hf_usb_urb_unused_setup_header, { "Unused Setup Header", "usb.unused_setup_header", FT_NONE, BASE_NONE, NULL, 0x0, NULL, HFILL }}, { &hf_usb_urb_interval, { "Interval", "usb.interval", FT_UINT32, BASE_DEC, NULL, 0x0, NULL, HFILL }}, { &hf_usb_urb_start_frame, { "Start frame", "usb.start_frame", FT_UINT32, BASE_DEC, NULL, 0x0, NULL, HFILL }}, { &hf_usb_urb_copy_of_transfer_flags, { "Copy of Transfer Flags", "usb.copy_of_transfer_flags", FT_UINT32, BASE_HEX, NULL, 0x0, NULL, HFILL }}, /* Win32 USBPcap pseudoheader */ { &hf_usb_win32_header_len, { "USBPcap pseudoheader length", "usb.usbpcap_header_len", FT_UINT16, BASE_DEC, NULL, 0x0, NULL, HFILL }}, { &hf_usb_irp_id, { "IRP ID", "usb.irp_id", FT_UINT64, BASE_HEX, NULL, 0x0, NULL, HFILL }}, { &hf_usb_usbd_status, { "IRP USBD_STATUS", "usb.usbd_status", FT_UINT32, BASE_HEX | BASE_EXT_STRING, &win32_usbd_status_vals_ext, 0x0, "USB request status value", HFILL }}, { &hf_usb_function, { "URB Function", "usb.function", FT_UINT16, BASE_HEX|BASE_EXT_STRING, &win32_urb_function_vals_ext, 0x0, NULL, HFILL }}, { &hf_usb_info, { "IRP information", "usb.irp_info", FT_UINT8, BASE_HEX, NULL, 0x0, NULL, HFILL }}, { &hf_usb_usbpcap_info_reserved, { "Reserved", "usb.irp_info.reserved", FT_UINT8, BASE_HEX, NULL, 0xFE, NULL, HFILL }}, { &hf_usb_usbpcap_info_direction, { "Direction", "usb.irp_info.direction", FT_UINT8, BASE_HEX, VALS(win32_usb_info_direction_vals), 0x01, NULL, HFILL }}, { &hf_usb_win32_device_address, { "Device address", "usb.device_address", FT_UINT16, BASE_DEC, NULL, 0x0, "Windows USB device address", HFILL }}, { &hf_usb_win32_data_len, { "Packet Data Length", "usb.data_len", FT_UINT16, BASE_DEC, NULL, 0x0, NULL, HFILL }}, { &hf_usb_control_stage, { "Control transfer stage", "usb.control_stage", FT_UINT8, BASE_DEC, VALS(usb_control_stage_vals), 0x0, NULL, HFILL }}, { &hf_usb_win32_iso_start_frame, { "Isochronous transfer start frame", "usb.win32.iso_frame", FT_UINT32, BASE_DEC, NULL, 0x0, NULL, HFILL }}, { &hf_usb_win32_iso_num_packets, { "Isochronous transfer number of packets", "usb.win32.iso_num_packets", FT_UINT32, BASE_DEC, NULL, 0x0, NULL, HFILL }}, { &hf_usb_win32_iso_error_count, { "Isochronous transfer error count", "usb.win32.iso_error_count", FT_UINT32, BASE_DEC, NULL, 0x0, NULL, HFILL }}, { &hf_usb_win32_iso_offset, { "ISO Data offset", "usb.win32.iso_offset", FT_UINT32, BASE_HEX, NULL, 0x0, NULL, HFILL }}, { &hf_usb_win32_iso_length, { "ISO Data length", "usb.win32.iso_data_len", FT_UINT32, BASE_HEX, NULL, 0x0, NULL, HFILL }}, { &hf_usb_win32_iso_status, { "ISO USBD status", "usb.win32.iso_status", FT_UINT32, BASE_HEX | BASE_EXT_STRING, &win32_usbd_status_vals_ext, 0x0, NULL, HFILL }}, { &hf_usb_bmRequestType, { "bmRequestType", "usb.bmRequestType", FT_UINT8, BASE_HEX, NULL, 0x0, NULL, HFILL }}, /* Only used when response type cannot be determined */ { &hf_usb_control_response_generic, { "CONTROL response data", "usb.control.Response", FT_BYTES, BASE_NONE, NULL, 0x0, NULL, HFILL }}, { &hf_usb_request, { "bRequest", "usb.setup.bRequest", FT_UINT8, BASE_DEC | BASE_EXT_STRING, &setup_request_names_vals_ext, 0x0, NULL, HFILL }}, /* Same as hf_usb_request but no descriptive text */ { &hf_usb_request_unknown_class, { "bRequest", "usb.setup.bRequest", FT_UINT8, BASE_DEC, NULL, 0x0, NULL, HFILL }}, { &hf_usb_value, { "wValue", "usb.setup.wValue", FT_UINT16, BASE_HEX, NULL, 0x0, NULL, HFILL }}, { &hf_usb_index, { "wIndex", "usb.setup.wIndex", FT_UINT16, BASE_DEC_HEX, NULL, 0x0, NULL, HFILL }}, { &hf_usb_length, { "wLength", "usb.setup.wLength", FT_UINT16, BASE_DEC, NULL, 0x0, NULL, HFILL }}, { &hf_usb_device_wFeatureSelector, { "wFeatureSelector", "usb.setup.wFeatureSelector", FT_UINT16, BASE_DEC, VALS(usb_device_feature_selector_vals), 0x0, NULL, HFILL }}, { &hf_usb_interface_wFeatureSelector, { "wFeatureSelector", "usb.setup.wFeatureSelector", FT_UINT16, BASE_DEC, VALS(usb_interface_feature_selector_vals), 0x0, NULL, HFILL }}, { &hf_usb_endpoint_wFeatureSelector, { "wFeatureSelector", "usb.setup.wFeatureSelector", FT_UINT16, BASE_DEC, VALS(usb_endpoint_feature_selector_vals), 0x0, NULL, HFILL }}, { &hf_usb_wInterface, { "wInterface", "usb.setup.wInterface", FT_UINT16, BASE_DEC, NULL, 0x0, NULL, HFILL }}, { &hf_usb_wEndpoint, { "wEndpoint", "usb.setup.wEndpoint", FT_UINT16, BASE_DEC, NULL, 0x0, NULL, HFILL }}, { &hf_usb_wStatus, { "wStatus", "usb.setup.wStatus", FT_UINT16, BASE_HEX, NULL, 0x0, NULL, HFILL }}, { &hf_usb_wFrameNumber, { "wFrameNumber", "usb.setup.wFrameNumber", FT_UINT16, BASE_DEC, NULL, 0x0, NULL, HFILL }}, /* --------------------------------- */ { &hf_usb_iso_error_count, /* host endian byte order */ { "ISO error count", "usb.iso.error_count", FT_UINT16, BASE_DEC, NULL, 0x0, NULL, HFILL }}, { &hf_usb_iso_numdesc, /* host endian byte order */ { "Number of ISO descriptors", "usb.iso.numdesc", FT_UINT16, BASE_DEC, NULL, 0x0, NULL, HFILL }}, /* fields of struct mon_bin_isodesc from linux/drivers/usb/mon/mon_bin.c */ { &hf_usb_iso_status, /* host endian byte order */ { "Status", "usb.iso.iso_status", FT_INT32, BASE_DEC|BASE_EXT_STRING, &usb_urb_status_vals_ext, 0x0, "ISO descriptor status", HFILL }}, { &hf_usb_iso_off, /* host endian byte order */ { "Offset [bytes]", "usb.iso.iso_off", FT_UINT32, BASE_DEC, NULL, 0x0, "ISO data offset in bytes starting from the end of the last ISO descriptor", HFILL }}, { &hf_usb_iso_len, /* host endian byte order */ { "Length [bytes]", "usb.iso.iso_len", FT_UINT32, BASE_DEC, NULL, 0x0, "ISO data length in bytes", HFILL }}, { &hf_usb_iso_actual_len, { "Actual Length [bytes]", "usb.iso.iso_actual_len", FT_UINT32, BASE_DEC, NULL, 0x0, "ISO data actual length in bytes", HFILL }}, { &hf_usb_iso_pad, /* host endian byte order */ { "Padding", "usb.iso.pad", FT_UINT32, BASE_HEX, NULL, 0x0, "Padding field of ISO descriptor structure", HFILL }}, { &hf_usb_iso_data, {"ISO Data", "usb.iso.data", FT_BYTES, BASE_NONE, NULL, 0x0, NULL, HFILL }}, /* --------------------------------- */ #if 0 { &hf_usb_data_len, {"Application Data Length", "usb.data.length", FT_UINT32, BASE_DEC, NULL, 0x0, NULL, HFILL }}, #endif { &hf_usb_capdata, {"Leftover Capture Data", "usb.capdata", FT_BYTES, BASE_NONE, NULL, 0x0, "Padding added by the USB capture system", HFILL }}, { &hf_usb_bmRequestType_direction, { "Direction", "usb.bmRequestType.direction", FT_BOOLEAN, 8, TFS(&tfs_bmrequesttype_direction), USB_DIR_IN, NULL, HFILL }}, { &hf_usb_bmRequestType_type, { "Type", "usb.bmRequestType.type", FT_UINT8, BASE_HEX, VALS(bmrequesttype_type_vals), USB_TYPE_MASK, NULL, HFILL }}, { &hf_usb_bmRequestType_recipient, { "Recipient", "usb.bmRequestType.recipient", FT_UINT8, BASE_HEX, VALS(bmrequesttype_recipient_vals), 0x1f, NULL, HFILL }}, { &hf_usb_bDescriptorType, { "bDescriptorType", "usb.bDescriptorType", FT_UINT8, BASE_HEX, NULL, 0x0, NULL, HFILL }}, /* Only used when descriptor type cannot be determined */ { &hf_usb_get_descriptor_resp_generic, { "GET DESCRIPTOR Response data", "usb.getDescriptor.Response", FT_BYTES, BASE_NONE, NULL, 0x0, NULL, HFILL }}, { &hf_usb_descriptor_index, { "Descriptor Index", "usb.DescriptorIndex", FT_UINT8, BASE_HEX, NULL, 0x0, NULL, HFILL }}, { &hf_usb_language_id, { "Language Id", "usb.LanguageId", FT_UINT16, BASE_HEX|BASE_EXT_STRING,&usb_langid_vals_ext, 0x0, NULL, HFILL }}, { &hf_usb_bLength, { "bLength", "usb.bLength", FT_UINT8, BASE_DEC, NULL, 0x0, NULL, HFILL }}, { &hf_usb_bcdUSB, { "bcdUSB", "usb.bcdUSB", FT_UINT16, BASE_HEX, NULL, 0x0, NULL, HFILL }}, { &hf_usb_bDeviceClass, { "bDeviceClass", "usb.bDeviceClass", FT_UINT8, BASE_HEX|BASE_EXT_STRING, &usb_class_vals_ext, 0x0, NULL, HFILL }}, { &hf_usb_bDeviceSubClass, { "bDeviceSubClass", "usb.bDeviceSubClass", FT_UINT8, BASE_DEC, NULL, 0x0, NULL, HFILL }}, { &hf_usb_bDeviceProtocol, { "bDeviceProtocol", "usb.bDeviceProtocol", FT_UINT8, BASE_DEC, NULL, 0x0, NULL, HFILL }}, { &hf_usb_bMaxPacketSize0, { "bMaxPacketSize0", "usb.bMaxPacketSize0", FT_UINT8, BASE_DEC, NULL, 0x0, NULL, HFILL }}, { &hf_usb_idVendor, { "idVendor", "usb.idVendor", FT_UINT16, BASE_HEX | BASE_EXT_STRING, &ext_usb_vendors_vals, 0x0, NULL, HFILL }}, { &hf_usb_idProduct, { "idProduct", "usb.idProduct", FT_UINT16, BASE_HEX, NULL, 0x0, NULL, HFILL }}, { &hf_usb_bcdDevice, { "bcdDevice", "usb.bcdDevice", FT_UINT16, BASE_HEX, NULL, 0x0, NULL, HFILL }}, { &hf_usb_iManufacturer, { "iManufacturer", "usb.iManufacturer", FT_UINT8, BASE_DEC, NULL, 0x0, NULL, HFILL }}, { &hf_usb_iProduct, { "iProduct", "usb.iProduct", FT_UINT8, BASE_DEC, NULL, 0x0, NULL, HFILL }}, { &hf_usb_iSerialNumber, { "iSerialNumber", "usb.iSerialNumber", FT_UINT8, BASE_DEC, NULL, 0x0, NULL, HFILL }}, { &hf_usb_bNumConfigurations, { "bNumConfigurations", "usb.bNumConfigurations", FT_UINT8, BASE_DEC, NULL, 0x0, NULL, HFILL }}, { &hf_usb_wLANGID, { "wLANGID", "usb.wLANGID", FT_UINT16, BASE_HEX|BASE_EXT_STRING,&usb_langid_vals_ext, 0x0, NULL, HFILL }}, { &hf_usb_bString, { "bString", "usb.bString", FT_STRING, BASE_NONE, NULL, 0x0, NULL, HFILL }}, { &hf_usb_bInterfaceNumber, { "bInterfaceNumber", "usb.bInterfaceNumber", FT_UINT8, BASE_DEC, NULL, 0x0, NULL, HFILL }}, { &hf_usb_bAlternateSetting, { "bAlternateSetting", "usb.bAlternateSetting", FT_UINT8, BASE_DEC, NULL, 0x0, NULL, HFILL }}, { &hf_usb_bNumEndpoints, { "bNumEndpoints", "usb.bNumEndpoints", FT_UINT8, BASE_DEC, NULL, 0x0, NULL, HFILL }}, { &hf_usb_bInterfaceClass, { "bInterfaceClass", "usb.bInterfaceClass", FT_UINT8, BASE_HEX|BASE_EXT_STRING, &usb_class_vals_ext, 0x0, NULL, HFILL }}, { &hf_usb_bInterfaceSubClass, { "bInterfaceSubClass", "usb.bInterfaceSubClass", FT_UINT8, BASE_HEX, NULL, 0x0, NULL, HFILL }}, { &hf_usb_bInterfaceSubClass_cdc, { "bInterfaceSubClass", "usb.bInterfaceSubClass", FT_UINT8, BASE_HEX | BASE_EXT_STRING, &ext_usb_com_subclass_vals, 0x0, NULL, HFILL }}, { &hf_usb_bInterfaceSubClass_hid, { "bInterfaceSubClass", "usb.bInterfaceSubClass", FT_UINT8, BASE_HEX | BASE_EXT_STRING, &usb_hid_subclass_vals_ext, 0x0, NULL, HFILL }}, { &hf_usb_bInterfaceSubClass_misc, { "bInterfaceSubClass", "usb.bInterfaceSubClass", FT_UINT8, BASE_HEX | BASE_EXT_STRING, &usb_misc_subclass_vals_ext, 0x0, NULL, HFILL }}, { &hf_usb_bInterfaceSubClass_app, { "bInterfaceSubClass", "usb.bInterfaceSubClass", FT_UINT8, BASE_HEX | BASE_EXT_STRING, &usb_app_subclass_vals_ext, 0x0, NULL, HFILL }}, { &hf_usb_bInterfaceProtocol, { "bInterfaceProtocol", "usb.bInterfaceProtocol", FT_UINT8, BASE_HEX, NULL, 0x0, NULL, HFILL }}, { &hf_usb_bInterfaceProtocol_cdc, { "bInterfaceProtocol", "usb.bInterfaceProtocol", FT_UINT8, BASE_HEX | BASE_EXT_STRING, &usb_cdc_protocol_vals_ext, 0x0, NULL, HFILL }}, { &hf_usb_bInterfaceProtocol_cdc_data, { "bInterfaceProtocol", "usb.bInterfaceProtocol", FT_UINT8, BASE_HEX | BASE_EXT_STRING, &usb_cdc_data_protocol_vals_ext, 0x0, NULL, HFILL }}, { &hf_usb_bInterfaceProtocol_hid_boot, { "bInterfaceProtocol", "usb.bInterfaceProtocol", FT_UINT8, BASE_HEX | BASE_EXT_STRING, &usb_hid_boot_protocol_vals_ext, 0x0, NULL, HFILL }}, { &hf_usb_bInterfaceProtocol_app_dfu, { "bInterfaceProtocol", "usb.bInterfaceProtocol", FT_UINT8, BASE_HEX | BASE_EXT_STRING, &usb_app_dfu_protocol_vals_ext, 0x0, NULL, HFILL }}, { &hf_usb_bInterfaceProtocol_app_irda, { "bInterfaceProtocol", "usb.bInterfaceProtocol", FT_UINT8, BASE_HEX | BASE_EXT_STRING, &usb_app_irda_protocol_vals_ext, 0x0, NULL, HFILL }}, { &hf_usb_bInterfaceProtocol_app_usb_test_and_measurement, { "bInterfaceProtocol", "usb.bInterfaceProtocol", FT_UINT8, BASE_HEX | BASE_EXT_STRING, &usb_app_usb_test_and_measurement_protocol_vals_ext, 0x0, NULL, HFILL }}, { &hf_usb_iInterface, { "iInterface", "usb.iInterface", FT_UINT8, BASE_DEC, NULL, 0x0, NULL, HFILL }}, { &hf_usb_bEndpointAddress, { "bEndpointAddress", "usb.bEndpointAddress", FT_UINT8, BASE_HEX, NULL, 0x0, NULL, HFILL }}, { &hf_usb_configuration_bmAttributes, { "Configuration bmAttributes", "usb.configuration.bmAttributes", FT_UINT8, BASE_HEX, NULL, 0x0, NULL, HFILL }}, { &hf_usb_bmAttributes, { "bmAttributes", "usb.bmAttributes", FT_UINT8, BASE_HEX, NULL, 0x0, NULL, HFILL }}, { &hf_usb_bEndpointAttributeTransfer, { "Transfertype", "usb.bmAttributes.transfer", FT_UINT8, BASE_HEX, VALS(usb_bmAttributes_transfer_vals), 0x03, NULL, HFILL }}, { &hf_usb_bEndpointAttributeSynchonisation, { "Synchronisationtype", "usb.bmAttributes.sync", FT_UINT8, BASE_HEX, VALS(usb_bmAttributes_sync_vals), 0x0c, NULL, HFILL }}, { &hf_usb_bEndpointAttributeBehaviour, { "Behaviourtype", "usb.bmAttributes.behaviour", FT_UINT8, BASE_HEX, VALS(usb_bmAttributes_behaviour_vals), 0x30, NULL, HFILL }}, { &hf_usb_wMaxPacketSize, { "wMaxPacketSize", "usb.wMaxPacketSize", FT_UINT16, BASE_DEC, NULL, 0x0, NULL, HFILL }}, { &hf_usb_wMaxPacketSize_size, { "Maximum Packet Size", "usb.wMaxPacketSize.size", FT_UINT16, BASE_DEC, NULL, 0x3FF, NULL, HFILL }}, { &hf_usb_wMaxPacketSize_slots, { "Transactions per microframe", "usb.wMaxPacketSize.slots", FT_UINT16, BASE_DEC, VALS(usb_wMaxPacketSize_slots_vals), (3<<11), NULL, HFILL }}, { &hf_usb_bInterval, { "bInterval", "usb.bInterval", FT_UINT8, BASE_DEC, NULL, 0x0, NULL, HFILL }}, { &hf_usb_wTotalLength, { "wTotalLength", "usb.wTotalLength", FT_UINT16, BASE_DEC, NULL, 0x0, NULL, HFILL }}, { &hf_usb_bNumInterfaces, { "bNumInterfaces", "usb.bNumInterfaces", FT_UINT8, BASE_DEC, NULL, 0x0, NULL, HFILL }}, { &hf_usb_bConfigurationValue, { "bConfigurationValue", "usb.bConfigurationValue", FT_UINT8, BASE_DEC, NULL, 0x0, NULL, HFILL }}, { &hf_usb_iConfiguration, { "iConfiguration", "usb.iConfiguration", FT_UINT8, BASE_DEC, NULL, 0x0, NULL, HFILL }}, { &hf_usb_bMaxPower, { "bMaxPower", "usb.bMaxPower", FT_UINT8, BASE_DEC, NULL, 0x0, NULL, HFILL }}, { &hf_usb_configuration_legacy10buspowered, { "Must be 1", "usb.configuration.legacy10buspowered", FT_BOOLEAN, 8, TFS(&tfs_mustbeone), 0x80, "Legacy USB 1.0 bus powered", HFILL }}, { &hf_usb_configuration_selfpowered, { "Self-Powered", "usb.configuration.selfpowered", FT_BOOLEAN, 8, TFS(&tfs_selfpowered), 0x40, NULL, HFILL }}, { &hf_usb_configuration_remotewakeup, { "Remote Wakeup", "usb.configuration.remotewakeup", FT_BOOLEAN, 8, TFS(&tfs_remotewakeup), 0x20, NULL, HFILL }}, { &hf_usb_bEndpointAddress_number, { "Endpoint Number", "usb.bEndpointAddress.number", FT_UINT8, BASE_HEX, NULL, 0x0f, NULL, HFILL }}, { &hf_usb_bEndpointAddress_direction, { "Direction", "usb.bEndpointAddress.direction", FT_BOOLEAN, 8, TFS(&tfs_endpoint_direction), 0x80, NULL, HFILL }}, { &hf_usb_request_in, { "Request in", "usb.request_in", FT_FRAMENUM, BASE_NONE, NULL, 0, "The request to this packet is in this packet", HFILL }}, { &hf_usb_time, { "Time from request", "usb.time", FT_RELATIVE_TIME, BASE_NONE, NULL, 0, "Time between Request and Response for USB cmds", HFILL }}, { &hf_usb_response_in, { "Response in", "usb.response_in", FT_FRAMENUM, BASE_NONE, NULL, 0, "The response to this packet is in this packet", HFILL }}, { &hf_usb_bFirstInterface, { "bFirstInterface", "usb.bFirstInterface", FT_UINT8, BASE_DEC, NULL, 0x0, NULL, HFILL }}, { &hf_usb_bInterfaceCount, { "bInterfaceCount", "usb.bInterfaceCount", FT_UINT8, BASE_DEC, NULL, 0x0, NULL, HFILL }}, { &hf_usb_bFunctionClass, { "bFunctionClass", "usb.bFunctionClass", FT_UINT8, BASE_HEX|BASE_EXT_STRING, &usb_class_vals_ext, 0x0, NULL, HFILL }}, { &hf_usb_bFunctionSubClass, { "bFunctionSubClass", "usb.bFunctionSubClass", FT_UINT8, BASE_HEX, NULL, 0x0, NULL, HFILL }}, { &hf_usb_bFunctionProtocol, { "bFunctionProtocol", "usb.bFunctionProtocol", FT_UINT8, BASE_HEX, NULL, 0x0, NULL, HFILL }}, { &hf_usb_iFunction, { "iFunction", "usb.iFunction", FT_UINT8, BASE_DEC, NULL, 0x0, NULL, HFILL }}, { &hf_usb_data_fragment, { "Data Fragment", "usb.data_fragment", FT_BYTES, BASE_NONE, NULL, 0x0, NULL, HFILL }}, { &hf_usb_src, { "Source", "usb.src", FT_STRING, STR_ASCII, NULL, 0x0, NULL, HFILL } }, { &hf_usb_dst, { "Destination", "usb.dst", FT_STRING, STR_ASCII, NULL, 0x0, NULL, HFILL } }, { &hf_usb_addr, { "Source or Destination", "usb.addr", FT_STRING, STR_ASCII, NULL, 0x0, NULL, HFILL } } }; static gint *usb_subtrees[] = { &ett_usb_hdr, &ett_usb_setup_hdr, &ett_usb_isodesc, &ett_usb_win32_iso_packet, &ett_usb_endpoint, &ett_usb_xferflags, &ett_usb_xferstatus, &ett_usb_frame, &ett_usb_frame_flags, &ett_usb_setup_bmrequesttype, &ett_usb_usbpcap_info, &ett_descriptor_device, &ett_configuration_bmAttributes, &ett_configuration_bEndpointAddress, &ett_endpoint_bmAttributes, &ett_endpoint_wMaxPacketSize }; static ei_register_info ei[] = { { &ei_usb_bLength_even, { "usb.bLength.even", PI_PROTOCOL, PI_WARN, "Invalid STRING DESCRIPTOR Length (must be even)", EXPFILL }}, { &ei_usb_bLength_too_short, { "usb.bLength.too_short", PI_MALFORMED, PI_ERROR, "Invalid STRING DESCRIPTOR Length (must be 2 or larger)", EXPFILL }}, { &ei_usb_desc_length_invalid, { "usb.desc_length.invalid", PI_MALFORMED, PI_ERROR, "Invalid descriptor length", EXPFILL }}, { &ei_usb_invalid_setup, { "usb.setup.invalid", PI_MALFORMED, PI_ERROR, "Only control URBs may contain a setup packet", EXPFILL }} }; expert_module_t* expert_usb; proto_usb = proto_register_protocol("USB", "USB", "usb"); proto_register_field_array(proto_usb, hf, array_length(hf)); proto_register_subtree_array(usb_subtrees, array_length(usb_subtrees)); expert_usb = expert_register_protocol(proto_usb); expert_register_field_array(expert_usb, ei, array_length(ei)); device_to_product_table = wmem_tree_new_autoreset(wmem_epan_scope(), wmem_file_scope()); device_to_protocol_table = wmem_tree_new_autoreset(wmem_epan_scope(), wmem_file_scope()); device_to_dissector = register_dissector_table("usb.device", "USB device", proto_usb, FT_UINT32, BASE_HEX); protocol_to_dissector = register_dissector_table("usb.protocol", "USB protocol", proto_usb, FT_UINT32, BASE_HEX); product_to_dissector = register_dissector_table("usb.product", "USB product", proto_usb, FT_UINT32, BASE_HEX); usb_bulk_dissector_table = register_dissector_table("usb.bulk", "USB bulk endpoint", proto_usb, FT_UINT8, BASE_DEC); heur_bulk_subdissector_list = register_heur_dissector_list("usb.bulk", proto_usb); usb_control_dissector_table = register_dissector_table("usb.control", "USB control endpoint", proto_usb, FT_UINT8, BASE_DEC); heur_control_subdissector_list = register_heur_dissector_list("usb.control", proto_usb); usb_interrupt_dissector_table = register_dissector_table("usb.interrupt", "USB interrupt endpoint", proto_usb, FT_UINT8, BASE_DEC); heur_interrupt_subdissector_list = register_heur_dissector_list("usb.interrupt", proto_usb); usb_descriptor_dissector_table = register_dissector_table("usb.descriptor", "USB descriptor", proto_usb, FT_UINT8, BASE_DEC); usb_module = prefs_register_protocol(proto_usb, NULL); prefs_register_bool_preference(usb_module, "try_heuristics", "Try heuristic sub-dissectors", "Try to decode a packet using a heuristic sub-dissector before " "attempting to dissect the packet using the \"usb.bulk\", \"usb.interrupt\" or " "\"usb.control\" dissector tables.", &try_heuristics); usb_tap = register_tap("usb"); register_decode_as(&usb_protocol_da); register_decode_as(&usb_product_da); register_decode_as(&usb_device_da); usb_address_type = address_type_dissector_register("AT_USB", "USB Address", usb_addr_to_str, usb_addr_str_len, NULL, NULL, NULL, NULL, NULL); register_conversation_table(proto_usb, TRUE, usb_conversation_packet, usb_hostlist_packet); } void proto_reg_handoff_usb(void) { dissector_handle_t linux_usb_handle; dissector_handle_t linux_usb_mmapped_handle; dissector_handle_t win32_usb_handle; dissector_handle_t freebsd_usb_handle; linux_usb_handle = create_dissector_handle(dissect_linux_usb, proto_usb); linux_usb_mmapped_handle = create_dissector_handle(dissect_linux_usb_mmapped, proto_usb); win32_usb_handle = create_dissector_handle(dissect_win32_usb, proto_usb); freebsd_usb_handle = create_dissector_handle(dissect_freebsd_usb, proto_usb); dissector_add_uint("wtap_encap", WTAP_ENCAP_USB_LINUX, linux_usb_handle); dissector_add_uint("wtap_encap", WTAP_ENCAP_USB_LINUX_MMAPPED, linux_usb_mmapped_handle); dissector_add_uint("wtap_encap", WTAP_ENCAP_USBPCAP, win32_usb_handle); dissector_add_uint("wtap_encap", WTAP_ENCAP_USB_FREEBSD, freebsd_usb_handle); } /* * Editor modelines - path_to_url * * Local variables: * c-basic-offset: 4 * tab-width: 8 * indent-tabs-mode: nil * End: * * vi: set shiftwidth=4 tabstop=8 expandtab: * :indentSize=4:tabSize=8:noTabs=true: */ ```
```c path_to_url Unless required by applicable law or agreed to in writing, software WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. */ #include <stdio.h> #include <stdlib.h> #include <string.h> #include "ntpd.h" #define kMinInputLength 20 #define kMaxInputLength 1024 bool nts_client_process_response_core(uint8_t *buff, int transferred, struct peer* peer); extern int LLVMFuzzerTestOneInput(const uint8_t *Data, size_t Size) {//ntpsec/tests/ntpd/nts_client.c if (Size < kMinInputLength || Size > kMaxInputLength){ return 0; } struct peer peer; peer.nts_state.aead = 42; /* Dummy init values */ peer.nts_state.cookielen = 0; peer.nts_state.writeIdx = 0; peer.nts_state.count = 0; peer.srcadr.sa4.sin_family = AF_INET; peer.srcadr.sa4.sin_port = htons(9999); peer.srcadr.sa4.sin_addr.s_addr= htonl(0x04030201); return nts_client_process_response_core((uint8_t*)Data,Size, &peer); } ```
Qikiqtagafaaluk, formerly Admiralty Island, is an uninhabited, irregularly shaped Arctic island in the Kitikmeot Region of Nunavut, Canada. It is located in the Victoria Strait, south of Victoria Island's Collinson Peninsula. References Victoria Island (Canada) Uninhabited islands of Kitikmeot Region
```javascript `hasOwnProperty` method Deleting properties Truthiness `catch` is block scoped Prototype methods ```
The Peggy V. Helmerich Distinguished Author Award is an American literary prize awarded by the Tulsa Library Trust in Tulsa, Oklahoma. It is awarded annually to an "internationally acclaimed" author who has "written a distinguished body of work and made a major contribution to the field of literature and letters". History of the award First given in 1985, with a cash prize of $5,000, by 2006 the prize had increased to $40,000 cash and an engraved crystal book. To date, all of the recipients have been English-language writers. The award is named after Peggy V. Helmerich, a prominent Tulsa library activist, philanthropist and the wife of Tulsa oilman Walter Helmerich III. Before her marriage, under the stage name Peggy Dow, she had been a motion picture actress, best known for playing the role of Nurse Kelly in the 1950 James Stewart film vehicle, Harvey and for co-starring with Best Actor Oscar nominee Arthur Kennedy in 1951's Bright Victory. The first honoree was writer and longtime Saturday Review of Literature editor Norman Cousins, with the evening's theme announced as "The Salutary Aspects of Laughter". In 1997, distinguished African-American historian John Hope Franklin became the first (and so far only) native Oklahoman to receive the award. While in Tulsa to accept the award, Franklin made several appearances to speak about his childhood experiences with racial segregation as well as his father's experiences as a lawyer in the aftermath of the 1921 Tulsa race riot. In 2004, 88-year-old Arthur Miller was initially announced as the honoree, but subsequently declined the award when illness prevented him from attending the December award ceremony and dinner; he died two months later. David McCullough, the 1995 winner, replaced him as featured speaker at the dinner and, later, returned his honorarium to the library. The following year's initial choice to be the honoree was again unable to accept due to illness: Oklahoman Tony Hillerman, who would have been the state's second native son to receive the award was, ultimately, replaced by John Grisham. Library Journal reported that Grisham donated the monetary prize to his Hurricane Katrina relief fund, and also used the occasion to research details for The Innocent Man: Murder and Injustice in a Small Town, his non-fiction account of an Oklahoma inmate cleared of murder charges shortly before his execution date. Reporting on Grisham's selection as Hillerman's replacement, a Virginia newspaper called the Helmerich Award the "best literary award you've never heard of." The 2017 honoree is novelist Richard Ford. List of winners The following authors have received the award since 1985: 1985 Norman Cousins 1986 Larry McMurtry 1987 John Updike 1988 Toni Morrison 1989 Saul Bellow 1990 John le Carré 1991 Eudora Welty 1992 Norman Mailer 1993 Peter Matthiessen 1994 Ray Bradbury 1995 David McCullough 1996 Neil Simon 1997 John Hope Franklin 1998 E. L. Doctorow 1999 Margaret Atwood 2000 William Manchester 2001 William Kennedy 2002 Joyce Carol Oates 2003 Shelby Foote 2004 David McCullough 2005 John Grisham 2006 Mark Helprin 2007 Thomas Keneally 2008 Michael Chabon 2009 Geraldine Brooks 2010 Ian McEwan 2011 Alan Furst 2012 Wendell Berry 2013 Kazuo Ishiguro 2014 Ann Patchett 2015 Rick Atkinson 2016 Billy Collins 2017 Richard Ford 2018 Hilary Mantel 2019 Stacy Schiff 2020 Marilynne Robinson 2022 Elizabeth Strout See also Tulsa City-County Library References External links Peggy V. Helmerich Distinguished Author Award official website Tulsa Library Trust official website Peggy Dow at Internet Movie Database. Voices of Oklahoma interview with Peggy Helmerich. First person interview conducted with Peggy Helmerich on October 9, 2009. Original audio and transcript archived with Voices of Oklahoma oral history project. American literary awards Awards established in 1985 Literary awards honoring writers Culture of Tulsa, Oklahoma 1985 establishments in Oklahoma
Leadership High School is a public charter high school located in San Francisco. Founded in 1997, Leadership or "LHS" was California's first start-up charter high school. The school provides a college-preparatory curriculum and focuses on leadership development and social justice. During the school's first two years, it operated out of Golden Gate University. It then moved to the Excelsior District and occupied an elementary school facility owned by San Francisco Unified School District. In January 2007, the school was forced to move from that location, because the building was determined to not be earthquake-safe. Between January 2007 and June 2008, the school shared a facility with Philip and Sala Burton High School. Between August 2008 and spring 2015, the school shared a facility with James Denman Middle School, exclusively occupying the top floor of the building. As of Spring 2015, Leadership is again occupying its original Excelsior District site, after SFUSD completed a modernization project there. The current address of the school is 350 Seneca Avenue, San Francisco, CA. The school enrolls approximately 270 students. By 2012, there were over 500 graduates, with over 95% going on to college. (This school is not related to "Leadership Public Schools," a network of 4 charter high schools in the San Francisco Bay Area.) The school has had four leaders in its history. The school is led by Principal Beth Silbergeld. School web site NOTE, Leadership High School merged with City Arts and Tech High School in 2022 to become City Arts and Leadership Academy Coalition of Essential Schools (CES) In 1997 Leadership became a Coalition of Essential Schools National Affiliate School. Demographics According to U.S. News & World Report, 99% of Leadership's student body is "of color," with 73% of the student body coming from an economically disadvantaged household, determined by student eligibility for California's Reduced-price meal program. See also San Francisco County high schools References External links Leadership High School - official website SFUSD portal page CALFEE, SF public schools guide Coalition of Essential Schools High schools in San Francisco Charter high schools in California 1997 establishments in California
Yantarny (; German: ; ) is an urban locality (an urban-type settlement) in Kaliningrad Oblast, Russia, located on the Sambian Peninsula, about from Kaliningrad, the administrative center of the oblast. Population: History Pre-1945 Palmnicken was founded in 1234 by the Teutonic Knights during the Northern Crusades, on the site of a previous Old Prussian settlement. After the secularization of the Teutonic Knights' Prussian lands in 1525, it became part of the Duchy of Prussia, a vassal duchy of the Kingdom of Poland. During the Polish–Swedish wars, Palmnicken was one of several towns in the region occupied by Sweden, who remained in the town for six years. The town became part of the Kingdom of Prussia in 1701, and was occupied by Imperial Russian troops between 1758 and 1762 during the Seven Years' War. Palmnicken was included in the newly formed province of East Prussia in 1773, and resulting from the further Prussian administrative reform in 1818 it became part of Landkreis Fischhausen district, which was seated in Fischhausen (now Primorsk). The local amber trade along the coast of the Sambian Peninsula entered industrial development in 1827, with Palmnicken eventually becoming the primary town of the industry in the region. The town became part of the German Empire in 1871 during the Prussian-led unification of Germany, and at the beginning of the 20th century developed into a spa resort. In 1939, the town had 3,079 inhabitants as part of Nazi Germany at the beginning of World War II. Massacre of Palmnicken Due to the advance of Soviet troops in January 1945, the Stutthof concentration camp, an SS subcamp near Stutthof (now Sztutowo, Poland), was disbanded and its inmates were sent on a forced march through Königsberg to Palmnicken, which only 3,000 of the original 13,000 inmates survived. Originally, the surviving detainees were to be walled up within a tunnel of an amber mine in Palmnicken, but this plan collapsed upon the objections of the mine's estate manager, the well respected Hans Feyerabend, who stated that whilst ever he was alive nobody would be killed, and he then arranged for the prisoners to be given food. The mine's director, Landsmann, then refused to open the mine citing concerns that it was used for the town's water supply. Unfortunately, after receiving threats from the SS, and possibly the ardent Nazi mayor Fredrichs, Feyerabend committed suicide though murder has not been ruled out. The SS guards, with the assistance of Mayor Fredrichs summoning a group of armed Hitler Youth (whom had been plied with drink), then brought the prisoners to the beach of Palmnicken during the night of January 31, and forced them to march into the Baltic Sea under gunfire, with only 33 of the known by name inmates surviving. A monument to the victims was unveiled in Yantarny on January 30, 2011. The monument, by Frank Meisler, features hands lifted up to the sky as a symbol of the perishing people. On August 24, 2011, the monument was vandalized with paint and antisemitic slogans. Post-1945 Palmnicken was eventually captured by the Red Army on 7 April 1945, during the closing days of the war. The northern third of the former province of East Prussia, including Palmnicken, became part of the Soviet Union in 1945 under terms of border changes promulgated at the Potsdam Conference. The German population evacuated or was subsequently expelled to West Germany, also in accordance with the Potsdam Agreement. Palmnicken was renamed Yantarny, derived from yantar, the Russian word for amber, and was repopulated by Soviet settlers, predominantly Russians, as well as Belarusians, Ukrainians, and Tatars. Administrative and municipal status Within the framework of administrative divisions, it is, together with two rural localities, incorporated as the urban-type settlement of oblast significance of Yantarny—an administrative unit with the status equal to that of the districts. As a municipal division, the urban-type settlement of oblast significance of Yantarny is incorporated as Yantarny Urban Okrug. Amber industry Amber was collected along the shores of the Sambian coast during the age of the Teutonic Knights. They succeeded in establishing a monopoly over the amber trade, which carried over to the Prussian state of the House of Hohenzollern. In the 16th century amber collected along the coastline was brought to Palmnicken where it was sorted and then sent to Königsberg for further processing. After 1811 the amber production was leased. In 1858 the firm Stantien & Becker was founded. Stantien & Becker created the first open pit amber mine in the world, but mined amber mainly with the method of underground mining (pits "Anna" and "Henriette"). Initially the mine produced 50 tons of amber annually, but by 1937 - now a state-owned company (Preußische Bergwerks- und Hütten AG) - it produced 650 tons annually and employed 700 workers. As part of the Soviet Union, Yantarny produced approximately 600 tons of amber annually through the company Russky Yantar ("Russian Amber"). The refinement of amber was discontinued in 2002 by a directive of the Russian Regulatory Authority for Technology and Environmental Protection. Some years later, a new open pit mine ("Primorskoje") was established in immediate vicinity of the old open pit mine. In 2008 about 500 tons amber was mined at this location. Amber Beach Festival In 2010, Yantarny hosted the annual Amber Beach international music festival. References Notes Sources External links Official website of Yantarny Article about Massacre of Palmnicken Article about Massacre of Palmnicken, newspaper Dvornik, Kaliningrad Urban-type settlements in Kaliningrad Oblast Nazi war crimes 1234 establishments in Europe
Plumbide is an anion of lead atoms. There are three plumbide anions, written as Pb−, Pb2− and Pb4− with 3 oxidation states, -1, -2 and -4, respectively. A plumbide can refer to one of two things: an intermetallic compound that contains lead, or a Zintl phase compound with lead as the anion. Zintl phase Plumbides can be formed when lead forms a Zintl phase compound with a more metallic element. One salt that can be formed this way is when cryptand reacts with sodium and lead in ethylenediamine (en) to produce [Pb5]2−, which is red in solution. Lead can also create anions with tin, in a series of anions with the formula [Sn9−xPbx]4−. Lead can also form the [Pb9]4− anion, which is emerald green in solution. Examples An example of a plumbide is CeRhPb. The lead atom has a coordination number of 12 in the crystal structure of this compound. It is bound to four rhodiums, six ceriums, and two other lead atoms in the crystal structure of the chemical. Several other plumbides are the M2Pd2Pb plumbides, where M is a rare-earth element, and the intermetallic additionally contains a palladium. These plumbides tend to exhibit antiferromagnetism, and all of them are conductors. A third plumbide is Ti6Pb4.8. Like the above plumbides, it is an intermetallic, but it only contains titanium as the other metal, and not any rare earths. Plumbides can also be Zintl phase compounds, such as [K(18-crown-6)]2K2Pb9·(en)1.5. This is not a simple Zintl compound, but rather contains the organic molecules 18-crown-6 and ethylenediamine (en) in order to stabilize the crystal structure. References Lead compounds Intermetallics Anions Cluster chemistry
```html <!DOCTYPE html> <html lang="en"> <head> <meta charset="UTF-8"> <meta name="viewport" content="width=device-width, initial-scale=1.0"> <title>Document</title> </head> <body> <script> alert('hello there'); </script> <script type="text/typescript"> function fib(n: number) { if (n == 0 || n == 1) { return n; } else { return (f(n-1) + f(n-2)); } } </script> <script type="text/template"> this should be ignored </script> </body> </html> ```
Samuel Ian Russell (born 4 October 1982) is an English football coach and former professional footballer who is an academy goalkeeping coach at EFL League One side Forest Green Rovers As a player he was a goalkeeper from 2000 until 2021. He began his career in the Premier League with Middlesbrough although he failed to make a first team appearance but would notably go on to have three spells with Darlington between 2002 and 2012. He also played for Gateshead, Scunthorpe United, Rochdale, Wrexham and Forest Green Rovers before joining Grimsby Town in 2018 where he combined playing duties as James McKeown's number two with coaching the clubs goalkeepers. Playing career Middlesbrough Russell was born in Middlesbrough, North Yorkshire and signed as a professional on 7 July 2000 for his hometown team after progressing through their youth system. He had a successful loan spell at Gateshead during the tail end of the 2001–02 season, followed by a brief period at Darlington on loan in 2002 after Mick Tait brought him in during a keeping crisis. He played in only one game, a 1–1 draw against Torquay United. In August 2003 he was sent to Scunthorpe United on loan where he stayed for three months and made 10 appearances. He finally left Middlesbrough in the summer of 2004, moving to Darlington on a permanent basis. He never appeared in Middlesbrough's first team. Darlington He was brought to Darlington by David Hodgson in August 2004. With the retirement of goalkeeping legend Andy Collett, the club only had two goalkeepers on their books in Michael Price and youth team player Jack Norton. Russell was a target for Hodgson after he heard that Middlesbrough were releasing the youngster. Initially it was thought Russell would be number two to Price, but performances during pre-season meant that Russell got the nod. Assistant manager Mark Proctor said that both Price and Russell were of virtual equal quality but Russsell's distribution was slightly better. In the opening game of the 2004–05 season Russell made his mark with a superb performance which earned him the man of the match. His superb saves kept Darlington in the game and meant Darlington hung on to an opening day victory. He was ever-present in the first team in the 2004–05 season, and signed a new two-year contract in July 2005. He seemed to have established himself as the number one until just before the start of the 2007–08 season until failed to agree terms with then manager Dave Penny and Russell left to go to Rochdale and left Darlington in June 2007 after three years in which he made 107 league appearances. Rochdale In July 2007 he signed for Rochdale, making his first start for the club in the Football League Trophy second round tie against Bury in October 2007. Following an injury to regular keeper James Spencer, Russell was next called upon in the FA Cup defeat at Southend United. He retained his place in the side and in December 2007, signed a contract until the end of the season having been on non-contract terms. He broke a finger in the 2–1 defeat at Lincoln City in February 2008 and was expected to be out of action until April. In the 2008–09 season Russell was handed the number one shirt for Rochdale. He had made some important saves for Rochdale that season, however he made the occasional mistake, such as scoring an own goal in the Rochdale–Bury derby, and slipping in a late equaliser by Jamie Ward against Chesterfield despite a Tom Kennedy late winner. In February he was replaced by Blackburn loanee Frank Fielding after Spencer was loaned out to Chester City. Wrexham After being released by Rochdale at the end of the season, Russell signed for Conference National side Wrexham after impressing on trial. Return to Darlington In May 2010 he agreed to rejoin Darlington on a one-year contract. He became their regular goalkeeper, and played at Wembley as they beat Mansfield Town to win the FA Trophy. He signed a one-year contract extension, but when the club suffered financial difficulties and failed to pay the players, he submitted his 14-day notice at the end of December and left the club for fellow Conference club Forest Green Rovers in January 2012. Forest Green Rovers Russell made his Forest Green debut on 21 January 2012 in a 0–0 draw against Newport County. In his first eight Forest Green performances he saved four consecutive penalty kicks. He made his hundredth league appearance for the club in a 2–1 win over Grimsby Town on 18 March 2014. In March 2014, he signed a contract extension to remain with the club. A run of 144 consecutive league appearances for Forest Green was brought to an end on 21 February 2015 when he was replaced by Steve Arnold for a 3–0 win over AFC Telford United. On 4 May 2015, it was announced that he was leaving Forest Green after his contract had come to an end. Gateshead In June 2015, Russell joined National League side Gateshead, returning to a club where over a decade earlier he had spent time on loan. Return to Forest Green Rovers On 1 July 2016, it was announced that Russell had returned to former club Forest Green Rovers on a two-year deal. He made his 150th league appearance for the club in a 4–1 away win over Maidstone United on 27 August 2016. Between 17 September 2016 and 29 October 2016, he kept a run of seven National League clean sheets, until Oliver Hawkins header in a 1–1 draw with Dagenham & Redbridge saw him concede and narrowly miss out on the league record of eight consecutive clean sheets set by Alan Julian for Stevenage. Russell appeared in every single of minute of Forest Green's 2016–17 campaign, as he helped the club secure promotion to the Football League for the first time in their history with a 3-1 National League play-off final win against Tranmere Rovers at Wembley Stadium. He was rewarded with a new one-year contract in June 2017 for Forest Green's first season in League Two. He was released by Forest Green at the end of the 2017–18 season. Coaching career Grimsby Town On 9 July 2018 he joined Grimsby Town as the club's new player / goalkeeper coach. Russell retired from playing at the end of the 2020–21 season but remained Paul Hurst's first team goalkeeping coach. On 25 June 2022, Russell announced via his Twitter account that he was leaving Grimsby. Russell returned to former club Forest Green Rovers as an academy goalkeeping coach in November 2022. Career statistics Honours Darlington FA Trophy: 2010–11 References External links 1982 births Living people Footballers from Middlesbrough English men's footballers Men's association football goalkeepers English Football League players National League (English football) players Northern Premier League players Middlesbrough F.C. players Gateshead F.C. players Darlington F.C. players Scunthorpe United F.C. players Rochdale A.F.C. players Wrexham A.F.C. players Forest Green Rovers F.C. players Grimsby Town F.C. players Grimsby Town F.C. non-playing staff Forest Green Rovers F.C. non-playing staff
William Gardiner (14 July 1864 – 27 January 1924) was a New Zealand cricketer. He played first-class cricket for Auckland and Wellington between 1882 and 1896. In the 1892–93 season Gardiner was regarded as the best batsman in Auckland and "one of the best in the colony", as well as a reliable fieldsman. His highest first-class score was 61, the only fifty in the match, when Auckland beat Canterbury in 1891–92. He top-scored for Wellington with 59 in the drawn match against Hawke's Bay in 1895–96, his last first-class match. Gardiner also played rugby for Auckland in the 1880s until a shoulder injury curtailed his career. He worked as a builder and contractor, his company being noted for building bridges and wharfs. He and his wife Catherine had one son, Norman. Gardiner died in Wellington aged 59 after a severe illness. See also List of Auckland representative cricketers References External links 1864 births 1924 deaths New Zealand cricketers Auckland cricketers Wellington cricketers Auckland rugby union players Cricketers from Auckland Colony of New Zealand people
```xml <?xml version="1.0" encoding="utf-8"?><!-- --> <manifest xmlns:android="path_to_url" package="com.dodola.rocoosample.runtimefix"> <application android:allowBackup="true" android:icon="@mipmap/ic_launcher" android:label="@string/app_name" android:name='.RocooApplication' android:supportsRtl="true" android:theme="@style/AppTheme"> <activity android:name=".MainActivity"> <intent-filter> <action android:name="android.intent.action.MAIN" /> <category android:name="android.intent.category.LAUNCHER" /> </intent-filter> </activity> </application> </manifest> ```
```python import os import sys import tempfile from pathlib import Path import pytest import ray from ray import train from ray.air import ScalingConfig from ray.air.constants import TRAINING_ITERATION from ray.air.execution import FixedResourceManager from ray.train import CheckpointConfig from ray.train._internal.storage import StorageContext from ray.train.tests.util import mock_storage_context from ray.tune import Trainable, register_trainable from ray.tune.execution.tune_controller import TuneController from ray.tune.experiment import Trial STORAGE = mock_storage_context() @pytest.fixture(scope="function") def ray_start_4_cpus_2_gpus_extra(): address_info = ray.init(num_cpus=4, num_gpus=2, resources={"a": 2}) yield address_info ray.shutdown() @pytest.mark.parametrize("trainable_type", ["class", "function", "data_parallel"]) @pytest.mark.parametrize("patch_iter", [False, True]) def test_checkpoint_freq_dir_name( ray_start_4_cpus_2_gpus_extra, trainable_type, patch_iter, tmp_path ): """Test that trial checkpoint IDs are correctly set across trainable types. This includes a current workaround to set checkpoint IDs according to reported metrics. """ def num_checkpoints(trial): return sum( item.startswith("checkpoint_") for item in os.listdir(trial.storage.trial_fs_path) ) def last_checkpoint_dir(trial): return max( item for item in os.listdir(trial.storage.trial_fs_path) if item.startswith("checkpoint_") ) checkpoint_config = None if trainable_type == "class": class MyTrainable(Trainable): def step(self): # `training_iteration` is increased after the report, so we # +1 here. return {"step": self.iteration + 1} def save_checkpoint(self, checkpoint_dir): return {"test": self.iteration} def load_checkpoint(self, checkpoint_dir): pass register_trainable("test_checkpoint_freq", MyTrainable) checkpoint_config = CheckpointConfig(checkpoint_frequency=3) elif trainable_type in {"function", "data_parallel"}: def train_fn(config): for step in range(1, 10): if step > 0 and step % 3 == 0: with tempfile.TemporaryDirectory() as checkpoint_dir: (Path(checkpoint_dir) / "data.ckpt").write_text(str(step)) train.report( {"step": step}, checkpoint=train.Checkpoint.from_directory(checkpoint_dir), ) else: train.report({"step": step}) if trainable_type == "function": register_trainable("test_checkpoint_freq", train_fn) elif trainable_type == "data_parallel": from ray.train.data_parallel_trainer import DataParallelTrainer trainer = DataParallelTrainer( train_loop_per_worker=train_fn, scaling_config=ScalingConfig(num_workers=1), ) register_trainable("test_checkpoint_freq", trainer.as_trainable()) else: raise RuntimeError("Invalid trainable type") if patch_iter: class CustomStorageContext(StorageContext): def _update_checkpoint_index(self, metrics): # Todo: Support auto-fille metrics for function trainables self.current_checkpoint_index = metrics.get( "step", self.current_checkpoint_index + 1 ) storage = mock_storage_context( storage_context_cls=CustomStorageContext, storage_path=tmp_path, ) else: storage = mock_storage_context(storage_path=tmp_path) trial = Trial( "test_checkpoint_freq", checkpoint_config=checkpoint_config, storage=storage, ) runner = TuneController( resource_manager_factory=lambda: FixedResourceManager(), storage=STORAGE, checkpoint_period=0, ) runner.add_trial(trial) while not trial.is_saving: runner.step() runner.step() assert trial.last_result[TRAINING_ITERATION] == 3 assert num_checkpoints(trial) == 1 if patch_iter: assert last_checkpoint_dir(trial) == "checkpoint_000003" else: assert last_checkpoint_dir(trial) == "checkpoint_000000" while not trial.is_saving: runner.step() runner.step() assert trial.last_result[TRAINING_ITERATION] == 6 assert num_checkpoints(trial) == 2 if patch_iter: assert last_checkpoint_dir(trial) == "checkpoint_000006" else: assert last_checkpoint_dir(trial) == "checkpoint_000001" while not trial.is_saving: runner.step() runner.step() assert trial.last_result[TRAINING_ITERATION] == 9 assert num_checkpoints(trial) == 3 if patch_iter: assert last_checkpoint_dir(trial) == "checkpoint_000009" else: assert last_checkpoint_dir(trial) == "checkpoint_000002" if __name__ == "__main__": sys.exit(pytest.main(["-v", __file__])) ```
Doresia Krings (born 13 April 1977) is an Austrian snowboarder. She won bronze medals at the FIS Snowboarding World Championships 2005 in the parallel giant slalom and parallel slalom events. She won a further bronze medal at the FIS Snowboarding World Championships 2007 in the parallel slalom event. Krings participated in the 2006 and 2010 Winter Olympics. References 1977 births Living people Austrian female snowboarders Snowboarders at the 2010 Winter Olympics Snowboarders at the 2006 Winter Olympics Olympic snowboarders for Austria
```go package listenkv import ( "io" "cosmossdk.io/store/types" ) var _ types.KVStore = &Store{} // Store implements the KVStore interface with listening enabled. // Operations are traced on each core KVStore call and written to any of the // underlying listeners with the proper key and operation permissions type Store struct { parent types.KVStore listener *types.MemoryListener parentStoreKey types.StoreKey } // NewStore returns a reference to a new traceKVStore given a parent // KVStore implementation and a buffered writer. func NewStore(parent types.KVStore, parentStoreKey types.StoreKey, listener *types.MemoryListener) *Store { return &Store{parent: parent, listener: listener, parentStoreKey: parentStoreKey} } // Get implements the KVStore interface. It traces a read operation and // delegates a Get call to the parent KVStore. func (s *Store) Get(key []byte) []byte { value := s.parent.Get(key) return value } // Set implements the KVStore interface. It traces a write operation and // delegates the Set call to the parent KVStore. func (s *Store) Set(key, value []byte) { types.AssertValidKey(key) s.parent.Set(key, value) s.listener.OnWrite(s.parentStoreKey, key, value, false) } // Delete implements the KVStore interface. It traces a write operation and // delegates the Delete call to the parent KVStore. func (s *Store) Delete(key []byte) { s.parent.Delete(key) s.listener.OnWrite(s.parentStoreKey, key, nil, true) } // Has implements the KVStore interface. It delegates the Has call to the // parent KVStore. func (s *Store) Has(key []byte) bool { return s.parent.Has(key) } // Iterator implements the KVStore interface. It delegates the Iterator call // the to the parent KVStore. func (s *Store) Iterator(start, end []byte) types.Iterator { return s.iterator(start, end, true) } // ReverseIterator implements the KVStore interface. It delegates the // ReverseIterator call the to the parent KVStore. func (s *Store) ReverseIterator(start, end []byte) types.Iterator { return s.iterator(start, end, false) } // iterator facilitates iteration over a KVStore. It delegates the necessary // calls to it's parent KVStore. func (s *Store) iterator(start, end []byte, ascending bool) types.Iterator { var parent types.Iterator if ascending { parent = s.parent.Iterator(start, end) } else { parent = s.parent.ReverseIterator(start, end) } return newTraceIterator(parent, s.listener) } type listenIterator struct { parent types.Iterator listener *types.MemoryListener } func newTraceIterator(parent types.Iterator, listener *types.MemoryListener) types.Iterator { return &listenIterator{parent: parent, listener: listener} } // Domain implements the Iterator interface. func (li *listenIterator) Domain() (start, end []byte) { return li.parent.Domain() } // Valid implements the Iterator interface. func (li *listenIterator) Valid() bool { return li.parent.Valid() } // Next implements the Iterator interface. func (li *listenIterator) Next() { li.parent.Next() } // Key implements the Iterator interface. func (li *listenIterator) Key() []byte { key := li.parent.Key() return key } // Value implements the Iterator interface. func (li *listenIterator) Value() []byte { value := li.parent.Value() return value } // Close implements the Iterator interface. func (li *listenIterator) Close() error { return li.parent.Close() } // Error delegates the Error call to the parent iterator. func (li *listenIterator) Error() error { return li.parent.Error() } // GetStoreType implements the KVStore interface. It returns the underlying // KVStore type. func (s *Store) GetStoreType() types.StoreType { return s.parent.GetStoreType() } // CacheWrap implements the KVStore interface. It panics as a Store // cannot be cache wrapped. func (s *Store) CacheWrap() types.CacheWrap { panic("cannot CacheWrap a ListenKVStore") } // CacheWrapWithTrace implements the KVStore interface. It panics as a // Store cannot be cache wrapped. func (s *Store) CacheWrapWithTrace(_ io.Writer, _ types.TraceContext) types.CacheWrap { panic("cannot CacheWrapWithTrace a ListenKVStore") } ```
```javascript /** * @license Apache-2.0 * * * * path_to_url * * Unless required by applicable law or agreed to in writing, software * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. */ 'use strict'; // MODULES // var resolve = require( 'path' ).resolve; var bench = require( '@stdlib/bench' ); var isBoolean = require( '@stdlib/assert/is-boolean' ).isPrimitive; var tryRequire = require( '@stdlib/utils/try-require' ); var pkg = require( './../package.json' ).name; // VARIABLES // var isIntegerf = tryRequire( resolve( __dirname, './../lib/native.js' ) ); var opts = { 'skip': ( isIntegerf instanceof Error ) }; // MAIN // bench( pkg+'::native,true', opts, function benchmark( b ) { var values; var bool; var i; values = [ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 1.0e38, 10, -0, -1, -2, -3, -4, -5, -6, -7, -8, -9, -10, -1.0e38 ]; b.tic(); for ( i = 0; i < b.iterations; i++ ) { bool = isIntegerf( values[ i % values.length ] ); if ( typeof bool !== 'boolean' ) { b.fail( 'should return a boolean' ); } } b.toc(); if ( !isBoolean( bool ) && bool !== true ) { b.fail( 'should return true' ); } b.pass( 'benchmark finished' ); b.end(); }); bench( pkg+'::native,false', opts, function benchmark( b ) { var values; var bool; var i; values = [ 0.1, 1.1, 2.1, 3.1, 4.1, 5.1, 6.1, 7.1, 8.1, 9.1, 10.1, -0.1, -1.1, -2.1, -3.1, -4.1, -5.1, -6.1, -7.1, -8.1, -9.1, -10.1 ]; b.tic(); for ( i = 0; i < b.iterations; i++ ) { bool = isIntegerf( values[ i % values.length ] ); if ( typeof bool !== 'boolean' ) { b.fail( 'should return a boolean' ); } } b.toc(); if ( !isBoolean( bool ) && bool !== false ) { b.fail( 'should return false' ); } b.pass( 'benchmark finished' ); b.end(); }); ```
```scala /* */ package com.lightbend.lagom.javadsl.testkit import java.util.concurrent.CompletableFuture import java.util.concurrent.CompletionStage import java.util.concurrent.TimeUnit import akka.NotUsed import akka.actor.ActorSystem import akka.stream.ActorMaterializer import com.google.inject.AbstractModule import com.lightbend.lagom.javadsl.api.ServiceCall import com.lightbend.lagom.javadsl.server.ServiceGuiceSupport import com.lightbend.lagom.javadsl.testkit.ServiceTest.Setup import com.lightbend.lagom.javadsl.testkit.ServiceTest.TestServer import com.lightbend.lagom.javadsl.testkit.services.tls.TestTlsService import com.typesafe.config.ConfigFactory import javax.inject.Inject import javax.net.ssl.SSLContext import org.scalatest.concurrent.PatienceConfiguration import org.scalatest.concurrent.ScalaFutures import org.scalatest.time.Seconds import org.scalatest.time.Span import org.scalatest.Matchers import org.scalatest.WordSpec import play.api.libs.ws.ahc.AhcWSClientConfigFactory import play.api.libs.ws.ahc.StandaloneAhcWSClient import play.inject.guice.GuiceApplicationBuilder import scala.compat.java8.FunctionConverters._ /** * */ class TestOverTlsSpec extends WordSpec with Matchers with ScalaFutures { val timeout = PatienceConfiguration.Timeout(Span(5, Seconds)) val defaultSetup = ServiceTest.defaultSetup.withCluster(false) "TestOverTls" when { "started with ssl" should { "not provide an ssl port by default" in { withServer(defaultSetup) { server => server.portSsl.isPresent shouldBe false } } "provide an ssl port" in { withServer(defaultSetup.withSsl()) { server => server.portSsl.isPresent shouldBe true } } "provide an ssl context for the client" in { withServer(defaultSetup.withSsl()) { server => server.clientSslContext.isPresent shouldBe true } } "complete an RPC call" in { withServer(defaultSetup.withSsl()) { server => // The client's provided by Lagom's ServiceTest default to use HTTP val client: TestTlsService = server.client(classOf[TestTlsService]) val response = client .sampleCall() .invoke() .toCompletableFuture .get(5, TimeUnit.SECONDS) response should be("sample response") } } "complete a WS call over HTTPS" in { withServer(defaultSetup.withSsl()) { server => implicit val actorSystem = server.app.asScala().actorSystem implicit val ctx = actorSystem.dispatcher // To explicitly use HTTPS on a test you must create a client of your own and make sure it uses // the provided SSLContext val wsClient = buildCustomWS(server.clientSslContext.get) val response = wsClient .url(s"path_to_url{server.portSsl.get}/api/sample") .get() .map { _.body[String] } whenReady(response, timeout) { r => r should be("sample response") } } } } } def withServer(setup: Setup)(block: TestServer => Unit): Unit = { ServiceTest.withServer(setup.configureBuilder((registerService _).asJava), block(_)) } def registerService(builder: GuiceApplicationBuilder): GuiceApplicationBuilder = builder.bindings(new TestTlsServiceModule) private def buildCustomWS(sslContext: SSLContext)(implicit actorSystem: ActorSystem) = { implicit val mat = ActorMaterializer() // This setting enables the use of `SSLContext.setDefault` on the following line. val sslConfig = ConfigFactory.parseString("play.ws.ssl.default = true").withFallback(ConfigFactory.load()) SSLContext.setDefault(sslContext) val config = AhcWSClientConfigFactory.forConfig(sslConfig) // This wsClient will use the `SSLContext` from `SSLContext.getDefault` (instead of the internal config-based) val wsClient = StandaloneAhcWSClient(config) wsClient } } // Thhe actual TestTlsService for javadsl tests is written in java. See TestTlsService.java //trait TestTlsService extends Service class TestTlsServiceImpl @Inject() () extends TestTlsService { override def sampleCall(): ServiceCall[NotUsed, String] = { new ServiceCall[NotUsed, String]() { override def invoke(request: NotUsed): CompletionStage[String] = CompletableFuture.completedFuture("sample response") } } } class TestTlsServiceModule extends AbstractModule with ServiceGuiceSupport { override def configure(): Unit = bindService(classOf[TestTlsService], classOf[TestTlsServiceImpl]) } ```
```javascript import getTranscriptScrollableElement from '../pageElements/transcriptScrollable'; import stabilized from './stabilized'; import sleep from '../../utils/sleep'; export default async function scrollToBottomCompleted() { await stabilized( 'scroll is at bottom and', () => { const scrollable = getTranscriptScrollableElement(); const { offsetHeight, scrollHeight, scrollTop } = scrollable; // Browser may keep rendering content. Wait until consecutively: // 1. Scroll position is at the bottom, and; // 2. Scroll is stabilized, i.e. `scrollTop` returns same value. // If it is not at the bottom, create a new empty object and return it, so it will fail the consecutive test. return Math.abs(offsetHeight + scrollTop - scrollHeight) <= 1 ? scrollTop : {}; }, 5, 5000 ); // TODO: This is probably a patch for test reliabilty. When we fix the root cause, remove this delay. await sleep(500); } ```
```smalltalk using System.Collections.Generic; using System.Runtime.Serialization; namespace Amazon.Lambda.LexV2Events { /// <summary> /// The class that represents the container for text that is returned to the customer. /// path_to_url /// </summary> [DataContract] public class LexV2Message { /// <summary> /// The text of the message. /// </summary> [DataMember(Name = "content", EmitDefaultValue = false)] [System.Text.Json.Serialization.JsonPropertyName("content")] public string Content { get; set; } /// <summary> /// Indicates the type of response. Could be one of <c>CustomPayload</c>, <c>ImageResponseCard</c>, <c>PlainText</c> or <c>SSML</c>. /// </summary> [DataMember(Name = "contentType", EmitDefaultValue = false)] [System.Text.Json.Serialization.JsonPropertyName("contentType")] public string ContentType { get; set; } /// <summary> /// A card that is shown to the user by a messaging platform. You define the contents of the card, the card is displayed by the platform. /// </summary> [DataMember(Name = "imageResponseCard", EmitDefaultValue = false)] [System.Text.Json.Serialization.JsonPropertyName("imageResponseCard")] public LexV2ImageResponseCard ImageResponseCard { get; set; } } /// <summary> /// The class that represents a card that is shown to the user by a messaging platform. /// path_to_url /// </summary> [DataContract] public class LexV2ImageResponseCard { /// <summary> /// A list of buttons that should be displayed on the response card. The arrangement of the buttons is determined by the platform that displays the button. /// </summary> [DataMember(Name = "buttons", EmitDefaultValue = false)] [System.Text.Json.Serialization.JsonPropertyName("buttons")] public IList<LexV2Button> Buttons { get; set; } /// <summary> /// TheThe URL of an image to display on the response card. The image URL must be publicly available so that the platform displaying the response card has access to the image. /// </summary> [DataMember(Name = "imageUrl", EmitDefaultValue = false)] [System.Text.Json.Serialization.JsonPropertyName("imageUrl")] public string ImageUrl { get; set; } /// <summary> /// The subtitle to display on the response card. The format of the subtitle is determined by the platform displaying the response card. /// </summary> [DataMember(Name = "subtitle", EmitDefaultValue = false)] [System.Text.Json.Serialization.JsonPropertyName("subtitle")] public string Subtitle { get; set; } /// <summary> /// The title to display on the response card. The format of the title is determined by the platform displaying the response card. /// </summary> [DataMember(Name = "title", EmitDefaultValue = false)] [System.Text.Json.Serialization.JsonPropertyName("title")] public string Title { get; set; } } /// <summary> /// The class that represents a button that appears on a response card show to the user. /// path_to_url /// </summary> [DataContract] public class LexV2Button { /// <summary> /// The text that is displayed on the button. /// </summary> [DataMember(Name = "text", EmitDefaultValue = false)] [System.Text.Json.Serialization.JsonPropertyName("text")] public string Text { get; set; } /// <summary> /// The value returned to Amazon Lex V2 when a user chooses the button. /// </summary> [DataMember(Name = "value", EmitDefaultValue = false)] [System.Text.Json.Serialization.JsonPropertyName("value")] public string Value { get; set; } } } ```
```python # File: regexTests.py # from chapter 17 of _Genetic Algorithms with Python_ # # Author: Clinton Sheppard <fluentcoder@gmail.com> # # path_to_url # # Unless required by applicable law or agreed to in writing, software # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or import datetime import random import re import unittest from functools import partial import genetic repeatMetas = {'?', '*', '+', '{2}', '{2,}'} startMetas = {'|', '(', '['} endMetas = {')', ']'} allMetas = repeatMetas | startMetas | endMetas regexErrorsSeen = {} def repair_regex(genes): result = [] finals = [] f = repair_ignore_repeat_metas for token in genes: f = f(token, result, finals) if ']' in finals and result[-1] == '[': del result[-1] result.extend(reversed(finals)) return ''.join(result) def repair_ignore_repeat_metas(token, result, finals): if token in repeatMetas or token in endMetas: return repair_ignore_repeat_metas if token == '(': finals.append(')') result.append(token) if token == '[': finals.append(']') return repair_in_character_set return repair_ignore_repeat_metas_following_repeat_or_start_metas def repair_ignore_repeat_metas_following_repeat_or_start_metas(token, result, finals): last = result[-1] if token not in repeatMetas: if token == '[': result.append(token) finals.append(']') return repair_in_character_set if token == '(': finals.append(')') elif token == ')': match = ''.join(finals).rfind(')') if match != -1: del finals[match] else: result[0:0] = ['('] result.append(token) elif last in startMetas: pass elif token == '?' and last == '?' and len(result) > 2 and \ result[-2] in repeatMetas: pass elif last in repeatMetas: pass else: result.append(token) return repair_ignore_repeat_metas_following_repeat_or_start_metas def repair_in_character_set(token, result, finals): if token == ']': if result[-1] == '[': del result[-1] result.append(token) match = ''.join(finals).rfind(']') if match != -1: del finals[match] return repair_ignore_repeat_metas_following_repeat_or_start_metas elif token == '[': pass elif token == '|' and result[-1] == '|': pass # suppresses FutureWarning about || else: result.append(token) return repair_in_character_set def get_fitness(genes, wanted, unwanted): pattern = repair_regex(genes) length = len(pattern) try: re.compile(pattern) except re.error as e: key = str(e) key = key[:key.index("at position")] info = [str(e), "genes = ['{}']".format("', '".join(genes)), "regex: " + pattern] if key not in regexErrorsSeen or len(info[1]) < len( regexErrorsSeen[key][1]): regexErrorsSeen[key] = info return Fitness(0, len(wanted), len(unwanted), length) numWantedMatched = sum(1 for i in wanted if re.fullmatch(pattern, i)) numUnwantedMatched = sum(1 for i in unwanted if re.fullmatch(pattern, i)) return Fitness(numWantedMatched, len(wanted), numUnwantedMatched, length) def display(candidate, startTime): timeDiff = datetime.datetime.now() - startTime print("{}\t{}\t{}".format( repair_regex(candidate.Genes), candidate.Fitness, timeDiff)) def mutate_add(genes, geneset): index = random.randrange(0, len(genes) + 1) if len(genes) > 0 else 0 genes[index:index] = [random.choice(geneset)] return True def mutate_remove(genes): if len(genes) < 1: return False del genes[random.randrange(0, len(genes))] if len(genes) > 1 and random.randint(0, 1) == 1: del genes[random.randrange(0, len(genes))] return True def mutate_replace(genes, geneset): if len(genes) < 1: return False index = random.randrange(0, len(genes)) genes[index] = random.choice(geneset) return True def mutate_swap(genes): if len(genes) < 2: return False indexA, indexB = random.sample(range(len(genes)), 2) genes[indexA], genes[indexB] = genes[indexB], genes[indexA] return True def mutate_move(genes): if len(genes) < 3: return False start = random.choice(range(len(genes))) stop = start + random.randint(1, 2) toMove = genes[start:stop] genes[start:stop] = [] index = random.choice(range(len(genes))) if index >= start: index += 1 genes[index:index] = toMove return True def mutate_to_character_set(genes): if len(genes) < 3: return False ors = [i for i in range(1, len(genes) - 1) if genes[i] == '|' and genes[i - 1] not in allMetas and genes[i + 1] not in allMetas] if len(ors) == 0: return False shorter = [i for i in ors if sum(len(w) for w in genes[i - 1:i + 2:2]) > len(set(c for w in genes[i - 1:i + 2:2] for c in w))] if len(shorter) == 0: return False index = random.choice(ors) distinct = set(c for w in genes[index - 1:index + 2:2] for c in w) sequence = ['['] + [i for i in distinct] + [']'] genes[index - 1:index + 2] = sequence return True def mutate_to_character_set_left(genes, wanted): if len(genes) < 4: return False ors = [i for i in range(-1, len(genes) - 3) if (i == -1 or genes[i] in startMetas) and len(genes[i + 1]) == 2 and genes[i + 1] in wanted and (len(genes) == i + 1 or genes[i + 2] == '|' or genes[i + 2] in endMetas)] if len(ors) == 0: return False lookup = {} for i in ors: lookup.setdefault(genes[i + 1][0], []).append(i) min2 = [i for i in lookup.values() if len(i) > 1] if len(min2) == 0: return False choice = random.choice(min2) characterSet = ['|', genes[choice[0] + 1][0], '['] characterSet.extend([genes[i + 1][1] for i in choice]) characterSet.append(']') for i in reversed(choice): if i >= 0: genes[i:i + 2] = [] genes.extend(characterSet) return True def mutate_add_wanted(genes, wanted): index = random.randrange(0, len(genes) + 1) if len(genes) > 0 else 0 genes[index:index] = ['|'] + [random.choice(wanted)] return True def mutate(genes, fnGetFitness, mutationOperators, mutationRoundCounts): initialFitness = fnGetFitness(genes) count = random.choice(mutationRoundCounts) for i in range(1, count + 2): copy = mutationOperators[:] func = random.choice(copy) while not func(genes): copy.remove(func) func = random.choice(copy) if fnGetFitness(genes) > initialFitness: mutationRoundCounts.append(i) return class RegexTests(unittest.TestCase): def test_two_digits(self): wanted = {"01", "11", "10"} unwanted = {"00", ""} self.find_regex(wanted, unwanted, 7) def test_grouping(self): wanted = {"01", "0101", "010101"} unwanted = {"0011", ""} self.find_regex(wanted, unwanted, 5) def test_state_codes(self): Fitness.UseRegexLength = True wanted = {"NE", "NV", "NH", "NJ", "NM", "NY", "NC", "ND"} unwanted = {"N" + l for l in "ABCDEFGHIJKLMNOPQRSTUVWXYZ" if "N" + l not in wanted} customOperators = [ partial(mutate_to_character_set_left, wanted=wanted), ] self.find_regex(wanted, unwanted, 11, customOperators) def test_even_length(self): wanted = {"00", "01", "10", "11", "0000", "0001", "0010", "0011", "0100", "0101", "0110", "0111", "1000", "1001", "1010", "1011", "1100", "1101", "1110", "1111"} unwanted = {"0", "1", "000", "001", "010", "011", "100", "101", "110", "111", ""} customOperators = [ mutate_to_character_set, ] self.find_regex(wanted, unwanted, 10, customOperators) def test_50_state_codes(self): Fitness.UseRegexLength = True wanted = {"AL", "AK", "AZ", "AR", "CA", "CO", "CT", "DE", "FL", "GA", "HI", "ID", "IL", "IN", "IA", "KS", "KY", "LA", "ME", "MD", "MA", "MI", "MN", "MS", "MO", "MT", "NE", "NV", "NH", "NJ", "NM", "NY", "NC", "ND", "OH", "OK", "OR", "PA", "RI", "SC", "SD", "TN", "TX", "UT", "VT", "VA", "WA", "WV", "WI", "WY"} unwanted = {a + b for a in "ABCDEFGHIJKLMNOPQRSTUVWXYZ" for b in "ABCDEFGHIJKLMNOPQRSTUVWXYZ" if a + b not in wanted} | \ set(i for i in "ABCDEFGHIJKLMNOPQRSTUVWXYZ") customOperators = [ partial(mutate_to_character_set_left, wanted=wanted), mutate_to_character_set, partial(mutate_add_wanted, wanted=[i for i in wanted]), ] self.find_regex(wanted, unwanted, 120, customOperators) def find_regex(self, wanted, unwanted, expectedLength, customOperators=None): startTime = datetime.datetime.now() textGenes = wanted | set(c for w in wanted for c in w) fullGeneset = [i for i in allMetas | textGenes] def fnDisplay(candidate): display(candidate, startTime) def fnGetFitness(genes): return get_fitness(genes, wanted, unwanted) mutationRoundCounts = [1] mutationOperators = [ partial(mutate_add, geneset=fullGeneset), partial(mutate_replace, geneset=fullGeneset), mutate_remove, mutate_swap, mutate_move, ] if customOperators is not None: mutationOperators.extend(customOperators) def fnMutate(genes): mutate(genes, fnGetFitness, mutationOperators, mutationRoundCounts) optimalFitness = Fitness(len(wanted), len(wanted), 0, expectedLength) best = genetic.get_best(fnGetFitness, max(len(i) for i in textGenes), optimalFitness, fullGeneset, fnDisplay, fnMutate, poolSize=10) self.assertTrue(not optimalFitness > best.Fitness) for info in regexErrorsSeen.values(): print("") print(info[0]) print(info[1]) print(info[2]) def test_benchmark(self): genetic.Benchmark.run(self.test_two_digits) class Fitness: UseRegexLength = False def __init__(self, numWantedMatched, totalWanted, numUnwantedMatched, length): self.NumWantedMatched = numWantedMatched self._totalWanted = totalWanted self.NumUnwantedMatched = numUnwantedMatched self.Length = length def __gt__(self, other): combined = (self._totalWanted - self.NumWantedMatched) \ + self.NumUnwantedMatched otherCombined = (other._totalWanted - other.NumWantedMatched) \ + other.NumUnwantedMatched if combined != otherCombined: return combined < otherCombined success = combined == 0 otherSuccess = otherCombined == 0 if success != otherSuccess: return success if not success: return self.Length <= other.Length if Fitness.UseRegexLength else False return self.Length < other.Length def __str__(self): return "matches: {} wanted, {} unwanted, len {}".format( "all" if self._totalWanted == self.NumWantedMatched else self.NumWantedMatched, self.NumUnwantedMatched, self.Length) if __name__ == '__main__': unittest.main() ```
```go //nolint:lll package nodejs import ( "bytes" "encoding/json" "io" "os" "os/exec" "path/filepath" "strings" "sync" "testing" "github.com/stretchr/testify/require" "github.com/pulumi/pulumi/pkg/v3/codegen/schema" "github.com/pulumi/pulumi/pkg/v3/codegen/testing/test" ) // For better CI test to job distribution, we split the test cases into three tests. var genPkgBatchSize = len(test.PulumiPulumiSDKTests) / 3 func TestGeneratePackageOne(t *testing.T) { t.Parallel() testGeneratePackageBatch(t, test.PulumiPulumiSDKTests[0:genPkgBatchSize]) } func TestGeneratePackageTwo(t *testing.T) { t.Parallel() testGeneratePackageBatch(t, test.PulumiPulumiSDKTests[genPkgBatchSize:2*genPkgBatchSize]) } func TestGeneratePackageThree(t *testing.T) { t.Parallel() testGeneratePackageBatch(t, test.PulumiPulumiSDKTests[2*genPkgBatchSize:]) } func testGeneratePackageBatch(t *testing.T, testCases []*test.SDKTest) { test.TestSDKCodegen(t, &test.SDKCodegenOptions{ Language: "nodejs", GenPackage: func(s string, p *schema.Package, m map[string][]byte) (map[string][]byte, error) { return GeneratePackage(s, p, m, nil, false) }, Checks: map[string]test.CodegenCheck{ "nodejs/compile": func(t *testing.T, pwd string) { typeCheckGeneratedPackage(t, pwd, true) }, "nodejs/test": testGeneratedPackage, }, TestCases: testCases, }) } // Runs unit tests against the generated code. func testGeneratedPackage(t *testing.T, pwd string) { // Some tests have do not have mocha as a dependency. hasMocha := false for _, c := range getYarnCommands(t, pwd) { if c == "mocha" { hasMocha = true break } } // We are attempting to ensure that we don't write tests that are not run. The `nodejs-extras` // folder exists to mixin tests of the form `*.spec.ts`. We assume that if this folder is // present and contains `*.spec.ts` files, we want to run those tests. foundTests := false findTests := func(path string, _ os.DirEntry, _ error) error { if strings.HasSuffix(path, ".spec.ts") { foundTests = true } return nil } mixinFolder := filepath.Join(filepath.Dir(pwd), "nodejs-extras") if err := filepath.WalkDir(mixinFolder, findTests); !hasMocha && !os.IsNotExist(err) && foundTests { t.Errorf("%s has at least one nodejs-extras/**/*.spec.ts file , but does not have mocha as a dependency."+ " Tests were not run. Please add mocha as a dependency in the schema or remove the *.spec.ts files.", pwd) } if hasMocha { // If mocha is a dev dependency but no test files exist, this will fail. test.RunCommand(t, "mocha", pwd, "yarn", "run", "mocha", "--require", "ts-node/register", "tests/**/*.spec.ts") } else { t.Logf("No mocha tests found for %s", pwd) } } // Get the commands runnable with yarn run func getYarnCommands(t *testing.T, pwd string) []string { cmd := exec.Command("yarn", "run", "--json") cmd.Dir = pwd out, err := cmd.Output() if err != nil { t.Errorf("Got error determining valid commands: %s", err) } dec := json.NewDecoder(bytes.NewReader(out)) parsed := []map[string]interface{}{} for { var m map[string]interface{} if err := dec.Decode(&m); err != nil { if err == io.EOF { break } t.FailNow() } parsed = append(parsed, m) } var cmds []string addProvidedCmds := func(c map[string]interface{}) { // If this fails, we want the test to fail. We don't want to accidentally skip tests. data := c["data"].(map[string]interface{}) if data["type"] == "possibleCommands" { return } for _, cmd := range data["items"].([]interface{}) { cmds = append(cmds, cmd.(string)) } } addBinaryCmds := func(c map[string]interface{}) { data := c["data"].(string) if !strings.HasPrefix(data, "Commands available from binary scripts:") { return } cmdList := data[strings.Index(data, ":")+1:] for _, cmd := range strings.Split(cmdList, ",") { cmds = append(cmds, strings.TrimSpace(cmd)) } } for _, c := range parsed { switch c["type"] { case "list": addProvidedCmds(c) case "info": addBinaryCmds(c) } } t.Logf("Found yarn commands in %s: %v", pwd, cmds) return cmds } func TestGenerateTypeNames(t *testing.T) { t.Parallel() test.TestTypeNameCodegen(t, "nodejs", func(pkg *schema.Package) test.TypeNameGeneratorFunc { modules, info, err := generateModuleContextMap("test", pkg, nil) require.NoError(t, err) pkg.Language["nodejs"] = info root, ok := modules[""] require.True(t, ok) // Parallel tests will use the TypeNameGeneratorFunc // from multiple goroutines, but root.typeString is // not safe. Mutex is needed to avoid panics on // concurrent map write. // // Note this problem is test-only since prod code // works on a single goroutine. var mutex sync.Mutex return func(t schema.Type) string { mutex.Lock() defer mutex.Unlock() return root.typeString(t, false, nil) } }) } func TestPascalCases(t *testing.T) { t.Parallel() tests := []struct { input string expected string }{ { input: "hi", expected: "Hi", }, { input: "NothingChanges", expected: "NothingChanges", }, { input: "everything-changed", expected: "EverythingChanged", }, } for _, tt := range tests { result := pascal(tt.input) require.Equal(t, tt.expected, result) } } func Test_isStringType(t *testing.T) { t.Parallel() tests := []struct { name string input schema.Type expected bool }{ {"string", schema.StringType, true}, {"int", schema.IntType, false}, {"Input[string]", &schema.InputType{ElementType: schema.StringType}, true}, {"Input[int]", &schema.InputType{ElementType: schema.IntType}, false}, {"StrictStringEnum", &schema.EnumType{ElementType: schema.StringType}, true}, {"StrictIntEnum", &schema.EnumType{ElementType: schema.IntType}, false}, {"RelaxedStringEnum", &schema.UnionType{ ElementTypes: []schema.Type{&schema.EnumType{ElementType: schema.StringType}, schema.StringType}, }, true}, {"RelaxedIntEnum", &schema.UnionType{ ElementTypes: []schema.Type{&schema.EnumType{ElementType: schema.IntType}, schema.IntType}, }, false}, } for _, tt := range tests { tt := tt t.Run(tt.name, func(t *testing.T) { t.Parallel() if got := isStringType(tt.input); got != tt.expected { t.Errorf("isStringType() = %v, want %v", got, tt.expected) } }) } } ```
```xml import angular from 'angular'; import { r2a } from '@/react-tools/react2angular'; import { withUIRouter } from '@/react-tools/withUIRouter'; import { withCurrentUser } from '@/react-tools/withCurrentUser'; import { withReactQuery } from '@/react-tools/withReactQuery'; import { NamespacesDatatable } from '@/react/kubernetes/namespaces/ListView/NamespacesDatatable'; import { NamespaceAppsDatatable } from '@/react/kubernetes/namespaces/ItemView/NamespaceAppsDatatable'; import { NamespaceAccessDatatable } from '@/react/kubernetes/namespaces/AccessView/AccessDatatable'; export const namespacesModule = angular .module('portainer.kubernetes.react.components.namespaces', []) .component( 'kubernetesNamespacesDatatable', r2a(withUIRouter(withCurrentUser(NamespacesDatatable)), [ 'dataset', 'onRemove', 'onRefresh', ]) ) .component( 'kubernetesNamespaceApplicationsDatatable', r2a(withUIRouter(withCurrentUser(NamespaceAppsDatatable)), [ 'dataset', 'isLoading', 'onRefresh', ]) ) .component( 'namespaceAccessDatatable', r2a(withUIRouter(withReactQuery(NamespaceAccessDatatable)), [ 'dataset', 'onRemove', ]) ).name; ```
```objective-c /* slow_protocol_subtypes.h * Defines subtypes for 802.3 "slow protocols" * * * Wireshark - Network traffic analyzer * By Gerald Combs <gerald@wireshark.org> * * This program is free software; you can redistribute it and/or * as published by the Free Software Foundation; either version 2 * * This program is distributed in the hope that it will be useful, * but WITHOUT ANY WARRANTY; without even the implied warranty of * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the * * along with this program; if not, write to the Free Software * Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA. */ #ifndef __SLOW_PROTOCOL_SUBTYPES_H__ #define __SLOW_PROTOCOL_SUBTYPES_H__ #define LACP_SUBTYPE 0x1 #define MARKER_SUBTYPE 0x2 #define OAM_SUBTYPE 0x3 #define OSSP_SUBTYPE 0xa /* IEEE 802.3 Annex 57A*/ #endif /* __SLOW_PROTOCOL_SUBTYPES_H__ */ ```
Dos Lunas is a line of 100% blue agave tequilas produced in Guadalajara, Jalisco, Mexico, by Dos Lunas Spirits, LLC, headquartered in El Paso, Texas. The brand has the first triple-distilled silver tequila, a month aged Reposado, 18 month aged Añejo and 10 year old Extra Añejo. History On New Year's Eve 2005, Dos Lunas’ founder, Richard C. Poe II, was at a bar in Telluride, Colorado. After witnessing several patrons order tequila and visibly cringe after tasting it, he had the idea to create a more premium brand of tequila that would taste good be more enjoyable to drink. In 2006, he visited Guadalajara, Mexico and secured a master distiller and began producing Dos Lunas. Packaging The bottles used for the Silver, Reposado, and Añejo are made of frosted glass with a small, clear pane in the center which looks onto, respectively, an agave plant, tequila barrels, and an agave harvester. The bottles won two silver medals for design at the San Francisco World Spirits Competition in 2007. The Grand Reserve tequila is bottled in the first Baccarat crystal decanter ever made for tequila. The bottles are individually numbered, with 1,000 having been produced. Production Dos Lunas Tequila is made from 100% Weber blue agave which is grown near Zapotlanejo, Jalisco, in the tequila-producing region of Mexico. All-natural fertilizer, and naturally derived yeast are utilized, while no added enzymes, synthetic anti-bacterial agents, or glycerine are used at any point during the growth, production, or bottling processes. It is also certified Kosher. Dos Lunas Tequila is one of the few brands to avoid the use of off-site bottlers, and instead fills each bottle by hand in the same distillery where the tequila is produced. The bottles are also washed in tequila. Silver Dos Lunas Silver was introduced in 2006. The silver is cold-filtered, triple distilled and aged for six days in new white oak barrels. Reposado Dos Lunas Reposado was also introduced in 2006. The reposado is double distilled and rested for nine months in Tennessee bourbon casks, and in new American white oak barrels. At the end of the nine months, the tequilas from both casks are blended to create an always consistent smooth and mellow taste. Añejo Dos Lunas Añejo was introduced in 2008. Like Dos Lunas reposado, it is aged in both Tennessee whiskey casks and new American white oak casks, but is aged twice as long, being mixed and bottled after 18 months to create this distinctive sipping tequila. Extra Añejo Dos Lunas’ Extra Añejo Grand Reserve is a special edition tequila that was made in limited quantity. 1,000 bottles were produced, out of tequila that had been aged 10 years in Spanish sherry casks. Awards Ranked in the Top 10% of Tequilas in each category according to the Beverage Testing Institute (BTI). Silver 2007: Silver Medal at the San Francisco World Spirits Competition 2008: Bronze Medal at the San Francisco World Spirits Competition, Bronze Medal at the Agave Spirits Challenge in Cancun, Mexico 2009: Silver Medal at the San Francisco World Spirits Competition, Gold Medal at the Beverage Tasting Institute’s International Review of Spirits 2010: Gold Medal at the San Francisco World Spirits Competition 2011: Silver Medal at the Hong Kong International Wine & Spirits Competition 2012: Gold Medal at the Beverly Hills International Spirits Award 2013: Silver Medal at the San Francisco World Spirits Competition, Silver Medal at the International Spirits Challenge, Platinum Medal at the World Spirits Competition Total: 11 Awards 3 Gold 5 Silver 2 Bronze 1 Platinum Reposado Ranked #1 in the world by both the beverage Testing Institute (BTI) and the San Francisco world Spirits Competition two years in a row (2009, 2010) 2006: Gold Medal at the Beverage Tasting Institute's International Review of Spirits 2007: Silver Medal at the San Francisco World Spirits Competition 2008: Bronze Medal at the Los Angeles International Wine and Spirits Competition, Gold Medal at the Beverage Tasting Institute's International Review of Spirits, Silver Medal at the Agave Spirits Challenge in Cancun, Mexico 2009: Silver Medal at the Los Angeles International Wine and Spirits Competition, Gold Medal at Beverage Tasting Institute's International Review of Spirits 2010: Double Gold Award at the San Francisco World Spirits Competition 2011: Silver Medal at the Hong Kong International Wine & Spirits Competition 2012: Gold Medal at the Beverly Hills International Spirits Award 2013: Double Gold Medal at the San Francisco World Spirits Competition, Bronze Medal at the International Spirits Challenge Total: 12 Awards 2 Double Gold 4 Gold 4 Silver 2 Bronze Añejo 2008: Silver Medal at the San Francisco World Spirits Competition, Silver Medal at the Beverage Tasting Institute's International Review of Spirits, Gold Medal at the Agave Spirits Challenge in Cancun, Mexico 2009: Silver medal at the Los Angeles International Wine and Spirits Competition, Gold Medal at the Beverage Tasting Institute's International Review of Spirits 2010: Bronze Medal at the San Francisco World Spirits Competition 2011: Silver Medal at the Hong Kong International Wine & Spirits Competition 2012: Gold Medal at the Beverly Hills International Spirits Award, Platinum Medal at the World Spirits Competition 2013: Silver Medal at the San Francisco World Spirits Competition Total: 10 Awards 3 Gold 5 Silver 1 Bronze 1 Platinum References External links Interview with founder at Youtube (company posted) Reviews at About.com Coctails 2006 establishments in Mexico Food and drink companies established in 2006 Mexican brands Kosher drinks Tequila Food and drink companies based in El Paso, Texas
```go // Code generated by protoc-gen-gogo. DO NOT EDIT. // source: github.com/m3db/m3/src/query/generated/proto/admin/placement.proto // // Permission is hereby granted, free of charge, to any person obtaining a copy // of this software and associated documentation files (the "Software"), to deal // in the Software without restriction, including without limitation the rights // to use, copy, modify, merge, publish, distribute, sublicense, and/or sell // copies of the Software, and to permit persons to whom the Software is // furnished to do so, subject to the following conditions: // // The above copyright notice and this permission notice shall be included in // all copies or substantial portions of the Software. // // THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR // IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, // FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE // AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER // LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, // OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN // THE SOFTWARE. package admin import proto "github.com/gogo/protobuf/proto" import fmt "fmt" import math "math" import placementpb "github.com/m3db/m3/src/cluster/generated/proto/placementpb" import io "io" // Reference imports to suppress errors if they are not otherwise used. var _ = proto.Marshal var _ = fmt.Errorf var _ = math.Inf type PlacementInitRequest struct { Instances []*placementpb.Instance `protobuf:"bytes,1,rep,name=instances" json:"instances,omitempty"` NumShards int32 `protobuf:"varint,2,opt,name=num_shards,json=numShards,proto3" json:"num_shards,omitempty"` ReplicationFactor int32 `protobuf:"varint,3,opt,name=replication_factor,json=replicationFactor,proto3" json:"replication_factor,omitempty"` OptionOverride *placementpb.Options `protobuf:"bytes,99,opt,name=option_override,json=optionOverride" json:"option_override,omitempty"` } func (m *PlacementInitRequest) Reset() { *m = PlacementInitRequest{} } func (m *PlacementInitRequest) String() string { return proto.CompactTextString(m) } func (*PlacementInitRequest) ProtoMessage() {} func (*PlacementInitRequest) Descriptor() ([]byte, []int) { return fileDescriptorPlacement, []int{0} } func (m *PlacementInitRequest) GetInstances() []*placementpb.Instance { if m != nil { return m.Instances } return nil } func (m *PlacementInitRequest) GetNumShards() int32 { if m != nil { return m.NumShards } return 0 } func (m *PlacementInitRequest) GetReplicationFactor() int32 { if m != nil { return m.ReplicationFactor } return 0 } func (m *PlacementInitRequest) GetOptionOverride() *placementpb.Options { if m != nil { return m.OptionOverride } return nil } type PlacementGetResponse struct { Placement *placementpb.Placement `protobuf:"bytes,1,opt,name=placement" json:"placement,omitempty"` Version int32 `protobuf:"varint,2,opt,name=version,proto3" json:"version,omitempty"` } func (m *PlacementGetResponse) Reset() { *m = PlacementGetResponse{} } func (m *PlacementGetResponse) String() string { return proto.CompactTextString(m) } func (*PlacementGetResponse) ProtoMessage() {} func (*PlacementGetResponse) Descriptor() ([]byte, []int) { return fileDescriptorPlacement, []int{1} } func (m *PlacementGetResponse) GetPlacement() *placementpb.Placement { if m != nil { return m.Placement } return nil } func (m *PlacementGetResponse) GetVersion() int32 { if m != nil { return m.Version } return 0 } type PlacementAddRequest struct { Instances []*placementpb.Instance `protobuf:"bytes,1,rep,name=instances" json:"instances,omitempty"` // By default add requests will only succeed if all instances in the placement // are AVAILABLE for all their shards. force overrides that. Force bool `protobuf:"varint,2,opt,name=force,proto3" json:"force,omitempty"` OptionOverride *placementpb.Options `protobuf:"bytes,99,opt,name=option_override,json=optionOverride" json:"option_override,omitempty"` } func (m *PlacementAddRequest) Reset() { *m = PlacementAddRequest{} } func (m *PlacementAddRequest) String() string { return proto.CompactTextString(m) } func (*PlacementAddRequest) ProtoMessage() {} func (*PlacementAddRequest) Descriptor() ([]byte, []int) { return fileDescriptorPlacement, []int{2} } func (m *PlacementAddRequest) GetInstances() []*placementpb.Instance { if m != nil { return m.Instances } return nil } func (m *PlacementAddRequest) GetForce() bool { if m != nil { return m.Force } return false } func (m *PlacementAddRequest) GetOptionOverride() *placementpb.Options { if m != nil { return m.OptionOverride } return nil } type PlacementRemoveRequest struct { InstanceIds []string `protobuf:"bytes,1,rep,name=instance_ids,json=instanceIds" json:"instance_ids,omitempty"` Force bool `protobuf:"varint,2,opt,name=force,proto3" json:"force,omitempty"` OptionOverride *placementpb.Options `protobuf:"bytes,99,opt,name=option_override,json=optionOverride" json:"option_override,omitempty"` } func (m *PlacementRemoveRequest) Reset() { *m = PlacementRemoveRequest{} } func (m *PlacementRemoveRequest) String() string { return proto.CompactTextString(m) } func (*PlacementRemoveRequest) ProtoMessage() {} func (*PlacementRemoveRequest) Descriptor() ([]byte, []int) { return fileDescriptorPlacement, []int{3} } func (m *PlacementRemoveRequest) GetInstanceIds() []string { if m != nil { return m.InstanceIds } return nil } func (m *PlacementRemoveRequest) GetForce() bool { if m != nil { return m.Force } return false } func (m *PlacementRemoveRequest) GetOptionOverride() *placementpb.Options { if m != nil { return m.OptionOverride } return nil } type PlacementReplaceRequest struct { LeavingInstanceIDs []string `protobuf:"bytes,1,rep,name=leavingInstanceIDs" json:"leavingInstanceIDs,omitempty"` Candidates []*placementpb.Instance `protobuf:"bytes,2,rep,name=candidates" json:"candidates,omitempty"` Force bool `protobuf:"varint,3,opt,name=force,proto3" json:"force,omitempty"` OptionOverride *placementpb.Options `protobuf:"bytes,99,opt,name=option_override,json=optionOverride" json:"option_override,omitempty"` } func (m *PlacementReplaceRequest) Reset() { *m = PlacementReplaceRequest{} } func (m *PlacementReplaceRequest) String() string { return proto.CompactTextString(m) } func (*PlacementReplaceRequest) ProtoMessage() {} func (*PlacementReplaceRequest) Descriptor() ([]byte, []int) { return fileDescriptorPlacement, []int{4} } func (m *PlacementReplaceRequest) GetLeavingInstanceIDs() []string { if m != nil { return m.LeavingInstanceIDs } return nil } func (m *PlacementReplaceRequest) GetCandidates() []*placementpb.Instance { if m != nil { return m.Candidates } return nil } func (m *PlacementReplaceRequest) GetForce() bool { if m != nil { return m.Force } return false } func (m *PlacementReplaceRequest) GetOptionOverride() *placementpb.Options { if m != nil { return m.OptionOverride } return nil } type PlacementSetRequest struct { Placement *placementpb.Placement `protobuf:"bytes,1,opt,name=placement" json:"placement,omitempty"` Version int32 `protobuf:"varint,2,opt,name=version,proto3" json:"version,omitempty"` // Confirm must be set, otherwise just a dry run is executed. Confirm bool `protobuf:"varint,3,opt,name=confirm,proto3" json:"confirm,omitempty"` // Force will skip validating the placement. Force bool `protobuf:"varint,4,opt,name=force,proto3" json:"force,omitempty"` } func (m *PlacementSetRequest) Reset() { *m = PlacementSetRequest{} } func (m *PlacementSetRequest) String() string { return proto.CompactTextString(m) } func (*PlacementSetRequest) ProtoMessage() {} func (*PlacementSetRequest) Descriptor() ([]byte, []int) { return fileDescriptorPlacement, []int{5} } func (m *PlacementSetRequest) GetPlacement() *placementpb.Placement { if m != nil { return m.Placement } return nil } func (m *PlacementSetRequest) GetVersion() int32 { if m != nil { return m.Version } return 0 } func (m *PlacementSetRequest) GetConfirm() bool { if m != nil { return m.Confirm } return false } func (m *PlacementSetRequest) GetForce() bool { if m != nil { return m.Force } return false } type PlacementSetResponse struct { Placement *placementpb.Placement `protobuf:"bytes,1,opt,name=placement" json:"placement,omitempty"` Version int32 `protobuf:"varint,2,opt,name=version,proto3" json:"version,omitempty"` DryRun bool `protobuf:"varint,3,opt,name=dryRun,proto3" json:"dryRun,omitempty"` } func (m *PlacementSetResponse) Reset() { *m = PlacementSetResponse{} } func (m *PlacementSetResponse) String() string { return proto.CompactTextString(m) } func (*PlacementSetResponse) ProtoMessage() {} func (*PlacementSetResponse) Descriptor() ([]byte, []int) { return fileDescriptorPlacement, []int{6} } func (m *PlacementSetResponse) GetPlacement() *placementpb.Placement { if m != nil { return m.Placement } return nil } func (m *PlacementSetResponse) GetVersion() int32 { if m != nil { return m.Version } return 0 } func (m *PlacementSetResponse) GetDryRun() bool { if m != nil { return m.DryRun } return false } func init() { proto.RegisterType((*PlacementInitRequest)(nil), "admin.PlacementInitRequest") proto.RegisterType((*PlacementGetResponse)(nil), "admin.PlacementGetResponse") proto.RegisterType((*PlacementAddRequest)(nil), "admin.PlacementAddRequest") proto.RegisterType((*PlacementRemoveRequest)(nil), "admin.PlacementRemoveRequest") proto.RegisterType((*PlacementReplaceRequest)(nil), "admin.PlacementReplaceRequest") proto.RegisterType((*PlacementSetRequest)(nil), "admin.PlacementSetRequest") proto.RegisterType((*PlacementSetResponse)(nil), "admin.PlacementSetResponse") } func (m *PlacementInitRequest) Marshal() (dAtA []byte, err error) { size := m.Size() dAtA = make([]byte, size) n, err := m.MarshalTo(dAtA) if err != nil { return nil, err } return dAtA[:n], nil } func (m *PlacementInitRequest) MarshalTo(dAtA []byte) (int, error) { var i int _ = i var l int _ = l if len(m.Instances) > 0 { for _, msg := range m.Instances { dAtA[i] = 0xa i++ i = encodeVarintPlacement(dAtA, i, uint64(msg.Size())) n, err := msg.MarshalTo(dAtA[i:]) if err != nil { return 0, err } i += n } } if m.NumShards != 0 { dAtA[i] = 0x10 i++ i = encodeVarintPlacement(dAtA, i, uint64(m.NumShards)) } if m.ReplicationFactor != 0 { dAtA[i] = 0x18 i++ i = encodeVarintPlacement(dAtA, i, uint64(m.ReplicationFactor)) } if m.OptionOverride != nil { dAtA[i] = 0x9a i++ dAtA[i] = 0x6 i++ i = encodeVarintPlacement(dAtA, i, uint64(m.OptionOverride.Size())) n1, err := m.OptionOverride.MarshalTo(dAtA[i:]) if err != nil { return 0, err } i += n1 } return i, nil } func (m *PlacementGetResponse) Marshal() (dAtA []byte, err error) { size := m.Size() dAtA = make([]byte, size) n, err := m.MarshalTo(dAtA) if err != nil { return nil, err } return dAtA[:n], nil } func (m *PlacementGetResponse) MarshalTo(dAtA []byte) (int, error) { var i int _ = i var l int _ = l if m.Placement != nil { dAtA[i] = 0xa i++ i = encodeVarintPlacement(dAtA, i, uint64(m.Placement.Size())) n2, err := m.Placement.MarshalTo(dAtA[i:]) if err != nil { return 0, err } i += n2 } if m.Version != 0 { dAtA[i] = 0x10 i++ i = encodeVarintPlacement(dAtA, i, uint64(m.Version)) } return i, nil } func (m *PlacementAddRequest) Marshal() (dAtA []byte, err error) { size := m.Size() dAtA = make([]byte, size) n, err := m.MarshalTo(dAtA) if err != nil { return nil, err } return dAtA[:n], nil } func (m *PlacementAddRequest) MarshalTo(dAtA []byte) (int, error) { var i int _ = i var l int _ = l if len(m.Instances) > 0 { for _, msg := range m.Instances { dAtA[i] = 0xa i++ i = encodeVarintPlacement(dAtA, i, uint64(msg.Size())) n, err := msg.MarshalTo(dAtA[i:]) if err != nil { return 0, err } i += n } } if m.Force { dAtA[i] = 0x10 i++ if m.Force { dAtA[i] = 1 } else { dAtA[i] = 0 } i++ } if m.OptionOverride != nil { dAtA[i] = 0x9a i++ dAtA[i] = 0x6 i++ i = encodeVarintPlacement(dAtA, i, uint64(m.OptionOverride.Size())) n3, err := m.OptionOverride.MarshalTo(dAtA[i:]) if err != nil { return 0, err } i += n3 } return i, nil } func (m *PlacementRemoveRequest) Marshal() (dAtA []byte, err error) { size := m.Size() dAtA = make([]byte, size) n, err := m.MarshalTo(dAtA) if err != nil { return nil, err } return dAtA[:n], nil } func (m *PlacementRemoveRequest) MarshalTo(dAtA []byte) (int, error) { var i int _ = i var l int _ = l if len(m.InstanceIds) > 0 { for _, s := range m.InstanceIds { dAtA[i] = 0xa i++ l = len(s) for l >= 1<<7 { dAtA[i] = uint8(uint64(l)&0x7f | 0x80) l >>= 7 i++ } dAtA[i] = uint8(l) i++ i += copy(dAtA[i:], s) } } if m.Force { dAtA[i] = 0x10 i++ if m.Force { dAtA[i] = 1 } else { dAtA[i] = 0 } i++ } if m.OptionOverride != nil { dAtA[i] = 0x9a i++ dAtA[i] = 0x6 i++ i = encodeVarintPlacement(dAtA, i, uint64(m.OptionOverride.Size())) n4, err := m.OptionOverride.MarshalTo(dAtA[i:]) if err != nil { return 0, err } i += n4 } return i, nil } func (m *PlacementReplaceRequest) Marshal() (dAtA []byte, err error) { size := m.Size() dAtA = make([]byte, size) n, err := m.MarshalTo(dAtA) if err != nil { return nil, err } return dAtA[:n], nil } func (m *PlacementReplaceRequest) MarshalTo(dAtA []byte) (int, error) { var i int _ = i var l int _ = l if len(m.LeavingInstanceIDs) > 0 { for _, s := range m.LeavingInstanceIDs { dAtA[i] = 0xa i++ l = len(s) for l >= 1<<7 { dAtA[i] = uint8(uint64(l)&0x7f | 0x80) l >>= 7 i++ } dAtA[i] = uint8(l) i++ i += copy(dAtA[i:], s) } } if len(m.Candidates) > 0 { for _, msg := range m.Candidates { dAtA[i] = 0x12 i++ i = encodeVarintPlacement(dAtA, i, uint64(msg.Size())) n, err := msg.MarshalTo(dAtA[i:]) if err != nil { return 0, err } i += n } } if m.Force { dAtA[i] = 0x18 i++ if m.Force { dAtA[i] = 1 } else { dAtA[i] = 0 } i++ } if m.OptionOverride != nil { dAtA[i] = 0x9a i++ dAtA[i] = 0x6 i++ i = encodeVarintPlacement(dAtA, i, uint64(m.OptionOverride.Size())) n5, err := m.OptionOverride.MarshalTo(dAtA[i:]) if err != nil { return 0, err } i += n5 } return i, nil } func (m *PlacementSetRequest) Marshal() (dAtA []byte, err error) { size := m.Size() dAtA = make([]byte, size) n, err := m.MarshalTo(dAtA) if err != nil { return nil, err } return dAtA[:n], nil } func (m *PlacementSetRequest) MarshalTo(dAtA []byte) (int, error) { var i int _ = i var l int _ = l if m.Placement != nil { dAtA[i] = 0xa i++ i = encodeVarintPlacement(dAtA, i, uint64(m.Placement.Size())) n6, err := m.Placement.MarshalTo(dAtA[i:]) if err != nil { return 0, err } i += n6 } if m.Version != 0 { dAtA[i] = 0x10 i++ i = encodeVarintPlacement(dAtA, i, uint64(m.Version)) } if m.Confirm { dAtA[i] = 0x18 i++ if m.Confirm { dAtA[i] = 1 } else { dAtA[i] = 0 } i++ } if m.Force { dAtA[i] = 0x20 i++ if m.Force { dAtA[i] = 1 } else { dAtA[i] = 0 } i++ } return i, nil } func (m *PlacementSetResponse) Marshal() (dAtA []byte, err error) { size := m.Size() dAtA = make([]byte, size) n, err := m.MarshalTo(dAtA) if err != nil { return nil, err } return dAtA[:n], nil } func (m *PlacementSetResponse) MarshalTo(dAtA []byte) (int, error) { var i int _ = i var l int _ = l if m.Placement != nil { dAtA[i] = 0xa i++ i = encodeVarintPlacement(dAtA, i, uint64(m.Placement.Size())) n7, err := m.Placement.MarshalTo(dAtA[i:]) if err != nil { return 0, err } i += n7 } if m.Version != 0 { dAtA[i] = 0x10 i++ i = encodeVarintPlacement(dAtA, i, uint64(m.Version)) } if m.DryRun { dAtA[i] = 0x18 i++ if m.DryRun { dAtA[i] = 1 } else { dAtA[i] = 0 } i++ } return i, nil } func encodeVarintPlacement(dAtA []byte, offset int, v uint64) int { for v >= 1<<7 { dAtA[offset] = uint8(v&0x7f | 0x80) v >>= 7 offset++ } dAtA[offset] = uint8(v) return offset + 1 } func (m *PlacementInitRequest) Size() (n int) { var l int _ = l if len(m.Instances) > 0 { for _, e := range m.Instances { l = e.Size() n += 1 + l + sovPlacement(uint64(l)) } } if m.NumShards != 0 { n += 1 + sovPlacement(uint64(m.NumShards)) } if m.ReplicationFactor != 0 { n += 1 + sovPlacement(uint64(m.ReplicationFactor)) } if m.OptionOverride != nil { l = m.OptionOverride.Size() n += 2 + l + sovPlacement(uint64(l)) } return n } func (m *PlacementGetResponse) Size() (n int) { var l int _ = l if m.Placement != nil { l = m.Placement.Size() n += 1 + l + sovPlacement(uint64(l)) } if m.Version != 0 { n += 1 + sovPlacement(uint64(m.Version)) } return n } func (m *PlacementAddRequest) Size() (n int) { var l int _ = l if len(m.Instances) > 0 { for _, e := range m.Instances { l = e.Size() n += 1 + l + sovPlacement(uint64(l)) } } if m.Force { n += 2 } if m.OptionOverride != nil { l = m.OptionOverride.Size() n += 2 + l + sovPlacement(uint64(l)) } return n } func (m *PlacementRemoveRequest) Size() (n int) { var l int _ = l if len(m.InstanceIds) > 0 { for _, s := range m.InstanceIds { l = len(s) n += 1 + l + sovPlacement(uint64(l)) } } if m.Force { n += 2 } if m.OptionOverride != nil { l = m.OptionOverride.Size() n += 2 + l + sovPlacement(uint64(l)) } return n } func (m *PlacementReplaceRequest) Size() (n int) { var l int _ = l if len(m.LeavingInstanceIDs) > 0 { for _, s := range m.LeavingInstanceIDs { l = len(s) n += 1 + l + sovPlacement(uint64(l)) } } if len(m.Candidates) > 0 { for _, e := range m.Candidates { l = e.Size() n += 1 + l + sovPlacement(uint64(l)) } } if m.Force { n += 2 } if m.OptionOverride != nil { l = m.OptionOverride.Size() n += 2 + l + sovPlacement(uint64(l)) } return n } func (m *PlacementSetRequest) Size() (n int) { var l int _ = l if m.Placement != nil { l = m.Placement.Size() n += 1 + l + sovPlacement(uint64(l)) } if m.Version != 0 { n += 1 + sovPlacement(uint64(m.Version)) } if m.Confirm { n += 2 } if m.Force { n += 2 } return n } func (m *PlacementSetResponse) Size() (n int) { var l int _ = l if m.Placement != nil { l = m.Placement.Size() n += 1 + l + sovPlacement(uint64(l)) } if m.Version != 0 { n += 1 + sovPlacement(uint64(m.Version)) } if m.DryRun { n += 2 } return n } func sovPlacement(x uint64) (n int) { for { n++ x >>= 7 if x == 0 { break } } return n } func sozPlacement(x uint64) (n int) { return sovPlacement(uint64((x << 1) ^ uint64((int64(x) >> 63)))) } func (m *PlacementInitRequest) Unmarshal(dAtA []byte) error { l := len(dAtA) iNdEx := 0 for iNdEx < l { preIndex := iNdEx var wire uint64 for shift := uint(0); ; shift += 7 { if shift >= 64 { return ErrIntOverflowPlacement } if iNdEx >= l { return io.ErrUnexpectedEOF } b := dAtA[iNdEx] iNdEx++ wire |= (uint64(b) & 0x7F) << shift if b < 0x80 { break } } fieldNum := int32(wire >> 3) wireType := int(wire & 0x7) if wireType == 4 { return fmt.Errorf("proto: PlacementInitRequest: wiretype end group for non-group") } if fieldNum <= 0 { return fmt.Errorf("proto: PlacementInitRequest: illegal tag %d (wire type %d)", fieldNum, wire) } switch fieldNum { case 1: if wireType != 2 { return fmt.Errorf("proto: wrong wireType = %d for field Instances", wireType) } var msglen int for shift := uint(0); ; shift += 7 { if shift >= 64 { return ErrIntOverflowPlacement } if iNdEx >= l { return io.ErrUnexpectedEOF } b := dAtA[iNdEx] iNdEx++ msglen |= (int(b) & 0x7F) << shift if b < 0x80 { break } } if msglen < 0 { return ErrInvalidLengthPlacement } postIndex := iNdEx + msglen if postIndex > l { return io.ErrUnexpectedEOF } m.Instances = append(m.Instances, &placementpb.Instance{}) if err := m.Instances[len(m.Instances)-1].Unmarshal(dAtA[iNdEx:postIndex]); err != nil { return err } iNdEx = postIndex case 2: if wireType != 0 { return fmt.Errorf("proto: wrong wireType = %d for field NumShards", wireType) } m.NumShards = 0 for shift := uint(0); ; shift += 7 { if shift >= 64 { return ErrIntOverflowPlacement } if iNdEx >= l { return io.ErrUnexpectedEOF } b := dAtA[iNdEx] iNdEx++ m.NumShards |= (int32(b) & 0x7F) << shift if b < 0x80 { break } } case 3: if wireType != 0 { return fmt.Errorf("proto: wrong wireType = %d for field ReplicationFactor", wireType) } m.ReplicationFactor = 0 for shift := uint(0); ; shift += 7 { if shift >= 64 { return ErrIntOverflowPlacement } if iNdEx >= l { return io.ErrUnexpectedEOF } b := dAtA[iNdEx] iNdEx++ m.ReplicationFactor |= (int32(b) & 0x7F) << shift if b < 0x80 { break } } case 99: if wireType != 2 { return fmt.Errorf("proto: wrong wireType = %d for field OptionOverride", wireType) } var msglen int for shift := uint(0); ; shift += 7 { if shift >= 64 { return ErrIntOverflowPlacement } if iNdEx >= l { return io.ErrUnexpectedEOF } b := dAtA[iNdEx] iNdEx++ msglen |= (int(b) & 0x7F) << shift if b < 0x80 { break } } if msglen < 0 { return ErrInvalidLengthPlacement } postIndex := iNdEx + msglen if postIndex > l { return io.ErrUnexpectedEOF } if m.OptionOverride == nil { m.OptionOverride = &placementpb.Options{} } if err := m.OptionOverride.Unmarshal(dAtA[iNdEx:postIndex]); err != nil { return err } iNdEx = postIndex default: iNdEx = preIndex skippy, err := skipPlacement(dAtA[iNdEx:]) if err != nil { return err } if skippy < 0 { return ErrInvalidLengthPlacement } if (iNdEx + skippy) > l { return io.ErrUnexpectedEOF } iNdEx += skippy } } if iNdEx > l { return io.ErrUnexpectedEOF } return nil } func (m *PlacementGetResponse) Unmarshal(dAtA []byte) error { l := len(dAtA) iNdEx := 0 for iNdEx < l { preIndex := iNdEx var wire uint64 for shift := uint(0); ; shift += 7 { if shift >= 64 { return ErrIntOverflowPlacement } if iNdEx >= l { return io.ErrUnexpectedEOF } b := dAtA[iNdEx] iNdEx++ wire |= (uint64(b) & 0x7F) << shift if b < 0x80 { break } } fieldNum := int32(wire >> 3) wireType := int(wire & 0x7) if wireType == 4 { return fmt.Errorf("proto: PlacementGetResponse: wiretype end group for non-group") } if fieldNum <= 0 { return fmt.Errorf("proto: PlacementGetResponse: illegal tag %d (wire type %d)", fieldNum, wire) } switch fieldNum { case 1: if wireType != 2 { return fmt.Errorf("proto: wrong wireType = %d for field Placement", wireType) } var msglen int for shift := uint(0); ; shift += 7 { if shift >= 64 { return ErrIntOverflowPlacement } if iNdEx >= l { return io.ErrUnexpectedEOF } b := dAtA[iNdEx] iNdEx++ msglen |= (int(b) & 0x7F) << shift if b < 0x80 { break } } if msglen < 0 { return ErrInvalidLengthPlacement } postIndex := iNdEx + msglen if postIndex > l { return io.ErrUnexpectedEOF } if m.Placement == nil { m.Placement = &placementpb.Placement{} } if err := m.Placement.Unmarshal(dAtA[iNdEx:postIndex]); err != nil { return err } iNdEx = postIndex case 2: if wireType != 0 { return fmt.Errorf("proto: wrong wireType = %d for field Version", wireType) } m.Version = 0 for shift := uint(0); ; shift += 7 { if shift >= 64 { return ErrIntOverflowPlacement } if iNdEx >= l { return io.ErrUnexpectedEOF } b := dAtA[iNdEx] iNdEx++ m.Version |= (int32(b) & 0x7F) << shift if b < 0x80 { break } } default: iNdEx = preIndex skippy, err := skipPlacement(dAtA[iNdEx:]) if err != nil { return err } if skippy < 0 { return ErrInvalidLengthPlacement } if (iNdEx + skippy) > l { return io.ErrUnexpectedEOF } iNdEx += skippy } } if iNdEx > l { return io.ErrUnexpectedEOF } return nil } func (m *PlacementAddRequest) Unmarshal(dAtA []byte) error { l := len(dAtA) iNdEx := 0 for iNdEx < l { preIndex := iNdEx var wire uint64 for shift := uint(0); ; shift += 7 { if shift >= 64 { return ErrIntOverflowPlacement } if iNdEx >= l { return io.ErrUnexpectedEOF } b := dAtA[iNdEx] iNdEx++ wire |= (uint64(b) & 0x7F) << shift if b < 0x80 { break } } fieldNum := int32(wire >> 3) wireType := int(wire & 0x7) if wireType == 4 { return fmt.Errorf("proto: PlacementAddRequest: wiretype end group for non-group") } if fieldNum <= 0 { return fmt.Errorf("proto: PlacementAddRequest: illegal tag %d (wire type %d)", fieldNum, wire) } switch fieldNum { case 1: if wireType != 2 { return fmt.Errorf("proto: wrong wireType = %d for field Instances", wireType) } var msglen int for shift := uint(0); ; shift += 7 { if shift >= 64 { return ErrIntOverflowPlacement } if iNdEx >= l { return io.ErrUnexpectedEOF } b := dAtA[iNdEx] iNdEx++ msglen |= (int(b) & 0x7F) << shift if b < 0x80 { break } } if msglen < 0 { return ErrInvalidLengthPlacement } postIndex := iNdEx + msglen if postIndex > l { return io.ErrUnexpectedEOF } m.Instances = append(m.Instances, &placementpb.Instance{}) if err := m.Instances[len(m.Instances)-1].Unmarshal(dAtA[iNdEx:postIndex]); err != nil { return err } iNdEx = postIndex case 2: if wireType != 0 { return fmt.Errorf("proto: wrong wireType = %d for field Force", wireType) } var v int for shift := uint(0); ; shift += 7 { if shift >= 64 { return ErrIntOverflowPlacement } if iNdEx >= l { return io.ErrUnexpectedEOF } b := dAtA[iNdEx] iNdEx++ v |= (int(b) & 0x7F) << shift if b < 0x80 { break } } m.Force = bool(v != 0) case 99: if wireType != 2 { return fmt.Errorf("proto: wrong wireType = %d for field OptionOverride", wireType) } var msglen int for shift := uint(0); ; shift += 7 { if shift >= 64 { return ErrIntOverflowPlacement } if iNdEx >= l { return io.ErrUnexpectedEOF } b := dAtA[iNdEx] iNdEx++ msglen |= (int(b) & 0x7F) << shift if b < 0x80 { break } } if msglen < 0 { return ErrInvalidLengthPlacement } postIndex := iNdEx + msglen if postIndex > l { return io.ErrUnexpectedEOF } if m.OptionOverride == nil { m.OptionOverride = &placementpb.Options{} } if err := m.OptionOverride.Unmarshal(dAtA[iNdEx:postIndex]); err != nil { return err } iNdEx = postIndex default: iNdEx = preIndex skippy, err := skipPlacement(dAtA[iNdEx:]) if err != nil { return err } if skippy < 0 { return ErrInvalidLengthPlacement } if (iNdEx + skippy) > l { return io.ErrUnexpectedEOF } iNdEx += skippy } } if iNdEx > l { return io.ErrUnexpectedEOF } return nil } func (m *PlacementRemoveRequest) Unmarshal(dAtA []byte) error { l := len(dAtA) iNdEx := 0 for iNdEx < l { preIndex := iNdEx var wire uint64 for shift := uint(0); ; shift += 7 { if shift >= 64 { return ErrIntOverflowPlacement } if iNdEx >= l { return io.ErrUnexpectedEOF } b := dAtA[iNdEx] iNdEx++ wire |= (uint64(b) & 0x7F) << shift if b < 0x80 { break } } fieldNum := int32(wire >> 3) wireType := int(wire & 0x7) if wireType == 4 { return fmt.Errorf("proto: PlacementRemoveRequest: wiretype end group for non-group") } if fieldNum <= 0 { return fmt.Errorf("proto: PlacementRemoveRequest: illegal tag %d (wire type %d)", fieldNum, wire) } switch fieldNum { case 1: if wireType != 2 { return fmt.Errorf("proto: wrong wireType = %d for field InstanceIds", wireType) } var stringLen uint64 for shift := uint(0); ; shift += 7 { if shift >= 64 { return ErrIntOverflowPlacement } if iNdEx >= l { return io.ErrUnexpectedEOF } b := dAtA[iNdEx] iNdEx++ stringLen |= (uint64(b) & 0x7F) << shift if b < 0x80 { break } } intStringLen := int(stringLen) if intStringLen < 0 { return ErrInvalidLengthPlacement } postIndex := iNdEx + intStringLen if postIndex > l { return io.ErrUnexpectedEOF } m.InstanceIds = append(m.InstanceIds, string(dAtA[iNdEx:postIndex])) iNdEx = postIndex case 2: if wireType != 0 { return fmt.Errorf("proto: wrong wireType = %d for field Force", wireType) } var v int for shift := uint(0); ; shift += 7 { if shift >= 64 { return ErrIntOverflowPlacement } if iNdEx >= l { return io.ErrUnexpectedEOF } b := dAtA[iNdEx] iNdEx++ v |= (int(b) & 0x7F) << shift if b < 0x80 { break } } m.Force = bool(v != 0) case 99: if wireType != 2 { return fmt.Errorf("proto: wrong wireType = %d for field OptionOverride", wireType) } var msglen int for shift := uint(0); ; shift += 7 { if shift >= 64 { return ErrIntOverflowPlacement } if iNdEx >= l { return io.ErrUnexpectedEOF } b := dAtA[iNdEx] iNdEx++ msglen |= (int(b) & 0x7F) << shift if b < 0x80 { break } } if msglen < 0 { return ErrInvalidLengthPlacement } postIndex := iNdEx + msglen if postIndex > l { return io.ErrUnexpectedEOF } if m.OptionOverride == nil { m.OptionOverride = &placementpb.Options{} } if err := m.OptionOverride.Unmarshal(dAtA[iNdEx:postIndex]); err != nil { return err } iNdEx = postIndex default: iNdEx = preIndex skippy, err := skipPlacement(dAtA[iNdEx:]) if err != nil { return err } if skippy < 0 { return ErrInvalidLengthPlacement } if (iNdEx + skippy) > l { return io.ErrUnexpectedEOF } iNdEx += skippy } } if iNdEx > l { return io.ErrUnexpectedEOF } return nil } func (m *PlacementReplaceRequest) Unmarshal(dAtA []byte) error { l := len(dAtA) iNdEx := 0 for iNdEx < l { preIndex := iNdEx var wire uint64 for shift := uint(0); ; shift += 7 { if shift >= 64 { return ErrIntOverflowPlacement } if iNdEx >= l { return io.ErrUnexpectedEOF } b := dAtA[iNdEx] iNdEx++ wire |= (uint64(b) & 0x7F) << shift if b < 0x80 { break } } fieldNum := int32(wire >> 3) wireType := int(wire & 0x7) if wireType == 4 { return fmt.Errorf("proto: PlacementReplaceRequest: wiretype end group for non-group") } if fieldNum <= 0 { return fmt.Errorf("proto: PlacementReplaceRequest: illegal tag %d (wire type %d)", fieldNum, wire) } switch fieldNum { case 1: if wireType != 2 { return fmt.Errorf("proto: wrong wireType = %d for field LeavingInstanceIDs", wireType) } var stringLen uint64 for shift := uint(0); ; shift += 7 { if shift >= 64 { return ErrIntOverflowPlacement } if iNdEx >= l { return io.ErrUnexpectedEOF } b := dAtA[iNdEx] iNdEx++ stringLen |= (uint64(b) & 0x7F) << shift if b < 0x80 { break } } intStringLen := int(stringLen) if intStringLen < 0 { return ErrInvalidLengthPlacement } postIndex := iNdEx + intStringLen if postIndex > l { return io.ErrUnexpectedEOF } m.LeavingInstanceIDs = append(m.LeavingInstanceIDs, string(dAtA[iNdEx:postIndex])) iNdEx = postIndex case 2: if wireType != 2 { return fmt.Errorf("proto: wrong wireType = %d for field Candidates", wireType) } var msglen int for shift := uint(0); ; shift += 7 { if shift >= 64 { return ErrIntOverflowPlacement } if iNdEx >= l { return io.ErrUnexpectedEOF } b := dAtA[iNdEx] iNdEx++ msglen |= (int(b) & 0x7F) << shift if b < 0x80 { break } } if msglen < 0 { return ErrInvalidLengthPlacement } postIndex := iNdEx + msglen if postIndex > l { return io.ErrUnexpectedEOF } m.Candidates = append(m.Candidates, &placementpb.Instance{}) if err := m.Candidates[len(m.Candidates)-1].Unmarshal(dAtA[iNdEx:postIndex]); err != nil { return err } iNdEx = postIndex case 3: if wireType != 0 { return fmt.Errorf("proto: wrong wireType = %d for field Force", wireType) } var v int for shift := uint(0); ; shift += 7 { if shift >= 64 { return ErrIntOverflowPlacement } if iNdEx >= l { return io.ErrUnexpectedEOF } b := dAtA[iNdEx] iNdEx++ v |= (int(b) & 0x7F) << shift if b < 0x80 { break } } m.Force = bool(v != 0) case 99: if wireType != 2 { return fmt.Errorf("proto: wrong wireType = %d for field OptionOverride", wireType) } var msglen int for shift := uint(0); ; shift += 7 { if shift >= 64 { return ErrIntOverflowPlacement } if iNdEx >= l { return io.ErrUnexpectedEOF } b := dAtA[iNdEx] iNdEx++ msglen |= (int(b) & 0x7F) << shift if b < 0x80 { break } } if msglen < 0 { return ErrInvalidLengthPlacement } postIndex := iNdEx + msglen if postIndex > l { return io.ErrUnexpectedEOF } if m.OptionOverride == nil { m.OptionOverride = &placementpb.Options{} } if err := m.OptionOverride.Unmarshal(dAtA[iNdEx:postIndex]); err != nil { return err } iNdEx = postIndex default: iNdEx = preIndex skippy, err := skipPlacement(dAtA[iNdEx:]) if err != nil { return err } if skippy < 0 { return ErrInvalidLengthPlacement } if (iNdEx + skippy) > l { return io.ErrUnexpectedEOF } iNdEx += skippy } } if iNdEx > l { return io.ErrUnexpectedEOF } return nil } func (m *PlacementSetRequest) Unmarshal(dAtA []byte) error { l := len(dAtA) iNdEx := 0 for iNdEx < l { preIndex := iNdEx var wire uint64 for shift := uint(0); ; shift += 7 { if shift >= 64 { return ErrIntOverflowPlacement } if iNdEx >= l { return io.ErrUnexpectedEOF } b := dAtA[iNdEx] iNdEx++ wire |= (uint64(b) & 0x7F) << shift if b < 0x80 { break } } fieldNum := int32(wire >> 3) wireType := int(wire & 0x7) if wireType == 4 { return fmt.Errorf("proto: PlacementSetRequest: wiretype end group for non-group") } if fieldNum <= 0 { return fmt.Errorf("proto: PlacementSetRequest: illegal tag %d (wire type %d)", fieldNum, wire) } switch fieldNum { case 1: if wireType != 2 { return fmt.Errorf("proto: wrong wireType = %d for field Placement", wireType) } var msglen int for shift := uint(0); ; shift += 7 { if shift >= 64 { return ErrIntOverflowPlacement } if iNdEx >= l { return io.ErrUnexpectedEOF } b := dAtA[iNdEx] iNdEx++ msglen |= (int(b) & 0x7F) << shift if b < 0x80 { break } } if msglen < 0 { return ErrInvalidLengthPlacement } postIndex := iNdEx + msglen if postIndex > l { return io.ErrUnexpectedEOF } if m.Placement == nil { m.Placement = &placementpb.Placement{} } if err := m.Placement.Unmarshal(dAtA[iNdEx:postIndex]); err != nil { return err } iNdEx = postIndex case 2: if wireType != 0 { return fmt.Errorf("proto: wrong wireType = %d for field Version", wireType) } m.Version = 0 for shift := uint(0); ; shift += 7 { if shift >= 64 { return ErrIntOverflowPlacement } if iNdEx >= l { return io.ErrUnexpectedEOF } b := dAtA[iNdEx] iNdEx++ m.Version |= (int32(b) & 0x7F) << shift if b < 0x80 { break } } case 3: if wireType != 0 { return fmt.Errorf("proto: wrong wireType = %d for field Confirm", wireType) } var v int for shift := uint(0); ; shift += 7 { if shift >= 64 { return ErrIntOverflowPlacement } if iNdEx >= l { return io.ErrUnexpectedEOF } b := dAtA[iNdEx] iNdEx++ v |= (int(b) & 0x7F) << shift if b < 0x80 { break } } m.Confirm = bool(v != 0) case 4: if wireType != 0 { return fmt.Errorf("proto: wrong wireType = %d for field Force", wireType) } var v int for shift := uint(0); ; shift += 7 { if shift >= 64 { return ErrIntOverflowPlacement } if iNdEx >= l { return io.ErrUnexpectedEOF } b := dAtA[iNdEx] iNdEx++ v |= (int(b) & 0x7F) << shift if b < 0x80 { break } } m.Force = bool(v != 0) default: iNdEx = preIndex skippy, err := skipPlacement(dAtA[iNdEx:]) if err != nil { return err } if skippy < 0 { return ErrInvalidLengthPlacement } if (iNdEx + skippy) > l { return io.ErrUnexpectedEOF } iNdEx += skippy } } if iNdEx > l { return io.ErrUnexpectedEOF } return nil } func (m *PlacementSetResponse) Unmarshal(dAtA []byte) error { l := len(dAtA) iNdEx := 0 for iNdEx < l { preIndex := iNdEx var wire uint64 for shift := uint(0); ; shift += 7 { if shift >= 64 { return ErrIntOverflowPlacement } if iNdEx >= l { return io.ErrUnexpectedEOF } b := dAtA[iNdEx] iNdEx++ wire |= (uint64(b) & 0x7F) << shift if b < 0x80 { break } } fieldNum := int32(wire >> 3) wireType := int(wire & 0x7) if wireType == 4 { return fmt.Errorf("proto: PlacementSetResponse: wiretype end group for non-group") } if fieldNum <= 0 { return fmt.Errorf("proto: PlacementSetResponse: illegal tag %d (wire type %d)", fieldNum, wire) } switch fieldNum { case 1: if wireType != 2 { return fmt.Errorf("proto: wrong wireType = %d for field Placement", wireType) } var msglen int for shift := uint(0); ; shift += 7 { if shift >= 64 { return ErrIntOverflowPlacement } if iNdEx >= l { return io.ErrUnexpectedEOF } b := dAtA[iNdEx] iNdEx++ msglen |= (int(b) & 0x7F) << shift if b < 0x80 { break } } if msglen < 0 { return ErrInvalidLengthPlacement } postIndex := iNdEx + msglen if postIndex > l { return io.ErrUnexpectedEOF } if m.Placement == nil { m.Placement = &placementpb.Placement{} } if err := m.Placement.Unmarshal(dAtA[iNdEx:postIndex]); err != nil { return err } iNdEx = postIndex case 2: if wireType != 0 { return fmt.Errorf("proto: wrong wireType = %d for field Version", wireType) } m.Version = 0 for shift := uint(0); ; shift += 7 { if shift >= 64 { return ErrIntOverflowPlacement } if iNdEx >= l { return io.ErrUnexpectedEOF } b := dAtA[iNdEx] iNdEx++ m.Version |= (int32(b) & 0x7F) << shift if b < 0x80 { break } } case 3: if wireType != 0 { return fmt.Errorf("proto: wrong wireType = %d for field DryRun", wireType) } var v int for shift := uint(0); ; shift += 7 { if shift >= 64 { return ErrIntOverflowPlacement } if iNdEx >= l { return io.ErrUnexpectedEOF } b := dAtA[iNdEx] iNdEx++ v |= (int(b) & 0x7F) << shift if b < 0x80 { break } } m.DryRun = bool(v != 0) default: iNdEx = preIndex skippy, err := skipPlacement(dAtA[iNdEx:]) if err != nil { return err } if skippy < 0 { return ErrInvalidLengthPlacement } if (iNdEx + skippy) > l { return io.ErrUnexpectedEOF } iNdEx += skippy } } if iNdEx > l { return io.ErrUnexpectedEOF } return nil } func skipPlacement(dAtA []byte) (n int, err error) { l := len(dAtA) iNdEx := 0 for iNdEx < l { var wire uint64 for shift := uint(0); ; shift += 7 { if shift >= 64 { return 0, ErrIntOverflowPlacement } if iNdEx >= l { return 0, io.ErrUnexpectedEOF } b := dAtA[iNdEx] iNdEx++ wire |= (uint64(b) & 0x7F) << shift if b < 0x80 { break } } wireType := int(wire & 0x7) switch wireType { case 0: for shift := uint(0); ; shift += 7 { if shift >= 64 { return 0, ErrIntOverflowPlacement } if iNdEx >= l { return 0, io.ErrUnexpectedEOF } iNdEx++ if dAtA[iNdEx-1] < 0x80 { break } } return iNdEx, nil case 1: iNdEx += 8 return iNdEx, nil case 2: var length int for shift := uint(0); ; shift += 7 { if shift >= 64 { return 0, ErrIntOverflowPlacement } if iNdEx >= l { return 0, io.ErrUnexpectedEOF } b := dAtA[iNdEx] iNdEx++ length |= (int(b) & 0x7F) << shift if b < 0x80 { break } } iNdEx += length if length < 0 { return 0, ErrInvalidLengthPlacement } return iNdEx, nil case 3: for { var innerWire uint64 var start int = iNdEx for shift := uint(0); ; shift += 7 { if shift >= 64 { return 0, ErrIntOverflowPlacement } if iNdEx >= l { return 0, io.ErrUnexpectedEOF } b := dAtA[iNdEx] iNdEx++ innerWire |= (uint64(b) & 0x7F) << shift if b < 0x80 { break } } innerWireType := int(innerWire & 0x7) if innerWireType == 4 { break } next, err := skipPlacement(dAtA[start:]) if err != nil { return 0, err } iNdEx = start + next } return iNdEx, nil case 4: return iNdEx, nil case 5: iNdEx += 4 return iNdEx, nil default: return 0, fmt.Errorf("proto: illegal wireType %d", wireType) } } panic("unreachable") } var ( ErrInvalidLengthPlacement = fmt.Errorf("proto: negative length found during unmarshaling") ErrIntOverflowPlacement = fmt.Errorf("proto: integer overflow") ) func init() { proto.RegisterFile("github.com/m3db/m3/src/query/generated/proto/admin/placement.proto", fileDescriptorPlacement) } var fileDescriptorPlacement = []byte{ // 475 bytes of a gzipped FileDescriptorProto 0x1f, 0x8b, 0x08, 0x00, 0x00, 0x00, 0x00, 0x00, 0x02, 0xff, 0xb4, 0x54, 0xcd, 0x6e, 0x13, 0x31, 0x10, 0xc6, 0x84, 0x14, 0x32, 0x41, 0xfc, 0x98, 0x10, 0x56, 0x48, 0x44, 0x21, 0xa7, 0x5c, 0xd8, 0x95, 0x1a, 0x38, 0x72, 0xa0, 0x42, 0xa0, 0x70, 0x29, 0xda, 0x3c, 0x40, 0xe4, 0xd8, 0x93, 0xd4, 0x52, 0xd6, 0xde, 0xda, 0xde, 0x48, 0xbd, 0xf0, 0x0c, 0xbd, 0x70, 0xe6, 0x75, 0x38, 0x72, 0x41, 0xe2, 0x88, 0xc2, 0x8b, 0xa0, 0x3a, 0xeb, 0xdd, 0x2d, 0x3f, 0xbd, 0xb4, 0x3d, 0xce, 0xcc, 0x37, 0x9f, 0xbf, 0x6f, 0x34, 0x63, 0x38, 0x58, 0x49, 0x77, 0x54, 0x2c, 0x62, 0xae, 0xb3, 0x24, 0x9b, 0x88, 0x45, 0x92, 0x4d, 0x12, 0x6b, 0x78, 0x72, 0x5c, 0xa0, 0x39, 0x49, 0x56, 0xa8, 0xd0, 0x30, 0x87, 0x22, 0xc9, 0x8d, 0x76, 0x3a, 0x61, 0x22, 0x93, 0x2a, 0xc9, 0xd7, 0x8c, 0x63, 0x86, 0xca, 0xc5, 0x3e, 0x4b, 0xdb, 0x3e, 0xfd, 0xf4, 0xc3, 0x7f, 0xa8, 0xf8, 0xba, 0xb0, 0x0e, 0xcd, 0x5f, 0x64, 0x15, 0x4d, 0xbe, 0xf8, 0x93, 0x72, 0xf4, 0x83, 0x40, 0xef, 0x63, 0xc8, 0x4d, 0x95, 0x74, 0x29, 0x1e, 0x17, 0x68, 0x1d, 0x9d, 0x40, 0x47, 0x2a, 0xeb, 0x98, 0xe2, 0x68, 0x23, 0x32, 0x6c, 0x8d, 0xbb, 0xfb, 0x8f, 0xe3, 0x06, 0x53, 0x3c, 0x2d, 0xab, 0x69, 0x8d, 0xa3, 0xcf, 0x00, 0x54, 0x91, 0xcd, 0xed, 0x11, 0x33, 0xc2, 0x46, 0x37, 0x87, 0x64, 0xdc, 0x4e, 0x3b, 0xaa, 0xc8, 0x66, 0x3e, 0x41, 0x5f, 0x00, 0x35, 0x98, 0xaf, 0x25, 0x67, 0x4e, 0x6a, 0x35, 0x5f, 0x32, 0xee, 0xb4, 0x89, 0x5a, 0x1e, 0xf6, 0xb0, 0x51, 0x79, 0xe7, 0x0b, 0xf4, 0x35, 0xdc, 0xd7, 0xb9, 0x47, 0xea, 0x0d, 0x1a, 0x23, 0x05, 0x46, 0x7c, 0x48, 0xc6, 0xdd, 0xfd, 0xde, 0x39, 0x21, 0x87, 0x1e, 0x63, 0xd3, 0x7b, 0x3b, 0xf0, 0x61, 0x89, 0x1d, 0x2d, 0x1b, 0xce, 0xde, 0xa3, 0x4b, 0xd1, 0xe6, 0x5a, 0x59, 0xa4, 0x2f, 0xa1, 0x53, 0xb5, 0x47, 0xc4, 0x13, 0xf6, 0xcf, 0x11, 0x56, 0x5d, 0x69, 0x0d, 0xa4, 0x11, 0xdc, 0xde, 0xa0, 0xb1, 0x52, 0xab, 0xd2, 0x57, 0x08, 0x47, 0x5f, 0x08, 0x3c, 0xaa, 0x5a, 0xde, 0x08, 0x71, 0xa9, 0x09, 0xf6, 0xa0, 0xbd, 0xd4, 0x86, 0xa3, 0x7f, 0xe4, 0x4e, 0xba, 0x0b, 0x2e, 0x3b, 0x89, 0x53, 0x02, 0xfd, 0xda, 0x14, 0x66, 0x7a, 0x83, 0x41, 0xe4, 0x73, 0xb8, 0x1b, 0x1e, 0x9f, 0x4b, 0xb1, 0xd3, 0xd9, 0x49, 0xbb, 0x21, 0x37, 0x15, 0xd7, 0x24, 0xe9, 0x3b, 0x81, 0x27, 0x0d, 0x49, 0xbe, 0x25, 0x68, 0x8a, 0x81, 0xae, 0x91, 0x6d, 0xa4, 0x5a, 0x85, 0x09, 0x4d, 0xdf, 0x06, 0x65, 0xff, 0xa8, 0xd0, 0x57, 0x00, 0x9c, 0x29, 0x21, 0x05, 0x73, 0x78, 0xb6, 0x75, 0x17, 0x4c, 0xba, 0x01, 0xac, 0x7d, 0xb5, 0xae, 0xd0, 0xd7, 0xe7, 0xe6, 0x32, 0xcc, 0xb0, 0x3a, 0xa7, 0x2b, 0x5e, 0xba, 0xb3, 0x0a, 0xd7, 0x6a, 0x29, 0x4d, 0x56, 0xca, 0x0f, 0x61, 0x6d, 0xeb, 0x56, 0xc3, 0xd6, 0xe8, 0x53, 0xe3, 0x18, 0x66, 0xd7, 0x77, 0x0c, 0xb4, 0x0f, 0x7b, 0xc2, 0x9c, 0xa4, 0x85, 0x2a, 0x65, 0x95, 0xd1, 0xc1, 0x83, 0xaf, 0xdb, 0x01, 0xf9, 0xb6, 0x1d, 0x90, 0x9f, 0xdb, 0x01, 0x39, 0xfd, 0x35, 0xb8, 0xb1, 0xd8, 0xf3, 0x1f, 0xd0, 0xe4, 0x77, 0x00, 0x00, 0x00, 0xff, 0xff, 0x8c, 0x7a, 0x08, 0x16, 0x19, 0x05, 0x00, 0x00, } ```
```python # # # path_to_url # # Unless required by applicable law or agreed to in writing, software # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # ============================================================================== """Abstract detection model. This file defines a generic base class for detection models. Programs that are designed to work with arbitrary detection models should only depend on this class. We intend for the functions in this class to follow tensor-in/tensor-out design, thus all functions have tensors or lists/dictionaries holding tensors as inputs and outputs. Abstractly, detection models predict output tensors given input images which can be passed to a loss function at training time or passed to a postprocessing function at eval time. The computation graphs at a high level consequently look as follows: Training time: inputs (images tensor) -> preprocess -> predict -> loss -> outputs (loss tensor) Evaluation time: inputs (images tensor) -> preprocess -> predict -> postprocess -> outputs (boxes tensor, scores tensor, classes tensor, num_detections tensor) DetectionModels must thus implement four functions (1) preprocess, (2) predict, (3) postprocess and (4) loss. DetectionModels should make no assumptions about the input size or aspect ratio --- they are responsible for doing any resize/reshaping necessary (see docstring for the preprocess function). Output classes are always integers in the range [0, num_classes). Any mapping of these integers to semantic labels is to be handled outside of this class. Images are resized in the `preprocess` method. All of `preprocess`, `predict`, and `postprocess` should be reentrant. The `preprocess` method runs `image_resizer_fn` that returns resized_images and `true_image_shapes`. Since `image_resizer_fn` can pad the images with zeros, true_image_shapes indicate the slices that contain the image without padding. This is useful for padding images to be a fixed size for batching. The `postprocess` method uses the true image shapes to clip predictions that lie outside of images. By default, DetectionModels produce bounding box detections; However, we support a handful of auxiliary annotations associated with each bounding box, namely, instance masks and keypoints. """ from __future__ import absolute_import from __future__ import division from __future__ import print_function import abc import six import tensorflow.compat.v1 as tf from object_detection.core import standard_fields as fields # If using a new enough version of TensorFlow, detection models should be a # tf module or keras model for tracking. try: _BaseClass = tf.keras.layers.Layer except AttributeError: _BaseClass = object class DetectionModel(six.with_metaclass(abc.ABCMeta, _BaseClass)): """Abstract base class for detection models. Extends tf.Module to guarantee variable tracking. """ def __init__(self, num_classes): """Constructor. Args: num_classes: number of classes. Note that num_classes *does not* include background categories that might be implicitly predicted in various implementations. """ self._num_classes = num_classes self._groundtruth_lists = {} self._training_step = None super(DetectionModel, self).__init__() @property def num_classes(self): return self._num_classes def groundtruth_lists(self, field): """Access list of groundtruth tensors. Args: field: a string key, options are fields.BoxListFields.{boxes,classes,masks,mask_weights,keypoints, keypoint_visibilities, densepose_*, track_ids, temporal_offsets, track_match_flags} fields.InputDataFields.is_annotated. Returns: a list of tensors holding groundtruth information (see also provide_groundtruth function below), with one entry for each image in the batch. Raises: RuntimeError: if the field has not been provided via provide_groundtruth. """ if field not in self._groundtruth_lists: raise RuntimeError('Groundtruth tensor {} has not been provided'.format( field)) return self._groundtruth_lists[field] def groundtruth_has_field(self, field): """Determines whether the groundtruth includes the given field. Args: field: a string key, options are fields.BoxListFields.{boxes,classes,masks,mask_weights,keypoints, keypoint_visibilities, densepose_*, track_ids} or fields.InputDataFields.is_annotated. Returns: True if the groundtruth includes the given field, False otherwise. """ return field in self._groundtruth_lists @property def training_step(self): if self._training_step is None: raise ValueError('Training step was not provided to the model.') return self._training_step @staticmethod def get_side_inputs(features): """Get side inputs from input features. This placeholder method provides a way for a meta-architecture to specify how to grab additional side inputs from input features (in addition to the image itself) and allows models to depend on contextual information. By default, detection models do not use side information (and thus this method returns an empty dictionary by default. However it can be overridden if side inputs are necessary." Args: features: A dictionary of tensors. Returns: An empty dictionary by default. """ return {} @abc.abstractmethod def preprocess(self, inputs): """Input preprocessing. To be overridden by implementations. This function is responsible for any scaling/shifting of input values that is necessary prior to running the detector on an input image. It is also responsible for any resizing, padding that might be necessary as images are assumed to arrive in arbitrary sizes. While this function could conceivably be part of the predict method (below), it is often convenient to keep these separate --- for example, we may want to preprocess on one device, place onto a queue, and let another device (e.g., the GPU) handle prediction. A few important notes about the preprocess function: + We assume that this operation does not have any trainable variables nor does it affect the groundtruth annotations in any way (thus data augmentation operations such as random cropping should be performed externally). + There is no assumption that the batchsize in this function is the same as the batch size in the predict function. In fact, we recommend calling the preprocess function prior to calling any batching operations (which should happen outside of the model) and thus assuming that batch sizes are equal to 1 in the preprocess function. + There is also no explicit assumption that the output resolutions must be fixed across inputs --- this is to support "fully convolutional" settings in which input images can have different shapes/resolutions. Args: inputs: a [batch, height_in, width_in, channels] float32 tensor representing a batch of images with values between 0 and 255.0. Returns: preprocessed_inputs: a [batch, height_out, width_out, channels] float32 tensor representing a batch of images. true_image_shapes: int32 tensor of shape [batch, 3] where each row is of the form [height, width, channels] indicating the shapes of true images in the resized images, as resized images can be padded with zeros. """ pass @abc.abstractmethod def predict(self, preprocessed_inputs, true_image_shapes, **side_inputs): """Predict prediction tensors from inputs tensor. Outputs of this function can be passed to loss or postprocess functions. Args: preprocessed_inputs: a [batch, height, width, channels] float32 tensor representing a batch of images. true_image_shapes: int32 tensor of shape [batch, 3] where each row is of the form [height, width, channels] indicating the shapes of true images in the resized images, as resized images can be padded with zeros. **side_inputs: additional tensors that are required by the network. Returns: prediction_dict: a dictionary holding prediction tensors to be passed to the Loss or Postprocess functions. """ pass @abc.abstractmethod def postprocess(self, prediction_dict, true_image_shapes, **params): """Convert predicted output tensors to final detections. This stage typically performs a few things such as * Non-Max Suppression to remove overlapping detection boxes. * Score conversion and background class removal. Outputs adhere to the following conventions: * Classes are integers in [0, num_classes); background classes are removed and the first non-background class is mapped to 0. If the model produces class-agnostic detections, then no output is produced for classes. * Boxes are to be interpreted as being in [y_min, x_min, y_max, x_max] format and normalized relative to the image window. * `num_detections` is provided for settings where detections are padded to a fixed number of boxes. * We do not specifically assume any kind of probabilistic interpretation of the scores --- the only important thing is their relative ordering. Thus implementations of the postprocess function are free to output logits, probabilities, calibrated probabilities, or anything else. Args: prediction_dict: a dictionary holding prediction tensors. true_image_shapes: int32 tensor of shape [batch, 3] where each row is of the form [height, width, channels] indicating the shapes of true images in the resized images, as resized images can be padded with zeros. **params: Additional keyword arguments for specific implementations of DetectionModel. Returns: detections: a dictionary containing the following fields detection_boxes: [batch, max_detections, 4] detection_scores: [batch, max_detections] detection_classes: [batch, max_detections] (If a model is producing class-agnostic detections, this field may be missing) detection_masks: [batch, max_detections, mask_height, mask_width] (optional) detection_keypoints: [batch, max_detections, num_keypoints, 2] (optional) detection_keypoint_scores: [batch, max_detections, num_keypoints] (optional) detection_surface_coords: [batch, max_detections, mask_height, mask_width, 2] (optional) num_detections: [batch] In addition to the above fields this stage also outputs the following raw tensors: raw_detection_boxes: [batch, total_detections, 4] tensor containing all detection boxes from `prediction_dict` in the format [ymin, xmin, ymax, xmax] and normalized co-ordinates. raw_detection_scores: [batch, total_detections, num_classes_with_background] tensor of class score logits for raw detection boxes. """ pass @abc.abstractmethod def loss(self, prediction_dict, true_image_shapes): """Compute scalar loss tensors with respect to provided groundtruth. Calling this function requires that groundtruth tensors have been provided via the provide_groundtruth function. Args: prediction_dict: a dictionary holding predicted tensors true_image_shapes: int32 tensor of shape [batch, 3] where each row is of the form [height, width, channels] indicating the shapes of true images in the resized images, as resized images can be padded with zeros. Returns: a dictionary mapping strings (loss names) to scalar tensors representing loss values. """ pass def provide_groundtruth( self, groundtruth_boxes_list, groundtruth_classes_list, groundtruth_masks_list=None, groundtruth_mask_weights_list=None, groundtruth_keypoints_list=None, groundtruth_keypoint_visibilities_list=None, groundtruth_dp_num_points_list=None, groundtruth_dp_part_ids_list=None, groundtruth_dp_surface_coords_list=None, groundtruth_track_ids_list=None, groundtruth_temporal_offsets_list=None, groundtruth_track_match_flags_list=None, groundtruth_weights_list=None, groundtruth_confidences_list=None, groundtruth_is_crowd_list=None, groundtruth_group_of_list=None, groundtruth_area_list=None, is_annotated_list=None, groundtruth_labeled_classes=None, groundtruth_verified_neg_classes=None, groundtruth_not_exhaustive_classes=None, groundtruth_keypoint_depths_list=None, groundtruth_keypoint_depth_weights_list=None, groundtruth_image_classes=None, training_step=None): """Provide groundtruth tensors. Args: groundtruth_boxes_list: a list of 2-D tf.float32 tensors of shape [num_boxes, 4] containing coordinates of the groundtruth boxes. Groundtruth boxes are provided in [y_min, x_min, y_max, x_max] format and assumed to be normalized and clipped relative to the image window with y_min <= y_max and x_min <= x_max. groundtruth_classes_list: a list of 2-D tf.float32 one-hot (or k-hot) tensors of shape [num_boxes, num_classes] containing the class targets with the 0th index assumed to map to the first non-background class. groundtruth_masks_list: a list of 3-D tf.float32 tensors of shape [num_boxes, height_in, width_in] containing instance masks with values in {0, 1}. If None, no masks are provided. Mask resolution `height_in`x`width_in` must agree with the resolution of the input image tensor provided to the `preprocess` function. groundtruth_mask_weights_list: a list of 1-D tf.float32 tensors of shape [num_boxes] with weights for each instance mask. groundtruth_keypoints_list: a list of 3-D tf.float32 tensors of shape [num_boxes, num_keypoints, 2] containing keypoints. Keypoints are assumed to be provided in normalized coordinates and missing keypoints should be encoded as NaN (but it is recommended to use `groundtruth_keypoint_visibilities_list`). groundtruth_keypoint_visibilities_list: a list of 3-D tf.bool tensors of shape [num_boxes, num_keypoints] containing keypoint visibilities. groundtruth_dp_num_points_list: a list of 1-D tf.int32 tensors of shape [num_boxes] containing the number of DensePose sampled points. groundtruth_dp_part_ids_list: a list of 2-D tf.int32 tensors of shape [num_boxes, max_sampled_points] containing the DensePose part ids (0-indexed) for each sampled point. Note that there may be padding. groundtruth_dp_surface_coords_list: a list of 3-D tf.float32 tensors of shape [num_boxes, max_sampled_points, 4] containing the DensePose surface coordinates for each sampled point. Note that there may be padding. groundtruth_track_ids_list: a list of 1-D tf.int32 tensors of shape [num_boxes] containing the track IDs of groundtruth objects. groundtruth_temporal_offsets_list: a list of 2-D tf.float32 tensors of shape [num_boxes, 2] containing the spatial offsets of objects' centers compared with the previous frame. groundtruth_track_match_flags_list: a list of 1-D tf.float32 tensors of shape [num_boxes] containing 0-1 flags that indicate if an object has existed in the previous frame. groundtruth_weights_list: A list of 1-D tf.float32 tensors of shape [num_boxes] containing weights for groundtruth boxes. groundtruth_confidences_list: A list of 2-D tf.float32 tensors of shape [num_boxes, num_classes] containing class confidences for groundtruth boxes. groundtruth_is_crowd_list: A list of 1-D tf.bool tensors of shape [num_boxes] containing is_crowd annotations. groundtruth_group_of_list: A list of 1-D tf.bool tensors of shape [num_boxes] containing group_of annotations. groundtruth_area_list: A list of 1-D tf.float32 tensors of shape [num_boxes] containing the area (in the original absolute coordinates) of the annotations. is_annotated_list: A list of scalar tf.bool tensors indicating whether images have been labeled or not. groundtruth_labeled_classes: A list of 1-D tf.float32 tensors of shape [num_classes], containing label indices encoded as k-hot of the classes that are exhaustively annotated. groundtruth_verified_neg_classes: A list of 1-D tf.float32 tensors of shape [num_classes], containing a K-hot representation of classes which were verified as not present in the image. groundtruth_not_exhaustive_classes: A list of 1-D tf.float32 tensors of shape [num_classes], containing a K-hot representation of classes which don't have all of their instances marked exhaustively. groundtruth_keypoint_depths_list: a list of 2-D tf.float32 tensors of shape [num_boxes, num_keypoints] containing keypoint relative depths. groundtruth_keypoint_depth_weights_list: a list of 2-D tf.float32 tensors of shape [num_boxes, num_keypoints] containing the weights of the relative depths. groundtruth_image_classes: A list of 1-D tf.float32 tensors of shape [num_classes], containing label indices encoded as k-hot of the classes that are present or not present in the image. training_step: An integer denoting the current training step. This is useful when models want to anneal loss terms. """ self._groundtruth_lists[fields.BoxListFields.boxes] = groundtruth_boxes_list self._groundtruth_lists[ fields.BoxListFields.classes] = groundtruth_classes_list if groundtruth_weights_list: self._groundtruth_lists[fields.BoxListFields. weights] = groundtruth_weights_list if groundtruth_confidences_list: self._groundtruth_lists[fields.BoxListFields. confidences] = groundtruth_confidences_list if groundtruth_masks_list: self._groundtruth_lists[ fields.BoxListFields.masks] = groundtruth_masks_list if groundtruth_mask_weights_list: self._groundtruth_lists[ fields.BoxListFields.mask_weights] = groundtruth_mask_weights_list if groundtruth_keypoints_list: self._groundtruth_lists[ fields.BoxListFields.keypoints] = groundtruth_keypoints_list if groundtruth_keypoint_visibilities_list: self._groundtruth_lists[ fields.BoxListFields.keypoint_visibilities] = ( groundtruth_keypoint_visibilities_list) if groundtruth_keypoint_depths_list: self._groundtruth_lists[ fields.BoxListFields.keypoint_depths] = ( groundtruth_keypoint_depths_list) if groundtruth_keypoint_depth_weights_list: self._groundtruth_lists[ fields.BoxListFields.keypoint_depth_weights] = ( groundtruth_keypoint_depth_weights_list) if groundtruth_dp_num_points_list: self._groundtruth_lists[ fields.BoxListFields.densepose_num_points] = ( groundtruth_dp_num_points_list) if groundtruth_dp_part_ids_list: self._groundtruth_lists[ fields.BoxListFields.densepose_part_ids] = ( groundtruth_dp_part_ids_list) if groundtruth_dp_surface_coords_list: self._groundtruth_lists[ fields.BoxListFields.densepose_surface_coords] = ( groundtruth_dp_surface_coords_list) if groundtruth_track_ids_list: self._groundtruth_lists[ fields.BoxListFields.track_ids] = groundtruth_track_ids_list if groundtruth_temporal_offsets_list: self._groundtruth_lists[ fields.BoxListFields.temporal_offsets] = ( groundtruth_temporal_offsets_list) if groundtruth_track_match_flags_list: self._groundtruth_lists[ fields.BoxListFields.track_match_flags] = ( groundtruth_track_match_flags_list) if groundtruth_is_crowd_list: self._groundtruth_lists[ fields.BoxListFields.is_crowd] = groundtruth_is_crowd_list if groundtruth_group_of_list: self._groundtruth_lists[ fields.BoxListFields.group_of] = groundtruth_group_of_list if groundtruth_area_list: self._groundtruth_lists[ fields.InputDataFields.groundtruth_area] = groundtruth_area_list if is_annotated_list: self._groundtruth_lists[ fields.InputDataFields.is_annotated] = is_annotated_list if groundtruth_labeled_classes: self._groundtruth_lists[ fields.InputDataFields .groundtruth_labeled_classes] = groundtruth_labeled_classes if groundtruth_verified_neg_classes: self._groundtruth_lists[ fields.InputDataFields .groundtruth_verified_neg_classes] = groundtruth_verified_neg_classes if groundtruth_image_classes: self._groundtruth_lists[ fields.InputDataFields .groundtruth_image_classes] = groundtruth_image_classes if groundtruth_not_exhaustive_classes: self._groundtruth_lists[ fields.InputDataFields .groundtruth_not_exhaustive_classes] = ( groundtruth_not_exhaustive_classes) if training_step is not None: self._training_step = training_step @abc.abstractmethod def regularization_losses(self): """Returns a list of regularization losses for this model. Returns a list of regularization losses for this model that the estimator needs to use during training/optimization. Returns: A list of regularization loss tensors. """ pass @abc.abstractmethod def restore_map(self, fine_tune_checkpoint_type='detection', load_all_detection_checkpoint_vars=False): """Returns a map of variables to load from a foreign checkpoint. Returns a map of variable names to load from a checkpoint to variables in the model graph. This enables the model to initialize based on weights from another task. For example, the feature extractor variables from a classification model can be used to bootstrap training of an object detector. When loading from an object detection model, the checkpoint model should have the same parameters as this detection model with exception of the num_classes parameter. Args: fine_tune_checkpoint_type: whether to restore from a full detection checkpoint (with compatible variable names) or to restore from a classification checkpoint for initialization prior to training. Valid values: `detection`, `classification`. Default 'detection'. load_all_detection_checkpoint_vars: whether to load all variables (when `fine_tune_checkpoint_type` is `detection`). If False, only variables within the feature extractor scope are included. Default False. Returns: A dict mapping variable names (to load from a checkpoint) to variables in the model graph. """ pass @abc.abstractmethod def restore_from_objects(self, fine_tune_checkpoint_type='detection'): """Returns a map of variables to load from a foreign checkpoint. Returns a dictionary of Tensorflow 2 Trackable objects (e.g. tf.Module or Checkpoint). This enables the model to initialize based on weights from another task. For example, the feature extractor variables from a classification model can be used to bootstrap training of an object detector. When loading from an object detection model, the checkpoint model should have the same parameters as this detection model with exception of the num_classes parameter. Note that this function is intended to be used to restore Keras-based models when running Tensorflow 2, whereas restore_map (above) is intended to be used to restore Slim-based models when running Tensorflow 1.x. TODO(jonathanhuang,rathodv): Check tf_version and raise unimplemented error for both restore_map and restore_from_objects depending on version. Args: fine_tune_checkpoint_type: whether to restore from a full detection checkpoint (with compatible variable names) or to restore from a classification checkpoint for initialization prior to training. Valid values: `detection`, `classification`. Default 'detection'. Returns: A dict mapping keys to Trackable objects (tf.Module or Checkpoint). """ pass @abc.abstractmethod def updates(self): """Returns a list of update operators for this model. Returns a list of update operators for this model that must be executed at each training step. The estimator's train op needs to have a control dependency on these updates. Returns: A list of update operators. """ pass def call(self, images): """Returns detections from a batch of images. This method calls the preprocess, predict and postprocess function sequentially and returns the output. Args: images: a [batch_size, height, width, channels] float tensor. Returns: detetcions: The dict of tensors returned by the postprocess function. """ preprocessed_images, shapes = self.preprocess(images) prediction_dict = self.predict(preprocessed_images, shapes) return self.postprocess(prediction_dict, shapes) ```
Thetys vagina, or the twin sailed salp, is the largest known solitary species of salp and the only valid species of the genus Thetys.  First described by W.G Tilesius in 1802, the species is transparent and gelatinous, making it difficult to be seen in water, which is helpful in avoiding predators. The fossil range is very recent. Other animals often mistaken for T. vagina are Salpa fusiformis, Aurelia aurita, and Pegea confoederata. There is no known status of conservation in this species. T. vagina DNA was sequenced as part of a larger project in 2014 where spiny lobster larvae were found attached to T. vagina and consuming it. Description T. vagina can reach up to 333 mm (13 in) long. They develop into two distinct forms; the aggregate generation and the solitary generation. The aggregated sexual blastozooids (aggregate form) can get to the size of 250 mm and has five muscle bands.  The solitary asexual oozooids (solitary form) can get to size of 300 mm. Vaginas of this form have around 20 muscle bands, which are characterized as “striped” with two short dark-colored tentacles at their ends, attached at the upper and lower halves of the body.  Both the aggregate and the solitary forms have tests covered in ridges and grooves. They have a colored digestive system seen as a dark or colorful lump. The embryos have been found to be between 10-15 mm. Distribution T. vagina is found in pelagic marine environments. It occurs in tropical and temperate waters of the Pacific, Atlantic and Indian Ocean and is occasionally found in colder waters in the northern Atlantic, likely following warm water currents. The species is widespread but at low density (although they may occasionally be found at very high density), resulting in only rare accounts of it being caught. T. vagina has been found off the central coast of British Columbia, marking its north-most occurrence to date. It has been found by cataloging volunteers along the West Coast of the U.S. and reportedly congests nets of fisherman off the coast of northern Honshu and southern Hokkaido, Japan.  In January 2009, the largest measured biomass of T. vagina was recorded at 852 g WW m−3 in the Tasman sea. T. vagina stays in the photic zone and is often found in places of high chlorophyll concentration, likely due to its phytoplankton rich diet. A large increase of T. vagina is associated with an increase in phytoplankton. The ecology of this species is not fully understood. Diet Like other salps, T. vagina feeds by consuming plankton nutrient water on one end of its body, filtering it via an internal net made of mucus, and spewing the water out the other end. Their internal net is very effective, catching particles spanning four magnitudes in size. This action also allows them to move through the water column, classifying them as nektonic. T. vagina  feeds on marine plankton, including single-celled organisms such as dinoflagellates, silicoflagellates, diatoms, and tintinnids, as well as copepods and other small particles. Continuing up the food chain, T. vagina is preyed upon by medusae, siphonophores, ctenophores, heteropods, sea turtles, late stage larvae of the spiny lobster, marine birds, along with various species of fish. They have a high energy content at (11.00 kJ g−1 DW). In a study done in the Japan Sea in 2006, the gut contents of T. vagina where evaluated.  The diatom Coscinodiscus spp. (13–55 µm in diameter) was found to be the major makeup of the guts, with the diatom Coscinodiscus wailesii (219–313 µm) being the second most prevalent. Another study off the coast of Maine found T. vagina gut content to be mainly made up of two dinoflagellates; Prorocentrummicans and Dinophysis norvegica. The study also found T. vagina to be an indiscriminate feeder over a broad size spectrum. Ecology Waste from T. vagina is densely packed, sinks quickly, and is full of carbon. Their carcasses also sink quickly and are carbon rich (31% dry weight, DW). This makes them efficient carbon sinks, but also harder to study. This carbon exchange could be responsible for up to 67% of the mean organic daily carbon flux in the area. In 2007 and 2009, the Tasman sea floor was analyzed from 200m to 2500m in depth and large quantities of T. vagina were found. The quantities found were some of the largest gelatinous zooplankton depositions ever recorded. Further, benthic communities were found consuming T. vagina carcasses. This sink provides nutrients to these benthic communities and are likely a large source of carbon input. Etymology The generic name of Thetys refers to Tethys or Thetis, Greek mythological figures. The former being the mother of river gods and the Oceanids, and the latter being the goddess of water, a nereid, or a sea nymph. The species epithet is Latin vagina and means "sheath" or "scabbard." At the time of Tilesius' naming, the term had not acquired the modern-day anatomical meaning and had referred solely to a sheath, likely referring to its appearance. References External links Photograph Video of swimming behavior on YouTube Thaliacea Monotypic tunicate genera Animals described in 1802
```go package kubernetes import ( "context" "fmt" "sync" "time" crdv1 "k8s.io/apiextensions-apiserver/pkg/apis/apiextensions/v1" crd "k8s.io/apiextensions-apiserver/pkg/client/clientset/clientset" metav1 "k8s.io/apimachinery/pkg/apis/meta/v1" "k8s.io/apimachinery/pkg/runtime" "k8s.io/apimachinery/pkg/watch" "k8s.io/client-go/discovery" "k8s.io/client-go/discovery/cached/memory" toolscache "k8s.io/client-go/tools/cache" ) // This exists so that we can do our own invalidation. type cachedDiscovery struct { discovery.CachedDiscoveryInterface invalidMu sync.Mutex invalid bool } // The k8s.io/client-go v8.0.0 implementation of MemCacheDiscovery // refreshes the cached values, synchronously, when Invalidate() is // called. Since we want to invalidate every time a CRD changes, but // only refresh values when we need to read the cached values, this // method defers the invalidation until a read is done. func (d *cachedDiscovery) Invalidate() { d.invalidMu.Lock() d.invalid = true d.invalidMu.Unlock() } // ServerResourcesForGroupVersion is the method used by the // namespacer; so, this is the one where we check whether the cache // has been invalidated. A cachedDiscovery implementation for more // general use would do this for all methods (that weren't implemented // purely in terms of other methods). func (d *cachedDiscovery) ServerResourcesForGroupVersion(groupVersion string) (*metav1.APIResourceList, error) { d.invalidMu.Lock() invalid := d.invalid d.invalid = false d.invalidMu.Unlock() if invalid { d.CachedDiscoveryInterface.Invalidate() } result, err := d.CachedDiscoveryInterface.ServerResourcesForGroupVersion(groupVersion) if err == memory.ErrCacheNotFound { // improve the error returned from memcacheclient err = fmt.Errorf("server resources for %s not found in cache; cache refreshes every 5 minutes", groupVersion) } return result, err } // MakeCachedDiscovery constructs a CachedDicoveryInterface that will // be invalidated whenever the set of CRDs change. The idea is that // the only avenue of a change to the API resources in a running // system is CRDs being added, updated or deleted. func MakeCachedDiscovery(d discovery.DiscoveryInterface, c crd.Interface, shutdown <-chan struct{}) discovery.CachedDiscoveryInterface { cachedDisco := &cachedDiscovery{CachedDiscoveryInterface: memory.NewMemCacheClient(d)} // We have an empty cache, so it's _a priori_ invalid. (Yes, that's the zero value, but better safe than sorry) cachedDisco.Invalidate() crdClient := c.ApiextensionsV1().CustomResourceDefinitions() lw := &toolscache.ListWatch{ ListFunc: func(options metav1.ListOptions) (runtime.Object, error) { return crdClient.List(context.TODO(), options) }, WatchFunc: func(options metav1.ListOptions) (watch.Interface, error) { return crdClient.Watch(context.TODO(), options) }, } handle := toolscache.ResourceEventHandlerFuncs{ AddFunc: func(_ interface{}) { cachedDisco.Invalidate() }, UpdateFunc: func(_, _ interface{}) { cachedDisco.Invalidate() }, DeleteFunc: func(_ interface{}) { cachedDisco.Invalidate() }, } _, controller := toolscache.NewInformer(lw, &crdv1.CustomResourceDefinition{}, 0, handle) go cachedDisco.invalidatePeriodically(shutdown) go controller.Run(shutdown) return cachedDisco } func (d *cachedDiscovery) invalidatePeriodically(shutdown <-chan struct{}) { // Make the first period shorter since we may be bootstrapping a // newly-created cluster and the resource definitions may not be // complete/stable yet. initialPeriodDuration := 1 * time.Minute subsequentPeriodsDuration := 5 * initialPeriodDuration timer := time.NewTimer(initialPeriodDuration) for { select { case <-timer.C: timer.Reset(subsequentPeriodsDuration) d.Invalidate() case <-shutdown: timer.Stop() return } } } ```
```javascript Load Initial Data via AJAX Use **React** with other libraries Server-side rendering Immutability helpers in **React** Default values for props ```
Bahlul () or Pahlul () is a village near the city of Stepanakert, de facto part of the breakaway Republic of Artsakh, de jure in Azerbaijan. History In June 1919, the village and the neighboring villages of Ghaibalishen (Khaibalikend), Jamilli, and Karkijahan were looted and destroyed in the Khaibalikend massacre with 600-700 ethnic Armenians being killed by armed Kurdish irregulars and Azerbaijani soldiers. References External links Stepanakert
```smalltalk using System; using System.Collections.Generic; using WorkflowCore.Interface; using WorkflowCore.Models; namespace WorkflowCore.Primitives { public class Recur : ContainerStepBody { public TimeSpan Interval { get; set; } public bool StopCondition { get; set; } public override ExecutionResult Run(IStepExecutionContext context) { if (StopCondition) { return ExecutionResult.Next(); } return new ExecutionResult { Proceed = false, BranchValues = new List<object> { null }, SleepFor = Interval }; } } } ```
The 28th Producers Guild of America Awards (also known as 2017 Producers Guild Awards), honoring the best film and television producers of 2016, were held at The Beverly Hilton in Beverly Hills, California on January 28, 2017. The nominations for documentary film were announced on November 22, 2016, the nominations for television were announced on January 5, 2017, and the nominations for film were announced on January 10, 2017. Winners and nominees Film {| class=wikitable style="width="100%" |- ! colspan="2" style="background:#abcdef;"| Darryl F. Zanuck Award for Outstanding Producer of Theatrical Motion Pictures |- | colspan="2" style="vertical-align:top;"| La La Land – Fred Berger, Jordan Horowitz, and Marc Platt Arrival – Dan Levine, Shawn Levy, Aaron Ryder, and David Linde Deadpool – Simon Kinberg, Ryan Reynolds, and Lauren Shuler Donner Fences – Scott Rudin, Denzel Washington, and Todd Black Hacksaw Ridge – Bill Mechanic and David Permut Hell or High Water – Carla Hacken and Julie Yorn Hidden Figures – Donna Gigliotti, Peter Chernin, Jenno Topping, Pharrell Williams, and Theodore Melfi Lion – Emile Sherman, Iain Canning, and Angie Fielder Manchester by the Sea – Matt Damon, Kimberly Steward, Chris Moore, Lauren Beck, and Kevin J. Walsh Moonlight – Adele Romanski, Dede Gardner, and Jeremy Kleiner |- ! colspan="2" style="background:#abcdef;"| Outstanding Producer of Animated Theatrical Motion Pictures |- | colspan="2" style="vertical-align:top;"| Zootopia – Clark Spencer Finding Dory – Lindsey Collins Kubo and the Two Strings – Arianne Sutner and Travis Knight Moana – Osnat Shurer The Secret Life of Pets – Chris Meledandri and Janet Healy |- ! colspan="2" style="background:#abcdef;"| Outstanding Producer of Documentary Theatrical Motion Pictures |- | colspan="2" style="vertical-align:top;"| O.J.: Made in America – Ezra Edelman and Caroline Waterlow Dancer – Gabrielle Tana The Eagle Huntress – Stacey Reiss and Otto Bell Life, Animated – Julie Goldman and Roger Ross Williams Tower – Keith Maitland, Susan Thomson, and Megan Gilbride |} Television {| class=wikitable style="width="100%" |- ! colspan="2" style="background:#abcdef;"| Norman Felton Award for Outstanding Producer of Episodic Television, Drama |- | colspan="2" style="vertical-align:top;"| Stranger Things (Netflix) – Matt Duffer, Ross Duffer, Shawn Levy, Dan Cohen, Iain Paterson Better Call Saul (AMC) – Vince Gilligan, Peter Gould, Melissa Bernstein, Mark Johnson, Thomas Schnauz, Gennifer Hutchison, Nina Jack, Robin Sweet, Diane Mercer, Bob Odenkirk Game of Thrones (HBO) – David Benioff, D. B. Weiss, Bernadette Caulfield, Frank Doelger, Carolyn Strauss, Bryan Cogman, Lisa McAtackney, Chris Newman, Greg Spence House of Cards (Netflix) – Beau Willimon, Dana Brunetti, Michael Dobbs, Josh Donen, David Fincher, Eric Roth, Kevin Spacey, Robin Wright, John Mankiewicz, Robert Zotnowski, Jay Carson, Frank Pugliese, Boris Malden, Hameed Shaukat Westworld (HBO) – J. J. Abrams, Jonathan Nolan, Lisa Joy, Bryan Burk, Athena Wickham, Kathy Lingg, Richard J. Lewis, Roberto Patino, Katherine Lingenfelter, Cherylanne Martin |- ! colspan="2" style="background:#abcdef;"| Danny Thomas Award for Outstanding Producer of Episodic Television, Comedy |- | colspan="2" style="vertical-align:top;"| Atlanta (FX) – Donald Glover, Dianne McGunigle, Paul Simms, Hiro Murai, Alex Orr Black-ish (ABC) – Kenya Barris, Jonathan Groff, Anthony Anderson, Laurence Fishburne, Helen Sugland, E. Brian Dobbins, Vijal Patel, Gail Lerner, Corey Nickerson, Courtney Lilly, Lindsey Shockley, Peter Saji, Jenifer Rice-Genzuk Henry, Hale Rothstein, Michael Petok, Yvette Lee Bowser Silicon Valley (HBO) – Mike Judge, Alec Berg, Jim Kleverweis, Clay Tarver, Dan O'Keefe, Michael Rotenberg, Tom Lassally, John Levenstein, Ron Weiner, Carrie Kemper, Adam Countee Veep (HBO) – David Mandel, Frank Rich, Julia Louis-Dreyfus, Lew Morton, Morgan Sackett, Sean Gray, Peter Huyck, Alex Gregory, Jim Margolis, Georgia Pritchett, Will Smith, Chris Addison, Rachel Axler, David Hyman, Erik Kenward, Billy Kimball, Steve Koren |- ! colspan="2" style="background:#abcdef;"| David L. Wolper Award for Outstanding Producer of Long-Form Television |- | colspan="2" style="vertical-align:top;"| The People v. O. J. Simpson: American Crime Story (FX) – Scott Alexander and Larry Karaszewski, Ryan Murphy, Brad Falchuk, Nina Jacobson, Brad Simpson, D.V. DeVincentis, Anthony Hemingway, Alexis Martin Woodall, John Travolta, Chip Vucelich Black Mirror (Netflix) – Annabel Jones, Charlie Brooker The Night Manager (AMC) – Simon Cornwell, Stephen Garrett, Stephen Cornwell, Hugh Laurie, Tom Hiddleston, Susanne Bier, David Farr, John le Carré, William D. Johnson, Alexei Boltho, Rob Bullock The Night Of (HBO) – Steven Zaillian, Richard Price, Jane Tranter, Garrett Basch, Scott Ferguson Sherlock: The Abominable Bride (PBS) – Mark Gatiss, Steven Moffat, Sue Vertue, Beryl Vertue |- ! colspan="2" style="background:#abcdef;"| Outstanding Producer of Non-Fiction Television |- | colspan="2" style="vertical-align:top;"| Making a Murderer (Netflix) – Laura Ricciardi, Moira Demos 30 for 30 (ESPN) – Connor Schell, John Dahl, Libby Geist, Bill Simmons, Erin Leyden, Andrew Billman, Marquis Daisy, Deirdre Fenton 60 Minutes (CBS) – Jeff Fager Anthony Bourdain: Parts Unknown (CNN) – Anthony Bourdain, Christopher Collins, Lydia Tenaglia, Sandra Zweig Hamilton's America (PBS) – Alex Horwitz, Nicole Pusateri, Lin-Manuel Miranda, Jeffrey Seller, Dave Sirulnick, Jon Kamen, Justin Wilkes |- ! colspan="2" style="background:#abcdef;"| Outstanding Producer of Competition Television |- | colspan="2" style="vertical-align:top;"| The Voice (NBC) – Audrey Morrissey, Jay Bienstock, Mark Burnett, John de Mol, Jr., Chad Hines, Lee Metzger, Kyra Thompson, Mike Yurchuk, Amanda Zucker, Carson Daly The Amazing Race (CBS) – Jerry Bruckheimer, Bertram van Munster, Jonathan Littman, Elise Doganieri, Mark Vertullo American Ninja Warrior (NBC) – Arthur Smith, Kent Weed, Anthony Storm, Brian Richardson, Kristen Stabile, David Markus, J.D. Pruess, D. Max Poris, Zayna Abi-Hashim, Royce Toni, John Gunn, Matt Silverberg, Briana Vowels, Mason Funk, Jonathan Provost Lip Sync Battle (Spike) – Casey Patterson, Jay Peterson, John Krasinski, Stephen Merchant, Leah Gonzalez, Genna Gintzig, LL Cool J Top Chef (Bravo) – Daniel Cutforth, Tom Colicchio, Casey Kriley, Padma Lakshmi, Jane Lipsitz, Doneen Arquines, Erica Ross, Patrick Schmedeman, Ellie Carbaial, Tara Seiner, Wade Sheeler |- ! colspan="2" style="background:#abcdef;"| Outstanding Producer of Live Entertainment & Talk Television |- | colspan="2" style="vertical-align:top;"| Last Week Tonight with John Oliver (HBO) – Tim Carvell, John Oliver, Liz Stanton Full Frontal with Samantha Bee (TBS) – Samantha Bee, Jo Miller, Jason Jones, Tony Hernandez, Miles Kahn, Pat King, Alison Camillo, Kristen Everman The Late Late Show with James Corden (CBS) – Ben Winston, Rob Crabbe, Mike Gibbons, Amy Ozols, Sheila Rogers, Michael Kaplan, Jeff Kopp, James Longman, Josie Cliff, James Corden Real Time with Bill Maher (HBO) – Bill Maher, Scott Carter, Sheila Griffiths, Marc Gurvitz, Billy Martin, Dean E. Johnsen, Chris Kelly, Matt Wood Saturday Night Live (NBC) – Lorne Michaels, Steve Higgins, Erik Kenward, Lindsay Shookus, Erin Doyle, Ken Aymong |- ! colspan="2" style="background:#abcdef;"| Outstanding Sports Program |- | colspan="2" style="vertical-align:top;"| Real Sports with Bryant Gumbel (HBO) (TIE) VICE World of Sports (VICELAND) (TIE) E:60 (ESPN) The Fight Game with Jim Lampley: A Tribute to Muhammad Ali (HBO) Hard Knocks: Training Camp with the Los Angeles Rams (HBO) |- ! colspan="2" style="background:#abcdef;"| Outstanding Children's Program |- | colspan="2" style="vertical-align:top;"| Sesame Street (PBS/HBO) Girl Meets World (Disney Channel) Octonauts (Disney Channel) School of Rock (Nickelodeon) SpongeBob SquarePants (Nickelodeon) |} Digital Milestone Award Tom RothmanStanley Kramer Award Loving Visionary Award Megan Ellison David O. Selznick Achievement Award in Theatrical Motion Pictures Irwin Winkler Norman Lear Achievement Award in Television James L. Brooks References External links 2016 2016 film awards 2016 television awards
```objective-c // // KWFutureObject.h // iOSFalconCore // // Created by Luke Redpath on 13/01/2011. // #import <Foundation/Foundation.h> typedef id (^KWFutureObjectBlock)(void); @interface KWFutureObject : NSObject + (id)objectWithObjectPointer:(id *)pointer; + (id)futureObjectWithBlock:(KWFutureObjectBlock)block; - (id)initWithBlock:(KWFutureObjectBlock)aBlock; - (id)object; @end ```
Bhiria road is a city in Naushahro Feroze District in Sindh province of Pakistan, spanning 4.52 square kilometers. Bhiria road is a commercial hub in Naushehro feroz district Agriculture Bhiria Road and adjacent area is very fertile and canal water for irrigation is available in abundance. Wheat, sugarcane, maize and cotton are the main crops in this part of the Naushahro Feroze District, Province of Sindh. There is huge whole sale market for these agricultural products and traders from around whole southern Sindh visit it where they buy in bulk the products. Facilities Railway Station: Bhiria Road has its own railway station which was built (in 1922) on main railway line running from Karachi to Peshawar via Rohri. It is located at a distance of 220 miles (350 km) from Karachi and 68 miles (110 km) from Rohri. This railway station serves the people of adjacent towns (Karundi, Pacca Chang, Raid Haji Darya khan Jalbani, Bhiria City, Kandiaro, Tharushah and Naushahro Feroz and the rural areas adjacent to these towns). Road transport: Round the clock road transport (rent a car, coaches and SUVs) are available in the parking area of railway station which may take to places as far as Karachi in the south and Peshawar in the north. In addition to this, regular coach service is available for Karachi, Hyderabad, Nawab Shah and Sukkur which leave at their scheduled timings with reasonable fare. Education A higher secondary School, two secondary schools for boys and girls respectively and a large number of private schools are providing quality education to the children of this area. Due to high quality education facilities, Bhiria Road is favorite town for the officials of various public and private sector organizations who prefer their posting here and after retirement many of them opt to settle here. Naushahro Feroze District
Vanda Maria Hădărean (born May 3, 1976, in Cluj-Napoca, Romania) is a retired Romanian artistic gymnast who competed in international events between 1990 and 1993. She is an Olympic silver medalist and a world bronze medalist with the team and a European all around bronze medalist. She is also a successful fitness model and she has won various world class fitness competitions. She is a four-time Ms. Fitness World (2008, 2009, 2010 and 2011) and a four-time Ms. Fitness Canada (2006–2009). She also coaches, trains, provides consultation services to gymnasts and fitness models under the brand name “Inspired by Vanda”. She is also known for her breakout photo series in the Romanian Playboy magazine. Early gymnastics career Hădărean began gymnastics at the age of five, after a local coach spotted her talent and potential while she was playing in the park. At 13, she was selected to the Junior National team and moved to Onesti to train. During her junior career she won the all around title at the Junior European Championships in 1991, the gold medal on uneven bars and placed sixth on beam and tied for the fourth place on floor. Senior gymnastics career As a senior gymnast Hădărean was a consistent member of the Romanian team, helping the team to win the bronze medal at the 1991 World Championships and the silver medal at the 1992 Olympic Games. Individually she won the bronze medal in the all around event at the 1992 European championships. At the same competition she placed sixth on uneven bars and seventh on floor. During her senior career she was often overshadowed by many of the other strong Romanians of her time such as Lavinia Miloșovici, Cristina Bontaș, and Gina Gogean. As a result, she did not have as many opportunities to compete in individual events due to the 3-per-country rule. For example, at the 1992 Olympic Games Hădărean placed eleventh in the qualification for the all around event behind Bontaș, third, Miloșovici, fourth, and Gogean, seventh. Coaching career Hădărean retired from competitive gymnastics in 1994. After graduating high school in Deva she moved on to a sports university. Shortly after enrolling in the university, she received a coaching position in Canada that she accepted. Hadarean quit school and moved to Canada where she coached at Mountain Star Gymnastics in Hamilton, Ontario and a second gym in Peterborough, Ontario. After more than two years coaching in Canada, she moved to the US to coach in Houston, Texas at Moceanu Gymnastics Incorporated (MGI). When MGI closed, she moved to South Carolina to coach at Golden Strip Gymnastics (now renamed Champions Gymnastics). This position was short-lived however, and she returned to Romania. Back home, Hădărean began working for a private aerobics club, EOS Aerobic Centre, in Cluj-Napoca. At the age of 24, she returned to Canada in early November 2000, and began coaching at Mountain Star Gymnastics in Hamilton again. In mid-2001 she began a new coaching position at Tumblers Gymnastics Centre in Ottawa, Canada's capital. She served as the club's first provincial-level coach, focusing on choreography and strengthening the club's female elite program. Fitness career Following the advice of a good friend Hădărean started to compete in fitness modeling and fitness routines. She had a successful debut in fitness in 2005 by winning the Ms. Fitness Canada title and finishing fourth at the Ms. World Fitness event. She continued her fitness career by winning three more Ms. Fitness Canada titles (2006, 2007, 2008), a second place at Ms. World Fitness in 2007 and four consecutive Ms. World Fitness titles (2008, 2009, 2010 and 2011). Hădărean was on the cover of several magazines such as “Serious about fitness”, November 2005 and March 2008, “Ms. Fitness Magazine”, Spring 2008 and "Oxygen", October 2009. She is also a successful fitness coach. Three of the girls coached by her at Tumblers reached the podium in the junior and teen events at the 2008 Ms. Fitness Canada competition. Emily MacDougall and Alexa Charett came first and second in the teen event and Emily Tippins placed second in the junior category. Brand name Currently Hădǎrean coaches and trains through her signature “Inspired by Vanda” programs at Soma Health and Fitness Ottawa Canada. She also makes promotional appearances and provides consultation services to gymnasts and other fitness, fitness model, and figure competitors. Awards She was twice awarded the title “Gymnast of the Year” by her hometown Cluj and once awarded “Gymnast of the Year” by Neutron Sports Canada. References External links Vanda Hadarean's official website 1976 births Living people Sportspeople from Cluj-Napoca Romanian female artistic gymnasts Gymnasts at the 1992 Summer Olympics Olympic gymnasts for Romania Olympic silver medalists for Romania Olympic medalists in gymnastics Medalists at the World Artistic Gymnastics Championships European champions in gymnastics Romanian gymnastics coaches Fitness and figure competitors Medalists at the 1992 Summer Olympics
```graphql # source: path_to_url # timestamp: Mon Nov 20 2017 11:36:08 GMT+0100 (CET) enum _ModelMutationType { CREATED UPDATED DELETED } # Meta information about the query. type _QueryMeta { count: Int! } type AddToBookeePayload { bookeeUser: User bookingsBooking: Booking } type AddToCityNeighbourhoodPayload { neighbourhoodsNeighbourhood: Neighbourhood cityCity: City } type AddToExperienceHostPayload { hostUser: User hostingExperiencesExperience: Experience } type AddToExperienceReviewsPayload { experienceExperience: Experience reviewsReview: Review } type AddToNeighbourhoodPayload { locationsLocation: Location neighbourHoodNeighbourhood: Neighbourhood } type AddToNotificationsPayload { userUser: User notificationsNotification: Notification } type AddToPaymentAccountsPayload { paymentsPayment: Payment paymentMethodPaymentAccount: PaymentAccount } type AddToPlaceBookingPayload { placePlace: Place bookingsBooking: Booking } type AddToPlaceOwnerPayload { hostUser: User ownedPlacesPlace: Place } type AddToPlacePicturesPayload { placePlace: Place picturesPicture: Picture } type AddToPlaceReviewsPayload { placePlace: Place reviewsReview: Review } type AddToReceivedMessagesPayload { toUser: User receivedMessagesMessage: Message } type AddToRestaurantPicturePayload { picturesPicture: Picture reservationRestaurant: Restaurant } type AddToSentMessagesPayload { fromUser: User sentMessagesMessage: Message } type AddToUserPaymentAccountsPayload { userUser: User paymentAccountPaymentAccount: PaymentAccount } type Amenities implements Node { airConditioning: Boolean! babyBath: Boolean! babyMonitor: Boolean! babysitterRecommendations: Boolean! bathtub: Boolean! breakfast: Boolean! buzzerWirelessIntercom: Boolean! cableTv: Boolean! changingTable: Boolean! childrensBooksAndToys: Boolean! childrensDinnerware: Boolean! crib: Boolean! doorman: Boolean! dryer: Boolean! elevator: Boolean! essentials: Boolean! familyKidFriendly: Boolean! freeParkingOnPremises: Boolean! freeParkingOnStreet: Boolean! gym: Boolean! hairDryer: Boolean! hangers: Boolean! heating: Boolean! hotTub: Boolean! id: ID! indoorFireplace: Boolean! internet: Boolean! iron: Boolean! kitchen: Boolean! laptopFriendlyWorkspace: Boolean! paidParkingOffPremises: Boolean! petsAllowed: Boolean! place(filter: PlaceFilter): Place! pool: Boolean! privateEntrance: Boolean! shampoo: Boolean! smokingAllowed: Boolean! suitableForEvents: Boolean! tv: Boolean! washer: Boolean! wheelchairAccessible: Boolean! wirelessInternet: Boolean! } input AmenitiesFilter { # Logical AND on all given filters. AND: [AmenitiesFilter!] # Logical OR on all given filters. OR: [AmenitiesFilter!] airConditioning: Boolean # All values that are not equal to given value. airConditioning_not: Boolean babyBath: Boolean # All values that are not equal to given value. babyBath_not: Boolean babyMonitor: Boolean # All values that are not equal to given value. babyMonitor_not: Boolean babysitterRecommendations: Boolean # All values that are not equal to given value. babysitterRecommendations_not: Boolean bathtub: Boolean # All values that are not equal to given value. bathtub_not: Boolean breakfast: Boolean # All values that are not equal to given value. breakfast_not: Boolean buzzerWirelessIntercom: Boolean # All values that are not equal to given value. buzzerWirelessIntercom_not: Boolean cableTv: Boolean # All values that are not equal to given value. cableTv_not: Boolean changingTable: Boolean # All values that are not equal to given value. changingTable_not: Boolean childrensBooksAndToys: Boolean # All values that are not equal to given value. childrensBooksAndToys_not: Boolean childrensDinnerware: Boolean # All values that are not equal to given value. childrensDinnerware_not: Boolean crib: Boolean # All values that are not equal to given value. crib_not: Boolean doorman: Boolean # All values that are not equal to given value. doorman_not: Boolean dryer: Boolean # All values that are not equal to given value. dryer_not: Boolean elevator: Boolean # All values that are not equal to given value. elevator_not: Boolean essentials: Boolean # All values that are not equal to given value. essentials_not: Boolean familyKidFriendly: Boolean # All values that are not equal to given value. familyKidFriendly_not: Boolean freeParkingOnPremises: Boolean # All values that are not equal to given value. freeParkingOnPremises_not: Boolean freeParkingOnStreet: Boolean # All values that are not equal to given value. freeParkingOnStreet_not: Boolean gym: Boolean # All values that are not equal to given value. gym_not: Boolean hairDryer: Boolean # All values that are not equal to given value. hairDryer_not: Boolean hangers: Boolean # All values that are not equal to given value. hangers_not: Boolean heating: Boolean # All values that are not equal to given value. heating_not: Boolean hotTub: Boolean # All values that are not equal to given value. hotTub_not: Boolean id: ID # All values that are not equal to given value. id_not: ID # All values that are contained in given list. id_in: [ID!] # All values that are not contained in given list. id_not_in: [ID!] # All values less than the given value. id_lt: ID # All values less than or equal the given value. id_lte: ID # All values greater than the given value. id_gt: ID # All values greater than or equal the given value. id_gte: ID # All values containing the given string. id_contains: ID # All values not containing the given string. id_not_contains: ID # All values starting with the given string. id_starts_with: ID # All values not starting with the given string. id_not_starts_with: ID # All values ending with the given string. id_ends_with: ID # All values not ending with the given string. id_not_ends_with: ID indoorFireplace: Boolean # All values that are not equal to given value. indoorFireplace_not: Boolean internet: Boolean # All values that are not equal to given value. internet_not: Boolean iron: Boolean # All values that are not equal to given value. iron_not: Boolean kitchen: Boolean # All values that are not equal to given value. kitchen_not: Boolean laptopFriendlyWorkspace: Boolean # All values that are not equal to given value. laptopFriendlyWorkspace_not: Boolean paidParkingOffPremises: Boolean # All values that are not equal to given value. paidParkingOffPremises_not: Boolean petsAllowed: Boolean # All values that are not equal to given value. petsAllowed_not: Boolean pool: Boolean # All values that are not equal to given value. pool_not: Boolean privateEntrance: Boolean # All values that are not equal to given value. privateEntrance_not: Boolean shampoo: Boolean # All values that are not equal to given value. shampoo_not: Boolean smokingAllowed: Boolean # All values that are not equal to given value. smokingAllowed_not: Boolean suitableForEvents: Boolean # All values that are not equal to given value. suitableForEvents_not: Boolean tv: Boolean # All values that are not equal to given value. tv_not: Boolean washer: Boolean # All values that are not equal to given value. washer_not: Boolean wheelchairAccessible: Boolean # All values that are not equal to given value. wheelchairAccessible_not: Boolean wirelessInternet: Boolean # All values that are not equal to given value. wirelessInternet_not: Boolean place: PlaceFilter } enum AmenitiesOrderBy { airConditioning_ASC airConditioning_DESC babyBath_ASC babyBath_DESC babyMonitor_ASC babyMonitor_DESC babysitterRecommendations_ASC babysitterRecommendations_DESC bathtub_ASC bathtub_DESC breakfast_ASC breakfast_DESC buzzerWirelessIntercom_ASC buzzerWirelessIntercom_DESC cableTv_ASC cableTv_DESC changingTable_ASC changingTable_DESC childrensBooksAndToys_ASC childrensBooksAndToys_DESC childrensDinnerware_ASC childrensDinnerware_DESC crib_ASC crib_DESC doorman_ASC doorman_DESC dryer_ASC dryer_DESC elevator_ASC elevator_DESC essentials_ASC essentials_DESC familyKidFriendly_ASC familyKidFriendly_DESC freeParkingOnPremises_ASC freeParkingOnPremises_DESC freeParkingOnStreet_ASC freeParkingOnStreet_DESC gym_ASC gym_DESC hairDryer_ASC hairDryer_DESC hangers_ASC hangers_DESC heating_ASC heating_DESC hotTub_ASC hotTub_DESC id_ASC id_DESC indoorFireplace_ASC indoorFireplace_DESC internet_ASC internet_DESC iron_ASC iron_DESC kitchen_ASC kitchen_DESC laptopFriendlyWorkspace_ASC laptopFriendlyWorkspace_DESC paidParkingOffPremises_ASC paidParkingOffPremises_DESC petsAllowed_ASC petsAllowed_DESC pool_ASC pool_DESC privateEntrance_ASC privateEntrance_DESC shampoo_ASC shampoo_DESC smokingAllowed_ASC smokingAllowed_DESC suitableForEvents_ASC suitableForEvents_DESC tv_ASC tv_DESC washer_ASC washer_DESC wheelchairAccessible_ASC wheelchairAccessible_DESC wirelessInternet_ASC wirelessInternet_DESC } input AmenitiesplacePlace { description: String! maxGuests: Int! name: String numBaths: Int! numBedrooms: Int! numBeds: Int! popularity: Int! shortDescription: String! size: PLACE_SIZES slug: String! guestRequirementsId: ID guestRequirements: PlaceguestRequirementsGuestRequirements hostId: ID host: PlacehostUser houseRulesId: ID houseRules: PlacehouseRulesHouseRules locationId: ID location: PlacelocationLocation policiesId: ID policies: PlacepoliciesPolicies pricingId: ID pricing: PlacepricingPricing viewsId: ID views: PlaceviewsPlaceViews bookingsIds: [ID!] bookings: [PlacebookingsBooking!] picturesIds: [ID!] pictures: [PlacepicturesPicture!] reviewsIds: [ID!] reviews: [PlacereviewsReview!] } type AmenitiesPreviousValues { airConditioning: Boolean! babyBath: Boolean! babyMonitor: Boolean! babysitterRecommendations: Boolean! bathtub: Boolean! breakfast: Boolean! buzzerWirelessIntercom: Boolean! cableTv: Boolean! changingTable: Boolean! childrensBooksAndToys: Boolean! childrensDinnerware: Boolean! crib: Boolean! doorman: Boolean! dryer: Boolean! elevator: Boolean! essentials: Boolean! familyKidFriendly: Boolean! freeParkingOnPremises: Boolean! freeParkingOnStreet: Boolean! gym: Boolean! hairDryer: Boolean! hangers: Boolean! heating: Boolean! hotTub: Boolean! id: ID! indoorFireplace: Boolean! internet: Boolean! iron: Boolean! kitchen: Boolean! laptopFriendlyWorkspace: Boolean! paidParkingOffPremises: Boolean! petsAllowed: Boolean! pool: Boolean! privateEntrance: Boolean! shampoo: Boolean! smokingAllowed: Boolean! suitableForEvents: Boolean! tv: Boolean! washer: Boolean! wheelchairAccessible: Boolean! wirelessInternet: Boolean! } input AmenitiesSubscriptionFilter { # Logical AND on all given filters. AND: [AmenitiesSubscriptionFilter!] # Logical OR on all given filters. OR: [AmenitiesSubscriptionFilter!] # The subscription event gets dispatched when it's listed in mutation_in mutation_in: [_ModelMutationType!] # The subscription event gets only dispatched when one of the updated fields names is included in this list updatedFields_contains: String # The subscription event gets only dispatched when all of the field names included in this list have been updated updatedFields_contains_every: [String!] # The subscription event gets only dispatched when some of the field names included in this list have been updated updatedFields_contains_some: [String!] node: AmenitiesSubscriptionFilterNode } input AmenitiesSubscriptionFilterNode { airConditioning: Boolean # All values that are not equal to given value. airConditioning_not: Boolean babyBath: Boolean # All values that are not equal to given value. babyBath_not: Boolean babyMonitor: Boolean # All values that are not equal to given value. babyMonitor_not: Boolean babysitterRecommendations: Boolean # All values that are not equal to given value. babysitterRecommendations_not: Boolean bathtub: Boolean # All values that are not equal to given value. bathtub_not: Boolean breakfast: Boolean # All values that are not equal to given value. breakfast_not: Boolean buzzerWirelessIntercom: Boolean # All values that are not equal to given value. buzzerWirelessIntercom_not: Boolean cableTv: Boolean # All values that are not equal to given value. cableTv_not: Boolean changingTable: Boolean # All values that are not equal to given value. changingTable_not: Boolean childrensBooksAndToys: Boolean # All values that are not equal to given value. childrensBooksAndToys_not: Boolean childrensDinnerware: Boolean # All values that are not equal to given value. childrensDinnerware_not: Boolean crib: Boolean # All values that are not equal to given value. crib_not: Boolean doorman: Boolean # All values that are not equal to given value. doorman_not: Boolean dryer: Boolean # All values that are not equal to given value. dryer_not: Boolean elevator: Boolean # All values that are not equal to given value. elevator_not: Boolean essentials: Boolean # All values that are not equal to given value. essentials_not: Boolean familyKidFriendly: Boolean # All values that are not equal to given value. familyKidFriendly_not: Boolean freeParkingOnPremises: Boolean # All values that are not equal to given value. freeParkingOnPremises_not: Boolean freeParkingOnStreet: Boolean # All values that are not equal to given value. freeParkingOnStreet_not: Boolean gym: Boolean # All values that are not equal to given value. gym_not: Boolean hairDryer: Boolean # All values that are not equal to given value. hairDryer_not: Boolean hangers: Boolean # All values that are not equal to given value. hangers_not: Boolean heating: Boolean # All values that are not equal to given value. heating_not: Boolean hotTub: Boolean # All values that are not equal to given value. hotTub_not: Boolean id: ID # All values that are not equal to given value. id_not: ID # All values that are contained in given list. id_in: [ID!] # All values that are not contained in given list. id_not_in: [ID!] # All values less than the given value. id_lt: ID # All values less than or equal the given value. id_lte: ID # All values greater than the given value. id_gt: ID # All values greater than or equal the given value. id_gte: ID # All values containing the given string. id_contains: ID # All values not containing the given string. id_not_contains: ID # All values starting with the given string. id_starts_with: ID # All values not starting with the given string. id_not_starts_with: ID # All values ending with the given string. id_ends_with: ID # All values not ending with the given string. id_not_ends_with: ID indoorFireplace: Boolean # All values that are not equal to given value. indoorFireplace_not: Boolean internet: Boolean # All values that are not equal to given value. internet_not: Boolean iron: Boolean # All values that are not equal to given value. iron_not: Boolean kitchen: Boolean # All values that are not equal to given value. kitchen_not: Boolean laptopFriendlyWorkspace: Boolean # All values that are not equal to given value. laptopFriendlyWorkspace_not: Boolean paidParkingOffPremises: Boolean # All values that are not equal to given value. paidParkingOffPremises_not: Boolean petsAllowed: Boolean # All values that are not equal to given value. petsAllowed_not: Boolean pool: Boolean # All values that are not equal to given value. pool_not: Boolean privateEntrance: Boolean # All values that are not equal to given value. privateEntrance_not: Boolean shampoo: Boolean # All values that are not equal to given value. shampoo_not: Boolean smokingAllowed: Boolean # All values that are not equal to given value. smokingAllowed_not: Boolean suitableForEvents: Boolean # All values that are not equal to given value. suitableForEvents_not: Boolean tv: Boolean # All values that are not equal to given value. tv_not: Boolean washer: Boolean # All values that are not equal to given value. washer_not: Boolean wheelchairAccessible: Boolean # All values that are not equal to given value. wheelchairAccessible_not: Boolean wirelessInternet: Boolean # All values that are not equal to given value. wirelessInternet_not: Boolean place: PlaceFilter } type AmenitiesSubscriptionPayload { mutation: _ModelMutationType! node: Amenities updatedFields: [String!] previousValues: AmenitiesPreviousValues } type Booking implements Node { bookee(filter: UserFilter): User! createdAt: DateTime! endDate: DateTime! id: ID! payment(filter: PaymentFilter): Payment! place(filter: PlaceFilter): Place! startDate: DateTime! } input BookingbookeeUser { email: String! firstName: String! isSuperHost: Boolean lastName: String! password: String! phone: String! responseRate: Float responseTime: Int locationId: ID location: UserlocationLocation profilePictureId: ID profilePicture: UserprofilePicturePicture bookingsIds: [ID!] bookings: [UserbookingsBooking!] hostingExperiencesIds: [ID!] hostingExperiences: [UserhostingExperiencesExperience!] notificationsIds: [ID!] notifications: [UsernotificationsNotification!] ownedPlacesIds: [ID!] ownedPlaces: [UserownedPlacesPlace!] paymentAccountIds: [ID!] paymentAccount: [UserpaymentAccountPaymentAccount!] receivedMessagesIds: [ID!] receivedMessages: [UserreceivedMessagesMessage!] sentMessagesIds: [ID!] sentMessages: [UsersentMessagesMessage!] } input BookingFilter { # Logical AND on all given filters. AND: [BookingFilter!] # Logical OR on all given filters. OR: [BookingFilter!] createdAt: DateTime # All values that are not equal to given value. createdAt_not: DateTime # All values that are contained in given list. createdAt_in: [DateTime!] # All values that are not contained in given list. createdAt_not_in: [DateTime!] # All values less than the given value. createdAt_lt: DateTime # All values less than or equal the given value. createdAt_lte: DateTime # All values greater than the given value. createdAt_gt: DateTime # All values greater than or equal the given value. createdAt_gte: DateTime endDate: DateTime # All values that are not equal to given value. endDate_not: DateTime # All values that are contained in given list. endDate_in: [DateTime!] # All values that are not contained in given list. endDate_not_in: [DateTime!] # All values less than the given value. endDate_lt: DateTime # All values less than or equal the given value. endDate_lte: DateTime # All values greater than the given value. endDate_gt: DateTime # All values greater than or equal the given value. endDate_gte: DateTime id: ID # All values that are not equal to given value. id_not: ID # All values that are contained in given list. id_in: [ID!] # All values that are not contained in given list. id_not_in: [ID!] # All values less than the given value. id_lt: ID # All values less than or equal the given value. id_lte: ID # All values greater than the given value. id_gt: ID # All values greater than or equal the given value. id_gte: ID # All values containing the given string. id_contains: ID # All values not containing the given string. id_not_contains: ID # All values starting with the given string. id_starts_with: ID # All values not starting with the given string. id_not_starts_with: ID # All values ending with the given string. id_ends_with: ID # All values not ending with the given string. id_not_ends_with: ID startDate: DateTime # All values that are not equal to given value. startDate_not: DateTime # All values that are contained in given list. startDate_in: [DateTime!] # All values that are not contained in given list. startDate_not_in: [DateTime!] # All values less than the given value. startDate_lt: DateTime # All values less than or equal the given value. startDate_lte: DateTime # All values greater than the given value. startDate_gt: DateTime # All values greater than or equal the given value. startDate_gte: DateTime bookee: UserFilter payment: PaymentFilter place: PlaceFilter } enum BookingOrderBy { createdAt_ASC createdAt_DESC endDate_ASC endDate_DESC id_ASC id_DESC startDate_ASC startDate_DESC } input BookingpaymentPayment { placePrice: Float! serviceFee: Float! totalPrice: Float! paymentMethodId: ID paymentMethod: PaymentpaymentMethodPaymentAccount } input BookingplacePlace { description: String! maxGuests: Int! name: String numBaths: Int! numBedrooms: Int! numBeds: Int! popularity: Int! shortDescription: String! size: PLACE_SIZES slug: String! amenitiesId: ID amenities: PlaceamenitiesAmenities guestRequirementsId: ID guestRequirements: PlaceguestRequirementsGuestRequirements hostId: ID host: PlacehostUser houseRulesId: ID houseRules: PlacehouseRulesHouseRules locationId: ID location: PlacelocationLocation policiesId: ID policies: PlacepoliciesPolicies pricingId: ID pricing: PlacepricingPricing viewsId: ID views: PlaceviewsPlaceViews bookingsIds: [ID!] bookings: [PlacebookingsBooking!] picturesIds: [ID!] pictures: [PlacepicturesPicture!] reviewsIds: [ID!] reviews: [PlacereviewsReview!] } type BookingPreviousValues { createdAt: DateTime! endDate: DateTime! id: ID! startDate: DateTime! } input BookingSubscriptionFilter { # Logical AND on all given filters. AND: [BookingSubscriptionFilter!] # Logical OR on all given filters. OR: [BookingSubscriptionFilter!] # The subscription event gets dispatched when it's listed in mutation_in mutation_in: [_ModelMutationType!] # The subscription event gets only dispatched when one of the updated fields names is included in this list updatedFields_contains: String # The subscription event gets only dispatched when all of the field names included in this list have been updated updatedFields_contains_every: [String!] # The subscription event gets only dispatched when some of the field names included in this list have been updated updatedFields_contains_some: [String!] node: BookingSubscriptionFilterNode } input BookingSubscriptionFilterNode { createdAt: DateTime # All values that are not equal to given value. createdAt_not: DateTime # All values that are contained in given list. createdAt_in: [DateTime!] # All values that are not contained in given list. createdAt_not_in: [DateTime!] # All values less than the given value. createdAt_lt: DateTime # All values less than or equal the given value. createdAt_lte: DateTime # All values greater than the given value. createdAt_gt: DateTime # All values greater than or equal the given value. createdAt_gte: DateTime endDate: DateTime # All values that are not equal to given value. endDate_not: DateTime # All values that are contained in given list. endDate_in: [DateTime!] # All values that are not contained in given list. endDate_not_in: [DateTime!] # All values less than the given value. endDate_lt: DateTime # All values less than or equal the given value. endDate_lte: DateTime # All values greater than the given value. endDate_gt: DateTime # All values greater than or equal the given value. endDate_gte: DateTime id: ID # All values that are not equal to given value. id_not: ID # All values that are contained in given list. id_in: [ID!] # All values that are not contained in given list. id_not_in: [ID!] # All values less than the given value. id_lt: ID # All values less than or equal the given value. id_lte: ID # All values greater than the given value. id_gt: ID # All values greater than or equal the given value. id_gte: ID # All values containing the given string. id_contains: ID # All values not containing the given string. id_not_contains: ID # All values starting with the given string. id_starts_with: ID # All values not starting with the given string. id_not_starts_with: ID # All values ending with the given string. id_ends_with: ID # All values not ending with the given string. id_not_ends_with: ID startDate: DateTime # All values that are not equal to given value. startDate_not: DateTime # All values that are contained in given list. startDate_in: [DateTime!] # All values that are not contained in given list. startDate_not_in: [DateTime!] # All values less than the given value. startDate_lt: DateTime # All values less than or equal the given value. startDate_lte: DateTime # All values greater than the given value. startDate_gt: DateTime # All values greater than or equal the given value. startDate_gte: DateTime bookee: UserFilter payment: PaymentFilter place: PlaceFilter } type BookingSubscriptionPayload { mutation: _ModelMutationType! node: Booking updatedFields: [String!] previousValues: BookingPreviousValues } type City implements Node { id: ID! name: String! neighbourhoods( filter: NeighbourhoodFilter orderBy: NeighbourhoodOrderBy skip: Int after: String before: String first: Int last: Int ): [Neighbourhood!] # Meta information about the query. _neighbourhoodsMeta( filter: NeighbourhoodFilter orderBy: NeighbourhoodOrderBy skip: Int after: String before: String first: Int last: Int ): _QueryMeta! } input CityFilter { # Logical AND on all given filters. AND: [CityFilter!] # Logical OR on all given filters. OR: [CityFilter!] id: ID # All values that are not equal to given value. id_not: ID # All values that are contained in given list. id_in: [ID!] # All values that are not contained in given list. id_not_in: [ID!] # All values less than the given value. id_lt: ID # All values less than or equal the given value. id_lte: ID # All values greater than the given value. id_gt: ID # All values greater than or equal the given value. id_gte: ID # All values containing the given string. id_contains: ID # All values not containing the given string. id_not_contains: ID # All values starting with the given string. id_starts_with: ID # All values not starting with the given string. id_not_starts_with: ID # All values ending with the given string. id_ends_with: ID # All values not ending with the given string. id_not_ends_with: ID name: String # All values that are not equal to given value. name_not: String # All values that are contained in given list. name_in: [String!] # All values that are not contained in given list. name_not_in: [String!] # All values less than the given value. name_lt: String # All values less than or equal the given value. name_lte: String # All values greater than the given value. name_gt: String # All values greater than or equal the given value. name_gte: String # All values containing the given string. name_contains: String # All values not containing the given string. name_not_contains: String # All values starting with the given string. name_starts_with: String # All values not starting with the given string. name_not_starts_with: String # All values ending with the given string. name_ends_with: String # All values not ending with the given string. name_not_ends_with: String neighbourhoods_every: NeighbourhoodFilter neighbourhoods_some: NeighbourhoodFilter neighbourhoods_none: NeighbourhoodFilter } input CityneighbourhoodsNeighbourhood { featured: Boolean! name: String! popularity: Int! slug: String! homePreviewId: ID homePreview: NeighbourhoodhomePreviewPicture locationsIds: [ID!] locations: [NeighbourhoodlocationsLocation!] } enum CityOrderBy { id_ASC id_DESC name_ASC name_DESC } type CityPreviousValues { id: ID! name: String! } input CitySubscriptionFilter { # Logical AND on all given filters. AND: [CitySubscriptionFilter!] # Logical OR on all given filters. OR: [CitySubscriptionFilter!] # The subscription event gets dispatched when it's listed in mutation_in mutation_in: [_ModelMutationType!] # The subscription event gets only dispatched when one of the updated fields names is included in this list updatedFields_contains: String # The subscription event gets only dispatched when all of the field names included in this list have been updated updatedFields_contains_every: [String!] # The subscription event gets only dispatched when some of the field names included in this list have been updated updatedFields_contains_some: [String!] node: CitySubscriptionFilterNode } input CitySubscriptionFilterNode { id: ID # All values that are not equal to given value. id_not: ID # All values that are contained in given list. id_in: [ID!] # All values that are not contained in given list. id_not_in: [ID!] # All values less than the given value. id_lt: ID # All values less than or equal the given value. id_lte: ID # All values greater than the given value. id_gt: ID # All values greater than or equal the given value. id_gte: ID # All values containing the given string. id_contains: ID # All values not containing the given string. id_not_contains: ID # All values starting with the given string. id_starts_with: ID # All values not starting with the given string. id_not_starts_with: ID # All values ending with the given string. id_ends_with: ID # All values not ending with the given string. id_not_ends_with: ID name: String # All values that are not equal to given value. name_not: String # All values that are contained in given list. name_in: [String!] # All values that are not contained in given list. name_not_in: [String!] # All values less than the given value. name_lt: String # All values less than or equal the given value. name_lte: String # All values greater than the given value. name_gt: String # All values greater than or equal the given value. name_gte: String # All values containing the given string. name_contains: String # All values not containing the given string. name_not_contains: String # All values starting with the given string. name_starts_with: String # All values not starting with the given string. name_not_starts_with: String # All values ending with the given string. name_ends_with: String # All values not ending with the given string. name_not_ends_with: String neighbourhoods_every: NeighbourhoodFilter neighbourhoods_some: NeighbourhoodFilter neighbourhoods_none: NeighbourhoodFilter } type CitySubscriptionPayload { mutation: _ModelMutationType! node: City updatedFields: [String!] previousValues: CityPreviousValues } input CreateAmenities { airConditioning: Boolean babyBath: Boolean babyMonitor: Boolean babysitterRecommendations: Boolean bathtub: Boolean breakfast: Boolean buzzerWirelessIntercom: Boolean cableTv: Boolean changingTable: Boolean childrensBooksAndToys: Boolean childrensDinnerware: Boolean crib: Boolean doorman: Boolean dryer: Boolean elevator: Boolean essentials: Boolean familyKidFriendly: Boolean freeParkingOnPremises: Boolean freeParkingOnStreet: Boolean gym: Boolean hairDryer: Boolean hangers: Boolean heating: Boolean hotTub: Boolean indoorFireplace: Boolean internet: Boolean iron: Boolean kitchen: Boolean laptopFriendlyWorkspace: Boolean paidParkingOffPremises: Boolean petsAllowed: Boolean pool: Boolean privateEntrance: Boolean shampoo: Boolean smokingAllowed: Boolean suitableForEvents: Boolean tv: Boolean washer: Boolean wheelchairAccessible: Boolean wirelessInternet: Boolean placeId: ID place: AmenitiesplacePlace } input CreateBooking { endDate: DateTime! startDate: DateTime! bookeeId: ID bookee: BookingbookeeUser paymentId: ID payment: BookingpaymentPayment placeId: ID place: BookingplacePlace } input CreateCity { name: String! neighbourhoodsIds: [ID!] neighbourhoods: [CityneighbourhoodsNeighbourhood!] } input CreateCreditCardInformation { cardNumber: String! country: String! expiresOnMonth: Int! expiresOnYear: Int! firstName: String! lastName: String! postalCode: String! securityCode: String! paymentAccountId: ID paymentAccount: CreditCardInformationpaymentAccountPaymentAccount } input CreateExperience { popularity: Int! pricePerPerson: Int! title: String! categoryId: ID category: ExperiencecategoryExperienceCategory hostId: ID host: ExperiencehostUser locationId: ID location: ExperiencelocationLocation previewId: ID preview: ExperiencepreviewPicture reviewsIds: [ID!] reviews: [ExperiencereviewsReview!] } input CreateExperienceCategory { mainColor: String name: String! experienceId: ID experience: ExperienceCategoryexperienceExperience } input CreateGuestRequirements { govIssuedId: Boolean guestTripInformation: Boolean recommendationsFromOtherHosts: Boolean placeId: ID place: GuestRequirementsplacePlace } input CreateHouseRules { additionalRules: String partiesAndEventsAllowed: Boolean petsAllowed: Boolean smokingAllowed: Boolean suitableForChildren: Boolean suitableForInfants: Boolean placeId: ID place: HouseRulesplacePlace } input CreateLocation { address: String directions: String lat: Float! lng: Float! experienceId: ID experience: LocationexperienceExperience neighbourHoodId: ID neighbourHood: LocationneighbourHoodNeighbourhood placeId: ID place: LocationplacePlace restaurantId: ID restaurant: LocationrestaurantRestaurant userId: ID user: LocationuserUser } input CreateMessage { deliveredAt: DateTime! readAt: DateTime! fromId: ID from: MessagefromUser toId: ID to: MessagetoUser } input CreateNeighbourhood { featured: Boolean! name: String! popularity: Int! slug: String! cityId: ID city: NeighbourhoodcityCity homePreviewId: ID homePreview: NeighbourhoodhomePreviewPicture locationsIds: [ID!] locations: [NeighbourhoodlocationsLocation!] } input CreateNotification { link: String! readDate: DateTime! type: NOTIFICATION_TYPE userId: ID user: NotificationuserUser } input CreatePayment { placePrice: Float! serviceFee: Float! totalPrice: Float! bookingId: ID booking: PaymentbookingBooking paymentMethodId: ID paymentMethod: PaymentpaymentMethodPaymentAccount } input CreatePaymentAccount { type: PAYMENT_PROVIDER creditcardId: ID creditcard: PaymentAccountcreditcardCreditCardInformation paypalId: ID paypal: PaymentAccountpaypalPaypalInformation userId: ID user: PaymentAccountuserUser paymentsIds: [ID!] payments: [PaymentAccountpaymentsPayment!] } input CreatePaypalInformation { email: String! paymentAccountId: ID paymentAccount: PaypalInformationpaymentAccountPaymentAccount } input CreatePicture { url: String! experienceId: ID experience: PictureexperienceExperience neighbourHoodId: ID neighbourHood: PictureneighbourHoodNeighbourhood placeId: ID place: PictureplacePlace reservationId: ID reservation: PicturereservationRestaurant userId: ID user: PictureuserUser } input CreatePlace { description: String! maxGuests: Int! name: String numBaths: Int! numBedrooms: Int! numBeds: Int! popularity: Int! shortDescription: String! size: PLACE_SIZES slug: String! amenitiesId: ID amenities: PlaceamenitiesAmenities guestRequirementsId: ID guestRequirements: PlaceguestRequirementsGuestRequirements hostId: ID host: PlacehostUser houseRulesId: ID houseRules: PlacehouseRulesHouseRules locationId: ID location: PlacelocationLocation policiesId: ID policies: PlacepoliciesPolicies pricingId: ID pricing: PlacepricingPricing viewsId: ID views: PlaceviewsPlaceViews bookingsIds: [ID!] bookings: [PlacebookingsBooking!] picturesIds: [ID!] pictures: [PlacepicturesPicture!] reviewsIds: [ID!] reviews: [PlacereviewsReview!] } input CreatePlaceViews { lastWeek: Int! placeId: ID place: PlaceViewsplacePlace } input CreatePolicies { checkInEndTime: Float! checkInStartTime: Float! checkoutTime: Float! placeId: ID place: PoliciesplacePlace } input CreatePricing { averageMonthly: Int! averageWeekly: Int! basePrice: Int! cleaningFee: Int currency: CURRENCY extraGuests: Int monthlyDiscount: Int perNight: Int! securityDeposit: Int smartPricing: Boolean weekendPricing: Int weeklyDiscount: Int placeId: ID place: PricingplacePlace } input CreateRestaurant { avgPricePerPerson: Int! isCurated: Boolean popularity: Int! slug: String! title: String! locationId: ID location: RestaurantlocationLocation picturesIds: [ID!] pictures: [RestaurantpicturesPicture!] } input CreateReview { accuracy: Int! checkIn: Int! cleanliness: Int! communication: Int! location: Int! stars: Int! text: String! value: Int! experienceId: ID experience: ReviewexperienceExperience placeId: ID place: ReviewplacePlace } input CreateUser { email: String! firstName: String! isSuperHost: Boolean lastName: String! password: String! phone: String! responseRate: Float responseTime: Int locationId: ID location: UserlocationLocation profilePictureId: ID profilePicture: UserprofilePicturePicture bookingsIds: [ID!] bookings: [UserbookingsBooking!] hostingExperiencesIds: [ID!] hostingExperiences: [UserhostingExperiencesExperience!] notificationsIds: [ID!] notifications: [UsernotificationsNotification!] ownedPlacesIds: [ID!] ownedPlaces: [UserownedPlacesPlace!] paymentAccountIds: [ID!] paymentAccount: [UserpaymentAccountPaymentAccount!] receivedMessagesIds: [ID!] receivedMessages: [UserreceivedMessagesMessage!] sentMessagesIds: [ID!] sentMessages: [UsersentMessagesMessage!] } type CreditCardInformation implements Node { cardNumber: String! country: String! createdAt: DateTime! expiresOnMonth: Int! expiresOnYear: Int! firstName: String! id: ID! lastName: String! paymentAccount(filter: PaymentAccountFilter): PaymentAccount postalCode: String! securityCode: String! } input CreditCardInformationFilter { # Logical AND on all given filters. AND: [CreditCardInformationFilter!] # Logical OR on all given filters. OR: [CreditCardInformationFilter!] cardNumber: String # All values that are not equal to given value. cardNumber_not: String # All values that are contained in given list. cardNumber_in: [String!] # All values that are not contained in given list. cardNumber_not_in: [String!] # All values less than the given value. cardNumber_lt: String # All values less than or equal the given value. cardNumber_lte: String # All values greater than the given value. cardNumber_gt: String # All values greater than or equal the given value. cardNumber_gte: String # All values containing the given string. cardNumber_contains: String # All values not containing the given string. cardNumber_not_contains: String # All values starting with the given string. cardNumber_starts_with: String # All values not starting with the given string. cardNumber_not_starts_with: String # All values ending with the given string. cardNumber_ends_with: String # All values not ending with the given string. cardNumber_not_ends_with: String country: String # All values that are not equal to given value. country_not: String # All values that are contained in given list. country_in: [String!] # All values that are not contained in given list. country_not_in: [String!] # All values less than the given value. country_lt: String # All values less than or equal the given value. country_lte: String # All values greater than the given value. country_gt: String # All values greater than or equal the given value. country_gte: String # All values containing the given string. country_contains: String # All values not containing the given string. country_not_contains: String # All values starting with the given string. country_starts_with: String # All values not starting with the given string. country_not_starts_with: String # All values ending with the given string. country_ends_with: String # All values not ending with the given string. country_not_ends_with: String createdAt: DateTime # All values that are not equal to given value. createdAt_not: DateTime # All values that are contained in given list. createdAt_in: [DateTime!] # All values that are not contained in given list. createdAt_not_in: [DateTime!] # All values less than the given value. createdAt_lt: DateTime # All values less than or equal the given value. createdAt_lte: DateTime # All values greater than the given value. createdAt_gt: DateTime # All values greater than or equal the given value. createdAt_gte: DateTime expiresOnMonth: Int # All values that are not equal to given value. expiresOnMonth_not: Int # All values that are contained in given list. expiresOnMonth_in: [Int!] # All values that are not contained in given list. expiresOnMonth_not_in: [Int!] # All values less than the given value. expiresOnMonth_lt: Int # All values less than or equal the given value. expiresOnMonth_lte: Int # All values greater than the given value. expiresOnMonth_gt: Int # All values greater than or equal the given value. expiresOnMonth_gte: Int expiresOnYear: Int # All values that are not equal to given value. expiresOnYear_not: Int # All values that are contained in given list. expiresOnYear_in: [Int!] # All values that are not contained in given list. expiresOnYear_not_in: [Int!] # All values less than the given value. expiresOnYear_lt: Int # All values less than or equal the given value. expiresOnYear_lte: Int # All values greater than the given value. expiresOnYear_gt: Int # All values greater than or equal the given value. expiresOnYear_gte: Int firstName: String # All values that are not equal to given value. firstName_not: String # All values that are contained in given list. firstName_in: [String!] # All values that are not contained in given list. firstName_not_in: [String!] # All values less than the given value. firstName_lt: String # All values less than or equal the given value. firstName_lte: String # All values greater than the given value. firstName_gt: String # All values greater than or equal the given value. firstName_gte: String # All values containing the given string. firstName_contains: String # All values not containing the given string. firstName_not_contains: String # All values starting with the given string. firstName_starts_with: String # All values not starting with the given string. firstName_not_starts_with: String # All values ending with the given string. firstName_ends_with: String # All values not ending with the given string. firstName_not_ends_with: String id: ID # All values that are not equal to given value. id_not: ID # All values that are contained in given list. id_in: [ID!] # All values that are not contained in given list. id_not_in: [ID!] # All values less than the given value. id_lt: ID # All values less than or equal the given value. id_lte: ID # All values greater than the given value. id_gt: ID # All values greater than or equal the given value. id_gte: ID # All values containing the given string. id_contains: ID # All values not containing the given string. id_not_contains: ID # All values starting with the given string. id_starts_with: ID # All values not starting with the given string. id_not_starts_with: ID # All values ending with the given string. id_ends_with: ID # All values not ending with the given string. id_not_ends_with: ID lastName: String # All values that are not equal to given value. lastName_not: String # All values that are contained in given list. lastName_in: [String!] # All values that are not contained in given list. lastName_not_in: [String!] # All values less than the given value. lastName_lt: String # All values less than or equal the given value. lastName_lte: String # All values greater than the given value. lastName_gt: String # All values greater than or equal the given value. lastName_gte: String # All values containing the given string. lastName_contains: String # All values not containing the given string. lastName_not_contains: String # All values starting with the given string. lastName_starts_with: String # All values not starting with the given string. lastName_not_starts_with: String # All values ending with the given string. lastName_ends_with: String # All values not ending with the given string. lastName_not_ends_with: String postalCode: String # All values that are not equal to given value. postalCode_not: String # All values that are contained in given list. postalCode_in: [String!] # All values that are not contained in given list. postalCode_not_in: [String!] # All values less than the given value. postalCode_lt: String # All values less than or equal the given value. postalCode_lte: String # All values greater than the given value. postalCode_gt: String # All values greater than or equal the given value. postalCode_gte: String # All values containing the given string. postalCode_contains: String # All values not containing the given string. postalCode_not_contains: String # All values starting with the given string. postalCode_starts_with: String # All values not starting with the given string. postalCode_not_starts_with: String # All values ending with the given string. postalCode_ends_with: String # All values not ending with the given string. postalCode_not_ends_with: String securityCode: String # All values that are not equal to given value. securityCode_not: String # All values that are contained in given list. securityCode_in: [String!] # All values that are not contained in given list. securityCode_not_in: [String!] # All values less than the given value. securityCode_lt: String # All values less than or equal the given value. securityCode_lte: String # All values greater than the given value. securityCode_gt: String # All values greater than or equal the given value. securityCode_gte: String # All values containing the given string. securityCode_contains: String # All values not containing the given string. securityCode_not_contains: String # All values starting with the given string. securityCode_starts_with: String # All values not starting with the given string. securityCode_not_starts_with: String # All values ending with the given string. securityCode_ends_with: String # All values not ending with the given string. securityCode_not_ends_with: String paymentAccount: PaymentAccountFilter } enum CreditCardInformationOrderBy { cardNumber_ASC cardNumber_DESC country_ASC country_DESC createdAt_ASC createdAt_DESC expiresOnMonth_ASC expiresOnMonth_DESC expiresOnYear_ASC expiresOnYear_DESC firstName_ASC firstName_DESC id_ASC id_DESC lastName_ASC lastName_DESC postalCode_ASC postalCode_DESC securityCode_ASC securityCode_DESC } input CreditCardInformationpaymentAccountPaymentAccount { type: PAYMENT_PROVIDER paypalId: ID paypal: PaymentAccountpaypalPaypalInformation userId: ID user: PaymentAccountuserUser paymentsIds: [ID!] payments: [PaymentAccountpaymentsPayment!] } type CreditCardInformationPreviousValues { cardNumber: String! country: String! createdAt: DateTime! expiresOnMonth: Int! expiresOnYear: Int! firstName: String! id: ID! lastName: String! postalCode: String! securityCode: String! } input CreditCardInformationSubscriptionFilter { # Logical AND on all given filters. AND: [CreditCardInformationSubscriptionFilter!] # Logical OR on all given filters. OR: [CreditCardInformationSubscriptionFilter!] # The subscription event gets dispatched when it's listed in mutation_in mutation_in: [_ModelMutationType!] # The subscription event gets only dispatched when one of the updated fields names is included in this list updatedFields_contains: String # The subscription event gets only dispatched when all of the field names included in this list have been updated updatedFields_contains_every: [String!] # The subscription event gets only dispatched when some of the field names included in this list have been updated updatedFields_contains_some: [String!] node: CreditCardInformationSubscriptionFilterNode } input CreditCardInformationSubscriptionFilterNode { cardNumber: String # All values that are not equal to given value. cardNumber_not: String # All values that are contained in given list. cardNumber_in: [String!] # All values that are not contained in given list. cardNumber_not_in: [String!] # All values less than the given value. cardNumber_lt: String # All values less than or equal the given value. cardNumber_lte: String # All values greater than the given value. cardNumber_gt: String # All values greater than or equal the given value. cardNumber_gte: String # All values containing the given string. cardNumber_contains: String # All values not containing the given string. cardNumber_not_contains: String # All values starting with the given string. cardNumber_starts_with: String # All values not starting with the given string. cardNumber_not_starts_with: String # All values ending with the given string. cardNumber_ends_with: String # All values not ending with the given string. cardNumber_not_ends_with: String country: String # All values that are not equal to given value. country_not: String # All values that are contained in given list. country_in: [String!] # All values that are not contained in given list. country_not_in: [String!] # All values less than the given value. country_lt: String # All values less than or equal the given value. country_lte: String # All values greater than the given value. country_gt: String # All values greater than or equal the given value. country_gte: String # All values containing the given string. country_contains: String # All values not containing the given string. country_not_contains: String # All values starting with the given string. country_starts_with: String # All values not starting with the given string. country_not_starts_with: String # All values ending with the given string. country_ends_with: String # All values not ending with the given string. country_not_ends_with: String createdAt: DateTime # All values that are not equal to given value. createdAt_not: DateTime # All values that are contained in given list. createdAt_in: [DateTime!] # All values that are not contained in given list. createdAt_not_in: [DateTime!] # All values less than the given value. createdAt_lt: DateTime # All values less than or equal the given value. createdAt_lte: DateTime # All values greater than the given value. createdAt_gt: DateTime # All values greater than or equal the given value. createdAt_gte: DateTime expiresOnMonth: Int # All values that are not equal to given value. expiresOnMonth_not: Int # All values that are contained in given list. expiresOnMonth_in: [Int!] # All values that are not contained in given list. expiresOnMonth_not_in: [Int!] # All values less than the given value. expiresOnMonth_lt: Int # All values less than or equal the given value. expiresOnMonth_lte: Int # All values greater than the given value. expiresOnMonth_gt: Int # All values greater than or equal the given value. expiresOnMonth_gte: Int expiresOnYear: Int # All values that are not equal to given value. expiresOnYear_not: Int # All values that are contained in given list. expiresOnYear_in: [Int!] # All values that are not contained in given list. expiresOnYear_not_in: [Int!] # All values less than the given value. expiresOnYear_lt: Int # All values less than or equal the given value. expiresOnYear_lte: Int # All values greater than the given value. expiresOnYear_gt: Int # All values greater than or equal the given value. expiresOnYear_gte: Int firstName: String # All values that are not equal to given value. firstName_not: String # All values that are contained in given list. firstName_in: [String!] # All values that are not contained in given list. firstName_not_in: [String!] # All values less than the given value. firstName_lt: String # All values less than or equal the given value. firstName_lte: String # All values greater than the given value. firstName_gt: String # All values greater than or equal the given value. firstName_gte: String # All values containing the given string. firstName_contains: String # All values not containing the given string. firstName_not_contains: String # All values starting with the given string. firstName_starts_with: String # All values not starting with the given string. firstName_not_starts_with: String # All values ending with the given string. firstName_ends_with: String # All values not ending with the given string. firstName_not_ends_with: String id: ID # All values that are not equal to given value. id_not: ID # All values that are contained in given list. id_in: [ID!] # All values that are not contained in given list. id_not_in: [ID!] # All values less than the given value. id_lt: ID # All values less than or equal the given value. id_lte: ID # All values greater than the given value. id_gt: ID # All values greater than or equal the given value. id_gte: ID # All values containing the given string. id_contains: ID # All values not containing the given string. id_not_contains: ID # All values starting with the given string. id_starts_with: ID # All values not starting with the given string. id_not_starts_with: ID # All values ending with the given string. id_ends_with: ID # All values not ending with the given string. id_not_ends_with: ID lastName: String # All values that are not equal to given value. lastName_not: String # All values that are contained in given list. lastName_in: [String!] # All values that are not contained in given list. lastName_not_in: [String!] # All values less than the given value. lastName_lt: String # All values less than or equal the given value. lastName_lte: String # All values greater than the given value. lastName_gt: String # All values greater than or equal the given value. lastName_gte: String # All values containing the given string. lastName_contains: String # All values not containing the given string. lastName_not_contains: String # All values starting with the given string. lastName_starts_with: String # All values not starting with the given string. lastName_not_starts_with: String # All values ending with the given string. lastName_ends_with: String # All values not ending with the given string. lastName_not_ends_with: String postalCode: String # All values that are not equal to given value. postalCode_not: String # All values that are contained in given list. postalCode_in: [String!] # All values that are not contained in given list. postalCode_not_in: [String!] # All values less than the given value. postalCode_lt: String # All values less than or equal the given value. postalCode_lte: String # All values greater than the given value. postalCode_gt: String # All values greater than or equal the given value. postalCode_gte: String # All values containing the given string. postalCode_contains: String # All values not containing the given string. postalCode_not_contains: String # All values starting with the given string. postalCode_starts_with: String # All values not starting with the given string. postalCode_not_starts_with: String # All values ending with the given string. postalCode_ends_with: String # All values not ending with the given string. postalCode_not_ends_with: String securityCode: String # All values that are not equal to given value. securityCode_not: String # All values that are contained in given list. securityCode_in: [String!] # All values that are not contained in given list. securityCode_not_in: [String!] # All values less than the given value. securityCode_lt: String # All values less than or equal the given value. securityCode_lte: String # All values greater than the given value. securityCode_gt: String # All values greater than or equal the given value. securityCode_gte: String # All values containing the given string. securityCode_contains: String # All values not containing the given string. securityCode_not_contains: String # All values starting with the given string. securityCode_starts_with: String # All values not starting with the given string. securityCode_not_starts_with: String # All values ending with the given string. securityCode_ends_with: String # All values not ending with the given string. securityCode_not_ends_with: String paymentAccount: PaymentAccountFilter } type CreditCardInformationSubscriptionPayload { mutation: _ModelMutationType! node: CreditCardInformation updatedFields: [String!] previousValues: CreditCardInformationPreviousValues } enum CURRENCY { CAD CHF EUR JPY USD ZAR } scalar DateTime type Experience implements Node { category(filter: ExperienceCategoryFilter): ExperienceCategory host(filter: UserFilter): User! id: ID! location(filter: LocationFilter): Location! popularity: Int! preview(filter: PictureFilter): Picture! pricePerPerson: Int! reviews( filter: ReviewFilter orderBy: ReviewOrderBy skip: Int after: String before: String first: Int last: Int ): [Review!] title: String! # Meta information about the query. _reviewsMeta( filter: ReviewFilter orderBy: ReviewOrderBy skip: Int after: String before: String first: Int last: Int ): _QueryMeta! } type ExperienceCategory implements Node { experience(filter: ExperienceFilter): Experience id: ID! mainColor: String! name: String! } input ExperiencecategoryExperienceCategory { mainColor: String name: String! } input ExperienceCategoryexperienceExperience { popularity: Int! pricePerPerson: Int! title: String! hostId: ID host: ExperiencehostUser locationId: ID location: ExperiencelocationLocation previewId: ID preview: ExperiencepreviewPicture reviewsIds: [ID!] reviews: [ExperiencereviewsReview!] } input ExperienceCategoryFilter { # Logical AND on all given filters. AND: [ExperienceCategoryFilter!] # Logical OR on all given filters. OR: [ExperienceCategoryFilter!] id: ID # All values that are not equal to given value. id_not: ID # All values that are contained in given list. id_in: [ID!] # All values that are not contained in given list. id_not_in: [ID!] # All values less than the given value. id_lt: ID # All values less than or equal the given value. id_lte: ID # All values greater than the given value. id_gt: ID # All values greater than or equal the given value. id_gte: ID # All values containing the given string. id_contains: ID # All values not containing the given string. id_not_contains: ID # All values starting with the given string. id_starts_with: ID # All values not starting with the given string. id_not_starts_with: ID # All values ending with the given string. id_ends_with: ID # All values not ending with the given string. id_not_ends_with: ID mainColor: String # All values that are not equal to given value. mainColor_not: String # All values that are contained in given list. mainColor_in: [String!] # All values that are not contained in given list. mainColor_not_in: [String!] # All values less than the given value. mainColor_lt: String # All values less than or equal the given value. mainColor_lte: String # All values greater than the given value. mainColor_gt: String # All values greater than or equal the given value. mainColor_gte: String # All values containing the given string. mainColor_contains: String # All values not containing the given string. mainColor_not_contains: String # All values starting with the given string. mainColor_starts_with: String # All values not starting with the given string. mainColor_not_starts_with: String # All values ending with the given string. mainColor_ends_with: String # All values not ending with the given string. mainColor_not_ends_with: String name: String # All values that are not equal to given value. name_not: String # All values that are contained in given list. name_in: [String!] # All values that are not contained in given list. name_not_in: [String!] # All values less than the given value. name_lt: String # All values less than or equal the given value. name_lte: String # All values greater than the given value. name_gt: String # All values greater than or equal the given value. name_gte: String # All values containing the given string. name_contains: String # All values not containing the given string. name_not_contains: String # All values starting with the given string. name_starts_with: String # All values not starting with the given string. name_not_starts_with: String # All values ending with the given string. name_ends_with: String # All values not ending with the given string. name_not_ends_with: String experience: ExperienceFilter } enum ExperienceCategoryOrderBy { id_ASC id_DESC mainColor_ASC mainColor_DESC name_ASC name_DESC } type ExperienceCategoryPreviousValues { id: ID! mainColor: String! name: String! } input ExperienceCategorySubscriptionFilter { # Logical AND on all given filters. AND: [ExperienceCategorySubscriptionFilter!] # Logical OR on all given filters. OR: [ExperienceCategorySubscriptionFilter!] # The subscription event gets dispatched when it's listed in mutation_in mutation_in: [_ModelMutationType!] # The subscription event gets only dispatched when one of the updated fields names is included in this list updatedFields_contains: String # The subscription event gets only dispatched when all of the field names included in this list have been updated updatedFields_contains_every: [String!] # The subscription event gets only dispatched when some of the field names included in this list have been updated updatedFields_contains_some: [String!] node: ExperienceCategorySubscriptionFilterNode } input ExperienceCategorySubscriptionFilterNode { id: ID # All values that are not equal to given value. id_not: ID # All values that are contained in given list. id_in: [ID!] # All values that are not contained in given list. id_not_in: [ID!] # All values less than the given value. id_lt: ID # All values less than or equal the given value. id_lte: ID # All values greater than the given value. id_gt: ID # All values greater than or equal the given value. id_gte: ID # All values containing the given string. id_contains: ID # All values not containing the given string. id_not_contains: ID # All values starting with the given string. id_starts_with: ID # All values not starting with the given string. id_not_starts_with: ID # All values ending with the given string. id_ends_with: ID # All values not ending with the given string. id_not_ends_with: ID mainColor: String # All values that are not equal to given value. mainColor_not: String # All values that are contained in given list. mainColor_in: [String!] # All values that are not contained in given list. mainColor_not_in: [String!] # All values less than the given value. mainColor_lt: String # All values less than or equal the given value. mainColor_lte: String # All values greater than the given value. mainColor_gt: String # All values greater than or equal the given value. mainColor_gte: String # All values containing the given string. mainColor_contains: String # All values not containing the given string. mainColor_not_contains: String # All values starting with the given string. mainColor_starts_with: String # All values not starting with the given string. mainColor_not_starts_with: String # All values ending with the given string. mainColor_ends_with: String # All values not ending with the given string. mainColor_not_ends_with: String name: String # All values that are not equal to given value. name_not: String # All values that are contained in given list. name_in: [String!] # All values that are not contained in given list. name_not_in: [String!] # All values less than the given value. name_lt: String # All values less than or equal the given value. name_lte: String # All values greater than the given value. name_gt: String # All values greater than or equal the given value. name_gte: String # All values containing the given string. name_contains: String # All values not containing the given string. name_not_contains: String # All values starting with the given string. name_starts_with: String # All values not starting with the given string. name_not_starts_with: String # All values ending with the given string. name_ends_with: String # All values not ending with the given string. name_not_ends_with: String experience: ExperienceFilter } type ExperienceCategorySubscriptionPayload { mutation: _ModelMutationType! node: ExperienceCategory updatedFields: [String!] previousValues: ExperienceCategoryPreviousValues } input ExperienceFilter { # Logical AND on all given filters. AND: [ExperienceFilter!] # Logical OR on all given filters. OR: [ExperienceFilter!] id: ID # All values that are not equal to given value. id_not: ID # All values that are contained in given list. id_in: [ID!] # All values that are not contained in given list. id_not_in: [ID!] # All values less than the given value. id_lt: ID # All values less than or equal the given value. id_lte: ID # All values greater than the given value. id_gt: ID # All values greater than or equal the given value. id_gte: ID # All values containing the given string. id_contains: ID # All values not containing the given string. id_not_contains: ID # All values starting with the given string. id_starts_with: ID # All values not starting with the given string. id_not_starts_with: ID # All values ending with the given string. id_ends_with: ID # All values not ending with the given string. id_not_ends_with: ID popularity: Int # All values that are not equal to given value. popularity_not: Int # All values that are contained in given list. popularity_in: [Int!] # All values that are not contained in given list. popularity_not_in: [Int!] # All values less than the given value. popularity_lt: Int # All values less than or equal the given value. popularity_lte: Int # All values greater than the given value. popularity_gt: Int # All values greater than or equal the given value. popularity_gte: Int pricePerPerson: Int # All values that are not equal to given value. pricePerPerson_not: Int # All values that are contained in given list. pricePerPerson_in: [Int!] # All values that are not contained in given list. pricePerPerson_not_in: [Int!] # All values less than the given value. pricePerPerson_lt: Int # All values less than or equal the given value. pricePerPerson_lte: Int # All values greater than the given value. pricePerPerson_gt: Int # All values greater than or equal the given value. pricePerPerson_gte: Int title: String # All values that are not equal to given value. title_not: String # All values that are contained in given list. title_in: [String!] # All values that are not contained in given list. title_not_in: [String!] # All values less than the given value. title_lt: String # All values less than or equal the given value. title_lte: String # All values greater than the given value. title_gt: String # All values greater than or equal the given value. title_gte: String # All values containing the given string. title_contains: String # All values not containing the given string. title_not_contains: String # All values starting with the given string. title_starts_with: String # All values not starting with the given string. title_not_starts_with: String # All values ending with the given string. title_ends_with: String # All values not ending with the given string. title_not_ends_with: String category: ExperienceCategoryFilter host: UserFilter location: LocationFilter preview: PictureFilter reviews_every: ReviewFilter reviews_some: ReviewFilter reviews_none: ReviewFilter } input ExperiencehostUser { email: String! firstName: String! isSuperHost: Boolean lastName: String! password: String! phone: String! responseRate: Float responseTime: Int locationId: ID location: UserlocationLocation profilePictureId: ID profilePicture: UserprofilePicturePicture bookingsIds: [ID!] bookings: [UserbookingsBooking!] hostingExperiencesIds: [ID!] hostingExperiences: [UserhostingExperiencesExperience!] notificationsIds: [ID!] notifications: [UsernotificationsNotification!] ownedPlacesIds: [ID!] ownedPlaces: [UserownedPlacesPlace!] paymentAccountIds: [ID!] paymentAccount: [UserpaymentAccountPaymentAccount!] receivedMessagesIds: [ID!] receivedMessages: [UserreceivedMessagesMessage!] sentMessagesIds: [ID!] sentMessages: [UsersentMessagesMessage!] } input ExperiencelocationLocation { address: String directions: String lat: Float! lng: Float! neighbourHoodId: ID neighbourHood: LocationneighbourHoodNeighbourhood placeId: ID place: LocationplacePlace restaurantId: ID restaurant: LocationrestaurantRestaurant userId: ID user: LocationuserUser } enum ExperienceOrderBy { id_ASC id_DESC popularity_ASC popularity_DESC pricePerPerson_ASC pricePerPerson_DESC title_ASC title_DESC } input ExperiencepreviewPicture { url: String! neighbourHoodId: ID neighbourHood: PictureneighbourHoodNeighbourhood placeId: ID place: PictureplacePlace reservationId: ID reservation: PicturereservationRestaurant userId: ID user: PictureuserUser } type ExperiencePreviousValues { id: ID! popularity: Int! pricePerPerson: Int! title: String! } input ExperiencereviewsReview { accuracy: Int! checkIn: Int! cleanliness: Int! communication: Int! location: Int! stars: Int! text: String! value: Int! placeId: ID place: ReviewplacePlace } input ExperienceSubscriptionFilter { # Logical AND on all given filters. AND: [ExperienceSubscriptionFilter!] # Logical OR on all given filters. OR: [ExperienceSubscriptionFilter!] # The subscription event gets dispatched when it's listed in mutation_in mutation_in: [_ModelMutationType!] # The subscription event gets only dispatched when one of the updated fields names is included in this list updatedFields_contains: String # The subscription event gets only dispatched when all of the field names included in this list have been updated updatedFields_contains_every: [String!] # The subscription event gets only dispatched when some of the field names included in this list have been updated updatedFields_contains_some: [String!] node: ExperienceSubscriptionFilterNode } input ExperienceSubscriptionFilterNode { id: ID # All values that are not equal to given value. id_not: ID # All values that are contained in given list. id_in: [ID!] # All values that are not contained in given list. id_not_in: [ID!] # All values less than the given value. id_lt: ID # All values less than or equal the given value. id_lte: ID # All values greater than the given value. id_gt: ID # All values greater than or equal the given value. id_gte: ID # All values containing the given string. id_contains: ID # All values not containing the given string. id_not_contains: ID # All values starting with the given string. id_starts_with: ID # All values not starting with the given string. id_not_starts_with: ID # All values ending with the given string. id_ends_with: ID # All values not ending with the given string. id_not_ends_with: ID popularity: Int # All values that are not equal to given value. popularity_not: Int # All values that are contained in given list. popularity_in: [Int!] # All values that are not contained in given list. popularity_not_in: [Int!] # All values less than the given value. popularity_lt: Int # All values less than or equal the given value. popularity_lte: Int # All values greater than the given value. popularity_gt: Int # All values greater than or equal the given value. popularity_gte: Int pricePerPerson: Int # All values that are not equal to given value. pricePerPerson_not: Int # All values that are contained in given list. pricePerPerson_in: [Int!] # All values that are not contained in given list. pricePerPerson_not_in: [Int!] # All values less than the given value. pricePerPerson_lt: Int # All values less than or equal the given value. pricePerPerson_lte: Int # All values greater than the given value. pricePerPerson_gt: Int # All values greater than or equal the given value. pricePerPerson_gte: Int title: String # All values that are not equal to given value. title_not: String # All values that are contained in given list. title_in: [String!] # All values that are not contained in given list. title_not_in: [String!] # All values less than the given value. title_lt: String # All values less than or equal the given value. title_lte: String # All values greater than the given value. title_gt: String # All values greater than or equal the given value. title_gte: String # All values containing the given string. title_contains: String # All values not containing the given string. title_not_contains: String # All values starting with the given string. title_starts_with: String # All values not starting with the given string. title_not_starts_with: String # All values ending with the given string. title_ends_with: String # All values not ending with the given string. title_not_ends_with: String category: ExperienceCategoryFilter host: UserFilter location: LocationFilter preview: PictureFilter reviews_every: ReviewFilter reviews_some: ReviewFilter reviews_none: ReviewFilter } type ExperienceSubscriptionPayload { mutation: _ModelMutationType! node: Experience updatedFields: [String!] previousValues: ExperiencePreviousValues } type GuestRequirements implements Node { govIssuedId: Boolean! guestTripInformation: Boolean! id: ID! place(filter: PlaceFilter): Place! recommendationsFromOtherHosts: Boolean! } input GuestRequirementsFilter { # Logical AND on all given filters. AND: [GuestRequirementsFilter!] # Logical OR on all given filters. OR: [GuestRequirementsFilter!] govIssuedId: Boolean # All values that are not equal to given value. govIssuedId_not: Boolean guestTripInformation: Boolean # All values that are not equal to given value. guestTripInformation_not: Boolean id: ID # All values that are not equal to given value. id_not: ID # All values that are contained in given list. id_in: [ID!] # All values that are not contained in given list. id_not_in: [ID!] # All values less than the given value. id_lt: ID # All values less than or equal the given value. id_lte: ID # All values greater than the given value. id_gt: ID # All values greater than or equal the given value. id_gte: ID # All values containing the given string. id_contains: ID # All values not containing the given string. id_not_contains: ID # All values starting with the given string. id_starts_with: ID # All values not starting with the given string. id_not_starts_with: ID # All values ending with the given string. id_ends_with: ID # All values not ending with the given string. id_not_ends_with: ID recommendationsFromOtherHosts: Boolean # All values that are not equal to given value. recommendationsFromOtherHosts_not: Boolean place: PlaceFilter } enum GuestRequirementsOrderBy { govIssuedId_ASC govIssuedId_DESC guestTripInformation_ASC guestTripInformation_DESC id_ASC id_DESC recommendationsFromOtherHosts_ASC recommendationsFromOtherHosts_DESC } input GuestRequirementsplacePlace { description: String! maxGuests: Int! name: String numBaths: Int! numBedrooms: Int! numBeds: Int! popularity: Int! shortDescription: String! size: PLACE_SIZES slug: String! amenitiesId: ID amenities: PlaceamenitiesAmenities hostId: ID host: PlacehostUser houseRulesId: ID houseRules: PlacehouseRulesHouseRules locationId: ID location: PlacelocationLocation policiesId: ID policies: PlacepoliciesPolicies pricingId: ID pricing: PlacepricingPricing viewsId: ID views: PlaceviewsPlaceViews bookingsIds: [ID!] bookings: [PlacebookingsBooking!] picturesIds: [ID!] pictures: [PlacepicturesPicture!] reviewsIds: [ID!] reviews: [PlacereviewsReview!] } type GuestRequirementsPreviousValues { govIssuedId: Boolean! guestTripInformation: Boolean! id: ID! recommendationsFromOtherHosts: Boolean! } input GuestRequirementsSubscriptionFilter { # Logical AND on all given filters. AND: [GuestRequirementsSubscriptionFilter!] # Logical OR on all given filters. OR: [GuestRequirementsSubscriptionFilter!] # The subscription event gets dispatched when it's listed in mutation_in mutation_in: [_ModelMutationType!] # The subscription event gets only dispatched when one of the updated fields names is included in this list updatedFields_contains: String # The subscription event gets only dispatched when all of the field names included in this list have been updated updatedFields_contains_every: [String!] # The subscription event gets only dispatched when some of the field names included in this list have been updated updatedFields_contains_some: [String!] node: GuestRequirementsSubscriptionFilterNode } input GuestRequirementsSubscriptionFilterNode { govIssuedId: Boolean # All values that are not equal to given value. govIssuedId_not: Boolean guestTripInformation: Boolean # All values that are not equal to given value. guestTripInformation_not: Boolean id: ID # All values that are not equal to given value. id_not: ID # All values that are contained in given list. id_in: [ID!] # All values that are not contained in given list. id_not_in: [ID!] # All values less than the given value. id_lt: ID # All values less than or equal the given value. id_lte: ID # All values greater than the given value. id_gt: ID # All values greater than or equal the given value. id_gte: ID # All values containing the given string. id_contains: ID # All values not containing the given string. id_not_contains: ID # All values starting with the given string. id_starts_with: ID # All values not starting with the given string. id_not_starts_with: ID # All values ending with the given string. id_ends_with: ID # All values not ending with the given string. id_not_ends_with: ID recommendationsFromOtherHosts: Boolean # All values that are not equal to given value. recommendationsFromOtherHosts_not: Boolean place: PlaceFilter } type GuestRequirementsSubscriptionPayload { mutation: _ModelMutationType! node: GuestRequirements updatedFields: [String!] previousValues: GuestRequirementsPreviousValues } type HouseRules implements Node { additionalRules: String createdAt: DateTime! id: ID! partiesAndEventsAllowed: Boolean petsAllowed: Boolean place(filter: PlaceFilter): Place! smokingAllowed: Boolean suitableForChildren: Boolean suitableForInfants: Boolean updatedAt: DateTime! } input HouseRulesFilter { # Logical AND on all given filters. AND: [HouseRulesFilter!] # Logical OR on all given filters. OR: [HouseRulesFilter!] additionalRules: String # All values that are not equal to given value. additionalRules_not: String # All values that are contained in given list. additionalRules_in: [String!] # All values that are not contained in given list. additionalRules_not_in: [String!] # All values less than the given value. additionalRules_lt: String # All values less than or equal the given value. additionalRules_lte: String # All values greater than the given value. additionalRules_gt: String # All values greater than or equal the given value. additionalRules_gte: String # All values containing the given string. additionalRules_contains: String # All values not containing the given string. additionalRules_not_contains: String # All values starting with the given string. additionalRules_starts_with: String # All values not starting with the given string. additionalRules_not_starts_with: String # All values ending with the given string. additionalRules_ends_with: String # All values not ending with the given string. additionalRules_not_ends_with: String createdAt: DateTime # All values that are not equal to given value. createdAt_not: DateTime # All values that are contained in given list. createdAt_in: [DateTime!] # All values that are not contained in given list. createdAt_not_in: [DateTime!] # All values less than the given value. createdAt_lt: DateTime # All values less than or equal the given value. createdAt_lte: DateTime # All values greater than the given value. createdAt_gt: DateTime # All values greater than or equal the given value. createdAt_gte: DateTime id: ID # All values that are not equal to given value. id_not: ID # All values that are contained in given list. id_in: [ID!] # All values that are not contained in given list. id_not_in: [ID!] # All values less than the given value. id_lt: ID # All values less than or equal the given value. id_lte: ID # All values greater than the given value. id_gt: ID # All values greater than or equal the given value. id_gte: ID # All values containing the given string. id_contains: ID # All values not containing the given string. id_not_contains: ID # All values starting with the given string. id_starts_with: ID # All values not starting with the given string. id_not_starts_with: ID # All values ending with the given string. id_ends_with: ID # All values not ending with the given string. id_not_ends_with: ID partiesAndEventsAllowed: Boolean # All values that are not equal to given value. partiesAndEventsAllowed_not: Boolean petsAllowed: Boolean # All values that are not equal to given value. petsAllowed_not: Boolean smokingAllowed: Boolean # All values that are not equal to given value. smokingAllowed_not: Boolean suitableForChildren: Boolean # All values that are not equal to given value. suitableForChildren_not: Boolean suitableForInfants: Boolean # All values that are not equal to given value. suitableForInfants_not: Boolean updatedAt: DateTime # All values that are not equal to given value. updatedAt_not: DateTime # All values that are contained in given list. updatedAt_in: [DateTime!] # All values that are not contained in given list. updatedAt_not_in: [DateTime!] # All values less than the given value. updatedAt_lt: DateTime # All values less than or equal the given value. updatedAt_lte: DateTime # All values greater than the given value. updatedAt_gt: DateTime # All values greater than or equal the given value. updatedAt_gte: DateTime place: PlaceFilter } enum HouseRulesOrderBy { additionalRules_ASC additionalRules_DESC createdAt_ASC createdAt_DESC id_ASC id_DESC partiesAndEventsAllowed_ASC partiesAndEventsAllowed_DESC petsAllowed_ASC petsAllowed_DESC smokingAllowed_ASC smokingAllowed_DESC suitableForChildren_ASC suitableForChildren_DESC suitableForInfants_ASC suitableForInfants_DESC updatedAt_ASC updatedAt_DESC } input HouseRulesplacePlace { description: String! maxGuests: Int! name: String numBaths: Int! numBedrooms: Int! numBeds: Int! popularity: Int! shortDescription: String! size: PLACE_SIZES slug: String! amenitiesId: ID amenities: PlaceamenitiesAmenities guestRequirementsId: ID guestRequirements: PlaceguestRequirementsGuestRequirements hostId: ID host: PlacehostUser locationId: ID location: PlacelocationLocation policiesId: ID policies: PlacepoliciesPolicies pricingId: ID pricing: PlacepricingPricing viewsId: ID views: PlaceviewsPlaceViews bookingsIds: [ID!] bookings: [PlacebookingsBooking!] picturesIds: [ID!] pictures: [PlacepicturesPicture!] reviewsIds: [ID!] reviews: [PlacereviewsReview!] } type HouseRulesPreviousValues { additionalRules: String createdAt: DateTime! id: ID! partiesAndEventsAllowed: Boolean petsAllowed: Boolean smokingAllowed: Boolean suitableForChildren: Boolean suitableForInfants: Boolean updatedAt: DateTime! } input HouseRulesSubscriptionFilter { # Logical AND on all given filters. AND: [HouseRulesSubscriptionFilter!] # Logical OR on all given filters. OR: [HouseRulesSubscriptionFilter!] # The subscription event gets dispatched when it's listed in mutation_in mutation_in: [_ModelMutationType!] # The subscription event gets only dispatched when one of the updated fields names is included in this list updatedFields_contains: String # The subscription event gets only dispatched when all of the field names included in this list have been updated updatedFields_contains_every: [String!] # The subscription event gets only dispatched when some of the field names included in this list have been updated updatedFields_contains_some: [String!] node: HouseRulesSubscriptionFilterNode } input HouseRulesSubscriptionFilterNode { additionalRules: String # All values that are not equal to given value. additionalRules_not: String # All values that are contained in given list. additionalRules_in: [String!] # All values that are not contained in given list. additionalRules_not_in: [String!] # All values less than the given value. additionalRules_lt: String # All values less than or equal the given value. additionalRules_lte: String # All values greater than the given value. additionalRules_gt: String # All values greater than or equal the given value. additionalRules_gte: String # All values containing the given string. additionalRules_contains: String # All values not containing the given string. additionalRules_not_contains: String # All values starting with the given string. additionalRules_starts_with: String # All values not starting with the given string. additionalRules_not_starts_with: String # All values ending with the given string. additionalRules_ends_with: String # All values not ending with the given string. additionalRules_not_ends_with: String createdAt: DateTime # All values that are not equal to given value. createdAt_not: DateTime # All values that are contained in given list. createdAt_in: [DateTime!] # All values that are not contained in given list. createdAt_not_in: [DateTime!] # All values less than the given value. createdAt_lt: DateTime # All values less than or equal the given value. createdAt_lte: DateTime # All values greater than the given value. createdAt_gt: DateTime # All values greater than or equal the given value. createdAt_gte: DateTime id: ID # All values that are not equal to given value. id_not: ID # All values that are contained in given list. id_in: [ID!] # All values that are not contained in given list. id_not_in: [ID!] # All values less than the given value. id_lt: ID # All values less than or equal the given value. id_lte: ID # All values greater than the given value. id_gt: ID # All values greater than or equal the given value. id_gte: ID # All values containing the given string. id_contains: ID # All values not containing the given string. id_not_contains: ID # All values starting with the given string. id_starts_with: ID # All values not starting with the given string. id_not_starts_with: ID # All values ending with the given string. id_ends_with: ID # All values not ending with the given string. id_not_ends_with: ID partiesAndEventsAllowed: Boolean # All values that are not equal to given value. partiesAndEventsAllowed_not: Boolean petsAllowed: Boolean # All values that are not equal to given value. petsAllowed_not: Boolean smokingAllowed: Boolean # All values that are not equal to given value. smokingAllowed_not: Boolean suitableForChildren: Boolean # All values that are not equal to given value. suitableForChildren_not: Boolean suitableForInfants: Boolean # All values that are not equal to given value. suitableForInfants_not: Boolean updatedAt: DateTime # All values that are not equal to given value. updatedAt_not: DateTime # All values that are contained in given list. updatedAt_in: [DateTime!] # All values that are not contained in given list. updatedAt_not_in: [DateTime!] # All values less than the given value. updatedAt_lt: DateTime # All values less than or equal the given value. updatedAt_lte: DateTime # All values greater than the given value. updatedAt_gt: DateTime # All values greater than or equal the given value. updatedAt_gte: DateTime place: PlaceFilter } type HouseRulesSubscriptionPayload { mutation: _ModelMutationType! node: HouseRules updatedFields: [String!] previousValues: HouseRulesPreviousValues } type Location implements Node { address: String directions: String experience(filter: ExperienceFilter): Experience id: ID! lat: Float! lng: Float! neighbourHood(filter: NeighbourhoodFilter): Neighbourhood place(filter: PlaceFilter): Place restaurant(filter: RestaurantFilter): Restaurant user(filter: UserFilter): User } input LocationexperienceExperience { popularity: Int! pricePerPerson: Int! title: String! categoryId: ID category: ExperiencecategoryExperienceCategory hostId: ID host: ExperiencehostUser previewId: ID preview: ExperiencepreviewPicture reviewsIds: [ID!] reviews: [ExperiencereviewsReview!] } input LocationFilter { # Logical AND on all given filters. AND: [LocationFilter!] # Logical OR on all given filters. OR: [LocationFilter!] address: String # All values that are not equal to given value. address_not: String # All values that are contained in given list. address_in: [String!] # All values that are not contained in given list. address_not_in: [String!] # All values less than the given value. address_lt: String # All values less than or equal the given value. address_lte: String # All values greater than the given value. address_gt: String # All values greater than or equal the given value. address_gte: String # All values containing the given string. address_contains: String # All values not containing the given string. address_not_contains: String # All values starting with the given string. address_starts_with: String # All values not starting with the given string. address_not_starts_with: String # All values ending with the given string. address_ends_with: String # All values not ending with the given string. address_not_ends_with: String directions: String # All values that are not equal to given value. directions_not: String # All values that are contained in given list. directions_in: [String!] # All values that are not contained in given list. directions_not_in: [String!] # All values less than the given value. directions_lt: String # All values less than or equal the given value. directions_lte: String # All values greater than the given value. directions_gt: String # All values greater than or equal the given value. directions_gte: String # All values containing the given string. directions_contains: String # All values not containing the given string. directions_not_contains: String # All values starting with the given string. directions_starts_with: String # All values not starting with the given string. directions_not_starts_with: String # All values ending with the given string. directions_ends_with: String # All values not ending with the given string. directions_not_ends_with: String id: ID # All values that are not equal to given value. id_not: ID # All values that are contained in given list. id_in: [ID!] # All values that are not contained in given list. id_not_in: [ID!] # All values less than the given value. id_lt: ID # All values less than or equal the given value. id_lte: ID # All values greater than the given value. id_gt: ID # All values greater than or equal the given value. id_gte: ID # All values containing the given string. id_contains: ID # All values not containing the given string. id_not_contains: ID # All values starting with the given string. id_starts_with: ID # All values not starting with the given string. id_not_starts_with: ID # All values ending with the given string. id_ends_with: ID # All values not ending with the given string. id_not_ends_with: ID lat: Float # All values that are not equal to given value. lat_not: Float # All values that are contained in given list. lat_in: [Float!] # All values that are not contained in given list. lat_not_in: [Float!] # All values less than the given value. lat_lt: Float # All values less than or equal the given value. lat_lte: Float # All values greater than the given value. lat_gt: Float # All values greater than or equal the given value. lat_gte: Float lng: Float # All values that are not equal to given value. lng_not: Float # All values that are contained in given list. lng_in: [Float!] # All values that are not contained in given list. lng_not_in: [Float!] # All values less than the given value. lng_lt: Float # All values less than or equal the given value. lng_lte: Float # All values greater than the given value. lng_gt: Float # All values greater than or equal the given value. lng_gte: Float experience: ExperienceFilter neighbourHood: NeighbourhoodFilter place: PlaceFilter restaurant: RestaurantFilter user: UserFilter } input LocationneighbourHoodNeighbourhood { featured: Boolean! name: String! popularity: Int! slug: String! cityId: ID city: NeighbourhoodcityCity homePreviewId: ID homePreview: NeighbourhoodhomePreviewPicture locationsIds: [ID!] locations: [NeighbourhoodlocationsLocation!] } enum LocationOrderBy { address_ASC address_DESC directions_ASC directions_DESC id_ASC id_DESC lat_ASC lat_DESC lng_ASC lng_DESC } input LocationplacePlace { description: String! maxGuests: Int! name: String numBaths: Int! numBedrooms: Int! numBeds: Int! popularity: Int! shortDescription: String! size: PLACE_SIZES slug: String! amenitiesId: ID amenities: PlaceamenitiesAmenities guestRequirementsId: ID guestRequirements: PlaceguestRequirementsGuestRequirements hostId: ID host: PlacehostUser houseRulesId: ID houseRules: PlacehouseRulesHouseRules policiesId: ID policies: PlacepoliciesPolicies pricingId: ID pricing: PlacepricingPricing viewsId: ID views: PlaceviewsPlaceViews bookingsIds: [ID!] bookings: [PlacebookingsBooking!] picturesIds: [ID!] pictures: [PlacepicturesPicture!] reviewsIds: [ID!] reviews: [PlacereviewsReview!] } type LocationPreviousValues { address: String directions: String id: ID! lat: Float! lng: Float! } input LocationrestaurantRestaurant { avgPricePerPerson: Int! isCurated: Boolean popularity: Int! slug: String! title: String! picturesIds: [ID!] pictures: [RestaurantpicturesPicture!] } input LocationSubscriptionFilter { # Logical AND on all given filters. AND: [LocationSubscriptionFilter!] # Logical OR on all given filters. OR: [LocationSubscriptionFilter!] # The subscription event gets dispatched when it's listed in mutation_in mutation_in: [_ModelMutationType!] # The subscription event gets only dispatched when one of the updated fields names is included in this list updatedFields_contains: String # The subscription event gets only dispatched when all of the field names included in this list have been updated updatedFields_contains_every: [String!] # The subscription event gets only dispatched when some of the field names included in this list have been updated updatedFields_contains_some: [String!] node: LocationSubscriptionFilterNode } input LocationSubscriptionFilterNode { address: String # All values that are not equal to given value. address_not: String # All values that are contained in given list. address_in: [String!] # All values that are not contained in given list. address_not_in: [String!] # All values less than the given value. address_lt: String # All values less than or equal the given value. address_lte: String # All values greater than the given value. address_gt: String # All values greater than or equal the given value. address_gte: String # All values containing the given string. address_contains: String # All values not containing the given string. address_not_contains: String # All values starting with the given string. address_starts_with: String # All values not starting with the given string. address_not_starts_with: String # All values ending with the given string. address_ends_with: String # All values not ending with the given string. address_not_ends_with: String directions: String # All values that are not equal to given value. directions_not: String # All values that are contained in given list. directions_in: [String!] # All values that are not contained in given list. directions_not_in: [String!] # All values less than the given value. directions_lt: String # All values less than or equal the given value. directions_lte: String # All values greater than the given value. directions_gt: String # All values greater than or equal the given value. directions_gte: String # All values containing the given string. directions_contains: String # All values not containing the given string. directions_not_contains: String # All values starting with the given string. directions_starts_with: String # All values not starting with the given string. directions_not_starts_with: String # All values ending with the given string. directions_ends_with: String # All values not ending with the given string. directions_not_ends_with: String id: ID # All values that are not equal to given value. id_not: ID # All values that are contained in given list. id_in: [ID!] # All values that are not contained in given list. id_not_in: [ID!] # All values less than the given value. id_lt: ID # All values less than or equal the given value. id_lte: ID # All values greater than the given value. id_gt: ID # All values greater than or equal the given value. id_gte: ID # All values containing the given string. id_contains: ID # All values not containing the given string. id_not_contains: ID # All values starting with the given string. id_starts_with: ID # All values not starting with the given string. id_not_starts_with: ID # All values ending with the given string. id_ends_with: ID # All values not ending with the given string. id_not_ends_with: ID lat: Float # All values that are not equal to given value. lat_not: Float # All values that are contained in given list. lat_in: [Float!] # All values that are not contained in given list. lat_not_in: [Float!] # All values less than the given value. lat_lt: Float # All values less than or equal the given value. lat_lte: Float # All values greater than the given value. lat_gt: Float # All values greater than or equal the given value. lat_gte: Float lng: Float # All values that are not equal to given value. lng_not: Float # All values that are contained in given list. lng_in: [Float!] # All values that are not contained in given list. lng_not_in: [Float!] # All values less than the given value. lng_lt: Float # All values less than or equal the given value. lng_lte: Float # All values greater than the given value. lng_gt: Float # All values greater than or equal the given value. lng_gte: Float experience: ExperienceFilter neighbourHood: NeighbourhoodFilter place: PlaceFilter restaurant: RestaurantFilter user: UserFilter } type LocationSubscriptionPayload { mutation: _ModelMutationType! node: Location updatedFields: [String!] previousValues: LocationPreviousValues } input LocationuserUser { email: String! firstName: String! isSuperHost: Boolean lastName: String! password: String! phone: String! responseRate: Float responseTime: Int profilePictureId: ID profilePicture: UserprofilePicturePicture bookingsIds: [ID!] bookings: [UserbookingsBooking!] hostingExperiencesIds: [ID!] hostingExperiences: [UserhostingExperiencesExperience!] notificationsIds: [ID!] notifications: [UsernotificationsNotification!] ownedPlacesIds: [ID!] ownedPlaces: [UserownedPlacesPlace!] paymentAccountIds: [ID!] paymentAccount: [UserpaymentAccountPaymentAccount!] receivedMessagesIds: [ID!] receivedMessages: [UserreceivedMessagesMessage!] sentMessagesIds: [ID!] sentMessages: [UsersentMessagesMessage!] } type Message implements Node { createdAt: DateTime! deliveredAt: DateTime! from(filter: UserFilter): User! id: ID! readAt: DateTime! to(filter: UserFilter): User! } input MessageFilter { # Logical AND on all given filters. AND: [MessageFilter!] # Logical OR on all given filters. OR: [MessageFilter!] createdAt: DateTime # All values that are not equal to given value. createdAt_not: DateTime # All values that are contained in given list. createdAt_in: [DateTime!] # All values that are not contained in given list. createdAt_not_in: [DateTime!] # All values less than the given value. createdAt_lt: DateTime # All values less than or equal the given value. createdAt_lte: DateTime # All values greater than the given value. createdAt_gt: DateTime # All values greater than or equal the given value. createdAt_gte: DateTime deliveredAt: DateTime # All values that are not equal to given value. deliveredAt_not: DateTime # All values that are contained in given list. deliveredAt_in: [DateTime!] # All values that are not contained in given list. deliveredAt_not_in: [DateTime!] # All values less than the given value. deliveredAt_lt: DateTime # All values less than or equal the given value. deliveredAt_lte: DateTime # All values greater than the given value. deliveredAt_gt: DateTime # All values greater than or equal the given value. deliveredAt_gte: DateTime id: ID # All values that are not equal to given value. id_not: ID # All values that are contained in given list. id_in: [ID!] # All values that are not contained in given list. id_not_in: [ID!] # All values less than the given value. id_lt: ID # All values less than or equal the given value. id_lte: ID # All values greater than the given value. id_gt: ID # All values greater than or equal the given value. id_gte: ID # All values containing the given string. id_contains: ID # All values not containing the given string. id_not_contains: ID # All values starting with the given string. id_starts_with: ID # All values not starting with the given string. id_not_starts_with: ID # All values ending with the given string. id_ends_with: ID # All values not ending with the given string. id_not_ends_with: ID readAt: DateTime # All values that are not equal to given value. readAt_not: DateTime # All values that are contained in given list. readAt_in: [DateTime!] # All values that are not contained in given list. readAt_not_in: [DateTime!] # All values less than the given value. readAt_lt: DateTime # All values less than or equal the given value. readAt_lte: DateTime # All values greater than the given value. readAt_gt: DateTime # All values greater than or equal the given value. readAt_gte: DateTime from: UserFilter to: UserFilter } input MessagefromUser { email: String! firstName: String! isSuperHost: Boolean lastName: String! password: String! phone: String! responseRate: Float responseTime: Int locationId: ID location: UserlocationLocation profilePictureId: ID profilePicture: UserprofilePicturePicture bookingsIds: [ID!] bookings: [UserbookingsBooking!] hostingExperiencesIds: [ID!] hostingExperiences: [UserhostingExperiencesExperience!] notificationsIds: [ID!] notifications: [UsernotificationsNotification!] ownedPlacesIds: [ID!] ownedPlaces: [UserownedPlacesPlace!] paymentAccountIds: [ID!] paymentAccount: [UserpaymentAccountPaymentAccount!] receivedMessagesIds: [ID!] receivedMessages: [UserreceivedMessagesMessage!] sentMessagesIds: [ID!] sentMessages: [UsersentMessagesMessage!] } enum MessageOrderBy { createdAt_ASC createdAt_DESC deliveredAt_ASC deliveredAt_DESC id_ASC id_DESC readAt_ASC readAt_DESC } type MessagePreviousValues { createdAt: DateTime! deliveredAt: DateTime! id: ID! readAt: DateTime! } input MessageSubscriptionFilter { # Logical AND on all given filters. AND: [MessageSubscriptionFilter!] # Logical OR on all given filters. OR: [MessageSubscriptionFilter!] # The subscription event gets dispatched when it's listed in mutation_in mutation_in: [_ModelMutationType!] # The subscription event gets only dispatched when one of the updated fields names is included in this list updatedFields_contains: String # The subscription event gets only dispatched when all of the field names included in this list have been updated updatedFields_contains_every: [String!] # The subscription event gets only dispatched when some of the field names included in this list have been updated updatedFields_contains_some: [String!] node: MessageSubscriptionFilterNode } input MessageSubscriptionFilterNode { createdAt: DateTime # All values that are not equal to given value. createdAt_not: DateTime # All values that are contained in given list. createdAt_in: [DateTime!] # All values that are not contained in given list. createdAt_not_in: [DateTime!] # All values less than the given value. createdAt_lt: DateTime # All values less than or equal the given value. createdAt_lte: DateTime # All values greater than the given value. createdAt_gt: DateTime # All values greater than or equal the given value. createdAt_gte: DateTime deliveredAt: DateTime # All values that are not equal to given value. deliveredAt_not: DateTime # All values that are contained in given list. deliveredAt_in: [DateTime!] # All values that are not contained in given list. deliveredAt_not_in: [DateTime!] # All values less than the given value. deliveredAt_lt: DateTime # All values less than or equal the given value. deliveredAt_lte: DateTime # All values greater than the given value. deliveredAt_gt: DateTime # All values greater than or equal the given value. deliveredAt_gte: DateTime id: ID # All values that are not equal to given value. id_not: ID # All values that are contained in given list. id_in: [ID!] # All values that are not contained in given list. id_not_in: [ID!] # All values less than the given value. id_lt: ID # All values less than or equal the given value. id_lte: ID # All values greater than the given value. id_gt: ID # All values greater than or equal the given value. id_gte: ID # All values containing the given string. id_contains: ID # All values not containing the given string. id_not_contains: ID # All values starting with the given string. id_starts_with: ID # All values not starting with the given string. id_not_starts_with: ID # All values ending with the given string. id_ends_with: ID # All values not ending with the given string. id_not_ends_with: ID readAt: DateTime # All values that are not equal to given value. readAt_not: DateTime # All values that are contained in given list. readAt_in: [DateTime!] # All values that are not contained in given list. readAt_not_in: [DateTime!] # All values less than the given value. readAt_lt: DateTime # All values less than or equal the given value. readAt_lte: DateTime # All values greater than the given value. readAt_gt: DateTime # All values greater than or equal the given value. readAt_gte: DateTime from: UserFilter to: UserFilter } type MessageSubscriptionPayload { mutation: _ModelMutationType! node: Message updatedFields: [String!] previousValues: MessagePreviousValues } input MessagetoUser { email: String! firstName: String! isSuperHost: Boolean lastName: String! password: String! phone: String! responseRate: Float responseTime: Int locationId: ID location: UserlocationLocation profilePictureId: ID profilePicture: UserprofilePicturePicture bookingsIds: [ID!] bookings: [UserbookingsBooking!] hostingExperiencesIds: [ID!] hostingExperiences: [UserhostingExperiencesExperience!] notificationsIds: [ID!] notifications: [UsernotificationsNotification!] ownedPlacesIds: [ID!] ownedPlaces: [UserownedPlacesPlace!] paymentAccountIds: [ID!] paymentAccount: [UserpaymentAccountPaymentAccount!] receivedMessagesIds: [ID!] receivedMessages: [UserreceivedMessagesMessage!] sentMessagesIds: [ID!] sentMessages: [UsersentMessagesMessage!] } type Mutation { createAmenities( airConditioning: Boolean babyBath: Boolean babyMonitor: Boolean babysitterRecommendations: Boolean bathtub: Boolean breakfast: Boolean buzzerWirelessIntercom: Boolean cableTv: Boolean changingTable: Boolean childrensBooksAndToys: Boolean childrensDinnerware: Boolean crib: Boolean doorman: Boolean dryer: Boolean elevator: Boolean essentials: Boolean familyKidFriendly: Boolean freeParkingOnPremises: Boolean freeParkingOnStreet: Boolean gym: Boolean hairDryer: Boolean hangers: Boolean heating: Boolean hotTub: Boolean indoorFireplace: Boolean internet: Boolean iron: Boolean kitchen: Boolean laptopFriendlyWorkspace: Boolean paidParkingOffPremises: Boolean petsAllowed: Boolean pool: Boolean privateEntrance: Boolean shampoo: Boolean smokingAllowed: Boolean suitableForEvents: Boolean tv: Boolean washer: Boolean wheelchairAccessible: Boolean wirelessInternet: Boolean placeId: ID place: AmenitiesplacePlace ): Amenities createBooking( endDate: DateTime! startDate: DateTime! bookeeId: ID bookee: BookingbookeeUser paymentId: ID payment: BookingpaymentPayment placeId: ID place: BookingplacePlace ): Booking createCity( name: String! neighbourhoodsIds: [ID!] neighbourhoods: [CityneighbourhoodsNeighbourhood!] ): City createCreditCardInformation( cardNumber: String! country: String! expiresOnMonth: Int! expiresOnYear: Int! firstName: String! lastName: String! postalCode: String! securityCode: String! paymentAccountId: ID paymentAccount: CreditCardInformationpaymentAccountPaymentAccount ): CreditCardInformation createExperience( popularity: Int! pricePerPerson: Int! title: String! categoryId: ID category: ExperiencecategoryExperienceCategory hostId: ID host: ExperiencehostUser locationId: ID location: ExperiencelocationLocation previewId: ID preview: ExperiencepreviewPicture reviewsIds: [ID!] reviews: [ExperiencereviewsReview!] ): Experience createExperienceCategory( mainColor: String name: String! experienceId: ID experience: ExperienceCategoryexperienceExperience ): ExperienceCategory createGuestRequirements( govIssuedId: Boolean guestTripInformation: Boolean recommendationsFromOtherHosts: Boolean placeId: ID place: GuestRequirementsplacePlace ): GuestRequirements createHouseRules( additionalRules: String partiesAndEventsAllowed: Boolean petsAllowed: Boolean smokingAllowed: Boolean suitableForChildren: Boolean suitableForInfants: Boolean placeId: ID place: HouseRulesplacePlace ): HouseRules createLocation( address: String directions: String lat: Float! lng: Float! experienceId: ID experience: LocationexperienceExperience neighbourHoodId: ID neighbourHood: LocationneighbourHoodNeighbourhood placeId: ID place: LocationplacePlace restaurantId: ID restaurant: LocationrestaurantRestaurant userId: ID user: LocationuserUser ): Location createMessage( deliveredAt: DateTime! readAt: DateTime! fromId: ID from: MessagefromUser toId: ID to: MessagetoUser ): Message createNeighbourhood( featured: Boolean! name: String! popularity: Int! slug: String! cityId: ID city: NeighbourhoodcityCity homePreviewId: ID homePreview: NeighbourhoodhomePreviewPicture locationsIds: [ID!] locations: [NeighbourhoodlocationsLocation!] ): Neighbourhood createNotification( link: String! readDate: DateTime! type: NOTIFICATION_TYPE userId: ID user: NotificationuserUser ): Notification createPayment( placePrice: Float! serviceFee: Float! totalPrice: Float! bookingId: ID booking: PaymentbookingBooking paymentMethodId: ID paymentMethod: PaymentpaymentMethodPaymentAccount ): Payment createPaymentAccount( type: PAYMENT_PROVIDER creditcardId: ID creditcard: PaymentAccountcreditcardCreditCardInformation paypalId: ID paypal: PaymentAccountpaypalPaypalInformation userId: ID user: PaymentAccountuserUser paymentsIds: [ID!] payments: [PaymentAccountpaymentsPayment!] ): PaymentAccount createPaypalInformation( email: String! paymentAccountId: ID paymentAccount: PaypalInformationpaymentAccountPaymentAccount ): PaypalInformation createPicture( url: String! experienceId: ID experience: PictureexperienceExperience neighbourHoodId: ID neighbourHood: PictureneighbourHoodNeighbourhood placeId: ID place: PictureplacePlace reservationId: ID reservation: PicturereservationRestaurant userId: ID user: PictureuserUser ): Picture createPlace( description: String! maxGuests: Int! name: String numBaths: Int! numBedrooms: Int! numBeds: Int! popularity: Int! shortDescription: String! size: PLACE_SIZES slug: String! amenitiesId: ID amenities: PlaceamenitiesAmenities guestRequirementsId: ID guestRequirements: PlaceguestRequirementsGuestRequirements hostId: ID host: PlacehostUser houseRulesId: ID houseRules: PlacehouseRulesHouseRules locationId: ID location: PlacelocationLocation policiesId: ID policies: PlacepoliciesPolicies pricingId: ID pricing: PlacepricingPricing viewsId: ID views: PlaceviewsPlaceViews bookingsIds: [ID!] bookings: [PlacebookingsBooking!] picturesIds: [ID!] pictures: [PlacepicturesPicture!] reviewsIds: [ID!] reviews: [PlacereviewsReview!] ): Place createPlaceViews(lastWeek: Int!, placeId: ID, place: PlaceViewsplacePlace): PlaceViews createPolicies( checkInEndTime: Float! checkInStartTime: Float! checkoutTime: Float! placeId: ID place: PoliciesplacePlace ): Policies createPricing( averageMonthly: Int! averageWeekly: Int! basePrice: Int! cleaningFee: Int currency: CURRENCY extraGuests: Int monthlyDiscount: Int perNight: Int! securityDeposit: Int smartPricing: Boolean weekendPricing: Int weeklyDiscount: Int placeId: ID place: PricingplacePlace ): Pricing createRestaurant( avgPricePerPerson: Int! isCurated: Boolean popularity: Int! slug: String! title: String! locationId: ID location: RestaurantlocationLocation picturesIds: [ID!] pictures: [RestaurantpicturesPicture!] ): Restaurant createReview( accuracy: Int! checkIn: Int! cleanliness: Int! communication: Int! location: Int! stars: Int! text: String! value: Int! experienceId: ID experience: ReviewexperienceExperience placeId: ID place: ReviewplacePlace ): Review updateAmenities( airConditioning: Boolean babyBath: Boolean babyMonitor: Boolean babysitterRecommendations: Boolean bathtub: Boolean breakfast: Boolean buzzerWirelessIntercom: Boolean cableTv: Boolean changingTable: Boolean childrensBooksAndToys: Boolean childrensDinnerware: Boolean crib: Boolean doorman: Boolean dryer: Boolean elevator: Boolean essentials: Boolean familyKidFriendly: Boolean freeParkingOnPremises: Boolean freeParkingOnStreet: Boolean gym: Boolean hairDryer: Boolean hangers: Boolean heating: Boolean hotTub: Boolean id: ID! indoorFireplace: Boolean internet: Boolean iron: Boolean kitchen: Boolean laptopFriendlyWorkspace: Boolean paidParkingOffPremises: Boolean petsAllowed: Boolean pool: Boolean privateEntrance: Boolean shampoo: Boolean smokingAllowed: Boolean suitableForEvents: Boolean tv: Boolean washer: Boolean wheelchairAccessible: Boolean wirelessInternet: Boolean placeId: ID place: AmenitiesplacePlace ): Amenities updateBooking( endDate: DateTime id: ID! startDate: DateTime bookeeId: ID bookee: BookingbookeeUser paymentId: ID payment: BookingpaymentPayment placeId: ID place: BookingplacePlace ): Booking updateCity( id: ID! name: String neighbourhoodsIds: [ID!] neighbourhoods: [CityneighbourhoodsNeighbourhood!] ): City updateCreditCardInformation( cardNumber: String country: String expiresOnMonth: Int expiresOnYear: Int firstName: String id: ID! lastName: String postalCode: String securityCode: String paymentAccountId: ID paymentAccount: CreditCardInformationpaymentAccountPaymentAccount ): CreditCardInformation updateExperience( id: ID! popularity: Int pricePerPerson: Int title: String categoryId: ID category: ExperiencecategoryExperienceCategory hostId: ID host: ExperiencehostUser locationId: ID location: ExperiencelocationLocation previewId: ID preview: ExperiencepreviewPicture reviewsIds: [ID!] reviews: [ExperiencereviewsReview!] ): Experience updateExperienceCategory( id: ID! mainColor: String name: String experienceId: ID experience: ExperienceCategoryexperienceExperience ): ExperienceCategory updateGuestRequirements( govIssuedId: Boolean guestTripInformation: Boolean id: ID! recommendationsFromOtherHosts: Boolean placeId: ID place: GuestRequirementsplacePlace ): GuestRequirements updateHouseRules( additionalRules: String id: ID! partiesAndEventsAllowed: Boolean petsAllowed: Boolean smokingAllowed: Boolean suitableForChildren: Boolean suitableForInfants: Boolean placeId: ID place: HouseRulesplacePlace ): HouseRules updateLocation( address: String directions: String id: ID! lat: Float lng: Float experienceId: ID experience: LocationexperienceExperience neighbourHoodId: ID neighbourHood: LocationneighbourHoodNeighbourhood placeId: ID place: LocationplacePlace restaurantId: ID restaurant: LocationrestaurantRestaurant userId: ID user: LocationuserUser ): Location updateMessage( deliveredAt: DateTime id: ID! readAt: DateTime fromId: ID from: MessagefromUser toId: ID to: MessagetoUser ): Message updateNeighbourhood( featured: Boolean id: ID! name: String popularity: Int slug: String cityId: ID city: NeighbourhoodcityCity homePreviewId: ID homePreview: NeighbourhoodhomePreviewPicture locationsIds: [ID!] locations: [NeighbourhoodlocationsLocation!] ): Neighbourhood updateNotification( id: ID! link: String readDate: DateTime type: NOTIFICATION_TYPE userId: ID user: NotificationuserUser ): Notification updatePayment( id: ID! placePrice: Float serviceFee: Float totalPrice: Float bookingId: ID booking: PaymentbookingBooking paymentMethodId: ID paymentMethod: PaymentpaymentMethodPaymentAccount ): Payment updatePaymentAccount( id: ID! type: PAYMENT_PROVIDER creditcardId: ID creditcard: PaymentAccountcreditcardCreditCardInformation paypalId: ID paypal: PaymentAccountpaypalPaypalInformation userId: ID user: PaymentAccountuserUser paymentsIds: [ID!] payments: [PaymentAccountpaymentsPayment!] ): PaymentAccount updatePaypalInformation( email: String id: ID! paymentAccountId: ID paymentAccount: PaypalInformationpaymentAccountPaymentAccount ): PaypalInformation updatePicture( id: ID! url: String experienceId: ID experience: PictureexperienceExperience neighbourHoodId: ID neighbourHood: PictureneighbourHoodNeighbourhood placeId: ID place: PictureplacePlace reservationId: ID reservation: PicturereservationRestaurant userId: ID user: PictureuserUser ): Picture updatePlace( description: String id: ID! maxGuests: Int name: String numBaths: Int numBedrooms: Int numBeds: Int popularity: Int shortDescription: String size: PLACE_SIZES slug: String amenitiesId: ID amenities: PlaceamenitiesAmenities guestRequirementsId: ID guestRequirements: PlaceguestRequirementsGuestRequirements hostId: ID host: PlacehostUser houseRulesId: ID houseRules: PlacehouseRulesHouseRules locationId: ID location: PlacelocationLocation policiesId: ID policies: PlacepoliciesPolicies pricingId: ID pricing: PlacepricingPricing viewsId: ID views: PlaceviewsPlaceViews bookingsIds: [ID!] bookings: [PlacebookingsBooking!] picturesIds: [ID!] pictures: [PlacepicturesPicture!] reviewsIds: [ID!] reviews: [PlacereviewsReview!] ): Place updatePlaceViews(id: ID!, lastWeek: Int, placeId: ID, place: PlaceViewsplacePlace): PlaceViews updatePolicies( checkInEndTime: Float checkInStartTime: Float checkoutTime: Float id: ID! placeId: ID place: PoliciesplacePlace ): Policies updatePricing( averageMonthly: Int averageWeekly: Int basePrice: Int cleaningFee: Int currency: CURRENCY extraGuests: Int id: ID! monthlyDiscount: Int perNight: Int securityDeposit: Int smartPricing: Boolean weekendPricing: Int weeklyDiscount: Int placeId: ID place: PricingplacePlace ): Pricing updateRestaurant( avgPricePerPerson: Int id: ID! isCurated: Boolean popularity: Int slug: String title: String locationId: ID location: RestaurantlocationLocation picturesIds: [ID!] pictures: [RestaurantpicturesPicture!] ): Restaurant updateReview( accuracy: Int checkIn: Int cleanliness: Int communication: Int id: ID! location: Int stars: Int text: String value: Int experienceId: ID experience: ReviewexperienceExperience placeId: ID place: ReviewplacePlace ): Review updateUser( email: String firstName: String id: ID! isSuperHost: Boolean lastName: String password: String phone: String responseRate: Float responseTime: Int locationId: ID location: UserlocationLocation profilePictureId: ID profilePicture: UserprofilePicturePicture bookingsIds: [ID!] bookings: [UserbookingsBooking!] hostingExperiencesIds: [ID!] hostingExperiences: [UserhostingExperiencesExperience!] notificationsIds: [ID!] notifications: [UsernotificationsNotification!] ownedPlacesIds: [ID!] ownedPlaces: [UserownedPlacesPlace!] paymentAccountIds: [ID!] paymentAccount: [UserpaymentAccountPaymentAccount!] receivedMessagesIds: [ID!] receivedMessages: [UserreceivedMessagesMessage!] sentMessagesIds: [ID!] sentMessages: [UsersentMessagesMessage!] ): User updateOrCreateAmenities(update: UpdateAmenities!, create: CreateAmenities!): Amenities updateOrCreateBooking(update: UpdateBooking!, create: CreateBooking!): Booking updateOrCreateCity(update: UpdateCity!, create: CreateCity!): City updateOrCreateCreditCardInformation( update: UpdateCreditCardInformation! create: CreateCreditCardInformation! ): CreditCardInformation updateOrCreateExperience(update: UpdateExperience!, create: CreateExperience!): Experience updateOrCreateExperienceCategory( update: UpdateExperienceCategory! create: CreateExperienceCategory! ): ExperienceCategory updateOrCreateGuestRequirements( update: UpdateGuestRequirements! create: CreateGuestRequirements! ): GuestRequirements updateOrCreateHouseRules(update: UpdateHouseRules!, create: CreateHouseRules!): HouseRules updateOrCreateLocation(update: UpdateLocation!, create: CreateLocation!): Location updateOrCreateMessage(update: UpdateMessage!, create: CreateMessage!): Message updateOrCreateNeighbourhood( update: UpdateNeighbourhood! create: CreateNeighbourhood! ): Neighbourhood updateOrCreateNotification(update: UpdateNotification!, create: CreateNotification!): Notification updateOrCreatePayment(update: UpdatePayment!, create: CreatePayment!): Payment updateOrCreatePaymentAccount( update: UpdatePaymentAccount! create: CreatePaymentAccount! ): PaymentAccount updateOrCreatePaypalInformation( update: UpdatePaypalInformation! create: CreatePaypalInformation! ): PaypalInformation updateOrCreatePicture(update: UpdatePicture!, create: CreatePicture!): Picture updateOrCreatePlace(update: UpdatePlace!, create: CreatePlace!): Place updateOrCreatePlaceViews(update: UpdatePlaceViews!, create: CreatePlaceViews!): PlaceViews updateOrCreatePolicies(update: UpdatePolicies!, create: CreatePolicies!): Policies updateOrCreatePricing(update: UpdatePricing!, create: CreatePricing!): Pricing updateOrCreateRestaurant(update: UpdateRestaurant!, create: CreateRestaurant!): Restaurant updateOrCreateReview(update: UpdateReview!, create: CreateReview!): Review updateOrCreateUser(update: UpdateUser!, create: CreateUser!): User deleteAmenities(id: ID!): Amenities deleteBooking(id: ID!): Booking deleteCity(id: ID!): City deleteCreditCardInformation(id: ID!): CreditCardInformation deleteExperience(id: ID!): Experience deleteExperienceCategory(id: ID!): ExperienceCategory deleteGuestRequirements(id: ID!): GuestRequirements deleteHouseRules(id: ID!): HouseRules deleteLocation(id: ID!): Location deleteMessage(id: ID!): Message deleteNeighbourhood(id: ID!): Neighbourhood deleteNotification(id: ID!): Notification deletePayment(id: ID!): Payment deletePaymentAccount(id: ID!): PaymentAccount deletePaypalInformation(id: ID!): PaypalInformation deletePicture(id: ID!): Picture deletePlace(id: ID!): Place deletePlaceViews(id: ID!): PlaceViews deletePolicies(id: ID!): Policies deletePricing(id: ID!): Pricing deleteRestaurant(id: ID!): Restaurant deleteReview(id: ID!): Review deleteUser(id: ID!): User setBookingPayment(paymentPaymentId: ID!, bookingBookingId: ID!): SetBookingPaymentPayload setCreditCardInformation( creditcardCreditCardInformationId: ID! paymentAccountPaymentAccountId: ID! ): SetCreditCardInformationPayload setExperienceCategory( categoryExperienceCategoryId: ID! experienceExperienceId: ID! ): SetExperienceCategoryPayload setExperienceLocation( experienceExperienceId: ID! locationLocationId: ID! ): SetExperienceLocationPayload setExperiencePreview( experienceExperienceId: ID! previewPictureId: ID! ): SetExperiencePreviewPayload setGuestRequirements( guestRequirementsGuestRequirementsId: ID! placePlaceId: ID! ): SetGuestRequirementsPayload setHomePreview( homePreviewPictureId: ID! neighbourHoodNeighbourhoodId: ID! ): SetHomePreviewPayload setHouseRules(houseRulesHouseRulesId: ID!, placePlaceId: ID!): SetHouseRulesPayload setPaypalInformation( paypalPaypalInformationId: ID! paymentAccountPaymentAccountId: ID! ): SetPaypalInformationPayload setPlaceAmenities(amenitiesAmenitiesId: ID!, placePlaceId: ID!): SetPlaceAmenitiesPayload setPlaceLocation(locationLocationId: ID!, placePlaceId: ID!): SetPlaceLocationPayload setPlacePrice(pricingPricingId: ID!, placePlaceId: ID!): SetPlacePricePayload setPlaceViews(viewsPlaceViewsId: ID!, placePlaceId: ID!): SetPlaceViewsPayload setPolicies(policiesPoliciesId: ID!, placePlaceId: ID!): SetPoliciesPayload setProfilePicture(profilePicturePictureId: ID!, userUserId: ID!): SetProfilePicturePayload setRestaurantLocation( restaurantRestaurantId: ID! locationLocationId: ID! ): SetRestaurantLocationPayload setUserLocation(locationLocationId: ID!, userUserId: ID!): SetUserLocationPayload unsetCreditCardInformation( creditcardCreditCardInformationId: ID! paymentAccountPaymentAccountId: ID! ): UnsetCreditCardInformationPayload unsetExperienceCategory( categoryExperienceCategoryId: ID! experienceExperienceId: ID! ): UnsetExperienceCategoryPayload unsetHomePreview( homePreviewPictureId: ID! neighbourHoodNeighbourhoodId: ID! ): UnsetHomePreviewPayload unsetProfilePicture(profilePicturePictureId: ID!, userUserId: ID!): UnsetProfilePicturePayload unsetUserLocation(locationLocationId: ID!, userUserId: ID!): UnsetUserLocationPayload addToBookee(bookingsBookingId: ID!, bookeeUserId: ID!): AddToBookeePayload addToCityNeighbourhood( cityCityId: ID! neighbourhoodsNeighbourhoodId: ID! ): AddToCityNeighbourhoodPayload addToExperienceHost( hostingExperiencesExperienceId: ID! hostUserId: ID! ): AddToExperienceHostPayload addToExperienceReviews( reviewsReviewId: ID! experienceExperienceId: ID! ): AddToExperienceReviewsPayload addToNeighbourhood( neighbourHoodNeighbourhoodId: ID! locationsLocationId: ID! ): AddToNeighbourhoodPayload addToNotifications(notificationsNotificationId: ID!, userUserId: ID!): AddToNotificationsPayload addToPaymentAccounts( paymentMethodPaymentAccountId: ID! paymentsPaymentId: ID! ): AddToPaymentAccountsPayload addToPlaceBooking(bookingsBookingId: ID!, placePlaceId: ID!): AddToPlaceBookingPayload addToPlaceOwner(ownedPlacesPlaceId: ID!, hostUserId: ID!): AddToPlaceOwnerPayload addToPlacePictures(picturesPictureId: ID!, placePlaceId: ID!): AddToPlacePicturesPayload addToPlaceReviews(reviewsReviewId: ID!, placePlaceId: ID!): AddToPlaceReviewsPayload addToReceivedMessages(receivedMessagesMessageId: ID!, toUserId: ID!): AddToReceivedMessagesPayload addToRestaurantPicture( reservationRestaurantId: ID! picturesPictureId: ID! ): AddToRestaurantPicturePayload addToSentMessages(sentMessagesMessageId: ID!, fromUserId: ID!): AddToSentMessagesPayload addToUserPaymentAccounts( paymentAccountPaymentAccountId: ID! userUserId: ID! ): AddToUserPaymentAccountsPayload removeFromExperienceReviews( reviewsReviewId: ID! experienceExperienceId: ID! ): RemoveFromExperienceReviewsPayload removeFromNeighbourhood( neighbourHoodNeighbourhoodId: ID! locationsLocationId: ID! ): RemoveFromNeighbourhoodPayload removeFromPlacePictures(picturesPictureId: ID!, placePlaceId: ID!): RemoveFromPlacePicturesPayload removeFromRestaurantPicture( reservationRestaurantId: ID! picturesPictureId: ID! ): RemoveFromRestaurantPicturePayload createUser( email: String! firstName: String! isSuperHost: Boolean lastName: String! password: String! phone: String! responseRate: Float responseTime: Int locationId: ID location: UserlocationLocation profilePictureId: ID profilePicture: UserprofilePicturePicture bookingsIds: [ID!] bookings: [UserbookingsBooking!] hostingExperiencesIds: [ID!] hostingExperiences: [UserhostingExperiencesExperience!] notificationsIds: [ID!] notifications: [UsernotificationsNotification!] ownedPlacesIds: [ID!] ownedPlaces: [UserownedPlacesPlace!] paymentAccountIds: [ID!] paymentAccount: [UserpaymentAccountPaymentAccount!] receivedMessagesIds: [ID!] receivedMessages: [UserreceivedMessagesMessage!] sentMessagesIds: [ID!] sentMessages: [UsersentMessagesMessage!] ): User } type Neighbourhood implements Node { city(filter: CityFilter): City! featured: Boolean! homePreview(filter: PictureFilter): Picture id: ID! locations( filter: LocationFilter orderBy: LocationOrderBy skip: Int after: String before: String first: Int last: Int ): [Location!] name: String! popularity: Int! slug: String! # Meta information about the query. _locationsMeta( filter: LocationFilter orderBy: LocationOrderBy skip: Int after: String before: String first: Int last: Int ): _QueryMeta! } input NeighbourhoodcityCity { name: String! neighbourhoodsIds: [ID!] neighbourhoods: [CityneighbourhoodsNeighbourhood!] } input NeighbourhoodFilter { # Logical AND on all given filters. AND: [NeighbourhoodFilter!] # Logical OR on all given filters. OR: [NeighbourhoodFilter!] featured: Boolean # All values that are not equal to given value. featured_not: Boolean id: ID # All values that are not equal to given value. id_not: ID # All values that are contained in given list. id_in: [ID!] # All values that are not contained in given list. id_not_in: [ID!] # All values less than the given value. id_lt: ID # All values less than or equal the given value. id_lte: ID # All values greater than the given value. id_gt: ID # All values greater than or equal the given value. id_gte: ID # All values containing the given string. id_contains: ID # All values not containing the given string. id_not_contains: ID # All values starting with the given string. id_starts_with: ID # All values not starting with the given string. id_not_starts_with: ID # All values ending with the given string. id_ends_with: ID # All values not ending with the given string. id_not_ends_with: ID name: String # All values that are not equal to given value. name_not: String # All values that are contained in given list. name_in: [String!] # All values that are not contained in given list. name_not_in: [String!] # All values less than the given value. name_lt: String # All values less than or equal the given value. name_lte: String # All values greater than the given value. name_gt: String # All values greater than or equal the given value. name_gte: String # All values containing the given string. name_contains: String # All values not containing the given string. name_not_contains: String # All values starting with the given string. name_starts_with: String # All values not starting with the given string. name_not_starts_with: String # All values ending with the given string. name_ends_with: String # All values not ending with the given string. name_not_ends_with: String popularity: Int # All values that are not equal to given value. popularity_not: Int # All values that are contained in given list. popularity_in: [Int!] # All values that are not contained in given list. popularity_not_in: [Int!] # All values less than the given value. popularity_lt: Int # All values less than or equal the given value. popularity_lte: Int # All values greater than the given value. popularity_gt: Int # All values greater than or equal the given value. popularity_gte: Int slug: String # All values that are not equal to given value. slug_not: String # All values that are contained in given list. slug_in: [String!] # All values that are not contained in given list. slug_not_in: [String!] # All values less than the given value. slug_lt: String # All values less than or equal the given value. slug_lte: String # All values greater than the given value. slug_gt: String # All values greater than or equal the given value. slug_gte: String # All values containing the given string. slug_contains: String # All values not containing the given string. slug_not_contains: String # All values starting with the given string. slug_starts_with: String # All values not starting with the given string. slug_not_starts_with: String # All values ending with the given string. slug_ends_with: String # All values not ending with the given string. slug_not_ends_with: String city: CityFilter homePreview: PictureFilter locations_every: LocationFilter locations_some: LocationFilter locations_none: LocationFilter } input NeighbourhoodhomePreviewPicture { url: String! experienceId: ID experience: PictureexperienceExperience placeId: ID place: PictureplacePlace reservationId: ID reservation: PicturereservationRestaurant userId: ID user: PictureuserUser } input NeighbourhoodlocationsLocation { address: String directions: String lat: Float! lng: Float! experienceId: ID experience: LocationexperienceExperience placeId: ID place: LocationplacePlace restaurantId: ID restaurant: LocationrestaurantRestaurant userId: ID user: LocationuserUser } enum NeighbourhoodOrderBy { featured_ASC featured_DESC id_ASC id_DESC name_ASC name_DESC popularity_ASC popularity_DESC slug_ASC slug_DESC } type NeighbourhoodPreviousValues { featured: Boolean! id: ID! name: String! popularity: Int! slug: String! } input NeighbourhoodSubscriptionFilter { # Logical AND on all given filters. AND: [NeighbourhoodSubscriptionFilter!] # Logical OR on all given filters. OR: [NeighbourhoodSubscriptionFilter!] # The subscription event gets dispatched when it's listed in mutation_in mutation_in: [_ModelMutationType!] # The subscription event gets only dispatched when one of the updated fields names is included in this list updatedFields_contains: String # The subscription event gets only dispatched when all of the field names included in this list have been updated updatedFields_contains_every: [String!] # The subscription event gets only dispatched when some of the field names included in this list have been updated updatedFields_contains_some: [String!] node: NeighbourhoodSubscriptionFilterNode } input NeighbourhoodSubscriptionFilterNode { featured: Boolean # All values that are not equal to given value. featured_not: Boolean id: ID # All values that are not equal to given value. id_not: ID # All values that are contained in given list. id_in: [ID!] # All values that are not contained in given list. id_not_in: [ID!] # All values less than the given value. id_lt: ID # All values less than or equal the given value. id_lte: ID # All values greater than the given value. id_gt: ID # All values greater than or equal the given value. id_gte: ID # All values containing the given string. id_contains: ID # All values not containing the given string. id_not_contains: ID # All values starting with the given string. id_starts_with: ID # All values not starting with the given string. id_not_starts_with: ID # All values ending with the given string. id_ends_with: ID # All values not ending with the given string. id_not_ends_with: ID name: String # All values that are not equal to given value. name_not: String # All values that are contained in given list. name_in: [String!] # All values that are not contained in given list. name_not_in: [String!] # All values less than the given value. name_lt: String # All values less than or equal the given value. name_lte: String # All values greater than the given value. name_gt: String # All values greater than or equal the given value. name_gte: String # All values containing the given string. name_contains: String # All values not containing the given string. name_not_contains: String # All values starting with the given string. name_starts_with: String # All values not starting with the given string. name_not_starts_with: String # All values ending with the given string. name_ends_with: String # All values not ending with the given string. name_not_ends_with: String popularity: Int # All values that are not equal to given value. popularity_not: Int # All values that are contained in given list. popularity_in: [Int!] # All values that are not contained in given list. popularity_not_in: [Int!] # All values less than the given value. popularity_lt: Int # All values less than or equal the given value. popularity_lte: Int # All values greater than the given value. popularity_gt: Int # All values greater than or equal the given value. popularity_gte: Int slug: String # All values that are not equal to given value. slug_not: String # All values that are contained in given list. slug_in: [String!] # All values that are not contained in given list. slug_not_in: [String!] # All values less than the given value. slug_lt: String # All values less than or equal the given value. slug_lte: String # All values greater than the given value. slug_gt: String # All values greater than or equal the given value. slug_gte: String # All values containing the given string. slug_contains: String # All values not containing the given string. slug_not_contains: String # All values starting with the given string. slug_starts_with: String # All values not starting with the given string. slug_not_starts_with: String # All values ending with the given string. slug_ends_with: String # All values not ending with the given string. slug_not_ends_with: String city: CityFilter homePreview: PictureFilter locations_every: LocationFilter locations_some: LocationFilter locations_none: LocationFilter } type NeighbourhoodSubscriptionPayload { mutation: _ModelMutationType! node: Neighbourhood updatedFields: [String!] previousValues: NeighbourhoodPreviousValues } # An object with an ID interface Node { # The id of the object. id: ID! } type Notification implements Node { createdAt: DateTime! id: ID! link: String! readDate: DateTime! type: NOTIFICATION_TYPE user(filter: UserFilter): User! } enum NOTIFICATION_TYPE { OFFER INSTANT_BOOK RESPONSIVENESS NEW_AMENITIES HOUSE_RULES } input NotificationFilter { # Logical AND on all given filters. AND: [NotificationFilter!] # Logical OR on all given filters. OR: [NotificationFilter!] createdAt: DateTime # All values that are not equal to given value. createdAt_not: DateTime # All values that are contained in given list. createdAt_in: [DateTime!] # All values that are not contained in given list. createdAt_not_in: [DateTime!] # All values less than the given value. createdAt_lt: DateTime # All values less than or equal the given value. createdAt_lte: DateTime # All values greater than the given value. createdAt_gt: DateTime # All values greater than or equal the given value. createdAt_gte: DateTime id: ID # All values that are not equal to given value. id_not: ID # All values that are contained in given list. id_in: [ID!] # All values that are not contained in given list. id_not_in: [ID!] # All values less than the given value. id_lt: ID # All values less than or equal the given value. id_lte: ID # All values greater than the given value. id_gt: ID # All values greater than or equal the given value. id_gte: ID # All values containing the given string. id_contains: ID # All values not containing the given string. id_not_contains: ID # All values starting with the given string. id_starts_with: ID # All values not starting with the given string. id_not_starts_with: ID # All values ending with the given string. id_ends_with: ID # All values not ending with the given string. id_not_ends_with: ID link: String # All values that are not equal to given value. link_not: String # All values that are contained in given list. link_in: [String!] # All values that are not contained in given list. link_not_in: [String!] # All values less than the given value. link_lt: String # All values less than or equal the given value. link_lte: String # All values greater than the given value. link_gt: String # All values greater than or equal the given value. link_gte: String # All values containing the given string. link_contains: String # All values not containing the given string. link_not_contains: String # All values starting with the given string. link_starts_with: String # All values not starting with the given string. link_not_starts_with: String # All values ending with the given string. link_ends_with: String # All values not ending with the given string. link_not_ends_with: String readDate: DateTime # All values that are not equal to given value. readDate_not: DateTime # All values that are contained in given list. readDate_in: [DateTime!] # All values that are not contained in given list. readDate_not_in: [DateTime!] # All values less than the given value. readDate_lt: DateTime # All values less than or equal the given value. readDate_lte: DateTime # All values greater than the given value. readDate_gt: DateTime # All values greater than or equal the given value. readDate_gte: DateTime type: NOTIFICATION_TYPE # All values that are not equal to given value. type_not: NOTIFICATION_TYPE # All values that are contained in given list. type_in: [NOTIFICATION_TYPE!] # All values that are not contained in given list. type_not_in: [NOTIFICATION_TYPE!] user: UserFilter } enum NotificationOrderBy { createdAt_ASC createdAt_DESC id_ASC id_DESC link_ASC link_DESC readDate_ASC readDate_DESC type_ASC type_DESC } type NotificationPreviousValues { createdAt: DateTime! id: ID! link: String! readDate: DateTime! type: NOTIFICATION_TYPE } input NotificationSubscriptionFilter { # Logical AND on all given filters. AND: [NotificationSubscriptionFilter!] # Logical OR on all given filters. OR: [NotificationSubscriptionFilter!] # The subscription event gets dispatched when it's listed in mutation_in mutation_in: [_ModelMutationType!] # The subscription event gets only dispatched when one of the updated fields names is included in this list updatedFields_contains: String # The subscription event gets only dispatched when all of the field names included in this list have been updated updatedFields_contains_every: [String!] # The subscription event gets only dispatched when some of the field names included in this list have been updated updatedFields_contains_some: [String!] node: NotificationSubscriptionFilterNode } input NotificationSubscriptionFilterNode { createdAt: DateTime # All values that are not equal to given value. createdAt_not: DateTime # All values that are contained in given list. createdAt_in: [DateTime!] # All values that are not contained in given list. createdAt_not_in: [DateTime!] # All values less than the given value. createdAt_lt: DateTime # All values less than or equal the given value. createdAt_lte: DateTime # All values greater than the given value. createdAt_gt: DateTime # All values greater than or equal the given value. createdAt_gte: DateTime id: ID # All values that are not equal to given value. id_not: ID # All values that are contained in given list. id_in: [ID!] # All values that are not contained in given list. id_not_in: [ID!] # All values less than the given value. id_lt: ID # All values less than or equal the given value. id_lte: ID # All values greater than the given value. id_gt: ID # All values greater than or equal the given value. id_gte: ID # All values containing the given string. id_contains: ID # All values not containing the given string. id_not_contains: ID # All values starting with the given string. id_starts_with: ID # All values not starting with the given string. id_not_starts_with: ID # All values ending with the given string. id_ends_with: ID # All values not ending with the given string. id_not_ends_with: ID link: String # All values that are not equal to given value. link_not: String # All values that are contained in given list. link_in: [String!] # All values that are not contained in given list. link_not_in: [String!] # All values less than the given value. link_lt: String # All values less than or equal the given value. link_lte: String # All values greater than the given value. link_gt: String # All values greater than or equal the given value. link_gte: String # All values containing the given string. link_contains: String # All values not containing the given string. link_not_contains: String # All values starting with the given string. link_starts_with: String # All values not starting with the given string. link_not_starts_with: String # All values ending with the given string. link_ends_with: String # All values not ending with the given string. link_not_ends_with: String readDate: DateTime # All values that are not equal to given value. readDate_not: DateTime # All values that are contained in given list. readDate_in: [DateTime!] # All values that are not contained in given list. readDate_not_in: [DateTime!] # All values less than the given value. readDate_lt: DateTime # All values less than or equal the given value. readDate_lte: DateTime # All values greater than the given value. readDate_gt: DateTime # All values greater than or equal the given value. readDate_gte: DateTime type: NOTIFICATION_TYPE # All values that are not equal to given value. type_not: NOTIFICATION_TYPE # All values that are contained in given list. type_in: [NOTIFICATION_TYPE!] # All values that are not contained in given list. type_not_in: [NOTIFICATION_TYPE!] user: UserFilter } type NotificationSubscriptionPayload { mutation: _ModelMutationType! node: Notification updatedFields: [String!] previousValues: NotificationPreviousValues } input NotificationuserUser { email: String! firstName: String! isSuperHost: Boolean lastName: String! password: String! phone: String! responseRate: Float responseTime: Int locationId: ID location: UserlocationLocation profilePictureId: ID profilePicture: UserprofilePicturePicture bookingsIds: [ID!] bookings: [UserbookingsBooking!] hostingExperiencesIds: [ID!] hostingExperiences: [UserhostingExperiencesExperience!] notificationsIds: [ID!] notifications: [UsernotificationsNotification!] ownedPlacesIds: [ID!] ownedPlaces: [UserownedPlacesPlace!] paymentAccountIds: [ID!] paymentAccount: [UserpaymentAccountPaymentAccount!] receivedMessagesIds: [ID!] receivedMessages: [UserreceivedMessagesMessage!] sentMessagesIds: [ID!] sentMessages: [UsersentMessagesMessage!] } type Payment implements Node { booking(filter: BookingFilter): Booking! createdAt: DateTime! id: ID! paymentMethod(filter: PaymentAccountFilter): PaymentAccount! placePrice: Float! serviceFee: Float! totalPrice: Float! } enum PAYMENT_PROVIDER { PAYPAL CREDIT_CARD } type PaymentAccount implements Node { createdAt: DateTime! creditcard(filter: CreditCardInformationFilter): CreditCardInformation id: ID! payments( filter: PaymentFilter orderBy: PaymentOrderBy skip: Int after: String before: String first: Int last: Int ): [Payment!] paypal(filter: PaypalInformationFilter): PaypalInformation type: PAYMENT_PROVIDER user(filter: UserFilter): User! # Meta information about the query. _paymentsMeta( filter: PaymentFilter orderBy: PaymentOrderBy skip: Int after: String before: String first: Int last: Int ): _QueryMeta! } input PaymentAccountcreditcardCreditCardInformation { cardNumber: String! country: String! expiresOnMonth: Int! expiresOnYear: Int! firstName: String! lastName: String! postalCode: String! securityCode: String! } input PaymentAccountFilter { # Logical AND on all given filters. AND: [PaymentAccountFilter!] # Logical OR on all given filters. OR: [PaymentAccountFilter!] createdAt: DateTime # All values that are not equal to given value. createdAt_not: DateTime # All values that are contained in given list. createdAt_in: [DateTime!] # All values that are not contained in given list. createdAt_not_in: [DateTime!] # All values less than the given value. createdAt_lt: DateTime # All values less than or equal the given value. createdAt_lte: DateTime # All values greater than the given value. createdAt_gt: DateTime # All values greater than or equal the given value. createdAt_gte: DateTime id: ID # All values that are not equal to given value. id_not: ID # All values that are contained in given list. id_in: [ID!] # All values that are not contained in given list. id_not_in: [ID!] # All values less than the given value. id_lt: ID # All values less than or equal the given value. id_lte: ID # All values greater than the given value. id_gt: ID # All values greater than or equal the given value. id_gte: ID # All values containing the given string. id_contains: ID # All values not containing the given string. id_not_contains: ID # All values starting with the given string. id_starts_with: ID # All values not starting with the given string. id_not_starts_with: ID # All values ending with the given string. id_ends_with: ID # All values not ending with the given string. id_not_ends_with: ID type: PAYMENT_PROVIDER # All values that are not equal to given value. type_not: PAYMENT_PROVIDER # All values that are contained in given list. type_in: [PAYMENT_PROVIDER!] # All values that are not contained in given list. type_not_in: [PAYMENT_PROVIDER!] creditcard: CreditCardInformationFilter payments_every: PaymentFilter payments_some: PaymentFilter payments_none: PaymentFilter paypal: PaypalInformationFilter user: UserFilter } enum PaymentAccountOrderBy { createdAt_ASC createdAt_DESC id_ASC id_DESC type_ASC type_DESC } input PaymentAccountpaymentsPayment { placePrice: Float! serviceFee: Float! totalPrice: Float! bookingId: ID booking: PaymentbookingBooking } input PaymentAccountpaypalPaypalInformation { email: String! } type PaymentAccountPreviousValues { createdAt: DateTime! id: ID! type: PAYMENT_PROVIDER } input PaymentAccountSubscriptionFilter { # Logical AND on all given filters. AND: [PaymentAccountSubscriptionFilter!] # Logical OR on all given filters. OR: [PaymentAccountSubscriptionFilter!] # The subscription event gets dispatched when it's listed in mutation_in mutation_in: [_ModelMutationType!] # The subscription event gets only dispatched when one of the updated fields names is included in this list updatedFields_contains: String # The subscription event gets only dispatched when all of the field names included in this list have been updated updatedFields_contains_every: [String!] # The subscription event gets only dispatched when some of the field names included in this list have been updated updatedFields_contains_some: [String!] node: PaymentAccountSubscriptionFilterNode } input PaymentAccountSubscriptionFilterNode { createdAt: DateTime # All values that are not equal to given value. createdAt_not: DateTime # All values that are contained in given list. createdAt_in: [DateTime!] # All values that are not contained in given list. createdAt_not_in: [DateTime!] # All values less than the given value. createdAt_lt: DateTime # All values less than or equal the given value. createdAt_lte: DateTime # All values greater than the given value. createdAt_gt: DateTime # All values greater than or equal the given value. createdAt_gte: DateTime id: ID # All values that are not equal to given value. id_not: ID # All values that are contained in given list. id_in: [ID!] # All values that are not contained in given list. id_not_in: [ID!] # All values less than the given value. id_lt: ID # All values less than or equal the given value. id_lte: ID # All values greater than the given value. id_gt: ID # All values greater than or equal the given value. id_gte: ID # All values containing the given string. id_contains: ID # All values not containing the given string. id_not_contains: ID # All values starting with the given string. id_starts_with: ID # All values not starting with the given string. id_not_starts_with: ID # All values ending with the given string. id_ends_with: ID # All values not ending with the given string. id_not_ends_with: ID type: PAYMENT_PROVIDER # All values that are not equal to given value. type_not: PAYMENT_PROVIDER # All values that are contained in given list. type_in: [PAYMENT_PROVIDER!] # All values that are not contained in given list. type_not_in: [PAYMENT_PROVIDER!] creditcard: CreditCardInformationFilter payments_every: PaymentFilter payments_some: PaymentFilter payments_none: PaymentFilter paypal: PaypalInformationFilter user: UserFilter } type PaymentAccountSubscriptionPayload { mutation: _ModelMutationType! node: PaymentAccount updatedFields: [String!] previousValues: PaymentAccountPreviousValues } input PaymentAccountuserUser { email: String! firstName: String! isSuperHost: Boolean lastName: String! password: String! phone: String! responseRate: Float responseTime: Int locationId: ID location: UserlocationLocation profilePictureId: ID profilePicture: UserprofilePicturePicture bookingsIds: [ID!] bookings: [UserbookingsBooking!] hostingExperiencesIds: [ID!] hostingExperiences: [UserhostingExperiencesExperience!] notificationsIds: [ID!] notifications: [UsernotificationsNotification!] ownedPlacesIds: [ID!] ownedPlaces: [UserownedPlacesPlace!] paymentAccountIds: [ID!] paymentAccount: [UserpaymentAccountPaymentAccount!] receivedMessagesIds: [ID!] receivedMessages: [UserreceivedMessagesMessage!] sentMessagesIds: [ID!] sentMessages: [UsersentMessagesMessage!] } input PaymentbookingBooking { endDate: DateTime! startDate: DateTime! bookeeId: ID bookee: BookingbookeeUser placeId: ID place: BookingplacePlace } input PaymentFilter { # Logical AND on all given filters. AND: [PaymentFilter!] # Logical OR on all given filters. OR: [PaymentFilter!] createdAt: DateTime # All values that are not equal to given value. createdAt_not: DateTime # All values that are contained in given list. createdAt_in: [DateTime!] # All values that are not contained in given list. createdAt_not_in: [DateTime!] # All values less than the given value. createdAt_lt: DateTime # All values less than or equal the given value. createdAt_lte: DateTime # All values greater than the given value. createdAt_gt: DateTime # All values greater than or equal the given value. createdAt_gte: DateTime id: ID # All values that are not equal to given value. id_not: ID # All values that are contained in given list. id_in: [ID!] # All values that are not contained in given list. id_not_in: [ID!] # All values less than the given value. id_lt: ID # All values less than or equal the given value. id_lte: ID # All values greater than the given value. id_gt: ID # All values greater than or equal the given value. id_gte: ID # All values containing the given string. id_contains: ID # All values not containing the given string. id_not_contains: ID # All values starting with the given string. id_starts_with: ID # All values not starting with the given string. id_not_starts_with: ID # All values ending with the given string. id_ends_with: ID # All values not ending with the given string. id_not_ends_with: ID placePrice: Float # All values that are not equal to given value. placePrice_not: Float # All values that are contained in given list. placePrice_in: [Float!] # All values that are not contained in given list. placePrice_not_in: [Float!] # All values less than the given value. placePrice_lt: Float # All values less than or equal the given value. placePrice_lte: Float # All values greater than the given value. placePrice_gt: Float # All values greater than or equal the given value. placePrice_gte: Float serviceFee: Float # All values that are not equal to given value. serviceFee_not: Float # All values that are contained in given list. serviceFee_in: [Float!] # All values that are not contained in given list. serviceFee_not_in: [Float!] # All values less than the given value. serviceFee_lt: Float # All values less than or equal the given value. serviceFee_lte: Float # All values greater than the given value. serviceFee_gt: Float # All values greater than or equal the given value. serviceFee_gte: Float totalPrice: Float # All values that are not equal to given value. totalPrice_not: Float # All values that are contained in given list. totalPrice_in: [Float!] # All values that are not contained in given list. totalPrice_not_in: [Float!] # All values less than the given value. totalPrice_lt: Float # All values less than or equal the given value. totalPrice_lte: Float # All values greater than the given value. totalPrice_gt: Float # All values greater than or equal the given value. totalPrice_gte: Float booking: BookingFilter paymentMethod: PaymentAccountFilter } enum PaymentOrderBy { createdAt_ASC createdAt_DESC id_ASC id_DESC placePrice_ASC placePrice_DESC serviceFee_ASC serviceFee_DESC totalPrice_ASC totalPrice_DESC } input PaymentpaymentMethodPaymentAccount { type: PAYMENT_PROVIDER creditcardId: ID creditcard: PaymentAccountcreditcardCreditCardInformation paypalId: ID paypal: PaymentAccountpaypalPaypalInformation userId: ID user: PaymentAccountuserUser paymentsIds: [ID!] payments: [PaymentAccountpaymentsPayment!] } type PaymentPreviousValues { createdAt: DateTime! id: ID! placePrice: Float! serviceFee: Float! totalPrice: Float! } input PaymentSubscriptionFilter { # Logical AND on all given filters. AND: [PaymentSubscriptionFilter!] # Logical OR on all given filters. OR: [PaymentSubscriptionFilter!] # The subscription event gets dispatched when it's listed in mutation_in mutation_in: [_ModelMutationType!] # The subscription event gets only dispatched when one of the updated fields names is included in this list updatedFields_contains: String # The subscription event gets only dispatched when all of the field names included in this list have been updated updatedFields_contains_every: [String!] # The subscription event gets only dispatched when some of the field names included in this list have been updated updatedFields_contains_some: [String!] node: PaymentSubscriptionFilterNode } input PaymentSubscriptionFilterNode { createdAt: DateTime # All values that are not equal to given value. createdAt_not: DateTime # All values that are contained in given list. createdAt_in: [DateTime!] # All values that are not contained in given list. createdAt_not_in: [DateTime!] # All values less than the given value. createdAt_lt: DateTime # All values less than or equal the given value. createdAt_lte: DateTime # All values greater than the given value. createdAt_gt: DateTime # All values greater than or equal the given value. createdAt_gte: DateTime id: ID # All values that are not equal to given value. id_not: ID # All values that are contained in given list. id_in: [ID!] # All values that are not contained in given list. id_not_in: [ID!] # All values less than the given value. id_lt: ID # All values less than or equal the given value. id_lte: ID # All values greater than the given value. id_gt: ID # All values greater than or equal the given value. id_gte: ID # All values containing the given string. id_contains: ID # All values not containing the given string. id_not_contains: ID # All values starting with the given string. id_starts_with: ID # All values not starting with the given string. id_not_starts_with: ID # All values ending with the given string. id_ends_with: ID # All values not ending with the given string. id_not_ends_with: ID placePrice: Float # All values that are not equal to given value. placePrice_not: Float # All values that are contained in given list. placePrice_in: [Float!] # All values that are not contained in given list. placePrice_not_in: [Float!] # All values less than the given value. placePrice_lt: Float # All values less than or equal the given value. placePrice_lte: Float # All values greater than the given value. placePrice_gt: Float # All values greater than or equal the given value. placePrice_gte: Float serviceFee: Float # All values that are not equal to given value. serviceFee_not: Float # All values that are contained in given list. serviceFee_in: [Float!] # All values that are not contained in given list. serviceFee_not_in: [Float!] # All values less than the given value. serviceFee_lt: Float # All values less than or equal the given value. serviceFee_lte: Float # All values greater than the given value. serviceFee_gt: Float # All values greater than or equal the given value. serviceFee_gte: Float totalPrice: Float # All values that are not equal to given value. totalPrice_not: Float # All values that are contained in given list. totalPrice_in: [Float!] # All values that are not contained in given list. totalPrice_not_in: [Float!] # All values less than the given value. totalPrice_lt: Float # All values less than or equal the given value. totalPrice_lte: Float # All values greater than the given value. totalPrice_gt: Float # All values greater than or equal the given value. totalPrice_gte: Float booking: BookingFilter paymentMethod: PaymentAccountFilter } type PaymentSubscriptionPayload { mutation: _ModelMutationType! node: Payment updatedFields: [String!] previousValues: PaymentPreviousValues } type PaypalInformation implements Node { createdAt: DateTime! email: String! id: ID! paymentAccount(filter: PaymentAccountFilter): PaymentAccount! } input PaypalInformationFilter { # Logical AND on all given filters. AND: [PaypalInformationFilter!] # Logical OR on all given filters. OR: [PaypalInformationFilter!] createdAt: DateTime # All values that are not equal to given value. createdAt_not: DateTime # All values that are contained in given list. createdAt_in: [DateTime!] # All values that are not contained in given list. createdAt_not_in: [DateTime!] # All values less than the given value. createdAt_lt: DateTime # All values less than or equal the given value. createdAt_lte: DateTime # All values greater than the given value. createdAt_gt: DateTime # All values greater than or equal the given value. createdAt_gte: DateTime email: String # All values that are not equal to given value. email_not: String # All values that are contained in given list. email_in: [String!] # All values that are not contained in given list. email_not_in: [String!] # All values less than the given value. email_lt: String # All values less than or equal the given value. email_lte: String # All values greater than the given value. email_gt: String # All values greater than or equal the given value. email_gte: String # All values containing the given string. email_contains: String # All values not containing the given string. email_not_contains: String # All values starting with the given string. email_starts_with: String # All values not starting with the given string. email_not_starts_with: String # All values ending with the given string. email_ends_with: String # All values not ending with the given string. email_not_ends_with: String id: ID # All values that are not equal to given value. id_not: ID # All values that are contained in given list. id_in: [ID!] # All values that are not contained in given list. id_not_in: [ID!] # All values less than the given value. id_lt: ID # All values less than or equal the given value. id_lte: ID # All values greater than the given value. id_gt: ID # All values greater than or equal the given value. id_gte: ID # All values containing the given string. id_contains: ID # All values not containing the given string. id_not_contains: ID # All values starting with the given string. id_starts_with: ID # All values not starting with the given string. id_not_starts_with: ID # All values ending with the given string. id_ends_with: ID # All values not ending with the given string. id_not_ends_with: ID paymentAccount: PaymentAccountFilter } enum PaypalInformationOrderBy { createdAt_ASC createdAt_DESC email_ASC email_DESC id_ASC id_DESC } input PaypalInformationpaymentAccountPaymentAccount { type: PAYMENT_PROVIDER creditcardId: ID creditcard: PaymentAccountcreditcardCreditCardInformation userId: ID user: PaymentAccountuserUser paymentsIds: [ID!] payments: [PaymentAccountpaymentsPayment!] } type PaypalInformationPreviousValues { createdAt: DateTime! email: String! id: ID! } input PaypalInformationSubscriptionFilter { # Logical AND on all given filters. AND: [PaypalInformationSubscriptionFilter!] # Logical OR on all given filters. OR: [PaypalInformationSubscriptionFilter!] # The subscription event gets dispatched when it's listed in mutation_in mutation_in: [_ModelMutationType!] # The subscription event gets only dispatched when one of the updated fields names is included in this list updatedFields_contains: String # The subscription event gets only dispatched when all of the field names included in this list have been updated updatedFields_contains_every: [String!] # The subscription event gets only dispatched when some of the field names included in this list have been updated updatedFields_contains_some: [String!] node: PaypalInformationSubscriptionFilterNode } input PaypalInformationSubscriptionFilterNode { createdAt: DateTime # All values that are not equal to given value. createdAt_not: DateTime # All values that are contained in given list. createdAt_in: [DateTime!] # All values that are not contained in given list. createdAt_not_in: [DateTime!] # All values less than the given value. createdAt_lt: DateTime # All values less than or equal the given value. createdAt_lte: DateTime # All values greater than the given value. createdAt_gt: DateTime # All values greater than or equal the given value. createdAt_gte: DateTime email: String # All values that are not equal to given value. email_not: String # All values that are contained in given list. email_in: [String!] # All values that are not contained in given list. email_not_in: [String!] # All values less than the given value. email_lt: String # All values less than or equal the given value. email_lte: String # All values greater than the given value. email_gt: String # All values greater than or equal the given value. email_gte: String # All values containing the given string. email_contains: String # All values not containing the given string. email_not_contains: String # All values starting with the given string. email_starts_with: String # All values not starting with the given string. email_not_starts_with: String # All values ending with the given string. email_ends_with: String # All values not ending with the given string. email_not_ends_with: String id: ID # All values that are not equal to given value. id_not: ID # All values that are contained in given list. id_in: [ID!] # All values that are not contained in given list. id_not_in: [ID!] # All values less than the given value. id_lt: ID # All values less than or equal the given value. id_lte: ID # All values greater than the given value. id_gt: ID # All values greater than or equal the given value. id_gte: ID # All values containing the given string. id_contains: ID # All values not containing the given string. id_not_contains: ID # All values starting with the given string. id_starts_with: ID # All values not starting with the given string. id_not_starts_with: ID # All values ending with the given string. id_ends_with: ID # All values not ending with the given string. id_not_ends_with: ID paymentAccount: PaymentAccountFilter } type PaypalInformationSubscriptionPayload { mutation: _ModelMutationType! node: PaypalInformation updatedFields: [String!] previousValues: PaypalInformationPreviousValues } type Picture implements Node { experience(filter: ExperienceFilter): Experience id: ID! neighbourHood(filter: NeighbourhoodFilter): Neighbourhood place(filter: PlaceFilter): Place reservation(filter: RestaurantFilter): Restaurant url: String! user(filter: UserFilter): User } input PictureexperienceExperience { popularity: Int! pricePerPerson: Int! title: String! categoryId: ID category: ExperiencecategoryExperienceCategory hostId: ID host: ExperiencehostUser locationId: ID location: ExperiencelocationLocation reviewsIds: [ID!] reviews: [ExperiencereviewsReview!] } input PictureFilter { # Logical AND on all given filters. AND: [PictureFilter!] # Logical OR on all given filters. OR: [PictureFilter!] id: ID # All values that are not equal to given value. id_not: ID # All values that are contained in given list. id_in: [ID!] # All values that are not contained in given list. id_not_in: [ID!] # All values less than the given value. id_lt: ID # All values less than or equal the given value. id_lte: ID # All values greater than the given value. id_gt: ID # All values greater than or equal the given value. id_gte: ID # All values containing the given string. id_contains: ID # All values not containing the given string. id_not_contains: ID # All values starting with the given string. id_starts_with: ID # All values not starting with the given string. id_not_starts_with: ID # All values ending with the given string. id_ends_with: ID # All values not ending with the given string. id_not_ends_with: ID url: String # All values that are not equal to given value. url_not: String # All values that are contained in given list. url_in: [String!] # All values that are not contained in given list. url_not_in: [String!] # All values less than the given value. url_lt: String # All values less than or equal the given value. url_lte: String # All values greater than the given value. url_gt: String # All values greater than or equal the given value. url_gte: String # All values containing the given string. url_contains: String # All values not containing the given string. url_not_contains: String # All values starting with the given string. url_starts_with: String # All values not starting with the given string. url_not_starts_with: String # All values ending with the given string. url_ends_with: String # All values not ending with the given string. url_not_ends_with: String experience: ExperienceFilter neighbourHood: NeighbourhoodFilter place: PlaceFilter reservation: RestaurantFilter user: UserFilter } input PictureneighbourHoodNeighbourhood { featured: Boolean! name: String! popularity: Int! slug: String! cityId: ID city: NeighbourhoodcityCity locationsIds: [ID!] locations: [NeighbourhoodlocationsLocation!] } enum PictureOrderBy { id_ASC id_DESC url_ASC url_DESC } input PictureplacePlace { description: String! maxGuests: Int! name: String numBaths: Int! numBedrooms: Int! numBeds: Int! popularity: Int! shortDescription: String! size: PLACE_SIZES slug: String! amenitiesId: ID amenities: PlaceamenitiesAmenities guestRequirementsId: ID guestRequirements: PlaceguestRequirementsGuestRequirements hostId: ID host: PlacehostUser houseRulesId: ID houseRules: PlacehouseRulesHouseRules locationId: ID location: PlacelocationLocation policiesId: ID policies: PlacepoliciesPolicies pricingId: ID pricing: PlacepricingPricing viewsId: ID views: PlaceviewsPlaceViews bookingsIds: [ID!] bookings: [PlacebookingsBooking!] picturesIds: [ID!] pictures: [PlacepicturesPicture!] reviewsIds: [ID!] reviews: [PlacereviewsReview!] } type PicturePreviousValues { id: ID! url: String! } input PicturereservationRestaurant { avgPricePerPerson: Int! isCurated: Boolean popularity: Int! slug: String! title: String! locationId: ID location: RestaurantlocationLocation picturesIds: [ID!] pictures: [RestaurantpicturesPicture!] } input PictureSubscriptionFilter { # Logical AND on all given filters. AND: [PictureSubscriptionFilter!] # Logical OR on all given filters. OR: [PictureSubscriptionFilter!] # The subscription event gets dispatched when it's listed in mutation_in mutation_in: [_ModelMutationType!] # The subscription event gets only dispatched when one of the updated fields names is included in this list updatedFields_contains: String # The subscription event gets only dispatched when all of the field names included in this list have been updated updatedFields_contains_every: [String!] # The subscription event gets only dispatched when some of the field names included in this list have been updated updatedFields_contains_some: [String!] node: PictureSubscriptionFilterNode } input PictureSubscriptionFilterNode { id: ID # All values that are not equal to given value. id_not: ID # All values that are contained in given list. id_in: [ID!] # All values that are not contained in given list. id_not_in: [ID!] # All values less than the given value. id_lt: ID # All values less than or equal the given value. id_lte: ID # All values greater than the given value. id_gt: ID # All values greater than or equal the given value. id_gte: ID # All values containing the given string. id_contains: ID # All values not containing the given string. id_not_contains: ID # All values starting with the given string. id_starts_with: ID # All values not starting with the given string. id_not_starts_with: ID # All values ending with the given string. id_ends_with: ID # All values not ending with the given string. id_not_ends_with: ID url: String # All values that are not equal to given value. url_not: String # All values that are contained in given list. url_in: [String!] # All values that are not contained in given list. url_not_in: [String!] # All values less than the given value. url_lt: String # All values less than or equal the given value. url_lte: String # All values greater than the given value. url_gt: String # All values greater than or equal the given value. url_gte: String # All values containing the given string. url_contains: String # All values not containing the given string. url_not_contains: String # All values starting with the given string. url_starts_with: String # All values not starting with the given string. url_not_starts_with: String # All values ending with the given string. url_ends_with: String # All values not ending with the given string. url_not_ends_with: String experience: ExperienceFilter neighbourHood: NeighbourhoodFilter place: PlaceFilter reservation: RestaurantFilter user: UserFilter } type PictureSubscriptionPayload { mutation: _ModelMutationType! node: Picture updatedFields: [String!] previousValues: PicturePreviousValues } input PictureuserUser { email: String! firstName: String! isSuperHost: Boolean lastName: String! password: String! phone: String! responseRate: Float responseTime: Int locationId: ID location: UserlocationLocation bookingsIds: [ID!] bookings: [UserbookingsBooking!] hostingExperiencesIds: [ID!] hostingExperiences: [UserhostingExperiencesExperience!] notificationsIds: [ID!] notifications: [UsernotificationsNotification!] ownedPlacesIds: [ID!] ownedPlaces: [UserownedPlacesPlace!] paymentAccountIds: [ID!] paymentAccount: [UserpaymentAccountPaymentAccount!] receivedMessagesIds: [ID!] receivedMessages: [UserreceivedMessagesMessage!] sentMessagesIds: [ID!] sentMessages: [UsersentMessagesMessage!] } type Place implements Node { amenities(filter: AmenitiesFilter): Amenities! bookings( filter: BookingFilter orderBy: BookingOrderBy skip: Int after: String before: String first: Int last: Int ): [Booking!] description: String! guestRequirements(filter: GuestRequirementsFilter): GuestRequirements host(filter: UserFilter): User! houseRules(filter: HouseRulesFilter): HouseRules id: ID! location(filter: LocationFilter): Location! maxGuests: Int! name: String numBaths: Int! numBedrooms: Int! numBeds: Int! pictures( filter: PictureFilter orderBy: PictureOrderBy skip: Int after: String before: String first: Int last: Int ): [Picture!] policies(filter: PoliciesFilter): Policies popularity: Int! pricing(filter: PricingFilter): Pricing! reviews( filter: ReviewFilter orderBy: ReviewOrderBy skip: Int after: String before: String first: Int last: Int ): [Review!] shortDescription: String! size: PLACE_SIZES slug: String! views(filter: PlaceViewsFilter): PlaceViews! # Meta information about the query. _bookingsMeta( filter: BookingFilter orderBy: BookingOrderBy skip: Int after: String before: String first: Int last: Int ): _QueryMeta! # Meta information about the query. _picturesMeta( filter: PictureFilter orderBy: PictureOrderBy skip: Int after: String before: String first: Int last: Int ): _QueryMeta! # Meta information about the query. _reviewsMeta( filter: ReviewFilter orderBy: ReviewOrderBy skip: Int after: String before: String first: Int last: Int ): _QueryMeta! } enum PLACE_SIZES { ENTIRE_HOUSE ENTIRE_APARTMENT ENTIRE_EARTH_HOUSE ENTIRE_CABIN ENTIRE_VILLA ENTIRE_PLACE ENTIRE_BOAT PRIVATE_ROOM } input PlaceamenitiesAmenities { airConditioning: Boolean babyBath: Boolean babyMonitor: Boolean babysitterRecommendations: Boolean bathtub: Boolean breakfast: Boolean buzzerWirelessIntercom: Boolean cableTv: Boolean changingTable: Boolean childrensBooksAndToys: Boolean childrensDinnerware: Boolean crib: Boolean doorman: Boolean dryer: Boolean elevator: Boolean essentials: Boolean familyKidFriendly: Boolean freeParkingOnPremises: Boolean freeParkingOnStreet: Boolean gym: Boolean hairDryer: Boolean hangers: Boolean heating: Boolean hotTub: Boolean indoorFireplace: Boolean internet: Boolean iron: Boolean kitchen: Boolean laptopFriendlyWorkspace: Boolean paidParkingOffPremises: Boolean petsAllowed: Boolean pool: Boolean privateEntrance: Boolean shampoo: Boolean smokingAllowed: Boolean suitableForEvents: Boolean tv: Boolean washer: Boolean wheelchairAccessible: Boolean wirelessInternet: Boolean } input PlacebookingsBooking { endDate: DateTime! startDate: DateTime! bookeeId: ID bookee: BookingbookeeUser paymentId: ID payment: BookingpaymentPayment } input PlaceFilter { # Logical AND on all given filters. AND: [PlaceFilter!] # Logical OR on all given filters. OR: [PlaceFilter!] description: String # All values that are not equal to given value. description_not: String # All values that are contained in given list. description_in: [String!] # All values that are not contained in given list. description_not_in: [String!] # All values less than the given value. description_lt: String # All values less than or equal the given value. description_lte: String # All values greater than the given value. description_gt: String # All values greater than or equal the given value. description_gte: String # All values containing the given string. description_contains: String # All values not containing the given string. description_not_contains: String # All values starting with the given string. description_starts_with: String # All values not starting with the given string. description_not_starts_with: String # All values ending with the given string. description_ends_with: String # All values not ending with the given string. description_not_ends_with: String id: ID # All values that are not equal to given value. id_not: ID # All values that are contained in given list. id_in: [ID!] # All values that are not contained in given list. id_not_in: [ID!] # All values less than the given value. id_lt: ID # All values less than or equal the given value. id_lte: ID # All values greater than the given value. id_gt: ID # All values greater than or equal the given value. id_gte: ID # All values containing the given string. id_contains: ID # All values not containing the given string. id_not_contains: ID # All values starting with the given string. id_starts_with: ID # All values not starting with the given string. id_not_starts_with: ID # All values ending with the given string. id_ends_with: ID # All values not ending with the given string. id_not_ends_with: ID maxGuests: Int # All values that are not equal to given value. maxGuests_not: Int # All values that are contained in given list. maxGuests_in: [Int!] # All values that are not contained in given list. maxGuests_not_in: [Int!] # All values less than the given value. maxGuests_lt: Int # All values less than or equal the given value. maxGuests_lte: Int # All values greater than the given value. maxGuests_gt: Int # All values greater than or equal the given value. maxGuests_gte: Int name: String # All values that are not equal to given value. name_not: String # All values that are contained in given list. name_in: [String!] # All values that are not contained in given list. name_not_in: [String!] # All values less than the given value. name_lt: String # All values less than or equal the given value. name_lte: String # All values greater than the given value. name_gt: String # All values greater than or equal the given value. name_gte: String # All values containing the given string. name_contains: String # All values not containing the given string. name_not_contains: String # All values starting with the given string. name_starts_with: String # All values not starting with the given string. name_not_starts_with: String # All values ending with the given string. name_ends_with: String # All values not ending with the given string. name_not_ends_with: String numBaths: Int # All values that are not equal to given value. numBaths_not: Int # All values that are contained in given list. numBaths_in: [Int!] # All values that are not contained in given list. numBaths_not_in: [Int!] # All values less than the given value. numBaths_lt: Int # All values less than or equal the given value. numBaths_lte: Int # All values greater than the given value. numBaths_gt: Int # All values greater than or equal the given value. numBaths_gte: Int numBedrooms: Int # All values that are not equal to given value. numBedrooms_not: Int # All values that are contained in given list. numBedrooms_in: [Int!] # All values that are not contained in given list. numBedrooms_not_in: [Int!] # All values less than the given value. numBedrooms_lt: Int # All values less than or equal the given value. numBedrooms_lte: Int # All values greater than the given value. numBedrooms_gt: Int # All values greater than or equal the given value. numBedrooms_gte: Int numBeds: Int # All values that are not equal to given value. numBeds_not: Int # All values that are contained in given list. numBeds_in: [Int!] # All values that are not contained in given list. numBeds_not_in: [Int!] # All values less than the given value. numBeds_lt: Int # All values less than or equal the given value. numBeds_lte: Int # All values greater than the given value. numBeds_gt: Int # All values greater than or equal the given value. numBeds_gte: Int popularity: Int # All values that are not equal to given value. popularity_not: Int # All values that are contained in given list. popularity_in: [Int!] # All values that are not contained in given list. popularity_not_in: [Int!] # All values less than the given value. popularity_lt: Int # All values less than or equal the given value. popularity_lte: Int # All values greater than the given value. popularity_gt: Int # All values greater than or equal the given value. popularity_gte: Int shortDescription: String # All values that are not equal to given value. shortDescription_not: String # All values that are contained in given list. shortDescription_in: [String!] # All values that are not contained in given list. shortDescription_not_in: [String!] # All values less than the given value. shortDescription_lt: String # All values less than or equal the given value. shortDescription_lte: String # All values greater than the given value. shortDescription_gt: String # All values greater than or equal the given value. shortDescription_gte: String # All values containing the given string. shortDescription_contains: String # All values not containing the given string. shortDescription_not_contains: String # All values starting with the given string. shortDescription_starts_with: String # All values not starting with the given string. shortDescription_not_starts_with: String # All values ending with the given string. shortDescription_ends_with: String # All values not ending with the given string. shortDescription_not_ends_with: String size: PLACE_SIZES # All values that are not equal to given value. size_not: PLACE_SIZES # All values that are contained in given list. size_in: [PLACE_SIZES!] # All values that are not contained in given list. size_not_in: [PLACE_SIZES!] slug: String # All values that are not equal to given value. slug_not: String # All values that are contained in given list. slug_in: [String!] # All values that are not contained in given list. slug_not_in: [String!] # All values less than the given value. slug_lt: String # All values less than or equal the given value. slug_lte: String # All values greater than the given value. slug_gt: String # All values greater than or equal the given value. slug_gte: String # All values containing the given string. slug_contains: String # All values not containing the given string. slug_not_contains: String # All values starting with the given string. slug_starts_with: String # All values not starting with the given string. slug_not_starts_with: String # All values ending with the given string. slug_ends_with: String # All values not ending with the given string. slug_not_ends_with: String amenities: AmenitiesFilter bookings_every: BookingFilter bookings_some: BookingFilter bookings_none: BookingFilter guestRequirements: GuestRequirementsFilter host: UserFilter houseRules: HouseRulesFilter location: LocationFilter pictures_every: PictureFilter pictures_some: PictureFilter pictures_none: PictureFilter policies: PoliciesFilter pricing: PricingFilter reviews_every: ReviewFilter reviews_some: ReviewFilter reviews_none: ReviewFilter views: PlaceViewsFilter } input PlaceguestRequirementsGuestRequirements { govIssuedId: Boolean guestTripInformation: Boolean recommendationsFromOtherHosts: Boolean } input PlacehostUser { email: String! firstName: String! isSuperHost: Boolean lastName: String! password: String! phone: String! responseRate: Float responseTime: Int locationId: ID location: UserlocationLocation profilePictureId: ID profilePicture: UserprofilePicturePicture bookingsIds: [ID!] bookings: [UserbookingsBooking!] hostingExperiencesIds: [ID!] hostingExperiences: [UserhostingExperiencesExperience!] notificationsIds: [ID!] notifications: [UsernotificationsNotification!] ownedPlacesIds: [ID!] ownedPlaces: [UserownedPlacesPlace!] paymentAccountIds: [ID!] paymentAccount: [UserpaymentAccountPaymentAccount!] receivedMessagesIds: [ID!] receivedMessages: [UserreceivedMessagesMessage!] sentMessagesIds: [ID!] sentMessages: [UsersentMessagesMessage!] } input PlacehouseRulesHouseRules { additionalRules: String partiesAndEventsAllowed: Boolean petsAllowed: Boolean smokingAllowed: Boolean suitableForChildren: Boolean suitableForInfants: Boolean } input PlacelocationLocation { address: String directions: String lat: Float! lng: Float! experienceId: ID experience: LocationexperienceExperience neighbourHoodId: ID neighbourHood: LocationneighbourHoodNeighbourhood restaurantId: ID restaurant: LocationrestaurantRestaurant userId: ID user: LocationuserUser } enum PlaceOrderBy { description_ASC description_DESC id_ASC id_DESC maxGuests_ASC maxGuests_DESC name_ASC name_DESC numBaths_ASC numBaths_DESC numBedrooms_ASC numBedrooms_DESC numBeds_ASC numBeds_DESC popularity_ASC popularity_DESC shortDescription_ASC shortDescription_DESC size_ASC size_DESC slug_ASC slug_DESC } input PlacepicturesPicture { url: String! experienceId: ID experience: PictureexperienceExperience neighbourHoodId: ID neighbourHood: PictureneighbourHoodNeighbourhood reservationId: ID reservation: PicturereservationRestaurant userId: ID user: PictureuserUser } input PlacepoliciesPolicies { checkInEndTime: Float! checkInStartTime: Float! checkoutTime: Float! } type PlacePreviousValues { description: String! id: ID! maxGuests: Int! name: String numBaths: Int! numBedrooms: Int! numBeds: Int! popularity: Int! shortDescription: String! size: PLACE_SIZES slug: String! } input PlacepricingPricing { averageMonthly: Int! averageWeekly: Int! basePrice: Int! cleaningFee: Int currency: CURRENCY extraGuests: Int monthlyDiscount: Int perNight: Int! securityDeposit: Int smartPricing: Boolean weekendPricing: Int weeklyDiscount: Int } input PlacereviewsReview { accuracy: Int! checkIn: Int! cleanliness: Int! communication: Int! location: Int! stars: Int! text: String! value: Int! experienceId: ID experience: ReviewexperienceExperience } input PlaceSubscriptionFilter { # Logical AND on all given filters. AND: [PlaceSubscriptionFilter!] # Logical OR on all given filters. OR: [PlaceSubscriptionFilter!] # The subscription event gets dispatched when it's listed in mutation_in mutation_in: [_ModelMutationType!] # The subscription event gets only dispatched when one of the updated fields names is included in this list updatedFields_contains: String # The subscription event gets only dispatched when all of the field names included in this list have been updated updatedFields_contains_every: [String!] # The subscription event gets only dispatched when some of the field names included in this list have been updated updatedFields_contains_some: [String!] node: PlaceSubscriptionFilterNode } input PlaceSubscriptionFilterNode { description: String # All values that are not equal to given value. description_not: String # All values that are contained in given list. description_in: [String!] # All values that are not contained in given list. description_not_in: [String!] # All values less than the given value. description_lt: String # All values less than or equal the given value. description_lte: String # All values greater than the given value. description_gt: String # All values greater than or equal the given value. description_gte: String # All values containing the given string. description_contains: String # All values not containing the given string. description_not_contains: String # All values starting with the given string. description_starts_with: String # All values not starting with the given string. description_not_starts_with: String # All values ending with the given string. description_ends_with: String # All values not ending with the given string. description_not_ends_with: String id: ID # All values that are not equal to given value. id_not: ID # All values that are contained in given list. id_in: [ID!] # All values that are not contained in given list. id_not_in: [ID!] # All values less than the given value. id_lt: ID # All values less than or equal the given value. id_lte: ID # All values greater than the given value. id_gt: ID # All values greater than or equal the given value. id_gte: ID # All values containing the given string. id_contains: ID # All values not containing the given string. id_not_contains: ID # All values starting with the given string. id_starts_with: ID # All values not starting with the given string. id_not_starts_with: ID # All values ending with the given string. id_ends_with: ID # All values not ending with the given string. id_not_ends_with: ID maxGuests: Int # All values that are not equal to given value. maxGuests_not: Int # All values that are contained in given list. maxGuests_in: [Int!] # All values that are not contained in given list. maxGuests_not_in: [Int!] # All values less than the given value. maxGuests_lt: Int # All values less than or equal the given value. maxGuests_lte: Int # All values greater than the given value. maxGuests_gt: Int # All values greater than or equal the given value. maxGuests_gte: Int name: String # All values that are not equal to given value. name_not: String # All values that are contained in given list. name_in: [String!] # All values that are not contained in given list. name_not_in: [String!] # All values less than the given value. name_lt: String # All values less than or equal the given value. name_lte: String # All values greater than the given value. name_gt: String # All values greater than or equal the given value. name_gte: String # All values containing the given string. name_contains: String # All values not containing the given string. name_not_contains: String # All values starting with the given string. name_starts_with: String # All values not starting with the given string. name_not_starts_with: String # All values ending with the given string. name_ends_with: String # All values not ending with the given string. name_not_ends_with: String numBaths: Int # All values that are not equal to given value. numBaths_not: Int # All values that are contained in given list. numBaths_in: [Int!] # All values that are not contained in given list. numBaths_not_in: [Int!] # All values less than the given value. numBaths_lt: Int # All values less than or equal the given value. numBaths_lte: Int # All values greater than the given value. numBaths_gt: Int # All values greater than or equal the given value. numBaths_gte: Int numBedrooms: Int # All values that are not equal to given value. numBedrooms_not: Int # All values that are contained in given list. numBedrooms_in: [Int!] # All values that are not contained in given list. numBedrooms_not_in: [Int!] # All values less than the given value. numBedrooms_lt: Int # All values less than or equal the given value. numBedrooms_lte: Int # All values greater than the given value. numBedrooms_gt: Int # All values greater than or equal the given value. numBedrooms_gte: Int numBeds: Int # All values that are not equal to given value. numBeds_not: Int # All values that are contained in given list. numBeds_in: [Int!] # All values that are not contained in given list. numBeds_not_in: [Int!] # All values less than the given value. numBeds_lt: Int # All values less than or equal the given value. numBeds_lte: Int # All values greater than the given value. numBeds_gt: Int # All values greater than or equal the given value. numBeds_gte: Int popularity: Int # All values that are not equal to given value. popularity_not: Int # All values that are contained in given list. popularity_in: [Int!] # All values that are not contained in given list. popularity_not_in: [Int!] # All values less than the given value. popularity_lt: Int # All values less than or equal the given value. popularity_lte: Int # All values greater than the given value. popularity_gt: Int # All values greater than or equal the given value. popularity_gte: Int shortDescription: String # All values that are not equal to given value. shortDescription_not: String # All values that are contained in given list. shortDescription_in: [String!] # All values that are not contained in given list. shortDescription_not_in: [String!] # All values less than the given value. shortDescription_lt: String # All values less than or equal the given value. shortDescription_lte: String # All values greater than the given value. shortDescription_gt: String # All values greater than or equal the given value. shortDescription_gte: String # All values containing the given string. shortDescription_contains: String # All values not containing the given string. shortDescription_not_contains: String # All values starting with the given string. shortDescription_starts_with: String # All values not starting with the given string. shortDescription_not_starts_with: String # All values ending with the given string. shortDescription_ends_with: String # All values not ending with the given string. shortDescription_not_ends_with: String size: PLACE_SIZES # All values that are not equal to given value. size_not: PLACE_SIZES # All values that are contained in given list. size_in: [PLACE_SIZES!] # All values that are not contained in given list. size_not_in: [PLACE_SIZES!] slug: String # All values that are not equal to given value. slug_not: String # All values that are contained in given list. slug_in: [String!] # All values that are not contained in given list. slug_not_in: [String!] # All values less than the given value. slug_lt: String # All values less than or equal the given value. slug_lte: String # All values greater than the given value. slug_gt: String # All values greater than or equal the given value. slug_gte: String # All values containing the given string. slug_contains: String # All values not containing the given string. slug_not_contains: String # All values starting with the given string. slug_starts_with: String # All values not starting with the given string. slug_not_starts_with: String # All values ending with the given string. slug_ends_with: String # All values not ending with the given string. slug_not_ends_with: String amenities: AmenitiesFilter bookings_every: BookingFilter bookings_some: BookingFilter bookings_none: BookingFilter guestRequirements: GuestRequirementsFilter host: UserFilter houseRules: HouseRulesFilter location: LocationFilter pictures_every: PictureFilter pictures_some: PictureFilter pictures_none: PictureFilter policies: PoliciesFilter pricing: PricingFilter reviews_every: ReviewFilter reviews_some: ReviewFilter reviews_none: ReviewFilter views: PlaceViewsFilter } type PlaceSubscriptionPayload { mutation: _ModelMutationType! node: Place updatedFields: [String!] previousValues: PlacePreviousValues } type PlaceViews implements Node { id: ID! lastWeek: Int! place(filter: PlaceFilter): Place! } input PlaceViewsFilter { # Logical AND on all given filters. AND: [PlaceViewsFilter!] # Logical OR on all given filters. OR: [PlaceViewsFilter!] id: ID # All values that are not equal to given value. id_not: ID # All values that are contained in given list. id_in: [ID!] # All values that are not contained in given list. id_not_in: [ID!] # All values less than the given value. id_lt: ID # All values less than or equal the given value. id_lte: ID # All values greater than the given value. id_gt: ID # All values greater than or equal the given value. id_gte: ID # All values containing the given string. id_contains: ID # All values not containing the given string. id_not_contains: ID # All values starting with the given string. id_starts_with: ID # All values not starting with the given string. id_not_starts_with: ID # All values ending with the given string. id_ends_with: ID # All values not ending with the given string. id_not_ends_with: ID lastWeek: Int # All values that are not equal to given value. lastWeek_not: Int # All values that are contained in given list. lastWeek_in: [Int!] # All values that are not contained in given list. lastWeek_not_in: [Int!] # All values less than the given value. lastWeek_lt: Int # All values less than or equal the given value. lastWeek_lte: Int # All values greater than the given value. lastWeek_gt: Int # All values greater than or equal the given value. lastWeek_gte: Int place: PlaceFilter } enum PlaceViewsOrderBy { id_ASC id_DESC lastWeek_ASC lastWeek_DESC } input PlaceViewsplacePlace { description: String! maxGuests: Int! name: String numBaths: Int! numBedrooms: Int! numBeds: Int! popularity: Int! shortDescription: String! size: PLACE_SIZES slug: String! amenitiesId: ID amenities: PlaceamenitiesAmenities guestRequirementsId: ID guestRequirements: PlaceguestRequirementsGuestRequirements hostId: ID host: PlacehostUser houseRulesId: ID houseRules: PlacehouseRulesHouseRules locationId: ID location: PlacelocationLocation policiesId: ID policies: PlacepoliciesPolicies pricingId: ID pricing: PlacepricingPricing bookingsIds: [ID!] bookings: [PlacebookingsBooking!] picturesIds: [ID!] pictures: [PlacepicturesPicture!] reviewsIds: [ID!] reviews: [PlacereviewsReview!] } input PlaceviewsPlaceViews { lastWeek: Int! } type PlaceViewsPreviousValues { id: ID! lastWeek: Int! } input PlaceViewsSubscriptionFilter { # Logical AND on all given filters. AND: [PlaceViewsSubscriptionFilter!] # Logical OR on all given filters. OR: [PlaceViewsSubscriptionFilter!] # The subscription event gets dispatched when it's listed in mutation_in mutation_in: [_ModelMutationType!] # The subscription event gets only dispatched when one of the updated fields names is included in this list updatedFields_contains: String # The subscription event gets only dispatched when all of the field names included in this list have been updated updatedFields_contains_every: [String!] # The subscription event gets only dispatched when some of the field names included in this list have been updated updatedFields_contains_some: [String!] node: PlaceViewsSubscriptionFilterNode } input PlaceViewsSubscriptionFilterNode { id: ID # All values that are not equal to given value. id_not: ID # All values that are contained in given list. id_in: [ID!] # All values that are not contained in given list. id_not_in: [ID!] # All values less than the given value. id_lt: ID # All values less than or equal the given value. id_lte: ID # All values greater than the given value. id_gt: ID # All values greater than or equal the given value. id_gte: ID # All values containing the given string. id_contains: ID # All values not containing the given string. id_not_contains: ID # All values starting with the given string. id_starts_with: ID # All values not starting with the given string. id_not_starts_with: ID # All values ending with the given string. id_ends_with: ID # All values not ending with the given string. id_not_ends_with: ID lastWeek: Int # All values that are not equal to given value. lastWeek_not: Int # All values that are contained in given list. lastWeek_in: [Int!] # All values that are not contained in given list. lastWeek_not_in: [Int!] # All values less than the given value. lastWeek_lt: Int # All values less than or equal the given value. lastWeek_lte: Int # All values greater than the given value. lastWeek_gt: Int # All values greater than or equal the given value. lastWeek_gte: Int place: PlaceFilter } type PlaceViewsSubscriptionPayload { mutation: _ModelMutationType! node: PlaceViews updatedFields: [String!] previousValues: PlaceViewsPreviousValues } type Policies implements Node { checkInEndTime: Float! checkInStartTime: Float! checkoutTime: Float! createdAt: DateTime! id: ID! place(filter: PlaceFilter): Place! updatedAt: DateTime! } input PoliciesFilter { # Logical AND on all given filters. AND: [PoliciesFilter!] # Logical OR on all given filters. OR: [PoliciesFilter!] checkInEndTime: Float # All values that are not equal to given value. checkInEndTime_not: Float # All values that are contained in given list. checkInEndTime_in: [Float!] # All values that are not contained in given list. checkInEndTime_not_in: [Float!] # All values less than the given value. checkInEndTime_lt: Float # All values less than or equal the given value. checkInEndTime_lte: Float # All values greater than the given value. checkInEndTime_gt: Float # All values greater than or equal the given value. checkInEndTime_gte: Float checkInStartTime: Float # All values that are not equal to given value. checkInStartTime_not: Float # All values that are contained in given list. checkInStartTime_in: [Float!] # All values that are not contained in given list. checkInStartTime_not_in: [Float!] # All values less than the given value. checkInStartTime_lt: Float # All values less than or equal the given value. checkInStartTime_lte: Float # All values greater than the given value. checkInStartTime_gt: Float # All values greater than or equal the given value. checkInStartTime_gte: Float checkoutTime: Float # All values that are not equal to given value. checkoutTime_not: Float # All values that are contained in given list. checkoutTime_in: [Float!] # All values that are not contained in given list. checkoutTime_not_in: [Float!] # All values less than the given value. checkoutTime_lt: Float # All values less than or equal the given value. checkoutTime_lte: Float # All values greater than the given value. checkoutTime_gt: Float # All values greater than or equal the given value. checkoutTime_gte: Float createdAt: DateTime # All values that are not equal to given value. createdAt_not: DateTime # All values that are contained in given list. createdAt_in: [DateTime!] # All values that are not contained in given list. createdAt_not_in: [DateTime!] # All values less than the given value. createdAt_lt: DateTime # All values less than or equal the given value. createdAt_lte: DateTime # All values greater than the given value. createdAt_gt: DateTime # All values greater than or equal the given value. createdAt_gte: DateTime id: ID # All values that are not equal to given value. id_not: ID # All values that are contained in given list. id_in: [ID!] # All values that are not contained in given list. id_not_in: [ID!] # All values less than the given value. id_lt: ID # All values less than or equal the given value. id_lte: ID # All values greater than the given value. id_gt: ID # All values greater than or equal the given value. id_gte: ID # All values containing the given string. id_contains: ID # All values not containing the given string. id_not_contains: ID # All values starting with the given string. id_starts_with: ID # All values not starting with the given string. id_not_starts_with: ID # All values ending with the given string. id_ends_with: ID # All values not ending with the given string. id_not_ends_with: ID updatedAt: DateTime # All values that are not equal to given value. updatedAt_not: DateTime # All values that are contained in given list. updatedAt_in: [DateTime!] # All values that are not contained in given list. updatedAt_not_in: [DateTime!] # All values less than the given value. updatedAt_lt: DateTime # All values less than or equal the given value. updatedAt_lte: DateTime # All values greater than the given value. updatedAt_gt: DateTime # All values greater than or equal the given value. updatedAt_gte: DateTime place: PlaceFilter } enum PoliciesOrderBy { checkInEndTime_ASC checkInEndTime_DESC checkInStartTime_ASC checkInStartTime_DESC checkoutTime_ASC checkoutTime_DESC createdAt_ASC createdAt_DESC id_ASC id_DESC updatedAt_ASC updatedAt_DESC } input PoliciesplacePlace { description: String! maxGuests: Int! name: String numBaths: Int! numBedrooms: Int! numBeds: Int! popularity: Int! shortDescription: String! size: PLACE_SIZES slug: String! amenitiesId: ID amenities: PlaceamenitiesAmenities guestRequirementsId: ID guestRequirements: PlaceguestRequirementsGuestRequirements hostId: ID host: PlacehostUser houseRulesId: ID houseRules: PlacehouseRulesHouseRules locationId: ID location: PlacelocationLocation pricingId: ID pricing: PlacepricingPricing viewsId: ID views: PlaceviewsPlaceViews bookingsIds: [ID!] bookings: [PlacebookingsBooking!] picturesIds: [ID!] pictures: [PlacepicturesPicture!] reviewsIds: [ID!] reviews: [PlacereviewsReview!] } type PoliciesPreviousValues { checkInEndTime: Float! checkInStartTime: Float! checkoutTime: Float! createdAt: DateTime! id: ID! updatedAt: DateTime! } input PoliciesSubscriptionFilter { # Logical AND on all given filters. AND: [PoliciesSubscriptionFilter!] # Logical OR on all given filters. OR: [PoliciesSubscriptionFilter!] # The subscription event gets dispatched when it's listed in mutation_in mutation_in: [_ModelMutationType!] # The subscription event gets only dispatched when one of the updated fields names is included in this list updatedFields_contains: String # The subscription event gets only dispatched when all of the field names included in this list have been updated updatedFields_contains_every: [String!] # The subscription event gets only dispatched when some of the field names included in this list have been updated updatedFields_contains_some: [String!] node: PoliciesSubscriptionFilterNode } input PoliciesSubscriptionFilterNode { checkInEndTime: Float # All values that are not equal to given value. checkInEndTime_not: Float # All values that are contained in given list. checkInEndTime_in: [Float!] # All values that are not contained in given list. checkInEndTime_not_in: [Float!] # All values less than the given value. checkInEndTime_lt: Float # All values less than or equal the given value. checkInEndTime_lte: Float # All values greater than the given value. checkInEndTime_gt: Float # All values greater than or equal the given value. checkInEndTime_gte: Float checkInStartTime: Float # All values that are not equal to given value. checkInStartTime_not: Float # All values that are contained in given list. checkInStartTime_in: [Float!] # All values that are not contained in given list. checkInStartTime_not_in: [Float!] # All values less than the given value. checkInStartTime_lt: Float # All values less than or equal the given value. checkInStartTime_lte: Float # All values greater than the given value. checkInStartTime_gt: Float # All values greater than or equal the given value. checkInStartTime_gte: Float checkoutTime: Float # All values that are not equal to given value. checkoutTime_not: Float # All values that are contained in given list. checkoutTime_in: [Float!] # All values that are not contained in given list. checkoutTime_not_in: [Float!] # All values less than the given value. checkoutTime_lt: Float # All values less than or equal the given value. checkoutTime_lte: Float # All values greater than the given value. checkoutTime_gt: Float # All values greater than or equal the given value. checkoutTime_gte: Float createdAt: DateTime # All values that are not equal to given value. createdAt_not: DateTime # All values that are contained in given list. createdAt_in: [DateTime!] # All values that are not contained in given list. createdAt_not_in: [DateTime!] # All values less than the given value. createdAt_lt: DateTime # All values less than or equal the given value. createdAt_lte: DateTime # All values greater than the given value. createdAt_gt: DateTime # All values greater than or equal the given value. createdAt_gte: DateTime id: ID # All values that are not equal to given value. id_not: ID # All values that are contained in given list. id_in: [ID!] # All values that are not contained in given list. id_not_in: [ID!] # All values less than the given value. id_lt: ID # All values less than or equal the given value. id_lte: ID # All values greater than the given value. id_gt: ID # All values greater than or equal the given value. id_gte: ID # All values containing the given string. id_contains: ID # All values not containing the given string. id_not_contains: ID # All values starting with the given string. id_starts_with: ID # All values not starting with the given string. id_not_starts_with: ID # All values ending with the given string. id_ends_with: ID # All values not ending with the given string. id_not_ends_with: ID updatedAt: DateTime # All values that are not equal to given value. updatedAt_not: DateTime # All values that are contained in given list. updatedAt_in: [DateTime!] # All values that are not contained in given list. updatedAt_not_in: [DateTime!] # All values less than the given value. updatedAt_lt: DateTime # All values less than or equal the given value. updatedAt_lte: DateTime # All values greater than the given value. updatedAt_gt: DateTime # All values greater than or equal the given value. updatedAt_gte: DateTime place: PlaceFilter } type PoliciesSubscriptionPayload { mutation: _ModelMutationType! node: Policies updatedFields: [String!] previousValues: PoliciesPreviousValues } type Pricing implements Node { averageMonthly: Int! averageWeekly: Int! basePrice: Int! cleaningFee: Int currency: CURRENCY extraGuests: Int id: ID! monthlyDiscount: Int perNight: Int! place(filter: PlaceFilter): Place! securityDeposit: Int smartPricing: Boolean! weekendPricing: Int weeklyDiscount: Int } input PricingFilter { # Logical AND on all given filters. AND: [PricingFilter!] # Logical OR on all given filters. OR: [PricingFilter!] averageMonthly: Int # All values that are not equal to given value. averageMonthly_not: Int # All values that are contained in given list. averageMonthly_in: [Int!] # All values that are not contained in given list. averageMonthly_not_in: [Int!] # All values less than the given value. averageMonthly_lt: Int # All values less than or equal the given value. averageMonthly_lte: Int # All values greater than the given value. averageMonthly_gt: Int # All values greater than or equal the given value. averageMonthly_gte: Int averageWeekly: Int # All values that are not equal to given value. averageWeekly_not: Int # All values that are contained in given list. averageWeekly_in: [Int!] # All values that are not contained in given list. averageWeekly_not_in: [Int!] # All values less than the given value. averageWeekly_lt: Int # All values less than or equal the given value. averageWeekly_lte: Int # All values greater than the given value. averageWeekly_gt: Int # All values greater than or equal the given value. averageWeekly_gte: Int basePrice: Int # All values that are not equal to given value. basePrice_not: Int # All values that are contained in given list. basePrice_in: [Int!] # All values that are not contained in given list. basePrice_not_in: [Int!] # All values less than the given value. basePrice_lt: Int # All values less than or equal the given value. basePrice_lte: Int # All values greater than the given value. basePrice_gt: Int # All values greater than or equal the given value. basePrice_gte: Int cleaningFee: Int # All values that are not equal to given value. cleaningFee_not: Int # All values that are contained in given list. cleaningFee_in: [Int!] # All values that are not contained in given list. cleaningFee_not_in: [Int!] # All values less than the given value. cleaningFee_lt: Int # All values less than or equal the given value. cleaningFee_lte: Int # All values greater than the given value. cleaningFee_gt: Int # All values greater than or equal the given value. cleaningFee_gte: Int currency: CURRENCY # All values that are not equal to given value. currency_not: CURRENCY # All values that are contained in given list. currency_in: [CURRENCY!] # All values that are not contained in given list. currency_not_in: [CURRENCY!] extraGuests: Int # All values that are not equal to given value. extraGuests_not: Int # All values that are contained in given list. extraGuests_in: [Int!] # All values that are not contained in given list. extraGuests_not_in: [Int!] # All values less than the given value. extraGuests_lt: Int # All values less than or equal the given value. extraGuests_lte: Int # All values greater than the given value. extraGuests_gt: Int # All values greater than or equal the given value. extraGuests_gte: Int id: ID # All values that are not equal to given value. id_not: ID # All values that are contained in given list. id_in: [ID!] # All values that are not contained in given list. id_not_in: [ID!] # All values less than the given value. id_lt: ID # All values less than or equal the given value. id_lte: ID # All values greater than the given value. id_gt: ID # All values greater than or equal the given value. id_gte: ID # All values containing the given string. id_contains: ID # All values not containing the given string. id_not_contains: ID # All values starting with the given string. id_starts_with: ID # All values not starting with the given string. id_not_starts_with: ID # All values ending with the given string. id_ends_with: ID # All values not ending with the given string. id_not_ends_with: ID monthlyDiscount: Int # All values that are not equal to given value. monthlyDiscount_not: Int # All values that are contained in given list. monthlyDiscount_in: [Int!] # All values that are not contained in given list. monthlyDiscount_not_in: [Int!] # All values less than the given value. monthlyDiscount_lt: Int # All values less than or equal the given value. monthlyDiscount_lte: Int # All values greater than the given value. monthlyDiscount_gt: Int # All values greater than or equal the given value. monthlyDiscount_gte: Int perNight: Int # All values that are not equal to given value. perNight_not: Int # All values that are contained in given list. perNight_in: [Int!] # All values that are not contained in given list. perNight_not_in: [Int!] # All values less than the given value. perNight_lt: Int # All values less than or equal the given value. perNight_lte: Int # All values greater than the given value. perNight_gt: Int # All values greater than or equal the given value. perNight_gte: Int securityDeposit: Int # All values that are not equal to given value. securityDeposit_not: Int # All values that are contained in given list. securityDeposit_in: [Int!] # All values that are not contained in given list. securityDeposit_not_in: [Int!] # All values less than the given value. securityDeposit_lt: Int # All values less than or equal the given value. securityDeposit_lte: Int # All values greater than the given value. securityDeposit_gt: Int # All values greater than or equal the given value. securityDeposit_gte: Int smartPricing: Boolean # All values that are not equal to given value. smartPricing_not: Boolean weekendPricing: Int # All values that are not equal to given value. weekendPricing_not: Int # All values that are contained in given list. weekendPricing_in: [Int!] # All values that are not contained in given list. weekendPricing_not_in: [Int!] # All values less than the given value. weekendPricing_lt: Int # All values less than or equal the given value. weekendPricing_lte: Int # All values greater than the given value. weekendPricing_gt: Int # All values greater than or equal the given value. weekendPricing_gte: Int weeklyDiscount: Int # All values that are not equal to given value. weeklyDiscount_not: Int # All values that are contained in given list. weeklyDiscount_in: [Int!] # All values that are not contained in given list. weeklyDiscount_not_in: [Int!] # All values less than the given value. weeklyDiscount_lt: Int # All values less than or equal the given value. weeklyDiscount_lte: Int # All values greater than the given value. weeklyDiscount_gt: Int # All values greater than or equal the given value. weeklyDiscount_gte: Int place: PlaceFilter } enum PricingOrderBy { averageMonthly_ASC averageMonthly_DESC averageWeekly_ASC averageWeekly_DESC basePrice_ASC basePrice_DESC cleaningFee_ASC cleaningFee_DESC currency_ASC currency_DESC extraGuests_ASC extraGuests_DESC id_ASC id_DESC monthlyDiscount_ASC monthlyDiscount_DESC perNight_ASC perNight_DESC securityDeposit_ASC securityDeposit_DESC smartPricing_ASC smartPricing_DESC weekendPricing_ASC weekendPricing_DESC weeklyDiscount_ASC weeklyDiscount_DESC } input PricingplacePlace { description: String! maxGuests: Int! name: String numBaths: Int! numBedrooms: Int! numBeds: Int! popularity: Int! shortDescription: String! size: PLACE_SIZES slug: String! amenitiesId: ID amenities: PlaceamenitiesAmenities guestRequirementsId: ID guestRequirements: PlaceguestRequirementsGuestRequirements hostId: ID host: PlacehostUser houseRulesId: ID houseRules: PlacehouseRulesHouseRules locationId: ID location: PlacelocationLocation policiesId: ID policies: PlacepoliciesPolicies viewsId: ID views: PlaceviewsPlaceViews bookingsIds: [ID!] bookings: [PlacebookingsBooking!] picturesIds: [ID!] pictures: [PlacepicturesPicture!] reviewsIds: [ID!] reviews: [PlacereviewsReview!] } type PricingPreviousValues { averageMonthly: Int! averageWeekly: Int! basePrice: Int! cleaningFee: Int currency: CURRENCY extraGuests: Int id: ID! monthlyDiscount: Int perNight: Int! securityDeposit: Int smartPricing: Boolean! weekendPricing: Int weeklyDiscount: Int } input PricingSubscriptionFilter { # Logical AND on all given filters. AND: [PricingSubscriptionFilter!] # Logical OR on all given filters. OR: [PricingSubscriptionFilter!] # The subscription event gets dispatched when it's listed in mutation_in mutation_in: [_ModelMutationType!] # The subscription event gets only dispatched when one of the updated fields names is included in this list updatedFields_contains: String # The subscription event gets only dispatched when all of the field names included in this list have been updated updatedFields_contains_every: [String!] # The subscription event gets only dispatched when some of the field names included in this list have been updated updatedFields_contains_some: [String!] node: PricingSubscriptionFilterNode } input PricingSubscriptionFilterNode { averageMonthly: Int # All values that are not equal to given value. averageMonthly_not: Int # All values that are contained in given list. averageMonthly_in: [Int!] # All values that are not contained in given list. averageMonthly_not_in: [Int!] # All values less than the given value. averageMonthly_lt: Int # All values less than or equal the given value. averageMonthly_lte: Int # All values greater than the given value. averageMonthly_gt: Int # All values greater than or equal the given value. averageMonthly_gte: Int averageWeekly: Int # All values that are not equal to given value. averageWeekly_not: Int # All values that are contained in given list. averageWeekly_in: [Int!] # All values that are not contained in given list. averageWeekly_not_in: [Int!] # All values less than the given value. averageWeekly_lt: Int # All values less than or equal the given value. averageWeekly_lte: Int # All values greater than the given value. averageWeekly_gt: Int # All values greater than or equal the given value. averageWeekly_gte: Int basePrice: Int # All values that are not equal to given value. basePrice_not: Int # All values that are contained in given list. basePrice_in: [Int!] # All values that are not contained in given list. basePrice_not_in: [Int!] # All values less than the given value. basePrice_lt: Int # All values less than or equal the given value. basePrice_lte: Int # All values greater than the given value. basePrice_gt: Int # All values greater than or equal the given value. basePrice_gte: Int cleaningFee: Int # All values that are not equal to given value. cleaningFee_not: Int # All values that are contained in given list. cleaningFee_in: [Int!] # All values that are not contained in given list. cleaningFee_not_in: [Int!] # All values less than the given value. cleaningFee_lt: Int # All values less than or equal the given value. cleaningFee_lte: Int # All values greater than the given value. cleaningFee_gt: Int # All values greater than or equal the given value. cleaningFee_gte: Int currency: CURRENCY # All values that are not equal to given value. currency_not: CURRENCY # All values that are contained in given list. currency_in: [CURRENCY!] # All values that are not contained in given list. currency_not_in: [CURRENCY!] extraGuests: Int # All values that are not equal to given value. extraGuests_not: Int # All values that are contained in given list. extraGuests_in: [Int!] # All values that are not contained in given list. extraGuests_not_in: [Int!] # All values less than the given value. extraGuests_lt: Int # All values less than or equal the given value. extraGuests_lte: Int # All values greater than the given value. extraGuests_gt: Int # All values greater than or equal the given value. extraGuests_gte: Int id: ID # All values that are not equal to given value. id_not: ID # All values that are contained in given list. id_in: [ID!] # All values that are not contained in given list. id_not_in: [ID!] # All values less than the given value. id_lt: ID # All values less than or equal the given value. id_lte: ID # All values greater than the given value. id_gt: ID # All values greater than or equal the given value. id_gte: ID # All values containing the given string. id_contains: ID # All values not containing the given string. id_not_contains: ID # All values starting with the given string. id_starts_with: ID # All values not starting with the given string. id_not_starts_with: ID # All values ending with the given string. id_ends_with: ID # All values not ending with the given string. id_not_ends_with: ID monthlyDiscount: Int # All values that are not equal to given value. monthlyDiscount_not: Int # All values that are contained in given list. monthlyDiscount_in: [Int!] # All values that are not contained in given list. monthlyDiscount_not_in: [Int!] # All values less than the given value. monthlyDiscount_lt: Int # All values less than or equal the given value. monthlyDiscount_lte: Int # All values greater than the given value. monthlyDiscount_gt: Int # All values greater than or equal the given value. monthlyDiscount_gte: Int perNight: Int # All values that are not equal to given value. perNight_not: Int # All values that are contained in given list. perNight_in: [Int!] # All values that are not contained in given list. perNight_not_in: [Int!] # All values less than the given value. perNight_lt: Int # All values less than or equal the given value. perNight_lte: Int # All values greater than the given value. perNight_gt: Int # All values greater than or equal the given value. perNight_gte: Int securityDeposit: Int # All values that are not equal to given value. securityDeposit_not: Int # All values that are contained in given list. securityDeposit_in: [Int!] # All values that are not contained in given list. securityDeposit_not_in: [Int!] # All values less than the given value. securityDeposit_lt: Int # All values less than or equal the given value. securityDeposit_lte: Int # All values greater than the given value. securityDeposit_gt: Int # All values greater than or equal the given value. securityDeposit_gte: Int smartPricing: Boolean # All values that are not equal to given value. smartPricing_not: Boolean weekendPricing: Int # All values that are not equal to given value. weekendPricing_not: Int # All values that are contained in given list. weekendPricing_in: [Int!] # All values that are not contained in given list. weekendPricing_not_in: [Int!] # All values less than the given value. weekendPricing_lt: Int # All values less than or equal the given value. weekendPricing_lte: Int # All values greater than the given value. weekendPricing_gt: Int # All values greater than or equal the given value. weekendPricing_gte: Int weeklyDiscount: Int # All values that are not equal to given value. weeklyDiscount_not: Int # All values that are contained in given list. weeklyDiscount_in: [Int!] # All values that are not contained in given list. weeklyDiscount_not_in: [Int!] # All values less than the given value. weeklyDiscount_lt: Int # All values less than or equal the given value. weeklyDiscount_lte: Int # All values greater than the given value. weeklyDiscount_gt: Int # All values greater than or equal the given value. weeklyDiscount_gte: Int place: PlaceFilter } type PricingSubscriptionPayload { mutation: _ModelMutationType! node: Pricing updatedFields: [String!] previousValues: PricingPreviousValues } type Query { allAmenitieses( filter: AmenitiesFilter orderBy: AmenitiesOrderBy skip: Int after: String before: String first: Int last: Int ): [Amenities!]! allBookings( filter: BookingFilter orderBy: BookingOrderBy skip: Int after: String before: String first: Int last: Int ): [Booking!]! allCities( filter: CityFilter orderBy: CityOrderBy skip: Int after: String before: String first: Int last: Int ): [City!]! allCreditCardInformations( filter: CreditCardInformationFilter orderBy: CreditCardInformationOrderBy skip: Int after: String before: String first: Int last: Int ): [CreditCardInformation!]! allExperiences( filter: ExperienceFilter orderBy: ExperienceOrderBy skip: Int after: String before: String first: Int last: Int ): [Experience!]! allExperienceCategories( filter: ExperienceCategoryFilter orderBy: ExperienceCategoryOrderBy skip: Int after: String before: String first: Int last: Int ): [ExperienceCategory!]! allGuestRequirementses( filter: GuestRequirementsFilter orderBy: GuestRequirementsOrderBy skip: Int after: String before: String first: Int last: Int ): [GuestRequirements!]! allHouseRuleses( filter: HouseRulesFilter orderBy: HouseRulesOrderBy skip: Int after: String before: String first: Int last: Int ): [HouseRules!]! allLocations( filter: LocationFilter orderBy: LocationOrderBy skip: Int after: String before: String first: Int last: Int ): [Location!]! allMessages( filter: MessageFilter orderBy: MessageOrderBy skip: Int after: String before: String first: Int last: Int ): [Message!]! allNeighbourhoods( filter: NeighbourhoodFilter orderBy: NeighbourhoodOrderBy skip: Int after: String before: String first: Int last: Int ): [Neighbourhood!]! allNotifications( filter: NotificationFilter orderBy: NotificationOrderBy skip: Int after: String before: String first: Int last: Int ): [Notification!]! allPayments( filter: PaymentFilter orderBy: PaymentOrderBy skip: Int after: String before: String first: Int last: Int ): [Payment!]! allPaymentAccounts( filter: PaymentAccountFilter orderBy: PaymentAccountOrderBy skip: Int after: String before: String first: Int last: Int ): [PaymentAccount!]! allPaypalInformations( filter: PaypalInformationFilter orderBy: PaypalInformationOrderBy skip: Int after: String before: String first: Int last: Int ): [PaypalInformation!]! allPictures( filter: PictureFilter orderBy: PictureOrderBy skip: Int after: String before: String first: Int last: Int ): [Picture!]! allPlaces( filter: PlaceFilter orderBy: PlaceOrderBy skip: Int after: String before: String first: Int last: Int ): [Place!]! allPlaceViewses( filter: PlaceViewsFilter orderBy: PlaceViewsOrderBy skip: Int after: String before: String first: Int last: Int ): [PlaceViews!]! allPolicieses( filter: PoliciesFilter orderBy: PoliciesOrderBy skip: Int after: String before: String first: Int last: Int ): [Policies!]! allPricings( filter: PricingFilter orderBy: PricingOrderBy skip: Int after: String before: String first: Int last: Int ): [Pricing!]! allRestaurants( filter: RestaurantFilter orderBy: RestaurantOrderBy skip: Int after: String before: String first: Int last: Int ): [Restaurant!]! allReviews( filter: ReviewFilter orderBy: ReviewOrderBy skip: Int after: String before: String first: Int last: Int ): [Review!]! allUsers( filter: UserFilter orderBy: UserOrderBy skip: Int after: String before: String first: Int last: Int ): [User!]! _allAmenitiesesMeta( filter: AmenitiesFilter orderBy: AmenitiesOrderBy skip: Int after: String before: String first: Int last: Int ): _QueryMeta! _allBookingsMeta( filter: BookingFilter orderBy: BookingOrderBy skip: Int after: String before: String first: Int last: Int ): _QueryMeta! _allCitiesMeta( filter: CityFilter orderBy: CityOrderBy skip: Int after: String before: String first: Int last: Int ): _QueryMeta! _allCreditCardInformationsMeta( filter: CreditCardInformationFilter orderBy: CreditCardInformationOrderBy skip: Int after: String before: String first: Int last: Int ): _QueryMeta! _allExperiencesMeta( filter: ExperienceFilter orderBy: ExperienceOrderBy skip: Int after: String before: String first: Int last: Int ): _QueryMeta! _allExperienceCategoriesMeta( filter: ExperienceCategoryFilter orderBy: ExperienceCategoryOrderBy skip: Int after: String before: String first: Int last: Int ): _QueryMeta! _allGuestRequirementsesMeta( filter: GuestRequirementsFilter orderBy: GuestRequirementsOrderBy skip: Int after: String before: String first: Int last: Int ): _QueryMeta! _allHouseRulesesMeta( filter: HouseRulesFilter orderBy: HouseRulesOrderBy skip: Int after: String before: String first: Int last: Int ): _QueryMeta! _allLocationsMeta( filter: LocationFilter orderBy: LocationOrderBy skip: Int after: String before: String first: Int last: Int ): _QueryMeta! _allMessagesMeta( filter: MessageFilter orderBy: MessageOrderBy skip: Int after: String before: String first: Int last: Int ): _QueryMeta! _allNeighbourhoodsMeta( filter: NeighbourhoodFilter orderBy: NeighbourhoodOrderBy skip: Int after: String before: String first: Int last: Int ): _QueryMeta! _allNotificationsMeta( filter: NotificationFilter orderBy: NotificationOrderBy skip: Int after: String before: String first: Int last: Int ): _QueryMeta! _allPaymentsMeta( filter: PaymentFilter orderBy: PaymentOrderBy skip: Int after: String before: String first: Int last: Int ): _QueryMeta! _allPaymentAccountsMeta( filter: PaymentAccountFilter orderBy: PaymentAccountOrderBy skip: Int after: String before: String first: Int last: Int ): _QueryMeta! _allPaypalInformationsMeta( filter: PaypalInformationFilter orderBy: PaypalInformationOrderBy skip: Int after: String before: String first: Int last: Int ): _QueryMeta! _allPicturesMeta( filter: PictureFilter orderBy: PictureOrderBy skip: Int after: String before: String first: Int last: Int ): _QueryMeta! _allPlacesMeta( filter: PlaceFilter orderBy: PlaceOrderBy skip: Int after: String before: String first: Int last: Int ): _QueryMeta! _allPlaceViewsesMeta( filter: PlaceViewsFilter orderBy: PlaceViewsOrderBy skip: Int after: String before: String first: Int last: Int ): _QueryMeta! _allPoliciesesMeta( filter: PoliciesFilter orderBy: PoliciesOrderBy skip: Int after: String before: String first: Int last: Int ): _QueryMeta! _allPricingsMeta( filter: PricingFilter orderBy: PricingOrderBy skip: Int after: String before: String first: Int last: Int ): _QueryMeta! _allRestaurantsMeta( filter: RestaurantFilter orderBy: RestaurantOrderBy skip: Int after: String before: String first: Int last: Int ): _QueryMeta! _allReviewsMeta( filter: ReviewFilter orderBy: ReviewOrderBy skip: Int after: String before: String first: Int last: Int ): _QueryMeta! _allUsersMeta( filter: UserFilter orderBy: UserOrderBy skip: Int after: String before: String first: Int last: Int ): _QueryMeta! Amenities(id: ID): Amenities Booking(id: ID): Booking City(id: ID): City CreditCardInformation(id: ID): CreditCardInformation Experience(id: ID): Experience ExperienceCategory(id: ID): ExperienceCategory GuestRequirements(id: ID): GuestRequirements HouseRules(id: ID): HouseRules Location(id: ID): Location Message(id: ID): Message Neighbourhood(id: ID): Neighbourhood Notification(id: ID): Notification Payment(id: ID): Payment PaymentAccount(id: ID): PaymentAccount PaypalInformation(id: ID): PaypalInformation Picture(id: ID): Picture Place(id: ID): Place PlaceViews(id: ID): PlaceViews Policies(id: ID): Policies Pricing(id: ID): Pricing Restaurant(id: ID): Restaurant Review(id: ID): Review User(email: String, id: ID): User user: User # Fetches an object given its ID node( # The ID of an object id: ID! ): Node } type RemoveFromExperienceReviewsPayload { experienceExperience: Experience reviewsReview: Review } type RemoveFromNeighbourhoodPayload { locationsLocation: Location neighbourHoodNeighbourhood: Neighbourhood } type RemoveFromPlacePicturesPayload { placePlace: Place picturesPicture: Picture } type RemoveFromRestaurantPicturePayload { picturesPicture: Picture reservationRestaurant: Restaurant } type Restaurant implements Node { avgPricePerPerson: Int! createdAt: DateTime! id: ID! isCurated: Boolean! location(filter: LocationFilter): Location! pictures( filter: PictureFilter orderBy: PictureOrderBy skip: Int after: String before: String first: Int last: Int ): [Picture!] popularity: Int! slug: String! title: String! # Meta information about the query. _picturesMeta( filter: PictureFilter orderBy: PictureOrderBy skip: Int after: String before: String first: Int last: Int ): _QueryMeta! } input RestaurantFilter { # Logical AND on all given filters. AND: [RestaurantFilter!] # Logical OR on all given filters. OR: [RestaurantFilter!] avgPricePerPerson: Int # All values that are not equal to given value. avgPricePerPerson_not: Int # All values that are contained in given list. avgPricePerPerson_in: [Int!] # All values that are not contained in given list. avgPricePerPerson_not_in: [Int!] # All values less than the given value. avgPricePerPerson_lt: Int # All values less than or equal the given value. avgPricePerPerson_lte: Int # All values greater than the given value. avgPricePerPerson_gt: Int # All values greater than or equal the given value. avgPricePerPerson_gte: Int createdAt: DateTime # All values that are not equal to given value. createdAt_not: DateTime # All values that are contained in given list. createdAt_in: [DateTime!] # All values that are not contained in given list. createdAt_not_in: [DateTime!] # All values less than the given value. createdAt_lt: DateTime # All values less than or equal the given value. createdAt_lte: DateTime # All values greater than the given value. createdAt_gt: DateTime # All values greater than or equal the given value. createdAt_gte: DateTime id: ID # All values that are not equal to given value. id_not: ID # All values that are contained in given list. id_in: [ID!] # All values that are not contained in given list. id_not_in: [ID!] # All values less than the given value. id_lt: ID # All values less than or equal the given value. id_lte: ID # All values greater than the given value. id_gt: ID # All values greater than or equal the given value. id_gte: ID # All values containing the given string. id_contains: ID # All values not containing the given string. id_not_contains: ID # All values starting with the given string. id_starts_with: ID # All values not starting with the given string. id_not_starts_with: ID # All values ending with the given string. id_ends_with: ID # All values not ending with the given string. id_not_ends_with: ID isCurated: Boolean # All values that are not equal to given value. isCurated_not: Boolean popularity: Int # All values that are not equal to given value. popularity_not: Int # All values that are contained in given list. popularity_in: [Int!] # All values that are not contained in given list. popularity_not_in: [Int!] # All values less than the given value. popularity_lt: Int # All values less than or equal the given value. popularity_lte: Int # All values greater than the given value. popularity_gt: Int # All values greater than or equal the given value. popularity_gte: Int slug: String # All values that are not equal to given value. slug_not: String # All values that are contained in given list. slug_in: [String!] # All values that are not contained in given list. slug_not_in: [String!] # All values less than the given value. slug_lt: String # All values less than or equal the given value. slug_lte: String # All values greater than the given value. slug_gt: String # All values greater than or equal the given value. slug_gte: String # All values containing the given string. slug_contains: String # All values not containing the given string. slug_not_contains: String # All values starting with the given string. slug_starts_with: String # All values not starting with the given string. slug_not_starts_with: String # All values ending with the given string. slug_ends_with: String # All values not ending with the given string. slug_not_ends_with: String title: String # All values that are not equal to given value. title_not: String # All values that are contained in given list. title_in: [String!] # All values that are not contained in given list. title_not_in: [String!] # All values less than the given value. title_lt: String # All values less than or equal the given value. title_lte: String # All values greater than the given value. title_gt: String # All values greater than or equal the given value. title_gte: String # All values containing the given string. title_contains: String # All values not containing the given string. title_not_contains: String # All values starting with the given string. title_starts_with: String # All values not starting with the given string. title_not_starts_with: String # All values ending with the given string. title_ends_with: String # All values not ending with the given string. title_not_ends_with: String location: LocationFilter pictures_every: PictureFilter pictures_some: PictureFilter pictures_none: PictureFilter } input RestaurantlocationLocation { address: String directions: String lat: Float! lng: Float! experienceId: ID experience: LocationexperienceExperience neighbourHoodId: ID neighbourHood: LocationneighbourHoodNeighbourhood placeId: ID place: LocationplacePlace userId: ID user: LocationuserUser } enum RestaurantOrderBy { avgPricePerPerson_ASC avgPricePerPerson_DESC createdAt_ASC createdAt_DESC id_ASC id_DESC isCurated_ASC isCurated_DESC popularity_ASC popularity_DESC slug_ASC slug_DESC title_ASC title_DESC } input RestaurantpicturesPicture { url: String! experienceId: ID experience: PictureexperienceExperience neighbourHoodId: ID neighbourHood: PictureneighbourHoodNeighbourhood placeId: ID place: PictureplacePlace userId: ID user: PictureuserUser } type RestaurantPreviousValues { avgPricePerPerson: Int! createdAt: DateTime! id: ID! isCurated: Boolean! popularity: Int! slug: String! title: String! } input RestaurantSubscriptionFilter { # Logical AND on all given filters. AND: [RestaurantSubscriptionFilter!] # Logical OR on all given filters. OR: [RestaurantSubscriptionFilter!] # The subscription event gets dispatched when it's listed in mutation_in mutation_in: [_ModelMutationType!] # The subscription event gets only dispatched when one of the updated fields names is included in this list updatedFields_contains: String # The subscription event gets only dispatched when all of the field names included in this list have been updated updatedFields_contains_every: [String!] # The subscription event gets only dispatched when some of the field names included in this list have been updated updatedFields_contains_some: [String!] node: RestaurantSubscriptionFilterNode } input RestaurantSubscriptionFilterNode { avgPricePerPerson: Int # All values that are not equal to given value. avgPricePerPerson_not: Int # All values that are contained in given list. avgPricePerPerson_in: [Int!] # All values that are not contained in given list. avgPricePerPerson_not_in: [Int!] # All values less than the given value. avgPricePerPerson_lt: Int # All values less than or equal the given value. avgPricePerPerson_lte: Int # All values greater than the given value. avgPricePerPerson_gt: Int # All values greater than or equal the given value. avgPricePerPerson_gte: Int createdAt: DateTime # All values that are not equal to given value. createdAt_not: DateTime # All values that are contained in given list. createdAt_in: [DateTime!] # All values that are not contained in given list. createdAt_not_in: [DateTime!] # All values less than the given value. createdAt_lt: DateTime # All values less than or equal the given value. createdAt_lte: DateTime # All values greater than the given value. createdAt_gt: DateTime # All values greater than or equal the given value. createdAt_gte: DateTime id: ID # All values that are not equal to given value. id_not: ID # All values that are contained in given list. id_in: [ID!] # All values that are not contained in given list. id_not_in: [ID!] # All values less than the given value. id_lt: ID # All values less than or equal the given value. id_lte: ID # All values greater than the given value. id_gt: ID # All values greater than or equal the given value. id_gte: ID # All values containing the given string. id_contains: ID # All values not containing the given string. id_not_contains: ID # All values starting with the given string. id_starts_with: ID # All values not starting with the given string. id_not_starts_with: ID # All values ending with the given string. id_ends_with: ID # All values not ending with the given string. id_not_ends_with: ID isCurated: Boolean # All values that are not equal to given value. isCurated_not: Boolean popularity: Int # All values that are not equal to given value. popularity_not: Int # All values that are contained in given list. popularity_in: [Int!] # All values that are not contained in given list. popularity_not_in: [Int!] # All values less than the given value. popularity_lt: Int # All values less than or equal the given value. popularity_lte: Int # All values greater than the given value. popularity_gt: Int # All values greater than or equal the given value. popularity_gte: Int slug: String # All values that are not equal to given value. slug_not: String # All values that are contained in given list. slug_in: [String!] # All values that are not contained in given list. slug_not_in: [String!] # All values less than the given value. slug_lt: String # All values less than or equal the given value. slug_lte: String # All values greater than the given value. slug_gt: String # All values greater than or equal the given value. slug_gte: String # All values containing the given string. slug_contains: String # All values not containing the given string. slug_not_contains: String # All values starting with the given string. slug_starts_with: String # All values not starting with the given string. slug_not_starts_with: String # All values ending with the given string. slug_ends_with: String # All values not ending with the given string. slug_not_ends_with: String title: String # All values that are not equal to given value. title_not: String # All values that are contained in given list. title_in: [String!] # All values that are not contained in given list. title_not_in: [String!] # All values less than the given value. title_lt: String # All values less than or equal the given value. title_lte: String # All values greater than the given value. title_gt: String # All values greater than or equal the given value. title_gte: String # All values containing the given string. title_contains: String # All values not containing the given string. title_not_contains: String # All values starting with the given string. title_starts_with: String # All values not starting with the given string. title_not_starts_with: String # All values ending with the given string. title_ends_with: String # All values not ending with the given string. title_not_ends_with: String location: LocationFilter pictures_every: PictureFilter pictures_some: PictureFilter pictures_none: PictureFilter } type RestaurantSubscriptionPayload { mutation: _ModelMutationType! node: Restaurant updatedFields: [String!] previousValues: RestaurantPreviousValues } type Review implements Node { accuracy: Int! checkIn: Int! cleanliness: Int! communication: Int! createdAt: DateTime! experience(filter: ExperienceFilter): Experience id: ID! location: Int! place(filter: PlaceFilter): Place! stars: Int! text: String! value: Int! } input ReviewexperienceExperience { popularity: Int! pricePerPerson: Int! title: String! categoryId: ID category: ExperiencecategoryExperienceCategory hostId: ID host: ExperiencehostUser locationId: ID location: ExperiencelocationLocation previewId: ID preview: ExperiencepreviewPicture reviewsIds: [ID!] reviews: [ExperiencereviewsReview!] } input ReviewFilter { # Logical AND on all given filters. AND: [ReviewFilter!] # Logical OR on all given filters. OR: [ReviewFilter!] accuracy: Int # All values that are not equal to given value. accuracy_not: Int # All values that are contained in given list. accuracy_in: [Int!] # All values that are not contained in given list. accuracy_not_in: [Int!] # All values less than the given value. accuracy_lt: Int # All values less than or equal the given value. accuracy_lte: Int # All values greater than the given value. accuracy_gt: Int # All values greater than or equal the given value. accuracy_gte: Int checkIn: Int # All values that are not equal to given value. checkIn_not: Int # All values that are contained in given list. checkIn_in: [Int!] # All values that are not contained in given list. checkIn_not_in: [Int!] # All values less than the given value. checkIn_lt: Int # All values less than or equal the given value. checkIn_lte: Int # All values greater than the given value. checkIn_gt: Int # All values greater than or equal the given value. checkIn_gte: Int cleanliness: Int # All values that are not equal to given value. cleanliness_not: Int # All values that are contained in given list. cleanliness_in: [Int!] # All values that are not contained in given list. cleanliness_not_in: [Int!] # All values less than the given value. cleanliness_lt: Int # All values less than or equal the given value. cleanliness_lte: Int # All values greater than the given value. cleanliness_gt: Int # All values greater than or equal the given value. cleanliness_gte: Int communication: Int # All values that are not equal to given value. communication_not: Int # All values that are contained in given list. communication_in: [Int!] # All values that are not contained in given list. communication_not_in: [Int!] # All values less than the given value. communication_lt: Int # All values less than or equal the given value. communication_lte: Int # All values greater than the given value. communication_gt: Int # All values greater than or equal the given value. communication_gte: Int createdAt: DateTime # All values that are not equal to given value. createdAt_not: DateTime # All values that are contained in given list. createdAt_in: [DateTime!] # All values that are not contained in given list. createdAt_not_in: [DateTime!] # All values less than the given value. createdAt_lt: DateTime # All values less than or equal the given value. createdAt_lte: DateTime # All values greater than the given value. createdAt_gt: DateTime # All values greater than or equal the given value. createdAt_gte: DateTime id: ID # All values that are not equal to given value. id_not: ID # All values that are contained in given list. id_in: [ID!] # All values that are not contained in given list. id_not_in: [ID!] # All values less than the given value. id_lt: ID # All values less than or equal the given value. id_lte: ID # All values greater than the given value. id_gt: ID # All values greater than or equal the given value. id_gte: ID # All values containing the given string. id_contains: ID # All values not containing the given string. id_not_contains: ID # All values starting with the given string. id_starts_with: ID # All values not starting with the given string. id_not_starts_with: ID # All values ending with the given string. id_ends_with: ID # All values not ending with the given string. id_not_ends_with: ID location: Int # All values that are not equal to given value. location_not: Int # All values that are contained in given list. location_in: [Int!] # All values that are not contained in given list. location_not_in: [Int!] # All values less than the given value. location_lt: Int # All values less than or equal the given value. location_lte: Int # All values greater than the given value. location_gt: Int # All values greater than or equal the given value. location_gte: Int stars: Int # All values that are not equal to given value. stars_not: Int # All values that are contained in given list. stars_in: [Int!] # All values that are not contained in given list. stars_not_in: [Int!] # All values less than the given value. stars_lt: Int # All values less than or equal the given value. stars_lte: Int # All values greater than the given value. stars_gt: Int # All values greater than or equal the given value. stars_gte: Int text: String # All values that are not equal to given value. text_not: String # All values that are contained in given list. text_in: [String!] # All values that are not contained in given list. text_not_in: [String!] # All values less than the given value. text_lt: String # All values less than or equal the given value. text_lte: String # All values greater than the given value. text_gt: String # All values greater than or equal the given value. text_gte: String # All values containing the given string. text_contains: String # All values not containing the given string. text_not_contains: String # All values starting with the given string. text_starts_with: String # All values not starting with the given string. text_not_starts_with: String # All values ending with the given string. text_ends_with: String # All values not ending with the given string. text_not_ends_with: String value: Int # All values that are not equal to given value. value_not: Int # All values that are contained in given list. value_in: [Int!] # All values that are not contained in given list. value_not_in: [Int!] # All values less than the given value. value_lt: Int # All values less than or equal the given value. value_lte: Int # All values greater than the given value. value_gt: Int # All values greater than or equal the given value. value_gte: Int experience: ExperienceFilter place: PlaceFilter } enum ReviewOrderBy { accuracy_ASC accuracy_DESC checkIn_ASC checkIn_DESC cleanliness_ASC cleanliness_DESC communication_ASC communication_DESC createdAt_ASC createdAt_DESC id_ASC id_DESC location_ASC location_DESC stars_ASC stars_DESC text_ASC text_DESC value_ASC value_DESC } input ReviewplacePlace { description: String! maxGuests: Int! name: String numBaths: Int! numBedrooms: Int! numBeds: Int! popularity: Int! shortDescription: String! size: PLACE_SIZES slug: String! amenitiesId: ID amenities: PlaceamenitiesAmenities guestRequirementsId: ID guestRequirements: PlaceguestRequirementsGuestRequirements hostId: ID host: PlacehostUser houseRulesId: ID houseRules: PlacehouseRulesHouseRules locationId: ID location: PlacelocationLocation policiesId: ID policies: PlacepoliciesPolicies pricingId: ID pricing: PlacepricingPricing viewsId: ID views: PlaceviewsPlaceViews bookingsIds: [ID!] bookings: [PlacebookingsBooking!] picturesIds: [ID!] pictures: [PlacepicturesPicture!] reviewsIds: [ID!] reviews: [PlacereviewsReview!] } type ReviewPreviousValues { accuracy: Int! checkIn: Int! cleanliness: Int! communication: Int! createdAt: DateTime! id: ID! location: Int! stars: Int! text: String! value: Int! } input ReviewSubscriptionFilter { # Logical AND on all given filters. AND: [ReviewSubscriptionFilter!] # Logical OR on all given filters. OR: [ReviewSubscriptionFilter!] # The subscription event gets dispatched when it's listed in mutation_in mutation_in: [_ModelMutationType!] # The subscription event gets only dispatched when one of the updated fields names is included in this list updatedFields_contains: String # The subscription event gets only dispatched when all of the field names included in this list have been updated updatedFields_contains_every: [String!] # The subscription event gets only dispatched when some of the field names included in this list have been updated updatedFields_contains_some: [String!] node: ReviewSubscriptionFilterNode } input ReviewSubscriptionFilterNode { accuracy: Int # All values that are not equal to given value. accuracy_not: Int # All values that are contained in given list. accuracy_in: [Int!] # All values that are not contained in given list. accuracy_not_in: [Int!] # All values less than the given value. accuracy_lt: Int # All values less than or equal the given value. accuracy_lte: Int # All values greater than the given value. accuracy_gt: Int # All values greater than or equal the given value. accuracy_gte: Int checkIn: Int # All values that are not equal to given value. checkIn_not: Int # All values that are contained in given list. checkIn_in: [Int!] # All values that are not contained in given list. checkIn_not_in: [Int!] # All values less than the given value. checkIn_lt: Int # All values less than or equal the given value. checkIn_lte: Int # All values greater than the given value. checkIn_gt: Int # All values greater than or equal the given value. checkIn_gte: Int cleanliness: Int # All values that are not equal to given value. cleanliness_not: Int # All values that are contained in given list. cleanliness_in: [Int!] # All values that are not contained in given list. cleanliness_not_in: [Int!] # All values less than the given value. cleanliness_lt: Int # All values less than or equal the given value. cleanliness_lte: Int # All values greater than the given value. cleanliness_gt: Int # All values greater than or equal the given value. cleanliness_gte: Int communication: Int # All values that are not equal to given value. communication_not: Int # All values that are contained in given list. communication_in: [Int!] # All values that are not contained in given list. communication_not_in: [Int!] # All values less than the given value. communication_lt: Int # All values less than or equal the given value. communication_lte: Int # All values greater than the given value. communication_gt: Int # All values greater than or equal the given value. communication_gte: Int createdAt: DateTime # All values that are not equal to given value. createdAt_not: DateTime # All values that are contained in given list. createdAt_in: [DateTime!] # All values that are not contained in given list. createdAt_not_in: [DateTime!] # All values less than the given value. createdAt_lt: DateTime # All values less than or equal the given value. createdAt_lte: DateTime # All values greater than the given value. createdAt_gt: DateTime # All values greater than or equal the given value. createdAt_gte: DateTime id: ID # All values that are not equal to given value. id_not: ID # All values that are contained in given list. id_in: [ID!] # All values that are not contained in given list. id_not_in: [ID!] # All values less than the given value. id_lt: ID # All values less than or equal the given value. id_lte: ID # All values greater than the given value. id_gt: ID # All values greater than or equal the given value. id_gte: ID # All values containing the given string. id_contains: ID # All values not containing the given string. id_not_contains: ID # All values starting with the given string. id_starts_with: ID # All values not starting with the given string. id_not_starts_with: ID # All values ending with the given string. id_ends_with: ID # All values not ending with the given string. id_not_ends_with: ID location: Int # All values that are not equal to given value. location_not: Int # All values that are contained in given list. location_in: [Int!] # All values that are not contained in given list. location_not_in: [Int!] # All values less than the given value. location_lt: Int # All values less than or equal the given value. location_lte: Int # All values greater than the given value. location_gt: Int # All values greater than or equal the given value. location_gte: Int stars: Int # All values that are not equal to given value. stars_not: Int # All values that are contained in given list. stars_in: [Int!] # All values that are not contained in given list. stars_not_in: [Int!] # All values less than the given value. stars_lt: Int # All values less than or equal the given value. stars_lte: Int # All values greater than the given value. stars_gt: Int # All values greater than or equal the given value. stars_gte: Int text: String # All values that are not equal to given value. text_not: String # All values that are contained in given list. text_in: [String!] # All values that are not contained in given list. text_not_in: [String!] # All values less than the given value. text_lt: String # All values less than or equal the given value. text_lte: String # All values greater than the given value. text_gt: String # All values greater than or equal the given value. text_gte: String # All values containing the given string. text_contains: String # All values not containing the given string. text_not_contains: String # All values starting with the given string. text_starts_with: String # All values not starting with the given string. text_not_starts_with: String # All values ending with the given string. text_ends_with: String # All values not ending with the given string. text_not_ends_with: String value: Int # All values that are not equal to given value. value_not: Int # All values that are contained in given list. value_in: [Int!] # All values that are not contained in given list. value_not_in: [Int!] # All values less than the given value. value_lt: Int # All values less than or equal the given value. value_lte: Int # All values greater than the given value. value_gt: Int # All values greater than or equal the given value. value_gte: Int experience: ExperienceFilter place: PlaceFilter } type ReviewSubscriptionPayload { mutation: _ModelMutationType! node: Review updatedFields: [String!] previousValues: ReviewPreviousValues } type SetBookingPaymentPayload { bookingBooking: Booking paymentPayment: Payment } type SetCreditCardInformationPayload { paymentAccountPaymentAccount: PaymentAccount creditcardCreditCardInformation: CreditCardInformation } type SetExperienceCategoryPayload { experienceExperience: Experience categoryExperienceCategory: ExperienceCategory } type SetExperienceLocationPayload { locationLocation: Location experienceExperience: Experience } type SetExperiencePreviewPayload { previewPicture: Picture experienceExperience: Experience } type SetGuestRequirementsPayload { placePlace: Place guestRequirementsGuestRequirements: GuestRequirements } type SetHomePreviewPayload { neighbourHoodNeighbourhood: Neighbourhood homePreviewPicture: Picture } type SetHouseRulesPayload { placePlace: Place houseRulesHouseRules: HouseRules } type SetPaypalInformationPayload { paymentAccountPaymentAccount: PaymentAccount paypalPaypalInformation: PaypalInformation } type SetPlaceAmenitiesPayload { placePlace: Place amenitiesAmenities: Amenities } type SetPlaceLocationPayload { placePlace: Place locationLocation: Location } type SetPlacePricePayload { placePlace: Place pricingPricing: Pricing } type SetPlaceViewsPayload { placePlace: Place viewsPlaceViews: PlaceViews } type SetPoliciesPayload { placePlace: Place policiesPolicies: Policies } type SetProfilePicturePayload { userUser: User profilePicturePicture: Picture } type SetRestaurantLocationPayload { locationLocation: Location restaurantRestaurant: Restaurant } type SetUserLocationPayload { userUser: User locationLocation: Location } type Subscription { Amenities(filter: AmenitiesSubscriptionFilter): AmenitiesSubscriptionPayload Booking(filter: BookingSubscriptionFilter): BookingSubscriptionPayload City(filter: CitySubscriptionFilter): CitySubscriptionPayload CreditCardInformation( filter: CreditCardInformationSubscriptionFilter ): CreditCardInformationSubscriptionPayload Experience(filter: ExperienceSubscriptionFilter): ExperienceSubscriptionPayload ExperienceCategory( filter: ExperienceCategorySubscriptionFilter ): ExperienceCategorySubscriptionPayload GuestRequirements( filter: GuestRequirementsSubscriptionFilter ): GuestRequirementsSubscriptionPayload HouseRules(filter: HouseRulesSubscriptionFilter): HouseRulesSubscriptionPayload Location(filter: LocationSubscriptionFilter): LocationSubscriptionPayload Message(filter: MessageSubscriptionFilter): MessageSubscriptionPayload Neighbourhood(filter: NeighbourhoodSubscriptionFilter): NeighbourhoodSubscriptionPayload Notification(filter: NotificationSubscriptionFilter): NotificationSubscriptionPayload Payment(filter: PaymentSubscriptionFilter): PaymentSubscriptionPayload PaymentAccount(filter: PaymentAccountSubscriptionFilter): PaymentAccountSubscriptionPayload PaypalInformation( filter: PaypalInformationSubscriptionFilter ): PaypalInformationSubscriptionPayload Picture(filter: PictureSubscriptionFilter): PictureSubscriptionPayload Place(filter: PlaceSubscriptionFilter): PlaceSubscriptionPayload PlaceViews(filter: PlaceViewsSubscriptionFilter): PlaceViewsSubscriptionPayload Policies(filter: PoliciesSubscriptionFilter): PoliciesSubscriptionPayload Pricing(filter: PricingSubscriptionFilter): PricingSubscriptionPayload Restaurant(filter: RestaurantSubscriptionFilter): RestaurantSubscriptionPayload Review(filter: ReviewSubscriptionFilter): ReviewSubscriptionPayload User(filter: UserSubscriptionFilter): UserSubscriptionPayload } type UnsetCreditCardInformationPayload { paymentAccountPaymentAccount: PaymentAccount creditcardCreditCardInformation: CreditCardInformation } type UnsetExperienceCategoryPayload { experienceExperience: Experience categoryExperienceCategory: ExperienceCategory } type UnsetHomePreviewPayload { neighbourHoodNeighbourhood: Neighbourhood homePreviewPicture: Picture } type UnsetProfilePicturePayload { userUser: User profilePicturePicture: Picture } type UnsetUserLocationPayload { userUser: User locationLocation: Location } input UpdateAmenities { airConditioning: Boolean babyBath: Boolean babyMonitor: Boolean babysitterRecommendations: Boolean bathtub: Boolean breakfast: Boolean buzzerWirelessIntercom: Boolean cableTv: Boolean changingTable: Boolean childrensBooksAndToys: Boolean childrensDinnerware: Boolean crib: Boolean doorman: Boolean dryer: Boolean elevator: Boolean essentials: Boolean familyKidFriendly: Boolean freeParkingOnPremises: Boolean freeParkingOnStreet: Boolean gym: Boolean hairDryer: Boolean hangers: Boolean heating: Boolean hotTub: Boolean id: ID! indoorFireplace: Boolean internet: Boolean iron: Boolean kitchen: Boolean laptopFriendlyWorkspace: Boolean paidParkingOffPremises: Boolean petsAllowed: Boolean pool: Boolean privateEntrance: Boolean shampoo: Boolean smokingAllowed: Boolean suitableForEvents: Boolean tv: Boolean washer: Boolean wheelchairAccessible: Boolean wirelessInternet: Boolean placeId: ID place: AmenitiesplacePlace } input UpdateBooking { endDate: DateTime id: ID! startDate: DateTime bookeeId: ID bookee: BookingbookeeUser paymentId: ID payment: BookingpaymentPayment placeId: ID place: BookingplacePlace } input UpdateCity { id: ID! name: String neighbourhoodsIds: [ID!] neighbourhoods: [CityneighbourhoodsNeighbourhood!] } input UpdateCreditCardInformation { cardNumber: String country: String expiresOnMonth: Int expiresOnYear: Int firstName: String id: ID! lastName: String postalCode: String securityCode: String paymentAccountId: ID paymentAccount: CreditCardInformationpaymentAccountPaymentAccount } input UpdateExperience { id: ID! popularity: Int pricePerPerson: Int title: String categoryId: ID category: ExperiencecategoryExperienceCategory hostId: ID host: ExperiencehostUser locationId: ID location: ExperiencelocationLocation previewId: ID preview: ExperiencepreviewPicture reviewsIds: [ID!] reviews: [ExperiencereviewsReview!] } input UpdateExperienceCategory { id: ID! mainColor: String name: String experienceId: ID experience: ExperienceCategoryexperienceExperience } input UpdateGuestRequirements { govIssuedId: Boolean guestTripInformation: Boolean id: ID! recommendationsFromOtherHosts: Boolean placeId: ID place: GuestRequirementsplacePlace } input UpdateHouseRules { additionalRules: String id: ID! partiesAndEventsAllowed: Boolean petsAllowed: Boolean smokingAllowed: Boolean suitableForChildren: Boolean suitableForInfants: Boolean placeId: ID place: HouseRulesplacePlace } input UpdateLocation { address: String directions: String id: ID! lat: Float lng: Float experienceId: ID experience: LocationexperienceExperience neighbourHoodId: ID neighbourHood: LocationneighbourHoodNeighbourhood placeId: ID place: LocationplacePlace restaurantId: ID restaurant: LocationrestaurantRestaurant userId: ID user: LocationuserUser } input UpdateMessage { deliveredAt: DateTime id: ID! readAt: DateTime fromId: ID from: MessagefromUser toId: ID to: MessagetoUser } input UpdateNeighbourhood { featured: Boolean id: ID! name: String popularity: Int slug: String cityId: ID city: NeighbourhoodcityCity homePreviewId: ID homePreview: NeighbourhoodhomePreviewPicture locationsIds: [ID!] locations: [NeighbourhoodlocationsLocation!] } input UpdateNotification { id: ID! link: String readDate: DateTime type: NOTIFICATION_TYPE userId: ID user: NotificationuserUser } input UpdatePayment { id: ID! placePrice: Float serviceFee: Float totalPrice: Float bookingId: ID booking: PaymentbookingBooking paymentMethodId: ID paymentMethod: PaymentpaymentMethodPaymentAccount } input UpdatePaymentAccount { id: ID! type: PAYMENT_PROVIDER creditcardId: ID creditcard: PaymentAccountcreditcardCreditCardInformation paypalId: ID paypal: PaymentAccountpaypalPaypalInformation userId: ID user: PaymentAccountuserUser paymentsIds: [ID!] payments: [PaymentAccountpaymentsPayment!] } input UpdatePaypalInformation { email: String id: ID! paymentAccountId: ID paymentAccount: PaypalInformationpaymentAccountPaymentAccount } input UpdatePicture { id: ID! url: String experienceId: ID experience: PictureexperienceExperience neighbourHoodId: ID neighbourHood: PictureneighbourHoodNeighbourhood placeId: ID place: PictureplacePlace reservationId: ID reservation: PicturereservationRestaurant userId: ID user: PictureuserUser } input UpdatePlace { description: String id: ID! maxGuests: Int name: String numBaths: Int numBedrooms: Int numBeds: Int popularity: Int shortDescription: String size: PLACE_SIZES slug: String amenitiesId: ID amenities: PlaceamenitiesAmenities guestRequirementsId: ID guestRequirements: PlaceguestRequirementsGuestRequirements hostId: ID host: PlacehostUser houseRulesId: ID houseRules: PlacehouseRulesHouseRules locationId: ID location: PlacelocationLocation policiesId: ID policies: PlacepoliciesPolicies pricingId: ID pricing: PlacepricingPricing viewsId: ID views: PlaceviewsPlaceViews bookingsIds: [ID!] bookings: [PlacebookingsBooking!] picturesIds: [ID!] pictures: [PlacepicturesPicture!] reviewsIds: [ID!] reviews: [PlacereviewsReview!] } input UpdatePlaceViews { id: ID! lastWeek: Int placeId: ID place: PlaceViewsplacePlace } input UpdatePolicies { checkInEndTime: Float checkInStartTime: Float checkoutTime: Float id: ID! placeId: ID place: PoliciesplacePlace } input UpdatePricing { averageMonthly: Int averageWeekly: Int basePrice: Int cleaningFee: Int currency: CURRENCY extraGuests: Int id: ID! monthlyDiscount: Int perNight: Int securityDeposit: Int smartPricing: Boolean weekendPricing: Int weeklyDiscount: Int placeId: ID place: PricingplacePlace } input UpdateRestaurant { avgPricePerPerson: Int id: ID! isCurated: Boolean popularity: Int slug: String title: String locationId: ID location: RestaurantlocationLocation picturesIds: [ID!] pictures: [RestaurantpicturesPicture!] } input UpdateReview { accuracy: Int checkIn: Int cleanliness: Int communication: Int id: ID! location: Int stars: Int text: String value: Int experienceId: ID experience: ReviewexperienceExperience placeId: ID place: ReviewplacePlace } input UpdateUser { email: String firstName: String id: ID! isSuperHost: Boolean lastName: String password: String phone: String responseRate: Float responseTime: Int locationId: ID location: UserlocationLocation profilePictureId: ID profilePicture: UserprofilePicturePicture bookingsIds: [ID!] bookings: [UserbookingsBooking!] hostingExperiencesIds: [ID!] hostingExperiences: [UserhostingExperiencesExperience!] notificationsIds: [ID!] notifications: [UsernotificationsNotification!] ownedPlacesIds: [ID!] ownedPlaces: [UserownedPlacesPlace!] paymentAccountIds: [ID!] paymentAccount: [UserpaymentAccountPaymentAccount!] receivedMessagesIds: [ID!] receivedMessages: [UserreceivedMessagesMessage!] sentMessagesIds: [ID!] sentMessages: [UsersentMessagesMessage!] } type User implements Node { bookings( filter: BookingFilter orderBy: BookingOrderBy skip: Int after: String before: String first: Int last: Int ): [Booking!] createdAt: DateTime! email: String! firstName: String! hostingExperiences( filter: ExperienceFilter orderBy: ExperienceOrderBy skip: Int after: String before: String first: Int last: Int ): [Experience!] id: ID! isSuperHost: Boolean! lastName: String! location(filter: LocationFilter): Location notifications( filter: NotificationFilter orderBy: NotificationOrderBy skip: Int after: String before: String first: Int last: Int ): [Notification!] ownedPlaces( filter: PlaceFilter orderBy: PlaceOrderBy skip: Int after: String before: String first: Int last: Int ): [Place!] password: String! paymentAccount( filter: PaymentAccountFilter orderBy: PaymentAccountOrderBy skip: Int after: String before: String first: Int last: Int ): [PaymentAccount!] phone: String! profilePicture(filter: PictureFilter): Picture receivedMessages( filter: MessageFilter orderBy: MessageOrderBy skip: Int after: String before: String first: Int last: Int ): [Message!] responseRate: Float responseTime: Int sentMessages( filter: MessageFilter orderBy: MessageOrderBy skip: Int after: String before: String first: Int last: Int ): [Message!] updatedAt: DateTime! # Meta information about the query. _bookingsMeta( filter: BookingFilter orderBy: BookingOrderBy skip: Int after: String before: String first: Int last: Int ): _QueryMeta! # Meta information about the query. _hostingExperiencesMeta( filter: ExperienceFilter orderBy: ExperienceOrderBy skip: Int after: String before: String first: Int last: Int ): _QueryMeta! # Meta information about the query. _notificationsMeta( filter: NotificationFilter orderBy: NotificationOrderBy skip: Int after: String before: String first: Int last: Int ): _QueryMeta! # Meta information about the query. _ownedPlacesMeta( filter: PlaceFilter orderBy: PlaceOrderBy skip: Int after: String before: String first: Int last: Int ): _QueryMeta! # Meta information about the query. _paymentAccountMeta( filter: PaymentAccountFilter orderBy: PaymentAccountOrderBy skip: Int after: String before: String first: Int last: Int ): _QueryMeta! # Meta information about the query. _receivedMessagesMeta( filter: MessageFilter orderBy: MessageOrderBy skip: Int after: String before: String first: Int last: Int ): _QueryMeta! # Meta information about the query. _sentMessagesMeta( filter: MessageFilter orderBy: MessageOrderBy skip: Int after: String before: String first: Int last: Int ): _QueryMeta! } input UserbookingsBooking { endDate: DateTime! startDate: DateTime! paymentId: ID payment: BookingpaymentPayment placeId: ID place: BookingplacePlace } input UserFilter { # Logical AND on all given filters. AND: [UserFilter!] # Logical OR on all given filters. OR: [UserFilter!] createdAt: DateTime # All values that are not equal to given value. createdAt_not: DateTime # All values that are contained in given list. createdAt_in: [DateTime!] # All values that are not contained in given list. createdAt_not_in: [DateTime!] # All values less than the given value. createdAt_lt: DateTime # All values less than or equal the given value. createdAt_lte: DateTime # All values greater than the given value. createdAt_gt: DateTime # All values greater than or equal the given value. createdAt_gte: DateTime email: String # All values that are not equal to given value. email_not: String # All values that are contained in given list. email_in: [String!] # All values that are not contained in given list. email_not_in: [String!] # All values less than the given value. email_lt: String # All values less than or equal the given value. email_lte: String # All values greater than the given value. email_gt: String # All values greater than or equal the given value. email_gte: String # All values containing the given string. email_contains: String # All values not containing the given string. email_not_contains: String # All values starting with the given string. email_starts_with: String # All values not starting with the given string. email_not_starts_with: String # All values ending with the given string. email_ends_with: String # All values not ending with the given string. email_not_ends_with: String firstName: String # All values that are not equal to given value. firstName_not: String # All values that are contained in given list. firstName_in: [String!] # All values that are not contained in given list. firstName_not_in: [String!] # All values less than the given value. firstName_lt: String # All values less than or equal the given value. firstName_lte: String # All values greater than the given value. firstName_gt: String # All values greater than or equal the given value. firstName_gte: String # All values containing the given string. firstName_contains: String # All values not containing the given string. firstName_not_contains: String # All values starting with the given string. firstName_starts_with: String # All values not starting with the given string. firstName_not_starts_with: String # All values ending with the given string. firstName_ends_with: String # All values not ending with the given string. firstName_not_ends_with: String id: ID # All values that are not equal to given value. id_not: ID # All values that are contained in given list. id_in: [ID!] # All values that are not contained in given list. id_not_in: [ID!] # All values less than the given value. id_lt: ID # All values less than or equal the given value. id_lte: ID # All values greater than the given value. id_gt: ID # All values greater than or equal the given value. id_gte: ID # All values containing the given string. id_contains: ID # All values not containing the given string. id_not_contains: ID # All values starting with the given string. id_starts_with: ID # All values not starting with the given string. id_not_starts_with: ID # All values ending with the given string. id_ends_with: ID # All values not ending with the given string. id_not_ends_with: ID isSuperHost: Boolean # All values that are not equal to given value. isSuperHost_not: Boolean lastName: String # All values that are not equal to given value. lastName_not: String # All values that are contained in given list. lastName_in: [String!] # All values that are not contained in given list. lastName_not_in: [String!] # All values less than the given value. lastName_lt: String # All values less than or equal the given value. lastName_lte: String # All values greater than the given value. lastName_gt: String # All values greater than or equal the given value. lastName_gte: String # All values containing the given string. lastName_contains: String # All values not containing the given string. lastName_not_contains: String # All values starting with the given string. lastName_starts_with: String # All values not starting with the given string. lastName_not_starts_with: String # All values ending with the given string. lastName_ends_with: String # All values not ending with the given string. lastName_not_ends_with: String password: String # All values that are not equal to given value. password_not: String # All values that are contained in given list. password_in: [String!] # All values that are not contained in given list. password_not_in: [String!] # All values less than the given value. password_lt: String # All values less than or equal the given value. password_lte: String # All values greater than the given value. password_gt: String # All values greater than or equal the given value. password_gte: String # All values containing the given string. password_contains: String # All values not containing the given string. password_not_contains: String # All values starting with the given string. password_starts_with: String # All values not starting with the given string. password_not_starts_with: String # All values ending with the given string. password_ends_with: String # All values not ending with the given string. password_not_ends_with: String phone: String # All values that are not equal to given value. phone_not: String # All values that are contained in given list. phone_in: [String!] # All values that are not contained in given list. phone_not_in: [String!] # All values less than the given value. phone_lt: String # All values less than or equal the given value. phone_lte: String # All values greater than the given value. phone_gt: String # All values greater than or equal the given value. phone_gte: String # All values containing the given string. phone_contains: String # All values not containing the given string. phone_not_contains: String # All values starting with the given string. phone_starts_with: String # All values not starting with the given string. phone_not_starts_with: String # All values ending with the given string. phone_ends_with: String # All values not ending with the given string. phone_not_ends_with: String responseRate: Float # All values that are not equal to given value. responseRate_not: Float # All values that are contained in given list. responseRate_in: [Float!] # All values that are not contained in given list. responseRate_not_in: [Float!] # All values less than the given value. responseRate_lt: Float # All values less than or equal the given value. responseRate_lte: Float # All values greater than the given value. responseRate_gt: Float # All values greater than or equal the given value. responseRate_gte: Float responseTime: Int # All values that are not equal to given value. responseTime_not: Int # All values that are contained in given list. responseTime_in: [Int!] # All values that are not contained in given list. responseTime_not_in: [Int!] # All values less than the given value. responseTime_lt: Int # All values less than or equal the given value. responseTime_lte: Int # All values greater than the given value. responseTime_gt: Int # All values greater than or equal the given value. responseTime_gte: Int updatedAt: DateTime # All values that are not equal to given value. updatedAt_not: DateTime # All values that are contained in given list. updatedAt_in: [DateTime!] # All values that are not contained in given list. updatedAt_not_in: [DateTime!] # All values less than the given value. updatedAt_lt: DateTime # All values less than or equal the given value. updatedAt_lte: DateTime # All values greater than the given value. updatedAt_gt: DateTime # All values greater than or equal the given value. updatedAt_gte: DateTime bookings_every: BookingFilter bookings_some: BookingFilter bookings_none: BookingFilter hostingExperiences_every: ExperienceFilter hostingExperiences_some: ExperienceFilter hostingExperiences_none: ExperienceFilter location: LocationFilter notifications_every: NotificationFilter notifications_some: NotificationFilter notifications_none: NotificationFilter ownedPlaces_every: PlaceFilter ownedPlaces_some: PlaceFilter ownedPlaces_none: PlaceFilter paymentAccount_every: PaymentAccountFilter paymentAccount_some: PaymentAccountFilter paymentAccount_none: PaymentAccountFilter profilePicture: PictureFilter receivedMessages_every: MessageFilter receivedMessages_some: MessageFilter receivedMessages_none: MessageFilter sentMessages_every: MessageFilter sentMessages_some: MessageFilter sentMessages_none: MessageFilter } input UserhostingExperiencesExperience { popularity: Int! pricePerPerson: Int! title: String! categoryId: ID category: ExperiencecategoryExperienceCategory locationId: ID location: ExperiencelocationLocation previewId: ID preview: ExperiencepreviewPicture reviewsIds: [ID!] reviews: [ExperiencereviewsReview!] } input UserlocationLocation { address: String directions: String lat: Float! lng: Float! experienceId: ID experience: LocationexperienceExperience neighbourHoodId: ID neighbourHood: LocationneighbourHoodNeighbourhood placeId: ID place: LocationplacePlace restaurantId: ID restaurant: LocationrestaurantRestaurant } input UsernotificationsNotification { link: String! readDate: DateTime! type: NOTIFICATION_TYPE } enum UserOrderBy { createdAt_ASC createdAt_DESC email_ASC email_DESC firstName_ASC firstName_DESC id_ASC id_DESC isSuperHost_ASC isSuperHost_DESC lastName_ASC lastName_DESC password_ASC password_DESC phone_ASC phone_DESC responseRate_ASC responseRate_DESC responseTime_ASC responseTime_DESC updatedAt_ASC updatedAt_DESC } input UserownedPlacesPlace { description: String! maxGuests: Int! name: String numBaths: Int! numBedrooms: Int! numBeds: Int! popularity: Int! shortDescription: String! size: PLACE_SIZES slug: String! amenitiesId: ID amenities: PlaceamenitiesAmenities guestRequirementsId: ID guestRequirements: PlaceguestRequirementsGuestRequirements houseRulesId: ID houseRules: PlacehouseRulesHouseRules locationId: ID location: PlacelocationLocation policiesId: ID policies: PlacepoliciesPolicies pricingId: ID pricing: PlacepricingPricing viewsId: ID views: PlaceviewsPlaceViews bookingsIds: [ID!] bookings: [PlacebookingsBooking!] picturesIds: [ID!] pictures: [PlacepicturesPicture!] reviewsIds: [ID!] reviews: [PlacereviewsReview!] } input UserpaymentAccountPaymentAccount { type: PAYMENT_PROVIDER creditcardId: ID creditcard: PaymentAccountcreditcardCreditCardInformation paypalId: ID paypal: PaymentAccountpaypalPaypalInformation paymentsIds: [ID!] payments: [PaymentAccountpaymentsPayment!] } type UserPreviousValues { createdAt: DateTime! email: String! firstName: String! id: ID! isSuperHost: Boolean! lastName: String! password: String! phone: String! responseRate: Float responseTime: Int updatedAt: DateTime! } input UserprofilePicturePicture { url: String! experienceId: ID experience: PictureexperienceExperience neighbourHoodId: ID neighbourHood: PictureneighbourHoodNeighbourhood placeId: ID place: PictureplacePlace reservationId: ID reservation: PicturereservationRestaurant } input UserreceivedMessagesMessage { deliveredAt: DateTime! readAt: DateTime! fromId: ID from: MessagefromUser } input UsersentMessagesMessage { deliveredAt: DateTime! readAt: DateTime! toId: ID to: MessagetoUser } input UserSubscriptionFilter { # Logical AND on all given filters. AND: [UserSubscriptionFilter!] # Logical OR on all given filters. OR: [UserSubscriptionFilter!] # The subscription event gets dispatched when it's listed in mutation_in mutation_in: [_ModelMutationType!] # The subscription event gets only dispatched when one of the updated fields names is included in this list updatedFields_contains: String # The subscription event gets only dispatched when all of the field names included in this list have been updated updatedFields_contains_every: [String!] # The subscription event gets only dispatched when some of the field names included in this list have been updated updatedFields_contains_some: [String!] node: UserSubscriptionFilterNode } input UserSubscriptionFilterNode { createdAt: DateTime # All values that are not equal to given value. createdAt_not: DateTime # All values that are contained in given list. createdAt_in: [DateTime!] # All values that are not contained in given list. createdAt_not_in: [DateTime!] # All values less than the given value. createdAt_lt: DateTime # All values less than or equal the given value. createdAt_lte: DateTime # All values greater than the given value. createdAt_gt: DateTime # All values greater than or equal the given value. createdAt_gte: DateTime email: String # All values that are not equal to given value. email_not: String # All values that are contained in given list. email_in: [String!] # All values that are not contained in given list. email_not_in: [String!] # All values less than the given value. email_lt: String # All values less than or equal the given value. email_lte: String # All values greater than the given value. email_gt: String # All values greater than or equal the given value. email_gte: String # All values containing the given string. email_contains: String # All values not containing the given string. email_not_contains: String # All values starting with the given string. email_starts_with: String # All values not starting with the given string. email_not_starts_with: String # All values ending with the given string. email_ends_with: String # All values not ending with the given string. email_not_ends_with: String firstName: String # All values that are not equal to given value. firstName_not: String # All values that are contained in given list. firstName_in: [String!] # All values that are not contained in given list. firstName_not_in: [String!] # All values less than the given value. firstName_lt: String # All values less than or equal the given value. firstName_lte: String # All values greater than the given value. firstName_gt: String # All values greater than or equal the given value. firstName_gte: String # All values containing the given string. firstName_contains: String # All values not containing the given string. firstName_not_contains: String # All values starting with the given string. firstName_starts_with: String # All values not starting with the given string. firstName_not_starts_with: String # All values ending with the given string. firstName_ends_with: String # All values not ending with the given string. firstName_not_ends_with: String id: ID # All values that are not equal to given value. id_not: ID # All values that are contained in given list. id_in: [ID!] # All values that are not contained in given list. id_not_in: [ID!] # All values less than the given value. id_lt: ID # All values less than or equal the given value. id_lte: ID # All values greater than the given value. id_gt: ID # All values greater than or equal the given value. id_gte: ID # All values containing the given string. id_contains: ID # All values not containing the given string. id_not_contains: ID # All values starting with the given string. id_starts_with: ID # All values not starting with the given string. id_not_starts_with: ID # All values ending with the given string. id_ends_with: ID # All values not ending with the given string. id_not_ends_with: ID isSuperHost: Boolean # All values that are not equal to given value. isSuperHost_not: Boolean lastName: String # All values that are not equal to given value. lastName_not: String # All values that are contained in given list. lastName_in: [String!] # All values that are not contained in given list. lastName_not_in: [String!] # All values less than the given value. lastName_lt: String # All values less than or equal the given value. lastName_lte: String # All values greater than the given value. lastName_gt: String # All values greater than or equal the given value. lastName_gte: String # All values containing the given string. lastName_contains: String # All values not containing the given string. lastName_not_contains: String # All values starting with the given string. lastName_starts_with: String # All values not starting with the given string. lastName_not_starts_with: String # All values ending with the given string. lastName_ends_with: String # All values not ending with the given string. lastName_not_ends_with: String password: String # All values that are not equal to given value. password_not: String # All values that are contained in given list. password_in: [String!] # All values that are not contained in given list. password_not_in: [String!] # All values less than the given value. password_lt: String # All values less than or equal the given value. password_lte: String # All values greater than the given value. password_gt: String # All values greater than or equal the given value. password_gte: String # All values containing the given string. password_contains: String # All values not containing the given string. password_not_contains: String # All values starting with the given string. password_starts_with: String # All values not starting with the given string. password_not_starts_with: String # All values ending with the given string. password_ends_with: String # All values not ending with the given string. password_not_ends_with: String phone: String # All values that are not equal to given value. phone_not: String # All values that are contained in given list. phone_in: [String!] # All values that are not contained in given list. phone_not_in: [String!] # All values less than the given value. phone_lt: String # All values less than or equal the given value. phone_lte: String # All values greater than the given value. phone_gt: String # All values greater than or equal the given value. phone_gte: String # All values containing the given string. phone_contains: String # All values not containing the given string. phone_not_contains: String # All values starting with the given string. phone_starts_with: String # All values not starting with the given string. phone_not_starts_with: String # All values ending with the given string. phone_ends_with: String # All values not ending with the given string. phone_not_ends_with: String responseRate: Float # All values that are not equal to given value. responseRate_not: Float # All values that are contained in given list. responseRate_in: [Float!] # All values that are not contained in given list. responseRate_not_in: [Float!] # All values less than the given value. responseRate_lt: Float # All values less than or equal the given value. responseRate_lte: Float # All values greater than the given value. responseRate_gt: Float # All values greater than or equal the given value. responseRate_gte: Float responseTime: Int # All values that are not equal to given value. responseTime_not: Int # All values that are contained in given list. responseTime_in: [Int!] # All values that are not contained in given list. responseTime_not_in: [Int!] # All values less than the given value. responseTime_lt: Int # All values less than or equal the given value. responseTime_lte: Int # All values greater than the given value. responseTime_gt: Int # All values greater than or equal the given value. responseTime_gte: Int updatedAt: DateTime # All values that are not equal to given value. updatedAt_not: DateTime # All values that are contained in given list. updatedAt_in: [DateTime!] # All values that are not contained in given list. updatedAt_not_in: [DateTime!] # All values less than the given value. updatedAt_lt: DateTime # All values less than or equal the given value. updatedAt_lte: DateTime # All values greater than the given value. updatedAt_gt: DateTime # All values greater than or equal the given value. updatedAt_gte: DateTime bookings_every: BookingFilter bookings_some: BookingFilter bookings_none: BookingFilter hostingExperiences_every: ExperienceFilter hostingExperiences_some: ExperienceFilter hostingExperiences_none: ExperienceFilter location: LocationFilter notifications_every: NotificationFilter notifications_some: NotificationFilter notifications_none: NotificationFilter ownedPlaces_every: PlaceFilter ownedPlaces_some: PlaceFilter ownedPlaces_none: PlaceFilter paymentAccount_every: PaymentAccountFilter paymentAccount_some: PaymentAccountFilter paymentAccount_none: PaymentAccountFilter profilePicture: PictureFilter receivedMessages_every: MessageFilter receivedMessages_some: MessageFilter receivedMessages_none: MessageFilter sentMessages_every: MessageFilter sentMessages_some: MessageFilter sentMessages_none: MessageFilter } type UserSubscriptionPayload { mutation: _ModelMutationType! node: User updatedFields: [String!] previousValues: UserPreviousValues } ```
The Diocese of Trujillo may refer to: Roman Catholic Archdiocese of Trujillo, Peru Roman Catholic Diocese of Trujillo (Honduras) Roman Catholic Diocese of Trujillo, Venezuela
```kotlin package net.corda.node.services import net.corda.core.flows.FlowLogic import kotlin.reflect.KFunction /** * Helper class for creating permissions. */ class Permissions { companion object { /** * Global admin permissions. */ @JvmStatic fun all() = "ALL" /** * Creates the flow permission string of the format "StartFlow.{ClassName}". * * @param className a flow class name for which permission is created. */ @JvmStatic fun startFlow(className: String) = "StartFlow.$className" /** * An overload for the [startFlow] * * @param clazz a class for which permission is created. */ @JvmStatic fun <P : FlowLogic<*>> startFlow(clazz: Class<P>) = startFlow(clazz.name) /** * An overload for the [startFlow]. * * @param P a class for which permission is created. */ @JvmStatic inline fun <reified P : FlowLogic<*>> startFlow(): String = startFlow(P::class.java) /** * Creates a permission string with format "InvokeRpc.{MethodName}". * * @param methodName a RPC method name for which permission is created. */ @JvmStatic fun invokeRpc(methodName: String) = "InvokeRpc.$methodName" /** * Creates a permission string with format "InvokeRpc.{method.name}". * * @param method a RPC [KFunction] for which permission is created. */ @JvmStatic fun invokeRpc(method: KFunction<*>) = invokeRpc(method.name) } } ```
Jens Sørensen Wand also given as Wandt or Vand (28 April 1875 – 26 May 1950) was a Frisian bird warden on the island of Norderoog in Hallig Hooge, part of North Frisian Islands in the North Sea, off the coast of Germany. Wand protected nesting seabird colonies, particularly of nesting Sandwich terns. Seabird protection had begun in 1909 after some of the islands were purchased by the Jordsand bird protection organization. He was nicknamed the "Bird king of Norderoog." Wand was born in Rolsø and little is known about his early life. He married Helene Marie Jespersen in 1889 and they lived in Brede near Bredebro north of Tønder during the winter and in summer he was often the only person to live on the island of Norderoog. He obtained his supplies from the nearby island of Hallig Hooge. He took up this position as a bird warden on 19 May 1909 and stayed there until his death in 1950. They had seven children. His wife drowned in a tidal creek between Hooge and Norderoog in 1914 and in 1950 he too drowned and died, at the same place. On the day of his death he had accompanied a film-maker named Venzl and a hiker and on the way back he was likely caught by the rising tide. His work involved keeping records of the numbers of birds nesting and to keep away any intruders. He came to be called the "Bird King of Norderoog" and sometimes as the "Hallig Robinson". His lonely life was the subject of much interest and he was the subject of many photographs and paintings. Wand's home built on stilts is now preserved as memorial. References External links Painting by Albert Johannsen 1875 births 1950 deaths Bird conservation Danish conservationists German conservationists
```javascript 'use strict' var runner = require('./runner') var schemas = require('./schemas') module.exports = function (data, cb) { return runner(schemas.har, data, cb) } Object.keys(schemas).map(function (name) { module.exports[name] = function (data, cb) { return runner(schemas[name], data, cb) } }) ```
The US Sen. Hattie Caraway Gravesite is located in Oaklawn Cemetery on the west side of Jonesboro, Arkansas. It is the only surviving site in Arkansas associated with the life of Hattie Caraway (1878-1950), the first woman to be elected to a full term in the United States Senate. The gravesite consists of a family headstone, simply engraved "Caraway", and three footstones: one for the senator, one for her husband Thaddeus, whom she succeeded in the Senate, and their son Robert. The site is located on the western central edge of the cemetery. The gravesite was listed on the National Register of Historic Places in 2007. See also National Register of Historic Places listings in Craighead County, Arkansas References Cemeteries on the National Register of Historic Places in Arkansas Buildings and structures completed in 1950 Buildings and structures in Jonesboro, Arkansas Burials in the United States National Register of Historic Places in Craighead County, Arkansas
```objective-c // // Aspia Project // // This program is free software: you can redistribute it and/or modify // (at your option) any later version. // // This program is distributed in the hope that it will be useful, // but WITHOUT ANY WARRANTY; without even the implied warranty of // MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the // // along with this program. If not, see <path_to_url // #ifndef COMMON_HTTP_FILE_DOWNLOADER_H #define COMMON_HTTP_FILE_DOWNLOADER_H #include "base/macros_magic.h" #include "base/task_runner.h" #include "base/memory/byte_array.h" #include "base/threading/simple_thread.h" namespace common { class HttpFileDownloader { public: HttpFileDownloader(); ~HttpFileDownloader(); class Delegate { public: virtual ~Delegate() = default; virtual void onFileDownloaderError(int error_code) = 0; virtual void onFileDownloaderCompleted() = 0; virtual void onFileDownloaderProgress(int percentage) = 0; }; void start(std::u16string_view url, std::shared_ptr<base::TaskRunner> owner_task_runner, Delegate* delegate); const base::ByteArray& data() const { return data_; } private: void run(); static size_t writeDataCallback(void* ptr, size_t size, size_t nmemb, HttpFileDownloader* self); static int progressCallback( HttpFileDownloader* self, double dltotal, double dlnow, double ultotal, double ulnow); base::SimpleThread thread_; class Runner; std::shared_ptr<Runner> runner_; std::u16string url_; base::ByteArray data_; DISALLOW_COPY_AND_ASSIGN(HttpFileDownloader); }; } // namespace common #endif // COMMON_HTTP_FILE_DOWNLOADER_H ```
The Seaplane Training Flight was a Royal Australian Air Force unit responsible for providing seaplane conversion training to RAAF air and ground crew. The Seaplane Training Flight was established on 1 March 1940 at RAAF Base Rathmines in New South Wales. Initially equipped with two Supermarine Seagull aircraft the Flight received Consolidated Catalina aircraft in the second half of 1940 and a small number of Vought Kingfisher aircraft in early 1942. As part of the expansion of the RAAF's seaplane units the Seaplane Training Flight was expanded to form No. 3 Operational Training Unit on 28 December 1942. References RAAF Historical Section (1995), Units of the Royal Australian Air Force. A Concise History. Volume 4 Maritime and Transport Units. Australian Government Publishing Service, Canberra. RAAF training units RAAF independent flights Military units and formations established in 1940
```go // Package log contains utilities for sensible logging. // // ATC components written in Go should use this package rather than a // third-party library. This package also provides log *levels*, which are not // really supported by the standard log package, so it's generally better to // use this than even that standard library. // // Inspired by path_to_url // // # Usage // // Normally, one will have some command-line program that takes in some kind of // configuration either via command-line arguments or a configuration file, and // want that configuration to define how logging happens. The easiest way to // logging up and running is to put that configuration into a type that // implements this package's Config interface, like: // // type Configuration struct { // ErrorLogs string `json:"errorLogs"` // WarningLogs string `json:"warningLogs"` // InfoLogs string `json:"infoLogs"` // DebugLogs string `json:"debugLogs"` // EventLogs string `json:"eventLogs"` // EnableFeature bool `json:"enableFeature"` // } // // func (c Configuration) ErrorLog() string { return c.ErrorLogs} // func (c Configuration) WarningLog() string { return c.WarningLogs } // func (c Configuration) InfoLog() string { return c.InfoLogs } // func (c Configuration) DebugLog() string { return c.DebugLogs } // func (c Configuration) EventLog() string { return c.EventLogs } // // Then the configuration can be unmarshaled from JSON and the resulting // Configuration can be passed directly to InitCfg, and then everything is // set-up and ready to go. package log /* * or more contributor license agreements. See the NOTICE file * distributed with this work for additional information * regarding copyright ownership. The ASF licenses this file * * path_to_url * * Unless required by applicable law or agreed to in writing, * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY * specific language governing permissions and limitations */ import ( "fmt" "io" "log" "os" "time" "github.com/apache/trafficcontrol/v8/lib/go-llog" ) // These are the loggers for the various log levels, which can be accessed // directly, but in general the functions exported by this package should // be used instead. var ( Debug *log.Logger Info *log.Logger Warning *log.Logger Error *log.Logger Event *log.Logger // Access is NOT initialized by Init nor InitCfg in any way. To initialize // it, use InitAccess. Access *log.Logger ) var ( debugCloser io.Closer infoCloser io.Closer warnCloser io.Closer errCloser io.Closer eventCloser io.Closer accessCloser io.Closer ) func initLogger(logger **log.Logger, oldLogCloser *io.Closer, newLogWriter io.WriteCloser, logPrefix string, logFlags int) { if newLogWriter == nil { *logger = nil if *oldLogCloser != nil { (*oldLogCloser).Close() *oldLogCloser = nil } return } if *logger != nil { (*logger).SetOutput(newLogWriter) } else { *logger = log.New(newLogWriter, logPrefix, logFlags) } if *oldLogCloser != nil { (*oldLogCloser).Close() } *oldLogCloser = newLogWriter } // These are prefixes prepended to messages for the various log levels. const ( DebugPrefix = "DEBUG: " InfoPrefix = "INFO: " WarnPrefix = "WARNING: " ErrPrefix = "ERROR: " EventPrefix = "" ) // These constants are flags used to create the underlying "log.Logger", // defining what's in the prefix for different types of log messages. // // Refer to the "log" package documentation for details. const ( DebugFlags = log.Lshortfile InfoFlags = log.Lshortfile WarnFlags = log.Lshortfile ErrFlags = log.Lshortfile EventFlags = 0 ) // Init initializes the logs - except the Access log stream - with the given // io.WriteClosers. // // If `Init` was previously called, existing loggers are Closed. // If you have loggers which are not Closers or which must not be Closed, wrap // them with `log.NopCloser`. func Init(eventW, errW, warnW, infoW, debugW io.WriteCloser) { initLogger(&Debug, &debugCloser, debugW, DebugPrefix, DebugFlags) initLogger(&Info, &infoCloser, infoW, InfoPrefix, InfoFlags) initLogger(&Warning, &warnCloser, warnW, WarnPrefix, WarnFlags) initLogger(&Error, &errCloser, errW, ErrPrefix, ErrFlags) initLogger(&Event, &eventCloser, eventW, EventPrefix, EventFlags) } // InitAccess initializes the Access logging stream with the given // io.WriteCloser. func InitAccess(accessW io.WriteCloser) { initLogger(&Access, &accessCloser, accessW, EventPrefix, EventFlags) } // Logf should generally be avoided, use the Init or InitCfg and Errorf, Warnln, // etc functions instead. // // It logs to the given logger, in the same format as the standard log // functions. This should only be used in rare and unusual circumstances when // the standard loggers and functions can't. func Logf(logger *log.Logger, format string, v ...interface{}) { if logger == nil { return } logger.Output(stackFrame, time.Now().UTC().Format(timeFormat)+": "+fmt.Sprintf(format, v...)) } // Logln should generally be avoided, use the built-in Init or InitCfg and Errorf, Warnln, etc functions instead. // It logs to the given logger, in the same format as the standard log functions. // This should only be used in rare and unusual circumstances when the standard loggers and functions can't. func Logln(logger *log.Logger, v ...interface{}) { if logger == nil { return } logger.Output(stackFrame, time.Now().UTC().Format(timeFormat)+": "+fmt.Sprintln(v...)) } const timeFormat = time.RFC3339Nano const stackFrame = 3 // Errorf prints a formatted message to the Error logging stream. // // Formatting is handled identically to the fmt package, but the final message // actually printed to the stream will be prefixed with the log level and // timestamp just like Errorln. Also, if the final formatted message does not // end in a newline character, one will be appended, so it's not necessary to // do things like e.g. // // Errorf("%v\n", 5) // // Instead, just: // // Errorf("%v", 5) // // will suffice, and will print exactly the same message. func Errorf(format string, v ...interface{}) { Logf(Error, format, v...) } // Errorln prints a line to the Error stream. // // This will use default formatting for all of its passed parameters, as // defined by the fmt package. The resulting line is first prefixed with the // log level - in this case "ERROR: " - and the current timestamp, as well as // the context in which it was logged. For example, if line 36 of foo.go // invokes this like so: // // log.Errorln("something wicked happened") // // ... the resulting log line will look like: // // ERROR: foo.go:36: 2006-01-02T15:04:05Z: something wicked happened // // (assuming the log call occurred at the time package's formatting reference // date/time). This allows multiple log streams to be directed to the same file // for output and still be distinguishable by their log-level prefix. func Errorln(v ...interface{}) { Logln(Error, v...) } // Warnf prints a formatted message to the Warning logging stream. // // The message format is identical to Errorf, but using the log-level prefix // "WARNING: ". func Warnf(format string, v ...interface{}) { Logf(Warning, format, v...) } // Warnln prints a line to the Warning stream. // // The message format is identical to Errorln, but using the log-level prefix // "WARNING: ". func Warnln(v ...interface{}) { Logln(Warning, v...) } // Infof prints a formatted message to the Info logging stream. // // The message format is identical to Errorf, but using the log-level prefix // "INFO: ". func Infof(format string, v ...interface{}) { Logf(Info, format, v...) } // Infoln prints a line to the Info stream. // // The message format is identical to Errorln, but using the log-level prefix // "INFO: ". func Infoln(v ...interface{}) { Logln(Info, v...) } // Debugf prints a formatted message to the Debug logging stream. // // The message format is identical to Errorf, but using the log-level prefix // "DEBUG: ". func Debugf(format string, v ...interface{}) { Logf(Debug, format, v...) } // Debugln prints a line to the Debug stream. // // The message format is identical to Errorln, but using the log-level prefix // "DEBUG: ". func Debugln(v ...interface{}) { Logln(Debug, v...) } const eventFormat = "%.3f %s" func eventTime(t time.Time) float64 { return float64(t.Unix()) + (float64(t.Nanosecond()) / float64(time.Second)) } // Accessln prints an "access"-level log to the Access logging stream. // // This does NOT use the same message formatting as the other logging functions // like Errorf. Instead, it prints only the information it's given, formatted // according to the format string (if present) as defined by the fmt package. // For example, if line 36 of foo.go invokes this like so: // // log.Accessln("%s\n", "my message") // // ... the resulting log line will look like: // // my message func Accessln(v ...interface{}) { if Access != nil { Access.Println(v...) } } // Eventf prints an "event"-level log to the Event logging stream. // // This does NOT use the same message formatting as the other logging functions // like Errorf. Instead, it prints the time at which the event occurred as the // number of seconds since the Unix epoch to three decimal places of sub-second // precision, followed by the trailing parameters formatted according to the // format string as defined by the fmt package. For example, if line 36 of // foo.go invokes this like so: // // log.Eventf(time.Now(), "%s\n", "my message") // // ... the resulting log line will look like: // // 1136214245.000 my message // // Note that this WILL NOT add trailing newlines if the resulting formatted // message string doesn't end with one. func Eventf(t time.Time, format string, v ...interface{}) { if Event == nil { return } // 1484001185.287 ... Event.Printf(eventFormat, eventTime(t), fmt.Sprintf(format, v...)) } // EventfRaw writes a formatted message to the Event log stream. // // The formatting is just like Eventf, but no timestamp prefix will be added. func EventfRaw(format string, v ...interface{}) { if Event == nil { return } Event.Printf(format, v...) } // EventRaw writes the given string directly to the Event log stream with // absolutely NO formatting - trailing newline, timestamp, log-level etc. // // Go's fmt.Printf/Sprintf etc. are very slow, so using this with string // concatenation is by far the fastest way to log, and should be used for // frequent logs. func EventRaw(s string) { if Event == nil { return } Event.Output(stackFrame, s) } // Close calls `Close()` on the given Closer, and logs any error that results // from that call. // // On error, the context is logged, followed by a colon and the error message. // Specifically, it calls Errorf with the context and error using the format // '%v: %v'. // // This is primarily designed to be used in `defer`, for example: // // defer log.Close(resp.Body, "readData fetching /foo/bar") // // Note that some Go linting tools may not properly detect this as both Closing // the Closer and checking the error value that Close returns, but it does both // of those things, we swear. func Close(c io.Closer, context string) { err := c.Close() if err != nil { Errorf("%v: %v", context, err) } } // Closef acts like Close, with a given format string and values, followed by a // colon, the error message, and a newline. // // The given values are not coerced, concatenated, or printed unless an error // occurs, so this is more efficient than Close in situations where the message // passed to Close would have been generated with fmt.Sprintf. // // This actually results in two distinct lines being printed in the Error log // stream. For example, if line 36 of foo.go invokes this like so: // // log.Closef(resp.Body, "doing %s", "something") // // ... and resp.Body.Close() returns a non-nil error with the message "failed", // the resulting log lines will look like: // // ERROR: log.go:271: 2006-01-02T15:04:05Z: doing something // ERROR: log.go:272: 2006-01-02T15:04:05Z: : failed // // (please please please don't count on those line numbers). func Closef(c io.Closer, contextFormat string, v ...interface{}) { err := c.Close() if err != nil { Errorf(contextFormat, v...) Errorf(": %v", err) } } // Write calls the Write method of the given Writer with the passed byte slice // as an argument, and logs any error that arises from that call. // // On error, the context is logged, followed by a colon and the error message, and // a trailing newline is guaranteed. For example, if line 36 of foo.go invokes // this like so: // // log.Write(someWriter, []byte("write me"), "trying to write") // // ... and someWriter.Write() returns a non-nil error with the message // "failed", the resulting log line will look like: // // ERROR: log.go:294: 2006-01-02T15:04:05Z: trying to write: failed // // (please please please don't count on that line number). func Write(w io.Writer, b []byte, context string) { _, err := w.Write(b) if err != nil { Errorf("%v: %v", context, err) } } // Writef acts like Write, with a given format string and values, followed by a // colon, the error message, and a newline. // // The given values are not coerced, concatenated, or printed unless an error // occurs, so this is more efficient than Write in situations where the message // passed to Write would have been generated with fmt.Sprintf. // // This actually results in two distinct lines being printed in the Error log // stream. For example, if line 36 of foo.go invokes this like so: // // log.Writef(someWriter, "doing %s", "something") // // ... and someWriter.Write() returns a non-nil error with the message // "failed", the resulting log lines will look like: // // ERROR: log.go:320: 2006-01-02T15:04:05Z: doing something // ERROR: log.go:321: 2006-01-02T15:04:05Z: : failed // // (please please please don't count on those line numbers). func Writef(w io.Writer, b []byte, contextFormat string, v ...interface{}) { _, err := w.Write(b) if err != nil { Errorf(contextFormat, v...) Errorf(": %v", err) } } type nopCloser struct { io.Writer } // Close implements the io.Closer interface. This does nothing. func (nopCloser) Close() error { return nil } // NopCloser returns an io.WriteCloser which wraps the passed io.Writer with a // Close method that just does nothing but return a nil error. This allows // non-closable streams to be used for logging. func NopCloser(w io.Writer) io.WriteCloser { return nopCloser{w} } // LogLocation is a location to log to. This may be stdout, stderr, null (/dev/null), or a valid file path. type LogLocation string const ( // LogLocationStdout indicates the stdout IO stream. LogLocationStdout = "stdout" // LogLocationStderr indicates the stderr IO stream. LogLocationStderr = "stderr" // LogLocationNull indicates the null IO stream (/dev/null). LogLocationNull = "null" ) // LogLocationFile specify where health client logs should go. // // Deprecated: It is unclear why this constant is in this package at all, and // it will almost certainly be removed in the future. const LogLocationFile = "/var/log/trafficcontrol/tc-health-client.log" // StaticFileDir is the directory that contains static HTML and Javascript // files for Traffic Monitor. // // Deprecated: It is unclear why this constant is in this package at all, and // it will almost certainly be removed in the future. const StaticFileDir = "/opt/traffic_monitor/static/" // GetLogWriter creates a writable stream from the given LogLocation. // // If the requested log location is a file, it will be opened with the // write-only, create, and append flags in permissions mode 0644. If that open // operation causes an error, it is returned. // // As a special case, if the location is an empty string, the returned stream // will be nil - this is the same behavior as passing LogLocationNull. func GetLogWriter(location LogLocation) (io.WriteCloser, error) { switch location { case LogLocationStdout: return NopCloser(os.Stdout), nil case LogLocationStderr: return NopCloser(os.Stderr), nil case LogLocationNull: fallthrough case "": return nil, nil default: return os.OpenFile(string(location), os.O_WRONLY|os.O_CREATE|os.O_APPEND, 0644) } } // Config structures represent logging configurations, which can be accessed by // this package for setting up the logging system via GetLogWriters. type Config interface { ErrorLog() LogLocation WarningLog() LogLocation InfoLog() LogLocation DebugLog() LogLocation EventLog() LogLocation } // GetLogWriters uses the passed Config to set up and return log streams. // // This bails at the first encountered error creating a log stream, and returns // the error without trying to create any further streams. // // The returned streams still need to be initialized, usually with Init. To // create and initialize the logging streams at the same time, refer to // InitCfg. func GetLogWriters(cfg Config) (io.WriteCloser, io.WriteCloser, io.WriteCloser, io.WriteCloser, io.WriteCloser, error) { eventLoc := cfg.EventLog() errLoc := cfg.ErrorLog() warnLoc := cfg.WarningLog() infoLoc := cfg.InfoLog() debugLoc := cfg.DebugLog() eventW, err := GetLogWriter(eventLoc) if err != nil { return nil, nil, nil, nil, nil, fmt.Errorf("getting log event writer %v: %v", eventLoc, err) } errW, err := GetLogWriter(errLoc) if err != nil { return nil, nil, nil, nil, nil, fmt.Errorf("getting log error writer %v: %v", errLoc, err) } warnW, err := GetLogWriter(warnLoc) if err != nil { return nil, nil, nil, nil, nil, fmt.Errorf("getting log warning writer %v: %v", warnLoc, err) } infoW, err := GetLogWriter(infoLoc) if err != nil { return nil, nil, nil, nil, nil, fmt.Errorf("getting log info writer %v: %v", infoLoc, err) } debugW, err := GetLogWriter(debugLoc) if err != nil { return nil, nil, nil, nil, nil, fmt.Errorf("getting log debug writer %v: %v", debugLoc, err) } return eventW, errW, warnW, infoW, debugW, nil } // InitCfg uses the passed configuration to both create and initialize all // logging streams. func InitCfg(cfg Config) error { eventW, errW, warnW, infoW, debugW, err := GetLogWriters(cfg) if err != nil { return err } Init(eventW, errW, warnW, infoW, debugW) return nil } // LLog returns an llog.Log, for passing to libraries using llog. // // Note the returned Log will have writers tied to loggers at its time of creation. // Thus, it's safe to reuse LLog if an application never re-initializes the loggers, // such as when reloading config on a HUP signal. // If the application re-initializes loggers, LLog should be called again to get // a new llog.Log associated with the new log locations. func LLog() llog.Log { // ltow converts a log.Logger into an io.Writer // This is relatively inefficient. If performance is necessary, this package could be made to // keep track of the original io.Writer, to avoid an extra function call and string copy. ltow := func(lg *log.Logger) io.Writer { return llog.WriterFunc(func(p []byte) (n int, err error) { Logln(lg, string(p)) return len(p), nil }) } return llog.New(ltow(Error), ltow(Warning), ltow(Info), ltow(Debug)) } ```
Daylight Robbery is a 1964 British film from the Children's Film Foundation. Its plot concerns a group of kids who foil bank robbers. Cast Trudy Moors as Trudy Janet Hannington as Janet Kirk Martin as Kirk Darryl Read as Darryl Doug Robinson as Gangster (as Douglas Robinson) John Trenaman as Gangster Gordon Jackson as Sergeant Janet Munro Zena Walker Patricia Burke James Villiers Norman Rossington Ronald Fraser Critical reception TV Guide called it an "Okay children's film with a surprisingly talented adult cast." References External links Daylight Robbery at the British Film Institute 1964 films British children's films Children's Film Foundation 1960s English-language films Films directed by Michael Truman 1960s British films
```go package servers import "github.com/gophercloud/gophercloud" // WaitForStatus will continually poll a server until it successfully // transitions to a specified status. It will do this for at most the number // of seconds specified. func WaitForStatus(c *gophercloud.ServiceClient, id, status string, secs int) error { return gophercloud.WaitFor(secs, func() (bool, error) { current, err := Get(c, id).Extract() if err != nil { return false, err } if current.Status == status { return true, nil } return false, nil }) } ```
```javascript 'use strict'; const twilio = require('twilio'); const config = require('./config.json'); const MessagingResponse = twilio.twiml.MessagingResponse; const projectId = process.env.GCLOUD_PROJECT; const region = 'us-central1'; exports.reply = (req, res) => { let isValid = true; // Only validate that requests came from Twilio when the function has been // deployed to production. if (process.env.NODE_ENV === 'production') { isValid = twilio.validateExpressRequest(req, config.TWILIO_AUTH_TOKEN, { url: `path_to_url{region}-${projectId}.cloudfunctions.net/reply` }); } // Halt early if the request was not sent from Twilio if (!isValid) { res .type('text/plain') .status(403) .send('Twilio Request Validation Failed.') .end(); return; } // Prepare a response to the SMS message const response = new MessagingResponse(); // Add text to the response response.message('Hello from Google Cloud Functions!'); // Send the response res .status(200) .type('text/xml') .end(response.toString()); }; ```
```clojure (ns status-im.contexts.wallet.account.view (:require [quo.core :as quo] [react-native.core :as rn] [status-im.contexts.wallet.account.style :as style] [status-im.contexts.wallet.account.tabs.view :as tabs] [status-im.contexts.wallet.common.account-switcher.view :as account-switcher] [status-im.contexts.wallet.sheets.buy-token.view :as buy-token] [status-im.feature-flags :as ff] [status-im.setup.hot-reload :as hot-reload] [utils.i18n :as i18n] [utils.re-frame :as rf])) (def first-tab-id :assets) (def tabs-data [{:id :assets :label (i18n/label :t/assets) :accessibility-label :assets-tab} {:id :collectibles :label (i18n/label :t/collectibles) :accessibility-label :collectibles-tab} {:id :activity :label (i18n/label :t/activity) :accessibility-label :activity-tab} {:id :about :label (i18n/label :t/about) :accessibility-label :about}]) (defn- change-tab [id] (rf/dispatch [:wallet/select-account-tab id])) (defn view [] (let [selected-tab (or (rf/sub [:wallet/account-tab]) first-tab-id) {:keys [name color formatted-balance watch-only?]} (rf/sub [:wallet/current-viewing-account]) customization-color (rf/sub [:profile/customization-color])] (hot-reload/use-safe-unmount (fn [] (rf/dispatch [:wallet/close-account-page]) (rf/dispatch [:wallet/clean-current-viewing-account]))) (rn/use-mount #(rf/dispatch [:wallet/fetch-activities-for-current-account])) [rn/view {:style {:flex 1}} [account-switcher/view {:type :wallet-networks :on-press #(rf/dispatch [:pop-to-root :shell-stack])}] [quo/account-overview {:container-style style/account-overview :current-value formatted-balance :account-name name :account (if watch-only? :watched-address :default) :customization-color color}] (when (ff/enabled? ::ff/wallet.graph) [quo/wallet-graph {:time-frame :empty}]) (when (not watch-only?) [quo/wallet-ctas {:container-style style/cta-buttons :send-action (fn [] (rf/dispatch [:wallet/clean-send-data]) (rf/dispatch [:wallet/wizard-navigate-forward {:start-flow? true :flow-id :wallet-send-flow}])) :receive-action #(rf/dispatch [:open-modal :screen/wallet.share-address {:status :receive}]) :buy-action #(rf/dispatch [:show-bottom-sheet {:content buy-token/view}]) :bridge-action (fn [] (rf/dispatch [:wallet/clean-send-data]) (rf/dispatch [:wallet/start-bridge])) :swap-action (when (ff/enabled? ::ff/wallet.swap) #(rf/dispatch [:wallet.swap/start]))}]) [quo/tabs {:style style/tabs :size 32 :active-tab-id selected-tab :data tabs-data :on-change change-tab :scrollable? true :scroll-on-press? true}] [tabs/view {:selected-tab selected-tab}] (when (ff/enabled? ::ff/shell.jump-to) [quo/floating-shell-button {:jump-to {:on-press #(rf/dispatch [:shell/navigate-to-jump-to]) :customization-color customization-color :label (i18n/label :t/jump-to)}} style/shell-button])])) ```
```lua require 'src/content_loss' require 'src/texture_loss' require 'loadcaffe' local ArtisticCriterion, parent = torch.class('nn.ArtisticCriterion', 'nn.Criterion') function ArtisticCriterion:__init(params, cnn, texture_image) parent.__init(self) self.content_modules = {} self.texture_modules = {} self.descriptor_net = create_descriptor_net(params, cnn, texture_image) self:updateModules() self.gradInput = nil end function ArtisticCriterion:updateOutput(input, target) -- Compute target content features if #self.content_modules > 0 then for k, module in pairs(self.texture_modules) do module.active = false end for k, module in pairs(self.content_modules) do module.active = false end self.descriptor_net:forward(target) end -- Now forward with images from generator for k, module in pairs(self.texture_modules) do module.active = true end for k, module in pairs(self.content_modules) do module.active = true module.target:resizeAs(module.output) module.target:copy(module.output) end self.descriptor_net:forward(input) local loss = 0 for _, mod in ipairs(self.content_modules) do loss = loss + mod.loss end for _, mod in ipairs(self.texture_modules) do loss = loss + mod.loss end return loss end function ArtisticCriterion:updateGradInput(input, target) self.gradInput = self.descriptor_net:updateGradInput(input, target) return self.gradInput end function ArtisticCriterion:updateModules() self.content_modules = {} self.texture_modules = {} for key,module in pairs(self.descriptor_net.modules) do if module.tag == 'content' then module.target:cuda(); table.insert(self.content_modules, module) elseif module.tag == 'texture' then module.target:cuda(); table.insert(self.texture_modules, module) end end end function ArtisticCriterion:replace(f) self.descriptor_net = self.descriptor_net:replace(f) self:updateModules() return self end function nop() -- nop. not needed by our net end function create_descriptor_net(params, cnn, texture_image) local content_layers = params.content_layers:split(",") local texture_layers = params.texture_layers:split(",") -- Set up the network, inserting texture and content loss modules local content_modules, texture_modules = {}, {} local next_content_idx, next_texture_idx = 1, 1 local net = nn.Sequential() for i = 1, #cnn do if next_content_idx <= #content_layers or next_texture_idx <= #texture_layers then local layer = cnn:get(i) local name = layer.name local layer_type = torch.type(layer) local is_pooling = (layer_type == 'cudnn.SpatialMaxPooling' or layer_type == 'nn.SpatialMaxPooling') if params.vgg_no_pad and (layer_type == 'nn.SpatialConvolution' or layer_type == 'nn.SpatialConvolutionMM' or layer_type == 'cudnn.SpatialConvolution') then print (name, ': padding set to 0') layer.padW = 0 layer.padH = 0 end net:add(layer) --------------------------------- -- Content --------------------------------- if name == content_layers[next_content_idx] then print("Setting up content layer", i, ":", layer.name) local norm = false local loss_module = nn.ContentLoss(params.content_weight, norm); net:add(loss_module) loss_module.tag = 'content' next_content_idx = next_content_idx + 1 end --------------------------------- -- Texture --------------------------------- if name == texture_layers[next_texture_idx] then print("Setting up texture layer ", i, ":", layer.name) local gram = GramMatrix():float(); local target_features = net:forward(texture_image:add_dummy()):clone() local target = gram:forward(nn.View(-1):float():setNumInputDims(2):forward(target_features[1])):clone() target:div(target_features[1]:nElement()) local norm = params.normalize_gradients local loss_module = nn.TextureLoss(params.texture_weight, target, norm):float(); net:add(loss_module) loss_module.tag = 'texture'; next_texture_idx = next_texture_idx + 1 end end end net:add(nn.DummyGradOutput()) -- We don't need the base CNN anymore, so clean it up to save memory. cnn = nil for i=1,#net.modules do local module = net.modules[i] if torch.type(module) == 'nn.SpatialConvolutionMM' or torch.type(module) == 'nn.SpatialConvolution' or torch.type(module) == 'cudnn.SpatialConvolution' then module.gradWeight = nil module.gradBias = nil end end net = cudnn.convert(net, cudnn):cuda() collectgarbage() return net, content_modules, texture_modules end ```
```shell Subdirectory checkout Managing branches Checkout the previous branch Cherry-pick a commit View your commit history in a graph ```
```go package install import ( "testing" . "github.com/onsi/ginkgo" . "github.com/onsi/gomega" "github.com/projectcalico/calico/libcalico-go/lib/testutils" "github.com/onsi/ginkgo/reporters" ) func init() { testutils.HookLogrusForGinkgo() } func TestInstall(t *testing.T) { RegisterFailHandler(Fail) junitReporter := reporters.NewJUnitReporter("../../report/install_suite.xml") RunSpecsWithDefaultAndCustomReporters(t, "Install Suite", []Reporter{junitReporter}) } ```
```c /* * DirectShow capture interface * * This file is part of FFmpeg. * * FFmpeg is free software; you can redistribute it and/or * modify it under the terms of the GNU Lesser General Public * * FFmpeg is distributed in the hope that it will be useful, * but WITHOUT ANY WARRANTY; without even the implied warranty of * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU * * You should have received a copy of the GNU Lesser General Public * Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA */ #include "dshow_capture.h" #include <stddef.h> #define imemoffset offsetof(libAVPin, imemvtbl) DECLARE_QUERYINTERFACE(libAVPin, { {&IID_IUnknown,0}, {&IID_IPin,0}, {&IID_IMemInputPin,imemoffset} }) DECLARE_ADDREF(libAVPin) DECLARE_RELEASE(libAVPin) long WINAPI libAVPin_Connect(libAVPin *this, IPin *pin, const AM_MEDIA_TYPE *type) { dshowdebug("libAVPin_Connect(%p, %p, %p)\n", this, pin, type); /* Input pins receive connections. */ return S_FALSE; } long WINAPI libAVPin_ReceiveConnection(libAVPin *this, IPin *pin, const AM_MEDIA_TYPE *type) { enum dshowDeviceType devtype = this->filter->type; dshowdebug("libAVPin_ReceiveConnection(%p)\n", this); if (!pin) return E_POINTER; if (this->connectedto) return VFW_E_ALREADY_CONNECTED; ff_print_AM_MEDIA_TYPE(type); if (devtype == VideoDevice) { if (!IsEqualGUID(&type->majortype, &MEDIATYPE_Video)) return VFW_E_TYPE_NOT_ACCEPTED; } else { if (!IsEqualGUID(&type->majortype, &MEDIATYPE_Audio)) return VFW_E_TYPE_NOT_ACCEPTED; } IPin_AddRef(pin); this->connectedto = pin; ff_copy_dshow_media_type(&this->type, type); return S_OK; } long WINAPI libAVPin_Disconnect(libAVPin *this) { dshowdebug("libAVPin_Disconnect(%p)\n", this); if (this->filter->state != State_Stopped) return VFW_E_NOT_STOPPED; if (!this->connectedto) return S_FALSE; IPin_Release(this->connectedto); this->connectedto = NULL; return S_OK; } long WINAPI libAVPin_ConnectedTo(libAVPin *this, IPin **pin) { dshowdebug("libAVPin_ConnectedTo(%p)\n", this); if (!pin) return E_POINTER; if (!this->connectedto) return VFW_E_NOT_CONNECTED; IPin_AddRef(this->connectedto); *pin = this->connectedto; return S_OK; } long WINAPI libAVPin_ConnectionMediaType(libAVPin *this, AM_MEDIA_TYPE *type) { dshowdebug("libAVPin_ConnectionMediaType(%p)\n", this); if (!type) return E_POINTER; if (!this->connectedto) return VFW_E_NOT_CONNECTED; return ff_copy_dshow_media_type(type, &this->type); } long WINAPI libAVPin_QueryPinInfo(libAVPin *this, PIN_INFO *info) { dshowdebug("libAVPin_QueryPinInfo(%p)\n", this); if (!info) return E_POINTER; if (this->filter) libAVFilter_AddRef(this->filter); info->pFilter = (IBaseFilter *) this->filter; info->dir = PINDIR_INPUT; wcscpy(info->achName, L"Capture"); return S_OK; } long WINAPI libAVPin_QueryDirection(libAVPin *this, PIN_DIRECTION *dir) { dshowdebug("libAVPin_QueryDirection(%p)\n", this); if (!dir) return E_POINTER; *dir = PINDIR_INPUT; return S_OK; } long WINAPI libAVPin_QueryId(libAVPin *this, wchar_t **id) { dshowdebug("libAVPin_QueryId(%p)\n", this); if (!id) return E_POINTER; *id = wcsdup(L"libAV Pin"); return S_OK; } long WINAPI libAVPin_QueryAccept(libAVPin *this, const AM_MEDIA_TYPE *type) { dshowdebug("libAVPin_QueryAccept(%p)\n", this); return S_FALSE; } long WINAPI libAVPin_EnumMediaTypes(libAVPin *this, IEnumMediaTypes **enumtypes) { const AM_MEDIA_TYPE *type = NULL; libAVEnumMediaTypes *new; dshowdebug("libAVPin_EnumMediaTypes(%p)\n", this); if (!enumtypes) return E_POINTER; new = libAVEnumMediaTypes_Create(type); if (!new) return E_OUTOFMEMORY; *enumtypes = (IEnumMediaTypes *) new; return S_OK; } long WINAPI libAVPin_QueryInternalConnections(libAVPin *this, IPin **pin, unsigned long *npin) { dshowdebug("libAVPin_QueryInternalConnections(%p)\n", this); return E_NOTIMPL; } long WINAPI libAVPin_EndOfStream(libAVPin *this) { dshowdebug("libAVPin_EndOfStream(%p)\n", this); /* I don't care. */ return S_OK; } long WINAPI libAVPin_BeginFlush(libAVPin *this) { dshowdebug("libAVPin_BeginFlush(%p)\n", this); /* I don't care. */ return S_OK; } long WINAPI libAVPin_EndFlush(libAVPin *this) { dshowdebug("libAVPin_EndFlush(%p)\n", this); /* I don't care. */ return S_OK; } long WINAPI libAVPin_NewSegment(libAVPin *this, REFERENCE_TIME start, REFERENCE_TIME stop, double rate) { dshowdebug("libAVPin_NewSegment(%p)\n", this); /* I don't care. */ return S_OK; } static int libAVPin_Setup(libAVPin *this, libAVFilter *filter) { IPinVtbl *vtbl = this->vtbl; IMemInputPinVtbl *imemvtbl; if (!filter) return 0; imemvtbl = av_malloc(sizeof(IMemInputPinVtbl)); if (!imemvtbl) return 0; SETVTBL(imemvtbl, libAVMemInputPin, QueryInterface); SETVTBL(imemvtbl, libAVMemInputPin, AddRef); SETVTBL(imemvtbl, libAVMemInputPin, Release); SETVTBL(imemvtbl, libAVMemInputPin, GetAllocator); SETVTBL(imemvtbl, libAVMemInputPin, NotifyAllocator); SETVTBL(imemvtbl, libAVMemInputPin, GetAllocatorRequirements); SETVTBL(imemvtbl, libAVMemInputPin, Receive); SETVTBL(imemvtbl, libAVMemInputPin, ReceiveMultiple); SETVTBL(imemvtbl, libAVMemInputPin, ReceiveCanBlock); this->imemvtbl = imemvtbl; SETVTBL(vtbl, libAVPin, QueryInterface); SETVTBL(vtbl, libAVPin, AddRef); SETVTBL(vtbl, libAVPin, Release); SETVTBL(vtbl, libAVPin, Connect); SETVTBL(vtbl, libAVPin, ReceiveConnection); SETVTBL(vtbl, libAVPin, Disconnect); SETVTBL(vtbl, libAVPin, ConnectedTo); SETVTBL(vtbl, libAVPin, ConnectionMediaType); SETVTBL(vtbl, libAVPin, QueryPinInfo); SETVTBL(vtbl, libAVPin, QueryDirection); SETVTBL(vtbl, libAVPin, QueryId); SETVTBL(vtbl, libAVPin, QueryAccept); SETVTBL(vtbl, libAVPin, EnumMediaTypes); SETVTBL(vtbl, libAVPin, QueryInternalConnections); SETVTBL(vtbl, libAVPin, EndOfStream); SETVTBL(vtbl, libAVPin, BeginFlush); SETVTBL(vtbl, libAVPin, EndFlush); SETVTBL(vtbl, libAVPin, NewSegment); this->filter = filter; return 1; } DECLARE_CREATE(libAVPin, libAVPin_Setup(this, filter), libAVFilter *filter) DECLARE_DESTROY(libAVPin, nothing) /***************************************************************************** * libAVMemInputPin ****************************************************************************/ long WINAPI libAVMemInputPin_QueryInterface(libAVMemInputPin *this, const GUID *riid, void **ppvObject) { libAVPin *pin = (libAVPin *) ((uint8_t *) this - imemoffset); dshowdebug("libAVMemInputPin_QueryInterface(%p)\n", this); return libAVPin_QueryInterface(pin, riid, ppvObject); } unsigned long WINAPI libAVMemInputPin_AddRef(libAVMemInputPin *this) { libAVPin *pin = (libAVPin *) ((uint8_t *) this - imemoffset); dshowdebug("libAVMemInputPin_AddRef(%p)\n", this); return libAVPin_AddRef(pin); } unsigned long WINAPI libAVMemInputPin_Release(libAVMemInputPin *this) { libAVPin *pin = (libAVPin *) ((uint8_t *) this - imemoffset); dshowdebug("libAVMemInputPin_Release(%p)\n", this); return libAVPin_Release(pin); } long WINAPI libAVMemInputPin_GetAllocator(libAVMemInputPin *this, IMemAllocator **alloc) { dshowdebug("libAVMemInputPin_GetAllocator(%p)\n", this); return VFW_E_NO_ALLOCATOR; } long WINAPI libAVMemInputPin_NotifyAllocator(libAVMemInputPin *this, IMemAllocator *alloc, BOOL rdwr) { dshowdebug("libAVMemInputPin_NotifyAllocator(%p)\n", this); return S_OK; } long WINAPI libAVMemInputPin_GetAllocatorRequirements(libAVMemInputPin *this, ALLOCATOR_PROPERTIES *props) { dshowdebug("libAVMemInputPin_GetAllocatorRequirements(%p)\n", this); return E_NOTIMPL; } long WINAPI libAVMemInputPin_Receive(libAVMemInputPin *this, IMediaSample *sample) { libAVPin *pin = (libAVPin *) ((uint8_t *) this - imemoffset); enum dshowDeviceType devtype = pin->filter->type; void *priv_data; AVFormatContext *s; uint8_t *buf; int buf_size; /* todo should be a long? */ int index; int64_t curtime; int64_t orig_curtime; int64_t graphtime; const char *devtypename = (devtype == VideoDevice) ? "video" : "audio"; IReferenceClock *clock = pin->filter->clock; int64_t dummy; struct dshow_ctx *ctx; dshowdebug("libAVMemInputPin_Receive(%p)\n", this); if (!sample) return E_POINTER; IMediaSample_GetTime(sample, &orig_curtime, &dummy); orig_curtime += pin->filter->start_time; IReferenceClock_GetTime(clock, &graphtime); if (devtype == VideoDevice) { /* PTS from video devices is unreliable. */ IReferenceClock_GetTime(clock, &curtime); } else { IMediaSample_GetTime(sample, &curtime, &dummy); if(curtime > 400000000000000000LL) { /* initial frames sometimes start < 0 (shown as a very large number here, like 437650244077016960 which FFmpeg doesn't like. TODO figure out math. For now just drop them. */ av_log(NULL, AV_LOG_DEBUG, "dshow dropping initial (or ending) audio frame with odd PTS too high %"PRId64"\n", curtime); return S_OK; } curtime += pin->filter->start_time; } buf_size = IMediaSample_GetActualDataLength(sample); IMediaSample_GetPointer(sample, &buf); priv_data = pin->filter->priv_data; s = priv_data; ctx = s->priv_data; index = pin->filter->stream_index; av_log(NULL, AV_LOG_VERBOSE, "dshow passing through packet of type %s size %8d " "timestamp %"PRId64" orig timestamp %"PRId64" graph timestamp %"PRId64" diff %"PRId64" %s\n", devtypename, buf_size, curtime, orig_curtime, graphtime, graphtime - orig_curtime, ctx->device_name[devtype]); pin->filter->callback(priv_data, index, buf, buf_size, curtime, devtype); return S_OK; } long WINAPI libAVMemInputPin_ReceiveMultiple(libAVMemInputPin *this, IMediaSample **samples, long n, long *nproc) { int i; dshowdebug("libAVMemInputPin_ReceiveMultiple(%p)\n", this); for (i = 0; i < n; i++) libAVMemInputPin_Receive(this, samples[i]); *nproc = n; return S_OK; } long WINAPI libAVMemInputPin_ReceiveCanBlock(libAVMemInputPin *this) { dshowdebug("libAVMemInputPin_ReceiveCanBlock(%p)\n", this); /* I swear I will not block. */ return S_FALSE; } void libAVMemInputPin_Destroy(libAVMemInputPin *this) { libAVPin *pin = (libAVPin *) ((uint8_t *) this - imemoffset); dshowdebug("libAVMemInputPin_Destroy(%p)\n", this); libAVPin_Destroy(pin); } ```
```c /* $OpenBSD: insque.c,v 1.3 2014/08/15 04:14:36 guenther Exp $ */ /* * All rights reserved. * * Redistribution and use in source and binary forms, with or without * modification, are permitted provided that the following conditions * are met: * 1. Redistributions of source code must retain the above copyright * notice, this list of conditions and the following disclaimer. * 2. Redistributions in binary form must reproduce the above copyright * notice, this list of conditions and the following disclaimer in the * documentation and/or other materials provided with the distribution. * 3. The name of the author may be used to endorse or promote products * derived from this software without specific prior written permission. * * THIS SOFTWARE IS PROVIDED BY THE AUTHOR `AS IS'' AND ANY EXPRESS OR * IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED * WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE * DISCLAIMED. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY DIRECT, * INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES * (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR * SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) * HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, * STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN * ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE * POSSIBILITY OF SUCH DAMAGE. */ #include <stdlib.h> #include <search.h> struct qelem { struct qelem *q_forw; struct qelem *q_back; }; void insque(void *entry, void *pred) { struct qelem *e = entry; struct qelem *p = pred; if (p == NULL) e->q_forw = e->q_back = NULL; else { e->q_forw = p->q_forw; e->q_back = p; if (p->q_forw != NULL) p->q_forw->q_back = e; p->q_forw = e; } } ```
```python import copy import logging from typing import Dict, List, Optional, Union from ray.tune.error import TuneError from ray.tune.experiment import Experiment, Trial, _convert_to_experiment_list from ray.tune.experiment.config_parser import _create_trial_from_spec, _make_parser from ray.tune.search.search_algorithm import SearchAlgorithm from ray.tune.search.searcher import Searcher from ray.tune.search.util import _set_search_properties_backwards_compatible from ray.tune.search.variant_generator import _resolve_nested_dict, format_vars from ray.tune.utils.util import ( _atomic_save, _load_newest_checkpoint, flatten_dict, merge_dicts, ) from ray.util.annotations import DeveloperAPI logger = logging.getLogger(__name__) def _warn_on_repeater(searcher, total_samples): from ray.tune.search.repeater import _warn_num_samples _warn_num_samples(searcher, total_samples) @DeveloperAPI class SearchGenerator(SearchAlgorithm): """Generates trials to be passed to the TrialRunner. Uses the provided ``searcher`` object to generate trials. This class transparently handles repeating trials with score aggregation without embedding logic into the Searcher. Args: searcher: Search object that subclasses the Searcher base class. This is then used for generating new hyperparameter samples. """ CKPT_FILE_TMPL = "search_gen_state-{}.json" def __init__(self, searcher: Searcher): assert issubclass( type(searcher), Searcher ), "Searcher should be subclassing Searcher." self.searcher = searcher self._parser = _make_parser() self._experiment = None self._counter = 0 # Keeps track of number of trials created. self._total_samples = 0 # int: total samples to evaluate. self._finished = False @property def metric(self): return self.searcher.metric def set_search_properties( self, metric: Optional[str], mode: Optional[str], config: Dict, **spec ) -> bool: return _set_search_properties_backwards_compatible( self.searcher.set_search_properties, metric, mode, config, **spec ) @property def total_samples(self): return self._total_samples def add_configurations( self, experiments: Union[Experiment, List[Experiment], Dict[str, Dict]] ): """Registers experiment specifications. Arguments: experiments: Experiments to run. """ assert not self._experiment logger.debug("added configurations") experiment_list = _convert_to_experiment_list(experiments) assert ( len(experiment_list) == 1 ), "SearchAlgorithms can only support 1 experiment at a time." self._experiment = experiment_list[0] experiment_spec = self._experiment.spec self._total_samples = self._experiment.spec.get("num_samples", 1) _warn_on_repeater(self.searcher, self._total_samples) if "run" not in experiment_spec: raise TuneError("Must specify `run` in {}".format(experiment_spec)) def next_trial(self): """Provides one Trial object to be queued into the TrialRunner. Returns: Trial: Returns a single trial. """ if not self.is_finished(): return self.create_trial_if_possible(self._experiment.spec) return None def create_trial_if_possible(self, experiment_spec: Dict) -> Optional[Trial]: logger.debug("creating trial") trial_id = Trial.generate_id() suggested_config = self.searcher.suggest(trial_id) if suggested_config == Searcher.FINISHED: self._finished = True logger.debug("Searcher has finished.") return if suggested_config is None: return spec = copy.deepcopy(experiment_spec) spec["config"] = merge_dicts(spec["config"], copy.deepcopy(suggested_config)) # Create a new trial_id if duplicate trial is created flattened_config = _resolve_nested_dict(spec["config"]) self._counter += 1 tag = "{0}_{1}".format(str(self._counter), format_vars(flattened_config)) trial = _create_trial_from_spec( spec, self._parser, evaluated_params=flatten_dict(suggested_config), experiment_tag=tag, trial_id=trial_id, ) return trial def on_trial_result(self, trial_id: str, result: Dict): """Notifies the underlying searcher.""" self.searcher.on_trial_result(trial_id, result) def on_trial_complete( self, trial_id: str, result: Optional[Dict] = None, error: bool = False ): self.searcher.on_trial_complete(trial_id=trial_id, result=result, error=error) def is_finished(self) -> bool: return self._counter >= self._total_samples or self._finished def get_state(self) -> Dict: return { "counter": self._counter, "total_samples": self._total_samples, "finished": self._finished, "experiment": self._experiment, } def set_state(self, state: Dict): self._counter = state["counter"] self._total_samples = state["total_samples"] self._finished = state["finished"] self._experiment = state["experiment"] def has_checkpoint(self, dirpath: str): return bool(_load_newest_checkpoint(dirpath, self.CKPT_FILE_TMPL.format("*"))) def save_to_dir(self, dirpath: str, session_str: str): """Saves self + searcher to dir. Separates the "searcher" from its wrappers (concurrency, repeating). This allows the user to easily restore a given searcher. The save operation is atomic (write/swap). Args: dirpath: Filepath to experiment dir. session_str: Unique identifier of the current run session. """ searcher = self.searcher search_alg_state = self.get_state() while hasattr(searcher, "searcher"): searcher_name = type(searcher).__name__ if searcher_name in search_alg_state: logger.warning( "There was a duplicate when saving {}. " "Restore may not work properly.".format(searcher_name) ) else: search_alg_state["name:" + searcher_name] = searcher.get_state() searcher = searcher.searcher base_searcher = searcher # We save the base searcher separately for users to easily # separate the searcher. base_searcher.save_to_dir(dirpath, session_str) _atomic_save( state=search_alg_state, checkpoint_dir=dirpath, file_name=self.CKPT_FILE_TMPL.format(session_str), tmp_file_name=".tmp_search_generator_ckpt", ) def restore_from_dir(self, dirpath: str): """Restores self + searcher + search wrappers from dirpath.""" searcher = self.searcher search_alg_state = _load_newest_checkpoint( dirpath, self.CKPT_FILE_TMPL.format("*") ) if not search_alg_state: raise RuntimeError("Unable to find checkpoint in {}.".format(dirpath)) while hasattr(searcher, "searcher"): searcher_name = "name:" + type(searcher).__name__ if searcher_name not in search_alg_state: names = [ key.split("name:")[1] for key in search_alg_state if key.startswith("name:") ] logger.warning( "{} was not found in the experiment " "state when restoring. Found {}.".format(searcher_name, names) ) else: searcher.set_state(search_alg_state.pop(searcher_name)) searcher = searcher.searcher base_searcher = searcher logger.debug(f"searching base {base_searcher}") base_searcher.restore_from_dir(dirpath) self.set_state(search_alg_state) ```
Thomas William Baxter (1 February 1903 – 21 August 1987) was an English footballer who played on the left wing for Welbeck Colliery, Newark Town, Worksop Town, Mansfield Town, Wolverhampton Wanderers, Port Vale, Margate, Carlisle United, and Distillery. He helped the "Valiants" to win the Third Division North title in 1929–30. Career Baxter played amateur football for Welbeck Colliery, Newark Town and Worksop Town (Midland League). He began his professional career with his hometown club Mansfield Town. He moved to Second Division club Wolverhampton Wanderers in late 1927, and made his "Wolves" debut on 17 December 1927 in a 2–0 defeat to Chelsea at Stamford Bridge. He remained in the team for the remainder of the 1927–28 season and played the majority of the 1928–29 campaign. An attacking player, he scored 15 goals in a total of 53 appearances for the Midlanders. He left Molineux to join Port Vale in August 1929. He was a regular member of the "Valiants" 1929–30 Third Division North winning side, claiming nine goals in 43 appearances. He lost his first team place in December 1930, and featured in just 16 games of the 1930–31 season, as the Vale achieved a club record high of fifth in the Second Division. He left The Old Recreation Ground at the end of the campaign and returned to Mansfield. He later spent two spells with Margate and also played for Carlisle United and Distillery (Irish League). Career statistics Source: Honours Port Vale Football League Third Division North: 1929–30 References 1903 births 1987 deaths Footballers from Mansfield English men's footballers Men's association football outside forwards Welbeck Welfare F.C. players Newark Town F.C. players Worksop Town F.C. players Mansfield Town F.C. players Wolverhampton Wanderers F.C. players Port Vale F.C. players Margate F.C. players Carlisle United F.C. players Lisburn Distillery F.C. players Midland Football League players English Football League players
```javascript /* Calculate the set of unique abbreviations for a given set of strings. * * |Name |Desc | * |------|----------------| * |names |List of names | * |return|Abbreviation map| */ /* example * abbrev('lina', 'luna'); * // -> {li: 'lina', lin: 'lina', lina: 'lina', lu: 'luna', lun: 'luna', luna: 'luna'} */ /* module * env: all */ /* typescript * export declare function abbrev(...names: string[]): types.PlainObj<string>; */ _('types restArgs isSorted'); exports = restArgs(function(names) { names = names.sort(isSorted.defComparator); const ret = {}; const idleMap = {}; for (let i = 0, len = names.length; i < len; i++) { const str = names[i]; const nextStr = names[i + 1] || ''; if (str === nextStr) continue; let start = false; let abbrev = ''; for (let j = 0, strLen = str.length; j < strLen; j++) { abbrev += str[j]; if (!start && (str[j] !== nextStr[j] || j === strLen - 1)) { start = true; } if (!start) { idleMap[abbrev] = str; } else if (!ret[abbrev] && !idleMap[abbrev]) { ret[abbrev] = str; } } } return ret; }); ```
The golden-bellied mangabey (Cercocebus chrysogaster) is a social Old World monkey found in swampy, humid forests south of the Congo River in the Democratic Republic of the Congo. It was previously considered a subspecies of the agile mangabey (C. agilis). Little is published about the species and its behaviour has only been studied in captivity. The only known photograph of golden-bellied mangabeys in the wild is shown in this article and a link to a video can be found in "External links" below. References External links Only known footage of golden-bellied mangabeys in the wild Only known footage: J.M.Stritch: http://cornwallcameratrapping.blogspot.co.uk/2014/07/only-know-footage-of-wild-golden.html golden-bellied mangabey Endemic fauna of the Democratic Republic of the Congo Mammals of the Democratic Republic of the Congo golden-bellied mangabey
```java /* * contributor license agreements. See the NOTICE file distributed with * this work for additional information regarding copyright ownership. * * path_to_url * * Unless required by applicable law or agreed to in writing, software * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. */ package org.apache.shardingsphere.sharding.merge.dql.pagination; import org.apache.shardingsphere.infra.binder.context.statement.dml.SelectStatementContext; import org.apache.shardingsphere.infra.config.props.ConfigurationProperties; import org.apache.shardingsphere.infra.metadata.database.resource.ResourceMetaData; import org.apache.shardingsphere.infra.session.connection.ConnectionContext; import org.apache.shardingsphere.infra.database.core.DefaultDatabase; import org.apache.shardingsphere.infra.database.core.type.DatabaseType; import org.apache.shardingsphere.infra.executor.sql.execute.result.query.QueryResult; import org.apache.shardingsphere.infra.merge.result.MergedResult; import org.apache.shardingsphere.infra.metadata.ShardingSphereMetaData; import org.apache.shardingsphere.infra.metadata.database.ShardingSphereDatabase; import org.apache.shardingsphere.infra.metadata.database.rule.RuleMetaData; import org.apache.shardingsphere.infra.spi.type.typed.TypedSPILoader; import org.apache.shardingsphere.sharding.merge.dql.ShardingDQLResultMerger; import org.apache.shardingsphere.sql.parser.statement.core.segment.dml.column.ColumnSegment; import org.apache.shardingsphere.sql.parser.statement.core.segment.dml.expr.BinaryOperationExpression; import org.apache.shardingsphere.sql.parser.statement.core.segment.dml.expr.simple.LiteralExpressionSegment; import org.apache.shardingsphere.sql.parser.statement.core.segment.dml.expr.subquery.SubquerySegment; import org.apache.shardingsphere.sql.parser.statement.core.segment.dml.item.ProjectionsSegment; import org.apache.shardingsphere.sql.parser.statement.core.segment.dml.pagination.rownum.NumberLiteralRowNumberValueSegment; import org.apache.shardingsphere.sql.parser.statement.core.segment.dml.pagination.top.TopProjectionSegment; import org.apache.shardingsphere.sql.parser.statement.core.segment.dml.predicate.WhereSegment; import org.apache.shardingsphere.sql.parser.statement.core.segment.generic.table.SubqueryTableSegment; import org.apache.shardingsphere.sql.parser.statement.core.statement.dml.SelectStatement; import org.apache.shardingsphere.sql.parser.statement.core.value.identifier.IdentifierValue; import org.apache.shardingsphere.sql.parser.statement.mysql.dml.MySQLSelectStatement; import org.apache.shardingsphere.sql.parser.statement.oracle.dml.OracleSelectStatement; import org.junit.jupiter.api.Test; import java.sql.SQLException; import java.util.Arrays; import java.util.Collections; import static org.junit.jupiter.api.Assertions.assertFalse; import static org.junit.jupiter.api.Assertions.assertTrue; import static org.mockito.Mockito.RETURNS_DEEP_STUBS; import static org.mockito.Mockito.mock; import static org.mockito.Mockito.when; class RowNumberDecoratorMergedResultTest { @Test void assertNextForSkipAll() throws SQLException { OracleSelectStatement selectStatement = new OracleSelectStatement(); selectStatement.setProjections(new ProjectionsSegment(0, 0)); WhereSegment whereSegment = mock(WhereSegment.class); BinaryOperationExpression binaryOperationExpression = mock(BinaryOperationExpression.class); when(binaryOperationExpression.getLeft()).thenReturn(new ColumnSegment(0, 0, new IdentifierValue("row_id"))); when(binaryOperationExpression.getRight()).thenReturn(new LiteralExpressionSegment(0, 0, Integer.MAX_VALUE)); when(binaryOperationExpression.getOperator()).thenReturn(">="); when(whereSegment.getExpr()).thenReturn(binaryOperationExpression); SubqueryTableSegment subqueryTableSegment = mock(SubqueryTableSegment.class); SubquerySegment subquerySegment = mock(SubquerySegment.class); SelectStatement subSelectStatement = mock(MySQLSelectStatement.class); ProjectionsSegment subProjectionsSegment = mock(ProjectionsSegment.class); TopProjectionSegment topProjectionSegment = mock(TopProjectionSegment.class); when(topProjectionSegment.getAlias()).thenReturn("row_id"); when(subProjectionsSegment.getProjections()).thenReturn(Collections.singletonList(topProjectionSegment)); when(subSelectStatement.getProjections()).thenReturn(subProjectionsSegment); when(subquerySegment.getSelect()).thenReturn(subSelectStatement); when(subqueryTableSegment.getSubquery()).thenReturn(subquerySegment); selectStatement.setFrom(subqueryTableSegment); selectStatement.setWhere(whereSegment); ShardingDQLResultMerger resultMerger = new ShardingDQLResultMerger(TypedSPILoader.getService(DatabaseType.class, "Oracle")); ShardingSphereDatabase database = mock(ShardingSphereDatabase.class, RETURNS_DEEP_STUBS); SelectStatementContext selectStatementContext = new SelectStatementContext(createShardingSphereMetaData(database), null, selectStatement, DefaultDatabase.LOGIC_NAME, Collections.emptyList()); when(database.getName()).thenReturn(DefaultDatabase.LOGIC_NAME); MergedResult actual = resultMerger.merge(Arrays.asList(mockQueryResult(), mockQueryResult(), mockQueryResult(), mockQueryResult()), selectStatementContext, database, mock(ConnectionContext.class)); assertFalse(actual.next()); } @Test void assertNextWithoutOffsetWithoutRowCount() throws SQLException { ShardingDQLResultMerger resultMerger = new ShardingDQLResultMerger(TypedSPILoader.getService(DatabaseType.class, "Oracle")); ShardingSphereDatabase database = mock(ShardingSphereDatabase.class, RETURNS_DEEP_STUBS); OracleSelectStatement selectStatement = new OracleSelectStatement(); selectStatement.setProjections(new ProjectionsSegment(0, 0)); SelectStatementContext selectStatementContext = new SelectStatementContext(createShardingSphereMetaData(database), null, selectStatement, DefaultDatabase.LOGIC_NAME, Collections.emptyList()); when(database.getName()).thenReturn(DefaultDatabase.LOGIC_NAME); MergedResult actual = resultMerger.merge(Arrays.asList(mockQueryResult(), mockQueryResult(), mockQueryResult(), mockQueryResult()), selectStatementContext, database, mock(ConnectionContext.class)); for (int i = 0; i < 8; i++) { assertTrue(actual.next()); } assertFalse(actual.next()); } @Test void assertNextForRowCountBoundOpenedFalse() throws SQLException { OracleSelectStatement selectStatement = new OracleSelectStatement(); selectStatement.setProjections(new ProjectionsSegment(0, 0)); WhereSegment whereSegment = mock(WhereSegment.class); BinaryOperationExpression binaryOperationExpression = mock(BinaryOperationExpression.class); when(binaryOperationExpression.getLeft()).thenReturn(new ColumnSegment(0, 0, new IdentifierValue("row_id"))); when(binaryOperationExpression.getRight()).thenReturn(new LiteralExpressionSegment(0, 0, 2)); when(binaryOperationExpression.getOperator()).thenReturn(">="); when(whereSegment.getExpr()).thenReturn(binaryOperationExpression); SubqueryTableSegment subqueryTableSegment = mock(SubqueryTableSegment.class); SubquerySegment subquerySegment = mock(SubquerySegment.class); SelectStatement subSelectStatement = mock(MySQLSelectStatement.class); ProjectionsSegment subProjectionsSegment = mock(ProjectionsSegment.class); TopProjectionSegment topProjectionSegment = mock(TopProjectionSegment.class); when(topProjectionSegment.getAlias()).thenReturn("row_id"); when(topProjectionSegment.getTop()).thenReturn(new NumberLiteralRowNumberValueSegment(0, 0, 4L, false)); when(subProjectionsSegment.getProjections()).thenReturn(Collections.singletonList(topProjectionSegment)); when(subSelectStatement.getProjections()).thenReturn(subProjectionsSegment); when(subquerySegment.getSelect()).thenReturn(subSelectStatement); when(subqueryTableSegment.getSubquery()).thenReturn(subquerySegment); selectStatement.setFrom(subqueryTableSegment); selectStatement.setWhere(whereSegment); ShardingDQLResultMerger resultMerger = new ShardingDQLResultMerger(TypedSPILoader.getService(DatabaseType.class, "Oracle")); ShardingSphereDatabase database = mock(ShardingSphereDatabase.class, RETURNS_DEEP_STUBS); SelectStatementContext selectStatementContext = new SelectStatementContext(createShardingSphereMetaData(database), null, selectStatement, DefaultDatabase.LOGIC_NAME, Collections.emptyList()); when(database.getName()).thenReturn(DefaultDatabase.LOGIC_NAME); MergedResult actual = resultMerger.merge(Arrays.asList(mockQueryResult(), mockQueryResult(), mockQueryResult(), mockQueryResult()), selectStatementContext, database, mock(ConnectionContext.class)); assertTrue(actual.next()); assertTrue(actual.next()); assertFalse(actual.next()); } @Test void assertNextForRowCountBoundOpenedTrue() throws SQLException { OracleSelectStatement selectStatement = new OracleSelectStatement(); selectStatement.setProjections(new ProjectionsSegment(0, 0)); WhereSegment whereSegment = mock(WhereSegment.class); BinaryOperationExpression binaryOperationExpression = mock(BinaryOperationExpression.class); when(binaryOperationExpression.getLeft()).thenReturn(new ColumnSegment(0, 0, new IdentifierValue("row_id"))); when(binaryOperationExpression.getRight()).thenReturn(new LiteralExpressionSegment(0, 0, 2)); when(binaryOperationExpression.getOperator()).thenReturn(">="); when(whereSegment.getExpr()).thenReturn(binaryOperationExpression); SubqueryTableSegment subqueryTableSegment = mock(SubqueryTableSegment.class); SubquerySegment subquerySegment = mock(SubquerySegment.class); SelectStatement subSelectStatement = mock(MySQLSelectStatement.class); ProjectionsSegment subProjectionsSegment = mock(ProjectionsSegment.class); TopProjectionSegment topProjectionSegment = mock(TopProjectionSegment.class); when(topProjectionSegment.getAlias()).thenReturn("row_id"); when(topProjectionSegment.getTop()).thenReturn(new NumberLiteralRowNumberValueSegment(0, 0, 4L, true)); when(subProjectionsSegment.getProjections()).thenReturn(Collections.singletonList(topProjectionSegment)); when(subSelectStatement.getProjections()).thenReturn(subProjectionsSegment); when(subquerySegment.getSelect()).thenReturn(subSelectStatement); when(subqueryTableSegment.getSubquery()).thenReturn(subquerySegment); selectStatement.setFrom(subqueryTableSegment); selectStatement.setWhere(whereSegment); ShardingDQLResultMerger resultMerger = new ShardingDQLResultMerger(TypedSPILoader.getService(DatabaseType.class, "Oracle")); ShardingSphereDatabase database = mock(ShardingSphereDatabase.class, RETURNS_DEEP_STUBS); SelectStatementContext selectStatementContext = new SelectStatementContext(createShardingSphereMetaData(database), null, selectStatement, DefaultDatabase.LOGIC_NAME, Collections.emptyList()); when(database.getName()).thenReturn(DefaultDatabase.LOGIC_NAME); MergedResult actual = resultMerger.merge(Arrays.asList(mockQueryResult(), mockQueryResult(), mockQueryResult(), mockQueryResult()), selectStatementContext, database, mock(ConnectionContext.class)); assertTrue(actual.next()); assertTrue(actual.next()); assertTrue(actual.next()); assertFalse(actual.next()); } private ShardingSphereMetaData createShardingSphereMetaData(final ShardingSphereDatabase database) { return new ShardingSphereMetaData(Collections.singletonMap(DefaultDatabase.LOGIC_NAME, database), mock(ResourceMetaData.class), mock(RuleMetaData.class), mock(ConfigurationProperties.class)); } private QueryResult mockQueryResult() throws SQLException { QueryResult result = mock(QueryResult.class, RETURNS_DEEP_STUBS); when(result.next()).thenReturn(true, true, false); return result; } } ```
The Puerta del Cambrón is a gate located in the west sector of the Spanish city of Toledo, in Castile-La Mancha. Also called previously "Gate of the Jews" or "Gate of Saint Leocadia", has been speculated The possibility that the name of the gate, del Cambrón, had its origin in the growth of a thorn bush or plant at the top of the ruins of one of the towers, before the last reconstruction of the gate, In 1576. It has the cataloging of Bien de Interés Cultural. Features Of Renaissance style, has two pairs of towers and two arches, being built of stone and brick. It underwent two renovations in the early-1570s and in 1576. Hernán González, Diego de Velasco and Juan Bautista Monegro would sculpt a figure of Leocadia in the gate. The gate was little damaged during the Spanish Civil War. Notes References Bibliography Text, but not this edition, is in the public domain. External links Gates in Toledo, Spain
This is a list of works (films, television, shorts, etc.) by the South Korean production company Playlist Studio. Films Television series Love Playlist The first series produced by Playlist Studio is Love Playlist (), which premiered on March 9, 2017, and lasted for four seasons, until its final broadcast, on August 10, 2019. In each episode it depicts the experiences and thoughts of people in their 20s related to love and deals with the process of falling in love with a person, starting a relationship, ending unrequited love, and a couple breaking up because of a misunderstanding. The cast features Kim Hyung-seok as Lee Hyun-seung, Jung Shin-hye as Jung Ji-won, Lee Yoo-jin as Han Jae-in, Choi Hee-seung as Kim Min-woo, Lim Hwi-jin as Kwak Jun-mo, Park Jung-woo as Kang Yoon, Min Hyo-won as Kim Do-yeong, Bae Hyun-sung as Park Ha-neul and Kim Sae-ron as Seo Ji-min who joined the line up in season 4. In season 4, it depicts various stories of characters who went on to adapt not only to love, but also friendship, military enlistment, and awkward social adjustment after being discharged from the military were unfolded. The drama also features original soundtrack (OST) by Brother Su and Cosmic Girls' Yoo Yeon-jung, Paul Kim, Kim Na-young, 10cm, MeloMance's Kim Min-seok, Exo-CBX, and Suran. For season 1 and 2, Facebook was the main platform, and YouTube was the main platform in season 3 and 4. Pu Reum's Vlog (, also known as Love Playlist Season 3.5) was a surprise release on March 1, 2019, with a preview of the upcoming season 4. The drama became massively popular in South Korea and internationally leading this to the 'Yeon-Fly-Lee' syndrome. They later held a fan meeting at the White Wave Art Center in Seoul on August 18, 2017, with more than 300 fans attending. The event drew attraction with more than 17,000 people participating in the invitation event. Love Playlist ended with total views for the four seasons exceeded 650 million views. As of July 2021, the series exceeded 700 million views. Dear. M is a spin-off of Love Playlist. It tells the stories of students at Seoyeon University trying to find 'M', the anonymous writer of the school community article that turned the school upside down. Originally planned for a February 2021 broadcast, it has since been postponed due to the controversy involving its lead actress Park Hye-su. Seventeen Following the success of Love Playlist, Seventeen depicts an unforgettable first love centre on a group of close-knit friends love between a college student and a young man. Yellow Titled Yellow, the drama starred deals with the love of anxious youth in an indie band and their charismatic tag-along photographer. The drama premiered on September 12 – October 12, 2017. The drama starred Kim Do-wan (Nam Ji-hoon, the leader), Kim Kwan-soo (Song Tae-min, the guitarist), Kim Hae-woo (Ja Pi, the drummer), Lee Se-jin (Park Dong-woo, the bassist), Ji Ye-eun (Cho Soo-a, the photographer) and Kim Ye-ji (Lee Yeo-reum). The drama features OST from Mamamoo's Wheein, Melomance, and Car, the Garden Yellow received a total of 28 million views in Korea alone, and an average of 4.66 million views per episode. Later, it reached a total of 43 million views in November 2017. Flower Ever After Flower Ever After is a romance drama about a marriage in the 20s and 30s. It starred five people with different charms; Lee Ho-jung as Han So-young, a newcomer loved by everyone for her natural sense, Choi Hee-jin as Ko Go-chae, a cute job seeker who has been in a long-term relationship for 7 years, Jung Geon-joo as Choi Woong, who is a beast but warm only in front of his woman, Ahn Si-eun, who plays the good Gong Ji-hyo, and Kang Hoon, who plays Yoo Hyeon-soo, a gentleman who dreams of marrying the woman he loves. The series recorded 800,000 views, the highest number of views in the shortest time. A-Teen A-Teen is a web drama that gained explosive popularity among teenagers aged 13 to 19 which aired for the first time in July 2018. It has been said that it captured the concerns of teenagers, including dating and unrequited love. The drama recorded the highest number of views of 4.38 million on YouTube. As of December 2019, the series garnered a total cumulative number of 480 million views. Just One Bite Just One Bite released on July 19, 2018, and ended on April 6, 2019, is starred by Kim Ji-in, Seo Hye-won, Jo Hye-joo, Park Seo-ham, Lee Shin-young, Park Seon-jae, Han Kyu-won and Lee Seong-ha. The drama shows three women who laugh and chat while looking for restaurants. It drew the sympathy of many people by depicting realistic mukbang and friendship and love of women in their 20s. In Seoul In Seoul tells a realistic story of two mothers and daughters who clash from room cleaning to college. XX As of the beginning of 2021, the cumulative number of viewers of XX reached about 100 million. Twenty-Twenty Twenty-Twenty is a drama by director Han Su-ji, who directed the web drama A-Teen. This is Han Seong-min's first lead role, who has been working as a magazine model, and the female lead, Chae Da-hee, who he plays, is a character who has lived as her mother has set her up. Kim Woo-seok, a member of the idol group UP10TION, takes on the role of Lee Hyun-jin, a male lead who creates a music crew and pursues his dreams independently without the help of his parents. Plans Plans also known as Plans: Seoyeon University x Revan is a side story of Love Playlist and Flower Ever After. Released on July 10, 2021, on the Playlist's official YouTube channel, the video revolved Love Playlist after two years. Starring Lee Yoo-jin (Jae In), an intern at Revan Company meets Kim Hyung-seok (Hyun Seung) and Choi Hee-seung (Min Woo) after a long time at Seoyeon University for a job interview with her senior Kang Hoon (Hyun Soo). As the main web drama protagonists of the playlist gathered in one place, the number of views of the video reached 790,000 within four days of its release. Blue Birthday Directed by Park Dan-hee and written by Goo So-yeon and Moon Won-young, Blue Birthday it is a work that challenged new genres and materials by reuniting with the production team, who had gathered topics with ending series such as The Best Ending (2019) and Ending Again (2020). The drama starred Kim Ye-rim of Red Velvet and Hongseok of Pentagon, tells a fantasy romance thriller drama in which the female protagonists revisit the past through mysterious photos left by her first love, who died on her birthday 10 years ago. The drama consists of three songs, including remakes of two popular Korean songs that were released in 2011, Beast's "On Rainy Days" (re-recorded and sung by Heize) and Lucia with Epitone Project's "Any Day, Any Words" (re-recorded and sung by O3ohn). The third and original song of the drama, "It's You" is recorded by Colde Peng The drama depicts the romance of a woman entering her 30s in which four different men appear in the life of the main protagonist, Go Sa-ri (Yoon So-hee). Go Sa-ri who is trying to start fresh as she says goodbye to her 20s faces a difficult situation when her ex-boyfriend Jeon Woo-sang (Lee Seung-il), a younger guy Yeon Ha-rim (Kim Hyun-jin), her best friend Pi Jung-won (Choi Won-myung), and her boss Ki Sun-jae (Joo Woo-jae) simultaneously try to win her over. Peng premieres on October 7, 2021, through YouTube and Watcha. A DeadbEAT's Meal The story of Jae Ho's growth in healing a series of job failures and parting pains with food. It is based on the popular webtoon of the same name. Ha Seok-jin takes on the role of Kim Jae-ho, an involuntary man who is preparing for a job for the second year after graduating from the Department of Korean Literature, after five years of public examination. Go Won-hee takes on the role of Yeo Eun-ho, a hot-tempered voluntary white man who claims that eating is all that is left in today's low-interest rates. Im Hyun-joo as the role of Seo Soo-jeong, an employee of a large telecommunication company in her second year of social life. Weak Hero Class 1 The drama is based on Naver webtoon Weak Hero by writer Seopass and illustrated by Kim Jin-seok (Razen), which was published in 2018. It stars Park Ji-hoon, Choi Hyun-wook, and Hong Kyung, directed and written by Yoo Soo-min and Han Jun-hee as the creative director. The series revolves around a weak-looking boy, who fights against numerous violence with his friends. Three episodes out of eight were screened at 27th Busan International Film Festival in 'On Screen' section. It aired exclusively as Wavve Original on November 18, 2022. The series was met with praise by critics and audiences for its production and solid actors' performances. Frequent collaborators This table lists actors who appear as different characters in more than one television series. Recurring cast and characters This table lists the characters who will appear or have appeared in more than one television series in Playlist Universe. = Main role   = Supporting role   = Guest role Notable guests Paul Kim Na Jae-min (NCT Dream) Lee Je-no (NCT Dream) Ahn Ji-young (Bolbbalgan4) JeA (Brown Eyed Girls) Hyun Woo-jin Joshua (Seventeen) Kim Min-ju (Iz*One) Hyunjin (Stray Kids) I.N (Stray Kids) Lee Han-wi Ahn Se-ha Kim Hye-yoon Go Woo-ri Jung Soo-young Cha Yeob Jeon No-min Yoon Yoo-sun Kim Jung-hak Song Seon-mi Kenta (JBJ95) Lena Yeonjun (Tomorrow X Together) Young K (Day6) Wonpil (Day6) Jaejae May (Cherry Bullet) Remi (Cherry Bullet) Haeyoon (Cherry Bullet) Wooyeon (Woo!ah!) Park Sung-hoon (Enhypen) Kim Woo-seok (Up10tion) Notes References Playlist Studio Playlist Studio
```xml namespace exceptions { function immediate(k: number) { try { pause(1) if (k > 0) throw "hl" + k pause(1) glb1++ } catch (e) { assert(e == "hl" + k) glb1 += 10 if (k >= 10) throw e } finally { x += glb1 } } function throwVal(n: number) { pause(1) if (n > 0) throw "hel" + n pause(1) } function higherorder(k: number) { try { [1].map(() => throwVal(k)) glb1++ } catch (e) { assert(e == "hel" + k) glb1 += 10 if (k >= 10) throw e } finally { x += glb1 } } function lambda(k: number) { function inner() { try { throwVal(k) glb1++ } catch (e) { assert(e == "hel" + k) glb1 += 10 if (k >= 10) throw e } finally { x += glb1 } } inner() } function callingThrowVal(k: number) { try { pause(1) throwVal(k) pause(1) glb1++ } catch (e) { assert(e == "hel" + k) glb1 += 10 if (k >= 10) throw e } finally { x += glb1 } } function nested() { try { try { callingThrowVal(10) } catch (e) { assert(glb1 == 10 && x == 10) glb1++ throw e } } catch (ee) { assert(glb1 == 11) } } function test3(fn: (k: number) => void) { glb1 = x = 0 fn(1) assert(glb1 == 10 && x == 10) fn(0) assert(glb1 == 11 && x == 21) fn(3) assert(glb1 == 21 && x == 42) } function test4(fn: () => void) { try { fn() return 10 } catch { return 20 } } function test5() { let n = 0 for (let k of [0, 1, 2]) { try { n++ try { if (k == 1) break } catch { n += 1000 } } catch { n += 100 } } assert(n == 2) } export function run() { msg("test exn") glb1 = x = 0 callingThrowVal(1) assert(glb1 == 10 && x == 10) callingThrowVal(0) assert(glb1 == 11 && x == 21) callingThrowVal(3) assert(glb1 == 21 && x == 42) test3(callingThrowVal) test3(immediate) test3(higherorder) test3(lambda) glb1 = x = 0 nested() assert(glb1 == 11) assert(test4(() => { }) == 10) assert(test4(() => { throw "foo" }) == 20) test5() msg("test exn done") } } exceptions.run(); ```
```javascript /** * @license Apache-2.0 * * * * path_to_url * * Unless required by applicable law or agreed to in writing, software * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. */ 'use strict'; // MODULES // var tape = require( 'tape' ); var floor = require( '@stdlib/math/base/special/floor' ); var isnan = require( '@stdlib/math/base/assert/is-nan' ); var Float64Array = require( '@stdlib/array/float64' ); var dmeanvarpn = require( './../lib/ndarray.js' ); // TESTS // tape( 'main export is a function', function test( t ) { t.ok( true, __filename ); t.strictEqual( typeof dmeanvarpn, 'function', 'main export is a function' ); t.end(); }); tape( 'the function has an arity of 8', function test( t ) { t.strictEqual( dmeanvarpn.length, 8, 'has expected arity' ); t.end(); }); tape( 'the function calculates the arithmetic mean and population variance of a strided array', function test( t ) { var expected; var out; var x; var v; x = new Float64Array( [ 1.0, -2.0, -4.0, 5.0, 0.0, 3.0 ] ); out = new Float64Array( 2 ); v = dmeanvarpn( x.length, 0, x, 1, 0, out, 1, 0 ); expected = new Float64Array( [ 0.5, 53.5/x.length ] ); t.strictEqual( v, out, 'returns expected value' ); t.deepEqual( v, expected, 'returns expected value' ); x = new Float64Array( [ -4.0, -4.0 ] ); out = new Float64Array( 2 ); v = dmeanvarpn( x.length, 0, x, 1, 0, out, 1, 0 ); expected = new Float64Array( [ -4.0, 0.0 ] ); t.strictEqual( v, out, 'returns expected value' ); t.deepEqual( v, expected, 'returns expected value' ); x = new Float64Array( [ NaN, 4.0 ] ); out = new Float64Array( 2 ); v = dmeanvarpn( x.length, 0, x, 1, 0, out, 1, 0 ); t.strictEqual( v, out, 'returns expected value' ); t.strictEqual( isnan( v[ 0 ] ), true, 'returns expected value' ); t.strictEqual( isnan( v[ 1 ] ), true, 'returns expected value' ); t.end(); }); tape( 'the function calculates the arithmetic mean and sample variance of a strided array', function test( t ) { var expected; var out; var x; var v; x = new Float64Array( [ 1.0, -2.0, -4.0, 5.0, 0.0, 3.0 ] ); out = new Float64Array( 2 ); v = dmeanvarpn( x.length, 1, x, 1, 0, out, 1, 0 ); expected = new Float64Array( [ 0.5, 53.5/(x.length-1) ] ); t.strictEqual( v, out, 'returns expected value' ); t.deepEqual( v, expected, 'returns expected value' ); x = new Float64Array( [ -4.0, -4.0 ] ); out = new Float64Array( 2 ); v = dmeanvarpn( x.length, 1, x, 1, 0, out, 1, 0 ); expected = new Float64Array( [ -4.0, 0.0 ] ); t.strictEqual( v, out, 'returns expected value' ); t.deepEqual( v, expected, 'returns expected value' ); x = new Float64Array( [ NaN, 4.0 ] ); out = new Float64Array( 2 ); v = dmeanvarpn( x.length, 1, x, 1, 0, out, 1, 0 ); t.strictEqual( v, out, 'returns expected value' ); t.strictEqual( isnan( v[ 0 ] ), true, 'returns expected value' ); t.strictEqual( isnan( v[ 1 ] ), true, 'returns expected value' ); t.end(); }); tape( 'if provided an `N` parameter less than or equal to `0`, the function returns `NaN` values', function test( t ) { var out; var x; var v; x = new Float64Array( [ 1.0, -2.0, -4.0, 5.0, 0.0, 3.0 ] ); out = new Float64Array( 2 ); v = dmeanvarpn( 0, 1, x, 1, 0, out, 1, 0 ); t.strictEqual( isnan( v[ 0 ] ), true, 'returns expected value' ); t.strictEqual( isnan( v[ 1 ] ), true, 'returns expected value' ); out = new Float64Array( 2 ); v = dmeanvarpn( -1, 1, x, 1, 0, out, 1, 0 ); t.strictEqual( isnan( v[ 0 ] ), true, 'returns expected value' ); t.strictEqual( isnan( v[ 1 ] ), true, 'returns expected value' ); t.end(); }); tape( 'if provided an `N` parameter equal to `1`, the function returns a population variance of `0`', function test( t ) { var expected; var out; var x; var v; x = new Float64Array( [ 1.0, -2.0, -4.0, 5.0, 0.0, 3.0 ] ); out = new Float64Array( 2 ); v = dmeanvarpn( 1, 0, x, 1, 0, out, 1, 0 ); expected = new Float64Array( [ 1.0, 0.0 ] ); t.deepEqual( v, expected, 'returns expected value' ); t.end(); }); tape( 'if provided a `correction` parameter yielding `N-correction` less than or equal to `0`, the function returns a variance equal to `NaN`', function test( t ) { var out; var x; var v; x = new Float64Array( [ 1.0, -2.0, -4.0, 5.0, 0.0, 3.0 ] ); out = new Float64Array( 2 ); v = dmeanvarpn( 1, 1, x, 1, 0, out, 1, 0 ); t.strictEqual( v[ 0 ], 1.0, 'returns expected value' ); t.strictEqual( isnan( v[ 1 ] ), true, 'returns expected value' ); out = new Float64Array( 2 ); v = dmeanvarpn( x.length, x.length, x, 0, 0, out, 1, 0 ); t.strictEqual( v[ 0 ], 1.0, 'returns expected value' ); t.strictEqual( isnan( v[ 1 ] ), true, 'returns expected value' ); out = new Float64Array( 2 ); v = dmeanvarpn( x.length, x.length, x, 1, 0, out, 1, 0 ); t.strictEqual( v[ 0 ], 0.5, 'returns expected value' ); t.strictEqual( isnan( v[ 1 ] ), true, 'returns expected value' ); out = new Float64Array( 2 ); v = dmeanvarpn( x.length, x.length+1, x, 1, 0, out, 1, 0 ); t.strictEqual( v[ 0 ], 0.5, 'returns expected value' ); t.strictEqual( isnan( v[ 1 ] ), true, 'returns expected value' ); t.end(); }); tape( 'the function supports `stride` parameters', function test( t ) { var expected; var out; var N; var x; var v; x = new Float64Array([ 1.0, // 0 2.0, 2.0, // 1 -7.0, -2.0, // 2 3.0, 4.0, // 3 2.0 ]); out = new Float64Array( 4 ); N = floor( x.length / 2 ); v = dmeanvarpn( N, 1, x, 2, 0, out, 2, 0 ); expected = new Float64Array( [ 1.25, 0.0, 6.25, 0.0 ] ); t.deepEqual( v, expected, 'returns expected value' ); t.end(); }); tape( 'the function supports negative `stride` parameters', function test( t ) { var expected; var out; var N; var x; var v; x = new Float64Array([ 1.0, // 3 2.0, 2.0, // 2 -7.0, -2.0, // 1 3.0, 4.0, // 0 2.0 ]); out = new Float64Array( 4 ); N = floor( x.length / 2 ); v = dmeanvarpn( N, 1, x, -2, 6, out, -2, 2 ); expected = new Float64Array( [ 6.25, 0.0, 1.25, 0.0 ] ); t.deepEqual( v, expected, 'returns expected value' ); t.end(); }); tape( 'if provided a `stride` parameter equal to `0`, the function returns an arithmetic mean equal to the first element and a variance of `0`', function test( t ) { var expected; var out; var x; var v; x = new Float64Array( [ 1.0, -2.0, -4.0, 5.0, 3.0 ] ); out = new Float64Array( 2 ); v = dmeanvarpn( x.length, 1, x, 0, 0, out, 1, 0 ); expected = new Float64Array( [ 1.0, 0.0 ] ); t.deepEqual( v, expected, 'returns expected value' ); t.end(); }); tape( 'the function supports `offset` parameters', function test( t ) { var expected; var out; var N; var x; var v; x = new Float64Array([ 2.0, 1.0, // 0 2.0, -2.0, // 1 -2.0, 2.0, // 2 3.0, 4.0 // 3 ]); out = new Float64Array( 4 ); N = floor( x.length / 2 ); v = dmeanvarpn( N, 1, x, 2, 1, out, 2, 1 ); expected = new Float64Array( [ 0.0, 1.25, 0.0, 6.25 ] ); t.deepEqual( v, expected, 'returns expected value' ); t.end(); }); ```
```go // +build linux package apparmor // import "github.com/docker/docker/profiles/apparmor" // baseTemplate defines the default apparmor profile for containers. const baseTemplate = ` {{range $value := .Imports}} {{$value}} {{end}} profile {{.Name}} flags=(attach_disconnected,mediate_deleted) { {{range $value := .InnerImports}} {{$value}} {{end}} network, capability, file, umount, deny @{PROC}/* w, # deny write for all files directly in /proc (not in a subdir) # deny write to files not in /proc/<number>/** or /proc/sys/** deny @{PROC}/{[^1-9],[^1-9][^0-9],[^1-9s][^0-9y][^0-9s],[^1-9][^0-9][^0-9][^0-9]*}/** w, deny @{PROC}/sys/[^k]** w, # deny /proc/sys except /proc/sys/k* (effectively /proc/sys/kernel) deny @{PROC}/sys/kernel/{?,??,[^s][^h][^m]**} w, # deny everything except shm* in /proc/sys/kernel/ deny @{PROC}/sysrq-trigger rwklx, deny @{PROC}/kcore rwklx, deny mount, deny /sys/[^f]*/** wklx, deny /sys/f[^s]*/** wklx, deny /sys/fs/[^c]*/** wklx, deny /sys/fs/c[^g]*/** wklx, deny /sys/fs/cg[^r]*/** wklx, deny /sys/firmware/** rwklx, deny /sys/kernel/security/** rwklx, {{if ge .Version 208095}} # suppress ptrace denials when using 'docker ps' or using 'ps' inside a container ptrace (trace,read) peer={{.Name}}, {{end}} } ` ```
```java /* * * * path_to_url * * Unless required by applicable law or agreed to in writing, software * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. */ package cn.nekocode.camerafilter.filter; import android.content.Context; import android.opengl.GLES11Ext; import android.opengl.GLES20; import android.support.annotation.CallSuper; import java.nio.ByteBuffer; import java.nio.ByteOrder; import java.nio.FloatBuffer; import cn.nekocode.camerafilter.MyGLUtils; import cn.nekocode.camerafilter.R; import cn.nekocode.camerafilter.RenderBuffer; /** * @author nekocode (nekocode.cn@gmail.com) */ public abstract class CameraFilter { static final float SQUARE_COORDS[] = { 1.0f, -1.0f, -1.0f, -1.0f, 1.0f, 1.0f, -1.0f, 1.0f, }; static final float TEXTURE_COORDS[] = { 1.0f, 0.0f, 0.0f, 0.0f, 1.0f, 1.0f, 0.0f, 1.0f, }; static FloatBuffer VERTEX_BUF, TEXTURE_COORD_BUF; static int PROGRAM = 0; private static final int BUF_ACTIVE_TEX_UNIT = GLES20.GL_TEXTURE8; private static RenderBuffer CAMERA_RENDER_BUF; private static final float ROATED_TEXTURE_COORDS[] = { 1.0f, 0.0f, 1.0f, 1.0f, 0.0f, 0.0f, 0.0f, 1.0f, }; private static FloatBuffer ROATED_TEXTURE_COORD_BUF; final long START_TIME = System.currentTimeMillis(); int iFrame = 0; public CameraFilter(Context context) { // Setup default Buffers if (VERTEX_BUF == null) { VERTEX_BUF = ByteBuffer.allocateDirect(SQUARE_COORDS.length * 4) .order(ByteOrder.nativeOrder()).asFloatBuffer(); VERTEX_BUF.put(SQUARE_COORDS); VERTEX_BUF.position(0); } if (TEXTURE_COORD_BUF == null) { TEXTURE_COORD_BUF = ByteBuffer.allocateDirect(TEXTURE_COORDS.length * 4) .order(ByteOrder.nativeOrder()).asFloatBuffer(); TEXTURE_COORD_BUF.put(TEXTURE_COORDS); TEXTURE_COORD_BUF.position(0); } if (ROATED_TEXTURE_COORD_BUF == null) { ROATED_TEXTURE_COORD_BUF = ByteBuffer.allocateDirect(ROATED_TEXTURE_COORDS.length * 4) .order(ByteOrder.nativeOrder()).asFloatBuffer(); ROATED_TEXTURE_COORD_BUF.put(ROATED_TEXTURE_COORDS); ROATED_TEXTURE_COORD_BUF.position(0); } if (PROGRAM == 0) { PROGRAM = MyGLUtils.buildProgram(context, R.raw.vertext, R.raw.original_rtt); } } @CallSuper public void onAttach() { iFrame = 0; } final public void draw(int cameraTexId, int canvasWidth, int canvasHeight) { // TODO move? // Create camera render buffer if (CAMERA_RENDER_BUF == null || CAMERA_RENDER_BUF.getWidth() != canvasWidth || CAMERA_RENDER_BUF.getHeight() != canvasHeight) { CAMERA_RENDER_BUF = new RenderBuffer(canvasWidth, canvasHeight, BUF_ACTIVE_TEX_UNIT); } // Use shaders GLES20.glUseProgram(PROGRAM); int iChannel0Location = GLES20.glGetUniformLocation(PROGRAM, "iChannel0"); GLES20.glActiveTexture(GLES20.GL_TEXTURE0); GLES20.glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, cameraTexId); GLES20.glUniform1i(iChannel0Location, 0); int vPositionLocation = GLES20.glGetAttribLocation(PROGRAM, "vPosition"); GLES20.glEnableVertexAttribArray(vPositionLocation); GLES20.glVertexAttribPointer(vPositionLocation, 2, GLES20.GL_FLOAT, false, 4 * 2, VERTEX_BUF); int vTexCoordLocation = GLES20.glGetAttribLocation(PROGRAM, "vTexCoord"); GLES20.glEnableVertexAttribArray(vTexCoordLocation); GLES20.glVertexAttribPointer(vTexCoordLocation, 2, GLES20.GL_FLOAT, false, 4 * 2, ROATED_TEXTURE_COORD_BUF); // Render to texture CAMERA_RENDER_BUF.bind(); GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT); GLES20.glDrawArrays(GLES20.GL_TRIANGLE_STRIP, 0, 4); CAMERA_RENDER_BUF.unbind(); GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT); onDraw(CAMERA_RENDER_BUF.getTexId(), canvasWidth, canvasHeight); iFrame++; } abstract void onDraw(int cameraTexId, int canvasWidth, int canvasHeight); void setupShaderInputs(int program, int[] iResolution, int[] iChannels, int[][] iChannelResolutions) { setupShaderInputs(program, VERTEX_BUF, TEXTURE_COORD_BUF, iResolution, iChannels, iChannelResolutions); } void setupShaderInputs(int program, FloatBuffer vertex, FloatBuffer textureCoord, int[] iResolution, int[] iChannels, int[][] iChannelResolutions) { GLES20.glUseProgram(program); int iResolutionLocation = GLES20.glGetUniformLocation(program, "iResolution"); GLES20.glUniform3fv(iResolutionLocation, 1, FloatBuffer.wrap(new float[]{(float) iResolution[0], (float) iResolution[1], 1.0f})); float time = ((float) (System.currentTimeMillis() - START_TIME)) / 1000.0f; int iGlobalTimeLocation = GLES20.glGetUniformLocation(program, "iGlobalTime"); GLES20.glUniform1f(iGlobalTimeLocation, time); int iFrameLocation = GLES20.glGetUniformLocation(program, "iFrame"); GLES20.glUniform1i(iFrameLocation, iFrame); int vPositionLocation = GLES20.glGetAttribLocation(program, "vPosition"); GLES20.glEnableVertexAttribArray(vPositionLocation); GLES20.glVertexAttribPointer(vPositionLocation, 2, GLES20.GL_FLOAT, false, 4 * 2, vertex); int vTexCoordLocation = GLES20.glGetAttribLocation(program, "vTexCoord"); GLES20.glEnableVertexAttribArray(vTexCoordLocation); GLES20.glVertexAttribPointer(vTexCoordLocation, 2, GLES20.GL_FLOAT, false, 4 * 2, textureCoord); for (int i = 0; i < iChannels.length; i++) { int sTextureLocation = GLES20.glGetUniformLocation(program, "iChannel" + i); GLES20.glActiveTexture(GLES20.GL_TEXTURE0 + i); GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, iChannels[i]); GLES20.glUniform1i(sTextureLocation, i); } float _iChannelResolutions[] = new float[iChannelResolutions.length * 3]; for (int i = 0; i < iChannelResolutions.length; i++) { _iChannelResolutions[i * 3] = iChannelResolutions[i][0]; _iChannelResolutions[i * 3 + 1] = iChannelResolutions[i][1]; _iChannelResolutions[i * 3 + 2] = 1.0f; } int iChannelResolutionLocation = GLES20.glGetUniformLocation(program, "iChannelResolution"); GLES20.glUniform3fv(iChannelResolutionLocation, _iChannelResolutions.length, FloatBuffer.wrap(_iChannelResolutions)); } public static void release() { PROGRAM = 0; CAMERA_RENDER_BUF = null; } } ```
Daganapothaha is a small town in Sri Lanka. It is located within Southern Province. See also List of towns in Southern Province, Sri Lanka External links Populated places in Southern Province, Sri Lanka
Domestic violence in Brazil involves any type of violence or abuse by intimate partners or family members against one another. The majority of domestic violence cases in Brazil are performed by the man against their female partners. In 2015, the government released a study that showed that every seven minutes a woman was a victim of domestic violence in Brazil, over 70% of the Brazilian female population will suffer some kind of violence throughout their lifetime and 1 in every 4 women reports being a victim of psychological or physical violence. In 2017, Brazil had an estimate of 606 cases of violence and 164 cases of rape per day, over 60 thousand cases throughout the year. It is also estimated that only 10% of the cases are registered to the police. Although Brazil acknowledged that domestic violence was a problem in the 1940s, the Government has only acted upon it from 1980s onwards, with the creation of the Women Police Stations (Delegacia da Mulher) and later in 2006, with the publication of the Domestic Violence law. Domestic violence is legally defined in Article 5 of the Domestic Violence Law of 2006 as "any action or omission of action motivated by gender that results in death, lesion, physical, sexual or psychological suffering, moral or patrimonial hazard". Although the legal definition is explained extensively in the law, the identification of domestic violence is a responsibility of the victims or closer relatives. Background According to the NGO Marias, there are several causes to the practice of domestic violence, such as alcoholism, adultery, jealousy, drugs, financial problems, and, according to professor Matthew Guttmann, anthropologist that studies masculinity for the Brown University, the main cause of domestic violence is sexism or machismo (in Portuguese). In his study, professor Guttmann proves that, against the common belief, the male violent behavior is not physiological or biological characteristic, but is the result of the culture of machismo, predominant in most societies, that reinforce the man's superiority to women. In the Brazil colony period, men were considered to be "owners" of women they married, entitled to beat, violent or even kill, if necessary. A research conducted by UN Women and Grupo Boticário shows that even nowadays, 95% of women and 81% of interviewed men agree with the affirmation that machismo is predominant in Brazil. According to professor Stela Meneghel, post-doctor physician specialized in gender studies, the violence practiced against women aims to keep them in an inferior position in relation to men, while men usually fell they have to "educate" women about their duties and their position. A study from the Instituto de Pesquisa Econômica Aplicada (Ipea) shows evidence of widespread machismo in Brazil. According to the 2014 study, Brazilian society still believes in a patriarchal nuclear family in which the man is perceived as the breadwinner, but his rights over women and children are restricted and exclude open and extreme forms of violence. The women, on the other side, should "give herself respect" and behave according to the traditional family models. The survey reveals acceptance of some kind of intervention in domestic violence themes: 85% of interviewees responded that in cases of violence, the couple should divorce; and over 90% agree that men that beat women should go to jail. The survey also shows that the Brazilian population is well informed regarding the origins of violence, 75% of interviewees disagree that violence is part of the male nature. However, There is evidence of widespread sexism, 58,5% of the interviewees believe that if women knew how to behave, there would be less rape cases and over 65% of interviewees agree that women that use clothes that show their bodies deserve to be attacked. There is also evidence of the widespread belief that the victim of violence should be the one to take action against it as reported that 65,1% of the interviewees report that victims of domestic violence that don't leave the abusive partners, like to be beaten. This assumption ignores all the social and psychological issues involved in an abusive relationship that may retrain a victim of domestic violence from reporting and leaving an abusive relationship. There are researches that propose a positive correlation between financial difficulties and violence: in scenarios of economical uncertainty and instability, the women is 1/3 more likely to be victim of domestic violence. A research conducted by DataSenado in 2015 shows that 100% of the women interviewed knew about the existence of the Maria da Penha Law, showing the increasing education of the female population regarding their rights. However, 43% reported not to be treated with respect, an increase of 8 percentage points (vs 2013 survey), which may also indicate a change of the common belief of what if to be "treated right". In addition, women reported they feel more safe and started to identify and report more frequently the domestic violence cases. The same research shows that approximately 21% of the victims of domestic violence do not look for help and the main reasons reported are concern with children (24%), fear of aggressor's revenge (21%), belief that the episode would be the last one (16%), disbelief in the appropriate legal consequences (10%) and shame (7%). A survey from the Justice Ministry reveals that 80% of the domestic violence victims do not want the aggressor to be imprisoned. The victims offer alternative solutions, such as psychological treatment (40%), aggressors discussion groups (30%) and mandatory services to the community (10%) and 9% of the women interviewed reported they felt totally or partially guilty by the violence suffered. To the professor Cristiane Brandão, Criminal Law professor at the Federal University of Rio de Janeiro (UFRJ), these numbers are the result of the predominant patriarchal and sexist Brazilian society. Law The Brazilian Constitution of 1988 previews equal rights to men and women, however, the first legal formalization against domestic violence was published only 18 years after constitution. The Brazilian famous law, Lei Maria da Penha was the result of an international process led by Maria da Penha herself. A victim of domestic violence, Maria da Penha Fernandes, was shot at by her husband with a rifle, who also tried to electrocute her in the bathroom. As a consequence, she became paralytic and started a long battle in court to convict her husband. In the 1990s, Maria da Penha appealed to the Inter-American Commission on Human Rights and, in 2001, she was able obtain justice and hold the Brazilian Government accountable for judicial tolerance of domestic violence. The commission also recommended that the Brazilian Government adopt more effective measures against women violence. Domestic Violence Law 2006 (Law 11.340/2006) The first legal form of protection for victims of domestic violence was published on August 7, 2006 by President Lula, who signed the Law of Domestic and Family Violence, also known as Lei Maria da Penha. The law previews mainly five types of domestic violence, as presented below: Psychological Violence: cursing, humiliating, threatening, intimidating, frightening, continually criticizing, devalue someone's acts in private or in public, and exercising any kind of emotional manipulation; Physical Violence: hitting, spanking, pushing, throwing objects, bitting, mutilating, torturing, using or not domestic tools, such as knives or working tools or gun; Sexual Violence: non-consented sexual relationships (i.e. while the partner is sleeping), forcing the partner to look pornographic material, forcing partner to have sex with other people, prevent women from controlling usage of non-pregnancy drugs, forcing abortion or preventing women from getting pregnant; Patrimonial Violence: control, retain or take someone's money, cause damage to other personal objects on purpose, retain partner's personal objects, personal documents or work documents from her; Moral Violence: offend or humiliate the partner in public, expose the intimacy of the couple, including in social media, public accusing the partner of committing crimes; Brazilian law prohibits domestic violence, and the government has taken steps that specifically address violence against women and spousal abuse. The law triples previous punishments for those convicted of such crimes, and also creates special courts in all states to preside over these cases. It is also the first official codification of domestic violence crimes. The Superior Court of Justice (Brazil) has reinforced the law, by starting legal processes with only the violence event police report with no need of the violence victim being present or being the main complainer. Update of Rape crimes in the Criminal Code (Law 12.015/2009) In 2009, the criminal code was updated to consider rape as a crime against the dignity and the sexual liberty, recognizing that all individuals, independently of gender, have the right to demand respect to their sexual lives and the obligation to respect the sexual option of other people. Feminicide Law of 2015 (Law 13.103/2015) In 2015, President Dilma Rousseff approved the Feminicide Law that altered the Brazilian Criminal Code to preview the feminicide as type of homicide and a hideous crime. Feminice is the murder of women simply by their condition of women. Feminicide crimes are motivated by hate, despise or the sentiment of loss of property over the woman. The law previews aggravating conditions that increase the legal penalty in 1/3, such as (i) crime committed during pregnancy or 3 months after child labor, (ii) crime committed against women less than 14 years old, more than 60 years old or with disability, (iii) crime committed in the presence of the parents or the children of the victim. Government action Delegacia da Mulher: The government acted to combat violence against women. Each state secretariat for public security operated delegacias da mulher (DEAM). These police stations are dedicated exclusively to addressing crimes against women. The quality of services varied widely, and availability was particularly limited in isolated areas. The stations provided psychological counseling, temporary shelter, and hospital treatment for victims of domestic violence and rape (including treatment for HIV and other sexually transmitted diseases). The stations also provided assistance to prosecution of criminal cases by investigating and forwarding evidence to the courts. According to the Ministry of Justice, while many of the DEAMs fell far short of standards and lacked strategies to protect victims after the reports were filed, they nevertheless served to raise public awareness of crimes against women."Mulher, Viver sem Violencia" Program: Launched on March 13, 2013, by the president Dilma Rousseff, this initiative aims to expand and improve the public services offered to women victims of violence. The program includes a series of initiatives as presented below: 1. Ligue 180: In 2005, the federal government implemented a toll-free hotline to address complaints of violence against women, provided by the Women's Ministry and the Special Secretary of Women's Policies. The toll-line not only receives complaints of violence, but also orients victims regarding their legal entitlements and refer victims to other public services when applicable. In 2015, the toll line registered 749.024 calls, an average of 2.052 calls per day. Since its implementation, the Ligue 180 has already registered 4.823.140 calls. Approximately 2/3 of the calls received are to report some kind of domestic violence; 41,09% of the calls received request more information, 9,56% are referred to special services to support women, 38,54% refer to other services such as Military Police, Civil police or Human Rights Secretary. 2. Casa da Mulher Brasileira: The Brazilian Women House is a humanized center of services to support the woman victim of violence. These institutions perform the triage of victims, offer psychological support and have special areas for children. They also englobe a number of public services, such as police stations, court, Public Ministry and transportation services center. 3. Humanization of the Service provided by Sexual Violence Victims: new procedures have been implemented to treat the victims of sexual violence in a more humanized and effective way in the Public Hospitals and Centers of Public Health. The program also includes training of professionals in the new protocol. 4. Women Care Centers in the Drought Regions: implementation of seven centers to support women migrants victims of violence and help detain women trafficking. 5. Awareness Campaigns: several media campaigns aimed at orienting the population about the services provided by the government as well as change the social norms of sexism and stereotypes. 6. Mobile Care Units: buses and boats specially equipped to provide care, health and legal services to women in remote locations. In 2015, over 53 mobile units were implemented. The law requires health facilities to contact the police regarding cases in which a woman was harmed physically, sexually, or psychologically. Despite the formal and legal actions taken by the Federal Government, local governments seem to still struggle with the law enforcement, with reports of lack of action of the police and courts in many cases, even after the violence was reported to authorities. UN Special Rapporteur Leandro Despouy noted a tendency to blame the victims of these offenses. According to government officials and NGO workers, the majority of criminal complaints regarding domestic violence were suspended inconclusively. Incidence According to the World Health Organization, 35% of women have already suffered from physical and/or sexual violence from intimate partners or sexual violence from a nonintimate partners during their lives. Brazil occupies the 5th position in the domestic violence crime index, preceded only by El Salvador, Colombia, Guatemala and Russia. In 2013, 4,762 women were murdered in Brazil and 50.3% of these crimes were committed by family members and 33.2% of these crimes were committed by the current or ex-partner. A study sponsored by the United Nations and the World Health Organization and the Brazilian Government found that 106.093 women were murdered in Brazil between 1980 and 2013. According to the Mapa da Violencia 2015 report, the femicide rates have been growing and have reached 4.8% of the women population in 2013. Regional Differences In 2013, regional states presented significant difference in the incidence of women murders. Roraima was the estate with the highest incidence, of 15,3% of women murdered and São Paulo was the estate with the lowest incidence of 2,9%. Brazil average women homicide rate was 4,8% in 2013. Race Differences According to the Mapa da Violencia 2015 report, the black population is the main victim of domestic violence and homicides in the country. The homicide rates of the white population have been falling (-9,8% between 2003 and 2013), while the homicide rates among black population keep rising (+54,2% between 2003 and 2013). Age Difference According to the Mapa da Violencia 2015 report, about 2/3 of the population attended on the Brazilian Public Health System (Sistema Único de Saúde) are women. In 2014, 22.796were attended victims of several types of violence. Distribution by sex and age of the number of victims of violence registered in the SUS Types of Violence According to the 2015 Balance of the Ligue 180 toll-free line, 76,651 reports of violence were registered, being: 38.451 reports of physical violence (50.15%); 23,247 reports of psychological violence (30.33%); 5.556 reports of moral violence (7.25%); 3.961 reports of home imprisonment (5.17%) 3.478 reports of sexual violence (4.54%); 1.607 reports of patrimonial violence (2.10%); 351 report of people trafficking (0.46%). The Aggressor According to the Mapa da Violencia 2015 report, 82% of aggressions to female children were performed by the child's parents, mainly the mother, that concentrates 42,2% of aggressions. For adolescents, the main offender are the parents (26,5%) and current or ex-partners (23,2%). Young and adult population, in more than 50%, are offended by current or ex-partners. For the elderly population, the main offender was an offspring (34,9%). A study conducted by IPEA in 2013 concludes that approximately 30% of the deaths of women classified as feminicide happened in their own house. Local attitudes and activism In a 2017 study conducted by IPSOS and Avon Institute, political scientist Céli Pinto, professor of the Federal University of Rio Grande do Sul (UFRGS) and author of book “Uma História do Feminismo no Brasil” (History of Feminism in Brazil) argues that recent episodes of domestic violence have fomented a "renaissance of feminism", named "Feminism Spring" in the study that aim to eliminate sexism and violence against women. Online campaign have gained popularity since 2014, with the movement #NãoMereçoSerEstuprada (I Do Not Deserve to Be Raped) launched after the publishing of the results of the IPEA survey "Tolerancia Social a Violencia contra Mulher" in which over 65% of interviewees agreed that women who wear clothes that show their bodies deserve to be raped. In 2015, Jout Jout, a Brazilian YouTuber, published a video on abusive relationships that inspired the movement #NãoTiraOBatomVermelho (Keep Your Red Lipstick On). In 2016, sexual remarks towards one of the girls participating in the TV show Masterchef Junior, another social media movement #PrimeiroAssedio (First Harassment) encouraged women from all ages to share their first experience in which they suffered sexual harassment. Other hashtags such as #ChegaDeFiuFiu (No More Catcalls) and #MeuAmigoSecreto (My Secret Friend) also tried to call attention on sexist behaviors on day-by-day activities, such as walking on the streets or working. In 2018, elevator footage of Luis Filipe Manvalier beating his wife Tatiane Spitzner, who eventually died, was aired on the popular Fantastico TV program set off national discussion and the social media hashtag #metaAcolher (Stick a Spoon in, a reference to butting into a domestic dispute). In addition to the online campaigns, street movements have also been constant, with the constant presence and awareness of the theme of violence against women: the Marcha das Margaridas (March of the Daisies), more than 70,000 people marched against violence against women. In 2015 ENEM, Brazil's standardized college qualification exam, the topic of the essay was “The Persistence of Violence against Women in Brazilian Society.” Recent episodes have also been receiving criticism. Faustão, one of the most famous TV show hosts said that "women like to be beaten". NGOs and activists asked the TV program to apologize to the public and people on social media. See also Lei Maria da Penha Femicide Women's police stations General: Women in Brazil Crime in Brazil References Brazil Family in Brazil
Group 7 of the UEFA Women's Euro 2017 qualifying competition consisted of five teams: England, Belgium, Serbia, Bosnia and Herzegovina, and Estonia. The composition of the eight groups in the qualifying group stage was decided by the draw held on 20 April 2015. The group was played in home-and-away round-robin format. The group winners qualified directly for the final tournament, while the runners-up also qualified directly if they were one of the six best runners-up among all eight groups (not counting results against the fifth-placed team); otherwise, the runners-up advance to the play-offs. Standings Matches Times are CEST (UTC+2) for dates between 29 March and 24 October 2015 and between 27 March and 29 October 2016, for other dates times are CET (UTC+1). Goalscorers 6 goals Milena Nikolić Karen Carney Danielle Carter 5 goals Jill Scott 4 goals Janice Cayman Tessa Wullaert 3 goals Tine De Caigny Aline Zeler Nikita Parris Jelena Čubrilo Mirela Tenkov 2 goals Julie Biesmans Maud Coutereels Elke Van Gorp Izzy Christiansen Gemma Davison Fran Kirby Ellen White 1 goal Cécile De Gernier Laura Deloose Audrey Demoustier Tine Schryvers Sara Yuceil Merjema Medić Antonela Radeljić Rachel Daly Alex Greenwood Jo Potter Jovana Damnjanović Milica Mijatović Marija Radojičić Violeta Slović 1 own goal Nikolina Dijaković (playing against Belgium) Inna Zlidnis (playing against Belgium) Nevena Damjanović (playing against England) References External links Standings, UEFA.com Group 7
Gustavo de Souza Caiche (born 4 June 1976), is a Brazilian professional football coach and former player who is the assistant coach for Athletico Paranaense U20s. He played as a centre-back. Honours Paraná State League: 1998, 2000, 2001 Brazilian League: 2001 Parana State Superleague: 2002 São Paulo State League: 2004 References External links Leão apresenta Gustavo CBF sportnet 1976 births Living people Brazilian men's footballers Club Athletico Paranaense players Sociedade Esportiva Palmeiras players Associação Desportiva São Caetano players Sport Club Corinthians Paulista players Sport Club do Recife players Men's association football defenders Sportspeople from São José do Rio Preto Footballers from São Paulo (state)
The canton of Fezensac is an administrative division of the Gers department, southwestern France. It was created at the French canton reorganisation which came into effect in March 2015. Its seat is in Vic-Fezensac. It consists of the following communes: Bascous Bazian Belmont Bezolles Caillavet Callian Castillon-Debats Cazaux-d'Anglès Courrensan Dému Gazax-et-Baccarisse Justian Lannepax Lupiac Marambat Mirannes Mourède Noulens Peyrusse-Grande Peyrusse-Vieille Préneron Ramouzens Riguepeu Roquebrune Roques Rozès Saint-Arailles Saint-Jean-Poutge Saint-Paul-de-Baïse Saint-Pierre-d'Aubézies Séailles Tudelle Vic-Fezensac References Cantons of Gers
```go // // // path_to_url // // Unless required by applicable law or agreed to in writing, software // WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. package autodetection import ( "fmt" "net" "regexp" "strings" log "github.com/sirupsen/logrus" v1 "k8s.io/api/core/v1" cnet "github.com/projectcalico/calico/libcalico-go/lib/net" "github.com/projectcalico/calico/node/pkg/lifecycle/utils" ) const ( AUTODETECTION_METHOD_FIRST = "first-found" AUTODETECTION_METHOD_CAN_REACH = "can-reach=" AUTODETECTION_METHOD_INTERFACE = "interface=" AUTODETECTION_METHOD_SKIP_INTERFACE = "skip-interface=" AUTODETECTION_METHOD_CIDR = "cidr=" K8S_INTERNAL_IP = "kubernetes-internal-ip" ) // autoDetectCIDR auto-detects the IP and Network using the requested // detection method. func AutoDetectCIDR(method string, version int, k8sNode *v1.Node, getInterfaces func([]string, []string, int) ([]Interface, error)) *cnet.IPNet { if method == "" || method == AUTODETECTION_METHOD_FIRST { // Autodetect the IP by enumerating all interfaces (excluding // known internal interfaces). return autoDetectCIDRFirstFound(version) } else if strings.HasPrefix(method, AUTODETECTION_METHOD_INTERFACE) { // Autodetect the IP from the specified interface. ifStr := strings.TrimPrefix(method, AUTODETECTION_METHOD_INTERFACE) // Regexes are passed in as a string separated by "," ifRegexes := regexp.MustCompile(`\s*,\s*`).Split(ifStr, -1) return autoDetectCIDRByInterface(ifRegexes, version) } else if strings.HasPrefix(method, AUTODETECTION_METHOD_CIDR) { // Autodetect the IP by filtering interface by its address. cidrStr := strings.TrimPrefix(method, AUTODETECTION_METHOD_CIDR) // CIDRs are passed in as a string separated by "," matches := []cnet.IPNet{} for _, r := range regexp.MustCompile(`\s*,\s*`).Split(cidrStr, -1) { _, cidr, err := cnet.ParseCIDR(r) if err != nil { log.Errorf("Invalid CIDR %q for IP autodetection method: %s", r, method) return nil } matches = append(matches, *cidr) } return autoDetectCIDRByCIDR(matches, version) } else if strings.HasPrefix(method, AUTODETECTION_METHOD_CAN_REACH) { // Autodetect the IP by connecting a UDP socket to a supplied address. destStr := strings.TrimPrefix(method, AUTODETECTION_METHOD_CAN_REACH) return autoDetectCIDRByReach(destStr, version) } else if strings.HasPrefix(method, AUTODETECTION_METHOD_SKIP_INTERFACE) { // Autodetect the Ip by enumerating all interfaces (excluding // known internal interfaces and any interfaces whose name // matches the given regexes). ifStr := strings.TrimPrefix(method, AUTODETECTION_METHOD_SKIP_INTERFACE) // Regexes are passed in as a string separated by "," ifRegexes := regexp.MustCompile(`\s*,\s*`).Split(ifStr, -1) return autoDetectCIDRBySkipInterface(ifRegexes, version) } else if strings.HasPrefix(method, K8S_INTERNAL_IP) { // K8s InternalIP configured for node is used if k8sNode == nil { log.Error("Cannot use method 'kubernetes-internal-ip' when not running on a Kubernetes cluster") return nil } return autoDetectUsingK8sInternalIP(version, k8sNode, getInterfaces) } // The autodetection method is not recognised and is required. Exit. log.Errorf("Invalid IP autodetection method: %s", method) utils.Terminate() return nil } // autoDetectCIDRFirstFound auto-detects the first valid Network it finds across // all interfaces (excluding common known internal interface names). func autoDetectCIDRFirstFound(version int) *cnet.IPNet { incl := []string{} iface, cidr, err := FilteredEnumeration(incl, DEFAULT_INTERFACES_TO_EXCLUDE, nil, version) if err != nil { log.Warnf("Unable to auto-detect an IPv%d address: %s", version, err) return nil } log.Infof("Using autodetected IPv%d address on interface %s: %s", version, iface.Name, cidr.String()) return cidr } // autoDetectCIDRByInterface auto-detects the first valid Network on the interfaces // matching the supplied interface regex. func autoDetectCIDRByInterface(ifaceRegexes []string, version int) *cnet.IPNet { iface, cidr, err := FilteredEnumeration(ifaceRegexes, nil, nil, version) if err != nil { log.Warnf("Unable to auto-detect an IPv%d address using interface regexes %v: %s", version, ifaceRegexes, err) return nil } log.Infof("Using autodetected IPv%d address %s on matching interface %s", version, cidr.String(), iface.Name) return cidr } // autoDetectCIDRByCIDR auto-detects the first valid Network on the interfaces // matching the supplied cidr. func autoDetectCIDRByCIDR(matches []cnet.IPNet, version int) *cnet.IPNet { iface, cidr, err := FilteredEnumeration(nil, nil, matches, version) if err != nil { log.Warnf("Unable to auto-detect an IPv%d address using interface cidr %s: %s", version, matches, err) return nil } log.Infof("Using autodetected IPv%d address %s on interface %s matching cidrs %+v", version, cidr.String(), iface.Name, matches) return cidr } // autoDetectCIDRByReach auto-detects the IP and Network by setting up a UDP // connection to a "reach" address. func autoDetectCIDRByReach(dest string, version int) *cnet.IPNet { if cidr, err := ReachDestination(dest, version); err != nil { log.Warnf("Unable to auto-detect IPv%d address by connecting to %s: %s", version, dest, err) return nil } else { log.Infof("Using autodetected IPv%d address %s, detected by connecting to %s", version, cidr.String(), dest) return cidr } } // autoDetectCIDRBySkipInterface auto-detects the first valid Network on the interfaces // matching the supplied interface regexes. func autoDetectCIDRBySkipInterface(ifaceRegexes []string, version int) *cnet.IPNet { incl := []string{} excl := DEFAULT_INTERFACES_TO_EXCLUDE excl = append(excl, ifaceRegexes...) iface, cidr, err := FilteredEnumeration(incl, excl, nil, version) if err != nil { log.Warnf("Unable to auto-detect an IPv%d address while excluding %v: %s", version, ifaceRegexes, err) return nil } log.Infof("Using autodetected IPv%d address on interface %s: %s while skipping matching interfaces", version, iface.Name, cidr.String()) return cidr } // autoDetectUsingK8sInternalIP reads K8s Node InternalIP. func autoDetectUsingK8sInternalIP(version int, k8sNode *v1.Node, getInterfaces func([]string, []string, int) ([]Interface, error)) *cnet.IPNet { var address string var err error nodeAddresses := k8sNode.Status.Addresses for _, addr := range nodeAddresses { if addr.Type == v1.NodeInternalIP { if (version == 4 && utils.IsIPv4String(addr.Address)) || (version == 6 && utils.IsIPv6String(addr.Address)) { address, err = GetLocalCIDR(addr.Address, version, getInterfaces) if err != nil { return nil } break } } } ip, ipNet, err := cnet.ParseCIDR(address) if err != nil { log.Errorf("Unable to parse CIDR %v : %v", address, err) return nil } // ParseCIDR masks off the IP addr of the IPNet it returns eg. ParseCIDR("192.168.1.2/24" will return //"192.168.1.2, 192.168.1.0/24". Callers of this function (autoDetectUsingK8sInternalIP) expect the full IP address // to be preserved in the CIDR ie. we should return 192.168.1.2/24 ipNet.IP = ip.IP return ipNet } // getLocalCIDR attempts to merge CIDR information from the host with the given IP address. // If a CIDR is provided, then it is simply returned. // If an IP is provided, it attempts to find the matching interface on the host to detect the appropriate prefix length. // If no match is found, the IP will be returned unmodified. func GetLocalCIDR(ip string, version int, getInterfaces func([]string, []string, int) ([]Interface, error)) (string, error) { if strings.Contains(ip, "/") { // Already a CIDR return ip, nil } var destCIDR net.IP if version == 4 { destCIDR = net.ParseIP(ip).To4() } else { destCIDR = net.ParseIP(ip).To16() } if destCIDR == nil { return ip, fmt.Errorf("%s is invalid.", ip) } // Get a full list of interface and IPs and find the CIDR matching the // found IP. interfaces, err := getInterfaces(nil, nil, version) if err != nil { return ip, err } log.Debugf("Auto-detecting IPv%d CIDR of %s", version, ip) for _, iface := range interfaces { log.WithField("Name", iface.Name).Debug("Checking interface") for _, cidr := range iface.Cidrs { log.WithField("CIDR", cidr.String()).Debug("Found") if cidr.IP.Equal(destCIDR) { log.WithField("CIDR", cidr.String()).Info("Including CIDR information from host interface.") return cidr.String(), nil } } } // Even if no CIDR is found, it doesn't think it needs to throw an exception log.Warnf("Unable to find matching host interface for IP %s", ip) return ip, nil } ```
```html <!DOCTYPE html> <html> <head> <meta charset="utf-8" /> <title>MyCrypto - Ethereum Wallet Manager</title> <meta http-equiv="Content-Security-Policy" content="<%= htmlWebpackPlugin.options.metaCsp %>" /> <meta http-equiv="X-UA-Compatible" content="IE=edge,chrome=1" /> <meta name="description" content="<%= htmlWebpackPlugin.options.appDescription %>" /> <meta name="viewport" content="width=device-width, initial-scale=1.0, maximum-scale=1.0" /> <meta property="og:title" content="<%= htmlWebpackPlugin.options.title %>" /> <meta property="og:url" content="<%= htmlWebpackPlugin.options.appUrl %>" /> <meta property="og:description" content="<%= htmlWebpackPlugin.options.appDescription %>" /> <meta property="og:site_name" content="<%= htmlWebpackPlugin.options.title %>" /> <meta property="og:type" content="<%= htmlWebpackPlugin.options.type %>" /> <meta property="og:image" content="<%= htmlWebpackPlugin.options.image %>" /> <meta name="twitter:title" content="<%= htmlWebpackPlugin.options.title %>" /> <meta name="twitter:card" content="summary_large_image" /> <meta name="twitter:site" content="<%= htmlWebpackPlugin.options.twitter.site %>" /> <meta name="twitter:creator" content="<%= htmlWebpackPlugin.options.twitter.creator %>" /> <meta name="twitter:image" content="<%= htmlWebpackPlugin.options.image %>" /> <meta name="google-site-verification" content="dRWkvANAUNAhNyMnTyc7M7S3lnucotMY8j8R-gsZhbo" /> <link rel="manifest" href="/manifest.json" /> </head> <body> <div id="root"></div> <noscript class="NoScript"> <div class="NoScript-content"> <h2>You Must Enable Javascript to Continue</h2> <p> MyCrypto requires Javascript to run. There are no security vulnerabilities as a result of enabling Javascript on MyCrypto, as we do not load any external scripts such as advertisements or trackers. </p> <p> If you are not sure why you are seeing this message, or are unsure of how to enable Javascript, please visit <a href="path_to_url" rel="noopener noreferrer" target="_blank" >enable-javascript.com</a > to learn more. </p> </div> </noscript> <div class="BadBrowser" style="display: none;"> <div class="BadBrowser-content"> <h2>Your Browser is Out of Date</h2> <p class="is-desktop"> MyCrypto requires certain features that your browser doesn't offer. Your browser may also be missing security updates that could open you up to vulnerabilities. Please update your browser, or switch to one of the following browsers to continue using MyCrypto. </p> <p class="is-desktop"> If you're using the Brave Browser, please ensure that you are not blocking all cookies. You may block 3rd-party cookies, but MyCrypto will not function properly if all cookies are blocked. </p> <p class="is-mobile"> MyCrypto requires certain features that your browser doesn't offer. Please use your device's default browser, or switch to a laptop or desktop computer to continue using MyCrypto. </p> <div class="BadBrowser-content-browsers is-desktop"> <a class="BadBrowser-content-browsers-browser firefox" href="path_to_url" rel="noopener noreferrer" target="_blank" > <span class="BadBrowser-content-browsers-browser-name"> Firefox </span> </a> <a class="BadBrowser-content-browsers-browser chrome" href="path_to_url" rel="noopener noreferrer" target="_blank" > <span class="BadBrowser-content-browsers-browser-name"> Chrome </span> </a> <a class="BadBrowser-content-browsers-browser opera" href="path_to_url" rel="noopener noreferrer" target="_blank" > <span class="BadBrowser-content-browsers-browser-name"> Opera </span> </a> </div> </div> </div> </body> </html> ```
is the third installment in the Samurai Warriors series, created by Tecmo Koei and Omega Force. The game was released in Japan on December 3, 2009, in Europe on May 28, 2010, in Australia on June 10, 2010, and in North America on September 28, 2010, for the Wii. Shigeru Miyamoto from Nintendo attended the game's Press Conference on August 5, 2009, to present a new mode in the game based on the Famicom Disk System game The Mysterious Murasame Castle. Nintendo published and distributed the game outside Japan for the Wii. A sequel, Samurai Warriors 4, was announced at a SCEJ press conference in September 2013. Story Like other games in the series, the game reinvents the story based on the Sengoku period of Japan, a period where Japan was ruled by powerful daimyōs and where constant military conflict and much political intrigue happened that lasted from the middle of 16th century to the beginning of 17th century. However, the game has a slightly extended time frame compared to the previous game; while Samurai Warriors 2 is mostly focused on the events leading to the great battle of Sekigahara, this game also covers the events beforehand. Gameplay The game features many gameplay improvements over previous games in the series, the most notable being the addition of the Spirit Gauge, a gauge which allows for characters to cancel certain attacks to perform more powerful ones. These occur depending on the level of the gauge. It can also be combined with Musou attacks to perform an "True Musou". Certain combinations of attacks from the Xtreme Legends expansions also make a comeback. Each of the character's weapons are categorized under Normal, Speed, and Power types similar to Dynasty Warriors 6, except that each character still has unique weapons assigned to them. The option to create/edit characters from the original game returns and is required to access the new "Historical Mode", which can be used to create an original story for edit characters by reenacting parts of historical battles. Both Story Mode and Free Mode return, as does the shop system, which has been redesigned and is now part of "Dojo", a section also dedicated to creating edit characters and color-edit existing characters. An exclusive mode for the Wii version is the "Murasame Castle" based on the Nintendo game Nazo no Murasame Jō, which allows for the control of its lead character Takamaru. Characters Seven new characters made their playable debut in the Samurai Warriors franchise, most of them former generic non-player characters in past installments. Most of the characters from previous games also return, all redesigned with several receiving new weapons. Four characters; Goemon Ishikawa, Gracia, Musashi Miyamoto, and Kojiro Sasaki do not return, although Gracia later returns in the Moushouden expansion. Of all of them, seven characters do not have stories, though they are given stories in the Moushouden expansion. Altogether, there are 30 returning characters for a total of 37 characters in the game. * Denotes characters added through expansion titles ** Denotes Takamaru only found in Samurai Warriors 3/Sengoku Musou 3: Moushouden Bold denotes default charactersBundles The game comes in three different variations; a stand-alone copy of the game, a Classic Controller Pro set, and a treasure box edition. The treasure box edition includes the controller as well as a mini figure, an original soundtrack CD and a book with strategies and artwork. The controller included in the latter two bundles is a special edition black Classic Controller Pro with the game's logo and Japanese inkbrush marks in gold. Music At the game's press conference on August 5, it was revealed that JPop artist Gackt would be performing two theme songs for the game, "Zan" and "Setsugekka". The song "Zan" was used in the promotional commercials for the game, and is also featured in the game's ending. The single, titled "Setsugekka (The End of Silence)/Zan", which contains both songs, was released on December 9, 2009. Expansions The game features three expansions/ports that either add new contents or expand on gameplay mechanics of the game. Sengoku Musou 3: Moushouden/ZSengoku Musou 3: Moushouden is the first expansion of the game, released for the Wii in Japan on February 10, 2011. The game introduces two new modes, the "Original Career" mode which allows the opportunity to create original scenarios by completing missions and acquiring gold to increase the player's abilities and strength, as well as the series staple "Challenge" mode that has three challenges of varying objectives. It also adds new weapons, items, two new difficulty levels ("Novice" and "Expert") and stories for characters that did not have them in the original. The game also has online functionality which was not possible in the original. It was also released for the PlayStation 3 on the same day under the title of Sengoku Musou 3 Z. This version has updated graphics compared to the Wii, but removes the Murasame Castle mode and Takamaru. Both of these versions have yet to receive an overseas release. Sengoku Musou 3: EmpiresSengoku Musou 3: Empires is the second expansion of the game, released for the PlayStation 3 in Japan on August 25, 2011. Like the other Empires expansion, the game is more focused on the political and tactical battle system. The game features a different version of Historical Mode and Free Mode that fits with the Empires structure and retains the edit character feature. Like Moushouden, this game has yet to be released overseas. Sengoku Musou 3 Z: SpecialSengoku Musou 3 Z: Special' is a port for the PlayStation Portable released in Japan on February 16, 2012. As it is based on Sengoku Musou 3 Z, it has all of its features (including the removal of Murasame Castle mode and Takamaru) as well as the ability for four players to compete in the game's Challenge mode. Due to memory limitations however, the graphics have been significantly downgraded. It has yet to receive an overseas release. ReceptionSamurai Warriors 3'' was met with very mixed to negative reception upon release; GameRankings gave it a score of 59%, while Metacritic gave it 55 out of 100. See also List of Samurai Warriors characters References External links Official North American website Official European website Official Japanese website Japanese Site of 3Z 2009 video games PlayStation 3 games Nintendo games Koei games PlayStation Portable games Video games about samurai Samurai Warriors Wii games Multiplayer and single-player video games Crowd-combat fighting games Video games developed in Japan Video games set in feudal Japan
Vijaya Raje (16 September 1919, in Dhar – 21 December 1995) was a member of royal family from Ramgarh in Hazaribagh district of Bihar State of India. She was member of the 1st Rajya Sabha during 1952–1957 from Bihar State of the Janta Party of Ramgarh Raj. Later she was elected to Lok Sabha from Chatra (Lok Sabha constituency) of Bihar State in 1957, 1962 and 1967 for 2nd, 3rd and 4th Lok Sabha. References 1919 births 1995 deaths People from Hazaribagh People from Dhar Rajya Sabha members from Bihar People from Chatra district India MPs 1957–1962 Swatantra Party politicians India MPs 1962–1967 India MPs 1967–1970 Women members of the Lok Sabha Women members of the Rajya Sabha Women in Bihar politics 20th-century Indian women 20th-century Indian people
```puppet resource foo "random:index/randomPet:RandomPet" { options { retainOnDelete = true } } ```
Sargatsky (masculine), Sargatskaya (feminine), or Sargatskoye (neuter) may refer to: Sargatsky District, a district of Omsk Oblast, Russia Sargatskoye, an urban locality (a work settlement) in Omsk Oblast, Russia
The 2021 Nova Scotia general election was held on August 17, 2021, to elect members to the 64th General Assembly of Nova Scotia. In April 2019, the Electoral Boundaries Commission released its final report entitled, Balancing Effective Representation with Voter Parity. The report recommended increasing the number of electoral districts from 51 to 55, including reinstating the four former districts of Argyle, Clare, Preston and Richmond. In the fourth quarter of 2019, the House of Assembly passed the recommended electoral changes into law and they were put into effect in this election. In a major upset, Tim Houston led the Progressive Conservatives to power for the first time since 2006, and with a majority government for the first time since 1999. With a popular vote share of 38.44%, the PCs won the smallest winning vote share of any majority government in Nova Scotian electoral history. Elizabeth Smith-McCrossin’s victory in Cumberland North marked the first occasion since 1988 that an independent candidate won election to the House of Assembly. A record number of four Black Nova Scotians were elected MLAs; prior to this election, only five Black MLAs had ever been elected in Nova Scotia. Timeline May 30, 2017 – The Liberal Party, led by Stephen McNeil, wins the 2017 Nova Scotia general election, the Progressive Conservative Association stay as the official opposition, and the New Democratic Party remain at third party status. January 24, 2018 - Jamie Baillie resigns as leader of the Progressive Conservative Association, and MLA Karla MacFarlane becomes interim leader. October 27, 2018 - Tim Houston is elected leader of the Progressive Conservative Association. August 6, 2020 - Premier Stephen McNeil announces he will resign as leader of the Liberal Party and as Premier of Nova Scotia in early 2021. February 6, 2021 - Iain Rankin is elected as leader of the Liberal Party. February 23, 2021 - Iain Rankin is officially sworn in as the 29th Premier of Nova Scotia. July 17, 2021 - Premier Iain Rankin calls a general election for August 17, 2021. Leaders' debates Results Results by party Results by region Synopsis of results Summary analysis Incumbents not running for reelection The following MLAs announced that they would not run in the 2021 general election: Independent Hugh MacKay (Chester-St. Margaret's) Liberal Party Karen Casey (Colchester North) Keith Colwell (Preston-Dartmouth) Lena Diab (Halifax Armdale) Mark Furey (Lunenburg West) Leo Glavine (Kings West) Bill Horne (Waverley-Fall River-Beaver Bank) Geoff MacLellan (Glace Bay) Chuck Porter (Hants West) Gordon Wilson (Clare-Digby) New Democratic Party Lisa Roberts (Halifax Needham) Candidates by constituency Legend bold denotes party leader † denotes an incumbent who is not running for re-election or was defeated in nomination contest NOTE: Candidates' names are as registered with Elections Nova Scotia Annapolis Valley |- |bgcolor=whitesmoke|Annapolis || |Carman Kerr4,23149.62% | |Jennifer Ehrenfeld-Poole2,75332.29% | |Cheryl Burbidge1,12713.22% | |Krista Grear3063.59% | |Mark Robertson1091.28% || |Vacant |- |bgcolor=whitesmoke|Clare || |Ronnie LeBlanc2,32249.89% | |Carl Deveau2,02143.43% | |Cameron Pye1533.29% | |Claire McDonald1583.39% | | || |Gordon Wilson† Clare-Digby |- |bgcolor=whitesmoke|Digby-Annapolis | |Jimmy MacAlpine1,86535.06% || |Jill Balser2,63649.55% | |Michael Carty62611.77% | |Jessica Walker1132.12% | |Tyler Ducharme801.50% | |New riding |- |bgcolor=whitesmoke|Hants West | |Brian Casey3,82741.58% || |Melissa Sheehy-Richard3,96843.11% | |Caet Moir1,01511.03% | |Jenn Kang2732.97% | |Gordon J. Berry1211.31% || |Chuck Porter† |- |bgcolor=whitesmoke|Kings North | |Geof Turner2,60229.29% || |John Lohr3,97144.70% | |Erin Patterson1,87621.12% | |Doug Hickman3914.40% | |Paul Dunn430.48% || |John Lohr |- |bgcolor=whitesmoke|Kings South || |Keith Irving4,04944.11% | |Derrick Kimball3,04633.18% | |Mercedes Brian1,80819.70% | |Barry Leslie2763.01% | | || |Keith Irving |- |bgcolor=whitesmoke|Kings West | |Emily Lutz3,85641.52% || |Chris Palmer4,59249.45% | |Jason Langille5495.91% | |Sue Earle2162.33% | |Rick Mehta740.80% || |Leo Glavine† |} South Shore |- |bgcolor=whitesmoke|Argyle | |Nick d'Entremont63514.33% || |Colton LeBlanc3,64982.35% | |Robin Smith631.42% | |Corey Clamp841.90% | | || |Colton LeBlancArgyle-Barrington |- |bgcolor=whitesmoke|Chester-St. Margaret's | |Jacob Killawee3,55637.61% || |Danielle Barkhouse3,78840.06% | |Amy Stewart Reitsma1,62617.20% | |Jessica Alexander4174.41% | |Steven Foster680.72% || |Hugh MacKay† |- |bgcolor=whitesmoke|Lunenburg | |Suzanne Lohnes-Croft2,91534.55% || |Susan Corkum-Greek3,54442.01% | |Alison Smith1,75020.74% | |Thomas Trappenberg1712.03% | |John Giannakos570.68% || |Suzanne Lohnes-Croft |- |bgcolor=whitesmoke|Lunenburg West | |Jennifer Naugler3,19734.94% || |Becky Druhan4,06544.42% | |Merydie Ross1,70918.68% | |Eric Wade1801.97% | | || |Mark Furey† |- |bgcolor=whitesmoke|Queens | |Susan MacLeod1,05120.39% || |Kim Masland3,62770.37% | |Mary Dahr3236.27% | |Brian Muldoon1532.97% | | || |Kim MaslandQueens-Shelburne |- |bgcolor=whitesmoke|Shelburne | |Penny Smith1,48323.76% || |Nolan Young3,90562.56% | |Darren Stoddard75312.06% | |Steve Hirchak1011.62% | | | ||New riding |- |bgcolor=whitesmoke|Yarmouth || |Zach Churchill4,34456.32% | |Candice Clairmont2,85637.03% | |SJ Rogers3224.17% | |Adam Randall1912.48% | | || |Zach Churchill |} Fundy-Northeast |- |bgcolor=whitesmoke|Colchester-Musquodoboit Valley | |Rhonda MacLellan1,91325.62% || |Larry Harrison4,11755.13% | |Janet Moulton1,43819.26% | | | | | | || |Larry Harrison |- |bgcolor=whitesmoke|Colchester North | |Merlyn Smith2,68131.84% || |Tom Taggart4,47753.18% | |Sean Foley95511.34% | |Ivan Drouin2522.99% | |Stephan Sante540.64% | | || |Karen Casey† |- |bgcolor=whitesmoke|Cumberland North | |Bill Casey2,48831.65% | |David Wightman5697.24% | |Lauren Skabar5697.24% | | | | || |Elizabeth Smith-McCrossin4,23553.87% || |Elizabeth Smith-McCrossin |- |bgcolor=whitesmoke|Cumberland South | |Rollie Lawless1,09219.17% || |Tory Rushton3,90068.47% | |Larry Duchesne5249.20% | |Nicholas Hendren1803.16% | | | | || |Tory Rushton |- |bgcolor=whitesmoke|Hants East | |Michael Blois3,23936.36% || |John A. MacDonald3,32837.36% | |Abby Cameron2,14224.05% | |Simon Greenough1992.23% | | | | || |Vacant |- |bgcolor=whitesmoke|Truro-Bible Hill-Millbrook-Salmon River | |Tamara Tynes Powell2,54130.21% || |Dave Ritcey4,02547.85% | |Darlene DeAdder1,39816.62% | |Shaun Trainor4485.33% | | | | || |Dave Ritcey |} Central Halifax |- |bgcolor=whitesmoke|Clayton Park West || |Rafah DiCostanzo3,60347.60% | |Nargis DeMolitor1,87524.77% | |Reena Davis1,83624.25% | |Richard Zurawski2102.77% | |Helen Lau460.61% | | || |Rafah DiCostanzo |- |bgcolor=whitesmoke|Fairview-Clayton Park || |Patricia Arab2,89238.51% | |Nicole Mosher1,67822.34% | |Joanne Hussey2,78737.11% | |Sheila Richardson1532.04% | | | | || |Patricia Arab |- |bgcolor=whitesmoke|Halifax Armdale || |Ali Duale3,07040.35% | |Richard MacLean1,68122.09% | |Julie Melanson2,59334.08% | |Jo-Ann Roberts2022.65% | | | |Stephen Chafe630.83% || |Lena Diab† |- |bgcolor=whitesmoke|Halifax Chebucto | |Jackie Kinley2,47832.14% | |John Wesley Chisholm91111.81% || |Gary Burrill4,00951.99% | |Lily Barraclough3134.06% | | | | || |Gary Burrill |- |bgcolor=whitesmoke|Halifax Citadel-Sable Island | |Labi Kousoulis2,95636.82% | |Sheri Morgan1,42517.75% || |Lisa Lachance3,39742.31% | |Noah Hollis2503.11% | | | | || |Labi Kousoulis |- |bgcolor=whitesmoke|Halifax Needham | |Colin Coady2,61729.07% | |Scott Ellis90410.04% || |Suzy Hansen5,30858.96% | |Kai Trappenberg1731.92% | | | | || |Lisa Roberts† |} Suburban Halifax |- |bgcolor=whitesmoke|Bedford Basin || |Kelly Regan3,70050.87% | |Nick Driscoll1,87425.76% | |Jacob Wilson1,55421.36% | |Madeline Taylor1462.01% | | | | || |Kelly Regan Bedford |- |bgcolor=whitesmoke|Bedford South || |Braedon Clark3,56845.37% | |Sura Hadad2,33829.73% | |David Paterson1,76322.42% | |Ron G. Parker1401.78% | |Alan Nightingale550.70% | | | |New riding |- |bgcolor=whitesmoke|Halifax Atlantic || |Brendan Maguire4,21355.22% | |Tim Cranston1,49319.57% | |Shauna Hatt1,74022.81% | |Sarah Weston1832.40% | | | | || |Brendan Maguire |- |bgcolor=whitesmoke|Hammonds Plains-Lucasville || |Ben Jessome3,69746.06% | |Julie Chaisson2,86535.70% | |Angela Downey1,33316.61% | |Mark Embrett1311.63% | | | | || |Ben Jessome |- |bgcolor=whitesmoke|Sackville-Cobequid | |Mary LeRoy1,70121.51% || |Steve Craig3,42643.33% | |Lara Fawthrop2,57732.59% | |Ian Dawson2032.57% | | | | || |Steve Craig |- |bgcolor=whitesmoke|Sackville-Uniacke | |Donalda MacIsaac2,32332.80% || |Brad Johns3,10443.82% | |Thomas Hill1,53521.67% | |Carson LeQuesne1211.71% | | | | || |Brad Johns Sackville-Beaver Bank |- |bgcolor=whitesmoke|Timberlea-Prospect || |Iain Rankin5,18154.38% | |Bill Healy2,32024.35% | |Raymond Theriault1,64717.29% | |Harry Ward2502.62% | |Dessire G. Miari400.42% | |Dawn Edith Penney900.94% || |Iain Rankin |- |bgcolor=whitesmoke|Waverley-Fall River-Beaver Bank | |Marni Tuttle3,54636.36% || |Brian Wong3,93840.38% | |Christina McCarron1,58116.21% | |Anthony Edmonds6176.33% | |Shawn Whitford710.73% | | || |Bill Horne† |} Dartmouth/Cole Harbour/Eastern Shore |- |bgcolor=whitesmoke|Cole Harbour || |Tony Ince2,11839.75% | |Darryl Johnson1,70431.98% | |Jerome Lagmay1,43126.86% | | | |Chris Kinnie751.41% || |Tony Ince Cole Harbour-Portland Valley |- |bgcolor=whitesmoke|Cole Harbour-Dartmouth || |Lorelei Nicoll5,14452.24% | |Karina Sanford2,92929.75% | |Melody Pentland1,55815.82% | |Rana Zaman2152.18% | | | |New riding |- |bgcolor=whitesmoke|Dartmouth East | |D'Arcy Poultney2,90034.68% || |Tim Halman3,26038.99% | |Tyler J. Colbourne1,97423.61% | |Sara Adams1872.24% | |Chris Bowie410.49% || |Tim Halman |- |bgcolor=whitesmoke| Dartmouth North | |Pam Cooley2,36131.48% | |Lisa Coates1,27817.04% || |Susan Leblanc3,73149.75% | |Carolyn Marshall1291.72% | | || |Susan Leblanc |- |bgcolor=whitesmoke|Dartmouth South | |Lesley MacKay1,60322.14% | |Chris Curtis1,26217.43% || |Claudia Chender4,20958.13% | |Skylar Martini1672.31% | | || |Claudia Chender |- |bgcolor=whitesmoke|Eastern Passage | |Joyce Treen1,44426.21% || |Barbara Adams2,46944.82% | |Tammy Jakeman1,22222.18% | |Corey Myers3746.79% | | || |Barbara Adams Cole Harbour-Eastern Passage |- |bgcolor=whitesmoke|Eastern Shore | |Kevin Murphy3,16934.06% || |Kent Smith4,26445.82% | |Deirdre Dwyer1,61817.39% | |Cheryl Atkinson2542.73% | | || |Kevin Murphy |- |bgcolor=whitesmoke|Preston || |Angela Simmonds2,22643.38% | |Archy Beals1,47228.69% | |Colter C.C. Simmonds1,43327.93% | | | | || |Keith Colwell† Preston-Dartmouth |} Central Nova |- |bgcolor=whitesmoke|Antigonish | |Randy Delorey2,99731.82% || |Michelle Thompson4,70749.98% | |Moraig MacGillivray1,55216.48% | |Will Fraser1281.36% | |Ryan Smyth340.36% | | || |Randy Delorey |- |bgcolor=whitesmoke|Guysborough-Tracadie | |Lloyd Hines1,57130.35% || |Greg Morrow3,28163.39% | |Matt Stickland2474.77% | |Gabe Bruce771.49% | | | | || |Lloyd HinesGuysborough-Eastern Shore-Tracadie |- |bgcolor=whitesmoke|Pictou Centre | |Jim McKenna2,26930.93% || |Pat Dunn4,09255.77% | |Vernon Theriault86211.75% | |Laura Moore1141.55% | | | | || |Pat Dunn |- |bgcolor=whitesmoke|Pictou East | |Joe MacDonald1,58522.46% || |Tim Houston4,91869.68% | |Joy Polley5007.08% | | | |Jonathan Geoffrey Dean550.78% | | || |Tim Houston |- |bgcolor=whitesmoke|Pictou West | |Mary Wooldridge-Elliott1,51021.41% || |Karla MacFarlane4,48763.62% | |Rick Parker87212.36% | |Clare Brett1241.76% | | | |John A. Clark600.85% || |Karla MacFarlane |} Cape Breton |- |bgcolor=whitesmoke|Cape Breton Centre-Whitney Pier | |Michelle Wilson3,18840.61% | |Bryden Mombourquette1,28116.32% || |Kendra Coombes3,30942.15% | |Robert Hussey720.92% | | || |Kendra CoombesCape Breton Centre |- |bgcolor=whitesmoke|Cape Breton East | |Heather Peters3,09436.73% || |Brian Comer3,89746.27% | |Barbara Beaton1,43217.00% | | | | || |Brian ComerSydney River-Mira-Louisbourg |- |bgcolor=whitesmoke|Glace Bay-Dominion | |John John McCarthy2,47931.15% || |John White2,75434.61% | |John Morgan2,72534.24% | | | | || |Geoff MacLellan† Glace Bay |- |bgcolor=whitesmoke|Inverness | |Damian MacInnis3,11235.96% || |Allan MacMaster4,83355.85% | |Joanna Clark7088.18% | | | | || |Allan MacMaster |- |bgcolor=whitesmoke|Northside-Westmount || |Fred Tilley4,03046.86% | |Murray Ryan3,14036.51% | |Jennifer Morrison1,43016.63% | | | | || |Murray Ryan |- |bgcolor=whitesmoke|Richmond | |Matt Haley2,00936.85% || |Trevor Boudreau2,77350.86% | |Bryson Syliboy2745.03% | | | |Alana Paon3967.26% || |Alana PaonCape Breton-Richmond |- |bgcolor=whitesmoke|Sydney-Membertou || |Derek Mombourquette4,56154.27% | |Pauline Singer1,46717.45% | |Madonna Doucette2,37728.28% | | | | || |Derek MombourquetteSydney-Whitney Pier |- |bgcolor=whitesmoke|Victoria-The Lakes | |Nadine Bernard2,22434.20% || |Keith Bain3,53654.37% | |Adrianna MacKinnon6279.64% | | | |Stemer MacLeod1161.78% || |Keith Bain |} Opinion polls Voting intentions in Nova Scotia since the 2017 election References 2021 Election 2021 elections in Canada August 2021 events in Canada
```ruby require 'minitest/autorun' require_relative './sort_tests' require_relative './merge_sort' class TestMergeSort < Minitest::Test include SortTests def sort(input) merge_sort(input) end end ```
```xml <?xml version="1.0" encoding="UTF-8"?> <!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "path_to_url"> <plist version="1.0"> <dict> <key>#Comment</key> <string>This file is for 10.12.6+ with native KabyLake support</string> <key>ACPI</key> <dict> <key>#Comment-SortedOrder</key> <string>SortedOrder required if you have patched SSDTs in ACPI/patched</string> <key>#DropTables</key> <array> <dict> <key>Signature</key> <string>MCFG</string> </dict> <dict> <key>Signature</key> <string>DMAR</string> </dict> <dict> <key>Signature</key> <string>SSDT</string> <key>TableId</key> <string>xh_rvp10</string> </dict> <dict> <key>Signature</key> <string>SSDT</string> <key>TableId</key> <string>CpuPm</string> </dict> <dict> <key>Signature</key> <string>SSDT</string> <key>TableId</key> <string>Cpu0Cst</string> </dict> <dict> <key>Signature</key> <string>SSDT</string> <key>TableId</key> <string>Cpu0Ist</string> </dict> <dict> <key>Signature</key> <string>SSDT</string> <key>TableId</key> <string>ApCst</string> </dict> <dict> <key>Signature</key> <string>SSDT</string> <key>TableId</key> <string>ApIst</string> </dict> </array> <key>#SortedOrder</key> <array> <string>SSDT.aml</string> <string>SSDT-0.aml</string> <string>SSDT-1.aml</string> <string>SSDT-2.aml</string> <string>SSDT-3.aml</string> <string>SSDT-4.aml</string> <string>SSDT-5.aml</string> <string>SSDT-6.aml</string> <string>SSDT-7.aml</string> <string>SSDT-8.aml</string> <string>SSDT-9.aml</string> <string>SSDT-10.aml</string> <string>SSDT-11.aml</string> <string>SSDT-12.aml</string> <string>SSDT-13.aml</string> <string>SSDT-14.aml</string> <string>SSDT-15.aml</string> <string>SSDT-16.aml</string> <string>SSDT-17.aml</string> <string>SSDT-18.aml</string> <string>SSDT-19.aml</string> <string>SSDT-XOSI.aml</string> <string>SSDT-LPC.aml</string> <string>SSDT-UIAC.aml</string> <string>SSDT-PNLF.aml</string> </array> <key>DSDT</key> <dict> <key>Fixes</key> <dict> <key>#Comment-IRQ Fix</key> <string>The following fixes may be needed for onboard audio/USB/etc</string> <key>AddPNLF_1000000</key> <true/> <key>FixHDA_8000</key> <true/> <key>FixHeaders_20000000</key> <true/> </dict> </dict> <key>DropTables</key> <array> <dict> <key>Signature</key> <string>#MCFG</string> </dict> <dict> <key>Signature</key> <string>DMAR</string> </dict> <dict> <key>Signature</key> <string>SSDT</string> <key>TableId</key> <string>xh_rvp10</string> </dict> </array> <key>SSDT</key> <dict> <key>DropOem</key> <false/> <key>Generate</key> <dict> <key>CStates</key> <false/> <key>PStates</key> <false/> </dict> <key>PluginType</key> <integer>1</integer> </dict> </dict> <key>Boot</key> <dict> <key>Arguments</key> <string>-v dart=0 nv_disable=1</string> <key>DefaultVolume</key> <string>LastBootedVolume</string> <key>Legacy</key> <string>LegacyBiosDefault</string> <key>Log</key> <false/> <key>NeverHibernate</key> <true/> <key>Secure</key> <false/> <key>Timeout</key> <integer>5</integer> <key>XMPDetection</key> <string>Yes</string> </dict> <key>CPU</key> <dict> <key>BusSpeedkHz</key> <integer>99790</integer> <key>FrequencyMHz</key> <integer>900</integer> <key>Latency</key> <string>0x00FA</string> <key>QPI</key> <integer>400</integer> <key>UseARTFrequency</key> <false/> </dict> <key>Devices</key> <dict> <key>#AddProperties</key> <array> <dict> <key>Comment</key> <string>hda-gfx=onboard-1 for HDMI audio</string> <key>Device</key> <string>IntelGFX</string> <key>Key</key> <string>hda-gfx</string> <key>Value</key> <data> b25ib2FyZC0xAA== </data> </dict> <dict> <key>Comment</key> <string>hda-gfx=onboard-1 for HDMI audio</string> <key>Device</key> <string>HDA</string> <key>Key</key> <string>hda-gfx</string> <key>Value</key> <data> b25ib2FyZC0xAA== </data> </dict> <dict> <key>Comment</key> <string>layout-id=3</string> <key>Device</key> <string>HDA</string> <key>Key</key> <string>layout-id</string> <key>Value</key> <data> AwAAAA== </data> </dict> <dict> <key>Device</key> <string>HDA</string> <key>Key</key> <string>PinConfigurations</string> <key>Value</key> <data> </data> </dict> </array> <key>Audio</key> <dict> <key>Inject</key> <string>3</string> <key>ResetHDA</key> <true/> </dict> <key>FakeID</key> <dict> <key>#Kaby Lake-Comment</key> <string>To avoid automatic Clover fake device-id (Skylake) injection</string> </dict> <key>USB</key> <dict> <key>AddClockID</key> <true/> <key>FixOwnership</key> <true/> <key>Inject</key> <true/> </dict> </dict> <key>DisableDrivers</key> <array> <string>VBoxHfs</string> </array> <key>GUI</key> <dict> <key>#ScreenResolution</key> <string>1920x1080</string> <key>Language</key> <string>en:0</string> <key>Mouse</key> <dict> <key>Enabled</key> <true/> </dict> <key>Scan</key> <dict> <key>Entries</key> <true/> <key>Legacy</key> <false/> <key>Linux</key> <false/> <key>Tool</key> <true/> </dict> <key>ScreenResolution</key> <string>1920x1080</string> <key>Theme</key> <string>BGM</string> </dict> <key>Graphics</key> <dict> <key>DualLink</key> <integer>0</integer> <key>EDID</key> <dict> <key>Inject</key> <false/> </dict> <key>Inject</key> <dict> <key>ATI</key> <false/> <key>Intel</key> <true/> <key>NVidia</key> <false/> </dict> <key>ig-platform-id</key> <string>0x191e0000</string> </dict> <key>KernelAndKextPatches</key> <dict> <key>AppleIntelCPUPM</key> <true/> <key>AppleRTC</key> <true/> <key>DellSMBIOSPatch</key> <false/> <key>ForceKextsToLoad</key> <array> <string>\System\Library\Extensions\IONetworkingFamily.kext</string> </array> <key>KernelLapic</key> <true/> <key>KernelPm</key> <true/> <key>KernelToPatch</key> <array> <dict> <key>Comment</key> <string>MSR 0xE2 _xcpm_idle instant reboot(c) Pike R. Alpha</string> <key>Disabled</key> <false/> <key>Find</key> <data> ILniAAAADzA= </data> <key>Replace</key> <data> ILniAAAAkJA= </data> </dict> </array> <key>KextsToPatch</key> <array> <dict> <key>Disabled</key> <false/> <key>Find</key> <data> TIlVuHZA </data> <key>Name</key> <string>com.apple.driver.AppleIntelSKLGraphicsFramebuffer</string> <key>Replace</key> <data> TIlVuOtA </data> </dict> </array> </dict> <key>RtVariables</key> <dict> <key>BooterConfig</key> <string>0x28</string> <key>CsrActiveConfig</key> <string>0x67</string> <key>MLB</key> <string>C02032109R5DC771H</string> <key>ROM</key> <string>UseMacAddr0</string> </dict> <key>SMBIOS</key> <dict> <key>BiosReleaseDate</key> <string>04/19/16</string> <key>BiosVendor</key> <string>Apple Inc.</string> <key>BiosVersion</key> <string>MB91.88Z.0159.B00.1708080011</string> <key>Board-ID</key> <string>Mac-9AE82516C7C6B903</string> <key>BoardManufacturer</key> <string>Apple Inc.</string> <key>BoardSerialNumber</key> <string>C02731701GUDM65JC</string> <key>BoardType</key> <integer>10</integer> <key>BoardVersion</key> <string>MacBook9,1</string> <key>ChassisAssetTag</key> <string>MacBook-Aluminum</string> <key>ChassisManufacturer</key> <string>Apple Inc.</string> <key>ChassisType</key> <string>0x10</string> <key>Family</key> <string>MacBook</string> <key>FirmwareFeatures</key> <string>0xFC0FE13E</string> <key>FirmwareFeaturesMask</key> <string>0xFF1FFF3F</string> <key>LocationInChassis</key> <string>Part Component</string> <key>Manufacturer</key> <string>Apple Inc.</string> <key>Memory</key> <dict> <key>Channels</key> <integer>1</integer> <key>Modules</key> <array> <dict> <key>Frequency</key> <integer>1800</integer> <key>Size</key> <integer>4096</integer> <key>Slot</key> <integer>0</integer> <key>Type</key> <string>DDR3</string> <key>Vendor</key> <string>Kingston</string> </dict> </array> </dict> <key>Mobile</key> <true/> <key>PlatformFeature</key> <string>0x1A</string> <key>ProductName</key> <string>MacBook9,1</string> <key>ProductName-Comment</key> <string>Using Haswell MacBookAir6,2 until Clover has support for KabyLake identifiers</string> <key>SerialNumber</key> <string>C02V53M0HDNK</string> <key>Version</key> <string>1.0</string> </dict> <key>SystemParameters</key> <dict> <key>#BacklightLevel</key> <integer>0</integer> <key>InjectKexts</key> <string>Yes</string> </dict> </dict> </plist> ```
The 30 megawatt Cullerin Range Wind Farm is located in the localities of Cullerin and Breadalbane in the Upper Lachlan Shire, New South Wales, Australia. The wind farm was completed in 2009 and cost around $90 million. The owner, Origin Energy, sold the business to Energy Developments, a subsidiary of Duet. Incidents On 5 January 2023, at about 6am, an electrical fire broke out in the generator cabin of one turbine. Local NSW Rural Fire Service firefighters attended the scene, to ensure that burning debris did not cause a fire on the ground. The three blades had been stopped with one vertical and that was consumed by the fire. See also Gunning Wind Farm Wind power in Australia List of wind farms in New South Wales References External links Wind farms in New South Wales
```shell How to unmodify a modified file Using tags for version control Using aliases for git commands You can use git offline! Use `short` status to make output more compact ```
"What Kind of Love Are You On" is a song by American hard rock band Aerosmith. The song, originally a track left off the Nine Lives album, was included on Armageddon: The Album for the 1998 film Armageddon starring lead singer Steven Tyler's daughter Liv Tyler. The song, was released as a promotional single to rock radio, reaching #4 on the Mainstream Rock Tracks chart. It was written by Steven Tyler, guitarist Joe Perry and outside songwriters Jack Blades and Tommy Shaw (both formerly of Damn Yankees). It is the second song written for the film, the other being "I Don't Want to Miss a Thing". Charts References 1998 singles 1998 songs Aerosmith songs Songs written by Tommy Shaw Songs written by Jack Blades Songs written by Steven Tyler Songs written by Joe Perry (musician) Song recordings produced by Matt Serletic Columbia Records singles