text
stringlengths
1
1.04M
language
stringclasses
25 values
NAIROBI, Kenya, Mar 18 – Former President Daniel arap Moi’s personal assistant Joshua Kulei has asked a constitutional court to declare a notice requiring him to enumerate how he acquired his wealth unconstitutional. Through lawyer Pravin Bowry, Mr Kulei urged judges Joseph Nyamu, Roselyn Wendoh and Mathew Emukule to declare a notice issued on July 12, 2006 by the director of Kenya Anti-Corruption Commission (KACC) Aaron Ringera null and void. He argued that section 26 of the Anti – Corruption and Economic Crimes Act under which the notice was issued is no longer in law. “The section was amended in 2007 thus making the notice ineffective for all intent and purposes," he said. Mr Kulei argued that since KACC had opted to continue with the matter despite the new legislation, the court cannot substitute the old section with the new. “The changes are very substantial and the respondent having opted to proceed with the matter under the old section, the court cannot impose a new section," he submitted. The advocate said it is unfair for KACC to ask his client to give all particulars of how he acquired his wealth from 2001. “To ask an individual to start enumerating how he acquired his property is unfair and nearly impossible because fundamental principles of fairness and justice are likely to be abused," he added. The notice by Justice Ringera, Mr Bowry contended, is unconstitutional because it indicated that it was from a judge who was still in an acting capacity. “The notice looked as if it was from a judge and one would conclude that a judge was taking over an investigative-cum-judicial function. This renders the notice unconstitutional," added the lawyer. Mr Kulei is challenging KACC’s demand that he lays bear his wealth portfolio and how he acquired it. KACC has warned him that failure to comply with the directive would attract a fine of Sh300,000 or a jail term not exceeding three years, or both.
english
Aishwarya Rai in another big movie after Ponniyin Selvan? Former Miss world and Actress Aishwarya Rai had debuted in films with the Kollywood flick Iruvar directed by Maniratnam, where she played dual roles, and was also the female lead in Shankar's big blockbuster Jeans. She had then switched to Bollywood, turning a top actress there. Though she had been occasionally returning to south with movies like Kandukonden Kandukonden, Ravanan and Endhiran, she had mostly been starring in Hindi movies. Recently, it was revealed by the actress herself that she will be acting in Maniratnam's big budget movie Ponniyin Selvan. The latest speculation making rounds is that, Aishwarya Rai will be a part of another big South Indian movie, other than Ponniyin Selvan, as she has been considered to play the female lead in Chiranjeevi's next movie after his ongoing Sye Raa Narasimha Reddy and this movie will be directed by Koratala Siva. It is to be seen if Aishwarya Rai will give her nod for Chiranjeevi movie. Follow us on Google News and stay updated with the latest!
english
Backed by various health experts, cinnamon infused water makes for a wonder drink that has innumerable healing properties. The unique flavour and fragrance of cinnamon is pure delight. This spice with medicinal properties has been used in wide varieties of cuisines, sweet and savoury dishes, breakfast cereals, baked goods and snacks. Loaded with various antioxidant and antibiotic properties, one of the best ways to extract all the goodness from cinnamon is to soak the stick in water and sip on it on a regular basis. According to various health experts, cinnamon infused water is a wonder drink that has innumerable healing properties. Here are some of the benefits that will prove why you should drink cinnamon water daily. Inducing Weight Loss: While it is yet to be proved, cinnamon water makes you feel satiated and prevents cravings and hunger pangs, which further aids weight loss. Reducing Effects Of PCOS: Polycystic Ovary Syndrome or PCOS is a hormonal disorder that causes ovaries to enlarge with small cysts on the outer edges. As per PCOS Awareness Association, cinnamon infused water along with some honey may help in reducing the effects of PCOS. Cinnamon water reduced insulin resistance in women with PCOS. Boosts Immunity System: Rich in antioxidant polyphenols and proanthocyanidins, cinnamon water gives a boost to the immunity system. The antiviral, antibacterial and antifungal properties help reduce the chances of developing various health hazards like respiratory disorders, heart problems, et al. Dealing with Menstrual Cramps: Women ingesting cinnamon experienced lesser pain which lasted for a shorter duration. One cup of warm cinnamon water every day may help reduce the effects of menstrual cramps. Reduce Inflammation: Those who are suffering from joint pain or arthritis are generally advised to drink cinnamon water for relief; thanks to its anti-inflammatory properties that help reduce excessive pain. It also helps in boosting blood circulation inhibiting the development of conditions like arthritis.
english
Topic: Welfare schemes for vulnerable sections of the population by the Centre and States and the performance of these schemes; mechanisms, laws, institutions and Bodies constituted for the protection and betterment of these vulnerable sections. 4. Tribal people in India are largely excluded by Governments at the Centre and States. In this context throw light upon the significance of the Pradhan Mantri Van Dhan Yojana. (250 words) Why the question: Key Demand of the question: One has to explain the challenges in addressing the concerns of the tribal pockets in the country and in what way often the policy measures taken by both centre and state ignore them. Structure of the answer: Explain the statement – Tribal people in India are largely excluded by Governments at the Centre and States by substantiating it with necessary facts. The Van Dhan Scheme is an initiative of the Ministry of Tribal Affairs and TRIFED. The question is amidst the welfare schemes and policies that the government is focusing at with tribal people of India as the target group. Discuss the mandate and key features of Pradhan Mantri Van Dhan Yojana. Explain how the scheme is specifically aimed at the tribal population of the country. Conclude with what needs to be done to address the situation and ensure welfare of the tribals.
english
I believe every Indian women should have free access to sanitary pads.. If government can distribute free condoms they should definately have free pads on all government hospitals and dispensaries. Even private companies should make them at highly subsidized price.
english
# wudc_pushNotifications A Windows Presentation Foundation solution that sends toast notifications to the Azure Push Notifications applications, formatted for Windows Phone.
markdown
<filename>kubernetes/1.20/defs/io.k8s.apiextensions-apiserver.pkg.apis.apiextensions.v1.CustomResourceDefinitionVersion.json { "description": "A version for CustomResourceDefinition.", "properties": { "additionalPrinterColumns": { "description": "Additional columns returned in table output. If no columns are specified, a single column displaying the age of the custom resource is used.", "items": { "$ref": "#/definitions/io.k8s.apiextensions-apiserver.pkg.apis.apiextensions.v1.CustomResourceColumnDefinition" }, "type": "array" }, "deprecated": { "default": false, "description": "This indicates this version of the custom resource API is deprecated. When set to `true`, API requests to this version receive a warning header in the server response.", "type": "boolean" }, "deprecationWarning": { "description": "This overrides the default warning returned to API clients. May only be set when `deprecated` is `true`. The default warning indicates this version is deprecated and recommends use of the newest served version of equal or greater stability, if one exists. This string may only contain printable UTF-8 characters.", "maxLength": 256, "minLenght": 1, "type": "string" }, "name": { "description": "The version name, e.g. `\"v1\"`, `\"v2beta1\"`, etc. The custom resources are served under this version at `\"/apis/<group>/<version>/...\"` if `served` is `true`. The value must be a valid DNS label and it must be unique among all versions.", "minLength": 1, "type": "string" }, "schema": { "$ref": "#/definitions/io.k8s.apiextensions-apiserver.pkg.apis.apiextensions.v1.CustomResourceValidation", "description": "The schema used for validation, pruning, and defaulting of this version of the custom resource." }, "served": { "description": "A flag enabling/disabling this version to be served via REST APIs", "type": "boolean" }, "storage": { "description": "This flag indicates that this version should be used when persisting custom resources to storage. There must be exactly one version with `storage` set to `true`.", "type": "boolean" }, "subresources": { "$ref": "#/definitions/io.k8s.apiextensions-apiserver.pkg.apis.apiextensions.v1.CustomResourceSubresources", "description": "What subresources this version of the defined custom resource have." } }, "required": [ "name", "served", "storage" ], "type": "object" }
json
<gh_stars>100-1000 {"nom":"Plan","circ":"7ème circonscription","dpt":"Isère","inscrits":187,"abs":94,"votants":93,"blancs":3,"nuls":4,"exp":86,"res":[{"nuance":"REM","nom":"<NAME>","voix":51},{"nuance":"LR","nom":"M. <NAME>","voix":35}]}
json
The death toll in the Kendrapara firecracker mishap has risen to two, after a 28-year-old succumbed to burn injuries while undergoing treatment on Monday. The deceased has been identified as Pintu Khan. Khan was under treatment at Cuttack-based SCB Medical College and Hospital. However, he breathed his last Sunday night. Earlier, an under-treatment patient had succumbed to burn injuries late Friday night. The deceased was identified as Uday Bhoi. He was also under treatment at SCB Hospital in Cuttack. Meanwhile, many other patients are still battling for their lives at the premier hospital’s Burn ICU, Central ICU and general ward. On the other hand, the Sadar police has arrested three persons for their alleged roles in connection with the tragedy. The three have been identified as Bulu Mallik, Arakhita Mallik and Chilu Dash. The accused booked under IPC 286, 337, 338, 109 were produced before the court. A case has also been registered as per firework and loudspeaker Act-5 and Explosives Act-3 & 5. The three accused are members of different puja committees. Moreover, the Sadar police took suo-moto cognisance of the matter and has registered cases against many other puja committee members. When contacted, SDPO, Jayant Mahapatra informed that there will be many other arrests soon. Worth mentioning, at least 40 people were injured following a firecracker mishap in the middle of a crowd celebrating an immersion ceremony of Lord Kartikeswar at Balia Bazaar in Odisha’s Kendrapara. As many as 33 injured were initially admitted to Kendrapara District Headquarters Hospital (DHH). After preliminary treatment, they were shifted to Cuttack SCB Medical College and Hospital after their condition worsened.
english
According to the police, whether he committed suicide or fell from the 20th floor accidentally will be known after a detailed investigation. The police initiated the drive following a tip-off that OYO rooms, booked online, were being used rampantly for drug deals. “People check into these rooms for just one or two hours for facilitating drug deals. We conducted the raids based on information that hotel rooms were being widely used for drug deals,” the DCP said.
english
<filename>index/d/dried-cherry-and-raisin-rice-pudding-107235.json { "directions": [ "Bring water with salt to a boil in a 2-quart heavy saucepan and stir in rice. Cover pan and reduce heat to low, then cook until water is absorbed, about 15 minutes. Stir in milk and sugar and cook over very low heat, covered, until mixture resembles a thick soup, 50 minutes to 1 hour.", "Whisk together egg, egg whites, vanilla, cardamom, and a pinch of salt. Whisk about 1 cup hot rice mixture into egg mixture, then stir mixture into remaining rice. Cook over low heat (do not let boil), whisking constantly, until an instant-read thermometer registers 170\u00b0F, 1 to 2 minutes. Remove from heat and stir in raisins and cherries.", "Transfer pudding to a 2-quart dish or 6 (8-ounce) ramekins and chill, its surface covered with wax paper, until cool but not cold, 1 to 2 hours." ], "ingredients": [ "1 cup water", "1/4 teaspoon salt", "1/2 cup long-grain white rice", "3 cups 1% fat milk", "1/3 cup sugar", "1 large egg", "2 large egg whites", "1 teaspoon vanilla", "1/8 teaspoon ground cardamom", "1/3 cup golden raisins", "1/3 cup dried tart cherries" ], "language": "en-US", "source": "www.epicurious.com", "tags": [ "Milk/Cream", "Egg", "Rice", "Dessert", "Low Fat", "Dried Fruit", "Raisin", "Fall", "Chill", "Gourmet" ], "title": "Dried Cherry and Raisin Rice Pudding", "url": "http://www.epicurious.com/recipes/food/views/dried-cherry-and-raisin-rice-pudding-107235" }
json
319 Written Answers (b) if so, the details thereof; and (c) the steps taken by Government for their protection? THE MINISTER OF LABOUR (SHRI BINDESHWARI DUBEY): (a) to (c). No report of harassment by the management of workers who went on strike during 22.11.1988 to 28.11.1988 have been received by the Delhi Administration and the Government of Haryana. Information from the Government of Uttar Pradesh has been called for and will be laid on the Table of the House. Rice and Wheat to Manipur 161. SHRI N. TOMBI SINGH: Will the Minister of FOOD AND CIVIL SUPPLIES be pleased to state: (a) the quantities of rice and wheat reKATA KE stąd ni 8801 pada munka Written Answers leased to the State of Manipur during the last three years, year-wise and their ratio to the demand; (b) whether Government have received any complaints regarding pilferage of the foodgrains in transit from Dinapur to Imphal; (c) if so, the details thereof and the action taken thereor and (d) if not, whether Government propose to call for information from the local unit in this regard? THE DEPUTY MINISTER IN THE MINISTRY OF FOOD AND CIVIL SUPPLIES (SHRID.L. BAITHA): (a) A statement is given below. (b) to (d). Information is being collected and will be laid on the Table of the Lok Sabha.
english
import re import pandas as pd from constant import shenzhen_data_csv_path def shape_data(header, body): """ 该方法用于从pdf获取数据的时候塑造dataframe形式的数据 :param header: :param body: :return: """ pd.set_option('display.max_rows', None) df = pd.DataFrame(body) df.columns = header return df def to_csv(dataframe): ''' 该方法用于保存从PDF提取的数据到csv :param dataframe: :return: ''' print("开始写入数据到csv文件") dataframe.to_csv(shenzhen_data_csv_path, index=False) def read_data_from_csv(path): """ 该方法用于从csv文件中获取数据 :param path: :return: """ pd.set_option('display.max_rows', None) return pd.read_csv(path) def fuzzy_finder(user_input, collection): """ 该方法参考于https://www.cnblogs.com/weiman3389/p/6047017.html :param user_input: :param collection: :return: """ suggestions = [] pattern = '.*'.join(user_input) regex = re.compile(pattern) for item in collection: match = regex.search(item) if match: suggestions.append(item) return suggestions
python
{"cdb_manipulate.bsh":"html/cdb_manipulate.bsh.txt","text_manipulate.bsh":"html/text_manipulate.bsh.txt"}
json
For weeks now, political leaders and media alike have framed this fight against coronavirus in the strongest of terms. We are at war. As President Trump tweeted, "I am now war time president. " Such a declaration is not an inconsequential one, and with it comes the worst enemy of all, fear. Having spent the last few weeks at home in quasi-quarantine with my family, I woke this morning with a sense of epiphany. I’ve been here before. I’ve lived months on end in a repetitive cycle of monotony and uncertainty. I’ve lived without the comforts of restaurants, movie theaters, shopping malls, family gatherings, holiday celebrations or last-minute road trips. I’ve turned inanimate objects into gym weights, structured a daily schedule to keep my mind and body busy, and felt the constricting pain of going from free citizen to mission serving Marine. In war you abide by one truth; seemingly needless discipline in things like exercise, social interaction and even entertainment are as critical as good hygiene or even ammunition when it comes to staying healthy and alive. Our minds can conquer when our bodies can’t. So you get creative. You don’t accept limitations, you create adaptations. For us, that was makeshift volleyball nets and card games, for you it’s work from home and virtual happy hour. - Quentin E. Hodgson: During coronavirus pandemic, can Congress members do their jobs by teleworking? My last deployment was to the deadliest parts of Afghanistan in 2010. And there too, was a silent, invisible and deadly enemy. You called them roadside bombs, we called them IEDs. Whatever their nomenclature, they gave no warning, and attacked us in indiscriminate places. Markets, streets and playgrounds were the battlefields, and their carriers looked just like the rest of the populous. They were invisible, evil and felt none of the pain they inflicted on us. We survived them with personal protective equipment like ballistic glasses and bulletproof vests. We stayed 10 meters apart from one another at all times so that if one of us set off a bomb, it’d killed the fewest of us. We dealt with the constant swaying of opinions by our political and strategic leaders, even when they seemed to contradict themselves from one day to the next. But we had a mission, to keep one another safe and to make it back home, back to freedom and family. As a veteran, a survivor of an IED and a grateful American let me tell you this: you will get through this. Now, a decade later my heart breaks as I see a similar experience afflicting my precious nation, here in our own homes. Coronavirus attacks with the same veracity and no less uncertainty as bombs hidden in the ground. For me, there is no solace in the fact I have seen such a thing before. Simply put, I chose to go to that fight for the explicit purpose of ensuring such a fight wouldn’t come home to you. I can’t help but feel a semblance of failure seeing my fellow Americans needlessly suffer. So as a veteran, a survivor of an IED and a grateful American let me tell you this: you will get through this. The fear and pain will subside and in its place a renewed appreciation for your freedoms and safety. Like you’ve never known. You’ll be emotionally stronger, physically more resilient and collectively a better version of the greatest people on earth. Your memories of death and unfairness will be replaced by memorials of fearless sacrifice and common heroism. You’ll sleep better at night knowing we can endure because we do it together. You’ll know first hand what I’ve known now for so long. Americans don’t just survive tragedies of this kind, we thrive after them. As a veteran, my role in this is simple, to show you the way. To let you know it’s possible and to give any and everything I have to offer to help you through it. God bless those struggling with this illness, our front line medical workers saving lives by risking theirs, our first responders and military, the people who keep the lights and water running, and the people who keep our hearts full with hope and love. But most of all, God bless the PEOPLE of the United States of America.
english
Of late, Samsung has been launching a plethora of affordable smartphones priced in the entry-level and mid-range market segments. The company's Galaxy J series smartphones did not fare well against the competition posed by the Chinese brands such as Xiaomi among others. Eventually, the company came up with better specced Galaxy A and Galaxy M series. Earlier this year, the Samsung launched the Galaxy M10, Galaxy M20 and Galaxy M30 smartphones priced reasonably. Following these, the company also brought the Galaxy A series smartphones with upgraded specifications and relatively higher pricing. Now, it is also gearing up to launch the Galaxy M40 and Galaxy A80 smartphones in India later this month. As intended, these smartphones are highly popular among buyers and have been achieving impressive sales. The company was recently in the headlines as it sold a few million units of the M series phones in just two months of their launch. Having said that, here we list some best Samsung smartphones priced under Rs. 20,000 in India. - Android 8. 0 (Oreo) - 6 Inch HD+ Infinity Display' - Android 8. 0 (Oreo) - Android 8. 0 (Oreo)
english
I HEREBY ENCOURAGE ALL THE QUALIFIED NOMINATORS (see end of article) to NOMINATE LOUJAIN AL-HATHLOUL FOR THE NOBEL PEACE PRIZE. Since 2014, the Saudi feminist movement has been publicly engaged in raising awareness, popularising human rights concepts in general, beyond equality for women and men. It is the most organised and articulate civil society in Saudi Arabia, rapidly becoming "the regime's nightmare." Observers believe that the fear of this uncontrolled spread of human rights was the reason for the 2018–2019 crackdown on feminists. Loujain was one of the leaders in the movement, reshaping the process of mass,collective consciousness-raising and developing a fully articulated understanding of women’s varying social positions. She was a main voice in the movements “Together We Stand to End Male Guardianship of Women” and “Women Demand the Overthrow of Guardianship” raising awareness online and sharing information. She has done this at great cost to herself and her well-being. Loujain never used a pseudonym, a current practice to avoid reprisal. She preferred to be visible so girls and women could identify themselves in her and follow her demand for social change. For her efforts, Loujain has been arrested three times. Loujain Al-Hathloul was arrested on May 15, 2018. She was held incommunicado for 35days before her family was notified. She is still in prison 530 days later. In August 2019, Saudi State Security officials offered to release Loujain if she recorded a video denying that they tortured her. She refused.The Saudi Government has still given no legal basis for her detention. When Ms.Al-Hathloul was arrested, officials failed to show any warrant, or other order issued by a lawful authority, that authorized her arrest. The arresting officers also failed to provide verbal confirmation of any laws that Ms. Al-Hathloul might have allegedly violated that would justify her detention.In the indictment that the Government brought against Ms. Al-Hathloul almost 10 months after her initial detention, 11 out of the 12 charges against her made no reference to any legal violation. In the last charge, the only reference was to a pre-existing Saudi law - “Combating Cybercrime” – where they charged her for using social media to discuss abolishing male guardianship and giving women equal rights.Loujain has been arrested twice before. Once, in 2014, she was detained for 73 days after driving her car over the Saudi border from the United Arab Emirates, where she was living at the time and where she was licensed to drive. During that period of detention, Ms. Al-Hathloul was never charged. In 2017, she was taken into custody for several days and then released, also without being charged or informed of the legal grounds for her arrest. Loujain could easily have a different life. She has a very supportive family. However, she is driven by an enormous desire to contribute to improve children and womenconditions in Saudi Arabia. Nominating Loujain is driven by the underlying philosophy of Loujain’s fight for justice: Peace should be the means; Peace should be the end.Loujain and other activists have fought for fundamental rights in a peaceful manner andare violently repressed. Peaceful actions and protest should be allowed to be a means for positive social change.We deeply believe that intentionally rewarding such acts will be a great symbol for the World. Loujain’s nomination is even more important in the context of peace in the Middle East. Saudi Arabia is a significant country, a model for many countries in the Islamic World. By nominating Loujain, who has been repressed by the Saudi regime for her peaceful protest, the international Community would affirm its support for Peace and its indirect disapproval of Saudi policy, and of any repressive regime. The Nobel Committee will add its voices to the many governmental bodies and NGOs standing up for peace. Through the pressure of soft power, Saudi Arabia may be obliged to review its policy regarding pacifists, and many neighboring countries will follow the example of Saudi Arabia. Through Loujain, this award would go to all gender equality defenders in the world, and specifically those struggling in the repressive regimes of the Middle East. It would life up all the brave women and men who dare to ask for equality, knowing it is the cornerstone of a peaceful society.Loujain has been the voice for the voiceless, and has been silenced since. This award would therefore be the voice of the voiceless fighting for peace. Saudi Arabia has clear ambitions to become a major actor on the international scene. To achieve their goal, Saudi Arabia has abandoned reserve to opt for the use of force to suppress any internal criticism. The murder of Jamal Khashoggi reflects the new norm in Saudi Arabia. The brutality and the impulsivity of that act are the mark of the current Saudi regime.The scale of Saudi Arabia’s ambition for total control – both inside and beyond its own borders – was first spelled out for the rest of the world in the Kingdom’s bloodyintervention in Yemen in 2015. We also see it in the disappearance for more than 2 weeks of the democratically elected leader of Lebanon during an official visit to Riyadh. Then, there was Khashoggi. Mohammed Bin Salman (MBS), the de facto ruler of Saudi Arabia, was believed to be an enlightened despot. Many applaud the reforms of MBS and do not show great concern over the situation of human rights in the country. Many prefer to carry on their good relationship with Saudi Arabia.Though international condemnation has increased since Khashoggi was killed, Saudi Arabia shows little contrition. Thus, it is paramount that we make Saudi Arabiaunderstand that the international community shares neither its policies nor its values.Rewarding a Saudi women such Loujain will be such a statement, and set an example to other despots. The Qualified Nominators are :
english
package io.makerplayground.ui; import io.makerplayground.generator.source.SourceCodeResult; import io.makerplayground.project.Project; import io.makerplayground.project.ProjectConfiguration; import javafx.concurrent.Task; public abstract class ProjectExportTask extends Task<ExportResult> { protected final Project project; protected final SourceCodeResult sourcecode; protected final ProjectConfiguration configuration; protected final String zipFilePath; protected ExportResult exportResult; public ProjectExportTask(Project project, SourceCodeResult sourcecode, String zipFilePath) { this.project = project; this.sourcecode = sourcecode; this.configuration = project.getProjectConfiguration(); this.zipFilePath = zipFilePath; } public ExportResult getExportResult() { return exportResult; } }
java
<filename>src/main/java/ejb/PartidoMetricaFacadeEJB.java package ejb; import javax.ejb.Stateless; import javax.persistence.EntityManager; import javax.persistence.PersistenceContext; import facade.AbstractFacade; import facade.PartidoMetricaFacade; import model.PartidoMetrica; @Stateless public class PartidoMetricaFacadeEJB extends AbstractFacade<PartidoMetrica> implements PartidoMetricaFacade { @PersistenceContext(unitName = "politweetsPU") private EntityManager em; public PartidoMetricaFacadeEJB() { super(PartidoMetrica.class); } @Override protected EntityManager getEntityManager() { return this.em; } }
java
def finished_hook_for_mp3(d): if d['status'] == 'finished': print('다운로드가 완료되었습니다. 컨버팅을 시작합니다.') def finished_hook_for_mp4(d): if d['status'] == 'finished': print('다운로드가 완료되었습니다.')
python
Born to actor Mohan Ram, Vidyullekha entered the industry with none of that influence and hype when she debuted in Gautam Menon's 'Neethaane En Ponvasantham'. As is well known, she went on to become an established actress with her unique role in 'Theeya Velai Seiyannum Kumaru'. The latest on the actress is that she is really going places. Earlier, she confirmed being a part of Ajith's 54th film, directed by 'Siruthai' Siva, and now she has declared that she is a part of Vijay's upcoming film, 'Jilla' too. Excited about being a part of the film directed by Nesan, the actress tweeted "Happy to announce that I am doing a "different" comic role in #JILLAstarting Vijay, Kajal & Mohan Lal. Shooting starts shorty! ! " Follow us on Google News and stay updated with the latest!
english
New Delhi, Dec 23: Most fans in India believe that football has not seen global success in the country due to the lack of infrastructure, a survey revealed on Friday. According to the study, 47 percent fans felt that lack of infrastructure was the reason behind India’s inability to shine globally while 26 percent said poor recognition was the key factor affecting the growth. “47 percent of the participants believe that football hasn’t seen global success in India because of lack of infrastructure, whereas 26 percent believe it to be the lack of recognition,” the ESPN FANtastic survey, which saw over 1000 Indian football fans voting said. On the other hand, 48 percent of the fans voted for Manchester United coach Jose Mourinho as the best English Premier League (EPL) coach while 27 percent fans below the age of 18 voted for Manchester City coach Pep Guardiola. After 18 matches in the EPL, City lead the table with 52 points while United are seated second with 41. Contradicting the choice of the Indian fans former Scottish footballer Craig Burley said: “Jose Mourinho may have the best CV in terms of trophies won, at various clubs, some smaller than the ones managed by Pep, obviously thinking towards Porto in particular, where Jose started. When asked about the team to win Champions League, 48 per cent fans voted for English football giants Manchester United. “48 percent fans are rooting for Manchester United to win the league. However, the youth, below 18 years old, strongly feels that Manchester City, also known as Pep Guardiola’s ‘The Unbeatables’, are equally competitive and more likely to win the league given their successive run,” the statement read.
english
{"content": "So UK citizens who come back from Iraq should lose their citizenship?? Yeah best let the rest of the world deal with our problems #bbcqt", "entities": [{"offset": 3, "type": "ne", "id": 2, "entity": "uk"}, {"offset": 34, "type": "topic keyword", "id": 3, "entity": "iraq"}], "topics": [{"topic": "defence", "id": 1}], "tweet_id": "578698276800704512"}
json
Mumbai: Actor Bhumi Pednekar on Thursday started shooting for her upcoming film “Durgavati”. Superstar Akshay Kumar, who is presenting the horror-thriller, shared a picture from the sets as the film went on floors. Instagram. “Durgavati” is being directed by G. Ashok. It is reportedly the official Hindi remake of Telugu film “Bhaagamathie”. Bhumi too posted the same image on social media. “With her blessings we start #Durgavati Need all your support and love as I start the most special film of my career. @akshaykumar sir I am ready to stand tall and strong ? ? #AshokG sir @ivikramix @shikhaarif. sharma #CapeOfGoodFilms @abundantiaent @mahieg Let’s do this,” she captioned the photograph.
english
Behind all the glitz and glamour of the shining film industry, many artists have fallen into the downward spiral of dark and twisted fates. It is not uncommon to hear horror stories from the industry which have stripped away many talented artists' careers. However, many have managed to pull themselves out of it and stand on their feet again. Kapil Sharma is one such example of it. The comedian gained a bad reputation after reaching the highest peak of success with The Kapil Sharma Show. From coming up late to work to unruly behaviour, the comedian battled through dark times due to alcoholism and depression. In an interview with Aaj Tak, Kapil Sharma looked back at those days and recalled how his decision to cancel a shoot led to Shah Rukh Khan counselling him in his car. He recalled cancelling the shoot because he 'did not want to go through with it'. After a few days, Shah Rukh Khan came to give him a visit when he was at the studio for some other project. ''As an artist, he understood,'' Kapil Sharma stated before revealing that Shah Rukh Khan sat with him in his car for an hour and 'counselled' him. SRK also asked the comedian if he takes drugs to which the latter denied it. Kapil Sharma revealed that he confided in the seasoned actor by admitting to him that he does not feel like working anymore. Kapil revealed that Shah Rukh Khan said a 'lot of good things'. However, he understood that he needs to get back into the business on his own. He also added that he facing depression for the first time.
english
<gh_stars>0 export * from './Router'; export * from './Server';
typescript
<reponame>pjholyoke/rest-fizzbuzz-sample package com.fizzbuzz.model; import com.fasterxml.jackson.annotation.JsonIgnoreProperties; import com.fasterxml.jackson.annotation.JsonProperty; import java.util.*; @JsonIgnoreProperties(ignoreUnknown = true) public class FizzBuzz { @JsonProperty("fizz") private List<Integer> Fizz; @JsonProperty("buzz") private List<Integer> Buzz; @JsonProperty("fizzBuzz") private List<Integer> FizzBuzz; public List<Integer> getFizz() { return Fizz; } public void setFizz(List<Integer> fizz) { Fizz = fizz; } public List<Integer> getBuzz() { return Buzz; } public void setBuzz(List<Integer> buzz) { Buzz = buzz; } public List<Integer> getFizzBuzz() { return FizzBuzz; } public void setFizzBuzz(List<Integer> fizzBuzz) { FizzBuzz = fizzBuzz; } }
java
<gh_stars>1-10 [ { "file": "expandedfoods:itemtypes/food/mixing/candy.json", op: "add", path: "/skipVariants", value: [ "candy-acorn-raw" ], dependsOn: [{ "modid": "acorns", "invert": true }] }, { "file": "expandedfoods:recipes/kneading/candy.json", op: "add", path: "/4/enabled", value: "false", dependsOn: [{ "modid": "acorns", "invert": true }] }, { "file": "expandedfoods:recipes/kneading/candy.json", op: "add", path: "/5/enabled", value: "false", dependsOn: [{ "modid": "acorns", "invert": true }] }, ]
json
{ "name": "environment-reporter", "version": "0.1.0", "description": "Code for a Tessel microcomputer with connected climate and ambience modules to report environmental conditions.", "main": "index.js", "scripts": { "start": "t2 run index.js" }, "repository": { "type": "git", "url": "git+https://github.com/caltemose/environment-reporter.git" }, "keywords": [ "tessel", "iot", "climate", "weather" ], "author": "<NAME>", "license": "MIT", "bugs": { "url": "https://github.com/caltemose/environment-reporter/issues" }, "homepage": "https://github.com/caltemose/environment-reporter#readme", "dependencies": { "ambient-attx4": "^0.2.11", "climate-si7005": "^0.1.5", "firebase-admin": "^4.2.1", "moment": "^2.18.1" } }
json
{ "text": "I record shows I love so I can binge-watch them on my own schedule. Last night I watched the first episode of season 5 of <a href=\"https://www.amc.com/shows/better-call-saul\">Better Call Saul</a>. It was good of course, but the commercials are so dated. Like reading a magazine from the <a href=\"https://www.google.com/search?safe=off&rlz=1C5CHFA_enUS743US747&sxsrf=ALeKk02kN35WVDHANRMb7Mf6B6dTtrvR0w:1587403615144&q=magazine+from+the+1950s&tbm=isch&source=univ&sa=X&ved=2ahUKEwin5eGkw_foAhV8hXIEHUqjAQgQsAR6BAgJEAE&biw=1256&bih=884\">1950s</a>. The episode aired Feb 23. ", "created": "Mon, 20 Apr 2020 17:20:10 GMT", "type": "outline" }
json
<gh_stars>100-1000 package com.dtflys.forest.backend.url; import com.dtflys.forest.http.ForestRequest; import com.dtflys.forest.utils.StringUtils; import java.util.Map; /** * @author gongjun[<EMAIL>] * @since 2017-05-19 14:10 */ public class SimpleURLBuilder extends URLBuilder { @Override public String buildUrl(ForestRequest request) { String url = request.getUrl(); String queryString = request.getQueryString(); StringBuilder urlBuilder = new StringBuilder(url); if (StringUtils.isNotBlank(queryString)) { urlBuilder.append("?").append(queryString); } String ref = request.getRef(); if (StringUtils.isNotEmpty(ref)) { urlBuilder.append("#").append(ref); } return urlBuilder.toString(); } }
java
How did you end up with negative oil prices today? This happens when a physical futures contract find no buyers close to or at expiry. Let me explain what that means: 1. A physical contract such as the NYMEX WTI has a delivery point at Cushing, OK, & date, in this occurrence May. So people who hold the contract at the end of the trading window have to take physical delivery of the oil they bought on the futures market. This is very rare. 2. It means that in the last few days of the futures trading cycle, (which is tomorrow for this one) speculative or paper futures positions start rolling over to the next contract. This is normally a pretty undramatic affair. 3. What is happening today is trades or speculators who had bought the contract are finding themselves unable to resell it, and have no storage booked to get delivered the crude in Cushing, OK, where the delivery is specified in the contract. 4. This means that all the storage in Cushing is booked, and there is no price they can pay to store it, or they are totally inexperienced in this game and are caught holding a contract they did not understand the full physical aspect of as the time clock expires. 6. The June contract is not out of the woods either: today’s action indicate that physical oil markets at Cushing are not in good shape and that storage is getting very full. 7. A decline of over 15% in the June contract price points to real worries that the physical stress will continue to reverberate, and will force a lot more production shutdowns during May than the ones announced so far. 8. So today negative prices are the reflection of dire market conditions for producers, with the hope that demand restart before the middle of May and that the June contract does not face the same fate.
english
<gh_stars>0 begin_unit|revision:0.9.5;language:Java;cregit-version:0.0.1 begin_comment comment|/* * Licensed to the Apache Software Foundation (ASF) under one or more * contributor license agreements. See the NOTICE file distributed with * this work for additional information regarding copyright ownership. * The ASF licenses this file to You under the Apache License, Version 2.0 * (the "License"); you may not use this file except in compliance with * the License. You may obtain a copy of the License at * * http://www.apache.org/licenses/LICENSE-2.0 * * Unless required by applicable law or agreed to in writing, software * distributed under the License is distributed on an "AS IS" BASIS, * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. * See the License for the specific language governing permissions and * limitations under the License. */ end_comment begin_package DECL|package|org.apache.camel.component.sjms.support package|package name|org operator|. name|apache operator|. name|camel operator|. name|component operator|. name|sjms operator|. name|support package|; end_package begin_import import|import name|javax operator|. name|jms operator|. name|JMSException import|; end_import begin_import import|import name|javax operator|. name|jms operator|. name|Session import|; end_import begin_import import|import name|org operator|. name|apache operator|. name|activemq operator|. name|ActiveMQConnection import|; end_import begin_import import|import name|org operator|. name|apache operator|. name|activemq operator|. name|management operator|. name|JMSStatsImpl import|; end_import begin_import import|import name|org operator|. name|apache operator|. name|activemq operator|. name|transport operator|. name|Transport import|; end_import begin_import import|import name|org operator|. name|apache operator|. name|activemq operator|. name|util operator|. name|IdGenerator import|; end_import begin_class DECL|class|MockConnection specifier|public class|class name|MockConnection extends|extends name|ActiveMQConnection block|{ DECL|field|returnBadSessionNTimes specifier|private name|int name|returnBadSessionNTimes decl_stmt|; DECL|method|MockConnection (final Transport transport, IdGenerator clientIdGenerator, IdGenerator connectionIdGenerator, JMSStatsImpl factoryStats, int returnBadSessionNTimes) specifier|protected name|MockConnection parameter_list|( specifier|final name|Transport name|transport parameter_list|, name|IdGenerator name|clientIdGenerator parameter_list|, name|IdGenerator name|connectionIdGenerator parameter_list|, name|JMSStatsImpl name|factoryStats parameter_list|, name|int name|returnBadSessionNTimes parameter_list|) throws|throws name|Exception block|{ name|super argument_list|( name|transport argument_list|, name|clientIdGenerator argument_list|, name|connectionIdGenerator argument_list|, name|factoryStats argument_list|) expr_stmt|; name|this operator|. name|returnBadSessionNTimes operator|= name|returnBadSessionNTimes expr_stmt|; block|} annotation|@ name|Override DECL|method|createSession (boolean transacted, int acknowledgeMode) specifier|public name|Session name|createSession parameter_list|( name|boolean name|transacted parameter_list|, name|int name|acknowledgeMode parameter_list|) throws|throws name|JMSException block|{ name|this operator|. name|checkClosedOrFailed argument_list|() expr_stmt|; name|this operator|. name|ensureConnectionInfoSent argument_list|() expr_stmt|; if|if condition|( operator|! name|transacted condition|) block|{ if|if condition|( name|acknowledgeMode operator|== literal|0 condition|) block|{ throw|throw operator|new name|JMSException argument_list|( literal|"acknowledgeMode SESSION_TRANSACTED cannot be used for an non-transacted Session" argument_list|) throw|; block|} if|if condition|( name|acknowledgeMode argument_list|< literal|0 operator||| name|acknowledgeMode argument_list|> literal|4 condition|) block|{ throw|throw operator|new name|JMSException argument_list|( literal|"invalid acknowledgeMode: " operator|+ name|acknowledgeMode operator|+ literal|". Valid values are Session.AUTO_ACKNOWLEDGE (1), Session.CLIENT_ACKNOWLEDGE (2), " operator|+ literal|"Session.DUPS_OK_ACKNOWLEDGE (3), ActiveMQSession.INDIVIDUAL_ACKNOWLEDGE (4) or for transacted sessions Session.SESSION_TRANSACTED (0)" argument_list|) throw|; block|} block|} name|boolean name|useBadSession init|= literal|false decl_stmt|; if|if condition|( name|returnBadSessionNTimes operator|> literal|0 condition|) block|{ name|useBadSession operator|= literal|true expr_stmt|; name|returnBadSessionNTimes operator|= name|returnBadSessionNTimes operator|- literal|1 expr_stmt|; block|} return|return operator|new name|MockSession argument_list|( name|this argument_list|, name|this operator|. name|getNextSessionId argument_list|() argument_list|, name|transacted condition|? literal|0 else|: name|acknowledgeMode argument_list|, name|this operator|. name|isDispatchAsync argument_list|() argument_list|, name|this operator|. name|isAlwaysSessionAsync argument_list|() argument_list|, name|useBadSession argument_list|) return|; block|} block|} end_class end_unit
java
<reponame>notoraptor/pysaurus from typing import Tuple from pysaurus.database.miniature_tools.pixel_comparator import AbstractPixelComparator class ColorClassPixelComparator(AbstractPixelComparator): def __init__(self, interval_length: int): self.interval_length = interval_length def _map_pixel(self, pixel: Tuple[int, int, int]) -> Tuple: return tuple( int(v // self.interval_length) * self.interval_length for v in pixel ) def normalize_data(self, data, width): return list(data) def pixels_are_close(self, data, i, j, width): return self._map_pixel(data[i]) == self._map_pixel(data[j]) def common_color(self, data, indices, width): return self._map_pixel(data[next(iter(indices))])
python
{"componentChunkName":"component---src-templates-page-js","path":"/ut-animi-est-beatae-totam-saepe/","result":{"data":{"wpgraphql":{"page":{"title":"Ut animi est beatae totam saepe","content":"\n<figure class=\"wp-block-image\"><img src=\"http://localhost/wpheadless/wp-content/uploads/2020/04/186ed3f6-8f3f-35e0-909a-45e6f2d09134.jpg\" alt=\"Quia consequatur aut nobis qui dolore vel et est\"/></figure>\n\n\n\n<p>Eveniet voluptatem omnis quas dolorem tempora. Ipsum voluptas voluptatem qui quia pariatur qui qui. Quam aut et maiores aut quisquam dolore dignissimos. Harum et nesciunt itaque asperiores inventore ratione vel. Id quam blanditiis molestias. Cum commodi ut nostrum natus itaque. Repellat voluptas ea eum aut nemo ipsa qui. Veritatis voluptatum perferendis molestias quis rem aut. Consequatur voluptatem molestiae inventore et sed voluptatem quas. Earum voluptatem sequi sit facilis dolore velit voluptatum. Iusto enim voluptatum doloribus eaque nulla veritatis. Voluptas iusto quidem porro ipsam. Sunt autem est vero nostrum exercitationem architecto. Quod iste omnis et suscipit expedita. Consequatur consequuntur quas sapiente sapiente repudiandae voluptatibus rerum. Quia non et voluptatem. Suscipit temporibus inventore mollitia voluptatum sit velit accusantium molestiae. Sequi corrupti deleniti et et totam illum enim eius. Asperiores iusto et suscipit. Et tempore consequatur doloremque earum enim voluptatem quod distinctio. Et repellendus voluptatem error repellendus quia. Repudiandae incidunt omnis perferendis consequuntur voluptatem. Possimus tempore delectus doloremque quasi eveniet. Omnis consequuntur quis et sapiente qui. Officiis et autem omnis non iure.</p>\n\n\n\n<h3>Aut dolore aut odio quia qui est. Eos excepturi quibusdam ex ratione</h3>\n\n\n\n<ul><li>Ex</li><li>Tempora possimus accusamus sequi quidem earum</li><li>Accusantium commodi nihil est in rem</li><li>Pariatur ab in sed labore</li><li>Veniam at ipsam nostrum nulla</li><li>Magnam cum quo sed at sed accusamus</li><li>Sint tempore cumque</li><li>Magni vel non ut enim</li><li>Dolores nam nihil quia hic</li><li>Debitis dolor aut est soluta modi saepe</li></ul>\n\n\n\n<h5>Doloribus accusantium culpa non omnis amet et</h5>\n\n\n\n<blockquote class=\"wp-block-quote\"><p><a title=\"Dolore non ex voluptatem.\" href=\"http://www.lebsack.biz/autem-quae-et-quia-in-eum\">Quo magnam</a> laboriosam vel quos fuga maiores. Saepe atque quas pariatur voluptatem. Sunt voluptas impedit <a title=\"Quo ullam soluta atque.\" href=\"http://hegmann.com/\"></a><a title=\"Laborum sit accusamus culpa.\" href=\"https://www.lind.org/ea-doloribus-natus-culpa-explicabo-officiis-sunt-est\">nihil ullam temporibus. Tempora accusamus sapiente</a> rem iste. et et soluta voluptates laudantium</p></blockquote>\n\n\n\n<div class=\"wp-block-image\"><figure class=\"aligncenter\"><img src=\"http://localhost/wpheadless/wp-content/uploads/2020/04/8c622abb-1c17-335a-a445-d0df658b9b8a.jpg\" alt=\"\"/></figure></div>\n"}}},"pageContext":{"id":"cGFnZToxOA=="}}}
json
<reponame>zhzhzhy/WeiBoHot_history 2020年11月09日05时数据 Status: 200 1.李雪琴 王一博为什么要有标准 微博热度:304435 2.卖高仿鞋团伙获刑并被罚1020万元 微博热度:167792 3.天猫双11 微博热度:157955 4.黄景瑜吴谨言座椅抱 微博热度:148117 5.霉霉在薇娅直播间送祝福 微博热度:122174 6.隐秘而伟大 微博热度:78067 7.郭艾伦被罚115万 微博热度:64386 8.天津疫情 微博热度:63232 9.美国同时出现抗议示威和上街庆祝 微博热度:56314 10.被坠楼者砸死的快递员妻子发声 微博热度:55840 11.黄绮珊张碧晨是否爱过我 微博热度:55546 12.小霸王法人被限制高消费 微博热度:54858 13.我们的歌 微博热度:50313 14.日本接连出现禽流感疫情 微博热度:49901 15.天津涉疫情货物已流向3个地区 微博热度:49121 16.周迅丸子头好可爱 微博热度:48859 17.带女儿住桥下的单亲母亲获救助 微博热度:48456 18.天天向上 微博热度:47775 19.陆军某特战旅高原寒区潜水集训 微博热度:46198 20.大爷自制假人拉车在街头行走 微博热度:46101 21.THE9合体留喻言位置 微博热度:45610 22.天津进入战时状态 微博热度:43985 23.萧燕燕韩德让私奔 微博热度:38688 24.谨防街头扫一扫骗局 微博热度:37388 25.公园配10名保安防游客踩粉黛草 微博热度:36128 26.王源挥手背影 微博热度:36012 27.刘昊然考编成功 微博热度:35674 28.C罗被踢中脚踝 微博热度:34895 29.谭德塞称期待与拜登团队合作 微博热度:34863 30.潍坊高新区禁止让家长批改作业 微博热度:34859 31.李玟孙楠小鬼你怎么舍得我难过 微博热度:34804 32.綦美合哥哥回应 微博热度:34738 33.恋爱综艺中的意难平 微博热度:34590 34.特朗普团队正募集6000万美元用于诉讼 微博热度:32343 35.QGhappy 微博热度:32291 36.杨丞琳李荣浩隔空合唱暧昧 微博热度:32285 37.李佳琦直播 微博热度:32285 38.隐秘而伟大男主人设 微博热度:32242 39.用一张图证明你压力比我大 微博热度:31373 40.武汉一高校食堂员工用脚洗菜 微博热度:30881 41.谢楠桡骨粉碎性骨折 微博热度:30379 42.薇娅品冠新歌爱你需要练习 微博热度:30141 43.WCG魔兽争霸3 微博热度:29295 44.萧胡辇救太平王 微博热度:28862 45.多款3000元内5G机型上市 微博热度:28185 46.韩国脱发人口达1000万 微博热度:28125 47.顾耀东追星 微博热度:27378 48.苏宁 恒大 微博热度:26110 49.记者节 微博热度:25898 50.家长为了孩子的手工课有多努力 微博热度:25416
markdown
There'll be competitions and prizes for the kidlets, with a rare toy available. It's going to be pretty crowded, so it's worth getting there early on the Sunday if you want to get in. The hosts of Australia's most popular gaming television programme, Good Game, will be taking the stage at the EB Games Expo for Good Game Live. Located at the EB Games Arena, the show is described as "a live quiz show for gamers by gamers; a multiplayer showdown of gaming goodness, trivia and LOLs with some very special guests". The hosts of the show will also take part in the expo's Video Game Parade on the Sunday, should you like massive balloons, tanks and cosplay. If there's one thing PAX Australia taught us, it's that the Aussie indie dev scene is full of amazingly talented people. And they will have a home at the EB Games Expo in the Home Grown Gaming section of the show. In addition to Aussie developers, the Home Grown hall will also feature local eSport competitions, as well as a tech zone dedicated to the latest gaming accessories to hit Australian shores (and even some that haven't arrived yet). We expect more details on exactly what to expect from this area in the weeks leading up to the show. You didn't expect a gaming event hosted by a gaming retailer to not offer its own store, did you? The EB Games Mega Store is set to be four times larger than the store at the 2012 Expo, and will officially be the biggest EB Games store in the world. In addition to the latest games, the Mega Store will offer exclusive swag for attendees, the ability to pre-order games, as well as purchase some of the best accessories and hardware shown off on the show floor. While the cynical part of us knows that this is what the show is really all about, we'd probably be disappointed if it wasn't there. Parents will be pleased to know that Sunday, October 6 has been allocated as the EB Games Expo family fun day. In addition to a wide range of free activities for the kidlets, including face painting and magicians, there will be video game hero meet and greets, and a special family edition of the Good Game Live show. There's even a family ticketing option for anyone looking to save on the cost of entry for the Sunday. For the educated gamer (or the enthusiast looking to get into the industry), there will be a selection of expert community panels scattered through the event. Topics are wide and varied, with panels ranging from a focus on how video games are good for you (with speakers from the Starlight foundation), to debates over whether retro or modern games are better. Gaming with kids, Australian indies and how you don't need a degree to make games are all worth paying attention to as well. On top of all that, there are plenty of other exhibitors and events scheduled over the course of the weekend. TechRadar will be there to bring you all the latest news as it happens. Also, to try and embarrass ourselves playing TitanFall. Get the hottest deals available in your inbox plus news, reviews, opinion, analysis, deals and more from the TechRadar team.
english
The photos of a lion from a zoo in China are breaking the internet, thanks to its unique mane. Yes, a lion residing at the Guangzhou Zoo went viral as netizens were amused to see it sporting fringes. The white lion created waves online after a visitor to the zoo recently shared images of it on a Chinese social media platform, the Little Red Book (Xiaohongshu). In the images, the lion was seen flaunting choppy bangs over its forehead, with many wondering who dared to give the beast a haircut. As the image quickly spread on other social media sites, people started to demand answers. According to Mothership, the zoo denied allegations about giving the animal a haircut. The zoo claimed the unique appearance was due to the humidity in Guangzhou province. However, according to a report by Newsweek, a user who had visited the zoo less than a week prior to when the pictures were posted said the lion wasn’t sporting bangs then. The lion, identified as Hang, by Pearl River News, got many talking online, with netizens comparing his mullet-style hair to celebrities like Bruce Lee and Anderson Paak. While some found the hairstyle cute, others raised doubts about zoo’s statement and said it’s impossible for natural mane to look like this. In 2020, a temple elephant in Tamil Nadu broke the internet as “Bob-cut Sengamalam”, where thousands throng the Rajagopalaswamy Temple in Mannargudi to see the animal.
english
<reponame>IniterWorker/rumqtt<filename>rumqttd/src/network.rs use bytes::BytesMut; use mqttbytes::v4::*; use tokio::io::{AsyncRead, AsyncReadExt, AsyncWrite, AsyncWriteExt}; use crate::state; use crate::state::State; use std::io::{self, ErrorKind}; use tokio::time::{self, error::Elapsed, Duration}; #[derive(Debug, thiserror::Error)] pub enum Error { #[error("I/O = {0}")] Io(#[from] io::Error), #[error("State = {0}")] State(#[from] state::Error), #[error("Invalid data = {0}")] Mqtt(mqttbytes::Error), #[error["Keep alive timeout"]] KeepAlive(#[from] Elapsed), } /// Network transforms packets <-> frames efficiently. It takes /// advantage of pre-allocation, buffering and vectorization when /// appropriate to achieve performance pub struct Network { /// Socket for IO socket: Box<dyn N>, /// Buffered reads read: BytesMut, /// Maximum packet size max_incoming_size: usize, /// Maximum readv count max_readb_count: usize, keepalive: Duration, } impl Network { pub fn new(socket: impl N + 'static, max_incoming_size: usize) -> Network { let socket = Box::new(socket) as Box<dyn N>; Network { socket, read: BytesMut::with_capacity(10 * 1024), max_incoming_size, max_readb_count: 10, keepalive: Duration::from_secs(0), } } pub fn set_keepalive(&mut self, keepalive: u16) { let keepalive = Duration::from_secs(keepalive as u64); self.keepalive = keepalive + keepalive.mul_f32(0.5); } /// Reads more than 'required' bytes to frame a packet into self.read buffer async fn read_bytes(&mut self, required: usize) -> io::Result<usize> { let mut total_read = 0; loop { let read = self.socket.read_buf(&mut self.read).await?; if 0 == read { return if self.read.is_empty() { Err(io::Error::new( ErrorKind::ConnectionAborted, "connection closed by peer", )) } else { Err(io::Error::new( ErrorKind::ConnectionReset, "connection reset by peer", )) }; } total_read += read; if total_read >= required { return Ok(total_read); } } } pub async fn read(&mut self) -> Result<Packet, io::Error> { loop { let required = match read(&mut self.read, self.max_incoming_size) { Ok(packet) => return Ok(packet), Err(mqttbytes::Error::InsufficientBytes(required)) => required, Err(e) => return Err(io::Error::new(ErrorKind::InvalidData, e.to_string())), }; // read more packets until a frame can be created. This function // blocks until a frame can be created. Use this in a select! branch self.read_bytes(required).await?; } } pub async fn read_connect(&mut self) -> io::Result<Connect> { let packet = self.read().await?; match packet { Packet::Connect(connect) => Ok(connect), packet => { let error = format!("Expecting connect. Received = {:?}", packet); Err(io::Error::new(io::ErrorKind::InvalidData, error)) } } } pub async fn _read_connack(&mut self) -> io::Result<ConnAck> { let packet = self.read().await?; match packet { Packet::ConnAck(connack) => Ok(connack), packet => { let error = format!("Expecting connack. Received = {:?}", packet); Err(io::Error::new(io::ErrorKind::InvalidData, error)) } } } pub async fn readb(&mut self, state: &mut State) -> Result<bool, Error> { if self.keepalive.as_secs() > 0 { let disconnect = time::timeout(self.keepalive, async { let disconnect = self.collect(state).await?; Ok::<_, Error>(disconnect) }) .await??; Ok(disconnect) } else { let disconnect = self.collect(state).await?; Ok(disconnect) } } /// Read packets in bulk. This allow replies to be in bulk. This method is used /// after the connection is established to read a bunch of incoming packets pub async fn collect(&mut self, state: &mut State) -> Result<bool, Error> { let mut count = 0; let mut disconnect = false; loop { match read(&mut self.read, self.max_incoming_size) { // Store packet and return after enough packets are accumulated Ok(packet) => { disconnect = state.handle_network_data(packet)?; count += 1; if count >= self.max_readb_count { return Ok(disconnect); } } // If some packets are already framed, return those Err(mqttbytes::Error::InsufficientBytes(_)) if count > 0 => return Ok(disconnect), // Wait for more bytes until a frame can be created Err(mqttbytes::Error::InsufficientBytes(required)) => { self.read_bytes(required).await?; } Err(e) => return Err(Error::Mqtt(e)), }; } } pub async fn _connect(&mut self, connect: Connect) -> Result<usize, io::Error> { let mut write = BytesMut::new(); let len = match connect.write(&mut write) { Ok(size) => size, Err(e) => return Err(io::Error::new(io::ErrorKind::InvalidData, e.to_string())), }; self.socket.write_all(&write[..]).await?; Ok(len) } pub async fn connack(&mut self, connack: ConnAck) -> Result<usize, io::Error> { let mut write = BytesMut::new(); let len = match connack.write(&mut write) { Ok(size) => size, Err(e) => return Err(io::Error::new(io::ErrorKind::InvalidData, e.to_string())), }; self.socket.write_all(&write[..]).await?; Ok(len) } pub async fn flush(&mut self, write: &mut BytesMut) -> Result<(), io::Error> { if write.is_empty() { return Ok(()); } self.socket.write_all(&write[..]).await?; write.clear(); Ok(()) } } pub trait N: AsyncRead + AsyncWrite + Send + Sync + Unpin {} impl<T> N for T where T: AsyncRead + AsyncWrite + Unpin + Send + Sync {}
rust
{ "parent": "crumbs:block/template_post", "textures": { "post": "crumbs:block/stripped_dark_oak_post", "top": "crumbs:block/stripped_dark_oak_post_top" } }
json
Max Verstappen revealed fond memories of his family spending time with Michael Schumacher and his family. The new world champion described his perception of the seven-time world champion as a child and his relationship with Schumacher's family. Speaking to German publication Autobild, Red Bull's latest F1 world champion said: The reigning world champion has fond memories of the German F1 legend as the latter was a close friend and team-mate of his father Jos Verstappen. The Dutchman also revealed having a lot of fond memories with the German champion’s children Mick and Gina. The Dutch world champion also shared karting memories between the Schumacher and Verstappen families. The newly-crowned world champion has often been compared to the German legend by analysts and critics of the sport. The F1 world has often drawn parallels between the two champions, both from a driving and psychological standpoint. Lewis Hamilton might not have been able to overtake Michael Schumacher by winning eight titles. His title rival, however, did surpass the German’s long-standing record of claiming the maximum number of podiums in a season. The German seven-time world champion had a record of 17 podiums in a single season, which was outdone by the 2021 world champion who claimed 18 podiums this season.
english
One of the top tight ends in the NFL right now is San Francisco 49ers' George Kittle. He is a top athlete who has played in three Pro Bowls and was previously named Most Improved Player of the Year. He is one of the most beloved NFL players, having surpassed Aaron Rodgers in apparel sales. There is no doubt that the woman by his side has always been there to support him. His wife, Claire Kittle, was born in Dubuque, Iowa, in 1994. She left Wahlert High School and enrolled at the University of Iowa. With 1,011 points, she is third all-time in scoring in Wahlert's records. Given her enthusiasm for exercise, Claire continued to acquire a bachelor's degree in health and human physiology after high school. George and Claire became friends during their first year of college. She was riding her moped to basketball training when George stopped her to appreciate her bright pink helmet. After that, George and Claire turned friends, and their friendship gradually developed into a romantic one. Throughout college, the two were in a relationship together. At the University of Iowa, Claire played basketball for the Hawkeyes for four years. She represented the Fuel Up to Play 60 initiative on campus, joining JE Models as a fitness model soon after graduating. However, since relocating to California to be with George, Claire has been more involved as a blogger and traveler. On her blog, "Letter Set Go," she provides advice on a variety of topics, including fashion, health, and traveling. After finishing college, she started modeling for fitness brands and now runs Claire Till Fitness. On Instagram, Claire Kittle routinely shares images of her and George, and many of their recent posts are from their wedding. Claire and George have been married since April 2019. More specifically, the Kittle couple wed in a private ceremony attended only by their closest family and friends. George's father, Bruce, performed the ceremony. Since "that's the sacred element of it all," Claire and her husband chose a short ceremony, according to her blog at letteysetgo. com. The dynamic duo's influence stretches across a number of industries, ranging from Claire's successful engagements in fitness and blogging to George's fame in the NFL.
english
<gh_stars>1-10 package hedera import ( "time" "github.com/hashgraph/hedera-sdk-go/proto" ) type ConsensusTopicCreateTransaction struct { TransactionBuilder pb *proto.ConsensusCreateTopicTransactionBody } // NewConsensusTopicCreateTransaction creates a ConsensusTopicCreateTransaction builder which can be // used to construct and execute a Consensus Create Topic Transaction. func NewConsensusTopicCreateTransaction() ConsensusTopicCreateTransaction { pb := &proto.ConsensusCreateTopicTransactionBody{} inner := newTransactionBuilder() inner.pb.Data = &proto.TransactionBody_ConsensusCreateTopic{pb} builder := ConsensusTopicCreateTransaction{inner, pb} return builder.SetAutoRenewPeriod(7890000 * time.Second) } // SetAdminKey sets the key required to update or delete the topic. If unspecified, anyone can increase the topic's // expirationTime. func (builder ConsensusTopicCreateTransaction) SetAdminKey(publicKey PublicKey) ConsensusTopicCreateTransaction { builder.pb.AdminKey = publicKey.toProto() return builder } // SetSubmitKey sets the key required for submitting messages to the topic. If unspecified, all submissions are allowed. func (builder ConsensusTopicCreateTransaction) SetSubmitKey(publicKey PublicKey) ConsensusTopicCreateTransaction { builder.pb.SubmitKey = publicKey.toProto() return builder } // SetTopicMemo sets a short publicly visible memo about the topic. No guarantee of uniqueness. func (builder ConsensusTopicCreateTransaction) SetTopicMemo(memo string) ConsensusTopicCreateTransaction { builder.pb.Memo = memo return builder } // SetAutoRenewPeriod sets the initial lifetime of the topic and the amount of time to extend the topic's lifetime // automatically at expirationTime if the autoRenewAccount is configured and has sufficient funds. // // Required. Limited to a maximum of 90 days (server-side configuration which may change). func (builder ConsensusTopicCreateTransaction) SetAutoRenewPeriod(period time.Duration) ConsensusTopicCreateTransaction { builder.pb.AutoRenewPeriod = durationToProto(period) return builder } // SetAutoRenewAccountID sets an optional account to be used at the topic's expirationTime to extend the life of the // topic. The topic lifetime will be extended up to a maximum of the autoRenewPeriod or however long the topic can be // extended using all funds on the account (whichever is the smaller duration/amount). // //If specified, there must be an adminKey and the autoRenewAccount must sign this transaction. func (builder ConsensusTopicCreateTransaction) SetAutoRenewAccountID(id AccountID) ConsensusTopicCreateTransaction { builder.pb.AutoRenewAccount = id.toProto() return builder } // // The following _5_ must be copy-pasted at the bottom of **every** _transaction.go file // We override the embedded fluent setter methods to return the outer type // func (builder ConsensusTopicCreateTransaction) SetMaxTransactionFee(maxTransactionFee Hbar) ConsensusTopicCreateTransaction { return ConsensusTopicCreateTransaction{builder.TransactionBuilder.SetMaxTransactionFee(maxTransactionFee), builder.pb} } func (builder ConsensusTopicCreateTransaction) SetMemo(memo string) ConsensusTopicCreateTransaction { return ConsensusTopicCreateTransaction{builder.TransactionBuilder.SetTransactionMemo(memo), builder.pb} } func (builder ConsensusTopicCreateTransaction) SetTransactionValidDuration(validDuration time.Duration) ConsensusTopicCreateTransaction { return ConsensusTopicCreateTransaction{builder.TransactionBuilder.SetTransactionValidDuration(validDuration), builder.pb} } func (builder ConsensusTopicCreateTransaction) SetTransactionID(transactionID TransactionID) ConsensusTopicCreateTransaction { return ConsensusTopicCreateTransaction{builder.TransactionBuilder.SetTransactionID(transactionID), builder.pb} } func (builder ConsensusTopicCreateTransaction) SetNodeAccountID(nodeAccountID AccountID) ConsensusTopicCreateTransaction { return ConsensusTopicCreateTransaction{builder.TransactionBuilder.SetNodeAccountID(nodeAccountID), builder.pb} }
go
<gh_stars>0 [launch-stack]: https://s3.amazonaws.com/cloudformation-examples/cloudformation-launch-stack.png ## Chaos Rat >Lightweight chaos experimentation for your cloud infrastructure ### Why While not as thorough, Chaos Rat is a easier to get running than [Chaos Monkey](https://github.com/Netflix/chaosmonkey). If you want to quickly test and experiment with the resiliency, availability, and fault tolerence of your systems Chaos Rat is for you. To get started jump to [Usage](#usage) below. This tool does not yet follow the [Principles of Chaos Engineering](https://principlesofchaos.org/) and only terminates compute instances to, per the principles of chaos in practice, "reflect real world events like servers that crash." ### Support Roadmap (PRs Welcome) #### Amazon Web Services - [x] Elastic Compute Cloud Instances - [ ] Elastic Container Service Tasks - [ ] Elastic Kubernetes Service Pods #### Google Cloud Platform - [ ] Compute Engine Instances - [ ] Google Kubernetes Engine Pods ### Usage #### Chew ```bash $ chaosrat chew -h NAME: main chew - Terminate compute instances USAGE: chaosrat chew -a [Auto Scaling group id] -m [maximum termination count] -r [compute region] DESCRIPTION: Terminate `m` number of instances within auto scaling group with ID `a` inside of region `r`. OPTIONS: --auto-scaling-group value The ID of your autoscaling group. --termination-max value The maximum number of instances to terminate. (default: 18446744073709551615) --region value The infrastructure region, containing your Auto Scaling group, that you would like to stress. --verbosity value Logging level: 'trace', 'debug', 'info', 'warn', 'error', 'fatal', or 'panic'. (default: "info") --dry No terminations. Only a description of the would-be affected instances will be returned. ``` - Elastic Compute Cloud (EC2) Instances 1. You must be using an EC2 Auto Scaling Group. To test Chaos Rat you can deploy the [AutoScalingMultiAZWithNotifications.template.yml](./examples/AutoScalingMultiAZWithNotifications.template.yml) using AWS CloudFormation and use the autoscaling group it provides. Click Launch Stack below to launch the template in the us-east-1 region (North Virginia). [![launch-stack]](https://console.aws.amazon.com/cloudformation/home?region=us-east-1#/stacks/new?stackName=AutoScalingMultiAZWithNotifications&templateURL=https://s3.amazonaws.com/cloudformation-templates-us-east-1/AutoScalingMultiAZWithNotifications.template ) 2. You can run Chaos Rat using [the latest GitHub Package Registry image](https://github.com/swoldemi/chaosrat/packages/43405?version=1.0.0-rc.1). Make sure to set `AWS_ACCESS_KEY_ID`, `AWS_SECRET_ACCESS_KEY`, `YOUR_AUTO_SCALING_GROUP_ID`, `YOUR_REGION`, and `TERMINATION_COUNT`: ```bash $ docker run --rm \ -e AWS_ACCESS_KEY_ID=$AWS_ACCESS_KEY_ID \ -e AWS_SECRET_ACCESS_KEY=$AWS_SECRET_ACCESS_KEY \ docker.pkg.github.com/swoldemi/chaosrat/chaosrat:1.0.0-rc.1 chew \ -a $YOUR_AUTO_SCALING_GROUP_ID \ -r $YOUR_REGION \ -m $TERMINATION_COUNT ``` ### Resources - awesome-chaos-engineering: https://github.com/dastergon/awesome-chaos-engineering - Chaos Engineering - Part 1 by <NAME>: https://medium.com/@adhorn/chaos-engineering-ab0cc9fbd12a - AWS re:Invent 2017: Performing Chaos at Netflix Scale (DEV334) by <NAME>: https://youtu.be/LaKGx0dAUlo - Chaos Monkey Guide for Engineers: https://gremlin.com/chaos-monkey
markdown
Some of the most notable venues from Westeros can be found on Earth. Game of Thrones might be best known for its mix of dragons and drama, but it's also one of the most gorgeous shows on TV. Ever wonder where they got that amazing shot? We've found 25 of the best real-life Game of Thrones shooting locations on Google Street View and matched them up with shots from the HBO show. We'll start with largest city in the Seven Kingdoms: King's Landing, home of the Iron Throne. The historic walled city of Dubrovnik, Croatia, has been used as the location for King's Landing since season 2. In a key flashback, Ned Stark heads to the Tower of Joy in the Red Mountains to find his sister, Lyanna Stark. The Tower of Joy is really the Castle of Zafra in Guadalajara, Spain. The Kingsroad runs from Castle Black to King's Landing. In the scene pictured, Gendry and Arya Stark are on their way to join the Night's Watch. In real life, the Kingsroad is near Stranocum, in County Antrim, Northern Ireland. In Season 5, Cersei Lannister walks naked through King's Landing as penance for her sins. Cersei's "shame" walk was shot on St. Dominic Street in Dubrovnik, Croatia. Here's Arya Stark looking over the city of Braavos and the Titan of Braavos statue. The St. James Cathedral in Sibenik, Croatia, is the location for many Braavos exterior shots. Kastel Gomilica, Croatia is also used. Across the Narrow Sea in Meereen, Daenerys Targaryen unleashes her dragons on the attacking fleet of Maester ships. The location used for this epic scene, which appears in one of our favorite Game of Thrones episodes, is the Mesa Roldan tower in Cabo de Gata, Almeria, Spain. Remember the gardens of the Red Keep in King's Landing? That's where Sansa Stark first tells Olenna and Margaery Tyrell that King Joffrey Baratheon is a "monster." The shooting location for these luscious gardens is Trsteno Arboretum in Trsteno, Croatia. In Season 5, the Sons of the Harpy try to assassinate Daenerys Targaryen at the fighting pits in Meereen. The Bullring of Osuna, in Spain, is used for Daznak's Pit in Meereen. Daenerys Targaryen frees the slaves of Meereen, a large city filled with pyramids and temples. The location for Meereen is the coastal city of Peniscola in Castellon, Spain. While in Qarth, Daenerys Targaryen is locked inside the House of the Undying until her dragons free her. Qarth's House of the Undying is really the Minceta Tower in Dubrovnik, Croatia. The wealthy Iron Bank of Braavos has made several appearances in Game of Thrones. The St. Jacob Cathedral in Sibenik, Croatia, stands in for the Iron Bank. In the Season 4 finale, Brienne of Tarth battles the Hound over Arya Stark. The backdrop for that fight is the Thingvellir National Park in Blaskogabyggo, Iceland. After attempting to rescue Myrcella from Dorne, Jaime Lannister brokers a truce with Prince Doran Martell at his palace. The throne room of the Arabic Palace at the Real Alcázar of Seville, in Spain, is used for the interior of Prince Doran's mansion. In the sixth season, a captive Daenerys Targaryen is taken to a camp at the Dothraki Sea. The location for the Dothraki Sea is Bardenas Reales in Navarra, Spain. The Long Bridge of Volantis looks a little different in Season 5. That's because shops and markets were added to the original structure using CGI. The real location? The Roman bridge of Cordoba in Andalusia, Spain. In Season 1, Daenerys Targaryen and Khal Drogo tie the knot outside of Pentos. The location for that shoot was the Azure Window on the coast of Gozo, Malta. In the series premiere, Ned Stark and his children find dire wolf puppies in the forests outside of Winterfell. The Tollymore Forest Park in Down, Northern Ireland, was the shooting location for the forests. In the second season, Jon Snow and other Night's Watch members search for Wildlings in the Frostfang Mountains. The real Frostfang Mountains are located in Hofoabrekka, Iceland. The seven Iron Islands are controlled by House Greyjoy. The Dunluce Castle, in County Antrim, Northern Ireland, is used for one of the Iron Islands on the show. In Season 2, Myrcella Baratheon leaves King's Landing for Dorne. The scene where Myrcella leaves for Dorne was filmed in the West Harbour in Dubrovnik, Croatia. This picturesque locale is the exterior of Prince Doran's palace. The exterior of House Martell is located in Alcazar of Seville in Spain. Winterfell is the ancestral home of House Stark. Castle Ward, near the village of Strangford, in County Down, Northern Ireland, is the location used for Winterfell. Here, Sandor "The Hound" Clegane is traveling through the Riverlands with Arya Stark. The Krka National Park. located in Lozovac, Croatia, has been used for multiple Riverlands scenes on the show. On the Dragonstone Beach, Melisandre first mutters her famous phrase, "For the night is dark and full of terrors." The beach used for Dragonstone is the Downhill Strand in County Londonderry, Northern Ireland. After spending years living in Winterfell, Theon Greyjoy returns home to the Iron Islands. The port where Theon arrives is the Ballintoy Harbor in County Antrim, Northern Ireland.
english
package net.puppygames.thjson; import java.util.ArrayList; import java.util.LinkedHashMap; import java.util.List; import java.util.Map; import java.util.Stack; import com.google.gson.JsonObject; /** * Converts a THJSON input stream into a {@link Map}. Ordering is maintained. */ public class THJSONtoMapConverter implements THJSONListener { private final Stack<Object> stack = new Stack<>(); private final Map<String, Object> root = new LinkedHashMap<>(); private boolean debug = true; private Object current = root; /** * C'tor */ public THJSONtoMapConverter() { } public void setDebug(boolean debug) { this.debug = debug; } /** * Gets the {@link JsonObject} that we create from listening to a THJSON stream. If the parsing throws at any point, then the JsonObject thus returned will * only be partially complete. * @return the {@link JsonObject} thus created. */ public Map<String, Object> getMap() { return root; } @SuppressWarnings("unchecked") private void addChild(Object newChild) { if (current instanceof List) { ((List<Object>) current).add(newChild); } else { Map<String, Object> currentObject = (Map<String, Object>) current; List<Object> children = (List<Object>) currentObject.get("children"); if (children == null) { children = new ArrayList<>(); currentObject.put("children", children); } children.add(newChild); } } @Override public void beginObject(String key, String clazz) { if (debug) { System.out.println("BEGIN OBJECT " + key + " " + clazz); } stack.push(current); Map<String, Object> newCurrent = new LinkedHashMap<>(); addChild(newCurrent); current = newCurrent; newCurrent.put("class", clazz); } @Override public void beginObjectValue(String clazz) { if (debug) { System.out.println("BEGIN OBJECT VALUE " + clazz); } stack.push(current); Map<String, Object> newCurrent = new LinkedHashMap<>(); addChild(newCurrent); current = newCurrent; newCurrent.put("class", clazz); } @Override public void endObject() { if (debug) { System.out.println("END OBJECT"); } current = stack.pop(); } @SuppressWarnings("unchecked") @Override public void property(String key, boolean value) { if (debug) { System.out.println("PROPERTY " + key + "=" + value); } ((Map<String, Object>) current).put(key, value); } @SuppressWarnings("unchecked") @Override public void property(String key, float value) { if (debug) { System.out.println("PROPERTY " + key + "=" + value); } ((Map<String, Object>) current).put(key, value); } @SuppressWarnings("unchecked") @Override public void property(String key, int value, IntegerType type) { if (debug) { System.out.println(type + " PROPERTY " + key + "=" + value); } ((Map<String, Object>) current).put(key, value); } @SuppressWarnings("unchecked") @Override public void property(String key, String value, StringType type) { if (debug) { System.out.println(type + " PROPERTY " + key + "=" + value + "<"); } ((Map<String, Object>) current).put(key, value); } @SuppressWarnings("unchecked") @Override public void property(String key, byte[] value, StringType type) { if (debug) { System.out.println(type + " PROPERTY " + key + "=" + value + "<"); } ((Map<String, Object>) current).put(key, value); } @SuppressWarnings("unchecked") @Override public void nullProperty(String key) { if (debug) { System.out.println("PROPERTY " + key + "=null"); } ((Map<String, Object>) current).put(key, null); } @SuppressWarnings("unchecked") @Override public void value(boolean value) { if (debug) { System.out.println("VALUE " + value); } ((List<Object>) current).add(value); } @SuppressWarnings("unchecked") @Override public void value(float value) { if (debug) { System.out.println("VALUE " + value); } ((List<Object>) current).add(value); } @SuppressWarnings("unchecked") @Override public void value(int value, IntegerType type) { if (debug) { System.out.println(type + " VALUE " + value); } ((List<Object>) current).add(value); } @SuppressWarnings("unchecked") @Override public void value(String value, StringType type) { if (debug) { System.out.println(type + " VALUE " + value); } ((List<Object>) current).add(value); } @SuppressWarnings("unchecked") @Override public void value(byte[] value, StringType type) { if (debug) { System.out.println(type + " VALUE " + value); } ((List<Object>) current).add(value); } @SuppressWarnings("unchecked") @Override public void nullValue() { if (debug) { System.out.println("NULL VALUE"); } ((List<Object>) current).add(null); } @SuppressWarnings("unchecked") @Override public void beginArray(String key) { if (debug) { System.out.println("BEGIN ARRAY " + key); } stack.push(current); List<Object> newCurrent = new ArrayList<>(); ((Map<String, Object>) current).put(key, newCurrent); current = newCurrent; } @Override public void beginArrayValue() { if (debug) { System.out.println("BEGIN ARRAY VALUE"); } stack.push(current); List<Object> newCurrent = new ArrayList<>(); addChild(newCurrent); current = newCurrent; } @SuppressWarnings("unchecked") @Override public void beginList(String key, String clazz) { if (debug) { System.out.println("BEGIN LIST " + key + " " + clazz); } stack.push(current); // Create a special "array" object Map<String, Object> array = new LinkedHashMap<>(); array.put("class", "array"); array.put("type", clazz); ((Map<String, Object>) current).put(key, array); stack.push(array); List<Object> elements = new ArrayList<>(); array.put("elements", elements); current = elements; } @Override public void beginListValue(String clazz) { if (debug) { System.out.println("BEGIN LIST VALUE " + clazz); } stack.push(current); // Create a special "array" object Map<String, Object> array = new LinkedHashMap<>(); array.put("class", "array"); array.put("type", clazz); addChild(array); stack.push(array); List<Object> elements = new ArrayList<>(); array.put("elements", elements); current = elements; } @SuppressWarnings("unchecked") @Override public void beginMap(String key) { if (debug) { System.out.println("BEGIN MAP " + key); } stack.push(current); Map<String, Object> newCurrent = new LinkedHashMap<>(); ((Map<String, Object>) current).put(key, newCurrent); current = newCurrent; } @Override public void beginMapValue() { if (debug) { System.out.println("BEGIN MAP VALUE"); } stack.push(current); Map<String, Object> newCurrent = new LinkedHashMap<>(); addChild(newCurrent); current = newCurrent; } @Override public void endMap() { if (debug) { System.out.println("END MAP"); } current = stack.pop(); } @Override public void endList() { if (debug) { System.out.println("END LIST"); } // Pop elements and discard stack.pop(); // Then pop the array object current = stack.pop(); } @Override public void endArray() { if (debug) { System.out.println("END ARRAY"); } current = stack.pop(); } @Override public void comment(String text, CommentType type) { System.out.println(type + ":" + text + "<"); } @Override public void directive(String text) { System.out.println("DIRECTIVE:" + text + "<"); } }
java
package bucketindex import ( "bytes" "context" "io" "io/ioutil" "path" "github.com/oklog/ulid" "github.com/thanos-io/objstore" "github.com/thanos-io/thanos/pkg/block" "github.com/thanos-io/thanos/pkg/block/metadata" ) // globalMarkersBucket is a bucket client which stores markers (eg. block deletion marks) in a per-tenant // global location too. type globalMarkersBucket struct { parent objstore.Bucket } // BucketWithGlobalMarkers wraps the input bucket into a bucket which also keeps track of markers // in the global markers location. func BucketWithGlobalMarkers(b objstore.Bucket) objstore.Bucket { return &globalMarkersBucket{ parent: b, } } // Upload implements objstore.Bucket. func (b *globalMarkersBucket) Upload(ctx context.Context, name string, r io.Reader) error { blockID, ok := b.isBlockDeletionMark(name) if !ok { return b.parent.Upload(ctx, name, r) } // Read the marker. body, err := ioutil.ReadAll(r) if err != nil { return err } // Upload it to the original location. if err := b.parent.Upload(ctx, name, bytes.NewBuffer(body)); err != nil { return err } // Upload it to the global markers location too. globalMarkPath := path.Clean(path.Join(path.Dir(name), "../", BlockDeletionMarkFilepath(blockID))) return b.parent.Upload(ctx, globalMarkPath, bytes.NewBuffer(body)) } // Delete implements objstore.Bucket. func (b *globalMarkersBucket) Delete(ctx context.Context, name string) error { // Call the parent. if err := b.parent.Delete(ctx, name); err != nil { return err } // Delete the marker in the global markers location too. if blockID, ok := b.isBlockDeletionMark(name); ok { globalMarkPath := path.Clean(path.Join(path.Dir(name), "../", BlockDeletionMarkFilepath(blockID))) if err := b.parent.Delete(ctx, globalMarkPath); err != nil { if !b.parent.IsObjNotFoundErr(err) { return err } } } return nil } // Name implements objstore.Bucket. func (b *globalMarkersBucket) Name() string { return b.parent.Name() } // Close implements objstore.Bucket. func (b *globalMarkersBucket) Close() error { return b.parent.Close() } // Iter implements objstore.Bucket. func (b *globalMarkersBucket) Iter(ctx context.Context, dir string, f func(string) error, options ...objstore.IterOption) error { return b.parent.Iter(ctx, dir, f, options...) } // Get implements objstore.Bucket. func (b *globalMarkersBucket) Get(ctx context.Context, name string) (io.ReadCloser, error) { return b.parent.Get(ctx, name) } // GetRange implements objstore.Bucket. func (b *globalMarkersBucket) GetRange(ctx context.Context, name string, off, length int64) (io.ReadCloser, error) { return b.parent.GetRange(ctx, name, off, length) } // Exists implements objstore.Bucket. func (b *globalMarkersBucket) Exists(ctx context.Context, name string) (bool, error) { return b.parent.Exists(ctx, name) } // IsObjNotFoundErr implements objstore.Bucket. func (b *globalMarkersBucket) IsObjNotFoundErr(err error) bool { return b.parent.IsObjNotFoundErr(err) } // Attributes implements objstore.Bucket. func (b *globalMarkersBucket) Attributes(ctx context.Context, name string) (objstore.ObjectAttributes, error) { return b.parent.Attributes(ctx, name) } // WithExpectedErrs implements objstore.InstrumentedBucket. func (b *globalMarkersBucket) WithExpectedErrs(fn objstore.IsOpFailureExpectedFunc) objstore.Bucket { if ib, ok := b.parent.(objstore.InstrumentedBucket); ok { return ib.WithExpectedErrs(fn) } return b } // ReaderWithExpectedErrs implements objstore.InstrumentedBucketReader. func (b *globalMarkersBucket) ReaderWithExpectedErrs(fn objstore.IsOpFailureExpectedFunc) objstore.BucketReader { if ib, ok := b.parent.(objstore.InstrumentedBucketReader); ok { return ib.ReaderWithExpectedErrs(fn) } return b } func (b *globalMarkersBucket) isBlockDeletionMark(name string) (ulid.ULID, bool) { if path.Base(name) != metadata.DeletionMarkFilename { return ulid.ULID{}, false } // Parse the block ID in the path. If there's not block ID, then it's not the per-block // deletion mark. return block.IsBlockDir(path.Dir(name)) }
go
<filename>server/backend/src/models/json/intruders/intrudersPut.json { "id": "615bd1fc329a7345d730d419", "detection": "small noise level increase", "alert_level": 2 }
json
<reponame>pnoom/zetl { "manifest_version": 2, "name": "Zetl", "description": "Take notes on webpages and save them for later.", "version": "1.0", "permissions": [ "tabs", "storage", "commands" ], "browser_action": { "default_icon": "zetl.png", "default_popup": "popup.html" }, "commands": { "_execute_browser_action": { "suggested_key": { "default": "Ctrl+Shift+Z" } }, "save_note": { "suggested_key": { "default": "Alt+Shift+Z" }, "description": "Save the note currently being edited." } } }
json
Awesome Car. Awesome Car. I recently purchased a 2021 model of the Hector 1.5 DCT petrol variant. It has only covered 8,000 kilometres so far. The car is truly awesome. I have also owned Mercedes Benz and Volvo SUVs, but in terms of ride quality, the Hector outshines them in its price category. Technologically, it is years ahead, although it may lack a bit in power due to its 1.5-litre petrol engine, which is understandable. On a scale of 5, I would rate it 4.5 stars. I will definitely recommend it to my friends, and lastly, its looks are a head-turner. - Mileage (30) - Performance (27) - Comfort (63) - Engine (41) - Interior (40)
english
<reponame>29next/Intro<gh_stars>1-10 .cc-input { background-size: 50px 40px !important; background-repeat: no-repeat !important; background-position: right !important; background-color: transparent; } .cc-input.amex { background: url("{{ 'img/pay/ccinput_amex.svg' | asset_url }}"); } .cc-input.diners { background: url("{{ 'img/pay/ccinput_diners.svg' | asset_url }}"); } .cc-input.discover { background: url("{{ 'img/pay/ccinput_discover.svg' | asset_url }}"); } .cc-input.jcb { background: url("{{ 'img/pay/ccinput_jcb.svg' | asset_url }}"); } .cc-input.maestro { background: url("{{ 'img/pay/ccinput_maestro.svg' | asset_url }}"); } .cc-input.mastercard { background: url("{{ 'img/pay/ccinput_mc.svg' | asset_url }}"); } .cc-input.visa { background: url("{{ 'img/pay/ccinput_visa.svg' | asset_url }}"); }
css
Congress leader Rahul Gandhi, who recently faced an ED probe in the National Herald case, on Friday took a dig at the Prime Minister, alleging Narendra Modi adopted such a measure due to 'confusion' but did not know there would be no change in the Wayanad MP's behaviour by making him sit in the central agency's office for five days. Gandhi was quizzed by the Enforcement Directorate (ED) in Delhi for five days last month in connection with the National Herald money laundering case. "The Government of India…the Prime Minister thinks that by making me sit in the ED (office) for five days, I will change my behaviour. This is a confusion in the mind of the Prime Minister," Gandhi told a UDF Bahujana Sangamam organised in Sulthan Bathery here against the Eco-Sensitive Zone policy of the BJP-ruled Centre and the CPI(M)-led state government. Alleging that both the BJP and the CPI(M) believe in violence and that it was deep-rooted in their ideology, he said both think that by the means of violence and by threatening they can shape other people’s behaviour. "They both think, by acting violently they can threaten people. This is very deep confusion in their mind. Because they lack courage. They think violence can shape other people's behaviour. That is not the case. Because there are many people whose behaviour cannot be shaped by the violence and by threats," the Congress leader said. Referring to the SFI activists' attack on his office in Kalpetta last week, Gandhi said the CPI(M) also believed that by damaging his office he would change his behaviour. "My behaviour is shaped by my affection for the people of my country. It is shaped by the affection for the toiling masses of this country. It can never be shaped by my opponents or my enemies," Gandhi added. Read all the Latest News, Breaking News, watch Top Videos and Live TV here. (This story has not been edited by News18 staff and is published from a syndicated news agency feed - PTI)
english
Japan’s prime minister outlined Friday a record 56 trillion yen, or $490 billion stimulus package, including cash handouts and aid to ailing businesses, to help the economy out of the doldrums worsened by the coronavirus pandemic. “The package has more than enough content and scale to deliver a sense of security and hope to the people,” Prime Minister Fumio Kishida told reporters. The proposal from Kishida was set for Cabinet approval later in the day but still needs parliamentary approval. Kishida has promised speedy action, and parliament is expected to convene next month. The plan includes doling out 100,000 yen ($880) each in monetary assistance to those 18 years or younger, and aid for ailing businesses, Kishida and other politicians said. Japan has never had a full lockdown during the pandemic and infections remained relatively low, with deaths related to COVID-19 at about 18,000 people. But the world’s third-largest economy was already stagnating before the pandemic hit. Under the government’s “state of emergency,” some restaurants closed or limited their hours, and events and theaters restricted crowd size for social distancing. A shortage of computer chips and other auto parts produced in other Asian nations that had severe outbreaks and strict lockdowns has hurt production at Japan’s automakers, including Toyota Motor Corp. , an economic mainstay. The government has been studying restarting the “GoTo Travel” campaign of discounts at restaurants and stores, designed to encourage domestic travel. The campaign, which began last year, got discontinued when COVID cases started to surge. Some critics have said the government approach amounts to “baramaki,” or “spreading out handouts,” which could prove ineffective in generating growth in the long run. Others say the proposed cash aid leaves out families without children and other poor. The scale of the latest package will require Japan to sink deeper into debt by issuing bonds. Yoshimasa Maruyama, the chief market economist at SMBC Nikko Securities, said the government needs to focus on getting spending going again, and the GoTo campaign could prove effective. Japan’s economy contracted at an annual rate of 3% in the July-September period, largely because of weak consumer spending. Analysts say the economy is unlikely to rebound until next year. Japan has also promised to earmark spending for vaccine research after facing criticism over being dependent on imports for coronavirus vaccines. It has so far approved vaccines from Pfizer, Moderna, and AstraZeneca. Kishida, who has promised “a new capitalism” for Japan, took office in October. His predecessor Yoshihide Suga stepped down after just a year in office, largely because of widespread public discontent about his inept response to the pandemic. READ ALSO:
english
<filename>sqlscript/dimProduct.json { "name": "dimProduct", "properties": { "folder": { "name": "Delta/Gold" }, "content": { "query": "\n\n\nCREATE EXTERNAL DATA SOURCE [DeltaLakeStorage] \n\tWITH (\n\t\tLOCATION = 'abfss://lgnsynapsefs@lgnsynapselake.dfs.core.windows.net/delta/' \n\t)\nGO\n\nCREATE EXTERNAL TABLE dimProduct (\n\t[ProductID] int,\n\t[Name] nvarchar(4000),\n\t[ProductNumber] nvarchar(4000),\n\t[Color] nvarchar(4000),\n\t[StandardCost] numeric(19,4),\n\t[ListPrice] numeric(19,4),\n\t[DiscontinuedDate] datetime2(7),\n\t[ThumbNailPhoto] varbinary(8000),\n\t[ThumbnailPhotoFileName] nvarchar(4000),\n\t[rowguid] nvarchar(4000),\n\t[ModifiedDate] datetime2(7),\n\t[Size] nvarchar(4000),\n\t[Weight] numeric(8,2),\n\t[ProductCategoryID] int,\n\t[ProductModelID] int,\n\t[SellStartDate] datetime2(7),\n\t[SellEndDate] datetime2(7),\n\t[SourceFileName] nvarchar(4000),\n\t[DateInserted] datetime2(7),\n\t[ParentProductCategoryID] int,\n\t[ProductCategory] nvarchar(4000),\n\t[ProductModel] nvarchar(4000),\n\t[CatalogDescription] nvarchar(4000),\n\t[ProductKey] bigint\n\t)\n\tWITH (\n\tLOCATION = 'gold/dimProduct/',\n\tDATA_SOURCE = [DeltaLakeStorage] ,\n\tFILE_FORMAT = [DeltaLakeFormat]\n\t)\nGO\n\n\nSELECT TOP 100 * FROM dbo.dimProduct\nGO\n\nDROP EXTERNAL DATA SOURCE [DeltaLakeStorage] \nDROP EXTERNAL TABLE dimProduct", "metadata": { "language": "sql" }, "currentConnection": { "databaseName": "gold", "poolName": "Built-in" }, "resultLimit": 5000 }, "type": "SqlQuery" } }
json
<gh_stars>0 html { box-sizing: border-box; } body { margin: 0; background-color: #45badd; /* yellow ish fabb34 - blue */ } .container { height: 100vh; width: 100vw; display: flex; flex-direction: column; justify-content: center; align-items: center; background: url("robot.gif"); background-size: contain; background-position: left center; background-repeat: no-repeat; } button#btn { background-color: rgb(240, 61, 136); border: none; outline: none; /*no blue outline after clicking*/ border-radius: 7px; box-shadow: 0 5px 0 rgb(151, 4, 53); text-shadow: 0 2px 5px black; color: #fff; padding: 1em 1.5em; position: relative; font-weight: bold; font-size: 20px; font-family: "Courier New", Courier, monospace, sans-serif; margin-bottom: 10px; } button#btn:hover { background-color: #ce1c75; cursor: pointer; } button#btn:active { box-shadow: none; top: 5px; } button#btn:disabled { cursor: default; filter: brightness(30%); } /* Media Query */ @media screen and (max-width: 1000px){ .container { background-size: cover; background-position: center center; } }
css
Viktor Hovland is at the epicenter of talks since winning the recently concluded Memorial Tournament 2023. Just recently, LIV golfer Phil Mickelson also lauded the Norwegian and dubbed him one of the favorites to win the upcoming 2023 US Open. Hovland managed top-10 finishes in both major tournaments this season. He even finished T3 in the 2023 Players Championship. Mickelson, who himself is a six-time major champion, lauded Hovland and his game. Mickelson also dubbed Hovland to be a favorite to win the third major of the season. "When Viktor Hovland turned pro he was a solid player. He has since worked hard on his game, increased his club head speed a lot, improved his putting and chipping immensely and is now one of the best in the game. After this win and his play in PGA, he’s a/the favorite to win US open," Mickelson's tweet read. How was the performance of Viktor Hovland in the Memorial Tournament 2023? Viktor Hovland had an incredible outing at the Memorial Tournament 2023 at the Muirfield Village Golf Club. The tournament, designed by the legendary Jack Nicklaus, is often called to be one of the toughest, in terms of condition, on the PGA Tour. Hovland who was not in the top five on the leaderboard jumped to the top after he carded a score of 2-under 70. He tied with Denny McCarthy in the first position and advanced to play a one-hole playoff. The Norwegian eventually defeated the latter to register his fourth win on the PGA Tour. Hovland was awarded a handsome amount of $3. 6 million and the Memorial Tournament trophy by the 17-time major champion Jack Nicklaus. He also gained 64. 71 points to jump from the seventh rank to the fifth rank in the OWGR. What did Viktor Hovland and Jack Nicklaus talk about after the Memorial Tournament 2023? After the conclusion of the Memorial Tournament 2023 at Muirfield Village Golf Club, Viktor Hovland was recorded having a chat with the great Jack Nicklaus. The duo had a brief fun chat before Nicklaus handed him the trophy. The video was shared by the PGA Tour's official Twitter handle. In the video, Nicklaus asked Hovland if he was in Oklahoma State too. The latter replied in the affirmative before adding: "The real Oklahoma State. I'm just kidding. Or the real OSU. I screwed it up. I didn't have the balls to do it in front of you, Jack. " Later on, Jack Nicklaus asked Viktor Hovland to go collect the big check and enjoy. The lovely banter was loved by fans and everyone who watched the video. Where is Phil Mickelson scheduled to play next? The 52-year-old LIV golfer is mostly seen playing at the Saudi-backed league since he joined it. However, due to his previous wins at major tournaments, Phil Mickelson also gets the privilege to play in those tournaments too. Mickelson will now gear up for the 2023 US Open which is scheduled to take place on June 15 at the Los Angeles Country Club. Despite winning all the other three majors, the American golfer is yet to win a US Open title. However, he did finish as runner-up six times in his career.
english
<filename>root/pli/ms/sutta/sn/sn9/sn9.2_root-pli-ms.json { "sn9.2:0.1": "Saṁyutta Nikāya 9 ", "sn9.2:0.2": "1. Vanavagga ", "sn9.2:0.3": "2. Upaṭṭhānasutta ", "sn9.2:1.1": "Ekaṁ samayaṁ aññataro bhikkhu kosalesu viharati aññatarasmiṁ vanasaṇḍe. ", "sn9.2:1.2": "Tena kho pana samayena so bhikkhu divāvihāragato supati. ", "sn9.2:1.3": "Atha kho yā tasmiṁ vanasaṇḍe adhivatthā devatā tassa bhikkhuno anukampikā atthakāmā taṁ bhikkhuṁ saṁvejetukāmā yena so bhikkhu tenupasaṅkami; upasaṅkamitvā taṁ bhikkhuṁ gāthāhi ajjhabhāsi: ", "sn9.2:2.1": "“Uṭṭhehi bhikkhu kiṁ sesi, ", "sn9.2:2.2": "ko attho supitena te; ", "sn9.2:2.3": "Āturassa hi kā niddā, ", "sn9.2:2.4": "sallaviddhassa ruppato. ", "sn9.2:3.1": "Yāya saddhāya pabbajito, ", "sn9.2:3.2": "agārasmānagāriyaṁ; ", "sn9.2:3.3": "Tameva saddhaṁ brūhehi, ", "sn9.2:3.4": "mā niddāya vasaṁ gamī”ti. ", "sn9.2:4.1": "“Aniccā addhuvā kāmā, ", "sn9.2:4.2": "yesu mandova mucchito; ", "sn9.2:4.3": "Baddhesu muttaṁ asitaṁ, ", "sn9.2:4.4": "kasmā pabbajitaṁ tape. ", "sn9.2:5.1": "Chandarāgassa vinayā, ", "sn9.2:5.2": "avijjāsamatikkamā; ", "sn9.2:5.3": "Taṁ ñāṇaṁ paramodānaṁ, ", "sn9.2:5.4": "kasmā pabbajitaṁ tape. ", "sn9.2:6.1": "Chetvā avijjaṁ vijjāya, ", "sn9.2:6.2": "āsavānaṁ parikkhayā; ", "sn9.2:6.3": "Asokaṁ anupāyāsaṁ, ", "sn9.2:6.4": "kasmā pabbajitaṁ tape. ", "sn9.2:7.1": "Āraddhavīriyaṁ pahitattaṁ, ", "sn9.2:7.2": "Niccaṁ daḷhaparakkamaṁ; ", "sn9.2:7.3": "Nibbānaṁ abhikaṅkhantaṁ, ", "sn9.2:7.4": "Kasmā pabbajitaṁ tape”ti. " }
json
package com.firstbrave.minhash.util; import java.math.BigInteger; import com.firstbrave.minhash.MinHash; /** * MinHash工具类 * * @author dave * created on 2018年3月9日 */ public class MinHashUtil { /** * 将Hash值(二进制)转换为Int * * @param hash 数组形式的 {0,1,0,1,...}的哈希值 * @param begin 要转换的哈希值段的起始位置 * @param end 要转换的哈希值段的结束位置 * @return 转化为整型的hash值 */ public static int Hash2Int(int[] hash, int begin, int end) { int intCode = 0; for (int i = begin; i < end; i++) { intCode = (intCode << 1); intCode += hash[i]; } return intCode; } /** * 获得内容的指纹 * * @param source * @return */ public static BigInteger getSignature(String content) { BigInteger result = null; if (content != null && content.length() > 0) { MinHash MinHash = new MinHash(); result = MinHash.getSignature(content); } return result; } /** * 签名转为二进制,如果达不到实际长度,需要在前面补0 * * @param signature * @return */ public static String Signature2BinaryStr(BigInteger signature, int length) { String tmp = signature.toString(2); StringBuilder sb = new StringBuilder(); if(tmp.length() < length) { for(int i=0; i<length-tmp.length(); i++) { sb.append("0"); } } sb.append(tmp); return sb.toString(); } /** * 将签名转换为二进制 * 如:208560148841534216148409504476892512671 -> * 10011100111001110011100100111100111011100011101011100101111010110000001100011001110001110000100010001001111011000100000110011111 * * @param signature * @return */ public static int[] Signature2Binary(String signature, int length) { return Signature2Binary(new BigInteger(signature), length); } /** * 将签名转换为二进制 * 如:208560148841534216148409504476892512671 -> * 10011100111001110011100100111100111011100011101011100101111010110000001100011001110001110000100010001001111011000100000110011111 * * @param signature * @return */ public static int[] Signature2Binary(BigInteger signature, int length) { String str = Signature2BinaryStr(signature, length); int[] result = new int[str.length()]; for(int i=0; i<str.length(); i++) { result[i] = Integer.valueOf(str.charAt(i) + ""); } return result; } /** * 将签名二进制转换为BigInteger * 如10011100111001110011100100111100111011100011101011100101111010110000001100011001110001110000100010001001111011000100000110011111 -> * 208560148841534216148409504476892512671 * * @param bytes * @return */ public static BigInteger SignatureBinary2BigInteger(String strSignature) { String[] str = strSignature.split(""); BigInteger signature = BigInteger.ZERO; for (int i = 0; i < str.length; i++) { String s = str[i]; if(s.compareTo("0") > 0) { signature = signature.add(BigInteger.ONE.shiftLeft(str.length - i - 1)); } } return signature; } /** * 获得两个签名的汉明距离 * * @param targetSignature 比较签名 * @return */ public static int getHammingDistance(BigInteger fromSignature, BigInteger toSignature) { BigInteger x = fromSignature.xor(toSignature); int tot = 0; // 统计x中二进制位数为1的个数 // 我们想想,一个二进制数减去1,那么,从最后那个1(包括那个1)后面的数字全都反了, // 对吧,然后,n&(n-1)就相当于把后面的数字清0, // 我们看n能做多少次这样的操作就OK了。 while (x.signum() != 0) { tot += 1; x = x.and(x.subtract(new BigInteger("1"))); } return tot; } /** * hash距离。二进制比较 * * @param targetHash 比较目标 * @return */ public static int getHashDistance(String fromHash, String toHash) { int distance; if (fromHash.length() != toHash.length()) { distance = -1; } else { distance = 0; for (int i = 0; i < fromHash.length(); i++) { if (fromHash.charAt(i) != toHash.charAt(i)) { distance++; } } } return distance; } }
java
# \WebhookApi All URIs are relative to *https://cloud.memsource.com/web* Method | HTTP request | Description ------------- | ------------- | ------------- [**CreateWebHook**](WebhookApi.md#CreateWebHook) | **Post** /api2/v1/webhooks | Create webhook [**CreateWebHook1**](WebhookApi.md#CreateWebHook1) | **Post** /api2/v2/webhooks | Create webhook [**DeleteWebHook**](WebhookApi.md#DeleteWebHook) | **Delete** /api2/v1/webhooks/{webHookId} | Delete webhook [**DeleteWebHook1**](WebhookApi.md#DeleteWebHook1) | **Delete** /api2/v2/webhooks/{webHookId} | Delete webhook [**GetWebHook**](WebhookApi.md#GetWebHook) | **Get** /api2/v1/webhooks/{webHookId} | Get webhook [**GetWebHook1**](WebhookApi.md#GetWebHook1) | **Get** /api2/v2/webhooks/{webHookId} | Get webhook [**GetWebHookList**](WebhookApi.md#GetWebHookList) | **Get** /api2/v1/webhooks | Lists webhooks [**GetWebHookList1**](WebhookApi.md#GetWebHookList1) | **Get** /api2/v2/webhooks | Lists webhooks [**UpdateWebHook**](WebhookApi.md#UpdateWebHook) | **Put** /api2/v1/webhooks/{webHookId} | Edit webhook [**UpdateWebHook1**](WebhookApi.md#UpdateWebHook1) | **Put** /api2/v2/webhooks/{webHookId} | Edit webhook # **CreateWebHook** > WebHookDto CreateWebHook(ctx, optional) Create webhook ### Required Parameters Name | Type | Description | Notes ------------- | ------------- | ------------- | ------------- **ctx** | **context.Context** | context for authentication, logging, cancellation, deadlines, tracing, etc. **optional** | ***WebhookApiCreateWebHookOpts** | optional parameters | nil if no parameters ### Optional Parameters Optional parameters are passed through a pointer to a WebhookApiCreateWebHookOpts struct Name | Type | Description | Notes ------------- | ------------- | ------------- | ------------- **body** | [**optional.Interface of WebHookDto**](WebHookDto.md)| | ### Return type [**WebHookDto**](WebHookDto.md) ### Authorization No authorization required ### HTTP request headers - **Content-Type**: application/json - **Accept**: application/json [[Back to top]](#) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to Model list]](../README.md#documentation-for-models) [[Back to README]](../README.md) # **CreateWebHook1** > WebHookDtoV2 CreateWebHook1(ctx, optional) Create webhook ### Required Parameters Name | Type | Description | Notes ------------- | ------------- | ------------- | ------------- **ctx** | **context.Context** | context for authentication, logging, cancellation, deadlines, tracing, etc. **optional** | ***WebhookApiCreateWebHook1Opts** | optional parameters | nil if no parameters ### Optional Parameters Optional parameters are passed through a pointer to a WebhookApiCreateWebHook1Opts struct Name | Type | Description | Notes ------------- | ------------- | ------------- | ------------- **body** | [**optional.Interface of WebHookDtoV2**](WebHookDtoV2.md)| | ### Return type [**WebHookDtoV2**](WebHookDtoV2.md) ### Authorization No authorization required ### HTTP request headers - **Content-Type**: application/json - **Accept**: application/json [[Back to top]](#) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to Model list]](../README.md#documentation-for-models) [[Back to README]](../README.md) # **DeleteWebHook** > DeleteWebHook(ctx, webHookId) Delete webhook ### Required Parameters Name | Type | Description | Notes ------------- | ------------- | ------------- | ------------- **ctx** | **context.Context** | context for authentication, logging, cancellation, deadlines, tracing, etc. **webHookId** | **int64**| | ### Return type (empty response body) ### Authorization No authorization required ### HTTP request headers - **Content-Type**: Not defined - **Accept**: Not defined [[Back to top]](#) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to Model list]](../README.md#documentation-for-models) [[Back to README]](../README.md) # **DeleteWebHook1** > DeleteWebHook1(ctx, webHookId) Delete webhook ### Required Parameters Name | Type | Description | Notes ------------- | ------------- | ------------- | ------------- **ctx** | **context.Context** | context for authentication, logging, cancellation, deadlines, tracing, etc. **webHookId** | **int64**| | ### Return type (empty response body) ### Authorization No authorization required ### HTTP request headers - **Content-Type**: Not defined - **Accept**: Not defined [[Back to top]](#) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to Model list]](../README.md#documentation-for-models) [[Back to README]](../README.md) # **GetWebHook** > WebHookDto GetWebHook(ctx, webHookId) Get webhook ### Required Parameters Name | Type | Description | Notes ------------- | ------------- | ------------- | ------------- **ctx** | **context.Context** | context for authentication, logging, cancellation, deadlines, tracing, etc. **webHookId** | **int64**| | ### Return type [**WebHookDto**](WebHookDto.md) ### Authorization No authorization required ### HTTP request headers - **Content-Type**: Not defined - **Accept**: application/json [[Back to top]](#) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to Model list]](../README.md#documentation-for-models) [[Back to README]](../README.md) # **GetWebHook1** > WebHookDtoV2 GetWebHook1(ctx, webHookId) Get webhook ### Required Parameters Name | Type | Description | Notes ------------- | ------------- | ------------- | ------------- **ctx** | **context.Context** | context for authentication, logging, cancellation, deadlines, tracing, etc. **webHookId** | **int64**| | ### Return type [**WebHookDtoV2**](WebHookDtoV2.md) ### Authorization No authorization required ### HTTP request headers - **Content-Type**: Not defined - **Accept**: application/json [[Back to top]](#) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to Model list]](../README.md#documentation-for-models) [[Back to README]](../README.md) # **GetWebHookList** > PageDtoWebHookDto GetWebHookList(ctx, optional) Lists webhooks ### Required Parameters Name | Type | Description | Notes ------------- | ------------- | ------------- | ------------- **ctx** | **context.Context** | context for authentication, logging, cancellation, deadlines, tracing, etc. **optional** | ***WebhookApiGetWebHookListOpts** | optional parameters | nil if no parameters ### Optional Parameters Optional parameters are passed through a pointer to a WebhookApiGetWebHookListOpts struct Name | Type | Description | Notes ------------- | ------------- | ------------- | ------------- **pageNumber** | **optional.Int32**| Page number, starting with 0, default 0 | [default to 0] **pageSize** | **optional.Int32**| Page size, accepts values between 1 and 50, default 50 | [default to 50] ### Return type [**PageDtoWebHookDto**](PageDtoWebHookDto.md) ### Authorization No authorization required ### HTTP request headers - **Content-Type**: Not defined - **Accept**: application/json [[Back to top]](#) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to Model list]](../README.md#documentation-for-models) [[Back to README]](../README.md) # **GetWebHookList1** > PageDtoWebHookDtoV2 GetWebHookList1(ctx, optional) Lists webhooks ### Required Parameters Name | Type | Description | Notes ------------- | ------------- | ------------- | ------------- **ctx** | **context.Context** | context for authentication, logging, cancellation, deadlines, tracing, etc. **optional** | ***WebhookApiGetWebHookList1Opts** | optional parameters | nil if no parameters ### Optional Parameters Optional parameters are passed through a pointer to a WebhookApiGetWebHookList1Opts struct Name | Type | Description | Notes ------------- | ------------- | ------------- | ------------- **pageNumber** | **optional.Int32**| Page number, starting with 0, default 0 | [default to 0] **pageSize** | **optional.Int32**| Page size, accepts values between 1 and 50, default 50 | [default to 50] ### Return type [**PageDtoWebHookDtoV2**](PageDtoWebHookDtoV2.md) ### Authorization No authorization required ### HTTP request headers - **Content-Type**: Not defined - **Accept**: application/json [[Back to top]](#) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to Model list]](../README.md#documentation-for-models) [[Back to README]](../README.md) # **UpdateWebHook** > WebHookDto UpdateWebHook(ctx, webHookId, optional) Edit webhook ### Required Parameters Name | Type | Description | Notes ------------- | ------------- | ------------- | ------------- **ctx** | **context.Context** | context for authentication, logging, cancellation, deadlines, tracing, etc. **webHookId** | **int64**| | **optional** | ***WebhookApiUpdateWebHookOpts** | optional parameters | nil if no parameters ### Optional Parameters Optional parameters are passed through a pointer to a WebhookApiUpdateWebHookOpts struct Name | Type | Description | Notes ------------- | ------------- | ------------- | ------------- **body** | [**optional.Interface of WebHookDto**](WebHookDto.md)| | ### Return type [**WebHookDto**](WebHookDto.md) ### Authorization No authorization required ### HTTP request headers - **Content-Type**: application/json - **Accept**: application/json [[Back to top]](#) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to Model list]](../README.md#documentation-for-models) [[Back to README]](../README.md) # **UpdateWebHook1** > WebHookDtoV2 UpdateWebHook1(ctx, webHookId, optional) Edit webhook ### Required Parameters Name | Type | Description | Notes ------------- | ------------- | ------------- | ------------- **ctx** | **context.Context** | context for authentication, logging, cancellation, deadlines, tracing, etc. **webHookId** | **int64**| | **optional** | ***WebhookApiUpdateWebHook1Opts** | optional parameters | nil if no parameters ### Optional Parameters Optional parameters are passed through a pointer to a WebhookApiUpdateWebHook1Opts struct Name | Type | Description | Notes ------------- | ------------- | ------------- | ------------- **body** | [**optional.Interface of WebHookDtoV2**](WebHookDtoV2.md)| | ### Return type [**WebHookDtoV2**](WebHookDtoV2.md) ### Authorization No authorization required ### HTTP request headers - **Content-Type**: application/json - **Accept**: application/json [[Back to top]](#) [[Back to API list]](../README.md#documentation-for-api-endpoints) [[Back to Model list]](../README.md#documentation-for-models) [[Back to README]](../README.md)
markdown
The Karnataka State Road Transport Corporation (KSRTC) and the Andhra Pradesh State Transport Corporation (APSRTC) have restored bus services between the two States from Saturday. The two road transport corporations had stopped its services from/to Tirupati, Vijayawada, Kurnool, Hyderabad, Mantralaya and other destinations from August 31 following protests against the formation of Telangana. A KSRTC press release said advance e-ticket booking too have been restored. Passengers can book their tickets at 173 computerised booking counters in Bangalore city, 18 in Mysore, 53 in Mangalore and also at 230 counters across the State. E-ticket booking could be done through the website: www. ksrtc. in. , the release added.
english
Stortage of Cont [Shri P. Ramachandran] the stocks. And in this verificaton, in one or two, some over-reporting was found. It is only on that basis and on that report that we Are taking action. The report is being examined and whoever is found to be responsble will be taken to task and action will be taken against him. About the individuals, there is no ban on any individual taking the coal, provided he has got certain certificate of the sponsoring authority. MARCH 8, SHRI EDUARDO FALEIRO: No certificate is required if there is no shortage. SHRI P. RAMACHANDRAN: Just because there is no shortage do you want anybody to buy? No. that is not the Government policy. The Government's policy is this. It is a commodity which has to be distributed properly. You cannot expect anybody to go, take the coal and then sell it in the black-market. SHRI EDUARDO FALEIRO: Why should he sell it in the black-market? There is no scarcity. SHRI P. RAMACHANDRAN: Artificial scarcity will be created. It is the policy of the Government which has been enunciated long back, that coal, whether it is in plenty or in scarcity, must be distributed properly. That is why a certain Authority is created and the Coal Controller Jus his say to regulate the distribution of the coal. Wagons have to be allotted and coal despatched. Railways have got the authority to have the priority for that. You cannot expect Railways to move coal with low priority, as higher priority coal. The first priority comes in respect of coal for the steel plants, railways, power stations and then cement industries and other industries. Then only comes the brick kiln and other things. It is only the Railways which can stipulate the priority. We have no control over that. After all, the Coal companies have got only stocks at the pit-head and they are responsible to give the cual to those people who come with the authority. That is all. And even for the price, at the pit-head, statutory price is fixed. SHRI ANNASAHEB P. SHINDE: (Ahmednagar): In some cases it has been found that the Railway Board had communicated in writing saying that 50 many wagons have been allotted; but even then coal is not moved. SHRI P. RAMACHANDRAN: might have been so in the month of October, 1978 or so. Now it is not 30. I would like instances if any to be brought to my notice. If there is anything to be looked into, I will immediately attend to that. Coal is available now in all the pitheads are wise, and any amount of coal can be transported if there is proper certificate from sponsoring authority. That is the position today. And regarding the powers of the holding company, certain steps of decentralisation were initiated. That does not mean that the holding company has lost the powers for having the control over the subsidiary companies. They still possess all the powers to monitor the production, the marketing aspect of it and also sanctioning of projects. Powers are still vested with the Coal Authority of India and there is no question of devaluing that authority except certain powers deligated to subsidiary companies for efficient operations. Certain powers were delegated to the subsidiary company. That is all. In fact, the Coal India Ltd. still enjoys all the powers necessary for effective monitoring of the subsidiary companies. The third point he was referring to was about the irregular appointment. of a Managing Director he has referred to. He was appointed long ago. He was not selected for this post after this Government came to power. It is very common that within the Department, people are transferred for various reasons and he was there in a particular company. SHRI EDUARDO FALEIRO; For how long! 397 Shortage of Coal PHALGUNA 17, 1900 (SAKA) (H.A.H, Disc.) SERE P, RAMACHANDRAN: What.. ever it may be, it is not the length of period that is important. He was transferred from the company to the Department for some time and then a vacancy arose there. He was posted to another company. When the Coal India Chairman was selected for one of the subsidiaries, naturally a vacancy arose there and in that vacancy he was posted. That is all SHRI EDUARDO FALEIRO: Is he qualified? SHRI P RAMACHANDRAN: It is a Managing Director's post. What is the necessary qualification that you are thinking of? SHRI EDUARDO FALEIRO: What I have said is that for this type of post, it is necessary that it should be filled by a person selected by the Bureau of Public Enterprise. SHRI P RAMACHANDRAN: "Yea, he was selected and posted even before this Government came to power. He was the Chairman of the Central Coal Fields. Ltd. Then a general transfer took place. At that time he was transferred to BBCL and after some time he was posted to the Ministry and recently we have posted him in the vacancy caused by the transfer of the Chairman, C C.FL. डा० रामजी ग्रिह (भागलपुर) : सभापति महोदय, इतना प्रोजन हो रहा है, फिर भी बड़े दुख की बात है कि जहाँ हमने 1979 में 50 हजार टन बेल्जियम को, 25 हजाटन क्रांस को, 1 लाख 25 हजार टन हॉलेंड को 2 लाख टनार्क को कोयला दिया साथ ही नेपाल, बर्मा, बंगलादेश और ताइवान आादि को भी कोबसे का निर्यात किया, वहां भाज हम 1.5 मीलियन टन कोयले का आयात कर रहे है। जबकि करीव का हमारा श्री राष्ट्रीयकरण के बाद से बढ़ा है तब यह स्थिति है। हमारे सामने आकड़े हैं कि हमारा प्रोड बराबर बढ़ता जा रहा है। 1973-74 में 78.1 मीशियन टन, 1974-75 में 88.4 मीलियन टन के उत्पादन से हमारा उत्पादन बराबर बढ़ता गया है तो क्या कारण है कि यह सब स्थिति उत्पन्न हुई ? जब से राष्ट्रीयकरण हुआ है, तब से हमारा कोयले काकी भो रहा है और स्टाफ भी बढ़ता Now, you have to move the present output of coal throughout the country. The total demand of coal during 1977-78 was about 91 million tonnes. But the total stocks about 102 million tonnes. It is clear the production of coal has all along exceeded the demand. इस तरह से हमारा प्रोडक्शन बढ रहा है। सभापति महोदय भापका प्रयन क्या है ? डा० रामजी सिंह मेरा प्रश्न यह है कि जब स्टाक भी बढ़ रहा है, उत्पादन भी बढ़ रहा है जिसका कि कोई वर्णन नहीं है फिर भी कायले के दाम बढ़ाने के लिए कोशिश की जा रही है। यह बडी भयावह चीज है। मेरे पास रिपोर्ट है जिसमें दिया गया है कि कोल इडस्ट्री का प्रशासन ठीक होना चाहिए। इस से 100 करोड रुपया बचाया जा सकता है। मुझे यह लगता है कि कोल प्रशासन प्रत्यन्त अक्षम है। यह मी नही कह. रहा हूँ। हर टन मे 13 रुपये बचाए जा सकते है। इससे यह पता चलता है कि शासन कितना कमजोर और प्रक्षम है और कोयले के उत्पादन, उसकी कार्यक्षमता को ठीक नहीं करता है । कोयले की कीमत बढ़ाने के लिए कैबिनेटसब कमेटी बाबू जगजीवन राम की अध्यक्षता में बनी थी। उसने कुछ फैसला किया था। अब कैबिनेट इसके बारे में फैसला करेगी। सारा देश सोच रहा है कि कोयले के दाम रोज-ब-रोज बढ़ते जा रहे है, ऐसा क्यो हो रहा है। मैं समझता हू कि यह विभाग की अक्षमता की निशानी है। आज क्या स्थिति है। माननीय राम साहब वहाँ से आते हैं। वह जानते होगे कि कांट्रैक्टर और मैनेजर की सांठगांठ से हजारो टन कोयले की रोज चोरी होती है। इसको रोकने के लिए आप क्या कर रहे हैं। मैं यह भी जानना चाहता हू कि बावेजा कमेटी की अनुशंसाम्रो को ध्यान में रखते हुए कोयले के उत्पादन के खर्च को तेरह रुपये प्रति टन कम करने के बारे में पाप क्या कर रहे है ? साथ ही कोयले का प्रायात प्राप कब तक बन्द करेंगे और साठ रोड़ जो इस पर खर्च होता है उसको पाप कब तक बचा पाएंगे ? कोयले के दाम बढ़ने की सम्भावना से लोग बहुत भयभीत है। क्या सनकी इस शंका को सरकार दूर करेगी और घोषणा करेगी कि कोयले के काम नहीं बढ़ाए जाएंगे और कोयला विभाग के प्रबन्ध को सक्षम बनाया जाएगा ? बकाया जो राशि उसकी सरकार और मिल दूसरे उपक्रमो के पास है भगर वह उसकी जाए तो उससे कैपिटल इनवेस्टमेंट में सुविधा होगी और वाशरीव वगैरह उससे बन सकती है। इस को बिलाने का भी क्या आप प्रयत्न करेंगे ? कट्रिक्ट सिस्टम जो फुरप्ट कर रहा है और मई जो इसकी वजह से धनुवाद क्षेत्र में होते है और जो कांट्रैक्टर और मैनेजर की गैंग करते हैं, ये सही इसके लिए यह जरूरी हैं कि ट्रक्ट सिस्टम को एमालित किया जाएं ? की एवालिस करेगी ? Shortage of Coal PROF. P. G. MAVALANKAR (Gandhinagar); Mr. Chairman, Sir, the shortage of coal has been a chronic problem and it has been creating a lot of terrible and intolerable hardships to industries, major and small scale. The Minister has himself admitted in his answer given on the 20th February, 1979 to the Unstarred Question that it has caused a lot of hardships to a number of industries, particularly textile industries. AN HON. MEMBER: As also fertilizer industries. PROF. P. G. MAVALANKAR: Yes. Now, Sir, I come from Ahmedabad. Yesterday, there were two questions. One was a Starred question, directly on this problem, but unfortunately my friend Shri Kumari Ananthan was not present. I could not, therefore, ask any question so far as Ahmedabad mills are concerned. Another Question by me on the same topic came as an Unstarred Question and I could not ask any supplementary questions on it either. I want to bring this problem to the Government's notice again with all seriousness. I have referred to the two questions of yesterday, and we have also the answer to the unstarred Question on the 20th February, which is the subject matter for half-an-hour discussion today. All the answers are a classic example of not saying anything. In the answer on the 20th February, on which this half-an-hour discussion has been raised, it was said: "There was no shortage of coal with the coal companies in December, 1978 and January, 1979". In that case, how is it that hundreds and thousands of industrial units and many others in many parts of India have been experiencing shortage of coal? I come from Gujarat. There, not only the textile mills, but even the power stations suffered because of shortage of coal. Will not the Government agree that because of this inadequate distribution or erratic distribution of supply of coal, although they have said that there is coal and they can supply, it has resulted in all kinds of handships and corruption? Therefore, it resulted in all kinds of harassment and all kinds of loss of production. My question is: what are Government doing with regard to that? Finally: every time the Minister of Energy and his colleague pass on the buck to the Railway Minister. The Railway Minister in turn passes on the buck to the other Minister. Only about 2 hours earlier in this very House today, the Railway Minister said that whether he was there or the Minister of Energy was there, they were all ultimately one, in the Government. Then how long will they go on passing the responsibility from one Ministry to another, and not coming to grips with the problem. In the end, the consumers are suffering. He says, the Ahmedabad mills are not closed. That is not completely true. The mills were on the verge of closure. In some of them, shifts were closed, if not the entire mills. People went out of employment. There was loss of production. How long will the Minister of Energy take, to come to some kind of an energetic coordination with the other energetic Minister, viz. the Minister of Railways? Although you have got coal, you cannot distribute it properly and in time. You should see that coal is supplied to the mills, power stations, fertilizer units etc. All of them should get coal. Please give a detailed answer, so that no further discussion on this subjected is warranted. SHRI (ChitraK. MALLANNA durga): There are two reasons for the shortage of coal. One, the production of coal and another the transport of coal in the country. So far as the production side is concerned, they attribute it to strike by workers and to the go-slow tactics of the workers. Sometimes it may be due to shortage of explosives. So far as transport is concerned, the Railway Minister is 401 Shortage Cog! PHALGUNA, shifting . sponsibility to the Energy Minister l.e. coal is to be supplied to the point where the rail Hfts it. There is absolutely no coordination between the two communities. I would like to know this information: what is the total amount of coal required in the country, how much is the 'country producing; is Government importing coal, if so, how many tonnes; how far is it adequate to cover the demand; whether it is a fact that sometimes the operation of coal comes to a halt due to shortage of explosives-if so, what is the action taken by Government to fill the gap: and whether there is any coordination between the two Ministries. SHRI P. RAMACHANDRAN: First, I will take up my friend Dr. Ramji Singh's points. He made a very sweeping and fantastic statement about the inefficiency in the Department of Coal I am very sorry; such sweeping statement should not have come from that Member. DR. RAMJI SINGH: But did I not substantiate it? SHRI P. RAMACHANDRAN: You have not. It is easy to condemn when somebody is doing the job. Here is something which one has to understand. Coal is produced in adequate quantities, to meet the demands of the consumers in the country. At times people say that coal is not available. He referred to Baweja Committee's recommendations, I do not know from, where he got the information that the Baweja Committee had said that Rs. 13 per tonne can be saved by efficient managéemnt. (Interrup tion) That Committee has made a number of recommendations, which are being Government; and they will be implemented when a final decision is taken. They have, also recommended various other things. The han. Member has not tried to find out the truths about the report; and be at pleply, from his und imagi17, 1000 (SAKA)· (HAH, Dise.) nation, trying to give all sorts of things about the Baweja Committee report. About coal production, it has picked up. For 3 or 4 months production had gone down, because of various calamities, SHRI EDUARDO FALEIRO rose, SHRI P. RAMACHANDRAN: I know. Simply ridiculing a depart ment is not good. You are ridiculing it. You can say anything. SHRI EDUARDO FALEIRO: I am not. We are giving every detail. We can give more. He is not taking it seriously. He was sitting and smiling while I am talking. SHRI P. RAMACHANDRAN: What is wrong with it? Do you want me to cry? SHRI EDUARDO FALEIRO: You should cry and resign, instead of sitting and laughing when there is such a hue and cry outside. SHRI P. RAMACHANDRAN: It is very easy to say anything he wants, because this House has given him the privilege to say whatever he wants. SHRI EDUARDO FALEIRO: Don't take it in a light manner. SHRI P. RAMACHANDRAN: I kknow, Mr. Faleiro; I can also understand what you are talking. SHRI EDUARDO FALEIRO: Then it is very good. SHRI P. RAMACHANDRAN: But this shortage is not threr to-day in the country. They have got, as I told you, nearly 13 million tonnes of coal Shortages are felt because of various constraints, movement difficulties and other things. On February 17, officials of the Railway Ministry, of the Energy Ministry and coal company representatives did meet and discussed sorted out the problems. At the ment coal is moving properly to varioous consumer ends. Bitll the
english
package br.com.ibm.twgerenciadortarefas.modelos; import java.util.List; import javax.persistence.Column; import javax.persistence.Entity; import javax.persistence.FetchType; import javax.persistence.GeneratedValue; import javax.persistence.GenerationType; import javax.persistence.Id; import javax.persistence.OneToMany; import javax.persistence.Table; import javax.validation.constraints.NotNull; import org.hibernate.validator.constraints.Length; @Entity @Table(name = "usr_usuarios") public class Usuario { @Id @GeneratedValue(strategy = GenerationType.IDENTITY) @Column(name = "usr_id") private Long id; @Column(name = "usr_email", nullable = false, length = 100) @NotNull(message = "O email é obrigatório.") @Length(min = 5, max = 100, message = "O email deve conter entre 5 e 100 caracteres") private String email; @Column(name = "usr_senha", nullable = false, length = 100) @NotNull(message = "A senha é obrigatória.") private String senha; @OneToMany(mappedBy = "usuario", fetch = FetchType.LAZY) private List<Tarefa> tarefas; public Long getId() { return id; } public void setId(Long id) { this.id = id; } public String getEmail() { return email; } public void setEmail(String email) { this.email = email; } public String getSenha() { return senha; } public void setSenha(String senha) { this.senha = senha; } public List<Tarefa> getTarefas() { return tarefas; } public void setTarefas(List<Tarefa> tarefas) { this.tarefas = tarefas; } }
java
The Government of India’s directive to microblogging platform Twitter that it remove the label ‘manipulated media’ from certain posts shared by functionaries of the Bharatiya Janata Party (BJP), including Union Ministers, has no legal leg to stand on. But it reveals that the Government of India is willing to go to any lengths to empower BJP functionaries to tarnish political opponents and misinform the public. The BJP functionaries circulated on Twitter what they called a ‘toolkit’ prepared by the Congress to disparage the government. The Congress has filed a police complaint that the BJP functionaries forged a document that does not exist. It has also written to Twitter to permanently suspend the accounts of those who circulated the forged documents. There is indeed a document that the Congress prepared on the opportunity costs of the Central Vista project for its internal use. The one circulated by the BJP leaders included additional pages on COVID-19. The BJP has failed to provide the digital footprint, or the copies, of what it calls the COVID-19 toolkit. There is no evidence that the Congress has done anything in the toolkit which was supposedly prepared in May; but the toolkit proposes courses of action that have already happened in April, an analysis by fact-checking platform AltNews has revealed. Toolkits are meant to be about coordinating future actions on social media, and not cataloguing past events. When challenged on facts, a BJP propagandist revealed the identity of a woman who was involved in the Central Vista research, leading to her bullying by cyber mobs. Twitter has not complied with the Centre’s directive, and at least six handles of BJP functionaries now have posts with the tag ‘manipulated media’. The reasoning behind the directive, in the absence of any legal provision to cite, by the Government of India is baffling. It has argued that the labelling was a “prejudged, prejudiced and a deliberate attempt to colour the investigation by local law enforcement agency”. By this metric, a private company must allow what it has determined as problematic content, until a state agency concurs. Twitter has a publicised policy that it may label tweets that include media that have been deceptively altered or fabricated. It could use its own mechanism or use third party services to make that determination. Twitter is a private entity whose relationship with users is guided by its terms of services. The IT Act that empowers the government to regulate content does not give it the power to order the removal of a label . Additionally, the government move raises serious concerns regarding arbitrary censorship and transparency. The Centre’s desperation to control any discussion on its failures, and shift the focus on to the Opposition is leading to such situations that embarrass a democracy. Rather than intimidate a private company, the BJP and the Centre should discipline its functionaries into more civility and truthfulness in their engagement with critics.
english
#![no_std] pub trait Pipe: Sized { fn pipe<R, F: FnOnce(Self) -> R>(self, f: F) -> R { f(self) } } impl<T> Pipe for T {}
rust
{"acadYear":"2019/2020","description":"","title":"Safety Health and Environmental Project","department":"Chemical and Biomolecular Engineering","faculty":"Engineering","moduleCredit":"8","moduleCode":"SH5404","attributes":{"year":true},"semesterData":[{"semester":1,"timetable":[]},{"semester":2,"timetable":[]}]}
json
import boto3 import logging #setup simple logging for INFO logger = logging.getLogger() logger.setLevel(logging.INFO) backup = boto3.client('backup') page_size = 1000 def lambda_handler(event, context): # List Backup Vaults backupvaults = backup.list_backup_vaults() logger.info(backupvaults) backupVaultNames = [backupvault['BackupVaultName'] for backupvault in backupvaults['BackupVaultList']] # For each Vault for backupvault in backupVaultNames: logger.info("Checking Recover points in vault: " + backupvault) recoverypoints = [] while True: if 'NextToken' in recoverypoints: logger.info("Next page") recoverypoints = backup.list_recovery_points_by_backup_vault(BackupVaultName=backupvault, MaxResults=int(page_size),NextToken=recoverypoints['NextToken']) else: logger.info("First page") recoverypoints = backup.list_recovery_points_by_backup_vault(BackupVaultName=backupvault, MaxResults=int(page_size)) for recoverypoint in recoverypoints['RecoveryPoints']: if recoverypoint['Status'] == 'EXPIRED': logger.info("Deleting expired recovery point: " + recoverypoint['RecoveryPointArn']) result = backup.delete_recovery_point(BackupVaultName=backupvault,RecoveryPointArn=recoverypoint['RecoveryPointArn']) logger.info("Completed batch") if 'NextToken' not in recoverypoints: break logger.info("Completed vault")
python
<filename>dist/channels/exgamingllc.json<gh_stars>10-100 {"template":{"small":"https://static-cdn.jtvnw.net/emoticons/v1/{image_id}/1.0","medium":"https://static-cdn.jtvnw.net/emoticons/v1/{image_id}/2.0","large":"https://static-cdn.jtvnw.net/emoticons/v1/{image_id}/3.0"},"channels":{"exgamingllc":{"title":"exgamingllc","channel_id":97762209,"link":"http://twitch.tv/exgamingllc","desc":null,"plans":{"$4.99":"14680","$9.99":"52009","$24.99":"52010"},"id":"exgamingllc","first_seen":"2017-01-26 09:10:05","badge":"https://static-cdn.jtvnw.net/badges/v1/c9c541d6-75ed-48f2-836e-0456fe358f83/1","badge_starting":"https://static-cdn.jtvnw.net/badges/v1/c9c541d6-75ed-48f2-836e-0456fe358f83/3","badge_3m":null,"badge_6m":null,"badge_12m":null,"badge_24m":null,"badges":[{"image_url_1x":"https://static-cdn.jtvnw.net/badges/v1/c9c541d6-75ed-48f2-836e-0456fe358f83/1","image_url_2x":"https://static-cdn.jtvnw.net/badges/v1/c9c541d6-75ed-48f2-836e-0456fe358f83/2","image_url_4x":"https://static-cdn.jtvnw.net/badges/v1/c9c541d6-75ed-48f2-836e-0456fe358f83/3","description":"Subscriber","title":"Subscriber","click_action":"subscribe_to_channel","click_url":""}],"bits_badges":null,"cheermote1":null,"cheermote100":null,"cheermote1000":null,"cheermote5000":null,"cheermote10000":null,"set":14680,"emotes":[{"code":"excHype","image_id":143484,"set":14680}]}}}
json
use super::{ config::Config, data_interface::Protocol, encoding::{ json::{FileDescription, OtaJob}, FileContext, }, pal::Version, }; pub mod mock; pub fn test_job_doc() -> OtaJob { OtaJob { protocols: heapless::Vec::from_slice(&[Protocol::Mqtt]).unwrap(), streamname: heapless::String::from("test_stream"), files: heapless::Vec::from_slice(&[FileDescription { filepath: heapless::String::from(""), filesize: 123456, fileid: 0, certfile: heapless::String::from("cert"), update_data_url: None, auth_scheme: None, sha1_rsa: Some(heapless::String::from("")), file_type: Some(0), sha256_rsa: None, sha1_ecdsa: None, sha256_ecdsa: None, }]) .unwrap(), } } pub fn test_file_ctx(config: &Config) -> FileContext { let ota_job = test_job_doc(); FileContext::new_from("Job-name", &ota_job, None, 0, config, Version::default()).unwrap() } pub mod ota_tests { use crate::jobs::data_types::{DescribeJobExecutionResponse, JobExecution, JobStatus}; use crate::ota::data_interface::Protocol; use crate::ota::encoding::json::{FileDescription, OtaJob}; use crate::ota::error::OtaError; use crate::ota::state::{Error, Events, States}; use crate::ota::test::test_job_doc; use crate::ota::{ agent::OtaAgent, control_interface::ControlInterface, data_interface::{DataInterface, NoInterface}, pal::OtaPal, test::mock::{MockPal, MockTimer}, }; use crate::test::MockMqtt; use embedded_hal::timer; use mqttrust::encoding::v4::{decode_slice, utils::Pid, PacketType}; use mqttrust::{MqttError, Packet, QoS, SubscribeTopic}; use serde::Deserialize; use serde_json_core::from_slice; /// All known job document that the device knows how to process. #[derive(Debug, PartialEq, Deserialize)] pub enum JobDetails { #[serde(rename = "afr_ota")] Ota(OtaJob), #[serde(other)] Unknown, } fn new_agent( mqtt: &MockMqtt, ) -> OtaAgent<'_, MockMqtt, &MockMqtt, NoInterface, MockTimer, MockTimer, MockPal> { let request_timer = MockTimer::new(); let self_test_timer = MockTimer::new(); let pal = MockPal {}; OtaAgent::builder(mqtt, mqtt, request_timer, pal) .with_self_test_timeout(self_test_timer, 16000) .build() } fn run_to_state<'a, C, DP, DS, T, ST, PAL>( agent: &mut OtaAgent<'a, C, DP, DS, T, ST, PAL>, state: States, ) where C: ControlInterface, DP: DataInterface, DS: DataInterface, T: timer::CountDown + timer::Cancel, T::Time: From<u32>, ST: timer::CountDown + timer::Cancel, ST::Time: From<u32>, PAL: OtaPal, { if agent.state.state() == &state { return; } match state { States::Ready => { println!( "Running to 'States::Ready', events: {}", agent.state.context().events.len() ); agent.state.process_event(Events::Shutdown).unwrap(); } States::CreatingFile => { println!( "Running to 'States::CreatingFile', events: {}", agent.state.context().events.len() ); run_to_state(agent, States::WaitingForJob); let job_doc = test_job_doc(); agent.job_update("Test-job", &job_doc, None).unwrap(); agent.state.context_mut().events.dequeue(); } States::RequestingFileBlock => { println!( "Running to 'States::RequestingFileBlock', events: {}", agent.state.context().events.len() ); run_to_state(agent, States::CreatingFile); agent.state.process_event(Events::CreateFile).unwrap(); agent.state.context_mut().events.dequeue(); } States::RequestingJob => { println!( "Running to 'States::RequestingJob', events: {}", agent.state.context().events.len() ); run_to_state(agent, States::Ready); agent.state.process_event(Events::Start).unwrap(); agent.state.context_mut().events.dequeue(); } States::Suspended => { println!( "Running to 'States::Suspended', events: {}", agent.state.context().events.len() ); run_to_state(agent, States::Ready); agent.suspend().unwrap(); } States::WaitingForFileBlock => { println!( "Running to 'States::Suspended', events: {}", agent.state.context().events.len() ); run_to_state(agent, States::RequestingFileBlock); agent.state.process_event(Events::RequestFileBlock).unwrap(); agent.state.context_mut().events.dequeue(); } States::WaitingForJob => { println!( "Running to 'States::WaitingForJob', events: {}", agent.state.context().events.len() ); run_to_state(agent, States::RequestingJob); agent.check_for_update().unwrap(); } States::Restarting => {} } } pub fn set_pid(buf: &mut [u8], pid: Pid) -> Result<(), ()> { let mut offset = 0; let (header, _) = mqttrust::encoding::v4::decoder::read_header(buf, &mut offset) .map_err(|_| ())? .ok_or(())?; match (header.typ, header.qos) { (PacketType::Publish, QoS::AtLeastOnce | QoS::ExactlyOnce) => { if buf[offset..].len() < 2 { return Err(()); } let len = ((buf[offset] as usize) << 8) | buf[offset + 1] as usize; offset += 2; if len > buf[offset..].len() { return Err(()); } else { offset += len; } } (PacketType::Subscribe | PacketType::Unsubscribe | PacketType::Suback, _) => {} ( PacketType::Puback | PacketType::Pubrec | PacketType::Pubrel | PacketType::Pubcomp | PacketType::Unsuback, _, ) => {} _ => return Ok(()), } pid.to_buffer(buf, &mut offset).map_err(|_| ()) } #[test] fn ready_when_stopped() { let mqtt = MockMqtt::new(); let mut ota_agent = new_agent(&mqtt); assert!(matches!(ota_agent.state.state(), &States::Ready)); run_to_state(&mut ota_agent, States::Ready); assert!(matches!(ota_agent.state.state(), &States::Ready)); assert_eq!(ota_agent.state.context().events.len(), 0); assert_eq!(mqtt.tx.borrow_mut().len(), 0); } #[test] fn abort_when_stopped() { let mqtt = MockMqtt::new(); let mut ota_agent = new_agent(&mqtt); run_to_state(&mut ota_agent, States::Ready); assert_eq!(ota_agent.state.context().events.len(), 0); assert_eq!( ota_agent.abort().err(), Some(Error::GuardFailed(OtaError::NoActiveJob)) ); ota_agent.process_event().unwrap(); assert!(matches!(ota_agent.state.state(), &States::Ready)); assert_eq!(mqtt.tx.borrow_mut().len(), 0); } #[test] fn resume_when_stopped() { let mqtt = MockMqtt::new(); let mut ota_agent = new_agent(&mqtt); run_to_state(&mut ota_agent, States::Ready); assert_eq!(ota_agent.state.context().events.len(), 0); assert!(matches!( ota_agent.resume().err().unwrap(), Error::InvalidEvent )); ota_agent.process_event().unwrap(); assert!(matches!(ota_agent.state.state(), &States::Ready)); assert_eq!(mqtt.tx.borrow_mut().len(), 0); } #[test] fn resume_when_suspended() { let mqtt = MockMqtt::new(); let mut ota_agent = new_agent(&mqtt); run_to_state(&mut ota_agent, States::Suspended); assert_eq!(ota_agent.state.context().events.len(), 0); assert!(matches!( ota_agent.resume().unwrap(), &States::RequestingJob )); assert_eq!(mqtt.tx.borrow_mut().len(), 0); } #[test] fn check_for_update() { let mqtt = MockMqtt::new(); let mut ota_agent = new_agent(&mqtt); run_to_state(&mut ota_agent, States::RequestingJob); assert!(matches!(ota_agent.state.state(), &States::RequestingJob)); assert_eq!(ota_agent.state.context().events.len(), 0); assert!(matches!( ota_agent.check_for_update().unwrap(), &States::WaitingForJob )); let bytes = mqtt.tx.borrow_mut().pop_front().unwrap(); let packet = decode_slice(bytes.as_slice()).unwrap(); let topics = match packet { Some(Packet::Subscribe(ref s)) => s.topics().collect::<Vec<_>>(), _ => panic!(), }; assert_eq!( topics, vec![SubscribeTopic { topic_path: "$aws/things/test_client/jobs/notify-next", qos: QoS::AtLeastOnce }] ); let mut bytes = mqtt.tx.borrow_mut().pop_front().unwrap(); set_pid(bytes.as_mut_slice(), Pid::new()).expect("Failed to set valid PID"); let packet = decode_slice(bytes.as_slice()).unwrap(); let publish = match packet { Some(Packet::Publish(p)) => p, _ => panic!(), }; assert_eq!( publish, mqttrust::encoding::v4::publish::Publish { dup: false, qos: QoS::AtLeastOnce, retain: false, topic_name: "$aws/things/test_client/jobs/$next/get", payload: &[ 123, 34, 99, 108, 105, 101, 110, 116, 84, 111, 107, 101, 110, 34, 58, 34, 48, 58, 116, 101, 115, 116, 95, 99, 108, 105, 101, 110, 116, 34, 125 ], pid: Some(Pid::new()), } ); assert_eq!(mqtt.tx.borrow_mut().len(), 0); } #[test] #[ignore] fn request_job_retry_fail() { let mut mqtt = MockMqtt::new(); // Let MQTT publish fail so request job will also fail mqtt.publish_fail(); let mut ota_agent = new_agent(&mqtt); // Place the OTA Agent into the state for requesting a job run_to_state(&mut ota_agent, States::RequestingJob); assert!(matches!(ota_agent.state.state(), &States::RequestingJob)); assert_eq!(ota_agent.state.context().events.len(), 0); assert_eq!( ota_agent.check_for_update().err(), Some(Error::GuardFailed(OtaError::Mqtt(MqttError::Full))) ); // Fail the maximum number of attempts to request a job document for _ in 0..ota_agent.state.context().config.max_request_momentum { ota_agent.process_event().unwrap(); assert!(ota_agent.state.context().request_timer.is_started); ota_agent.timer_callback().ok(); assert!(matches!(ota_agent.state.state(), &States::RequestingJob)); } // Attempt to request another job document after failing the maximum // number of times, triggering a shutdown event. ota_agent.process_event().unwrap(); assert!(matches!(ota_agent.state.state(), &States::Ready)); assert_eq!(mqtt.tx.borrow_mut().len(), 4); } #[test] fn init_file_transfer_mqtt() { let mqtt = MockMqtt::new(); let mut ota_agent = new_agent(&mqtt); // Place the OTA Agent into the state for creating file run_to_state(&mut ota_agent, States::CreatingFile); assert!(matches!(ota_agent.state.state(), &States::CreatingFile)); assert_eq!(ota_agent.state.context().events.len(), 0); ota_agent.process_event().unwrap(); assert!(matches!(ota_agent.state.state(), &States::CreatingFile)); ota_agent.process_event().unwrap(); ota_agent.state.process_event(Events::CreateFile).unwrap(); // Above will automatically enqueue `RequestFileBlock` assert!(matches!( ota_agent.state.state(), &States::RequestingFileBlock )); // Check the latest MQTT message let bytes = mqtt.tx.borrow_mut().pop_back().unwrap(); let packet = decode_slice(bytes.as_slice()).unwrap(); let topics = match packet { Some(Packet::Subscribe(ref s)) => s.topics().collect::<Vec<_>>(), _ => panic!(), }; assert_eq!( topics, vec![SubscribeTopic { topic_path: "$aws/things/test_client/streams/test_stream/data/cbor", qos: QoS::AtLeastOnce }] ); // Should still contain: // - subscription to `$aws/things/test_client/jobs/notify-next` // - publish to `$aws/things/test_client/jobs/$next/get` assert_eq!(mqtt.tx.borrow_mut().len(), 2); } #[test] fn request_file_block_mqtt() { let mqtt = MockMqtt::new(); let mut ota_agent = new_agent(&mqtt); // Place the OTA Agent into the state for requesting file block run_to_state(&mut ota_agent, States::RequestingFileBlock); assert!(matches!( ota_agent.state.state(), &States::RequestingFileBlock )); assert_eq!(ota_agent.state.context().events.len(), 0); ota_agent .state .process_event(Events::RequestFileBlock) .unwrap(); assert!(matches!( ota_agent.state.state(), &States::WaitingForFileBlock )); let bytes = mqtt.tx.borrow_mut().pop_back().unwrap(); let publish = match decode_slice(bytes.as_slice()).unwrap() { Some(Packet::Publish(p)) => p, _ => panic!(), }; // Check the latest MQTT message assert_eq!( publish, mqttrust::encoding::v4::publish::Publish { dup: false, qos: QoS::AtMostOnce, retain: false, topic_name: "$aws/things/test_client/streams/test_stream/get/cbor", payload: &[ 164, 97, 102, 0, 97, 108, 25, 1, 0, 97, 111, 0, 97, 98, 68, 255, 255, 255, 127 ], pid: None } ); // Should still contain: // - subscription to `$aws/things/test_client/jobs/notify-next` // - publish to `$aws/things/test_client/jobs/$next/get` // - subscription to // `$aws/things/test_client/streams/test_stream/data/cbor` assert_eq!(mqtt.tx.borrow_mut().len(), 3); } #[test] fn deserialize_describe_job_execution_response_ota() { let payload = br#"{ "clientToken":"0:<PASSWORD>", "timestamp":1624445100, "execution":{ "jobId":"AFR_OTA-rustot_test_1", "status":"QUEUED", "queuedAt":1624440618, "lastUpdatedAt":1624440618, "versionNumber":1, "executionNumber":1, "jobDocument":{ "afr_ota":{ "protocols":["MQTT"], "streamname":"AFR_OTA-0ba01295-9417-4ba7-9a99-4b31fb03d252", "files":[{ "filepath":"IMG_test.jpg", "filesize":2674792, "fileid":0, "certfile":"nope", "fileType":0, "sig-sha256-ecdsa":"This is my signature! Better believe it!" }] } } } }"#; let (response, _) = from_slice::<DescribeJobExecutionResponse<JobDetails>>(payload).unwrap(); assert_eq!( response, DescribeJobExecutionResponse { execution: Some(JobExecution { execution_number: Some(1), job_document: Some(JobDetails::Ota(OtaJob { protocols: heapless::Vec::from_slice(&[Protocol::Mqtt]).unwrap(), streamname: heapless::String::from( "AFR_OTA-0ba01295-9417-4ba7-9a99-4b31fb03d252" ), files: heapless::Vec::from_slice(&[FileDescription { filepath: heapless::String::from("IMG_test.jpg"), filesize: 2674792, fileid: 0, certfile: heapless::String::from("nope"), update_data_url: None, auth_scheme: None, sha1_rsa: None, sha256_rsa: None, sha1_ecdsa: None, sha256_ecdsa: Some(heapless::String::from( "This is my signature! Better believe it!" )), file_type: Some(0), }]) .unwrap(), })), job_id: heapless::String::from("AFR_OTA-rustot_test_1"), last_updated_at: 1624440618, queued_at: 1624440618, status_details: None, status: JobStatus::Queued, version_number: 1, approximate_seconds_before_timed_out: None, started_at: None, thing_name: None, }), timestamp: 1624445100, client_token: "<PASSWORD>", } ); } }
rust
During the February episode of Bam's House, the talk show hosted by GOT7's BamBam, the idol got tangled up in a controversy when he mentioned NewJeans' Haerin to be his ideal type. He also explained how he was a fan of the rookie K-pop idol. However, given that Haerin is still a minor, fans found his remarks to be quite problematic. As the situation gained heat, BamBam's agency, ABYSS, released an official statement addressing the issue and rejecting any sort of negative intentions that the idol had with his mention of NewJeans' Haerin. The statement added that any further spread of rumors with malicious intentions against GOT7's BamBam will lead to legal action from the company. While the situation settled after the same, BamBam opened up about the issue and clarified his stance directly to fans during his birthday live broadcast on May 2, 2023. Towards the end of GOT7's BamBam's birthday on May 2, the idol went live on Weverse to celebrate with fans, as is the norm in the K-pop industry. While the live stream for the most part had the idol thanking his fans for the birthday wishes and cutting his birthday cake, he also addressed the issue that garnered him a lot of criticism. BamBam began by saying: "I try to ignore a lot of stuff, but I'm stressed a lot. You guys probably know what happened not too long ago, but they don't say what I mean. The reason I'm saying this is because that I'm scared all that these situations might happen again, so you know, I'll explain everything later." The idol then addressed the media manipulation that many K-pop artists and celebrities face in the entertainment industry: "But nowadays people just believe all the internet stuff, I appreciate all the people who're still with me and trust my heart and everything. I never try to be a bad person. I never have bad thoughts or anything. I'm trying to be the best I can be, but when I try my best and this is the result, I'm tired and stressed out and I wanna give up sometimes, not gonna lie. You know, just let me live, let me work, all I need is love. If you're not going to support, just back off and let me do my thing." The GOT7 member concluded his speech by opening up about the issue regarding Haerin and his reaction to the same: "You know, this been bothering me every day since like three weeks ago. You all tell me not to care about the haters but people do believe the haters these days so how do I not care. I saw a lot of people saying, "after all that stuff I cannot look at BamBam as the same person anymore". You guys know what kind of a person I am, don't believe a ten second video." There has been an outpouring of support for BamBam in the wake of his response video. Viewers of the live stream were quite heartbroken by his open and honest speech about how he feels about others' perceptions currently being altered by the controversy and the tough times haters give him. They were also quite worried about how much the issue must've been bothering him for him to bring it up during his birthday live.
english
<reponame>JesseWright/aws-sdk-rust // Code generated by software.amazon.smithy.rust.codegen.smithy-rs. DO NOT EDIT. pub fn parse_http_generic_error( response: &http::Response<bytes::Bytes>, ) -> Result<smithy_types::Error, smithy_json::deserialize::Error> { crate::json_errors::parse_generic_error(response.body(), response.headers()) } pub fn deser_structure_conflicting_operation_exceptionjson_err( input: &[u8], mut builder: crate::error::conflicting_operation_exception::Builder, ) -> Result<crate::error::conflicting_operation_exception::Builder, smithy_json::deserialize::Error> { let mut tokens_owned = smithy_json::deserialize::json_token_iter(crate::json_deser::or_empty_doc(input)) .peekable(); let tokens = &mut tokens_owned; smithy_json::deserialize::token::expect_start_object(tokens.next())?; loop { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::EndObject { .. }) => break, Some(smithy_json::deserialize::Token::ObjectKey { key, .. }) => { match key.to_unescaped()?.as_ref() { "message" => { builder = builder.set_message( smithy_json::deserialize::token::expect_string_or_null(tokens.next())? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } "resourceId" => { builder = builder.set_resource_id( smithy_json::deserialize::token::expect_string_or_null(tokens.next())? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } "resourceArn" => { builder = builder.set_resource_arn( smithy_json::deserialize::token::expect_string_or_null(tokens.next())? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } _ => smithy_json::deserialize::token::skip_value(tokens)?, } } _ => { return Err(smithy_json::deserialize::Error::custom( "expected object key or end object", )) } } } if tokens.next().is_some() { return Err(smithy_json::deserialize::Error::custom( "found more JSON tokens after completing parsing", )); } Ok(builder) } pub fn deser_structure_internal_failure_exceptionjson_err( input: &[u8], mut builder: crate::error::internal_failure_exception::Builder, ) -> Result<crate::error::internal_failure_exception::Builder, smithy_json::deserialize::Error> { let mut tokens_owned = smithy_json::deserialize::json_token_iter(crate::json_deser::or_empty_doc(input)) .peekable(); let tokens = &mut tokens_owned; smithy_json::deserialize::token::expect_start_object(tokens.next())?; loop { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::EndObject { .. }) => break, Some(smithy_json::deserialize::Token::ObjectKey { key, .. }) => { match key.to_unescaped()?.as_ref() { "message" => { builder = builder.set_message( smithy_json::deserialize::token::expect_string_or_null(tokens.next())? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } _ => smithy_json::deserialize::token::skip_value(tokens)?, } } _ => { return Err(smithy_json::deserialize::Error::custom( "expected object key or end object", )) } } } if tokens.next().is_some() { return Err(smithy_json::deserialize::Error::custom( "found more JSON tokens after completing parsing", )); } Ok(builder) } pub fn deser_structure_invalid_request_exceptionjson_err( input: &[u8], mut builder: crate::error::invalid_request_exception::Builder, ) -> Result<crate::error::invalid_request_exception::Builder, smithy_json::deserialize::Error> { let mut tokens_owned = smithy_json::deserialize::json_token_iter(crate::json_deser::or_empty_doc(input)) .peekable(); let tokens = &mut tokens_owned; smithy_json::deserialize::token::expect_start_object(tokens.next())?; loop { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::EndObject { .. }) => break, Some(smithy_json::deserialize::Token::ObjectKey { key, .. }) => { match key.to_unescaped()?.as_ref() { "message" => { builder = builder.set_message( smithy_json::deserialize::token::expect_string_or_null(tokens.next())? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } _ => smithy_json::deserialize::token::skip_value(tokens)?, } } _ => { return Err(smithy_json::deserialize::Error::custom( "expected object key or end object", )) } } } if tokens.next().is_some() { return Err(smithy_json::deserialize::Error::custom( "found more JSON tokens after completing parsing", )); } Ok(builder) } pub fn deser_structure_limit_exceeded_exceptionjson_err( input: &[u8], mut builder: crate::error::limit_exceeded_exception::Builder, ) -> Result<crate::error::limit_exceeded_exception::Builder, smithy_json::deserialize::Error> { let mut tokens_owned = smithy_json::deserialize::json_token_iter(crate::json_deser::or_empty_doc(input)) .peekable(); let tokens = &mut tokens_owned; smithy_json::deserialize::token::expect_start_object(tokens.next())?; loop { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::EndObject { .. }) => break, Some(smithy_json::deserialize::Token::ObjectKey { key, .. }) => { match key.to_unescaped()?.as_ref() { "message" => { builder = builder.set_message( smithy_json::deserialize::token::expect_string_or_null(tokens.next())? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } _ => smithy_json::deserialize::token::skip_value(tokens)?, } } _ => { return Err(smithy_json::deserialize::Error::custom( "expected object key or end object", )) } } } if tokens.next().is_some() { return Err(smithy_json::deserialize::Error::custom( "found more JSON tokens after completing parsing", )); } Ok(builder) } pub fn deser_structure_resource_not_found_exceptionjson_err( input: &[u8], mut builder: crate::error::resource_not_found_exception::Builder, ) -> Result<crate::error::resource_not_found_exception::Builder, smithy_json::deserialize::Error> { let mut tokens_owned = smithy_json::deserialize::json_token_iter(crate::json_deser::or_empty_doc(input)) .peekable(); let tokens = &mut tokens_owned; smithy_json::deserialize::token::expect_start_object(tokens.next())?; loop { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::EndObject { .. }) => break, Some(smithy_json::deserialize::Token::ObjectKey { key, .. }) => { match key.to_unescaped()?.as_ref() { "message" => { builder = builder.set_message( smithy_json::deserialize::token::expect_string_or_null(tokens.next())? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } _ => smithy_json::deserialize::token::skip_value(tokens)?, } } _ => { return Err(smithy_json::deserialize::Error::custom( "expected object key or end object", )) } } } if tokens.next().is_some() { return Err(smithy_json::deserialize::Error::custom( "found more JSON tokens after completing parsing", )); } Ok(builder) } pub fn deser_structure_throttling_exceptionjson_err( input: &[u8], mut builder: crate::error::throttling_exception::Builder, ) -> Result<crate::error::throttling_exception::Builder, smithy_json::deserialize::Error> { let mut tokens_owned = smithy_json::deserialize::json_token_iter(crate::json_deser::or_empty_doc(input)) .peekable(); let tokens = &mut tokens_owned; smithy_json::deserialize::token::expect_start_object(tokens.next())?; loop { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::EndObject { .. }) => break, Some(smithy_json::deserialize::Token::ObjectKey { key, .. }) => { match key.to_unescaped()?.as_ref() { "message" => { builder = builder.set_message( smithy_json::deserialize::token::expect_string_or_null(tokens.next())? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } _ => smithy_json::deserialize::token::skip_value(tokens)?, } } _ => { return Err(smithy_json::deserialize::Error::custom( "expected object key or end object", )) } } } if tokens.next().is_some() { return Err(smithy_json::deserialize::Error::custom( "found more JSON tokens after completing parsing", )); } Ok(builder) } pub fn deser_operation_batch_associate_project_assets( input: &[u8], mut builder: crate::output::batch_associate_project_assets_output::Builder, ) -> Result< crate::output::batch_associate_project_assets_output::Builder, smithy_json::deserialize::Error, > { let mut tokens_owned = smithy_json::deserialize::json_token_iter(crate::json_deser::or_empty_doc(input)) .peekable(); let tokens = &mut tokens_owned; smithy_json::deserialize::token::expect_start_object(tokens.next())?; loop { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::EndObject { .. }) => break, Some(smithy_json::deserialize::Token::ObjectKey { key, .. }) => { match key.to_unescaped()?.as_ref() { "errors" => { builder = builder.set_errors( crate::json_deser::deser_list_batch_associate_project_assets_errors( tokens, )?, ); } _ => smithy_json::deserialize::token::skip_value(tokens)?, } } _ => { return Err(smithy_json::deserialize::Error::custom( "expected object key or end object", )) } } } if tokens.next().is_some() { return Err(smithy_json::deserialize::Error::custom( "found more JSON tokens after completing parsing", )); } Ok(builder) } pub fn deser_operation_batch_disassociate_project_assets( input: &[u8], mut builder: crate::output::batch_disassociate_project_assets_output::Builder, ) -> Result< crate::output::batch_disassociate_project_assets_output::Builder, smithy_json::deserialize::Error, > { let mut tokens_owned = smithy_json::deserialize::json_token_iter(crate::json_deser::or_empty_doc(input)) .peekable(); let tokens = &mut tokens_owned; smithy_json::deserialize::token::expect_start_object(tokens.next())?; loop { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::EndObject { .. }) => break, Some(smithy_json::deserialize::Token::ObjectKey { key, .. }) => { match key.to_unescaped()?.as_ref() { "errors" => { builder = builder.set_errors( crate::json_deser::deser_list_batch_disassociate_project_assets_errors( tokens, )?, ); } _ => smithy_json::deserialize::token::skip_value(tokens)?, } } _ => { return Err(smithy_json::deserialize::Error::custom( "expected object key or end object", )) } } } if tokens.next().is_some() { return Err(smithy_json::deserialize::Error::custom( "found more JSON tokens after completing parsing", )); } Ok(builder) } pub fn deser_structure_service_unavailable_exceptionjson_err( input: &[u8], mut builder: crate::error::service_unavailable_exception::Builder, ) -> Result<crate::error::service_unavailable_exception::Builder, smithy_json::deserialize::Error> { let mut tokens_owned = smithy_json::deserialize::json_token_iter(crate::json_deser::or_empty_doc(input)) .peekable(); let tokens = &mut tokens_owned; smithy_json::deserialize::token::expect_start_object(tokens.next())?; loop { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::EndObject { .. }) => break, Some(smithy_json::deserialize::Token::ObjectKey { key, .. }) => { match key.to_unescaped()?.as_ref() { "message" => { builder = builder.set_message( smithy_json::deserialize::token::expect_string_or_null(tokens.next())? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } _ => smithy_json::deserialize::token::skip_value(tokens)?, } } _ => { return Err(smithy_json::deserialize::Error::custom( "expected object key or end object", )) } } } if tokens.next().is_some() { return Err(smithy_json::deserialize::Error::custom( "found more JSON tokens after completing parsing", )); } Ok(builder) } pub fn deser_operation_batch_put_asset_property_value( input: &[u8], mut builder: crate::output::batch_put_asset_property_value_output::Builder, ) -> Result< crate::output::batch_put_asset_property_value_output::Builder, smithy_json::deserialize::Error, > { let mut tokens_owned = smithy_json::deserialize::json_token_iter(crate::json_deser::or_empty_doc(input)) .peekable(); let tokens = &mut tokens_owned; smithy_json::deserialize::token::expect_start_object(tokens.next())?; loop { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::EndObject { .. }) => break, Some(smithy_json::deserialize::Token::ObjectKey { key, .. }) => { match key.to_unescaped()?.as_ref() { "errorEntries" => { builder = builder.set_error_entries( crate::json_deser::deser_list_batch_put_asset_property_error_entries( tokens, )?, ); } _ => smithy_json::deserialize::token::skip_value(tokens)?, } } _ => { return Err(smithy_json::deserialize::Error::custom( "expected object key or end object", )) } } } if tokens.next().is_some() { return Err(smithy_json::deserialize::Error::custom( "found more JSON tokens after completing parsing", )); } Ok(builder) } pub fn deser_operation_create_access_policy( input: &[u8], mut builder: crate::output::create_access_policy_output::Builder, ) -> Result<crate::output::create_access_policy_output::Builder, smithy_json::deserialize::Error> { let mut tokens_owned = smithy_json::deserialize::json_token_iter(crate::json_deser::or_empty_doc(input)) .peekable(); let tokens = &mut tokens_owned; smithy_json::deserialize::token::expect_start_object(tokens.next())?; loop { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::EndObject { .. }) => break, Some(smithy_json::deserialize::Token::ObjectKey { key, .. }) => { match key.to_unescaped()?.as_ref() { "accessPolicyArn" => { builder = builder.set_access_policy_arn( smithy_json::deserialize::token::expect_string_or_null(tokens.next())? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } "accessPolicyId" => { builder = builder.set_access_policy_id( smithy_json::deserialize::token::expect_string_or_null(tokens.next())? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } _ => smithy_json::deserialize::token::skip_value(tokens)?, } } _ => { return Err(smithy_json::deserialize::Error::custom( "expected object key or end object", )) } } } if tokens.next().is_some() { return Err(smithy_json::deserialize::Error::custom( "found more JSON tokens after completing parsing", )); } Ok(builder) } pub fn deser_structure_resource_already_exists_exceptionjson_err( input: &[u8], mut builder: crate::error::resource_already_exists_exception::Builder, ) -> Result<crate::error::resource_already_exists_exception::Builder, smithy_json::deserialize::Error> { let mut tokens_owned = smithy_json::deserialize::json_token_iter(crate::json_deser::or_empty_doc(input)) .peekable(); let tokens = &mut tokens_owned; smithy_json::deserialize::token::expect_start_object(tokens.next())?; loop { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::EndObject { .. }) => break, Some(smithy_json::deserialize::Token::ObjectKey { key, .. }) => { match key.to_unescaped()?.as_ref() { "message" => { builder = builder.set_message( smithy_json::deserialize::token::expect_string_or_null(tokens.next())? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } "resourceId" => { builder = builder.set_resource_id( smithy_json::deserialize::token::expect_string_or_null(tokens.next())? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } "resourceArn" => { builder = builder.set_resource_arn( smithy_json::deserialize::token::expect_string_or_null(tokens.next())? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } _ => smithy_json::deserialize::token::skip_value(tokens)?, } } _ => { return Err(smithy_json::deserialize::Error::custom( "expected object key or end object", )) } } } if tokens.next().is_some() { return Err(smithy_json::deserialize::Error::custom( "found more JSON tokens after completing parsing", )); } Ok(builder) } pub fn deser_operation_create_asset( input: &[u8], mut builder: crate::output::create_asset_output::Builder, ) -> Result<crate::output::create_asset_output::Builder, smithy_json::deserialize::Error> { let mut tokens_owned = smithy_json::deserialize::json_token_iter(crate::json_deser::or_empty_doc(input)) .peekable(); let tokens = &mut tokens_owned; smithy_json::deserialize::token::expect_start_object(tokens.next())?; loop { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::EndObject { .. }) => break, Some(smithy_json::deserialize::Token::ObjectKey { key, .. }) => { match key.to_unescaped()?.as_ref() { "assetArn" => { builder = builder.set_asset_arn( smithy_json::deserialize::token::expect_string_or_null(tokens.next())? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } "assetId" => { builder = builder.set_asset_id( smithy_json::deserialize::token::expect_string_or_null(tokens.next())? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } "assetStatus" => { builder = builder.set_asset_status( crate::json_deser::deser_structure_asset_status(tokens)?, ); } _ => smithy_json::deserialize::token::skip_value(tokens)?, } } _ => { return Err(smithy_json::deserialize::Error::custom( "expected object key or end object", )) } } } if tokens.next().is_some() { return Err(smithy_json::deserialize::Error::custom( "found more JSON tokens after completing parsing", )); } Ok(builder) } pub fn deser_operation_create_asset_model( input: &[u8], mut builder: crate::output::create_asset_model_output::Builder, ) -> Result<crate::output::create_asset_model_output::Builder, smithy_json::deserialize::Error> { let mut tokens_owned = smithy_json::deserialize::json_token_iter(crate::json_deser::or_empty_doc(input)) .peekable(); let tokens = &mut tokens_owned; smithy_json::deserialize::token::expect_start_object(tokens.next())?; loop { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::EndObject { .. }) => break, Some(smithy_json::deserialize::Token::ObjectKey { key, .. }) => { match key.to_unescaped()?.as_ref() { "assetModelArn" => { builder = builder.set_asset_model_arn( smithy_json::deserialize::token::expect_string_or_null(tokens.next())? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } "assetModelId" => { builder = builder.set_asset_model_id( smithy_json::deserialize::token::expect_string_or_null(tokens.next())? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } "assetModelStatus" => { builder = builder.set_asset_model_status( crate::json_deser::deser_structure_asset_model_status(tokens)?, ); } _ => smithy_json::deserialize::token::skip_value(tokens)?, } } _ => { return Err(smithy_json::deserialize::Error::custom( "expected object key or end object", )) } } } if tokens.next().is_some() { return Err(smithy_json::deserialize::Error::custom( "found more JSON tokens after completing parsing", )); } Ok(builder) } pub fn deser_operation_create_dashboard( input: &[u8], mut builder: crate::output::create_dashboard_output::Builder, ) -> Result<crate::output::create_dashboard_output::Builder, smithy_json::deserialize::Error> { let mut tokens_owned = smithy_json::deserialize::json_token_iter(crate::json_deser::or_empty_doc(input)) .peekable(); let tokens = &mut tokens_owned; smithy_json::deserialize::token::expect_start_object(tokens.next())?; loop { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::EndObject { .. }) => break, Some(smithy_json::deserialize::Token::ObjectKey { key, .. }) => { match key.to_unescaped()?.as_ref() { "dashboardArn" => { builder = builder.set_dashboard_arn( smithy_json::deserialize::token::expect_string_or_null(tokens.next())? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } "dashboardId" => { builder = builder.set_dashboard_id( smithy_json::deserialize::token::expect_string_or_null(tokens.next())? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } _ => smithy_json::deserialize::token::skip_value(tokens)?, } } _ => { return Err(smithy_json::deserialize::Error::custom( "expected object key or end object", )) } } } if tokens.next().is_some() { return Err(smithy_json::deserialize::Error::custom( "found more JSON tokens after completing parsing", )); } Ok(builder) } pub fn deser_operation_create_gateway( input: &[u8], mut builder: crate::output::create_gateway_output::Builder, ) -> Result<crate::output::create_gateway_output::Builder, smithy_json::deserialize::Error> { let mut tokens_owned = smithy_json::deserialize::json_token_iter(crate::json_deser::or_empty_doc(input)) .peekable(); let tokens = &mut tokens_owned; smithy_json::deserialize::token::expect_start_object(tokens.next())?; loop { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::EndObject { .. }) => break, Some(smithy_json::deserialize::Token::ObjectKey { key, .. }) => { match key.to_unescaped()?.as_ref() { "gatewayArn" => { builder = builder.set_gateway_arn( smithy_json::deserialize::token::expect_string_or_null(tokens.next())? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } "gatewayId" => { builder = builder.set_gateway_id( smithy_json::deserialize::token::expect_string_or_null(tokens.next())? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } _ => smithy_json::deserialize::token::skip_value(tokens)?, } } _ => { return Err(smithy_json::deserialize::Error::custom( "expected object key or end object", )) } } } if tokens.next().is_some() { return Err(smithy_json::deserialize::Error::custom( "found more JSON tokens after completing parsing", )); } Ok(builder) } pub fn deser_operation_create_portal( input: &[u8], mut builder: crate::output::create_portal_output::Builder, ) -> Result<crate::output::create_portal_output::Builder, smithy_json::deserialize::Error> { let mut tokens_owned = smithy_json::deserialize::json_token_iter(crate::json_deser::or_empty_doc(input)) .peekable(); let tokens = &mut tokens_owned; smithy_json::deserialize::token::expect_start_object(tokens.next())?; loop { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::EndObject { .. }) => break, Some(smithy_json::deserialize::Token::ObjectKey { key, .. }) => { match key.to_unescaped()?.as_ref() { "portalArn" => { builder = builder.set_portal_arn( smithy_json::deserialize::token::expect_string_or_null(tokens.next())? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } "portalId" => { builder = builder.set_portal_id( smithy_json::deserialize::token::expect_string_or_null(tokens.next())? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } "portalStartUrl" => { builder = builder.set_portal_start_url( smithy_json::deserialize::token::expect_string_or_null(tokens.next())? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } "portalStatus" => { builder = builder.set_portal_status( crate::json_deser::deser_structure_portal_status(tokens)?, ); } "ssoApplicationId" => { builder = builder.set_sso_application_id( smithy_json::deserialize::token::expect_string_or_null(tokens.next())? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } _ => smithy_json::deserialize::token::skip_value(tokens)?, } } _ => { return Err(smithy_json::deserialize::Error::custom( "expected object key or end object", )) } } } if tokens.next().is_some() { return Err(smithy_json::deserialize::Error::custom( "found more JSON tokens after completing parsing", )); } Ok(builder) } pub fn deser_operation_create_project( input: &[u8], mut builder: crate::output::create_project_output::Builder, ) -> Result<crate::output::create_project_output::Builder, smithy_json::deserialize::Error> { let mut tokens_owned = smithy_json::deserialize::json_token_iter(crate::json_deser::or_empty_doc(input)) .peekable(); let tokens = &mut tokens_owned; smithy_json::deserialize::token::expect_start_object(tokens.next())?; loop { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::EndObject { .. }) => break, Some(smithy_json::deserialize::Token::ObjectKey { key, .. }) => { match key.to_unescaped()?.as_ref() { "projectArn" => { builder = builder.set_project_arn( smithy_json::deserialize::token::expect_string_or_null(tokens.next())? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } "projectId" => { builder = builder.set_project_id( smithy_json::deserialize::token::expect_string_or_null(tokens.next())? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } _ => smithy_json::deserialize::token::skip_value(tokens)?, } } _ => { return Err(smithy_json::deserialize::Error::custom( "expected object key or end object", )) } } } if tokens.next().is_some() { return Err(smithy_json::deserialize::Error::custom( "found more JSON tokens after completing parsing", )); } Ok(builder) } pub fn deser_operation_delete_asset( input: &[u8], mut builder: crate::output::delete_asset_output::Builder, ) -> Result<crate::output::delete_asset_output::Builder, smithy_json::deserialize::Error> { let mut tokens_owned = smithy_json::deserialize::json_token_iter(crate::json_deser::or_empty_doc(input)) .peekable(); let tokens = &mut tokens_owned; smithy_json::deserialize::token::expect_start_object(tokens.next())?; loop { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::EndObject { .. }) => break, Some(smithy_json::deserialize::Token::ObjectKey { key, .. }) => { match key.to_unescaped()?.as_ref() { "assetStatus" => { builder = builder.set_asset_status( crate::json_deser::deser_structure_asset_status(tokens)?, ); } _ => smithy_json::deserialize::token::skip_value(tokens)?, } } _ => { return Err(smithy_json::deserialize::Error::custom( "expected object key or end object", )) } } } if tokens.next().is_some() { return Err(smithy_json::deserialize::Error::custom( "found more JSON tokens after completing parsing", )); } Ok(builder) } pub fn deser_operation_delete_asset_model( input: &[u8], mut builder: crate::output::delete_asset_model_output::Builder, ) -> Result<crate::output::delete_asset_model_output::Builder, smithy_json::deserialize::Error> { let mut tokens_owned = smithy_json::deserialize::json_token_iter(crate::json_deser::or_empty_doc(input)) .peekable(); let tokens = &mut tokens_owned; smithy_json::deserialize::token::expect_start_object(tokens.next())?; loop { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::EndObject { .. }) => break, Some(smithy_json::deserialize::Token::ObjectKey { key, .. }) => { match key.to_unescaped()?.as_ref() { "assetModelStatus" => { builder = builder.set_asset_model_status( crate::json_deser::deser_structure_asset_model_status(tokens)?, ); } _ => smithy_json::deserialize::token::skip_value(tokens)?, } } _ => { return Err(smithy_json::deserialize::Error::custom( "expected object key or end object", )) } } } if tokens.next().is_some() { return Err(smithy_json::deserialize::Error::custom( "found more JSON tokens after completing parsing", )); } Ok(builder) } pub fn deser_operation_delete_portal( input: &[u8], mut builder: crate::output::delete_portal_output::Builder, ) -> Result<crate::output::delete_portal_output::Builder, smithy_json::deserialize::Error> { let mut tokens_owned = smithy_json::deserialize::json_token_iter(crate::json_deser::or_empty_doc(input)) .peekable(); let tokens = &mut tokens_owned; smithy_json::deserialize::token::expect_start_object(tokens.next())?; loop { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::EndObject { .. }) => break, Some(smithy_json::deserialize::Token::ObjectKey { key, .. }) => { match key.to_unescaped()?.as_ref() { "portalStatus" => { builder = builder.set_portal_status( crate::json_deser::deser_structure_portal_status(tokens)?, ); } _ => smithy_json::deserialize::token::skip_value(tokens)?, } } _ => { return Err(smithy_json::deserialize::Error::custom( "expected object key or end object", )) } } } if tokens.next().is_some() { return Err(smithy_json::deserialize::Error::custom( "found more JSON tokens after completing parsing", )); } Ok(builder) } pub fn deser_operation_describe_access_policy( input: &[u8], mut builder: crate::output::describe_access_policy_output::Builder, ) -> Result<crate::output::describe_access_policy_output::Builder, smithy_json::deserialize::Error> { let mut tokens_owned = smithy_json::deserialize::json_token_iter(crate::json_deser::or_empty_doc(input)) .peekable(); let tokens = &mut tokens_owned; smithy_json::deserialize::token::expect_start_object(tokens.next())?; loop { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::EndObject { .. }) => break, Some(smithy_json::deserialize::Token::ObjectKey { key, .. }) => { match key.to_unescaped()?.as_ref() { "accessPolicyArn" => { builder = builder.set_access_policy_arn( smithy_json::deserialize::token::expect_string_or_null(tokens.next())? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } "accessPolicyCreationDate" => { builder = builder.set_access_policy_creation_date( smithy_json::deserialize::token::expect_timestamp_or_null( tokens.next(), smithy_types::instant::Format::EpochSeconds, )?, ); } "accessPolicyId" => { builder = builder.set_access_policy_id( smithy_json::deserialize::token::expect_string_or_null(tokens.next())? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } "accessPolicyIdentity" => { builder = builder.set_access_policy_identity( crate::json_deser::deser_structure_identity(tokens)?, ); } "accessPolicyLastUpdateDate" => { builder = builder.set_access_policy_last_update_date( smithy_json::deserialize::token::expect_timestamp_or_null( tokens.next(), smithy_types::instant::Format::EpochSeconds, )?, ); } "accessPolicyPermission" => { builder = builder.set_access_policy_permission( smithy_json::deserialize::token::expect_string_or_null(tokens.next())? .map(|s| { s.to_unescaped() .map(|u| crate::model::Permission::from(u.as_ref())) }) .transpose()?, ); } "accessPolicyResource" => { builder = builder.set_access_policy_resource( crate::json_deser::deser_structure_resource(tokens)?, ); } _ => smithy_json::deserialize::token::skip_value(tokens)?, } } _ => { return Err(smithy_json::deserialize::Error::custom( "expected object key or end object", )) } } } if tokens.next().is_some() { return Err(smithy_json::deserialize::Error::custom( "found more JSON tokens after completing parsing", )); } Ok(builder) } pub fn deser_operation_describe_asset( input: &[u8], mut builder: crate::output::describe_asset_output::Builder, ) -> Result<crate::output::describe_asset_output::Builder, smithy_json::deserialize::Error> { let mut tokens_owned = smithy_json::deserialize::json_token_iter(crate::json_deser::or_empty_doc(input)) .peekable(); let tokens = &mut tokens_owned; smithy_json::deserialize::token::expect_start_object(tokens.next())?; loop { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::EndObject { .. }) => break, Some(smithy_json::deserialize::Token::ObjectKey { key, .. }) => { match key.to_unescaped()?.as_ref() { "assetArn" => { builder = builder.set_asset_arn( smithy_json::deserialize::token::expect_string_or_null(tokens.next())? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } "assetCompositeModels" => { builder = builder.set_asset_composite_models( crate::json_deser::deser_list_asset_composite_models(tokens)?, ); } "assetCreationDate" => { builder = builder.set_asset_creation_date( smithy_json::deserialize::token::expect_timestamp_or_null( tokens.next(), smithy_types::instant::Format::EpochSeconds, )?, ); } "assetHierarchies" => { builder = builder.set_asset_hierarchies( crate::json_deser::deser_list_asset_hierarchies(tokens)?, ); } "assetId" => { builder = builder.set_asset_id( smithy_json::deserialize::token::expect_string_or_null(tokens.next())? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } "assetLastUpdateDate" => { builder = builder.set_asset_last_update_date( smithy_json::deserialize::token::expect_timestamp_or_null( tokens.next(), smithy_types::instant::Format::EpochSeconds, )?, ); } "assetModelId" => { builder = builder.set_asset_model_id( smithy_json::deserialize::token::expect_string_or_null(tokens.next())? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } "assetName" => { builder = builder.set_asset_name( smithy_json::deserialize::token::expect_string_or_null(tokens.next())? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } "assetProperties" => { builder = builder.set_asset_properties( crate::json_deser::deser_list_asset_properties(tokens)?, ); } "assetStatus" => { builder = builder.set_asset_status( crate::json_deser::deser_structure_asset_status(tokens)?, ); } _ => smithy_json::deserialize::token::skip_value(tokens)?, } } _ => { return Err(smithy_json::deserialize::Error::custom( "expected object key or end object", )) } } } if tokens.next().is_some() { return Err(smithy_json::deserialize::Error::custom( "found more JSON tokens after completing parsing", )); } Ok(builder) } pub fn deser_operation_describe_asset_model( input: &[u8], mut builder: crate::output::describe_asset_model_output::Builder, ) -> Result<crate::output::describe_asset_model_output::Builder, smithy_json::deserialize::Error> { let mut tokens_owned = smithy_json::deserialize::json_token_iter(crate::json_deser::or_empty_doc(input)) .peekable(); let tokens = &mut tokens_owned; smithy_json::deserialize::token::expect_start_object(tokens.next())?; loop { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::EndObject { .. }) => break, Some(smithy_json::deserialize::Token::ObjectKey { key, .. }) => { match key.to_unescaped()?.as_ref() { "assetModelArn" => { builder = builder.set_asset_model_arn( smithy_json::deserialize::token::expect_string_or_null(tokens.next())? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } "assetModelCompositeModels" => { builder = builder.set_asset_model_composite_models( crate::json_deser::deser_list_asset_model_composite_models(tokens)?, ); } "assetModelCreationDate" => { builder = builder.set_asset_model_creation_date( smithy_json::deserialize::token::expect_timestamp_or_null( tokens.next(), smithy_types::instant::Format::EpochSeconds, )?, ); } "assetModelDescription" => { builder = builder.set_asset_model_description( smithy_json::deserialize::token::expect_string_or_null(tokens.next())? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } "assetModelHierarchies" => { builder = builder.set_asset_model_hierarchies( crate::json_deser::deser_list_asset_model_hierarchies(tokens)?, ); } "assetModelId" => { builder = builder.set_asset_model_id( smithy_json::deserialize::token::expect_string_or_null(tokens.next())? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } "assetModelLastUpdateDate" => { builder = builder.set_asset_model_last_update_date( smithy_json::deserialize::token::expect_timestamp_or_null( tokens.next(), smithy_types::instant::Format::EpochSeconds, )?, ); } "assetModelName" => { builder = builder.set_asset_model_name( smithy_json::deserialize::token::expect_string_or_null(tokens.next())? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } "assetModelProperties" => { builder = builder.set_asset_model_properties( crate::json_deser::deser_list_asset_model_properties(tokens)?, ); } "assetModelStatus" => { builder = builder.set_asset_model_status( crate::json_deser::deser_structure_asset_model_status(tokens)?, ); } _ => smithy_json::deserialize::token::skip_value(tokens)?, } } _ => { return Err(smithy_json::deserialize::Error::custom( "expected object key or end object", )) } } } if tokens.next().is_some() { return Err(smithy_json::deserialize::Error::custom( "found more JSON tokens after completing parsing", )); } Ok(builder) } pub fn deser_operation_describe_asset_property( input: &[u8], mut builder: crate::output::describe_asset_property_output::Builder, ) -> Result<crate::output::describe_asset_property_output::Builder, smithy_json::deserialize::Error> { let mut tokens_owned = smithy_json::deserialize::json_token_iter(crate::json_deser::or_empty_doc(input)) .peekable(); let tokens = &mut tokens_owned; smithy_json::deserialize::token::expect_start_object(tokens.next())?; loop { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::EndObject { .. }) => break, Some(smithy_json::deserialize::Token::ObjectKey { key, .. }) => { match key.to_unescaped()?.as_ref() { "assetId" => { builder = builder.set_asset_id( smithy_json::deserialize::token::expect_string_or_null(tokens.next())? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } "assetModelId" => { builder = builder.set_asset_model_id( smithy_json::deserialize::token::expect_string_or_null(tokens.next())? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } "assetName" => { builder = builder.set_asset_name( smithy_json::deserialize::token::expect_string_or_null(tokens.next())? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } "assetProperty" => { builder = builder.set_asset_property( crate::json_deser::deser_structure_property(tokens)?, ); } "compositeModel" => { builder = builder.set_composite_model( crate::json_deser::deser_structure_composite_model_property(tokens)?, ); } _ => smithy_json::deserialize::token::skip_value(tokens)?, } } _ => { return Err(smithy_json::deserialize::Error::custom( "expected object key or end object", )) } } } if tokens.next().is_some() { return Err(smithy_json::deserialize::Error::custom( "found more JSON tokens after completing parsing", )); } Ok(builder) } pub fn deser_operation_describe_dashboard( input: &[u8], mut builder: crate::output::describe_dashboard_output::Builder, ) -> Result<crate::output::describe_dashboard_output::Builder, smithy_json::deserialize::Error> { let mut tokens_owned = smithy_json::deserialize::json_token_iter(crate::json_deser::or_empty_doc(input)) .peekable(); let tokens = &mut tokens_owned; smithy_json::deserialize::token::expect_start_object(tokens.next())?; loop { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::EndObject { .. }) => break, Some(smithy_json::deserialize::Token::ObjectKey { key, .. }) => { match key.to_unescaped()?.as_ref() { "dashboardArn" => { builder = builder.set_dashboard_arn( smithy_json::deserialize::token::expect_string_or_null(tokens.next())? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } "dashboardCreationDate" => { builder = builder.set_dashboard_creation_date( smithy_json::deserialize::token::expect_timestamp_or_null( tokens.next(), smithy_types::instant::Format::EpochSeconds, )?, ); } "dashboardDefinition" => { builder = builder.set_dashboard_definition( smithy_json::deserialize::token::expect_string_or_null(tokens.next())? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } "dashboardDescription" => { builder = builder.set_dashboard_description( smithy_json::deserialize::token::expect_string_or_null(tokens.next())? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } "dashboardId" => { builder = builder.set_dashboard_id( smithy_json::deserialize::token::expect_string_or_null(tokens.next())? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } "dashboardLastUpdateDate" => { builder = builder.set_dashboard_last_update_date( smithy_json::deserialize::token::expect_timestamp_or_null( tokens.next(), smithy_types::instant::Format::EpochSeconds, )?, ); } "dashboardName" => { builder = builder.set_dashboard_name( smithy_json::deserialize::token::expect_string_or_null(tokens.next())? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } "projectId" => { builder = builder.set_project_id( smithy_json::deserialize::token::expect_string_or_null(tokens.next())? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } _ => smithy_json::deserialize::token::skip_value(tokens)?, } } _ => { return Err(smithy_json::deserialize::Error::custom( "expected object key or end object", )) } } } if tokens.next().is_some() { return Err(smithy_json::deserialize::Error::custom( "found more JSON tokens after completing parsing", )); } Ok(builder) } pub fn deser_operation_describe_default_encryption_configuration( input: &[u8], mut builder: crate::output::describe_default_encryption_configuration_output::Builder, ) -> Result< crate::output::describe_default_encryption_configuration_output::Builder, smithy_json::deserialize::Error, > { let mut tokens_owned = smithy_json::deserialize::json_token_iter(crate::json_deser::or_empty_doc(input)) .peekable(); let tokens = &mut tokens_owned; smithy_json::deserialize::token::expect_start_object(tokens.next())?; loop { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::EndObject { .. }) => break, Some(smithy_json::deserialize::Token::ObjectKey { key, .. }) => { match key.to_unescaped()?.as_ref() { "configurationStatus" => { builder = builder.set_configuration_status( crate::json_deser::deser_structure_configuration_status(tokens)?, ); } "encryptionType" => { builder = builder.set_encryption_type( smithy_json::deserialize::token::expect_string_or_null(tokens.next())? .map(|s| { s.to_unescaped() .map(|u| crate::model::EncryptionType::from(u.as_ref())) }) .transpose()?, ); } "kmsKeyArn" => { builder = builder.set_kms_key_arn( smithy_json::deserialize::token::expect_string_or_null(tokens.next())? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } _ => smithy_json::deserialize::token::skip_value(tokens)?, } } _ => { return Err(smithy_json::deserialize::Error::custom( "expected object key or end object", )) } } } if tokens.next().is_some() { return Err(smithy_json::deserialize::Error::custom( "found more JSON tokens after completing parsing", )); } Ok(builder) } pub fn deser_operation_describe_gateway( input: &[u8], mut builder: crate::output::describe_gateway_output::Builder, ) -> Result<crate::output::describe_gateway_output::Builder, smithy_json::deserialize::Error> { let mut tokens_owned = smithy_json::deserialize::json_token_iter(crate::json_deser::or_empty_doc(input)) .peekable(); let tokens = &mut tokens_owned; smithy_json::deserialize::token::expect_start_object(tokens.next())?; loop { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::EndObject { .. }) => break, Some(smithy_json::deserialize::Token::ObjectKey { key, .. }) => { match key.to_unescaped()?.as_ref() { "creationDate" => { builder = builder.set_creation_date( smithy_json::deserialize::token::expect_timestamp_or_null( tokens.next(), smithy_types::instant::Format::EpochSeconds, )?, ); } "gatewayArn" => { builder = builder.set_gateway_arn( smithy_json::deserialize::token::expect_string_or_null(tokens.next())? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } "gatewayCapabilitySummaries" => { builder = builder.set_gateway_capability_summaries( crate::json_deser::deser_list_gateway_capability_summaries(tokens)?, ); } "gatewayId" => { builder = builder.set_gateway_id( smithy_json::deserialize::token::expect_string_or_null(tokens.next())? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } "gatewayName" => { builder = builder.set_gateway_name( smithy_json::deserialize::token::expect_string_or_null(tokens.next())? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } "gatewayPlatform" => { builder = builder.set_gateway_platform( crate::json_deser::deser_structure_gateway_platform(tokens)?, ); } "lastUpdateDate" => { builder = builder.set_last_update_date( smithy_json::deserialize::token::expect_timestamp_or_null( tokens.next(), smithy_types::instant::Format::EpochSeconds, )?, ); } _ => smithy_json::deserialize::token::skip_value(tokens)?, } } _ => { return Err(smithy_json::deserialize::Error::custom( "expected object key or end object", )) } } } if tokens.next().is_some() { return Err(smithy_json::deserialize::Error::custom( "found more JSON tokens after completing parsing", )); } Ok(builder) } pub fn deser_operation_describe_gateway_capability_configuration( input: &[u8], mut builder: crate::output::describe_gateway_capability_configuration_output::Builder, ) -> Result< crate::output::describe_gateway_capability_configuration_output::Builder, smithy_json::deserialize::Error, > { let mut tokens_owned = smithy_json::deserialize::json_token_iter(crate::json_deser::or_empty_doc(input)) .peekable(); let tokens = &mut tokens_owned; smithy_json::deserialize::token::expect_start_object(tokens.next())?; loop { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::EndObject { .. }) => break, Some(smithy_json::deserialize::Token::ObjectKey { key, .. }) => { match key.to_unescaped()?.as_ref() { "capabilityConfiguration" => { builder = builder.set_capability_configuration( smithy_json::deserialize::token::expect_string_or_null(tokens.next())? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } "capabilityNamespace" => { builder = builder.set_capability_namespace( smithy_json::deserialize::token::expect_string_or_null(tokens.next())? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } "capabilitySyncStatus" => { builder = builder.set_capability_sync_status( smithy_json::deserialize::token::expect_string_or_null(tokens.next())? .map(|s| { s.to_unescaped().map(|u| { crate::model::CapabilitySyncStatus::from(u.as_ref()) }) }) .transpose()?, ); } "gatewayId" => { builder = builder.set_gateway_id( smithy_json::deserialize::token::expect_string_or_null(tokens.next())? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } _ => smithy_json::deserialize::token::skip_value(tokens)?, } } _ => { return Err(smithy_json::deserialize::Error::custom( "expected object key or end object", )) } } } if tokens.next().is_some() { return Err(smithy_json::deserialize::Error::custom( "found more JSON tokens after completing parsing", )); } Ok(builder) } pub fn deser_operation_describe_logging_options( input: &[u8], mut builder: crate::output::describe_logging_options_output::Builder, ) -> Result<crate::output::describe_logging_options_output::Builder, smithy_json::deserialize::Error> { let mut tokens_owned = smithy_json::deserialize::json_token_iter(crate::json_deser::or_empty_doc(input)) .peekable(); let tokens = &mut tokens_owned; smithy_json::deserialize::token::expect_start_object(tokens.next())?; loop { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::EndObject { .. }) => break, Some(smithy_json::deserialize::Token::ObjectKey { key, .. }) => { match key.to_unescaped()?.as_ref() { "loggingOptions" => { builder = builder.set_logging_options( crate::json_deser::deser_structure_logging_options(tokens)?, ); } _ => smithy_json::deserialize::token::skip_value(tokens)?, } } _ => { return Err(smithy_json::deserialize::Error::custom( "expected object key or end object", )) } } } if tokens.next().is_some() { return Err(smithy_json::deserialize::Error::custom( "found more JSON tokens after completing parsing", )); } Ok(builder) } pub fn deser_operation_describe_portal( input: &[u8], mut builder: crate::output::describe_portal_output::Builder, ) -> Result<crate::output::describe_portal_output::Builder, smithy_json::deserialize::Error> { let mut tokens_owned = smithy_json::deserialize::json_token_iter(crate::json_deser::or_empty_doc(input)) .peekable(); let tokens = &mut tokens_owned; smithy_json::deserialize::token::expect_start_object(tokens.next())?; loop { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::EndObject { .. }) => break, Some(smithy_json::deserialize::Token::ObjectKey { key, .. }) => { match key.to_unescaped()?.as_ref() { "alarms" => { builder = builder.set_alarms(crate::json_deser::deser_structure_alarms(tokens)?); } "notificationSenderEmail" => { builder = builder.set_notification_sender_email( smithy_json::deserialize::token::expect_string_or_null(tokens.next())? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } "portalArn" => { builder = builder.set_portal_arn( smithy_json::deserialize::token::expect_string_or_null(tokens.next())? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } "portalAuthMode" => { builder = builder.set_portal_auth_mode( smithy_json::deserialize::token::expect_string_or_null(tokens.next())? .map(|s| { s.to_unescaped() .map(|u| crate::model::AuthMode::from(u.as_ref())) }) .transpose()?, ); } "portalClientId" => { builder = builder.set_portal_client_id( smithy_json::deserialize::token::expect_string_or_null(tokens.next())? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } "portalContactEmail" => { builder = builder.set_portal_contact_email( smithy_json::deserialize::token::expect_string_or_null(tokens.next())? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } "portalCreationDate" => { builder = builder.set_portal_creation_date( smithy_json::deserialize::token::expect_timestamp_or_null( tokens.next(), smithy_types::instant::Format::EpochSeconds, )?, ); } "portalDescription" => { builder = builder.set_portal_description( smithy_json::deserialize::token::expect_string_or_null(tokens.next())? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } "portalId" => { builder = builder.set_portal_id( smithy_json::deserialize::token::expect_string_or_null(tokens.next())? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } "portalLastUpdateDate" => { builder = builder.set_portal_last_update_date( smithy_json::deserialize::token::expect_timestamp_or_null( tokens.next(), smithy_types::instant::Format::EpochSeconds, )?, ); } "portalLogoImageLocation" => { builder = builder.set_portal_logo_image_location( crate::json_deser::deser_structure_image_location(tokens)?, ); } "portalName" => { builder = builder.set_portal_name( smithy_json::deserialize::token::expect_string_or_null(tokens.next())? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } "portalStartUrl" => { builder = builder.set_portal_start_url( smithy_json::deserialize::token::expect_string_or_null(tokens.next())? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } "portalStatus" => { builder = builder.set_portal_status( crate::json_deser::deser_structure_portal_status(tokens)?, ); } "roleArn" => { builder = builder.set_role_arn( smithy_json::deserialize::token::expect_string_or_null(tokens.next())? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } _ => smithy_json::deserialize::token::skip_value(tokens)?, } } _ => { return Err(smithy_json::deserialize::Error::custom( "expected object key or end object", )) } } } if tokens.next().is_some() { return Err(smithy_json::deserialize::Error::custom( "found more JSON tokens after completing parsing", )); } Ok(builder) } pub fn deser_operation_describe_project( input: &[u8], mut builder: crate::output::describe_project_output::Builder, ) -> Result<crate::output::describe_project_output::Builder, smithy_json::deserialize::Error> { let mut tokens_owned = smithy_json::deserialize::json_token_iter(crate::json_deser::or_empty_doc(input)) .peekable(); let tokens = &mut tokens_owned; smithy_json::deserialize::token::expect_start_object(tokens.next())?; loop { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::EndObject { .. }) => break, Some(smithy_json::deserialize::Token::ObjectKey { key, .. }) => { match key.to_unescaped()?.as_ref() { "portalId" => { builder = builder.set_portal_id( smithy_json::deserialize::token::expect_string_or_null(tokens.next())? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } "projectArn" => { builder = builder.set_project_arn( smithy_json::deserialize::token::expect_string_or_null(tokens.next())? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } "projectCreationDate" => { builder = builder.set_project_creation_date( smithy_json::deserialize::token::expect_timestamp_or_null( tokens.next(), smithy_types::instant::Format::EpochSeconds, )?, ); } "projectDescription" => { builder = builder.set_project_description( smithy_json::deserialize::token::expect_string_or_null(tokens.next())? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } "projectId" => { builder = builder.set_project_id( smithy_json::deserialize::token::expect_string_or_null(tokens.next())? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } "projectLastUpdateDate" => { builder = builder.set_project_last_update_date( smithy_json::deserialize::token::expect_timestamp_or_null( tokens.next(), smithy_types::instant::Format::EpochSeconds, )?, ); } "projectName" => { builder = builder.set_project_name( smithy_json::deserialize::token::expect_string_or_null(tokens.next())? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } _ => smithy_json::deserialize::token::skip_value(tokens)?, } } _ => { return Err(smithy_json::deserialize::Error::custom( "expected object key or end object", )) } } } if tokens.next().is_some() { return Err(smithy_json::deserialize::Error::custom( "found more JSON tokens after completing parsing", )); } Ok(builder) } pub fn deser_operation_describe_storage_configuration( input: &[u8], mut builder: crate::output::describe_storage_configuration_output::Builder, ) -> Result< crate::output::describe_storage_configuration_output::Builder, smithy_json::deserialize::Error, > { let mut tokens_owned = smithy_json::deserialize::json_token_iter(crate::json_deser::or_empty_doc(input)) .peekable(); let tokens = &mut tokens_owned; smithy_json::deserialize::token::expect_start_object(tokens.next())?; loop { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::EndObject { .. }) => break, Some(smithy_json::deserialize::Token::ObjectKey { key, .. }) => { match key.to_unescaped()?.as_ref() { "configurationStatus" => { builder = builder.set_configuration_status( crate::json_deser::deser_structure_configuration_status(tokens)?, ); } "lastUpdateDate" => { builder = builder.set_last_update_date( smithy_json::deserialize::token::expect_timestamp_or_null( tokens.next(), smithy_types::instant::Format::EpochSeconds, )?, ); } "multiLayerStorage" => { builder = builder.set_multi_layer_storage( crate::json_deser::deser_structure_multi_layer_storage(tokens)?, ); } "storageType" => { builder = builder.set_storage_type( smithy_json::deserialize::token::expect_string_or_null(tokens.next())? .map(|s| { s.to_unescaped() .map(|u| crate::model::StorageType::from(u.as_ref())) }) .transpose()?, ); } _ => smithy_json::deserialize::token::skip_value(tokens)?, } } _ => { return Err(smithy_json::deserialize::Error::custom( "expected object key or end object", )) } } } if tokens.next().is_some() { return Err(smithy_json::deserialize::Error::custom( "found more JSON tokens after completing parsing", )); } Ok(builder) } pub fn deser_operation_get_asset_property_aggregates( input: &[u8], mut builder: crate::output::get_asset_property_aggregates_output::Builder, ) -> Result< crate::output::get_asset_property_aggregates_output::Builder, smithy_json::deserialize::Error, > { let mut tokens_owned = smithy_json::deserialize::json_token_iter(crate::json_deser::or_empty_doc(input)) .peekable(); let tokens = &mut tokens_owned; smithy_json::deserialize::token::expect_start_object(tokens.next())?; loop { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::EndObject { .. }) => break, Some(smithy_json::deserialize::Token::ObjectKey { key, .. }) => { match key.to_unescaped()?.as_ref() { "aggregatedValues" => { builder = builder.set_aggregated_values( crate::json_deser::deser_list_aggregated_values(tokens)?, ); } "nextToken" => { builder = builder.set_next_token( smithy_json::deserialize::token::expect_string_or_null(tokens.next())? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } _ => smithy_json::deserialize::token::skip_value(tokens)?, } } _ => { return Err(smithy_json::deserialize::Error::custom( "expected object key or end object", )) } } } if tokens.next().is_some() { return Err(smithy_json::deserialize::Error::custom( "found more JSON tokens after completing parsing", )); } Ok(builder) } pub fn deser_operation_get_asset_property_value( input: &[u8], mut builder: crate::output::get_asset_property_value_output::Builder, ) -> Result<crate::output::get_asset_property_value_output::Builder, smithy_json::deserialize::Error> { let mut tokens_owned = smithy_json::deserialize::json_token_iter(crate::json_deser::or_empty_doc(input)) .peekable(); let tokens = &mut tokens_owned; smithy_json::deserialize::token::expect_start_object(tokens.next())?; loop { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::EndObject { .. }) => break, Some(smithy_json::deserialize::Token::ObjectKey { key, .. }) => { match key.to_unescaped()?.as_ref() { "propertyValue" => { builder = builder.set_property_value( crate::json_deser::deser_structure_asset_property_value(tokens)?, ); } _ => smithy_json::deserialize::token::skip_value(tokens)?, } } _ => { return Err(smithy_json::deserialize::Error::custom( "expected object key or end object", )) } } } if tokens.next().is_some() { return Err(smithy_json::deserialize::Error::custom( "found more JSON tokens after completing parsing", )); } Ok(builder) } pub fn deser_operation_get_asset_property_value_history( input: &[u8], mut builder: crate::output::get_asset_property_value_history_output::Builder, ) -> Result< crate::output::get_asset_property_value_history_output::Builder, smithy_json::deserialize::Error, > { let mut tokens_owned = smithy_json::deserialize::json_token_iter(crate::json_deser::or_empty_doc(input)) .peekable(); let tokens = &mut tokens_owned; smithy_json::deserialize::token::expect_start_object(tokens.next())?; loop { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::EndObject { .. }) => break, Some(smithy_json::deserialize::Token::ObjectKey { key, .. }) => { match key.to_unescaped()?.as_ref() { "assetPropertyValueHistory" => { builder = builder.set_asset_property_value_history( crate::json_deser::deser_list_asset_property_value_history(tokens)?, ); } "nextToken" => { builder = builder.set_next_token( smithy_json::deserialize::token::expect_string_or_null(tokens.next())? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } _ => smithy_json::deserialize::token::skip_value(tokens)?, } } _ => { return Err(smithy_json::deserialize::Error::custom( "expected object key or end object", )) } } } if tokens.next().is_some() { return Err(smithy_json::deserialize::Error::custom( "found more JSON tokens after completing parsing", )); } Ok(builder) } pub fn deser_operation_get_interpolated_asset_property_values( input: &[u8], mut builder: crate::output::get_interpolated_asset_property_values_output::Builder, ) -> Result< crate::output::get_interpolated_asset_property_values_output::Builder, smithy_json::deserialize::Error, > { let mut tokens_owned = smithy_json::deserialize::json_token_iter(crate::json_deser::or_empty_doc(input)) .peekable(); let tokens = &mut tokens_owned; smithy_json::deserialize::token::expect_start_object(tokens.next())?; loop { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::EndObject { .. }) => break, Some(smithy_json::deserialize::Token::ObjectKey { key, .. }) => { match key.to_unescaped()?.as_ref() { "interpolatedAssetPropertyValues" => { builder = builder.set_interpolated_asset_property_values( crate::json_deser::deser_list_interpolated_asset_property_values( tokens, )?, ); } "nextToken" => { builder = builder.set_next_token( smithy_json::deserialize::token::expect_string_or_null(tokens.next())? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } _ => smithy_json::deserialize::token::skip_value(tokens)?, } } _ => { return Err(smithy_json::deserialize::Error::custom( "expected object key or end object", )) } } } if tokens.next().is_some() { return Err(smithy_json::deserialize::Error::custom( "found more JSON tokens after completing parsing", )); } Ok(builder) } pub fn deser_operation_list_access_policies( input: &[u8], mut builder: crate::output::list_access_policies_output::Builder, ) -> Result<crate::output::list_access_policies_output::Builder, smithy_json::deserialize::Error> { let mut tokens_owned = smithy_json::deserialize::json_token_iter(crate::json_deser::or_empty_doc(input)) .peekable(); let tokens = &mut tokens_owned; smithy_json::deserialize::token::expect_start_object(tokens.next())?; loop { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::EndObject { .. }) => break, Some(smithy_json::deserialize::Token::ObjectKey { key, .. }) => { match key.to_unescaped()?.as_ref() { "accessPolicySummaries" => { builder = builder.set_access_policy_summaries( crate::json_deser::deser_list_access_policy_summaries(tokens)?, ); } "nextToken" => { builder = builder.set_next_token( smithy_json::deserialize::token::expect_string_or_null(tokens.next())? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } _ => smithy_json::deserialize::token::skip_value(tokens)?, } } _ => { return Err(smithy_json::deserialize::Error::custom( "expected object key or end object", )) } } } if tokens.next().is_some() { return Err(smithy_json::deserialize::Error::custom( "found more JSON tokens after completing parsing", )); } Ok(builder) } pub fn deser_operation_list_asset_models( input: &[u8], mut builder: crate::output::list_asset_models_output::Builder, ) -> Result<crate::output::list_asset_models_output::Builder, smithy_json::deserialize::Error> { let mut tokens_owned = smithy_json::deserialize::json_token_iter(crate::json_deser::or_empty_doc(input)) .peekable(); let tokens = &mut tokens_owned; smithy_json::deserialize::token::expect_start_object(tokens.next())?; loop { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::EndObject { .. }) => break, Some(smithy_json::deserialize::Token::ObjectKey { key, .. }) => { match key.to_unescaped()?.as_ref() { "assetModelSummaries" => { builder = builder.set_asset_model_summaries( crate::json_deser::deser_list_asset_model_summaries(tokens)?, ); } "nextToken" => { builder = builder.set_next_token( smithy_json::deserialize::token::expect_string_or_null(tokens.next())? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } _ => smithy_json::deserialize::token::skip_value(tokens)?, } } _ => { return Err(smithy_json::deserialize::Error::custom( "expected object key or end object", )) } } } if tokens.next().is_some() { return Err(smithy_json::deserialize::Error::custom( "found more JSON tokens after completing parsing", )); } Ok(builder) } pub fn deser_operation_list_asset_relationships( input: &[u8], mut builder: crate::output::list_asset_relationships_output::Builder, ) -> Result<crate::output::list_asset_relationships_output::Builder, smithy_json::deserialize::Error> { let mut tokens_owned = smithy_json::deserialize::json_token_iter(crate::json_deser::or_empty_doc(input)) .peekable(); let tokens = &mut tokens_owned; smithy_json::deserialize::token::expect_start_object(tokens.next())?; loop { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::EndObject { .. }) => break, Some(smithy_json::deserialize::Token::ObjectKey { key, .. }) => { match key.to_unescaped()?.as_ref() { "assetRelationshipSummaries" => { builder = builder.set_asset_relationship_summaries( crate::json_deser::deser_list_asset_relationship_summaries(tokens)?, ); } "nextToken" => { builder = builder.set_next_token( smithy_json::deserialize::token::expect_string_or_null(tokens.next())? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } _ => smithy_json::deserialize::token::skip_value(tokens)?, } } _ => { return Err(smithy_json::deserialize::Error::custom( "expected object key or end object", )) } } } if tokens.next().is_some() { return Err(smithy_json::deserialize::Error::custom( "found more JSON tokens after completing parsing", )); } Ok(builder) } pub fn deser_operation_list_assets( input: &[u8], mut builder: crate::output::list_assets_output::Builder, ) -> Result<crate::output::list_assets_output::Builder, smithy_json::deserialize::Error> { let mut tokens_owned = smithy_json::deserialize::json_token_iter(crate::json_deser::or_empty_doc(input)) .peekable(); let tokens = &mut tokens_owned; smithy_json::deserialize::token::expect_start_object(tokens.next())?; loop { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::EndObject { .. }) => break, Some(smithy_json::deserialize::Token::ObjectKey { key, .. }) => { match key.to_unescaped()?.as_ref() { "assetSummaries" => { builder = builder.set_asset_summaries( crate::json_deser::deser_list_asset_summaries(tokens)?, ); } "nextToken" => { builder = builder.set_next_token( smithy_json::deserialize::token::expect_string_or_null(tokens.next())? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } _ => smithy_json::deserialize::token::skip_value(tokens)?, } } _ => { return Err(smithy_json::deserialize::Error::custom( "expected object key or end object", )) } } } if tokens.next().is_some() { return Err(smithy_json::deserialize::Error::custom( "found more JSON tokens after completing parsing", )); } Ok(builder) } pub fn deser_operation_list_associated_assets( input: &[u8], mut builder: crate::output::list_associated_assets_output::Builder, ) -> Result<crate::output::list_associated_assets_output::Builder, smithy_json::deserialize::Error> { let mut tokens_owned = smithy_json::deserialize::json_token_iter(crate::json_deser::or_empty_doc(input)) .peekable(); let tokens = &mut tokens_owned; smithy_json::deserialize::token::expect_start_object(tokens.next())?; loop { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::EndObject { .. }) => break, Some(smithy_json::deserialize::Token::ObjectKey { key, .. }) => { match key.to_unescaped()?.as_ref() { "assetSummaries" => { builder = builder.set_asset_summaries( crate::json_deser::deser_list_associated_assets_summaries(tokens)?, ); } "nextToken" => { builder = builder.set_next_token( smithy_json::deserialize::token::expect_string_or_null(tokens.next())? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } _ => smithy_json::deserialize::token::skip_value(tokens)?, } } _ => { return Err(smithy_json::deserialize::Error::custom( "expected object key or end object", )) } } } if tokens.next().is_some() { return Err(smithy_json::deserialize::Error::custom( "found more JSON tokens after completing parsing", )); } Ok(builder) } pub fn deser_operation_list_dashboards( input: &[u8], mut builder: crate::output::list_dashboards_output::Builder, ) -> Result<crate::output::list_dashboards_output::Builder, smithy_json::deserialize::Error> { let mut tokens_owned = smithy_json::deserialize::json_token_iter(crate::json_deser::or_empty_doc(input)) .peekable(); let tokens = &mut tokens_owned; smithy_json::deserialize::token::expect_start_object(tokens.next())?; loop { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::EndObject { .. }) => break, Some(smithy_json::deserialize::Token::ObjectKey { key, .. }) => { match key.to_unescaped()?.as_ref() { "dashboardSummaries" => { builder = builder.set_dashboard_summaries( crate::json_deser::deser_list_dashboard_summaries(tokens)?, ); } "nextToken" => { builder = builder.set_next_token( smithy_json::deserialize::token::expect_string_or_null(tokens.next())? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } _ => smithy_json::deserialize::token::skip_value(tokens)?, } } _ => { return Err(smithy_json::deserialize::Error::custom( "expected object key or end object", )) } } } if tokens.next().is_some() { return Err(smithy_json::deserialize::Error::custom( "found more JSON tokens after completing parsing", )); } Ok(builder) } pub fn deser_operation_list_gateways( input: &[u8], mut builder: crate::output::list_gateways_output::Builder, ) -> Result<crate::output::list_gateways_output::Builder, smithy_json::deserialize::Error> { let mut tokens_owned = smithy_json::deserialize::json_token_iter(crate::json_deser::or_empty_doc(input)) .peekable(); let tokens = &mut tokens_owned; smithy_json::deserialize::token::expect_start_object(tokens.next())?; loop { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::EndObject { .. }) => break, Some(smithy_json::deserialize::Token::ObjectKey { key, .. }) => { match key.to_unescaped()?.as_ref() { "gatewaySummaries" => { builder = builder.set_gateway_summaries( crate::json_deser::deser_list_gateway_summaries(tokens)?, ); } "nextToken" => { builder = builder.set_next_token( smithy_json::deserialize::token::expect_string_or_null(tokens.next())? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } _ => smithy_json::deserialize::token::skip_value(tokens)?, } } _ => { return Err(smithy_json::deserialize::Error::custom( "expected object key or end object", )) } } } if tokens.next().is_some() { return Err(smithy_json::deserialize::Error::custom( "found more JSON tokens after completing parsing", )); } Ok(builder) } pub fn deser_operation_list_portals( input: &[u8], mut builder: crate::output::list_portals_output::Builder, ) -> Result<crate::output::list_portals_output::Builder, smithy_json::deserialize::Error> { let mut tokens_owned = smithy_json::deserialize::json_token_iter(crate::json_deser::or_empty_doc(input)) .peekable(); let tokens = &mut tokens_owned; smithy_json::deserialize::token::expect_start_object(tokens.next())?; loop { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::EndObject { .. }) => break, Some(smithy_json::deserialize::Token::ObjectKey { key, .. }) => { match key.to_unescaped()?.as_ref() { "nextToken" => { builder = builder.set_next_token( smithy_json::deserialize::token::expect_string_or_null(tokens.next())? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } "portalSummaries" => { builder = builder.set_portal_summaries( crate::json_deser::deser_list_portal_summaries(tokens)?, ); } _ => smithy_json::deserialize::token::skip_value(tokens)?, } } _ => { return Err(smithy_json::deserialize::Error::custom( "expected object key or end object", )) } } } if tokens.next().is_some() { return Err(smithy_json::deserialize::Error::custom( "found more JSON tokens after completing parsing", )); } Ok(builder) } pub fn deser_operation_list_project_assets( input: &[u8], mut builder: crate::output::list_project_assets_output::Builder, ) -> Result<crate::output::list_project_assets_output::Builder, smithy_json::deserialize::Error> { let mut tokens_owned = smithy_json::deserialize::json_token_iter(crate::json_deser::or_empty_doc(input)) .peekable(); let tokens = &mut tokens_owned; smithy_json::deserialize::token::expect_start_object(tokens.next())?; loop { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::EndObject { .. }) => break, Some(smithy_json::deserialize::Token::ObjectKey { key, .. }) => { match key.to_unescaped()?.as_ref() { "assetIds" => { builder = builder .set_asset_ids(crate::json_deser::deser_list_asset_i_ds(tokens)?); } "nextToken" => { builder = builder.set_next_token( smithy_json::deserialize::token::expect_string_or_null(tokens.next())? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } _ => smithy_json::deserialize::token::skip_value(tokens)?, } } _ => { return Err(smithy_json::deserialize::Error::custom( "expected object key or end object", )) } } } if tokens.next().is_some() { return Err(smithy_json::deserialize::Error::custom( "found more JSON tokens after completing parsing", )); } Ok(builder) } pub fn deser_operation_list_projects( input: &[u8], mut builder: crate::output::list_projects_output::Builder, ) -> Result<crate::output::list_projects_output::Builder, smithy_json::deserialize::Error> { let mut tokens_owned = smithy_json::deserialize::json_token_iter(crate::json_deser::or_empty_doc(input)) .peekable(); let tokens = &mut tokens_owned; smithy_json::deserialize::token::expect_start_object(tokens.next())?; loop { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::EndObject { .. }) => break, Some(smithy_json::deserialize::Token::ObjectKey { key, .. }) => { match key.to_unescaped()?.as_ref() { "nextToken" => { builder = builder.set_next_token( smithy_json::deserialize::token::expect_string_or_null(tokens.next())? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } "projectSummaries" => { builder = builder.set_project_summaries( crate::json_deser::deser_list_project_summaries(tokens)?, ); } _ => smithy_json::deserialize::token::skip_value(tokens)?, } } _ => { return Err(smithy_json::deserialize::Error::custom( "expected object key or end object", )) } } } if tokens.next().is_some() { return Err(smithy_json::deserialize::Error::custom( "found more JSON tokens after completing parsing", )); } Ok(builder) } pub fn deser_structure_unauthorized_exceptionjson_err( input: &[u8], mut builder: crate::error::unauthorized_exception::Builder, ) -> Result<crate::error::unauthorized_exception::Builder, smithy_json::deserialize::Error> { let mut tokens_owned = smithy_json::deserialize::json_token_iter(crate::json_deser::or_empty_doc(input)) .peekable(); let tokens = &mut tokens_owned; smithy_json::deserialize::token::expect_start_object(tokens.next())?; loop { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::EndObject { .. }) => break, Some(smithy_json::deserialize::Token::ObjectKey { key, .. }) => { match key.to_unescaped()?.as_ref() { "message" => { builder = builder.set_message( smithy_json::deserialize::token::expect_string_or_null(tokens.next())? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } _ => smithy_json::deserialize::token::skip_value(tokens)?, } } _ => { return Err(smithy_json::deserialize::Error::custom( "expected object key or end object", )) } } } if tokens.next().is_some() { return Err(smithy_json::deserialize::Error::custom( "found more JSON tokens after completing parsing", )); } Ok(builder) } pub fn deser_operation_list_tags_for_resource( input: &[u8], mut builder: crate::output::list_tags_for_resource_output::Builder, ) -> Result<crate::output::list_tags_for_resource_output::Builder, smithy_json::deserialize::Error> { let mut tokens_owned = smithy_json::deserialize::json_token_iter(crate::json_deser::or_empty_doc(input)) .peekable(); let tokens = &mut tokens_owned; smithy_json::deserialize::token::expect_start_object(tokens.next())?; loop { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::EndObject { .. }) => break, Some(smithy_json::deserialize::Token::ObjectKey { key, .. }) => { match key.to_unescaped()?.as_ref() { "tags" => { builder = builder.set_tags(crate::json_deser::deser_map_tag_map(tokens)?); } _ => smithy_json::deserialize::token::skip_value(tokens)?, } } _ => { return Err(smithy_json::deserialize::Error::custom( "expected object key or end object", )) } } } if tokens.next().is_some() { return Err(smithy_json::deserialize::Error::custom( "found more JSON tokens after completing parsing", )); } Ok(builder) } pub fn deser_operation_put_default_encryption_configuration( input: &[u8], mut builder: crate::output::put_default_encryption_configuration_output::Builder, ) -> Result< crate::output::put_default_encryption_configuration_output::Builder, smithy_json::deserialize::Error, > { let mut tokens_owned = smithy_json::deserialize::json_token_iter(crate::json_deser::or_empty_doc(input)) .peekable(); let tokens = &mut tokens_owned; smithy_json::deserialize::token::expect_start_object(tokens.next())?; loop { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::EndObject { .. }) => break, Some(smithy_json::deserialize::Token::ObjectKey { key, .. }) => { match key.to_unescaped()?.as_ref() { "configurationStatus" => { builder = builder.set_configuration_status( crate::json_deser::deser_structure_configuration_status(tokens)?, ); } "encryptionType" => { builder = builder.set_encryption_type( smithy_json::deserialize::token::expect_string_or_null(tokens.next())? .map(|s| { s.to_unescaped() .map(|u| crate::model::EncryptionType::from(u.as_ref())) }) .transpose()?, ); } "kmsKeyArn" => { builder = builder.set_kms_key_arn( smithy_json::deserialize::token::expect_string_or_null(tokens.next())? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } _ => smithy_json::deserialize::token::skip_value(tokens)?, } } _ => { return Err(smithy_json::deserialize::Error::custom( "expected object key or end object", )) } } } if tokens.next().is_some() { return Err(smithy_json::deserialize::Error::custom( "found more JSON tokens after completing parsing", )); } Ok(builder) } pub fn deser_operation_put_storage_configuration( input: &[u8], mut builder: crate::output::put_storage_configuration_output::Builder, ) -> Result<crate::output::put_storage_configuration_output::Builder, smithy_json::deserialize::Error> { let mut tokens_owned = smithy_json::deserialize::json_token_iter(crate::json_deser::or_empty_doc(input)) .peekable(); let tokens = &mut tokens_owned; smithy_json::deserialize::token::expect_start_object(tokens.next())?; loop { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::EndObject { .. }) => break, Some(smithy_json::deserialize::Token::ObjectKey { key, .. }) => { match key.to_unescaped()?.as_ref() { "configurationStatus" => { builder = builder.set_configuration_status( crate::json_deser::deser_structure_configuration_status(tokens)?, ); } "multiLayerStorage" => { builder = builder.set_multi_layer_storage( crate::json_deser::deser_structure_multi_layer_storage(tokens)?, ); } "storageType" => { builder = builder.set_storage_type( smithy_json::deserialize::token::expect_string_or_null(tokens.next())? .map(|s| { s.to_unescaped() .map(|u| crate::model::StorageType::from(u.as_ref())) }) .transpose()?, ); } _ => smithy_json::deserialize::token::skip_value(tokens)?, } } _ => { return Err(smithy_json::deserialize::Error::custom( "expected object key or end object", )) } } } if tokens.next().is_some() { return Err(smithy_json::deserialize::Error::custom( "found more JSON tokens after completing parsing", )); } Ok(builder) } pub fn deser_structure_too_many_tags_exceptionjson_err( input: &[u8], mut builder: crate::error::too_many_tags_exception::Builder, ) -> Result<crate::error::too_many_tags_exception::Builder, smithy_json::deserialize::Error> { let mut tokens_owned = smithy_json::deserialize::json_token_iter(crate::json_deser::or_empty_doc(input)) .peekable(); let tokens = &mut tokens_owned; smithy_json::deserialize::token::expect_start_object(tokens.next())?; loop { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::EndObject { .. }) => break, Some(smithy_json::deserialize::Token::ObjectKey { key, .. }) => { match key.to_unescaped()?.as_ref() { "message" => { builder = builder.set_message( smithy_json::deserialize::token::expect_string_or_null(tokens.next())? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } "resourceName" => { builder = builder.set_resource_name( smithy_json::deserialize::token::expect_string_or_null(tokens.next())? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } _ => smithy_json::deserialize::token::skip_value(tokens)?, } } _ => { return Err(smithy_json::deserialize::Error::custom( "expected object key or end object", )) } } } if tokens.next().is_some() { return Err(smithy_json::deserialize::Error::custom( "found more JSON tokens after completing parsing", )); } Ok(builder) } pub fn deser_operation_update_asset( input: &[u8], mut builder: crate::output::update_asset_output::Builder, ) -> Result<crate::output::update_asset_output::Builder, smithy_json::deserialize::Error> { let mut tokens_owned = smithy_json::deserialize::json_token_iter(crate::json_deser::or_empty_doc(input)) .peekable(); let tokens = &mut tokens_owned; smithy_json::deserialize::token::expect_start_object(tokens.next())?; loop { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::EndObject { .. }) => break, Some(smithy_json::deserialize::Token::ObjectKey { key, .. }) => { match key.to_unescaped()?.as_ref() { "assetStatus" => { builder = builder.set_asset_status( crate::json_deser::deser_structure_asset_status(tokens)?, ); } _ => smithy_json::deserialize::token::skip_value(tokens)?, } } _ => { return Err(smithy_json::deserialize::Error::custom( "expected object key or end object", )) } } } if tokens.next().is_some() { return Err(smithy_json::deserialize::Error::custom( "found more JSON tokens after completing parsing", )); } Ok(builder) } pub fn deser_operation_update_asset_model( input: &[u8], mut builder: crate::output::update_asset_model_output::Builder, ) -> Result<crate::output::update_asset_model_output::Builder, smithy_json::deserialize::Error> { let mut tokens_owned = smithy_json::deserialize::json_token_iter(crate::json_deser::or_empty_doc(input)) .peekable(); let tokens = &mut tokens_owned; smithy_json::deserialize::token::expect_start_object(tokens.next())?; loop { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::EndObject { .. }) => break, Some(smithy_json::deserialize::Token::ObjectKey { key, .. }) => { match key.to_unescaped()?.as_ref() { "assetModelStatus" => { builder = builder.set_asset_model_status( crate::json_deser::deser_structure_asset_model_status(tokens)?, ); } _ => smithy_json::deserialize::token::skip_value(tokens)?, } } _ => { return Err(smithy_json::deserialize::Error::custom( "expected object key or end object", )) } } } if tokens.next().is_some() { return Err(smithy_json::deserialize::Error::custom( "found more JSON tokens after completing parsing", )); } Ok(builder) } pub fn deser_operation_update_gateway_capability_configuration( input: &[u8], mut builder: crate::output::update_gateway_capability_configuration_output::Builder, ) -> Result< crate::output::update_gateway_capability_configuration_output::Builder, smithy_json::deserialize::Error, > { let mut tokens_owned = smithy_json::deserialize::json_token_iter(crate::json_deser::or_empty_doc(input)) .peekable(); let tokens = &mut tokens_owned; smithy_json::deserialize::token::expect_start_object(tokens.next())?; loop { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::EndObject { .. }) => break, Some(smithy_json::deserialize::Token::ObjectKey { key, .. }) => { match key.to_unescaped()?.as_ref() { "capabilityNamespace" => { builder = builder.set_capability_namespace( smithy_json::deserialize::token::expect_string_or_null(tokens.next())? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } "capabilitySyncStatus" => { builder = builder.set_capability_sync_status( smithy_json::deserialize::token::expect_string_or_null(tokens.next())? .map(|s| { s.to_unescaped().map(|u| { crate::model::CapabilitySyncStatus::from(u.as_ref()) }) }) .transpose()?, ); } _ => smithy_json::deserialize::token::skip_value(tokens)?, } } _ => { return Err(smithy_json::deserialize::Error::custom( "expected object key or end object", )) } } } if tokens.next().is_some() { return Err(smithy_json::deserialize::Error::custom( "found more JSON tokens after completing parsing", )); } Ok(builder) } pub fn deser_operation_update_portal( input: &[u8], mut builder: crate::output::update_portal_output::Builder, ) -> Result<crate::output::update_portal_output::Builder, smithy_json::deserialize::Error> { let mut tokens_owned = smithy_json::deserialize::json_token_iter(crate::json_deser::or_empty_doc(input)) .peekable(); let tokens = &mut tokens_owned; smithy_json::deserialize::token::expect_start_object(tokens.next())?; loop { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::EndObject { .. }) => break, Some(smithy_json::deserialize::Token::ObjectKey { key, .. }) => { match key.to_unescaped()?.as_ref() { "portalStatus" => { builder = builder.set_portal_status( crate::json_deser::deser_structure_portal_status(tokens)?, ); } _ => smithy_json::deserialize::token::skip_value(tokens)?, } } _ => { return Err(smithy_json::deserialize::Error::custom( "expected object key or end object", )) } } } if tokens.next().is_some() { return Err(smithy_json::deserialize::Error::custom( "found more JSON tokens after completing parsing", )); } Ok(builder) } pub fn or_empty_doc(data: &[u8]) -> &[u8] { if data.is_empty() { b"{}" } else { data } } #[allow(clippy::type_complexity, non_snake_case)] pub fn deser_list_batch_associate_project_assets_errors<'a, I>( tokens: &mut std::iter::Peekable<I>, ) -> Result<Option<std::vec::Vec<crate::model::AssetErrorDetails>>, smithy_json::deserialize::Error> where I: Iterator< Item = Result<smithy_json::deserialize::Token<'a>, smithy_json::deserialize::Error>, >, { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::ValueNull { .. }) => Ok(None), Some(smithy_json::deserialize::Token::StartArray { .. }) => { let mut items = Vec::new(); loop { match tokens.peek() { Some(Ok(smithy_json::deserialize::Token::EndArray { .. })) => { tokens.next().transpose().unwrap(); break; } _ => { let value = crate::json_deser::deser_structure_asset_error_details(tokens)?; if let Some(value) = value { items.push(value); } } } } Ok(Some(items)) } _ => Err(smithy_json::deserialize::Error::custom( "expected start array or null", )), } } #[allow(clippy::type_complexity, non_snake_case)] pub fn deser_list_batch_disassociate_project_assets_errors<'a, I>( tokens: &mut std::iter::Peekable<I>, ) -> Result<Option<std::vec::Vec<crate::model::AssetErrorDetails>>, smithy_json::deserialize::Error> where I: Iterator< Item = Result<smithy_json::deserialize::Token<'a>, smithy_json::deserialize::Error>, >, { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::ValueNull { .. }) => Ok(None), Some(smithy_json::deserialize::Token::StartArray { .. }) => { let mut items = Vec::new(); loop { match tokens.peek() { Some(Ok(smithy_json::deserialize::Token::EndArray { .. })) => { tokens.next().transpose().unwrap(); break; } _ => { let value = crate::json_deser::deser_structure_asset_error_details(tokens)?; if let Some(value) = value { items.push(value); } } } } Ok(Some(items)) } _ => Err(smithy_json::deserialize::Error::custom( "expected start array or null", )), } } #[allow(clippy::type_complexity, non_snake_case)] pub fn deser_list_batch_put_asset_property_error_entries<'a, I>( tokens: &mut std::iter::Peekable<I>, ) -> Result< Option<std::vec::Vec<crate::model::BatchPutAssetPropertyErrorEntry>>, smithy_json::deserialize::Error, > where I: Iterator< Item = Result<smithy_json::deserialize::Token<'a>, smithy_json::deserialize::Error>, >, { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::ValueNull { .. }) => Ok(None), Some(smithy_json::deserialize::Token::StartArray { .. }) => { let mut items = Vec::new(); loop { match tokens.peek() { Some(Ok(smithy_json::deserialize::Token::EndArray { .. })) => { tokens.next().transpose().unwrap(); break; } _ => { let value = crate::json_deser::deser_structure_batch_put_asset_property_error_entry(tokens)? ; if let Some(value) = value { items.push(value); } } } } Ok(Some(items)) } _ => Err(smithy_json::deserialize::Error::custom( "expected start array or null", )), } } pub fn deser_structure_asset_status<'a, I>( tokens: &mut std::iter::Peekable<I>, ) -> Result<Option<crate::model::AssetStatus>, smithy_json::deserialize::Error> where I: Iterator< Item = Result<smithy_json::deserialize::Token<'a>, smithy_json::deserialize::Error>, >, { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::ValueNull { .. }) => Ok(None), Some(smithy_json::deserialize::Token::StartObject { .. }) => { #[allow(unused_mut)] let mut builder = crate::model::AssetStatus::builder(); loop { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::EndObject { .. }) => break, Some(smithy_json::deserialize::Token::ObjectKey { key, .. }) => { match key.to_unescaped()?.as_ref() { "state" => { builder = builder.set_state( smithy_json::deserialize::token::expect_string_or_null( tokens.next(), )? .map(|s| { s.to_unescaped() .map(|u| crate::model::AssetState::from(u.as_ref())) }) .transpose()?, ); } "error" => { builder = builder.set_error( crate::json_deser::deser_structure_error_details(tokens)?, ); } _ => smithy_json::deserialize::token::skip_value(tokens)?, } } _ => { return Err(smithy_json::deserialize::Error::custom( "expected object key or end object", )) } } } Ok(Some(builder.build())) } _ => Err(smithy_json::deserialize::Error::custom( "expected start object or null", )), } } pub fn deser_structure_asset_model_status<'a, I>( tokens: &mut std::iter::Peekable<I>, ) -> Result<Option<crate::model::AssetModelStatus>, smithy_json::deserialize::Error> where I: Iterator< Item = Result<smithy_json::deserialize::Token<'a>, smithy_json::deserialize::Error>, >, { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::ValueNull { .. }) => Ok(None), Some(smithy_json::deserialize::Token::StartObject { .. }) => { #[allow(unused_mut)] let mut builder = crate::model::AssetModelStatus::builder(); loop { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::EndObject { .. }) => break, Some(smithy_json::deserialize::Token::ObjectKey { key, .. }) => { match key.to_unescaped()?.as_ref() { "state" => { builder = builder.set_state( smithy_json::deserialize::token::expect_string_or_null( tokens.next(), )? .map(|s| { s.to_unescaped().map(|u| { crate::model::AssetModelState::from(u.as_ref()) }) }) .transpose()?, ); } "error" => { builder = builder.set_error( crate::json_deser::deser_structure_error_details(tokens)?, ); } _ => smithy_json::deserialize::token::skip_value(tokens)?, } } _ => { return Err(smithy_json::deserialize::Error::custom( "expected object key or end object", )) } } } Ok(Some(builder.build())) } _ => Err(smithy_json::deserialize::Error::custom( "expected start object or null", )), } } pub fn deser_structure_portal_status<'a, I>( tokens: &mut std::iter::Peekable<I>, ) -> Result<Option<crate::model::PortalStatus>, smithy_json::deserialize::Error> where I: Iterator< Item = Result<smithy_json::deserialize::Token<'a>, smithy_json::deserialize::Error>, >, { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::ValueNull { .. }) => Ok(None), Some(smithy_json::deserialize::Token::StartObject { .. }) => { #[allow(unused_mut)] let mut builder = crate::model::PortalStatus::builder(); loop { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::EndObject { .. }) => break, Some(smithy_json::deserialize::Token::ObjectKey { key, .. }) => { match key.to_unescaped()?.as_ref() { "state" => { builder = builder.set_state( smithy_json::deserialize::token::expect_string_or_null( tokens.next(), )? .map(|s| { s.to_unescaped() .map(|u| crate::model::PortalState::from(u.as_ref())) }) .transpose()?, ); } "error" => { builder = builder.set_error( crate::json_deser::deser_structure_monitor_error_details( tokens, )?, ); } _ => smithy_json::deserialize::token::skip_value(tokens)?, } } _ => { return Err(smithy_json::deserialize::Error::custom( "expected object key or end object", )) } } } Ok(Some(builder.build())) } _ => Err(smithy_json::deserialize::Error::custom( "expected start object or null", )), } } pub fn deser_structure_identity<'a, I>( tokens: &mut std::iter::Peekable<I>, ) -> Result<Option<crate::model::Identity>, smithy_json::deserialize::Error> where I: Iterator< Item = Result<smithy_json::deserialize::Token<'a>, smithy_json::deserialize::Error>, >, { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::ValueNull { .. }) => Ok(None), Some(smithy_json::deserialize::Token::StartObject { .. }) => { #[allow(unused_mut)] let mut builder = crate::model::Identity::builder(); loop { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::EndObject { .. }) => break, Some(smithy_json::deserialize::Token::ObjectKey { key, .. }) => { match key.to_unescaped()?.as_ref() { "user" => { builder = builder.set_user( crate::json_deser::deser_structure_user_identity(tokens)?, ); } "group" => { builder = builder.set_group( crate::json_deser::deser_structure_group_identity(tokens)?, ); } "iamUser" => { builder = builder.set_iam_user( crate::json_deser::deser_structure_iam_user_identity(tokens)?, ); } "iamRole" => { builder = builder.set_iam_role( crate::json_deser::deser_structure_iam_role_identity(tokens)?, ); } _ => smithy_json::deserialize::token::skip_value(tokens)?, } } _ => { return Err(smithy_json::deserialize::Error::custom( "expected object key or end object", )) } } } Ok(Some(builder.build())) } _ => Err(smithy_json::deserialize::Error::custom( "expected start object or null", )), } } pub fn deser_structure_resource<'a, I>( tokens: &mut std::iter::Peekable<I>, ) -> Result<Option<crate::model::Resource>, smithy_json::deserialize::Error> where I: Iterator< Item = Result<smithy_json::deserialize::Token<'a>, smithy_json::deserialize::Error>, >, { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::ValueNull { .. }) => Ok(None), Some(smithy_json::deserialize::Token::StartObject { .. }) => { #[allow(unused_mut)] let mut builder = crate::model::Resource::builder(); loop { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::EndObject { .. }) => break, Some(smithy_json::deserialize::Token::ObjectKey { key, .. }) => { match key.to_unescaped()?.as_ref() { "portal" => { builder = builder.set_portal( crate::json_deser::deser_structure_portal_resource(tokens)?, ); } "project" => { builder = builder.set_project( crate::json_deser::deser_structure_project_resource(tokens)?, ); } _ => smithy_json::deserialize::token::skip_value(tokens)?, } } _ => { return Err(smithy_json::deserialize::Error::custom( "expected object key or end object", )) } } } Ok(Some(builder.build())) } _ => Err(smithy_json::deserialize::Error::custom( "expected start object or null", )), } } #[allow(clippy::type_complexity, non_snake_case)] pub fn deser_list_asset_composite_models<'a, I>( tokens: &mut std::iter::Peekable<I>, ) -> Result<Option<std::vec::Vec<crate::model::AssetCompositeModel>>, smithy_json::deserialize::Error> where I: Iterator< Item = Result<smithy_json::deserialize::Token<'a>, smithy_json::deserialize::Error>, >, { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::ValueNull { .. }) => Ok(None), Some(smithy_json::deserialize::Token::StartArray { .. }) => { let mut items = Vec::new(); loop { match tokens.peek() { Some(Ok(smithy_json::deserialize::Token::EndArray { .. })) => { tokens.next().transpose().unwrap(); break; } _ => { let value = crate::json_deser::deser_structure_asset_composite_model(tokens)?; if let Some(value) = value { items.push(value); } } } } Ok(Some(items)) } _ => Err(smithy_json::deserialize::Error::custom( "expected start array or null", )), } } #[allow(clippy::type_complexity, non_snake_case)] pub fn deser_list_asset_hierarchies<'a, I>( tokens: &mut std::iter::Peekable<I>, ) -> Result<Option<std::vec::Vec<crate::model::AssetHierarchy>>, smithy_json::deserialize::Error> where I: Iterator< Item = Result<smithy_json::deserialize::Token<'a>, smithy_json::deserialize::Error>, >, { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::ValueNull { .. }) => Ok(None), Some(smithy_json::deserialize::Token::StartArray { .. }) => { let mut items = Vec::new(); loop { match tokens.peek() { Some(Ok(smithy_json::deserialize::Token::EndArray { .. })) => { tokens.next().transpose().unwrap(); break; } _ => { let value = crate::json_deser::deser_structure_asset_hierarchy(tokens)?; if let Some(value) = value { items.push(value); } } } } Ok(Some(items)) } _ => Err(smithy_json::deserialize::Error::custom( "expected start array or null", )), } } #[allow(clippy::type_complexity, non_snake_case)] pub fn deser_list_asset_properties<'a, I>( tokens: &mut std::iter::Peekable<I>, ) -> Result<Option<std::vec::Vec<crate::model::AssetProperty>>, smithy_json::deserialize::Error> where I: Iterator< Item = Result<smithy_json::deserialize::Token<'a>, smithy_json::deserialize::Error>, >, { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::ValueNull { .. }) => Ok(None), Some(smithy_json::deserialize::Token::StartArray { .. }) => { let mut items = Vec::new(); loop { match tokens.peek() { Some(Ok(smithy_json::deserialize::Token::EndArray { .. })) => { tokens.next().transpose().unwrap(); break; } _ => { let value = crate::json_deser::deser_structure_asset_property(tokens)?; if let Some(value) = value { items.push(value); } } } } Ok(Some(items)) } _ => Err(smithy_json::deserialize::Error::custom( "expected start array or null", )), } } #[allow(clippy::type_complexity, non_snake_case)] pub fn deser_list_asset_model_composite_models<'a, I>( tokens: &mut std::iter::Peekable<I>, ) -> Result< Option<std::vec::Vec<crate::model::AssetModelCompositeModel>>, smithy_json::deserialize::Error, > where I: Iterator< Item = Result<smithy_json::deserialize::Token<'a>, smithy_json::deserialize::Error>, >, { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::ValueNull { .. }) => Ok(None), Some(smithy_json::deserialize::Token::StartArray { .. }) => { let mut items = Vec::new(); loop { match tokens.peek() { Some(Ok(smithy_json::deserialize::Token::EndArray { .. })) => { tokens.next().transpose().unwrap(); break; } _ => { let value = crate::json_deser::deser_structure_asset_model_composite_model(tokens)?; if let Some(value) = value { items.push(value); } } } } Ok(Some(items)) } _ => Err(smithy_json::deserialize::Error::custom( "expected start array or null", )), } } #[allow(clippy::type_complexity, non_snake_case)] pub fn deser_list_asset_model_hierarchies<'a, I>( tokens: &mut std::iter::Peekable<I>, ) -> Result<Option<std::vec::Vec<crate::model::AssetModelHierarchy>>, smithy_json::deserialize::Error> where I: Iterator< Item = Result<smithy_json::deserialize::Token<'a>, smithy_json::deserialize::Error>, >, { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::ValueNull { .. }) => Ok(None), Some(smithy_json::deserialize::Token::StartArray { .. }) => { let mut items = Vec::new(); loop { match tokens.peek() { Some(Ok(smithy_json::deserialize::Token::EndArray { .. })) => { tokens.next().transpose().unwrap(); break; } _ => { let value = crate::json_deser::deser_structure_asset_model_hierarchy(tokens)?; if let Some(value) = value { items.push(value); } } } } Ok(Some(items)) } _ => Err(smithy_json::deserialize::Error::custom( "expected start array or null", )), } } #[allow(clippy::type_complexity, non_snake_case)] pub fn deser_list_asset_model_properties<'a, I>( tokens: &mut std::iter::Peekable<I>, ) -> Result<Option<std::vec::Vec<crate::model::AssetModelProperty>>, smithy_json::deserialize::Error> where I: Iterator< Item = Result<smithy_json::deserialize::Token<'a>, smithy_json::deserialize::Error>, >, { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::ValueNull { .. }) => Ok(None), Some(smithy_json::deserialize::Token::StartArray { .. }) => { let mut items = Vec::new(); loop { match tokens.peek() { Some(Ok(smithy_json::deserialize::Token::EndArray { .. })) => { tokens.next().transpose().unwrap(); break; } _ => { let value = crate::json_deser::deser_structure_asset_model_property(tokens)?; if let Some(value) = value { items.push(value); } } } } Ok(Some(items)) } _ => Err(smithy_json::deserialize::Error::custom( "expected start array or null", )), } } pub fn deser_structure_property<'a, I>( tokens: &mut std::iter::Peekable<I>, ) -> Result<Option<crate::model::Property>, smithy_json::deserialize::Error> where I: Iterator< Item = Result<smithy_json::deserialize::Token<'a>, smithy_json::deserialize::Error>, >, { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::ValueNull { .. }) => Ok(None), Some(smithy_json::deserialize::Token::StartObject { .. }) => { #[allow(unused_mut)] let mut builder = crate::model::Property::builder(); loop { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::EndObject { .. }) => break, Some(smithy_json::deserialize::Token::ObjectKey { key, .. }) => { match key.to_unescaped()?.as_ref() { "id" => { builder = builder.set_id( smithy_json::deserialize::token::expect_string_or_null( tokens.next(), )? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } "name" => { builder = builder.set_name( smithy_json::deserialize::token::expect_string_or_null( tokens.next(), )? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } "alias" => { builder = builder.set_alias( smithy_json::deserialize::token::expect_string_or_null( tokens.next(), )? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } "notification" => { builder = builder.set_notification( crate::json_deser::deser_structure_property_notification( tokens, )?, ); } "dataType" => { builder = builder.set_data_type( smithy_json::deserialize::token::expect_string_or_null( tokens.next(), )? .map(|s| { s.to_unescaped().map(|u| { crate::model::PropertyDataType::from(u.as_ref()) }) }) .transpose()?, ); } "unit" => { builder = builder.set_unit( smithy_json::deserialize::token::expect_string_or_null( tokens.next(), )? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } "type" => { builder = builder.set_type( crate::json_deser::deser_structure_property_type(tokens)?, ); } _ => smithy_json::deserialize::token::skip_value(tokens)?, } } _ => { return Err(smithy_json::deserialize::Error::custom( "expected object key or end object", )) } } } Ok(Some(builder.build())) } _ => Err(smithy_json::deserialize::Error::custom( "expected start object or null", )), } } pub fn deser_structure_composite_model_property<'a, I>( tokens: &mut std::iter::Peekable<I>, ) -> Result<Option<crate::model::CompositeModelProperty>, smithy_json::deserialize::Error> where I: Iterator< Item = Result<smithy_json::deserialize::Token<'a>, smithy_json::deserialize::Error>, >, { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::ValueNull { .. }) => Ok(None), Some(smithy_json::deserialize::Token::StartObject { .. }) => { #[allow(unused_mut)] let mut builder = crate::model::CompositeModelProperty::builder(); loop { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::EndObject { .. }) => break, Some(smithy_json::deserialize::Token::ObjectKey { key, .. }) => { match key.to_unescaped()?.as_ref() { "name" => { builder = builder.set_name( smithy_json::deserialize::token::expect_string_or_null( tokens.next(), )? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } "type" => { builder = builder.set_type( smithy_json::deserialize::token::expect_string_or_null( tokens.next(), )? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } "assetProperty" => { builder = builder.set_asset_property( crate::json_deser::deser_structure_property(tokens)?, ); } _ => smithy_json::deserialize::token::skip_value(tokens)?, } } _ => { return Err(smithy_json::deserialize::Error::custom( "expected object key or end object", )) } } } Ok(Some(builder.build())) } _ => Err(smithy_json::deserialize::Error::custom( "expected start object or null", )), } } pub fn deser_structure_configuration_status<'a, I>( tokens: &mut std::iter::Peekable<I>, ) -> Result<Option<crate::model::ConfigurationStatus>, smithy_json::deserialize::Error> where I: Iterator< Item = Result<smithy_json::deserialize::Token<'a>, smithy_json::deserialize::Error>, >, { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::ValueNull { .. }) => Ok(None), Some(smithy_json::deserialize::Token::StartObject { .. }) => { #[allow(unused_mut)] let mut builder = crate::model::ConfigurationStatus::builder(); loop { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::EndObject { .. }) => break, Some(smithy_json::deserialize::Token::ObjectKey { key, .. }) => { match key.to_unescaped()?.as_ref() { "state" => { builder = builder.set_state( smithy_json::deserialize::token::expect_string_or_null( tokens.next(), )? .map(|s| { s.to_unescaped().map(|u| { crate::model::ConfigurationState::from(u.as_ref()) }) }) .transpose()?, ); } "error" => { builder = builder.set_error( crate::json_deser::deser_structure_configuration_error_details( tokens, )?, ); } _ => smithy_json::deserialize::token::skip_value(tokens)?, } } _ => { return Err(smithy_json::deserialize::Error::custom( "expected object key or end object", )) } } } Ok(Some(builder.build())) } _ => Err(smithy_json::deserialize::Error::custom( "expected start object or null", )), } } #[allow(clippy::type_complexity, non_snake_case)] pub fn deser_list_gateway_capability_summaries<'a, I>( tokens: &mut std::iter::Peekable<I>, ) -> Result< Option<std::vec::Vec<crate::model::GatewayCapabilitySummary>>, smithy_json::deserialize::Error, > where I: Iterator< Item = Result<smithy_json::deserialize::Token<'a>, smithy_json::deserialize::Error>, >, { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::ValueNull { .. }) => Ok(None), Some(smithy_json::deserialize::Token::StartArray { .. }) => { let mut items = Vec::new(); loop { match tokens.peek() { Some(Ok(smithy_json::deserialize::Token::EndArray { .. })) => { tokens.next().transpose().unwrap(); break; } _ => { let value = crate::json_deser::deser_structure_gateway_capability_summary(tokens)?; if let Some(value) = value { items.push(value); } } } } Ok(Some(items)) } _ => Err(smithy_json::deserialize::Error::custom( "expected start array or null", )), } } pub fn deser_structure_gateway_platform<'a, I>( tokens: &mut std::iter::Peekable<I>, ) -> Result<Option<crate::model::GatewayPlatform>, smithy_json::deserialize::Error> where I: Iterator< Item = Result<smithy_json::deserialize::Token<'a>, smithy_json::deserialize::Error>, >, { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::ValueNull { .. }) => Ok(None), Some(smithy_json::deserialize::Token::StartObject { .. }) => { #[allow(unused_mut)] let mut builder = crate::model::GatewayPlatform::builder(); loop { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::EndObject { .. }) => break, Some(smithy_json::deserialize::Token::ObjectKey { key, .. }) => { match key.to_unescaped()?.as_ref() { "greengrass" => { builder = builder.set_greengrass( crate::json_deser::deser_structure_greengrass(tokens)?, ); } "greengrassV2" => { builder = builder.set_greengrass_v2( crate::json_deser::deser_structure_greengrass_v2(tokens)?, ); } _ => smithy_json::deserialize::token::skip_value(tokens)?, } } _ => { return Err(smithy_json::deserialize::Error::custom( "expected object key or end object", )) } } } Ok(Some(builder.build())) } _ => Err(smithy_json::deserialize::Error::custom( "expected start object or null", )), } } pub fn deser_structure_logging_options<'a, I>( tokens: &mut std::iter::Peekable<I>, ) -> Result<Option<crate::model::LoggingOptions>, smithy_json::deserialize::Error> where I: Iterator< Item = Result<smithy_json::deserialize::Token<'a>, smithy_json::deserialize::Error>, >, { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::ValueNull { .. }) => Ok(None), Some(smithy_json::deserialize::Token::StartObject { .. }) => { #[allow(unused_mut)] let mut builder = crate::model::LoggingOptions::builder(); loop { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::EndObject { .. }) => break, Some(smithy_json::deserialize::Token::ObjectKey { key, .. }) => { match key.to_unescaped()?.as_ref() { "level" => { builder = builder.set_level( smithy_json::deserialize::token::expect_string_or_null( tokens.next(), )? .map(|s| { s.to_unescaped() .map(|u| crate::model::LoggingLevel::from(u.as_ref())) }) .transpose()?, ); } _ => smithy_json::deserialize::token::skip_value(tokens)?, } } _ => { return Err(smithy_json::deserialize::Error::custom( "expected object key or end object", )) } } } Ok(Some(builder.build())) } _ => Err(smithy_json::deserialize::Error::custom( "expected start object or null", )), } } pub fn deser_structure_alarms<'a, I>( tokens: &mut std::iter::Peekable<I>, ) -> Result<Option<crate::model::Alarms>, smithy_json::deserialize::Error> where I: Iterator< Item = Result<smithy_json::deserialize::Token<'a>, smithy_json::deserialize::Error>, >, { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::ValueNull { .. }) => Ok(None), Some(smithy_json::deserialize::Token::StartObject { .. }) => { #[allow(unused_mut)] let mut builder = crate::model::Alarms::builder(); loop { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::EndObject { .. }) => break, Some(smithy_json::deserialize::Token::ObjectKey { key, .. }) => { match key.to_unescaped()?.as_ref() { "alarmRoleArn" => { builder = builder.set_alarm_role_arn( smithy_json::deserialize::token::expect_string_or_null( tokens.next(), )? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } "notificationLambdaArn" => { builder = builder.set_notification_lambda_arn( smithy_json::deserialize::token::expect_string_or_null( tokens.next(), )? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } _ => smithy_json::deserialize::token::skip_value(tokens)?, } } _ => { return Err(smithy_json::deserialize::Error::custom( "expected object key or end object", )) } } } Ok(Some(builder.build())) } _ => Err(smithy_json::deserialize::Error::custom( "expected start object or null", )), } } pub fn deser_structure_image_location<'a, I>( tokens: &mut std::iter::Peekable<I>, ) -> Result<Option<crate::model::ImageLocation>, smithy_json::deserialize::Error> where I: Iterator< Item = Result<smithy_json::deserialize::Token<'a>, smithy_json::deserialize::Error>, >, { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::ValueNull { .. }) => Ok(None), Some(smithy_json::deserialize::Token::StartObject { .. }) => { #[allow(unused_mut)] let mut builder = crate::model::ImageLocation::builder(); loop { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::EndObject { .. }) => break, Some(smithy_json::deserialize::Token::ObjectKey { key, .. }) => { match key.to_unescaped()?.as_ref() { "id" => { builder = builder.set_id( smithy_json::deserialize::token::expect_string_or_null( tokens.next(), )? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } "url" => { builder = builder.set_url( smithy_json::deserialize::token::expect_string_or_null( tokens.next(), )? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } _ => smithy_json::deserialize::token::skip_value(tokens)?, } } _ => { return Err(smithy_json::deserialize::Error::custom( "expected object key or end object", )) } } } Ok(Some(builder.build())) } _ => Err(smithy_json::deserialize::Error::custom( "expected start object or null", )), } } pub fn deser_structure_multi_layer_storage<'a, I>( tokens: &mut std::iter::Peekable<I>, ) -> Result<Option<crate::model::MultiLayerStorage>, smithy_json::deserialize::Error> where I: Iterator< Item = Result<smithy_json::deserialize::Token<'a>, smithy_json::deserialize::Error>, >, { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::ValueNull { .. }) => Ok(None), Some(smithy_json::deserialize::Token::StartObject { .. }) => { #[allow(unused_mut)] let mut builder = crate::model::MultiLayerStorage::builder(); loop { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::EndObject { .. }) => break, Some(smithy_json::deserialize::Token::ObjectKey { key, .. }) => { match key.to_unescaped()?.as_ref() { "customerManagedS3Storage" => { builder = builder.set_customer_managed_s3_storage( crate::json_deser::deser_structure_customer_managed_s3_storage( tokens, )?, ); } _ => smithy_json::deserialize::token::skip_value(tokens)?, } } _ => { return Err(smithy_json::deserialize::Error::custom( "expected object key or end object", )) } } } Ok(Some(builder.build())) } _ => Err(smithy_json::deserialize::Error::custom( "expected start object or null", )), } } #[allow(clippy::type_complexity, non_snake_case)] pub fn deser_list_aggregated_values<'a, I>( tokens: &mut std::iter::Peekable<I>, ) -> Result<Option<std::vec::Vec<crate::model::AggregatedValue>>, smithy_json::deserialize::Error> where I: Iterator< Item = Result<smithy_json::deserialize::Token<'a>, smithy_json::deserialize::Error>, >, { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::ValueNull { .. }) => Ok(None), Some(smithy_json::deserialize::Token::StartArray { .. }) => { let mut items = Vec::new(); loop { match tokens.peek() { Some(Ok(smithy_json::deserialize::Token::EndArray { .. })) => { tokens.next().transpose().unwrap(); break; } _ => { let value = crate::json_deser::deser_structure_aggregated_value(tokens)?; if let Some(value) = value { items.push(value); } } } } Ok(Some(items)) } _ => Err(smithy_json::deserialize::Error::custom( "expected start array or null", )), } } pub fn deser_structure_asset_property_value<'a, I>( tokens: &mut std::iter::Peekable<I>, ) -> Result<Option<crate::model::AssetPropertyValue>, smithy_json::deserialize::Error> where I: Iterator< Item = Result<smithy_json::deserialize::Token<'a>, smithy_json::deserialize::Error>, >, { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::ValueNull { .. }) => Ok(None), Some(smithy_json::deserialize::Token::StartObject { .. }) => { #[allow(unused_mut)] let mut builder = crate::model::AssetPropertyValue::builder(); loop { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::EndObject { .. }) => break, Some(smithy_json::deserialize::Token::ObjectKey { key, .. }) => { match key.to_unescaped()?.as_ref() { "value" => { builder = builder .set_value(crate::json_deser::deser_structure_variant(tokens)?); } "timestamp" => { builder = builder.set_timestamp( crate::json_deser::deser_structure_time_in_nanos(tokens)?, ); } "quality" => { builder = builder.set_quality( smithy_json::deserialize::token::expect_string_or_null( tokens.next(), )? .map(|s| { s.to_unescaped() .map(|u| crate::model::Quality::from(u.as_ref())) }) .transpose()?, ); } _ => smithy_json::deserialize::token::skip_value(tokens)?, } } _ => { return Err(smithy_json::deserialize::Error::custom( "expected object key or end object", )) } } } Ok(Some(builder.build())) } _ => Err(smithy_json::deserialize::Error::custom( "expected start object or null", )), } } #[allow(clippy::type_complexity, non_snake_case)] pub fn deser_list_asset_property_value_history<'a, I>( tokens: &mut std::iter::Peekable<I>, ) -> Result<Option<std::vec::Vec<crate::model::AssetPropertyValue>>, smithy_json::deserialize::Error> where I: Iterator< Item = Result<smithy_json::deserialize::Token<'a>, smithy_json::deserialize::Error>, >, { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::ValueNull { .. }) => Ok(None), Some(smithy_json::deserialize::Token::StartArray { .. }) => { let mut items = Vec::new(); loop { match tokens.peek() { Some(Ok(smithy_json::deserialize::Token::EndArray { .. })) => { tokens.next().transpose().unwrap(); break; } _ => { let value = crate::json_deser::deser_structure_asset_property_value(tokens)?; if let Some(value) = value { items.push(value); } } } } Ok(Some(items)) } _ => Err(smithy_json::deserialize::Error::custom( "expected start array or null", )), } } #[allow(clippy::type_complexity, non_snake_case)] pub fn deser_list_interpolated_asset_property_values<'a, I>( tokens: &mut std::iter::Peekable<I>, ) -> Result< Option<std::vec::Vec<crate::model::InterpolatedAssetPropertyValue>>, smithy_json::deserialize::Error, > where I: Iterator< Item = Result<smithy_json::deserialize::Token<'a>, smithy_json::deserialize::Error>, >, { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::ValueNull { .. }) => Ok(None), Some(smithy_json::deserialize::Token::StartArray { .. }) => { let mut items = Vec::new(); loop { match tokens.peek() { Some(Ok(smithy_json::deserialize::Token::EndArray { .. })) => { tokens.next().transpose().unwrap(); break; } _ => { let value = crate::json_deser::deser_structure_interpolated_asset_property_value( tokens, )?; if let Some(value) = value { items.push(value); } } } } Ok(Some(items)) } _ => Err(smithy_json::deserialize::Error::custom( "expected start array or null", )), } } #[allow(clippy::type_complexity, non_snake_case)] pub fn deser_list_access_policy_summaries<'a, I>( tokens: &mut std::iter::Peekable<I>, ) -> Result<Option<std::vec::Vec<crate::model::AccessPolicySummary>>, smithy_json::deserialize::Error> where I: Iterator< Item = Result<smithy_json::deserialize::Token<'a>, smithy_json::deserialize::Error>, >, { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::ValueNull { .. }) => Ok(None), Some(smithy_json::deserialize::Token::StartArray { .. }) => { let mut items = Vec::new(); loop { match tokens.peek() { Some(Ok(smithy_json::deserialize::Token::EndArray { .. })) => { tokens.next().transpose().unwrap(); break; } _ => { let value = crate::json_deser::deser_structure_access_policy_summary(tokens)?; if let Some(value) = value { items.push(value); } } } } Ok(Some(items)) } _ => Err(smithy_json::deserialize::Error::custom( "expected start array or null", )), } } #[allow(clippy::type_complexity, non_snake_case)] pub fn deser_list_asset_model_summaries<'a, I>( tokens: &mut std::iter::Peekable<I>, ) -> Result<Option<std::vec::Vec<crate::model::AssetModelSummary>>, smithy_json::deserialize::Error> where I: Iterator< Item = Result<smithy_json::deserialize::Token<'a>, smithy_json::deserialize::Error>, >, { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::ValueNull { .. }) => Ok(None), Some(smithy_json::deserialize::Token::StartArray { .. }) => { let mut items = Vec::new(); loop { match tokens.peek() { Some(Ok(smithy_json::deserialize::Token::EndArray { .. })) => { tokens.next().transpose().unwrap(); break; } _ => { let value = crate::json_deser::deser_structure_asset_model_summary(tokens)?; if let Some(value) = value { items.push(value); } } } } Ok(Some(items)) } _ => Err(smithy_json::deserialize::Error::custom( "expected start array or null", )), } } #[allow(clippy::type_complexity, non_snake_case)] pub fn deser_list_asset_relationship_summaries<'a, I>( tokens: &mut std::iter::Peekable<I>, ) -> Result< Option<std::vec::Vec<crate::model::AssetRelationshipSummary>>, smithy_json::deserialize::Error, > where I: Iterator< Item = Result<smithy_json::deserialize::Token<'a>, smithy_json::deserialize::Error>, >, { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::ValueNull { .. }) => Ok(None), Some(smithy_json::deserialize::Token::StartArray { .. }) => { let mut items = Vec::new(); loop { match tokens.peek() { Some(Ok(smithy_json::deserialize::Token::EndArray { .. })) => { tokens.next().transpose().unwrap(); break; } _ => { let value = crate::json_deser::deser_structure_asset_relationship_summary(tokens)?; if let Some(value) = value { items.push(value); } } } } Ok(Some(items)) } _ => Err(smithy_json::deserialize::Error::custom( "expected start array or null", )), } } #[allow(clippy::type_complexity, non_snake_case)] pub fn deser_list_asset_summaries<'a, I>( tokens: &mut std::iter::Peekable<I>, ) -> Result<Option<std::vec::Vec<crate::model::AssetSummary>>, smithy_json::deserialize::Error> where I: Iterator< Item = Result<smithy_json::deserialize::Token<'a>, smithy_json::deserialize::Error>, >, { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::ValueNull { .. }) => Ok(None), Some(smithy_json::deserialize::Token::StartArray { .. }) => { let mut items = Vec::new(); loop { match tokens.peek() { Some(Ok(smithy_json::deserialize::Token::EndArray { .. })) => { tokens.next().transpose().unwrap(); break; } _ => { let value = crate::json_deser::deser_structure_asset_summary(tokens)?; if let Some(value) = value { items.push(value); } } } } Ok(Some(items)) } _ => Err(smithy_json::deserialize::Error::custom( "expected start array or null", )), } } #[allow(clippy::type_complexity, non_snake_case)] pub fn deser_list_associated_assets_summaries<'a, I>( tokens: &mut std::iter::Peekable<I>, ) -> Result< Option<std::vec::Vec<crate::model::AssociatedAssetsSummary>>, smithy_json::deserialize::Error, > where I: Iterator< Item = Result<smithy_json::deserialize::Token<'a>, smithy_json::deserialize::Error>, >, { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::ValueNull { .. }) => Ok(None), Some(smithy_json::deserialize::Token::StartArray { .. }) => { let mut items = Vec::new(); loop { match tokens.peek() { Some(Ok(smithy_json::deserialize::Token::EndArray { .. })) => { tokens.next().transpose().unwrap(); break; } _ => { let value = crate::json_deser::deser_structure_associated_assets_summary(tokens)?; if let Some(value) = value { items.push(value); } } } } Ok(Some(items)) } _ => Err(smithy_json::deserialize::Error::custom( "expected start array or null", )), } } #[allow(clippy::type_complexity, non_snake_case)] pub fn deser_list_dashboard_summaries<'a, I>( tokens: &mut std::iter::Peekable<I>, ) -> Result<Option<std::vec::Vec<crate::model::DashboardSummary>>, smithy_json::deserialize::Error> where I: Iterator< Item = Result<smithy_json::deserialize::Token<'a>, smithy_json::deserialize::Error>, >, { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::ValueNull { .. }) => Ok(None), Some(smithy_json::deserialize::Token::StartArray { .. }) => { let mut items = Vec::new(); loop { match tokens.peek() { Some(Ok(smithy_json::deserialize::Token::EndArray { .. })) => { tokens.next().transpose().unwrap(); break; } _ => { let value = crate::json_deser::deser_structure_dashboard_summary(tokens)?; if let Some(value) = value { items.push(value); } } } } Ok(Some(items)) } _ => Err(smithy_json::deserialize::Error::custom( "expected start array or null", )), } } #[allow(clippy::type_complexity, non_snake_case)] pub fn deser_list_gateway_summaries<'a, I>( tokens: &mut std::iter::Peekable<I>, ) -> Result<Option<std::vec::Vec<crate::model::GatewaySummary>>, smithy_json::deserialize::Error> where I: Iterator< Item = Result<smithy_json::deserialize::Token<'a>, smithy_json::deserialize::Error>, >, { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::ValueNull { .. }) => Ok(None), Some(smithy_json::deserialize::Token::StartArray { .. }) => { let mut items = Vec::new(); loop { match tokens.peek() { Some(Ok(smithy_json::deserialize::Token::EndArray { .. })) => { tokens.next().transpose().unwrap(); break; } _ => { let value = crate::json_deser::deser_structure_gateway_summary(tokens)?; if let Some(value) = value { items.push(value); } } } } Ok(Some(items)) } _ => Err(smithy_json::deserialize::Error::custom( "expected start array or null", )), } } #[allow(clippy::type_complexity, non_snake_case)] pub fn deser_list_portal_summaries<'a, I>( tokens: &mut std::iter::Peekable<I>, ) -> Result<Option<std::vec::Vec<crate::model::PortalSummary>>, smithy_json::deserialize::Error> where I: Iterator< Item = Result<smithy_json::deserialize::Token<'a>, smithy_json::deserialize::Error>, >, { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::ValueNull { .. }) => Ok(None), Some(smithy_json::deserialize::Token::StartArray { .. }) => { let mut items = Vec::new(); loop { match tokens.peek() { Some(Ok(smithy_json::deserialize::Token::EndArray { .. })) => { tokens.next().transpose().unwrap(); break; } _ => { let value = crate::json_deser::deser_structure_portal_summary(tokens)?; if let Some(value) = value { items.push(value); } } } } Ok(Some(items)) } _ => Err(smithy_json::deserialize::Error::custom( "expected start array or null", )), } } #[allow(clippy::type_complexity, non_snake_case)] pub fn deser_list_asset_i_ds<'a, I>( tokens: &mut std::iter::Peekable<I>, ) -> Result<Option<std::vec::Vec<std::string::String>>, smithy_json::deserialize::Error> where I: Iterator< Item = Result<smithy_json::deserialize::Token<'a>, smithy_json::deserialize::Error>, >, { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::ValueNull { .. }) => Ok(None), Some(smithy_json::deserialize::Token::StartArray { .. }) => { let mut items = Vec::new(); loop { match tokens.peek() { Some(Ok(smithy_json::deserialize::Token::EndArray { .. })) => { tokens.next().transpose().unwrap(); break; } _ => { let value = smithy_json::deserialize::token::expect_string_or_null(tokens.next())? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?; if let Some(value) = value { items.push(value); } } } } Ok(Some(items)) } _ => Err(smithy_json::deserialize::Error::custom( "expected start array or null", )), } } #[allow(clippy::type_complexity, non_snake_case)] pub fn deser_list_project_summaries<'a, I>( tokens: &mut std::iter::Peekable<I>, ) -> Result<Option<std::vec::Vec<crate::model::ProjectSummary>>, smithy_json::deserialize::Error> where I: Iterator< Item = Result<smithy_json::deserialize::Token<'a>, smithy_json::deserialize::Error>, >, { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::ValueNull { .. }) => Ok(None), Some(smithy_json::deserialize::Token::StartArray { .. }) => { let mut items = Vec::new(); loop { match tokens.peek() { Some(Ok(smithy_json::deserialize::Token::EndArray { .. })) => { tokens.next().transpose().unwrap(); break; } _ => { let value = crate::json_deser::deser_structure_project_summary(tokens)?; if let Some(value) = value { items.push(value); } } } } Ok(Some(items)) } _ => Err(smithy_json::deserialize::Error::custom( "expected start array or null", )), } } #[allow(clippy::type_complexity, non_snake_case)] pub fn deser_map_tag_map<'a, I>( tokens: &mut std::iter::Peekable<I>, ) -> Result< Option<std::collections::HashMap<std::string::String, std::string::String>>, smithy_json::deserialize::Error, > where I: Iterator< Item = Result<smithy_json::deserialize::Token<'a>, smithy_json::deserialize::Error>, >, { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::ValueNull { .. }) => Ok(None), Some(smithy_json::deserialize::Token::StartObject { .. }) => { let mut map = std::collections::HashMap::new(); loop { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::EndObject { .. }) => break, Some(smithy_json::deserialize::Token::ObjectKey { key, .. }) => { let key = key.to_unescaped().map(|u| u.into_owned())?; let value = smithy_json::deserialize::token::expect_string_or_null(tokens.next())? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?; if let Some(value) = value { map.insert(key, value); } } _ => { return Err(smithy_json::deserialize::Error::custom( "expected object key or end object", )) } } } Ok(Some(map)) } _ => Err(smithy_json::deserialize::Error::custom( "expected start object or null", )), } } pub fn deser_structure_asset_error_details<'a, I>( tokens: &mut std::iter::Peekable<I>, ) -> Result<Option<crate::model::AssetErrorDetails>, smithy_json::deserialize::Error> where I: Iterator< Item = Result<smithy_json::deserialize::Token<'a>, smithy_json::deserialize::Error>, >, { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::ValueNull { .. }) => Ok(None), Some(smithy_json::deserialize::Token::StartObject { .. }) => { #[allow(unused_mut)] let mut builder = crate::model::AssetErrorDetails::builder(); loop { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::EndObject { .. }) => break, Some(smithy_json::deserialize::Token::ObjectKey { key, .. }) => { match key.to_unescaped()?.as_ref() { "assetId" => { builder = builder.set_asset_id( smithy_json::deserialize::token::expect_string_or_null( tokens.next(), )? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } "code" => { builder = builder.set_code( smithy_json::deserialize::token::expect_string_or_null( tokens.next(), )? .map(|s| { s.to_unescaped() .map(|u| crate::model::AssetErrorCode::from(u.as_ref())) }) .transpose()?, ); } "message" => { builder = builder.set_message( smithy_json::deserialize::token::expect_string_or_null( tokens.next(), )? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } _ => smithy_json::deserialize::token::skip_value(tokens)?, } } _ => { return Err(smithy_json::deserialize::Error::custom( "expected object key or end object", )) } } } Ok(Some(builder.build())) } _ => Err(smithy_json::deserialize::Error::custom( "expected start object or null", )), } } pub fn deser_structure_batch_put_asset_property_error_entry<'a, I>( tokens: &mut std::iter::Peekable<I>, ) -> Result<Option<crate::model::BatchPutAssetPropertyErrorEntry>, smithy_json::deserialize::Error> where I: Iterator< Item = Result<smithy_json::deserialize::Token<'a>, smithy_json::deserialize::Error>, >, { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::ValueNull { .. }) => Ok(None), Some(smithy_json::deserialize::Token::StartObject { .. }) => { #[allow(unused_mut)] let mut builder = crate::model::BatchPutAssetPropertyErrorEntry::builder(); loop { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::EndObject { .. }) => break, Some(smithy_json::deserialize::Token::ObjectKey { key, .. }) => { match key.to_unescaped()?.as_ref() { "entryId" => { builder = builder.set_entry_id( smithy_json::deserialize::token::expect_string_or_null( tokens.next(), )? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } "errors" => { builder = builder.set_errors( crate::json_deser::deser_list_batch_put_asset_property_errors( tokens, )?, ); } _ => smithy_json::deserialize::token::skip_value(tokens)?, } } _ => { return Err(smithy_json::deserialize::Error::custom( "expected object key or end object", )) } } } Ok(Some(builder.build())) } _ => Err(smithy_json::deserialize::Error::custom( "expected start object or null", )), } } pub fn deser_structure_error_details<'a, I>( tokens: &mut std::iter::Peekable<I>, ) -> Result<Option<crate::model::ErrorDetails>, smithy_json::deserialize::Error> where I: Iterator< Item = Result<smithy_json::deserialize::Token<'a>, smithy_json::deserialize::Error>, >, { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::ValueNull { .. }) => Ok(None), Some(smithy_json::deserialize::Token::StartObject { .. }) => { #[allow(unused_mut)] let mut builder = crate::model::ErrorDetails::builder(); loop { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::EndObject { .. }) => break, Some(smithy_json::deserialize::Token::ObjectKey { key, .. }) => { match key.to_unescaped()?.as_ref() { "code" => { builder = builder.set_code( smithy_json::deserialize::token::expect_string_or_null( tokens.next(), )? .map(|s| { s.to_unescaped() .map(|u| crate::model::ErrorCode::from(u.as_ref())) }) .transpose()?, ); } "message" => { builder = builder.set_message( smithy_json::deserialize::token::expect_string_or_null( tokens.next(), )? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } "details" => { builder = builder.set_details( crate::json_deser::deser_list_detailed_errors(tokens)?, ); } _ => smithy_json::deserialize::token::skip_value(tokens)?, } } _ => { return Err(smithy_json::deserialize::Error::custom( "expected object key or end object", )) } } } Ok(Some(builder.build())) } _ => Err(smithy_json::deserialize::Error::custom( "expected start object or null", )), } } pub fn deser_structure_monitor_error_details<'a, I>( tokens: &mut std::iter::Peekable<I>, ) -> Result<Option<crate::model::MonitorErrorDetails>, smithy_json::deserialize::Error> where I: Iterator< Item = Result<smithy_json::deserialize::Token<'a>, smithy_json::deserialize::Error>, >, { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::ValueNull { .. }) => Ok(None), Some(smithy_json::deserialize::Token::StartObject { .. }) => { #[allow(unused_mut)] let mut builder = crate::model::MonitorErrorDetails::builder(); loop { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::EndObject { .. }) => break, Some(smithy_json::deserialize::Token::ObjectKey { key, .. }) => { match key.to_unescaped()?.as_ref() { "code" => { builder = builder.set_code( smithy_json::deserialize::token::expect_string_or_null( tokens.next(), )? .map(|s| { s.to_unescaped().map(|u| { crate::model::MonitorErrorCode::from(u.as_ref()) }) }) .transpose()?, ); } "message" => { builder = builder.set_message( smithy_json::deserialize::token::expect_string_or_null( tokens.next(), )? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } _ => smithy_json::deserialize::token::skip_value(tokens)?, } } _ => { return Err(smithy_json::deserialize::Error::custom( "expected object key or end object", )) } } } Ok(Some(builder.build())) } _ => Err(smithy_json::deserialize::Error::custom( "expected start object or null", )), } } pub fn deser_structure_user_identity<'a, I>( tokens: &mut std::iter::Peekable<I>, ) -> Result<Option<crate::model::UserIdentity>, smithy_json::deserialize::Error> where I: Iterator< Item = Result<smithy_json::deserialize::Token<'a>, smithy_json::deserialize::Error>, >, { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::ValueNull { .. }) => Ok(None), Some(smithy_json::deserialize::Token::StartObject { .. }) => { #[allow(unused_mut)] let mut builder = crate::model::UserIdentity::builder(); loop { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::EndObject { .. }) => break, Some(smithy_json::deserialize::Token::ObjectKey { key, .. }) => { match key.to_unescaped()?.as_ref() { "id" => { builder = builder.set_id( smithy_json::deserialize::token::expect_string_or_null( tokens.next(), )? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } _ => smithy_json::deserialize::token::skip_value(tokens)?, } } _ => { return Err(smithy_json::deserialize::Error::custom( "expected object key or end object", )) } } } Ok(Some(builder.build())) } _ => Err(smithy_json::deserialize::Error::custom( "expected start object or null", )), } } pub fn deser_structure_group_identity<'a, I>( tokens: &mut std::iter::Peekable<I>, ) -> Result<Option<crate::model::GroupIdentity>, smithy_json::deserialize::Error> where I: Iterator< Item = Result<smithy_json::deserialize::Token<'a>, smithy_json::deserialize::Error>, >, { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::ValueNull { .. }) => Ok(None), Some(smithy_json::deserialize::Token::StartObject { .. }) => { #[allow(unused_mut)] let mut builder = crate::model::GroupIdentity::builder(); loop { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::EndObject { .. }) => break, Some(smithy_json::deserialize::Token::ObjectKey { key, .. }) => { match key.to_unescaped()?.as_ref() { "id" => { builder = builder.set_id( smithy_json::deserialize::token::expect_string_or_null( tokens.next(), )? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } _ => smithy_json::deserialize::token::skip_value(tokens)?, } } _ => { return Err(smithy_json::deserialize::Error::custom( "expected object key or end object", )) } } } Ok(Some(builder.build())) } _ => Err(smithy_json::deserialize::Error::custom( "expected start object or null", )), } } pub fn deser_structure_iam_user_identity<'a, I>( tokens: &mut std::iter::Peekable<I>, ) -> Result<Option<crate::model::IamUserIdentity>, smithy_json::deserialize::Error> where I: Iterator< Item = Result<smithy_json::deserialize::Token<'a>, smithy_json::deserialize::Error>, >, { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::ValueNull { .. }) => Ok(None), Some(smithy_json::deserialize::Token::StartObject { .. }) => { #[allow(unused_mut)] let mut builder = crate::model::IamUserIdentity::builder(); loop { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::EndObject { .. }) => break, Some(smithy_json::deserialize::Token::ObjectKey { key, .. }) => { match key.to_unescaped()?.as_ref() { "arn" => { builder = builder.set_arn( smithy_json::deserialize::token::expect_string_or_null( tokens.next(), )? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } _ => smithy_json::deserialize::token::skip_value(tokens)?, } } _ => { return Err(smithy_json::deserialize::Error::custom( "expected object key or end object", )) } } } Ok(Some(builder.build())) } _ => Err(smithy_json::deserialize::Error::custom( "expected start object or null", )), } } pub fn deser_structure_iam_role_identity<'a, I>( tokens: &mut std::iter::Peekable<I>, ) -> Result<Option<crate::model::IamRoleIdentity>, smithy_json::deserialize::Error> where I: Iterator< Item = Result<smithy_json::deserialize::Token<'a>, smithy_json::deserialize::Error>, >, { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::ValueNull { .. }) => Ok(None), Some(smithy_json::deserialize::Token::StartObject { .. }) => { #[allow(unused_mut)] let mut builder = crate::model::IamRoleIdentity::builder(); loop { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::EndObject { .. }) => break, Some(smithy_json::deserialize::Token::ObjectKey { key, .. }) => { match key.to_unescaped()?.as_ref() { "arn" => { builder = builder.set_arn( smithy_json::deserialize::token::expect_string_or_null( tokens.next(), )? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } _ => smithy_json::deserialize::token::skip_value(tokens)?, } } _ => { return Err(smithy_json::deserialize::Error::custom( "expected object key or end object", )) } } } Ok(Some(builder.build())) } _ => Err(smithy_json::deserialize::Error::custom( "expected start object or null", )), } } pub fn deser_structure_portal_resource<'a, I>( tokens: &mut std::iter::Peekable<I>, ) -> Result<Option<crate::model::PortalResource>, smithy_json::deserialize::Error> where I: Iterator< Item = Result<smithy_json::deserialize::Token<'a>, smithy_json::deserialize::Error>, >, { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::ValueNull { .. }) => Ok(None), Some(smithy_json::deserialize::Token::StartObject { .. }) => { #[allow(unused_mut)] let mut builder = crate::model::PortalResource::builder(); loop { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::EndObject { .. }) => break, Some(smithy_json::deserialize::Token::ObjectKey { key, .. }) => { match key.to_unescaped()?.as_ref() { "id" => { builder = builder.set_id( smithy_json::deserialize::token::expect_string_or_null( tokens.next(), )? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } _ => smithy_json::deserialize::token::skip_value(tokens)?, } } _ => { return Err(smithy_json::deserialize::Error::custom( "expected object key or end object", )) } } } Ok(Some(builder.build())) } _ => Err(smithy_json::deserialize::Error::custom( "expected start object or null", )), } } pub fn deser_structure_project_resource<'a, I>( tokens: &mut std::iter::Peekable<I>, ) -> Result<Option<crate::model::ProjectResource>, smithy_json::deserialize::Error> where I: Iterator< Item = Result<smithy_json::deserialize::Token<'a>, smithy_json::deserialize::Error>, >, { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::ValueNull { .. }) => Ok(None), Some(smithy_json::deserialize::Token::StartObject { .. }) => { #[allow(unused_mut)] let mut builder = crate::model::ProjectResource::builder(); loop { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::EndObject { .. }) => break, Some(smithy_json::deserialize::Token::ObjectKey { key, .. }) => { match key.to_unescaped()?.as_ref() { "id" => { builder = builder.set_id( smithy_json::deserialize::token::expect_string_or_null( tokens.next(), )? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } _ => smithy_json::deserialize::token::skip_value(tokens)?, } } _ => { return Err(smithy_json::deserialize::Error::custom( "expected object key or end object", )) } } } Ok(Some(builder.build())) } _ => Err(smithy_json::deserialize::Error::custom( "expected start object or null", )), } } pub fn deser_structure_asset_composite_model<'a, I>( tokens: &mut std::iter::Peekable<I>, ) -> Result<Option<crate::model::AssetCompositeModel>, smithy_json::deserialize::Error> where I: Iterator< Item = Result<smithy_json::deserialize::Token<'a>, smithy_json::deserialize::Error>, >, { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::ValueNull { .. }) => Ok(None), Some(smithy_json::deserialize::Token::StartObject { .. }) => { #[allow(unused_mut)] let mut builder = crate::model::AssetCompositeModel::builder(); loop { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::EndObject { .. }) => break, Some(smithy_json::deserialize::Token::ObjectKey { key, .. }) => { match key.to_unescaped()?.as_ref() { "name" => { builder = builder.set_name( smithy_json::deserialize::token::expect_string_or_null( tokens.next(), )? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } "description" => { builder = builder.set_description( smithy_json::deserialize::token::expect_string_or_null( tokens.next(), )? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } "type" => { builder = builder.set_type( smithy_json::deserialize::token::expect_string_or_null( tokens.next(), )? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } "properties" => { builder = builder.set_properties( crate::json_deser::deser_list_asset_properties(tokens)?, ); } _ => smithy_json::deserialize::token::skip_value(tokens)?, } } _ => { return Err(smithy_json::deserialize::Error::custom( "expected object key or end object", )) } } } Ok(Some(builder.build())) } _ => Err(smithy_json::deserialize::Error::custom( "expected start object or null", )), } } pub fn deser_structure_asset_hierarchy<'a, I>( tokens: &mut std::iter::Peekable<I>, ) -> Result<Option<crate::model::AssetHierarchy>, smithy_json::deserialize::Error> where I: Iterator< Item = Result<smithy_json::deserialize::Token<'a>, smithy_json::deserialize::Error>, >, { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::ValueNull { .. }) => Ok(None), Some(smithy_json::deserialize::Token::StartObject { .. }) => { #[allow(unused_mut)] let mut builder = crate::model::AssetHierarchy::builder(); loop { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::EndObject { .. }) => break, Some(smithy_json::deserialize::Token::ObjectKey { key, .. }) => { match key.to_unescaped()?.as_ref() { "id" => { builder = builder.set_id( smithy_json::deserialize::token::expect_string_or_null( tokens.next(), )? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } "name" => { builder = builder.set_name( smithy_json::deserialize::token::expect_string_or_null( tokens.next(), )? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } _ => smithy_json::deserialize::token::skip_value(tokens)?, } } _ => { return Err(smithy_json::deserialize::Error::custom( "expected object key or end object", )) } } } Ok(Some(builder.build())) } _ => Err(smithy_json::deserialize::Error::custom( "expected start object or null", )), } } pub fn deser_structure_asset_property<'a, I>( tokens: &mut std::iter::Peekable<I>, ) -> Result<Option<crate::model::AssetProperty>, smithy_json::deserialize::Error> where I: Iterator< Item = Result<smithy_json::deserialize::Token<'a>, smithy_json::deserialize::Error>, >, { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::ValueNull { .. }) => Ok(None), Some(smithy_json::deserialize::Token::StartObject { .. }) => { #[allow(unused_mut)] let mut builder = crate::model::AssetProperty::builder(); loop { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::EndObject { .. }) => break, Some(smithy_json::deserialize::Token::ObjectKey { key, .. }) => { match key.to_unescaped()?.as_ref() { "id" => { builder = builder.set_id( smithy_json::deserialize::token::expect_string_or_null( tokens.next(), )? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } "name" => { builder = builder.set_name( smithy_json::deserialize::token::expect_string_or_null( tokens.next(), )? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } "alias" => { builder = builder.set_alias( smithy_json::deserialize::token::expect_string_or_null( tokens.next(), )? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } "notification" => { builder = builder.set_notification( crate::json_deser::deser_structure_property_notification( tokens, )?, ); } "dataType" => { builder = builder.set_data_type( smithy_json::deserialize::token::expect_string_or_null( tokens.next(), )? .map(|s| { s.to_unescaped().map(|u| { crate::model::PropertyDataType::from(u.as_ref()) }) }) .transpose()?, ); } "dataTypeSpec" => { builder = builder.set_data_type_spec( smithy_json::deserialize::token::expect_string_or_null( tokens.next(), )? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } "unit" => { builder = builder.set_unit( smithy_json::deserialize::token::expect_string_or_null( tokens.next(), )? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } _ => smithy_json::deserialize::token::skip_value(tokens)?, } } _ => { return Err(smithy_json::deserialize::Error::custom( "expected object key or end object", )) } } } Ok(Some(builder.build())) } _ => Err(smithy_json::deserialize::Error::custom( "expected start object or null", )), } } pub fn deser_structure_asset_model_composite_model<'a, I>( tokens: &mut std::iter::Peekable<I>, ) -> Result<Option<crate::model::AssetModelCompositeModel>, smithy_json::deserialize::Error> where I: Iterator< Item = Result<smithy_json::deserialize::Token<'a>, smithy_json::deserialize::Error>, >, { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::ValueNull { .. }) => Ok(None), Some(smithy_json::deserialize::Token::StartObject { .. }) => { #[allow(unused_mut)] let mut builder = crate::model::AssetModelCompositeModel::builder(); loop { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::EndObject { .. }) => break, Some(smithy_json::deserialize::Token::ObjectKey { key, .. }) => { match key.to_unescaped()?.as_ref() { "name" => { builder = builder.set_name( smithy_json::deserialize::token::expect_string_or_null( tokens.next(), )? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } "description" => { builder = builder.set_description( smithy_json::deserialize::token::expect_string_or_null( tokens.next(), )? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } "type" => { builder = builder.set_type( smithy_json::deserialize::token::expect_string_or_null( tokens.next(), )? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } "properties" => { builder = builder.set_properties( crate::json_deser::deser_list_asset_model_properties(tokens)?, ); } _ => smithy_json::deserialize::token::skip_value(tokens)?, } } _ => { return Err(smithy_json::deserialize::Error::custom( "expected object key or end object", )) } } } Ok(Some(builder.build())) } _ => Err(smithy_json::deserialize::Error::custom( "expected start object or null", )), } } pub fn deser_structure_asset_model_hierarchy<'a, I>( tokens: &mut std::iter::Peekable<I>, ) -> Result<Option<crate::model::AssetModelHierarchy>, smithy_json::deserialize::Error> where I: Iterator< Item = Result<smithy_json::deserialize::Token<'a>, smithy_json::deserialize::Error>, >, { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::ValueNull { .. }) => Ok(None), Some(smithy_json::deserialize::Token::StartObject { .. }) => { #[allow(unused_mut)] let mut builder = crate::model::AssetModelHierarchy::builder(); loop { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::EndObject { .. }) => break, Some(smithy_json::deserialize::Token::ObjectKey { key, .. }) => { match key.to_unescaped()?.as_ref() { "id" => { builder = builder.set_id( smithy_json::deserialize::token::expect_string_or_null( tokens.next(), )? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } "name" => { builder = builder.set_name( smithy_json::deserialize::token::expect_string_or_null( tokens.next(), )? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } "childAssetModelId" => { builder = builder.set_child_asset_model_id( smithy_json::deserialize::token::expect_string_or_null( tokens.next(), )? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } _ => smithy_json::deserialize::token::skip_value(tokens)?, } } _ => { return Err(smithy_json::deserialize::Error::custom( "expected object key or end object", )) } } } Ok(Some(builder.build())) } _ => Err(smithy_json::deserialize::Error::custom( "expected start object or null", )), } } pub fn deser_structure_asset_model_property<'a, I>( tokens: &mut std::iter::Peekable<I>, ) -> Result<Option<crate::model::AssetModelProperty>, smithy_json::deserialize::Error> where I: Iterator< Item = Result<smithy_json::deserialize::Token<'a>, smithy_json::deserialize::Error>, >, { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::ValueNull { .. }) => Ok(None), Some(smithy_json::deserialize::Token::StartObject { .. }) => { #[allow(unused_mut)] let mut builder = crate::model::AssetModelProperty::builder(); loop { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::EndObject { .. }) => break, Some(smithy_json::deserialize::Token::ObjectKey { key, .. }) => { match key.to_unescaped()?.as_ref() { "id" => { builder = builder.set_id( smithy_json::deserialize::token::expect_string_or_null( tokens.next(), )? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } "name" => { builder = builder.set_name( smithy_json::deserialize::token::expect_string_or_null( tokens.next(), )? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } "dataType" => { builder = builder.set_data_type( smithy_json::deserialize::token::expect_string_or_null( tokens.next(), )? .map(|s| { s.to_unescaped().map(|u| { crate::model::PropertyDataType::from(u.as_ref()) }) }) .transpose()?, ); } "dataTypeSpec" => { builder = builder.set_data_type_spec( smithy_json::deserialize::token::expect_string_or_null( tokens.next(), )? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } "unit" => { builder = builder.set_unit( smithy_json::deserialize::token::expect_string_or_null( tokens.next(), )? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } "type" => { builder = builder.set_type( crate::json_deser::deser_structure_property_type(tokens)?, ); } _ => smithy_json::deserialize::token::skip_value(tokens)?, } } _ => { return Err(smithy_json::deserialize::Error::custom( "expected object key or end object", )) } } } Ok(Some(builder.build())) } _ => Err(smithy_json::deserialize::Error::custom( "expected start object or null", )), } } pub fn deser_structure_property_notification<'a, I>( tokens: &mut std::iter::Peekable<I>, ) -> Result<Option<crate::model::PropertyNotification>, smithy_json::deserialize::Error> where I: Iterator< Item = Result<smithy_json::deserialize::Token<'a>, smithy_json::deserialize::Error>, >, { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::ValueNull { .. }) => Ok(None), Some(smithy_json::deserialize::Token::StartObject { .. }) => { #[allow(unused_mut)] let mut builder = crate::model::PropertyNotification::builder(); loop { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::EndObject { .. }) => break, Some(smithy_json::deserialize::Token::ObjectKey { key, .. }) => { match key.to_unescaped()?.as_ref() { "topic" => { builder = builder.set_topic( smithy_json::deserialize::token::expect_string_or_null( tokens.next(), )? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } "state" => { builder = builder.set_state( smithy_json::deserialize::token::expect_string_or_null( tokens.next(), )? .map(|s| { s.to_unescaped().map(|u| { crate::model::PropertyNotificationState::from( u.as_ref(), ) }) }) .transpose()?, ); } _ => smithy_json::deserialize::token::skip_value(tokens)?, } } _ => { return Err(smithy_json::deserialize::Error::custom( "expected object key or end object", )) } } } Ok(Some(builder.build())) } _ => Err(smithy_json::deserialize::Error::custom( "expected start object or null", )), } } pub fn deser_structure_property_type<'a, I>( tokens: &mut std::iter::Peekable<I>, ) -> Result<Option<crate::model::PropertyType>, smithy_json::deserialize::Error> where I: Iterator< Item = Result<smithy_json::deserialize::Token<'a>, smithy_json::deserialize::Error>, >, { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::ValueNull { .. }) => Ok(None), Some(smithy_json::deserialize::Token::StartObject { .. }) => { #[allow(unused_mut)] let mut builder = crate::model::PropertyType::builder(); loop { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::EndObject { .. }) => break, Some(smithy_json::deserialize::Token::ObjectKey { key, .. }) => { match key.to_unescaped()?.as_ref() { "attribute" => { builder = builder.set_attribute( crate::json_deser::deser_structure_attribute(tokens)?, ); } "measurement" => { builder = builder.set_measurement( crate::json_deser::deser_structure_measurement(tokens)?, ); } "transform" => { builder = builder.set_transform( crate::json_deser::deser_structure_transform(tokens)?, ); } "metric" => { builder = builder .set_metric(crate::json_deser::deser_structure_metric(tokens)?); } _ => smithy_json::deserialize::token::skip_value(tokens)?, } } _ => { return Err(smithy_json::deserialize::Error::custom( "expected object key or end object", )) } } } Ok(Some(builder.build())) } _ => Err(smithy_json::deserialize::Error::custom( "expected start object or null", )), } } pub fn deser_structure_configuration_error_details<'a, I>( tokens: &mut std::iter::Peekable<I>, ) -> Result<Option<crate::model::ConfigurationErrorDetails>, smithy_json::deserialize::Error> where I: Iterator< Item = Result<smithy_json::deserialize::Token<'a>, smithy_json::deserialize::Error>, >, { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::ValueNull { .. }) => Ok(None), Some(smithy_json::deserialize::Token::StartObject { .. }) => { #[allow(unused_mut)] let mut builder = crate::model::ConfigurationErrorDetails::builder(); loop { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::EndObject { .. }) => break, Some(smithy_json::deserialize::Token::ObjectKey { key, .. }) => { match key.to_unescaped()?.as_ref() { "code" => { builder = builder.set_code( smithy_json::deserialize::token::expect_string_or_null( tokens.next(), )? .map(|s| { s.to_unescaped() .map(|u| crate::model::ErrorCode::from(u.as_ref())) }) .transpose()?, ); } "message" => { builder = builder.set_message( smithy_json::deserialize::token::expect_string_or_null( tokens.next(), )? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } _ => smithy_json::deserialize::token::skip_value(tokens)?, } } _ => { return Err(smithy_json::deserialize::Error::custom( "expected object key or end object", )) } } } Ok(Some(builder.build())) } _ => Err(smithy_json::deserialize::Error::custom( "expected start object or null", )), } } pub fn deser_structure_gateway_capability_summary<'a, I>( tokens: &mut std::iter::Peekable<I>, ) -> Result<Option<crate::model::GatewayCapabilitySummary>, smithy_json::deserialize::Error> where I: Iterator< Item = Result<smithy_json::deserialize::Token<'a>, smithy_json::deserialize::Error>, >, { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::ValueNull { .. }) => Ok(None), Some(smithy_json::deserialize::Token::StartObject { .. }) => { #[allow(unused_mut)] let mut builder = crate::model::GatewayCapabilitySummary::builder(); loop { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::EndObject { .. }) => break, Some(smithy_json::deserialize::Token::ObjectKey { key, .. }) => { match key.to_unescaped()?.as_ref() { "capabilityNamespace" => { builder = builder.set_capability_namespace( smithy_json::deserialize::token::expect_string_or_null( tokens.next(), )? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } "capabilitySyncStatus" => { builder = builder.set_capability_sync_status( smithy_json::deserialize::token::expect_string_or_null( tokens.next(), )? .map(|s| { s.to_unescaped().map(|u| { crate::model::CapabilitySyncStatus::from(u.as_ref()) }) }) .transpose()?, ); } _ => smithy_json::deserialize::token::skip_value(tokens)?, } } _ => { return Err(smithy_json::deserialize::Error::custom( "expected object key or end object", )) } } } Ok(Some(builder.build())) } _ => Err(smithy_json::deserialize::Error::custom( "expected start object or null", )), } } pub fn deser_structure_greengrass<'a, I>( tokens: &mut std::iter::Peekable<I>, ) -> Result<Option<crate::model::Greengrass>, smithy_json::deserialize::Error> where I: Iterator< Item = Result<smithy_json::deserialize::Token<'a>, smithy_json::deserialize::Error>, >, { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::ValueNull { .. }) => Ok(None), Some(smithy_json::deserialize::Token::StartObject { .. }) => { #[allow(unused_mut)] let mut builder = crate::model::Greengrass::builder(); loop { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::EndObject { .. }) => break, Some(smithy_json::deserialize::Token::ObjectKey { key, .. }) => { match key.to_unescaped()?.as_ref() { "groupArn" => { builder = builder.set_group_arn( smithy_json::deserialize::token::expect_string_or_null( tokens.next(), )? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } _ => smithy_json::deserialize::token::skip_value(tokens)?, } } _ => { return Err(smithy_json::deserialize::Error::custom( "expected object key or end object", )) } } } Ok(Some(builder.build())) } _ => Err(smithy_json::deserialize::Error::custom( "expected start object or null", )), } } pub fn deser_structure_greengrass_v2<'a, I>( tokens: &mut std::iter::Peekable<I>, ) -> Result<Option<crate::model::GreengrassV2>, smithy_json::deserialize::Error> where I: Iterator< Item = Result<smithy_json::deserialize::Token<'a>, smithy_json::deserialize::Error>, >, { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::ValueNull { .. }) => Ok(None), Some(smithy_json::deserialize::Token::StartObject { .. }) => { #[allow(unused_mut)] let mut builder = crate::model::GreengrassV2::builder(); loop { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::EndObject { .. }) => break, Some(smithy_json::deserialize::Token::ObjectKey { key, .. }) => { match key.to_unescaped()?.as_ref() { "coreDeviceThingName" => { builder = builder.set_core_device_thing_name( smithy_json::deserialize::token::expect_string_or_null( tokens.next(), )? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } _ => smithy_json::deserialize::token::skip_value(tokens)?, } } _ => { return Err(smithy_json::deserialize::Error::custom( "expected object key or end object", )) } } } Ok(Some(builder.build())) } _ => Err(smithy_json::deserialize::Error::custom( "expected start object or null", )), } } pub fn deser_structure_customer_managed_s3_storage<'a, I>( tokens: &mut std::iter::Peekable<I>, ) -> Result<Option<crate::model::CustomerManagedS3Storage>, smithy_json::deserialize::Error> where I: Iterator< Item = Result<smithy_json::deserialize::Token<'a>, smithy_json::deserialize::Error>, >, { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::ValueNull { .. }) => Ok(None), Some(smithy_json::deserialize::Token::StartObject { .. }) => { #[allow(unused_mut)] let mut builder = crate::model::CustomerManagedS3Storage::builder(); loop { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::EndObject { .. }) => break, Some(smithy_json::deserialize::Token::ObjectKey { key, .. }) => { match key.to_unescaped()?.as_ref() { "s3ResourceArn" => { builder = builder.set_s3_resource_arn( smithy_json::deserialize::token::expect_string_or_null( tokens.next(), )? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } "roleArn" => { builder = builder.set_role_arn( smithy_json::deserialize::token::expect_string_or_null( tokens.next(), )? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } _ => smithy_json::deserialize::token::skip_value(tokens)?, } } _ => { return Err(smithy_json::deserialize::Error::custom( "expected object key or end object", )) } } } Ok(Some(builder.build())) } _ => Err(smithy_json::deserialize::Error::custom( "expected start object or null", )), } } pub fn deser_structure_aggregated_value<'a, I>( tokens: &mut std::iter::Peekable<I>, ) -> Result<Option<crate::model::AggregatedValue>, smithy_json::deserialize::Error> where I: Iterator< Item = Result<smithy_json::deserialize::Token<'a>, smithy_json::deserialize::Error>, >, { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::ValueNull { .. }) => Ok(None), Some(smithy_json::deserialize::Token::StartObject { .. }) => { #[allow(unused_mut)] let mut builder = crate::model::AggregatedValue::builder(); loop { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::EndObject { .. }) => break, Some(smithy_json::deserialize::Token::ObjectKey { key, .. }) => { match key.to_unescaped()?.as_ref() { "timestamp" => { builder = builder.set_timestamp( smithy_json::deserialize::token::expect_timestamp_or_null( tokens.next(), smithy_types::instant::Format::EpochSeconds, )?, ); } "quality" => { builder = builder.set_quality( smithy_json::deserialize::token::expect_string_or_null( tokens.next(), )? .map(|s| { s.to_unescaped() .map(|u| crate::model::Quality::from(u.as_ref())) }) .transpose()?, ); } "value" => { builder = builder.set_value( crate::json_deser::deser_structure_aggregates(tokens)?, ); } _ => smithy_json::deserialize::token::skip_value(tokens)?, } } _ => { return Err(smithy_json::deserialize::Error::custom( "expected object key or end object", )) } } } Ok(Some(builder.build())) } _ => Err(smithy_json::deserialize::Error::custom( "expected start object or null", )), } } pub fn deser_structure_variant<'a, I>( tokens: &mut std::iter::Peekable<I>, ) -> Result<Option<crate::model::Variant>, smithy_json::deserialize::Error> where I: Iterator< Item = Result<smithy_json::deserialize::Token<'a>, smithy_json::deserialize::Error>, >, { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::ValueNull { .. }) => Ok(None), Some(smithy_json::deserialize::Token::StartObject { .. }) => { #[allow(unused_mut)] let mut builder = crate::model::Variant::builder(); loop { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::EndObject { .. }) => break, Some(smithy_json::deserialize::Token::ObjectKey { key, .. }) => { match key.to_unescaped()?.as_ref() { "stringValue" => { builder = builder.set_string_value( smithy_json::deserialize::token::expect_string_or_null( tokens.next(), )? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } "integerValue" => { builder = builder.set_integer_value( smithy_json::deserialize::token::expect_number_or_null( tokens.next(), )? .map(|v| v.to_i32()), ); } "doubleValue" => { builder = builder.set_double_value( smithy_json::deserialize::token::expect_number_or_null( tokens.next(), )? .map(|v| v.to_f64()), ); } "booleanValue" => { builder = builder.set_boolean_value( smithy_json::deserialize::token::expect_bool_or_null( tokens.next(), )?, ); } _ => smithy_json::deserialize::token::skip_value(tokens)?, } } _ => { return Err(smithy_json::deserialize::Error::custom( "expected object key or end object", )) } } } Ok(Some(builder.build())) } _ => Err(smithy_json::deserialize::Error::custom( "expected start object or null", )), } } pub fn deser_structure_time_in_nanos<'a, I>( tokens: &mut std::iter::Peekable<I>, ) -> Result<Option<crate::model::TimeInNanos>, smithy_json::deserialize::Error> where I: Iterator< Item = Result<smithy_json::deserialize::Token<'a>, smithy_json::deserialize::Error>, >, { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::ValueNull { .. }) => Ok(None), Some(smithy_json::deserialize::Token::StartObject { .. }) => { #[allow(unused_mut)] let mut builder = crate::model::TimeInNanos::builder(); loop { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::EndObject { .. }) => break, Some(smithy_json::deserialize::Token::ObjectKey { key, .. }) => { match key.to_unescaped()?.as_ref() { "timeInSeconds" => { builder = builder.set_time_in_seconds( smithy_json::deserialize::token::expect_number_or_null( tokens.next(), )? .map(|v| v.to_i64()), ); } "offsetInNanos" => { builder = builder.set_offset_in_nanos( smithy_json::deserialize::token::expect_number_or_null( tokens.next(), )? .map(|v| v.to_i32()), ); } _ => smithy_json::deserialize::token::skip_value(tokens)?, } } _ => { return Err(smithy_json::deserialize::Error::custom( "expected object key or end object", )) } } } Ok(Some(builder.build())) } _ => Err(smithy_json::deserialize::Error::custom( "expected start object or null", )), } } pub fn deser_structure_interpolated_asset_property_value<'a, I>( tokens: &mut std::iter::Peekable<I>, ) -> Result<Option<crate::model::InterpolatedAssetPropertyValue>, smithy_json::deserialize::Error> where I: Iterator< Item = Result<smithy_json::deserialize::Token<'a>, smithy_json::deserialize::Error>, >, { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::ValueNull { .. }) => Ok(None), Some(smithy_json::deserialize::Token::StartObject { .. }) => { #[allow(unused_mut)] let mut builder = crate::model::InterpolatedAssetPropertyValue::builder(); loop { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::EndObject { .. }) => break, Some(smithy_json::deserialize::Token::ObjectKey { key, .. }) => { match key.to_unescaped()?.as_ref() { "timestamp" => { builder = builder.set_timestamp( crate::json_deser::deser_structure_time_in_nanos(tokens)?, ); } "value" => { builder = builder .set_value(crate::json_deser::deser_structure_variant(tokens)?); } _ => smithy_json::deserialize::token::skip_value(tokens)?, } } _ => { return Err(smithy_json::deserialize::Error::custom( "expected object key or end object", )) } } } Ok(Some(builder.build())) } _ => Err(smithy_json::deserialize::Error::custom( "expected start object or null", )), } } pub fn deser_structure_access_policy_summary<'a, I>( tokens: &mut std::iter::Peekable<I>, ) -> Result<Option<crate::model::AccessPolicySummary>, smithy_json::deserialize::Error> where I: Iterator< Item = Result<smithy_json::deserialize::Token<'a>, smithy_json::deserialize::Error>, >, { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::ValueNull { .. }) => Ok(None), Some(smithy_json::deserialize::Token::StartObject { .. }) => { #[allow(unused_mut)] let mut builder = crate::model::AccessPolicySummary::builder(); loop { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::EndObject { .. }) => break, Some(smithy_json::deserialize::Token::ObjectKey { key, .. }) => { match key.to_unescaped()?.as_ref() { "id" => { builder = builder.set_id( smithy_json::deserialize::token::expect_string_or_null( tokens.next(), )? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } "identity" => { builder = builder.set_identity( crate::json_deser::deser_structure_identity(tokens)?, ); } "resource" => { builder = builder.set_resource( crate::json_deser::deser_structure_resource(tokens)?, ); } "permission" => { builder = builder.set_permission( smithy_json::deserialize::token::expect_string_or_null( tokens.next(), )? .map(|s| { s.to_unescaped() .map(|u| crate::model::Permission::from(u.as_ref())) }) .transpose()?, ); } "creationDate" => { builder = builder.set_creation_date( smithy_json::deserialize::token::expect_timestamp_or_null( tokens.next(), smithy_types::instant::Format::EpochSeconds, )?, ); } "lastUpdateDate" => { builder = builder.set_last_update_date( smithy_json::deserialize::token::expect_timestamp_or_null( tokens.next(), smithy_types::instant::Format::EpochSeconds, )?, ); } _ => smithy_json::deserialize::token::skip_value(tokens)?, } } _ => { return Err(smithy_json::deserialize::Error::custom( "expected object key or end object", )) } } } Ok(Some(builder.build())) } _ => Err(smithy_json::deserialize::Error::custom( "expected start object or null", )), } } pub fn deser_structure_asset_model_summary<'a, I>( tokens: &mut std::iter::Peekable<I>, ) -> Result<Option<crate::model::AssetModelSummary>, smithy_json::deserialize::Error> where I: Iterator< Item = Result<smithy_json::deserialize::Token<'a>, smithy_json::deserialize::Error>, >, { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::ValueNull { .. }) => Ok(None), Some(smithy_json::deserialize::Token::StartObject { .. }) => { #[allow(unused_mut)] let mut builder = crate::model::AssetModelSummary::builder(); loop { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::EndObject { .. }) => break, Some(smithy_json::deserialize::Token::ObjectKey { key, .. }) => { match key.to_unescaped()?.as_ref() { "id" => { builder = builder.set_id( smithy_json::deserialize::token::expect_string_or_null( tokens.next(), )? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } "arn" => { builder = builder.set_arn( smithy_json::deserialize::token::expect_string_or_null( tokens.next(), )? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } "name" => { builder = builder.set_name( smithy_json::deserialize::token::expect_string_or_null( tokens.next(), )? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } "description" => { builder = builder.set_description( smithy_json::deserialize::token::expect_string_or_null( tokens.next(), )? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } "creationDate" => { builder = builder.set_creation_date( smithy_json::deserialize::token::expect_timestamp_or_null( tokens.next(), smithy_types::instant::Format::EpochSeconds, )?, ); } "lastUpdateDate" => { builder = builder.set_last_update_date( smithy_json::deserialize::token::expect_timestamp_or_null( tokens.next(), smithy_types::instant::Format::EpochSeconds, )?, ); } "status" => { builder = builder.set_status( crate::json_deser::deser_structure_asset_model_status(tokens)?, ); } _ => smithy_json::deserialize::token::skip_value(tokens)?, } } _ => { return Err(smithy_json::deserialize::Error::custom( "expected object key or end object", )) } } } Ok(Some(builder.build())) } _ => Err(smithy_json::deserialize::Error::custom( "expected start object or null", )), } } pub fn deser_structure_asset_relationship_summary<'a, I>( tokens: &mut std::iter::Peekable<I>, ) -> Result<Option<crate::model::AssetRelationshipSummary>, smithy_json::deserialize::Error> where I: Iterator< Item = Result<smithy_json::deserialize::Token<'a>, smithy_json::deserialize::Error>, >, { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::ValueNull { .. }) => Ok(None), Some(smithy_json::deserialize::Token::StartObject { .. }) => { #[allow(unused_mut)] let mut builder = crate::model::AssetRelationshipSummary::builder(); loop { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::EndObject { .. }) => break, Some(smithy_json::deserialize::Token::ObjectKey { key, .. }) => { match key.to_unescaped()?.as_ref() { "hierarchyInfo" => { builder = builder.set_hierarchy_info( crate::json_deser::deser_structure_asset_hierarchy_info( tokens, )?, ); } "relationshipType" => { builder = builder.set_relationship_type( smithy_json::deserialize::token::expect_string_or_null( tokens.next(), )? .map(|s| { s.to_unescaped().map(|u| { crate::model::AssetRelationshipType::from(u.as_ref()) }) }) .transpose()?, ); } _ => smithy_json::deserialize::token::skip_value(tokens)?, } } _ => { return Err(smithy_json::deserialize::Error::custom( "expected object key or end object", )) } } } Ok(Some(builder.build())) } _ => Err(smithy_json::deserialize::Error::custom( "expected start object or null", )), } } pub fn deser_structure_asset_summary<'a, I>( tokens: &mut std::iter::Peekable<I>, ) -> Result<Option<crate::model::AssetSummary>, smithy_json::deserialize::Error> where I: Iterator< Item = Result<smithy_json::deserialize::Token<'a>, smithy_json::deserialize::Error>, >, { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::ValueNull { .. }) => Ok(None), Some(smithy_json::deserialize::Token::StartObject { .. }) => { #[allow(unused_mut)] let mut builder = crate::model::AssetSummary::builder(); loop { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::EndObject { .. }) => break, Some(smithy_json::deserialize::Token::ObjectKey { key, .. }) => { match key.to_unescaped()?.as_ref() { "id" => { builder = builder.set_id( smithy_json::deserialize::token::expect_string_or_null( tokens.next(), )? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } "arn" => { builder = builder.set_arn( smithy_json::deserialize::token::expect_string_or_null( tokens.next(), )? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } "name" => { builder = builder.set_name( smithy_json::deserialize::token::expect_string_or_null( tokens.next(), )? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } "assetModelId" => { builder = builder.set_asset_model_id( smithy_json::deserialize::token::expect_string_or_null( tokens.next(), )? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } "creationDate" => { builder = builder.set_creation_date( smithy_json::deserialize::token::expect_timestamp_or_null( tokens.next(), smithy_types::instant::Format::EpochSeconds, )?, ); } "lastUpdateDate" => { builder = builder.set_last_update_date( smithy_json::deserialize::token::expect_timestamp_or_null( tokens.next(), smithy_types::instant::Format::EpochSeconds, )?, ); } "status" => { builder = builder.set_status( crate::json_deser::deser_structure_asset_status(tokens)?, ); } "hierarchies" => { builder = builder.set_hierarchies( crate::json_deser::deser_list_asset_hierarchies(tokens)?, ); } _ => smithy_json::deserialize::token::skip_value(tokens)?, } } _ => { return Err(smithy_json::deserialize::Error::custom( "expected object key or end object", )) } } } Ok(Some(builder.build())) } _ => Err(smithy_json::deserialize::Error::custom( "expected start object or null", )), } } pub fn deser_structure_associated_assets_summary<'a, I>( tokens: &mut std::iter::Peekable<I>, ) -> Result<Option<crate::model::AssociatedAssetsSummary>, smithy_json::deserialize::Error> where I: Iterator< Item = Result<smithy_json::deserialize::Token<'a>, smithy_json::deserialize::Error>, >, { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::ValueNull { .. }) => Ok(None), Some(smithy_json::deserialize::Token::StartObject { .. }) => { #[allow(unused_mut)] let mut builder = crate::model::AssociatedAssetsSummary::builder(); loop { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::EndObject { .. }) => break, Some(smithy_json::deserialize::Token::ObjectKey { key, .. }) => { match key.to_unescaped()?.as_ref() { "id" => { builder = builder.set_id( smithy_json::deserialize::token::expect_string_or_null( tokens.next(), )? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } "arn" => { builder = builder.set_arn( smithy_json::deserialize::token::expect_string_or_null( tokens.next(), )? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } "name" => { builder = builder.set_name( smithy_json::deserialize::token::expect_string_or_null( tokens.next(), )? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } "assetModelId" => { builder = builder.set_asset_model_id( smithy_json::deserialize::token::expect_string_or_null( tokens.next(), )? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } "creationDate" => { builder = builder.set_creation_date( smithy_json::deserialize::token::expect_timestamp_or_null( tokens.next(), smithy_types::instant::Format::EpochSeconds, )?, ); } "lastUpdateDate" => { builder = builder.set_last_update_date( smithy_json::deserialize::token::expect_timestamp_or_null( tokens.next(), smithy_types::instant::Format::EpochSeconds, )?, ); } "status" => { builder = builder.set_status( crate::json_deser::deser_structure_asset_status(tokens)?, ); } "hierarchies" => { builder = builder.set_hierarchies( crate::json_deser::deser_list_asset_hierarchies(tokens)?, ); } _ => smithy_json::deserialize::token::skip_value(tokens)?, } } _ => { return Err(smithy_json::deserialize::Error::custom( "expected object key or end object", )) } } } Ok(Some(builder.build())) } _ => Err(smithy_json::deserialize::Error::custom( "expected start object or null", )), } } pub fn deser_structure_dashboard_summary<'a, I>( tokens: &mut std::iter::Peekable<I>, ) -> Result<Option<crate::model::DashboardSummary>, smithy_json::deserialize::Error> where I: Iterator< Item = Result<smithy_json::deserialize::Token<'a>, smithy_json::deserialize::Error>, >, { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::ValueNull { .. }) => Ok(None), Some(smithy_json::deserialize::Token::StartObject { .. }) => { #[allow(unused_mut)] let mut builder = crate::model::DashboardSummary::builder(); loop { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::EndObject { .. }) => break, Some(smithy_json::deserialize::Token::ObjectKey { key, .. }) => { match key.to_unescaped()?.as_ref() { "id" => { builder = builder.set_id( smithy_json::deserialize::token::expect_string_or_null( tokens.next(), )? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } "name" => { builder = builder.set_name( smithy_json::deserialize::token::expect_string_or_null( tokens.next(), )? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } "description" => { builder = builder.set_description( smithy_json::deserialize::token::expect_string_or_null( tokens.next(), )? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } "creationDate" => { builder = builder.set_creation_date( smithy_json::deserialize::token::expect_timestamp_or_null( tokens.next(), smithy_types::instant::Format::EpochSeconds, )?, ); } "lastUpdateDate" => { builder = builder.set_last_update_date( smithy_json::deserialize::token::expect_timestamp_or_null( tokens.next(), smithy_types::instant::Format::EpochSeconds, )?, ); } _ => smithy_json::deserialize::token::skip_value(tokens)?, } } _ => { return Err(smithy_json::deserialize::Error::custom( "expected object key or end object", )) } } } Ok(Some(builder.build())) } _ => Err(smithy_json::deserialize::Error::custom( "expected start object or null", )), } } pub fn deser_structure_gateway_summary<'a, I>( tokens: &mut std::iter::Peekable<I>, ) -> Result<Option<crate::model::GatewaySummary>, smithy_json::deserialize::Error> where I: Iterator< Item = Result<smithy_json::deserialize::Token<'a>, smithy_json::deserialize::Error>, >, { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::ValueNull { .. }) => Ok(None), Some(smithy_json::deserialize::Token::StartObject { .. }) => { #[allow(unused_mut)] let mut builder = crate::model::GatewaySummary::builder(); loop { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::EndObject { .. }) => break, Some(smithy_json::deserialize::Token::ObjectKey { key, .. }) => { match key.to_unescaped()?.as_ref() { "gatewayId" => { builder = builder.set_gateway_id( smithy_json::deserialize::token::expect_string_or_null( tokens.next(), )? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } "gatewayName" => { builder = builder.set_gateway_name( smithy_json::deserialize::token::expect_string_or_null( tokens.next(), )? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } "gatewayPlatform" => { builder = builder.set_gateway_platform( crate::json_deser::deser_structure_gateway_platform(tokens)?, ); } "gatewayCapabilitySummaries" => { builder = builder.set_gateway_capability_summaries( crate::json_deser::deser_list_gateway_capability_summaries( tokens, )?, ); } "creationDate" => { builder = builder.set_creation_date( smithy_json::deserialize::token::expect_timestamp_or_null( tokens.next(), smithy_types::instant::Format::EpochSeconds, )?, ); } "lastUpdateDate" => { builder = builder.set_last_update_date( smithy_json::deserialize::token::expect_timestamp_or_null( tokens.next(), smithy_types::instant::Format::EpochSeconds, )?, ); } _ => smithy_json::deserialize::token::skip_value(tokens)?, } } _ => { return Err(smithy_json::deserialize::Error::custom( "expected object key or end object", )) } } } Ok(Some(builder.build())) } _ => Err(smithy_json::deserialize::Error::custom( "expected start object or null", )), } } pub fn deser_structure_portal_summary<'a, I>( tokens: &mut std::iter::Peekable<I>, ) -> Result<Option<crate::model::PortalSummary>, smithy_json::deserialize::Error> where I: Iterator< Item = Result<smithy_json::deserialize::Token<'a>, smithy_json::deserialize::Error>, >, { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::ValueNull { .. }) => Ok(None), Some(smithy_json::deserialize::Token::StartObject { .. }) => { #[allow(unused_mut)] let mut builder = crate::model::PortalSummary::builder(); loop { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::EndObject { .. }) => break, Some(smithy_json::deserialize::Token::ObjectKey { key, .. }) => { match key.to_unescaped()?.as_ref() { "id" => { builder = builder.set_id( smithy_json::deserialize::token::expect_string_or_null( tokens.next(), )? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } "name" => { builder = builder.set_name( smithy_json::deserialize::token::expect_string_or_null( tokens.next(), )? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } "description" => { builder = builder.set_description( smithy_json::deserialize::token::expect_string_or_null( tokens.next(), )? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } "startUrl" => { builder = builder.set_start_url( smithy_json::deserialize::token::expect_string_or_null( tokens.next(), )? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } "creationDate" => { builder = builder.set_creation_date( smithy_json::deserialize::token::expect_timestamp_or_null( tokens.next(), smithy_types::instant::Format::EpochSeconds, )?, ); } "lastUpdateDate" => { builder = builder.set_last_update_date( smithy_json::deserialize::token::expect_timestamp_or_null( tokens.next(), smithy_types::instant::Format::EpochSeconds, )?, ); } "roleArn" => { builder = builder.set_role_arn( smithy_json::deserialize::token::expect_string_or_null( tokens.next(), )? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } "status" => { builder = builder.set_status( crate::json_deser::deser_structure_portal_status(tokens)?, ); } _ => smithy_json::deserialize::token::skip_value(tokens)?, } } _ => { return Err(smithy_json::deserialize::Error::custom( "expected object key or end object", )) } } } Ok(Some(builder.build())) } _ => Err(smithy_json::deserialize::Error::custom( "expected start object or null", )), } } pub fn deser_structure_project_summary<'a, I>( tokens: &mut std::iter::Peekable<I>, ) -> Result<Option<crate::model::ProjectSummary>, smithy_json::deserialize::Error> where I: Iterator< Item = Result<smithy_json::deserialize::Token<'a>, smithy_json::deserialize::Error>, >, { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::ValueNull { .. }) => Ok(None), Some(smithy_json::deserialize::Token::StartObject { .. }) => { #[allow(unused_mut)] let mut builder = crate::model::ProjectSummary::builder(); loop { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::EndObject { .. }) => break, Some(smithy_json::deserialize::Token::ObjectKey { key, .. }) => { match key.to_unescaped()?.as_ref() { "id" => { builder = builder.set_id( smithy_json::deserialize::token::expect_string_or_null( tokens.next(), )? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } "name" => { builder = builder.set_name( smithy_json::deserialize::token::expect_string_or_null( tokens.next(), )? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } "description" => { builder = builder.set_description( smithy_json::deserialize::token::expect_string_or_null( tokens.next(), )? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } "creationDate" => { builder = builder.set_creation_date( smithy_json::deserialize::token::expect_timestamp_or_null( tokens.next(), smithy_types::instant::Format::EpochSeconds, )?, ); } "lastUpdateDate" => { builder = builder.set_last_update_date( smithy_json::deserialize::token::expect_timestamp_or_null( tokens.next(), smithy_types::instant::Format::EpochSeconds, )?, ); } _ => smithy_json::deserialize::token::skip_value(tokens)?, } } _ => { return Err(smithy_json::deserialize::Error::custom( "expected object key or end object", )) } } } Ok(Some(builder.build())) } _ => Err(smithy_json::deserialize::Error::custom( "expected start object or null", )), } } #[allow(clippy::type_complexity, non_snake_case)] pub fn deser_list_batch_put_asset_property_errors<'a, I>( tokens: &mut std::iter::Peekable<I>, ) -> Result< Option<std::vec::Vec<crate::model::BatchPutAssetPropertyError>>, smithy_json::deserialize::Error, > where I: Iterator< Item = Result<smithy_json::deserialize::Token<'a>, smithy_json::deserialize::Error>, >, { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::ValueNull { .. }) => Ok(None), Some(smithy_json::deserialize::Token::StartArray { .. }) => { let mut items = Vec::new(); loop { match tokens.peek() { Some(Ok(smithy_json::deserialize::Token::EndArray { .. })) => { tokens.next().transpose().unwrap(); break; } _ => { let value = crate::json_deser::deser_structure_batch_put_asset_property_error( tokens, )?; if let Some(value) = value { items.push(value); } } } } Ok(Some(items)) } _ => Err(smithy_json::deserialize::Error::custom( "expected start array or null", )), } } #[allow(clippy::type_complexity, non_snake_case)] pub fn deser_list_detailed_errors<'a, I>( tokens: &mut std::iter::Peekable<I>, ) -> Result<Option<std::vec::Vec<crate::model::DetailedError>>, smithy_json::deserialize::Error> where I: Iterator< Item = Result<smithy_json::deserialize::Token<'a>, smithy_json::deserialize::Error>, >, { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::ValueNull { .. }) => Ok(None), Some(smithy_json::deserialize::Token::StartArray { .. }) => { let mut items = Vec::new(); loop { match tokens.peek() { Some(Ok(smithy_json::deserialize::Token::EndArray { .. })) => { tokens.next().transpose().unwrap(); break; } _ => { let value = crate::json_deser::deser_structure_detailed_error(tokens)?; if let Some(value) = value { items.push(value); } } } } Ok(Some(items)) } _ => Err(smithy_json::deserialize::Error::custom( "expected start array or null", )), } } pub fn deser_structure_attribute<'a, I>( tokens: &mut std::iter::Peekable<I>, ) -> Result<Option<crate::model::Attribute>, smithy_json::deserialize::Error> where I: Iterator< Item = Result<smithy_json::deserialize::Token<'a>, smithy_json::deserialize::Error>, >, { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::ValueNull { .. }) => Ok(None), Some(smithy_json::deserialize::Token::StartObject { .. }) => { #[allow(unused_mut)] let mut builder = crate::model::Attribute::builder(); loop { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::EndObject { .. }) => break, Some(smithy_json::deserialize::Token::ObjectKey { key, .. }) => { match key.to_unescaped()?.as_ref() { "defaultValue" => { builder = builder.set_default_value( smithy_json::deserialize::token::expect_string_or_null( tokens.next(), )? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } _ => smithy_json::deserialize::token::skip_value(tokens)?, } } _ => { return Err(smithy_json::deserialize::Error::custom( "expected object key or end object", )) } } } Ok(Some(builder.build())) } _ => Err(smithy_json::deserialize::Error::custom( "expected start object or null", )), } } pub fn deser_structure_measurement<'a, I>( tokens: &mut std::iter::Peekable<I>, ) -> Result<Option<crate::model::Measurement>, smithy_json::deserialize::Error> where I: Iterator< Item = Result<smithy_json::deserialize::Token<'a>, smithy_json::deserialize::Error>, >, { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::ValueNull { .. }) => Ok(None), Some(smithy_json::deserialize::Token::StartObject { .. }) => { #[allow(unused_mut)] let mut builder = crate::model::Measurement::builder(); loop { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::EndObject { .. }) => break, Some(smithy_json::deserialize::Token::ObjectKey { key, .. }) => { match key.to_unescaped()?.as_ref() { "processingConfig" => { builder = builder.set_processing_config( crate::json_deser::deser_structure_measurement_processing_config(tokens)? ); } _ => smithy_json::deserialize::token::skip_value(tokens)?, } } _ => { return Err(smithy_json::deserialize::Error::custom( "expected object key or end object", )) } } } Ok(Some(builder.build())) } _ => Err(smithy_json::deserialize::Error::custom( "expected start object or null", )), } } pub fn deser_structure_transform<'a, I>( tokens: &mut std::iter::Peekable<I>, ) -> Result<Option<crate::model::Transform>, smithy_json::deserialize::Error> where I: Iterator< Item = Result<smithy_json::deserialize::Token<'a>, smithy_json::deserialize::Error>, >, { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::ValueNull { .. }) => Ok(None), Some(smithy_json::deserialize::Token::StartObject { .. }) => { #[allow(unused_mut)] let mut builder = crate::model::Transform::builder(); loop { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::EndObject { .. }) => break, Some(smithy_json::deserialize::Token::ObjectKey { key, .. }) => { match key.to_unescaped()?.as_ref() { "expression" => { builder = builder.set_expression( smithy_json::deserialize::token::expect_string_or_null( tokens.next(), )? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } "variables" => { builder = builder.set_variables( crate::json_deser::deser_list_expression_variables(tokens)?, ); } "processingConfig" => { builder = builder.set_processing_config( crate::json_deser::deser_structure_transform_processing_config( tokens, )?, ); } _ => smithy_json::deserialize::token::skip_value(tokens)?, } } _ => { return Err(smithy_json::deserialize::Error::custom( "expected object key or end object", )) } } } Ok(Some(builder.build())) } _ => Err(smithy_json::deserialize::Error::custom( "expected start object or null", )), } } pub fn deser_structure_metric<'a, I>( tokens: &mut std::iter::Peekable<I>, ) -> Result<Option<crate::model::Metric>, smithy_json::deserialize::Error> where I: Iterator< Item = Result<smithy_json::deserialize::Token<'a>, smithy_json::deserialize::Error>, >, { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::ValueNull { .. }) => Ok(None), Some(smithy_json::deserialize::Token::StartObject { .. }) => { #[allow(unused_mut)] let mut builder = crate::model::Metric::builder(); loop { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::EndObject { .. }) => break, Some(smithy_json::deserialize::Token::ObjectKey { key, .. }) => { match key.to_unescaped()?.as_ref() { "expression" => { builder = builder.set_expression( smithy_json::deserialize::token::expect_string_or_null( tokens.next(), )? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } "variables" => { builder = builder.set_variables( crate::json_deser::deser_list_expression_variables(tokens)?, ); } "window" => { builder = builder.set_window( crate::json_deser::deser_structure_metric_window(tokens)?, ); } "processingConfig" => { builder = builder.set_processing_config( crate::json_deser::deser_structure_metric_processing_config( tokens, )?, ); } _ => smithy_json::deserialize::token::skip_value(tokens)?, } } _ => { return Err(smithy_json::deserialize::Error::custom( "expected object key or end object", )) } } } Ok(Some(builder.build())) } _ => Err(smithy_json::deserialize::Error::custom( "expected start object or null", )), } } pub fn deser_structure_aggregates<'a, I>( tokens: &mut std::iter::Peekable<I>, ) -> Result<Option<crate::model::Aggregates>, smithy_json::deserialize::Error> where I: Iterator< Item = Result<smithy_json::deserialize::Token<'a>, smithy_json::deserialize::Error>, >, { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::ValueNull { .. }) => Ok(None), Some(smithy_json::deserialize::Token::StartObject { .. }) => { #[allow(unused_mut)] let mut builder = crate::model::Aggregates::builder(); loop { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::EndObject { .. }) => break, Some(smithy_json::deserialize::Token::ObjectKey { key, .. }) => { match key.to_unescaped()?.as_ref() { "average" => { builder = builder.set_average( smithy_json::deserialize::token::expect_number_or_null( tokens.next(), )? .map(|v| v.to_f64()), ); } "count" => { builder = builder.set_count( smithy_json::deserialize::token::expect_number_or_null( tokens.next(), )? .map(|v| v.to_f64()), ); } "maximum" => { builder = builder.set_maximum( smithy_json::deserialize::token::expect_number_or_null( tokens.next(), )? .map(|v| v.to_f64()), ); } "minimum" => { builder = builder.set_minimum( smithy_json::deserialize::token::expect_number_or_null( tokens.next(), )? .map(|v| v.to_f64()), ); } "sum" => { builder = builder.set_sum( smithy_json::deserialize::token::expect_number_or_null( tokens.next(), )? .map(|v| v.to_f64()), ); } "standardDeviation" => { builder = builder.set_standard_deviation( smithy_json::deserialize::token::expect_number_or_null( tokens.next(), )? .map(|v| v.to_f64()), ); } _ => smithy_json::deserialize::token::skip_value(tokens)?, } } _ => { return Err(smithy_json::deserialize::Error::custom( "expected object key or end object", )) } } } Ok(Some(builder.build())) } _ => Err(smithy_json::deserialize::Error::custom( "expected start object or null", )), } } pub fn deser_structure_asset_hierarchy_info<'a, I>( tokens: &mut std::iter::Peekable<I>, ) -> Result<Option<crate::model::AssetHierarchyInfo>, smithy_json::deserialize::Error> where I: Iterator< Item = Result<smithy_json::deserialize::Token<'a>, smithy_json::deserialize::Error>, >, { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::ValueNull { .. }) => Ok(None), Some(smithy_json::deserialize::Token::StartObject { .. }) => { #[allow(unused_mut)] let mut builder = crate::model::AssetHierarchyInfo::builder(); loop { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::EndObject { .. }) => break, Some(smithy_json::deserialize::Token::ObjectKey { key, .. }) => { match key.to_unescaped()?.as_ref() { "parentAssetId" => { builder = builder.set_parent_asset_id( smithy_json::deserialize::token::expect_string_or_null( tokens.next(), )? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } "childAssetId" => { builder = builder.set_child_asset_id( smithy_json::deserialize::token::expect_string_or_null( tokens.next(), )? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } _ => smithy_json::deserialize::token::skip_value(tokens)?, } } _ => { return Err(smithy_json::deserialize::Error::custom( "expected object key or end object", )) } } } Ok(Some(builder.build())) } _ => Err(smithy_json::deserialize::Error::custom( "expected start object or null", )), } } pub fn deser_structure_batch_put_asset_property_error<'a, I>( tokens: &mut std::iter::Peekable<I>, ) -> Result<Option<crate::model::BatchPutAssetPropertyError>, smithy_json::deserialize::Error> where I: Iterator< Item = Result<smithy_json::deserialize::Token<'a>, smithy_json::deserialize::Error>, >, { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::ValueNull { .. }) => Ok(None), Some(smithy_json::deserialize::Token::StartObject { .. }) => { #[allow(unused_mut)] let mut builder = crate::model::BatchPutAssetPropertyError::builder(); loop { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::EndObject { .. }) => break, Some(smithy_json::deserialize::Token::ObjectKey { key, .. }) => { match key.to_unescaped()?.as_ref() { "errorCode" => { builder = builder.set_error_code( smithy_json::deserialize::token::expect_string_or_null( tokens.next(), )? .map(|s| { s.to_unescaped().map(|u| { crate::model::BatchPutAssetPropertyValueErrorCode::from( u.as_ref(), ) }) }) .transpose()?, ); } "errorMessage" => { builder = builder.set_error_message( smithy_json::deserialize::token::expect_string_or_null( tokens.next(), )? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } "timestamps" => { builder = builder.set_timestamps( crate::json_deser::deser_list_timestamps(tokens)?, ); } _ => smithy_json::deserialize::token::skip_value(tokens)?, } } _ => { return Err(smithy_json::deserialize::Error::custom( "expected object key or end object", )) } } } Ok(Some(builder.build())) } _ => Err(smithy_json::deserialize::Error::custom( "expected start object or null", )), } } pub fn deser_structure_detailed_error<'a, I>( tokens: &mut std::iter::Peekable<I>, ) -> Result<Option<crate::model::DetailedError>, smithy_json::deserialize::Error> where I: Iterator< Item = Result<smithy_json::deserialize::Token<'a>, smithy_json::deserialize::Error>, >, { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::ValueNull { .. }) => Ok(None), Some(smithy_json::deserialize::Token::StartObject { .. }) => { #[allow(unused_mut)] let mut builder = crate::model::DetailedError::builder(); loop { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::EndObject { .. }) => break, Some(smithy_json::deserialize::Token::ObjectKey { key, .. }) => { match key.to_unescaped()?.as_ref() { "code" => { builder = builder.set_code( smithy_json::deserialize::token::expect_string_or_null( tokens.next(), )? .map(|s| { s.to_unescaped().map(|u| { crate::model::DetailedErrorCode::from(u.as_ref()) }) }) .transpose()?, ); } "message" => { builder = builder.set_message( smithy_json::deserialize::token::expect_string_or_null( tokens.next(), )? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } _ => smithy_json::deserialize::token::skip_value(tokens)?, } } _ => { return Err(smithy_json::deserialize::Error::custom( "expected object key or end object", )) } } } Ok(Some(builder.build())) } _ => Err(smithy_json::deserialize::Error::custom( "expected start object or null", )), } } pub fn deser_structure_measurement_processing_config<'a, I>( tokens: &mut std::iter::Peekable<I>, ) -> Result<Option<crate::model::MeasurementProcessingConfig>, smithy_json::deserialize::Error> where I: Iterator< Item = Result<smithy_json::deserialize::Token<'a>, smithy_json::deserialize::Error>, >, { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::ValueNull { .. }) => Ok(None), Some(smithy_json::deserialize::Token::StartObject { .. }) => { #[allow(unused_mut)] let mut builder = crate::model::MeasurementProcessingConfig::builder(); loop { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::EndObject { .. }) => break, Some(smithy_json::deserialize::Token::ObjectKey { key, .. }) => { match key.to_unescaped()?.as_ref() { "forwardingConfig" => { builder = builder.set_forwarding_config( crate::json_deser::deser_structure_forwarding_config(tokens)?, ); } _ => smithy_json::deserialize::token::skip_value(tokens)?, } } _ => { return Err(smithy_json::deserialize::Error::custom( "expected object key or end object", )) } } } Ok(Some(builder.build())) } _ => Err(smithy_json::deserialize::Error::custom( "expected start object or null", )), } } #[allow(clippy::type_complexity, non_snake_case)] pub fn deser_list_expression_variables<'a, I>( tokens: &mut std::iter::Peekable<I>, ) -> Result<Option<std::vec::Vec<crate::model::ExpressionVariable>>, smithy_json::deserialize::Error> where I: Iterator< Item = Result<smithy_json::deserialize::Token<'a>, smithy_json::deserialize::Error>, >, { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::ValueNull { .. }) => Ok(None), Some(smithy_json::deserialize::Token::StartArray { .. }) => { let mut items = Vec::new(); loop { match tokens.peek() { Some(Ok(smithy_json::deserialize::Token::EndArray { .. })) => { tokens.next().transpose().unwrap(); break; } _ => { let value = crate::json_deser::deser_structure_expression_variable(tokens)?; if let Some(value) = value { items.push(value); } } } } Ok(Some(items)) } _ => Err(smithy_json::deserialize::Error::custom( "expected start array or null", )), } } pub fn deser_structure_transform_processing_config<'a, I>( tokens: &mut std::iter::Peekable<I>, ) -> Result<Option<crate::model::TransformProcessingConfig>, smithy_json::deserialize::Error> where I: Iterator< Item = Result<smithy_json::deserialize::Token<'a>, smithy_json::deserialize::Error>, >, { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::ValueNull { .. }) => Ok(None), Some(smithy_json::deserialize::Token::StartObject { .. }) => { #[allow(unused_mut)] let mut builder = crate::model::TransformProcessingConfig::builder(); loop { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::EndObject { .. }) => break, Some(smithy_json::deserialize::Token::ObjectKey { key, .. }) => { match key.to_unescaped()?.as_ref() { "computeLocation" => { builder = builder.set_compute_location( smithy_json::deserialize::token::expect_string_or_null( tokens.next(), )? .map(|s| { s.to_unescaped().map(|u| { crate::model::ComputeLocation::from(u.as_ref()) }) }) .transpose()?, ); } "forwardingConfig" => { builder = builder.set_forwarding_config( crate::json_deser::deser_structure_forwarding_config(tokens)?, ); } _ => smithy_json::deserialize::token::skip_value(tokens)?, } } _ => { return Err(smithy_json::deserialize::Error::custom( "expected object key or end object", )) } } } Ok(Some(builder.build())) } _ => Err(smithy_json::deserialize::Error::custom( "expected start object or null", )), } } pub fn deser_structure_metric_window<'a, I>( tokens: &mut std::iter::Peekable<I>, ) -> Result<Option<crate::model::MetricWindow>, smithy_json::deserialize::Error> where I: Iterator< Item = Result<smithy_json::deserialize::Token<'a>, smithy_json::deserialize::Error>, >, { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::ValueNull { .. }) => Ok(None), Some(smithy_json::deserialize::Token::StartObject { .. }) => { #[allow(unused_mut)] let mut builder = crate::model::MetricWindow::builder(); loop { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::EndObject { .. }) => break, Some(smithy_json::deserialize::Token::ObjectKey { key, .. }) => { match key.to_unescaped()?.as_ref() { "tumbling" => { builder = builder.set_tumbling( crate::json_deser::deser_structure_tumbling_window(tokens)?, ); } _ => smithy_json::deserialize::token::skip_value(tokens)?, } } _ => { return Err(smithy_json::deserialize::Error::custom( "expected object key or end object", )) } } } Ok(Some(builder.build())) } _ => Err(smithy_json::deserialize::Error::custom( "expected start object or null", )), } } pub fn deser_structure_metric_processing_config<'a, I>( tokens: &mut std::iter::Peekable<I>, ) -> Result<Option<crate::model::MetricProcessingConfig>, smithy_json::deserialize::Error> where I: Iterator< Item = Result<smithy_json::deserialize::Token<'a>, smithy_json::deserialize::Error>, >, { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::ValueNull { .. }) => Ok(None), Some(smithy_json::deserialize::Token::StartObject { .. }) => { #[allow(unused_mut)] let mut builder = crate::model::MetricProcessingConfig::builder(); loop { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::EndObject { .. }) => break, Some(smithy_json::deserialize::Token::ObjectKey { key, .. }) => { match key.to_unescaped()?.as_ref() { "computeLocation" => { builder = builder.set_compute_location( smithy_json::deserialize::token::expect_string_or_null( tokens.next(), )? .map(|s| { s.to_unescaped().map(|u| { crate::model::ComputeLocation::from(u.as_ref()) }) }) .transpose()?, ); } _ => smithy_json::deserialize::token::skip_value(tokens)?, } } _ => { return Err(smithy_json::deserialize::Error::custom( "expected object key or end object", )) } } } Ok(Some(builder.build())) } _ => Err(smithy_json::deserialize::Error::custom( "expected start object or null", )), } } #[allow(clippy::type_complexity, non_snake_case)] pub fn deser_list_timestamps<'a, I>( tokens: &mut std::iter::Peekable<I>, ) -> Result<Option<std::vec::Vec<crate::model::TimeInNanos>>, smithy_json::deserialize::Error> where I: Iterator< Item = Result<smithy_json::deserialize::Token<'a>, smithy_json::deserialize::Error>, >, { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::ValueNull { .. }) => Ok(None), Some(smithy_json::deserialize::Token::StartArray { .. }) => { let mut items = Vec::new(); loop { match tokens.peek() { Some(Ok(smithy_json::deserialize::Token::EndArray { .. })) => { tokens.next().transpose().unwrap(); break; } _ => { let value = crate::json_deser::deser_structure_time_in_nanos(tokens)?; if let Some(value) = value { items.push(value); } } } } Ok(Some(items)) } _ => Err(smithy_json::deserialize::Error::custom( "expected start array or null", )), } } pub fn deser_structure_forwarding_config<'a, I>( tokens: &mut std::iter::Peekable<I>, ) -> Result<Option<crate::model::ForwardingConfig>, smithy_json::deserialize::Error> where I: Iterator< Item = Result<smithy_json::deserialize::Token<'a>, smithy_json::deserialize::Error>, >, { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::ValueNull { .. }) => Ok(None), Some(smithy_json::deserialize::Token::StartObject { .. }) => { #[allow(unused_mut)] let mut builder = crate::model::ForwardingConfig::builder(); loop { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::EndObject { .. }) => break, Some(smithy_json::deserialize::Token::ObjectKey { key, .. }) => { match key.to_unescaped()?.as_ref() { "state" => { builder = builder.set_state( smithy_json::deserialize::token::expect_string_or_null( tokens.next(), )? .map(|s| { s.to_unescaped().map(|u| { crate::model::ForwardingConfigState::from(u.as_ref()) }) }) .transpose()?, ); } _ => smithy_json::deserialize::token::skip_value(tokens)?, } } _ => { return Err(smithy_json::deserialize::Error::custom( "expected object key or end object", )) } } } Ok(Some(builder.build())) } _ => Err(smithy_json::deserialize::Error::custom( "expected start object or null", )), } } pub fn deser_structure_expression_variable<'a, I>( tokens: &mut std::iter::Peekable<I>, ) -> Result<Option<crate::model::ExpressionVariable>, smithy_json::deserialize::Error> where I: Iterator< Item = Result<smithy_json::deserialize::Token<'a>, smithy_json::deserialize::Error>, >, { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::ValueNull { .. }) => Ok(None), Some(smithy_json::deserialize::Token::StartObject { .. }) => { #[allow(unused_mut)] let mut builder = crate::model::ExpressionVariable::builder(); loop { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::EndObject { .. }) => break, Some(smithy_json::deserialize::Token::ObjectKey { key, .. }) => { match key.to_unescaped()?.as_ref() { "name" => { builder = builder.set_name( smithy_json::deserialize::token::expect_string_or_null( tokens.next(), )? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } "value" => { builder = builder.set_value( crate::json_deser::deser_structure_variable_value(tokens)?, ); } _ => smithy_json::deserialize::token::skip_value(tokens)?, } } _ => { return Err(smithy_json::deserialize::Error::custom( "expected object key or end object", )) } } } Ok(Some(builder.build())) } _ => Err(smithy_json::deserialize::Error::custom( "expected start object or null", )), } } pub fn deser_structure_tumbling_window<'a, I>( tokens: &mut std::iter::Peekable<I>, ) -> Result<Option<crate::model::TumblingWindow>, smithy_json::deserialize::Error> where I: Iterator< Item = Result<smithy_json::deserialize::Token<'a>, smithy_json::deserialize::Error>, >, { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::ValueNull { .. }) => Ok(None), Some(smithy_json::deserialize::Token::StartObject { .. }) => { #[allow(unused_mut)] let mut builder = crate::model::TumblingWindow::builder(); loop { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::EndObject { .. }) => break, Some(smithy_json::deserialize::Token::ObjectKey { key, .. }) => { match key.to_unescaped()?.as_ref() { "interval" => { builder = builder.set_interval( smithy_json::deserialize::token::expect_string_or_null( tokens.next(), )? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } "offset" => { builder = builder.set_offset( smithy_json::deserialize::token::expect_string_or_null( tokens.next(), )? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } _ => smithy_json::deserialize::token::skip_value(tokens)?, } } _ => { return Err(smithy_json::deserialize::Error::custom( "expected object key or end object", )) } } } Ok(Some(builder.build())) } _ => Err(smithy_json::deserialize::Error::custom( "expected start object or null", )), } } pub fn deser_structure_variable_value<'a, I>( tokens: &mut std::iter::Peekable<I>, ) -> Result<Option<crate::model::VariableValue>, smithy_json::deserialize::Error> where I: Iterator< Item = Result<smithy_json::deserialize::Token<'a>, smithy_json::deserialize::Error>, >, { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::ValueNull { .. }) => Ok(None), Some(smithy_json::deserialize::Token::StartObject { .. }) => { #[allow(unused_mut)] let mut builder = crate::model::VariableValue::builder(); loop { match tokens.next().transpose()? { Some(smithy_json::deserialize::Token::EndObject { .. }) => break, Some(smithy_json::deserialize::Token::ObjectKey { key, .. }) => { match key.to_unescaped()?.as_ref() { "propertyId" => { builder = builder.set_property_id( smithy_json::deserialize::token::expect_string_or_null( tokens.next(), )? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } "hierarchyId" => { builder = builder.set_hierarchy_id( smithy_json::deserialize::token::expect_string_or_null( tokens.next(), )? .map(|s| s.to_unescaped().map(|u| u.into_owned())) .transpose()?, ); } _ => smithy_json::deserialize::token::skip_value(tokens)?, } } _ => { return Err(smithy_json::deserialize::Error::custom( "expected object key or end object", )) } } } Ok(Some(builder.build())) } _ => Err(smithy_json::deserialize::Error::custom( "expected start object or null", )), } }
rust
Undoubtedly, many computer users working with the Internet (and not only) have heard about the term, like AES data encryption. What kind of system it is, what algorithms it uses and for what it is used, has a fairly limited circle of people. Usual user is by and large to know and do not need. Nevertheless, we will consider this cryptographic system, especially without going into complicated mathematical calculations and formulas, so that it is understandable to any person. What is AES encryption? To begin with, the system itself represents a set of algorithms that make it possible to hide the initial appearance of some data that is transmitted or received by the user or stored on the computer. Most often it is used in Internet technologies, when it is required to ensure complete confidentiality of information, and refers to the so-called symmetric encryption algorithms. The type of AES encryption is used to convert information into a protected view and reverse decode the same key that is known to both the sending and receiving side, in contrast to symmetric encryption, which involves the use of two keys - closed and open. Thus, it is easy to conclude that if both sides know the right key, the process of encryption and decryption is quite simple. For the first time, AES encryption was mentioned back in 2000, when the Rijndael algorithm became the winner in the selection contest for the DES successor, which was the standard in the US since 1977. In 2001, the AES system was officially adopted as the new federal standard for data encryption and has since been used universally. The evolution of the algorithms included several intermediate stages, which were mainly associated with increasing the length of the key. Today, there are three main types: AES-128-encryption, AES-192 and AES-256. The name speaks for itself. The numerical designation corresponds to the length of the key used, expressed in bits. In addition, AES encryption refers to a block type that works directly with blocks of information of fixed length, encrypting each of them, in contrast to streaming algorithms operating with single symbols of an open message, translating them into an encrypted form. In AES, the block length is 128 bits. In scientific terms, the same algorithms that use AES-256 encryption involve operations based on the polynomial representation of operations and codes when processing two-dimensional arrays (matrices). How it works? The algorithm of operation is rather complicated, but it involves the use of several basic elements. Initially, a two-dimensional matrix, conversion cycles (rounds), a round key and tables of initial and reverse substitutions are used. The process of data encryption consists of several stages: - Calculation of all round keys; - Substitution of bytes using the main S-Box table; - Shift in the form using different values (see the figure above); - Mixing data within each column of the matrix (form); - Add a shape and a round key. The decryption is done in the reverse order, but instead of the S-Box table, the reverse setting table is applied, which was mentioned above. If you give an example, if you have a 4-bit key, you need only 16 stages (rounds) to scan, that is, you need to check all possible combinations, starting from 0000 to 1111. Naturally, such protection is hacked quickly enough. But if you take more keys, for 65 bits, 65 536 stages are required, and for 256 bits - 1. 1 x 10 77 . And as it was stated by American specialists, the selection of the right combination (key) will take about 149 trillion years. What to apply when setting up the network in practice: AES or TKIP encryption? Now we turn to the use of AES-256 when encrypting transmitted and received data in wireless networks. As a rule, in any router (router) there are several parameters to choose from: only AES, only TKIP and AES + TKIP. They are used depending on the protocol (WEP or WEP2). But! TKIP is an obsolete system because it has less protection and does not support 802. 11n connections with data rates exceeding 54 Mbps. Thus, the conclusion about the priority use of AES together with the security mode WPA2-PSK suggests itself, although it is possible to use both algorithms in a pair. Despite loud statements of specialists, algorithms of AES are theoretically still vulnerable, since the very nature of encryption has a simple algebraic description. This was noted by Niels Fergusson. And in 2002, Josef Pepshik and Nicolas Courtois published an article that substantiates the potentially possible attack of XSL. True, it has caused a lot of controversy in the scientific world, and some have considered their calculations to be erroneous. In 2005, it was suggested that the attack could use third-party channels, not just mathematical calculations. In this case, one of the attacks calculated the key after 800 operations, and the other received it through 2 32 operations (in the eighth round). Without a doubt, to date, this system could be considered one of the most advanced, if not one but. A few years ago, a wave of virus attacks swept the Internet, in which the virus-encryptor (and at the same time also the extortionist), penetrating computers, fully encrypted the data, demanding a tidy amount of money for deciphering. At the same time, it was noted in the message that the encryption was performed using the algorithm AES1024, which, as was thought until recently, does not exist in nature. So it is or not, but even the most famous developers of anti-virus software, including Kaspersky Lab, were powerless when trying to decrypt the data. Many experts recognized that the notorious I Love You virus, which at one time hit millions of computers around the world and destroyed important information on them, compared to this threat, was a childish babble. In addition, I Love You was more focused on multimedia files, and the new virus only accessed the confidential information of large corporations. However, to state with all evidence that the encryption of AES-1024 was used here, no one is taking it. If we sum up some results, in any case we can say that AES-encryption is by far the most advanced and protected, regardless of which key length is used. It is not surprising that this standard is used in most cryptosystems and has quite broad prospects for development and improvement in the foreseeable future, especially since it is very likely that several types of encryption can be combined into one (for example, the parallel use of symmetric and asymmetric or block and streaming Encryption).
english
import Vue from 'vue'; import CounterCard from '../components/CounterCard'; export default Vue.extend({ // @ts-ignore layout: 'alt', name: 'Alt', render() { return ( <div> <div>Alt Page</div> <CounterCard /> </div> ); } });
typescript
Producing the best season of his career, Dominic Thiem made his Masters 1000 breakthrough on the hardcourts of Indian Wells in 2019 after beating five-time champion Roger Federer in a three-set final. Regarded as a clay-court specialist for long owing his prowess on the slowest surface of the game, Thiem strutted his all-court credentials in their finest glory in 2019 by winning the biggest title of his career on a surface which was not clay. By denying Federer a record sixth title at the tournament, Thiem became the 67th different player to lift a Masters 1000 title. On that note, let us have a look at the Austrian's best run at a Masters 1000 tournament. You may also like: The last 5 players to win their first Masters 1000 title. Making his sixth appearance at the tournament, Thiem took out Jordan Thompson of Australia, French veteran Gilles Simon, and the big-serving Ivo Karlovic in straight sets before a 'walkover' win over another Frenchman Gilles Simon took the Austrian to his first quarter-final at Indian Wells since a defeat to eventual finalist Stanislas Wawrinka in a third-set tiebreak in 2017. You may also like: Meet the last 5 players to make their Masters 1000 breakthrough at Indian Wells. In his first Masters 1000 semi-final outside of clay, Thiem took the first set on a tiebreak against Milos Raonic before the big Canadian restored parity by taking the second set on another tiebreak. The Austrian soon reasserted his ascendancy in the third to reach his third career Masters 1000 final, having made the title round at 2017-18 Madrid. The going got tough in the title match as Thiem conceded the opening set to five-time champion Federer in just over half an hour. To his credit though, the Austrian improved his level of play as Federer's hit a downward curve and soon the 2019 Indian Wells final was at one set apiece. In a competitive deciding set, Thiem came under enormous scoreboard pressure, standing two points away from defeat as Federer arrived at 5-4 30-30 on the Austrian's serve. Once again, Thiem exhibited his big-match temperament and produced clutch tennis to avoid the danger and hold serve. Seizing his opportunity on the Federer serve in the next game, Thiem found extra zip on his groundstrokes to break the Swiss maestro and promptly served out the win to land the biggest title of his career. In the process, Thiem joined his illustrious compatriot Thomas Muster as the only Austrian players to lift a Masters 1000 title. It would be the Austrian's first of a career-best five titles on the season highlghted by consecutive runner-up finishes to Rafael Nadal in back-to-back Roland Garros finals and a loss to Stefanos Tsitsipas in a third-set tiebreak in the final of the season-ending ATP Finals in London. You may also like: Charting the ascent of the newest world no. 3 Dominic Thiem.
english
{"PREVALENCE_BY_GENDER_AGE_YEAR":{"TRELLIS_NAME":["30-39","30-39"],"SERIES_NAME":["MALE","MALE"],"X_CALENDAR_YEAR":[2014,2018],"Y_PREVALENCE_1000PP":[0.02104,0.02463]},"PREVALENCE_BY_MONTH":{"X_CALENDAR_MONTH":[],"Y_PREVALENCE_1000PP":[]},"LENGTH_OF_ERA":{"CATEGORY":"Length of era","MIN_VALUE":1,"P10_VALUE":1,"P25_VALUE":1,"MEDIAN_VALUE":1,"P75_VALUE":1,"P90_VALUE":8,"MAX_VALUE":30},"AGE_AT_FIRST_DIAGNOSIS":{"CATEGORY":"MALE","MIN_VALUE":23,"P10_VALUE":27,"P25_VALUE":30,"MEDIAN_VALUE":34,"P75_VALUE":39,"P90_VALUE":44,"MAX_VALUE":59}}
json
// // ======================================================================== // Copyright (c) 1995-2017 Mort Bay Consulting Pty. Ltd. // ------------------------------------------------------------------------ // All rights reserved. This program and the accompanying materials // are made available under the terms of the Eclipse Public License v1.0 // and Apache License v2.0 which accompanies this distribution. // // The Eclipse Public License is available at // http://www.eclipse.org/legal/epl-v10.html // // The Apache License v2.0 is available at // http://www.opensource.org/licenses/apache2.0.php // // You may elect to redistribute this code under either of these licenses. // ======================================================================== // package org.eclipse.jetty.security; import java.io.Serializable; import java.security.Principal; import javax.security.auth.Subject; import javax.servlet.ServletRequest; import org.eclipse.jetty.server.UserIdentity; import org.eclipse.jetty.util.component.AbstractLifeCycle; import org.eclipse.jetty.util.log.Log; import org.eclipse.jetty.util.log.Logger; import org.eclipse.jetty.util.security.Credential; /** * AbstractLoginService */ public abstract class AbstractLoginService extends AbstractLifeCycle implements LoginService { private static final Logger LOG = Log.getLogger(AbstractLoginService.class); protected IdentityService _identityService=new DefaultIdentityService(); protected String _name; protected boolean _fullValidate = false; /* ------------------------------------------------------------ */ /** * RolePrincipal */ public static class RolePrincipal implements Principal,Serializable { private static final long serialVersionUID = 2998397924051854402L; private final String _roleName; public RolePrincipal(String name) { _roleName=name; } public String getName() { return _roleName; } } /* ------------------------------------------------------------ */ /** * UserPrincipal */ public static class UserPrincipal implements Principal,Serializable { private static final long serialVersionUID = -6226920753748399662L; private final String _name; private final Credential _credential; /* -------------------------------------------------------- */ public UserPrincipal(String name,Credential credential) { _name=name; _credential=credential; } /* -------------------------------------------------------- */ public boolean authenticate(Object credentials) { return _credential!=null && _credential.check(credentials); } /* -------------------------------------------------------- */ public boolean authenticate (Credential c) { return(_credential != null && c != null && _credential.equals(c)); } /* ------------------------------------------------------------ */ public String getName() { return _name; } /* -------------------------------------------------------- */ @Override public String toString() { return _name; } } /* ------------------------------------------------------------ */ protected abstract String[] loadRoleInfo (UserPrincipal user); /* ------------------------------------------------------------ */ protected abstract UserPrincipal loadUserInfo (String username); /* ------------------------------------------------------------ */ /** * @see org.eclipse.jetty.security.LoginService#getName() */ @Override public String getName() { return _name; } /* ------------------------------------------------------------ */ /** Set the identityService. * @param identityService the identityService to set */ public void setIdentityService(IdentityService identityService) { if (isRunning()) throw new IllegalStateException("Running"); _identityService = identityService; } /* ------------------------------------------------------------ */ /** Set the name. * @param name the name to set */ public void setName(String name) { if (isRunning()) throw new IllegalStateException("Running"); _name = name; } /* ------------------------------------------------------------ */ @Override public String toString() { return this.getClass().getSimpleName()+"["+_name+"]"; } /* ------------------------------------------------------------ */ /** * @see org.eclipse.jetty.security.LoginService#login(java.lang.String, java.lang.Object, javax.servlet.ServletRequest) */ @Override public UserIdentity login(String username, Object credentials, ServletRequest request) { if (username == null) return null; UserPrincipal userPrincipal = loadUserInfo(username); if (userPrincipal != null && userPrincipal.authenticate(credentials)) { //safe to load the roles String[] roles = loadRoleInfo(userPrincipal); Subject subject = new Subject(); subject.getPrincipals().add(userPrincipal); subject.getPrivateCredentials().add(userPrincipal._credential); if (roles!=null) for (String role : roles) subject.getPrincipals().add(new RolePrincipal(role)); subject.setReadOnly(); return _identityService.newUserIdentity(subject,userPrincipal,roles); } return null; } /* ------------------------------------------------------------ */ /** * @see org.eclipse.jetty.security.LoginService#validate(org.eclipse.jetty.server.UserIdentity) */ @Override public boolean validate(UserIdentity user) { if (!isFullValidate()) return true; //if we have a user identity it must be valid //Do a full validation back against the user store UserPrincipal fresh = loadUserInfo(user.getUserPrincipal().getName()); if (fresh == null) return false; //user no longer exists if (user.getUserPrincipal() instanceof UserPrincipal) { return fresh.authenticate(((UserPrincipal)user.getUserPrincipal())._credential); } throw new IllegalStateException("UserPrincipal not KnownUser"); //can't validate } /* ------------------------------------------------------------ */ /** * @see org.eclipse.jetty.security.LoginService#getIdentityService() */ @Override public IdentityService getIdentityService() { return _identityService; } /* ------------------------------------------------------------ */ /** * @see org.eclipse.jetty.security.LoginService#logout(org.eclipse.jetty.server.UserIdentity) */ @Override public void logout(UserIdentity user) { //Override in subclasses } /* ------------------------------------------------------------ */ public boolean isFullValidate() { return _fullValidate; } /* ------------------------------------------------------------ */ public void setFullValidate(boolean fullValidate) { _fullValidate = fullValidate; } }
java
Honor has refuted a report that claimed it had officially withdrawn from the Indian market. The report pointed out that Honor India's official Twitter handle has been dormant for over a year. However, an Honor spokesperson in a statement to Gadgets 360 has denied these claims and confirmed that the company will continue to operate in the country. In June, the brand launched the Honor Watch GS 3 in India for Rs. 12,990. In related news, the company recently launched the Honor Magicbook 14 Ryzen Edition in China along with several other products. The statement shared by an Honor spokesperson with Gadgets 360 says, “Honor is maintaining business operation[s] in India and will continue its development. The earlier news report that 'Honor officially announces its exit from the Indian market' is not correct." According to the previous report, Honor had decided to exit the Indian market despite enjoying profitability. Considering that these claims have been refuted by the company, we might expect more devices to be launched in the country. The Chinese brand recently announced the Honor X8 5G. It is powered by a Snapdragon 480+ SoC paired with Adreno 619 GPU and 6GB RAM. The handset sports a 6.5-inch HD+ display with a 20:09 aspect ratio. There is a 48-megapixel triple rear camera setup and an 8-megapixel selfie camera. In the previous report, we speculated that it might not be launched in the country. However, the upcoming handset might make its way to the country in the future. Honor also unveiled several new devices last week, including the Honor Magicbook 14. This notebook packs either a Ryzen 5 6600H or a Ryzen 7 6800H APU. It has a 14-inch display with a QHD+ resolution and 300 nits of brightness. The company launched the Earbuds X3 and Earbuds X3i true wireless stereo (TWS) audio wearables. It also revealed two new Honor Smart TV models — the X3 (55-inch and 65-inch) and the X3i (55-inch, 65-inch, and 75-inch). Affiliate links may be automatically generated - see our ethics statement for details.
english
<reponame>Zombie8/ng-spin-kit<filename>dist/app/spinner/rotating-plane.component.metadata.json [{"__symbolic":"module","version":4,"metadata":{"RotatingPlaneComponent":{"__symbolic":"class","extends":{"__symbolic":"reference","module":"./spinner.component","name":"SpinnerComponent","line":47,"character":44},"decorators":[{"__symbolic":"call","expression":{"__symbolic":"reference","module":"@angular/core","name":"Component","line":3,"character":1},"arguments":[{"selector":"sk-rotating-plane","styles":["\n .rotating-plane-spinner {\n margin: 25px auto;\n width: 40px;\n height: 40px;\n \n -webkit-animation: sk-rotateplane 1.2s infinite ease-in-out;\n animation: sk-rotateplane 1.2s infinite ease-in-out;\n }\n \n @-webkit-keyframes sk-rotateplane {\n 0% {\n -webkit-transform: perspective(120px)\n }\n 50% {\n -webkit-transform: perspective(120px) rotateY(180deg)\n }\n 100% {\n -webkit-transform: perspective(120px) rotateY(180deg) rotateX(180deg)\n }\n }\n \n @keyframes sk-rotateplane {\n 0% {\n transform: perspective(120px) rotateX(0deg) rotateY(0deg);\n -webkit-transform: perspective(120px) rotateX(0deg) rotateY(0deg)\n }\n 50% {\n transform: perspective(120px) rotateX(-180.1deg) rotateY(0deg);\n -webkit-transform: perspective(120px) rotateX(-180.1deg) rotateY(0deg)\n }\n 100% {\n transform: perspective(120px) rotateX(-180deg) rotateY(-179.9deg);\n -webkit-transform: perspective(120px) rotateX(-180deg) rotateY(-179.9deg);\n }\n }\n "],"template":"\n <div [hidden]=\"!visible\" class=\"rotating-plane-spinner\" [style.backgroundColor]=\"color\"></div>\n "}]}]}}}]
json
package com.networknt.light.rule.blog; import com.networknt.light.rule.AbstractBfnRule; import com.networknt.light.rule.Rule; /** * Created by steve on 3/6/2015. * Update post in a blog * * AccessLevel R [owner, admin, blogAdmin, blogUser] * * blogUser can only update his or her blog * */ public class UpdPostRule extends AbstractBfnRule implements Rule { public boolean execute (Object ...objects) throws Exception { return updPost("blog", objects); } }
java
<reponame>inke-design/website-fast-generator import { observe, Watcher } from "../observer"; import { isEmpty, compose, get } from "../utils/index"; export default class Model { constructor({ state = {}, middlewares = [] }) { this.store = observe(state); this.INSTALL_TYPES = { action: "action", }; this.actions = {}; this.middlewares = middlewares; this._initMiddleWares(this.middlewares); } getState() { return this.store; } setState(state) { this.store = state; } dispatch = (action) => { return this._beforeDispatch(() => { const actionType = action.type; const actionCfg = this.actions[actionType]; if (actionCfg) { const { reducer } = actionCfg; const state = this.getState(); const newState = reducer(state, action); this.setState(newState); } else { console.warn(`dispatch ${action.type}尚未注册`); } }); } install(type, config) { switch (type) { case this.INSTALL_TYPES.action: { this._addAction(config); } default: { } } } getter(fn) { const state = {...this.getState()}; if(typeof fn === 'string') { return get(state, fn, undefined); } if(typeof fn === 'function') { return fn(state); } throw new Error("getter params is not a function or string"); } subscribe(fn, keys) { this._addWatcher(keys, fn); } _addAction(config) { if(Array.isArray(config)) { config.forEach(action => { this.actions[action.type] = action; }) return; } this.actions[config.type] = config; } _addWatcher(keys, fn) { if (isEmpty(keys)) { new Watcher(this.store, undefined, (newVal, oldVal, state) => { fn(newVal, oldVal, this.getState()); }); return; } keys.forEach((watchKey) => { new Watcher(this.store, watchKey, (newVal, oldVal, state) => { fn(newVal, oldVal, this.getState()); }); }); } _applyMiddleware(middlewares) { const store = this.store; let dispatch = this.dispatch; const middlewareAPI = { getState: this.getState, dispatch: (action) => dispatch(action), }; const chain = middlewares.map((middleware) => middleware(middlewareAPI)); dispatch = compose(...chain)(dispatch); return dispatch; } _beforeDispatch(fn){ return new Promise((resolve, reject) => { try { fn(); resolve(); } catch(err) { reject(err); } }); } _initMiddleWares(middlewares) { if(!isEmpty(middlewares)) { this._dispatch = this.dispatch; this.dispatch = this._applyMiddleware(middlewares); } } }
javascript
A record at the national and international level is set by Ring Road Shuba` Kannada film as all women cinema. The team working for the film is lead by Priya Belliappa as director, Rajani Ravindra Das as producer, Rekha Rani dialogue, Vani Harikrishns music composition, Reshmi Sarkar cinematographer, Shilpa Krishna costume designer, Maryann D Souza editing, Chitralekha Shetty art director, Avantika Nimbalkar Sound designer, Hema BN sound recording, Geetha Gurappa DTS engineer, Devanshi Desai DI Colorist, Poonam Prasad is Makeup artist, Suman Tyagi hair stylist, Sonam Thale is publicity stills and design, Chandrika and Jeevitha Vishwanath choreographers, Mehi Shah and Avisha Baing still photographers, Abheri De chief assistant cinematographer, Swapaneel Neogi chief assistant director, Parineeta Bhure assistant director, Bhavyasri U is assistant editor. All girls and only girls cinema was in merry mood at the Boozy Griffin Pub situated above California Pizza Kitchen in Koramangala, audio of the film consisting of five songs scored by Vani Harikrishna was released by KFCC President HD Gangaraju and well known distributor Basha. Duniya Vijay, lovely star Premkumar and Ajit with ravishing beauties like Kushi lead artist, Nikita Tukral, Sanjana and of course producer Ranjani Ravindra. Ring Road Shuba` audio is brought to the market by D Beats. Rekha Rani veteran journalist and part of this film stated that it is best to release audio CD with media in front of us because they take us to various places. Media for this all women cinema has extended tremendous support stated Rekha Rani. She thanked KFCC for relaxing rules when a great effort was in the offing. The first one release cheque before shoot was Bashaji as he sensed something remarkable is going to happen from Ring Road Shuba. The song Pom Pom Parvathamnore…featuring Vijay and Nikita with extra artists and Yaakinge Yaakinge…a very melodious song for introduction of Kushi as lead artist was screened. Duniya Vijay went to his school days sitting with women around him. Girls were not looking at me for my black complexion and today it is a day sitting with women around stated Vijay. He has worked for this film taking only one rupee as remuneration. It is double kick with women and liquor around stated Premkumar. It is remarkable and wonderful are the words from Ranjani Ravindra, this is history creating cinema mentioned Nikita, I am a feminist first said Sanjana who is in second film with Priya Belliappa. She said Priya is like Huli`. Ravishing beauty Kushi of RRS` has thanked director Priya Belliyappa for correcting her mistakes, Vani Harikrishna recalled the song she sung with her husband is second time. It was first in Payana she informed. Yogish son of Dwarakish veteran Ustaad of Kannada cinema remembered that it was the dream of his father to make all women cinema that has happened today. A brilliant Priya Belliappa heading the megaphone said the film RRS` would be released soon. Follow us on Google News and stay updated with the latest!
english
<reponame>sevgiun/T_System #!/usr/bin/python3 # -*- coding: utf-8 -*- """ .. module:: face_encoding :platform: Unix :synopsis: the top-level submodule of T_System's remote_ui that contains the API for face encoding of T_System's face recognition ability. .. moduleauthor:: <NAME> <<EMAIL>> """ from flask import Blueprint, Response, request, send_file from flask_restful import Api, Resource from schema import SchemaError from t_system.remote_ui.modules.face_encoding import create_face, get_face_image, download_face_image, get_faces, update_face, delete_face from t_system.remote_ui.api.data_schema import FACE_ENCODING_SCHEMA from t_system import log_manager logger = log_manager.get_logger(__name__, "DEBUG") api_bp = Blueprint('face_encoding_api', __name__) api = Api(api_bp) class FaceEncodingApi(Resource): """Class to define an API of the face encoding ability of T_System. This class provides necessary initiations and functions named; :func:`t_system.remote_ui.api.face_encoding.FaceEncodingApi.get`for the provide get face data from database, :func:`t_system.remote_ui.api.face_encoding.FaceEncodingApi.post` for provide creating new face encoding, :func:`t_system.remote_uia.api.face_encoding.FaceEncodingApi.put` for provide updating the putted faces's encodings, :func:`t_system.remote_ui.api.face_encoding.FaceEncodingApi.delete` for provide deleting the face. """ def __init__(self): """Initialization method of :class:`t_system.remote_ui.api.face_encoding.FaceEncodingApi` class. """ def get(self): """The API method to GET request for flask. """ is_download = request.args.get('download', None) face_image = request.args.get('image', None) face_id = request.args.get('id', None) admin_id = request.args.get('admin_id', None) if not face_id and (face_image or is_download): return {'status': 'ERROR', 'message': '\'id\' parameter is missing'} if face_id and not face_image: return {'status': 'ERROR', 'message': '\'image\' parameter is missing'} if face_id: if is_download: image = download_face_image(admin_id, face_id, face_image) if not image: return {'status': 'ERROR', 'message': 'parameter invalid'} return send_file(image) else: image, mimetype = get_face_image(admin_id, face_id, face_image) if image and mimetype: logger.debug("Response returning") return Response(image, mimetype=mimetype) faces = get_faces(admin_id) return {'status': 'OK', 'data': faces} def post(self): """The API method to POST request for flask. """ admin_id = request.args.get('admin_id', None) try: name = request.form.to_dict(flat=True).get("face_name") images = request.files.getlist("face_images") except SchemaError as e: return {'status': 'ERROR', 'message': e.code} result = create_face(admin_id, name, images) return {'status': 'OK' if result else 'ERROR'} def put(self): """The API method to PUT request for flask. """ face_id = request.args.get('id') admin_id = request.args.get('admin_id', None) if not face_id: return {'status': 'ERROR', 'message': '\'id\' parameter is missing'} try: form = request.form.to_dict(flat=True) data = FACE_ENCODING_SCHEMA.validate(form) except SchemaError as e: return {'status': 'ERROR', 'message': e.code} result = update_face(admin_id, face_id, data) return {'status': 'OK' if result else 'ERROR'} def delete(self): """The API method to DELETE request for flask. """ face_id = request.args.get('id') admin_id = request.args.get('admin_id', None) if not face_id: return {'status': 'ERROR', 'message': '\'id\' parameter is missing'} result = delete_face(admin_id, face_id) return {'status': 'OK' if result else 'ERROR'} api.add_resource(FaceEncodingApi, '/api/face_encoding')
python
#dataexplorer-tableview-dialog table td { background: #fff; border: 1px solid #eee; padding: 6px; color: #555; } #dataexplorer-tableview-dialog table thead td { background: #444; color: #fff; border-color: #444; border-bottom: none; } .dataexplorer-tableview-viewer { position: relative; height: 100%; overflow: hidden; } .dataexplorer-tableview-nav { list-style-type: none; margin: 0; padding: 0; height: 22px; } .dataexplorer-tableview-nav-toggle { float: left; } .dataexplorer-tableview-nav li { float: left; } #dataexplorer-tableview-nav-editor, label[for=dataexplorer-tableview-nav-editor] { float: right; } /* Adds border to left of data table */ .dataexplorer-tableview-grid, .dataexplorer-tableview-graph, .dataexplorer-tableview-editor { position: absolute; left: 0; right: 220px; top: 28px; bottom: 0; z-index: 0; } .dataexplorer-tableview-grid { border-left: 1px solid #ccc; } .dataexplorer-tableview-graph { z-index: 1; background-color: #fff; } .dataexplorer-tableview-editor { z-index: 1; background-color: #efefef; right: 0; left: auto; width: 198px; padding: 5px 10px; border: 1px solid #ccc; overflow: auto; overflow-x: hidden; } .dataexplorer-tableview-editor ul { list-style-type: none; margin: 0; padding: 0; overflow: hidden; } .dataexplorer-tableview-editor li { margin-bottom: 10px; } .dataexplorer-tableview-editor .dataexplorer-tableview-editor-group { padding-bottom: 10px; margin-bottom: 10px; border-bottom: 1px solid #ddd; } .dataexplorer-tableview-editor label { display: block; font-weight: bold; color: #555; line-height: 1.4; } .dataexplorer-tableview-editor label a { float: right; font-size: 11px; color: #999; font-weight: normal; } .dataexplorer-tableview-editor select { width: 100%; } .dataexplorer-tableview-editor button { float: right; } .dataexplorer-tableview-editor-buttons { clear: right; overflow: hidden; } .dataexplorer-tableview-editor-submit { margin-top: 10px; padding-top: 10px; border-top: 1px solid #ddd; } .dataexplorer-tableview-editor-info { border-bottom: 1px solid #ddd; margin-bottom: 10px; } .dataexplorer-tableview-editor-info h1, .dataexplorer-tableview-editor-info p { font-size: 12px; margin: 0 0 10px; color: #555; } .dataexplorer-tableview-editor-info h1 { line-height: 16px; cursor: pointer; font-family: sans-serif; font-size: 13px; font-weight: bold; margin: 0 0 4px; } .dataexplorer-tableview-editor-info h1 span { position: relative; top: 1px; display: inline-block; width: 12px; height: 12px; background: url(jquery-ui/css/ckan/images/ui-icons_444444_256x240.png) no-repeat -68px -17px; } .dataexplorer-tableview-editor-hide-info h1 span { background-position: -36px -18px; } .dataexplorer-tableview-editor-hide-info p { display: none; } .dataexplorer-tableview-hide-editor .dataexplorer-tableview-editor { display: none; } .dataexplorer-tableview-hide-editor .dataexplorer-tableview-panel { right: 0; } /* Style the preview buttons */ .preview { width: 65px; } .preview .resource-preview-button { border: 1px solid #eaeaea; color: #444; padding: 3px 6px 2px 21px; font-size: 12px; font-weight: bold; background: #fff url(./icon-sprite.png) no-repeat 5px 6px; -webkit-border-radius: 4px; -moz-border-radius: 4px; -ms-border-radius: 4px; -o-border-radius: 4px; border-radius: 4px; display: block; margin: 0 -5px; } .preview .resource-preview-button:hover { color: #000; text-decoration: none; border-color: #aaa; background-color: #fafafa; background-position: 5px -24px; } .preview .resource-preview-button:active { color: #b22; border-color: #b22; background-position: 5px -54px; } .preview .resource-preview-chart { background-position: 5px -83px; } .preview .resource-preview-chart:hover { background-position: 5px -113px; } .preview .resource-preview-chart:active { background-position: 5px -143px; } .preview .resource-preview-loading, .preview .resource-preview-loading:hover, .preview .resource-preview-loading:active { color: #aaa; border: 1px solid #eaeaea; background-image: url(./loading.gif); background-position: 3px 5px; cursor: default; } /* Reduce the default size of the alert dialog */ .ui-dialog .ui-button-text-only .ui-button-text { padding: 3px 16px 1px; } /* Extend the resize handle into the adjacent column */ .slick-resizable-handle { width: 12px; right: -6px; } /* Fix to crop the text correctly */ .slick-header-wrapper { overflow: hidden; display: inline-block; width: 100%; text-overflow: ellipsis; } .slick-columnpicker { border-color: #888; z-index: 99999!important; } .slick-columnpicker label, .slick-columnpicker input { cursor: pointer; }
css
'use strict'; const express = require('express'); const basicAuth = require('../../middleware/BasicAuthentication'); const router = express.Router(); router.post('/signin', basicAuth, signinUser); /** * * @param {object} req * it will set the token in the request object * @param {object} res * it will set the token in a cookie then redirect to the home page */ function signinUser(req, res) { let token = req.token; let day = 86400000; res.cookie('remember token', token, { expires: new Date(Date.now() + day), httpOnly: true, }); res.redirect('/home'); } module.exports = router;
javascript
<reponame>metamatex/metamate // generated by metactl sdk gen package mql const ( DummySortName = "DummySort" ) type DummySort struct { BoolField *string `json:"boolField,omitempty" yaml:"boolField,omitempty"` CreatedAt *TimestampSort `json:"createdAt,omitempty" yaml:"createdAt,omitempty"` Float64Field *string `json:"float64Field,omitempty" yaml:"float64Field,omitempty"` Id *ServiceIdSort `json:"id,omitempty" yaml:"id,omitempty"` Int32Field *string `json:"int32Field,omitempty" yaml:"int32Field,omitempty"` StringField *string `json:"stringField,omitempty" yaml:"stringField,omitempty"` UnionField *DummyUnionSort `json:"unionField,omitempty" yaml:"unionField,omitempty"` }
go
package org.jemiahlabs.skrls.core; public class Message { public static Message createMessageInfo(String text) { return new Message(text, TypeMessage.INFO); } public static Message createMessageWarning(String text) { return new Message(text, TypeMessage.WARNNING); } public static Message createMessageError(String text) { return new Message(text, TypeMessage.ERROR); } private String text; private TypeMessage typeMessage; public Message(String text, TypeMessage typeMessage) { this.text = text; this.typeMessage = typeMessage; } public String getText() { return text; } public TypeMessage getTypeMessage() { return typeMessage; } }
java
Ouya's Kickstarter console needs a huge wodge of cash to become a real product, but it certainly sounds intriguing. US company Ouya is plotting an Android-powered games console, which is designed to be dirt cheap, built for hackers and offer free games to play on your telly. Here's the rub -- it's not yet real. Ouya is using crowd-funding site Kickstarter to harvest money for the project, which is born out of an acceptance that indie game developers are moving towards mobile gadgets like Android smart phones or the iPad. In a classy promo video (embedded below) that promises an "inexpensive games console for gamers", Ouya founder Julie Uhrman explains the box will be powered by Google's mobile operating system -- version 4.0 Ice Cream Sandwich to be exact -- and that every game will have a free-to-play element. Notable indie developers have flocked to support the project, including Minecraft creator Mojang. Adam Saltsman, who created popular moody mobile platformer Canabalt, said, "The prospect of an affordable, open console -- that's an idea I find really exciting." Also on board is Yves Behar, who did design work for the One Laptop per Child project, and has given the Ouya its shiny curvy cuby look. Throwing $95 (about £60) or more Ouya's way will get you an Ouya console and controller, should the project escape the realm of fantasy and become a real object you can buy. Developers who pay $699 (about £450) for the ambitious project will have their game promoted for one year, and marked with a 'founder emblem'. If it doesn't hit its target, you won't be charged. It's a cool project, and makes you glad that open-source platforms like Android exist. Ouya has just 29 days to reach its mammoth $950k goal -- I'll be watching its progress like a hungry hawk. Kickstarter recently confirmed that it's coming to the UK, so if you have a crazy invention you'd like other people to pay for, your chance could be coming up. Will you be backing Ouya? Let me know in the comments or on our Facebook wall.
english
Dev Anand was in love with Hare Rama Hare Krishna co-star Zeenat Aman and lost her to Raj Kapoor. Here is what happened in Dev Anand’s words. Tuesday marks the 94th birth anniversary the evergreen star of Bollywood - Dev Anand. He floored millions of women with his performances and stunning looks but he could not get the one he truly loved -- we are talking about his Hare Rama Hare Krishna co-star Zeenat Aman. Basking in the glow of the success of Hare Rama Hare Krishna in 1971, Dev Anand, the ever-romantic hero, realised that he was in love with the film’s leading lady and his discovery Zeenat Aman. Describing his feelings for Zeenat, Dev Anand wrote in his autobiography, Romancing With Life, that he enjoyed it when newspapers and magazines started linking them together romantically after the film’s success. He almost declared his love to her, but quietly withdrew when he saw her getting close to Raj Kapoor who wanted to cast her in his film, Satyam Shivam Sundaram. “Whenever and wherever she was talked about glowingly, I loved it; and whenever and wherever I was discussed in the same vein, she was jubilant. In the subconscious, we had become emotionally attached to each other,” Dev Anand wrote in his 2007 book. He soon realised that he was in love with her and wanted to declare it to her at a romantic meeting at the Taj in Mumbai. Suspecting something, Dev Anand recalled that a rumour had been floating that Zeenat had gone to Raj Kapoor’s studio for a screen test for the main role in his new movie, Satyam Shivam Sundaram. “The hearsay now started ringing true. My heart was bleeding,” he wrote. This article was first published in 2011.
english
Begin typing your search above and press return to search. The second half begins much like the first half - with frantic attacks. An Indian player goes down injured after a clash. Both sets of players start coming together, but then better sense prevails. India still on the attack, still hunting for the equaliser.
english
{"name":"<NAME>","harga":" Rp. 22.990/ kemasan","golongan":"http://medicastore.com/image_banner/obat_keras_dan_psikotropika.gif","kandungan":"Tiap 5 ml mengandung : Ambroxol HCl / Ambroksol HCl 15 mg","indikasi":"Sekretolitik pada gangguan saluran pernafasan kronis, khususnya pada eksaserbasi bronkitis kronis, bronkitis asmatik & asma bronkial","kontraindikasi":"","perhatian":"","efeksamping":"Efek samping pada saluran pencernaan bersifat ringan, dan reaksi alergi.","indeksamanwanitahamil":"","kemasan":"Botol isi 60 ml","dosis":" Anak dibawah 2 tahun : 2 kali sehari 1/2 sendok takar\r\n Anak 2-6 tahun : 3 kali sehari 1/2 sendok takar\r\n Snsk 6-12 tahun : 2-3 kali sehari 1 sendok takar","penyajian":"Dikonsumsi bersamaan dengan makanan","pabrik":"Galenium Pharmasia Laboratories.","id":"12779","category_id":"3"}
json
{ "id": 72380055, "name": "stories-js", "fullName": "davideo71/stories-js", "owner": { "login": "davideo71", "id": 1611385, "avatarUrl": "https://avatars2.githubusercontent.com/u/1611385?v=3", "gravatarId": "", "url": "https://api.github.com/users/davideo71", "htmlUrl": "https://github.com/davideo71", "followersUrl": "https://api.github.com/users/davideo71/followers", "subscriptionsUrl": "https://api.github.com/users/davideo71/subscriptions", "organizationsUrl": "https://api.github.com/users/davideo71/orgs", "reposUrl": "https://api.github.com/users/davideo71/repos", "receivedEventsUrl": "https://api.github.com/users/davideo71/received_events", "type": "User" }, "private": false, "htmlUrl": "https://github.com/davideo71/stories-js", "description": "Next iteration of the stories project", "fork": false, "url": "https://api.github.com/repos/davideo71/stories-js", "forksUrl": "https://api.github.com/repos/davideo71/stories-js/forks", "teamsUrl": "https://api.github.com/repos/davideo71/stories-js/teams", "hooksUrl": "https://api.github.com/repos/davideo71/stories-js/hooks", "eventsUrl": "https://api.github.com/repos/davideo71/stories-js/events", "tagsUrl": "https://api.github.com/repos/davideo71/stories-js/tags", "languagesUrl": "https://api.github.com/repos/davideo71/stories-js/languages", "stargazersUrl": "https://api.github.com/repos/davideo71/stories-js/stargazers", "contributorsUrl": "https://api.github.com/repos/davideo71/stories-js/contributors", "subscribersUrl": "https://api.github.com/repos/davideo71/stories-js/subscribers", "subscriptionUrl": "https://api.github.com/repos/davideo71/stories-js/subscription", "mergesUrl": "https://api.github.com/repos/davideo71/stories-js/merges", "downloadsUrl": "https://api.github.com/repos/davideo71/stories-js/downloads", "deploymentsUrl": "https://api.github.com/repos/davideo71/stories-js/deployments", "createdAt": "2016-10-30T22:41:19.000Z", "updatedAt": "2016-10-30T22:57:46.000Z", "pushedAt": "2017-01-29T13:11:25.000Z", "gitUrl": "git://github.com/davideo71/stories-js.git", "sshUrl": "git@github.com:davideo71/stories-js.git", "cloneUrl": "https://github.com/davideo71/stories-js.git", "svnUrl": "https://github.com/davideo71/stories-js", "homepage": null, "size": 501, "stargazersCount": 0, "watchersCount": 0, "language": "JavaScript", "hasIssues": true, "hasDownloads": true, "hasWiki": true, "hasPages": false, "forksCount": 0, "mirrorUrl": null, "openIssuesCount": 0, "openIssues": 0, "watchers": 0, "defaultBranch": "master", "permissions": { "admin": false, "push": false, "pull": true }, "license": null, "networkCount": 0, "subscribersCount": 4, "status": 200, "packageJSON": { "name": "stories", "version": "0.0.0", "description": "Stories Electron test", "main": "src/main.js", "scripts": { "test": "echo \"Error: no test specified\" && exit 1", "start": "electron ." }, "author": "", "license": "TODO: http://choosealicense.com/about", "dependencies": { "three": "~0.82.1" }, "devDependencies": { "electron": "^1.4", "eslint": "^3.4" } }, "packageStatus": 200, "firstCommit": { "sha": "e9e62e9e55ae1460cedc4511a9acbe1dfa6dbf59", "commit": { "author": { "name": "<NAME>", "email": "<EMAIL>", "date": "2016-10-30T12:29:15Z" }, "committer": { "name": "<NAME>", "email": "<EMAIL>", "date": "2016-10-30T12:29:15Z" }, "message": "Initial checkin of Stories project: for now only a quick start Electron example.", "tree": { "sha": "b79db9a0348f75eb6b73ce9a967f82976946d79a", "url": "https://api.github.com/repos/davideo71/stories-js/git/trees/b79db9a0348f75eb6b73ce9a967f82976946d79a" }, "url": "https://api.github.com/repos/davideo71/stories-js/git/commits/e9e62e9e55ae1460cedc4511a9acbe1dfa6dbf59", "commentCount": 0 } }, "filename": "davideo71___stories-js.json", "hasProjects": true, "lastFetchedAt": "2017-05-04T05:18:11.183Z", "packageLastFetchedAt": "2017-05-05T17:29:53.077Z" }
json
body { width: 40em; margin: 0 auto; padding: 0; } #header, #nav { text-align: center; } @media (max-width: 640px) { body { font-size: 16px; } } @media (min-width: 641px) and (max-width: 1400px) { body { font-size: 24px; } } @media (min-width: 1401px) { body { font-size: 32px; } }
css
<reponame>Datenschule/schulscraper-data {"name":"Daniel-Cederberg-Schule","id":"HE-8234-0","official_id":"8234","address":"Neuhöfe 17, 35041 Marburg","school_type":"Sonstige Förderschule","school_type_entity":"Förderschule","fax":"06421/936444","state":"HE","phone":"06421/93640","full_time_school":false,"lon":8.724857,"lat":50.799733}
json
/* ==UserStyle== @name amor doce perfil133234234355466 @namespace USO Archive @author <NAME> @description `amor doce perfil` @version 20160305.19.5 @license NO-REDISTRIBUTION @preprocessor uso ==/UserStyle== */ @-moz-document domain("amordoce.com") { #container{ background-image : url(http://i64.tinypic.com/2akh3s2.jpg) !important ; } #container.connected #header { background-image : url(http://i66.tinypic.com/ws0f80.jpg) !important ; } .idcard-member { background : url(http://i66.tinypic.com/r2s1nt.png) no-repeat !important ; } } @-moz-document regexp(Fofinhaflor) { .idcard-member { background : url(http://i68.tinypic.com/zuf67s.png) no-repeat !important ; } }
css
<reponame>Wasteland-Ventures-Group/WV-VTT-module<filename>src/typescript/hooks/index.ts import registerForDragRulerReady from "./dragRuler/ready.js"; import registerForHotbarDrop from "./hotbarDrop.js"; import registerForInit from "./init.js"; import registerForPreUpdateActor from "./preUpdateActor.js"; import registerForReady from "./ready.js"; import registerForRenderChatMessage from "./renderChatMessage/index.js"; import registerForUpdateActor from "./updateActor.js"; /** Register system callbacks for all used hooks. */ export default function registerForHooks(): void { registerForInit(); registerForReady(); registerForHotbarDrop(); registerForRenderChatMessage(); registerForPreUpdateActor(); registerForUpdateActor(); registerForDragRulerReady(); }
typescript
// This file is part of MinSQL // Copyright (c) 2019 MinIO, Inc. // // This program is free software: you can redistribute it and/or modify // it under the terms of the GNU Affero General Public License as published by // the Free Software Foundation, either version 3 of the License, or // (at your option) any later version. // // This program is distributed in the hope that it will be useful, // but WITHOUT ANY WARRANTY; without even the implied warranty of // MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the // GNU Affero General Public License for more details. // // You should have received a copy of the GNU Affero General Public License // along with this program. If not, see <http://www.gnu.org/licenses/>. use std::fmt; use std::io::Error as IoError; use std::time::Instant; use chrono::{Datelike, Timelike, Utc}; use futures::future::result; use futures::future::FutureResult; use futures::stream::Stream; use futures::Future; use futures::Poll; use rand::distributions::{IndependentSample, Range}; use rusoto_core::HttpClient; use rusoto_core::Region; use rusoto_core::RusotoError; use rusoto_credential::AwsCredentials; use rusoto_credential::CredentialsError; use rusoto_credential::ProvideAwsCredentials; use rusoto_s3::{ GetObjectError, GetObjectRequest, ListObjectsRequest, PutObjectRequest, S3Client, S3, }; use uuid::Uuid; use crate::config::{Config, DataStore}; #[derive(Debug)] enum StorageError { Get(GetObjectError), Io(IoError), } impl From<GetObjectError> for StorageError { fn from(err: GetObjectError) -> Self { StorageError::Get(err) } } impl From<RusotoError<GetObjectError>> for StorageError { fn from(err: RusotoError<GetObjectError>) -> Self { match err { RusotoError::Service(e) => StorageError::Get(e), _ => StorageError::Io(IoError::last_os_error()), } } } impl From<IoError> for StorageError { fn from(err: IoError) -> Self { StorageError::Io(err) } } // Our Credentials holder so we can use per-datasource credentials with rusoto #[derive(Debug)] pub struct CustomCredentialsProvider { credentials: AwsCredentials, } impl CustomCredentialsProvider { pub fn with_credentials(credentials: AwsCredentials) -> Self { CustomCredentialsProvider { credentials: credentials, } } } pub struct CustomCredentialsProviderFuture { inner: FutureResult<AwsCredentials, CredentialsError>, } impl Future for CustomCredentialsProviderFuture { type Item = AwsCredentials; type Error = CredentialsError; fn poll(&mut self) -> Poll<Self::Item, Self::Error> { self.inner.poll() } } impl ProvideAwsCredentials for CustomCredentialsProvider { type Future = CustomCredentialsProviderFuture; fn credentials(&self) -> Self::Future { CustomCredentialsProviderFuture { inner: result(Ok(self.credentials.clone())), } } } fn client_for_datastore(datastore: &DataStore) -> S3Client { // Create a credentials holder, for our provider to provide into the s3 client let credentials = AwsCredentials::new( &datastore.access_key[..], &datastore.secret_key[..], None, None, ); let provider = CustomCredentialsProvider::with_credentials(credentials); let dispatcher = HttpClient::new().expect("failed to create request dispatcher"); // A custom region is the way to point to a minio instance let region = Region::Custom { name: datastore.name.clone().unwrap(), endpoint: datastore.endpoint.clone(), }; // Build the client let s3_client = S3Client::new_with(dispatcher, provider, region); s3_client } // <p>Function used to verify if a datastore is valid in terms of reachability</p> pub fn can_reach_datastore(datastore: &DataStore) -> bool { // Get the Object Storage client let s3_client = client_for_datastore(&datastore); // perform list call to verify we have access let can_reach = match s3_client .list_objects(ListObjectsRequest { bucket: datastore.bucket.clone(), delimiter: None, encoding_type: None, marker: None, max_keys: Some(i64::from(1)), prefix: None, request_payer: None, }) .sync() { Ok(_) => true, Err(e) => { info!("Cannot access bucket: {}", e); false } }; can_reach } fn str_to_streaming_body(s: String) -> rusoto_s3::StreamingBody { s.into_bytes().into() } #[derive(Debug)] pub struct WriteDatastoreError { details: String, } impl WriteDatastoreError { pub fn new(msg: &str) -> WriteDatastoreError { WriteDatastoreError { details: msg.to_string(), } } } impl fmt::Display for WriteDatastoreError { fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result { write!(f, "{}", self.details) } } pub fn write_to_datastore( log_name: &str, cfg: &Config, payload: &String, ) -> Result<bool, WriteDatastoreError> { let start = Instant::now(); // Select a datastore at random to write to let datastore = rand_datastore(&cfg, &log_name).unwrap(); // Get the Object Storage client let s3_client = client_for_datastore(&datastore); let now = Utc::now(); let my_uuid = Uuid::new_v4(); let target_file = format!( "{log}/{year}/{month}/{day}/{hour}/{ts}.log", log = log_name, year = now.date().year(), month = now.date().month(), day = now.date().day(), hour = now.hour(), ts = my_uuid ); let destination = format!("minsql/{}", target_file); // turn the payload into a streaming body let strbody = str_to_streaming_body(payload.clone()); // save the payload match s3_client .put_object(PutObjectRequest { bucket: datastore.bucket.clone(), key: destination, body: Some(strbody), ..Default::default() }) .sync() { Ok(x) => x, Err(e) => { return Err(WriteDatastoreError::new( &format!("Could not write to datastore: {}", e)[..], )); } }; //TODO: Remove this metric let duration = start.elapsed(); println!("Writing to minio: {:?}", duration); Ok(true) } #[derive(Debug)] pub struct ListMslFilesError { details: String, } impl ListMslFilesError { pub fn new(msg: &str) -> ListMslFilesError { ListMslFilesError { details: msg.to_string(), } } } impl fmt::Display for ListMslFilesError { fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result { write!(f, "{}", self.details) } } // List all the files for a bucket pub fn list_msl_bucket_files( logname: &str, datastore: &DataStore, ) -> Result<Vec<String>, ListMslFilesError> { let s3_client = client_for_datastore(datastore); let files = match s3_client .list_objects(ListObjectsRequest { bucket: datastore.bucket.clone(), prefix: Some(format!("minsql/{}", logname)), ..Default::default() }) .sync() { Ok(l) => l, Err(e) => { return Err(ListMslFilesError::new( &format!("Could not list files in datastore: {}", e)[..], )); } }; let mut msl_files = Vec::new(); for file in files.contents.unwrap() { let fkey = file.key.unwrap(); if fkey.contains(".log") { msl_files.push(fkey) } } Ok(msl_files) } // Return the contents of a file from a datastore pub fn read_file(key: &String, datastore: &DataStore) -> Result<String, ListMslFilesError> { let s3_client = client_for_datastore(datastore); let files = match s3_client .get_object(GetObjectRequest { bucket: datastore.bucket.clone(), key: key.clone(), ..Default::default() }) .sync() { Ok(l) => l, Err(e) => { return Err(ListMslFilesError::new( &format!("Could not list files in datastore: {}", e)[..], )); } }; let stream = files.body.unwrap(); let bytes = stream.concat2().wait().unwrap(); let body = String::from_utf8(bytes).unwrap(); Ok(body) } /// Selects a datastore at random /// Will return `None` if the log_name doesn't match a valid `Log` name in the `Config`. fn rand_datastore<'a>(cfg: &'a Config, log_name: &str) -> Option<&'a DataStore> { let log = match cfg.log.get(log_name) { Some(v) => v, None => return None, }; if log.datastores.is_empty() { return None; } let index = Range::new(0, log.datastores.len()).ind_sample(&mut rand::thread_rng()); let datastore_name = match log.datastores.iter().skip(index).next() { Some(v) => v, None => return None, }; cfg.datastore.get(&datastore_name[..]) } #[cfg(test)] mod storage_tests { use std::collections::HashMap; use crate::config::Log; use super::*; // Generates a Config object with only one auth item for one log fn get_ds_log_config_for(log_name: String, datastore_list: &Vec<String>) -> Config { let mut datastore_map = HashMap::new(); for datastore_name in datastore_list { datastore_map.insert( datastore_name.clone(), DataStore { name: Some(datastore_name.clone()), endpoint: "".to_string(), access_key: "".to_string(), secret_key: "".to_string(), bucket: "".to_string(), prefix: "".to_string(), }, ); } let mut log_map = HashMap::new(); log_map.insert( log_name.clone(), Log { name: Some(log_name.clone()), datastores: datastore_list.clone(), commit_window: "5s".to_string(), }, ); let cfg = Config { version: "1".to_string(), server: None, datastore: datastore_map, log: log_map, auth: HashMap::new(), }; cfg } #[test] fn random_datastore_selected() { let ds_list = vec!["ds1".to_string(), "ds2".to_string()]; let cfg = get_ds_log_config_for("mylog".to_string(), &ds_list); let cfg = Box::new(cfg); let cfg: &'static _ = Box::leak(cfg); let rand_ds = rand_datastore(&cfg, "mylog"); let ds_name = match rand_ds { None => panic!("No datastore was matched"), Some(ds) => ds.name.clone().unwrap(), }; let ds_in_list = ds_list.contains(&ds_name); assert_eq!(ds_in_list, true) } #[test] fn fail_random_datastore_selected() { let ds_list = vec!["ds1".to_string(), "ds2".to_string()]; let cfg = get_ds_log_config_for("mylog".to_string(), &ds_list); let cfg = Box::new(cfg); let cfg: &'static _ = Box::leak(cfg); let rand_ds = rand_datastore(&cfg, "mylog2"); assert_eq!( rand_ds, None, "Select random datastore from incorrect log should have failed." ) } }
rust
<reponame>GraphicsReplicability/sig20-paper-supplementary-materials<gh_stars>1-10 { "Timestamp": "12/18/2019 10:19:40", "Title": "Scanner: efficient video analysis at scale", "DOI": "10.1145/3197517.3201394", "PDF availability (beside dl.acm)": "On an open archive arXiv or similar open archive, On the author(s) website or its institution", "Is the code or software publicly available ?": "Yes", "Availability": "on github, gitlab or other code platform", "Main languages": "C/C++, Python", "How long did take for you to evaluate the code this paper (if any)?": 5, "Build info / instructions": "README, LICENSE or explicit license (e.g. in the readme / headers), CONTRIBUTING or explicitly mentions how to contribute to the code, AUTHORS or explicitly mentioned", "arXiv (page URL)": "https://arxiv.org/abs/1805.07339", "code URL": "https://github.com/scanner-research/scanner", "code URL 2": "", "PDF URL": "http://graphics.stanford.edu/papers/scanner/poms18_scanner.pdf", "License type": "Apache License 2.0", "Build mechanism": "Makefile, CMakeLists, Not applicable (python, Matlab..)", "Mandatory Software dependencies": "Open-source libraries, Closed source (e.g. commercial) software or libraries free for research purposes", "Feedback [easy deps]": "", "Feedbacks [easy to find/install dependencies (5=easy)]": 5, "Feedbacks [easy to configure/build (5=easy)]": "N/A", "Feedbacks [easy to fix bugs (5=easy N/A if no bug)]": "N/A", "Execution experience [Row 1]": "", "Does the code need data (other than examples/inputs) ?": "Cannot answer", "Available data (provided or url / ref)": "", "License for the data (if any)": "", "Feedbacks [easy to adapt / use in other contexts (5=easy)]": 5, "Feedbacks [Interface user-friendly (5=easy)]": 5, "Documentation": "User-documentation (readme-note-tutorial) more than just build instructions, API documentation", "Authors": "Academia", "ACM 1": "Image manipulation", "ACM 2": "Graphics systems and interfaces", "Feedbacks [if matlab does it run on OSS alternatives (5=easy)]": "N/A", "OS of the test": "Linux", "General comments": "I used the docker setting, it worked well (cpu). Was not able to install from source because of a problem with hwang (installed by deps.sh although optional dependency). A jupyter notebook extremely detailed is provided and helps understand the code, so it would be very easy to adapt. I did not reproduce all the results in the paper (grayscale conversion worked well with docker).", "Does the paper already have a reproducibility stamp": "No", "Citation count (google scholar)": 14, "List of the required dependencies (if any, \"/\" separated, ex: matlab, libpng,cgal...)": "tensorflow/caffe/opencv/eigen3/ffmpeg/boost/openpose/halide/storehouse/hwang/pybind/grpc/protobuf/libpqxx/cuda/cudnn", "Project URL": "http://scanner.run/", "Did I manage to perform a complete test (deps/build)?": "Yes", "Year": 2018, "Misc. comment": "", "Software Type": "Code", "Topic": "Image", "Deep learning": "False", "hasThumbnail": true, "Open access": false, "Feedbacks [could reproduce results (5=highly confident)]": 4, "Could paper be trivially implemented using the pseudo-code": "", "Reviewer": 1, "Documentation score": 1, "Paper authors": [ { "given": "Alex", "family": "Poms", "sequence": "first", "affiliation": [ { "name": "Carnegie Mellon University" } ] }, { "given": "Will", "family": "Crichton", "sequence": "additional", "affiliation": [ { "name": "Stanford University" } ] }, { "given": "Pat", "family": "Hanrahan", "sequence": "additional", "affiliation": [ { "name": "Stanford University" } ] }, { "given": "Kayvon", "family": "Fatahalian", "sequence": "additional", "affiliation": [ { "name": "Stanford University" } ] } ], "Altmetric score": 4, "Altmetric badge": "https://badges.altmetric.com/?size=64&score=4&types=tttttttt", "Altmetric url": "http://www.altmetric.com/details.php?citation_id=42155836" }
json
The American organizers of Formula One’s Miami Grand Prix went all-in on a showbiz-style introduction of competitors before Sunday’s race but the drivers were left far from impressed by the razmataz. Before the national anthem and start of the race, Rapper-turned-actor L. L. Cool J introduced the drivers to the crowd with Formula One’s finest walking into dry ice and to shimmering pom-poms from the NFL Miami Dolphins cheerleaders. While it was clearly designed to appeal to the South Florida crowd, Mercedes driver George Russell said it was not ideal preparation for a 57-lap high-speed race in the heat. “It is distracting because, you know, we were on the grid for half an hour in all of our overalls in the sun," he said. “I don’t think there’s any other sports in the world that 30 minutes before you go out to do your business that you’re out there in the sun, all the cameras on you, and making a bit of a show of it. “We spoke about it as drivers on Friday night. Everybody’s got different personalities. I guess it’s the American way of doing things, doing sport. “Personally, probably not for me. But you know, that’s just my personal opinion. " World champion Max Verstappen of Red Bull, who went on to win the race, broadly agreed, saying attitudes came down to personality and preferences. “Some people like to be more in the spotlight, some people don’t. I personally don’t," he said. “So for me, I think that, naturally, of course, what they did today is not necessary. I prefer to just talk to my engineers, walk to my car, put the helmet on and drive. “But of course, I understand the entertainment value. So yeah, I just hope we don’t have that every single time because we have a very long season. So we don’t need an entry like that every time. " Red Bull team principal Christian Horner accepted that F1 is experimenting with different approaches in what is still a new market for them, but said such innovations needed to take into account the drivers’ desire for a calm preparation. “It’s quite tough for the drivers, to be honest with you, to be running through dry ice and high-fiving A-listers that they’re probably not quite sure who they are. Then thrown into the national anthem and expected to deliver," he said. “There’s not many sports that athletes have to do that. And so I think we need to be respectful. " But seven-time world champion Lewis Hamilton, who embraced L. L. Cool J when he walked out, backed the move. “They’re trying new things, they’re trying to improve the show always, and I’m in full support of it," he said. “I thought it was cool. " Read all the Latest Sports News, Check Out Orange Cap and Purple Cap holder details here(This story has not been edited by News18 staff and is published from a syndicated news agency feed - AFP)
english
<reponame>andrew4699/File-Manager<filename>css/files.css .fileContainer { display: inline-block; background: #525252; text-align: center; padding: 10px; margin: 0px 7px 7px 0px; } .fileContainer:hover { background: #424242; cursor: pointer; } .fileContainerTitle { font: bold 16px Helvetica, sans-serif; color: white; margin: 5px 0px 0px 0px; } .fileContainerOptions { margin: 15px 0px 0px 0px; text-shadow: 1px 1px rgba(255, 255, 255, 0.25); } .fileContainerOptions a:not(:last-child) { margin: 0px 20px 0px 0px; }
css
Unknown person has entered a girls hostel located at Nellimarla mandal in Vizianagaram district. One social welfare junior college, high school and a hostel are located in the same premises. Due to some repairing works there, some of the windows there were removed and the person entered the hostel through these windows. On knowing about the incident, the police reached the college and started their investigations in the matter. Watch the full story here:
english
export default function Log (prefix, msg, type = 'log') { let backPage = chrome.extension.getBackgroundPage() backPage.console[type]('----' + prefix + '--------------') backPage.console[type](msg) if (backPage !== window) { console[type](msg) } }
javascript