text
stringlengths
1
1.04M
language
stringclasses
25 values
Talent development firm NIIT has partnered Institute of Electrical and Electronics Engineers (IEEE) with an aim to provide training to 40,000 engineering graduates in next three years. IEEE is the world’s largest professional association dedicated to advancing technological innovation. It has more than more than 2,190 chapters and has major tech firms like AMD, AT&T and Cisco as corporate members. “Although nearly 1.3 lakh students study in universities in India, around 80 per cent of these graduates are considered not job-ready by employers,” NIIT Executive Director P Rajendran told reporters here. He added that through this course, NIIT will bring SWEBOK and other IEEE educational and skill development programmes to NIIT’s over 500 centres. “With an aim to bring engineers at par with the global industry standards, we plan to train over 40,000 engineering graduates in SWEBOK training programs in next three years. In the first year, we expect to train 5,000-8,000 and go higher from there,” NIIT President Futurz Amitabh Lahiri said. The company is working out the details of the centres where the course will be available. It is expected to be rolled out from December onwards. According to a report by Aspiring Minds Assessment, only 18.43 per cent engineers are employable for the software engineer-IT services role, while just 3.95 per cent are trained to be directly deployed on projects. For core jobs in mechanical, electronics/electrical and civil jobs, only 7.49 per cent engineers are employable. “By forging this strategic partnership with NIIT, we look forward to taking the SWEBOK and other IEEE educational and skill development programs global. We are glad to have come together with NIIT to reduce the skill gap among engineering graduates throughout India,” IEEE President and CEO Roberto de Marca said. NIIT and IEEE Computer Society will provide three courses based on the SWEBOK certification — SWEBOK Certificate Program (SCP), The Certified Software Development Associate (CSDA) and Certified Software Development Professional (CSDP).
english
Five years ago, the coders at DeepMind, a London-based artificial intelligence company, watched excitedly as an AI taught itself to play a classic arcade game. They’d used the hot technique of the day, deep learning, on a seemingly whimsical task: mastering Breakout,1 the Atari game in which you bounce a ball at a wall of bricks, trying to make each one vanish. Deep learning is self-education for machines; you feed an AI huge amounts of data, and eventually it begins to discern patterns all by itself. In this case, the data was the activity on the screen—blocky pixels representing the bricks, the ball, and the player’s paddle. The DeepMind AI, a so-called neural network made up of layered algorithms, wasn’t programmed with any knowledge about how Breakout works, its rules, its goals, or even how to play it. The coders just let the neural net examine the results of each action, each bounce of the ball. Where would it lead? To some very impressive skills, it turns out. During the first few games, the AI flailed around. But after playing a few hundred times, it had begun accurately bouncing the ball. By the 600th game, the neural net was using a more expert move employed by human Breakout players, chipping through an entire column of bricks and setting the ball bouncing merrily along the top of the wall. “That was a big surprise for us,” Demis Hassabis, CEO of DeepMind, said at the time. “The strategy completely emerged from the underlying system.” The AI had shown itself capable of what seemed to be an unusually subtle piece of humanlike thinking, a grasping of the inherent concepts behind Breakout. Because neural nets loosely mirror the structure of the human brain, the theory was that they should mimic, in some respects, our own style of cognition. This moment seemed to serve as proof that the theory was right. Then, last year, computer scientists at Vicarious, an AI firm in San Francisco, offered an interesting reality check. They took an AI like the one used by DeepMind and trained it on Breakout. It played great. But then they slightly tweaked the layout of the game. They lifted the paddle up higher in one iteration; in another, they added an unbreakable area in the center of the blocks. A human player would be able to quickly adapt to these changes; the neural net couldn’t. The seemingly supersmart AI could play only the exact style of Breakout it had spent hundreds of games mastering. It couldn’t handle something new. “We humans are not just pattern recognizers,” Dileep George, a computer scientist who cofounded Vicarious, tells me. “We’re also building models about the things we see. And these are causal models—we understand about cause and effect.” Humans engage in reasoning, making logical inferences about the world around us; we have a store of common-sense knowledge that helps us figure out new situations. When we see a game of Breakout that’s a little different from the one we just played, we realize it’s likely to have mostly the same rules and goals. The neural net, on the other hand, hadn’t understood anything about Breakout. All it could do was follow the pattern. When the pattern changed, it was helpless. Deep learning is the reigning monarch of AI. In the six years since it exploded into the mainstream, it has become the dominant way to help machines sense and perceive the world around them. It powers Alexa’s speech recognition, Waymo’s self-driving cars, and Google’s on-the-fly translations. Uber is in some respects a giant optimization problem, using machine learning to figure out where riders will need cars. Baidu, the Chinese tech giant, has more than 2,000 engineers cranking away on neural net AI. For years, it seemed as though deep learning would only keep getting better, leading inexorably to a machine with the fluid, supple intelligence of a person. But some heretics argue that deep learning is hitting a wall. They say that, on its own, it’ll never produce generalized intelligence, because truly humanlike intelligence isn’t just pattern recognition. We need to start figuring out how to imbue AI with everyday common sense, the stuff of human smarts. If we don’t, they warn, we’ll keep bumping up against the limits of deep learning, like visual-recognition systems that can be easily fooled by changing a few inputs, making a deep-learning model think a turtle is a gun. But if we succeed, they say, we’ll witness an explosion of safer, more useful devices—health care robots that navigate a cluttered home, fraud detection systems that don’t trip on false positives, medical breakthroughs powered by machines that ponder cause and effect in disease. But what does true reasoning look like in a machine? And if deep learning can’t get us there, what can? Gary Marcus is a pensive, bespectacled 48-year-old professor of psychology and neuroscience at New York University, and he’s probably the most famous apostate of orthodox deep learning. Back then, the strategy behind deep learning was the same as it is today. Say you wanted a machine to teach itself to recognize daisies. First you’d code some algorithmic “neurons,” connecting them in layers like a sandwich (when you use several layers, the sandwich gets thicker or deep—hence “deep” learning). You’d show an image of a daisy to the first layer, and its neurons would fire or not fire based on whether the image resembled the examples of daisies it had seen before. The signal would move on to the next layer, where the process would be repeated. Eventually, the layers would winnow down to one final verdict. At first, the neural net is just guessing blindly; it starts life a blank slate, more or less. The key is to establish a useful feedback loop. Every time the AI misses a daisy, that set of neural connections weakens the links that led to an incorrect guess; if it’s successful, it strengthens them. Given enough time and enough daisies, the neural net gets more accurate. It learns to intuit some pattern of daisy-ness that lets it detect the daisy (and not the sunflower or aster) each time. As the years went on, this core idea—start with a naive network and train by repetition—was improved upon and seemed useful nearly anywhere it was applied. But Marcus was never convinced. For him, the problem is the blank slate: It assumes that humans build their intelligence purely by observing the world around them, and that machines can too. But Marcus doesn’t think that’s how humans work. He walks the intellectual path laid down by Noam Chomsky,2 who argued that humans are born wired to learn, programmed to master language and interpret the physical world. For all their supposed braininess, he notes, neural nets don’t appear to work the way human brains do. For starters, they’re much too data-hungry. In most cases, each neural net requires thousands or millions of examples to learn from. Worse, each time you want a neural net to recognize a new type of item, you have to start from scratch. A neural net trained to recognize only canaries isn’t of any use in recognizing, say, birdsong or human speech. “We don’t need massive amounts of data to learn,” Marcus says. His kids didn’t need to see a million cars before they could recognize one. Better yet, they can generalize; when they see a tractor for the first time, they understand that it’s sort of like a car. They can also engage in counterfactuals. Google Translate can map the French equivalent of the English sentence “The glass was pushed, so it fell off the table.” But it doesn’t know what the words mean, so it couldn’t tell you what would happen if the glass weren’t pushed. Humans, Marcus notes, grasp not just the patterns of grammar but the logic behind it. You could give a young child a fake verb like pilk, and she’d likely be able to reason that the past tense would be pilked. She hasn’t seen that word before, of course. She hasn’t been “trained” on it. She has just intuited some logic about how language works and can apply it to a new situation. “These deep-learning systems don’t know how to integrate abstract knowledge,” says Marcus, who founded a company that created AI to learn with less data (and sold the company to Uber in 2016). Earlier this year, Marcus published a white paper on arXiv, arguing that, without some new approaches, deep learning might never get past its current limitations. What it needs is a boost—rules that supplement or are built in to help it reason about the world. Oren Etzioni is a smiling bear of a guy. A computer scientist who runs the Allen Institute for Artificial Intelligence in Seattle, he greets me in his bright office wearing jeans and a salmon-colored shirt, ushering me in past a whiteboard scrawled with musings about machine intelligence. (“DEFINE SUCCESS,” “WHAT’S THE TASK?”) Outside, in the sun-drenched main room of the institute, young AI researchers pad around sylphlike, headphones attached, quietly pecking at keyboards. Etzioni and his team are working on the common-sense problem. He defines it in the context of two legendary AI moments—the trouncing of the chess grandmaster Garry Kasparov3 by IBM’s Deep Blue in 1997 and the equally shocking defeat of the world’s top Go player by DeepMind’s AlphaGo last year. (Google bought DeepMind in 2014.) “With Deep Blue we had a program that would make a superhuman chess move—while the room was on fire,” Etzioni jokes. “Right? Completely lacking context. Fast-forward 20 years, we’ve got a computer that can make a superhuman Go move—while the room is on fire.” Humans, of course, do not have this limitation. His team plays weekly games of bughouse chess, and if a fire broke out the humans would pull the alarm and run for the doors. Humans, in other words, possess a base of knowledge about the world (fire burns things) mixed with the ability to reason about it (you should try to move away from an out-of-control fire). For AI to truly think like people, we need to teach it the stuff that everyone knows, like physics (balls tossed in the air will fall) or the relative sizes of things (an elephant can’t fit in a bathtub). Until AI possesses these basic concepts, Etzioni figures, it won’t be able to reason. With an infusion of hundreds of millions of dollars from Paul Allen,4 Etzioni and his team are trying to develop a layer of common-sense reasoning to work with the existing style of neural net. (The Allen Institute is a nonprofit, so everything they discover will be published, for anyone to use.) The first problem they face is answering the question, What is common sense? Etzioni describes it as all the knowledge about the world that we take for granted but rarely state out loud. He and his colleagues have created a set of benchmark questions that a truly reasoning AI ought to be able to answer: If I put my socks in a drawer, will they be there tomorrow? If I stomp on someone’s toe, will they be mad? One way to get this knowledge is to extract it from people. Etzioni’s lab is paying crowdsourced humans on Amazon Mechanical Turk to help craft common-sense statements. The team then uses various machine-learning techniques—some old-school statistical analyses, some deep-learning neural nets—to draw lessons from those statements. If they do it right, Etzioni believes they can produce reusable Lego bricks of computer reasoning: One set that understands written words, one that grasps physics, and so on. Yejin Choi, one of Etzioni’s leading common-sense scientists, has led several of these crowdsourced efforts. In one project, she wanted to develop an AI that would understand the intent or emotion implied by a person’s actions or statements. She started by examining thousands of online stories, blogs, and idiom entries in Wiktionary and extracting “phrasal events,” such as the sentence “Jeff punches Roger’s lights out.” Then she’d anonymize each phrase—“Person X punches Person Y’s lights out”—and ask the Turkers to describe the intent of Person X: Why did they do that? When she had gathered 25,000 of these marked-up sentences, she used them to train a machine-learning system to analyze sentences it had never seen before and infer the emotion or intent of the subject. At best, the new system worked only half the time. But when it did, it evinced some very humanlike perception: Given a sentence like “Oren cooked Thanksgiving dinner,” it predicted that Oren was trying to impress his family. “We can also reason about others’ reactions, even if they’re not mentioned,” Choi notes. “So X’s family probably feel impressed and loved.” Another system her team built used Turkers to mark up the psychological states of people in stories; the resulting system could also draw some sharp inferences when given a new situation. It was told, for instance, about a music instructor getting angry at his band’s lousy performance and that “the instructor was furious and threw his chair.” The AI predicted that the musicians would “feel fear afterwards,” even though the story doesn’t explicitly say so. Choi, Etzioni, and their colleagues aren’t abandoning deep learning. Indeed, they regard it as a very useful tool. But they don’t think there is a shortcut to the laborious task of coaxing people to explicitly state the weird, invisible, implied knowledge we all possess. Deep learning is garbage in, garbage out. Merely feeding a neural net tons of news articles isn’t enough, because it wouldn’t pick up on the unstated knowledge, the obvious stuff that writers didn’t bother to mention. As Choi puts it, “People don’t say ‘My house is bigger than me.’ ” To help tackle this problem, she had the Turkers analyze the physical relationships implied by 1,100 common verbs, such as “X threw Y.” That, in turn, allowed for a simple statistical model that could take the sentence “Oren threw the ball” and infer that the ball must be smaller than Oren. Another challenge is visual reasoning. Aniruddha Kembhavi, another of Etzioni’s AI scientists, shows me a virtual robot wandering around an onscreen house. Other Allen Institute scientists built the Sims-like house, filling it with everyday items and realistic physics—kitchen cupboards full of dishes, couches that can be pushed around. Then they designed the robot, which looks like a dark gray garbage canister with arms, and told it to hunt down certain items. After thousands of tasks, the neural net gains a basic grounding in real-life facts. Etzioni and his colleagues hope that these various components—Choi’s language reasoning, the visual thinking, other work they’re doing on getting an AI to grasp textbook science information—can all eventually be combined. But how long will it take, and what will the final products look like? They don’t know. The common-sense systems they’re building still make mistakes, sometimes more than half the time. Choi estimates she’ll need around a million crowdsourced human statements as she trains her various language-parsing AIs. Building common sense, it would seem, is uncommonly hard. There are other pathways to making machines that reason, and they’re even more labor-intensive. For example, you could simply sit down and write out, by hand, all the rules that tell a machine how the world works. This is how Doug Lenat’s Cyc project works. For 34 years, Lenat has employed a team of engineers and philosophers to code 25 million rules of general common sense, like “water is wet” or “most people know the first names of their friends.” This lets Cyc deduce things: “Your shirt is wet, so you were probably in the rain.” The advantage is that Lenat has exquisite control over what goes into Cyc’s database; that isn’t true of crowdsourced knowledge. Brute-force, handcrafted AI has become unfashionable in the world of deep learning. That’s partly because it can be “brittle”: Without the right rules about the world, the AI can get flummoxed. This is why scripted chatbots are so frustrating; if they haven’t been explicitly told how to answer a question, they have no way to reason it out. Cyc is enormously more capable than a chatbot and has been licensed for use in health care systems, financial services, and military projects. But the work is achingly slow, and it’s expensive. Lenat says it has cost around $200 million to develop Cyc. As it played Breakout, the system developed the ability to weigh different courses of action and their likely outcomes. This worked in reverse too. If the AI wanted to break a block in the far left corner of the screen, it reasoned to put the paddle in the far right corner. Crucially, this meant that when Vicarious changed the layout of the game—adding new bricks or raising the paddle—the system compensated. It appeared to have extracted some general understanding about Breakout itself. Granted, there are trade-offs in this type of AI engineering. It’s arguably more painstaking to craft and takes careful planning to figure out precisely what foreordained logic to feed into the system. It’s also hard to strike the right balance of speed and accuracy when designing a new system. George says he looks for the minimum set of data “to put into the model so it can learn quickly.” The fewer assumptions you need, the more efficiently the machine will make decisions. Once you’ve trained a deep-learning model to recognize cats, you can show it a Russian blue it has never seen and it renders the verdict—it’s a cat!—almost instantaneously. Having processed millions of photos, it knows not only what makes a cat a cat but also the fastest way to identify one. In contrast, Vicarious’ style of AI is slower, because it’s actively making logical inferences as it goes. When the Vicarious AI works well, it can learn from much less data. George’s team created an AI to bust captchas,5 those “I’m not a robot” obstacles online, by recognizing characters in spite of their distorted, warped appearance. Much as with the Breakout system, they endowed their AI with some abilities up front, such as knowledge that helps it discern the likely edges of characters. With that bootstrapping in place, they only needed to train the AI on 260 images before it learned to break captchas with 90.4 percent accuracy. In contrast, a neural net needed to be trained on more than 2.3 million images before it could break a captcha. Others are building common-sense-like structure into neural nets in different ways. Two researchers at DeepMind, for instance, recently created a hybrid system—part deep learning, part more traditional techniques—known as inductive logic programming. The goal was to produce something that could do mathematical reasoning. Yann LeCun, a deep-learning pioneer and the current head of Facebook’s AI research wing, agrees with many of the new critiques of the field. He acknowledges that it requires too much training data, that it can’t reason, that it doesn’t have common sense. “I’ve been basically saying this over and over again for the past four years,” he reminds me. But he remains steadfast that deep learning, properly crafted, can provide the answer. He disagrees with the Chomskyite vision of human intelligence. He thinks human brains develop the ability to reason solely through interaction, not built-in rules. “If you think about how animals and babies learn, there’s a lot of things that are learned in the first few minutes, hours, days of life that seem to be done so fast that it looks like they are hardwired,” he notes. “But in fact they don’t need to be hardwired, because they can be learned so quickly.” In this view, to figure out the physics of the world, a baby just moves its head around, data-crunches the incoming imagery, and concludes that, hey, depth of field is a thing. Still, LeCun admits it’s not yet clear which routes will help deep learning get past its humps. It might be “adversarial” neural nets, a relatively new technique in which one neural net tries to fool another neural net with fake data—forcing the second one to develop extremely subtle internal representations of pictures, sounds, and other inputs. The advantage here is that you don’t have the “data hungriness” problem. You don’t need to collect millions of data points on which to train the neural nets, because they’re learning by studying each other. (Apocalyptic side note: A similar method is being used to create those profoundly troubling “deepfake” videos in which someone appears to be saying or doing something they are not.) I met LeCun at the offices of Facebook’s AI lab in New York. Mark Zuckerberg recruited him in 2013, with the promise that the lab’s goal would be to push the limits of ambitious AI, not just produce minor tweaks for Facebook’s products. Like an academic lab, LeCun and his researchers publish their work for others to access. LeCun, who retains the rich accent of his native France and has a Bride of Frankenstein shock of white in his thick mass of dark hair, stood at a whiteboard energetically sketching out theories of possible deep-learning advances. On the facing wall was a set of gorgeous paintings from Stanley Kubrick’s 2001: A Space Odyssey—the main spaceship floating in deep space, the wheel-like ship orbiting Earth. “Oh, yes,” LeCun said, when I pointed them out; they were reprints of artwork Kubrick commissioned for the movie. It was weirdly unsettling to discuss humanlike AI with those images around, because of course HAL 9000,6 the humanlike AI in 2001, turns out to be a highly efficient murderer. And this pointed to a deeper philosophical question that floats over the whole debate: Is smarter AI even a good idea? Vicarious’ system cracked captcha, but the whole point of captcha is to prevent bots from impersonating humans. Some AI thinkers worry that the ability to talk to humans and understand their psychology could make a rogue AI incredibly dangerous. Nick Bostrom7 at the University of Oxford has sounded the alarm about the dangers of creating a “superintelligence,” an AI that self-improves and rapidly outstrips humanity, able to outthink and outflank us in every way. (One way he suggests it might amass control is by manipulating people—something for which possessing a “theory of mind” would be quite useful.) Elon Musk is sufficiently convinced of this danger that he has funded OpenAI, an organization dedicated to the notion of safe AI. This future doesn’t keep Etzioni up at night. He’s not worried about AI becoming maliciously superintelligent. “We’re worried about something taking over the world,” he scoffs, “that can’t even on its own decide to play chess again.” It’s not clear how an AI would develop a desire to do so or what that desire would look like in software. Deep learning can conquer chess, but it has no inborn will to play. What does concern him is that current AI is woefully inept. So while we might not be creating HAL with a self-preserving intelligence, an “inept AI attached to deadly weapons can easily kill,” he says. This is partly why Etzioni is so determined to give AI some common sense. Ultimately, he argues, it will make AI safer; the idea that humanity shouldn’t be wholesale slaughtered is, of course, arguably a piece of common-sense knowledge itself. (Part of the Allen Institute’s mandate is to make AI safer by making it more reasonable.) Etzioni notes that the dystopic sci-fi visions of AI are less risky than near-term economic displacement. The better AI gets at common sense, the more rapidly it’ll take over jobs that currently are too hard for mere pattern-matching deep learning: drivers, cashiers, managers, analysts of all stripes, and even (alas) journalists. But truly reasoning AI could wreak havoc even beyond the economy. Imagine how good political disinformation bots would be if they could use common-sense knowledge to appear indistinguishably human on Twitter or Facebook or in mass phone calls. Sitting beneath the images from 2001, LeCun makes a bit of a heretical point himself. Sure, making artificial intelligence more humanlike helps AI to navigate our world. But directly replicating human styles of thought? It’s not clear that’d be useful. We already have humans who can think like humans; maybe the value of smart machines is that they are quite alien from us. “They will tend to be more useful if they have capabilities we don’t have,” he tells me. “Then they’ll become an amplifier for intelligence. So to some extent you want them to have a nonhuman form of intelligence ... You want them to be more rational than humans.” In other words, maybe it’s worth keeping artificial intelligence a little bit artificial. Clive Thompson (@pomeranian99) is a columnist for WIRED. This article appears in the December issue. Subscribe now. Let us know what you think about this article. Submit a letter to the editor at mail@wired.com.
english
{"id": 9083, "name": "<NAME> passage", "qualified_name": "Seal of passage", "examine": "A seal of passage issued by Brundt the Chieftain of the Fremennik.", "members": true, "release_date": "2006-07-24", "quest": true, "weight": 4.535, "value": 1, "tradeable": false, "stackable": false, "noteable": false, "equipable": true, "tradeable_ge": false, "icon": "<KEY>", "wiki_url": "https://oldschool.runescape.wiki/w/Seal_of_passage", "equipment": {"slot": "neck", "requirements": [], "bonuses": {"attack": {"stab": 0, "slash": 0, "crush": 0, "magic": 0, "ranged": 0}, "defence": {"stab": 0, "slash": 0, "crush": 0, "magic": 0, "ranged": 0}, "strength": {"melee": 0, "ranged": 0, "magic": 0}, "prayer": 0}}}
json
.content-header{ margin-bottom: 20px; } ul.right-links{ position: absolute; right: 20px; top: 15px; } ul.right-links>li> a.btn{ color: #FFF; padding:7px 15px 7px 15px; margin: 0px; } ul.right-links>li{ margin: 0px; padding: 0px 5px 0px 0px; } .form-control{ height: 40px; } .with-border-sp{ border-top: 1px solid #e6e6e6; margin-top: 10px; padding-top: 20px; } #basicToggle{ display:none;} #items td{ vertical-align: middle; } .img-td{ text-align: center; } .reviews-summary-container{ margin:10px 0px 20px 0px; } .reviews-summary-container h2{ margin: 0px; font-size: 25px; } .reviews-summary-container h2 span{ font-size: 16px; margin-left: 20px; } .reviews-summary{ background: #FFF; margin:15px 0px 0px 0px; border: 1px solid #e6e6e6; padding: 0px 0px; display: flex; color: #4f4f4f; } .reviews-summary > [class^=col-md]{ padding: 20px 20px; } .reviews-summary >[class^=col-md]:first-child{ border-right: 1px solid #e6e6e6; } .reviews-summary-title{ font-size: 18px; font-weight: 600; display: block; margin-bottom: 15px; } .overall-rating{ font-size: 19px; } .overall-rating i.fa{ color: #ffc203; font-size: 17px; } .overall-rating span.fa-stack{ margin: 0px 30px 0px 0px; padding: 0px; display: inline-block; width: auto; } .overall-rating span.fa-stack i.fa-star{ color: #d3d3d4; } .overall-rating span.fa-stack i.fa-star-half{ } .reviews-summary-based{ color: #a6a7a7; font-size: 16px; } .reviews-summary .progress{ height: 8px; margin:8px 0px 0px 0px; background-color: #e6e6e6; } .reviews-summary .progress-bar{ background-color: #ffc203; } .reviews-summary-all [class^=col-md]{ padding-right: 0px; } .reviews-summary-all [class^=col-md]:nth-child(2){ padding-left: 20px; margin-left: -15px; } .reviews-summary-all{ margin-left: 10px; font-size: 15px; font-weight: 600; color: #747373; } .reviews-summary-all .row{ margin-top: 8px; } .review-show{ margin-left: 10px; margin-right: 10px; border-bottom: 1px solid #e6e6e6; padding-bottom: 15px; padding-top: 15px; color: #4d4c4c; position: relative; } .review-show h2{ margin: 0px; font-size: 20px; } .review-show-rate{ margin-bottom: 15px; } .review-show-rate i{ color: #337ab7; } .review-show-msg{ font-size: 13px; } .review-show-owner{ margin-top: 15px; font-size: 13px; } .review-show-owner span{ font-weight: bold; } .review-show-date{ position: absolute; right: 0px; top: 15px; color: #b4b3b3; } .review-helpfull{ position: absolute; bottom: 15px; right: 0px; } .review-helpfull{ font-weight: bold; font-size: 13px; } .review-helpfull a{ padding:2px 10px; margin-left: 10px; }
css
Nobel laureate Amartya Sen said the Citizenship (Amendment) Act, or CAA, violates constitutional provisions. "The CAA law that has been passed in my judgment should be turned down by the Supreme Court on the grounds of it being unconstitutional because you cannot have certain types of fundamental human rights linking citizenship with religious differences," Mr Sen told reporters at the Infosys Science Foundation's Infosys Prize 2019 in Bengaluru. The Nobel laureate said what really should matter for deciding citizenship is the place a person was born, and where the person has lived. "My reading of the (amended) law is that it violates the provision of the Constitution," he said, adding that citizenship on the basis of religion had been a matter of discussion in the constituent assembly where it was decided that "using religion for the purpose of discrimination of this kind will not be acceptable. " Mr Sen, however, agreed that a Hindu who is persecuted in a country outside India deserves sympathy and his or her case must be taken into account. "It (consideration for citizenship) has to be independent of religion but take cognisance of the sufferings and other issues into account," Mr Sen said. On the mob attack at Delhi's Jawaharlal Nehru University (JNU), Mr Sen noted the university administration could not stop outsiders from coming to the campus to lead the attack. "The communication between the university administration and the police got delayed due to which ill treatment of students went on without being prevented by the law enforcement agencies," he added.
english
<filename>package.json<gh_stars>1-10 { "name": "geojson-multiply", "version": "1.0.3", "description": "project starter for npm package", "main": "index.js", "scripts": { "test": "mocha" }, "repository": { "type": "git", "url": "git+https://github.com/haoliangyu/geojson-multiply.git" }, "keywords": [ "geojson", "gis" ], "author": "<NAME> <<EMAIL>>", "license": "MIT", "bugs": { "url": "https://github.com/haoliangyu/geojson-multiply/issues" }, "homepage": "https://github.com/haoliangyu/geojson-multiply#readme", "devDependencies": { "chai": "^3.5.0", "eslint": "^2.9.0", "eslint-plugin-json": "^1.2.0", "mocha": "^2.4.5" } }
json
There is probably no mixed martial artist on the planet right now who has the same submission knowledge as Reinier de Ridder. The ONE middleweight world champion is arguably the most lethal submission artist in the sport right now. He’s displayed his constricting ground game time and again during his run in ONE Championship. He’s achieved mythic status in the promotion through his sublime Brazilian jiu-jitsu, and he’ll prove that once again, but this time in submission grappling. De Ridder will take on BJJ phenom Tye Ruotolo in a submission grappling match at ONE Fight Night 10 on May 5 at 1stBank Center. The card is ONE Championship’s first on-site event in the United States and will be broadcast live and for free via Prime Video in North America. Before Reinier de Ridder heads to Colorado, let’s look back at his three best submission wins in ONE Championship. #3. Kiamrian Abbasov (ONE: Full Circle) Reinier de Ridder was at the top of the world when he held both the ONE middleweight and light heavyweight world titles in his possession. After taking the 205-pound world title from Aung La N Sang, de Ridder was challenged by then-ONE welterweight world champion Kiamrian Abbasov for the middleweight belt at ONE: Full Circle. De Ridder, though, made Abbasov realize that he was too far out of his element. Though he was fighting a fellow world champion, the Dutch superstar looked miles ahead of his competition. ‘The Dutch Knight’ was so utterly dominant that he submitted Abbasov twice during the match. The first one, of course, didn’t count since the Kyrgyzstani star tapped to the arm triangle choke after the bell sounded at the end of the second round. By the time the third round started, de Ridder wasted no time in securing that fight-ending sequence. The Judo and BJJ black belt took Abbasov down with a swift single-leg takedown before ending the match with an arm triangle choke. #2. Aung La N Sang I (ONE: Inside the Matrix) Aung La N Sang was seen as this ruthless force during his reign with the ONE middleweight and light heavyweight world titles from 2018 to 2020, and that was until Reinier de Ridder rode into the field. Befitting of his imperious nickname, ‘The Dutch Knight’ marched through Aung La’s territory and challenged the Burmese legend for all the gold starting with the ONE middleweight world championship. De Ridder challenged Aung La for middleweight gold at ONE: Inside the Matrix in October 2020, and put on one of the defining matches of his career. Quickly taking Aung La’s back barely a minute into the opening round, de Ridder held that position for almost three minutes before locking in a rear-naked choke that ‘The Burmese Python’ had no choice but to tap out to. De Ridder ultimately completed his golden conquest six months later when he took the ONE light heavyweight world title from Aung La at ONE on TNT IV to become ONE Championship’s third MMA double world champion. #1. Vitaly Bigdash (ONE 159) No one was stopping Reinier de Ridder during his exalted run as both the ONE middleweight and light heavyweight world champion. Even former middleweight king Vitaly Bigdash tried to usurp the throne at ONE 159, but his efforts ultimately became a footnote to the Dutch ruler’s regime. Bigdash even tried using de Ridder’s game to his advantage when he slapped in a tight guillotine choke in the first minute of the match. While it looked like he had de Ridder in trouble, the Russian slugger couldn’t hold on to the submission, which allowed the Dutch superstar to wriggle out and on to dominant position. De Ridder took top position right after he got out of the guillotine, before working his way into possibly the most subliminal submission finish in ONE Championship history. Bigdash thought he got back to a dominant position more than three minutes into the match, but all he got was his neck getting trapped in a rare reverse triangle choke. Within seconds, Bigdash faded into unconsciousness while de Ridder successfully retained the ONE middleweight world title.
english
{ "name": "alpha", "version": "1.0.0", "description": "A community driven 2D online multiplayer PIXEL platformer game", "main": "index.js", "scripts": { "test": "echo \"Error: no test specified\" && exit 1" }, "repository": { "type": "git", "url": "git+https://github.com/DEDAjs/alpha.git" }, "keywords": [ "2D", "platformer", "game" ], "author": "Liam & <NAME>", "license": "MIT", "bugs": { "url": "https://github.com/DEDAjs/alpha/issues" }, "homepage": "https://github.com/DEDAjs/alpha#readme", "dependencies": { "express": "^4.17.2", "gif-frames": "^1.0.1", "ws": "^8.4.0" } }
json
The community that has opposed the state division in big way and benefited in Andhra Pradesh after the division is now in introspective mood. There is no political presence of their community in Telangana State. But the only blessing in disguise is that the political power in Telangana is sparing them as they have become irrelevant so far. At this juncture, if someone writes "No one is here to get afraid. . . " kind of writings like that of Andhra Jyothy RK, challenging the government with hatred, things will not be the same. It is observed that the situation changed just within four days of such writings. No one knows, what indications have gone to ABN RK, but he became afraid and stopped writing baseless allegations on Telangana Government. Adding to that the daily has started praising the government with bigger adjectives. There is a big reason behind this change, as per the sources. It seems that some community heads have given warnings to RK to keep quiet and use some brain. Majority of their investments are in Telangana State and some of them have been doing businesses taking some liberties those are not legal. If the government wishes to screw them legally, they would become scapegoats. So, they warned RK not to become a target and collaterally make them also a target. "We ruled in our time in AP. You wrote whatever you wanted to during Kukatpally elections. Now we have no place in AP. Our businesses are here in TS. Let us be silent and let the life move on. We need to learn pro Telangana aspect. Adding to this, in this Corona times, the businesses are at loss. Print media is also not immune to it. So, with no government support in both the states, you cannot survive and we too cannot help you in our way if you mess up with T Government", a big man who belongs to a community advised RK. As a result, the scene is changed in Andhra Jyothy with no negative news in any corner.
english
<reponame>LJW123/YiLianMall<gh_stars>0 package com.yilian.mall.ui; import android.app.Dialog; import android.content.Context; import android.content.DialogInterface; import android.content.Intent; import android.os.Bundle; import android.os.Handler; import android.os.Message; import android.text.Html; import android.text.Spanned; import android.text.TextUtils; import android.view.Display; import android.view.Gravity; import android.view.View; import android.view.ViewGroup; import android.view.Window; import android.view.WindowManager; import android.view.inputmethod.InputMethodManager; import android.widget.AdapterView; import android.widget.Button; import android.widget.ImageView; import android.widget.RelativeLayout; import android.widget.TextView; import com.alipay.sdk.app.PayTask; import com.google.gson.Gson; import com.lidroid.xutils.ViewUtils; import com.lidroid.xutils.exception.HttpException; import com.lidroid.xutils.http.ResponseInfo; import com.lidroid.xutils.http.callback.RequestCallBack; import com.lidroid.xutils.view.annotation.ViewInject; import com.orhanobut.logger.Logger; import com.tencent.mm.sdk.openapi.IWXAPI; import com.yilian.loginmodule.LeFenPhoneLoginActivity; import com.yilian.mall.BaseActivity; import com.yilian.mall.R; import com.yilian.mall.alipay.PayResult; import com.yilian.mall.entity.GoodsChargeForPayResultEntity; import com.yilian.mall.entity.MakeMallShoppingOrderEntity; import com.yilian.mall.entity.PaymentIndexEntity; import com.yilian.mall.http.MTNetRequest; import com.yilian.mall.http.MallNetRequest; import com.yilian.mall.http.MyIncomeNetRequest; import com.yilian.mall.http.PaymentNetRequest; import com.yilian.mall.http.WeiXinNetRequest; import com.yilian.mall.utils.CommonUtils; import com.yilian.mall.utils.MoneyUtil; import com.yilian.mall.utils.NumberFormat; import com.yilian.mall.utils.PreferenceUtils; import com.yilian.mall.utils.RequestOftenKey; import com.yilian.mall.utils.Toast; import com.yilian.mall.widgets.GridPasswordView; import com.yilian.mall.widgets.NoScrollListView; import com.yilian.mall.widgets.VersionDialog; import com.yilian.mylibrary.CheckServiceReturnEntityUtil; import com.yilian.mylibrary.Constants; import com.yilian.mylibrary.GlideUtil; import com.yilian.mylibrary.Ip; import com.yilian.mylibrary.ThreadUtil; import com.yilian.mylibrary.pay.PayFrom; import com.yilian.networkingmodule.entity.MyBalanceEntity2; import com.yilian.networkingmodule.entity.PayOkEntity; import com.yilian.networkingmodule.entity.PayTypeEntity; import com.yilian.networkingmodule.httpresult.HttpResultBean; import com.yilian.networkingmodule.retrofitutil.RetrofitUtils; import java.util.ArrayList; import java.util.HashMap; import java.util.Map; import retrofit2.Call; import retrofit2.Callback; import retrofit2.Response; import static com.yilian.mall.ui.CashDeskActivity2.PAY_FROM_TAG; /** * 收银台 * 该界面所有货币信息只是展示使用,支付时只是向服务器传递一个订单ID * 只有在充值时,才牵涉到货币金额的计算,也就是"moreOverLeBi"这个字段的计算 * 该页面已废弃 */ public class CashDeskActivity extends BaseActivity { private static final int SDK_PAY_FLAG = 1; private static final int SDK_CHECK_FLAG = 2; String order_total_lebi, orderCreateTime; String orderName; MallNetRequest mallNetRequest; String paymentIndex; String fee; float orderTotalLebi;//订单原本需要支付多少钱 单位分 IWXAPI api; @ViewInject(R.id.v3Back) private ImageView tvBack; @ViewInject(R.id.v3Title) private TextView tv_title; @ViewInject(R.id.tv_buy_price) private TextView tv_buy_price; @ViewInject(R.id.btn_surePay) private Button btnSurePay; @ViewInject(R.id.tv_usable_money) private TextView tvUsableMoney; @ViewInject(R.id.btn_whether_use_money) private ImageView btnWhetherUseMoney; @ViewInject(R.id.lv_pay_type) private NoScrollListView nslvPayType; private Boolean whetheruseMoney = true;//是否使用奖励支付 private PayFrom type;//支付来源(商品详情、购物车、商品订单、WebView、套餐支付、套餐店内消费) //支付订单的订单ID private String orderIndex; private ArrayList<MakeMallShoppingOrderEntity.MakeMallShopping> list; private int getPayResultTimes = 0; private GoodsChargeForPayResultEntity result; private String phone; private String addressId;//团购转配送时的type=7时,需要传入此字段 private String totalTotalBeans; private PayDialog paydialog; private int orderNumber = 0;//订单数量 private float userMoney;//用户奖励 private String payType1;//1商城订单 2 商家入驻或续费 3店内支付 6套餐 private Handler handler = new Handler() { @Override public void handleMessage(Message msg) { super.handleMessage(msg); switch (msg.what) { case SDK_PAY_FLAG: Logger.i("msg.obj" + msg.obj.toString() + " msg.obj:" + msg.obj); PayResult payResult = new PayResult((Map<String, String>) msg.obj); /**` * 同步返回的结果必须放置到服务端进行验证(验证的规则请看https://doc.open.alipay.com/doc2/ * detail.htm?spm=0.0.0.0.xdvAU6&treeId=59&articleId=103665& * docType=1) 建议商户依赖异步通知 */ String resultInfo = payResult.getResult();// 同步返回需要验证的信息 String resultStatus = payResult.getResultStatus(); Logger.i("resultStatus " + resultStatus); // 判断resultStatus 为“9000”则代表支付成功,具体状态码代表含义可参考接口文档 if (TextUtils.equals(resultStatus, "9000")) { getPayResult(); } else { // 判断resultStatus 为非"9000"则代表可能支付失败 // "8000"代表支付结果因为支付渠道原因或者系统原因还在等待支付结果确认,最终交易是否成功以服务端异步通知为准(小概率状态) if (TextUtils.equals(resultStatus, "8000")) { Toast.makeText(mContext, "支付结果确认中", Toast.LENGTH_SHORT).show(); } else if (TextUtils.equals(resultStatus, "4000")) { Toast.makeText(mContext, "请安装支付宝插件", Toast.LENGTH_SHORT).show(); } else { // 其他值就可以判断为支付失败,包括用户主动取消支付,或者系统返回的错误 Toast.makeText(mContext, "支付失败", Toast.LENGTH_SHORT).show(); } sp.edit().putString("lebiPay", "false").commit(); } break; case SDK_CHECK_FLAG: { Toast.makeText(mContext, "检查结果为:" + msg.obj, Toast.LENGTH_SHORT).show(); break; } default: break; } } }; private float moreOverLeBi;//还需要充值多少钱 单位分 private Boolean moneyEnough = true;//用户奖励是否足够 private PaymentNetRequest paymentNetRequest; private PayFragmentAdapter payTypeadapter; private ArrayList<PayTypeEntity.PayData> payList; //1支付宝 2微信 3微信公共账号 4网银 private String payType; private MyIncomeNetRequest myIncomeNetRequest; private int selectedPosition = -1; private WeiXinNetRequest weiXinNetRequest; private void getPayResult() { if (mallNetRequest == null) { mallNetRequest = new MallNetRequest(mContext); } Logger.i("获取支付结果OrderId:" + orderIndex + " payType1:" + payType1); mallNetRequest.getPayResult(orderIndex, payType1, new RequestCallBack<GoodsChargeForPayResultEntity>() { @Override public void onStart() { super.onStart(); startMyDialog(); } @Override public void onSuccess(ResponseInfo<GoodsChargeForPayResultEntity> responseInfo) { result = responseInfo.result; switch (result.code) { case 1: PreferenceUtils.writeBoolConfig(Constants.REFRESH_USER_FRAGMENT, true, mContext); totalTotalBeans = result.totalTotalBean; jumpToNOrderPaySuccessActivity(result.lebi,result.dealTime, result.returnBean, result.subsidy); break; case -100://10秒钟轮询接口pay_info五次 if (getPayResultTimes < 5) { ThreadUtil.getThreadPollProxy().execute(new Runnable() { @Override public void run() { try { Thread.sleep(2000); getPayResult(); getPayResultTimes++; } catch (InterruptedException e) { e.printStackTrace(); } } }); } break; default: showToast(result.msg); break; } stopMyDialog(); } @Override public void onFailure(HttpException e, String s) { stopMyDialog(); showToast(R.string.net_work_not_available); } }); } @Override protected void onCreate(Bundle savedInstanceState) { super.onCreate(savedInstanceState); setContentView(R.layout.activity_jp_cash_desk); ViewUtils.inject(this); initView(); initData(); initListener(); } @Override protected void onResume() { super.onResume(); phone = sp.getString(Constants.SPKEY_PHONE, ""); } private void initView() { /** * 本页面用于结算的数据,需要从上个页面传递过来的有: order_total_lebi String类型 订单所需金额 “payType”String类型 1乐享币直充 3商城订单 4.付款给商家支付(套餐店内消费) 5兑换中心扫码支付 6商家套餐支付 “orderIndex”String类型 订单ID,多个订单ID使用英文逗号连接,店内消费不传 “type” String类型 该值决定是从哪里跳转过来的,从而决定怎么处理支付金额 */ tv_title.setText("确认支付"); Intent intent = getIntent(); type = (PayFrom) intent.getSerializableExtra(PAY_FROM_TAG); order_total_lebi = intent.getStringExtra("order_total_lebi"); orderTotalLebi = Integer.valueOf(order_total_lebi); addressId = intent.getStringExtra("addressId"); orderCreateTime = intent.getStringExtra("orderCreateTime"); payType1 = intent.getStringExtra("payType");//1商城订单 2 商家入驻或续费 3店内支付 sp.edit().putString("payType1", payType1).commit(); Spanned spanned = null; if (Integer.valueOf(order_total_lebi) > 0) { spanned = Html.fromHtml("<font color='#fe5062'><small><small>¥</small></small>" + MoneyUtil.getLeXiangBi(order_total_lebi) + "</color></font>"); } else if (Integer.valueOf(order_total_lebi) > 0) { spanned = Html.fromHtml("<font color='#fe5062'><small><small>¥</small></small>" + MoneyUtil.getLeXiangBi(order_total_lebi) + "</color></font>"); } else if (Integer.valueOf(order_total_lebi) <= 0) { spanned = Html.fromHtml("<font color='#fe5062'><small><small>¥</small></small>" + MoneyUtil.getLeXiangBi(order_total_lebi) + "</color></font>"); } tv_buy_price.setText(spanned); RequestOftenKey.getUserInfor(mContext, sp);//更新本地用户信息 switch (type) { case GOODS_DETAIL: orderIndex = intent.getStringExtra("orderIndex"); orderNumber = 1; break; case GOODS_SHOPPING_CART: //从购物车过来 list = (ArrayList<MakeMallShoppingOrderEntity.MakeMallShopping>) intent.getSerializableExtra("list"); orderIndex = ""; for (MakeMallShoppingOrderEntity.MakeMallShopping data : list ) { orderIndex += data.orderIndex + ","; Logger.i("收银台接收的orderIndex:" + data.orderIndex); } orderIndex = orderIndex.substring(0, orderIndex.length() - 1); orderNumber = list.size(); break; case GOODS_ORDER: //从订单过来 orderIndex = intent.getStringExtra("orderIndex"); orderNumber = 1; break; default: break; } Logger.i("接收到的订单号:" + orderIndex); } private void initData() { getUserBalance(); } private void initListener() { nslvPayType.setOnItemClickListener(new AdapterView.OnItemClickListener() { @Override public void onItemClick(AdapterView<?> parent, View view, int position, long id) { PayTypeEntity.PayData itemAtPosition = (PayTypeEntity.PayData) parent.getItemAtPosition(position); if ("1".equals(itemAtPosition.isuse)) { if (whetheruseMoney && moneyEnough) { // return;//如果选择奖励支付,且奖励足够,则不能选择充值方式 whetheruseMoney = !whetheruseMoney; btnWhetherUseMoney.setImageResource(R.mipmap.library_module_cash_desk_off); moreOverLeBi = orderTotalLebi;//需要支付的金额 和 用户奖励 的差额 即需要另外充值的金额 (未处理的金额) btnSurePay.setText("确认支付(还需支付" + MoneyUtil.set¥Money(MoneyUtil.getLeXiangBi(moreOverLeBi)) + ")"); } payType = String.valueOf(itemAtPosition.payType); selectedPosition = position; payTypeadapter.notifyDataSetChanged(); } else { showToast("支付方式暂不可用"); } } }); btnWhetherUseMoney.setOnClickListener(new View.OnClickListener() { @Override public void onClick(View v) { if (payList != null && payList.size() > 0) { whetheruseMoney = !whetheruseMoney; if (whetheruseMoney) {//使用奖励 btnWhetherUseMoney.setImageResource(R.mipmap.library_module_cash_desk_on); if (orderTotalLebi > userMoney) {//奖励不足 //默认选中支付方式列表第一个支付方式 if (payList.size() > 0) { if ("1".equals(payList.get(0).isuse)) { payType = String.valueOf(payList.get(0).payType); selectedPosition = 0; payTypeadapter.notifyDataSetChanged(); } } moreOverLeBi = orderTotalLebi - userMoney;//需要支付的金额 和 用户奖励 的差额 即需要另外充值的金额 btnSurePay.setText("确认支付(还需支付" + MoneyUtil.set¥Money(MoneyUtil.getLeXiangBi(moreOverLeBi)) + ")"); } else {//奖励足够 selectedPosition = -1; payTypeadapter.notifyDataSetChanged(); btnSurePay.setText("立即支付"); } } else {//不使用奖励 //默认选中支付方式列表第一个支付方式 if (payList.size() > 0) { if ("1".equals(payList.get(0).isuse)) { payType = String.valueOf(payList.get(0).payType); selectedPosition = 0; payTypeadapter.notifyDataSetChanged(); } } btnWhetherUseMoney.setImageResource(R.mipmap.library_module_cash_desk_off); moreOverLeBi = orderTotalLebi;//需要支付的金额 和 用户奖励 的差额 即需要另外充值的金额 (未处理的金额) btnSurePay.setText("确认支付(还需支付" + MoneyUtil.set¥Money(MoneyUtil.getLeXiangBi(moreOverLeBi)) + ")"); } } } }); btnSurePay.setOnClickListener(new View.OnClickListener() { @Override public void onClick(View v) { if ((moneyEnough && whetheruseMoney) || moreOverLeBi == 0) { //如果奖励足够,且使用奖励支付 pay(); } else { //1:使用奖励且奖励不足 2:不使用奖励且奖励不足 3:不使用奖励且奖励足够且支付乐享币不为0 这三种情况都走充值,充值金额是moreOverLeBi,充值成功后订单结算由服务器完成 if (payList != null) { if ("1".equals(payList.get(selectedPosition).isuse)) { Intent intent = null; switch (payList.get(selectedPosition).payType) { case "1"://支付宝 jumpToZhiFuBao(); break; default://银联或财付通支付(跳转到网页支付) intent = new Intent(mContext, WebViewActivity.class); String valueUrl = Ip.getBaseURL(mContext) + payList.get(selectedPosition).content + "&token=" + RequestOftenKey.getToken(mContext) + "&device_index=" + RequestOftenKey.getDeviceIndex(mContext) + "&pay_type=" + payList.get(selectedPosition).payType + "&type=" + payType1//商品订单 1商家入驻缴费 2扫码支付 3线下交易 4现金充值 5 未定 6 套餐 7团购转配送 + "&payment_fee=" + (int) moreOverLeBi //此处payment_fee单位是分,不能带小数点 + "&order_index=" + orderIndex + "&address_id=" + addressId; Logger.i("银联URL:" + valueUrl); intent.putExtra("url", valueUrl); intent.putExtra("isRecharge", true); startActivity(intent); break; } } else { Toast.makeText(mContext, "该方式暂不可用,请选择其他方式,谢谢!", Toast.LENGTH_SHORT).show(); stopMyDialog(); } } else { showToast("订单异常,请重新支付"); } } } }); tvBack.setOnClickListener(new View.OnClickListener() { @Override public void onClick(View view) { showDialog(null, "便宜不等人,请君三思而后行~", null, 0, Gravity.NO_GRAVITY, "去意已决", "我再想想", false, new DialogInterface.OnClickListener() { @Override public void onClick(DialogInterface dialog, int which) { switch (which) { case DialogInterface.BUTTON_POSITIVE: dialog.dismiss(); finish(); break; case DialogInterface.BUTTON_NEGATIVE: dialog.dismiss(); break; } } }, mContext); } }); } /** * 获取用户奖励 */ private void getUserBalance() { startMyDialog(); RetrofitUtils.getInstance(mContext).setDeviceIndex(RequestOftenKey.getDeviceIndex(mContext)).setToken(RequestOftenKey.getToken(mContext)) .getMyBalance(new Callback<MyBalanceEntity2>() { @Override public void onResponse(Call<MyBalanceEntity2> call, Response<MyBalanceEntity2> response) { HttpResultBean body = response.body(); if (CheckServiceReturnEntityUtil.checkServiceReturnEntity(mContext, body)) { if (CommonUtils.serivceReturnCode(mContext, body.code, body.msg)) { MyBalanceEntity2 entity = response.body(); switch (body.code) { case 1: userMoney = NumberFormat.convertToFloat(entity.lebi, 0); setData(); break; default: break; } } } stopMyDialog(); } @Override public void onFailure(Call<MyBalanceEntity2> call, Throwable t) { showToast("获取用户奖励失败,请重试"); stopMyDialog(); } }); } private void pay() { if (TextUtils.isEmpty(phone)) { //奖励支付时,先检测是否有绑定手机号(因为奖励支付需要支付密码,而支付密码的设置必须有手机号码) new VersionDialog.Builder(mContext) .setMessage("请绑定手机号码") .setPositiveButton("绑定", new DialogInterface.OnClickListener() { @Override public void onClick(DialogInterface dialog, int which) { startActivity(new Intent(mContext, BindPhoneActivity.class)); dialog.dismiss(); } }).create().show(); return; } // 奖励支付时,先检测是否有支付密码,如果有直接支付,如果没有则提示跳转密码支付设置界面 if (PreferenceUtils.readBoolConfig(com.yilian.mylibrary.Constants.PAY_PASSWORD, mContext, false)) { //如果有支付密码 paydialog = new PayDialog(mContext, orderIndex, handler); paydialog.show(); } else { //没有支付密码,提示跳转设置支付密码界面 new VersionDialog.Builder(mContext).setMessage("您还未设置支付密码,请设置支付密码后再支付!") .setPositiveButton("设置", new DialogInterface.OnClickListener() { @Override public void onClick(DialogInterface dialog, int which) { startActivity(new Intent(CashDeskActivity.this, InitialPayActivity.class)); dialog.dismiss(); } }) .setNegativeButton("否", new DialogInterface.OnClickListener() { @Override public void onClick(DialogInterface dialog, int which) { dialog.dismiss(); } }) .create().show(); } } /** * 支付宝支付 */ private void jumpToZhiFuBao() { payType = "1"; sp.edit().putString("lebiPay", "true").commit(); charge(); } private void setData() { Logger.i("userMoney:" + userMoney); if (userMoney == 0 || orderTotalLebi <= 0) { //如果用户奖励为0 或者 支付金额为0,或者奖券不足,则默认不适用奖励支付,且不能选择奖励支付选项 btnWhetherUseMoney.setImageResource(R.mipmap.library_module_cash_desk_off); btnWhetherUseMoney.setEnabled(false); } tvUsableMoney.setText(Html.fromHtml("奖励支付:" + "<font color=\"#333333\">" + MoneyUtil.set¥Money(MoneyUtil.getLeXiangBi(sp.getString("lebi", "0"))) + "</font>")); moreOverLeBi = orderTotalLebi - userMoney;//需要支付的金额 和 用户奖励 的差额(未处理的金额) if (moreOverLeBi > 0) { //如果还需要充值,即钱不够,提示还需支付多少,那么奖励支付数量就是用户奖励 // selectedPosition=0;//如果需要充值,则需要根据该值初始化充值方式的选择 btnSurePay.setText("确认支付(还需支付" + MoneyUtil.set¥Money(MoneyUtil.getLeXiangBi(moreOverLeBi)) + ")"); moneyEnough = false; //默认选中支付方式列表第一个支付方式 tvUsableMoney.setText(Html.fromHtml("奖励支付:" + "<font color=\"#333333\">" + MoneyUtil.set¥Money(MoneyUtil.getLeXiangBi(userMoney)) + "</font>")); } else {//钱够了,那么奖励支付数量就是订单金额 moneyEnough = true; tvUsableMoney.setText(Html.fromHtml("奖励支付:" + "<font color=\"#333333\">" + MoneyUtil.set¥Money(MoneyUtil.getLeXiangBi(orderTotalLebi)) + "</font>")); } getPayTypeList();//获取充值方式列表 } /** * 请求充值订单 * payType 1支付宝 2微信 * paymentFree 支付总价 */ public void charge() { if (myIncomeNetRequest == null) { myIncomeNetRequest = new MyIncomeNetRequest(mContext); } Logger.i(payType + "payType" + moreOverLeBi + "lebiPrice"); Logger.i("充值参数1: payType:" + payType + " paytype1:" + payType1 + " 未处理的moreOverLeBi:" + moreOverLeBi + " orderIndex:" + orderIndex); myIncomeNetRequest.NPaymentIndexNet(payType, payType1, MoneyUtil.getLeXiangBi(moreOverLeBi), orderIndex, addressId, new RequestCallBack<PaymentIndexEntity>() { @Override public void onStart() { super.onStart(); startMyDialog(); } @Override public void onSuccess(ResponseInfo<PaymentIndexEntity> responseInfo) { PaymentIndexEntity result = responseInfo.result; Logger.i(result.code + " responseInfo.result.code" + "支付订单信息:" + result.orderString + " paytype:" + payType); switch (result.code) { case 1: paymentIndex = result.paymentIndex;//支付订单编号 fee = result.paymentFee;//充值金额 switch (payType) { case "1": zhifubao(result.orderString); break; default: break; } break; case -23: showToast(R.string.login_failure); startActivity(new Intent(mContext, LeFenPhoneLoginActivity.class)); break; default: showToast("支付前错误码:" + result.code + result.msg); break; } stopMyDialog(); } @Override public void onFailure(HttpException e, String s) { stopMyDialog(); showToast(R.string.net_work_not_available); } }); } /** * 获取充值方式 */ private void getPayTypeList() { startMyDialog(); RetrofitUtils.getInstance(mContext).setDeviceIndex(com.yilian.mylibrary.RequestOftenKey.getDeviceIndex(mContext)).setToken(com.yilian.mylibrary.RequestOftenKey.getToken(mContext)) .getPayTypeList(new Callback<PayTypeEntity>() { @Override public void onResponse(Call<PayTypeEntity> call, Response<PayTypeEntity> response) { PayTypeEntity body = response.body(); if (CheckServiceReturnEntityUtil.checkServiceReturnEntity(mContext, body)) { if (com.yilian.mylibrary.CommonUtils.serivceReturnCode(mContext, body.code, body.msg)) { switch (body.code) { case 1: payList = body.data; payTypeadapter = new PayFragmentAdapter(mContext, payList); nslvPayType.setAdapter(payTypeadapter); if (!moneyEnough) { //如果钱不足,则要默认选中第一条充值方式 if (payList.size() > 0) { if ("1".equals(payList.get(0).isuse)) { payType = String.valueOf(payList.get(0).payType); selectedPosition = 0; payTypeadapter.notifyDataSetChanged(); } } } break; default: break; } } } stopMyDialog(); } @Override public void onFailure(Call<PayTypeEntity> call, Throwable t) { stopMyDialog(); showToast(R.string.net_work_not_available); } }); } /** * 支付宝支付 * * @param orderString */ public void zhifubao(String orderString) { /** * 完整的符合支付宝参数规范的订单信息服务器返回不需要客户端处理 */ if (TextUtils.isEmpty(orderString)) { return; } stopMyDialog(); Runnable payRunnable = new Runnable() { @Override public void run() { // 构造PayTask 对象 PayTask alipay = new PayTask((CashDeskActivity) mContext); // 调用支付接口,获取支付结果 Map<String, String> result = alipay.payV2(orderString, true); Logger.i("result " + result.toString()); Message msg = new Message(); msg.what = SDK_PAY_FLAG; msg.obj = result; handler.sendMessage(msg); } }; // 必须异步调用 Thread payThread = new Thread(payRunnable); payThread.start(); } /** * 支付成功后跳转到支付成功界面 */ private void jumpToNOrderPaySuccessActivity(String lebi, String dealTime, String returnBean, String subsidy) { Intent intent = null; switch (payType1) {//1商城订单 2 商家入驻或续费(商家入驻支付页面处理) 3店内支付(网页处理) 6套餐支付 case "1": intent = new Intent(mContext, NOrderPaySuccessActivity.class); intent.putExtra("deal_time", dealTime); intent.putExtra("lebi", lebi); intent.putExtra("returnBean", returnBean); intent.putExtra("subsidy", subsidy); break; default: break; } startActivity(intent); finish(); } public class PayDialog extends Dialog { private ImageView img_dismiss; private TextView tv_forget_pwd; private GridPasswordView pwdView; private Context context; private Handler handler; private String orderIndexs; private MallNetRequest request; private MTNetRequest mtNetRequest; public PayDialog(Context context, String orderIndexs, Handler handler) { super(context, R.style.GiftDialog); this.context = context; this.handler = handler; this.orderIndexs = orderIndexs; } @Override protected void onCreate(Bundle savedInstanceState) { super.onCreate(savedInstanceState); setContentView(R.layout.dialog_suregift_pwd); initView(); //dialog弹出时弹出软键盘 getWindow().setSoftInputMode(WindowManager.LayoutParams.SOFT_INPUT_STATE_VISIBLE | WindowManager.LayoutParams.SOFT_INPUT_ADJUST_RESIZE); } private void initView() { img_dismiss = (ImageView) findViewById(R.id.img_dismiss); tv_forget_pwd = (TextView) findViewById(R.id.tv_forget_pwd); pwdView = (GridPasswordView) findViewById(R.id.pwd); pwdView.setOnPasswordChangedListener(new GridPasswordView.OnPasswordChangedListener() { @Override public void onChanged(String psw) { } @Override public void onMaxLength(String psw) { sendGoodsRequest(pwdView.getPassWord()); } }); img_dismiss.setOnClickListener(new View.OnClickListener() { @Override public void onClick(View v) { dismiss(); } }); tv_forget_pwd.setOnClickListener(new View.OnClickListener() { @Override public void onClick(View v) { context.startActivity(new Intent(context, InitialPayActivity.class)); } }); Window dialogWindow = getWindow(); WindowManager windowManager = getWindowManager(); Display display = windowManager.getDefaultDisplay(); WindowManager.LayoutParams lp = dialogWindow.getAttributes(); lp.width = (int) (display.getWidth()); //设置宽度 dialogWindow.setAttributes(lp); dialogWindow.setGravity(Gravity.BOTTOM); } /** * 商品支付 密码支付 */ private void sendGoodsRequest(String pwd) { //支付密码 final String password = CommonUtils.getMD5Str(pwd).toLowerCase() + CommonUtils.getMD5Str(RequestOftenKey.getServerSalt(context)); if (request == null) { request = new MallNetRequest(context); } request.CashDeskPayGoods(orderIndexs, password, new RequestCallBack<PayOkEntity>() { @Override public void onStart() { super.onStart(); startMyDialog(); } @Override public void onSuccess(ResponseInfo<PayOkEntity> responseInfo) { stopMyDialog(); PayOkEntity result = responseInfo.result; Logger.i("result", new Gson().toJson(result)); switch (result.code) { case 1: Toast.makeText(context, "支付成功", Toast.LENGTH_SHORT).show(); PreferenceUtils.writeBoolConfig(Constants.REFRESH_USER_FRAGMENT, true, context); jumpToNOrderPaySuccessActivity(result.lebi, result.dealTime, result.returnBean, result.subsidy); //软键盘消失 dismissJP(); paydialog.dismiss(); finish(); break; case -3: paydialog.dismiss(); showToast("系统繁忙,请稍后再试"); break; case -5: pwdView.clearPassword(); paydialog.dismiss(); showErrorPWDDialog(); break; case -13: Toast.makeText(context, "奖励不足", Toast.LENGTH_SHORT).show(); paydialog.dismiss(); if (isLogin()) { startActivity(new Intent(context, RechargeActivity.class)); } else { startActivity(new Intent(context, LeFenPhoneLoginActivity.class)); } break; default: showToast(result.msg); break; } } @Override public void onFailure(HttpException e, String s) { stopMyDialog(); showToast(R.string.net_work_not_available); } }); Logger.i(orderIndexs + "orderIds" + password + "password"); } //软键盘消失 public void dismissJP() { View view = getWindow().peekDecorView(); if (view != null) { InputMethodManager inputmanger = (InputMethodManager) context.getSystemService(Context.INPUT_METHOD_SERVICE); inputmanger.hideSoftInputFromWindow(view.getWindowToken(), 0); } } /** * 支付密码填写错误后,弹出提示框 */ private void showErrorPWDDialog() { CashDeskActivity.this.showDialog(null, "密码错误,请重新输入", null, 0, Gravity.CENTER, "重置密码", "重新输入", false, new OnClickListener() { @Override public void onClick(DialogInterface dialog, int which) { switch (which) { case Dialog.BUTTON_NEGATIVE://重新输入 dialog.dismiss(); break; case Dialog.BUTTON_POSITIVE://密码重置 context.startActivity(new Intent(context, InitialPayActivity.class)); dialog.dismiss(); break; } } }, context); } } public class PayFragmentAdapter extends android.widget.BaseAdapter { private final ArrayList<PayTypeEntity.PayData> list; private final Context context; private Map<Integer, Boolean> selectedMap;//保存checkbox是否被选中的状态 public PayFragmentAdapter(Context context, ArrayList<PayTypeEntity.PayData> list) { this.context = context; this.list = list; selectedMap = new HashMap<Integer, Boolean>(); initData(); } public void initData() { for (int i = 0; i < list.size(); i++) { selectedMap.put(i, false); } } @Override public int getCount() { return list.size(); } @Override public Object getItem(int position) { return list.get(position); } @Override public long getItemId(int position) { return position; } @Override public View getView(int position, View convertView, ViewGroup parent) { final PayFragmentAdapter.ViewHolder holder; if (convertView == null) { convertView = View.inflate(context, R.layout.item_pay_fragment_adapter, null); holder = new PayFragmentAdapter.ViewHolder(convertView); convertView.setTag(holder); } else { holder = (PayFragmentAdapter.ViewHolder) convertView.getTag(); } PayTypeEntity.PayData dataBean = list.get(position); GlideUtil.showImageNoSuffix(mContext, dataBean.icon, holder.mIvIcon); holder.mTvClassName.setText(dataBean.className); holder.mTvClassSubTitle.setText(dataBean.classSubtitle); if ("0".equals(dataBean.isuse)) { holder.mRL.setBackgroundColor(context.getResources().getColor(R.color.bac_color)); } if (selectedPosition == -1) { holder.selectImg.setImageDrawable(context.getResources().getDrawable(R.mipmap.merchant_big_is_noselect)); } else if (selectedPosition == 0) { if (position == 0) { holder.selectImg.setImageDrawable(context.getResources().getDrawable(R.mipmap.merchant_big_is_select)); } else { holder.selectImg.setImageDrawable(context.getResources().getDrawable(R.mipmap.merchant_big_is_noselect)); } } else { if (selectedPosition == position) { holder.selectImg.setImageDrawable(context.getResources().getDrawable(R.mipmap.merchant_big_is_select)); } else { holder.selectImg.setImageDrawable(context.getResources().getDrawable(R.mipmap.merchant_big_is_noselect)); } } return convertView; } public class ViewHolder { public RelativeLayout mRL; public View rootView; public ImageView mIvIcon; public TextView mTvClassName; public TextView mTvClassSubTitle; public ImageView selectImg; public ViewHolder(View rootView) { this.rootView = rootView; this.mIvIcon = (ImageView) rootView.findViewById(R.id.iv_icon); this.mTvClassName = (TextView) rootView.findViewById(R.id.tv_class_name); this.mTvClassSubTitle = (TextView) rootView.findViewById(R.id.tv_class_sub_title); this.mRL = (RelativeLayout) rootView.findViewById(R.id.rl); this.selectImg = (ImageView) rootView.findViewById(R.id.commit_express_icon); } } } }
java
<gh_stars>1-10 package keys import ( "fmt" "github.com/irisnet/irishub/client/keys" "github.com/spf13/cobra" ) func deleteKeyCommand() *cobra.Command { cmd := &cobra.Command{ Use: "delete <name>", Short: "Delete the given key", RunE: runDeleteCmd, Args: cobra.ExactArgs(1), } return cmd } func runDeleteCmd(cmd *cobra.Command, args []string) error { name := args[0] kb, err := keys.GetKeyBase() if err != nil { return err } _, err = kb.Get(name) if err != nil { return err } buf := keys.BufferStdin() oldpass, err := keys.GetPassword( "DANGER - enter password to permanently delete key:", buf) if err != nil { return err } err = kb.Delete(name, oldpass) if err != nil { return err } fmt.Println("Password deleted forever (uh oh!)") return nil }
go
<reponame>gabrielfeitosa/gabrielfeitosa.github.io .page-header{ margin: 23px 0 23px; } .conversation-wrap { box-shadow: -2px 0 3px #ddd; padding: 0; max-height: 400px; overflow: auto; } .conversation { padding: 5px; border-bottom: 1px solid #ddd; margin: 0; } .message-wrap { box-shadow: 0 0 3px #ddd; padding: 0; } .msg { padding: 5px; /*border-bottom:1px solid #ddd;*/ margin: 0; } .msg-wrap { padding: 10px; max-height: 300px; overflow: auto; } .time { color: #bfbfbf; } .send-wrap { border-top: 1px solid #eee; border-bottom: 1px solid #eee; padding: 10px; /*background: #f8f8f8;*/ } .highlight { background-color: #f7f7f9; border: 1px solid #e1e1e8; } .send-message-btn { border-top-left-radius: 0; border-top-right-radius: 0; border-bottom-left-radius: 0; border-bottom-right-radius: 0; } .btn-panel { background: #f7f7f9; } .btn-panel .btn { transition: 0.2s all ease-in-out; } .btn-panel .btn:hover { color: #666; background: #f8f8f8; } .btn-panel .btn:active { background: #f8f8f8; box-shadow: 0 0 1px #ddd; } .btn-panel-conversation .btn, .btn-panel-msg .btn { background: #f8f8f8; } .btn-panel-conversation .btn:first-child { border-right: 1px solid #ddd; } .msg-wrap .media-heading { color: #003bb3; font-weight: 700; } .msg-date { background: none; text-align: center; color: #aaa; border: none; box-shadow: none; border-bottom: 1px solid #ddd; } body::-webkit-scrollbar { width: 12px; } /* Let's get this party started */ ::-webkit-scrollbar { width: 6px; } /* Track */ ::-webkit-scrollbar-track { -webkit-box-shadow: inset 0 0 6px rgba(0, 0, 0, 0.3); /* -webkit-border-radius: 10px; border-radius: 10px;*/ } /* Handle */ ::-webkit-scrollbar-thumb { /* -webkit-border-radius: 10px; border-radius: 10px;*/ background: #ddd; -webkit-box-shadow: inset 0 0 6px rgba(0, 0, 0, 0.5); } ::-webkit-scrollbar-thumb:window-inactive { background: #ddd; } [ng\:cloak], [ng-cloak], [data-ng-cloak], [x-ng-cloak], .ng-cloak, .x-ng-cloak { display: none !important; }
css
<filename>Projet-Tangram/Library/StateCache/Hierarchy/d64696-mainStage.json {"m_ExpandedPrefabGameObjectFileIDs":[],"m_ExpandedSceneGameObjectInstanceIDs":[-51764,-34586,-34332,-30310,-19540,-3448,-1138,-12,13800,13844,14012,14064],"m_ScrollY":0.0,"m_LastClickedFileID":0,"m_LastClickedInstanceID":-8136}
json
I like the diesel version in Maruti Baleno. Best mileage getting in this car. The car looks compact but very spacious from inside. Leg room is also decent. - Performance (429) - Interior (452)
english
{ ".platform-darwin": { "cmd-alt-F": "code-search:toggle" }, ".platform-win32, .platform-linux": { "ctrl-alt-F": "code-search:toggle" } }
json
export class LogEntry { Timestamp: Date; Product: string; Layer: string; Location: string; UserId: string; UserName: string; Message: string; CorrelationId: string; ElapsedMilliseconds: number; AdditionalInfo: any; }
typescript
<gh_stars>0 export default { getLogs(file = null,current_page = 1, search = null) { return Nova.request().get( search ? `/nova-vendor/KABBOUCHI/logs-tool/logs?file=${file}&page=${current_page}&search=${search}` : `/nova-vendor/KABBOUCHI/logs-tool/logs?file=${file}&page=${current_page}`) .then(response => response.data); }, getDailyLogFiles(file = null,current_page = 1) { return Nova.request().get(`/nova-vendor/KABBOUCHI/logs-tool/daily-log-files`) .then(response => response.data); }, deleteFile(file) { return Nova.request().delete(`/nova-vendor/KABBOUCHI/logs-tool/logs?file=${file}`) .then(response => response.data); }, getLogsPermissions(file) { return Nova.request().get(`/nova-vendor/KABBOUCHI/logs-tool/logs/permissions`) .then(response => response.data); }, }
javascript
{ "vorgangId": "50250", "VORGANG": { "WAHLPERIODE": "17", "VORGANGSTYP": "Schriftliche Frage", "TITEL": "Verhältnismäßigkeit der Einführung einer Härtefallregelung zur Kürzung der Überschussbeteiligungen von Lebensversicherungen", "AKTUELLER_STAND": "Beantwortet", "SIGNATUR": "", "GESTA_ORDNUNGSNUMMER": "", "WICHTIGE_DRUCKSACHE": { "DRS_HERAUSGEBER": "BT", "DRS_NUMMER": "17/12008", "DRS_TYP": "Schriftliche Fragen", "DRS_LINK": "http://dipbt.bundestag.de:80/dip21/btd/17/120/1712008.pdf" }, "EU_DOK_NR": "", "SCHLAGWORT": [ { "_fundstelle": "true", "__cdata": "Lebensversicherung" }, "Mindestzuführungsverordnung" ], "ABSTRAKT": "Originaltext der Frage(n): \r\n \r\nAuf welcher empirischen Basis entschied sich die Bundesregierung dazu, über die Mindestzuführungsverordnung eine Härtefallregelung für die Kürzung der Überschussbeteiligungen von Lebensversicherungen mit einer Kappungsgrenze bei 10 Prozent der Deckungsrückstellung (wobei Versicherte durchschnittlich einen maximalen Abzug von 5 Prozent tragen müssen, weil sie nur zu 50 Prozent an den Bewertungsreserven beteiligt werden) einführen zu wollen, und wie positioniert sich die Bundesregierung zu der Kritik u. a. vom Bund der Versicherten, dass diese Kappungsgrenze zu hoch sei und daher nur in wenigen Ausnahmefällen greife, weil die meisten Kunden gerade mal rund 2,6 Prozent Gewinnbeteiligung aus den Bewertungsreserven zu erwarten hätten (vgl. Medieninformation Bund der Versicherten vom 12. Dezember 2012)?" }, "VORGANGSABLAUF": { "VORGANGSPOSITION": { "ZUORDNUNG": "BT", "URHEBER": "Schriftliche Frage/Schriftliche Antwort ", "FUNDSTELLE": "04.01.2013 - BT-Drucksache 17/12008, Nr. 40", "FUNDSTELLE_LINK": "http://dipbt.bundestag.de:80/dip21/btd/17/120/1712008.pdf", "PERSOENLICHER_URHEBER": [ { "VORNAME": "Harald", "NACHNAME": "Koch", "FUNKTION": "MdB", "FRAKTION": "DIE LINKE", "AKTIVITAETSART": "Frage" }, { "VORNAME": "Hartmut", "NACHNAME": "Koschyk", "FUNKTION": "Parl. Staatssekr.", "RESSORT": "Bundesministerium der Finanzen", "AKTIVITAETSART": "Antwort" } ] } } }
json
<reponame>sepidehhosseinzadeh/Naive-Bayes-Classifier { "@metadata": { "authors": [ "Sp5uhe" ] }, "solve_disambiguation-adding-dn-template": "Robot wspomógł poprawę ujednoznacznienia %(from)s – oznaczono jako wymagające uwagi eksperta", "solve_disambiguation-links-removed": "Robot wspomógł poprawę ujednoznacznienia %(from)s – usunięto link(i)", "solve_disambiguation-links-resolved": "Robot wspomógł poprawę ujednoznacznienia %(from)s – zmieniono link(i) do %(to)s", "solve_disambiguation-redirect-adding-dn-template": "Robot wspomógł poprawę ujednoznacznienia %(from)s – oznaczono jako wymagające uwagi eksperta", "solve_disambiguation-redirect-removed": "Robot wspomógł poprawę ujednoznacznienia %(from)s – usunięto link(i)", "solve_disambiguation-redirect-resolved": "Robot wspomógł poprawę ujednoznacznienia %(from)s – zmieniono link(i) do %(to)s", "solve_disambiguation-unknown-page": "(nieznana)" }
json
killing living creatures is not a sport or something to be praised. Yes, I eat meat. No, im not a vegetarian or vegan. But I can still hold other species to a certain level of respect. We all have a place on this, including sharks. Killing them for fun is disgusting and should be illegal.
english
<reponame>mattrunyon/deephaven-core<gh_stars>10-100 { "className": "io.deephaven.proto.backplane.script.grpc.OpenDocumentRequestOrBuilder", "methods": { "getConsoleId": ".io.deephaven.proto.backplane.grpc.Ticket console_id = 1;\n\n:return: (io.deephaven.proto.backplane.grpc.Ticket) The consoleId.", "getConsoleIdOrBuilder": ".io.deephaven.proto.backplane.grpc.Ticket console_id = 1;\n\n:return: io.deephaven.proto.backplane.grpc.TicketOrBuilder", "getTextDocument": ".io.deephaven.proto.backplane.script.grpc.TextDocumentItem text_document = 2;\n\n:return: (io.deephaven.proto.backplane.script.grpc.TextDocumentItem) The textDocument.", "getTextDocumentOrBuilder": ".io.deephaven.proto.backplane.script.grpc.TextDocumentItem text_document = 2;\n\n:return: io.deephaven.proto.backplane.script.grpc.TextDocumentItemOrBuilder", "hasConsoleId": ".io.deephaven.proto.backplane.grpc.Ticket console_id = 1;\n\n:return: (boolean) Whether the consoleId field is set.", "hasTextDocument": ".io.deephaven.proto.backplane.script.grpc.TextDocumentItem text_document = 2;\n\n:return: (boolean) Whether the textDocument field is set." }, "path": "io.deephaven.proto.backplane.script.grpc.OpenDocumentRequestOrBuilder", "typeName": "interface" }
json
<gh_stars>0 // // Created by andrzej on 10/13/21. // #include "tree.h" #include <iostream> using namespace std; int Node::countId = 0; Node::Node(Node* parent) { id = countId; countId++; lweak_assign((Object**)&this->parent, (Object*)parent); //weak! if (parent!=nullptr) depth = parent->depth+1; left = right = nullptr; } Node::~Node() { cout << "delete Node depth=" << depth << " id="<< id <<endl; //lshared_release_atomic((Object**)&parent);//weak! lshared_release_atomic((Object**)&left); lshared_release_atomic((Object**)&right); } void Node::addChild(Node* parent) { left = new Node(parent); right = new Node(parent); } void makeTree(Node* node, int depth) { if (depth==0) return; node->left = new Node(node); lshared_init_elem(node->left); makeTree(node->left, depth-1); node->right = new Node(node); lshared_init_elem(node->right); makeTree(node->right, depth-1); } void testTree() { Node *root = new Node(nullptr); lshared_init_elem(root); makeTree(root, 5); Node *keep; lshared_assign((Object**)&keep, root->left->left->parent);//must be shared, not weak, although is copy of weak! lshared_assign((Object**)&root, nullptr); cout << "keep" << endl; lshared_release_atomic((Object**)&keep); }
cpp
If you want to save the world, you should study worst-case scenarios for the future, according to 20,000 science fiction fans. The Sci Fi Channel did an online poll, through its Visions For Tomorrow initiative, to find out the top "things to read, watch and do to save the world." And the winners were dark tales of a world gone to hell, including Blade Runner, 1984, Firefly, the new Battlestar Galactica and The Matrix. An exclusive first look at all the winners, below the fold. Here are the top 10 books to read to save the world, according to Sci Fi's visitors: The dystopian message of books like 1984, The Time Machine, Fahrenheit 451 and Brave New World is pretty clear: don't be too quick to give away your freedoms, watch out for false utopias and groupthink etc. I'm not sure how some of the other books will actually help save the world. I can see most of these winning a poll for "best SF book of all time" but world-saving? Similarly, the TV choices include a lot of paranoia, anti-authoritarianism and apocalytic narratives, with a dash of optimism further down the list: - 2. Battlestar Galactica (2004) And here are the top movies. I'm not sure what the world-saving message of Jurassic Park is, other than "don't clone dinosaurs." There's a definite optimistic strain in a couple of these choices, like 2001 and Close Encounters, but otherwise it's pretty much doom across the board. Science goes too far, humans ruin the Earth, we're too violent and ignorant, and we're likely to become slaves of machines. Or enslave our own creations. - 1. Blade Runner (1982) - 3. The Terminator (1984) - 5. Jurassic Park (1993) - 6. Close Encounters of the Third Kind (1977) - 7. The Day After Tomorrow (2004) - 8. The Day the Earth Stood Still (1951) - 9. Children of Men (2006) So what do you think? Can 20,000 readers be wrong? The 20,000 respondents in the Sci Fi poll voted "reading" the number one thing to do to save the world, so the Visions For Tomorrow initiative will partner with Booksfree.com, the internet's biggest paperback and audiobook rental service. If you sign up for Booksfree through Sci Fi's Visions For Tomorrow site, you get an extra 20 percent discount. The other activities that could save the world included recycling, giving blood, voting, eating healthy and being kind. Visions For Tomorrow is the Sci Fi Channel's public affairs campaign, which aims to use the power of science fiction to inspire people and organizations to "meet the growing challenges of the future."
english
/* * This code is from projectM a 2D side scroller using SFML. * It was made using the help of the SFML Game Development Book * * If you have any questions, please contact me at * pridexs.com * * For more information visit the repo for this project at: * github.com/pridexs * * <NAME> - 2016 */
cpp
<gh_stars>0 const getApps = require('./get-apps'); module.exports = function verifyNgx(options, context) { const { logger } = context; const apps = getApps(options, context); if (apps.length === 0) { logger.log('No angular projects of type "application" exist.'); } else { logger.log(`Found angular apps ${apps.map(a => a.name).join(', ')}`); } return true; };
javascript
<gh_stars>0 .customers__text { max-width: 815px; }
css
from typing import Any, Dict, Tuple import torch from torch_geometric.nn import GATConv from torch_sparse import SparseTensor, set_diag from rgnn_at_scale.aggregation import ROBUST_MEANS from rgnn_at_scale.models.gcn import GCN class RGATConv(GATConv): """Extension of Pytorch Geometric's `GCNConv` to execute a robust aggregation function: - soft_k_medoid - soft_medoid (not scalable) - k_medoid - medoid (not scalable) - dimmedian Parameters ---------- mean : str, optional The desired mean (see above for the options), by default 'soft_k_medoid' mean_kwargs : Dict[str, Any], optional Arguments for the mean, by default dict(k=64, temperature=1.0, with_weight_correction=True) """ def __init__(self, mean='soft_k_medoid', mean_kwargs: Dict[str, Any] = dict(k=64, temperature=1.0, with_weight_correction=True), **kwargs): kwargs['in_channels'] = 2 * [kwargs['in_channels']] super().__init__(**kwargs) self._mean = ROBUST_MEANS[mean] if mean is not None else None self._mean_kwargs = mean_kwargs def forward(self, arguments: Tuple[torch.Tensor, SparseTensor] = None) -> torch.Tensor: """Predictions based on the input. Parameters ---------- arguments : Sequence[torch.Tensor] [x, edge indices] or [x, edge indices, edge weights], by default None Returns ------- torch.Tensor the output of `GCNConv`. Raises ------ NotImplementedError if the arguments are not of length 2 or 3 """ if len(arguments) == 2: x, edge_index = arguments edge_weight = None elif len(arguments) == 3: x, edge_index, edge_weight = arguments else: raise NotImplementedError("This method is just implemented for two or three arguments") assert isinstance(edge_index, SparseTensor), 'GAT requires a SparseTensor as input' assert edge_weight is None, 'The weights must be passed via a SparseTensor' H, C = self.heads, self.out_channels assert x.dim() == 2, 'Static graphs not supported in `GATConv`.' x_l = x_r = self.lin_l(x).view(-1, H, C) alpha_l = (x_l * self.att_l).sum(dim=-1) alpha_r = (x_r * self.att_r).sum(dim=-1) if self.add_self_loops: edge_index = set_diag(edge_index) # propagate_type: (x: OptPairTensor, alpha: OptPairTensor) out = self.propagate(edge_index, x=(x_l, x_r), alpha=(alpha_l, alpha_r)) alpha = self._alpha * edge_index.storage.value()[:, None] self._alpha = None if self.concat: out = out.view(-1, self.heads * self.out_channels) else: out = out.mean(dim=1) if self.bias is not None: out += self.bias attention_matrix = edge_index.set_value(alpha, layout='coo') attention_matrix.storage._value = attention_matrix.storage._value.squeeze() x = self.lin_l(x) if self._mean is not None: x = self._mean(attention_matrix, x, **self._mean_kwargs) else: x = attention_matrix @ x x += self.bias return x class RGAT(GCN): """Generic Reliable Graph Neural Network (RGNN) implementation which currently supports a GCN architecture with the aggregation functions: - soft_k_medoid - soft_medoid (not scalable) - k_medoid - medoid (not scalable) - dimmedian and with the adjacency preprocessings: - SVD: <NAME>, <NAME>, <NAME>, and <NAME>. All you need is Low (rank): Defending against adversarial attacks on graphs. - GDC: <NAME>, <NAME>, and <NAME>. Diffusion Improves Graph Learning. - Jaccard: <NAME>, <NAME>, <NAME>, <NAME>, <NAME>, and <NAME>. Adversarial examples for graph data: Deep insights into attack and defense. Parameters ---------- mean : str, optional The desired mean (see above for the options), by default 'soft_k_medoid' mean_kwargs : Dict[str, Any], optional Arguments for the mean, by default dict(k=64, temperature=1.0, with_weight_correction=True) """ def __init__(self, mean: str = 'soft_k_medoid', mean_kwargs: Dict[str, Any] = dict(k=64, temperature=1.0, with_weight_correction=True), **kwargs): self._mean_kwargs = dict(mean_kwargs) self._mean = mean super().__init__(**kwargs) assert not self.do_checkpoint, 'Checkpointing is not supported' def _build_conv_layer(self, in_channels: int, out_channels: int): return RGATConv(mean=self._mean, mean_kwargs=self._mean_kwargs, in_channels=in_channels, out_channels=out_channels) def _cache_if_option_is_set(self, callback, x, edge_idx, edge_weight): return SparseTensor.from_edge_index(edge_idx, edge_weight, (x.shape[0], x.shape[0])), None
python
Filming for 'The Umbrella Academy' will begin in February 2021. In his long-running, critically-acclaimed, and highly profitable career, Christopher Nolan has created some of the most mind-boggling films imaginable to the human mind. But the key to unraveling what he is actually trying to tell hides in his movie's emotional core. Notes on queer sex, exploration and gender identities we can take-away from the Netflix Original.
english
1200+ Brain & Cognitive! iTested Quality! Your Rewards account is blocked. Your Rewards account is paused. Start shopping and earn Rewards today! Cart Total: Customers Also Bought: 5 Results (showing 1 - 5) There are no similar items available. Bausch + Lomb, Advanced Eye Relief, Dry Eye, 1 fl oz (30 ml) - We'll notify you! Bausch + Lomb, Advanced Eye Relief, Eye Wash, 4 fl oz (118 ml) - We'll notify you! Bausch + Lomb, Saline Solution, Sensitive Eyes, 12 fl oz (355 ml) - We'll notify you! - We'll notify you! Bausch + Lomb, Soothe, Lubricant Eye Ointment, Nighttime, 1/8 oz (3.5 g) - We'll notify you! 5 Results (showing 1 - 5)
english
Payal Rajput and Ajay Bhupathi are set to send shivers down the audience's spine with their upcoming horror thriller, 'Mangalavaram', scheduled for a grand release on November 17, 2023. The dynamic duo, known for their impactful debut film 'RX100', is gearing up to create another cinematic sensation. Having successfully completed its censor formalities, 'Mangalavaram' has been granted an A certificate by the censor board, further intensifying the anticipation for its release. Ajay Bhupathi, the visionary director, has stoked curiosity by hinting at a potential sequel to the film, generating buzz following the spine-chilling teaser and trailer. As the excitement builds, online bookings for 'Mangalavaram' are now open, revealing ticket rates that are nothing short of surprising. In Telangana, at the AMB Multiplex, where Mahesh Babu is a partner, the platinum ticket is priced at Rs. 350, while the gold rate stands at Rs. 295. In all other multiplex theaters, including Prasad Multiplex, the ticket rate is set at Rs. 295.For theaters like Chandrakala, Vishwanath, BR Hitech, the maximum ticket rate is Rs. 150, with a minimum ticket price of Rs. 50, ensuring accessibility for a wide audience. Turning our attention to Andhra Pradesh, Visakhapatnam multiplex theaters like Sarath Sangam have set a maximum ticket rate of Rs. 145 for box seats and a minimum rate of Rs. 100 for non-premium seats. Similarly, Shree Leela Mahal Theater maintains the same ticket prices. At Nataraj Theatre, the first-class ticket is priced at Rs. 145, and the second-class ticket is Rs. 70. In Vijayawada's single-screen theaters, the maximum ticket rate is Rs. 112, with a minimum of Rs. 80. 'Mangalavaram' boasts an impressive ensemble cast, including Nanditha Swetha, Divya Pillai, Azmal, Ravindra Vijay, Krishna Chaitanya, and Ajay Gosh, with the musical brilliance of Ajaneesh Lokanath. Produced in collaboration between Mudra Media Works and A Creative Works, 'Mangalavaram' promises to deliver a thrilling cinematic experience that will leave audiences on the edge of their seats. Follow us on Google News and stay updated with the latest!
english
Kadaisi Vivasayi is one of the upcoming films from the Kollywood film industry. There are a lot of movies coming up on farming background and this is one among them. Touted to be a comedy-drama, the film's director is M Manikandan. Manikandan earlier directed Kaaka Muttai, the National award-winning film. There is a positive buzz around the film since it is coming from a critically acclaimed filmmaker Manikandan. The latest buzz around the film is that Santhosh Narayanan is on board to score the music. The film Kadaisi Vivasayi's trailer is already out on social media. 85-year-old Nallandi, Vijay Setupathi, Yogi Babu and a few others play the lead roles in the film. The film unit started the shoot in 2018 and finished it then. The release got delayed and the latest buzz is that Ilaiyaraaja was the composer for the movie initially and he is not working on the project now. Instead of maestro Ilaiyaraaja, the film will have Santhosh Narayanan scoring the music. This is a surprising move as no one expected that Ilaiayaraaja would not work on the film. The reason for the change of the music composer is currently not available to the media. The film unit is also unavailable to comment on the same. The sources say that the director and legendary music composer disagreed on certain aspects and it pushed them to part ways. Manikandan then approached Santhosh Narayanan who readily came on board to finish the scoring of the film.
english
116/7 (17. 0 ov) 119/4 (16. 1 ov) 172/7 (20. 0 ov) 157 (20. 0 ov) SRH pacer Bhuvneshwar Kumar feels the flurry of wickets in the middle overs led to his side's defeat by 22 runs against KKR in IPL 2016. VVS Laxman writes, no matter what the format is, games of cricket are won by balanced teams having both quality batsmen and bowlers. After four back to back defeats, finally a victory for MS Dhoni-led Rising Pune Supergiants’ name. Currently, RPS are zero for the loss of a wicket in one over against SRH in IPL 2016. Moise Henriques also praised David Warner and Shikhar Dhawan. Bhuvneshwar Kumar added SRH's plan was to bowl first and restrict the opposition to around 150. Sunrisers Hyderabad (SRH) started this season of Indian Premier League (IPL) with two back-to-back defeats and now they are back on track with two consecutive victories. SRH rode to a comfortable and superior 10-wicket win over new franchise GL, in Match 15 of the ongoing IPL 2016, at Rajkot. RCB have started the journey in the long tournament with a vital win under their belt which should give them massive confidence moving ahead.
english
Oxygen cylinders (File photo) KASKI: Hong Kong chapter of Non-Resident Nepalis’ Association (NRNA) has provided two oxygen concentrators to Sardikhola Health Post of Machhapuchhre Rural Municipality-2 in Kaski. Two oxygen concentrators of 10-litre capacity were provided to the health post for facilitating it to manage treatment of the COVID-19 patients. Hong Kong chapter chair of NRNA, Radhika Gurung (Tika), took special initiative to ensure assistance. The oxygen concentrators were handed over to ward chair Bikram Gurung, who is also the chair of Sardikhola Health Post management committee. The health post has been providing primary treatment to COVID-19 patients by establishing a 10-bed isolation centre. Meanwhile, health post in-charge Anita Ghimire informed that six among 30 COVID-19 patients admitted in the isolation centre were referred to Pokhara and remaining ones got recovery. Similarly, two infected women underwent delivery at isolation. So far, ward no 2 of Machhapuchhre Rural Municipality recorded 202 cases of coronavirus infection so far. Handing over the oxygen concentrator, social worker Nanda Kumari Gurung reminded that NRNA Hong Kong chapter chair Radhika Gurung sent the medical assistance to her birthplace.
english
import { AudioData, BaseApp, HandleRequest, Host, Jovo, SpeechBuilder, SessionConstants, JovoError, ErrorCode, AxiosResponse, } from 'jovo-core'; import { Lindenbaum } from '../Lindenbaum'; import { LindenbaumRequest } from './LindenbaumRequest'; import { LindenbaumResponse, Responses } from './LindenbaumResponse'; import { LindenbaumUser } from './LindenbaumUser'; import { LindenbaumSpeechBuilder } from './LindenbaumSpeechBuilder'; import { DialogAPI, DialogAPIRequestOptions, DialogAPIData } from '../services/DialogAPI'; export class LindenbaumBot extends Jovo { $lindenbaumBot: LindenbaumBot; $user: LindenbaumUser; constructor(app: BaseApp, host: Host, handleRequest?: HandleRequest) { super(app, host, handleRequest); this.$lindenbaumBot = this; this.$response = new LindenbaumResponse(); this.$speech = new LindenbaumSpeechBuilder(this); // $reprompt object has to be added even if the platform doesn't use it. // Is used by users as platform independent feature this.$reprompt = new LindenbaumSpeechBuilder(this); this.$user = new LindenbaumUser(this); this.$output.Lindenbaum = []; } setResponses(responses: Responses[]): this { const response = this.$response as LindenbaumResponse; response.responses = responses; return this; } /** * Calls the `/call/drop` endpoint to terminate the call */ addDrop(): this { this.$output.Lindenbaum.push({ '/call/drop': { dialogId: this.$request!.getSessionId(), }, }); return this; } /** * Calls the `/call/bridge` endpoint to bridge the call to `headNumber` * @param {number} extensionLength * @param {string} headNumber */ addBridge(extensionLength: number, headNumber: string): this { this.$output.Lindenbaum.push({ '/call/bridge': { dialogId: this.$request!.getSessionId(), extensionLength, headNumber, }, }); return this; } /** * Calls the `/call/forward` endpoint to forward the call to `destinationNumber` * @param {string} destinationNumber */ addForward(destinationNumber: string): this { this.$output.Lindenbaum.push({ '/call/forward': { dialogId: this.$request!.getSessionId(), destinationNumber, }, }); return this; } /** * Calls the `/call/data` endpoint to save additional data on the conversations * @param {string} key * @param {string} value */ addData(key: string, value: string): this { this.$output.Lindenbaum.push({ '/call/data': { dialogId: this.$request!.getSessionId(), key, value, }, }); return this; } isNewSession(): boolean { if (this.$user.$session) { return this.$user.$session.id !== this.$request!.getSessionId(); } else { return false; } } hasAudioInterface(): boolean { return this.$request!.hasAudioInterface(); } hasScreenInterface(): boolean { return this.$request!.hasScreenInterface(); } hasVideoInterface(): boolean { return this.$request!.hasVideoInterface(); } getSpeechBuilder(): SpeechBuilder | undefined { return new LindenbaumSpeechBuilder(this); } speechBuilder(): SpeechBuilder | undefined { return this.getSpeechBuilder(); } getDeviceId(): undefined { return undefined; } getRawText(): string { const request = this.$request as LindenbaumRequest; return request.getRawText(); } getTimestamp(): string { return this.$request!.getTimestamp(); } getLocale(): string { return this.$request!.getLocale(); } getType(): string { return Lindenbaum.appType; } getPlatformType(): string { return Lindenbaum.type; } getSelectedElementId(): undefined { return undefined; } getAudioData(): AudioData | undefined { return undefined; } /** * Returns the dialog data for the parsed `dialogId`. * If `dialogId` is not parsed, it uses the current request's `dialogId` property * @param {string} resellerToken * @param {string | undefined} dialogId * @returns {Promise<AxiosResponse<DialogAPIData>>} */ async getDialogData( resellerToken: string, dialogId?: string, ): Promise<AxiosResponse<DialogAPIData>> { const request = this.$request as LindenbaumRequest; const options: DialogAPIRequestOptions = { resellerToken, dialogId: dialogId || request.dialogId, }; return DialogAPI.getDialogData(options); } /** * Delete the dialog data for the parsed `dialogId`. * If `dialogId` is not parsed, it uses the current request's `dialogId` property * @param {string} resellerToken * @param {string | undefined} dialogId * @returns {Promise<AxiosResponse>} */ async deleteDialogData(resellerToken: string, dialogId?: string): Promise<AxiosResponse> { const request = this.$request as LindenbaumRequest; const options: DialogAPIRequestOptions = { resellerToken, dialogId: dialogId || request.dialogId, }; return DialogAPI.deleteDialogData(options); } }
typescript
Chennai: The Income Tax (I-T) Department is carrying out simultaneous raids at 40 different locations across Tamil Nadu at the premises of government officers who are closely linked to the state’s Minister for Excise, Electricity and Prohibition, V. Senthil Balaji. Friday’s raids were being conducted in Chennai, Karur and Coimbatore. According to sources, stores under the Tamil Nadu State Marketing Corporation Limited (TASMAC) allegedly charged Rs 10-20 extra for every bottle of alcohol and the money collected from across the state went to the coffers of Senthil Balaji. Tamil Nadu BJP State President K. Annamalai had demanded the resignation of Senthil Balaji from the cabinet in the backdrop of hooch tragedies in Villupuram and Chengalpattu. He had also urged that fair investigation is not possible against him as the Supreme Court has directed the police and the Enforcement Directorate (ED) to resume a probe against Senthil Balaji. The I-T sleuths had raided the G-Square real estate firm last month allegedly connected to the Chief Minister M. K. Stalin’s family. The ED had conducted raids in connection with the alleged scam in the public sector electrical power generation and distribution undertaking Tamil Nadu Generation and Distribution Corporation (TANGEDCO).
english
<reponame>GeeKaven/BangumiTV-Subject<gh_stars>0 {"date":"2004-01-12","platform":"TV","images":{"small":"https://lain.bgm.tv/pic/cover/s/46/bf/29278_ApWlA.jpg","grid":"https://lain.bgm.tv/pic/cover/g/46/bf/29278_ApWlA.jpg","large":"https://lain.bgm.tv/pic/cover/l/46/bf/29278_ApWlA.jpg","medium":"https://lain.bgm.tv/pic/cover/m/46/bf/29278_ApWlA.jpg","common":"https://lain.bgm.tv/pic/cover/c/46/bf/29278_ApWlA.jpg"},"summary":"故事时间是近代未来2023年的东京,社会国际化,流通的扩大。伴随这社会的变化,社会的黑暗面也在持续改变着。犯罪巧妙化,组织更加大规模化,社会治安持续恶化着。\r\n发现到事态严重性的当时政府,决定引入时代的系统,便是有组织并处理特殊犯罪的超常规警队。是为了要及时响应市民的要求,能迅速解决事件的少数精锐警察队之一。\r\n那就是特殊行动队伍--波利亚斯警察部队。","name":"BURN-UP SCRAMBLE","name_cn":"杀人科 SCRAMBLE","tags":[{"name":"2004","count":8},{"name":"AIC","count":8},{"name":"能登麻美子","count":7},{"name":"TV","count":5},{"name":"原创","count":4},{"name":"田中理恵","count":3},{"name":"釘宮理恵","count":3},{"name":"2004年1月","count":2},{"name":"季番組","count":2},{"name":"热血","count":2},{"name":"肉番","count":2},{"name":"补番","count":2},{"name":"TVA","count":1},{"name":"アニメ","count":1},{"name":"卖肉","count":1},{"name":"日本动画","count":1},{"name":"林宏樹","count":1},{"name":"植竹須美男","count":1},{"name":"记不清","count":1},{"name":"豊口めぐみ","count":1}],"infobox":[{"key":"中文名","value":"杀人科 SCRAMBLE"},{"key":"别名","value":[{"v":"バーンナップスクランブル"},{"v":"BURN UP SCRAMBLE"},{"v":"杀人科"}]},{"key":"话数","value":"12"},{"key":"放送开始","value":"2004年1月12日"},{"key":"放送星期","value":"星期一"},{"key":"官方网站","value":"http://www.anime-int.com/works/burn-up/scramble/"},{"key":"播放电视台","value":"ちばテレビ"},{"key":"其他电视台","value":"サンテレビ, TVK, メ〜テレ, AT-X, テレビ埼玉"},{"key":"播放结束","value":"2004年3月29日"}],"rating":{"rank":0,"total":23,"count":{"1":1,"2":0,"3":0,"4":2,"5":7,"6":6,"7":6,"8":1,"9":0,"10":0},"score":5.7},"total_episodes":12,"collection":{"on_hold":8,"dropped":5,"wish":47,"collect":38,"doing":7},"id":29278,"eps":12,"volumes":0,"locked":false,"nsfw":false,"type":2}
json
/* eslint-env jest */ import { join } from 'path' import { renderViaHTTP, findPort, launchApp, killApp, nextBuild, nextStart, getBuildManifest, } from 'next-test-utils' import fs from 'fs-extra' // test suite import clientNavigation from './client-navigation' const context = {} const appDir = join(__dirname, '../') jest.setTimeout(1000 * 60 * 5) const runTests = (isProd = false) => { clientNavigation(context, isProd) } describe('Client 404', () => { describe('dev mode', () => { beforeAll(async () => { context.appPort = await findPort() context.server = await launchApp(appDir, context.appPort) // pre-build page at the start await renderViaHTTP(context.appPort, '/') }) afterAll(() => killApp(context.server)) runTests() }) describe('production mode', () => { beforeAll(async () => { await nextBuild(appDir) context.appPort = await findPort() context.server = await nextStart(appDir, context.appPort) const manifest = await getBuildManifest(appDir) const files = manifest.pages['/missing'].filter((d) => /static[\\/]chunks[\\/]pages/.test(d) ) if (files.length < 1) { throw new Error('oops!') } await Promise.all(files.map((f) => fs.remove(join(appDir, '.next', f)))) }) afterAll(() => killApp(context.server)) runTests(true) }) })
javascript
package com.slimgears.rxrpc.sample; public class SampleCircularReferenceData { public SampleCircularReferenceData data() { return null; } }
java
{"id":"../node_modules/o3/lib/abstractMethod.js","dependencies":[{"name":"/workspace/certificates-near-smart-contract/package.json","includedInParent":true,"mtime":1646951090919},{"name":"/workspace/certificates-near-smart-contract/node_modules/o3/package.json","includedInParent":true,"mtime":1646951093000}],"generated":{"js":"module.exports = function () {\r\n throw new Error(\"Not implemented.\");\r\n};"},"sourceMaps":{"js":{"mappings":[{"source":"../node_modules/o3/lib/abstractMethod.js","original":{"line":1,"column":0},"generated":{"line":1,"column":0}},{"source":"../node_modules/o3/lib/abstractMethod.js","original":{"line":2,"column":0},"generated":{"line":2,"column":0}},{"source":"../node_modules/o3/lib/abstractMethod.js","original":{"line":3,"column":0},"generated":{"line":3,"column":0}}],"sources":{"../node_modules/o3/lib/abstractMethod.js":"module.exports = function () {\r\n throw new Error(\"Not implemented.\");\r\n};"},"lineCount":3}},"error":null,"hash":"2dfefc2444e36cb2915756081748a843","cacheData":{"env":{}}}
json
package pessoas; public class Funcionario { boolean isCLT; String name; public Funcionario(boolean isCLT, String name) { this.isCLT = isCLT; this.name = name; } public void showFuncionario() { System.out.println(this.isCLT + " " + this.name); } }
java
Washington: The State Department Friday told US citizens to leave Iraq ‘immediately’, after an American strike killed top Iranian commander Qasem Soleimani in Baghdad. “Due to heightened tensions in Iraq and the region, we urge US citizens to depart Iraq immediately,” the State Department tweeted. The US announced earlier Friday that it had killed the powerful general in a strike on Baghdad’s international airport, in which the deputy chief of Iraq’s powerful Hashed al-Shaabi paramilitary force also died. Tensions with Iraq were already running high after pro-Iranian protesters laid siege the US embassy Tuesday, reacting to weekend airstrikes that killed at least 25 fighters from the hardline Kataeb Hezbollah paramilitary group. The strikes were in response to a 36-rocket attack last week that killed an American contractor at an Iraqi base. The Pentagon said Soleimani had orchestrated attacks on coalition bases in Iraq over the past few months, including December 27, the day the contractor was killed. Soleimani ’also approved the attacks’ on the US embassy in Baghdad, according to the Pentagon. The US State Department had issued a travel advisory for Iraq January 1, warning citizens not to travel to the country. In a separate statement Friday, the US embassy in Baghdad also urged American citizens in Iraq to ‘depart immediately’ for fear of fallout. “US citizens should depart via airline while possible, and failing that, to other countries via land,” the embassy said in a statement. American nationals working at Iraqi oil fields were already evacuating the country, an oil ministry spokesman said. Several had already left Friday morning and others were preparing to fly out, Assem Jihad told AFP, adding that there was ‘no impact’ on Iraq’s oil production. The number of American staff at fields in OPEC’s second-biggest crude producer had already dwindled in recent months as tensions between the US and Iran soared. (AFP)
english
I love this brand of NAC. I find it to be of very high quality; the strength I need and well packaged. Enough cysteine is important for many benefits of your health. Natural ingredients, correct size 3 month and 10 days supply worth for your money.I have been using this for years and am totally satisfied!
english
Second World Test Championship final is already underway at The Oval, and on the first day, Team India looked clueless, to say the least. When the toss happened, there was an overcast situation, and Rohit Sharma’s choice to ball first must be a result of detailed consultation with the coach Rahul Dravid. After initial success, it was all leather chasing for Team India for the remainder of the day. Although Australian captain Pat Cummins also suggested that he would have also chosen to bowl first had he won the toss, after the first hour’s play sun shined brightly and there was no seam or swing for the Indian bowlers? Before every game, the decision about the toss is always discussed between the team captain, the coach and the senior players. But, at the end of the day, the captain and the coach will decide. For such an important game like the World Test Championship final which actually a Test match one wonders how a decision can be taken based on the first hour’s weather conditions. Almost all the weather apps, since last Sunday, were suggesting bright sunny conditions throughout five days at The Oval except for Saturday where there were chances of showers. Yet, the decision to bowl first based on the first hour’s condition baffled many Indian cricket fans. Not only this but leaving Ravichandran Ashwin for Shardul Thakur and Umesh Yadav, has put fans in shock. It is not surprising that these two critical decisions that put the Aussies in the driver’s seat have made cricket fans angry not only at Rohit Sharma but also at Rahul Dravid the coach. The fans are of the opinion that under Rahul Dravid’s coaching Team India has not only gone defensive but also made too many changes in the team when they were absolutely not required. Some fans are also remembering the days of Virat Kohli and Ravi Shastri when the team was almost settled and winning almost all the Test matches and the series. Yes, India couldn’t win the inaugural World Test Championship final against New Zealand, but it didn’t look clueless during that game at least. It is evident that when Team India falters with the mistakes made by the coach, the fans will pour their anger on him on social media. Currently, ‘Dravid’ is trending on Twitter, and cricket fans are showing their anger at his defensive approach as well as at the decision on dropping Ravichandran Ashwin for this all-important Test. Let us look at some of those reactions of Team India’s coach. User Balasubramanian feels that Rahul Dravid is an embarrassment to a coach. It sounds harsh, but he also has his reasons to support it. Rahul Dravid has been an embarassment of a coach for this Indian team so far, especially in the test arena. Fumbling the absolute basics, messing up with well-set combinations, losing out on key moments, horrific management of work load and injuries and what not. If ‘horses for courses’ were the policy, Dravid shouldn’t be our T20 & ODI coach. Kirat is a bit harsh on the coach and claims he took the team back in 2012. Dr. Krishnamurthy Subramanian is criticizing the decision to drop Ravichandran Ashwin over Umesh Yadav by citing the difference between their bowling and batting skills. Crazy that we have played Umesh Yadav instead of Ashwin: 3. Ashwin is a much better batsman than Umesh. Tapan Joshi has tweeted something interesting; he is remembering the black days of Indian cricket when Rahul Dravid was the captain, and Greg Chappell was the coach. Rahul Dravid the coach seems to be cut from the same cloth as Greg Chappell. Give them Under-19s and they will make them cricketers. But can’t handle international cricketers. Ayush has tweeted a video in which the former coach Ravi Shastri’s attacking approach towards Test matches has been shown. By tweeting this video, he is suggesting that Rahul Dravid is way too defensive as a coach. Niks has quoted a tweet and shown how Ravichandran Ashwin could have been effective at The Oval as he played for the home team Surrey a few seasons back. It is obvious that when fans, in general, can understand what to do and what not to do in a particular situation, the greats like Rohit Sharma and Rahul Dravid just don’t see at the ground surely hurts. With the huge score already on the board, India will only do the chasing job from hereon in not to win but to draw the Test, and it will be painful to see.
english
<gh_stars>0 // +build js package main import ( "github.com/helmutkemper/iotmaker.santa_isabel_theater.platform.webbrowser/factoryBrowserDocument" "github.com/helmutkemper/iotmaker.santa_isabel_theater.platform.webbrowser/factoryBrowserHtml" ) func main() { done := make(chan struct{}, 0) browserDocument := factoryBrowserDocument.NewDocument() factoryBrowserHtml.NewImage( browserDocument.SelfDocument, map[string]interface{}{ "id": "player", "src": "./player_big.png", }, true, true, ) <-done }
go
/* * Copyright (c) 2009-2021 jMonkeyEngine * All rights reserved. * * Redistribution and use in source and binary forms, with or without * modification, are permitted provided that the following conditions are * met: * * * Redistributions of source code must retain the above copyright * notice, this list of conditions and the following disclaimer. * * * Redistributions in binary form must reproduce the above copyright * notice, this list of conditions and the following disclaimer in the * documentation and/or other materials provided with the distribution. * * * Neither the name of 'jMonkeyEngine' nor the names of its contributors * may be used to endorse or promote products derived from this software * without specific prior written permission. * * THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS * "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED * TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR * PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR * CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, * EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, * PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR * PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF * LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING * NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS * SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. */ package com.jme3.network.kernel; import com.jme3.network.Filter; import java.nio.ByteBuffer; /** * Defines the basic byte[] passing messaging * kernel. * * @version $Revision$ * @author <NAME> */ public interface Kernel { /** * A marker envelope returned from read() that indicates that * there are events pending. This allows a single thread to * more easily process the envelopes and endpoint events. */ public static final Envelope EVENTS_PENDING = new Envelope( null, new byte[0], false ); /** * Initializes the kernel and starts any internal processing. */ public void initialize(); /** * Gracefully terminates the kernel and stops any internal * daemon processing. This method will not return until all * internal threads have been shut down. */ public void terminate() throws InterruptedException; /** * Dispatches the data to all endpoints managed by the * kernel that match the specified endpoint filter.. * If 'copy' is true then the implementation will copy the byte buffer * before delivering it to endpoints. This allows the caller to reuse * the data buffer. Though it is important that the buffer not be changed * by another thread while this call is running. * Only the bytes from data.position() to data.remaining() are sent. */ public void broadcast( Filter<? super Endpoint> filter, ByteBuffer data, boolean reliable, boolean copy ); /** * Returns true if there are waiting envelopes. */ public boolean hasEnvelopes(); /** * Removes one envelope from the received messages queue or * blocks until one is available. */ public Envelope read() throws InterruptedException; /** * Removes and returns one endpoint event from the event queue or * null if there are no endpoint events. */ public EndpointEvent nextEvent(); }
java
<reponame>remondis-it/ReMap package com.remondis.remap; /** * Represents a field value mapping result. This result carries the actual value to write to the destination object and * an operation flag indicating situations where a value, <code>null</code> or nothing should be written to destination. */ public class MappedResult { private Object value; private MappingOperation operation; private MappedResult(Object value, MappingOperation operation) { super(); this.value = value; this.operation = operation; } /** * @return Returns a {@link MappedResult} that signals that the mapping should be skipped. */ public static MappedResult skip() { return new MappedResult(null, MappingOperation.SKIP); } /** * @param value The actual value the mapping returns. * @return Returns a {@link MappedResult} that signals that the mapping should be used even if <code>null</code> is * returned. */ public static MappedResult value(Object value) { return new MappedResult(value, MappingOperation.VALUE); } public Object getValue() { return value; } public MappingOperation getOperation() { return operation; } /** * @return Returns <code>true</code> if a mapped value is present. */ public boolean hasValue() { return MappingOperation.VALUE == this.getOperation(); } @Override public String toString() { return "MappedResult [value=" + value + ", operation=" + operation + "]"; } }
java
{ "name": "Tweego", "description": "A command line Twiner compiler.", "url": "https://www.motoslave.net/tweego/" }
json
Max Muncy has been one of the premier infielders for the Los Angeles Dodgers in recent years. With his contract expiring in 2023, he will be one of the top free agents on the market next season. Although Max Muncy has become a household name in baseball circles, less is known about the Dodger slugger's personal life. Muncy married fellow Texas native Kellie Cline in 2018. Kellie was born in Midland, Texas, and graduated from Baylor University with a Bachelor of Science (BS) in Fashion Merchandising in 2013. She has worked in the fashion industry before, but she now stays at home to help raise her growing family. After marrying Max in November 2018, the pair welcomed their first child, Sophie Kate Muncy, to the world in July 2021. The pair currently live with their young daughter in the Los Angeles area. Kellie and Max Muncy are both devout Christians and have made as much clear through their various social media channels. In the offseason, the couple usually involve themselves with the First Baptist Church in Dallas. Muncy was drafted by the Cleveland Indians in 2009 but did not make his MLB debut as he opted to attend Baylor University. He was later drafted by Oakland Athletics in the fifth round of the 2012 MLB Draft. A utility infielder, Muncy has been known to play third, second, and first base. In his first two seasons in the majors, Muncy was a mediocre player for the Athletics. He didn't become a big name in baseball until he joined the Dodgers in 2018. In his first two seasons in the majors, Muncy was a mediocre player for the Athletics. He didn't become a big name in baseball until he joined the Dodgers in 2018. Muncy hit 35 home runs in consecutive years in 2018 and 2019. In addition, his 98 RBIs in 2019 earned him his first All-Star selection. His home run and six RBIs were pivotal in the Dodgers' victory over the Tampa Bay Rays in the 2020 World Series. Max Muncy's contract with the Dodgers expires at the end of next season after signing a three-year deal with the team in 2020. Muncy had 21 home runs and 69 RBIs in 2022. His batting average, however, was a dismal . 196. Kellie and Max will both be hoping for a strong showing during his contract year.
english
package org.adorsys.adpharma.client.jpa.login; import javafx.concurrent.Service; import javafx.concurrent.Task; import javax.inject.Inject; public class LoginRemoveService extends Service<Login> { @Inject private LoginService remoteService; private Login entity; public LoginRemoveService setEntity(Login entity) { this.entity = entity; return this; } @Override protected Task<Login> createTask() { return new Task<Login>() { @Override protected Login call() throws Exception { return remoteService.deleteById(entity.getId()); } }; } }
java
{ "unversionedId": "snippets/python-snippets", "id": "snippets/python-snippets", "title": "Python code", "description": "Clean filenames in a folder", "source": "@site/docs/snippets/python-snippets.mdx", "sourceDirName": "snippets", "slug": "/snippets/python-snippets", "permalink": "/recohut/docs/snippets/python-snippets", "editUrl": "https://github.com/sparsh-ai/recohut/docs/snippets/python-snippets.mdx", "tags": [], "version": "current", "frontMatter": {}, "sidebar": "tutorialSidebar", "previous": { "title": "Word2vec", "permalink": "/recohut/docs/tutorials/word2vec" }, "next": { "title": "Unix shell", "permalink": "/recohut/docs/snippets/unix-shell-snippets" } }
json
Thursday November 03, 2022, Cloud automation is the use of tools and processes to reduce or minimise manual intervention associated with configuring and managing public, private, and hybrid cloud environments. Automation tools help optimal performance from the cloud ecosystem and streamlines activities related to cloud computing. It can improve efficiency by reducing the need for IT teams to manage repetitive tasks or make decisions about capacity or performance in real-time. Utilising automation to run workloads in a cloud environment over an on-premises set up can maximise budget and resources for organisations. As enterprises, startups and mid-sized companies move to public cloud ecosystems, to bypass years of legacy technology processes and incorporate digital innovation, cloud automation has rapidly become a cornerstone for the success of setting up optimal cloud workflows. Yet, the building of sophisticated cloud automation and the overall management of these systems across separate teams and functions is something often found lacking. It is important to understand that cloud automation needs to become a central component of overall cloud strategy today. Understanding what can be automated in the cloud, which tools can help achieve automation and optimisation, leveraging cloud effectively at scale are imperative for organisations and industries, so that they can gain maximum ROI from their cloud expenditures. The roundtable ‘Automation on Cloud for workflow optimisation’ will feature startups across industries to discuss how automation on cloud can help accelerate business outcomes, the need for automation for digital transformation, handling scalability and complexity of cloud environments through automated workflows, accelerating competitive advantage and optimisation of resources that are today helping organisations gain maximum ROIs from their cloud expenditures. Hosted by YourStory and Google, the roundtable aims to bring stakeholders from across segments on a common platform to discuss technology opportunities and share best practices in automation that can help organisations to optimise resources and gain competitive advantage in future. Amit Dixit, CTO at; Anilkumar Varma, CTO at MCXCCL; Manjunath Athrey, Head of Engineering, , Rashmi Tambe, Modernisation and Cloud Strategy Lead, ; Dinesh Varadharajan, Chief Product Officer, ; and Akshay Kapoor, Customer Engineer, Google Cloud will be the speakers for the discussion. To all stakeholders, founders, experts, this is your chance to be part of the discussion with your questions and insights to understand how cloud automation can help workflow optimisation.
english
["deadorbit","firebase-util","postmessage-json-rpc","promisescript","react-c3-wrapper","react-gateway","react-modal2","react-tinymce-mention"]
json
import datetime import os from dateutil import rrule from osgeo import gdal import numpy as np np.set_printoptions(linewidth=700, precision=2) startTime = datetime.datetime.now() print startTime def cells(array): window = array[480:510, 940:970] return window # Set start datetime object start, end = datetime.datetime(2000, 1, 1), datetime.datetime(2013, 12, 31) path = 'C:\\Recharge_GIS\\OSG_Data\\current_use' raster = 'aws_mod_4_21_10_0' aws_open = gdal.Open('{a}\\{b}.tif'.format(a=path, b=raster)) taw = np.array(aws_open.GetRasterBand(1).ReadAsArray(), dtype=float) dataset = aws_open min_val = np.ones(taw.shape) pKcb = np.zeros(taw.shape) cum_kcb = min_val taw = [] aws_open = [] x = 0 for dday in rrule.rrule(rrule.DAILY, dtstart=start, until=end): doy = dday.timetuple().tm_yday if 121 < doy < 305: pass else: x += 1 print "Time : {a} day {b}_{c}".format(a=str(datetime.datetime.now() - startTime), b=doy, c=dday.year) # NDVI to kcb if dday.year == 2000: path = 'F:\\NDVI\\NDVI_std_all' ras_list = os.listdir('F:\\NDVI\\NDVI_std_all') obj = [1, 49, 81, 113, 145, 177, 209, 241, 273, 305, 337] if doy < 49: strt = 1 band = doy nd = 48 raster = '{a}\\T{b}_{c}_2000_etrf_subset_001_048_ndvi_daily.tif'.format(a=path, b=str(strt).rjust(3, '0'), c=str(nd).rjust(3, '0')) ndvi_open = gdal.Open(raster) ndvi = np.array(ndvi_open.GetRasterBand(band).ReadAsArray(), dtype=float) ndvi_open = [] kcb = ndvi * 1.25 else: for num in obj[1:]: diff = doy - num if 0 <= diff <= 31: pos = obj.index(num) strt = obj[pos] band = diff + 1 if num == 337: nd = num + 29 else: nd = num + 31 raster = '{a}\\T{b}_{c}_2000_etrf_subset_001_048_ndvi_daily.tif'.format(a=path, b=str(strt).rjust(3, '0'), c=str(nd).rjust(3, '0')) ndvi_open = gdal.Open(raster) ndvi = np.array(ndvi_open.GetRasterBand(band).ReadAsArray(), dtype=float) ndvi_open = [] kcb = ndvi * 1.25 elif dday.year == 2001: path = "F:\\NDVI\\NDVI_std_all" obj = [1, 17, 33, 49, 65, 81, 97, 113, 129, 145, 161, 177, 193, 209, 225, 241, 257, 273, 289, 305, 321, 337, 353] for num in obj: diff = doy - num if 0 <= diff <= 15: pos = obj.index(num) strt = obj[pos] band = diff + 1 if num == 353: nd = num + 12 else: nd = num + 15 raster = '{a}\\{b}_{c}_{d}.tif'.format(a=path, b=dday.year, c=strt, d=nd) ndvi_open = gdal.Open(raster) ndvi = np.array(ndvi_open.GetRasterBand(band).ReadAsArray(), dtype=float) ndvi_open = [] kcb = ndvi * 1.25 else: path = "F:\\NDVI\\NDVI_std_all" obj = [1, 17, 33, 49, 65, 81, 97, 113, 129, 145, 161, 177, 193, 209, 225, 241, 257, 273, 289, 305, 321, 337, 353] for num in obj: diff = doy - num if 0 <= diff <= 15: pos = obj.index(num) strt = obj[pos] band = diff + 1 if num == 353: nd = num + 12 else: nd = num + 15 raster = '{a}\\{b}_{c}.tif'.format(a=path, b=dday.year, c=pos+1, d=nd) ndvi_open = gdal.Open(raster) ndvi = np.array(ndvi_open.GetRasterBand(band).ReadAsArray(), dtype=float) ndvi_open = [] kcb = ndvi * 1.25 kcb = np.where(np.isnan(kcb) == True, pKcb, kcb) kcb = np.where(kcb > 3.0, pKcb, kcb) cum_kcb += kcb mean_kcb = cum_kcb / x outputs = [mean_kcb] output_names = ['mean_kcb2'] x = 0 now = datetime.datetime.now() tag = '{}_{}'.format(now.month, now.day) for element in outputs: name = output_names[x] print "Saving {a}".format(a=name) driver = gdal.GetDriverByName('GTiff') filename = 'C:\\Recharge_GIS\\Array_Results\\{a}.tif'.format(a=name) cols = dataset.RasterXSize rows = dataset.RasterYSize bands = dataset.RasterCount band = dataset.GetRasterBand(1) datatype = band.DataType outDataset = driver.Create(filename, cols, rows, bands, datatype) geoTransform = dataset.GetGeoTransform() outDataset.SetGeoTransform(geoTransform) proj = dataset.GetProjection() outDataset.SetProjection(proj) outBand = outDataset.GetRasterBand(1) outBand.WriteArray(element, 0, 0) x += 1
python
--- Description: The following is a list of gesture glyphs that Microsoft plans to support in the future as part of the Microsoft gesture recognizer. ms.assetid: 4d504140-ff48-4a07-9bf7-a36913e44426 title: Unimplemented Glyphs ms.topic: article ms.date: 05/31/2018 --- # Unimplemented Glyphs The following is a list of gesture glyphs that Microsoft plans to support in the future as part of the *Microsoft gesture recognizer*. Work is in progress to ensure that the recognition accuracy for these glyphs is high (to avoid accidental activation) and that they map to the appropriate operations. To ensure consistency of *gestures* used for common *actions* between applications, you should adhere to the following suggestions: - The *Action* is the suggested semantic behavior associated with the gesture. - For the gestures labeled as *Fixed* in the following table, it is recommended that you not change the suggested semantic behavior. If an application does not have a need for the specified semantic behavior, it is recommended that you not reuse the gesture for another action or semantic. - You should feel free to associate relevant semantic behaviors to all other gestures. These gestures are labeled as *Application-specific*. For those gestures that have a suggested semantic behavior, it is recommended that you use the gesture for the suggested semantic if the corresponding functionality exists in your application. - The hot point of a gesture is a distinguishing point in the geometry of the gesture. The hot point can be used to determine where the gesture was made. The gestures API make it possible to determine the hot point for a given gesture. However, not all gestures have a specific distinguishing hot point. For those that do not have a specific distinguishing hot point, the starting point is reported as the hot point. > [!Note] > Some of the gestures do have a distinguishing hot point that just happens to be the starting point. These are distinguished in the table.   | Gesture Name | Action | Fixed | Hot point | |-----------------------------------|------------------------------------------------|---------------------------------|---------------------------------------------------------------------| | Infinity<br/> | Switch into and out of gesture mode<br/> | Fixed<br/> | Starting point<br/> | | Cross<br/> | Delete<br/> | Application-specific<br/> | Intersection of the strokes<br/> | | Paragraph mark<br/> | New paragraph<br/> | Fixed<br/> | Starting point<br/> | | Section<br/> | New section<br/> | Fixed<br/> | Starting point<br/> | | Bullet<br/> | Application-specific<br/> | Application-specific<br/> | Starting point<br/> | | Bullet-cross<br/> | Application-specific<br/> | Application-specific<br/> | Starting point<br/> | | Squiggle<br/> | Bold<br/> | Application-specific<br/> | Starting point<br/> | | Swap<br/> | Exchange content<br/> | Application-specific<br/> | Middle of the up stroke<br/> | | Openup<br/> | Open up space between words<br/> | Application-specific<br/> | Middle of the down stroke<br/> | | Closeup<br/> | Close up extra space<br/> | Application-specific<br/> | Center of the vertical space between the two parentheses<br/> | | Rectangle<br/> | Selects enclosed content<br/> | Fixed<br/> | Starting point<br/> | | Circle-tap<br/> | Application-specific<br/> | Application-specific<br/> | Tap<br/> | | Circle-circle<br/> | Application-specific<br/> | Application-specific<br/> | Starting point<br/> | | Circle-cross<br/> | Application-specific<br/> | Application-specific<br/> | Starting point<br/> | | Circle-line-vertical<br/> | Application-specific<br/> | Application-specific<br/> | Starting point<br/> | | Circle-line-horizontal<br/> | Application-specific<br/> | Application-specific<br/> | Starting point<br/> | | Double-arrow-up<br/> | Application-specific<br/> | Application-specific<br/> | Apex of the arrow head<br/> | | Double-arrow-down<br/> | Application-specific<br/> | Application-specific<br/> | Apex of the arrow head<br/> | | Double-arrow-left<br/> | Application-specific<br/> | Application-specific<br/> | Apex of the arrow head<br/> | | Double-arrow-right<br/> | Application-specific<br/> | Application-specific<br/> | Apex of the arrow head<br/> | | Up-arrow-left<br/> | Application-specific<br/> | Application-specific<br/> | Apex of the arrow head<br/> | | Up-arrow-right<br/> | Application-specific<br/> | Application-specific<br/> | Apex of the arrow head<br/> | | Down-arrow-left<br/> | Application-specific<br/> | Application-specific<br/> | Apex of the arrow head<br/> | | Down-arrow-right<br/> | Application-specific<br/> | Application-specific<br/> | Apex of the arrow head<br/> | | Left-arrow-up<br/> | Application-specific<br/> | Application-specific<br/> | The apex of the arrow head<br/> | | Left-arrow-down<br/> | Application-specific<br/> | Application-specific<br/> | Apex of the arrow head<br/> | | Right-arrow-up<br/> | Application-specific<br/> | Application-specific<br/> | Apex of the arrow head<br/> | | Right-arrow-down<br/> | Application-specific<br/> | Application-specific<br/> | Apex of the arrow head<br/> | | Diagonal-leftup<br/> | Application-specific<br/> | Application-specific<br/> | Starting point<br/> | | Diagonal-rightup<br/> | Application-specific<br/> | Application-specific<br/> | Starting point<br/> | | Diagonal-leftdown<br/> | Application-specific<br/> | Application-specific<br/> | Starting point<br/> | | Diagonal-rightdown<br/> | Application-specific<br/> | Application-specific<br/> | Starting point<br/> | | Latin-Letter-A<br/> | Application-specific<br/> | Application-specific<br/> | Starting point<br/> | | Latin-Letter-B<br/> | Bold<br/> | Fixed<br/> | Starting point<br/> | | Latin-Letter-C<br/> | Application-specific<br/> | Application-specific<br/> | Starting point<br/> | | Latin-Letter-D<br/> | Application-specific<br/> | Application-specific<br/> | Starting point<br/> | | Latin-Letter-E<br/> | Application-specific<br/> | Application-specific<br/> | Starting point<br/> | | Latin-Letter-F<br/> | Application-specific<br/> | Application-specific<br/> | Starting point<br/> | | Latin-Letter-G<br/> | Application-specific<br/> | Application-specific<br/> | Starting point<br/> | | Latin-Letter-H<br/> | Application-specific<br/> | Application-specific<br/> | Starting point<br/> | | Latin-Letter-I<br/> | Italic<br/> | Application-specific<br/> | Starting point<br/> | | Latin-Letter-J<br/> | Application-specific<br/> | Application-specific<br/> | Starting point<br/> | | Latin-Letter-K<br/> | Application-specific<br/> | Application-specific<br/> | Starting point<br/> | | Latin-Letter-L<br/> | Application-specific<br/> | Application-specific<br/> | Starting point<br/> | | Latin-Letter-M<br/> | Application-specific<br/> | Application-specific<br/> | Starting point<br/> | | Latin-Letter-N<br/> | Application-specific<br/> | Application-specific<br/> | Starting point<br/> | | Latin-Letter-O<br/> | Application-specific<br/> | Application-specific<br/> | Starting point<br/> | | Latin-Letter-P<br/> | Application-specific<br/> | Application-specific<br/> | Starting point<br/> | | Latin-Letter-Q<br/> | Application-specific<br/> | Application-specific<br/> | Starting point<br/> | | Latin-Letter-R<br/> | Application-specific<br/> | Application-specific<br/> | Starting point<br/> | | Latin-Letter-S<br/> | Application-specific<br/> | Application-specific<br/> | Starting point<br/> | | Latin-Letter-T<br/> | Application-specific<br/> | Application-specific<br/> | Starting point<br/> | | Latin-Letter-U<br/> | Application-specific<br/> | Application-specific<br/> | Starting point<br/> | | Latin-Letter-V<br/> | Application-specific<br/> | Application-specific<br/> | Starting point<br/> | | Latin-Letter-W<br/> | Application-specific<br/> | Application-specific<br/> | Starting point<br/> | | Latin-Letter-X<br/> | Cut<br/> | Application-specific<br/> | Starting point<br/> | | Latin-Letter-Y<br/> | Application-specific<br/> | Application-specific<br/> | Starting point<br/> | | Latin-Letter-Z<br/> | Application-specific<br/> | Application-specific<br/> | Starting point<br/> | | Digit-0<br/> | Application-specific<br/> | Application-specific<br/> | Starting point<br/> | | Digit-1<br/> | Application-specific<br/> | Application-specific<br/> | Starting point<br/> | | Digit-2<br/> | Application-specific<br/> | Application-specific<br/> | Starting point<br/> | | Digit-3<br/> | Application-specific<br/> | Application-specific<br/> | Starting point<br/> | | Digit-4<br/> | Application-specific<br/> | Application-specific<br/> | Starting point<br/> | | Digit-5<br/> | Application-specific<br/> | Application-specific<br/> | Starting point<br/> | | Digit-6<br/> | Application-specific<br/> | Application-specific<br/> | Starting point<br/> | | Digit-7<br/> | Application-specific<br/> | Application-specific<br/> | Starting point<br/> | | Digit-8<br/> | Application-specific<br/> | Application-specific<br/> | Starting point<br/> | | Digit-9<br/> | Application-specific<br/> | Application-specific<br/> | Starting point<br/> | | Question mark<br/> | Help<br/> | Fixed<br/> | Center along the height of the curve<br/> | | Sharp<br/> | Application-specific<br/> | Application-specific<br/> | Starting point<br/> | | Dollar<br/> | Application-specific<br/> | Application-specific<br/> | Starting point<br/> | | Asterisk<br/> | Application-specific<br/> | Application-specific<br/> | Center<br/> | | Plus<br/> | Paste<br/> | Fixed<br/> | Intersection of the strokes<br/> | | Double-up<br/> | Scroll up<br/> | Fixed<br/> | Starting point<br/> | | Double-down<br/> | Scroll down<br/> | Fixed<br/> | Starting point<br/> | | Double-left<br/> | Scroll left<br/> | Fixed<br/> | Starting point<br/> | | Double-right<br/> | Scroll right<br/> | Fixed<br/> | Starting point<br/> | | Triple-up<br/> | Page up<br/> | Fixed<br/> | Starting point<br/> | | Triple-down<br/> | Page down<br/> | Fixed<br/> | Starting point<br/> | | Triple-left<br/> | Application-specific<br/> | Application-specific<br/> | Starting point<br/> | | Triple-right<br/> | Application-specific<br/> | Application-specific<br/> | Starting point<br/> | | Bracket-over<br/> | Application-specific<br/> | Application-specific<br/> | Midpoint<br/> | | Bracket-under<br/> | Application-specific<br/> | Application-specific<br/> | Midpoint<br/> | | Bracket-left<br/> | Start of selection<br/> | Fixed<br/> | Midpoint<br/> | | Bracket-right<br/> | End of selection<br/> | Fixed<br/> | Midpoint<br/> | | Brace-over<br/> | Application-specific<br/> | Application-specific<br/> | Midpoint<br/> | | Brace-under<br/> | Application-specific<br/> | Application-specific<br/> | Midpoint<br/> | | Brace-left<br/> | Start of discontinuous selection<br/> | Fixed<br/> | Midpoint<br/> | | Brace-right<br/> | End of discontinuous selection<br/> | Fixed<br/> | Midpoint<br/> | | Triple-tap<br/> | Application-specific<br/> | Application-specific<br/> | Starting point is distinguishing hot point<br/> | | Quadruple-tap<br/> | Application-specific<br/> | Application-specific<br/> | Starting point is distinguishing hot point<br/> |      
markdown
<filename>_posts/2004-08-10-39-4522.md --- layout: post amendno: 39-4522 cadno: CAD2004-MULT-34 title: 照明——BRUCE客舱地板紧急撤离引导系统 date: 2004-08-10 00:00:00 +0800 effdate: 2004-08-13 00:00:00 +0800 tag: MULT categories: 民航西北地区管理局适航审定处 author: 杨子洲 --- ##适用范围: 在生产中已经执行了空客改装号为 06810或6934(BRUCE客舱地板紧急撤离引导系统),且没有执行空客 SB A310-33-2045或 A300-33-6047的所有空客 A310-308、A310-324、A310-325、A300 B4-605R型飞机。 注:在生产中本系统已经装备在以下飞机: A310飞机:生产序号为 0439、0442、0443、0449、0451、0452、 0453、0455、0456、0457、0458、0467、0493、0500、0501、0534、 0539、0542、0548、0549、0570、0574、0576、0585、0587、0589、 0590、0634、0650、0653、0654、0656、0660、0665、0669、0674、 0676、0678、0680、0682、0684、0686、0687、0689、0691、0693和 0697。 A300 B4-605R飞机:生产序号为 0725、0741和0763。
markdown
<gh_stars>1-10 version https://git-lfs.github.com/spec/v1 oid sha256:192cae9fca74540120d8db071c06dd1dcf27a84d620d4d2ac4a6860ee74db025 size 16293
json
Bigg Boss 8: Interesting New Contestant Names Come To Light! flight, peaked. Khan, better than ever. PR guru Dale Bhagwagar, who has earlier handled public relations for the maximum number of Bigg Boss contestants including Rakhi Sawant, Kashmera Shah, Sambhavna Seth, Sherlyn Chopra, Pooja Misrra, Diana Hayden, Shamita Shetty, Ashmit Patel, Vindu Dara Singh, Zulfi Syed, Salil Ankola, Amar Upadhyay and Anita Advani, is ecstatic about the new season of the show. has. year, standards. creates history," says the Bollywood publicist. include: Deepshikha Nagpal (of serial Madhubala - Ek Ishq Ek Junoon), Diandra Soares (model and fashion designer), Karishma Tanna (of Grand Masti fame), Minissha Lamba (of Honeymoon Travels Pvt. Ltd. and Bachna Ae Haseeno fame), Natasha Stankovic (Serbian item girl from Satyagraha), Sonali Raut (of The Xpose fame), Soni Singh (of serial Ghar Ki Lakshmi Betiyann), Sukirti Kandpal (of serials Dill Mill Gayye and Pyaar Kii Ye Ek Kahaani), Aarya Babbar (Hindi and Punjabi film actor and son of actor-politician Raj Babbar), Gautam Gulati (of serial Diya Aur Baati Hum), Praneet Bhatt (Shakuni Mama of the new Mahabharat), Pritam Singh (RJ), Punnet Issar (Duryodhan of the old Mahabharat; Bhagwan Parshuram of the new Mahabharat and director of the film Garv - Pride And Honour), Sushant Divgikar (sissy pansy Mr Gay India 2014), Upen Patel (of 36 China Town and actress Amrita Arora's ex boyfriend). New names emerge claiming to be the contestants in Bigg Boss 8. Diandra Soares - model and fashion designer. Minissha Lamba - of Honeymoon Travels Pvt. Ltd. and Bachna Ae Haseeno fame. Deepshikha Nagpal - of serial Madhubala - Ek Ishq Ek Junoon. Natasha Stankovic - Serbian item girl from Satyagraha. Praneet Bhatt - Shakuni Mama of the new Mahabharat. Soni Singh - of serial Ghar Ki Lakshmi Betiyann. Sukirti Kandpal - of serials Dill Mill Gayye and Pyaar Kii Ye Ek Kahaani. Upen Patel - of 36 China Town and actress Amrita Arora's ex boyfriend. Sushant Divgikar - sissy pansy Mr Gay India 2014. Sonali Raut - of The Xpose fame. Punnet Issar - Duryodhan of the old Mahabharat. Aarya Babbar - Hindi and Punjabi film actor and son of actor-politician Raj Babbar. Gautam Gulati - of serial Diya Aur Baati Hum. Karishma Tanna - of Grand Masti fame.
english
<reponame>feiteng/fl.github.io <!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN"> <HTML> <HEAD> <TITLE>318_becauseiambatman.py</TITLE> <META http-equiv="Content-Type" content="text/html; charset=UTF-8"> <script type="text/javascript"> <!-- function ZweiFrames(URL1,F1,URL2,F2) { parent.frames[F1].location.href=URL1; parent.frames[F2].location.href=URL2; } //--> </script> </HEAD> <BODY BGCOLOR="#ffffff"> <HR> <H3><CENTER>318_becauseiambatman.py</CENTER></H3><HR> <PRE> class Solution: def minCost(self, n: int, cuts: List[int]) -&gt; int: cuts.sort() cuts.append(n) <A NAME="0"></A> idxs = [cuts[0]] for i in range(len(cuts)-1): idxs.append(cuts[i+1] - cuts[i]) dp = [[0]*len<FONT color="#0000ff"><A HREF="javascript:ZweiFrames('match24-1.html#0',3,'match24-top.html#0',1)"><IMG SRC="forward.gif" ALT="other" BORDER="0" ALIGN="right"></A><B>(cuts) for _ in range(len(cuts))] for i in range(len(cuts)): for j in range(0, len(cuts)-i): if i == 0: dp[j][j] = 0 elif i == 1: <A NAME="1"></A> dp[j][j+1] = idxs[j] + idxs[</B></FONT>j+1] else: dp[j][i+j] = float('inf') <FONT color="#f63526"><A HREF="javascript:ZweiFrames('match24-1.html#1',3,'match24-top.html#1',1)"><IMG SRC="forward.gif" ALT="other" BORDER="0" ALIGN="right"></A><B>for k in range(j, i+j): dp[j][i+j] = min(dp[j][i+j], dp[j][k] + dp[k+1][i+j] + cuts[i+j] - (cuts[j-1] if j &gt;=1 else 0)) return dp[0][</B></FONT>-1] </PRE> </BODY> </HTML>
html
<filename>fhirVersions/hl7.fhir.r3.core-3.0.2/package/CodeSystem-medication-package-form.json {"resourceType":"CodeSystem","id":"medication-package-form","meta":{"lastUpdated":"2019-10-24T11:53:00+11:00"},"extension":[{"url":"http://hl7.org/fhir/StructureDefinition/structuredefinition-ballot-status","valueString":"Informative"},{"url":"http://hl7.org/fhir/StructureDefinition/structuredefinition-fmm","valueInteger":1},{"url":"http://hl7.org/fhir/StructureDefinition/structuredefinition-wg","valueCode":"phx"}],"url":"http://hl7.org/fhir/medication-package-form","identifier":{"system":"urn:ietf:rfc:3986","value":"urn:oid:2.16.840.1.113883.4.642.1.362"},"version":"3.0.2","name":"MedicationContainer","status":"draft","experimental":false,"date":"2019-10-24T11:53:00+11:00","publisher":"HL7 (FHIR Project)","contact":[{"telecom":[{"system":"url","value":"http://hl7.org/fhir"},{"system":"email","value":"<EMAIL>"}]}],"description":"A coded concept defining the kind of container a medication package is packaged in","caseSensitive":true,"valueSet":"http://hl7.org/fhir/ValueSet/medication-package-form","content":"complete","concept":[{"code":"ampoule","display":"Ampoule","definition":"A sealed glass capsule containing a liquid"},{"code":"bottle","display":"Bottle","definition":"A container, typically made of glass or plastic and with a narrow neck, used for storing liquids."},{"code":"box","display":"Box","definition":"A container with a flat base and sides, typically square or rectangular and having a lid."},{"code":"cartridge","display":"Cartridge","definition":"A device of various configuration and composition used with a syringe for the application of anesthetic or other materials to a patient."},{"code":"container","display":"Container","definition":"A package intended to store pharmaceuticals."},{"code":"tube","display":"Tube","definition":"A long, hollow cylinder of metal, plastic, glass, etc., for holding medications, typically creams or ointments"},{"code":"unitdose","display":"Unit Dose Blister","definition":"A dose of medicine prepared in an individual package for convenience, safety or monitoring."},{"code":"vial","display":"Vial","definition":"A small container, typically cylindrical and made of glass, used especially for holding liquid medications."}]}
json
{"id_1185":{"title":"Revelations of Divine Love","language":"English","totaltime":"7:20:02","url_librivox":"http://librivox.org/revelations-of-divine-love-by-julian-of-norwich/","url_iarchive":"http://www.archive.org/details/revelations_of_divine_love_drb_librivox","readers":["94"],"authors":"2421","genres":"Christianity - Other"}}
json
Earlier this week, Google blended its Chromecast device with Chrome OS software, introducing what it calls a Chromebit. The small stick of a computer plugs into the HDMI port of a display and supports an external keyboard and mouse. Google says the Chromebit will arrive from its hardware partner Asus this summer and cost less than $100. It's a clever approach that doesn't just take advantage of the same form-factor as its Chromecast streaming stick but also replicates another aspect: A low price. Granted, you're not getting the latest and greatest x86 chip inside. Instead, Google and Asus worked with Rockchip, which supplies a quad-core ARM-based processor for the Chromebit. Also inside the device are Wi-Fi (802.11ac) and Bluetooth 4.0 radios, 2GB of memory and 16GB of flash storage. My colleague Larry Dignan was quick to point out that the Chromebit is similar to Intel's take on the PC on a stick: The company debuted a $149 Compute Stick in January at the Consumer Electronics Show. Unlike Intel, which says the Compute Stick is "business ready," Google is aiming the Chromebit at students and educators, although I'm sure some consumers will be interested as well. I think it's a mistake for Google to overlook the enterprise and small business here, however. Both computing sticks address a key aspect of any company: Controlling costs. It's far less expensive to hand employees a plug and play HDMI dongle than it is to lease or buy laptops, desktops and other costly computers. Sure, employees will need a monitor, keyboard and mouse for either of these new computing devices, but in the case of a company that uses workstations, these should already be available. Depending on the type of business, laptops certainly make sense. In another lifetime, I managed the deployment and support of laptops for a national mortgage company. Those laptops and their 56k modem cards helped keep the sales force productive when mobile. The technical help desk I was responsible for in a different role, however, would have been well served by a smaller personal computer. We had shared workstations for different shifts at that time and the sign-in process to run roaming profile scripts and such wasted several minutes a day. Multiply that by each employee and you have a significant amount of unproductive activity. That same situation with a portable PC that you just plug-in, though? There's opportunity to reclaim that wasted time because a Chromebit or Compute Stick already has the personal data of an employee, plus any locally stored user data. Best of all, it can be used anywhere there's a monitor, keyboard and mouse available; perfect in the example of my help desk shifts. Neither of these devices will work for a truly mobile workforce. Nor will a Chromebit be of value if you have legacy applications or don't rely on web-based services. Obviously, you have to choose the right tool for the task. If those tasks can securely be done on a low-end computing stick, maybe there's no need to renew those laptop leases the next time they come due.
english
New Delhi, May 17 (IANS): Fintech unicorn Zepz is laying off 420 employees, or 26 per cent of its workforce, and has started informing those impacted, the media reported. According to CNBC, the job cuts at the London-based money transfer service provider, which is a Western Union rival, will mainly impact Zepz's customer care and engineering teams. Zepz said that it was implementing "workforce optimisation" to account for roles that had been duplicated following its combination of Sendwave with WorldRemit, according to the report. Zepz and Sendwave are being used by more than 11 million users across 150 countries. The impacted employees will be offered help via counselling, coaching, career and CV development and job applications. The decision marked "an important and necessary step in transitioning from two vast, segmented teams to one dynamic organisation under Zepz, and laying ambitious foundations towards our long-term strategic direction as a portfolio business", said Mark Lenhard, Zepz CEO. "The remittance industry has maintained robust growth despite global economic conditions, and we've seen this audience take great measures to ensure their loved ones are supported as costs rise around the world," he added. Zepz raised cash in August 2021 at a $5 billion valuation, with $292 million of new funding from investors led by hedge fund Farallon Capital. Zepz was founded by British-Somalian entrepreneur Ismail Ahmed in 2010. He stepped down as CEO in 2018, and remains on the board as non-executive chairman.
english
<reponame>dmacka/MultiverseClientServer package multiverse.mars.behaviors; import java.util.*; import multiverse.msgsys.*; import multiverse.server.plugins.WorldManagerClient; import multiverse.server.engine.Behavior; import multiverse.server.engine.Engine; public class ChatResponseBehavior extends Behavior implements MessageCallback { @Override public void initialize() { MessageTypeFilter filter = new MessageTypeFilter(); filter.addType(WorldManagerClient.MSG_TYPE_COM); eventSub = Engine.getAgent().createSubscription(filter, this); } @Override public void activate() { } @Override public void deactivate() { if (eventSub != null) { Engine.getAgent().removeSubscription(eventSub); eventSub = null; } //return true; } @Override public void handleMessage(Message msg, int flags) { String response = null; if (msg instanceof WorldManagerClient.ComMessage) { WorldManagerClient.ComMessage comMsg = (WorldManagerClient.ComMessage)msg; response = responses.get(comMsg.getString()); } else if (msg instanceof WorldManagerClient.TargetedComMessage) { WorldManagerClient.TargetedComMessage comMsg = (WorldManagerClient.TargetedComMessage)msg; response = responses.get(comMsg.getString()); } if (response != null) { WorldManagerClient.sendChatMsg(obj.getOid(), 1, response); } } public void addChatResponse(String trigger, String response) { responses.put(trigger, response); } Map<String, String> responses = new HashMap<>(); Long eventSub = null; private static final long serialVersionUID = 1L; }
java
<gh_stars>1-10 import { CssMatcherSymbol } from "../types"; import { CssAttributeMatcher } from "../css-attribute-matcher"; const supersetSymbols = [ CssMatcherSymbol.Prefix, CssMatcherSymbol.Suffix, CssMatcherSymbol.SubCode, CssMatcherSymbol.Occurrence, CssMatcherSymbol.Contains, CssMatcherSymbol.Equal, ]; export class CssContainsMatcher extends CssAttributeMatcher { readonly symbol: CssMatcherSymbol = CssMatcherSymbol.Contains; supersetOf ( matcher: CssAttributeMatcher ): boolean { if (supersetSymbols.indexOf(matcher.symbol) !== -1) { return matcher.value.indexOf(this.value) !== -1; } return false; } }
typescript
Lord Shiva devotees walk miles while carrying the Kanwar (water pitchers hung along the two ends of a bamboo pole) on their shoulders during a Kanwar Yatra pilgrimage. Therefore, read on to know why the Ganga water plays a pivotal role and why it is offered to Lord Shiva. Lord Shiva devotees participate in the Kanwar Yatra, which begins on the first day of the Shravan month as per the Purnimant calendar and culminates with Shravan Shivratri (i. e. Shravan Chaturdashi Krishna Paksha). Check out this link to know the Shravan 2022 dates and scroll down to learn why pilgrims walk miles to collect Ganga's sacred water for Lord Shiva. Kanwar Yatra KathaThe story of Kanwar Yatra is associated with an event that took place in the Satyuga when Lord Vishnu took the avatar of a tortoise (Kurma) to help the Devas (who represented the good) and the Asuras (who symbolised the evil) churn the cosmic ocean (Kshirasagara) to obtain the nectar of immortality (Amrit). The two sides started churning the Kshirasagara to obtain the Amrit with the Mandara Parvat as the churning rod and Lord Shiva's snake, Vasuki, as the rope. However, Halahala, or lethal poison, initially emerged from the ocean. And it was so toxic that it caused massive destruction. Therefore, to prevent further devastation, the two parties approached Lord Shiva for help. Subsequently, Lord Shiva consumed the Halahala but arrested the venom in his throat. As a result, his neck turned blue. Interestingly, as per another version of this legend, Mata Parvati held Lord Shiva's neck to stop the Halahala from gliding down his body. Nonetheless, Lord Shiva endured agony and burns due to the toxicity of the Halahala. Therefore, the Devas offered water from the sacred Ganga to Lord Shiva to lessen his sufferings. Interestingly, the Kanwar Yatra helps devotees relive the gestures of the Devas by thanking Lord Shiva, who endured agony to save the Universe. Hence, the Kanwariyas walk miles, carry the Gangajal and offer it to Lord Shiva as a token of gratitude.
english
Seven flights from Muscat , Doha, Riyadh, Bahrain and Dubai from the Gulf under the Vande Bharat Mission will fly back more than a thousand distressed and stranded Indians back to the country today. to Jammu and Kashmir. The Dubai Srinagar flight is expected to take back at least 150 Indians, mostly elderly, those facing medical emergencies, pregnant women and people who have lost their jobs. Priority on all these flights are being given to distressed blue collar workers, medical emergency cases, pregnant women, stranded tourists and the elderly. Only asymptomatic passengers will be allowed to board the flight.
english
For Aili Seghetti, 48, engaging in tongue-tied conversations and offering cheesy pick-up lines are all in a day’s work. As a consumer researcher and intimacy coach, her job is to help people with emotional and physical intimacy issues. Her clients include all kinds of people in the intimacy spectrum—from singles struggling with dating to couples who want to revive their physical intimacy and individuals with issues such as premature ejaculation, erectile dysfunction, inability to orgasm and kinky fantasies, among others. In a career spanning over 15 years in India, the Finland native—now based in Mumbai—deals with even ‘misogynist’ clients and guide them on how to approach love, sex or dating in a consensual and respectful way. According to the founder of The Intimacy Curator, an organisation promoting self-discovery through emotional and sexual wellbeing, her line of work is still nascent in India. With sex therapists and sex educators and nothing in between, it is either pathologised or strictly pedantic—and that’s where coaches like Seghetti come into the picture. “People are more open to a coach as they need help specific to their issue and don’t want to consult a doctor. People have started realising that intimate relations require more soft skills than toys. They are more open to learn skills and look out for services instead,” says Seghetti. The culture of ‘coaching’ is picking up and it is either due to lack or excess of role models or the ignorance that many people face over their individual or unique problems. A few decades ago, people would consult religious gurus or elder people in the family about soft skills such as staying healthy or choosing a career or a partner, but this set-up is now breaking down, as there are more cultural differences between generations, or we are evolving faster in the digital space. Mentors, who align people’s goals in life, are also needed for an overall wellness of an individual as the benefits of coaching interventions are huge. “When you neglect your personal needs, you may experience low self-esteem and lack of compassion which can lead to anxious states of anger, stress and exhaustion. It is a vicious circle and can give rise to health and mental problems like heart disease, gut issues, depression, etc,” says Mumbai-based occupational therapist and intervention coach Purvi Gandhi. The allied health professional helps individuals in therapeutic practices in day-to-day activities, and work towards physical, mental, developmental and emotional ailments. “A parallel process of participation helps in the growth process and can be highly satisfying for therapists as well as the family,” she adds. YouTuber Shwetabh Gangwar, who is an author and a motivational speaker with over 2 million subscribers, gives perspectives on problems related to career, relationships, parents, existential, social, cultural and philosophical issues. “Re-evaluating your choices, decisions and patterns is important. We rarely think about the sources or origins of behaviour and decision making, and yet we proudly represent ourselves, ideologies and actions. Maybe in between those, there comes a curiosity to understand and gather knowledge about your own mind,” he says. It all began in the late-19th and early-20th century with the establishment international mutual aid fellowships like Alcoholics Anonymous in 1935 and literary works by authors who took the self-help approach to help readers. Books like Samuel Smiles’ Self-Help: With Illustrations of Conduct and Perseverance on Self-Help, Dale Carnegie’s How to Win Friends and Influence People and Norman Vincent Peale’s The Power of Positive Thinking and the autobiography of Benjamin Franklin continued this momentum by exploring business, political, social, religious and sexual matters. In 2017, over 15 million self-help books were sold in the US alone. In fact, books like How to Win Friends and Influence People by Dale Carnegie, Who Moved My Cheese by Dr Spencer Johnson, The Power of the Subconscious Mind by Joseph Murphy, and other titles by Og Mandino, Napoleon Hill and Mitch Albom dominated the genre. Indian authors such as Jay Shetty, Robin Sharma and Gaur Gopal Das, too, have propelled the self-help book market to new heights. Author Rhonda Byrne shot to fame with her 2006 documentary, The Secret. She inspired millions with her self-help book, The Secret, which has sold over 35 million copies to date and is believed to have blown up the self-help books market—worth over $14 billion by 2025. The book talks about ways to establish healthy relationships with better boundaries with others, emphasising upon the importance of practising gratitude in relationships, physical and mental well-being. “In recent years, the genre has experienced remarkable development. India is a high-aspirational market, and readers expect more from self-help books they read than just entertainment. The availability of these books improved dramatically with the advent of online retail, and self-help authors discovered a massive market hungry for new concepts,” says Rahul Dixit, sales director, HarperCollins India, which has published books like The Subtle Art of Not Giving a Fuck, Girl Wash your Face and Think Like a Monk. But why are self-help books so popular? While people look for inspiration to grow and excel, self-help books often come from successful role models and there is always scope to learn from their experiences and expertise. So, the demand has steadfastly grown over time. “Self-help is an ongoing trend with more people exploring self-discovery by way of books till today. Books assisting self-improvement, personality development, confidence building, and positive living have been continually popular. There are also self-help books on soft business topics such as time management, stress management, etc, which have drawn readers,” shares Nandan Jha, EVP- sales and product and business development, Penguin Random House India, which publishes about 15-20 such books under Penguin Ananda and Ebury Press imprint every year. US market research publisher Marketdata, which specialises in analysing niche service sectors, estimates that the US self-improvement market was worth $11.6 billion in 2019, and that it contracted by 10% to $10.5 billion in 2020. There was a forecast of 7.7% rebound in 2021, to $11.3 billion, and 6.0% average annual growth to $14.0 billion by 2025. Content consumption in the last few years has given rise to self-love and storytelling in the form of audio books, guides, and podcasts. Also, the popularity of smart gadgets like Amazon Echo, Apple HomePod, and Google Home has made it easier to find and listen to podcasts. From topics on better day-to-day habits to motivational books to keep going during tough times, platforms like Kuku FM or Pocket FM have subscribers like students, shopkeepers and salaried individuals. Also, the massive adoption of audio listeners in India has made such platforms focus on regional languages through which young users listen to audiobooks and shows in their mother tongue. The focus is exclusively on solving user’s problems from Tier II and Tier III cities in English, Gujarati, Hindi, Bengali, Marathi and Tamil, among other regional languages. “We have seen a month-on-month growth rate of over 80%. With 30,000 creators, 50% of the content on the platform is exclusive, covering a wide range of genres in fiction and non-fiction like self-help, education, entertainment, personal finance, spirituality, inspiration and history,” says Lal Chandu Bisu, CEO and co-founder of Kuku FM, a vernacular audio OTT platform that recently crossed 1 million active paid subscribers in India offering online subscription in small towns. While audio storytelling is imbibed in Indian culture and listening to books is a frictionless medium to consume content, Ashu Behl, SVP-content of Pocket FM, an audio streaming platform, finds a large set of users preferring to listen to native language content. “As India is the fastest-growing market for the audiobook space, the size of the audiobooks market in India will soon be equivalent to or more than the physical or e-book market,” he says. The platform is building a repository of audiobooks and collaborating with Indian and global publishers and authors for exclusive audio content, also available in eight languages—Telugu, Malayalam, Hindi, Tamil, English, Kannada, Bangla and Marathi. “We have partnered with leading Indian publishers like Manjul Publishing, Prabhat Prakashan, etc. With over 15 million monthly active listeners (MAL) on the Pocket ecosystem, I think we have a highly aspirational demographic of young users who are seeking knowledge to upskill and improve in all aspects of life. Self-help content is immensely popular especially in wealth creation, health, personality development and career,” adds Behl, whose self-help audiobook category contributes to more than two-thirds of audiobook sales. Pocket FM also has Ankur Warikoo’s Do Epic Shit exclusively in audiobooks. Similarly, mentoring platform MentorKart invites various industry experts to talk on career growth, entrepreneurship, jobs and more to motivate, and provide skills and emotional support to youth—be it students, young professionals, or startup entrepreneurs. “The idea is to unlock the potential of students, young professionals and early-stage entrepreneurs and help them achieve their professional goals, overall career development, and industry placements by providing them with tips, lessons, mentoring from mentors and coaches from India and around the world,” says Ashish Khare, founder of MentorKart, who has partnered with more than 50 universities across the country and intend to achieve the goal of reaching over 200 campuses in the next few quarters. The dependence on health, wellness and lifestyle coaches has continued and has seen a boom as healthcare takes centre stage in the post-pandemic workspace. Many big names in the corporate sector have been inviting health and lifestyle coaches regularly to have sessions with their employees. Luke Coutinho, holistic lifestyle coach – integrative medicine and founder of YouCare-All about YOU, a holistic health store, has in the past been to corporates like Uber India and Dr Reddy’s Laboratories for giving sessions to its employees. “In a fast-paced world where burnout, digital fatigue, job insecurity, financial crisis, personal loss, and the lack of social connection are challenges, investing in the health and well-being of employees is becoming necessary,” he says. To ease the unpredictable working climate, Wipro, for instance, is one corporate that made plans in consultation with coaches throughout the pandemic to manage work from home and now a hybrid work culture. Sumit Taneja, vice president, global rewards and analytics, Wipro Limited, says they now plan to expand their wellness interventions and programmes globally. Some of their programmes are centred around managing change, coping with the fear of exposure to Covid-19 if back in office, adapting to change and coping with anxiety, and more. As for Modicare Limited, The Art of Living is one of their key programmes as they work towards creating a culture driven by holistic wellness, shares its MD and CEO Samir Modi. “Workplace ergonomics, sleep hygiene to managing anxiety and effective financial planning—we have made employee well-being an integral part of our company culture,” he adds. Coaches are also helping in bridging the social gap. Gut health coach and the founder of GutAvatar and INUEN Payal Kothari, says that life, well-being and health coaches are also being called by corporates and organisations to encourage employee engagement, interaction and stable work culture which has been lost due to the pandemic. “There is no way the online medium can take over what physical engagement programmes can do. We have received feedback from the employees and the senior management confessing how a better work environment is required for productivity and mental health,” she says. Puneet Gupta, CEO and founder, Clensta International, agrees with Kothari that juggling work and personal life can sometimes get overwhelming when working remotely and so, they have set up an ‘Employee Wellness Committee’ that works towards ensuring organisational well-being and nurturing a culture where everyone feels comfortable in coming forth with their issues and concerns. The committee also regularly organises sessions with industry experts where they help employees with stress management and share tips and tricks for achieving work-life balance. Another issue raised by employees when coached was the difficulty in maintaining a balance between home, personal and work life during the pandemic, according to Harshit Malik, enrichment guide, wisdom coach and an entrepreneur. He says that in his employee coaching sessions, he has seen a similar pattern in many. “Usually, our ego does not allow us to talk about our shortcomings or issues or failures with our friends and partners. A coach is unknown to an employee and has a neutral approach, so they can easily open up,” explains Malik.
english
import unittest from app.models import Pitch from app import db class PitchTest(unittest.TestCase): """ Test class to test the behaviour of the Pitch """ def setUp(self): """ Set up method that will run before every Test """ self.new_pitch = Pitch(0000, 'name','description') def test_instance(self): self.assertTrue(isinstance(self.new_pitch)) def tearDown(self): Review.query.delete() User.query.delete()
python
package minimum_size_subarray_sum func minSubArrayLen(s int, nums []int) int { var left, right, res int min := len(nums) + 1 for right < len(nums) { for ; res < s && right < len(nums); right++ { res += nums[right] } for ; res >= s; left++ { res -= nums[left] if right-left < min { min = right - left } } } if min == len(nums)+1 { return 0 } return min }
go
{ "idArt" : "664", "series" : "Revue des Nouvelles Technologies de l'Information", "booktitle" : "EGC ", "year" : "2008", "place" : "Sophia Antipolis", "location" : { "lat" : 43.6163539, "lon" : 7.0552218 }, "title" : "Une nouvelle approche du boosting face aux données bruitées", "abstract" : "La réduction de l'erreur en généralisation est l'une des principales motivations de la recherche en apprentissage automatique. De ce fait, un grand nombre de travaux ont été menés sur les méthodes d'agrégation de classifieurs afin d'améliorer, par des techniques de vote, les performances d'un classifieur unique. Parmi ces méthodes d'agrégation, le boosting est sans doute le plus performant grâce à la mise à jour adaptative de la distribution des exemples visant à augmenter de façon exponentielle le poids des exemples mal classés. Cependant, en cas de données fortement bruitées, cette méthode est sensible au sur-apprentissage et sa vitesse de convergence est affectée. Dans cet article, nous proposons une nouvelle approche basée sur des modifications de la mise à jour des exemples et du calcul de l'erreur apparente effectuées au sein de l'algorithme classique d'AdaBoost. Une étude expérimentale montre l'intérêt de cette nouvelle approche, appelée Approche Hybride, face à AdaBoost et à BrownBoost, une version d'AdaBoost adaptée aux données bruitées.", "authors" : ["<NAME>","<NAME>"], "pdf1page" : "http://editions-rnti.fr/render_pdf.php?p1&p=1000623" , "pdfarticle" : "http://editions-rnti.fr/render_pdf.php?p=1000623" }
json
--- category: truthdig-podcast image: type: '' url: http://feedproxy.google.com/~r/TruthdigPodcast/~5/JJa7QZxDp80/TruthdigRadio040413.mp3 layout: podcast link: http://feedproxy.google.com/~r/TruthdigPodcast/~3/ntXsuKxw_co/ podcast: developer-note: '' length: '' note: '' size: 53178678 type: audio/mpeg url: http://feedproxy.google.com/~r/TruthdigPodcast/~5/JJa7QZxDp80/TruthdigRadio040413.mp3 show: category: - News & Politics image: height: 0 url: http://www.truthdig.com/images/itunespodcastgraphic_large.png width: 0 owner: email: <EMAIL> name: <NAME> published: string: '2015-02-28T01:25:25+00:00' timestamp: '1427995295' subtitle: Interviews and spoken articles from a provocative Web magazine. title: Truthdig Podcast tags: - News & Politics - truthdig-podcast title: How We Talk About North Korea --- This week on Truthdig Radio in association with KPFK: How the media cover—and promote—war, <NAME> defends the<span class="ellipsis">...</span> <!--more--> messenger, AP disappears ‘illegal’ immigrants, and America’s office slaves, otherwise known as interns, rise up. <br /> TruthdigRadio040413.mp3<br />
html
{"symbol": "ARY","address": "0xa5F8fC0921880Cb7342368BD128eb8050442B1a1","overview":{"en": ""},"email": "<EMAIL>","website": "https://blockarray.com/","state": "NORMAL","links": {"blog": "","twitter": "https://twitter.com/blockarraygroup","telegram": "https://t.me/Block_Array","github": "https://github.com/blockarraygroup"}}
json
#https://docs.python.org/3.5/library/tarfile.html#tarfile-objects import tarfile import os src = "/imatge/lpanagiotis/projects/saliency/epic-kitchens/object_detection_images" dst = "/imatge/lpanagiotis/work/Epic-Kitchens/object_detection_images" # Folder Structure corresponds to : train/test -> person -> videos for x in ["train", "test"]: print("Now extracting frames for {}".format(x)) root_path = os.path.join(src, x) people = os.listdir(root_path) for person in people: person_path = os.path.join(src, x, person) videos = os.listdir(person_path) for video in videos: # Define our source tar file source_tar_file = os.path.join(person_path, video) # Define our destination directory video_dir = video.split(".")[0] #remove the tar extension destination_dir = os.path.join(dst, x, person, video_dir) if not os.path.exists(destination_dir): os.mkdir(destination_dir) # Extract the frames from the tar file to the destination folder video_file = tarfile.open(name = source_tar_file) video_file.extractall(path = destination_dir) print("Video {} extracted to destination folder".format(video))
python
<html> <head> <meta name="viewport" content="width=device-width, initial-scale=1"> <style> body { font-family: "Lato", sans-serif; } .sidenav { height: 100%; width: 160px; position: fixed; z-index: 1; top: 0; left: 0; background-color: #111; overflow-x: hidden; padding-top: 20px; } .sidenav a { padding: 6px 8px 6px 16px; text-decoration: none; font-size: 25px; color: #818181; display: block; } .sidenav a:hover { color: #000000; background-color: #f1f1f1; } div#clicked a { color: #000000; background-color: #f1f1f1; } div#heading { color: #f1f1f1; background-color: #111; height: 50px; width: 100%; text-align: center; font-size: 35px; margin-top: 10px; padding-top: 10px; } .main { margin-left: 160px; /* Same as the width of the sidenav */ font-size: 28px; /* Increased text to enable scrolling */ padding: 0px 10px; } @media screen and (max-height: 450px) { .sidenav {padding-top: 15px;} .sidenav a {font-size: 18px;} } </style> </head> <body> <div class="sidenav"> <div id="clicked"> <a href="index.html">Home</a> </div> <a href="HistoricalPlaces.html">Historical Places</a> <a href="TheNoblestPeople.html">The Noblest People</a> <a href="educationalCenter.html">Educational Centers</a> <a href="demography.html">Demography</a> <a href="climate.html">Climate</a> <a href="agriculture.html">Agriculture</a> <a href="malls.html">Malls</a> <a href="resturants.html">Resturants</a> <a href="contact.html">Contact</a> </div> <div class="main"> <div id="heading">WORLD OF PURNEA</div> <p>Purnia has an area of 3,202 square km.<BR> It is a level, depressed tract of country, consisting for the most part of a rich, loamy soil of alluvial formation.<BR> It is traversed by several rivers flowing from the Himalayas, which afford great advantages of irrigation and water-carriage.<BR> </P> <u><p>LOCAL BODIES</p></u> Blocks 14 <BR> Panchayats 246<BR> Villages 1226</p> <p>Area: 3229 Sq. Km. Population: 32,64,619 Language: Hindi Villages: 1226 Male: 16,99,370 Female: 15,65,249</p> <img src="./image/MAP OF PURNEA.jpg" height="150" weight="150" /><br><BR> <B><P>HELPLINE NUMBER </p></B> <LI>Child Helpline : 1098</LI> <LI>Women Helpline : 1091</LI> <LI>Crime Stopper : 1090</LI><br> Helpline District Control Room (Help Line) Contact numbers <LI>06454-241555</LI> <LI>06454-24300</LI> <LI>06454-241466</LI> <LI>SMS Number : 900640244</LI> </html> </div> </body> </html>
html
<reponame>niolikon/alexandria-Inventory package org.niolikon.alexandria.inventory.catalog.commons.dto; import java.io.Serializable; import java.math.BigDecimal; import javax.validation.constraints.NotEmpty; import lombok.Data; import lombok.NoArgsConstructor; @NoArgsConstructor @Data public class ProductRequest implements Serializable { /** Serial Version ID */ private static final long serialVersionUID = 7413602318541965573L; @NotEmpty private String name; @NotEmpty private String description; @NotEmpty private String label; @NotEmpty private String image; @NotEmpty private BigDecimal price; private Boolean featured; }
java
<filename>src/drive/display.rs /* * Copyright 2018 <NAME> <<EMAIL>> * * Licensed under the Apache License, Version 2.0 (the "License"); * you may not use this file except in compliance with the License. * You may obtain a copy of the License at * * http://www.apache.org/licenses/LICENSE-2.0 * * Unless required by applicable law or agreed to in writing, software * distributed under the License is distributed on an "AS IS" BASIS, * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. * See the License for the specific language governing permissions and * limitations under the License. */ use crate::display::*; use crate::drive::course::Segment; use crate::drive::course::{Course, MapWrapper}; use crate::drive::gps; use crate::drive::imu; use crate::drive::obdii; use crate::drive::prepare; use crate::drive::read_track::Coord; use crate::drive::temp; use crate::drive::threading::Threading; use crate::drive::threading::ThreadingRef; use dissolve::strip_html_tags; use gtk::prelude::*; use gtk::ResponseType; use plotters::prelude::*; use plotters_cairo::CairoBackend; use std::cell::RefCell; use std::rc::Rc; use std::sync::mpsc; use std::thread; use std::time::Duration; pub fn button_press_event(display: DisplayRef, track_sel_info: prepare::TrackSelectionRef) { let builder = display.builder.clone(); let stack = builder .get_object::<gtk::Stack>("MainStack") .expect("Can't find MainStack in ui file."); stack.set_visible_child_name("DrivePage"); let drive_page = builder .get_object::<gtk::Notebook>("DriveNotebook") .expect("Can't find DriveNotebook in ui file."); let map_frame = builder .get_object::<gtk::Frame>("DriveMapFrame") .expect("Can't find DriveMapFrame in ui file."); map_frame.add(&track_sel_info.map_widget); let mut champlain_view = champlain::gtk_embed::get_view(track_sel_info.map_widget.clone()); let track_points = track_sel_info.track_points.take(); let (location_tx, location_rx) = mpsc::channel::<(f64, f64, i32, Option<bool>)>(); let (elapsed_tx, elapsed_rx) = mpsc::channel::<Duration>(); let (times_tx, times_rx) = mpsc::channel::<(Duration, Duration, Duration)>(); let (time_diff_tx, time_diff_rx) = mpsc::channel::<(bool, Duration)>(); let (obdii_tx, obdii_rx) = mpsc::channel::<obdii::OBDIIData>(); let (imu_tx, imu_rx) = mpsc::channel::<(f64, f64, Option<f64>, Option<f64>)>(); let (imu_page_tx, imu_page_rx) = mpsc::channel::<(f64, f64, Option<f64>, Option<f64>)>(); let (temp_tx, temp_rx) = mpsc::channel::<Vec<f64>>(); let thread_info = Threading::new(); let window: gtk::ApplicationWindow = builder .get_object("MainPage") .expect("Couldn't find MainPage in ui file."); let thread_info_weak = ThreadingRef::downgrade(&thread_info); let _handler_gpsd = thread::spawn(move || { let thread_info = upgrade_weak!(thread_info_weak); let mut segments = Vec::new(); for points in track_points { segments.push(Segment::new( Coord::new( (&points).first().unwrap().lat, (&points).first().unwrap().lon, (&points).first().unwrap().head, ), Coord::new( (&points).last().unwrap().lat, (&points).last().unwrap().lon, (&points).last().unwrap().head, ), )); } let mut course_info = Course::new(segments); gps::gpsd_thread( thread_info, elapsed_tx, times_tx, time_diff_tx, location_tx, &mut course_info, ); }); let mut track_name = track_sel_info.track_file.borrow().clone(); let thread_info_weak = ThreadingRef::downgrade(&thread_info); let _handler_obdii = thread::spawn(move || { let thread_info = upgrade_weak!(thread_info_weak); obdii::obdii_thread(thread_info, obdii_tx, &mut track_name).unwrap(); }); let mut track_name = track_sel_info.track_file.borrow().clone(); let thread_info_weak = ThreadingRef::downgrade(&thread_info); let _handler_imu = thread::spawn(move || { let thread_info = upgrade_weak!(thread_info_weak); imu::imu_thread(thread_info, imu_tx, imu_page_tx, &mut track_name); }); let mut track_name = track_sel_info.track_file.borrow().clone(); let thread_info_weak = ThreadingRef::downgrade(&thread_info); let _handler_imu = thread::spawn(move || { let thread_info = upgrade_weak!(thread_info_weak); temp::temp_thread(thread_info, temp_tx, &mut track_name); }); let thread_info_weak = ThreadingRef::downgrade(&thread_info); let display_weak = DisplayRef::downgrade(&display); glib::timeout_add_local(10, move || { let thread_info = upgrade_weak!(thread_info_weak, glib::source::Continue(false)); let display = upgrade_weak!(display_weak, glib::source::Continue(false)); let builder = display.builder.clone(); if thread_info.close.lock().unwrap().get() { return glib::source::Continue(false); } thread_info.time_update_idle_thread(&elapsed_rx, &times_rx, &time_diff_rx, builder) }); let thread_info_weak = ThreadingRef::downgrade(&thread_info); let display_weak = DisplayRef::downgrade(&display); let obdii_data = Rc::new(RefCell::new(obdii::OBDIIGraphData::new())); thread_info.set_cairo_graphs(&builder, &obdii_data); glib::timeout_add_local(10, move || { let thread_info = upgrade_weak!(thread_info_weak, glib::source::Continue(false)); let display = upgrade_weak!(display_weak, glib::source::Continue(false)); let builder = display.builder.clone(); thread_info.obdii_update_idle_thread(&obdii_rx, builder, &obdii_data) }); let thread_info_weak = ThreadingRef::downgrade(&thread_info); let display_weak = DisplayRef::downgrade(&display); glib::timeout_add_local(10, move || { let thread_info = upgrade_weak!(thread_info_weak, glib::source::Continue(false)); let display = upgrade_weak!(display_weak, glib::source::Continue(false)); let builder = display.builder.clone(); if thread_info.close.lock().unwrap().get() { return glib::source::Continue(false); } thread_info.temp_update_idle_thread(&temp_rx, builder) }); let imu_area: gtk::DrawingArea = builder .get_object("AccelDrawingArea") .expect("Couldn't find AccelDrawingArea in ui file."); let thread_info_weak = ThreadingRef::downgrade(&thread_info); let display_weak = DisplayRef::downgrade(&display); imu_area.connect_draw(move |me, ctx| { let thread_info = upgrade_weak!(thread_info_weak, glib::signal::Inhibit(true)); let display = upgrade_weak!(display_weak, Inhibit(true)); let builder = display.builder.clone(); thread_info.imu_draw_idle_thread(&imu_rx, me, ctx, builder) }); let imu_page_accel_area: gtk::DrawingArea = builder .get_object("IMUPageAcellDraw") .expect("Couldn't find IMUPageAcellDraw in ui file."); let thread_info_weak = ThreadingRef::downgrade(&thread_info); let display_weak = DisplayRef::downgrade(&display); imu_page_accel_area.connect_draw(move |me, ctx| { let thread_info = upgrade_weak!(thread_info_weak, glib::signal::Inhibit(true)); let display = upgrade_weak!(display_weak, Inhibit(true)); let builder = display.builder.clone(); thread_info.imu_draw_idle_thread(&imu_page_rx, me, ctx, builder) }); glib::timeout_add_local(imu::IMU_SAMPLE_FREQ as u32, move || { imu_area.queue_draw(); imu_page_accel_area.queue_draw(); glib::source::Continue(true) }); let close_button = builder .get_object::<gtk::Button>("DriveOptionsPopOverClose") .expect("Can't find DriveOptionsPopOverClose in ui file."); let thread_info_weak = ThreadingRef::downgrade(&thread_info); close_button.connect_clicked(move |_| { let thread_info = upgrade_weak!(thread_info_weak); thread_info.close.lock().unwrap().set(true); stack.set_visible_child_name("SplashImage"); }); let save_button = builder .get_object::<gtk::Button>("DriveOptionsPopOverSave") .expect("Can't find DriveOptionsPopOverClose in ui file."); let thread_info_weak = ThreadingRef::downgrade(&thread_info); save_button.connect_clicked(move |_| { let thread_info = upgrade_weak!(thread_info_weak); let file_chooser = gtk::FileChooserNative::new( Some("Save times as"), Some(&window), gtk::FileChooserAction::Save, Some("Save"), Some("Close"), ); let response = file_chooser.run(); if response == ResponseType::Accept { if let Some(filepath) = file_chooser.get_filename() { let mut time_file = thread_info.time_file.write().unwrap(); *time_file = filepath; thread_info.serialise.lock().unwrap().set(true); } } }); let calibrate_button = display .builder .get_object::<gtk::Button>("CalibrateOptionsPopOverSave") .expect("Can't find CalibrateOptionsPopOverSave in ui file."); let thread_info_weak = ThreadingRef::downgrade(&thread_info); calibrate_button.connect_clicked(move |_| { let thread_info = upgrade_weak!(thread_info_weak); thread_info.calibrate.lock().unwrap().set(true); }); let mut layer = champlain::marker_layer::ChamplainMarkerLayer::new(); layer.borrow_mut_actor().show(); champlain_view.add_layer(layer.borrow_mut_layer()); let point_colour = champlain::clutter_colour::ClutterColor::new(100, 200, 255, 255); let mut point = champlain::point::ChamplainPoint::new_full(12.0, point_colour); layer.add_marker(point.borrow_mut_marker()); let mut pos_path_layer = champlain::path_layer::ChamplainPathLayer::new(); champlain_view.add_layer(pos_path_layer.borrow_mut_layer()); let colour = champlain::clutter_colour::ClutterColor::new(204, 60, 0, 255); pos_path_layer.set_stroke_colour(colour); pos_path_layer.set_visible(true); let mut neg_path_layer = champlain::path_layer::ChamplainPathLayer::new(); champlain_view.add_layer(neg_path_layer.borrow_mut_layer()); let colour = champlain::clutter_colour::ClutterColor::new(0, 153, 76, 255); neg_path_layer.set_stroke_colour(colour); neg_path_layer.set_visible(true); layer.show_all_markers(); let mut map_wrapper = MapWrapper::new(pos_path_layer, neg_path_layer, point); #[allow(clippy::redundant_clone)] let thread_info_clone = thread_info.clone(); glib::timeout_add_local(10, move || { let thread_info = ThreadingRef::downgrade(&thread_info_clone) .upgrade() .unwrap(); if thread_info.close.lock().unwrap().get() { layer.remove_all(); return glib::source::Continue(false); } thread_info.map_update_idle_thread(&location_rx, &mut map_wrapper) }); drive_page.show_all(); } impl Threading { pub fn set_cairo_graphs( &self, builder: &gtk::Builder, obdii_data: &Rc<RefCell<obdii::OBDIIGraphData>>, ) { let chart = builder .get_object::<gtk::DrawingArea>("OBDIIChartOne") .expect("Can't find OBDIIChartOne in ui file."); let obdii_data_cloned = obdii_data.clone(); chart.connect_draw(move |me, cr| { let width = me.get_allocated_width() as f64 * 0.07; let height = me.get_allocated_width() as f64 * 0.07; let root = CairoBackend::new(cr, (500, 500)) .unwrap() .into_drawing_area(); let mut chart = ChartBuilder::on(&root) .margin(10) .caption("RPM", ("sans-serif", 30).into_font()) .x_label_area_size(width as u32) .y_label_area_size(height as u32) .build_cartesian_2d(0..100 as u32, 0f64..15000f64) .unwrap(); chart.configure_mesh().draw().unwrap(); chart .draw_series(AreaSeries::new( obdii_data_cloned .borrow_mut() .rpm .iter() .enumerate() .map(|(x, y)| (x as u32, *y)), 0.0, &BLUE.mix(0.2), )) .unwrap(); Inhibit(true) }); let chart = builder .get_object::<gtk::DrawingArea>("OBDIIChartTwo") .expect("Can't find OBDIIChartTwo in ui file."); let obdii_data_cloned = obdii_data.clone(); chart.connect_draw(move |me, cr| { let width = me.get_allocated_width() as f64 * 0.07; let height = me.get_allocated_width() as f64 * 0.07; let root = CairoBackend::new(cr, (500, 500)) .unwrap() .into_drawing_area(); let mut chart = ChartBuilder::on(&root) .margin(10) .caption("MAF (%)", ("sans-serif", 30).into_font()) .x_label_area_size(width as u32) .y_label_area_size(height as u32) .build_cartesian_2d(0..100 as u32, 0f64..100f64) .unwrap(); chart.configure_mesh().draw().unwrap(); chart .draw_series(AreaSeries::new( obdii_data_cloned .borrow_mut() .maf .iter() .enumerate() .map(|(x, y)| (x as u32, *y)), 0.0, &RED.mix(0.2), )) .unwrap(); Inhibit(true) }); let chart = builder .get_object::<gtk::DrawingArea>("OBDIIChartThree") .expect("Can't find OBDIIChartThree in ui file."); let obdii_data_cloned = obdii_data.clone(); chart.connect_draw(move |me, cr| { let width = me.get_allocated_width() as f64 * 0.07; let height = me.get_allocated_width() as f64 * 0.07; let root = CairoBackend::new(cr, (500, 500)) .unwrap() .into_drawing_area(); let mut chart = ChartBuilder::on(&root) .margin(10) .caption("Throtle (%)", ("sans-serif", 30).into_font()) .x_label_area_size(width as u32) .y_label_area_size(height as u32) .build_cartesian_2d(0..100 as u32, 0f64..100f64) .unwrap(); chart.configure_mesh().draw().unwrap(); chart .draw_series(AreaSeries::new( obdii_data_cloned .borrow_mut() .throttle .iter() .enumerate() .map(|(x, y)| (x as u32, *y)), 0.0, &GREEN.mix(0.2), )) .unwrap(); Inhibit(true) }); let chart = builder .get_object::<gtk::DrawingArea>("OBDIIChartFour") .expect("Can't find OBDIIChartFour in ui file."); let obdii_data_cloned = obdii_data.clone(); chart.connect_draw(move |me, cr| { let width = me.get_allocated_width() as f64 * 0.07; let height = me.get_allocated_width() as f64 * 0.07; let root = CairoBackend::new(cr, (500, 500)) .unwrap() .into_drawing_area(); let mut chart = ChartBuilder::on(&root) .margin(10) .caption("Load (%)", ("sans-serif", 30).into_font()) .x_label_area_size(width as u32) .y_label_area_size(height as u32) .build_cartesian_2d(0..100 as u32, 0f64..100f64) .unwrap(); chart.configure_mesh().draw().unwrap(); chart .draw_series(AreaSeries::new( obdii_data_cloned .borrow_mut() .load .iter() .enumerate() .map(|(x, y)| (x as u32, *y)), 0.0, &YELLOW.mix(0.2), )) .unwrap(); Inhibit(true) }); // Setup the fonts let best_diff = builder .get_object::<gtk::Label>("BestDiff") .expect("Can't find BestDiff in ui file."); best_diff.connect_size_allocate({ move |me, allocation| { let markup = format!( "<span font_desc=\"{}\">{}</span>", allocation.width / 8, strip_html_tags(&me.get_text()).first().unwrap() ); me.set_markup(&markup); } }); let current_time = builder .get_object::<gtk::Label>("CurrentTime") .expect("Can't find CurrentTime in ui file."); current_time.connect_size_allocate({ move |me, allocation| { let markup = format!( "<span font_desc=\"{}\">{}</span>", allocation.width / 7, strip_html_tags(&me.get_text()).first().unwrap() ); me.set_markup(&markup); } }); let last_time = builder .get_object::<gtk::Label>("LastTime") .expect("Can't find LastTime in ui file."); last_time.connect_size_allocate({ move |me, allocation| { let markup = format!( "<span font_desc=\"{}\" foreground=\"#c4c4a0a00000\">{}</span>", allocation.width / 8, strip_html_tags(&me.get_text()).first().unwrap() ); me.set_markup(&markup); } }); let best_time = builder .get_object::<gtk::Label>("BestTime") .expect("Can't find BestTime in ui file."); best_time.connect_size_allocate({ move |me, allocation| { let markup = format!( "<span font_desc=\"{}\" foreground=\"#0b7dac5e165c\">{}</span>", allocation.width / 8, strip_html_tags(&me.get_text()).first().unwrap() ); me.set_markup(&markup); } }); let worst_time = builder .get_object::<gtk::Label>("WorstTime") .expect("Can't find WorstTime in ui file."); worst_time.connect_size_allocate({ move |me, allocation| { let markup = format!( "<span font_desc=\"{}\" foreground=\"#a4a400000000\">{}</span>", allocation.width / 8, strip_html_tags(&me.get_text()).first().unwrap() ); me.set_markup(&markup); } }); } }
rust
<reponame>lidong3527/-PC-PPT- @charset "utf-8"; /* CSS Document */ html,body,div,span,h1,h2,h3,h4,h5,h6,p,pre,a,code,em,img,small,strong,sub,sup,u,i,center,dl,dt,dd,ol,ul,li,fieldset,form,label{ margin:0;padding:0;border:0;outline:0;font-size:100%;vertical-align:baseline;background:transparent } button, a{ outline:none; } ul, li{ list-style:none; } .pay_select_center{ text-align:center; margin-top:30px; margin-bottom:20px } .pay_select_center span{ padding: 2px 20px !important; border-radius: 4px; display:inline-block; width:60px; } .pay_selected { border: #01b400 1px solid; color: #01b400; margin: 0 10px; } .pay_makeup{ width: 400px; font-size: 18px; margin-top: 50px; background: #eafbff; padding: 20px 50px; border-radius: 8px; float:left; margin-left:80px; } .pay_makeup_ul{ width: 400px; font-size: 18px; background: #eafbff; padding: 20px 50px; border-radius: 8px; margin:50px auto 0 auto } .pay_makeup_tile{ color: #333; font-size: 24px; text-align: center; padding: 15px 0; } .pay_makeup ul{ margin:0; padding:0 } .pay_makeup ul li{ list-style:none; padding:15px 0 0 0; line-height:30px; } .pay_makeup ul li span:first-child{ width:60px; display:inline-block; } .list_phone{ width:80px !important; display: inline-block; } .pay_input{ box-sizing: border-box; color: #222; font-size: 14px; width: 300px; height: 40px; border: 1px solid #cfcfcf; border-radius: 3px; background-color: #fff; margin: 0; padding: 10px; outline: 0; } .pay_input1{ background: none; border: none; color: #f00; font-size: 24px; display: inline-block; width: 55px; height: 54px; box-sizing: border-box; line-height:48px; text-decoration: none; vertical-align: middle; text-align: center; position: relative; top: -3px; } .makeup_btn{ background: #2d9afa; color: #fff; display: inline-block; width: 200px; height: 54px; box-sizing: border-box; line-height: 54px; text-decoration: none; vertical-align: middle; text-align: center; border-radius: 5px; border: none; font-size: 18px; cursor: pointer; } .makeup_btn:hover{ background:#4babff } .makeup_center{ text-align:center } .makeup_money{ color:#2d9afa; text-decoration: none; } .makeup_money:hover{ color:#4babff; text-decoration:underline } .makeup_top{ margin-top:30px; } .makeup_money_li{ border-bottom: 1px solid #ddd; padding: 10px 0; color:#333 } .makeup_money_li:first-child{ margin-top:20px } #pay_text{ font-size: 14px; text-align: center; display: block; margin-top: 20px; } .pay_container{ text-align:center; padding: 100px 0; } .pay_container img{ text-align:center } .pay_ok{ text-align: center; color: #03ba03; font-size: 28px; line-height: 150px; width: 450px; height: 300px; background: #eafbff; margin: 200px auto 0 auto; border-radius: 8px; } .pay_ok_in{ color: #f60; font-size: 17px; } .pay_info{ color: #f60; font-size: 16px; margin-bottom: 20px; } .pay_bg{ background:url(../img/pay_up_b.png) no-repeat; background-size: cover; } .pay_bg h2{ color: #fff; text-align: center; font-size: 32px; padding-top: 40px; font-family:'Microsoft YaHei' } .pay_bg_man{ background:url(../img/pay_up_man.png) no-repeat; width:414px; height:386px; float:left; margin-top:110px } .pay_cont{ width:1000px; margin:0 auto; }
css
<reponame>VividcodeIO/lodash4cookbook const escapeRegExp = require('lodash/escapeRegExp'); describe('escapeRegExp', () => { it('should escape special regular expression characters', () => { expect(escapeRegExp('[hello]')).toEqual('\\[hello\\]'); }); });
javascript
<reponame>JacXuan/uavstack<gh_stars>1-10 /*- * << * UAVStack * == * Copyright (C) 2016 - 2017 UAVStack * == * Licensed under the Apache License, Version 2.0 (the "License"); * you may not use this file except in compliance with the License. * You may obtain a copy of the License at * * http://www.apache.org/licenses/LICENSE-2.0 * * Unless required by applicable law or agreed to in writing, software * distributed under the License is distributed on an "AS IS" BASIS, * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. * See the License for the specific language governing permissions and * limitations under the License. * >> */ package com.creditease.uav.apphub.sso; import java.security.MessageDigest; import java.security.NoSuchAlgorithmException; import java.util.ArrayList; import java.util.HashMap; import java.util.List; import java.util.Map; import java.util.concurrent.TimeUnit; import javax.servlet.http.HttpServletRequest; import com.creditease.agent.helpers.JSONHelper; import com.creditease.agent.log.SystemLogger; import com.creditease.agent.log.api.ISystemLogger; import com.creditease.uav.cache.api.CacheManager; import com.creditease.uav.cache.api.CacheManagerFactory; public abstract class GUISSOClient { protected static final List<Map<String, String>> systemUser = new ArrayList<Map<String, String>>(); protected static char[] hex = "0123456789abcdef".toCharArray(); protected static CacheManager cm = null; protected static MessageDigest md = null; protected ISystemLogger logger = SystemLogger.getLogger(GUISSOLdapClient.class); protected GUISSOClient() { } public GUISSOClient(HttpServletRequest request) { initMessageMD5(); initSystemUser(request); initRedis(request); } public Map<String, String> getUserByLogin(String loginId, String loginPwd) { Map<String, String> userInfo = new HashMap<String, String>(); userInfo = getUserBySystem(loginId, loginPwd); if (userInfo.isEmpty()) { userInfo = getUserByCache(loginId, loginPwd); } if (userInfo.isEmpty()) { userInfo = getUserByLoginImpl(loginId, loginPwd); if (!userInfo.isEmpty()) { userInfo.put("loginPwd", bytes2Hex(md.digest(loginPwd.getBytes()))); cm.putJSON("apphub.ldap.cache", loginId, userInfo, 7L, TimeUnit.DAYS); loggerInfo("用户信息缓存 ", "保存7天", "成功", loginId); } } if (!userInfo.isEmpty()) { userInfo.remove("loginPwd"); loggerInfo("用户登录 ", "信息获取", "成功", userInfo.toString()); } return userInfo; } /** * return Map key schema : * * "loginId":loginId * * "groupId":所属组织架构字符串 * * "emailList":所属业务组字符串 */ protected abstract Map<String, String> getUserByLoginImpl(String loginId, String loginPwd); /** * return List<Map> key schema : * * "name":用户姓名 * * "email":用户邮箱(登录账户) * * "groupId":所属组织架构字符串 * * "emailList":所属业务组字符串 */ public abstract List<Map<String, String>> getUserByQuery(String email); /** * return Map key schema : * * "name":业务组名(String类型) * * "email":业务组邮箱或标识(String类型) * * "groupIdFiltered":业务组成员所属的所有组织架构(Set<String>类型) * * "emailList_Filtered":业务组成员所属的所有业务组 (Set<String>类型) * * "userInfo":业务组成员信息,包含name、email、groupID、emailList。(List<Map<String,String>>类型) */ public abstract Map<String, Object> getEmailListByQuery(String email); // ======================================init begin======================================== private void initMessageMD5() { try { if (null == md) { md = MessageDigest.getInstance("MD5"); } } catch (NoSuchAlgorithmException e) { loggerError("initMessageMD5", "", e); } } private void initSystemUser(HttpServletRequest request) { if (systemUser.isEmpty()) { String adminLoginid = request.getServletContext().getInitParameter("uav.apphub.sso.admin.loginid"); String adminPassword = request.getServletContext().getInitParameter("uav.apphub.sso.admin.password"); Map<String, String> adminInfo = new HashMap<String, String>(); adminInfo.put("loginId", adminLoginid); adminInfo.put("loginPwd", <PASSWORD>); adminInfo.put("groupId", "uav_admin"); adminInfo.put("emailList", "UAV.ADMIN.EMAIL.LIST"); systemUser.add(adminInfo); String guestLoginid = request.getServletContext().getInitParameter("uav.apphub.sso.guest.loginid"); String guestPassword = request.getServletContext().getInitParameter("uav.apphub.sso.guest.password"); Map<String, String> guestInfo = new HashMap<String, String>(); guestInfo.put("loginId", guestLoginid); guestInfo.put("loginPwd", <PASSWORD>); guestInfo.put("groupId", "uav_guest"); guestInfo.put("emailList", "UAV.GUEST.EMAIL.LIST"); systemUser.add(guestInfo); } } @SuppressWarnings("unchecked") private void initRedis(HttpServletRequest request) { if (null == cm) { String redisAddrStr = request.getServletContext().getInitParameter("uav.app.ldap.redis.store.addr"); Map<String, Object> redisParamsMap = JSONHelper.toObject( request.getServletContext().getInitParameter("uav.app.ldap.redis.store.params"), Map.class); cm = CacheManagerFactory.build(redisAddrStr, Integer.valueOf(String.valueOf(redisParamsMap.get("min"))), Integer.valueOf(String.valueOf(redisParamsMap.get("max"))), Integer.valueOf(String.valueOf(redisParamsMap.get("queue"))), String.valueOf(redisParamsMap.get("pwd"))); } } private Map<String, String> getUserBySystem(String loginId, String loginPwd) { Map<String, String> result = new HashMap<String, String>(); for (Map<String, String> user : systemUser) { if (user.get("loginId").equals(loginId) && user.get("loginPwd").equals(loginPwd)) { result.putAll(user); break; } } if (!result.isEmpty()) { loggerInfo("系统用户 ", "校验", "成功", loginId); } return result; } @SuppressWarnings("unchecked") private Map<String, String> getUserByCache(String loginId, String loginPwd) { Map<String, String> result = new HashMap<String, String>(); if (cm.exists("apphub.ldap.cache", loginId)) { Map<String, String> cache = cm.getJSON("apphub.ldap.cache", loginId, Map.class); if (bytes2Hex(md.digest(loginPwd.getBytes())).equals(cache.get("loginPwd"))) { result = cache; } } if (!result.isEmpty()) { loggerInfo("缓存用户 ", "校验", "成功", loginId); } return result; } private String bytes2Hex(byte[] bys) { char[] chs = new char[bys.length * 2]; int stopCondition = bys.length; int offset = 0; for (int i = 0; i < stopCondition; i++) { chs[offset++] = hex[bys[i] >> 4 & 0xf]; chs[offset++] = hex[bys[i] & 0xf]; } return new String(chs); } protected void loggerInfo(String title, String action, String result, String msg) { logger.info(this, title + "[" + action + "]" + result + " " + msg); } protected void loggerError(String title, String msg, Exception e) { logger.err(this, title + "[错误] " + msg, e.getMessage(), e); } }
java
{"smooth-scroll.js":"sha256-MMwd7yq/pl1dFNFZYEn+HQ9w8cHfShnOj+lZe4JirTo=","smooth-scroll.min.js":"sha256-+w5dKMPfbXFge3jZueYprckx25keRFuM1ENaRgzVZ1I=","smooth-scroll.polyfills.js":"sha256-OVcD+kL0rWLIGaEr2yH4VsDEvovruY9uM+OKgvxDFXs=","smooth-scroll.polyfills.min.js":"sha256-1evUOKgG2C7IxAF7VT6zZ4MWR6Jf0PpKRqxInGmxoYU="}
json
use std::convert::TryFrom; use chrono::{TimeZone, Utc}; use fake::{Fake, Faker}; use middleware_wrapper_atrust::{atrustapi::return_codes::ReturnCode, helpers::ffi, idesscd::*}; use once_cell::sync::Lazy; use serial_test::serial; use wiremock::{ matchers::{method, path}, Mock, MockServer, Request, Respond, ResponseTemplate, }; #[macro_use] mod helpers; static SCU_URL: Lazy<Option<String>> = Lazy::new(|| std::env::var("SCU_URL").ok()); const CONFIG_FILE: &str = "./tests/asigntseonline.conf"; const CONFIG_FILE_TARGET: &str = "./target/asigntseonline.conf"; static MOCK_IDESSCD: Lazy<MockIDeSscd> = Lazy::new(|| { let mut mock_idesscd = MockIDeSscd::new(); mock_idesscd.expect_get_tse_info().returning(|| Ok(Faker.fake::<TseInfo>())); mock_idesscd.expect_start_transaction().returning(|_| Ok(Faker.fake::<StartTransactionResponse>())); mock_idesscd.expect_update_transaction().returning(|_| Ok(Faker.fake::<UpdateTransactionResponse>())); mock_idesscd.expect_finish_transaction().returning(|_| Ok(Faker.fake::<FinishTransactionResponse>())); mock_idesscd.expect_set_tse_state().returning(|_| Ok(Faker.fake::<TseState>())); mock_idesscd.expect_register_client_id().returning(|_| Ok(Faker.fake::<RegisterClientIdResponse>())); mock_idesscd.expect_unregister_client_id().returning(|_| Ok(Faker.fake::<UnregisterClientIdResponse>())); mock_idesscd.expect_execute_set_tse_time().returning(|| Ok(())); mock_idesscd.expect_execute_self_test().returning(|| Ok(())); mock_idesscd.expect_start_export_session().returning(|_| Ok(Faker.fake::<StartExportSessionResponse>())); mock_idesscd.expect_start_export_session_by_time_stamp().returning(|_| Ok(Faker.fake::<StartExportSessionResponse>())); mock_idesscd.expect_start_export_session_by_transaction().returning(|_| Ok(Faker.fake::<StartExportSessionResponse>())); mock_idesscd.expect_export_data().returning(|_| Ok(Faker.fake::<ExportDataResponse>())); mock_idesscd.expect_end_export_session().returning(|_| Ok(Faker.fake::<EndExportSessionResponse>())); mock_idesscd.expect_echo().returning(|request| Ok(ScuDeEchoResponse { message: request.message.clone() })); mock_idesscd }); pub struct FakerResponder(Box<dyn Fn(String) -> String + Send + Sync>); impl FakerResponder { fn post<REQ: Send + Sync + for<'de> serde::Deserialize<'de>, RES: Send + Sync + serde::Serialize, C: Fn(REQ) -> RES + 'static + Send + Sync>(mock: C) -> FakerResponder { FakerResponder(Box::new(move |req: String| { let de = serde_json::de::from_str(&req).unwrap(); let res = mock(de); serde_json::to_string(&res).unwrap() })) } fn get<RES: Send + Sync + serde::Serialize, C: Fn() -> RES + 'static + Send + Sync>(mock: C) -> FakerResponder { FakerResponder(Box::new(move |_: String| { let res = mock(); serde_json::to_string(&res).unwrap() })) } } impl Respond for FakerResponder { fn respond(&self, request: &Request) -> ResponseTemplate { ResponseTemplate::new(200).set_body_string(self.0(String::from_utf8(request.body.clone()).unwrap())) } } static SETUP_MOCK_SERVER: Lazy<MockServer> = Lazy::new(|| { async_std::task::block_on(async { let mock_server = MockServer::start().await; let config = std::fs::read_to_string(CONFIG_FILE).unwrap(); if let Some(scu_url) = SCU_URL.as_ref() { std::fs::write(CONFIG_FILE_TARGET, config.replace("{{ scu_url }}", scu_url)).unwrap(); return mock_server; } std::fs::write(CONFIG_FILE_TARGET, config.replace("{{ scu_url }}", &mock_server.uri())).unwrap(); Mock::given(method("POST")).and(path("/v1/starttransaction")).respond_with(FakerResponder::post(|req| MOCK_IDESSCD.start_transaction(&req).unwrap())).mount(&mock_server).await; Mock::given(method("POST")).and(path("/v1/updatetransaction")).respond_with(FakerResponder::post(|req| MOCK_IDESSCD.update_transaction(&req).unwrap())).mount(&mock_server).await; Mock::given(method("POST")).and(path("/v1/finishtransaction")).respond_with(FakerResponder::post(|req| MOCK_IDESSCD.finish_transaction(&req).unwrap())).mount(&mock_server).await; Mock::given(method("GET")).and(path("/v1/tseinfo")).respond_with(FakerResponder::get(|| MOCK_IDESSCD.get_tse_info().unwrap())).mount(&mock_server).await; Mock::given(method("POST")).and(path("/v1/tsestate")).respond_with(FakerResponder::post(|req| MOCK_IDESSCD.set_tse_state(&req).unwrap())).mount(&mock_server).await; Mock::given(method("POST")).and(path("/v1/registerclientid")).respond_with(FakerResponder::post(|req| MOCK_IDESSCD.register_client_id(&req).unwrap())).mount(&mock_server).await; Mock::given(method("POST")).and(path("/v1/unregisterclientid")).respond_with(FakerResponder::post(|req| MOCK_IDESSCD.unregister_client_id(&req).unwrap())).mount(&mock_server).await; Mock::given(method("POST")).and(path("/v1/executeselftest")).respond_with(FakerResponder::get(|| MOCK_IDESSCD.execute_self_test().unwrap())).mount(&mock_server).await; Mock::given(method("POST")).and(path("/v1/executesettsetime")).respond_with(FakerResponder::get(|| MOCK_IDESSCD.execute_set_tse_time().unwrap())).mount(&mock_server).await; Mock::given(method("POST")).and(path("/v1/startexportsession")).respond_with(FakerResponder::post(|req| MOCK_IDESSCD.start_export_session(&req).unwrap())).mount(&mock_server).await; Mock::given(method("POST")) .and(path("/v1/startexportsessionbytimestamp")) .respond_with(FakerResponder::post(|req| MOCK_IDESSCD.start_export_session_by_time_stamp(&req).unwrap())) .mount(&mock_server) .await; Mock::given(method("POST")) .and(path("/v1/startexportsessionbytransaction")) .respond_with(FakerResponder::post(|req| MOCK_IDESSCD.start_export_session_by_transaction(&req).unwrap())) .mount(&mock_server) .await; Mock::given(method("POST")).and(path("/v1/exportdata")).respond_with(FakerResponder::post(|req| MOCK_IDESSCD.export_data(&req).unwrap())).mount(&mock_server).await; Mock::given(method("POST")).and(path("/v1/endexportsession")).respond_with(FakerResponder::post(|req| MOCK_IDESSCD.end_export_session(&req).unwrap())).mount(&mock_server).await; Mock::given(method("POST")).and(path("/v1/echo")).respond_with(FakerResponder::post(|req| MOCK_IDESSCD.echo(&req).unwrap())).mount(&mock_server).await; mock_server }) }); static SETUP_ATRUSTAPI: Lazy<dlopen::symbor::Library> = Lazy::new(|| { let dylib_path = test_cdylib::build_current_project(); let dylib = dlopen::symbor::Library::open(&dylib_path).unwrap(); let cfg_set_config_file = unsafe { dylib.symbol::<extern "C" fn(*const i8, u32) -> i32>("cfgSetConfigFile").unwrap() }; assert_eq!(0, cfg_set_config_file(CONFIG_FILE_TARGET.as_ptr() as *const i8, CONFIG_FILE_TARGET.len() as u32)); let at_load = unsafe { dylib.symbol::<extern "C" fn() -> i32>("at_load").unwrap() }; assert_eq!(0, at_load()); dylib }); #[test] #[ignore = "dotnet-test"] #[serial] fn only_setup_mocks() { Lazy::<MockServer>::force(&SETUP_MOCK_SERVER); let _ = std::io::Read::read(&mut std::io::stdin(), &mut [0u8]).unwrap(); } #[test] #[serial] fn at_run_self_tests() { Lazy::<MockServer>::force(&SETUP_MOCK_SERVER); let dylib = &SETUP_ATRUSTAPI; let at_run_self_tests = unsafe { dylib.symbol::<extern "C" fn() -> i32>("at_runSelfTests").unwrap() }; let result: ReturnCode = ReturnCode::try_from(at_run_self_tests()).unwrap(); assert_eq!(result, ReturnCode::ExecutionOk); } #[test] #[serial] fn at_get_public_key_with_tse() { Lazy::<MockServer>::force(&SETUP_MOCK_SERVER); let dylib = &SETUP_ATRUSTAPI; let at_get_public_key_with_tse = unsafe { dylib.symbol::<extern "C" fn(*mut *mut u8, *mut u32, *const i8, u32) -> i32>("at_getPublicKeyWithTse").unwrap() }; let mut pub_key = std::mem::MaybeUninit::<*mut u8>::uninit(); let mut pub_key_length = std::mem::MaybeUninit::<u32>::uninit(); let tse_id = "default"; let result: ReturnCode = ReturnCode::try_from(at_get_public_key_with_tse(pub_key.as_mut_ptr(), pub_key_length.as_mut_ptr(), tse_id.as_ptr() as *const i8, tse_id.len() as u32)).unwrap(); assert_eq!(result, ReturnCode::ExecutionOk); println!("pub_key: {}", unsafe { ffi::from_cstr(*pub_key.as_ptr() as *const i8, *pub_key_length.as_ptr()) }); unsafe { ffi::free_ptr(pub_key.as_mut_ptr() as *mut *mut std::os::raw::c_void) }; } #[test] #[serial] fn at_register_client_id() { at_register_client_id_internal(); } fn at_register_client_id_internal() -> String { Lazy::<MockServer>::force(&SETUP_MOCK_SERVER); let dylib = &SETUP_ATRUSTAPI; let at_register_client_id = unsafe { dylib.symbol::<extern "C" fn(*const i8, u32) -> i32>("at_registerClientId").unwrap() }; let client_id: String = Faker.fake(); let result: ReturnCode = ReturnCode::try_from(at_register_client_id(client_id.as_ptr() as *const i8, client_id.len() as u32)).unwrap(); assert_eq!(result, ReturnCode::ExecutionOk); client_id } #[test] #[serial] fn start_transaction() { Lazy::<MockServer>::force(&SETUP_MOCK_SERVER); let dylib = &SETUP_ATRUSTAPI; let start_transaction = unsafe { dylib .symbol::<extern "C" fn(*const i8, u32, *const u8, u32, *const i8, u32, *const u8, u32, *mut u32, *mut i64, *mut *mut u8, *mut u32, *mut u32, *mut *mut u8, *mut u32) -> i32>("startTransaction") .unwrap() }; let client_id = at_register_client_id_internal(); let mut transaction_number = std::mem::MaybeUninit::<u32>::uninit(); let mut log_time = std::mem::MaybeUninit::<i64>::uninit(); let mut serial_number = std::mem::MaybeUninit::<*mut u8>::uninit(); let mut serial_number_length = std::mem::MaybeUninit::<u32>::uninit(); let mut signature_counter = std::mem::MaybeUninit::<u32>::uninit(); let mut signature_value = std::mem::MaybeUninit::<*mut u8>::uninit(); let mut signature_value_length = std::mem::MaybeUninit::<u32>::uninit(); let result: ReturnCode = ReturnCode::try_from(start_transaction( client_id.as_ptr() as *const i8, client_id.len() as u32, "processData".as_bytes().as_ptr(), "processData".len() as u32, "processType".as_ptr() as *const i8, "processType".len() as u32, "additionalData".as_bytes().as_ptr(), "additionalData".len() as u32, transaction_number.as_mut_ptr(), log_time.as_mut_ptr(), serial_number.as_mut_ptr(), serial_number_length.as_mut_ptr(), signature_counter.as_mut_ptr(), signature_value.as_mut_ptr(), signature_value_length.as_mut_ptr(), )) .unwrap(); assert_eq!(result, ReturnCode::ExecutionOk); println!("transaction_number: {}", unsafe { *transaction_number.as_ptr() }); println!("log_time: {}", Utc.timestamp(unsafe { *log_time.as_ptr() }, 0)); println!("serial_number: {}", unsafe { ffi::from_cstr(*serial_number.as_ptr() as *const i8, *serial_number_length.as_ptr()) }); println!("serial_number_length: {}", unsafe { *serial_number_length.as_ptr() }); println!("signature_counter: {}", unsafe { *signature_counter.as_ptr() }); println!("signature_value: {}", unsafe { ffi::from_cstr(*signature_value.as_ptr() as *const i8, *signature_value_length.as_ptr()) }); println!("signature_value_length: {}", unsafe { *signature_value_length.as_ptr() }); unsafe { ffi::free_ptr(serial_number.as_mut_ptr() as *mut *mut std::os::raw::c_void) }; unsafe { ffi::free_ptr(signature_value.as_mut_ptr() as *mut *mut std::os::raw::c_void) }; } #[test] #[serial] fn export_data_with_client_id() { Lazy::<MockServer>::force(&SETUP_MOCK_SERVER); let dylib = &SETUP_ATRUSTAPI; let export_data_with_client_id = unsafe { dylib.symbol::<extern "C" fn(*const i8, u32, *mut *mut u8, *mut u32) -> i32>("exportDataWithClientId").unwrap() }; let client_id = at_register_client_id_internal(); let mut exported_data = std::mem::MaybeUninit::<*mut u8>::uninit(); let mut exported_data_length = std::mem::MaybeUninit::<u32>::uninit(); let result: ReturnCode = ReturnCode::try_from(export_data_with_client_id(client_id.as_bytes().as_ptr() as *const i8, client_id.len() as u32, exported_data.as_mut_ptr(), exported_data_length.as_mut_ptr())).unwrap(); assert_eq!(result, ReturnCode::ExecutionOk); println!("exported_data: {}", base64::encode(unsafe { ffi::from_cba(*exported_data.as_ptr(), *exported_data_length.as_ptr()) })); println!("exported_data_length: {}", unsafe { *exported_data_length.as_ptr() }); unsafe { ffi::free_ptr(exported_data.as_mut_ptr() as *mut *mut std::os::raw::c_void) }; }
rust
Most couples experience passionate romance and active sex life in the initial years of married life. Somewhere down the years, these get lost amidst hectic jobs, raising kids, fulfilling family responsibilities and growing older. There are only very few lucky couples who still have an ignited love and romance for each other, even decades into marriage. Research indicates sex life declines as one gets older. But it does not have to be that way! People get complacent and stop taking care of their physical appearance - this is what damages most marriages. The husband-wife relationship is kept alive when both work towards making it better. Sex is not just a physical act but it also drives emotional needs. Orgasm produces hormones 'endorphins' which instills feelings of happiness and satisfaction. Apart from this physical result, sex also promotes intimacy, comfort, encouragement and the realization that each one is desired by the other. It is beyond doubt that a healthy sex life is a critical determinant of a successful marriage because it keeps the bond invigorating and helps partners come closer to each other. Generally men are seen to be more in need of sex but women afford higher value to their social responsibilities towards children, in-laws and house. So, most wives ignore their husbands' desire for a physical connect which is so like a death sentence for the marriage. Then why worry when husbands seek love outside? All men want to feel attractive to their partners and this comes when you give them attention and make yourself attractive for them! Remember those days when each one of you continuously exchanged love messages…where have they gone? Is it because you don't find the need or you think that your partner already knows - either of the reason does not qualify for a great married life. Here are few strategies for an exciting sex life and stronger marriage: - Take out time for each other. - Keep yourself physically attractive. This is especially true for women. - Be adventurous just like the way you were in the first year of marriage. - Don't be overburdened by the rigors of domestic, social and work life. - Do things together - listen to music, watch movies and why not 'date nights' on the weekends. - Never bypass foreplay - it's an effective tool to explore each other sensitive parts.
english
The World Health Organisation's chief scientist, Dr Soumya Swaminathan, said that the agency is optimistic and hopeful that the Covid-19 vaccines could be available before the end of this year. The WHO scientist also said that clinical trials have now definitively shown that anti-malarial drug hydroxychloroquine does not have an impact on preventing deaths from Covid-19. In reference to a future vaccine against the deadly virus, she said there are about 10 candidates who are in human testing phase and at least three of them are entering the new promising phase-three stage which proves a vaccine's efficacy. “I'm hopeful, I'm optimistic, but vaccine development is a complex undertaking and comes with a lot of uncertainty. The good thing is that we have many different vaccine candidates and platforms,” she said.
english
Monday December 04, 2017, Value is an imaginary concept. A Hindu icon becomes a deity after the ritual of ‘prana-pratishtha’ which means ‘the breath placement’ ceremony. After this, the icon is seen as being sentient, alive, capable of listening to the petition of the devotee and responding to it. A non-believer will never see value in the icon as a believer will. A believer will invest time, money and attention on the deity while the non-believer will simply keep the icon on a shelf, as decoration perhaps. The value of an icon depends on other things too – how old is it, how rare is it, how desired it is by other collectors. Many antique sellers in Mumbai have never understood the fascination of rich people with what they consider junk. But they indulge it for a premium price. What is being sold here? A thing, or the perception of a thing? That perception is a brand. In India, a great idea that has great political and social capital is the concept of honour, or pride, especially clan pride embodied in the moustache of a man. And it is usually located on the body of a woman. She embodies honour. She embodies pride. In Sanskrit literature there is the concept of ‘a-surya-sparsha’ or ‘one unseen by the sun’. Such a lady obviously belongs to a rich aristocratic household, who does not have to work in fields or go the village well. Such a lady is obviously ‘un-tanned’ and ‘delicate’. In male-dominated societies, such women were kept in ivory towers with chastity belts, or in inner courtyards with veils. The fewer people saw them, the less people touched them, the more valuable they became, and so greater symbols of honour. Goddesses almost. A goddess without power, of course, as she had to submit to the fierce male scrutiny and prove she was worthy of the status imposed upon her. In a world that propagates ‘equality’, such valorisation of ‘honour’ and glamorisation of ‘female purity and chastity’ makes no sense. But it does make sense to people who have nothing else in their life – neither economic nor political power – but honour. In such societies, battles are fought for a woman’s honour. She is trophy to be protected or booty to be claimed. Tribal kings expected these honourable women to die with them, thus carrying their honour to the afterlife. If the king died in battle, these women were to kill themselves, so no one could claim them, preferably in fire, so even their corpses would be out of reach. Like ‘honour’, ‘holiness’ is also a currency. Icons become holy. Teachers become holy and so everything they touch become holy. Followers of gurus will pay vast sums of money to own a piece of clothing worn by the guru, or a cup used by the guru. Buddhist monasteries venerate relics of the Buddha – tooth and bone fragments found after his body was cremated. In secular times, holiness is replaced by ‘glamour’. We want to own the clothes of a superstar, notes of a scientist, the guitar of a musician, even the divorce papers of a queen. In each case, the value is far more than the material worth of the object in question. We would like to believe that the market is a rational place. That there is something called ‘fair’ value that a state or regulating authority can define. But when it comes to value, everything goes topsy-turvy. For fair for the believer is not just unfair, it is absurd for the non-believer. Charles Darwin who wrote about evolution did not share the religious beliefs of his wife Emma. She did not agree with him either. But they both loved each other until the very end, despite valuing different things. The ability to respect other people’s value is the hallmark of civilisation. It is most distasteful to have other people’s values imposed upon us: and that holds true for the doctrine of ‘universal values’ that assumes everyone has to believe in the same thing. There are many people who will not value ‘honour’ and there will be many people who will not value ‘human rights’ and many who will not value ‘holiness’ and the global village has to find a way to living with these multiple value systems, rather than trying to seek a universal one. This article was first published on 24th November 2017, in the Economic Times. (Disclaimer: The views and opinions expressed in this article are those of the author and do not necessarily reflect the views of YourStory.)
english
[ { "merged": "/home/abhijeet/dev/projects/jwplayer/jwplayer-sdk-android-demo/CastCompanionLibrary-android/build/intermediates/res/merged/release/drawable-mdpi-v11/ic_stat_content_remove.png", "source": "/home/abhijeet/dev/projects/jwplayer/jwplayer-sdk-android-demo/CastCompanionLibrary-android/res/drawable-mdpi-v11/ic_stat_content_remove.png" }, { "merged": "/home/abhijeet/dev/projects/jwplayer/jwplayer-sdk-android-demo/CastCompanionLibrary-android/build/intermediates/res/merged/release/drawable-mdpi-v11/ic_stat_action_democast.png", "source": "/home/abhijeet/dev/projects/jwplayer/jwplayer-sdk-android-demo/CastCompanionLibrary-android/res/drawable-mdpi-v11/ic_stat_action_democast.png" }, { "merged": "/home/abhijeet/dev/projects/jwplayer/jwplayer-sdk-android-demo/CastCompanionLibrary-android/build/intermediates/res/merged/release/drawable-mdpi-v11/ic_stat_action_notification.png", "source": "/home/abhijeet/dev/projects/jwplayer/jwplayer-sdk-android-demo/CastCompanionLibrary-android/res/drawable-mdpi-v11/ic_stat_action_notification.png" } ]
json
'use strict'; module.exports = function () { return { 'cd-badges': { 'listBadges': [{ role: 'none' }], // NOTE: Must be defined by visibility ? 'getBadge': [{ role: 'none' }], 'sendBadgeApplication': [{ role: 'basic-user', customValidator: [{ role: 'cd-dojos', cmd: 'can_award_badge' }] }], 'acceptBadge': [{ role: 'basic-user', // TODO : this is buggy, it seems the userId in the badge is not set making the validation not working // customValidator: [{ // role: 'cd-badges', // cmd: 'ownBadge' // }] }], // NOTE: Must be defined by visibility ? 'loadUserBadges': [{ role: 'basic-user' }], 'loadBadgeCategories': [{ role: 'none' }], 'loadBadgeByCode': [{ role: 'none' }], 'claimBadge': [{ role: 'basic-user' }], 'exportBadges': [{ role: 'basic-user', customValidator: [{ role: 'cd-users', cmd: 'is_self' }] }], 'kpiNumberOfBadgesAwarded': [{ role: 'cdf-admin' }], 'kpiNumberOfBadgesPublished' :[{ role: 'cdf-admin' }] } }; };
javascript
package util import ( "fmt" "github.com/go-kit/kit/log" "github.com/go-kit/kit/log/level" "net/http" "os" ) type EcsLogger struct { log.Logger } // LogHTTP implements the Logger interface of the Scaleway API. func (l *EcsLogger) LogHTTP(r *http.Request) { _ = level.Debug(l).Log("msg", "HTTP request", "method", r.Method, "url", r.URL.String()) } // Fatalf implements the Logger interface of the Scaleway API. func (l *EcsLogger) Fatalf(format string, v ...interface{}) { _ = level.Error(l).Log("msg", fmt.Sprintf(format, v...)) os.Exit(1) } // Debugf implements the Logger interface of the Scaleway API. func (l *EcsLogger) Debugf(format string, v ...interface{}) { _ = level.Debug(l).Log("msg", fmt.Sprintf(format, v...)) } // Infof implements the Logger interface of the Scaleway API. func (l *EcsLogger) Infof(format string, v ...interface{}) { _ = level.Info(l).Log("msg", fmt.Sprintf(format, v...)) } // Warnf implements the Logger interface of the Scaleway API. func (l *EcsLogger) Warnf(format string, v ...interface{}) { _ = level.Warn(l).Log("msg", fmt.Sprintf(format, v...)) } // Warnf implements the Logger interface of the promhttp package. func (l *EcsLogger) Println(v ...interface{}) { _ = level.Error(l).Log("msg", fmt.Sprintln(v...)) }
go
Faced with a backlog of applications for agricultural power connections,the state-owned power distribution company,Paschim Gujarat Vij Company Limited (PGVCL),plans to undertake a pilot project of installing solar water pumps on agricultural farms. We have approval from the Gujarat Urja Vikas Nigam Limited (GUVNL) ,the holding company of PGVCL. We shall install five solar water pumps on pilot basis on agricultural research farms of Junagadh Agricultural University (JUA). Tenders for the same will be issued shortly, said Sandeep Kumar,managing director (MD),PGVCL ,on Thursday. Kumar said the pumps would be installed in different parts of Saurashtra to test their effectiveness and viability. The JUA would not have any financial obligation for the project,he added. The JAU has 17 research centres spread across seven districts of Saurashtra. The project is part of the efforts by GUVNL to diversify its sources of energy and focus on harnessing solar power. Depending on the success of the pilot project,we shall decide on replicating the model on a larger scale, Kumar said. According to Kumar,if the project succeeds,the solar water pumps would help clear backlog of applications seeking agricultural power connections. As many as 2. 5 lakh applications were pending with PGVCL in February this year,while the figure for the state stood at 4. 6 lakh. According to sources,Madhya Gujarat Vij Company Limited (MGVCL),a subsidiary of PGVCL,will implement a similar project in central Gujarat.
english
After four years of constant refusal to resume bilateral cricketing relations and giving Pakistan cricket the cold shoulder, the ice has finally started to melt. The Board of Control for Cricket in India (BCCI) finally decided to restart the most intense cricket rivalry in the world, having invited Pakistan to play three One-Day Internationals and two Twenty20s in India later this year. After four years of constant refusal to resume bilateral cricketing relations and giving Pakistan cricket the cold shoulder, the ice has finally started to melt. The Board of Control for Cricket in India (BCCI) finally decided to restart the most intense cricket rivalry in the world, having invited Pakistan to play three One-Day Internationals and two Twenty20s in India later this year. With bilateral cricket ties with India on hold ever since the 2008 Mumbai attacks, followed by the attack on the Sri Lankan team in Lahore in 2009, not to mention the 2010 spot-fixing saga and Pakistan being stripped off hosting rights of the 2011 World Cup, it was quite evident that the country’s cricket had been on a downward spiral for the past few years. In recent times, however, the national team’s creditable performance has given the fans something to cheer about. However, the off-the-field realities are the ones that have continued to ensure that Pakistan cricket remains in the doldrums on many fronts, including the painful realisation that it would be many years before we are able to host international cricket and that the Pakistan Cricket Board (PCB) will continue to face serious financial problems in the foreseeable future. In such a scenario, a series with India, even if it is not on home soil, will hopefully ensure that the severity of the financial crunch that the PCB currently faces will be alleviated to an extent. However, beyond the obvious financial benefits that both the boards will derive from this series, we need to look at Pakistan-India cricket ties on a much broader canvas. If anyone doubts the importance of cricket ties between the arch-rivals to the health of world cricket, one only has to recall the high voltage, riveting contest that unfolded when Pakistan met India in the semi-final of the last World Cup, en route to becoming the world champions. Top-flight cricket is played amongst a handful of nations and if two of those countries — which incidentally also provide one of the most intense rivalries in international sport — refuse to face each other, world cricket will definitely be poorer off. More often than not, cricket relations between the two countries have been held hostage to the political climate prevalent between the neighbours. It is disappointing that certain quarters in India have not welcomed this thaw in cricket ties. One wonders why those who oppose this resumption have only spoken out against this aspect of improved relations between the two countries. Trade ties between the neighbours are improving, Pakistan’s president recently visited India, as did the country’s foreign secretary. Why this vocal outburst only against the resumption of cricket ties? To all those who believe that cricket should continue to be held hostage to the wider issues between the two countries, it would be a good reminder that just five years after a bloody Partition, which had engendered harrowing violence, bloodshed and extreme intolerance in its wake, 16 men from Pakistan went to play cricket in India. Pakistan’s inaugural 1952 tour of India was held amidst a much more hostile climate that is prevalent right now. But it still ended up giving hope that cordial relations between the people of the two countries were possible and that maintaining sporting ties was one way of generating good will on both sides of the Wagah border. (Amna Lone works for The Express Tribune’s editorial pages. The above article is reproduced with permission from http://tribune. com. pk/)
english
<reponame>DogeDark/dioxus<gh_stars>1000+ #![allow(non_snake_case)] //! This example proves that instantly resolving futures don't cause issues use dioxus::prelude::*; fn main() { dioxus::desktop::launch(App); } fn App(cx: Scope) -> Element { cx.render(rsx!(Demo {})) } fn Demo(cx: Scope) -> Element { let fut1 = use_future(&cx, (), |_| async move { std::thread::sleep(std::time::Duration::from_millis(100)); 10 }); cx.render(match fut1.value() { Some(value) => { let content = format!("content : {:?}", value); rsx!(div{ "{content}" }) } None => rsx!(div{"computing!"}), }) }
rust
<filename>handlers/token_collector/quil/manifest.json<gh_stars>10-100 { "bundle.symbolic_name" : "qcor_quil_token", "bundle.activator" : true, "bundle.name" : "QUIL Token Collector", "bundle.description" : "" }
json
<filename>package.json { "name": "dataopen2020", "version": "1.0.0", "license": "MIT", "scripts": { "dev": "parcel visualization/index.html --out-dir visualization/dist --public-url /dataopen-f20", "build": "parcel build visualization/index.html --out-dir visualization/dist --public-url /dataopen-f20", "deploy": "gh-pages --dist visualization/dist" }, "devDependencies": { "gh-pages": "^3.1.0", "parcel-bundler": "^1.12.4", "prettier": "^2.1.2" }, "dependencies": { "mapbox-gl": "^1.12.0" } }
json
<gh_stars>1-10 package scraper.api; public interface InstanceAddress extends Address {}
java
/** * Licensed to Jasig under one or more contributor license * agreements. See the NOTICE file distributed with this work * for additional information regarding copyright ownership. * Jasig licenses this file to you under the Apache License, * Version 2.0 (the "License"); you may not use this file * except in compliance with the License. You may obtain a * copy of the License at: * * http://www.apache.org/licenses/LICENSE-2.0 * * Unless required by applicable law or agreed to in writing, * software distributed under the License is distributed on * an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY * KIND, either express or implied. See the License for the * specific language governing permissions and limitations * under the License. */ package org.apereo.portal.permission.target; import java.io.Serializable; import java.util.Collection; import java.util.HashSet; import java.util.Set; import org.apache.commons.logging.Log; import org.apache.commons.logging.LogFactory; import org.apereo.portal.layout.dlm.remoting.IGroupListHelper; import org.apereo.portal.layout.dlm.remoting.JsonEntityBean; import org.apereo.portal.permission.target.IPermissionTarget.TargetType; import org.apereo.portal.security.IPermission; import org.springframework.beans.factory.annotation.Autowired; /** * EntityTargetProviderImpl provides uPortal entity keys as targets. Instances * of this implementation may indicate which entity types may be used as targets. * Target keys will be the key of the underlying entity itself, while the * target human-readable name will similarly be the name of the entity. * * TODO: This implementation currently has a number of problems. The code * uses the EntityEnum class and is hardcoded to only recognize four types * of entities: uPortal person groups, person entities, portlet categories, * and portlet entities. This code also may perform poorly for large * portal installations for searches that return many results. * * @author <NAME>, <EMAIL> * @version $Revision$ * @since 3.3 */ public class EntityTargetProviderImpl implements IPermissionTargetProvider, Serializable { private static final IPermissionTarget ALL_CATEGORIES_TARGET = new PermissionTargetImpl(IPermission.ALL_CATEGORIES_TARGET, IPermission.ALL_CATEGORIES_TARGET, TargetType.CATEGORY); private static final IPermissionTarget ALL_GROUPS_TARGET = new PermissionTargetImpl(IPermission.ALL_GROUPS_TARGET, IPermission.ALL_GROUPS_TARGET, TargetType.GROUP); private static final IPermissionTarget ALL_PORTLETS_TARGET = new PermissionTargetImpl(IPermission.ALL_PORTLETS_TARGET, IPermission.ALL_PORTLETS_TARGET, TargetType.PORTLET); private static final long serialVersionUID = 1L; private Set<TargetType> allowedTargetTypes = new HashSet<>(); protected transient final Log log = LogFactory.getLog(getClass()); private transient IGroupListHelper groupListHelper; @Autowired(required = true) public void setGroupListHelper(IGroupListHelper helper) { this.groupListHelper = helper; } /** * Construct a new instance of targets matching the set of allowed * target entity types. * * @param targetTypeNames */ public EntityTargetProviderImpl(Set<String> targetTypeNames) { /* * Arguably this logic should be moved to the TargetType enum itself; * but this sort of mapping only occurs (afaik) for "entities." */ for (String name : targetTypeNames) { switch (name) { case "person": allowedTargetTypes.add(TargetType.PERSON); break; case "group": allowedTargetTypes.add(TargetType.GROUP); break; case "portlet": allowedTargetTypes.add(TargetType.PORTLET); break; case "category": allowedTargetTypes.add(TargetType.CATEGORY); break; default: String msg = "Unrecognized targetTypeName: " + name; throw new RuntimeException(msg); } } } /** * The <code>key</code> parameter <em>should</em> specify a unique entity * across all 4 supported types: people, groups, portlets, and categories. * * Concrete examples of working keys: * <ul> * <li>defaultTemplateUser (user)</li> * <li>local.0 (group)</li> * <li>PORTLET_ID.82 (portlet)</li> * <li>local.1 (category)</li> * </ul> */ public IPermissionTarget getTarget(String key) { /* * If the specified key matches one of the "all entity" style targets, * just return the appropriate target. */ switch (key) { case IPermission.ALL_CATEGORIES_TARGET: return ALL_CATEGORIES_TARGET; case IPermission.ALL_PORTLETS_TARGET: return ALL_PORTLETS_TARGET; case IPermission.ALL_GROUPS_TARGET: return ALL_GROUPS_TARGET; // Else just fall through... } /* * Attempt to find a matching entity for each allowed entity type. This * implementation will return the first entity that it finds. If the * portal contains duplicate entity keys across multiple types, it's * possible that this implementation would demonstrate inconsistent * behavior. */ for (TargetType targetType : allowedTargetTypes) { JsonEntityBean entity = groupListHelper.getEntity(targetType.toString(), key, false); if (entity != null) { IPermissionTarget target = new PermissionTargetImpl(entity.getId(), entity.getName(), targetType); return target; } } return null; } /* * (non-Javadoc) * @see org.apereo.portal.permission.target.IPermissionTargetProvider#searchTargets(java.lang.String) */ public Collection<IPermissionTarget> searchTargets(String term) { // Initialize a new collection of matching targets. We use a HashSet // implementation here to prevent duplicate target entries. Collection<IPermissionTarget> matching = new HashSet<IPermissionTarget>(); /* * Attempt to find matching entities for each allowed entity type. * Any matching entities will be added to our collection. */ for (TargetType targetType : allowedTargetTypes) { Set<JsonEntityBean> entities = groupListHelper.search(targetType.toString(), term); for (JsonEntityBean entity : entities) { IPermissionTarget target = new PermissionTargetImpl(entity.getId(), entity.getName(), targetType); matching.add(target); } } if (IPermission.ALL_CATEGORIES_TARGET.contains(term)) { matching.add(ALL_CATEGORIES_TARGET); } else if (IPermission.ALL_PORTLETS_TARGET.contains(term)) { matching.add(ALL_PORTLETS_TARGET); } else if (IPermission.ALL_GROUPS_TARGET.contains(term)) { matching.add(ALL_GROUPS_TARGET); } // return the list of matching targets return matching; } }
java
<reponame>josephst/MedlineXmlToDatabase package org.ohdsi.meshXmlToDatabase; import java.io.FileInputStream; import java.io.IOException; import java.util.ArrayList; import java.util.List; import java.util.Map; import java.util.zip.GZIPInputStream; import javax.xml.parsers.ParserConfigurationException; import javax.xml.parsers.SAXParser; import javax.xml.parsers.SAXParserFactory; import org.ohdsi.databases.InsertableDbTable; import org.ohdsi.utilities.StringUtilities; import org.ohdsi.utilities.files.Row; import org.xml.sax.Attributes; import org.xml.sax.SAXException; import org.xml.sax.helpers.DefaultHandler; public class MainMeshParser extends DefaultHandler { private InsertableDbTable outTerms; private InsertableDbTable outRelationship; private Map<String, String> treeNumberToUi; private Row row; private Trace trace = new Trace(); private String ui; public MainMeshParser(InsertableDbTable outTerms, InsertableDbTable outRelationship, Map<String, String> treeNumberToUi) { super(); this.outRelationship = outRelationship; this.outTerms = outTerms; this.treeNumberToUi = treeNumberToUi; } public static void parse(String fileName, InsertableDbTable outTerms, InsertableDbTable outRelationship, Map<String, String> treeNumberToUi) { StringUtilities.outputWithTime("Parsing main file"); try { FileInputStream fileInputStream = new FileInputStream(fileName); GZIPInputStream gzipInputStream = new GZIPInputStream(fileInputStream); MainMeshParser mainMeshParser = new MainMeshParser(outTerms, outRelationship, treeNumberToUi); SAXParserFactory factory = SAXParserFactory.newInstance(); SAXParser saxParser = factory.newSAXParser(); saxParser.parse(gzipInputStream, mainMeshParser); } catch (org.xml.sax.SAXException e) { e.printStackTrace(); } catch (ParserConfigurationException e) { e.printStackTrace(); } catch (IOException e) { e.printStackTrace(); } } public void startElement(String uri, String localName, String name, Attributes a) { trace.push(name); if (name.equalsIgnoreCase("DescriptorRecord")) row = new Row(); } public void characters(char ch[], int start, int length) throws SAXException { String traceString = trace.toString(); if (traceString.equalsIgnoreCase("DescriptorRecordSet.DescriptorRecord.DescriptorUI")) { ui = new String(ch, start, length); row.add("ui", ui); } else if (traceString.equalsIgnoreCase("DescriptorRecordSet.DescriptorRecord.DescriptorName.String")) { row.add("name", new String(ch, start, length)); } else if (traceString.equalsIgnoreCase("DescriptorRecordSet.DescriptorRecord.TreeNumberList.TreeNumber")) { treeNumberToUi.put(new String(ch, start, length), ui); } else if (traceString.equalsIgnoreCase("DescriptorRecordSet.DescriptorRecord.PharmacologicalActionList.PharmacologicalAction.DescriptorReferredTo.DescriptorUI")) { Row rowPa = new Row(); rowPa.add("ui_1", ui); rowPa.add("ui_2", new String(ch, start, length)); rowPa.add("relationship_id", "Pharmacological action"); outRelationship.write(rowPa); } } public void endElement(String uri, String localName, String name) { trace.pop(); if (name.equalsIgnoreCase("DescriptorRecord")) { row.add("supplement", "0"); outTerms.write(row); } } private class Trace { private List<String> tags = new ArrayList<String>(); public void push(String tag) { tags.add(tag); } public void pop() { tags.remove(tags.size()-1); } public String toString() { return StringUtilities.join(tags, "."); } } }
java
<filename>aliyun-java-sdk-miniapplcdp/src/main/java/com/aliyuncs/miniapplcdp/transform/v20200113/CreateLogicModelResponseUnmarshaller.java /* * Licensed under the Apache License, Version 2.0 (the "License"); * you may not use this file except in compliance with the License. * You may obtain a copy of the License at * * http://www.apache.org/licenses/LICENSE-2.0 * * Unless required by applicable law or agreed to in writing, software * distributed under the License is distributed on an "AS IS" BASIS, * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. * See the License for the specific language governing permissions and * limitations under the License. */ package com.aliyuncs.miniapplcdp.transform.v20200113; import com.aliyuncs.miniapplcdp.model.v20200113.CreateLogicModelResponse; import com.aliyuncs.miniapplcdp.model.v20200113.CreateLogicModelResponse.Data; import java.util.Map; import com.aliyuncs.transform.UnmarshallerContext; public class CreateLogicModelResponseUnmarshaller { public static CreateLogicModelResponse unmarshall(CreateLogicModelResponse createLogicModelResponse, UnmarshallerContext _ctx) { createLogicModelResponse.setRequestId(_ctx.stringValue("CreateLogicModelResponse.RequestId")); Data data = new Data(); data.setCreateTime(_ctx.stringValue("CreateLogicModelResponse.Data.CreateTime")); data.setModelType(_ctx.stringValue("CreateLogicModelResponse.Data.ModelType")); data.setSubType(_ctx.stringValue("CreateLogicModelResponse.Data.SubType")); data.setRevision(_ctx.integerValue("CreateLogicModelResponse.Data.Revision")); data.setModifiedTime(_ctx.stringValue("CreateLogicModelResponse.Data.ModifiedTime")); data.setDescription(_ctx.stringValue("CreateLogicModelResponse.Data.Description")); data.setSchemaVersion(_ctx.stringValue("CreateLogicModelResponse.Data.SchemaVersion")); data.setAppId(_ctx.stringValue("CreateLogicModelResponse.Data.AppId")); data.setProps(_ctx.mapValue("CreateLogicModelResponse.Data.Props")); data.setModelStatus(_ctx.stringValue("CreateLogicModelResponse.Data.ModelStatus")); data.setModelName(_ctx.stringValue("CreateLogicModelResponse.Data.ModelName")); data.setContent(_ctx.mapValue("CreateLogicModelResponse.Data.Content")); data.setId(_ctx.stringValue("CreateLogicModelResponse.Data.Id")); data.setModelId(_ctx.stringValue("CreateLogicModelResponse.Data.ModelId")); createLogicModelResponse.setData(data); return createLogicModelResponse; } }
java
{ "citations" : [ { "textCitation" : "[See ex-natded5.3 on Metamath](http://us.metamath.org/mpegif/ex-natded5.3.html)" } ], "names" : [ "ex-natded5.3" ], "language" : "METAMATH_SET_MM", "lookupTerms" : [ "#T_wph", "#T_wi", "#T_wps", "#T_wi", "#T_wch", "#T_wph", "#T_wi", "#T_wch", "#T_wi", "#T_wth", "#T_wph", "#T_wi", "#T_wps", "#T_wi", "#T_wch", "#T_wa", "#T_wth" ], "metaLanguage" : "METAMATH", "remarks" : " Theorem 5.3 of [Clemente] p. 16, translated line by line using an interpretation of natural deduction in Metamath. A much more efficient proof, using more of Metamath and MPE's capabilities, is shown in ~ ex-natded5.3-2 . A proof without context is shown in ~ ex-natded5.3i . For information about ND and Metamath, see the <HTML> <A HREF=\"mmnatded.html\">page on Deduction Form and Natural Deduction in Metamath Proof Explorer</A></HTML>. The original proof, which uses Fitch style, was written as follows: <HTML> <TABLE BORDER> <TR><TH NOWRAP>#</TH><TH>MPE#</TH><TH>ND Expression</TH> <TH NOWRAP>MPE Translation</TH><TH>ND Rationale</TH> <TH>MPE Rationale</TH></TR> <TR><TD>1</TD><TD>2;3</TD><TD NOWRAP> ` ( ps -> ch ) ` </TD> <TD NOWRAP> ` ( ph -> ( ps -> ch ) ) ` </TD> <TD>Given</TD> <TD>$e; ~ adantr to move it into the ND hypothesis</TD></TR> <TR><TD>2</TD><TD>5;6</TD><TD NOWRAP> ` ( ch -> th ) ` </TD> <TD NOWRAP> ` ( ph -> ( ch -> th ) ) ` </TD> <TD>Given</TD> <TD>$e; ~ adantr to move it into the ND hypothesis</TD></TR> <TR><TD>3</TD><TD>1</TD><TD> ...| ` ps ` </TD> <TD> ` ( ( ph /\\ ps ) -> ps ) ` </TD> <TD>ND hypothesis assumption</TD> <TD> ~ simpr , to access the new assumption </TD></TR> <TR><TD>4</TD><TD>4</TD><TD> ... ` ch ` </TD> <TD> ` ( ( ph /\\ ps ) -> ch ) ` </TD> <TD> ` -> `E 1,3</TD> <TD> ~ mpd , the MPE equivalent of ` -> `E, 1.3. ~ adantr was used to transform its dependency (we could also use ~ imp to get this directly from 1) </TD></TR> <TR><TD>5</TD><TD>7</TD><TD> ... ` th ` </TD> <TD> ` ( ( ph /\\ ps ) -> th ) ` </TD> <TD> ` -> `E 2,4</TD> <TD> ~ mpd , the MPE equivalent of ` -> `E, 4,6. ~ adantr was used to transform its dependency</TD></TR> <TR><TD>6</TD><TD>8</TD><TD> ... ` ( ch /\\ th ) ` </TD> <TD> ` ( ( ph /\\ ps ) -> ( ch /\\ th ) ) ` </TD> <TD> ` /\\ `I 4,5</TD> <TD> ~ jca , the MPE equivalent of ` /\\ `I, 4,7</TD></TR> <TR><TD>7</TD><TD>9</TD><TD NOWRAP> ` ( ps -> ( ch /\\ th ) ) ` </TD> <TD NOWRAP> ` ( ph -> ( ps -> ( ch /\\ th ) ) ) ` </TD> <TD> ` -> `I 3,6</TD> <TD> ~ ex , the MPE equivalent of ` -> `I, 8</TD></TR> </TABLE> </HTML> The original used Latin letters for predicates; we have replaced them with Greek letters to follow Metamath naming conventions and so that it is easier to follow the Metamath translation. The Metamath line-for-line translation of this natural deduction approach precedes every line with an antecedent including ` ph ` and uses the Metamath equivalents of the natural deduction rules. (Proof modification is discouraged.) (New usage is discouraged.) (Contributed by <NAME>, 9-Feb-2017.) ", "statement" : "ex-natded5.3.1 $e |- ( ph -> ( ps -> ch ) ) $.\nex-natded5.3.2 $e |- ( ph -> ( ch -> th ) ) $.\nex-natded5.3 $p |- ( ph -> ( ps -> ( ch /\\ th ) ) ) $." }
json
You are viewing a single comment's thread from: RE: Betterlife: The Diary Game ( 10 September 2023) A fascinating time spent with a successful Steem Representative (Successful Steemian). @zubaer ভাই সুন্দর একটি ডাইরী গেম লিখেছেন।পুরোনো বন্ধুদের মিলনের স্মৃতি অনেক মধুর হয়। আপনি আপনার বন্ধুদের সাথে অনেক আনন্দ করলেন।দেখ খুব ভালো লাগলো।আপনার জন্য শুভকামনা।
english
[ { "Id": "1479953", "ThreadId": "656732", "Html": "is Office Api supported in mono c# <br />\n", "PostedDate": "2016-07-27T08:25:27.573-07:00", "UserRole": null, "MarkedAsAnswerDate": null }, { "Id": "1480158", "ThreadId": "656732", "Html": "You can compile your solution at development time with NetOffice without worries even with Mono. But at runtime you need an MS-Office version to success.\r<br />\nBUT Mono is a - Linux &lt;3 - framework and MS-Office is not available here. (i'm not sure i understand your question)\r<br />\nIf you want only create and edit office documents - you can do this with some other great projects available here on - Codeplex &lt;3 - and there works fine in Mono.\r<br />\n<br />\n*Sebastian<br />\n", "PostedDate": "2016-07-31T12:39:18.743-07:00", "UserRole": null, "MarkedAsAnswerDate": null } ]
json
Dr. Praveen PrajapatDr. Praveen Prajapat is a well-known General Surgery in Nagaur. With 15 years of experience, Dr. Praveen Prajapat has been associated with various prestigious hospitals and clinics. A full list of Dr. Praveen Prajapat's expertise can be found on the profile page.
english