text
stringlengths
1
1.04M
language
stringclasses
25 values
<filename>src/main/resources/static/mas_json/2015_rtss_8908612558896192307.json {"title": "Distributed Deadline and Renewable Aware Electric Vehicle Demand Response in the Smart Grid.", "fields": ["charging station", "demand response", "demand reduction", "electric vehicle", "market mechanism"], "abstract": "Demand response is an important feature and functionality of the future smart grid. Electric vehicles are recognized as a particularly promising resource for demand response given their high charging demand and flexibility in demand management. Recently, researchers begun to apply market-based solutions to electric vehicle demand response. A clear vision, however, remains elusive because existing works overlook three key issues. (i) The hierarchy among electric vehicles (EVs), charging stations, and electric power companies (EPCs). Previous works assume direct interaction between EVs and EPCs and thus confine to single-level market designs. The designed mechanisms are inapplicable here due to ignoring the role of charging stations in the hierarchy. (ii) Temporal aspects of charging loads. Solely focusing on economic aspects makes significant demand reduction, but electric vehicles would end up with little allocated power due to overlooking their temporal constraints. (iii) Renewable generation co-located with charging stations. Market mechanisms that overlook the uncertainty of renewable would cause much inefficiency in terms of both the economic and temporal aspects. To address these issues, we study a new demand response scheme, i.e, hierarchical demand response for electric vehicles via charging stations. We propose that two-level marketing is suitable to this hierarchical scheme, and design a distributed market mechanism that is compatible with both the economic and temporal aspects of electric vehicle demand response. The market mechanism has a hierarchical decision-making structure by which the charging station leads the market and electric vehicles follow and respond to its actions. An appealing feature of the mechanism is the provable convergence to a unique equilibrium solution. At the equilibrium, neither the charging station or electric vehicles can improve their individual economic and/or temporal performance by changing their own strategies. Furthermore, we present a stochastic optimization based algorithm to optimize economic performance for the charging station at the equilibrium, given the predictions of the co-located renewable generation. The algorithm has provable robust performance guarantee in terms of the variance of the prediction errors. We finally evaluate the designed mechanism via detailed simulations. The results show the efficacy and validate the theoretical analysis for the mechanism.", "citation": "Citations (5)", "year": "2015", "departments": ["McGill University", "McGill University"], "conf": "rtss", "authors": ["<NAME>.....http://dblp.org/pers/hd/k/Kong:Fanxin", "<NAME>.....http://dblp.org/pers/hd/l/Liu_0001:Xue"], "pages": 10}
json
/* * Copyright [2021] JD.com, Inc. * * Licensed under the Apache License, Version 2.0 (the "License"); * you may not use this file except in compliance with the License. * You may obtain a copy of the License at * * http://www.apache.org/licenses/LICENSE-2.0 * * Unless required by applicable law or agreed to in writing, software * distributed under the License is distributed on an "AS IS" BASIS, * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. * See the License for the specific language governing permissions and * limitations under the License. */ #include "atomic.h" struct AtomicU32 { private: typedef uint32_t V; atomic_t count; public: ~AtomicU32(void) { } AtomicU32(V v = 0) { set(v); } inline V get(void) const { return atomic_read((atomic_t *)&count); } inline V set(V v) { atomic_set(&count, v); return v; } inline V add(V v) { return atomic_add_return(v, &count); } inline V sub(V v) { return atomic_sub_return(v, &count); } inline V clear(void) { return atomic_clear(&count); } inline V inc(void) { return add(1); } inline V dec(void) { return sub(1); } inline operator V(void) const { return get(); } inline V operator=(V v) { return set(v); } inline V operator+=(V v) { return add(v); } inline V operator-=(V v) { return sub(v); } inline V operator++(void) { return inc(); } inline V operator--(void) { return dec(); } inline V operator++(int) { return inc() - 1; } inline V operator--(int) { return dec() + 1; } }; struct AtomicS32 { private: typedef int32_t V; atomic_t count; public: ~AtomicS32(void) { } AtomicS32(V v = 0) { set(v); } inline V get(void) const { return atomic_read((atomic_t *)&count); } inline V set(V v) { atomic_set(&count, v); return v; } inline V add(V v) { return atomic_add_return(v, &count); } inline V sub(V v) { return atomic_sub_return(v, &count); } inline V clear(void) { return atomic_clear(&count); } inline V inc(void) { return add(1); } inline V dec(void) { return sub(1); } inline operator V(void) const { return get(); } inline V operator=(V v) { return set(v); } inline V operator+=(V v) { return add(v); } inline V operator-=(V v) { return sub(v); } inline V operator++(void) { return inc(); } inline V operator--(void) { return dec(); } inline V operator++(int) { return inc() - 1; } inline V operator--(int) { return dec() + 1; } }; #if HAS_ATOMIC8 struct AtomicS64 { private: typedef int64_t V; atomic8_t count; public: ~AtomicS64(void) { } AtomicS64(V v = 0) { set(v); } inline V get(void) const { return atomic8_read((atomic8_t *)&count); } inline V set(V v) { atomic8_set(&count, v); return v; } inline V add(V v) { return atomic8_add_return(v, &count); } inline V sub(V v) { return atomic8_sub_return(v, &count); } inline V clear(void) { return atomic8_clear(&count); } inline V inc(void) { return add(1); } inline V dec(void) { return sub(1); } inline operator V(void) const { return get(); } inline V operator=(V v) { return set(v); } inline V operator+=(V v) { return add(v); } inline V operator-=(V v) { return sub(v); } inline V operator++(void) { return inc(); } inline V operator--(void) { return dec(); } inline V operator++(int) { return inc() - 1; } inline V operator--(int) { return dec() + 1; } }; #endif
cpp
<reponame>JosephHerreraDev/docs.es-es<filename>docs/framework/winforms/controls/get-and-set-the-current-cell-wf-datagridview-control.md --- title: Obtener y establecer la celda actual en el control DataGridView description: Obtenga información sobre cómo detectar mediante programación qué celda está activa actualmente obteniendo y estableciendo la celda actual en el control DataGridView Windows Forms. ms.date: 03/30/2017 dev_langs: - csharp - vb helpviewer_keywords: - DataGridView control [Windows Forms], getting current cell - DataGridView control [Windows Forms], setting current cell - cells [Windows Forms], getting and setting current ms.assetid: b0e41e57-493a-4bd0-9376-a6f76723540c ms.openlocfilehash: 1651ca9c8fa0329f9435a70ce777bce68f15ff63 ms.sourcegitcommit: e02d17b2cf9c1258dadda4810a5e6072a0089aee ms.translationtype: MT ms.contentlocale: es-ES ms.lasthandoff: 07/01/2020 ms.locfileid: "85622216" --- # <a name="how-to-get-and-set-the-current-cell-in-the-windows-forms-datagridview-control"></a>Procedimiento para obtener y establecer la celda actual en el control DataGridView de formularios Windows Forms La interacción con <xref:System.Windows.Forms.DataGridView> suele requerir que se detecte mediante programación qué celda está activa actualmente. También puede que necesite cambiar la celda actual. Puede realizar estas tareas con la <xref:System.Windows.Forms.DataGridView.CurrentCell%2A> propiedad. > [!NOTE] > No se puede establecer la celda actual en una fila o columna que tenga su <xref:System.Windows.Forms.DataGridViewBand.Visible%2A> propiedad establecida en `false` . Dependiendo del modo de <xref:System.Windows.Forms.DataGridView> selección del control, cambiar la celda actual puede cambiar la selección. Para obtener más información, vea [modos de selección en el control DataGridView Windows Forms](selection-modes-in-the-windows-forms-datagridview-control.md). ### <a name="to-get-the-current-cell-programmatically"></a>Para obtener la celda actual mediante programación - Use la <xref:System.Windows.Forms.DataGridView> propiedad del control <xref:System.Windows.Forms.DataGridView.CurrentCell%2A> . [!code-csharp[System.Windows.Forms.DataGridViewMisc#080](~/samples/snippets/csharp/VS_Snippets_Winforms/System.Windows.Forms.DataGridViewMisc/CS/datagridviewmisc.cs#080)] [!code-vb[System.Windows.Forms.DataGridViewMisc#080](~/samples/snippets/visualbasic/VS_Snippets_Winforms/System.Windows.Forms.DataGridViewMisc/VB/datagridviewmisc.vb#080)] ### <a name="to-set-the-current-cell-programmatically"></a>Para establecer la celda actual mediante programación - Establezca la <xref:System.Windows.Forms.DataGridView.CurrentCell%2A> propiedad del <xref:System.Windows.Forms.DataGridView> control. En el ejemplo de código siguiente, la celda actual se establece en la fila 0, columna 1. [!code-csharp[System.Windows.Forms.DataGridViewMisc#085](~/samples/snippets/csharp/VS_Snippets_Winforms/System.Windows.Forms.DataGridViewMisc/CS/datagridviewmisc.cs#085)] [!code-vb[System.Windows.Forms.DataGridViewMisc#085](~/samples/snippets/visualbasic/VS_Snippets_Winforms/System.Windows.Forms.DataGridViewMisc/VB/datagridviewmisc.vb#085)] ## <a name="compiling-the-code"></a>Compilar el código Para este ejemplo se necesita: - <xref:System.Windows.Forms.Button>controles denominados `getCurrentCellButton` y `setCurrentCellButton` . En Visual C#, debe adjuntar los <xref:System.Windows.Forms.Control.Click> eventos de cada botón al controlador de eventos asociado en el código de ejemplo. - Control <xref:System.Windows.Forms.DataGridView> denominado `dataGridView1`. - Referencias a los ensamblados <xref:System?displayProperty=nameWithType> y <xref:System.Windows.Forms?displayProperty=nameWithType>. ## <a name="see-also"></a>Vea también - <xref:System.Windows.Forms.DataGridView> - <xref:System.Windows.Forms.DataGridView.CurrentCell%2A?displayProperty=nameWithType> - [Características básicas de columnas, filas y celdas en el control DataGridView de formularios Windows Forms](basic-column-row-and-cell-features-wf-datagridview-control.md) - [Modos de selección en el control DataGridView de formularios Windows Forms](selection-modes-in-the-windows-forms-datagridview-control.md)
markdown
United Airlines has formally issued an apology for the violent handling of a passenger who was recently dragged off one of its flights. "I continue to be disturbed by what happened on this flight, and I deeply apologize to the customer forcibly removed and to all the customers aboard," United Airlines CEO Oscar Munoz said in a statement on Tuesday. According to Press TV, he added "I want you to know that we take full responsibility and we will work to make it right. No one should ever be mistreated this way. " The embattled airlines has come under harsh criticism since a video released on social media on Sunday showed a Chinese man being forcibly taken off a flight at Chicago’s O’Hare airport. The airline said the 69-year-old man, identified as Dr. David Dao, had been asked to give up his seat on an overbooked flight from Chicago to Louisville in the US State of Kentucky, but he refused to cooperate. Passengers confirmed that the flight was overbooked and the airline wanted the man and three other passengers to disembark so that four of its own employees could get on board. They said the tussle led to the man’s face being slammed against an arm rest, causing serious bleeding. Munoz promised that the company would conduct a "thorough review" of its procedures, including "how we handle oversold situations" and how the airline partners with airport authorities and law enforcement. He also pledged to release the results of the review by the end of this month. According to the Chicago aviation department, one of the law enforcement officers was put on leave after it became clear that he had not followed the protocol. The incident caused furor among Chinese people, in particular, after news outlets described the mistreated passenger as being of Chinese descent. Many Chinese social media users accused the airlines of racism and others called for a boycott against it. The Chinese hashtag translated into “United forcibly removes passenger from plane,” was the most popular topic on the country’s popular social network, garnering more than 270 million views and more than 150,000 comments. United Airlines has lost nearly one billion dollars in value since the Sunday incident.
english
{ "response_code": 0, "power": "standby", "sleep": 0, "volume": 77, "mute": false, "max_volume": 161, "input": "tuner", "distribution_enable": true, "sound_program": "5ch_stereo", "direct": false, "enhancer": true, "tone_control": { "mode": "manual", "bass": 0, "treble": 0 }, "link_control": "standard", "link_audio_delay": "lip_sync", "disable_flags": 0 }
json
{ "id": 42240980, "name": "google-play-music-electron", "fullName": "hrysd/google-play-music-electron", "owner": { "login": "hrysd", "id": 1663465, "avatarUrl": "https://avatars0.githubusercontent.com/u/1663465?v=3", "gravatarId": "", "url": "https://api.github.com/users/hrysd", "htmlUrl": "https://github.com/hrysd", "followersUrl": "https://api.github.com/users/hrysd/followers", "subscriptionsUrl": "https://api.github.com/users/hrysd/subscriptions", "organizationsUrl": "https://api.github.com/users/hrysd/orgs", "reposUrl": "https://api.github.com/users/hrysd/repos", "receivedEventsUrl": "https://api.github.com/users/hrysd/received_events", "type": "User" }, "private": false, "htmlUrl": "https://github.com/hrysd/google-play-music-electron", "description": null, "fork": false, "url": "https://api.github.com/repos/hrysd/google-play-music-electron", "forksUrl": "https://api.github.com/repos/hrysd/google-play-music-electron/forks", "teamsUrl": "https://api.github.com/repos/hrysd/google-play-music-electron/teams", "hooksUrl": "https://api.github.com/repos/hrysd/google-play-music-electron/hooks", "eventsUrl": "https://api.github.com/repos/hrysd/google-play-music-electron/events", "tagsUrl": "https://api.github.com/repos/hrysd/google-play-music-electron/tags", "languagesUrl": "https://api.github.com/repos/hrysd/google-play-music-electron/languages", "stargazersUrl": "https://api.github.com/repos/hrysd/google-play-music-electron/stargazers", "contributorsUrl": "https://api.github.com/repos/hrysd/google-play-music-electron/contributors", "subscribersUrl": "https://api.github.com/repos/hrysd/google-play-music-electron/subscribers", "subscriptionUrl": "https://api.github.com/repos/hrysd/google-play-music-electron/subscription", "mergesUrl": "https://api.github.com/repos/hrysd/google-play-music-electron/merges", "downloadsUrl": "https://api.github.com/repos/hrysd/google-play-music-electron/downloads", "deploymentsUrl": "https://api.github.com/repos/hrysd/google-play-music-electron/deployments", "createdAt": "2015-09-10T11:39:03.000Z", "updatedAt": "2015-10-13T12:07:46.000Z", "pushedAt": "2015-10-13T12:07:45.000Z", "gitUrl": "git://github.com/hrysd/google-play-music-electron.git", "sshUrl": "git@github.com:hrysd/google-play-music-electron.git", "cloneUrl": "https://github.com/hrysd/google-play-music-electron.git", "svnUrl": "https://github.com/hrysd/google-play-music-electron", "homepage": null, "size": 8052, "stargazersCount": 1, "watchersCount": 1, "language": "JavaScript", "hasIssues": true, "hasDownloads": true, "hasWiki": true, "hasPages": false, "forksCount": 0, "mirrorUrl": null, "openIssuesCount": 0, "openIssues": 0, "watchers": 1, "defaultBranch": "master", "permissions": { "admin": false, "push": false, "pull": true }, "license": null, "networkCount": 0, "subscribersCount": 1, "status": 200, "packageJSON": { "name": "google-play-music-electron", "version": "0.0.1", "description": "", "main": "./src/index.js", "lisence": "MIT", "scripts": { "app": "electron ./src/index.js", "build": "./bin/build" }, "author": "<NAME>", "license": "MIT", "dependencies": { "menubar": "^2.2.1" }, "devDependencies": { "electron-packager": "^5.1.0", "electron-prebuilt": "^0.32.1" } }, "packageStatus": 200, "firstCommit": { "sha": "fdc009ff58044081826ac0853909161dfc0d24e4", "commit": { "author": { "name": "<NAME>", "email": "<EMAIL>", "date": "2015-09-10T06:32:40Z" }, "committer": { "name": "<NAME>", "email": "<EMAIL>", "date": "2015-09-10T06:35:12Z" }, "message": "Initialize commit", "tree": { "sha": "7dbbd7db56702de3cd3d4d333ada30715f6c2a65", "url": "https://api.github.com/repos/hrysd/google-play-music-electron/git/trees/7dbbd7db56702de3cd3d4d333ada30715f6c2a65" }, "url": "https://api.github.com/repos/hrysd/google-play-music-electron/git/commits/fdc009ff58044081826ac0853909161dfc0d24e4", "commentCount": 0 } }, "filename": "hrysd___google-play-music-electron.json", "hasProjects": true, "lastFetchedAt": "2017-05-04T20:32:20.862Z", "packageLastFetchedAt": "2017-05-05T15:34:24.177Z" }
json
{"name":"<NAME>","id":"NI-65961-0","address":"Wellweg 43, 31157 Sarstedt","school_type":"Gymnasium - SEK I -","phone":"05066 902280","full_time_school":true,"email":"<EMAIL>","programs":{"programs":[]},"state":"NI","lon":9.861312,"lat":52.24051}
json
Realty firm DLF has decided to exit the life insurance business. It has decided to sell its 74 per cent stake in its joint venture, DLF Pramerica Life Insurance, to Dewan Housing Finance. The sale consideration is put at around Rs. 350 crore. DLF announced is foray into the life insurance business in 2001 through a joint venture with U. S. insurance giant Prudential Financial’s arm. The Indian realty firm held 74 per cent in the joint venture. The balance was held by Prudential International Insurance. The joint venture had reported a combined loss of over Rs. 250 crore during the past two fiscals. “The company, on Thursday, signed definitive agreements to sell its 74 per cent stake in the life insurance joint venture to Dewan Housing Finance and its group entities,” the company said in a statement. Although the company did not disclose the value of the deal, insiders said it could be worth Rs. 350-400 crore. “These agreements are subject to regulatory approvals. The transaction consideration shall be disclosed post receipt of all such approvals,” it said. “This transaction is in line with our ongoing strategy to divest non-core businesses or assets. We have had a very cordial relationship with Prudential, and wish them the best in their new partnership with DHFL,’’ DLF Group CFO Ashok Tyagi said in the statement. PTI reports: Commenting on the deal, DHFL Chairman and Managing Director Kapil Wadhawan said, “This joint venture will help DHFL extend its philosophy of financial inclusion by broadening the range of products and services available to our customers , across India, especially in Tier 2 and 3 cities and towns. ’’ DHFL believes that this joint venture would generate long-term value for its shareholders, he said. “This new partnership also reinforces Prudential Financial Inc. ’s commitment to building a strong presence in India, a country whose future growth over the long-term will enhance our 138-year history of providing financial security to customers across the world,’’ Tim Feige, Senior Vice-President and International Insurance Group Executive at PFI said. DLF has been selling its non-core assets and business to focus on core business of real estate. It has already exited from the wind energy business and has entered into an agreement to sell its luxury hospitality chain, Aman Resorts. The country’s largest realty firm has raised almost Rs. 10,000 crore in the last three years through divestment of its non-core business.
english
Alabama was voted No. 1 in the preseason USA Today coaches' poll released Monday, with Ohio State second and defending national champion Georgia third. The Associated Press preseason Top 25 will be released Aug. 15. The Crimson Tide received 54 first-place votes from a panel of 66 major college football coaches. Alabama is coming off a loss in the College Football Playoff title game to Georgia. The Buckeyes received five first-place votes and the Bulldogs got six. No. 18 Texas also received a first-place vote. Clemson was No. 4 and Notre Dame was No. 5. Michigan, coming off its first CFP appearance, was sixth, followed by Texas A&M, Utah, Oklahoma and Baylor.
english
{ "project": { "url": "https://lgtm.com/projects/g/antonyh/hippo-site-nucleus", "url-identifier": "g/antonyh/hippo-site-nucleus", "id": 1506593520124, "name": "antonyh/hippo-site-nucleus" }, "data": [ [ { "url": "https://lgtm.com/projects/g/antonyh/hippo-site-nucleus/snapshot/f01196d70ec44dcbb9295baad1219b3617a95992/files/ht/pom.xml#L72", "line": 72, "value": "repository", "file": "/ht/pom.xml" }, { "value": "Downloading or uploading artifacts over insecure protocol (eg. http or ftp) to/from repository http://maven.onehippo.com/maven2/" } ] ], "columns": [ "repository", "col1" ] }
json
<reponame>xioren/JohnDoe cities = [ 'Sao Paulo', 'Rio de Janeiro', 'Salvador', 'Fortaleza', 'Be<NAME>e', 'Brasilia', 'Curitiba', 'Manaus', 'Recife', 'Belem', 'Porto Alegre', 'Goiania', 'Guarulhos', 'Campinas', 'Nova Iguacu', 'Maceio', '<NAME>', '<NAME>', 'Natal', 'Teresina', '<NAME>', 'Campo Grande', 'Jaboatao', 'Osasco', '<NAME>', '<NAME>', 'Jaboatao dos Guararapes', 'Contagem', 'Ribeirao Preto', '<NAME> dos Campos', 'Uberlandia', 'Sorocaba', 'Cuiaba', 'Aparecida de Goiania', 'Aracaju', 'Feira de Santana', 'Londrina', 'Juiz de Fora', 'Belford Roxo', 'Joinville', 'Niteroi', '<NAME>', 'Ananindeua', 'Florianopolis', 'Santos', 'Ribeirao das Neves', 'Vila Velha', 'Serra', 'Diadema', '<NAME> Goytacazes', 'Maua', 'Betim', 'C<NAME> Sul', 'Sao Jose do Rio Preto', 'Olinda', 'Carapicuiba', 'Campina Grande', 'Piracicaba', 'Macapa', 'Itaquaquecetuba', 'Bauru', 'Montes Claros', 'Canoas', 'Mogi das Cruzes', 'Sao Vicente', 'Jundiai', 'Pelotas', 'Anapolis', 'Vitoria', 'Maringa', 'Guaruja', 'Port<NAME>', 'Franca', 'Blumenau', 'Foz do Iguacu', '<NAME>', 'Paulista', 'Limeira', 'Viamao', 'Suzano', 'Caucaia', 'Petropolis', 'Uberaba', 'Rio Branco', 'Cascavel', 'Novo Hamburgo', 'Vitoria da Conquista', 'Barueri', 'Taubate', 'Governador Valadares', 'Praia Grande', 'Varzea Grande', 'Volta Redonda', 'Santa Maria', 'Santa Luzia', 'Gravatai', 'Caruaru', 'Boa Vista', 'Ipatinga', 'Sumare', 'Juazeiro do Norte', 'Embu', 'Imperatriz', 'Colombo', 'Taboao da Serra', 'Jacarei', 'Marilia', 'Presidente Prudente', '<NAME>', 'Itabuna', '<NAME>', 'Hortolandia', 'Mossoro', 'Itapevi', 'Sete Lagoas', 'Sao Jose', 'Palmas', 'Americana', 'Petrolina', 'Divinopolis', 'Maracanau', 'Planaltina', 'Santarem', 'Camacari', "Santa Barbara d'Oeste", 'R<NAME>', 'Cachoeiro de Itapemirim', 'Itaborai', '<NAME>', 'Indaiatuba', 'Passo Fundo', 'Cotia', '<NAME>', 'Aracatuba', 'Araraquara', 'Ferraz de Vasconcelos', 'Arapiraca', 'Lages', 'Barra Mansa', 'Nossa Senhora do Socorro', 'Dourados', 'Criciuma', 'Chapeco', 'Barreiras', 'Sobral', 'Itajai', 'Ilheus', 'Angra dos Reis', 'Nova Friburgo', 'Rondonopolis', 'Itapecerica da Serra', 'Guarapuava', 'Parnamirim', 'Caxias', 'Nilopolis', 'Pocos de Caldas', 'Maraba', 'Luziania', 'Cabo', 'Macae', 'Ibirite', 'Lauro de Freitas', 'Paranagua', 'Parnaiba', 'Itu', 'Castanhal', 'Sao Caetano do Sul', 'Queimados', 'Pindamonhangaba', 'Sapucaia', 'Jaragua do Sul', '<NAME>', 'Jequie', 'Itapetininga', '<NAME>as', '<NAME>', 'Timon', '<NAME>', 'Teresopolis', 'Uruguaiana', 'Porto Seguro', 'Alagoinhas', 'Palhoca', 'Barbacena', 'Cachoeirinha', 'Santa Rita', 'Toledo', 'Jau', 'Cubatao', 'Pinhais', '<NAME>', 'Varginha', 'Sinop', '<NAME>', 'Eunapolis', 'Botucatu', 'Jandira', '<NAME>', '<NAME>', 'Resende', 'Araucaria', 'Atibaia', '<NAME>', 'Garanhuns', 'Araruama', 'Catanduva', '<NAME> Rocha', '<NAME>', 'Ji Parana', 'Araras', 'Poa', 'Vitoria de Santo Antao', 'Umuarama', 'Apucarana', 'Santa Cruz do Sul', 'Guaratingueta', 'Linhares', 'Araguaina', 'Esmeraldas', 'Birigui', 'Assis', 'Barretos', 'Colatina', '<NAME>', 'Guaiba', 'Guarapari', '<NAME>', 'Itaguai', '<NAME> Ostras', 'Itabira', 'Votorantim', 'Sertaozinho', 'Santana de Parnaiba', 'Bage', 'Passos', 'Salto', 'Uba', 'Ourinhos', 'Trindade', 'Arapongas', 'Araguari', 'Corumba', 'Erechim', 'Japeri', 'Vespasiano', 'Campo Largo', 'Tatui', 'Patos', 'Timoteo', 'Muriae', 'Cambe', 'Bayeux', 'B<NAME>calves', 'Caraguatatuba', 'Itanhaem', 'Santana do Livramento', '<NAME>', 'Planaltina', 'Crato', 'Valinhos', '<NAME>', 'Nova Lima', 'Brusque', 'Barra do Pirai', 'Alegrete', 'Caieiras', 'Barra do Corda', 'Igarassu', '<NAME>', 'Ituiutaba', 'Esteio', 'Sarandi', 'Itaperuna', 'Santana', '<NAME>', 'Codo', 'Araxa', 'Abreu e Lima', 'Itajuba', 'Lavras', 'Avare', 'Formosa', 'Leme', 'Cruzeiro do Sul', 'Itumbiara', 'Marica', 'Ubatuba', 'Tres Lagoas', '<NAME>', '<NAME>', 'Abaetetuba', 'Sao Bento do Sul', 'Itauna', 'Sao Mateus', 'Jatai', 'Sao Joao da Boa Vista', 'Lorena', 'Santa Cruz do Capibaribe', 'Sao Sebastiao', 'Tucurui', 'Em<NAME>', 'Sapiranga', 'Para de Minas', 'Campo Mourao', 'Cachoeira do Sul', 'Santo Antonio de Jesus', 'Paranavai', 'Jo<NAME>', 'Matao', 'Bacabal', 'Cacapava', 'Aruja', 'Cruzeiro', 'Patrocinio', 'Tres Rios', 'Bebedouro', 'Sao Cristovao', 'Alfenas', 'Ijui', 'Altamira', 'Paracatu', 'Carpina', 'Iguatu', 'Votuporanga', 'Paragominas', 'Lins', 'Jaboticabal', 'Vicosa', 'Sao Sebastiao do Paraiso', 'Balsas', 'Itatiba', 'Santa Ines', 'Tubarao', 'Pato Branco', 'Paulinia', 'Lajeado', 'Cruz Alta', 'Aquiraz', 'Itacoatiara', 'Gurupi', 'Itaituba', 'Santo Angelo', 'Parintins', 'Curvelo', 'Itabaiana', 'Cacador', 'Ouro Preto', 'Caldas Novas', 'Irece', 'Catalao', 'Tres Coracoes', 'Rio Largo', 'Vilhena', 'Valenca', 'Peruibe', 'Itapeva', 'Cataguases', 'Saquarema', 'Tupa', 'Fernandopolis', 'Senador Canedo', 'Itapira', 'Gravata', 'Valenca', 'Pirassununga', 'Unai', 'Caratinga', 'Itapetinga', 'Mococa', '<NAME>', 'Carazinho', 'Santa Rosa', '<NAME>', 'Guanambi', 'Aracruz', 'Ariquemes', 'Farroupilha', '<NAME>', 'Picos', '<NAME>', 'Arcoverde', 'Braganca', 'Vacaria', 'Cajamar', 'Janauba', 'Vinhedo', 'Formiga', 'Cianorte', 'Nova Vicosa', 'Itapipoca', '<NAME>', 'Estancia', 'Cacoal', '<NAME>', 'Concordia', 'Pacatuba', 'Viana', '<NAME>', 'Caico', 'Seropedica' ]
python
Panchumarti Anuradha, who was the youngest Mayor in India, was asked only one question when she was interviewed by Telugu Desam Party chief and Chief Minister N. Chandrababu Naidu way back in 2000. Ms. Anuradha was speaking about her selection as Vijayawada Mayor at a book release function recently. “I applied for the post and was tensed about the interview, which was part of the selection process with Mr. Naidu. I bought all books about Vijayawada, urban issues and went through them expecting questions about civic administration and the history of the city,” she said. “The interview was set at 2 a. m. in the city. Mr. Naidu asked only two questions about C, C++ and Oracle. I was asked to leave after I gave the answers. At 4 a. m. I got a call from his men that I got selected as the Mayor. I was 26 then. ” Ms. Anuradha holds a Ph. D. in Political Communications. Andhra Pradesh Planning Commission Vice-Chairman C. Kutumbha Rao was at his eloquent best as the chief guest of a seminar organised at the Cultural Centre for Vijayawada and Amaravati (CCVA). He dwelt on various topics including the contentious Special Category Status. One narration which went down well with the audience was the pressure parents are putting on their children in their academic preferences and shared his personal experience of accidental foray into MPC (Maths, Physics, Chemistry) during his college days. “I want to study economics and I opted for MEC (Maths, Economics, Commerce). But my father forced me to opt for MPC. Bowing to his pressure I joined MPC and later did my mechanical engineering. But destiny brought me back to economics as Vice-Chairman of the Planning Commission. The truth is my engineering background is helpful in my present job as it allows to think analytically. Guntur MP Jayadev Galla, who is soft-spoken, was forced to make caustic remarks against the BJP-led NDA government in Parliament for not implementing the A. P. Reorganisation Act (APRA) in letter and spirit. He was flooded with praise for rising to the occasion in the august House. He was given a hero’s welcome back home after the first leg of the Budget session was adjourned till March 5. The TDP leaders could not prevent their enthusiastic party workers felicitating them in appreciation of their belligerent posture against the Central government. “This, he insisted, was only the beginning of the battle. We are geared up to corner the Centre in the coming days,” he said, suggesting that the TDP was in no mood to be submissive. Voice matters, The politicians leave no stone unturned to hog the limelight. Imagine the plight of a politician in an auditorium without proper lighting where he was to make a crucial announcement. Recently, Bharatiya Janata Party (BJP) Krishna District unit organised a meeting of party workers at IV Palace. The beam position was apparently dysfunctional and the front part of the stage was poorly illuminated prompting the audience to turn restless.
english
LIBRARY libOSmesa VERSION LIBRARY_VERSION EXPORTS OSMesaCreateContext OSMesaDestroyContext OSMesaGetColorBuffer OSMesaGetCurrentContext OSMesaGetDepthBuffer OSMesaGetIntegerv OSMesaMakeCurrent OSMesaPixelStore #ifndef __UNIXOS2__ _glapi_Context _glapi_noop_enable_warnings _glapi_add_entrypoint _glapi_get_dispatch_table_size _glapi_set_dispatch _glapi_check_multithread _glapi_set_context glTexCoordPointer glColorPointer glNormalPointer glVertexPointer glDrawElements #else OSMesaCreateContextExt #endif /* __UNIXOS2__ */ /* $XFree86: xc/lib/GL/mesa/drivers/osmesa/OSMesa-def.cpp,v 1.1 2004/04/14 11:18:28 alanh Exp $ */
cpp
<filename>interface.py import subprocess import serial from mupq import mupq class M4Settings(mupq.PlatformSettings): #: Specify folders to include scheme_folders = [ # mupq.PlatformSettings.scheme_folders + [ ('pqm4', 'crypto_kem', ''), ('pqm4', 'crypto_sign', ''), ('mupq', 'mupq/crypto_kem', ''), ('mupq', 'mupq/crypto_sign', ''), ('pqclean', 'mupq/pqclean/crypto_kem', "PQCLEAN"), ('pqclean', 'mupq/pqclean/crypto_sign', "PQCLEAN"), ] #: List of dicts, in each dict specify (Scheme class) attributes of the #: scheme with values, if all attributes match the scheme is skipped. skip_list = ( {'scheme': 'bikel3', 'implementation': 'ref'}, {'scheme': 'bikel3', 'implementation': 'm4f'}, {'scheme': 'dilithium5', 'implementation': 'clean'}, {'scheme': 'falcon-1024-tree', 'implementation': 'opt-leaktime'}, {'scheme': 'falcon-1024-tree', 'implementation': 'opt-ct'}, {'scheme': 'frodokem640aes', 'implementation': 'clean'}, {'scheme': 'frodokem640aes', 'implementation': 'opt'}, {'scheme': 'frodokem976aes', 'implementation': 'clean'}, {'scheme': 'frodokem976aes', 'implementation': 'opt'}, {'scheme': 'frodokem1344aes', 'implementation': 'clean'}, {'scheme': 'frodokem1344aes', 'implementation': 'opt'}, {'scheme': 'frodokem640shake', 'implementation': 'clean'}, {'scheme': 'frodokem976shake', 'implementation': 'clean'}, {'scheme': 'frodokem976shake', 'implementation': 'opt'}, {'scheme': 'frodokem1344shake', 'implementation': 'clean'}, {'scheme': 'frodokem1344shake', 'implementation': 'opt'}, {'scheme': 'rainbowI-classic', 'implementation': 'clean'}, {'scheme': 'rainbowI-circumzenithal', 'implementation': 'clean'}, {'scheme': 'rainbowI-compressed', 'implementation': 'clean'}, {'scheme': 'rainbowIII-classic', 'implementation': 'clean'}, {'scheme': 'rainbowIII-circumzenithal', 'implementation': 'clean'}, {'scheme': 'rainbowIII-compressed', 'implementation': 'clean'}, {'scheme': 'rainbowV-classic', 'implementation': 'clean'}, {'scheme': 'rainbowV-circumzenithal', 'implementation': 'clean'}, {'scheme': 'rainbowV-compressed', 'implementation': 'clean'}, {'scheme': 'mceliece348864', 'implementation': 'clean'}, {'scheme': 'mceliece348864f', 'implementation': 'clean'}, {'scheme': 'mceliece460896', 'implementation': 'clean'}, {'scheme': 'mceliece460896f', 'implementation': 'clean'}, {'scheme': 'mceliece6688128', 'implementation': 'clean'}, {'scheme': 'mceliece6688128f', 'implementation': 'clean'}, {'scheme': 'mceliece6960119', 'implementation': 'clean'}, {'scheme': 'mceliece6960119f', 'implementation': 'clean'}, {'scheme': 'mceliece8192128', 'implementation': 'clean'}, {'scheme': 'mceliece8192128f', 'implementation': 'clean'}, {'scheme': 'mceliece348864', 'implementation': 'vec'}, {'scheme': 'mceliece348864f', 'implementation': 'vec'}, {'scheme': 'mceliece460896', 'implementation': 'vec'}, {'scheme': 'mceliece460896f', 'implementation': 'vec'}, {'scheme': 'mceliece6688128', 'implementation': 'vec'}, {'scheme': 'mceliece6688128f', 'implementation': 'vec'}, {'scheme': 'mceliece6960119', 'implementation': 'vec'}, {'scheme': 'mceliece6960119f', 'implementation': 'vec'}, {'scheme': 'mceliece8192128', 'implementation': 'vec'}, {'scheme': 'mceliece8192128f', 'implementation': 'vec'}, {'scheme': 'hqc-rmrs-192', 'implementation': 'clean'}, {'scheme': 'hqc-rmrs-256', 'implementation': 'clean'}, ) import platform class M4(mupq.Platform): def __enter__(self): if platform.system() == "Darwin": device = "/dev/tty.usbserial-0001" else: device = "/dev/ttyUSB0" self._dev = serial.Serial(device, 115200, timeout=10) return super().__enter__() def __exit__(self,*args, **kwargs): self._dev.close() return super().__exit__(*args, **kwargs) def device(self): return self._dev def flash(self, binary_path): super().flash(binary_path) subprocess.check_call( ["st-flash", "--reset", "write", binary_path, "0x8000000"], stdout=subprocess.DEVNULL, stderr=subprocess.DEVNULL )
python
Leyte is an island in the Visayas group of islands in the Philippines. It is the eighth largest island in the Philippines by land area. The island was known to 16th-century Spanish explorers as Tandaya. Its population grew rapidly after 1900, especially in the Leyte and Ormoc valleys. In World War II, U. S. forces landed on Leyte, and, after the Battle of Leyte Gulf, the Japanese were expelled. Since the accessibility of land has been depleted, Leyte has provided countless number of migrants to Mindanao. Most inhabitants are farmers. Fishing is a supplementary activity. Rice and corn are the main food crops; cash crops include coconuts, abaca, tobacco, bananas, and sugarcane. There are some manganese deposits, and sandstone and limestone are quarried in the northwest. Politically, the island is divided into two provinces: Leyte and Southern Leyte. Territorially, Southern Leyte includes the island of Panaon to its south. To the north of Leyte is the island province of Biliran, a former sub-province of Leyte. Politics : Politics : Politics : Technology :
english
/*--------------------------------------------------------------------------------------------- * Copyright (c) Microsoft Corporation. All rights reserved. * Licensed under the Source EULA. See License.txt in the project root for license information. *--------------------------------------------------------------------------------------------*/ // Do not edit this file. It is machine generated. { "refAriaLabel": "{0}, git", "checkoutBranch": "Branche sur {0}", "checkoutRemoteBranch": "Branche distante à {0}", "checkoutTag": "Étiquette à {0}", "alreadyCheckedOut": "La branche {0} est déjà la branche active", "branchAriaLabel": "{0}, branche git", "createBranch": "Créer une branche {0}", "noBranches": "Aucune autre branche", "notValidBranchName": "Fournissez un nom de branche valide" }
json
1-2 Naiye, Tiafarasi, nanu sina ewado ea. Yesu Kerisoma buna ueta faiyawere yabai utebi. Moana uwarama Danu buna ueta erebitate Danu sina yaba webitate owawa odebe yabai matebita. 3 Eno utebitaba nanu imuetana eno, Yesunu buna ueta nesia nauegou utebitaraba nesia rorowarau odimate aba siaia umane eno imuteda iba owawa utatane. 4 I buna ueta naui sina ina ba iwata ufae iba ewa owawa siaia utatane. 5 Boderewere Erodi, dana gaemani dera yawoeta amara. Dana Judia orofa yawotebi. Yawotebina i kowaro mui fuyo ode sina weta amara, danu ifu Sakaraiya, iro ibebi. Danu danata demurai uwarana Abaija waita uwara ido. Dawaini danu uwaraini dera dubu suro Godinu gaukara utebita. Sakaraiyanu awetanu ifuna Erisabeti. Erisabetinu waria amarana Eroni. Eroninu uwaraini Godinu fuyo ode sina webita. 6 Awetaini awera Godinu dabaro rorowarau inareda Danu tarawatuini Danu sinaini nauteda imuegou utebisi. Eno utebisiro Godima emanu kobere ueta sineta erebi. 7 Eno ibeda Erisabeti desini kimu uiro ero uite kuita munawa me ibebe awetaini awera buruka usi. 8 I Abaija waita uwaranu gaukara kowa ido fari. I fuyo ode sina weta uwaranu baeta eno, emanu gaukara kowa farinuba emanu ifu ifu feifaro yanatebitate bosioro youtebita. Eno utebitate guriguri utebitate ifu demurai munebita. 9-10 Enoba i kowaro Sakaraiyanu ifu muta. Mutaba Sakaraiyama dera dubu suro fuyo mafie amuite ibina i guruguru ubita uwara nesia wawaro yaubeda guriguri utebita. 11 Sakaraiya ido inaibebe erina Godinu aneru bonana ina gibori kaiyo banibira afuro bodere inaibinono inaibiro eri. 12 Eno ereda neno siosa uteda iya derawere uyari. 13 Iya uyariro aneruma wei, Sakaraiya, iya da uya wei. Godima anu guriguri ueta nausinuba anu aweta Erisabetima mui amara kuita mubiro ana danu ifu Joni maibasu wei. 14 Kuita mubiro neno kobererau uteda yaru ufawaro uwara faiyawere ema enanari yaru ufitaita wei. 15 I amakanu daisibite Godinu dera amara sibiro Godima efisu wei. Idoba anu amarama wainini siosa ogoini da ifisu. Danu danua dawa mubiro Godinu Imumu Kotofuma danu neno ubarero itafisu. 16 I amarama Godinu sina bunawere wegou ufite eme faiyawere ma owefiro emanu Dera Godibai anibitaita aneruma eno wei. 17 Iraijanu ueta sinetaini bunaini dana mubite dawa ko anibisu. Anibite emanu damamamutuini koiniboroini ma demurai ufiro ibifitaita. I Godinu sina naue ekoeta uwaranu neno ma owereibiro nono imueta kobererau imuteibitaita. Dawa i uwara eno ue bou ufiro Dera Waria Amara ido fafisu, aneruma eno wei. 18 Weiro Sakaraiyama aneruba wei, nanu awetaini naini buruka uwaraba anene umate iwata umau wei? 19 Weiro aneruma dawaba wei, nanu ifuna Geibero. Na Godibai ibatane wei. Godima na ewa kobere sina aba wemane iba siaia usinuro iba fanea wei. 20 Weite nono wei, nanu sina nautawaba awonaroma i kuita mubisu kowaro wate mu ufate sina wate da webasute me ibifaro Nanu wene ba enanari ufisu wei. 21 Eno utebisina i guruguru ubita uwarama Sakaraiya yowetebe osi utate we imuta, i amarana aneba ubarero yafawere ibinua we imuta. 22 Sakaraiyama dera dubu su ubareroma wawaro itarite emaba sina webie uite wiawa uiba i uwarama we imuta, dawa tubetube erinu aba we imuta. We imutaro Sakaraiyama wate mu uiba emaba sina wiawa uite agema torowa wanawana ui. 23 Eno uite moana kowaro wate wiawa me mu ue ibebe danu gaukara kowa me siniro suro ido ani. 24 Duburo Erisabetima dia munite sakara fai (5) suro feare yaubebi. 25 Dana eno wei, na kuita munawaba nanu bodere ibetana aramawere. Awonana Godima na kobererau erasuba emema na da irufitaita wei. 26 Erisabeti sakara sikisi (6) dia bobo ibina Godima Danu aneru, Geibero siaia uiro aneruma tararite Gareri orofa Nasareta suro ani. 27-28 I suro mui udiri waribo iwata me aruma ibi. Iro ibebiro mui amarama i aruma buritawa me aworau ibi. Iro me ibebiro mui amarama i arumanu ofe buritawa, i arumama mui amara awera mubie me karu inaibi. I dana mubisu amaranu ifuna Jousefa. Dawa Deiwidinu sisia amara. I arumanu ifuna Meri. Aneruma i arumabai farite wei, yaubinu, Godima ama kobererau utasuba ana Godinu mune odi aruma. Godima abai ibinu wei. 29 Eno weiba Merima neno siosa derawere uteda we imui, aneba i ario sina nabai eno wasu we imui? 30-31 I aneruma Meriba wei, iya da uya wei. Godima a kobererau erasuba dia mubate amara mubasuna Danu ifu Yesu maibasu wei. 32-33 Dana dera amarawere sibiro uwara nesia Dawana I Dera Godinu Amara webitaita. I dera Godima Danu sisia amara Deiwidinu yawoeta kabesi mafiro i Isaraero uwara yawofiro Danu yawoetana da me sibisu wei. 34 Weiro Merima aneruba wei, na awera meba anene umau wei? 35 Aneruma dawaba wei, Godinu Imumu Kotofuma abai tarafiro i urero Dera Godinu buna derawerema abai mafiro tare a obari ufisuba iba amara mubaro Godinu kakara amara webitaita wei. 36 Weite wei, anu atae Erisabeti buruka awetama bodere dia munite i desini kimu waita awetanu sakara awona sikisi (6) me sininu wei. 37 Ina kimu me, Godima anene webite Danu webisu enanari ufisu wei. 38 Aneruma eno weiba Merima dawaba wei, na Godinu gaukara aruma ewadoni eate anu wenu sina enanari ua wei. Eno weiro aneruma Meri ekodite ani. 39 Merima moana kowaro yaubebe uyarite mui Judia orofa maidani suro mani. 40-43 Merima Sakaraiyanu suro mane farite Erisabeti ario weina Erisabetinu kuita desini ubarero wawana ui. Eno uiro Godinu Imumu Kotofuma Erisabetinu neno ubarero itariba Erisabetima derawere wei, Godima a i aweta nesiabairoma ma kobererau utasute anu mubasu kuita ma kobererau utasu weita wei, nanu Dera Waria Amaranu danuama aneba nabai farinu wei? 44 Anu ario wenuro naunero nanu kuita desiniro yaru utedaba wawana usinu wei. 45 Ana Godinu sina nautasuba Godima a ma kobererau utasu. Dawa aba we bou ui enanari ufisu wei. 46-47 Eno weiro Merima wei, nanu neno ubareroma Godi we ma derawere utatanete Godi nanu Wiroeta Amara weda yaru uteda watane. 48 Godima Danu gaukara aruma erasuba iba eno utatane. Enoba i duburo fafitaita uwara nesia na kobererau webitaita watane. 49 Nanu buna Godima na ma kobererau uiba Danu ifuna kakarawere. 50-51 Ana anama Dawa imuegou ufitaro awona uwaraini nesia neno arama derawere ufisu watane. Danu age bunawere. I siosa imueta sasae bobo uwara ma beratasuro aika maika anaita. 52 I dera buna yawotaita uwara mu sanasute i dera me uwara ma uyarasu. 53 I osi urasu uwaraba kobere kobere ibaiabai matasuro dia uraita. I ibaiabai faiyawere bobo uwaraba matawa utasute we yowereda aniawe wasu. 54 Danu bodere durua ueta neno arama we bou ueta imutasute ya Isaraero uwara durua utasu. 55 Godima yanu Babamutu, Eibaramuini danu abua demurai uwaraini nesia neno arama ueta we bou uiba enanari utasu watane, Merima i sina eno we ma kobererau utebi. 56 Meri iro sakara rarogonu ibebe ekodite danu suro ani. 57 Erisabetinu kuita muneta kowa fariro dana amara muni. 58 Amara muniro danu abua demurai uwaraini su uwaraini nesia nautate we imuta, Godima Erisabeti neno arama derawere usinuba iba amara muninu, we imutate yaru uta. 59-60 Kowa eita (8) me siniro i Ju uwaranu baeta kuitanu ofe tureta kowa fariba i uwara nesia fatate guruguru uteda weita, danu damamanu ifu, Sakaraiya mamia weitaro Erisabeti nauite wei, me, danu ifuna Joni wei. 61-63 Weiro nautate weita, ema, yanu abua demurai uwarabai i eno ifu meya, kodia weitate danu damamaba age wanawana utaro Sakaraiyama age moko wawawana uiro feifa mataro Sakaraiyama owawa ue odi, danu ifuna Joni, odi. Eno odiro i uwara nesia neno kirifu uta. 64 Sakaraiyama owawa ue odiro danu sina wiawa utebi me siniro sina wate weite Godi we ma kobererau utebi. 65 Eno utebiba i uwara nesia iya uyata. I sina Judia maidani orofa aboro we inarebitaro nautebita. 66 I sina nauta uwara nenoma we imuta, Godinu buna i amakanubai ibinuba daisibite ane ufisu we imuta. 67-68 Godinu Imumu Kotofuma Joninu damama Sakaraiyanu neno ubarero itariba Sakaraiyama sina eno we bou uite wei, ya Isaraero uwaranu Dera Godima ya uwarabai tararasute ya ma wirotasuba Dawa we ma kobererau utaisi. 69-70 Godima Danu we bou ueta uwaranu bebeturoma bodere wei enanari utasu. Godima mui amara Deiwidinu sisia uwarabairoma ma uyari. I Amarana bunawere ma wiroeta Amara. 71-72 Yaba wasai utaita uwara ufite anama yaba neno ka utaita uwaranu ageroma ya ma darefite odifiro merau anibeisi. Danu bodere wei sina enanari utasu. 73-75 Godima yanu baba Eibaramu imuteda neno arama uteda Danu we bou ui sina enanari utasuba yaba wasai utaita uwaranu ageroma ya ma darefite ya ma kobererauini we ma fira ufisu. Iba iya da uyafete Godinu iboro kobere ueta sineta ue ibeda rorowarau uteibeisi wei. 76 Weite wei, a nanu amararatu, aba yanu Dera Godinu we bou ueta amara webitaita. A ko anibate yanu Dera Waria Amaranu daba ue bou ufasu. 77 Danu sina webaro naufitaro emanu neno neno mubite dubenaro sabisuba Dawa imufitaita. 78-79 Godima ya neno arama derawere utasuba i dera ario ari ureroma tarafite i dumuini uietanu utumuro ibinita uwara ario ufite yanu daba mawetu ufiro merau ibifeisi watane wei. 80 I amakanu ibebe daisinite danu neno Godibai odeda neno kobererau ui. Dawa eme ibawa orofaro ibebe ibebe Isaraero uwarabai we mawetu ufie fari.
english
Phil Mickelson, Bryson DeChambeau and nine other players involved with LIV Golf have filed suit against the PGA Tour, charging antitrust violations and alleging a wide-ranging pattern of coordinated behavior between the Tour and multiple other golf entities. In the lawsuit, first reported by the Wall Street Journal, the players are challenging their suspensions from the PGA Tour, some of which extend into 2024. In addition, three players – Talor Gooch, Hudson Swafford and Matt Jones – are seeking a temporary restraining order to allow them to play in the Tour’s upcoming FedEx Cup playoffs, an event for which they had already qualified. One of the more notable charges in the lawsuit involves Mickelson, who remains the most high-profile player on the LIV tour. The lawsuit indicates that the PGA Tour suspended Mickelson for two months on March 22, 2022, for, among other reasons, “attempting to recruit players to join [LIV Golf].” When Mickelson applied for reinstatement in June, the Tour extended that suspension to March 31, 2023, and later extended it another year, after Mickelson played in the first two LIV events. Mickelson at present cannot apply for reinstatement before March 31, 2024. DeChambeau, the lawsuit contends, has been suspended through March 31, 2023, “for talking to other Tour members about the positive experience he had had with LIV Golf.” Other players, including Abraham Ancer, Gooch, Swafford and Ian Poulter, have been suspended through March 31, 2024. The lawsuit also contends that the Tour engaged in a pattern of coordinated attempts to restrict LIV players from competing in other events, such as the majors. However, multiple players, including past champions Charl Schwartzel, Dustin Johnson and Sergio Garcia, played in the 2022 Masters and then joined LIV. At the 2022 Masters, Ridley said that Mickelson had not been disinvited from the tournament. Mickelson at the time was on hiatus from golf and also – according to the lawsuit – under a PGA Tour suspension. Augusta National has not yet responded to a request from Yahoo Sports for comment. The lawsuit also charges that the PGA Tour “threatened companies and individuals in the golf and sports production industry that they will be blackballed from working with the Tour if they work with LIV Golf.” Vendors including providers of tents, technology and apparel either exited negotiations or did not enter them because of the fear of losing the PGA Tour’s business, the lawsuit states. The suit further indicates that both NBC and CBS have kept LIV at a distance because of existing relationships with the Tour. LIV Golf has drawn notice because of its vast payouts to all players who compete in its events, as well as its significant upfront signing bonuses to marquee players such as Mickelson, DeChambeau and Johnson. However, the lawsuit contends that the upfront payouts were necessary to convince players to weather the storm of PGA Tour reprisals, and that such a business model is a threat to LIV’s long-term viability. LIV is backed by Saudi Arabia’s sovereign wealth fund, which has already pledged $2 billion to LIV to build out a full league over the next three years. Monahan’s letter to players did not discuss matters beyond the FedEx Cup playoffs except in general terms, instead focusing on the fate of players who are seeking to enter the playoffs. The FedEx Cup playoffs begin next week in Memphis at the FedEx St. Jude Championship. LIV Golf’s next event is scheduled for Sept. 2-4 at The Oaks in Boston. Contact Jay Busbee at jay.busbee@yahoo.com or on Twitter at @jaybusbee.
english
declare module 'react-materialize'; declare module 'react-router'; declare module 'flux';
typescript
Started back in 2011, the auto shows organized by PakWheels has become a trend among the automotive community nationwide; people eagerly wait for the auto shows to come to their cities. It has become a national phenomenon, where not only auto enthusiasts but families also come to enjoy a day with fun-filled activities. In Collaboration with Pak Army, PakWheels.com organized a 4th massive auto show in Peshawar on 9th December 2018, at Col. Sher Khan Stadium. Last year in 2017, PakWheels.com hosted its auto show in the city and around 30,000 visitors came to the event and made it a huge success, however, the numbers of the visitors this year increased a lot and over 40,000 citizens came from all over KPK to watch the iconic Peshawar Auto Show. The show featured six main categories of vehicles including vintage, exotic, luxury, modified, 4×4 and bikes. This time around, through the auto show, PakWheels extended great support to Shaukat Khanum and Army Public School, Peshawar. The PakWheels team felt honoured to invite the families of APS victims and paid tribute to the martyrs. PakWheels Auto Shows are the evolved iteration of the PakWheels meets, which the members used to arrange on their own throughout Pakistan. The roots lie in the same principle of people interacting with each other because of the common love and admiration for anything related to cars. We are dedicated and passionate to organize the Auto Show in Peshawar next year as well.
english
/* eslint-disable import/extensions */ /* eslint-disable import/no-unresolved */ import Phaser from 'phaser'; import bootScene from '../modules/Scenes/BootScene'; import preloaderScene from '../modules/Scenes/PreloaderScene'; import menuScene from '../modules/Scenes/MenuScene'; import gameScene from '../modules/Scenes/GameScene'; import gameOverScene from '../modules/Scenes/GameOverScene'; import leaderBoardScene from '../modules/Scenes/LeaderBoardScene'; import optionsScene from '../modules/Scenes/OptionsScene'; import creditsScene from '../modules/Scenes/CreditsScene'; function newGame() { const config = { type: Phaser.WEBGL, parent: 'game-div', width: 480, height: 640, backgroundColor: 'black', dom: { createContainer: true, }, physics: { default: 'arcade', arcade: { gravity: { x: 0, y: 0 }, }, }, scene: [ bootScene, preloaderScene, menuScene, gameScene, gameOverScene, leaderBoardScene, optionsScene, creditsScene, ], pixelArt: true, roundPixels: true, }; const game = new Phaser.Game(config); return game; } export default newGame;
javascript
Here's a big piece of news slipping in quietly during the holiday weekend. The Federal Communications Commission has finally issued a ruling over the SoftBank-Sprint-Clearwire deal. The long story short is that the Commission has approved the deal. The full ruling was published on Friday afternoon. To recall, Japanese cellular giant SoftBank bought a 70 percent stake in Sprint for $20.1 billion last October. At the time, it was expected that the deal would close (subject to regulatory approval) by mid-2013. At the same time, Sprint owns just more than half of Clearwire and wants to acquire the rest of the company for $2.97 per share. But there have been a few bumps as this deal works on obtaining federal approval. For one, Clearwire shareholders asked Sprint to bump up the bid back in January. That was after Dish filed a note with the FCC to pause review of the Sprint-Softbank deal, assuming that Sprint would be forced to drop its bid for the rest of Clearwire's shares -- thus allowing Dish to wedge its way in instead. Analysts had previously predicted that the federal agency would issue a ruling as soon as May. However, that was obviously wishful thinking as the last few months have come and gone with nary a peep on the matter. The ruling noted that the original bid has been since modified as recently as early June. SoftBank bumped up the original offer to $16.64 billion with now 78 percent ownership in Sprint. While the direct investment has been lowered to $5 billion, the overall value of the deal is still higher at $21.6 billion. Sprint shareholders have also approved the revised bid. As for the Sprint-Clearwire part of the deal, with Dish Network out of the picture, Clearwire shareholders are scheduled to vote on the final Sprint offer on July 8.
english
Shahjahanpur (Uttar Pradesh), March 19 (IANS) Three persons, including a toddler, were killed when a truck hit the scooter on which they were travelling and dragged the two-wheeler for almost 500 metres. The incident took place near the Katra overbridge in the district. Ramdin, 40, his sister-in-law Surja Devi, 35, and three-year-old nephew, all residents of Lalpur village, were on their way back home when a truck rammed into their scooter from behind. Senior Superintendent of Police (SSP) S. Anand said, "The scooter got entangled with the truck and was dragged for around 500 metres. " After reaching the spot, police personnel sent the three injured victims for treatment to a Bareilly hospital, where doctors declared them brought dead, he said. The bodies have been sent for post-mortem examination, the SSP said. The officials further said that the truck driver fled the spot, leaving behind his vehicle. Further investigation is underway in the matter. Disclaimer: This story has not been edited by the Sakshi Post team and is auto-generated from syndicated feed.
english
{"published": "2015-09-04T17:17:02Z", "media-type": "News", "title": "Easthampton roads need \nattention right now", "id": "09e30ba8-9515-4d5d-bfff-eebb9c8fd405", "content": "To the editor: \n \nI don\u2019t understand why Easthampton has not repaired the main roads in the city \u2014 Cottage, Union and Main streets. \n \nI have traveled from Easthampton to the Berkshires and Connecticut. Their main roads have been repaired where needed. Even North Adams, Cummington and Goshen have smooth roads. \n \nThere is construction all around Easthampton to improve roads. What\u2019s wrong with this city? \n \nYou want a fun ride? Take a trip up Williston Avenue towards the high school. That road has many, many patches; it\u2019s like riding on a roller coaster. Potholes are showing up again and you can\u2019t blame that on winter. Easthampton has repaired East Street, which was needed. It paved Clark Street. Why? It\u2019s a side street. The conditions of the roads in Easthampton are an embarrassment to the city. The roads have never been this bad. What happened to the money that was allocated for road repair? Is it being used for something else? \n \nAnother winter will be upon us in a couple of months. I say, let\u2019s fix these roads before snow arrives. \n \n<NAME> \n \nEasthampton", "source": "Daily Hampshire Gazette"}
json
/* * Copyright 2000-2017 JetBrains s.r.o. * * Licensed under the Apache License, Version 2.0 (the "License"); * you may not use this file except in compliance with the License. * You may obtain a copy of the License at * * http://www.apache.org/licenses/LICENSE-2.0 * * Unless required by applicable law or agreed to in writing, software * distributed under the License is distributed on an "AS IS" BASIS, * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. * See the License for the specific language governing permissions and * limitations under the License. */ package com.intellij.ui.win; import com.intellij.openapi.application.ApplicationNamesInfo; import com.intellij.openapi.application.PathManager; import com.intellij.util.lang.UrlClassLoader; import java.lang.ref.WeakReference; import java.util.concurrent.atomic.AtomicBoolean; public class RecentTasks { private static final AtomicBoolean initialized = new AtomicBoolean(false); private static final WeakReference<Thread> openerThread; private static final String openerThreadName; static { Thread thread = Thread.currentThread(); openerThread = new WeakReference<>(thread); openerThreadName = thread.getName(); UrlClassLoader.loadPlatformLibrary("jumpListBridge"); } private synchronized static void init() { if (initialized.compareAndSet(false, true)) { initialize(ApplicationNamesInfo.getInstance().getFullProductName() + "." + PathManager.getConfigPath().hashCode()); } } /** * COM initialization should be invoked once per process. * All invocations should be made from the same thread. */ native private static void initialize(String applicationId); native private static void addTasksNativeForCategory(@SuppressWarnings("SameParameterValue") String category, Task[] tasks); native static String getShortenPath(String paths); native private static void clearNative(); public synchronized static void clear() { init(); checkThread(); clearNative(); } /** * Use #clearNative method instead of passing empty array of tasks. */ public synchronized static void addTasks(final Task[] tasks) { if (tasks.length == 0) return; init(); checkThread(); addTasksNativeForCategory("Recent", tasks); } private static void checkThread() { Thread thread = Thread.currentThread(); if (!thread.equals(openerThread.get())) { throw new RuntimeException("Current thread is '" + thread.getName() + "'; this class should be used from '" + openerThreadName + "'"); } } }
java
<filename>core/modules/filters/untagged.js /*\ title: $:/core/modules/filters/untagged.js type: application/javascript module-type: filteroperator Filter operator returning all the selected tiddlers that are untagged \*/ (function(){ /*jslint node: true, browser: true */ /*global $tw: false */ "use strict"; /* Export our filter function */ exports.untagged = function(source,operator,options) { var results = []; // Function to check an individual title function checkTiddler(title) { var tiddler = options.wiki.getTiddler(title), match = tiddler && $tw.utils.isArray(tiddler.fields.tags) && tiddler.fields.tags.length > 0; if(operator.prefix !== "!") { match = !match; } if(match) { $tw.utils.pushTop(results,title); } } // Iterate through the source tiddlers if($tw.utils.isArray(source)) { $tw.utils.each(source,function(title) { checkTiddler(title); }); } else { $tw.utils.each(source,function(element,title) { checkTiddler(title); }); } return results; }; })();
javascript
{ "american": [ "Boston Red Sox", "Detroit Tigers", "New York Yankees" ], "national": [ "New York Mets", "Chicago Cubs", "Atlanta Braves" ] }
json
<reponame>SohrabAmin/Free-Room {"id": "PHL440H1S20201", "code": "PHL440H1S", "name": "<NAME>", "description": "Advanced study of topics in bioethics, taught in conjunction with clinical bioethicists associated with the health care organization partners of the University of Toronto Joint Centre for Bioethics.", "division": "Faculty of Arts and Science", "department": "Philosophy", "prerequisites": "PHL281H1, 4.0 credits in philosophy. Limited to students enrolled in the Bioethics Specialist or Bioethics Major programs.", "exclusions": "", "level": 400, "campus": "UTSG", "term": "2020 Winter", "breadths": [], "meeting_sections": [{"code": "L0101", "instructors": ["<NAME>"], "times": [{"day": "THURSDAY", "start": 54000, "end": 64800, "duration": 10800, "location": "SS 1088"}], "size": 30, "enrolment": 30}]}
json
<reponame>fibo/SQL92-JSON<filename>util/isPartialKeyword.js<gh_stars>1-10 var partialKeywords = require('./partialKeywords.json') function isPartialKeyword (token) { var TOKEN = token.toUpperCase() return partialKeywords.indexOf(TOKEN) > -1 } module.exports = isPartialKeyword
javascript
{ "name": "Longman Dictionary Bubble", "short_name": "Lmd bubble", "description": "Search a definition from Longman Dictionary of Contemporary English (5th edition)", "author": "nator333", "version": "1.1.1", "icons": { "16": "icons/Longman16.png", "48": "icons/Longman48.png", "128": "icons/Longman128.png" }, "permissions": [ "activeTab", "storage" ], "browser_action": { "default_title": "title", "default_popup": "pages/popup.html" }, "options_page": "pages/options.html", "content_scripts": [ { "js": [ "js/content.js" ], "css": [ "css/content.css" ], "run_at": "document_end", "matches": [ "<all_urls>" ], "all_frames": true } ], "manifest_version": 2, "content_security_policy": "script-src 'self'; object-src 'self'" }
json
Kuldeep Yadav to Pooran, out Lbw!! He was a goner. He knew he was plumb in front. That was a quicker one, skidded through did the flipper and the batter had no answers to it. Windies slip further. India dominating this. Yadav gets one in his first over. Pooran lbw b Kuldeep Yadav 3(6)
english
{ "actors": [], "countries": [], "enb_end_date": "07-Dec-05", "enb_long_title": "Eleventh session of the Conference of the Parties to the Climate Change Convention and first meeting of the Parties to the Kyoto Protocol", "enb_short_title": "UNFCCC COP 11 and COP/MOP 1", "enb_start_date": "07-Dec-05", "enb_url": "http://www.iisd.ca/vol12/enb12288e.html", "id": "enb12288e_20", "section_title": "TECHNOLOGY TRANSFER", "sentences": [ "On development and transfer of technologies, SBSTA adopted conclusions on matters relating to the implementation of the framework (FCCC/SBSTA/2005/L.24) and the Work Programme of the EGTT for 2006 (FCCC/SBSTA/2005/L.23) and a draft COP decision (FCCC/SBSTA/2005/L.24/Add.1)." ], "subtype": "", "topics": [], "type": "" }
json
<gh_stars>1-10 # palladium3d-demo This is the demo version of our game Palladium. This demo contains all basic game mechanics and small part of entire Palladium maze with one puzzle. ![Palladium maze](https://nlbproject.com/files/pmaze.png) # License Anything related to the game code in this demo is MIT-licensed. Feel free to use it in your projects. Anything inside the assets folder is proprietary. Some files can be freely downloaded elsewhere, for example the fonts or textures from https://www.textures.com, some files are our own work for the game and can be used only with our permission. If you have any questions regarding this game, feel free to write me an e-mail: <EMAIL> # Related issues Shadows looks highly pixelated when using ConeTrace and simply incorrect when using RayTrace: https://github.com/godotengine/godot/issues/30929
markdown
<gh_stars>0 import { assert } from '../test_utils' import { handle_attrs_cond_overrides, MoreAttrOption_cond_overrides } from '../../steps/attrs_override' type StepAttrOption_ = StepAttrOptionT<MoreAttrOption_cond_overrides> export type StepAttrsOption_ = Dictionary<StepAttrOption_> describe('handle_attrs_cond_overrides', () => { // @ts-expect-error const test = async ({ conds, with_conds, expected_without_conds, v, expected_overrides }) => { const { attrs, attrs_override } = await handle_attrs_cond_overrides(conds)(with_conds) assert.deepEqual(attrs, expected_without_conds); assert.deepEqual(await attrs_override(undefined, { step: undefined, v, history: [] }), expected_overrides) } const conds = { foo: (v: v) => v.givenName, } it("should work", async () => { const test1 = { conds, with_conds: { sn: { title: "SN", cond_overrides: { foo: (null as any) }, }, }, expected_without_conds: { sn: { title: "SN" } }, v: { givenName: "Pascal" } as v, expected_overrides: { sn: (null as any) }, } await test(test1) await test({ ...test1, v: {} as v, expected_overrides: {}, }) }); it("should handle if_then", async () => { const test1 = { conds, with_conds: { pager: { title: "PAGER", if: { optional: true }, then: { merge_patch_parent_properties: { sn: { title: "SN", cond_overrides: { foo: (null as any) }, } }, }, }, }, expected_without_conds: { pager: { title: "PAGER", if: { optional: true }, then: { merge_patch_parent_properties: { sn: { title: "SN" } } }, }, }, v: { givenName: "Pascal" } as v, expected_overrides: { pager: { then: { merge_patch_parent_properties: { sn: (null as any) } } }, }, } await test(test1) }); })
typescript
<reponame>DylanSunLei/H-Piper # Checkpoint ## Progress review We finished literature review for the project in the first two weeks after submitting the proposal. After understanding that Google Inception Net contains 8 unique layers and they can be used to construct CNN and VGG and other nets, we decide to focus on implementing Google Inception net, which essentially is implementing the 8 unique layers. We are currently stuck on building caffe2 in latedays cluster. There are many dependencies missing. While waiting for instructors to help, we decide to move on and build our own libraies or proceed with Halide implementation first. ### SCHEDULE | Timeline | Goal to achieve | Progress | |:----------|:--------------| :-------| | April 8 | Understand Google Inception Net's Layers and how does they work| Done| | April 15 | Scratch the framework and define primitives| Done| | April 22 | Implement the framework in Halide and test on CIFAR-10 dataset | Ongoing| | April 29 | Tune different schedule to improve performance, contact PolyMage for automatic shedule, and compete with Caffe and MXNet | Not Started| | May 7 | Wrap up implementatin and compare/analyse the performance in report | Not Started | ## Updated Goal and Deliverables We've put lots of hours to build caffe and caffe2 on lateday cluster. We decide to make our scope to be only support protobuf and json input. We could expend the supported input in the future. For now, the ability of taking in protobuf and json is the most important. Also, We decide to focus on implementation of our H-Piper framework instead of tuning scheduling. To have this framework ready could help investigate locality-parallelism tradeoff and also PolyMage could benifit from our framework by testing its automatically generated schedules on different nets. The Performance Compete will be done between Caffe's performance VS PolyMage Schedule's performance. ## What to Show in Parallelism Competition We want to show the comparison of the performance of our hand-tuned schedule, the performance of PolyMage automatic schedule, and the performance of Caffe2. So it will be a graph. I think it will be nice to have a demo to show how easy it is to configure a net but given that we only have 6 minutes, we might not do this demo. ## Preliminary Results We don't have any results for now. We are still working on the correctness of our framework. ## Issues - We don't know how to write fast halide schedule - Working on installing dependencies to build caffe2 on lateday clusters - Some questions about Layers implementation of Google Inception Net. For example, there are two possible way to implement Inception Layer. We don't know which one to choose. - Our biggest concern is the time. We are working hard now and try to finish our plans in time. ## New Schedule with task | Timeline | People | Task | |:----------|:--------------| :-------| | April 22 | <NAME> | Maxpool Layer, Softmax Layer Implementation | | April 22 | <NAME> | Protobuf Input Parser and Data Layer | | April 29 | <NAME> | Test cases build. Caffe2 Dependencies. | | April 29 | <NAME> | Conv Layer, DepthConcat Layer, Net Creation| | May 7 | <NAME>, <NAME> | Wrap up implementatin and compare/analyse the performance in report |
markdown
<filename>libs/class.RpcClient.js 'use strict'; const http = require('http'), precon = require('@mintpond/mint-precon'); class RpcClient { /** * Constructor. * * @param args * @param args.host {string} * @param args.port {number} * @param [args.user] {string} * @param [args.password] {string} */ constructor(args) { precon.string(args.host, 'host'); precon.minMaxInteger(args.port, 1, 65535, 'port'); precon.opt_string(args.user, 'user'); precon.opt_string(args.password, 'password'); const _ = this; _._host = args.host; _._port = args.port; _._user = args.user || ''; _._password = args.password || ''; _._msgId = 0; } /** * Send an RPC method to the wallet daemon. * * @param args * @param args.method {string} The RPC method name. * @param [args.params] {Array} Array containing method parameter arguments. * @param [args.callback(err, rpcResult)] {function} Function to callback when RPC response is received. */ cmd(args) { precon.string(args.method, 'method'); precon.opt_array(args.params, 'params'); const _ = this; const method = args.method; const params = args.params || []; const request = { method: method, params: params, id: _._msgId++ }; _._sendRequest(request, args.callback); } /** * Validate a wallet address. * * @param args * @param args.address {string} * @param [args.callback] {function(err:*, )} */ validateAddress(args) { precon.string(args.address, 'address'); precon.opt_funct(args.callback, 'callback'); const _ = this; const address = args.address; const callback = args.callback; _.cmd({ method: 'validateaddress', params: [address], callback: (err, results) => { if (err) console.error(err); callback && callback(!err && results.isvalid, results); } }); } _sendRequest(request, callback) { const _ = this; const serialized = JSON.stringify(request); const options = { hostname: _._host, port: _._port, method: 'POST', auth: `${_._user}:${_._password}`, headers: { 'Content-Length': serialized.length } }; const req = http.request(options, res => { let data = ''; res.setEncoding('utf8'); res.on('data', chunk => { data += chunk; }); res.on('end', () => { callback && _._parseResponse({ res: res, json: data, callback: callback }); callback = null; }); }); req.on('error', err => { callback && callback(err, null); callback = null; }); req.end(serialized); } _parseResponse(args) { const _ = this; const res = args.res; const json = args.json; const callback = args.callback; if (res.statusCode === 401) { console.error('Daemon rejected username and/or password.'); return; } const parsedJson = _._tryParseJson(json); if (parsedJson.error) { callback(parsedJson.error, null); } else { callback(parsedJson.parsed.error, parsedJson.parsed.result); } } _tryParseJson(json) { const _ = this; let result; try { result = { error: null, parsed: JSON.parse(json) } } catch (err) { if (json.indexOf(':-nan') !== -1) { json = json.replace(/:-nan,/g, ':0'); result = _._tryParseJson(json); } else { result = { error: err, parsed: null }; } } return result; } } module.exports = RpcClient;
javascript
عن أبي الأسود، قال: قَدِمْتُ المدينة، فَجَلَسْتُ إلى عمر بن الخطاب رضي الله عنه فَمَرَّتْ بهم جَنازة، فَأُثْنِيَ على صاحِبِها خيراً، فقال عمر: وجَبَتْ، ثم مَرَّ بأُخْرَى فَأُثْنِيَ على صاحِبِها خيراً، فقال عمر: وجَبَتْ، ثم مَرَّ بالثالثة، فَأُثْنِيَ على صاحِبِها شَرَّا، فقال عمر: وجَبَتْ، قال أبو الأسود: فقلت: وما وجَبَتْ يا أمير المؤمنين؟ قال: قلت كما قال النبي صلى الله عليه وسلم : «أيُّما مُسلم شَهِد له أربعة بخير، أدخله الله الجنة» فقلنا: وثلاثة؟ قال: «وثلاثة» فقلنا: واثنان؟ قال: «واثنان» ثم لم نَسْأَلْهُ عن الواحد. Jednom, prošla je dženaza pored Omera, radijallahu anhu, a s njim je bilo još ljudi, te su oni pohvalili umrloga, a Omer, radijallahu anhu, kazao: "To će se obistiniti." Zatim je prošla druga dženaza, a oni su je opet pohvalili, kao i prvu dženazu, a Omer, radijallahu anhu, kazao je isto. Potom je prošla i treća dženaza, a oni su se o mrtvome izjasnili loše, a Omer, radijallahu anhu, kazao je: "To će se obistiniti." Ebul-Esvedu je to predstavljalo problem, jer nije razumio Omerove, radijallahu anhu, riječi te je zatražio da mu to pojasni. Omer, radijallahu anhu, kazao je: "Rekao sam ono što je rekao i Poslanik, sallallahu alejhi ve sellem: 'Kome god muslimanu posvjedoče četverica da je dobar, Allah će ga uvesti u džennet.' Pa kada su to ashabi čuli od Allahovog Poslanika, salallahu alejhi ve sellem, upitali su: 'A kome posvjedoče trojica da je čestit?' 'Isto tako, kome posvjedoče trojica de ja čestit obavezan mu je džennet', odgovorio je. Tada ashabi rekoše: 'A kome posvjedoče dvojica, da li će biti od stanovnika dženneta?' 'I kome posvjedoče dvojica obavezan mu je džennet', odgovorio je. A potom ga nismo pitali o onom za koga jedan čovjek posvjedoči da je čestit, da li će ući u džennet."
english
let english : Array<string> = [ "the", "name", "of", "very", "for", "before", "and", "just", "form", "in", "much", "is", "great", "it", "think", "you", "say", "that", "help", "he", "low", "was", "line", "for", "before", "on", "turn", "are", "cause", "with", "same", "as", "mean", "differ", "his", "move", "they", "right", "be", "boy", "at", "old", "one", "too", "have", "does", "this", "tell", "from", "sentence", "or", "set", "had", "three", "by", "want", "hot", "air", "but", "well", "some", "also", "what", "play", "there", "small", "we", "end", "can", "put", "out", "home", "other", "read", "were", "hand", "all", "port", "your", "large", "when", "spell", "up", "add", "use", "even", "word", "land", "how", "here", "said", "must", "an", "big", "each", "high", "she", "such", "which", "follow", "do", "act", "their", "why", "time", "ask", "if", "men", "will", "change", "way", "went", "about", "light", "many", "kind", "then", "off", "them", "need", "would", "house", "write", "picture", "like", "try", "so", "us", "these", "again", "her", "animal", "long", "point", "make", "mother", "thing", "world", "see", "near", "him", "build", "two", "self", "has", "earth", "look", "father", "more", "head", "day", "stand", "could", "own", "go", "page", "come", "should", "did", "country", "my", "found", "sound", "answer", "no", "school", "most", "grow", "number", "study", "who", "still", "over", "learn", "know", "plant", "water", "cover", "than", "food", "call", "sun", "first", "four", "people", "thought", "maylet", "down", "keep", "side", "eye", "been", "never", "now", "last", "find", "door", "any", "between", "new", "city", "work", "tree", "part", "cross", "take", "since", "get", "hard", "place", "start", "made", "might", "live", "story", "where", "saw", "after", "far", "back", "sea", "little", "draw", "only", "left", "round", "late", "man", "run", "year", "don't", "came", "while", "show", "press", "every", "close", "good", "night", "me", "real", "give", "life", "our", "few", "under", "stop", "open", "ten", "seem", "simple", "together", "several", "next", "vowel", "white", "toward", "children", "war", "begin", "lay", "got", "against", "walk", "pattern", "example", "slow", "ease", "center", "paper", "love", "often", "person", "always", "money", "music", "serve", "those", "appear", "both", "road", "mark", "map", "book", "science", "letter", "rule", "until", "govern", "mile", "pull", "river", "cold", "car", "notice", "feet", "voice", "care", "fall", "second", "power", "group", "town", "carry", "fine", "took", "certain", "rain", "fly", "eat", "unit", "room", "lead", "friend", "cry", "began", "dark", "idea", "machine", "fish", "note", "mountain", "wait", "north", "plan", "once", "figure", "base", "star", "hear", "box", "horse", "noun", "cut", "field", "sure", "rest", "watch", "correct", "colorable", "face", "pound", "wood", "done", "main", "beauty", "enough", "drive", "plain", "stood", "girl", "contain", "usual", "front", "young", "teach", "ready", "week", "above", "final", "ever", "gave", "red", "green", "list", "oh", "though", "quick", "feel", "develop", "talk", "sleep", "bird", "warm", "soon", "free", "body", "minute", "dog", "strong", "family", "special", "direct", "mind", "pose", "behind", "leave", "clear", "song", "tail", "measure", "produce", "state", "fact", "product", "street", "black", "inch", "short", "lot", "numeral", "nothing", "class", "course", "wind", "stay", "question", "wheel", "happen", "full", "complete", "force", "ship", "blue", "area", "object", "half", "decide", "rock", "surface", "order", "deep", "fire", "moon", "south", "island", "problem", "foot", "piece", "yet", "told", "busy", "knew", "test", "pass", "record", "farm", "boat", "top", "common", "whole", "gold", "king", "possible", "size", "plane", "heard", "age", "best", "dry", "hour", "wonder", "better", "laugh", "true", "thousand", "during", "ago", "hundred", "ran", "am", "check", "remember", "game", "step", "shape", "early", "yes", "hold", "hot", "west", "miss", "ground", "brought", "interest", "heat", "reach", "snow", "fast", "bed", "five", "bring", "sing", "sit", "listen", "perhaps", "six", "fill", "table", "east", "travel", "weight", "less", "language", "morning", "among", "speed", "typing", "mineral", "seven", "eight", "nine", "everything", "something", "standard", "distant", "paint", ]; export default english;
typescript
Thiruvananthapuram, July 2 Congress leaders in Kerala on Sunday strongly came out against the party's Ernakulam MP, Hibi Eden's private member's bill suggesting Kochi be the state capital instead of Thiruvananthapuram. Leader of the Opposition in Kerala Assembly, V. D. Satheeshan said that Congress has never discussed such a subject. He also said that the MP had moved the bill without consulting the party and added that the party had asked him withdraw the bill. After Eden moved the private member's bill in parliament, the Union Home Ministry sought the opinion of the Kerala government. However the state government rejected the demand, with Chief Minister Pinarayi Vijayan noting in the file that the demand was impractical and hence, was being rejected. He said that he was also from Kochi but never had the urge to make it the capital city and added that Thiruvananthapuram was a good capital city for Kerala. Thiruvananthapuram MP and senior Congress leader Dr. Shashi Tharoor also told media persons that there was no need for such a demand. Vadakara MP K. Muraleedharan said that the demand of Eden has to be rejected outright, and reiterated that Thiruvananthapuram was a good place to be the capital of the state. He asked whether he could demand Vadakara as the capital of Kerala, adding that these were all immature suggestions. Several other senior state Congress leaders came out strongly against the demand of Eden. Meanwhile, Kerala Industries Minister P. Rajeev told media persons that the demand of Eden is an indication that he was the candidate for Ernakulam Lok Sabha seat again.
english
Former FA Cup-winning Arsenal striker Kevin Campbell believes the Gunners need to sell Granit Xhaka this summer. Mikel Arteta's side are reportedly close to finalizing a transfer to bring in Fabio Vieira from FC Porto (as per Sky Sports) to bolster their midfield options. They have also been linked with Leicester City's Youri Tielemans, with The Guardian reporting that the Gunners are working on a deal. The club could look to finalize some outgoings as well to balance their books. Xhaka has been one of the players linked with an exit, with the Mirror reporting that Bundesliga side Bayer Leverkusen are interested in signing him. The 29-year-old still has two years left on his current deal with Arsenal. However, Campbell believes his former side should be looking to offload Xhaka this summer. He told Football Insider: Campbell praised the Swiss international for enjoying a good 2021-22 campaign for the Gunners. However, he believes that the club's "reliance" on Xhaka needs to end, opining: The 52-year-old concluded: Xhaka signed for the north London club back in the summer of 2016 from Borussia Monchengladbach. While he added some steel to their midfield and produced the odd long-range stunner, fans of the club have often been divided in their opinions about him. Xhaka's disciplinary issues on the field have played a major role in shaping supporters' perceptions of him. He has picked up 68 yellow cards and four red cards in 249 matches across competitions for the Premier League giants. The spotlight often reflects more on those numbers than his tallies of 14 goals and 22 assists. His falling out with a section of Gunners fans during a match in 2019, which saw him throw the captain's armband on the turf, didn't help his case either. Should Xhaka leave this summer, he'll end his career at the Emirates with two FA Cups and two FA Community Shields to his name.
english
A welfare scheme is like an infinite game–you can start it but cannot end it as the objective is not winning but perpetuating. The political fallout of shutting down a scheme is too high. But how long should a government continue to drag a faulty design and keep doling out funds in the name of welfare? The story of Mahatma Gandhi National Rural Employment Guarantee Scheme (MGNREGS) is one of an infinite game that creates a corrupt system that survives on perpetuating the scheme. The extent of MGNREGS failure is evident from the pile of incomplete projects. The Economic Survey 2023 reveals the macro dynamics of the efficiency and effectiveness of MGNREGS. The data on the incomplete project shows that the money spent on this scheme, over the last many years, rarely translates into assets on the ground. Is half road still a road? Will an incomplete water drain system still be useful, and can a half-finished Panchayat Bhawan be used? An incomplete road in a rural area will either wither away with disuse or, better still, will again be approved as a project. The lobby that has been lamenting about the lower allocation for MGNREGS in the current budget should be worried about its utilisation. CIPP’s experience with its rural lab in the most backward district of the state of Haryana with the MGNREGS scheme is that it has extended increased corruption to the bottom of the pyramid. The objective of MGNREGS was to give guaranteed income to the hands of rural landless workers and create assets like irrigation drains, roads, and Panchayat Bhawan. If the MGNREGS project is not designed to create a physical asset, then two challenges arise. One, if no asset is created on the ground, then there is no way a government to audit or ensure that the work that payments have been made to the beneficiaries for labour and material as there is nothing to show on the ground. Second, when there is nothing to show, it means that the system itself has siphoned off both the material and labour costs. This creates a massive amount of corruption at the village level, beginning from the panchayat itself, going upward from the Junior Engineer, ABPO, and sometimes right up to the political or administrative head of the state’s Panchayat department. This way, MGNREGS ensures and creates corruption at the bottom of the governance and political pyramid–the panchayat level. This corruption at the panchayat level is very visible and clearly evident from the amount of money that candidates are willing to spend to win the Sarpanch elections. This amount has ballooned from a few lakhs ten years back to as high as Rs 50 lakh and Rs 1 crore, depending upon a village’s population. The population of a village along with the number of landless farmers and backward castes determines the spending of the government scheme. Almost every government scheme is now reaching down to the panchayat in the hope that the scrutiny and delivery will be better. The junior engineer, block panchayat officer, panchayat secretary, and district panchayat officer have now developed well-oiled machinery to siphon off MGNREGS funds. The Central government is aware that corruption is rising and it is trying to plug the leakage of funds using technology. Hence, the department developed a portal specifically for preventing leakage. The portal demands that everything is done electronically, beginning from generating demand at the village level. The sarpanch or the panchayat secretary is supposed to generate demand depending upon the need. But before this, a very crucial exercise is to get a project approved under MGNREGS. This has now become an exercise in futility and is akin to trying to get a Minister of Roads and Surface Transportation (MoRST) to assign a new national highway. The power to sanction a project is based on the total funds allocated to a district by the state government. How these funds are allocated for the district is based on where the chief minister wants to spread the largesse. Anecdotal evidence from states suggests that every CM allots funds based on political requirements instead of the backwardness of the district or its need. This can be easily corrected if the Central government makes a system or formula based on income levels or unemployment in a district. They can also make the data public so that it is known which district is getting how much funding. Even project approval and completion data should be shared at the district level to see the competency of the district in completing projects. Right now, if the fund is allocated to the district and the district CEO of the panchayat department agrees, a project gets approved for a village. This depends on the persistence and political prowess of the sarpanch. The next step is sending the demand to the block panchayat officer. This begins with uploading the bank accounts of all the beneficiaries so that the labour costs can be directly sent into their accounts. The Haryana government also wants the Parivar Pehchan Patra, a new document ID for the family, to be uploaded and verified for each beneficiary. This is a task multiplied by the number of beneficiaries and it has to be done every time, as the system does not save or store the list of beneficiaries. Technically, the staff is supposed to do everything from evaluating the project to estimating the cost and preparing a budget. But, in reality, all this work of the junior engineer has to be done by the panchayat. The system at this level feels that doing its job is somebody else’s responsibility. The junior engineer sometimes does not even have the skills. But as these villagers are not really expected to work on the ground as there is no tangible thing to be created, they are happy to give away a part of the money to the system for being enrolled. Every sarpanch or panchayat creates its own list of beneficiaries that will give thee free money back to them. This is the free money system that has been created on the ground over the last six years–anywhere between Rs 3 lakh crore and Rs 4. 5 lakh crore. If we see the data on the kind of projects that are being taken up in the last three years of MGNREGS, they are mostly projects on private land. This means that MGNREGS has exhausted or is no longer taking community infrastructure projects. MGNREGS does not acknowledge that taking up any project needs enterprise, agency, and resources. The government does not pay immediately but the material has to be paid immediately. Small suppliers in the village will not wait for six months for the bills to be paid. This means that somebody enterprising enough, steps in and takes responsibility for the project. The ABPO or CEO cannot handle multiple projects directly. Hence in every project, an agent or a contractor is involved. MGNREGS does not acknowledge this contractor. The contractor is crucial as he puts the muster for labourers together, collects them on the site, gets the material from the shop to the location, and pays the supplier for the cement or sand or bricks supplied. All these are tasks that require time, effort, and money. The design of MGNREGS assumes that these tasks do not exist or they will be done by the government official or panchayat. This is unrealistic as the government official does not have the bandwidth and even the sarpanch will not give an advance payment for material or give a payment on behalf of the government. The only one who will give an advance for the material and to the labourers to come on-site is the one who believes that he will be able to get the payment from the government. MGNREGS ignores that there is any private enterprise or efforts made to earn money at the bottom of the pyramid. Because the system software does not acknowledge the presence of a contractor and believes this is a utopian world where resources will be spent on behalf of the government without expecting any return. The system is broken and it changes everything about the project from execution, to design, and completion. If for any other project, the system acknowledged the presence of the contractor, in fact, did a tender for allocating the project to the lowest bidder, it would have one single person or organisation responsible for the project. If the project was treated as any other project, the cost of it would be standardized as per PWD standards. Currently, there is no standard followed by a water drain project that can be designed for 4 lakhs or 40 lakhs. The biggest change that the MGNREGS project can do is to acknowledge private enterprise in the execution of the project. Make a standardised budget for quality and time and ensure that the contractors are compensated to create it. Judge the project on quality and not on the fact that the labour cost has been transferred. Currently, a project is seen to be complete if the labour cost is sent to the beneficiary account. The material cost is sent much later and if that is also sent, then the project is closed on the software or the system. If MNREGS is not overhauled it would continue to corrupt the system at village, block and district level, whereas it can genuinely create jobs and entrepreneurship at that level. K Yatish Rajawat is a public policy researcher and works at the Gurgaon-based think and do tank Centre for Innovation in Public Policy (CIPP). Views expressed are personal.
english
Arnold Schwarzenegger is known worldwide as one of the greatest bodybuilders of all time. An actor, film producer, businessman and politician, he has worn several hats in his life, making him one of the biggest personalities in the US. The seven-time Mr Olympia winner is considered a living legend by bodybuilders around the globe. It's interesting to note that the ‘Terminator’ star carries the nickname, 'Austrian Oak'. As mentioned in the nickname, Arnold Schwarzenegger was born in Austria. However, the actor left his home country at a young age as he began to feel there wasn't enough for him. In an interview, Schwarzenegger shocked a few viewers as he mentioned why he left his home country. Schwarzenegger shot to fame at a very young age. The bodybuilder moved to the United States at a very young age to pursue his dream of making it big in the bodybuilding community. Speaking about why he chose the US as his destination, Schwarzenegger said that he fell in love with the country as a kid. Born and raised in Austria, the bodybuilder made enormous sacrifices to move to his dream country. Arnold Schwarzenegger is one of the biggest names in the American entertainment industry. But not many know the struggle behind it. Schwarzenegger was born shortly after World War II and grew up in a small town. During an interview with Graham Bensinger, Schwarzenegger recalled his childhood and said that his town was too small. He said: “In Austria, when you live in this little village, you see everything little. Little bridges, little cars, little four-cylinder cars, one-cylinder motorcycles and mopeds and stuff like that. " Furthermore, Schwarzenegger said that he first fell in love with the US after seeing videos of its enormous landmarks. He said that he found US-made vehicles and buildings 'gigantic’. He continued: The former Mr Olympia winner said: The bodybuilder went on to add that he found the US more interesting than Austria. Schwarzenegger said that he always felt like he ‘belonged to the US’ and that bodybuilding was his ticket to the country. The actor stated: “For some reason, I felt I was in the wrong place. I always felt out of place in Austria, I felt kind of like—I mean it was a beautiful country and everything—but it was kind of like I felt that it’s too little for me, I was much more attracted to America. " Watch the full interview here: Arnold Schwarzenegger was crowned Junior Mr Europe in 1965 at the age of 14. He won the senior version of the title the following year. The bodybuilder shot to fame in the seventies as he began to dominate international competitions. He went on to win the Mr Universe title for three years in a row from 1968 to 1970. Having made it big in the US, the seven-time Mr Olympia title winner transitioned to acting. Arnold Schwarzenegger gained popularity in Hollywood, starring in blockbuster hits like 'The Terminator' and 'Twins' in the 80s. He was twice named in the ‘100 most influential people in the world’ by Time magazine (2004 and 2007). Later, he also ventured into politics, becoming the 38th governor of California between 2003 and 2011.
english
A viral video shows a bull standing on the side of a road and then attacking a policeman, as it throws him in the air. It is unusual to encounter stray bulls on the road. Generally, they do not attack, but it is always better to make a distance from them. However, a hair-raising video of a Delhi cop being attacked by a bull and tossed into the air has emerged on social media. This incident took place when the policeman was on duty in Sherpur Chowk in Dayalpur, New Delhi. The whole episode was recorded on the CCTV camera installed in the area. The video starts with the cop crossing a busy road while the bull arrives from another side. After crossing the street, the cop stopped and turned towards the bull's side. As the cop looked at his mobile phone, the bull suddenly turned violent and rushed towards the policeman, attacked the cop, and tossed him into the air. The cop was identified as Gyan Singh, and soon after the incident, he was rushed to the nearest hospital by his associates. According to the available sources, the constable did not have any serious injury and was discharged from the hospital. After being shared online, the video has gone crazy viral on digital media platforms, and several netizens requested that the problem of stray cattle in Delhi be addressed. A user wrote, "punishment needed for the bull. " Another person commented, "Punishment needed for the people who worship cows. . . But leave them on road. " Also Read: Halal row: HD Kumaraswamy dares CM Bommai to warn right wings, says 'act if you are man'
english
use anyhow::Result; use awsbs::Configuration; fn main() -> Result<()> { println!("{:?}", Configuration::auto()?); Ok(()) }
rust
<filename>db/data/repos/vuejs/vue-ssr-docs.json<gh_stars>0 { "_comment": "DO NOT EDIT MANUALLY - See ../../../README.md", "name": "vue-ssr-docs", "full_name": "vuejs/vue-ssr-docs", "owner": "vuejs", "private": false, "html_url": "https://github.com/vuejs/vue-ssr-docs", "description": "Vue.js Server-Side Rendering Guide", "fork": false, "url": "https://api.github.com/repos/vuejs/vue-ssr-docs", "languages_url": "https://api.github.com/repos/vuejs/vue-ssr-docs/languages", "pulls_url": "https://api.github.com/repos/vuejs/vue-ssr-docs/pulls{/number}", "created_at": "2017-04-26T05:01:09Z", "updated_at": "2018-09-12T20:21:03Z", "pushed_at": "2018-08-16T15:39:23Z", "homepage": null, "size": 2006, "stargazers_count": 337, "language": null, "mirror_url": null, "archived": false, "license": null, "default_branch": "master", "organization": { "login": "vuejs", "id": 6128107, "node_id": "MDEyOk9yZ2FuaXphdGlvbjYxMjgxMDc=", "avatar_url": "https://avatars1.githubusercontent.com/u/6128107?v=4", "gravatar_id": "", "url": "https://api.github.com/users/vuejs", "html_url": "https://github.com/vuejs", "followers_url": "https://api.github.com/users/vuejs/followers", "following_url": "https://api.github.com/users/vuejs/following{/other_user}", "gists_url": "https://api.github.com/users/vuejs/gists{/gist_id}", "starred_url": "https://api.github.com/users/vuejs/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/vuejs/subscriptions", "organizations_url": "https://api.github.com/users/vuejs/orgs", "repos_url": "https://api.github.com/users/vuejs/repos", "events_url": "https://api.github.com/users/vuejs/events{/privacy}", "received_events_url": "https://api.github.com/users/vuejs/received_events", "type": "Organization", "site_admin": false }, "contributors": { "sodatea": 1, "mitsu-ksgr": 1, "chrisvfritz": 1, "lex111": 1, "nicedchy": 1, "HcySunYang": 1, "rkunev": 1, "eddyerburgh": 1, "reed-jones": 1, "paerallax": 1, "ruanjf": 1, "jiangxiaoxin": 1, "Jerevia": 1, "tyrion-yu": 1, "callumacrae": 1, "njleonzhang": 1, "martin-heralecky": 1, "Akryum": 1, "anhulife": 1, "conanskyforce": 1, "yoshidax": 1, "kennethlombardi": 1, "zhuyunhe": 1, "xiaoqiang-zhao": 1, "Austio": 1, "morkro": 1, "danieldiekmeier": 1, "olayinkaos": 1, "jakwuh": 1, "pi0": 1, "Snugug": 1, "obertrand": 1, "lvjinlong": 2, "jbruni": 2, "iFwu": 2, "riophae": 2, "sotayamashita": 4, "chikathreesix": 6, "Alex-Sokolov": 15, "Haeresis": 17, "ChangJoo-Park": 24, "kazupon": 27, "dear-lizhihua": 31, "yyx990803": 41 }, "pulls_authors": [ "ulivz", "mitsu-ksgr", "Alex-Sokolov", "chrisvfritz", "lex111", "nicedchy", "valsaven", "HcySunYang", "guox191", "olsonpm", "dear-lizhihua", "LennonChin", "rkunev", "tonyljl526", "lzxb", "larongbingo", "reed-jones", "paerallax", "ruanjf", "jiangxiaoxin", "Jerevia", "Haeresis", "jungminji", "sotayamashita", "kazupon", "tyrion-yu", "chikathreesix", "callumacrae", "njleonzhang", "jbruni", "martin-heralecky", "Akryum", "iFwu", "anhulife", "conanskyforce", "yoshidax", "kennethlombardi", "eidonjoe", "ChangJoo-Park", "yyx990803", "JiamingLinn", "pascalgermain", "riophae", "Austio", "francoisromain", "Ben52", "danieldiekmeier", "olayinkaos", "morkro", "jakwuh", "pi0", "Snugug", "Kocal", "obertrand" ], "languages": {}, "fetched_at": "2018-09-13T00:23:19.817Z" }
json
Sony Computer Entertainment's President Kaz Hirai says the PlayStation 3 is the top console. The numbers say otherwise. If there's one word that often fits Sony, it's "hubris." Sony Computer Entertainment America President and CEO Kaz Hirai likes to talk big, but I'm a little worried about what world the head honcho is living in. Kaz declared in the February issue of Official Playstation Magazine --in the same article he says the Xbox 360 "lacks longevity"--that he believes Sony is the "official industry leader" in the gaming console market. How does he reach this conclusion? After all, the PS3 is in third place, according to data from research firm The NPD Group. "This is not meant in terms of numbers." Oh, well then. "Or who's got the biggest install base." Um, Kaz? "Or who's selling most in any particular week or month." Aren't those the indicators people look to when considering leadership? "But I'd like to think that we continue official leadership in this industry," Kaz said. And I'd like to think I'm next up on the Amy Adams crush list, but that doesn't necessarily make it so. He's got no kind words for Nintendo's Wii, either, basically dismissing it as a non-competitor. But Kaz, whether or not you like it, you're in a three-horse race. The Wii, Microsoft Xbox 360, and PlayStation 3 are all good consoles that go in very different directions. And we're all for rattling sabers. Both of those are good things for technology and for gamers; not only do they force the console makers to constantly one-up each other, but they give the users far more choice. But that old adage is true: Actions speak louder than words. When you overtake Microsoft in sales, Sony, I'll start listening. In the meantime, feel free to preach to your choir.
english
Popular actress and 'Bigg Boss' star Vanitha Vijayakumar got married to filmmaker Peter Paul on Saturday, June 27th according to Christian customs. Vanitha's two daughters served as the bridesmaids and the groom kissed the bride and also champagnes were opened. The couple kissing has become a topic of controversy and Peter Paul's wife too had stated in her interviews that it could adversely influence children who watch it on YouTube. Vanitha Vijayakumar's post on the kiss controversy is laden with sarcasm as she has written "ARENTAL CONTROL ALERT. . . PLEASE DONT LER YOUR CHILDREN WATCH DISNEY CARTOONS OR Read FAIRY TALE BOOKS. . apparently it's not suitable for children. . they show adult rated kissing content. Kids are never supposed to witness or know when a man loves a woman or when they marry they kiss. . . " Peter Paul's first wife Elisabeth Helen has lodged a complaint that the marriage between her husband and Vanitha is not legal as they have not divorced yet. The actress has responded that she would face the issue legally and has countered that its a ploy to extort money. Follow us on Google News and stay updated with the latest!
english
Fans of Wendy are not happy after her label SM Entertainment made an announcement regarding her future activities. The K-POP singer is a part of Red Velvet, a girl group under SM Entertainment. They debuted in 2014 with their single titled "Happiness. " Since then, they've released several albums and singles, finding paramount success internationally. Their concept focuses on showing two sides of them through their music releases: the "Red" side, which is a fierce and concentrated side of them, and the "Velvet" side, representing their gentle and feminine side. Fierce is how fans are expressing their feelings after Wendy was confirmed to be a regular cast member for the new season of SNL Korea. SNL Korea is the Korean alternative to the American late-night show "Saturday Night Live. " On the show, regular cast members (along with famous guests) perform comedic skits in front of a live audience. ReVeluvs, or fans of Red Velvet, are extremely disappointed by the decision made to cast Wendy on the show as a regular member due to the past controversies the show has been through. In 2016, the SNL Korea cast uploaded a behind-the-scenes video of themselves along with their guests for that particular episode, K-POP band B1A4. The video was taken down soon after as fans called out the behavior displayed by the female staff of the show, allegedly touching the boy-band members inappropriately. However, this was not the first time. Fans dug up previous uploads by the very own SNL Korea cast, where other artists were supposedly also sexually harassed, including BLOCK B and INFINITE. Along with this, the show has also had a racially sensitive past, with members of the cast performing blackface in one of their skits to try to elicit laughter from their audience. While the show ended in 2018 after the onslaught of the backlash it received, it's been reported that the new season approaching will be a revival with a new cast and fresh concepts. Also read: Jessi Smiles responds to Gabbie Hanna's video, calls her a "dangerous person to have a platform" As soon as news of Wendy's addition to the main cast of the next season of SNL Korea was revealed, fans stormed Twitter to express their displeasure at the decision: For the show's revival, Coupang Play, a new streaming service, will be hosting the show on its platform. The host from the previous seasons, Shin Dong Yup, will be reprising his role. Both Wendy and SM Entertainment have yet to make a statement on the current issue.
english
--- title: "CreateEncryptionKey Method" ms.author: solsen ms.custom: na ms.date: 02/03/2020 ms.reviewer: na ms.suite: na ms.tgt_pltfrm: na ms.topic: article ms.service: "dynamics365-business-central" author: SusanneWindfeldPedersen --- [//]: # (START>DO_NOT_EDIT) [//]: # (IMPORTANT:Do not edit any of the content between here and the END>DO_NOT_EDIT.) [//]: # (Any modifications should be made in the .xml files in the ModernDev repo.) # CreateEncryptionKey Method Creates an encryption key for the current tenant. ## Syntax ``` [Ok := ] System.CreateEncryptionKey() ``` > [!NOTE] > This method can be invoked without specifying the data type name. ## Return Value *Ok* &emsp;Type: [Boolean](../boolean/boolean-data-type.md) **true** if the operation was successful; otherwise **false**. If you omit this optional return value and the operation does not execute successfully, a runtime error will occur. [//]: # (IMPORTANT: END>DO_NOT_EDIT) ## Remarks If a key already exists, the following error will be displayed: **Unable to create a new encryption key. An encryption key already exists**. ## Example This code example creates an encryption key for the current tenant. It uses the [ENCRYPTIONENABLED](../../methods-auto/system/system-encryptionenabled-method.md) method to perform a check. ``` if not ENCRYPTIONENABLED then CREATEENCRYPTIONKEY(); ``` ## See Also [System Data Type](system-data-type.md) [Getting Started with AL](../../devenv-get-started.md) [Developing Extensions](../../devenv-dev-overview.md)
markdown
Postpone Mhtcet 2020!! As we are aware of the covid 19 situation all around the world. The world has come on its knees and we need to rebuild its pride. So to fight against this stubborn virus we are supposed to stay indoors and stay healthy. In such situations public gathering must be strictly prohibited and exams must be postponed. As in this case, Mhtcet is going to be conducted on 4th July 2020 which is practicality impossible but government of Maharashtra is not keenly taking interest in postponing it or giving any update about it. Now lets take an example, if a student after giving the exam comes home and gets affected by this virus and if his grandparents show the symptoms of covid, then who is responsible?? There are about 5-6 lac students appearing for this exam. Upsc has postponed its exams to the month of September and November, similarly the exams of company secretary are postponed to August. We think Mhtcet must also be postponed to month of August.
english
<gh_stars>0 // Simple typographic replacements // // (c) (C) → © // (tm) (TM) → ™ // (r) (R) → ® // +- → ± // (p) (P) -> § // ... → … (also ?.... → ?.., !.... → !..) // ???????? → ???, !!!!! → !!!, `,,` → `,` // -- → &ndash;, --- → &mdash; // 'use strict'; import State from '../../types/rules_core/state_core'; import Token = require('../token'); // TODO: // - fractionals 1/2, 1/4, 3/4 -> ½, ¼, ¾ // - miltiplication 2 x 4 -> 2 × 4 const RARE_RE = /\+-|\.\.|\?\?\?\?|!!!!|,,|--/; // Workaround for phantomjs - need regex without /g flag, // or root check will fail every second time const SCOPED_ABBR_TEST_RE = /\((c|tm|r|p)\)/i; const SCOPED_ABBR_RE = /\((c|tm|r|p)\)/ig; const SCOPED_ABBR = { c: '©', r: '®', p: '§', tm: '™' }; function replaceFn(match:string, name:string):string { return SCOPED_ABBR[name.toLowerCase()]; } function replace_scoped(inlineTokens:Token[]) { let token:Token, inside_autolink:number = 0; for (let i = inlineTokens.length - 1; i >= 0; i--) { token = inlineTokens[i]; if (token.type === 'text' && !inside_autolink) { token.content = token.content.replace(SCOPED_ABBR_RE, replaceFn); } if (token.type === 'link_open' && token.info === 'auto') { inside_autolink--; } if (token.type === 'link_close' && token.info === 'auto') { inside_autolink++; } } } function replace_rare(inlineTokens:Token[]) { let token:Token, inside_autolink:number = 0; for (let i = inlineTokens.length - 1; i >= 0; i--) { token = inlineTokens[i]; if (token.type === 'text' && !inside_autolink) { if (RARE_RE.test(token.content)) { token.content = token.content .replace(/\+-/g, '±') // .., ..., ....... -> … // but ?..... & !..... -> ?.. & !.. .replace(/\.{2,}/g, '…').replace(/([?!])…/g, '$1..') .replace(/([?!]){4,}/g, '$1$1$1').replace(/,{2,}/g, ',') // em-dash .replace(/(^|[^-])---([^-]|$)/mg, '$1\u2014$2') // en-dash .replace(/(^|\s)--(\s|$)/mg, '$1\u2013$2') .replace(/(^|[^-\s])--([^-\s]|$)/mg, '$1\u2013$2'); } } if (token.type === 'link_open' && token.info === 'auto') { inside_autolink--; } if (token.type === 'link_close' && token.info === 'auto') { inside_autolink++; } } } export = function replace(state:State) { if (!state.md.options.typographer) { return; } for (let blkIdx = state.tokens.length - 1; blkIdx >= 0; blkIdx--) { if (state.tokens[blkIdx].type !== 'inline') { continue; } if (SCOPED_ABBR_TEST_RE.test(state.tokens[blkIdx].content)) { replace_scoped(state.tokens[blkIdx].children || []); } if (RARE_RE.test(state.tokens[blkIdx].content)) { replace_rare(state.tokens[blkIdx].children || []); } } };
typescript
Kerala Boat Tragedy: Broken Rules, Negligence Behind Malappuram Capsize Which Killed At Least 22? Horrible tragedy strikes Kerala's Malappuram. Double-decker boat, capsizes near beach, at least 22 confirmed dead, including 7 children. What was the cause of this tragedy? Watch the video to find out more.
english
The updated final NRC, which validates bonafide Indian citizens of Assam, was out on Saturday, with over 19 lakh applicants who failed to make it to the list staring at an uncertain future. - Mamata has termed the Assam NRC as a "botched-up process" By India Today Web Desk: West Bengal Chief Minister Mamata Banerjee on Saturday said that the whole National Register of Citizens (NRC) in Assam has exposed those who tried to take political mileage out of it. She further said that the people behind the NRC have a lot to answer to the nation. The updated final NRC, which validates bonafide Indian citizens of Assam, was out on Saturday, with over 19 lakh applicants who failed to make it to the list staring at an uncertain future. Taking to Twitter, Mamata Banerjee said, "The NRC fiasco has exposed all those who tried to take political mileage out of it. They have a lot to answer to the nation. " She further said that the NRC fiasco happens when an act is "guided by an ulterior motive rather than the good of the society and the larger interest of the nation". Terming the Assam NRC as a "botched-up process", Mamata Banerjee said, "My heart goes out to all those, especially the large number of Bengali speaking brothers and sisters, who are made to suffer. " Earlier in the day, the Trinamool Congress hit out at BJP government at the Centre for allegedly trying to drive out Bengalis from Assam in the name of NRC and said it will have to take the responsibility of the 19 lakh applicants who failed to make it to the final list. Mamata Banerjee also expressed concern over 19. 07 lakh people being left out of the final NRC list and said, "People have been rendered homeless in their own country", senior TMC leader Firhad Hakim said. TMC has been one of the most voracious critics of the citizens' register and had accused BJP governments both at the Centre and in Assam of trying to drive out Bengalis from the northeastern state. "Our party [TMC] supremo [Mamata Banerjee] is very concerned about the future of the 19 lakh people who have been left out of the NRC list. What will happen to them? What is their future? The central government has to take their responsibility," Hakim said. Last year after the draft NRC list was released, Mamata Banerjee had gone all out to oppose it and had even sent a TMC delegation to Assam to talk to the people. "It is a plot to drive out Bengalis from Assam. How can the government be so insensitive that on one fine morning it is declaring citizens, who have been living in Assam for the last several decades, as foreigners," the TMC leader said.
english
<reponame>sbhbenjamin/waveportal-frontend {"ast":null,"code":"\"use strict\";\n\nimport { AbiCoder, checkResultErrors, ConstructorFragment, defaultAbiCoder, ErrorFragment, EventFragment, FormatTypes, Fragment, FunctionFragment, Indexed, Interface, LogDescription, ParamType, TransactionDescription } from \"@ethersproject/abi\";\nimport { getAddress, getCreate2Address, getContractAddress, getIcapAddress, isAddress } from \"@ethersproject/address\";\nimport * as base64 from \"@ethersproject/base64\";\nimport { Base58 as base58 } from \"@ethersproject/basex\";\nimport { arrayify, concat, hexConcat, hexDataSlice, hexDataLength, hexlify, hexStripZeros, hexValue, hexZeroPad, isBytes, isBytesLike, isHexString, joinSignature, zeroPad, splitSignature, stripZeros } from \"@ethersproject/bytes\";\nimport { _TypedDataEncoder, hashMessage, id, isValidName, namehash } from \"@ethersproject/hash\";\nimport { defaultPath, entropyToMnemonic, getAccountPath, HDNode, isValidMnemonic, mnemonicToEntropy, mnemonicToSeed } from \"@ethersproject/hdnode\";\nimport { getJsonWalletAddress } from \"@ethersproject/json-wallets\";\nimport { keccak256 } from \"@ethersproject/keccak256\";\nimport { Logger } from \"@ethersproject/logger\";\nimport { computeHmac, ripemd160, sha256, sha512 } from \"@ethersproject/sha2\";\nimport { keccak256 as solidityKeccak256, pack as solidityPack, sha256 as soliditySha256 } from \"@ethersproject/solidity\";\nimport { randomBytes, shuffled } from \"@ethersproject/random\";\nimport { checkProperties, deepCopy, defineReadOnly, getStatic, resolveProperties, shallowCopy } from \"@ethersproject/properties\";\nimport * as RLP from \"@ethersproject/rlp\";\nimport { computePublicKey, recoverPublicKey, SigningKey } from \"@ethersproject/signing-key\";\nimport { formatBytes32String, nameprep, parseBytes32String, _toEscapedUtf8String, toUtf8Bytes, toUtf8CodePoints, toUtf8String, Utf8ErrorFuncs } from \"@ethersproject/strings\";\nimport { accessListify, computeAddress, parse as parseTransaction, recoverAddress, serialize as serializeTransaction, TransactionTypes } from \"@ethersproject/transactions\";\nimport { commify, formatEther, parseEther, formatUnits, parseUnits } from \"@ethersproject/units\";\nimport { verifyMessage, verifyTypedData } from \"@ethersproject/wallet\";\nimport { _fetchData, fetchJson, poll } from \"@ethersproject/web\"; ////////////////////////\n// Enums\n\nimport { SupportedAlgorithm } from \"@ethersproject/sha2\";\nimport { UnicodeNormalizationForm, Utf8ErrorReason } from \"@ethersproject/strings\"; ////////////////////////\n// Exports\n\nexport { AbiCoder, defaultAbiCoder, Fragment, ConstructorFragment, ErrorFragment, EventFragment, FunctionFragment, ParamType, FormatTypes, checkResultErrors, Logger, RLP, _fetchData, fetchJson, poll, checkProperties, deepCopy, defineReadOnly, getStatic, resolveProperties, shallowCopy, arrayify, concat, stripZeros, zeroPad, isBytes, isBytesLike, defaultPath, HDNode, SigningKey, Interface, LogDescription, TransactionDescription, base58, base64, hexlify, isHexString, hexConcat, hexStripZeros, hexValue, hexZeroPad, hexDataLength, hexDataSlice, nameprep, _toEscapedUtf8String, toUtf8Bytes, toUtf8CodePoints, toUtf8String, Utf8ErrorFuncs, formatBytes32String, parseBytes32String, hashMessage, namehash, isValidName, id, _TypedDataEncoder, getAddress, getIcapAddress, getContractAddress, getCreate2Address, isAddress, formatEther, parseEther, formatUnits, parseUnits, commify, computeHmac, keccak256, ripemd160, sha256, sha512, randomBytes, shuffled, solidityPack, solidityKeccak256, soliditySha256, splitSignature, joinSignature, accessListify, parseTransaction, serializeTransaction, TransactionTypes, getJsonWalletAddress, computeAddress, recoverAddress, computePublicKey, recoverPublicKey, verifyMessage, verifyTypedData, getAccountPath, mnemonicToEntropy, entropyToMnemonic, isValidMnemonic, mnemonicToSeed, ////////////////////////\n// Enums\nSupportedAlgorithm, UnicodeNormalizationForm, Utf8ErrorReason, Indexed };","map":null,"metadata":{},"sourceType":"module"}
json
# Generated by Django 2.2.3 on 2019-08-03 02:26 from django.db import migrations class Migration(migrations.Migration): dependencies = [ ('quotes', '0007_auto_20190730_2313'), ] operations = [ migrations.RemoveField( model_name='quote', name='user_favourited', ), migrations.RemoveField( model_name='quote', name='user_liked', ), ]
python
--- title: Matt short_name: matt nickname: Matty-O ragbrai: 2006 image: matt.jpg active: false ---
markdown
Quite a cheeky debut it was for young filmmaker Adhik Ravichandran, who managed to score the brownie points with GV Prakash through ‘Trisha Illana Nayantara’. It was a film that even sent GV Prakash into a mode of inevitable astonishment when audiences celebrated the film with dance and celebrations. Undoubtedly, this news should them a heavy rush of excitements into their veins as Adhik Ravichandran and GV Prakash are teaming up yet again for one more film. The film will be produced by Mr. Stephen of Steeve’s Corner and the official announcements pertaining to Heroine and the title will be announced shortly the producer said. Adhik the most sought after director of the youth brigade has come with an another un usual concept that is going to let the box office ring. GV Prakash who has hit the bulls eye with his recently released ‘Pencil’ has cemented his place as an star in the commercial arena. ‘I am indeed excited to produce this film with a proven combination like this. G V Prakash’s growth as an star apart from his proven status as a music director is a great gift to me as a producer.We were about to start this film from the first week of May, but due to the pre commitments of the hero G V Prakash we were unable to proceed. Being the part of the industry i fully under stand the logistic difficulties of the Hero and we agreed mutually to start this film once he is done with his pre commitments, Now we are all set to start this film from the first week of August’ declared the producer Stephen of Steve’s corner.
english
Bengaluru FC vs Mumbai City FC Highlights: BFC vs MCFC Highlights in ISL 2023Highlights: Mumbai City FC’s streak of victories has been broken against Bengaluru FC. They were turned on right away, and Sunil Chhetri was commanding the group with all of his style and charisma. In the opening 45 minutes, there was nothing to differentiate the two teams, but soon after play resumed, Sunil Chhetri’s precise header gave BFC the lead. Javier Hernandez’s accurate shot gave Bengaluru FC some insurance in the 70th minute, and it ultimately turned out to be the game-winning goal. Mumbai City has already accomplished their season’s first goal by winning the ISL Shield. Bengaluru FC’s seven-match winning streak has propelled them into the ISL playoff picture. Follow Indian Super League and ISL 2022-23 LIVE Updates on InsideSport.IN. Chhetri made his first start of the new year for Bengaluru FC as Roy Krishna served his suspension. The 38-year-old asked questions of the Islanders’ backline. A little over five minutes after attempting to dink one over the keeper, Chhetri tried his luck from the edge of the box after Vignesh D’s clearance deflected into his path. Moments later, Chhetri had another crack at goal after getting the better of Ahmed Jahouh. Lachenpa was quick to go down and put his body behind the low shot. But in a first half where neither team looked to be too adventurous, it was Mumbai City FC who were presented with the best opportunity to take the lead. Bengaluru FC’s defenders were caught in a tangle in the middle of the pitch when a long ball set Bipin Singh on a free run at goal down the left side. However, Gurpreet Singh Sandhu made himself big at the near post and blocked the winger’s effort which was straight at him. Chhetri’s persistence paid off minutes before the hour mark as the League Shield Winners looked a shadow of themselves in the second half. Hernandez whipped a corner towards the far post, where the Islanders had left Chhetri unmarked – the veteran charged in to head his team into the lead. A glimpse of just how much the Blues had rattled the Islanders was visible when centre-back Aleksandar Jovanovic, in to replace the suspended Parag Shrivas, burst past a flat-footed defence into the box to create the second goal. He squared the ball into the path of Hernandez, who drove it in first time to put his team 2-0 up in the 70th minute. This did seem to shake the league leaders up, as they clawed back in the 77th minute when a Rowllin Borges corner was headed back across goal for Mourtada Fall to tap in. But this drive came too little too late from the Islanders, whose domineering run came to a halt at the hands of the league’s most in-form team. Bengaluru FC are now sitting in fourth, four points clear of sixth place. Their final league game of the season will be against FC Goa on February 23. Mumbai City FC will conclude their league season against East Bengal FC on February 19. Bengaluru FC XI: Gurpreet Sandhu (GK), Prabir Das, Aleksandar Jovanovic, Sandesh Jhingan, Roshan Naorem, Javier Hernandez, Bruno Silva, Suresh Wangjam, Rohit Kumar, Siva Narayanan, Sunil Chhetri (C). Mumbai City FC XI: Phurba Lachenpa (GK), Sanjeev Stalin, Mourtada Fall (C), Mehtab Singh, Vignesh Dakshinamurthy, Ahmed Jahouh, Lalengmawia Ralte, Jorge Diaz, Lallianzuala Chhangte, Greg Stewart, Bipin Singh. Follow InsideSport on GOOGLE NEWS / Follow Indian Super League and ISL 2022-23 LIVE Updates on InsideSport.IN.
english
The Grand Theft Auto (GTA) series offers a plethora of vehicles in each of its games. They are almost always inspired by real-life vehicles, with some creative tweaks to make them unique. A new off-roader called the Karin Boor has debuted in GTA Online with the latest weekly update that is part of the Los Santos Drug Wars DLC drip-feed content. Players can't help but wonder which real-life car it is based on and whether it’s worth buying in 2023. The Karin Boor in GTA Online is a two-seater utility car that takes direct inspiration from the real-life Subaru BRAT (2nd generation) coupe. Classic car enthusiasts will appreciate the vintage look of this vehicle, which resembles the Picador and Warrener HKR. It has the following characteristics: Front body: Side body: Rear body: - Silver lettering on the driver's side (both manufacturer and vehicle names) The Karin Boor comes with a primary color in most of the bodywork. A secondary color is only applied to the lower body portions. GTA Online's Karin Boor is powered by a large-sized inline-6 engine with a 5-speed transmission. It is one of the slowest vehicles in the game and can only reach a top speed of 92.58 mph (149.00 km/h), per the game files. The lack of sheer power also results in slow acceleration, which, when combined with poor braking, doesn’t offer a good driving experience for most players. Being extremely lightweight, the vehicle also doesn’t offer much ramming power. The lack of grip on the road makes the Karin Boor’s handling unresponsive at times. If players try to take hard corners, they will sometimes experience sliding, even if they don’t intend to. On the positive side, the vehicle’s suspension is soft but heavily dampened, which allows it to handle most road bumps with ease. It also has a very good turning radius, making it easier for GTA Online players to take corners despite the car's tendency to slide. While the cons outnumber the pros, Karin Boor is still a decent vehicle to have and drive all around Los Santos. Interested buyers can get it from Southern San Andreas Super Autos for a price of $1,280,000. For The Biggest GTA 6 Map Leaks, Click Here. Poll : Do you think Karin Boor is worth buying?
english
import { Services, Utils } from 'czechidm-core'; /** * Example service: * - example rest methods * * @author <NAME> */ export default class ExampleProductService { /** * Client error example * * @param {string} some value * @return {Promise} */ clientError(parameter = 'Test') { return Services.RestApiService .get(`/examples/client-error?parameter=${ encodeURIComponent(parameter) }`) .then(response => response.json()) .then(json => { if (Utils.Response.hasError(json)) { throw Utils.Response.getFirstError(json); } return json; }); } /** * Server error example * * @param {string} some value * @return {Promise} */ serverError(parameter = 'Test') { return Services.RestApiService .get(`/examples/server-error?parameter=${ encodeURIComponent(parameter) }`) .then(response => response.json()) .then(json => { if (Utils.Response.hasError(json)) { throw Utils.Response.getFirstError(json); } return json; }); } }
javascript
from __future__ import absolute_import import sys, os BASE_DIR = os.path.dirname(os.path.dirname(__file__)) sys.path.append(BASE_DIR) import numpy as np import PMML43Ext as pml from skl import pre_process as pp from datetime import datetime import math import metadata import inspect from nyoka.keras.keras_model_to_pmml import KerasToPmml from nyoka.xgboost.xgboost_to_pmml import xgboost_to_pmml from nyoka.lgbm.lgb_to_pmml import lgb_to_pmml from nyoka.lgbm.lgbmTrainingAPI_to_pmml import ExportToPMML as ext def model_to_pmml(toExportDict, pmml_f_name='from_sklearn.pmml'): """ Exports scikit-learn pipeline object into pmml Parameters ---------- pipeline : Contains an instance of Pipeline with preprocessing and final estimator col_names : List Contains list of feature/column names. target_name : String Name of the target column. (Default='target') pmml_f_name : String Name of the pmml file. (Default='from_sklearn.pmml') Returns ------- Returns a pmml file """ # To support multiple models and Transformation dictionaries models_dict = {'DeepNetwork':[]} trfm_dict_kwargs = {'TransformationDictionary':[]} data_dicts = [] visited = [] categoric_values = None derived_col_names = None mining_imp_val = None for model_name in toExportDict.keys(): col_names = toExportDict[model_name]['featuresUsed'] target_name = toExportDict[model_name]['targetName'] tasktype = toExportDict[model_name]['taskType'] model = toExportDict[model_name]['modelObj'] pipelineOnly = toExportDict[model_name]['pipelineObj'] categoric_values = tuple() derived_col_names = col_names mining_imp_val = tuple() if (pipelineOnly is not None) and (pipelineOnly not in visited): derived_col_names,categoric_values,mining_imp_val,trfm_dict_kwargs = get_trfm_dict_kwargs(col_names,pipelineOnly, trfm_dict_kwargs,model,model_name) if 'keras' in str(model): KModelObj=toExportDict[model_name] if 'model_graph' in KModelObj: model_graph = KModelObj['model_graph'] with model_graph.as_default(): tf_session = KModelObj['tf_session'] with tf_session.as_default(): KerasPMML = KerasToPmml(model.model,model_name=pmml_f_name,targetVarName=target_name) else: KerasPMML = KerasToPmml(model,model_name=pmml_f_name,targetVarName=target_name) model_obj = KerasPMML.DeepNetwork[0] model_obj.modelName = model_name model_obj.taskType=tasktype models_dict['DeepNetwork'].append(model_obj) else: #model = pipeline.steps[-1][1] #ppln_sans_predictor = pipeline.steps[:-1] #derived_col_names,categoric_values,mining_imp_val,trfm_dict_kwargs = get_trfm_dict_kwargs(col_names,pipelineOnly, # trfm_dict_kwargs,modelobj,model_name) if ('XGBRegressor' in str(model)) or ('XGBClassifier' in str(model)): PMML_kwargs = xgboost_to_pmml(model, derived_col_names, col_names, target_name, mining_imp_val, categoric_values, tasktype) elif ('LGBMRegressor' in str(model)) or ('LGBMClassifier' in str(model)): PMML_kwargs = lgb_to_pmml(model, derived_col_names, col_names, target_name, mining_imp_val, categoric_values, tasktype) elif ('Booster' in str(model)): PMML_kwargs = ext(model,tasktype,target_name) else: PMML_kwargs = get_PMML_kwargs(model, derived_col_names, col_names, target_name, mining_imp_val, categoric_values, tasktype) model_obj = list(PMML_kwargs.values())[0][0] model_obj.modelName = model_name key = list(PMML_kwargs.keys())[0] if key in models_dict: models_dict[key].append(model_obj) else: PMML_kwargs = {key:[model_obj]} models_dict.update(PMML_kwargs) data_dicts.append(get_data_dictionary(model, col_names, target_name, categoric_values)) pmml = pml.PMML( version=get_version(), Header=get_header(), MiningBuildTask=get_mining_buildtask(toExportDict), DataDictionary=get_data_dictionary_values(data_dicts), script = get_script_execution(toExportDict), **trfm_dict_kwargs, **models_dict ) pmml.export(outfile=open(pmml_f_name, "w"), level=0) def get_trfm_dict_kwargs(col_names,pipelineOnly,trfm_dict_kwargs,model,model_name): if isinstance(col_names, np.ndarray): col_names = col_names.tolist() #ppln_sans_predictor = pipeline.steps[:-1] ppln_sans_predictor = pipelineOnly.steps derived_col_names = col_names categoric_values = tuple() mining_imp_val = tuple() if ppln_sans_predictor: pml_pp = pp.get_preprocess_val(ppln_sans_predictor, col_names, model, model_name) trfm_dict_kwargs['TransformationDictionary'].append(pml_pp['trfm_dict']) derived_col_names = pml_pp['derived_col_names'] col_names = pml_pp['preprocessed_col_names'] categoric_values = pml_pp['categorical_feat_values'] mining_imp_val = pml_pp['mining_imp_values'] return derived_col_names,categoric_values,mining_imp_val,trfm_dict_kwargs def processScript(scr): scr=scr.replace('&','&amp;') return scr def get_data_dictionary_values(data_dicts): data_dicts = [x for x in data_dicts if x is not None] lst = [] lislen = len(data_dicts) if lislen != 0: for indfile in data_dicts[0].DataField: lst.append(indfile.get_name()) if lislen == 0: datadict = None elif lislen == 1: datadict = data_dicts[0] else: for dd in range(1,lislen): for indfile in data_dicts[dd].DataField: if indfile.get_name() in lst and len(indfile.get_Value())==0: pass else: data_dicts[0].add_DataField(indfile) lst.append(indfile.get_name()) datadict = data_dicts[0] return datadict def get_script_execution(toExportDict): # Script execution scrps = [] for model_name in toExportDict.keys(): if toExportDict[model_name]['preProcessingScript'] is not None: lstlen = len(toExportDict[model_name]['preProcessingScript']['scripts']) for leng in range(lstlen): scrps.append(pml.script(content=processScript(toExportDict[model_name]['preProcessingScript']['scripts'][leng]), for_= model_name, class_ = 'preprocessing', scriptPurpose = toExportDict[model_name]['preProcessingScript']['scriptpurpose'][leng] )) if toExportDict[model_name]['postProcessingScript'] is not None: lstlen = len(toExportDict[model_name]['postProcessingScript']['scripts']) for leng in range(0,lstlen): scrps.append(pml.script(content=processScript(toExportDict[model_name]['postProcessingScript']['scripts'][leng]), for_= model_name, class_ = 'postprocessing', scriptPurpose = toExportDict[model_name]['postProcessingScript']['scriptpurpose'][leng] )) return scrps def get_entire_string(pipe0): pipe_steps = pipe0.steps pipe_memory = 'memory=' + str(pipe0.memory) df_container = '' pipe_container = '' for step_idx, step in enumerate(pipe_steps): pipe_step_container = '' step_name = step[0] step_item = step[1] if step_item.__class__.__name__ == "DataFrameMapper": df_default_val = "default=" + str(step_item.default) df_out_val = "df_out=" + str(step_item.df_out) input_df_val = "input_df=" + str(step_item.input_df) sparse_val = "sparse=" + str(step_item.sparse) for feature in step_item.features: if not df_container: df_container = df_container + str(feature) else: df_container = df_container + ',' + str(feature) df_container = '[' + df_container + ']' df_container = 'features=' + df_container df_container = df_default_val + ',' + df_out_val + ',\n\t' + df_container df_container = df_container + ',\n\t' + input_df_val + ',' + sparse_val df_container = '(' + df_container + ')' df_container = 'DataFrameMapper' + df_container df_container = '\'' + step_name + '\'' + ',' + df_container df_container = '(' + df_container + ')' else: pipe_step_container = '\'' + step_name + '\'' + ',' + str(step_item) pipe_step_container = '(' + pipe_step_container + ')' if not pipe_container: pipe_container = pipe_container + pipe_step_container else: pipe_container = pipe_container + ',' + pipe_step_container if df_container: pipe_container = df_container + ',' + pipe_container pipe_container = '[' + pipe_container + ']' pipe_container = 'steps=' + pipe_container pipe_container = pipe_memory + ',\n ' + pipe_container pipe_container = 'Pipeline(' + pipe_container + ')' return pipe_container def get_mining_buildtask(toExportDict): extension = [] for model_name in toExportDict.keys(): pipeline = toExportDict[model_name]['pipelineObj'] if 'keras' in str(pipeline): pass else: if pipeline: pipeline = get_entire_string(pipeline) extension.append(pml.Extension(value=pipeline,for_=model_name,name="preprocessingPipeline")) modelobj = toExportDict[model_name]['modelObj'] modelobj = str(modelobj) extension.append(pml.Extension(value=modelobj,for_=model_name,name="modelObject")) if toExportDict[model_name]['hyperparameters']: extension.append(pml.Extension(value=toExportDict[model_name]['hyperparameters'],for_=model_name,name="hyperparameters")) mining_bld_task = pml.MiningBuildTask(Extension = extension) return mining_bld_task def any_in(seq_a, seq_b): return any(elem in seq_b for elem in seq_a) def get_PMML_kwargs(model, derived_col_names, col_names, target_name, mining_imp_val, categoric_values, tasktype): """ It returns all the pmml elements. Parameters ---------- model : Scikit-learn model object An instance of Scikit-learn model. derived_col_names : List Contains column names after preprocessing col_names : List Contains list of feature/column names. target_name : String Name of the target column . mining_imp_val : tuple Contains the mining_attributes,mining_strategy, mining_impute_value categoric_values : tuple Contains Categorical attribute names and its values Returns ------- algo_kwargs : Dictionary Get the PMML model argument based on scikit learn model object """ skl_mdl_super_cls_names = get_super_cls_names(model) # regression_model_names = ('LinearRegression','LinearSVR') # regression_mining_model_names = ('LogisticRegression', 'RidgeClassifier','LinearDiscriminantAnalysis', \ # 'SGDClassifier','LinearSVC',) regression_model_names = ('LinearRegression', 'LogisticRegression', 'RidgeClassifier', 'SGDClassifier', 'LinearDiscriminantAnalysis','LinearSVC','LinearSVR') tree_model_names = ('BaseDecisionTree',) support_vector_model_names = ('SVC', 'SVR') anomaly_model_names = ('OneClassSVM',) naive_bayes_model_names = ('GaussianNB',) mining_model_names = ('RandomForestRegressor', 'RandomForestClassifier', 'GradientBoostingClassifier', 'GradientBoostingRegressor','IsolationForest') neurl_netwk_model_names = ('MLPClassifier', 'MLPRegressor') nearest_neighbour_names = ('NeighborsBase',) clustering_model_names = ('KMeans',) if any_in(tree_model_names, skl_mdl_super_cls_names): algo_kwargs = {'TreeModel': get_tree_models(model, derived_col_names, col_names, target_name, mining_imp_val, categoric_values, tasktype)} # elif any_in(regression_mining_model_names, skl_mdl_super_cls_names): # if len(model.classes_) == 2: # algo_kwargs = {'RegressionModel': get_regrs_models(model, # derived_col_names, # col_names, # target_name, # mining_imp_val, # categoric_values, # tasktype)} # else: # algo_kwargs = {'MiningModel': get_reg_mining_models(model, # derived_col_names, # col_names, # target_name, # mining_imp_val, # categoric_values, # tasktype)} elif any_in(regression_model_names, skl_mdl_super_cls_names): algo_kwargs = {'RegressionModel': get_regrs_models(model, derived_col_names, col_names, target_name, mining_imp_val, categoric_values, tasktype)} elif any_in(support_vector_model_names, skl_mdl_super_cls_names): algo_kwargs = {'SupportVectorMachineModel': get_supportVectorMachine_models(model, derived_col_names, col_names, target_name, mining_imp_val, categoric_values, tasktype)} elif any_in(mining_model_names, skl_mdl_super_cls_names): algo_kwargs = {'MiningModel': get_ensemble_models(model, derived_col_names, col_names, target_name, mining_imp_val, categoric_values, tasktype)} elif any_in(neurl_netwk_model_names, skl_mdl_super_cls_names): algo_kwargs = {'NeuralNetwork': get_neural_models(model, derived_col_names, col_names, target_name, mining_imp_val, categoric_values, tasktype)} elif any_in(naive_bayes_model_names, skl_mdl_super_cls_names): algo_kwargs = {'NaiveBayesModel': get_naiveBayesModel(model, derived_col_names, col_names, target_name, mining_imp_val, categoric_values, tasktype)} elif any_in(nearest_neighbour_names, skl_mdl_super_cls_names): algo_kwargs = {'NearestNeighborModel': get_nearestNeighbour_model(model, derived_col_names, col_names, target_name, mining_imp_val, categoric_values, tasktype)} elif any_in(anomaly_model_names, skl_mdl_super_cls_names): algo_kwargs = {'AnomalyDetectionModel': get_anomalydetection_model(model, derived_col_names, col_names, target_name, mining_imp_val, categoric_values, tasktype)} elif any_in(clustering_model_names, skl_mdl_super_cls_names): algo_kwargs = {'ClusteringModel': get_clustering_model(model, derived_col_names, col_names, target_name, mining_imp_val, categoric_values, tasktype )} else: raise NotImplementedError("{} is not Implemented!".format(model.__class__.__name__)) return algo_kwargs def get_model_kwargs(model, col_names, target_name, mining_imp_val, categoric_values): """ It returns all the model element for a specific model. Parameters ---------- model : An instance of Scikit-learn model. col_names : List Contains list of feature/column names. target_name : String Name of the Target column. mining_imp_val : tuple Contains the mining_attributes,mining_strategy, mining_impute_value Returns ------- model_kwargs : Dictionary Returns function name, MiningSchema and Output of the sk_model object """ model_kwargs = dict() model_kwargs['functionName'] = get_mining_func(model) model_kwargs['MiningSchema'] = get_mining_schema(model, col_names, target_name, mining_imp_val, categoric_values) if model.__class__.__name__ == 'IsolationForest': model_kwargs['Output']=get_anomaly_detection_output(model) else: model_kwargs['Output'] = get_output(model, target_name) return model_kwargs def get_reg_mining_models(model, derived_col_names, col_names, target_name, mining_imp_val, categoric_values, tasktype): num_classes = len(model.classes_) model_kwargs = get_model_kwargs(model, col_names, target_name, mining_imp_val, categoric_values) mining_model = pml.MiningModel(modelName=model.__class__.__name__, taskType=tasktype,**model_kwargs) inner_mining_schema = [mfield for mfield in model_kwargs['MiningSchema'].MiningField if mfield.usageType != 'target'] segmentation = pml.Segmentation(multipleModelMethod="modelChain") for idx in range(num_classes): segment = pml.Segment(id=str(idx+1),True_=pml.True_()) segment.RegressionModel = pml.RegressionModel( functionName='regression', MiningSchema=pml.MiningSchema( MiningField=inner_mining_schema ), Output=pml.Output( OutputField=[ pml.OutputField( name="probablity_"+str(idx), optype="continuous", dataType="double" ) ] ), RegressionTable=get_reg_tab_for_reg_mining_model(model,derived_col_names,idx) ) if model.__class__.__name__ != 'LinearSVC': segment.RegressionModel.normalizationMethod = "logit" segmentation.add_Segment(segment) last_segment = pml.Segment(id=str(num_classes+1),True_=pml.True_()) mining_flds_for_last = [pml.MiningField(name="probablity_"+str(idx)) for idx in range(num_classes)] mining_flds_for_last.append(pml.MiningField(name=target_name,usageType="target")) mining_schema_for_last = pml.MiningSchema(MiningField=mining_flds_for_last) reg_tab_for_last = list() for idx in range(num_classes): reg_tab_for_last.append( pml.RegressionTable( intercept="0.0", targetCategory=str(model.classes_[idx]), NumericPredictor=[pml.NumericPredictor( name="probablity_"+str(idx), coefficient="1.0" )] ) ) last_segment.RegressionModel = pml.RegressionModel( functionName="classification", MiningSchema=mining_schema_for_last, RegressionTable=reg_tab_for_last ) if model.__class__.__name__ != 'LinearSVC': last_segment.RegressionModel.normalizationMethod = "simplemax" segmentation.add_Segment(last_segment) mining_model.set_Segmentation(segmentation) return [mining_model] def get_reg_tab_for_reg_mining_model(model, col_names, index): reg_tab = pml.RegressionTable(intercept="{:.16f}".format(model.intercept_[index])) for idx, coef in enumerate(model.coef_[index]): reg_tab.add_NumericPredictor(pml.NumericPredictor(name=col_names[idx],coefficient="{:.16f}".format(coef))) return [reg_tab] def get_anomalydetection_model(model, derived_col_names, col_names, target_name, mining_imp_val, categoric_values, tasktype): """ It returns the KMean Clustering model element. Parameters ---------- model : An instance of Scikit-learn model. derived_col_names : List Contains column names after preprocessing col_names : List Contains list of feature/column names. target_name : String Name of the Target column. mining_imp_val : tuple Contains the mining_attributes,mining_strategy, mining_impute_value categoric_values : tuple Contains Categorical attribute names and its values Returns ------- anomaly_detection_model :List Returns an anomaly detection model within a list """ anomaly_detection_model = list() if 'OneClassSVM' in str(model.__class__): anomaly_detection_model.append( pml.AnomalyDetectionModel( modelName=model.__class__.__name__, algorithmType="ocsvm", functionName="regression", MiningSchema=get_mining_schema(model, col_names, target_name, mining_imp_val,categoric_values), Output=get_anomaly_detection_output(model), taskType=tasktype, SupportVectorMachineModel=get_supportVectorMachine_models(model, derived_col_names, col_names, target_name, mining_imp_val, categoric_values,tasktype)[0] ) ) # else: # anomaly_detection_model.append( # pml.AnomalyDetectionModel( # modelName="IsolationForests", # algorithmType="iforest", # functionName="regression", # MiningSchema=get_mining_schema(model, col_names, target_name, mining_imp_val), # Output=get_anomaly_detection_output(model), # ParameterList=pml.ParameterList(Parameter=[pml.Parameter( # name="training_data_count", # value=model.max_samples_)]), # MiningModel=get_ensemble_models(model, # derived_col_names, # col_names, # target_name, # mining_imp_val, # categoric_values)[0] # ) # ) return anomaly_detection_model def get_anomaly_detection_output(model): """ Parameters ---------- Returns ------- output_fields : Returns an Output instance of anomaly detection model """ output_fields = list() if 'OneClassSVM' in str(model.__class__): output_fields.append(pml.OutputField( name="anomalyScore", feature="predictedValue", optype="continuous", dataType="float")) output_fields.append(pml.OutputField( name="anomaly", feature="anomaly", optype="categorical", dataType="boolean", threshold="0" )) else: n = model.max_samples_ eulers_gamma = 0.577215664901532860606512090082402431 output_fields.append(pml.OutputField(name="rawAnomalyScore", optype="continuous", dataType="double", feature="predictedValue", isFinalResult="false")) output_fields.append(pml.OutputField(name="normalizedAnomalyScore", optype="continuous", dataType="double", feature="transformedValue", isFinalResult="false", Apply=pml.Apply(function="/", FieldRef=[pml.FieldRef(field="rawAnomalyScore")], Constant=[pml.Constant(dataType="double", valueOf_=(2.0*(math.log(n-1.0)+eulers_gamma))- (2.0*((n-1.0)/n)))]))) appl_inner_inner = pml.Apply(function="*") cnst = pml.Constant(dataType="double", valueOf_=-1.0) fldref = pml.FieldRef(field="normalizedAnomalyScore") cnst.original_tagname_ = 'Constant' appl_inner_inner.add_FieldRef(cnst) appl_inner_inner.add_FieldRef(fldref) appl_inner = pml.Apply(function='pow') cnst = pml.Constant(dataType="double", valueOf_=2.0) cnst.original_tagname_ = 'Constant' appl_inner.add_FieldRef(cnst) appl_inner_inner.original_tagname_='Apply' appl_inner.add_FieldRef(appl_inner_inner) appl_outer = pml.Apply(function="-") cnst = pml.Constant(dataType="double", valueOf_=0.5) cnst.original_tagname_ = 'Constant' appl_outer.add_FieldRef(cnst) appl_inner.original_tagname_='Apply' appl_outer.add_FieldRef(appl_inner) output_fields.append(pml.OutputField(name="decisionFunction", optype="continuous", dataType="double", feature="transformedValue", isFinalResult="false", Apply=appl_outer)) output_fields.append(pml.OutputField(name="outlier", optype="categorical", dataType="boolean", feature="transformedValue", isFinalResult="true", Apply=pml.Apply(function="greaterThan", FieldRef=[pml.FieldRef(field="decisionFunction")], Constant=[pml.Constant(dataType="double", valueOf_="{:.16f}".format(model.threshold_))]))) return pml.Output(OutputField=output_fields) def get_clustering_model(model, derived_col_names, col_names, target_name, mining_imp_val,categoric_values,tasktype): """ It returns the KMean Clustering model element. Parameters ---------- model : An instance of Scikit-learn model. derived_col_names : List Contains column names after preprocessing col_names : List Contains list of feature/column names. target_name : String Name of the Target column. Returns ------- clustering_models :List Returns a KMean Clustering model within a list """ clustering_models = list() model_kwargs = get_model_kwargs(model, col_names, target_name, mining_imp_val,categoric_values) values, counts = np.unique(model.labels_,return_counts=True) model_kwargs["Output"] = get_output_for_clustering(values) clustering_models.append( pml.ClusteringModel( modelClass="centerBased", modelName=model.__class__.__name__, numberOfClusters=get_cluster_num(model), ComparisonMeasure=get_comp_measure(), ClusteringField=get_clustering_flds(derived_col_names), Cluster=get_cluster_vals(model,counts), taskType=tasktype, **model_kwargs ) ) return clustering_models def get_output_for_clustering(values): """ Parameters ---------- model : An instance of Scikit-learn model. Returns ------- output_fields :List Returns a list of OutputField """ output_fields = list() output_fields.append(pml.OutputField(name="cluster", optype="categorical",dataType="string",feature="predictedValue")) for idx, val in enumerate(values): output_fields.append( pml.OutputField( name="affinity("+str(idx)+")", optype="continuous", dataType="double", feature="entityAffinity", value=str(val) ) ) return pml.Output(OutputField=output_fields) def get_cluster_vals(model,counts): """ Parameters ---------- model : An instance of Scikit-learn model. Returns ------- cluster_flds :List Returns a list of Cluster instances """ centroids = model.cluster_centers_ cluster_flds = [] for centroid_idx in range(centroids.shape[0]): centroid_values = "" centroid_flds = pml.ArrayType(type_="real") for centroid_cordinate_idx in range(centroids.shape[1]): centroid_flds.content_[0].value = centroid_values + "{:.16f}".format(centroids[centroid_idx][centroid_cordinate_idx]) centroid_values = centroid_flds.content_[0].value + " " cluster_flds.append(pml.Cluster(id=str(centroid_idx), Array=centroid_flds,size=str(counts[centroid_idx]))) return cluster_flds def get_cluster_num(model): """ Parameters ---------- model : An instance of Scikit-learn model. Returns ------- model.n_clusters: Integer Returns the number of clusters """ return model.n_clusters def get_comp_measure(): """ Parameters ---------- Returns ------- Returns an instance of comparision measure """ comp_equation = pml.euclidean() return pml.ComparisonMeasure(euclidean=comp_equation, kind="distance") def get_clustering_flds(col_names): """ Parameters ---------- col_names : Contains list of feature/column names. Returns ------- clustering_flds: List Returns the list containing clustering field instances """ clustering_flds = [] for name in col_names: clustering_flds.append(pml.ClusteringField(field=str(name))) return clustering_flds def get_nearestNeighbour_model(model, derived_col_names, col_names, target_name, mining_imp_val,categoric_values,tasktype): """ It returns the Nearest Neighbour model element. Parameters ---------- model : An instance of Scikit-learn model. derived_col_names : List Contains column names after preprocessing col_names : List Contains list of feature/column names. target_name : String Name of the Target column. Returns ------- nearest_neighbour_model : Returns a nearest neighbour model instance """ model_kwargs = get_model_kwargs(model, col_names, target_name, mining_imp_val,categoric_values) nearest_neighbour_model = list() nearest_neighbour_model.append( pml.NearestNeighborModel( modelName=model.__class__.__name__, continuousScoringMethod='average', algorithmName="KNN", numberOfNeighbors=model.n_neighbors, KNNInputs=get_knn_inputs(derived_col_names), ComparisonMeasure=get_comparison_measure(model), TrainingInstances=get_training_instances(model, derived_col_names, target_name), taskType=tasktype, **model_kwargs ) ) return nearest_neighbour_model def get_training_instances(model, derived_col_names, target_name): """ It returns the Training Instance element. Parameters ---------- model : An instance of Scikit-learn model. derived_col_names : List Contains column names after preprocessing target_name : String Name of the Target column. Returns ------- TrainingInstances : Returns a TrainingInstances instance """ return pml.TrainingInstances( InstanceFields=get_instance_fields(derived_col_names, target_name), InlineTable=get_inline_table(model) ) def get_inline_table(model): """ It Returns the Inline Table element of the model. Parameters ---------- model : An instance of Scikit-learn model. Returns ------- InlineTable : Returns a InlineTable instance. """ rows = [] x = model._tree.get_arrays()[0].tolist() y = model._y.tolist() X = [] for idx in range(len(model._tree.get_arrays()[0][0])): X.append("x" + str(idx + 1)) for idx in range(len(x)): row = pml.row() row.elementobjs_ = ['y'] + X if hasattr(model, 'classes_'): row.y = model.classes_[y[idx]] else: row.y = y[idx] for idx_2 in range(len(x[idx])): exec("row." + X[idx_2] + "=" + str(x[idx][idx_2])) rows.append(row) return pml.InlineTable(row=rows) def get_instance_fields(derived_col_names, target_name): """ It returns the Instance field element. Parameters ---------- derived_col_names : List Contains column names after preprocessing. target_name : String Name of the Target column. Returns ------- InstanceFields : Returns a InstanceFields instance """ instance_fields = list() instance_fields.append(pml.InstanceField(field=target_name, column="y")) for (index, name) in enumerate(derived_col_names): instance_fields.append(pml.InstanceField(field=str(name), column="x" + str(index + 1))) return pml.InstanceFields(InstanceField=instance_fields) def get_comparison_measure(model): """ It return the Comparison measure element. Parameters ---------- model : An instance of Scikit-learn model. Returns ------- comp_measure : Returns a ComparisonMeasure instance. """ if model.effective_metric_ == 'euclidean': comp_measure = pml.ComparisonMeasure(euclidean=pml.euclidean(), kind="distance") elif model.effective_metric_ == 'minkowski': comp_measure = pml.ComparisonMeasure(minkowski=pml.minkowski(p_parameter=model.p), kind="distance") elif model.effective_metric_ in ['manhattan','cityblock']: comp_measure = pml.ComparisonMeasure(cityBlock=pml.cityBlock(), kind="distance") elif model.effective_metric_ == 'sqeuclidean': comp_measure = pml.ComparisonMeasure(squaredEuclidean=pml.squaredEuclidean(), kind="distance") elif model.effective_metric_ == 'chebyshev': comp_measure = pml.ComparisonMeasure(chebychev=pml.chebychev(), kind="distance") elif model.effective_metric_ == 'matching': comp_measure = pml.ComparisonMeasure(simpleMatching=pml.simpleMatching(), kind="similarity") elif model.effective_metric_ == 'jaccard': comp_measure = pml.ComparisonMeasure(jaccard=pml.jaccard(), kind="similarity") elif model.effective_metric_ == 'rogerstanimoto': comp_measure = pml.ComparisonMeasure(tanimoto=pml.tanimoto(), kind="similarity") else: raise NotImplementedError("{} metric is not implemented for KNN Model!".format(model.effective_metric_)) return comp_measure def get_knn_inputs(col_names): """ It returns the KNN Inputs element. Parameters ---------- col_names : List Contains list of feature/column names. Returns ------- KNNInputs : Returns a KNNInputs instance. """ knnInput = list() for name in col_names: knnInput.append(pml.KNNInput(field=str(name))) return pml.KNNInputs(KNNInput=knnInput) def get_naiveBayesModel(model, derived_col_names, col_names, target_name, mining_imp_val,categoric_values,tasktype): """ It returns the Naive Bayes Model element of the model. Parameters ---------- model : An instance of Scikit-learn model. derived_col_names : List Contains column names after preprocessing. col_names : List Contains list of feature/column names. target_name : String Name of the Target column. Returns ------- naive_bayes_model : List Returns the NaiveBayesModel """ model_kwargs = get_model_kwargs(model, col_names, target_name, mining_imp_val,categoric_values) naive_bayes_model = list() naive_bayes_model.append(pml.NaiveBayesModel( modelName=model.__class__.__name__, BayesInputs=get_bayes_inputs(model, derived_col_names), BayesOutput=get_bayes_output(model, target_name), threshold=get_threshold(), taskType=tasktype, **model_kwargs )) return naive_bayes_model def get_threshold(): """ It returns the Threshold value. Returns ------- Returns the Threshold value """ return '0.001' def get_bayes_output(model, target_name): """ It returns the Bayes Output element of the model Parameters ---------- model : An instance of Scikit-learn model. target_name : String Name of the Target column. Returns ------- BayesOutput : Returns a BayesOutput instance """ class_counts = model.class_count_ target_val_counts = pml.TargetValueCounts() for name, count in zip(model.classes_, class_counts): tr_val = pml.TargetValueCount(value=str(name), count=str(count)) target_val_counts.add_TargetValueCount(tr_val) return pml.BayesOutput( fieldName=target_name, TargetValueCounts=target_val_counts ) def get_bayes_inputs(model, derived_col_names): """ It returns the Bayes Input element of the model . Parameters ---------- model : An instance of Scikit-learn model. derived_col_names : List Contains column names after preprocessing. Returns ------- bayes_inputs : Returns a BayesInput instance. """ bayes_inputs = pml.BayesInputs() for indx, name in enumerate(derived_col_names): means = model.theta_[:, indx] variances = model.sigma_[:, indx] target_val_stats = pml.TargetValueStats() for idx, val in enumerate(model.classes_): target_val = pml.TargetValueStat( val, GaussianDistribution=pml.GaussianDistribution( mean="{:.16f}".format(means[idx]), variance="{:.16f}".format(variances[idx]))) target_val_stats.add_TargetValueStat(target_val) bayes_inputs.add_BayesInput(pml.BayesInput(fieldName=str(name), TargetValueStats=target_val_stats)) return bayes_inputs def get_supportVectorMachine_models(model, derived_col_names, col_names, target_names, mining_imp_val, categoric_values, tasktype): """ It returns the Support Vector Machine Model element. Parameters ---------- model : An instance of Scikit-learn model. derived_col_names : List Contains column names after preprocessing. col_names : List Contains list of feature/column names. target_names : String Name of the Target column. mining_imp_val : tuple Contains the mining_attributes,mining_strategy, mining_impute_value categoric_values : tuple Contains Categorical attribute names and its values Returns ------- supportVector_models : List Returns SupportVectorMachineModel elements which contains classificationMethod, VectorDictionary, SupportVectorMachine, kernelType """ model_kwargs = get_model_kwargs(model, col_names, target_names, mining_imp_val,categoric_values) supportVector_models = list() kernel_type = get_kernel_type(model) supportVector_models.append(pml.SupportVectorMachineModel( modelName=model.__class__.__name__, classificationMethod=get_classificationMethod(model), VectorDictionary=get_vectorDictionary(model, derived_col_names, categoric_values), SupportVectorMachine=get_supportVectorMachine(model), taskType=tasktype, **kernel_type, **model_kwargs )) # supportVector_models[0].export(sys.stdout,0," ") return supportVector_models def get_model_name(model): if 'OneClassSVM' in str(model.__class__): return 'ocsvm' elif 'IsolationForest' in str(model.__class__): return 'iforest' elif 'XGB' in str(model.__class__): return 'XGBoostModel' elif 'LGB' in str(model.__class__): return 'LightGBModel' elif 'GradientBoosting' in str(model.__class__): return 'GradientBoostingModel' elif 'RandomForest' in str(model.__class__): return 'RandomForestModel' def get_ensemble_models(model, derived_col_names, col_names, target_name, mining_imp_val, categoric_values, tasktype): """ It returns the Mining Model element of the model Parameters ---------- model : An instance of Scikit-learn model. derived_col_names : List Contains column names after preprocessing. col_names : List Contains list of feature/column names. target_name : String Name of the Target column. mining_imp_val : tuple Contains the mining_attributes,mining_strategy, mining_impute_value. categoric_values : tuple Contains Categorical attribute names and its values Returns ------- mining_models : List Returns the MiningModel of the respective ensemble model """ model_kwargs = get_model_kwargs(model, col_names, target_name, mining_imp_val,categoric_values) if model.__class__.__name__ == 'GradientBoostingRegressor': model_kwargs['Targets'] = get_targets(model, target_name) mining_fields = model_kwargs['MiningSchema'].MiningField new_mining_fields = list() if model.__class__.__name__ != 'IsolationForest': for idx, imp_ in enumerate(model.feature_importances_): if imp_ > 0: new_mining_fields.append(mining_fields[idx]) else: for idx in range(len(col_names)): new_mining_fields.append(mining_fields[idx]) for fld in mining_fields: if fld.usageType == 'target': new_mining_fields.append(fld) model_kwargs['MiningSchema'].MiningField = new_mining_fields mining_models = list() mining_models.append(pml.MiningModel( modelName=model.__class__.__name__, Segmentation=get_outer_segmentation(model, derived_col_names, col_names, target_name, mining_imp_val, categoric_values,tasktype), taskType=tasktype, **model_kwargs )) return mining_models def get_targets(model, target_name): """ It returns the Target element of the model. Parameters ---------- model : A Scikit-learn model instance. target_name : String Name of the Target column. Returns ------- targets : Returns a Target instance. """ if model.__class__.__name__ == 'GradientBoostingRegressor': targets = pml.Targets( Target=[ pml.Target( field=target_name, rescaleConstant="{:.16f}".format(model.init_.mean), rescaleFactor="{:.16f}".format(model.learning_rate) ) ] ) else: targets = pml.Targets( Target=[ pml.Target( field=target_name, rescaleConstant="{:.16f}".format(model.base_score) ) ] ) return targets def get_multiple_model_method(model): """ It returns the name of the Multiple Model Chain element of the model. Parameters ---------- model : A Scikit-learn model instance Returns ------- The multiple model method for a mining model. """ if model.__class__.__name__ == 'GradientBoostingClassifier': return 'modelChain' elif model.__class__.__name__ == 'GradientBoostingRegressor': return 'sum' elif model.__class__.__name__ == 'RandomForestClassifier': return 'majorityVote' elif model.__class__.__name__ in ['RandomForestRegressor','IsolationForest']: return 'average' def get_outer_segmentation(model, derived_col_names, col_names, target_name, mining_imp_val, categoric_values,tasktype): """ It returns the Segmentation element of the model. Parameters ---------- model : A Scikit-learn model instance. derived_col_names : List Contains column names after preprocessing. col_names : List Contains list of feature/column names. target_name : String Name of the Target column. mining_imp_val : tuple Contains the mining_attributes,mining_strategy, mining_impute_value categoric_values : tuple Contains Categorical attribute names and its values Returns ------- segmentation : A segmentation instance. """ segmentation = pml.Segmentation( multipleModelMethod=get_multiple_model_method(model), Segment=get_segments(model, derived_col_names, col_names, target_name, mining_imp_val, categoric_values,tasktype) ) return segmentation def get_segments(model, derived_col_names, col_names, target_name, mining_imp_val, categoric_values,tasktype): """ It returns the Segment element of the model. Parameters ---------- model : A Scikit-learn model instance. derived_col_names : List Contains column names after preprocessing. col_names : List Contains list of feature/column names. target_name : String Name of the Target column. mining_imp_val : tuple Contains the mining_attributes,mining_strategy, mining_impute_value categoric_values : tuple Contains Categorical attribute names and its values Returns ------- segments : A list of segment instances. """ segments = None if 'GradientBoostingClassifier' in str(model.__class__): segments = get_segments_for_gbc(model, derived_col_names, col_names, target_name, mining_imp_val, categoric_values,tasktype) else: segments = get_inner_segments(model, derived_col_names, col_names, 0) return segments def get_segments_for_gbc(model, derived_col_names, col_names, target_name, mining_imp_val, categoric_values,tasktype): """ It returns list of Segments element of the model. Parameters ---------- model : A Scikit-learn model instance. derived_col_names : List Contains column names after preprocessing. col_names : List Contains list of feature/column names. target_name : String Name of the Target column. mining_imp_val : tuple Contains the mining_attributes,mining_strategy, mining_impute_value categoric_values : tuple Contains Categorical attribute names and its values Returns ------- segments : List Get the Segments for the Segmentation element. """ segments = list() out_field_names = list() for estm_idx in range(len(model.estimators_[0])): mining_fields_for_first = list() # for name in col_names: for idx,imp_ in enumerate(model.feature_importances_): # mining_fields_for_first.append(pml.MiningField(name=name)) if imp_ > 0: mining_fields_for_first.append(pml.MiningField(name=col_names[idx])) miningschema_for_first = pml.MiningSchema(MiningField=mining_fields_for_first) output_fields = list() output_fields.append( pml.OutputField( name='decisionFunction(' + str(estm_idx) + ')', feature='predictedValue', dataType="double", isFinalResult=False ) ) if len(model.classes_) == 2: output_fields.append( pml.OutputField( name='transformedDecisionFunction(0)', feature='transformedValue', dataType="double", isFinalResult=True, Apply=pml.Apply( function="+", Constant=[pml.Constant( dataType="double", valueOf_="{:.16f}".format(model.init_.prior) )], Apply_member=[pml.Apply( function="*", Constant=[pml.Constant( dataType="double", valueOf_="{:.16f}".format(model.learning_rate) )], FieldRef=[pml.FieldRef( field="decisionFunction(0)", )] )] ) ) ) else: output_fields.append( pml.OutputField( name='transformedDecisionFunction(' + str(estm_idx) + ')', feature='transformedValue', dataType="double", isFinalResult=True, Apply=pml.Apply( function="+", Constant=[pml.Constant( dataType="double", valueOf_="{:.16f}".format(model.init_.priors[estm_idx]) )], Apply_member=[pml.Apply( function="*", Constant=[pml.Constant( dataType="double", valueOf_="{:.16f}".format(model.learning_rate) )], FieldRef=[pml.FieldRef( field="decisionFunction(" + str(estm_idx) + ")", )] )] ) ) ) out_field_names.append('transformedDecisionFunction(' + str(estm_idx) + ')') segments.append( pml.Segment( True_=pml.True_(), id=str(estm_idx), MiningModel=pml.MiningModel( functionName='regression', modelName="MiningModel", MiningSchema=miningschema_for_first, Output=pml.Output(OutputField=output_fields), Segmentation=pml.Segmentation( multipleModelMethod="sum", Segment=get_inner_segments(model, derived_col_names, col_names, estm_idx) ) ) ) ) reg_model = get_regrs_models(model, out_field_names,out_field_names, target_name, mining_imp_val, categoric_values,tasktype)[0] reg_model.Output = None if len(model.classes_) == 2: reg_model.normalizationMethod="logit" else: reg_model.normalizationMethod="softmax" segments.append( pml.Segment( id=str(len(model.estimators_[0])), True_=pml.True_(), RegressionModel=reg_model ) ) return segments def get_inner_segments(model, derived_col_names, col_names, index): """ It returns the Inner segments of the model. Parameters ---------- model : A Scikit-learn model instance. derived_col_names : List Contains column names after preprocessing. col_names : List Contains list of feature/column names. index : Integer The index of the estimator for the model Returns ------- segments : List Get the Segments for the Segmentation element. """ segments = list() for estm_idx in range(model.n_estimators): if np.asanyarray(model.estimators_).ndim == 1: estm = model.estimators_[estm_idx] else: estm = model.estimators_[estm_idx][index] tree_features = estm.tree_.feature features_ = list() for feat in tree_features: if feat != -2 and feat not in features_: features_.append(feat) if len(features_) != 0: mining_fields = list() # for feat in col_names: feature_importances = estm.tree_.compute_feature_importances() for idx,imp_ in enumerate(feature_importances): if imp_ > 0: # mining_fields.append(pml.MiningField(name=feat)) mining_fields.append(pml.MiningField(name=col_names[idx])) segments.append( pml.Segment( True_=pml.True_(), id=str(estm_idx), TreeModel=pml.TreeModel( modelName=estm.__class__.__name__, functionName=get_mining_func(estm), splitCharacteristic="multiSplit", MiningSchema=pml.MiningSchema(MiningField = mining_fields), Node=get_node(estm, derived_col_names, model) ) ) ) return segments def get_classificationMethod(model): """ It returns the Classification Model name of the model. Parameters ---------- model : A Scikit-learn model instance. Returns ------- Returns the classification method of the SVM model """ if model.__class__.__name__ == 'SVC': return 'OneAgainstOne' else: return 'OneAgainstAll' def get_vectorDictionary(model, derived_col_names, categoric_values): """ It return the Vector Dictionary element. Parameters ---------- model : A Scikit-learn model instance. derived_col_names : List Contains column names after preprocessing. categoric_values : tuple Contains Categorical attribute names and its values Returns ------- VectorDictionary : A Vector Dictionary instance. """ model_coef = model.C fieldref_element = get_vectorfields(model_coef, derived_col_names, categoric_values) vectorfields_element = pml.VectorFields(FieldRef=fieldref_element) vec_id = list(model.support_) vecinsts = list() vecs = list(model.support_vectors_) if model.support_vectors_.__class__.__name__ != 'csr_matrix': for vec_idx in range(len(vecs)): vecinsts.append(pml.VectorInstance( id=vec_id[vec_idx], REAL_SparseArray=pml.REAL_SparseArray( n=len(fieldref_element), Indices=([x for x in range(1, len(vecs[vec_idx]) + 1)]), REAL_Entries=vecs[vec_idx].tolist() ) )) else: for vec_idx in range(len(vecs)): vecinsts.append(pml.VectorInstance( id=vec_id[vec_idx], REAL_SparseArray=pml.REAL_SparseArray( n=len(fieldref_element), Indices=([x for x in range(1, len(vecs[vec_idx].todense().tolist()[0]) + 1)]), REAL_Entries=vecs[vec_idx].todense().tolist()[0] ) )) vd=pml.VectorDictionary(VectorFields=vectorfields_element, VectorInstance=vecinsts) return vd def get_vectorfields(model_coef, feat_names, categoric_values): """ It return the Vector Fields . Parameters ---------- model : A Scikit-learn model instance. derived_col_names : List Contains column names after preprocessing. categoric_values : tuple Contains Categorical attribute names and its values Returns ------- Returns the Vector Dictionary instance for Support Vector model. """ der_fld_len = len(feat_names) der_fld_idx = 0 row_idx = -1 predictors = list() if categoric_values: class_lbls = categoric_values[0] class_attribute = categoric_values[1] while der_fld_idx < der_fld_len: if is_labelbinarizer(feat_names[der_fld_idx]): if not is_stdscaler(feat_names[der_fld_idx]): class_id = get_classid(class_attribute, feat_names[der_fld_idx]) cat_predictors = get_categoric_pred(feat_names[der_fld_idx],row_idx, der_fld_idx, model_coef, class_lbls[class_id], class_attribute[class_id]) for predictor in cat_predictors: predictors.append(predictor) if len(class_lbls[class_id]) == 2: incrementor = 1 else: incrementor = len(class_lbls[class_id]) der_fld_idx = der_fld_idx + incrementor else: vectorfields_element = pml.FieldRef(field=feat_names[der_fld_idx]) predictors.append(vectorfields_element) der_fld_idx += 1 elif is_onehotencoder(feat_names[der_fld_idx]): if not is_stdscaler(feat_names[der_fld_idx]): class_id = get_classid(class_attribute, feat_names[der_fld_idx]) cat_predictors = get_categoric_pred(feat_names[der_fld_idx],row_idx, der_fld_idx, model_coef, class_lbls[class_id], class_attribute[class_id]) for predictor in cat_predictors: predictors.append(predictor) incrementor = len(class_lbls[class_id]) der_fld_idx = der_fld_idx + incrementor else: vectorfields_element = pml.FieldRef(field=feat_names[der_fld_idx]) predictors.append(vectorfields_element) der_fld_idx += 1 else: vectorfields_element = pml.FieldRef(field=feat_names[der_fld_idx]) predictors.append(vectorfields_element) der_fld_idx += 1 return predictors def is_onehotencoder(feat_name): """ Parameters ---------- feat_name : string Contains the name of the attribute Returns ------- Returns a boolean value that states whether OneHotEncoder has been applied or not """ if "oneHotEncoder" in feat_name: return True else: return False def get_kernel_type(model): """ It returns the kernel type element. Parameters ---------- model : A Scikit-learn model instance. Returns ------- kernel_kwargs : Dictionary Get the respective kernel type of the SVM model. """ kernel_kwargs = dict() if model.kernel == 'linear': kernel_kwargs['LinearKernelType'] = pml.LinearKernelType(description='Linear Kernel Type') elif model.kernel == 'poly': kernel_kwargs['PolynomialKernelType'] = pml.PolynomialKernelType(description='Polynomial Kernel type', gamma="{:.16f}".format(model._gamma), coef0="{:.16f}".format(model.coef0), degree=model.degree) elif model.kernel == 'rbf': kernel_kwargs['RadialBasisKernelType'] = pml.RadialBasisKernelType(description='Radial Basis Kernel Type', gamma="{:.16f}".format(model._gamma)) elif model.kernel == 'sigmoid': kernel_kwargs['SigmoidKernelType'] = pml.SigmoidKernelType(description='Sigmoid Kernel Type', gamma="{:.16f}".format(model._gamma), coef0="{:.16f}".format(model.coef0)) else: raise NotImplementedError("{} kernel is not implemented!".format(model.kernel)) return kernel_kwargs def get_supportVectorMachine(model): """ It return the Support Vector Machine element. Parameters ---------- model : A Scikit-learn model instance. Returns ------- support_vector_machines : List Get the Support Vector Machine element which conatains targetCategory, alternateTargetCategory, SupportVectors, Coefficients """ support_vector_machines = list() if model.__class__.__name__ in ['SVR','OneClassSVM']: support_vector = list() for sv in model.support_: support_vector.append(pml.SupportVector(vectorId=sv)) support_vectors = pml.SupportVectors(SupportVector=support_vector) coefficient = list() absoValue = model.intercept_[0] if model.dual_coef_.__class__.__name__ != 'csr_matrix': for coef in model.dual_coef_: for num in coef: coefficient.append(pml.Coefficient(value="{:.16f}".format(num))) else: dual_coefficent=model.dual_coef_.data for num in dual_coefficent: coefficient.append(pml.Coefficient(value="{:.16f}".format(num))) coeff = pml.Coefficients(absoluteValue=absoValue, Coefficient=coefficient) support_vector_machines.append(pml.SupportVectorMachine(SupportVectors=support_vectors, Coefficients=coeff)) else: support_vector_locs = np.cumsum(np.hstack([[0], model.n_support_])) n_class = model.dual_coef_.shape[0] + 1 coef_abs_val_index = 0 for class1 in range(n_class): sv1 = model.support_[support_vector_locs[class1]:support_vector_locs[class1 + 1]] for class2 in range(class1 + 1, n_class): svs = list() coefs = list() sv2 = model.support_[support_vector_locs[class2]:support_vector_locs[class2 + 1]] svs.append((list(sv1) + list(sv2))) alpha1 = model.dual_coef_[class2 - 1, support_vector_locs[class1]:support_vector_locs[class1 + 1]] alpha2 = model.dual_coef_[class1, support_vector_locs[class2]:support_vector_locs[class2 + 1]] coefs.append((list(alpha1) + list(alpha2))) all_svs = list() for sv in (svs[0]): all_svs.append(pml.SupportVector(vectorId=sv)) all_coefs = list() for coef in (coefs[0]): all_coefs.append(pml.Coefficient(value="{:.16f}".format(coef))) coef_abs_value = model.intercept_[coef_abs_val_index] coef_abs_val_index += 1 if len(model.classes_) == 2: support_vector_machines.append( pml.SupportVectorMachine( targetCategory=model.classes_[class1], alternateTargetCategory=model.classes_[class2], SupportVectors=pml.SupportVectors(SupportVector=all_svs), Coefficients=pml.Coefficients(absoluteValue="{:.16f}".format(coef_abs_value), Coefficient=all_coefs) ) ) else: support_vector_machines.append( pml.SupportVectorMachine( targetCategory=model.classes_[class2], alternateTargetCategory=model.classes_[class1], SupportVectors=pml.SupportVectors(SupportVector=all_svs), Coefficients=pml.Coefficients(absoluteValue="{:.16f}".format(coef_abs_value), Coefficient=all_coefs) ) ) return support_vector_machines def get_tree_models(model, derived_col_names, col_names, target_name, mining_imp_val,categoric_values,tasktype): """ It return Tree Model element of the model Parameters ---------- model : A Scikit-learn model instance. derived_col_names : Contains column names after preprocessing. col_names : List Contains list of feature/column names. target_name : String Name of the Target column. mining_imp_val : tuple Contains the mining_attributes,mining_strategy, mining_impute_value Returns ------- tree_models : List Get the TreeModel element. """ model_kwargs = get_model_kwargs(model, col_names, target_name, mining_imp_val,categoric_values) tree_models = list() tree_models.append(pml.TreeModel( modelName=model.__class__.__name__, Node=get_node(model, derived_col_names), taskType=tasktype, **model_kwargs )) return tree_models def get_neural_models(model, derived_col_names, col_names, target_name, mining_imp_val, categoric_values,tasktype): """ It returns Neural Network element of the model. Parameters ---------- model : A Scikit-learn model instance. derived_col_names : List Contains column names after preprocessing. col_names : List Contains list of feature/column names. target_name : String Name of the Target column. mining_imp_val : tuple Contains the mining_attributes,mining_strategy, mining_impute_value. Returns ------- neural_model : List Model attributes for PMML file. """ model_kwargs = get_model_kwargs(model, col_names, target_name, mining_imp_val,categoric_values) neural_model = list() neural_model.append(pml.NeuralNetwork( modelName=model.__class__.__name__, threshold='0', altitude='1.0', activationFunction=get_funct(model), NeuralInputs = get_neuron_input(derived_col_names), NeuralLayer = get_neural_layer(model, derived_col_names, target_name)[0], NeuralOutputs = get_neural_layer(model, derived_col_names, target_name)[1], **model_kwargs )) return neural_model def get_funct(sk_model): """ It returns the activation fucntion of the model. Parameters ---------- model : A Scikit-learn model instance. Returns ------- a_fn : String Returns the activation function. """ a_fn = sk_model.activation if a_fn =='relu': a_fn = 'rectifier' return a_fn def get_regrs_models(model, derived_col_names, col_names, target_name, mining_imp_val, categoric_values, tasktype): """ It returns the Regression Model element of the model Parameters ---------- model : A Scikit-learn model instance. derived_col_names : List Contains column names after preprocessing. col_names : List Contains list of feature/column names. target_name : String Name of the Target column. mining_imp_val : tuple Contains the mining_attributes,mining_strategy, mining_impute_value categoric_values : tuple Contains Categorical attribute names and its values Returns ------- regrs_models : List Returns a regression model of the respective model """ model_kwargs = get_model_kwargs(model, col_names, target_name, mining_imp_val, categoric_values) if model.__class__.__name__ not in ['LinearRegression','LinearSVR']: model_kwargs['normalizationMethod'] = 'logit' regrs_models = list() regrs_models.append(pml.RegressionModel( modelName=model.__class__.__name__, RegressionTable=get_regrs_tabl(model, derived_col_names, target_name, categoric_values), taskType=tasktype, **model_kwargs )) return regrs_models def get_regrs_tabl(model, feature_names, target_name, categoric_values): """ It returns the Regression Table element of the model. Parameters ---------- model : A Scikit-learn model instance. derived_col_names : List Contains column names after preprocessing. target_name : String Name of the Target column. categoric_values : tuple Contains Categorical attribute names and its values Returns ------- merge : List Returns a list of Regression Table. """ merge = list() if hasattr(model, 'intercept_'): func_name = get_mining_func(model) inter = model.intercept_ model_coef = model.coef_ merge = list() target_classes = target_name row_idx = 0 if not hasattr(inter, '__iter__') or model.__class__.__name__ in ['LinearRegression','LinearSVR']: inter = np.array([inter]) target_classes = [target_classes] model_coef = np.ravel(model_coef) model_coef = model_coef.reshape(1, model_coef.shape[0]) target_cat = None else: target_classes = model.classes_ max_target_index = len(target_classes) - 1 target_cat = target_classes[max_target_index] if len(inter) == 1: regr_predictor = get_regr_predictors(model_coef, row_idx, feature_names, categoric_values) merge.append( pml.RegressionTable( intercept="{:.16f}".format(inter.item()), targetCategory=target_cat, NumericPredictor=regr_predictor ) ) if func_name != 'regression': merge.append( pml.RegressionTable( intercept="0.0", targetCategory=target_classes[0] ) ) else: for tgname, tg_idx in zip(np.unique(target_classes), range(len(np.unique(target_classes)))): row_idx = tg_idx regr_predictors = get_regr_predictors(model_coef, row_idx, feature_names, categoric_values) merge.append( pml.RegressionTable( intercept="{:.16f}".format(inter[tg_idx]), targetCategory=tgname, NumericPredictor=regr_predictors ) ) else: if len(model.classes_) == 2: merge.append( pml.RegressionTable( NumericPredictor=[pml.NumericPredictor(coefficient='1.0',name=feature_names[0])], intercept='0.0', targetCategory=str(model.classes_[-1]) ) ) merge.append( pml.RegressionTable(intercept='0.0', targetCategory=str(model.classes_[0])) ) else: for feat_idx in range(len(feature_names)): merge.append( pml.RegressionTable( NumericPredictor=[pml.NumericPredictor(coefficient='1.0',name=feature_names[feat_idx])], intercept='0.0', targetCategory=str(model.classes_[feat_idx]) ) ) return merge def get_node(model, features_names, main_model=None): """ It return the Node element of the model. Parameters ---------- model : An instance of the estimator of the tree object. features_names : List Contains the list of feature/column name. main_model : A Scikit-learn model instance. Returns ------- _getNode : Get all the underlying Nodes. """ tree = model.tree_ node_samples = tree.n_node_samples if main_model and main_model.__class__.__name__ == 'RandomForestClassifier': classes = main_model.classes_ elif hasattr(model,'classes_'): classes = model.classes_ tree_leaf = -1 def _getNode(idx,parent=None, cond=None): simple_pred_cond = None if cond: simple_pred_cond = cond node = pml.Node(id=idx, recordCount=float(tree.n_node_samples[idx])) if simple_pred_cond: node.SimplePredicate = simple_pred_cond else: node.True_ = pml.True_() if tree.children_left[idx] != tree_leaf: fieldName = features_names[tree.feature[idx]] prnt = None if model.__class__.__name__ == "ExtraTreeRegressor": prnt = parent + 1 simplePredicate = pml.SimplePredicate(field=fieldName, operator="lessOrEqual", value="{:.16f}".format(tree.threshold[idx])) left_child = _getNode(tree.children_left[idx],prnt, simplePredicate) simplePredicate = pml.SimplePredicate(field=fieldName, operator="greaterThan", value="{:.16f}".format(tree.threshold[idx])) right_child = _getNode(tree.children_right[idx],prnt, simplePredicate) node.add_Node(left_child) node.add_Node(right_child) else: nodeValue = list(tree.value[idx][0]) lSum = float(sum(nodeValue)) if model.__class__.__name__ == 'DecisionTreeClassifier': probs = [x / lSum for x in nodeValue] score_dst = [] for i in range(len(probs)): score_dst.append(pml.ScoreDistribution(confidence=probs[i], recordCount=float(nodeValue[i]), value=classes[i])) node.ScoreDistribution = score_dst node.score = classes[probs.index(max(probs))] else: if model.__class__.__name__ == "ExtraTreeRegressor": nd_sam=node_samples[int(idx)] node.score = "{:.16f}".format(parent+avgPathLength(nd_sam)) else: node.score="{:.16f}".format(lSum) return node if model.__class__.__name__ == "ExtraTreeRegressor": return _getNode(0,0) else: return _getNode(0) def avgPathLength(n): if n<=1.0: return 1.0 return 2.0*(math.log(n-1.0)+0.57721566) - 2.0*((n-1.0)/n) def get_output(model, target_name): """ It returns the output element of the model. Parameters ---------- model : A Scikit-learn model instance. target_name : String Name of the Target column. Returns ------- Output : Get the Output element. """ mining_func = get_mining_func(model) output_fields = list() if not has_target(model): output_fields.append(pml.OutputField( name='predicted', feature="predictedValue", optype="categorical", dataType="double" )) else: alt_target_name = 'predicted_' + target_name if mining_func == 'classification': for cls in model.classes_: output_fields.append(pml.OutputField( name='probability_' + str(cls), feature="probability", optype="continuous", dataType="double", value=str(cls) )) output_fields.append(pml.OutputField( name=alt_target_name, feature="predictedValue", optype="categorical", dataType="string")) else: output_fields.append(pml.OutputField( name=alt_target_name, feature="predictedValue", optype="continuous", dataType="double")) return pml.Output(OutputField=output_fields) def get_mining_func(model): """ It returns the name of the mining function of the model. Parameters ---------- model : A Scikit-learn model instance. Returns ------- func_name : String Returns the function name of the model """ if not hasattr(model, 'classes_'): if hasattr(model,'n_clusters'): func_name = 'clustering' else: func_name = 'regression' else: if isinstance(model.classes_, np.ndarray): func_name = 'classification' else: func_name = 'regression' return func_name def get_mining_schema(model, feature_names, target_name, mining_imp_val, categoric_values): """ It returns the Mining Schema of the model. Parameters ---------- model : A Scikit-learn model instance. feature_names : List Contains the list of feature/column name. target_name : String Name of the Target column. mining_imp_val : tuple Contains the mining_attributes,mining_strategy, mining_impute_value. Returns ------- MiningSchema : Get the MiningSchema element """ if mining_imp_val: mining_attributes = mining_imp_val[0] mining_strategy = mining_imp_val[1] mining_replacement_val = mining_imp_val[2] n_features = len(feature_names) features_pmml_optype = ['continuous'] * n_features features_pmml_utype = ['active'] * n_features target_pmml_utype = 'target' mining_func = get_mining_func(model) if mining_func == 'classification': target_pmml_optype = 'categorical' elif mining_func == 'regression': target_pmml_optype = 'continuous' mining_flds = list() mining_name_stored = list() # handling impute pre processing if mining_imp_val: for mining_item, mining_idx in zip(mining_attributes, range(len(mining_attributes))): for feat_name,feat_idx in zip(feature_names, range(len(feature_names))): if feat_name in mining_item: if feat_name not in mining_name_stored: impute_index = mining_item.index(feat_name) mining_flds.append(pml.MiningField(name=str(feat_name), optype=features_pmml_optype[feat_idx], missingValueReplacement=mining_replacement_val[mining_idx][ impute_index], missingValueTreatment=mining_strategy[mining_idx], usageType=features_pmml_utype[feat_idx])) mining_name_stored.append(feat_name) if len(categoric_values) > 0: for cls_attr in categoric_values[1]: mining_flds.append(pml.MiningField( name=cls_attr, usageType='active', optype='categorical' )) mining_name_stored.append(cls_attr) for feat_name, feat_idx in zip(feature_names, range(len(feature_names))): if feat_name not in mining_name_stored: mining_flds.append(pml.MiningField(name=str(feat_name), optype=features_pmml_optype[feat_idx], usageType=features_pmml_utype[feat_idx])) if has_target(model): mining_flds.append(pml.MiningField(name=target_name, optype=target_pmml_optype, usageType=target_pmml_utype)) return pml.MiningSchema(MiningField=mining_flds) def get_neuron_input(feature_names): """ It returns the Neural Input element. Parameters ---------- feature_names : List Contains the list of feature/column name. Returns ------- neural_input_element : Returns the NeuralInputs element """ neural_input = list() for features in feature_names: field_ref = pml.FieldRef(field = str(features)) derived_flds = pml.DerivedField(optype = "continuous", dataType = "double", FieldRef = field_ref) class_node = pml.NeuralInput(id = str(features), DerivedField = derived_flds) neural_input.append(class_node) neural_input_element = pml.NeuralInputs(NeuralInput = neural_input, numberOfInputs = str(len(neural_input))) return neural_input_element def get_neural_layer(model, feature_names, target_name): """ It returns the Neural Layer and Neural Ouptput element. Parameters ---------- model : A Scikit-learn model instance. feature_names : List Contains the list of feature/column name. target_name : String Name of the Target column. Returns ------- all_neuron_layer : List Return the list of NeuralLayer elelemt. neural_output_element : Return the NeuralOutput element instance """ weight = model.coefs_ bias = model.intercepts_ last_layer = bias[-1] hidden_layer_sizes = model.hidden_layer_sizes hidden_layers = list(hidden_layer_sizes) hidden_layers.append(len(last_layer)) neuron = list() all_neuron_layer = list() input_features = feature_names neuron_id = list() for count in range(len(hidden_layers)): for count1 in range(hidden_layers[count]): con = list() for count2 in range(len(input_features)): con.append(pml.Con(from_ = input_features[count2], weight = format(weight[count][count2][count1]))) neuron.append(pml.Neuron(id = str(count)+str(count1), bias = format(bias[count][count1]),Con = con)) neuron_id.append(str(count)+str(count1)) all_neuron_layer.append(pml.NeuralLayer(Neuron = neuron)) input_features = neuron_id neuron_id = list() neuron = list() if hidden_layers[-1]==1 and 'MLPClassifier' in str(model.__class__): bias1=[1.0,0.0] weight1=[-1.0,1.0] con = list() linear = ['linear/1'] i_d = ['true', 'false'] con.append(pml.Con(from_ = input_features[0], weight = 1.0)) neuron.append(pml.Neuron(id = linear[0], bias = ('0.0'), Con = con)) all_neuron_layer.append(pml.NeuralLayer(activationFunction = "logistic", Neuron = neuron)) neuron = list() con = list() for num in range(2): con.append(pml.Con(from_ = linear[0], weight = format(weight1[num]))) neuron.append(pml.Neuron(id = i_d[num], bias = format(bias1[num]), Con = con)) con = list() all_neuron_layer.append(pml.NeuralLayer(activationFunction = "identity", Neuron = neuron)) if 'MLPClassifier' in str(model.__class__): neural_output = list() for values, count in zip(model.classes_, range(len(model.classes_))): norm_discrete = pml.NormDiscrete(field = target_name, value = str(values)) derived_flds = pml.DerivedField(optype = "categorical", dataType = 'double', NormDiscrete = norm_discrete) if len(input_features)==1: class_node = pml.NeuralOutput(outputNeuron = input_features, DerivedField = derived_flds) else: class_node = pml.NeuralOutput(outputNeuron = input_features[count],DerivedField = derived_flds) neural_output.append(class_node) neural_output_element = pml.NeuralOutputs(numberOfOutputs = None, Extension = None, NeuralOutput = neural_output) if 'MLPRegressor' in str(model.__class__): neural_output = list() fieldRef = pml.FieldRef(field = target_name) derived_flds = pml.DerivedField(optype = "continuous", dataType = "double", FieldRef = fieldRef) class_node = pml.NeuralOutput(outputNeuron = input_features, DerivedField = derived_flds) neural_output.append(class_node) neural_output_element = pml.NeuralOutputs(numberOfOutputs = None, Extension = None, NeuralOutput = neural_output) return all_neuron_layer, neural_output_element def get_super_cls_names(model_inst): """ It returns the set of Super class of the model. Parameters: ------- model_inst: Instance of the scikit-learn model Returns ------- parents : Set Returns all the parent class of the model instance. """ def super_cls_names(cls): nonlocal parents parents.add(cls.__name__) for super_cls in cls.__bases__: super_cls_names(super_cls) cls = model_inst.__class__ parents = set() super_cls_names(cls) return parents def get_version(): """ It returns the pmml version . Returns ------- version : String Returns the version of the pmml. """ version = '4.4' return version def get_header(): """ It returns the Header element of the pmml. Returns ------- header : Returns the header of the pmml. """ copyryt = "Copyright (c) 2019 Software AG" description = "Default Description" timestamp = pml.Timestamp(datetime.now()) application=pml.Application(name="Nyoka",version=metadata.__version__) header = pml.Header(copyright=copyryt, description=description, Timestamp=timestamp, Application=application) return header def get_dtype(feat_value): """ It return the data type of the value. Parameters ---------- feat_value : Contains a value for finding the its data type. Returns ------- Returns the respective data type of that value. """ data_type=str(type(feat_value)) if 'float' in data_type: return 'float' if 'int' in data_type: return 'integer' if 'long' in data_type: return 'long' if 'complex' in data_type: return 'complex' if 'str' in data_type: return 'string' def get_data_dictionary(model, feature_names, target_name, categoric_values=None): """ It returns the Data Dictionary element. Parameters ---------- model : A Scikit-learn model instance. feature_names : List Contains the list of feature/column name. target_name : List Name of the Target column. categoric_values : tuple Contains Categorical attribute names and its values Returns ------- data_dict : Return the dataDictionary instance """ categoric_feature_name = list() if categoric_values: categoric_labels = categoric_values[0] categoric_feature_name = categoric_values[1] target_attr_values = [] n_features = len(feature_names) features_pmml_optype = ['continuous'] * n_features features_pmml_dtype = ['double'] * n_features mining_func = get_mining_func(model) if mining_func == 'classification': target_pmml_optype = 'categorical' target_pmml_dtype = get_dtype(model.classes_[0]) target_attr_values = model.classes_.tolist() elif mining_func == 'regression': target_pmml_optype = 'continuous' target_pmml_dtype = 'double' data_fields = list() if categoric_values: for class_list, attr_for_class in zip(categoric_labels, categoric_feature_name): category_flds = pml.DataField(name=str(attr_for_class), optype="categorical", dataType=get_dtype(class_list[0]) if class_list else 'string') if class_list: for values in class_list: category_flds.add_Value(pml.Value(value=str(values))) data_fields.append(category_flds) attr_without_class_attr = [feat_name for feat_name in feature_names if feat_name not in categoric_feature_name] for feature_idx, feat_name in enumerate(attr_without_class_attr): data_fields.append(pml.DataField(name=str(feat_name), optype=features_pmml_optype[feature_idx], dataType=features_pmml_dtype[feature_idx])) if has_target(model): class_node = pml.DataField(name=str(target_name), optype=target_pmml_optype, dataType=target_pmml_dtype) for class_value in target_attr_values: class_node.add_Value(pml.Value(value=str(class_value))) data_fields.append(class_node) data_dict = pml.DataDictionary(numberOfFields=len(data_fields), DataField=data_fields) return data_dict def has_target(model): target_less_models = ['KMeans','OneClassSVM','IsolationForest', ] if model.__class__.__name__ in target_less_models: return False else: return True def get_regr_predictors(model_coef, row_idx, feat_names, categoric_values): """ Parameters ---------- model_coef : array Contains the estimators coefficient values row_idx : int Contains an integer value to differentiate between linear and svm models feat_names : list Contains the list of feature/column names categoric_values : tuple Contains Categorical attribute names and its values Returns ------- predictors : list Returns a list with instances of nyoka numeric/categorical predictor class """ der_fld_len = len(feat_names) der_fld_idx = 0 predictors = list() if categoric_values: class_lbls = categoric_values[0] class_attribute = categoric_values[1] while der_fld_idx < der_fld_len: if is_labelbinarizer(feat_names[der_fld_idx]): if not is_stdscaler(feat_names[der_fld_idx]): class_id = get_classid(class_attribute, feat_names[der_fld_idx]) cat_predictors = get_categoric_pred(feat_names[der_fld_idx], row_idx, der_fld_idx, model_coef, class_lbls[class_id], class_attribute[class_id]) for predictor in cat_predictors: predictors.append(predictor) if len(class_lbls[class_id]) == 2: incrementor = 1 else: incrementor = len(class_lbls[class_id]) der_fld_idx = der_fld_idx + incrementor else: num_predictors = get_numeric_pred(row_idx, der_fld_idx, model_coef, feat_names[der_fld_idx]) predictors.append(num_predictors) der_fld_idx += 1 elif is_onehotencoder(feat_names[der_fld_idx]): if not is_stdscaler(feat_names[der_fld_idx]): class_id = get_classid(class_attribute, feat_names[der_fld_idx]) cat_predictors = get_categoric_pred(feat_names[der_fld_idx], row_idx, der_fld_idx, model_coef, class_lbls[class_id], class_attribute[class_id]) for predictor in cat_predictors: predictors.append(predictor) incrementor = len(class_lbls[class_id]) der_fld_idx = der_fld_idx + incrementor else: vectorfields_element = pml.FieldRef(field=feat_names[der_fld_idx]) predictors.append(vectorfields_element) der_fld_idx += 1 else: num_predictors = get_numeric_pred(row_idx, der_fld_idx, model_coef, feat_names[der_fld_idx]) predictors.append(num_predictors) der_fld_idx += 1 return predictors def get_classid(class_attribute, feat_name): """ Parameters ---------- class_attribute: Contains the name of the attribute/column that contains categorical values feat_name : string Contains the name of the attribute/column Returns ------- class_idx:int Returns an integer value that will represent each categorical value """ for class_idx,class_attr in enumerate(class_attribute): if class_attr in feat_name: return class_idx def is_labelbinarizer(feat_name): """ Parameters ---------- feat_name : string Contains the name of the attribute Returns ------- Returns a boolean value that states whether label binarizer has been applied or not """ if "labelBinarizer" in feat_name or "one_hot_encoder" in feat_name: return True else: return False def is_stdscaler(feat_name): """ Parameters ---------- feat_name : string Contains the name of the attribute Returns ------- Returns a boolean value that states whether standard scaler has been applied or not """ if "standardScaler" in feat_name: return True else: return False def get_categoric_pred(feat_names,row_idx, der_fld_idx, model_coef, class_lbls, class_attribute): """ Parameters ---------- feat_names : str Contains the name of the field row_idx : int Contains an integer value to index attribute/column names der_fld_idx : int Contains an integer value to differentiate between linear and svm models model_coef : array Contains the estimators coefficient values class_lbls : list Contains the list of categorical values class_attribute : tuple Contains Categorical attribute name Returns ------- categoric_predictor : list Returns a list with instances of nyoka categorical predictor class """ categoric_predictor = list() classes_len = len(class_lbls) if not is_onehotencoder(feat_names): if classes_len == 2: if row_idx == -1: coef = model_coef else: coef = model_coef[row_idx][der_fld_idx ] cat_pred = pml.CategoricalPredictor(name=class_attribute, value=class_lbls[-1], coefficient="{:.16f}".format(coef)) cat_pred.original_tagname_ = "CategoricalPredictor" categoric_predictor.append(cat_pred) else: for cname, class_idx in zip(class_lbls, range(len(class_lbls))): if row_idx == -1: coef = model_coef else: coef = model_coef[row_idx][der_fld_idx+class_idx] cat_pred = pml.CategoricalPredictor(name=class_attribute, value=cname, coefficient="{:.16f}".format(coef)) cat_pred.original_tagname_ = "CategoricalPredictor" categoric_predictor.append(cat_pred) else: for cname, class_idx in zip(class_lbls, range(len(class_lbls))): if row_idx == -1: coef = model_coef else: coef = model_coef[row_idx][der_fld_idx + class_idx] cat_pred = pml.CategoricalPredictor(name=class_attribute, value=cname, coefficient="{:.16f}".format(coef)) cat_pred.original_tagname_ = "CategoricalPredictor" categoric_predictor.append(cat_pred) return categoric_predictor def get_numeric_pred(row_idx, der_fld_idx, model_coef, der_fld_name): """ Parameters ---------- row_idx : int Contains an integer value to index attribute/column names der_fld_idx : int Contains an integer value to differentiate between linear and svm models model_coef : array Contains the estimators coefficient values der_fld_name : string Contains the name of the attribute Returns ------- num_pred : Returns an instances of nyoka numeric predictor class """ num_pred = pml.NumericPredictor( name=der_fld_name, exponent='1', coefficient="{:.16f}".format(model_coef[row_idx][der_fld_idx])) num_pred.original_tagname_ = "NumericPredictor" return num_pred
python
<filename>data/THIN/conditions/condition_4058345.json {"PREVALENCE_BY_GENDER_AGE_YEAR":{"TRELLIS_NAME":[],"SERIES_NAME":[],"X_CALENDAR_YEAR":[],"Y_PREVALENCE_1000PP":[]},"PREVALENCE_BY_MONTH":{"X_CALENDAR_MONTH":[],"Y_PREVALENCE_1000PP":[]},"CONDITIONS_BY_TYPE":{"CONCEPT_NAME":"Observation recorded from EHR","COUNT_VALUE":32},"AGE_AT_FIRST_DIAGNOSIS":{"CATEGORY":["MALE","FEMALE"],"MIN_VALUE":[3,4],"P10_VALUE":[5,26],"P25_VALUE":[16,30],"MEDIAN_VALUE":[27,47],"P75_VALUE":[38,81],"P90_VALUE":[73,86],"MAX_VALUE":[83,93]}}
json
PM Narendra Modi arrived at the stadium with Gujarat CM Bhupendrabhai Patel. PM Narendra Modi presented a special cap to India captain Rohit Sharma ahead of the match. Narendra Modi stand in unison with Australian PM Anthony Albanese and captains Rohit Sharma and Steve Smith. Virat. Rohit. Modi. That's the post and that's the picture that has created the most buzz on social media. Narendra Modi singing the national anthem with Rohit and Virat is the best thing you'll see on the internet today. PM Narendra Modi and PM Anthony Albanese did a celebratory lap ahead of the 4th Test. By clicking “Accept All Cookies”, you agree to the storing of cookies on your device to enhance site navigation, analyze site usage, and assist in our marketing efforts Cookies Policy.
english
{ "id": 5581, "cites": 20, "cited_by": 143, "reference": [ "<NAME>, Monetary Targeting with Exchange Rate Constraints,Federal Reserve Bank of St. Louis Economic Review, September/October, 1989.", "<NAME>, The Practice of Monetary Targeting: A Case Study of the West German Experience, Federal Reserve Bank of San Francisco Economic Review, 30 - 44, Spring 1988.", "<NAME>, European Integration and Asymmetry in the EMS, Federal Reserve Bank of New York, mimeo, 1994.", "<NAME>, Does Monetary Policy Matter in the G-6 Countries, mimeo, Yale University, 1994.", "<NAME> and <NAME>, Liquidity and Exchange Rates, A Structural VAR Approach, mimeo, New York University, 1995.", "<NAME> and <NAME>, Searching for the Holy Grail: An Examination of German Money Demand After Unification, mimeo, Federal Reserve Board, 1994.", "Sims, <NAME>. and <NAME>, Does Monetary Policy Generate Recessions: Using Less Aggregate Price Data to Identify Monetary Policy, mimeo, Yale University. <NAME>, Discretion versus Policy Rules in Practice, Carnegie Rochester Conference on Public Policy, December, 1993, pp/ 195-214. Tsatsaronis, Konstantinos, Bank Lending and the Monetary Transmission Mechanism: The Case of Germany, mimeo, 1993.", "<NAME>, A Comparison of Monetary Policy Operating Procedures in Six Industrialized Countries, Federal Reserve Bank of New York, mimeo, 1992.", "<NAME>., Stability of Monetary Policy, Stability of the Monetary System: Experience with Monetary Targeting in Germany, mimes, 1995.", "Kahn, George and <NAME>, Lessons from West German Monetary Policy, Federal Reserve Bank of Kansas City Economic Review, 18 - 34, April 1989.", "<NAME>, <NAME>, and <NAME>, Central Bank Institutions and Policies, Economic Policy, October 1991.", "Goldman Sachs German Economic Commentary, various issues. Gordon, <NAME>. and <NAME>, The Dynamic Effects of Monetary Policy: An Exercise in Tentative Identification, Journal of Political Economy, 1994.", "<NAME>, How Well Does The IS/LM Model Fit Post-War U.S. Data? Quarterly Journal of Economics, 1992.", "<NAME> and <NAME>, Sources of Real Exchange Rate Fluctuations: How Important Are Nominal Shocks? Carnegie Rochester Conference on Public Policy, 1994.", "Eichenbaum, Martin, and <NAME>, Some Empirical Evidence on the Effects of Monetary Policy on Exchange Rates, mlmeo, 1993.", "<NAME>. and <NAME>, A New Approach to the Decomposition of Time Series into Permanent and Transitory Components, Journal of Monetary Economics, 7, 151-174. Blanchard, <NAME>., A Traditional Interpretation of Economic Fluctuations, American Economic Review, LXXIX, 1989, 1146 - 1164. Christiano, <NAME>., <NAME> and <NAME>, The Effects of Monetary Policy Shocks: Evidence from the Flow of FUnds, forthcoming, Review of Economics and Statistics. Deutche Bundesbank: Policy Practices and Procedure, 1989.", "Bernanke, Ben, and <NAME>, Central Bank Behavior and the Strategy of Monetary Policy: Observations from Six Industrialized Countries: NBER Macroeconomics Annual, 1992.", "<NAME> and <NAME>, Measuring Monetary Policy, mimeo, Princeton University, June 1995.", "<NAME> and <NAME>, The Federal Funds Rate and the Transmission of Monetary Policy, American Economic Review 82, 901-921, 1992.", "<NAME>, What Determines the Sacrifice Ratio?, in Monetary Policy, edited by <NAME>, NBER, Chicago, 1994." ] }
json
But not, before any extreme constitutional reform is possible, this new Minnesota Chippewa Group human anatomy politic must deal with the truth that this new tribe does not have people practical measure of inherent sovereignty. In reality and legislation, the brand new tribe is actually a production of the us government within the Indian Reorganization Operate away from 1934 and certainly will end up being abolished from the plenary authority from Congress when. Government entities, of course, can also be insidiously select and quodlibet quite the opposite by just reducing, or removing, government money out-of tribal applications. Doubt political facts does not foster useful constitutional change. Robert An excellent. Fairbanks, The fresh new Minnesota Chippewa Tribe in the 1997: Another birth, and/or beginning of the avoid?, Native American Force, Dec. twenty seven, 1996, at the 6 (stress extra). I recognize one to a few off eleven are a tiny testing to draw hard and fast results away from. However, I will in addition to county, on the other hand, that in case inside the a small, sparsely inhabited condition within the Minnesota, whenever 20% of your own mayors and city councils in metropolitan areas in this condition experience a trial that computed thieves, ripoff, kickbacks, scam, corruption, and you may vote rigging had been positioned when it comes to those metropolises getting decades, it’s obvious the residents ones towns and cities, the bedroom legislative agencies, the official lawyer general’s office, and state auditor’s place of work carry out put its white teeth and you will grimly influence to determine just how those individuals standards stayed to possess way too long. You notice, new responsible verdicts when it comes to those a few categories of trials weren’t in the an individual operate otherwise a couple of natural theft or embezzlement. The newest responsible verdicts were not from the inside-county customers defrauding out-of-condition strangers over the phone. As an alternative, the data as well as the bad counts exhibited a cycle from ages and years of corruption, and the facts forming the foundation towards beliefs ended up past a reasonable doubt that defendants had been stealing from their very own someone. The individuals found guilty was and stay humans, friends and family to the remaining state. New unaccountability it took advantageous asset of try a result of the fresh new currently stored look at “sovereignty,” a viewpoint that rejects to say officials the legal right to take a look at, manage, and regulate their unique owners traditions on the Minnesota bookings. All of the opted officials, state and federal, exec, legislative, and judicial, need to shoulder the particular show of your own shame stemming regarding institutionalized overlook out-of Indian individuals. Owed processes and you may fairness demand a primary move to your formulating voice societal policy to make sure so it never goes again. We have lengthened what the law states, contorted they, and you can tortured they to promote the view out of “sovereignty” you to definitely tribal governing bodies otherwise booking company councils need because the “regulations.” It’s got pulled us to depths that could not fathomed for any other racial, cultural, otherwise social class contained in this country. Possible off Cabazon Gang of Mission Indians v. State off Riverside, 783 F.2d 900 (9th Cir.1986), and that needs to be discover very carefully, just quoted, immediately after which carefully realized, is puzzling. Carried to its logical extreme, it could be rebranded “The latest Indian Gambling and Scheduling Nonaccountability Operate.” Briefly, the new Cabazon Band lives in California. Id. at 903. Unregulated betting is a type of gambling the state of Ca never brings to help you its citizens.
english
[{"e_id":"4538","title":"政协副主席陈再道逝世","content":"\n    在28年前的今天,1993年4月6日(农历1993年3月15日),政协副主席陈再道逝世。\n\t\n\n\t1993年4月6日,政协第六届全国委员会副主席陈再道在北京逝世,终年84岁。  \n\n\t陈再道1909年1月24日生于湖北省麻城市乘马岗乡新村程家冲的一个贫苦农民家庭。青年时代在大革命高潮的影响下,确立了投身革命的思想,积极参加与组织农民协会。1927年参加黄麻起义。曾任红四军一师排长、连长,红十一师营长、团长、师长,红四军副军长、军长,八路军一二九师三八六旅副旅长、独立旅旅长,东进纵队司令员,冀南军区司令员,晋冀鲁豫野战军冀南纵队司令员,中原野战军第二纵队司令员,河南军区司令员,中国人民解放军武装力量监察部副部长,武汉军区司令员,铁道兵司令员等职。\n\n\t陈再道1955年被授予上将军衔。荣获一级八一勋章、一级独立自由勋章、一级解放勋章。1988年7月荣获中国人民解放军一级红星功勋荣誉章。","picNo":"1","picUrl":[{"pic_title":"1993年4月6日,政协第六届全国委员会副主席陈再道在北京逝世,终年84岁。  ","suffix":"jpeg","id":1,"url":"https://aimgs.oss-cn-shenzhen.aliyuncs.com/history_2021/4538_1.jpeg"}]}]
json
<reponame>IKATS/ikats-datamodel /** * Copyright 2018-2019 CS Systèmes d'Information * * Licensed under the Apache License, Version 2.0 (the "License"); * you may not use this file except in compliance with the License. * You may obtain a copy of the License at * * http://www.apache.org/licenses/LICENSE-2.0 * * Unless required by applicable law or agreed to in writing, software * distributed under the License is distributed on an "AS IS" BASIS, * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. * See the License for the specific language governing permissions and * limitations under the License. */ package fr.cs.ikats.temporaldata.exception; import javax.ws.rs.core.Response; import javax.ws.rs.core.Response.Status; import javax.ws.rs.ext.ExceptionMapper; import javax.ws.rs.ext.Provider; import org.apache.log4j.Logger; import fr.cs.ikats.common.dao.exception.IkatsDaoConflictException; import fr.cs.ikats.common.dao.exception.IkatsDaoException; import fr.cs.ikats.common.dao.exception.IkatsDaoMissingResource; /** * Handler of IkatsDaoExceptions */ @Provider public class IkatsDaoExceptionHandler implements ExceptionMapper<IkatsDaoException> { private static Logger logger = Logger.getLogger(IkatsDaoExceptionHandler.class); /** * * {@inheritDoc} * @since [#142998] handle IkatsDaoException, with different status Status.CONFLICT, Status.NOT_FOUND ... */ @Override public Response toResponse(IkatsDaoException exception) { if(IkatsDaoMissingResource.class.isAssignableFrom(exception.getClass())) { logger.error("Error processing the request: resource not found on server: "); logger.error( exception ); return Response.status(Status.NOT_FOUND).entity(exception.getMessage()).build(); } else if(IkatsDaoConflictException.class.isAssignableFrom(exception.getClass())) { logger.error("Error processing the request: resource conflict on server: "); logger.error( exception ); return Response.status(Status.CONFLICT).entity(exception.getMessage()).build(); } else { logger.error("Error handled while processing request ",exception); return Response.status(Status.BAD_REQUEST).entity(exception.getMessage()).build(); } } }
java
It was exactly 17 years ago that the India-Australia ODI at the Barabati Stadium was abandoned owing to rain. The same fate awaits the teams when they meet in the fifth game of the seven-match series on Saturday. The fifth ODI had generated a buzz among the fans, but persistent rain for the past few days had left the outfield flooded. To make matters worse, there was a steady drizzle throughout Friday. The Indian team, which drove down from Bhubaneswar, played football at the nearby indoor stadium. The Australian coach, Steve Rixon, and a few members of the support staff, went to the ground to assess the damage but had to beat a hasty retreat as the underfoot conditions were too soft. It will be left to the on-field umpires, S. Ravi (India) and Nigel Llong (England), to make an official announcement (abandonment) after an inspection on Saturday. A press release from the Odisha Cricket Association said those who have bought tickets would get refunds in case of a wash-out. With only a remote chance of play in the fifth ODI, Ravindra Jadeja admitted that the host had towin the remaining two games to win the series. “If this match does not happen, we have to win the next two matches. Australia is ranked second in the world and has come to India with the motivation to do well. In the last 12 to18 months, we have won in England, West Indies and Zimbabwe. With so much cricket going on, we are bound to slip up. We have chased well twice — in the Twenty20 game and the 360 at Jaipur. It is not as if we are playing badly; it is just that in one or two matches, the results have not gone our way,” he said. Jadeja said the two-ball rule was not affecting R. Ashwin and him. “If the pitch offers help, it does not matter if it is a new or an old ball. If you get something from the wicket, the newness of the ball would not matter,” he said. Shane Watson was categorical in saying that the Aussies wanted to win the series. “To be able to win a series in India in any format of the game is a huge challenge and a great achievement. We have to continue playing good cricket. We’ve been batting well in the series and everyone is looking forward to a perfect game in the next outing,” he said. “If the match is rained out on Saturday, it will make the game in Nagpur extremely important. We will try and close out the series there,” he added.
english
{ "kind": "Property", "name": "ServiceWorkerRegistration.scope", "href": "https://developer.mozilla.org/en-US/docs/Web/API/ServiceWorkerRegistration/scope", "description": "The scope read-only property of the ServiceWorkerRegistration interface returns a unique identifier for a service worker registration. The service worker must be on the same origin as the document that registers the ServiceWorker.", "refs": [ { "name": "Service Workers", "href": "https://w3c.github.io/ServiceWorker/#dom-serviceworkerregistration-scope", "description": "ServiceWorkerRegistration.scope - Service Workers" } ] }
json
New Delhi: The Congress party will challenge the Supreme Court’s recent order granting early release to six convicts who were imprisoned for the assassination of former Prime Minister Rajiv Gandhi in 1991. On 11 November, the top court released Nalini Sriharan, her Sri Lankan husband Murugan, Robert Pais, R. P. Ravichandran, Santhan and Jaikumar, noting that they had been jailed for over 30 years. The bench of Justices B. R. Gavai and B. V. Nagarathna passed the order following a direction in May which had freed A. G. Perarivalan, another life-term convict in the case. The Congress reacted sharply that day, saying the top court had not “acted in consonance with the spirit of India”. The BJP-ruled central government also filed a petition last week in the Supreme Court, seeking a review of the order. All seven convicts were incarcerated for over three decades for Rajiv Gandhi’s murder in Sriperumbudur, Tamil Nadu. Four of them, including Nalini, were to hang but their death sentences were later commuted to life. Nalini served her sentence in a special prison for women in Vellore for more than 30 years, while Ravichandran was in the Central Prison in Madurai. In May 1999, the top court had upheld the death sentences of four convicts — Perarivalan, Murugan, Santhan and Nalini. Nalini’s death sentence was commuted to life imprisonment in 2001 on the consideration that she has a daughter. In 2014, the apex court also commuted the death sentences of Perarivalan, Santhan and Murugan to life imprisonment on grounds of delay in deciding their mercy petitions. In 2008, Priyanka Gandhi had met Nalini in prison, in order to come to terms with the “loss and violence” she had experienced. “Why did you do it? ” Priyanka had asked Nalini before breaking down. Then she reportedly gave the convict a patient hearing. Priyanka has maintained that she didn’t believe in anger, hatred and violence, and couldn’t allow these things to “overpower my life”.
english
import { IApplicationLoader } from './isolate/ApplicationLoader'; import { ILocalAPIServer } from './isolate/LocalApiServer'; export declare const APPLICATION_LOADER: import("@airport/di").IDiToken<IApplicationLoader>; export declare const LOCAL_API_SERVER: import("@airport/di").IDiToken<ILocalAPIServer>; //# sourceMappingURL=tokens.d.ts.map
typescript
<gh_stars>1-10 /* * TODO: Make this a generically useful stylesheet? */ .chatout { width: 40em; height: 10em; border: 2px black solid; overflow: auto; padding: 0.5em; } .chatin { width: 40em; padding: 0.5em; border: 2px black solid; margin-top: 0.5em; }
css
<filename>src/ledc/lsch2_conf1.rs #[doc = "Reader of register LSCH2_CONF1"] pub type R = crate::R<u32, super::LSCH2_CONF1>; #[doc = "Writer for register LSCH2_CONF1"] pub type W = crate::W<u32, super::LSCH2_CONF1>; #[doc = "Register LSCH2_CONF1 `reset()`'s with value 0"] impl crate::ResetValue for super::LSCH2_CONF1 { type Type = u32; #[inline(always)] fn reset_value() -> Self::Type { 0 } } #[doc = "Reader of field `DUTY_START_LSCH2`"] pub type DUTY_START_LSCH2_R = crate::R<bool, bool>; #[doc = "Write proxy for field `DUTY_START_LSCH2`"] pub struct DUTY_START_LSCH2_W<'a> { w: &'a mut W, } impl<'a> DUTY_START_LSCH2_W<'a> { #[doc = r"Sets the field bit"] #[inline(always)] pub fn set_bit(self) -> &'a mut W { self.bit(true) } #[doc = r"Clears the field bit"] #[inline(always)] pub fn clear_bit(self) -> &'a mut W { self.bit(false) } #[doc = r"Writes raw bits to the field"] #[inline(always)] pub fn bit(self, value: bool) -> &'a mut W { self.w.bits = (self.w.bits & !(0x01 << 31)) | (((value as u32) & 0x01) << 31); self.w } } #[doc = "Reader of field `DUTY_INC_LSCH2`"] pub type DUTY_INC_LSCH2_R = crate::R<bool, bool>; #[doc = "Write proxy for field `DUTY_INC_LSCH2`"] pub struct DUTY_INC_LSCH2_W<'a> { w: &'a mut W, } impl<'a> DUTY_INC_LSCH2_W<'a> { #[doc = r"Sets the field bit"] #[inline(always)] pub fn set_bit(self) -> &'a mut W { self.bit(true) } #[doc = r"Clears the field bit"] #[inline(always)] pub fn clear_bit(self) -> &'a mut W { self.bit(false) } #[doc = r"Writes raw bits to the field"] #[inline(always)] pub fn bit(self, value: bool) -> &'a mut W { self.w.bits = (self.w.bits & !(0x01 << 30)) | (((value as u32) & 0x01) << 30); self.w } } #[doc = "Reader of field `DUTY_NUM_LSCH2`"] pub type DUTY_NUM_LSCH2_R = crate::R<u16, u16>; #[doc = "Write proxy for field `DUTY_NUM_LSCH2`"] pub struct DUTY_NUM_LSCH2_W<'a> { w: &'a mut W, } impl<'a> DUTY_NUM_LSCH2_W<'a> { #[doc = r"Writes raw bits to the field"] #[inline(always)] pub unsafe fn bits(self, value: u16) -> &'a mut W { self.w.bits = (self.w.bits & !(0x03ff << 20)) | (((value as u32) & 0x03ff) << 20); self.w } } #[doc = "Reader of field `DUTY_CYCLE_LSCH2`"] pub type DUTY_CYCLE_LSCH2_R = crate::R<u16, u16>; #[doc = "Write proxy for field `DUTY_CYCLE_LSCH2`"] pub struct DUTY_CYCLE_LSCH2_W<'a> { w: &'a mut W, } impl<'a> DUTY_CYCLE_LSCH2_W<'a> { #[doc = r"Writes raw bits to the field"] #[inline(always)] pub unsafe fn bits(self, value: u16) -> &'a mut W { self.w.bits = (self.w.bits & !(0x03ff << 10)) | (((value as u32) & 0x03ff) << 10); self.w } } #[doc = "Reader of field `DUTY_SCALE_LSCH2`"] pub type DUTY_SCALE_LSCH2_R = crate::R<u16, u16>; #[doc = "Write proxy for field `DUTY_SCALE_LSCH2`"] pub struct DUTY_SCALE_LSCH2_W<'a> { w: &'a mut W, } impl<'a> DUTY_SCALE_LSCH2_W<'a> { #[doc = r"Writes raw bits to the field"] #[inline(always)] pub unsafe fn bits(self, value: u16) -> &'a mut W { self.w.bits = (self.w.bits & !0x03ff) | ((value as u32) & 0x03ff); self.w } } impl R { #[doc = "Bit 31"] #[inline(always)] pub fn duty_start_lsch2(&self) -> DUTY_START_LSCH2_R { DUTY_START_LSCH2_R::new(((self.bits >> 31) & 0x01) != 0) } #[doc = "Bit 30"] #[inline(always)] pub fn duty_inc_lsch2(&self) -> DUTY_INC_LSCH2_R { DUTY_INC_LSCH2_R::new(((self.bits >> 30) & 0x01) != 0) } #[doc = "Bits 20:29"] #[inline(always)] pub fn duty_num_lsch2(&self) -> DUTY_NUM_LSCH2_R { DUTY_NUM_LSCH2_R::new(((self.bits >> 20) & 0x03ff) as u16) } #[doc = "Bits 10:19"] #[inline(always)] pub fn duty_cycle_lsch2(&self) -> DUTY_CYCLE_LSCH2_R { DUTY_CYCLE_LSCH2_R::new(((self.bits >> 10) & 0x03ff) as u16) } #[doc = "Bits 0:9"] #[inline(always)] pub fn duty_scale_lsch2(&self) -> DUTY_SCALE_LSCH2_R { DUTY_SCALE_LSCH2_R::new((self.bits & 0x03ff) as u16) } } impl W { #[doc = "Bit 31"] #[inline(always)] pub fn duty_start_lsch2(&mut self) -> DUTY_START_LSCH2_W { DUTY_START_LSCH2_W { w: self } } #[doc = "Bit 30"] #[inline(always)] pub fn duty_inc_lsch2(&mut self) -> DUTY_INC_LSCH2_W { DUTY_INC_LSCH2_W { w: self } } #[doc = "Bits 20:29"] #[inline(always)] pub fn duty_num_lsch2(&mut self) -> DUTY_NUM_LSCH2_W { DUTY_NUM_LSCH2_W { w: self } } #[doc = "Bits 10:19"] #[inline(always)] pub fn duty_cycle_lsch2(&mut self) -> DUTY_CYCLE_LSCH2_W { DUTY_CYCLE_LSCH2_W { w: self } } #[doc = "Bits 0:9"] #[inline(always)] pub fn duty_scale_lsch2(&mut self) -> DUTY_SCALE_LSCH2_W { DUTY_SCALE_LSCH2_W { w: self } } }
rust
<reponame>oohyeah0331/UVa class Solution: def twoSum(self, nums, target): """ :type nums: List[int] :type target: int :rtype: List[int] """ i, j = 0, 0 for i, n in enumerate(nums): a = target - n # print('i = ',i,'n = ',n) # print('a = ',a) for j, m in enumerate(nums[i+1:], start=i+1): # print('j = ',j,'m = ',m) if m == a: return [i, j] return [i, j] class Solution2: def twoSum(self, nums, target): if len(nums) < 2: return [] m = {} # val:idx for i in range(len(nums)): t = target - nums[i] if t in m: return [m[t], i] m[nums[i]] = i # print(m) return [] if __name__ == "__main__": # nums = [2, 7, 11, 15] # nums = [3, 2, 4] nums = [1, 3, 2, 3] target = 6 print(Solution2().twoSum(nums, target))
python
<reponame>Jador/gigawatt var actionFactory = require('./build/createAction.js'); var Gigawatt = { createAction: actionFactory.createAction, createActions: actionFactory.createActions, createStore: require('./build/createStore.js'), Mixin: require('./build/Mixin.js') }; module.exports = Gigawatt;
javascript
Desiderio da Settignano (born c. 1430, Settignano, republic of Florence [Italy]—died January 1464, Florence) Florentine sculptor whose works, particularly his marble low reliefs, were unrivaled in the 15th century for subtlety and technical accomplishment. He is perhaps best known for having carved the funerary monument for the humanist Carlo Marsuppini. Desiderio was raised in a family of stone masons and entered the Stone and Wood Carvers’ Guild of Florence in 1453. Little is known about his education, although he was influenced by the Italian sculptor Donatello, particularly in his low reliefs. In his youth he worked with his brother Geri in a workshop near the Ponte Santa Trinita; his fame seems to have lasted during his lifetime and until soon after his death. Desiderio’s delicate, sensitive, highly original style is perhaps most exquisitely manifest in his sensuous portrait busts of women and children. These lyrical pieces convey a wide range of moods and emotions, from joy and charm to melancholy and pensiveness. His sense of design and his highly refined skill as a marble cutter established him as a master of low reliefs. Some of the most notable are his studies of the Madonna and Child, St. John, and Christ as an infant. Sometime after 1453 Desiderio designed and carved the monument of Marsuppini in Santa Croce in Florence. This monument was inspired by Bernardo Rossellino’s funerary monument to Leonardo Bruni in the same church. Desiderio borrowed heavily from Rossellino’s design, so the two monuments are strikingly similar. Both feature an arch, an effigy of the entombed man, a relief of the Virgin and Child, and a depiction of angels carrying a garland. With its rich architectural detail and its admirable effigy, Marsuppini’s tomb is exceptionally important in the history of Florentine wall monument. Desiderio also carved the tondi for Filippo Brunelleschi’s Pazzi Chapel in Florence sometime after 1451 and completed the marble Altar of the Sacrament in San Lorenzo, Florence (1461), which is considered to be one of the decorative masterpieces of the 15th century. Desiderio masterfully employed the technique of rilievo stiacciato (low, or flattened, relief) in a style related to that of Donatello. The delicacy of contrast in his carvings gives his surfaces a glowing, ethereal quality, as seen in his Angel from the Altar of the Sacrament (1458–61) and many of his busts of women.
english
# ZurigAR Location-based Augmented Reality im Tourismus - Ein Prototyp in Zürich
markdown
{"data":{"site":{"siteMetadata":{"disqusShortname":"marcopeg","url":"https://alialfredji.github.io"}}}}
json
--- id: 13760 title: <NAME> date: 2021-04-07T13:49:13+00:00 author: victor layout: post guid: https://ukdataservers.com/ignacia-antonia/ permalink: /04/07/ignacia-antonia tags: - claims - lawyer - doctor - house - multi family - online - poll - business - unspecified - single - relationship - engaged - married - complicated - open relationship - widowed - separated - divorced - Husband - Wife - Boyfriend - Girlfriend category: Guides --- * some text {: toc} ## Who is <NAME> Chilean social media star who is famous for her ignaciaa_antonia TikTok account where she posts lip sync videos for over 21 million fans. In 2019, she published a book titled Atrévete a Soñar and she embarked on a promotional tour for the book throughout Latin America, Spain and United States. In early 2020, she released her first original song titled &#8220;Que Me Olvides.&#8221; ## Prior to Popularity She began publishing regularly to her Instagram account in July 2016.  ## Random data She has amassed more than 7 million followers to her Instagram account ignaciaa_antonia and over 3.3 million subscribers on her YouTube channel. She was previously a member of the collaborative web group LIR team. In 2019, she joined the campaign #NoMásBullying to raise awareness about the dire consequences of bullying. In late 2020, she released her own makeup collection with the company Etienne.  ## Family & Everyday Life of <NAME> Her full name is <NAME>. She was originally born and raised in Chile. Her parents&#8217; names are Mauricio and Beatriz. She has a younger brother named Felipe who she often features in her videos. She began dating fellow social star <NAME>. She also previously dated a guy named Tomas for two years.  ## People Related With <NAME> She is joined in the LIR team by <NAME>. She&#8217;s been photographed alongside social media stars <NAME> and <NAME>.
markdown
Your skin cells are constantly regenerating, so how is it that a tattoo can last for decades? Fascinating new research from France sheds new light on how this works: Your skin cells soak up the pigment, then release it when they die for other cells to pick up. This cycle continues ad infinitum—the circle of tattooed life. In the research, published this week in the Journal of Experimental Medicine, scientists tattooed the tails of mice. Immune cells called macrophages in the mice’s skin picked up the pigment. But when scientists killed off those cells, the tattoo still looked exactly the same. That’s because deep down in the skin’s dermal layer, those dead cells had released the tattoo ink. And when they did, neighboring cells picked it up. In another experiment, researchers transferred a piece of tattooed skin from one mouse to another mouse and discovered that after six weeks, the new mouse’s cells were carrying the pigment as well. The work suggested this is a continuing cycle. Prior research has investigated why macrophages pick up the ink in the first place. It turns out, it’s an immune reaction. Macrophages are attracted to the wound inflicted by the tattoo needle, and eat up the tattoo pigment in the same way they would any invading pathogen. This new research provides more insight into how that process works. But while this new work help explains why tattoos last a lifetime, it also may offer hope for those who wish their ink wouldn’t. When tattoos are removed by lasers, pulses cause skin cells to die and release their pigment. The new research suggests ways to improve that process by making sure new cells don’t pick up pigment when the laser kills off cells. Science may one day have an answer to that unfortunate tribal tattoo that you got on your 18th birthday after all.
english
Paulus geukalon ateueh anggota mahkamah nyan dan laju geupeugah, "Syedara-syedara! Trok uroenyoe ulôn hana meurasa meusalah bak Po teu Allah lam até nurani lôn keuhai udeb ulôn." 2 Watée Paulus geupeugah meunan, Imeum Agong Ananias laju jiyue bak ureuëng nyang dong blaih Paulus mangat jitampa babah Paulus. 3 Dan Paulus laju geupeugah bak imeum agong nyan, "Allah pasti geutampa gata, ureuëng munafék nyang pura-pura suci! Droeneueh neuduek inan neupeusidang ulôn meunurot huköm Nabi Musa, padahai droeneueh keudroe iengkeue keu huköm nyan ngon cara neuyue ureuëng laén jitampa ulôn!" 4 Ureuëng-ureuëng nyang na blah Paulus laju jipeugah bak Paulus, "Gata tahina Imeum Agong Allah!" 5 Geujaweueb lé Paulus, "Óh, ulôn hana lôn teupeu, syedara-syedara, bahwa jihnyan imeum agong. Beutôi lam Alkitab na teutuléh lagée nyoe, 'Bék kheueh gata hina peumimpén-peumimpén bansa gata.' " 6 Paulus geukalon bahwa ladum nibak peumimpén-peumimpén bansa nyan na kheueh ureuëng-ureuëng Saduki dan ladum teuk na kheueh ureuëng-ureuëng Farisi. Ngon sabab nyan gobnyan laju geupeugah bak mahkamah nyan, "Syedara-syedara! Ulôn sidroe ureuëng Farisi, keuturonan Farisi. Ulôn jipeusidang nibak teumpatnyoe sabab ulôn peugah ureuëng maté udeb lom teuma." 8 Sabab ureuëng-ureuëng Saduki jipeugah ureuëng maté hana udeb le, bahwa malaikat nyan hana, meunan cit roh-roh jeuhet hana; seudangkan ureuëng-ureuëng Farisi jipeucaya banmandum nyan.) 9 Ngon lagée nyan teuka kheueh karu nyang paleng gura. Na meupadubdroe ureuëng Farisi nyang jeuet keu gurée agama jidong dan jilawan biet-biet. "Meunurot kamoe ureuëng nyoe hana meusalah mubacut pih! Mungken keubiet na roh atawa malaikat nyang peugah bak gobnyan!" 10 Meudakwanyan biet-biet meutamah subra sampoe keumandan pasokan nyan jitakot Paulus jiparan lé awaknyan. Dan laju jiyue pasokan keu jijak cok Paulus lamkawan ureuëng nyang teungoh meudakwanyan jeuet teuba u markas. 11 Malam óh lheueh nyan teuma Tuhan Isa geudong bak binéh Paulus dan geupeugah, "Peukong kheueh até gata! Gata katabri keusaksian gata keuhai ulôn di Yerusalem. Singoh teuma harôh gata tabri keusaksian lagée nyan di Roma." 12 Banbeungoh that óh singoh nyan, ureuëng-ureuëng Yahudi laju jipeuna kumplotan. Awaknyan jimubudok hana jimakheun ngon jijieb ié meunyoe gohlom jipeumaté Paulus. 13 Leubeh peuet ploh droe nyang rhoh lam kumplotan nyan. 14 Awaknyan laju jijak ubak imeum-imeum ulée dan peumimpén-peumimpén Yahudi dan jipeugah, "Kamoe ka kamoe mubudok meusajan-sajan, hana kamoe makheun atawa kamoe jieb ié sigohlom kamoe poh maté Paulus. 15 Syedara-syedara ngon anggota-anggota Mahkamah Agama nyang sigét jih neukirém surat ubak keumandan pasokan Roma nyan keu talakée awaknyan jiba Paulus lom jak meuhadab droeneueh, pura-pura droeneueh keuneuk pareksa lom peukara ureuëng nyan ngon teuliti. Dan kamoe ka siab keu kamoe poh maté jihnyan sigohlom trok keunoe." 16 Teuma aneuëk nibak syedara nyang inong Paulus jideungoe reuncana jeuhet nibak kumplotan nyan. Teuma laju jijak ubak markas dan jipeugah hainyan ubak Paulus. 17 Paulus laju geutawök sidroe peuwira dan geupeugah bak ureuëng nyan, "Ba kheueh aneuëk mudanyoe ubak keumandan; jihnyoe keuneuk lapor sapeue-sapeue bak keumandan nyan." 18 Peuwira nyan laju jiba aneuëk muda nyan ubak keumandan dan jipeugah, "Paulus, nyang na lam tahanan nyan, geutawök ulôn dan geulakée ulôn mangat lôn ba aneuëk muda nyoe ubak Bapak; jihnyoe keuneuk jilapor sapeue-sapeue." 19 Keumandan nyan laju jimat jaroe aneuëk muda nyan, "Peukeuneuk gata peugah ubak ulôn?" 20 Jijaweueb lé aneuëk muda nyan, "Ureuëng-ureuëng Yahudi kasipakat keuneuk jilakée bak Teungku jeuet Paulus teuba ubak Mahkamah Agama singoh, pura-pura awaknyan keuneuk jipareksa lom peukara Paulus nyan ngon leubeh teuliti. 21 Teuma bék Teungku turot lagée nyang jilakée nyan, sabab na leubeh peuet ploh droe nyang teungoh jimusöm keuneuk jilinteueng gobnyan bak rot euh. Awaknyan mandum ka jimubudok hana teuma jimakheun atawa jijieb ié sigohlom Paulus jipoh maté. Jinoenyoe pih awaknyan ka siab; tinggai jipreh nariet Teungku mantong." 22 Keumandan nyan jipeugah, "Bék gata peugah bak soe mantong bahwa gata ka tabritée hainyan ubak ulôn." Laju jiyue aneuëk muda nyan jiwoe u rumoh jih. 23 Óh lheuehnyan keumandan nyan laju jitawök dua droe peuwira, dan laju jipeugah bak awaknyan, "Peusiab laju dua reutôh droe sidadu meusajan ngon tujoh ploh droe sidadu meuguda dan dua reutôh sidadu meukapak keu teubeurangkat poh sikureueng malam nyoe cit teuma u Kaisarea. 24 Meunan cit gata peusiab guda keu ngon ba Paulus dan gata ba kheueh jinyoe ngon seulamat sampoe trok ubak Gubunur Feliks." 25 Laju keumandannyan jituléh surat nyang asoe jih lagée nyoe, 26 "Nyang Mulia Gubunur Feliks. Saleuem nibak Klaudius Lisias! 27 Ureuëng nyoe ka jidrob lé ureuëng-ureuëng Yahudi dan rab jipoh maté lé awaknyan, meunyoe ulôn hana lôn jak ngon pasokan lôn dan lôn peuseulamat jihnyan; sabab ulôn deungoe jihnyoe wareuga neugahra Romawi. 28 Sabab ulôn keuneuk kuteupeu peue nyang seubeuna jih buet nyang salah nyang jikeuneuk tudoh ateueh ureuëng nyoe, teuma laju ulônba ubak Mahkamah Agama awaknyan. 29 Teunyata ureuëng nyoe hana jipeubuet sapeue pih nyang jeuet tehuköm ngon huköm maté atawa huköm lam glab. Nyang jitudôh lé awaknyan ateueh ureuëng nyoe na kheueh nyang na hubongan ngon huköm-huköm agama awaknyan keudroe. 30 Óh lheueh nyan bak ulôn jibritée bahwa na reuncana jeuhet nibak ureuëng-ureuëng Yahudi ateueh ureuëng nyoe. Ngon sabab nyan langsong ulôn kirém jihnyoe ubak Teungku Gubunur. Dan ulôn ka lônyue bak ureuëng nyang tudoh jihnyoe jak meuhadab ubak Teungku." 31 Teuma lé sidadu nyan laju jipeubuet peurintah nyan. Awaknyan jijakcok Paulus malam nyan cit teuma laju jiba sampoe trok u Antipatris. 32 Óh singohnyan awaknyan jipeubiyeu pasokan meuguda jijak laju ngon Paulus, dan awaknyan keudroe laju jiwoe u markas. 33 Watée pasokan meuguda nyan trok u Kaisarea, awaknyan laju jipeusampoe suratnyan ubak gubunur, dan laju jijok Paulus bak gubunur nyan. 34 Óh ka lheueh jibaca surat nyan lé gubunur nyan, laju jitanyong bak Paulus pat asai jih. Watée jiteupeu Paulus asaijih di Kilikia, 35 laju jipeugah, "Gét kheueh, meunyoe meunan! Ulôn teuma lôn pareksa peukara gata, meunyoe ureuëng-ureuëng nyang gadu gata katrok keunoe." Laju jipeutron peurintah mangat Paulus jiteuon lam meuligoe Herodes.
english
<reponame>CathyOkonji/Scufgamming<gh_stars>0 @import url('https://fonts.googleapis.com/css2?family=Roboto:wght@500&display=swap'); html, body, * { margin: 0; padding: 0; outline: 0; box-sizing: border-box; } body{ background-color: #212530; background-image: url(images/ellipse-bg.png); background-repeat: no-repeat; font-family: Roboto, 'sans-serif'; background-position: top -100px right -320px; height: 500px; background-size: auto auto; } .container{ max-width: 1128px; margin: 0 auto; } header{ background-color: #161920; } .header .navbar{ display: flex; justify-content: space-between; padding: 35px 0px; } .navbar .navlinks{ display: flex; justify-content: flex-start; list-style: none; } .navlinks .navlink{ margin-right: 48px; } .navlinks .navlink-item{ text-decoration: none; color: white; opacity: 0.6; } .navlinks .active{ color: white; opacity: 1; } .navlinks .navlink-item:hover{ text-decoration: underline; text-decoration-color: blue; } .navbar .navicon{ margin-right: 20px; color: white; } .nav-icons .fa-shopping-bag{ position: relative; } .nav-icons .fa-shopping-bag::after{ content: ""; height: 7px; width: 7px; background-color: red; border-radius: 50%; display: inline-block; position: absolute; bottom: -2px; right: 0.2px; } .bodysection{ display: flex; } .bodytext{ position: relative; color: white; font-family: Roboto, 'sans-serif'; padding: 60px 40px 0px 0px; width: 50%; } .text .title{ font-weight: 300; font-size: 50px; } .title .makered{ color: red; } .text .summary{ font-family: Roboto, sans-serif; font-weight: 200; opacity: 0.5; font-size: 14px; line-height: 24px; } .button{ margin-top: 20px; } .button .btn{ font-size: 14px; border: 1px solid #E42C47; color:white; background-color: red; padding: 10px; border-radius: 6px; } .bodytext .gamepads{ position: relative; display: flex; justify-content: space-between; padding: 65px 20px 20px 1px; } .bodyimages{ position: relative; width: 50%; } .redcard{ position: absolute; bottom: 10px; left: 40px; transform: rotate(1deg); filter: drop-shadow(0px 66px 95px rgba(33, 37, 48, 0.35)); } .gamepadblue{ position: absolute; top: 80px; left: -20px; filter: drop-shadow(0px 66px 95px rgba(33, 37, 48, 0.35)); } .gamepadwhite{ position: absolute; bottom: -10px; right: 12px; filter: drop-shadow(0px 66px 95px rgba(33, 37, 48, 0.35)); } .contact .contactlinks{ list-style: none; display: flex; opacity: 0.5; justify-content: space-between; width: 30%; margin-left: 5px; align-items: center; } .contactlinks .contact1{ justify-content: flex-start; text-decoration: none; font-family: Roboto 'sans-serif'; color: #FFFFFF; font-size: 10px; display: flex; align-items: center; } .contactlinks .contact1:hover{ text-decoration: underline; text-decoration-color: blue; } .fa-minus{ margin-right: 6px; }
css
fn main() { let mut v = [-5, 4, 1, -3, 2]; v.sort(); assert!(v == [-5, -3, 1, 2, 4]); }
rust
South Africa face a major setback ahead of their 2023 ICC Men's Cricket World Cup campaign with injury concerns turning into their worst nightmare. South Africa will be without Anrich Nortje and Sisanda Magala at the World Cup after the duo failed to clear a fitness test. While Nortje was named in the initial 15-member squad, both players failed to recover after the home series against Australia. His absence is a major blow to the Proteas ahead of the multi-team event in India where they'll aim to secure a maiden Cricket World Cup win.
english
Anyone who says money is not important either has a lot of it, or is a liar. It is one of those subjects that people have little idea about how to master. We have been taught subjects like geography, physics, biology, mathematics, computers, accounting, but nobody ever taught us money. Everything we know about money is what we learnt ourselves or from people around us. For most of the people, they did not know any smart role models or topic experts on the subject of money. They end up learning it from people who have money issues just like them. I read hundreds of books on the subject of money and use the knowledge to create a couple of businesses. Let me share certain principles I learnt about this indispensable subject called money. 1. Money is never enough: You will meet a lot of rich people who do not think themselves to be rich. This clearly proves the point that money is never enough. Once you have more, you start meeting the people who have equal or more money than you. Thereafter, you again feel the need to have more. So, perish the thought that one day, you will have enough and you will be happy. Your current salary was once your dream salary but now you have become used to it. Money raises your standard of living and then you seek new standards. 2. Money does not make you happy, but it makes you comfortable: Money does not make you happy, but it surely makes you comfortable. And comfort may not be sufficient to make you happy, but it is a necessary pre-requisite for being happy. It is difficult to be happy when you are behind on your bills. It is almost impossible to be happy when you have more month left at the end of the salary rather than more salary at the end of month. So, make sure you learn to earn enough money to make yourself feel comfortable. Also, remember that you will enjoy your money most if you make it by doing what you love to do. The way you spend your time determines how you feel about your money. 3. Money is a choice: No matter how hard it hits you, the truth is that the money you earn is what you have decided to earn. If you think your company does not pay you enough, then remember that you chose the company. If your business does not give enough profit, remember that you chose the business. If your industry offers limited money opportunity, then remember you chose the industry. If you want to change the amount of money you are making, then learn to change your choices. I have changed my industry multiple times to make the most of every opportunity that came my way. 4. Money is a reflection of the skill you have: Always remember that more skilful you are, more you will be valued in the marketplace. If you develop the skills that sell then you will never be underpaid by the market. If your employers do not pay you well, their competition will. What are the most relevant skills for the next five years? I can name a few broad ones, you can pick up the ones you feel can catapult you in your professional life. Few relevant skills are – public speaking, business modelling, valuation, designing, digital enablement, counselling, coaching and coding. More the skills in your kitty, more the job security you have. Read all the Working It Out columns here. 5. Money is a result of system you build or join: American entrepreneur, author and motivational speaker Jim Rohn once said, “Profits are better than wages”, and I think he was right. Money that comes into your life comes in one of the six forms – salary, profit, fee, commission, rent or interest. Salary always has an upper limit, while other five sources can go as high as the customer is willing to pay. Also remember, that salary income grows at 10-20 per cent per annum on average, while other forms of income can also grow geometrically. This does not mean that you should leave your jobs. All I am suggesting is learn to build multiple streams of passive income. Active income is the slowest form of making money. 6. Master the art of delegation: You will never be rich until you master the art of delegation. Getting rich is a full-time job. Your supreme duty is to maximise your financial return on the time you have. Find and hire people who will do what you should not do so that you can spend your time doing what only you can do. If you ever observe an ultra-wealthy person, you will realise that they focus only on what they do best. 7. Money is a result of the service you give: More the number of people you serve, greater are your chances of making more money. Your responsibility is to find what you love to do and then find the people who will pay you to do it. If you are experiencing money issues, then ask yourself what problems are you solving for the market. A billionaire always serves more people than a millionaire does through his product or services. By service, I mean business utility not charity. Follow these seven rules of money mastery and you will see how your bank balance will start moving north. Master money or it will make you its slave.
english
["ak-rn-packager","akpack","babel-globals","babel-preset-es2015-loose-rollup","babel-preset-es2015-min","babel-preset-quad","broccoli-babel-plugin","broccoli-pug-plugin","ctx-core","fl-google-maps-react","gulp-conventional-release","ima-gulp-tasks","ima.js-gulp-tasks","laravel-elixir-rollup","tim-react-native","transfigure-babel"]
json
Union Minister for Home and Cooperation, Shri Amit Shah attends 51st Foundation Day event of Bureau of Police Research and Development (BPR&D) in New Delhi today as the Chief Guest. Posted On: Modi government has done a lot of work for the internal security of the country. The Union Home and Cooperation Minister, Shri Amit Shah attended 51st Foundation Day program of Bureau of Police Research and Development (BPR&D) in New Delhi today as the Chief Guest. Union Minister of State for Home Shri Nityanand Rai, Union Home Secretary, Director of IB, Director General of BPR&D along with several senior officers of the Ministry of Home Affairs, Police and Central Armed Police Forces (CAPFs) were also present on the occasion. Officers of police and CAPFs from different parts of the country also joined the event virtually. The Union Home Minister presented the best police training institutes the trophies and awards and also presented medals for excellence in police training. The Union Home Minister also released the publications of BPR&D and presented Pandit Govind Ballabh Pant Award in Hindi writing. The Union Home Minister also felicitated Tokyo Olympics 2020 silver medalist, Ms. S. Mirabai Chanu on the occasion. In his address, the Union Home Minister said that it takes huge effort for an institution to maintain its relevance for 51 years and BPR&D has done it remarkably well and also proved its mettle. He said that maintaining its relevance is a big challenge as the times keep changing and the institutions also have to change accordingly. Shri Amit Shah said that BPR&D is doing a very crucial work and during his last visit here, he had written in the visitor's book that good policing cannot be imagined without BPR&D. Shri Amit Shah said that law and order is a state subject in the federal structure and to strengthen the federal structure, a link connecting the implementing agencies of all the states, that is, the police, and its federal organizations, is very important. He said that the challenges are faced from the perspective of the country, there are governments of different parties and ideologies in states, there are regional parties as well, apart from this, if law and order is to be ready to face the challenges, then there is a link to do that. If that link is not there, the law and order system of the country will collapse and in 51 years, BPR&D has done a great job of connecting all the states in terms of law and order in the country. He said that different states of the country have different policies and challenges and it was not possible for the states to meet these challenges until a central system assessing all these challenges, practicing its global parameters, do not work day and night to upgrade the force. The Union Home Minister said that the country adopted democracy and republic after independence and democracy is the nature of our country and people. The greatest thing in a democracy is the freedom of the individual which is directly linked with law and order and democracy can never succeed without law and order. He said that democracy is when 130 crore people get an opportunity to develop themselves according to their ability and intelligence and the country gets benefitted of the cumulative effect of development of 130 crore citizens and develops. He said that without good law and order situation, democracy can never succeed. It is very important for a successful democracy that the safety of the individual is ensured, the rights which have been given to him under the law, he should continue to enjoy them freely and he should continue to discharge the duties that the Constitution has laid down for him. The Union Home Minister said that the work of maintaining law and order is done by the police in the country and all the forces engaged in securing the borders of the country. He said that BPR&D has done the work of upgrading, training and rectifying the shortcomings of all these forces and police institutions. Shri Amit Shah said that the biggest contribution in making democracy successful is by the beat constable who ensures the safety of the citizen, otherwise democracy cannot succeed. He said that due to some unknown reasons, a campaign of sorts is going on to malign the image of the police, such as exaggerating some incidents and not giving adequate publicity to good deeds. Shri Shah said that the most difficult job amongst all the government personnel in the entire government is that of the police. The Union Home Minister said that when the whole country is celebrating festivals, the policemen are doing their duty, there is hardly any other government personnel having a harder duty than this. Shri Amit Shah said that during the COVID-19 pandemic, from the Prime Minister of the country to the children of the country, everyone hailed the services of the police force. The Prime Minister, Shri Narendra Modi showered flower petals on the police forces with a helicopter, and that day, it seemed that for the first time the police forces are getting appreciation and respect for their hard work. He said that during the pandemic, across the country from Kashmir to Kanyakumari and Dwarka to Assam, police force did a very good job and this should be documented and a documentary should also be made on this and this good work should be remembered by the country and society for many years, taking together the police forces and CAPFs of all the states, because the sacrifices made by the police is less discussed. Union Home Minister said that more than 35,000 police personnel have sacrificed their lives in various works in last 75 years and laid down their lives for the country and that is why Prime Minister, Shri Narendra Modi created Police Memorial and today it is situated in Delhi, which shows that the police, with 35,000 sacrifices, has always stood in the service of the nation with pride. He said that during his last visit to the Police Memorial, he had said that a documentary should be made of police sacrifices from different states across the country and it should be shown to the children who come here. 16 states have included the Police Memorial as a visiting site for children's tours. He said that BPR&D should prepare good material for image building by compiling it from all the state governments. The Union Home Minister congratulated Tokyo Olympics 2020 silver medalist Ms. S. Mirabai Chanu and wished her all the best for her bright future and hoped that next time she will bring glory to the country by winning a gold medal. He said that a lot is yet to be done for the convenience of the sportspersons in the country. Shri Amit Shah said that challenges are not permanent, challenges keep changing before the country. He said that today cyber-attacks, drone attacks, narcotic smuggling, throw currency and hawala rackets are the biggest challenges. BPR&D is such an institution which should change its work according to the challenges. He said that the main task of BRP&D is to prepare our police forces by preparing them by studying the best practices around the world and assessing the changing challenges. The Union Home Minister said that basic policing cannot be good without improving the beat system. He said more work needs to be done to revive, update and upgrade the beat system. Shri Amit Shah said that under the leadership of Prime Minister Shri Narendra Modi, the Ministry of Home Affairs, Government of India is doing a lot of work to bring about fundamental changes in CrPC, IPC and Evidence Act. He said that BPR&D has contributed very well in this endeavor. The Bureau has sent very good suggestions for change in consultation with many people, 14 States, 3 Union Territories, 8 CPOS, 6 CAPFs and 7 Non-Government Organizations. Shri Amit Shah said that he also wants to add some things to the charter of BPR&D. Shri Amit Shah said that the BPR&D should also work to modernize, train, operate and enhance operation skills of CAPFs in view of the kind of border security challenges that have come before us today. He said that if necessary, the Ministry of Home Affairs will also amend your charter. Shri Shah said that he believes that BPR&D can do this work very well by taking everyone together. He said that it is very important for our land border and sea border to be safe and there should be no laxity in it. The Union Home Minister also said that the Bureau should work for institutionalizing reforms that have been implemented on the ground. Without institutionalizing this system, we cannot know whether our reforms are practical or not and whether we have been able to motivate the police to implement these reforms. Therefore, there should be an institutional arrangement in BPR&D that how many police reforms that take place across the country get off the ground. Shri Amit Shah said that the government which is running under the leadership of Modi ji has done a lot of work for the internal security of the country. He said that we have done a great job to strengthen the legal framework. We have changed many laws, made them timely and made such arrangements to give strength to the law enforcement agencies. The work of abrogation of Article 370 and 35A in Jammu and Kashmir, changes in CrPC, IPC, cyber security and the work of realizing the Prime Minister Shri Narendra Modi's basic mantra of Minimum Government Maximum Governance has been done with full spirit. He said that at the same time we have started a campaign to crack down on narcotics by making amendments in NIA Act, changes in Arms Act, changes in UAPA Act, four tier structure through Narco. The Union Home Minister said that the police and the Narcotics Control Bureau across the country have broken their own record in the last 25 years by catching the maximum number of narcotics within the last two years. Shri Amit Shah said that the government has made a lot of agreements in the Northeast. NLFT agreement, resettlement of Bru refugees, Bodo peace agreement and agreement with Karbi Anglong to be done this evening. He said that around 3700 armed cadres have surrendered and joined the mainstream. These people used to live in the jungle with weapons and in the last two years they have surrendered and returned to the mainstream. The Union Home Minister said that our aim is that we will make our sincere efforts to bring the weapons into the mainstream by communicating with those who lay down weapons and those who have weapons in their hands, the police can deal with them. But the avenues are open for those who want to communicate. He said that talks are going on with all the insurgent organizations at different levels. The Union Home Minister said that in order to bring democracy to the lowest level in Jammu and Kashmir and to take democracy to the people of Jammu and Kashmir, three-tier Panchayati elections were conducted. He said that the meaning of democracy in Jammu and Kashmir is not limited to a few MPs and MLAs but also includes Panch, Sarpanch of every village. He said that 22000 people are contributing in this system. Shri Amit Shah said that this is a great achievement of the Modi government. The Union Home Minister said that the establishment of National Defense University, establishment of FSL University, coordination of criminal justice system with CCTN has been done. At the same time, we are also working very fast in the direction of e-procedure and e-forensics. National Academy for Coastal Policing was also established in Gujarat, National Cyber Crime Portal has been dedicated to the nation, it needs to be made more popular. Shri Shah said that the Prime Minister of the country will soon dedicate NATGRID to the nation. He said that we have made a very good use of the Private Security Agency Licensing Portal at the national level. FCRA has also been radically changed. The Home Minister said that the internal security of the country is being handled by the police force, the borders of the country are being handled by our Central Armed Police Forces (CAPFs) and the promptness with which our CAPFs are handling the borders and our police force is handling the internal security. We are in absolutely safe hands. But challenges grow, challenges change and overcoming these should be our goal to stay ahead of destabilizers and BPR&D can play a big part in this. Shri Amit Shah said that the next decade is very important for internal security because of the way the country is progressing under the leadership of Prime Minister, Shri Narendra Modi, we have set a target to become 5 trillion-dollar economy, many reforms are taking place, we have to ensure that all these continue unabated. Shri Amit Shah said that our police force and CAPFs across the country should be prepared after assessing all the challenges. The Union Home Minister said that it is our duty to take the idea of smart policing put forth by the Prime Minister on the ground. He said that BPR&D, in its journey to complete 100 years, will become even more relevant and move ahead under the leadership of the Director General.
english
New Delhi: Of the 716 women candidates analysed in the 2019 Lok Sabha elections, 100 (15 per cent) have declared criminal cases against themselves, while 78 (11 per cent) have declared serious criminal cases against their names, according to a report by the Association for Democratic Reforms (ADR). Among the major parties, 14 (26 per cent) out of 54 women candidates are from the Congress, 18 (34 per cent) out of 53 from the Bharatiya Janata Party (BJP), 2 (8 per cent) out of 24 women from the Bahujan Samaj Party (BSP), six (26 per cent) out of 23 fielded by the Trinamool Congress (TMC) and 22 (10 per cent) out of 222 Independent women candidates have declared criminal cases against themselves in their affidavits. While 10 (19 per cent) out of 54 women candidates from the Congress, 13 (25 per cent) out of 53 from the BJP, 2 (8 per cent) out of 24 from the BSP, 4 (17 per cent) out of 23 fielded by the TMC and 21 (10 per cent) out of 222 Independent have declared serious criminal cases against themselves in their affidavits. The National Election Watch and the ADR have analysed the affidavits of 716 out of 724 women candidates contesting in the Lok Sabha 2019 elections. Of the 716 women candidates, 94 participated in the first phase of the elections, 124 in the second phase, 142 in the third, 96 in the fourth, 80 in the fifth and 84 in the sixth phase. However, 96 women candidates, who have declared criminal cases against themselves, are participating in the seventh phase of the elections. The election watchdog’s report said that 78 (11 per cent) women candidates have declared serious criminal cases including cases related to rape, murder, attempt to murder, crimes against women etc in the Lok Sabha elections, 2019. The number of such women candidates was 51 (8 per cent) in the Lok Sabha elections, 2014, the ADR said. 15 percent women candidates declare criminal cases in Lok Sabha elections: Two women candidates have declared convicted cases against themselves, four with cases related to murder, 16 with cases of attempt to murder, 14 with crimes against women such as causing miscarriage without woman’s consent while seven with cases related to hate speech. Of the 716 women candidates analysed, the ADR said, 255 (36 per cent) are crorepatis. A total of 44 (82 per cent) out of 54 fielded by the Congress, 44 (83 per cent) out of 53 from the BJP, 15 (65 per cent) out of 23 from the TMC, nine (38 per cent) out of 24 from the BSP, and 43 (19 per cent) out of 222 Independent women candidates have declared assets worth more than Rs 1 crore. Among the major parties, the average assets per candidate for 54 Congress women candidates is Rs 18.84 crore, 53 BJP women candidates have average assets of Rs 22.09 crore, 24 BSP women candidates have average assets worth Rs 3.03 crore, 23 TMC women candidates have average assets worth Rs 2.67 crore, 10 CPI(M) women candidates have average assets worth Rs 1.33 crore, six Samajwadi Party women candidates have average assets worth Rs 39.85 crore, average assets of three AAP candidates are Rs 2.92 crore and 222 Independent women candidates have average assets of Rs. 1.63 crore. The BJP’s Hema Malini (with Rs 250 crore) from Mathura constituency in Uttar Pradesh is among the three richest women candidates contesting in the elections followed by Telugu Desam Party’s D.A. Sathya Prabha (with Rs 220 crore) from Andhra Pradesh’s Rajampet constituency and Shiromani Akali Dal’s Harsimrat Kaur Badal (with Rs 217 crore) from Punjab’s Bathinda. Meghalaya scored 92.85 out of 100 possible points in a Gaming Industry Index and proved to be India’s most gaming-friendly state following its recent profound legislation changes over the field allowing land-based and online gaming, including games of chance, under a licensing regime. Starting from February last year, Meghalaya became the third state in India’s northeast to legalise gambling and betting after Sikkim and Nagaland. After consultations with the UKIBC, the state proceeded with the adoption of the Meghalaya Regulation of Gaming Act, 2021 and the nullification of the Meghalaya Prevention of Gambling Act, 1970. Subsequently in December, the Meghalaya Regulation of Gaming Rules, 2021 were notified and came into force. The move to legalise and license various forms of offline and online betting and gambling in Meghalaya is aimed at boosting tourism and creating jobs, and altogether raising taxation revenues for the northeastern state. At the same time, the opportunities to bet and gamble legally will be reserved only for tourists and visitors. “We came out with a Gaming Act and subsequently framed the Regulation of Gaming Rules, 2021. The government will accordingly issue licenses to operate games of skill and chance, both online and offline,” said James P. K. Sangma, Meghalaya State Law and Taxation Minister speaking in the capital city of Shillong. “But the legalized gambling and gaming will only be for tourists and not residents of Meghalaya,” he continued. To be allowed to play, tourists and people visiting the state for work or business purposes will have to prove their non-resident status by presenting appropriate documents, in a process similar to a bank KYC (Know Your Customer) procedure. With 140 millions of people in India estimated to bet regularly on sports, and a total of 370 million desi bettors around prominent sporting events, as per data from one of the latest reports by Esse N Videri, Meghalaya is set to reach out and take a piece of a vast market. Estimates on the financial value of India’s sports betting market, combined across all types of offline channels and online sports and cricket predictions and betting platforms, speak about amounts between $130 and $150 billion (roughly between ₹9.7 and ₹11.5 lakh crore). Andhra Pradesh, Telangana and Delhi are shown to deliver the highest number of bettors and Meghalaya can count on substantial tourists flow from their betting circles. The sports betting communities of Karnataka, Maharashtra, Uttar Pradesh and Haryana are also not to be underestimated. Among the sports, cricket is most popular, registering 68 percent of the total bet count analyzed by Esse N Videri. Football takes second position with 11 percent of the bets, followed by betting on FIFA at 7 percent and on eCricket at 5 percent. The last position in the Top 5 of popular sports for betting in India is taken by tennis with 3 percent of the bet count. Meghalaya residents will still be permitted to participate in teer betting over arrow-shooting results. Teer is a traditional method of gambling, somewhat similar to a lottery draw, and held under the rules of the Meghalaya Regulation of the Game of Arrow Shooting and the Sale of Teer Tickets Act, 2018. Teer includes bettors wagering on the number of arrows that reach the target which is placed about 50 meters away from a team of 20 archers positioned in a semicircle. The archers shoot volleys of arrows at the target for ten minutes, and players place their bets choosing a number between 0 and 99 trying to guess the last two digits of the number of arrows that successfully pierce the target. If, for example, the number of hits is 256, anyone who has bet on 56 wins an amount eight times bigger than their wager.
english
Adin Ross is the latest streamer to come out in defense of KSI, or JJ, as he is called by the community after he officially lost to Tommy Fury over the weekend. On his latest live stream on Kick, Ross gave his opinion about the much-anticipated boxing clash between the two and was clearly unhappy with the judges' decision. For those who are out of the loop, JJ and Tommy Fury's fight went for six rounds and finally came down to the decision of the judges who, after some controversy surrounding tallying up of marks, gave the win to Fury in a unanimous decision. However, the results have not pleased everybody, with many people like Adin Ross voicing their displeasure. The Kick streamer was reacting to Jake Paul's video about the fight when he told his audience that he did not appreciate Fury's performance: "Why did Tommy fight so ass, on god? Why was Tommy fighting like that? Was it because KSI was hugging, I don't get it." Adin then outright stated that Tommy Fury did not deserve the win because of the point deduction: "Chat, KSI should have won that fight. I am being honest. He got a point deduction." The clip of Adin Ross supporting the English celebrity has obviously gone viral on social media, and he is not the first streamer to make headlines for this. Popular YouTube streamer IShowSpeed has also been criticizing the decision of the match since the results were announced and even revealed that he felt like crying after Tommy Fury was announced as the winner. British YouTuber and Twitch streamer TrueGeordie, known for making content about MMA and boxing, also seemed to agree with IShowSpeed's take and, in a recent show, explained why he thought JJ deserved a win. Like TrueGeordie, Adin Ross also brought up the point deduction debacle, which led Fury to lose a point in the second round after he inadvertently hit the back of his opponent's head. The Kick creator stated: "Whether you actually want to say it or not, Tommy had a point deduction. Technically, Tommy fought horribly but technically KSI should have won that fight." Adin also echoed IShowSpeed's words and claimed that the true winner was robbed: "Bro, he got robbed literally." When some people in his chat protested, Adin Ross shut them down, saying: "Doesn't matter if he is hugging, it's a point system r*tards. It's literally a point system and KSI obviously had more points. It was either a draw or KSI should have won that. I'm not hating by the way, I watched the fight." Adin Ross's clip has naturally gone viral on social media, garnering a lot of reactions from fans of both fighters. Here are some of the more general reactions: Read more about how streamers reacted to the result of the fight right here, with reactions from popular internet personalities such as Ludwig and IShowSpeed.
english
<reponame>shuyan-k/hutool /** * Hutool-db是一个在JDBC基础上封装的数据库操作工具类,通过包装,使用ActiveRecord思想操作数据库。<br> * 在Hutool-db中,使用Entity(本质上是个Map)代替Bean来使数据库操作更加灵活,同时提供Bean和Entity的转换提供传统ORM的兼容支持。 * * @author looly * */ package cn.hutool.db;
java
{"title": "Meta-theory \u00e0 la carte.", "fields": ["extensibility", "metatheory", "mathematical proof", "proof assistant", "constructed language"], "abstract": "Formalizing meta-theory, or proofs about programming languages, in a proof assistant has many well-known benefits. Unfortunately, the considerable effort involved in mechanizing proofs has prevented it from becoming standard practice. This cost can be amortized by reusing as much of existing mechanized formalizations as possible when building a new language or extending an existing one. One important challenge in achieving reuse is that the inductive definitions and proofs used in these formalizations are closed to extension. This forces language designers to cut and paste existing definitions and proofs in an ad-hoc manner and to expend considerable effort to patch up the results. The key contribution of this paper is the development of an induction technique for extensible Church encodings using a novel reinterpretation of the universal property of folds. These encodings provide the foundation for a framework, formalized in Coq, which uses type classes to automate the composition of proofs from modular components. This framework enables a more structured approach to the reuse of meta-theory formalizations through the composition of modular inductive definitions and proofs. Several interesting language features, including binders and general recursion, illustrate the capabilities of our framework. We reuse these features to build fully mechanized definitions and proofs for a number of languages, including a version of mini-ML. Bounded induction enables proofs of properties for non-inductive semantic functions, and mediating type classes enable proof adaptation for more feature-rich languages.", "citation": "Citations (25)", "departments": ["University of Texas at Austin", "National University of Singapore", "Ghent University"], "authors": ["<NAME>.....http://dblp.org/pers/hd/d/Delaware:Benjamin", "<NAME>.....http://dblp.org/pers/hd/o/Oliveira:Bruno_C=_d=_S=", "<NAME>.....http://dblp.org/pers/hd/s/Schrijvers:Tom"], "conf": "popl", "year": "2013", "pages": 12}
json
Lucknow: As the SP-BSP alliance unravelled in Uttar Pradesh, Samajwadi Party chief Akhilesh Yadav Wednesday described the tie-up as a "trial" and said it may not always be successful, but helps in knowing the shortcomings. BSP chief Mayawati on Tuesday declared that her party will fight the assembly bypolls alone, prompting Akhilesh Yadav to say his SP too is ready to go solo. "Yes, there are trials, and sometimes you are not successful, but at least you can know your shortcomings," Yadav told reporters here on Wednesday. He also said that he respected Mayawati irrespective of the political equation between the parties. "I stand by my earlier statement that respecting Mayawati ji is same as respecting me," the SP president said. He had made the statement earlier this year when the Samajwadi Party (SP) and Bahujan Samaj Party (BSP) announced tie-up for the 2019 Lok Sabha elections. "As far as going solo in the bypolls is concerned, I will consult with party leaders and devise a strategy to work in this direction," Yadav said. ally Apna Dal (S) two. Eleven assembly bypolls are due in UP after the respective MLAs won the Lok Sabha polls. Nine of them are from the BJP, and one each from the BSP and the SP. alone," Mayawati said on Tuesday. "If I feel that the SP president is able to fulfil his duties and convert his people into missionaries, then we can still walk together in future. There has been no permanent break as of now. " If he is unable to succeed in his task, it will be better for the party to walk alone, she added. The BSP chief had called a review meeting in New Delhi on Monday to analyse the Lok Sabha results. "I have to say with much sadness that the SP's base vote - meaning 'Yadav samaj' - has not stood along with the SP even in areas where they are in high numbers," she had said. (This story has not been edited by News18 staff and is published from a syndicated news agency feed - PTI)
english
<filename>src/config/styles.js export const FONT_FAMILY = `font-family: 'Barlow', sans-serif;` export const BLUE = `rgba(97, 218, 251, 1)` export const BLUE_04 = `rgba(97, 218, 251, 0.4)` export const DARK_BLUE = `rgba(0,41,56, 1)` export const DARK_BLUE_075 = `rgba(0,41,56, 0.75)` export const LIGHT_BLUE = `rgba(87,178,212,0.9)` export const YELLOW = '#D6D100' export const BROWN = '#979797' export const GREY = '#c4c4c4' export const DARK_GREY = '#4a4a4a' export const WHITE = '#fff' export const PINK = '#f388a2' export const GRAPHQL_PINK = '#DF0098' export const LIGHT_PINK = 'rgba(223, 0, 152, 0.8)' export const GREEN = '#80da8d' export const MEETUP_RED = '#F64060' export const RED = '#C0392B' export const BOX_SHADOW = '0 -2px 24px 0 rgba(0, 0, 0, 0.24), 0 2px 24px 0 rgba(0, 0, 0, 0.12);' export const TEXT_SIZE = ({ sm = false, lg = false }) => { if (sm) return `font-size: 12px;` if (lg) return `font-size: 18px;` return `font-size: 16px;` } export const Z_INDEX_TOP = 999 export const Z_INDEX_MEDIUM = 5 export const Z_INDEX_SMALL = 1 export const Z_INDEX_BG = -2 export const theme = { flexboxgrid: { gutterWidth: 1, outerMargin: 0.5, container: { sm: null, // rem md: null, // rem lg: 64, // rem }, }, // design system space: [ '0rem', // 0 '0.625rem', // 1 '0.9rem', // 2 '1rem', // 3 '1.5rem', // 4 '2rem', // 5 '3rem', // 6 '4rem', // 7 ], fonts: { barlow: `'Barlow', sans-serif`, }, fontShadows: { light: '0 18px 29px -2px rgba(0, 0, 0, 0.26)', }, fontSizes: [ '0.8rem', // 0 '0.9rem', // 1 '1rem', // 2 '1.15rem', // 3 '1.25rem', // 4 '1.563rem', // 5 '1.953rem', // 6 '2.441rem', // 7 '2.77rem', // 8 ], fontStyle: { normal: 'normal', italic: 'italic', }, lineHeights: [ '0', // 0 '1.2rem', // 1 '1.5rem', // 2 '1.845rem', // 3 '2rem', // 4 '2.5rem', // 5 '3rem', // 6 ], fontWeights: { normal: '400', bold: '800', }, shadows: { thin: '0 2px 2px 0 rgba(0, 0, 0, 0.45), 0 0 2px 0 rgba(0, 0, 0, 0.12)', light: '0 18px 29px -2px rgba(0, 0, 0, 0.26)', bold: 'rgb(74, 74, 74) 0px 0px 1px', }, }
javascript
{ "_public_key": "<KEY>", "private_key": "<KEY> "kid": "EJ[1:Ns0zxeQCJrM7rRh7vtf7uNaBu5uo9kl2JUBHXshR6mQ=:WR24XzkfnWxqf+fDcMpaMK2ertNVQoNI:cNhgUSE95b3PhjJKMycbXQEgxOiAiJ7BlFA0pRAi+0DIOHiTYbsXdjWRjpJHgIiEuq+X94rqEi2RprQ=]" }
json
Our editors will review what you’ve submitted and determine whether to revise the article. - Born: - Oct. 21, 1785, Burlington county, N.J., U.S. - Died: - March 6, 1851, St. Louis, Mo. (aged 65) Henry Miller Shreve, (born Oct. 21, 1785, Burlington county, N.J., U.S.—died March 6, 1851, St. Louis, Mo.), American river captain and pioneer steamboat builder who contributed significantly to developing the potential of the Mississippi River waterway system. Shreve’s father was a Quaker who nevertheless served as a colonel in the American Revolutionary War and lost all his possessions at the hands of the British. Destitute, the Shreves were forced to emigrate to the western Pennsylvania frontier. When his father died in 1799, Shreve began to make trading voyages by keelboat and barge down the Monongahela and Ohio rivers. In 1807 he inaugurated the fur trade between St. Louis and Philadelphia, by way of Pittsburgh, and in 1810 he began carrying lead from Galena, Ill., near the upper Mississippi. He became a stockholder and skipper of the Enterprise (the second steamboat on the Mississippi), carrying supplies in 1814 for Andrew Jackson’s army and taking part himself in the Battle of New Orleans. In May 1815 the Enterprise with Shreve at the helm became the first steamboat to ascend the Mississippi and Ohio to Louisville, Ky. Shreve, however, saw the need for an entirely new design for river steamers and had built to his specifications the Washington, with a flat, shallow hull, a high-pressure steam engine on the main deck instead of in the hold, and a second deck. His round trip in the Washington in 1816 from Pittsburgh to New Orleans and back to Louisville definitely established the Mississippi steamboat type. In 1827 Shreve was appointed superintendent of western river improvements and designed the first snag boat to remove from the river system the sunken tree trunks that often wrecked steamboats. In the 1830s he undertook the removal of an accumulated underwater obstruction of the Red River known as the Great Raft; his success opened northern Louisiana to development, and his workcamp turned into a permanent settlement as Shreveport.
english
<reponame>dineshkummarc/couchbase-beers<gh_stars>1-10 {"_id":"brewery_Atlanta_Brewing_Company","name":"<NAME>","address":["2323 Defoor Hills Rd NW"],"city":"Atlanta","state":"Georgia","code":"30318","country":"United States","phone":"404-355-5558","website":"http:\/\/www.atlantabrewing.com\/","geo":{"loc":["-84.4353","33.818"],"accuracy":"ROOFTOP"},"updated":"2010-07-22 20:00:20"}
json
<reponame>MadModsBattletech/BourbonVanilla { "Description" : { "Name" : "RELIABLE MOVEMENT", "Details" : "PASSIVE: This unit generates an extra EVASIVE charge from all movement actions. The unit can SUSTAIN evasion up to its initiative when moving and sprinting. Regular movement also grants ENTRENCHED.", "Icon" : "uixSvgIcon_action_evasivemove" } }
json
Don’t Miss Out on the Latest Updates. Subscribe to Our Newsletter Today! Mumbai, Mar 23 (PTI) Actor Pankaj Tripathi says he took up the role in “Kaala” as it gave him a chance to meet and talk to superstar Rajinikanth. The Tamil film, which also stars Huma Qureshi, is due to arrive in theatres on April 27. This is the first time that Pankaj has worked with Rajinikanth, and the actor said for the first few minutes of their first shot together, he quietly looked at the megastar. “The first shot was very nice. I was just quietly watching him. For 10-15 minutes I was just looking at him. I had signed this film to just meet and talk to him. “I wanted to talk to him about cinema, his approach towards it, life and spirituality. I’m glad I did that,” Pankaj told PTI. The gangster film, written by Pa Ranjith, also stars Nana Patekar. Talking about his other big project, “Super 30”, which is the biopic of Indian mathematician Anand Kumar and has Hrithik Roshan in the lead role, Pankaj said he is excited to have played an important role in the film. Before Hrithik came on board, there were rumours that Pankaj was the first choice of the makers. The 41-year-old actor, however, denied, saying the speculation arose after people discussed that he could be a candidate, given his resemblance to Anand. “There was never any truth in this. People just said that I was a suitable candidate. I am playing an important role in the film. It was a lovely experience working with Hrithik. We had worked together in ‘Agneepath’ also, we knew each other already. It was very nice,” Pankaj added. The actor had a successful 2017 with three of his films – “Bareilly Ki Barfi”, “Newton” and “Fukrey Returns”. This is published unedited from the PTI feed. For breaking news and live news updates, like us on Facebook or follow us on Twitter and Instagram. Read more on Latest News on India. com.
english
TIRUMALA, JAN 19: TTDs Tirumala JEO Sri KS Sreenivasa Raju took part in the pulse polio drops administration programme at Aswini Hospital in Tirumala on Sunday. He administered pulse polio drops to the infants in the hospital promises. Speaking on this occasion he said, TTD has set up 25 centres in Tirumala to administer polio drops to the infants who are aged between 0-5years of age. “Out of these 25 centres, 17 are meant for the visiting pilgrims while the remaining for locals. If any body fails to adminsiter polio drops to their kids today, they will be rendered the polio polio drops through the mopping programme on 21st and 22nd January, he added. Meanwhile TTD has set up centres in Aswini, GNC, RTC bus stand, CRO, Health office, KKC, PAC I and II, F-type Quarters, Balaji Nagar(2 centres), SV High School, TTD Employees Dispensary, Rambhageecha-III, inside temple, Vahana Mandapam, VQC I and II, Medaramitta, Varahaswamy Guest House, MBC-26, Supadham and Papavinasanam. Two more mobile centres have also been set up near Alipiri foot path and near guest houses where pilgrim influx is more. About 6653 infants were administered polio drops on Sunday in Tirumala. Aswini Hospital Medical Superintendent Dr D Nageshwara Rao and his team of doctors, nursing staff were also present.
english
Having made an entry in Bollywood three years back with an item song in Akshay Kumar starrer Khatta Meetha,actress Kainaat Arora is finally making her acting debut in Indra Kumar’s Grand Masti. The model-turned-actress says she is proud that she bagged the role after seeing off competition from 200 other girls during the audition of the film,which is a sequel to the 2004 hit Masti. The film will feature Vivek Oberoi,Aftab Shivdasani and Riteish Deshmukh reprising their roles from the original. “I bagged the role through the writer of the film,Milap. He approached me for this role and he said he was looking for a girl opposite Vivek. He said they had already tested 100 to 200 girls for the role. “I went for the audition and the next thing I heard was that I was on board. I am the lucky one from those 200 girls who finally bagged the role,” Kainaat told PTI in an interview. Besides the lure of working in a successful franchise,the newcomer says it was the opportunity to work with filmmaker Indra Kumar that drew her to Grand Masti. “Masti was a big hit and when I got an opportunity to work in the sequel opposite Vivek,I just jumped into that without even thinking twice. Moreover it is an Indra Kumar film. He is a very sensible and senior director and I would have been a fool to say no to such a film,” Kainaat said. Other than Kainaat,Grand Masti also stars Bruna Abdullah,Karishma Tanna,Sonalee Kulkarni,Maryam Zakaria and Manjari Fadnis. The film will hit theatres on September 13. When asked about her character in the upcoming adult comedy,Kainaat says she plays a nervous,nerdy girl,who later turns into a glam doll. “I am basically playing two characters in the film. In the beginning,I am this nerdy girl. Something like the Ugly Betty. She is very geeky,wears braces,carries a water bottle and school bag in college. But at the reunion everyone is surprised as my character has completely transformed into a beautiful,desirable girl,” she said. After doing the item song Aila re Aila in Priyadarshan’s Khatta Meetha back in 2010,Kainaat was offered many others in various films. But the actress declined them all as she did not want to get typecast and waited for the right role to come her way. “It is a human psyche. When you do a song,people will only approach you for songs. I did not want to do another song after Khatta Meetha. I was very adamant that I would only do a film. Hence I waited and I am excited that I am finally making my debut with Grand Masti,” Kainaat said. The young actress will do an item song in Vikram Bhatt’s Hate Story 2. “I decided to do a song now because New Year is coming and I thought of doing another hit song. The film is being produced by Vikram Bhatt and the shooting will begin later this month,” she said. Kainaat,who is the cousin of late Bollywood actress Divya Bharti,says she is inspired by the Deewana star. “I hope if I even do little bit of what she did in the two short years of her career,my life will be sorted out,” she said. Click for more updates and latest Bollywood news along with Entertainment updates. Also get latest news and top headlines from India and around the world at The Indian Express.
english