text
stringlengths 1
1.04M
| language
stringclasses 25
values |
|---|---|
With the season having ended, Sportskeeda decided to let its viewers rate the 30 best players of the season.
We will be shortlisting 30 of the best performers this term, and let you pick your winner.
What do you think? Have your say at the end of the article.
Goalkeepers are often a forgotten lot. They are the some of the worst paid and least respected professional footballers. It is difficult for a goalkeeper to get the recognition he deserves. There are men like Buffon, Casillas and Van Der Sar but they are part of a very small minority. A man who seems to be working up his way on the goalkeeping charts is Roman Weidenfeller.
This man, for me has been the best keeper in Europe this season. Be it in the Champions League or the Bundesliga, Weidenfeller has been absolutely magnificent. Being a goalkeeper myself, it is a wonder to see someone so technically sound when it comes to the basics of goalkeeping. And not only is he sound technically, he is also a brilliant reader of the game and makes more than his fair share of wonder saves.
Unlike De Gea or Valdes or even Neuer, Roman is a complete keeper. Some may argue that Neuer had a better season on the basis of his clean sheets, but the sheer number and quality of saves made by the Dortmund custodian is mind blowing. In 48 games he has played this season, he has made a mindblowing 124 saves, which averages to 2. 5 saves per game. Also he averages 2. 71 saves per goal, which is pretty good and shows that he carries a lot of responsibility for carrying his team. As a keeper he earns a tick in every department of the game. He is almost unbeatable in one-on-one situations. He has brilliant reflexes and is quick off his line. More importantly he recognizes situations that require him to come forward. His shot stopping qualities have been widely recognized and duly praised.
The most important thing though is mental toughness. The Borrusia vice-captain has plenty of that particular quality. He seems unfazed by whatever happens around him. He is vocal in his demeanor and is constantly organizing his team from the back. It is due to his organization that Borrusia are able to break so fluidly. Another impressive aspect of his game are his long throws. Not only does he manage large distances on them, they are more often than not accurate.
The thing that most impresses about this man is his consistency. It is just this season that his contributions are being noted, but as an avid Bundesliga watcher I can tell you that he has maintained a similar level of excellence throughout his career.
Without a shadow of a doubt it has to be the Champions League final. The man was an impregnable wall all through the 90 minutes. He made three excellent one-on-one stops to deny Robben, he saved two missile-like shots from Alaba and Schweinsteiger and tipped over Mandzukic’s goal bound header with surprising agility and speed. He could not be blamed for the either of the goals conceded by his team, but he was the man responsible for keeping the score down at 2-1.
The best players save their best for the biggest matches. The case with Roman was no different. In the final he made half-a-dozen world class saves, but one stood out. It was the one he made to deny Robben an easy goal. He was one-on-one against his opponent and showed his commitment and fearlessness in rushing out and denying Robben a clear angle at goal. Awake to the possibility of a chip, Weidenfeller stayed big and saved with his face.
Now whenever the world’s greatest keepers are discussed, Roman Weidenfeller will be counted amongst them and for a man who has never been capped for his country that is a huge achievement. He has matured as player and is a senior statesmen in a young team. He leads his troops from the backline and leads them very well. The very fact that he managed to shine in a team with the like of Reus, Gotze, Lewandowski, Gundogan and others is more than whatever I can say.
|
english
|
package com.zxjia.ssmp.controller;
import com.zxjia.ssmp.dto.IndexRequest;
import com.zxjia.ssmp.service.ProductService;
import com.zxjia.ssmp.vo.IndexVo;
import com.zxjia.ssmp.vo.ResultVO;
import io.swagger.annotations.Api;
import io.swagger.annotations.ApiOperation;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.web.bind.annotation.PostMapping;
import org.springframework.web.bind.annotation.RequestBody;
import org.springframework.web.bind.annotation.RequestMapping;
import org.springframework.web.bind.annotation.RestController;
@Api(tags = "首页管理")
@RestController
@RequestMapping(value = "/api")
public class IndexController {
@Autowired
ProductService productService;
@ApiOperation(value = "查询商品类别信息")
@PostMapping(value = "/index/getProductByCateId")
public ResultVO<IndexVo> getProductByCateId(@RequestBody IndexRequest request) {
return ResultVO.success(productService.getProductCateById(request.getCatId()));
}
@ApiOperation(value = "搜索")
@PostMapping(value = "/index/search")
public ResultVO<IndexVo> search(@RequestBody IndexRequest request) {
return ResultVO.success(productService.search(request));
}
}
|
java
|
---
title: Hata ayıklama Visual c++ özelliklerini etkinleştirme (-D_DEBUG) | Microsoft Docs
ms.custom: ''
ms.date: 11/15/2016
ms.prod: visual-studio-dev14
ms.reviewer: ''
ms.suite: ''
ms.technology:
- vs-ide-debug
ms.tgt_pltfrm: ''
ms.topic: article
f1_keywords:
- vs.debug
dev_langs:
- FSharp
- VB
- CSharp
- C++
helpviewer_keywords:
- /D_DEBUG compiler option [C++]
- debugging [C++], enabling debug features
- debugging [MFC], enabling debug features
- assertions, enabling debug features
- D_DEBUG compiler option
- MFC libraries, debug version
- debug builds, MFC
- _DEBUG macro
ms.assetid: 276e2254-7274-435e-ba4d-67fcef4f33bc
caps.latest.revision: 10
author: mikejo5000
ms.author: mikejo
manager: ghogen
ms.openlocfilehash: e512620e1af8da85039ed403d4280568101fbe57
ms.sourcegitcommit: 240c8b34e80952d00e90c52dcb1a077b9aff47f6
ms.translationtype: MT
ms.contentlocale: tr-TR
ms.lasthandoff: 10/23/2018
ms.locfileid: "49829267"
---
# <a name="enabling-debug-features-in-visual-c-ddebug"></a>Visual C++'de Hata Ayıklama Özelliklerini Etkinleştirme (/D_DEBUG)
[!INCLUDE[vs2017banner](../includes/vs2017banner.md)]
İçinde [!INCLUDE[vcprvc](../includes/vcprvc-md.md)], programınızı sembolü ile derleme yaparken onaylar etkinleştirilen gibi hata ayıklama özellikleri **_DEBUG** tanımlı. Tanımlayabileceğiniz **_DEBUG** iki yoldan biriyle:
- Belirtin **#define _DEBUG** , kaynak kodunuzdaki veya
- Belirtin **/D_DEBUG** derleyici seçeneği. (Proje sihirbazları kullanarak Visual Studio'da oluşturursanız **/D_DEBUG** hata ayıklama yapılandırmasında otomatik olarak tanımlanır.)
Zaman **_DEBUG** olan tanımlanan, derleyici tarafından çevrelenen kod bölümlerini derler **#ifdef _DEBUG** ve `#endif`.
MFC programı hata ayıklama yapılandırmasını MFC Kitaplığı hata ayıklama sürümü ile bağlanması gerekir. MFC üstbilgi dosyalarındaki doğru MFC kitaplığı ile bağlantı için tanımladığınız, gibi semboller üzerinde temel sürümünü **_DEBUG** ve **_UNICODE**. Ayrıntılar için bkz [MFC kitaplık sürümleri](http://msdn.microsoft.com/library/3d7a8ae1-e276-4cf8-ba63-360c2f85ad0e).
## <a name="see-also"></a>Ayrıca Bkz.
[Yerel kodda hata ayıklama](../debugger/debugging-native-code.md)
[C++ Hata Ayıklama Yapılandırması Proje Ayarları](../debugger/project-settings-for-a-cpp-debug-configuration.md)
|
markdown
|
<gh_stars>100-1000
package com.toly1994.ds4android.activity;
import android.os.Bundle;
import android.support.v7.app.AppCompatActivity;
import android.widget.Toast;
import com.toly1994.ds4android.analyze.gold12.ZRandom;
import com.toly1994.ds4android.view.other.OnCtrlClickListener;
import com.toly1994.ds4android.view.SingleLinkedView;
import java.util.Arrays;
public class SingleLinkedChartActivity extends AppCompatActivity {
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
SingleLinkedView<String> view = new SingleLinkedView<>(this);
view.setOnCtrlClickListener(new OnCtrlClickListener<SingleLinkedView<String>>() {
@Override
public void onAdd(SingleLinkedView<String> view) {
// view.addData(ZRandom.randomOf3Name());
view.addData(ZRandom.randomCnName());
}
@Override
public void onAddByIndex(SingleLinkedView<String> view) {
view.addDataById(view.getSelectIndex(), ZRandom.randomCnName());
}
@Override
public void onRemove(SingleLinkedView<String> view) {
view.removeData();
}
@Override
public void onRemoveByIndex(SingleLinkedView<String> view) {
view.removeData(view.getSelectIndex());
}
@Override
public void onSet(SingleLinkedView<String> view) {
view.setData(view.getSelectIndex(), ZRandom.randomCnName());
}
@Override
public void onFind(SingleLinkedView<String> view) {
String data = view.findData(view.getSelectIndex());
Toast.makeText(SingleLinkedChartActivity.this, data, Toast.LENGTH_SHORT).show();
}
@Override
public void onFindByData(SingleLinkedView<String> view) {
int[] data = view.findData(view.getSelectData());
Toast.makeText(SingleLinkedChartActivity.this, Arrays.toString(data), Toast.LENGTH_SHORT).show();
}
@Override
public void onClear(SingleLinkedView<String> view) {
view.clearData();
}
});
setContentView(view);
}
}
|
java
|
{
"name": "@emceearjun/translate",
"version": "0.0.4",
"peerDependencies": {
"@angular/common": "^6.0.0-rc.0 || ^6.0.0",
"@angular/core": "^6.0.0-rc.0 || ^6.0.0"
},
"description": "<b>translate</b> provides a simple extension to help integrate internationalization with your Angular application.",
"main": "karma.conf.js",
"scripts": {
"test": "echo \"Error: no test specified\" && exit 1"
},
"repository": {
"type": "git",
"url": "git+https://github.com/emceearjun/translate.git"
},
"keywords": [
"translate",
"i18n",
"angular"
],
"author": "<NAME>",
"license": "MIT",
"bugs": {
"url": "https://github.com/emceearjun/translate/issues"
},
"homepage": "https://github.com/emceearjun/translate#readme"
}
|
json
|
<reponame>biosphere-switch/libbio
#pragma once
#include <bio/gpu/gpu_Types.hpp>
namespace bio::gpu {
enum class ConnectionApi : i32 {
EGL = 1,
Cpu = 2,
Media = 3,
Camera = 4,
};
enum class DisconnectMode : u32 {
Api = 0,
AllLocal = 1,
};
struct QueueBufferOutput {
u32 width;
u32 height;
u32 transform_hint;
u32 pending_buffer_count;
};
struct Plane {
u32 width;
u32 height;
ColorFormat color_format;
Layout layout;
u32 pitch;
u32 map_handle_unused;
u32 offset;
Kind kind;
u32 block_height_log2;
DisplayScanFormat display_scan_format;
u32 second_field_offset;
u64 flags;
u64 size;
u32 unk[6];
};
static_assert(sizeof(Plane) == 0x58);
struct GraphicBufferHeader {
u32 magic;
u32 width;
u32 height;
u32 stride;
PixelFormat pixel_format;
GraphicsAllocatorUsage usage;
u32 pid;
u32 refcount;
u32 fd_count;
u32 buffer_size;
static constexpr u32 Magic = 0x47424652; // GBFR (Graphic Buffer)
};
// TODO: is the packed attribute really needed here?
struct __attribute__((packed)) GraphicBuffer {
GraphicBufferHeader header;
u32 null;
u32 map_id;
u32 zero;
u32 buffer_magic;
u32 pid;
u32 type;
GraphicsAllocatorUsage usage;
PixelFormat pixel_format;
PixelFormat external_pixel_format;
u32 stride;
u32 full_size;
u32 plane_count;
u32 zero_2;
Plane planes[3];
u64 unused;
static constexpr u32 Magic = 0xDAFFCAFF;
};
static_assert(sizeof(GraphicBuffer) == 0x16C);
struct Fence {
u32 id;
u32 value;
};
struct MultiFence {
u32 fence_count;
Fence fences[4];
};
struct Rect {
i32 left;
i32 top;
i32 right;
i32 bottom;
};
enum class Transform : u32 {
FlipH = 1,
FlipV = 2,
Rotate90 = 4,
Rotate180 = 3,
Rotate270 = 7,
};
struct QueueBufferInput {
i64 timestamp;
i32 is_auto_timestamp;
Rect crop;
i32 scaling_mode;
Transform transform;
u32 sticky_transform;
u32 unk;
u32 swap_interval;
MultiFence fence;
};
}
|
cpp
|
Citizens with impressive track record of doing good in various activities in the cause of making life of people at large free of their life’s woes used to be recognised and honoured with titles appropriate to their field of excellence by the monarchs of the erstwhile Princely State of Mysore. The practice ceased from the time monarchy merged with the democratic dispensation with the dawn of Independence from colonial rule seven decades ago. Mysureans in their advanced age can recollect the names of such titles like Raja Seva Dhureena, Dharma Rathnakara and so on.
The citizens who were thus honoured could proudly prefix these titles to their names and the citizenry took care to accord them importance in public functions, signalling total societal acknowledgement of their esteem. As if to carry on with that now defunct practices, the State Government has introduced Rajyotsava Awards. While the number of citizens bestowed royal recognition in the days past could be counted in single digit, the awardees honoured by the Government of the day in the State are virtually dime-a-dozen, considerably lowering their societal image itself.
In addition to awarding titles to achievers in specific fields of human endeavour, both the royalties and institutions devoted to spiritual pursuits are known to have honoured luminaries in various fields such as literature, music, sports and heroic deeds, particularly in times of threat to the country’s safety and sovereignty.
Honouring citizens nationally and regionally seems to have suffered from the factor of political patronage on many occasions. On one such occasion some years ago one of the awardees in the State happened to be a shamiana contractor creating a feeling of embarrassment for other awardees with societal recognition and adoration of their track record. On a cursory observation of the areas of work in which the awardees have excelled, one gets the gut feeling that two fields, namely, agriculture and ecology, have not been bestowed same level of recognition as many other fields such as literature, sports and arts.
The authority and decision-making for identifying the achievers to be honoured with Rajyotsava Awards, admittedly is exercised by the elected representatives of people in posts of the Government. So far, so good. The too-well-known feature of these same worthies, barring exceptions, amassing wealth carved out of public funds in amounts that are anybody’s guess can be euphemistically called social self-service (! ), truly a highly paying pursuit (! ! ).
|
english
|
{
"endpoint": "cicd_organization",
"input": "cicd_organization_in.json",
"output": "cicd_organization_out.json",
"query": "cicd_organization.gql"
}
|
json
|
Mumbai-based Workcell Solutions Pvt. Ltd, which runs B2B app for pharmacies Reap, has raised an undisclosed amount in seed funding from FreeCharge founders Kunal Shah and Sandeep Tandon, Chaayos co-founder and CEO Nitin Saluja and Thinklabs founder Gagan Goyal, a financial daily reported on Wednesday.
The company will use the funds to develop the product further and expand operations, said a report in The Economic Times.
Reap, founded by IIT-Bombay alumnus Siddharth Gadia, solves supply-chain management issues at the pharmacy level, including distribution, stock visibility and working capital constraints. Besides, it also helps pharmacies increase their sales conversions.
E-mail queries sent by Techcircle to the company remained unanswered at the time of filing this report.
CEO Gadia told the newspaper that Reap has been able to crystallise the core problems facing pharmacies, adding that the company saw a huge disruption opportunity in the segment.
The app has so far tied up with 6,000 pharmacy stores in cities such as Jaipur, Surat, Ahmedabad and Nashik.
Reap recently launched Chemist Community, a network of pharmacy store-owners and distributors.
In September 2016, Pune-based Pharmarack Technologies Pvt. Ltd, which provides software-as-a-service applications for automated order-processing and inventory management to pharmaceutical retailers and distributors, had raised Rs 5 crore ($745,000) in a pre-Series A round of funding.
|
english
|
<gh_stars>0
#!/usr/bin/env python3
'''
Write a function that will find all the anagrams of a word from a list. You will be given two inputs a word and an array with words.
You should return an array of all the anagrams or an empty array if there are none.
'''
def anagrams(word, words):
analis = []
for item in words:
if (sorted(word) == sorted(item)):
analis.append(item)
return analis
#Alternative implementations
def anagrams(word, words):
return [item for item in words if sorted(item)==sorted(word)]
def anagrams(word, words):
return filter(lambda x: sorted(word) == sorted(x), words)
|
python
|
/**
* Created by tsc on 4/1/16.
*/
function getPost(){
$.ajax({
/*url: '/'+module+'/'+action+'.html',*/
url: '/'+'index.php?module=post&action=default',
type: "GET",
success: function(data) {
$('#pageContent').html(data);
}
});
}
function getFormPost(id){
if(id){
$.ajax({
/*url: '/'+module+'/'+action+'.html',*/
url: '/' + 'index.php?module=post&action=edit',
type: "GET",
data: {id: id},
success: function (data) {
$('#pageContent').html(data);
}
});
}else {
$.ajax({
/*url: '/'+module+'/'+action+'.html',*/
url: '/' + 'index.php?module=post&action=create',
type: "GET",
success: function (data) {
$('#pageContent').html(data);
}
});
}
}
function postCreatePost(){
$.ajax({
/*url: '/'+module+'/'+action+'.html',*/
url: '/'+'index.php?module=post&action=create',
type: "POST",
dataType: "json",
data: $("#create-post").serialize(),
success: function(data) {
if(data['status'] == 'false'){
$('#error').html(data['error']);
}else{
getPost();
}
}
});
}
function postUpdatePost(){
$.ajax({
/*url: '/'+module+'/'+action+'.html',*/
url: '/'+'index.php?module=post&action=edit',
type: "POST",
dataType: "json",
data: $("#update-post").serialize(),
success: function(data) {
if(data['status'] == 'false'){
$('#error').html(data['error']);
}else{
getPost();
}
}
});
}
|
javascript
|
<gh_stars>1-10
##############################################################################
#
# Copyright (c) 2001, 2002 Zope Foundation and Contributors.
# All Rights Reserved.
#
# This software is subject to the provisions of the Zope Public License,
# Version 2.1 (ZPL). A copy of the ZPL should accompany this distribution.
# THIS SOFTWARE IS PROVIDED "AS IS" AND ANY AND ALL EXPRESS OR IMPLIED
# WARRANTIES ARE DISCLAIMED, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED
# WARRANTIES OF TITLE, MERCHANTABILITY, AGAINST INFRINGEMENT, AND FITNESS
# FOR A PARTICULAR PURPOSE.
#
##############################################################################
"""Directory-based resources test
"""
import os
import tempfile
import shutil
from unittest import TestCase
from zope.publisher.interfaces import NotFound
from zope.proxy import isProxy
from zope.publisher.browser import TestRequest
from zope.security import proxy
from zope.security.checker import NamesChecker, ProxyFactory
from zope.interface import implementer
from zope.location.interfaces import IContained
from zope.traversing.browser.absoluteurl import AbsoluteURL
from zope.traversing.browser.interfaces import IAbsoluteURL
from zope.component import provideAdapter, provideUtility
from zope.testing import cleanup
from zope.browserresource.directory import \
DirectoryResourceFactory, DirectoryResource
from zope.browserresource.file import FileResource
import zope.browserresource.tests as p
from zope.browserresource.tests import support
test_directory = os.path.dirname(p.__file__)
checker = NamesChecker(
('get', '__getitem__', 'request', 'publishTraverse')
)
@implementer(IContained)
class Ob(object):
__parent__ = __name__ = None
ob = Ob()
class Test(support.SiteHandler, cleanup.CleanUp, TestCase):
def setUp(self):
super(Test, self).setUp()
provideAdapter(AbsoluteURL, (None, None), IAbsoluteURL)
def testNotFound(self):
path = os.path.join(test_directory, 'testfiles')
request = TestRequest()
factory = DirectoryResourceFactory(path, checker, 'testfiles')
resource = factory(request)
self.assertRaises(NotFound, resource.publishTraverse,
resource.request, 'doesnotexist')
self.assertRaises(NotFound, resource.get, 'doesnotexist')
def testBrowserDefault(self):
path = os.path.join(test_directory, 'testfiles')
request = TestRequest()
factory = DirectoryResourceFactory(path, checker, 'testfiles')
resource = factory(request)
view, next = resource.browserDefault(request)
self.assertEqual(view(), '')
self.assertEqual(next, ())
def testGetitem(self):
path = os.path.join(test_directory, 'testfiles')
request = TestRequest()
factory = DirectoryResourceFactory(path, checker, 'testfiles')
resource = factory(request)
self.assertRaises(KeyError, resource.__getitem__, 'doesnotexist')
file = resource['test.txt']
def testForbiddenNames(self):
request = TestRequest()
old_forbidden_names = DirectoryResource.forbidden_names
path = tempfile.mkdtemp()
try:
os.mkdir(os.path.join(path, '.svn'))
with open(os.path.join(path, 'test.txt'), 'w') as f:
f.write('')
factory = DirectoryResourceFactory(path, checker, 'testfiles')
resource = factory(request)
self.assertEqual(resource.get('.svn', None), None)
self.assertNotEqual(resource.get('test.txt', None), None)
DirectoryResource.forbidden_names = ('*.txt', )
self.assertEqual(resource.get('test.txt', None), None)
self.assertNotEqual(resource.get('.svn', None), None)
finally:
shutil.rmtree(path)
DirectoryResource.forbidden_names = old_forbidden_names
def testProxy(self):
path = os.path.join(test_directory, 'testfiles')
request = TestRequest()
factory = DirectoryResourceFactory(path, checker, 'testfiles')
resource = factory(request)
file = ProxyFactory(resource['test.txt'])
self.assertTrue(isProxy(file))
def testURL(self):
request = TestRequest()
request._vh_root = support.site
path = os.path.join(test_directory, 'testfiles')
files = DirectoryResourceFactory(path, checker, 'test_files')(request)
files.__parent__ = support.site
file = files['test.gif']
self.assertEqual(file(), 'http://127.0.0.1/@@/test_files/test.gif')
def testURL2Level(self):
request = TestRequest()
request._vh_root = support.site
ob.__parent__ = support.site
ob.__name__ = 'ob'
path = os.path.join(test_directory, 'testfiles')
files = DirectoryResourceFactory(path, checker, 'test_files')(request)
files.__parent__ = ob
file = files['test.gif']
self.assertEqual(file(), 'http://127.0.0.1/@@/test_files/test.gif')
def testURL3Level(self):
request = TestRequest()
request._vh_root = support.site
ob.__parent__ = support.site
ob.__name__ = 'ob'
path = os.path.join(test_directory, 'testfiles')
files = DirectoryResourceFactory(path, checker, 'test_files')(request)
files.__parent__ = ob
file = files['test.gif']
self.assertEqual(file(), 'http://127.0.0.1/@@/test_files/test.gif')
subdir = files['subdir']
self.assertTrue(proxy.isinstance(subdir, DirectoryResource))
file = subdir['test.gif']
self.assertEqual(file(),
'http://127.0.0.1/@@/test_files/subdir/test.gif')
def testPluggableFactories(self):
path = os.path.join(test_directory, 'testfiles')
request = TestRequest()
resource = DirectoryResourceFactory(path, checker, 'files')(request)
class ImageResource(object):
def __init__(self, image, request):
pass
class ImageResourceFactory(object):
def __init__(self, path, checker, name):
pass
def __call__(self, request):
return ImageResource(None, request)
from zope.browserresource.interfaces import IResourceFactoryFactory
provideUtility(ImageResourceFactory, IResourceFactoryFactory, 'gif')
image = resource['test.gif']
self.assertTrue(proxy.isinstance(image, ImageResource))
file = resource['test.txt']
self.assertTrue(proxy.isinstance(file, FileResource))
def test_get_matches_forbidden(self):
resource = DirectoryResource(None, None)
with self.assertRaises(LookupError):
resource.get('.svn')
|
python
|
{{ define "title" }}
{{ if eq .Data.Singular "tag" }}
{{ humanize (i18n "tag") }} | {{ humanize .Data.Term }}
{{ else if eq .Data.Singular "category" }}
{{ humanize (i18n "category") }} | {{ humanize .Data.Term }}
{{ else }}{{ end }}
{{ end }}
{{ define "frontmenu" }}
{{ partial "nav.html" . }}
{{ end }}
{{ define "main"}}
<div class="ui centered grid">
<div class="sixteen wide mobile fifteen wide tablet four wide computer column">
{{ partial "header.html" . }}
</div>
<div class="sixteen wide mobile fifteen wide tablet eleven wide computer column post-list">
<div class="ui two stackable cards">
{{ partial "pagination.html" . }}
</div>
</div>
</div>
{{ end }}
|
html
|
{
"copyright_text": "Standard YouTube License",
"description": "We present the initial alpha release of QIIME 2, a Python 3 framework supporting interactive analysis and visualization of microbiomes on diverse high-performance computing resources; arbitrary interface development and platform integration; and a plugin system with automatic decentralized provenance tracking.",
"duration": 1634,
"id": 5364,
"language": "eng",
"recorded": "2016-07-15",
"related_urls": [
"https://github.com/qiime2/qiime2/wiki",
"http://qiime.org/"
],
"slug": "qiime-2-self-documenting-extensible-and-reproducible-microbiome-analysis-in-python-3-scipy-201",
"speakers": [
"<NAME>"
],
"tags": [
"biomedicine",
"qiime"
],
"thumbnail_url": "https://i.ytimg.com/vi/tLtGg21Yu9Q/maxresdefault.jpg",
"title": "QIIME 2: Self-documenting, Extensible, and Reproducible Microbiome Analysis in Python 3.",
"videos": [
{
"type": "youtube",
"url": "https://www.youtube.com/watch?v=tLtGg21Yu9Q"
}
]
}
|
json
|
Days after National Mission for Clean Ganga (NMCG) asked a detailed report from the UP government, the state government has started collating district-wise data of “unidentified dead bodies or unclaimed corpses” fished out or recovered from Ganga across the state.
A report in The Indian Express quoted sources saying that it was one of the 7-point on which the state government has sought a “weekly report” from the districts.
The district magistrates have also been asked to send a report on the “Audit of the infrastructure for crematoria/ burial grounds and their utilization. ” They have also been asked to stop the dumping or burial of dead bodies or unidentified corpses into river or bank of the river Ganga.
The report added that the issue of water quality of the rivers has also been flagged in the letter. Earlier, NMCG had held a video conference with the chief secretaries of Uttar Pradesh and Bihar on 15th May and sent a letter to the states on May 20th.
An advisory was issued by Rajiv Ranjan Mishra, Director General, National Mission for Clean Ganga on May 11 to the district magistrates, who are also the chairpersons of the district Ganga committees. This was followed up by a letter the next day to the chief secretaries to prevent the dumping of dead bodies in the river and ensure enforcement of the government guidelines on the cremation of COVID-19 victims. The letter also advised the states to provide financial assistance as well as regulate the rates for the cremation or burial process.
The NMCG Director General Rajiv Ranjan Mishra wrote to Chief Secretaries of 5 states– Uttarakhand, Uttar Pradesh, Bihar, Jharkhand and West Bengal– on May 12, asking them to issue “specific directions” to the concerned district administrations, local bodies and police authorities.
Read all the Latest News, Breaking News and Coronavirus News here. Follow us on Facebook, Twitter and Telegram.
|
english
|
import { src, dest } from 'gulp';
import { SrcOptions } from 'vinyl-fs';
/**
* Copies file(s) to the specified output dir(s).
* @param {string | string[]} filePath The path or array of paths to the file(s) to be copied.
* @param {string | string[]} outputDir The path or array of paths to the output dir(s).
*/
export function copyFiles(filePaths: string | string[], outputDir: string | string[], srcOptions?: SrcOptions): NodeJS.ReadWriteStream {
let stream = src(filePaths, srcOptions);
const destDirs = outputDir instanceof Array ? outputDir : [outputDir];
destDirs.forEach(dir => stream = stream.pipe(dest(dir)));
return stream;
}
|
typescript
|
---
type: blog
date: "2021-05-22T18:41:30Z"
author: <NAME>
title: "Guernsey is Inescapable"
categories:
- Misc
series: ["Moving to London"]
---
One of the things I was particularly looking forward to about London was the sense of anonymity. In Guernsey, you see someone you know around every corner, and that leads to a sense of
feeling watched & judged, and having to perform and conform 100% of the time you're out in public. Obviously, I hated that, but I thought London would be different.
And obviously it is, by and large. But I have already bumped into three people from Guernsey that I know, and I've only been here 15 days. The first was the brother of someone in my
year at school - we were in a climbing gym and he was wearing a mask, so I stared at him for a solid 15 minutes trying to confirm his identity without being able to see his nose or
mouth. It turns out people's noses and mouths are often their distinguishing features, and now he definitely thinks I'm some kind of creepy, stare-y lunatic. The second was one of my
best friend's sisters as I was walking through Vauxhall; we went round opposite sides of a tree before both double taking and realising that actually *is* who we thought it was. The third
came today as I was walking home along the river from Covent Garden, and enjoying a waterside pint is my old debating cup partner from school. He doesn't
look like he's aged a day in the last ten years, and sadly hasn't grown out of the whole tweed jacket thing yet, meaning I doubt he ever will.
Admittedly the second two weren't unpleasant, but still. The possibility remains that if I am out doing something particularly rogue, it still might get fed back to the Guernsey grapevine.
Maybe I'll simply have to move further afield; Japan is tempting.
|
markdown
|
<gh_stars>0
import { replaceAllDoubleDots } from '../../../src/bundle/bundle-package-json';
describe('bundle/bundle-package-json', () => {
it('replaces relative paths', () => {
const paths = replaceAllDoubleDots(
{ 'test': 'file:../test-package/dist' },
'dist/src/'
);
paths.test = paths.test.replace(/\\/g, '/');
expect(paths.test).toEqual('file:../../../test-package/dist');
});
});
|
typescript
|
<gh_stars>1-10
use super::{Diagnostic, SourceFile};
use std::collections::HashMap;
pub struct CompileSession {
diagnostics: Vec<Diagnostic>,
files: HashMap<u64, SourceFile>,
}
impl CompileSession {
pub fn new() -> CompileSession {
CompileSession {
diagnostics: Vec::new(),
files: HashMap::new(),
}
}
pub fn add_source_file(&mut self, file: SourceFile) {
self.files.insert(file.unique_key, file);
}
pub fn get_source_file(&self, key: u64) -> Option<&SourceFile> {
self.files.get(&key)
}
pub fn add_diagnostic(&mut self, diagnostic: Diagnostic) {
self.diagnostics.push(diagnostic);
}
pub fn diagnostics(&self) -> &Vec<Diagnostic> {
&self.diagnostics
}
}
|
rust
|
While announcing special open market operations on August 25, 2020 the Reserve Bank stated that it would continue to monitor evolving liquidity and market conditions and take measures as appropriate to ensure orderly functioning of financial markets.
2. Recently, market sentiment has been impacted by concerns relating to the inflation outlook and the fiscal situation amidst global developments that have firmed up yields abroad.
3. On the outlook for inflation, the resolution of the Monetary Policy Committee (MPC) on August 6, 2020 identified the sources of inflation pressures and expected that although headline inflation may remain elevated in Q2:2020-21, it would moderate in H2:2020-21. Accordingly, the MPC decided to pause and remain watchful and use the available space judiciously to support the revival of the economy. There are indications that food and fuel prices are stabilising and cost push factors are moderating. In addition, the recent appreciation of the rupee is working towards containing imported inflationary pressures. The RBI remains vigilant about these developments. In support of the accommodative stance of monetary policy, the RBI is committed to ensuring comfortable liquidity and financing conditions in the economy.
4. Notwithstanding an augmented market borrowing programme for 2020-21, the RBI has managed the borrowing calendar for the first half of the year seamlessly, completing more than 90 per cent of scheduled borrowings of the Centre and States in H1:2020-21. The RBI has assured that the borrowing programme of the Centre and States for the year 2020-21 will be completed in a non-disruptive manner.
5. In order to continue to ensure orderly market conditions and congenial financial conditions, the following measures are being announced:
The Reserve Bank will conduct additional special open market operation involving the simultaneous purchase and sale of Government securities for an aggregate amount of ₹20,000 crore in two tranches of ₹10,000 crore each. The auctions would be conducted on September 10, 2020 and September 17, 2020. The RBI remains committed to conduct further such operations as warranted by market conditions.
The Reserve Bank will conduct term repo operations for an aggregate amount of ₹100,000 crore at floating rates (i.e., at the prevailing repo rate) in the middle of September to assuage pressures on the market on account of advance tax outflows. In order to reduce the cost of funds, banks that had availed of funds under long-term repo operations (LTROs) may exercise an option of reversing these transactions before maturity. Thus, the banks may reduce their interest liability by returning funds taken at the repo rate prevailing at that time (5.15 per cent) and availing funds at the current repo rate of 4 per cent. Details are being notified separately.
Currently, banks are required to maintain 18 per cent of their net demand and time liabilities (NDTL) in SLR securities. The extant limit for investments that can be held in HTM category is 25 per cent of total investment. Banks are allowed to exceed this limit provided the excess is invested in SLR securities within an overall limit of 19.5 per cent of NDTL. SLR securities held in HTM category by major banks amount to around 17.3 per cent of NDTL at present. However, there are inter-bank variations with some banks close to the 19.5 per cent of NDTL limit. Accordingly, it has been decided to allow banks to hold fresh acquisitions of SLR securities acquired from September 1, 2020 under HTM up to an overall limit of 22 per cent of NDTL up to March 31, 2021 which shall be reviewed thereafter. Details are being notified separately.
The RBI stands ready to conduct market operations as required through a variety of instruments so as to ensure orderly market functioning.
6. The RBI remains committed to use all instruments at its command to revive the economy by maintaining congenial financial conditions, mitigate the impact of COVID-19 and restore the economy to a path of sustainable growth while preserving macroeconomic and financial stability.
(Yogesh Dayal)
|
english
|
IND vs AUS 3rd T20I Report: Ruturaj’s roaring 123* might have scared the hell out of Australia, but they had a man to subdue it, and when he stepped up, Australia found their way to win the third T20I against India. This five-wicket win is a breather for them, as a loss would have put the series to bed. It’s two for one in favor of India, and the last two matches will determine the final verdict.
In the first innings, Ruturaj Gaikwad played sensationally, staying unbeaten on 123 off 57 balls. Despite Jason Behrendorff’s impressive spell of 1 for 12 in four overs, Australia struggled to contain the run flow. Gaikwad took advantage of loose bowling from Hardie and Sangha, showcasing a range of confident shots, particularly three sixes and a four in the 18th over. The final three overs saw a flurry of runs, with Gaikwad adding 67 to India’s total. Nathan Ellis tried to pull things back in the 19th, but Gaikwad’s onslaught continued in the 20th, where he scored 30 runs off Wade’s bowling, guiding India to a formidable total of 222. Gaikwad completed his century in 52 balls, finishing with an outstanding 123 off 57 balls.
Australia managed to maintain a competitive run rate throughout their innings, thanks to Travis Head and Maxwell’s aggressive batting. In the crucial overs, particularly the 19th by Axar Patel and the 20th by Prasidh Krishna, Maxwell and Matthew Wade capitalized on the dewy conditions, consistently finding the boundary. They achieved an improbable win, needing 43 off 12 balls, 21 off six, and two off the last ball. Maxwell’s powerful finish, hitting the last four balls for 6, 4, 4, 4, sealed the victory, leaving the home crowd stunned.
Maxwell’s impactful entry at 66 for 2 in the sixth over proved decisive. His aggressive batting, including two sixes and a four in the eighth over, set the tone for the chase. Despite brief setbacks with the wickets of Josh Inglis and Marcus Stoinis, Maxwell’s onslaught, especially in the 17th over with back-to-back sixes, kept Australia in the game. As per the IND vs AUS 3rd T20I Report, Prasidh’s 18th over added pressure, but an expensive 19th from Axar brought the equation down, leading to Maxwell and Wade’s final charge for victory.
225 for 5 (Maxwell 104*, Head 35, Wade 28*, Bishnoi 2-32)
222 for 3 (Gaikwad 123*, Suryakumar 39, Tilak 31)
CHECK OUT:
Get IPL 2024 Live Score & IPL 2024 Team Updates along with Latest Player News at IceCric.News and Follow for Live Updates – Twitter, Facebook & Instagram.
|
english
|
#
# Copyright (c) 2014, 2016, 2018, 2020 LexisNexis Risk Data Management Inc.
#
# This file is part of the RadSSH software package.
#
# RadSSH is free software, released under the Revised BSD License.
# You are permitted to use, modify, and redsitribute this software
# according to the Revised BSD License, a copy of which should be
# included with the distribution as file LICENSE.txt
#
'''
RadSSH Module
Simplified Paramiko interface for managing clustered SSH interaction
'''
import os
import threading
import socket
import time
import uuid
import fnmatch
import netaddr
import re
import logging
import hashlib
import shlex
import subprocess
import queue
import paramiko
from .authmgr import AuthManager
from .streambuffer import StreamBuffer
from .dispatcher import Dispatcher, UnfinishedJobs
from .console import RadSSHConsole, user_password
from . import known_hosts
from . import config
from .keepalive import KeepAlive, ServerNotResponding
# If main thread gets KeyboardInterrupt, use this to signal
# running background threads to terminate prior to command completion
user_abort = threading.Event()
FILTER_TTY_ATTRS_RE = re.compile(b"\x1b\\[(\d)+(;(\d+))*m")
# Map ssh_config LogLevels to Python logging module levels
# This may need some future adjustment, as the labels don't quite line up
sshconfig_loglevels = {
'QUIET': 0,
'FATAL': logging.CRITICAL,
'ERROR': logging.ERROR,
'INFO': logging.WARNING,
'VERBOSE': logging.INFO,
'DEBUG': logging.DEBUG,
'DEBUG1': logging.DEBUG,
'DEBUG2': logging.DEBUG,
'DEBUG3': logging.DEBUG
}
def filter_tty_attrs(line):
'''Handle the attributes for colors, etc.'''
return FILTER_TTY_ATTRS_RE.sub(b'', line)
class Quota(object):
'''Quota values for auto-termination of in-flight commands'''
def __init__(self, defaults={}):
self.time_limit = int(defaults.get('quota.time', 0))
self.byte_limit = int(defaults.get('quota.bytes', 0))
self.line_limit = int(defaults.get('quota.lines', 0))
def settings(self):
return self.time_limit, self.byte_limit, self.line_limit
def time_exceeded(self, elapsed_time):
if self.time_limit and elapsed_time > self.time_limit:
return True
else:
return False
def bytes_exceeded(self, bytes):
if self.byte_limit and bytes > self.byte_limit:
return True
else:
return False
def lines_exceeded(self, lines):
if self.line_limit and lines > self.line_limit:
return True
else:
return False
class CommandResult(object):
'''Generic object to save a bunch of fields'''
def __init__(self, **kwargs):
for k, v in kwargs.items():
self.__setattr__(k, v)
def __repr__(self):
return '%s "%s" : [%s]' % (self.status, self.command, self.return_code)
class Chunker(object):
'''Allow list of host connections to be chunkified into sublists'''
def __init__(self, grouping=10, delay=30):
self.data = [[]]
self.grouping = grouping
self.delay = delay
def add(self, *args):
for x in args:
if self.grouping and len(self.data[-1]) >= self.grouping:
self.data.append([])
self.data[-1].append(x)
def __len__(self):
return sum([len(x) for x in self.data])
def __iter__(self):
for x in self.data[:-1]:
yield x
if self.delay:
try:
time.sleep(self.delay)
except KeyboardInterrupt:
print('<Ctrl-C> in chunk mode delay')
# Yield the last one without adding a delay, since it is the last
yield self.data[-1]
def run_local_command(original_name, remote_hostname, port, remote_username, sshconfig):
'''
Handle preparing command line to run on connecting to remote host. This
includes doing possible substitutions that are not done by Paramiko's
ssh_config, as they involve connection specific values that are not known
at the time of original lookup.
'''
if sshconfig.get('permitlocalcommand', 'no') != 'yes':
return
cmd = sshconfig.get('localcommand')
if not cmd:
return
if '%' in cmd:
translations = {
'%d': os.path.expanduser('~'),
'%h': remote_hostname,
'%l': socket.gethostname(),
'%n': original_name,
'%p': str(port),
'%r': sshconfig.get('user', remote_username),
'%u': os.getlogin(),
'%C': hashlib.sha1((
socket.gethostname()
+ remote_hostname
+ str(port)
+ sshconfig.get('user', remote_username)).encode('UTF8')).hexdigest()
}
for token, subst in translations.items():
cmd = cmd.replace(token, subst)
logging.getLogger('radssh').info('Executing LocalCommand "%s" for connection to %s', cmd, original_name)
# Self-inflicted harm if this never returns...
p = subprocess.Popen(shlex.split(cmd))
logging.getLogger('radssh').debug('LocalCommand "%s" completed with return code %d', cmd, p.wait())
def connection_worker(host, conn, auth, sshconfig={}):
check_host_key = True
# host is the label of the host, conn is the "real" name/ip to connect
# to, or an already established socket-like object. If conn is not
# filled in, use the label as the hostname.
if not conn:
conn = str(host)
if isinstance(conn, str):
hostname = sshconfig.get('hostname', conn)
port = sshconfig.get('port', '22')
proxy = sshconfig.get('proxycommand')
if proxy:
logging.getLogger('radssh').info('Connecting to %s via ProxyCommand "%s"', hostname, proxy)
s = paramiko.ProxyCommand(proxy)
else:
try:
timeout = sshconfig.get('connecttimeout')
if timeout:
timeout = float(timeout)
except Exception as e:
logging.getLogger('radssh').error('Invalid ConnectTimeout value "%s" ignored: %s', timeout, e)
timeout = None
s = socket.create_connection((hostname, int(port)), timeout=timeout)
run_local_command(conn, hostname, port, auth.default_user, sshconfig)
t = paramiko.Transport(s)
t.setName(host)
ciphers = sshconfig.get('ciphers')
if ciphers:
logging.getLogger('radssh').debug('Limit Ciphers to %s', ciphers)
preferred = []
for name in ciphers.split(','):
if name in t._preferred_ciphers:
preferred.append(name)
else:
logging.getLogger('radssh').debug('Ignoring cipher %s (not supported by Paramiko)', name)
t._preferred_ciphers = tuple(preferred)
logging.getLogger('radssh').debug('Setting Paramiko _preferred_ciphers to %s', t._preferred_ciphers)
kex_algorithms = sshconfig.get('kexalgorithms')
if kex_algorithms:
logging.getLogger('radssh').debug('Limit KexAlgorithms to %s', kex_algorithms)
preferred = []
for name in kex_algorithms.split(','):
if name in t._preferred_kex:
preferred.append(name)
else:
logging.getLogger('radssh').debug('Ignoring KexAlgorithm %s (not supported by Paramiko)', name)
t._preferred_kex = tuple(preferred)
logging.getLogger('radssh').debug('Setting Paramiko _preferred_kex to %s', t._preferred_kex)
macs = sshconfig.get('macs')
if macs:
logging.getLogger('radssh').debug('Limit MACs to %s', macs)
preferred = []
for name in macs.split(','):
if name in t._preferred_macs:
preferred.append(name)
else:
logging.getLogger('radssh').debug('Ignoring MAC %s (not supported by Paramiko)', name)
t._preferred_macs = tuple(preferred)
logging.getLogger('radssh').debug('Setting Paramiko _preferred_macs to %s', t._preferred_macs)
elif isinstance(conn, paramiko.Transport):
# Reuse of established Transport, don't overwrite name
# and don't bother doing host key verification
t = conn
check_host_key = False
else:
# Socket (or sock-like) which is a probably a tunneled connection
t = paramiko.Transport(conn)
port = t.getpeername()[1]
t.setName(host)
hostname = host
t.set_log_channel('radssh.paramiko.transport.%s' % host)
# Assign the ssh_config LogLevel to the paramiko.transport logger
loglevel = sshconfig.get('loglevel', 'INFO')
if loglevel.upper() in sshconfig_loglevels:
logging.getLogger(t.get_log_channel()).setLevel(sshconfig_loglevels[loglevel.upper()])
else:
logging.getLogger('radssh').warning('Unknown LogLevel (%s) for %s', loglevel, host)
try:
if check_host_key:
verify_host = sshconfig.get('hostkeyalias', str(hostname))
sys_known_hosts = known_hosts.load(sshconfig.get('globalknownhostsfile', '/etc/ssh/ssh_known_hosts'))
user_known_hosts = known_hosts.load(sshconfig.get('userknownhostsfile', '~/.ssh/known_hosts'))
keys = list(sys_known_hosts.matching_keys(verify_host, int(port)))
keys.extend(user_known_hosts.matching_keys(verify_host, int(port)))
if keys:
# Only request the key types from known_hosts
t._preferred_keys = [x.key.get_name() for x in keys]
else:
# Order per HostKeyAlgorithms, or bump Paramiko precedence of ECDSA
t._preferred_keys = sshconfig.get('hostkeyalgorithms', 'ecdsa-sha2-nistp256,ssh-rsa,ssh-dss').split(',')
if not t.is_active():
t.start_client()
# Do the key verification based on sshconfig settings
known_hosts.verify_transport_key(t, verify_host, int(port), sshconfig)
except Exception as e:
logging.getLogger('radssh').error('Unable to verify host key for %s\n%s', verify_host, repr(e))
print('Unable to verify host key for', verify_host)
print(repr(e))
t.close()
print('Connection to %s closed.' % str(hostname))
return t
# After connection and passing host key verification, now try to authenticate
auth.authenticate(t, sshconfig)
return t
def exec_command(host, t, cmd, quota, streamQ, encoding='UTF-8'):
'''Run a command across a transport via exec_cmd. Capture stdout, stderr, and return code, streaming to an optional output queue'''
return_code = None
if isinstance(t, paramiko.Transport) and t.is_authenticated():
stdout = StreamBuffer(streamQ, (str(host), False), blocksize=2048, encoding=encoding)
stderr = StreamBuffer(streamQ, (str(host), True), blocksize=2048, encoding=encoding)
keepalive = KeepAlive(t)
# If transport has a persistent session (identified by being named same as the transport.remote_version)
# then use the persistent session via send/recv to the shell quasi-interactively, rather than
# creating a single-use session with exec_command, which gives true process termination (exit_status_ready)
# and process return code capabilities.
persistent_session = None
for s in t._channels.values():
if s.get_name() == t.remote_version:
persistent_session = s
while s.recv_ready():
# clear out any accumulated data
s.recv(2048)
time.sleep(0.4)
# Send a bunch of newlines in hope to get a bunch of prompt lines
s.sendall('\n\n\n\n\n')
time.sleep(0.5)
try:
s.settimeout(3.0)
# Read the queued data as the remote host prompt
# to check for command completion if we get the prompt again
data = s.recv(2048)
prompt_lines = [x.strip() for x in data.split('\n') if x.strip()]
persist_prompt = prompt_lines[-1]
stdout.push('\n=== Start of Exec: Prompt is [%s] ===\n\n' % persist_prompt)
except (socket.timeout, IndexError):
persist_prompt = None
stdout.push('\n=== Start of Exec: Failed to read prompt [%s] ===\n\n' % data)
s.send('%s\n' % cmd)
break
else:
s = t.open_session()
s.set_name(t.getName())
xcmd = cmd
s.exec_command(xcmd)
stdout_eof = stderr_eof = False
quiet_increment = 0.4
quiet_time = 0
while not (stdout_eof and stderr_eof and s.exit_status_ready()):
# Read from stdout socket
s.settimeout(quiet_increment)
try:
data = s.recv(16384)
quiet_time = 0
if data:
stdout.push(data)
if persistent_session:
if persist_prompt and persist_prompt in data:
stdout_eof = True
process_completion = '*** Returned To Prompt ***'
break
if data.strip().endswith('--More--'):
s.sendall(' ')
else:
stdout_eof = True
# Avoid wild CPU-bound thrashing under Python3 GIL
s.status_event.wait(0.01)
except socket.timeout:
# Push out a (nothing) in case the queue needs to do a time-based dump
stdout.push('')
quiet_time += quiet_increment
try:
if quiet_time > 5.0:
keepalive.ping()
except ServerNotResponding:
t.close()
process_completion = '*** Server Not Responding ***'
break
# Read from stderr socket, altered timeout
try:
s.settimeout(0.1)
data = s.recv_stderr(4096)
if data:
stderr.push(data)
else:
stderr_eof = True
except socket.timeout:
pass
# Check quota limits
if quota.time_exceeded(quiet_time):
process_completion = '*** Time Limit (%d) Reached ***' % quota.time_limit
break
if quota.bytes_exceeded(len(stdout)):
process_completion = '*** Byte Limit (%d) Reached ***' % quota.byte_limit
break
if quota.lines_exceeded(stdout.line_count):
process_completion = '*** Line Limit (%d) Reached ***' % quota.line_limit
break
if user_abort.isSet():
process_completion = '*** <Ctrl-C> Abort ***'
break
# Make a guess if the command completed, since persistent sessions
# via invoke_shell don't do EOF or exit_status at all...
if persistent_session and quiet_time > 30:
process_completion = '*** Presumed Complete ***'
break
else:
process_completion = '*** Complete ***'
return_code = s.recv_exit_status()
if persistent_session:
return_code = 0
else:
s.close()
stdout.close()
if stdout.discards:
logging.getLogger('radssh').warning('StreamBuffer encountered %d discards', stdout.discards)
process_completion += 'StreamBuffer encountered %d discards' % stdout.discards
stderr.close()
return CommandResult(command=cmd, return_code=return_code, status=process_completion, stdout=stdout.buffer, stderr=stderr.buffer)
else:
process_completion = '*** Skipped ***'
return CommandResult(command=cmd, return_code=return_code, status=process_completion, stdout=b'', stderr=b'')
def sftp_thread(host, t, srcfile, dstfile=None, attrs=None):
if not attrs:
attrs = paramiko.sftp_attr.SFTPAttributes.from_stat(os.stat(srcfile))
s = t.open_sftp_client()
if not dstfile:
dstfile = srcfile
s.put(srcfile, dstfile)
s.chmod(dstfile, attrs.st_mode % 4096)
try:
s.chown(dstfile, attrs.st_uid, attrs.st_gid)
except IOError:
pass
s.close()
return CommandResult(command='SFTP %s -> %s' % (srcfile, dstfile),
return_code=0, status='*** Complete ***',
stdout='Transferred %d bytes' % attrs.st_size, stderr='')
def close_connection(t, k, signoff=''):
'''Close Paramiko transport connections'''
if isinstance(t, paramiko.Transport):
if t.is_authenticated():
# Scan for persisitent session, and sign-off cleanly
for s in t._channels.values():
if s.get_name() == t.remote_version:
s.send('\n'.join(signoff.split(';')) + '\n')
t.close()
class Cluster(object):
'''SSH Cluster'''
def __init__(self, hostlist, auth=None, console=None, mux={}, defaults={}, commandline_options={}):
'''Create a Cluster object from a list of host entries'''
if auth:
self.auth = auth
else:
self.auth = AuthManager()
if console:
self.console = console
else:
# Limit console queue size to 4x connections to avoid excess
# bottleneck when extremely high volume output
outQ = queue.Queue(min(100, 4 * len(hostlist)))
self.console = RadSSHConsole(outQ)
self.console.quiet(True)
if defaults:
self.defaults = defaults
else:
self.defaults = config.load_default_settings()
self.log_out = self.defaults.get('log_out', 'out.log').strip()
self.log_err = self.defaults.get('log_err', 'err.log').strip()
thread_count = min(int(self.defaults.get('max_threads')), len(hostlist))
self.dispatcher = Dispatcher(outQ=queue.Queue(), threadpool_size=thread_count)
self.pending = {}
self.uuid = uuid.uuid1()
self.connections = {}
self.connect_timings = {}
self.mux = {}
self.reverse_port = {}
self.disabled = set()
self.last_result = None
self.user_vars = {}
self.quota = Quota(self.defaults)
self.chunk_size = None
self.chunk_delay = 0
self.output_mode = self.defaults['output_mode']
self.ordered_placeholder = self.defaults['ordered_placeholder']
self.sshconfig = paramiko.SSHConfig()
# Only load SSHConfig if path is set in RadSSH config
if defaults.get('ssh_config'):
logging.getLogger('radssh').warning('Loading SSH Config file: %s', defaults['ssh_config'])
try:
with open(os.path.expanduser(defaults['ssh_config'])) as user_config:
self.sshconfig.parse(user_config)
except IOError as e:
logging.getLogger('radssh').warning('Unable to process user ssh_config file: %s', e)
if os.path.isdir('/etc/ssh'):
system_config = '/etc/ssh/ssh_config'
else:
# OSX location
system_config = '/etc/ssh_config'
try:
with open(system_config) as sysconfig:
self.sshconfig.parse(sysconfig)
except IOError as e:
logging.getLogger('radssh').warning('Unable to process system ssh_config file (%s): %s', system_config, e)
for label, conn in hostlist:
config = self.get_ssh_config(label, conn)
if mux:
for idx, mux_var in enumerate(mux.get(label, [])):
mux_label = '%s:%d' % (label, idx)
self.pending[self.dispatcher.submit(connection_worker, mux_label, conn, self.auth, config)] = label
self.mux[mux_label] = mux_var
else:
self.pending[self.dispatcher.submit(connection_worker, label, conn, self.auth, config)] = label
self.update_connections()
# Start remainder of dispatcher threads
self.dispatcher.start_threads(len(self.connections))
def update_connections(self):
'''Pull completed transport creations and save in connections dict'''
while True:
try:
for pid, summary in self.dispatcher.async_results(5):
host = self.pending.pop(pid)
transport = summary.result
self.connections[host] = transport
self.connect_timings[host] = summary.end_time - summary.start_time
try:
if transport.is_authenticated():
transport.set_keepalive(int(self.defaults.get('keepalive', 0)))
self.console.progress('.')
logging.getLogger('radssh.connection').info('Authenticated to %s' % host)
# IOS switch may require invoke_shell instead of exec_command
for id_string in self.defaults.get('force_tty', '').split(','):
if id_string and id_string in transport.remote_version:
self.console.message('%s (%s)' % (host, transport.remote_version), 'FORCE TTY')
tty = transport.open_session()
tty.set_name(transport.remote_version)
tty.get_pty(width=132, height=43)
tty.invoke_shell()
# If we have a signon string, send it to the remote host
# translate semi-colons as newlines (and tack on an extra \n at the end)
if self.defaults.get('force_tty.signon'):
tty.send('\n'.join(self.defaults.get('force_tty.signon', '').split(';')) + '\n')
time.sleep(0.5)
while tty.recv_ready():
banner = tty.recv(2048)
self.console.message(str(banner), 'SIGNON')
# Issue a final empty line to trigger a fresh prompt
tty.send('\n')
else:
self.console.progress('O')
logging.getLogger('radssh.connection').warning('Failed to authenticate to %s: %s' % (host, str(transport)))
except Exception:
self.console.progress('X')
logging.getLogger('radssh.connection').warning('Failed to connect to %s: %s' % (host, str(transport)))
break
except UnfinishedJobs as e:
self.console.message(e.message, 'STALLED')
except KeyboardInterrupt:
self.console.message('Aborting %d pending connections' % len(self.pending), 'Ctrl-C')
for label in self.pending.values():
self.console.message(label, 'FAILED CONNECTION')
self.connections[label] = Exception('Failed to connect/Ctrl-C')
logging.getLogger('radssh.connection').warning('Aborted connect to %s: Ctrl-C' % host)
self.pending.clear()
# Blocked threads can cause issues with internal recordkeeping of Dispatcher
# object, and havoc if the thread ever unblocks and sends a completion message
# via outQ when async_results are used for another batch of jobs. Safest
# thing in this case is to abandon the Dispatcher blocked on connection
# results, and begin the session with a new Dispatcher object for
# the exec_command calls. The terminate() call will safely terminate the
# unblocked threads, freeing some resources for the new Dispatcher.
self.dispatcher.terminate()
new_dispatcher = Dispatcher(outQ=queue.Queue(), threadpool_size=self.dispatcher.threadpool_size)
self.dispatcher = new_dispatcher
break
self.console.progress('\n')
self.console.status('Ready')
def reauth(self, user):
'''Attempt to reconnect and reauthenticate to any hosts in the cluster that are not already properly established'''
if not user or user == self.auth.default_user:
# Not switching users, we know the existing auth won't work
retry = AuthManager(self.auth.default_user, auth_file=None,
try_auth_none=False)
else:
# Give the option of reusing the existing auth options with new user
alternate_password = <PASSWORD>(
'Please enter a password for (%s) or leave blank to retry auth options with new user:' % user)
if alternate_password:
retry = AuthManager(user, auth_file=None, default_password=<PASSWORD>)
else:
retry = self.auth
self.auth.default_user = user
for k, t in self.connections.items():
if isinstance(t, paramiko.Transport) and t.is_authenticated():
continue
if isinstance(t, paramiko.Transport) and t.is_active():
t.close()
self.console.message(str(k), 'RECONNECT')
conn = None
try:
conn = socket.create_connection((str(k), 22), timeout=float(self.defaults.get('socket.timeout', 2.0)))
except socket.gaierror as e:
if '.' not in k and e.args == (-2, 'Name or service not known'):
# Try extended domain searches
for suffix in self.defaults.get('domains', '').split():
fqdn = '%s.%s' % (k, suffix)
try:
conn = socket.create_connection((fqdn, 22), timeout=float(self.defaults.get('socket.timeout', 2.0)))
self.console.message('%s -> %s' % (k, fqdn), 'FQDN')
break
except socket.error:
pass
except Exception as e:
self.console.message('%s - %s' % (str(k), str(e)), 'EXCEPTION')
# For Reauth, do not pass sshconfig options since we're just trying to force a password authentication
# self.pending[self.dispatcher.submit(connection_worker, k, conn, retry, self.sshconfig.lookup(str(k)))] = k
self.pending[self.dispatcher.submit(connection_worker, k, conn, retry, {'identityfile': []})] = k
self.update_connections()
def get_ssh_config(self, label, connection_spec=None):
'''Lookup or create a dict of SSHConfig options for the given host'''
if isinstance(connection_spec, str):
host_spec = connection_spec
else:
host_spec = label
# Support specs in form of user@host:port
if '@' in host_spec:
supplied_user, host_spec = host_spec.split('@', 1)
else:
supplied_user = None
if ':' in host_spec:
# Attempt to disambiguate IPv6
host_spec, supplied_port = host_spec.rsplit(':', 1)
if ':' in host_spec:
if host_spec[0] == '[' and host_spec[-1] == ']':
host_spec = host_spec[1:-1]
else:
host_spec = host_spec + ':' + supplied_port
supplied_port = None
else:
supplied_port = None
config = self.sshconfig.lookup(host_spec)
# If spec included port or user, overrride the SSHConfig values
if supplied_port:
config['port'] = supplied_port
if supplied_user:
config['user'] = supplied_user
# if SSHConfig has no value for LogLevel, use the cluster setting
if 'loglevel' not in config:
config['loglevel'] = self.defaults['loglevel'].upper()
return config
def tunnel_connections(self, hostlist, jumpbox=None):
'''Create a cluster of tunneled connections through a jumpbox'''
if jumpbox:
t = self.connections.get(jumpbox)
else:
t = list(self.connections.values())[0]
tunnel_list = []
if not t:
return None
for host in hostlist:
try:
s = t.open_channel('direct-tcpip', (host, 22), (host, 22))
tunnel_list.append((host, s))
except Exception as e:
self.console.q.put((('TUNNEL', True), 'Unable to tunnel to %s: %s' % (host, e)))
return Cluster(tunnel_list, self.auth)
def multiplex(self, mux_command='echo /mnt/gluster-brick*'):
'''Create a multiplex cluster based on the output of a command against the current cluster'''
self.console.quiet(True)
res = self.run_command(mux_command)
self.console.quiet(False)
mux_data = {}
mux_list = []
for host, vars in [(k, job.result.stdout) for k, job in res.items() if job.completed and job.result.return_code == 0]:
mux_data[host] = vars.split()
mux_list.append((host, self.connections[host]))
return Cluster(mux_list, self.auth, console=self.console, mux=mux_data, defaults=self.defaults)
def enable(self, enable_list=None):
'''Set active set of connections via list of fnmatch/IP patterns to limit run_command; pass in None to reset to enable all connections'''
self.disabled = set()
if enable_list is None:
self.console.q.put((('ENABLED', True), 'All %d hosts currently enabled' % len(self.connections)))
return
if isinstance(enable_list, str):
# Handle single value being passed instead of list
enable_list = [enable_list]
# Assemble an enabled list, then complement it at the end
enabled = set()
for pattern in enable_list:
direct_match = self.locate(pattern)
if direct_match:
enabled.add(direct_match)
continue
# Try using pattern as IP network or glob first
# if it doesn't look like either, then treat it as a name wildcard
pattern_match = set()
try:
ip_match = netaddr.IPNetwork(pattern)
except Exception:
try:
ip_match = netaddr.IPSet(netaddr.IPGlob(pattern))
except Exception:
ip_match = None
for host, t in self.connections.items():
if ip_match:
try:
if netaddr.IPAddress(t.getpeername()[0]) in ip_match:
pattern_match.add(host)
except Exception:
pass
else:
if fnmatch.fnmatch(str(host), pattern):
pattern_match.add(host)
if len(pattern_match) > 1:
self.console.q.put((('ENABLED', True), 'Pattern wildcard "%s" matched %d hosts' % (pattern, len(pattern_match))))
enabled.update(pattern_match)
# Take complement of enabled set
for host in self.connections:
if host not in enabled:
self.disabled.add(host)
self.console.q.put((('ENABLED', True), '%d hosts currently enabled' % len(enabled)))
def prep_command(self, cmd, target):
'''Preprocess command line for target host execution (%ip%, %host%, etc)'''
# Scan for all %var% constructs - we won't care to honor any quoting rules
vars = set(re.findall('%[a-zA-Z_]+%', cmd))
if not vars:
return cmd
t = self.connections[target]
# Handle predefined %%'s (host, ip, port, tunnel, mux) first
auto_vars = {
'%host%': str(target),
'%ip%': t.getpeername()[0] if isinstance(t, paramiko.Transport) else '0.0.0.0',
'%ssh_version%': t.remote_version if isinstance(t, paramiko.Transport) else 'No Connection',
'%uuid%': str(self.uuid)
}
if self.mux:
auto_vars['%mux%'] = self.mux.get(target, '')
if target in self.reverse_port:
port = self.reverse_port[target]
auto_vars['%port%'] = '%d' % port
auto_vars['%tunnel%'] = '127.0.0.1:%d' % port
for v in vars:
if v in auto_vars:
try:
cmd = cmd.replace(v, auto_vars[v])
except Exception as e:
self.console.q.put((('EXCEPTION', True),
'Substituting %s for %s: %s' % (v, target, str(e))))
return None
else:
if v not in self.user_vars:
x = input('Missing variable setting for %s\nEnter value : ' % v)
self.user_vars[v] = x
cmd = cmd.replace(v, self.user_vars[v])
return cmd
def run_command(self, template):
'''Execute a command line (template) string across all enabled host connections'''
result = {}
last_interrupt = 0
chunker = Chunker(self.chunk_size, self.chunk_delay)
for k in self:
if k in self.disabled:
continue
chunker.add(k)
total = len(chunker)
for chunk in chunker:
ordered_list = []
for k in chunk:
t = self.connections[k]
ordered_list.append(k)
# Patch up command line with host/mux specific data prior to queueing
cmd = self.prep_command(template, k)
if not cmd:
continue
# Now we have a legit command line to execute
if self.output_mode == 'stream':
self.pending[self.dispatcher.submit(exec_command, k, t, cmd, self.quota, self.console.q, self.defaults['character_encoding'])] = k
else:
self.pending[self.dispatcher.submit(exec_command, k, t, cmd, self.quota, None, self.defaults['character_encoding'])] = k
# Wait for background jobs to complete
while self.pending:
try:
self.console.status('Completed on %d/%d hosts' % (len(result), total))
for pid, summary in self.dispatcher.async_results():
host = self.pending.pop(pid)
result[host] = summary
if self.output_mode == 'ordered':
while ordered_list and ordered_list[0] in result:
host = ordered_list.pop(0)
job = result[host]
if job.result.stdout:
self.console.q.put(((host, False), job.result.stdout.decode(self.defaults['character_encoding'])))
elif self.ordered_placeholder == 'on':
self.console.q.put(((host, False), '[No Output]'))
if job.result.stderr:
self.console.q.put(((host, True), job.result.stderr.decode(self.defaults['character_encoding'])))
else:
ordered_list.remove(host)
self.console.status('Completed on %d/%d hosts' % (len(result), total))
except UnfinishedJobs:
pass
except KeyboardInterrupt:
self.console.status('<Ctrl-C>')
if time.time() - last_interrupt < 2.0:
user_abort.set()
# break
else:
last_interrupt = time.time()
self.console.status('Completed on %d/%d hosts' % (len(result), total))
self.console.q.put((('CONSOLE', True), '*** <Ctrl-C> ***'))
in_flight = sorted([str(k) for k in self.pending.values() if k not in result])
for host in in_flight:
self.console.replay_recent(host)
self.console.q.put((('CONSOLE', True), 'In-Flight commands running on %s' % str(in_flight)))
self.console.q.put((('CONSOLE', True), 'To kill: Press <Ctrl-C> again within 2 seconds'))
except Exception as e:
self.console.q.put((('EXCEPTION', True), '%s' % str(e)))
self.console.join()
self.console.status('Completed on %d/%d hosts' % (len(result), total))
self.console.status('Ready')
# join(True) here causes the last_lines buffer to be cleared
self.console.join(True)
user_abort.clear()
self.last_result = result
return result
def log_result(self, logdir=None, command_header=True, encoding='UTF-8'):
'''Save last_result content to a log directory - 1 file per host'''
if logdir:
for k, job in self.last_result.items():
v = job.result
if isinstance(v, CommandResult):
if self.log_out:
lines = v.stdout.strip().split(b'\n')
if lines:
with open(os.path.join(logdir, self.log_out), 'ab') as f:
f.write(('[%s] === "%s" %s [%s] ===\n' %
(str(k), v.command, v.status, v.return_code)).encode(encoding))
for line in lines:
f.write(("[{0}]".format(str(k)) + filter_tty_attrs(line).decode(encoding, 'replace') + "\n").encode(encoding))
with open(os.path.join(logdir, str(k) + '.log'), 'ab') as f:
if command_header:
f.write(('=== "%s" %s [%s] ===\n' %
(v.command, v.status, v.return_code)).encode(encoding))
f.write(v.stdout)
f.write(b'\n')
if v.stderr:
if self.log_err:
lines = v.stderr.strip().split(b'\n')
if lines:
with open(os.path.join(logdir, self.log_err), 'ab') as f:
f.write(('[%s] === "%s" %s [%s] ===\n' %
(str(k), v.command, v.status, v.return_code)).encode(encoding))
for line in lines:
f.write(("[{0}]".format(str(k)) + filter_tty_attrs(line).decode(encoding, 'replace') + "\n").encode(encoding))
with open(os.path.join(logdir, str(k) + '.stderr'), 'ab') as f:
f.write(v.stderr)
f.write(b'\n')
else:
if self.log_err:
lines = str(v).split('\n')
if lines:
with open(os.path.join(logdir, self.log_err), 'ab') as f:
for line in lines:
f.write(("[{0}]".format(str(k)) + line + "\n").encode(encoding))
with open(os.path.join(logdir, str(k) + '.log'), 'ab') as f:
f.write(('%s\n' % str(v)).encode(encoding))
def sftp(self, src, dst=None, attrs=None):
'''SFTP a file (put) to all nodes'''
for k in self:
t = self.connections[k]
if k in self.disabled:
continue
if not isinstance(t, paramiko.Transport) or not t.is_authenticated():
continue
self.pending[self.dispatcher.submit(sftp_thread, k, t, src, dst, attrs)] = k
total = len(self.pending)
result = {}
while self.pending:
try:
for pid, summary in self.dispatcher.async_results():
host = self.pending.pop(pid)
result[host] = summary
if not summary.completed:
self.console.message('%s - %s' % (str(host), repr(summary.result)), 'EXCEPTION')
self.console.status('Completed on %d/%d hosts' % (total - len(self.pending), total))
except UnfinishedJobs:
pass
except KeyboardInterrupt:
self.console.message('<Ctrl-C> SFTP Transfer ignored.')
continue
self.last_result = result
self.console.status('Ready')
return result
def status(self):
'''Return a combined list of connection status text messages'''
good = []
bad = []
for k in self:
t = self.connections[k]
if k in self.connect_timings:
connect_time = self.connect_timings[k]
else:
connect_time = -1
if isinstance(t, paramiko.Transport):
if not t.is_active():
bad.append((k, '(%7.3fs) Not connected' % connect_time))
elif not t.is_authenticated():
bad.append((k, '(%7.3fs) Connected to %s / not authenticated' % (connect_time, t.getpeername()[0])))
else:
if k in self.disabled:
good.append((k, '(%7.3fs) Authenticated as %s to %s (Disabled)' % (connect_time, t.get_username(), t.getpeername()[0])))
else:
good.append((k, '(%7.3fs) Authenticated as %s to %s' % (connect_time, t.get_username(), t.getpeername()[0])))
else:
bad.append((k, '(%8.3fs) %s' % (connect_time, str(t))))
return good + bad
def connection_summary(self):
'''Determine counts of various connection statuses'''
ready = disabled = failed_auth = failed_connect = dropped = 0
for k, t in self.connections.items():
if isinstance(t, paramiko.Transport):
if not t.is_active():
dropped += 1
elif not t.is_authenticated():
failed_auth += 1
else:
if k in self.disabled:
disabled += 1
else:
ready += 1
else:
failed_connect += 1
return (ready, disabled, failed_auth, failed_connect, dropped)
def locate(self, s):
'''Lookup cluster entry - keys may be netaddr.IPAddress, not string'''
# Trivial case, string to string match
if s in self.connections:
return s
# Loop and compare string conversion of key to s
for k in self.connections:
if str(k) == s:
return k
return None
def __iter__(self):
'''Get connection keys in (hybrid) sorted order'''
def hybrid_key(x):
return(str(type(x)), x)
result = sorted(self.connections.keys(), key=hybrid_key)
return iter(result)
def close_connections(self):
'''Disconnect from all remote hosts'''
for k in list(self.connections):
t = self.connections.pop(k)
self.dispatcher.submit(close_connection, t, k, self.defaults.get('force_tty.signoff', ''))
self.dispatcher.wait()
|
python
|
<filename>doppel/__init__.py
__all__ = [
'PackageAPI',
'PackageCollection'
]
from doppel.PackageAPI import PackageAPI
from doppel.PackageCollection import PackageCollection
from doppel.DoppelTestError import DoppelTestError
from doppel.reporters import SimpleReporter
|
python
|
Apple says it plans to open up another 30 to 35 retail stores in the next year, bringing it right in line with its growth during 2012.
Apple's retail empire is still expanding.
The company today said it plans to open another 30 to 35 retail stores during its 2013 fiscal year, which wraps up next September. About three-quarters of those stores would be located outside the United States, the company said.
The mention came inside the company's annual report, which was filed with the U.S. Securities and Exchange Commission this afternoon. Apple added that it plans to spend about $850 million on the expansion, as well as investment into its current retail infrastructure.
The plans fall in line with Apple's retail store rollout during 2012, which included the opening of 33 new stores, bringing Apple's full tally to 390 stores worldwide. Similar to the planned expansion, 28 of those 33 stores were opened up outside the U.S. The year prior, Apple opened up 40 stores.
The company's operations are under a closer watch given the departure of retail chief John Browett, who Apple this week said no longer works for the company. During Browett's tenure, reports surfaced that Apple was making cutbacks on staffing, as well as in-store features, prompting speculation that the company's retail efforts were not as healthy as they once were.
In its annual report, Apple said retail sales grew by $4.7 billion or 33 percent versus the year prior, mainly due to the iPhone 4S and iPhone 5, as well as its two latest iPad models. Altogether, sales made at Apple stores made up 12 percent of Apple's total sales for 2012, down from 13 percent the year before.
|
english
|
<gh_stars>1-10
// tests go here; this will not be compiled when this package is used as a library
bluetooth.startMax6675Service(DigitalPin.P0);
|
typescript
|
import { inspect } from "util";
import {
CancellationToken,
CodeAction,
CodeActionContext,
CodeActionProvider,
Range,
Selection,
TextDocument,
WorkspaceEdit,
} from "vscode";
import { debug } from "./debug";
import Message from "./message";
export default class ArcCodeActionsProvider implements CodeActionProvider {
public provideCodeActions(
document: TextDocument,
range: Range | Selection,
context: CodeActionContext,
token: CancellationToken,
): CodeAction[] {
const messages = context.diagnostics.filter((m): m is Message => m instanceof Message);
const actions = [];
for (const m of messages) {
const action = this.createAction(m, document);
if (action) { actions.push(action); }
}
if (context.only) { return actions.filter((a) => a.kind === context.only); }
return actions;
}
public createAction(m: Message, document: TextDocument): CodeAction | undefined {
const edit = m.edit;
if (!edit) { return; }
const original = document.getText(m.range);
if (original !== m.arcMessage.original) {
debug(`Text changed:\nA: ${inspect(original)}\nB: ${inspect(m.arcMessage.original)}`);
return;
}
const action = new CodeAction(m.message, m.actionKind);
action.isPreferred = true;
action.diagnostics = [m];
action.edit = new WorkspaceEdit();
action.edit.replace(document.uri, edit.range, edit.newText);
return action;
}
}
|
typescript
|
Variety is in India’s blood. From language to customs, culture to religion or the high mountains to the never-ending rivers, India cannot be restricted to a single uniqueness. This variety feature of India is now spreading across sports. Many think of India as a cricket-mad country, but those who are aware of its past and present knows that this country is surely more than that.
Cricket is a sport which is hugely followed in India and the country has achieved massive success in this area, but somewhere down the line the achievements in cricket overshadows other sports. Though highlighted now, but India has been active in producing world class players in others fields like hockey, chess, badminton, tennis, squash, boxing, billiards, snooker, kabaddi to name a few.
The achievements in other sports should be duly accoladed to ensure a capable future generation having expertise in a variety of sports. Now let's set our eyes upon a game for which India has produced many champions but is still a next-to-forgotten game in the memory of the countrymen – Cue sports.
Billiards and snooker are not interchangeable terms as they both are two different games which look alike. Both are table games with a different set of guiding rules and surely need different approaches.
Billiards, on one hand, is a more three dimensional game involving a far more range of shots than snooker. Its a game of patience and out-of-the-box thinking. Snooker, on the other hand, has a youth appeal and a more trending game than Billiards nowadays. Two players engaged at the table have a full-go at each other aiming to score more points. The player with the higher points, wins. So the influence on the opponents is surely more in snooker.
Cue sports have been prevalent in India for a long time now. It lacks a mass appeal and thus, the reason for the poor knowledge and concern of it among the countrymen. Although, the appeal is limited, India has even then managed to produce not only world-class players but champions. Players like Geet Sethi, Ashok Sandilya, Michael Ferreira, Wilson Jones and Pankaj Advani have brought laurels for the country in countless events and were and still are the greatest of the game.
The golden chapters in India's billiards and snooker history can be said that it started with the advent of Wilson Jones. Recipient of prestigious awards like the Padma Shri award and the Dronacharya Award, Jones won the World Amateur Billiards Championship in 1958 which gave India its first world champion in any sports. He was won the National Billiards Championship of India 12 times.
The spark he ignited turned into a flame and soon India had a new billiards hero named Michael Ferreira. Also a recipient of high civil honours, Ferriera won his first World Amateur Billiards Championship and Open Billiards Championship in 1977.
Idolising the two champions many more players came to the stage, but it was Geet Sethi and Ashok Sandilya that stole the show. They made many records their own. Sethi won the IBSF World Billiards Championship in 1985 whereas Sandilya won the Asian Games Gold in 1998 held in Bangkok. Sethi even registered many achievements in snooker too. Both are legends and even the Government felicitated them with prestigious civil honours.
All the above players are idols to many but the player that stands out even in the line of legends, the player who has the world of billiards and snooker in his hand and is surely the greatest that India has ever produced is Pankaj Advani.
Pankaj Advani, a Bangalore-born snooker and billiards player is a World Champion in both games, an amazing feat. What's more amazing is that he has done it 3 times. A truly monumental feat! An icon to millions and surely first of its kind in this sports profile, Advani has received multiple awards for his achievements. He was awarded the Rajiv Gandhi Khel Ratna Award, prestigious Padma Shri and Arjuna awards. This great athlete is still on his dream embarking journey with the hopes of millions upon him.
The future of billiards and snooker seem to be on track. India is one of the strong forces in the world of billiards and in snooker, they are doing pretty well too. To create a more happening future, measures can be taken like few individual players should be selected by Billiards and Snooker Federation of India (BFSI) for exposure trips. This will give youngsters opportunities to face the international show and develop their game accordingly.
The motive is to have a look and inspire an uprising in billiards and snooker in the country. The game has a long list of achievements, but has fell short of capturing the attention it deserved. It is surely not the game's fault. What we need is a more wider outlook. Restricting the media and public attention to cricket, chess and badminton has acted negatively for the athletes who have brought laurels for the country in other sports and billiards and snooker is one among them.
India has the resources to excel in table-game sport it wants, it just needs the right medicine. The positive progress in providing facilities to cue sports players across the country will surely reap rewads, but this is a long term process. The BFSI should be praised for identifying the need and approaching a more aggressive approach in popularising the game.
India is moving a step ahead every day in the field of this table-game. Let's hope that the upcoming Indian players are able to weave a beautiful future, a future we all dream of.
|
english
|
<gh_stars>0
{"topic_54": {"num_tags": 3, "name": "topic_54", "full_name": "topic_54", "num_included_tokens": 3}, "topic_53": {"num_tags": 9, "name": "topic_53", "full_name": "topic_53", "num_included_tokens": 9}, "topic_34": {"num_tags": 1, "name": "topic_34", "full_name": "topic_34", "num_included_tokens": 1}, "topic_38": {"num_tags": 2, "name": "topic_38", "full_name": "topic_38", "num_included_tokens": 2}, "topic_22": {"num_tags": 10, "name": "topic_22", "full_name": "topic_22", "num_included_tokens": 10}, "topic_49": {"num_tags": 2, "name": "topic_49", "full_name": "topic_49", "num_included_tokens": 2}, "topic_23": {"num_tags": 3, "name": "topic_23", "full_name": "topic_23", "num_included_tokens": 3}, "topic_90": {"num_tags": 29, "name": "topic_90", "full_name": "topic_90", "num_included_tokens": 29}, "topic_78": {"num_tags": 9, "name": "topic_78", "full_name": "topic_78", "num_included_tokens": 9}, "topic_95": {"num_tags": 1, "name": "topic_95", "full_name": "topic_95", "num_included_tokens": 1}, "topic_63": {"num_tags": 1, "name": "topic_63", "full_name": "topic_63", "num_included_tokens": 1}, "topic_74": {"num_tags": 18, "name": "topic_74", "full_name": "topic_74", "num_included_tokens": 18}, "topic_98": {"num_tags": 12, "name": "topic_98", "full_name": "topic_98", "num_included_tokens": 12}}
|
json
|
<gh_stars>1-10
# Query a CloudFormation stack.
# Returns:
# - Build Parameters - the parameters sent to the TemplateBuilder when generating the template
# - Stack Parameters - the parameters sent to CF during stack create or update time.
import argparse
import arguments
import logconfig
import session
from scaffold.cf import stack
def query_stack(stack_name, profile):
boto3_session = session.new(args.profile, args.region, args.role)
return [
stack.summary(boto3_session, stack_name).build_parameters(),
stack.parameters(boto3_session, stack_name)
]
default_profile = 'default'
def get_args():
ap = argparse.ArgumentParser(description='Query a CloudFormation stack for build and template parameters',
add_help=False)
ap.add_argument('stack_name',
help='Name of the stack to query')
arguments.add_security_control_group(ap)
return ap.parse_args()
if __name__ == '__main__':
logconfig.config()
args = get_args()
build_parms, stack_parms = query_stack(args.stack_name, args.profile)
# TODO: move these to logging messages
print('Build Parameters:')
if len(build_parms) == 0:
print(' (none)')
else:
for name, value in build_parms.items():
print(' {} : {}'.format(name, value))
print('Stack Parameters:')
if len(stack_parms) == 0:
print(' (none)')
else:
for name, value in stack_parms.items():
print(' {} : {}'.format(name, value))
|
python
|
Lather up and massage over wet face avoiding the eye area. Rinse off with clean water. Use 1-2 times a day. For best results use NIVEA MEN Extra Whitening Pore Minimizer Moisturiser after cleansing.
Aqua, Potassium Myristate, Propulene Glycol, Potassium Palmilate, Potassium Stearate, Glycerin, Potassium Laurate, PEG-150, PEG-8, Glyceryl stearate, 4-Butylresorcinol, Glycyrrhiza Glabra root Extract,Glyceryl Glucoside, Ginkgo Bilbora, leaf Extract, Panax ginseng root extract, Sodium Ascorbyl Phosphate, Panthenol, Tocopheryl Acetate, magnesium Chloride, BHT, Cera Alba, Sodium Methyl Cocoyl, taurate, trisodium EDTA, Potassium arachidate, Potassium Oleate, Parfum.
While M Plus strives to ensure the accuracy of its product images and information, some manufacturing changes to packaging and/or ingredients may be pending updates on our site. Although items may occasionally ship with alternate packaging, product originality is always guaranteed. We recommend that you read labels, warnings, and directions of all products before use and not rely solely on the information provided by M Plus.
|Order Total (RM)
|Delivery Fee (RM)
|Delivery Fee (RM)
|
english
|
import {
DIFF_EXPR_COMPARISON_GROUP_SET,
} from '../../actionTypes/differentialExpression';
const setComparisonGroup = (group) => async (dispatch) => {
const {
cellSet, compareWith, basis, type,
} = group;
dispatch({
type: DIFF_EXPR_COMPARISON_GROUP_SET,
payload: {
type,
cellSet,
compareWith,
basis,
},
});
};
export default setComparisonGroup;
|
javascript
|
<filename>en/castorbean.json
{"word":"castor bean","definition":"The bean or seed of the castor-oil plant (Ricinus communis, or Palma Christi.)"}
|
json
|
Kottayam vigilance court has ordered quick verification against minister Thomas Chandy following allegations of lake encroachment.
The order was made while considering a petition filed by advocate Subhash.
He had alleged that the minister used MP fund for constructing the approach road of Lake Palace Resort, owned by the minister, without seeking the permission of the local committee. The petitioner also alleged that this resulted in a loss of ₹23 lakh to the state exchequer.
Earlier, Alappuzha District Collector T V Anupama, had in her inquiry report submitted that the resort owned by the Transport Minister had flouted rules. It was stated that he filled paddy fields for the construction of parking space and approach road to his resort and also encroached backwaters.
He was charged with violation of Kerala Conservation of Paddy and Wetland Act. The minister has been facing attack from both the Congress and the BJP ever since the allegations surfaced.
Opposition leader Ramesh Chennithala accused the Chief Minister of providing 'out of the way' support to Chandy and called Pinarayi a co-accused.
Chennithala said the Chief Minister has become a co-accused by opting to shield the minister. The Opposition leader also alleged that Pinarayi Vijayan tried to delay the inquiry against Chandy under the pretext of seeking time for initiating legal action against him.
|
english
|
<gh_stars>0
package com.ming.blog.mq.event.executor;
import com.alibaba.fastjson.JSON;
import com.ming.blog.mq.event.Event;
import com.ming.blog.mq.event.EventChannel;
import com.ming.blog.mq.event.TraceId;
import com.ming.blog.mq.event.base.FunctionMap;
import com.ming.blog.mq.event.base.FunctionNameMap;
import com.ming.blog.mq.event.function.BaseFunction;
import lombok.extern.slf4j.Slf4j;
import org.slf4j.MDC;
import org.springframework.scheduling.annotation.Async;
import org.springframework.scheduling.concurrent.ThreadPoolTaskExecutor;
import org.springframework.stereotype.Component;
import javax.annotation.Resource;
import java.util.List;
/**
* @author jiangzaiming
*/
@Slf4j
@Component
public class EventExecutor {
@Resource
ThreadPoolTaskExecutor taskExecutor;
@Resource
private EventChannel eventChannel;
@Resource
private FunctionMap functionMap;
@Resource
private FunctionNameMap functionNameMap;
private List<BaseFunction> getProcessorFunction(int type) {
return functionMap.getFunction(type);
}
@Async(value = "taskExecutor")
public void executor(Event event) {
// taskExecutor.execute(() -> {
MDC.put("traceId", TraceId.id() + "-" + event.getEventId());
List<BaseFunction> baseFunctionList = getProcessorFunction(event.getEventType());
// 有对应的处理器
if (baseFunctionList != null && baseFunctionList.size() > 0) {
for (BaseFunction baseFunction : baseFunctionList) {
functionInvoke(baseFunction, event);
}
} else {
log.warn(String.format("no function to process this event, event type: %s , event message: %s",
event.getEventType(), JSON.toJSONString(event.getEventParams())));
}
// });
}
/**
* @param baseFunction 方法
* @param event 事件属性
* @param <T> 参数泛型
*/
private <T> void functionInvoke(BaseFunction<T> baseFunction, Event<T> event) {
try {
if (!checkEventTimeout(baseFunction, event)) {
confirm(baseFunction.getName(), event.getEventId());
log.info("event time out, event : {}", JSON.toJSONString(event));
return;
}
Event<T> ev = Event.from(event);
ev.setEventId(event.getEventId());
T param = JSON.parseObject(JSON.toJSONString(event.getEventParams()), baseFunction.getType());
ev.setEventParams(param);
try {
eventChannel.processing(baseFunction.getName(), ev);
} catch (Exception e) {
log.error(String.format("processing event error, event type: %s , event message: %s",
event.getEventType(), JSON.toJSONString(event.getEventParams())), e);
}
boolean success = baseFunction.execute(param);
if (success) {
confirm(baseFunction.getName(), event.getEventId());
}
} catch (Exception e) {
log.error(String.format("process event error, event type: %s , event message: %s",
event.getEventType(), JSON.toJSONString(event.getEventParams())), e);
}
}
private <T> boolean checkEventTimeout(BaseFunction<T> baseFunction, Event<T> event) {
// 超时了,返回不处理
if (baseFunction.getTimeout() == 0L) {
// 方法未设置超时时间
return true;
} else {
return System.currentTimeMillis() - event.getTs() <= baseFunction.getTimeout();
}
}
public void confirm(String functionName, String eventId) {
try {
eventChannel.confirm(functionName, eventId);
} catch (Exception e) {
log.error(String.format("confirm event error, event id : %s, processor name : %s",
eventId, functionName), e);
}
}
/**
* 指定处理器处理event
*
* @param functionName 处理器名称
* @param event 事件
*/
public void executorFunction(String functionName, Event event) {
if (event != null) {
MDC.put("traceId", TraceId.id() + "-" + event.getEventId());
BaseFunction baseFunction = getFunctionByName(functionName);
if (baseFunction != null) {
functionInvoke(baseFunction, event);
} else {
log.error("baseFunction not exist , name : {}, event : {}",
functionName, JSON.toJSONString(event));
}
}
}
private BaseFunction getFunctionByName(String functionName) {
return functionNameMap.getFunction(functionName);
}
}
|
java
|
[{"postal_code":"72016","place_name":"Real <NAME>","place_type":"Colonia","county":"Puebla","state":"Puebla","city":"Heroica Puebla de Zaragoza"},{"postal_code":"72016","place_name":"Nuevo Paraíso","place_type":"Colonia","county":"Puebla","state":"Puebla","city":"Heroica Puebla de Zaragoza"},{"postal_code":"72016","place_name":"Rancho Guadalupe","place_type":"Colonia","county":"Puebla","state":"Puebla","city":"Heroica Puebla de Zaragoza"},{"postal_code":"72016","place_name":"<NAME>","place_type":"Colonia","county":"Puebla","state":"Puebla","city":"Heroica Puebla de Zaragoza"},{"postal_code":"72016","place_name":"Emperatriz","place_type":"Conjunto habitacional","county":"Puebla","state":"Puebla","city":"Heroica Puebla de Zaragoza"},{"postal_code":"72016","place_name":"<NAME>","place_type":"Colonia","county":"Puebla","state":"Puebla","city":"Heroica Puebla de Zaragoza"},{"postal_code":"72016","place_name":"<NAME>","place_type":"Colonia","county":"Puebla","state":"Puebla","city":"Heroica Puebla de Zaragoza"},{"postal_code":"72016","place_name":"INFONAVIT Villa Frontera","place_type":"Colonia","county":"Puebla","state":"Puebla","city":"Heroica Puebla de Zaragoza"},{"postal_code":"72016","place_name":"<NAME>","place_type":"Colonia","county":"Puebla","state":"Puebla","city":"Heroica Puebla de Zaragoza"},{"postal_code":"72016","place_name":"Guadalupe del Conde","place_type":"Colonia","county":"Puebla","state":"Puebla","city":"Heroica Puebla de Zaragoza"},{"postal_code":"72016","place_name":"Villas San Gregorio","place_type":"Fraccionamiento","county":"Puebla","state":"Puebla","city":"Heroica Puebla de Zaragoza"},{"postal_code":"72016","place_name":"Solidaridad Nacional","place_type":"Colonia","county":"Puebla","state":"Puebla","city":"Heroica Puebla de Zaragoza"},{"postal_code":"72016","place_name":"San José el Conde","place_type":"Colonia","county":"Puebla","state":"Puebla","city":"Heroica Puebla de Zaragoza"},{"postal_code":"72016","place_name":"<NAME>","place_type":"Unidad habitacional","county":"Puebla","state":"Puebla","city":"Heroica Puebla de Zaragoza"},{"postal_code":"72016","place_name":"<NAME>","place_type":"Colonia","county":"Puebla","state":"Puebla","city":"Heroica Puebla de Zaragoza"}]
|
json
|
- Updated: Fri, 25 Oct 2019 03:38 PM (IST)
New Dellhi | Jagran News Desk: Retaining Maharashtra for a second straight term, the BJP is likely to form government in Haryana, where it emerged as the single largest party in a fractured mandate.
Seven Independent MLAs have reportedly extended their support to the saffron party. They include Haryana Lokhit Party's Gopal Kanda and Independent candidate Ranjeet Singh. Kanda, the winning candidate from Sirsa, said that all Independent candidates have extended their support to the BJP. A third Somveer Singh told PTI that he will support the BJP government.
With 40 seats in its kitty, the BJP needs six more to touch the halfway mark needed to form the next government. The Congress won 31 seats, the Jannayak Janta Party 10, the Indian National Lok Dal and Haryana Lokhit Party one each. The Aam Aadmi Party, which contested 46 seats, was decimated.
Chief Minister Manohar Lal Khattar on Thursday sought Governor Satyadev Narayan Arya’s time to stake claim to form government. According to report, Khattar will be taking oath as the chief minister for a second straight term on Friday evening.
Meanwhile, some media reports on Thursday claimed that Dushyant Chautala is likely to extend his support to the BJP. At the same time, some other reports suggested that the Congress had also approached the JJP chief to form an alliance while offering him the chief ministerial post.
As counting trends indicated that no party would enough seats in the 90-member House to form the government, senior Congress leader and former chief minister Bhupinder Singh Hooda sought support for his party.
"This mandate is against the BJP. The JJP, INLD and others, including the independents, should join hands with the Congress to keep the BJP at bay," he told reporters in Rohtak.
Hooda alleged the administration was putting pressure on the independents at the behest of the BJP and not allowing them to move freely.
Commenting on the early trends, JJP leader Dushyant Chautala said, "This shows there was huge anti-incumbency against the (Manohar Lal) Khattar government. " But he remained non-committal on whom he will support.
"It is too early to say anything. We will first summon a meeting of our MLAs, decide who would be our leader in the House and then take it further," he said.
However, he said the people of Haryana want “change”.
State Congress chief Kumari Selja said the people of Haryana have rejected the BJP and are ready to embrace “a new dawn of justice".
In the 2014 assembly elections, the BJP formed the government on its own, winning 47 seats. The Congress had then won 17 seats, INLD 19, Shiromani Akali Dal one, Bahujan Samaj Party one and independents five.
The BJP strength in the House later rose to 48 after it won a bypoll.
Chief Minister Manohar Lal Khattar retained his Karnal seat by a margin of 45,188 votes against his Congress rival Tarlochan Singh. But, apart from the eight ministers, state BJP chief Subhash Barala (Tohana) lost.
JJP leader Dushyant Chautala won by 47,452 votes over the sitting BJP MLA Prem Lata in Uchana Kalan. Prem Lata is the wife of former Union Minister Birender Singh, and has defeated him earlier.
Among the sportspersons fielded by the BJP, only former Indian hockey captain Sandeep Singh won from Pehowa. International wrestlers Babita Phogat (Dadri) Yogeshwar Dutt (Baroda) lost.
TikTok star Sonali Phogat lost to senior Congress leader Kuldeep Bishnoi in Adampur.
Senior cabinet minister Anil Vij won from Ambala Cantonment, which he has represented five-time legislator, defeating independent candidate Chitra Sarwara by 20,165 votes.
The BJP failed to break the Congress stronghold — the Deswali or Jat dominated belt of the state comprising Rohtak, Jhajjar and Sonipat districts. It picked up just one seat there, Rai in Sonipat. The Congress won 11 seats in this belt.
The BJP held on to its Jind seat and did well in some pockets of southern Haryana and in Faridabad district.
Senior Congress leader Randeep Singh Surjewala lost to his BJP rival Leela Ram Gurjar by just 1,246 votes in Kaithal and Assembly Speaker Kanwar Pal defeated Akram Khan of the Congress by a margin of 16,373 votes in Jagadhri.
The Congress winners include Geeta Bhukkal, Raghubir Singh Kadian and Capt Ajay Yadav's son Chiranjeev Rao.
Congress candidate and former BCCI chief Ranbir Singh Mahendra lost in Badhra constituency to JJP's Naina Chautala.
Raj Kumar Saini, chief of Loktanter Suraksha Party and former BJP MP from Kurukshetra, was defeated by the Congress candidate in Gohana.
Most exit polls had predicted a comfortable victory for the BJP, which came to power for the first time on its own in Haryana in 2014.
(With PTI inputs)
|
english
|
{
"name": "slides-kottans-chernivtsi-2019-react-continued-1",
"private": true,
"version": "0.0.0",
"description": "Slides for 1st part of my talk about React for Kottans Chernivtsi Frontend Course 2019",
"main": "index.html",
"scripts": {
"start": "http-server"
},
"author": "<NAME> <<EMAIL>> (http://denysdovhan.com)",
"license": "MIT",
"dependencies": {},
"devDependencies": {
"http-server": "^0.10.0"
}
}
|
json
|
'use strict'
var camel = require('camelcase')
var extend = Object.assign//require('object-assign')
// normalize properties object
// detect on-
// detect data-
// detect jsx-
// split reserved
module.exports = function parseProps(srcProps) {
var props = extend({}, srcProps)
var result = { key: props.key != null ? props.key : props.id }
result.on = pickSet(props, 'on')
result.jsx = pickSet(props, 'jsx')
result.data = pickSet(props, 'data')
result.attributes = props
return result
}
// on-something, onsomething, onSomething → on: {something}
function pickSet (props, prefix) {
var l = prefix.length
var set
// on={{something}}
if (props[prefix]) {
set = props[prefix]
delete props[prefix]
return set
}
set = {}
for (var name in props) {
if (name.substr(0, prefix.length) === prefix) {
// on-something
if (name[prefix.length] === '-') {
set[camel(name.substr(1))] = props[name]
}
// onSomething, onsomething
else {
set[camel(name)] = props[name]
}
delete props[name]
}
}
return set
}
|
javascript
|
{"word":"humpbacked salmon","definition":"A small salmon (Oncorhynchus gorbuscha) which ascends the rivers of the Pacific coast from California to Alaska, and also on the Asiatic side. In the breeding season the male has a large dorsal hump and distorted jaws."}
|
json
|
{
"name": "<NAME>",
"bio": "Engineer @ The DataWorks; writes React, Angular, Golang; Google GDE; Microsoft MVP",
"photoURL": "https://i0.wp.com/unravelweb.dev/wp-content/uploads/2021/03/nishugoel-e1616742561838.jpg",
"mediumURL": "https://medium.com/@nishugoel",
"gitHubUrl": "https://github.com/nishugoel"
}
|
json
|
{
"ssid": "wifissid",
"password": "<PASSWORD>",
"slack": "https://<KEY>",
"mqtt": {
"broker": "your.broker.address",
"port": "1883",
"device_id": "deviceID",
"topic": "your/topic"
},
"interval_sec": 30
}
|
json
|
New Delhi | Researchers at Indian Institute of Technology (IIT), Mandi have identified microbial pairs that can effectively convert cellulose, a major component present in agriculture residue and paper waste, into useful chemicals, biofuels, and carbon suitable for several industrial applications.
The research has been published in the journal "Bioresource Technology Reports". The method has been patented and further scale-up of the bioprocess is ongoing.
According to officials, plant dry matter, also known as lignocellulose, is one of the most abundant renewable materials on Earth. Lignocellulosic waste from agriculture, forests, and industries can be converted into valuable chemicals such as bioethanol, biodiesel, lactic acid, and fatty acids using a process called bioprocessing. Bioprocessing, however, involves multiple steps and can release undesirable chemicals, requiring multiple washing and separation steps, which increases costs.
Scientists are exploring an innovative method called consolidated bioprocessing (CBP) to convert lignocellulosic biomass into useful chemicals.
"This method involves combining saccharification (the conversion of the cellulose into simple sugars) and fermentation (the conversion of simple sugars into alcohol) into one step. One way to achieve this is by using a synthetic microbial consortium (SynCONS)," said Shyam Kumar Masakapalli, Associate Professor at IIT, Mandi.
"SynCONS are a combination of different microorganisms; in this case, two types of microbes are selected, one brings about saccharification and the other fermentation. A combination of microbes that is stable at high temperatures (thermophilic consortia) is particularly useful because fermentation is a heat-releasing process," he added.
The scientists studied two SynCONS systems for a cellulose processing process that was followed by pyrolysis, a method that decomposes organic materials by heating them above 500 degrees Celsius in the absence of oxygen, was integrated with microbial bioprocessing.
|
english
|
{"date":20200622,"state":"SC","positive":25701,"probableCases":null,"negative":278179,"pending":null,"totalTestResultsSource":"posNeg","totalTestResults":303880,"hospitalizedCurrently":731,"hospitalizedCumulative":2294,"inIcuCurrently":null,"inIcuCumulative":null,"onVentilatorCurrently":null,"onVentilatorCumulative":null,"recovered":10790,"lastUpdateEt":"6/21/2020 23:59","dateModified":"2020-06-21T23:59:00Z","checkTimeEt":"06/21 19:59","death":659,"hospitalized":2294,"hospitalizedDischarged":null,"dateChecked":"2020-06-21T23:59:00Z","totalTestsViral":311953,"positiveTestsViral":33774,"negativeTestsViral":278179,"positiveCasesViral":25666,"deathConfirmed":659,"deathProbable":0,"totalTestEncountersViral":null,"totalTestsPeopleViral":null,"totalTestsAntibody":33906,"positiveTestsAntibody":1630,"negativeTestsAntibody":32276,"totalTestsPeopleAntibody":null,"positiveTestsPeopleAntibody":null,"negativeTestsPeopleAntibody":null,"totalTestsPeopleAntigen":null,"positiveTestsPeopleAntigen":null,"totalTestsAntigen":null,"positiveTestsAntigen":null,"fips":"45","positiveIncrease":1008,"negativeIncrease":205,"total":303880,"totalTestResultsIncrease":1213,"posNeg":303880,"dataQualityGrade":null,"deathIncrease":6,"hospitalizedIncrease":0,"hash":"be4aa022473a653d7b036950bbc13d7bba33328d","commercialScore":0,"negativeRegularScore":0,"negativeScore":0,"positiveScore":0,"score":0,"grade":""}
|
json
|
<reponame>garmann/photo-rating-api<filename>json/listing-details-6.json
{"shootingid":"6","shooting":{"id":6,"title":"Palmenshooting","description":"Crazy Palmenshooting","photos":[{"name":"R0000070.JPG","thumbs":true,"trash":false,"rating":1,"comments":[{"author":"<EMAIL>","text":"erster"},{"author":"<EMAIL>","text":"zweiter"}]},{"name":"R0000071.JPG","thumbs":false,"trash":true,"rating":0,"comments":[{"author":"<EMAIL>","text":"nur ein comment"}]},{"name":"R0000072.JPG","thumbs":true,"trash":false,"rating":2,"comments":[]},{"name":"R0000073.JPG","thumbs":false,"trash":true,"rating":3,"comments":[]},{"name":"R0000074.JPG","thumbs":true,"trash":false,"rating":3,"comments":[]},{"name":"R0000076.JPG","thumbs":true,"trash":false,"rating":2,"comments":[]},{"name":"R0000077.JPG","thumbs":true,"trash":false,"rating":2,"comments":[]},{"name":"R0000078.JPG","thumbs":true,"trash":false,"rating":2,"comments":[]},{"name":"R0000079.JPG","thumbs":true,"trash":false,"rating":2,"comments":[]},{"name":"R0000080.JPG","thumbs":true,"trash":false,"rating":2,"comments":[]},{"name":"R0000081.JPG","thumbs":true,"trash":false,"rating":2,"comments":[]}]}}
|
json
|
The former Sri Lanka captain, Angelo Mathews is all set to perform the role of an all-rounder in the upcoming ODI series against India, as he has been cleared to bowl. The news was confirmed by the Chairman of Selectors, Sanath Jayasuriya himself.
Though he also cleared the fact that they are not expecting Mathews to complete his quota of 10 overs,
“It’s a big relief that Angelo will be able to bowl for us again. We have to manage him carefully and he is at the moment not ready to bowl full quota of ten overs, but he will certainly be able to bowl five to six overs. That’s a big plus point for us as it helps us to balance the side.” Jayasuriya told Cricbuzz.
(Read Here: R Ashwin To Play County Cricket)
It has come as a big relief for the Sri Lankan team, as there was a time during the Test series where they were in need of an additional second pacer in the game but Mathews was not an option, despite being there on the field. It happened in the second Test when Nuwan Pradeep walked off the field due to a hamstring injury and incidentally, he was the only main pacer in the line-up.
Had Mathews been cleared to bowl at that time, Sri Lankan team would have hope for a better performance.
In the upcoming limited-overs series, experienced pacer Lasith Malinga will himself lead the pace line-up. Vishwa Fernando will assist him and Mathews, of course, will be the key due to the experience he possesses.
The limited-over series between India and Sri Lanka starts from August 20, with the first ODI to be played at Rangiri Dambulla International Stadium in Dambulla. Four ODIs and one T20I will follow, which will also mark an end to India’s tour of Sri Lanka. After a period of two months, Sri Lanka will travel to India for a full fledged series against the hosts. It is to be noted that India will again tour Sri Lanka for Independence Cup in the first quarter of 2018 and again, Sri Lanka will come back to India to participate in Asia Cup.
(Read Here: Ab de Villiers Might Play BBL)
(Read Here: What Is ‘Yo-Yo’ Test)
|
english
|
<reponame>nordeck/fosdem-schedule-element-widget
import moment from 'moment';
import React, { useState } from 'react';
import { useInterval } from 'react-use';
import { Icon } from 'semantic-ui-react';
import styled from 'styled-components';
const FlashingLabel = styled.label`
float: right;
font-size: 0.8em;
opacity: 1;
transition: transform 500ms ease-in-out !important;
margin-right: 0 !important;
margin-top: 15px !important;
transform: scale(1);
`;
interface ITimeDistanceProps {
date: string
endDate: string
}
const TimeDistance = ({ date, endDate }: ITimeDistanceProps) => {
const getTimeData = () => {
const startDiff = moment(date).diff(moment());
const duration = moment.duration(startDiff);
const endDiff = moment(endDate).diff(moment());
const remainingDuration = moment.duration(endDiff);
return {
distance: duration.humanize(true),
exact: `in ${duration.format('hh:mm:ss')}`,
hours: duration.asHours(),
minutes: duration.asMinutes(),
past: startDiff > endDiff,
remainingDistance: remainingDuration.humanize(true),
remainingExact: `ends in ${remainingDuration.format('hh:mm:ss')}`,
remainingHours: remainingDuration.asHours(),
remainingMinutes: remainingDuration.asMinutes(),
remainingSeconds: remainingDuration.asSeconds(),
seconds: duration.asSeconds()
};
};
const [now, setNow] = useState(getTimeData());
const [scale, setScale] = useState(1);
useInterval(
() => {
setNow(getTimeData());
},
now.hours > 0 || now.remainingHours > 0 ? 1000 : null
);
useInterval(
() => {
setScale((scale === 1) ? 1.1 : 1);
},
now.minutes > 0 && now.minutes <= 5 ? 500 : null
);
return (
((now.hours <= 8 && now.seconds > 0) || (now.remainingSeconds > 0 && now.seconds < 0)) ?
<FlashingLabel
className={`ui horizontal label tiny${(now.remainingMinutes > 0 && now.minutes < 0) ? ' green' : (now.minutes <= 5 ? ((now.seconds < 59) ? ' red' : ' blue') : '')}`}
style={{
fontStyle: ((now.remainingMinutes > 0 && now.minutes < 0) ? 'italic' : 'normal'),
fontWeight: ((now.minutes <= 5 && now.minutes >= 0) ? 'bold' : 'normal'),
transform: `scale(${((now.minutes <= 5 && now.minutes >= 0) ? scale : 1)})`
}}
>
<Icon
loading={now.minutes < 0}
name={now.minutes > 0 ? 'clock outline' : 'circle notch'}
/>
{(now.remainingMinutes > 0 && now.minutes < 0) ? now.remainingExact : ((now.minutes < 0.99) ? now.exact : now.distance)}
</FlashingLabel> :
null
);
};
export default TimeDistance;
|
typescript
|
const withImages = require('next-images');
const LodashModuleReplacementPlugin = require('lodash-webpack-plugin');
const withBundleAnalyzer = require('@next/bundle-analyzer');
const withPlugins = require('next-compose-plugins');
require('dotenv').config();
module.exports = withPlugins(
[
withBundleAnalyzer({
enabled: process.env.ANALYZE === 'true',
}),
withImages,
],
{
webpack(config, options) {
config.plugins.push(new LodashModuleReplacementPlugin({ paths: true }));
return config;
},
env: {
GITHUB_TOKEN: process.env.GITHUB_TOKEN,
},
},
);
|
javascript
|
<reponame>fredericbonnet/stryker-js<gh_stars>1000+
{
"testRunner": "jest",
"reporters": [
"json",
"html",
"progress",
"clear-text"
],
"tempDirName": "stryker-tmp",
"timeoutMS": 60000,
"mutate": [
"src/{Alert,Badge,Breadcrumb}.js"
],
"concurrency": 2,
"jest": {
"projectType": "create-react-app"
}
}
|
json
|
<filename>public/static/d/914/path---showcase-www-healthcarelogic-com-dda-8c6-OqlwXgDxJBV64hngw7x5HmA9c.json
{"data":{"sitesYaml":{"id":"c6c45746-9a57-5613-962f-1ebe86fd0b6b","title":"Healthcare Logic","main_url":"https://www.healthcarelogic.com/","featured":false,"categories":["Marketing","Healthcare","Technology"],"built_by":"Thrive","built_by_url":"https://thriveweb.com.au/","source_url":null,"description":"Revolutionary technology that empowers clinical and managerial leaders to collaborate with clarity.","childScreenshot":{"screenshotFile":{"childImageSharp":{"sizes":{"base64":"data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAABQAAAAPCAYAAADkmO9VAAAACXBIWXMAAAsSAAALEgHS3X78AAACcklEQVQ4y43U21NSURQG8HXSsZoptJkmdUYBoRAvEIHI5XC/FpJohklqB1EK0CjTeOihpr/8a+29j3gkm+nhvH781lrfhkj/CIqfgNab/BmgtQNQsAF6sQt6/hbk2wat1kDLVdBSBbRYBj0rgp7mQK4MyJkEOeIgexQ0FwZRooPHWz8QMn7DUbvAXb2lAoN7oEAd5N/hwC3QymsO3AB5X4I8JQ7Mg9xZ0EKKA3Xc92ShiVDSP+Fe9hRztUuM68es/MDKfRUauEXpfWUqC6YyLZWaUwfNi8Bkl0fmscNHoMjR9dih9zz2O1P55qZyUSiLpjIzVJI9JgJ7EGPLXUZZGOFdhk1lqKFChdK/Dc2yS81blqEaKzUZyru08y4pdcqBXRUYO7mpDHCYn4U+Fq5wmLeqhB4WulnoZqGTR54XOv7E2JT+DKnkXVK8zcqWvLjGu5wp9zBb6uJJvo3Z/AnclS7GIgd4qB/AXTAwHduFK3cIb8nAo8CGKUz3MVTGLUoee6k+QLT5E6v1SwT3Bii2f3ETzmEr9uDc7MNRPYMrzz8cq2MyUFUVoswXDjz7Wxk28CDdxnSph8lUC1PJJmZyx5hKHGIs3JDfxPqeqpAYeyGtLk7Zr5BjS2XHVB4r5doh77Fh7pIPs7KtdrnM1/ZyyT1lWSHNYyk6Zc9xrewqpRhbKOXFOTS0r4ouXw8X3ScqtMnBtxSdct8wVCZHlZYK/W/RKX8BpewrZWJEKQJDZuC/im5VUuE7pFKMfVUhqVTHucN/HBPxFiaihkW5o5Ry7BElFQYYKq0VEkp+OeOpDmy5LmyZNjQxunw5V8qaqawMn+MfilrSNU8jUYsAAAAASUVORK5CYII=","aspectRatio":1.3333333333333333,"src":"/static/c0c6d2ea2bbb43b70ffd1df751246ff8-87d53cf59203fc548a27bbf406c57449-f454e.png","srcSet":"/static/c0c6d2ea2bbb43b70ffd1df751246ff8-87d53cf59203fc548a27bbf406c57449-224d3.png 175w,\n/static/c0c6d2ea2bbb43b70ffd1df751246ff8-87d53cf59203fc548a27bbf406c57449-a7b7f.png 350w,\n/static/c0c6d2ea2bbb43b70ffd1df751246ff8-87d53cf59203fc548a27bbf406c57449-f454e.png 700w,\n/static/c0c6d2ea2bbb43b70ffd1df751246ff8-87d53cf59203fc548a27bbf406c57449-325b7.png 1050w,\n/static/c0c6d2ea2bbb43b70ffd1df751246ff8-87d53cf59203fc548a27bbf406c57449-03420.png 1400w,\n/static/c0c6d2ea2bbb43b70ffd1df751246ff8-87d53cf59203fc548a27bbf406c57449-d961b.png 2048w","sizes":"(max-width: 700px) 100vw, 700px"},"resize":{"src":"/static/c0c6d2ea2bbb43b70ffd1df751246ff8-87d53cf59203fc548a27bbf406c57449-a97b6.jpg"}}}},"fields":{"slug":"/showcase/www.healthcarelogic.com"}}},"pageContext":{"slug":"/showcase/www.healthcarelogic.com"}}
|
json
|
const AudioSynth = () => ({
createInstrument: () => ({
play: () => null,
}),
});
export default AudioSynth;
|
javascript
|
Yuva Samrat Akkineni Naga Chaitanya’s upcoming film with director Chandoo Mondeti is currently in the pre-production stage. Yesterday, the makers dropped a hint that a leading heroine had joined the cast of the yet-to-be-titled movie.
Today, they officially announced that Sai Pallavi has come on board for the project, marking her second collaboration with Naga Chaitanya. The news was unveiled through the release of a couple of pictures.
This eagerly awaited film will be produced by Bunny Vas on a grand scale, while Allu Arvind is the presenter. Anirudh Ravichander is expected to compose the music for the movie, with shooting scheduled to commence next month.
Articles that might interest you:
|
english
|
The BJP and the AAP on Thursday took a dig at the Congress over TS Singh Deo's appointment as Chhattisgarh deputy chief minister and asked if he will be the party's chief ministerial face in the upcoming elections since people have lost faith in the incumbent CM, Bhupesh Baghel. A scion of the erstwhile Sarguja royal family, Singh Deo is currently holding Health and Family Welfare, Medical Education, Twenty Point Implementation, and Commercial Tax (GST) portfolios in the state government.
The ruling Congress announced on Wednesday that party president Mallikarjun Kharge had approved the proposal for his appointment as the deputy chief minister.
Reacting to the development, BJP IT department head Amit Malviya said Congress president Kharge appointing Singh Deo as deputy chief minister of Chhattisgarh is deeply problematic as it usurps Chief Minister Bhupesh Baghel's constitutional mandate. "Before Karnataka, Baghel used to fund the Congress, but soon after winning Karnataka, once an alternate source of funding was available, he was cut to size, the BJP leader charged in a tweet, adding, It is a case of Gandhis using a regional satrap, an OBC leader, till it was convenient, and then dumping him. " "Will Singh Deo now be projected as the CM face since Baghel faces massive corruption charges and has become a liability? Malviya also asked.
Commenting on the development, the AAP's Chhattisgarh in-charge Sanjeev Jha said that with its decision to appoint Singh Deo as deputy chief minister ahead of the state assembly polls, the Congress has accepted that the people of Chhattisgarh do not have faith in Chief Minister Bhupesh Baghel. "It's a damage control exercise by the Congress high command but it is not going to help the party in the state as Chief Minister Bhupesh Baghel has caused a lot of losses to the people of the state by looting its resources, Jha told PTI.
The AAP leader asked the Congress to make it clear who will be the chief ministerial candidate of the party in the assembly polls due to be held this year. "Congress high command has not cleared the party's stand on this. They should clear their stand if Singh Deo will the face of the party and if there is no faith in Baghel now…. There is confusion among people and Congress workers as well (on this issue), Jha added.
BJP Chhattisgarh unit chief Arun Sao on Wednesday said Singh Deo's appointment as the deputy chief minister at the fag end of the government's tenure was an injustice with him, and said the decision will not save the Congress from being defeated in the upcoming assembly elections. "When the time has come to complete the term, Singh Deo has been made deputy chief minister. This is not only insulting to the people of Sarguja but also injustice with Singh Deo, Sao charged.
"The Congress is a sinking ship. With such decisions, the Congress is not going to gain anything. Chhattisgarh people have decided to dislodge the Congress government. Congress is not going to get anything in Chhattisgarh, he added. Senior BJP leaders and former Chhattisgarh chief minister Raman Singh said the Congress decision to appoint Sigh Deo as deputy chief minister ahead of the assembly polls will have far-reaching consequences. With this decision, the Congress has accepted that elections in Chhattisgarh will be not be contested under the leadership of Bhupesh Baghel, he added. (This story has not been edited by News18 staff and is published from a syndicated news agency feed - PTI)
|
english
|
<filename>reviewboard/hostingsvcs/management/commands/reset-github-tokens.py<gh_stars>1-10
"""Management command for resetting GitHub auth tokens."""
from __future__ import unicode_literals
import getpass
from django.utils.six.moves import input
from django.utils.translation import ugettext as _
from djblets.util.compat.django.core.management.base import BaseCommand
from reviewboard.hostingsvcs.errors import (AuthorizationError,
TwoFactorAuthCodeRequiredError)
from reviewboard.hostingsvcs.models import HostingServiceAccount
class Command(BaseCommand):
"""Management command for resetting GitHub auth tokens."""
help = _('Resets associated GitHub tokens.')
def add_arguments(self, parser):
"""Add arguments to the command.
Args:
parser (argparse.ArgumentParser):
The argument parser for the command.
"""
parser.add_argument(
'usernames',
metavar='USERNAME',
nargs='*',
help=_('Specific GitHub account users to reset. If not '
'provided, all users will be reset.'))
parser.add_argument(
'--yes',
action='store_true',
default=False,
dest='force_yes',
help=_('Answer yes to all questions'))
parser.add_argument(
'--local-sites',
action='store',
dest='local_sites',
help=_('Comma-separated list of Local Sites to filter by'))
def handle(self, *usernames, **options):
"""Handle the command.
Args:
*usernames (tuple):
A list of usernames containing tokens to reset.
**options (dict, unused):
Options parsed on the command line. For this command, no
options are available.
Raises:
django.core.management.CommandError:
There was an error with arguments or disabling the extension.
"""
force_yes = options['force_yes']
local_sites = options['local_sites']
accounts = HostingServiceAccount.objects.filter(service_name='github')
if usernames:
accounts = accounts.filter(username__in=usernames)
if local_sites:
local_site_names = local_sites.split(',')
if local_site_names:
accounts = accounts.filter(
local_site__name__in=local_site_names)
for account in accounts:
if force_yes:
reset = 'y'
else:
if account.local_site:
reset_msg = _('Reset token for %(site_name)s '
'(%(username)s) [Y/n] ') % {
'site_name': account.local_site.name,
'username': account.username,
}
else:
reset_msg = _('Reset token for %s [Y/n] ') % (
account.username)
reset = input(reset_msg)
if reset != 'n':
self._reset_token(account)
def _reset_token(self, account):
"""Reset the token for an account.
Args:
account (reviewboard.hostingsvcs.tests.HostingServiceAccount):
The account containing a token to reset.
"""
service = account.service
password = None
auth_token = None
while True:
if (not password and
service.get_reset_auth_token_requires_password()):
password = getpass.getpass(_('Password for %s: ')
% account.username)
auth_token = None
try:
service.reset_auth_token(password, auth_token)
self.stdout.write(_('Successfully reset token for %s\n')
% account.username)
break
except TwoFactorAuthCodeRequiredError:
auth_token = input('Enter your two-factor auth token: ')
except AuthorizationError as e:
self.stderr.write('%s\n' % e)
password = None
except Exception as e:
self.stderr.write(_('Unexpected error: %s\n') % e)
raise
|
python
|
{"jodit.min.css":"sha512-KPrc6p+<KEY>,"jodit.min.js":"<KEY>}
|
json
|
Europe, perhaps, doesn’t even need an introduction. How do you befittingly describe enjoying a gelato near the Eiffel Tower, or walking through the ruins of Pompeii feeling your hair stand up on your skin, or rowing down the canals of Venice with a loved one by your side, or going crazy at the Tomatina festival in Spain or lying on your back and watching the Northern Lights in Norway, or a day reminiscing the Holocaust at German concentration camps, or enjoying some fine whisky in Scotland and some fine wine in France.
You have either been here or you haven’t. You either did that or you didn’t. The only way to know Europe is to be there, in all of its countries, doing all of the things, experiencing its diverse cultures, monuments, history, languages and people first hand. Unfortunately, there isn’t enough time in anyone's life and the ideal Europe itinerary starts from the best and rarest experiences that will at least scratch the tip of the iceberg.
Europe is simply too big, in size, sights, cultures, experiences and things to do, to fit in one itinerary or one introduction.
Europe has an unbelievably smooth rail network. Eurail offers several national passes and special fares that make train travel the cheapest and most hassle-free mode of travel between countries. To explore the best of what Europe tourism has to offer, renting cars, cycling and hitching rides (a paid privilege) are popular options.
Most Parisians walk or use a Vélib bicycle when they want to travel locally. The metro and RER train system is also an inexpensive and fast way of going anywhere and everywhere within the city. Day passes are available from $14 onwards and a single ticket costs $2.
1. Explore millions of art pieces, including the Mona Lisa, at the world's most visited museum, the Louvre. Put this on your Europe itinerary.
2. Watch grand cabaret and burlesque performances at the exotic Lido Show.
3. Let loose at the Walt Disney Studio Park at Disneyland Paris. This is one of the most popular things to do in Europe.
4. Wander on the Rue Mouffetard. Shop at the outdoor market and when you are tired, sit at a cafe and watch people go by.
5. Get out early in the morning, beat the crowd and see the Eiffel Tower.
5. Marvel in the glory of the extravagant Palace of Versailles – a short train ride away from Paris.
1. Epicure – A three Michelin star holder, this restaurant serves incredibly flavourful culinary experience.
2. Paris Picnic – A unique place that offers delectable picnic baskets that can be eaten in house or taken away to a spot of your choice.
The rail system is the best means of getting around in Switzerland. Local trains take you to all major tourist attractions. Otherwise there are also Swiss buses and cars on rent.
1. Take a picturesque train journey to the highest station, Jungfraujoch, at 3,454 meters, for a breathtaking panoramic view of pearly white mountains and glaciers.
2. Tour the gorgeous Ice Palace, full of glittering ice sculptures, located 15 metres under the surface of a glacier. You can't miss this in your Europe itinerary.
3. Put on your gear and ski down the beautiful Bernese Oberland Mountains in Interlaken.
4. Take a scenic ride on the world's first revolving gondola, Rotair, towards the summit of Titlis.
5. Stroll through the 14th-century Kapellbrücke on the Reuss River in the Old Town, Lucerne.
1. STERN Luzern – Enjoy a delicious European meal here at the heart of Lucerne.
2. Husi Bierhaus – The best pub in Interlaken with the juiciest burgers and the most delicious beers.
Train, plane and car are the three best ways of moving around in Italy. Renting a car will be cost-effective if you are a big group and you plan on stopping at several places. Sometimes, budget flights from one point to another will be cheaper than a car. The reliable train network can also be a fast way of commuting to see all the places to visit in Europe.
1. Take a slow-moving ride on a Vaporetto in Venice – make sure this is one of the things to do in Europe.
2. Try a different flavour of gelato every day. Every Europe itinerary has to have a stop at a gelato parlour.
3. Admire the intrinsic architecture of the Colosseum in Rome.
4. Bask in the magnificence that is the St. Peter’s Basilica, in Vatican City.
1. Likeat – With fresh ingredients and toppings, this place serves the best sandwiches and paninis in Rome.
2. Gusto Giusto – Try their fillet steak, pastas and creme brulee. You won't want to stop eating.
ÖBB is Austria's main rail provider and runs a countrywide train network. Take an ÖBB train almost anywhere in the country. And when you can't, catch a Postbus. Reasonably-priced day passes are available.
1. Take a walking tower in Old Town, Innsbruck and enjoy the beautiful architecture visible in the house-fronts, doorways and windows. This town is one of the most colourful places to visit in Europe.
2. Shop for some glittering jewels at the Swarovski Crystal Shop in Innsbruck. This has to be one of the things to do in Europe.
1. Breakfast Club – A cozy and fun place in downtown Innsbruck for delicious coffee and a whimsical breakfast menu.
2. Gasthof Weisses Rossl – This is the place to get your fill of the schnitzel. Put this on your Europe itinerary.
Using the Munich subway is perhaps the best way of getting around in this city. It'll also save you from using a car – as parking is expensive and difficult to find here.
1. Pack a food basket and some beer and eat at a picnic table at a Beer Garden.
2. Get transported to your favourite fairy tale in Neuschwanstein Castle, Bavaria. Europe tourism can't get any more scenic than this.
3. Delight in the sight of the freshest fruits, vegetables, meats and breads at Viktualienmarkt.
1. Vinpasa – If you like being looked after and flavourful Italian food, this is the place for you.
2. Broeding – Enjoy a traditional German meal at this perfect location – put it on the things to do in Europe.
Go on. Don't keep waiting. Go live your European dream.
All hotel prices are approximate and are on a per day basis. They were last updated on 19th January 2016.
|
english
|
import { ApplicationInitStatus } from '@angular/core';
import { TestBed, waitForAsync } from '@angular/core/testing';
import { NgxRxdbCollectionService } from './rxdb-collection.service';
import { MockNgxRxdbService, TEST_DB_CONFIG_1, TEST_FEATURE_CONFIG_1 } from './rxdb.mock';
import { addRxPlugin } from 'rxdb/plugins/core';
import { NgxRxdbFeatureModule, NgxRxdbModule } from './rxdb.module';
import { NgxRxdbService } from './rxdb.service';
import { RXDB_CONFIG } from './rxdb.token';
import { setupNavigationWarnStub } from './utils';
addRxPlugin(require('pouchdb-adapter-node-websql'));
describe('NgxRxdbModule', () => {
beforeAll(() => {
setupNavigationWarnStub();
});
describe('NgxRxdbModule :: init w/o forRoot', () => {
beforeEach(() => {
TestBed.configureTestingModule({
imports: [NgxRxdbModule],
});
});
it('should create', () => {
expect(NgxRxdbModule).toBeDefined();
});
it(`should not provide 'RXDB_CONFIG' token & 'NgxRxdbService'`, () => {
expect(() => TestBed.inject(RXDB_CONFIG)).toThrowError(
/InjectionToken NgxRxdbConfig is not provided. Make sure you call the 'forRoot'/
);
expect(() => TestBed.inject(NgxRxdbService)).toThrowError(
// /No provider for/
/InjectionToken NgxRxdbConfig is not provided. Make sure you call the 'forRoot'/
);
});
});
describe(`NgxRxdbModule :: forRoot()`, () => {
beforeEach(() => {
TestBed.configureTestingModule({
imports: [NgxRxdbModule.forRoot(TEST_DB_CONFIG_1)],
});
});
it(`should provide db service`, () => {
expect(TestBed.inject(NgxRxdbService)).toBeDefined();
});
it(`should provide db config`, () => {
expect(TestBed.inject(RXDB_CONFIG)).toBeDefined();
});
});
describe(`NgxRxdbModule :: init w/o forFeature`, () => {
beforeEach(() => {
TestBed.configureTestingModule({
imports: [NgxRxdbModule.forRoot(TEST_DB_CONFIG_1)],
providers: [{ provide: NgxRxdbService, useClass: MockNgxRxdbService }],
});
});
it('should create', () => {
expect(NgxRxdbModule).toBeDefined();
});
it(`should not provide feature config token & collection service`, () => {
expect(() => TestBed.inject(NgxRxdbCollectionService)).toThrowError(
/No provider for/
);
});
});
describe(`NgxRxdbModule :: forFeature`, () => {
let dbService: NgxRxdbService;
let dbInitSpy;
beforeEach(() => {
TestBed.configureTestingModule({
imports: [
NgxRxdbModule.forRoot(TEST_DB_CONFIG_1),
NgxRxdbModule.forFeature(TEST_FEATURE_CONFIG_1),
],
providers: [{ provide: NgxRxdbService, useClass: MockNgxRxdbService }],
});
dbService = TestBed.inject(NgxRxdbService);
dbInitSpy = jest.spyOn(dbService, 'initDb');
});
it(
`should init db via dbService`,
waitForAsync(async () => {
expect(dbInitSpy).toHaveBeenCalled();
const calls = dbInitSpy.mock.calls;
expect(calls[0].length).toEqual(1);
expect(calls[0][0]).toEqual(TEST_DB_CONFIG_1);
await Promise.resolve();
})
);
it(
`should provide collectionConfig & collection service`,
waitForAsync(() => {
expect(NgxRxdbFeatureModule).toBeDefined();
expect(TestBed.inject(NgxRxdbCollectionService)).toBeDefined();
})
);
});
});
|
typescript
|
/*********************************************************\
* File: SPRTTShaders.cpp *
*
* Copyright (C) 2002-2013 The PixelLight Team (http://www.pixellight.org/)
*
* This file is part of PixelLight.
*
* Permission is hereby granted, free of charge, to any person obtaining a copy of this software
* and associated documentation files (the "Software"), to deal in the Software without
* restriction, including without limitation the rights to use, copy, modify, merge, publish,
* distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the
* Software is furnished to do so, subject to the following conditions:
*
* The above copyright notice and this permission notice shall be included in all copies or
* substantial portions of the Software.
*
* THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING
* BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND
* NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM,
* DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING
* FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
\*********************************************************/
//[-------------------------------------------------------]
//[ Includes ]
//[-------------------------------------------------------]
#include <PLCore/Tools/Timing.h>
#include <PLMath/Matrix4x4.h>
#include <PLMath/Rectangle.h>
#include <PLGraphics/Image/Image.h>
#include <PLGraphics/Image/ImageBuffer.h>
#include <PLRenderer/Renderer/Renderer.h>
#include <PLRenderer/Renderer/VertexBuffer.h>
#include <PLRenderer/Renderer/VertexShader.h>
#include <PLRenderer/Renderer/ProgramWrapper.h>
#include <PLRenderer/Renderer/ShaderLanguage.h>
#include <PLRenderer/Renderer/ProgramUniform.h>
#include <PLRenderer/Renderer/ProgramAttribute.h>
#include <PLRenderer/Renderer/FragmentShader.h>
#include <PLRenderer/Renderer/TextureBuffer2D.h>
#include <PLMesh/MeshHandler.h>
#include <PLMesh/MeshLODLevel.h>
#include "SPRTTShaders.h"
//[-------------------------------------------------------]
//[ Namespace ]
//[-------------------------------------------------------]
using namespace PLCore;
using namespace PLMath;
using namespace PLGraphics;
using namespace PLRenderer;
using namespace PLMesh;
//[-------------------------------------------------------]
//[ RTTI interface ]
//[-------------------------------------------------------]
pl_class_metadata(SPRTTShaders, "", SPRTT, "Shaders based render to texture surface painter")
// Constructors
pl_constructor_1_metadata(ParameterConstructor, PLRenderer::Renderer&, "Parameter constructor", "")
pl_class_metadata_end(SPRTTShaders)
//[-------------------------------------------------------]
//[ Public functions ]
//[-------------------------------------------------------]
/**
* @brief
* Constructor
*/
SPRTTShaders::SPRTTShaders(Renderer &cRenderer) : SPRTT(cRenderer),
m_pRenderTarget(nullptr),
m_pColorTarget1(nullptr),
m_pColorTarget2(nullptr),
m_pColorTarget3(nullptr),
m_pSceneVertexShader(nullptr),
m_pSceneFragmentShader(nullptr),
m_pSceneProgram(nullptr),
m_pVertexShader(nullptr),
m_pFragmentShader(nullptr),
m_pProgram(nullptr)
{
// Check/get number of supported color render targets
uint8 nMaxColorTargets = 4;
if (nMaxColorTargets > cRenderer.GetCapabilities().nMaxColorRenderTargets)
nMaxColorTargets = cRenderer.GetCapabilities().nMaxColorRenderTargets;
{ // Render targets
// Create the render target. We will create a very low resolution 2D texture buffer to see funny pixels.
m_pRenderTarget = cRenderer.CreateSurfaceTextureBuffer2D(Vector2i(64, 64), TextureBuffer::R8G8B8, SurfaceTextureBuffer::Depth|SurfaceTextureBuffer::NoMultisampleAntialiasing, nMaxColorTargets);
if (m_pRenderTarget && nMaxColorTargets > 1) {
// Set additional color render targets
if (nMaxColorTargets > 1 && !m_pColorTarget1) {
Image cImage = Image::CreateImage(DataByte, ColorRGB, Vector3i(64, 64, 1));
m_pColorTarget1 = cRenderer.CreateTextureBuffer2D(cImage, TextureBuffer::Unknown, TextureBuffer::RenderTarget);
}
if (nMaxColorTargets > 2 && !m_pColorTarget2) {
Image cImage = Image::CreateImage(DataByte, ColorRGB, Vector3i(64, 64, 1));
m_pColorTarget2 = cRenderer.CreateTextureBuffer2D(cImage, TextureBuffer::Unknown, TextureBuffer::RenderTarget);
}
if (nMaxColorTargets > 3 && !m_pColorTarget3) {
Image cImage = Image::CreateImage(DataByte, ColorRGB, Vector3i(64, 64, 1));
m_pColorTarget3 = cRenderer.CreateTextureBuffer2D(cImage, TextureBuffer::Unknown, TextureBuffer::RenderTarget);
}
}
}
// Decide which shader language should be used (for example "GLSL" or "Cg")
ShaderLanguage *pShaderLanguage = cRenderer.GetShaderLanguage(cRenderer.GetDefaultShaderLanguage());
if (pShaderLanguage) {
{ // Scene program (with MRT support)
// Construct the string containing the fragment shader definitions
String sDefinitions;
if (m_pColorTarget1)
sDefinitions += "#define MRT_1\n";
if (m_pColorTarget2)
sDefinitions += "#define MRT_2\n";
if (m_pColorTarget3)
sDefinitions += "#define MRT_3\n";
// Shader source code
String sVertexShaderSourceCode;
String sFragmentShaderSourceCode;
if (pShaderLanguage->GetShaderLanguage() == "GLSL") {
#include "SPRTTShaders_GLSL.h"
if (cRenderer.GetAPI() == "OpenGL ES 2.0") {
// Get shader source codes
sVertexShaderSourceCode = "#version 100\n" + sSceneVertexShaderSourceCodeGLSL;
sFragmentShaderSourceCode = "#version 100\n" + sDefinitions + sSceneFragmentShaderSourceCodeGLSL;
} else {
// Remove precision qualifiers so that we're able to use 120 (OpenGL 2.1 shaders) instead of 130 (OpenGL 3.0 shaders,
// with this version we can keep the precision qualifiers) so that this shader requirements are as low as possible
// -> In here we're using 120 instead of 110 because matrix casts are quite comfortable...
sVertexShaderSourceCode = "#version 120\n" + Shader::RemovePrecisionQualifiersFromGLSL(sSceneVertexShaderSourceCodeGLSL);
sFragmentShaderSourceCode = "#version 120\n" + sDefinitions + Shader::RemovePrecisionQualifiersFromGLSL(sSceneFragmentShaderSourceCodeGLSL);
}
} else if (pShaderLanguage->GetShaderLanguage() == "Cg") {
#include "SPRTTShaders_Cg.h"
sVertexShaderSourceCode = sSceneVertexShaderSourceCodeCg;
sFragmentShaderSourceCode = sSceneFragmentShaderSourceCodeCg;
}
// Create a vertex shader instance
// -> I define a Cg profile because when using an GLSL Cg profile (which is the default), the shader is not working correctly on my AMD/ATI ("AMD Catalyst� 11.3") system while it worked on the tested NVIDIA system...
m_pSceneVertexShader = pShaderLanguage->CreateVertexShader(sVertexShaderSourceCode, "arbvp1");
// Create a fragment shader instance
// -> I define a Cg profile because when using an GLSL Cg profile (which is the default), the shader is not working correctly on my AMD/ATI ("AMD Catalyst� 11.3") system while it worked on the tested NVIDIA system...
m_pSceneFragmentShader = pShaderLanguage->CreateFragmentShader(sFragmentShaderSourceCode, "arbfp1");
// Create a program instance and assign the created vertex and fragment shaders to it
m_pSceneProgram = pShaderLanguage->CreateProgram(m_pSceneVertexShader, m_pSceneFragmentShader);
}
{ // Program
// Shader source code
String sVertexShaderSourceCode;
String sFragmentShaderSourceCode;
if (pShaderLanguage->GetShaderLanguage() == "GLSL") {
#include "SPRTTShaders_GLSL.h"
if (cRenderer.GetAPI() == "OpenGL ES 2.0") {
// Get shader source codes
sVertexShaderSourceCode = "#version 100\n" + sVertexShaderSourceCodeGLSL;
sFragmentShaderSourceCode = "#version 100\n" + sFragmentShaderSourceCodeGLSL;
} else {
// Remove precision qualifiers so that we're able to use 110 (OpenGL 2.0 shaders) instead of 130 (OpenGL 3.0 shaders,
// with this version we can keep the precision qualifiers) so that this shader requirements are as low as possible
sVertexShaderSourceCode = "#version 110\n" + Shader::RemovePrecisionQualifiersFromGLSL(sVertexShaderSourceCodeGLSL);
sFragmentShaderSourceCode = "#version 110\n" + Shader::RemovePrecisionQualifiersFromGLSL(sFragmentShaderSourceCodeGLSL);
}
} else if (pShaderLanguage->GetShaderLanguage() == "Cg") {
#include "SPRTTShaders_Cg.h"
sVertexShaderSourceCode = sVertexShaderSourceCodeCg;
sFragmentShaderSourceCode = sFragmentShaderSourceCodeCg;
}
// Create a vertex shader instance
// -> I define a Cg profile because when using an GLSL Cg profile (which is the default), the shader is not working correctly on my AMD/ATI ("AMD Catalyst� 11.3") system while it worked on the tested NVIDIA system...
m_pVertexShader = pShaderLanguage->CreateVertexShader(sVertexShaderSourceCode, "arbvp1");
// Create a fragment shader instance
// -> I define a Cg profile because when using an GLSL Cg profile (which is the default), the shader is not working correctly on my AMD/ATI ("AMD Catalyst� 11.3") system while it worked on the tested NVIDIA system...
m_pFragmentShader = pShaderLanguage->CreateFragmentShader(sFragmentShaderSourceCode, "arbfp1");
// Create a program instance and assign the created vertex and fragment shaders to it
m_pProgram = static_cast<ProgramWrapper*>(pShaderLanguage->CreateProgram(m_pVertexShader, m_pFragmentShader));
}
}
}
/**
* @brief
* Destructor
*/
SPRTTShaders::~SPRTTShaders()
{
// Destroy the render targets
if (m_pRenderTarget)
delete m_pRenderTarget;
if (m_pColorTarget1)
delete m_pColorTarget1;
if (m_pColorTarget2)
delete m_pColorTarget2;
if (m_pColorTarget3)
delete m_pColorTarget3;
// Cleanup
if (m_pSceneProgram)
delete m_pSceneProgram;
if (m_pSceneFragmentShader)
delete m_pSceneFragmentShader;
if (m_pSceneVertexShader)
delete m_pSceneVertexShader;
if (m_pProgram)
delete m_pProgram;
if (m_pFragmentShader)
delete m_pFragmentShader;
if (m_pVertexShader)
delete m_pVertexShader;
}
//[-------------------------------------------------------]
//[ Private functions ]
//[-------------------------------------------------------]
/**
* @brief
* Draws the scene
*/
void SPRTTShaders::DrawScene(Renderer &cRenderer)
{
// Clear the content of the current used render target by using gray (this way, in case on an graphics error we might still see at least something)
cRenderer.Clear(Clear::Color | Clear::ZBuffer, Color4::Gray);
// Make our program to the current one
if (cRenderer.SetProgram(m_pSceneProgram)) {
// Calculate the world matrix
Matrix4x4 mWorld;
{
// Build a rotation matrix by using a given Euler angle around the y-axis
mWorld.FromEulerAngleY(static_cast<float>(m_fRotation*Math::DegToRad));
}
// Set program uniforms
ProgramUniform *pProgramUniform = m_pSceneProgram->GetUniform("ObjectSpaceToClipSpaceMatrix");
if (pProgramUniform) {
// Calculate the view matrix
Matrix4x4 mView;
{
mView.SetTranslation(0.0f, -0.1f, -0.5f);
}
// Calculate the projection matrix
Matrix4x4 mProjection;
{
const float fAspect = 1.0f;
const float fAspectRadio = cRenderer.GetViewport().GetWidth()/(cRenderer.GetViewport().GetHeight()*fAspect);
mProjection.PerspectiveFov(static_cast<float>(45.0f*Math::DegToRad), fAspectRadio, 0.001f, 1000.0f);
}
// Calculate the final composed world view projection matrix
const Matrix4x4 mWorldViewProjection = mProjection*mView*mWorld;
// Set object space to clip space matrix uniform
pProgramUniform->Set(mWorldViewProjection);
}
// Set object space to world space matrix uniform
pProgramUniform = m_pSceneProgram->GetUniform("ObjectSpaceToWorldSpaceMatrix");
if (pProgramUniform)
pProgramUniform->Set(mWorld);
// Set world space light direction
pProgramUniform = m_pSceneProgram->GetUniform("LightDirection");
if (pProgramUniform)
pProgramUniform->Set(Vector3::UnitZ);
// Get the used mesh
const Mesh *pMesh = m_pMeshHandler->GetMesh();
if (pMesh) {
// Get the mesh LOD level to use
const MeshLODLevel *pLODLevel = pMesh->GetLODLevel(0);
if (pLODLevel && pLODLevel->GetIndexBuffer()) {
// Get and use the index buffer of the mesh LOD level
cRenderer.SetIndexBuffer(pLODLevel->GetIndexBuffer());
// Get the vertex buffer of the mesh handler
VertexBuffer *pVertexBuffer = m_pMeshHandler->GetVertexBuffer();
if (pVertexBuffer) {
// Set program vertex attributes, this creates a connection between "Vertex Buffer Attribute" and "Vertex Shader Attribute"
ProgramAttribute *pProgramAttribute = m_pSceneProgram->GetAttribute("VertexPosition");
if (pProgramAttribute)
pProgramAttribute->Set(pVertexBuffer, VertexBuffer::Position);
pProgramAttribute = m_pSceneProgram->GetAttribute("VertexNormal");
if (pProgramAttribute)
pProgramAttribute->Set(pVertexBuffer, VertexBuffer::Normal);
// Loop through all geometries of the mesh
const Array<Geometry> &lstGeometries = *pLODLevel->GetGeometries();
for (uint32 nGeo=0; nGeo<lstGeometries.GetNumOfElements(); nGeo++) {
// Is this geometry active?
const Geometry &cGeometry = lstGeometries[nGeo];
if (cGeometry.IsActive()) {
// Draw the geometry
cRenderer.DrawIndexedPrimitives(
cGeometry.GetPrimitiveType(),
0,
pVertexBuffer->GetNumOfElements()-1,
cGeometry.GetStartIndex(),
cGeometry.GetIndexSize()
);
}
}
}
}
}
}
}
//[-------------------------------------------------------]
//[ Private virtual PLRenderer::SurfacePainter functions ]
//[-------------------------------------------------------]
void SPRTTShaders::OnPaint(Surface &cSurface)
{
// Get the used renderer
Renderer &cRenderer = GetRenderer();
// Clear the content of the current used render target by using gray (this way, in case on an graphics error we might still see at least something)
cRenderer.Clear(Clear::Color | Clear::ZBuffer, Color4::Gray);
// Backup current render target and set the new one to render in our texture buffer
Surface *pRenderSurfaceBackup = cRenderer.GetRenderTarget();
if (cRenderer.SetRenderTarget(m_pRenderTarget)) {
if (m_pColorTarget1)
cRenderer.SetColorRenderTarget(static_cast<TextureBuffer*>(m_pColorTarget1), 1);
if (m_pColorTarget2)
cRenderer.SetColorRenderTarget(static_cast<TextureBuffer*>(m_pColorTarget2), 2);
if (m_pColorTarget3)
cRenderer.SetColorRenderTarget(static_cast<TextureBuffer*>(m_pColorTarget3), 3);
// Draw the scene
DrawScene(cRenderer);
// Reset render target
cRenderer.SetRenderTarget(pRenderSurfaceBackup);
}
// This code is similar to the code of the triangle sample. But instead of a
// triangle we will draw three rotating quadrangles. Further the used vertex
// buffer has texture coordinates and we apply the 'teapot' texture buffer on the primitives.
// Clear the content of the current used render target
// Make our program to the current one
if (cRenderer.SetProgram(m_pProgram)) {
// Set program vertex attributes, this creates a connection between "Vertex Buffer Attribute" and "Vertex Shader Attribute"
m_pProgram->Set("VertexPosition", m_pPositionVertexBuffer, VertexBuffer::Position);
m_pProgram->Set("VertexTextureCoordinate", m_pPositionVertexBuffer, VertexBuffer::TexCoord);
m_pProgram->Set("VertexColor", m_pColorVertexBuffer, VertexBuffer::Color);
// Set color factor
m_pProgram->Set("ColorFactor", 0.0f);
// Calculate the composed view projection matrix - we need it multiple times
Matrix4x4 mViewProjection;
{
// Calculate the view matrix
Matrix4x4 mView;
{
mView.SetTranslation(0.0f, 0.0f, -5.0f);
}
// Calculate the projection matrix
Matrix4x4 mProjection;
{
const float fAspect = 1.0f;
const float fAspectRadio = cRenderer.GetViewport().GetWidth()/(cRenderer.GetViewport().GetHeight()*fAspect);
mProjection.PerspectiveFov(static_cast<float>(45.0f*Math::DegToRad), fAspectRadio, 0.001f, 1000.0f);
}
// Calculate the composed view projection matrix
mViewProjection = mProjection*mView;
}
// No back face culling, please. Else we can only see one 'side' of the quadrangle
cRenderer.SetRenderState(RenderState::CullMode, Cull::None);
Matrix4x4 mWorld;
{ // Draw quadrangle 1: Primary render target
// Set object space to clip space matrix uniform
ProgramUniform *pProgramUniform = m_pProgram->GetUniform("ObjectSpaceToClipSpaceMatrix");
if (pProgramUniform) {
mWorld.SetTranslationMatrix(2.0f, 1.0f, 0.0f);
pProgramUniform->Set(mViewProjection*mWorld);
}
{ // Set the texture buffer we rendered our teapot in as the current texture buffer
const int nTextureUnit = m_pProgram->Set("DiffuseMap", m_pRenderTarget->GetTextureBuffer());
if (nTextureUnit >= 0) {
// Disable mip mapping - this is required because we created/filled no mipmaps for your texture buffer
cRenderer.SetSamplerState(nTextureUnit, Sampler::MagFilter, TextureFiltering::Linear);
cRenderer.SetSamplerState(nTextureUnit, Sampler::MinFilter, TextureFiltering::Linear);
cRenderer.SetSamplerState(nTextureUnit, Sampler::MipFilter, TextureFiltering::None);
}
}
// Draw
cRenderer.DrawPrimitives(Primitive::TriangleStrip, 0, 4);
}
{ // Draw quadrangle 2: Color render target 1
// Set object space to clip space matrix uniform
ProgramUniform *pProgramUniform = m_pProgram->GetUniform("ObjectSpaceToClipSpaceMatrix");
if (pProgramUniform) {
mWorld.FromEulerAngleY(static_cast<float>(m_fRotation*Math::DegToRad));
mWorld.SetTranslation(-2.0f, 1.0f, 0.0f);
pProgramUniform->Set(mViewProjection*mWorld);
}
{ // Set the texture buffer we rendered our teapot in as the current texture buffer
const int nTextureUnit = m_pProgram->Set("DiffuseMap", m_pColorTarget1 ? static_cast<TextureBuffer*>(m_pColorTarget1) : m_pRenderTarget->GetTextureBuffer());
if (nTextureUnit >= 0) {
// Disable mip mapping - this is required because we created/filled no mipmaps for your texture buffer
cRenderer.SetSamplerState(nTextureUnit, Sampler::MagFilter, TextureFiltering::Linear);
cRenderer.SetSamplerState(nTextureUnit, Sampler::MinFilter, TextureFiltering::Linear);
cRenderer.SetSamplerState(nTextureUnit, Sampler::MipFilter, TextureFiltering::None);
}
}
// Draw
cRenderer.DrawPrimitives(Primitive::TriangleStrip, 0, 4);
}
{ // Draw quadrangle 3: Color render target 2
// Set object space to clip space matrix uniform
ProgramUniform *pProgramUniform = m_pProgram->GetUniform("ObjectSpaceToClipSpaceMatrix");
if (pProgramUniform) {
mWorld.FromEulerAngleZ(static_cast<float>(m_fRotation*Math::DegToRad));
mWorld.SetTranslation(0.0f, 1.0f, 0.0f);
pProgramUniform->Set(mViewProjection*mWorld);
}
{ // Set the texture buffer we rendered our teapot in as the current texture buffer
const int nTextureUnit = m_pProgram->Set("DiffuseMap", m_pColorTarget2 ? static_cast<TextureBuffer*>(m_pColorTarget2) : m_pRenderTarget->GetTextureBuffer());
if (nTextureUnit >= 0) {
// Disable mip mapping - this is required because we created/filled no mipmaps for your texture buffer
cRenderer.SetSamplerState(nTextureUnit, Sampler::MagFilter, TextureFiltering::Linear);
cRenderer.SetSamplerState(nTextureUnit, Sampler::MinFilter, TextureFiltering::Linear);
cRenderer.SetSamplerState(nTextureUnit, Sampler::MipFilter, TextureFiltering::None);
}
}
// Draw
cRenderer.DrawPrimitives(Primitive::TriangleStrip, 0, 4);
}
{ // Draw quadrangle 4: Color render target 3
// Set object space to clip space matrix uniform
ProgramUniform *pProgramUniform = m_pProgram->GetUniform("ObjectSpaceToClipSpaceMatrix");
if (pProgramUniform) {
mWorld.FromEulerAngleZ(static_cast<float>(-m_fRotation*Math::DegToRad));
mWorld.SetTranslation(-2.0f, -1.0f, 0.0f);
pProgramUniform->Set(mViewProjection*mWorld);
}
{ // Set the texture buffer we rendered our teapot in as the current texture buffer
const int nTextureUnit = m_pProgram->Set("DiffuseMap", m_pColorTarget3 ? static_cast<TextureBuffer*>(m_pColorTarget3) : m_pRenderTarget->GetTextureBuffer());
if (nTextureUnit >= 0) {
// Disable mip mapping - this is required because we created/filled no mipmaps for your texture buffer
cRenderer.SetSamplerState(nTextureUnit, Sampler::MagFilter, TextureFiltering::Linear);
cRenderer.SetSamplerState(nTextureUnit, Sampler::MinFilter, TextureFiltering::Linear);
cRenderer.SetSamplerState(nTextureUnit, Sampler::MipFilter, TextureFiltering::None);
}
}
// Draw
cRenderer.DrawPrimitives(Primitive::TriangleStrip, 0, 4);
}
{ // Draw quadrangle 4: Primary render target, but with per vertex color - the primitive will be quite colorful :)
// Set object space to clip space matrix uniform
ProgramUniform *pProgramUniform = m_pProgram->GetUniform("ObjectSpaceToClipSpaceMatrix");
if (pProgramUniform) {
mWorld.FromEulerAngleZ(static_cast<float>(-m_fRotation*Math::DegToRad));
mWorld.SetTranslation(2.0f, -1.0f, 0.0f);
pProgramUniform->Set(mViewProjection*mWorld);
}
// Set color factor
m_pProgram->Set("ColorFactor", 1.0f);
{ // Set the texture buffer we rendered our teapot in as the current texture buffer
const int nTextureUnit = m_pProgram->Set("DiffuseMap", m_pRenderTarget->GetTextureBuffer());
if (nTextureUnit >= 0) {
// Disable mip mapping - this is required because we created/filled no mipmaps for your texture buffer
cRenderer.SetSamplerState(nTextureUnit, Sampler::MagFilter, TextureFiltering::Linear);
cRenderer.SetSamplerState(nTextureUnit, Sampler::MinFilter, TextureFiltering::Linear);
cRenderer.SetSamplerState(nTextureUnit, Sampler::MipFilter, TextureFiltering::None);
}
}
// Draw
cRenderer.DrawPrimitives(Primitive::TriangleStrip, 0, 4);
}
// Change the backface culling back to the default setting
cRenderer.SetRenderState(RenderState::CullMode, Cull::CCW);
}
// Increase the rotation by the current time difference (time past since the last frame)
// -> Normally, such work should NOT be performed within the rendering step, but we want
// to keep the implementation simple in here...
m_fRotation += Timing::GetInstance()->GetTimeDifference()*50;
}
|
cpp
|
/*=============================================================
Authour URI: www.binarytheme.com
License: Commons Attribution 3.0
http://creativecommons.org/licenses/by/3.0/
100% Free To use For Personal And Commercial Use.
IN EXCHANGE JUST GIVE US CREDIT US AND TELL YOUR FRIENDS ABOUT US
======================================================== */
/*=====================================
GENERAL STYLE SECTION
===================================*/
body {
font-family: 'Open Sans', sans-serif;
font-size:14px;
line-height:25px;
}
.text-center {
text-align:center;
}
.sub-head {
padding-bottom:50px;
display:block;
}
.pad-top-botm {
padding-top:50px;
padding-bottom:50px;
}
h2 {
color:#ff0000;
font-weight:900;
}
h3{
color:#159238;
}
#home{
background: url(../images/6.jpg) no-repeat center center;
padding: 0;
-webkit-background-size: cover;
background-size: cover;
}
#home h1 {
padding:150px 20px 20px 0px;
color: white;
font-size:40px;
font-weight:900;
}
#home .head-text span {
display: block;
font-size: 24px;
color: white;
/*margin-bottom: 25px;*/
}
#home .head-text span >i {
margin-right: 10px;
color: #0000F0;
}
#home p {
padding:10px 20px 20px 20px;
color:#fff;
font-size:16px;
line-height:20px;
font-weight:800;
text-align:center;
}
#home .overlay {
/*background:url(../img/overlays/06.png);*/
min-height:700px;
}
#home h3 {
color:#fff;
}
#about {
background-color:#122b40;
padding-top: 80px;
color: white;
}
.big-icon {
font-size:250px;
}
.why-div {
text-align: justify;
padding: 10px 50px;
}
.donars-section {
background: url(../images/7.jpg) no-repeat center center;
padding: 0;
-webkit-background-size: cover;
background-size: cover;
color:white;
}
.donars-section .overlay {
background: url(../images/overlays/02.png);
}
.slide-custom {
min-height:200px;
padding:100px 20px 100px 20px;
line-height:30px;
}
.donars-section h4 {
line-height:40px;
}
.donars-section h4 i{
padding:5px;
}
|
css
|
NAIROBI, Kenya Aug 4 – Foreign Affairs Assistant Minister Richard Onyonka was questioned by detectives at the Kenya Anti-Corruption Commission on Thursday over allegations of squandering some Sh137 million from the Kitutu Chache Constituency Development Fund (CDF) kitty.
Mr Onyonka was summoned to the Integrity centre at 8. 30am to explain why he used the money to buy sugar from Chemilil Sugar company and failing to remit Sh18 million to the Kenya Revenue Authority (KRA) as tax.
“It is alleged that Hon Onyonka used CDF funds to purchase sugar worth Sh137 million (137,058,429. 80) from Chemilil Sugar Company. It is alleged that the Hon Onyonka did not remitted VAT worth ksh. 18, 461,196/- to KRA,” a brief statement from the KACC said.
“The Member of Parliament for Kitutu Chache is today 4th August 2011 recording a statement at Kenya Anti-Corruption Commission’s Office at Integrity Centre,” the brief signed by the commission’s Spokesman Mr Nicholas Simani said.
Mr Simani said their investigators were on the final stages of the probe and would be forwarding the file to the Director of Public Prosecutions for advice once they are through.
He said several officials including those at the sugar company had recorded statements and provided investigators with relevant data of the transaction.
“Investigators at also in Chemilil finalising on data collection and when they conclude their investigations the file will be forwarded to the Director of Public Prosecution with appropriate recommendations,” the commission statement said.
Several Members of Parliament are under investigation by the commission for misappropriating money allocated for CDF projects in various parts of the country.
KACC Director Patrick Lumumba recently announced that investigations on various high profile cases-including misuse of CDF-were on their final stages and would be prosecuted soon.
|
english
|
#include<iostream>
#include<cmath>
#include<typeinfo>
#include<cstdlib>
#include<sstream>
using namespace std;
class MyDouble {
private:
double x;
public:
MyDouble(float _x) : x(_x) {}
int trunc() {
return x;
}
MyDouble operator^(MyDouble &op) {
return MyDouble(pow(x,op.x));
}
bool operator>(MyDouble &op) {
return x>op.x;
}
ostream& put(ostream &stream) const {
stream << x;
return stream;
}
};
ostream &operator<<(ostream &stream, const MyDouble &d) {
return d.put(stream);
}
class A {
private:
float w;
protected:
static int const dim = 4;
MyDouble *mat[dim][dim];
public:
A(double _w) : w(_w) {
for(int i=0; i<dim; i++)
for (int j=0; j<dim; j++)
mat[i][j] = new MyDouble((w*10)+i+j);
}
virtual double f() const = 0;
virtual string classname() const {
return "A";
}
virtual ostream &put(ostream &stream) const {
stream << "class "<< classname() << ": \tw="<<w;
return stream;
}
};
ostream &operator<<(ostream &str, const A &a) {
return a.put(str);
}
class B : public A {
public:
B(double _w) : A(_w) {}
double f() const {
int somma = 0;
for(int j=0; j<dim; j++)
somma += mat[0][j]->trunc();
return somma;
}
virtual string classname() const {
return "B";
}
virtual ostream &put(ostream &stream) const {
A::put(stream);
stream << "\tf()="<<f();
return stream;
}
};
template<typename T>
class C : public A {
private:
T c;
public:
C(double _w, T _c) : A(_w), c(_c) {}
double f() const {
MyDouble a(5.0);
int contatore = 0;
for(int i=0; i<dim; i++)
for(int j=0; j<dim; j++)
if(*mat[i][j]>a)
contatore++;
return contatore;
}
MyDouble f(T p) const {
if(typeid(T)==typeid(double))
return MyDouble(p);
else
return (*mat[0][0])^(*mat[dim-1][dim-1]);
}
virtual string classname() const {
if(typeid(T)==typeid(int)) {
return "C<int>";
} else if(typeid(T)==typeid(double)){
return "C<double>";
} else
return "C<"+string(typeid(T).name())+">";
}
virtual ostream &put(ostream &stream) const {
A::put(stream);
stream << "\tc="<<c<< "\tf()="<<f()<<"\tf(3)="<<f(3);
return stream;
}
};
int main() {
const int DIM = 30;
A *vett[DIM];
double sum = 0;
MyDouble d(0);
int d_indice=0;
bool d_found = false;
srand(328832748);
for(int i=0; i<DIM; i++){
double x=rand()/(double)RAND_MAX;
rand();
switch(rand()%3) {
case 0: vett[i] = new B(x);
break;
case 1: vett[i] = new C<double>(x,rand()/(double)RAND_MAX);
break;
case 2: vett[i] = new C<int>(x,(int)(x*10));
}
cout << i << ")" << *vett[i]<<endl;
sum+=vett[i]->f();
if(!d_found && typeid(*vett[i])==typeid(C<int>)) {
d_indice = i;
d=( (C<int>*) vett[i] )->f(3);
d_found = true;
}
}
cout << "punto 1: sum="<<sum<<"\t punto 2: f(3)="<<d<<" di indice "<<d_indice<<endl;
}
|
cpp
|
I can never get enough of the Hell Divers series! I have no idea what it takes to keep a series exciting for this many books, but Smith manages to do it every time! I'm already craving the next book!!!
No comments have been added yet.
|
english
|
<reponame>gbf-labs/rh-api
#====================================#
# AUTHOR: <NAME> #
#====================================#
# pylint: disable=no-self-use
"""General Info"""
from __future__ import division
import re
from flask import request
from library.couch_database import CouchDatabase
from library.couch_queries import Queries
from library.common import Common
class GeneralInfo(Common):
"""Class for GeneralInfo"""
# INITIALIZE
def __init__(self):
"""The Constructor for GeneralInfo class"""
self._couch_db = CouchDatabase()
self.couch_query = Queries()
self.epoch_default = 26763
super(GeneralInfo, self).__init__()
def general_info(self):
"""
This API is for getting general info of device
---
tags:
- Devices
produces:
- application/json
parameters:
- name: token
in: header
description: Token
required: true
type: string
- name: userid
in: header
description: User ID
required: true
type: string
- name: vessel_id
in: query
description: Vessel ID
required: true
type: string
responses:
500:
description: Error
200:
description: Vessel Device List
"""
# INIT DATA
data = {}
# VESSEL ID
vessel_id = request.args.get('vessel_id')
# GET DATA
token = request.headers.get('token')
userid = request.headers.get('userid')
# CHECK TOKEN
token_validation = self.validate_token(token, userid)
if not token_validation:
data["alert"] = "Invalid Token"
data['status'] = 'Failed'
# RETURN ALERT
return self.return_data(data)
if not vessel_id:
data["alert"] = "No Vessel ID"
data['status'] = 'Failed'
# RETURN ALERT
return self.return_data(data)
rows = []
heading = self.get_heading(vessel_id)
if heading:
row = {}
row['option_name'] = 'Heading'
row['value'] = "{0}°".format(float(heading['heading']))
row['data_provider'] = heading['heading_source']
rows.append(row)
speed = self.get_speed(vessel_id)
if speed:
row = {}
row['option_name'] = 'Speed'
row['value'] = speed['speed']
row['data_provider'] = speed['speed_source']
rows.append(row)
failover = self.couch_query.get_complete_values(
vessel_id,
"FAILOVER"
)
if failover:
row = {}
row['option_name'] = 'Internet provider'
row['value'] = failover['FAILOVER']['General']['Description']
row['data_provider'] = 'Failover'
rows.append(row)
parameters = self.couch_query.get_complete_values(
vessel_id,
"PARAMETERS"
)
info = parameters['PARAMETERS']['INFO']
if info:
row = {}
row['option_name'] = 'IMO'
row['value'] = info['IMO']
row['data_provider'] = 'PARAMETERS'
rows.append(row)
if info:
row = {}
row['option_name'] = 'MMSI'
row['value'] = info['MMSI']
row['data_provider'] = 'PARAMETERS'
rows.append(row)
data['status'] = 'ok'
data['rows'] = rows
return self.return_data(data)
def get_heading(self, vessel_id):
"""Return Heading"""
all_devices = self.couch_query.get_all_devices(vessel_id)
source = 'VSAT'
devices = self.get_device(all_devices, source)
data = {}
for device in devices:
values = self.couch_query.get_complete_values(
vessel_id,
device['device']
)
if values:
# GET DEVICE NUMBER
number = device['device'].split(source)[1]
# GET DEVICE COMPLETE NAME
heading_source = self.device_complete_name(source, number)
# SET RETURN
data['heading'] = values[device['device']]['General']['Heading']
data['heading_source'] = heading_source
# # RETURN
# return data
source = 'NMEA'
devices = self.get_device(all_devices, source)
for device in devices:
values = self.couch_query.get_complete_values(
vessel_id,
device['device']
)
if values:
# GET DEVICE NUMBER
number = device['device'].split(source)[1]
# GET DEVICE COMPLETE NAME
heading_source = self.device_complete_name(source, number)
gen = values[device['device']].keys()
if 'GP0001' in gen:
gp0001 = values[device['device']]['GP0001']
# SET RETURN
if not data:
data['heading'] = gp0001['VTG']['courseOverGroundTrueDegrees']
data['heading_source'] = heading_source
# RETURN
return data
def get_device(self, devices, pattern):
"""Return Device"""
data = []
for device in devices:
if re.findall(r'' + pattern + r'\d', device['doc']['device']):
data.append(device['doc'])
data = sorted(data, key=lambda i: i['device'])
return data
def get_speed(self, vessel_id):
"""Return Speed"""
all_devices = self.couch_query.get_all_devices(vessel_id)
source = 'NMEA'
devices = self.get_device(all_devices, source)
data = {}
speed = None
speed_source = ""
for device in devices:
values = self.couch_query.get_complete_values(
vessel_id,
device['device']
)
if values:
# GET DEVICE NUMBER
number = device['device'].split(source)[1]
# GET DEVICE COMPLETE NAME
heading_source = self.device_complete_name(source, number)
gen = values[device['device']].keys()
if 'GP0001' in gen:
gp0001 = values[device['device']]['GP0001']
# SET RETURN
if not data:
data['heading'] = gp0001['VTG']['courseOverGroundTrueDegrees']
data['heading_source'] = heading_source
speed = float(gp0001['VTG']['speedOverGroundKnots'])
speed_source = device['device']
data['speed'] = str(speed) + " knot(s)"
data['speed_source'] = speed_source
return data
return 0
|
python
|
<filename>src/test/java/com/perosa/bello/server/DispatchLogicTest.java
package com.perosa.bello.server;
import com.perosa.bello.core.balancer.Balancer;
import io.undertow.server.HttpHandler;
import io.undertow.server.HttpServerExchange;
import io.undertow.util.HeaderMap;
import io.undertow.util.HttpString;
import org.junit.jupiter.api.Test;
import org.junit.jupiter.api.extension.ExtendWith;
import org.mockito.Mock;
import org.mockito.junit.jupiter.MockitoExtension;
import java.util.Map;
import static org.junit.jupiter.api.Assertions.assertEquals;
import static org.junit.jupiter.api.Assertions.assertNotNull;
import static org.mockito.Mockito.*;
@ExtendWith(MockitoExtension.class)
class DispatchLogicTest {
@Mock
HttpServerExchange exchange;
@Mock
Balancer balancer;
@Test
void dispatch() {
when(exchange.getRequestScheme()).thenReturn("https");
when(exchange.getRequestPath()).thenReturn("/webhook");
when(exchange.getQueryString()).thenReturn("user=perosa");
new DispatchLogic(balancer).dispatch(exchange);
verify(exchange, times(1)).dispatch(isA(HttpHandler.class));
verify(balancer, times(1)).findTarget(isA(InRequest.class));
}
@Test
void buildUrl() {
when(exchange.getRequestScheme()).thenReturn("https");
when(exchange.getRequestPath()).thenReturn("/webhook");
when(exchange.getQueryString()).thenReturn("user=perosa");
assertEquals("https://localhost/webhook?user=perosa", new DispatchLogic(balancer).buildUrl(exchange, "localhost"));
}
@Test
void buildUrlWithoutQueryString() {
when(exchange.getRequestScheme()).thenReturn("https");
when(exchange.getRequestPath()).thenReturn("/webhook");
assertEquals("https://localhost/webhook", new DispatchLogic(balancer).buildUrl(exchange, "localhost"));
}
@Test
void getRequest() {
when(exchange.getHostName()).thenReturn("localhost");
InRequest request = new DispatchLogic(balancer).getRequest(exchange);
assertNotNull(request);
assertEquals("localhost", request.getHost());
assertEquals(null, request.getPayload());
}
@Test
void extractHeaders() {
HeaderMap headerMap = new HeaderMap();
headerMap.add(new HttpString("host"), "localhost");
headerMap.add(new HttpString("authorization"), "bldfsdf$%45");
when(exchange.getRequestHeaders()).thenReturn(headerMap);
Map<String, String> map = new DispatchLogic(balancer).extractHeaders(exchange);
assertNotNull(map);
assertEquals(2, map.size());
}
}
|
java
|
[{"liveId":"5c911ebf0cf2927bfdb8dfd2","title":"武晓迪的直播间","subTitle":"我武苗苗又回来了","picPath":"/mediasource/live/1552316713701QSmhJ8Z6VR.png","startTime":1553014463006,"memberId":874717,"liveType":1,"picLoopTime":0,"lrcPath":"/mediasource/live/lrc/5c911ebf0cf2927bfdb8dfd2.lrc","streamPath":"http://liveplaylk.lvb.eastmoney.com/live/2519_3721219.flv?txSecret=8f9ffebd13e80f732621a08f582a3bce&txTime=5C927042","screenMode":0,"roomId":"63374977","bwqaVersion":0},{"liveId":"5c911a7f0cf2f3b1e81af436","title":"谢艾琳的直播间","subTitle":"随便播播","picPath":"/mediasource/live/1551966416612KZbF4eku32.jpg","startTime":1553013375529,"memberId":530447,"liveType":1,"picLoopTime":0,"lrcPath":"/mediasource/live/lrc/5c911a7f0cf2f3b1e81af436.lrc","streamPath":"http://liveplaylk.lvb.eastmoney.com/live/2519_3721211.flv?txSecret=ceadd82e7b84a3eea854d143cc0461bd&txTime=5C926C06","screenMode":0,"roomId":"8761292","bwqaVersion":0},{"liveId":"5c9106030cf20f4b9b9363cd","title":"熊素君的电台","subTitle":"夜黑风高…寂寞如雪十分钟","picPath":"/mediasource/live/1550929303415MGux8l021r.jpg,/mediasource/live/15530081312164sU5H53EmF.jpg","startTime":1553008131415,"memberId":63573,"liveType":2,"picLoopTime":30000,"lrcPath":"/mediasource/live/lrc/5c9106030cf20f4b9b9363cd.lrc","streamPath":"http://livepull.48.cn/pocket48/bej48_xiong_sujun_epygh.flv","screenMode":0,"roomId":"3871120","bwqaVersion":0}]
|
json
|
The Intercontinental Championship is a creation of kayfabe in its purest form. After WWE officially withdrew from the territorial system and became a world wide phenomenon, they employed a massive roster of top Superstars.
Obviously, not all of these famous wrestlers could be WWE World Champion. A mid card belt was needed. Since WWE rival, the NWA, had the United States Championships as their mid card title, the WWE decided to go one better and create the Intercontinental championship, signifying the best wrestler in both North and South America.
The reason the Intercontinental championship is mired in kayfabe is due to how the first champion was crowned. Pat Patterson was the first champion, but he didn't pin a single person to get the belt.
The WWE decided to forgo the usual tournament format of crowning a freshly minted title's first owner, and decided to pretend that it had happened instead.
After Patterson, a virtual who's who of wrestling legends have held the Intercontinental Championship. Macho Man Randy Savage, The Rock, Chris Jericho are a few names who have held the prestigious title.
Over the course of its long and storied history, there have naturally been some fantastic bouts contested for it. Here are some of the finest, in chronological order.
Disclaimer: The opinions expressed in the article belong to the writer and doesn't necessarily represent Sportskeeda's stand.
#1 Ricky the Dragon Steamboat vs. Macho Man Randy Savage (C)
The Angle: When people talk about the Intercontinental Championship, it's a good bet that this match will come up, for very good reasons.
The build up angle to the culminating title bout was nothing short of brilliant, and included wrestling legend George the Animal Steele. Steele had been making advances in his own innocent child like way towards Miss Elizabeth, the manager (and real life wife) of Macho Man Randy Savage.
Fellow babyface Ricky Steamboat became involved, trying to help his friend Steele, and for his troubles had his larynx 'crushed' (kayfabe) when Macho Man leapt off the top rope and landed on him with the ringbell.
After Steamboat 'recovered' from the assault, the match was booked for the IC title at WrestleMania 3.
Why it's considered to be great: There's no question that Andre vs. Hogan was the match that got everyone jumping off their seats, but this bout completely stole the show. Savage and Steamboat meticulously planned their bout, which turned out to be everything a pro wrestling match should be. Drama, technical acumen, and thrilling dives from the top rope all conspired with the stellar angle to make this perhaps the most beloved IC title bout of all time.
The Angle: In 1991, angles were built differently. The main thrust of the story was that Bret Hart and Curt Hennig were both claiming to be the best technical wrestler in the world, with Bret viewed as an underdog.
This was the start of Bret's main event build up, and most of the time that journey starts with an Intercontinental championship reign. Mr. Perfect was the perfect 'gatekeeper' for Hart to defeat and continue his climb to the top.
Why it's considered to be great: Simply put, this match is a masterwork of technical wrestling with plenty of drama thrown in for good measure. At one point, a frustrated Mr. Perfect tried to run away from the ring (because he couldn't lose the title on a count out) but Bret Hart dragged him back so fiercely that Perfect's singlet was torn.
Hart and Hennig were at the top of their game here, in their respective primes and it shows in this cornerstone of IC title history.
The Place: SummerSlam live from Wembley Stadium.
The Angle: Bret Hart had won the IC title at the previous year's SummerSlam, but it was time for him to begin to ascend to the main event picture. Bulldog was a huge babyface in this era, making this one of the rare face vs. face match ups in IC title history.
Despite Bulldog being a hometown hero, the gist of the story line was that he was an underdog, no pun intended. Bret Hart reportedly advised Vince McMahon to let the match go on last, unusual for a midcard title bout, and McMahon agreed, knowing that the home town of Bulldog would pop huge for the expected title change.
Why it's considered to be great: Many people criticized Bulldog over the years for being a big stiff, but this match proves that he was, in fact, an above average worker elevated to greatness by a top of his game Bret Hart.
When Bulldog reversed a sunset flip attempt and got the pin, the resulting crowd pop is one of the loudest of all time in WWE history.
The Place: Wrestlemania X Madison Square Garden, New York, New York.
The Angle: IC strap holder Shawn Michaels was 'suspended' by the WWE (in reality he was nursing an injury) and Razor Ramon won the title in the interim. When HBK came back, he claimed to be the 'real' IC champion because he never lost the belt.
Obviously there could not be two IC champions, so the stage was set to determine who would be the undisputed champion. Both title belts were suspended above the ring, and the first man to scale a ladder and retrieve them would be recognized as the one, true IC champion.
Why it's considered to be great: For one thing, this was the first ever ladder match in WWE history. Razor and HBK put on an innovative, never before seen match up with sheer brutality and death defying falls as its hallmark. The finish is spectacular, as Razor toppled the ladder when HBK was inches away from winning. Michaels got tangled up in the ropes after racking his crotch on them, and Razor easily scaled the ladder and won the bout.
The Place: SummerSlam 1998 from Madison Square Garden, New York, New York.
The Angle: What is it about SummerSlam and great IC title matches? They just seem to go together.
The build up to this match saw the ostensibly heel Rock attack Triple H with his IC title belt, damaging his knee (kayfabe) and hopefully making it impossible for Hunter to climb the ladder and get the belt.
This was the start of Triple H's build up toward the main event picture, and he was supposed to be the good guy, but there were a lot of Rock fans in the audience cheering for him as well.
Why it's considered to be Great: Let's not beat around the bush. Neither Rock nor Triple H are technical wrestling masters. But what they both excel at is psychology and story telling,and that's evident here. The two men traded brutal bumps with the ladder, beat the heck out of each other with chairs, and took nasty spills to the hard mat.
The finish involved the dearly departed Chyna delivering her patented low blow to Rock, allowing Triple H to climb the ladder and win the bout.
The Place: Backlash live from Alberta Canada.
The Angle: During this era, the young rookie Randy Orton was a member of Evolution and went by the moniker "Legend Killer. " He tormented Mick Foley for months until he inadvertently brought out the most dangerous of Foley's personas, Cactus Jack.
The build up was nothing short of brilliant, and the match helped get Orton over as more than just another pretty face.
Why it's considered to be great: Let's be honest; when Cactus Jack enters a ring, you expect him to take the brunt of the hardcore offense. But Randy Orton took just as many nasty bumps as Foley, including being slammed onto a pile of thumb tacks after Cactus Jack countered the RKO.
Both men were bloody and beaten senseless, but Orton managed to eke out a win with an RKO to a barbed wire bat. One of the best hardcore matches of all time in terms of psychology and sheer brutality.
Who knows what future generations will say about great Intercontinental title matches? Could we see names like AJ Styles and Ricochet on a similar list in the future? It very well could be.
Thanks for taking this trip down memory lane with us, and as always thanks for reading!
|
english
|
It looks like Facebook wants to make your News Feed a lot more newsy, if recently leaked screenshots are to be believed: some US users have started seeing tabs for news categories like Sports and World News appear inside the official mobile apps.
Images posted by Tom Critchlow (@tomcritchlow) on Twitter were followed up by an official Facebook statement to Mashable: "People have told us they'd like options to see more stories on Facebook around specific topics they're interested in, so we have been testing a few feeds for people to view more and different stories from people and Pages based on topic areas," said a spokesperson.
It's worth remembering that Facebook trials new features and tweaks all the time, and this may not roll out to everyone if the early testers don't like it. Even so, it's a sign of how Mark Zuckerberg's social network juggernaut is adapting to changing user habits.
While Facebook was awash with personal news and posts in the early days, those kind of intimate, diary-style posts have disappeared in recent years. People are now more likely to share personal stuff on apps like Instagram and Snapchat.
If that's the case then breaking news might be the way for Facebook to keep us all opening up the app on our phones - and of course that's an area where Twitter has traditionally excelled. Both platforms have a long history of borrowing features off each other and this could be the latest shift.
According to early testers, the original News Feed is still available, and you will be able to choose which news categories you see inside the app. We'll see if people find the feature compelling enough for Facebook to launch it fully.
We've got the lowdown on the brand new HTC 10 smartphone:
Get the hottest deals available in your inbox plus news, reviews, opinion, analysis, deals and more from the TechRadar team.
Dave is a freelance tech journalist who has been writing about gadgets, apps and the web for more than two decades. Based out of Stockport, England, on TechRadar you'll find him covering news, features and reviews, particularly for phones, tablets and wearables. Working to ensure our breaking news coverage is the best in the business over weekends, David also has bylines at Gizmodo, T3, PopSci and a few other places besides, as well as being many years editing the likes of PC Explorer and The Hardware Handbook.
|
english
|
import { makeReduxDuck } from "teedux";
interface IState {
source: null;
currentAction: null;
isRetrying: boolean;
isError: boolean;
errorStatusCode: null;
isSuccess: boolean;
}
const initialState: IState = {
source: null,
currentAction: null,
isRetrying: false,
isError: false,
errorStatusCode: null,
isSuccess: false
};
const duck = makeReduxDuck<IState>("currentMeetingActions", initialState);
const setActionSource = duck.defineAction<{ source: null }>("SET_ACTION_SOURCE", (_, { source }) => ({
source
}));
export const $setActionSource = (source: null) => setActionSource({ source });
const startAction = duck.defineAction<{ currentAction: null }>("START_ACTION", (_, { currentAction }) => ({
currentAction,
isError: false
}));
export const $startAction = (currentAction: null) => startAction({ currentAction });
const setActionIsRetrying = duck.definePayloadlessAction("SET_ACTION_IS_RETRYING", () => ({
isRetrying: true
}));
export const $setActionIsRetrying = setActionIsRetrying;
const setActionError = duck.defineAction<{ errorStatusCode: null }>("SET_ACTION_ERROR", (_, { errorStatusCode }) => ({
errorStatusCode,
isError: true,
isRetrying: false
}));
export const $setActionError = (errorStatusCode: null) => setActionError({ errorStatusCode });
const setActionSuccess = duck.definePayloadlessAction("SET_ACTION_SUCCESS", () => ({
isSuccess: true,
isRetrying: false
}));
export const $setActionSuccess = setActionSuccess;
export const endAction = duck.definePayloadlessAction("END_ACTION", () => initialState);
export default duck.getReducer();
|
typescript
|
{
"id": 1636,
"source": "barnes",
"verse_id": 18543,
"verse_count": 1,
"reference": "44:9",
"title": "",
"html": " <p> <em>They that make a graven image <\/em>—A graven image is one that is cut, or sculptured out of wood or stone, in contradistinction from one that is molten, which is made by being cast. Here it is used to denote an image, or an idol-god in general. God had asserted in the previous verses his own divinity, and he now proceeds to show, at length, the vanity of idols, and of idol-worship. This same topic was introduced in <a class=\"isa\" verses=\"eyIxODQzOSI6M30=\">Isaiah 40:18-20<\/a>, (see the notes at that passage), but it is here pursued at greater length, and in a tone and manner far more sarcastic and severe. Perhaps the prophet had two immediate objects in view; first, to reprove the idolatrous spirit in his own time, which prevailed especially in the early part of the reign of Manasseh; and secondly, to show to the exile Jews in Babylon that the gods of the Babylonians could not protect their city, and that Yahweh could rescue his own people. He begins, therefore, by saying, that the makers of the idols were all of them vanity. Of course, the idols themselves could have no more power than their makers, and must be vanity also. <\/p> <p> <em>Are all of them vanity <\/em>—(See the note at <a class=\"isa\" verses=\"WzE4NDgxXQ==\">Isaiah 41:29<\/a>). <\/p> <p> <em>And their delectable things <\/em>—Margin, ‘Desirable.’ The sense is, their valued works, their idol-gods, on which they have lavished so much expense, and which they prize so highly. <\/p> <p> <em>Shall not profit <\/em>—Shall not be able to aid or protect them; shall be of no advantage to them (see <a class=\"ref\">Habakkuk 2:18<\/a>). <\/p> <p> <em>And they are their own witnesses <\/em>—They can foretell nothing; they can furnish no aid; they cannot defend in times of danger. This may refer either to the worshippers, or to the idols themselves—and was alike true of both. <\/p> <p> <em>They see not <\/em>—They have no power of discerning anything. How can they then foresee future events? <\/p> <p> <em>That they may be ashamed <\/em>—The same sentiment is repeated in <a class=\"isa\" verses=\"WzE4NTQ1XQ==\">Isaiah 44:11<\/a>, and in <a class=\"isa\" verses=\"WzE4NTc4XQ==\">Isaiah 45:16<\/a>. The sense is, that shame and confusion must await all who put their trust in an idol-god. <\/p>",
"audit": 1
}
|
json
|
<filename>resources/modinfo.json
{
"type": "code",
"modid": "discordrelay",
"name": "Discord Relay",
"description": "Bridges Vintage Story and a Discord channel",
"authors": ["Derpius"],
"version": "0.1.0",
"dependencies": {
"game": "1.16.4"
},
"website": "https://github.com/Derpius/VintageRelay",
"side": "Server"
}
|
json
|
Walsh also added that the team is keen to bounce back in the Test series and will play their best eleven in the opener.
Former West Indies pace great and Bangladesh bowling coach Courtney Walsh reckons left-arm fast bowler Mustafizur Rahman might not be able to feature in all three Tests against New Zealand which begin on February 28.
The tourists, who were handed a 3-0 whitewash in the ODIs are battling with many injuries as they seek a consolation victory. Experienced batsman Mushfiqur Rahim is grappling with the surfacing of an earlier rib injury and will be under observation.
“In the case of Mustafizur, we have to make sure he is fine and try to get him fresh. So, the question will be how many Tests the selectors want him to play here and once we know that we can monitor his workload,” Walsh was quoted as saying by BdCrictime. com on Tuesday.
Walsh also added that the team is keen to bounce back in the Test series and will play their best eleven in the opener. However, he feels that Mustafizur, being the strike bowler, will carry a bit of workload.
“Obviously, we are here to win the Test series, so we have to look for the best team in the first Test and I am positive about it. So, he is probably going to carry a bit of workload and he knows that,” Walsh said.
“I have got to make sure he doesn’t over-bowl in practice or in practice matches. That’s my job and responsibility to try keep him fresh.
“If it’s up to me, yes, Rubel Hossain could play some more games because unfortunately, he didn’t play a lot of games here. He has done well in the Bangladesh Premier League (BPL) and most games for Bangladesh he has played. So, I think it’s important for him to play up until the ICC World Cup 2019 and to play as many matches as he can once he is been looked after,” he added.
|
english
|
import fetchMock, { MockRequest, MockResponseFunction } from 'fetch-mock';
import { stillingApi, stillingssøkProxy } from '../api/api';
import StandardsøkDto from '../søk/standardsøk/Standardsøk';
import standardsøk from './mock-data/standardsøk';
import { resultat } from './mock-data/stillingssøk';
const adsUrl = `${stillingssøkProxy}/stilling/_search`;
const standardsøkUrl = `${stillingApi}/standardsok`;
const logg = (response: any): MockResponseFunction => (url, opts) => {
console.info(`Mock ${opts.method} mot ${url}`, { body: opts.body, response });
return response;
};
const putStandardsøk = (url: string, options: MockRequest): StandardsøkDto => {
const nyttSøk = JSON.parse(options.body as string) as StandardsøkDto;
return {
...standardsøk,
søk: nyttSøk.søk,
};
};
fetchMock.config.fallbackToNetwork = true;
fetchMock
.post(adsUrl, logg(resultat))
.get(standardsøkUrl, logg(standardsøk))
.put(standardsøkUrl, (url, opts) => {
const standardsøk = putStandardsøk(url, opts);
return logg(standardsøk);
});
|
typescript
|
<filename>e2e/cypress/integration/bot_accounts/promote_demote_spec.js
// Copyright (c) 2015-present Mattermost, Inc. All Rights Reserved.
// See LICENSE.txt for license information.
// ***************************************************************
// - [#] indicates a test step (e.g. # Go to a page)
// - [*] indicates an assertion (e.g. * Check the title)
// - Use element ID when selecting an element. Create one if none.
// ***************************************************************
// Stage: @prod
// Group: @bot_accounts
import {createBotPatch} from '../../support/api/bots';
describe('Managing bots in Teams and Channels', () => {
let team;
before(() => {
cy.apiUpdateConfig({
ServiceSettings: {
EnableBotAccountCreation: true,
},
TeamSettings: {
RestrictCreationToDomains: 'sample.mattermost.com',
},
});
cy.apiInitSetup({loginAfter: true}).then((out) => {
team = out.team;
});
});
it('MM-T1819 Promote a BOT to team admin', () => {
cy.makeClient().then(async (client) => {
// # Go to channel
const channel = await client.getChannelByName(team.id, 'off-topic');
cy.visit(`/${team.name}/channels/${channel.name}`);
// # Add bot to team
const bot = await client.createBot(createBotPatch());
await client.addToTeam(team.id, bot.user_id);
// # Open team menu and click 'Manage Members'
cy.uiOpenTeamMenu('Manage Members');
// # Find bot
cy.get('.more-modal__list').find('.more-modal__row').its('length').should('be.gt', 0);
cy.get('#searchUsersInput').type(`${bot.username}`);
// # Wait for loading screen
cy.get('#teamMembersModal .loading-screen').should('be.visible');
// # Find bot member dropdown
cy.get(`#teamMembersDropdown_${bot.username}`).as('memberDropdown').should('contain.text', 'Member').click();
// # Promote bot to team admin
cy.findByTestId('userListItemActions').find('button').contains('Make Team Admin').click();
// * Verify bot was promoted
cy.get('@memberDropdown').should('contain.text', 'Team Admin');
});
});
});
|
javascript
|
Russia-Ukraine war: What is concerning RBI, SBI, other banks in India?
The ongoing war between Russia and Ukraine has dented investors' sentiments on the stock market in India.
By India Today Web Desk: The ongoing war between Russia and Ukraine has ripple effects on crucial aspects of the global economy - stock markets around the world and international oil prices. Now banks in India have started getting concerned as the conflict escalates between Moscow and Kyiv.
The State Bank of India (SBI) has stopped processing transactions of Russian entities that have been sanctioned by the West over Moscow's invasion of Ukraine. To effect this, SBI has issued a circular as it fears that any transaction with entities or sectors under sanction will invite sanction on it as well, according to a PTI report.
WHAT THIS MEANS?
No transactions involving entities, banks, ports or vessels appearing on a US, European Union or United Nations sanctions list would be processed irrespective of the currency of the transaction.
Payments due to such entities have to be processed by other mechanism rather than through the banking channel.
SBI operates a joint venture in Moscow called Commercial Indo Bank Llc, where Canara Bank is another partner with 40 per cent stake, PTI reported.
Banks in India were scrambling after bills for imports from Russia have started bouncing and payments for exports have been stuck in the wake of sanctions imposed by the west on Russia following its invasion of Ukraine, Reuters reported.
The Reserve Bank of India (RBI) met with select bankers and is trying to assess the exposure that lenders have to Russia and to Ukraine and the impact it may have on Indian banks, said another senior banking executive.
The extent of the payments that are stalled is not clear at the moment and the regulator is trying to assess it, the Reuters report stated.
Meanwhile, banks have started contacting corporate clients to understand their exposure to Russia and to Ukraine to see if their business could be significantly impacted, resulting in delayed loan re-payments for banks, according to Reuters.
- Russia is one of the biggest suppliers of defence products and equipment to India, mostly under government-to-government contracts.
- Bilateral trade between India and Russia stood at USD 9. 4 billion so far this fiscal year, against USD 8. 1 billion in 2020-21.
- India’s main imports from Russia include fuels, mineral oils, pearls, precious or semi-precious stones, nuclear reactors, boilers, machinery and mechanical appliances; electrical machinery and equipment and fertilisers.
- Major export items from India to Russia include pharmaceutical products, electrical machinery and equipment, organic chemicals and vehicles.
Last week, the Group of seven (G-7) major economies imposed punitive sanctions against the Russian central bank.
They also decided to remove Russian banks from the SWIFT inter-banking system -- which is intended to isolate Russia from global trade.
India has so far maintained a neutral stance on Russia attack on Ukraine asking both the countries to resolve the issue diplomatically.
|
english
|
Direct Ishq movie is a rom-com masala, that's full of romance, comedy and action and with beautiful songs.
Story in detail:
In this holy city, there lives a girl named Dolly Pandey, who looks like a doll- naughty but active. She is wise and very bold, who doesn't allow any boy to come near her. Her dream is to become a very big singer & wants her father & the whole city to be proud of her. But, destiny is not simple.
She meets two boys name Vicky Shukla & Kabeer, who change her. Vicky Shukla is a typical boy from Banaras, who is also the president of 'Kaashi Vidyapeeth'. He is a rough and tough strong & bold boy who always keeps a revolver with him, as he is always ready to fight the enemies. At the same time, he is just the opposite when coming to deal with a girl. He then, becomes extremely shy. One day, he meets Dolly Pandey and falls in love with her.
On the other hand, Kabeer is a smart, good looking guy born in a rich family of Banaras but educated in Mumbai and who does music shows and events in Mumbai. His grandmother wants him to marry his friend's daughter, but, he refuses. One day, he meets Dolly Pandey and falls in love with her. He helps her in making her career in singing.
Who is the lucky Man in Dolly's life? Is it Kabeer or Vicky?
|
english
|
package br.com.codevelopment.microservices.core.docs;
import org.springframework.context.annotation.Configuration;
import br.com.codevelopment.microservices.common.docs.BaseSwaggerConfig;
import springfox.documentation.swagger2.annotations.EnableSwagger2;
@EnableSwagger2
@Configuration
public class SwaggerCoreConfig extends BaseSwaggerConfig{
public SwaggerCoreConfig() {
super("br.com.codevelopment.microservices.core.controller");
}
}
|
java
|
{
"ConnectionStrings": {
"NWindConnectionString": "XpoProvider=SQLite;Data Source=|DataDirectory|Data/nwind.db",
//"HomesConnectionString": "XpoProvider=SQLite;Data Source=|DataDirectory|Data/homes.db",
//"ContactsConnectionString": "XpoProvider=SQLite;Data Source=|DataDirectory|Data/Contacts.db",
"DevAvConnectionString": "XpoProvider=SQLite;Data Source=|DataDirectory|Data/devav.sqlite3",
//"CountriesConnectionString": "XpoProvider=SQLite; Data Source=|DataDirectory|Data/Countries.db"
}
}
|
json
|
package org.apereo.cas.config.support.authentication;
import org.apereo.cas.authentication.AuthenticationEventExecutionPlan;
import org.apereo.cas.authentication.AuthenticationEventExecutionPlanConfigurer;
import org.apereo.cas.authentication.AuthenticationHandler;
import org.apereo.cas.configuration.CasConfigurationProperties;
import org.apereo.cas.util.AsciiArtUtils;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.beans.factory.annotation.Qualifier;
import org.springframework.boot.context.properties.EnableConfigurationProperties;
import org.springframework.context.annotation.Configuration;
import ee.ria.sso.authentication.principal.TaraPrincipalResolver;
/**
*
* @author <NAME>: <EMAIL>
* @since 5.1.4
*/
@Configuration("acceptUsersAuthenticationEventExecutionPlanConfiguration")
@EnableConfigurationProperties(CasConfigurationProperties.class)
public class AcceptUsersAuthenticationEventExecutionPlanConfiguration implements AuthenticationEventExecutionPlanConfigurer {
private static final Logger LOGGER = LoggerFactory.getLogger(AcceptUsersAuthenticationEventExecutionPlanConfiguration.class);
@Autowired
private TaraPrincipalResolver taraPrincipalResolver;
@Autowired
@Qualifier("taraAuthenticationHandler")
private AuthenticationHandler taraAuthenticationHandler;
@Override
public void configureAuthenticationExecutionPlan(final AuthenticationEventExecutionPlan plan) {
final String header = ""
+ "-----------------------------------------------------------------------"
+ "Authentication Execution Plan of RIIGI INFOSÜSTEEMI AMET has been loaded"
+ "-----------------------------------------------------------------------";
AsciiArtUtils.printAsciiArtWarning(LOGGER, "RIIGI INFOSÜSTEEMI AMET", header);
plan.registerAuthenticationHandlerWithPrincipalResolver(this.taraAuthenticationHandler, this.taraPrincipalResolver);
}
}
|
java
|
<gh_stars>0
{
"_guid_": "F-047_Test_Data_Base",
"productName": "CCD Data Store",
"operationName": "Get case ids",
"method": "GET",
"uri": "/caseworkers/{uid}/jurisdictions/{jid}/case-types/{ctid}/cases/ids",
"user": {
"_extends_": "Common_User_For_Request"
},
"request": {
"_extends_": "Common_Request",
"pathVariables": {
"uid": "[[DEFAULT_AUTO_VALUE]]",
"jid": "AUTOTEST1",
"ctid": "AAT"
},
"queryParams": {
"userId": "F047UserIdTest"
}
},
"expectedResponse": {
"headers": {
"_extends_": "Common_Response_Headers",
"Content-Length": "[[ANYTHING_PRESENT]]",
"Content-Type": "[[ANYTHING_PRESENT]]",
"Content-Encoding": "[[ANYTHING_PRESENT]]"
}
}
}
|
json
|
export {default} from "./InformeEquipos";
|
javascript
|
Ayushmann Khurrana is a proud father, as indicated by his most recent Instagram post. On Sunday, the actor and his wife wished their 10-year-old son Virajveer a happy birthday by sharing adorable posts for the little munchkin.
Taking to his Instagram handle, Ayushman shared a series of pictures of his son indulging in different activities and captioned his post, “Happy birthday my football lover, goofball, nature lover, Lennon lover, guitar strummer,” the proud father wished his son Virajveer. Looks like Virajveer has inherited many skills, just like his dad.
Whereas the inspirational and Virajveer’s mother, Tahira Kashyap shared a slew of photos of their “nikka musician," including a video of him strumming a guitar. She wrote in the caption, “My nikka musician. So much for you to learn and so much to learn from you! Happy birthday my love. " Fans, friends, and family members poured their hearts out in the comments section.
Uncle Aparshakti Khurrana, on the other hand, sent a string of heart emoticons to his nephew. Whereas filmmaker Abhishek Kapoor wrote, “Happy birthday Virajveer…have a great year and may u dance to our own tunes and have the world dance with u. "
Ayushmann and Tahira married in November 2008. The power couple frequently share photos of themselves with their children. They have a son, Virajveer and a daughter named Varushka Khurrana as well.
On the professional front, Ayushmann was most recently seen in the Abhishek Kapoor-directed film Chandigarh Kare Aashiqui, alongside Vaani Kapoor.
|
english
|
import os
from datetime import datetime
from PIL import Image, ImageDraw, ImageFont
from pySmartDL import SmartDL
from telethon.tl import functions
from uniborg.util import admin_cmd
import asyncio
import shutil
import random, re
FONT_FILE_TO_USE = "/usr/share/fonts/truetype/dejavu/DejaVuSans.ttf"
#Add telegraph media links of profile pics that are to be used
TELEGRAPH_MEDIA_LINKS = ["https://telegra.ph/file/2bc2e85fb6b256efc4088.jpg",
"https://telegra.ph/file/443fff8a7db51d390e1a7.jpg",
"https://telegra.ph/file/e49bbb9e21383f8231d85.jpg",
"https://telegra.ph/file/d6875213197a9d93ff181.jpg",
"https://telegra.ph/file/ec7da24872002e75e6af8.jpg",
"https://telegra.ph/file/468a2af386d10cd45df8f.jpg",
"https://telegra.ph/file/59c7ce59289d80f1fe830.jpg"
]
@borg.on(admin_cmd(pattern="thordp ?(.*)"))
async def autopic(event):
while True:
piclink = random.randint(0, len(TELEGRAPH_MEDIA_LINKS) - 1)
AUTOPP = TELEGRAPH_MEDIA_LINKS[piclink]
downloaded_file_name = "./ravana/original_pic.png"
downloader = SmartDL(AUTOPP, downloaded_file_name, progress_bar=True)
downloader.start(blocking=False)
photo = "photo_pfp.png"
while not downloader.isFinished():
place_holder = None
shutil.copy(downloaded_file_name, photo)
im = Image.open(photo)
current_time = datetime.now().strftime("@MrSemmy \n \nTime: %H:%M:%S \nDate: %d/%m/%y")
img = Image.open(photo)
drawn_text = ImageDraw.Draw(img)
fnt = ImageFont.truetype(FONT_FILE_TO_USE, 30)
drawn_text.text((30, 50), current_time, font=fnt, fill=(102, 209, 52))
img.save(photo)
file = await event.client.upload_file(photo) # pylint:disable=E0602
try:
await event.client(functions.photos.DeletePhotosRequest(await event.client.get_profile_photos("me", limit=1)))
await event.client(functions.photos.UploadProfilePhotoRequest( # pylint:disable=E0602
file
))
os.remove(photo)
await asyncio.sleep(600)
except:
return
|
python
|
{
"text": "My two cents on changing Facebook to keep its power manageable. 1. Its social graph should be a public resource. Spin it out. The APIs should be open like the web APIs are. 2. Facebook should peer with the open web, so it isn't a silo.",
"created": "Tue, 01 Oct 2019 14:00:47 GMT",
"type": "outline"
}
|
json
|
http://data.doremus.org/expression/b4561be6-222a-3bd4-869b-5d086c9e7a65
http://data.doremus.org/expression/a4670b80-f536-317a-9fa9-79f0a30e046b
http://data.doremus.org/expression/476e9196-a8aa-3113-a8f4-6a1382df1c80
http://data.doremus.org/expression/76f2fae3-b5d1-31f0-91b5-9ae4c2715aaa
http://data.doremus.org/expression/4836fe9d-ba38-35ec-8970-373ee76421e6
http://data.doremus.org/expression/f442ad3e-bce8-384c-b14c-463a6e87e68c
http://data.doremus.org/expression/4d7d6926-490f-3c97-bd92-8ae158fd1549
http://data.doremus.org/expression/10bf5d63-9c6f-3ac2-82c2-d242a5cee49b
http://data.doremus.org/expression/bb58630b-77d6-3771-ad1d-4c20f87a42d8
http://data.doremus.org/expression/0e56f10d-f5a7-3e8d-a423-190aeecdda2e
http://data.doremus.org/expression/a325cb6c-251a-3ca8-a98d-330f7bc55c8f
|
json
|
- SpousesGrover Asmus(August 30, 1974 - January 14, 1986) (her death)Tony Owen(June 15, 1945 - 1971) (divorced, 4 children)William Tuttle(January 30, 1943 - January 8, 1945) (divorced)
- In the scene from It's a Wonderful Life (1946) where she and James Stewart throw rocks at the old Granville house, director Frank Capra had originally planned to use a double in Donna's place to throw the rock. Miss Reed, however, was an accomplished baseball player in high school and threw very well, as evidenced by her toss in the movie.
- Had a close relationship with her TV daughter, Shelley Fabares. Was considered by Fabares as her second mother until Reed's death in 1986.
- Learned of her firing from Dallas (1978) from a reporter while on a vacation to Paris. She was in the process of suing the show's producers before her death in January, 1986.
- Although her image was generally associated that of the the squeaky-clean, conservative 1950s housewife and mother, she won her Oscar for From Here to Eternity (1953) for playing a prostitute.
- Her last husband Grover Asmus started a program called the Donna Reed Foundation that led to the Donna Reed festival held yearly in Denison, IA. It's a celebration of Donna, and includes classes and performances. Many stars attend such as Shelley Fabares, Debbie Reynolds, and Loren Janes.
- Forty pictures I was in, and all I remember is 'What kind of bra will you be wearing today, honey?' That was always the area of big decision - from the neck to the navel.
- I've been involved with blood donation since the 1980s because there is a critical need.
- If nuclear power plants are safe, let the commercial insurance industry insure them. Until these most expert judges of risk are willing to gamble with their money, I'm not willing to gamble with the health and safety of my family.
- What we look for in the school is unrealized potential.
- I hope more people decide to become organ donors.
|
english
|
<gh_stars>0
import React from "react";
import styled from "styled-components";
import "font-awesome/css/font-awesome.min.css";
import { Link } from "react-router-dom";
import {
ContainerFooter,
Section,
Titles,
P,
ImageContainer,
A,
Image,
H3,
Right,
} from "./style";
const index = () => {
return (
<>
<ContainerFooter>
<Section>
<Titles>Sobre Nosotros</Titles>
<P>
Curabitur non nulla sit amet nislinit tempus convallis quis ac
lectus. lac inia eget consectetur sed, convallis at tellus. Nulla
porttitor accumsana tincidunt.
</P>
</Section>
<Section>
<Titles>Contacto</Titles>
<ImageContainer>
<A href="/">
<Image className="fa fa-facebook"></Image>
</A>
<A href="/">
<Image className="fa fa-google"></Image>
</A>
<A href="/">
<Image className="fa fa-instagram"></Image>
</A>
<A href="/">
<Image className="fa fa-linkedin"></Image>
</A>
</ImageContainer>
</Section>
<Section>
<Titles>Mantente en contacto</Titles>
<H3>Direccion: </H3>
<H3>Tlajomulco de Zuñiga, Jalisco, México</H3>
<H3>Contacto :</H3>
<H3>Telefono : +52 3326704013</H3>
<H3>Email : <EMAIL></H3>
</Section>
</ContainerFooter>
<Right>
© 2021 <NAME>. Todos los derechos reservados
</Right>
</>
);
};
export default index;
|
javascript
|
// Copyright 2013 The Chromium Authors. All rights reserved.
// Use of this source code is governed by a BSD-style license that can be
// found in the LICENSE file.
#include "base/command_line.h"
#include "components/autofill/core/browser/payments/payments_service_url.h"
#include "components/autofill/core/common/autofill_switches.h"
#include "testing/gtest/include/gtest/gtest.h"
#include "url/gurl.h"
namespace autofill {
namespace payments {
TEST(PaymentsServiceSandboxUrl, CheckSandboxUrls) {
base::CommandLine::ForCurrentProcess()->AppendSwitchASCII(
switches::kWalletServiceUseSandbox, "1");
EXPECT_EQ("https://wallet-web.sandbox.google.com/manage/w/1/paymentMethods",
GetManageInstrumentsUrl(1).spec());
EXPECT_EQ("https://wallet-web.sandbox.google.com/manage/w/1/settings/"
"addresses",
GetManageAddressesUrl(1).spec());
}
TEST(PaymentsServiceSandboxUrl, CheckProdUrls) {
base::CommandLine::ForCurrentProcess()->AppendSwitchASCII(
switches::kWalletServiceUseSandbox, "0");
EXPECT_EQ("https://wallet.google.com/manage/w/1/paymentMethods",
GetManageInstrumentsUrl(1).spec());
EXPECT_EQ("https://wallet.google.com/manage/w/1/settings/addresses",
GetManageAddressesUrl(1).spec());
}
} // namespace payments
} // namespace autofill
|
cpp
|
<html xmlns="http://www.w3.org/1999/xhtml">
<head>
<meta name="Message:Raw-Header:X-RocketDSI" content="i=172.16.17.32;s=w" />
<meta name="subject" content="Attention Please" />
<meta name="dc:creator" content="=?iso-8859-1?q?Mariam=20Abacha?= <<EMAIL>> transferring this funds out of the sub-African region. Bearing in mind t=" />
<meta name="dc:creator" content="<NAME> <<EMAIL>>" />
<meta name="MboxParser-status" content="RO" />
<meta name="MboxParser---0-236422448-1142305742=" content="43093" />
<meta name="MboxParser---0-236422448-1142305742=" content="43093" />
<meta name="MboxParser---0-236422448-1142305742=" content="43093--" />
<meta name="dcterms:created" content="2006-03-14T03:09:02Z" />
<meta name="Message:From-Email" content="<EMAIL>" />
<meta name="dc:format" content="text/html; charset=iso-8859-1" />
<meta name="Message-To" content="R@S sum of 24 million U.S dollars cash which I intend to use for investment =" />
<meta name="Message-To" content="R@S" />
<meta name="Message-Recipient-Address" content="R@S sum of 24 million U.S dollars cash which I intend to use for investment =" />
<meta name="Message:Raw-Header:Status" content="RO" />
<meta name="MboxParser-reply-to" content="<EMAIL> with your letter of acceptance and your willingness in assisting me to r=" />
<meta name="Message:Raw-Header:MIME-Version" content="1.0" />
<meta name="Multipart-Boundary" content="0-236422448-1142305742=:43093" />
<meta name="Message:Raw-Header:Message-ID" content="<20060314030902.452<EMAIL>>" />
<meta name="Message:Raw-Header:Reply-To" content="<EMAIL>" />
<meta name="dc:title" content="Attention Please" />
<meta name="MboxParser-x-rocketdsi" content="i=2192.168.3.11;s=w that we received such funds. The consignment was kept on an "OPEN BENEFI=" />
<meta name="Message:Raw-Header:X-Sieve" content="CMU Sieve 2.2 h=Message-ID:X-RocketDSI:Date:From:Reply-To:Subject:To:MIME-Version:Content-Type:Content-Transfer-Encoding; b=ACLUWg9SCHs5qTNPYKY/cZoU4c7ljCFh9lfMPW7yx5ci8tDQmFoBfiQ6Jqy+I6IrBjsXzcY3Ah8TwLNCjCrfbj0gJT0+kkW+zi1L3cLw/WTr1/SzUt1hLFdiO53+I6IKxR1rjZvXVgg1Rk4FH4UU+49Hl7aPe+qPj0PM6gteBgw= ;" />
<meta name="Content-Type-Override" content="message/rfc822" />
<meta name="Content-Type" content="message/rfc822" />
<meta name="MboxParser-content-transfer-encoding" content="7bit" />
<meta name="MboxParser-content-transfer-encoding" content="quoted-printable" />
<meta name="MboxParser-content-transfer-encoding" content="quoted-printable" />
<meta name="identifier" content="<20060314030902.4522<EMAIL>> their probe into my husbands financial resources which has led to the fr=" />
<meta name="creator" content="=?iso-8859-1?q?Mariam=20Abacha?= <<EMAIL>> transferring this funds out of the sub-African region. Bearing in mind t=" />
<meta name="creator" content="<NAME> <<EMAIL>>" />
<meta name="X-Parsed-By" content="org.apache.tika.parser.DefaultParser" />
<meta name="X-Parsed-By" content="org.apache.tika.parser.mail.RFC822Parser" />
<meta name="meta:author" content="=?iso-8859-1?q?Mariam=20Abacha?= <<EMAIL>> transferring this funds out of the sub-African region. Bearing in mind t=" />
<meta name="meta:author" content="<NAME> <<EMAIL>>" />
<meta name="meta:creation-date" content="2006-03-14T03:09:02Z" />
<meta name="format" content="text/html; charset=iso-8859-1" />
<meta name="MboxParser-x-sieve" content="CMU Sieve 2.2 b=ACLUWg9SCHs5qTNPYKY/cZoU4c7ljCFh9lfMPW7yx5ci8tDQmFoBfiQ6Jqy+I6IrBjsXzcY3Ah8TwLNCjCrfbj0gJT0+kkW+zi1L3cLw/WTr1/SzUt1hLFdiO53+I6IKxR1rjZvXVgg1Rk4FH4UU+49Hl7aPe+qPj0PM6gteBgw= ; and a Russian firm in our country's multi-billion dollar Ajaokuta steel =" />
<meta name="Creation-Date" content="2006-03-14T03:09:02Z" />
<meta name="Message:Raw-Header:Content-Transfer-Encoding" content="7bit" />
<meta name="Message:Raw-Header:Return-Path" content="<<EMAIL>>" />
<meta name="MboxParser-from" content="r Mon Mar 13 22:09:11 2006" />
<meta name="MboxParser-return-path" content="<<EMAIL>> h=Message-ID:X-RocketDSI:Date:From:Reply-To:Subject:To:MIME-Version:Content-Type:Content-Transfer-Encoding; outside Nigeria.=20" />
<meta name="MboxParser-mime-version" content="1.0 to receive the fund,also include your Physical Address, personal telepho=" />
<meta name="Message:From-Name" content="<NAME>" />
<meta name="X-TIKA:embedded_depth" content="1" />
<meta name="Author" content="=?iso-8859-1?q?Mariam=20Abacha?= <<EMAIL>> transferring this funds out of the sub-African region. Bearing in mind t=" />
<meta name="Author" content="<NAME> <<EMAIL>>" />
<meta name="Multipart-Subtype" content="alternative" />
<meta name="X-TIKA:embedded_resource_path" content="/embedded-2332" />
<meta name="Message-From" content="<NAME> <<EMAIL>>" />
<meta name="dc:identifier" content="<20060314030902.452<EMAIL>> their probe into my husbands financial resources which has led to the fr=" />
<title>Attention Please</title>
</head>
<body><table><tr> <td>Jeg har en ny e-mail-adresse!</td></tr>
</table>
Nu kan du e-maile mig på: <EMAIL>
My Dear Friend,
I am Dr. Mrs. <NAME>, wife to the late Nigerian Head of state, General Sani Abacha who died on the 8th of June 1998 while still on active service for our Country. I am contacting you with the hope that you will be of great assistance to me, I currently have within my reach the sum of 24 million U.S dollars cash which I intend to use for investment purposes outside Nigeria.
This money came as a result of a payback contract deal between my husband and a Russian firm in our country's multi-billion dollar Ajaokuta steel plant. The Russian partners returned my husbands share being the above sum after his death. Presently, the new civilian Government has intensified their probe into my husbands financial resources which has led to the freezing of all our accounts, local and foreign, the revoking of all our business licences and the arrest of my First son. In view of this I acted very fast to withdraw this money from one of our finance houses before it was closed down. I have deposited the money in a security company with the help of very loyal officials of my late husband. No record is known about this fund by the government because there is no documentation showing that we received such funds. The consignment was kept on an "OPEN BENEFICIARY MANDATE" with the Security Company to avoid detection, seizure or diversion.
Due to the current situation in the country and government attitude to my family, I cannot make use of this money within, thus I seek your help in transferring this funds out of the sub-African region. Bearing in mind that you may assist me, l have decided
to part with 20% of the total sum.
Please kindly reply through this email address: <EMAIL> with your letter of acceptance and your willingness in assisting me to receive the fund,also include your Physical Address, personal telephone and fax numbers.
Your URGENT response is needed.
Regards,
Dr. Mrs. <NAME>.
- <NAME>
</body></html>
|
html
|
<filename>lib/monitoring/aws-cloudfront/CloudFrontDistributionMetricFactory.ts
import { IDistribution } from "aws-cdk-lib/aws-cloudfront";
import { DimensionsMap } from "aws-cdk-lib/aws-cloudwatch";
import {
MetricFactory,
MetricStatistic,
RateComputationMethod,
} from "../../common";
const CloudFrontNamespace = "AWS/CloudFront";
const CloudFrontGlobalRegion = "Global";
const CloudFrontDefaultMetricRegion = "us-east-1";
export interface CloudFrontDistributionMetricFactoryProps {
readonly distribution: IDistribution;
/**
* @default true
*/
readonly fillTpsWithZeroes?: boolean;
/**
* @default average
*/
readonly rateComputationMethod?: RateComputationMethod;
}
/**
* To get the CloudFront metrics from the CloudWatch API, you must use the US East (N. Virginia) Region (us-east-1).
* https://docs.aws.amazon.com/AmazonCloudFront/latest/DeveloperGuide/programming-cloudwatch-metrics.html
*/
export class CloudFrontDistributionMetricFactory {
private readonly metricFactory: MetricFactory;
private readonly fillTpsWithZeroes: boolean;
private readonly rateComputationMethod: RateComputationMethod;
private readonly dimensionsMap: DimensionsMap;
constructor(
metricFactory: MetricFactory,
props: CloudFrontDistributionMetricFactoryProps
) {
this.metricFactory = metricFactory;
this.fillTpsWithZeroes = props.fillTpsWithZeroes ?? true;
this.rateComputationMethod =
props.rateComputationMethod ?? RateComputationMethod.AVERAGE;
this.dimensionsMap = {
DistributionId: props.distribution.distributionId,
Region: CloudFrontGlobalRegion,
};
}
metricRequestCount() {
return this.metricFactory
.createMetric(
"Requests",
MetricStatistic.SUM,
"Uploaded",
this.dimensionsMap,
undefined,
CloudFrontNamespace
)
.with({ region: CloudFrontDefaultMetricRegion });
}
metricRequestRate() {
return this.metricFactory.toRate(
this.metricRequestCount(),
this.rateComputationMethod,
false,
"requests",
this.fillTpsWithZeroes
);
}
/**
* @deprecated use metricRequestRate
*/
metricRequestTps() {
return this.metricFactory.toRate(
this.metricRequestCount(),
RateComputationMethod.PER_SECOND,
false,
"requests",
this.fillTpsWithZeroes
);
}
metricTotalBytesUploaded() {
return this.metricFactory
.createMetric(
"BytesUploaded",
MetricStatistic.SUM,
"Uploaded",
this.dimensionsMap,
undefined,
CloudFrontNamespace
)
.with({ region: CloudFrontDefaultMetricRegion });
}
metricTotalBytesDownloaded() {
return this.metricFactory
.createMetric(
"BytesDownloaded",
MetricStatistic.SUM,
"Downloaded",
this.dimensionsMap,
undefined,
CloudFrontNamespace
)
.with({ region: CloudFrontDefaultMetricRegion });
}
metricCacheHitRateAverageInPercent() {
return this.metricFactory
.createMetric(
"CacheHitRate",
MetricStatistic.AVERAGE,
"Hit Rate",
this.dimensionsMap,
undefined,
CloudFrontNamespace
)
.with({ region: CloudFrontDefaultMetricRegion });
}
metric4xxErrorRateAverage() {
return this.metricFactory
.createMetric(
"4xxErrorRate",
MetricStatistic.AVERAGE,
"4XX",
this.dimensionsMap,
undefined,
CloudFrontNamespace
)
.with({ region: CloudFrontDefaultMetricRegion });
}
metric5xxErrorRateAverage() {
return this.metricFactory
.createMetric(
"5xxErrorRate",
MetricStatistic.AVERAGE,
"5XX",
this.dimensionsMap,
undefined,
CloudFrontNamespace
)
.with({ region: CloudFrontDefaultMetricRegion });
}
metricTotalErrorRateAverage() {
return this.metricFactory
.createMetric(
"TotalErrorRate",
MetricStatistic.AVERAGE,
"Total",
this.dimensionsMap,
undefined,
CloudFrontNamespace
)
.with({ region: CloudFrontDefaultMetricRegion });
}
}
|
typescript
|
As Tamil film industry is gearing up for the felicitation function for Chief Minister M. Karunanidhi on February 6, news is abuzz saying the cultural performances by leading artistes are going to be not only breath taking but also be unexpected. One great item will be Kamal HaasanÂs soliloquy performance.
WhatÂs going to be another highlight of the cultural is the Âlive hip-hop feat by the hot couple of South Indian film industry - Prabhu Deva and Nayantara. The hot pair was expected to be performing on stage together for Kamal-50 show by Vijay TV last year but they didnÂt do the act.
Now the best kept secret is out and the value quotient of the event has grown many folds.
Follow us on Google News and stay updated with the latest!
|
english
|
// Register a template definition set named "default".
CKEDITOR.addTemplates( 'default',
{
// The name of the subfolder that contains the preview images of the templates.
imagesPath : CKEDITOR.getUrl( CKEDITOR.plugins.getPath( 'templates' ) + 'templates/images/' ),
// Template definitions.
templates :
[
{
title: 'Tab Destination 1',
//image: 'template1.gif',
description: '5 Columns x 7 Columns',
html:'<div class="row"><div class="col-sm-5"> Image </div><div class="col-sm-7"> Content </div></div><div class="row"><div class="col-sm-5"> Image </div><div class="col-sm-7"> Content </div></div><div class="row"><div class="col-sm-5"> Image </div><div class="col-sm-7"> Content </div></div><div class="row"><div class="col-sm-5"> Image </div><div class="col-sm-7"> Content </div></div><div class="row"><div class="col-sm-5"> Image </div><div class="col-sm-7"> Content </div></div><div class="row"><div class="col-sm-5"> Image </div><div class="col-sm-7"> Content </div></div>'
},
{
title: 'Tab Destination 2',
//image: 'template1.gif',
description: '6 Columns x 6 Columns',
html:'<div class="row"><div class="col-sm-6"> Image </div><div class="col-sm-6"> Content </div></div><div class="row"><div class="col-sm-6"> Image </div><div class="col-sm-6"> Content </div></div><div class="row"><div class="col-sm-6"> Image </div><div class="col-sm-6"> Content </div></div><div class="row"><div class="col-sm-6"> Image </div><div class="col-sm-6"> Content </div></div><div class="row"><div class="col-sm-6"> Image </div><div class="col-sm-6"> Content </div></div><div class="row"><div class="col-sm-6"> Image </div><div class="col-sm-6"> Content </div></div>'
},
{
title: 'Letters',
//image: 'template1.gif',
description: 'Letters',
html:'<p>Content</p><small>Signature</small>'
},
{
title: 'Blog',
//image: 'template1.gif',
description: 'Blog',
html:'<a href="#" target="_blank"><img src="path" alt=""></a><h3 class="h6"><a href="#" target="_blank">Title</a></h3><p>content</p>'
}
]
});
|
javascript
|
Zion Williamson is one of the bigger athletes in the league. Recently, fans trolled the NOLA All-Star as photos comparing his physique from his rookie year and 3rd campaign go viral.
Zion Williamson is among the brightest young talents in the league. Undoubtedly, Zion will be one of the future faces of the league. What the 21-year-old phenom has achieved in his first two years in the league, has been done by a few very set of all-time greats.
Averaging 27 points, 7.2 rebounds and 3.7 assists on an outstanding 61.1% shooting from the field, Zion proved all his haters and naysayers wrong this past season. And he dominated the paint displaying his incredibly deep offensive arsenal. As fans already know, “Zanos” has a flashy dunk package, and his touch around the bucket is truly second to none.
The Pelicans truly got an outstanding talent in Zion with their #1 pick of the 2019 Draft. However, Williamson does have one flaw. Weighing 284 pounds, the 6-foot-7 prodigy is one of the heaviest forwards in the league. Being overweight comes the risk of several career-ending injuries.
Recently, fans stumbled upon the media day photos from his rookie campaign. Comparing his physique from his 1st year to his 3rd year, it sure looks like the Pels star has put on some more weight.
Fans have been naming Williamson all sorts of nasty names like Air Gumbo, among many others. And after the photos comparing his physique from his rookie year and 3rd year go viral, NBA Reddit had some rather mean remarks for “Zanos”.
This surely is a matter of grave concern for the NOLA front office. However, the Pels hope to see a healthy Zion play as soon as the regular season. While fans will hope to see the best of him this upcoming campaign, everyone will definitely hope for him to get back to shape.
|
english
|
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<title>{{ title }}</title>
{% include 'header.html' %}
</head>
<body>
{% include 'nav.html' %}
<div class="pricing-header px-3 py-3 pt-md-5 pb-md-4 mx-auto text-center">
<h1 class="display-4">Bugine</h1>
<p class="lead">For issue recommendation.</p>
</div>
<div class="container">
<div class="card-deck mb-3 text-center">
<div class="card mb-4 shadow-sm">
<div class="card-header">
<h4 class="my-0 font-weight-normal">Step 1: Description</h4>
</div>
<div class="card-body">
<ul class="list-unstyled mt-3 mb-4">
<li>Parsing source code from GitHub</li>
<li>Extract description file</li>
</ul>
<a href="descript" class="btn btn-lg btn-block btn-primary">Go</a>
</div>
</div>
<div class="card mb-4 shadow-sm">
<div class="card-header">
<h4 class="my-0 font-weight-normal">Step 2: Query</h4>
</div>
<div class="card-body">
<ul class="list-unstyled mt-3 mb-4">
<li>Submit query request</li>
<li>Select excluded app(s)</li>
</ul>
<a href="query" class="btn btn-lg btn-block btn-primary">Go</a>
</div>
</div>
<div class="card mb-4 shadow-sm">
<div class="card-header">
<h4 class="my-0 font-weight-normal">Step 3: Result</h4>
</div>
<div class="card-body">
<ul class="list-unstyled mt-3 mb-4">
<li>Check query status</li>
<li>Retrieve query result</li>
</ul>
<a href="result" class="btn btn-lg btn-block btn-primary">Go</a>
</div>
</div>
</div>
<div class="jumbotron">
<h2 class="font-weight-normal">Instructions for use</h2>
<p class="lead">Following the workflow, and passing (copy and paste) by the tokens between each step.</p>
</div>
</div>
</body>
</html>
|
html
|
package ru.nsu.nsustudyhelper.dto;
import lombok.Data;
@Data
public class TokenDto {
private final String token;
}
|
java
|
# This notebook implements a proof-of-principle for
# Multi-Agent Common Knowledge Reinforcement Learning (MACKRL)
# The entire notebook can be executed online, no need to download anything
# http://pytorch.org/
from itertools import chain
import torch
import torch.nn.functional as F
from torch.multiprocessing import Pool, set_start_method, freeze_support
try:
set_start_method('spawn')
except RuntimeError:
pass
from torch.nn import init
from torch.optim import Adam, SGD
import numpy as np
import matplotlib.pyplot as plt
use_cuda = False
payoff_values = []
payoff_values.append(torch.tensor([ # payoff values
[5, 0, 0, 2, 0],
[0, 1, 2, 4, 2],
[0, 0, 0, 2, 0],
[0, 0, 0, 1, 0],
[0, 0, 0, 0, 0],
], dtype=torch.float32) * 0.2)
payoff_values.append(
torch.tensor([ # payoff values
[0, 0, 1, 0, 5],
[0, 0, 2, 0, 0],
[1, 2, 4, 2, 1],
[0, 0, 2, 0, 0],
[0, 0, 1, 0, 0],
], dtype=torch.float32) * 0.2)
n_agents = 2
n_actions = len(payoff_values[0])
n_states_dec = 5
n_states_joint = 3
n_mix_hidden = 3
p_observation = 0.5
p_ck_noise = [0.0]
# Number of gradient steps
t_max = 202
# We'll be using a high learning rate, since we have exact gradients
lr = 0.05 # DEBUG: 0.05 if exact gradients!
optim = 'adam'
# You can reduce this number if you are short on time. (Eg. n_trials = 20)
#n_trials = 100 # 30
n_trials = 20 #15 #100
std_val = 1.0
# These are the 3 settings we run: MACRKL, Joint-action-learner (always uses CK),
# Independent Actor-Critic (always uses decentralised actions selection)
labels = ["IAC", "JAL"]
p_vec = [0.0, 0.2, 0.4, 0.6, 0.8, 1.0]
final_res = []
# # Pair-Controller with 3 input state (no CK, CK & Matrix ID = 0, CK & Matrix ID = 1), n_actions^2 actions for
# # joint action + 1 action for delegation to the independent agents.
# theta_joint = init.normal_(torch.zeros(n_states_joint, n_actions ** 2 + 1, requires_grad=True), std=0.1)
# Produce marginalised policy: pi_pc[0] * pi^a * pi^b + p(u^ab)
def p_joint_all(pi_pc, pi_dec):
p_joint = pi_pc[1:].view(n_actions, n_actions).clone()
pi_a_pi_b = torch.ger(pi_dec[0], pi_dec[1])
p_joint = pi_pc[0] * pi_a_pi_b + p_joint
return p_joint
def p_joint_all_noise_alt(pi_pcs, pi_dec, p_ck_noise, ck_state):
p_none = (1-p_ck_noise) ** 2 # both unnoised
p_both = (p_ck_noise) ** 2 # both noised
p_one = (1-p_ck_noise) * p_ck_noise # exactly one noised
p_marg_ag0_ck1 = pi_pcs[1][1:].view(n_actions, n_actions).clone().sum(dim=0)
p_marg_ag0_ck2 = pi_pcs[2][1:].view(n_actions, n_actions).clone().sum(dim=0)
p_marg_ag1_ck1 = pi_pcs[1][1:].view(n_actions, n_actions).clone().sum(dim=1)
p_marg_ag1_ck2 = pi_pcs[2][1:].view(n_actions, n_actions).clone().sum(dim=1)
p_joint_ck0 = pi_pcs[0][1:].view(n_actions, n_actions).clone()
p_joint_ck1 = pi_pcs[1][1:].view(n_actions, n_actions).clone()
p_joint_ck2 = pi_pcs[2][1:].view(n_actions, n_actions).clone()
p_d_ck0 = pi_pcs[0][0]
p_d_ck1 = pi_pcs[1][0]
p_d_ck2 = pi_pcs[2][0]
def make_joint(p1, p2, mode="interval"):
"""
1. Pick uniform random variable between [0,1]
2. Do multinomial sampling through contiguous, ordered bucketing for both p1, p2
"""
p1 = p1.clone().view(-1)
p2 = p2.clone().view(-1)
p_final = p1.clone().zero_()
if mode == "interval":
for i in range(p1.shape[0]):
# calculate overlap between the probability distributions
low1 = torch.sum(p1[:i])
high1 = low1 + p1[i]
low2 = torch.sum(p2[:i])
high2 = low2 + p2[i]
if low1 >= low2 and high2 > low1:
p_final[i] = torch.min(high1, high2) - low1
pass
elif low2 >= low1 and high1 > low2:
p_final[i] = torch.min(high1, high2) - low2
else:
p_final[i] = 0
return p_final.clone().view(n_actions, n_actions)
if ck_state == 0:
p_joint = p_joint_ck0 + p_d_ck0 * torch.ger(pi_dec[0], pi_dec[1])
return p_joint # always delegate
elif ck_state == 1:
p_joint = p_none * p_joint_ck1 + \
p_both * p_joint_ck2 + \
p_one * make_joint(p_joint_ck1, p_joint_ck2) + \
p_one * make_joint(p_joint_ck2, p_joint_ck1) + \
(p_one * p_d_ck1 * p_d_ck2
+ p_one * p_d_ck2 * p_d_ck1
+ p_both * p_d_ck2
+ p_none * p_d_ck1) * torch.ger(pi_dec[0], pi_dec[1]) \
+ p_one * p_d_ck1 * (1 - p_d_ck2) * torch.ger(pi_dec[0], p_marg_ag1_ck2) \
+ p_one * (1 - p_d_ck2) * p_d_ck1 * torch.ger(p_marg_ag0_ck2, pi_dec[1]) \
+ p_one * p_d_ck2 * (1 - p_d_ck1) * torch.ger(pi_dec[0], p_marg_ag1_ck1) \
+ p_one * (1 - p_d_ck1) * p_d_ck2 * torch.ger(p_marg_ag0_ck1, pi_dec[1])
return p_joint
elif ck_state == 2:
p_joint = p_none * p_joint_ck2 + \
p_both * p_joint_ck1 + \
p_one * make_joint(p_joint_ck2, p_joint_ck1) + \
p_one * make_joint(p_joint_ck1, p_joint_ck2) + \
(p_one * p_d_ck2 * p_d_ck1
+ p_one * p_d_ck1 * p_d_ck2
+ p_both * p_d_ck1
+ p_none * p_d_ck2) * torch.ger(pi_dec[0], pi_dec[1]) \
+ p_one * p_d_ck2 * (1 - p_d_ck1) * torch.ger(pi_dec[0], p_marg_ag1_ck1) \
+ p_one * (1 - p_d_ck1) * p_d_ck2 * torch.ger(p_marg_ag0_ck1, pi_dec[1]) \
+ p_one * p_d_ck1 * (1 - p_d_ck2) * torch.ger(pi_dec[0], p_marg_ag1_ck2) \
+ p_one * (1 - p_d_ck2) * p_d_ck1 * torch.ger(p_marg_ag0_ck2, pi_dec[1])
return p_joint
pass
def get_policies(common_knowledge, observations, run, test, thetas_dec, theta_joint, p_ck_noise=0):
if test:
beta = 100
else:
beta = 1
actions = []
pi_dec = []
# common_knowledge decides whether ck_state is informative
if common_knowledge == 0:
ck_state = 0
else:
ck_state = int(observations[0] + 1)
if p_ck_noise == 0:
pol_vals = theta_joint[ck_state, :].clone()
# logits get masked out for independent learner and joint-action-learner
# independent learner has a pair controller that always delegates
if run == 'JAL':
pol_vals[0] = -10 ** 10
elif run == 'IAC':
pol_vals[1:] = -10 ** 10
# apply temperature to set testing
pi_pc = F.softmax(pol_vals * beta, -1)
# calcuate decentralised policies
for i in range(n_agents):
dec_state = int(observations[i])
pi = F.softmax(thetas_dec[i][dec_state] * beta, -1)
pi_dec.append(pi)
return pi_pc, pi_dec
else:
pol_vals = theta_joint.clone()
pi_pcs = []
for i in range(n_states_joint):
if run == 'JAL':
pol_vals[i][0] = -10 ** 10
elif run == 'IAC':
pol_vals[i][1:] = -10 ** 10
# apply temperature to set testing
pi_pcs.append(F.softmax(pol_vals[i] * beta, -1))
# calcuate decentralised policies
for i in range(n_agents):
dec_state = int(observations[i])
pi = F.softmax(thetas_dec[i][dec_state] * beta, -1)
pi_dec.append(pi)
return pi_pcs, pi_dec, ck_state
def get_state(common_knowledge, obs_0, obs_1, matrix_id):
receives_obs = [obs_0, obs_1]
if common_knowledge == 1:
observations = np.repeat(matrix_id, 2)
else:
observations = np.ones((n_agents)) * 2 #
for ag in range(n_agents):
if receives_obs[ag]:
observations[ag] += matrix_id + 1
return common_knowledge, observations, matrix_id
# Calculate the expected return: sum_{\tau} P(\tau | pi) R(\tau)
def expected_return(p_common, p_observation, thetas, run, test, p_ck_noise=0):
thetas_dec = thetas["dec"]
theta_joint = thetas["joint"]
# Probability of CK
p_common_val = [1 - p_common, p_common]
# Probability of observation given no CK)
p_obs_val = [1 - p_observation, p_observation]
# Matrices are chosen 50 / 50
p_matrix = [0.5, 0.5]
# p_matrix = [1.0, 0.0] # DEBUG!
# Initialise expected return
ret_val = 0
for ck in [0, 1]:
for matrix_id in [0, 1]:
for obs_0 in [0, 1]:
for obs_1 in [0, 1]:
p_state = p_common_val[ck] * p_obs_val[obs_0] * p_obs_val[obs_1] * p_matrix[matrix_id]
common_knowledge, observations, matrix_id = get_state(ck, obs_0, obs_1, matrix_id)
# Get final probabilities for joint actions
if p_ck_noise==0:
pi_pc, pi_dec = get_policies(common_knowledge, observations, run, test, thetas_dec, theta_joint)
p_joint_val = p_joint_all(pi_pc, pi_dec)
else:
pol_vals, pi_dec, ck_state = get_policies(common_knowledge, observations, run, test, thetas_dec, theta_joint, p_ck_noise)
p_joint_val = p_joint_all_noise_alt(pol_vals, pi_dec, p_ck_noise, ck_state)
# Expected return is just the elementwise product of rewards and action probabilities
expected_ret = (p_joint_val * payoff_values[matrix_id]).sum()
# Add return from given state
ret_val = ret_val + p_state * expected_ret
return ret_val
def _proc(args):
p_common, p_observation, run, p_ck_noise, t_max, n_trials = args
results = []
for nt in range(n_trials):
print("Run: {} P_CK_NOISE: {} P_common: {} #Trial: {}".format(run, p_ck_noise, p_common, nt))
results_log = np.zeros((t_max // (t_max // 100),))
results_log_test = np.zeros((t_max // (t_max // 100),))
thetas = {}
thetas["dec"] = [init.normal_(torch.zeros(n_states_dec, n_actions, requires_grad=True), std=std_val) for i in
range(n_agents)]
thetas["joint"] = init.normal_(torch.zeros(n_states_joint, n_actions ** 2 + 1, requires_grad=True),
std=std_val)
params = chain(*[_v if isinstance(_v, (list, tuple)) else [_v] for _v in thetas.values()])
params = list(params)
if use_cuda:
for param in params:
param = param.to("cuda")
if optim == 'sgd':
optimizer = SGD(params, lr=lr)
else:
optimizer = Adam(params, lr=lr)
for i in range(t_max):
if run in ['MACKRL',
'JAL',
'IAC']:
loss = - expected_return(p_common, p_observation, thetas, run, False, p_ck_noise)
r_s = -loss.data.numpy()
optimizer.zero_grad()
loss.backward()
optimizer.step()
if i % (t_max // 100) == 0:
if run in ['MACKRL',
'JAL',
'IAC']:
r_test = expected_return(p_common, p_observation, thetas, run, True, p_ck_noise)
results_log_test[i // (t_max // 100)] = r_test
results_log[i // (t_max // 100)] = r_s
results.append((results_log_test, results_log))
return results
def main():
use_mp = True
if use_mp:
pool = Pool(processes=2)
# Well be appending results to these lists
run_results = []
for run in labels:
noise_results = []
for pnoise in p_ck_noise:
print("Run: {} P_CK_NOISE: {}".format(run, pnoise))
results = pool.map(_proc, [ (pc, p_observation, run, pnoise, t_max, n_trials) for pc in p_vec ])
noise_results.append(results)
run_results.append(noise_results)
for p_common_id, p_common in enumerate(p_vec):
all_res = []
all_res_test = []
for run_id, run in enumerate(labels):
for pnoise_id, pnoise in enumerate(p_ck_noise):
try:
results = run_results[run_id][pnoise_id][p_common_id]
except Exception as e:
pass
all_res_test.append(np.stack([r[0] for r in results], axis=1))
all_res.append(np.stack([r[1] for r in results], axis=1))
final_res.append([all_res_test, all_res])
pool.close()
pool.join()
else:
# Well be appending results to these lists
run_results = []
for run in labels:
noise_results = []
for pnoise in p_ck_noise:
print("Run: {} P_CK_NOISE: {}".format(run, pnoise))
results = [_proc((pc, p_observation, run, pnoise, t_max, n_trials)) for pc in p_vec ]
noise_results.append(results)
run_results.append(noise_results)
for p_common_id, p_common in enumerate(p_vec):
all_res = []
all_res_test = []
for run_id, run in enumerate(labels):
for pnoise_id, pnoise in enumerate(p_ck_noise):
try:
results = run_results[run_id][pnoise_id][p_common_id]
except Exception as e:
pass
all_res_test.append(np.stack([r[0] for r in results], axis=1))
all_res.append(np.stack([r[1] for r in results], axis=1))
final_res.append([all_res_test, all_res])
import pickle
import uuid
import os
res_dict = {}
res_dict["final_res"] = final_res
res_dict["labels"] = labels
res_dict["p_ck_noise"] = p_ck_noise
res_dict["p_vec"] = p_vec
if not os.path.exists(os.path.join(os.path.dirname(os.path.abspath(__file__)),
"pickles")):
os.makedirs(os.path.join(os.path.dirname(os.path.abspath(__file__)),
"pickles"))
pickle.dump(res_dict, open(os.path.join(os.path.dirname(os.path.abspath(__file__)),
"pickles",
"final_res_{}.p".format(uuid.uuid4().hex[:4])), "wb"))
plt.figure(figsize=(5, 5))
color = ['b', 'r','g', 'c', 'm', 'y', 'k','b', 'r','g', 'c', 'm', 'y', 'k']
titles = ['Test', 'Train Performance']
for pl in [0,1]:
ax = plt.subplot(1, 1, 1)
for i in range(len(labels)):
for pck, pcknoise in enumerate(p_ck_noise):
mean_vals = []
min_vals = []
max_vals = []
for j, p in enumerate( p_vec ):
vals = final_res[j][pl]
this_mean = np.mean( vals[i*len(p_ck_noise) + pck], 1)[-1]
std = np.std(vals[i], 1)[-1]/0.5
low = this_mean-std / (n_trials)**0.5
high = this_mean + std / (n_trials)**0.5
mean_vals.append( this_mean )
min_vals.append( low )
max_vals.append( high )
plt.plot(p_vec,
mean_vals,
color[(i*len(p_ck_noise) + pck) % len(color)],
label = "{} p_ck_noise: {}".format(labels[i], pcknoise))
plt.fill_between(p_vec,
min_vals,
max_vals,
facecolor=color[i],
alpha=0.3)
plt.xlabel('P(common knowledge)')
plt.ylabel('Expected Return')
plt.ylim([0.0, 1.01])
plt.xlim([-0.01, 1.01])
ax.set_facecolor((1.0, 1.0, 1.0))
ax.grid(color='k', linestyle='-', linewidth=1)
ax.set_title(titles[pl])
plt.legend()
plt.xticks([0, 0.5, 1])
plt.yticks([0.5, 0.75, 1])
plt.savefig("MACKRL {}.pdf".format(titles[pl]))
plt.show(block=False)
if __name__ == "__main__":
freeze_support()
main()
|
python
|
<filename>src/test/java/net/andreho/struct/map/DataSets.java<gh_stars>0
package net.andreho.struct.map;
import java.util.List;
import java.util.stream.Collectors;
import static java.util.Collections.singletonList;
import static java.util.concurrent.ThreadLocalRandom.current;
import static java.util.stream.Collectors.toList;
public class DataSets {
protected static final String ONE_RANDOM = "oneRandom";
protected static final String FIVE_RANDOM = "fiveRandoms";
protected static final String TEN_RANDOM = "tenRandoms";
protected static final String HUNDRED_RANDOM = "hundredRandoms";
protected static final String THOUSAND_RANDOM = "thousandRandoms";
protected static final String TEN_THOUSAND_RANDOM = "tenThousandRandoms";
protected static final String HUNDRED_THOUSAND_RANDOM = "hundredThousandRandoms";
protected static final String ONE = "one";
protected static final String FIVE = "five";
protected static final String TEN = "ten";
protected static final String HUNDRED = "hundred";
protected static final String THOUSAND = "thousand";
protected static final String TEN_THOUSAND = "tenThousand";
protected static final String HUNDRED_THOUSAND = "hundredThousand";
protected static List<List<Integer>> oneRandom() {
return singletonList(
current()
.ints(1)
.boxed()
.collect(toList())
);
}
protected static List<List<Integer>> fiveRandoms() {
return singletonList(
current()
.ints(5)
.boxed()
.collect(toList())
);
}
protected static List<List<Integer>> tenRandoms() {
return singletonList(
current()
.ints(10)
.boxed()
.collect(toList())
);
}
protected static List<List<Integer>> hundredRandoms() {
return singletonList(
current()
.ints(100)
.boxed()
.collect(toList())
);
}
protected static List<List<Integer>> thousandRandoms() {
return singletonList(
current()
.ints(1000)
.boxed()
.collect(toList())
);
}
protected static List<List<Integer>> tenThousandRandoms() {
return singletonList(
current()
.ints(10_000)
.boxed()
.collect(toList())
);
}
protected static List<List<Integer>> hundredThousandRandoms() {
return singletonList(
current()
.ints(100_000)
.boxed()
.collect(toList())
);
}
private static final List<Integer> INTEGERS =
current()
.ints(100_000)
.distinct()
.boxed()
.collect(Collectors.toList());
protected static List<List<Integer>> one() {
return singletonList(INTEGERS.subList(0, 1));
}
protected static List<List<Integer>> five() {
return singletonList(INTEGERS.subList(0, 5));
}
protected static List<List<Integer>> ten() {
return singletonList(INTEGERS.subList(0, 10));
}
protected static List<List<Integer>> hundred() {
return singletonList(INTEGERS.subList(0, 100));
}
protected static List<List<Integer>> thousand() {
return singletonList(INTEGERS.subList(0, 1000));
}
protected static List<List<Integer>> tenThousand() {
return singletonList(INTEGERS.subList(0, 10_000));
}
protected static List<List<Integer>> hundredThousand() {
return singletonList(INTEGERS.subList(0, 100_000));
}
}
|
java
|
/**
* Sa: http://plugins.jquery.com/project/json_autocomplete
* Json key/value autocomplete for jQuery
* Provides a transparent way to have key/value autocomplete
* Copyright (C) 2008 <NAME> www.CodeAssembly.com
*
* This program is free software: you can redistribute it and/or modify
* it under the terms of the GNU Lesser General Public License as published by
* the Free Software Foundation, either version 3 of the License, or
* (at your option) any later version.
*
* This program is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
* GNU General Public License for more details.
*
* You should have received a copy of the GNU Lesser General Public License
* along with this program. If not, see http://www.gnu.org/licenses/
*
* Examples
* $("input#example").autocomplete("autocomplete.php");//using default parameters
* $("input#example").autocomplete("autocomplete.php",{minChars:3,timeout:3000,validSelection:false,parameters:{'myparam':'myvalue'},before : function(input,text) {},after : function(input,text) {}});
* minChars = Minimum characters the input must have for the ajax request to be made
* timeOut = Number of miliseconds passed after user entered text to make the ajax request
* validSelection = If set to true then will invalidate (set to empty) the value field if the text is not selected (or modified) from the list of items.
* parameters = Custom parameters to be passed
* after, before = a function that will be caled before/after the ajax request
*/
jQuery.fn.autocomplete = function(url, settings)
{
return this.each( function()//do it for each matched element
{
//this is the original input
var textInput = $(this);
//create a new hidden input that will be used for holding the return value when posting the form, then swap names with the original input
textInput.after('<input type=hidden name="' + textInput.attr("name") + '"/>').attr("name", textInput.attr("name") + "_text");
var valueInput = $(this).next();
//create the ul that will hold the text and values
valueInput.after('<ul class="autocomplete"></ul>');
var list = valueInput.next().css({top: textInput.offset().top + textInput.outerHeight(), left: textInput.offset().left, width: textInput.width()});
var oldText = '';
var typingTimeout;
var size = 0;
var selected = 0;
settings = jQuery.extend(//provide default settings
{
minChars : 1,
timeout: 1000,
after : null,
before : null,
validSelection : true,
parameters : {}
} , settings);
if(settings.extraParams) {
jQuery.each(settings.extraParams, function(key, param) {
settings.parameters[key] = typeof param == "function" ? param() : param;
});
}
function getData(text)
{
window.clearInterval(typingTimeout);
// alert("Pozvan s "+text);
if (text != oldText && (settings.minChars != null && text.length >= settings.minChars))
{
clear();
if (settings.before == "function")
{
settings.before(textInput,text);
}
textInput.addClass('autocomplete-loading');
settings.parameters.q = text;
$.getJSON(url,settings.parameters,function(data)
{
var items = '';
if (data)
{
size = data.length;
for (i = 0; i < data.length; i++)//iterate over all options
{
for ( key in data[i] )//get key => value
{
items += '<li value3="' + key + '">' + data[i][key].replace(new RegExp("(" + text + ")","i"),"<strong>$1</strong>") + '</li>';
}
list.html(items);
//on mouse hover over elements set selected class and on click set the selected value and close list
list.show().children().
hover(function() { $(this).addClass("selected").siblings().removeClass("selected");}, function() { $(this).removeClass("selected") } ).
click(function () { valueInput.val( $(this).attr('value3') );textInput.val( $(this).text() ); clear(); });
}
if (settings.after == "function")
{
settings.after(textInput,text);
}
}
textInput.removeClass('autocomplete-loading');
});
oldText = text;
}
}
function clear()
{
list.hide();
size = 0;
selected = 0;
}
textInput.keydown(function(e)
{
window.clearInterval(typingTimeout);
if(e.which == 27)//escape
{
clear();
} else if (e.which == 46 || e.which == 8)//delete and backspace
{
clear();
//invalidate previous selection
if (settings.validSelection) valueInput.val('');
}
else if(e.which == 13)//enter
{
if ( list.css("display") == "none")//if the list is not visible then make a new request, otherwise hide the list
{
getData(textInput.val());
} else
{
clear();
}
e.preventDefault();
return false;
}
else if(e.which == 40 || e.which == 9 || e.which == 38)//move up, down
{
e.preventDefault();
switch(e.which)
{
case 40:
case 9:
selected = selected >= size - 1 ? 0 : selected + 1; break;
case 38:
selected = selected <= 0 ? size - 1 : selected - 1; break;
default: break;
}
//set selected item and input values
textInput.val( list.children().removeClass('selected').eq(selected).addClass('selected').text() );
valueInput.val( list.children().eq(selected).attr('value3') );
return false;
} else
{
//invalidate previous selection
if (settings.validSelection) valueInput.val('');
typingTimeout = window.setTimeout(function() { getData(textInput.val()) },settings.timeout);
}
});
});
};
|
javascript
|
<filename>app/src/main/java/com/getinlight/controlphone/receiver/DeviceAdmin.java
package com.getinlight.controlphone.receiver;
import android.app.admin.DeviceAdminReceiver;
import android.content.Context;
import android.content.Intent;
/**
* Created by getinlight on 2018/2/12.
*/
public class DeviceAdmin extends DeviceAdminReceiver {
@Override
public void onReceive(Context context, Intent intent) {
super.onReceive(context, intent);
}
}
|
java
|
<filename>lib/bridgedb/email/request.py
# -*- coding: utf-8; test-case-name: bridgedb.test.test_email_request; -*-
#_____________________________________________________________________________
#
# This file is part of BridgeDB, a Tor bridge distribution system.
#
# :authors: <NAME> <<EMAIL>>
# <NAME> <<EMAIL>> 0xA3ADB67A2CDB8B35
# <NAME> <<EMAIL>>
# please also see AUTHORS file
# :copyright: (c) 2007-2015, The Tor Project, Inc.
# (c) 2013-2015, Isis Lovecruft
# :license: see LICENSE for licensing information
#_____________________________________________________________________________
"""
.. py:module:: bridgedb.email.request
:synopsis: Classes for parsing and storing information about requests for
bridges which are sent to the email distributor.
bridgedb.email.request
======================
Classes for parsing and storing information about requests for bridges
which are sent to the email distributor.
::
bridgedb.email.request
| |_ determineBridgeRequestOptions - Figure out which filters to apply, or
| offer help.
|_ EmailBridgeRequest - A request for bridges which was received through
the email distributor.
..
"""
from __future__ import print_function
from __future__ import unicode_literals
import logging
import re
from bridgedb import bridgerequest
from bridgedb.Dist import EmailRequestedHelp
from bridgedb.Dist import EmailRequestedKey
#: A regular expression for matching the Pluggable Transport method TYPE in
#: emailed requests for Pluggable Transports.
TRANSPORT_REGEXP = ".*transport ([a-z][_a-z0-9]*)"
TRANSPORT_PATTERN = re.compile(TRANSPORT_REGEXP)
#: A regular expression that matches country codes in requests for unblocked
#: bridges.
UNBLOCKED_REGEXP = ".*unblocked ([a-z]{2,4})"
UNBLOCKED_PATTERN = re.compile(UNBLOCKED_REGEXP)
def determineBridgeRequestOptions(lines):
"""Figure out which :class:`Bridges.BridgeFilter`s to apply, or offer help.
.. note:: If any ``'transport TYPE'`` was requested, or bridges not
blocked in a specific CC (``'unblocked CC'``), then the ``TYPE``
and/or ``CC`` will *always* be stored as a *lowercase* string.
:param list lines: A list of lines from an email, including the headers.
:raises EmailRequestedHelp: if the client requested help.
:raises EmailRequestedKey: if the client requested our GnuPG key.
:rtype: :class:`EmailBridgeRequest`
:returns: A :class:`~bridgerequst.BridgeRequest` with all of the requested
parameters set. The returned ``BridgeRequest`` will have already had
its filters generated via :meth:`~EmailBridgeRequest.generateFilters`.
"""
request = EmailBridgeRequest()
skippedHeaders = False
for line in lines:
line = line.strip().lower()
# Ignore all lines before the first empty line:
if not line: skippedHeaders = True
if not skippedHeaders: continue
if ("help" in line) or ("halp" in line):
raise EmailRequestedHelp("Client requested help.")
if "get" in line:
request.isValid(True)
logging.debug("Email request was valid.")
if "key" in line:
request.wantsKey(True)
raise EmailRequestedKey("Email requested a copy of our GnuPG key.")
if "ipv6" in line:
request.withIPv6()
if "transport" in line:
request.withPluggableTransportType(line)
if "unblocked" in line:
request.withoutBlockInCountry(line)
logging.debug("Generating hashring filters for request.")
request.generateFilters()
return request
class EmailBridgeRequest(bridgerequest.BridgeRequestBase):
"""We received a request for bridges through the email distributor."""
def __init__(self):
"""Process a new bridge request received through the
:class:`~bridgedb.Dist.EmailBasedDistributor`.
"""
super(EmailBridgeRequest, self).__init__()
self._isValid = False
self._wantsKey = False
def isValid(self, valid=None):
"""Get or set the validity of this bridge request.
If called without parameters, this method will return the current
state, otherwise (if called with the **valid** parameter), it will set
the current state of validity for this request.
:param bool valid: If given, set the validity state of this
request. Otherwise, get the current state.
"""
if valid is not None:
self._isValid = bool(valid)
return self._isValid
def wantsKey(self, wantsKey=None):
"""Get or set whether this bridge request wanted our GnuPG key.
If called without parameters, this method will return the current
state, otherwise (if called with the **wantsKey** parameter set), it
will set the current state for whether or not this request wanted our
key.
:param bool wantsKey: If given, set the validity state of this
request. Otherwise, get the current state.
"""
if wantsKey is not None:
self._wantsKey = bool(wantsKey)
return self._wantsKey
def withoutBlockInCountry(self, line):
"""This request was for bridges not blocked in **country**.
Add any country code found in the **line** to the list of
``notBlockedIn``. Currently, a request for a transport is recognized
if the email line contains the ``'unblocked'`` command.
:param str country: The line from the email wherein the client
requested some type of Pluggable Transport.
"""
unblocked = None
logging.debug("Parsing 'unblocked' line: %r" % line)
try:
unblocked = UNBLOCKED_PATTERN.match(line).group(1)
except (TypeError, AttributeError):
pass
if unblocked:
self.notBlockedIn.append(unblocked)
logging.info("Email requested bridges not blocked in: %r"
% unblocked)
def withPluggableTransportType(self, line):
"""This request included a specific Pluggable Transport identifier.
Add any Pluggable Transport method TYPE found in the **line** to the
list of ``transports``. Currently, a request for a transport is
recognized if the email line contains the ``'transport'`` command.
:param str line: The line from the email wherein the client
requested some type of Pluggable Transport.
"""
transport = None
logging.debug("Parsing 'transport' line: %r" % line)
try:
transport = TRANSPORT_PATTERN.match(line).group(1)
except (TypeError, AttributeError):
pass
if transport:
self.transports.append(transport)
logging.info("Email requested transport type: %r" % transport)
|
python
|
Sunrisers Hyderabad (SRH) coach Trevor Bayliss shared why Kane Williamson was not part of the playing eleven on Sunday against KKR. The coach revealed the team wanted to give Williamson more time to get up to speed.
Kane Williamson was surprisingly left out of the side for Sunrisers Hyderabad’s match against Kolkata Knight Riders. The decision to pick Jonny Bairstow ahead of Williamson raised eyebrows, as many felt the Kiwi batsman was better suited to playing in Chennai.
Trevor Bayliss spoke to the media after SRH’s ten-run loss to KKR and explained the team’s decision to bench their former skipper.
“We just felt that Kane needed a little bit of extra time to get match fit and a little bit more time in the nets. He would have played in place of Jonny Bairstow obviously if that had occurred. But we aren’t too perturbed about that, Jonny has been in form recently in white-ball cricket here in India. Kane will obviously come into calculations as the tournament unfolds,” claimed Bayliss.
Despite not being the first choice, Jonny Bairstow had an impressive showing in the season opener. The Englishman came out to bat when the scores were 10/2. He stitched a 92-run partnership with Manish Pandey, scoring 55 from 40 balls.
Jonny Bairstow’s knock gave SRH a fighting chance, but the team ultimately failed to get over the line. His impressive knock means SRH will be left scratching their heads once Kane Williamson returns to full fitness.
Kane Williamson came out of quarantine just a couple of days before SRH's first game. He last played the New Zealand vs Australia T20I series at the start of March. The Kiwi skipper has been out of action since, as he recovers from a small tear in his left elbow tendon.
The injury caused him to miss New Zealand’s white-ball series against Bangladesh, and SRH will hope they have the dynamic batsman back as soon as possible.
While Jonny Bairstow is a reliable option in the middle order, Kane Williamson gives them stability at No. 4. The 30-year-old played 12 games last season, scoring 317 runs at a strike rate of 133. 75. His presence allows others to play around him, enabling SRH’s power hitters to attack safely in the knowledge they have a dependable batsman around them.
|
english
|
<gh_stars>1-10
{"type":"FeatureCollection","features":[{"type":"Feature","properties":{"label:cs":"Boží muka na návsi v Nevcehli","P31":"http://www.wikidata.org/entity/Q3395121","P31^label:cs":"Boží muka","P131^label:cs":"Nevcehle","P6736":"28973"},"geometry":{"type":"Point","coordinates":[15.53468,49.22517]}},{"type":"Feature","properties":{"label:cs":"Boží muka u silnice jižně od Pavlova","P31":"http://www.wikidata.org/entity/Q3395121","P31^label:cs":"Boží muka","P131^label:cs":"Pavlov","P6736":"28972"},"geometry":{"type":"Point","coordinates":[15.55508,49.23749]}}]}
|
json
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.