text
stringlengths 2
14k
| meta
dict |
|---|---|
---
layout: post
title: 就部分同学因申请信息公开被约谈一事致北大校方的联名信
date: 2018-04-25 12:30
categories: Archive
tags: 声援岳昕
description: 联名信
---
来自微信 ~~[深约一丈](
https://mp.weixin.qq.com/s/TpQ4-z-Rm_w3SxEP027acg)~~ ,~~[北大BBS](https://bbs.pku.edu.cn/v2/post-read.php?bid=22&threadid=16397004)~~
---
尊敬的北京大学校领导:
您好!
我们是关注沈阳事件及其后续进展的北大师生和校友。
首先需要表明,我们写下这封信不全是因为岳昕同学,还因为在沈阳事件曝光之后很多积极推动解决此事的同学们所经历的种种不公正待遇。
23日上午,我们在震惊与不解中得知这样的消息:
因我校外国语学院2014级本科生岳昕同学寻求沈阳事件真相向校方提交相关信息公开申请,外国语学院的辅导员老师深夜到其宿舍约谈,并单方面通知家人,其家人因此受到严重惊吓,岳昕同学因此陷入孤立无助的处境。
岳昕等同学怀着参与校园事务的热忱之心,怀着北大人的责任感,于4月9日早上向学校提交了信息公开申请表。我们连同岳昕同学在内,所做的一切仅仅是为了在寻求真相的基础上更好地解决沈阳事件,在今后的制度建设方面切实保障校内师生的合法权益。
可在这个过程中所发生的一系列事情令我们非常困惑和失望:
4月7日晚,邓宇昊同学发文表示将申请信息公开。但距离发文不过几小时,邓同学即被院系深夜约谈至凌晨三点多,最终在众多热心同学和约谈老师的力争下邓同学才被放回;
4月9日,十名师生以合理合法的方式在校长办公楼正式书面递交了信息公开申请。而在接下来的几天中,各相关院系开始约谈7日晚在理教邓同学被约谈现场和递交信息公开申请的同学。
虽然学校的这种做法以及于4月20日给出的信息公开反馈结果令我们存在诸多不解,但是出于对学校的信任,大家接下来均选择配合学校一起积极推进校园反性骚扰制度的建设,保障在校师生的合法权益。
可是在同学们非常冷静地对待此事的情况下,我们在前天早上得知的消息却是:外国语学院近期多次约谈岳昕同学,同时越过岳同学向其家长施压、凌晨到宿舍强行约谈,并且要求删除申请信息公开一事的相关资料。
虽然相关老师和外国语学院已及时就此事做出了声明,并且在声明当中表示“始终尊重每一位同学的基本权利,努力保障每一位同学的合法权益”,但是对于经历了一次又一次约谈的同学们来说,我们很难感受到来自校方真诚的关爱。
我们盼望着这些同学能尽快回归正常生活,我们绝不希望在此之后校园内仍然有类似的情况发生。我们真诚地希望同学们的合法权益不再受到侵犯,燕园内校方与同学之间的“误会”不再发生,北京大学国内国际的声誉与形象能够得到长久的维护。
此外我们还想说明的是,在经历了这些事情之后,我们深感个体的无力与渺小,越加感受到老师与同学们的支持对于意见的表达是多么重要。
有鉴于此,针对岳昕同学的遭遇及校方近期一系列约谈行为,我们将以集体联名的方式向校方提起以下四点倡议:
1. 妥善弥补因约谈行为对岳昕同学带来的伤害。切实敦促外国语学院于近期及时召开事件说明会,向全校范围内的师生公开,提前告知参与方式,以有效回应大家的困惑和质疑。保证以后不以任何形式就此事给岳昕同学施压,积极主动向岳昕同学的家人澄清此事,消除不必要的误解。保证岳昕同学的毕业等后续发展不受因校方干预所带来的消极影响。
2. 加强制度约束。面向全校师生明确北京大学约谈制度,出台约谈实施细则,落实“以学生为中心”的思想,应当首先保证同学们的正常学习生活不受影响,严禁打着“关心同学”等幌子随意约谈同学,侵犯大家的合法权益;
3. 充分保障同学合法权利。在进行约谈工作之前,应当开诚布公说明相关情况,告知被约谈者有选择是否接受的自由,不能越过同学本人向同学的家人朋友进行施压;
4. 完善约谈制度的群众监督与制约机制。在同学们自愿的前提下,约谈情况
|
{
"pile_set_name": "Github"
}
|
[](https://travis-ci.org/isaacs/rimraf) [](https://david-dm.org/isaacs/rimraf) [](https://david-dm.org/isaacs/rimraf#info=devDependencies)
The [UNIX command](http://en.wikipedia.org/wiki/Rm_(Unix)) `rm -rf` for node.
Install with `npm install rimraf`, or just drop rimraf.js somewhere.
## API
`rimraf(f, [opts], callback)`
The first parameter will be interpreted as a globbing pattern for files. If you
want to disable globbing you can do so with `opts.disableGlob` (defaults to
`false`). This might be handy, for instance, if you have filenames that contain
globbing wildcard characters.
The callback will be called with an error if there is one. Certain
errors are handled for you:
* Windows: `EBUSY` and `ENOTEMPTY` - rimraf will back off a maximum of
`opts.maxBusyTries` times before giving up, adding 100ms of wait
between each attempt. The default `maxBusyTries` is 3.
* `ENOENT` - If the file doesn't exist, rimraf will return
successfully, since your desired outcome is already the case.
* `EMFILE` - Since `readdir` requires opening a file descriptor, it's
possible to hit `EMFILE` if too many file descriptors are in use.
In the sync case, there's nothing to be done for this. But in the
async case, rimraf will gradually back off with timeouts up to
`opts.emfileWait` ms, which defaults to 1000.
## rimraf.sync
It can remove stuff synchronously, too. But that's not so good. Use
the async API. It's better.
## CLI
If installed with `npm install rimraf -g` it can be used as a global
command `rimraf <path> [<path> ...]` which is useful for cross platform support.
## mkdirp
If you need to create a directory recursively, check out
[mkdirp](https://github.com/substack/node-mkdirp).
|
{
"pile_set_name": "Github"
}
|
/**
******************************************************************************
* @file usb_bsp.c
* @author MCD Application Team
* @version V2.2.1
* @date 17-March-2018
* @brief This file implements the board support package for the USB host library
******************************************************************************
* @attention
*
* <h2><center>© Copyright (c) 2015 STMicroelectronics.
* All rights reserved.</center></h2>
*
* This software component is licensed by ST under Ultimate Liberty license
* SLA0044, the "License"; You may not use this file except in compliance with
* the License. You may obtain a copy of the License at:
* <http://www.st.com/SLA0044>
*
******************************************************************************
*/
/* Includes ------------------------------------------------------------------ */
#include "usbh_usr.h"
#include "usb_bsp.h"
#include "usb_hcd_int.h"
#include "usbh_core.h"
#include "delay.h"
#include "variants.h"
#ifdef U_DISK_SUPPORT
/**
* @brief USB_OTG_BSP_Init
* Initializes BSP configurations
* @param None
* @retval None
*/
void USB_OTG_BSP_Init(USB_OTG_CORE_HANDLE * pdev)
{
// EXTI_InitTypeDef EXTI_InitStructure;
#ifdef STM32F10X_CL
#if defined(MKS_32_V1_4)
RCC_OTGFSCLKConfig(RCC_OTGFSCLKSource_PLLVCO_Div2);
#else
RCC_OTGFSCLKConfig(RCC_OTGFSCLKSource_PLLVCO_Div3);
#endif
RCC_AHBPeriphClockCmd(RCC_AHBPeriph_OTG_FS, ENABLE);
#else // USE_STM322xG_EVAL
GPIO_InitTypeDef GPIO_InitStructure;
#ifdef USE_USB_OTG_FS
RCC_AHB1PeriphClockCmd(RCC_AHB1Periph_GPIOA, ENABLE);
/* Configure DM DP Pins */
GPIO_InitStructure.GPIO_Pin = GPIO_Pin_11 | GPIO_Pin_12;
GPIO_InitStructure.GPIO_Speed = GPIO_Speed_100MHz;
GPIO_InitStructure.GPIO_Mode = GPIO_Mode_AF;
GPIO_InitStructure.GPIO_OType = GPIO_OType_PP;
GPIO_InitStructure.GPIO_PuPd = GPIO_PuPd_NOPULL;
GPIO_Init(GPIOA, &GPIO_InitStructure);
GPIO_PinAFConfig(GPIOA, GPIO_PinSource11, GPIO_AF_OTG1_FS);
GPIO_PinAFConfig(GPIOA, GPIO_PinSource12, GPIO_AF_OTG1_FS);
RCC_APB2PeriphClockCmd(RCC_APB2Periph_SYSCFG, ENABLE);
RCC_AHB2PeriphClockCmd(RCC_AHB2Periph_OTG_FS, ENABLE);
#else // USE_USB_OTG_HS
#ifdef USE_ULPI_PHY // ULPI
RCC_AHB1PeriphClockCmd(RCC_AHB1Periph_GPIOA | RCC_AHB1Periph_GPIOB |
RCC_AHB1Periph_GPIOC | RCC_AHB1Periph_GPIOH |
RCC_AHB1Periph_GPIOI, ENABLE);
GPIO_PinAFConfig(GPIOA, GPIO_PinSource3, GPIO_AF_OTG2_HS); // D0
GPIO_PinAFConfig(GPIOA, GPIO_PinSource5, GPIO_AF_OTG2_HS); // CLK
GPIO_PinAFConfig(GPIOB, GPIO_PinSource0, GPIO_AF_OTG2_HS); // D1
GPIO_PinAFConfig(GPIOB, GPIO_PinSource1, GPIO_AF_OTG2_HS); // D2
GPIO_PinAFConfig(GPIOB, GPIO_PinSource5, GPIO_AF_OTG2_HS); // D7
GPIO_PinAFConfig(GPIOB, GPIO_PinSource10, GPIO_AF_OTG2_HS); // D3
GPIO_PinAFConfig(GPIOB, GPIO_PinSource11, GPIO_AF_OTG2_HS); // D4
GPIO_PinAFConfig(GPIOB, GPIO_PinSource12, GPIO_AF_OTG2_HS); // D5
GPIO_PinAFConfig(GPIOB, GPIO_PinSource13, GPIO_AF_OTG2_HS); // D6
GPIO_PinAFConfig(GPIOH, GPIO_PinSource4, GPIO_AF_OTG2_HS); // NXT
GPIO_PinAFConfig(GPIOI, GPIO_PinSource11, GPIO_AF_OTG2_HS); // DIR
GPIO_PinAFConfig(GPIOC, GPIO_PinSource0, GPIO_AF_OTG2_HS); // STP
// CLK
GPIO_InitStructure.GPIO_Pin = GPIO_Pin_5;
GPIO_InitStructure.GPIO_Speed = GPIO_Speed_100MHz;
GPIO_InitStructure.GPIO_Mode = GPIO_Mode_AF;
GPIO_Init(GPIOA, &GPIO_InitStructure);
// D0
GPIO_InitStructure.GPIO_Pin = GPIO_Pin_3;
GPIO_InitStructure.GPIO_Speed = GPIO_Speed_100MHz;
GPIO_InitStructure.GPIO_Mode = GPIO_Mode_AF;
GPIO_InitStructure.GPIO_OType = GPIO_OType_PP;
GPIO_InitStructure.GPIO_PuPd = GPIO_PuPd_NOPULL;
GPIO_Init(GPIOA, &GPIO_InitStructure);
// D1 D2 D3 D4 D5 D6 D7
GPIO_InitStructure.GPIO_Pin = GPIO_Pin_0 | GPIO_Pin_1 |
GPIO_Pin_5 | GPIO_Pin_10 | GPIO_Pin_11 | GPIO_Pin_12 | GPIO_Pin_13;
GPIO_InitStructure.GPIO_Speed = GPIO_Speed_100MHz;
GPIO_InitStructure.GPIO_Mode = GPIO_Mode_AF;
GPIO_InitStructure.GPIO_OType = GPIO_OType_PP;
GPIO_InitStructure.GPIO_PuPd = GPIO_PuPd_NOPULL;
GPIO_Init(GPIOB, &GPIO_InitStructure);
// STP
GPIO_InitStructure.GPIO_Pin = GPIO_Pin_0;
GPIO_InitStructure.GPIO_Speed = GPIO_Speed_100MHz;
GPIO_InitStructure.GPIO_Mode = GPIO_Mode_AF;
GPIO_Init(GPIOC, &GPIO_InitStructure);
// NXT
GPIO_InitStructure.GPIO_Pin = GPIO_Pin_4;
GPIO_InitStructure.GPIO_Speed = GPIO_Speed_100MHz;
GPIO_InitStructure.GPIO_Mode = GPIO_Mode_AF;
GPIO_Init(GPIOH, &GPIO_InitStructure);
// DIR
GPIO_InitStructure.GPIO_Pin = GPIO_Pin_11;
GPIO_InitStructure.GPIO_Speed = GPIO_Speed_100MHz;
GPIO_InitStructure.GPIO_Mode = GPIO_Mode_AF;
GPIO_Init(GPIOI, &GPIO_InitStructure);
RCC_AHB1PeriphClockCmd(RCC_AHB1Periph_OTG_HS |
RCC_AHB1Periph_OTG_HS_ULPI, ENABLE);
#else
RCC_AHB1PeriphClockCmd(RCC_AHB1Periph_GPIOB, ENABLE);
GPIO_InitStructure.GPIO_Pin = GPIO_Pin_12 | GPIO_Pin_14 | GPIO_Pin_15;
GPIO_InitStructure.GPIO_Speed = GPIO_Speed_100MHz;
GPIO_InitStructure.GPIO_Mode = GPIO_Mode_AF;
GPIO_Init(GPIOB, &GPIO_InitStructure);
GPIO_PinAFConfig(GPIOB, GPIO_PinSource12, GPIO_AF_OTG2_FS);
GPIO_PinAFConfig(GPIOB, GPIO_PinSource14, GPIO_AF_OTG2_FS);
GPIO_PinAFConfig(GPIOB, GPIO_
|
{
"pile_set_name": "Github"
}
|
Usage
=====
To install and use `waveform`, simply run:
```
$ go install github.com/mdlayher/waveform/...
```
The `waveform` binary is now installed in your `$GOPATH`. It has several options available
for generating waveform images:
```
$ waveform -h
Usage of waveform:
-alt="": hex alternate color of output waveform image
-bg="#FFFFFF": hex background color of output waveform image
-fg="#000000": hex foreground color of output waveform image
-fn="solid": function used to color output waveform image [options: fuzz, gradient, solid, stripe]
-resolution=1: number of times audio is read and drawn per second of audio
-sharpness=1: sharpening factor used to add curvature to a scaled image
-x=1: scaling factor for image X-axis
-y=1: scaling factor for image Y-axis
```
`waveform` currently supports both WAV and FLAC audio files. An audio stream must
be passed on `stdin`, and the resulting, PNG-encoded image will be written to `stdout`.
Any errors which occur will be written to `stderr`.
|
{
"pile_set_name": "Github"
}
|
#include "defines.hpp"
#include <sstream>
#include <string>
#include <vector>
#include <components/debug/debuglog.hpp>
#include <components/misc/stringops.hpp>
namespace Interpreter{
bool check(const std::string& str, const std::string& escword, unsigned int* i, unsigned int* start)
{
bool retval = str.find(escword) == 0;
if(retval){
(*i) += escword.length();
(*start) = (*i) + 1;
}
return retval;
}
std::vector<std::string> globals;
bool longerStr(const std::string& a, const std::string& b)
{
return a.length() > b.length();
}
std::string fixDefinesReal(std::string text, bool dialogue, Context& context)
{
unsigned int start = 0;
std::ostringstream retval;
for(unsigned int i = 0; i < text.length(); i++)
{
char eschar = text[i];
if(eschar == '%' || eschar == '^')
{
retval << text.substr(start, i - start);
std::string temp = Misc::StringUtils::lowerCase(text.substr(i+1, 100));
bool found = false;
try
{
if( (found = check(temp, "actionslideright", &i, &start))){
retval << context.getActionBinding("#{sRight}");
}
else if((found = check(temp, "actionreadymagic", &i, &start))){
retval << context.getActionBinding("#{sReady_Magic}");
}
else if((found = check(temp, "actionprevweapon", &i, &start))){
retval << context.getActionBinding("#{sPrevWeapon}");
}
else if((found = check(temp, "actionnextweapon", &i, &start))){
retval << context.getActionBinding("#{sNextWeapon}");
}
else if((found = check(temp, "actiontogglerun", &i, &start))){
retval << context.getActionBinding("#{sAuto_Run}");
}
else if((found = check(temp, "actionslideleft", &i, &start))){
retval << context.getActionBinding("#{sLeft}");
}
else if((found = check(temp, "actionreadyitem", &i, &start))){
retval << context.getActionBinding("#{sReady_Weapon}");
}
else if((found = check(temp, "actionprevspell", &i, &start))){
retval << context.getActionBinding("#{sPrevSpell}");
}
else if((found = check(temp, "actionnextspell", &i, &start))){
retval << context.getActionBinding("#{sNextSpell}");
}
else if((found = check(temp, "actionrestmenu", &i, &start))){
retval << context.getActionBinding("#{sRestKey}");
}
else if((found = check(temp, "actionmenumode", &i, &start))){
retval << context.getActionBinding("#{sInventory}");
}
else if((found = check(temp, "actionactivate", &i, &start))){
retval << context.getActionBinding("#{sActivate}");
}
else if((found = check(temp, "actionjournal", &i, &start))){
retval << context.getActionBinding("#{sJournal}");
}
else if((found = check(temp, "actionforward", &i, &start))){
retval << context.getActionBinding("#{sForward}");
}
else if((found = check(temp, "pccrimelevel", &i, &start))){
retval << context.getPCBounty();
}
else if((found = check(temp, "actioncrouch", &i, &start))){
retval << context.getActionBinding("#{sCrouch_Sneak}");
}
else if((found = check(temp, "actionjump", &i, &start))){
retval << context.getActionBinding("#{sJump}");
}
else if((found = check(temp, "actionback", &i, &start))){
retval << context.getActionBinding("#{sBack}");
}
else if((found = check(temp, "actionuse", &i, &start))){
retval << context.getActionBinding("#{sUse}");
}
else if((found = check(temp, "actionrun", &i, &start))){
retval << context.getActionBinding("#{sRun}");
}
else if((found = check(temp, "pcclass", &i, &start))){
retval << context.getPCClass();
}
else if((found = check(temp, "pcrace", &i, &start))){
retval << context.getPCRace();
}
else if((found = check(temp, "pcname", &i, &start))){
retval << context.getPCName();
}
else if((found = check(temp, "cell", &i, &start))){
retval << context.getCurrentCellName();
}
else if(dialogue) { // In Dialogue, not messagebox
if( (found = check(temp, "faction", &i, &start))){
retval << context.getNPCFaction();
}
else if((found = check(temp, "nextpcrank", &i, &start))){
retval << context.getPCNextRank();
}
else if((found = check(temp, "pcnextrank", &i, &start))){
retval << context.getPCNextRank();
}
else if((found = check(temp, "pcrank", &i, &start))){
retval << context.getPCRank();
}
else if((found = check(temp, "rank", &i, &start))){
retval << context.getNPCRank();
}
else if((found = check(temp, "class", &i, &start))){
retval << context.getNPCClass();
}
else if((found = check(temp, "race", &i, &start))){
retval << context.getNPCRace();
}
else if((found = check(temp, "name", &i, &start))){
retval << context.getActorName();
}
}
else { // In messagebox or book, not dialogue
/* empty outside dialogue */
if( (found = check(temp, "faction", &i, &start)));
else if((found = check(temp, "nextpcrank", &i, &start)));
else if((found = check(temp, "pcnextrank", &i, &start)));
else if((found = check(temp, "pcrank", &i, &start)));
else if((found = check(temp, "rank", &i, &start)));
/* uses pc in messageboxes */
else if((found = check(temp, "class", &i, &start))){
retval << context.getPCClass();
}
else if((found = check(temp, "race", &i, &start))){
retval << context.getPCRace();
}
else if((found = check(temp, "name", &i, &start))){
retval << context.getPCName();
}
}
/* Not a builtin, try global variables */
if(!found){
/* if list of globals is empty, grab it and sort it by descending string length */
if(globals.empty()){
globals = context.getGlobals();
sort(globals.begin(), globals.end(), longerStr);
}
for(unsigned int j = 0; j < globals.size(); j++){
if(globals[j].length() > temp.length()){ // Just in case there's a global with a huuuge name
temp = text.substr(i+1, globals[j].length
|
{
"pile_set_name": "Github"
}
|
// Copyright 1998-2017 Epic Games, Inc. All Rights Reserved.
#include "CoreMinimal.h"
#include "Interfaces/NetworkPredictionInterface.h"
UNetworkPredictionInterface::UNetworkPredictionInterface(const FObjectInitializer& ObjectInitializer)
: Super(ObjectInitializer)
{
}
|
{
"pile_set_name": "Github"
}
|
#!/usr/bin/env python
# -*- coding: utf-8 -*-
import json
from alipay.aop.api.constant.ParamConstants import *
class ZolozIdentificationCustomerEnrollcertifyQueryModel(object):
def __init__(self):
self._biz_id = None
self._face_type = None
self._need_img = None
self._zim_id = None
@property
def biz_id(self):
return self._biz_id
@biz_id.setter
def biz_id(self, value):
self._biz_id = value
@property
def face_type(self):
return self._face_type
@face_type.setter
def face_type(self, value):
self._face_type = value
@property
def need_img(self):
return self._need_img
@need_img.setter
def need_img(self, value):
self._need_img = value
@property
def zim_id(self):
return self._zim_id
@zim_id.setter
def zim_id(self, value):
self._zim_id = value
def to_alipay_dict(self):
params = dict()
if self.biz_id:
if hasattr(self.biz_id, 'to_alipay_dict'):
params['biz_id'] = self.biz_id.to_alipay_dict()
else:
params['biz_id'] = self.biz_id
if self.face_type:
if hasattr(self.face_type, 'to_alipay_dict'):
params['face_type'] = self.face_type.to_alipay_dict()
else:
params['face_type'] = self.face_type
if self.need_img:
if hasattr(self.need_img, 'to_alipay_dict'):
params['need_img'] = self.need_img.to_alipay_dict()
else:
params['need_img'] = self.need_img
if self.zim_id:
if hasattr(self.zim_id, 'to_alipay_dict'):
params['zim_id'] = self.zim_id.to_alipay_dict()
else:
params['zim_id'] = self.zim_id
return params
@staticmethod
def from_alipay_dict(d):
if not d:
return None
o = ZolozIdentificationCustomerEnrollcertifyQueryModel()
if 'biz_id' in d:
o.biz_id = d['biz_id']
if 'face_type' in d:
o.face_type = d['face_type']
if 'need_img' in d:
o.need_img = d['need_img']
if 'zim_id' in d:
o.zim_id = d['zim_id']
return o
|
{
"pile_set_name": "Github"
}
|
/* impure.c. Handling of re-entrancy data structure for OpenRISC 1000.
Copyright (C) 2014, Authors
Contributor Stefan Wallentowitz <stefan.wallentowitz@tum.de>
* The authors hereby grant permission to use, copy, modify, distribute,
* and license this software and its documentation for any purpose, provided
* that existing copyright notices are retained in all copies and that this
* notice is included verbatim in any distributions. No written agreement,
* license, or royalty fee is required for any of the authorized uses.
* Modifications to this software may be copyrighted by their authors
* and need not follow the licensing terms described here, provided that
* the new terms are clearly indicated on the first page of each file where
* they apply.
*/
#include <reent.h>
#include "or1k-internals.h"
#include <string.h>
/* As an exception handler may also use the library, it is better to use
* a different re-entrancy data structure for the exceptions.
* This data structure is configured here and as part of the exception
* handler (or1k_exception_handler) temporarily replaces the software's
* impure data pointer.
*
* During initialization, the libraries standard _impure_data and the exception
* impure data (_exception_impure_data) are initialized. Afterwards,
* the current value _current_impure_ptr is set to _impure_ptr.
*
* At runtime __getreent is called to return the current reentrancy pointer,
* which is stored in _current_impure_ptr.
*
* In the or1k_exception_handler the _current_impure_ptr is set to point to
* _exception_impure_ptr. After the exception handler returned, it is set back
* to _impure_ptr.
*/
/* Link in the external impure_data structure */
extern struct _reent *__ATTRIBUTE_IMPURE_PTR__ _impure_ptr;
#ifdef __OR1K_MULTICORE__
struct _reent **_or1k_impure_ptr;
struct _reent **_or1k_exception_impure_ptr;
struct _reent **_or1k_current_impure_ptr;
#else
struct _reent *__ATTRIBUTE_IMPURE_PTR__ _or1k_impure_ptr;
/* Create exception impure data structure */
static struct _reent _or1k_exception_impure_data = _REENT_INIT (_or1k_exception_impure_data);
/* Link to the exception impure data structure */
struct _reent *__ATTRIBUTE_IMPURE_PTR__ _or1k_exception_impure_ptr = &_or1k_exception_impure_data;
/* Link to the currently used data structure. */
struct _reent *__ATTRIBUTE_IMPURE_PTR__ _or1k_current_impure_ptr;
#endif
#ifdef __OR1K_MULTICORE__
#define OR1K_LIBC_GETREENT _or1k_current_impure_ptr[or1k_coreid()]
#else
#define OR1K_LIBC_GETREENT _or1k_current_impure_ptr
#endif
void
_or1k_libc_impure_init (void)
{
#ifdef __OR1K_MULTICORE__
uint32_t c;
_or1k_impure_ptr = _sbrk_r(0, sizeof(struct _reent*) * or1k_numcores());
_or1k_exception_impure_ptr = _sbrk_r(0, sizeof(struct _reent*) * or1k_numcores());
_or1k_current_impure_ptr = _sbrk_r(0, sizeof(struct _reent*) * or1k_numcores());
_or1k_impure_ptr[0] = _impure_ptr;
_REENT_INIT_PTR(_impure_ptr);
for (c = 1; c < or1k_numcores(); c++) {
_or1k_impure_ptr[c] = _sbrk_r(0, sizeof(struct _reent));
_REENT_INIT_PTR(_or1k_impure_ptr[c]);
}
for (c = 0; c < or1k_numcores(); c++) {
_or1k_exception_impure_ptr[c] = _sbrk_r(0, sizeof(struct _reent));
_REENT_INIT_PTR(_or1k_exception_impure_ptr[c]);
}
for (c = 0; c < or1k_numcores(); c++) {
_or1k_current_impure_ptr[c] = _or1k_impure_ptr[c];
}
#else
// Initialize both impure data structures
_REENT_INIT_PTR (_impure_ptr);
_REENT_INIT_PTR (_or1k_exception_impure_ptr);
// Use the standard impure ptr during normal software run
_or1k_impure_ptr = _impure_ptr;
// Set current to standard impure pointer
_or1k_current_impure_ptr = _impure_ptr;
#endif
}
struct _reent*
_or1k_libc_getreent(void) {
return OR1K_LIBC_GETREENT;
}
#ifdef __OR1K_MULTICORE__
struct _or1k_reent (*_or1k_reent)[];
#else
struct _or1k_reent _or1k_reent;
#endif
void
_or1k_reent_init(void)
{
#ifdef __OR1K_MULTICORE__
size_t memsize = sizeof(struct _or1k_reent) * or1k_numcores();
_or1k_reent = (struct _or1k_reent*) _sbrk_r(0, memsize);
#endif
}
|
{
"pile_set_name": "Github"
}
|
; This file is for use with available_externally_a.ll
; RUN: true
@foo = hidden unnamed_addr constant i32 0
|
{
"pile_set_name": "Github"
}
|
{
"name": "angular",
"version": "1.4.8",
"description": "HTML enhanced for web apps",
"main": "index.js",
"scripts": {
"test": "echo \"Error: no test specified\" && exit 1"
},
"repository": {
"type": "git",
"url": "https://github.com/angular/angular.js.git"
},
"keywords": [
"angular",
"framework",
"browser",
"client-side"
],
"author": "Angular Core Team <angular-core+npm@google.com>",
"license": "MIT",
"bugs": {
"url": "https://github.com/angular/angular.js/issues"
},
"homepage": "http://angularjs.org"
}
|
{
"pile_set_name": "Github"
}
|
/*
Copyright 2017 The Kubernetes Authors.
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
*/
package storage
import (
"fmt"
"mime"
"k8s.io/apimachinery/pkg/runtime"
"k8s.io/apimachinery/pkg/runtime/schema"
"k8s.io/apimachinery/pkg/runtime/serializer/recognizer"
"k8s.io/apiserver/pkg/storage/storagebackend"
)
// StorageCodecConfig are the arguments passed to newStorageCodecFn
type StorageCodecConfig struct {
StorageMediaType string
StorageSerializer runtime.StorageSerializer
StorageVersion schema.GroupVersion
MemoryVersion schema.GroupVersion
Config storagebackend.Config
EncoderDecoratorFn func(runtime.Encoder) runtime.Encoder
DecoderDecoratorFn func([]runtime.Decoder) []runtime.Decoder
}
// NewStorageCodec assembles a storage codec for the provided storage media type, the provided serializer, and the requested
// storage and memory versions.
func NewStorageCodec(opts StorageCodecConfig) (runtime.Codec, runtime.GroupVersioner, error) {
mediaType, _, err := mime.ParseMediaType(opts.StorageMediaType)
if err != nil {
return nil, nil, fmt.Errorf("%q is not a valid mime-type", opts.StorageMediaType)
}
serializer, ok := runtime.SerializerInfoForMediaType(opts.StorageSerializer.SupportedMediaTypes(), mediaType)
if !ok {
return nil, nil, fmt.Errorf("unable to find serializer for %q", mediaType)
}
s := serializer.Serializer
// Give callers the opportunity to wrap encoders and decoders. For decoders, each returned decoder will
// be passed to the recognizer so that multiple decoders are available.
var encoder runtime.Encoder = s
if opts.EncoderDecoratorFn != nil {
encoder = opts.EncoderDecoratorFn(encoder)
}
decoders := []runtime.Decoder{
// selected decoder as the primary
s,
// universal deserializer as a fallback
opts.StorageSerializer.UniversalDeserializer(),
// base64-wrapped universal deserializer as a last resort.
// this allows reading base64-encoded protobuf, which should only exist if etcd2+protobuf was used at some point.
// data written that way could exist in etcd2, or could have been migrated to etcd3.
// TODO: flag this type of data if we encounter it, require migration (read to decode, write to persist using a supported encoder), and remove in 1.8
runtime.NewBase64Serializer(nil, opts.StorageSerializer.UniversalDeserializer()),
}
if opts.DecoderDecoratorFn != nil {
decoders = opts.DecoderDecoratorFn(decoders)
}
encodeVersioner := runtime.NewMultiGroupVersioner(
opts.StorageVersion,
schema.GroupKind{Group: opts.StorageVersion.Group},
schema.GroupKind{Group: opts.MemoryVersion.Group},
)
// Ensure the storage receives the correct version.
encoder = opts.StorageSerializer.EncoderForVersion(
encoder,
encodeVersioner,
)
decoder := opts.StorageSerializer.DecoderToVersion(
recognizer.NewDecoder(decoders...),
runtime.NewCoercingMultiGroupVersioner(
opts.MemoryVersion,
schema.GroupKind{Group: opts.MemoryVersion.Group},
schema.GroupKind{Group: opts.StorageVersion.Group},
),
)
return runtime.NewCodec(encoder, decoder), encodeVersioner, nil
}
|
{
"pile_set_name": "Github"
}
|
from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
import _init_paths
import os
import cv2
import time
# from opts import opts
from opts_pose import opts
from detectors.detector_factory import detector_factory
image_ext = ['jpg', 'jpeg', 'png', 'webp']
video_ext = ['mp4', 'mov', 'avi', 'mkv']
time_stats = ['tot', 'load', 'pre', 'net', 'dec', 'post', 'merge']
def demo(opt):
os.environ['CUDA_VISIBLE_DEVICES'] = opt.gpus_str
Detector = detector_factory[opt.task]
detector = Detector(opt)
if opt.demo == 'webcam' or \
opt.demo[opt.demo.rfind('.') + 1:].lower() in video_ext:
cam = cv2.VideoCapture(0 if opt.demo == 'webcam' else opt.demo)
detector.pause = False
i = 0
start_time = time.time()
if opt.output_video:
fourcc = cv2.VideoWriter_fourcc(*'mp4v') # 如果是mp4视频,编码需要为mp4v
im_width = int(cam.get(cv2.CAP_PROP_FRAME_WIDTH))
im_height = int(cam.get(cv2.CAP_PROP_FRAME_HEIGHT))
write_cap = cv2.VideoWriter(opt.output_video, fourcc, 25, (im_width, im_height))
while cam.grab():
i += 1
_, img = cam.retrieve()
cv2.imshow('input', img)
ret = detector.run(img)
time_str = ''
for stat in time_stats:
time_str = time_str + '{} {:.3f}s |'.format(stat, ret[stat])
if opt.output_video:
write_cap.write(ret['plot_img'])
print('fps:{:.3f}'.format(i/(time.time()-start_time)), time_str)
if cv2.waitKey(1) == 27:
return # esc to quit
else:
if os.path.isdir(opt.demo):
image_names = []
ls = os.listdir(opt.demo)
for file_name in sorted(ls):
ext = file_name[file_name.rfind('.') + 1:].lower()
if ext in image_ext:
image_names.append(os.path.join(opt.demo, file_name))
else:
image_names = [opt.demo]
for (image_name) in image_names:
ret = detector.run(image_name)
time_str = ''
for stat in time_stats:
time_str = time_str + '{} {:.3f}s |'.format(stat, ret[stat])
print(time_str)
if __name__ == '__main__':
opt = opts().init()
demo(opt)
|
{
"pile_set_name": "Github"
}
|
# ignore-walk
[](https://travis-ci.org/npm/ignore-walk)
Nested/recursive `.gitignore`/`.npmignore` parsing and filtering.
Walk a directory creating a list of entries, parsing any `.ignore`
files met along the way to exclude files.
## USAGE
```javascript
const walk = require('ignore-walk')
// All options are optional, defaults provided.
// this function returns a promise, but you can also pass a cb
// if you like that approach better.
walk({
path: '...', // root dir to start in. defaults to process.cwd()
ignoreFiles: [ '.gitignore' ], // list of filenames. defaults to ['.ignore']
includeEmpty: true|false, // true to include empty dirs, default false
follow: true|false // true to follow symlink dirs, default false
}, callback)
// to walk synchronously, do it this way:
const result = walk.sync({ path: '/wow/such/filepath' })
```
If you want to get at the underlying classes, they're at `walk.Walker`
and `walk.WalkerSync`.
## OPTIONS
* `path` The path to start in. Defaults to `process.cwd()`
* `ignoreFiles` Filenames to treat as ignore files. The default is
`['.ignore']`. (This is where you'd put `.gitignore` or
`.npmignore` or whatever.) If multiple ignore files are in a
directory, then rules from each are applied in the order that the
files are listed.
* `includeEmpty` Set to `true` to include empty directories, assuming
they are not excluded by any of the ignore rules. If not set, then
this follows the standard `git` behavior of not including
directories that are empty.
Note: this will cause an empty directory to be included if it
would contain an included entry, even if it would have otherwise
been excluded itself.
For example, given the rules `*` (ignore everything) and `!/a/b/c`
(re-include the entry at `/a/b/c`), the directory `/a/b` will be
included if it is empty.
* `follow` Set to `true` to treat symbolically linked directories as
directories, recursing into them. There is no handling for nested
symlinks, so `ELOOP` errors can occur in some cases when using this
option. Defaults to `false`.
|
{
"pile_set_name": "Github"
}
|
/*
* Hibernate OGM, Domain model persistence for NoSQL datastores
*
* License: GNU Lesser General Public License (LGPL), version 2.1 or later
* See the lgpl.txt file in the root directory or <http://www.gnu.org/licenses/lgpl-2.1.html>.
*/
package org.hibernate.ogm.backendtck.type.converter;
import javax.persistence.AttributeConverter;
/**
* @author Gunnar Morling
*/
public class MyStringToUpperCaseStringConverter implements AttributeConverter<MyString, String> {
@Override
public String convertToDatabaseColumn(MyString attribute) {
return attribute != null ? attribute.toString().toUpperCase() : null;
}
@Override
public MyString convertToEntityAttribute(String dbData) {
return dbData != null ? new MyString( dbData.toLowerCase() ) : null;
}
}
|
{
"pile_set_name": "Github"
}
|
/*
Copyright (c) 2003-2015, CKSource - Frederico Knabben. All rights reserved.
For licensing, see LICENSE.md or http://ckeditor.com/license
*/
CKEDITOR.plugins.setLang( 'maximize', 'en-au', {
maximize: 'Maximize',
minimize: 'Minimize' // MISSING
} );
|
{
"pile_set_name": "Github"
}
|
/*!
* jQVMap Version 1.0
*
* http://jqvmap.com
*
* Copyright 2012, Peter Schmalfeldt <manifestinteractive@gmail.com>
* Licensed under the MIT license.
*
* Fork Me @ https://github.com/manifestinteractive/jqvmap
*/
.jqvmap-label {
position: absolute;
display: none;
-webkit-border-radius: 3px;
-moz-border-radius: 3px;
border-radius: 3px;
background: #292929;
color: white;
font-family: sans-serif, Verdana;
font-size: smaller;
padding: 3px;
}
.jqvmap-zoomin, .jqvmap-zoomout {
position: absolute;
left: 10px;
-webkit-border-radius: 3px;
-moz-border-radius: 3px;
border-radius: 3px;
background: #000000;
padding: 0px 7px;
color: white;
cursor: pointer;
line-height: 20px;
text-align: center;
}
.jqvmap-zoomin {
top: 10px;
}
.jqvmap-zoomout {
top: 40px;
}
.jqvmap-region {
cursor: pointer;
}
.jqvmap-ajax_response {
width: 100%;
height: 500px;
}
|
{
"pile_set_name": "Github"
}
|
{
"SourceFileNode": {
"statementList": [
{
"InlineHtml": {
"scriptSectionEndTag": null,
"text": null,
"scriptSectionStartTag": {
"kind": "ScriptSectionStartTag",
"textLength": 6
}
}
},
{
"ClassDeclaration": {
"attributes": null,
"abstractOrFinalModifier": null,
"classKeyword": {
"kind": "ClassKeyword",
"textLength": 5
},
"name": {
"kind": "Name",
"textLength": 1
},
"classBaseClause": null,
"classInterfaceClause": null,
"classMembers": {
"ClassMembersNode": {
"openBrace": {
"kind": "OpenBraceToken",
"textLength": 1
},
"classMemberDeclarations": [
{
"ClassConstDeclaration": {
"attributes": null,
"modifiers": [
{
"kind": "PublicKeyword",
"textLength": 6
}
],
"constKeyword": {
"kind": "ConstKeyword",
"textLength": 5
},
"constElements": {
"ConstElementList": {
"children": [
{
"ConstElement": {
"name": {
"kind": "Name",
"textLength": 1
},
"equalsToken": {
"kind": "EqualsToken",
"textLength": 1
},
"assignment": {
"Variable": {
"dollar": null,
"name": {
"kind": "VariableName",
"textLength": 2
}
}
}
}
}
]
}
},
"semicolon": {
"kind": "SemicolonToken",
"textLength": 1
}
}
}
],
"closeBrace": {
"kind": "CloseBraceToken",
"textLength": 1
}
}
}
}
}
],
"endOfFileToken": {
"kind": "EndOfFileToken",
"textLength": 0
}
}
}
|
{
"pile_set_name": "Github"
}
|
az: азербејџански
az_AZ: 'азербејџански (Азербејџан)'
az_Latn_AZ: 'азербејџански (латиница, Азербејџан)'
az_Latn: 'азербејџански (латиница)'
az_Cyrl_AZ: 'азербејџански (ћирилица, Азербејџан)'
az_Cyrl: 'азербејџански (ћирилица)'
ak: акан
ak_GH: 'акан (Гана)'
sq: албански
sq_AL: 'албански (Албанија)'
sq_XK: 'албански (Косово)'
sq_MK: 'албански (Македонија)'
am: амхарски
am_ET: 'амхарски (Етиопија)'
ar: арапски
ar_DZ: 'арапски (Алжир)'
ar_BH: 'арапски (Бахреин)'
ar_EG: 'арапски (Египат)'
ar_ER: 'арапски (Еритреја)'
ar_EH: 'арапски (Западна Сахара)'
ar_IL: 'арапски (Израел)'
ar_IQ: 'арапски (Ирак)'
ar_YE: 'арапски (Јемен)'
ar_JO: 'арапски (Јордан)'
ar_SS: 'арапски (Јужни Судан)'
ar_QA: 'арапски (Катар)'
ar_KM: 'арапски (Коморска Острва)'
ar_KW: 'арапски (Кувајт)'
ar_LB: 'арапски (Либан)'
ar_LY: 'арапски (Либија)'
ar_MA: 'арапски (Мароко)'
ar_MR: 'арапски (Мауританија)'
ar_OM: 'арапски (Оман)'
ar_PS: 'арапски (Палестинске територије)'
ar_SA: 'арапски (Саудијска Арабија)'
ar_SY: 'арапски (Сирија)'
ar_SO: 'арапски (Сомалија)'
ar_SD: 'арапски (Судан)'
ar_TN: 'арапски (Тунис)'
ar_AE: 'арапски (Уједињени Арапски Емирати)'
ar_TD: 'арапски (Чад)'
ar_DJ: 'арапски (Џибути)'
as: асамски
as_IN: 'асамски (Индија)'
af: африканс
af_ZA: 'африканс (Јужноафричка Република)'
af_NA: 'африканс (Намибија)'
bm: бамбара
bm_Latn_ML: 'бамбара (латиница, Мали)'
bm_Latn: 'бамбара (латиница)'
eu: баскијски
eu_ES: 'баскијски (Шпанија)'
be: белоруски
be_BY: 'белоруски (Белорусија)'
bn: бенгалски
bn_BD: 'бенгалски (Бангладеш)'
bn_IN: 'бенгалски (Индија)'
bs: босански
bs_BA: 'босански (Босна и Херцеговина)'
bs_Latn_BA: 'босански (латиница, Босна и Херцеговина)'
bs_Latn: 'босански (латиница)'
bs_Cyrl_BA: 'босански (ћирилица, Босна и Херцеговина)'
bs_Cyrl: 'босански (ћирилица)'
br: бретонски
br_FR: 'бретонски (Француска)'
bg: бугарски
bg_BG: 'бугарски (Бугарска)'
my: бурмански
my_MM: 'бурмански (Мијанмар (Бурма))'
cy: велшки
cy_GB: 'велшки (Велика Британија)'
vi: вијетнамски
vi_VN: 'вијетнамски (Вијетнам)'
gl: галицијски
gl_ES: 'галицијски (Шпанија)'
lg: ганда
lg_UG: 'ганда (Уганда)'
ka: грузијски
ka_GE: 'грузијски (Грузија)'
el: грчки
el_GR: 'грчки (Грчка)'
el_CY: 'грчки (Кипар)'
gu: гуџарати
gu_IN: 'гуџарати (Индија)'
da: дански
da_GL: 'дански (Гренланд)'
da_DK: 'дански (Данска)'
ee: еве
ee_GH: 'еве (Гана)'
ee_TG: 'еве (Того)'
en: енглески
en_VI: 'енглески (Америчка Девичанска Острва)'
en_AS: 'енглески (Америчка Самоа)'
en_AI: 'енглески (Ангвила)'
en_AG: 'енглески (Антигва и Барбуда)'
en_AU: 'енглески (Аустралија)'
en_BB: 'енглески (Барбадос)'
en_BS: 'енглески (Бахами)'
en_BE: 'енглески (Белгија)'
en_BZ: 'енглески (Белизе)'
en_BM: 'енглески (Бермуда)'
en_CX: 'енглески (Божићно острво)'
en_BW: 'енглески (Боцвана)'
en_VG: 'енглески (Британска Девичанска Острва)'
en_IO: 'енглески (Британска територија у Индијском океану)'
en_VU: 'енглески (Вануату)'
en_GB: 'енглески (Велика Британија)'
en_GM: 'енглески (Гамбија)'
en_GH: 'енглески (Гана)'
en_GY: 'енглески (Гвајана)'
en_GI: 'енглески (Гибралтар)'
en_GD: 'енглески (Гренада)'
en_GU: 'енглески (Гуам)'
en_GG: 'енглески (Гурнси)'
en_DG
|
{
"pile_set_name": "Github"
}
|
// Copyright (c) 2020, NVIDIA CORPORATION. All rights reserved.
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
#include <gtest/gtest.h>
#include <cmath>
#include <complex>
#include <tuple>
#include <vector>
#include "dali/kernels/common/utils.h"
#include "dali/kernels/imgproc/convolution/baseline_convolution.h"
#include "dali/kernels/imgproc/convolution/separable_convolution_cpu.h"
#include "dali/kernels/scratch.h"
#include "dali/test/tensor_test_utils.h"
#include "dali/test/test_tensors.h"
namespace dali {
namespace kernels {
template <typename T>
void InitTriangleWindow(const TensorView<StorageCPU, T, 1> &window) {
int radius = window.num_elements() / 2;
for (int i = 0; i < radius; i++) {
*window(i) = i + 1;
*window(window.num_elements() - i - 1) = i + 1;
}
*window(radius) = radius + 1;
}
TEST(SeparableConvolutionTest, Axes1WithChannels) {
std::array<int, 1> window_dims = {5};
TestTensorList<float, 1> kernel_window;
TestTensorList<float, 2> input;
TestTensorList<int, 2> output, baseline_output;
TensorListShape<2> data_shape = uniform_list_shape<2>(1, {16, 3});
kernel_window.reshape(uniform_list_shape<1>(1, {window_dims[0]}));
input.reshape(data_shape);
output.reshape(data_shape);
baseline_output.reshape(data_shape);
auto kernel_window_v = kernel_window.cpu()[0];
auto in_v = input.cpu()[0];
auto out_v = output.cpu()[0];
auto baseline_out_v = baseline_output.cpu()[0];
std::mt19937 rng;
UniformRandomFill(in_v, rng, 0, 255);
InitTriangleWindow(kernel_window_v);
SeparableConvolutionCpu<int, float, float, 1, true> kernel;
KernelContext ctx;
auto req = kernel.Setup(ctx, data_shape[0], window_dims);
ScratchpadAllocator scratch_alloc;
scratch_alloc.Reserve(req.scratch_sizes);
auto scratchpad = scratch_alloc.GetScratchpad();
ctx.scratchpad = &scratchpad;
kernel.Run(ctx, out_v, in_v,
uniform_array<1, TensorView<StorageCPU, const float, 1>>(kernel_window_v));
testing::BaselineConvolve(baseline_out_v, in_v, kernel_window_v, 0, window_dims[0] / 2);
Check(out_v, baseline_out_v);
}
TEST(SeparableConvolutionTest, Axes1NoChannels) {
std::array<int, 1> window_dims = {5};
TestTensorList<float, 1> kernel_window;
TestTensorList<float, 2> input;
TestTensorList<int, 1> output;
TestTensorList<int, 2> baseline_output;
TensorListShape<2> data_shape = uniform_list_shape<2>(1, {16, 1});
kernel_window.reshape(uniform_list_shape<1>(1, {window_dims[0]}));
input.reshape(data_shape);
output.reshape(data_shape.first<1>());
baseline_output.reshape(data_shape);
auto kernel_window_v = kernel_window.cpu()[0];
auto baseline_in_v = input.cpu()[0];
TensorView<StorageCPU, float, 1> in_v = {baseline_in_v.data, baseline_in_v.shape.first<1>()};
auto out_v = output.cpu()[0];
auto baseline_out_v = baseline_output.cpu()[0];
std::mt19937 rng;
UniformRandomFill(in_v, rng, 0, 255);
InitTriangleWindow(kernel_window_v);
SeparableConvolutionCpu<int, float, float, 1, false> kernel;
KernelContext ctx;
auto req = kernel.Setup(ctx, data_shape[0].first<1>(), window_dims);
ScratchpadAllocator scratch_alloc;
scratch_alloc.Reserve(req.scratch_sizes);
auto scratchpad = scratch_alloc.GetScratchpad();
ctx.scratchpad = &scratchpad;
kernel.Run(ctx, out_v, in_v,
uniform_array<1, TensorView<StorageCPU, const float, 1>>(kernel_window_v));
testing::BaselineConvolve(baseline_out_v, baseline_in_v, kernel_window_v, 0, window_dims[0] / 2);
TensorView<StorageCPU, int, 1> compare_v = {baseline_out_v.data, baseline_out_v.shape.first<1>()};
Check(out_v, compare_v);
}
TEST(SeparableConvolutionTest, Axes2WithChannels) {
std::array<int, 2> window_dims = {5, 7};
TestTensorList<float, 1> kernel_window_0, kernel_window_1;
TestTensorList<int, 3> input;
TestTensorList<float, 3> intermediate;
TestTensorList<int, 3> output, baseline_output;
TensorListShape<3> data_shape = uniform_list_shape<3>(1, {20, 16, 3});
kernel_window_0.reshape(uniform_list_shape<1>(1, {window_dims[0]}));
kernel_window_1.reshape(uniform_list_shape<1>(1, {window_dims[1]}));
input.reshape(data_shape);
intermediate.reshape(data_shape);
output.reshape(data_shape);
baseline_output.reshape(data_shape);
auto kernel_window_0_v = kernel_window_0.cpu()[0];
auto kernel_window_1_v = kernel_window_1.cpu()[0];
auto in_v = input.cpu()[0];
auto interm_v = intermediate.cpu()[0];
auto out_v = output.cpu()[0];
auto baseline_out_v = baseline_output.cpu()[0];
std::mt19937 rng;
UniformRandomFill(in_v, rng, 0, 255);
InitTriangleWindow(kernel_window_0_v);
InitTriangleWindow(kernel_window_1_v);
SeparableConvolutionCpu<int, int, float, 2, true> kernel;
static_assert(
std::is_same<typename SeparableConvolutionCpu<int, int, float, 2, true>::Intermediate,
float>::value,
"Unexpected intermediate type");
KernelContext ctx;
auto req = kernel.Setup(ctx, data_shape[0], window_dims);
ScratchpadAllocator scratch_alloc;
scratch_alloc.Reserve(req.scratch_sizes);
auto scratchpad = scratch_alloc.GetScratchpad();
ctx.scratchpad = &scratchpad;
kernel.Run(ctx, out_v, in_v, {kernel_window_0_v, kernel_window_1_v});
testing::
|
{
"pile_set_name": "Github"
}
|
<?php
/**
* CoreShop.
*
* This source file is subject to the GNU General Public License version 3 (GPLv3)
* For the full copyright and license information, please view the LICENSE.md and gpl-3.0.txt
* files that are distributed with this source code.
*
* @copyright Copyright (c) 2015-2020 Dominik Pfaffenbauer (https://www.pfaffenbauer.at)
* @license https://www.coreshop.org/license GNU General Public License version 3 (GPLv3)
*/
declare(strict_types=1);
namespace CoreShop\Behat\Model\Index;
use CoreShop\Component\Index\Model\IndexableInterface;
use CoreShop\Component\Index\Model\IndexInterface;
use CoreShop\Component\Resource\Exception\ImplementedByPimcoreException;
use CoreShop\Component\Resource\Pimcore\Model\AbstractPimcoreModel;
class TestEnableIndex extends AbstractPimcoreModel implements IndexableInterface
{
/**
* {@inheritdoc}
*/
public function getIndexable(IndexInterface $index): bool
{
return $this->getIndexableEnabled($index) && $this->getPublished();
}
/**
* {@inheritdoc}
*/
public function getIndexableEnabled(IndexInterface $index): bool
{
$enabled = $this->getEnabled();
if (is_bool($enabled)) {
return $enabled;
}
return false;
}
/**
* {@inheritdoc}
*/
public function getIndexableName(IndexInterface $index, string $language): string
{
$name = $this->getName($language);
if (null === $name) {
return '';
}
if (!is_string($name)) {
return '';
}
return $name;
}
/**
* {@inheritdoc}
*/
public function getEnabled()
{
return new ImplementedByPimcoreException(__CLASS__, __METHOD__);
}
/**
* {@inheritdoc}
*/
public function getName($language)
{
return new ImplementedByPimcoreException(__CLASS__, __METHOD__);
}
}
|
{
"pile_set_name": "Github"
}
|
@import url("gtk-main.css");
|
{
"pile_set_name": "Github"
}
|
"""Statistics analyzer for HotShot."""
import profile
import pstats
import hotshot.log
from hotshot.log import ENTER, EXIT
def load(filename):
return StatsLoader(filename).load()
class StatsLoader:
def __init__(self, logfn):
self._logfn = logfn
self._code = {}
self._stack = []
self.pop_frame = self._stack.pop
def load(self):
# The timer selected by the profiler should never be used, so make
# sure it doesn't work:
p = Profile()
p.get_time = _brokentimer
log = hotshot.log.LogReader(self._logfn)
taccum = 0
for event in log:
what, (filename, lineno, funcname), tdelta = event
if tdelta > 0:
taccum += tdelta
# We multiply taccum to convert from the microseconds we
# have to the seconds that the profile/pstats module work
# with; this allows the numbers to have some basis in
# reality (ignoring calibration issues for now).
if what == ENTER:
frame = self.new_frame(filename, lineno, funcname)
p.trace_dispatch_call(frame, taccum * .000001)
taccum = 0
elif what == EXIT:
frame = self.pop_frame()
p.trace_dispatch_return(frame, taccum * .000001)
taccum = 0
# no further work for line events
assert not self._stack
return pstats.Stats(p)
def new_frame(self, *args):
# args must be filename, firstlineno, funcname
# our code objects are cached since we don't need to create
# new ones every time
try:
code = self._code[args]
except KeyError:
code = FakeCode(*args)
self._code[args] = code
# frame objects are create fresh, since the back pointer will
# vary considerably
if self._stack:
back = self._stack[-1]
else:
back = None
frame = FakeFrame(code, back)
self._stack.append(frame)
return frame
class Profile(profile.Profile):
def simulate_cmd_complete(self):
pass
class FakeCode:
def __init__(self, filename, firstlineno, funcname):
self.co_filename = filename
self.co_firstlineno = firstlineno
self.co_name = self.__name__ = funcname
class FakeFrame:
def __init__(self, code, back):
self.f_back = back
self.f_code = code
def _brokentimer():
raise RuntimeError, "this timer should not be called"
|
{
"pile_set_name": "Github"
}
|
"""Tokenization help for Python programs.
generate_tokens(readline) is a generator that breaks a stream of
text into Python tokens. It accepts a readline-like method which is called
repeatedly to get the next line of input (or "" for EOF). It generates
5-tuples with these members:
the token type (see token.py)
the token (a string)
the starting (row, column) indices of the token (a 2-tuple of ints)
the ending (row, column) indices of the token (a 2-tuple of ints)
the original line (string)
It is designed to match the working of the Python tokenizer exactly, except
that it produces COMMENT tokens for comments and gives type OP for all
operators
Older entry points
tokenize_loop(readline, tokeneater)
tokenize(readline, tokeneater=printtoken)
are the same, except instead of generating tokens, tokeneater is a callback
function to which the 5 fields described above are passed as 5 arguments,
each time a new token is found."""
__author__ = 'Ka-Ping Yee <ping@lfw.org>'
__credits__ = ('GvR, ESR, Tim Peters, Thomas Wouters, Fred Drake, '
'Skip Montanaro, Raymond Hettinger')
from itertools import chain
import string, re
from token import *
import token
__all__ = [x for x in dir(token) if not x.startswith("_")]
__all__ += ["COMMENT", "tokenize", "generate_tokens", "NL", "untokenize"]
del x
del token
COMMENT = N_TOKENS
tok_name[COMMENT] = 'COMMENT'
NL = N_TOKENS + 1
tok_name[NL] = 'NL'
N_TOKENS += 2
def group(*choices): return '(' + '|'.join(choices) + ')'
def any(*choices): return group(*choices) + '*'
def maybe(*choices): return group(*choices) + '?'
Whitespace = r'[ \f\t]*'
Comment = r'#[^\r\n]*'
Ignore = Whitespace + any(r'\\\r?\n' + Whitespace) + maybe(Comment)
Name = r'[a-zA-Z_]\w*'
Hexnumber = r'0[xX][\da-fA-F]+[lL]?'
Octnumber = r'(0[oO][0-7]+)|(0[0-7]*)[lL]?'
Binnumber = r'0[bB][01]+[lL]?'
Decnumber = r'[1-9]\d*[lL]?'
Intnumber = group(Hexnumber, Binnumber, Octnumber, Decnumber)
Exponent = r'[eE][-+]?\d+'
Pointfloat = group(r'\d+\.\d*', r'\.\d+') + maybe(Exponent)
Expfloat = r'\d+' + Exponent
Floatnumber = group(Pointfloat, Expfloat)
Imagnumber = group(r'\d+[jJ]', Floatnumber + r'[jJ]')
Number = group(Imagnumber, Floatnumber, Intnumber)
# Tail end of ' string.
Single = r"[^'\\]*(?:\\.[^'\\]*)*'"
# Tail end of " string.
Double = r'[^"\\]*(?:\\.[^"\\]*)*"'
# Tail end of ''' string.
Single3 = r"[^'\\]*(?:(?:\\.|'(?!''))[^'\\]*)*'''"
# Tail end of """ string.
Double3 = r'[^"\\]*(?:(?:\\.|"(?!""))[^"\\]*)*"""'
Triple = group("[uUbB]?[rR]?'''", '[uUbB]?[rR]?"""')
# Single-line ' or " string.
String = group(r"[uUbB]?[rR]?'[^\n'\\]*(?:\\.[^\n'\\]*)*'",
r'[uUbB]?[rR]?"[^\n"\\]*(?:\\.[^\n"\\]*)*"')
# Because of leftmost-then-longest match semantics, be sure to put the
# longest operators first (e.g., if = came before ==, == would get
# recognized as two instances of =).
Operator = group(r"\*\*=?", r">>=?", r"<<=?", r"<>", r"!=",
r"//=?",
r"[+\-*/%&|^=<>]=?",
r"~")
Bracket = '[][(){}]'
Special = group(r'\r?\n', r'[:;.,`@]')
Funny = group(Operator, Bracket, Special)
PlainToken = group(Number, Funny, String, Name)
Token = Ignore + PlainToken
# First (or only) line of ' or " string.
ContStr = group(r"[uUbB]?[rR]?'[^\n'\\]*(?:\\.[^\n'\\]*)*" +
group("'", r'\\\r?\n'),
r'[uUbB]?[rR]?"[^\n"\\]*(?:\\.[^\n"\\]*)*' +
group('"', r'\\\r?\n'))
PseudoExtras = group(r'\\\r?\n|\Z', Comment, Triple)
PseudoToken = Whitespace + group(PseudoExtras, Number, Funny, ContStr, Name)
tokenprog, pseudoprog, single3prog, double3prog = map(
re.compile, (Token, PseudoToken, Single3, Double3))
endprogs = {"'": re.compile(Single), '"': re.compile(Double),
"'''": single3prog, '"""': double3prog,
"r'''": single3prog, 'r"""': double3prog,
"u'''": single3prog, 'u"""': double3prog,
"ur'''": single3prog, 'ur"""': double3prog,
"R'''": single3prog, 'R"""': double3prog,
"U'''": single3prog, 'U"""': double3prog,
"uR'''": single3prog, 'uR"""': double3prog,
"Ur'''": single3prog, 'Ur"""': double3prog,
"UR'''": single3prog, 'UR"""': double3prog,
"b'''": single3prog, 'b"""': double3prog,
"br'''": single3prog, 'br"""': double3prog,
"B'''": single3prog, 'B"""': double3prog,
"bR'''": single3prog, 'bR"""': double3prog,
"Br'''": single3prog, 'Br"""': double3prog,
"BR'''": single3prog, 'BR"""': double3prog,
'r': None, 'R': None, 'u': None, 'U': None,
'b': None, 'B': None}
triple_quoted = {}
for t in ("'''", '"""',
"r'''", 'r"""', "R'''", 'R"""',
"u'''", 'u"""', "U'''", 'U"""',
"ur'''", 'ur"""', "Ur'''", 'Ur"""',
"uR'''", 'uR"""', "UR'''", 'UR"""',
"b'''", 'b"""', "B'''", 'B"""',
"br'''", 'br"""', "Br'''", 'Br"""',
"bR'''", 'bR"""', "BR'''", 'BR"""'):
triple_quoted[t] = t
single_quoted = {}
for t in ("'", '"',
"r'", 'r"', "R'", 'R"',
"u'", 'u"', "U'", 'U"',
"ur'", 'ur"', "Ur'", 'Ur"',
"uR'", 'uR"', "UR'", 'UR"',
"b'", 'b"', "B'", 'B"',
"br'", 'br"', "Br'", 'Br"',
"bR'", 'bR"', "BR'", 'BR"' ):
single_quoted[t] = t
tabsize = 8
|
{
"pile_set_name": "Github"
}
|
Copyright (c) 2003-2019, CKSource - Frederico Knabben. All rights reserved.
For licensing, see LICENSE.md or https://ckeditor.com/legal/ckeditor-oss-license
cs.js Found: 118 Missing: 0
cy.js Found: 118 Missing: 0
de.js Found: 118 Missing: 0
el.js Found: 16 Missing: 102
eo.js Found: 118 Missing: 0
et.js Found: 31 Missing: 87
fa.js Found: 24 Missing: 94
fi.js Found: 23 Missing: 95
fr.js Found: 118 Missing: 0
hr.js Found: 23 Missing: 95
it.js Found: 118 Missing: 0
nb.js Found: 118 Missing: 0
nl.js Found: 118 Missing: 0
no.js Found: 118 Missing: 0
tr.js Found: 118 Missing: 0
ug.js Found: 39 Missing: 79
zh-cn.js Found: 118 Missing: 0
|
{
"pile_set_name": "Github"
}
|
/*
* This file is part of EasyRPG Player.
*
* EasyRPG Player is free software: you can redistribute it and/or modify
* it under the terms of the GNU General Public License as published by
* the Free Software Foundation, either version 3 of the License, or
* (at your option) any later version.
*
* EasyRPG Player is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
* GNU General Public License for more details.
*
* You should have received a copy of the GNU General Public License
* along with EasyRPG Player. If not, see <http://www.gnu.org/licenses/>.
*/
#ifndef EP_SCENE_ACTORTARGET_H
#define EP_SCENE_ACTORTARGET_H
// Headers
#include "scene.h"
#include "window_actortarget.h"
#include "window_help.h"
#include "window_targetstatus.h"
/**
* Scene ActorTarget class.
* Manages using of Items and Spells.
*/
class Scene_ActorTarget : public Scene {
public:
/**
* Constructor.
*
* @param item_id item ID of item to use.
*/
Scene_ActorTarget(int item_id);
/**
* Constructor.
*
* @param skill_id skill ID of skill to use.
* @param actor_index index of the spell caster in party.
*/
Scene_ActorTarget(int skill_id, int actor_index);
void Start() override;
void Update() override;
/**
* Update function used when an item will be used.
*/
void UpdateItem();
/**
* Update function used when a skill will be used.
*/
void UpdateSkill();
private:
/** Contains the actors of the party. */
std::unique_ptr<Window_ActorTarget> target_window;
/** Contains the name of the item/skill that will be used. */
std::unique_ptr<Window_Help> help_window;
/** Contains quantity/cost of item/spell. */
std::unique_ptr<Window_TargetStatus> status_window;
/** ID of item/skill to use. */
int id;
/* Index of spell caster in party (only for skills). */
int actor_index;
/** True if item, false if skill. */
bool use_item;
};
#endif
|
{
"pile_set_name": "Github"
}
|
<?php
defined('YII_DEBUG') or define('YII_DEBUG', true);
defined('YII_ENV') or define('YII_ENV', 'test');
defined('YII_APP_BASE_PATH') or define('YII_APP_BASE_PATH', dirname(dirname(dirname(__DIR__))));
defined('FRONTEND_ENTRY_URL') or define('FRONTEND_ENTRY_URL', parse_url(\Codeception\Configuration::config()['config']['test_entry_url'], PHP_URL_PATH));
defined('FRONTEND_ENTRY_FILE') or define('FRONTEND_ENTRY_FILE', YII_APP_BASE_PATH . '/frontend/web/index-test.php');
require_once(YII_APP_BASE_PATH . '/vendor/autoload.php');
require_once(YII_APP_BASE_PATH . '/vendor/yiisoft/yii2/Yii.php');
require_once(YII_APP_BASE_PATH . '/common/config/bootstrap.php');
require_once(YII_APP_BASE_PATH . '/frontend/config/bootstrap.php');
// set correct script paths
// the entry script file path for functional and acceptance tests
$_SERVER['SCRIPT_FILENAME'] = FRONTEND_ENTRY_FILE;
$_SERVER['SCRIPT_NAME'] = FRONTEND_ENTRY_URL;
$_SERVER['SERVER_NAME'] = parse_url(\Codeception\Configuration::config()['config']['test_entry_url'], PHP_URL_HOST);
$_SERVER['SERVER_PORT'] = parse_url(\Codeception\Configuration::config()['config']['test_entry_url'], PHP_URL_PORT) ?: '80';
Yii::setAlias('@tests', dirname(dirname(__DIR__)));
|
{
"pile_set_name": "Github"
}
|
'use strict';
const acorn = require('acorn-jsx');
const esrecurse = require('../esrecurse/esrecurse');
const escodegen = require('escodegen');
const esquery = require('../esquery/esquery');
const bfs = require('acorn-bfs');
const htmlElements = require('./constants.js').htmlElements;
const reactMethods = require('./constants.js').reactMethods;
function getReactStates(node) {
const stateStr = escodegen.generate(node);
let states;
eval(`states = ${stateStr}`);
const output = [];
for (const state in states) {
output.push({
name: state,
value: states[state],
});
}
return output;
}
/**
* Returns array of props from React component passed to input
* @param {Node} node
* @returns {Array} Array of all JSX props on React component
*/
function getReactProps(node, parent) {
if (node.openingElement.attributes.length === 0 ||
htmlElements.indexOf(node.openingElement.name.name) > 0) return {};
const result = node.openingElement.attributes
.map(attribute => {
const name = attribute.name.name;
let valueName;
if (attribute.value === null) valueName = undefined;
else if (attribute.value.type === 'Literal') valueName = attribute.value.value;
else if (attribute.value.expression.type === 'Literal') {
valueName = attribute.value.expression.value;
} else if (attribute.value.expression.type === 'Identifier') {
valueName = attribute.value.expression.name;
} else if (attribute.value.expression.type === 'CallExpression') {
valueName = attribute.value.expression.callee.object.property.name;
} else if (attribute.value.expression.type === 'BinaryExpression') {
valueName = attribute.value.expression.left.name
+ attribute.value.expression.operator
+ (attribute.value.expression.right.name
|| attribute.value.expression.right.value);
} else if (attribute.value.expression.type === 'MemberExpression') {
let current = attribute.value.expression;
while (current && current.property) {
// && !current.property.name.match(/(state|props)/)
valueName = `.${current.property.name}${valueName || ''}`;
current = current.object;
if (current.type === 'Identifier') {
valueName = `.${current.name}${valueName || ''}`;
break;
}
}
valueName = valueName.replace('.', '');
} else if (attribute.value.expression.type === 'LogicalExpression') {
valueName = attribute.value.expression.left.property.name;
// valueType = attribute.value.expression.left.object.name;
} else if (attribute.value.expression.type === 'JSXElement') {
const nodez = attribute.value.expression;
const output = {
name: nodez.openingElement.name.name,
children: getChildJSXElements(nodez, parent),
props: getReactProps(nodez, parent),
state: {},
methods: [],
};
valueName = output;
} else valueName = escodegen.generate(attribute.value);
return {
name,
value: valueName,
parent,
};
});
return result;
}
/**
* Returns array of children components of React component passed to input
* @param {Node} node
* @returns {Array} Array of (nested) children of React component passed in
*/
function getChildJSXElements(node, parent) {
if (node.children.length === 0) return [];
const childJsxComponentsArr = node
.children
.filter(jsx => jsx.type === 'JSXElement'
&& htmlElements.indexOf(jsx.openingElement.name.name) < 0);
return childJsxComponentsArr
.map(child => {
return {
name: child.openingElement.name.name,
children: getChildJSXElements(child, parent),
props: getReactProps(child, parent),
state: {},
methods: [],
};
});
}
function forInFinder(arr, name) {
const result = arr.map(ele => {
const jsxnode = esquery(ele, 'JSXElement')[0];
const obj = {};
obj.variables = {};
esquery(ele, 'VariableDeclarator').forEach(vars => {
if (vars.id.name !== 'i' && vars.init) {
obj.variables[vars.id.name] = escodegen.generate(vars.init).replace('this.', '');
}
});
if (ele.left.declarations) obj.variables[ele.left.declarations[0].id.name] = '[key]';
else if (ele.left.type === 'Identifier') obj.variables[ele.left.name] = '[key]';
if (jsxnode && htmlElements.indexOf(jsxnode.openingElement.name.name)) {
let current = ele.right;
let found;
while (current && current.property) {
found = `.${current.property.name}${found || ''}`;
current = current.object;
if (current.type === 'Identifier') {
found = `.${current.name}${found || ''}`;
break;
}
}
obj.jsx = {
name: jsxnode.openingElement.name.name,
children: getChildJSXElements(jsxnode, name),
props: getReactProps(jsxnode, name),
state: {},
methods: [],
iterated: 'forIn',
source: found.replace('.', ''),
};
const propsArr = obj.jsx.props;
for (let i = 0; i < propsArr.length; i++) {
for (const key in obj.variables) {
if (propsArr[i].value.includes(key)) {
if (obj.variables[key] === '[key]') {
propsArr[i].value = propsArr[i].value.replace(`.${key}`, obj.variables[key]);
} else propsArr[i].value = propsArr[i].value.replace(key, obj.variables[key]);
}
}
}
}
return obj;
});
return result;
}
function forLoopFinder(arr, name) {
const result = arr.map(ele => {
const jsxnode = esquery(ele, 'JSXElement')[0];
const obj = {};
obj.variables = {};
// finding variables in case information was reassigned
esquery(ele, 'VariableDeclarator').forEach(vars => {
if (vars.id.name !== 'i' && vars.init) {
obj.variables[vars.id.name] = escodegen.generate(vars.init)
.replace('this.', '').replace('.length', '');
}
});
// defaulting each iteration to be represented by 'i'
if (ele.init.declarations) obj.variables[ele.init.declarations[0].id.name] = '[i]';
else if (ele.init.type === 'AssignmentExpression') obj.variables[ele.init.left.name] = '[i]';
// building the object name
if (jsxnode && htmlElements.indexOf(jsxnode.openingElement.name.name)) {
let current = ele.test.right;
let found;
while (current && current.property) {
found = `.${current.property.name}${found || ''}`;
current = current.object;
if (current.type === 'Identifier') {
found = `.${current.name}${found || ''}`;
break;
}
}
obj.jsx = {
name: jsxnode.openingElement.name.name,
children: getChildJSXElements(jsxnode, name),
props: getReact
|
{
"pile_set_name": "Github"
}
|
To: GitHub, Inc
Attn: DMCA Agent
88 Colin P Kelly Jr St
San Francisco, CA. 94107
via copyright@github.com
[private], November 14th, 2018
Infringement Notice
Dear Sirs/Madams,
We have been notified that licence keys and activation codes for products of our company JetBrains
s.r.o. (such as
Upsource, PhpStorm, WebStorm, PyCharm, RubyMine, IntelliJ IDEA, AppCode, ReSharper, CLion,
DataGrip), the URLs of the illegally run license servers,
source and binary codes of illegal license servers, cracks (software which intentionally and
illegally modifies JetBrains tools in order to run it on behalf of a third party without an
authorization from JetBrains), and instructions on how to use the cracks and activation codes are
publicly accessible on your website github.com.
JetBrains s.r.o. is an owner of registered trademarks for all of the products specified above, those
are proprietary desktop software products.
The materials infringing JetBrains s.r.o. rights mentioned above are to be found in particular on
the following links:
https://github.com/stephenfri/jbl-server
https://github.com/kosmosr/license_server
https://github.com/HackersZone/JetBrains-License-Server
[repository disabled per previous DMCA takedown]
[repository disabled per previous DMCA takedown]
[repository disabled per previous DMCA takedown]
https://github.com/CrazyNing98/JetbrainsCrack
https://github.com/LinuxDigger/JetbrainsCrack
https://github.com/GalaxySuze/JetbrainsCrack
https://github.com/sandeepvalapi/DevOps/wiki/Intellij_Idea_crack
Because:
1) you and github.com users owning repositories / gists mentioned above are not the owners of the
trademarks of these products and
2) you and github.com users owning repositories / gists mentioned above do not have any agreement
with our company which would entitle you or them to redistribute our products, license keys or
license keys generators for our products,
we consider the occurrence of cracks, license keys, license keys generators to our products, license
servers
source and binary code, and URLs of license servers which are not authorized by JetBrains to distribute
licenses to our products, source and binary codes of illegal license servers, as well as specific
software which aims to circumvent technical measures which protect JetBrains tools from being run on
behalf of a third party without an authorization from JetBrains, on the website github.com to be
infringement of our proprietary rights.
That´s why we demand that you:
1) immediately cease all use of our trademarks by the links specified above,
2) immediately remove all cracks, license keys, source code of license keys generators, license servers
source code, and URLs of license servers which are not authorized by JetBrains to distribute
licenses to our products, and any other references to our products from the website github.com, by
the links specified above,
3) forbear from all further unauthorised use of our trademarks and from all further making
accessible of license keys to our products, as well as source code of license keys generating software.
I have read and understand GitHub's Guide to Filing a DMCA Notice.
I have a good faith belief that use of the copyrighted materials described above on the infringing
web pages is not authorized by the copyright owner, or its agent, or the law.
I swear, under penalty of perjury, that the information in this notification is accurate and that I
am the copyright owner, or am authorized to act on behalf of the owner, of an exclusive right that
is allegedly infringed.
You can contact JetBrains s.r.o. on any matters via email private. The address is private. Phone: private. Fax: private. The contact person is private (direct email
private).
This DMCA claim has been prepared and filed by private on behalf of private
(private, JetBrains s.r.o.).
A list of the links specified above is attached in a CSV format for your convenience.
Thank you!
Kind regards,
private (on behalf of private, private, JetBrains s.r.o.)
e-mail: private
|
{
"pile_set_name": "Github"
}
|
/***********************************************************************
Copyright (c) 2006-2011, Skype Limited. All rights reserved.
Redistribution and use in source and binary forms, with or without
modification, (subject to the limitations in the disclaimer below)
are permitted provided that the following conditions are met:
- Redistributions of source code must retain the above copyright notice,
this list of conditions and the following disclaimer.
- Redistributions in binary form must reproduce the above copyright
notice, this list of conditions and the following disclaimer in the
documentation and/or other materials provided with the distribution.
- Neither the name of Skype Limited, nor the names of specific
contributors, may be used to endorse or promote products derived from
this software without specific prior written permission.
NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED
BY THIS LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND
CONTRIBUTORS ''AS IS'' AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING,
BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND
FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE
COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT,
INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT
NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF
USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON
ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
(INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
***********************************************************************/
#include "SKP_Silk_typedef.h"
#include "SKP_Silk_common_pitch_est_defines.h"
/********************************************************/
/* Auto Generated File from generate_pitch_est_tables.m */
/********************************************************/
const SKP_int16 SKP_Silk_CB_lags_stage2[PITCH_EST_NB_SUBFR][PITCH_EST_NB_CBKS_STAGE2_EXT] =
{
{0, 2,-1,-1,-1, 0, 0, 1, 1, 0, 1},
{0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 0},
{0, 0, 1, 0, 0, 0, 1, 0, 0, 0, 0},
{0,-1, 2, 1, 0, 1, 1, 0, 0,-1,-1}
};
const SKP_int16 SKP_Silk_CB_lags_stage3[PITCH_EST_NB_SUBFR][PITCH_EST_NB_CBKS_STAGE3_MAX] =
{
{-9,-7,-6,-5,-5,-4,-4,-3,-3,-2,-2,-2,-1,-1,-1, 0, 0, 0, 1, 1, 0, 1, 2, 2, 2, 3, 3, 4, 4, 5, 6, 5, 6, 8},
{-3,-2,-2,-2,-1,-1,-1,-1,-1, 0, 0,-1, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 1, 1, 0, 1, 1, 2, 1, 2, 2, 2, 2, 3},
{ 3, 3, 2, 2, 2, 2, 1, 2, 1, 1, 0, 1, 1, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0,-1, 0, 0,-1,-1,-1,-1,-1,-2,-2,-2},
{ 9, 8, 6, 5, 6, 5, 4, 4, 3, 3, 2, 2, 2, 1, 0, 1, 1, 0, 0, 0,-1,-1,-1,-2,-2,-2,-3,-3,-4,-4,-5,-5,-6,-7}
};
const SKP_int16 SKP_Silk_Lag_range_stage3[ SKP_Silk_PITCH_EST_MAX_COMPLEX + 1 ] [ PITCH_EST_NB_SUBFR ][ 2 ] =
{
/* Lags to search for low number of stage3 cbks */
{
{-2,6},
{-1,5},
{-1,5},
{-2,7}
},
/* Lags to search for middle number of stage3 cbks */
{
{-4,8},
{-1,6},
{-1,6},
{-4,9}
},
/* Lags to search for max number of stage3 cbks */
{
{-9,12},
{-3,7},
{-2,7},
{-7,13}
}
};
const SKP_int16 SKP_Silk_cbk_sizes_stage3[SKP_Silk_PITCH_EST_MAX_COMPLEX + 1] =
{
PITCH_EST_NB_CBKS_STAGE3_MIN,
PITCH_EST_NB_CBKS_STAGE3_MID,
PITCH_EST_NB_CBKS_STAGE3_MAX
};
const SKP_int16 SKP_Silk_cbk_offsets_stage3[SKP_Silk_PITCH_EST_MAX_COMPLEX + 1] =
{
((PITCH_EST_NB_CBKS_STAGE3_MAX - PITCH_EST_NB_CBKS_STAGE3_MIN) >> 1),
((PITCH_EST_NB_CBKS_STAGE3_MAX - PITCH_EST_NB_CBKS_STAGE3_MID) >> 1),
0
};
|
{
"pile_set_name": "Github"
}
|
# -*- coding: utf-8 -*-
# Generated by Django 1.11.15 on 2018-12-19 12:35
from __future__ import unicode_literals
from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
('CMDB', '0036_auto_20181213_1404'),
]
operations = [
migrations.AlterUniqueTogether(
name='networkcard_assets',
unique_together=set([('host', 'macaddress', 'ip')]),
),
]
|
{
"pile_set_name": "Github"
}
|
# SPDX-License-Identifier: GPL-2.0-only
acp_audio_dma-objs := acp-pcm-dma.o
snd-soc-acp-da7219mx98357-mach-objs := acp-da7219-max98357a.o
snd-soc-acp-rt5645-mach-objs := acp-rt5645.o
snd-soc-acp-rt5682-mach-objs := acp3x-rt5682-max9836.o
obj-$(CONFIG_SND_SOC_AMD_ACP) += acp_audio_dma.o
obj-$(CONFIG_SND_SOC_AMD_CZ_DA7219MX98357_MACH) += snd-soc-acp-da7219mx98357-mach.o
obj-$(CONFIG_SND_SOC_AMD_CZ_RT5645_MACH) += snd-soc-acp-rt5645-mach.o
obj-$(CONFIG_SND_SOC_AMD_ACP3x) += raven/
obj-$(CONFIG_SND_SOC_AMD_RV_RT5682_MACH) += snd-soc-acp-rt5682-mach.o
obj-$(CONFIG_SND_SOC_AMD_RENOIR) += renoir/
|
{
"pile_set_name": "Github"
}
|
#include "wsconnection.h"
#include <stdlib.h>
#include <vector>
#include <assert.h>
#include <arpa/inet.h>
#include <string.h>
#include <endian.h>
#include "connection.h"
#include "httprequest.h"
#include "httpresponse.h"
#include "stringutil.h"
#include "sockutil.h"
#include "log.h"
using namespace std;
using namespace std::placeholders;
namespace tnet
{
size_t WsConnection::ms_maxPayloadLen = 10 * 1024 * 1024;
static void dummyCallback()
{}
WsConnection::WsConnection(const ConnectionPtr_t& conn, const WsCallback_t& callback)
{
m_fd = conn->getSockFd();
m_conn = conn;
m_callback = callback;
m_payloadLen = 0;
m_final = 0;
m_opcode = 0;
m_mask = 0;
m_lastOpcode = 0;
m_sendCallback = std::bind(&dummyCallback);
}
WsConnection::~WsConnection()
{
LOG_INFO("wsconnection destroyed");
}
void WsConnection::onOpen(const void* context)
{
m_status = FrameStart;
m_callback(shared_from_this(), Ws_OpenEvent, context);
}
void WsConnection::onError()
{
m_callback(shared_from_this(), Ws_ErrorEvent, 0);
}
void WsConnection::onConnEvent(const ConnectionPtr_t& conn, ConnEvent event, const void* context)
{
switch(event)
{
case Conn_ReadEvent:
{
const StackBuffer* buffer = (const StackBuffer*)context;
onRead(conn, buffer->buffer, buffer->count);
}
break;
case Conn_WriteCompleteEvent:
{
m_sendCallback();
m_sendCallback = std::bind(&dummyCallback);
}
break;
default:
break;
}
}
#define HANDLE(func) \
ret = func(data + readLen, count - readLen); \
readLen += (ret > 0 ? ret : 0);
ssize_t WsConnection::onRead(const ConnectionPtr_t& conn, const char* data, size_t count)
{
size_t readLen = 0;
ssize_t ret = 1;
while(readLen < count && ret > 0)
{
switch(m_status)
{
case FrameStart:
HANDLE(onFrameStart);
case FramePayloadLen:
HANDLE(onFramePayloadLen);
break;
case FramePayloadLen16:
HANDLE(onFramePayloadLen16);
break;
case FramePayloadLen64:
HANDLE(onFramePayloadLen64);
break;
case FrameMaskingKey:
HANDLE(onFrameMaskingKey);
break;
case FrameData:
HANDLE(onFrameData);
break;
case FrameFinal:
ret = handleFrameData(conn);
readLen += (ret > 0 ? ret : 0);
break;
default:
return -1;
break;
}
}
if(ret > 0 && m_status == FrameFinal)
{
ret = handleFrameData(conn);
}
if(ret < 0)
{
LOG_ERROR("onReadError");
m_status = FrameError;
m_callback(shared_from_this(), Ws_ErrorEvent, 0);
//an error occur, only to shutdown connection
conn->shutDown();
}
return ret;
}
ssize_t WsConnection::onFrameStart(const char* data, size_t count)
{
m_cache.clear();
char header = data[0];
if(header & 0x70)
{
//reserved bits now not supported, abort
return -1;
}
m_final = header & 0x80;
m_opcode = header & 0x0f;
m_status = FramePayloadLen;
return 1;
}
ssize_t WsConnection::handleFramePayloadLen(size_t payloadLen)
{
m_payloadLen = payloadLen;
m_cache.reserve(payloadLen);
m_cache.clear();
if(m_payloadLen == 0)
{
m_status = FrameFinal;
}
else
{
m_status = isMaskFrame() ? FrameMaskingKey : FrameData;
}
return 0;
}
ssize_t WsConnection::onFramePayloadLen(const char* data, size_t count)
{
uint8_t payloadLen = (uint8_t)data[0];
m_mask = payloadLen & 0x80;
payloadLen &= 0x7f;
if(isControlFrame() && payloadLen >= 126)
{
//control frames must have payload < 126
return -1;
}
if(payloadLen < 126)
{
handleFramePayloadLen(payloadLen);
}
else if(payloadLen == 126)
{
m_status = FramePayloadLen16;
}
else if(payloadLen == 127)
{
m_status = FramePayloadLen64;
}
else
{
//payload error
return -1;
}
return 1;
}
ssize_t WsConnection::tryRead(const char* data, size_t count, size_t tryReadData)
{
assert(m_cache.size() < tryReadData);
size_t pendingSize = m_cache.size();
if(pendingSize + count < tryReadData)
{
m_cache.append(data, count);
return 0;
}
m_cache.append(data, tryReadData - m_cache.size());
return tryReadData - pendingSize;
}
ssize_t WsConnection::onFramePayloadLen16(const char* data, size_t count)
{
ssize_t readLen = tryRead(data, count, 2);
if(readLen == 0)
{
return readLen;
}
uint16_t payloadLen = *(uint16_t*)m_cache.data();
//memcpy(&payloadLen, m_cache.data(), sizeof(uint16_t));
if(payloadLen > ms_maxPayloadLen)
{
return -1;
}
payloadLen = ntohs(payloadLen);
handleFramePayloadLen(payloadLen);
return readLen;
}
ssize_t WsConnection::onFramePayloadLen64(const char* data, size_t count)
{
ssize_t readLen = tryRead(data, count, 8);
if(readLen == 0)
{
return readLen;
}
uint64_t payloadLen = *(uint64_t*)m_cache.data();
//memcpy(&payloadLen, m_cache.data(), sizeof(uint64_t));
if(payloadLen > ms_maxPayloadLen)
{
return -1;
}
//todo ntohl64
payloadLen = be64toh(payloadLen);
handleFramePayloadLen(payloadLen);
return readLen;
}
ssize_t WsConnection::onFrameMaskingKey(const char* data, size_t count)
{
ssize_t readLen = tryRead(data, count, 4);
if(readLen == 0)
{
return 0;
}
memcpy(m_maskingKey, m_cache.data(), sizeof(m_maskingKey));
m_cache.clear();
m_status = FrameData;
return readLen;
}
ssize_t WsConnection::onFrame
|
{
"pile_set_name": "Github"
}
|
# (C) Copyright David Abrahams 2001.
# (C) Copyright MetaCommunications, Inc. 2004.
# Distributed under the Boost Software License, Version 1.0. (See
# accompanying file LICENSE_1_0.txt or copy at
# http://www.boost.org/LICENSE_1_0.txt)
# The following #// line will be used by the regression test table generation
# program as the column heading for HTML tables. Must not include a version
# number.
#//<a href="http://www.comeaucomputing.com/">Comeau<br>C++</a>
import common ;
import como ;
import feature ;
import generators ;
import toolset : flags ;
feature.extend-subfeature toolset como : platform : win ;
# Initializes the Comeau toolset for windows. The command is the command which
# invokes the compiler. You should either set environment variable
# COMO_XXX_INCLUDE where XXX is the used backend (as described in the
# documentation), or pass that as part of command, e.g:
#
# using como-win : 4.3 : "set COMO_BCC_INCLUDE=C:/include &&" como.exe ;
#
rule init ( version ? : command * : options * )
{
local condition = [ common.check-init-parameters como-win
: version $(version) ] ;
command = [ common.get-invocation-command como-win : como.exe :
$(command) ] ;
common.handle-options como-win : $(condition) : $(command) : $(options) ;
}
generators.register-c-compiler como-win.compile.c++ : CPP : OBJ
: <toolset>como <toolset-como:platform>win ;
generators.register-c-compiler como-win.compile.c : C : OBJ
: <toolset>como <toolset-como:platform>win ;
generators.register-linker como-win.link
: OBJ SEARCHED_LIB STATIC_LIB IMPORT_LIB
: EXE
: <toolset>como <toolset-como:platform>win ;
# Note that status of shared libraries support is not clear, so we do not define
# the link.dll generator.
generators.register-archiver como-win.archive
: OBJ : STATIC_LIB
: <toolset>como <toolset-como:platform>win ;
flags como-win C++FLAGS <exception-handling>off : --no_exceptions ;
flags como-win C++FLAGS <exception-handling>on : --exceptions ;
flags como-win CFLAGS <inlining>off : --no_inlining ;
flags como-win CFLAGS <inlining>on <inlining>full : --inlining ;
# The following seems to be VC-specific options. At least, when I uncomment
# then, Comeau with bcc as backend reports that bcc32 invocation failed.
#
#flags como-win CFLAGS <debug-symbols>on : /Zi ;
#flags como-win CFLAGS <optimization>off : /Od ;
flags como-win CFLAGS <cflags> ;
flags como-win CFLAGS : -D_WIN32 ; # Make sure that we get the Boost Win32 platform config header.
flags como-win CFLAGS <threading>multi : -D_MT ; # Make sure that our config knows that threading is on.
flags como-win C++FLAGS <cxxflags> ;
flags como-win DEFINES <define> ;
flags como-win UNDEFS <undef> ;
flags como-win HDRS <include> ;
flags como-win SYSHDRS <sysinclude> ;
flags como-win LINKFLAGS <linkflags> ;
flags como-win ARFLAGS <arflags> ;
flags como-win NO_WARN <no-warn> ;
#flags como-win STDHDRS : $(COMO_INCLUDE_PATH) ;
#flags como-win STDLIB_PATH : $(COMO_STDLIB_PATH)$(SLASH) ;
flags como-win LIBPATH <library-path> ;
flags como-win LIBRARIES <library-file> ;
flags como-win FINDLIBS <find-shared-library> ;
flags como-win FINDLIBS <find-static-library> ;
nl = "
" ;
# For como, we repeat all libraries so that dependencies are always resolved.
#
actions link bind LIBRARIES
{
$(CONFIG_COMMAND) --no_version --no_prelink_verbose $(LINKFLAGS) -o "$(<[1]:S=)" @"@($(<[1]:W).rsp:E=$(nl)"$(>)")" "$(LIBRARIES)" "$(FINDLIBS:S=.lib)"
}
actions compile.c
{
$(CONFIG_COMMAND) -c --c99 -e5 --no_version --display_error_number --diag_suppress=9,21,161,748,940,962 -U$(UNDEFS) -D$(DEFINES) $(WARN) $(CFLAGS) -I"$(HDRS)" -I"$(STDHDRS)" -I"$(SYSHDRS)" -o "$(<:D=)" "$(>)"
}
actions compile.c++
{
$(CONFIG_COMMAND) -c -e5 --no_version --no_prelink_verbose --display_error_number --long_long --diag_suppress=9,21,161,748,940,962 --diag_error=461 -D__STL_LONG_LONG -U$(UNDEFS) -D$(DEFINES) $(WARN) $(CFLAGS) $(C++FLAGS) -I"$(HDRS)" -I"$(STDHDRS)" -I"$(SYSHDRS)" -o "$(<)" "$(>)"
}
actions archive
{
$(CONFIG_COMMAND) --no_version --no_prelink_verbose --prelink_object @"@($(<[1]:W).rsp:E=$(nl)"$(>)")"
lib $(ARFLAGS) /nologo /out:"$(<:S=.lib)" @"@($(<[1]:W).rsp:E=$(nl)"$(>)")"
}
|
{
"pile_set_name": "Github"
}
|
// Source : https://leetcode.com/problems/diagonal-traverse/
// Author : Han Zichi
// Date : 2017-02-07
/**
* @param {number[]} nums
* @return {number[]}
*/
var nextGreaterElements = function(nums) {
let len = nums.length;
nums = nums.concat(nums.slice(0, len - 1));
let ans = [];
let stack = [];
// 单调队列?
nums.forEach((item, index) => {
if (index === 0) {
stack.push({
num: item,
index: index
});
} else {
while (true) {
if (!stack.length) break;
if (item > stack[stack.length - 1].num) {
let lastItem = stack.pop();
ans[lastItem.index] = item;
} else {
break;
}
}
stack.push({
num: item,
index: index
});
}
});
ans = ans.slice(0, len);
for (let i = 0; i < len; i++)
if (ans[i] === undefined)
ans[i] = -1;
return ans;
};
|
{
"pile_set_name": "Github"
}
|
typedef unsigned short uint16;
typedef unsigned uint32;
typedef unsigned char uint8;
typedef struct
{
uint16 len;
uint16 maxlen;
uint32 offset;
}tSmbStrHeader;
typedef struct
{
char ident[8];
uint32 msgType;
uint32 flags;
tSmbStrHeader user;
tSmbStrHeader domain;
uint8 buffer[1024];
uint32 bufIndex;
}tSmbNtlmAuthRequest;
typedef struct
{
char ident[8];
uint32 msgType;
tSmbStrHeader uDomain;
uint32 flags;
uint8 challengeData[8];
uint8 reserved[8];
tSmbStrHeader emptyString;
uint8 buffer[1024];
uint32 bufIndex;
}tSmbNtlmAuthChallenge;
typedef struct
{
char ident[8];
uint32 msgType;
tSmbStrHeader lmResponse;
tSmbStrHeader ntResponse;
tSmbStrHeader uDomain;
tSmbStrHeader uUser;
tSmbStrHeader uWks;
tSmbStrHeader sessionKey;
uint32 flags;
uint8 buffer[1024];
uint32 bufIndex;
}tSmbNtlmAuthResponse;
#define SmbLength(ptr) (((ptr)->buffer - (uint8*)(ptr)) + (ptr)->bufIndex)
|
{
"pile_set_name": "Github"
}
|
diff --git a/chrome/android/java/src/org/chromium/chrome/browser/notifications/NotificationWrapperBuilderFactory.java b/chrome/android/java/src/org/chromium/chrome/browser/notifications/NotificationWrapperBuilderFactory.java
index cf33525790377668a857f209c651a6759a6e30a5..f83e924c6d7caef6f87f2b01d7998ca46bbabe81 100644
--- a/chrome/android/java/src/org/chromium/chrome/browser/notifications/NotificationWrapperBuilderFactory.java
+++ b/chrome/android/java/src/org/chromium/chrome/browser/notifications/NotificationWrapperBuilderFactory.java
@@ -71,7 +71,7 @@ public class NotificationWrapperBuilderFactory {
}
NotificationManagerProxyImpl notificationManagerProxy =
- new NotificationManagerProxyImpl(context);
+ new BraveNotificationManagerProxyImpl(context);
ChannelsInitializer channelsInitializer = new ChannelsInitializer(notificationManagerProxy,
ChromeChannelDefinitions.getInstance(), context.getResources());
|
{
"pile_set_name": "Github"
}
|
{% extends "two_factor/_base_focus.html" %}
{% load i18n %}
{% block content %}
<h1>{% block title %}{% trans "Permission Denied" %}{% endblock %}</h1>
<p>{% blocktrans %}The page you requested, enforces users to verify using
two-factor authentication for security reasons. You need to enable these
security features in order to access this page.{% endblocktrans %}</p>
<p>{% blocktrans %}Two-factor authentication is not enabled for your
account. Enable two-factor authentication for enhanced account
security.{% endblocktrans %}</p>
<p>
<a href="javascript:history.go(-1)"
class="pull-right btn btn-link">{% trans "Go back" %}</a>
<a href="{% url 'two_factor:setup' %}" class="btn btn-primary">
{% trans "Enable Two-Factor Authentication" %}</a>
</p>
{% endblock %}
|
{
"pile_set_name": "Github"
}
|
/*
* Copyright (C) 2010 Apple Inc. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without
* modification, are permitted provided that the following conditions
* are met:
* 1. Redistributions of source code must retain the above copyright
* notice, this list of conditions and the following disclaimer.
* 2. Redistributions in binary form must reproduce the above copyright
* notice, this list of conditions and the following disclaimer in the
* documentation and/or other materials provided with the distribution.
*
* THIS SOFTWARE IS PROVIDED BY APPLE INC. AND ITS CONTRIBUTORS ``AS IS'' AND ANY
* EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED
* WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE
* DISCLAIMED. IN NO EVENT SHALL APPLE INC. OR ITS CONTRIBUTORS BE LIABLE FOR ANY
* DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES
* (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES;
* LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON
* ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
* (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS
* SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
#ifndef PluginViewBase_h
#define PluginViewBase_h
#include "AudioHardwareListener.h"
#include "BridgeJSC.h"
#include "PlatformLayer.h"
#include "ScrollTypes.h"
#include "Widget.h"
#include <wtf/text/WTFString.h>
namespace JSC {
class ExecState;
class JSGlobalObject;
class JSObject;
}
namespace WebCore {
class Scrollbar;
// PluginViewBase is a widget that all plug-in views inherit from, both in Webkit and WebKit2.
// It's intended as a stopgap measure until we can merge all plug-in views into a single plug-in view.
class PluginViewBase : public Widget {
public:
virtual PlatformLayer* platformLayer() const { return 0; }
#if PLATFORM(IOS)
virtual bool willProvidePluginLayer() const { return false; }
virtual void attachPluginLayer() { }
virtual void detachPluginLayer() { }
#endif
virtual JSC::JSObject* scriptObject(JSC::JSGlobalObject*) { return 0; }
virtual void storageBlockingStateChanged() { }
virtual void privateBrowsingStateChanged(bool) { }
virtual bool getFormValue(String&) { return false; }
virtual bool scroll(ScrollDirection, ScrollGranularity) { return false; }
// A plug-in can ask WebKit to handle scrollbars for it.
virtual Scrollbar* horizontalScrollbar() { return 0; }
virtual Scrollbar* verticalScrollbar() { return 0; }
// FIXME: This is a hack that works around the fact that the WebKit2 PluginView isn't a ScrollableArea.
virtual bool wantsWheelEvents() { return false; }
virtual bool supportsKeyboardFocus() const { return false; }
virtual bool canProcessDrag() const { return false; }
virtual bool shouldAlwaysAutoStart() const { return false; }
virtual void beginSnapshottingRunningPlugin() { }
virtual bool shouldAllowNavigationFromDrags() const { return false; }
virtual bool isPluginViewBase() const { return true; }
virtual bool shouldNotAddLayer() const { return false; }
virtual AudioHardwareActivityType audioHardwareActivity() const { return AudioHardwareActivityType::Unknown; }
virtual void setJavaScriptPaused(bool) { }
virtual RefPtr<JSC::Bindings::Instance> bindingInstance() { return nullptr; }
virtual void willDetatchRenderer() { }
protected:
explicit PluginViewBase(PlatformWidget widget = 0) : Widget(widget) { }
};
} // namespace WebCore
SPECIALIZE_TYPE_TRAITS_WIDGET(PluginViewBase, isPluginViewBase())
#endif // PluginViewBase_h
|
{
"pile_set_name": "Github"
}
|
LOCAL_PATH:= $(call my-dir)
include $(CLEAR_VARS)
LOCAL_MODULE := imageutils$(LIB_SUFFIX)
LOCAL_SRC_FILES := blur-jni.cpp \
similar-jni.cpp \
blur.cpp \
similar.cpp
ifeq ($(TARGET_ARCH_ABI),armeabi-v7a)
LOCAL_CFLAGS += -DHAVE_ARMEABI_V7A=1 -mfloat-abi=softfp -mfpu=neon
LOCAL_C_INCLUDES += $(NDK_ROOT)/sources/android/cpufeatures
LOCAL_STATIC_LIBRARIES += cpufeatures
endif
LOCAL_C_INCLUDES += $(LOCAL_PATH)/../common
LOCAL_STATIC_LIBRARIES += common
include $(BUILD_SHARED_LIBRARY)
|
{
"pile_set_name": "Github"
}
|
package fr.lteconsulting.pomexplorer.commands;
import java.io.File;
import java.util.Set;
import fr.lteconsulting.pomexplorer.ApplicationSession;
import fr.lteconsulting.pomexplorer.Client;
import fr.lteconsulting.pomexplorer.DefaultPomFileLoader;
import fr.lteconsulting.pomexplorer.Log;
import fr.lteconsulting.pomexplorer.PomAnalysis;
import fr.lteconsulting.pomexplorer.Project;
import fr.lteconsulting.pomexplorer.Tools;
import fr.lteconsulting.pomexplorer.graph.PomGraph.PomGraphReadTransaction;
import fr.lteconsulting.pomexplorer.model.Gav;
import fr.lteconsulting.pomexplorer.tools.FilteredGAVs;
public class GavCommand
{
@Help( "list the session's GAVs" )
public void main( ApplicationSession session, Log log )
{
list( session, log );
}
@Help( "list the session's GAVs" )
public void list( ApplicationSession session, Log log )
{
list( session, null, log );
}
@Help( "list the session's GAVs, with filtering" )
public void list( ApplicationSession session, FilteredGAVs gavFilter, Log log )
{
StringBuilder sb = new StringBuilder();
sb.append( "<br/>GAV list filtered with '" + (gavFilter != null ? gavFilter.getFilterDescription() : "no filter") + "' :<br/>" );
if( gavFilter != null )
{
gavFilter.getGavs( session.session() ).stream().sorted( ( g1, g2 ) -> g1.toString().compareTo( g2.toString() ) ).forEach( gav -> sb.append( gav + "<br/>" ) );
}
else
{
PomGraphReadTransaction tx = session.graph().read();
tx.gavs().stream().sorted( ( g1, g2 ) -> g1.toString().compareTo( g2.toString() ) ).forEach( gav -> sb.append( gav + "<br/>" ) );
}
log.html( sb.toString() );
}
@Help( "analyze all the gav's dependencies and add them in the pom graph." )
public void add( ApplicationSession session, Log log, Client client, Gav gav )
{
log.html( "analyzing " + gav + "...<br/>" );
DefaultPomFileLoader loader = new DefaultPomFileLoader( session.session(), true );
File pomFile = loader.loadPomFileForGav( gav, null, log );
if( pomFile == null )
{
log.html( Tools.errorMessage( "cannot fetch project " + gav ) );
return;
}
PomAnalysis analysis = new PomAnalysis( session.session(), loader, null, false, log );
analysis.addFile( pomFile );
analysis.loadProjects();
analysis.completeLoadedProjects();
analysis.addCompletedProjectsToSession();
Set<Project> addedToGraph = analysis.addCompletedProjectsToGraph();
log.html( "project " + gav + " fetched successfully, " + addedToGraph.size() + " project added to graph.<br/>" );
}
@Help( "analyze gavs which have no associated project" )
public void resolve( ApplicationSession session, Log log, Client client )
{
DefaultPomFileLoader loader = new DefaultPomFileLoader( session.session(), true );
PomAnalysis analysis = new PomAnalysis( session.session(), loader, null, false, log );
session.graph().read().gavs().stream().filter( gav -> session.projects().forGav( gav ) == null ).forEach( gav -> {
log.html( "fetching pom file for " + gav + "...<br/>" );
File pomFile = loader.loadPomFileForGav( gav, null, log );
if( pomFile == null )
{
log.html( Tools.errorMessage( "cannot fetch project " + gav ) );
return;
}
analysis.addFile( pomFile );
} );
analysis.loadProjects();
analysis.completeLoadedProjects();
analysis.addCompletedProjectsToSession();
Set<Project> addedToGraph = analysis.addCompletedProjectsToGraph();
log.html( "finished, " + addedToGraph.size() + " project added to graph.<br/>" );
}
}
|
{
"pile_set_name": "Github"
}
|
#pragma once
#include <elle/reactor/fsm/Transition.hh>
#include <elle/reactor/fsm/fwd.hh>
namespace elle
{
namespace reactor
{
namespace fsm
{
/// A specialized Transition that occurs at the end of the source State.
class EndTransition
: public Transition
{
public:
using Condition = std::function<bool ()>;
public:
/// Called when the origin State is over.
///
/// \param trigger Who trigger the transition. If null, exn is null
/// and our condition is true or null, set
/// trigger = this.
/// \param exn An exception pointer.
virtual
void
done(Transition*& trigger, std::exception_ptr& exn) override;
protected:
/// Create a EndTransition.
///
/// \param start The source State of the Transition.
/// \param end The destination State of the Transition.
EndTransition(State& start,
State& end);
/// Create a EndTransition.
///
/// \param start The source State of the Transition.
/// \param end The destination State of the Transition.
/// \param condition The Condition to trigger the Transition.
EndTransition(State& start,
State& end,
Condition const& condition);
friend class Machine;
ELLE_ATTRIBUTE(boost::optional<Condition>, condition);
/*----------.
| Printable |
`----------*/
public:
void
print(std::ostream& stream) const override;
};
}
}
}
|
{
"pile_set_name": "Github"
}
|
// go run mksyscall.go -l32 -openbsd -arm -tags openbsd,arm syscall_bsd.go syscall_openbsd.go syscall_openbsd_arm.go
// Code generated by the command above; see README.md. DO NOT EDIT.
// +build openbsd,arm
package unix
import (
"syscall"
"unsafe"
)
var _ syscall.Errno
// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT
func getgroups(ngid int, gid *_Gid_t) (n int, err error) {
r0, _, e1 := RawSyscall(SYS_GETGROUPS, uintptr(ngid), uintptr(unsafe.Pointer(gid)), 0)
n = int(r0)
if e1 != 0 {
err = errnoErr(e1)
}
return
}
// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT
func setgroups(ngid int, gid *_Gid_t) (err error) {
_, _, e1 := RawSyscall(SYS_SETGROUPS, uintptr(ngid), uintptr(unsafe.Pointer(gid)), 0)
if e1 != 0 {
err = errnoErr(e1)
}
return
}
// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT
func wait4(pid int, wstatus *_C_int, options int, rusage *Rusage) (wpid int, err error) {
r0, _, e1 := Syscall6(SYS_WAIT4, uintptr(pid), uintptr(unsafe.Pointer(wstatus)), uintptr(options), uintptr(unsafe.Pointer(rusage)), 0, 0)
wpid = int(r0)
if e1 != 0 {
err = errnoErr(e1)
}
return
}
// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT
func accept(s int, rsa *RawSockaddrAny, addrlen *_Socklen) (fd int, err error) {
r0, _, e1 := Syscall(SYS_ACCEPT, uintptr(s), uintptr(unsafe.Pointer(rsa)), uintptr(unsafe.Pointer(addrlen)))
fd = int(r0)
if e1 != 0 {
err = errnoErr(e1)
}
return
}
// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT
func bind(s int, addr unsafe.Pointer, addrlen _Socklen) (err error) {
_, _, e1 := Syscall(SYS_BIND, uintptr(s), uintptr(addr), uintptr(addrlen))
if e1 != 0 {
err = errnoErr(e1)
}
return
}
// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT
func connect(s int, addr unsafe.Pointer, addrlen _Socklen) (err error) {
_, _, e1 := Syscall(SYS_CONNECT, uintptr(s), uintptr(addr), uintptr(addrlen))
if e1 != 0 {
err = errnoErr(e1)
}
return
}
// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT
func socket(domain int, typ int, proto int) (fd int, err error) {
r0, _, e1 := RawSyscall(SYS_SOCKET, uintptr(domain), uintptr(typ), uintptr(proto))
fd = int(r0)
if e1 != 0 {
err = errnoErr(e1)
}
return
}
// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT
func getsockopt(s int, level int, name int, val unsafe.Pointer, vallen *_Socklen) (err error) {
_, _, e1 := Syscall6(SYS_GETSOCKOPT, uintptr(s), uintptr(level), uintptr(name), uintptr(val), uintptr(unsafe.Pointer(vallen)), 0)
if e1 != 0 {
err = errnoErr(e1)
}
return
}
// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT
func setsockopt(s int, level int, name int, val unsafe.Pointer, vallen uintptr) (err error) {
_, _, e1 := Syscall6(SYS_SETSOCKOPT, uintptr(s), uintptr(level), uintptr(name), uintptr(val), uintptr(vallen), 0)
if e1 != 0 {
err = errnoErr(e1)
}
return
}
// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT
func getpeername(fd int, rsa *RawSockaddrAny, addrlen *_Socklen) (err error) {
_, _, e1 := RawSyscall(SYS_GETPEERNAME, uintptr(fd), uintptr(unsafe.Pointer(rsa)), uintptr(unsafe.Pointer(addrlen)))
if e1 != 0 {
err = errnoErr(e1)
}
return
}
// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT
func getsockname(fd int, rsa *RawSockaddrAny, addrlen *_Socklen) (err error) {
_, _, e1 := RawSyscall(SYS_GETSOCKNAME, uintptr(fd), uintptr(unsafe.Pointer(rsa)), uintptr(unsafe.Pointer(addrlen)))
if e1 != 0 {
err = errnoErr(e1)
}
return
}
// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT
func Shutdown(s int, how int) (err error) {
_, _, e1 := Syscall(SYS_SHUTDOWN, uintptr(s), uintptr(how), 0)
if e1 != 0 {
err = errnoErr(e1)
}
return
}
// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT
func socketpair(domain int, typ int, proto int, fd *[2]int32) (err error) {
_, _, e1 := RawSyscall6(SYS_SOCKETPAIR, uintptr(domain), uintptr(typ), uintptr(proto), uintptr(unsafe.Pointer(fd)), 0, 0)
if e1 != 0 {
err = errnoErr(e1)
}
return
}
// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT
func recvfrom(fd int, p []byte, flags int, from *RawSockaddrAny, fromlen *_Socklen) (n int, err error) {
var _p0 unsafe.Pointer
if len(p) > 0 {
_p0 = unsafe.Pointer(&p[0])
} else {
_p0 = unsafe.Pointer(&_zero)
}
r0, _, e1 := Syscall6(SYS_RECVFROM, uintptr(fd), uintptr(_p0), uintptr(len(p)), uintptr(flags), uintptr(unsafe.Pointer(from)), uintptr(unsafe.Pointer(fromlen)))
n = int(r0)
if e1 != 0 {
err = errnoErr(e1)
}
return
}
// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT
func sendto(s int, buf []byte, flags int, to unsafe.Pointer, addrlen _Socklen) (err error) {
var _p0 unsafe.Pointer
if len(buf) > 0 {
_p0 = unsafe.Pointer(&buf[0])
} else {
_p0 = unsafe.Pointer(&_zero)
}
_, _, e1 := Syscall6(SYS_SENDTO, uintptr(s), uintptr(_p0), uintptr(len(buf)), uintptr(flags), uintptr(to), uintptr(addrlen))
if e1 != 0 {
err = errnoErr(e1)
}
return
}
// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT
func recvmsg(s int, msg *Msghdr, flags int) (n int, err error) {
r0, _, e1 := Syscall(SYS_RECVMSG, uintptr(s), uintptr(unsafe.Pointer(msg)), uintptr(flags))
n = int(r0)
if e1 != 0 {
err = errnoErr(e1)
}
return
}
// THIS FILE IS GENERATED BY THE COMMAND AT THE TOP; DO NOT EDIT
func sendmsg(s int, msg *Msghdr, flags int) (n int, err error) {
r0, _, e1 := Syscall(SYS_SENDMSG, uintptr(s), uintptr(unsafe.Pointer(msg)), uintptr(flags))
n = int
|
{
"pile_set_name": "Github"
}
|
exports.f = Object.getOwnPropertySymbols;
|
{
"pile_set_name": "Github"
}
|
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>org.openjfx</groupId>
<artifactId>javafx</artifactId>
<version>@VERSION@</version>
<packaging>pom</packaging>
<name>openjfx</name>
<description>OpenJFX JavaFX</description>
<properties>
<javafx.version>@VERSION@</javafx.version>
</properties>
<dependencyManagement>
</dependencyManagement>
<profiles>
<profile>
<id>linux</id>
<activation>
<os>
<name>linux</name>
</os>
</activation>
<properties>
<javafx.platform>linux</javafx.platform>
</properties>
</profile>
<profile>
<id>macosx</id>
<activation>
<os>
<name>mac os x</name>
</os>
</activation>
<properties>
<javafx.platform>mac</javafx.platform>
</properties>
</profile>
<profile>
<id>windows</id>
<activation>
<os>
<family>windows</family>
</os>
</activation>
<properties>
<javafx.platform>win</javafx.platform>
</properties>
</profile>
<profile>
<id>javafx.platform.custom</id>
<activation>
<property>
<name>javafx.platform</name>
</property>
</activation>
<properties>
<javafx.platform>${javafx.platform}</javafx.platform>
</properties>
</profile>
</profiles>
</project>
|
{
"pile_set_name": "Github"
}
|
'use strict';
Object.defineProperty(exports, '__esModule', { value: true });
var prefix = 'fas';
var iconName = 'suitcase';
var width = 512;
var height = 512;
var ligatures = [];
var unicode = 'f0f2';
var svgPathData = 'M128 480h256V80c0-26.5-21.5-48-48-48H176c-26.5 0-48 21.5-48 48v400zm64-384h128v32H192V96zm320 80v256c0 26.5-21.5 48-48 48h-48V128h48c26.5 0 48 21.5 48 48zM96 480H48c-26.5 0-48-21.5-48-48V176c0-26.5 21.5-48 48-48h48v352z';
exports.definition = {
prefix: prefix,
iconName: iconName,
icon: [
width,
height,
ligatures,
unicode,
svgPathData
]};
exports.faSuitcase = exports.definition;
exports.prefix = prefix;
exports.iconName = iconName;
exports.width = width;
exports.height = height;
exports.ligatures = ligatures;
exports.unicode = unicode;
exports.svgPathData = svgPathData;
|
{
"pile_set_name": "Github"
}
|
---
title: Cells.Application Property (Word)
keywords: vbawd10.chm155845608
f1_keywords:
- vbawd10.chm155845608
ms.prod: word
api_name:
- Word.Cells.Application
ms.assetid: be60412c-86a7-bfd5-25d6-e35d9c7cca96
ms.date: 06/08/2017
---
# Cells.Application Property (Word)
Returns an **[Application](application-object-word.md)** object that represents the Microsoft Word application.
## Syntax
_expression_ . **Application**
_expression_ A variable that represents a **[Cells](cells-object-word.md)** object.
## See also
#### Concepts
[Cells Collection Object](cells-object-word.md)
|
{
"pile_set_name": "Github"
}
|
<?php
/**
* JBZoo Application
*
* This file is part of the JBZoo CCK package.
* For the full copyright and license information, please view the LICENSE
* file that was distributed with this source code.
*
* @package Application
* @license GPL-2.0
* @copyright Copyright (C) JBZoo.com, All rights reserved.
* @link https://github.com/JBZoo/JBZoo
* @author Denis Smetannikov <denis@jbzoo.com>
*/
// no direct access
defined('_JEXEC') or die('Restricted access');
$this->app->jbdebug->mark('layout::item_columns::start');
if ($vars['count']) {
$i = 0;
$bootstrap = $this->app->jbbootstrap;
$count = $vars['count'];
$rowItems = array_chunk($vars['objects'], $vars['cols_num']);
$rowClass = $bootstrap->getRowClass();
$colClass = $bootstrap->columnClass($vars['cols_num']);
echo '<div class="items items-col-' . $vars['cols_num'] . '">';
foreach ($rowItems as $row) {
echo '<div class="' . $rowClass . ' item-row-' . $i . '">';
$j = 0;
$i++;
foreach ($row as $item) {
$classes = array(
'item-column', $colClass
);
$first = ($j == 0) ? $classes[] = 'first' : '';
$last = ($j == $count - 1) ? $classes[] = 'last' : '';
$j++;
$isLast = $j % $vars['cols_num'] == 0 && $vars['cols_order'] == 0;
if ($isLast) {
$classes[] = 'last';
}
echo '<div class="' . implode(' ', $classes) . '">' .
' <div class="item-box well">' . $item . '</div>' .
'</div>';
}
echo '</div>';
}
echo '</div>';
}
$this->app->jbdebug->mark('layout::item_columns::finish');
|
{
"pile_set_name": "Github"
}
|
### 中国人权律师祝圣武:向大法弟子学习(视频)
------------------------
#### [首页](https://github.com/gfw-breaker/banned-news/blob/master/README.md) | [手把手翻墙教程](https://github.com/gfw-breaker/guides/wiki) | [禁闻聚合安卓版](https://github.com/gfw-breaker/bn-android) | [网门安卓版](https://github.com/oGate2/oGate) | [神州正道安卓版](https://github.com/SzzdOgate/update)
<div class="zhidingtu">
<div class="ar-wrap-3x2">
<img alt="中国人权律师祝圣武在温哥华法轮功学员反迫害20周年集会上。(图片来源:唐风/大纪元)" class="ar-wrap-inside-fill" src="http://img.soundofhope.org/2019/07/dsc-0488-600x400-600x400.jpg"/>
</div>
<div class="caption">
中国人权律师祝圣武在温哥华法轮功学员反迫害20周年集会上。(图片来源:唐风/大纪元)
</div>
</div>
<hr/>
<div class="content">
<p>
<span class="content-info-date">
【希望之声2019年7月21日】
</span>
<span class="content-info-type">
(作者:祝圣武)
</span>
<em>
编者按:7月20日下午1时许,大温地区的部分法轮功学员在温哥华市中心艺术馆前举行反迫害二十周年集会,中国大陆人权律师祝圣武参加集会并演讲。他发言的题目是“向法轮大法弟子学习”。以下是根据祝圣武律师的演讲整理的文稿(附视频):
</em>
</p>
<div class="widget ad-300x250 ad-ecf">
<!-- ZW30 Post Embed 300x250 1 -->
<ins class="adsbygoogle" data-ad-client="ca-pub-1519518652909441" data-ad-slot="9768754376" style="display:inline-block;width:300px;height:250px">
</ins>
</div>
<p>
非常荣幸参加温哥华法轮大法反迫害二十周年集会!我是来自大陆的人权律师祝圣武。
</p>
<p>
2017年9月,我因为反党反社会主义被吊销律师证,并被迫关闭我经营管理的山东信常律师事务所。被吊销律师证之前,我是中国互联网著作权纠纷领域非常有名的律师。2018年4月,我和妻子逃离中国大陆来到温哥华。
</p>
<p>
今天,站在这片自由、民主的土地,面对各位大法弟子,我非常激动。我要大声的呼吁:向法轮大法弟子学习!
</p>
<p>
2017年5月,我开始为大法弟子辩护。为大法服务的时间非常短,但大法弟子对我的思想影响非常深远。我在齐齐哈尔市、玉溪市、文山壮族苗族自治州、曲靖市、昆明市为被绑架的大法弟子辩护。
</p>
<p>
这些蒙难的大法弟子没有一个因为失去自由而恐惧,没有一个因为受苦受难而唉声叹气,没有一个因为受到共产党的惨无人道的酷刑、监禁而放弃信仰。他们始终高昂着头、高声歌颂着法轮大法。
</p>
<p>
大法弟子用生命捍卫信仰的壮举,彻底震撼了我,彻底改变了我对于“信仰”这个词的理解。对中国人而言,守护信仰远远不只是日常的供奉,更要求信仰者勇于承受中国(中共)政府施加的极端残酷的迫害。
</p>
<p>
玉溪市大法弟子李海艳,59岁,因为在居民小区宣传大法而被绑架。我见到她的时候,她刚刚因为拒绝放弃信仰而被连续戴脚镣手镣一个月。她已经绝食抗议了很多天,身体非常虚弱,手上、脚上、脸上都浮肿了。她带着脚镣和手镣,扶着墙猫着腰才得以走出来。会见的时候,共产党的打手就站在李海艳旁边监视着。我看着她刚强、高贵、备受折磨的身躯,我哭了。我从心底里敬佩她。
</p>
<p>
文山壮族苗族自治州大法弟子王文瑛,62岁,家境富裕,丈夫是处长级官员。王文瑛被绑架后,中国(共)政府试图以他丈夫的工作来胁迫她,她断然拒绝。他的丈夫被株连而提前退休。她总是高高兴兴的和我讲她的故事。我们还偷偷的把看守所内的“三退”(编辑加注:声明退出共产党、共青团和少先队)人员名单带出来交给大法联络人。开庭的时候,她没有丝毫的恐惧,高声为大法辩护。
</p>
<div>
</div>
<p>
文山壮族苗族自治州大法弟子李群,65岁,和王文瑛一同被绑架
|
{
"pile_set_name": "Github"
}
|
/*
** ClanLib SDK
** Copyright (c) 1997-2016 The ClanLib Team
**
** This software is provided 'as-is', without any express or implied
** warranty. In no event will the authors be held liable for any damages
** arising from the use of this software.
**
** Permission is granted to anyone to use this software for any purpose,
** including commercial applications, and to alter it and redistribute it
** freely, subject to the following restrictions:
**
** 1. The origin of this software must not be misrepresented; you must not
** claim that you wrote the original software. If you use this software
** in a product, an acknowledgment in the product documentation would be
** appreciated but is not required.
** 2. Altered source versions must be plainly marked as such, and must not be
** misrepresented as being the original software.
** 3. This notice may not be removed or altered from any source distribution.
**
** Note: Some of the libraries ClanLib may link to may have additional
** requirements or restrictions.
**
** File Author(s):
**
** Magnus Norddahl
*/
#pragma once
#include "../View/view.h"
namespace clan
{
class Image;
class ProgressView : public View
{
public:
float progress() const;
void set_progress(float value, bool animated = false);
Image track_image() const;
void set_track_image(const Image &value);
Image progress_image() const;
void set_progress_image(const Image &value);
Colorf progress_color() const;
void set_progress_color(const Colorf &value);
};
}
|
{
"pile_set_name": "Github"
}
|
#!/usr/bin/env python3
#pylint: disable=missing-docstring
#* This file is part of the MOOSE framework
#* https://www.mooseframework.org
#*
#* All rights reserved, see COPYRIGHT for full restrictions
#* https://github.com/idaholab/moose/blob/master/COPYRIGHT
#*
#* Licensed under LGPL 2.1, please see LICENSE for details
#* https://www.gnu.org/licenses/lgpl-2.1.html
import chigger
n = 3
line0 = chigger.graphs.Line(marker='circle', color=[0,0,1])
line1 = chigger.graphs.Line(marker='plus', color=[0,1,0], corner='right-top')
graph = chigger.graphs.Graph(line0, line1)
graph.setOptions('xaxis', lim=[0,n])
graph.setOptions('yaxis', lim=[0,n])
graph.setOptions('x2axis', lim=[0,n])
graph.setOptions('y2axis', lim=[0,n])
graph.setOptions('legend', visible=False)
window = chigger.RenderWindow(graph, size=[300,300], test=True)
window.write('secondary_initial.png')
for i in range(n+1):
line0.setOptions(x=[i], y=[i], append=True)
line1.setOptions(x=[i], y=[n-i], append=True)
window.write('secondary_' + str(i) + '.png')
window.start()
|
{
"pile_set_name": "Github"
}
|
# frozen_string_literal: true
module Krane
class DeployTaskConfigValidator < TaskConfigValidator
def initialize(protected_namespaces, prune, *arguments)
super(*arguments)
@protected_namespaces = protected_namespaces
@allow_protected_ns = !protected_namespaces.empty?
@prune = prune
@validations += %i(validate_protected_namespaces)
end
private
def validate_protected_namespaces
if @protected_namespaces.include?(namespace)
if @allow_protected_ns && @prune
@errors << "Refusing to deploy to protected namespace '#{namespace}' with pruning enabled"
elsif @allow_protected_ns
logger.warn("You're deploying to protected namespace #{namespace}, which cannot be pruned.")
logger.warn("Existing resources can only be removed manually with kubectl. " \
"Removing templates from the set deployed will have no effect.")
logger.warn("***Please do not deploy to #{namespace} unless you really know what you are doing.***")
else
@errors << "Refusing to deploy to protected namespace '#{namespace}'"
end
end
end
end
end
|
{
"pile_set_name": "Github"
}
|
/*
* Initialization and support routines for self-booting compressed image.
*
* $Copyright Open Broadcom Corporation$
*
* $Id: circularbuf.h 452258 2014-01-29 19:17:57Z $
*/
#ifndef __CIRCULARBUF_H_INCLUDED__
#define __CIRCULARBUF_H_INCLUDED__
#include <osl.h>
#include <typedefs.h>
#include <bcmendian.h>
/* Enumerations of return values provided by MsgBuf implementation */
typedef enum {
CIRCULARBUF_FAILURE = -1,
CIRCULARBUF_SUCCESS
} circularbuf_ret_t;
/* Core circularbuf circular buffer structure */
typedef struct circularbuf_s
{
uint16 depth; /* Depth of circular buffer */
uint16 r_ptr; /* Read Ptr */
uint16 w_ptr; /* Write Ptr */
uint16 e_ptr; /* End Ptr */
uint16 wp_ptr; /* wp_ptr/pending - scheduled for DMA. But, not yet complete. */
uint16 rp_ptr; /* rp_ptr/pending - scheduled for DMA. But, not yet complete. */
uint8 *buf_addr;
void *mb_ctx;
void (*mb_ring_bell)(void *ctx);
} circularbuf_t;
#define CBUF_ERROR_VAL 0x00000001 /* Error level tracing */
#define CBUF_TRACE_VAL 0x00000002 /* Function level tracing */
#define CBUF_INFORM_VAL 0x00000004 /* debug level tracing */
extern int cbuf_msg_level;
#define CBUF_ERROR(args) do {if (cbuf_msg_level & CBUF_ERROR_VAL) printf args;} while (0)
#define CBUF_TRACE(args) do {if (cbuf_msg_level & CBUF_TRACE_VAL) printf args;} while (0)
#define CBUF_INFO(args) do {if (cbuf_msg_level & CBUF_INFORM_VAL) printf args;} while (0)
#define CIRCULARBUF_START(x) ((x)->buf_addr)
#define CIRCULARBUF_WRITE_PTR(x) ((x)->w_ptr)
#define CIRCULARBUF_READ_PTR(x) ((x)->r_ptr)
#define CIRCULARBUF_END_PTR(x) ((x)->e_ptr)
#define circularbuf_debug_print(handle) \
CBUF_INFO(("%s:%d:\t%p rp=%4d r=%4d wp=%4d w=%4d e=%4d\n", \
__FUNCTION__, __LINE__, \
(void *) CIRCULARBUF_START(handle), \
(int) (handle)->rp_ptr, (int) (handle)->r_ptr, \
(int) (handle)->wp_ptr, (int) (handle)->w_ptr, \
(int) (handle)->e_ptr));
/* Callback registered by application/mail-box with the circularbuf implementation.
* This will be invoked by the circularbuf implementation when write is complete and
* ready for informing the peer
*/
typedef void (*mb_ring_t)(void *ctx);
/* Public Functions exposed by circularbuf */
void
circularbuf_init(circularbuf_t *handle, void *buf_base_addr, uint16 total_buf_len);
void
circularbuf_register_cb(circularbuf_t *handle, mb_ring_t mb_ring_func, void *ctx);
/* Write Functions */
void *
circularbuf_reserve_for_write(circularbuf_t *handle, uint16 size);
void
circularbuf_write_complete(circularbuf_t *handle, uint16 bytes_written);
/* Read Functions */
void *
circularbuf_get_read_ptr(circularbuf_t *handle, uint16 *avail_len);
circularbuf_ret_t
circularbuf_read_complete(circularbuf_t *handle, uint16 bytes_read);
/*
* circularbuf_get_read_ptr() updates rp_ptr by the amount that the consumer
* is supposed to read. The consumer may not read the entire amount.
* In such a case, circularbuf_revert_rp_ptr() call follows a corresponding
* circularbuf_get_read_ptr() call to revert the rp_ptr back to
* the point till which data has actually been processed.
* It is not valid if it is preceded by multiple get_read_ptr() calls
*/
circularbuf_ret_t
circularbuf_revert_rp_ptr(circularbuf_t *handle, uint16 bytes);
#endif /* __CIRCULARBUF_H_INCLUDED__ */
|
{
"pile_set_name": "Github"
}
|
$ groovy UserScript.groovy
Caught: groovy.lang.MissingMethodException: No signature of method: User.givenName() is applicable for argument types: () values: []
Possible solutions: giveName(), getName(), setName(java.lang.String)
at UserScript.run(UserScript.groovy:7)
|
{
"pile_set_name": "Github"
}
|
"""
ACE parser
From wotsit.org and the SDK header (bitflags)
Partial study of a new block type (5) I've called "new_recovery", as its
syntax is very close to the former one (of type 2).
Status: can only read totally file and header blocks.
Author: Christophe Gisquet <christophe.gisquet@free.fr>
Creation date: 19 january 2006
"""
from hachoir_py3.parser import Parser
from hachoir_py3.field import (StaticFieldSet, FieldSet,
Bit, Bits, NullBits, RawBytes, Enum,
UInt8, UInt16, UInt32,
PascalString8, PascalString16, String,
TimeDateMSDOS32)
from hachoir_py3.core.text_handler import textHandler, filesizeHandler, hexadecimal
from hachoir_py3.core.endian import LITTLE_ENDIAN
from hachoir_py3.parser.common.msdos import MSDOSFileAttr32
MAGIC = b"**ACE**"
OS_MSDOS = 0
OS_WIN32 = 2
HOST_OS = {
0: "MS-DOS",
1: "OS/2",
2: "Win32",
3: "Unix",
4: "MAC-OS",
5: "Win NT",
6: "Primos",
7: "APPLE GS",
8: "ATARI",
9: "VAX VMS",
10: "AMIGA",
11: "NEXT",
}
COMPRESSION_TYPE = {
0: "Store",
1: "Lempel-Ziv 77",
2: "ACE v2.0",
}
COMPRESSION_MODE = {
0: "fastest",
1: "fast",
2: "normal",
3: "good",
4: "best",
}
# TODO: Computing the CRC16 would also prove useful
# def markerValidate(self):
# return not self["extend"].value and self["signature"].value == MAGIC and \
# self["host_os"].value<12
class MarkerFlags(StaticFieldSet):
format = (
(Bit, "extend", "Whether the header is extended"),
(Bit, "has_comment", "Whether the archive has a comment"),
(NullBits, "unused", 7, "Reserved bits"),
(Bit, "sfx", "SFX"),
(Bit, "limited_dict", "Junior SFX with 256K dictionary"),
(Bit, "multi_volume", "Part of a set of ACE archives"),
(Bit, "has_av_string", "This header holds an AV-string"),
(Bit, "recovery_record", "Recovery record preset"),
(Bit, "locked", "Archive is locked"),
(Bit, "solid", "Archive uses solid compression")
)
def markerFlags(self):
yield MarkerFlags(self, "flags", "Marker flags")
def markerHeader(self):
yield String(self, "signature", 7, "Signature")
yield UInt8(self, "ver_extract", "Version needed to extract archive")
yield UInt8(self, "ver_created", "Version used to create archive")
yield Enum(UInt8(self, "host_os", "OS where the files were compressed"), HOST_OS)
yield UInt8(self, "vol_num", "Volume number")
yield TimeDateMSDOS32(self, "time", "Date and time (MS DOS format)")
yield Bits(self, "reserved", 64, "Reserved size for future extensions")
flags = self["flags"]
if flags["has_av_string"].value:
yield PascalString8(self, "av_string", "AV String")
if flags["has_comment"].value:
size = filesizeHandler(UInt16(self, "comment_size", "Comment size"))
yield size
if size.value > 0:
yield RawBytes(self, "compressed_comment", size.value,
"Compressed comment")
class FileFlags(StaticFieldSet):
format = (
(Bit, "extend", "Whether the header is extended"),
(Bit, "has_comment", "Presence of file comment"),
(Bits, "unused", 10, "Unused bit flags"),
(Bit, "encrypted", "File encrypted with password"),
(Bit, "previous", "File continued from previous volume"),
(Bit, "next", "File continues on the next volume"),
(Bit, "solid", "File compressed using previously archived files")
)
def fileFlags(self):
yield FileFlags(self, "flags", "File flags")
def fileHeader(self):
yield filesizeHandler(UInt32(self, "compressed_size", "Size of the compressed file"))
yield filesizeHandler(UInt32(self, "uncompressed_size", "Uncompressed file size"))
yield TimeDateMSDOS32(self, "ftime", "Date and time (MS DOS format)")
if self["/header/host_os"].value in (OS_MSDOS, OS_WIN32):
yield MSDOSFileAttr32(self, "file_attr", "File attributes")
else:
yield textHandler(UInt32(self, "file_attr", "File attributes"), hexadecimal)
yield textHandler(UInt32(self, "file_crc32", "CRC32 checksum over the compressed file)"), hexadecimal)
yield Enum(UInt8(self, "compression_type", "Type of compression"), COMPRESSION_TYPE)
yield Enum(UInt8(self, "compression_mode", "Quality of compression"), COMPRESSION_MODE)
yield textHandler(UInt16(self, "parameters", "Compression parameters"), hexadecimal)
yield textHandler(UInt16(self, "reserved", "Reserved data"), hexadecimal)
# Filename
yield PascalString16(self, "filename", "Filename")
# Comment
if self["flags/has_comment"].value:
yield filesizeHandler(UInt16(self, "comment_size", "Size of the compressed comment"))
if self["comment_size"].value > 0:
yield RawBytes(self, "comment_data", self["comment_size"].value, "Comment data")
def fileBody(self):
size = self["compressed_size"].value
if size > 0:
yield RawBytes(self, "compressed_data", size, "Compressed data")
def fileDesc(self):
return "File entry: %s (%s)" % (self["filename"].value, self["compressed_size"].display)
def recoveryHeader(self):
yield filesizeHandler(UInt32(self, "rec_blk_size", "Size of recovery data"))
self.body_size = self["rec_blk_size"].size
yield String(self, "signature", 7, "Signature, normally '**ACE**'")
yield textHandler(UInt32(self, "relative_start",
"Relative start (to this block) of the data this block is mode of"),
hexadecimal)
yield UInt32(self, "num_blocks", "Number of blocks the data is split into")
yield UInt32(self, "size_blocks", "Size of these blocks")
yield UInt16(self, "crc16_blocks", "CRC16 over recovery data")
# size_blocks blocks of size size_blocks follow
# The ultimate data is the xor data of all those blocks
size = self["size_blocks"].value
for index in range(self["num_blocks"].value):
yield RawBytes(self, "data[]", size, "Recovery block %i" % index)
yield RawBytes(self, "xor_data", size, "The XOR value of the above data blocks")
def recoveryDesc(self):
return "Recovery block, size=%u" % self["body_size"].display
def newRecoveryHeader(self):
"""
This header is described nowhere
"""
if self["flags/extend"].value:
yield filesizeHandler(UInt32(self, "body_size", "Size of the unknown body following"))
self.body_size = self["body_size"].value
yield textHandler(UInt32(
|
{
"pile_set_name": "Github"
}
|
# 2009 January 29
#
# The author disclaims copyright to this source code. In place of
# a legal notice, here is a blessing:
#
# May you do good and not evil.
# May you find forgiveness for yourself and forgive others.
# May you share freely, never taking more than you give.
#
#***********************************************************************
#
# Verify that certain keywords can be used as identifiers.
#
# $Id: keyword1.test,v 1.1 2009/01/29 19:27:47 drh Exp $
set testdir [file dirname $argv0]
source $testdir/tester.tcl
db eval {
CREATE TABLE t1(a, b);
INSERT INTO t1 VALUES(1, 'one');
INSERT INTO t1 VALUES(2, 'two');
INSERT INTO t1 VALUES(3, 'three');
}
set kwlist {
abort
after
analyze
asc
attach
before
begin
by
cascade
cast
column
conflict
current_date
current_time
current_timestamp
database
deferred
desc
detach
end
each
exclusive
explain
fail
for
glob
if
ignore
immediate
initially
instead
key
like
match
of
offset
plan
pragma
query
raise
recursive
regexp
reindex
release
rename
replace
restrict
rollback
row
savepoint
temp
temporary
trigger
vacuum
view
virtual
with
without
};
set exprkw {
cast
current_date
current_time
current_timestamp
raise
}
foreach kw $kwlist {
do_test keyword1-$kw.1 {
if {$kw=="if"} {
db eval "CREATE TABLE \"$kw\"($kw $kw)"
} else {
db eval "CREATE TABLE ${kw}($kw $kw)"
}
db eval "INSERT INTO $kw VALUES(99)"
db eval "INSERT INTO $kw SELECT a FROM t1"
if {[lsearch $exprkw $kw]<0} {
db eval "SELECT * FROM $kw ORDER BY $kw ASC"
} else {
db eval "SELECT * FROM $kw ORDER BY \"$kw\" ASC"
}
} {1 2 3 99}
do_test keyword1-$kw.2 {
if {$kw=="if"} {
db eval "DROP TABLE \"$kw\""
db eval "CREATE INDEX \"$kw\" ON t1(a)"
} else {
db eval "DROP TABLE $kw"
db eval "CREATE INDEX $kw ON t1(a)"
}
db eval "SELECT b FROM t1 INDEXED BY $kw WHERE a=2"
} {two}
}
finish_test
|
{
"pile_set_name": "Github"
}
|
################################################################################
#
# uftp
#
################################################################################
UFTP_VERSION = 4.10.2
UFTP_SITE = http://sourceforge.net/projects/uftp-multicast/files/source-tar
UFTP_LICENSE = GPL-3.0+
UFTP_LICENSE_FILES = LICENSE.txt
ifeq ($(BR2_PACKAGE_OPENSSL),y)
UFTP_DEPENDENCIES += host-pkgconf openssl
UFTP_MAKE_OPTS += CRYPT_LIB="`$(PKG_CONFIG_HOST_BINARY) --libs libcrypto`"
else
UFTP_MAKE_OPTS += NO_ENCRYPTION=1
endif
define UFTP_BUILD_CMDS
$(TARGET_CONFIGURE_OPTS) $(MAKE) -C $(@D) $(UFTP_MAKE_OPTS)
endef
define UFTP_INSTALL_TARGET_CMDS
$(TARGET_CONFIGURE_OPTS) $(MAKE) -C $(@D) $(UFTP_MAKE_OPTS) \
DESTDIR=$(TARGET_DIR) install
endef
$(eval $(generic-package))
|
{
"pile_set_name": "Github"
}
|
package mockit;
import static org.junit.Assert.*;
import org.junit.*;
public final class TestedClassWithNoDITest
{
public static final class TestedClass {
private final Dependency dependency = new Dependency();
public boolean doSomeOperation() { return dependency.doSomething() > 0; }
}
static class Dependency { int doSomething() { return -1; } }
@Tested TestedClass tested1;
@Tested final TestedClass tested2 = new TestedClass();
@Tested TestedClass tested3;
@Tested NonPublicTestedClass tested4;
@Tested final TestedClass tested5 = null;
@Mocked Dependency mock;
TestedClass tested;
@Before
public void setUp() {
assertNotNull(mock);
assertNull(tested);
tested = new TestedClass();
assertNull(tested3);
tested3 = tested;
assertNull(tested1);
assertNotNull(tested2);
assertNull(tested4);
assertNull(tested5);
}
@Test
public void verifyTestedFields() {
assertNull(tested5);
assertNotNull(tested4);
assertNotNull(tested3);
assertSame(tested, tested3);
assertNotNull(tested2);
assertNotNull(tested1);
}
@Test
public void exerciseAutomaticallyInstantiatedTestedObject() {
new Expectations() {{ mock.doSomething(); result = 1; }};
assertTrue(tested1.doSomeOperation());
}
@Test
public void exerciseManuallyInstantiatedTestedObject() {
new Expectations() {{ mock.doSomething(); result = 1; }};
assertTrue(tested2.doSomeOperation());
new FullVerifications() {};
}
@Test
public void exerciseAnotherManuallyInstantiatedTestedObject() {
assertFalse(tested3.doSomeOperation());
new Verifications() {{ mock.doSomething(); times = 1; }};
}
}
class NonPublicTestedClass {
@SuppressWarnings("RedundantNoArgConstructor")
NonPublicTestedClass() {}
}
|
{
"pile_set_name": "Github"
}
|
<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd">
<html xmlns="http://www.w3.org/1999/xhtml">
<head>
<meta http-equiv="Content-Type" content="text/xhtml;charset=UTF-8"/>
<meta http-equiv="X-UA-Compatible" content="IE=9"/>
<meta name="generator" content="Doxygen 1.8.6"/>
<title>qLibc: utilities/qio.c Source File</title>
<link href="tabs.css" rel="stylesheet" type="text/css"/>
<script type="text/javascript" src="jquery.js"></script>
<script type="text/javascript" src="dynsections.js"></script>
<link href="navtree.css" rel="stylesheet" type="text/css"/>
<script type="text/javascript" src="resize.js"></script>
<script type="text/javascript" src="navtree.js"></script>
<script type="text/javascript">
$(document).ready(initResizable);
$(window).load(resizeHeight);
</script>
<link href="doxygen.css" rel="stylesheet" type="text/css" />
</head>
<body>
<div id="top"><!-- do not remove this div, it is closed by doxygen! -->
<div id="titlearea">
<table cellspacing="0" cellpadding="0">
<tbody>
<tr style="height: 56px;">
<td style="padding-left: 0.5em;">
<div id="projectname">qLibc
</div>
</td>
</tr>
</tbody>
</table>
</div>
<!-- end header part -->
<!-- Generated by Doxygen 1.8.6 -->
<div id="navrow1" class="tabs">
<ul class="tablist">
<li><a href="index.html"><span>Main Page</span></a></li>
<li class="current"><a href="files.html"><span>Files</span></a></li>
</ul>
</div>
<div id="navrow2" class="tabs2">
<ul class="tablist">
<li><a href="files.html"><span>File List</span></a></li>
<li><a href="globals.html"><span>Globals</span></a></li>
</ul>
</div>
</div><!-- top -->
<div id="side-nav" class="ui-resizable side-nav-resizable">
<div id="nav-tree">
<div id="nav-tree-contents">
<div id="nav-sync" class="sync"></div>
</div>
</div>
<div id="splitbar" style="-moz-user-select:none;"
class="ui-resizable-handle">
</div>
</div>
<script type="text/javascript">
$(document).ready(function(){initNavTree('qio_8c_source.html','');});
</script>
<div id="doc-content">
<div class="header">
<div class="headertitle">
<div class="title">qio.c</div> </div>
</div><!--header-->
<div class="contents">
<a href="qio_8c.html">Go to the documentation of this file.</a><div class="fragment"><div class="line"><a name="l00001"></a><span class="lineno"> 1</span> <span class="comment">/******************************************************************************</span></div>
<div class="line"><a name="l00002"></a><span class="lineno"> 2</span> <span class="comment"> * qLibc</span></div>
<div class="line"><a name="l00003"></a><span class="lineno"> 3</span> <span class="comment"> *</span></div>
<div class="line"><a name="l00004"></a><span class="lineno"> 4</span> <span class="comment"> * Copyright (c) 2010-2015 Seungyoung Kim.</span></div>
<div class="line"><a name="l00005"></a><span class="lineno"> 5</span> <span class="comment"> * All rights reserved.</span></div>
<div class="line"><a name="l00006"></a><span class="lineno"> 6</span> <span class="comment"> *</span></div>
<div class="line"><a name="l00007"></a><span class="lineno"> 7</span> <span class="comment"> * Redistribution and use in source and binary forms, with or without</span></div>
<div class="line"><a name="l00008"></a><span class="lineno"> 8</span> <span class="comment"> * modification, are permitted provided that the following conditions are met:</span></div>
<div class="line"><a name="l00009"></a><span class="lineno"> 9</span> <span class="comment"> *</span></div>
<div class="line"><a name="l00010"></a><span class="lineno"> 10</span> <span class="comment"> * 1. Redistributions of source code must retain the above copyright notice,</span></div>
<div class="line"><a name="l00011"></a><span class="lineno"> 11</span> <span class="comment"> * this list of conditions and the following disclaimer.</span></div>
<div class="line"><a name="l00012"></a><span class="lineno"> 12</span> <span class="comment"> * 2. Redistributions in binary form must reproduce the above copyright notice,</span></div>
<div class="line"><a name="l00013"></a><span class="lineno"> 13</span> <span class="comment"> * this list of conditions and the following disclaimer in the documentation</span></div>
<div class="line"><a name="l00014"></a><span class="lineno"> 14</span> <span class="comment"> * and/or other materials provided with the distribution.</span></div>
<div class="line"><a name="l00015"></a><span class="lineno"> 15</span> <span class="comment"> *</span></div>
<div class="line"><a name="l00016"></a><span class="lineno"> 16</span> <span class="comment"> * THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS"</span></div>
<div class="line"><a name="l00017"></a><span class="lineno"> 17</span> <span class="comment"> * AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE</span></div>
<div class="line"><a name="l00018"></a><span class="lineno"> 18</span> <span class="comment"> * IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE</span></div>
<div class="line"><a name="l00019"></a><span class="lineno"> 19</span> <span class="comment"> * ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE</span></div>
<div class="line"><a name="l00020"></a><span class="lineno"> 20</span> <span class="comment"> * LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR</span></div>
<div class="line"><a name="l00021"></a><span class="lineno"> 21</span> <span class="comment"> * CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF</span></div>
<div class="line"><a name="l00022"></a><span class="lineno"> 22</span> <span class="comment"> * SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS</span></div>
<div class="line"><a name="l00023"></a><span class="lineno"> 23</span> <span class="comment"> * INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN</span></div>
<div class="line"><a name="l00024"></a><span class="lineno"> 24</span> <
|
{
"pile_set_name": "Github"
}
|
# Azure Service Bus Geo-disaster recovery
To learn more about Azure Service Bus, please visit our [marketing page](https://azure.microsoft.com/en-us/services/service-bus/).
To learn more about our Geo-DR feature in general please follow [this](https://docs.microsoft.com/azure/service-bus-messaging/service-bus-geo-dr) link.
This sample shows how to:
1. Achieve Geo-DR for an Service Bus namespace.
2. Create a namespace with live metadata replication between two customer chosen regions
This sample consists of three parts:
1. The main scenario showing management (Setup, failover, remove pairing) of new or existing namespaces sample can be found [here](https://github.com/Azure/azure-service-bus/tree/master/samples/DotNet/Microsoft.ServiceBus.Messaging/GeoDR/SBGeoDR2/SBGeoDR2)
2. The scenario in which you want to use an existing namespace name as alias can be found [here](https://github.com/Azure/azure-service-bus/tree/master/samples/DotNet/Microsoft.ServiceBus.Messaging/GeoDR/SBGeoDR2/SBGeoDR_existing_namespace_name). Make sure to thoroughly look through the comments as this diverges slightly from the main scenario. Examine both, App.config and Program.cs. ***Note:*** If you do not failover but just do break pairing, there is no need to execute delete alias as namespace name and alias are the same. If you do failover you would need to delete the alias if you would want to use the namespace outside of a DR setup.
3. A sample on how to access the alias connection string which can be found [here](https://github.com/Azure/azure-service-bus/tree/master/samples/DotNet/Microsoft.ServiceBus.Messaging/GeoDR/TestGeoDR/ConsoleApp1).
## Getting Started
### Prerequisites
In order to get started using the sample (as it uses the Service Bus management libraries), you must authenticate with Azure Active Directory (AAD). This requires you to authenticate as a Service Principal, which provides access to your Azure resources.
To obtain a service principal please do the following steps:
1. Go to the Azure Portal and select Azure Active Directory in the left menu.
2. Create a new Application under App registrations / + New application registration.
1. The application should be of type Web app / API.
2. You can provide any URL for your application as sign on URL.
3. Navigate to your newly created application
3. Application or AppId is the client Id. Note it down as you will need it for the sample.
4. Select keys and create a new key. Note down the Value as you won't be able to access it later.
5. Go back to the root Azure Active Directory and select properties.
1. Note down the Directory ID as this is your TenantId.
6. You must have ‘Owner’ permissions under Role for the resource group that you wish to run the sample on. Regardless if you want to use an existing namespace or create a new one, make sure to add the newly created application as owner under Access Control (IAM).
For more information on creating a Service Principal, refer to the following articles:
* [Use the Azure Portal to create Active Directory application and service principal that can access resources](https://docs.microsoft.com/azure/azure-resource-manager/resource-group-create-service-principal-portal)
* [Use Azure PowerShell to create a service principal to access resources](https://docs.microsoft.com/azure/azure-resource-manager/resource-group-authenticate-service-principal)
* [Use Azure CLI to create a service principal to access resources](https://docs.microsoft.com/azure/azure-resource-manager/resource-group-authenticate-service-principal-cli)
<!-- The above articles helps you to obtain an AppId (ClientId), TenantId, and ClientSecret (Authentication Key), all of which are required to authenticate the management libraries. Finally, when creating your Active Directory application, if you do not have a sign-on URL to input in the create step, simply input any URL format string e.g. https://contoso.org/exampleapp -->
### Required NuGet packages
1. Microsoft.Azure.Management.ServiceBus
2. Microsoft.IdentityModel.Clients.ActiveDirectory - used to authenticate with AAD
### Running the sample
1. Please use Visual Studio 2017
2. Make sure all assemblies are in place.
2. Populate the regarding values in the App.config.
3. Build the solution.
4. Make sure to execute on Screen option A before any other option.
The Geo DR actions could be
* CreatePairing
For creating a paired region. After this, you should see metadata (i.e. Queues, Topics and Subscriptions replicated to the secondary namespace).
* FailOver
Simulating a failover. After this action, the secondary namespace becomes the primary
* BreakPairing
For breaking the pairing between a primary and secondary namespace
* DeleteAlias
For deleting an alias, that contains information about the primary-secondary pairing
* GetConnectionStrings
In a Geo DR enabled namespace, the Service Bus should be accessed only via the alias. This is because, the alias can point to either the primary Service Bus or the failed over Service Bus. This way, the user does not have to adjust the connection strings in his/her apps to point to a different Service Bus in the case of a failover.
The way to get the alias connection string is shown in a seperate console app which you can also use to test your newly geo paired namespaces. It can be found [here](https://github.com/Azure/azure-service-bus/tree/master/samples/DotNet/Microsoft.ServiceBus.Messaging/GeoDR/TestGeoDR).
***Note:*** The AAD access data for the GeoDR sample must also be used for the Test sample.
|
{
"pile_set_name": "Github"
}
|
/*
* SonarLint for Visual Studio
* Copyright (C) 2016-2020 SonarSource SA
* mailto:info AT sonarsource DOT com
*
* This program is free software; you can redistribute it and/or
* modify it under the terms of the GNU Lesser General Public
* License as published by the Free Software Foundation; either
* version 3 of the License, or (at your option) any later version.
*
* This program is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
* Lesser General Public License for more details.
*
* You should have received a copy of the GNU Lesser General Public License
* along with this program; if not, write to the Free Software Foundation,
* Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
*/
using System;
using System.Collections.Generic;
using System.Diagnostics;
using System.IO;
using System.IO.Abstractions;
using System.Linq;
using EnvDTE;
using SonarLint.VisualStudio.Core.Binding;
using SonarLint.VisualStudio.Core.Helpers;
using SonarLint.VisualStudio.Integration.Resources;
namespace SonarLint.VisualStudio.Integration
{
internal class SolutionRuleSetsInformationProvider : ISolutionRuleSetsInformationProvider
{
public const char RuleSetDirectoriesValueSpliter = ';';
private readonly IServiceProvider serviceProvider;
private readonly ILogger logger;
private readonly IFileSystem fileSystem;
public SolutionRuleSetsInformationProvider(IServiceProvider serviceProvider, ILogger logger)
: this(serviceProvider, logger, new FileSystem())
{
}
internal SolutionRuleSetsInformationProvider(IServiceProvider serviceProvider, ILogger logger, IFileSystem fileSystem)
{
this.serviceProvider = serviceProvider ?? throw new ArgumentNullException(nameof(serviceProvider));
this.logger = logger ?? throw new ArgumentNullException(nameof(logger));
this.fileSystem = fileSystem ?? throw new ArgumentNullException(nameof(fileSystem));
}
public IEnumerable<RuleSetDeclaration> GetProjectRuleSetsDeclarations(Project project)
{
if (project == null)
{
throw new ArgumentNullException(nameof(project));
}
return GetProjectRuleSetsDeclarationsIterator(project);
}
private IEnumerable<RuleSetDeclaration> GetProjectRuleSetsDeclarationsIterator(Project project)
{
/* This method walks through all of the available configurations (e.g. Debug, Release, Foo) and
* attempts to fetch the values of a couple of properties from the project (CodeAnalysisRuleSet
* and CodeAnalysisRuleSetDirectories). The collected data is put into a data object
* and returned to the caller. The collected data includes the DTE Property object itself, which
* is used later to update the ruleset value.
*
* TODO: consider refactoring. The code seems over-complicated: it finds the "ruleset"
* property for all configurations, then backtracks to find the configuration, then looks
* for the corresponding "ruleset directories" property.
* Note: we are now fetching the "ruleset directories" property from the MSBuild project,
* rather than through the DTE (the previous version of this code that used the DTE fails
* for C# and VB projects that use the new project system).
*/
var declarations = new List<RuleSetDeclaration>();
var projectSystem = this.serviceProvider.GetService<IProjectSystemHelper>();
var ruleSetProperties = VsShellUtils.GetProjectProperties(project, Constants.CodeAnalysisRuleSetPropertyKey);
Debug.Assert(ruleSetProperties != null);
Debug.Assert(ruleSetProperties.All(p => p != null), "Not expecting nulls in the list of properties");
if (!ruleSetProperties.Any())
{
logger.WriteLine(Strings.CouldNotFindCodeAnalysisRuleSetPropertyOnProject, project.UniqueName);
}
foreach (Property ruleSetProperty in ruleSetProperties)
{
string activationContext = TryGetPropertyConfiguration(ruleSetProperty)?.ConfigurationName ?? string.Empty;
string ruleSetDirectoriesValue = projectSystem.GetProjectProperty(project, Constants.CodeAnalysisRuleSetDirectoriesPropertyKey, activationContext);
string[] ruleSetDirectories = ruleSetDirectoriesValue?.Split(new[] { RuleSetDirectoriesValueSpliter }, StringSplitOptions.RemoveEmptyEntries) ?? new string[0];
string ruleSetValue = ruleSetProperty.Value as string;
var declaration = new RuleSetDeclaration(project, ruleSetProperty, ruleSetValue, activationContext, ruleSetDirectories);
declarations.Add(declaration);
}
return declarations;
}
public string GetSolutionSonarQubeRulesFolder(SonarLintMode bindingMode)
{
bindingMode.ThrowIfNotConnected();
var projectSystem = this.serviceProvider.GetService<IProjectSystemHelper>();
string solutionFullPath = projectSystem.GetCurrentActiveSolution()?.FullName;
// Solution closed?
if (string.IsNullOrWhiteSpace(solutionFullPath))
{
return null;
}
string solutionRoot = Path.GetDirectoryName(solutionFullPath);
string ruleSetDirectoryRoot = Path.Combine(solutionRoot,
bindingMode == SonarLintMode.LegacyConnected ?
Constants.LegacySonarQubeManagedFolderName :
Constants.SonarlintManagedFolderName);
return ruleSetDirectoryRoot;
}
public bool TryGetProjectRuleSetFilePath(RuleSetDeclaration declaration, out string fullFilePath)
{
if (string.IsNullOrWhiteSpace(declaration.RuleSetPath))
{
fullFilePath = null;
return false;
}
var options = new List<string>();
options.Add(declaration.RuleSetPath); // Might be a full path
options.Add(PathHelper.ResolveRelativePath(declaration.RuleSetPath, declaration.RuleSetProjectFullName)); // Relative to project
// Note: currently we don't search in rule set directories since we expect the project rule set
// to be relative to the project. We can add this in the future if it will be needed.
fullFilePath = options.FirstOrDefault(fileSystem.File.Exists);
return !string.IsNullOrWhiteSpace(fullFilePath);
}
private static Configuration TryGetPropertyConfiguration(Property property)
{
Configuration configuration = property.Collection.Parent as Configuration; // Could be null if the one used is the Project level one.
Debug.Assert(configuration != null || property.Collection.Parent is Project, $"Unexpected property parent type: {property.Collection.Parent.GetType().FullName}");
return configuration;
}
}
}
|
{
"pile_set_name": "Github"
}
|
// VirtThread.cpp
#include "VirtThread.h"
static THREAD_FUNC_DECL CoderThread(void *p)
{
for (;;)
{
CVirtThread *t = (CVirtThread *)p;
t->StartEvent.Lock();
if (t->Exit)
return 0;
t->Execute();
t->FinishedEvent.Set();
}
}
WRes CVirtThread::Create()
{
RINOK(StartEvent.CreateIfNotCreated());
RINOK(FinishedEvent.CreateIfNotCreated());
StartEvent.Reset();
FinishedEvent.Reset();
Exit = false;
if (Thread.IsCreated())
return S_OK;
return Thread.Create(CoderThread, this);
}
void CVirtThread::Start()
{
Exit = false;
StartEvent.Set();
}
void CVirtThread::WaitThreadFinish()
{
Exit = true;
if (StartEvent.IsCreated())
StartEvent.Set();
if (Thread.IsCreated())
{
Thread.Wait();
Thread.Close();
}
}
|
{
"pile_set_name": "Github"
}
|
apiVersion: v1
kind: Pod
metadata:
name: website
labels:
app: website
role: frontend
spec:
containers:
- name: website
image: nginx
volumeMounts:
- mountPath: /cache
name: cache-volume
ports:
- containerPort: 80
volumes:
- name: cache-volume
emptyDir: {}
|
{
"pile_set_name": "Github"
}
|
/*
LiquidCrystal Library - Hello World
Demonstrates the use a 16x2 LCD display. The LiquidCrystal
library works with all LCD displays that are compatible with the
Hitachi HD44780 driver. There are many of them out there, and you
can usually tell them by the 16-pin interface.
This sketch prints "Hello World!" to the LCD
and shows the time.
The circuit:
* LCD RS pin to digital pin 12
* LCD Enable pin to digital pin 11
* LCD D4 pin to digital pin 5
* LCD D5 pin to digital pin 4
* LCD D6 pin to digital pin 3
* LCD D7 pin to digital pin 2
* LCD R/W pin to ground
* 10K resistor:
* ends to +5V and ground
* wiper to LCD VO pin (pin 3)
Library originally added 18 Apr 2008
by David A. Mellis
library modified 5 Jul 2009
by Limor Fried (http://www.ladyada.net)
example added 9 Jul 2009
by Tom Igoe
modified 22 Nov 2010
by Tom Igoe
This example code is in the public domain.
http://www.arduino.cc/en/Tutorial/LiquidCrystal
*/
// include the library code:
#include <LiquidCrystal.h>
// initialize the library with the numbers of the interface pins
LiquidCrystal lcd(12, 11, 5, 4, 3, 2);
void setup() {
// set up the LCD's number of columns and rows:
lcd.begin(16, 2);
// Print a message to the LCD.
lcd.print("hello, world!");
}
void loop() {
// set the cursor to column 0, line 1
// (note: line 1 is the second row, since counting begins with 0):
lcd.setCursor(0, 1);
// print the number of seconds since reset:
lcd.print(millis()/1000);
}
|
{
"pile_set_name": "Github"
}
|
#include <stdio.h>
#include <stdlib.h>
#include "item.h"
#include "list.h"
link new(Item item)
{
link t = malloc(sizeof(link));
t->item = item;
t->next = NULL;
return t;
}
link list_init(Item item)
{
return new(item);
}
void list_insert_next(link last, Item item)
{
last->next = new(item);
}
void list_print(link t)
{
while (t != NULL)
{
print_item(t->item);
t = t->next;
}
}
|
{
"pile_set_name": "Github"
}
|
// (C) Copyright Gennadiy Rozental 2002-2008.
// Distributed under the Boost Software License, Version 1.0.
// (See accompanying file LICENSE_1_0.txt or copy at
// http://www.boost.org/LICENSE_1_0.txt)
// See http://www.boost.org/libs/test for the library home page.
//
// File : $RCSfile$
//
// Version : $Revision: 49312 $
//
// Description : simple minimal testing definitions and implementation
// ***************************************************************************
#ifndef BOOST_TEST_MINIMAL_HPP_071894GER
#define BOOST_TEST_MINIMAL_HPP_071894GER
#define BOOST_CHECK(exp) \
( (exp) \
? static_cast<void>(0) \
: boost::minimal_test::report_error(#exp,__FILE__,__LINE__, BOOST_CURRENT_FUNCTION) )
#define BOOST_REQUIRE(exp) \
( (exp) \
? static_cast<void>(0) \
: boost::minimal_test::report_critical_error(#exp,__FILE__,__LINE__,BOOST_CURRENT_FUNCTION))
#define BOOST_ERROR( msg_ ) \
boost::minimal_test::report_error( (msg_),__FILE__,__LINE__, BOOST_CURRENT_FUNCTION, true )
#define BOOST_FAIL( msg_ ) \
boost::minimal_test::report_critical_error( (msg_),__FILE__,__LINE__, BOOST_CURRENT_FUNCTION, true )
//____________________________________________________________________________//
// Boost.Test
#include <boost/test/detail/global_typedef.hpp>
#include <boost/test/impl/execution_monitor.ipp>
#include <boost/test/impl/debug.ipp>
#include <boost/test/utils/class_properties.hpp>
#include <boost/test/utils/basic_cstring/io.hpp>
// Boost
#include <boost/cstdlib.hpp> // for exit codes#include <boost/cstdlib.hpp> // for exit codes
#include <boost/current_function.hpp> // for BOOST_CURRENT_FUNCTION
// STL
#include <iostream> // std::cerr, std::endl
#include <string> // std::string
#include <boost/test/detail/suppress_warnings.hpp>
//____________________________________________________________________________//
int test_main( int argc, char* argv[] ); // prototype for users test_main()
namespace boost {
namespace minimal_test {
typedef boost::unit_test::const_string const_string;
inline unit_test::counter_t& errors_counter() { static unit_test::counter_t ec = 0; return ec; }
inline void
report_error( const char* msg, const char* file, int line, const_string func_name, bool is_msg = false )
{
++errors_counter();
std::cerr << file << "(" << line << "): ";
if( is_msg )
std::cerr << msg;
else
std::cerr << "test " << msg << " failed";
if( func_name != "(unknown)" )
std::cerr << " in function: '" << func_name << "'";
std::cerr << std::endl;
}
inline void
report_critical_error( const char* msg, const char* file, int line, const_string func_name, bool is_msg = false )
{
report_error( msg, file, line, func_name, is_msg );
throw boost::execution_aborted();
}
class caller {
public:
// constructor
caller( int argc, char** argv )
: m_argc( argc ), m_argv( argv ) {}
// execution monitor hook implementation
int operator()() { return test_main( m_argc, m_argv ); }
private:
// Data members
int m_argc;
char** m_argv;
}; // monitor
} // namespace minimal_test
} // namespace boost
//____________________________________________________________________________//
int BOOST_TEST_CALL_DECL main( int argc, char* argv[] )
{
using namespace boost::minimal_test;
try {
::boost::execution_monitor ex_mon;
int run_result = ex_mon.execute( caller( argc, argv ) );
BOOST_CHECK( run_result == 0 || run_result == boost::exit_success );
}
catch( boost::execution_exception const& exex ) {
if( exex.code() != boost::execution_exception::no_error )
BOOST_ERROR( (std::string( "exception \"" ).
append( exex.what().begin(), exex.what().end() ).
append( "\" caught" ) ).c_str() );
std::cerr << "\n**** Testing aborted.";
}
if( boost::minimal_test::errors_counter() != 0 ) {
std::cerr << "\n**** " << errors_counter()
<< " error" << (errors_counter() > 1 ? "s" : "" ) << " detected\n";
return boost::exit_test_failure;
}
std::cout << "\n**** no errors detected\n";
return boost::exit_success;
}
//____________________________________________________________________________//
#include <boost/test/detail/enable_warnings.hpp>
#endif // BOOST_TEST_MINIMAL_HPP_071894GER
|
{
"pile_set_name": "Github"
}
|
package zmq.socket.radiodish;
import zmq.Ctx;
import zmq.Msg;
import zmq.Options;
import zmq.SocketBase;
import zmq.ZError;
import zmq.ZMQ;
import zmq.io.IOThread;
import zmq.io.SessionBase;
import zmq.io.net.Address;
import zmq.pipe.Pipe;
import zmq.socket.pubsub.Dist;
import java.nio.charset.StandardCharsets;
import java.util.ArrayList;
import java.util.HashMap;
import java.util.List;
import java.util.Map;
public class Radio extends SocketBase
{
private final Map<String, List<Pipe>> subscriptions;
private final Dist dist;
public Radio(Ctx parent, int tid, int sid)
{
super(parent, tid, sid, true);
options.type = ZMQ.ZMQ_RADIO;
subscriptions = new HashMap<>();
dist = new Dist();
}
@Override
public void xattachPipe(Pipe pipe, boolean subscribe2all, boolean isLocallyInitiated)
{
assert (pipe != null);
pipe.setNoDelay();
dist.attach(pipe);
xreadActivated(pipe);
}
@Override
public void xreadActivated(Pipe pipe)
{
Msg msg = pipe.read();
while (msg != null) {
if (msg.isJoin()) {
List<Pipe> pipes = subscriptions.computeIfAbsent(msg.getGroup(), k -> new ArrayList<>());
pipes.add(pipe);
}
else if (msg.isLeave()) {
List<Pipe> pipes = subscriptions.get(msg.getGroup());
if (pipes != null) {
pipes.remove(pipe);
if (pipes.isEmpty()) {
subscriptions.remove(msg.getGroup());
}
}
}
msg = pipe.read();
}
}
@Override
public void xwriteActivated(Pipe pipe)
{
dist.activated(pipe);
}
@Override
public void xpipeTerminated(Pipe pipe)
{
subscriptions.entrySet().removeIf(entry -> {
entry.getValue().remove(pipe);
return entry.getValue().isEmpty();
});
dist.terminated(pipe);
}
@Override
protected boolean xsend(Msg msg)
{
// SERVER sockets do not allow multipart data (ZMQ_SNDMORE)
if (msg.hasMore()) {
errno.set(ZError.EINVAL);
return false;
}
dist.unmatch();
List<Pipe> range = subscriptions.get(msg.getGroup());
if (range != null) {
for (int i = 0; i < range.size(); i++) {
dist.match(range.get(i));
}
}
dist.sendToMatching(msg);
return true;
}
@Override
protected Msg xrecv()
{
errno.set(ZError.ENOTSUP);
// Messages cannot be received from RADIO socket.
throw new UnsupportedOperationException();
}
@Override
protected boolean xhasIn()
{
return false;
}
@Override
protected boolean xhasOut()
{
return dist.hasOut();
}
public static class RadioSession extends SessionBase
{
enum State
{
GROUP,
BODY
}
private State state;
private Msg pending;
public RadioSession(IOThread ioThread, boolean connect, SocketBase socket, final Options options,
final Address addr)
{
super(ioThread, connect, socket, options, addr);
state = State.GROUP;
}
@Override
public boolean pushMsg(Msg msg)
{
if (msg.isCommand()) {
byte commandNameSize = msg.get(0);
if (msg.size() < commandNameSize + 1) {
return super.pushMsg(msg);
}
byte[] data = msg.data();
String commandName = new String(data, 1, commandNameSize, StandardCharsets.US_ASCII);
int groupLength;
String group;
Msg joinLeaveMsg = new Msg();
// Set the msg type to either JOIN or LEAVE
if (commandName.equals("JOIN")) {
groupLength = msg.size() - 5;
group = new String(data, 5, groupLength, StandardCharsets.US_ASCII);
joinLeaveMsg.initJoin();
}
else if (commandName.equals("LEAVE")) {
groupLength = msg.size() - 6;
group = new String(data, 6, groupLength, StandardCharsets.US_ASCII);
joinLeaveMsg.initLeave();
}
// If it is not a JOIN or LEAVE just push the message
else {
return super.pushMsg(msg);
}
// Set the group
joinLeaveMsg.setGroup(group);
// Push the join or leave command
msg = joinLeaveMsg;
return super.pushMsg(msg);
}
return super.pushMsg(msg);
}
@Override
protected Msg pullMsg()
{
Msg msg;
switch (state) {
case GROUP:
pending = super.pullMsg();
if (pending == null) {
return null;
}
// First frame is the group
msg = new Msg(pending.getGroup().getBytes(StandardCharsets.US_ASCII));
msg.setFlags(Msg.MORE);
// Next status is the body
state = State.BODY;
break;
case BODY:
msg = pending;
state = State.GROUP;
break;
default:
throw new IllegalStateException();
}
return msg;
}
@Override
protected void reset()
{
super.reset();
state = State.GROUP;
}
}
}
|
{
"pile_set_name": "Github"
}
|
---
title: Rosling.bubbles()
subtitle: "The Bubbles Animation in Hans Roslings Talk"
date: '2017-04-04'
slug: Rosling-bubbles
---
In Hans Rosling's attractive talk `Debunking third-world myths with the best
stats you've ever seen`, he used a lot of bubble plots to illustrate trends
behind the data over time. This function gives an imitation of those moving
bubbles, besides, as this function is based on `symbols`, we can
also make use of other symbols such as squares, rectangles, thermometers,
etc.
Suppose we have observations of $n$ individuals over
`ani.options('nmax')` years. In this animation, the data of each year
will be shown in the bubbles (symbols) plot; as time goes on, certain trends
will be revealed (like those in Rosling's talk). Please note that the
arrangement of the data for bubbles (symbols) should be a matrix like
$A_{ijk}$ in which $i$ is the individual id (from 1 to n), $j$
denotes the $j$-th variable (from 1 to p) and $k$ indicates the time
from 1 to `ani.options('nmax')`.
And the length of `x` and `y` should be equal to the number of rows
of this matrix.
```{r demo-a, cache=TRUE, interval=.2}
library(animation)
ani.options(interval = 0.2, nmax = 50)
## use default arguments (random numbers); you may try to find the real data
par(mar = c(4, 4, 0.2, 0.2))
Rosling.bubbles()
```
```{r demo-b, cache=TRUE, interval=.2}
## rectangles
Rosling.bubbles(type = 'rectangles', data = matrix(abs(rnorm(50 * 10 * 2)), ncol = 2))
```
|
{
"pile_set_name": "Github"
}
|
/* vi:set ts=8 sts=4 sw=4 noet:
*
* VIM - Vi IMproved by Bram Moolenaar
*
* Do ":help uganda" in Vim to read copying and usage conditions.
* Do ":help credits" in Vim to see a list of people who contributed.
* See README.txt for an overview of the Vim source code.
*/
/*
* diff.c: code for diff'ing two, three or four buffers.
*
* There are three ways to diff:
* - Shell out to an external diff program, using files.
* - Use the compiled-in xdiff library.
* - Let 'diffexpr' do the work, using files.
*/
#include "vim.h"
#include "xdiff/xdiff.h"
#if defined(FEAT_DIFF) || defined(PROTO)
static int diff_busy = FALSE; // using diff structs, don't change them
static int diff_need_update = FALSE; // ex_diffupdate needs to be called
// flags obtained from the 'diffopt' option
#define DIFF_FILLER 0x001 // display filler lines
#define DIFF_IBLANK 0x002 // ignore empty lines
#define DIFF_ICASE 0x004 // ignore case
#define DIFF_IWHITE 0x008 // ignore change in white space
#define DIFF_IWHITEALL 0x010 // ignore all white space changes
#define DIFF_IWHITEEOL 0x020 // ignore change in white space at EOL
#define DIFF_HORIZONTAL 0x040 // horizontal splits
#define DIFF_VERTICAL 0x080 // vertical splits
#define DIFF_HIDDEN_OFF 0x100 // diffoff when hidden
#define DIFF_INTERNAL 0x200 // use internal xdiff algorithm
#define DIFF_CLOSE_OFF 0x400 // diffoff when closing window
#define ALL_WHITE_DIFF (DIFF_IWHITE | DIFF_IWHITEALL | DIFF_IWHITEEOL)
static int diff_flags = DIFF_INTERNAL | DIFF_FILLER | DIFF_CLOSE_OFF;
static long diff_algorithm = 0;
#define LBUFLEN 50 // length of line in diff file
static int diff_a_works = MAYBE; // TRUE when "diff -a" works, FALSE when it
// doesn't work, MAYBE when not checked yet
#if defined(MSWIN)
static int diff_bin_works = MAYBE; // TRUE when "diff --binary" works, FALSE
// when it doesn't work, MAYBE when not
// checked yet
#endif
// used for diff input
typedef struct {
char_u *din_fname; // used for external diff
mmfile_t din_mmfile; // used for internal diff
} diffin_T;
// used for diff result
typedef struct {
char_u *dout_fname; // used for external diff
garray_T dout_ga; // used for internal diff
} diffout_T;
// two diff inputs and one result
typedef struct {
diffin_T dio_orig; // original file input
diffin_T dio_new; // new file input
diffout_T dio_diff; // diff result
int dio_internal; // using internal diff
} diffio_T;
static int diff_buf_idx(buf_T *buf);
static int diff_buf_idx_tp(buf_T *buf, tabpage_T *tp);
static void diff_mark_adjust_tp(tabpage_T *tp, int idx, linenr_T line1, linenr_T line2, long amount, long amount_after);
static void diff_check_unchanged(tabpage_T *tp, diff_T *dp);
static int diff_check_sanity(tabpage_T *tp, diff_T *dp);
static int check_external_diff(diffio_T *diffio);
static int diff_file(diffio_T *diffio);
static int diff_equal_entry(diff_T *dp, int idx1, int idx2);
static int diff_cmp(char_u *s1, char_u *s2);
#ifdef FEAT_FOLDING
static void diff_fold_update(diff_T *dp, int skip_idx);
#endif
static void diff_read(int idx_orig, int idx_new, diffout_T *fname);
static void diff_copy_entry(diff_T *dprev, diff_T *dp, int idx_orig, int idx_new);
static diff_T *diff_alloc_new(tabpage_T *tp, diff_T *dprev, diff_T *dp);
static int parse_diff_ed(char_u *line, linenr_T *lnum_orig, long *count_orig, linenr_T *lnum_new, long *count_new);
static int parse_diff_unified(char_u *line, linenr_T *lnum_orig, long *count_orig, linenr_T *lnum_new, long *count_new);
static int xdiff_out(void *priv, mmbuffer_t *mb, int nbuf);
/*
* Called when deleting or unloading a buffer: No longer make a diff with it.
*/
void
diff_buf_delete(buf_T *buf)
{
int i;
tabpage_T *tp;
FOR_ALL_TABPAGES(tp)
{
i = diff_buf_idx_tp(buf, tp);
if (i != DB_COUNT)
{
tp->tp_diffbuf[i] = NULL;
tp->tp_diff_invalid = TRUE;
if (tp == curtab)
diff_redraw(TRUE);
}
}
}
/*
* Check if the current buffer should be added to or removed from the list of
* diff buffers.
*/
void
diff_buf_adjust(win_T *win)
{
win_T *wp;
int i;
if (!win->w_p_diff)
{
// When there is no window showing a diff for this buffer, remove
// it from the diffs.
FOR_ALL_WINDOWS(wp)
if (wp->w_buffer == win->w_buffer && wp->w_p_diff)
break;
if (wp == NULL)
{
i = diff_buf_idx(win->w_buffer);
if (i != DB_COUNT)
{
curtab->tp_diffbuf[i] = NULL;
curtab->tp_diff_invalid = TRUE;
diff_redraw(TRUE);
}
}
}
else
diff_buf_add(win->w_buffer);
}
/*
* Add a buffer to make diffs for.
* Call this when a new buffer is being edited in the current window where
* 'diff' is set.
* Marks the current buffer as being part of the diff and requiring updating.
* This must be done before any autocmd, because a command may use info
* about the screen contents.
*/
void
diff_buf_add(buf_T *buf)
{
int i;
if (diff_buf_idx(buf) != DB_COUNT)
return; // It's already there.
for (i = 0; i < DB_COUNT; ++i)
if (curtab->tp_diffbuf[i] == NULL)
{
curtab->tp_diffbuf[i] = buf;
curtab->tp_diff_invalid = TRUE;
diff_redraw(TRUE);
return;
}
semsg(_("E96: Cannot diff more than %d buffers"), DB_COUNT);
}
/*
* Remove all buffers to make diffs for.
*/
static void
diff_buf_clear(void)
{
int i;
for (i = 0; i < DB_COUNT; ++i)
if (curtab->tp_diffbuf[i] != NULL)
{
curtab->tp_diffbuf[i] = NULL;
curtab->tp_diff_invalid = TRUE;
diff_redraw(TRUE
|
{
"pile_set_name": "Github"
}
|
<domain type='vmware'>
<name>firmware-efi</name>
<uuid>564d9bef-acd9-b4e0-c8f0-aea8b9103515</uuid>
<memory unit='KiB'>4096</memory>
<os firmware='efi'>
<type>hvm</type>
</os>
</domain>
|
{
"pile_set_name": "Github"
}
|
package com.bird.web.sso;
import lombok.Getter;
import lombok.Setter;
/**
* @author liuxx
* @date 2019/3/1
*/
@Getter
@Setter
public abstract class SsoProperties {
/**
* cookie名称,默认为Sso-Token
*/
private String cookieName = "Sso-Token";
/**
* 登录地址
*/
private String loginPath;
}
|
{
"pile_set_name": "Github"
}
|
/*
* Copyright (c) 2007-2017 Xplenty, Inc. All Rights Reserved.
*
* Project and contact information: http://www.cascading.org/
*
* This file is part of the Cascading project.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package cascading.tuple.io;
import java.util.List;
import cascading.tuple.Tuple;
/** Class IndexTuple allows for managing an int index value with a Tuple instance. Used internally for co-grouping values. */
public class IndexTuple extends Tuple implements Comparable<Object>
{
int index;
Tuple tuple;
/** Constructor IndexTuple creates a new IndexTuple instance. */
public IndexTuple()
{
super( (List<Object>) null );
}
/**
* Constructor IndexTuple creates a new IndexTuple instance.
*
* @param index of type int
* @param tuple of type Tuple
*/
public IndexTuple( int index, Tuple tuple )
{
super( (List<Comparable>) null );
this.index = index;
this.tuple = tuple;
}
public void setIndex( int index )
{
this.index = index;
}
public int getIndex()
{
return index;
}
public void setTuple( Tuple tuple )
{
this.tuple = tuple;
}
public Tuple getTuple()
{
return tuple;
}
@Override
public String print()
{
return printTo( new StringBuffer() ).toString();
}
public StringBuffer printTo( StringBuffer buffer )
{
buffer.append( "{" );
buffer.append( index ).append( ":" );
tuple.printTo( buffer );
buffer.append( "}" );
return buffer;
}
public int compareTo( Object object )
{
if( object instanceof IndexTuple )
return compareTo( (IndexTuple) object );
return -1;
}
public int compareTo( IndexTuple indexTuple )
{
int c = this.index - indexTuple.index;
if( c != 0 )
return c;
return this.tuple.compareTo( indexTuple.tuple );
}
@Override
public boolean equals( Object object )
{
if( this == object )
return true;
if( object == null || getClass() != object.getClass() )
return false;
IndexTuple that = (IndexTuple) object;
if( index != that.index )
return false;
if( tuple != null ? !tuple.equals( that.tuple ) : that.tuple != null )
return false;
return true;
}
@Override
public int hashCode()
{
int result = index;
result = 31 * result + ( tuple != null ? tuple.hashCode() : 0 );
return result;
}
@Override
public String toString()
{
return "[" + index + "]" + tuple;
}
}
|
{
"pile_set_name": "Github"
}
|
# AnotherFakeApi
All URIs are relative to *http://petstore.swagger.io:80/v2*
Method | HTTP request | Description
------------- | ------------- | -------------
[**testSpecialTags**](AnotherFakeApi.md#testSpecialTags) | **PATCH** another-fake/dummy | To test special tags
<a name="testSpecialTags"></a>
# **testSpecialTags**
> Client testSpecialTags(body)
To test special tags
To test special tags
### Example
```java
// Import classes:
//import io.swagger.client.ApiException;
//import io.swagger.client.api.AnotherFakeApi;
AnotherFakeApi apiInstance = new AnotherFakeApi();
Client body = new Client(); // Client | client model
try {
Client result = apiInstance.testSpecialTags(body);
System.out.println(result);
} catch (ApiException e) {
System.err.println("Exception when calling AnotherFakeApi#testSpecialTags");
e.printStackTrace();
}
```
### Parameters
Name | Type | Description | Notes
------------- | ------------- | ------------- | -------------
**body** | [**Client**](Client.md)| client model |
### Return type
[**Client**](Client.md)
### Authorization
No authorization required
### HTTP request headers
- **Content-Type**: application/json
- **Accept**: application/json
|
{
"pile_set_name": "Github"
}
|
{
"token": "",
"team_id": "",
"enterprise_id": "",
"api_app_id": "",
"event": {
"type": "star_added",
"user": "",
"item": {
"type": "message",
"channel": "",
"message": {
"bot_id": "",
"type": "message",
"text": "",
"user": "",
"ts": "0000000000.000000",
"team": "",
"bot_profile": {
"id": "",
"deleted": false,
"name": "",
"updated": 12345,
"app_id": "",
"icons": {
"image_36": "https://www.example.com/",
"image_48": "https://www.example.com/",
"image_72": "https://www.example.com/"
},
"team_id": ""
},
"edited": {
"user": "",
"ts": "0000000000.000000"
},
"attachments": [
{
"service_name": "",
"service_url": "https://www.example.com/",
"title": "",
"title_link": "https://www.example.com/",
"author_name": "",
"author_link": "https://www.example.com/",
"thumb_url": "https://www.example.com/",
"thumb_width": 12345,
"thumb_height": 12345,
"fallback": "",
"video_html": "",
"video_html_width": 12345,
"video_html_height": 12345,
"from_url": "https://www.example.com/",
"service_icon": "https://www.example.com/",
"id": 12345,
"original_url": "https://www.example.com/",
"msg_subtype": "",
"callback_id": "",
"color": "",
"pretext": "",
"author_id": "",
"author_icon": "",
"author_subname": "",
"channel_id": "",
"channel_name": "",
"bot_id": "",
"indent": false,
"is_msg_unfurl": false,
"is_reply_unfurl": false,
"is_thread_root_unfurl": false,
"is_app_unfurl": false,
"app_unfurl_url": "",
"text": "",
"fields": [
{
"title": "",
"value": "",
"short": false
}
],
"footer": "",
"footer_icon": "",
"ts": "",
"mrkdwn_in": [
""
],
"actions": [
{
"id": "",
"name": "",
"text": "",
"style": "",
"type": "button",
"value": "",
"confirm": {
"title": "",
"text": "",
"ok_text": "",
"dismiss_text": ""
},
"options": [
{
"text": "",
"value": ""
}
],
"selected_options": [
{
"text": "",
"value": ""
}
],
"data_source": "",
"min_query_length": 12345,
"option_groups": [
{
"text": ""
}
],
"url": "https://www.example.com/"
}
],
"filename": "",
"size": 12345,
"mimetype": "",
"url": "https://www.example.com/",
"metadata": {
"thumb_64": false,
"thumb_80": false,
"thumb_160": false,
"original_w": 12345,
"original_h": 12345,
"thumb_360_w": 12345,
"thumb_360_h": 12345,
"format": "",
"extension": "",
"rotation": 12345,
"thumb_tiny": ""
}
}
],
"is_starred": false,
"permalink": "https://www.example.com/"
},
"date_create": 12345
},
"event_ts": "0000000000.000000"
},
"type": "event_callback",
"event_id": "",
"event_time": 12345,
"authed_users": [
""
]
}
|
{
"pile_set_name": "Github"
}
|
#include <KlayGE/KlayGE.hpp>
#include <KFL/ErrorHandling.hpp>
#include <KFL/Util.hpp>
#include <KlayGE/Texture.hpp>
#include <KFL/Math.hpp>
#include <KlayGE/TexCompressionBC.hpp>
#include <KlayGE/ResLoader.hpp>
#include <iostream>
#include <fstream>
#include <vector>
#include <cstring>
using namespace std;
using namespace KlayGE;
namespace
{
void DecompressNormal(std::vector<uint8_t>& res_normals, std::vector<uint8_t> const & com_normals)
{
for (size_t i = 0; i < com_normals.size() / 4; ++ i)
{
float x = com_normals[i * 4 + 2] / 255.0f * 2 - 1;
float y = com_normals[i * 4 + 1] / 255.0f * 2 - 1;
float z = sqrt(1 - x * x - y * y);
res_normals[i * 4 + 0] = static_cast<uint8_t>(MathLib::clamp(static_cast<int>((z * 0.5f + 0.5f) * 255 + 0.5f), 0, 255));
res_normals[i * 4 + 1] = com_normals[i * 4 + 1];
res_normals[i * 4 + 2] = com_normals[i * 4 + 2];
res_normals[i * 4 + 3] = 0;
}
}
void DecompressNormalMapSubresource(uint32_t width, uint32_t height, ElementFormat restored_format,
ElementInitData& restored_data, std::vector<uint8_t>& restored_data_block, ElementFormat com_format, ElementInitData const & com_data)
{
KFL_UNUSED(restored_format);
std::vector<uint8_t> normals(width * height * 4);
if (IsCompressedFormat(com_format))
{
std::unique_ptr<TexCompression> tex_codec;
switch (com_format)
{
case EF_BC3:
tex_codec = MakeUniquePtr<TexCompressionBC3>();
break;
case EF_BC5:
tex_codec = MakeUniquePtr<TexCompressionBC5>();
break;
default:
KFL_UNREACHABLE("Compression formats other than BC3 and BC5 are not supported");
}
for (uint32_t y_base = 0; y_base < height; y_base += 4)
{
for (uint32_t x_base = 0; x_base < width; x_base += 4)
{
uint32_t argb[16];
if (EF_BC5 == com_format)
{
uint16_t gr[16];
tex_codec->DecodeBlock(gr, static_cast<uint8_t const *>(com_data.data) + ((y_base / 4) * width / 4 + x_base / 4) * 16);
for (int i = 0; i < 16; ++ i)
{
argb[i] = (gr[i] & 0xFF00) | ((gr[i] & 0xFF) << 16);
}
}
else
{
BOOST_ASSERT(EF_BC3 == com_format);
tex_codec->DecodeBlock(argb, static_cast<uint8_t const *>(com_data.data) + ((y_base / 4) * width / 4 + x_base / 4) * 16);
}
for (int y = 0; y < 4; ++ y)
{
if (y_base + y < height)
{
for (int x = 0; x < 4; ++ x)
{
if (x_base + x < width)
{
std::memcpy(&normals[((y_base + y) * width + (x_base + x)) * 4], &argb[y * 4 + x], sizeof(uint32_t));
}
}
}
}
}
}
}
else
{
if (EF_GR8 == com_format)
{
uint8_t const * gr_data = static_cast<uint8_t const *>(com_data.data);
for (uint32_t y = 0; y < height; ++ y)
{
for (uint32_t x = 0; x < width; ++ x)
{
normals[(y * width + x) * 4 + 0] = 0;
normals[(y * width + x) * 4 + 1] = gr_data[y * com_data.row_pitch + x * 2 + 1];
normals[(y * width + x) * 4 + 2] = gr_data[y * com_data.row_pitch + x * 2 + 0];
normals[(y * width + x) * 4 + 3] = 0xFF;
}
}
}
else
{
BOOST_ASSERT(EF_ABGR8 == com_format);
uint8_t const * abgr_data = static_cast<uint8_t const *>(com_data.data);
for (uint32_t y = 0; y < height; ++ y)
{
for (uint32_t x = 0; x < width; ++ x)
{
normals[(y * width + x) * 4 + 0] = 0;
normals[(y * width + x) * 4 + 1] = abgr_data[y * com_data.row_pitch + x * 4 + 1];
normals[(y * width + x) * 4 + 2] = abgr_data[y * com_data.row_pitch + x * 4 + 0];
normals[(y * width + x) * 4 + 3] = 0xFF;
}
}
}
}
if (restored_format != EF_ARGB8)
{
std::vector<uint8_t> argb8_normals(width * height * 4);
ResizeTexture(&argb8_normals[0], width * 4, width * height * 4, EF_ARGB8, width, height, 1, &normals[0], width * 4,
width * height * 4, restored_format, width, height, 1, TextureFilter::Point);
normals.swap(argb8_normals);
}
restored_data_block.resize(width * height * 4);
restored_data.row_pitch = width * 4;
restored_data.slice_pitch = width * height * 4;
restored_data.data = &restored_data_block[0];
DecompressNormal(restored_data_block, normals);
}
void Normal2NaLength(std::string const & in_file, std::string const & out_file, ElementFormat new_format)
{
TexturePtr in_tex = LoadSoftwareTexture(in_file);
auto const in_type = in_tex->Type();
auto const in_width = in_tex->Width(0);
auto const in_height = in_tex->Height(0);
auto const in_depth = in_tex->Depth(0);
auto const in_num_mipmaps = in_tex->NumMipMaps();
auto const in_array_size = in_tex->ArraySize();
auto const in_format = in_tex->Format();
auto const & in_data = checked_cast<SoftwareTexture&>(*in_tex).SubresourceData();
TexCompressionBC4 bc4_codec;
std::vector<std::vector<uint8_t>> level_lengths(in_num_mipmaps * in_array_size);
std::vector<ElementInitData> new_data(level_lengths.size());
for (size_t array_index = 0; array_index < in_array_size; ++ array_index)
{
ElementInitData restored_data;
std::vector<uint8_t> restored_data_
|
{
"pile_set_name": "Github"
}
|
package cm.aptoide.pt.app.view;
import android.view.LayoutInflater;
import android.view.ViewGroup;
import androidx.recyclerview.widget.RecyclerView;
import cm.aptoide.pt.R;
import cm.aptoide.pt.app.AppViewSimilarApp;
import cm.aptoide.pt.app.view.similar.SimilarAppClickEvent;
import java.text.DecimalFormat;
import java.util.List;
import rx.subjects.PublishSubject;
/**
* Created by franciscocalado on 11/05/18.
*/
public class AppViewSimilarAppsAdapter extends RecyclerView.Adapter<AppViewSimilarAppViewHolder> {
private List<AppViewSimilarApp> similarApps;
private DecimalFormat oneDecimalFormater;
private PublishSubject<SimilarAppClickEvent> appClicked;
private SimilarAppType type;
public AppViewSimilarAppsAdapter(List<AppViewSimilarApp> similarApps,
DecimalFormat oneDecimalFormater, PublishSubject<SimilarAppClickEvent> appClicked,
SimilarAppType type) {
this.similarApps = similarApps;
this.oneDecimalFormater = oneDecimalFormater;
this.appClicked = appClicked;
this.type = type;
}
@Override
public AppViewSimilarAppViewHolder onCreateViewHolder(ViewGroup viewGroup, int viewType) {
return new AppViewSimilarAppViewHolder(LayoutInflater.from(viewGroup.getContext())
.inflate(R.layout.displayable_grid_ad, viewGroup, false), oneDecimalFormater, appClicked);
}
@Override public void onBindViewHolder(AppViewSimilarAppViewHolder appViewSimilarAppViewHolder,
int position) {
if (similarApps.get(position) != null) {
appViewSimilarAppViewHolder.setSimilarApp(similarApps.get(position), type);
}
}
@Override public int getItemViewType(int position) {
return similarApps.get(position)
.getNetworkAdType();
}
@Override public int getItemCount() {
return similarApps.size();
}
public void update(List<AppViewSimilarApp> apps) {
similarApps = apps;
notifyDataSetChanged();
}
public enum SimilarAppType {
APPC_SIMILAR_APPS("appc_similar_apps"), SIMILAR_APPS("similar_apps");
private final String description;
SimilarAppType(String description) {
this.description = description;
}
public String getDescription() {
return description;
}
}
}
|
{
"pile_set_name": "Github"
}
|
#include "Swift_watchOS_Native/Swift_watchOS_Native_base.xcconfig"
VALIDATE_PRODUCT = YES
|
{
"pile_set_name": "Github"
}
|
/**
* Copyright (c) 2010-2020 Contributors to the openHAB project
*
* See the NOTICE file(s) distributed with this work for additional
* information.
*
* This program and the accompanying materials are made available under the
* terms of the Eclipse Public License 2.0 which is available at
* http://www.eclipse.org/legal/epl-2.0
*
* SPDX-License-Identifier: EPL-2.0
*/
package org.openhab.binding.lifx.internal.protocol;
import java.lang.reflect.Constructor;
import java.lang.reflect.Field;
import java.nio.ByteBuffer;
import org.eclipse.jdt.annotation.NonNullByDefault;
/**
* A generic handler that dynamically creates "standard" packet instances.
*
* <p>
* Packet types must have an empty constructor and cannot require any
* additional logic (other than parsing).
*
* @param <T> the packet subtype this handler constructs
*
* @author Tim Buckley - Initial Contribution
* @author Karel Goderis - Enhancement for the V2 LIFX Firmware and LAN Protocol Specification
*/
@NonNullByDefault
public class GenericHandler<T extends Packet> implements PacketHandler<T> {
private Constructor<T> constructor;
private boolean typeFound;
private int type;
public boolean isTypeFound() {
return typeFound;
}
public int getType() {
return type;
}
public GenericHandler(Class<T> clazz) {
try {
constructor = clazz.getConstructor();
} catch (NoSuchMethodException ex) {
throw new IllegalArgumentException("Packet class cannot be handled by GenericHandler", ex);
}
try {
Field typeField = clazz.getField("TYPE");
type = (int) typeField.get(null);
typeFound = true;
} catch (NoSuchFieldException | IllegalAccessException ex) {
// silently ignore
typeFound = false;
}
}
@Override
public T handle(ByteBuffer buf) {
try {
T ret = constructor.newInstance();
ret.parse(buf);
return ret;
} catch (ReflectiveOperationException ex) {
throw new IllegalArgumentException("Unable to instantiate empty packet", ex);
}
}
}
|
{
"pile_set_name": "Github"
}
|
* {
box-sizing: border-box;
}
body {
padding: 0;
margin: 0;
font-family: 'Dosis', sans-serif;
font-size: 18px;
background-color: #eee;
overflow: hidden;
position: absolute;
top: 0px;
bottom: 0px;
right: 0px;
left: 0px;
}
h1 {
margin-left: 38px;
font-size: 42px;
font-weight: 200;
letter-spacing: 1px;
}
h2 {
margin: 30px 0 20px 0;
font-weight: 700;
font-size: 22px;
letter-spacing: 1px;
}
.column {
position: absolute;
top: 110px;
bottom: 0px;
border-left: 1px dotted #bbb;
padding: 0 0 0 20px;
display: none;
}
#example-list {
position: absolute;
top: 50px;
left: 20px;
right: 0px;
bottom: 0px;
font-family: Monaco, Menlo, "Ubuntu Mono", Consolas, source-code-pro, monospaceace;
font-size: 12px;
overflow: hidden;
padding-bottom: 20px;
}
.example-item {
padding: 4px 0 4px 0;
cursor: pointer;
}
.example-item:hover {
background-color: rgba(0, 0, 0, 0.05);
}
#code-editor {
position: absolute;
top: 50px;
left: 20px;
right: 0px;
bottom: 0px;
background: #eee;
}
#log-output {
position: absolute;
top: 50px;
left: 20px;
right: 0px;
bottom: 0px;
padding-bottom: 20px;
font-family: Monaco, Menlo, "Ubuntu Mono", Consolas, source-code-pro, monospaceace;
font-size: 12px;
background: #eee;
overflow: auto;
white-space: pre-wrap;
word-wrap: break-word;
}
.log-entry {
margin-bottom: 4px;
padding-bottom: 4px;
border-bottom: 1px dotted #b4b4b4;
}
.log-entry-details {
color: #b4b4b4;
margin-right: 8px;
}
.log-entry-info {
color: black;
}
.log-entry-debug {
color: black;
}
.log-entry-warn {
color: #ff9100;
}
.log-entry-error {
color: red;
}
#run-button {
position: absolute;
top: -2px;
left: 24px;
width: 26px;
height: 26px;
border-radius: 50%;
z-index: 20000;
cursor: pointer;
background-color: #82CA6D;
background-image: url("../img/run-icon.svg");
background-size: 16px 16px;
background-position: center center;
background-repeat: no-repeat;
}
#run-button:hover {
background-color: #64a151;
}
#run-button:active {
top: -1px;
left: 25px;
}
.ace_gutter-cell {
opacity: 0.3 !important;
}
|
{
"pile_set_name": "Github"
}
|
package org.rrd4j.core.jrrd;
import java.io.IOException;
import java.io.PrintStream;
import java.text.NumberFormat;
/**
* Instances of this class model a data source in an RRD file.
*
* @author <a href="mailto:ciaran@codeloop.com">Ciaran Treanor</a>
* @version $Revision: 1.1 $
*/
public class DataSource {
private static enum ds_param_en { DS_mrhb_cnt, DS_min_val, DS_max_val, DS_cde }
private final long offset;
private final long size;
private final String name;
private final DataSourceType type;
private final int minimumHeartbeat;
private final double minimum;
private final double maximum;
// initialized during RRDatabase construction
private PDPStatusBlock pdpStatusBlock;
DataSource(RRDFile file) throws IOException {
offset = file.getFilePointer();
name = file.readString(Constants.DS_NAM_SIZE);
type = DataSourceType.valueOf(file.readString(Constants.DST_SIZE).toUpperCase());
UnivalArray par = file.getUnivalArray(10);
minimumHeartbeat = (int) par.getLong(ds_param_en.DS_mrhb_cnt);
minimum = par.getDouble(ds_param_en.DS_min_val);
maximum = par.getDouble(ds_param_en.DS_max_val);
size = file.getFilePointer() - offset;
}
void loadPDPStatusBlock(RRDFile file) throws IOException {
pdpStatusBlock = new PDPStatusBlock(file);
}
/**
* Returns the primary data point status block for this data source.
*
* @return the primary data point status block for this data source.
*/
public PDPStatusBlock getPDPStatusBlock() {
return pdpStatusBlock;
}
/**
* Returns the minimum required heartbeat for this data source.
*
* @return the minimum required heartbeat for this data source.
*/
public int getMinimumHeartbeat() {
return minimumHeartbeat;
}
/**
* Returns the minimum value input to this data source can have.
*
* @return the minimum value input to this data source can have.
*/
public double getMinimum() {
return minimum;
}
/**
* Returns the type this data source is.
*
* @return the type this data source is.
* @see DataSourceType
*/
public DataSourceType getType() {
return type;
}
/**
* Returns the maximum value input to this data source can have.
*
* @return the maximum value input to this data source can have.
*/
public double getMaximum() {
return maximum;
}
/**
* Returns the name of this data source.
*
* @return the name of this data source.
*/
public String getName() {
return name;
}
void printInfo(PrintStream s, NumberFormat numberFormat) {
StringBuilder sb = new StringBuilder("ds[");
sb.append(name);
s.print(sb);
s.print("].type = \"");
s.print(type);
s.println("\"");
s.print(sb);
s.print("].minimal_heartbeat = ");
s.println(minimumHeartbeat);
s.print(sb);
s.print("].min = ");
s.println(Double.isNaN(minimum)
? "NaN"
: numberFormat.format(minimum));
s.print(sb);
s.print("].max = ");
s.println(Double.isNaN(maximum)
? "NaN"
: numberFormat.format(maximum));
s.print(sb);
s.print("].last_ds = ");
s.println(pdpStatusBlock.lastReading);
s.print(sb);
s.print("].value = ");
double value = pdpStatusBlock.value;
s.println(Double.isNaN(value)
? "NaN"
: numberFormat.format(value));
s.print(sb);
s.print("].unknown_sec = ");
s.println(pdpStatusBlock.unknownSeconds);
}
void toXml(PrintStream s) {
s.println("\t<ds>");
s.print("\t\t<name> ");
s.print(name);
s.println(" </name>");
s.print("\t\t<type> ");
s.print(type);
s.println(" </type>");
s.print("\t\t<minimal_heartbeat> ");
s.print(minimumHeartbeat);
s.println(" </minimal_heartbeat>");
s.print("\t\t<min> ");
s.print(minimum);
s.println(" </min>");
s.print("\t\t<max> ");
s.print(maximum);
s.println(" </max>");
s.println();
s.println("\t\t<!-- PDP Status -->");
s.print("\t\t<last_ds> ");
s.print(pdpStatusBlock.lastReading);
s.println(" </last_ds>");
s.print("\t\t<value> ");
s.print(pdpStatusBlock.value);
s.println(" </value>");
s.print("\t\t<unknown_sec> ");
s.print(pdpStatusBlock.unknownSeconds);
s.println(" </unknown_sec>");
s.println("\t</ds>");
s.println();
}
/**
* Returns a summary the contents of this data source.
*
* @return a summary of the information contained in this data source.
*/
public String toString() {
StringBuilder sb = new StringBuilder("[DataSource: OFFSET=0x");
sb.append(Long.toHexString(offset));
sb.append(", SIZE=0x");
sb.append(Long.toHexString(size));
sb.append(", name=");
sb.append(name);
sb.append(", type=");
sb.append(type.toString());
sb.append(", minHeartbeat=");
sb.append(minimumHeartbeat);
sb.append(", min=");
sb.append(minimum);
sb.append(", max=");
sb.append(maximum);
sb.append("]");
sb.append("\n\t\t");
sb.append(pdpStatusBlock.toString());
return sb.toString();
}
}
|
{
"pile_set_name": "Github"
}
|
<?php
/*
* This file is part of Piplin.
*
* Copyright (C) 2016-2017 piplin.com
*
* For the full copyright and license information, please view the LICENSE
* file that was distributed with this source code.
*/
namespace Piplin\Http\Requests;
use Piplin\Http\Requests\Request;
/**
* Validate the user name and password.
*/
class StoreProfileRequest extends Request
{
/**
* Get the validation rules that apply to the request.
*
* @return array
*/
public function rules()
{
$rules = [
'nickname' => 'required|max:255',
'password' => 'required|confirmed|min:6',
];
if (empty($this->get('password'))) {
unset($rules['password']);
}
return $rules;
}
}
|
{
"pile_set_name": "Github"
}
|
package ws.schild.jave.filters;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.List;
import java.util.stream.Collectors;
/**
* A filterchain as described by <a
* href="https://ffmpeg.org/ffmpeg-filters.html#Filtergraph-syntax-1">FFMPEG Documentation</a>.
*
* <p>A filterchain is a comma separated series of filters.
*
* @author mressler
*/
public class FilterChain implements VideoFilter {
private List<Filter> filters;
/** Create an empty filterchain. */
public FilterChain() {
filters = new ArrayList<>();
}
/**
* Create a filterchain with the specified filters
*
* @param filters The ordered list of filters in this chain
*/
public FilterChain(Filter... filters) {
this.filters = new ArrayList<>(Arrays.asList(filters));
}
/**
* Add one Filter to this filterchain
*
* @param filter The Filter to add to this chain.
* @return this FilterChain for builder pattern magic
*/
public FilterChain addFilter(Filter filter) {
filters.add(filter);
return this;
}
public FilterChain prependFilter(Filter filter) {
filters.add(0, filter);
return this;
}
/**
* Adds an input label to the first filter in this chain.
* @param label The label to use for the input label for the first filter in this chain
* @return this FilterChain for builder pattern magic
* @throws IndexOutOfBoundsException if there are no filters in this chain.
*/
public FilterChain setInputLabel(String label) {
filters.get(0).addInputLabel(label);
return this;
}
/**
* Adds an output label to the first filter in this chain.
* @param label The label to use for the output label for the last filter in this chain
* @return this FilterChain for builder pattern magic
* @throws IndexOutOfBoundsException if there are no filters in this chain.
*/
public FilterChain setOutputLabel(String label) {
filters.get(filters.size() - 1).addOutputLabel(label);
return this;
}
@Override
public String getExpression() {
return filters.stream().map(VideoFilter::getExpression).collect(Collectors.joining(","));
}
}
|
{
"pile_set_name": "Github"
}
|
<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN" "http://www.w3.org/TR/html4/loose.dtd">
<!--NewPage-->
<HTML>
<HEAD>
<!-- Generated by javadoc (build 1.6.0_22) on Tue Mar 08 17:53:56 GMT 2011 -->
<META http-equiv="Content-Type" content="text/html; charset=UTF-8">
<TITLE>
Issue (jslint4java parent 1.4.7 API)
</TITLE>
<META NAME="date" CONTENT="2011-03-08">
<LINK REL ="stylesheet" TYPE="text/css" HREF="../../../stylesheet.css" TITLE="Style">
<SCRIPT type="text/javascript">
function windowTitle()
{
if (location.href.indexOf('is-external=true') == -1) {
parent.document.title="Issue (jslint4java parent 1.4.7 API)";
}
}
</SCRIPT>
<NOSCRIPT>
</NOSCRIPT>
</HEAD>
<BODY BGCOLOR="white" onload="windowTitle();">
<HR>
<!-- ========= START OF TOP NAVBAR ======= -->
<A NAME="navbar_top"><!-- --></A>
<A HREF="#skip-navbar_top" title="Skip navigation links"></A>
<TABLE BORDER="0" WIDTH="100%" CELLPADDING="1" CELLSPACING="0" SUMMARY="">
<TR>
<TD COLSPAN=2 BGCOLOR="#EEEEFF" CLASS="NavBarCell1">
<A NAME="navbar_top_firstrow"><!-- --></A>
<TABLE BORDER="0" CELLPADDING="0" CELLSPACING="3" SUMMARY="">
<TR ALIGN="center" VALIGN="top">
<TD BGCOLOR="#EEEEFF" CLASS="NavBarCell1"> <A HREF="../../../overview-summary.html"><FONT CLASS="NavBarFont1"><B>Overview</B></FONT></A> </TD>
<TD BGCOLOR="#EEEEFF" CLASS="NavBarCell1"> <A HREF="package-summary.html"><FONT CLASS="NavBarFont1"><B>Package</B></FONT></A> </TD>
<TD BGCOLOR="#FFFFFF" CLASS="NavBarCell1Rev"> <FONT CLASS="NavBarFont1Rev"><B>Class</B></FONT> </TD>
<TD BGCOLOR="#EEEEFF" CLASS="NavBarCell1"> <A HREF="class-use/Issue.html"><FONT CLASS="NavBarFont1"><B>Use</B></FONT></A> </TD>
<TD BGCOLOR="#EEEEFF" CLASS="NavBarCell1"> <A HREF="package-tree.html"><FONT CLASS="NavBarFont1"><B>Tree</B></FONT></A> </TD>
<TD BGCOLOR="#EEEEFF" CLASS="NavBarCell1"> <A HREF="../../../deprecated-list.html"><FONT CLASS="NavBarFont1"><B>Deprecated</B></FONT></A> </TD>
<TD BGCOLOR="#EEEEFF" CLASS="NavBarCell1"> <A HREF="../../../index-all.html"><FONT CLASS="NavBarFont1"><B>Index</B></FONT></A> </TD>
<TD BGCOLOR="#EEEEFF" CLASS="NavBarCell1"> <A HREF="../../../help-doc.html"><FONT CLASS="NavBarFont1"><B>Help</B></FONT></A> </TD>
</TR>
</TABLE>
</TD>
<TD ALIGN="right" VALIGN="top" ROWSPAN=3><EM>
</EM>
</TD>
</TR>
<TR>
<TD BGCOLOR="white" CLASS="NavBarCell2"><FONT SIZE="-2">
PREV CLASS
<A HREF="../../../com/googlecode/jslint4java/Issue.IssueBuilder.html" title="class in com.googlecode.jslint4java"><B>NEXT CLASS</B></A></FONT></TD>
<TD BGCOLOR="white" CLASS="NavBarCell2"><FONT SIZE="-2">
<A HREF="../../../index.html?com/googlecode/jslint4java/Issue.html" target="_top"><B>FRAMES</B></A>
<A HREF="Issue.html" target="_top"><B>NO FRAMES</B></A>
<SCRIPT type="text/javascript">
<!--
if(window==top) {
document.writeln('<A HREF="../../../allclasses-noframe.html"><B>All Classes</B></A>');
}
//-->
</SCRIPT>
<NOSCRIPT>
<A HREF="../../../allclasses-noframe.html"><B>All Classes</B></A>
</NOSCRIPT>
</FONT></TD>
</TR>
<TR>
<TD VALIGN="top" CLASS="NavBarCell3"><FONT SIZE="-2">
SUMMARY: <A HREF="#nested_class_summary">NESTED</A> | FIELD | CONSTR | <A HREF="#method_summary">METHOD</A></FONT></TD>
<TD VALIGN="top" CLASS="NavBarCell3"><FONT SIZE="-2">
DETAIL: FIELD | CONSTR | <A HREF="#method_detail">METHOD</A></FONT></TD>
</TR>
</TABLE>
<A NAME="skip-navbar_top"></A>
<!-- ========= END OF TOP NAVBAR ========= -->
<HR>
<!-- ======== START OF CLASS DATA ======== -->
<H2>
<FONT SIZE="-1">
com.googlecode.jslint4java</FONT>
<BR>
Class Issue</H2>
<PRE>
<A HREF="http://java.sun.com/j2se/1.5.0/docs/api/java/lang/Object.html?is-external=true" title="class or interface in java.lang">java.lang.Object</A>
<IMG SRC="../../../resources/inherit.gif" ALT="extended by "><B>com.googlecode.jslint4java.Issue</B>
</PRE>
<HR>
<DL>
<DT><PRE>public class <B>Issue</B><DT>extends <A HREF="http://java.sun.com/j2se/1.5.0/docs/api/java/lang/Object.html?is-external=true" title="class or interface in java.lang">Object</A></DL>
</PRE>
<P>
A single issue with the code that is being checked for problems.
<P>
<P>
<DL>
<DT><B>Author:</B></DT>
<DD>dom</DD>
</DL>
<HR>
<P>
<!-- ======== NESTED CLASS SUMMARY ======== -->
<A NAME="nested_class_summary"><!-- --></A>
<TABLE BORDER="1" WIDTH="100%" CELLPADDING="3" CELLSPACING="0" SUMMARY="">
<TR BGCOLOR="#CCCCFF" CLASS="TableHeadingColor">
<TH ALIGN="left" COLSPAN="2"><FONT SIZE="+2">
<B>Nested Class Summary</B></FONT></TH>
</TR>
<TR BGCOLOR="white" CLASS="TableRowColor">
<TD ALIGN="right" VALIGN="top" WIDTH="1%"><FONT SIZE="-1">
<CODE>static class</CODE></FONT></TD>
<TD><CODE><B><A HREF="../../../com/googlecode/jslint4java/Issue.IssueBuilder.html" title="class in com.googlecode.jslint4java">Issue.IssueBuilder</A></B></CODE>
<BR>
Allow creating an issue in a couple of different ways.</TD>
</TR>
</TABLE>
<!-- ========== METHOD SUMMARY =========== -->
<A NAME="method_summary"><!-- --></A>
<TABLE BORDER="1" WIDTH="100%" CELLPADDING="3" CELLSPACING="0" SUMMARY="">
<TR BGCOLOR="#CCCCFF" CLASS="TableHeadingColor">
<TH ALIGN="left" COLSPAN="2"><FONT SIZE="+2">
<B>Method Summary</B></FONT></TH>
</TR>
<TR BGCOLOR="white" CLASS="TableRowColor">
<TD ALIGN="right" VALIGN="top" WIDTH="1%"><FONT SIZE="-1">
<CODE> <A HREF="http://
|
{
"pile_set_name": "Github"
}
|
package problem235
|
{
"pile_set_name": "Github"
}
|
//
// Copyright (c) Microsoft Corporation. All rights reserved.
//
namespace Microsoft.Zelig.CodeGeneration.IR
{
using System;
using System.Collections.Generic;
using System.Runtime.CompilerServices;
using Microsoft.Zelig.Runtime.TypeSystem;
public class ExternalDataDescriptor : DataManager.DataDescriptor
{
public interface IExternalDataContext
{
byte[] RawData { get; }
void WriteData( ImageBuilders.SequentialRegion region );
object DataSection { get; }
}
private IExternalDataContext m_externContext;
internal ExternalDataDescriptor()
{
}
public ExternalDataDescriptor( DataManager owner ,
IExternalDataContext context ,
DataManager.Attributes flags ,
Abstractions.PlacementRequirements pr ) : base( owner, null, flags, pr )
{
m_externContext = context;
}
public IExternalDataContext ExternContext
{
get { return m_externContext; }
}
public override object GetDataAtOffset( Runtime.TypeSystem.FieldRepresentation[] accessPath, int accessPathIndex, int offset )
{
throw new NotImplementedException();
}
internal override void IncludeExtraTypes( Runtime.TypeSystem.TypeSystem.Reachability reachability, CompilationSteps.PhaseDriver phase )
{
throw new NotImplementedException();
}
internal override void Reduce( GrowOnlySet<DataManager.DataDescriptor> visited, Runtime.TypeSystem.TypeSystem.Reachability reachability, bool fApply )
{
}
internal override void RefreshValues( CompilationSteps.PhaseDriver phase )
{
}
internal override void Write( ImageBuilders.SequentialRegion region )
{
m_externContext.WriteData( region );
}
protected override string ToString( bool fVerbose )
{
return "ExternalDataDescriptor";
}
}
}
|
{
"pile_set_name": "Github"
}
|
---
layout: page
title: Sandboxes - Sinon.JS
breadcrumb: sandbox
---
Sandboxes removes the need to keep track of every fake created, which greatly simplifies cleanup.
```javascript
var sandbox = require('sinon').createSandbox();
var myAPI = { hello: function () {} };
describe('myAPI.hello method', function () {
beforeEach(function () {
// stub out the `hello` method
sandbox.stub(myAPI, 'hello');
});
afterEach(function () {
// completely restore all fakes created through the sandbox
sandbox.restore();
});
it('should be called once', function () {
myAPI.hello();
sandbox.assert.calledOnce(myAPI.hello);
});
it('should be called twice', function () {
myAPI.hello();
myAPI.hello();
sandbox.assert.calledTwice(myAPI.hello);
});
});
```
## Sandbox API
#### Default sandbox
Since `sinon@5.0.0`, the `sinon` object is a default sandbox. Unless you have a very advanced setup or need a special configuration, you probably want to only use that one.
```javascript
const myObject = {
'hello': 'world'
};
sinon.stub(myObject, 'hello').value('Sinon');
console.log(myObject.hello);
// Sinon
sinon.restore();
console.log(myObject.hello);
// world
```
#### `var sandbox = sinon.createSandbox();`
Creates a new sandbox object with spies, stubs, and mocks.
#### `var sandbox = sinon.createSandbox(config);`
The `sinon.createSandbox(config)` method is often an integration feature, and can be used for scenarios including a global object to coordinate all fakes through.
Sandboxes are partially configured by default such that calling:
```javascript
var sandbox = sinon.createSandbox({});
```
will merge in extra defaults analogous to:
```javascript
var sandbox = sinon.createSandbox({
// ...
injectInto: null,
properties: ["spy", "stub", "mock"],
useFakeTimers: false,
useFakeServer: false
});
```
The `useFakeTimers` and `useFakeServers` are **false** as opposed to the [defaults in `sinon.defaultConfig`](https://github.com/sinonjs/sinon/blob/master/lib/sinon/util/core/default-config.js):
```javascript
sinon.defaultConfig = {
// ...
injectInto: null,
properties: ["spy", "stub", "mock", "clock", "server", "requests"],
useFakeTimers: true,
useFakeServer: true
}
```
To get a full sandbox with stubs, spies, etc. **and** fake timers and servers, you can call:
```javascript
// Inject the sinon defaults explicitly.
var sandbox = sinon.createSandbox(sinon.defaultConfig);
// (OR) Add the extra properties that differ from the sinon defaults.
var sandbox = sinon.createSandbox({
useFakeTimers: true
useFakeServer: true
});
```
##### injectInto
The sandbox's methods can be injected into another object for convenience. The
`injectInto` configuration option can name an object to add properties to.
##### properties
What properties to inject. Note that only naming "server" here is not
sufficient to have a `server` property show up in the target object, you also
have to set `useFakeServer` to `true`.
The list of properties that can be injected are the ones exposed by the object
returned by the function `inject`, namely:
```javascript
{
//...
properties: [
"spy", "stub", "mock", "createStubInstance", "fake", "replace",
"replaceSetter", "replaceGetter", "clock", "server", "requests", "match"
]
}
```
##### useFakeTimers
If set to `true`, the sandbox will have a `clock` property. You can optionally pass
in a configuration object that follows the [specification for fake timers](../fake-timers),
such as `{ toFake: ["setTimeout", "setInterval"] }`.
##### useFakeServer
If `true`, `server` and `requests` properties are added to the sandbox. Can
also be an object to use for fake server. The default one is `sinon.fakeServer`,
but if you're using jQuery 1.3.x or some other library that does not set the XHR's
`onreadystatechange` handler, you might want to do:
```javascript
sinon.config = {
useFakeServer: sinon.fakeServerWithClock
};
```
##### exposing sandbox example
To create an object `sandboxFacade` which gets the method `spy` injected, you
can code:
```javascript
// object that will have the spy method injected into it
var sandboxFacade = {};
// create sandbox and inject properties (in this case spy) into sandboxFacade
var sandbox = sinon.createSandbox({
injectInto: sandboxFacade,
properties: ["spy"]
});
```
#### `sandbox.assert();`
A convenience reference for [`sinon.assert`](./assertions)
*Since `sinon@2.0.0`*
#### `sandbox.replace(object, property, replacement);`
Replaces `property` on `object` with `replacement` argument. Attempts to replace an already replaced value cause an exception.
`replacement` can be any value, including `spies`, `stubs` and `fakes`.
This method only works on non-accessor properties, for replacing accessors, use `sandbox.replaceGetter()` and `sandbox.replaceSetter()`.
```js
var myObject = {
myMethod: function() {
return 'apple pie';
}
};
sandbox.replace(myObject, 'myMethod', function () {
return 'strawberry';
});
console.log(myObject.myMethod());
// strawberry
```
#### `sandbox.replaceGetter();`
Replaces getter for `property` on `object` with `replacement` argument. Attempts to replace an already replaced getter cause an exception.
`replacement` must be a `Function`, and can be instances of `spies`, `stubs` and `fakes`.
```js
var myObject = {
get myProperty: function() {
return 'apple pie';
}
};
sandbox.replaceGetter(myObject, 'myMethod', function () {
return 'strawberry';
});
console.log(myObject.myProperty);
// strawberry
```
#### `sandbox.replaceSetter();`
Replaces setter for `property` on `object` with `replacement` argument. Attempts to replace an already replaced setter cause an exception.
`replacement` must be a `Function`, and can be instances of `spies`, `stubs` and `fakes`.
```js
var object = {
set myProperty(value) {
this.prop = value;
}
};
sandbox.replaceSetter(object, 'myProperty', function (value) {
this.prop = 'strawberry ' + value;
});
object.myProperty = 'pie';
console.log(object.prop);
// strawberry pie
```
#### `sandbox.spy();`
Works exactly like `sinon.spy`
#### `sandbox.createStubInstance();`
Works almost exactly like `sinon.createStubInstance`, only also adds the returned stubs to the internal collection of fakes for restoring through `sandbox.restore()`.
#### `sandbox.stub();`
Works exactly like `sinon.stub`.
##### Stubbing a non-function property
```javascript
const myObject = {
'hello': 'world'
};
sandbox.stub(myObject, 'hello').value('Sinon');
console.log(myObject.hello);
// Sinon
sandbox.restore();
console.log(myObject.hello);
// world
```
#### `sandbox.mock();`
Works exactly like `sin
|
{
"pile_set_name": "Github"
}
|
panic: too many errors [recovered]
github.com/stephens2424/php.func·006
github.com/stephens2424/php.(*Parser).errorf
github.com/stephens2424/php.(*Parser).expected
github.com/stephens2424/php.(*Parser).expectCurrent
github.com/stephens2424/php.(*Parser).expect
github.com/stephens2424/php.(*Parser).expectStmtEnd
github.com/stephens2424/php.(*Parser).parseStmt
github.com/stephens2424/php.(*Parser).parseStmt
github.com/stephens2424/php.(*Parser).parseNode
github.com/stephens2424/php.(*Parser).Parse
github.com/stephens2424/php.Fuzz
github.com/dvyukov/go-fuzz/go-fuzz-dep.Main
main.main
|
{
"pile_set_name": "Github"
}
|
{
"name": "dbgobject-tree",
"headless": true,
"author": "Peter Salas",
"description": "Provides an interface for traversing a tree of DbgObjects.",
"dependencies": ["dbgobject", "dbgobject-inspector"],
"includes": ["dbgobject-tree.css", "dbgobject-tree.js", "dbgobject-renderer.js", "tree-readers.js", "dbgobject-tree-renderer.js"]
}
|
{
"pile_set_name": "Github"
}
|
// Copyright 2017-2020 Lei Ni (nilei81@gmail.com) and other Dragonboat authors.
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
package rsm
import (
"crypto/md5"
"encoding/binary"
"sort"
"strings"
"github.com/lni/goutils/logutil"
pb "github.com/lni/dragonboat/v3/raftpb"
)
func addressEqual(addr1 string, addr2 string) bool {
return strings.EqualFold(strings.TrimSpace(addr1),
strings.TrimSpace(addr2))
}
func deepCopyMembership(m pb.Membership) pb.Membership {
c := pb.Membership{
ConfigChangeId: m.ConfigChangeId,
Addresses: make(map[uint64]string),
Removed: make(map[uint64]bool),
Observers: make(map[uint64]string),
Witnesses: make(map[uint64]string),
}
for nid, addr := range m.Addresses {
c.Addresses[nid] = addr
}
for nid := range m.Removed {
c.Removed[nid] = true
}
for nid, addr := range m.Observers {
c.Observers[nid] = addr
}
for nid, addr := range m.Witnesses {
c.Witnesses[nid] = addr
}
return c
}
type membership struct {
clusterID uint64
nodeID uint64
ordered bool
members *pb.Membership
}
func newMembership(clusterID uint64, nodeID uint64, ordered bool) *membership {
return &membership{
clusterID: clusterID,
nodeID: nodeID,
ordered: ordered,
members: &pb.Membership{
Addresses: make(map[uint64]string),
Observers: make(map[uint64]string),
Removed: make(map[uint64]bool),
Witnesses: make(map[uint64]string),
},
}
}
func (m *membership) id() string {
return logutil.DescribeSM(m.clusterID, m.nodeID)
}
func (m *membership) set(n pb.Membership) {
cm := deepCopyMembership(n)
m.members = &cm
}
func (m *membership) get() pb.Membership {
return deepCopyMembership(*m.members)
}
func (m *membership) getHash() uint64 {
vals := make([]uint64, 0)
for v := range m.members.Addresses {
vals = append(vals, v)
}
sort.Slice(vals, func(i, j int) bool { return vals[i] < vals[j] })
vals = append(vals, m.members.ConfigChangeId)
data := make([]byte, 8)
hash := md5.New()
for _, v := range vals {
binary.LittleEndian.PutUint64(data, v)
if _, err := hash.Write(data); err != nil {
panic(err)
}
}
md5sum := hash.Sum(nil)
return binary.LittleEndian.Uint64(md5sum[:8])
}
func (m *membership) isEmpty() bool {
return len(m.members.Addresses) == 0
}
func (m *membership) isConfChangeUpToDate(cc pb.ConfigChange) bool {
if !m.ordered || cc.Initialize {
return true
}
if m.members.ConfigChangeId == cc.ConfigChangeId {
return true
}
return false
}
func (m *membership) isAddingRemovedNode(cc pb.ConfigChange) bool {
if cc.Type == pb.AddNode ||
cc.Type == pb.AddObserver ||
cc.Type == pb.AddWitness {
_, ok := m.members.Removed[cc.NodeID]
return ok
}
return false
}
func (m *membership) isPromotingObserver(cc pb.ConfigChange) bool {
if cc.Type == pb.AddNode {
oa, ok := m.members.Observers[cc.NodeID]
return ok && addressEqual(oa, string(cc.Address))
}
return false
}
func (m *membership) isInvalidObserverPromotion(cc pb.ConfigChange) bool {
if cc.Type == pb.AddNode {
oa, ok := m.members.Observers[cc.NodeID]
return ok && !addressEqual(oa, string(cc.Address))
}
return false
}
func (m *membership) isAddingExistingMember(cc pb.ConfigChange) bool {
// try to add again with the same node ID
if cc.Type == pb.AddNode {
_, ok := m.members.Addresses[cc.NodeID]
if ok {
return true
}
}
if cc.Type == pb.AddObserver {
_, ok := m.members.Observers[cc.NodeID]
if ok {
return true
}
}
if cc.Type == pb.AddWitness {
_, ok := m.members.Witnesses[cc.NodeID]
if ok {
return true
}
}
if m.isPromotingObserver(cc) {
return false
}
if cc.Type == pb.AddNode ||
cc.Type == pb.AddObserver ||
cc.Type == pb.AddWitness {
for _, addr := range m.members.Addresses {
if addressEqual(addr, string(cc.Address)) {
return true
}
}
for _, addr := range m.members.Observers {
if addressEqual(addr, string(cc.Address)) {
return true
}
}
for _, addr := range m.members.Witnesses {
if addressEqual(addr, string(cc.Address)) {
return true
}
}
}
return false
}
func (m *membership) isAddingNodeAsObserver(cc pb.ConfigChange) bool {
if cc.Type == pb.AddObserver {
_, ok := m.members.Addresses[cc.NodeID]
return ok
}
return false
}
func (m *membership) isAddingNodeAsWitness(cc pb.ConfigChange) bool {
if cc.Type == pb.AddWitness {
_, ok := m.members.Addresses[cc.NodeID]
return ok
}
return false
}
func (m *membership) isAddingWitnessAsObserver(cc pb.ConfigChange) bool {
if cc.Type == pb.AddObserver {
_, ok := m.members.Witnesses[cc.NodeID]
return ok
}
return false
}
func (m *membership) isAddingWitnessAsNode(cc pb.ConfigChange) bool {
if cc.Type == pb.AddNode {
_, ok := m.members.Witnesses[cc.NodeID]
return ok
}
return false
}
func (m *membership) isAddingObserverAsWitness(cc pb.ConfigChange) bool {
if cc.Type == pb.AddWitness {
_, ok := m.members.Observers[cc.NodeID]
return ok
}
return false
}
func (
|
{
"pile_set_name": "Github"
}
|
add_definitions(-DTRANSLATION_DOMAIN=\"kdevstandardoutputview\")
########### next target ###############
declare_qt_logging_category(standardoutputview_LOG_PART_SRCS
TYPE PLUGIN
IDENTIFIER PLUGIN_STANDARDOUTPUTVIEW
CATEGORY_BASENAME "standardoutputview"
)
set(standardoutputview_LIB_SRCS
standardoutputview.cpp
outputwidget.cpp
toolviewdata.cpp
standardoutputviewmetadata.cpp
${standardoutputview_LOG_PART_SRCS}
)
kdevplatform_add_plugin(kdevstandardoutputview JSON kdevstandardoutputview.json SOURCES ${standardoutputview_LIB_SRCS})
target_link_libraries(kdevstandardoutputview
KDev::Interfaces
KDev::Sublime
KDev::Util
KDev::OutputView
)
if(BUILD_TESTING)
add_subdirectory(tests)
endif()
|
{
"pile_set_name": "Github"
}
|
{
"type": "attributeSelector",
"content": [
{
"type": "attributeName",
"content": [
{
"type": "ident",
"content": "a",
"syntax": "less",
"start": {
"line": 1,
"column": 2
},
"end": {
"line": 1,
"column": 2
}
}
],
"syntax": "less",
"start": {
"line": 1,
"column": 2
},
"end": {
"line": 1,
"column": 2
}
},
{
"type": "attributeMatch",
"content": "=",
"syntax": "less",
"start": {
"line": 1,
"column": 3
},
"end": {
"line": 1,
"column": 3
}
},
{
"type": "attributeValue",
"content": [
{
"type": "ident",
"content": "b",
"syntax": "less",
"start": {
"line": 1,
"column": 4
},
"end": {
"line": 1,
"column": 4
}
}
],
"syntax": "less",
"start": {
"line": 1,
"column": 4
},
"end": {
"line": 1,
"column": 4
}
},
{
"type": "space",
"content": " ",
"syntax": "less",
"start": {
"line": 1,
"column": 5
},
"end": {
"line": 1,
"column": 5
}
},
{
"type": "attributeFlags",
"content": [
{
"type": "ident",
"content": "i",
"syntax": "less",
"start": {
"line": 1,
"column": 6
},
"end": {
"line": 1,
"column": 6
}
}
],
"syntax": "less",
"start": {
"line": 1,
"column": 6
},
"end": {
"line": 1,
"column": 6
}
}
],
"syntax": "less",
"start": {
"line": 1,
"column": 1
},
"end": {
"line": 1,
"column": 7
}
}
|
{
"pile_set_name": "Github"
}
|
import random
c_choice = random.randint(100,999)
a = int(input("Guess a number?(100-999) "))
running = True
while running:
if a == c_choice:
print("BINGO!")
running = False
elif a < c_choice:
print("The answer is bigger.")
a = int(input("Try again~ "))
elif a > c_choice:
print("The answer is smaller.")
a = int(input("Try again~ "))
|
{
"pile_set_name": "Github"
}
|
// Copyright 2014 The Serviced Authors.
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
package dao
// --------------------------------------------------------------------------------------------------
// --------------------------------------------------------------------------------------------------
// **** USE OF THE METHODS IN THIS FILE IS DEPRECATED ****
//
// THAT MEANS DO NOT ADD MORE METHODS TO dao.ControlPlane
//
// Instead of adding new RPC calls via dao.ControlPlane, new RPCs should be added
// rpc/master.ClientInterface
// --------------------------------------------------------------------------------------------------
// --------------------------------------------------------------------------------------------------
import (
"time"
"github.com/control-center/serviced/domain/addressassignment"
"github.com/control-center/serviced/domain/logfilter"
"github.com/control-center/serviced/domain/service"
"github.com/control-center/serviced/metrics"
)
// ControlPlaneError is a generic ControlPlane error
type ControlPlaneError struct {
Msg string
}
// Implement the Error() interface for ControlPlaneError
func (s ControlPlaneError) Error() string {
return s.Msg
}
// EntityRequest is a request for a control center object.
type EntityRequest interface{}
// ServiceRequest identifies a service plus some query parameters.
type ServiceRequest struct {
Tags []string
TenantID string
UpdatedSince time.Duration
NameRegex string
}
// ServiceCloneRequest specifies a service to clone and how to modify the clone's name.
type ServiceCloneRequest struct {
ServiceID string
Suffix string
}
// ServiceMigrationRequest is request to modify one or more services.
type ServiceMigrationRequest struct {
ServiceID string // The tenant service ID
Modified []*service.Service // Services modified by the migration
Added []*service.Service // Services added by the migration
Deploy []*ServiceDeploymentRequest // ServiceDefinitions to be deployed by the migration
LogFilters map[string]logfilter.LogFilter // LogFilters to add/replace
}
// ServiceStateRequest specifies a request for a service's service state.
type ServiceStateRequest struct {
ServiceID string
ServiceStateID string
}
// ScheduleServiceRequest specifies a request to schedule a service to run.
type ScheduleServiceRequest struct {
ServiceIDs []string
AutoLaunch bool
Synchronous bool
}
// WaitServiceRequest is a request to wait for a set of services to gain the requested status.
type WaitServiceRequest struct {
ServiceIDs []string // List of service IDs to monitor
DesiredState service.DesiredState // State which to monitor for
Timeout time.Duration // Time to wait before cancelling the subprocess
Recursive bool // Recursively wait for the desired state
}
// HostServiceRequest is a request for the service state of a host.
type HostServiceRequest struct {
HostID string
ServiceStateID string
}
// AttachRequest is a request to run a command in the container of a running service.
type AttachRequest struct {
Running *RunningService
Command string
Args []string
}
// FindChildRequest is a request to locate a service's child by name.
type FindChildRequest struct {
ServiceID string
ChildName string
}
// SnapshotRequest is a request to create a snapshot.
type SnapshotRequest struct {
ServiceID string
Message string
Tag string
ContainerID string
SnapshotSpacePercent int
}
// TagSnapshotRequest is a request to add a tag (label) to the specified snapshot.
type TagSnapshotRequest struct {
SnapshotID string
TagName string
}
// SnapshotByTagRequest is request for the snapshot idenfified by the tag name.
type SnapshotByTagRequest struct {
ServiceID string
TagName string
}
// RollbackRequest is a request to apply a snapshot to the current system.
type RollbackRequest struct {
SnapshotID string
ForceRestart bool
}
// MetricRequest is a request for the metrics of the instances of a service.
type MetricRequest struct {
StartTime time.Time
HostID string
ServiceID string
Instances []metrics.ServiceInstance
}
// The ControlPlane interface is the API for a serviced master.
type ControlPlane interface {
//---------------------------------------------------------------------------
// Service CRUD
// Add a new service
AddService(svc service.Service, serviceID *string) error
// Clones a new service
CloneService(request ServiceCloneRequest, serviceID *string) error
// Deploy a new service
DeployService(svc ServiceDeploymentRequest, serviceID *string) error
// Update an existing service
UpdateService(svc service.Service, _ *int) error
// Migrate a service definition
MigrateServices(request ServiceMigrationRequest, _ *int) error
// Remove a service definition
RemoveService(serviceID string, _ *int) error
// Get a service from serviced
GetService(serviceID string, svc *service.Service) error
// Find a child service with the given name
FindChildService(request FindChildRequest, svc *service.Service) error
// Assign IP addresses to all services at and below the provided service
AssignIPs(assignmentRequest addressassignment.AssignmentRequest, _ *int) (err error)
// Get a list of tenant IDs
GetTenantIDs(_ struct{}, tenantIDs *[]string) error
//---------------------------------------------------------------------------
//ServiceState CRUD
// Schedule the given service to start
StartService(request ScheduleServiceRequest, affected *int) error
// Schedule the given service to restart
RestartService(request ScheduleServiceRequest, affected *int) error
// Schedule the given service to rebalance
RebalanceService(request ScheduleServiceRequest, affected *int) error
// Schedule the given service to stop
StopService(request ScheduleServiceRequest, affected *int) error
// Schedule the given service to pause
PauseService(request ScheduleServiceRequest, affected *int) error
// Stop a running instance of a service
StopRunningInstance(request HostServiceRequest, _ *int) error
// Wait for a particular service state
WaitService(request WaitServiceRequest, _ *int) error
// Computes the status of the service based on its service instances
GetServiceStatus(serviceID string, status *[]service.Instance) error
// Get logs for the given app
GetServiceLogs(serviceID string, logs *string) error
// Get logs for the given app
GetServiceStateLogs(request ServiceStateRequest, logs *string) error
// Get all running services
GetRunningServices(request EntityRequest, runningServices *[]RunningService) error
// Get the services instances for a given service
GetRunningServicesForHost(hostID string, runningServices *[]RunningService) error
// Get the service instances for a given service
GetRunningServicesForService(serviceID string, runningServices *[]RunningService) error
// Attach to a running container with a predefined action
Action(request AttachRequest, _ *int) error
// ------------------------------------------------------------------------
// Metrics
// Get service memory stats for a particular host
GetHostMemoryStats(req MetricRequest, stats *metrics.MemoryUsageStats) error
// Get service memory stats for a particular service
GetServiceMemoryStats(req MetricRequest, stats *metrics.MemoryUsageStats) error
// Get service memory stats for a particular service instance
GetInstanceMemoryStats(req MetricRequest, stats *[]metrics.MemoryUsageStats) error
// -----------------------------------------------------------------------
// Filesystem CRUD
// Backup captures the state of the application stack and writes the output
// to disk.
Backup(backupRequest BackupRequest, filename *string) (err error)
// GetBackupEstimate estimates space required to take backup and space available
GetBackupEstimate(backupRequest BackupRequest, estimate *BackupEstimate) (err error)
// AsyncBackup is the same as backup but asynchronous
AsyncBackup(backupRequest BackupRequest, filename *string)
|
{
"pile_set_name": "Github"
}
|
/*
* Licensed to the Apache Software Foundation (ASF) under one or more
* contributor license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright ownership.
* The ASF licenses this file to You under the Apache License, Version 2.0
* (the "License"); you may not use this file except in compliance with
* the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package net.neoremind.kraps.rpc
/**
* A callback that [[RpcEndpoint]] can use to send back a message or failure. It's thread-safe
* and can be called in any thread.
*/
trait RpcCallContext {
/**
* Reply a message to the sender. If the sender is [[RpcEndpoint]], its [[RpcEndpoint.receive]]
* will be called.
*/
def reply(response: Any): Unit
/**
* Report a failure to the sender.
*/
def sendFailure(e: Throwable): Unit
/**
* The sender of this message.
*/
def senderAddress: RpcAddress
}
|
{
"pile_set_name": "Github"
}
|
begin
select * from t1 where id in (select id from t2)
commit
|
{
"pile_set_name": "Github"
}
|
useLogger(CustomEventLogger())
class CustomEventLogger() : BuildAdapter(), TaskExecutionListener {
override fun beforeExecute(task: Task) {
println("[${task.name}]")
}
override fun afterExecute(task: Task, state: TaskState) {
println()
}
override fun buildFinished(result: BuildResult) {
println("build completed")
if (result.failure != null) {
(result.failure as Throwable).printStackTrace()
}
}
}
|
{
"pile_set_name": "Github"
}
|
<table class="action" align="center" width="100%" cellpadding="0" cellspacing="0">
<tr>
<td align="center">
<table width="100%" border="0" cellpadding="0" cellspacing="0">
<tr>
<td align="center">
<table border="0" cellpadding="0" cellspacing="0">
<tr>
<td>
<a href="{{ $url }}" class="button button-{{ $color or 'blue' }}" target="_blank">{{ $slot }}</a>
</td>
</tr>
</table>
</td>
</tr>
</table>
</td>
</tr>
</table>
|
{
"pile_set_name": "Github"
}
|
---
name: Bug Report
about: Create a report to help us improve.
title: ''
labels: bug
assignees: ''
---
<!-- If you have an issue with a plugin create an issue on that plugin's GitHub page instead. -->
<!-- Before opening an issue, please review the Troubleshooting Page on the Wiki to ensure that this is a new issue, and alternatively search the closed issues for similar problems. -->
<!-- Link to the the Wiki - https://github.com/homebridge/HAP-NodeJS/wiki -->
<!-- Provide a general summary in the Title above -->
**Describe The Bug:**
<!-- A clear and concise description of what the bug is. -->
**To Reproduce:**
<!-- Steps to reproduce the behavior. -->
**Expected behavior:**
<!-- A clear and concise description of what you expected to happen. -->
**Logs:**
<!-- Paste relevant output between the two ``` lines below -->
<!-- Remove any sensitive information, passwords, etc. -->
<!-- Please include the beginning of the log where the HAP-NodeJS initialization happens -->
**Screenshots:**
<!-- If applicable, add screenshots to help explain your problem. -->
**Environment:**
* **Node.js Version**: <!-- node -v -->
* **NPM Version**: <!-- npm -v -->
* **Operating System**: Raspbian / Ubuntu / Debian / Windows / macOS / Docker / other
* **Process Supervisor**: Systemd / init.d / pm2 / launchctl / Docker / hb-service / other / none
<!-- Click the "Preview" tab before you submit to ensure the formatting is correct. -->
|
{
"pile_set_name": "Github"
}
|
# Open Source Help Wanted Projects
The purpose of this repo is to maintain a list of open source projects by language that allow for the community to quickly indentify areas they can contribute. This is very helpful for individuals that are new to a language. I'm also hoping this will be used for hackathons and meetups.
## Go
[golang.org](http://golang.org)
- [Help Wanted](https://github.com/golang/go/labels/help%20wanted)
- [Documentation](https://github.com/golang/go/issues?q=is%3Aopen+is%3Aissue+label%3ADocumentation)
- [Contribution Guidelines](https://golang.org/doc/contribute.html)
#### [InfluxDB](http://github.com/influxdata/influxdb)
To see only help wanted issues, follow these links:
- [All Help Wanted](https://github.com/influxdata/influxdb/issues?q=is%3Aopen+is%3Aissue+label%3Astatus%2Fhelp-wanted)
- [Help Wanted - Difficulty: low](https://github.com/influxdata/influxdb/issues?q=is%3Aopen+is%3Aissue+label%3Astatus%2Fhelp-wanted+label%3Adifficulty%2Flow)
- [Help Wanted - Difficulty: medium](https://github.com/influxdata/influxdb/issues?q=is%3Aopen+is%3Aissue+label%3Astatus%2Fhelp-wanted+label%3Adifficulty%2Fmedium)
- [Help Wanted - Difficulty: high](https://github.com/influxdata/influxdb/issues?q=is%3Aopen+is%3Aissue+label%3Astatus%2Fhelp-wanted+label%3Adifficulty%2Fhigh)
#### [Kapacitor](http://github.com/influxdata/kapacitor)
To see only help wanted issues, follow these links:
- [All Help Wanted](https://github.com/influxdata/kapacitor/labels/help%20wanted)
- [Help Wanted - Difficulty: easy](https://github.com/influxdata/kapacitor/labels/difficulty-easy)
- [Help Wanted - Difficulty: medium](https://github.com/influxdata/kapacitor/labels/difficulty-medium)
- [Help Wanted - Difficulty: hard](https://github.com/influxdata/kapacitor/labels/difficulty-hard)
#### [Docker](https://github.com/docker/docker)
- [Beginner](https://github.com/docker/docker/issues?q=is%3Aopen+is%3Aissue+label%3Aexp%2Fbeginner)
- [Intermediate](https://github.com/docker/docker/issues?q=is%3Aopen+is%3Aissue+label%3Aexp%2Fintermediate)
- [Expert](https://github.com/docker/docker/issues?q=is%3Aopen+is%3Aissue+label%3Aexp%2Fexpert)
#### [Mattermost Server](https://github.com/mattermost/mattermost-server)
Open source Slack-alternative built on Go and React.
- [All Help Wanted](https://mattermost.com/pl/help-wanted)
- [Go Help Wanted - Difficulty: Easy](https://github.com/mattermost/mattermost-server/issues?q=is%3Aissue+is%3Aopen+label%3ADifficulty%2F1%3AEasy+label%3A%22Tech%2FGo%22)
- [Go Help Wanted - Difficulty: Medium](https://github.com/mattermost/mattermost-server/issues?q=is%3Aissue+is%3Aopen+label%3ADifficulty%2F2%3AMedium+label%3A%22Tech%2FGo%22+)
- [Go Help Wanted - Difficulty: Hard](https://github.com/mattermost/mattermost-server/issues?q=is%3Aissue+is%3Aopen+label%3ADifficulty%2F3%3AHard+label%3A%22Tech%2FGo%22+)
#### [CockroachDB](https://github.com/cockroachdb/cockroach)
- [All Help Wanted](https://github.com/cockroachdb/cockroach/labels/helpwanted)
- [Help Wanted - Easy](https://github.com/cockroachdb/cockroach/issues?q=is%3Aopen+label%3Ahelpwanted+label%3Aeasy)
#### [Caddy](https://github.com/mholt/caddy)
- [All Help Wanted](https://github.com/mholt/caddy/labels/help%20wanted)
- [Help Wanted - Easy](https://github.com/mholt/caddy/labels/easy)
#### [Gogs](https://github.com/gogits/gogs)
- [Help Wanted](https://github.com/gogits/gogs/issues?q=is%3Aopen+is%3Aissue+label%3A%22help+wanted%22)
#### [Arduino Create Agent](https://github.com/arduino/arduino-create-agent)
This project works, but it's a mess of non idiomatic go code without tests.
- [Help wanted - Refactor](https://github.com/arduino/arduino-create-agent/issues/3)
#### [universal-translator](https://github.com/go-playground/universal-translator)
- [Help Wanted - Tests](https://github.com/go-playground/universal-translator/issues/1)
#### [Go kit](https://github.com/go-kit/kit)
- [Help Wanted](https://github.com/go-kit/kit/issues?q=is%3Aissue+is%3Aopen+label%3A%22help+wanted%22)
- [Help Wanted, Newbie Friendly](https://github.com/go-kit/kit/issues?utf8=✓&q=is%3Aissue+is%3Aopen+label%3A"help+wanted"+label%3A"newbie+friendly")
#### [fsnotify](https://github.com/fsnotify/fsnotify)
- Help wanted to [investigate and reproduce issues](https://github.com/fsnotify/fsnotify/issues?q=is%3Aissue+is%3Aopen+label%3Ainvestigate), review [pull requests](https://github.com/fsnotify/fsnotify/pulls), support additional platforms, and clean up the existing code.
#### [log](https://github.com/go-playground/log)
- [Help Wanted](https://github.com/go-playground/log/issues/1)
#### [gorilla/mux](https://github.com/gorilla/mux)
- [Help Wanted](https://github.com/gorilla/mux/labels/helpwanted)
- [Documentation](https://github.com/gorilla/mux/labels/documentation)
#### [gorouter](https://github.com/vardius/gorouter)
- [Help Wanted](https://github.com/vardius/gorouter/labels/help%20wanted)
#### [go-api-boilerplate](https://github.com/vardius/go-api-boilerplate)
- [Help Wanted](https://github.com/vardius/go-api-boilerplate/labels/help%20wanted)
#### [httpexpect](https://github.com/gavv/httpexpect)
- [Help Wanted](https://github.com/gavv/httpexpect/labels/help%20wanted)
## C++
#### [Roc Toolkit](https://roc-streaming.org/)
- [Help Wanted](https://github.com/roc-streaming/roc-toolkit/labels/help%20wanted)
- [Contribution Guidelines](https://roc-streaming.org/toolkit/docs/development/contribution_guidelines.html)
|
{
"pile_set_name": "Github"
}
|
<?php
/*
*
* ____ _ _ __ __ _ __ __ ____
* | _ \ ___ ___| | _____| |_| \/ (_)_ __ ___ | \/ | _ \
* | |_) / _ \ / __| |/ / _ \ __| |\/| | | '_ \ / _ \_____| |\/| | |_) |
* | __/ (_) | (__| < __/ |_| | | | | | | | __/_____| | | | __/
* |_| \___/ \___|_|\_\___|\__|_| |_|_|_| |_|\___| |_| |_|_|
*
* This program is free software: you can redistribute it and/or modify
* it under the terms of the GNU Lesser General Public License as published by
* the Free Software Foundation, either version 3 of the License, or
* (at your option) any later version.
*
* @author PocketMine Team
* @link http://www.pocketmine.net/
*
*
*/
namespace pocketmine\permission;
use pocketmine\plugin\Plugin;
use pocketmine\utils\PluginException;
class PermissionAttachment{
/** @var PermissionRemovedExecutor */
private $removed = null;
/**
* @var bool[]
*/
private $permissions = [];
/** @var Permissible */
private $permissible;
/** @var Plugin */
private $plugin;
/**
* @param Plugin $plugin
* @param Permissible $permissible
*
* @throws PluginException
*/
public function __construct(Plugin $plugin, Permissible $permissible){
if(!$plugin->isEnabled()){
throw new PluginException("Plugin " . $plugin->getDescription()->getName() . " is disabled");
}
$this->permissible = $permissible;
$this->plugin = $plugin;
}
/**
* @return Plugin
*/
public function getPlugin(){
return $this->plugin;
}
/**
* @param PermissionRemovedExecutor $ex
*/
public function setRemovalCallback(PermissionRemovedExecutor $ex){
$this->removed = $ex;
}
/**
* @return PermissionRemovedExecutor
*/
public function getRemovalCallback(){
return $this->removed;
}
/**
* @return Permissible
*/
public function getPermissible(){
return $this->permissible;
}
/**
* @return bool[]
*/
public function getPermissions(){
return $this->permissions;
}
/**
* @return bool[]
*/
public function clearPermissions(){
$this->permissions = [];
$this->permissible->recalculatePermissions();
}
/**
* @param bool[] $permissions
*/
public function setPermissions(array $permissions){
foreach($permissions as $key => $value){
$this->permissions[$key] = (bool) $value;
}
$this->permissible->recalculatePermissions();
}
/**
* @param string[] $permissions
*/
public function unsetPermissions(array $permissions){
foreach($permissions as $node){
unset($this->permissions[$node]);
}
$this->permissible->recalculatePermissions();
}
/**
* @param string|Permission $name
* @param bool $value
*/
public function setPermission($name, $value){
$name = $name instanceof Permission ? $name->getName() : $name;
if(isset($this->permissions[$name])){
if($this->permissions[$name] === $value){
return;
}
unset($this->permissions[$name]); //Fixes children getting overwritten
}
$this->permissions[$name] = $value;
$this->permissible->recalculatePermissions();
}
/**
* @param string|Permission $name
*/
public function unsetPermission($name){
$name = $name instanceof Permission ? $name->getName() : $name;
if(isset($this->permissions[$name])){
unset($this->permissions[$name]);
$this->permissible->recalculatePermissions();
}
}
/**
* @return void
*/
public function remove(){
$this->permissible->removeAttachment($this);
}
}
|
{
"pile_set_name": "Github"
}
|
af: afrikáans
af_NA: 'afrikáans (Namibia)'
af_ZA: 'afrikáans (Sudáfrica)'
ak: akan
ak_GH: 'akan (Ghana)'
sq: albanés
sq_AL: 'albanés (Albania)'
sq_XK: 'albanés (Kosovo)'
sq_MK: 'albanés (Macedonia)'
de: alemán
de_DE: 'alemán (Alemania)'
de_AT: 'alemán (Austria)'
de_BE: 'alemán (Bélgica)'
de_LI: 'alemán (Liechtenstein)'
de_LU: 'alemán (Luxemburgo)'
de_CH: 'alemán (Suiza)'
am: amárico
am_ET: 'amárico (Etiopía)'
ar: árabe
ar_SA: 'árabe (Arabia Saudí)'
ar_DZ: 'árabe (Argelia)'
ar_BH: 'árabe (Baréin)'
ar_QA: 'árabe (Catar)'
ar_TD: 'árabe (Chad)'
ar_KM: 'árabe (Comoras)'
ar_EG: 'árabe (Egipto)'
ar_AE: 'árabe (Emiratos Árabes Unidos)'
ar_ER: 'árabe (Eritrea)'
ar_IQ: 'árabe (Iraq)'
ar_IL: 'árabe (Israel)'
ar_JO: 'árabe (Jordania)'
ar_KW: 'árabe (Kuwait)'
ar_LB: 'árabe (Líbano)'
ar_LY: 'árabe (Libia)'
ar_MA: 'árabe (Marruecos)'
ar_MR: 'árabe (Mauritania)'
ar_OM: 'árabe (Omán)'
ar_EH: 'árabe (Sáhara Occidental)'
ar_SY: 'árabe (Siria)'
ar_SO: 'árabe (Somalia)'
ar_SS: 'árabe (Sudán del Sur)'
ar_SD: 'árabe (Sudán)'
ar_PS: 'árabe (Territorios Palestinos)'
ar_TN: 'árabe (Túnez)'
ar_YE: 'árabe (Yemen)'
ar_DJ: 'árabe (Yibuti)'
hy: armenio
hy_AM: 'armenio (Armenia)'
as: asamés
as_IN: 'asamés (India)'
az: azerí
az_AZ: 'azerí (Azerbaiyán)'
az_Cyrl_AZ: 'azerí (cirílico, Azerbaiyán)'
az_Cyrl: 'azerí (cirílico)'
az_Latn_AZ: 'azerí (latín, Azerbaiyán)'
az_Latn: 'azerí (latín)'
bm: bambara
bm_Latn_ML: 'bambara (latín, Mali)'
bm_Latn: 'bambara (latín)'
bn: bengalí
bn_BD: 'bengalí (Bangladés)'
bn_IN: 'bengalí (India)'
be: bielorruso
be_BY: 'bielorruso (Bielorrusia)'
my: birmano
my_MM: 'birmano (Myanmar (Birmania))'
nb: 'bokmal noruego'
nb_NO: 'bokmal noruego (Noruega)'
nb_SJ: 'bokmal noruego (Svalbard y Jan Mayen)'
bs: bosnio
bs_BA: 'bosnio (Bosnia-Herzegovina)'
bs_Cyrl_BA: 'bosnio (cirílico, Bosnia-Herzegovina)'
bs_Cyrl: 'bosnio (cirílico)'
bs_Latn_BA: 'bosnio (latín, Bosnia-Herzegovina)'
bs_Latn: 'bosnio (latín)'
br: bretón
br_FR: 'bretón (Francia)'
bg: búlgaro
bg_BG: 'búlgaro (Bulgaria)'
ks: cachemiro
ks_Arab_IN: 'cachemiro (árabe, India)'
ks_Arab: 'cachemiro (árabe)'
ks_IN: 'cachemiro (India)'
kn: canarés
kn_IN: 'canarés (India)'
ca: catalán
ca_AD: 'catalán (Andorra)'
ca_ES: 'catalán (España)'
ca_FR: 'catalán (Francia)'
ca_IT: 'catalán (Italia)'
cs: checo
cs_CZ: 'checo (República Checa)'
zh: chino
zh_CN: 'chino (China)'
zh_HK: 'chino (RAE de Hong Kong (China))'
zh_MO: 'chino (RAE de Macao (China))'
zh_Hans_CN: 'chino (simplificado, China)'
zh_Hans_HK: 'chino (simplificado, RAE de Hong Kong (China))'
zh_Hans_MO: 'chino (simplificado, RAE de Macao (China))'
zh_Hans_SG: 'chino (simplificado, Singapur)'
zh_Hans: 'chino (simplificado)'
zh_SG: 'chino (Singapur)'
zh_TW: 'chino (Taiwán)'
zh_Hant_HK: 'chino (tradicional, RAE de Hong Kong (China))'
zh_Hant_MO: 'chino (tradicional, RAE de Macao (China))'
zh_Hant_TW: 'chino (tradicional, Taiwán)'
zh_Hant: 'chino (tradicional)'
si: cingalés
si_LK: 'cingalés (Sri Lanka)'
ko: coreano
ko_KP: 'coreano (Corea del Norte)'
ko_KR: 'coreano (Corea del Sur)'
kw: córnico
kw_GB: 'córnico (Reino Unido)'
hr: croata
hr_BA: 'croata (Bosnia-Herzegovina)'
hr_HR: 'croata (Croacia)'
da: danés
da_DK: 'danés (Dinamarca)'
da_GL: 'danés (Groenlandia)'
dz: dzongkha
dz_BT: 'dzongkha (Bután)'
sk: eslovaco
sk_SK: 'eslovaco (Eslovaquia)'
sl: esloveno
sl_SI: 'esloveno (Eslovenia)'
es: español
es_AR: 'español (Argentina)'
es_BO: 'español (Bolivia)'
es_EA: 'español (Ceuta y Melilla)'
es_CL: 'español (Chile)'
es_CO: 'español (Colombia)'
es_CR: 'español (Costa Rica)'
es_CU: 'español (Cuba)'
es_EC: 'español (Ecuador)'
es_SV: 'español (El Salvador)'
es_ES: 'español (España)'
es_US: 'español (Estados Unidos)'
es_PH: 'español (Filipinas)'
es_GT: 'español (Guatemala)'
es_GQ: 'español (Guinea Ecuatorial)'
es_HN: 'español (Honduras)'
es_IC: 'español (islas Canarias)'
es_MX: 'español (México)'
es_NI: 'español (Nicaragua)'
es_PA: 'español (Panamá)'
es_PY: 'español (Paraguay)'
es_PE: 'español (Perú)'
es_PR: 'español (Puerto Rico)'
es_DO: 'español (República Dominicana)'
es_UY: 'español (Uruguay)'
es_VE: 'español (Venezuela)'
eo: esperanto
et
|
{
"pile_set_name": "Github"
}
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.