text
stringlengths 27
775k
|
|---|
# Introduction
This document presents the principles that govern how the SARIF TC incorporates changes
into the SARIF spec, and describes the workflow for incorporating agreed-upon changes.
# Principals and process
1. We encourage discussion on the public issues. The Editors will ensure that issues are
filed in the GitHub repo, and will "curate" them by applying issue labels, ensuring
content from other sources (email) is recorded in the GitHub issue, _etc._
1. Any issues that require time-sensitive discussion will be driven through the mailing
list (this should be rare).
1. Once a revision of the Working Draft has been accepted (for example,
`sarif-v1.0-wd01.docx`), the Editors will create a "provisional" draft with the next
number (for example `sarif-v1.0-wd02-provisional.docx`)
in the Documents\ProvisionalDrafts folder of the repo.
1. When a change is proposed, the Editors will push a copy of the _current_ provisional draft
to the Documents\ChangeDrafts folder of the repo.
The copy will have a name of the form `sarif-issue-<issueNumber>-<mnemonic>.docx`.
We refer to this copy as a "change draft".
**NOTE**: This does *not* require a PR.
The "mnemonic" is a short string (at most a few words) that distinguishes
this change from other, competing proposals that address the same issue.
For example, if there are two competing proposals to address Issue #3,
there might be two change drafts:
* `sarif-issue-3-multiple-classifications.docx`
* `sarif-issue-3-single-classification.docx`
1. The Editors will make all changes in the change draft, with change tracking enabled.
1. The Editors will inform the TC of the location of the change draft, and will invite comments on the draft in the associated GitHub issue.
1. When an issue is ready for final approval, the Editors will:
1. Label the issue `ready-for-approval`.
1. Place a comment at the top of the change draft stating that the proposal is ready.
1. Push the latest version of the change draft to the repo.
**NOTE**: This does *not* require a PR.
1. If an issue requires further discussion at the next TC meeting, the Editors will:
1. Label the issue `discussion-ongoing`.
1. Place a comment at the top of the change draft stating that the proposal requires
further discussion.
1. Push the latest version of the change draft to the repo.
**NOTE**: This does *not* require a PR.
1. The TC decides whether to approve the change at the next TC meeting, as follows:
1. The Editors announce the impending vote by including an item in the meeting agenda.
1. When the agenda item arises, one of the Editors moves that the change be adopted.
1. The motion is debated as usual under Robert's Rules (although ideally, any controversy will have been resolved in previous meetings or in the GitHub issue discussion thread).
1. A simple majority vote adopts the motion to incorporate the changes from the change draft into the current Provisional Draft.
1. If the TC approves a change (or one of a set of competing changes), the Editors will:
1. Merge the changes from the approved change draft into the provisional draft, with change tracking enabled.
**NOTE**: We can use Word's "Combine documents" feature (on the Review tab of the ribbon) to accomplish this easily.
1. Label the issue `resolved-fixed`.
1. Close the issue.
1. Place a comment at the top of the approved change draft stating that the proposal was approved.
1. Place a comment at the top of any competing change draft stating that the proposal was rejected.
1. If the TC rejects a change (or rejects every one of a set of competing changes),
and if the TC further decides not to continue to address the issue, the Editors will:
1. Label the issue `resolved-wont-fix`.
1. Close the issue.
1. Place a comment at the top of every associated change draft stating that the proposal
was rejected.
1. At certain times, the TC might decide to capture the current state of the provisional draft
in a revised Working Draft (with the next revision number). In that case, the Editors
will:
1. Accept all change-tracked changes in the provisional draft
(for example, `sarif-v1.0-wd02-provisional.docx`), and remove all comments.
1. Modify the document metadata in the provisional draft so that the correct document identifier
(in this example, `sarif-v1.0-wd02`) appears in the document footer.
1. Copy the provisional draft to a file with the correct name for the next Working Draft
(in this example, copying `sarif-v1.0-wd02-provisional.docx` to `sarif-v1.0-wd02.docx`).
1. Rename the provisional draft to the next version number (in this example,
renaming `sarif-v1.0-wd02-provisional.docx` to `sarif-v1.0-wd03-provisional.docx`).
1. This process will be documented in the file `Workflow.md` in the repository.
1. Changes to this process will be requested by opening an issue in the repository, and
must be approved at a TC meeting.
# Issue labels
We will track the workflow status of issues in the repo with a set of labels.
These labels are mostly a subset of the labels in the [original spec repo](https://github.com/sarif-standard/sarif-spec):
- `bug`: The issue prevents the file format from correctly and consistently representing the information necessary to support the scenarios for which it is designed.
- `enhancement`: The proposal adds support to the spec for a new scenario, or provide richer support for a supported scenario.
- `domain-result-management`: The issue is specific to the domain of result management.
- `domain-security`: The issue is specific to the security domain.
- `impact-breaks-consumers`: The proposed change to the format would prevent consumers of the format, such as viewers or result management systems, from consuming some or all valid log files.
- `impact-breaks-producers`: The proposed change to the format would render invalid log files created by existing producers, such as analysis tools or converters.
- `impact-documentation-only`: The proposed change clarifies or enhances the documentation, but does not affect the format.
- `impact-non-breaking-change`: The proposed change to the format is backward compatible with all conforming producers and consumers.
- `process`: The issue relates to the process of producing the TC's work products, rather than to the content of the work products themselves.
- `prototype-needed`: The practicality of implementing the proposed change is unclear; the change should be prototyped in code before being considered for adoption.
- `question`: The issue is a request for information, not a proposal to change the format or the documentation.
- `discussion-ongoing`: The issue requires further discussion at the next TC meeting.
- `ready-for-approval`: The issue has been discussed and a resolution reached, the spec has been edited with change tracking to reflect the change, and the change is ready for approval at the next TC meeting.
- `resolved-by-design`: If the issue is a bug, this label means that the existing behavior is as intended and will not be changed.
- `resolved-deferred`: The issue is deferred for consideration in a future version of the specification.
- `resolved-duplicate`: The issue is a duplicate of another.
- `resolved-fixed`: The changes to the spec have been approved at a TC meeting and have been merged into the Working Draft.
- `resolved-wont-fix`: If the issue is a bug, this label means that the bug will not be fixed. (This should be rare!) If the issue is an enhancement, this label means that the TC has decided not to incorporate the proposed change into the spec.
|
package io.openfuture.api.entity.scaffold
import io.openfuture.api.entity.base.Dictionary
enum class Blockchain(
private val id: Int,
private val value: String
) : Dictionary {
Ethereum(1, "Ethereum"),
OPEN(2, "OPEN Chain")
;
override fun getId(): Int = id
fun getValue(): String = value
}
|
mod poly_impl;
#[proc_macro]
pub fn poly(input: proc_macro::TokenStream) -> proc_macro::TokenStream {
poly_impl::poly(input)
}
|
---
title: Workload
---
A process/binary deployed by operators in Istio, typically represented by entities such as containers, pods, or VMs.
* A workload can expose zero or more [service endpoints](#service-endpoint).
* A workload can consume zero or more [services](#service).
* Each workload has a single canonical [service name](#service-name) associated with it, but
may also represent additional service names.
|
Tutorial
--------
### Functions
Function, similar to the ones in math, is a collection of statements that is designed to perform specific tasks. Commonly, functions take in inputs (parameters), which is then processed to return an output. Below is the general format of functions in C++:
type name_of_function (parameters, parameters...){
statements
}
* **type**: Type of variable the function will return
* **name_of_function**: Name of the function that is used to call the function
* **parameters**: Inputs that the funtion will use. Identify data type followed by variable name. Seperate with `,` for more than one input.
* **statements**: Lines of code that will perform the function's task.
To call the function, you simply have to write the name of the function with the parenthesis following (if the function requires parameters, the parameters should be written between the parenthesis):
int squareNumbers(int x){ // Declares function "squareNumbers" that takes in parameter of x.
y=x*x; //creates int variable equating to x squared
return y; //returns the value of y when the function is called
}
int main(){
input = 9;
output = squareNumbers(input);
//the function is called, resulting in the int variable "output" equating input squared
}
Another example:
void helloThere(string name){//void means this function doesn't return anything
cout << "Hello, " << name;
}
int main(){
helloThere("Celina"); //prints out "Hello, Celina"
}
Most commonly, functions are placed outside of the Main function.
Excercise
---------
In this excercise, you will create a function that prints out the sum of the given variables, a, b, and c. Below is the given code.
Tutorial Code
-------------
#include <iostream>
using namespace std;
// your code goes here (declare function)
int main (){
int a = 1;
int b = 4;
int c = 3;
// your code goes here (call the function)
}
Expected Output
---------------
8
Solution
--------
#include <iostream>
using namespace std;
void addition(int a, int b, int c){
cout << a+b+c;
}
int main (){
int a = 1;
int b = 4;
int c = 3;
addition(a,b,c);
}
|
package com.kimi.my.shop.web.admin.dao;
import com.kimi.my.shop.commons.persistence.BaseDao;
import com.kimi.my.shop.domain.Product;
import org.springframework.stereotype.Repository;
@Repository
public interface ProductDao extends BaseDao<Product> {
}
|
module Refinery
class UserPlugin < ActiveRecord::Base
belongs_to :user
attr_accessible :user_id, :name, :position
end
end
|
<?php
namespace Kms\Request\V20160120;
/**
* @deprecated Please use https://github.com/aliyun/openapi-sdk-php
*
* Request of GetParametersForImport
*
* @method string getKeyId()
* @method string getWrappingAlgorithm()
* @method string getWrappingKeySpec()
*/
class GetParametersForImportRequest extends \RpcAcsRequest
{
/**
* @var string
*/
protected $requestScheme = 'https';
/**
* @var string
*/
protected $method = 'POST';
/**
* Class constructor.
*/
public function __construct()
{
parent::__construct(
'Kms',
'2016-01-20',
'GetParametersForImport',
'kms'
);
}
/**
* @param string $keyId
*
* @return $this
*/
public function setKeyId($keyId)
{
$this->requestParameters['KeyId'] = $keyId;
$this->queryParameters['KeyId'] = $keyId;
return $this;
}
/**
* @param string $wrappingAlgorithm
*
* @return $this
*/
public function setWrappingAlgorithm($wrappingAlgorithm)
{
$this->requestParameters['WrappingAlgorithm'] = $wrappingAlgorithm;
$this->queryParameters['WrappingAlgorithm'] = $wrappingAlgorithm;
return $this;
}
/**
* @param string $wrappingKeySpec
*
* @return $this
*/
public function setWrappingKeySpec($wrappingKeySpec)
{
$this->requestParameters['WrappingKeySpec'] = $wrappingKeySpec;
$this->queryParameters['WrappingKeySpec'] = $wrappingKeySpec;
return $this;
}
}
|
// Copyright 2019 The Fuchsia Authors. All rights reserved.
// Use of this source code is governed by a BSD-style license that can be
// found in the LICENSE file.
use {
crate::story_context_store::StoryContextStore,
crate::utils,
failure::{format_err, Error, ResultExt},
fidl_fuchsia_app_discover::{
ModuleIdentifier, ModuleOutputWriterRequest, ModuleOutputWriterRequestStream,
},
fuchsia_async as fasync,
fuchsia_syslog::macros::*,
futures::prelude::*,
parking_lot::Mutex,
std::sync::Arc,
};
/// The ModuleOutputWriter protocol implementation.
pub struct ModuleOutputWriterService {
/// The story id to which the module belongs.
story_id: String,
/// The module id in story |story_id| to which the output belongs.
module_id: String,
/// Reference to the context store.
story_context_store: Arc<Mutex<StoryContextStore>>,
}
impl ModuleOutputWriterService {
/// Create a new module writer instance from an identifier.
pub fn new(
story_context_store: Arc<Mutex<StoryContextStore>>,
module: ModuleIdentifier,
) -> Result<Self, Error> {
Ok(ModuleOutputWriterService {
story_id: module.story_id.ok_or(format_err!("expected story id"))?,
module_id: utils::encoded_module_path(
module.module_path.ok_or(format_err!("expected mod path"))?,
),
story_context_store,
})
}
/// Handle a stream of ModuleOutputWriter requests.
pub fn spawn(self, mut stream: ModuleOutputWriterRequestStream) {
fasync::spawn(
async move {
while let Some(request) = await!(stream.try_next()).context(format!(
"Error running module output for {:?} {:?}",
self.story_id, self.module_id,
))? {
match request {
ModuleOutputWriterRequest::Write {
output_name,
entity_reference,
responder,
} => {
self.handle_write(output_name, entity_reference);
responder.send(&mut Ok(()))?;
}
}
}
Ok(())
}
.unwrap_or_else(|e: Error| fx_log_err!("error serving module output {}", e)),
)
}
/// Write to the given |entity_reference| to the context store and associate
/// it to this module output |output_name|. If no entity reference is given,
/// clear that output.
fn handle_write(&self, output_name: String, entity_reference: Option<String>) {
// TODO(miguelfrde): verify the output_name matches an output in
// the manifest.
fx_log_info!(
"Got write for parameter name:{}, story:{}, mod:{:?}",
output_name,
self.story_id,
self.module_id,
);
match entity_reference {
// TODO(miguelfrde): veirfy the reference exists.
Some(reference) => self.story_context_store.lock().contribute(
&self.story_id,
&self.module_id,
&output_name,
&reference,
),
None => self.story_context_store.lock().withdraw(
&self.story_id,
&self.module_id,
vec![&output_name],
),
}
}
}
#[cfg(test)]
mod tests {
use super::*;
use crate::story_context_store::{ContextEntity, Contributor};
use fidl_fuchsia_app_discover::ModuleOutputWriterMarker;
#[fasync::run_until_stalled(test)]
async fn test_write() {
let state = Arc::new(Mutex::new(StoryContextStore::new()));
// Initialize service client and server.
let (client, request_stream) =
fidl::endpoints::create_proxy_and_stream::<ModuleOutputWriterMarker>().unwrap();
let module = ModuleIdentifier {
story_id: Some("story1".to_string()),
module_path: Some(vec!["mod-a".to_string()]),
};
ModuleOutputWriterService::new(state.clone(), module).unwrap().spawn(request_stream);
// Write a module output.
assert!(await!(client.write("param-foo", Some("foo"))).is_ok());
// Verify we have one entity with the right contributor.
{
let context_store = state.lock();
let result = context_store.current().collect::<Vec<&ContextEntity>>();
let mut expected_entity = ContextEntity::new("foo");
expected_entity.add_contributor(Contributor::module_new(
"story1",
"mod-a",
"param-foo",
));
assert_eq!(result.len(), 1);
assert_eq!(result[0], &expected_entity);
}
// Write no entity to the same output. This should withdraw the entity.
assert!(await!(client.write("param-foo", None)).is_ok());
// Verify we have no values.
let context_store = state.lock();
let result = context_store.current().collect::<Vec<&ContextEntity>>();
assert_eq!(result.len(), 0);
}
}
|
---
title: 2020 Photos
slug: "2020"
thumbnail: /photos/2020/images/20200219-DSC00844.jpg
---
|
require 'test_helper'
class IndustriesControllerTest < ActionDispatch::IntegrationTest
include Devise::Test::IntegrationHelpers
setup do
@industry = industries(:one)
@user = users(:one)
end
test "should get index" do
get industries_url
assert_response :success
end
test "should get new" do
sign_in @user
get new_industry_url
assert_response :success
end
test "should create industry" do
sign_in @user
assert_difference('Industry.count') do
post industries_url, params: { industry: { name: "#{@industry.name}3", sector_id: @industry.sector_id } }
end
assert_redirected_to industry_url(Industry.last)
end
test "should not create industry" do
sign_in @user
assert_difference('Industry.count', 0) do
post industries_url, params: { industry: { name: "", sector_id: @industry.sector_id } }
end
assert_response :success
end
test "should show industry" do
get industry_url(@industry)
assert_response :success
end
test "should get edit" do
sign_in @user
get edit_industry_url(@industry)
assert_response :success
end
test "should update industry" do
sign_in @user
patch industry_url(@industry), params: { industry: { name: @industry.name, sector_id: @industry.sector_id } }
assert_redirected_to industry_url(@industry)
end
test "should not update industry" do
sign_in @user
patch industry_url(@industry), params: { industry: { name: "", sector_id: @industry.sector_id } }
assert_response :success
end
test "should destroy industry" do
sign_in @user
assert_difference('Industry.count', 0) do
assert_difference('Industry.count') do
post industries_url, params: { industry: { name: "#{@industry.name}4", sector_id: @industry.sector_id } }
end
assert_difference('Industry.count', -1) do
delete industry_url(Industry.find_by_name("#{@industry.name}4"))
end
assert_redirected_to industries_url
end
end
end
|
import React, { useState } from 'react';
import { useSelector, useDispatch } from 'react-redux';
import {
Box,
createStyles,
MenuItem,
Select,
Switch,
Typography,
WithStyles,
withStyles,
} from '@material-ui/core';
import { LayerKey, LayerType } from '../../../../config/types';
import {
getDisplayBoundaryLayers,
getBoundaryLayerSingleton,
LayerDefinitions,
} from '../../../../config/utils';
import {
safeDispatchAddLayer,
safeDispatchRemoveLayer,
} from '../../../../utils/map-utils';
import {
layersSelector,
mapSelector,
} from '../../../../context/mapStateSlice/selectors';
import { useUrlHistory } from '../../../../utils/url-utils';
import { removeLayer } from '../../../../context/mapStateSlice';
import { useSafeTranslation } from '../../../../i18n';
import { clearDataset } from '../../../../context/datasetStateSlice';
function SwitchItem({ classes, layer }: SwitchItemProps) {
const { t } = useSafeTranslation();
const selectedLayers = useSelector(layersSelector);
const map = useSelector(mapSelector);
const dispatch = useDispatch();
const { updateHistory, removeKeyFromUrl } = useUrlHistory();
const { id: layerId, title: layerTitle, group } = layer;
const selected = selectedLayers.some(({ id: testId }) => {
return (
testId === layerId || (group && group.layers.some(l => l.id === testId))
);
});
const selectedActiveLayer = selected
? selectedLayers.filter(sl => {
return (
(group?.activateAll &&
group?.layers.find(l => l.id === sl.id && l.main)) ||
(!group?.activateAll && group?.layers.map(l => l.id).includes(sl.id))
);
})
: [];
const initialActiveLayer =
selectedActiveLayer.length > 0 ? selectedActiveLayer[0].id : null;
const [activeLayer, setActiveLayer] = useState(
initialActiveLayer || (group?.layers?.find(l => l.main)?.id as string),
);
const validatedTitle = t(group?.groupTitle || layerTitle || '');
const toggleLayerValue = (selectedLayerId: string, checked: boolean) => {
const ADMIN_LEVEL_DATA_LAYER_KEY = 'admin_level_data';
// clear previous table dataset loaded first
// to close the dataseries and thus close chart
dispatch(clearDataset());
const urlLayerKey =
layer.type === ADMIN_LEVEL_DATA_LAYER_KEY
? 'baselineLayerId'
: 'hazardLayerId';
const selectedLayer = group
? LayerDefinitions[selectedLayerId as LayerKey]
: layer;
if (checked) {
updateHistory(urlLayerKey, selectedLayer.id);
const defaultBoundary = getBoundaryLayerSingleton();
if (
!('boundary' in selectedLayer) &&
selectedLayer.type === ADMIN_LEVEL_DATA_LAYER_KEY
) {
safeDispatchAddLayer(map, defaultBoundary, dispatch);
}
} else {
removeKeyFromUrl(urlLayerKey);
dispatch(removeLayer(selectedLayer));
// For admin boundary layers with boundary property
// we have to de-activate the unique boundary and re-activate
// default boundaries
if ('boundary' in selectedLayer) {
const boundaryId = selectedLayer.boundary || '';
if (Object.keys(LayerDefinitions).includes(boundaryId)) {
const displayBoundaryLayers = getDisplayBoundaryLayers();
const uniqueBoundaryLayer = LayerDefinitions[boundaryId as LayerKey];
if (
!displayBoundaryLayers
.map(l => l.id)
.includes(uniqueBoundaryLayer.id)
) {
safeDispatchRemoveLayer(map, uniqueBoundaryLayer, dispatch);
}
displayBoundaryLayers.forEach(l => {
safeDispatchAddLayer(map, l, dispatch);
});
}
}
}
};
const handleSelect = (
event: React.ChangeEvent<{ value: string | unknown }>,
) => {
const selectedId = event.target.value;
setActiveLayer(selectedId as string);
toggleLayerValue(selectedId as string, true);
};
const menuTitle = group ? (
<>
<Typography className={classes.title}>{validatedTitle}</Typography>
{!group.activateAll && (
<Select
className={classes.select}
classes={{ root: classes.selectItem }}
value={activeLayer}
onChange={e => handleSelect(e)}
>
{group.layers.map(menu => (
<MenuItem key={menu.id} value={menu.id}>
{t(menu.label)}
</MenuItem>
))}
</Select>
)}
</>
) : (
<Typography className={classes.title}>{validatedTitle}</Typography>
);
return (
<Box key={layerId} display="flex" mb={1}>
<Switch
size="small"
color="default"
className={classes.switch}
checked={selected}
onChange={e => toggleLayerValue(activeLayer, e.target.checked)}
inputProps={{
'aria-label': validatedTitle,
}}
/>
{menuTitle}
</Box>
);
}
const styles = () =>
createStyles({
title: {
lineHeight: 1.8,
},
select: {
'&::before': {
border: 'none',
},
},
selectItem: {
whiteSpace: 'normal',
fontSize: 13,
fontWeight: 300,
padding: 0,
marginLeft: 5,
},
switch: {
marginRight: 2,
},
});
export interface SwitchItemProps extends WithStyles<typeof styles> {
layer: LayerType;
}
export default withStyles(styles)(SwitchItem);
|
import bpy
from bpy.props import *
from . base import InsertClassTemplateBase, insert_template
from .. utils.variable_name_conversion import (get_valid_variable_name,
get_lower_case_with_underscores,
get_separated_capitalized_words)
menu_type_items = [
("NORMAL", "Normal", ""),
("PIE", "Pie", "") ]
class InsertMenu(bpy.types.Operator, InsertClassTemplateBase):
bl_idname = "code_autocomplete.insert_menu"
bl_label = "Insert Menu"
bl_description = ""
menu_type = EnumProperty(items = menu_type_items, default = "NORMAL")
def execute(self, context):
if self.menu_type == "NORMAL": code = menu_template
if self.menu_type == "PIE": code = pie_menu_template
changes = {
"CLASS_NAME" : get_valid_variable_name(self.class_name),
"ID_NAME" : "view3d." + get_lower_case_with_underscores(self.class_name),
"LABEL" : get_separated_capitalized_words(self.class_name) }
insert_template(code, changes)
return {"FINISHED"}
menu_template = '''class CLASS_NAME(bpy.types.Menu):
bl_idname = "ID_NAME"
bl_label = "LABEL"
def draw(self, context):
layout = self.layout
'''
pie_menu_template = '''class CLASS_NAME(bpy.types.Menu):
bl_idname = "ID_NAME"
bl_label = "LABEL"
def draw(self, context):
pie = self.layout.menu_pie()
'''
|
using MongoDB.Bson.Serialization.Attributes;
using System.Collections.Generic;
using System.ComponentModel.DataAnnotations;
namespace KnowledgeApi.Models
{
public class Article : MongoBaseModel
{
[BsonElement("title")]
//[Required(ErrorMessage = "Başlık alanı boş geçilemez.")]
public string Title { get; set; }
[BsonElement("content")]
//[Required(ErrorMessage = "İçerik alanı boş geçilemez.")]
public string Content { get; set; }
[BsonElement("description")]
//[Required(ErrorMessage = "Açıklama alanı boş geçilemez.")]
public string Description { get; set; }
[BsonElement("topics")]
public string Topics { get; set; }
[BsonElement("url")]
//[Required(ErrorMessage = "Url alanı boş geçilemez.")]
public string Url { get; set; }
[BsonElement("articletype")]
public string ArticleType { get; set; }
[BsonElement("image")]
//[Required(ErrorMessage = "Resim alanı boş geçilemez.")]
public string Image{ get; set; }
[BsonElement("dates")]
public string Dates { get; set; }
[BsonElement("arttypedetail")]
public IList<ArtType> ArttypeDetail { get; set; }
}
}
|
SELECT FirstName,
MiddleName,
LastName
FROM Employees
|
import { createBrowserHistory } from 'history';
import { nodeStore } from './nodeStore'
import { gatewayStore } from './gatewayStore'
import { userStore } from './userStore'
import { authenticationStore } from './authenticationStore'
import { tradeOrderStore } from './tradeOrderStore';
import { tradeAccountStore } from './tradeAccountStore';
import { tradePositionStore } from './tradePositionStore';
import { tradeContractStore } from './tradeContractStore';
import { tradeTradeStore } from './tradeTradeStore';
import { tradeTickStore } from './tradeTickStore';
import { tradeActionStore } from './tradeActionStore';
import { customizeStore } from './customizeStore';
import { marketDataRecordingStore } from './marketDataRecordingStore';
import { operatorStore } from './operatorStore';
import { observable, action, makeObservable, observe } from 'mobx';
export class RouterStore {
location = null;
history: any = null;
constructor() {
makeObservable(this, {
location: observable,
_updateLocation: action
});
this.push = this.push.bind(this);
this.replace = this.replace.bind(this);
this.go = this.go.bind(this);
this.goBack = this.goBack.bind(this);
this.goForward = this.goForward.bind(this);
}
_updateLocation(newState: any) {
this.location = newState;
}
/*
* History methods
*/
push = (location: any, state: any) => {
this.history.push(location, state);
}
replace = (location: any, state: any) => {
this.history.replace(location, state);
}
go = (n: any) => {
this.history.go(n);
}
goBack = () => {
this.history.goBack();
}
goForward = () => {
this.history.goForward();
}
};
export const syncHistoryWithStore = (history: any, store: any) => {
// Initialise store
store.history = history;
// Handle update from history object
const handleLocationChange = (location: any) => {
store._updateLocation(location);
};
const unsubscribeFromHistory = history.listen(handleLocationChange);
handleLocationChange(history.location);
const subscribe = (listener: any) => {
const onStoreChange = () => {
const rawLocation = { ...store.location };
listener(rawLocation, history.action);
};
// Listen for changes to location state in store
const unsubscribeFromStore = observe(store, 'location', onStoreChange);
listener(store.location, history.action);
return unsubscribeFromStore;
};
history.subscribe = subscribe;
history.unsubscribe = unsubscribeFromHistory;
return history;
};
export const browserHistory = createBrowserHistory();
export const routingStore = new RouterStore();
export const history = syncHistoryWithStore(browserHistory, routingStore);
export {
authenticationStore,
nodeStore,
gatewayStore,
userStore,
tradeOrderStore,
tradeAccountStore,
tradePositionStore,
tradeContractStore,
tradeTradeStore,
tradeTickStore,
tradeActionStore,
customizeStore,
marketDataRecordingStore,
operatorStore
}
|
import pytest
from brownie import accounts, reverts
from settings import *
# reset the chain after every test case
@pytest.fixture(autouse=True)
def isolation(fn_isolation):
pass
@pytest.fixture(scope='function')
def test_mint_token(nft):
_from = accounts[0]
to = accounts[1]
token_price = 0.1 * TENPOW18
status = True
old_token_id = nft.tokenId()
old_balance = nft.balanceOf(to)
tx = nft.mintToken(to, token_price, "", status, {"from": _from})
assert 'TokenMinted' in tx.events
new_token_id = nft.tokenId()
new_balance = nft.balanceOf(to)
token_owner = nft.ownerOf(new_token_id)
assert new_token_id == old_token_id + 1
assert new_balance == old_balance + 1
assert token_owner == to
def test_burn_token(nft, test_mint_token):
token_id = nft.tokenId()
assert token_id != 0
tx = nft.burnToken(token_id, {"from": accounts[1] })
assert 'TokenBurned' in tx.events
with reverts():
nft.ownerOf(token_id)
def test_buy_token(nft, test_mint_token):
token_id = nft.tokenId()
assert token_id != 0
value = 0.1 * TENPOW18
old_owner = nft.ownerOf(token_id)
tx = nft.buyToken(token_id, {"from": accounts[2], "value": value })
assert 'TokenBought' in tx.events
new_owner = nft.ownerOf(token_id)
assert old_owner != new_owner
assert new_owner == accounts[2]
|
/**
* ZeroDark.cloud Framework
*
* Homepage : https://www.zerodark.cloud
* GitHub : https://github.com/4th-ATechnologies/ZeroDark.cloud
* Documentation : https://zerodarkcloud.readthedocs.io/en/latest/
* API Reference : https://apis.zerodark.cloud
**/
#import <UIKit/UIKit.h>
//
// from http://stackoverflow.com/questions/18977527/how-do-i-display-the-standard-checkmark-on-a-uicollectionviewcell
//
typedef NS_ENUM( NSUInteger, ZDCCheckMarkStyle )
{
ZDCCheckMarkStyleOpenCircle,
ZDCCheckMarkStyleGrayedOut,
};
@interface ZDCCheckMark : UIView
@property (nonatomic) BOOL checked;
@property (nonatomic) ZDCCheckMarkStyle checkMarkStyle;
@end
|
module EthJsonRpc
module Exception
class EthJsonRpcError < StandardError; end
class ConnectionError < EthJsonRpcError; end
class BadStatusCodeError < EthJsonRpcError; end
class BadJsonError < EthJsonRpcError; end
class BadResponseError < EthJsonRpcError; end
end
end
|
using System.Collections.Generic;
using System.Runtime.Serialization;
namespace Domain.ViewModels
{
[DataContract]
public class UserRecoveryCodesViewModel
{
[DataMember(Name = "items", EmitDefaultValue = false)]
public IEnumerable<string> Items { get; set; }
}
}
|
# GlobalStyle
```js
import { GlobalStyle } from '@truework/ui'
```
## Usage
Provides base styles and a simple CSS reset based on the `theme`.
```jsx
import { theme, GlobalStyle } from '@truework/ui'
export default () => (
<ThemeProvider theme={theme}>
<GlobalStyle />
{children}
</ThemeProvider>
)
```
### License
MIT License © [Truework](https://truework.com)
|
import { Provider } from 'react-redux';
import { mount, shallow } from 'enzyme';
import React from 'react';
import { createAppStore } from './store';
import ConnectedSignUpPage, { SignUpPage } from './SignUpPage';
const store = createAppStore();
it('renders without crashing', () => {
mount(
<Provider store={store}>
<ConnectedSignUpPage />
</Provider>,
);
});
describe('<App />', () => {
it('Generates App', () => {
const dispatch = jest.fn();
const wrapper = shallow(<SignUpPage dispatch={dispatch} isAuthenticated={false} />);
expect(wrapper.find('Connect(SignUp)')).toHaveLength(1);
wrapper.find('Connect(SignUp)').simulate('signUp', 'test');
expect(dispatch.mock.calls.length).toBe(1);
});
it('redirects to / if authenticated', () => {
const dispatch = jest.fn();
const wrapper = shallow(<SignUpPage dispatch={dispatch} isAuthenticated={true} />);
expect(wrapper.find('SignUp')).toHaveLength(0);
expect(wrapper.find('Redirect')).toHaveLength(1);
});
});
|
package payload.response
import payload.response.common.OrderFromRecent
import kotlinx.serialization.Serializable
@Serializable
data class RecentOrders private constructor(
val buy_orders: List<OrderFromRecent>,
val sell_orders: List<OrderFromRecent>
)
|
#![feature(no_coverage)]
#![feature(trivial_bounds)]
mod alternation_char_mutators;
mod char_mutators;
mod constrained_integer;
mod derived_recursive_struct;
mod expansions;
#[cfg(feature = "regex_grammar")]
mod grammar_based_mutators;
mod option;
mod vector;
|
/*********************************************************************
*
* BSD 3-Clause License
*
* Copyright (c) 2021, dengpw
* All rights reserved.
*
* Redistribution and use in source and binary forms, with or without
* modification, are permitted provided that the following conditions
* are met:
*
* 1 Redistributions of source code must retain the above copyright
* notice, this list of conditions and the following disclaimer.
* 2 Redistributions in binary form must reproduce the above
* copyright notice, this list of conditions and the following
* disclaimer in the documentation and/or other materials provided
* with the distribution.
* 3 Neither the name of the copyright holder nor the names of its
* contributors may be used to endorse or promote products derived from
* this software without specific prior written permission.
*
* THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
* LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS
* FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE
* COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT,
* INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING,
* BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES;
* LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT
* LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN
* ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE
* POSSIBILITY OF SUCH DAMAGE.
*
* Author: dengpw
*********************************************************************/
#include "planner_core.h"
#include "node3d.h"
#include <tf/transform_datatypes.h>
namespace hybrid_astar_planner {
void HybridAStarPlanner::publishPlan(const std::vector<geometry_msgs::PoseStamped>& path) {
if (!initialized_) {
ROS_ERROR(
"This planner has not been initialized yet, but it is being used, please call initialize() before use");
return;
}
//create a message for the plan
geometry_msgs::PoseStamped transform_path;
nav_msgs::Path gui_path;
int size = path.size();
gui_path.poses.resize(size);
gui_path.header.frame_id = frame_id_;
gui_path.header.stamp = ros::Time::now();
// Extract the plan in world co-ordinates, we assume the path is all in the same frame
for (unsigned int i = 0; i < size; i++) {
transform_path.pose.position = path[i].pose.position;
gui_path.poses[i] = transform_path;//
}
plan_pub_.publish(gui_path);
// ROS_INFO("Publish the path to Rviz");
}//end of publishPlan
void HybridAStarPlanner::publishPathNodes(const std::vector<geometry_msgs::PoseStamped>& path) {
if (!initialized_) {
ROS_ERROR(
"This planner has not been initialized yet, but it is being used, please call initialize() before use");
return;
}
visualization_msgs::Marker pathVehicle;
int nodeSize = path.size();
pathVehicle.header.stamp = ros::Time(0);
pathVehicle.color.r = 52.f / 255.f;
pathVehicle.color.g = 250.f / 255.f;
pathVehicle.color.b = 52.f / 255.f;
pathVehicle.type = visualization_msgs::Marker::ARROW;
pathVehicle.header.frame_id = frame_id_;
pathVehicle.scale.x = 0.22;
pathVehicle.scale.y = 0.18;
pathVehicle.scale.z = 0.12;
pathVehicle.color.a = 0.1;
// 转化节点,并同时加上时间戳等信息
for(int i = 0; i<nodeSize; i++) {
pathVehicle.header.stamp = ros::Time(0);
pathVehicle.pose = path[i].pose;
pathVehicle.id = i;
pathNodes.markers.push_back(pathVehicle);
}
// 发布这些车辆位置标记点
path_vehicles_pub_.publish(pathNodes);
}//end of publishPathNodes
void HybridAStarPlanner::clearPathNodes() {
// 初始化并配置节点为全清空模式
visualization_msgs::Marker node;
pathNodes.markers.clear();
node.action = visualization_msgs::Marker::DELETEALL;
node.header.frame_id = frame_id_;
node.header.stamp = ros::Time(0);
node.id = 0;
node.action = 3;
pathNodes.markers.push_back(node);
path_vehicles_pub_.publish(pathNodes);
// ROS_INFO("Clean the path nodes");
}
void publishSearchNodes(Node3D node,ros::Publisher& pub,
visualization_msgs::MarkerArray& pathNodes, int i) {
visualization_msgs::Marker pathVehicle;
pathVehicle.header.stamp = ros::Time(0);
pathVehicle.color.r = 250.f / 255.f;
pathVehicle.color.g = 250.f / 255.f;
pathVehicle.color.b = 52.f / 255.f;
pathVehicle.type = visualization_msgs::Marker::ARROW;
pathVehicle.header.frame_id = "map";
pathVehicle.scale.x = 0.22;
pathVehicle.scale.y = 0.18;
pathVehicle.scale.z = 0.12;
pathVehicle.color.a = 0.1;
// 转化节点,并同时加上时间戳等信息
pathVehicle.header.stamp = ros::Time(0);
pathVehicle.pose.position.x = node.getX();
pathVehicle.pose.position.y = node.getY();
pathVehicle.pose.position.z = 0;
pathVehicle.pose.orientation = tf::createQuaternionMsgFromYaw(node.getT());
pathVehicle.id = i;
pathNodes.markers.push_back(pathVehicle);
// 发布这些车辆位置标记点
pub.publish(pathNodes);
}//end of publishPathNodes
}
|
> [!NOTE]
> Bu konu Dynamics 365 Finance ve Dynamics 365 Retail için geçerlidir.
|
import { CourseService } from "v1/api/course/course.service";
import { getDescribe } from "v1/tests/helpers/get-describe";
import { courseMock } from "v1/tests/mocks/course";
// eslint-disable-next-line jest/valid-title
describe(getDescribe(__filename), () => {
let service: CourseService;
beforeAll(async () => {
service = await courseMock.service();
});
it("should be defined", () => {
expect(service).toBeDefined();
});
});
|
# frozen_string_literal: true
module Configuration
class MefController < ApplicationController
layout "configuration"
before_action :identification_agent
before_action :set_mef, only: %i[show edit update destroy]
def index
etablissement = @agent_connecte.etablissement
@mef_service = MefDestination.new(etablissement)
@mef = Mef.where(etablissement: etablissement).order(:libelle)
@dossiers_sans_mef = DossierEleve.where(etablissement: etablissement, mef_destination: nil)
end
def new
@mef = Mef.new
end
def edit; end
def create
@mef = Mef.new(mef_params)
@mef.etablissement = agent_connecte.etablissement
if @mef.save
redirect_to configuration_mef_index_url, notice: t(".mef_cree")
else
flash[:alert] = t(".erreur_create_mef", champs: @mef.errors.first[0], erreur: @mef.errors.first[1])
render :new
end
end
def update
if @mef.update(mef_params)
redirect_to configuration_mef_index_url, notice: t(".mef_mis_a_jour")
else
render :edit
end
end
def destroy
mef_origine = DossierEleve.where(mef_origine: @mef)
mef_destination = DossierEleve.where(mef_destination: @mef)
if mef_origine.blank? && mef_destination.blank?
@mef.destroy
redirect_to configuration_mef_index_url, notice: t(".mef_supprime")
else
@mef = Mef.where(etablissement: agent_connecte.etablissement)
flash[:alert] = t(".mef_utilise")
render :index
end
end
private
def set_mef
@mef = Mef.find(params[:id])
end
def mef_params
params.require(:mef).permit(:libelle, :code, :etablissement_id)
end
end
end
|
package com.trib3.graphql.execution
import assertk.assertThat
import assertk.assertions.isEqualTo
import com.expediagroup.graphql.generator.SchemaGeneratorConfig
import com.expediagroup.graphql.generator.TopLevelObject
import com.expediagroup.graphql.generator.toSchema
import graphql.GraphQL
import org.slf4j.MDC
import org.testng.annotations.Test
class Query {
fun test(): String {
return "test"
}
}
class RequestIdInstrumentationTest {
val config =
SchemaGeneratorConfig(listOf(this::class.java.packageName))
val graphQL =
GraphQL.newGraphQL(
toSchema(
config,
listOf(TopLevelObject(Query()))
)
).instrumentation(RequestIdInstrumentation()).build()
@Test
fun testInstrumentation() {
val prevMDC = MDC.get("RequestId")
MDC.put("RequestId", "RequestIdInstrumentationTest::testInstrumentation")
try {
val result = graphQL.execute("query {}")
assertThat(result.extensions["RequestId"])
.isEqualTo("RequestIdInstrumentationTest::testInstrumentation")
} finally {
if (prevMDC == null) {
MDC.clear()
} else {
MDC.put("RequestId", prevMDC)
}
}
}
}
|
#!/usr/bin/env ruby
require 'yaml'
require 'sensu/plugins/prometheus/checks/runner'
config_file = ARGV[0] || 'config.yml'
abort("Can't find configuration file at '#{config_file}'") \
unless File.exist?(config_file)
runner = nil
begin
runner = Sensu::Plugins::Prometheus::Checks::Runner.new(
YAML.load_file(config_file)
)
runner.run
rescue RuntimeError => e
puts "ERROR: #{e}"
exit(1)
end
puts("\n")
puts(runner.output)
exit(runner.status)
|
import React from 'react';
const UserVote = (props) => (
<div style={{ background: "#eee", borderRadius: '5px', padding: '0 10px' }}>
<p><strong>{props.user}</strong> says:</p>
<p>{props.points}</p>
</div>
);
export default UserVote;
|
"use strict";
Object.defineProperty(exports, "__esModule", { value: true });
const articles_dialog_box_1 = require("./articles-dialog-box");
class Main {
constructor() {
const dialog = new articles_dialog_box_1.ArticlesDialogBox();
dialog.simulateUserInteraction();
}
}
new Main();
|
#!/bin/bash
#--------------------------------------------------
# global variables
#--------------------------------------------------
# zsh shell lower bound array start from 1
# shellcheck disable=SC2034
case "${SHELL}" in
'/bin/bash'|'/usr/bin/bash'|'/usr/bin/ash'|'/usr/bin/sh')
LBOUND=0
;;
'/usr/bin/zsh')
LBOUND=1
;;
esac
# set vim as default editor
export VISUAL=vim
export EDITOR="${VISUAL}"
|
/*
* Copyright 2020 Google LLC
*
* Use of this source code is governed by a BSD-style license that can be
* found in the LICENSE file.
*/
#include "include/core/SkColor.h"
#include "include/core/SkRefCnt.h"
#include "include/gpu/GrTypes.h"
class GrDirectContext;
class SkColorSpace;
class SkImage;
class SkPixmap;
struct SkISize;
namespace sk_gpu_test {
/**
* Creates a backend texture with pixmap contents and wraps it in a SkImage that safely deletes
* the texture when it goes away. Unlike using makeTextureImage() on a non-GPU image, this will
* fail rather than fallback if the pixmaps's color type doesn't map to a supported texture format.
* For testing purposes the texture can be made renderable to exercise different code paths for
* renderable textures/formats.
*/
sk_sp<SkImage> MakeBackendTextureImage(
GrDirectContext*, const SkPixmap&, GrRenderable, GrSurfaceOrigin);
/** Creates an image of with a solid color. */
sk_sp<SkImage> MakeBackendTextureImage(
GrDirectContext*, const SkImageInfo& info, SkColor4f, GrMipmapped = GrMipmapped::kNo,
GrRenderable = GrRenderable::kNo, GrSurfaceOrigin = GrSurfaceOrigin::kTopLeft_GrSurfaceOrigin);
} // namespace sk_gpu_test
|
# @quercia/mock
## 0.0.2
### Patch Changes
- [`27ba2d8`](https://github.com/lucat1/quercia/commit/27ba2d8a8056a67d42b00c3d263c2d7e1987163b)
Thanks [@lucat1](https://github.com/lucat1)! - feat(mock): finish work on the
mock package
- Updated dependencies []:
- @quercia/logger@0.0.2
## 0.0.2-next.0
### Patch Changes
- [`27ba2d8`](https://github.com/lucat1/quercia/commit/27ba2d8a8056a67d42b00c3d263c2d7e1987163b)
Thanks [@lucat1](https://github.com/lucat1)! - feat(mock): finish work on the
mock package
|
#pbuilder update --distribution precise --override-config
pbuilder-dist precise update --override-config
|
import 'package:flutter/material.dart';
import 'package:flutter_localizations/flutter_localizations.dart';
import 'package:get/get.dart';
import 'package:get_storage/get_storage.dart';
import 'package:logger/logger.dart';
import 'package:repainter/class/MainState.dart';
import 'constant.dart';
import 'pages/mainpage.dart';
import 'pages/start.dart';
Future<void> main() async {
WidgetsFlutterBinding.ensureInitialized();
Get.put<Logger>(Logger(
printer: PrettyPrinter(),
));
Get.put<MainState>(MainState());
await GetStorage().initStorage;
runApp(const MyApp());
}
class MyApp extends StatelessWidget {
const MyApp({Key? key}) : super(key: key);
@override
Widget build(BuildContext context) {
return GetMaterialApp(
theme: lightTheme(),
darkTheme: darkTheme(),
themeMode: ThemeMode.system,
locale: locale,
supportedLocales: const [
locale,
],
localizationsDelegates: const [
GlobalMaterialLocalizations.delegate,
GlobalWidgetsLocalizations.delegate,
GlobalCupertinoLocalizations.delegate
],
title: 'Repainter',
initialRoute: '/start',
getPages: [
GetPage<GetPage>(
name: '/main',
page: () => MainPage(),
),
GetPage<GetPage>(
name: '/start',
page: () => StartPage(),
)
],
);
}
}
|
[CmdletBinding()]
Param (
[Parameter(Mandatory = $True, Position = 0, ParameterSetName = 'path')]
[string]$ConfigPath,
[Parameter(Mandatory = $True, Position = 0, ParameterSetName = 'array')]
[array]$ConfigArray
)
ipmo ./PowerSwitch -Force -Verbose:$false
Get-EosInventoryFromConfig -ConfigPath $ConfigPath
<#
$Files = gci $ConfigPath -File | ? { $_.BaseName -notmatch '_route$'}
#$Files = $Files[0..4]
$ReturnArray = @()
$i = 0
#$Files = gci $ConfigPath -File | ? { $_.BaseName -match '^\d+\.\d+\.\d+\.\d+$'}
foreach ($file in $Files) {
$i++
Write-Warning "$i/$($Files.Count): $($file.Name)"
$ThisConfigArray = gc $file
$PsParams = @{}
$PsParams.ConfigArray = $ThisConfigArray
$HostConfig = Get-CiscoHostConfig @PsParams
$TimeConfig = Get-CiscoTimeConfig @PsParams
$MgmtConfig = Get-CiscoMgmtConfig @PsParams
$StpConfig = Get-CiscoSpantreeConfig @PsParams
# new object
$NewObject = "" | Select-Object LogFile,Name,IpAddress,
SntpServer,
SshEnabled,WebviewEnabled,
RadiusServer,
StpEnabled,StpMode,StpPriority,
DhcpSnoopingEnabled,
ArpInspectionEnabled,
PortAuthenticationEnabled
$ReturnArray += $NewObject
# HostConfig
$NewObject.LogFile = $file.Name
$NewObject.Name = $HostConfig.Name
$NewObject.IpAddress = $HostConfig.IpAddress
# TimeConfig
$NewObject.SntpServer = ($TimeConfig.SntpServer | Sort-Object) -join ','
# MgmtConfig
#$NewObject.TelnetEnabled = $MgmtConfig.TelnetEnabled # not reliable
$NewObject.SshEnabled = $MgmtConfig.SshEnabled
$NewObject.WebviewEnabled = $MgmtConfig.WebviewEnabled
# StpConfig
$NewObject.StpEnabled = $StpConfig.Enabled
$NewObject.StpMode = $StpConfig.Mode
$NewObject.StpPriority = $StpConfig.Priority
# Know Disabled
$NewObject.DhcpSnoopingEnabled = $false
$NewObject.ArpInspectionEnabled = $false
$NewObject.PortAuthenticationEnabled = $false
}
$ReturnArray #>
|
require 'spec_helper'
describe Bihash do
it 'should be enumerable' do
Bihash.must_include Enumerable
end
Bihash::UNIMPLEMENTED_METHODS.each do |method|
it "should report that it does not respond to ##{method}" do
Bihash.new.respond_to?(method).must_equal false
end
it "should raise NoMethodError if ##{method} is called" do
error = -> { Bihash.new.send(method) }.must_raise NoMethodError
error.message.must_equal "Bihash##{method} not implemented"
end
end
describe '::[]' do
it 'should be able to create an empty bihash' do
bh = Bihash[]
bh.must_be_instance_of Bihash
bh.must_be_empty
end
it 'should convert a hash to a bihash' do
bh = Bihash[:key => 'value']
bh.must_be_instance_of Bihash
bh[:key].must_equal 'value'
bh['value'].must_equal :key
end
it 'should not accept a hash with duplicate values' do
-> { Bihash[:k1 => 'val', :k2 => 'val'] }.must_raise ArgumentError
end
it 'should not accept a hash that would result in ambiguous mappings' do
-> { Bihash[1, 2, 2, 3] }.must_raise ArgumentError
end
it 'should accept a hash where a key equals its value' do
bh = Bihash[:key => :key]
bh.must_be_instance_of Bihash
bh[:key].must_equal :key
end
it 'should always return the value object if key-value pairs are equal' do
key, value = [], []
bh = Bihash[key => value]
bh.must_be_instance_of Bihash
bh[key].object_id.must_equal value.object_id
bh[value].object_id.must_equal value.object_id
end
it 'should accept an even number of arguments' do
bh = Bihash[:k1, 1, :k2, 2]
bh.must_be_instance_of Bihash
bh[:k1].must_equal 1
bh[:k2].must_equal 2
bh[1].must_equal :k1
bh[2].must_equal :k2
end
it 'should accept an array of key-value pairs packaged in arrays' do
array = [[:k1, 1], [:k2, 2]]
bh = Bihash[array]
bh.must_be_instance_of Bihash
bh[:k1].must_equal 1
bh[:k2].must_equal 2
bh[1].must_equal :k1
bh[2].must_equal :k2
end
end
describe '::new' do
it 'should create an empty bihash with a default of nil if no args' do
bh = Bihash.new
bh.must_be_instance_of Bihash
bh.must_be_empty
bh[:not_a_key].must_be_nil
end
it 'should create an empty bihash with a default if given an object arg' do
bh = Bihash.new('default')
bh.must_be_instance_of Bihash
bh.must_be_empty
bh[:not_a_key].must_equal 'default'
bh[:not_a_key].tr!('ealt', '3417')
bh[:still_not_a_key].must_equal 'd3f4u17'
end
it 'should create an empty bihash with a default if given a block arg' do
bh = Bihash.new { 'd3f4u17' }
bh.must_be_instance_of Bihash
bh.must_be_empty
bh[:not_a_key].must_equal 'd3f4u17'
bh[:not_a_key].tr!('3417', 'ealt')
bh[:not_a_key].must_equal 'd3f4u17'
end
it 'should allow assignment of new pairs if given a block arg' do
bh = Bihash.new { |bihash, key| bihash[key] = key.to_s }
bh[404].must_equal '404'
bh.size.must_equal 1
bh.must_include 404
bh.must_include '404'
end
it 'should not accept both an object and a block' do
-> { Bihash.new('default 1') { 'default 2' } }.must_raise ArgumentError
end
end
describe '::try_convert' do
it 'should convert an object to a bihash if it responds to #to_hash' do
hash = {:k1 => 1, :k2 => 2}
bh = Bihash.try_convert(hash)
bh.must_be_instance_of Bihash
bh[:k1].must_equal 1
bh[:k2].must_equal 2
bh[1].must_equal :k1
bh[2].must_equal :k2
end
it 'should convert a bihash to a bihash' do
bh = Bihash[:key => 'value']
Bihash.try_convert(bh).must_equal bh
end
it 'should return nil if the object does not respond to #to_hash' do
Bihash.try_convert(Object.new).must_be_nil
end
it 'should not accept a hash with duplicate values' do
-> { Bihash.try_convert(:k1 => 1, :k2 => 1) }.must_raise ArgumentError
end
end
describe '#<' do
it 'should raise an error if the right hand side is not a bihash' do
-> { Bihash[a: 1, b: 2] < {a: 1, b: 2, c: 3} }.must_raise TypeError
end
it 'should return true when the argument is a strict subset of self' do
(Bihash[a: 1, b: 2] < Bihash[a: 1, b: 2, c: 3]).must_equal true
end
it 'should return false when the argument is equal to self' do
(Bihash[a: 1, b: 2] < Bihash[a: 1, b: 2]).must_equal false
end
it 'should return false when the argument is not a subset of self' do
(Bihash[a: 1, b: 2, c: 3] < Bihash[a: 1, b: 2]).must_equal false
end
end
describe '#<=' do
it 'should raise an error if the right hand side is not a bihash' do
-> { Bihash[a: 1, b: 2] <= {a: 1, b: 2, c: 3} }.must_raise TypeError
end
it 'should return true when the argument is a strict subset of self' do
(Bihash[a: 1, b: 2] <= Bihash[a: 1, b: 2, c: 3]).must_equal true
end
it 'should return true when the argument is equal to self' do
(Bihash[a: 1, b: 2] <= Bihash[a: 1, b: 2]).must_equal true
end
it 'should return false when the argument is not a subset of self' do
(Bihash[a: 1, b: 2, c: 3] <= Bihash[a: 1, b: 2]).must_equal false
end
end
describe '#==' do
it 'should return true when two bihashes have the same pairs' do
bh1, bh2 = Bihash[:k1 => 1, :k2 => 2], Bihash[2 => :k2, 1 => :k1]
(bh1 == bh2).must_equal true
end
it 'should return false when two bihashes do not have the same pairs' do
bh1, bh2 = Bihash[:k1 => 1, :k2 => 2], Bihash[:k1 => 1, :k2 => 99]
(bh1 == bh2).must_equal false
end
it 'should be aliased to #eql?' do
bh = Bihash.new
bh.method(:eql?).must_equal bh.method(:==)
end
end
describe '#>' do
it 'should raise an error if the right hand side is not a bihash' do
-> { Bihash[a: 1, b: 2] > {a: 1, b: 2, c: 3} }.must_raise TypeError
end
it 'should return true when the argument is a strict superset of self' do
(Bihash[a: 1, b: 2, c: 3] > Bihash[a: 1, b: 2]).must_equal true
end
it 'should return false when the argument is equal to self' do
(Bihash[a: 1, b: 2] > Bihash[a: 1, b: 2]).must_equal false
end
it 'should return false when the argument is not a superset of self' do
(Bihash[a: 1, b: 2] > Bihash[a: 1, b: 2, c: 3]).must_equal false
end
end
describe '#>=' do
it 'should raise an error if the right hand side is not a bihash' do
-> { Bihash[a: 1, b: 2] >= {a: 1, b: 2, c: 3} }.must_raise TypeError
end
it 'should return true when the argument is a strict superset of self' do
(Bihash[a: 1, b: 2, c: 3] >= Bihash[a: 1, b: 2]).must_equal true
end
it 'should return true when the argument is equal to self' do
(Bihash[a: 1, b: 2] >= Bihash[a: 1, b: 2]).must_equal true
end
it 'should return false when the argument is not a superset of self' do
(Bihash[a: 1, b: 2] >= Bihash[a: 1, b: 2, c: 3]).must_equal false
end
end
describe '#[]' do
it 'should return the other pair' do
bh = Bihash[:key => 'value']
bh[:key].must_equal 'value'
bh['value'].must_equal :key
end
it 'should return falsey values correctly' do
bh1 = Bihash[nil => false]
bh1[nil].must_equal false
bh1[false].must_be_nil
bh2 = Bihash[false => nil]
bh2[false].must_be_nil
bh2[nil].must_equal false
end
end
describe '#[]=' do
it 'should allow assignment of new pairs' do
bh = Bihash.new
bh[:key] = 'value'
bh[:key].must_equal 'value'
bh['value'].must_equal :key
end
it 'should remove old pairs if old keys are re-assigned' do
bh = Bihash[1 => 'one', 2 => 'two']
bh[1] = 'uno'
bh[1].must_equal 'uno'
bh['uno'].must_equal 1
bh.wont_include 'one'
end
it 'should always return the value object if key-value pairs are equal' do
key, value = [], []
bh = Bihash.new
bh[key] = value
bh[key].object_id.must_equal value.object_id
bh[value].object_id.must_equal value.object_id
end
it 'should be aliased to #store' do
bh = Bihash.new
bh.method(:store).must_equal bh.method(:[]=)
bh.store(:key, 'value')
bh[:key].must_equal 'value'
bh['value'].must_equal :key
end
it 'should raise RuntimeError if called on a frozen bihash' do
-> { Bihash.new.freeze[:key] = 'value' }.must_raise RuntimeError
end
end
describe '#assoc' do
it 'should return the pair if the argument is a key' do
bh = Bihash[:k1 => 'v1', :k2 => 'v2']
bh.assoc(:k1).must_equal [:k1, 'v1']
bh.assoc('v2').must_equal ['v2', :k2]
end
it 'should return nil if the argument is not a key' do
bh = Bihash.new(404)
bh.assoc(:not_a_key).must_be_nil
end
it 'should find the key using #==' do
bh = Bihash[[] => 'array']
bh['array'] << 'modified'
bh.assoc(['modified']).must_equal [['modified'], 'array']
bh.assoc([]).must_be_nil
end
end
describe '#clear' do
it 'should remove all pairs and return the bihash' do
bh = Bihash[:key => 'value']
bh.clear.object_id.must_equal bh.object_id
bh.must_be_empty
end
it 'should raise RuntimeError if called on a frozen bihash' do
-> { Bihash.new.freeze.clear }.must_raise RuntimeError
end
end
describe '#clone' do
it 'should make a copy of the bihash' do
bh = Bihash[1 => :one]
clone = bh.clone
clone[2] = :two
bh[2].must_be_nil
end
end
describe '#compare_by_identity' do
it 'should set bihash to compare by identity instead of equality' do
bh = Bihash.new.compare_by_identity
key1, key2 = 'key', 'value'
bh[key1] = key2
bh['key'].must_be_nil
bh['value'].must_be_nil
bh[key1].must_equal 'value'
bh[key2].must_equal 'key'
end
it 'should raise RuntimeError if called on a frozen bihash' do
-> { Bihash.new.freeze.compare_by_identity }.must_raise RuntimeError
end
end
describe '#compare_by_identity?' do
it 'should indicate whether bihash is comparing by identity' do
Bihash.new.compare_by_identity.compare_by_identity?.must_equal true
Bihash.new.compare_by_identity?.must_equal false
end
end
describe '#default' do
it 'should not accept more than one argument' do
-> { Bihash.new.default(1,2) }.must_raise ArgumentError
end
describe 'when there is not a default proc' do
it 'should return the default' do
bh1 = Bihash[:key => 'value']
bh1.default.must_be_nil
bh1.default(:not_a_key).must_be_nil
bh1.default(:key).must_be_nil
bh2 = Bihash.new(404)
bh2[:key] = 'value'
bh2.default.must_equal 404
bh2.default(:not_a_key).must_equal 404
bh2.default(:key).must_equal 404
end
end
describe 'when there is a default proc' do
it 'should return the default if called with no argument' do
Bihash.new { 'proc called' }.default.must_be_nil
end
it 'should call the default proc when called with an argument' do
bh = Bihash.new { |bihash, key| bihash[key] = key.to_s }
bh[:key] = 'value'
bh.default(:key).must_equal 'key'
bh[:key].must_equal 'key'
bh.default(404).must_equal '404'
bh[404].must_equal '404'
end
end
end
describe '#default=' do
it 'should set the default object' do
bh = Bihash.new { 'proc called' }
bh[:not_a_key].must_equal 'proc called'
(bh.default = 404).must_equal 404
bh[:not_a_key].must_equal 404
end
it 'should raise RuntimeError if called on a frozen bihash' do
-> { Bihash.new.freeze.default = 404 }.must_raise RuntimeError
end
end
describe '#default_proc' do
it 'should return the default proc if it exists' do
bh = Bihash.new { |bihash, key| bihash[key] = key }
prc = bh.default_proc
array = []
prc.call(array, 2)
array.must_equal [nil, nil, 2]
end
it 'should return nil if there is no default proc' do
Bihash.new.default_proc.must_be_nil
Bihash.new(404).default_proc.must_be_nil
end
end
describe '#default_proc=' do
it 'should set the default proc' do
bh = Bihash.new(:default_object)
bh[:not_a_key].must_equal :default_object
(bh.default_proc = ->(bihash, key) { '404' }).must_be_instance_of Proc
bh[:not_a_key].must_equal '404'
end
it 'should set the default value to nil if argument is nil' do
bh = Bihash.new(:default_object)
bh[:not_a_key].must_equal :default_object
(bh.default_proc = nil).must_be_nil
bh[:not_a_key].must_be_nil
end
it 'should raise TypeError if not given a non-proc (except nil)' do
-> { Bihash.new.default_proc = :not_a_proc }.must_raise TypeError
end
it 'should raise TypeError given a lambda without 2 args' do
-> { Bihash.new.default_proc = -> { '404' } }.must_raise TypeError
end
it 'should raise RuntimeError if called on a frozen bihash' do
-> { Bihash[].freeze.default_proc = proc { '' } }.must_raise RuntimeError
end
end
describe '#delete' do
it 'should return the other key if the given key is found' do
Bihash[:key => 'value'].delete(:key).must_equal 'value'
Bihash[:key => 'value'].delete('value').must_equal :key
end
it 'should remove both keys' do
bh1 = Bihash[:key => 'value']
bh1.delete(:key)
bh1.wont_include :key
bh1.wont_include 'value'
bh2 = Bihash[:key => 'value']
bh2.delete('value')
bh2.wont_include :key
bh2.wont_include 'value'
end
it 'should call the block (if given) when the key is not found' do
out = Bihash[:key => 'value'].delete(404) { |key| "#{key} not found" }
out.must_equal '404 not found'
end
it 'should raise RuntimeError if called on a frozen bihash' do
-> { Bihash.new.freeze.delete(:key) }.must_raise RuntimeError
end
end
describe '#delete_if' do
it 'should delete any pairs for which the block evaluates to true' do
bh = Bihash[1 => :one, 2 => :two, 3 => :three, 4 => :four]
bh_id = bh.object_id
bh.delete_if { |key1, key2| key1.even? }.object_id.must_equal bh_id
bh.must_equal Bihash[1 => :one, 3 => :three]
end
it 'should raise RuntimeError if called on a frozen bihash with a block' do
-> { Bihash.new.freeze.delete_if { false } }.must_raise RuntimeError
end
it 'should return an enumerator if not given a block' do
enum = Bihash[1 => :one, 2 => :two, 3 => :three, 4 => :four].delete_if
enum.must_be_instance_of Enumerator
enum.each { |k1, k2| k1.even? }.must_equal Bihash[1 => :one, 3 => :three]
end
end
describe '#dig' do
it 'should traverse nested bihashes' do
bh = Bihash[foo: Bihash[bar: Bihash[baz: 4]]]
bh.dig(:foo, :bar, :baz).must_equal 4
bh.dig(:foo, :bar, 4).must_equal :baz
end
it 'should traverse nested hashes' do
bh = Bihash[foo: {bar: {baz: 4}}]
bh.dig(:foo, :bar, :baz).must_equal 4
end
it 'should traverse nested arrays' do
bh = Bihash[foo: [[4]]]
bh.dig(:foo, 0, 0).must_equal 4
end
it 'should return nil if any intermediate step is nil' do
bh = Bihash[foo: Bihash[bar: Bihash[baz: 4]]]
bh.dig(:foo, :bur, :boz).must_be_nil
end
end
describe '#dup' do
it 'should make a copy of the bihash' do
bh = Bihash[1 => :one]
dup = bh.dup
dup[2] = :two
bh[2].must_be_nil
end
end
describe '#each' do
it 'should iterate over each pair in the bihash' do
array = []
Bihash[:k1 => 'v1', :k2 => 'v2'].each { |pair| array << pair }
array.must_equal [[:k1, 'v1'], [:k2, 'v2']]
end
it 'should return the bihash if given a block' do
bh = Bihash.new
bh.each { |p| }.must_be_instance_of Bihash
bh.each { |p| }.object_id.must_equal bh.object_id
end
it 'should return an enumerator if not given a block' do
enum = Bihash[:k1 => 'v1', :k2 => 'v2'].each
enum.must_be_instance_of Enumerator
enum.each { |pair| pair }.must_equal Bihash[:k1 => 'v1', :k2 => 'v2']
end
it 'should be aliased to #each_pair' do
bh = Bihash.new
bh.method(:each_pair).must_equal bh.method(:each)
end
end
describe '#empty?' do
it 'should indicate if the bihash is empty' do
Bihash.new.empty?.must_equal true
Bihash[:key => 'value'].empty?.must_equal false
end
end
describe '#fetch' do
it 'should return the other pair' do
bh = Bihash[:key => 'value']
bh.fetch(:key).must_equal 'value'
bh.fetch('value').must_equal :key
end
it 'should return falsey values correctly' do
bh1 = Bihash[nil => false]
bh1.fetch(nil).must_equal false
bh1.fetch(false).must_be_nil
bh2 = Bihash[false => nil]
bh2.fetch(false).must_be_nil
bh2.fetch(nil).must_equal false
end
describe 'when the key is not found' do
it 'should raise KeyError when not supplied any default' do
-> { Bihash[].fetch(:not_a_key) }.must_raise KeyError
end
it 'should return the second arg when supplied with one' do
Bihash[].fetch(:not_a_key, :second_arg).must_equal :second_arg
end
it 'should call the block if supplied with one' do
Bihash[].fetch(404) { |k| "#{k} not found" }.must_equal '404 not found'
end
end
end
describe '#fetch_values' do
it 'should return an array of values corresponding to the given keys' do
Bihash[1 => :one, 2 => :two].fetch_values(1, 2).must_equal [:one, :two]
Bihash[1 => :one, 2 => :two].fetch_values(:one, :two).must_equal [1, 2]
Bihash[1 => :one, 2 => :two].fetch_values(1, :two).must_equal [:one, 2]
end
it 'should raise a KeyError if any key is not found' do
-> { Bihash.new.fetch_values(404) }.must_raise KeyError
end
it 'should not duplicate entries if a key equals its value' do
Bihash[:key => :key].fetch_values(:key).must_equal [:key]
end
it 'should return an empty array with no args' do
Bihash[:key => 'value'].fetch_values.must_equal []
end
end
describe '#flatten' do
it 'should extract the pairs into an array' do
Bihash[:k1 => 'v1', :k2 => 'v2'].flatten.must_equal [:k1, 'v1', :k2, 'v2']
end
it 'should not flatten array keys if no argument is given' do
Bihash[:key => ['v1', 'v2']].flatten.must_equal [:key, ['v1', 'v2']]
end
it 'should flatten to the level given as an argument' do
Bihash[:key => ['v1', 'v2']].flatten(2).must_equal [:key, 'v1', 'v2']
end
end
describe '#hash' do
it 'should return the same hash code if two bihashes have the same pairs' do
bh1, bh2 = Bihash[:k1 => 1, :k2 => 2], Bihash[2 => :k2, 1 => :k1]
bh1.hash.must_equal bh2.hash
end
end
describe '#include?' do
it 'should indicate if the bihash contains the argument' do
bh = Bihash[:key => 'value']
bh.include?(:key).must_equal true
bh.include?('value').must_equal true
bh.include?(:not_a_key).must_equal false
end
it 'should be aliased to #has_key?, #key?, and #member?' do
bh = Bihash.new
bh.method(:has_key?).must_equal bh.method(:include?)
bh.method(:key?).must_equal bh.method(:include?)
bh.method(:member?).must_equal bh.method(:include?)
end
end
describe '#keep_if' do
it 'should retain any pairs for which the block evaluates to true' do
bh = Bihash[1 => :one, 2 => :two, 3 => :three, 4 => :four]
bh_id = bh.object_id
bh.keep_if { |key1, key2| key1.even? }.object_id.must_equal bh_id
bh.must_equal Bihash[2 => :two, 4 => :four]
end
it 'should raise RuntimeError if called on a frozen bihash with a block' do
-> { Bihash.new.freeze.keep_if { true } }.must_raise RuntimeError
end
it 'should return an enumerator if not given a block' do
enum = Bihash[1 => :one, 2 => :two, 3 => :three, 4 => :four].keep_if
enum.must_be_instance_of Enumerator
enum.each { |k1, k2| k1.even? }.must_equal Bihash[2 => :two, 4 => :four]
end
end
describe '#length' do
it 'should return the number of pairs in the bihash' do
Bihash[1 => :one, 2 => :two].length.must_equal 2
end
end
describe '#merge' do
it 'should merge bihashes, assigning each arg pair to a copy of reciever' do
receiver = Bihash[:chips => :salsa, :milk => :cookies, :fish => :rice]
original_receiver = receiver.dup
argument = Bihash[:fish => :chips, :soup => :salad]
return_value = Bihash[:milk => :cookies, :fish => :chips, :soup => :salad]
receiver.merge(argument).must_equal return_value
receiver.must_equal original_receiver
end
it 'should raise TypeError if arg is not a bihash' do
-> { Bihash.new.merge({:key => 'value'}) }.must_raise TypeError
end
end
describe '#merge!' do
it 'should merge bihashes, assigning each arg pair to the receiver' do
receiver = Bihash[:chips => :salsa, :milk => :cookies, :fish => :rice]
argument = Bihash[:fish => :chips, :soup => :salad]
return_value = Bihash[:milk => :cookies, :fish => :chips, :soup => :salad]
receiver.merge!(argument).must_equal return_value
receiver.must_equal return_value
end
it 'should raise RuntimeError if called on a frozen bihash' do
-> { Bihash.new.freeze.merge!(Bihash.new) }.must_raise RuntimeError
end
it 'should raise TypeError if arg is not a bihash' do
-> { Bihash.new.merge!({:key => 'value'}) }.must_raise TypeError
end
it 'should be aliased to #update' do
bh = Bihash.new
bh.method(:update).must_equal bh.method(:merge!)
end
end
describe '#rehash' do
it 'should recompute all key hash values and return the bihash' do
bh = Bihash[[] => :array]
bh[:array] << 1
bh[[1]].must_be_nil
bh.rehash[[1]].must_equal :array
bh[[1]].must_equal :array
end
it 'should raise RuntimeError if called on a frozen bihash' do
-> { Bihash.new.freeze.rehash }.must_raise RuntimeError
end
it 'should raise RuntimeError if called when key duplicated outside pair' do
bh = Bihash[[1], [2], [3], [4]]
(bh[[4]] << 1).shift
-> { bh.rehash }.must_raise RuntimeError
end
end
describe '#reject' do
describe 'when some items are rejected' do
it 'should return a bihash with items not rejected by the block' do
bh = Bihash[1 => :one, 2 => :two, 3 => :three, 4 => :four]
bh.reject { |k1,k2| k1.even? }.must_equal Bihash[1 => :one, 3 => :three]
bh.must_equal Bihash[1 => :one, 2 => :two, 3 => :three, 4 => :four]
end
end
describe 'when no items are rejected' do
it 'should return a bihash with items not rejected by the block' do
bh = Bihash[1 => :one, 3 => :three, 5 => :five, 7 => :seven]
bh.reject { |k1,k2| k1.even? }.must_equal bh
bh.must_equal bh
end
end
it 'should return an enumerator if not given a block' do
enum = Bihash[1 => :one, 2 => :two, 3 => :three, 4 => :four].reject
enum.must_be_instance_of Enumerator
enum.each { |k1,k2| k1.even? }.must_equal Bihash[1 => :one, 3 => :three]
end
end
describe '#reject!' do
it 'should delete any pairs for which the block evaluates to true' do
bh = Bihash[1 => :one, 2 => :two, 3 => :three, 4 => :four]
bh_id = bh.object_id
bh.reject! { |key1, key2| key1.even? }.object_id.must_equal bh_id
bh.must_equal Bihash[1 => :one, 3 => :three]
end
it 'should return nil if no changes were made to the bihash' do
bh = Bihash[1 => :one, 2 => :two, 3 => :three, 4 => :four]
bh.reject! { |key1, key2| key1 > 5 }.must_be_nil
bh.must_equal Bihash[1 => :one, 2 => :two, 3 => :three, 4 => :four]
end
it 'should raise RuntimeError if called on a frozen bihash with a block' do
-> { Bihash.new.freeze.reject! { false } }.must_raise RuntimeError
end
it 'should return an enumerator if not given a block' do
enum = Bihash[1 => :one, 2 => :two, 3 => :three, 4 => :four].reject!
enum.must_be_instance_of Enumerator
enum.each { |k1, k2| k1.even? }.must_equal Bihash[1 => :one, 3 => :three]
end
end
describe '#replace' do
it 'should replace the contents of receiver with the contents of the arg' do
receiver = Bihash[]
original_id = receiver.object_id
arg = Bihash[:key => 'value']
receiver.replace(arg).must_equal Bihash[:key => 'value']
arg[:another_key] = 'another_value'
receiver.object_id.must_equal original_id
receiver.must_equal Bihash[:key => 'value']
end
it 'should raise TypeError if arg is not a bihash' do
-> { Bihash.new.replace({:key => 'value'}) }.must_raise TypeError
end
it 'should raise RuntimeError if called on a frozen bihash' do
-> { Bihash.new.freeze.replace(Bihash[:k, 'v']) }.must_raise RuntimeError
end
end
describe '#select' do
describe 'when only some items are selected' do
it 'should return a bihash with items selected by the block' do
bh = Bihash[1 => :one, 2 => :two, 3 => :three, 4 => :four]
bh.select { |k1,k2| k1.even? }.must_equal Bihash[2 => :two, 4 => :four]
bh.must_equal Bihash[1 => :one, 2 => :two, 3 => :three, 4 => :four]
end
end
describe 'when all items are selected' do
it 'should return a bihash with items selected by the block' do
bh = Bihash[2 => :two, 4 => :four, 6 => :six, 8 => :eight]
bh.select { |k1,k2| k1.even? }.must_equal bh
bh.must_equal bh
end
end
it 'should return an enumerator if not given a block' do
enum = Bihash[1 => :one, 2 => :two, 3 => :three, 4 => :four].select
enum.must_be_instance_of Enumerator
enum.each { |k1,k2| k1.even? }.must_equal Bihash[2 => :two, 4 => :four]
end
it 'should be aliased to #filter' do
bh = Bihash.new
bh.method(:filter).must_equal bh.method(:select)
end
end
describe '#select!' do
it 'should retain any pairs for which the block evaluates to true' do
bh = Bihash[1 => :one, 2 => :two, 3 => :three, 4 => :four]
bh_id = bh.object_id
bh.select! { |key1, key2| key1.even? }.object_id.must_equal bh_id
bh.must_equal Bihash[2 => :two, 4 => :four]
end
it 'should return nil if no changes were made to the bihash' do
bh = Bihash[1 => :one, 2 => :two, 3 => :three, 4 => :four]
bh.select! { |key1, key2| key1 < 5 }.must_be_nil
bh.must_equal Bihash[1 => :one, 2 => :two, 3 => :three, 4 => :four]
end
it 'should raise RuntimeError if called on a frozen bihash with a block' do
-> { Bihash.new.freeze.select! { true } }.must_raise RuntimeError
end
it 'should return an enumerator if not given a block' do
enum = Bihash[1 => :one, 2 => :two, 3 => :three, 4 => :four].select!
enum.must_be_instance_of Enumerator
enum.each { |k1, k2| k1.even? }.must_equal Bihash[2 => :two, 4 => :four]
end
it 'should be aliased to #filter!' do
bh = Bihash.new
bh.method(:filter!).must_equal bh.method(:select!)
end
end
describe '#shift' do
it 'should remove the oldest pair from the bihash and return it' do
bh = Bihash[1 => :one, 2 => :two, 3 => :three]
bh.shift.must_equal [1, :one]
bh.must_equal Bihash[2 => :two, 3 => :three]
end
it 'should return the default value if bihash is empty' do
Bihash.new.shift.must_be_nil
Bihash.new(404).shift.must_equal 404
Bihash.new { 'd3f4u17' }.shift.must_equal 'd3f4u17'
end
it 'should raise RuntimeError if called on a frozen bihash' do
-> { Bihash.new.freeze.shift }.must_raise RuntimeError
end
end
describe '#size' do
it 'should return the number of pairs in the bihash' do
Bihash[1 => :one, 2 => :two].size.must_equal 2
end
end
describe '#slice' do
it 'should return a new bihash with only the pairs that are in the args' do
bh = Bihash[1 => :one, 2 => :two, 3 => :three]
bh.slice(1, :one, :two, "nope").must_equal Bihash[1 => :one, 2 => :two]
bh.must_equal Bihash[1 => :one, 2 => :two, 3 => :three]
end
it 'should return a vanilla bihash without default values, etc.' do
sliced_bh = Bihash.new(404).slice
sliced_bh.default.must_be_nil
end
end
describe '#to_h' do
it 'should return a copy of the forward hash' do
bh = Bihash[:key1 => 'val1', :key2 => 'val2']
h = bh.to_h
h.must_equal Hash[:key1 => 'val1', :key2 => 'val2']
h.delete(:key1)
bh.must_include :key1
end
it 'should be aliased to #to_hash' do
bh = Bihash.new
bh.method(:to_hash).must_equal bh.method(:to_h)
end
end
describe '#to_proc' do
it 'should convert the bihash to a proc' do
Bihash[].to_proc.must_be_instance_of Proc
end
it 'should call #[] on the bihash when the proc is called' do
Bihash[:key => 'value'].to_proc.call(:key).must_equal 'value'
end
end
describe '#to_s' do
it 'should return a nice string representing the bihash' do
bh = Bihash[:k1 => 'v1', :k2 => [:v2], :k3 => {:k4 => 'v4'}]
bh.to_s.must_equal 'Bihash[:k1=>"v1", :k2=>[:v2], :k3=>{:k4=>"v4"}]'
end
it 'should be aliased to #inspect' do
bh = Bihash.new
bh.method(:inspect).must_equal bh.method(:to_s)
end
end
describe '#values_at' do
it 'should return an array of values corresponding to the given keys' do
Bihash[1 => :one, 2 => :two].values_at(1, 2).must_equal [:one, :two]
Bihash[1 => :one, 2 => :two].values_at(:one, :two).must_equal [1, 2]
Bihash[1 => :one, 2 => :two].values_at(1, :two).must_equal [:one, 2]
end
it 'should use the default if a given key is not found' do
bh = Bihash.new(404)
bh[1] = :one
bh[2] = :two
bh.values_at(1, 2, 3).must_equal [:one, :two, 404]
bh.values_at(:one, :two, :three).must_equal [1, 2, 404]
end
it 'should not duplicate entries if a key equals its value' do
Bihash[:key => :key].values_at(:key).must_equal [:key]
end
it 'should return an empty array with no args' do
Bihash[:key => 'value'].values_at.must_equal []
end
end
describe '#initialize' do
it 'should raise RuntimeError if called on a frozen bihash' do
-> { Bihash.new.freeze.send(:initialize) }.must_raise RuntimeError
end
end
end
|
!> @brief This procedure implements the differentiated version of the
!> @param[in] n vector length
!> @param[in] dx vector x
!> @param[in] g_dx derivative object associated to dx
!> @param[in] da scalar a = alpha
!> @param[in] incx stride in vector x
!> @param[out] dx vector x
!> @param[out] g_dx derivative object associated to dx
!> @details
!> This procedure implements the differentiated version of the\n
!> BLAS1 routine dscal(). Based on original BLAS1 and changed\n
!> to a differentiated handcoded version.\n
!> Technique:\n
!> Changes: add support for (incx .le. 0)\n
!> Original BLAS description:\n
!> scales a vector by a constant.\n
!> uses unrolled loops for increment equal to one.\n
!> jack dongarra, linpack, 3/11/78.\n
!> modified 3/93 to return if incx .le. 0.\n
!> modified 12/3/93, array(1) declarations changed to array(*)\n
SUBROUTINE g_dscal(n,da,dx,g_dx,incx)
DOUBLE PRECISION da, dx(*), g_dx(*)
INTEGER incx, n
! CheCk out input:
!hAW CorreCt inCx .le. 0
!hAW if( n.le.0 .or. inCx.le.0 )return
IF (n<=0) RETURN
! Start of Computation :
!hAW seems a little bit slower than the Code for not equal to 1
CALL dscal(n,da,g_dx,incx)
! Call orig. funCtion after
CALL dscal(n,da,dx,incx)
RETURN
END
|
#!/bin/sh
# Pre-requisities:
# Install and configure pyenv.
# Install poetry after pyenv.
# Install project dependencies.
poetry install
|
(ns gaia.core
(:require
[clojure.walk :as walk]
[clojure.string :as string]
[clojure.tools.cli :as cli]
[aleph.http :as http]
[ring.middleware.resource :as resource]
[ring.middleware.params :as params]
[ring.middleware.keyword-params :as keyword]
[cheshire.core :as json]
[ubergraph.core :as graph]
[polaris.core :as polaris]
[sisyphus.kafka :as kafka]
[sisyphus.log :as log]
[gaia.config :as config]
[gaia.store :as store]
[gaia.util :as util]
[gaia.executor :as executor]
[gaia.command :as command]
[gaia.flow :as flow]
[gaia.sync :as sync])
(:import
[java.io InputStreamReader]))
(defn atom?
[x]
(instance? clojure.lang.IAtom x))
(defn read-json
[body]
(json/parse-stream (InputStreamReader. body) keyword))
(defn json-seq
[path]
(map
#(json/parse-string % keyword)
(string/split (slurp path) #"\n")))
(defn json-response
([body]
(json-response body 200))
([body status-code]
{:status status-code
:headers {"content-type" "application/json"}
:body (json/generate-string body)}))
(defn serializable
"Make internal data JSON-serializable by expanding each atom's value. This is
useful for debugging but doesn't promise API results that client code can
depend on."
[data]
(walk/prewalk
(fn [node]
(if (atom? node)
@node
node))
data))
(defn index-handler
[state]
(fn [request]
{:status 200
:headers {"content-type" "text/html"}
:body (slurp "resources/public/index.html")}))
(defn initialize-flow!
"Construct the named workflow, connected to storage and kafka messaging."
[{:keys [store kafka] :as state} workflow]
(when-not (seq (name workflow))
(throw (IllegalArgumentException. "empty workflow name")))
(let [pointed (store (name workflow))]
(sync/generate-sync workflow kafka pointed)))
(defn find-flow!
"Find or construct the named workflow."
[{:keys [flows] :as state} workflow]
(if-let [flow (get @flows (keyword workflow))]
flow
(let [flow (initialize-flow! state workflow)]
(swap! flows assoc workflow flow)
flow)))
(defn handle-kafka
[{:keys [executor] :as state} topic message]
(let [flow (find-flow! state (:workflow message))]
(sync/executor-events! flow executor topic message)))
(defn executor-status!
[{:keys [kafka] :as state}]
(let [status-topic (get kafka :status-topic "gaia-status")
topics ["gaia-events" status-topic]
kafka (update kafka :subscribe concat topics)
handle (partial handle-kafka state)]
(kafka/boot-consumer kafka handle)))
(defn boot
[config]
(let [flows (atom {})
store (config/load-store (:store config))
kafka (:kafka config)
producer (kafka/boot-producer kafka)
kafka (assoc kafka :producer producer)
exec-config (assoc
(:executor config)
:kafka kafka)
executor (config/load-executor exec-config)
state {:config config
:kafka kafka
:flows flows
:store store
:executor executor}]
(assoc-in
state
[:kafka :consumer]
(executor-status! state))))
(defn merge-properties!
"Merge the new properties into the named workflow."
; :owner - owner name for filtering and sorting workflow lists
; :description - to organize workflows
; Requested number of workers?
; Worker node resource needs?
; Storage root path?
[state workflow properties]
(let [flow (find-flow! state workflow)]
(sync/merge-properties! flow properties)
state))
(defn merge-commands!
"Merge the new commands `merging` into the named workflow."
[{:keys [executor] :as state} workflow merging]
(let [flow (find-flow! state workflow)]
(sync/merge-commands! flow executor merging)
state))
(defn merge-steps!
"Merge the new steps into the named workflow."
[{:keys [executor] :as state} workflow steps]
(let [flow (find-flow! state workflow)]
(sync/merge-steps! flow executor steps)
state))
(defn run-flow!
[{:keys [executor] :as state} workflow]
(let [flow (find-flow! state workflow)]
(sync/run-flow! flow executor)
state))
(defn halt-flow!
[{:keys [executor] :as state} workflow]
(let [flow (find-flow! state workflow)]
(sync/halt-flow! flow executor)
state))
(defn expire-keys!
[{:keys [executor] :as state} workflow expire]
(when (seq expire)
(log/info! "expiring storage path keys" expire "of" workflow))
(let [flow (find-flow! state workflow)]
(sync/expire-keys! flow executor expire)
state))
(defn flow-status!
[state workflow debug]
(let [flow (find-flow! state workflow)
{:keys [state data tasks]} @(:status flow)
complete (sync/complete-keys data)
; TODO(jerry): :state is in #{:initialized :running :complete :halted :error}?
; Add a :stalled state?
; TODO(jerry): Add :steps.
status {:state state
:commands @(:commands flow)
:waiting-inputs (flow/missing-data @(:flow flow) complete)}
status (if debug ; include internal guts
(merge status
(serializable
{:flow @(:flow flow)
:tasks tasks
:data data}))
status)]
status))
(defn workflows-info
"Return a map of workflow names to their summary info."
[{:keys [flows] :as state}]
(util/map-vals sync/summarize-flow @flows))
(defn command-handler
"Merge the given commands (transformed to a keyword -> value map `index`) into
the named workflow."
[state]
(fn [request]
(let [{:keys [workflow commands] :as body} (read-json (:body request))
workflow (keyword workflow)
index (command/index-key :name commands)]
(log/info! "merge commands request" body)
(merge-commands! state workflow index)
(json-response
{:commands
(deref
(:commands
(find-flow! state workflow)))}))))
(defn merge-handler
"Merge the given steps into the named workflow."
[state]
(fn [request]
(let [{:keys [workflow steps] :as body} (read-json (:body request))
workflow (keyword workflow)]
(log/info! "merge steps request" body)
(merge-steps! state workflow steps)
; TODO(jerry): Return ALL the steps or at least all their names.
(json-response
{:steps (map :name steps)}))))
(defn upload-handler
"Upload a new workflow in one request: properties, commands, and steps."
[{:keys [flows] :as state}]
(fn [request]
(let [{:keys [workflow properties commands steps] :as body}
(read-json (:body request))
workflow (keyword workflow)
index (command/index-key :name commands)]
(log/info! "upload workflow request" body)
(when (get @flows workflow)
(throw (IllegalArgumentException.
(str "workflow already exists: " (name workflow)))))
(merge-properties! state workflow properties)
(merge-commands! state workflow index)
(merge-steps! state workflow steps)
(json-response
{:workflow {workflow (map :name steps)}}))))
(defn run-handler
"Trigger the named workflow if it's not already running."
[state]
(fn [request]
(let [{:keys [workflow] :as body} (read-json (:body request))
workflow (keyword workflow)]
(log/info! "run request" body)
(run-flow! state workflow)
(json-response
{:run workflow}))))
(defn halt-handler
"Immediately stop the named workflow and cancel its running tasks."
[state]
(fn [request]
(let [{:keys [workflow] :as body} (read-json (:body request))
workflow (keyword workflow)]
(log/info! "halt request" body)
(halt-flow! state workflow)
(json-response
{:halt workflow}))))
(defn status-handler
"Return information about the named workflow."
[state]
(fn [request]
(let [{:keys [workflow debug] :as body} (read-json (:body request))
workflow (keyword workflow)]
(log/info! "status request" body)
(json-response
{:workflow workflow
:status
(flow-status! state workflow debug)}))))
(defn expire-handler
"Expire the named steps and/or data files (by storage keys) from the named
workflow."
[state]
(fn [request]
(let [{:keys [workflow expire] :as body} (read-json (:body request))
workflow (keyword workflow)]
(log/info! "expire request" body)
(expire-keys! state workflow expire)
(json-response
{:expire expire}))))
(defn workflows-handler
"List the current workflows in a map with summary info about each one."
[state]
(fn [request]
(log/info! "list workflows request")
(json-response {:workflows (workflows-info state)})))
(defn gaia-routes
[state]
[["/" :index (index-handler state)]
["/command" :command (command-handler state)]
["/merge" :merge (merge-handler state)]
["/upload" :upload (upload-handler state)]
["/run" :run (run-handler state)]
["/halt" :halt (halt-handler state)]
["/status" :status (status-handler state)]
["/expire" :expire (expire-handler state)]
["/workflows" :workflows (workflows-handler state)]])
(def parse-args
[["-c" "--config CONFIG" "path to config file"]
["-i" "--input INPUT" "input file or directory"]])
(defn wrap-error
[handler]
(fn [request]
(try
(handler request)
(catch Exception e
; TODO: The request body is just an empty stream now.
; Adopt ring-json/wrap-json-response and ring-json/wrap-json-params to
; log the request body?
(log/exception! e "Bad Request") ; or 500 Internal Server Error
(json-response {:error (str e)} 400)))))
(defn start
[options]
(let [path (or (:config options) "resources/config/gaia.clj")
config (config/read-path path)
state (boot config)
routes (polaris/build-routes (gaia-routes state))
router (polaris/router routes)
app (-> router
(resource/wrap-resource "public")
(keyword/wrap-keyword-params)
(params/wrap-params)
(wrap-error))]
(println config)
(http/start-server app {:port 24442})
state))
(defn -main
[& args]
(let [env (:options (cli/parse-opts args parse-args))]
(start env)))
|
class RecentWork {
final String image, category, title;
final int id;
RecentWork({this.id, this.image, this.category, this.title});
}
// Demo List of my works
List<RecentWork> recentWorks = [
RecentWork(
id: 1,
title: "New & Fresh Looking Portfolio indeed at the end",
category: "Graphic Design",
image: "assets/images/work_1.png",
),
RecentWork(
id: 2,
title: "New & Fresh Looking Portfolio indeed at the end",
category: "Graphic Design",
image: "assets/images/work_2.png",
),
RecentWork(
id: 3,
title: "New & Fresh Looking Portfolio indeed at the end",
category: "Graphic Design",
image: "assets/images/work_3.png",
),
RecentWork(
id: 4,
title: "New & Fresh Looking Portfolio indeed at the end",
category: "Graphic Design",
image: "assets/images/work_4.png",
),
];
|
# -*- coding: utf-8 -*-
from colorsys import rgb_to_hsv, hsv_to_rgb
import logging
from unicorn.apps.color.utils import hex_to_rgb
logger = logging.getLogger('unicorn.utils')
def get_rgb_brightness(color):
"""
Given an RGB _color (hex format), return it's brightness.
"""
r, g, b = hex_to_rgb(color)
hsv = rgb_to_hsv(r/255.0, g/255.0, b/255.0)
return hsv[2] * 100
def set_rgb_brightness(color, brightness):
"""
Given an RGB _color (hex format) and brightness, return a new RGB _color.
:param color: hex format RGB _color string
:param brightness: Integer from 0 to 100
"""
r, g, b = hex_to_rgb(color)
hsv = rgb_to_hsv(r/255.0, g/255.0, b/255.0)
rgb = hsv_to_rgb(hsv[0], hsv[1], brightness/100.0)
return (
int(rgb[0] * 255.0 + 0.5),
int(rgb[1] * 255.0 + 0.5),
int(rgb[2] * 255.0 + 0.5),
)
|
package org.jetbrains.plugins.scala
package lang
package psi
package api
package toplevel
package typedef
import org.jetbrains.plugins.scala.lang.psi.api.statements.ScFunction
trait ScGivenAlias extends ScGiven with ScFunction {
}
|
import os
POSTGRES_DB=os.environ['POSTGRES_DB']
POSTGRES_USER=os.environ['POSTGRES_USER']
POSTGRES_PASSWORD=os.environ['POSTGRES_PASSWORD']
POSTGRES_HOST=os.environ['POSTGRES_HOST']
POSTGRES_PORT=os.environ['POSTGRES_PORT']
'''Path to Wikidata dump from /import folder'''
DUMP = 'latest-all.json.gz'
'''Max number of lines to read from dump'''
LIMIT = 100000000
# LIMIT = 1000000
'''OSM tables to fetch Wikidata for'''
OSM_TABLES = [
'osm_aerodrome_label_point',
'osm_peak_point',
'osm_city_point',
'osm_continent_point',
'osm_country_point',
'osm_island_point',
'osm_island_polygon',
'osm_state_point',
'osm_poi_point',
'osm_poi_polygon',
'osm_marine_point',
'osm_water_polygon',
'osm_waterway_linestring'
]
'''Table with imported wikidata'''
TABLE_NAME = 'wd_names'
|
const nodeLib = require('/lib/xp/node');
const taskLib = require('/lib/xp/task');
const queryLib = require('/lib/query');
exports.post = function (req) {
const bean = __.newBean('systems.rcd.enonic.datatoolbox.RcdReportScriptBean');
const body = JSON.parse(req.body);
const repositoryName = body.repositoryName;
const branchName = body.branchName;
const query = body.query;
const filters = body.filters;
const parsedFilters = filters ? JSON.parse(filters) : null;
const sort = body.sort ? decodeURIComponent(body.sort) : undefined;
const reportName = body.reportName;
const taskId = taskLib.submit({
description: 'Report generation',
task: function () {
taskLib.progress({info: 'Querying...'});
const queryResult = executeQuery(repositoryName, branchName, query, parsedFilters, sort);
taskLib.progress({
info: 'Generating report (0/' + queryResult.total + ')...',
current: 0,
total: queryResult.total
});
const createReportFileCallback = __.toScriptValue(
(createEntryConsumer) => {
generateReportEntries(repositoryName, branchName, queryResult, createEntryConsumer);
generateReportMeta({
repository: repositoryName || undefined,
branch: branchName || undefined,
query: query,
filters: parsedFilters || undefined,
sort: sort,
},
{
total: queryResult.total
},
createEntryConsumer
);
});
const result = bean.createReportFile(reportName, createReportFileCallback);
taskLib.progress({
info: result,
current: queryResult.total,
total: queryResult.total
});
}
});
return {
contentType: 'application/json',
body: {taskId: taskId}
}
};
function executeQuery(repositoryName, branchName, query, filters, sort) {
const repoConnection = queryLib.createRepoConnection(repositoryName, branchName);
return repoConnection.query({
query: query,
filters: filters,
start: 0,
count: -1,
sort: sort
});
}
function generateReportEntries(repositoryName, branchName, queryResult, createEntryConsumer) {
const repoConnection = queryLib.createRepoConnection(repositoryName, branchName);
let current = 0;
const total = queryResult.total;
queryResult.hits.forEach(hit => {
const node = (repositoryName && branchName) ? repoConnection.get(hit.id) : nodeLib.connect({
repoId: hit.repoId,
branch: hit.branch
}).get(hit.id);
createEntryConsumer((repositoryName || hit.repoId) + '/' + (branchName || hit.branch) + node._path + '.json',
JSON.stringify(node, null, 2));
current++;
if (current % 10 === 0) {
taskLib.progress({
info: 'Generating report (' + current + '/' + total + ')...',
current: current,
total: total
});
}
});
}
function generateReportMeta(queryParams, queryResult, createEntryConsumer) {
createEntryConsumer('report.json', JSON.stringify({
version: "1",
format: 'Node as JSON',
params: queryParams,
result: queryResult
}, null, 2));
}
|
package testenv
import forex.Main.{AppEnv, AppTask}
import forex.domain.{Price, Rate, Timestamp}
import forex.services.rates.clients.ForexClient.{ForexClient, Service}
import forex.services.rates.errors
import zio.clock.Clock
import zio.console.Console
import zio.{IO, Layer, ZIO, ZLayer}
object TestEnvironment {
def forexClientTest: Layer[Nothing, ForexClient] =
ZLayer.succeed((pair: Rate.Pair) => IO.succeed(Rate(pair, Price(BigDecimal(100)), Timestamp.now)))
def withEnv[A](task: AppTask[A]) =
ZIO.environment[AppEnv].provideCustomLayer(Console.live ++ Clock.live ++ forexClientTest) >>> task
}
|
/*
Copyright 2016 The Kubernetes Authors.
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
*/
package secret
import (
"sync"
"k8s.io/api/core/v1"
clientset "k8s.io/client-go/kubernetes"
podutil "k8s.io/kubernetes/pkg/api/v1/pod"
metav1 "k8s.io/apimachinery/pkg/apis/meta/v1"
"k8s.io/apimachinery/pkg/util/sets"
)
type Manager interface {
// Get secret by secret namespace and name.
GetSecret(namespace, name string) (*v1.Secret, error)
// WARNING: Register/UnregisterPod functions should be efficient,
// i.e. should not block on network operations.
// RegisterPod registers all secrets from a given pod.
RegisterPod(pod *v1.Pod)
// UnregisterPod unregisters secrets from a given pod that are not
// used by any other registered pod.
UnregisterPod(pod *v1.Pod)
}
type objectKey struct {
namespace string
name string
}
// simpleSecretManager implements SecretManager interfaces with
// simple operations to apiserver.
type simpleSecretManager struct {
kubeClient clientset.Interface
}
func NewSimpleSecretManager(kubeClient clientset.Interface) Manager {
return &simpleSecretManager{kubeClient: kubeClient}
}
func (s *simpleSecretManager) GetSecret(namespace, name string) (*v1.Secret, error) {
return s.kubeClient.CoreV1().Secrets(namespace).Get(name, metav1.GetOptions{})
}
func (s *simpleSecretManager) RegisterPod(pod *v1.Pod) {
}
func (s *simpleSecretManager) UnregisterPod(pod *v1.Pod) {
}
// store is the interface for a secrets cache that
// can be used by cacheBasedSecretManager.
type store interface {
// AddReference adds a reference to the secret to the store.
// Note that multiple additions to the store has to be allowed
// in the implementations and effectively treated as refcounted.
AddReference(namespace, name string)
// DeleteReference deletes reference to the secret from the store.
// Note that secret should be deleted only when there was a
// corresponding Delete call for each of Add calls (effectively
// when refcount was reduced to zero).
DeleteReference(namespace, name string)
// Get a secret from a store.
Get(namespace, name string) (*v1.Secret, error)
}
// cachingBasedSecretManager keeps a store with secrets necessary
// for registered pods. Different implementations of the store
// may result in different semantics for freshness of secrets
// (e.g. ttl-based implementation vs watch-based implementation).
type cacheBasedSecretManager struct {
secretStore store
lock sync.Mutex
registeredPods map[objectKey]*v1.Pod
}
func newCacheBasedSecretManager(secretStore store) Manager {
return &cacheBasedSecretManager{
secretStore: secretStore,
registeredPods: make(map[objectKey]*v1.Pod),
}
}
func (c *cacheBasedSecretManager) GetSecret(namespace, name string) (*v1.Secret, error) {
return c.secretStore.Get(namespace, name)
}
func getSecretNames(pod *v1.Pod) sets.String {
result := sets.NewString()
podutil.VisitPodSecretNames(pod, func(name string) bool {
result.Insert(name)
return true
})
return result
}
func (c *cacheBasedSecretManager) RegisterPod(pod *v1.Pod) {
names := getSecretNames(pod)
c.lock.Lock()
defer c.lock.Unlock()
for name := range names {
c.secretStore.AddReference(pod.Namespace, name)
}
var prev *v1.Pod
key := objectKey{namespace: pod.Namespace, name: pod.Name}
prev = c.registeredPods[key]
c.registeredPods[key] = pod
if prev != nil {
for name := range getSecretNames(prev) {
// On an update, the .Add() call above will have re-incremented the
// ref count of any existing secrets, so any secrets that are in both
// names and prev need to have their ref counts decremented. Any that
// are only in prev need to be completely removed. This unconditional
// call takes care of both cases.
c.secretStore.DeleteReference(prev.Namespace, name)
}
}
}
func (c *cacheBasedSecretManager) UnregisterPod(pod *v1.Pod) {
var prev *v1.Pod
key := objectKey{namespace: pod.Namespace, name: pod.Name}
c.lock.Lock()
defer c.lock.Unlock()
prev = c.registeredPods[key]
delete(c.registeredPods, key)
if prev != nil {
for name := range getSecretNames(prev) {
c.secretStore.DeleteReference(prev.Namespace, name)
}
}
}
|
---
title: Teaching
layout: index_eng
dropdown: true
content:
- bsc
- msc
- doktori
- temakiiras
---
|
package com.dsource.idc.jellowintl;
import android.Manifest;
import android.content.Context;
import android.content.Intent;
import android.content.pm.PackageManager;
import android.os.AsyncTask;
import android.os.Build;
import android.os.Bundle;
import com.crashlytics.android.Crashlytics;
import com.dsource.idc.jellowintl.cache.CacheManager;
import com.dsource.idc.jellowintl.factories.TextFactory;
import com.dsource.idc.jellowintl.utility.CreateDataBase;
import com.dsource.idc.jellowintl.utility.DataBaseHelper;
import com.google.android.gms.tasks.OnCompleteListener;
import com.google.android.gms.tasks.Task;
import com.google.firebase.remoteconfig.FirebaseRemoteConfig;
import com.google.firebase.remoteconfig.FirebaseRemoteConfigSettings;
import org.json.JSONArray;
import org.json.JSONException;
import org.json.JSONObject;
import java.io.File;
import java.util.ArrayList;
import java.util.List;
import java.util.Timer;
import java.util.TimerTask;
import androidx.annotation.NonNull;
import androidx.core.content.ContextCompat;
import static com.dsource.idc.jellowintl.utility.Analytics.isAnalyticsActive;
import static com.dsource.idc.jellowintl.utility.Analytics.resetAnalytics;
import static com.dsource.idc.jellowintl.utility.Analytics.setCrashlyticsCustomKey;
import static com.dsource.idc.jellowintl.utility.Analytics.setUserProperty;
import static com.dsource.idc.jellowintl.utility.SessionManager.BN_IN;
import static com.dsource.idc.jellowintl.utility.SessionManager.ENG_AU;
import static com.dsource.idc.jellowintl.utility.SessionManager.ENG_IN;
import static com.dsource.idc.jellowintl.utility.SessionManager.ENG_UK;
import static com.dsource.idc.jellowintl.utility.SessionManager.ENG_US;
import static com.dsource.idc.jellowintl.utility.SessionManager.HI_IN;
import static com.dsource.idc.jellowintl.utility.SessionManager.MR_IN;
/**
* Created by ekalpa on 7/12/2016.
*/
public class SplashActivity extends BaseActivity {
//Field to create IconDatabase
CreateDataBase iconDatabase;
@Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_splash);
getSupportActionBar().hide();
//updateLangPackagesIfUpdateAvail();
new DataBaseHelper(this).createDataBase();
PlayGifView pGif = findViewById(R.id.viewGif);
pGif.setImageResource(R.drawable.jellow_j);
if (getSession().isRequiredToPerformDbOperations()) {
performDatabaseOperations();
getSession().setCompletedDbOperations(true);
}
if((Build.VERSION.SDK_INT >= Build.VERSION_CODES.M) &&
(ContextCompat.checkSelfPermission(this, Manifest.permission.CALL_PHONE)
!= PackageManager.PERMISSION_GRANTED))
getSession().setEnableCalling(false);
if (getSession().isLanguageChanged() == 1) {
CacheManager.clearCache();
TextFactory.clearJson();
}
iconDatabase=new CreateDataBase(this);
iconDatabase.execute();
checkIfDatabaseCreated();
}
@Override
protected void onResume() {
super.onResume();
if(!isAnalyticsActive()) {
resetAnalytics(this, getSession().getCaregiverNumber().substring(1));
}
setUserParameters();
}
private void setUserParameters() {
final int GRID_3BY3 = 1, PICTURE_TEXT = 0;
if(getSession().isGridSizeKeyExist()) {
if(getSession().getGridSize() == GRID_3BY3){
setUserProperty("GridSize", "9");
setCrashlyticsCustomKey("GridSize", "9");
}else{
setUserProperty("GridSize", "3");
setCrashlyticsCustomKey("GridSize", "3");
}
}else{
setUserProperty("GridSize", "9");
setCrashlyticsCustomKey("GridSize", "9");
}
if(getSession().getPictureViewMode() == PICTURE_TEXT) {
setUserProperty("PictureViewMode", "PictureText");
setCrashlyticsCustomKey("PictureViewMode", "PictureText");
}else{
setUserProperty("PictureViewMode", "PictureOnly");
setCrashlyticsCustomKey("PictureViewMode", "PictureOnly");
}
}
private void checkIfDatabaseCreated()
{
if(!(getSession().isLanguageChanged()==2))
startApp();//if changes are their in the app then check whether the data base is created or not
else
startJellow();//If no change in language then simply start the app.
}
private void startApp() {
final Timer timer=new Timer();
timer.scheduleAtFixedRate(new TimerTask() {
@Override
public void run() {
if(iconDatabase.getStatus()== AsyncTask.Status.FINISHED)
{
startJellow();
timer.cancel();
}
}
},0,100);
}
private void startJellow() {
startActivity(new Intent(SplashActivity.this, MainActivity.class));
finishAffinity();
}
private void performDatabaseOperations() {
DataBaseHelper helper = new DataBaseHelper(this);
helper.openDataBase();
helper.addLanguageDataToDatabase();
}
private void updateLangPackagesIfUpdateAvail() {
//This function will check if any language package is updated at Firebase. Then
// user required to download that package.
final FirebaseRemoteConfig frc = FirebaseRemoteConfig.getInstance();
FirebaseRemoteConfigSettings configSettings = new FirebaseRemoteConfigSettings.Builder()
.setDeveloperModeEnabled(BuildConfig.DEBUG)
.build();
frc.setConfigSettings(configSettings);
frc.setDefaults(R.xml.remote_config_default);
frc.fetch(1)
.addOnCompleteListener(this, new OnCompleteListener<Void>() {
@Override
public void onComplete(@NonNull Task<Void> task) {
if (task.isSuccessful()) {
// After config data is successfully fetched, it must be activated before
// newly fetched values are returned.
frc.activateFetched();
String updateLangPackageData = frc.getString("vcTwentyUpdateLanguagePackages");
if(updateLangPackageData.isEmpty())
return;
StringBuilder lang = new StringBuilder();
//1)Get local packages list.
//2)Compare which package from local list has updates.
//3)Add language name to update list.
try {
JSONObject jObj = new JSONObject(updateLangPackageData);
for (String langName: getOfflineLanguages()) {
try {
JSONArray jArray = jObj.getJSONArray(langName);
String path = getBaseContext().getDir(langName, Context.MODE_PRIVATE).getPath();
for (int i = 0; i < jArray.length(); i++) {
if (!(new File(path + jArray.get(i))).exists()) {
lang.append(langName+",");
break;
}
}
}catch (JSONException e) {
e.printStackTrace();
}
}
} catch (JSONException e) {
e.printStackTrace();
}
if (lang.toString().isEmpty())
return;
startActivity(new Intent(SplashActivity.this,
LanguagePackageUpdateActivity.class).putExtra("packageList", lang.toString()));
finish();
} else {
Crashlytics.log("RemoteConfigFetchFailed");
}
}
});
}
private String[] getOfflineLanguages(){
List<String> lang = new ArrayList<>();
if(getSession().isDownloaded(ENG_IN))
lang.add(ENG_IN);
if(getSession().isDownloaded(ENG_US))
lang.add(ENG_US);
if(getSession().isDownloaded(ENG_AU))
lang.add(ENG_AU);
if(getSession().isDownloaded(ENG_UK))
lang.add(ENG_UK);
if(getSession().isDownloaded(HI_IN))
lang.add(HI_IN);
if(getSession().isDownloaded(MR_IN))
lang.add(MR_IN);
if(getSession().isDownloaded(BN_IN))
lang.add(BN_IN);
return lang.toArray(new String[lang.size()]);
}
}
|
import { core } from './core'
try {
const app = core()
app.start();
} catch (error) {
console.log(error)
}
|
#!/bin/bash
# This script is intended to be used on Debian systems for building
# the project. It has been tested with Debian 8
USERNAME=$USER
SIGNING_NAME='perfect-mail'
SDK_VERSION='r24.3.3'
SDK_DIR=$HOME/android-sdk
cd ..
PROJECT_HOME=$(pwd)
sudo apt-get install build-essential default-jdk \
lib32stdc++6 lib32z1 lib32z1-dev
if [ ! -d $SDK_DIR ]; then
mkdir -p $SDK_DIR
fi
cd $SDK_DIR
# download the SDK
if [ ! -f $SDK_DIR/android-sdk_$SDK_VERSION-linux.tgz ]; then
wget https://dl.google.com/android/android-sdk_$SDK_VERSION-linux.tgz
tar -xzvf android-sdk_$SDK_VERSION-linux.tgz
fi
SDK_DIR=$SDK_DIR/android-sdk-linux
echo 'Check that you have the SDK tools installed for Android 17, SDK 19.1'
if [ ! -f $SDK_DIR/tools/android ]; then
echo "$SDK_DIR/tools/android not found"
exit -1
fi
cd $SDK_DIR
chmod -R 0755 $SDK_DIR
chmod a+rx $SDK_DIR/tools
ANDROID_HOME=$SDK_DIR
echo "sdk.dir=$SDK_DIR" > $ANDROID_HOME/local.properties
PATH=${PATH}:$ANDROID_HOME/tools:$ANDROID_HOME/platform-tools
android sdk
cd $PROJECT_HOME
if [ ! -f $SDK_DIR/tools/templates/gradle/wrapper/gradlew ]; then
echo "$SDK_DIR/tools/templates/gradle/wrapper/gradlew not found"
exit -2
fi
. $SDK_DIR/tools/templates/gradle/wrapper/gradlew build
#cd ~/develop/$PROJECT_NAME/build/outputs/apk
#keytool -genkey -v -keystore example.keystore -alias \
# "$SIGNING_NAME" -keyalg RSA -keysize 4096
#jarsigner -verbose -keystore example.keystore \
# perfect-mail-release-unsigned.apk "$SIGNING_NAME"
# cleaning up
cd $PROJECT_HOME/perfect-mail/build/outputs/apk
if [ ! -f perfect-mail-debug.apk ]; then
echo 'perfect-mail-debug.apk was not found'
exit -3
fi
echo 'Build script ended successfully'
echo -n 'apk is available at: '
echo "$PROJECT_HOME/perfect-mail/build/outputs/apk/perfect-mail-debug.apk"
exit 0
|
/*
* Copyright © 2014 - 2018 Leipzig University (Database Research Group)
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.gradoop.flink.model.impl.operators.matching.single.simulation.dual.tuples;
import org.apache.flink.api.java.tuple.Tuple4;
import org.gradoop.common.model.impl.id.GradoopId;
import org.gradoop.flink.model.impl.operators.matching.single.simulation.dual.util.MessageType;
import java.util.List;
/**
* Bundles a set of {@link Deletion} messages into a single message to update
* the vertex state accordingly.
*
* f0: recipient vertex id
* f1: sender vertex ids
* f2: candidate deletions
* f3: message types
*/
public class Message extends
Tuple4<GradoopId, List<GradoopId>, List<Long>, List<MessageType>> {
public GradoopId getRecipientId() {
return f0;
}
public void setRecipientId(GradoopId recipientId) {
f0 = recipientId;
}
public List<GradoopId> getSenderIds() {
return f1;
}
public void setSenderIds(List<GradoopId> senderIds) {
f1 = senderIds;
}
public List<Long> getDeletions() {
return f2;
}
public void setDeletions(List<Long> deletions) {
f2 = deletions;
}
public List<MessageType> getMessageTypes() {
return f3;
}
public void setMessageTypes(List<MessageType> messageTypes) {
f3 = messageTypes;
}
}
|
## message_spec.rb
## Used to test wrapper for message results
require 'spec_helper'
describe Wit::REST::Message do
let(:json) {%(
{
"msg_id" : "c20ad081-2cb9-4c63-8dd6-6667409514fa",
"outcomes" : [ {
"_text" : "how many people between Tuesday and Friday",
"intent" : "query_metrics",
"entities" : {
"metric" : [ {
"metadata" : "{'code' : 324}",
"value" : "metric_visitor"
} ],
"datetime" : [ {
"value" : {
"from" : "2014-06-24T00:00:00.000+02:00",
"to" : "2014-06-25T00:00:00.000+02:00"
}
}, {
"value" : {
"from" : "2014-06-27T00:00:00.000+02:00",
"to" : "2014-06-28T00:00:00.000+02:00"
}
} ]
},
"confidence" : 0.986
} ]
}
)}
let(:rand_path) {"rand_path"}
let(:rand_body) {"rand_body"}
let(:rest_code) {"get"}
let(:message_results) {Wit::REST::Message.new(MultiJson.load(json), rand_path, rand_body, rest_code)}
it "should have the following parameters, confidence and intent and entities" do
expect(message_results.confidence).to eql(0.986)
expect(message_results.intent).to eql("query_metrics")
expect(message_results.metric.class).to eql(Wit::REST::EntityArray)
expect(message_results.datetime.class).to eql(Wit::REST::EntityArray)
end
it "should have the right values in the entities" do
expect(message_results.metric[0].value).to eql("metric_visitor")
expect(message_results.datetime[0].value["from"]).to eql("2014-06-24T00:00:00.000+02:00")
end
it "should be able to return back an array of strings of the names of each entity" do
expect(message_results.entity_names).to eql(["metric", "datetime"])
end
end
|
import { BrowserModule } from '@angular/platform-browser';
import {CUSTOM_ELEMENTS_SCHEMA, NgModule} from '@angular/core';
import {MatMenuModule} from '@angular/material/menu';
import { AppRoutingModule } from './app-routing.module';
import { AppComponent } from './app.component';
import { HomePageComponent } from './home-page/home-page.component';
import { ULibraryComponent } from './u-library/u-library.component';
import { AddItemsComponent } from './add-items/add-items.component';
import { ScanItemsComponent } from './scan-items/scan-items.component';
import { HttpClientModule } from '@angular/common/http';
import {MatGridListModule} from '@angular/material/grid-list';
import {FormsModule} from '@angular/forms';
import { UBookComponent } from './u-book/u-book.component';
@NgModule({
declarations: [
AppComponent,
HomePageComponent,
ULibraryComponent,
AddItemsComponent,
ScanItemsComponent,
UBookComponent
],
imports: [
BrowserModule,
AppRoutingModule,
MatMenuModule,
HttpClientModule,
MatGridListModule,
FormsModule
],
schemas: [ CUSTOM_ELEMENTS_SCHEMA ],
providers: [],
bootstrap: [AppComponent]
})
export class AppModule { }
|
package just.fp
import just.fp.compat.EitherCompat
/**
* @author Kevin Lee
* @since 2019-03-16
*/
trait Monad[M[_]] extends Applicative[M] {
override def map[A, B](ma: M[A])(f: A => B): M[B] =
flatMap(ma)(a => pure(f(a)))
def flatMap[A, B](ma: M[A])(f: A => M[B]): M[B]
override def ap[A, B](ma: => M[A])(f: => M[A => B]): M[B] =
flatMap(ma) { a =>
map(f)(fab => fab(a))
}
trait MonadLaw extends ApplicativeLaw {
/*
* return a >>= f === f a
*/
def leftIdentity[A, B](a: A, f: A => M[B])(implicit MB: Equal[M[B]]): Boolean =
MB.equal(flatMap(pure(a))(f), f(a))
/*
* m >>= return === m
*/
def rightIdentity[A](a: M[A])(implicit MA: Equal[M[A]]): Boolean =
MA.equal(flatMap(a)(pure(_: A)), a)
/*
* (m >>= f) >>= g === m >>= (\x -> f x >>= g)
*/
def associativity[A, B, C](a: M[A], f: A => M[B], g: B => M[C])(implicit MC: Equal[M[C]]): Boolean =
MC.equal(flatMap(flatMap(a)(f))(g), flatMap(a)(x => flatMap(f(x))(g)))
}
def monadLaw: MonadLaw = new MonadLaw {}
}
object Monad extends MonadInstances {
@inline final def apply[M[_] : Monad]: Monad[M] = implicitly[Monad[M]]
}
private[fp] trait MonadInstances
extends IdInstance
with OptionMonadInstance
with EitherMonadInstance
with ListMonadInstance
with VectorMonadInstance
with FutureMonadInstance
private[fp] trait IdInstance {
implicit val idInstance: Functor[Id] with Applicative[Id] with Monad[Id] =
new Functor[Id] with Applicative[Id] with Monad[Id] {
override def pure[A](a: => A): Id[A] = a
override def flatMap[A, B](ma: Id[A])(f: A => Id[B]): Id[B] = f(ma)
}
}
private[fp] trait OptionMonad extends Monad[Option] with OptionApplicative {
override def flatMap[A, B](ma: Option[A])(f: A => Option[B]): Option[B] =
ma.flatMap(f)
override def pure[A](a: => A): Option[A] = Option(a)
}
private[fp] trait EitherMonad[A] extends Monad[Either[A, *]] with EitherApplicative[A] {
override def flatMap[B, C](ma: Either[A, B])(f: B => Either[A, C]): Either[A, C] =
EitherCompat.flatMap(ma)(f)
override def pure[B](b: => B): Either[A, B] = Right(b)
}
private[fp] trait ListMonad extends Monad[List] with ListApplicative {
override def flatMap[A, B](ma: List[A])(f: A => List[B]): List[B] =
ma.flatMap(f)
override def pure[A](a: => A): List[A] = List(a)
}
private[fp] trait VectorMonad extends Monad[Vector] with VectorApplicative {
override def flatMap[A, B](ma: Vector[A])(f: A => Vector[B]): Vector[B] =
ma.flatMap(f)
override def pure[A](a: => A): Vector[A] = Vector(a)
}
import scala.concurrent.Future
private[fp] trait FutureMonad extends Monad[Future] with FutureApplicative {
import scala.concurrent.ExecutionContext
override implicit def executor: ExecutionContext
override def flatMap[A, B](ma: Future[A])(f: A => Future[B]): Future[B] =
ma.flatMap(f)
override def pure[A](a: => A): Future[A] = Future(a)
}
private[fp] trait OptionMonadInstance extends OptionApplicativeInstance {
implicit val optionMonad: Monad[Option] = new OptionMonad {}
}
private[fp] trait EitherMonadInstance extends EitherApplicativeInstance {
implicit def eitherMonad[A]: Monad[Either[A, *]] = new EitherMonad[A] {}
}
private[fp] trait ListMonadInstance extends ListApplicativeInstance {
implicit val listMonad: Monad[List] = new ListMonad {}
}
private[fp] trait VectorMonadInstance extends VectorApplicativeInstance {
implicit val vectorMonad: Monad[Vector] = new VectorMonad {}
}
private[fp] trait FutureMonadInstance extends FutureApplicativeInstance {
import scala.concurrent.ExecutionContext
@SuppressWarnings(Array("org.wartremover.warts.ImplicitParameter"))
implicit def futureMonad(implicit executor0: ExecutionContext): Monad[Future] =
new FutureMonad {
override implicit def executor: ExecutionContext = executor0
}
}
|
# IDEA 版本控制篇 GIT (不一样的操作方式)
### 查看每一行代码的条件人, 提交时间(大部分人不知道)

**选择后入下图所示**

**鼠标移动上去还能看到提交详细信息**

### 克隆远程代码
> git clone origin url
常规操作

**装逼操作**

### 拉取远程代码
> git pull

**快捷方式**
```
ctrl + t
```
### 将暂存区代码提交到本地库
> git commit -m 'message'


### 将本地库 提交到远程库
> git push

快捷键
```
ctrl + shift + k
```
或
```
alt + 1 + 8
```
### 切换分支, 或拉取远程分支

以下提供几种快捷方式
```
ctrl + shift + `
```
或
```
alt + ~ + 7
```
或

### 查看当前打开类 历史记录
```
alt + ~ + 4
```
#### 查看项目工程历史记录
选中工程后
```
alt + ~ + 4
```
或 `alt + 9` 切换到 `Version Control` 面板 选择log

|
<?php
declare(strict_types=1);
namespace webignition\BasilParser;
use webignition\BasilModels\Page\Page;
use webignition\BasilModels\Page\PageInterface;
class PageParser
{
private const KEY_URL = 'url';
private const KEY_ELEMENTS = 'elements';
public static function create(): PageParser
{
return new PageParser();
}
/**
* @param array<string, mixed> $pageData
*/
public function parse(string $importName, array $pageData): PageInterface
{
$url = $pageData[self::KEY_URL] ?? '';
$url = is_string($url) ? trim($url) : '';
$elements = $pageData[self::KEY_ELEMENTS] ?? [];
$elements = is_array($elements) ? $elements : [];
return new Page($importName, $url, $elements);
}
}
|
# Stats
This repository is collection of scripts for generating project statistics and
data.
Stats generated by these scripts have been, are and will be used in curl
related blog posts and presentations and by providing the scripts in a public
repository everyone can reproduce the results and can verify the correctness
of them.
And it allows everyone to help out to improve the script and to provide new
ones that generate even more, better and more interesting project stats.
## How to run the scripts
### Check out the main curl git repository
git clone https://github.com/curl/curl.git
### Check out this repository as a subdirectory
cd curl
git clone https://github.com/curl/stats.git
### Run the stats scripts
The scripts are (primarily) written in perl and are intended to be run from
the curl source code root.
Example:
perl stats/CI-jobs-over-time.pl
## Output
The scripts are written to output CSV data, usually having data and/or curl
release versions included in each line.
# Scripts
## API-calls-over-time
Iterates over git tags. Extracts the number of function calls as mentioned in
the RELEASE-NOTES of the tag moment. Outputs version, date and a counter.
## CI-jobs-over-time
Iterates over all git tags. It then counts how many CI jobs that seems to have
been enabled at that time. Outputs version, date, total count, travis count,
cirrus count, appveyor count, azure count and the github count.
## CI-platforms
Iterates over all git tags. It then counts how many CI jobs that seems to have
been enabled at that time. Outputs date, total count, Linux count, macOS
count, Windows count and FreeBSD count.
## authors
Iterates over the git log. Counts how many comits each author did and when,
then lists all dates when a new author appeared in the project. Date,
single-commiter count, total authors count and a single/total share.
## authors-per-month
Iterates over the git log. Counts number of differenth authors every month,
then for all years after 2009, outputs: year, first-commiters, unique authors,
drive-by count and total uniues. The *drive-by* count is an author with less
than three commits done within that month.
## authors-per-year
Iterates over the git log. Counts number of differenth authors every year,
then outputs date (as "$year-01-01"), total count and first-timer count. The
latter being the number of authors who did their first commit that year.
## bugbounty-over-time
Iterates over all vulnerabilities in `vuln.pm` (in the curl-www repo). Outputs
cve, date, the accumulated amount and the individual payout amount. Amounts in
USD. Note that this then does not include CVE reports that received payout but
have since been retracted from the list of vulnerabilities.
## bugfix-frequency
Iterates over all releases in `releases.csv` (built in the curl-www repo). For
each release, it outputs version release date, total number of bugfixes in
that release and then the averaged bugs per day count for the last 5
releases. The first 5 lines obviously then have less releases for the average.
## cmdline-options-over-time
Iterates over all git tags. Extracs the counter from the `RELEASES-NOTES` from
each tag. This script also contains a bunch of manually added lines from the
times before the `RELEASES-NOTES` file contained the necessary information.
Outputs version, date and a counter.
## commits-per-month
Iterates over the git log. Counts number of commits done per month. Outputs
date ("$y-$m-01") and a counter.
## commits-per-year
Iterates over the git log. Counts number of commits done per year. Outputs
date ("$y-01-01") and a counter.
## contributors-over-time
Iterates over all git tags. Extracs the counter from the `RELEASES-NOTES` from
each tag. The script contains a set of manually added numbers from the time
before the number was added to `RELEASES-NOTES`.
## coreteam-over-time
Iterates over the git log. Counts how many authors that have done 10 commits
or more within the same calendar year, count them as "core team" members and
outputs information about them.
## cve-age
Iterates over `vuln.pm` and `releases.csv` (from curl-www). Outputs CVE, date,
flaw period, project age at that point, days since previous CVE, total CVE
count to that point.
## cve-plot
Iterates over `vuln.pm` and `releases.csv` (from curl-www). Outputs CVE, total
count, flaw period, project age at that point.
## daniel-per-year
Iterates over the git log. Counts how many commits Daniel did and how many
others did each year. Outputs date ("$year-12-31") and a share for that year.
## daniel-vs-rest
Iterates over the git log. Outputs date, total commit count, Daniel's share of
all commits, the others's share of all commits.
## days-per-release
Iterates over `releases.csv` (from curl-www). Outputs version, date and number
of days between this release and the previous.
## docs-over-time
Iterates over the git log and all commits done to the `docs/` folder. Outputs
date and number of lines.
## gh-monthly
Uses the generated github.csv file to generate graphs on github activity.
## gh-age
Uses the generated github.csv file to generate graphs on github issue ages.
## lines-over-time
Iterates over the git log and all commits done to the `src/`, `lib/` and
`include/` folders. Outputs date and number of lines. The script contains a
set of versions and LOC counts, manually counted from the time before the git
repo.
## mail
Downloads the server-side CSV and generates a mailing list activity graph.
## protocols-over-time
Iterates over `protocol-history.md`, which is a human maintained input
source. Outputs date, protocol, total count. The protocol being the one that
was added at that particular moment in time. The initial protocols were added
*before* the first curl release...
## setopts-over-time
Iterates over all git tags. Extracs the setopt counter from the
`RELEASES-NOTES` from each tag. The script contains a set of manually added
numbers from the time before the number was added to `RELEASES-NOTES`. Outputs
version, date and counter.
## files-over-time
Iterates over all git tags. Counts the number of files in the repository at
the time of each tag. Outputs version, date and counter.
## tests-over-time
Iterates over all git tags. Counts the number of files matching
`tests/data/test[num]` at the time of each tag. Outputs version, date and
counter.
## tls-over-time
Iterates over `tls-history.md`, which is a human maintained input
source. Outputs date, backend, total count. The "backend" being the TLS
library that was added (or removed) at that particular moment in time. Removed
backends are prefixed with a minus.
## vulns-over-time
Iterates over all vulnerabilities in `vuln.pm` (in the curl-www repo). Outputs
cve, date, the total CVE count.
## vulns-per-year
Iterates over all vulnerabilities in `vuln.pm` (in the curl-www repo). Outputs
date ("$year-01-01;, CVEs that year and the total CVE count up and including
that year.
# License
The scripts are provided under [MIT](LICENSE).
|
"""Initialize the module."""
from .ugrid_2d_data_extractor import UGrid2dDataExtractor # NOQA: F401
from .ugrid_2d_polyline_data_extractor import UGrid2dPolylineDataExtractor # NOQA: F401
__version__ = '5.0.1'
|
package net.wessendorf.beam;
import org.apache.beam.sdk.Pipeline;
import org.apache.beam.sdk.io.amqp.AmqpIO;
import org.apache.beam.sdk.io.jms.JmsIO;
import org.apache.beam.sdk.io.jms.JmsRecord;
import org.apache.beam.sdk.io.kafka.KafkaIO;
import org.apache.beam.sdk.options.PipelineOptions;
import org.apache.beam.sdk.options.PipelineOptionsFactory;
import org.apache.beam.sdk.testing.PAssert;
import org.apache.beam.sdk.transforms.ParDo;
import org.apache.beam.sdk.transforms.Values;
import org.apache.beam.sdk.transforms.windowing.FixedWindows;
import org.apache.beam.sdk.transforms.windowing.Window;
import org.apache.beam.sdk.values.PCollection;
import org.apache.kafka.common.serialization.StringDeserializer;
import org.apache.kafka.common.serialization.StringSerializer;
import org.apache.qpid.jms.JmsConnectionFactory;
import org.apache.qpid.proton.message.Message;
import org.joda.time.Duration;
import java.util.Collections;
public class Runner {
public static void main(String[] args) throws Exception {
PipelineOptions options = PipelineOptionsFactory.fromArgs(args).withValidation()
.as(PipelineOptions.class);
Pipeline pipeline = Pipeline.create(options);
// Apache Artemis:
PCollection<JmsRecord> output = pipeline.apply(JmsIO.read()
.withConnectionFactory(new JmsConnectionFactory("amqp://172.17.0.10:5672"))
.withQueue("someQueue")
.withUsername("admin")
.withPassword("admin"));
// Apache Qpid (using older proton-j lib)
// PCollection<Message> output = pipeline.apply(AmqpIO.read()
// .withMaxNumRecords(Long.MAX_VALUE)
// .withAddresses(Collections.singletonList("amqp://172.17.0.10:5672/myQueue")));
PCollection<String> words = output.apply(ParDo.of(new UppercaseFn()));
words.apply(KafkaIO.<String, String>write().withBootstrapServers("172.17.0.11:9092")
.withTopic("beam.results").withValueSerializer(StringSerializer.class)
.withKeySerializer(StringSerializer.class).values());
pipeline.run();
}
}
|
# SSI PHP SERVER
Local server supporting SSI, PHP, hot reload and Shift-JIS encoded files
<br>
## Dependencies
- gulp
- gulp-connect-php
- gulp-convert-encoding
- connect-ssi
- browser-sync
<br>
<br>
## Configuration
First, you need to have PHP installed on your machine in order to run PHP scripts.
Then open the [gulpfile.js] file and copy the path of your php.exe and php.ini files.
```javascript
// before
const phpExe = 'path-to-php.exe';
const phpIni = 'path-to-php.ini';
// after :
const phpExe = 'C:/foo/bar/php.exe';
const phpIni = 'C:/foo/bar/php.ini';
```
Thats all !
<br>
<br>
## Install
<br>
```sh
npm i
# or
pnpm i
```
<br>
<br>
## Launch SSI + PHP server with hot reload
<br>
## UTF-8 encoded files
```sh
npm run utf8
# or
gulp uft8
```
- serve [src] folder
- watch changes in [src] folder and reload browser
<br><br>
## Shift_JIS encoded files
```sh
npm run shiftJis
# or
gulp shiftJis
```
- copy [src] files to [dist] folder
- convert HTML, CSS and JS files from shift-jis to utf-8
- serve [dist] folder
- watch changes in [src] folder
- copy [src] files to [dist] folder
- convert HTML, CSS and JS files from shift-jis to utf-8
- reload browser
|
# University Data Analysis
Data analysis of US colleges and universities.
Data source: *College Scorecard*.
##### Dependencies:
* [NumPy](https://numpy.org)
* [Pandas](https://pandas.pydata.org)
* [Matplotlib](https://matplotlib.org)
|
import * as React from 'react'
import { Wrapper, Guide } from './elements'
import { TGuide } from '../App'
type Props = {
guides: TGuide[]
isLoading: boolean
}
const Guides: React.SFC<Props> = ({ guides }) => (
<Wrapper>
{guides.map((guide) => (
<Guide
key={guide.fileName}
href={`/guides/${guide.type}/${guide.fileName.split('.')[0]}`}
>
{guide.title}
<span>{guide.type}</span>
</Guide>
))}
</Wrapper>
)
export default Guides
|
package itest
import (
"context"
"fmt"
"testing"
"time"
"github.com/btcsuite/btcutil"
"github.com/lightningnetwork/lnd/lnrpc"
"github.com/lightningnetwork/lnd/lntest"
"github.com/lightningnetwork/lnd/macaroons"
"github.com/stretchr/testify/require"
"google.golang.org/protobuf/proto"
"gopkg.in/macaroon.v2"
)
// testRPCMiddlewareInterceptor tests that the RPC middleware interceptor can
// be used correctly and in a safe way.
func testRPCMiddlewareInterceptor(net *lntest.NetworkHarness, t *harnessTest) {
// Let's first enable the middleware interceptor.
net.Alice.Cfg.ExtraArgs = append(
net.Alice.Cfg.ExtraArgs, "--rpcmiddleware.enable",
)
err := net.RestartNode(net.Alice, nil)
require.NoError(t.t, err)
// Let's set up a channel between Alice and Bob, just to get some useful
// data to inspect when doing RPC calls to Alice later.
net.EnsureConnected(t.t, net.Alice, net.Bob)
net.SendCoins(t.t, btcutil.SatoshiPerBitcoin, net.Alice)
_ = openChannelAndAssert(
t, net, net.Alice, net.Bob, lntest.OpenChannelParams{
Amt: 1_234_567,
},
)
// Load or bake the macaroons that the simulated users will use to
// access the RPC.
readonlyMac, err := net.Alice.ReadMacaroon(
net.Alice.ReadMacPath(), defaultTimeout,
)
require.NoError(t.t, err)
customCaveatMac, err := macaroons.SafeCopyMacaroon(readonlyMac)
require.NoError(t.t, err)
addConstraint := macaroons.CustomConstraint(
"itest-caveat", "itest-value",
)
require.NoError(t.t, addConstraint(customCaveatMac))
// Run all sub-tests now. We can't run anything in parallel because that
// would cause the main test function to exit and the nodes being
// cleaned up.
t.t.Run("registration restrictions", func(tt *testing.T) {
middlewareRegistrationRestrictionTests(tt, net.Alice)
})
t.t.Run("read-only intercept", func(tt *testing.T) {
registration := registerMiddleware(
tt, net.Alice, &lnrpc.MiddlewareRegistration{
MiddlewareName: "itest-interceptor",
ReadOnlyMode: true,
},
)
defer registration.cancel()
middlewareInterceptionTest(
tt, net.Alice, net.Bob, registration, readonlyMac,
customCaveatMac, true,
)
})
// We've manually disconnected Bob from Alice in the previous test, make
// sure they're connected again.
net.EnsureConnected(t.t, net.Alice, net.Bob)
t.t.Run("encumbered macaroon intercept", func(tt *testing.T) {
registration := registerMiddleware(
tt, net.Alice, &lnrpc.MiddlewareRegistration{
MiddlewareName: "itest-interceptor",
CustomMacaroonCaveatName: "itest-caveat",
},
)
defer registration.cancel()
middlewareInterceptionTest(
tt, net.Alice, net.Bob, registration, customCaveatMac,
readonlyMac, false,
)
})
// Next, run the response manipulation tests.
net.EnsureConnected(t.t, net.Alice, net.Bob)
t.t.Run("read-only not allowed to manipulate", func(tt *testing.T) {
registration := registerMiddleware(
tt, net.Alice, &lnrpc.MiddlewareRegistration{
MiddlewareName: "itest-interceptor",
ReadOnlyMode: true,
},
)
defer registration.cancel()
middlewareManipulationTest(
tt, net.Alice, net.Bob, registration, readonlyMac, true,
)
})
net.EnsureConnected(t.t, net.Alice, net.Bob)
t.t.Run("encumbered macaroon manipulate", func(tt *testing.T) {
registration := registerMiddleware(
tt, net.Alice, &lnrpc.MiddlewareRegistration{
MiddlewareName: "itest-interceptor",
CustomMacaroonCaveatName: "itest-caveat",
},
)
defer registration.cancel()
middlewareManipulationTest(
tt, net.Alice, net.Bob, registration, customCaveatMac,
false,
)
})
// And finally make sure mandatory middleware is always checked for any
// RPC request.
t.t.Run("mandatory middleware", func(tt *testing.T) {
middlewareMandatoryTest(tt, net.Alice, net)
})
}
// middlewareRegistrationRestrictionTests tests all restrictions that apply to
// registering a middleware.
func middlewareRegistrationRestrictionTests(t *testing.T,
node *lntest.HarnessNode) {
testCases := []struct {
registration *lnrpc.MiddlewareRegistration
expectedErr string
}{{
registration: &lnrpc.MiddlewareRegistration{
MiddlewareName: "foo",
},
expectedErr: "invalid middleware name",
}, {
registration: &lnrpc.MiddlewareRegistration{
MiddlewareName: "itest-interceptor",
CustomMacaroonCaveatName: "foo",
},
expectedErr: "custom caveat name of at least",
}, {
registration: &lnrpc.MiddlewareRegistration{
MiddlewareName: "itest-interceptor",
CustomMacaroonCaveatName: "itest-caveat",
ReadOnlyMode: true,
},
expectedErr: "cannot set read-only and custom caveat name",
}}
for idx, tc := range testCases {
tc := tc
t.Run(fmt.Sprintf("%d", idx), func(tt *testing.T) {
invalidName := registerMiddleware(
tt, node, tc.registration,
)
_, err := invalidName.stream.Recv()
require.Error(tt, err)
require.Contains(tt, err.Error(), tc.expectedErr)
invalidName.cancel()
})
}
}
// middlewareInterceptionTest tests that unary and streaming requests can be
// intercepted. It also makes sure that depending on the mode (read-only or
// custom macaroon caveat) a middleware only gets access to the requests it
// should be allowed access to.
func middlewareInterceptionTest(t *testing.T, node *lntest.HarnessNode,
peer *lntest.HarnessNode, registration *middlewareHarness,
userMac *macaroon.Macaroon, disallowedMac *macaroon.Macaroon,
readOnly bool) {
// Everything we test here should be executed in a matter of
// milliseconds, so we can use one single timeout context for all calls.
ctxb := context.Background()
ctxc, cancel := context.WithTimeout(ctxb, defaultTimeout)
defer cancel()
// Create a client connection that we'll use to simulate user requests
// to lnd with.
cleanup, client := macaroonClient(t, node, userMac)
defer cleanup()
// We're going to send a simple RPC request to list all channels.
// We need to invoke the intercept logic in a goroutine because we'd
// block the execution of the main task otherwise.
req := &lnrpc.ListChannelsRequest{ActiveOnly: true}
go registration.interceptUnary(
"/lnrpc.Lightning/ListChannels", req, nil,
)
// Do the actual call now and wait for the interceptor to do its thing.
resp, err := client.ListChannels(ctxc, req)
require.NoError(t, err)
// Did we receive the correct intercept message?
assertInterceptedType(t, resp, <-registration.responsesChan)
// Let's test the same for a streaming endpoint.
req2 := &lnrpc.PeerEventSubscription{}
go registration.interceptStream(
"/lnrpc.Lightning/SubscribePeerEvents", req2, nil,
)
// Do the actual call now and wait for the interceptor to do its thing.
peerCtx, peerCancel := context.WithCancel(ctxb)
resp2, err := client.SubscribePeerEvents(peerCtx, req2)
require.NoError(t, err)
// Disconnect Bob to trigger a peer event without using Alice's RPC
// interface itself.
_, err = peer.DisconnectPeer(ctxc, &lnrpc.DisconnectPeerRequest{
PubKey: node.PubKeyStr,
})
require.NoError(t, err)
peerEvent, err := resp2.Recv()
require.NoError(t, err)
require.Equal(t, lnrpc.PeerEvent_PEER_OFFLINE, peerEvent.GetType())
// Stop the peer stream again, otherwise we'll produce more events.
peerCancel()
// Did we receive the correct intercept message?
assertInterceptedType(t, peerEvent, <-registration.responsesChan)
// Make sure that with the other macaroon we aren't allowed to access
// the interceptor. If we registered for read-only access then there is
// no middleware that handles the custom macaroon caveat. If we
// registered for a custom caveat then there is no middleware that
// handles unencumbered read-only access.
cleanup, client = macaroonClient(t, node, disallowedMac)
defer cleanup()
// We need to make sure we don't get any interception messages for
// requests made with the disallowed macaroon.
var (
errChan = make(chan error, 1)
msgChan = make(chan *lnrpc.RPCMiddlewareRequest, 1)
)
go func() {
req, err := registration.stream.Recv()
if err != nil {
errChan <- err
return
}
msgChan <- req
}()
// Let's invoke the same request again but with the other macaroon.
resp, err = client.ListChannels(ctxc, req)
// Depending on what mode we're in, we expect something different. If we
// are in read-only mode then an encumbered macaroon cannot be used
// since there is no middleware registered for it. If we registered for
// a custom macaroon caveat and a request with anon-encumbered macaroon
// comes in, we expect to just not get any intercept messages.
if readOnly {
require.Error(t, err)
require.Contains(
t, err.Error(), "cannot accept macaroon with custom "+
"caveat 'itest-caveat', no middleware "+
"registered",
)
} else {
require.NoError(t, err)
// We disconnected Bob so there should be no active channels.
require.Len(t, resp.Channels, 0)
}
// There should be neither an error nor any interception messages in the
// channels.
select {
case err := <-errChan:
t.Fatalf("Unexpected error, not expecting messages: %v", err)
case msg := <-msgChan:
t.Fatalf("Unexpected intercept message: %v", msg)
case <-time.After(time.Second):
// Nothing came in for a second, we're fine.
}
}
// middlewareManipulationTest tests that unary and streaming requests can be
// intercepted and also manipulated, at least if the middleware didn't register
// for read-only access.
func middlewareManipulationTest(t *testing.T, node *lntest.HarnessNode,
peer *lntest.HarnessNode, registration *middlewareHarness,
userMac *macaroon.Macaroon, readOnly bool) {
// Everything we test here should be executed in a matter of
// milliseconds, so we can use one single timeout context for all calls.
ctxb := context.Background()
ctxc, cancel := context.WithTimeout(ctxb, defaultTimeout)
defer cancel()
// Create a client connection that we'll use to simulate user requests
// to lnd with.
cleanup, client := macaroonClient(t, node, userMac)
defer cleanup()
// We're going to attempt to replace the response with our own. But
// since we only registered for read-only access, our replacement should
// just be ignored.
replacementResponse := &lnrpc.ListChannelsResponse{
Channels: []*lnrpc.Channel{{
ChannelPoint: "f000:0",
}, {
ChannelPoint: "f000:1",
}},
}
// We're going to send a simple RPC request to list all channels.
// We need to invoke the intercept logic in a goroutine because we'd
// block the execution of the main task otherwise.
req := &lnrpc.ListChannelsRequest{ActiveOnly: true}
go registration.interceptUnary(
"/lnrpc.Lightning/ListChannels", req, replacementResponse,
)
// Do the actual call now and wait for the interceptor to do its thing.
resp, err := client.ListChannels(ctxc, req)
require.NoError(t, err)
// Did we get the manipulated response (2 fake channels) or the original
// one (1 channel)?
if readOnly {
require.Len(t, resp.Channels, 1)
} else {
require.Len(t, resp.Channels, 2)
}
// Let's test the same for a streaming endpoint.
replacementResponse2 := &lnrpc.PeerEvent{
Type: lnrpc.PeerEvent_PEER_ONLINE,
PubKey: "foo",
}
req2 := &lnrpc.PeerEventSubscription{}
go registration.interceptStream(
"/lnrpc.Lightning/SubscribePeerEvents", req2,
replacementResponse2,
)
// Do the actual call now and wait for the interceptor to do its thing.
peerCtx, peerCancel := context.WithCancel(ctxb)
resp2, err := client.SubscribePeerEvents(peerCtx, req2)
require.NoError(t, err)
// Disconnect Bob to trigger a peer event without using Alice's RPC
// interface itself.
_, err = peer.DisconnectPeer(ctxc, &lnrpc.DisconnectPeerRequest{
PubKey: node.PubKeyStr,
})
require.NoError(t, err)
peerEvent, err := resp2.Recv()
require.NoError(t, err)
// Did we get the correct, original response?
if readOnly {
require.Equal(
t, lnrpc.PeerEvent_PEER_OFFLINE, peerEvent.GetType(),
)
require.Equal(t, peer.PubKeyStr, peerEvent.PubKey)
} else {
require.Equal(
t, lnrpc.PeerEvent_PEER_ONLINE, peerEvent.GetType(),
)
require.Equal(t, "foo", peerEvent.PubKey)
}
// Stop the peer stream again, otherwise we'll produce more events.
peerCancel()
}
// middlewareMandatoryTest tests that all RPC requests are blocked if there is
// a mandatory middleware declared that's currently not registered.
func middlewareMandatoryTest(t *testing.T, node *lntest.HarnessNode,
net *lntest.NetworkHarness) {
// Let's declare our itest interceptor as mandatory but don't register
// it just yet. That should cause all RPC requests to fail, except for
// the registration itself.
node.Cfg.ExtraArgs = append(
node.Cfg.ExtraArgs,
"--rpcmiddleware.addmandatory=itest-interceptor",
)
err := net.RestartNodeNoUnlock(node, nil, false)
require.NoError(t, err)
// The "wait for node to start" flag of the above restart does too much
// and has a call to GetInfo built in, which will fail in this special
// test case. So we need to do the wait and client setup manually here.
conn, err := node.ConnectRPC(true)
require.NoError(t, err)
node.InitRPCClients(conn)
err = node.WaitUntilStateReached(lnrpc.WalletState_RPC_ACTIVE)
require.NoError(t, err)
node.LightningClient = lnrpc.NewLightningClient(conn)
ctxb := context.Background()
ctxc, cancel := context.WithTimeout(ctxb, defaultTimeout)
defer cancel()
// Test a unary request first.
_, err = node.ListChannels(ctxc, &lnrpc.ListChannelsRequest{})
require.Error(t, err)
require.Contains(
t, err.Error(), "middleware 'itest-interceptor' is "+
"currently not registered",
)
// Then a streaming one.
stream, err := node.SubscribeInvoices(ctxc, &lnrpc.InvoiceSubscription{})
require.NoError(t, err)
_, err = stream.Recv()
require.Error(t, err)
require.Contains(
t, err.Error(), "middleware 'itest-interceptor' is "+
"currently not registered",
)
// Now let's register the middleware and try again.
registration := registerMiddleware(
t, node, &lnrpc.MiddlewareRegistration{
MiddlewareName: "itest-interceptor",
CustomMacaroonCaveatName: "itest-caveat",
},
)
defer registration.cancel()
// Both the unary and streaming requests should now be allowed.
time.Sleep(500 * time.Millisecond)
_, err = node.ListChannels(ctxc, &lnrpc.ListChannelsRequest{})
require.NoError(t, err)
_, err = node.SubscribeInvoices(ctxc, &lnrpc.InvoiceSubscription{})
require.NoError(t, err)
// We now shut down the node manually to prevent the test from failing
// because we can't call the stop RPC if we unregister the middleware in
// the defer statement above.
err = net.ShutdownNode(node)
require.NoError(t, err)
}
// assertInterceptedType makes sure that the intercept message sent by the RPC
// interceptor is correct for a proto message that was sent or received over the
// RPC interface.
func assertInterceptedType(t *testing.T, rpcMessage proto.Message,
interceptMessage *lnrpc.RPCMessage) {
t.Helper()
require.Equal(
t, string(proto.MessageName(rpcMessage)),
interceptMessage.TypeName,
)
rawRequest, err := proto.Marshal(rpcMessage)
require.NoError(t, err)
// Make sure we don't trip over nil vs. empty slice in the equality
// check below.
if len(rawRequest) == 0 {
rawRequest = nil
}
require.Equal(t, rawRequest, interceptMessage.Serialized)
}
// middlewareStream is a type alias to shorten the long definition.
type middlewareStream lnrpc.Lightning_RegisterRPCMiddlewareClient
// middlewareHarness is a test harness that holds one instance of a simulated
// middleware.
type middlewareHarness struct {
t *testing.T
cancel func()
stream middlewareStream
responsesChan chan *lnrpc.RPCMessage
}
// registerMiddleware creates a new middleware harness and sends the initial
// register message to the RPC server.
func registerMiddleware(t *testing.T, node *lntest.HarnessNode,
registration *lnrpc.MiddlewareRegistration) *middlewareHarness {
ctxc, cancel := context.WithCancel(context.Background())
middlewareStream, err := node.RegisterRPCMiddleware(ctxc)
require.NoError(t, err)
err = middlewareStream.Send(&lnrpc.RPCMiddlewareResponse{
MiddlewareMessage: &lnrpc.RPCMiddlewareResponse_Register{
Register: registration,
},
})
require.NoError(t, err)
return &middlewareHarness{
t: t,
cancel: cancel,
stream: middlewareStream,
responsesChan: make(chan *lnrpc.RPCMessage),
}
}
// interceptUnary intercepts a unary call, optionally requesting to replace the
// response sent to the client. A unary call is expected to receive one
// intercept message for the request and one for the response.
//
// NOTE: Must be called in a goroutine as this will block until the response is
// read from the response channel.
func (h *middlewareHarness) interceptUnary(methodURI string,
expectedRequest proto.Message, responseReplacement proto.Message) {
// Read intercept message and make sure it's for an RPC request.
reqIntercept, err := h.stream.Recv()
require.NoError(h.t, err)
req := reqIntercept.GetRequest()
require.NotNil(h.t, req)
// We know the request we're going to send so make sure we get the right
// type and content from the interceptor.
require.Equal(h.t, methodURI, req.MethodFullUri)
assertInterceptedType(h.t, expectedRequest, req)
// We need to accept the request.
h.sendAccept(reqIntercept.MsgId, nil)
// Now read the intercept message for the response.
respIntercept, err := h.stream.Recv()
require.NoError(h.t, err)
res := respIntercept.GetResponse()
require.NotNil(h.t, res)
// We expect the request ID to be the same for the request intercept
// and the response intercept messages. But the message IDs must be
// different/unique.
require.Equal(h.t, reqIntercept.RequestId, respIntercept.RequestId)
require.NotEqual(h.t, reqIntercept.MsgId, respIntercept.MsgId)
// We need to accept the response as well.
h.sendAccept(respIntercept.MsgId, responseReplacement)
h.responsesChan <- res
}
// interceptStream intercepts a streaming call, optionally requesting to replace
// the (first) response sent to the client. A streaming call is expected to
// receive one intercept message for the stream authentication, one for the
// first request and one for the first response.
//
// NOTE: Must be called in a goroutine as this will block until the first
// response is read from the response channel.
func (h *middlewareHarness) interceptStream(methodURI string,
expectedRequest proto.Message, responseReplacement proto.Message) {
// Read intercept message and make sure it's for an RPC stream auth.
authIntercept, err := h.stream.Recv()
require.NoError(h.t, err)
auth := authIntercept.GetStreamAuth()
require.NotNil(h.t, auth)
// This is just the authentication, so we can only look at the URI.
require.Equal(h.t, methodURI, auth.MethodFullUri)
// We need to accept the auth.
h.sendAccept(authIntercept.MsgId, nil)
// Read intercept message and make sure it's for an RPC request.
reqIntercept, err := h.stream.Recv()
require.NoError(h.t, err)
req := reqIntercept.GetRequest()
require.NotNil(h.t, req)
// We know the request we're going to send so make sure we get the right
// type and content from the interceptor.
require.Equal(h.t, methodURI, req.MethodFullUri)
assertInterceptedType(h.t, expectedRequest, req)
// We need to accept the request.
h.sendAccept(reqIntercept.MsgId, nil)
// Now read the intercept message for the response.
respIntercept, err := h.stream.Recv()
require.NoError(h.t, err)
res := respIntercept.GetResponse()
require.NotNil(h.t, res)
// We expect the request ID to be the same for the auth intercept,
// request intercept and the response intercept messages. But the
// message IDs must be different/unique.
require.Equal(h.t, authIntercept.RequestId, respIntercept.RequestId)
require.Equal(h.t, reqIntercept.RequestId, respIntercept.RequestId)
require.NotEqual(h.t, authIntercept.MsgId, reqIntercept.MsgId)
require.NotEqual(h.t, authIntercept.MsgId, respIntercept.MsgId)
require.NotEqual(h.t, reqIntercept.MsgId, respIntercept.MsgId)
// We need to accept the response as well.
h.sendAccept(respIntercept.MsgId, responseReplacement)
h.responsesChan <- res
}
// sendAccept sends an accept feedback to the RPC server.
func (h *middlewareHarness) sendAccept(msgID uint64,
responseReplacement proto.Message) {
var replacementBytes []byte
if responseReplacement != nil {
var err error
replacementBytes, err = proto.Marshal(responseReplacement)
require.NoError(h.t, err)
}
err := h.stream.Send(&lnrpc.RPCMiddlewareResponse{
MiddlewareMessage: &lnrpc.RPCMiddlewareResponse_Feedback{
Feedback: &lnrpc.InterceptFeedback{
ReplaceResponse: len(replacementBytes) > 0,
ReplacementSerialized: replacementBytes,
},
},
RefMsgId: msgID,
})
require.NoError(h.t, err)
}
|
using Microsoft.AspNetCore.Mvc.RazorPages;
namespace AspNetCoreIdentityFido2Mfa.Areas.Identity.Pages.Account.Manage
{
public class MfaModel : PageModel
{
public void OnGet()
{
}
public void OnPost()
{
}
}
}
|
module Day01 where
import AOC
main = do
ns <- parseInput $ eachLine number
print $ sum ns
print $ firstDuplicate $ scanl (+) (0 :: Integer) $ cycle ns
|
package org.kynosarges.tektosyne.demo;
import javafx.application.Application;
import javafx.event.*;
import javafx.geometry.Insets;
import javafx.scene.Scene;
import javafx.scene.control.*;
import javafx.scene.input.*;
import javafx.scene.layout.*;
import javafx.stage.Stage;
/**
* Defines the Tektosyne Demo application for JavaFX.
* @author Christoph Nahr
* @version 6.1.0
*/
public class TektosyneDemo extends Application {
/**
* Starts the {@link TektosyneDemo} application.
* @param primaryStage the primary {@link Stage} for the application
*/
@Override
public void start(Stage primaryStage) {
Global.setPrimaryStage(primaryStage);
final Label caption = new Label("Tektosyne Demo Application");
caption.setFont(Global.boldFont(16));
caption.setPadding(new Insets(8));
final Label message = new Label("Select a menu item to demonstrate Tektosyne features.");
message.setPadding(new Insets(8));
final VBox root = new VBox(createMenuBar(), caption, message);
final Scene scene = new Scene(root, 400, 300);
// background thread executor must be shut down manually
primaryStage.setOnCloseRequest(t -> BenchmarkDialog.EXECUTOR.shutdownNow());
primaryStage.setTitle("Tektosyne Demo");
primaryStage.setResizable(false);
primaryStage.setScene(scene);
primaryStage.centerOnScreen();
primaryStage.show();
}
private static MenuBar createMenuBar() {
final MenuBar menu = new MenuBar();
final Menu fileMenu = createMenu("_File",
createMenuItem("_About", t -> new AboutDialog().showAndWait(), null),
createMenuItem("_Benchmarks", t -> new BenchmarkDialog().showAndWait(),
new KeyCodeCombination(KeyCode.B, KeyCombination.SHORTCUT_DOWN)),
new SeparatorMenuItem(),
createMenuItem("E_xit", t -> Global.primaryStage().close(), null)
);
final Menu geoMenu = createMenu("_Geometry",
createMenuItem("Convex _Hull", t -> new ConvexHullDialog().showAndWait(),
new KeyCodeCombination(KeyCode.H, KeyCombination.SHORTCUT_DOWN)),
createMenuItem("Line _Intersection", t -> new LineIntersectionDialog().showAndWait(),
new KeyCodeCombination(KeyCode.I, KeyCombination.SHORTCUT_DOWN)),
createMenuItem("_Point in Polygon", t -> new PointInPolygonDialog().showAndWait(),
new KeyCodeCombination(KeyCode.P, KeyCombination.SHORTCUT_DOWN)),
new SeparatorMenuItem(),
createMenuItem("_Subdivision", t -> new SubdivisionDialog().showAndWait(),
new KeyCodeCombination(KeyCode.S, KeyCombination.SHORTCUT_DOWN)),
createMenuItem("Subdivision In_tersection", t -> new SubdivisionInterDialog().showAndWait(),
new KeyCodeCombination(KeyCode.T, KeyCombination.SHORTCUT_DOWN)),
createMenuItem("_Voronoi & Delaunay", t -> new VoronoiDialog().showAndWait(),
new KeyCodeCombination(KeyCode.V, KeyCombination.SHORTCUT_DOWN))
);
final Menu graphMenu = createMenu("_Polygon & Graph",
createMenuItem("_Regular Polygon", t -> new RegularPolygonDialog().showAndWait(),
new KeyCodeCombination(KeyCode.R, KeyCombination.SHORTCUT_DOWN, KeyCombination.SHIFT_DOWN)),
createMenuItem("Polygon _Grid", t -> new PolygonGridDialog().showAndWait(),
new KeyCodeCombination(KeyCode.G, KeyCombination.SHORTCUT_DOWN, KeyCombination.SHIFT_DOWN)),
createMenuItem("_Save & Print Grid", t -> new MakeGridDialog().showAndWait(),
new KeyCodeCombination(KeyCode.S, KeyCombination.SHORTCUT_DOWN, KeyCombination.SHIFT_DOWN)),
new SeparatorMenuItem(),
createMenuItem("Graph _Algorithms", t -> new GraphDialog().showAndWait(),
new KeyCodeCombination(KeyCode.A, KeyCombination.SHORTCUT_DOWN, KeyCombination.SHIFT_DOWN))
);
menu.getMenus().addAll(fileMenu, geoMenu, graphMenu);
return menu;
}
private static Menu createMenu(String title, MenuItem... items) {
final Menu menu = new Menu(title);
menu.setMnemonicParsing(true);
menu.getItems().addAll(items);
return menu;
}
private static MenuItem createMenuItem(String text,
EventHandler<ActionEvent> onAction, KeyCombination key) {
final MenuItem item = new MenuItem(text);
item.setMnemonicParsing(true);
item.setOnAction(onAction);
if (key != null) item.setAccelerator(key);
return item;
}
/**
* Launches the {@link TektosyneDemo} application.
* @param args the command line arguments
*/
public static void main(String[] args) {
launch(args);
}
}
|
# Phrase
class Phrase
attr_reader :phrase
def initialize(phrase)
@phrase = phrase
end
def word_count
words.each_with_object(Hash.new(0)) { |word, counts| counts[word] += 1 }
end
private
def words
phrase.downcase.gsub(/[^'a-z0-9\s]/i, ' ').split(' ')
end
end
|
package com.vanniktech.espresso.core.utils;
import android.graphics.drawable.Drawable;
import androidx.annotation.CheckResult;
import androidx.annotation.DrawableRes;
import androidx.test.espresso.matcher.BoundedMatcher;
import android.view.View;
import android.widget.ImageView;
import org.hamcrest.Description;
import static com.vanniktech.espresso.core.utils.Utils.NO_DRAWABLE;
import static com.vanniktech.espresso.core.utils.Utils.drawableMatches;
public final class DrawableMatcher extends BoundedMatcher<View, ImageView> {
/**
* Matches that the given view has the expected drawable.
*
* <p>Example usage:</p>
* <code>onView(withId(R.id.view)).check(matches(withDrawable(R.drawable.android)));</code>
*/
@CheckResult public static DrawableMatcher withDrawable(@DrawableRes final int resourceId) {
return new DrawableMatcher(resourceId);
}
/**
* Matches that the given view has no drawable.
*
* <p>Example usage:</p>
* <code>onView(withId(R.id.view)).check(matches(withNoDrawable()));</code>
*/
@CheckResult public static DrawableMatcher withNoDrawable() {
return new DrawableMatcher(NO_DRAWABLE);
}
private final int expectedId;
private DrawableMatcher(final int expectedId) {
super(ImageView.class);
this.expectedId = expectedId;
}
@Override protected boolean matchesSafely(final ImageView imageView) {
final Drawable drawable = imageView.getDrawable();
return drawableMatches(imageView, drawable, expectedId);
}
@Override public void describeTo(final Description description) {
if (expectedId == NO_DRAWABLE) {
description.appendText("with no drawable");
} else {
description.appendText("with drawable from resource id: ").appendValue(expectedId);
}
}
}
|
#!/usr/bin/env ruby
require 'whois'
require 'whois-parser'
require 'pry-byebug'
require 'colorize'
require 'time_difference'
domain = ARGV[0]
record = Whois.whois(domain)
parser = record.parser
created_on = parser.created_on if parser.registered?
lookup_result = parser.available? ? "available!" : "occupied"
time_diff = TimeDifference.between(created_on, Time.now).humanize if created_on
yehooo = "\u{1f600}"
shit = "\u{1f4a9}"
puts "Domain '#{domain}' is #{lookup_result} #{time_diff}"
# Add support fot multiple arguments
# ARGV.each do|argument|
# puts "Argument: #{argument}"
# end
## Notices
# Try to use OptionParser
# https://ruby-doc.org/stdlib-2.7.1/libdoc/optparse/rdoc/OptionParser.html
# Try to use cliqr
#https://github.com/anshulverma/cliqr
|
import skimage as io
from typing import io
import numpy as np
import matplotlib.pyplot as plt
from skimage import io
def laplace_sharpen(input_image, c, choice):
input_image_cp = io.imread(input_image)
m, n = input_image_cp.shape
input_image_cp = np.pad(input_image_cp, ((1, 1), (1, 1)), mode='constant', constant_values=0)
process_image = np.zeros(input_image_cp.shape)
scaled_image = np.copy(input_image_cp)
output_image = np.copy(input_image_cp)
laplace_filter1 = np.array([
[0, 1, 0],
[1, -4, 1],
[0, 1, 0],
])
laplace_filter2 = np.array([
[1, 1, 1],
[1, -8, 1],
[1, 1, 1],
])
if choice == 1:
laplace_filter = laplace_filter1
name = input_image.strip(".tif") + " laplace_sharpen_type 1_11810506.tif"
else:
laplace_filter = laplace_filter2
name = input_image.strip(".tif") + " laplace_sharpen_type 2_11810506.tif"
for i in range(1, m + 1):
for j in range(1, n + 1):
process_image[i, j] = np.sum(laplace_filter * input_image_cp[i - 1:i + 2, j - 1:j + 2])
a = np.min(process_image)
b = 255 / np.max(process_image)
for i in range(1, m + 1):
for j in range(1, n + 1):
scaled_image[i, j] = int((process_image[i, j] - a) * b)
output_image[i, j] = input_image_cp[i, j] + c * process_image[i, j]
if process_image[i, j] < 0:
process_image[i, j] = 0
process_image = process_image[1:m - 1, 1:n - 1]
scaled_image = scaled_image[1:m - 1, 1:n - 1]
output_image = output_image[1:m - 1, 1:n - 1]
plt.figure(figsize=(6, 6))
plt.subplot(221)
plt.title('input_image', fontsize=10)
plt.imshow(input_image_cp, cmap='gray')
plt.subplot(222)
plt.title('laplace_image', fontsize=10)
plt.imshow(process_image, cmap='gray')
plt.subplot(223)
plt.title('scaled_image', fontsize=10)
plt.imshow(scaled_image, cmap='gray')
plt.subplot(224)
plt.title('output_image', fontsize=10)
plt.imshow(output_image, cmap='gray')
plt.savefig(name)
plt.close()
return 0
|
<?php
/**
* Created by PhpStorm.
* User: enesdayanc
* Date: 18/08/2017
* Time: 11:40
*/
namespace PaymentGateway\VPosGaranti\Constant;
class MotoInd
{
const MAIL_ORDER = 'Y';
const E_COMMERCE = 'N';
}
|
using System;
using System.Collections.Generic;
using System.Linq;
namespace CustomerTestsExcel.ExcelToCode
{
public class ExcelToCodeState
{
public ExcelToCodeGiven Given { get; }
public ExcelToCodeTable Table { get; }
public ExcelToCodeSimpleProperty SimpleProperty { get; }
public ExcelToCodeComplexProperty ComplexProperty { get; }
public ExcelToCodeList List { get; }
public ExcelToCodeWhen When { get; }
public ExcelToCodeThen Then { get; }
public LogState Log { get; }
public ExcelState Excel { get; }
public CodeState Code { get; }
public ICodeNameToExcelNameConverter Converter { get; }
public ExcelToCodeState(ICodeNameToExcelNameConverter converter)
{
Converter = converter ?? throw new ArgumentNullException(nameof(converter));
Log = new LogState();
Code = new CodeState();
Excel = new ExcelState();
Given = new ExcelToCodeGiven(this);
Table = new ExcelToCodeTable(this);
SimpleProperty = new ExcelToCodeSimpleProperty(this);
ComplexProperty = new ExcelToCodeComplexProperty(this);
List = new ExcelToCodeList(this);
When = new ExcelToCodeWhen(this);
Then = new ExcelToCodeThen(this);
}
public void Initialise(ITabularPage worksheet)
{
Excel.Initialise(worksheet);
Code.Initialise();
Log.Initialise();
}
}
}
|
#!/usr/bin/env ruby
require 'prophet'
require 'logger'
require 'yaml'
require_relative 'ci_executor'
Prophet.setup do |config|
# Setup Github access.
CONFIG_FILE = './options.yml'
if File.exists?(CONFIG_FILE)
options = YAML.load_file(CONFIG_FILE)
config.username_pass = options['default']['username_pass']
config.access_token_pass = options['default']['access_token_pass']
config.username_fail = options['default']['username_fail']
config.access_token_fail = options['default']['access_token_fail']
end
# Setup logging.
config.logger = log = @logger = Logger.new(STDOUT)
log.level = Logger::INFO
# Now that GitHub has fixed their notifications system, we can dare to increase
# Prophet's verbosity and use a new comment for every run.
config.reuse_comments = false
config.comment_failure = 'Prophet reports failure.'
config.comment_success = 'Well Done! Your tests are still passing.'
# Specify which tests to run. (Defaults to `rake test`.)
# NOTE: Either ensure the last call in that block runs your tests
# or manually set @result to a boolean inside this block.
config.execution do
executor = SCC::CiExecutor.new(logger: config.logger)
executor.run!
config.success = executor.success?
if config.success
log.info 'All tests are passing.'
else
config.comment_failure += "\n#{executor.fail_message}"
log.info 'Some tests are failing.'
executor.inspect_failed
throw RuntimeError, config.comment_failure
end
end
end
# Finally, run Prophet!
Prophet.run
|
package com.jayway.jsonpath.spi.transformer;
import java.text.MessageFormat;
import java.util.ResourceBundle;
public abstract class ValidationError {
private String errorCode;
private String description;
public ValidationError(String errorCode, ResourceBundle bundle) {
this.errorCode = errorCode;
this.description = bundle.containsKey(errorCode) ? bundle.getString(errorCode): errorCode;
}
public ValidationError(String errorCode, ResourceBundle bundle, Object... params) {
this.errorCode = errorCode;
this.description = MessageFormat.format(bundle.getString(errorCode), params);
}
public String getErrorCode() {
return errorCode;
}
public String getDescription() {
return description;
}
@Override
public String toString() {
return new StringBuilder().append("\t").append("errorCode=")
.append(errorCode).append(" : ").append("description=")
.append(description).append("\n").toString();
}
}
|
# python-email-sending-with-attachment
These codes are helpful for sending email using python with attachments,
|
#!/usr/bin/env bash
CUDA_VISIBLE_DEVICES=0 python main.py \
--output_dir ./experiment_SRGAN_VGG54/ \
--summary_dir ./experiment_SRGAN_VGG54/log/ \
--mode train \
--is_training True \
--task SRGAN \
--batch_size 16 \
--flip True \
--random_crop True \
--crop_size 24 \
--input_dir_LR ./data/training_lr_images/ \
--input_dir_HR ./data/training_hr_images/ \
--num_resblock 16 \
--perceptual_mode VGG54 \
--name_queue_capacity 4096 \
--image_queue_capacity 4096 \
--ratio 0.001 \
--learning_rate 0.0001 \
--decay_step 100000 \
--decay_rate 0.1 \
--stair True \
--beta 0.9 \
--max_iter 200000 \
--queue_thread 10 \
--vgg_scaling 0.0061 \
--pre_trained_model True \
--checkpoint ./experiment_SRGAN_MSE/model-500000
|
from httplib import HTTPConnection
import mimetypes
import random
import string
class ApiInterface():
# Custom HTTP Headers
auth_header = "X-Authorization"
user_header = "X-User"
def __init__(self, user_name, auth_key, scan_host, scan_port):
"""
Constructor sets the connection info
The API key for a user can be found on PhageScan's Account page
"""
self.user = str(user_name)
self.api_key = str(auth_key)
self.port = str(scan_port)
self.host = str(scan_host)
def submit_sample(self, file_path):
"""
Given a file path, submit a sample to PhageScan
Returns JSON message with a sample id number (usable to retrieve scan results) or error message
"""
body, headers = self.encode_multipart_data({"sample": file_path})
conn = HTTPConnection("{0}:{1}".format(self.host,self.port))
conn.request("POST", "/api/upload/", body, headers)
result = conn.getresponse().read()
return result
def get_results(self, sample_id):
"""
Given a sample id, get results (if any)
Output is JSON formatted
"""
headers = {self.auth_header : self.api_key, self.user_header : self.user}
conn = HTTPConnection("{0}:{1}".format(self.host,self.port))
conn.request("GET", "/api/scanresult/" + str(sample_id), "", headers)
result = conn.getresponse().read()
return result
def encode_multipart_data(self, files):
"""
Helper function that forms the HTTP body and headers
"""
boundary = self.random_string (30)
def get_content_type(filename):
return mimetypes.guess_type (filename)[0] or 'application/octet-stream'
def encode_file(field_name):
filename = files [field_name]
return ('--{0}'.format(boundary),
'Content-Disposition: form-data; name="{0}"; filename="{1}"'.format(field_name, filename),
'Content-Type: {0}'.format(get_content_type(filename)),
'', open(filename, 'rb').read())
lines = []
for name in files:
lines.extend (encode_file (name))
lines.extend (('--{0}--'.format(boundary), ''))
body = '\r\n'.join(lines)
headers = {'content-type': 'multipart/form-data; boundary={0}'.format(boundary),
'content-length': str(len (body))}
headers["Accept"] = "text/json"
headers[self.auth_header] = self.api_key
headers[self.user_header] = self.user
return body, headers
def random_string(self, length):
"""
Helper function to generate the boundary
"""
return ''.join(random.choice (string.letters) for ii in range(length + 1))
|
angular.module('info', [])
.controller('MainCtrl', ['$scope', '$http', '$location', function($scope, $http){
$http.get("/info/json/" + document.location.pathname.split('/').pop()).then(function(success){
$scope.profile = success.data.profile;
}, function(error){
$scope.profile = {};
});
}]);
|
lmxrk=22
nrows[1]=54000; nb[1]=1125; acc[1]=8; maxrank[1]=$lmxrk;
nrows[2]=81000; nb[2]=1125; acc[2]=8; maxrank[2]=$lmxrk;
nrows[3]=108000; nb[3]=1125; acc[3]=8; maxrank[3]=$lmxrk;
nrows[4]=135000; nb[4]=1125; acc[4]=8; maxrank[4]=$lmxrk;
nrows[5]=162000; nb[5]=1125; acc[5]=8; maxrank[5]=$lmxrk;
nrows[6]=189000; nb[6]=1125; acc[6]=8; maxrank[6]=$lmxrk;
nrows[7]=216000; nb[7]=1125; acc[7]=8; maxrank[7]=$lmxrk;
nrows[8]=243000; nb[8]=1125; acc[8]=8; maxrank[8]=$lmxrk;
nrows[9]=270000; nb[9]=1125; acc[9]=8; maxrank[9]=$lmxrk;
nrows[10]=297000; nb[10]=1125; acc[10]=8; maxrank[10]=$lmxrk;
nrows[11]=324000; nb[11]=1125; acc[11]=8; maxrank[11]=$lmxrk;
nrows[12]=351000; nb[12]=1125; acc[12]=8; maxrank[12]=$lmxrk;
nrows[13]=378000; nb[13]=1125; acc[13]=8; maxrank[13]=$lmxrk;
nrows[14]=405000; nb[14]=1125; acc[14]=8; maxrank[14]=$lmxrk;
nrows[15]=432000; nb[15]=1125; acc[15]=8; maxrank[15]=$lmxrk;
nrows[16]=459000; nb[16]=1125; acc[16]=8; maxrank[16]=$lmxrk;
nrows[17]=486000; nb[17]=1125; acc[17]=8; maxrank[17]=$lmxrk;
nrows[18]=513000; nb[18]=1125; acc[18]=8; maxrank[18]=$lmxrk;
nrows[19]=540000; nb[19]=1125; acc[19]=8; maxrank[19]=$lmxrk;
nrows[20]=567000; nb[20]=1125; acc[20]=8; maxrank[20]=$lmxrk;
nrows[21]=594000; nb[21]=1125; acc[21]=8; maxrank[21]=$lmxrk;
allcaseids[16]="`seq 1 21`"
nrows[1]=54000; nb[1]=1350; acc[1]=8; maxrank[1]=22;
nrows[2]=54000; nb[2]=1500; acc[2]=8; maxrank[2]=22;
nrows[3]=54000; nb[3]=1800; acc[3]=8; maxrank[3]=22;
nrows[4]=54000; nb[4]=2250; acc[4]=8; maxrank[4]=22;
nrows[5]=81000; nb[5]=1350; acc[5]=8; maxrank[5]=22;
nrows[6]=81000; nb[6]=1500; acc[6]=8; maxrank[6]=22;
nrows[7]=81000; nb[7]=1800; acc[7]=8; maxrank[7]=22;
nrows[8]=81000; nb[8]=2250; acc[8]=8; maxrank[8]=22;
nrows[9]=108000; nb[9]=1350; acc[9]=8; maxrank[9]=22;
nrows[10]=108000; nb[10]=1500; acc[10]=8; maxrank[10]=22;
nrows[11]=108000; nb[11]=1800; acc[11]=8; maxrank[11]=22;
nrows[12]=108000; nb[12]=2250; acc[12]=8; maxrank[12]=22;
nrows[13]=135000; nb[13]=1350; acc[13]=8; maxrank[13]=22;
nrows[14]=135000; nb[14]=1500; acc[14]=8; maxrank[14]=22;
nrows[15]=135000; nb[15]=1800; acc[15]=8; maxrank[15]=22;
nrows[16]=135000; nb[16]=2250; acc[16]=8; maxrank[16]=22;
nrows[17]=162000; nb[17]=1350; acc[17]=8; maxrank[17]=22;
nrows[18]=162000; nb[18]=1500; acc[18]=8; maxrank[18]=22;
nrows[19]=162000; nb[19]=1800; acc[19]=8; maxrank[19]=22;
nrows[20]=162000; nb[20]=2250; acc[20]=8; maxrank[20]=22;
nrows[21]=189000; nb[21]=1350; acc[21]=8; maxrank[21]=22;
nrows[22]=189000; nb[22]=1500; acc[22]=8; maxrank[22]=22;
nrows[23]=189000; nb[23]=1800; acc[23]=8; maxrank[23]=22;
nrows[24]=189000; nb[24]=2250; acc[24]=8; maxrank[24]=22;
nrows[25]=216000; nb[25]=1350; acc[25]=8; maxrank[25]=22;
nrows[26]=216000; nb[26]=1500; acc[26]=8; maxrank[26]=22;
nrows[27]=216000; nb[27]=1800; acc[27]=8; maxrank[27]=22;
nrows[28]=216000; nb[28]=2250; acc[28]=8; maxrank[28]=22;
nrows[29]=243000; nb[29]=1350; acc[29]=8; maxrank[29]=22;
nrows[30]=243000; nb[30]=1500; acc[30]=8; maxrank[30]=22;
nrows[31]=243000; nb[31]=1800; acc[31]=8; maxrank[31]=22;
nrows[32]=243000; nb[32]=2250; acc[32]=8; maxrank[32]=22;
nrows[33]=270000; nb[33]=1350; acc[33]=8; maxrank[33]=22;
nrows[34]=270000; nb[34]=1500; acc[34]=8; maxrank[34]=22;
nrows[35]=270000; nb[35]=1800; acc[35]=8; maxrank[35]=22;
nrows[36]=270000; nb[36]=2250; acc[36]=8; maxrank[36]=22;
nrows[37]=297000; nb[37]=1350; acc[37]=8; maxrank[37]=22;
nrows[38]=297000; nb[38]=1500; acc[38]=8; maxrank[38]=22;
nrows[39]=297000; nb[39]=1800; acc[39]=8; maxrank[39]=22;
nrows[40]=297000; nb[40]=2250; acc[40]=8; maxrank[40]=22;
nrows[41]=324000; nb[41]=1350; acc[41]=8; maxrank[41]=22;
nrows[42]=324000; nb[42]=1500; acc[42]=8; maxrank[42]=22;
nrows[43]=324000; nb[43]=1800; acc[43]=8; maxrank[43]=22;
nrows[44]=324000; nb[44]=2250; acc[44]=8; maxrank[44]=22;
nrows[45]=351000; nb[45]=1350; acc[45]=8; maxrank[45]=22;
nrows[46]=351000; nb[46]=1500; acc[46]=8; maxrank[46]=22;
nrows[47]=351000; nb[47]=1800; acc[47]=8; maxrank[47]=22;
nrows[48]=351000; nb[48]=2250; acc[48]=8; maxrank[48]=22;
nrows[49]=378000; nb[49]=1350; acc[49]=8; maxrank[49]=22;
nrows[50]=378000; nb[50]=1500; acc[50]=8; maxrank[50]=22;
nrows[51]=378000; nb[51]=1800; acc[51]=8; maxrank[51]=22;
nrows[52]=378000; nb[52]=2250; acc[52]=8; maxrank[52]=22;
nrows[53]=405000; nb[53]=1350; acc[53]=8; maxrank[53]=22;
nrows[54]=405000; nb[54]=1500; acc[54]=8; maxrank[54]=22;
nrows[55]=405000; nb[55]=1800; acc[55]=8; maxrank[55]=22;
nrows[56]=405000; nb[56]=2250; acc[56]=8; maxrank[56]=22;
nrows[57]=432000; nb[57]=1350; acc[57]=8; maxrank[57]=22;
nrows[58]=432000; nb[58]=1500; acc[58]=8; maxrank[58]=22;
nrows[59]=432000; nb[59]=1800; acc[59]=8; maxrank[59]=22;
nrows[60]=432000; nb[60]=2250; acc[60]=8; maxrank[60]=22;
nrows[61]=459000; nb[61]=1350; acc[61]=8; maxrank[61]=22;
nrows[62]=459000; nb[62]=1500; acc[62]=8; maxrank[62]=22;
nrows[63]=459000; nb[63]=1800; acc[63]=8; maxrank[63]=22;
nrows[64]=459000; nb[64]=2250; acc[64]=8; maxrank[64]=22;
nrows[65]=486000; nb[65]=1350; acc[65]=8; maxrank[65]=22;
nrows[66]=486000; nb[66]=1500; acc[66]=8; maxrank[66]=22;
nrows[67]=486000; nb[67]=1800; acc[67]=8; maxrank[67]=22;
nrows[68]=486000; nb[68]=2250; acc[68]=8; maxrank[68]=22;
nrows[69]=513000; nb[69]=1350; acc[69]=8; maxrank[69]=22;
nrows[70]=513000; nb[70]=1500; acc[70]=8; maxrank[70]=22;
nrows[71]=513000; nb[71]=1800; acc[71]=8; maxrank[71]=22;
nrows[72]=513000; nb[72]=2250; acc[72]=8; maxrank[72]=22;
nrows[73]=540000; nb[73]=1350; acc[73]=8; maxrank[73]=22;
nrows[74]=540000; nb[74]=1500; acc[74]=8; maxrank[74]=22;
nrows[75]=540000; nb[75]=1800; acc[75]=8; maxrank[75]=22;
nrows[76]=540000; nb[76]=2250; acc[76]=8; maxrank[76]=22;
nrows[77]=567000; nb[77]=1350; acc[77]=8; maxrank[77]=22;
nrows[78]=567000; nb[78]=1500; acc[78]=8; maxrank[78]=22;
nrows[79]=567000; nb[79]=1800; acc[79]=8; maxrank[79]=22;
nrows[80]=567000; nb[80]=2250; acc[80]=8; maxrank[80]=22;
nrows[81]=594000; nb[81]=1350; acc[81]=8; maxrank[81]=22;
nrows[82]=594000; nb[82]=1500; acc[82]=8; maxrank[82]=22;
nrows[83]=594000; nb[83]=1800; acc[83]=8; maxrank[83]=22;
nrows[84]=594000; nb[84]=2250; acc[84]=8; maxrank[84]=22;
note="Hicma rnd"
allcaseids[16]="`seq 1 84`"
#allcaseids[16]="1"
step=1
nprocs="16"
_appdata="--rnd"; _decay=0.41;timelimit="06:00:00"
_compmaxrank=22
|
class Client {
String id;
String email;
String name;
String lastName;
Client(this.email, this.name, this.lastName);
Client.fromMap(String id, Map<String, dynamic> data) {
this.id = id;
email = data["email"];
name = data["name"];
lastName = data["lastName"];
}
Map<String, dynamic> toMap() =>
{"email": email, "name": name, "lastName": lastName};
}
|
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
namespace Guppy.UI.Utilities.Units
{
/// <summary>
/// A simple static pixel value that
/// never changes.
/// </summary>
public class PixelUnit : MultiUnit
{
#region Private Fields
private Int32 _value;
#endregion
#region Constructors
public PixelUnit(Int32 value, params Unit[] padding) : base(padding)
{
_value = value;
}
#endregion
#region Unit Implementation
/// <inheritdoc />
public override Unit Flip()
=> new PixelUnit(-_value, base.Flip());
/// <inheritdoc />
public override int ToPixel(int parent)
=> _value + base.ToPixel(parent);
public override bool Equals(object obj)
{
if (obj is PixelUnit p)
if (p._value == this._value)
return base.Equals(obj);
return false;
}
#endregion
}
}
|
@page
@model RegisterModel
@{
ViewData["Title"] = "Register";
}
<h1>@ViewData["Title"]</h1>
<div class="row">
<div class="col-md-4">
<form asp-route-returnUrl="@Model.ReturnUrl" method="post">
<h4>Create a new account.</h4>
<hr />
<div asp-validation-summary="All" class="text-danger"></div>
<div class="form-group">
<label asp-for="Input.Email"></label>
<input asp-for="Input.Email" class="form-control" />
<span asp-validation-for="Input.Email" class="text-danger"></span>
</div>
<div class="form-group">
<label asp-for="Input.FirstName"></label>
<input asp-for="Input.FirstName" class="form-control" />
<span asp-validation-for="Input.FirstName" class="text-danger"></span>
</div>
<div class="form-group">
<label asp-for="Input.LastName"></label>
<input asp-for="Input.LastName" class="form-control" />
<span asp-validation-for="Input.LastName" class="text-danger"></span>
</div>
<div class="form-group">
<label asp-for="Input.Phone"></label>
<input asp-for="Input.Phone" class="form-control" />
<span asp-validation-for="Input.Phone" class="text-danger"></span>
</div>
<div class="form-group">
<label asp-for="Input.UserRole"></label>
<select asp-for="Input.UserRole" class="form-control" id="RoleGroup">
<option value="None"> Choose role </option>
<option value="Analyst"> Analyst </option>
<option value="Farmer"> Farmer </option>
</select>
</div>
<div class="form-group" id="FarmGroup">
<label asp-for="Input.FarmID"></label>
<select asp-for="Input.FarmID" class="form-control" asp-items="ViewBag.Farms">
<option value=0> Choose farm </option>
</select>
</div>
<div class="form-group">
<label asp-for="Input.Password"></label>
<input asp-for="Input.Password" class="form-control" />
<span asp-validation-for="Input.Password" class="text-danger"></span>
</div>
<div class="form-group">
<label asp-for="Input.ConfirmPassword"></label>
<input asp-for="Input.ConfirmPassword" class="form-control" />
<span asp-validation-for="Input.ConfirmPassword" class="text-danger"></span>
</div>
<button type="submit" class="btn btn-primary">Register</button>
</form>
</div>
</div>
@section Scripts {
<partial name="_ValidationScriptsPartial" />
<script src="~/lib/jquery/dist/jquery.js" type="text/javascript"></script>
<script type="text/javascript" src="~/js/RegisterUserRole.js"></script>
}
|
package library.enrichment.messaging
import au.com.dius.pact.consumer.MessagePactBuilder
import au.com.dius.pact.consumer.dsl.PactDslJsonBody
import au.com.dius.pact.model.PactSpecVersion
import au.com.dius.pact.model.v3.messaging.Message
import com.nhaarman.mockitokotlin2.check
import com.nhaarman.mockitokotlin2.mock
import com.nhaarman.mockitokotlin2.verify
import library.enrichment.core.BookAddedEvent
import library.enrichment.core.BookAddedEventHandler
import org.assertj.core.api.Assertions.assertThat
import org.junit.jupiter.api.Test
import org.springframework.amqp.core.MessageProperties
import utils.classification.UnitTest
import utils.objectMapper
@UnitTest
internal class MessagingContractTest {
companion object {
const val pactContractFolder = "../library-service/src/test/pacts/message"
const val eventId = "aa1dc09f-7b64-4e7e-a6f6-7eb50dcd6e9d"
const val bookId = "9bf258be-19d4-4338-b172-60a1b7ef076b"
}
val configuration = MessagingConfiguration(mock(), mock())
val objectMapper = objectMapper()
val messageConverter = configuration.messageConverter(objectMapper)
val handler: BookAddedEventHandler = mock()
val cut = BookAddedEventMessageListener(objectMapper, handler)
@Test fun `book-added contract`() {
val pact = MessagePactBuilder
.consumer("library-enrichment")
.hasPactWith("library-service")
.expectsToReceive("'The Martian' was added event")
.withContent(PactDslJsonBody()
.stringType("id", eventId)
.stringType("bookId", bookId)
.stringValue("isbn", "9780091956141")
)
.toPact()
pact.messages.forEach {
val message = toMessage(readEvent(it), it.contentType)
cut.onMessage(message)
}
verify(handler).handle(check {
assertThat(it.id).isEqualTo(eventId)
assertThat(it.bookId).isEqualTo(bookId)
assertThat(it.isbn).isEqualTo("9780091956141")
})
pact.write(pactContractFolder, PactSpecVersion.V3)
}
private fun toMessage(event: BookAddedEvent, contentType: String): org.springframework.amqp.core.Message {
val properties = MessageProperties()
properties.contentType = contentType
properties.consumerQueue = MessagingConfiguration.BOOK_ADDED_QUEUE
return messageConverter.toMessage(event, properties)
}
private fun readEvent(it: Message) =
objectMapper.readValue(it.contentsAsBytes(), BookAddedEvent::class.java)
}
|
/* ========================================
* StereoFX - StereoFX.h
* Copyright (c) 2016 airwindows, All rights reserved
* ======================================== */
#ifndef __StereoFX_H
#include "StereoFX.h"
#endif
void StereoFX::processReplacing(float **inputs, float **outputs, VstInt32 sampleFrames)
{
float* in1 = inputs[0];
float* in2 = inputs[1];
float* out1 = outputs[0];
float* out2 = outputs[1];
double overallscale = 1.0;
overallscale /= 44100.0;
overallscale *= getSampleRate();
long double inputSampleL;
long double inputSampleR;
long double mid;
long double side;
//High Impact section
double stereowide = A;
double centersquish = C;
double density = stereowide * 2.4;
double sustain = 1.0 - (1.0/(1.0 + (density/7.0)));
//this way, enhance increases up to 50% and then mid falls off beyond that
double bridgerectifier;
double count;
//Highpass section
double iirAmount = pow(B,3)/overallscale;
double tight = -0.33333333333333;
double offset;
//we are setting it up so that to either extreme we can get an audible sound,
//but sort of scaled so small adjustments don't shift the cutoff frequency yet.
while (--sampleFrames >= 0)
{
inputSampleL = *in1;
inputSampleR = *in2;
if (inputSampleL<1.2e-38 && -inputSampleL<1.2e-38) {
static int noisesource = 0;
//this declares a variable before anything else is compiled. It won't keep assigning
//it to 0 for every sample, it's as if the declaration doesn't exist in this context,
//but it lets me add this denormalization fix in a single place rather than updating
//it in three different locations. The variable isn't thread-safe but this is only
//a random seed and we can share it with whatever.
noisesource = noisesource % 1700021; noisesource++;
int residue = noisesource * noisesource;
residue = residue % 170003; residue *= residue;
residue = residue % 17011; residue *= residue;
residue = residue % 1709; residue *= residue;
residue = residue % 173; residue *= residue;
residue = residue % 17;
double applyresidue = residue;
applyresidue *= 0.00000001;
applyresidue *= 0.00000001;
inputSampleL = applyresidue;
}
if (inputSampleR<1.2e-38 && -inputSampleR<1.2e-38) {
static int noisesource = 0;
noisesource = noisesource % 1700021; noisesource++;
int residue = noisesource * noisesource;
residue = residue % 170003; residue *= residue;
residue = residue % 17011; residue *= residue;
residue = residue % 1709; residue *= residue;
residue = residue % 173; residue *= residue;
residue = residue % 17;
double applyresidue = residue;
applyresidue *= 0.00000001;
applyresidue *= 0.00000001;
inputSampleR = applyresidue;
//this denormalization routine produces a white noise at -300 dB which the noise
//shaping will interact with to produce a bipolar output, but the noise is actually
//all positive. That should stop any variables from going denormal, and the routine
//only kicks in if digital black is input. As a final touch, if you save to 24-bit
//the silence will return to being digital black again.
}
//assign working variables
mid = inputSampleL + inputSampleR;
side = inputSampleL - inputSampleR;
//assign mid and side. Now, High Impact code
count = density;
while (count > 1.0)
{
bridgerectifier = fabs(side)*1.57079633;
if (bridgerectifier > 1.57079633) bridgerectifier = 1.57079633;
//max value for sine function
bridgerectifier = sin(bridgerectifier);
if (side > 0.0) side = bridgerectifier;
else side = -bridgerectifier;
count = count - 1.0;
}
//we have now accounted for any really high density settings.
bridgerectifier = fabs(side)*1.57079633;
if (bridgerectifier > 1.57079633) bridgerectifier = 1.57079633;
//max value for sine function
bridgerectifier = sin(bridgerectifier);
if (side > 0) side = (side*(1-count))+(bridgerectifier*count);
else side = (side*(1-count))-(bridgerectifier*count);
//blend according to density control
//done first density. Next, sustain-reducer
bridgerectifier = fabs(side)*1.57079633;
if (bridgerectifier > 1.57079633) bridgerectifier = 1.57079633;
bridgerectifier = (1-cos(bridgerectifier))*3.141592653589793;
if (side > 0) side = (side*(1-sustain))+(bridgerectifier*sustain);
else side = (side*(1-sustain))-(bridgerectifier*sustain);
//done with High Impact code
//now, Highpass code
offset = 0.666666666666666 + ((1-fabs(side))*tight);
if (offset < 0) offset = 0;
if (offset > 1) offset = 1;
if (flip)
{
iirSampleA = (iirSampleA * (1 - (offset * iirAmount))) + (side * (offset * iirAmount));
side = side - iirSampleA;
}
else
{
iirSampleB = (iirSampleB * (1 - (offset * iirAmount))) + (side * (offset * iirAmount));
side = side - iirSampleB;
}
//done with Highpass code
bridgerectifier = fabs(mid)/1.273239544735162;
if (bridgerectifier > 1.57079633) bridgerectifier = 1.57079633;
bridgerectifier = sin(bridgerectifier)*1.273239544735162;
if (mid > 0) mid = (mid*(1-centersquish))+(bridgerectifier*centersquish);
else mid = (mid*(1-centersquish))-(bridgerectifier*centersquish);
//done with the mid saturating section.
inputSampleL = (mid+side)/2.0;
inputSampleR = (mid-side)/2.0;
//stereo 32 bit dither, made small and tidy.
int expon; frexpf((float)inputSampleL, &expon);
long double dither = (rand()/(RAND_MAX*7.737125245533627e+25))*pow(2,expon+62);
inputSampleL += (dither-fpNShapeL); fpNShapeL = dither;
frexpf((float)inputSampleR, &expon);
dither = (rand()/(RAND_MAX*7.737125245533627e+25))*pow(2,expon+62);
inputSampleR += (dither-fpNShapeR); fpNShapeR = dither;
//end 32 bit dither
*out1 = inputSampleL;
*out2 = inputSampleR;
*in1++;
*in2++;
*out1++;
*out2++;
}
}
void StereoFX::processDoubleReplacing(double **inputs, double **outputs, VstInt32 sampleFrames)
{
double* in1 = inputs[0];
double* in2 = inputs[1];
double* out1 = outputs[0];
double* out2 = outputs[1];
double overallscale = 1.0;
overallscale /= 44100.0;
overallscale *= getSampleRate();
long double inputSampleL;
long double inputSampleR;
long double mid;
long double side;
//High Impact section
double stereowide = A;
double centersquish = C;
double density = stereowide * 2.4;
double sustain = 1.0 - (1.0/(1.0 + (density/7.0)));
//this way, enhance increases up to 50% and then mid falls off beyond that
double bridgerectifier;
double count;
//Highpass section
double iirAmount = pow(B,3)/overallscale;
double tight = -0.33333333333333;
double offset;
//we are setting it up so that to either extreme we can get an audible sound,
//but sort of scaled so small adjustments don't shift the cutoff frequency yet.
while (--sampleFrames >= 0)
{
inputSampleL = *in1;
inputSampleR = *in2;
if (inputSampleL<1.2e-38 && -inputSampleL<1.2e-38) {
static int noisesource = 0;
//this declares a variable before anything else is compiled. It won't keep assigning
//it to 0 for every sample, it's as if the declaration doesn't exist in this context,
//but it lets me add this denormalization fix in a single place rather than updating
//it in three different locations. The variable isn't thread-safe but this is only
//a random seed and we can share it with whatever.
noisesource = noisesource % 1700021; noisesource++;
int residue = noisesource * noisesource;
residue = residue % 170003; residue *= residue;
residue = residue % 17011; residue *= residue;
residue = residue % 1709; residue *= residue;
residue = residue % 173; residue *= residue;
residue = residue % 17;
double applyresidue = residue;
applyresidue *= 0.00000001;
applyresidue *= 0.00000001;
inputSampleL = applyresidue;
}
if (inputSampleR<1.2e-38 && -inputSampleR<1.2e-38) {
static int noisesource = 0;
noisesource = noisesource % 1700021; noisesource++;
int residue = noisesource * noisesource;
residue = residue % 170003; residue *= residue;
residue = residue % 17011; residue *= residue;
residue = residue % 1709; residue *= residue;
residue = residue % 173; residue *= residue;
residue = residue % 17;
double applyresidue = residue;
applyresidue *= 0.00000001;
applyresidue *= 0.00000001;
inputSampleR = applyresidue;
//this denormalization routine produces a white noise at -300 dB which the noise
//shaping will interact with to produce a bipolar output, but the noise is actually
//all positive. That should stop any variables from going denormal, and the routine
//only kicks in if digital black is input. As a final touch, if you save to 24-bit
//the silence will return to being digital black again.
}
//assign working variables
mid = inputSampleL + inputSampleR;
side = inputSampleL - inputSampleR;
//assign mid and side. Now, High Impact code
count = density;
while (count > 1.0)
{
bridgerectifier = fabs(side)*1.57079633;
if (bridgerectifier > 1.57079633) bridgerectifier = 1.57079633;
//max value for sine function
bridgerectifier = sin(bridgerectifier);
if (side > 0.0) side = bridgerectifier;
else side = -bridgerectifier;
count = count - 1.0;
}
//we have now accounted for any really high density settings.
bridgerectifier = fabs(side)*1.57079633;
if (bridgerectifier > 1.57079633) bridgerectifier = 1.57079633;
//max value for sine function
bridgerectifier = sin(bridgerectifier);
if (side > 0) side = (side*(1-count))+(bridgerectifier*count);
else side = (side*(1-count))-(bridgerectifier*count);
//blend according to density control
//done first density. Next, sustain-reducer
bridgerectifier = fabs(side)*1.57079633;
if (bridgerectifier > 1.57079633) bridgerectifier = 1.57079633;
bridgerectifier = (1-cos(bridgerectifier))*3.141592653589793;
if (side > 0) side = (side*(1-sustain))+(bridgerectifier*sustain);
else side = (side*(1-sustain))-(bridgerectifier*sustain);
//done with High Impact code
//now, Highpass code
offset = 0.666666666666666 + ((1-fabs(side))*tight);
if (offset < 0) offset = 0;
if (offset > 1) offset = 1;
if (flip)
{
iirSampleA = (iirSampleA * (1 - (offset * iirAmount))) + (side * (offset * iirAmount));
side = side - iirSampleA;
}
else
{
iirSampleB = (iirSampleB * (1 - (offset * iirAmount))) + (side * (offset * iirAmount));
side = side - iirSampleB;
}
//done with Highpass code
bridgerectifier = fabs(mid)/1.273239544735162;
if (bridgerectifier > 1.57079633) bridgerectifier = 1.57079633;
bridgerectifier = sin(bridgerectifier)*1.273239544735162;
if (mid > 0) mid = (mid*(1-centersquish))+(bridgerectifier*centersquish);
else mid = (mid*(1-centersquish))-(bridgerectifier*centersquish);
//done with the mid saturating section.
inputSampleL = (mid+side)/2.0;
inputSampleR = (mid-side)/2.0;
//stereo 64 bit dither, made small and tidy.
int expon; frexp((double)inputSampleL, &expon);
long double dither = (rand()/(RAND_MAX*7.737125245533627e+25))*pow(2,expon+62);
dither /= 536870912.0; //needs this to scale to 64 bit zone
inputSampleL += (dither-fpNShapeL); fpNShapeL = dither;
frexp((double)inputSampleR, &expon);
dither = (rand()/(RAND_MAX*7.737125245533627e+25))*pow(2,expon+62);
dither /= 536870912.0; //needs this to scale to 64 bit zone
inputSampleR += (dither-fpNShapeR); fpNShapeR = dither;
//end 64 bit dither
*out1 = inputSampleL;
*out2 = inputSampleR;
*in1++;
*in2++;
*out1++;
*out2++;
}
}
|
<?php
namespace App\Models;
use Illuminate\Database\Eloquent\Model;
/**
* App\Models\PassportScans
*
* @property int $id
* @property int $user_id
* @property string $path
* @property int $is_confirm
* @property \Carbon\Carbon|null $created_at
* @property \Carbon\Carbon|null $updated_at
* @method static \Illuminate\Database\Eloquent\Builder|\App\Models\PassportScans whereCreatedAt($value)
* @method static \Illuminate\Database\Eloquent\Builder|\App\Models\PassportScans whereId($value)
* @method static \Illuminate\Database\Eloquent\Builder|\App\Models\PassportScans whereIsConfirm($value)
* @method static \Illuminate\Database\Eloquent\Builder|\App\Models\PassportScans wherePath($value)
* @method static \Illuminate\Database\Eloquent\Builder|\App\Models\PassportScans whereUpdatedAt($value)
* @method static \Illuminate\Database\Eloquent\Builder|\App\Models\PassportScans whereUserId($value)
* @mixin \Eloquent
* @property string|null $photo
* @property string|null $preview
* @method static \Illuminate\Database\Eloquent\Builder|\App\Models\PassportScans wherePhoto($value)
* @method static \Illuminate\Database\Eloquent\Builder|\App\Models\PassportScans wherePreview($value)
*/
class PassportScans extends Model
{
protected $table = 'passport_scans';
public $timestamps = true;
protected $fillable = [
'user_id',
'path',
'is_confirm'
];
}
|
---
layout: page
title: How to Contribute
subtitle: Context
use-site-title: false
---
## Video
## Transcript
|
function editDistance(s1, s2)
m=length(s1)+1
n=length(s2)+1
i=0
j=0
tbl = Dict{Tuple{Int64,Int64},Int64}()
for i=1:m
tbl[i,1]=i-1
end
for j=1:n
tbl[1,j]=j-1
end
for i=2:m
for j=2:n
cost = s1[i-1] == s2[j-1]?0:1
tbl[i,j] = min(tbl[i, j-1]+1, tbl[i-1, j]+1, tbl[i-1, j-1]+cost)
end
end
tbl[i,j]
end
d=-1
for i=1:1000000
d=editDistance("AAAATTTTCCCCGGGGAAAANTTTTCCCCGGGG", "AAAATTTTCCCCGGGGAAAAMTTTTCCCCGGGG")
end
println(d)
|
package com.shop.service.impl;
import com.shop.service.CustomerService;
import org.springframework.beans.factory.DisposableBean;
import org.springframework.stereotype.Service;
import java.util.concurrent.TimeUnit;
@Service("customerService")
public class CustomerServiceImpl implements CustomerService, DisposableBean {
@Override
public String actionInStore(String product) {
if(product.equalsIgnoreCase("cash")) {
throw new IllegalArgumentException("Don't touch our cash!!!");
}
try {
TimeUnit.SECONDS.sleep(2);
} catch (InterruptedException e) {
e.printStackTrace();
}
return "He bought a: " + product;
}
@Override
public void destroy() throws Exception {
}
}
|
import * as moment from 'moment-timezone';
import * as React from 'react';
import { useDispatch, useSelector } from 'react-redux';
import { setSettings } from '../../store/actions';
import { ConfigProps } from '../../types';
const getSettings = state => state.bookings.config;
export const useCalendarSettings = () => {
const curState: ConfigProps = useSelector(getSettings);
const dispatch = useDispatch();
const [weekends, setWeekends] = React.useState<boolean>(curState.weekends);
const [slotDuration, setSlotDuration] = React.useState<any>(
curState.slotDuration,
);
const [startTime, setStartTime] = React.useState<any>(
curState.businessHours.startTime,
);
const [endTime, setEndTime] = React.useState<any>(
curState.businessHours.endTime,
);
const [daysOfWeek, setDaysOfWeek] = React.useState<number[]>(
curState.businessHours.daysOfWeek,
);
const [timeZone, setTimeZone] = React.useState<any>(moment.tz.guess());
React.useEffect(() => {
dispatch(
setSettings({
weekends,
slotDuration,
businessHours: {
startTime,
endTime,
daysOfWeek,
},
}),
);
}, [
weekends,
startTime,
endTime,
daysOfWeek,
slotDuration,
timeZone,
dispatch,
]);
return {
weekends,
setWeekends,
daysOfWeek,
setDaysOfWeek,
slotDuration,
setSlotDuration,
startTime,
setStartTime,
endTime,
setEndTime,
timeZone,
setTimeZone,
};
};
|
#include "ComponentObject.h"
ComponentObject::ComponentObject()
{
_enabled = true;
}
ComponentObject::~ComponentObject()
{
}
void ComponentObject::update(float dt)
{
}
ComponentObject* ComponentObject::create()
{
ComponentObject *node = new ComponentObject();
if (node)
{
node->autorelease();
return node;
}
CC_SAFE_DELETE(node);
return nullptr;
}
|
# typed: strict
module Types
class ActivityFeedType < Types::BaseEnum
description "Options for filtering events in the activity feed."
value "GLOBAL", value: 'global', description: "Events from everyone."
value "FOLLOWING", value: 'following', description: "Events from the current user and anyone they follow."
end
end
|
// Copyright (c) 2020 Oxford-Hainan Blockchain Research Institute
//
// Licensed under the Apache License, Version 2.0 (the "License");
// you may not use this file except in compliance with the License.
// You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing, software
// distributed under the License is distributed on an "AS IS" BASIS,
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// See the License for the specific language governing permissions and
// limitations under the License.
#pragma once
#include "jsonrpc.h"
#include "types.h"
namespace Ethereum {
template <class TTag, typename TParams, typename TResult>
struct RpcBuilder {
using Tag = TTag;
using Params = TParams;
using Result = TResult;
using In = jsonrpc::ProcedureCall<TParams>;
using Out = jsonrpc::Response<TResult>;
static constexpr auto name = TTag::name;
static In make(ccf::SeqNo n = 0) {
In in;
in.id = n;
in.method = TTag::name;
return in;
}
};
namespace ethrpc {
struct GetAccountsTag {
static constexpr auto name = "eth_accounts";
};
using GetAccounts = RpcBuilder<GetAccountsTag, void, std::vector<eevm::Address>>;
struct GetChainIdTag {
static constexpr auto name = "eth_chainId";
};
using GetChainId = RpcBuilder<GetChainIdTag, void, size_t>;
struct GetGasPriceTag {
static constexpr auto name = "eth_gasPrice";
};
using GetGasPrice = RpcBuilder<GetGasPriceTag, void, size_t>;
struct GetEstimateGasTag {
static constexpr auto name = "eth_estimateGas";
};
using GetEstimateGas = RpcBuilder<GetEstimateGasTag, EstimateGas, Result>;
struct GetBalanceTag {
static constexpr auto name = "eth_getBalance";
};
using GetBalance = RpcBuilder<GetBalanceTag, AddressWithBlock, Balance>;
struct GetCodeTag {
static constexpr auto name = "eth_getCode";
};
using GetCode = RpcBuilder<GetCodeTag, AddressWithBlock, ByteData>;
struct GetTransactionCountTag {
static constexpr auto name = "eth_getTransactionCount";
};
using GetTransactionCount = RpcBuilder<GetTransactionCountTag, GetTransactionCount, size_t>;
struct GetTransactionReceiptTag {
static constexpr auto name = "eth_getTransactionReceipt";
};
using GetTransactionReceipt =
RpcBuilder<GetTransactionReceiptTag, GetTransactionReceipt, ReceiptResponse>;
struct SendRawTransactionTag {
static constexpr auto name = "eth_sendRawTransaction";
};
using SendRawTransaction = RpcBuilder<SendRawTransactionTag, SendRawTransaction, TxHash>;
} // namespace ethrpc
} // namespace Ethereum
|
#include <stdio.h>
char stack[1000002];
int main(void)
{
int n,ptr;
char ch;
scanf("%d", &n);
getchar();
while(n--)
{
ptr = 0;
while((ch = getchar()) != '\n')
{
stack[ptr] = ch;
if(ptr && stack[ptr]==stack[ptr-1])
ptr--;
else
ptr++;
}
stack[ptr] = '\0';
printf("%s\n",stack);
}
return 0;
}
|
## Direitos autorais e de licença
Este componente está sob a [licença MIT](https://github.com/gpupo/common-sdk/blob/master/LICENSE)
Para a informação dos direitos autorais e de licença você deve ler o arquivo
de [licença](https://github.com/gpupo/common-sdk/blob/master/LICENSE) que é distribuído com este código-fonte.
### Resumo da licença
Exigido:
- Aviso de licença e direitos autorais
Permitido:
- Uso comercial
- Modificação
- Distribuição
- Sublicenciamento
Proibido:
- Responsabilidade Assegurada
---
## Instalação
git clone --depth=1 https://github.com/gpupo/bash-utilities.git ~/bash-utilities;
### Update
cd ~/bash-utilities && git pull;
## Uso
Logs and monitoring every 1 minute the load time of a URL
~/bash-utilities/loadTime.sh www.google.com /var/log/loadTime.log
Backup crontab from every user on the server
sudo $HOME/bash-utilities/crontabDump.sh /var/log/cron
crontab line:
1 6 * * * sudo $HOME/bash-utilities/crontabDump.sh /var/log/cron
## License
MIT, see LICENSE.
|
import { render, screen } from '@testing-library/react';
import fetchMock from 'fetch-mock';
import { ContextFactory as mockContextFactory } from 'utils/test/factories';
import createQueryClient from 'utils/react-query/createQueryClient';
import { Maybe, Nullable } from 'types/utils';
import { User } from 'types/User';
import { SessionProvider, useSession } from '.';
jest.mock('utils/context', () => ({
__esModule: true,
default: mockContextFactory({
authentication: undefined,
}).generate(),
}));
describe('useSession', () => {
const queryClient = createQueryClient({ persistor: true });
beforeEach(() => {
jest.useFakeTimers('modern');
queryClient.clear();
fetchMock.restore();
});
afterEach(() => {
jest.clearAllTimers();
});
it('provides a null user if there is no authentication backend', async () => {
let userInComponent: Maybe<Nullable<User>>;
const Component = () => {
const { user } = useSession();
userInComponent = user;
return <span>component rendered</span>;
};
render(
<SessionProvider>
<Component />
</SessionProvider>,
);
expect(userInComponent).toBeNull();
screen.getByText('component rendered');
});
});
|
//using System;
//using System.Collections;
//using System.Collections.Generic;
//using System.Linq;
//using System.Text;
//using System.Threading.Tasks;
//using MongoDB.Bson;
//namespace Khooversoft.MongoDb
//{
// public class ExpressionNode<T> : IExpressionNode, IEnumerable<IExpressionNode>
// {
// private readonly List<IExpressionNode> _children = new List<IExpressionNode>();
// public ExpressionNode(T type)
// {
// Type = type;
// }
// public ExpressionNode(T type, string name)
// : this(type)
// {
// Name = name;
// }
// public ExpressionNode(T type, string name, IEnumerable<IExpressionNode> nodes)
// : this(type, name)
// {
// _children.AddRange(nodes);
// }
// public T Type { get; }
// public string Name { get; }
// public IExpressionNode this[int index]
// {
// get { return _children[index]; }
// set { _children[index] = value; }
// }
// public int Count => _children.Count;
// public void Clear()
// {
// _children.Clear();
// }
// public void Add(IExpressionNode node)
// {
// _children.Add(node);
// }
// public void RemoveAt(int index)
// {
// _children.RemoveAt(index);
// }
// public BsonDocument ToBsonDocument()
// {
// var document = new BsonDocument();
// foreach (var item in this)
// {
// switch (item)
// {
// case TerminalNode<T> terminal:
// document.AddRange(terminal.ToBsonDocument());
// break;
// case ExpressionNode<T> expression:
// BsonDocument queryDocument = expression.ToBsonDocument();
// var array = new BsonArray();
// //foreach(var docItem in queryDocument)
// //{
// //}
// document.Add(expression.Name, new BsonArray { queryDocument });
// break;
// default:
// throw new ArgumentException($"Unknown type-{item.GetType().FullName}");
// }
// }
// return document;
// }
// public IEnumerator<IExpressionNode> GetEnumerator()
// {
// return _children.GetEnumerator();
// }
// IEnumerator IEnumerable.GetEnumerator()
// {
// return _children.GetEnumerator();
// }
// public static ExpressionNode<T> operator +(ExpressionNode<T> rootNode, IExpressionNode nodeToAdd)
// {
// rootNode.Add(nodeToAdd);
// return rootNode;
// }
// }
//}
|
package appspacerouter
import (
"testing"
pathToRegexp "github.com/soongo/path-to-regexp"
)
func TestP2R(t *testing.T) {
toEnd := false
r, err := pathToRegexp.PathToRegexp("/abc", nil, &pathToRegexp.Options{End: &toEnd})
if err != nil {
t.Error(err)
}
m, err := r.FindStringMatch("/abc/def")
if err != nil {
t.Error(err)
}
if m == nil {
t.Error("expected a match")
}
if len(m.Groups()) != 1 {
t.Error("expected 1 group")
}
m, err = r.FindStringMatch("/uvw/def")
if err != nil {
t.Error(err)
}
if m != nil {
t.Error("expected no match")
}
m, err = r.FindStringMatch("/ab")
if err != nil {
t.Error(err)
}
if m != nil {
t.Error("expected no match")
}
}
func TestP2RTokens(t *testing.T) {
var tokens []pathToRegexp.Token
r, err := pathToRegexp.PathToRegexp("/abc/:id", &tokens, nil)
if err != nil {
t.Error(err)
}
if len(tokens) != 1 {
t.Error("expected 1 token")
}
m, err := r.FindStringMatch("/abc")
if err != nil {
t.Error(err)
}
if m != nil {
t.Error("expected no match")
}
m, err = r.FindStringMatch("/abc/")
if err != nil {
t.Error(err)
}
if m != nil {
t.Error("expected no match")
}
m, err = r.FindStringMatch("/abc/x")
if err != nil {
t.Error(err)
}
if m == nil {
t.Error("expected a match")
}
groups := m.Groups()
if len(groups) != 2 {
t.Error("expected 2 group")
}
group := groups[1]
if group.String() != "x" {
t.Error("expected x " + group.String())
}
}
func TestP2RMatch(t *testing.T) {
abcIDMatch, err := pathToRegexp.Match("/abc/:id", nil)
if err != nil {
t.Error(err)
}
m, err := abcIDMatch("/abc/x")
if err != nil {
t.Error(err)
}
if m == nil {
t.Error("expected a match")
return
}
// if m.Path != "/abc/:id" {
// t.Error("expected the the router path " + m.Path)
// }
id, ok := m.Params["id"]
if !ok {
t.Error("aw no id in params")
}
if id != "x" {
t.Error("was hoping for id to be x")
}
}
///////////////
// Need actual tests for V0approutes
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.